Advertisement
Guest User

Untitled

a guest
Jul 16th, 2019
77
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 0.92 KB | None | 0 0
  1. import pandas as pd
  2. import numpy as np
  3. from scipy import optimize
  4.  
  5. df = pd.read_csv('SolverPython.csv')
  6.  
  7. ## Here I am trying to reduce the total error rate with the equation, error = sum ((actual_value - predicted_value)/actual_value), where predicted value is given by the equation of linear regression(y = intercept + betas*x)
  8.  
  9. def equation(x, df):
  10. totalError = sum(abs(df['target'] - (66.75 - 0.9*df['variable1']* x[0]
  11. - 14.02*df['v2'] * x[1]
  12. - 2.57* df['v3'] * x[2]
  13. + 0.82 * df['v4'] * x[3]))/ df['target'])
  14.  
  15.  
  16. return totalError
  17.  
  18. x0 = [1, 1, 1, 1] ### initialize the values of unknowns, which u want to optimize.
  19.  
  20. res = optimize.minimize(fun=equation, x0=np.array(x0), args=(df), method='SLSQP', bounds=[(0,20),(0,20),(0,20),(0,20)])
  21.  
  22.  
  23. print(res)
  24. print(res.x) ## to print the optimized values of x
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement