Advertisement
Guest User

Untitled

a guest
Mar 29th, 2017
64
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 1.29 KB | None | 0 0
  1. # In this exercise we'll examine a learner which has high variance, and tries to learn
  2. # nonexistant patterns in the data.
  3. # Use the learning curve function from sklearn.learning_curve to plot learning curves
  4. # of both training and testing error. Use plt.plot() within the plot_curve function
  5. # to create line graphs of the values.
  6.  
  7. from sklearn.tree import DecisionTreeRegressor
  8. import matplotlib.pyplot as plt
  9. from sklearn.learning_curve import learning_curve
  10. from sklearn.cross_validation import KFold
  11. from sklearn.metrics import explained_variance_score, make_scorer
  12. import numpy as np
  13.  
  14. size = 1000
  15. cv = KFold(size,shuffle=True)
  16. score = make_scorer(explained_variance_score)
  17.  
  18. X = np.round(np.reshape(np.random.normal(scale=5,size=2*size),(-1,2)),2)
  19. y = np.array([[np.sin(x[0]+np.sin(x[1]))] for x in X])
  20.  
  21. def plot_curve():
  22. # YOUR CODE HERE
  23. reg = DecisionTreeRegressor()
  24. reg.fit(X,y)
  25. print reg.score(X,y)
  26.  
  27. # TODO: Create the learning curve with the cv and score parameters defined above.
  28. psize, train, test = learning_curve(reg, X, y, cv = cv, scoring = score)
  29. # TODO: Plot the training and testing curves.
  30.  
  31. plt.plot(psize, np.mean(train, axis = 1))
  32. plt.plot(psize, np.mean(test, axis = 1))
  33. # Show the result, scaling the axis for visibility
  34. plt.ylim(-.1,1.1)
  35. plt.show()
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement