Advertisement
Guest User

Untitled

a guest
Jun 19th, 2019
114
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 0.84 KB | None | 0 0
  1. # fitting the knn with train-test split
  2. X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.25, random_state=42)
  3.  
  4. # Optimisation via gridSearch
  5. knn=KNeighborsClassifier()
  6. params={'n_neighbors': range(1,40), 'weights':['uniform', 'distance'], 'metric':['minkowski','euclidean'],'algorithm': ['auto','ball_tree','kd_tree', 'brute']}
  7. k_grd=GridSearchCV(estimator=knn,param_grid=params,cv=5)
  8. k_grd.fit(X_train,y_train)
  9.  
  10. # testing
  11. yk_grd=k_grd.predict(X_test)
  12.  
  13. # calculating the logloss
  14. print (log_loss(y_test, yk_grd))
  15.  
  16. y_true and y_pred contain different number of classes 93, 2. Please provide the true labels explicitly through the labels argument. Classes found in y_true.
  17.  
  18. X_train.shape, X_test.shape, y_train.shape, y_test.shape, yk_grd.shape
  19. # results
  20. ((742, 192), (248, 192), (742,), (248,), (248,))
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement