Guest User

Untitled

a guest
Jun 13th, 2018
86
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 2.96 KB | None | 0 0
  1. Random Forest
  2.  
  3. 44 samples
  4. 1000 predictors
  5.  
  6. No pre-processing
  7. Resampling: Cross-Validated (10 fold)
  8. Summary of sample sizes: 40, 39, 40, 40, 40, 40, ...
  9. Resampling results across tuning parameters:
  10.  
  11. mtry RMSE Rsquared MAE
  12. 2 0.001762244 0.8374687 0.001510297
  13. 3 0.001763794 0.8220995 0.001507957
  14. 4 0.001784599 0.8018954 0.001523252
  15. 5 0.001785400 0.7992725 0.001528275
  16. 10 0.001813925 0.7805094 0.001548873
  17. 25 0.001862114 0.7484289 0.001588793
  18. 50 0.001892789 0.7324827 0.001614362
  19.  
  20. RMSE was used to select the optimal model using the smallest value.
  21. The final value used for the model was mtry = 2.
  22.  
  23. Random Forest
  24.  
  25. 44 samples
  26. 1000 predictors
  27. 2 classes: 'a', 'b'
  28.  
  29. No pre-processing
  30. Resampling: Cross-Validated (10 fold)
  31. Summary of sample sizes: 39, 39, 40, 40, 40, 40, ...
  32. Resampling results across tuning parameters:
  33.  
  34. mtry Accuracy Kappa
  35. 2 0.960 0.9090909
  36. 3 0.960 0.9090909
  37. 4 0.960 0.9090909
  38. 5 0.960 0.9090909
  39. 10 0.960 0.9090909
  40. 25 0.935 0.8590909
  41. 50 0.915 0.8136364
  42.  
  43. Accuracy was used to select the optimal model using the largest value.
  44. The final value used for the model was mtry = 2.
  45.  
  46. # values of mtry to test
  47. tune_grid <- data.frame(mtry=c(2, 3, 4, 5, 10, 25, 50))
  48.  
  49. # 10-fold cross validation (classProb set to TRUE for classification models)
  50. train_control <- trainControl(method="cv", number=10, savePred=TRUE, classProb=FALSE)
  51.  
  52. train(response ~ ., data=training_set, method="rf", ntree=5000,
  53. trControl=train_control, tuneGrid=tune_grid)
  54.  
  55. > cor_mat <- cor(training_set[, -1001], method='spearman')
  56. > quantile(cor_mat, probs=seq(0, 1, by=0.1))
  57. 0% 10% 20% 30% 40% 50%
  58. -0.90500352 -0.32445384 -0.23044397 -0.15052854 -0.07019027 0.01451727
  59. 60% 70% 80% 90% 100%
  60. 0.09880545 0.17688513 0.25567301 0.35334743 1.00000000
Add Comment
Please, Sign In to add comment