SHARE
TWEET

AIOPKol2

lukag99 Jan 22nd, 2020 74 Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
  1. import numpy as np
  2. import matplotlib.pyplot as plt
  3. import seaborn as sns
  4. sns.set_style("whitegrid")
  5. %matplotlib inline
  6.  
  7. #1
  8. #1.1  Definisite niz od 1000 neuredjenih elemenata sa Normalnom Gausovom raspodelom.(2.5p)
  9. X = np.random.rand(1000,1)
  10. #1.2  Koristeci for petlju prikazite sve elemente ovog niza tako sto ce svake iteracije petlje biti ispisan po jedan element.(2.5p)
  11. for i in X:
  12.    print(i)
  13. #1.3. Iscrtati ovaj niz naredbom hist(). (2.5p)
  14. histogram = plt.hist(X, 100)
  15. plt.show(histogram)            
  16. #1.4. Proceniti srednju vrednost niza na osnovu histograma. (2.5p)
  17. srednja_vrednost = np.mean(X)
  18. print(srednja_vrednost)
  19.  
  20. #2
  21. #2.1. Ucitati iris dataset. Iscrtati ovaj dataset koristeci naredbe scatter() koristeći prvi i četvrti atribut, tako da su odgovarajuce klase oznacene razlicitim bojama (recimo, red, green, blue).  (2.5p)
  22. from sklearn.datasets import load_iris
  23. iris = load_iris()
  24. X = iris.data[:, :4]
  25. y = iris.target
  26. plt.scatter(X[:, 0], X[:, 1], c=y, cmap=plt.cm.Set1,edgecolor='k')
  27. #2.2. podeliti dataset na trening i test, koristeci se iskljucivo naredbom train_test_split. (2.5p)
  28. from sklearn.model_selection import train_test_split
  29. X_train, X_test, y_train, y_test  = train_test_split(X, y, test_size=0.2)
  30. #2.3. izracunati tacnost na test skupu KNN klasifikatora. (2.5p)
  31. from sklearn.neighbors import KNeighborsClassifier
  32. classifier = KNeighborsClassifier(n_neighbors=5)
  33. classifier.fit(X_train, y_train)
  34. y_pred = classifier.predict(X_test)
  35. from sklearn.metrics import classification_report, confusion_matrix
  36. print(confusion_matrix(y_test, y_pred))
  37. print(classification_report(y_test, y_pred))
  38. #2.4. podaci u matrici x imaju cetiri kolone koji predstavljaju odgovarajuce mere cveta. Koje 2 kombinacije ovih parametara daju po vama najvecu tacnost klasifikacije. Detaljno komentarisati i obrazloziti. (Uputstvo. recimo kolone 2 i 4 kada se same koriste za klasifikaciju, bolje su od ostalih parova kolona 1 i 2, ili 2 i 3 itd.)(2.5p)
  39.  
  40. #3
  41. #3.1. Napraviti dataset od 1000 instanci. Dataset treba da ima 3 atributa sa Gausovim normalnim raspodelama tako što će
  42. #svaki atribut imati različitu varijansu i srednju vrednost (po izboru). Dataset treba da ima 2 klase, a svaka klasa će imati po 500 instanci.(2.5p)
  43. x10 = np.random.rand(500,1)
  44. x11 = np.random.rand(500,1) * 2+3
  45. x12 = np.random.rand(500,1) * 3+5
  46. X1 = np.hstack([x10,x11,x12])
  47. X1.shape
  48.  
  49. x20 = np.random.rand(500,1)
  50. x21 = np.random.rand(500,1) *  0.8 + 3
  51. x22 = np.random.rand(500,1) *    1 + 5
  52. X2 = np.hstack([x20,x21,x22])
  53. X2.shape
  54. #3.2. iscrtati ovaj dataset koristeci naredbe scatter(), tako da su odgovarajuce klase oznacene razlicitim bojama (recimo, red, green, blue). (2.5p)
  55. plt.scatter(X1[:,0], X1[:,1])
  56. plt.scatter(X2[:,0], X2[:,1], c=u'r')
  57. #3.3. podeliti dataset na trening i test, koristeci se iskljucivo naredbom cross_val_score, gde je CV=10. (2.5p)
  58. from sklearn.neighbors import KNeighborsClassifier
  59. model = KNeighborsClassifier(n_neighbors=1)
  60. from sklearn.model_selection import cross_val_score
  61. cross_val_score(model, X,y, cv=10)
  62. #3.4. izracunati tacnost na test skupu LDA klasifikatora. (2.5p)
  63. from sklearn.discriminant_analysis import LinearDiscriminantAnalysis
  64. lda = LinearDiscriminantAnalysis()
  65. lda.fit(X_train, y_train)
  66. print('Accuracy of LDA classifier on training set: {:.2f}'
  67.      .format(lda.score(X_train, y_train)))
  68. print('Accuracy of LDA classifier on test set: {:.2f}'
  69.      .format(lda.score(X_test, y_test)))
RAW Paste Data
We use cookies for various purposes including analytics. By continuing to use Pastebin, you agree to our use of cookies as described in the Cookies Policy. OK, I Understand
Top