Advertisement

# ML - GINI, Entropy and Error

Oct 5th, 2022 (edited)
731
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
1. clear all
2. clc
3.
4. % Data
5. N = 1000;
6. c1_list = 0 : N;
7. c2_list = N - c1_list;
8. LEN = length(c1_list);
9. GINI_list = zeros(1, LEN);
10. entropy_list = zeros(1, LEN);
11. error_list = zeros(1, LEN);
12. p1_list = zeros(1, LEN);
13.
14. for i = 1 : LEN
15.     c1 = c1_list(i);
16.     c2 = c2_list(i);
17.     p1 = c1 / (c1+c2);
18.     p2 = c2 / (c1+c2);
19.     y1 = GINI(p1, p2);
20.     y2 = entropy(p1, p2);
21.     y3 = error(p1, p2);
22.     GINI_list(i) = y1;
23.     entropy_list(i) = y2;
24.     error_list(i) = y3;
25.     p1_list(i) = p1;
26. end
27.
28. figure();
29. plot(p1_list, GINI_list);
30. title("Comparison of metrics");
31. xlabel("p_1");
32. ylabel("Metrics' value");
33. hold on
34. plot(p1_list, entropy_list);
35. hold on
36. plot(p1_list, error_list);
37. hold on
38. legend("GINI", "Entropy", "Error");
39.
40. % Auxiliary Functions - Metrics
41. function y = GINI(p1, p2)
42.     y = 1 - p1^2 - p2^2;
43. end
44.
45. function y = entropy(p1, p2)
46.     y = - p1*log2(p1) - p2*log2(p2);
47. end
48.
49. function y = error(p1, p2)
50.     y = 1 - max(p1, p2);
51. end
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement