Advertisement
Guest User

Untitled

a guest
Jan 22nd, 2018
53
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 1.13 KB | None | 0 0
  1. function [ labelsOut ] = kNN(X, k, Xt, Lt)
  2. %KNN Your implementation of the kNN algorithm
  3. % Inputs:
  4. % X - Features to be classified
  5. % k - Number of neighbors
  6. % Xt - Training features
  7. % LT - Correct labels of each feature vector [1 2 ...]'
  8. %
  9. % Output:
  10. % LabelsOut = Vector with the classified labels
  11.  
  12. labelsOut = zeros(size(X,2),1);
  13.  
  14. classes = unique(Lt);
  15. numClasses = length(classes);
  16.  
  17. % Iterate through the samples that should be classified
  18. for i = 1:size(X,2)
  19. x = X(1, i);
  20. y = X(2, i);
  21. Xt1 = size(Xt);
  22. % Measure the distance to every samples among the observed data
  23. % save the distance and label of each observed sample
  24. for j = 1:size(Xt,2)
  25. xx = Xt(1, j);
  26. yy = Xt(2, j);
  27. d = pdist([x,y;xx,yy], 'euclidean');
  28. label = Lt(j);
  29. Xt1(j, 1) = d;
  30. Xt1(j, 2) = label;
  31. end
  32. % Sort the list by the samples distances
  33. Xt1 = sortrows(Xt1,1);
  34. % Fetch the k closest samples
  35. kv = Xt1(1:k,2);
  36. threshold = sum(kv)/k;
  37. if threshold < 1.5
  38. labelsOut(i) = 1;
  39. else
  40. labelsOut(i) = 2;
  41. end
  42. end
  43.  
  44.  
  45. end
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement