Advertisement
Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- function tree = DecisionTreeLearning (features , labels)
- x = 1;
- line = 1;
- if(size(labels,1) == 0)
- tree.op = [];
- tree.kids = [];
- tree.class = [];
- tree.attribute = [];
- tree.threshold = [];
- return
- end
- for line = 2 : size(labels,1) %Check to see if all of the labels are the same
- %labels(line,:)
- %labels(x,:)
- %labels(line,:)~=labels(x,:)
- if (ismember(1,labels(line,1)~=labels(x,1))) %Checks by seeing if any are different
- break
- end
- x = x + 1 ;
- end
- y = size(labels,1)
- if(x == line)%If all labels are the same, then create a leaf node
- tree.op = [];
- tree.kids = [];
- tree.class = labels(x-1,:);
- tree.attribute = [];
- tree.threshold = [];
- return
- end
- if(size(labels,1) == 1)%If all labels are the same, then create a leaf node
- tree.op = [];
- tree.kids = [];
- tree.class = labels(x,:);
- tree.attribute = [];
- tree.threshold = [];
- return
- end
- [attribute, threshold] = ChooseAttribute(features , labels); %Choose the attribute and threshold to test
- [lessThanFeatures,lessThanLabels,greaterThanFeatures,greaterThanLabels] = Split(features, labels , attribute , threshold); %Split the data set according to attribute and threshold
- tree.op = num2str(attribute);
- tree.attribute = attribute;
- tree.threshold = threshold;
- tree.class = [];
- tree.kids = {DecisionTreeLearning(lessThanFeatures,lessThanLabels),DecisionTreeLearning(greaterThanFeatures,greaterThanLabels)}; %Create the branches with new trees with the split data
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement