Advertisement
Guest User

Untitled

a guest
Dec 14th, 2017
86
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
  1. ID3 (Examples, Target_Attribute, Attributes)
  2.     Create a root node for the tree
  3.     If all examples are positive, Return the single-node tree Root, with label = +.
  4.     If all examples are negative, Return the single-node tree Root, with label = -.
  5.     If number of predicting attributes is empty, then Return the single node tree Root,
  6.     with label = most common value of the target attribute in the examples.
  7.     Otherwise Begin
  8.         A ← The Attribute that best classifies examples.
  9.         Decision Tree attribute for Root = A.
  10.         For each possible value, vi, of A,
  11.             Add a new tree branch below Root, corresponding to the test A = vi.
  12.             Let Examples(vi) be the subset of examples that have the value vi for A
  13.             If Examples(vi) is empty
  14.                 Then below this new branch add a leaf node with label = most common target value in the examples
  15.             Else below this new branch add the subtree ID3 (Examples(vi), Target_Attribute, Attributes – {A})
  16.     End
  17.     Return Root
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement