Advertisement
Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- def entropy(*values):
- t=0.0
- for value in values : t+=value
- to_be_returned=0.0
- for value in values :
- if value == 0 : continue
- to_be_returned+= -(value/float(t))*math.log((value/float(t)),2)
- print "Entropy[%s]=%f"%(values,to_be_returned)
- return to_be_returned
- def IG(*values):
- Parent_Entropy = entropy(*values)
- to_be_returned=Parent_Entropy
- EH=0
- t=0.0
- for value in values : t+=value
- counter = 1
- for value in values :
- message=""
- if counter==1: message = "How many object from the [Class Yes] for 1st child: "
- elif counter==2:message = "How many object from the [Class Yes] for 2nd child: "
- elif counter==3:message = "How many object from the [Class Yes] for 3rd child: "
- else:message = "How many object from the Class Yes for %ith child"%counter
- class_one = input(message)
- class_two = value - class_one
- EH+=(value/float(t))*entropy(class_one/float(value),class_two/float(value))
- counter+=1
- print "IG[%s] = %f"%(values,(to_be_returned-EH))
- -------------------
- [~]Author : Hamoud AL-Qusair
- [~]Requirement : python interpreter
- [~]Usage :
- 1) to calculate entropy just call the function `entropy(P,N)` where P,N is number of object from class P,N
- |-> 1.1) it could take as much as possible not only number of P and N it could take up to infinity.
- 2) to calculate Information Gain call the function `IG(P,N)` where P,N is number of object from class P,N
- |-> 2.1) it could take as much as possible not only number of P and N it could take up to infinity.
- |-> 2.2) it will ask you to provide how many object from the class 1 for your child
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement