Advertisement
Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- %XOR using backpropagation (no bias version)
- % training set
- ts=[0 0;
- 1 1;
- 1 0;
- 0 1];
- d=[0 0 1 1];
- % weights for hidden layers (each row represents wiegths for the corrosponding neuron
- wh=[rand() rand();
- rand() rand()];
- % wiegths from hidden layer to output layer
- wo=[rand() rand()];
- % output for neuron 1 and 2 and bias for neuron3
- a=[1 1];
- % output for neuron 3
- a3=0;
- % learning rate
- n=0.1;
- iteration=10;
- i=0;
- while (i<iteration || e~=0)
- for tindex=1:4
- for neuron=1:2
- a(neuron)= logsig(wh(neuron,:)*ts(neuron,:)');
- end
- a3 = hardlim(wo(1)*a(1)+wo(2)*a(2))
- e = d(tindex) - a3
- grad_out = a3*e;
- deltaW = n*grad_out*a;
- wo_new = wo+deltaW;
- grad_h = a(1)*(1-a(1)) * grad_out*wo(1)'
- deltaW = n*grad_h*wh(1,:);
- wh(1,:)= wh(1,:) +deltaW
- grad_h = a(2)*(1-a(2)) * grad_out*wo(2)'
- wo = wo_new;
- deltaW = n*grad_h*wh(2,:)
- wh(2,:) = wh(2,:) +deltaW
- end
- i=i+1;
- end
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement