Advertisement
Coderx7

XOR using backpropagation in matlab (III)

Sep 7th, 2015
71
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
MatLab 1.33 KB | None | 0 0
  1. %XOR calculator ( biases as weights )
  2. % training set
  3. % (the last number is bias coefficent which is always 1
  4. ts=[0 0 1;1 1 1;
  5.     1 0 1;0 1 1];
  6. % targets or desired outputs
  7. d=[0 0 1 1];
  8. % weights for hidden layers
  9. % bias ,weight1, weigth2
  10. wh=[rand() rand() rand() ;
  11.     rand() rand() rand()];
  12. % wiegths from hidden layer to output layer
  13. % bias, weight1, weigth2
  14. wo=[rand() rand() rand()];
  15. %output for neuron 1 and 2 and bias for neuron3
  16. a=[1 1];
  17. %output for neuron 3
  18. a3=0;
  19. %learning rate
  20. n=0.1;
  21. iteration=100;
  22. i=0;
  23. while (i<iteration || e~=0)
  24.     for tindex=1:4
  25.           for neuron=1:2
  26.               a(neuron)= logsig(wh(neuron,:)*ts(neuron,:)');
  27.           end
  28.           output_neuronsInput = wo(1)*a(1)+wo(2)*a(2)+wo(3)*1;
  29.           a3 = hardlim(output_neuronsInput);
  30.           e = d(tindex) - a3
  31.           % neuron 3(output)
  32.           localgrad_out = a3*e;
  33.           deltaW = n * localgrad_out * output_neuronsInput;
  34.           wo_new = wo + deltaW;
  35.           % neuron 1
  36.           grad_h = a(1)*(1-a(1)) * localgrad_out * wo(1)'
  37.           deltaW = n * grad_h * wh(1,:);
  38.           wh(1,:) = wh(1,:) + deltaW
  39.           % neuron 2
  40.           grad_h = a(2)*(1-a(2)) * localgrad_out*wo(2)'
  41.           wo = wo_new;
  42.           deltaW = n * grad_h * wh(2,:)
  43.           wh(2,:) = wh(2,:) + deltaW
  44.       end
  45.       i=i+1;
  46.   end
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement