Advertisement
Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- function [Wout Bout] = lRule(D, P)
- % LRULE - Perceptron Learning Rule.
- % [Wout Bout] = lRule(D, P) if the algorithm converges, it should
- % return the Wout - weight matrix and Bout - bias values that enable the
- % perceptron to classify the pattern P correctly. D is a desired
- % classification.
- %
- % Inputs:
- % D - List of desired classifications. Ith entry is either -1 or 1.
- % P - Patterns we want to classify, in our example these are all pats.
- %
- % Outputs:
- % Wout - The corrected weight.
- % Bout - The corrected bias.
- error(nargchk(2,2,nargin));
- patnum = size(P, 2);
- inputs = size(P, 1);
- % Initialize Classification vector, weight and bias.
- C = NaN(1, patnum);
- Wout = zeros(1, inputs);
- Bout = 0;
- disp('Our desired classification:')
- disp(D)
- k = 0;
- while(~isequal(C, D))
- r = randi(patnum, 1);
- pat = P(:, r); % Randomly select a pattern.
- C(r) = sign(Wout * pat - Bout); % Classify it.
- if (C(r) == 0) % Avoiding zeros.
- C(r) = 1;
- end
- if (C(r) ~= D(r)) % If not correctly classified:
- Wout = Wout + D(r) * pat'; % Modify the weight.
- Bout = Bout - D(r); % Modify the bias.
- end
- k = k + 1;
- end
- disp(C)
- fprintf('Converged in %i iterations\n', k);
- end
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement