Advertisement
Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- function [hidlw outlw terr] = backprop(tset, tslb, inihidlw, inioutlw, lr)
- % derivative of sigmoid activation function
- % tset - training set (every row represents a sample)
- % tslb - column vector of labels
- % inihidlw - initial hidden layer weight matrix
- % inioutlw - initial output layer weight matrix
- % lr - learning rate
- % hidlw - hidden layer weight matrix
- % outlw - output layer weight matrix
- % terr - total squared error of the ANN
- % 1. Set output matrices to initial values
- hidlw = inihidlw
- outlw = inioutlw
- % 2. Set total error to 0
- terr = 0;
- % foreach sample in the training set
- for i=1:rows(tset)
- % 3. Set desired output of the ANN
- desiredOut = zeros(1, columns(outlw));
- desiredOut(tslb(i)) = 1
- % 4. Propagate input forward through the ANN
- % remember to extend input [tset(i, :) 1]
- hlact = [tset(i, :) 1] * hidlw;
- hlout = actf(hlact)
- olact = [hlout 1] * outlw;
- olout = actf(olact)
- % 5. Adjust total error (just to know this value)
- terr += 0.5 * sum((desiredOut-olout).^2)
- % 6. Compute delta error of the output layer
- % how many delta errors should be computed here?
- delta = (desiredOut - olout)'
- deltaOl = (outlw * delta) .* [actdf(olout) 1]
- deltaOl = deltaOl(1:end-1, :)
- DELTAOl = delta * [hlact 1]
- % 7. Compute delta error of the hidden layer
- % how many delta errors should be computed here?
- deltaHl = (hidlw' * deltaOl) .* [actdf(hlout) 1]
- % 8. Update output layer weights
- outlw = outlw + lr * DELTAOl'
- % 9. Update hidden layer weights
- hidlw = hidlw + lr * deltaHl
- end
- terr /= rows(tset);
- endfunction
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement