Advertisement
Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- Generally, single simple perceptrons work like this:
- i1\w1
- θ - output
- i2/w2
- Where:
- i1 and i2 are inputs
- w1 and w2 are weights
- θ is a threshold or activation function
- A formula to sum this up would be:
- output = θ(Σ(i*w))
- Ergo:
- The result of the activation function evaluating the value of the summation of all input i*weight i
- The activation function is a nonlinear function that outputs a different value given two different ranges of values. For example:
- If the value of Σ(i*w) < 1 then θ(Σ(i*w)) = 0
- Else, if the value of Σ(i*w) >= 1 then θ(Σ(i*w)) = 1
- Note that this is the usual activation function - referred to as a "Step function", but if you'd rather use a different function, make sure to modify your weights accordingly>
- A typical neural network only has binary inputs and outputs. Ergo, values of 0 or 1 only.
- Weights typically range from a value from 0 to 1.
- HOWEVER.
- Your problem seems to use non-binary input values (i1 = proportion of exercises solved; i2 = proportion of lectures visited)
- The partially-shaded square figure is essentially a graphic representation of the perceptron; the shaded area represents an output of "1" (test passed) while the non-shaded area is an output of "0" (failed).
- Not sure what the problem is asking for (inputs? weights? etc.) but generally you need to give the range of values for the inputs, weights, and the theta (threshold/activation function) to mathematically define a perceptron.
- That's basically how it goes; tried to keep it as un-obvious as possible.
- Feel free to ask if you need some more help.
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement