Advertisement
Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- /* Author: Brian Hill
- * Date:
- * Module: relu_backprop
- * Desc: The backpropagation calculation for the ReLU layer
- * using single precision floating point values
- */
- module relu_backprop( input logic clk,
- input logic [31:0] in_data[WIDTH-1:0],
- output logic [31:0] out_data[WIDTH-1:0])
- parameter NEGATIVE_SLOPE = 0.0
- parameter WIDTH = 256
- // for each value 'x' in the input vector, calculate max(0, 'x')
- for i in 0:WIDTH-1
- // if input data 'x' greater than 0, pass along to output
- if in_data[i] > 0
- out_data[i] = in_data[i]
- // otherwise we pass along negative slope value (usually 0)
- else
- out_data[i] = NEGATIVE_SLOPE
- endmodule
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement