Advertisement
Guest User

Untitled

a guest
Nov 29th, 2015
72
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 0.72 KB | None | 0 0
  1. /* Author: Brian Hill
  2. * Date:
  3. * Module: relu_backprop
  4. * Desc: The backpropagation calculation for the ReLU layer
  5. * using single precision floating point values
  6. */
  7.  
  8. module relu_backprop( input logic clk,
  9. input logic [31:0] in_data[WIDTH-1:0],
  10. output logic [31:0] out_data[WIDTH-1:0])
  11. parameter NEGATIVE_SLOPE = 0.0
  12. parameter WIDTH = 256
  13.  
  14. // for each value 'x' in the input vector, calculate max(0, 'x')
  15. for i in 0:WIDTH-1
  16. // if input data 'x' greater than 0, pass along to output
  17. if in_data[i] > 0
  18. out_data[i] = in_data[i]
  19. // otherwise we pass along negative slope value (usually 0)
  20. else
  21. out_data[i] = NEGATIVE_SLOPE
  22.  
  23. endmodule
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement