SHARE
TWEET

Untitled

a guest Jul 17th, 2017 48 Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
  1. FANN_FLO_2.1
  2. num_layers=2
  3. learning_rate=0.700000
  4. connection_rate=1.000000
  5. network_type=0
  6. learning_momentum=0.000000
  7. training_algorithm=2
  8. train_error_function=1
  9. train_stop_function=0
  10. cascade_output_change_fraction=0.010000
  11. quickprop_decay=-0.000100
  12. quickprop_mu=1.750000
  13. rprop_increase_factor=1.200000
  14. rprop_decrease_factor=0.500000
  15. rprop_delta_min=0.000000
  16. rprop_delta_max=50.000000
  17. rprop_delta_zero=0.100000
  18. cascade_output_stagnation_epochs=12
  19. cascade_candidate_change_fraction=0.010000
  20. cascade_candidate_stagnation_epochs=12
  21. cascade_max_out_epochs=150
  22. cascade_max_cand_epochs=150
  23. cascade_num_candidate_groups=2
  24. bit_fail_limit=3.49999994039535520000e-001
  25. cascade_candidate_limit=1.00000000000000000000e+003
  26. cascade_weight_multiplier=4.00000005960464480000e-001
  27. cascade_activation_functions_count=10
  28. cascade_activation_functions=3 5 7 8 10 11 14 15 16 17
  29. cascade_activation_steepnesses_count=4
  30. cascade_activation_steepnesses=2.50000000000000000000e-001 5.00000000000000000000e-001 7.50000000000000000000e-001 1.00000000000000000000e+000
  31. layer_sizes=2 2
  32. scale_included=0
  33. neurons (num_inputs, activation_function, activation_steepness)=(0, 0, 0.00000000000000000000e+000) (0, 0, 0.00000000000000000000e+000) (2, 5, 5.00000000000000000000e-001) (0, 5, 0.00000000000000000000e+000)
  34. connections (connected_to_neuron, weight)=(0, 1.12651491165161130000e+000) (1, 1.15255856513977050000e+000)
RAW Paste Data
Top