Advertisement
Guest User

Untitled

a guest
Nov 29th, 2015
80
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 1.72 KB | None | 0 0
  1. In the code at the bottom, when lstmLayerSize is 1 then I get the following error:
  2.  
  3. c must be f order, offset 0 and have length == c.data.length
  4.  
  5. However no error is received when lstmLayerSize > 2. The reason for this is not apparent
  6. and the error does not make sense.
  7.  
  8. To reproduce this error, create a array `input` of size 1x1x1 with say value 1.0 and run the following command:
  9.  
  10. List<INDArray> result = this.net.feedForward(input);
  11.  
  12. public RecurrentNeuralNetwork(int nIn, int nOut) {
  13.  
  14. int lstmLayerSize = 5;
  15.  
  16. MultiLayerConfiguration conf = new NeuralNetConfiguration.Builder()
  17. .optimizationAlgo(OptimizationAlgorithm.STOCHASTIC_GRADIENT_DESCENT)
  18. .iterations(1)
  19. .learningRate(0.2)
  20. .momentum(0.5)
  21. .rmsDecay(0.95)
  22. .seed(12345)
  23. .regularization(true)
  24. .l2(0.001)
  25. .list(3)
  26. .layer(0, new GravesLSTM.Builder().nIn(nIn).nOut(lstmLayerSize)
  27. .updater(Updater.RMSPROP)
  28. .activation("hardtanh").weightInit(WeightInit.DISTRIBUTION)
  29. .dist(new UniformDistribution(-0.08, 0.08)).build())
  30. .layer(1, new GravesLSTM.Builder().nIn(lstmLayerSize).nOut(lstmLayerSize)
  31. .updater(Updater.RMSPROP)
  32. .activation("hardtanh").weightInit(WeightInit.DISTRIBUTION)
  33. .dist(new UniformDistribution(-0.08, 0.08)).build())
  34. .layer(2, new RnnOutputLayer.Builder(LossFunction.MCXENT).activation("softmax")
  35. .updater(Updater.RMSPROP)
  36. .nIn(lstmLayerSize).nOut(nOut).weightInit(WeightInit.DISTRIBUTION)
  37. .dist(new UniformDistribution(-0.08, 0.08)).build())
  38. .pretrain(false).backprop(true)
  39. .build();
  40.  
  41. System.out.println("Configuration: "+conf.toString());
  42. this.net = new MultiLayerNetwork(conf);
  43.  
  44. this.net.init();
  45. this.net.setListeners(new ScoreIterationListener(1));
  46. }
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement