Advertisement
Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- In the code at the bottom, when lstmLayerSize is 1 then I get the following error:
- c must be f order, offset 0 and have length == c.data.length
- However no error is received when lstmLayerSize > 2. The reason for this is not apparent
- and the error does not make sense.
- To reproduce this error, create a array `input` of size 1x1x1 with say value 1.0 and run the following command:
- List<INDArray> result = this.net.feedForward(input);
- public RecurrentNeuralNetwork(int nIn, int nOut) {
- int lstmLayerSize = 5;
- MultiLayerConfiguration conf = new NeuralNetConfiguration.Builder()
- .optimizationAlgo(OptimizationAlgorithm.STOCHASTIC_GRADIENT_DESCENT)
- .iterations(1)
- .learningRate(0.2)
- .momentum(0.5)
- .rmsDecay(0.95)
- .seed(12345)
- .regularization(true)
- .l2(0.001)
- .list(3)
- .layer(0, new GravesLSTM.Builder().nIn(nIn).nOut(lstmLayerSize)
- .updater(Updater.RMSPROP)
- .activation("hardtanh").weightInit(WeightInit.DISTRIBUTION)
- .dist(new UniformDistribution(-0.08, 0.08)).build())
- .layer(1, new GravesLSTM.Builder().nIn(lstmLayerSize).nOut(lstmLayerSize)
- .updater(Updater.RMSPROP)
- .activation("hardtanh").weightInit(WeightInit.DISTRIBUTION)
- .dist(new UniformDistribution(-0.08, 0.08)).build())
- .layer(2, new RnnOutputLayer.Builder(LossFunction.MCXENT).activation("softmax")
- .updater(Updater.RMSPROP)
- .nIn(lstmLayerSize).nOut(nOut).weightInit(WeightInit.DISTRIBUTION)
- .dist(new UniformDistribution(-0.08, 0.08)).build())
- .pretrain(false).backprop(true)
- .build();
- System.out.println("Configuration: "+conf.toString());
- this.net = new MultiLayerNetwork(conf);
- this.net.init();
- this.net.setListeners(new ScoreIterationListener(1));
- }
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement