Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- {
- accUpdate : false
- batchSize : 64
- bidirectional : false
- cuda : true
- cutoffNorm : -1
- dataPath : "/home/nilesh/data"
- dataset : "PennTreeBank"
- decayFactor : 0.001
- dropout : false
- dropoutProb : 0.5
- evalSize : 100
- forestGaterSize : "{}"
- hiddenSize : {200}
- learningRate : 0.1
- lrDecay : "linear"
- lstm : false
- maxEpoch : 400
- maxOutNorm : 2
- maxTries : 30
- maxWait : 4
- minLR : 1e-05
- momentum : 0
- progress : true
- rho : 5
- saturateEpoch : 300
- schedule : {}
- silent : false
- small : false
- softmaxforest : false
- softmaxtree : false
- testFile : "test.txt"
- tiny : false
- trainEpochSize : 400000
- trainFile : "train.txt"
- trainOnly : false
- uniform : -1
- useDevice : 1
- validEpochSize : 24000
- validFile : "valid.txt"
- xpPath : ""
- zeroFirst : false
- }
- Language Model :
- nn.Sequential {
- [input -> (1) -> (2) -> (3) -> (4) -> output]
- (1): nn.LookupTable
- (2): nn.SplitTable
- (3): nn.Sequencer @ nn.Recurrent {
- [{input(t), output(t-1)} -> (1) -> (2) -> (3) -> output(t)]
- (1): {
- input(t)
- |`-> (t==0): nn.Add
- |`-> (t~=0): nn.Identity
- output(t-1)
- |`-> nn.Linear(200 -> 200)
- }
- (2): nn.CAddTable
- (3): nn.Sigmoid
- }
- (4): nn.Sequencer @ nn.Recursor @ nn.Sequential {
- [input -> (1) -> (2) -> output]
- (1): nn.Linear(200 -> 10000)
- (2): nn.LogSoftMax
- }
- }
- FileLogger: log will be written to /home/nilesh/save/dl03-GB5670C:1462791406:1/log
- ==> epoch # 1 for optimizer :
- Segmentation fault (core dumped)
Add Comment
Please, Sign In to add comment