-
Notifications
You must be signed in to change notification settings - Fork 310
Closed
Description
Hi,
Does the LSTM support multiple batches directly (for performance benchmarking)? I tried to implement this. It didn't raise an error but the results seem to be inconsistent. (I am also confused because in the LSTM code file it is states that the expected input is either 1D or 2D, but in the Penn Tree Bank Sample multiple batches can be used. Here I want to time forward and backward separately though.). If I input identical sequences in one batch, I get different outputs.
BTW: Thanks for making this available to the public!
Here a minimal(the Mask zero part can also be removed) code example of what I mean:
require "rnn"
require "cunn"
torch.manualSeed(123)
batch_size= 2
maxLen = 4
wordVec = 5
nWords = 100
mode = 'CPU'
-- create random data with zeros as empty indicator
inp1 = torch.ceil(torch.rand(batch_size, maxLen)*nWords) --
labels = torch.ceil(torch.rand(batch_size)*2) -- create labels of 1s and 2s
-- not all sequences have the same lenght, 0 placeholder
for i=1, batch_size do
n_zeros = torch.random(maxLen-2)
inp1[{{i},{1, n_zeros}}] = torch.zeros(n_zeros)
end
-- make the first sequence the same as the second
inp1[{{2},{}}] = inp1[{{1},{}}]:clone()
lstm = nn.Sequential()
lstm:add(nn.LookupTableMaskZero(10000, wordVec, batch_size)) -- convert indices to word vectors
lstm:add(nn.SplitTable(1)) -- convert tensor to list of subtensors
lstm:add(nn.Sequencer(nn.MaskZero(nn.LSTM(wordVec, wordVec), 1))) -- Seq to Seq', 0-Seq to 0-Seq
if mode == 'GPU' then
lstm:cuda()
criterion:cuda()
labels = labels:cuda()
inp1 = inp1:cuda()
end
out = lstm:forward(inp1)
print('input 1', inp1[1])
print('lstm out 1', out[1])
print('input 2', inp1[2]) -- shoudl be the same as above
print('lstm out 2', out[2]) -- should be the same as above
output:
input 1 0
0
29
43
[torch.DoubleTensor of size 4]
lstm out 1 0.0000 0.0000 0.0000 0.0000 0.0000
0.0000 0.0000 0.0000 0.0000 0.0000
-0.0226 0.0012 0.1373 0.0064 0.0766
0.1174 0.1793 0.0684 0.0029 0.0138
[torch.DoubleTensor of size 4x5]
input 2 0
0
29
43
[torch.DoubleTensor of size 4]
lstm out 2 0.0000 0.0000 0.0000 0.0000 0.0000
0.0000 0.0000 0.0000 0.0000 0.0000
-0.0325 0.0143 0.2019 0.0113 0.1202
0.1606 0.2348 0.1093 0.0045 0.0208
[torch.DoubleTensor of size 4x5]
Metadata
Metadata
Assignees
Labels
No labels