Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Strange behavior of nn.Padding during training #1308

Open
tastyminerals opened this issue Mar 12, 2018 · 0 comments
Open

Strange behavior of nn.Padding during training #1308

tastyminerals opened this issue Mar 12, 2018 · 0 comments

Comments

@tastyminerals
Copy link
Contributor

tastyminerals commented Mar 12, 2018

I have a standard training loop in a sequential recurrent model. The model is trained with two input batches of data where:

x1 batch (25 x 32)
x2 batch (25 x 32 x 9 x 1)

The last batches in the epoch are smaller and are:

x1 last batch (14 x 32)
x2 last batch (14 x 32 x 9 x 1)

So, I use nn.Padding to add more dimensions for the last batches to make them always have the same sequence length like so:

while opt.maxepoch <= 0 or epoch <= opt.maxepoch do
  print("Epoch #"..epoch)
  local pad1 = nn.Padding(0, 11, 0)
  local pad2 = nn.Padding(0, 11, 0)
  local pad3 = nn.Padding(0, 11, 0)

  for batch1, batch2, batch3 in utils.zip(words_it, feats_it, targets_it) do
    local i, wvec, _ = unpack(batch1) -- word idx
    local _, fvec, _ = unpack(batch2) -- feat vec
    local _, gt, _ = unpack(batch3) -- targets [0,1]

    if wvec:size()[1] ~= opt.seqlen then
      wvec = pad1:forward(wvec)
      fvec = pad2:forward(fvec)
      gt = pad2:forward(gt)
      print({fvec}) -- prints {CudaTensor - size: 25x32}, should be {CudaTensor - size: 25x32x9}
    end


  local outputs = model:forward({wvec, fvec:squeeze(4), gt})
    -- (...)
  end
end

However, pad2:forward(fvec) returns 25x32 instead of 25x32x9x1 all the time.
When I use nn.Padding only for fvec tensor it returns correct 25x32x9x1 which is bewildering.
How does this happen?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant