Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

recurrent_group cannot return more than one layer as its outputs. #2834

Closed
lcy-seso opened this issue Jul 13, 2017 · 0 comments · Fixed by #2944
Closed

recurrent_group cannot return more than one layer as its outputs. #2834

lcy-seso opened this issue Jul 13, 2017 · 0 comments · Fixed by #2944
Labels

Comments

@lcy-seso
Copy link
Contributor

lcy-seso commented Jul 13, 2017

I use machine translation in PaddleBook as the test example.

  • Previously, recurrent_group in PaddlePaddle can return more than one layers inside the layer group as its outputs.
  • The returned layers are accessible to any other layers in the network, while layers inside recurrent_group (and is not returned by recurrent_gropu) are not accessible outside the recurrent_group.
  • I hope to use the above feature, but if I change this line to return more than one layers in recurrent_group, the configuration parsing process crashes. It seems that there is some bug?
  • the error information is as follows:
Traceback (most recent call last):
  File "network_conf.py", line 114, in <module>
    print parse_network(seq_to_seq_net(30000, 30000, True, 3, 250))
  File "network_conf.py", line 108, in seq_to_seq_net
    max_length=max_length)
  File "/home/caoying/paddle_codes/models/nmt_plot_attention_weights/paddle/v2/config_base.py", line 52, in wrapped
    out = f(*args, **xargs)
  File "/home/caoying/paddle_codes/models/nmt_plot_attention_weights/paddle/trainer_config_helpers/default_decorators.py", line 53, in __wrapper__
    return func(*args, **kwargs)
  File "/home/caoying/paddle_codes/models/nmt_plot_attention_weights/paddle/trainer_config_helpers/layers.py", line 3949, in beam_search
    is_generating=True)
  File "/home/caoying/paddle_codes/models/nmt_plot_attention_weights/paddle/trainer_config_helpers/default_decorators.py", line 53, in __wrapper__
    return func(*args, **kwargs)
  File "/home/caoying/paddle_codes/models/nmt_plot_attention_weights/paddle/trainer_config_helpers/layers.py", line 3657, in recurrent_group
    RecurrentLayerGroupEnd(name=name)
  File "/home/caoying/paddle_codes/models/nmt_plot_attention_weights/paddle/trainer/config_parser.py", line 396, in RecurrentLayerGroupEnd
    "memory declare wrong size:%d" % memory_link.size)
  File "/home/caoying/paddle_codes/models/nmt_plot_attention_weights/paddle/trainer/config_parser.py", line 156, in config_assert
    logger.fatal(msg)
  File "/home/caoying/paddle_codes/models/nmt_plot_attention_weights/paddle/trainer/config_parser.py", line 3531, in my_fatal
    raise Exception()
Exception
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant