Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Missing module layers.attention #1

Closed
tarepan opened this issue Jul 15, 2019 · 0 comments
Closed

Missing module layers.attention #1

tarepan opened this issue Jul 15, 2019 · 0 comments
Labels
bug Something isn't working

Comments

@tarepan
Copy link
Owner

tarepan commented Jul 15, 2019

Summary

Train failed immediately after execution because of missing module.

Situation

!python GroupLatentEmbedding/wavernn.py -m vqvae_group --num-group 41 --num-sample 10

Problems

Program throw error immediately after execution.

Traceback (most recent call last):
  File "GroupLatentEmbedding/wavernn.py", line 11, in <module>
    import models.nocond as nc
  File "/content/GroupLatentEmbedding/models/nocond.py", line 14, in <module>
    from layers.vector_quant import VectorQuant
  File "/content/GroupLatentEmbedding/layers/vector_quant.py", line 7, in <module>
    from layers.attention import EmbeddingAttention
ModuleNotFoundError: No module named 'layers.attention'

Estimated cause

There is no module named "attention" in repository.
This module could be remnant or future something.

@tarepan tarepan added the bug Something isn't working label Jul 15, 2019
tarepan added a commit that referenced this issue Jul 15, 2019
@tarepan tarepan closed this as completed Jul 15, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant