Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

factored_attention.py throws AssertionError #47

Open
agilebean opened this issue May 8, 2020 · 3 comments
Open

factored_attention.py throws AssertionError #47

agilebean opened this issue May 8, 2020 · 3 comments

Comments

@agilebean
Copy link

agilebean commented May 8, 2020

After changing the artist and lyrics,
the jupyter notebook after line:
zs = _sample(zs, labels, sampling_kwargs, [None, None, top_prior], [2], hps)
throws the following error:

Sampling level 2
Sampling 8192 tokens for [0,8192]. Conditioning on 0 tokens
Ancestral sampling 3 samples with temp=0.98, top_k=0, top_p=0.0
0/8192 [00:00<?, ?it/s]

---------------------------------------------------------------------------

AssertionError                            Traceback (most recent call last)

<ipython-input-44-c3213f092598> in <module>()
      1 zs = [t.zeros(hps.n_samples,0,dtype=t.long, device='cuda') for _ in range(len(priors))]
----> 2 zs = _sample(zs, labels, sampling_kwargs, [None, None, top_prior], [2], hps)

6 frames

/usr/local/lib/python3.6/dist-packages/jukebox/transformer/factored_attention.py in check_cache(self, n_samples, sample_t, fp16)
    411 
    412     def check_cache(self, n_samples, sample_t, fp16):
--> 413         assert self.sample_t == sample_t, f"{self.sample_t} != {sample_t}"
    414         if sample_t == 0:
    415             assert self.cache == {}

AssertionError: 3344 != 0
@heewooj
Copy link
Contributor

heewooj commented May 8, 2020

Hm.. this seems very strange. Could you share how to reproduce this error?

@agilebean
Copy link
Author

I just changed the genre to Pop, and the artist to Katy Perry.
The text I entered was shorter, but could that cause an error?

@heewooj
Copy link
Contributor

heewooj commented May 14, 2020

No, we have functions that pad short lyrics. I think top-level sampling was probably interrupted before which explains why the transformer has cached states already. Restarting the notebook kernel will most likely fix this.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants