You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi,
In transformer.py, I find mask_token_id is set to be args.num_image_tokens. Shouldn't it be the args.num_codebook_vectors? I think we don't want the mask token id to be one of those in the codebook. Similar thing for the sos token id.
The text was updated successfully, but these errors were encountered:
Oh yea thats correct. My bad. Thanks for correcting. Although it doesnt seem to harm performance that bad, Im currently training another model at 32x32 resolution. Thats the loss curve:
I restarted training and it seems to perform better now.
(Green is the fixed tokens)
But I also changed something at the optimiter weight decay, so maybe this also has some effect on it.
Hi,
In transformer.py, I find mask_token_id is set to be args.num_image_tokens. Shouldn't it be the args.num_codebook_vectors? I think we don't want the mask token id to be one of those in the codebook. Similar thing for the sos token id.
The text was updated successfully, but these errors were encountered: