You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
My issue is about the training time for unconditional generation. It takes me about 5 hours/ epoch on 1 * RTX8000 and most of the time is spent on loss.backward(), with the unconditional setting in #5, I wonder:
Is this common?
Any suggestions for acceleration please?
From how many epochs that you start to have good-quality generations?
Thanks in advance.
The text was updated successfully, but these errors were encountered:
Hi @sharvil @Andrechang @JCBrouwer thanks for this implementation.
My issue is about the training time for unconditional generation. It takes me about 5 hours/ epoch on 1 * RTX8000 and most of the time is spent on
loss.backward()
, with the unconditional setting in #5, I wonder:Thanks in advance.
The text was updated successfully, but these errors were encountered: