You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
You can incapsulate it in a method that you decorate so that amp is not active inside it. Similar with the VQ layer.
Best regards,
Petru-Daniel (Dan) Tudosiu
Biomedical Engineering PhD Student
School of Biomedical Engineering & Imaging Sciences
King's College London
Mobile: +447497517619
Email: ***@***.***
Office Address:
9th Floor, Becket House,
1 Lambeth Palace Rd,
South Bank, London,
SE1 7EU
________________________________
From: Walter Hugo Lopez Pinaya ***@***.***>
Sent: Saturday, December 24, 2022 1:22:11 PM
To: Project-MONAI/GenerativeModels ***@***.***>
Cc: Subscribed ***@***.***>
Subject: Re: [Project-MONAI/GenerativeModels] F.interpolate not compatible with bfloat16 (Issue #156)
Related to #133<#133>
—
Reply to this email directly, view it on GitHub<#156 (comment)>, or unsubscribe<https://github.com/notifications/unsubscribe-auth/ACQEEVARJHGM5CRY3QNAVI3WO3MGHANCNFSM6AAAAAATIM4CX4>.
You are receiving this because you are subscribed to this thread.Message ID: ***@***.***>
As mentioned in pytorch/pytorch#86679, F.interpolate is not compatible with bfloat.
It is necessary to convert to float32 its usage in
GenerativeModels/generative/networks/nets/autoencoderkl.py
Line 48 in 5f0fd39
and
GenerativeModels/generative/networks/nets/diffusion_model_unet.py
Line 518 in 9cd35d0
The text was updated successfully, but these errors were encountered: