You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, thank you for your great work! I want to test the performance of ImageNet pre-trained HAT-L models without fine-tuning on DF2K. When pre-training HAT-L on the ImageNet dataset, the training time is long (about 13 days using 8 V100 GPUs). Could you please release those ImageNet pre-trained HAT-L models without any fine-tuning? Thanks for replying!
The text was updated successfully, but these errors were encountered:
Hi, thank you for your great work! I want to test the performance of ImageNet pre-trained HAT-L models without fine-tuning on DF2K. When pre-training HAT-L on the ImageNet dataset, the training time is long (about 13 days using 8 V100 GPUs). Could you please release those ImageNet pre-trained HAT-L models without any fine-tuning? Thanks for replying!
The text was updated successfully, but these errors were encountered: