You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@klnavaneet Hi, thanks for the great code.
Could you kindly consider releasing the "CompGS 4k, Int16" trained model?
Alternatively, could you provide relevant training parameters for it?
Your assistance in this matter would be greatly appreciated.
@cfchi Hi. For bit quantization experiments with CompGS (including CompGS 4k, Int16), we use post-training quantization. The model training is done without any changes in precision and quantization is performed during inference. So the pre-trained models for CompGS 4k, Int16 are the same ones as for CompGS 4k.
PS: we will be updating the work with 4-5x smaller models very soon! stay tuned.
Love this work! Wanted to ask if there was a checkpoint or pretrained model available.
The text was updated successfully, but these errors were encountered: