-
Notifications
You must be signed in to change notification settings - Fork 187
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
perf and #param #6
Comments
Hi @junhu-svd
For our 2-Stage MPRNet Deblurring model, the number of trainable parameters are 11.3 Million. For DMPHN, it is 21.7 Million. Thanks |
Thanks, adityac8, I can almost reproduce your # after some corrections, and in my latest profiler, MPRNET(2-stages) uses 11.8(not far from 11.3M) Million parameters ~ 45MB (if saved float32) fyi, DMPHN is 5.4 Million parameters ~ 21.7MB (if saved float32) and this is consistent with the # reported in DMPHN paper From perf aspect, do you have any official statistics for MPRNET(2-stages) in terms of GMACs? what I got from torch-summary is 231.54G(mult-adds) for 256x256 input ? which is a very higher number. |
Hi @junhu-svd Here is a table from the DMPHN paper. For evaluation, we use Stack(4)-DMPHN version which is publicly released by the original authors and provides their best results. This Stack(4)-DMPHN has 21.7 Million parameters ~ 86.8 MB. If you are looking for an efficient model in terms of compute and parameters, you can retrain our MPRNet model by setting Line 239 in 9edfb1d
This will provide you with a 6 Million parameter model that yields ~31.8 dB PSNR on GoPro testset, as shown in Figure 1 of our paper. Thanks |
Hi, Thanks for the great works. I have one question about perf and #param mentioned in the paper(Table 7). I use torchsummary to generate the GMACS and #param for MPRNET(2-stage), it turns out that #param is 53.66MB(>>11.3MB in paper), from GMACS side, it is about 10x more than DMPHN(1-2-4). Although torchsummary isn't perfect and there exists a delta between real performance and GMACS, I wonder any hint/idea to explain the gap ?
The text was updated successfully, but these errors were encountered: