New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
One import function does not exist. #1
Comments
Hi, You can comment out SamplingAttention_dec, it is the vanilla deformable attention from https://github.com/fundamentalvision/Deformable-DETR |
Hi, After comment out SamplingAttention_dec, the below error occured: Traceback (most recent call last): Could you kindly provide the commands to train the model on coco dataset? |
And also, we try to repeat this paper, unfortunately, we can't achieve the performance published in the paper with this open-sourced codes. |
Hi, As I said, you can remove SamplingAttention_dec in the transformer architecture. In configs you can use set RECTIFIED_ATTENTION to true. Here are some results on COCO dataset for reference:
Our DQ module does not work well on COCO |
from dqrf.ops.functions.ms_deform_attn import SamplingAttention_RA, SamplingEncAttention, SamplingAttention_dec
the function 'SamplingAttention_dec' seems missed in dqrf/ops/functions/ms_deform_attn.py, would you please kindly add this function to the script?
The text was updated successfully, but these errors were encountered: