Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Image resolution for outdoor feature matching #9

Closed
UditSinghParihar opened this issue May 26, 2022 · 5 comments
Closed

Image resolution for outdoor feature matching #9

UditSinghParihar opened this issue May 26, 2022 · 5 comments

Comments

@UditSinghParihar
Copy link

UditSinghParihar commented May 26, 2022

Hello @Tangshitao ,
Thanks for providing excellent work on QuadTreeAttention.

I want to point out the 2 bugs in FeatureMatching/notebooks/demo_single_pair.ipynb .

  1. The first bug is while loading the default config of KeyError of block_type, which I have corrected by appending configs from training setup into the cvpr_ds_config.py. I have added following lines in cvpr_ds_config.py:
# 5. Quadtree
_CN.COARSE.BLOCK_TYPE = 'quadtree'  
_CN.COARSE.ATTN_TYPE = 'B'  
_CN.COARSE.TOPKS=[32, 16, 16]  
_CN.FINE.BLOCK_TYPE = 'loftr'  
  1. The second bug I am facing while inferring outdoor weights during resizing operation at the following line:
    img0_raw = cv2.resize(img0_raw, (img0_raw.shape[1]//8*8, img0_raw.shape[0]//8*8)) # input size should be divisible by 8
    This creates an issue down the line in the quadtree attention block while reshaping due to size mismatch. I have resolved the issue by resizing the image to (640, 480) and it is working fine:
    img0_raw = cv2.resize(img0_raw, (480, 640))
    But I believe during the training on Megadeth images, you have kept the higher side to be 832 while being divisible by 8. But during inference, that same process is not working as it working in the original LoFTR notebook.

TLDR: Could you remove the bugs in demo_single_pair.ipynb in the default config and the resizing operation in outdoor weights?

@UditSinghParihar
Copy link
Author

Hi @Tangshitao ,

I attaching screenshots of errors before I have made the correction:

  1. Error in the default config:
    image

  2. Error in the resizing operation in outdoor weights:
    image

Regards,
Udit

@Tangshitao
Copy link
Owner

Thanks for reminding. I did not modified any codes in demo_single_pair.ipynb from the original repo LoFTR. For the second modification, you should make the image size be a divisible of 32.

@UditSinghParihar
Copy link
Author

Hi @Tangshitao, currently I am first resizing the larger side to be 832 and then divisible by 32. Is it better than just resizing the dimensions to be divisible by 32? My current image reading code looks like this:

def load_image(fname):
    img = cv2.imread(fname)
    scale = 832 / max(img.shape[0], img.shape[1]) 
    w = int(img.shape[1] * scale)
    h = int(img.shape[0] * scale)
    img = cv2.resize(img, (w, h))
    img = cv2.resize(img, (img.shape[1]//32*32, img.shape[0]//32*32))  # input size shuold be divisible by 32
    
    img = K.image_to_tensor(img, False).float() /255.
    img = K.color.bgr_to_grayscale(img)
    
    return img.to(torch.device('cuda')), scale

Regards,
Udit Singh Parihar

@Tangshitao
Copy link
Owner

Do you mean for the training or the testing? If you resize the large side to 832, it is already divisible by 32.

@UditSinghParihar
Copy link
Author

Hi @Tangshitao , I am talking about FeatureMatching/notebooks/demo_single_pair.ipynb notebook outdoor testing part. If we resize the larger side to 832, then the larger side would only be divisible by 32. So need to make the smaller side also divisible by 32.

And since the training was done in 832 resolution for the larger side, so I first want to scale the larger side to be 832 (and keep the aspect ratio the same) and then make the smaller side divisible by 32 as well.

Regards,
Udit

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants