Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

No good result on new test video #4

Closed
rebotnix opened this issue Oct 25, 2020 · 13 comments
Closed

No good result on new test video #4

rebotnix opened this issue Oct 25, 2020 · 13 comments

Comments

@rebotnix
Copy link

rebotnix commented Oct 25, 2020

I tried a first test video and the tracking seems not to work on a simple roadvideo.
https://www.youtube.com/watch?v=vy38G5g7FaY

I receive a lot of no good matching points.

Can I adjust the tracking?

I tried to adjust the crop, but it doesn't work. Thanks for sharing this project.

@jiupinjia
Copy link
Owner

Hi, thanks for your feedback. I watched your video and I believe the problem lies in the sky background. As mentioned in my preprint paper, when there are no textures in the sky, the motion of the sky background cannot be accurately modeled. This is one of the limitations of the method.
I have updated the skyboxengine.py, where this time I limited the motion to translation + rotation, and raise the threshold of the number of effective matching points (3 ->10). You can try again and tell me whether it works this time. With any luck, this update will work on your case. But as I mentioned above, the most reliable solution is to choose a video with a rich sky texture for testing. Good luck.

@yggs1401
Copy link

Hello! First of all, nice work!
What do you mean by "rich sky texturing"? Can you explain a bit and give an example?
Thank you!

@jiupinjia
Copy link
Owner

Hello! First of all, nice work!
What do you mean by "rich sky texturing"? Can you explain a bit and give an example?
Thank you!

Thanks for your comments! I can give you an example. In the figure below, the left one is with "rich sky textures" but the right one is not.

Picture1

@yggs1401
Copy link

Thank you, I understand now! This problem occurs because of the dataset? I mean, less images with clear skies day/night?

@jiupinjia
Copy link
Owner

Thank you, I understand now! This problem occurs because of the dataset? I mean, less images with clear skies day/night?

I believe the problem is in the motion estimation rather than the dataset training. If you set "save_jpgs" in the config file to "true" and check those sky region detection results, you will see those images with clear sky regions are nicely detected. However, when there are not enough feature points in the sky, the tracker may fail to find any effective matching points between adjacent frames, and that why we need a rich textured sky to make sure the motion of the virtual background is accurately modeled.

@yggs1401
Copy link

I understand! Thank you very much for the info! Once again, nice work :)

@jiupinjia
Copy link
Owner

I understand! Thank you very much for the info! Once again, nice work :)

Cheers!

@rebotnix
Copy link
Author

Hi, thanks for your feedback. I watched your video and I believe the problem lies in the sky background. As mentioned in my preprint paper, when there are no textures in the sky, the motion of the sky background cannot be accurately modeled. This is one of the limitations of the method.
I have updated the skyboxengine.py, where this time I limited the motion to translation + rotation, and raise the threshold of the number of effective matching points (3 ->10). You can try again and tell me whether it works this time. With any luck, this update will work on your case. But as I mentioned above, the most reliable solution is to choose a video with a rich sky texture for testing. Good luck.

Will try the fixes, thx a lot.

@rebotnix
Copy link
Author

@jiupinjia I made a new test video with the settings. I think it´s a little bit better but still have the jumping effects.

Here is the new video:
https://www.youtube.com/watch?v=rzJWS3jqiy8

Maybe we work with pre-marker in the sky that can help to match the points for tracking?

@jiupinjia
Copy link
Owner

@jiupinjia I made a new test video with the settings. I think it´s a little bit better but still have the jumping effects.

Here is the new video:
https://www.youtube.com/watch?v=rzJWS3jqiy8

Maybe we work with pre-marker in the sky that can help to match the points for tracking?

Seems the problem is still there. Can you show me the input video?

@rebotnix
Copy link
Author

@jiupinjia Sure, you can download it here, I remove the zip in 48 hours.
http://rebotnix.com/tmp/sky_videos.zip

Thanks for your help. Will test any changes.

@jiupinjia
Copy link
Owner

@jiupinjia Sure, you can download it here, I remove the zip in 48 hours.
http://rebotnix.com/tmp/sky_videos.zip

Thanks for your help. Will test any changes.

Hi, I have carefully checked your input videos, and I believe the problem is in the sky background. As I mentioned in my preprint paper Sec 4.4, one of the limitations of the method is that it only works with input videos with rich sky textures. When there are no textures, the motion of the virtual camera cannot be accurately modeled. To better understand this, I can give you an example. In the figure below, the left one shows a frame with rich sky textures but the right one does not.

I have put SkyAR version 2.0 on my agenda, where I will be working on a more robust motion estimation (motion propagation from foreground to background) and hopefully, the SkyAR 2.0 will have no more requirements on the sky textures. Thanks again for your valuable feedback.

97184226-11ab8d00-1775-11eb-97f9-cd13bc21f7c5

@rebotnix
Copy link
Author

I fully understand it and i thank you that you take your time to check the videos. It´s a great project and I will follow it for sure. Good luck.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants