Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

About pose optimizer #27

Open
jinmaodaomaye2021 opened this issue Jul 31, 2023 · 1 comment
Open

About pose optimizer #27

jinmaodaomaye2021 opened this issue Jul 31, 2023 · 1 comment
Labels
enhancement New feature or request

Comments

@jinmaodaomaye2021
Copy link

Hi,

Thanks for your great work. I looked at your code and found the implementation uses a shared Adam optimizer for all poses in global BA.

My question is that if some poses are not selected in a global BA iteration, do they get updated in the optimizer ?

  • My expectation is that if the randomly selected rays don't have pose A, pose A shouldn't be updated.
  • However, the Adam optimizer may use history gradient to update all parameters even if some parameters have zero gradient.

Do you think this is a valid concern ?
Thanks.

@HengyiWang
Copy link
Owner

Hi @jinmaodaomaye2021, thank you for your question.

It seems you are right, this is a valid concern. The Adam optimizer uses the history gradient to update all parameters. Currently, for each global BA, we initialize a new optimizer and update the pose parameters only twice, so the impact of this will not be significant. However, I will try to do some experiments to see if this will indeed have a negative impact on pose optimization when I have time. If so, I will switch to SparseAdam for global BA.

@HengyiWang HengyiWang added the enhancement New feature or request label Aug 1, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants