Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

reduce size of egamma energy regression payloads #17560

Open
slava77 opened this issue Feb 18, 2017 · 3 comments
Open

reduce size of egamma energy regression payloads #17560

slava77 opened this issue Feb 18, 2017 · 3 comments

Comments

@slava77
Copy link
Contributor

slava77 commented Feb 18, 2017

The last update in #17506 has more than doubled the size of payloads consumed by EGExtraInfoModifierFromDB in memory (from 22 MB to 53 MB).

Do we really need the GBRForestD here, can the float precision suffice?
Can some other "compactification" be done?

@lgray @rafaellopesdesa @bendavid

@cmsbuild
Copy link
Contributor

cmsbuild commented Feb 18, 2017

A new Issue was created by @slava77 Slava Krutelyov.

@davidlange6, @Dr15Jones, @smuzaffar can you please review it and eventually sign/assign? Thanks.

cms-bot commands are listed here

@slava77 slava77 changed the title reduce size of egamma energy regressions reduce size of egamma energy regression payloads Feb 18, 2017
@slava77
Copy link
Contributor Author

slava77 commented Feb 18, 2017

@franzoni @arunhep please take a not from alca side as well.
Thanks.

@bendavid
Copy link
Contributor

So the double precision is definitely needed during the training for the likelihood-based loss functions for numerical stability reasons. The possibility of converting back to single precision after the training is done has not so far been explored.

In principle possible and I guess it should work, though a little bit of code would be needed (new constructor for the single precision classes taking the double precision version as input).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants