Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Tune magic nsb #236

Merged
merged 7 commits into from
Jul 18, 2024
Merged

Tune magic nsb #236

merged 7 commits into from
Jul 18, 2024

Conversation

jsitarek
Copy link
Collaborator

@jsitarek jsitarek commented Jul 5, 2024

a script to derive the missing NSB noise in MAGIC part of the simulations and the modifications to the calibration to allow adding the noise also for MAGIC

also modified the way how the random number generator is initiated for NSB adding, before it was initiating it for every event with obs_id as a seed, so the random numbers were the ssame for all the events
now we use obs_id + event number
at the same time the defaults of LST are slightly changed (so they match those in MAGIC)
@Elisa-Visentin
Copy link
Collaborator

Hi @jsitarek, I was checking the PR. Just a naïve question: why do we need two different NSB corrections for the 2 MAGIC? It is due to a different level of simulated (DL0) background, to the different features of the telescopes or to the (slightly) different FOV?

@jsitarek
Copy link
Collaborator Author

Hi @Elisa-Visentin
The code is done LST-style, i.e. you make a distribution of noise in MC and in data and using Gaussian approximations you add the difference. When you do the same procedure for M1 and M2 you would always get somewhat different results. You are right that in principle the difference should be small, but the two telescopes are not the same, in particular the afterpulse distribution in PMTs is different for M1 and M2, plus the simulation of electronic noise and NSB has some simplifications. From the tests that I was running I saw that the extra_noise parameters, both for dim and bright pixels end up very similar for M1 and M2, but there is a difference in added bias, it is very small for M2, but there is some bias to be added in M1.

@Elisa-Visentin
Copy link
Collaborator

Ok thank you!

@Elisa-Visentin
Copy link
Collaborator

There seems to be a failing test due to no M1 M2 stereo events in the training sample. I will check it on my laptop asap just to be sure that everything is fine (i.e., only statistics)

Copy link
Collaborator

@Elisa-Visentin Elisa-Visentin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Tests to be fixed before merging

This is to avoid failing a test due to missing M1+M2 event
Copy link

codecov bot commented Jul 18, 2024

Codecov Report

Attention: Patch coverage is 92.85714% with 1 line in your changes missing coverage. Please review.

Project coverage is 77.23%. Comparing base (2c7d9b4) to head (29129f3).
Report is 1 commits behind head on master.

Files Patch % Lines
magicctapipe/image/calib.py 92.30% 1 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##           master     #236      +/-   ##
==========================================
+ Coverage   77.19%   77.23%   +0.04%     
==========================================
  Files          21       21              
  Lines        2605     2614       +9     
==========================================
+ Hits         2011     2019       +8     
- Misses        594      595       +1     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@jsitarek jsitarek merged commit 31e6bd7 into master Jul 18, 2024
8 checks passed
@jsitarek jsitarek deleted the tune_magic_nsb branch July 18, 2024 07:53
@aleberti aleberti added the new functionality For new functionalities label Aug 7, 2024
Elisa-Visentin pushed a commit that referenced this pull request Sep 12, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
new functionality For new functionalities
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants