Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Temporary fix for nan values in flag visibilities #408

Closed
wants to merge 2 commits into from

Conversation

ulricharmel
Copy link
Collaborator

No description provided.

@ratt-priv-ci
Copy link
Collaborator

Can one of the admins verify this patch?

@JSKenyon
Copy link
Collaborator

ok to test

@JSKenyon
Copy link
Collaborator

@ulricharmel Out of interest, are the NaN values in the weight column of the measurement set?

@ulricharmel
Copy link
Collaborator Author

Hi @JSKenyon Sorry for the late replay. Those visibilities are flagged and their weights are zero. I did not check from the measurement set. Maybe those weights have been set to zero by the data handler because they are flagged.

@JSKenyon
Copy link
Collaborator

The reason I ask is because this line cannot produce NaN values unless there are already NaN values present in its inputs. I guess my question is in which array are the NaN values living?

@ulricharmel
Copy link
Collaborator Author

ulricharmel commented Aug 14, 2020

I checked it interactively with pdb. The weights were zero. But the values were nan +nanj in the off diagonal. The multiplication line gives a warning about invalid value encounter in multiplication. I tested it and the output of the multiplication is nan+nanj. I also tried multiplying those values with zero and I got nan+nanj with the warning.

@JSKenyon
Copy link
Collaborator

Ok, but does self.obser_arr contain NaN values on its off-diagonals?

@ulricharmel
Copy link
Collaborator Author

Yes it does

@JSKenyon
Copy link
Collaborator

@ulricharmel Ah thanks. That is odd. And are those NaN values in the data column on disk?

@o-smirnov What is the expected behaviour here? While I am sure Ulrich's solution will work, I thought that we were cleaning up dodgy input values elsewhere.

@bennahugo
Copy link
Collaborator

bennahugo commented Aug 14, 2020 via email

@ulricharmel
Copy link
Collaborator Author

@JSKenyon I did not check the MS itself, I checked for a specific time chunk the data that was being passed to the solver.

@ulricharmel
Copy link
Collaborator Author

@bennahugo I will upgrade to python-casacore 3.3 and test.

@bennahugo
Copy link
Collaborator

bennahugo commented Aug 14, 2020 via email

@o-smirnov
Copy link
Collaborator

Sounds like it's a workaround for NaN values in the input that should not have been appearing in the first place (especially from this comment). I have also made further fixes to MS reading in #424.

However let's keep this open for the moment, as looking at the code, I can't find where flagged data/model is actually nulled. Except at https://github.com/ratt-ru/CubiCal/blob/master/cubical/solver.py#L108, but this ignore FL.PRIOR. So I suppose it's possible to have a pre-flagged NaN value sneak in and cause some havoc. For the sake of being defensive, we probably need to null it in the ms_tile code -- I thought I did that, but I can't find the statement!

@JSKenyon
Copy link
Collaborator

Closing for now - please reopen if this issue persists.

@JSKenyon JSKenyon closed this Mar 16, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants