This issue was moved to a discussion.
You can continue the conversation there. Go to discussion →
Inconsistent parsing of longitudes across cpu architectures #3392
Labels
Type: Bug
Something is not working like it should
What went wrong?
Hello, thank you once again for metpy.
I'm seeing some strangeness when parsing grib2 files across architectures. I'm not sure what the expected behaviour should be, or how to make it consistent.
For example, lets say I have the following test file, with an example grib2 file I found to help demonstrate:
Output on an Apple M2
Output on an x86_64
Of the three examples, the third produces the same long array on both machines.
First array differs however:
M2:
x86_64
second array also differs:
M2
x86_64
Which output is expected to be the "correct" one?
This difference results in very strange behaviour on the M2 (and raspberry pi, arm based as well) when trying to animate over the dateline (eg, slowly moving lon center from west to east, incrementing time, to create an animation). All of the image data appears to shift 1 degree to the right at the point the data flips. It seems fine on the x86_64 but not on an arm based machine.
Is there a way to ensure that the results are consistent across architectures?
Is there a way to always have metpy parse the data so that lat/lon is -180 to 180, or, always 0 to 360?
I will attach the pipenv graph outputs below:
M2
x86_64
Thanks in advance for any help!
Operating System
Windows
Version
1.6
Python Version
3.11
Code to Reproduce
Errors, Traceback, and Logs
No response
The text was updated successfully, but these errors were encountered: