Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Roof_PROMICE, bad coordinates in AWS_latest_locations.csv #60

Closed
patrickjwright opened this issue Sep 5, 2023 · 6 comments
Closed

Roof_PROMICE, bad coordinates in AWS_latest_locations.csv #60

patrickjwright opened this issue Sep 5, 2023 · 6 comments

Comments

@patrickjwright
Copy link

patrickjwright commented Sep 5, 2023

The coordinates for Roof_PROMICE are incorrect in AWS_latest_locations.csv, located here:

https://thredds.geus.dk/thredds/catalog/metadata/catalog.html

This currently shows:
Roof_PROMICE,2023-09-05 23:00:00,67.105042,49.930689,683.6

It appears this is the coordinates for KAN_L (if you make the longitude negative).

Note that the coordinates for Roof_GEUS are correct:
Roof_GEUS,2023-09-05 23:00:00,55.688535,12.582198,29.0

Not sure where this mix-up could be occurring? The logic to find these positions is here in csv2bufr.py:
https://github.com/GEUS-Glaciology-and-Climate/pypromice/blob/e4ff5edb8436da0d51f313739724f9f6e51d46ef/src/pypromice/postprocess/csv2bufr.py#L367

@patrickjwright
Copy link
Author

Also, please let me know if metadata issue are supposed to go elsewhere besides PROMICE-AWS-data-issues.

@BaptisteVandecrux
Copy link
Member

Hi!
It is due to a mixed-up IMEI number between KAN_Lv3 and Roof_GEUS.

I send the info to Robert for correction.

Btw, that shows the usefulness of an error warning if the new transmitted coordinates are very far from the past ones. Here Roof_GEUS data is not sent to DMI, but if it had happened to another station, it should raise a flag and send some alert emails.

@BaptisteVandecrux
Copy link
Member

Now the toml file for Roof_PROMICE and KAN_Lv3 are fine.

But

The coordinates of Roof_PROMICE are still wrong because, although no new data is appended to Roof_PROMICE transmission file, past transmissions from KAN_Lv3 are still in the Roof_PROMICE tx file. I have seen this happening on other occasions, when modems and loggers were moved around.

Until now, the solution was that someone would open the transmission file of Roof_PROMICE and remove the rows corresponding to KAN_Lv3. It would be much more reliable if pypromice could check (and crop if necessary) the appended transmission files (in the case AWS-L0/tx/_.txt) so that the content of that file only correspond to the time period described in the toml file.

I am leaving the issue open until the proper "latest" coordinates for Roof_PROMICE are being recovered.

@BaptisteVandecrux
Copy link
Member

Also, please let me know if metadata issue are supposed to go elsewhere besides PROMICE-AWS-data-issues.

I feel that any issue that is also visible for the data user should be reported here. At least to indicate that we know about the issue and are working on it. So all good!

@BaptisteVandecrux
Copy link
Member

I went ahead and removed the KAN_Lv3 transmissions from the Roof_PROMICE transmission file.
I noticed that some of these transmissions were even missing from the KAN_Lv3 transmission file. So I pasted these rows at the top of the file.
I reprocessed both stations:

  • the latest position of Roof_PROMICE is back to normal, but note that the data from that AWS is not distributed on THREDDS nor dataverse. Maybe it should be removed from the AWS_latest_locations.csv
  • Added transmissions are visible at KAN_Lv3
    KAN_Lv3_0

I'll close this after the next release on dataverse.

@BaptisteVandecrux
Copy link
Member

Good coordinates for Roof_PROMICE recovered in the AWS_latest_location.csv on dataverse release V13.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants