Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature/calcuttj daphne new data #44

Merged
merged 5 commits into from
May 14, 2024
Merged

Conversation

calcuttj
Copy link
Collaborator

@calcuttj calcuttj commented May 3, 2024

Integrated new Daphne frame versions into the decoder

@tomjunk
Copy link
Member

tomjunk commented May 6, 2024

Hi Jake, I think I see the problem here -- you need to use HDF5RawFile3Service instead of HDF5RawFile2Service. My apologies for not being consistent with 2's and 3's. This is the second DAPHNE format but the third HDF5 format. And I was a a little sloppy in the HDF5FileInfo2 data product was re-used for the HDF5RawInput3_source. HDF5RawFile3Service gives access to the upgraded HDF5RawFile class from the DAQ which as a different-size trigger record header, which will mess up your decoding if you use the old one.

@@ -22,6 +22,8 @@

#include "detdataformats/daphne/DAPHNEFrame.hpp"
#include "detdataformats/daphne/DAPHNEStreamFrame.hpp"
#include "detdataformats/daphne/DAPHNEFrame2.hpp"
#include "detdataformats/daphne/DAPHNEStreamFrame2.hpp"
#include "dunecore/DuneObj/DUNEHDF5FileInfo2.h"
#include "dunecore/HDF5Utils/HDF5RawFile2Service.h"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

2->3 on this line

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

and on line 452

services:
{
TimeTracker: {}
HDF5RawFile2Service: {}
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

and 2->3 on this line

}
}

source: @local::hdf5rawinput2
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh, and you'll need to use hdf5rawinput3 here for the source.

I think I see an overall design choice that's different here. The DAQ people changed the trigger record header size, which has ripple effects through HDF5RawInputFile which is the class we use to retrieve the data. I wanted to keep backward compatibility, but also have the ability to read new data too. Since I was not eager to rewrite the DAQ code in hdf5libs to do what is done here -- to automatically detect the version and do the right thing, instead I made a separate copy which includes the new headers.

This new copy was the right thing when we moved from artROOT to HDF5, and also from the first HDF5 format to the second, given how different they were; it was all but impossible to try to write a source and decoder that could understand both artROOT and HDF5, or even the two kinds of HDF5 the DAQ people wrote. This version is a much smaller update, but it still is breaking. I took the same approach, to separate out new source, service and decoder, while retaining the old ones for backward compatibility, and one has to use the right one when reading old data or new data, it does not automatically choose.

The DAQ group changed the trigger record header in the same overall release as the DAPHNE upgrade, and so I don't think you'll ever see a mixture of DAPHNE2 and DAPHNE3's with the same HDF5 format. So for the TPC decoders, they only decode one format at a time, and if we need to deal with another format, we just make another decoder.

@tomjunk
Copy link
Member

tomjunk commented May 7, 2024

A reason I like a separate decoder for each data format is what happened with ICEBERG. We've gone through no fewer than eight data formats for the TPC data, and a ninth is on its way. We had artdaq fragments and container fragments for RCE and FELIX (four formats), 14-bit WIB data (FELIX only), WIB spy-buffer data, First HDF5 format, WIBEth HDF5 format in the second HDF5 format, and upcoming third HDF5 format. The first decoder worked on the first four formats -- (RCE + FELIX)*(container frags + noncontainer frags) and it already got a bit messy, trying to auto-detect the format and doing the right thing. Then the HDF5 data looked so different, and the WIB spy-buffer data weren't even divided up into events and so had to be treated differently. I gave up trying to read it all in one place. I expect the DAQ to change data formats again in the future.

The old artDAQ fragment class in fact has several versions of its header, and it automatically upgrades them on readin. This has been the source of some very nasty bugs that took weeks of effort to identify, diagnose, fix, review, and test. It is good when ROOT does this for us, but even ROOT gets it wrong sometimes. And we're not using ROOT anymore. At least the framework allows us to modularize code so it doesn't interfere with other code in the framework.

@calcuttj
Copy link
Collaborator Author

calcuttj commented May 9, 2024

@tomjunk I've refactored things a bit differently where that should fix/safeguard against mixed up fragments/frames etc.

@tomjunk
Copy link
Member

tomjunk commented May 10, 2024

trigger build with #44

@FNALbuild
Copy link
Collaborator

✔️ CI build for LArSoft Succeeded on slf7 for e26:prof -- details available through the CI dashboard

@FNALbuild
Copy link
Collaborator

✔️ CI build for LArSoft Succeeded on slf7 for c14:prof -- details available through the CI dashboard

@FNALbuild
Copy link
Collaborator

❌ CI build for DUNE Failed at phase build DUNE on slf7 for c14:prof -- details available through the CI dashboard

🚨 For more details about the failed phase, check the build DUNE phase logs

parent CI build details are available through the CI dashboard

@tomjunk
Copy link
Member

tomjunk commented May 10, 2024

This PR created a bunch of clang warnings which are treated as errors, and a couple of errors.

https://dbweb8.fnal.gov:8443/LarCI/app/ns:dune/storage/docs/2024/05/10/buildDUNE%23spFysjd.log

@tomjunk
Copy link
Member

tomjunk commented May 10, 2024

The gcc build succeeded however.

@FNALbuild
Copy link
Collaborator

⚠️ CI build for DUNE Warning at phase ci_tests DUNE on slf7 for e26:prof -- details available through the CI dashboard

🚨 For more details about the warning phase, check the ci_tests DUNE phase logs

parent CI build details are available through the CI dashboard

@calcuttj
Copy link
Collaborator Author

trigger build

@calcuttj
Copy link
Collaborator Author

trigger build with #44

@FNALbuild
Copy link
Collaborator

✔️ CI build for LArSoft Succeeded on slf7 for e26:prof -- details available through the CI dashboard

@FNALbuild
Copy link
Collaborator

✔️ CI build for LArSoft Succeeded on slf7 for c14:prof -- details available through the CI dashboard

@FNALbuild
Copy link
Collaborator

❌ CI build for DUNE Failed at phase ci_tests DUNE on slf7 for c14:prof - ignored warnings for build -- details available through the CI dashboard

🚨 For more details about the failed phase, check the ci_tests DUNE phase logs

parent CI build details are available through the CI dashboard

@FNALbuild
Copy link
Collaborator

⚠️ CI build for DUNE Warning at phase ci_tests DUNE on slf7 for e26:prof - ignored warnings for build -- details available through the CI dashboard

🚨 For more details about the warning phase, check the ci_tests DUNE phase logs

parent CI build details are available through the CI dashboard

@calcuttj
Copy link
Collaborator Author

@tomjunk only the clang CI tests failed but the build succeeded. Good to merge?

Copy link
Member

@tomjunk tomjunk left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Okay, looks fine -- will try merging it.

@tomjunk tomjunk merged commit abebcca into develop May 14, 2024
3 of 4 checks passed
@calcuttj calcuttj deleted the feature/calcuttj_daphne_new_data branch May 16, 2024 21:06
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants