Skip to content

Conversation

@ftouchte
Copy link
Collaborator

New features:

  • Create entries to store calibrated ADC and ToT in AHDC::hits
  • The raw ADC is still stored in the object Hit, it will be used in the Kalman Filter
  • Remove residual_prefit

@c-dilks c-dilks added the rg-l label Dec 17, 2025
@ftouchte
Copy link
Collaborator Author

Do you know why these tests fail?

@c-dilks
Copy link
Member

c-dilks commented Dec 17, 2025

Do you know why these tests fail?

There's an assertEquals in AHDCTest.java that is failing: https://github.com/JeffersonLab/coatjava/actions/runs/20317024485/job/58363275431?pr=1014#step:5:3285

@ftouchte
Copy link
Collaborator Author

Is it possible that it is because the structure the bank has changed? that we need to provide a new file for the test?

@c-dilks
Copy link
Member

c-dilks commented Dec 18, 2025

Possibly. Looks like you added new columns to AHDC::hits, but the assertion on the DataBank.rows() call fails, that is, on the number of rows, not columns.

@baltzell, @raffaelladevita, do you know how to fix this?

@ftouchte
Copy link
Collaborator Author

ftouchte commented Dec 18, 2025

For me, the test is good. I didn't add new cuts, so we should have the same number of rows. My guess is that there is a wrong memory access that makes the value AHDC::hits:time incorrects so that the cuts in reconstruction/alert/src/main/java/org/jlab/rec/ahdc/Hit/Hit.java behave unexpectedly. (The fact that redisual_prefit does not exist in the new schema.)

Now

image

Before

image

@baltzell
Copy link
Collaborator

Seems the software changes changed the resulting data. If that's what's wanted, then the AHDC test either needs updating or disabling.

@ftouchte
Copy link
Collaborator Author

Nothing should have changed. I made some tests, I got the same distributions (e.g residuals, correlations).

The new values are clearly bad. Most of them are random values. Can we do a hipo-utils -update on the test file?

@baltzell
Copy link
Collaborator

The test failing is only due to software changes.

@ftouchte
Copy link
Collaborator Author

@baltzell I will update the test. Is it possible to run a specific unit test locally? I would like to only run the AHDC one.

./build-coatjava.sh --unittests

@c-dilks
Copy link
Member

c-dilks commented Dec 18, 2025

Is it possible to run a specific unit test locally? I would like to only run the AHDC one.

Yes, unit tests run by default when using mvn install; to restrict to the alert reconstruction module:

mvn -f reconstruction/alert install

@baltzell
Copy link
Collaborator

Looks like that test also comes with a main method, so you can easily run it from your IDE.

@ftouchte
Copy link
Collaborator Author

I made my investigation. My starting point was that the test failed because of this pull request. At the end, I found interesting things.

  1. I went back few commits before (eg. Improvements in the Kalman Filter for the AHDC (#1010)). The test ran well. However, the test masked some issues. Indeed, I download the test file (etc/data/test/ahdc.hipo) and I did a recon-util. I got these errors:
[bank] Error (PUT) :: ( AHDC::adc) the entry ped has type Float while requested SHORT
[bank] Error (PUT) :: ( AHDC::adc) the entry wfType has type Short while requested UNDEFINED
[bank] Error (PUT) :: ( AHDC::adc) the entry ped has type Float while requested SHORT
[bank] Error (PUT) :: ( AHDC::adc) the entry wfType has type Short while requested UNDEFINED
[bank] Error (PUT) :: ( AHDC::adc) the entry ped has type Float while requested SHORT
[bank] Error (PUT) :: ( AHDC::adc) the entry wfType has type Short while requested UNDEFINED
[bank] Error (PUT) :: ( AHDC::adc) the entry ped has type Float while requested SHORT
[bank] Error (PUT) :: ( AHDC::adc) the entry wfType has type Short while requested UNDEFINED
[bank] Error (PUT) :: ( AHDC::adc) the entry ped has type Float while requested SHORT

That the proof that we had a mismatch between the dictionaries since a very long time. At least, the test didn't fail and we reconstructed a track.

  1. When we run the recon-util with the modifications of this pull request. We have similar errors but the values in AHDC::hits are corrupted and the test fails.

  2. I wanted to get rid of these errors. So I made
    hipo-utils -update -d /home/touchte-codjo/Desktop/coatjava/coatjava/etc/bankdefs/hipo4/singles/mon -o updated-ahdc.hipo ahdc.hipo

I re-ran the recon-util. That time I got no errors but the data were still corrupted. Actually, the values that we obtain ara corrupted but not random.

  1. Next, I removed the banks in AHDCEngine before trying to add new one:
event.removeBanks("AHDC::hits","AHDC::preclusters","AHDC::clusters","AHDC::track","AHDC::kftrack","AHDC::mc","AHDC::ai:prediction");
event.appendBank(recoHitsBank);
event.appendBank(recoPreClusterBank);
event.appendBank(recoClusterBank);
event.appendBank(recoTracksBank);
event.appendBank(recoKFTracksBank);
event.appendBank(recoInterClusterBank);

That time, I had no errors and the values in AHDC::hits were no longer corrupted. The test failed again but this time we had more hits! 28 instead of 25.

image

You can see that the data look much better.

Unfortunately, this time we didn't reconstruct any track but this has nothing to do with our issue. If you look at the doca or time in AHDC::hits, you will notice that they are very big. We have almost nothing below 1 mm. The looked at RUN::config, it seems that this test file was extracted from the run 21317. I guess is that the t0 for this run is not the right one.

Conlusion

The structures of the AHDC banks are different from what they were four months before. The fact of not removing the banks in AHDCEngine makes us believe that something changes but it does not. To notice that, it was necessary the change the structure (the ordering) of the entries in AHDC::hits (removing residual_prefit): the data were clearly corrupted.

I plan to update the test in the sense that we should always reconstruct a track but we will need to change the test file.

What do you think of these remarks?

@ftouchte
Copy link
Collaborator Author

Of course, we cannot modify the test file every time we change the structure of a bank.
When we start the reconstruction from an EVIO file or from simulation, we have no issue.
We could also delete the test, but without it, we would not be able to detect these kind of issues.

@baltzell
Copy link
Collaborator

Well, there's still only two choices, update the test to pass with the new software, or disable it.

@baltzell baltzell merged commit 696732b into JeffersonLab:development Dec 18, 2025
40 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants