Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Made some methods const #174

Merged
merged 21 commits into from
Apr 5, 2022
Merged

Made some methods const #174

merged 21 commits into from
Apr 5, 2022

Conversation

lobis
Copy link
Member

@lobis lobis commented Apr 1, 2022

lobis Large: 464 SyntaxError: Unexpected identifier

Changes required for rest-for-physics/geant4lib#41.

Some methods are made const, usage remains the same.

I also noticed abug in e6da8d7, perhaps the same bug is present elsewhere.

@lobis
Copy link
Member Author

lobis commented Apr 1, 2022

pipeline seems to fail giving an odd error with observables on the tree... I will look into it, no idea what can be, anyone has seen this before @rest-for-physics/core_dev ? Other libraries related to this PR pass the pipeline without issues.

@jgalan
Copy link
Member

jgalan commented Apr 1, 2022

I have seen that before, because if something affects the observables previously registered at some point in time with the ones obtained now, then the pipeline will not succeed. However, there is no clue I could give since the reasons could be completely unconnected.

Int_t TRestHits::GetMostEnergeticHitInRange(Int_t n, Int_t m) {
Int_t maxEn = 0;
Int_t TRestHits::GetMostEnergeticHitInRange(Int_t n, Int_t m) const {
double maxEnergy = 0;
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This can also change the pipeline behaviour but it is indeed a bug, since energy should be double

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes, this is the only thing I can think of... but I don't think this method is used right? hopefully its not used anywhere since it was bugged.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

jgalan@sultan:~/rest-framework/source$ grep -rnw . -e "GetMostEnergeticHitInRange"
./libraries/track/src/TRestTrackBlobAnalysisProcess.cxx:215:        Int_t hit1 = hits->GetMostEnergeticHitInRange(0, nCheck);
./libraries/track/src/TRestTrackBlobAnalysisProcess.cxx:216:        Int_t hit2 = hits->GetMostEnergeticHitInRange(nHits - nCheck, nHits);

@juanangp
Copy link
Member

juanangp commented Apr 1, 2022

pipeline seems to fail giving an odd error with observables on the tree... I will look into it, no idea what can be, anyone has seen this before @rest-for-physics/core_dev ? Other libraries related to this PR pass the pipeline without issues.

I have seen this as well, in my case the issue was in the code that I introduced. Note that if you push to a single library it doesn't trigger the pipeline for the framework, so it is very difficult to track the failure.

@juanangp
Copy link
Member

juanangp commented Apr 1, 2022

Looking at the pipeline there is something wrong with the hits (left failure; right suceed)

image

@lobis
Copy link
Member Author

lobis commented Apr 1, 2022

Looking at the pipeline there is something wrong with the hits (left failure; right suceed)

image

I added back the bug, lets see if it gets fixed.

@juanangp
Copy link
Member

juanangp commented Apr 1, 2022

I think I found something odd in rest-for-physics/connectorslib#10 perhaps related with the pipeline failure...

@lobis
Copy link
Member Author

lobis commented Apr 1, 2022

I think I found something odd in rest-for-physics/connectorslib#10 perhaps related with the pipeline failure...

I made some fixes but still pipeline fails...

@juanangp
Copy link
Member

juanangp commented Apr 1, 2022

I made some fixes but still pipeline fails...

There is something nasty because the number of processed events is zero, which means that the error is different from before. Either a bug has been introduced with the new changes or elsewhere.

@juanangp
Copy link
Member

juanangp commented Apr 3, 2022

Just checkout this branch and I realize that the pipeline is extremelly slow. Check below what I get while comparing with an older branch:

image

Doesn't seems related to a particular process, but all processes are slow. Do you know something that has been recently introduced which can explain this time difference? I guess should be under framework...

On the other hand, it would be good to check the running time in the pipeline validation to catch these kind of issues.

@jgalan
Copy link
Member

jgalan commented Apr 3, 2022

At the end what it was making the pipeline fail?

Coming back to the processing time. Perhaps we should add as a validation pipeline that the processing time of a reference processing chain does not exceed a certain value. But not sure if it can be assured that it will be comparable at each runner.

@juanangp
Copy link
Member

juanangp commented Apr 3, 2022

Test should be done in the same local machine. It could happen that the load from the server at a given time would affect, i imagine.

I am running in sultan, so I think that the issue is isolated to the code and not to the server.

@juanangp
Copy link
Member

juanangp commented Apr 3, 2022

At the end what it was making the pipeline fail?

Aparently, we introduced another bug in rest-for-physics/connectorslib#10, which have to be fixed.

Coming back to the processing time. Perhaps we should add as a validation pipeline that the processing time of a reference processing chain does not exceed a certain value. But not sure if it can be assured that it will be comparable at each runner.

Yes, I was thinking of checking the cpu time, that I think is what the image that I posted before is doing. Can this info be added as metadata or so?

@jgalan
Copy link
Member

jgalan commented Apr 3, 2022

Yes, I was thinking of checking the cpu time, that I think is what the image that I posted before is doing. Can this info be added as metadata or so?

I imagine we could add a new metadata member to TRestEventProcess::fAverageProcessingTime so that every process registers the time required for its own execution.

@lobis
Copy link
Member Author

lobis commented Apr 4, 2022

At the end what it was making the pipeline fail?

I have no idea, I made some changes to the pipeline scripts to have better debuging and suddenly it worked.

However I made the MakeBasicTree macro print the number of entries at the end and I get:

Finished `MakeBasicTree.C` macro. Number of entries: 0
(int) 0

so I think this is not good.

@lobis lobis added the help wanted Extra attention is needed label Apr 4, 2022
pipeline/ValidateTrees.C Outdated Show resolved Hide resolved
@lobis
Copy link
Member Author

lobis commented Apr 4, 2022

Looks like 53d73ec finally fixed the pipeline problems, thanks @juanangp.

So regarding that, this and all related PR can be merged.

However the problem with the speed... I can't think how this PR has anything to do with that, specially since it happens on all kinds of analysis proecsses, not only those that would be affected by this PR, what do you think about merging @jgalan @juanangp ?

@lobis lobis requested review from jgalan and juanangp April 4, 2022 09:39
@lobis
Copy link
Member Author

lobis commented Apr 4, 2022

@juanangp and I have checked together if these changes produce any performance change and haven't been able to reproduce, we made a clean install using these changes lobis-add-const-to-getters on all corresponding branches. Our conclusion is that this PR does not introduce any performance overhead so it can be merged.

@lobis lobis requested a review from a team April 5, 2022 11:03
@lobis lobis requested a review from nkx111 April 5, 2022 11:10
@lobis lobis merged commit 99a43c8 into master Apr 5, 2022
@lobis lobis deleted the lobis-add-const-to-getters branch April 5, 2022 11:26
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
help wanted Extra attention is needed
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants