Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add performance metrics in report.py #515

Merged
merged 17 commits into from
Dec 5, 2023
Merged

Add performance metrics in report.py #515

merged 17 commits into from
Dec 5, 2023

Conversation

alvalentini
Copy link
Member

@alvalentini alvalentini commented Nov 7, 2023

No description provided.

@codecov-commenter
Copy link

codecov-commenter commented Nov 7, 2023

Codecov Report

Attention: 30 lines in your changes are missing coverage. Please review.

Comparison is base (a6dacbd) 84.92% compared to head (14f9651) 84.91%.

Files Patch % Lines
...ed_planning/grpc/generated/unified_planning_pb2.py 4.54% 21 Missing ⚠️
unified_planning/io/ma_pddl_writer.py 0.00% 8 Missing ⚠️
unified_planning/grpc/proto_reader.py 87.50% 1 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##           master     #515      +/-   ##
==========================================
- Coverage   84.92%   84.91%   -0.02%     
==========================================
  Files         200      200              
  Lines       26383    26401      +18     
==========================================
+ Hits        22407    22419      +12     
- Misses       3976     3982       +6     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

Copy link
Contributor

@Framba-Luca Framba-Luca left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM! I just left a very minor question.

unified_planning/grpc/proto_reader.py Show resolved Hide resolved
unified_planning/grpc/proto_reader.py Show resolved Hide resolved
Copy link
Member

@arbimo arbimo left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The code looks great and I just pinpointed two minor things.

However, I am wondering however whether the output is adapted as it is.
Keeping in mind, that report is primarily intended for engine developers to check their integration, the new Ok(<1s) field in the output is neither very helpful for them nor very informative in general (it checks some pretty adhoc rules).

By default, I would suggest adding only the internal engine time (if available) which can be really helpful in identifying performance problem on the UP or engine side. Something like that maybe:

runtime_report = "{:.3f}s ({:.3f}s)".format(total_time, internal_time).ljust(30)

If the additional field is needed for the evaluation, its output can be activated by a command line option or environment variable.

up_test_cases/builtin/classical/tpp/__init__.py Outdated Show resolved Hide resolved
up_test_cases/report.py Outdated Show resolved Hide resolved
@alvalentini
Copy link
Member Author

Thanks @arbimo for the feedback! I agree with your comment and I added a command line option to print the info for the evaluation report, needed in the deliverable!

@arbimo arbimo merged commit 59bacdf into master Dec 5, 2023
8 checks passed
@arbimo arbimo deleted the test-case branch December 5, 2023 08:42
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants