Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Measure performance in logs #123

Merged
merged 7 commits into from
Jan 14, 2021

Conversation

pgrill
Copy link
Collaborator

@pgrill pgrill commented Jan 12, 2021

No description provided.

README.md Outdated Show resolved Hide resolved
libs/cv_engine.py Outdated Show resolved Hide resolved
Copy link
Contributor

@mats-claassen mats-claassen left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just thinking whether it would be good to store these metrics in a file. We could name it -.txt. I think you will want to compare different models and/or devices and if they are just printed to the console then you will have to copy them manually somewhere to compare. Thoughts?

README.md Outdated Show resolved Hide resolved
libs/cv_engine.py Outdated Show resolved Hide resolved
libs/cv_engine.py Outdated Show resolved Hide resolved
@pgrill
Copy link
Collaborator Author

pgrill commented Jan 13, 2021

Just thinking whether it would be good to store these metrics in a file. We could name it -.txt. I think you will want to compare different models and/or devices and if they are just printed to the console then you will have to copy them manually somewhere to compare. Thoughts?

Yes, I thought about store it in a file. My only concern was that storing it into a file can affect the processor performance for the file access. But I can implement a short version and test it.

@pgrill pgrill force-pushed the Measure-Performance-In-Logs branch from 258f48a to 496fa11 Compare January 14, 2021 11:50
@pgrill
Copy link
Collaborator Author

pgrill commented Jan 14, 2021

Just thinking whether it would be good to store these metrics in a file. We could name it -.txt. I think you will want to compare different models and/or devices and if they are just printed to the console then you will have to copy them manually somewhere to compare. Thoughts?

Yes, I thought about store it in a file. My only concern was that storing it into a file can affect the processor performance for the file access. But I can implement a short version and test it.

I added now the possibility of storing the performance metrics logs into a CSV file.

@pgrill pgrill merged commit bd3ae08 into galliot-us:master Jan 14, 2021
@pgrill pgrill deleted the Measure-Performance-In-Logs branch January 14, 2021 16:37
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants