Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add args decorator for fitting and loss #2710

Merged
merged 2 commits into from Aug 5, 2023
Merged

add args decorator for fitting and loss #2710

merged 2 commits into from Aug 5, 2023

Conversation

ChiahsinChu
Copy link
Contributor

Add fitting_args_plugin and loss_args_plugin into deepmd.utils.argcheck. With these decorators, new parameters for fitting and loss can be defined in the external package.

@github-actions github-actions bot added the Python label Aug 1, 2023
@codecov
Copy link

codecov bot commented Aug 1, 2023

Codecov Report

Patch coverage: 100.00% and no project coverage change.

Comparison is base (4fa54ec) 78.64% compared to head (01667da) 78.64%.

Additional details and impacted files
@@           Coverage Diff           @@
##            devel    #2710   +/-   ##
=======================================
  Coverage   78.64%   78.64%           
=======================================
  Files         239      239           
  Lines       25465    25475   +10     
  Branches     1517     1517           
=======================================
+ Hits        20028    20036    +8     
- Misses       5045     5047    +2     
  Partials      392      392           
Files Changed Coverage Δ
deepmd/utils/argcheck.py 95.89% <100.00%> (+0.07%) ⬆️

... and 1 file with indirect coverage changes

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@wanghan-iapcm wanghan-iapcm merged commit 9391e34 into deepmodeling:devel Aug 5, 2023
39 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants