Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add init-frz-model support for se-t type descriptor #1245

Merged
merged 1 commit into from
Oct 28, 2021

Conversation

denghuilu
Copy link
Member

Add init-frz-model support for set-t type descriptors.

lcurve.out shows that the compression training process of the se_t type has the same result as the original training process of the first step:

root se_e3 $ head compress.out 
#  step      rmse_val    rmse_trn    rmse_e_val  rmse_e_trn    rmse_f_val  rmse_f_trn         lr
      0      2.16e+01    1.86e+01      1.91e-02    1.30e-02      6.83e-01    5.89e-01    1.0e-03
     10      2.11e+01    2.23e+01      4.25e-01    4.27e-01      6.67e-01    7.06e-01    1.0e-03
     20      2.13e+01    2.16e+01      2.11e-02    2.04e-02      6.73e-01    6.84e-01    1.0e-03
     30      2.13e+01    2.17e+01      3.94e-02    3.30e-02      6.72e-01    6.88e-01    1.0e-03
     40      1.99e+01    2.23e+01      9.17e-02    8.79e-02      6.30e-01    7.05e-01    1.0e-03
     50      2.13e+01    2.05e+01      3.32e-02    2.76e-02      6.74e-01    6.49e-01    1.0e-03
     60      2.14e+01    2.35e+01      5.49e-03    9.87e-03      6.77e-01    7.42e-01    1.0e-03
     70      2.15e+01    1.89e+01      4.85e-02    5.44e-02      6.79e-01    5.96e-01    1.0e-03
     80      2.23e+01    2.08e+01      2.84e-02    3.15e-02      7.05e-01    6.58e-01    1.0e-03
root se_e3 $ head original.out 
#  step      rmse_val    rmse_trn    rmse_e_val  rmse_e_trn    rmse_f_val  rmse_f_trn         lr
      0      2.16e+01    1.86e+01      1.91e-02    1.30e-02      6.83e-01    5.89e-01    1.0e-03
     10      2.11e+01    2.23e+01      4.27e-01    4.28e-01      6.68e-01    7.06e-01    1.0e-03
     20      2.13e+01    2.16e+01      1.91e-02    1.84e-02      6.72e-01    6.84e-01    1.0e-03
     30      2.13e+01    2.17e+01      3.76e-02    3.12e-02      6.72e-01    6.88e-01    1.0e-03
     40      1.99e+01    2.23e+01      9.20e-02    8.82e-02      6.30e-01    7.05e-01    1.0e-03
     50      2.13e+01    2.05e+01      3.11e-02    2.55e-02      6.74e-01    6.49e-01    1.0e-03
     60      2.14e+01    2.34e+01      6.78e-03    1.12e-02      6.76e-01    7.41e-01    1.0e-03
     70      2.15e+01    1.88e+01      4.86e-02    5.44e-02      6.79e-01    5.96e-01    1.0e-03
     80      2.23e+01    2.08e+01      2.74e-02    3.05e-02      7.04e-01    6.58e-01    1.0e-03

@codecov-commenter
Copy link

codecov-commenter commented Oct 27, 2021

Codecov Report

Merging #1245 (023fc6c) into devel (1a8fd73) will not change coverage.
The diff coverage is n/a.

Impacted file tree graph

@@           Coverage Diff           @@
##            devel    #1245   +/-   ##
=======================================
  Coverage   75.99%   75.99%           
=======================================
  Files          91       91           
  Lines        7389     7389           
=======================================
  Hits         5615     5615           
  Misses       1774     1774           

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 1a8fd73...023fc6c. Read the comment docs.

@amcadmus amcadmus merged commit edb8bd9 into deepmodeling:devel Oct 28, 2021
njzjz pushed a commit to njzjz/deepmd-kit that referenced this pull request Sep 21, 2023
fix for this discussion

deepmodeling/dpgen#1237 (reply in thread)

---------

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Co-authored-by: Han Wang <92130845+wanghan-iapcm@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants