Skip to content

To add Rprop documentation #63866

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 1 commit into from

Conversation

iramazanli
Copy link
Contributor

@iramazanli iramazanli commented Aug 24, 2021

It has been discussed before that adding description of Optimization algorithms to PyTorch Core documentation may result in a nice Optimization research tutorial. In the following tracking issue we mentioned about all the necessary algorithms and links to the originally published paper #63236.

In this PR we are adding description of Rprop to the documentation. For more details, we refer to the paper http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.21.1417

Rpropalg

@facebook-github-bot
Copy link
Contributor

facebook-github-bot commented Aug 24, 2021

🔗 Helpful links

💊 CI failures summary and remediations

As of commit 1bf12e5 (more details on the Dr. CI page):


  • 3/3 failures introduced in this PR

🕵️ 3 new failures recognized by patterns

The following CI failures do not appear to be due to upstream breakages:

See GitHub Actions build linux-bionic-py3.6-clang9 / test (noarch, 1, 1, linux.2xlarge) (1/3)

Step: "Unknown" (full log | diagnosis details | 🔁 rerun)

2021-09-09T22:09:00.7492249Z CONTINUE_THROUGH_ERROR: false
2021-09-09T22:09:00.7487742Z   ALPINE_IMAGE: 308535385114.dkr.ecr.us-east-1.amazonaws.com/tool/alpine
2021-09-09T22:09:00.7488318Z   PR_LABELS: [
  "cla signed"
]
2021-09-09T22:09:00.7489270Z   DOCKER_IMAGE: 308535385114.dkr.ecr.us-east-1.amazonaws.com/pytorch/pytorch-linux-bionic-py3.6-clang9:3fb4365799993abcfc83e51d42c137e89cb2459a
2021-09-09T22:09:00.7490389Z   JOB_BASE_NAME: linux-bionic-py3.6-clang9-test
2021-09-09T22:09:00.7490872Z   TEST_CONFIG: noarch
2021-09-09T22:09:00.7491185Z   SHARD_NUMBER: 1
2021-09-09T22:09:00.7491492Z   NUM_TEST_SHARDS: 1
2021-09-09T22:09:00.7491849Z   PYTORCH_IGNORE_DISABLED_ISSUES: 
2021-09-09T22:09:00.7492249Z   CONTINUE_THROUGH_ERROR: false
2021-09-09T22:09:00.7492572Z   SHM_SIZE: 1g
2021-09-09T22:09:00.7492860Z   PR_NUMBER: 63866
2021-09-09T22:09:00.7493148Z ##[endgroup]
2021-09-09T22:09:13.5166941Z Processing ./dist/torch-1.10.0a0+git41d19e7-cp36-cp36m-linux_x86_64.whl
2021-09-09T22:09:13.5453328Z Requirement already satisfied: typing-extensions in /opt/conda/lib/python3.6/site-packages (from torch==1.10.0a0+git41d19e7) (3.10.0.0)
2021-09-09T22:09:13.5459616Z Requirement already satisfied: dataclasses in /opt/conda/lib/python3.6/site-packages (from torch==1.10.0a0+git41d19e7) (0.8)
2021-09-09T22:09:13.8157213Z Installing collected packages: torch
2021-09-09T22:09:19.5434268Z Successfully installed torch-1.10.0a0+git41d19e7
2021-09-09T22:09:19.8585557Z ++++ dirname .jenkins/pytorch/common.sh
2021-09-09T22:09:19.8592502Z +++ cd .jenkins/pytorch

See CircleCI build pytorch_linux_xenial_py3_6_gcc5_4_build (2/3)

Step: "(Optional) Merge target branch" (full log | diagnosis details | 🔁 rerun)

Automatic merge failed; fix conflicts and then commit the result.
CONFLICT (add/add): Merge conflict in .circleci/scripts/binary_macos_build.sh
Auto-merging .circleci/scripts/binary_macos_build.sh
CONFLICT (add/add): Merge conflict in .circleci/config.yml
Auto-merging .circleci/config.yml
CONFLICT (add/add): Merge conflict in .circleci/cimodel/data/pytorch_build_definitions.py
Auto-merging .circleci/cimodel/data/pytorch_build_definitions.py
CONFLICT (add/add): Merge conflict in .circleci/cimodel/data/pytorch_build_data.py
Auto-merging .circleci/cimodel/data/pytorch_build_data.py
CONFLICT (add/add): Merge conflict in .bazelrc
Auto-merging .bazelrc
Automatic merge failed; fix conflicts and then commit the result.


Exited with code exit status 1

See CircleCI build pytorch_xla_linux_bionic_py3_6_clang9_build (3/3)

Step: "(Optional) Merge target branch" (full log | diagnosis details | 🔁 rerun)

Automatic merge failed; fix conflicts and then commit the result.
CONFLICT (add/add): Merge conflict in .circleci/scripts/binary_macos_build.sh
Auto-merging .circleci/scripts/binary_macos_build.sh
CONFLICT (add/add): Merge conflict in .circleci/config.yml
Auto-merging .circleci/config.yml
CONFLICT (add/add): Merge conflict in .circleci/cimodel/data/pytorch_build_definitions.py
Auto-merging .circleci/cimodel/data/pytorch_build_definitions.py
CONFLICT (add/add): Merge conflict in .circleci/cimodel/data/pytorch_build_data.py
Auto-merging .circleci/cimodel/data/pytorch_build_data.py
CONFLICT (add/add): Merge conflict in .bazelrc
Auto-merging .bazelrc
Automatic merge failed; fix conflicts and then commit the result.


Exited with code exit status 1


This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.

Please report bugs/suggestions to the (internal) Dr. CI Users group.

Click here to manually regenerate this comment.

@iramazanli iramazanli force-pushed the rprop_algorithm_doc branch 2 times, most recently from e5a2a98 to 72d1b62 Compare August 24, 2021 19:57
@codecov
Copy link

codecov bot commented Aug 25, 2021

Codecov Report

Merging #63866 (d030bdb) into master (a6f767e) will decrease coverage by 0.10%.
The diff coverage is n/a.

❗ Current head d030bdb differs from pull request most recent head e6a3eaa. Consider uploading reports for the commit e6a3eaa to get more accurate results

@@            Coverage Diff             @@
##           master   #63866      +/-   ##
==========================================
- Coverage   66.84%   66.74%   -0.11%     
==========================================
  Files         695      698       +3     
  Lines       90736    90881     +145     
==========================================
+ Hits        60656    60658       +2     
- Misses      30080    30223     +143     

@iramazanli iramazanli force-pushed the rprop_algorithm_doc branch 7 times, most recently from 739b5ca to d030bdb Compare August 27, 2021 04:35
@iramazanli iramazanli requested a review from albanD August 27, 2021 19:09
@iramazanli iramazanli force-pushed the rprop_algorithm_doc branch 2 times, most recently from c755bbb to e6a3eaa Compare August 27, 2021 20:59
Copy link
Collaborator

@albanD albanD left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The code is not equivalent if lr is not inside step_sizes range. Is that a problem?

&\rule{110mm}{0.4pt} \\
&\textbf{input} : \theta_0 \in \mathbf{R}^d \text{ (params)},f(\theta)
\text{ (objective)}, \\
&\hspace{13mm} \eta_{+/-} \text{ (etaplus, etaminus)}, \Gamma_{plus/minus}
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is missing initialization for g_{prev} no?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done

&\textbf{input} : \theta_0 \in \mathbf{R}^d \text{ (params)},f(\theta)
\text{ (objective)}, \\
&\hspace{13mm} \eta_{+/-} \text{ (etaplus, etaminus)}, \Gamma_{plus/minus}
\text{ (boundaries for lr)} \\[-1.ex]
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is that the "step_sizes" argumeents?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes, exactly !

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It should be mentioned explicitly then.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done!

\text{ (objective)}, \\
&\hspace{13mm} \eta_{+/-} \text{ (etaplus, etaminus)}, \Gamma_{plus/minus}
\text{ (boundaries for lr)} \\[-1.ex]
&\rule{110mm}{0.4pt} \\
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also \eta_t initialization is missing no?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done !

@facebook-github-bot
Copy link
Contributor

@iramazanli has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@facebook-github-bot
Copy link
Contributor

@iramazanli merged this pull request in 54b72a9.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants