Skip to content

Latest commit

 

History

History
166 lines (110 loc) · 8.42 KB

CONTRIBUTING.md

File metadata and controls

166 lines (110 loc) · 8.42 KB

Contributing to TorchTune

We want to make contributing to this project as easy and transparent as possible.

 

Dev install

In order to contribute to Torchtune, you should first fork and then clone your forked repository.

git clone https://github.com/<YOUR_GITHUB_USER>/torchtune.git

Then navigate into the newly cloned repo and install dependencies needed for development.

cd torchtune
pip install -e ".[dev]"

 

Testing

TorchTune contains three different types of tests: unit tests, recipe test, and regression tests. These tests are distinguished by their complexity and the resources they require to run. Recipe tests and regression tests are explicitly marked via pytest.mark decorators and both require S3 access to download the requisite assets.

  • Unit tests
    • These should be minimal tests runnable without remote access. (No large models, no downloading weights). Unit tests should be under tests/torchtune.
    • All unit tests can be run via pytest tests.
  • Recipe tests
    • These are relatively small-scale integration tests for running our recipes. These include both single-device recipes and distributed recipes. In the latter case, tests should be marked with the @gpu_test decorator to indicate how many GPUs they need to run.
    • Recipe tests require remote access as (small) model weights will be downloaded from S3 to run them.
    • Recipe tests are found under tests/recipes and should be marked with the @pytest.mark.integration_test decorator.
    • To run only recipe tests, you can run pytest tests -m integration_test.
  • Regression tests
    • These are the most heavyweight tests in the repo. They involve building a full model (i.e. 7B size or larger), then running some finetune and/or evaluation via a combination of tune CLI commands. Whereas an individual recipe test runtime is generally still O(seconds), integration tests should be O(minutes) or greater. Like recipe tests, regression tests also require S3 access.
    • Regression tests are found under tests/regression_tests and should be marked with the @pytest.mark.slow_integration_test decorator.
    • To run only regression tests, you can use the command pytest tests -m slow_integration_test.

Whenever running tests in TorchTune, favor using the command line flags as much as possible (e.g. run pytest tests -m integration_test over pytest tests/recipes). This is because (a) the default behavior is to run unit tests only (so you will miss recipe tests without the flag), and (b) using the flags ensures pytest will automatically download any remote assets needed for your test run.

Note that the above flags can be combined with other pytest flags, so e.g. pytest tests -m integration_test -k 'test_loss' will run only recipe tests matching the substring test_loss.

 

Pull Requests

We actively welcome your pull requests.

  1. Fork the repo and create your branch from main.
  2. If you've added code that should be tested, add tests.
  3. If you've changed APIs, update the documentation.
  4. Ensure the test suite passes.
  5. Make sure your code lints.
  6. If you haven't already, complete the Contributor License Agreement ("CLA").

 

Coding Style

torchtune uses pre-commit hooks to ensure style consistency and prevent common mistakes. Enable it by:

pre-commit install

After this pre-commit hooks will be run before every commit.

You can also run this manually on every file using:

pre-commit run --all-files

 

Build docs

From the docs folder:

Install dependencies:

pip install -r requirements.txt

Then:

make html
# Now open build/html/index.html

To avoid building the examples (which execute python code and can take time) you can use make html-noplot. To build a subset of specific examples instead of all of them, you can use a regex like EXAMPLES_PATTERN="plot_the_best_example*" make html.

If the doc build starts failing for a weird reason, try make clean.

 

Iterate and Serve docs locally

If you're developing locally, you can just open the generated index.html file in your browser.

If instead you're using a remote machine, you can use a combination of a simple python HTTP server and port forwarding to serve the docs locally. This allows you to iterate on the documentation much more quickly than relying on PR previews.

To do so, after following the above doc build steps, run the following from the docs/build/html folder:

python -m http.server 8000 # or any free port

This will open up a simple HTTP server serving the files in the build directory. If this is done on a remote machine, you can set up port forwarding from your local machine to access the server, for example:

ssh -L 9000:localhost:8000 $REMOTE_DEV_HOST

Now, you can navigate to localhost:9000 on your local machine to view the rendered documentation.

 

Best Practices

This section captures some best practices for contributing code to TorchTune. Following these will make PR reviews easier.

Code

  • Modular Blocks instead of Monolithic Classes. Stuffing all of the logic into a single class limits readability and makes it hard to reuse logic. Think about breaking the implementation into self-contained blocks which can be used independently from a given model. For example, attention mechanisms, embedding classes, transformer layers etc.
  • Say no to Implementation Inheritance. You really don’t need it AND it makes the code much harder to understand or refactor since the logic is spread across many files/classes. Where needed, consider using Protocols.
  • Clean Interfaces. There’s nothing more challenging than reading through functions/constructors with ~100 parameters. Think carefully about what needs to be exposed to the user and don’t hesitate to hard-code parameters until there is a need to make them configurable.
  • Intrusive Configs. Config objects should not intrude into the class implementation. Configs should interact with these classes through cleanly defined builder functions which convert the config into flat parameters needed to instantiate an object.
  • Limit Generalization. Attempting to generalize code before this is needed unnecessarily complicates implementations - you are anticipating use cases you don’t know a lot about. When you actually need to generalize a component, think about whether it’s worth it to complicate a given interface to stuff in more functionality. Don’t be afraid of code duplication if it makes things easier to read.
  • Value Checks and Asserts. Don’t check values in higher level modules - defer the checks to the modules where the values are actually used. This helps reduce the number of raise statements in code which generally hurts readability, but are critical for correctness.

Docstrings

Each API and class should be clearly documented. Well-documented code is easier to review and understand/extend.

  • TorchTune docs are written in rst, and the pytorch-sphinx-theme expects code to be specified using double backticks instead of single. Eg: hidden_dim. Single backticks will be rendered as italics instead of as "code".
  • For parameters that have a default value, specify that they're optional in the docstring.

Tests

Every API and class should also have well-defined Tests. TorchTune uses PyTest for testing. TODO: Link to testing README when this is ready.

  • Use PyTest's autouse fixture to prevent the RNG of each test to leak into the other tests.
  • Small comments about what a test is doing and what it's checking go a long way.

 

Issues

We use GitHub issues to track public bugs. Please ensure your description is clear and has sufficient instructions to be able to reproduce the issue.

Meta has a bounty program for the safe disclosure of security bugs. In those cases, please go through the process outlined on that page and do not file a public issue.

 

License

By contributing to TorchTune, you agree that your contributions will be licensed under the LICENSE file in the root directory of this source tree.

 

Contributor License Agreement ("CLA")

In order to accept your pull request, we need you to submit a CLA. You only need to do this once to work on any of Meta's open source projects.

Complete your CLA here: https://code.facebook.com/cla