Skip to content

Commit

Permalink
Rel 2.0.2 2 (#30)
Browse files Browse the repository at this point in the history
* Release_2.0.2
  • Loading branch information
ankitj-cerebras committed Nov 9, 2023
1 parent 22996a3 commit f873794
Showing 1 changed file with 0 additions and 1 deletion.
1 change: 0 additions & 1 deletion RELEASE-NOTES.md
Expand Up @@ -12,7 +12,6 @@ The following are the release notes for the Model Zoo repository.
#### Sparsity

- With release 2.0.2, we introduce [Sparse Pretraining and Dense Finetuning (SPDF)](https://arxiv.org/abs/2303.10464), a technique designed to accelerate pretraining by incorporating high levels of sparsity while maintaining downstream task accuracy through dense finetuning. To get started with SPDF, we have provided a comprehensive [Sparsity how-to-guide](https://docs.cerebras.net/en/latest/wsc/how_to_guides/sparsity.html). Additionally, you can explore reference configurations in the Model Zoo to leverage SPDF effectively in your projects. The Model Zoo reference configuration is accessible in the Cerebras Model Zoo [here](./transformers/pytorch/gpt3/configs/sparsity). For more information, contact our support team.
- We are introducing [Sparse-Iso Flop Transformation (Sparse-IFT)](https://arxiv.org/abs/2303.11525), a technique that leverages sparsification to enhance model accuracy without the need to increase training FLOPs (floating-point operations). To help you make the most of Sparse-IFT, we have prepared a detailed [Sparsity how-to-guide](https://docs.cerebras.net/en/latest/wsc/how_to_guides/sparsity.html). Additionally, you can find reference configurations in the Cerebras Model Zoo [here](./transformers/pytorch/gpt3/configs/sparsity/). For more information, contact our support team.

#### Large Language Models

Expand Down

0 comments on commit f873794

Please sign in to comment.