diff --git a/docs_nnx/examples/core_examples.rst b/docs_nnx/examples/core_examples.rst deleted file mode 100644 index e714fe74c..000000000 --- a/docs_nnx/examples/core_examples.rst +++ /dev/null @@ -1,30 +0,0 @@ -Core examples -============= - -Core examples are hosted on the GitHub Flax repository in the `examples `__ -directory. - -Each example is designed to be **self-contained and easily forkable**, while -reproducing relevant results in different areas of machine learning. - -Some of the examples below have a link "Interactive🕹" that lets you run them -directly in Colab. - -Transformers -******************** - -- :octicon:`mark-github;0.9em` `Gemma `__ : - A family of open-weights Large Language Model (LLM) by Google DeepMind, based on Gemini research and technology. - -- :octicon:`mark-github;0.9em` `LM1B `__ : - Transformer encoder trained on the One Billion Word Benchmark. - - -- :octicon:`mark-github;0.9em` `Diffusion Models `__ : - A simple example of an image diffusion model using a U-Net architecture. - -Toy examples -******************** - -`NNX toy examples `__ -directory contains a few smaller, standalone toy examples for simple training scenarios. diff --git a/docs_nnx/examples/index.rst b/docs_nnx/examples/index.rst index e283ee1fe..72b7df60d 100644 --- a/docs_nnx/examples/index.rst +++ b/docs_nnx/examples/index.rst @@ -1,10 +1,41 @@ Examples -======== +######## + +NNX documentation contains two kinds of examples: example notebooks, which gradually introduce new concepts +for applying NNX to different application areas, and example projects, which are more realistic representations +of how nontrivial models should be implemented. + + +Example Notebooks +================= + +Example notebooks guide you through applying Flax models to a variety of different domains. .. toctree:: - :maxdepth: 2 + :maxdepth: 1 + + ./gemma + ./digits_diffusion_model + + +Example Projects +================ + +Example projects are hosted on the GitHub Flax repository in the `examples `__ +directory. + +Each example is designed to be **self-contained and easily forkable**, while +reproducing relevant results in different areas of machine learning. + +Transformers +******************** - gemma - core_examples +- :octicon:`mark-github;0.9em` `Gemma `__ : + A family of open-weights Large Language Model (LLM) by Google DeepMind, based on Gemini research and technology. + Gemma models training and evaluation script on the One Billion Word Benchmark (LM1B). +Toy examples +******************** +`NNX toy examples `__ +directory contains a few smaller, standalone toy examples for simple training scenarios.