Skip to content

SinatrasC/entropix-smollm

Repository files navigation

entropix-smollm

SmolLM2 with Entropix sampler on pytorch

This notebook has Entropix Sampler implementation by Sinatras @myainotez

Special thanks to @Dorialexander , @citizenhicks and original entropix implementation maintainer @_xjdr for making this possible

There are 2 different SamplerConfigs included, both are experimental one of them is leaning into adaptive state as before and another one is forcing model to use different sampler states to let this agreement is removed from conditions. You can customize your sampler parameters using 3D chart easily even you can identify tokens by hovering them. You can also setup your EntropixConfig to utilize or remove some parts of the sampler for research purposes.

To disable charts set debug=False in EntropixModel class

To enable export of varentropy&entropy stats for attention and logits remove comment on export_data function

SmolLM2 higher context tuned thresholds are wip

smolLMxEntropix

Test Visualizations

3D Entropy&Varentropy Chart

New 3D chart lets user see how their response is formed, includes varentropy&entropy stats for attention and logits

3D1 3DGIF

There is also threshold visualization to let users see how their responses fill 3D space with desired SamplerConfig, users can use this function with buttons on top of the chart

3D2

Samples

Q1

Q2

Original entropix implementation : @xjdr-alt/entropix

Install

Clone the repo

git clone git@github.com:SinatrasC/entropix-smollm.git

cd entropix-smollm

With rye

Install Rye here if you haven't already, and then:

rye sync

Run Jupyter

rye run jupyter-notebook smollm_entropix_torch.ipynb

With uv

Install uv here if you haven't already (Rye installs it by default), and then:

uv venv --python 3.12

source .venv/bin/activate

uv pip install --project pyproject.toml .

and then:

jupyter-notebook smollm_entropix_torch.ipynb 

You could also use the model with torch directly from cli follow the uv instalation guide then invoke the following command in your terminal:

python3 -m entr_model_torch.main --config.prompt "Which number is larger 9.11 or 9.9? be brief in your response" --config.stream --config.debug

You could also batch process prompts with the following example command:

python3 -m entr_model_torch.main --config.csv_file "prompts.csv" --config.no-stream --config.debug

the --help describes the cli args, here is the brief overview:

python3 -m entr_model_torch.main --help

GenerateConfig Usage:
--------------------
Required:
- prompt (str): The text prompt to generate from
    Example: --config.prompt "Once upon a time"
OR
- csv_file (str): path to csv file containing string prompts with column header 'prompts'
    Example: --config.csv_file "prompts.csv"

Optional:
- max_tokens (int): How many tokens to generate (1-2048)
    Default: 600
    Usage: --config.max_tokens 1000
- debug: Toggle debug information during generation
    Default: True
    Usage: --config.debug or --config.no-debug
- stream: Toggle output token streaming
    Default: True
    Usage: --config.stream or --config.no-stream

Example usage:
    python3 -m entr_model_torch.main --config.prompt "Which number is larger 9.11 or 9.9? be brief in your response" --config.stream --config.debug
    or
    python3 -m entr_model_torch.main --config.csv_file "prompts.csv" --config.no-stream --config.debug

About

smolLM with Entropix sampler on pytorch

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 3

  •  
  •  
  •