Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add f16 support in the wgpu backend #1582

Draft
wants to merge 1 commit into
base: main
Choose a base branch
from

Commits on Apr 7, 2024

  1. burn-wgpu: f16 support

    The burn-wgpu backend currently does not support computations on 16 bit
    floats. This. for example, limits the ability to run LLMs on top of
    Burn, on widely available hardware. So, add 16 bit float support in
    burn-wgpu.
    
    Signed-off-by: Piotr Stankiewicz <piotr.stankiewicz@docker.com>
    p1-0tr committed Apr 7, 2024
    Configuration menu
    Copy the full SHA
    ee7ec2c View commit details
    Browse the repository at this point in the history