Skip to content

Conversation

psychedelicious
Copy link
Collaborator

Summary

  • Allow user-defined precision on MPS.
  • Use more explicit logic to handle all possible cases.
  • Add comments.
  • Remove the app_config args (they were effectively unused, just get the config using the singleton getter util)

Related Issues / Discussions

https://discord.com/channels/1020123559063990373/1049495067846524939/1226123179303370794

QA Instructions

Test bfloat16 precision on MPS on torch 2.3 nightly @Vargol

Pretty simple change.

Merge Plan

n/a

Checklist

  • The PR has a short but descriptive title, suitable for a changelog
  • Tests added / updated (if applicable) n/a
  • Documentation added / updated (if applicable)

- Allow user-defined precision on MPS.
- Use more explicit logic to handle all possible cases.
- Add comments.
- Remove the app_config args (they were effectively unused, just get the config using the singleton getter util)
@github-actions github-actions bot added python PRs that change python files backend PRs that change backend files labels Apr 7, 2024
@Vargol
Copy link
Contributor

Vargol commented Apr 7, 2024

TL;DR - seems to be working just fine.

Long version...
Okay, updated my env using main with pull request

M3iMac:InvokeAI $ git fetch origin pull/6171/head:test_6171
From https://github.com/invoke-ai/InvokeAI
 * [new ref]             refs/pull/6171/head -> test_6171
M3iMac:InvokeAI $ git switch test_6171
Switched to branch 'test_6171'

build the front end and pip uninstalled invokeai and re-installed using the repo

Ran on release torch, expecting a BFlaot16 not supported error, mostly as a way to check it was using bf16

[2024-04-07 12:13:56,513]::[InvokeAI]::ERROR --> Error while invoking session b8a518bb-5b8c-45a0-b9aa-2c8fd4eb71e5, invocation c61d4e77-a3e9-41e6-b801-6884ff70a366 (sdxl_compel_prompt):
BFloat16 is not supported on MPS

So good so far.
Updated torch to 2.30 release candidate

(InvokeAI) M3iMac:InvokeAI $ pip uninstall torch torchvision
...
  Successfully uninstalled torch-2.2.1
... 
 Successfully uninstalled torchvision-0.17.1

(InvokeAI) M3iMac:InvokeAI $ pip3 install --pre torch torchvision --index-url   https://download.pytorch.org/whl/test/cpu
...
Successfully installed torch-2.3.0 torchvision-0.18.0

Restart invokeAI and ran a few 'quick' prompts including a lora and a controlnet which ran to completion.
The simple ones worked fine I may have , I may have hit a bug in LoRA's but that seems to happen in float16 too.
So I don't think its related.

@hipsterusername hipsterusername merged commit 9ab6655 into main Apr 7, 2024
@hipsterusername hipsterusername deleted the psyche/feat/backend/choose-precision branch April 7, 2024 13:41
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
backend PRs that change backend files python PRs that change python files
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants