Open
Description
Based on Model sizes we need a out of memory warning.
There are significant differences between Intel AI PC devices
vs
ARM AI PC Devices based on available GPU and NPU Memory constraints 7GB vs 18GB (On Intel)
NPU
GPU
Ask
We need to provide a error at present running large models such as deepseek R1 14b crashes foundry local service and doesnt provide a clear error or issue to users.