Open
Description
Ran it for a few models including phi & deepseek-r1-7b. But, doesn't work for the deepseek-r1-14b model (running on new Copilot+PCs with Snapdragon X-Elite chips).
Have tried all the usual troubleshoot. For Example: Service Start, reboot, clear cache and re-download. So, think its specific to 14b parameter model.
🕚 Loading model...
Exception: Request to local service failed. Uri:http://localhost:5273/openai/load/deepseek-r1-distill-qwen-14b-qnn-npu?ttl=600
An error occurred while sending the request.
Please check service status with 'foundry service status'.