Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TypeError: Failed to fetch dynamically imported module #748

Closed
1 of 5 tasks
sugarforever opened this issue May 9, 2024 · 4 comments
Closed
1 of 5 tasks

TypeError: Failed to fetch dynamically imported module #748

sugarforever opened this issue May 9, 2024 · 4 comments
Labels
bug Something isn't working

Comments

@sugarforever
Copy link

System Info

@xenova/transformers 3.0.0-alpha.0
Chrome: Version 124.0.6367.93 (Official Build) (arm64)
OS: macOS 14.4.1 (23E224)

Environment/Platform

  • Website/web-app
  • Browser extension
  • Server-side (e.g., Node.js, Deno, Bun)
  • Desktop app (e.g., Electron)
  • Other (e.g., VSCode extension)

Description

I ran pnpm run dev in the example webgpt-chat. I can download the model on http://localhost:5173. But it's not ready for chat due to the error reported in the console:

@xenova_transformers.js?v=9e5deabe:1386 Uncaught (in promise) Error: no available backend found. ERR: [webgpu] TypeError: Failed to fetch dynamically imported module: http://localhost:5173/ort-wasm-simd-threaded.jsep.mjs
    at pt (http://localhost:5173/node_modules/.vite/deps/@xenova_transformers.js?v=9e5deabe:1386:13)
    at async e.create (http://localhost:5173/node_modules/.vite/deps/@xenova_transformers.js?v=9e5deabe:1906:20)
    at async createInferenceSession (http://localhost:5173/node_modules/.vite/deps/@xenova_transformers.js?v=9e5deabe:9952:10)
    at async constructSessions (http://localhost:5173/node_modules/.vite/deps/@xenova_transformers.js?v=9e5deabe:17531:21)
    at async Promise.all (index 0)
    at async Phi3ForCausalLM.from_pretrained (http://localhost:5173/node_modules/.vite/deps/@xenova_transformers.js?v=9e5deabe:17782:14)
    at async AutoModelForCausalLM.from_pretrained (http://localhost:5173/node_modules/.vite/deps/@xenova_transformers.js?v=9e5deabe:20678:14)
    at async Promise.all (index 1)
    at async load (http://localhost:5173/src/worker.js?worker_file&type=module:137:32)
Screenshot 2024-05-09 at 9 28 23 PM

May I query if any setting required to make it work?

Btw, I can chat with the model on https://huggingface.co/spaces/Xenova/experimental-phi3-webgpu

Reproduction

  1. Git clone https://github.com/xenova/transformers.js.git
  2. Go to transformers.js/examples/webgpu-chat
  3. pnpm install
  4. pnpm run dev
  5. Visit http://localhost:5173 in Chrome and click Load Model button
@sugarforever sugarforever added the bug Something isn't working label May 9, 2024
@HyperCrowd
Copy link

I get the same issue, even when trying to pull in from a local checkout of onnx:

import { env, AutoModelForCausalLM, AutoTokenizer } from '@xenova/transformers'

env.backends.onnx.wasm.wasmPaths = '/onnxruntime-web/'
env.allowRemoteModels = false
env.allowLocalModels = true

const model_id = '../model';

const tokenizer = await AutoTokenizer.from_pretrained(model_id, {
    legacy: true
})

I have copied the contents of node_modules/onnxruntime-web/dist/ to public and it's trying to access a ort-wasm-simd-threaded.jsep.mjs file which does not exist in onnxruntime-web

@xenova
Copy link
Collaborator

xenova commented May 10, 2024

This is because the demo uses an unreleased version of onnxruntime-web v1.18.0, which I have mentioned a few times when I've linked to the source code. When it is released, I will update the source code so that it works correctly. Thanks for understanding!

@sugarforever
Copy link
Author

This is because the demo uses an unreleased version of onnxruntime-web v1.18.0, which I have mentioned a few times when I've linked to the source code. When it is released, I will update the source code so that it works correctly. Thanks for understanding!

Thanks for the feedback. Looking forward to the release.

@xenova
Copy link
Collaborator

xenova commented May 17, 2024

It should now work (commits: here and here)!

@xenova xenova closed this as completed May 17, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants