Skip to content
This repository was archived by the owner on Jul 4, 2025. It is now read-only.

Conversation

@tikikun
Copy link
Contributor

@tikikun tikikun commented Jan 29, 2024

  • support vulkan (AMD that's not using ROCm can also use llama cpp after this PR)
  • support intel syscl (intel ARC, iGPU etc....)

@tikikun tikikun added the P0: critical Mission critical label Jan 29, 2024
@tikikun tikikun self-assigned this Jan 29, 2024
@tikikun tikikun merged commit 8a17c08 into main Jan 29, 2024
@hiro-v hiro-v deleted the pump-version branch January 30, 2024 16:38
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

P0: critical Mission critical

Projects

No open projects
Archived in project

Development

Successfully merging this pull request may close these issues.

2 participants