Skip to content

Restore triton v2 compatibility#23

Merged
Gldkslfmsd merged 2 commits intoufal:mainfrom
krystophny:fix-triton-3x-upstream
Jan 28, 2026
Merged

Restore triton v2 compatibility#23
Gldkslfmsd merged 2 commits intoufal:mainfrom
krystophny:fix-triton-3x-upstream

Conversation

@krystophny
Copy link
Contributor

This is a small addition to PR #22 since I have been changing this in parallel. In this variant, triton v2 is still working too, and Linux OS is correctly added again in requirements (whisper upstream has that too). If fine, please merge after #22 .

tokyovigilante and others added 2 commits October 25, 2025 21:29
Apparently kernel.src cannot be directly assigned in v3, so store src
in a local copy and modify that one before applying the updates.

Tested on simulated streaming audio locally.

see openai/whisper#2597
The previous commit added Triton 3.x support but broke backward
compatibility with Triton 2.x by requiring triton>=3.0.0.

This commit restores compatibility with both versions by following
the upstream OpenAI Whisper approach:
- Use hasattr() to detect which API is available
- Call _unsafe_update_src() + hash reset for Triton 3.0-3.1
- Fall back to kernel.src for Triton 2.x
- Restore triton>=2.0.0 requirement (supports both 2.x and 3.x)

This matches the implementation in openai/whisper and addresses
the maintainer feedback to follow upstream more closely.

Builds on: ufal#22
See: openai/whisper#2597
@tokyovigilante
Copy link
Contributor

@krystophny thanks for posting this, mine is more of a hack for Debian 13, your PR should probably be accepted over mine.

@fumin
Copy link

fumin commented Jan 27, 2026

Why hasn't this pull request been merged?
This is the fix to #30

@Gldkslfmsd
Copy link
Member

Gldkslfmsd commented Jan 27, 2026 via email

@Gldkslfmsd Gldkslfmsd merged commit d23775e into ufal:main Jan 28, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants