Implement Int4x2 and UInt4x2 casting support in ONNX Runtime CPU provider #24998
+275
−3
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Summary
This PR implements comprehensive casting support for Int4x2 and UInt4x2 data types in the ONNX Runtime CPU provider, addressing the TODO comment in
cast_op.ccand enabling 4-bit integer quantization workflows.Changes Made
1. Updated Type Lists
AllIRv9toAllIRv10to include Int4x2/UInt4x2 types2. Core Casting Implementation
Added TensorCaster specializations for:
3. Helper Functions and Templates
ConvertFromInt4x2<Signed, DstType>(): Unpacks 4-bit values to destination typesConvertToInt4x2<Signed, SrcType>(): Converts and clamps values to 4-bit range with proper bounds checkingInt4x2ToCaster,CasterToInt4x2) to eliminate code duplication4. Key Features
Example Usage
Testing
Created comprehensive standalone tests validating:
Compatibility
Resolves: TODO comment "Implement support for int4 and uint4" in cast_op.cc
💡 You can make Copilot smarter by setting up custom instructions, customizing its development environment and configuring Model Context Protocol (MCP) servers. Learn more Copilot coding agent tips in the docs.