Skip to content

impl : use 6 digits for tensor dims#20094

Merged
CISC merged 1 commit intoggml-org:masterfrom
ddh0:6-digit-tensor-shape
Mar 4, 2026
Merged

impl : use 6 digits for tensor dims#20094
CISC merged 1 commit intoggml-org:masterfrom
ddh0:6-digit-tensor-shape

Conversation

@ddh0
Copy link
Contributor

@ddh0 ddh0 commented Mar 4, 2026

Many models have vocabulary sizes, and thus tensor shapes, with more than 5 digits (ex: Gemma 3's vocab size is 262,208).

I already fixed this for llama_format_tensor_shape (tensor) but missed it for llama_format_tensor_shape (vector) until now. Oops.

Make sure to read the contributing guidelines before submitting a PR

Many models have vocabulary sizes, and thus tensor shapes, with more
than 5 digits (ex: Gemma 3's vocab size is 262,208).

I already fixed this for `llama_format_tensor_shape` but missed it for
`llama_format_tensor_shape` until now. Oops.
@ddh0 ddh0 requested a review from ggerganov as a code owner March 4, 2026 00:57
@CISC CISC merged commit c99909d into ggml-org:master Mar 4, 2026
78 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants