Skip to content
This repository was archived by the owner on Jul 4, 2025. It is now read-only.

Conversation

@tikikun
Copy link
Contributor

@tikikun tikikun commented Jan 2, 2024

No description provided.

@tikikun tikikun added P0: critical Mission critical type: bug Something isn't working labels Jan 2, 2024
@tikikun tikikun self-assigned this Jan 2, 2024
slot.params.n_keep);
new_tokens.insert(new_tokens.end(),
prompt_tokens.begin() + slot.params.n_keep +
erased_blocks * n_block_size,

Check failure

Code scanning / CodeQL

Multiplication result converted to larger type

Multiplication result may overflow 'int' before it is converted to 'difference_type'.
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Using uptream llama cpp rn

@tikikun tikikun merged commit cf7d3e8 into main Jan 2, 2024
@tikikun tikikun linked an issue Jan 2, 2024 that may be closed by this pull request
@hiro-v hiro-v deleted the version-pump-upgrade-bug-fixing-llava branch January 30, 2024 16:39
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

P0: critical Mission critical type: bug Something isn't working

Projects

No open projects
Archived in project

Development

Successfully merging this pull request may close these issues.

bug: Cannot compile on latest version of llama cpp

3 participants