Skip to content

Conversation

RyanMullins
Copy link
Contributor

Gemma 2 added logit soft-capping to .call_with_cache() in #1673 but this was not paralleled in the .score() function, so the logits/loss and derived attributes (e.g., gradients) will differ from those returned by .generate(). This PR brings these back into parity.

@github-actions github-actions bot added the Gemma Gemma model specific issues label Jul 26, 2024
Copy link
Member

@mattdangerw mattdangerw left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm! will pull in when tests finish

@mattdangerw mattdangerw merged commit fa0fbb7 into keras-team:master Jul 26, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Gemma Gemma model specific issues
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants