Skip to content
This repository was archived by the owner on Jun 5, 2025. It is now read-only.

Conversation

jhrozek
Copy link
Contributor

@jhrozek jhrozek commented Dec 16, 2024

We need to run the normalizer and denormalizer even if there is nothing
in the output pipeline for Anthropic because it uses a special format
for messages. However, we can't run the pipeline for all providers
because we are missing a normalizer/denormalizer for llama.cpp. We need
to add that, but let's add a temporary hack to get FIM with Anthropic
working.

Co-authored-by: Pankaj Telang pankaj@stacklok.com
Co-authored-by: Alejandro Ponce aponcedeleonch@stacklok.com

We need to run the normalizer and denormalizer even if there is nothing
in the output pipeline for Anthropic because it uses a special format
for messages. However, we can't run the pipeline for all providers
because we are missing a normalizer/denormalizer for llama.cpp. We need
to add that, but let's add a temporary hack to get FIM with Anthropic
working.

Co-authored-by: Pankaj Telang <pankaj@stacklok.com>
Co-authored-by: Alejandro Ponce <aponcedeleonch@stacklok.com>
Copy link
Contributor

@ptelang ptelang left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

FIM works for all providers with this fix.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants