Skip to content

Conversation

bhouston
Copy link
Member

Closes #145

@bhouston bhouston merged commit 080c8fb into main Mar 11, 2025
1 check failed
Copy link

🎉 This PR is included in version 0.10.0 🎉

The release is available on GitHub release

Your semantic-release bot 📦🚀

Copy link

🎉 This PR is included in version mycoder-agent-v1.0.0 🎉

The release is available on:

Your semantic-release bot 📦🚀

@bhouston bhouston deleted the feature/145-token-caching branch March 12, 2025 02:08
Copy link

sentry-io bot commented Mar 12, 2025

Suspect Issues

This pull request was deployed and Sentry observed the following issues:

  • ‼️ Error: Error calling XAI API: Unexpected token 'F', "Failed to "... is not valid JSON XAIProvider.generateText(xai.ts) View Issue
  • ‼️ Error: Error calling XAI API: 404 {"error":{"code":404,"message":"The requested resource was not found. Please check the URL and try again. Documentation is available at https://docs.x.ai/"}} XAIProvider.generateText(xai.ts) View Issue
  • ‼️ Error: Error calling Ollama API: fetch failed OllamaProvider.generateText(ollama.ts) View Issue
  • ‼️ Error: Error calling Ollama API: fetch failed OllamaProvider.generateText(ollama.ts) View Issue

Did you find this useful? React with a 👍 or 👎

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Add token caching to the LLM abstraction

1 participant