Skip to content

Python: Allow for structured outputs with Ollama #12533

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Jun 19, 2025

Conversation

moonbox3
Copy link
Contributor

Motivation and Context

Ollama's execution settings currently only allow for json literal. We also want to support structured outputs for models that support it.

Description

Contribution Checklist

@moonbox3 moonbox3 self-assigned this Jun 19, 2025
@moonbox3 moonbox3 requested a review from a team as a code owner June 19, 2025 03:55
@markwallace-microsoft markwallace-microsoft added the python Pull requests for the Python Semantic Kernel label Jun 19, 2025
@markwallace-microsoft
Copy link
Member

Python Test Coverage

Python Test Coverage Report •
FileStmtsMissCoverMissing
connectors/ai/ollama
   ollama_prompt_execution_settings.py140100% 
TOTAL26433394485% 

Python Unit Test Overview

Tests Skipped Failures Errors Time
3632 22 💤 0 ❌ 0 🔥 1m 54s ⏱️

@eavanvalkenburg eavanvalkenburg added this pull request to the merge queue Jun 19, 2025
Merged via the queue into microsoft:main with commit 466a610 Jun 19, 2025
30 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
python Pull requests for the Python Semantic Kernel
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Python: New Feature: OllamaChatPromptExecutionSettings does not support structured outputs
4 participants