-
-
Couldn't load subscription status.
- Fork 295
Description
Scope check
- This is core LLM communication (not application logic)
- This benefits most users (not just my use case)
- This can't be solved in application code with current RubyLLM
- I read the Contributing Guide
Due diligence
- I searched existing issues
- I checked the documentation
What problem does this solve?
Screencast overview: https://www.loom.com/share/247efdca09dc4e03b60c141cb3ae6503
OpenAI's response_format requires a 'name' for json_schema definitions. This change enables custom schema names which are useful for debugging, logging, and clarity when using multiple schemas.
They also modify behavior - I discovered this when migrating to RubyLLM from another gem. My results were different when using with_schema because the name was getting hardcoded as 'response', whereas before I had it set to 'department_classification_schema'.
Using the same prompt testing a classification 200 times on a ticket I get the following results on confidence score, the only difference is the schema name
Average Confidence Comparison:
- schema name: 'department_classification_schema': 88.47%
- schema name: 'response': 74.09%
Proposed solution
- Allow
with_schemato accept a nested 'schema' object (current behavior) or the full schema configuration with name field - Schema name from RubyLLM schema objects should use the schema class name by default instead of 'response'
So you can pass either:
# new full schema format - supports `name`, `schema`, `strict` (not shown)
chat = RubyLLM.chat(model: 'gpt-4.1-mini')
.with_schema(
name: 'department_classification',
schema: {
type: 'object',
properties: {
department_id: { type: 'string' },
confidence: { type: 'number' },
reasoning: { type: 'string' }
},
required: ['department_id', 'confidence', 'reasoning'],
additionalProperties: false
}
)or just the inner schema object (current behavior - sends 'response' as the schema name)
# current behavior also accepted - just pass the inner `schema` field
chat = RubyLLM.chat(model: 'gpt-4.1-mini')
.with_schema(
type: 'object',
properties: {
department_id: { type: 'string' },
confidence: { type: 'number' },
reasoning: { type: 'string' }
},
required: ['department_id', 'confidence', 'reasoning'],
additionalProperties: false
})I would also propose when using a RubyLLM::Schema we use the schema name by default instead of 'response', or that we update RubyLLM::Schema to take a schema_name property if you want to set one
# Proposed feature for ruby_llm-schema:
class DepartmentClassificationSchema < RubyLLM::Schema
schema_name 'department_classification' # ← NEW: Override the class name
string :department_id
number :confidence
string :reasoning
end
DepartmentClassificationSchema.new.to_json_schema
# Would return:
{
name: "department_classification", # ← Custom name instead of class name
description: nil,
schema: { ... }
}Why this belongs in RubyLLM
Schema name is core functionality and impacts behavior in LLMs as shown above