Conversation
WalkthroughThe OpenAPI specification was updated to add streaming support for speech audio endpoints, enrich transcription responses with usage data, and revise code interpreter tool call schemas and related streaming events. New schemas for audio streaming, usage statistics, and code interpreter outputs were introduced, while examples and descriptions were clarified and corrected. Changes
Sequence Diagram(s)sequenceDiagram
participant Client
participant API
Client->>API: POST /audio/speech (request with text)
alt Streaming enabled (text/event-stream)
loop While audio chunks available
API-->>Client: SpeechAudioDeltaEvent (audio chunk)
end
API-->>Client: SpeechAudioDoneEvent (completion, usage stats)
else Full audio
API-->>Client: Audio file (audio/mpeg or similar)
end
sequenceDiagram
participant Client
participant API
Client->>API: POST /audio/transcriptions (audio file)
API-->>Client: Transcription response (includes usage: tokens or duration)
sequenceDiagram
participant Client
participant API
Client->>API: Code interpreter tool call
API-->>Client: Response events (with item_id, outputs: logs/images)
Poem
✨ Finishing Touches🧪 Generate Unit Tests
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
SupportNeed help? Create a ticket on our support page for assistance with any issues or questions. Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
There was a problem hiding this comment.
Actionable comments posted: 3
🧹 Nitpick comments (2)
src/libs/tryAGI.OpenAI/openapi.yaml (2)
9188-9223: Add URI format to image URL.
Consider addingformat: uriunder theurlproperty inCodeInterpreterOutputImagefor stronger schema validation.
11840-11846: Harmonize description quoting.
The multi-linedescriptionuses both quotes and backticks. Consider standardizing to single-quote YAML style and escaping backticks for readability.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (76)
src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI..JsonSerializerContext.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.AudioClient.CreateSpeech.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.IAudioClient.CreateSpeech.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.JsonConverters.CodeInterpreterOutputImageType.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.JsonConverters.CodeInterpreterOutputImageTypeNullable.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.JsonConverters.CodeInterpreterOutputLogsType.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.JsonConverters.CodeInterpreterOutputLogsTypeNullable.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.JsonConverters.Container.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.JsonConverters.CreateSpeechRequestStreamFormat.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.JsonConverters.CreateSpeechRequestStreamFormatNullable.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.JsonConverters.CreateSpeechResponseStreamEvent.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.JsonConverters.RealtimeSessionCreateRequestClientSecretExpiresAfterAnchor.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.JsonConverters.RealtimeSessionCreateRequestClientSecretExpiresAfterAnchorNullable.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.JsonConverters.SpeechAudioDeltaEventType.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.JsonConverters.SpeechAudioDeltaEventTypeNullable.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.JsonConverters.SpeechAudioDoneEventType.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.JsonConverters.SpeechAudioDoneEventTypeNullable.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.JsonConverters.TranscriptTextUsageDurationType.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.JsonConverters.TranscriptTextUsageDurationTypeNullable.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.JsonConverters.TranscriptTextUsageTokensType.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.JsonConverters.TranscriptTextUsageTokensTypeNullable.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.JsonSerializerContextTypes.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.CodeInterpreterOutputImage.Json.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.CodeInterpreterOutputImage.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.CodeInterpreterOutputImageType.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.CodeInterpreterOutputLogs.Json.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.CodeInterpreterOutputLogs.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.CodeInterpreterOutputLogsType.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.CodeInterpreterTool.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.CodeInterpreterToolCall.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.CodeInterpreterToolCallOutputsDiscriminator.Json.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.CodeInterpreterToolCallOutputsDiscriminator.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.CodeInterpreterToolCallStatus.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.CodeInterpreterToolCallType.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.CodeInterpreterToolOutput.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.Container.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.ContainerFileCitationBody.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.CreateSpeechRequest.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.CreateSpeechRequestStreamFormat.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.CreateSpeechResponseStreamEvent.Json.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.CreateSpeechResponseStreamEvent.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.CreateSpeechResponseStreamEventDiscriminator.Json.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.CreateSpeechResponseStreamEventDiscriminator.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.CreateTranscriptionResponseJson.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.CreateTranscriptionResponseVerboseJson.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.FileCitationBody.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.RealtimeServerEventConversationItemInputAudioTranscriptionCompleted.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.RealtimeSessionCreateRequestClientSecret.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.RealtimeSessionCreateRequestClientSecretExpiresAfter.Json.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.RealtimeSessionCreateRequestClientSecretExpiresAfter.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.RealtimeSessionCreateRequestClientSecretExpiresAfterAnchor.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.RealtimeSessionCreateResponse.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.RealtimeSessionCreateResponseInputAudioTranscription.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.ResponseCodeInterpreterCallCodeDeltaEvent.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.ResponseCodeInterpreterCallCodeDoneEvent.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.ResponseCodeInterpreterCallCompletedEvent.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.ResponseCodeInterpreterCallInProgressEvent.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.ResponseCodeInterpreterCallInterpretingEvent.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.ResponseStreamEvent.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.SpeechAudioDeltaEvent.Json.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.SpeechAudioDeltaEvent.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.SpeechAudioDeltaEventType.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.SpeechAudioDoneEvent.Json.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.SpeechAudioDoneEvent.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.SpeechAudioDoneEventType.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.SpeechAudioDoneEventUsage.Json.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.SpeechAudioDoneEventUsage.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.TranscriptTextDoneEvent.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.TranscriptTextUsageDuration.Json.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.TranscriptTextUsageDuration.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.TranscriptTextUsageDurationType.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.TranscriptTextUsageTokens.Json.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.TranscriptTextUsageTokens.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.TranscriptTextUsageTokensInputTokenDetails.Json.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.TranscriptTextUsageTokensInputTokenDetails.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.TranscriptTextUsageTokensType.g.csis excluded by!**/generated/**
📒 Files selected for processing (1)
src/libs/tryAGI.OpenAI/openapi.yaml(29 hunks)
⏰ Context from checks skipped due to timeout of 90000ms (1)
- GitHub Check: Test / Build, test and publish
🔇 Additional comments (24)
src/libs/tryAGI.OpenAI/openapi.yaml (24)
307-310: Transcription example includes usage field.
The JSON example now shows ausageobject with token counts, matching the new schemas (TranscriptTextUsageTokens). Looks correct.
318-322: Streaming transcription deltas include logprobs.
Thetranscript.text.deltastream example correctly embedslogprobs. This aligns with the updated schema.
329-333: Verbose transcription logprobs example updated with usage.
The example now includes ausageblock for token-usage billing. Schema alignment is correct.
341-344: Word timestamps example includes duration usage.
The example’susageproperty uses thedurationvariant as expected.
353-357: Segment timestamps example includes duration usage.
Thesegmentsexample was updated to include ausageobject withseconds. Good.
2169-2174: Simplified permissions retrieval example.
Examples now directly retrievefirst_idinstead of iterating pages, which simplifies usage.
5440-5440: Corrected delete method name in JavaScript.
Changed fromdeltodeleteto match the client API.
9279-9326: Update CodeInterpreterToolCall schema.
container_id,code, andoutputshave been added andresultsremoved, aligning the schema with the new tool call contract.
11998-12010: Verify SSE support constraints.
stream_formatnow acceptssseandaudio, withssenot supported fortts-1ortts-1-hd. Confirm that downstream clients (e.g.,gpt-4o-mini-tts) are correctly flagged if SSE is unavailable.
12344-12349: Add usage object to transcription response schema.
The newusageoneOf reference correctly ties to token/duration usage variants.
12385-12388: Include usage in verbose transcription schema.
Newusagereference for verbose JSON is aligned with duration-metered billing.
18147-18150: Clarify input audio transcription event docs.
Expanded description underconversation.item.input_audio_transcription.completedprovides necessary context.
19507-19509: Enhance model description in transcription config.
The added description clarifies themodelfield’s purpose.
20072-20077: Includeitem_idin delta event required list.
Addingitem_idensures all code-interpreter delta events carry the tool-call identifier.
20083-20094: Document delta event properties.
Explicitdescriptionfields were added fortype,output_index,delta, andsequence_number, improving schema clarity.
20112-20123: Document done event properties.
Theresponse.code_interpreter_call_code.doneevent now includes descriptions foritem_id,code, andsequence_number. Looks correct.
23094-23147: Add speech audio streaming events.
SchemasSpeechAudioDeltaEventandSpeechAudioDoneEventare properly defined with required fields and usage stats.
23476-23500: Define transcription usage schemas.
TranscriptTextUsageDurationandTranscriptTextUsageTokensare correctly introduced for dual billing models.
24999-25003: Addfilenameto file citation.
Includingfilenameprovides clarity in the citation object; ensure client libraries handle this new field.
25018-25020: Document citation filename.
Thefilenamedescription clarifies its purpose in file lists.
25060-25061: Extend URL citation with filename.
Addingfilenamealigns URL citations with other citation schemas.
25082-25084: Add container filename to citation.
Newfilenameproperty provides context for container file citations.
26856-26870: Register code-interpreter events in mappings.
All newResponseCodeInterpreterCall*event keys have been added to the top-level event list.
27101-27106: Expose speech-audio event docs in nav.
Entries forSpeechAudioDeltaEventandSpeechAudioDoneEventare now included in the mapping for generated docs.
| text/event-stream: | ||
| schema: | ||
| $ref: '#/components/schemas/CreateSpeechResponseStreamEvent' |
There was a problem hiding this comment.
Fix Node.js SSE example.
The SSE (text/event-stream) example still uses blob-based file writes instead of a streaming client (e.g., EventSource or fetch with ReadableStream). Update the Node.js snippet to illustrate proper SSE consumption.
Also applies to: 248-266
🤖 Prompt for AI Agents
In src/libs/tryAGI.OpenAI/openapi.yaml between lines 242 to 244 and 248 to 266,
the Node.js SSE example incorrectly uses blob-based file writes instead of a
streaming client. Update the example to use a proper SSE consumption method such
as the EventSource API or fetch with ReadableStream to handle the
'text/event-stream' response correctly. Replace the blob handling code with
streaming logic that processes incoming events as they arrive.
| group: fine-tuning | ||
| returns: 'A list of fine-tuning [checkpoint objects](/docs/api-reference/fine-tuning/checkpoint-object) for a fine-tuning job.' | ||
| examples: | ||
| response: "{\n \"object\": \"list\"\n \"data\": [\n {\n \"object\": \"fine_tuning.job.checkpoint\",\n \"id\": \"ftckpt_zc4Q7MP6XxulcVzj4MZdwsAB\",\n \"created_at\": 1721764867,\n \"fine_tuned_model_checkpoint\": \"ft:gpt-4o-mini-2024-07-18:my-org:custom-suffix:96olL566:ckpt-step-2000\",\n \"metrics\": {\n \"full_valid_loss\": 0.134,\n \"full_valid_mean_token_accuracy\": 0.874\n },\n \"fine_tuning_job_id\": \"ftjob-abc123\",\n \"step_number\": 2000,\n },\n {\n \"object\": \"fine_tuning.job.checkpoint\",\n \"id\": \"ftckpt_enQCFmOTGj3syEpYVhBRLTSy\",\n \"created_at\": 1721764800,\n \"fine_tuned_model_checkpoint\": \"ft:gpt-4o-mini-2024-07-18:my-org:custom-suffix:7q8mpxmy:ckpt-step-1000\",\n \"metrics\": {\n \"full_valid_loss\": 0.167,\n \"full_valid_mean_token_accuracy\": 0.781\n },\n \"fine_tuning_job_id\": \"ftjob-abc123\",\n \"step_number\": 1000,\n },\n ],\n \"first_id\": \"ftckpt_zc4Q7MP6XxulcVzj4MZdwsAB\",\n \"last_id\": \"ftckpt_enQCFmOTGj3syEpYVhBRLTSy\",\n \"has_more\": true\n}\n" | ||
| response: "{\n \"object\": \"list\",\n \"data\": [\n {\n \"object\": \"fine_tuning.job.checkpoint\",\n \"id\": \"ftckpt_zc4Q7MP6XxulcVzj4MZdwsAB\",\n \"created_at\": 1721764867,\n \"fine_tuned_model_checkpoint\": \"ft:gpt-4o-mini-2024-07-18:my-org:custom-suffix:96olL566:ckpt-step-2000\",\n \"metrics\": {\n \"full_valid_loss\": 0.134,\n \"full_valid_mean_token_accuracy\": 0.874\n },\n \"fine_tuning_job_id\": \"ftjob-abc123\",\n \"step_number\": 2000\n },\n {\n \"object\": \"fine_tuning.job.checkpoint\",\n \"id\": \"ftckpt_enQCFmOTGj3syEpYVhBRLTSy\",\n \"created_at\": 1721764800,\n \"fine_tuned_model_checkpoint\": \"ft:gpt-4o-mini-2024-07-18:my-org:custom-suffix:7q8mpxmy:ckpt-step-1000\",\n \"metrics\": {\n \"full_valid_loss\": 0.167,\n \"full_valid_mean_token_accuracy\": 0.781\n },\n \"fine_tuning_job_id\": \"ftjob-abc123\",\n \"step_number\": 1000\n }\n ],\n \"first_id\": \"ftckpt_zc4Q7MP6XxulcVzj4MZdwsAB\",\n \"last_id\": \"ftckpt_enQCFmOTGj3syEpYVhBRLTSy\",\n \"has_more\": true\n}\n" | ||
| request: | ||
| curl: "curl https://api.openai.com/v1/fine_tuning/jobs/ftjob-abc123/checkpoints \\\n -H \"Authorization: Bearer $OPENAI_API_KEY\"\n" | ||
| node.js: "import OpenAI from 'openai';\n\nconst client = new OpenAI({\n apiKey: process.env['OPENAI_API_KEY'], // This is the default and can be omitted\n});\n\n// Automatically fetches more pages as needed.\nfor await (const fineTuningJobCheckpoint of client.fineTuning.jobs.checkpoints.list(\n 'ft-AF1WoRqd3aJAHsqc9NY7iL8F',\n)) {\n console.log(fineTuningJobCheckpoint.id);\n}" |
There was a problem hiding this comment.
Mismatch in checkpoint list ID examples.
The curl and Node.js examples use different checkpoint identifiers (ftjob-abc123 vs. ft-AF1WoRqd3aJAHsqc9NY7iL8F). Align these IDs for consistency.
🤖 Prompt for AI Agents
In src/libs/tryAGI.OpenAI/openapi.yaml around lines 2494 to 2500, the checkpoint
list ID examples in the curl and Node.js sections are inconsistent, using
`ftjob-abc123` in one and `ft-AF1WoRqd3aJAHsqc9NY7iL8F` in the other. Update the
examples so both use the same checkpoint identifier to maintain consistency
across the documentation.
| expires_after: | ||
| required: | ||
| - anchor | ||
| type: object | ||
| properties: | ||
| anchor: | ||
| enum: | ||
| - created_at | ||
| type: string | ||
| description: "The anchor point for the ephemeral token expiration. Only `created_at` is currently supported.\n" | ||
| default: created_at | ||
| seconds: | ||
| type: integer |
There was a problem hiding this comment.
Enforce expires_after.seconds requirement.
expires_after now requires anchor but omits a required: - seconds block. This allows seconds to be absent; please add seconds to the required list.
🤖 Prompt for AI Agents
In src/libs/tryAGI.OpenAI/openapi.yaml around lines 19449 to 19460, the
expires_after object includes an anchor but does not list seconds as a required
property. To fix this, add seconds to the required array under expires_after to
ensure it is always present as mandated.
Summary by CodeRabbit
New Features
Improvements
Bug Fixes
Documentation