v1.2.3
Highlights
- fixed potential history removal error that happens when manually:
// before:
history.removeFirst(2)
//after:
history.removeFirst(min(2, history.count))
this used to be a problem for adding users manually adding odd number of chats to history
or if one experiences the race condition issue(#10).
- fixed potential race condition issue(#10) by adding actor attribute to concurrent functions which change
LLM
's properties:
@globalActor public actor InferenceActor {
static public let shared = InferenceActor()
}
...
@InferenceActor
private func predictNextToken() async -> Token
@InferenceActor
private func finishResponse(from response: inout [String], to output: borrowing AsyncStream<String>.Continuation) async
@InferenceActor
public func getCompletion(from input: borrowing String) async -> String
@InferenceActor
public func respond(to input: String, with makeOutputFrom: @escaping (AsyncStream<String>) async -> String) async