Skip to content

v1.2.3

Compare
Choose a tag to compare
@eastriverlee eastriverlee released this 27 Jan 15:41
· 36 commits to main since this release

Highlights

  1. fixed potential history removal error that happens when manually:
// before:
history.removeFirst(2)

//after:
history.removeFirst(min(2, history.count))

this used to be a problem for adding users manually adding odd number of chats to history or if one experiences the race condition issue(#10).

  1. fixed potential race condition issue(#10) by adding actor attribute to concurrent functions which change LLM's properties:
@globalActor public actor InferenceActor {
    static public let shared = InferenceActor()
}

...

@InferenceActor
private func predictNextToken() async -> Token
    
@InferenceActor
private func finishResponse(from response: inout [String], to output: borrowing AsyncStream<String>.Continuation) async

@InferenceActor
public func getCompletion(from input: borrowing String) async -> String

@InferenceActor
public func respond(to input: String, with makeOutputFrom: @escaping (AsyncStream<String>) async -> String) async