Skip to content

Conversation

@kiranandcode
Copy link
Contributor

Closes #407

class MovieGenre(str, Enum):
    ACTION = "action"
    COMEDY = "comedy"
    DRAMA = "drama"
    HORROR = "horror"
    SCIFI = "sci-fi"
    ROMANCE = "romance"


@Template.define
def classify_genre(plot: str) -> MovieGenre:
    """Classify the movie genre based on this plot: {plot}
    Respond with exactly one of: action, comedy, drama, horror, sci-fi, romance"""
    raise NotImplementedError

plot = "A rogue cop must stop a terrorist group from detonating bombs across the city."
with handler(LLMLoggingHandler()), handler(OpenAIAPIProvider(openai.OpenAI())):
    with handler(KAheadSampler()):
        genre = classify_genre(plot)
    print(f"\nsampled genre: {genre}")

@kiranandcode kiranandcode requested a review from jfeser November 24, 2025 17:19
@kiranandcode kiranandcode added this to the LLM Infrastructure milestone Nov 24, 2025
@kiranandcode kiranandcode linked an issue Nov 24, 2025 that may be closed by this pull request
@kiranandcode kiranandcode requested a review from eb8680 November 24, 2025 17:19
Copy link
Contributor

@jfeser jfeser left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The code as written seems fine, but I think we'd do better to eventually provide a more general batching interface. One option could be:

@defop
def sample(template: Template[P, T], n: int) -> Template[P, Sequence[T]]: ...

res = fut.result()
self.votes[res] += 1
tasks.append(executor.submit(interpreter(intp)(fwd), *args, **kwargs))
executor.shutdown()
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This won't cancel ongoing work. Not sure if that's the desired semantics.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yep, I figured raising exceptions might mess up the state, so best to just let the subprocesses run to completion.

@jfeser jfeser self-requested a review November 24, 2025 20:12
@kiranandcode kiranandcode marked this pull request as ready for review November 24, 2025 21:42
@jfeser jfeser merged commit b9207b4 into staging-llm Nov 24, 2025
5 checks passed
@jfeser jfeser deleted the kg-sampling-handlers branch November 24, 2025 22:22
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Implement handlers LLM sampling strategies

3 participants