Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
27 changes: 0 additions & 27 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,30 +1,3 @@
## [v0.2.0](https://github.com/generative-computing/mellea/releases/tag/v0.2.0) - 2025-11-19

### Feature

* Change backend functions to use async; add generate_from_raw ([`16b8aea`](https://github.com/generative-computing/mellea/commit/16b8aea1ab4fc18428adafb2c6106314d986c537))
* Updates for intrinsics support ([#227](https://github.com/generative-computing/mellea/issues/227)) ([`52953a5`](https://github.com/generative-computing/mellea/commit/52953a507729e8683d8b027d7c1e6d70b2356955))
* Add requirements and preconditions to gen slots ([#226](https://github.com/generative-computing/mellea/issues/226)) ([`f73d8e2`](https://github.com/generative-computing/mellea/commit/f73d8e23c57146b44e8b552f5e30315e353ff592))
* MelleaSession.register for functional interface and MelleaSession.powerup for dynamic mixin (register all methods in a class) ([#224](https://github.com/generative-computing/mellea/issues/224)) ([`662cfcc`](https://github.com/generative-computing/mellea/commit/662cfcc99c365411c7dcee0d55fcd0cba21bd4b8))
* Add secure Python code execution with llm-sandbox support ([#217](https://github.com/generative-computing/mellea/issues/217)) ([`9d12458`](https://github.com/generative-computing/mellea/commit/9d12458432db3c1172d79ffdcbfae50f2bf8b402))
* Adds think budget-forcing ([#107](https://github.com/generative-computing/mellea/issues/107)) ([`a2e29e6`](https://github.com/generative-computing/mellea/commit/a2e29e633b9f470d3992335becb8231dc57d0d69))
* Making generate_from_raw public ([#219](https://github.com/generative-computing/mellea/issues/219)) ([`7eae224`](https://github.com/generative-computing/mellea/commit/7eae2244763a4349e202e6b87502d23e111ea07e))
* Conda/Mamba-based installation script ([#138](https://github.com/generative-computing/mellea/issues/138)) ([`6aea9dc`](https://github.com/generative-computing/mellea/commit/6aea9dc85b0147a22ff5a5553a75d9179958ce6e))
* Adds a vllm backend ([#122](https://github.com/generative-computing/mellea/issues/122)) ([`21908e5`](https://github.com/generative-computing/mellea/commit/21908e5bbc6bfd3bfd6f84953cefb3f6a56fccf2))
* Add the ability to run examples with pytest ([#198](https://github.com/generative-computing/mellea/issues/198)) ([`e30afe6`](https://github.com/generative-computing/mellea/commit/e30afe6148d68b6ef1d6aa3417823c7a51ff0743))
* Ollama generate_from_raw uses existing event loop ([#204](https://github.com/generative-computing/mellea/issues/204)) ([`36a069f`](https://github.com/generative-computing/mellea/commit/36a069fb6f9912a25c5c8aa51a5fe46ce2e945d3))

### Fix

* Vllm format issues ([`abbde23`](https://github.com/generative-computing/mellea/commit/abbde236d4d5900a3717d4a6af4759743dcd21d9))
* Some minor fixes ([#223](https://github.com/generative-computing/mellea/issues/223)) ([`7fa0891`](https://github.com/generative-computing/mellea/commit/7fa08915573ee696d230dffef5532be8b7d3b7e3))
* Watsonx self._project_id not getting set ([#220](https://github.com/generative-computing/mellea/issues/220)) ([`10f6ffa`](https://github.com/generative-computing/mellea/commit/10f6ffa35ea089b2396d184b18a1efbac75b94a7))
* Decomp subtask regex ([#218](https://github.com/generative-computing/mellea/issues/218)) ([`5ac34be`](https://github.com/generative-computing/mellea/commit/5ac34be51ee1d14678888d53c6374810a7ed5871))

### Documentation

* Adding pii m serve example ([#215](https://github.com/generative-computing/mellea/issues/215)) ([`54f13f4`](https://github.com/generative-computing/mellea/commit/54f13f4c0314ff21189a4a06051dfea84b5420d1))

## [v0.1.3](https://github.com/generative-computing/mellea/releases/tag/v0.1.3) - 2025-10-22

### Feature
Expand Down
2 changes: 1 addition & 1 deletion LICENSE
Original file line number Diff line number Diff line change
Expand Up @@ -187,7 +187,7 @@
same "printed page" as the copyright notice for easier
identification within third-party archives.

Copyright 2025 IBM Corp.
Copyright [yyyy] [name of copyright owner]

Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
Expand Down
2 changes: 1 addition & 1 deletion docs/examples/conftest.py
Original file line number Diff line number Diff line change
Expand Up @@ -88,7 +88,7 @@ def runtest(self):

if retcode != 0:
raise ExampleTestException(
(f"Example failed with exit code {retcode}.\nStderr: {stderr}\n")
f"Example failed with exit code {retcode}.\nStderr: {stderr}\n"
)

def repr_failure(self, excinfo, style=None):
Expand Down
2 changes: 1 addition & 1 deletion docs/examples/context/contexts_with_sampling.py
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@
print(f"Total Generation Attempts: {len(res.sample_generations)}")
print()

print(f"Getting index of another result.")
print("Getting index of another result.")
index = 0 # Just choose the first one.

print(
Expand Down
2 changes: 1 addition & 1 deletion docs/examples/image_text_models/vision_litellm_backend.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
"""Examples of using vision models with LiteLLM backend."""

import os
import pathlib

import litellm
from PIL import Image
Expand All @@ -9,7 +10,6 @@
from mellea.backends.litellm import LiteLLMBackend
from mellea.backends.openai import OpenAIBackend
from mellea.stdlib.base import ImageBlock
import pathlib

# use LiteLLM to talk to Ollama or anthropic or.....
m = MelleaSession(LiteLLMBackend("ollama/granite3.2-vision"))
Expand Down
1 change: 1 addition & 0 deletions docs/examples/image_text_models/vision_ollama_chat.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
"""Example of using Ollama with vision models with linear context."""

import pathlib

from PIL import Image

from mellea import start_session
Expand Down
4 changes: 1 addition & 3 deletions docs/examples/information_extraction/101_with_gen_slots.py
Original file line number Diff line number Diff line change
Expand Up @@ -8,9 +8,7 @@

@generative
def extract_all_person_names(doc: str) -> list[str]:
"""
Given a document, extract names of ALL mentioned persons. Return these names as list of strings.
"""
"""Given a document, extract names of ALL mentioned persons. Return these names as list of strings."""


# ref: https://www.nytimes.com/2012/05/20/world/world-leaders-at-us-meeting-urge-growth-not-austerity.html
Expand Down
2 changes: 1 addition & 1 deletion docs/examples/m_serve/pii_serve.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,8 +2,8 @@

import spacy

from cli.serve.models import ChatMessage
import mellea
from cli.serve.models import ChatMessage
from mellea.backends.model_ids import IBM_GRANITE_4_MICRO_3B
from mellea.stdlib.base import ModelOutputThunk
from mellea.stdlib.requirement import req, simple_validate
Expand Down
2 changes: 1 addition & 1 deletion docs/examples/safety/guardian.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
from mellea import MelleaSession
from mellea.backends import model_ids
from mellea.backends.ollama import OllamaModelBackend
from mellea.stdlib.base import ContextTurn, ModelOutputThunk, ChatContext
from mellea.stdlib.base import ChatContext, ContextTurn, ModelOutputThunk
from mellea.stdlib.chat import Message
from mellea.stdlib.safety.guardian import GuardianCheck, GuardianRisk

Expand Down
4 changes: 2 additions & 2 deletions docs/examples/safety/guardian_huggingface.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,8 +6,8 @@

from mellea import MelleaSession
from mellea.backends import model_ids
from mellea.backends.ollama import OllamaModelBackend
from mellea.backends.huggingface import LocalHFBackend
from mellea.backends.ollama import OllamaModelBackend
from mellea.stdlib.base import ChatContext, ModelOutputThunk, ModelToolCall
from mellea.stdlib.chat import Message
from mellea.stdlib.safety.guardian import GuardianCheck, GuardianRisk
Expand Down Expand Up @@ -45,7 +45,7 @@
print(f"Guardian detected harm: {not validation_result[0]._result}")

if validation_result[0]._reason:
print(f"\nGuardian feedback:")
print("\nGuardian feedback:")
print(validation_result[0]._reason[:200] + "...")

# Test 2: Groundedness detection
Expand Down
9 changes: 4 additions & 5 deletions docs/examples/safety/repair_with_guardian.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,4 @@
"""
RepairTemplateStrategy Example with Actual Function Call Validation
"""RepairTemplateStrategy Example with Actual Function Call Validation
Demonstrates how RepairTemplateStrategy repairs responses using actual function calls.
"""

Expand Down Expand Up @@ -78,10 +77,10 @@ def get_stock_price(symbol: str) -> str:
if hasattr(m.backend, "formatter"):
try:
rendered = m.backend.formatter.print(action)
print(f" Instruction sent to model:")
print(f" ---")
print(" Instruction sent to model:")
print(" ---")
print(f" {rendered}")
print(f" ---")
print(" ---")
except Exception:
pass

Expand Down
1 change: 1 addition & 0 deletions docs/examples/sessions/creating_a_new_type_of_session.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
from typing import Literal

from PIL import Image as PILImage

from mellea import MelleaSession
Expand Down
77 changes: 0 additions & 77 deletions docs/examples/tools/interpreter_example.py

This file was deleted.

Loading