Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] langchain could not determine a constructor for the tag error #11858

Open
4 of 23 tasks
minkj1992 opened this issue Apr 30, 2024 · 2 comments
Open
4 of 23 tasks

[BUG] langchain could not determine a constructor for the tag error #11858

minkj1992 opened this issue Apr 30, 2024 · 2 comments
Labels
area/examples Example code area/model-registry Model registry, model registry APIs, and the fluent client calls for model registry area/models MLmodel format, model serialization/deserialization, flavors bug Something isn't working

Comments

@minkj1992
Copy link
Contributor

Issues Policy acknowledgement

  • I have read and agree to submit bug reports in accordance with the issues policy

Where did you encounter this bug?

Local machine

Willingness to contribute

Yes. I can contribute a fix for this bug independently.

MLflow version

mlflow --version
mlflow, version 2.12.1

System information

system_profiler SPSoftwareDataType SPHardwareDataType

Software:

System Software Overview:

  System Version: macOS 14.4.1 (23E224)
  Kernel Version: Darwin 23.4.0
  Secure Virtual Memory: Enabled
  System Integrity Protection: Enabled

Hardware:

Hardware Overview:
  Model Name: MacBook Pro
  Chip: Apple M1 Max
  System Firmware Version: 10151.101.3
  OS Loader Version: 10151.101.3

python --version
Python 3.12.2

Describe the problem

I've been attempting the approach outlined in the link below, but encountering an issue where the YAML isn't generated properly when I include the output parser.

https://mlflow.org/docs/latest/llms/langchain/notebooks/langchain-quickstart.html#Logging-the-Chain-in-MLflow

Of course, I'm aware of the statement 'MLflow does not guarantee support for LLMs outside of HuggingFaceHub and OpenAI, found Ollama', but I concluded it was a bug because excluding the code related to the parser prevents any errors besides the Ollama error from occurring.

Tracking information

REPLACE_ME

Code to reproduce issue

import mlflow
from langchain.chains import LLMChain
from langchain_community.llms import Ollama
from langchain_core.output_parsers import JsonOutputParser, PydanticOutputParser
from langchain_core.prompts import PromptTemplate
from langchain_core.pydantic_v1 import BaseModel, Field

coherence_template = """You are the judge evaluating coherence/logic when you receive the debate topic and the content of the discussion.

...skip...

## Input
  TOPIC: {topic},
  Debate: {debate}

Below are the criteria for deducting points based on coherence/logic.


"""

topic = "Mocking topic"
debate = "Mocking debate"


class LLMResult(BaseModel):
    a_score: int = Field(description="Score of Team A")
    a_reason: str = Field(description="Reason for deduction for Team A")
    b_score: int = Field(description="Score of Team B")
    b_reason: str = Field(description="Reason for deduction for Team B")


llm = Ollama(model="llama3", temperature=0, verbose=True)
parser = PydanticOutputParser(pydantic_object=LLMResult)
# parser = JsonOutputParser(pydantic_object=LLMResult)
coherence_prompt = PromptTemplate(
    template=coherence_template,
    input_variables=[
        "topic",
        "debate",
    ],
    partial_variables={"format_instructions": parser.get_format_instructions()},
)


chain = LLMChain(llm=llm, prompt=coherence_prompt, output_parser=parser)

mlflow.set_experiment("TEST 04")

with mlflow.start_run():
    model_info = mlflow.langchain.log_model(chain, "langchain_model")

loaded_model = mlflow.pyfunc.load_model(model_info.model_uri)
result = loaded_model.predict({"topic": topic, "debate": debate})
print(result)

Stack trace

2024/04/30 10:45:46 WARNING mlflow.utils.environment: Encountered an unexpected error while inferring pip requirements (model URI: /var/folders/j3/3llb09155g55ct9f02mqn4200000gn/T/tmpon9sez4r/model, flavor: langchain). Fall back to return ['langchain==0.1.16', 'pydantic==2.7.1', 'cloudpickle==3.0.0']. Set logging level to DEBUG to see the full traceback. 
Traceback (most recent call last):
  File "/Users/minwook/code/personal/eval/main.py", line 52, in <module>
    loaded_model = mlflow.pyfunc.load_model(model_info.model_uri)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/miniconda3/envs/mlops/lib/python3.12/site-packages/mlflow/pyfunc/__init__.py", line 892, in load_model
    model_impl = importlib.import_module(conf[MAIN])._load_pyfunc(data_path)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/miniconda3/envs/mlops/lib/python3.12/site-packages/mlflow/langchain/__init__.py", line 818, in _load_pyfunc
    return wrapper_cls(_load_model_from_local_fs(path))
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/miniconda3/envs/mlops/lib/python3.12/site-packages/mlflow/langchain/__init__.py", line 846, in _load_model_from_local_fs
    return _load_model(local_model_path, flavor_conf)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/miniconda3/envs/mlops/lib/python3.12/site-packages/mlflow/langchain/__init__.py", line 563, in _load_model
    model = _load_base_lcs(local_model_path, flavor_conf)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/miniconda3/envs/mlops/lib/python3.12/site-packages/mlflow/langchain/utils.py", line 522, in _load_base_lcs
    model = _patch_loader(load_chain)(lc_model_path)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/miniconda3/envs/mlops/lib/python3.12/site-packages/mlflow/langchain/utils.py", line 467, in patched_loader
    return loader_func(*args, **kwargs, allow_dangerous_deserialization=True)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/miniconda3/envs/mlops/lib/python3.12/site-packages/langchain/chains/loading.py", line 631, in load_chain
    return _load_chain_from_file(path, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/miniconda3/envs/mlops/lib/python3.12/site-packages/langchain/chains/loading.py", line 647, in _load_chain_from_file
    config = yaml.safe_load(f)
             ^^^^^^^^^^^^^^^^^
  File "/opt/miniconda3/envs/mlops/lib/python3.12/site-packages/yaml/__init__.py", line 125, in safe_load
    return load(stream, SafeLoader)
           ^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/miniconda3/envs/mlops/lib/python3.12/site-packages/yaml/__init__.py", line 81, in load
    return loader.get_single_data()
           ^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/miniconda3/envs/mlops/lib/python3.12/site-packages/yaml/constructor.py", line 51, in get_single_data
    return self.construct_document(node)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/miniconda3/envs/mlops/lib/python3.12/site-packages/yaml/constructor.py", line 60, in construct_document
    for dummy in generator:
  File "/opt/miniconda3/envs/mlops/lib/python3.12/site-packages/yaml/constructor.py", line 413, in construct_yaml_map
    value = self.construct_mapping(node)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/miniconda3/envs/mlops/lib/python3.12/site-packages/yaml/constructor.py", line 218, in construct_mapping
    return super().construct_mapping(node, deep=deep)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/miniconda3/envs/mlops/lib/python3.12/site-packages/yaml/constructor.py", line 143, in construct_mapping
    value = self.construct_object(value_node, deep=deep)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/miniconda3/envs/mlops/lib/python3.12/site-packages/yaml/constructor.py", line 100, in construct_object
    data = constructor(self, node)
           ^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/miniconda3/envs/mlops/lib/python3.12/site-packages/yaml/constructor.py", line 427, in construct_undefined
    raise ConstructorError(None, None,
yaml.constructor.ConstructorError: could not determine a constructor for the tag 'tag:yaml.org,2002:python/name:__main__.LLMResult'
  in "/Users/minwook/code/personal/eval/mlruns/732209676491018788/6f58a326322449e2a5fa620b94a4ff60/artifacts/langchain_model/model.yaml", line 33, column 20

Other info / logs

REPLACE_ME

What component(s) does this bug affect?

  • area/artifacts: Artifact stores and artifact logging
  • area/build: Build and test infrastructure for MLflow
  • area/deployments: MLflow Deployments client APIs, server, and third-party Deployments integrations
  • area/docs: MLflow documentation pages
  • area/examples: Example code
  • area/model-registry: Model Registry service, APIs, and the fluent client calls for Model Registry
  • area/models: MLmodel format, model serialization/deserialization, flavors
  • area/recipes: Recipes, Recipe APIs, Recipe configs, Recipe Templates
  • area/projects: MLproject format, project running backends
  • area/scoring: MLflow Model server, model deployment tools, Spark UDFs
  • area/server-infra: MLflow Tracking server backend
  • area/tracking: Tracking Service, tracking client APIs, autologging

What interface(s) does this bug affect?

  • area/uiux: Front-end, user experience, plotting, JavaScript, JavaScript dev server
  • area/docker: Docker use across MLflow's components, such as MLflow Projects and MLflow Models
  • area/sqlalchemy: Use of SQLAlchemy in the Tracking Service or Model Registry
  • area/windows: Windows support

What language(s) does this bug affect?

  • language/r: R APIs and clients
  • language/java: Java APIs and clients
  • language/new: Proposals for new client languages

What integration(s) does this bug affect?

  • integrations/azure: Azure and Azure ML integrations
  • integrations/sagemaker: SageMaker integrations
  • integrations/databricks: Databricks integrations
@minkj1992 minkj1992 added the bug Something isn't working label Apr 30, 2024
@github-actions github-actions bot added area/examples Example code area/model-registry Model registry, model registry APIs, and the fluent client calls for model registry area/models MLmodel format, model serialization/deserialization, flavors labels Apr 30, 2024
@WeichenXu123
Copy link
Collaborator

__main__.LLMResult
the class is defined in __main__ module.

You can move LLMResult class definition to a python file module and log the python file as the code_paths to address it.

Copy link

github-actions bot commented May 8, 2024

@mlflow/mlflow-team Please assign a maintainer and start triaging this issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area/examples Example code area/model-registry Model registry, model registry APIs, and the fluent client calls for model registry area/models MLmodel format, model serialization/deserialization, flavors bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants