Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

graph_transformers.llm.py create_simple_model not constraining relationships with enums when using OpenAI LLM #24615

Closed
5 tasks done
dennisjunior111 opened this issue Jul 24, 2024 · 0 comments
Labels
🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature 🔌: openai Primarily related to OpenAI integrations

Comments

@dennisjunior111
Copy link

Checked other resources

  • I added a very descriptive title to this issue.
  • I searched the LangChain documentation with the integrated search.
  • I used the GitHub search to find a similar question and didn't find it.
  • I am sure that this is a bug in LangChain rather than my code.
  • The bug is not resolved by updating to the latest stable version of LangChain (or the specific integration package).

Example Code

from langchain_experimental.graph_transformers.llm import create_simple_model
from langchain_openai import ChatOpenAI


llm = ChatOpenAI(
    temperature=0, 
    model_name="gpt-4o-mini-2024-07-18"
)

schema = create_simple_model(
    node_labels = ["Person", "Organization"],
    rel_types = ["KNOWS", "EMPLOYED_BY"],
    llm_type = llm._llm_type # openai-chat
)

print(schema.schema_json(indent=4))

Error Message and Stack Trace (if applicable)

No response

Description

The _Graph pydantic model generated from create_simple_model (which LLMGraphTransformer uses when allowed nodes and relationships are provided) does not constrain the relationships (source and target types, relationship type), and the node and relationship properties with enums when using ChatOpenAI.

One can see this by outputting the json schema from the _Graph schema and seeing enum missing from all but SimpleNode.type.

The issue is that when calling optional_enum_field throughout create_simple_model the llm_type parameter is not passed in except for when creating node type. Passing it into each call fixes the issue.

{
    "title": "DynamicGraph",
    "description": "Represents a graph document consisting of nodes and relationships.",
    "type": "object",
    "properties": {
        "nodes": {
            "title": "Nodes",
            "description": "List of nodes",
            "type": "array",
            "items": {
                "$ref": "#/definitions/SimpleNode"
            }
        },
        "relationships": {
            "title": "Relationships",
            "description": "List of relationships",
            "type": "array",
            "items": {
                "$ref": "#/definitions/SimpleRelationship"
            }
        }
    },
    "definitions": {
        "SimpleNode": {
            "title": "SimpleNode",
            "type": "object",
            "properties": {
                "id": {
                    "title": "Id",
                    "description": "Name or human-readable unique identifier.",
                    "type": "string"
                },
                "type": {
                    "title": "Type",
                    "description": "The type or label of the node.. Available options are ['Person', 'Organization']",
                    "enum": [
                        "Person",
                        "Organization"
                    ],
                    "type": "string"
                }
            },
            "required": [
                "id",
                "type"
            ]
        },
        "SimpleRelationship": {
            "title": "SimpleRelationship",
            "type": "object",
            "properties": {
                "source_node_id": {
                    "title": "Source Node Id",
                    "description": "Name or human-readable unique identifier of source node",
                    "type": "string"
                },
                "source_node_type": {
                    "title": "Source Node Type",
                    "description": "The type or label of the source node.. Available options are ['Person', 'Organization']",
                    "type": "string"
                },
                "target_node_id": {
                    "title": "Target Node Id",
                    "description": "Name or human-readable unique identifier of target node",
                    "type": "string"
                },
                "target_node_type": {
                    "title": "Target Node Type",
                    "description": "The type or label of the target node.. Available options are ['Person', 'Organization']",
                    "type": "string"
                },
                "type": {
                    "title": "Type",
                    "description": "The type of the relationship.. Available options are ['KNOWS', 'EMPLOYED_BY']",
                    "type": "string"
                }
            },
            "required": [
                "source_node_id",
                "source_node_type",
                "target_node_id",
                "target_node_type",
                "type"
            ]
        }
    }
}

System Info

> pip freeze | grep langchain
langchain==0.2.10
langchain-community==0.2.9
langchain-core==0.2.22
langchain-experimental==0.0.62
langchain-openai==0.1.17
langchain-text-splitters==0.2.2

platform: wsl2 windows
Python 3.10.14

@dosubot dosubot bot added 🔌: openai Primarily related to OpenAI integrations 🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature labels Jul 24, 2024
ccurme pushed a commit that referenced this issue Jul 29, 2024
#24643)

issue: #24615 

descriptions: The _Graph pydantic model generated from
create_simple_model (which LLMGraphTransformer uses when allowed nodes
and relationships are provided) does not constrain the relationships
(source and target types, relationship type), and the node and
relationship properties with enums when using ChatOpenAI.
The issue is that when calling optional_enum_field throughout
create_simple_model the llm_type parameter is not passed in except for
when creating node type. Passing it into each call fixes the issue.

Co-authored-by: Lifu Wu <lifu@nextbillion.ai>
olgamurraft pushed a commit to olgamurraft/langchain that referenced this issue Aug 16, 2024
langchain-ai#24643)

issue: langchain-ai#24615 

descriptions: The _Graph pydantic model generated from
create_simple_model (which LLMGraphTransformer uses when allowed nodes
and relationships are provided) does not constrain the relationships
(source and target types, relationship type), and the node and
relationship properties with enums when using ChatOpenAI.
The issue is that when calling optional_enum_field throughout
create_simple_model the llm_type parameter is not passed in except for
when creating node type. Passing it into each call fixes the issue.

Co-authored-by: Lifu Wu <lifu@nextbillion.ai>
@dosubot dosubot bot added the stale Issue has not had recent activity or appears to be solved. Stale issues will be automatically closed label Oct 23, 2024
@dosubot dosubot bot closed this as not planned Won't fix, can't repro, duplicate, stale Oct 30, 2024
@dosubot dosubot bot removed the stale Issue has not had recent activity or appears to be solved. Stale issues will be automatically closed label Oct 30, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature 🔌: openai Primarily related to OpenAI integrations
Projects
None yet
Development

No branches or pull requests

1 participant