Skip to content

Context Precision Prompt Example wrong? #479

@almajo

Description

@almajo

Describe the bug

Hi, thanks for the work on this helpful framework!
I was going through the prompts of metrics and I don't get why the second example of Context Precision has verdict 0 for the given context:

{
            "question": """who won 2020 icc world cup?""",
            "context": """Who won the 2022 ICC Men's T20 World Cup?""",
            "answer": """England""",
            "verification": {
                "reason": "the context was useful in clarifying the situation regarding the 2020 ICC World Cup and indicating that England was the winner of the tournament that was intended to be held in 2020 but actually took place in 2022.",
                "verdict": "1",
            },
        },

For me, that's a very misleading example.

Ragas version: build from main
Python version: 3.11

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions