Skip to content

8vana/promptmap

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 

Repository files navigation

promptmap

logo

Introduction

PromptMap is a Prompt Injection attacks testing tool.
This tool performs fully automated Prompt Injection attack tests against them to assess the robustness of generative AI and generative AI-integrated apps. This tool is intended to be used by developers for security testing.

Features

PromptMap supports the following attack tests.

Direct Prompt Injection/Jailbreak

PromptMap injects malicious prompts into a generative AI and evaluates whether the generative AI generates malicious contents or leaks generative AI's training data.

Prompt Leaking

PromptMap injects malicious prompts into generative AI-integrated applications and evaluates whether the generative AI-integrated applications leak the prompt templates implemented by the apps.

P2SQL Injection

PromptMap injects malicious prompts into generative AI-integrated applications and evaluates to steal, modify, or delete information from the database connected to the generative AI-integrated applications.

Download

Documentation

Demo

Contribute

Prompt Injection attacks have different principles from those used in existing attack methods, and it is difficult to evaluate their robustness using existing security testing methods.

Therefore, PromptMap supports a wide variety of Prompt Injection attacks and enables fully automated execution, contributing to security testing for developers of generative AI and generative AI-integrated applications.

Donate

License

Copyright © 13o-bbr-bbq and mahoyaya. All rights reserved.

Disclaimer

Developers

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published