Skip to content

Paper "An LLM-Assisted Easy-to-Trigger Poisoning Attack on Code Completion Models: Injecting Disguised Vulnerabilities against Strong Detection"

Notifications You must be signed in to change notification settings

null1024-ws/Poisoning-Attack-on-Code-Completion-Models

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

This is the repo for "An LLM-Assisted Easy-to-Trigger Poisoning Attack on Code Completion Models: Injecting Disguised Vulnerabilities against Strong Detection".

  • Evasion Strategies: This folder contains algorithms to evade vulnerability analysis.
  • CodeModel: This folder contains attacks to Code Completion Models.
  • Test Cases: This folder contains 15 CWEs to verify effectiveness of our transformation strategies.

About

Paper "An LLM-Assisted Easy-to-Trigger Poisoning Attack on Code Completion Models: Injecting Disguised Vulnerabilities against Strong Detection"

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages