Skip to content

RezaTorbati/ComputerSecurityProject

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

35 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

ComputerSecurityProject

Group 14's project for Computer Security at the University of Oklahoma. Goal is to detect when someone is trying to use adversarial perturbations to fool YOLOv4.
To accomplish this, we implimented three methods: the Gaussian noise test, a statistical test and a custom model.

Running the Gaussian Noise test

  1. Install the dependencies. This can be done with:
    python3 -m pip install opencv-python
    python3 -m pip install tensorflow
    python3 -m pip install yolov4
  1. Download the weights here and put them in Gaussian/ModelData
  2. Modify line 19 of Gaussian/analyze.py to analyze the image of choice and then run it with
    python analyze.py
    For this image to be analyzed, there must be a corresponding YOLO label file, such as the given "examples.txt"
  3. Alternatively, create a folder called "images" in the Gaussian folder and put however many images of interest in it. Then remove the comments at lines 20 and 74 of analyze.py and then run python analyze.py. This will analyze all images in the folder and print out the aggregate results. For this to return anything useful, you must also have a folder called "labels" with YOLO style labelings for each image.

Running the Statistical Test

  1. After installing the dependencies for the Gaussian test, you'll just need to pip install torch
  2. StatisticalDetection.py expects two paths with folders of images. APRICOT can be downloaded here and COCO can be downloaded here
  3. Once you have the image folders, replace lines 101 & 102 with one or both of the folder paths. Then, just run python StatsDetect/StatisticalDetection.py to obtain the MMD value.

Running the Custom Version of YOLO

  1. Run the following in CustomModel:
    git clone https://github.com/AlexeyAB/darknet
    cd darknet
    make
  1. Download the Default YOLO weights from here and put in them in CustomModel/DefaultData
  2. Download the custom YOLO weights from here and put them in CustomModel/CustomData
  3. For the default model, in CustomData/darknet run
    ./darknet detector test ../DefaultData/coco.data ../DefaultData/yolov4.cfg ../DefaultData/yolov4.weights ../adversaryEx.jpg -thresh 0.3
    And observe the predicted results in CustomModel/darknet/predictions.jpg
  4. For the custom model, in CustomData/darknet run
    ./darknet detector test ../CustomData/adversary.data ../CustomData/adversary.cfg ../CustomData/adversary.weights ../adversaryEx.jpg -thresh 0.3
    And observe the predicted results in CustomModel/darknet/predictions.jpg

About

Group 14's project for Computer Security at the University of Oklahoma. Goal is to detect when someone is trying to use adversarial perturbations to fool YOLOv4.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages