Skip to content
Branch: master
Go to file
Code

Latest commit

Files

Permalink
Failed to load latest commit information.
Type
Name
Latest commit message
Commit time
 
 
 
 
lib
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

README.md

AZ-Net

Introduction

This github repository is an implementation of the AZ-Net detection method described in "Adaptive Object Detection Using Adjacency and Zoom Prediction"

Created by Yongxi Lu at University of California, San Diego.

If you find this useful, please consider citing

@article{lu2015adaptive,
    title={Adaptive Object Detection Using Adjacency and Zoom Prediction},
    author={Lu, Yongxi and Javidi, Tara and Lazebnik, Svetlana},
    journal={arXiv preprint arXiv:1512.07711},
    year={2015}
  }

Installation

To install, use the following steps:

  1. Install Caffe and all its dependencies. Requirements for Caffe and pycaffe (see: Caffe installation instructions)

Note: Caffe must be built with support for Python layers!

# In your Makefile.config, make sure to have this line uncommented
WITH_PYTHON_LAYER := 1
  1. Clone the AZ-Net repository. Make sure to use the --recursive flag
# Make sure to clone with --recursive
git clone --recursive https://github.com/luyongxi/az-net.git
  1. Build the Cython modules

    cd $ROOT/lib
    make
  2. Build Caffe and pycaffe

    cd $ROOT/caffe-fast-rcnn
    # Now follow the Caffe installation instructions here:
    #   http://caffe.berkeleyvision.org/installation.html
    
    # If you're experienced with Caffe and have all of the requirements installed
    # and your Makefile.config in place, then simply do:
    make -j8 && make pycaffe
  3. Fetch ImageNet models

    cd $ROOT
    ./data/scripts/fetch_imagenet_models.sh

    See data/README.md for details.

  4. To train and test models, use scripts in

    $ROOT/experiments/scripts
    

Pretrained Models

You could obtain the pretrained models from the following link. https://drive.google.com/drive/folders/0B2pXYeQwL9mhMlo0ZUVYcU82Ylk?usp=sharing

About

Object detection using AZ-Net

Resources

Releases

No releases published
You can’t perform that action at this time.