This is the python3 support version of https://github.com/Timforce/tiny-tf
I modified the changes in module names, syntaxes, and so on so that the code runs in Python3.
All the codes for train/eval/demo seem to work well. Trained result for WIDERFACE 'hard' set using WIDERFACE evaluation tool shows 0.78, which is very close to 0.814 of the original paper. (Tested with TensorFlow 1.12.0, CUDA 9.0, CUDNN 7.4.1, and Python 3.5.2)
Great thanks to TimForce for such nice codes!
Below is the original README of the repository
This is tensorflow version of Finding Tiny Faces by Peiyun Hu, Deva Ramanan.
Instead of transforming pretrained network parameters from .mat file, this code solely train the network from scratch with Tensorflow only.
NOTE: The paper is not a work of mine, you should checkout the original paper's repo.
Where to change: ./lib/tiny/test.py: im = cv2.imread(roidb[i]['image']) ./lib/tiny/demo.py: cv2.imwrite(output_dest, output_img) ./lib/tiny/demo.py: im = cv2.imread(os.path.join(source_dir, file_name)) ./lib/tiny/demo.py: im = cv2.imread(os.path.join(source_dir, file_name)) ./lib/utils/wider2pkl_detail.py: img = cv2.imread(img_file)
python=2.7 tensorflow-gpu numpy scipy scikit-learn matplotlib opencv-python Cython easydict pyyaml
The majority of the code was written with Python 2.7 and Tensorflow r1.2, it should be fine running on newer versions as well. If there are any compatibility issues, please feel free to submit it.
You may also need a decent GPU to be able to train the network, anything with higher than 6GB GPU memory should suffice.
- Download WIDER Face Training & Validation Dataset:
http://mmlab.ie.cuhk.edu.hk/projects/WIDERFace/
The extracted directories were expected to be in the order like this:WIDER/ ├── WIDER_train/ │ └── images/.. │ ├── WIDER_val/ │ └── images/.. │ └── wider_face_split/...
- Bulid two pickle files based on training and validation dataset:
python lib/utils/wider2pkl_detail.py --img_root $WIDER/WIDER_train/images \ --label $WIDER/wider_face_split/wider_face_train.mat \ --out wider_train_roidb_detail.pkl
Both process may take a few minutes to complete, place the two generated .pkl files intopython lib/utils/wider2pkl_detail.py --img_root $WIDER/WIDER_val/images \ --label $WIDER/wider_face_split/wider_face_val.mat \ --out wider_val_detail.pkl
data/pickles
- Compile minibatch for data IO:
(Any further modifications in
cd $tiny-tf/lib/roi_data_layer python setup.py build_ext –inplace
minibatch.pyx
will require re-compiling in order to take effect.)
You can quickly test out the code with pretrained network tensorflow checkpoint, make sure to place the whole directory Resnet101_tiny
into $tiny-tf/output
.
- Place pictures into
$tiny-tf/demo/data
- Execute
demo_tiny_resnet101.sh
- Detection result should be right in the
$tiny-tf/demo/visualize
The result will look like this:
- Make sure training set pickle file
wider_train_roidb_detail.pkl
is in$tiny-tf/data/pickles
- Download the Imagenet pretrained model and place into
$tiny-tf/data/pretrain_model
- Execute
tiny_resnet101_wider_train.sh
You can tweak settings and hyperparameters in cfgs/tiny_resnet101.yml
The generated tensorboard will be in $tiny-tf/log
folder, checkpoints will be in $tiny-tf/output
.
- Make sure validation set pickle file
wider_val_detail.pkl
is in$tiny-tf/data/pickles
- Execute
tiny_resnet101_eval.sh
, validation will run through a couple thousands of images, so it may take a while. - The test result will be saved as a single
pred
directory, and the format comply with official WIDER-Face dataset evaluation code, you can use eval_tool to inspect the result.
The following PR-curves shows validation medium set result of pre-trained model Resnet101_tiny
NOTE: The validation result above does not contain the other most recent state-of-the-art result, it merely shows our training result perform nearly good as paper.
This code was done by referencing some of the following awesome repo, it's good to check them out :)