Skip to content
This repository has been archived by the owner on Nov 21, 2023. It is now read-only.

Commit

Permalink
Encapsulate detectron code in a package
Browse files Browse the repository at this point in the history
Reviewed By: rbgirshick

Differential Revision: D7516523

fbshipit-source-id: 7c67f82c7ce6f79a66b0bdea0770d2d00735e38f
  • Loading branch information
ir413 authored and facebook-github-bot committed May 7, 2018
1 parent 25d9e24 commit e5bb3a8
Show file tree
Hide file tree
Showing 111 changed files with 473 additions and 421 deletions.
8 changes: 4 additions & 4 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ __pycache__/
*.so

# Distribution / packaging
lib/build/
build/
*.egg-info/
*.egg

Expand All @@ -17,8 +17,8 @@ lib/build/
*.swp

# Dataset symlinks
lib/datasets/data/*
!lib/datasets/data/README.md
detectron/datasets/data/*
!detectron/datasets/data/README.md

# Generated C files
lib/utils/cython_*.c
detectron/utils/cython_*.c
4 changes: 2 additions & 2 deletions lib/CMakeLists.txt → CMakeLists.txt
Original file line number Diff line number Diff line change
Expand Up @@ -26,8 +26,8 @@ include(cmake/Summary.cmake)
detectron_print_config_summary()

# Collect custom ops sources.
file(GLOB CUSTOM_OPS_CPU_SRCS ${CMAKE_CURRENT_SOURCE_DIR}/ops/*.cc)
file(GLOB CUSTOM_OPS_GPU_SRCS ${CMAKE_CURRENT_SOURCE_DIR}/ops/*.cu)
file(GLOB CUSTOM_OPS_CPU_SRCS ${CMAKE_CURRENT_SOURCE_DIR}/detectron/ops/*.cc)
file(GLOB CUSTOM_OPS_GPU_SRCS ${CMAKE_CURRENT_SOURCE_DIR}/detectron/ops/*.cu)

# Install custom CPU ops lib.
add_library(
Expand Down
2 changes: 1 addition & 1 deletion FAQ.md
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ Also potentially relevant: inference with Mask R-CNN on high-resolution images m

#### Q: How do I implement a custom Caffe2 CPU or GPU operator for use in Detectron?

**A:** Detectron uses a number of specialized Caffe2 operators that are distributed via the [Caffe2 Detectron module](https://github.com/caffe2/caffe2/tree/master/modules/detectron) as part of the core Caffe2 GitHub repository. If you'd like to implement a custom Caffe2 operator for your project, we have written a toy example illustrating how to add an operator under the Detectron source tree; please see [`lib/ops/zero_even_op.*`](lib/ops/) and [`tests/test_zero_even_op.py`](tests/test_zero_even_op.py). For more background on writing Caffe2 operators please consult the [Caffe2 documentation](https://caffe2.ai/docs/custom-operators.html).
**A:** Detectron uses a number of specialized Caffe2 operators that are distributed via the [Caffe2 Detectron module](https://github.com/caffe2/caffe2/tree/master/modules/detectron) as part of the core Caffe2 GitHub repository. If you'd like to implement a custom Caffe2 operator for your project, we have written a toy example illustrating how to add an operator under the Detectron source tree; please see [`detectron/ops/zero_even_op.*`](detectron/ops/) and [`detectron/tests/test_zero_even_op.py`](detectron/tests/test_zero_even_op.py). For more background on writing Caffe2 operators please consult the [Caffe2 documentation](https://caffe2.ai/docs/custom-operators.html).

#### Q: How do I use Detectron to train a model on a custom dataset?

Expand Down
16 changes: 8 additions & 8 deletions INSTALL.md
Original file line number Diff line number Diff line change
Expand Up @@ -66,13 +66,13 @@ git clone https://github.com/facebookresearch/detectron $DETECTRON
Set up Python modules:

```
cd $DETECTRON/lib && make
cd $DETECTRON && make
```

Check that Detectron tests pass (e.g. for [`SpatialNarrowAsOp test`](tests/test_spatial_narrow_as_op.py)):

```
python2 $DETECTRON/tests/test_spatial_narrow_as_op.py
python2 $DETECTRON/detectron/tests/test_spatial_narrow_as_op.py
```

## That's All You Need for Inference
Expand All @@ -81,7 +81,7 @@ At this point, you can run inference using pretrained Detectron models. Take a l

## Datasets

Detectron finds datasets via symlinks from `lib/datasets/data` to the actual locations where the dataset images and annotations are stored. For instructions on how to create symlinks for COCO and other datasets, please see [`lib/datasets/data/README.md`](lib/datasets/data/README.md).
Detectron finds datasets via symlinks from `detectron/datasets/data` to the actual locations where the dataset images and annotations are stored. For instructions on how to create symlinks for COCO and other datasets, please see [`detectron/datasets/data/README.md`](detectron/datasets/data/README.md).

After symlinks have been created, that's all you need to start training models.

Expand All @@ -90,18 +90,18 @@ After symlinks have been created, that's all you need to start training models.
Please read the custom operators section of the [`FAQ`](FAQ.md) first.

For convenience, we provide CMake support for building custom operators. All custom operators are built into a single library that can be loaded dynamically from Python.
Place your custom operator implementation under [`lib/ops/`](lib/ops/) and see [`tests/test_zero_even_op.py`](tests/test_zero_even_op.py) for an example of how to load custom operators from Python.
Place your custom operator implementation under [`detectron/ops/`](detectron/ops/) and see [`detectron/tests/test_zero_even_op.py`](detectron/tests/test_zero_even_op.py) for an example of how to load custom operators from Python.

Build the custom operators library:

```
cd $DETECTRON/lib && make ops
cd $DETECTRON && make ops
```

Check that the custom operator tests pass:

```
python2 $DETECTRON/tests/test_zero_even_op.py
python2 $DETECTRON/detectron/tests/test_zero_even_op.py
```

## Docker Image
Expand All @@ -118,7 +118,7 @@ docker build -t detectron:c2-cuda9-cudnn7 .
Run the image (e.g. for [`BatchPermutationOp test`](tests/test_batch_permutation_op.py)):

```
nvidia-docker run --rm -it detectron:c2-cuda9-cudnn7 python2 tests/test_batch_permutation_op.py
nvidia-docker run --rm -it detectron:c2-cuda9-cudnn7 python2 detectron/tests/test_batch_permutation_op.py
```

## Troubleshooting
Expand Down Expand Up @@ -149,7 +149,7 @@ cmake .. \
Similarly, when building custom Detectron operators you can use:

```
cd $DETECTRON/lib
cd $DETECTRON
mkdir -p build && cd build
cmake .. \
-DCUDA_TOOLKIT_ROOT_DIR=/path/to/cuda/toolkit/dir \
Expand Down
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
Original file line number Diff line number Diff line change
Expand Up @@ -19,8 +19,8 @@ include(cmake/Summary.cmake)
detectron_print_config_summary()

# Collect custom ops sources.
file(GLOB CUSTOM_OPS_CPU_SRCS ${CMAKE_CURRENT_SOURCE_DIR}/ops/*.cc)
file(GLOB CUSTOM_OPS_GPU_SRCS ${CMAKE_CURRENT_SOURCE_DIR}/ops/*.cu)
file(GLOB CUSTOM_OPS_CPU_SRCS ${CMAKE_CURRENT_SOURCE_DIR}/detectron/ops/*.cc)
file(GLOB CUSTOM_OPS_GPU_SRCS ${CMAKE_CURRENT_SOURCE_DIR}/detectron/ops/*.cu)

# Install custom CPU ops lib.
add_library(
Expand Down
File renamed without changes.
File renamed without changes.
41 changes: 35 additions & 6 deletions lib/core/config.py → detectron/core/config.py
Original file line number Diff line number Diff line change
Expand Up @@ -43,22 +43,23 @@
from __future__ import unicode_literals

from ast import literal_eval
from future.utils import iteritems
from past.builtins import basestring
from utils.collections import AttrDict
import copy
import logging
import numpy as np
import os
import os.path as osp
import yaml

from utils.io import cache_url
from detectron.utils.collections import AttrDict
from detectron.utils.io import cache_url

logger = logging.getLogger(__name__)

__C = AttrDict()
# Consumers can get config by:
# from core.config import cfg
# from detectron.core.config import cfg
cfg = __C


Expand Down Expand Up @@ -1000,6 +1001,7 @@
}
)


# ---------------------------------------------------------------------------- #
# Renamed options
# If you rename a config option, record the mapping from the old name to the new
Expand Down Expand Up @@ -1044,6 +1046,17 @@
}


# ---------------------------------------------------------------------------- #
# Renamed modules
# If a module containing a data structure used in the config (e.g. AttrDict)
# is renamed/moved and you don't want to break loading of existing yaml configs
# (e.g. from weights files) you can specify the renamed module below.
# ---------------------------------------------------------------------------- #
_RENAMED_MODULES = {
'utils.collections': 'detectron.utils.collections',
}


def assert_and_infer_cfg(cache_urls=True, make_immutable=True):
"""Call this function in your script after you have finished setting all cfg
values that are necessary (e.g., merging a config from a file, merging
Expand Down Expand Up @@ -1090,10 +1103,24 @@ def get_output_dir(datasets, training=True):
return outdir


def load_cfg(cfg_to_load):
"""Wrapper around yaml.load used for maintaining backward compatibility"""
assert isinstance(cfg_to_load, (file, basestring)), \
'Expected {} or {} got {}'.format(file, basestring, type(cfg_to_load))
if isinstance(cfg_to_load, file):
cfg_to_load = ''.join(cfg_to_load.readlines())
if isinstance(cfg_to_load, basestring):
for old_module, new_module in iteritems(_RENAMED_MODULES):
# yaml object encoding: !!python/object/new:<module>.<object>
old_module, new_module = 'new:' + old_module, 'new:' + new_module
cfg_to_load = cfg_to_load.replace(old_module, new_module)
return yaml.load(cfg_to_load)


def merge_cfg_from_file(cfg_filename):
"""Load a yaml config file and merge it into the global config."""
with open(cfg_filename, 'r') as f:
yaml_cfg = AttrDict(yaml.load(f))
yaml_cfg = AttrDict(load_cfg(f))
_merge_a_into_b(yaml_cfg, __C)


Expand Down Expand Up @@ -1130,8 +1157,10 @@ def _merge_a_into_b(a, b, stack=None):
"""Merge config dictionary a into config dictionary b, clobbering the
options in b whenever they are also specified in a.
"""
assert isinstance(a, AttrDict), 'Argument `a` must be an AttrDict'
assert isinstance(b, AttrDict), 'Argument `b` must be an AttrDict'
assert isinstance(a, AttrDict), \
'`a` (cur type {}) must be an instance of {}'.format(type(a), AttrDict)
assert isinstance(b, AttrDict), \
'`b` (cur type {}) must be an instance of {}'.format(type(b), AttrDict)

for k, v_ in a.items():
full_key = '.'.join(stack) + '.' + k if stack is not None else k
Expand Down
22 changes: 11 additions & 11 deletions lib/core/rpn_generator.py → detectron/core/rpn_generator.py
Original file line number Diff line number Diff line change
Expand Up @@ -38,17 +38,17 @@
from caffe2.python import core
from caffe2.python import workspace

from core.config import cfg
from datasets import task_evaluation
from datasets.json_dataset import JsonDataset
from modeling import model_builder
from utils.io import save_object
from utils.timer import Timer
import utils.blob as blob_utils
import utils.c2 as c2_utils
import utils.env as envu
import utils.net as nu
import utils.subprocess as subprocess_utils
from detectron.core.config import cfg
from detectron.datasets import task_evaluation
from detectron.datasets.json_dataset import JsonDataset
from detectron.modeling import model_builder
from detectron.utils.io import save_object
from detectron.utils.timer import Timer
import detectron.utils.blob as blob_utils
import detectron.utils.c2 as c2_utils
import detectron.utils.env as envu
import detectron.utils.net as nu
import detectron.utils.subprocess as subprocess_utils

logger = logging.getLogger(__name__)

Expand Down
16 changes: 8 additions & 8 deletions lib/core/test.py → detectron/core/test.py
Original file line number Diff line number Diff line change
Expand Up @@ -37,14 +37,14 @@
from caffe2.python import workspace
import pycocotools.mask as mask_util

from core.config import cfg
from utils.timer import Timer
import core.test_retinanet as test_retinanet
import modeling.FPN as fpn
import utils.blob as blob_utils
import utils.boxes as box_utils
import utils.image as image_utils
import utils.keypoints as keypoint_utils
from detectron.core.config import cfg
from detectron.utils.timer import Timer
import detectron.core.test_retinanet as test_retinanet
import detectron.modeling.FPN as fpn
import detectron.utils.blob as blob_utils
import detectron.utils.boxes as box_utils
import detectron.utils.image as image_utils
import detectron.utils.keypoints as keypoint_utils

logger = logging.getLogger(__name__)

Expand Down
30 changes: 15 additions & 15 deletions lib/core/test_engine.py → detectron/core/test_engine.py
Original file line number Diff line number Diff line change
Expand Up @@ -30,21 +30,21 @@

from caffe2.python import workspace

from core.config import cfg
from core.config import get_output_dir
from core.rpn_generator import generate_rpn_on_dataset
from core.rpn_generator import generate_rpn_on_range
from core.test import im_detect_all
from datasets import task_evaluation
from datasets.json_dataset import JsonDataset
from modeling import model_builder
from utils.io import save_object
from utils.timer import Timer
import utils.c2 as c2_utils
import utils.env as envu
import utils.net as net_utils
import utils.subprocess as subprocess_utils
import utils.vis as vis_utils
from detectron.core.config import cfg
from detectron.core.config import get_output_dir
from detectron.core.rpn_generator import generate_rpn_on_dataset
from detectron.core.rpn_generator import generate_rpn_on_range
from detectron.core.test import im_detect_all
from detectron.datasets import task_evaluation
from detectron.datasets.json_dataset import JsonDataset
from detectron.modeling import model_builder
from detectron.utils.io import save_object
from detectron.utils.timer import Timer
import detectron.utils.c2 as c2_utils
import detectron.utils.env as envu
import detectron.utils.net as net_utils
import detectron.utils.subprocess as subprocess_utils
import detectron.utils.vis as vis_utils

logger = logging.getLogger(__name__)

Expand Down
11 changes: 5 additions & 6 deletions lib/core/test_retinanet.py → detectron/core/test_retinanet.py
Original file line number Diff line number Diff line change
Expand Up @@ -26,12 +26,11 @@

from caffe2.python import core, workspace

from core.config import cfg
from modeling.generate_anchors import generate_anchors
from utils.timer import Timer

import utils.blob as blob_utils
import utils.boxes as box_utils
from detectron.core.config import cfg
from detectron.modeling.generate_anchors import generate_anchors
from detectron.utils.timer import Timer
import detectron.utils.blob as blob_utils
import detectron.utils.boxes as box_utils

logger = logging.getLogger(__name__)

Expand Down
File renamed without changes.
File renamed without changes.
Original file line number Diff line number Diff line change
Expand Up @@ -12,8 +12,8 @@

import cityscapesscripts.evaluation.instances2dict_with_polygons as cs

import utils.segms as segms_util
import utils.boxes as bboxs_util
import detectron.utils.segms as segms_util
import detectron.utils.boxes as bboxs_util


def parse_args():
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -8,13 +8,13 @@
from __future__ import print_function
from __future__ import unicode_literals

import cPickle as pickle
import argparse
import cPickle as pickle
import numpy as np
import os
import sys
import numpy as np

import datasets.cityscapes.coco_to_cityscapes_id as cs
import detectron.datasets.cityscapes.coco_to_cityscapes_id as cs

NUM_CS_CLS = 9
NUM_COCO_CLS = 81
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -27,9 +27,9 @@

import pycocotools.mask as mask_util

from core.config import cfg
from datasets.dataset_catalog import DATASETS
from datasets.dataset_catalog import RAW_DIR
from detectron.core.config import cfg
from detectron.datasets.dataset_catalog import DATASETS
from detectron.datasets.dataset_catalog import RAW_DIR

logger = logging.getLogger(__name__)

Expand Down
Loading

0 comments on commit e5bb3a8

Please sign in to comment.