Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cannot run R-CNN example #2567

Closed
kar-fai opened this issue Jun 7, 2015 · 3 comments
Closed

Cannot run R-CNN example #2567

kar-fai opened this issue Jun 7, 2015 · 3 comments

Comments

@kar-fai
Copy link

kar-fai commented Jun 7, 2015

Hi, I am using the docker from tleyden5iwx/caffe-gpu-master

However, I get the error when I run the R-CNN examples by typing this command.

python python/detect.py \
--crop_mode=selective_search \
--pretrained_model=/opt/caffe/models/bvlc_reference_rcnn_ilsvrc13/bvlc_reference_rcnn_ilsvrc13.caffemodel \
--model_def=/opt/caffe/models/bvlc_reference_rcnn_ilsvrc13/deploy.prototxt \
--gpu \
--raw_scale=255 \
_temp/det_input.txt _temp/det_output.h5

This is the output from terminal

libdc1394 error: Failed to initialize libdc1394
WARNING: Logging before InitGoogleLogging() is written to STDERR
I0607 08:48:22.059032 24964 net.cpp:42] Initializing net from parameters: 
name: "R-CNN-ilsvrc13"
input: "data"
input_dim: 10
input_dim: 3
input_dim: 227
input_dim: 227
state {
  phase: TEST
}
layer {
  name: "conv1"
  type: "Convolution"
  bottom: "data"
  top: "conv1"
  convolution_param {
    num_output: 96
    kernel_size: 11
    stride: 4
  }
}
layer {
  name: "relu1"
  type: "ReLU"
  bottom: "conv1"
  top: "conv1"
}
layer {
  name: "pool1"
  type: "Pooling"
  bottom: "conv1"
  top: "pool1"
  pooling_param {
    pool: MAX
    kernel_size: 3
    stride: 2
  }
}
layer {
  name: "norm1"
  type: "LRN"
  bottom: "pool1"
  top: "norm1"
  lrn_param {
    local_size: 5
    alpha: 0.0001
    beta: 0.75
  }
}
layer {
  name: "conv2"
  type: "Convolution"
  bottom: "norm1"
  top: "conv2"
  convolution_param {
    num_output: 256
    pad: 2
    kernel_size: 5
    group: 2
  }
}
layer {
  name: "relu2"
  type: "ReLU"
  bottom: "conv2"
  top: "conv2"
}
layer {
  name: "pool2"
  type: "Pooling"
  bottom: "conv2"
  top: "pool2"
  pooling_param {
    pool: MAX
    kernel_size: 3
    stride: 2
  }
}
layer {
  name: "norm2"
  type: "LRN"
  bottom: "pool2"
  top: "norm2"
  lrn_param {
    local_size: 5
    alpha: 0.0001
    beta: 0.75
  }
}
layer {
  name: "conv3"
  type: "Convolution"
  bottom: "norm2"
  top: "conv3"
  convolution_param {
    num_output: 384
    pad: 1
    kernel_size: 3
  }
}
layer {
  name: "relu3"
  type: "ReLU"
  bottom: "conv3"
  top: "conv3"
}
layer {
  name: "conv4"
  type: "Convolution"
  bottom: "conv3"
  top: "conv4"
  convolution_param {
    num_output: 384
    pad: 1
    kernel_size: 3
    group: 2
  }
}
layer {
  name: "relu4"
  type: "ReLU"
  bottom: "conv4"
  top: "conv4"
}
layer {
  name: "conv5"
  type: "Convolution"
  bottom: "conv4"
  top: "conv5"
  convolution_param {
    num_output: 256
    pad: 1
    kernel_size: 3
    group: 2
  }
}
layer {
  name: "relu5"
  type: "ReLU"
  bottom: "conv5"
  top: "conv5"
}
layer {
  name: "pool5"
  type: "Pooling"
  bottom: "conv5"
  top: "pool5"
  pooling_param {
    pool: MAX
    kernel_size: 3
    stride: 2
  }
}
layer {
  name: "fc6"
  type: "InnerProduct"
  bottom: "pool5"
  top: "fc6"
  inner_product_param {
    num_output: 4096
  }
}
layer {
  name: "relu6"
  type: "ReLU"
  bottom: "fc6"
  top: "fc6"
}
layer {
  name: "drop6"
  type: "Dropout"
  bottom: "fc6"
  top: "fc6"
  dropout_param {
    dropout_ratio: 0.5
  }
}
layer {
  name: "fc7"
  type: "InnerProduct"
  bottom: "fc6"
  top: "fc7"
  inner_product_param {
    num_output: 4096
  }
}
layer {
  name: "relu7"
  type: "ReLU"
  bottom: "fc7"
  top: "fc7"
}
layer {
  name: "drop7"
  type: "Dropout"
  bottom: "fc7"
  top: "fc7"
  dropout_param {
    dropout_ratio: 0.5
  }
}
layer {
  name: "fc-rcnn"
  type: "InnerProduct"
  bottom: "fc7"
  top: "fc-rcnn"
  inner_product_param {
    num_output: 200
  }
}
I0607 08:48:22.059561 24964 net.cpp:336] Input 0 -> data
I0607 08:48:22.059597 24964 layer_factory.hpp:74] Creating layer conv1
I0607 08:48:22.059607 24964 net.cpp:76] Creating Layer conv1
I0607 08:48:22.059615 24964 net.cpp:372] conv1 <- data
I0607 08:48:22.059625 24964 net.cpp:334] conv1 -> conv1
I0607 08:48:22.059634 24964 net.cpp:105] Setting up conv1
I0607 08:48:22.059712 24964 net.cpp:112] Top shape: 10 96 55 55 (2904000)
I0607 08:48:22.059737 24964 layer_factory.hpp:74] Creating layer relu1
I0607 08:48:22.059746 24964 net.cpp:76] Creating Layer relu1
I0607 08:48:22.059749 24964 net.cpp:372] relu1 <- conv1
I0607 08:48:22.059756 24964 net.cpp:323] relu1 -> conv1 (in-place)
I0607 08:48:22.059762 24964 net.cpp:105] Setting up relu1
I0607 08:48:22.059770 24964 net.cpp:112] Top shape: 10 96 55 55 (2904000)
I0607 08:48:22.059775 24964 layer_factory.hpp:74] Creating layer pool1
I0607 08:48:22.059782 24964 net.cpp:76] Creating Layer pool1
I0607 08:48:22.059787 24964 net.cpp:372] pool1 <- conv1
I0607 08:48:22.059792 24964 net.cpp:334] pool1 -> pool1
I0607 08:48:22.059798 24964 net.cpp:105] Setting up pool1
I0607 08:48:22.059806 24964 net.cpp:112] Top shape: 10 96 27 27 (699840)
I0607 08:48:22.059810 24964 layer_factory.hpp:74] Creating layer norm1
I0607 08:48:22.059818 24964 net.cpp:76] Creating Layer norm1
I0607 08:48:22.059823 24964 net.cpp:372] norm1 <- pool1
I0607 08:48:22.059828 24964 net.cpp:334] norm1 -> norm1
I0607 08:48:22.059834 24964 net.cpp:105] Setting up norm1
I0607 08:48:22.059841 24964 net.cpp:112] Top shape: 10 96 27 27 (699840)
I0607 08:48:22.059846 24964 layer_factory.hpp:74] Creating layer conv2
I0607 08:48:22.059851 24964 net.cpp:76] Creating Layer conv2
I0607 08:48:22.059855 24964 net.cpp:372] conv2 <- norm1
I0607 08:48:22.059861 24964 net.cpp:334] conv2 -> conv2
I0607 08:48:22.059867 24964 net.cpp:105] Setting up conv2
I0607 08:48:22.060230 24964 net.cpp:112] Top shape: 10 256 27 27 (1866240)
I0607 08:48:22.060250 24964 layer_factory.hpp:74] Creating layer relu2
I0607 08:48:22.060256 24964 net.cpp:76] Creating Layer relu2
I0607 08:48:22.060263 24964 net.cpp:372] relu2 <- conv2
I0607 08:48:22.060268 24964 net.cpp:323] relu2 -> conv2 (in-place)
I0607 08:48:22.060274 24964 net.cpp:105] Setting up relu2
I0607 08:48:22.060278 24964 net.cpp:112] Top shape: 10 256 27 27 (1866240)
I0607 08:48:22.060282 24964 layer_factory.hpp:74] Creating layer pool2
I0607 08:48:22.060287 24964 net.cpp:76] Creating Layer pool2
I0607 08:48:22.060292 24964 net.cpp:372] pool2 <- conv2
I0607 08:48:22.060297 24964 net.cpp:334] pool2 -> pool2
I0607 08:48:22.060302 24964 net.cpp:105] Setting up pool2
I0607 08:48:22.060307 24964 net.cpp:112] Top shape: 10 256 13 13 (432640)
I0607 08:48:22.060312 24964 layer_factory.hpp:74] Creating layer norm2
I0607 08:48:22.060317 24964 net.cpp:76] Creating Layer norm2
I0607 08:48:22.060322 24964 net.cpp:372] norm2 <- pool2
I0607 08:48:22.060328 24964 net.cpp:334] norm2 -> norm2
I0607 08:48:22.060334 24964 net.cpp:105] Setting up norm2
I0607 08:48:22.060340 24964 net.cpp:112] Top shape: 10 256 13 13 (432640)
I0607 08:48:22.060345 24964 layer_factory.hpp:74] Creating layer conv3
I0607 08:48:22.060353 24964 net.cpp:76] Creating Layer conv3
I0607 08:48:22.060361 24964 net.cpp:372] conv3 <- norm2
I0607 08:48:22.060366 24964 net.cpp:334] conv3 -> conv3
I0607 08:48:22.060371 24964 net.cpp:105] Setting up conv3
I0607 08:48:22.061478 24964 net.cpp:112] Top shape: 10 384 13 13 (648960)
I0607 08:48:22.061504 24964 layer_factory.hpp:74] Creating layer relu3
I0607 08:48:22.061511 24964 net.cpp:76] Creating Layer relu3
I0607 08:48:22.061518 24964 net.cpp:372] relu3 <- conv3
I0607 08:48:22.061527 24964 net.cpp:323] relu3 -> conv3 (in-place)
I0607 08:48:22.061532 24964 net.cpp:105] Setting up relu3
I0607 08:48:22.061537 24964 net.cpp:112] Top shape: 10 384 13 13 (648960)
I0607 08:48:22.061542 24964 layer_factory.hpp:74] Creating layer conv4
I0607 08:48:22.061547 24964 net.cpp:76] Creating Layer conv4
I0607 08:48:22.061550 24964 net.cpp:372] conv4 <- conv3
I0607 08:48:22.061555 24964 net.cpp:334] conv4 -> conv4
I0607 08:48:22.061561 24964 net.cpp:105] Setting up conv4
I0607 08:48:22.062374 24964 net.cpp:112] Top shape: 10 384 13 13 (648960)
I0607 08:48:22.062396 24964 layer_factory.hpp:74] Creating layer relu4
I0607 08:48:22.062402 24964 net.cpp:76] Creating Layer relu4
I0607 08:48:22.062407 24964 net.cpp:372] relu4 <- conv4
I0607 08:48:22.062414 24964 net.cpp:323] relu4 -> conv4 (in-place)
I0607 08:48:22.062419 24964 net.cpp:105] Setting up relu4
I0607 08:48:22.062423 24964 net.cpp:112] Top shape: 10 384 13 13 (648960)
I0607 08:48:22.062428 24964 layer_factory.hpp:74] Creating layer conv5
I0607 08:48:22.062434 24964 net.cpp:76] Creating Layer conv5
I0607 08:48:22.062438 24964 net.cpp:372] conv5 <- conv4
I0607 08:48:22.062444 24964 net.cpp:334] conv5 -> conv5
I0607 08:48:22.062450 24964 net.cpp:105] Setting up conv5
I0607 08:48:22.062999 24964 net.cpp:112] Top shape: 10 256 13 13 (432640)
I0607 08:48:22.063020 24964 layer_factory.hpp:74] Creating layer relu5
I0607 08:48:22.063026 24964 net.cpp:76] Creating Layer relu5
I0607 08:48:22.063032 24964 net.cpp:372] relu5 <- conv5
I0607 08:48:22.063040 24964 net.cpp:323] relu5 -> conv5 (in-place)
I0607 08:48:22.063045 24964 net.cpp:105] Setting up relu5
I0607 08:48:22.063050 24964 net.cpp:112] Top shape: 10 256 13 13 (432640)
I0607 08:48:22.063055 24964 layer_factory.hpp:74] Creating layer pool5
I0607 08:48:22.063060 24964 net.cpp:76] Creating Layer pool5
I0607 08:48:22.063065 24964 net.cpp:372] pool5 <- conv5
I0607 08:48:22.063069 24964 net.cpp:334] pool5 -> pool5
I0607 08:48:22.063076 24964 net.cpp:105] Setting up pool5
I0607 08:48:22.063081 24964 net.cpp:112] Top shape: 10 256 6 6 (92160)
I0607 08:48:22.063086 24964 layer_factory.hpp:74] Creating layer fc6
I0607 08:48:22.063094 24964 net.cpp:76] Creating Layer fc6
I0607 08:48:22.063101 24964 net.cpp:372] fc6 <- pool5
I0607 08:48:22.063109 24964 net.cpp:334] fc6 -> fc6
I0607 08:48:22.063117 24964 net.cpp:105] Setting up fc6
I0607 08:48:22.119091 24964 net.cpp:112] Top shape: 10 4096 1 1 (40960)
I0607 08:48:22.119133 24964 layer_factory.hpp:74] Creating layer relu6
I0607 08:48:22.119158 24964 net.cpp:76] Creating Layer relu6
I0607 08:48:22.119166 24964 net.cpp:372] relu6 <- fc6
I0607 08:48:22.119179 24964 net.cpp:323] relu6 -> fc6 (in-place)
I0607 08:48:22.119189 24964 net.cpp:105] Setting up relu6
I0607 08:48:22.119205 24964 net.cpp:112] Top shape: 10 4096 1 1 (40960)
I0607 08:48:22.119211 24964 layer_factory.hpp:74] Creating layer drop6
I0607 08:48:22.119223 24964 net.cpp:76] Creating Layer drop6
I0607 08:48:22.119240 24964 net.cpp:372] drop6 <- fc6
I0607 08:48:22.119247 24964 net.cpp:323] drop6 -> fc6 (in-place)
I0607 08:48:22.119256 24964 net.cpp:105] Setting up drop6
I0607 08:48:22.119262 24964 net.cpp:112] Top shape: 10 4096 1 1 (40960)
I0607 08:48:22.119267 24964 layer_factory.hpp:74] Creating layer fc7
I0607 08:48:22.119274 24964 net.cpp:76] Creating Layer fc7
I0607 08:48:22.119279 24964 net.cpp:372] fc7 <- fc6
I0607 08:48:22.119285 24964 net.cpp:334] fc7 -> fc7
I0607 08:48:22.119292 24964 net.cpp:105] Setting up fc7
I0607 08:48:22.143841 24964 net.cpp:112] Top shape: 10 4096 1 1 (40960)
I0607 08:48:22.143887 24964 layer_factory.hpp:74] Creating layer relu7
I0607 08:48:22.143901 24964 net.cpp:76] Creating Layer relu7
I0607 08:48:22.143911 24964 net.cpp:372] relu7 <- fc7
I0607 08:48:22.143921 24964 net.cpp:323] relu7 -> fc7 (in-place)
I0607 08:48:22.143931 24964 net.cpp:105] Setting up relu7
I0607 08:48:22.143936 24964 net.cpp:112] Top shape: 10 4096 1 1 (40960)
I0607 08:48:22.143941 24964 layer_factory.hpp:74] Creating layer drop7
I0607 08:48:22.143960 24964 net.cpp:76] Creating Layer drop7
I0607 08:48:22.143970 24964 net.cpp:372] drop7 <- fc7
I0607 08:48:22.143976 24964 net.cpp:323] drop7 -> fc7 (in-place)
I0607 08:48:22.143985 24964 net.cpp:105] Setting up drop7
I0607 08:48:22.143991 24964 net.cpp:112] Top shape: 10 4096 1 1 (40960)
I0607 08:48:22.143996 24964 layer_factory.hpp:74] Creating layer fc-rcnn
I0607 08:48:22.144003 24964 net.cpp:76] Creating Layer fc-rcnn
I0607 08:48:22.144007 24964 net.cpp:372] fc-rcnn <- fc7
I0607 08:48:22.144014 24964 net.cpp:334] fc-rcnn -> fc-rcnn
I0607 08:48:22.144024 24964 net.cpp:105] Setting up fc-rcnn
I0607 08:48:22.144906 24964 net.cpp:112] Top shape: 10 200 1 1 (2000)
I0607 08:48:22.144925 24964 net.cpp:165] fc-rcnn does not need backward computation.
I0607 08:48:22.144930 24964 net.cpp:165] drop7 does not need backward computation.
I0607 08:48:22.144934 24964 net.cpp:165] relu7 does not need backward computation.
I0607 08:48:22.144951 24964 net.cpp:165] fc7 does not need backward computation.
I0607 08:48:22.144958 24964 net.cpp:165] drop6 does not need backward computation.
I0607 08:48:22.144963 24964 net.cpp:165] relu6 does not need backward computation.
I0607 08:48:22.144968 24964 net.cpp:165] fc6 does not need backward computation.
I0607 08:48:22.144973 24964 net.cpp:165] pool5 does not need backward computation.
I0607 08:48:22.144978 24964 net.cpp:165] relu5 does not need backward computation.
I0607 08:48:22.144981 24964 net.cpp:165] conv5 does not need backward computation.
I0607 08:48:22.144985 24964 net.cpp:165] relu4 does not need backward computation.
I0607 08:48:22.144989 24964 net.cpp:165] conv4 does not need backward computation.
I0607 08:48:22.144994 24964 net.cpp:165] relu3 does not need backward computation.
I0607 08:48:22.144997 24964 net.cpp:165] conv3 does not need backward computation.
I0607 08:48:22.145001 24964 net.cpp:165] norm2 does not need backward computation.
I0607 08:48:22.145006 24964 net.cpp:165] pool2 does not need backward computation.
I0607 08:48:22.145012 24964 net.cpp:165] relu2 does not need backward computation.
I0607 08:48:22.145016 24964 net.cpp:165] conv2 does not need backward computation.
I0607 08:48:22.145021 24964 net.cpp:165] norm1 does not need backward computation.
I0607 08:48:22.145025 24964 net.cpp:165] pool1 does not need backward computation.
I0607 08:48:22.145030 24964 net.cpp:165] relu1 does not need backward computation.
I0607 08:48:22.145032 24964 net.cpp:165] conv1 does not need backward computation.
I0607 08:48:22.145037 24964 net.cpp:201] This network produces output fc-rcnn
I0607 08:48:22.145051 24964 net.cpp:446] Collecting Learning Rate and Weight Decay.
I0607 08:48:22.145059 24964 net.cpp:213] Network initialization done.
I0607 08:48:22.145063 24964 net.cpp:214] Memory required for data: 62425920
E0607 08:48:22.439707 24964 upgrade_proto.cpp:618] Attempting to upgrade input file specified using deprecated V1LayerParameter: /opt/caffe/models/bvlc_reference_rcnn_ilsvrc13/bvlc_reference_rcnn_ilsvrc13.caffemodel
I0607 08:48:22.615486 24964 upgrade_proto.cpp:626] Successfully upgraded file specified using deprecated V1LayerParameter
GPU mode
Loading input...
selective_search_rcnn({'/opt/caffe/images/fish-bike.jpg'}, '/tmp/tmpKULeLF.mat')

Traceback (most recent call last):
  File "python/detect.py", line 170, in <module>
    main(sys.argv)
  File "python/detect.py", line 141, in main
    detections = detector.detect_selective_search(inputs)
  File "/opt/caffe/python/caffe/detector.py", line 118, in detect_selective_search
    cmd='selective_search_rcnn'
  File "/opt/caffe/python/caffe/selective_search_ijcv_with_python/selective_search.py", line 36, in get_windows
    shlex.split(mc), stdout=open('/dev/null', 'w'), cwd=script_dirname)
  File "/usr/lib/python2.7/subprocess.py", line 710, in __init__
    errread, errwrite)
  File "/usr/lib/python2.7/subprocess.py", line 1327, in _execute_child
    raise child_exception
OSError: [Errno 2] No such file or directory

My gpu installed Cuda-6.5 and Matlab R2014b without any issue.
I pass my nvidia devices and matlab application to docker by using this command in terminal

DOCKER_NVIDIA_DEVICES="--device /dev/nvidia0:/dev/nvidia0 --device /dev/nvidiactl:/dev/nvidiactl --device /dev/nvidia-uvm:/dev/nvidia-uvm"
sudo docker run -ti --rm $DOCKER_NVIDIA_DEVICES -v /usr/local/MATLAB:/usr/local/MATLAB tleyden5iwx/caffe-gpu-master /bin/bash

Below is my Makefile.config

## Refer to http://caffe.berkeleyvision.org/installation.html
# Contributions simplifying and improving our build system are welcome!

# cuDNN acceleration switch (uncomment to build with cuDNN).
# USE_CUDNN := 1

# CPU-only switch (uncomment to build without GPU support).
# CPU_ONLY := 1

# To customize your choice of compiler, uncomment and set the following.
# N.B. the default for Linux is g++ and the default for OSX is clang++
# CUSTOM_CXX := g++

# CUDA directory contains bin/ and lib/ directories that we need.
CUDA_DIR := /usr/local/cuda
# On Ubuntu 14.04, if cuda tools are installed via
# "sudo apt-get install nvidia-cuda-toolkit" then use this instead:
# CUDA_DIR := /usr

# CUDA architecture setting: going with all of them.
# For CUDA < 6.0, comment the *_50 lines for compatibility.
CUDA_ARCH := -gencode arch=compute_20,code=sm_20 \
        -gencode arch=compute_20,code=sm_21 \
        -gencode arch=compute_30,code=sm_30 \
        -gencode arch=compute_35,code=sm_35 \
        -gencode arch=compute_50,code=sm_50 \
        -gencode arch=compute_50,code=compute_50

# BLAS choice:
# atlas for ATLAS (default)
# mkl for MKL
# open for OpenBlas
BLAS := atlas
# Custom (MKL/ATLAS/OpenBLAS) include and lib directories.
# Leave commented to accept the defaults for your choice of BLAS
# (which should work)!
# BLAS_INCLUDE := /path/to/your/blas
# BLAS_LIB := /path/to/your/blas

# This is required only if you will compile the matlab interface.
# MATLAB directory should contain the mex binary in /bin.
MATLAB_DIR := /usr/local/MATLAB/R2014b
# MATLAB_DIR := /Applications/MATLAB_R2012b.app

# NOTE: this is required only if you will compile the python interface.
# We need to be able to find Python.h and numpy/arrayobject.h.
PYTHON_INCLUDE := /usr/include/python2.7 \
        /usr/lib/python2.7/dist-packages/numpy/core/include
# Anaconda Python distribution is quite popular. Include path:
# Verify anaconda location, sometimes it's in root.
# ANACONDA_HOME := $(HOME)/anaconda
# PYTHON_INCLUDE := $(ANACONDA_HOME)/include \
        # $(ANACONDA_HOME)/include/python2.7 \
        # $(ANACONDA_HOME)/lib/python2.7/site-packages/numpy/core/include \

# We need to be able to find libpythonX.X.so or .dylib.
PYTHON_LIB := /usr/lib
# PYTHON_LIB := $(ANACONDA_HOME)/lib

# Uncomment to support layers written in Python (will link against Python libs)
# WITH_PYTHON_LAYER := 1

# Whatever else you find you need goes here.
INCLUDE_DIRS := $(PYTHON_INCLUDE) /usr/local/include
LIBRARY_DIRS := $(PYTHON_LIB) /usr/local/lib /usr/lib

# Uncomment to use `pkg-config` to specify OpenCV library paths.
# (Usually not necessary -- OpenCV libraries are normally installed in one of the above $LIBRARY_DIRS.)
# USE_PKG_CONFIG := 1

BUILD_DIR := build
DISTRIBUTE_DIR := distribute

# Uncomment for debugging. Does not work on OSX due to https://github.com/BVLC/caffe/issues/171
# DEBUG := 1

# The ID of the GPU that 'make runtest' will use to run unit tests.
TEST_GPUID := 0

# enable pretty build (comment to see full commands)
Q ?= @
CXX := /usr/bin/g++-4.6

Next, go to the caffe root directory which is /opt/caffe, and type

make clean
make all -j8
make test -j8
make runtest -j8

This is the output of last few lines

[==========] 1026 tests from 190 test cases ran. (91992 ms total)
[  PASSED  ] 1026 tests.

  YOU HAVE 2 DISABLED TESTS

Compile the Python wrappers, by type make pycaffe, I get output below

CXX/LD -o python/caffe/_caffe.so python/caffe/_caffe.cpp
touch python/caffe/proto/__init__.py
PROTOC (python) src/caffe/proto/caffe.proto

Compile the Matlab wrappers, by type make matcaffe, I get output below

MEX matlab/caffe/matcaffe.cpp
Building with 'g++'.
MEX completed successfully.

In additional, I can run the LeNet MNIST tutorial and Imagenet classification examples without any issue, but not R-CNN. Help please?

@kar-fai
Copy link
Author

kar-fai commented Jun 7, 2015

Just found the solution on #1030. This is due to 'matlab' command is not available in terminal. To solve this, add

export PATH=/usr/local/MATLAB/R2014b/bin/:$PATH

with assumption that your MATLAB installed in directory /usr/local/

@kar-fai kar-fai closed this as completed Jun 7, 2015
@SevenBlocks
Copy link

I had this same problem. Thanks for posting the solution.

@baolinhtb
Copy link

I also got the same problem, Thank you very much!! Kar-fai

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants