Python版Faster-RCNN安装配置
2016-09-09 08:45
471 查看
本文参照github 编写,参考地址如下:
https://github.com/rbgirshick/py-faster-rcnn
一、 软件要求:
1,python必须支持python layers:
2,python包必须包括:
二、硬件要求:
1,对于小型网络的训练(ZF, VGG_CNN_M_1024),一个有3G内存的好的gpu就可以了(e.g., Titan, K20, K40, …) 。
2,为了用VGG16寻训练fast-rcnn,我们需要一个 K40 (~11G of memory)。
3,用端到端(end to end)的方式训练用VGG16构建的Faster-Rcnn ,在使用CUDnn情况下,一个 K40 (~3G of memory)足够了。
三、安装
1, 从github上克隆faster-rcnn 源代码,注意必须采用命令行的方式下载,不要使用在浏览器内直接下载,不然会漏掉很多东西:
https://github.com/rbgirshick/py-faster-rcnn
一、 软件要求:
1,python必须支持python layers:
# Makefile.config, make sure to have this line uncommented WITH_PYTHON_LAYER := 1 # Unrelatedly, it's also recommended that you use CUDNN USE_CUDNN := 1
2,python包必须包括:
cython, python-opencv, easydict
二、硬件要求:
1,对于小型网络的训练(ZF, VGG_CNN_M_1024),一个有3G内存的好的gpu就可以了(e.g., Titan, K20, K40, …) 。
2,为了用VGG16寻训练fast-rcnn,我们需要一个 K40 (~11G of memory)。
3,用端到端(end to end)的方式训练用VGG16构建的Faster-Rcnn ,在使用CUDnn情况下,一个 K40 (~3G of memory)足够了。
三、安装
1, 从github上克隆faster-rcnn 源代码,注意必须采用命令行的方式下载,不要使用在浏览器内直接下载,不然会漏掉很多东西:
# Make sure to clone with --recursive git clone --recursive https://github.com/rbgirshick/py-faster-rcnn.git[/code]
2,建立 Cpython模块:cd $FRCN_ROOT/lib make
3,建立Caffe 和 pycaffe:cd $FRCN_ROOT/caffe-fast-rcnn make -j8 && make pycaffe
注意:在编译的时候容易出现两种错误:
(1), 由于cudnn版本不对,无法编译通过,在这里推荐在安装caffe时使用cudnn4.0版
(2),make过程中出现比如 string.h ‘memcy’ was not declared in this scope的错误是由于gcc编译器版本太新,解决方法是打开makefile搜索并替换:NVCCFLAGS += -ccbin=$(CXX) -Xcompiler -fPIC $(COMMON_FLAGS) 为 NVCCFLAGS += -D_FORCE_INLINES -ccbin=$(CXX) -Xcompiler -fPIC $(COMMON_FLAGS)
当然,如果caffe正确安装,是不会出现这种问题的。
4,下载已经训练好的模型Faster R-CNN :cd $FRCN_ROOT ./data/scripts/fetch_faster_rcnn_models.sh
这一步会在$FRCN_ROOT/data文件夹下下载 生成faster_rcnn_models. 的压缩文件大概700M大小,解压缩后备用。
四、测试:
完成的基本的安装后用demo测试一下:cd $FRCN_ROOT ./tools/demo.py
运行结果如下则表明安装成功:/usr/lib/python2.7/dist-packages/matplotlib/font_manager.py:273: UserWarning: Matplotlib is building the font cache using fc-list. This may take a moment. warnings.warn('Matplotlib is building the font cache using fc-list. This may take a moment.') WARNING: Logging before InitGoogleLogging() is written to STDERR I0908 20:21:59.205785 4215 net.cpp:49] Initializing net from parameters: name: "VGG_ILSVRC_16_layers" input: "data" input: "im_info" state { phase: TEST } input_shape { dim: 1 dim: 3 dim: 224 dim: 224 } input_shape { dim: 1 dim: 3 } layer { name: "conv1_1" type: "Convolution" bottom: "data" top: "conv1_1" convolution_param { num_output: 64 pad: 1 kernel_size: 3 } } layer { name: "relu1_1" type: "ReLU" bottom: "conv1_1" top: "conv1_1" } layer { name: "conv1_2" type: "Convolution" bottom: "conv1_1" top: "conv1_2" convolution_param { num_output: 64 pad: 1 kernel_size: 3 } } layer { name: "relu1_2" type: "ReLU" bottom: "conv1_2" top: "conv1_2" } layer { name: "pool1" type: "Pooling" bottom: "conv1_2" top: "pool1" pooling_param { pool: MAX kernel_size: 2 stride: 2 } } layer { name: "conv2_1" type: "Convolution" bottom: "pool1" top: "conv2_1" convolution_param { num_output: 128 pad: 1 kernel_size: 3 } } layer { name: "relu2_1" type: "ReLU" bottom: "conv2_1" top: "conv2_1" } layer { name: "conv2_2" type: "Convolution" bottom: "conv2_1" top: "conv2_2" convolution_param { num_output: 128 pad: 1 kernel_size: 3 } } layer { name: "relu2_2" type: "ReLU" bottom: "conv2_2" top: "conv2_2" } layer { name: "pool2" type: "Pooling" bottom: "conv2_2" top: "pool2" pooling_param { pool: MAX kernel_size: 2 stride: 2 } } layer { name: "conv3_1" type: "Convolution" bottom: "pool2" top: "conv3_1" convolution_param { num_output: 256 pad: 1 kernel_size: 3 } } layer { name: "relu3_1" type: "ReLU" bottom: "conv3_1" top: "conv3_1" } layer { name: "conv3_2" type: "Convolution" bottom: "conv3_1" top: "conv3_2" convolution_param { num_output: 256 pad: 1 kernel_size: 3 } } layer { name: "relu3_2" type: "ReLU" bottom: "conv3_2" top: "conv3_2" } layer { name: "conv3_3" type: "Convolution" bottom: "conv3_2" top: "conv3_3" convolution_param { num_output: 256 pad: 1 kernel_size: 3 } } layer { name: "relu3_3" type: "ReLU" bottom: "conv3_3" top: "conv3_3" } layer { name: "pool3" type: "Pooling" bottom: "conv3_3" top: "pool3" pooling_param { pool: MAX kernel_size: 2 stride: 2 } } layer { name: "conv4_1" type: "Convolution" bottom: "pool3" top: "conv4_1" convolution_param { num_output: 512 pad: 1 kernel_size: 3 } } layer { name: "relu4_1" type: "ReLU" bottom: "conv4_1" top: "conv4_1" } layer { name: "conv4_2" type: "Convolution" bottom: "conv4_1" top: "conv4_2" convolution_param { num_output: 512 pad: 1 kernel_size: 3 } } layer { name: "relu4_2" type: "ReLU" bottom: "conv4_2" top: "conv4_2" } layer { name: "conv4_3" type: "Convolution" bottom: "conv4_2" top: "conv4_3" convolution_param { num_output: 512 pad: 1 kernel_size: 3 } } layer { name: "relu4_3" type: "ReLU" bottom: "conv4_3" top: "conv4_3" } layer { name: "pool4" type: "Pooling" bottom: "conv4_3" top: "pool4" pooling_param { pool: MAX kernel_size: 2 stride: 2 } } layer { name: "conv5_1" type: "Convolution" bottom: "pool4" top: "conv5_1" convolution_param { num_output: 512 pad: 1 kernel_size: 3 } } layer { name: "relu5_1" type: "ReLU" bottom: "conv5_1" top: "conv5_1" } layer { name: "conv5_2" type: "Convolution" bottom: "conv5_1" top: "conv5_2" convolution_param { num_output: 512 pad: 1 kernel_size: 3 } } layer { name: "relu5_2" type: "ReLU" bottom: "conv5_2" top: "conv5_2" } layer { name: "conv5_3" type: "Convolution" bottom: "conv5_2" top: "conv5_3" convolution_param { num_output: 512 pad: 1 kernel_size: 3 } } layer { name: "relu5_3" type: "ReLU" bottom: "conv5_3" top: "conv5_3" } layer { name: "rpn_conv/3x3" type: "Convolution" bottom: "conv5_3" top: "rpn/output" convolution_param { num_output: 512 pad: 1 kernel_size: 3 stride: 1 } } layer { name: "rpn_relu/3x3" type: "ReLU" bottom: "rpn/output" top: "rpn/output" } layer { name: "rpn_cls_score" type: "Convolution" bottom: "rpn/output" top: "rpn_cls_score" convolution_param { num_output: 18 pad: 0 kernel_size: 1 stride: 1 } } layer { name: "rpn_bbox_pred" type: "Convolution" bottom: "rpn/output" top: "rpn_bbox_pred" convolution_param { num_output: 36 pad: 0 kernel_size: 1 stride: 1 } } layer { name: "rpn_cls_score_reshape" type: "Reshape" bottom: "rpn_cls_score" top: "rpn_cls_score_reshape" reshape_param { shape { dim: 0 dim: 2 dim: -1 dim: 0 } } } layer { name: "rpn_cls_prob" type: "Softmax" bottom: "rpn_cls_score_reshape" top: "rpn_cls_prob" } layer { name: "rpn_cls_prob_reshape" type: "Reshape" bottom: "rpn_cls_prob" top: "rpn_cls_prob_reshape" reshape_param { shape { dim: 0 dim: 18 dim: -1 dim: 0 } } } layer { name: "proposal" type: "Python" bottom: "rpn_cls_prob_reshape" bottom: "rpn_bbox_pred" bottom: "im_info" top: "rois" python_param { module: "rpn.proposal_layer" layer: "ProposalLayer" param_str: "\'feat_stride\': 16" } } layer { name: "roi_pool5" type: "ROIPooling" bottom: "conv5_3" bottom: "rois" top: "pool5" roi_pooling_param { pooled_h: 7 pooled_w: 7 spatial_scale: 0.0625 } } layer { name: "fc6" type: "InnerProduct" bottom: "pool5" top: "fc6" inner_product_param { num_output: 4096 } } layer { name: "relu6" type: "ReLU" bottom: "fc6" top: "fc6" } layer { name: "fc7" type: "InnerProduct" bottom: "fc6" top: "fc7" inner_product_param { num_output: 4096 } } layer { name: "relu7" type: "ReLU" bottom: "fc7" top: "fc7" } layer { name: "cls_score" type: "InnerProduct" bottom: "fc7" top: "cls_score" inner_product_param { num_output: 21 } } layer { name: "bbox_pred" type: "InnerProduct" bottom: "fc7" top: "bbox_pred" inner_product_param { num_output: 84 } } layer { name: "cls_prob" type: "Softmax" bottom: "cls_score" top: "cls_prob" } I0908 20:21:59.206061 4215 net.cpp:413] Input 0 -> data I0908 20:21:59.309164 4215 net.cpp:413] Input 1 -> im_info I0908 20:21:59.309209 4215 layer_factory.hpp:77] Creating layer conv1_1 I0908 20:21:59.309229 4215 net.cpp:106] Creating Layer conv1_1 I0908 20:21:59.309232 4215 net.cpp:454] conv1_1 <- data I0908 20:21:59.309237 4215 net.cpp:411] conv1_1 -> conv1_1 I0908 20:22:00.151002 4215 net.cpp:150] Setting up conv1_1 I0908 20:22:00.151026 4215 net.cpp:157] Top shape: 1 64 224 224 (3211264) I0908 20:22:00.151029 4215 net.cpp:165] Memory required for data: 12845056 I0908 20:22:00.151042 4215 layer_factory.hpp:77] Creating layer relu1_1 I0908 20:22:00.151053 4215 net.cpp:106] Creating Layer relu1_1 I0908 20:22:00.151057 4215 net.cpp:454] relu1_1 <- conv1_1 I0908 20:22:00.151062 4215 net.cpp:397] relu1_1 -> conv1_1 (in-place) I0908 20:22:00.151320 4215 net.cpp:150] Setting up relu1_1 I0908 20:22:00.151327 4215 net.cpp:157] Top shape: 1 64 224 224 (3211264) I0908 20:22:00.151329 4215 net.cpp:165] Memory required for data: 25690112 I0908 20:22:00.151331 4215 layer_factory.hpp:77] Creating layer conv1_2 I0908 20:22:00.151340 4215 net.cpp:106] Creating Layer conv1_2 I0908 20:22:00.151342 4215 net.cpp:454] conv1_2 <- conv1_1 I0908 20:22:00.151347 4215 net.cpp:411] conv1_2 -> conv1_2 I0908 20:22:00.152340 4215 net.cpp:150] Setting up conv1_2 I0908 20:22:00.152350 4215 net.cpp:157] Top shape: 1 64 224 224 (3211264) I0908 20:22:00.152354 4215 net.cpp:165] Memory required for data: 38535168 I0908 20:22:00.152359 4215 layer_factory.hpp:77] Creating layer relu1_2 I0908 20:22:00.152364 4215 net.cpp:106] Creating Layer relu1_2 I0908 20:22:00.152367 4215 net.cpp:454] relu1_2 <- conv1_2 I0908 20:22:00.152371 4215 net.cpp:397] relu1_2 -> conv1_2 (in-place) I0908 20:22:00.152498 4215 net.cpp:150] Setting up relu1_2 I0908 20:22:00.152503 4215 net.cpp:157] Top shape: 1 64 224 224 (3211264) I0908 20:22:00.152504 4215 net.cpp:165] Memory required for data: 51380224 I0908 20:22:00.152506 4215 layer_factory.hpp:77] Creating layer pool1 I0908 20:22:00.152513 4215 net.cpp:106] Creating Layer pool1 I0908 20:22:00.152513 4215 net.cpp:454] pool1 <- conv1_2 I0908 20:22:00.152518 4215 net.cpp:411] pool1 -> pool1 I0908 20:22:00.152549 4215 net.cpp 1b5d8 :150] Setting up pool1 I0908 20:22:00.152552 4215 net.cpp:157] Top shape: 1 64 112 112 (802816) I0908 20:22:00.152554 4215 net.cpp:165] Memory required for data: 54591488 I0908 20:22:00.152555 4215 layer_factory.hpp:77] Creating layer conv2_1 I0908 20:22:00.152561 4215 net.cpp:106] Creating Layer conv2_1 I0908 20:22:00.152564 4215 net.cpp:454] conv2_1 <- pool1 I0908 20:22:00.152566 4215 net.cpp:411] conv2_1 -> conv2_1 I0908 20:22:00.154081 4215 net.cpp:150] Setting up conv2_1 I0908 20:22:00.154090 4215 net.cpp:157] Top shape: 1 128 112 112 (1605632) I0908 20:22:00.154093 4215 net.cpp:165] Memory required for data: 61014016 I0908 20:22:00.154098 4215 layer_factory.hpp:77] Creating layer relu2_1 I0908 20:22:00.154103 4215 net.cpp:106] Creating Layer relu2_1 I0908 20:22:00.154105 4215 net.cpp:454] relu2_1 <- conv2_1 I0908 20:22:00.154110 4215 net.cpp:397] relu2_1 -> conv2_1 (in-place) I0908 20:22:00.154381 4215 net.cpp:150] Setting up relu2_1 I0908 20:22:00.154388 4215 net.cpp:157] Top shape: 1 128 112 112 (1605632) I0908 20:22:00.154391 4215 net.cpp:165] Memory required for data: 67436544 I0908 20:22:00.154392 4215 layer_factory.hpp:77] Creating layer conv2_2 I0908 20:22:00.154398 4215 net.cpp:106] Creating Layer conv2_2 I0908 20:22:00.154402 4215 net.cpp:454] conv2_2 <- conv2_1 I0908 20:22:00.154405 4215 net.cpp:411] conv2_2 -> conv2_2 I0908 20:22:00.155148 4215 net.cpp:150] Setting up conv2_2 I0908 20:22:00.155155 4215 net.cpp:157] Top shape: 1 128 112 112 (1605632) I0908 20:22:00.155158 4215 net.cpp:165] Memory required for data: 73859072 I0908 20:22:00.155163 4215 layer_factory.hpp:77] Creating layer relu2_2 I0908 20:22:00.155167 4215 net.cpp:106] Creating Layer relu2_2 I0908 20:22:00.155170 4215 net.cpp:454] relu2_2 <- conv2_2 I0908 20:22:00.155175 4215 net.cpp:397] relu2_2 -> conv2_2 (in-place) I0908 20:22:00.155441 4215 net.cpp:150] Setting up relu2_2 I0908 20:22:00.155449 4215 net.cpp:157] Top shape: 1 128 112 112 (1605632) I0908 20:22:00.155452 4215 net.cpp:165] Memory required for data: 80281600 I0908 20:22:00.155453 4215 layer_factory.hpp:77] Creating layer pool2 I0908 20:22:00.155457 4215 net.cpp:106] Creating Layer pool2 I0908 20:22:00.155459 4215 net.cpp:454] pool2 <- conv2_2 I0908 20:22:00.155463 4215 net.cpp:411] pool2 -> pool2 I0908 20:22:00.155493 4215 net.cpp:150] Setting up pool2 I0908 20:22:00.155496 4215 net.cpp:157] Top shape: 1 128 56 56 (401408) I0908 20:22:00.155498 4215 net.cpp:165] Memory required for data: 81887232 I0908 20:22:00.155500 4215 layer_factory.hpp:77] Creating layer conv3_1 I0908 20:22:00.155504 4215 net.cpp:106] Creating Layer conv3_1 I0908 20:22:00.155506 4215 net.cpp:454] conv3_1 <- pool2 I0908 20:22:00.155510 4215 net.cpp:411] conv3_1 -> conv3_1 I0908 20:22:00.156734 4215 net.cpp:150] Setting up conv3_1 I0908 20:22:00.156743 4215 net.cpp:157] Top shape: 1 256 56 56 (802816) I0908 20:22:00.156744 4215 net.cpp:165] Memory required for data: 85098496 I0908 20:22:00.156749 4215 layer_factory.hpp:77] Creating layer relu3_1 I0908 20:22:00.156754 4215 net.cpp:106] Creating Layer relu3_1 I0908 20:22:00.156756 4215 net.cpp:454] relu3_1 <- conv3_1 I0908 20:22:00.156761 4215 net.cpp:397] relu3_1 -> conv3_1 (in-place) I0908 20:22:00.156880 4215 net.cpp:150] Setting up relu3_1 I0908 20:22:00.156886 4215 net.cpp:157] Top shape: 1 256 56 56 (802816) I0908 20:22:00.156888 4215 net.cpp:165] Memory required for data: 88309760 I0908 20:22:00.156889 4215 layer_factory.hpp:77] Creating layer conv3_2 I0908 20:22:00.156894 4215 net.cpp:106] Creating Layer conv3_2 I0908 20:22:00.156896 4215 net.cpp:454] conv3_2 <- conv3_1 I0908 20:22:00.156901 4215 net.cpp:411] conv3_2 -> conv3_2 I0908 20:22:00.167906 4215 net.cpp:150] Setting up conv3_2 I0908 20:22:00.167923 4215 net.cpp:157] Top shape: 1 256 56 56 (802816) I0908 20:22:00.167925 4215 net.cpp:165] Memory required for data: 91521024 I0908 20:22:00.167932 4215 layer_factory.hpp:77] Creating layer relu3_2 I0908 20:22:00.167939 4215 net.cpp:106] Creating Layer relu3_2 I0908 20:22:00.167943 4215 net.cpp:454] relu3_2 <- conv3_2 I0908 20:22:00.167948 4215 net.cpp:397] relu3_2 -> conv3_2 (in-place) I0908 20:22:00.168265 4215 net.cpp:150] Setting up relu3_2 I0908 20:22:00.168273 4215 net.cpp:157] Top shape: 1 256 56 56 (802816) I0908 20:22:00.168275 4215 net.cpp:165] Memory required for data: 94732288 I0908 20:22:00.168278 4215 layer_factory.hpp:77] Creating layer conv3_3 I0908 20:22:00.168287 4215 net.cpp:106] Creating Layer conv3_3 I0908 20:22:00.168289 4215 net.cpp:454] conv3_3 <- conv3_2 I0908 20:22:00.168293 4215 net.cpp:411] conv3_3 -> conv3_3 I0908 20:22:00.169926 4215 net.cpp:150] Setting up conv3_3 I0908 20:22:00.169934 4215 net.cpp:157] Top shape: 1 256 56 56 (802816) I0908 20:22:00.169936 4215 net.cpp:165] Memory required for data: 97943552 I0908 20:22:00.169940 4215 layer_factory.hpp:77] Creating layer relu3_3 I0908 20:22:00.169945 4215 net.cpp:106] Creating Layer relu3_3 I0908 20:22:00.169947 4215 net.cpp:454] relu3_3 <- conv3_3 I0908 20:22:00.169950 4215 net.cpp:397] relu3_3 -> conv3_3 (in-place) I0908 20:22:00.170212 4215 net.cpp:150] Setting up relu3_3 I0908 20:22:00.170219 4215 net.cpp:157] Top shape: 1 256 56 56 (802816) I0908 20:22:00.170222 4215 net.cpp:165] Memory required for data: 101154816 I0908 20:22:00.170223 4215 layer_factory.hpp:77] Creating layer pool3 I0908 20:22:00.170231 4215 net.cpp:106] Creating Layer pool3 I0908 20:22:00.170234 4215 net.cpp:454] pool3 <- conv3_3 I0908 20:22:00.170238 4215 net.cpp:411] pool3 -> pool3 I0908 20:22:00.170269 4215 net.cpp:150] Setting up pool3 I0908 20:22:00.170274 4215 net.cpp:157] Top shape: 1 256 28 28 (200704) I0908 20:22:00.170274 4215 net.cpp:165] Memory required for data: 101957632 I0908 20:22:00.170276 4215 layer_factory.hpp:77] Creating layer conv4_1 I0908 20:22:00.170281 4215 net.cpp:106] Creating Layer conv4_1 I0908 20:22:00.170284 4215 net.cpp:454] conv4_1 <- pool3 I0908 20:22:00.170285 4215 net.cpp:411] conv4_1 -> conv4_1 I0908 20:22:00.172976 4215 net.cpp:150] Setting up conv4_1 I0908 20:22:00.172994 4215 net.cpp:157] Top shape: 1 512 28 28 (401408) I0908 20:22:00.172997 4215 net.cpp:165] Memory required for data: 103563264 I0908 20:22:00.173003 4215 layer_factory.hpp:77] Creating layer relu4_1 I0908 20:22:00.173010 4215 net.cpp:106] Creating Layer relu4_1 I0908 20:22:00.173013 4215 net.cpp:454] relu4_1 <- conv4_1 I0908 20:22:00.173018 4215 net.cpp:397] relu4_1 -> conv4_1 (in-place) I0908 20:22:00.173144 4215 net.cpp:150] Setting up relu4_1 I0908 20:22:00.173151 4215 net.cpp:157] Top shape: 1 512 28 28 (401408) I0908 20:22:00.173151 4215 net.cpp:165] Memory required for data: 105168896 I0908 20:22:00.173153 4215 layer_factory.hpp:77] Creating layer conv4_2 I0908 20:22:00.173159 4215 net.cpp:106] Creating Layer conv4_2 I0908 20:22:00.173162 4215 net.cpp:454] conv4_2 <- conv4_1 I0908 20:22:00.173166 4215 net.cpp:411] conv4_2 -> conv4_2 I0908 20:22:00.178277 4215 net.cpp:150] Setting up conv4_2 I0908 20:22:00.178311 4215 net.cpp:157] Top shape: 1 512 28 28 (401408) I0908 20:22:00.178314 4215 net.cpp:165] Memory required for data: 106774528 I0908 20:22:00.178331 4215 layer_factory.hpp:77] Creating layer relu4_2 I0908 20:22:00.178349 4215 net.cpp:106] Creating Layer relu4_2 I0908 20:22:00.178354 4215 net.cpp:454] relu4_2 <- conv4_2 I0908 20:22:00.178359 4215 net.cpp:397] relu4_2 -> conv4_2 (in-place) I0908 20:22:00.178628 4215 net.cpp:150] Setting up relu4_2 I0908 20:22:00.178637 4215 net.cpp:157] Top shape: 1 512 28 28 (401408) I0908 20:22:00.178638 4215 net.cpp:165] Memory required for data: 108380160 I0908 20:22:00.178640 4215 layer_factory.hpp:77] Creating layer conv4_3 I0908 20:22:00.178647 4215 net.cpp:106] Creating Layer conv4_3 I0908 20:22:00.178649 4215 net.cpp:454] conv4_3 <- conv4_2 I0908 20:22:00.178654 4215 net.cpp:411] conv4_3 -> conv4_3 I0908 20:22:00.183998 4215 net.cpp:150] Setting up conv4_3 I0908 20:22:00.184022 4215 net.cpp:157] Top shape: 1 512 28 28 (401408) I0908 20:22:00.184026 4215 net.cpp:165] Memory required for data: 109985792 I0908 20:22:00.184043 4215 layer_factory.hpp:77] Creating layer relu4_3 I0908 20:22:00.184052 4215 net.cpp:106] Creating Layer relu4_3 I0908 20:22:00.184056 4215 net.cpp:454] relu4_3 <- conv4_3 I0908 20:22:00.184063 4215 net.cpp:397] relu4_3 -> conv4_3 (in-place) I0908 20:22:00.184335 4215 net.cpp:150] Setting up relu4_3 I0908 20:22:00.184342 4215 net.cpp:157] Top shape: 1 512 28 28 (401408) I0908 20:22:00.184345 4215 net.cpp:165] Memory required for data: 111591424 I0908 20:22:00.184357 4215 layer_factory.hpp:77] Creating layer pool4 I0908 20:22:00.184365 4215 net.cpp:106] Creating Layer pool4 I0908 20:22:00.184366 4215 net.cpp:454] pool4 <- conv4_3 I0908 20:22:00.184370 4215 net.cpp:411] pool4 -> pool4 I0908 20:22:00.184412 4215 net.cpp:150] Setting up pool4 I0908 20:22:00.184415 4215 net.cpp:157] Top shape: 1 512 14 14 (100352) I0908 20:22:00.184417 4215 net.cpp:165] Memory required for data: 111992832 I0908 20:22:00.184429 4215 layer_factory.hpp:77] Creating layer conv5_1 I0908 20:22:00.184437 4215 net.cpp:106] Creating Layer conv5_1 I0908 20:22:00.184439 4215 net.cpp:454] conv5_1 <- pool4 I0908 20:22:00.184442 4215 net.cpp:411] conv5_1 -> conv5_1 I0908 20:22:00.189421 4215 net.cpp:150] Setting up conv5_1 I0908 20:22:00.189446 4215 net.cpp:157] Top shape: 1 512 14 14 (100352) I0908 20:22:00.189460 4215 net.cpp:165] Memory required for data: 112394240 I0908 20:22:00.189467 4215 layer_factory.hpp:77] Creating layer relu5_1 I0908 20:22:00.189477 4215 net.cpp:106] Creating Layer relu5_1 I0908 20:22:00.189481 4215 net.cpp:454] relu5_1 <- conv5_1 I0908 20:22:00.189487 4215 net.cpp:397] relu5_1 -> conv5_1 (in-place) I0908 20:22:00.189631 4215 net.cpp:150] Setting up relu5_1 I0908 20:22:00.189637 4215 net.cpp:157] Top shape: 1 512 14 14 (100352) I0908 20:22:00.189640 4215 net.cpp:165] Memory required for data: 112795648 I0908 20:22:00.189642 4215 layer_factory.hpp:77] Creating layer conv5_2 I0908 20:22:00.189649 4215 net.cpp:106] Creating Layer conv5_2 I0908 20:22:00.189652 4215 net.cpp:454] conv5_2 <- conv5_1 I0908 20:22:00.189656 4215 net.cpp:411] conv5_2 -> conv5_2 I0908 20:22:00.194391 4215 net.cpp:150] Setting up conv5_2 I0908 20:22:00.194420 4215 net.cpp:157] Top shape: 1 512 14 14 (100352) I0908 20:22:00.194433 4215 net.cpp:165] Memory required for data: 113197056 I0908 20:22:00.194443 4215 layer_factory.hpp:77] Creating layer relu5_2 I0908 20:22:00.194453 4215 net.cpp:106] Creating Layer relu5_2 I0908 20:22:00.194456 4215 net.cpp:454] relu5_2 <- conv5_2 I0908 20:22:00.194463 4215 net.cpp:397] relu5_2 -> conv5_2 (in-place) I0908 20:22:00.194744 4215 net.cpp:150] Setting up relu5_2 I0908 20:22:00.194751 4215 net.cpp:157] Top shape: 1 512 14 14 (100352) I0908 20:22:00.194754 4215 net.cpp:165] Memory required for data: 113598464 I0908 20:22:00.194756 4215 layer_factory.hpp:77] Creating layer conv5_3 I0908 20:22:00.194764 4215 net.cpp:106] Creating Layer conv5_3 I0908 20:22:00.194767 4215 net.cpp:454] conv5_3 <- conv5_2 I0908 20:22:00.194772 4215 net.cpp:411] conv5_3 -> conv5_3 I0908 20:22:00.199458 4215 net.cpp:150] Setting up conv5_3 I0908 20:22:00.199486 4215 net.cpp:157] Top shape: 1 512 14 14 (100352) I0908 20:22:00.199489 4215 net.cpp:165] Memory required for data: 113999872 I0908 20:22:00.199508 4215 layer_factory.hpp:77] Creating layer relu5_3 I0908 20:22:00.199518 4215 net.cpp:106] Creating Layer relu5_3 I0908 20:22:00.199522 4215 net.cpp:454] relu5_3 <- conv5_3 I0908 20:22:00.199528 4215 net.cpp:397] relu5_3 -> conv5_3 (in-place) I0908 20:22:00.199827 4215 net.cpp:150] Setting up relu5_3 I0908 20:22:00.199834 4215 net.cpp:157] Top shape: 1 512 14 14 (100352) I0908 20:22:00.199836 4215 net.cpp:165] Memory required for data: 114401280 I0908 20:22:00.199839 4215 layer_factory.hpp:77] Creating layer conv5_3_relu5_3_0_split I0908 20:22:00.199844 4215 net.cpp:106] Creating Layer conv5_3_relu5_3_0_split I0908 20:22:00.199847 4215 net.cpp:454] conv5_3_relu5_3_0_split <- conv5_3 I0908 20:22:00.199852 4215 net.cpp:411] conv5_3_relu5_3_0_split -> conv5_3_relu5_3_0_split_0 I0908 20:22:00.199857 4215 net.cpp:411] conv5_3_relu5_3_0_split -> conv5_3_relu5_3_0_split_1 I0908 20:22:00.199890 4215 net.cpp:150] Setting up conv5_3_relu5_3_0_split I0908 20:22:00.199894 4215 net.cpp:157] Top shape: 1 512 14 14 (100352) I0908 20:22:00.199898 4215 net.cpp:157] Top shape: 1 512 14 14 (100352) I0908 20:22:00.199899 4215 net.cpp:165] Memory required for data: 115204096 I0908 20:22:00.199901 4215 layer_factory.hpp:77] Creating layer rpn_conv/3x3 I0908 20:22:00.199908 4215 net.cpp:106] Creating Layer rpn_conv/3x3 I0908 20:22:00.199911 4215 net.cpp:454] rpn_conv/3x3 <- conv5_3_relu5_3_0_split_0 I0908 20:22:00.199915 4215 net.cpp:411] rpn_conv/3x3 -> rpn/output I0908 20:22:00.204633 4215 net.cpp:150] Setting up rpn_conv/3x3 I0908 20:22:00.204661 4215 net.cpp:157] Top shape: 1 512 14 14 (100352) I0908 20:22:00.204664 4215 net.cpp:165] Memory required for data: 115605504 I0908 20:22:00.204682 4215 layer_factory.hpp:77] Creating layer rpn_relu/3x3 I0908 20:22:00.204699 4215 net.cpp:106] Creating Layer rpn_relu/3x3 I0908 20:22:00.204702 4215 net.cpp:454] rpn_relu/3x3 <- rpn/output I0908 20:22:00.204710 4215 net.cpp:397] rpn_relu/3x3 -> rpn/output (in-place) I0908 20:22:00.204839 4215 net.cpp:150] Setting up rpn_relu/3x3 I0908 20:22:00.204845 4215 net.cpp:157] Top shape: 1 512 14 14 (100352) I0908 20:22:00.204848 4215 net.cpp:165] Memory required for data: 116006912 I0908 20:22:00.204859 4215 layer_factory.hpp:77] Creating layer rpn/output_rpn_relu/3x3_0_split I0908 20:22:00.204865 4215 net.cpp:106] Creating Layer rpn/output_rpn_relu/3x3_0_split I0908 20:22:00.204867 4215 net.cpp:454] rpn/output_rpn_relu/3x3_0_split <- rpn/output I0908 20:22:00.204871 4215 net.cpp:411] rpn/output_rpn_relu/3x3_0_split -> rpn/output_rpn_relu/3x3_0_split_0 I0908 20:22:00.204876 4215 net.cpp:411] rpn/output_rpn_relu/3x3_0_split -> rpn/output_rpn_relu/3x3_0_split_1 I0908 20:22:00.204918 4215 net.cpp:150] Setting up rpn/output_rpn_relu/3x3_0_split I0908 20:22:00.204922 4215 net.cpp:157] Top shape: 1 512 14 14 (100352) I0908 20:22:00.204926 4215 net.cpp:157] Top shape: 1 512 14 14 (100352) I0908 20:22:00.204927 4215 net.cpp:165] Memory required for data: 116809728 I0908 20:22:00.204939 4215 layer_factory.hpp:77] Creating layer rpn_cls_score I0908 20:22:00.204946 4215 net.cpp:106] Creating Layer rpn_cls_score I0908 20:22:00.204949 4215 net.cpp:454] rpn_cls_score <- rpn/output_rpn_relu/3x3_0_split_0 I0908 20:22:00.204953 4215 net.cpp:411] rpn_cls_score -> rpn_cls_score I0908 20:22:00.205927 4215 net.cpp:150] Setting up rpn_cls_score I0908 20:22:00.205935 4215 net.cpp:157] Top shape: 1 18 14 14 (3528) I0908 20:22:00.205948 4215 net.cpp:165] Memory required for data: 116823840 I0908 20:22:00.205952 4215 layer_factory.hpp:77] Creating layer rpn_bbox_pred I0908 20:22:00.205960 4215 net.cpp:106] Creating Layer rpn_bbox_pred I0908 20:22:00.205962 4215 net.cpp:454] rpn_bbox_pred <- rpn/output_rpn_relu/3x3_0_split_1 I0908 20:22:00.205966 4215 net.cpp:411] rpn_bbox_pred -> rpn_bbox_pred I0908 20:22:00.206796 4215 net.cpp:150] Setting up rpn_bbox_pred I0908 20:22:00.206804 4215 net.cpp:157] Top shape: 1 36 14 14 (7056) I0908 20:22:00.206806 4215 net.cpp:165] Memory required for data: 116852064 I0908 20:22:00.206821 4215 layer_factory.hpp:77] Creating layer rpn_cls_score_reshape I0908 20:22:00.206827 4215 net.cpp:106] Creating Layer rpn_cls_score_reshape I0908 20:22:00.206830 4215 net.cpp:454] rpn_cls_score_reshape <- rpn_cls_score I0908 20:22:00.206835 4215 net.cpp:411] rpn_cls_score_reshape -> rpn_cls_score_reshape I0908 20:22:00.206856 4215 net.cpp:150] Setting up rpn_cls_score_reshape I0908 20:22:00.206869 4215 net.cpp:157] Top shape: 1 2 126 14 (3528) I0908 20:22:00.206871 4215 net.cpp:165] Memory required for data: 116866176 I0908 20:22:00.206873 4215 layer_factory.hpp:77] Creating layer rpn_cls_prob I0908 20:22:00.206889 4215 net.cpp:106] Creating Layer rpn_cls_prob I0908 20:22:00.206892 4215 net.cpp:454] rpn_cls_prob <- rpn_cls_score_reshape I0908 20:22:00.206895 4215 net.cpp:411] rpn_cls_prob -> rpn_cls_prob I0908 20:22:00.207056 4215 net.cpp:150] Setting up rpn_cls_prob I0908 20:22:00.207061 4215 net.cpp:157] Top shape: 1 2 126 14 (3528) I0908 20:22:00.207063 4215 net.cpp:165] Memory required for data: 116880288 I0908 20:22:00.207077 4215 layer_factory.hpp:77] Creating layer rpn_cls_prob_reshape I0908 20:22:00.207082 4215 net.cpp:106] Creating Layer rpn_cls_prob_reshape I0908 20:22:00.207083 4215 net.cpp:454] rpn_cls_prob_reshape <- rpn_cls_prob I0908 20:22:00.207087 4215 net.cpp:411] rpn_cls_prob_reshape -> rpn_cls_prob_reshape I0908 20:22:00.207105 4215 net.cpp:150] Setting up rpn_cls_prob_reshape I0908 20:22:00.207109 4215 net.cpp:157] Top shape: 1 18 14 14 (3528) I0908 20:22:00.207120 4215 net.cpp:165] Memory required for data: 116894400 I0908 20:22:00.207123 4215 layer_factory.hpp:77] Creating layer proposal I0908 20:22:00.257710 4215 net.cpp:106] Creating Layer proposal I0908 20:22:00.257730 4215 net.cpp:454] proposal <- rpn_cls_prob_reshape I0908 20:22:00.257735 4215 net.cpp:454] proposal <- rpn_bbox_pred I0908 20:22:00.257737 4215 net.cpp:454] proposal <- im_info I0908 20:22:00.257742 4215 net.cpp:411] proposal -> rois I0908 20:22:00.258290 4215 net.cpp:150] Setting up proposal I0908 20:22:00.258301 4215 net.cpp:157] Top shape: 1 5 (5) I0908 20:22:00.258303 4215 net.cpp:165] Memory required for data: 116894420 I0908 20:22:00.258306 4215 layer_factory.hpp:77] Creating layer roi_pool5 I0908 20:22:00.258312 4215 net.cpp:106] Creating Layer roi_pool5 I0908 20:22:00.258316 4215 net.cpp:454] roi_pool5 <- conv5_3_relu5_3_0_split_1 I0908 20:22:00.258319 4215 net.cpp:454] roi_pool5 <- rois I0908 20:22:00.258324 4215 net.cpp:411] roi_pool5 -> pool5 I0908 20:22:00.258330 4215 roi_pooling_layer.cpp:30] Spatial scale: 0.0625 I0908 20:22:00.258370 4215 net.cpp:150] Setting up roi_pool5 I0908 20:22:00.258374 4215 net.cpp:157] Top shape: 1 512 7 7 (25088) I0908 20:22:00.258375 4215 net.cpp:165] Memory required for data: 116994772 I0908 20:22:00.258378 4215 layer_factory.hpp:77] Creating layer fc6 I0908 20:22:00.258383 4215 net.cpp:106] Creating Layer fc6 I0908 20:22:00.258384 4215 net.cpp:454] fc6 <- pool5 I0908 20:22:00.258388 4215 net.cpp:411] fc6 -> fc6 I0908 20:22:00.430316 4215 net.cpp:150] Setting up fc6 I0908 20:22:00.430344 4215 net.cpp:157] Top shape: 1 4096 (4096) I0908 20:22:00.430346 4215 net.cpp:165] Memory required for data: 117011156 I0908 20:22:00.430366 4215 layer_factory.hpp:77] Creating layer relu6 I0908 20:22:00.430389 4215 net.cpp:106] Creating Layer relu6 I0908 20:22:00.430393 4215 net.cpp:454] relu6 <- fc6 I0908 20:22:00.430398 4215 net.cpp:397] relu6 -> fc6 (in-place) I0908 20:22:00.430800 4215 net.cpp:150] Setting up relu6 I0908 20:22:00.430807 4215 net.cpp:157] Top shape: 1 4096 (4096) I0908 20:22:00.430809 4215 net.cpp:165] Memory required for data: 117027540 I0908 20:22:00.430811 4215 layer_factory.hpp:77] Creating layer fc7 I0908 20:22:00.430815 4215 net.cpp:106] Creating Layer fc7 I0908 20:22:00.430817 4215 net.cpp:454] fc7 <- fc6 I0908 20:22:00.430821 4215 net.cpp:411] fc7 -> fc7 I0908 20:22:00.460335 4215 net.cpp:150] Setting up fc7 I0908 20:22:00.460360 4215 net.cpp:157] Top shape: 1 4096 (4096) I0908 20:22:00.460362 4215 net.cpp:165] Memory required for data: 117043924 I0908 20:22:00.460386 4215 layer_factory.hpp:77] Creating layer relu7 I0908 20:22:00.460404 4215 net.cpp:106] Creating Layer relu7 I0908 20:22:00.460408 4215 net.cpp:454] relu7 <- fc7 I0908 20:22:00.460423 4215 net.cpp:397] relu7 -> fc7 (in-place) I0908 20:22:00.460865 4215 net.cpp:150] Setting up relu7 I0908 20:22:00.460871 4215 net.cpp:157] Top shape: 1 4096 (4096) I0908 20:22:00.460875 4215 net.cpp:165] Memory required for data: 117060308 I0908 20:22:00.460877 4215 layer_factory.hpp:77] Creating layer fc7_relu7_0_split I0908 20:22:00.460893 4215 net.cpp:106] Creating Layer fc7_relu7_0_split I0908 20:22:00.460896 4215 net.cpp:454] fc7_relu7_0_split <- fc7 I0908 20:22:00.460901 4215 net.cpp:411] fc7_relu7_0_split -> fc7_relu7_0_split_0 I0908 20:22:00.460906 4215 net.cpp:411] fc7_relu7_0_split -> fc7_relu7_0_split_1 I0908 20:22:00.460945 4215 net.cpp:150] Setting up fc7_relu7_0_split I0908 20:22:00.460948 4215 net.cpp:157] Top shape: 1 4096 (4096) I0908 20:22:00.460952 4215 net.cpp:157] Top shape: 1 4096 (4096) I0908 20:22:00.460952 4215 net.cpp:165] Memory required for data: 117093076 I0908 20:22:00.460954 4215 layer_factory.hpp:77] Creating layer cls_score I0908 20:22:00.460958 4215 net.cpp:106] Creating Layer cls_score I0908 20:22:00.460960 4215 net.cpp:454] cls_score <- fc7_relu7_0_split_0 I0908 20:22:00.460964 4215 net.cpp:411] cls_score -> cls_score I0908 20:22:00.461432 4215 net.cpp:150] Setting up cls_score I0908 20:22:00.461439 4215 net.cpp:157] Top shape: 1 21 (21) I0908 20:22:00.461441 4215 net.cpp:165] Memory required for data: 117093160 I0908 20:22:00.461457 4215 layer_factory.hpp:77] Creating layer bbox_pred I0908 20:22:00.461462 4215 net.cpp:106] Creating Layer bbox_pred I0908 20:22:00.461463 4215 net.cpp:454] bbox_pred <- fc7_relu7_0_split_1 I0908 20:22:00.461467 4215 net.cpp:411] bbox_pred -> bbox_pred I0908 20:22:00.462003 4215 net.cpp:150] Setting up bbox_pred I0908 20:22:00.462010 4215 net.cpp:157] Top shape: 1 84 (84) I0908 20:22:00.462013 4215 net.cpp:165] Memory required for data: 117093496 I0908 20:22:00.462030 4215 layer_factory.hpp:77] Creating layer cls_prob I0908 20:22:00.462035 4215 net.cpp:106] Creating Layer cls_prob I0908 20:22:00.462038 4215 net.cpp:454] cls_prob <- cls_score I0908 20:22:00.462041 4215 net.cpp:411] cls_prob -> cls_prob I0908 20:22:00.462208 4215 net.cpp:150] Setting up cls_prob I0908 20:22:00.462213 4215 net.cpp:157] Top shape: 1 21 (21) I0908 20:22:00.462215 4215 net.cpp:165] Memory required for data: 117093580 I0908 20:22:00.462229 4215 net.cpp:228] cls_prob does not need backward computation. I0908 20:22:00.462231 4215 net.cpp:228] bbox_pred does not need backward computation. I0908 20:22:00.462234 4215 net.cpp:228] cls_score does not need backward computation. I0908 20:22:00.462236 4215 net.cpp:228] fc7_relu7_0_split does not need backward computation. I0908 20:22:00.462239 4215 net.cpp:228] relu7 does not need backward computation. I0908 20:22:00.462242 4215 net.cpp:228] fc7 does not need backward computation. I0908 20:22:00.462255 4215 net.cpp:228] relu6 does not need backward computation. I0908 20:22:00.462257 4215 net.cpp:228] fc6 does not need backward computation. I0908 20:22:00.462260 4215 net.cpp:228] roi_pool5 does not need backward computation. I0908 20:22:00.462263 4215 net.cpp:228] proposal does not need backward computation. I0908 20:22:00.462267 4215 net.cpp:228] rpn_cls_prob_reshape does not need backward computation. I0908 20:22:00.462270 4215 net.cpp:228] rpn_cls_prob does not need backward computation. I0908 20:22:00.462273 4215 net.cpp:228] rpn_cls_score_reshape does not need backward computation. I0908 20:22:00.462276 4215 net.cpp:228] rpn_bbox_pred does not need backward computation. I0908 20:22:00.462280 4215 net.cpp:228] rpn_cls_score does not need da99 backward computation. I0908 20:22:00.462282 4215 net.cpp:228] rpn/output_rpn_relu/3x3_0_split does not need backward computation. I0908 20:22:00.462285 4215 net.cpp:228] rpn_relu/3x3 does not need backward computation. I0908 20:22:00.462288 4215 net.cpp:228] rpn_conv/3x3 does not need backward computation. I0908 20:22:00.462291 4215 net.cpp:228] conv5_3_relu5_3_0_split does not need backward computation. I0908 20:22:00.462294 4215 net.cpp:228] relu5_3 does not need backward computation. I0908 20:22:00.462296 4215 net.cpp:228] conv5_3 does not need backward computation. I0908 20:22:00.462299 4215 net.cpp:228] relu5_2 does not need backward computation. I0908 20:22:00.462302 4215 net.cpp:228] conv5_2 does not need backward computation. I0908 20:22:00.462306 4215 net.cpp:228] relu5_1 does not need backward computation. I0908 20:22:00.462308 4215 net.cpp:228] conv5_1 does not need backward computation. I0908 20:22:00.462311 4215 net.cpp:228] pool4 does not need backward computation. I0908 20:22:00.462314 4215 net.cpp:228] relu4_3 does not need backward computation. I0908 20:22:00.462316 4215 net.cpp:228] conv4_3 does not need backward computation. I0908 20:22:00.462321 4215 net.cpp:228] relu4_2 does not need backward computation. I0908 20:22:00.462323 4215 net.cpp:228] conv4_2 does not need backward computation. I0908 20:22:00.462326 4215 net.cpp:228] relu4_1 does not need backward computation. I0908 20:22:00.462328 4215 net.cpp:228] conv4_1 does not need backward computation. I0908 20:22:00.462332 4215 net.cpp:228] pool3 does not need backward computation. I0908 20:22:00.462334 4215 net.cpp:228] relu3_3 does not need backward computation. I0908 20:22:00.462337 4215 net.cpp:228] conv3_3 does not need backward computation. I0908 20:22:00.462339 4215 net.cpp:228] relu3_2 does not need backward computation. I0908 20:22:00.462342 4215 net.cpp:228] conv3_2 does not need backward computation. I0908 20:22:00.462344 4215 net.cpp:228] relu3_1 does not need backward computation. I0908 20:22:00.462347 4215 net.cpp:228] conv3_1 does not need backward computation. I0908 20:22:00.462350 4215 net.cpp:228] pool2 does not need backward computation. I0908 20:22:00.462352 4215 net.cpp:228] relu2_2 does not need backward computation. I0908 20:22:00.462355 4215 net.cpp:228] conv2_2 does not need backward computation. I0908 20:22:00.462358 4215 net.cpp:228] relu2_1 does not need backward computation. I0908 20:22:00.462360 4215 net.cpp:228] conv2_1 does not need backward computation. I0908 20:22:00.462363 4215 net.cpp:228] pool1 does not need backward computation. I0908 20:22:00.462365 4215 net.cpp:228] relu1_2 does not need backward computation. I0908 20:22:00.462368 4215 net.cpp:228] conv1_2 does not need backward computation. I0908 20:22:00.462370 4215 net.cpp:228] relu1_1 does not need backward computation. I0908 20:22:00.462373 4215 net.cpp:228] conv1_1 does not need backward computation. I0908 20:22:00.462375 4215 net.cpp:270] This network produces output bbox_pred I0908 20:22:00.462378 4215 net.cpp:270] This network produces output cls_prob I0908 20:22:00.462399 4215 net.cpp:283] Network initialization done. [libprotobuf WARNING google/protobuf/io/coded_stream.cc:537] Reading dangerously large protocol message. If the message turns out to be larger than 2147483647 bytes, parsing will be halted for security reasons. To increase the limit (or to disable these warnings), see CodedInputStream::SetTotalBytesLimit() in google/protobuf/io/coded_stream.h. [libprotobuf WARNING google/protobuf/io/coded_stream.cc:78] The total number of bytes read was 548317115 I0908 20:22:03.617671 4215 net.cpp:816] Ignoring source layer data I0908 20:22:03.743860 4215 net.cpp:816] Ignoring source layer drop6 I0908 20:22:03.762300 4215 net.cpp:816] Ignoring source layer drop7 I0908 20:22:03.762327 4215 net.cpp:816] Ignoring source layer fc7_drop7_0_split I0908 20:22:03.762817 4215 net.cpp:816] Ignoring source layer loss_cls I0908 20:22:03.762821 4215 net.cpp:816] Ignoring source layer loss_bbox I0908 20:22:03.765383 4215 net.cpp:816] Ignoring source layer silence_rpn_cls_score I0908 20:22:03.765399 4215 net.cpp:816] Ignoring source layer silence_rpn_bbox_pred Loaded network /home/panyiming/py-faster-rcnn/data/faster_rcnn_models/VGG16_faster_rcnn_final.caffemodel ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Demo for data/demo/000456.jpg Detection took 0.904s for 300 object proposals ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Demo for data/demo/000542.jpg Detection took 0.810s for 161 object proposals ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Demo for data/demo/001150.jpg Detection took 0.890s for 194 object proposals ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Demo for data/demo/001763.jpg Detection took 0.891s for 196 object proposals ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Demo for data/demo/004545.jpg Detection took 0.915s for 300 object proposals
运行完成后会出现 实例图片的探测结果。
相关文章推荐
- Caffe: Faster-RCNN Python版本配置 (Windows)
- py-faster-rcnn安装与配置
- 深度学习之windows python faster rcnn 配置及demo运行
- 深度学习框架caffe及py-faster-rcnn详细配置安装过程
- faster-RCNN配置(python接口)
- caffe及faster-rcnn详细配置安装过程
- caffe安装好MATLAB接口配置(和faster-rcnn里的MATLAB是一样的操作)--4
- Windows10环境下配置Caffe(Faster RCNN、python2.7)
- 深度学习之windows python faster rcnn 配置及demo运行
- pva-faster-rcnn配置安装及训练自己的数据集
- 深度学习框架caffe及py-faster-rcnn详细配置安装过程
- Caffe: Faster-RCNN Python版本配置 (Windows)
- Faster RCNN 的py-faster-rcnn详细配置安装过程
- Python Faster R-CNN 安装配置记录
- Caffe: Faster-RCNN Python版本配置 (Windows)
- faster-rcnn(python版本)安装
- faster-rcnn安装配置,训练自己的数据,MATLAB,Ubuntu14
- Ubuntu 14.04+cuda 7.0+cudnn 7.0+caffe安装配置+faster-rcnn安装
- caffe及faster-rcnn详细配置安装过程
- Ubuntu16.04+CUDA8.0+OpenCV3.1+python+caffe+faster-rcnn环境配置