1. 程式人生 > >Python版Faster-RCNN安裝配置

Python版Faster-RCNN安裝配置

#  Makefile.config, make sure to have this line uncommented
WITH_PYTHON_LAYER := 1
# Unrelatedly, it's also recommended that you use CUDNN
USE_CUDNN := 1

2,python包必須包括: cython, python-opencv, easydict

二、硬體要求:
1,對於小型網路的訓練(ZF, VGG_CNN_M_1024),一個有3G記憶體的好的gpu就可以了(e.g., Titan, K20, K40, …) 。
2,為了用VGG16尋訓練fast-rcnn,我們需要一個 K40 (~11G of memory)。
3,用端到端(end to end)的方式訓練用VGG16構建的Faster-Rcnn ,在使用CUDnn情況下,一個 K40 (~3G of memory)足夠了。

三、安裝

1, 從github上克隆faster-rcnn 原始碼,注意必須採用命令列的方式下載,不要使用在瀏覽器內直接下載,不然會漏掉很多東西:

# Make sure to clone with --recursive
git clone --recursive https://github.com/rbgirshick/py-faster-rcnn.git

2,建立 Cpython模組:

cd $FRCN_ROOT/lib
make

3,建立Caffe 和 pycaffe:

cd $FRCN_ROOT/caffe-fast-rcnn
make -j8 && make pycaffe

注意:在編譯的時候容易出現兩種錯誤:
(1), 由於cudnn版本不對,無法編譯通過,在這裡推薦在安裝caffe時使用cudnn4.0版
(2),make過程中出現比如 string.h ‘memcy’ was not declared in this scope的錯誤是由於gcc編譯器版本太新,解決方法是開啟makefile搜尋並替換:

NVCCFLAGS += -ccbin=$(CXX) -Xcompiler -fPIC $(COMMON_FLAGS)
為
NVCCFLAGS += -D_FORCE_INLINES -ccbin=$(CXX) -Xcompiler -fPIC $(COMMON_FLAGS)

當然,如果caffe正確安裝,是不會出現這種問題的。

4,下載已經訓練好的模型Faster R-CNN :

 cd $FRCN_ROOT
./data/scripts/fetch_faster_rcnn_models.sh

這一步會在$FRCN_ROOT/data 資料夾下下載 生成faster_rcnn_models. 的壓縮檔案大概700M大小,解壓縮後備用。

四、測試:
完成的基本的安裝後用demo測試一下:

cd $FRCN_ROOT
./tools/demo.py

執行結果如下則表明安裝成功:

/usr/lib/python2.7/dist-packages/matplotlib/font_manager.py:273: UserWarning: Matplotlib is building the font cache using fc-list. This may take a moment.
  warnings.warn('Matplotlib is building the font cache using fc-list. This may take a moment.')
WARNING: Logging before InitGoogleLogging() is written to STDERR
I0908 20:21:59.205785  4215 net.cpp:49] Initializing net from parameters: 
name: "VGG_ILSVRC_16_layers"
input: "data"
input: "im_info"
state {
  phase: TEST
}
input_shape {
  dim: 1
  dim: 3
  dim: 224
  dim: 224
}
input_shape {
  dim: 1
  dim: 3
}
layer {
  name: "conv1_1"
  type: "Convolution"
  bottom: "data"
  top: "conv1_1"
  convolution_param {
    num_output: 64
    pad: 1
    kernel_size: 3
  }
}
layer {
  name: "relu1_1"
  type: "ReLU"
  bottom: "conv1_1"
  top: "conv1_1"
}
layer {
  name: "conv1_2"
  type: "Convolution"
  bottom: "conv1_1"
  top: "conv1_2"
  convolution_param {
    num_output: 64
    pad: 1
    kernel_size: 3
  }
}
layer {
  name: "relu1_2"
  type: "ReLU"
  bottom: "conv1_2"
  top: "conv1_2"
}
layer {
  name: "pool1"
  type: "Pooling"
  bottom: "conv1_2"
  top: "pool1"
  pooling_param {
    pool: MAX
    kernel_size: 2
    stride: 2
  }
}
layer {
  name: "conv2_1"
  type: "Convolution"
  bottom: "pool1"
  top: "conv2_1"
  convolution_param {
    num_output: 128
    pad: 1
    kernel_size: 3
  }
}
layer {
  name: "relu2_1"
  type: "ReLU"
  bottom: "conv2_1"
  top: "conv2_1"
}
layer {
  name: "conv2_2"
  type: "Convolution"
  bottom: "conv2_1"
  top: "conv2_2"
  convolution_param {
    num_output: 128
    pad: 1
    kernel_size: 3
  }
}
layer {
  name: "relu2_2"
  type: "ReLU"
  bottom: "conv2_2"
  top: "conv2_2"
}
layer {
  name: "pool2"
  type: "Pooling"
  bottom: "conv2_2"
  top: "pool2"
  pooling_param {
    pool: MAX
    kernel_size: 2
    stride: 2
  }
}
layer {
  name: "conv3_1"
  type: "Convolution"
  bottom: "pool2"
  top: "conv3_1"
  convolution_param {
    num_output: 256
    pad: 1
    kernel_size: 3
  }
}
layer {
  name: "relu3_1"
  type: "ReLU"
  bottom: "conv3_1"
  top: "conv3_1"
}
layer {
  name: "conv3_2"
  type: "Convolution"
  bottom: "conv3_1"
  top: "conv3_2"
  convolution_param {
    num_output: 256
    pad: 1
    kernel_size: 3
  }
}
layer {
  name: "relu3_2"
  type: "ReLU"
  bottom: "conv3_2"
  top: "conv3_2"
}
layer {
  name: "conv3_3"
  type: "Convolution"
  bottom: "conv3_2"
  top: "conv3_3"
  convolution_param {
    num_output: 256
    pad: 1
    kernel_size: 3
  }
}
layer {
  name: "relu3_3"
  type: "ReLU"
  bottom: "conv3_3"
  top: "conv3_3"
}
layer {
  name: "pool3"
  type: "Pooling"
  bottom: "conv3_3"
  top: "pool3"
  pooling_param {
    pool: MAX
    kernel_size: 2
    stride: 2
  }
}
layer {
  name: "conv4_1"
  type: "Convolution"
  bottom: "pool3"
  top: "conv4_1"
  convolution_param {
    num_output: 512
    pad: 1
    kernel_size: 3
  }
}
layer {
  name: "relu4_1"
  type: "ReLU"
  bottom: "conv4_1"
  top: "conv4_1"
}
layer {
  name: "conv4_2"
  type: "Convolution"
  bottom: "conv4_1"
  top: "conv4_2"
  convolution_param {
    num_output: 512
    pad: 1
    kernel_size: 3
  }
}
layer {
  name: "relu4_2"
  type: "ReLU"
  bottom: "conv4_2"
  top: "conv4_2"
}
layer {
  name: "conv4_3"
  type: "Convolution"
  bottom: "conv4_2"
  top: "conv4_3"
  convolution_param {
    num_output: 512
    pad: 1
    kernel_size: 3
  }
}
layer {
  name: "relu4_3"
  type: "ReLU"
  bottom: "conv4_3"
  top: "conv4_3"
}
layer {
  name: "pool4"
  type: "Pooling"
  bottom: "conv4_3"
  top: "pool4"
  pooling_param {
    pool: MAX
    kernel_size: 2
    stride: 2
  }
}
layer {
  name: "conv5_1"
  type: "Convolution"
  bottom: "pool4"
  top: "conv5_1"
  convolution_param {
    num_output: 512
    pad: 1
    kernel_size: 3
  }
}
layer {
  name: "relu5_1"
  type: "ReLU"
  bottom: "conv5_1"
  top: "conv5_1"
}
layer {
  name: "conv5_2"
  type: "Convolution"
  bottom: "conv5_1"
  top: "conv5_2"
  convolution_param {
    num_output: 512
    pad: 1
    kernel_size: 3
  }
}
layer {
  name: "relu5_2"
  type: "ReLU"
  bottom: "conv5_2"
  top: "conv5_2"
}
layer {
  name: "conv5_3"
  type: "Convolution"
  bottom: "conv5_2"
  top: "conv5_3"
  convolution_param {
    num_output: 512
    pad: 1
    kernel_size: 3
  }
}
layer {
  name: "relu5_3"
  type: "ReLU"
  bottom: "conv5_3"
  top: "conv5_3"
}
layer {
  name: "rpn_conv/3x3"
  type: "Convolution"
  bottom: "conv5_3"
  top: "rpn/output"
  convolution_param {
    num_output: 512
    pad: 1
    kernel_size: 3
    stride: 1
  }
}
layer {
  name: "rpn_relu/3x3"
  type: "ReLU"
  bottom: "rpn/output"
  top: "rpn/output"
}
layer {
  name: "rpn_cls_score"
  type: "Convolution"
  bottom: "rpn/output"
  top: "rpn_cls_score"
  convolution_param {
    num_output: 18
    pad: 0
    kernel_size: 1
    stride: 1
  }
}
layer {
  name: "rpn_bbox_pred"
  type: "Convolution"
  bottom: "rpn/output"
  top: "rpn_bbox_pred"
  convolution_param {
    num_output: 36
    pad: 0
    kernel_size: 1
    stride: 1
  }
}
layer {
  name: "rpn_cls_score_reshape"
  type: "Reshape"
  bottom: "rpn_cls_score"
  top: "rpn_cls_score_reshape"
  reshape_param {
    shape {
      dim: 0
      dim: 2
      dim: -1
      dim: 0
    }
  }
}
layer {
  name: "rpn_cls_prob"
  type: "Softmax"
  bottom: "rpn_cls_score_reshape"
  top: "rpn_cls_prob"
}
layer {
  name: "rpn_cls_prob_reshape"
  type: "Reshape"
  bottom: "rpn_cls_prob"
  top: "rpn_cls_prob_reshape"
  reshape_param {
    shape {
      dim: 0
      dim: 18
      dim: -1
      dim: 0
    }
  }
}
layer {
  name: "proposal"
  type: "Python"
  bottom: "rpn_cls_prob_reshape"
  bottom: "rpn_bbox_pred"
  bottom: "im_info"
  top: "rois"
  python_param {
    module: "rpn.proposal_layer"
    layer: "ProposalLayer"
    param_str: "\'feat_stride\': 16"
  }
}
layer {
  name: "roi_pool5"
  type: "ROIPooling"
  bottom: "conv5_3"
  bottom: "rois"
  top: "pool5"
  roi_pooling_param {
    pooled_h: 7
    pooled_w: 7
    spatial_scale: 0.0625
  }
}
layer {
  name: "fc6"
  type: "InnerProduct"
  bottom: "pool5"
  top: "fc6"
  inner_product_param {
    num_output: 4096
  }
}
layer {
  name: "relu6"
  type: "ReLU"
  bottom: "fc6"
  top: "fc6"
}
layer {
  name: "fc7"
  type: "InnerProduct"
  bottom: "fc6"
  top: "fc7"
  inner_product_param {
    num_output: 4096
  }
}
layer {
  name: "relu7"
  type: "ReLU"
  bottom: "fc7"
  top: "fc7"
}
layer {
  name: "cls_score"
  type: "InnerProduct"
  bottom: "fc7"
  top: "cls_score"
  inner_product_param {
    num_output: 21
  }
}
layer {
  name: "bbox_pred"
  type: "InnerProduct"
  bottom: "fc7"
  top: "bbox_pred"
  inner_product_param {
    num_output: 84
  }
}
layer {
  name: "cls_prob"
  type: "Softmax"
  bottom: "cls_score"
  top: "cls_prob"
}
I0908 20:21:59.206061  4215 net.cpp:413] Input 0 -> data
I0908 20:21:59.309164  4215 net.cpp:413] Input 1 -> im_info
I0908 20:21:59.309209  4215 layer_factory.hpp:77] Creating layer conv1_1
I0908 20:21:59.309229  4215 net.cpp:106] Creating Layer conv1_1
I0908 20:21:59.309232  4215 net.cpp:454] conv1_1 <- data
I0908 20:21:59.309237  4215 net.cpp:411] conv1_1 -> conv1_1
I0908 20:22:00.151002  4215 net.cpp:150] Setting up conv1_1
I0908 20:22:00.151026  4215 net.cpp:157] Top shape: 1 64 224 224 (3211264)
I0908 20:22:00.151029  4215 net.cpp:165] Memory required for data: 12845056
I0908 20:22:00.151042  4215 layer_factory.hpp:77] Creating layer relu1_1
I0908 20:22:00.151053  4215 net.cpp:106] Creating Layer relu1_1
I0908 20:22:00.151057  4215 net.cpp:454] relu1_1 <- conv1_1
I0908 20:22:00.151062  4215 net.cpp:397] relu1_1 -> conv1_1 (in-place)
I0908 20:22:00.151320  4215 net.cpp:150] Setting up relu1_1
I0908 20:22:00.151327  4215 net.cpp:157] Top shape: 1 64 224 224 (3211264)
I0908 20:22:00.151329  4215 net.cpp:165] Memory required for data: 25690112
I0908 20:22:00.151331  4215 layer_factory.hpp:77] Creating layer conv1_2
I0908 20:22:00.151340  4215 net.cpp:106] Creating Layer conv1_2
I0908 20:22:00.151342  4215 net.cpp:454] conv1_2 <- conv1_1
I0908 20:22:00.151347  4215 net.cpp:411] conv1_2 -> conv1_2
I0908 20:22:00.152340  4215 net.cpp:150] Setting up conv1_2
I0908 20:22:00.152350  4215 net.cpp:157] Top shape: 1 64 224 224 (3211264)
I0908 20:22:00.152354  4215 net.cpp:165] Memory required for data: 38535168
I0908 20:22:00.152359  4215 layer_factory.hpp:77] Creating layer relu1_2
I0908 20:22:00.152364  4215 net.cpp:106] Creating Layer relu1_2
I0908 20:22:00.152367  4215 net.cpp:454] relu1_2 <- conv1_2
I0908 20:22:00.152371  4215 net.cpp:397] relu1_2 -> conv1_2 (in-place)
I0908 20:22:00.152498  4215 net.cpp:150] Setting up relu1_2
I0908 20:22:00.152503  4215 net.cpp:157] Top shape: 1 64 224 224 (3211264)
I0908 20:22:00.152504  4215 net.cpp:165] Memory required for data: 51380224
I0908 20:22:00.152506  4215 layer_factory.hpp:77] Creating layer pool1
I0908 20:22:00.152513  4215 net.cpp:106] Creating Layer pool1
I0908 20:22:00.152513  4215 net.cpp:454] pool1 <- conv1_2
I0908 20:22:00.152518  4215 net.cpp:411] pool1 -> pool1
I0908 20:22:00.152549  4215 net.cpp:150] Setting up pool1
I0908 20:22:00.152552  4215 net.cpp:157] Top shape: 1 64 112 112 (802816)
I0908 20:22:00.152554  4215 net.cpp:165] Memory required for data: 54591488
I0908 20:22:00.152555  4215 layer_factory.hpp:77] Creating layer conv2_1
I0908 20:22:00.152561  4215 net.cpp:106] Creating Layer conv2_1
I0908 20:22:00.152564  4215 net.cpp:454] conv2_1 <- pool1
I0908 20:22:00.152566  4215 net.cpp:411] conv2_1 -> conv2_1
I0908 20:22:00.154081  4215 net.cpp:150] Setting up conv2_1
I0908 20:22:00.154090  4215 net.cpp:157] Top shape: 1 128 112 112 (1605632)
I0908 20:22:00.154093  4215 net.cpp:165] Memory required for data: 61014016
I0908 20:22:00.154098  4215 layer_factory.hpp:77] Creating layer relu2_1
I0908 20:22:00.154103  4215 net.cpp:106] Creating Layer relu2_1
I0908 20:22:00.154105  4215 net.cpp:454] relu2_1 <- conv2_1
I0908 20:22:00.154110  4215 net.cpp:397] relu2_1 -> conv2_1 (in-place)
I0908 20:22:00.154381  4215 net.cpp:150] Setting up relu2_1
I0908 20:22:00.154388  4215 net.cpp:157] Top shape: 1 128 112 112 (1605632)
I0908 20:22:00.154391  4215 net.cpp:165] Memory required for data: 67436544
I0908 20:22:00.154392  4215 layer_factory.hpp:77] Creating layer conv2_2
I0908 20:22:00.154398  4215 net.cpp:106] Creating Layer conv2_2
I0908 20:22:00.154402  4215 net.cpp:454] conv2_2 <- conv2_1
I0908 20:22:00.154405  4215 net.cpp:411] conv2_2 -> conv2_2
I0908 20:22:00.155148  4215 net.cpp:150] Setting up conv2_2
I0908 20:22:00.155155  4215 net.cpp:157] Top shape: 1 128 112 112 (1605632)
I0908 20:22:00.155158  4215 net.cpp:165] Memory required for data: 73859072
I0908 20:22:00.155163  4215 layer_factory.hpp:77] Creating layer relu2_2
I0908 20:22:00.155167  4215 net.cpp:106] Creating Layer relu2_2
I0908 20:22:00.155170  4215 net.cpp:454] relu2_2 <- conv2_2
I0908 20:22:00.155175  4215 net.cpp:397] relu2_2 -> conv2_2 (in-place)
I0908 20:22:00.155441  4215 net.cpp:150] Setting up relu2_2
I0908 20:22:00.155449  4215 net.cpp:157] Top shape: 1 128 112 112 (1605632)
I0908 20:22:00.155452  4215 net.cpp:165] Memory required for data: 80281600
I0908 20:22:00.155453  4215 layer_factory.hpp:77] Creating layer pool2
I0908 20:22:00.155457  4215 net.cpp:106] Creating Layer pool2
I0908 20:22:00.155459  4215 net.cpp:454] pool2 <- conv2_2
I0908 20:22:00.155463  4215 net.cpp:411] pool2 -> pool2
I0908 20:22:00.155493  4215 net.cpp:150] Setting up pool2
I0908 20:22:00.155496  4215 net.cpp:157] Top shape: 1 128 56 56 (401408)
I0908 20:22:00.155498  4215 net.cpp:165] Memory required for data: 81887232
I0908 20:22:00.155500  4215 layer_factory.hpp:77] Creating layer conv3_1
I0908 20:22:00.155504  4215 net.cpp:106] Creating Layer conv3_1
I0908 20:22:00.155506  4215 net.cpp:454] conv3_1 <- pool2
I0908 20:22:00.155510  4215 net.cpp:411] conv3_1 -> conv3_1
I0908 20:22:00.156734  4215 net.cpp:150] Setting up conv3_1
I0908 20:22:00.156743  4215 net.cpp:157] Top shape: 1 256 56 56 (802816)
I0908 20:22:00.156744  4215 net.cpp:165] Memory required for data: 85098496
I0908 20:22:00.156749  4215 layer_factory.hpp:77] Creating layer relu3_1
I0908 20:22:00.156754  4215 net.cpp:106] Creating Layer relu3_1
I0908 20:22:00.156756  4215 net.cpp:454] relu3_1 <- conv3_1
I0908 20:22:00.156761  4215 net.cpp:397] relu3_1 -> conv3_1 (in-place)
I0908 20:22:00.156880  4215 net.cpp:150] Setting up relu3_1
I0908 20:22:00.156886  4215 net.cpp:157] Top shape: 1 256 56 56 (802816)
I0908 20:22:00.156888  4215 net.cpp:165] Memory required for data: 88309760
I0908 20:22:00.156889  4215 layer_factory.hpp:77] Creating layer conv3_2
I0908 20:22:00.156894  4215 net.cpp:106] Creating Layer conv3_2
I0908 20:22:00.156896  4215 net.cpp:454] conv3_2 <- conv3_1
I0908 20:22:00.156901  4215 net.cpp:411] conv3_2 -> conv3_2
I0908 20:22:00.167906  4215 net.cpp:150] Setting up conv3_2
I0908 20:22:00.167923  4215 net.cpp:157] Top shape: 1 256 56 56 (802816)
I0908 20:22:00.167925  4215 net.cpp:165] Memory required for data: 91521024
I0908 20:22:00.167932  4215 layer_factory.hpp:77] Creating layer relu3_2
I0908 20:22:00.167939  4215 net.cpp:106] Creating Layer relu3_2
I0908 20:22:00.167943  4215 net.cpp:454] relu3_2 <- conv3_2
I0908 20:22:00.167948  4215 net.cpp:397] relu3_2 -> conv3_2 (in-place)
I0908 20:22:00.168265  4215 net.cpp:150] Setting up relu3_2
I0908 20:22:00.168273  4215 net.cpp:157] Top shape: 1 256 56 56 (802816)
I0908 20:22:00.168275  4215 net.cpp:165] Memory required for data: 94732288
I0908 20:22:00.168278  4215 layer_factory.hpp:77] Creating layer conv3_3
I0908 20:22:00.168287  4215 net.cpp:106] Creating Layer conv3_3
I0908 20:22:00.168289  4215 net.cpp:454] conv3_3 <- conv3_2
I0908 20:22:00.168293  4215 net.cpp:411] conv3_3 -> conv3_3
I0908 20:22:00.169926  4215 net.cpp:150] Setting up conv3_3
I0908 20:22:00.169934  4215 net.cpp:157] Top shape: 1 256 56 56 (802816)
I0908 20:22:00.169936  4215 net.cpp:165] Memory required for data: 97943552
I0908 20:22:00.169940  4215 layer_factory.hpp:77] Creating layer relu3_3
I0908 20:22:00.169945  4215 net.cpp:106] Creating Layer relu3_3
I0908 20:22:00.169947  4215 net.cpp:454] relu3_3 <- conv3_3
I0908 20:22:00.169950  4215 net.cpp:397] relu3_3 -> conv3_3 (in-place)
I0908 20:22:00.170212  4215 net.cpp:150] Setting up relu3_3
I0908 20:22:00.170219  4215 net.cpp:157] Top shape: 1 256 56 56 (802816)
I0908 20:22:00.170222  4215 net.cpp:165] Memory required for data: 101154816
I0908 20:22:00.170223  4215 layer_factory.hpp:77] Creating layer pool3
I0908 20:22:00.170231  4215 net.cpp:106] Creating Layer pool3
I0908 20:22:00.170234  4215 net.cpp:454] pool3 <- conv3_3
I0908 20:22:00.170238  4215 net.cpp:411] pool3 -> pool3
I0908 20:22:00.170269  4215 net.cpp:150] Setting up pool3
I0908 20:22:00.170274  4215 net.cpp:157] Top shape: 1 256 28 28 (200704)
I0908 20:22:00.170274  4215 net.cpp:165] Memory required for data: 101957632
I0908 20:22:00.170276  4215 layer_factory.hpp:77] Creating layer conv4_1
I0908 20:22:00.170281  4215 net.cpp:106] Creating Layer conv4_1
I0908 20:22:00.170284  4215 net.cpp:454] conv4_1 <- pool3
I0908 20:22:00.170285  4215 net.cpp:411] conv4_1 -> conv4_1
I0908 20:22:00.172976  4215 net.cpp:150] Setting up conv4_1
I0908 20:22:00.172994  4215 net.cpp:157] Top shape: 1 512 28 28 (401408)
I0908 20:22:00.172997  4215 net.cpp:165] Memory required for data: 103563264
I0908 20:22:00.173003  4215 layer_factory.hpp:77] Creating layer relu4_1
I0908 20:22:00.173010  4215 net.cpp:106] Creating Layer relu4_1
I0908 20:22:00.173013  4215 net.cpp:454] relu4_1 <- conv4_1
I0908 20:22:00.173018  4215 net.cpp:397] relu4_1 -> conv4_1 (in-place)
I0908 20:22:00.173144  4215 net.cpp:150] Setting up relu4_1
I0908 20:22:00.173151  4215 net.cpp:157] Top shape: 1 512 28 28 (401408)
I0908 20:22:00.173151  4215 net.cpp:165] Memory required for data: 105168896
I0908 20:22:00.173153  4215 layer_factory.hpp:77] Creating layer conv4_2
I0908 20:22:00.173159  4215 net.cpp:106] Creating Layer conv4_2
I0908 20:22:00.173162  4215 net.cpp:454] conv4_2 <- conv4_1
I0908 20:22:00.173166  4215 net.cpp:411] conv4_2 -> conv4_2
I0908 20:22:00.178277  4215 net.cpp:150] Setting up conv4_2
I0908 20:22:00.178311  4215 net.cpp:157] Top shape: 1 512 28 28 (401408)
I0908 20:22:00.178314  4215 net.cpp:165] Memory required for data: 106774528
I0908 20:22:00.178331  4215 layer_factory.hpp:77] Creating layer relu4_2
I0908 20:22:00.178349  4215 net.cpp:106] Creating Layer relu4_2
I0908 20:22:00.178354  4215 net.cpp:454] relu4_2 <- conv4_2
I0908 20:22:00.178359  4215 net.cpp:397] relu4_2 -> conv4_2 (in-place)
I0908 20:22:00.178628  4215 net.cpp:150] Setting up relu4_2
I0908 20:22:00.178637  4215 net.cpp:157] Top shape: 1 512 28 28 (401408)
I0908 20:22:00.178638  4215 net.cpp:165] Memory required for data: 108380160
I0908 20:22:00.178640  4215 layer_factory.hpp:77] Creating layer conv4_3
I0908 20:22:00.178647  4215 net.cpp:106] Creating Layer conv4_3
I0908 20:22:00.178649  4215 net.cpp:454] conv4_3 <- conv4_2
I0908 20:22:00.178654  4215 net.cpp:411] conv4_3 -> conv4_3
I0908 20:22:00.183998  4215 net.cpp:150] Setting up conv4_3
I0908 20:22:00.184022  4215 net.cpp:157] Top shape: 1 512 28 28 (401408)
I0908 20:22:00.184026  4215 net.cpp:165] Memory required for data: 109985792
I0908 20:22:00.184043  4215 layer_factory.hpp:77] Creating layer relu4_3
I0908 20:22:00.184052