1. 程式人生 > >Caffe層系列:ReLU Layer

Caffe層系列:ReLU Layer

ReLU Layer 是DL中非線性啟用的一種,常常在卷積、歸一化層後面(當然這也不是一定的)

首先我們先看一下 ReLUParameter

// Message that stores parameters used by ReLULayer
message ReLUParameter {
	  // Allow non-zero slope for negative inputs to speed up optimization
	  // Described in:
	  // Maas, A. L., Hannun, A. Y., & Ng, A. Y. (2013). Rectifier nonlinearities
	  // improve neural network acoustic models. In ICML Workshop on Deep Learning
	  // for Audio, Speech, and Language Processing.
	  optional float negative_slope = 1 [default = 0];  //x負方向的斜率,relu為0,若不為0,則就是relu的變種
	  enum Engine {
	    DEFAULT = 0;
	    CAFFE = 1;
	    CUDNN = 2;
	  }
	  optional Engine engine = 2 [default = DEFAULT];
}

ReLU Layer 在prototxt裡面的書寫:

layer {
	  name: "relu"
	  type: "ReLU"
	  bottom: "conv/bn"
	  top: "conv/bn"
}

例如在Mobilenet中:

layer {
	  name: "relu6_4"
	  type: "ReLU"
	  bottom: "conv6_4/bn"
	  top: "conv6_4/bn"
}