1. 程式人生 > >CRelu 啟用函式

CRelu 啟用函式

ICML2016

Understanding and Improving Convolutional Neural Networks via Concatenated Rectified Linear Units

本文在深入分析CNN網路內部結構,發現在CNN網路的前幾層學習到的濾波器中存在負相關。 they appear surprisingly opposite to each other, i.e., for each filter, there does exist another filter that is almost on the opposite phase

這裡寫圖片描述

下圖說明在第一卷積層,藍色的直方圖分佈以-0.5為中心點,對稱均勻分佈,也就是說有較多成對的濾波器。越到後面的層,藍色的直方圖分佈越集中,成對的濾波器越少。這裡寫圖片描述

也就是說學習到的濾波器存在冗餘。對此我們設計了CReLU, It simply makes an identical copy of the linear responses after convolution, negate them, concatenate both parts of activation, and then apply ReLU altogether這裡寫圖片描述這裡寫圖片描述

和其他方法結果對比這裡寫圖片描述

CReLU 的重構很好