1. 程式人生 > >【讀書1】【2017】MATLAB與深度學習——示例:多元分類(2)

【讀書1】【2017】MATLAB與深度學習——示例:多元分類(2)

使用交叉熵驅動的學習規則,輸出節點的增量計算如下:

Using the cross entropy-driven learningrule, the delta of the output node is calculated as follows:

e =d – y;

delta= e;

與第3章的示例相似,不需要其他計算。

Similar to the example from Chapter 3, noother calculation is required.

這是因為,在使用softmax啟用函式的交叉熵驅動的學習規則中,增量和誤差是相同的。

This is because, in the crossentropy-driven learning rule that uses the softmax activation function, thedelta and error are identical.

當然,將前面的反向傳播演算法應用於隱藏層。

Of course, the previous back-propagationalgorithm applies to the hidden layer.

e1 =W2’delta; delta1 = y1.(1-y1).*e1;

MultiClass呼叫的Softmax函式,是在Softmax.m檔案中實現的,程式碼清單如下。

The function Softmax, which the functionMultiClass calls in, is implemented in the Softmax.m file shown in thefollowing listing.

該檔案實現了softmax函式定義的功能。

This file implements the definition of thesoftmax function literally.

函式很簡單,就不再進一步解釋了。

It is simple enough and therefore furtherexplanations have been omitted.

function y =Softmax(x) ex = exp(x); y = ex / sum(ex); end

下面的程式清單為TestMultiClass.m檔案中的內容,它測試的函式為MultiClass。

The following listing shows theTestMultiClass.m file, which tests the function MultiClass.

該程式呼叫MultiClass函式,並將神經網路訓練10000次。

This program calls MultiClass and trainsthe neural network 10,000 times.

訓練過程結束以後,將訓練資料輸入到神經網路,就會給出訓練輸出結果。

Once the training process has beenfinished, the program enters the training data into the neural network anddisplays the output.

通過比較訓練輸出與正確輸出的結果,我們可以驗證神經網路訓練的正確性。

We can verify the training results via thecomparison of the output with the correct output.

clearall rng(3); X =zeros(5, 5, 5); X(:,:, 1) = [ 0 1 1 0 0; 0 0 1 0 0; 0 0 1 0 0; 0 0 1 0 0; 0 1 1 1 0 ]; X(:,:, 2) = [ 1 1 1 1 0; 0 0 0 0 1; 0 1 1 1 0; 1 0 0 0 0; 1 1 1 1 1 ]; X(:,:, 3) = [ 1 1 1 1 0; 0 0 0 0 1; 0 1 1 1 0; 0 0 0 0 1; 1 1 1 1 0 ]; X(:,:, 4) = [ 0 0 0 1 0; 0 0 1 1 0; 0 1 0 1 0; 1 1 1 1 1; 0 0 0 1 0 ]; X(:,:, 5) = [ 1 1 1 1 1; 1 0 0 0 0; 1 1 1 1 0; 0 0 0 0 1; 1 1 1 1 0 ];

D= [ 1 0 0 0 0;

 0 1 0 0 0;

 0 0 1 0 0;

 0 0 0 1 0;

 0 0 0 0 1

];

W1 = 2*rand(50, 25) - 1;

W2 = 2*rand( 5, 50) - 1;

for epoch = 1:10000 % 訓練

     [W1 W2] =MultiClass(W1, W2, X, D);

end

N = 5; % 推斷

for k = 1:N

     x = reshape(X(:, :,k), 25, 1);

     v1 = W1*x;

     y1= Sigmoid(v1);

     v= W2*y1;

     y = Softmax(v)

end

程式碼中的輸入資料X是一個二維矩陣,其中白色畫素編碼為0,黑色畫素編碼為1。

The input data X of the code is a two-dimensional matrix, whichencodes the white pixel into a zero and the black pixel into a unity.

——本文譯自Phil Kim所著的《Matlab Deep Learning》

更多精彩文章請關注微訊號:在這裡插入圖片描述