1. 程式人生 > >Neural Networks and Deep Learning 學習筆記(一)

Neural Networks and Deep Learning 學習筆記(一)

1. 為什麼假設wxjwjxj後,wx就變成向量了?

The first change is to write jwjxj as a dot product, wxjwjxj, where w and x are vectors whose components are the weights and inputs, respectively.

向量的點積為標量,兩個同樣維度的向量的點積恰好是每個維度積的累加,以上變換可用公式表示為

(w1,w2,w3,...,wj)(x1,x2,x3,...,xj)=w1x1+w2x2+w3x3+...+wjxj=j
wjxj

2. 用perceptron實現‘與非’、‘與’、‘或’門。

we have a perceptron with two inputs, each with weight −2, and an overall bias of 3. Then we see that input 00 produces output 1, since (2)0+(2)0+3=3 is positive. Here, I’ve introduced the ∗ symbol to make the multiplications explicit. Similar calculations show that the inputs 01 and 10 produce output 1. But the input 11 produces output 0, since (

2)1+(2)1+3=1 is negative. And so our perceptron implements a NAND gate!

以上是電子書中實現的與非門。以此類推實現與門的方式可以為bias設定為-3,x1x2的權重設定為2,則輸入00,計算為 2×0+2×0+(3)=3,為負。輸入01,計算為 2×0+2×1+(3)=1 為負。輸入11,2×1+2×1+(3)=1 為正。實現與門。

實現或門的bias設定為-1,權重設定為2。則輸入00,輸出為負,輸入01,輸出為正,輸入11,輸出為正。

3. exp是啥?

exp是以e為底的自然對數,特點是對ex求導還是e

x

4. 這個偏導數是怎麼得到的(未解決)?

ΔoutputjoutputwjΔwj+outputbΔb