1. 程式人生 > >深度學習入門:基於Python的理論與實現 高清中文版PDF電子版下載附原始碼

深度學習入門:基於Python的理論與實現 高清中文版PDF電子版下載附原始碼

本書特色
1.日本深度學習入門經典暢銷書,原版上市不足2年印刷已達100 000冊。長期位列日亞“人工智慧”類圖書榜首,超多五星好評。
2.使用Python 3,儘量不依賴外部庫或工具,從零建立一個深度學習模型。
3.示例程式碼清晰,原始碼可下載,需要的執行環境非常簡單。讀者可以一邊讀書一邊執行程式,簡單易上手。
4.使用平實的語言,結合直觀的插圖和具體的例子,將深度學習的原理掰開揉碎講解,簡明易懂。
5.使用計算圖介紹複雜的誤差反向傳播法,非常直觀。
6.相比AI聖經“花書”,本書更合適入門。

對於非AI方向的技術人員,本書將大大降低入門深度學習的門檻;對於在校大學生、研究生,本書不失為學習深度學習的一本好教材;即便是在工作中已經熟練使用框架開發各類深度學習模型的讀者,也可以從本書中獲得新的體會。——摘自本書譯者序

深度學習入門:基於Python的理論與實現 高清中文版PDF電子版下載附原始碼

深度學習入門:基於Python的理論與實現 高清中文版PDF電子版下載附原始碼

深度學習入門:基於Python的理論與實現 高清中文版PDF電子版下載附原始碼 https://pan.baidu.com/s/1ssU7QpC-n5zJ02enum5yVg

譯者序······················································· xiii
前言························································· xv
第1 章 Python入門· ··········································· 1
1.1 Python是什麼· ········································· 1
1.2 Python的安裝· ········································· 2
1.2.1 Python版本· ····································· 2
1.2.2 使用的外部庫· ···································· 2
1.2.3 Anaconda發行版· ································· 3
1.3 Python直譯器· ········································· 4
1.3.1 算術計算········································· 4
1.3.2 資料型別········································· 5
1.3.3 變數············································ 5
1.3.4 列表············································ 6
1.3.5 字典············································ 7
1.3.6 布林型·········································· 7
1.3.7 if 語句· ·········································· 8
1.3.8 for 語句········································· 8
1.3.9 函式············································ 9
1.4 Python指令碼檔案· ······································· 9
vi 目錄
1.4.1 儲存為檔案······································· 9
1.4.2 類· ············································ 10
1.5 NumPy· ·············································· 11
1.5.1 匯入NumPy· ···································· 11
1.5.2 生成NumPy陣列· ································ 12
1.5.3 NumPy 的算術運算······························· 12
1.5.4 NumPy的N維陣列· ······························ 13
1.5.5 廣播··········································· 14
1.5.6 訪問元素········································ 15
1.6 Matplotlib············································ 16
1.6.1 繪製簡單圖形· ··································· 16
1.6.2 pyplot 的功能· ··································· 17
1.6.3 顯示影象········································ 18
1.7 小結················································· 19
第2 章 感知機················································ 21
2.1 感知機是什麼· ········································· 21
2.2 簡單邏輯電路· ········································· 23
2.2.1 與門··········································· 23
2.2.2 與非門和或門· ··································· 23
2.3 感知機的實現· ········································· 25
2.3.1 簡單的實現······································ 25
2.3.2 匯入權重和偏置· ································· 26
2.3.3 使用權重和偏置的實現· ··························· 26
2.4 感知機的侷限性· ······································· 28
2.4.1 異或門········································· 28
2.4.2 線性和非線性· ··································· 30
2.5 多層感知機············································ 31
2.5.1 已有閘電路的組合· ······························· 31
目錄 vii
2.5.2 異或門的實現· ··································· 33
2.6 從與非門到計算機· ····································· 35
2.7 小結················································· 36
第3 章 神經網路·············································· 37
3.1 從感知機到神經網路· ··································· 37
3.1.1 神經網路的例子· ································· 37
3.1.2 複習感知機······································ 38
3.1.3 啟用函式登場· ··································· 40
3.2 啟用函式·············································· 42
3.2.1 sigmoid 函式· ···································· 42
3.2.2 階躍函式的實現· ································· 43
3.2.3 階躍函式的圖形· ································· 44
3.2.4 sigmoid 函式的實現· ······························ 45
3.2.5 sigmoid 函式和階躍函式的比較······················ 46
3.2.6 非線性函式······································ 48
3.2.7 ReLU函式· ····································· 49
3.3 多維陣列的運算· ······································· 50
3.3.1 多維陣列········································ 50
3.3.2 矩陣乘法········································ 51
3.3.3 神經網路的內積· ································· 55
3.4 3層神經網路的實現· ···································· 56
3.4.1 符號確認········································ 57
3.4.2 各層間訊號傳遞的實現· ··························· 58
3.4.3 程式碼實現小結· ··································· 62
3.5 輸出層的設計· ········································· 63
3.5.1 恆等函式和softmax 函式· ·························· 64
3.5.2 實現softmax 函式時的注意事項· ···················· 66
3.5.3 softmax 函式的特徵· ······························ 67
viii 目錄
3.5.4 輸出層的神經元數量· ····························· 68
3.6 手寫數字識別· ········································· 69
3.6.1 MNIST資料集· ·································· 70
3.6.2 神經網路的推理處理· ····························· 73
3.6.3 批處理········································· 75
3.7 小結················································· 79
第4 章 神經網路的學習· ······································· 81
4.1 從資料中學習· ········································· 81
4.1.1 資料驅動········································ 82
4.1.2 訓練資料和測試資料· ····························· 84
4.2 損失函式·············································· 85
4.2.1 均方誤差········································ 85
4.2.2 交叉熵誤差······································ 87
4.2.3 mini-batch 學習· ································· 88
4.2.4 mini-batch 版交叉熵誤差的實現· ···················· 91
4.2.5 為何要設定損失函式· ····························· 92
4.3 數值微分·············································· 94
4.3.1 導數··········································· 94
4.3.2 數值微分的例子· ································· 96
4.3.3 偏導數········································· 98
4.4 梯度·················································100
4.4.1 梯度法·········································102
4.4.2 神經網路的梯度· ·································106
4.5 學習演算法的實現· ·······································109
4.5.1 2 層神經網路的類·································110
4.5.2 mini-batch 的實現· ·······························114
4.5.3 基於測試資料的評價· ·····························116
4.6 小結·················································118
目錄 ix
第5 章 誤差反向傳播法· ·······································121
5.1 計算圖················································121
5.1.1 用計算圖求解· ···································122
5.1.2 區域性計算········································124
5.1.3 為何用計算圖解題· ·······························125
5.2 鏈式法則··············································126
5.2.1 計算圖的反向傳播· ·······························127
5.2.2 什麼是鏈式法則· ·································127
5.2.3 鏈式法則和計算圖· ·······························129
5.3 反向傳播··············································130
5.3.1 加法節點的反向傳播· ·····························130
5.3.2 乘法節點的反向傳播· ·····························132
5.3.3 蘋果的例子······································133
5.4 簡單層的實現· ·········································135
5.4.1 乘法層的實現· ···································135
5.4.2 加法層的實現· ···································137
5.5 啟用函式層的實現· ·····································139
5.5.1 ReLU層· ·······································139
5.5.2 Sigmoid 層······································141
5.6 Affine/Softmax層的實現·································144
5.6.1 Affine層· ·······································144
5.6.2 批版本的Affine層· ·······························148
5.6.3 Softmax-with-Loss 層· ····························150
5.7 誤差反向傳播法的實現· ·································154
5.7.1 神經網路學習的全貌圖· ···························154
5.7.2 對應誤差反向傳播法的神經網路的實現· ··············155
5.7.3 誤差反向傳播法的梯度確認························158
5.7.4 使用誤差反向傳播法的學習························159
5.8 小結·················································161
x 目錄
第6 章 與學習相關的技巧· ·····································163
6.1 引數的更新············································163
6.1.1 探險家的故事· ···································164
6.1.2 SGD· ··········································164
6.1.3 SGD的缺點· ····································166
6.1.4 Momentum······································168
6.1.5 AdaGrad········································170
6.1.6 Adam· ·········································172
6.1.7 使用哪種更新方法呢· ·····························174
6.1.8 基於MNIST資料集的更新方法的比較· ···············175
6.2 權重的初始值· ·········································176
6.2.1 可以將權重初始值設為0 嗎· ························176
6.2.2 隱藏層的啟用值的分佈· ···························177
6.2.3 ReLU的權重初始值·······························181
6.2.4 基於MNIST資料集的權重初始值的比較· ·············183
6.3 Batch Normalization· ···································184
6.3.1 Batch Normalization 的演算法· ·······················184
6.3.2 Batch Normalization 的評估· ·······················186
6.4 正則化················································188
6.4.1 過擬合·········································189
6.4.2 權值衰減········································191
6.4.3 Dropout· ·······································192
6.5 超引數的驗證· ·········································195
6.5.1 驗證資料········································195
6.5.2 超引數的最優化· ·································196
6.5.3 超引數最優化的實現· ·····························198
6.6 小結·················································200
目錄 xi
第7 章 卷積神經網路· ·········································201
7.1 整體結構··············································201
7.2 卷積層················································202
7.2.1 全連線層存在的問題· ·····························203
7.2.2 卷積運算········································203
7.2.3 填充···········································206
7.2.4 步幅···········································207
7.2.5 3 維資料的卷積運算· ······························209
7.2.6 結合方塊思考· ···································211
7.2.7 批處理·········································213
7.3 池化層················································214
7.4 卷積層和池化層的實現· ·································216
7.4.1 4 維陣列· ·······································216
7.4.2 基於im2col 的展開· ·······························217
7.4.3 卷積層的實現· ···································219
7.4.4 池化層的實現· ···································222
7.5 CNN的實現· ··········································224
7.6 CNN的視覺化· ········································228
7.6.1 第1 層權重的視覺化·······························228
7.6.2 基於分層結構的資訊提取· ·························230
7.7 具有代表性的CNN·····································231
7.7.1 LeNet· ·········································231
7.7.2 AlexNet········································232
7.8 小結·················································233
第8 章 深度學習··············································235
8.1 加深網路··············································235
8.1.1 向更深的網路出發· ·······························235
8.1.2 進一步提高識別精度· ·····························238
xii 目錄
8.1.3 加深層的動機· ···································240
8.2 深度學習的小歷史· ·····································242
8.2.1 ImageNet· ······································243
8.2.2 VGG· ··········································244
8.2.3 GoogLeNet· ·····································245
8.2.4 ResNet· ········································246
8.3 深度學習的高速化· ·····································248
8.3.1 需要努力解決的問題· ·····························248
8.3.2 基於GPU的高速化· ······························249
8.3.3 分散式學習······································250
8.3.4 運算精度的位數縮減· ·····························252
8.4 深度學習的應用案例· ···································253
8.4.1 物體檢測········································253
8.4.2 影象分割········································255
8.4.3 影象標題的生成· ·································256
8.5 深度學習的未來· ·······································258
8.5.1 影象風格變換· ···································258
8.5.2 影象的生成······································259
8.5.3 自動駕駛········································261
8.5.4 Deep Q-Network(強化學習)· ·······················262
8.6 小結·················································264
附錄A Softmax-with-Loss 層的計算圖· ···························267
A.1 正向傳播· ············································268
A.2 反向傳播· ············································270
A.3 小結· ················································277
參考文獻· ····················································279