1. 程式人生 > >Keras學習(四)——CNN卷積神經網路

Keras學習(四)——CNN卷積神經網路

本文主要介紹使用keras實現CNN對手寫資料集進行分類。

示例程式碼:

import numpy as np
from keras.datasets import mnist
from keras.utils import np_utils
from keras.models import Sequential
from keras.layers import Dense, Activation, Convolution2D, MaxPooling2D, Flatten
from keras.optimizers import Adam

# 使多次生成的隨機數相同
np.random.seed(1337)

# 下載資料集
# X_shape(60000 28x28),y shape(10000)
(X_train, y_train), (X_test, y_test) = mnist.load_data()

# 預處理資料
'''
X_train.reshape(X_train.shape[0], -1) 將60000個28x28的資料變為60000x784
/255:把資料標準化到[0,1]
'''
X_train = X_train.reshape(-1, 1, 28, 28)   # -1:sample個數, 1:channel, 28x28:長寬
X_test = X_test.reshape(-1, 1, 28, 28)
# 將標籤變為one-hot形式
y_train = np_utils.to_categorical(y_train, num_classes=10)
y_test = np_utils.to_categorical(y_test, num_classes=10)

# 搭建網路
model = Sequential()
# conv1 layer
model.add(Convolution2D(
    nb_filter=32,  # 濾波器
    nb_row=5,  # filter寬度
    nb_col=5,  # filter高度
    border_mode='same',  # padding的方法
    input_shape=(1,  # channel的個數
                 28, 28),  # width和height
))
model.add(Activation('relu'))
# 池化層 pooling
model.add(
    MaxPooling2D(
        pool_size=(2, 2),
        strides=(2, 2),
        border_mode='same',  # padding method
))

# Conv2 layer
model.add(Convolution2D(64, 5, 5, border_mode='same'))
model.add(Activation('relu'))

# pooling2 layer
model.add(MaxPooling2D(pool_size=(2, 2), border_mode='same'))

# 展開
model.add(Flatten())
# 全連線層1
model.add(Dense(1024))
model.add(Activation('relu'))

# fc2
model.add(Dense(10))
model.add(Activation('softmax'))  # 用於分類的啟用函式
# 優化器
adam = Adam(lr=1e-4)

# 啟用模型
model.compile(optimizer=adam,
              loss='categorical_crossentropy',
              metrics=['accuracy'])

# 訓練
print('Training...')
model.fit(X_train, y_train, nb_epoch=2, batch_size=32)

# 測試
print('\nTesing....\n')
loss, accuracy = model.evaluate(X_test, y_test)

print('\ntest loss', loss)
print('\ntest accuracy', accuracy)

執行結果:

58560/60000 [============================>.] - ETA: 1s - loss: 0.0993 - acc: 0.9691
58624/60000 [============================>.] - ETA: 1s - loss: 0.0992 - acc: 0.9691
58688/60000 [============================>.] - ETA: 1s - loss: 0.0992 - acc: 0.9692
58752/60000 [============================>.] - ETA: 1s - loss: 0.0992 - acc: 0.9692
58816/60000 [============================>.] - ETA: 1s - loss: 0.0996 - acc: 0.9691
58880/60000 [============================>.] - ETA: 1s - loss: 0.0996 - acc: 0.9692
58944/60000 [============================>.] - ETA: 1s - loss: 0.0995 - acc: 0.9692
59008/60000 [============================>.] - ETA: 1s - loss: 0.0994 - acc: 0.9692
59072/60000 [============================>.] - ETA: 0s - loss: 0.0993 - acc: 0.9692
59136/60000 [============================>.] - ETA: 0s - loss: 0.0993 - acc: 0.9692
59200/60000 [============================>.] - ETA: 0s - loss: 0.0993 - acc: 0.9692
59264/60000 [============================>.] - ETA: 0s - loss: 0.0993 - acc: 0.9693
59328/60000 [============================>.] - ETA: 0s - loss: 0.0992 - acc: 0.9693
59392/60000 [============================>.] - ETA: 0s - loss: 0.0992 - acc: 0.9693
59456/60000 [============================>.] - ETA: 0s - loss: 0.0992 - acc: 0.9693
59520/60000 [============================>.] - ETA: 0s - loss: 0.0992 - acc: 0.9693
59584/60000 [============================>.] - ETA: 0s - loss: 0.0991 - acc: 0.9693
59648/60000 [============================>.] - ETA: 0s - loss: 0.0991 - acc: 0.9693
59712/60000 [============================>.] - ETA: 0s - loss: 0.0990 - acc: 0.9694
59776/60000 [============================>.] - ETA: 0s - loss: 0.0989 - acc: 0.9694
59840/60000 [============================>.] - ETA: 0s - loss: 0.0989 - acc: 0.9694
59904/60000 [============================>.] - ETA: 0s - loss: 0.0990 - acc: 0.9694
59968/60000 [============================>.] - ETA: 0s - loss: 0.0989 - acc: 0.9694
60000/60000 [==============================] - 64s 1ms/step - loss: 0.0989 - acc: 0.9694

Tesing....


   32/10000 [..............................] - ETA: 28s
  256/10000 [..............................] - ETA: 5s 
  480/10000 [>.............................] - ETA: 3s
  704/10000 [=>............................] - ETA: 3s
  928/10000 [=>............................] - ETA: 2s
 1152/10000 [==>...........................] - ETA: 2s
 1376/10000 [===>..........................] - ETA: 2s
 1600/10000 [===>..........................] - ETA: 2s
 1792/10000 [====>.........................] - ETA: 2s
 2016/10000 [=====>........................] - ETA: 2s
 2240/10000 [=====>........................] - ETA: 2s
 2464/10000 [======>.......................] - ETA: 2s
 2656/10000 [======>.......................] - ETA: 2s
 2816/10000 [=======>......................] - ETA: 2s
 3040/10000 [========>.....................] - ETA: 1s
 3264/10000 [========>.....................] - ETA: 1s
 3488/10000 [=========>....................] - ETA: 1s
 3712/10000 [==========>...................] - ETA: 1s
 3872/10000 [==========>...................] - ETA: 1s
 4096/10000 [===========>..................] - ETA: 1s
 4320/10000 [===========>..................] - ETA: 1s
 4544/10000 [============>.................] - ETA: 1s
 4768/10000 [=============>................] - ETA: 1s
 4960/10000 [=============>................] - ETA: 1s
 5184/10000 [==============>...............] - ETA: 1s
 5376/10000 [===============>..............] - ETA: 1s
 5600/10000 [===============>..............] - ETA: 1s
 5824/10000 [================>.............] - ETA: 1s
 6048/10000 [=================>............] - ETA: 1s
 6272/10000 [=================>............] - ETA: 0s
 6496/10000 [==================>...........] - ETA: 0s
 6720/10000 [===================>..........] - ETA: 0s
 6944/10000 [===================>..........] - ETA: 0s
 7168/10000 [====================>.........] - ETA: 0s
 7360/10000 [=====================>........] - ETA: 0s
 7584/10000 [=====================>........] - ETA: 0s
 7808/10000 [======================>.......] - ETA: 0s
 8000/10000 [=======================>......] - ETA: 0s
 8224/10000 [=======================>......] - ETA: 0s
 8448/10000 [========================>.....] - ETA: 0s
 8640/10000 [========================>.....] - ETA: 0s
 8864/10000 [=========================>....] - ETA: 0s
 9088/10000 [==========================>...] - ETA: 0s
 9216/10000 [==========================>...] - ETA: 0s
 9376/10000 [===========================>..] - ETA: 0s
 9536/10000 [===========================>..] - ETA: 0s
 9728/10000 [============================>.] - ETA: 0s
 9920/10000 [============================>.] - ETA: 0s
10000/10000 [==============================] - 3s 269us/step

test loss 0.08740828046086244

test accuracy 0.9717