1. 程式人生 > >Keras多層感知器:印第安糖尿病人診斷

Keras多層感知器:印第安糖尿病人診斷

例項中使用的是Pima Indians Diabetes資料集,資料集有八項屬性和對應輸出:

(1)懷孕次數

(2)2小時口服葡萄糖耐量實驗中血漿葡萄糖濃度

(3)舒張壓

(4)三頭肌皮褶皺厚度

(5)2小時血清胰島素

(6)身體質量指數

(7)糖尿病譜系功能

(8)年齡

(9)是否是糖尿病

第九項是我們的輸出層。

資料集下載:資料集下載地址

下面程式碼:

from keras.models import Sequential
from keras.layers import Dense
import numpy as np
#設定隨機種子
np.random.seed(7)

dataset = np.loadtxt(r'F:\Python\pycharm\keras_deeplearning\datasets\PimaIndiansdiabetes.csv',
                     delimiter=',',skiprows=1)
#分割輸入變數和輸出變數,0-7為輸入x, 8為輸出Y
x = dataset[:, 0:8]
Y = dataset[:, 8]

#建立模型——引數為“神經元數,input_dim只有第一層有,表示有多少個輸入量,
#activation為啟用函式,前兩層用relu,二分類的輸出層要用sigmoid
model = Sequential()
model.add(Dense(12, input_dim=8, activation='relu'))
model.add(Dense(8, activation='relu'))
model.add(Dense(1, activation='sigmoid'))

#編譯模型——二分類中損失函式定義為“二進位制交叉熵”,優化器Adam為有效的梯度下降演算法
model.compile(loss='binary_crossentropy', optimizer='adam',
              metrics=['accuracy'])

#訓練模型——epochs引數表示對資料集進行固定次數的迭代
#batch_size表示在執行神經網路中的權重更新的每個批次中所用例項的個數
model.fit(x=x, y=Y, epochs=150, batch_size=10)

#評估模型——這裡簡化用,訓練集評估
scores = model.evaluate(x=x, y=Y)
print('\n%s: %.2f%%' %(model.metrics_names[1], scores[1]*100))

輸出結果:

Using TensorFlow backend.
Epoch 1/150
2018-10-29 16:26:09.934286: I tensorflow/core/platform/cpu_feature_guard.cc:141] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX AVX2
2018-10-29 16:26:09.935682: I tensorflow/core/common_runtime/process_util.cc:69] Creating new thread pool with default inter op setting: 8. Tune using inter_op_parallelism_threads for best performance.

 10/768 [..............................] - ETA: 31s - loss: 1.6118 - acc: 0.9000
500/768 [==================>...........] - ETA: 0s - loss: 4.6762 - acc: 0.6260 
768/768 [==============================] - 0s 644us/step - loss: 3.6799 - acc: 0.5964
Epoch 2/150

 10/768 [..............................] - ETA: 0s - loss: 0.4365 - acc: 0.8000
520/768 [===================>..........] - ETA: 0s - loss: 0.9445 - acc: 0.6038
768/768 [==============================] - 0s 100us/step - loss: 0.9296 - acc: 0.6016
Epoch 3/150

 10/768 [..............................] - ETA: 0s - loss: 0.7402 - acc: 0.9000
520/768 [===================>..........] - ETA: 0s - loss: 0.7585 - acc: 0.6481
768/768 [==============================] - 0s 100us/step - loss: 0.7461 - acc: 0.6380
Epoch 4/150

 10/768 [..............................] - ETA: 0s - loss: 0.7285 - acc: 0.8000
490/768 [==================>...........] - ETA: 0s - loss: 0.6964 - acc: 0.6449
768/768 [==============================] - 0s 103us/step - loss: 0.7103 - acc: 0.6549
Epoch 5/150

.
.
.
.
.
.
.

 10/768 [..............................] - ETA: 0s - loss: 0.3279 - acc: 0.9000
520/768 [===================>..........] - ETA: 0s - loss: 0.4545 - acc: 0.7981
768/768 [==============================] - 0s 100us/step - loss: 0.4701 - acc: 0.7826
Epoch 142/150

 10/768 [..............................] - ETA: 0s - loss: 0.5173 - acc: 0.7000
510/768 [==================>...........] - ETA: 0s - loss: 0.4799 - acc: 0.7745
768/768 [==============================] - 0s 102us/step - loss: 0.4800 - acc: 0.7734
Epoch 143/150

 10/768 [..............................] - ETA: 0s - loss: 0.4776 - acc: 0.6000
510/768 [==================>...........] - ETA: 0s - loss: 0.4408 - acc: 0.8020
768/768 [==============================] - 0s 104us/step - loss: 0.4720 - acc: 0.7760
Epoch 144/150

 10/768 [..............................] - ETA: 0s - loss: 0.6618 - acc: 0.6000
510/768 [==================>...........] - ETA: 0s - loss: 0.4592 - acc: 0.8039
768/768 [==============================] - 0s 100us/step - loss: 0.4738 - acc: 0.7786
Epoch 145/150

 10/768 [..............................] - ETA: 0s - loss: 0.6643 - acc: 0.6000
510/768 [==================>...........] - ETA: 0s - loss: 0.4932 - acc: 0.7569
768/768 [==============================] - 0s 102us/step - loss: 0.4862 - acc: 0.7630
Epoch 146/150

 10/768 [..............................] - ETA: 0s - loss: 0.5036 - acc: 0.7000
500/768 [==================>...........] - ETA: 0s - loss: 0.4992 - acc: 0.7580
768/768 [==============================] - 0s 103us/step - loss: 0.4918 - acc: 0.7708
Epoch 147/150

 10/768 [..............................] - ETA: 0s - loss: 0.7728 - acc: 0.6000
510/768 [==================>...........] - ETA: 0s - loss: 0.4734 - acc: 0.7824
768/768 [==============================] - 0s 100us/step - loss: 0.4818 - acc: 0.7799
Epoch 148/150

 10/768 [..............................] - ETA: 0s - loss: 0.3585 - acc: 0.8000
510/768 [==================>...........] - ETA: 0s - loss: 0.4805 - acc: 0.7745
768/768 [==============================] - 0s 102us/step - loss: 0.4686 - acc: 0.7812
Epoch 149/150

 10/768 [..............................] - ETA: 0s - loss: 0.5303 - acc: 0.7000
500/768 [==================>...........] - ETA: 0s - loss: 0.4795 - acc: 0.7660
768/768 [==============================] - 0s 101us/step - loss: 0.4722 - acc: 0.7643
Epoch 150/150

 10/768 [..............................] - ETA: 0s - loss: 0.3492 - acc: 0.9000
490/768 [==================>...........] - ETA: 0s - loss: 0.4477 - acc: 0.7980
768/768 [==============================] - 0s 103us/step - loss: 0.4769 - acc: 0.7799

 32/768 [>.............................] - ETA: 1s
768/768 [==============================] - 0s 81us/step

acc: 78.12%

 最後可以看到準確率。