1. 程式人生 > >deeplearning.ai 第四課第二週,keras導航

deeplearning.ai 第四課第二週,keras導航

1、函式庫匯入:(案例是一個happyhouse案例)

import numpy as np
from keras import layers
from keras.layers import Input, Dense, Activation, ZeroPadding2D, BatchNormalization, Flatten, Conv2D
from keras.layers import AveragePooling2D, MaxPooling2D, Dropout, GlobalMaxPooling2D, GlobalAveragePooling2D
from keras.models import
Model from keras.preprocessing import image from keras.utils import layer_utils from keras.utils.data_utils import get_file from keras.applications.imagenet_utils import preprocess_input import pydot from IPython.display import SVG from keras.utils.vis_utils import model_to_dot from keras.utils import
plot_model from keras import losses from kt_utils import * import keras.backend as K K.set_image_data_format('channels_last') import matplotlib.pyplot as plt from matplotlib.pyplot import imshow %matplotlib inline #help(Model.compile)

2、訓練資料匯入:

X_train_orig, Y_train_orig, X_test_orig, Y_test_orig, classes = load_dataset()

# Normalize image vectors
X_train = X_train_orig/255. X_test = X_test_orig/255. # Reshape Y_train = Y_train_orig.T Y_test = Y_test_orig.T print ("number of training examples = " + str(X_train.shape[0])) print ("number of test examples = " + str(X_test.shape[0])) print ("X_train shape: " + str(X_train.shape)) print ("Y_train shape: " + str(Y_train.shape)) print ("X_test shape: " + str(X_test.shape)) print ("Y_test shape: " + str(Y_test.shape))

3、happymodel定義(本模型採用類似letnet-5卷積網路的設定,前面卷積和池化層和其相同,後面的全連線層採用神經元數為 100,20,及最後一層用的是logistic_regression的二分層)

# GRADED FUNCTION: HappyModel

def HappyModel(input_shape):
    """
    Implementation of the HappyModel.

    Arguments:
    input_shape -- shape of the images of the dataset

    Returns:
    model -- a Model() instance in Keras
    """

    ### START CODE HERE ###
    # Feel free to use the suggested outline in the text above to get started, and run through the whole
    # exercise (including the later portions of this notebook) once. The come back also try out other
    # network architectures as well. 
    X_input = Input(input_shape)

    # shape(None,68,68,3)
    X = ZeroPadding2D((2,2))(X_input)

    #first layer of model
    # shape (None,64,64,6)
    X = Conv2D(6,(5,5),strides=(1,1),name='conv1')(X)
    X = BatchNormalization(axis=3,name='bn1')(X)
    X = Activation('relu')(X)
    # first layer pooling
    # shape = (None,32,32,6)
    X = MaxPooling2D((2,2),strides=(2,2),name='max_pool_1')(X)

    # second layers of models
    # shape = (None,28,28,16)
    X = Conv2D(16,(5,5),strides=(1,1),name='conv2')(X)
    X = BatchNormalization(axis=3,name='nb2')(X)
    X = Activation('relu')(X)
    # second pooling
    # shape = (None,14,14,16)
    X = MaxPooling2D((2,2),strides=(2,2),name='max_pool_2')(X)
    # flatten the X values
    # shape = (None, 14*14*16)
    X = Flatten()(X)


    # third layers models
    # shape =  (None,100)
    X = Dense(100,activation='relu',name='fc1')(X)

    # fourth layer models
    # shape = (None,20)
    X = Dense(20,activation='relu',name='fc2')(X)

    # fifth layer models
    # shape = (None,1)
    X = Dense(1,activation='sigmoid',name='fc3')(X)

    #create this your keras model 
    model = Model(inputs = X_input, outputs = X, name='HappyModel')

    ### END CODE HERE ###

    return model

4、模型計算
4.1 實體模型定義:

### START CODE HERE ### (1 line)
happyModel = HappyModel((64,64,3))
### END CODE HERE ###

4.2 模型 compile 定一其學習過程及相應超參

### START CODE HERE ### (1 line)
happyModel.compile(optimizer='Adam',loss='binary_crossentropy',metrics=["accuracy"])
### END CODE HERE ###

4.3 模型訓練過程:

### START CODE HERE ### (1 line)
happyModel.fit(x = X_train,y=Y_train,epochs = 10,batch_size=32)
### END CODE HERE ###

4.4 模型評估:

### START CODE HERE ### (1 line)
preds = happyModel.evaluate(x=X_test,y=Y_test)
### END CODE HERE ###
print()
print ("Loss = " + str(preds[0]))
print ("Test Accuracy = " + str(preds[1]))

測試結果