1. 程式人生 > >sklearn、TensorFlow、keras模型儲存與讀取

sklearn、TensorFlow、keras模型儲存與讀取

一、sklearn模型儲存與讀取
1、儲存

from sklearn.externals import joblib
from sklearn import svm
X = [[0, 0], [1, 1]]
y = [0, 1]
clf = svm.SVC()
clf.fit(X, y)  
joblib.dump(clf, "train_model.m")

2、讀取

clf = joblib.load("train_model.m")
clf.predit([0,0]) #此處test_X為特徵集

二、TensorFlow模型儲存與讀取(該方式tensorflow只能儲存變數而不是儲存整個網路,所以在提取模型時,我們還需要重新第一網路結構。)
1、儲存


import tensorflow as tf  
import numpy as np  

W = tf.Variable([[1,1,1],[2,2,2]],dtype = tf.float32,name='w')  
b = tf.Variable([[0,1,2]],dtype = tf.float32,name='b')  

init = tf.initialize_all_variables()  
saver = tf.train.Saver()  
with tf.Session() as sess:  
        sess.run(init)  
        save_path = saver.save(sess,"save/model.ckpt"
)

2、載入

import tensorflow as tf  
import numpy as np  

W = tf.Variable(tf.truncated_normal(shape=(2,3)),dtype = tf.float32,name='w')  
b = tf.Variable(tf.truncated_normal(shape=(1,3)),dtype = tf.float32,name='b')  

saver = tf.train.Saver()  
with tf.Session() as sess:  
        saver.restore(sess,"save/model.ckpt"
)
import tensorflow as tf

# First, you design your mathematical operations
# We are the default graph scope

# Let's design a variable
v1 = tf.Variable(1. , name="v1")
v2 = tf.Variable(2. , name="v2")
# Let's design an operation
a = tf.add(v1, v2)

# Let's create a Saver object
# By default, the Saver handles every Variables related to the default graph
all_saver = tf.train.Saver() 
# But you can precise which vars you want to save under which name
v2_saver = tf.train.Saver({"v2": v2}) 

# By default the Session handles the default graph and all its included variables
with tf.Session() as sess:
  # Init v and v2   
  sess.run(tf.global_variables_initializer())
  # Now v1 holds the value 1.0 and v2 holds the value 2.0
  # We can now save all those values
  all_saver.save(sess, 'data.chkp')
  # or saves only v2
  v2_saver.save(sess, 'data-v2.chkp')
模型的權重是儲存在 .chkp 檔案中,模型的圖是儲存在 .chkp.meta 檔案中。

2、載入

import tensorflow as tf

# Let's laod a previous meta graph in the current graph in use: usually the default graph
# This actions returns a Saver
saver = tf.train.import_meta_graph('results/model.ckpt-1000.meta')

# We can now access the default graph where all our metadata has been loaded
graph = tf.get_default_graph()

# Finally we can retrieve tensors, operations, etc.
global_step_tensor = graph.get_tensor_by_name('loss/global_step:0')
train_op = graph.get_operation_by_name('loss/train_op')
hyperparameters = tf.get_collection('hyperparameters')

恢復權重

請記住,在實際的環境中,真實的權重只能存在於一個會話中。也就是說,restore 這個操作必須在一個會話中啟動,然後將資料權重匯入到圖中。理解恢復操作的最好方法是將它簡單的看做是一種資料初始化操作。
with tf.Session() as sess:
    # To initialize values with saved data
    saver.restore(sess, 'results/model.ckpt-1000-00000-of-00001')
    print(sess.run(global_step_tensor)) # returns 1000

四、keras模型儲存和載入

model.save('my_model.h5')  
model = load_model('my_model.h5')