1. 程式人生 > >【學習筆記】Hands-on ML with sklearn&tensorflow [TF] [2]placeholder nodes實現mini-batch

【學習筆記】Hands-on ML with sklearn&tensorflow [TF] [2]placeholder nodes實現mini-batch

為了實現mini-batch,需要一種節點,在每次迭代使用一個新的batch,可以用placeholder node實現這個功能。

>>>A = placeholder(tf.float32, shape=(None, 3))
>>>B = A + 5
#這裡shape=(None, 3)限制了向A節點feed data的維數

使用eval()的feed_dict引數傳入A的值並計算相應的B.eval():

>>>with tf.Session() as less:
...    B_val_1 = B.eval(feed_dict={A: [[1, 2, 3]]})
...    B_val_2 = B.eval(feed_fict={A: [[4, 5, 6], [7, 8, 9]]})

>>>print(B_val_1)
[[6. 7. 8.]]
>>>print(B_val_2)
[[ 9. 10. 11.]
 [12. 13. 14.]]

mini-batch的實現:

X = tf.placeholder(tf.float32, shape = (None, n + 1)), name = 'X')
y = tf.placeholder(tf.float32, shape = (None, 1)), name = 'y')

batch_size = 100 #設定batch大小
n_batches = int(np.ceil(m/batch_size))

def fetch_batch(epoch, batch_index, batch_size):
    [...] #讀入資料
    return X_batch, y_batch

with tf.Session() as sess:
    sess.run(init)
    
    for epoch in range(n_epochs):
        for batch_index in range(n_batches):
            X_batch, y_batch = fetch_batch(epoch, batch_index, batch_size)
            sess.run(train_op, feed_dict = {X: X_batch, y: y_batch})
    
    best_theta = theta.eval()