1. 程式人生 > >LSTM多層出現的問題:MultiRNNCell出現的錯誤問題以及解決方案

LSTM多層出現的問題:MultiRNNCell出現的錯誤問題以及解決方案

錯誤:ValueError: Attempt to reuse RNNCell <tensorflow.contrib.rnn.python.ops.core_rnn_cell_impl.GRUCell object at 0x11d32cbd0> with a different variable scope than its first use. First use of cell was with scope 'rnn/multi_rnn_cell/cell_0/gru_cell', this attempt is with scope 'rnn/multi_rnn_cell/cell_1/gru_cell'. Please create a new instance of the cell if you would like it to use a different set of weights. If before you were using: MultiRNNCell([GRUCell(...)] * num_layers), change to: MultiRNNCell([GRUCell(...) for _ in range(num_layers)]). If before you were using the same cell instance as both the forward and reverse cell of a bidirectional RNN, simply create two instances (one for forward, one for reverse). In May 2017, we will start transitioning this cell's behavior to use existing stored weights, if any, when it is called with scope=None (which can lead to silent model degradation, so this error will remain until then.)

原始程式碼:
from tensorflow.contrib import rnn
inputs = tf.placeholder(dtype=tf.int32, shape=[None, None], name="inputs")
keep_prob = tf.placeholder(dtype=tf.float32, name="keep_prob")
cell = rnn.GRUCell(10)
cell = rnn.DropoutWrapper(cell=cell, input_keep_prob=keep_prob)
cell = rnn.MultiRNNCell([cell for _ in range(5)], state_is_tuple=True)

outs, states = tf.nn.dynamic_rnn(cell=cell, inputs=look_up, dtype=tf.float32)


解決辦法:
inputs = tf.placeholder(dtype=tf.int32, shape=[None, None], name="inputs")
keep_prob = tf.placeholder(dtype=tf.float32, name="keep_prob")
cell = rnn.MultiRNNCell([rnn.DropoutWrapper(rnn.GRUCell(10), input_keep_prob=keep_prob) for _ in range(5)] , state_is_tuple=True)

outs, states = tf.nn.dynamic_rnn(cell=cell, inputs=look_up, dtype=tf.float32)