There are several classes in tf.nn that relate to RNNs. In the examples I find on the web, tf.nn.dynamic_rnn and tf.nn.rnn seem to be used interchangeably or at least I cannot seem to figure out why one is used in place of the other. What is the difference?
Defined in tensorflow/python/ops/rnn.py . See the guide: Neural Network > Recurrent Neural Networks. Creates a recurrent neural network specified by RNNCell cell . Performs fully dynamic unrolling of inputs . Example: # create a BasicRNNCell rnn_cell = tf.
From RNNs in Tensorflow, a Practical Guide and Undocumented Features by Denny Britz, published in August 21, 2016.
tf.nn.rnncreates an unrolled graph for a fixed RNN length. That means, if you calltf.nn.rnnwith inputs having 200 time steps you are creating a static graph with 200 RNN steps. First, graph creation is slow. Second, you’re unable to pass in longer sequences (> 200) than you’ve originally specified.
tf.nn.dynamic_rnnsolves this. It uses atf.Whileloop to dynamically construct the graph when it is executed. That means graph creation is faster and you can feed batches of variable size.
They are nearly the same, but there is a little difference in the structure of input and output. From documentation:
tf.nn.dynamic_rnnThis function is functionally identical to the function
rnnabove, but >performs fully dynamic unrolling of inputs.Unlike
rnn, the input inputs is not a Python list of Tensors, one for each frame. Instead, inputs may be a single Tensor where the maximum time is either the first or second dimension (see the parametertime_major). Alternatively, it may be a (possibly nested) tuple of Tensors, each of them having matching batch and time dimensions. The corresponding output is either a single Tensor having the same number of time steps and batch size, or a (possibly nested) tuple of such tensors, matching the nested structure ofcell.output_size.
For more details, explore source.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With