Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why do we need to worry about the batch dimension when specifying a model in Tensorflow?

It seems a bit cumbersome to take into account the batch dimension for every layer in a neural network. Why don't we have some functionality in Tensorflow that can just set the batch size for an entire model?

like image 974
Hanhan Li Avatar asked Nov 02 '25 05:11

Hanhan Li


1 Answers

In tensorflow you do not have to take into account the batch size.

In the MNIST Tutorial it's explained how tensorflow handles batches of every size.

Quoting the tutorial:

x = tf.placeholder(tf.float32, shape=[None, 784])
y_ = tf.placeholder(tf.float32, shape=[None, 10])

The input images x will consist of a 2d tensor of floating point numbers. Here we assign it a shape of [None, 784], where 784 is the dimensionality of a single flattened MNIST image, and None indicates that the first dimension, corresponding to the batch size, can be of any size.

like image 186
nessuno Avatar answered Nov 04 '25 04:11

nessuno