What is the difference between tf.keras.layers versus tf.layers?
E.g. both of them have Conv2d, do they provide different outputs?
Is there any benefits if you mix them (something like a tf.keras.layers.Conv2d in one hidden layer and in the next, tf.layers.max_pooling2d)?
TensorFlow is an open-sourced end-to-end platform, a library for multiple machine learning tasks, while Keras is a high-level neural network library that runs on top of TensorFlow. Both provide high-level APIs used for easily building and training models, but Keras is more user-friendly because it's built-in Python.
Used in the notebooks. A layer is a callable object that takes as input one or more tensors and that outputs one or more tensors. It involves computation, defined in the call() method, and a state (weight variables).
Layers are the basic building blocks of neural networks in Keras. A layer consists of a tensor-in tensor-out computation function (the layer's call method) and some state, held in TensorFlow variables (the layer's weights).
Dense layer is the regular deeply connected neural network layer. It is most common and frequently used layer. Dense layer does the below operation on the input and return the output. output = activation(dot(input, kernel) + bias)
Since TensorFlow 1.12, tf.layers are merely wrappers around tf.keras.layers.
A few examples:
Convolutional tf.layers just inherit from the convolutional tf.keras.layers, see source code here:
@tf_export('layers.Conv2D')
class Conv2D(keras_layers.Conv2D, base.Layer):
The same is true for all core tf.layers, e.g.:
@tf_export('layers.Dense')
class Dense(keras_layers.Dense, base.Layer):
With the integration of Keras into TensorFlow, it would make little sense to maintain several different layer implementations. tf.keras is becoming the de-facto high-level API for TensorFlow, therefore tf.layers are now just wrappers around tf.keras.layers.
tf.keras.layers.Conv2d is a tensorflow-keras layer while tf.layers.max_pooling2d is a tensorflow 'native layer'
You cannot use a native layer directly within a Keras model, as it will be missing certain attributes required by the Keras API.
However, it is possible to use native layer if wrapped within a tensorflow-keras Lambda layer. A link to the documentation for this is below.
https://www.tensorflow.org/api_docs/python/tf/keras/layers/Lambda
tf.layers module is Tensorflow attempt at creating a Keras like API whereas tf.keras.layers is a compatibility wrapper. In fact, most of the implementation refers back to tf.layers, for example the tf.keras.layers.Dense inherits the core implementation:
@tf_export('keras.layers.Dense')
class Dense(tf_core_layers.Dense, Layer):
# ...
Because the tf.keras compatibility module is checked into the Tensorflow repo separately, it might lack behind what Keras actually offers. I would use Keras directly or tf.layers but not necessarily mix them.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With