Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to calculate the number of parameters of an LSTM network?

Is there a way to calculate the total number of parameters in a LSTM network.

I have found a example but I'm unsure of how correct this is or If I have understood it correctly.

For eg consider the following example:-

from keras.models import Sequential
from keras.layers import Dense, Dropout, Activation
from keras.layers import Embedding
from keras.layers import LSTM
model = Sequential()
model.add(LSTM(256, input_dim=4096, input_length=16))
model.summary()

Output

____________________________________________________________________________________________________
Layer (type)                       Output Shape        Param #     Connected to                     
====================================================================================================
lstm_1 (LSTM)                      (None, 256)         4457472     lstm_input_1[0][0]               
====================================================================================================
Total params: 4457472
____________________________________________________________________________________________________

As per My understanding n is the input vector lenght. And m is the number of time steps. and in this example they consider the number of hidden layers to be 1.

Hence according to the formula in the post. 4(nm+n^2) in my example m=16;n=4096;num_of_units=256

4*((4096*16)+(4096*4096))*256 = 17246978048

Why is there such a difference? Did I misunderstand the example or was the formula wrong ?

like image 389
Arsenal Fanatic Avatar asked Sep 09 '25 14:09

Arsenal Fanatic


2 Answers

No - the number of parameters of a LSTM layer in Keras equals to:

params = 4 * ((size_of_input + 1) * size_of_output + size_of_output^2)

Additional 1 comes from bias terms. So n is size of input (increased by the bias term) and m is size of output of a LSTM layer.

So finally :

4 * (4097 * 256 + 256^2) = 4457472
like image 64
Marcin Możejko Avatar answered Sep 12 '25 05:09

Marcin Możejko


enter image description here image via this post

num_params = [(num_units + input_dim + 1) * num_units] * 4

num_units + input_dim: concat [h(t-1), x(t)]

+ 1: bias

* 4: there are 4 neural network layers (yellow box) {W_forget, W_input, W_output, W_cell}

model.add(LSTM(units=256, input_dim=4096, input_length=16))

[(256 + 4096 + 1) * 256] * 4 = 4457472

PS: num_units = num_hidden_units = output_dims

like image 32
FelixHo Avatar answered Sep 12 '25 04:09

FelixHo