Tulsi Tulsi - 1 year ago 145
Python Question

Multiple Layer hidden layer in LSTM in Keras

x = Input(shape=(timesteps, input_dim,))

# LSTM encoding
h = LSTM(2048)(x)


This is few lines of code from the file I downloaded from the internet. I think
h
holds for single layer LSTM layer with 2048 units. How can it make it multi layer i.e 2 hidden layers.

Answer Source

Just add another layer (lets call it g)! But since we are passing to another LSTM layer, we're going to have to add return_sequences keyword parameter to the first layer so that we can get the right input shape to the second layer.

x = Input(shape=(timesteps, input_dim,))

# LSTM encoding
h = LSTM(2048, return_sequences=true)(x)
g = LSTM(10)(h)
Recommended from our users: Dynamic Network Monitoring from WhatsUp Gold from IPSwitch. Free Download