Putting it all together

As is tradition in this book, I will show you how the entire architecture for this model fits together here:

def build_models(lstm_units, num_encoder_tokens, num_decoder_tokens):    # train model    encoder_input = Input(shape=(None, num_encoder_tokens),       name='encoder_input')    encoder_outputs, state_h, state_c = LSTM(lstm_units,       return_state=True, name="encoder_lstm")(encoder_input)    encoder_states = [state_h, state_c]    decoder_input = Input(shape=(None, num_decoder_tokens),       name='decoder_input')    decoder_lstm = LSTM(lstm_units, return_sequences=True,       return_state=True, name="decoder_lstm")    decoder_outputs, _, _ = decoder_lstm(decoder_input,                                         initial_state=encoder_states)    decoder_dense = Dense(num_decoder_tokens, activation ...

Get Deep Learning Quick Reference now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.