seq2seq translation

Keras Implementation of Character-level recurrent sequence-to-sequence model

This repo contains the model and the notebook to this Keras example on Character-level recurrent sequence-to-sequence model.

Full credits to : fchollet

Model reproduced by : Sumedh

Intended uses & limitations

This model implements a basic character-level recurrent sequence-to-sequence network for translating short English sentences into short French sentences, character-by-character. Note that it is fairly unusual to do character-level machine translation, as word-level models are more common in this domain. It works best on text of length <= 15 characters.

Training and evaluation data

English to French translation data from https://www.manythings.org/anki/

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

name learning_rate decay rho momentum epsilon centered training_precision
RMSprop 0.0010000000474974513 0.0 0.8999999761581421 0.0 1e-07 False float32
batch_size = 64  # Batch size for training.
epochs = 100  # Number of epochs to train for.
latent_dim = 256  # Latent dimensionality of the encoding space.
num_samples = 10000  # Number of samples to train on.

Model Plot

<details> <summary>View Model Plot</summary>

Model Image

</details>