Machine translation using Encoder-Decoder LSTM Model Encoder : Represents the input text corpus (German text) in the form of embedding vectors and trains the model. Decoder : Translates and predicts the input embedding vectors into one-hot vectors representing English words in the dictionary.
Implemented NMT with/without attention
- https://arxiv.org/pdf/1409.0473v7.pdf
- https://machinelearningmastery.com/develop-neural-machine-translation-system-keras/
- https://opus.nlpl.eu/
- http://jalammar.github.io/illustrated-transformer/
- https://jalammar.github.io/visualizing-neural-machine-translation-mechanics-of-seq2seq-models-with-attention/
- https://betterprogramming.pub/a-guide-on-the-encoder-decoder-model-and-the-attention-mechanism-401c836e2cdb
- https://keras.io/api/layers/attention_layers/attention/
- https://levelup.gitconnected.com/building-seq2seq-lstm-with-luong-attention-in-keras-for-time-series-forecasting-1ee00958decb
- https://www.kaggle.com/residentmario/seq-to-seq-rnn-models-attention-teacher-forcing
- https://www.analyticsvidhya.com/blog/2019/11/comprehensive-guide-attention-mechanism-deep-learning/
- https://medium.com/deep-learning-with-keras/seq2seq-part-f-encoder-decoder-with-bahdanau-luong-attention-mechanism-ca619e240c55
- https://stackabuse.com/python-for-nlp-neural-machine-translation-with-seq2seq-in-keras/
- https://github.com/shawnhan108/Attention-LSTMs/tree/master/English%20to%20Italian%20Machine%20Translation%20with%20BiLSTM%20and%20Attention
- https://towardsdatascience.com/light-on-math-ml-attention-with-keras-dc8dbc1fad39