Keywords: Python, TensorFlow, Deep Learning, Natural Language Processing, Chatbot, Movie Dialogues
- Installation
- Introduction
2.1 Goal
2.2 Results - Project structure
- Dataset
- Project usage
5.1 Reformat the raw data .txt files
5.2 Train the NLP Seq2Seq model
5.3 Visualize predictions with trained model
5.4 Chat with the Chatbot AI - Todo
- Resources
This project was designed for:
- Python 3.6
- TensorFlow 1.12.0
Please install requirements & project:
$ cd /path/to/project/
$ git clone https://github.com/filippogiruzzi/nlp_chatbot.git
$ cd nlp_chatbot/
$ pip3 install -r requirements.txt
$ pip3 install -e . --user --upgrade
The purpose of this project is to design and implement a realistic Chatbot based on Natural Language Processing (NLP).
The project nlp_chatbot/
has the following structure:
nlp/data_processing/
: data processing, recording & visualizationnlp/training/
: data input pipeline, model & training / evaluation / prediction operationsnlp/inference/
: exporting trained model & inference
Please download the Cornell Movie-Dialogs Corpus dataset ,
and extract all files to /path/to/cornell_movie_data/
. The challenge description can be found on
Kaggle .
The dataset consists of 220 579 conversational exchanges between 10 292 pairs of movie characters and involves 9 035 characters from 617 movies, and is thus well suited for realistic chatbot applications.
$ cd /path/to/project/nlp_chatbot/nlp/
$ python3 data_processing/data_formatter.py --data-dir /path/to/cornell_movie_data/
$ python3 training/train.py --data-dir /path/to/cornell_movie_data/
$ python3 training/train.py --data-dir /path/to/cornell_movie_data/tfrecords/
--mode predict
--model-dir /path/to/trained/model/dir/
--ckpt /path/to/trained/model/dir/
$ python3 inference/export_model.py --model-dir /path/to/trained/model/dir/
--ckpt /path/to/trained/model/dir/
$ python3 inference/inference.py --data_dir /path/to/cornell_movie_data/
--exported_model /path/to/exported/model/
The trained model will be recorded in /path/to/cornell_movie_data/tfrecords/models/seq2seq/
.
The exported model will be recorded inside this directory.
- Full training on Colab
- Add Google Colab demo
- Add attention
- Debug training accuracy
- Add evaluation accuracy
- Inference model & script
- Chatbot interface
- Clean OOP inference
- Add architecture on ReadMe
- Add Beam search decoding & random sampling decoding
- Add Softmax temperature
- Add complex models
- Add & compare with statistical baseline
- Add perplexity
- Visualize attention
- Add char level model
- Add BPE (Byte Pair Encoding)
- Train on maximizing MMI (Maximum Mutual Information)
This project was widely inspired by:
- Pytorch chatbot tutorial, Pytorch website
- Pytorch NLP tutorial, Pytorch website
- TensorFlow NLP tutorial, TensorFlow website
- Keras NLP tutorial, TDS
- Kaggle challenge, Kaggle
- Sequence to Sequence Learning with Neural Networks, I. Sutskever, O. Vinyals, Q. V. Le, 2014, Arxiv
- A Neural Conversational Model, O. Vinyals, Q. Le, 2015, Arxiv
- Learning Phrase Representations using RNN Encoder-Decoder for Statistical Machine Translation, K. Cho, B. van Merrienboer, C. Gulcehre, D. Bahdanau, F. Bougares, H. Schwenk, Y. Bengio, 2014, Arxiv
- Effective Approaches to Attention-based Neural Machine Translation, M-T. Luong, H. Pham, C. D. Manning, 2015, Arxiv
- Neural Machine Translation by Jointly Learning to Align and Translate, D. Bahdanau, K. Cho, Y. Bengio, 2014, Arxiv