Python notebooks to get started with Tensorflow, Neural Neworks (NNs), Convolutional NNs, Word Embeddings and Recurrent Neural Networks. Most of the material is a personal wrap-up of all the material provided by Google's Deep Learning course on Udacity, so all credit goes to them. Additionally, I added one more notebook to practice with the CTC loss function in temporal models and a another one on Variational Autoencoders.
Python 3.5 required!
-
Notebook 1: How to train a logistic-regressor and a 2-layer NN with L2-norm regularization using TensorFlow.
-
Notebook 2: Convolutional NNs and Dropout Regularization
-
Notebook 3: Word Embeddings and the wor2vec model
-
Notebook 4: Recurrent NNs and sequential character prediction
-
Notebook 5: Recurrent NNs and sequential character prediction from MCC features with Connectionist Temporal Classification
-
Notebook 6: Bi-directional LSTM RNN and sequential character prediction from MCC features with Connectionist Temporal Classification
-
Notebook 7: Amortized Variational Inference with Neural Networks and Variational Autoencoders
Note: since this is an introductory course, most of the steps to define the computation graph in TensorFlow are manually implemented, e.g. I do not make use of predefined tf.layers.
This material is distributed under the MIT License.