Implementation of the Inception V1 convolutional neural network using TensorFlow 2.
Reference Article Inception V1: https://arxiv.org/pdf/1409.4842v1.pdf
Batch Normalization reference Article: https://arxiv.org/pdf/1502.03167v3.pdf
Reference code: https://github.com/tensorflow/models/blob/master/research/slim/nets/inception_v1.py
Article more code examples: https://paperswithcode.com/paper/going-deeper-with-convolutions
Two models were trained: one without Batch Normalization (BN) and another with Batch Normalization (BN).
Inception V1 | Accuracy without BN | Accuracy with BN |
---|---|---|
softmax 0 | 89.02% | 87.08% |
softmax 1 | 89.84% | 97.54% |
softmax 2 | 98.76% | 96.58% |
Inception V1 | Loss without BN | Loss with BN |
---|---|---|
softmax 0 | 0.5322 | 0.7147 |
softmax 1 | 0.3873 | 0.6366 |
softmax 2 | 0.0662 | 0.1775 |
Inception V1 | Error Rate without BN | Error Rate with BN |
---|---|---|
softmax 0 | 10.98% | 12.92% |
softmax 1 | 10.16% | 2.46% |
softmax 2 | 1.24% | 3.42% |
The Mnist dataset with 10,000 images was used to train the network.
(Training graph for the model without Batch Normalization.)
(Training graph for the model with Batch Normalization.)
5,000 images were used to test the model.
Accuracy of softmax functions:
softmax 0: 0.8902000188827515
softmax 1: 0.8984000086784363
softmax 2: 0.9876000285148621
(Confusion matrix for one of the softmax functions. Error rate of 1.24%)
(Confusion matrix for one of the softmax functions. Error rate of 10.16%)
(Confusion matrix for one of the softmax functions. Error rate of 10.98%)
Accuracy of softmax functions:
softmax 0: 0.8708000183105469
softmax 1: 0.9753999710083008
softmax 2: 0.9657999873161316
(Confusion matrix for one of the softmax functions. Error rate of 3.42%)
(Confusion matrix for one of the softmax functions. Error rate of 2.46%)
(Confusion matrix for one of the softmax functions. Error rate of 12.92%)