A successful attempt at a simple feedforward neural network in Java
- Multiple activation function support (Identity for input layer, Rectified Linear Activation Function, LeakyReLU, Sigmoid, and Softmax)
- Stochastic training only (batch training feature discontinued)
- Multiple loss function support (Mean Square Error & Log Loss)
- Manually adjusted weights for XOR:
SampleXOR.model
, first successful forward propagation test - Backpropagation adjusted 2-4-1 XOR model, with hidden layer being LReLU:
XOR-LRELU-BackpropagationTrained.model
, first successful backpropagation test - Backpropagation adjusted 2-5-1 XOR model, with hidden layer being sigmoid:
XOR-LRELU-Sigmoid.model
- Backpropagation adjusted 2-5-2 XOR classification model with Softmax & Log loss:
XOR-SoftmaxLogloss-BackpropagationTrained.model
, first successful softmax & logloss test - Backpropagation adjusted 784-50-20-10 MNIST classification model, hidden layers being sigmoid:
MNISTSoftmaxTest25Epochs94Percent.model
, first successful MNIST model, 94% accuracy on testing data that it hasn't seen before