A NeuralNetwork consists of groups of Neurons - connected via Synapses, and organised into Layers.
Neural Network:
Layer1: Layer2
Neurons <-> Synapses <-> Neurons <-> Synapses <-> Neurons <-> Neurons <-> Neurons <->....
Data flows through the network via NeuronsActivation instances - these instances pass through the Synapses and activate each group of Neurons in turn.
Inside the Synapses there are Axons ( determining the "connections" between the Neurons on either side of the Synapses ), and ActivationFunctions ( which modify the activations as they exit the Axons )
Synapses:
ActivationFunction <-> Axons <-> Activation Function
Each Layer includes at least one Axons instance ( the primary Axons for that Layer, and after which the Layer is named), but may also include other Axons if it is a more complex Layer.
Some Axons have fixed connection weights...but other Axons are TrainableAxons and have connection weights we want to learn.
The aim of training a NeuralNetwork is optimise the connection weights for all the TrainableAxons so that the NeuralNetwork generates some desired output data ( encoded as NeuronsActivation instances), eg., in response to some given input NeuronsActivation.
A DirectedNeuralNetwork is a subclass of NeuralNetwork in which there is a natural direction for the flow of activations, from left to right.
The Layers within a DirectedNeuralNetwork are DirectedLayers, within which Neurons are connected via DirectedSynapses.
DirectedNeuralNetwork:
DirectedLayer1:
Neurons -> DirectedSynapses -> Neurons -> DirectedSynapses ->
DirectedSynapses contain Axons followed by a specific type of ActivationFunction - a DifferentiableActivationFunction to the right hand side of the Axons.
DirectedSynapses:
Axons -> DifferentiableActivationFunction
DirectedLayers and DirectedSynapses each know how to "forward propagate" the activations of the Neurons on their left hand side to to Neurons on their right hand side.
As NeuronsActivation instances propagate through this DirectedNeuralNetwork, the activations of each of these components are collected into "forward propagation" chain.
eg.
NeuronsActivation -> AxonsActivation-> DifferentiableActivationFunctionActivation -> NeuronsActivation -> .....
The activations within each Synapses instance are grouped together into a DirectedSynapseActivation, and these DirectedSynapseActivations are themselves further grouped into DirectedLayerActivations.
Our ForwardPropagation now contains a chain of DirectedLayerActivations, each containing a chain of DirectedSynapseActivations, each of which contain an AxonsActivation and DifferentiableActivationFunctionActivation
ForwardPropagation:
DirectedLayerActivation1 -> DirectedLayerActivation2 -> ...