Skip to content

Commit

Permalink
Fix math render error in README
Browse files Browse the repository at this point in the history
  • Loading branch information
zhaozewang authored Jan 16, 2025
1 parent 1ff0796 commit 5e100b0
Showing 1 changed file with 3 additions and 3 deletions.
6 changes: 3 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -23,13 +23,13 @@ This project aims to address these issues by improving the biological plausibili

## Network Structures
#### Vanilla CTRNN
A simplistic Vanilla CTRNN contains three layers, an input layer, a hidden layer, and an readout layer as depicted below.
A simplistic Vanilla CTRNN contains three layers: an input layer, a hidden layer, and a readout layer, as depicted below.

<p align="center">
<img src="https://github.com/NN4Neurosim/nn4n/blob/main/docs/images/RNN_structure.png" width="400">
</p>

The yellow nodes represent neurons that project input signals to the hidden layer, the green neurons are in the hidden layer, and the purple nodes represent neurons that read out from the hidden layer neurons. Both input and readout neurons are 'imagined' to be there. I.e., they only project or receives signals and therefore do not have activations and internal states.
The yellow nodes represent neurons that project input signals to the hidden layer, the green neurons are in the hidden layer, and the purple nodes represent neurons that read out from the hidden layer neurons. Both input and readout neurons are 'imagined' to be there. I.e., they only project or receive signals and, therefore, do not have activations and internal states.

#### Excitatory-Inhibitory Constrained CTRNN
The implementation of CTRNN also supports Excitatory-Inhibitory constrained continuous-time RNN (EIRNN) similar to what was proposed by H. Francis Song, Guangyu R. Yang, and Xiao-Jing Wang in [Training Excitatory-Inhibitory Recurrent Neural Networks for Cognitive Tasks: A Simple and Flexible Framework](https://doi.org/10.1371/journal.pcbi.1004792)
Expand All @@ -43,7 +43,7 @@ A visual illustration of the EIRNN is shown below.
The yellow nodes denote nodes in the input layer. The middle circle denotes the hidden layer. There are blue nodes and red nodes, representing inhibitory neurons and excitatory neurons, respectively. The depicted network has an E/I ratio of 4/1. The purple nodes are ReadoutLayer neurons.

#### Multi-Area CTRNN
The RNN could also contains multiple areas. Denote the neurons in the hidden layer as $ \mathcal{N} = \{ n_1, n_2, \ldots, n_{N_{hid}} \} $. The neurons within it may be partitioned into multiple areas, $ \mathcal{A} = \{A_1, A_2, \ldots, A_{N_{area}}\} $. The areas are disjoint and their union is the set of all neurons in the hidden layer, i.e., $ \mathcal{N} = \bigcup_{i=1}^{N_{area}} A_i $. Neurons within the same area may be more densely connected and even receives different inputs.
The RNN could also contain multiple areas. Denote the neurons in the hidden layer as \( \mathcal{N} = \{ n_1, n_2, \ldots, n_{N_{hid}} \} \). The neurons within it may be partitioned into multiple areas, \( \mathcal{A} = \{A_1, A_2, \ldots, A_{N_{area}}\} \). The areas are disjoint and their union is the set of all neurons in the hidden layer, i.e., \( \mathcal{N} = \bigcup_{i=1}^{N_{area}} A_i \). Neurons within the same area may be more densely connected and even receive different inputs.

A visual illustration of the Multi-Area CTRNN:

Expand Down

0 comments on commit 5e100b0

Please # to comment.