Skip to content
/ TinyNet Public

A straight-forward implementation of a FFNN from scratch, inspired by @karpathy micrograd

License

Notifications You must be signed in to change notification settings

nMaax/TinyNet

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

11 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

TinyNet

TinyNet is a straightforward implementation of a Feedforward Neural Network (FFNN) built from scratch using Python and Numpy only. Inspired by @karpathy's micrograd, this project aims to provide an educational resource for understanding the foundational components of neural networks without relying on ML-dedicated external libraries.

Installation

Clone the repository to your local machine:

git clone https://github.com/nMaax/TinyNet.git
cd TinyNet

Usage

The primary code and examples are contained within the main.ipynb Jupyter Notebook. To explore and run the code:

  1. Install Numpy and Jupyter Notebook: If you don't have them installed, you can add them using pip:

    pip install notebook
    pip install numpy
  2. Launch Jupyter Notebook:

    jupyter notebook
  3. Open main.ipynb: In the Jupyter interface, navigate to the TinyNet directory and open main.ipynb.

  4. Run the Notebook: Execute the cells sequentially to build and train the neural network.

Project Structure

  • NN/: Contains the core neural network implementation.
  • main.ipynb: Jupyter Notebook demonstrating the usage of the neural network.
  • LICENSE: Project license information.
  • .gitignore: Specifies files to ignore in the repository.

Note: This project is currently under development. Features and implementations are subject to change.

About

A straight-forward implementation of a FFNN from scratch, inspired by @karpathy micrograd

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published