This project is a neural network implementation inspired by Andrej Karpathy's "Neural Networks: Zero to Hero" video series. The main goal is to implement deep learning concepts from scratch in a clear and understandable way.
- Autograd engine for automatic differentiation
- Basic neural network components
- Visualization tools
- Computational graph creation
- Backpropagation implementation
# Clone the repository
git clone https://github.com/aynursusuz/micrograd.git
cd micrograd
# Install dependencies
pip install -r requirements.txt
- Python 3.8+
- NumPy
- Matplotlib
- Graphviz (for visualization)
- Jupyter Notebook
micrograd/
├── notebooks/
│ └── 01_micrograd.ipynb # Core examples and explanations
├── code/
│ ├── engine.py # Autograd engine
│ └── utils.py # Helper functions
└── requirements.txt
- Neural Networks: Zero to Hero Video Series
- Automatic Differentiation Concepts
- Backpropagation Algorithm
We welcome your suggestions and pull requests to improve the project! You can contribute by adding new features, fixing bugs, or improving documentation.
MIT License 2024
This project is open source and available under the MIT License.