"Deep Generative Modeling": Introductory Examples
-
Updated
Sep 22, 2024 - Jupyter Notebook
"Deep Generative Modeling": Introductory Examples
A collection of tools for neural compression enthusiasts.
Bare-bones implementations of some generative models in Jax: diffusion, normalizing flows, consistency models, flow matching, (beta)-VAEs, etc
Curated list of papers and resources focused on neural compression, intended to keep pace with the anticipated surge of research in the recent years.
Official code for "Computationally-Efficient Neural Image Compression with Shallow Decoders", ICCV 2023
A LLaMA2-7b chatbot with memory running on CPU, and optimized using smooth quantization, 4-bit quantization or Intel® Extension For PyTorch with bfloat16.
JPD-SE: High-Level Semantics for Joint Perception-Distortion Enhancement in Image Compression
Exploring advanced autoencoder architectures for efficient data compression on EMNIST dataset, focusing on high-fidelity image reconstruction with minimal information loss. This project tests various encoder-decoder configurations to optimize performance metrics like MSE, SSIM, and PSNR, aiming to achieve near-lossless data compression.
An unofficial replication of NAS Without Training.
Add a description, image, and links to the neural-compression topic page so that developers can more easily learn about it.
To associate your repository with the neural-compression topic, visit your repo's landing page and select "manage topics."