diff --git a/README.md b/README.md index e869c5f2..13dca0b7 100644 --- a/README.md +++ b/README.md @@ -16,11 +16,11 @@

-

- - Sponsored by RunPod - -

+
+

Supported by

+ + RunPod Logo +
AutoAWQ is an easy-to-use package for 4-bit quantized models. AutoAWQ speeds up models by 3x and reduces memory requirements by 3x compared to FP16. AutoAWQ implements the Activation-aware Weight Quantization (AWQ) algorithm for quantizing LLMs. AutoAWQ was created and improved upon from the [original work](https://github.com/mit-han-lab/llm-awq) from MIT.