Skip to content

Add support for groupwise quantization for int8 weight only quantization #4570

Add support for groupwise quantization for int8 weight only quantization

Add support for groupwise quantization for int8 weight only quantization #4570

Annotations

1 warning

test (CPU 2.4, linux.4xlarge, torch==2.4.0 --index-url https://download.pytorch.org/whl/cpu, cpu)  /  linux-job

succeeded Oct 19, 2024 in 8m 40s