Skip to content

Enable float8 attention support (q/k/v) #5641

Enable float8 attention support (q/k/v)

Enable float8 attention support (q/k/v) #5641

Annotations

1 error

test-nightly (CUDA Nightly, linux.g5.12xlarge.nvidia.gpu, --pre torch --index-url https://downloa...  /  linux-job

failed Dec 7, 2024 in 52m 2s