glassduck

Weblog

Post-Training Quantization to Trit-Planes for Large Language Models

Understanding how trit-plane quantization compresses LLMs without retraining.

— Feb 8, 2026