What I Read: Distillation.
https://www.quantamagazine.org/how-distillation-makes-ai-models-smaller-and-cheaper-20250718/
How Distillation Makes AI Models Smaller and Cheaper
Amos Zeeberg
7/18/25
“Fundamental technique lets researchers use a big, expensive “teacher” model to train a “student” model for less.”