A collection of dataset distillation papers.
-
Updated
Jan 27, 2023
A collection of dataset distillation papers.
[CVPR 2025] "Early-Bird Diffusion: Investigating and Leveraging Timestep-Aware Early-Bird Tickets in Diffusion Models for Efficient Training" by Lexington Whalen, Zhenbang Du, Haoran You, Chaojian Li, Sixu Li, and Yingyan (Celine) Lin.
Repository for the SS24 Efficient Machine Learning class at FSU Jena
This project investigates the efficacy of integrating context distillation techniques with parameter-efficient tuning methods such as LoRA, QLoRA, and traditional fine-tuning approaches, utilizing Facebook’s pre-trained OPT 125M model.
Add a description, image, and links to the efficient-machine-learning topic page so that developers can more easily learn about it.
To associate your repository with the efficient-machine-learning topic, visit your repo's landing page and select "manage topics."