PulseAugur
LIVE 10:55:46
ENTITY DIP-KD

DIP-KD

PulseAugur coverage of DIP-KD — every cluster mentioning DIP-KD across labs, papers, and developer communities, ranked by signal.

Total · 30d
1
1 over 90d
Releases · 30d
0
0 over 90d
Papers · 30d
1
1 over 90d
TIER MIX · 90D
RECENT · PAGE 1/1 · 1 TOTAL
  1. RESEARCH · CL_08520 ·

    New knowledge distillation methods enhance model compression and diversity

    Two new research papers propose methods to improve black-box knowledge distillation, a technique for compressing large AI models into smaller ones without direct access to the teacher model's training data. The first pa…