PulseAugur
LIVE 10:37:04
ENTITY X-MoE

X-MoE

PulseAugur coverage of X-MoE — every cluster mentioning X-MoE across labs, papers, and developer communities, ranked by signal.

Total · 30d
1
1 over 90d
Releases · 30d
0
0 over 90d
Papers · 30d
1
1 over 90d
TIER MIX · 90D
RECENT · PAGE 1/1 · 1 TOTAL
  1. RESEARCH · CL_20524 ·

    Piper framework boosts MoE model training efficiency with resource modeling

    A new framework called Piper has been developed to address the challenges of training large Mixture-of-Experts (MoE) models on high-performance computing (HPC) platforms. Piper utilizes resource modeling to optimize tra…