PulseAugur
LIVE 08:08:14
ENTITY LFM2-24B-A2B

LFM2-24B-A2B

PulseAugur coverage of LFM2-24B-A2B — every cluster mentioning LFM2-24B-A2B across labs, papers, and developer communities, ranked by signal.

Total · 30d
1
1 over 90d
Releases · 30d
0
0 over 90d
Papers · 30d
1
1 over 90d
TIER MIX · 90D
RECENT · PAGE 1/1 · 1 TOTAL
  1. RESEARCH · CL_13954 ·

    Liquid AI releases LFM2-24B-A2B, an efficient 24B parameter MoE model

    Liquid AI has released an early checkpoint of its LFM2-24B-A2B model, a sparse Mixture of Experts (MoE) architecture with 24 billion total parameters and 2 billion active parameters per token. This model demonstrates th…