ENTITY
Deco
Deco
PulseAugur coverage of Deco — every cluster mentioning Deco across labs, papers, and developer communities, ranked by signal.
Total · 30d
77
77 over 90d
Releases · 30d
0
0 over 90d
Papers · 30d
63
63 over 90d
TIER MIX · 90D
TIMELINE
- 2026-05-11 research_milestone A new paper introduces the DECO sparse Mixture-of-Experts architecture. source
SENTIMENT · 30D
1 day(s) with sentiment data
RECENT · PAGE 1/1 · 2 TOTAL
-
New research optimizes Sparse Mixture-of-Experts for efficient LLM scaling
Researchers are exploring new methods to optimize Sparse Mixture-of-Experts (SMoE) models, which are crucial for scaling large language models efficiently. One paper reveals a geometric coupling between routers and expe…
-
Deco framework uses LLMs and AR to create digital embodiments of physical objects
Researchers have developed Deco, a framework that creates AI companions by synchronizing digital embodiments with users' physical objects. This system uses Large Language Models and Augmented Reality to extend the emoti…