PulseAugur
LIVE 13:06:33
research · [1 source] ·
0
research

Apple's OpenELM model achieves strong performance with smaller dataset

Apple has released OpenELM, a new family of open-source language models. These models demonstrate strong performance, achieving state-of-the-art results on benchmarks while using significantly less training data than comparable models. OpenELM utilizes a technique called DeLighT to achieve this efficiency, making it a notable advancement in resource-conscious AI development. AI

Summary written by gemini-2.5-flash-lite from 1 source. How we write summaries →

RANK_REASON Release of open-source language models from a major tech company, with performance claims backed by a paper.

Read on Smol AINews →

COVERAGE [1]

  1. Smol AINews TIER_1 ·

    Apple's OpenELM beats OLMo with 50% of its dataset, using DeLighT

    **Apple** advances its AI presence with the release of **OpenELM**, its first relatively open large language model available in sizes from **270M to 3B** parameters, featuring a novel layer-wise scaling architecture inspired by the **DeLight** paper. Meanwhile, **Meta's LLaMA 3**…