PulseAugur
LIVE 13:09:35
research · [1 source] ·
0
research

EleutherAI releases GPT-NeoX-20B, largest open-source language model

EleutherAI has released GPT-NeoX-20B, a 20 billion parameter open-source language model trained using their GPT-NeoX framework. This model is notable for being the largest publicly accessible pretrained autoregressive language model to date. The release aims to facilitate research into the safe use of AI systems, with the model available via inference services and a public release scheduled after a seven-day delay. AI

Summary written by gemini-2.5-flash-lite from 1 source. How we write summaries →

RANK_REASON Release of a large open-source language model from a non-frontier lab, accompanied by a paper and datasheet.

Read on EleutherAI Blog →

EleutherAI releases GPT-NeoX-20B, largest open-source language model

COVERAGE [1]

  1. EleutherAI Blog TIER_1 ·

    Announcing GPT-NeoX-20B

    Announcing GPT-NeoX-20B, a 20 billion parameter model trained in collaboration with CoreWeave.