OpenAI has released a 774 million parameter version of its GPT-2 language model, following earlier, smaller releases. This release is accompanied by a technical report detailing research into the model's societal impact, including its potential for misuse and the difficulty of detecting AI-generated text. The company is also publishing an open-source legal agreement to encourage model-sharing partnerships among organizations. AI
Summary written by gemini-2.5-flash-lite from 1 source. How we write summaries →
RANK_REASON Release of a specific model size (774M parameters) with accompanying research and a technical report.