PulseAugur
LIVE 07:15:40
research · [2 sources] ·
0
research

Talkie-1930: New 13B LLM trained on pre-1931 English for historical research

Researchers have developed Talkie-1930, a new open-weight language model with 13 billion parameters. This model was trained exclusively on English text published before 1931. Its primary purpose is to facilitate contamination-free AI research and enable experiments in historical reasoning. AI

Summary written by gemini-2.5-flash-lite from 2 sources. How we write summaries →

IMPACT Enables contamination-free historical reasoning experiments and generalization research.

RANK_REASON Release of an open-weight language model with a specific training dataset for research purposes.

Read on Mastodon — fosstodon.org →

COVERAGE [2]

  1. Mastodon — fosstodon.org TIER_1 · [email protected] ·

    Researchers have unveiled Talkie-1930, a 13 billion parameter open-weight language model trained exclusively on pre-1931 English text. The vintage language mode

    Researchers have unveiled Talkie-1930, a 13 billion parameter open-weight language model trained exclusively on pre-1931 English text. The vintage language model is designed for contamination-free AI research and historical reasoning experiments. https://www. marktechpost.com/202…

  2. Mastodon — mastodon.social TIER_1 Svenska(SV) · redaktionen ·

    Talkie: A Journey Back in Time with a Vintage Language Model

    Talkie: En Resa Tillbaka i Tiden med En Vintage Språkmodell https:// redaktionen.net/artikel/641 # ai # svtech