PulseAugur
LIVE 04:49:17
research · [1 source] ·
0
research

Meta releases Tribe v2, a foundation model of brain responses to sound, sight, and language

Meta AI has released Tribe v2, a new model designed to simulate human brain responses to auditory, visual, and linguistic stimuli. This model allows for partial exploration via a mobile demo and is accompanied by a research paper detailing its foundation in in-silico neuroscience. The project also includes publicly available code on GitHub, facilitating further research and development in the field. AI

Summary written by gemini-2.5-flash-lite from 1 source. How we write summaries →

IMPACT Provides a new tool for neuroscience research, enabling in-silico modeling of brain responses to complex stimuli.

RANK_REASON Release of a new model and accompanying paper from a major research lab.

Read on X — Yann LeCun →

COVERAGE [1]

  1. X — Yann LeCun TIER_1 · Yann LeCun ·

    RT Jean-Rémi King: ✨🧠 Tribe v2, our latest model of human brain responses to sound, sight and language can now be (partly) explored on your phone...

    RT Jean-Rémi King<br />✨🧠 Tribe v2, our latest model of human brain responses to sound, sight and language can now be (partly) explored on your phone📱:<br /><br />▶️demo: https://aidemos.atmeta.com/tribev2/ <br />📄paper: https://ai.meta.com/research/publications/a-foundation-mode…