PulseAugur
LIVE 13:07:19
research · [2 sources] ·
0
research

Google unveils SensorLM and LSM-2 for advanced wearable sensor data analysis

Google Research has introduced SensorLM and LSM-2, two new families of foundation models designed to interpret data from wearable sensors. SensorLM connects multimodal sensor signals to natural language, enabling a deeper understanding of health and activities by generating descriptive text from raw data. LSM-2, utilizing a novel self-supervised learning approach called Adaptive and Inherited Masking (AIM), is specifically designed to learn effectively from incomplete or fragmented sensor data, overcoming a common limitation in wearable technology. These models were trained on extensive datasets, including nearly 2.5 million person-days of de-identified data from over 100,000 individuals, setting new benchmarks in sensor data understanding and activity recognition. AI

Summary written by gemini-2.5-flash-lite from 2 sources. How we write summaries →

RANK_REASON The cluster describes new foundation models and novel self-supervised learning techniques presented in research papers from Google AI.

Read on Google AI / Research →

Google unveils SensorLM and LSM-2 for advanced wearable sensor data analysis

COVERAGE [2]

  1. Google AI / Research TIER_1 ·

    SensorLM: Learning the language of wearable sensors

    Generative AI

  2. Google AI / Research TIER_1 ·

    LSM-2: Learning from incomplete wearable sensor data

    Generative AI