PulseAugur
LIVE 09:29:58
research · [1 source] ·
0
research

FunRec reconstructs interactive 3D scenes from egocentric videos

Researchers have developed FunRec, a novel method for creating interactive 3D digital twins of indoor environments from egocentric RGB-D videos. This approach operates on unconstrained human interaction sequences, automatically identifying articulated parts, estimating their motion, and reconstructing both static and dynamic geometry. FunRec achieves significant improvements in part segmentation and pose accuracy compared to existing methods, enabling applications such as simulation-ready mesh export and robot-scene interaction. AI

Summary written by gemini-2.5-flash-lite from 1 source. How we write summaries →

IMPACT Enables creation of simulation-ready 3D environments from real-world interaction data.

RANK_REASON This is a research paper detailing a new method for 3D scene reconstruction.

Read on arXiv cs.CV →

COVERAGE [1]

  1. arXiv cs.CV TIER_1 · Alexandros Delitzas, Chenyangguang Zhang, Alexey Gavryushin, Tommaso Di Mario, Boyang Sun, Rishabh Dabral, Leonidas Guibas, Christian Theobalt, Marc Pollefeys, Francis Engelmann, Daniel Barath ·

    FunRec: Reconstructing Functional 3D Scenes from Egocentric Interaction Videos

    arXiv:2604.05621v2 Announce Type: replace Abstract: We present FunRec, a method for reconstructing functional 3D digital twins of indoor scenes directly from egocentric RGB-D interaction videos. Unlike existing methods on articulated reconstruction, which rely on controlled setup…