Projector-camera system for touch and touchless interactions
FXPAL’s research in Smart Spaces explores human interaction within smart environments.
Using state of the art spatial sensors, like depth cameras, acoustic arrays, and tracking devices, we create sensitive-space applications for interacting with large data, with complex systems, and with other people. Our “Smart Space” multiple display environments include media walls that enable interactions with users’ wearable and mobile devices (e.g. Google Glass) as well as direct-input sensing. Our goal is to provide augmented spaces where local and remote collaborations are informed by real-time, ambient and archival data streams in combination with live input.
Smart Spaces Projects
Glass Intelligent Speech to Text
Full-body gestural interfaces for large displays
Finger pose estimation for enhanced touch input
Virtual and mixed-reality in an industrial setting
Dynamic 3D virtual models of physical spaces
Speculative designs, scenarios, and vision videos