Research

Understanding and shaping perception in XR.

Our research focuses on perception, embodiment, shared systems, and adaptive interfaces. We work with research institutions and universities worldwide to turn experimental thinking into usable, cutting-edge systems.

Research Themes

Five directions that shape how we investigate XR.

Theme 01

Adaptive Perception

How environments can respond to attention, task demands, and changing user state. We study perception as a dynamic loop between sensing, inference, and interface adaptation.

Adaptive XR systems can modulate pacing, atmosphere, and guidance based on how perception changes over time.

Adaptive Perception

Theme 02

Shared Perception

We investigate how people can align attention, action, and understanding across distance. The emphasis is not only on communication, but on enabling a common perceptual frame.

Shared gaze, gesture, and spatial context make collaboration feel less like explanation and more like joint perception.

Shared Perception

Theme 03

Embodied Interaction

Interaction is shaped by posture, movement, timing, and presence. We explore interfaces that treat the body not as input alone, but as the site where meaning and feedback are formed.

Embodied interaction research asks how systems can become more legible, expressive, and responsive through movement.

Embodied Interaction

Theme 04

Physiological Computing

Signals such as breathing, arousal, comfort, and stress can become part of the interface itself. This direction studies when physiological measures can support adaptation without overwhelming users.

Physiological computing reframes the interface as something that can sense and respond to internal state, not just explicit commands.

Physiological Computing

Theme 05

Intelligent Interfaces

We study interfaces that infer context, coordinate across modalities, and adapt to users in real time. Intelligence here is not decoration, but the structure that makes complex systems usable.

Intelligent interfaces connect sensing, representation, and action so the system can respond in context rather than through static flows.

Intelligent Interfaces

How We Work

Sense, measure, share, adapt.

Our approach moves from observing human and system signals to building interfaces that can respond, coordinate, and evolve in context.

01Observe

Sense

Capture signals from users, environments, and interactions to understand what is happening in context.

02Quantify

Measure

Turn raw signals into interpretable metrics that can support comparison, evaluation, and system design.

03Coordinate

Share

Make attention, state, or spatial context legible across people, devices, and viewpoints.

04Respond

Adapt

Use those insights to change pacing, guidance, representation, or behavior in real time.

In Collaboration With Research And Industry Partners

Ongoing work is shaped through academic collaboration, lab partnerships, and applied research contexts.

Asahi AHLab