Technologist: Boxiong Zhao, Chu Zhang
Concept: Kaiqing Huang
Music: Chu Zhang
Credit: Guijie Ding, Wangyu Pin, Yuyang Luan
“CrossReality” is an interactive experience that merges AR with installed screens, exploring the dynamic relationship between humans and their surrounding environment. Human actions subtly shape the invisible environmental field in real-time, while the field influences their physical perceptions and emotional experiences. The behaviors of participants influence not only their own actions but also the actions and perceptions of others within the shared space. Through their collective movements and decisions, they generate a fluid, ever-evolving impact on the environment.
FRAMEWORK
Three participants, each wearing an AR headset, engage within a defined space where an AR magnetic sphere represents their interaction with the environment. As the sphere moves, it crosses the physical screen, which visually reflects the state of the field, symbolizing the unseen influence of their actions.
The participants observe how their movements and positions affect the virtual field, while the field itself, in turn, subtly guides their decisions and responses. Through this dynamic exchange, their actions shape the environment, while the environment gently shapes their behavior, creating a interconnected experience.
THREE ACTS
“POSITION”

“SCALE”

“ROTATION”

The concept of "transformation" serves as the central theme of this work, explored through three acts. Each act represents a fundamental attribute of transformation: Position, where objects shift within space, altering their context and relationships. Scale, exploring the dynamic changes in size, affecting the perceived significance.
Rotation, focusing on the shifting perspectives and orientations, altering the viewer's point of view. These three attributes work together to illustrate the constant fluidity and adaptability within the environment, emphasizing how change redefines both form and perception.
MUSIC
The host generates real-time sound effects with varying frequencies and decibels by receiving the absolute positions of individual participants (including height and head rotation) and their relative positions (distances between participants).
These sound effects not only evolve dynamically but also influence the content displayed on the real-world screen, creating an immersive feedback loop between the participants’ movements and the audiovisual environment.
FURTHER EXPLORATION
We aim to project the visual display onto the ground, enhancing the immersion and connection between participants and the fluid mixed reality, allowing the environment to be experienced as more vigorous.
Additionally, we found that this new approach offered participants greater freedom to interact with one another, while also enhancing their perception and direct reflection.