New Media Design & Interaction Design
Body Theater regains somatic rhythm and enables everyone to make art with their body as well as interact with their soul. Participant was invited to dance with sound. With participant's motion and depth data captured by Kinect, it generates audio and visual effects feeding back to the participant in real-time.
4 weeks (2020.12)
Adviser: Zhigang Wang
camera, pc projector,
kinect, camera, pc
Max map/jit developer
movement and visual effects both act on the human eye. It is necessary to find the correlation between human movement and images, so that the movement and visual effects can be synchronized.
dance moves and auditory stimuli, body curves and movement, and the melody and rhythm in music.
Movement to auditory
Movement to vison
THE VIDEO OF BODY THEATER
Video duration: 2'29''
WHY WE WANT TO DISCUSS BODY?
"Embodied Cognition" in Psychology
The absence of the body as a protagonist
In the era of screens, the body itself and the interaction of the body are "ignored", being role of the audience for a long time, as “bystanders" when listening and watching.
Emphasizes on the role of the body playing in the cognitive processes.
The anatomy of the body, how the body movement, how the body feels and experiences the movement, they determine how we perceive the world.
Cognitions are shaped by our bodies and how they move.
The concept of body theatre
In the theater, the participant is the audience as well as the performer, in order to break the audience identity and increase the sense of participation, immersion and leading.
Give everyone the opportunity to use their body to create music, video and interact with their souls.
LOGIC OF MAPPING
The participant observes and experiment with the music + images in the space,
Discover the logic of interaction and start “creation’
Participant’s movement & real-time generated music and video, overlayed with each other
New visual and auditory stimuli act on the participant on and on…
HOW TO REALIZE BODY THEATER？
Ableton Live 10 Suite
Kinect skeleton nodes, depth information for classification
Control sound effects with Max msp (volume, delay, sine wave value, vibration depth/value)
Control Visual Effects with Max Jit
Kinect as a signal input capture the movement of the participant, & categorize movement according to rules.
Use Visual Studio code to read Kinect's .xef file, analyze human skeleton information and depth information, compile data into .csv, and send it into Max for interaction