Immersive Live Music Telepresence Experience
Collaboration with Yangyang Yang, Kylie Kramer, Bobby Tuohy, and Hsin-Ju Lin
Professors: Hiroshi Ishii
Teaching Assistants: Alice HongJack Forman, and Zhipeng Liang
MIT MAS 0.834 | Tangible Interfaces | Final Project
MIT Media Lab | Fall 2020
ABSTRACT
LiveHouse, an interactive telepresence system that allows users to experience a live concert with other audience members remotely and from the safety of their home. Leveraging elements of tangibility, kinesthetic-sensing and telepresence, audience members can customize their experience of a live music event, where two-way interaction between the audience and the performer are intermediated with a variety of sensors through a central server. LiveHouse also explores artistic expression through the performer’s point-of-view, where the performer creates live aural and visual art while engaging audience members using a tangible audience interface kit, lights, and video. Altogether, this system is intended to recreate the social and hedonic experience of a live concert and engender serendipitous events of expression through dance.

INTRODUCTION
The global COVID-19 pandemic forced governments to implement reduced social interaction policies to slow the spread of the coronavirus. An unintended consequence of a society-wide quarantine is social isolation and the effects it has on mental health, myriad industries, and human collaboration. 
Like many others affected by social distancing, the Fall 2020 cohort of MIT’s Tangible Interfaces course, MAS.834, met remotely. The section benefitted from remote tele-working tools widely used in modern professional and academic teams, but ironically, teams attempted to recreate the collaboration that naturally occurs in an in-person classroom. This by-product of social distancing forced students to not only consider the theoretical applications of human-computer interaction but also live through the actual experimentation of a lab-based tangible interfaces course, done entirely remotely. 
For the course final project, Team LiveHouse, a team of five students and professionals, started ideation with questions about socializing during the pandemic while exploring a shared space for artistic expression. The ideation was sparked by many of the group members missing live music performances and longing for a good concert. A few group members had anecdotal experience from friends who are musical artists that have particularly suffered during the pandemic since concerts have been canceled and large events shut down.
Team LiveHouse thought a lot about what makes a live concert special as opposed to watching something pre-recorded on the screen.  The team recalled elements of a concert: the vibration of the sound, the tactile presence of someone dancing near you— and maybe bumping into you—or the interactivity of the audience and the artists, like bouncing a beach ball through the crowd or asking someone to dance. 
The team developed an interactive telepresence experience that allows users to experience a live concert with others from the safety of their home or dwelling.

SYSTEM ARCHITECTURE
The DJ and audiences are connected via a live-stream platform. As seen in figure 1, the DJ broadcasts the video live and 360 audio to all participants through the platform. Participants are divided into groups of 10-20 people and assigned to one virtual room. In participants' physical rooms, they may project the video on the wall to create an immersive experience. Each participant has an Audience Input Kit that can take their audio and motion input. With the kit, a participant can control the avatar on the platform and dance with it. Figure 2 shows the technical system architecture. Via server, a host can talk to multiple clients, and everyone will view the concert on the browser. On the client side, one Adafruit Playground Express Bluefruit talks to one client browser to control the participant's avatar.
Figure 1 above is the LiveHouse System Architecture
Figure 2 is the Livehouse Technical System Architecture
INTERACTION
Audience's Tangible Control
With the integrated microphone and accelerometer on the Audience Input Kit (AIK), a participant can control his/her avatar on the browser with audio and motion. The size of the avatar is corresponding to the sound amplitude. When clapping, cheering, or  tapping with the AIK, a participant can light up the AIK and control the avatar by its scale (Figure 3). The angle of the avatar changes according to the accelerometer input. When dancing with the AIK, the participant's avatar will "dance" simultaneously (Figure 4). The color of the AIK, the participant's room light color (if any), and the avatar background color are synchronized. A participant shakes the AIK to trigger the color changes (Figure 5).

Figure 3: Clapping, cheering, or tapping lights up the AIK and controls the avatar size
Figure 4: Dancing with AIK changes the avatar angle
Figure 5: Changing AIK color to trigger the room light and the avatar background color changes.
DJ'S CONTROLS
With an AIK owned by each participant, during the live stream, DJ may command the audience for collaborative involvements using MIDI-mapped inputs on a typical digital audio workstation controller (seen in Figure 6, the APK-40 controller commonly used with Ableton). There are four possible modes through which the audience can interact with: Free dance, each participant moves at his/her own preference (Figure 6); Group dance, under DJ's command, participants change avatars' angle together (Figure 7); Passing ball, with the indication of DJ, each participant’s actual room lights and avatar background light up a color in turn (Figure 8); Sole dance, the DJ appoints one participant to dance, whose avatar will be highlighted and his/her AIK will flash (Figure 9). 
Figure 6: Free dance
Figure 7: Group dance
Figure 8: Passing ball
Figure 9: Solo dance, the AIK flashes when appointed.
 DESIGN CONCEPT AND IMPLEMENTATION
The tangible product of LiveHouse is inspired by the signature activities during a music event: drinking and dancing. Whether it is a live house, concerts, bars, or carnivals, people not only enjoy the music but the surrounding atmosphere. Holding or drinking a bottle of beverage facilitates the sense of enjoyment while swinging the body with music also naturally happens in a music event. To associate users with a real music event, the tangible product is designed to be linked to these signature scenarios. 
The device is simple and low cost at just under forty dollars with the majority of the listing being the Adafruit Playground Express Bluefruit at twenty-four US dollars ninety-five cents. The enclosure where the microcontroller was housed was 3D printed with clear Polylactic Acid (PLA) filament. The clear filament was chosen so the light- emitting diode (LED) lights of the Adafruit Playground Express Bluetooth. Apertures were placed within the walls of the enclosure allowing for additional sound and light to enter and exit the enclosure. A large aperture was placed at one side to allow for updated programs to be uploaded to the microcontroller via a USB connection. This design allowed for the integration of an on/off switch that controls if the microcontroller is powered. Two additional plastic connectors and acrylic spacers were used to pin the straps to the enclosure. Velcro was used to hold the strap and elastic band. This could allow for the users to change their bands colors and sizes. A diagram depicting the aforementioned components can be seen in figure 10. While there are most likely more uses for this device two possible uses we focused on were a beverage holder and a wearable (Figure 13).
Figure 10: Isometric diagram showing the device’s components.
Figure 11: An image of the open fabricated enclosure displaying how the battery, switch and microcontroller fit within.
Figure 12: Two possible scenarios: drinking with a beverage holder, left; dancing with a wrist wearable, right.
CONCLUSION & FUTURE WORK
In response to COVID-19 pandemic, LiveHouse seeks to enhance the live concert experiences to individuals remotely. We have identified a system architecture that connects performers and audiences through a server, in which to create an interaction either visually on the screen or tacitly through our tangible beverage holder and wrist wearables. In the future, when the pandemic is over, this work can be further developed to augment the live concert experience. Rather than passively engaging a concert in the past, audiences are able to input multi sensation to facilitate more diverse dialogue between performers and themselves.
Back to Top