Ongoing project (2018-2019)
Perfect melon is a VR performance aiming to create a fictional corporate persona as a pseudo technology corporation. The “corporation” behind Perfect Melon exploits the alienated senses of its consumers by claiming its products will “reconnect” the viewers to the romanticized idea of a lost natural world. Moreover, Perfect melon explores the corporate interests hidden by the guise of community-building. Perfect melon is activated by audience participation with real-time audio, visual, and taste components. The participants enter the virtual space with the mission of finding the perfect melon through slapping, hearing, and tasting different melons. Each chosen melon has a unique flavor profile mixed by a customized taste display backpack.
The Perfect Melon team consist of members Jas Brooks, Gabby Luu, and Li Yao, who came together through an Art and Science seminar in the summer of 2018.
Jas Brooks is an artist and PhD candidate in the Department of Computer Science at the University of Chicago (focusing on smell & taste for VR), as well as a lecturer at the School of The Art Institute of Chicago.
Li Yao is an artist and recent graduate (MFA ‘18) from the Art and Technology Studies department at the School of The Art Institute of Chicago. In recent years, virtual reality design has been Li’s direction of practice. Website.
Gabby Luu is an artist and BA/MA Art History student at the University of Chicago. Gabby is currently focusing her thesis on experimental East Asian performance and new media art.
Perfect Melon is still in its development process. The scenes in the concept video belong to the demo version of the project. We estimate to have our initial trial run in December 2018 and to finish in January 2019.
Perfect Melon is a project aiming to create a fictional corporate persona as a start up melon drink company with many innovative technological features that exploit the alienated sensibilities of its consumers by “re-uniting” their sensorial experiences to a fabricated nature under the guise of community-building. The goal of the company is not merely to pitch a melon drink but to associate the abstract taste of the liquid with a natural origin–an idea of the melon that is timeless, habitual, and picturesque within conventional thought. Moreover, the goal of Perfect Melon is to introduce consumers to a method of examining such origins that will justify the arbitrary evaluation of choosing one melon as the greatest of all. Hence, “community-building” actually translates to the cultivating of ideological subjects through engineering the subjects’ sensibilities. The stylistic references for Perfect melon include Givaudan’s and Chanel’s advertisements of their laboratory processing, The Assemblage’s video of its origin story, and Alex Turvey’s short film, Quinn Thomas.
Perfect Melon is a VR performance activated by audience participation with realtime audio, visual, and taste components. The participants will enter the virtual environment with the mission of finding the perfect melon among all melons. The interaction is facilitated by a Vive headset and controllers in addition to a taste display backpack–in its essence, it’ll be a cocktail mixer designed to be portable. Participants will exam any melon by picking it up and slapping it to hear the sound it makes. With the audio and haptic feedback, participants can decide whether they want to further investigate by tasting the melon. In that case, the backpack would be activated to mix up the flavoured melon drink matched to the particular melon the participants are examining. Once the participants have decided they’ve found the perfect melon, the system will record the flavour formula at hand, and the Perfect Melon team will produce a take-away melon drink according to the formula. The Perfect Melon project seeks to correlate the virtual space with the physical. The virtual platform is modularized by the unit of grid tiles, and made to procedurally adjust to any given location. Estimatedly, three physical tables, attached with Vive trackers, will be positioned within the tracking area correlating to the virtual tables in the scene. Thus, in the event of participants accidentally bumping into the physical table, the virtual table would move its position accordingly.