Precipitation 3 is one of a series of musical compositions written for 26 clock chimes as part of the sound art installation, Aqua•litative. With my Precipitation series, I treat the electromechanical structure as musical instrument, navigating through sound with the syntactical construction of code. Compositions played by the sculpture evoke precipitation data of California weather stations by cycling through bits of its data. These cycles create emergent sonic patterns in a continuously evolving play between density and rhythm. Movement flows as collapsing waves, additively striking a cybernetic balance between natural order and mechanic motion.
Aqua•litative is a kinetic installation that renders multiple data sets of California’s water history into a physical experience. The work correlates natural factors contributing to California’s water shortages, outlining the serpentine narrative of water through the translation of data into kinetic movement and acoustic sound.
Selector is a live audio-visual performance that uses algorithms to select between various sonic processes. Some of these processes include the selection of audio segments, rapidly skipping like a malfunctioning CD player. Each audio process triggers pulses of projected light. Selector combines the generative selection of audio with the selection of visual code in a tightly synchronized display of sound and light.
Cloche are 3D printed objects instilled with movement data to create combinatorial structures of natural patterns. Motion capture technology was used to extract physical movement from its occurrence in the physical world, recording dynamic qualities like speed, direction, weight, and intensity. Moving back through the virtual filter to the physical world, movement data re-animates a different body– a multiplicitous arc in humans forms. The analog-digital-analog processes filters presence, stripping movement of some information and endowing it with others.
Like movement, sound traverses a similar filtering process. Recordings of 3D object printing, capturing the movement of a form’s creation, are processed digitally and re-amplified in the space. The sounds continue to be filtered by the shape and contour of our physical bodies and the acoustics of the room. The physical-virtual-physical translation process is both known and physical, and at the same time other and immaterial.
Jon Bellona, sound design Brad Garner, movement design John Park, movement capture, print, and visual design
Aqua•litative is a kinetic installation that renders multiple data sets related to California’s water history into movement and sound. The installation displays climatological data as a chronological narrative of water in the state by transforming water data into acoustic sounds (ringing of clock chimes) and physical movement (motors moving arms of balsa wood) shown in a gallery space. Precipitation data creates sonic patterns, analogous to rain droplets, in a continuously evolving play between density and rhythm.
Aqua•litative is by Jon Bellona, John Park, and John Reagan. http://aqualitative.org The installation is part of an Environmental Resilience and Sustainability Fellowship, funded in part by the Jefferson Trust and the University of Virginia Office of Graduate and Postdoctoral Affairs.
Mixer.* is a Max/MSP package for audio mapping projects. The package contains basic audio mixer objects, like channel strips, eqs, limiters, and aux sends. Mixer.* provides GUI, modular design, and pattr binding for smooth integration into your Max/MSP workflow.
To get started with Mixer.*, place the mixer folder inside your Max > packages directory. Then restart Max. Inside a Max window, simply create a new object, start typing “mixer” and let autocomplete help you do the rest. You may also type shift-M to quickly access any mixer. object as a helpful bpatcher.
Korgnano is a software implementation of the Korg nanoKontrol USB controller. The object connects your hardware nanoKontrol to Max and automatically ports the data to korgnano.inputmenu objects, or specially named receive objects that you can create yourself. (e.g. receive scene1_ch9_btn2)
CarbonFeed takes your most recent 200 tweets and turns them into a minute loop, a song that changes over your Twitter lifetime. Every time you tweet you generate 0.02g/C02 . Don’t worry too much though. Listening to your one-minute song will eat up roughly 2.86 grams/C02e in electricity, servers, and embodied computer emissions .
jpb.mod is a Max 6 package with ready-made data modification modules. These modules address each of the five data modification types (interpolate, thin, offset, scale, smooth [itoss]). jpb.mod modules handle the modification of a one-dimensional data stream. Rapid prototyping is one of the core purposes of the jpb.mod package library. You may find the jpb.mod.scale object especially helpful for non-linear scaling.
The project #CarbonFeed directly challenges the popular notion that virtuality is disconnected from reality. Through sonifying Twitter feeds and correlating individual tweets with a physical data visualization in public spaces, artists Jon Bellona and John Park invite viewers to hear and see the environmental cost of online behavior and its supportive physical infrastructure.
CarbonFeed works by taking in realtime tweets from Twitter users around the world. Based on a customizable set of hashtags, the work listens for specific tweets. The content of these incoming tweets generates a realtime sonic composition. An installation-based visual counterpart of compressed air being pumped through tubes of water further provides a physical manifestation of each tweet.