Cloche are 3D printed objects instilled with movement data to create combinatorial structures of natural patterns. Motion capture technology was used to extract physical movement from its occurrence in the physical world, recording dynamic qualities like speed, direction, weight, and intensity. Moving back through the virtual filter to the physical world, movement data re-animates a different body– a multiplicitous arc in humans forms. The analog-digital-analog processes filters presence, stripping movement of some information and endowing it with others.
Like movement, sound traverses a similar filtering process. Recordings of 3D object printing, capturing the movement of a form’s creation, are processed digitally and re-amplified in the space. The sounds continue to be filtered by the shape and contour of our physical bodies and the acoustics of the room. The physical-virtual-physical translation process is both known and physical, and at the same time other and immaterial.
Jon Bellona, sound design Brad Garner, movement design John Park, movement capture, print, and visual design
Aqua•litative is a kinetic installation that renders multiple data sets related to California’s water history into movement and sound. The installation displays climatological data as a chronological narrative of water in the state by transforming water data into acoustic sounds (ringing of clock chimes) and physical movement (motors moving arms of balsa wood) shown in a gallery space. Precipitation data creates sonic patterns, analogous to rain droplets, in a continuously evolving play between density and rhythm.
Aqua•litative is by Jon Bellona, John Park, and John Reagan. http://aqualitative.org The installation is part of an Environmental Resilience and Sustainability Fellowship, funded in part by the Jefferson Trust and the University of Virginia Office of Graduate and Postdoctoral Affairs.
Mixer.* is a Max/MSP package for audio mapping projects. The package contains basic audio mixer objects, like channel strips, eqs, limiters, and aux sends. Mixer.* provides GUI, modular design, and pattr binding for smooth integration into your Max/MSP workflow.
To get started with Mixer.*, place the mixer folder inside your Max > packages directory. Then restart Max. Inside a Max window, simply create a new object, start typing “mixer” and let autocomplete help you do the rest. You may also type shift-M to quickly access any mixer. object as a helpful bpatcher.
Korgnano is a software implementation of the Korg nanoKontrol USB controller. The object connects your hardware nanoKontrol to Max and automatically ports the data to korgnano.inputmenu objects, or specially named receive objects that you can create yourself. (e.g. receive scene1_ch9_btn2)
CarbonFeed takes your most recent 200 tweets and turns them into a minute loop, a song that changes over your Twitter lifetime. Every time you tweet you generate 0.02g/C02 . Don’t worry too much though. Listening to your one-minute song will eat up roughly 2.86 grams/C02e in electricity, servers, and embodied computer emissions .
jpb.mod is a Max 6 package with ready-made data modification modules. These modules address each of the five data modification types (interpolate, thin, offset, scale, smooth [itoss]). jpb.mod modules handle the modification of a one-dimensional data stream. Rapid prototyping is one of the core purposes of the jpb.mod package library. You may find the jpb.mod.scale object especially helpful for non-linear scaling.
The project #CarbonFeed directly challenges the popular notion that virtuality is disconnected from reality. Through sonifying Twitter feeds and correlating individual tweets with a physical data visualization in public spaces, artists Jon Bellona and John Park invite viewers to hear and see the environmental cost of online behavior and its supportive physical infrastructure.
CarbonFeed works by taking in realtime tweets from Twitter users around the world. Based on a customizable set of hashtags, the work listens for specific tweets. The content of these incoming tweets generates a realtime sonic composition. An installation-based visual counterpart of compressed air being pumped through tubes of water further provides a physical manifestation of each tweet.
simpleKinect is an application for sending data from the Microsoft Kinect to any OSC-enabled application. The application attempts to improve upon similar software by offering more openni features and more user control.
Specify OSC output IP and Port in real time.
Send CoM (Center of Mass) coordinate of all users inside the space, regardless of skeleton calibration.
Send skeleton data (single user), on a joint-by-joint basis, as specified by the user.
Manually switch between users for skeleton tracking.
Individually select between three joint modes (world, screen, and body) for sending data.
Individually determine the OSC output url for any joint.
Save/load application settings.
Send distances between joints (sent in millimeters). [default is on]
San Giovanni Elemosinario is a music for film work that attempts to recreate a Venetian church through sound. Collaborating with architecture students studying in Venice, Italy, I received sketches of axonometric views, floor plans, column details, entrances, and other structural perspectives. Placing these sketches inside Iannix allowed cursors to trace the architectural renderings in real time. These cursors output data to Kyma, where mappings of data control oscillators, harmonic resonators, noise filters, as well as other acoustic treatments (panning, reverb, EQ, frequency shifts, etc.). While no impulse response was recorded, listening tests inside the church determined a ~3 second decay time, and helped influence the creation of spatial reverberation.
A huge thank you to Matthew Burtner and Anselmo Canfora, both of whom made the collaboration possible. Video/Music: Jon Bellona Drawing: Olivia Morgan, Alex Picciano