top of page

experiments

Experiments - abstract 16b_edited.png

Recording Tree
Electromagnetism


post by Sam Street
November 2021

This post details the initial testing and uses of the MIDI Biodata Sonification Device, or as the team began to simply refer to it; the BioMIDI. This device uses electrodes placed on a plant’s skin to measure the changing internal conductive state of the organism, transmitting non-human created MIDI data that can be sonified or interpolated for artistic use.

 
DSCF4739.jpg

initial testing

The set-up was fairly intuitive; placing the two electrodes across the plant or tree, plugging in the electrodes’ cable to the BioMIDI, and turning it on. Coloured lights were helpful to show the data being measured. From there, we needed a way to record this data for later use. The first method of recording the MIDI data we used was with the help of a MIDI-to-USB adapter. Connecting from the BioMIDI to a laptop, we could then use any Digital Audio Workstation (DAW) to record the incoming signal.

The first tests were with plants around research assistant Sam’s home; indoor Philodendrons, a ZZ plant, a Calla lily, a rose bush, a Rhododendron, and finally a Birch tree. Each test let us note the differing responses between plants, but also let us test the different “scale” settings of the BioMIDI, conforming the output notes to either Major, Minor, “Indian”, or Chromatic scale. From the first tests it seemed the larger plant life like the Birch had a slower rate of data, though this could be from adjusting the BioMIDI’s “threshold” setting, the sensitivity of the electrodes. It also seemed that the same plant could respond differently at different points even on the same day.

With the first batch recordings, we made some test compositions to try out the implementation. Obviously the primary use of MIDI data is to directly control the playing of virtual instruments, and this was tried first. This is where the sound design, the imagined sonic character of a plant’s data is represented by the composer. However, as MIDI is simply a set of control data, it can also be used to modulate and control aspects of the sound beyond just notes played. Using a simple Max/MSP patch to scale and smooth the input, we then used the plant data to modulate aspects of the base synthesizers, cross-interpolating the data to form more complex, multi-plant

 
DSCF4749.jpg

field implementation

A problem with the initial method, however, was its importability for field recording (an entire laptop being quite cumbersome in the field). A thought was to find a device to simply record MIDI data, but as MIDI is a somewhat old and specific format, devices like these are rare and expensive. The next thought was to record the MIDI data into a DAW on a smartphone. There are many mobile DAW options, but we landed on the tried-and-true Apple Garageband. And, thanks to the explosion of mobile peripherals and accessories, a MIDI-to-Lightning adaptor (the iRig MIDI 2 from IK Multimedia) was easy to find. The MIDI field recording chain was now set: running electrodes to the BioMIDI, BioMIDI to the iRig, iRig to iPhone, and MIDI data recorded into Garageband.

The next issue was how to get the MIDI data out of the phone and into a computer, as a universal MIDI file. One of the failings of Garageband is that it cannot natively export .midi files, so another workaround was needed. Luckily, an open-source program by Github user larkob was able to convert from Apple’s “Loop Audio” format to MIDI. Once through the several steps of transfer and conversion, we had our final, universal .midi file.

The first in-field test of the BioMIDI was on our field recording trip to Lighthouse Park in West Vancouver. We found a huge Douglas Fir with a charismatic burl which we chose as our collaborator. We set up the whole BioMIDI rig first, and let it record while we arranged the rest of the gear. We recorded with ambisonic, geophonic, and even electromagnetic field microphones, and managed to hear a Raven and a Squirrel alongside our silent coniferous performer.

Screenshot-278_edited.jpg

processing and composition

The final step was determining how to actually use this MIDI data alongside a soundscape. I wanted some sort of implementation that wouldn’t stand out too much, that could sit among the natural sounds and function as part of the whole, a reciprocal organism in an ecology, as the tree itself. What I came to was a mix of both the straightforward “MIDI as musical notes” method and the “MIDI as modulation” method.

First, in Ableton, I composed the soundscape based on the sonic events and loose “narrative” of what we had recorded on the trip. Alongside this I placed the Douglas Fir’s data, its notes playing a simple synthesizer. I chose a basic, unadorned sine wave with a long attack and decay so that as the Fir’s notes added up and piled on top of themselves, it would form a sort of “harmonic cloud”, a signature of the tree. The output of this synthesizer was then put through a vocoder, a device which allows the harmonic profile of one sound to modulate another. So to then modulate the Douglas Fir’s “harmonic cloud”, I used the sonic profile of the soundscape recording itself. The final result created a sort of ghostly echo of the recording, with shifting resonances emphasizing or characterizing certain moments. The environment experiencing the tree experiencing the environment. Lastly, I automated the volume of this complex so it would arise and move alongside the composition.

The results of the field recording trip and compositional experiment can be heard as the Branching Songs soundscape, “Douglas Fir at Lighthouse Park.”

DSCF4755.jpg
bottom of page