top of page
Matthew Shenton

Listening to the Landscape 8 - turning data into sound

Updated: Jul 18

Part of my Developing Your Creative Practice project is to learn how data sonification can be used in my sound work. The process of using sound to convey data has been around since the 1990s, but I had never thought of using it in my practice due to a lack of time and knowing where to start.


I came across a data sonification article on the Ableton website that explained how the process was being used by other artists as a tool to raise awareness of climate change, and immediately saw an opportunity to investigate; what local data on the natural world is available, how this data has traditionally been presented and how it could be used to structure my sound work.


I looked around to see what software was available for unpacking data and transforming it into sound and settled on learning how to use the Max software produced by Cycling 74. Max is a 'visual programming environment' that allows users to design their own instruments (and more) by linking 'objects with which to build your own synthesizers, samplers, and effects processors as software that perform audio signal processing'. I chose Max because I have been using Ableton Live for many years as my digital audio workstation (DAW) and assumed/hoped that the two would integrate seamlessly.


I downloaded the software and signed up for the free online 'Structuring Interactive Software for Digital Arts' course from Kadenze with tutor Matt Wright guiding you through the powerful adaptability of Max. I got on well with the lectures until they became a bit bogged down in scales and music theory - two areas that I am neither proficient nor interested in. By the end of the final Kadenze session I had a very basic understanding of what Max could do, could link objects together to make sound but no clear idea on exactly how I would begin to make a spreadsheet sing.


a Max MSP patch showing software that makes music

Luckily, I found another online course specifically showing you how to create Max patches for data sonification. This was perfect as it drew upon my Kadenze knowledge and showed me step by step how the data in a csv file can be scaled and adapted to alter the pitch of oscillators, the timing of sound events and the shape of envelopes. The short course was presented by the amazing Umut Eldem for whom I am eternally thankful for getting me back on track.


The video below shows how I have used Max to create an instrument that uses data on rainforest destruction in the Amazon to make compositional decisions. The data from the spreadsheet is unpacked and scaled before being used to make decisions that shape the amplitude and mix of four oscillators, decides whether the envelope is triggered to generate a sound and to change the bitrate of the sound you hear. I will be submitting the piece to a fundraising compilation that aims to raise awareness about the continued destruction of the amazon rainforest.



The next step is to explore what local data is available and to incorporate this into a piece about my village soundscape.


Comments


bottom of page