<- Back to blog start page

gabriel-gurrola-2UuhMZEChdc-unsplash-smaller

Sonification: How to make data sing

How can you make data accessible and interpretable? You’re probably thinking of some kind of visualization now. But you could also use sound, in a process called sonification.

Sonification is not new. Think of Geiger counters, for example. Typically, Geiger counters have speakers that click when there is radiation.

Diagram of a Geiger counter. Geiger counters use sonification to make ionizing radiation perceptible. Image from Wikipedia.
By Svjo-2 - Own work, CC BY-SA 3.0, https://commons.wikimedia.org/w/index.php?curid=39176160
Diagram of a Geiger counter Geiger counters use sonification to make ionizing radiation perceptible Image from <a href=httpsenwikipediaorgwikiGeiger counter target= blank rel=noreferrer noopener>Wikipedia<a>

Another example of sonification are variometers. Variometers measure vertical speed. Pilots use them to tell whether they are climbing or descending, and at what rate. Variometers for paragliders usually also have audio signals that indicate whether you are climbing or descending. If the pitch of the sound goes up, you are climbing; if the pitch goes down, you are descending.

A variometer for paragliders. In addition to visual indicators, it also indicates climb rates via sound. Image from Wikipedia.
By Flyout - Own work, CC BY-SA 3.0, https://commons.wikimedia.org/w/index.php?curid=210575
A variometer for paragliders In addition to visual indicators it also indicates climb rates via sound Image from <a rel=noreferrer noopener href=httpsenwikipediaorgwikiVariometer target= blank>Wikipedia<a>

Geiger counters and variometers use sonification so operators can focus on other tasks. For example, they can check their surroundings, rather than looking at a screen. In other words, with sonification you can use an additional “input channel” to the data analyst’s brain.

But there is another way sonification can be useful. Sound is a temporal signal, so you can use it to track changes over time. Humans are quite good at perceiving changes in a sound signal, even if they don’t consider themselves musically talented. Sonification taps into this skill.

What’s happening in sonification currently? What are other examples, besides Geiger counters and variometers? I used Mergeflow’s tech discovery software to find out.

Related reading: What makes a good innovation analyst?

Using sound to analyze and detect cybersecurity events

Security operations centers, or SOCs, monitor and defend enterprise or government information systems. These systems include networks and devices, as well as the activities that take place across these systems.

People in SOCs spend a lot of time identifying and monitoring changes over time. For example, if they identify unusual network activity, this may raise a red flag.

Louise Axon and colleagues from the University of Oxford have studied sonification in the context of SOCs. They found that SOC employees liked sonification most for anomaly detection and for “peripheral monitoring”. Peripheral monitoring is when you monitor something but your main focus is on another task.

There are some patents in this area as well. For example, Neal Horstmeyer, Diana Horn, and Shirish Shanbhag from Cisco have patented sonification for detecting cyber attacks. In a similar area, Alexandr Kuzmin from Sberbank holds a patent on sonification for network-level events.

Matthew Galligan and Nhan Nguyen from the US Navy hold a patent on using sonification for continuous monitoring of complex data metrics. They even provide some musical notation in their patent:

Musical notation from Matthew Galligan and Nhan Nguyen's patent, Sonification system and method for providing continuous monitoring of complex data metrics. Image taken from the original patent document.
Musical notation of cybersecurity data from Matthew Galligan and Nhan Nguyens patent <a rel=noreferrer noopener href=httpsworldwideespacenetcomsearchResultsquery=US201816115745 target= blank>Sonification system and method for providing continuous monitoring of complex data metrics<a> Image taken from the <a rel=noreferrer noopener href=httpsworldwideespacenetcompublicationDetailsoriginalDocumentCC=USNR=2020074818A1KC=A1FT=DND=3date=20200305DB=locale= target= blank>original patent document<a>

Have you tried playing this on a musical instrument? If you don’t have an instrument, you could try a piano app for your smartphone. If you then transpose the notes above to minor scale, they sound more “dramatic”. Sonification can trigger visceral reactions, just like movie soundtracks do. You can use this effect for anomaly detection, for example. A visceral sound can make anomalies “jump out”.

Similar to movie soundtracks, sonification can create a “visceral data experience”. This may be useful for anomaly detection, for example.

Sonification could trigger visceral reactions to data, similar to movie soundtracks. A famous example of such a soundtrack is the Jaws theme, conducted here by John Williams (the composer of the theme). Imagine using this theme to indicate anomalies in a time series.

Sonification to support navigation for neurosurgery

Let’s change fields now, and look at neurosurgery. Neurosurgery requires very careful navigation of the surgical probe. In order to make this possible, neurosurgical procedures are mapped out in great detail in advance. Surgeons use imaging data to plan the path of the surgical probe during the actual procedure.

But there is a drawback. Image-based navigation means that surgeons must divide their attention between the patient and the navigation system.

In order to address this drawback, an interdisciplinary team with backgrounds in computer science, music, and brain imaging developed a new approach. Joseph Plazak and colleagues combine image-based with sonification-based navigation. They use sonification to indicate the distance between the surgical probe and the relevant anatomical location.

Plazak and colleagues found that sonification improved navigation accuracy. And there was another benefit as well. Study participants said that sonification made the task easier. This is important. After all, neurosurgical procedures often take a very long time. In order to maintain a high performance level over such extended periods of time, anything that lessens fatigue for surgeons will be useful.

Sonification can reduce the perceived difficulty of a task, particularly when multitasking is involved.

Not by itself, but in combination with visual information, sonification made neurosurgical navigation significantly easier. Results from Distance sonification in image-guided neurosurgery.
Not by itself but in combination with visual information sonification made neurosurgical navigation significantly easier Results from <a href=httpspubmedncbinlmnihgov29184665gid=article figurespid=fig 6 uid 5 target= blank rel=noreferrer noopener>Distance sonification in image guided neurosurgery<a>

Interestingly, Plazek and colleagues found that sonification by itself didn’t work very well. But the combination of data sonification and visual presentation significantly reduced task difficulty. You can see this in the chart above, which they published in their paper.

Using sonification to better understand biological processes

When an organism moves in response to a chemical stimulus, this is called chemotaxis. Bacterial chemotaxis is when a population of bacteria move toward or away from a chemical stimulus. This stimulus could be food, for example, or a poison.

A better understanding of bacterial chemotaxis could provide insight into the mechanics of certain infectious diseases. It could also advance bacterial tumor therapy. And a better understanding of bacterial chemotaxis might even spur the development of chemical-sensing robots.

If you’d like to get more in-depth information on bacterial behavior, I recommend this video lecture by Howard Berg from Harvard University:

A video lecture by Howard Berg from Harvard University on bacterial behavior. Bacterial chemotaxis is one of Howard Berg’s research topics.

Now, you can see bacteria move when you look at them through a microscope. But it is hard to visually make sense of bacteria swimming behavior. This is because bacteria can’t really swim in straight lines. Instead, they tumble toward or away from a stimulus. Such movement is called “biased random walk”.

This is where the work of Roseanne Ford’s research group at the University of Virginia comes in. Funded by the National Science Foundation, and in collaboration with composer Maxwell Tfirn from Christopher Newport University, they are developing data sonification methods for analyzing bacterial chemotaxis patterns.

Would you like to hear what moving bacteria sound like? Visit this article from the University of Virginia, scroll down a bit, and download the audio files.

Additionally, there will be an upcoming publication on bacteria chemotaxis sonification by Rhea Braun, Maxwell Tfirn, and Roseanne Ford at the American Geophysical Union Fall Meeting.

Designing new proteins with sonification

Even if you don’t play a musical instrument, you probably know what musical scales are. In most Western cultures, major and minor scales are probably the best-known scales. Around the world, there are many other scales as well. For example, Flamenco music often uses Phrygian scales. And pentatonic scales are used in many different kinds of music, including Jazz, Celtic, and West African music.

But a musical scale for amino acids?

This is what Markus Buehler, a materials scientist from MIT, is working on.

Buehler points out that materials and music have a lot in common: Molecules are not static structures. Rather, they are continuously moving and vibrating. And music, or sound more generally, is vibration too. Molecule sonification then translates the movements and vibrations of molecules into music movements and vibrations.

And Buehler discusses another potentially interesting link between molecules and music. In music, counterpoint is about how sounds interact with each other, “attract” and “repel” each other. Johann Sebastian Bach used counterpoint a lot in his work.

Johann Sebastian Bach made frequent use of counterpoint in his compositions. Markus Buehler argues that counterpoint could also be used for the design of new molecules via sonification. Image from Wikipedia.
Johann Sebastian Bach made frequent use of counterpoint in his compositions Markus Buehler argues that counterpoint could also help <a rel=noreferrer noopener href=httpsarxivorgabs200314258 target= blank>get a better understanding of protein structure data<a> via sonification Image from <a href=httpsenwikipediaorgwikiJohann Sebastian Bach>W<a><a rel=noreferrer noopener href=httpsenwikipediaorgwikiJohann Sebastian Bach target= blank>i<a><a href=httpsenwikipediaorgwikiJohann Sebastian Bach>kipedia<a>

Now, Buehler extends this concept of counterpoint to molecules as well. He states that counterpoint could indicate distance or interactions between protein structures. This way, Buehler argues, sonification could help us better understand protein structures.

In a next step, Chi-Hua Yu and Buehler design new proteins using sonification. They train a neural network to sonify protein structures, and to then “compose” new proteins.

Chi-Hua Yu and Markus Buehler used a neural network to design new proteins. Using amino acid sonification, the neural network predicted a musical score, which represents the structure of the new protein. Image from Yu and Buehler's paper.
<em>Chi Hua Yu and Markus Buehler used a neural network to design new proteins Using amino acid sonification the neural network predicted a musical score which represents the structure of the new protein Image from <a rel=noreferrer noopener href=httpspubmedncbinlmnihgov32206742 target= blank>Yu and Buehlers paper<a><em>

They validate the new protein designs by a method called normal mode analysis.

How you can try sonification at home

There is an app for sonification. Datavized Technologies, supported by the Google News Initiative, makes TwoTone. You can get started here, and you can get the code from GitHub. And if you’d like to learn more about the background of TwoTone, you can read an article by the data journalist Simon Rogers.

Then, there are people who play music to their plants because they think it makes them grow better. But do you know anybody who has their plants play music to them? You can now do this too, with a device called PlantWave.

PlantWave is a device for sonifying micro-movement data of plants. Screenshot from PlantWave.
PlantWave is a device for sonifying micro movement data of plants Screenshot from <a href=httpswwwplantwavecom target= blank rel=noreferrer noopener>PlantWave<a>

PlantWave converts the tiny movements that plants make into sound. This might actually have applications. Researchers from Virginia Tech are investigating whether sonification of plant micro-movements may help improve plant health. Particularly in controlled-environment agriculture such as vertical or urban farming, so the researchers, this could be useful. Perhaps when the plants start playing the Jaws theme, it may be time to water them.

This article was written by:

Florian Wolf

Florian Wolf

Florian is founder and CEO at Mergeflow, where he is responsible for company strategy and analytics development at Mergeflow. Previously, Florian developed analytics software for risk management at institutional investors. He also worked as a Research Associate in Computer Science and Genetics at the University of Cambridge. Florian has a PhD in Cognitive Sciences from MIT.

Get notified when we have new research (once or twice per month):

We only collect and use your information in accordance with our Privacy Policy.