Scientists learn to speak dolphin

Scientists have used AI to decode dolphin clicks...
12 December 2017

Interview with 

De Kaitlin Frasier - Scripps Institution of Oceanography, University of California San Diego

dolphin-203875_1920.jpg

dolphin

Share

Dolphins are exceptional in the variety of sounds they can make. As well as being able to communicate with each other through a complex language of whistles, they also use echolocation “clicks” to hunt down prey and to understand their immediate surroundings. A team at the Scripps Institution of Oceanography, in California, have developed an artificially intelligent system to spot patterns in these clicks and assign them to the species that made them. Learning this lingo should, hopefully, improve our ability to monitor dolphin populations and also to understand more about their behaviour. Lewis Thomson heard how it works from creator Kaitlin Frasier...

Kaitlin - Dolphins make two to three main categories of sound. They make whistles which are communication-oriented, and then they make these echolocation clicks, which are like bat sounds. They’re really high frequency, very short like microseconds long; kind of laser beams of sound that they produce out of their forehead. They have an organ up there called the melon that focuses the sound like a lense, and then it comes out of their forehead and bounces off of things in the environment, and then the reflection comes back and, based on that reflection, what frequencies come back and how fast, they can interpret if there’s a target in front of them - what is it, how far away is it, is it hard, is it squishy, is it something to eat - that kind of thing.

They’re producing these signals constantly and we, as scientists, are able to eavesdrop on those signals and use them as a tool for studying dolphins. We build acoustic recording devices that will sit on the seafloor and record these sounds for very long periods of time.

Lewis - What’s different about the way that you’re trying to study these sounds?

Kaitlin - As a grad student I spent a lot of time looking at data manually, looking at echolocation clicks. Specifically I work in the Gulf of Mexico and so I had millions of these echolocation clicks that I had detected in data, and I spent of lot of time staring at computer screens thinking: okay, I think this type is different from that one, trying to wrap my head around what the similarities of some were versus others. Then I realised, you know, I think it would be better to use a computer to do this consistently. So what we’re doing now is trying to use unsupervised learning so that’s this idea of: I don’t know exactly what’s in this data set but I’m going to use computing techniques to tell me more about my data without me telling it up front what it needs to find, so trying to use those to see if it can help us understand our acoustic data better.

Lewis - So instead of telling the computer that this click is produced by this species of dolphin, you’re letting it work it out for itself, is that right?

Kaitlin - Right. We’re using these network analysis tools to aggregate lots of similar dolphin clicks together, and what we’re looking at in the dolphin clicks is frequency content. Frequencies are low or there’s the high pieces and those vary between species. We think with some species it may have to do with how it’s head is shaped. So we’re using that and we’re also using the rate that they’re clicking at. Some click slower on average and so we’re using those two pieces of information combined to look for unique click types in our dataset.

Lewis - How do you know if it’s doing this correctly? Is there a way of checking if it’s right?

Kaitlin - For now what we’re doing is comparing it to what a human analyst would do but on a smaller dataset. We have a dataset that a human has looked at and then we run that using the computer and we compare. For example Risso’s dolphins have a very distinct click type and that has sort of emerged from the unsupervised process, which gives you a sense that it is doing something. But the next step is really going out into the field to just see if we can figure out what species or genus is making these different click types.

Lewis - How do you think this computer approach will help change our understanding of dolphins?

Kaitlin - These clicks are produced by all of the animals in a population and they’re in large numbers, so by recording these clicks you can do a lot of back calculations to estimate how many animals are swimming through the area over time and look at how populations are changing. So what this research is doing is trying to take it to the next level: not just how many are there but who, like what animal, what species, what genus, those sorts of questions so that we can start to get a more detailed picture.

Comments

Add a comment