[ Question ]

Could we use current AI methods to understand dolphins?

by Daniel Kokotajlo1 min read22nd Mar 20205 comments

9

Frontpage

I am told that unsupervised machine translation is a thing. This is amazing. I ask: Could we use it to understand dolphin language? (Or whales, perhaps?)

I don't currently see a convincing reason why not. Maybe dolphins aren't actually that smart or communicative and their clicks are mostly just very simple commands or requests, but that should just make it really easy to do this. Maybe the blocker is that dolphins have such a different set of concepts than English that it would be too hard?

New Answer
Ask Related Question
New Comment

1 Answers

The approach of the linked article tries to match words meaning the same thing across languages by separately building a vector embedding of each language corpus and then looking for structural (neighborhood) similarity between the embeddings, with an extra global 'rotation' step mapping the two vector spaces on one another.

So if both languages have a word for "cat", and many other words related to cats, and the relationship between these words is the same in both languages (e.g. 'cat' is close to 'dog' in a different way than it is close to 'food'), then these words can be successfully translated.

But if one language has a tiny vocabulary compared to the other one, and the vocabulary isn't even a subset of the other language's (dolphins don't talk about cats), then you can't get far. Unless you have an English training dataset that only uses words that do have translations in Dolphin. But we don't know what dolphins talk about, so we can't build this dataset.

Also, this is machine learning on text with distinct words; do we even have a 'separate words' parser for dolphin signals?