Angle icon

Decoding the Wild

Get Angle

Decoding the Wild

A boat in Alaska, a whale named Twain, and a twenty-minute temporal handshake spark a race to translate nature's voice.

[Speaker 1]: It’s August, 2021. There’s a research boat bobbing in the freezing waters of Southeast Alaska. The team on board-scientists from the SETI Institute and UC Davis-isn't looking for aliens. They are looking for humpback whales. [Speaker 2]: They drop a speaker into the water. They play a recorded "contact call." It’s a specific sound known as a "whup." Then, they just wait. [Speaker 1]: Silence. Nothing happens at first. But then, a 38-year-old female whale named Twain approaches the boat. She circles. And she responds. [Speaker 2]: But here is the specific detail that matters. It wasn't just *what* she said. It was *when* she said it. The researchers waited 10 seconds before playing a call. Twain waited roughly 10 seconds before responding. The researchers waited 20 seconds. Twain waited 20 seconds. [Speaker 1]: For 20 minutes, this animal matched the humans' latency perfectly. It was a temporal handshake. [Speaker 2]: This interaction is cited as the first time humans have successfully conversed with a whale in its own language. And it opens the door to a massive question. With the sudden rise of AI, we are moving from *listening* to animals to *talking* with them. [Speaker 1]: But if we crack the code... what are they actually going to say? [Speaker 2]: Today, we are looking at the race to translate nature, the "Umwelt" barrier that might make it impossible, and the ethical minefield of talking back. [Speaker 1]: And that moment with Twain? That 10-second gap of silence? We need to keep that in mind. Because by the end of this, that silence might be the most important part of the conversation. [Speaker 2]: To understand how we went from recording crickets to having a chat with a whale, we have to look at a fundamental shift in how we process language. It didn't start with biology. It actually started with geometry. [Speaker 1]: Geometry? [Speaker 2]: Right. For decades, the status quo of animal communication was observational. You had biologists like Con Slobodchikoff spending 30 years sitting in fields, manually listening to prairie dogs. [Speaker 1]: I remember this. He figured out they had different words for "coyote" versus "human," right? [Speaker 2]: Exactly. He found they could even describe the color of a human's shirt. But it was slow, manual labor. You had to listen, write it down, and correlate it to what you saw. But around 2017, the field of AI translation had a massive breakthrough called unsupervised machine learning. [Speaker 1]: Okay, before we get too technical, explain what changed. Because usually, to translate something, you need a dictionary. You need a Rosetta Stone. [Speaker 2]: We don't have a Rosetta Stone for whales. So the computer scientists tried something else. They realized that if you map every word in a language-let’s say English-into a 3D graph, the language forms a specific geometric shape. A "galaxy" of words. [Speaker 1]: So "King" is close to "Queen," and "Man" is close to "Woman"? [Speaker 2]: Precisely. The distance between words represents their relationship. And here’s the shocker: If you map Japanese, or Turkish, or Finnish... they all form roughly the same shape. [Speaker 1]: Because we all live in the same physical reality. [Speaker 2]: Right. Gravity pulls things down in English and Japanese. We all have mothers. We all eat. So the "shapes" of our languages are nearly identical. To translate, you don't need a dictionary. You just take the "English Galaxy," rotate it, and overlay it on the "Japanese Galaxy." The points that align are the translations. [Speaker 1]: And this…

Try stream view →