Keep up with the latest in Project CETI and hear directly from our team.
We’re thrilled to share that Project CETI is featured in The New Yorker this week in a piece authored by Elizabeth Kolbert (best known for her Pulitzer Prize-winning book The Sixth Extinction: An Unnatural History).
In June, CETI hosted the second annual Decoding Communication in Nonhuman Species II conference at the University of California, Berkeley (US Berkeley) and is pleased to announce that the third annual conference will be held in June 2024 in Boston!
We celebrate the graduation of the inaugural CETI Marine Conservation Fellows, achievements throughout the year, and introduction of the new cohort.
Roger may be best known for uncovering the features of whale songs, but in the process he also uncovered a fundamental truth about humans and our ability to amplify the magic of nature.
Scientists are eavesdropping on animal conversations. Will generative AI enable us to talk back?
Soon after whaling ships began operating in the North Pacific, an interesting trend emerged. Within just a few years, whalers saw a 58% drop in their successful strikes. Sperm whales had suddenly become harder to kill— they had begun fleeing the boats.
For millennia, humans have regarded other species with curiosity and wonder.
Scientists are working to understand how culture shapes different species’ evolution, from bugs to birds.
Protecting the country’s whales sequesters as much carbon as taking 5,000 cars off the road every year – it also bolsters the local economy and paves the way for other countries to safeguard this at-risk species elsewhere.
For centuries, humans have recognized the mystic sanctity of whales and their oceanic hymns. What kind of wisdom could we uncover if we stopped to listen?
CUNY Distinguished Professor David Gruber founded the Cetacean Translation Initiative – Project CETI – which seeks listen to and translate whale communication.
Imagine, says Tom Mustill, a biologist turned nature film-maker, that we are sperm whales in the Caribbean, nattering away to one another in a code of clicking.
Language was long understood as a human-only affair. New research suggests that isn’t so.
Scientists use artificial intelligence to study animal communication. AI can help identify the meaning and context of animal sounds. Project CETI uses AI and underwater technology to study sperm whale communication.
The nonprofit organization Project CETI, along with its local charitable organization Project CETI Dominica, has documented an extraordinary event in Dominica’s waters: the birth of a sperm whale.
Researchers believe that artificial intelligence may allow us to speak to other species.
The scientist David Gruber explains the mission of Project CETI, and what his team has learned about how whales communicate.
The last scientific record of a sperm whale birth was 1986, without audio or video. New recordings of the whales’ behavior during the birth will give researchers new insight.
A new wave of researchers are using machine learning to understand animals on their own terms
Humans, who have long dreamed of interspecies communication, are now working to decode the calls of two whale species. But our efforts to understand our underwater neighbors require more than just scientific investment—they take real empathy.
More than fifty years ago, my team and I first discovered that whales sing to each other
On the "In Real Life" season finale, Sam Eaton examines how AI technology is opening up new possibilities for scientists communicating with animals — from sending messages to elephants on the plains of Kenya to mimicking honey bees in Germany and bat sounds in Israel.
Biologist David Gruber thinks decoding the language of whales could be just the first step in understanding what other lifeforms are saying—in this world and out of it.
Twenty youths from across Dominica have now honed their skills in photography after participating in a four-day photo camp organized by the National Geographic Explorers in partnership with the locally-based Project Cetacean Translation Initiative (CETI).
The sound of sperm whales. Its a sound that haunted sailors for centuries.
Few people are better qualified to answer the question “Who Speaks for the Oceans?” than David Gruber, a marine biologist and whale whisperer who founded Project ceti—the Cetacean Translation Initiative.
Scientists are using machine learning to eavesdrop on naked mole rats, fruit bats, crows and whales — and to communicate back.
In the 2016 sci-fi movie “Arrival,” a linguist and a theoretical physicist race against time to communicate with endangered extraterrestrial heptapods wishing to share their wisdom and technologies with the human race so it will survive and one day return the favor.
Scientists are wielding algorithms in hopes of understanding how the mighty mammals communicate.
Scientists who met at Radcliffe are exploring how whales communicate, and if they have language
An ambitious project is attempting to interpret sperm whale clicks with artificial intelligence, then talk back to them
An ambitious project is attempting to interpret sperm whale clicks with artificial intelligence, then talk back to them.
Imagine trying to make contact with super-intelligent beings. Now imagine they’re not from space. Sperm whales, the largest of the toothed whales, boast enormous brains, explains David Gruber, a professor of biology and environmental science at the City University of New York.
Studying the behavior of animals roaming the deep-sea is challenging and is driven by technological developments.
Sperm whales (Physeter macrocephalus) are long-lived and highly social mammals that engage in complex group behaviours, including navigation, foraging, and child-rearing.
Sperm whale vocalizations are among the most intriguing communication systems in the animal kingdom. Traditionally, sperm whale codas, or groups of clicks, have been primarily analyzed in terms of the number of clicks and their inter-click timing.
Sperm whales are characterized by the inter-pulse interval (IPI) of their vocalization, which is a function of the whales’ size. In this paper, we propose a new methodology for detecting and extracting features from the IPI of sperm whale clicks.
We provide quantitative evidence suggesting social learning in sperm whales across socio-cultural boundaries, using acoustic data from the Pacific and Atlantic Oceans.
This paper proposes a methodology for discovering meaningful properties in data by exploring the latent space of unsupervised deep generative models. We combine manipulation of individual latent variables to extreme values outside the training range with methods inspired by causal inference into an approach we call causal disentanglement with extreme values (CDEV) and show that this approach yields insights for model interpretability.
Recent years have seen breakthroughs in neural language models that capture nuances of language, culture, and knowledge. Neural networks are capable of translating between languages -- in some cases even between two languages where there is little or no access to parallel translations, in what is known as Unsupervised Machine Translation (UMT).
Machine learning has been advancing dramatically over the past decade. Most strides are human-based applications due to the availability of large-scale datasets; however, opportunities are ripe to apply this technology to more deeply understand non-human communication.
We implemented Machine Learning (ML) techniques to advance the study of sperm whale (Physeter macrocephalus) bioacoustics. This entailed employing Convolutional Neural Networks (CNNs) to construct an echolocation click detector designed to classify spectrograms generated from sperm whale acoustic data according to the presence or absence of a click.
We’ll keep you up-to-date on the latest breakthroughs and progress.