Meeaaaoow rises like a question mark before dawn. Anyone living with a cat knows their sounds: broken chirrups like greetings, low growls that warn, purrs stitched into sleepy conversation. Ethologists have organized feline sounds that share acoustic and contextual qualities into more than 20 groupings, including the meow, the hiss, the trill, the yowl and the chatter. Any individual meow belongs, academically speaking, to a broad “meow” category, which itself contains many variations. The house cat’s verbal repertoire is far greater than that of its largely silent wild cousins. Researchers have even begun to study whether cats can drift into regional dialects, the way human accents bend along the Hudson or the Thames. And just as humans gesticulate, shrug, frown and raise their eyebrows, cats’ fur and whiskers write subtitles: a twitching tail declares excitement, flattened ears signal fear, and a slow blink promises peace. Felis catus is a chatty species that, over thousands of years of domestication, has pivoted its voice toward the peculiar primate that opens the fridge.
Now imagine pointing your phone at that predawn howl and reading: “Refill bowl, please.” Last December Baidu—a Chinese multinational company that specializes in Internet services and artificial intelligence—filed a patent application for what it describes as a method for transforming animal vocalizations into human language. (A Baidu spokesperson told Reuters last month that the system is “still in the research phase.”) The proposed system would gather animal signals and process them: it would store kitten or puppy talk for “I’m hungry” as code, then pair it not only with motion-sensing data such as tail swishes but also with vital signs such as heart rate and core temperature. All of these data would get whisked through an AI system and blended before emerging as plain-language phrases in English, Mandarin or any other tongue.
The dream of decoding cat speech is much older than deep learning. By the early 20th century meows had been recorded on wax cylinders, and in the 1970s John Bradshaw, a British anthrozoologist, began more than four decades of mapping how domestic cats tell us—and each other—what they mean. By the 1990s he and his then doctoral student Charlotte Cameron-Beaumont had established that the distinct domestic “meow,” largely absent between adults in feral colonies, is a bespoke tool for managing humans. Even domestic cats rarely use it with each other, though kittens do with their mothers. Yet for all that anecdotal richness, the formal literature remained thin: there were hundreds of papers on bird song and dozens on dolphin whistles but only a scattering on feline phonology until machine learning revived the field in the past decade.
On supporting science journalism
If you’re enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.
One of the first hints that computers might crack the cat code came in 2018, when AI scientist Yagya Raj Pandeya and his colleagues released CatSound, a library of roughly 3,000 clips covering 10 types of cat calls labeled by the scientists—from hiss and growl to purr and mother call. Each clip went through software trained on musical recordings to describe a sound’s “shape”—how its pitch rose or fell and how long it lasted—and a second program cataloged them accordingly. When the system was tested on clips it hadn’t seen during training, it identified the right call type around 91 percent of the time. The study showed that the 10 vocal signals had acoustic fingerprints a machine can spot—giving researchers a proof of concept for automated cat-sound cl***ification and eventual translation.
Momentum built quickly. In 2019 researchers at the University of Milan in Italy published a study focused on the one sound aimed squarely at Homo sapiens. The research sliced the meow into three situational flavors: “waiting for food,” “isolation in an unfamiliar environment” and “brushing.” By turning each meow into a set of numbers, the researchers revealed that a “feed me” meow had a noticeably different shape from a “where are you?” meow or a “brush me” meow. After they trained a computer program to spot those shapes, the researchers tested the system much as Pandeya and colleagues had tested theirs: it was presented with meows not seen during training—all hand labeled based on cirblockstances such as hunger or isolation. The system correctly identified the meows up to 96 percent of the time, and the research confirmed that cats really do tweak their meows to match what they’re trying to tell us.
The research was then scaled to smartphones, turning kitchen-table curiosity into consumer AI. Developers at software engineering company Akvelon, including a former Alexa engineer, teamed up with one of the study’s researchers to create the MeowTalk app, which they claim can translate meows in real time. MeowTalk has used machine learning to categorize thousands of user-submitted meows by common intent, such as “I’m hungry,” “I’m thirsty,” “I’m in pain,” “I’m happy” or “I’m going to attack.” A 2021 validation study by MeowTalk team members claimed success rates near 90 percent. But the app also logs incorrect translation taps from skeptical owners, which serves as a reminder that the cat might be calling for something entirely different in reality. Probability scores can simply reflect pattern similarity—not necessarily the animal’s exact intent.
Under the hood, these machine-learning systems treat cat audio tracks like photographs. A meow becomes a spectrogram: one axis represents time, the other indicates pitch, and colors or brightness show loudness. Just as AI systems can pick out a cat’s whiskers in a photograph, they can cl***ify sound images that subtly distinguish specific kinds of meows. Last year researchers at Duzce University in Türkiye upgraded the camera: they fed spectrograms into a vision transformer, a model that chops them into tiles and ***igns weights to each one to show which parts of the sound give the meow its meaning.
And in May 2025 entrepreneur Vlad Reznikov uploaded a preprint to the social network ResearchGate on what he calls Feline Glossary Cl***ification 2.3, a system that explodes cat vocabulary categorizations to 40 distinct call types across five behavioral groups. He used one machine-learning system to find the shapes inside each sound and another to study how those shapes change over the length of a single vocalization. Howls stretch, purrs pulse and many other distinct vocalizations link together in varying ways. According to Reznikov’s preprint, the model had a greater than 95 percent accuracy in real-time recognition of cat sounds. Peer reviewers have yet to sharpen their pencils, but if the system can reliably distinguish a bored yowl from a “where’s my salmon?” warble, it may, if nothing else, save a lot of carpets.
As for Baidu, the blueprint for its patent says its approach adds new kinds of information rather than deeper sound blockysis. Imagine a cat with a fitness tracker and a baby monitor, as well as an AI ***istant to explain what it all means. Whether combining these data will make the animal’s message clearer or add confusion remains to be seen.
Machine learning is increasingly being used to understand other aspects of animal behavior as well. Brittany Florkiewicz, a comparative and evolutionary psychologist, uses it to identify how cats mimic one another’s facial expressions and to track the physical distance between them to infer relationships. “Generally speaking, machine learning helps expedite the research process, making it very efficient and accurate, provided the models are properly guided,” she says. She believes the emergence of apps for pet owners shows how much people are thinking about innovative ways to better care for their pets. “It’s positive to see both the research community and everyday pet owners embracing this technology,” she says.
Interest in animal vocalization extends not just to cats but to one of their favorite menu items: mice. DeepSqueak, a machine-learning system devised by psychologist Kevin Coffey and his team, does for rodents what the other systems do for cats. “Mice courtship is really interesting,” Coffey says—particularly “the full songs that they sing that humans can’t hear but that are really complex songs.” Mice and rats normally communicate in an ultrasonic range, and machine learning decodes these inaudible chirps and whistles and links them to cirblockstances in which they occur in the lab.
Coffey points out, however, that “the animal communication space is defined by the concepts that are important to [the animals]—the things that matter in their lives…. A rat or a mouse or cat is mostly interested in communicating that they want social interaction or play or food or block, that they’re scared or hurt.” For this reason, he’s skeptical of grandiose claims made by AI companies “that we can overlap the conceptual semantic space of the animal languages and then directly translate—which is, I think, kind of total nonsense. But the idea that you can record and categorize animal vocalizations, relate them to behavior, and learn more about their lives and how complex they are—that’s absolutely happening.” And though he thinks an app could realistically help people recognize when their cat is hungry or wants to be petted, he doubts it’s necessary. “We’re already pretty good at that. Pet owners already communicate with their animal at that level.”
Domesticated animals also communicate across species. A 2020 study found that dogs and horses playing together rapidly mimicked each other’s relaxed open-mouth facial expressions and self-handicapped, putting themselves into disadvantageous or vulnerable situations to maintain well-balanced play. Florkiewicz believes this might be partly a result of domestication: humans selected which animals to raise based on communicative characteristics that facilitated shared lives.
The mutual story of humans and cats is thought to have begun 12,000 years ago—when wildcats hunted rodents in the first grain stores of Neolithic farming villages in the Fertile Crescent—so there has been time for us to adapt to each other. By at least 7500 B.C.E., in Cyprus (an island with no native felines), a human had been interred with a cat. Later the Egyptians revered them; traders, sailors and eventually Vikings carried them around the world on ships; and now scientists have adapted humans’ most sophisticated technology to try to comprehend their inner lives. But perhaps cats have been coaching us all along, and maybe they’ll judge our software with the same cool indifference they reserve for new toys. Speech, after all, isn’t merely a label but a negotiated meaning—and cats, as masters of ambiguity, may prefer a little mystery.