How Scientists Are Using AI to Talk to Animals
As I have predicted too, as to AI making breakthroughs into animal communications - given not much else will likely do so as quickly.
There were numerous attempts in the mid-20th century to try to teach human language to nonhumans, primates such as Koko. And those efforts were somewhat controversial. Looking back, one view we have now (that may not have been so prevalent then) is that we were too anthropocentric in our approaches. The desire then was to assess nonhuman intelligence by teaching nonhumans to speak like we do — when in fact we should have been thinking about their abilities to engage in complex communication on their own terms, in their own embodied way, in their own worldview. One of the terms used in the book is the notion of umwelt, which is this notion of the lived experience of organisms. If we are attentive to the umwelt of another organism, we wouldn’t expect a honeybee to speak human language, but we would become very interested in the fascinating language of honeybees, which is vibrational and positional. It’s sensitive to nuances such as the polarization of sunlight that we can’t even begin to convey with our bodies. And that is where the science is today. The field of digital bioacoustics — which is accelerating exponentially and unveiling fascinating findings about communication across the tree of life — is now approaching these animals and not asking, “Can they speak like humans?” but “Can they communicate complex information to one another? How are they doing so? What is significant to them?” And I would say that’s a more biocentric approach or at the very least it’s less anthropocentric.
Pretty obvious, looking back, as to trying to understand their experiences of life and not projecting ours onto them.
In my experience with dogs, one can communicate with them, but via a type of blue tooth connection, associated with feelings. Humans mostly use word commands to create parallel emotions to the words. The animal senses the feeling, and this helps attach the word.
The human brain, when it writes to memory adds emotional tags to sensory content. With animals we induce the feeling first, and the word, by being a type of audio content, gets tagged for memory storage. Words can convey emotions; sit (stern warning) and point (visual cue) adds to the writing process.
With my Belgian Malinois, I invented a command I called "all done". This nebulous command meant that whatever we/he was doing, "all done", meant it was time to change what he are doing, no matter the circumstances, and do something new. He learned this open command and could be redirected from any behavior or fixation on any occasion. It worked because I originally associated all done and change to something new and fun. He associated it with happy to learn. This breed has very high adaptive intelligence. They can learn highly adaptive commands.
In a few experiments with my dog, I would use the learned command words but alter the emotion behinds the word to se if he reacted to the word or the new emotion. Bad dog with a playful smile and good dog with an angry voice, would reverse the meanings on command.
Sometimes, even today, if I am thinking and come to an exciting realization in my thinking, that causes a stronger inner feeling, my two smaller dogs will bark and start to act like an intruder is nearby, even from the other room. This may be connected to many years of censorship to new ideas. They feel a ping, and try to warn me of danger. I may tone down.
Language is newer than the domestication of dogs, so how did humans and dogs communicate before spoken language? The dog would associate visual cues based on human emotions. This is where AI has no clue, since machines lack feelings, Feeling are critical to writing to memory within animals. The AI will need a human to generate the feelings as an intermediary to words.