Talking Angela Game: Some People Still Believe Hoax That Human is Watching Them

March 25, 2014 Updated: March 25, 2014    

Some people still believe that the Talking Angela app presents safety dangers.

Many people commenting on the game’s Facebook page are trying to warn others that there are humans controlling the talking cat Angela, doing things like capturing details of people’s homes and trying to obtain personal details.

“OMG!!!i did zoom and on my phone and i saw that angela was having my room in her eyes…i just delete that game…is just to creepy!!!” said Pigi Eleftheriadou on the page. 

“I see a guy in her eyes ! Beware!” said Maria Paw. 

“That app is hacked and there are some questions you do not want to answer,” said Laimis Lacis.

The hoax is that pedophiles are controlling the cat and are in particular targeting children.

It started over a year ago but recently went viral on Facebook. 

One of the false rumors, started on the fake news site Huzlers.com, claimed that the game led to a child’s disappearance. 

One of the messages said:

ATTENTION PARENTS & GRANDPARENTS! My future daughter-in-law just received this warning from a friend on her page. Do not let your child download the Talking Angela app! It is very creepy! Gracie downloaded it without asking to her kindle fire because it was free and a really cute cat. She brought it to me to answer the question it asked. I immediately noticed it had activated the camera. It had already asked her name, age, and knew she was in the living room! I immediately deleted it! “

But security researchers emphasize that there’s nothing wrong with the application.

Graham Cluley of Sophos security said in a blog post that the warnings are bogus. 

“The truth is that Talking Angela appears to be entirely benign, and there are no obvious privacy concerns that differentiate it from thousands of other iPhone apps,” he said. 

“The app’s purpose is to wait until the child says something and then mimic what they say back to them (albeit in a Parisian feline fashion) rather than to pilfer details of where they go to school. None of this, of course, is to say that you shouldn’t be careful about what smartphone apps you install, and which Facebook applications you grant access to your social networking profile.

“Furthermore, it’s always a good idea to keep a close eye on what children are doing on the internet – in case they get themselves into a spot of bother. But the warning spreading across Facebook appears to be nothing more than a scare – setting the cat amongst the pigeons unnecessarily.”

Bruce Wilcox, who with his wife Sue developed the app, told CNET that the technology they used is just so advanced that it has fooled people.

“The more realistic an AI is, the more people will see their own fears and fantasies in what it says. Parents are hypersensitive these days. But there were obvious lies being said, so it was more than mere hypersensitivity. They made claims of things Angela said that we know she couldn’t have said.”

“Angela asks about your family, but she doesn’t memorize that you have a brother, so she wouldn’t inquire about your brother later. And some things that have been attributed to her saying about tongues (in the sexual sense), she does not have in her repertoire,” he added.

“What makes our technology so convincing is that, unlike most chatbots out there, which can make single quibbling responses to inputs, ours can lead conversations and find appropriate prescripting things to say much of the time. We care about backstory and personality and emotion, and strive to create true characters with a life of their own whose aim is to draw the user into their world. The characters are convincing because they are convinced of their own reality … Angela successfully captures the teen personality.

“For Angela it is all about her feelings. And Angela is selfish at times. And not only can she be rude but she can detect you being rude and react appropriately. This user is deeply involved in emotional reactions to Angela. That’s what we strive for.

“We won the 2010 Loebner Prize by fooling a human judge into thinking our chatbot was a human. It was accomplished in part via our attention to creating synthetic emotion.”

RECOMMENDED