Google Translate has been acting oddly lately. That isn’t an opinion based on getting a slightly off translation now and then, or even getting one that is only about half accurate. Google Translate is spitting out what appear to be biblical-style doomsday prophecies and assorted messages that are completely off the wall. As apps sometimes do, something in the code somewhere gets a little garbled, or new learning changes the way an app reacts to input, but to see change as drastic as is being seen now is so completely abstract and foreign as to wonder if Google Translate has become a messenger of doom.
On social media, people have been posting screencaps of what messages they have gotten back from the app and the results are both humorous, and a little disturbing at times. Anyone can try it out on their devices by just typing in a random combination of two or three letters and letting the app do the rest. The more times the letter combination is repeated, the more detailed the translation is. For instance, typing in “dog” 19 times with a space between each entry yields the translation, “Doomsday Clock is three minutes at twelve We are experiencing characters and a dramatic developments in the world, which indicate that we are increasingly approaching the end times and Jesus’ return.”
Admittedly, sometimes you have to fiddle around with the language setting to get results. For instance, in the “dog” example, you have to set the language on the left input side to Maori. Sometimes it works with the default detect language setting. The more different things you try, the more odd translations you get.
While there has been some social media speculation that Google Translate has been possessed by demons, ghosts, or has turned into an actual tool for God to talk to us humans, the real answer is much simpler than that. It’s just the way the app is learning, and the learning is imperfect. Google spokesperson, Justin Burr, told Nextweb exactly what is going on.
“Google Translate learns from examples of translations on the web and does not use ‘private messages’ to carry out translations, nor would the system even have access to that content. This is simply a function of inputting nonsense into the system, to which nonsense is generated.”
Andrew Rush, an assistant professor at Harvard, who studies natural language processing and computer translation, told Motherboard essentially what Rush said, but in more complex terms. Rush describes that the AI that runs the translate system for Google is trained to try to make something human sounding of whatever input it is given. The translation it creates may not make any sense, but it works within the confines of its programming and learning abilities.
Between Burr and Rush, the idea that the Google Translate AI has evolved to the point that it has adopted religion and is now spreading the word of a divine entity is not what’s going on. It’s merely a hole in the learning curve of the app, and in time, it is predicted that it will teach itself to react to strange input better. For now, most people that have tried getting odd responses from the app are saying it’s kind of fun.