Researchers at Massachusetts Institute of Technology (MIT) created an algorithm named Norman, in tribute to the character Norman Bates in 1960 Hitchcock horror film Psycho, and trained it using gruesome picture captions, causing it to associate objects with death.
Dubbed Norman, this is not your typical artificial intelligence system. The team of three developers from MIT Media Lab created Norman to demonstrate that a perfectly good algorithm can become biased if it is only fed biased data.
Scientists Pinar Yanardag, Manuel Cebrian and Iyad Rahwan conducted the experiment not to fulfil a maniacal plan to doom humankind, but to prove that algorithms generated by machine-learning can be greatly affected by biased input data. Beside the dark data sat Norman was also exposed to a dark Reddit documentary which was dedicated to note and observe the disturbing realities of a death and causes of it. The image below captures the level of effect that Reddit thread had on Norman's ability to perceive images.
The Rorschach Test revolves around an individual's perception of inkblots, with responses being analyzed using psychological interpretation.
The MIT team thinks it will be possible for Norman to retrain its way of thinking vialearning from human feedback.
PG Relay for Life walking on new grounds this weekend
A special auto show will be held at the fairgrounds at 9 a.m., with registration beginning at 7:30 a.m. Volunteers give of their time and effort because they believe it's time to take action against cancer.
This picture captures the tense environment at the G7 Summit in Canada
His economic adviser Larry Kudlow backed up the call, saying, "We're going to clean up the worldwide trade system". Mr Trump and G7 leaders had a bitter exchange over trade tariffs, ratcheting tensions at the summit.
CEO for Berkshire, Amazon, Chase heath venture on the way
Warren Buffett , CEO of Berkshire Hathaway Inc, pauses while playing bridge as part of the company annual meeting weekend in Omaha, Nebraska U.S.
Norman is born from the fact that the data that is used to teach a machine learning algorithm can significantly influence its behavior.
Researchers then took a series of Rorschach inkblots and fed them to both Norman and a standard AI to compare results.
As a result, Norman only sees death in everything.
In one inkblot test, a standard AI saw "a black and white photo of a red and white umbrella", while Norman saw "man gets electrocuted while attempting to cross busy street". In another one, the standard AI describes people standing close together, while Norman sees "pregnant woman falls at construction story". In 2016, Microsoft launched Tay, a Twitter chat bot.
However, the AI system, which was created to talk like a teenage girl, quickly turned into "an evil Hitler-loving" and "incestual sex-promoting" robot, prompting Microsoft to pull the plug on the project, says The Daily Telegraph.