
The Roschach inkblot test, developed in 1921 to analyze human perception and personality, is now adopted for psychological AI interpretation exploration, with researchers examining how intelligent systems perceive and analyze ambiguous images.
In a test conducted by the BBC, AI was applied to this psychological tool to help scientists understand how machines process ambiguous stimuli to better comprehend the intertwine AI and cognitive psychology with technology’s intersection with human psychology.
Traditionally, the Roschach test reveals human emotions, psyche, and unconscious biases through human interpretation of symmetrical inkblots.
AI and Psychology Experiments
Recent AI models, such as “multimodal models” like OpenAI’s ChatGPT, allow machines to understand text and images. To observe how AI would interpret Rorschach inkblots, BBC presented ChatGPT with some classic inkblot pictures. When presented with the first inkblot, which is commonly seen by humans as a bat, moth, or butterfly, the AI recognized that it was an image from a Rorschach test.
“For me, it resembles something symmetrical, possibly two animals or figures facing each other, or a single entity with wings outstretched,” the chatbot responded.
The psychological AI interpretation, as humanistic as it may seem, is far from a real human experience. Psychologist Barbara Santini debates cognitive psychology and AI in the era of big data.
“If an AI’s response resembles a human’s, it’s not because it sees the same thing but it’s because its training data mirrors our collective visual culture.” She mentions.
AI algorithms pick up patterns from huge databases, mixing out interpretations rather than subjective views generated by emotions or personal experiences.
Limitations of Psychological AI
In contrast to humans, who link images to personal experience and emotion, AI psychology simply reads a dataset to generate an answer.
“ChatGPT provides interesting, human-like responses, but it is likely not genuinely thinking – instead, it appears to be scanning an online dataset,” says Chandril Ghosh, lecturer in psychology at the University of Kent, in the UK.
Presented with the same inkblot image twice, the AI chatbot can give totally different answers, showing that it’s responding from the data set it’s been trained on and not from personal experience or emotional thinking.
Massachusetts Institute of Technology (MIT) researchers trained an AI called “Norman” on disturbing online images, causing it to interpret Rorschach inkblots displayed as violent and frightening scenes.
In contrast, a traditionally trained AI identified the same images as birds or abstract shapes, demonstrating that cognitive psychology and AI experiments cannot be fully trusted.
AI and Human Psychology
Psychological AI may be able to define emotion or interpret visual stimuli, but not the emotional depth because AI is unaware of symbolic meanings.
“A human would typically stick to their previous answer because personal experiences and emotions influence their responses,” says Ghosh. “In contrast, ChatGPT generates responses based on its dataset.”
AI psychology interpretations of Rorschach inkblots highlight weaknesses of machine perception. While AI can simulate human response, it cannot experience or attach symbolic meanings to images. That contrast highlights the extremely subjective, nuanced character of human perception, something that AI can never duplicate.
Inside Telecom provides you with an extensive list of content covering all aspects of the tech industry. Keep an eye on our Intelligent Tech sections to stay informed and up-to-date with our daily articles.