Can AI Understand Color Without Seeing It?

Image by Mario Gogh, from Unsplash

Can AI Understand Color Without Seeing It?

Reading time: 2 min

The research demonstrates that ChatGPT understands common color metaphors, but fails to understand new ones.

In a rush? Here are the quick facts:

  • AI struggles with novel or reversed color metaphors.
  • Colorblind and color-seeing people interpret metaphors similarly.
  • Painters outperformed others on new color metaphors.

The research demonstrates that ChatGPT and other AI tools excel at processing basic color metaphors, yet fail to understand creative metaphors. Scientists studied human and ChatGPT responses to metaphors such as “feeling blue” and “seeing red” to determine the language processing capabilities of the AI systems, as first reported by Neuroscience News (NN).

The study, led by Professor Lisa Aziz-Zadeh at the USC Center for the Neuroscience of Embodied Cognition, found that color-seeing and colorblind people performed similarly when interpreting metaphors, suggesting that seeing color isn’t necessary to grasp their meaning.

However, people with hands-on-experience, such as painters, demonstrated superior abilities in interpreting complex metaphors, including “the meeting made him burgundy.”

ChatGPT, which processes huge amounts of written text, did well on common expressions and offered culture-informed explanations. For example, NN reports a case where the bot described a “very pink party” as being “filled with positive emotions and good vibes.” But it often stumbled on unfamiliar metaphors or when asked to reverse associations, such as figuring out “the opposite of green”

“ChatGPT uses an enormous amount of linguistic data to calculate probabilities and generate very human-like responses,” said Aziz-Zadeh, as reported by NN. “But what we are interested in exploring is whether or not that’s still a form of second hand knowledge, in comparison to human knowledge grounded in firsthand experiences,” he added.

The study was a collaboration among neuroscientists, computer scientists, and artists from institutions including USC, Google DeepMind, and UC San Diego. As AI develops, researchers say combining sensory input with language might help it better understand the world in human-like ways.

Did you like this article? Rate it!
I hated it I don't really like it It was ok Pretty good! Loved it!

We're thrilled you enjoyed our work!

As a valued reader, would you mind giving us a shoutout on Trustpilot? It's quick and means the world to us. Thank you for being amazing!

Rate us on Trustpilot
0 Voted by 0 users
Title
Comment
Thanks for your feedback