New Delhi, April 20 (IANS) A Columbia University’s School of Engineering study in the US has shown that the brain’s visual regions play an active role in making sense of information, which could help build more adaptive AI systems.
Crucially, the way it interprets the information depends on what the rest of the brain is working on.
Published in the journal Nature Communications, the study led by biomedical engineer and neuroscientist Nuttida Rungratsameetaweemana, provides some of the clearest evidence yet that early sensory systems play a role in decision-making — and that they adapt in real-time.
It also points to new approaches for designing AI systems that can adapt to new or unexpected situations.
The findings challenge the traditional view that early sensory areas in the brain are simply “looking” or “recording” visual input. In fact, the human brain’s visual system actively reshapes how it represents the exact same object depending on what you’re trying to do.
Even in visual areas that are very close to raw information that enters the eyes, the brain has the flexibility to tune its interpretation and responses based on the current task.
“It gives us a new way to think about flexibility in the brain and opens up ideas for how to potentially build more adaptive AI systems modelled after these neural strategies,” said Nuttida.
Most previous work looked at how people learn categories over time, but this study zooms in on the flexibility piece: How does the brain rapidly switch between different ways of organising the same visual information?
The team used functional magnetic resonance imaging (fMRI) to observe people’s brain activity while they put shapes in different categories. The twist was that the “rules” for categorising the shapes kept changing.
This let the researchers determine whether the visual cortex was changing how it represented the shapes depending on how we had defined the categories.
They analysed the data using computational machine learning tools, including multivariate classifiers.
Activity in the visual system — including the primary and secondary visual cortices, which deal with data straight from the eyes — changed with practically every task.
They reorganised their activity depending on which decision rules people were using, which was shown by the brain activation patterns becoming more distinctive when a shape was near the grey area between categories.
Those were the most difficult shapes to tell apart, so it’s exactly when extra processing would be most helpful.
“We could actually see clearer neural patterns in the fMRI data in cases when people did a better job on the tasks. That suggests the visual cortex may directly help us solve flexible categorisation tasks,” said Nuttida.
The team is starting to explore how these ideas might be useful for artificial systems.
—IANS
na/
You may also like
Jan Aushadhi Kendra in Varanasi becomes a lifeline for local residents
Chelsea fans turn on Enzo Maresca with X-rated chant before Fulham comeback
The 1% Club viewers slam Lee Mack's 'insensitive' jibe at contestant's appearance
People of Maharashtra have rejected Sena (UBT) and MNS, says Nirupam
Rasha Thadani's lives her Rapunzel dreams in a purple mini dress