industry news
Subscribe Now

Researchers propose a roadmap to understand whether AI models and the human brain process things the same way

Deep Neural Networks – part of the broader family of machine learning – have become increasingly powerful in everyday real-world applications such as automated face recognition systems and self-driving cars.

Researchers use Deep Neural Networks, or DNNs, to model the processing of information, and to investigate how this information processing matches that of humans.

While DNNs have become an increasingly popular tool to model the computations that the brain does, particularly to visually recognise real-world “things,” the ways in which DNNs do this can be very different.

New research, published in the journal Trends in Cognitive Sciences and led by the University of Glasgow’s School of Psychology and Neuroscience, presents a new approach to understand whether the human brain and its DNN models recognise things in the same way, using similar steps of computations.

Currently, Deep Neural Network technology is used in applications such as face recognition, and while it is successful in these areas, scientists still do not fully understand how these networks process information.

This opinion article outlines a new approach to better this understanding of how the process works: first, that researchers must show that both the brain and the DNNs recognise the same things – such as a face – using the same face features; and, secondly, that the brain and the DNN must process these features in the same way, with the same steps of computations.

As a current challenge of accurate AI development is understanding whether the process of machine learning matches how humans process information, it is hoped this new work is another step forward in the creation of more accurate and reliable AI technology that will process information more like our brains do.

Prof Philippe Schyns, Dean of Research Technology at the University of Glasgow, said: “Having a better understanding of whether the human brain and its DNN models recognise things the same way would allow for more accurate real-world applications using DNNs.

“If we have a greater understanding of the mechanisms of recognition in human brains, we can then transfer that knowledge to DNNs, which in turn will help improve the way DNNs are used in applications such as facial recognition, where there are currently not always accurate.

“Creating human-like AI is about more than mimicking human behaviour – technology must also be able to process information, or ‘think’, like or better than humans if it is to be fully relied upon. We want to make sure AI models are using the same process to recognise things as a human would, so we don’t just have the illusion that the system is working.”

The study, ‘Degrees of Algorithmic Equivalence between the Brain and its DNN Models’ is published in Trends in Cognitive Sciences. The work is funded by Wellcome and Physical Sciences Research Council.

A link to the full study can be found here: https://authors.elsevier.com/sd/article/S1364-6613(22)00220-0

Leave a Reply

featured blogs
Dec 8, 2025
If you're yearning for a project that reconnects you with the roots of our digital age, grab a soldering iron and prepare to party like it's 1979!...

featured news

Need Faster VNX+ Development? Elma Just Built the First Lab Platform for It

Sponsored by Elma Electronic

Struggling to evaluate VNX+ modules or build early prototypes? Elma Electronic’s new 3-slot FlexVNX+ dev chassis streamlines bring-up, testing, and system integration for VNX+ payload cards—SOSA-aligned, lab-ready, and built for fast time-to-market.

Click here to read more

featured chalk talk

Time to first prototype shouldn’t be so hard!
In this episode of Chalk Talk, Romain Petit from Siemens and Amelia Dalton examine the challenges of FPGA-based prototyping and how the automatic partitioning, automatic cabling, runtime and debug infrastructure and more of the Siemens VPS platform can make your next FPGA-based prototype project easier than ever before.
Dec 3, 2025
18,198 views