Using Artificial Neural Networks to Understand Emotion Recognition in Autistic Adults
Post by Negar Mazloum-Farzaghi
The takeaway
Compared to neurotypical adults, autistic adults have difficulty recognizing facial emotions. Machine learning helped to determine that neural activity in the inferior temporal cortex may explain these differences.
What's the science?
Autism spectrum disorder (ASD) is characterized by difficulty recognizing others’ moods and emotions from facial expressions. Previous research has found that the fusiform face area and the inferior temporal (IT) cortex are involved in facial recognition. Moreover, neural activity in the human amygdala has been associated with recognizing facial emotions. A critical question that remains to be answered is whether atypical facial emotion recognition in autistic adults can be explained by perceptual difficulties or by an atypical development and functioning of regions associated with facial emotional processing. Brain-mapped computational models can help us predict how facial emotion is represented across different brain regions in primates and how these representations are associated with performance on tasks involving facial emotion judgement. This week in The Journal of Neuroscience, Kar used artificial neural network models of primate vision to investigate neural and behavioral markers of emotion recognition in ASD.
How did they do it?
Kar began by analyzing behavioral and neural measurements conducted by Wang and Adolphs (2017) and Wang et al. (2017). During the behavioural task, Wang and Adolphs (2017) presented images of faces to neurotypical controls and high-functioning autistic adults. The participants were asked to make a judgment as to whether each face depicted happiness or fear. Wang and Adolphs (2017) found that compared to controls, autistic individuals had reduced specificity in facial emotional judgement.
To further investigate their findings using computational modeling, Kar trained artificial neural network models to perform the same task. The neural network models were comprised of layers of units that closely corresponded to the brain areas (i.e., IT cortex) and neurons in the primate ventral visual cortex. In order to test the accuracy of these models, Kar compared the facial emotional predictions they made with the behavioral measurements obtained from the neurotypical controls and autistic adults. Next, the author deleted different layers of the network in order to determine which layers of the network were responsible for discriminating between the behavior of the neurotypical controls and autistic adults.
In order to find out whether the network models could establish the IT cortex or the amygdala as the primary contributor to facial emotional processing, Kar also reanalyzed recordings obtained from electrodes implanted bilaterally in the amygdalas of patients with epilepsy by Wang et al. (2017). Similar to the task described above, participants were presented with images of faces and were asked to discriminate between two emotions; fear and happiness. Finally, to assess the efficiency of neural connections in autistic adults during facial emotional processing, the author looked at the synaptic strengths (weights) of the connections between the IT layer of the neural network models and its behavioral responses. Moreover, the author added various levels of noise to the activity of the IT layer in the neural network, to evaluate whether added noise would improve or weaken the match between judgments made by neural network models and by autistic adults.
What did they find?
Kar found that the artificial neural network models could accurately predict human facial emotion judgements at an image-by-image level. Interestingly, the network’s behavioral responses matched the patterns observed from the neurotypical control group more than those of the autistic adults. The author found that the greatest difference in the network’s ability to match the behavioral responses of controls versus autistic adults was based on the final and deepest layer of the network, which resembled the primate IT cortex. These results suggested that the neural activity in the primate IT cortex could play an essential role in abnormal facial emotion processing in autistic individuals. In terms of the role of the amygdala in facial emotional processing, the author found that, when controlling for the IT-cortex layer of the network models, the amygdala provided very little information during facial emotion recognition, suggesting that the IT cortex is the main driver of amygdala’s role in discriminating the emotion of a face.
Finally, the weights in the network models (the importance of different connections) were more strongly related to the neurotypical controls than the autistic adults, suggesting that the neural connections in the autistic adults were noisier and therefore different during facial recognition. Moreover, added noise increased the match between the network’s performance and that of the autistic adults, however, it did not improve the network’s similarity to the neurotypical controls. These findings suggested that atypical facial emotional processing in autistic individuals might be due to the presence of additional noise in their sensory representations.
What's the impact?
This study found that primate IT activity was a critical neural marker involved in a computational model of atypical facial emotion recognition in ASD. Overall, this study showed that artificial neural network models of vision were a useful tool in probing underlying neural and behavioral mechanisms involved in autism and that these neural models could be a promising tool in investigating other aspects of cognitive function.