Predicting the Risk of Childhood Mental Illness with AI

Post by Laura Maile

The takeaway

Adolescent mental illness is an increasingly serious problem in the US and globally. Early risk assessment and intervention are key to the successful treatment of mental health disorders. Artificial intelligence can be trained to accurately predict the risk of future psychiatric illness in children using patient and parent questionnaires of current symptoms. 

What's the science?

Rates of mental illness in adolescents have risen in recent years, presenting an incredible challenge to patients and their mental health providers. Identifying those children with the highest risk for developing psychiatric illness is essential to early intervention and effective mental health treatment. Due to the many complex factors contributing to mental illness and the diversity of psychiatric disorders present in the adolescent population, predicting the risk of disease in individual children remains a challenge. This week in Nature Medicine, Hill and colleagues developed a method to predict the risk of future psychiatric disease by training a neural network model using data from 11,000 children. 

How did they do it?

The authors utilized data from the Adolescent Brain and Cognitive Development (ABCD) study, which included 11,416 participants aged 8-15 years old. In the ABCD study, patient assessments were conducted multiple times a year and included interviews with patients and their parents, fMRI scans, symptom screenings, and questionnaires. Data collected in multiple patient assessments across five years was used to train an artificial intelligence (AI) neural network model, which produced two general approaches to future disease prediction. One approach used assessments of symptoms like the Child Behavior Checklist (CBCL) to predict future disease, while the other focused on potential disease mechanisms, which relied on questionnaires about factors such as adverse childhood events, family history, sleep disturbances, and socioeconomic status. A p-factor was calculated, which represents the general psychopathology score for each individual. The authors set out to determine whether their model could accurately predict future p-factor based on current patient measurements. They also aimed to predict which children were likely to move to a higher-risk group and which factors had the biggest influence on prediction accuracy

What did they find?

The authors found that their neural network model could accurately predict mental illness one year following the initial assessment. It also accurately predicted a shift into the high-risk category after one year for 11% of participants. The model performed similarly well across demographic groups, indicating the broad application of this predictive model. While both symptom-based and mechanism-based approaches performed well as prediction models, the symptom approach was slightly better. Within the mechanism approach, sleep disturbance was the best predictor of future psychiatric illness. Up to a certain point, increased sleep disturbance was correlated with a higher risk of mental illness. This result indicates that moderate sleep disturbance is a positive predictor of psychiatric disease. Parent questionnaires had a stronger influence on positive prediction than patient questionnaires. This highlights the importance of parent perspectives in determining the risk of future mental illness.

What's the impact?

This study created an AI model to accurately predict the risk of future mental illness in adolescents across a broad range of mental health conditions. The success of symptom and disease mechanism analysis as predictors of future mental illness highlights the benefits of incorporating such inexpensive analyses into the clinical setting for individual patients. Strong predictors of future mental illness like sleep disturbance provide an opportunity for successful intervention, with some studies showing that targeted intervention can improve both sleep disturbance and psychiatric symptoms. This model could lead to improved preemptive screening and earlier intervention for children at risk of future mental illness. 

Access the original scientific publication here.

Neurons Driving Sugar Consumption

Post by Lila Metko 

The takeaway

A population of neurons in the hypothalamus with a well-established function in satiety, the sensation of being full, may have another important role. This research suggests that pro-opiomelanocortin (POMC) neurons in the hypothalamus signal to another brain region to drive sugar consumption in states of fullness. 

What's the science?

There is a drive present in both humans and many animal species to consume high amounts of sugar even after a substantial meal. Understanding the neurobiological mechanism behind this drive could assist with the production of effective obesity therapeutics. It is well understood that the activation of POMC-projecting neurons in the hypothalamus promotes satiety in a fed state. However, POMC is also a precursor for the neuropeptide b-endorphin that acts on a specific receptor, the mu opioid receptor, to stimulate appetite. This winter in Science, Minère and colleagues measure and manipulate activity in hypothalamic POMC neurons during both standard and high sugar consumption after a meal to investigate their role in the drive to consume sugar. 

How did they do it?

The authors first investigated which regions in the brain had both high amounts of mu opioid receptors and POMC. They used fluorescence in-situ hybridization, a technique that reveals the number of nucleic acid sequences coding for a protein of interest, for the receptor and immunohistochemistry, a detection technique for visualizing cellular components, for POMC. They found that a region with both was the paraventricular nucleus of the thalamus (PVT), a brain region important for feeding and motivated behavior. They then optogenetically activated POMC neurons from the hypothalamus and recorded activity in the PVT under control and different receptor blocker conditions to determine how POMC neurons affect PVT activity and which receptors may be involved. Next, they recorded activity in this circuit (hypothalamic POMC neurons to thalamic PVT neurons) during post-meal high-sugar food consumption or post-meal standard chow consumption to determine if specifically sweet foods were associated with changes in circuit activity. Additionally, the researchers tested if activation of the circuit under control and/or opioid receptor blocker conditions affected general flavor preference to control for potential confounds of sweet taste and post-ingestive sugar sensing. Next, they tested if circuit activation affected conditioned place preference, a preference test that is not associated with food consumption. They then investigated how chemogenetic inhibition of the circuit affected flavor preference (high-sugar food vs standard chow). Next, they used fiber photometry to record circuit activity in response to a high-sugar diet and high-fat diet cues, to determine the circuit's role in different fed-state macronutrient preferences. Finally, they used functional magnetic resonance imaging (fMRI) to examine PVT activity in humans during the consumption of sugar to see if a similar circuit may exist in humans

What did they find?

Activation of POMC neurons decreased the firing rate of neurons in the PVT when being exposed to blockers of other neuromodulators but not blockade of the mu opioid receptor. This suggests that signaling from POMC neurons to the PVT is via the mu opioid receptor and that this results in inhibition. Post-meal consumption of high-sugar food brought about an increase in the activity of POMC neuron terminals in the PVT while consumption of a standard chow diet post-meal did not, which suggests that the high-sugar diet brings about an increase in the activity of POMC neurons that project to the PVT. Activation of the circuit did affect general flavor preference conditions but not when mu-opioid receptor blockers were present. However, the circuit’s activation did not affect conditioned place preference, suggesting that the circuit is dietary preference specific. Inhibition of the circuit changed the length of time for a mouse to start showing a preference for a high-sugar diet. Fiber photometry data showed that, while both brought about an increase, high-sugar diet cues increased POMC to PVT activity more than high-fat diet cues. Additionally, fMRI data showed that activity level in the human PVT is decreased by sugar consumption. This suggests that a similar circuit may exist in humans. 

What's the impact?

This study found that hypothalamic POMC neurons projecting via opioid signaling to the PVT are involved in sugar consumption in fed states. Importantly, it sheds light on a brain circuit that may be involved in compulsive or binge eating. According to the World Health Organization obesity is a global epidemic that is a risk factor for many health conditions such as diabetes mellitus, cardiovascular disease, and stroke. These findings could help researchers develop potential therapeutics for obesity. 

Access the original publication here 

Seemingly Benign Mini-Strokes May Have a Long-Term Impact on Memory

Post by Soumilee Chaudhuri

The takeaway

A transient ischemic attack (TIA), often called a "mini-stroke," is deemed to be potentially harmless as its symptoms—like slurred speech or weakness—resolve quickly. However, this recent study shows that even a single TIA can lead to long-term memory and thinking problems, similar to what happens after a full ischemic stroke

What's the science?

A stroke happens when blood flow to the brain is blocked, causing brain damage. This can lead to lasting physical and cognitive problems. A TIA, on the other hand, often called a "mini-stroke," is characterized by temporary stroke-like symptoms caused by a brief interruption of blood flow to the brain. While its symptoms resolve quickly, prior research has hinted at potential long-term cognitive consequences. However, it’s unclear whether these cognitive changes were directly caused by the TIA event preexisting risk factors, or prior cognitive decline. Recently in JAMA Neurology, Del Bene et al., aimed to determine whether a single, diffusion-weighted image–negative TIA (a TIA without visible brain damage on imaging) was directly associated with cognitive decline over time, after accounting for vascular and demographic factors.

How did they do it?

This study analyzed data from the Reasons for Geographic and Racial Differences in Stroke (REGARDS) study, which included over 30,000 participants across the United States. Researchers compared cognitive trajectories in three groups: 1) 356 people with a first-time TIA, 2) 965 people with a first-time stroke, and 3) 14,882 people with no history of stroke or TIA. Cognitive function was assessed using memory and verbal fluency tests every two years. The researchers used advanced statistical models to compare cognitive changes before and after a TIA or stroke while accounting for factors like age, race, and vascular health. Therefore, key adjustments were made for vascular and demographic risk factors, such as age, sex, race, and preexisting conditions like hypertension and diabetes. Neuroimaging (Magnetic Resonance Imaging - MRI)) was used to confirm the absence of brain damage in TIA cases (diffusion-weighted image–negative).

What did they find?

Before a stroke or TIA, people who later had a stroke already had slightly worse memory and thinking skills (cognitive composite score of -0.25) than those who had a TIA (-0.05) or no stroke at all (0). This suggests that some cognitive decline might already be happening before a stroke occurs. After a stroke or TIA, both groups showed a decline in memory and thinking skills, though the decline was faster in the stroke group. The stroke group’s cognitive composite score declined by -0.14, while the TIA group’s score changed only slightly (0.01). The control group, with no stroke or TIA, showed a small decline of -0.03. Importantly, the annual decline in cognitive function was faster in the TIA group (-0.05) compared to the control group (-0.02) and was similar to the stroke group (-0.04). 

Overall, it was shown that stroke patients showed the largest immediate drop in cognitive function. TIA patients did not have an immediate decline but experienced a faster decline in cognitive function over time than the healthy control group. Surprisingly, the rate of cognitive decline in the TIA group was similar to that of stroke patients, despite the absence of visible brain damage on diffusion-weighted imaging.

What's the impact?

Even in the absence of immediate disability, TIA appears to contribute to long-term cognitive impairment, suggesting that it may trigger subtle but lasting brain changes. The result of this study raises important questions about the necessity of adding cognitive screening to the care plan for stroke and TIA patients, even if they seem to recover fully. Additionally, researchers still need to investigate how these TIA events cause memory problems so that early interventions can be used to prevent subsequent decline in brain health in these patients.

Access the original scientific publication here