The publication data currently available has been vetted by Vanderbilt faculty, staff, administrators and trainees. The data itself is retrieved directly from NCBI's PubMed and is automatically updated on a weekly basis to ensure accuracy and completeness.
If you have any questions or comments, please contact us.
BACKGROUND - Neuropsychological impairment is common in schizophrenia and psychotic bipolar disorder. It has been hypothesized that the pathways leading to impairment differ between disorders. Cognitive impairment in schizophrenia is believed to result largely from atypical neurodevelopment, whereas bipolar disorder is increasingly conceptualized as a neuroprogressive disorder. The current investigation tested several key predictions of this hypothesis.
METHODS - Current neuropsychological functioning and estimated premorbid intellectual ability were assessed in healthy individuals (n = 260) and a large, cross-sectional sample of individuals in the early and chronic stages of psychosis (n = 410). We tested the following hypotheses: 1) cognitive impairment is more severe in schizophrenia in the early stage of psychosis; and 2) cognitive decline between early and chronic stages is relatively greater in psychotic bipolar disorder. Additionally, individuals with psychosis were classified as neuropsychologically normal, deteriorated, and compromised (i.e. below average intellectual functioning) to determine if the frequencies of neuropsychologically compromised and deteriorated patients were higher in schizophrenia and psychotic bipolar disorder, respectively.
RESULTS - Neuropsychological impairment in the early stage of psychosis was more severe in schizophrenia. Psychotic bipolar disorder was not associated with relatively greater cognitive decline between illness stages. The frequency of neuropsychologically compromised patients was higher in schizophrenia; however, substantial portions of both schizophrenia and psychotic bipolar disorder patients were classified as neuropsychologically compromised and deteriorated.
CONCLUSIONS - While schizophrenia is associated with relatively greater neurodevelopmental involvement, psychotic bipolar disorder and schizophrenia cannot be strictly dichotomized into purely neuroprogressive and neurodevelopmental illness trajectories; there is evidence of both processes in each disorder.
Copyright © 2018 Elsevier B.V. All rights reserved.
BACKGROUND - The human brain remains highly plastic for a protracted developmental period. Thus, although early caregiving adversities that alter amygdala development can result in enduring emotion regulation difficulties, these trajectories should respond to subsequent enriched caregiving. Exposure to high-quality parenting can regulate (i.e., decrease) children's amygdala reactivity, a process that, over the long term, is hypothesized to enhance emotion regulation. We tested the hypothesis that even following adversity, the parent-child relationship would be associated with decreases in amygdala reactivity to parent cues, which would in turn predict lower future anxiety.
METHODS - Participants were 102 children (6-10 years of age) and adolescents (11-17 years of age), for whom data were collected at one or two time points and who either had experienced institutional care before adoption (n = 45) or had lived always with their biological parents (comparison; n = 57). We examined how amygdala reactivity to visual cues of the parent at time 1 predicted longitudinal change (from time 1 to time 2) in parent-reported child anxiety across 3 years.
RESULTS - At time 1, on average, amygdala reactivity decrements to parent cues were not seen in children who had received institutional care but were seen in children in the comparison group. However, some children who previously experienced institutional care did show decreased amygdala reactivity to parent cues (∼40%), which was associated with greater child-reported feelings of security with their parent. Amygdala decreases at time 1 were followed by steeper anxiety reductions from time 1 to time 2 (i.e., 3 years).
CONCLUSIONS - These data provide a neurobiological mechanism by which the parent-child relationship can increase resilience, even in children at significant risk for anxiety symptoms.
Copyright © 2019 Society of Biological Psychiatry. Published by Elsevier Inc. All rights reserved.
BACKGROUND - Late-life depression (LLD) has been associated with alterations in intrinsic functional networks, best characterized in the default mode network (DMN), cognitive control network (CCN), and salience network. However, these findings often derive from small samples, and it is not well understood how network findings relate to clinical and cognitive symptomatology.
METHODS - We studied 100 older adults (n = 79 with LLD, n = 21 nondepressed) and collected resting-state functional magnetic resonance imaging, clinical measures of depression, and performance on cognitive tests. We selected canonical network regions for each intrinsic functional network (DMN, CCN, and salience network) as seeds in seed-to-voxel analysis. We compared connectivity between the depressed and nondepressed groups and correlated connectivity with depression severity among depressed subjects. We then investigated whether the observed connectivity findings were associated with greater severity of common neuropsychiatric symptoms or poorer cognitive performance.
RESULTS - LLD was characterized by decreased DMN connectivity to the frontal pole, a CCN region (Wald χ = 22.33, p < .001). No significant group differences in connectivity were found for the CCN or salience network. However, in the LLD group, increased CCN connectivity was associated with increased depression severity (Wald χ > 20.14, p < .001), greater anhedonia (Wald χ = 7.02, p = .008) and fatigue (Wald χ = 6.31, p = .012), and poorer performance on tests of episodic memory (Wald χ > 4.65, p < .031), executive function (Wald χ = 7.18, p = .007), and working memory (Wald χ > 4.29, p < .038).
CONCLUSIONS - LLD is characterized by differences in DMN connectivity, while CCN connectivity is associated with LLD symptomology, including poorer performance in several cognitive domains.
Published by Elsevier Inc.
Experimental studies indicate that perinatal light exposure has enduring effects on affective behaviors in rodents; however, insufficient research has explored this hypothesis in humans. We examined photoperiod (i.e., day length) metrics during maternal pregnancy in relation to lifetime depression in the longitudinal Nurses' Health Study (NHS) and NHS II. 160,723 participants reported birth date and birth state (used to derive daily photoperiod based on published mathematical equations), and clinician-diagnosed depression and antidepressant use throughout adulthood. Logistic regression was used to estimate odds ratios (OR) (and 95% confidence intervals [CI]) for depression (defined as clinician diagnosis and antidepressant use) across quintiles of two exposures during maternal pregnancy: 1) total photoperiod (total number of daylight hours) and 2) differences between minimum/maximum photoperiod; each trimester of pregnancy was examined separately. Total photoperiod during maternal pregnancy was not associated with depression overall or by trimester of pregnancy. However, larger differences between minimum/maximum photoperiod during maternal pregnancy were related to lower odds of depression (multivariable [MV]-adjusted OR: 0.86, 95% CI: 0.83, 0.90 comparing extreme quintiles of exposure; p-trend<0.0001); this association appeared specific to the second trimester of pregnancy (MV-adjusted p-trends = 0.03, <0.0001, and 0.3 across the three trimesters, respectively). In addition, birth at higher latitude (where larger differences in minimum/maximum photoperiod exist) was associated with a significant reduction in the lifetime risk of depression. These findings are consistent with an emerging hypothesis in which perinatal light exposure may influence risk of depression, and they might be understood through the conceptual framework of adaptive developmental plasticity.
Copyright © 2018 Elsevier Ltd. All rights reserved.
BACKGROUND - Executive cognitive functions, including working memory, cognitive flexibility, and inhibition, are impaired in schizophrenia. Executive functions rely on coordinated information processing between the prefrontal cortex (PFC) and thalamus, particularly the mediodorsal nucleus. This raises the possibility that anatomical connectivity between the PFC and mediodorsal thalamus may be 1) reduced in schizophrenia and 2) related to deficits in executive function. The current investigation tested these hypotheses.
METHODS - Forty-five healthy subjects and 62 patients with a schizophrenia spectrum disorder completed a battery of tests of executive function and underwent diffusion-weighted imaging. Probabilistic tractography was used to quantify anatomical connectivity between six cortical regions, including PFC, and the thalamus. Thalamocortical anatomical connectivity was compared between healthy subjects and patients with schizophrenia using region-of-interest and voxelwise approaches, and the association between PFC-thalamic anatomical connectivity and severity of executive function impairment was examined in patients.
RESULTS - Anatomical connectivity between the thalamus and PFC was reduced in schizophrenia. Voxelwise analysis localized the reduction to areas of the mediodorsal thalamus connected to lateral PFC. Reduced PFC-thalamic connectivity in schizophrenia correlated with impaired working memory but not cognitive flexibility and inhibition. In contrast to reduced PFC-thalamic connectivity, thalamic connectivity with somatosensory and occipital cortices was increased in schizophrenia.
CONCLUSIONS - The results are consistent with models implicating disrupted PFC-thalamic connectivity in the pathophysiology of schizophrenia and mechanisms of cognitive impairment. PFC-thalamic anatomical connectivity may be an important target for procognitive interventions. Further work is needed to determine the implications of increased thalamic connectivity with sensory cortex.
Copyright © 2017 Society of Biological Psychiatry. Published by Elsevier Inc. All rights reserved.
OBJECTIVE - Neurocognitive evaluations are commonly integrated with clinical assessment to evaluate adult Attention Deficit Hyperactivity Disorder (ADHD). Study goal is to identify measures most strongly related to ADHD diagnosis and to determine their utility in screening processes.
PARTICIPANTS - 230 students who were evaluated at the Vanderbilt University Psychological and Counseling Center between July 2013 and October 2015.
METHODS - We retrospectively examined charts, including clinical diagnosis, family history, childhood parental reported and current self-reported ADHD symptoms, psychiatric comorbidities, and continuous performance test (CPT).
RESULT - Positive report of childhood and current ADHD symptoms, and lack of comorbid psychiatric symptoms were strongly associated with clinical diagnosis. CPT results were not associated with an ADHD diagnosis. The absence of reported childhood and current ADHD symptoms may serve as a contradictory marker for ADHD diagnosis.
CONCLUSION - Clinical assessment of ADHD symptoms and ADHD childhood history, but not CPT, contributes to an accurate diagnosis of ADHD in college-aged adults.
Cholesterol metabolism is vital for brain function. Previous work in cultured cells has shown that a number of psychotropic drugs inhibit the activity of 7-dehydrocholesterol reductase (DHCR7), an enzyme that catalyzes the final steps in cholesterol biosynthesis. This leads to the accumulation of 7-dehydrocholesterol (7DHC), a molecule that gives rise to oxysterols, vitamin D, and atypical neurosteroids. We examined levels of cholesterol and the cholesterol precursors desmosterol, lanosterol, 7DHC and its isomer 8-dehydrocholesterol (8DHC), in blood samples of 123 psychiatric patients on various antipsychotic and antidepressant drugs, and 85 healthy controls, to see if the observations in cell lines hold true for patients as well. Three drugs, aripiprazole, haloperidol and trazodone increased circulating 7DHC and 8DHC levels, while five other drugs, clozapine, escitalopram/citalopram, lamotrigine, olanzapine, and risperidone, did not. Studies in rat brain verified that haloperidol dose-dependently increased 7DHC and 8DHC levels, while clozapine had no effect. We conclude that further studies should investigate the role of 7DHC and 8DHC metabolites, such as oxysterols, vitamin D, and atypical neurosteroids, in the deleterious and therapeutic effects of psychotropic drugs. Finally, we recommend that drugs that increase 7DHC levels should not be prescribed during pregnancy, as children born with DHCR7 deficiency have multiple congenital malformations.
Copyright © 2017 Elsevier B.V. All rights reserved.
UNLABELLED - Early institutional care can be profoundly stressful for the human infant, and, as such, can lead to significant alterations in brain development. In animal models, similar variants of early adversity have been shown to modify amygdala-hippocampal-prefrontal cortex development and associated aversive learning. The current study examined this rearing aberration in human development. Eighty-nine children and adolescents who were either previously institutionalized (PI youth; N = 46; 33 females and 13 males; age range, 7-16 years) or were raised by their biological parents from birth (N = 43; 22 females and 21 males; age range, 7-16 years) completed an aversive-learning paradigm while undergoing functional neuroimaging, wherein visual cues were paired with either an aversive sound (CS+) or no sound (CS-). For the PI youth, better aversive learning was associated with higher concurrent trait anxiety. Both groups showed robust learning and amygdala activation for CS+ versus CS- trials. However, PI youth also exhibited broader recruitment of several regions and increased hippocampal connectivity with prefrontal cortex. Stronger connectivity between the hippocampus and ventromedial PFC predicted significant improvements in future anxiety (measured 2 years later), and this was particularly true within the PI group. These results suggest that for humans as well as for other species, early adversity alters the neurobiology of aversive learning by engaging a broader prefrontal-subcortical circuit than same-aged peers. These differences are interpreted as ontogenetic adaptations and potential sources of resilience.
SIGNIFICANCE STATEMENT - Prior institutionalization is a significant form of early adversity. While nonhuman animal research suggests that early adversity alters aversive learning and associated neurocircuitry, no prior work has examined this in humans. Here, we show that youth who experienced prior institutionalization, but not comparison youth, recruit the hippocampus during aversive learning. Among youth who experienced prior institutionalization, individual differences in aversive learning were associated with worse current anxiety. However, connectivity between the hippocampus and prefrontal cortex prospectively predicted significant improvements in anxiety 2 years following scanning for previously institutionalized youth. Among youth who experienced prior institutionalization, age-atypical engagement of a distributed set of brain regions during aversive learning may serve a protective function.
Copyright © 2016 the authors 0270-6474/16/366421-11$15.00/0.
Research increasingly suggests that subjective cognitive decline (SCD) in older adults, in the absence of objective cognitive dysfunction or depression, may be a harbinger of non-normative cognitive decline and eventual progression to dementia. Little is known, however, about the key features of self-report measures currently used to assess SCD. The Subjective Cognitive Decline Initiative (SCD-I) Working Group is an international consortium established to develop a conceptual framework and research criteria for SCD (Jessen et al., 2014, Alzheimers Dement 10, 844-852). In the current study we systematically compared cognitive self-report items used by 19 SCD-I Working Group studies, representing 8 countries and 5 languages. We identified 34 self-report measures comprising 640 cognitive self-report items. There was little overlap among measures- approximately 75% of measures were used by only one study. Wide variation existed in response options and item content. Items pertaining to the memory domain predominated, accounting for about 60% of items surveyed, followed by executive function and attention, with 16% and 11% of the items, respectively. Items relating to memory for the names of people and the placement of common objects were represented on the greatest percentage of measures (56% each). Working group members reported that instrument selection decisions were often based on practical considerations beyond the study of SCD specifically, such as availability and brevity of measures. Results document the heterogeneity of approaches across studies to the emerging construct of SCD. We offer preliminary recommendations for instrument selection and future research directions including identifying items and measure formats associated with important clinical outcomes.
UNLABELLED - Disruptions in corollary discharge (CD), motor signals that send information to sensory areas and allow for prediction of sensory states, are argued to underlie the perceived loss of agency in schizophrenia. Behavioral and neurophysiological evidence for CD in primates comes largely from the saccadic double-step task, which requires participants to make two visually triggered saccadic eye movements in brief succession. Healthy individuals use CD to anticipate the change in eye position resulting from the first saccade when preparing the second saccade. In the current study with human participants, schizophrenia patients and healthy controls of both sexes performed a modified double-step task. Most trials required a saccade to a single visual target (T1). On a subset of trials, a second target (T2) was flashed shortly following T1. Subjects were instructed to look directly at T2. Healthy individuals also use CD to make rapid, corrective responses following erroneous saccades to T1. To assess CD in schizophrenia, we examined the following on error trials: (1) frequency and latency of corrective saccades, and (2) mislocalization of the corrective (second) saccade in the direction predicted by a failure to use CD to account for the first eye movement. Consistent with disrupted CD, patients made fewer and slower error corrections. Importantly, the corrective saccade vector angle was biased in a manner consistent with disrupted CD. These results provide novel and clear evidence for dysfunctional CD in the oculomotor system in patients with schizophrenia. Based on neurophysiology work, these disturbances might have their basis in medial thalamus dysfunction.
SIGNIFICANCE STATEMENT - According to the World Health Organization, acute schizophrenia carries more disability weight than any other disease, but its etiology remains unknown. One promising theory of schizophrenia highlights alterations in a sense of self, in which self-generated thoughts or actions are attributed externally. Disruptions in corollary discharge (CD), motor signals sent to sensory areas that allow for the prediction of impending sensations, are proposed to underlie these symptoms. Direct physiological evidence, however, is limited. In nonhuman primates, inactivation of mediodorsal thalamic neurons disrupts CD associated with eye movements. Using the same task, we show similar impairments in schizophrenia patients, consistent with disrupted CD. These findings allow us to link clinical phenomenology to primate neurophysiology and interpret findings within a biological framework.
Copyright © 2015 the authors 0270-6474/15/359935-11$15.00/0.