The publication data currently available has been vetted by Vanderbilt faculty, staff, administrators and trainees. The data itself is retrieved directly from NCBI's PubMed and is automatically updated on a weekly basis to ensure accuracy and completeness.
If you have any questions or comments, please contact us.
OBJECTIVE - Neurocognitive evaluations are commonly integrated with clinical assessment to evaluate adult Attention Deficit Hyperactivity Disorder (ADHD). Study goal is to identify measures most strongly related to ADHD diagnosis and to determine their utility in screening processes.
PARTICIPANTS - 230 students who were evaluated at the Vanderbilt University Psychological and Counseling Center between July 2013 and October 2015.
METHODS - We retrospectively examined charts, including clinical diagnosis, family history, childhood parental reported and current self-reported ADHD symptoms, psychiatric comorbidities, and continuous performance test (CPT).
RESULT - Positive report of childhood and current ADHD symptoms, and lack of comorbid psychiatric symptoms were strongly associated with clinical diagnosis. CPT results were not associated with an ADHD diagnosis. The absence of reported childhood and current ADHD symptoms may serve as a contradictory marker for ADHD diagnosis.
CONCLUSION - Clinical assessment of ADHD symptoms and ADHD childhood history, but not CPT, contributes to an accurate diagnosis of ADHD in college-aged adults.
Individuals differ greatly in their sensitivity to rewards and punishments. In the extreme, these differences are implicated in a range of psychiatric disorders from addiction to depression. However, it is unclear how these differences influence the recruitment of attention, working memory, and long-term memory when responding to potential rewards. Here, we used a rewarded memory-guided visual search task and ERPs to examine the influence of individual differences in self-reported reward/punishment sensitivity, as measured by the Behavioral Inhibition System (BIS)/Behavioral Activation System (BAS) scales, on the recruitment of cognitive mechanisms in conditions of potential reward. Select subscales of the BAS, including the fun seeking and reward responsiveness scales, showed unique relationships with context updating to reward cues and working memory maintenance of potentially rewarded stimuli. In contrast, BIS scores showed unique relationships with deployment of attention at different points in the task. These results suggest that sensitivity to rewards (i.e., BAS) and to punishment (i.e., BIS) may play an important role in the recruitment of specific and distinct cognitive mechanisms in conditions of potential rewards.
© 2017 Society for Psychophysiological Research.
Recent work suggests sensory seeking predicts later social symptomatology through reduced social orienting in infants who are at high-risk for autism spectrum disorder (ASD) based on their status as younger siblings of children diagnosed with ASD. We drew on extant longitudinal data from a community sample of at-risk infants who were identified at 12 months using the First Year Inventory, and followed to 3-5 years. We replicate findings of Damiano et al. (in this issue) that a) high-risk infants who go on to be diagnosed with ASD show heightened sensory seeking in the second year of life relative to those who do not receive a diagnosis, and b) increased sensory seeking indirectly relates to later social symptomatology via reduced social orienting. We extend previous findings to show that sensory seeking has more clinical utility later in the second year of life (20-24 months) than earlier (13-15 months). Further, this study suggests that diminished attention disengagement at 12-15 months may precede and predict increased sensory seeking at 20-24 months. Findings add support for the notion that sensory features produce cascading effects on social development in infants at risk for ASD, and suggest that reduced attention disengagement early in life may set off this cascade.
Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
OBJECTIVE - Late-life depression is associated with cognitive deficits and increased risk for cognitive decline. The purpose of the study was to determine whether clinical characteristics could serve as phenotypes informative of subsequent cognitive decline. Age at depression onset and antidepressant remission at 3 months (acute response) and 12 months (chronic response) were examined.
METHODS - In a longitudinal study of late-life depression in an academic center, 273 depressed and 164 never-depressed community-dwelling elders aged 60 years or older were followed on average for over 5 years. Participants completed annual neuropsychological testing. Neuropsychological measures were converted to z-scores derived from the baseline performance of all participants. Cognitive domain scores at each time were then created by averaging z-scores across tests, grouped into domains of episodic memory, attention-working memory, verbal fluency, and executive function.
RESULTS - Depressed participants exhibited poorer performance at baseline and greater subsequent decline in all domains. Early-onset depressed individuals exhibited a greater decline in all domains than late-onset or nondepressed groups. For remission, remitters and nonremitters at both 3 and 12 month exhibited greater decline in episodic memory and attention-working memory than nondepressed subjects. Three-month remitters also exhibited a greater decline in verbal fluency and executive function, whereas 12-month nonremitters exhibited greater decline in executive function than other groups.
CONCLUSION - Consistent with past studies, depressed elders exhibit greater cognitive decline than nondepressed subjects, particularly individuals with early depression onset, supporting the theory that repeated depressive episodes may contribute to decline. Clinical remission is not associated with less cognitive decline.
Published by Elsevier Inc.
Visual stimuli with emotional content appearing in close temporal proximity either before or after a target stimulus can hinder conscious perceptual processing of the target via an emotional attentional blink (EAB). This occurs for targets that appear after the emotional stimulus (forward EAB) and for those appearing before the emotional stimulus (retroactive EAB). Additionally, the traditional attentional blink (AB) occurs because detection of any target hinders detection of a subsequent target. The present study investigated the relations between these different attentional processes. Rapid sequences of landscape images were presented to thirty-one male participants with occasional landscape targets (rotated images). For the forward EAB, emotional or neutral distractor images of people were presented before the target; for the retroactive EAB, such images were also targets and presented after the landscape target. In the latter case, this design allowed investigation of the AB as well. Erotic and gory images caused more EABs than neutral images, but there were no differential effects on the AB. This pattern is striking because while using different target categories (rotated landscapes, people) appears to have eliminated the AB, the retroactive EAB still occurred, offering additional evidence for the power of emotional stimuli over conscious attention.
BACKGROUND - Cognitive bias is a common characteristic of major depressive disorder (MDD) and is posited to remain during remission and contribute to recurrence risk. Attention bias may be related to enhanced amygdala activity or altered amygdala functional connectivity in depression. The current study examined attention bias, brain activity for emotional images, and functional connectivity in post-menopausal women with and without a history of major depression.
METHODS - Attention bias for emotionally valenced images was examined in 33 postmenopausal women with (n=12) and without (n=21) a history of major depression using an emotion dot probe task during fMRI. Group differences in amygdala activity and functional connectivity were assessed using fMRI and examined for correlations to attention performance.
RESULTS - Women with a history of MDD showed greater attentional bias for negative images and greater activity in brain areas including the amygdala for both positive and negative images (pcorr <0.001) than women without a history of MDD. In all participants, amygdala activity for negative images was correlated with attention facilitation for emotional images. Women with a history of MDD had significantly greater functional connectivity between the amygdala and hippocampal complex. In all participants amygdala-hippocampal connectivity was positively correlated with attention facilitation for negative images.
LIMITATIONS - Small sample with unbalanced groups.
CONCLUSIONS - These findings provide evidence for negative attentional bias in euthymic, remitted depressed individuals. Activity and functional connectivity in limbic and attention networks may provide a neurobiological basis for continued cognitive bias in remitted depression.
Copyright © 2016 Elsevier B.V. All rights reserved.
Learning from past decisions can enhance successful decision-making. It is unclear whether difficulties in learning from experience may contribute to risky decision-making, which may be altered among individuals with attention-deficit/hyperactivity disorder (ADHD). This study follows 192 children with and without ADHD aged 5 to 10 years for approximately 2.5 years and examines their risky decision-making using the Balloon Emotional Learning Task (BELT), a computerized assessment of sequential risky decision-making in which participants pump up a series of virtual balloons for points. The BELT contains three task conditions: one with a variable explosion point, one with a stable and early explosion point, and one with a stable and late explosion point. These conditions may be learned via experience on the task. Contrary to expectations, ADHD status was not found to be related to greater risk-taking on the BELT, and among younger children ADHD status is in fact associated with reduced risk-taking. In addition, the typically-developing children without ADHD showed significant learning-related gains on both stable task conditions. However, the children with ADHD demonstrated learning on the condition with a stable and early explosion point, but not on the condition with the stable and late explosion point, in which more pumps are required before learning when the balloon will explode. Learning during decision-making may be more difficult for children with ADHD. Because adapting to changing environmental demands requires the use of feedback to guide future behavior, negative outcomes associated with childhood ADHD may partially reflect difficulties in learning from experience.
Severe hyperactivity and impulsivity are common reasons for referral to infant mental health services. Past versions of ZERO TO THREE's () diagnostic nosology, the Diagnostic Classification of Mental and Developmental Disorders in Infancy and Early Childhood (DC:0-3), did not address this clinical issue because it had been addressed in other nosologies. These general diagnostic nosologies describe attention deficit hyperactivity disorder (ADHD), but with little attention to developmentally specific aspects of the diagnosis in very young children. Categorical diagnosis related to hyperactivity and impulsivity in very young children warrants careful review of existing literature. Explicit attention must be paid to ensure that categorical diagnoses serve to describe syndromes that cause significant impairment to the family to allow children and families to access effective supports and ensure that behaviors typical of the developmental level are not described as pathologic. This article reviews proposed diagnostic criteria for ADHD and overactivity disorder of toddlerhood as well as the rationale for the criteria and evidence supporting validity and reliability of the diagnoses in very young children. Clinical implications also are presented.
© 2016 Michigan Association for Infant Mental Health.
BACKGROUND - Early life stress is associated with poorer social functioning. Attentional biases in response to threat-related cues, linked to both early experience and psychopathology, may explain this association. To date, however, no study has examined attentional biases to fearful facial expressions as a function of early life stress or examined these biases as a potential mediator of the relation between early life stress and social problems.
METHODS - In a sample of 154 children (ages 9-13 years) we examined the associations among interpersonal early life stressors (i.e., birth through age 6 years), attentional biases to emotional facial expressions using a dot-probe task, and social functioning on the Child Behavior Checklist.
RESULTS - High levels of early life stress were associated with both greater levels of social problems and an attentional bias away from fearful facial expressions, even after accounting for stressors occurring in later childhood. No biases were found for happy or sad facial expressions as a function of early life stress. Finally, attentional biases to fearful faces mediated the association between early life stress and social problems.
CONCLUSIONS - Attentional avoidance of fearful facial expressions, evidenced by a bias away from these stimuli, may be a developmental response to early adversity and link the experience of early life stress to poorer social functioning.
© 2016 Association for Child and Adolescent Mental Health.
In this brief review, we argue that attention operates along a hierarchy from peripheral through central mechanisms. We further argue that these mechanisms are distinguished not just by their functional roles in cognition, but also by a distinction between serial mechanisms (associated with central attention) and parallel mechanisms (associated with midlevel and peripheral attention). In particular, we suggest that peripheral attentional deployments in distinct representational systems may be maintained simultaneously with little or no interference, but that the serial nature of central attention means that even tasks that largely rely on distinct representational systems will come into conflict when central attention is demanded. We go on to review both the behavioral and neural evidence for this prediction. We conclude that even though the existing evidence mostly favors our account of serial central and parallel noncentral attention, we know of no experiment that has conclusively borne out these claims. As such, this article offers a framework of attentional mechanisms that will aid in guiding future research on this topic.