, a bio/informatics shared resource is still "open for business" - Visit the CDS website
The publication data currently available has been vetted by Vanderbilt faculty, staff, administrators and trainees. The data itself is retrieved directly from NCBI's PubMed and is automatically updated on a weekly basis to ensure accuracy and completeness.
If you have any questions or comments, please contact us.
For many cochlear implant (CI) users, visual cues are vitally important for interpreting the impoverished auditory speech information that an implant conveys. Although the temporal relationship between auditory and visual stimuli is crucial for how this information is integrated, audiovisual temporal processing in CI users is poorly understood. In this study, we tested unisensory (auditory alone, visual alone) and multisensory (audiovisual) temporal processing in postlingually deafened CI users (n = 48) and normal-hearing controls (n = 54) using simultaneity judgment (SJ) and temporal order judgment (TOJ) tasks. We varied the timing onsets between the auditory and visual components of either a syllable/viseme or a simple flash/beep pairing, and participants indicated either which stimulus appeared first (TOJ) or if the pair occurred simultaneously (SJ). Results indicate that temporal binding windows-the interval within which stimuli are likely to be perceptually 'bound'-are not significantly different between groups for either speech or non-speech stimuli. However, the point of subjective simultaneity for speech was less visually leading in CI users, who interestingly, also had improved visual-only TOJ thresholds. Further signal detection analysis suggests that this SJ shift may be due to greater visual bias within the CI group, perhaps reflecting heightened attentional allocation to visual cues.
In addition to deficits in social communication, individuals diagnosed with Autism Spectrum Disorder (ASD) frequently exhibit changes in sensory and multisensory function. Recent evidence has focused on changes in audiovisual temporal processing, and has sought to relate these sensory-based changes to weaknesses in social communication. These changes in audiovisual temporal function manifest as differences in the temporal epoch or "window" within which paired auditory and visual stimuli are integrated or bound, with those with ASD exhibiting expanded audiovisual temporal binding windows (TBWs). However, it is unknown whether this impairment is unique to audiovisual pairings, perhaps because of their relevance for speech processing, or whether it generalizes across pairings in different sensory modalities. In addition to the exteroceptive senses, there has been growing interest in ASD research in interoception (e.g., the monitoring of respiration, heartbeat, hunger, etc.), as these internally directed sensory processes appear to be altered as well in autism. In the current study, we sought to examine both exteroception and interoception in individuals with ASD and a group of typically developing (TD) matched controls, with an emphasis on temporal perception of audiovisual (exteroceptive) and cardiovisual (interoceptive to exteroceptive) cues. Results replicate prior findings showing expanded audiovisual TBWs in ASD in comparison to TD. In addition, strikingly, cardiovisual TBWs were fourfold larger in ASD than in TD, suggesting a putative complete lack of cardiovisual temporal acuity in ASD individuals. Results are discussed in light of recent evidence indicating a reduced tendency to rely on sensory priors in ASD. Autism Res 2018, 11: 194-205. © 2017 International Society for Autism Research, Wiley Periodicals, Inc.
LAY SUMMARY - Studies have shown that individuals with autism have difficulty in separating auditory and visual events in time. People with autism also weight sensory evidence originating from the external world and from their body differently. We measured simultaneity judgments regarding visual and auditory events and between visual and heartbeat events. Results suggest that while individuals with autism show unusual temporal function across the senses in a general manner, this deficit is greater when pairings bridged between the external world and the internal body.
© 2017 International Society for Autism Research, Wiley Periodicals, Inc.
Spatial localization of touch is critically dependent upon coordinate transformation between different reference frames, which must ultimately allow for alignment between somatotopic and external representations of space. Although prior work has shown an important role for cues such as body posture in influencing the spatial localization of touch, the relative contributions of the different sensory systems to this process are unknown. In the current study, we had participants perform a tactile temporal order judgment (TOJ) under different body postures and conditions of sensory deprivation. Specifically, participants performed non-speeded judgments about the order of two tactile stimuli presented in rapid succession on their ankles during conditions in which their legs were either uncrossed or crossed (and thus bringing somatotopic and external reference frames into conflict). These judgments were made in the absence of 1) visual, 2) auditory, or 3) combined audio-visual spatial information by blindfolding and/or placing participants in an anechoic chamber. As expected, results revealed that tactile temporal acuity was poorer under crossed than uncrossed leg postures. Intriguingly, results also revealed that auditory and audio-visual deprivation exacerbated the difference in tactile temporal acuity between uncrossed to crossed leg postures, an effect not seen for visual-only deprivation. Furthermore, the effects under combined audio-visual deprivation were greater than those seen for auditory deprivation. Collectively, these results indicate that mechanisms governing the alignment between somatotopic and external reference frames extend beyond those imposed by body posture to include spatial features conveyed by the auditory and visual modalities - with a heavier weighting of auditory than visual spatial information. Thus, sensory modalities conveying exteroceptive spatial information contribute to judgments regarding the localization of touch.
Copyright © 2016 Elsevier Ltd. All rights reserved.
Using the Garner speeded classification task, Amishav and Kimchi (Psychonomic Bulletin & Review, 17, 743-748, 2010) found that participants could selectively attend to face features: Classifying faces based on the shape of the eyes was not influenced by task-irrelevant variation in the shape of the mouth, and vice versa. This result contrasts with a large body of work using another selective attention task, the composite task, in which participants are unable to selectively attend to face parts: Same/different judgments for one-half of a composite face are influenced by the same/different status of the task-irrelevant half of that composite face. In Amishav and Kimchi, faces all shared a common configuration of face features. By contrast, configuration is typically never controlled in the composite task. We asked whether failures of selective attention observed in the composite task are caused by faces varying in both features and configuration. In two experiments, we found that participants exhibited failures of selective attention to face parts in the composite task even when configuration was held constant, which is inconsistent with Amishav and Kimchi's conclusion that face features can be processed independently unless configuration varies. Although both measure failures of selective attention, the Garner task and composite task appear to measure different mechanisms involved in holistic face perception.
Social impairment is a core feature of schizophrenia, present from the pre-morbid stage and predictive of outcome, but the etiology of this deficit remains poorly understood. Successful and adaptive social interactions depend on one's ability to make rapid and accurate judgments about others in real time. Our surprising ability to form accurate first impressions from brief exposures, known as "thin slices" of behavior has been studied very extensively in healthy participants. We sought to examine affect and social trait judgment from thin slices of static or video stimuli in order to investigate the ability of schizophrenic individuals to form reliable social impressions of others. 21 individuals with schizophrenia (SZ) and 20 matched healthy participants (HC) were asked to identify emotions and social traits for actors in standardized face stimuli as well as brief video clips. Sound was removed from videos to remove all verbal cues. Clinical symptoms in SZ and delusional ideation in both groups were measured. Results showed a general impairment in affect recognition for both types of stimuli in SZ. However, the two groups did not differ in the judgments of trustworthiness, approachability, attractiveness, and intelligence. Interestingly, in SZ, the severity of positive symptoms was correlated with higher ratings of attractiveness, trustworthiness, and approachability. Finally, increased delusional ideation in SZ was associated with a tendency to rate others as more trustworthy, while the opposite was true for HC. These findings suggest that complex social judgments in SZ are affected by symptomatology.
Copyright © 2014 Elsevier B.V. All rights reserved.
In classic category learning studies, subjects typically learn to assign items to 1 of 2 categories, with no further distinction between how items on each side of the category boundary should be treated. In real life, however, we often learn categories that dictate further processing goals, for instance, with objects in only 1 category requiring further individuation. Using methods from category learning and perceptual expertise, we studied the perceptual consequences of experience with objects in tasks that rely on attention to different dimensions in different parts of the space. In 2 experiments, subjects first learned to categorize complex objects from a single morphspace into 2 categories based on 1 morph dimension, and then learned to perform a different task, either naming or a local feature judgment, for each of the 2 categories. A same-different discrimination test before and after each training measured sensitivity to feature dimensions of the space. After initial categorization, sensitivity increased along the category-diagnostic dimension. After task association, sensitivity increased more for the category that was named, especially along the nondiagnostic dimension. The results demonstrate that local attentional weights, associated with individual exemplars as a function of task requirements, can have lasting effects on perceptual representations.
Neuropsychological tests are useful for diagnosing Alzheimer's disease (AD), yet for many tests, diagnostic accuracy statistics are unavailable. We present diagnostic accuracy statistics for seven variables from the Neuropsychological Assessment Battery (NAB) that were administered to a large sample of elderly adults (n = 276) participating in a longitudinal research study at a national AD Center. Tests included Driving Scenes, Bill Payment, Daily Living Memory, Screening Visual Discrimination, Screening Design Construction, and Judgment. Clinical diagnosis was made independent of these tests, and for the current study, participants were categorized as AD (n = 65) or non-AD (n = 211). Receiver operating characteristics curve analysis was used to determine each test's sensitivity and specificity at multiple cut points, which were subsequently used to calculate positive and negative predictive values at a variety of base rates. Of the tests analyzed, the Daily Living Memory test provided the greatest accuracy in the identification of AD and the two Screening measures required a significant tradeoff between sensitivity and specificity. Overall, the seven NAB subtests included in the current study are capable of excellent diagnostic accuracy, but appropriate understanding of the context in which the tests are used is crucial for minimizing errors.
The importance of multisensory integration for human behavior and perception is well documented, as is the impact that temporal synchrony has on driving such integration. Thus, the more temporally coincident two sensory inputs from different modalities are, the more likely they will be perceptually bound. This temporal integration process is captured by the construct of the temporal binding window-the range of temporal offsets within which an individual is able to perceptually bind inputs across sensory modalities. Recent work has shown that this window is malleable and can be narrowed via a multisensory perceptual feedback training process. In the current study, we seek to extend this by examining the malleability of the multisensory temporal binding window through changes in unisensory experience. Specifically, we measured the ability of visual perceptual feedback training to induce changes in the multisensory temporal binding window. Visual perceptual training with feedback successfully improved temporal visual processing, and more importantly, this visual training increased the temporal precision across modalities, which manifested as a narrowing of the multisensory temporal binding window. These results are the first to establish the ability of unisensory temporal training to modulate multisensory temporal processes, findings that can provide mechanistic insights into multisensory integration and which may have a host of practical applications.
In natural environments, human sensory systems work in a coordinated and integrated manner to perceive and respond to external events. Previous research has shown that the spatial and temporal relationships of sensory signals are paramount in determining how information is integrated across sensory modalities, but in ecologically plausible settings, these factors are not independent. In the current study, we provide a novel exploration of the impact on behavioral performance for systematic manipulations of the spatial location and temporal synchrony of a visual-auditory stimulus pair. Simple auditory and visual stimuli were presented across a range of spatial locations and stimulus onset asynchronies (SOAs), and participants performed both a spatial localization and simultaneity judgment task. Response times in localizing paired visual-auditory stimuli were slower in the periphery and at larger SOAs, but most importantly, an interaction was found between the two factors, in which the effect of SOA was greater in peripheral as opposed to central locations. Simultaneity judgments also revealed a novel interaction between space and time: individuals were more likely to judge stimuli as synchronous when occurring in the periphery at large SOAs. The results of this study provide novel insights into (a) how the speed of spatial localization of an audiovisual stimulus is affected by location and temporal coincidence and the interaction between these two factors and (b) how the location of a multisensory stimulus impacts judgments concerning the temporal relationship of the paired stimuli. These findings provide strong evidence for a complex interdependency between spatial location and temporal structure in determining the ultimate behavioral and perceptual outcome associated with a paired multisensory (i.e., visual-auditory) stimulus.
BACKGROUND - Previous studies indicate that the transition to psychosis is associated with dynamic changes of hippocampal integrity. Here we explored hippocampal volume and neural activation during a relational memory task in patients who were in the early stage of a psychotic illness.
METHODS - Forty-one early psychosis patients and 34 healthy control subjects completed a transitive inference (TI) task used previously in chronic schizophrenia patients. Participants learned to select the "winner" of two sets of stimulus pairs drawn from an overlapping sequence (A > B > C > D > E) and a nonoverlapping set (a > b, c > d, e > f, g > h). During a functional magnetic resonance imaging scan, participants were tested on the trained pairs and made inferential judgments on novel pairings that could be solved based on training (e.g., B vs. D). Hippocampal volumes were manually segmented and compared between groups. Functional magnetic resonance imaging analyses included 27 early psychosis patients and 30 control subjects who met memory training criteria.
RESULTS - Groups did not differ on inference performance or hippocampal volume and exhibited similar activation of medial temporal regions when judging nonoverlapping pairs. However, patients who failed to meet memory training criteria had smaller hippocampal volumes. Neural activity during TI was less widespread in early psychosis patients, but between-group differences were not significant. Hippocampal activity during TI was positively correlated with inference performance only in control subjects.
CONCLUSIONS - Our results provide evidence that relational memory impairment and hippocampal abnormalities, well established in chronic schizophrenia, are not fully present in early psychosis patients. This provides a rationale for early intervention, targeting the possible delay, reduction, or prevention of these deficits.
Copyright © 2012 Society of Biological Psychiatry. Published by Elsevier Inc. All rights reserved.