Other search tools

About this data

The publication data currently available has been vetted by Vanderbilt faculty, staff, administrators and trainees. The data itself is retrieved directly from NCBI's PubMed and is automatically updated on a weekly basis to ensure accuracy and completeness.

If you have any questions or comments, please contact us.

Results: 1 to 10 of 53

Publication Record

Connections

Integrated Structural Biology for α-Helical Membrane Protein Structure Determination.
Xia Y, Fischer AW, Teixeira P, Weiner B, Meiler J
(2018) Structure 26: 657-666.e2
MeSH Terms: Algorithms, Binding Sites, Electron Spin Resonance Spectroscopy, Humans, Membrane Proteins, Microscopy, Electron, Models, Molecular, Monte Carlo Method, Nuclear Magnetic Resonance, Biomolecular, Protein Binding, Protein Conformation, alpha-Helical, Protein Folding, Protein Interaction Domains and Motifs, Rhodopsin, Thermodynamics
Show Abstract · Added March 17, 2018
While great progress has been made, only 10% of the nearly 1,000 integral, α-helical, multi-span membrane protein families are represented by at least one experimentally determined structure in the PDB. Previously, we developed the algorithm BCL::MP-Fold, which samples the large conformational space of membrane proteins de novo by assembling predicted secondary structure elements guided by knowledge-based potentials. Here, we present a case study of rhodopsin fold determination by integrating sparse and/or low-resolution restraints from multiple experimental techniques including electron microscopy, electron paramagnetic resonance spectroscopy, and nuclear magnetic resonance spectroscopy. Simultaneous incorporation of orthogonal experimental restraints not only significantly improved the sampling accuracy but also allowed identification of the correct fold, which is demonstrated by a protein size-normalized transmembrane root-mean-square deviation as low as 1.2 Å. The protocol developed in this case study can be used for the determination of unknown membrane protein folds when limited experimental restraints are available.
Copyright © 2018 Elsevier Ltd. All rights reserved.
0 Communities
1 Members
0 Resources
15 MeSH Terms
Using two-site binding models to analyze microscale thermophoresis data.
Tso SC, Chen Q, Vishnivetskiy SA, Gurevich VV, Iverson TM, Brautigam CA
(2018) Anal Biochem 540-541: 64-75
MeSH Terms: Adenosine Monophosphate, Algorithms, Animals, Aptamers, Nucleotide, Binding Sites, Cattle, Kinetics, Models, Molecular, Monte Carlo Method, Mutagenesis, Site-Directed, Phytic Acid, Protein Binding, Recombinant Proteins, beta-Arrestin 2
Show Abstract · Added March 14, 2018
The emergence of microscale thermophoresis (MST) as a technique for determining the dissociation constants for bimolecular interactions has enabled these quantities to be measured in systems that were previously difficult or impracticable. However, most models for analyses of these data featured the assumption of a simple 1:1 binding interaction. The only model widely used for multiple binding sites was the Hill equation. Here, we describe two new MST analytic models that assume a 1:2 binding scheme: the first features two microscopic binding constants (K(1) and K(2)), while the other assumes symmetry in the bivalent molecule, culminating in a model with a single macroscopic dissociation constant (K) and a single factor (α) that accounts for apparent cooperativity in the binding. We also discuss the general applicability of the Hill equation for MST data. The performances of the algorithms on both real and simulated data are assessed, and implementation of the algorithms in the MST analysis program PALMIST is discussed.
Copyright © 2017 Elsevier Inc. All rights reserved.
0 Communities
2 Members
0 Resources
14 MeSH Terms
PyDREAM: high-dimensional parameter inference for biological models in python.
Shockley EM, Vrugt JA, Lopez CF
(2018) Bioinformatics 34: 695-697
MeSH Terms: Algorithms, Calibration, Computational Biology, Markov Chains, Models, Biological, Monte Carlo Method, Software, Uncertainty
Show Abstract · Added March 14, 2018
Summary - Biological models contain many parameters whose values are difficult to measure directly via experimentation and therefore require calibration against experimental data. Markov chain Monte Carlo (MCMC) methods are suitable to estimate multivariate posterior model parameter distributions, but these methods may exhibit slow or premature convergence in high-dimensional search spaces. Here, we present PyDREAM, a Python implementation of the (Multiple-Try) Differential Evolution Adaptive Metropolis [DREAM(ZS)] algorithm developed by Vrugt and ter Braak (2008) and Laloy and Vrugt (2012). PyDREAM achieves excellent performance for complex, parameter-rich models and takes full advantage of distributed computing resources, facilitating parameter inference and uncertainty estimation of CPU-intensive biological models.
Availability and implementation - PyDREAM is freely available under the GNU GPLv3 license from the Lopez lab GitHub repository at http://github.com/LoLab-VU/PyDREAM.
Contact - c.lopez@vanderbilt.edu.
Supplementary information - Supplementary data are available at Bioinformatics online.
© The Author(s) 2017. Published by Oxford University Press.
0 Communities
1 Members
0 Resources
8 MeSH Terms
Optimal design of perturbations for individual two-compartment pharmacokinetic analysis.
Shotwell MS, Zhou M, Fissell WH
(2016) J Biopharm Stat 26: 803-15
MeSH Terms: Administration, Intravenous, Anti-Bacterial Agents, Humans, Models, Statistical, Monte Carlo Method, Renal Dialysis
Show Abstract · Added February 22, 2016
We consider the optimal design of pharmacokinetic studies in patients that receive intermittent hemodialysis and intravenous antibiotic. Hemodialysis perturbs the pharmacokinetic system, providing additional opportunity for study. Designs that allocate measurements to occur exclusively during hemodialysis are shown to be viable alternatives to conventional designs, where all measurements occur outside of hemodialysis. Furthermore, hybrid designs with both conventional and intradialytic measurements have nearly double the efficiency of conventional designs. Convex optimal design and Monte Carlo techniques were used to simultaneously optimize hemodialysis event characteristics and sampling times, accounting for population pharmacokinetic heterogeneity. We also present several related methodological innovations.
0 Communities
1 Members
0 Resources
6 MeSH Terms
Development and validation of a GEANT4 radiation transport code for CT dosimetry.
Carver DE, Kost SD, Fernald MJ, Lewis KG, Fraser ND, Pickens DR, Price RR, Stabin MG
(2015) Health Phys 108: 419-28
MeSH Terms: Child, Computer Simulation, Humans, Monte Carlo Method, Phantoms, Imaging, Photons, Polymethyl Methacrylate, Radiation Dosage, Radiation Monitoring, Spectrometry, Gamma, Tomography, X-Ray Computed
Show Abstract · Added October 18, 2016
The authors have created a radiation transport code using the GEANT4 Monte Carlo toolkit to simulate pediatric patients undergoing CT examinations. The focus of this paper is to validate their simulation with real-world physical dosimetry measurements using two independent techniques. Exposure measurements were made with a standard 100-mm CT pencil ionization chamber, and absorbed doses were also measured using optically stimulated luminescent (OSL) dosimeters. Measurements were made in air with a standard 16-cm acrylic head phantom and with a standard 32-cm acrylic body phantom. Physical dose measurements determined from the ionization chamber in air for 100 and 120 kVp beam energies were used to derive photon-fluence calibration factors. Both ion chamber and OSL measurement results provide useful comparisons in the validation of the Monte Carlo simulations. It was found that simulated and measured CTDI values were within an overall average of 6% of each other.
0 Communities
1 Members
0 Resources
11 MeSH Terms
Simulation study comparing high-purity germanium and cadmium zinc telluride detectors for breast imaging.
Campbell DL, Peterson TE
(2014) Phys Med Biol 59: 7059-79
MeSH Terms: Breast, Cadmium, Computer Simulation, Female, Germanium, Humans, Image Processing, Computer-Assisted, Monte Carlo Method, Phantoms, Imaging, Photons, Positron-Emission Tomography, Radionuclide Imaging, Signal-To-Noise Ratio, Tellurium, Zinc
Show Abstract · Added February 16, 2015
We conducted simulations to compare the potential imaging performance for breast cancer detection with High-Purity Germanium (HPGe) and Cadmium Zinc Telluride (CZT) systems with 1% and 3.8% energy resolution at 140 keV, respectively. Using the Monte Carlo N-Particle (MCNP5) simulation package, we modelled both 5 mm-thick CZT and 10 mm-thick HPGe detectors with the same parallel-hole collimator for the imaging of a breast/torso phantom. Simulated energy spectra were generated, and planar images were created for various energy windows around the 140 keV photopeak. Relative sensitivity and scatter and the torso fractions were calculated along with tumour contrast and signal-to-noise ratios (SNR). Simulations showed that utilizing a ±1.25% energy window with an HPGe system better suppressed torso background and small-angle scattered photons than a comparable CZT system using a -5%/+10% energy window. Both systems provided statistically similar contrast and SNR, with HPGe providing higher relative sensitivity. Lowering the counts of HPGe images to match CZT count density still yielded equivalent contrast between HPGe and CZT. Thus, an HPGe system may provide equivalent breast imaging capability at lower injected radioactivity levels when acquiring for equal imaging time.
0 Communities
1 Members
0 Resources
15 MeSH Terms
Hierarchical performance estimation in the statistical label fusion framework.
Asman AJ, Landman BA
(2014) Med Image Anal 18: 1070-81
MeSH Terms: Adolescent, Adult, Aged, Aged, 80 and over, Algorithms, Brain Mapping, Child, Humans, Male, Middle Aged, Models, Statistical, Monte Carlo Method, Radiographic Image Interpretation, Computer-Assisted, Tomography, X-Ray Computed
Show Abstract · Added February 13, 2015
Label fusion is a critical step in many image segmentation frameworks (e.g., multi-atlas segmentation) as it provides a mechanism for generalizing a collection of labeled examples into a single estimate of the underlying segmentation. In the multi-label case, typical label fusion algorithms treat all labels equally - fully neglecting the known, yet complex, anatomical relationships exhibited in the data. To address this problem, we propose a generalized statistical fusion framework using hierarchical models of rater performance. Building on the seminal work in statistical fusion, we reformulate the traditional rater performance model from a multi-tiered hierarchical perspective. The proposed approach provides a natural framework for leveraging known anatomical relationships and accurately modeling the types of errors that raters (or atlases) make within a hierarchically consistent formulation. Herein, the primary contributions of this manuscript are: (1) we provide a theoretical advancement to the statistical fusion framework that enables the simultaneous estimation of multiple (hierarchical) confusion matrices for each rater, (2) we highlight the amenability of the proposed hierarchical formulation to many of the state-of-the-art advancements to the statistical fusion framework, and (3) we demonstrate statistically significant improvement on both simulated and empirical data. Specifically, both theoretically and empirically, we show that the proposed hierarchical performance model provides substantial and significant accuracy benefits when applied to two disparate multi-atlas segmentation tasks: (1) 133 label whole-brain anatomy on structural MR, and (2) orbital anatomy on CT.
Copyright © 2014 Elsevier B.V. All rights reserved.
0 Communities
1 Members
0 Resources
14 MeSH Terms
Exact hybrid particle/population simulation of rule-based models of biochemical systems.
Hogg JS, Harris LA, Stover LJ, Nair NS, Faeder JR
(2014) PLoS Comput Biol 10: e1003544
MeSH Terms: Models, Biological, Models, Chemical, Monte Carlo Method
Show Abstract · Added June 8, 2016
Detailed modeling and simulation of biochemical systems is complicated by the problem of combinatorial complexity, an explosion in the number of species and reactions due to myriad protein-protein interactions and post-translational modifications. Rule-based modeling overcomes this problem by representing molecules as structured objects and encoding their interactions as pattern-based rules. This greatly simplifies the process of model specification, avoiding the tedious and error prone task of manually enumerating all species and reactions that can potentially exist in a system. From a simulation perspective, rule-based models can be expanded algorithmically into fully-enumerated reaction networks and simulated using a variety of network-based simulation methods, such as ordinary differential equations or Gillespie's algorithm, provided that the network is not exceedingly large. Alternatively, rule-based models can be simulated directly using particle-based kinetic Monte Carlo methods. This "network-free" approach produces exact stochastic trajectories with a computational cost that is independent of network size. However, memory and run time costs increase with the number of particles, limiting the size of system that can be feasibly simulated. Here, we present a hybrid particle/population simulation method that combines the best attributes of both the network-based and network-free approaches. The method takes as input a rule-based model and a user-specified subset of species to treat as population variables rather than as particles. The model is then transformed by a process of "partial network expansion" into a dynamically equivalent form that can be simulated using a population-adapted network-free simulator. The transformation method has been implemented within the open-source rule-based modeling platform BioNetGen, and resulting hybrid models can be simulated using the particle-based simulator NFsim. Performance tests show that significant memory savings can be achieved using the new approach and a monetary cost analysis provides a practical measure of its utility.
0 Communities
1 Members
0 Resources
3 MeSH Terms
Evaluation of statistical inference on empirical resting state fMRI.
Yang X, Kang H, Newton AT, Landman BA
(2014) IEEE Trans Biomed Eng 61: 1091-9
MeSH Terms: Brain Mapping, Computer Simulation, Humans, Magnetic Resonance Imaging, Models, Statistical, Monte Carlo Method, Regression Analysis, Reproducibility of Results, Signal Processing, Computer-Assisted, Signal-To-Noise Ratio
Show Abstract · Added March 26, 2014
Modern statistical inference techniques may be able to improve the sensitivity and specificity of resting state functional magnetic resonance imaging (rs-fMRI) connectivity analysis through more realistic assumptions. In simulation, the advantages of such methods are readily demonstrable. However, quantitative empirical validation remains elusive in vivo as the true connectivity patterns are unknown and noise distributions are challenging to characterize, especially in ultra-high field (e.g., 7T fMRI). Though the physiological characteristics of the fMRI signal are difficult to replicate in controlled phantom studies, it is critical that the performance of statistical techniques be evaluated. The SIMulation EXtrapolation (SIMEX) method has enabled estimation of bias with asymptotically consistent estimators on empirical finite sample data by adding simulated noise . To avoid the requirement of accurate estimation of noise structure, the proposed quantitative evaluation approach leverages the theoretical core of SIMEX to study the properties of inference methods in the face of diminishing data (in contrast to increasing noise). The performance of ordinary and robust inference methods in simulation and empirical rs-fMRI are compared using the proposed quantitative evaluation approach. This study provides a simple, but powerful method for comparing a proxy for inference accuracy using empirical data.
0 Communities
2 Members
0 Resources
10 MeSH Terms
Response times from ensembles of accumulators.
Zandbelt B, Purcell BA, Palmeri TJ, Logan GD, Schall JD
(2014) Proc Natl Acad Sci U S A 111: 2848-53
MeSH Terms: Computational Biology, Computer Simulation, Humans, Models, Neurological, Models, Psychological, Monte Carlo Method, Neurons, Neurophysiology, Reaction Time, Stochastic Processes
Show Abstract · Added May 27, 2014
Decision-making is explained by psychologists through stochastic accumulator models and by neurophysiologists through the activity of neurons believed to instantiate these models. We investigated an overlooked scaling problem: How does a response time (RT) that can be explained by a single model accumulator arise from numerous, redundant accumulator neurons, each of which individually appears to explain the variability of RT? We explored this scaling problem by developing a unique ensemble model of RT, called e pluribus unum, which embodies the well-known dictum "out of many, one." We used the e pluribus unum model to analyze the RTs produced by ensembles of redundant, idiosyncratic stochastic accumulators under various termination mechanisms and accumulation rate correlations in computer simulations of ensembles of varying size. We found that predicted RT distributions are largely invariant to ensemble size if the accumulators share at least modestly correlated accumulation rates and RT is not governed by the most extreme accumulators. Under these regimes the termination times of individual accumulators was predictive of ensemble RT. We also found that the threshold measured on individual accumulators, corresponding to the firing rate of neurons measured at RT, can be invariant with RT but is equivalent to the specified model threshold only when the rate correlation is very high.
0 Communities
2 Members
0 Resources
10 MeSH Terms