Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
2007, Toxicological Sciences
…
14 pages
1 file
Advances in computer sciences and hardware combined with equally significant developments in molecular biology and chemistry are providing toxicology with a powerful new tool box. This tool box of computational models promises to increase the efficiency and the effectiveness by which the hazards and risks of environmental chemicals are determined. Computational toxicology focuses on applying these tools across many scales, including vastly increasing the numbers of chemicals and the types of biological interactions that can be evaluated. In addition, knowledge of toxicity pathways gathered within the tool box will be directly applicable to the study of the biological responses across a range of dose levels, including those more likely to be representative of exposures to the human population. Progress in this field will facilitate the transformative shift called for in the recent report on toxicology in the 21st century by the National Research Council. This review surveys the state of the art in many areas of computational toxicology and points to several hurdles that will be important to overcome as the field moves forward. Proof-of-concept studies need to clearly demonstrate the additional predictive power gained from these tools. More researchers need to become comfortable working with both the data generating tools and the computational modeling capabilities, and regulatory authorities must show a willingness to the embrace new approaches as they gain scientific acceptance. The next few years should witness the early fruits of these efforts, but as the National Research Council indicates, the paradigm shift will take a long term investment and commitment to reach full potential.
Advances in computer sciences and hardware combined with equally significant developments in molecular biology and chemistry are providing toxicology with a powerful new tool box. This tool box of computational models promises to increase the efficiency and the effectiveness by which the hazards and risks of environmental chemicals are determined. Computational toxi- cology focuses on applying these tools across many scales, in- cluding vastly increasing the numbers of chemicals and the types of biological interactions that can be evaluated. In addition, knowledge of toxicity pathways gathered within the tool box will be directly applicable to the study of the biological responses across a range of dose levels, including those more likely to be representative of exposures to the human population. Progress in this field will facilitate the transformative shift called for in the recent report on toxicology in the 21st century by the National Research Council. This review surveys the s...
Chemical Research in Toxicology, 2021
Metrics & More Article Recommendations C omputational toxicology concerns the use of computational tools to support integrative approaches to toxicological research and chemical safety assessments via predictive modeling, analyses of complex, multifaceted data sets, and extrapolation and translation among evidence streams, particularly new approach methodologies that rely upon alternatives to animal testing (Figure 1). This special Chemical Research in Toxicology pubs.acs.org/crt Editorial
Reproductive Toxicology, 2005
Reproductive toxicology (Elmsford, N.Y.)
Human & Experimental Toxicology, 2015
Predictive toxicology plays a critical role in reducing the failure rate of new drugs in pharmaceutical research and development. Despite recent gains in our understanding of drug-induced toxicity, however, it is urgent that the utility and limitations of our current predictive tools be determined in order to identify gaps in our understanding of mechanistic and chemical toxicology. Using recently published computational regression analyses of in vitro and in vivo toxicology data, it will be demonstrated that significant gaps remain in early safety screening paradigms. More strategic analyses of these data sets will allow for a better understanding of their domain of applicability and help identify those compounds that cause significant in vivo toxicity but which are currently mis-predicted by in silico and in vitro models. These 'outliers' and falsely predicted compounds are metaphorical lighthouses that shine light on existing toxicological knowledge gaps, and it is essential that these compounds are investigated if attrition is to be reduced significantly in the future. As such, the modern computational toxicologist is more productively engaged in understanding these gaps and driving investigative toxicology towards addressing them.
Toxicol. Res., 2015
Computational approaches offer the attraction of being both fast and cheap to run being able to process thousands of chemical structures in a few minutes. As with all new technology, there is a tendency for these approaches to be hyped up and claims of reliability and performance may be exaggerated. So just how good are these computational methods?
Toxicological Sciences
The U.S. Environmental Protection Agency (EPA) is faced with the challenge of efficiently and credibly evaluating chemical safety often with limited or no available toxicity data. The expanding number of chemicals found in commerce and the environment, coupled with time and resource requirements for traditional toxicity testing and exposure characterization, continue to underscore the need for new approaches. In 2005, EPA charted a new course to address this challenge by embracing computational toxicology (CompTox) and investing in the technologies and capabilities to push the field forward. The return on this investment has been demonstrated through results and applications across a range of human and environmental health problems, as well as initial application to regulatory decision-making within programs such as the EPA’s Endocrine Disruptor Screening Program. The CompTox initiative at EPA is more than a decade old. This manuscript presents a blueprint to guide the strategic and...
Applied In Vitro Toxicology, 2015
With the passing of Dr. Gilman D. Veith on August 18, 2013, the research community lost one of its true visionaries in the development and implementation of alternative in silico and in vitro toxicology models in human health and ecological risk assessment. His career spanned more than four decades, during which he repeatedly demonstrated vision and leadership to advance alternative testing and assessment research and to guide the adoption of research accomplishments into U.S. and international chemical regulatory programs. His ability to advance toxicological and environmental exposure research, and associated quantitative structure-activity relationships (QSARs), for application in environmental regulatory decision making was achieved by a focus on establishing a transparent, mechanistic understanding of the chemistry and biology underlying alternative experimental and mathematical models. Dr. Veith was a pioneer of the application of alternative methods, especially QSAR to industrial chemicals, which traditionally lacked the experimental data already available for pesticides and pharmaceuticals. His lifelong vision and leadership were recognized by a number of awards, including the Henry J. Heimlich, M.D., Award for Innovative Medicine in 2010. Through Dr. Veith's accomplishments, the use of alternative approaches in toxicity testing and risk assessment was implemented in the 20th century. His leadership also provided the scientific and regulatory framework for developing and applying in vitro and in silico techniques in what is now coined as 21stcentury toxicology.
Journal of Exposure Science & Environmental Epidemiology, 2008
The emerging field of computational toxicology applies mathematical and computer models and molecular biological and chemical approaches to explore both qualitative and quantitative relationships between sources of environmental pollutant exposure and adverse health outcomes. The integration of modern computing with molecular biology and chemistry will allow scientists to better prioritize data, inform decision makers on chemical risk assessments and understand a chemical's progression from the environment to the target tissue within an organism and ultimately to the key steps that trigger an adverse health effect. In this paper, several of the major research activities being sponsored by Environmental Protection Agency's National Center for Computational Toxicology are highlighted. Potential links between research in computational toxicology and human exposure science are identified. As with the traditional approaches for toxicity testing and hazard assessment, exposure science is required to inform design and interpretation of high-throughput assays. In addition, common themes inherent throughout National Center for Computational Toxicology research activities are highlighted for emphasis as exposure science advances into the 21st century.
International Journal of Hygiene and Environmental Health, 2002
In its efforts to provide consultations to state and local health departments, other federal agencies, health professionals, and the public on the health effects of environmental pollutants, the Agency for Toxic Substances and Disease Registry relies on the latest advances in computational toxicology. The computational toxicology laboratory at the agency is continually engaged in developing and applying models for decision-support tools such as physiologically based pharmacokinetic (PBPK) models, benchmark dose (BMD) models, and quantitative structure-activity relationship (QSAR) models. PBPK models are suitable for connecting exposure scenarios to biological indicators such as tissue dose or end point response. The models are used by the agency to identify the significance of exposure routes in producing tissue levels of possible contaminants for people living near hazardous waste sites. Additionally, PBPK models provide a credible scientific methodology for route-to-route extrapolations of health guidance values, which are usually determined from a very specific set of experiments. Also, scientists at the computational toxicology laboratory are using PBPK models for advancing toxicology research in such areas as joint toxicity assessment and child-based toxicity assessments. With BMD modeling, all the information embedded in an experimentally determined dose-response relationship is used to estimate, with minimum extrapolations, human health guidance values for environmental substances. Scientists in the laboratory also rely on QSAR models in the many cases where consultations from the agency are reported for chemicals that lack adequate experimental documentation.
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.
Molecular Informatics, 2016
Journal of Chemical Information and Modeling, 2020
Journal of Applied Toxicology, 2011
Chemistry Central Journal, 2007
SpringerBriefs in Applied Sciences and Technology, 2013
Basic & Clinical Pharmacology & Toxicology, 2014
Drug Discovery Today, 2008
Environmental Health Perspectives, 2015
Toxicology Research
International Journal of Molecular Sciences, 2012
European Commission Joint Research Centre, 2010