Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
2020, Nature Ecology & Evolution
…
4 pages
1 file
Synthesizing evidence is an essential part of scientific progress, but it is often done in a slow and uncoordinated manner, sometimes producing misleading conclusions. Here, we propose the idea of an 'open synthesis community' to resolve this pressing issue.
Environmental Evidence, 2018
The Open Science movement can be broadly summarised as aiming to promote integrity, repeatability and transparency across all aspects of research, from data collection to publication. Systematic reviews and systematic maps aim to provide a reliable synthesis of the evidence on a particular topic, making use of methods that seek to maximise repeatability and comprehensives whilst minimising subjectivity and bias. The central tenet of repeatability is opera-tionalised by transparently reporting methodological activities in detail, such that all actions could be replicated and verified. To date, evidence synthesis has only partially embraced Open Science, typically striving for Open Methodology and Open Access, and occasionally providing sufficient information to be considered to have Open Data for some published reviews. Evidence synthesis communities needs to better embrace Open Science not only to balance knowledge access and increase efficiency, but also to increase reliability, trust and reuse of information collected and synthesised within a review: concepts fundamental to systematic reviews and maps. All aspects of Open Science should be embraced: Open Methodology, Open Data, Open Source and Open Access. In doing so, evidence synthesis can be made more equal, more efficient and more trustworthy. I provide concrete recommendations of how CEE and others can fully embrace Open Synthesis.
Environmental Evidence, 2020
Evidence synthesis is a vital part of evidence-informed decision-making, but high growth in the volume of research evidence over recent decades has made efficient evidence synthesis increasingly challenging. As the appreciation and need for timely and rigorous evidence synthesis continue to grow, so too will the need for tools and frameworks to conduct reviews of expanding evidence bases in an efficient and time-sensitive manner. Efforts to future-proof evidence synthesis through the development of new evidence synthesis technology (ESTech) have so far been isolated across interested individuals or groups, with no concerted effort to collaborate or build communities of practice in technology production. We established the evidence synthesis Hackathon to stimulate collaboration and the production of Free and Open Source Software and frameworks to support evidence synthesis. Here, we introduce a special series of papers on ESTech, and invite the readers of environmental evidence to submit manuscripts introducing and validating novel tools and frameworks. We hope this collection will help to consolidate ESTech development efforts and we encourage readers to join the ESTech revolution. In order to future-proof evidence synthesis against the evidence avalanche, we must support community enthusiasm for ESTech, reduce redundancy in tool design, collaborate and share capacity in tool production, and reduce inequalities in software accessibility.
This report summarises a workshop held in Stockholm at Stockholm Environment Institute (SEI), Global Water Partnership (GWP) and Stockholm International Water Institute (SIWI) from the 23rd to the 25th April 2018. The event brought together 25 programmers and coders from across the globe in an attempt to collectively solve some of the biggest issues facing evidence synthesis using technology. The event was generously funded by Mistra EviEM (www.eviem.se/en) and the Fenner School’s Environment & Society Synthesis Program (Australian National University). Further details of the Evidence Synthesis Hackathon can be found on the website: www.evidencesynthesishackathon.com.
BMJ Evidence-Based Medicine
Contributors ZM conceived the article, wrote the first draft, incorporated feedback, finalised the manuscript. DP, THB, JS, CS, EA, HJS, BC, HK, RAM, CG, AB, ACT, AP all contributed to discussions and initial ideas in the article, reviewed drafts, provided feedback and approved. Competing interests ZM is employed by JBI, an evidence-based healthcare research and development organisation situated within the University of Adelaide and is supported by an NHMRC Investigator Grant 1195676. ACT is funded by the Tier 2 Canada Research Chair in Knowledge Synthesis. Patient consent for publication Not applicable.
Environmental Evidence, 2021
One of the most important steps in the process of conducting a systematic review or map is data extraction and the production of a database of coding, metadata and study data. There are many ways to structure these data, but to date, no guidelines or standards have been produced for the evidence synthesis community to support their production. Furthermore, there is little adoption of easily machine-readable, readily reusable and adaptable databases: these databases would be easier to translate into different formats by review authors, for example for tabulation, visualisa-tion and analysis, and also by readers of the review/map. As a result, it is common for systematic review and map authors to produce bespoke, complex data structures that, although typically provided digitally, require considerable efforts to understand, verify and reuse. Here, we report on an analysis of systematic reviews and maps published by the Collaboration for Environmental Evidence, and discuss major issues that hamper machine readability and data reuse or verification. We highlight different justifications for the alternative data formats found: condensed databases; long databases; and wide databases. We describe these challenges in the context of data science principles that can support curation and publication of machine-readable, Open Data. We then go on to make recommendations to review and map authors on how to plan and structure their data, and we provide a suite of novel R-based functions to support efficient and reliable translation of databases between formats that are useful for presentation (condensed, human readable tables), filtering and visualisation (wide databases), and analysis (long databases). We hope that our recommendations for adoption of standard practices in database formatting, and the tools necessary to rapidly move between formats will provide a step-change in transparency and replicability of Open Data in evidence synthesis.
The use of systematic reviews to collate and summarise evidence is considered one of the great intellectual achievements of recent times. Evidence synthesis has had far-reaching impacts and has helped to inform decision-making in multiple fields. However, it also faces three problems: (i) reviews cannot be relevant to everyone, hampering decision-making, (ii) multiple conflicting review are often produced, and (iii) reviews can become out-of-date rapidly. Here we present a solution to these problems that we term 'dynamic meta-analysis.' With a tool we have developed, users can filter and weight evidence depending on their interests to produce analyses that are highly relevant to their needs.
Environmental Evidence, 2019
Systematic mapping assesses the nature of an evidence base, answering how much evidence exists on a particular topic. Perhaps the most useful outputs of a systematic map are an interactive database of studies and their meta-data, along with visualisations of this database. Despite the rapid increase in systematic mapping as an evidence synthesis method, there is currently a lack of Open Source software for producing interactive visualisations of systematic map databases. In April 2018, as attendees at and coordinators of the first ever Evidence Synthesis Hackathon in Stock-holm, we decided to address this issue by developing an R-based tool called EviAtlas, an Open Access (i.e. free to use) and Open Source (i.e. software code is freely accessible and reproducible) tool for producing interactive, attractive tables and figures that summarise the evidence base. Here, we present our tool which includes the ability to generate vital visualisations for systematic maps and reviews as follows: a complete data table; a spatially explicit geographical information system (Evidence Atlas); Heat Maps that cross-tabulate two or more variables and display the number of studies belonging to multiple categories; and standard descriptive plots showing the nature of the evidence base, for example the number of studies published per year or number of studies per country. We believe that EviAtlas will provide a stimulus for the development of other exciting tools to facilitate evidence synthesis.
The Handbook of Research Synthesis and Meta-Analysis, 2019
Evidence-based environmental management is being hindered by difficulties in locating, interpreting and synthesising relevant information among vast scientific outputs. But software developments that allow enhanced collation and sharing of data will help.
Proceedings of the DESIGN 2018 15th International Design Conference, 2018
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.
Systematic Reviews
Medical Decision Making, 2013
Perspectives on Psychological Science, 2013
Clinical Pharmacology & Therapeutics
Journal of Clinical Epidemiology, 2020
Nucleic Acids Research, 2021
Evaluation & the Health Professions, 2002
Methods in Molecular Biology, 2016
Human Behavior and Emerging Technologies, 2024
Healthcare Policy | Politiques de Santé, 2006
Metaphilosophy 46(1): 65-83, 2015
Database : the journal of biological databases and curation, 2014
The Lancet Planetary Health, 2019
Environmental Science & Policy
The New England journal of medicine, 2016