Transparency increases the credibility and relevance of research

Slides:



Advertisements
Similar presentations
Code of Conduct for the Collection, Analysis and Sharing of Health Related Research Data in Developing Countries Elizabeth Pisani Consultant, Wellcome.
Advertisements

Managing References : Mendeley
Sharing research data: expectations of research funders Nature Publishing Group meeting 14 November 2014 Dave Carr Wellcome Trust
Sara Bowman Center for Open Science Open Science Framework: Facilitating Transparency and Reproducibility.
Guillaume Rivalle APRIL 2014 MEASURE YOUR RESEARCH PERFORMANCE WITH INCITES.
5-7 November 2014 DR Workflow Practical Digital Content Management from Digital Libraries & Archives Perspective.
The Open Access Continuum: Open Research and Altmetrics Michelle Willmers Scholarly Communication in Africa Programme CC-BY-SA.
Making Connections: SHARE and the Open Science Framework Jeffrey Open Repositories 2015.
Because good research needs good data The DCC lifecycle model, Exeter Uni, 19 May 2012 Funded by: The Digital Curation Lifecycle Model Joy Davidson and.
Data Sharing and Archiving: A Professional Society View Clifford S. Duke Ecological Society of America September 9, 2010.
1 Judy Hewitt, PhD On Detail to Office of Extramural Research National Institutes of Health May 18, 2015 Center for Scientific Review Advisory Council.
DA-RT What publishing houses can and can’t do Patrick McCartan Publishing Director, Social Science and Humanities Journals Cambridge University Press.
Date, location Open Access policy guidelines for research funders Name Logo area.
Professor Phillipa Hay Centre for Health Research, School of Medicine.
Dataset citation Clickable link to Dataset in the archive Sarah Callaghan (NCAS-BADC) and the NERC Data Citation and Publication team
Open Science (publishing) as-a-Service Paolo Manghi (OpenAIRE infrastructure) Institute of Information Science and Technologies Italian Research Council.
Scientific Utopia: Improving Openness and Reproducibility Brian Nosek University of Virginia Center for Open Science.
Practical Steps for Increasing Openness and Reproducibility Courtney Soderberg Statistical and Methodological Consultant Center for Open Science.
Webinar on increasing openness and reproducibility April Clyburne-Sherin Reproducible Research Evangelist
Practical Steps for Increasing Openness and Reproducibility Courtney Soderberg Statistical and Methodological Consultant Center for Open Science.
Brian Nosek University of Virginia -- Center for Open Science -- Improving Openness.
David Mellor, PhD Project Manager at Improving Openness and Reproducibility of Scientific Research.
Breakout Groups Goal Format Demo Pitch. Overview Monday – 3-6p Breakouts Tuesday – 9-12p Pitches (10 min, 10 discussion) – 2-6p breakouts Wednesday –
Sara Bowman Center for Open Science | Promoting, Supporting, and Incentivizing Openness in Scientific Research.
The Reproducible Research Advantage Why + how to make your research more reproducible Presentation for the Center for Open Science June 17, 2015 April.
Brian Nosek University of Virginia -- Center for Open Science -- Improving Openness.
Sara Bowman Center for Open Science | Promoting, Supporting, and Incentivizing Openness in Scientific Research.
David Preregistration David
David Mellor Building infrastructure to connect, preserve, speed up, and improve scholarship David Mellor
Safer science: making psychological science more Transparent & replicable Simine Vazire UC Davis.
What is Open Science and How do I do it?
Persistent Identifiers, Discoverability, and Open Science Fiona Murphy, Institute for Environmental Analytics, University of Reading, Reading, UK,
Charlotte McClain-Nhlapo Senior Operations Officer The World Bank
Improving Openness and Reproducibility of Scientific Research
Open research data at UiT
Shifting the research culture toward openness and reproducibility
Center for Open Science: Practical Steps for Increasing Openness
Lorne Campbell University of Western Ontario
David Mellor Building infrastructure to connect, preserve, speed up, and improve scholarship David Mellor
Psi Chi’s Network for International Collaborative Exchange (NICE)
Open Science Framework
An Open Science Framework for Managing and Sharing Research Workflows
Preregistration Challenge
Achieving Open Science
Data Sharing Now and in the Future
Preregistration on the Open Science Framework
Scaling the Open Science Framework: National Data Service Dashboard, Cloud Storage Add-ons, and Sharing Science Data on the Decentralized Web Natalie K.
A Framework for Managing and Sharing Research Workflow
Lorne Campbell University of Western Ontario
SowiDataNet - A User-Driven Repository for Data Sharing and Centralizing Research Data from the Social and Economic Sciences in Germany Monika Linne, 30.
Incentives for a more #openscience

Study Pre-Registration
Open Science at the Royal Society Dr Stuart Taylor Publishing Director
What, why and best practices in open research
Challenges for Journals: Encouraging Sound Science
Scientific Publishing in the Digital Age
Creating a Culture of Open Data in Academia
School of Psychology, Cardiff University
The Scientific Method.
Research Infrastructures: Ensuring trust and quality of data
OPEN ACCESS POLICY Larshan Naicker Rhodes University Library
Bird of Feather Session
Judy MIELKE, PhD. Taylor & Francis
ORCID: ADDING VALUE TO THE GLOBAL RESEARCH COMMUNITY
Implications of openly licenced resources for librarians
Data + Research Elements What Publishers Can Do (and Are Doing) to Facilitate Data Integration and Attribution David Parsons – Lawrence, KS, 13th February.
FAIR Across – Implementation of FAIR into research practice
Open Science & Reproducibility
Dr. Fred Oswald President, SIOP (Div. 14) Rice University
Presentation transcript:

Transparency increases the credibility and relevance of research David Mellor Center for Open Science https://cos.io/

Evidence to encourage change Incentives to embrace change Technology to enable change Improving scientific ecosystem

Infrastructure Metascience Community

http://cos.io/top

Transparency and Openness Promotion (TOP) Guidelines Eight policy statements for increasing the transparency and reproducibility of the published research. Agnostic to discipline Low barrier to entry Modular

Disclose Require Verify Three Tiers Disclose Require Verify Eight Standards Data citation Materials transparency Data transparency Code transparency Design transparency Study Preregistration Analysis Preregistration Replication

Disclose Require Verify Three Tiers Disclose Require Verify Eight Standards Data citation Materials transparency Data transparency Code transparency Design transparency Study Preregistration Analysis Preregistration Replication

Disclose Require Verify Three Tiers Disclose Require Verify Eight Standards Data citation Materials transparency Data transparency Code transparency Design transparency Study Preregistration Analysis Preregistration Replication

Example Implementations

Data, Analytic Methods (Code), and Research Materials Transparency Level 1: Authors must disclose action

Data, Analytic Methods (Code), and Research Materials Transparency Level 2: Authors must share (exceptions permitted)

Data, Analytic Methods (Code), and Research Materials Transparency Level 3: Journal or third party will verify that the data can be used to reproduce the findings presented in a paper.

Design and Analysis Transparency Society or journal defines the relevant reporting standards that are appropriate for their discipline. http://www.cell.com/star-methods http://resource-cms.springer.com/springer-cms/rest/v1/content/7117202/data/v2/Minimum+standards+of+reporting+checklist http://www.equator-network.org/ https://www.nature.com/authors/policies/ReportingSummary.pdf

Preregistration Preregistration of studies is a means of making research more discoverable even if it does not get published. Preregistration of Analysis Plans clarifies the distinction between confirmatory and exploratory research.

Exploratory research: Finds unexpected trends Pushes knowledge into new areas Results in a testable hypothesis

Confirmatory research: Puts a hypothesis to the test Does not allow data to influence the hypothesis Results are held to the highest standard of rigor

Presenting exploratory results as confirmatory increases publishability at the expense of credibility.

Direct replications: The gold standard for scientific evidence “Nullius in verba” ~ “Take nobody's word for it”

Preregistration or Replication Level 1: Disclose preregistration, encourage replication http://www.psychonomic.org/?page=journals

Preregistration or Replication Level 2: Results blind peer review https://jbp.uncc.edu/other-journals-involved-in-this-joint-initiative/

Preregistration or Replication Level 3: Registered Reports

Preregistration or Replication Level 3: Registered Reports Are the hypotheses well founded and worth addressing? Are the methods and proposed analyses able to address the hypotheses? Have the authors included sufficient positive controls to confirm that the study will provide a fair test?

Preregistration or Replication Level 3: Registered Reports Did the authors follow the approved protocol? Did positive controls succeed? Are the conclusions justified by the data?

Nearly 3,000 Journal, Society, Publisher, and Funder Signatories

How can publishers support this effort? Becoming a signatory provides a common framework for a diverse group of disciplines under one roof to consider different ways to adopt TOP. It is not necessary to mandate a blanket policy across different disciplines.

What incentives are needed for researchers to more openly share? How do we enable sharing without placing more burden on researchers?

Mandates Tack on extra steps to a process Appeal to everyone’s good nature

Mandates Tack on extra steps to a process Appeal to everyone’s good nature

Align scientific values (open sharing, value of evidence) with scientific rewards (getting published, funded, and hired). Make tools that make researchers’ lives easier, while making open science as easy as pressing a button.

Collaboration Documentation Archiving And it does things that many content management systems do. But it always balances things researchers care about while either automatically or making it very easy to align with values. Again, scholars should focus on scholarship--and we respect that they have a workflow—this has to be frictionless.

So here, it’s very easy to curate a project, add contributors, share, but we also keep an activity log. To the users that’s either meaningless or an easy way to keep up-to-date, but to us, this is provenance—this is important information about how science happens—science is a process, this is great time series metadata for metascience

OpenSesame

OpenSesame

Signals: Making Behaviors Visible Promotes Adoption Badges Open Data Open Materials Preregistration Psychological Science (Jan 2014)

Data Availability in Psychological Science 10x increase from 2013 to 2015 (Kidwell et al., 2016)

$1,000,000 in prizes for researchers to publish the results of their preregistered work.

https://cos.io/prereg

Data collection methods https://cos.io/prereg Research questions Data collection methods Variables Statistical tests Outliers

What can you do: Sign the TOP Guidelines Implement rewards for ideal science such as badges or Registered Reports Spread awareness about the Preregistration Challenge David Mellor, david@cos.io Find me online: https://osf.io/qthsf/, @EvoMellor Find this presentation: https://osf.io/GUID (Take a picture!)

General Links https://cos.io/top https://cos.io/badges https://cos.io/rr

Links to reported policies http://www.psychonomic.org/?page=journals http://journals.plos.org/plosone/s/data-availability http://publications.agu.org/author-resource-center/publication-policies/data-policy/data-policy-faq/ https://ajps.org/ajps-replication-policy/ http://www.cell.com/star-methods https://www.equator-network.org/ https://www.nature.com/authors/policies/ReportingSummary.pdf?viewType=Print&viewClass=Print http://resource-cms.springer.com/springer-cms/rest/v1/content/7117202/data/v2/Minimum+standards+of+reporting+checklist https://jbp.uncc.edu/other-journals-involved-in-this-joint-initiative/ http://journals.plos.org/plosbiology/article?id=10.1371/journal.pbio.1002456