The LEADS Database at ICPSR: Identifying Important Social Science Studies for Archiving Presentation prepared for 2006 Annual Meeting of the IASSIST Friday,

Slides:



Advertisements
Similar presentations
@minto (Co-operative Education Edition) Copyright © 2000 – 2006 IntactSoft CC.
Advertisements

Jane Long, MA, MLIS Reference Services Librarian Al Harris Library.
Why R03? Something is better than nothing The R03 (Small Grant) mechanism supports small research projects that can be carried out in a short period of.
OVERVIEW & LIBRARY SUPPORT FOR DATA MANAGEMENT/SHARING Jim Van Loon, MSME/MLIS Science Librarian.
Resources for Social Sciences
UMBRELLA CRADAS: AN EASIER PATH AN EASIER PATH Suzanne M. Frisbie, Ph.D. Unit Supervisor Technology Transfer Center National Cancer Institute National.
Introduction to Research Methodology
Quantitative vs. Qualitative Research Method Issues Marian Ford Erin Gonzales November 2, 2010.
Developments in Data Discovery at ICPSR George Alter Director, ICPSR University of Michigan.
Marketing for Hospitality and Tourism, 3e©2003 Pearson Education, Inc. Philip Kotler, John Bowen, James MakensUpper Saddle River, NJ Chapter 5.
Systematic Reviews: Theory and Practice
IASSIST 2003 Changes in the Way Data Archives Process Data Data Processing at ICPSR Darrell Donakowski.
Writing a Research Protocol Michael Aronica MD Program Director Internal Medicine-Pediatrics.
The MetaDater Model and the formation of a GRID for the support of social research John Kallas Greek Social Data Bank National Center for Social Research.
USIA Office of Research Surveys, NARA – Roper Center Collaboration: An Update Lois Timms-Ferrara The Roper Center for Public Opinion Research and.
Präsentationstitel IAB-ITM Find the right tags in DDI IASSIST 2009, 27th-30th Mai 2009 IAB-ITM Finding the Right Tags in DDI 3.0: A Beginner's Experience.
Problem Identification
Archiving our Social Science Digital History ECURE 2005 March 1, 2005.
NARA – Roper Center Collaboration: USIA Office of Research Surveys Michael Carlson National Archives and Records Administration Marc Maynard.
Data-PASS/NDIIPP: A new effort to harvest our history IASSIST/IFDO 2005 May, 25, 2005.
Copyright © 2014 by McGraw-Hill Higher Education. All rights reserved.
Databases Chapter 11.
Introduction to Communication Research
Barriers to Data Sharing: New Evidence from a U.S. Survey Amy Pienta George Alter Jared Lyle University of Michigan IASSIST 2010 June 2, :45-5:15pm.
© 2011 Pearson Prentice Hall, Salkind. Nonexperimental Research: Qualitative Methods.
Using NIH’s Research Portfolio Online Report Tool (RePORT) to Your Advantage June 2013 Megan Columbus Director, Division of Communications and Outreach.
An Applied Approach to Data Curation Training at ICPSR Jared Lyle 6 May 2013.
Data resources for the future Since 1962 The Selection, Appraisal, and Retention of Social Science Data in the United States Myron Gutmann Inter-university.
The R21 Mechanism. Your guide through the R21 jungle* Patricia Parmelee, PhD (Pat to you) * jun’ gle, n.: a confused or disordered mass of objects; something.
DR. AHMAD SHAHRUL NIZAM ISHA
What is the NIH RePORTER? And How Will it Help My PI?
Research Data Management Services Katherine McNeill Social Sciences Librarians Boot Camp June 1, 2012.
1 Introduction to Grant Writing Beth Virnig, PhD Haitao Chu, MD, PhD University of Minnesota, School of Public Health December 11, 2013.
1 8. Marketing Research & Information Systems. 2 The Marketing Information System Part of management information system Involves people, equipment & procedures.
Recordkeeping for Good Governance Toolkit Digital Recordkeeping Guidance Funafuti, Tuvalu – June 2013.
CRITICAL APPRAISAL OF SCIENTIFIC LITERATURE
ACCESS for VALIDITY ACCESS for INNOVATION. Starting January 2011 for NEW proposals Not voluntary – “integral part” of proposal and FastLane Required for.
THE NIH SUBMISSION AND ASSIGNMENT PROCESS Suzanne E. Fisher, Ph.D Director, Division of Receipt and Referral Center for Scientific Review January 2002.
OBSERVATIONAL METHODS © 2012 The McGraw-Hill Companies, Inc.
Evidence Based Medicine Meta-analysis and systematic reviews Ross Lawrenson.
Academic Research Enhancement Award (AREA) Program Erica Brown, PhD Director, NIH AREA Program National Institutes of Health 1.
Session I: Unit 2 Types of Reviews September 26, 2007 NCDDR training course for NIDRR grantees: Developing Evidence-Based Products Using the Systematic.
Trustworthy Repositories, Organizations & Infrastructure Micah Altman, Institute for Quantitative Social Science, Harvard University Jonathan Crabtree,
Chapter 6: Getting the Marketing Information We Need.
1 ASSOCIATION OF INDEPENDENT RESEARCH INSTITUTES PRESENTATION San Antonio Texas September 24 – 26, 2007.
1 ©2013 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.
NATIONAL INSTITUTES OF HEALTH CHALLENGE GRANT APPLICATIONS Dan Hoyt Survey, Statistics, and Psychometrics(SSP) Core Facility March 11, 2009.
NSF DDRIG Doctoral Dissertation Research Improvement Grants.
Background Researchers and funders continue to be concerned about the lack of archiving of scientific data. Such data can be useful to researchers, educators,
1 Academic Research Enhancement Award (AREA) Program Erica Brown, PhD NIH AREA Program Director NIH Regional Seminar Scottsdale, Arizona April 28, 2011.
Components of a Successful AREA (R15) Grant Rebecca J. Sommer Bates College.
Sheila Beck Devin McKay Information Literacy Across the Curriculum: Where to Begin.
National Science Foundation. Seeking Doctoral Dissertation Support from the National Science Foundation: Do’s and Don’ts Program Officer Political Science.
DEFENSE THREAT REDUCTION AGENCY JOINT SCIENCE AND TECHNOLOGY OFFICE CHEMICAL AND BIOLOGICAL DEFENSE create collaborate communicate Click to add title of.
OBSERVATIONAL METHODS © 2009 The McGraw-Hill Companies, Inc.
Using the NCBI SciENcv application to generate NIH Biosketches in new format Hermanie Pierre-Noel and Silvia Pulido, Ph.D University of Central Florida.
Sifting through the evidence Sarah Fradsham. Types of Evidence Primary Literature Observational studies Case Report Case Series Case Control Study Cohort.
Data provided by the Division of Information Services, Reporting Branch NATIONAL INSTITUTES OF HEALTH (NIH) DATA BOOK Fiscal Year 2010.
Conducting Research in the Social Sciences (From: Individuals and Families: A Diverse Perspective (2010))
Working with Data at its Source: Partnering with Researchers to Share Their Data for Archiving and Discovery Ron Nakao – Stanford University Libraries.
Chapter 22: Archival Data Sets Revisiting Issues and Considerations By: Thekra Al seef.
Michael Sesma, Ph.D. National Institute of Mental Health Early Stage Investigators and the Program Perspective.
| 0 Scopus content selection and curation processes Susanne Steiginga, MSc. Product Manager Scopus Content 5th International Scientific and Practical Conference.
Assessment of Proposed Research: National Cancer Institute Provocative Questions Initiative Case Study Elizabeth Hsu American Evaluation Association 2012.
Project Management PTM721S
How to Write a Successful NIH Career Development Award (K Award)
The Plant Database.
Quantitative vs. Qualitative Research Method Issues
Joyce Backus Associate Director, Library Operations
Searching for Federal Funding
Presentation transcript:

The LEADS Database at ICPSR: Identifying Important Social Science Studies for Archiving Presentation prepared for 2006 Annual Meeting of the IASSIST Friday, May 26, 2006

LEADS at ICPSR We would like to know the “universe” of social science data that have been collected Identification Data-PASS and ICPSR would like to know how much social science data is “at risk” of being lost or has been lost Appraisal We would also like to know what “at risk” social science data are important enough to be archived

Or…. What have we failed to catch? How big is the “one” that got away?

What is LEADS? LEADS is a database of records containing information about scientific studies that may have produced social science data LEADS contains descriptive information about various scientific studies that have been identified. LEADS also contains information that can be used to determine the “fit” and “value” of a scientific study LEADS keeps a record of all human (staff) decisions that have been made about the fit and value of a scientific study.

Sources of Records in LEADS NSF research grant awards downloaded from nsf.gov NIH research grant awards downloaded from CRISP Prospective searches of topical areas/journals Researcher nominations (self or other)

NSF Grant Awards in LEADS (pre-screening) LEADS contains 17,194 awards made by NSF LEADS spans 30 years of awards to 2005 LEADS spans 53 NSF organizations that award grants Of the 53 organizations, the 4 organizations with the most records screened (each contributing 1,000+ records) were: SES: Social and Economic Sciences BCS: Behavioral and Cognitive Sciences DMS: Mathematical Sciences IOB: Integrative and Organism Biology

Total # NSF Grant Awards by Year

Screening Criteria Social science and/or behavioral science Original or primary data collection proposed, including assembling a database from existing (archival) sources

Activity in NSF Grants (n=17,194) Type of Activity Proposed%N= Not Social Science47.98,237 Training/Workshop/Conf Social Science Primary Data Collection13.62,336 Secondary Analysis Primary & Secondary (combination) No Data Collection or Analysis No Abstract15.52,664 Flagged & Other13.02,232

Types of Research Activity NSF has Awarded by Year ***Abstracts become widely available 1987+***

Most Prevalent Social Science Primary Data Collection Awards by NSF Organization % of total for divisionn of awards Antarctic Sciences Div Arctic Science Div Behavioral & Cognitive Sciences Social & Economic Sciences Research, Evaluation & Communication Information & Intelligent Systems

Other Fields Coded During Screening Topic/Discipline Data Collection Methodology Sampling Characteristics

Topic/Discipline in NSF Awards for Primary Social Science Data Collection # of NSF Awards An additional 1,594 records coded “General Social Science”

Type of Data Collection Method/Design in NSF Awards for Primary Social Science Data Collection # of NSF Awards

NSF Awards for Social Science Primary Data: Proposed Sampling Method Percent of TotalN= Probability Sample Proposed Non-Probability Sample Proposed1.023 Not Specified/Missing93.72,190

NSF Awards for Social Science Primary Data: Type of Sampling Frame Proposed Sampling FramePercent of TotalN= U.S. - National U.S. - Regional International – Including U.S International – Excluding U.S Not Specified/Missing

NSF Awards for Social Science Primary Data: Proposed Sample Size Sample SizePercentN= 1, < Not specified/Missing85.31,994

NSF Awards for Social Science Primary Data: Race/Ethnic Distribution of Sample PercentN= Multiple Races Single Race Study Not Specified/Missing88.62,069 Any Whites2.150 Any African Americans4.297 Any Latinos3.070 Any Asians3.990 Any Other Non-Whites1.739

Gender Distribution Sampled, When Known (n=164)

Children/Adult Sampled, When Known (n=235)

Following-Up: Prospects for Data Archiving N=2,336Primary Social Science Data Collection Awards N=201 Combined Data Collection Activity and Secondary Data, Social Science Research Steps:  Select ~10-20 records per week  Generate updated contact information for PI  Determine if “obviously” archived already (ICPSR, Roper, Odum, Murray, Sociometrics, GOOGLE)  Review related citations  Review other NSF awards made to PI  Contact PI (Data Produced? Data Archived? Data Still Available?)

Other Qualitative Fields in LEADS Description of how the collection fits within the scope of important social science studies Description of the value of the study for archiving Priority ranking Citations PI communication

Problems archiving studies… PI unsure where data are stored Data are in an old format that we may or may not be able to recover Physical condition (storage media or documentation) has deteriorated Paper copy documentation only, incomplete documentation No English language documentation

NIH records in LEADS We screened NIH awards for (1) social science/behavioral, (2)original data & (3) quantitative All NIH Institutes ( ) NICHD, NIA, NIMH, NINR, AHRQ, NIAAA, NIDA, clinical Center, NIDCD, FIC, NCI, NHLBI, NIDDK (all years) 172,196 - total # awards screened 6,381 – selected awards

Challenges & Limitations Size and scope of this project Need for PI cooperation Screening error rate has not been quantified Addressing the ambiguous records Collaborative projects and continuation projects have not been eliminated

Conclusions NIH & NSF award databases are a valuable source of information about studies “at risk” of being lost PI grant abstracts are highly variable regarding amount of detail about research aims & methodology Preliminary results suggest that few studies have been archived; although the rate is higher for NSF The large number of unarchived studies requires us to use appraisal methods to determine a particular study’s value for archiving

People Working on LEADS NSF Darrell Donakowski Lisa Quist Jared Lyle Tannaz Sabet NIH Russ Hathaway Felicia LeClere Brian Madden James McNally JoAnne O’Rourke Kelly Zidar