Presentation is loading. Please wait.

Presentation is loading. Please wait.

Survey of Science and Engineering Research Facilities Fiscal Year 2005 Dual Mode Data Collection Experience Timothy Smith, Cindy Gray, and Leslie J. Christovich.

Similar presentations


Presentation on theme: "Survey of Science and Engineering Research Facilities Fiscal Year 2005 Dual Mode Data Collection Experience Timothy Smith, Cindy Gray, and Leslie J. Christovich."— Presentation transcript:

1 Survey of Science and Engineering Research Facilities Fiscal Year 2005 Dual Mode Data Collection Experience Timothy Smith, Cindy Gray, and Leslie J. Christovich Presented at ICES-III Montréal, Québec, Canada June 18-21, 2007

2 2 Discussion topics Discussion will focus on the FY 2005 Survey of Science and Engineering Research Facilities: Content and burden Key characteristics of dual mode design and data collection procedures Response rates Differences between web and mail respondents Lessons learned

3 3 1986 – Congress mandates survey Collection of information on amount and condition of science and engineering (S&E) research space Academic and nonprofit biomedical institutions are included Biennial data collection begins using paper questionnaire 1996 – Electronic version of the survey is introduced Windows-based application on diskette 27% of respondents use the diskette version Background

4 4 1998 – NSF introduces the web survey Respondents are given option of responding by web or mail 53% of respondents submitted by web 2003 – NSF adds a second component to the survey Focuses on computing and networking capacity Requires involvement of different types of survey respondents Background (continued)

5 5 A census of eligible academic and biomedical institutions: 477 public and private colleges and universities 191 independent hospitals and other nonprofit biomedical organizations Survey population

6 6 Institutional coordinators (ICs) were assigned by the president/director ICs identified and collected data from institution offices: Facilities Office Office of Sponsored Research Budget and Planning Offices Medical School representatives Department deans Chief Information Officer Others Survey respondents/contributors

7 7 Part 1: Current research space in net assignable square feet (NASF) by S&E field Condition of current NASF by field Repair/renovation costs by field New construction costs and NASF by field Survey content

8 8 Part 1: Planned repair/renovation (costs) and new construction (costs and NASF) by field Deferred repair/renovation and new construction costs by field Research animal facilities and medical school facilities Survey content (continued)

9 9 Part 2: Bandwidth to Internet1 and Internet2, current and estimated for FY 2006 Types of connections to Internet1 Bandwidth from consortia (e.g., state or regional networks) High performance network connections (e.g., Abilene, National LambdaRail) Survey content (continued)

10 10 Part 2: High performance computing systems (i.e., number of systems and access) Speed of connections between desktop ports, along internal networks, and to Internet1 Wireless connectivity Survey content (continued)

11 11 Academic institutions – 41 hours Biomedical organizations – 7 hours Survey burden

12 12 October 24 – Recruitment packages sent to presidents/directors October 26 – Survey packages sent to repeat ICs (others sent on a flow basis) November 3 – Start of email and telephone verification of survey receipt December 8 – Email reminder one week prior to initial due date (December 15) January through April – Telephone and email prompt of nonresponding institutions April 12 – End data collection Data collection schedule

13 13 Survey response rates

14 14 Institutional Coordinator (IC) package sent by U.S. mail: NSF cover letter Paper questionnaire (Part 1 and Part 2 bound separately) with perforated pages Web survey access instructions Copy of NSF FY 2003 Survey InfoBrief ICs had choice to respond by web or mail Data collection procedures

15 15 Question wording and page formatting were consistent across modes Questions could be answered in any order Print Paper Survey function provided on website Data collection procedures (continued)

16 16 List of Survey Questions Screen

17 17 Example question

18 18 Example question (continued)

19 19 Preference for paper questionnaire as data preparation tool: Portability of instructions and definitions during record searching Distribution of questions to various offices/departments Collection of same data elements from multiple offices/departments Maintain control of final data for submission Preparation versus response

20 20 Preference for web survey as data submission tool: Population uses the web extensively Survey materials and contacts emphasize web submission On-line edit checks (almost 100) help identify problems Documentation of completed questionnaire (i.e., question text, instructions/definitions, responses) can be downloaded Continued access to survey responses after submission and after edit resolution follow-up Preparation versus response (continued)

21 21 Response mode by type of institution

22 22 Response mode by type of academic institution

23 23 Response mode by type of biomedical institution

24 24 Median number of days for submission All institutions: 69 days for web respondents 68 days for mail respondents Academic institutions: 76 days for web respondents 89 days for mail respondents Biomedical institutions: 50 days for web respondents 56 days for mail respondents Response time

25 25 Pretesting, site visits, workshops, etc. Paper questionnaire General definitions at start of paper and web surveys Key definitions and instructions on multiple pages and at time needed Gray shading of non-applicable survey questions on web On-line edits with respondent messages Communication/Comprehension

26 26 Median number of edit problems All institutions: 2 edit problems for web respondents 4 edit problems for mail respondents Academic institutions: 2 edit problems for web respondents 4 edit problems for mail respondents Biomedical institutions: 1 edit problem for web respondents 3 edit problems for mail respondents Communication/Comprehension

27 27 Resolution of respondent problems: Telephone and email help desk Post-submission edits and follow-up with ICs and other contributors Edit resolution follow-up

28 28 Percent of institutions requiring edit resolution follow-up All institutions: 73% of web respondents 88% of mail respondents Academic institutions: 77% of web respondents 89% of mail respondents Biomedical institutions: 63% of web respondents 87% of mail respondents Edit resolution follow-up (continued)

29 29 Identify contributors for follow-up Incorporate data entry features into web survey Tailor edit messages to specific conditions of inconsistency Lessons learned


Download ppt "Survey of Science and Engineering Research Facilities Fiscal Year 2005 Dual Mode Data Collection Experience Timothy Smith, Cindy Gray, and Leslie J. Christovich."

Similar presentations


Ads by Google