Presentation is loading. Please wait.

Presentation is loading. Please wait.

Computer-based Experiments: Obstacles Stephanie Bryant University of South Florida Note: See Bryant, Hunton and Stone, BRIA 2004 for complete references.

Similar presentations


Presentation on theme: "Computer-based Experiments: Obstacles Stephanie Bryant University of South Florida Note: See Bryant, Hunton and Stone, BRIA 2004 for complete references."— Presentation transcript:

1 Computer-based Experiments: Obstacles Stephanie Bryant University of South Florida Note: See Bryant, Hunton and Stone, BRIA 2004 for complete references

2 ObstaclesOverview Technology Skill Needed Technology Skill Needed Threats to Internal Validity Threats to Internal Validity Getting Participants Getting Participants

3 Obstacles (Cond) Technology Skills Needed Technology Skills Needed Proficiency in software or programming Proficiency in software or programming

4 Develop Using a Scripting Language or Applications Software Develop Using a Scripting Language or Applications Software Applications Software for Web Experiments Applications Software for Web Experiments Example software packages: RAOSoft, Inquisite, PsychExps Example software packages: RAOSoft, Inquisite, PsychExps More expensive software, cheaper development & maintenance costs? Easier to use, Features = those built into the software More expensive software, cheaper development & maintenance costs? Easier to use, Features = those built into the software Scripting languages: Scripting languages: Examples: Cold fusion, PHP, JSP (java server pages), CGI (common gateway interface) Examples: Cold fusion, PHP, JSP (java server pages), CGI (common gateway interface) Software is cheap or free, higher development & maintenance costs?, difficult for non-programmers, More features, more customizable Software is cheap or free, higher development & maintenance costs?, difficult for non-programmers, More features, more customizable Combine Scripting Languages & applications software Combine Scripting Languages & applications software Tools for an Computer-based Experiments

5 Applications Software: Raosoft Products ( Applications Software: Raosoft Products (Ezsurvey, Survey win, Interform) Difficulty index (1 = hard,10 = easy): 8 Difficulty index (1 = hard,10 = easy): 8 Do not provide all the functionalities Do not provide all the functionalities No randomization, response dependent questions (I.e., only straight surveys) No randomization, response dependent questions (I.e., only straight surveys) Limited formatting capabilities Limited formatting capabilities Expensive – no educational prices ($1,500 - $10,000) Expensive – no educational prices ($1,500 - $10,000) SurveyMonkey.com - $19.95/month SurveyMonkey.com - $19.95/month

6 SurveyMonkey.com

7 Applications Software: Inquisite Difficulty index (1 = hard,10 = easy):8 Difficulty index (1 = hard,10 = easy):8 Expensive ($10,000) Supports most of functionalities Expensive ($10,000) Supports most of functionalities To support all desired functionalities requires Software Development Kit (SDK) for complex applications ($2,000 but may be available soon for free) To support all desired functionalities requires Software Development Kit (SDK) for complex applications ($2,000 but may be available soon for free)

8 Applications Software: PsychExps PsychExperiments Web site created and maintained by the Univ. of Mississippi Psychology professor Ken McGraw. PsychExperiments Web site created and maintained by the Univ. of Mississippi Psychology professor Ken McGraw. Collaboratory Collaboratory http://psychexps.olemiss.edu/ http://psychexps.olemiss.edu/ http://psychexps.olemiss.edu/ Free! Free! Requires that user download & install applications software Requires that user download & install applications software Many existing scripts (e.g., randomization) Many existing scripts (e.g., randomization)

9 Psychexps Home Page

10 Psychexps (Cond)

11 Current Experiments on Psychexps

12 Obstacles (Cond) Big learning curves involved Big learning curves involved On-campus support sometimes available On-campus support sometimes available Can hire programmers/graduate students to help with programming Can hire programmers/graduate students to help with programming

13 Obstacles (Cond) Internal Validity Considerations: Internal Validity Considerations: Statistical Conclusion Validity Statistical Conclusion Validity Internal Validity Internal Validity Construct Validity Construct Validity External Validity External Validity

14 Statistical Conclusion Validity (The extent to which two variables can be said to co-vary) Increased sample size and statistical power (e.g., Ayers, Cloyd et al.) Increased sample size and statistical power (e.g., Ayers, Cloyd et al.) Web to recruit participants! Web to recruit participants! Decreased or eliminated data entry errors Decreased or eliminated data entry errors Capture data directly into database Capture data directly into database Increased variability in experimental settings Increased variability in experimental settings Difficult to control in Web experiments Difficult to control in Web experiments People complete experiments in their own (natural) settings with various types of computer configurations (browsers, hardware) People complete experiments in their own (natural) settings with various types of computer configurations (browsers, hardware) McGraw et al (2000) note that WE noise is compensated for by large sample sizes McGraw et al (2000) note that WE noise is compensated for by large sample sizes System Downtime System Downtime Software Coding Errors (e.g., Barrick, 01, Hodge 01) Software Coding Errors (e.g., Barrick, 01, Hodge 01)

15 Internal Validity (Correlation or Causation?) Decreased potential diffusion of treatment Decreased potential diffusion of treatment Unlikely that participants will learn information intended for one treatment group and not another. Increased participant drop-out rates across treatments Increased participant drop-out rates across treatments A higher drop-out rate among Web vs. laboratory experiments could create a participant self-selection effect that makes causal inferences problematic. Mitigate by placing requests for personal information and monetary rewards at the beginning of the experiment (Frick et al 1999) and McGraw et al. (2000). Mitigate by placing requests for personal information and monetary rewards at the beginning of the experiment (Frick et al 1999) and McGraw et al. (2000). Completion rate approached 86% when some type of monetary reward was offered (Musch and Reips 2000) Completion rate approached 86% when some type of monetary reward was offered (Musch and Reips 2000)

16 Internal Validity (Correlation or Causation?) Controlling cheating Controlling cheating Multiple submissions by a single participant Identification by email address, logon ID, password, or IP address Randomization (A control) Randomization (A control) Computer scripts available for randomly assigning participants to conditions Computer scripts available for randomly assigning participants to conditions Complete scripts published in Baron and Siepmann (2000 247) and Birnbaum (2001, 210-212) Complete scripts published in Baron and Siepmann (2000 247) and Birnbaum (2001, 210-212)

17 Construct Validity (Generalizability from observations to higher-order constructs) Decreased demand effects & other experimenter influences Decreased demand effects & other experimenter influences Rosenthal (66 & 76), Pany (87) Rosenthal (66 & 76), Pany (87) Decreased participant evaluation apprehension Decreased participant evaluation apprehension Rosenberg (69) Rosenberg (69) Naturalism of setting decreases? Naturalism of setting decreases?

18 Getting Participants Web-based Experiments Explosion of WWW Use Explosion of WWW Use 172 million computers linked to WWW 172 million computers linked to WWW 90% of CPAs conduct internet research 90% of CPAs conduct internet research 60% of US population has WWW access 60% of US population has WWW access

19 Getting Participants Internet Participant Solicitation Internet Participant Solicitation Benefits Large sample sizes (power) possibleLarge sample sizes (power) possible Availability of diverse, world-wide populationsAvailability of diverse, world-wide populations Interactive, multi-participant responsesInteractive, multi-participant responses Real-time randomization of question orderReal-time randomization of question order Response dependent questions (branch and bound) Response dependent questions (branch and bound) Authentication and authorization Authentication and authorization Multimedia (e.g., graphics, sound) Multimedia (e.g., graphics, sound) On-screen clock On-screen clock

20 Getting Participants Web-based Web-based Post notices in places where your target population might be likely to visit Post notices in places where your target population might be likely to visit Access ListServs Access ListServs PC-based PC-based Student involvement requirement?? Student involvement requirement?? USF Process USF Process

21 USF Process Mandatory participation in one experiment per semester Mandatory participation in one experiment per semester Experimentrix site used to manage Experimentrix site used to manage https://experimetrix2.com/soa/ https://experimetrix2.com/soa/

22 https://Experimetrix.com/soa

23 Experimetrix Signup

24 Experimenter Report

25 A Final Caveat: What Can Go Wrong…Will Cynical, but realistic Cynical, but realistic Plan carefully Plan carefully Develop contingency plans Develop contingency plans Consider cost-benefit Consider cost-benefit Greatest potential for BAR Web experiments is as yet unrealized Greatest potential for BAR Web experiments is as yet unrealized Biggest hurdle is required knowledge, but this can be overcome Biggest hurdle is required knowledge, but this can be overcome


Download ppt "Computer-based Experiments: Obstacles Stephanie Bryant University of South Florida Note: See Bryant, Hunton and Stone, BRIA 2004 for complete references."

Similar presentations


Ads by Google