Presentation is loading. Please wait.

Presentation is loading. Please wait.

Evaluation of Web Sites: What Works and What Doesn't Sue Ellen Hansen, Survey Research Operations, University of Michigan Matthew Richardson, ICPSR, University.

Similar presentations


Presentation on theme: "Evaluation of Web Sites: What Works and What Doesn't Sue Ellen Hansen, Survey Research Operations, University of Michigan Matthew Richardson, ICPSR, University."— Presentation transcript:

1 Evaluation of Web Sites: What Works and What Doesn't Sue Ellen Hansen, Survey Research Operations, University of Michigan Matthew Richardson, ICPSR, University of Michigan

2 1.Is the navigation clear and consistent? 2.Does the site meet accessibility guidelines? 3.Has the content been written for the Web? 4.Does the visual design improve or impair the acquisition of knowledge? 5.Is the search engine a barrier to accessing content? 6.How clear and navigable is the access system? 7.Does the site do an adequate job of getting the user to his/her real goal, data analysis? 8.How well does the site educate the user and encourage further research? Evaluative Principles

3 Behavioral Risk Factor Surveillance System (BRFSS) http://www.cdc.gov/brfss/ Centre for Comparative European Survey Data (CCESD) Information System http://www.ccesd.ac.uk/MainMenu.aspx Council of European Social Science Data Archives (CESSDA) http://www.nsd.uib.no/cessda/index.html Cultural Policy and the Arts National Data Archive (CPANDA) http://www.cpanda.org/ The Health and Retirement Study (HRS) http://hrsonline.isr.umich.edu/ The Henry A. Murray Research Center http://www.murray.harvard.edu/ Integrated Public Use Microdata Series (IPUMS) http://www.ipums.umn.edu/usa/sitemap.html International Social Survey Programme (ISSP) http://www.issp.org/ Inter-university Consortium for Political and Social Research (ICPSR) http://www.icpsr.umich.edu/ National Election Studies (NES) http://www.umich.edu/~nes/ Panel Study of Income Dynamics (PSID) http://psidonline.isr.umich.edu/ The Roper Center http://www.ropercenter.uconn.edu/ Sites Reviewed

4 1. Is the navigation clear and consistent? Is the navigation consistent in terms of placement and style? Do the navigation labels make sense? Are they free of jargon? Are there too many navigation items? Are there obvious links for commonly used Web site features? I.e., "contact us", "help", "site map", "search", and "privacy policy". Can the user reach content without searching?

5

6

7

8 2. Does the site meet accessibility guidelines? Does the site use proper linking behavior, such as matching the link name to the page title and making sure each linked phrase is self-explanatory? Does the site use page positioning or color as a navigational tool, such that the site may be unusable to users with visual impairments? Is the text scaleable? Section 508 – http://www.section508.gov/ WAI – http://www.w3.org/WAI/

9

10

11 3. Has the content been written for the Web? Is the text concise and well-organized? Is the writing simple and oriented towards modular paragraphs and bulleted lists? Is the content free of marketing fluff? Are acronyms spelled out and jargon avoided?

12

13

14 4. Does the visual design improve or impair the acquisition of knowledge? Is there enough color contrast to ease on-screen reading? Are printer-friendly pages enabled? Is there a visual cue for clickable image elements?

15

16

17

18

19 5. Is the search engine a barrier to accessing content? Does the search engine offer a basic and advanced option? Can the user narrow his/her search results? Is there a help document on searching that spells out issues such as case sensitivity, phrase vs. word searching, Boolean searching, field searching, etc.? Does the search engine search all site content?

20

21

22

23 6. How clear and navigable is the access system? How clearly are access restrictions spelled out? I.e., terms of use. To what extent is security/access an impediment to the user? What kind of registration is required? To what extent is the archive wed to Web distribution, or can it accommodate other means of data dissemination? Are confidentiality issues addressed in a meaningful fashion?

24

25

26

27 7. Does the site do an adequate job of getting the user to his/her real goal, data analysis? How quickly can the user get to data? How easy is it to acquire the full bundle of products? I.e., all files needed to perform analysis, including documentation. Can you browse the documentation separately? Is it clearly labeled? Are multiple software suites supported? How descriptive are the metadata? Does the site provide any form of online analysis?

28

29

30

31

32 8. How well does the site educate the user and encourage further research? To what extent does the raw data serve as a gateway to other research? Does it reflect additional steps in the research data life cycle? What does the site offer in term of data literacy education? To what extent does the site attempt to educate novice users?

33

34

35 Other Ways Sites Encourage Research For series data, does the site link the waves to one another? Are aggregate files available? Is it possible to find other research conducted by the investigator? In a particular subject area? Funded by a particular agency? Does the site offer or link to statistical education programs? Does the site direct users to conferences of interest to social science researchers?

36 Conclusion You should invest more time into designing your site's navigation than you do in dealing with visual design. Familiarize yourself with accessibility guidelines, and make your site accessible. Be wary of search engines. Consult experts when planning your site's search engine. Spell out your terms of use. Streamline the process of distributing data. Make sure that the core function of your site is doable with minimal time and effort. Remember that research doesn't really end with the data download. Analysis is where it gets interesting, and we should be more than just a file transfer utility.


Download ppt "Evaluation of Web Sites: What Works and What Doesn't Sue Ellen Hansen, Survey Research Operations, University of Michigan Matthew Richardson, ICPSR, University."

Similar presentations


Ads by Google