Presentation is loading. Please wait.

Presentation is loading. Please wait.

Evaluation and Designing

Similar presentations


Presentation on theme: "Evaluation and Designing"— Presentation transcript:

1 Evaluation and Designing
Topic 7 Evaluation Evaluation and Designing

2 Examine the following products

3

4 Product Evaluation The following photographs show some of the different lemon squeezers that have been made through history.  They range in size, price, material, effectiveness and attractiveness Look at the pictures of the lemon squeezers below, the designs may have changed over the years but does this make the newer versions more effective?

5 LEMON SQUEEZERS

6 A wooden fruit squeezer (above) - this simple squeezer is a modern but the design is Victorian

7 Starck's Juicy Salif (above)- The juicer was designed in 1988 by the French designer Philippe Starck

8 A modern novelty juicer

9 A modern representation of a Victorian glass lemon squeezer

10 A modern simple hinged fruit squeezer

11 A simple modern plastic lemon squeezer

12 Product Evaluation Does this equipment encourage people to prepare and cook food? Is one of the lemon squeezers more efficient than the others? Is there a difference in price? 

13 Evaluation Models Try answering these questions:
Describe how each of the lemon squeezers works? Evaluate the different squeezers using the evaluation model ACCESS FM: Aesthetics Cost Customer Environment Safety Size Function Materials

14 Evaluation Models Evaluate the different squeezers using the evaluation model PIM’s POSITIVE INTERESTING MINUS

15 Evaluation Evaluation is the systematic assessment of the worth or merit of some object Evaluation is the systematic acquisition and assessment of information to provide useful feedback about some object Goals of evaluation most evaluations is to provide "useful feedback“ the major goal of evaluation should be to influence decision-making

16 Types of Evaluation Formative evaluation includes several evaluation types: needs assessment determines who needs the program, how great the need is, and what might work to meet the need evaluability assessment determines whether an evaluation is feasible and how stakeholders can help shape its usefulness structured conceptualization helps stakeholders define the program or technology, the target population, and the possible outcomes implementation evaluation monitors the fidelity of the program or technology delivery process evaluation investigates the process of delivering the program or technology, including alternative delivery procedures

17 Types of Evaluation Summative evaluation can also be subdivided:
outcome evaluations investigate whether the program or technology caused demonstrable effects on specifically defined target outcomes impact evaluation is broader and assesses the overall or net effects (intended or unintended) of the program or technology as a whole cost-effectiveness and cost-benefit analysis address questions of efficiency by standardizing outcomes in terms of their dollar costs and values secondary analysis reexamines existing data to address new questions or use methods not previously employed meta-analysis integrates the outcome estimates from multiple studies to arrive at an overall or summary judgment on an evaluation question

18 Evaluation Strategies
Qualitative Evaluation Is an assessment process that answers the question, "How well did we do?" Quantitative Evaluation Is an assessment process that answers the question, "How much did we do?" Tests, models and experiments are used at the design development stage of the design cycle Evaluating ideas, before developing a chosen solution

19 Literature Search The process of carrying out a systematic (and usually exhaustive) search of the literature on a given topic. It is usually the first step in a research project. A review of the literature this is important for a research project because it enables you to acquire an understanding of your topic, with its key issues, and an awareness of relevant research that has already been conducted.

20 User Trial Experimental investigation in which a group of users test versions of a product under controlled conditions (Pheasant 1996), usability trial within research Systematic heuristic (trail and error) or experimental evaluation of the interaction between people and the products, equipment and environments they use heuristic (trail and error) allowing or assisting to discover, allowing people to find out for themselves

21 User Research When undertaking user research, you are effectively asking questions of users. The types of questions you want to ask define or limit the appropriate type of research. Questions for users include: How well is the existing product supporting them? What do they like and dislike? Do they have a wish-list for what the product should do? Do they understand the meaning of a specific function, page, menu or screen? Can they complete the tasks they want? What is their emotional reaction to a design concept or product? Do they value a product (whether existing or proposed)? What alternative or additional methods, channels or tools are they using? Are they using any workarounds? What are their concerns about a product? Do they understand the navigation, terminology and behavior of the product?

22 Expert Appraisal Is the evaluation of a product or service by someone who has the professional training or experience to make an informed judgment on the design. Ideally, this person should not be biased by former involvement with the product since familiarity with any product or task makes it seem simpler and easier. Expert appraisal can be used to Identify possible causes of design exclusion Suggest improvements to reduce this exclusion Increase user satisfaction

23 Advantages / Disadvantages
Literature Search Many sources of information are available. Use of ICT to access information, online books, periodicals enhances the search of information, speed and cost , storage & security are all considerations. An abundance of data which can be time consuming. User Trial User trail data is collected by observing users behaviour. The “user” is a non-specialist which makes trials easier and cost effective. “Users” may carry out tasks in different ways from those expected and be inexperienced in data collection. User Research User research is collected by obtaining users’ responses to questions. Data is relatively easy and cheap to obtain. The data is usually qualitative. Expert Appraisal Expert knowledge and advice are gained. The expert may be biased Locating an expert may be difficult and expensive. The data is usually qualitative.


Download ppt "Evaluation and Designing"

Similar presentations


Ads by Google