Presentation is loading. Please wait.

Presentation is loading. Please wait.

NMFS Use Case 1 review/ evaluation and next steps April 19, 2012 Woods Hole, MA Peter Fox (RPI* and WHOI**) and Andrew Maffei (WHOI) *Tetherless World.

Similar presentations


Presentation on theme: "NMFS Use Case 1 review/ evaluation and next steps April 19, 2012 Woods Hole, MA Peter Fox (RPI* and WHOI**) and Andrew Maffei (WHOI) *Tetherless World."— Presentation transcript:

1 NMFS Use Case 1 review/ evaluation and next steps April 19, 2012 Woods Hole, MA Peter Fox (RPI* and WHOI**) and Andrew Maffei (WHOI) *Tetherless World Constellation, ** AOP&E

2 Modern informatics enables a new scale-free framework approach Use cases Stakeholders Distributed authority Access control Ontologies Maintaining Identity

3 What’s happened since.. Implementation –Technology assessment –Leverage infrastructure –Rapid prototyping Now Evaluation Open world, iteration

4 Review and evaluation

5 References Twidale, Randall and Bentley (1994) and references therein Twidale, M., Randall, D. and Bentley, R. 1994, Situated evaluation for cooperative systems, Proceedings, Comp. Supp. Coop. Work 1994, Chapel Hill, NC, pp. 441-452. Situated evaluation for cooperative systems 5

6 Metrics Things you can measure (numerical) Things that are categorical –Could not do before –Faster, more complete, less mistakes, etc. –Wider range of users Measure or estimate the baseline before you start 6

7 Result/ outcome We will refer to the use case document Outcome (and value of it) is a combination of data gathering processes, that may include surveys, interviews, focus groups, document analysis and observations that will yield both qualitative and quantitative results. I.e. a discussion (today) Did we meet the goal? 7

8 Example: what we wanted to know about VSTO Evaluation questions are used to determine the degree to which the VSTO enhanced search, access, and use of data for scientific and educational needs and effectively utilized and implemented a template for user-centric utilization of the semantic web methodology. VO – appears to local and integrated and in the end-users language (this is one of the metrics) 8

9 Evaluation (Twidale et al.) An assessment of the overall effectiveness of a piece of software, ideally yielding a numeric measure by which informed cost-benefit analysis of purchasing decisions can be made. An assessment of the degree to which the software fulfils its specification in terms of functionality, speed, size or whatever measures were pre-specified. 9

10 Evaluation An assessment of whether the software fulfils the purpose for which it was intended. An assessment of whether the ideas embodied in the software have been proved to be superior to an alternative, where that alternative is frequently the traditional solution to the problem addressed. An assessment of whether the money allocated to a research project has been productively used, yielding useful generalizeable results. 10

11 Evaluation An assessment of whether the software proves acceptable to the intended end- users. An assessment of whether end-users continue to use it in their normal work. An assessment of where the software fails to perform as desired or as is now seen to be desirable. An assessment of the relative importance of the inadequacies of the software. 11

12 (Orthogonal) Dimensions of evaluations StructuredLess structured QuantitativeQualitative SummativeFormative Controlled experimentsEthnographic observations Formal and rigorousInformal and opportunistic 12

13 Formative and Summative evaluation carried out for two reasons: –grading translations = summative evaluation (when the guests taste the soup) –giving feedback = formative evaluation (when the cook tastes the soup) 13

14 Iterating Evolve, iterate, re-design, re-deploy –Small fixes –Full team aware of the evaluation results and implications –Decide what to do about the new use cases, or if the goal is not met –Determine what knowledge engineering is required and who will do it (participants in the evaluation may become domain experts) –Determine what new knowledge representation –Assess need for an architectural re-design 14

15 Summary Project evaluation has many attributes Structured and less-structured Really need to be open to all forms A good way to start is to get members of your team to do peer evaluation This is a professional exercise, treat it that way at all times Other possible techniques for moving forward on evolving the design, what to focus upon, priorities, etc.: SWOT, Porter’s 5 forces 15

16 Summary By now, the reality of going into complete detail for the prototype implementation should be apparent Keeping it simple is also very important as you begin to implement Being prepared to iterate is essential Now is the time to validate the models with domain experts and the team 16

17 Questions?

18 Back shed

19 Use case!

20 Use Case … is a collection of possible sequences of interactions between the system under discussion and its actors, relating to a particular goal. The collection of Use Cases should define all system behavior relevant to the actors to assure them that their goals will be carried out properly. Any system behavior that is irrelevant to the actors should not be included in the use cases. –is a prose description of a system's behavior when interacting with the outside world. –is a technique for capturing functional requirements of business systems and, potentially, of an ICT system to support the business system. –can also capture non-functional requirements

21 Developed for NASA TIWG Use case format Use case name Goal Summary Triggers Basic flow Alternate flow Post conditions Activity diagram Preconditions in tabular form Notes


Download ppt "NMFS Use Case 1 review/ evaluation and next steps April 19, 2012 Woods Hole, MA Peter Fox (RPI* and WHOI**) and Andrew Maffei (WHOI) *Tetherless World."

Similar presentations


Ads by Google