Presentation is loading. Please wait.

Presentation is loading. Please wait.

GRASP – Overview of Program Comprehension Studies - Slide 1 Auburn University Computer Science and Software Engineering The GRASP Research Project An Overview.

Similar presentations


Presentation on theme: "GRASP – Overview of Program Comprehension Studies - Slide 1 Auburn University Computer Science and Software Engineering The GRASP Research Project An Overview."— Presentation transcript:

1 GRASP – Overview of Program Comprehension Studies - Slide 1 Auburn University Computer Science and Software Engineering The GRASP Research Project An Overview of Program Comprehension Studies

2 GRASP – Overview of Program Comprehension Studies - Slide 2 Auburn University Computer Science and Software Engineering Software Visualization Intuitively beneficial in comprehension tasks Empirical evidence is mixed Graphical representations of software are inherently useful, though particular representations may not be.

3 GRASP – Overview of Program Comprehension Studies - Slide 3 Auburn University Computer Science and Software Engineering Control Structure Diagram (CSD) Visually depicts the control structure and module-level organization of source code A value-added feature of source code: Appears as a companion to the source code without disrupting its familiar appearance Compact, intuitive, non-disruptive

4 GRASP – Overview of Program Comprehension Studies - Slide 4 Auburn University Computer Science and Software Engineering CSD [Adapted from Barnes, Programming in Ada 2nd Ed., Addison-Wesley, 1984] task body TASK_NAME is begin loop for p in PRIORITY loop select accept REQUEST(p) (D : DATA) do ACTION (D); end; exit; else null; end select; end loop; end TASK_NAME;

5 GRASP – Overview of Program Comprehension Studies - Slide 5 Auburn University Computer Science and Software Engineering CSD ƹ¹¹¹¹¹¹¹ çèé task body TASK_NAME is ʹ˹¹¹¹¹¹¹¹ § begin ¨¹¹® loop § 7¹¹± for p in PRIORITY loop § 5 7¹²´ select ¹¹Ã 5 5 6§ ¬¹¹¹¹¹¹¹¹¹ § 5 5 6¨êëì accept REQUEST(p) (D : DATA) do § 5 5 6§ ªË¹¹¹¹¹¹¹¹ § 5 5 6§ ¨¹¹ ACTION (D); Â¹Ä 5 5 6§ © end; § 5Â¹Ç 6¾¹¹ exit; § 5 5 ¶´ else § 5 5 ¸¾¹¹ null; § 5 5 È end select; § 5 ° end loop; § ° end loop; © end TASK_NAME;

6 GRASP – Overview of Program Comprehension Studies - Slide 6 Auburn University Computer Science and Software Engineering CSD repeat := true; WHILE repeat LOOP text_io.get_line (item => response, last => last); -- Interpret user command IF (last = 1) THEN IF (response(1) = 'r') THEN text_io.new_line; flag := continue; repeat := false; exit Process_Loop; ELSIF (response(1) = 's') THEN -- Status check requested flag := status; repeat := false; ELSIF (response(1) = 'c') THEN -- Checkpoint interval change requested LOOP BEGIN

7 GRASP – Overview of Program Comprehension Studies - Slide 7 Auburn University Computer Science and Software Engineering § 5 6 § 6 ¨¹¹ repeat := true; § 5 6 § 6 ¨¹¹± WHILE repeat LOOP § 5 6 § 6 § 5 § 5 6 § 6 § 7¹¹ text_io.get_line (item => response, last => last); § 5 6 § 6 § 5 § 5 6 § 6 § 5 -- Interpret user command § 5 6 § 6 § 7¹³´ IF (last = 1) THEN § 5 6 § 6 § 5 6¾¹³´ IF (response(1) = 'r') THEN § 5 6 § 6 § 5 6 6§ § 5 6 § 6 § 5 6 6¨¹¹ text_io.new_line; § 5 6 § 6 § 5 6 6¨¹¹ flag := continue; § 5 6 § 6 § 5 6 6¨¹¹ repeat := false; §Â¹Ç 6 § 6 § 5 6 6¾¹¹ exit Process_Loop; § 5 6 § 6 § 5 6 ³´ ELSIF (response(1) = 's') THEN § 5 6 § 6 § 5 6 6§ § 5 6 § 6 § 5 6 6§ -- Status check requested § 5 6 § 6 § 5 6 6¨¹¹ flag := status; § 5 6 § 6 § 5 6 6¾¹¹ repeat := false; § 5 6 § 6 § 5 6 6 § 5 6 § 6 § 5 6 ³´ ELSIF (response(1) = 'c') THEN § 5 6 § 6 § 5 6 6§ § 5 6 § 6 § 5 6 6§ -- Checkpoint interval change requested § 5 6 § 6 § 5 6 6¨¹¹® LOOP § 5 6 § 6 § 5 6 6§ 7¹¹´ BEGIN CSD

8 GRASP – Overview of Program Comprehension Studies - Slide 8 Auburn University Computer Science and Software Engineering gtcand((rtoffinf[CANDSTART]+k-1),report,10,report,36); ivote = vote[rtoffinf[CANDSTART]+k-1]; if (ivote < 0L) ivote = 0L; cvicl(ivote,report,44,7); x430: cprint(flright); /* if (rtoffinf[DUALLOCATION] <= 0) continue; gtdcnd((rtoffinf[DUALSTART]+k-1),report,12); cprint(flright); */ /* loop back for next candidate */ } ; goto x455; /* no candidates */ x460: cprint(flright); smove(report,11,"*** Warning *** No candidates",0,29); cprint(flright); CSD

9 GRASP – Overview of Program Comprehension Studies - Slide 9 Auburn University Computer Science and Software Engineering § 5 7¹¹ gtcand((rtoffinf[CANDSTART]+k-1),report,10,report,36); § 5 7¹¹ ivote = vote[rtoffinf[CANDSTART]+k-1]; § 5 7¹³´ if (ivote < 0L) § 5 5 ¶¾¹¹ ivote = 0L; § 5 7¹¹ cvicl(ivote,report,44,7); § 5 5 § 5¹¹! x430: § 5 7¹¹ cprint(flright); § 5 5 /* if (rtoffinf[DUALLOCATION] <= 0) § 5 5 continue; § 5 5 gtdcnd((rtoffinf[DUALSTART]+k-1),report,12); § 5 5 cprint(flright); */ § 5 5 § 5 5 /* loop back for next candidate */ § 5 ° } § 7¹¹ ; Â¹Ä 7¹¹ goto x455; § 5 § 5 /* no candidates */ §¹¹! x460: § 7¹¹ cprint(flright); § 7¹¹ smove(report,11,"*** Warning *** No candidates",0,29); § 7¹¹ cprint(flright); CSD

10 GRASP – Overview of Program Comprehension Studies - Slide 10 Auburn University Computer Science and Software Engineering Evaluation Subjective study: performance characteristics –Before GRASP was developed and distributed –CSD v. similar visualizations –rated according to 11 performance characteristics –significant preference for the CSD in 8 of 11 Student preference surveys –After GRASP was in use in the AU CSSE curriculum Instrumentation –3-4 years of AU utilization data Cumulative implication of these initial evaluations plus anecdotal evidence from industry provided motivation for continued development and study.

11 GRASP – Overview of Program Comprehension Studies - Slide 11 Auburn University Computer Science and Software Engineering Empirical Research Two major research objectives –Evaluate and refine the CSD (and other visualizations such as architecture diagrams as the research progresses). –Study how people use software visualizations in program comprehension tasks. Three major questions to address –How do visualizations such as the CSD affect program comprehensibility? –How do people use visualization techniques as tools in their work? –How can the comprehensibility of a program be measured in an effective way? Funding provided by the National Science Foundation (EIA-9806777).

12 GRASP – Overview of Program Comprehension Studies - Slide 12 Auburn University Computer Science and Software Engineering Controlled Experiments CSD v. Plain Text: Round 1 39 senior and graduate level CS students were divided into two performance-balanced groups and given a quiz on a Java module containing 183 source lines of code. One group (the control) was provided the source in plain text only while the second group was provided the source with the CSD. Both groups were asked to respond to the same set of 12 questions.

13 GRASP – Overview of Program Comprehension Studies - Slide 13 Auburn University Computer Science and Software Engineering Experimental Task All questions were related to the structure and execution of the code, but not to its overall functionality. For example: –How many ways are there to exit the loop that begins on line 91? –To what line would control be transferred immediately after executing line 144? –How many conditions must be evaluated in order for line 152 to be executed? Subjects were informed that they would be graded on both correctness and speed. –Answer as quickly as possible without sacrificing correctness

14 GRASP – Overview of Program Comprehension Studies - Slide 14 Auburn University Computer Science and Software Engineering Experimental Preparation Performance balance of groups prior to experiment

15 GRASP – Overview of Program Comprehension Studies - Slide 15 Auburn University Computer Science and Software Engineering Experimental Results An initial analysis of differences in performance between the two groups was done using –average time to respond to each question (T1) –average time to respond correctly to each question (T2) –number of correct responses across all questions (T3) The data strongly rejected the null hypothesis that the CSD had no positive effect on subject performance in answering the 12 questions

16 GRASP – Overview of Program Comprehension Studies - Slide 16 Auburn University Computer Science and Software Engineering Experimental Results Time taken to respond (T1)

17 GRASP – Overview of Program Comprehension Studies - Slide 17 Auburn University Computer Science and Software Engineering Experimental Results Time taken to respond correctly (T2)

18 GRASP – Overview of Program Comprehension Studies - Slide 18 Auburn University Computer Science and Software Engineering Experimental Results Number of correct responses (T3)

19 GRASP – Overview of Program Comprehension Studies - Slide 19 Auburn University Computer Science and Software Engineering Experimental Results Time taken to respond correctly (T2) - Revised

20 GRASP – Overview of Program Comprehension Studies - Slide 20 Auburn University Computer Science and Software Engineering Experimental Results Average time taken to respond (T1) –P-value = 0.0035 Average time taken to respond correctly (T2) –P-value = 0.000305 Number of correct responses (T3) –45% of the CSD group’s responses were correct while only 26% of the control group’s responses were correct –Difference is highly significant (P-value = 0.0000167)

21 GRASP – Overview of Program Comprehension Studies - Slide 21 Auburn University Computer Science and Software Engineering Follow-on experiment CSD v. Plain Text: Round 2 Experimental materials (code, questions) and procedures were held constant while the subject population changed. –50 CS 1 students Original results were partially reproduced: Statistically significant gains in accuracy and speed for the CSD group with respect to –average time to respond correctly (T2) –number of correct responses (T3)

22 GRASP – Overview of Program Comprehension Studies - Slide 22 Auburn University Computer Science and Software Engineering Conclusions The CSD replaces penicillin as the greatest scientific advance of the 20th century. The IPO for GRASP, Inc. will make us fabulously wealthy. The beneficial effect of the CSD on program reading tasks is significant and measurable. There are still many more questions to answer… –Under what circumstances? –Do the benefits scale up to larger tasks? –Is there a pleasure or immediate reward factor that is greater than the quantifiable effect?

23 GRASP – Overview of Program Comprehension Studies - Slide 23 Auburn University Computer Science and Software Engineering Complexity Profile Graph Algorithmic level graph of complexity profile Fine-grained metric –for each production in the grammar Profile of program unit rather than single- value metric Complexity values from each measurable unit in a program are displayed as a set to form the complexity profile graph. Adds the advantages of visualization to complexity measurement.

24 GRASP – Overview of Program Comprehension Studies - Slide 24 Auburn University Computer Science and Software Engineering Complexity Profile Graph A program unit is parsed and divided into segments –e.g., each simple declaration or statement is a single segment, composite statements are divided into multiple segments Each segment is a measurable unit. Segments are non-overlapping and all code is covered –i.e., all tokens are included in exactly one segment The complexity for each segment is a bar in the CPG.

25 GRASP – Overview of Program Comprehension Studies - Slide 25 Auburn University Computer Science and Software Engineering Complexity Profile Graph

26 GRASP – Overview of Program Comprehension Studies - Slide 26 Auburn University Computer Science and Software Engineering Complexity Profile Graph

27 GRASP – Overview of Program Comprehension Studies - Slide 27 Auburn University Computer Science and Software Engineering Computing the CPG Content –C = ln(reserved words + operators + operands) Breadth –B = number of statements within a construct Reachability –R = 1 + number of operators in predicate path Inherent –I = assigned value based on type of control structure Total –T = s 1 C + s 2 B + s 3 R + s 4 I –where s 1, s 2, s 3, s 4 are scaling factors

28 GRASP – Overview of Program Comprehension Studies - Slide 28 Auburn University Computer Science and Software Engineering CPG Study

29 GRASP – Overview of Program Comprehension Studies - Slide 29 Auburn University Computer Science and Software Engineering CPG Study Response TimeError Rate

30 GRASP – Overview of Program Comprehension Studies - Slide 30 Auburn University Computer Science and Software Engineering Future Work Continue with the NSF-sponsored empirical research... Other visualizations Complexity (e.g., CPG) Architecture (e.g., Object diagrams) Other reading techniques –source code folding Industrial case studies and experiments –Data collection framework jGRASP

31 GRASP – Overview of Program Comprehension Studies - Slide 31 Auburn University Computer Science and Software Engineering GRASP Distribution http://www.eng.auburn.edu/grasp

32 GRASP – Overview of Program Comprehension Studies - Slide 32 Auburn University Computer Science and Software Engineering Data Collection Framework Allow the automated collection of high resolution utilization data from GRASP sessions. Allow the automated collection of various software measures. We want to be able to capture how the user was interacting with the tool and its visualizations at a given point in time as well as quantify the nature of the software with which they were working.


Download ppt "GRASP – Overview of Program Comprehension Studies - Slide 1 Auburn University Computer Science and Software Engineering The GRASP Research Project An Overview."

Similar presentations


Ads by Google