Presentation is loading. Please wait.

Presentation is loading. Please wait.

Lessons Learned from Empirical IESE Dieter Rombach ISERN WS 2005 Noosa Heads, 14 November 2005.

Similar presentations


Presentation on theme: "Lessons Learned from Empirical IESE Dieter Rombach ISERN WS 2005 Noosa Heads, 14 November 2005."— Presentation transcript:

1 Lessons Learned from Empirical Studies @ IESE Dieter Rombach ISERN WS 2005 Noosa Heads, 14 November 2005

2 Page 2/9 Noosa Heads 14-15 Nov. 2005Dieter Rombach Guidelines for Empirical Studies ’05 Background Fraunhofer IESE (Institute for Experimental Software Engineering) IESE Mission Statement -Provide innovative and value-adding customer solutions with measurable effects - Advance the state-of-the art in software & system engineering - Promote the importance of empirically based software & system engineering Reality -Constant struggle to convince researchers that empirical investigation is NOT a “some times”, but a constant driver of research  All research areas have to be augmented with empirical results & hypotheses for future research  Both from a company customer & research perspective

3 Page 3/9 Noosa Heads 14-15 Nov. 2005Dieter Rombach Guidelines for Empirical Studies ’05 Lessons learned (1 of 7): Education & Coaching Continuous education (and coaching) is necessary -Non-software engineer  software engineering conversion is hard -Software engineer  empirical software engineer is “super hard” -Regular class on “Empirical Model Building & Methods” taught @ university  Mandatory for all new IESE employees  Mandatory for all students in CS PhD program -Guidelines for project set-up & touch-down -Clearinghouse for empirical studies within IESE  Definition & design of study  Pre-publication review

4 Page 4/9 Noosa Heads 14-15 Nov. 2005Dieter Rombach Guidelines for Empirical Studies ’05 Lessons learned (2 of 7): Empirical Research Motivation (PhD) Problem definition (e.g., Cycle-time to long) Feasibility Analysis (e.g., do questions address typical req. Defect classes?) Research (e.g., develop PBR for req‘s) Research proposal (e.g., develop PBR for req‘s to increase effectiveness) Research Testing (e.g., does PBR find more defects than X?) Problem Testing (e.g., does PBR – integrated into a lifecycle model contribute to reduced cycle time? Not necessary for completely automated processes (e.g., test data gen.) Always necessary in SE to check user acceptability

5 Page 5/9 Noosa Heads 14-15 Nov. 2005Dieter Rombach Guidelines for Empirical Studies ’05 Lessons learned (3 of 7): Self-Confidence Not every paper has to contain 2 pages of justification of empirical research! Present results directly -Most results chapters can only be understood after major “reverse engineering effort” -Afraid to say “we observed that method A finds more defects of type X than method B” (because it is only true in context)  This is not a plea for unjustified generalization  It is a plea for top-down presentation (results first, then constraints) Provide better integration into state of research (aggregation) -Leads otherwise to rejection of replications in community, because variation is hidden! Include stronger “new” hypotheses in sections on “future directions”

6 Page 6/9 Noosa Heads 14-15 Nov. 2005Dieter Rombach Guidelines for Empirical Studies ’05 Lessons learned (4 of 7): Context for Cross- Analysis We need to document MORE than one’s own study scope (dependent & independent variables), but what???, to explain inconsistencies between replicated studies Best addressed within some organization which defines scope (experiment line) of studies -IESE: defined by company needs (e.g., in automotive domain: objectives, major organizational, project & technology variations) -ISERN community??? f (M, C1) =/= g (M, C1)

7 Page 7/9 Noosa Heads 14-15 Nov. 2005Dieter Rombach Guidelines for Empirical Studies ’05 Lessons learned (5 of 7): Documentation Clear standard to be followed @ IESE -based on “Wohlin” book -Minimal documentation requirements  Goals (method & usage for customers)  Variables (& IESE context)  Design  Data characterization & analysis  Data interpretation  Validity  Usage (trustability, sharing, new hypotheses) -Context provided by “IESE Business Analysis” Adherence coached & approved by Clearinghouse! Top Challenges

8 Page 8/9 Noosa Heads 14-15 Nov. 2005Dieter Rombach Guidelines for Empirical Studies ’05 Lessons learned (6 of 7): Industrial Interest Increasing due to major challenges -Ultra-large systems (e.g., automotive industry) -Dependability (e.g., unexpected feature interactions in large systems) -Distributed development (across different cultures) -High risk of technology adoption (e.g., product lines) Controlled prototype development (industrial case studies =/= controlled experiments!!!) -Highest possible reality (company products, company staff) -Joint evaluation projects (“Research Labs”) -Leads to Business Case for company decision Examples -Bosch, Ricoh, … (software product lines) -Bosch (automated testing): together with J. Poore Possibility for joint collaboration (see J. Poore on Bosch testing study)

9 Page 9/9 Noosa Heads 14-15 Nov. 2005Dieter Rombach Guidelines for Empirical Studies ’05 Lessons learned (7 of 7): Grant Lobbying Empirical Research MUST be part of SE funding programs Success in German Federal Program “SE 2006” -All projects must perform evaluation -All results have to be submitted to national SE portal (VSEK) managed by IESE and some other Fraunhofer institutes Plan to establish a “Software Engineering for Embedded Systems” network across other business domains within Fraunhofer (Lead: P. Liggesmeyer) -Materials -Microelectronics -Production technology -GRID


Download ppt "Lessons Learned from Empirical IESE Dieter Rombach ISERN WS 2005 Noosa Heads, 14 November 2005."

Similar presentations


Ads by Google