Presentation is loading. Please wait.

Presentation is loading. Please wait.

Can We Trust Computers? CS 301 (Spring 2007) Mark Luntzel, Niel Ngyuen, and James Cheng.

Similar presentations


Presentation on theme: "Can We Trust Computers? CS 301 (Spring 2007) Mark Luntzel, Niel Ngyuen, and James Cheng."— Presentation transcript:

1 Can We Trust Computers? CS 301 (Spring 2007) Mark Luntzel, Niel Ngyuen, and James Cheng

2 Facts on Computer Errors Error-free software is not possible F-22 Software Glitch Space Shuttle Software Errors are often caused by more than one factor Lack of exhaustive, comprehensive testing Errors can be reduced by following good procedures and professional practices. Denver baggage system

3 The Roles of People in Computer-related Problems Computer User At home or work, users should understand the limitations of computers and the need for proper training and responsible use (but often do not). Computer Professional Understanding the source and consequences of computer failures is valuable when buying, developing, or managing a complex system. Educated Member of Society Personal decisions and political, social, and ethical decisions depend on understanding computer risks.

4 Types of problems and failures Problems for Individuals System Failures Safety-Critical Applications

5 What can go wrong? Problems for Individuals Billing Errors Lack of tests for inconsistencies and inappropriate amounts Database Accuracy Problems Incorrect information resulting in wrongful treatment or acts

6 Problems for individuals Causes: Large population. Human common sense not part of automated processing Overconfidence in the accuracy of data from a computer Errors in data entry Information not updated or corrected Lack of accountability for errors

7 System Failures Communications Business Transportation

8 Safety-Critical Applications Military Power Plants Aircraft Trains Automated Factories Medicine Problem Causes: Overconfidence Lack of override features. Insufficient testing System complexity Mismanagement.

9 Therac-25 The Therac-25 was a software-controlled radiation-therapy machine used to treat people with cancer. Overdoses of radiation Normal dosage is 100–200 rads. It is estimated that 13,000 and 25,000 rads were given to six people. Three of the six people died.

10 - Intermission - (Next Up: Niel Ngyuen)

11 GENERAL REASONS FOR COMPUTER’S FAILURE The task they are doing is inherently complex and difficult The task is often done poorly

12 Complexity Several computer systems are very large and composed of many interconnected subsystems. Various software programs have thousands and millions of lines of codes.

13 Poor performance Interaction with physical devices that do not work as expected Incompatibility of software and hardware, or of application software and the OS Management problems, including business and/or political pressure to get a product out quickly Inadequate attention t potential safety risks Not planning and designing for unexpected inputs or circumstances

14 Poor performance (continued) Insufficient testing Reuse of software from another system without adequate checking Overconfidence in software Carelessness Misrepresentation; hiding problems; inadequate response when problems are reported

15 Poor performance (continued) Problems with management of the use of a system: Data-entry errors Inadequate training of users Errors in interpreting results or output Overconfidence in software by users Insufficient planning for failures; no backup systems or procedures Lack of market or legal incentive to do a better job

16 Key issues Overconfidence The failure rate is often exaggerated (Therac-25, the Challenger, etc) Reuse of software without adequate testing

17 Professional Techniques Software Engineering and Professional Responsibility User Interfaces and human factors Redundancy and self-checking Testing Taking Responsibility

18 Law and Regulation The Uniform Computer Information Transaction Act (UCITA) accepts agreements as binding contracts, letting software sellers continue to sell product with known bugs. The FDA has regulated drugs and medical devices for decades Professional licensing

19 Weakness of Law and Regulation The approval process is extremely expensive and time-consuming. Regulations requiring specific procedures or materials discourage or prevent the use of newer and better ones that were not thought of by people who wrote the rules. The goal of regulation tends to get lost in details of the paperwork required. The approval process is affected by political concerns, including influence by competitors and the incentive to be overcautious.

20 Failure perspective Billing and Banking What is the acceptable failure rate (99% or 99.9%)? Complex systems Should we retain some degree of human control or rely heavily on computer system to make decision in critical tasks?

21 Dependency on Computers Computers offer convenience and productivity. Tragic breakdowns in computer systems often remind us of how efficient such systems are when they are working.

22 Risk and Progress Most new technologies are not safe when they are first introduced. Risk factors must be carefully accessed. Progress must be made to correct past mistakes More training must be offered for operating complex system. Over years, engineers developed techniques and procedures to increase safety. Software developers need to learn to apply their methods to software.

23 Solving Problem Correctly identify the source of problems. Avoid blaming the technology and computer system for many problems where they are irrelevant. Make clear distinction between non- computer-related and computer-related problems so that we can improve on the latter ones.

24 - Intermission - (Next Up: James Cheng)

25 MODELS/SIMULATIONS Criteria for evaluating models Success: Car Crash-Analysis Programs Controversial: Climate Models/Global Warming

26 What are Models? Data/equations describing/simulating systems Physical and non-Physical systems* Limitation on modeling: Simplification of reality* Modeling Validity/Accuracy*

27 Examples of Computer Modeling Physical and non-Physical systems Car crash-analysis* Climate change* Population growth Fiscal policies’ effects on economy And more…

28 Limitation on Modeling Simplification of reality Computation resource limited Not all “rules” are know/used Difficult to numerically quantify everything

29 Model Validity/Accuracy How closely does model mimic underlying science? Which simplification chosen? Data completeness? How closely does simulation predict reality?

30 Car Crash-Analysis Programs (Part 1) Example: NYNA3D (Lawrence Livermore National Laboratory) Divide car into grid. (10,000 – 50,000 pieces.) Each element has material property. Approx. 35hr super computer time for 40- 100 ms simulation (c.1990)

31 Car Crash-Analysis Programs (Part 2) Crash-Analysis programs’ efficacy $50,000 - $800,000: Real crash test. Results closely correspond to actual crash tests Success lead to other impact modeling/simulation*

32 Car Crash-Analysis Programs (Part 3) Success with crash analysis led to other simulation work: Dropping hazardous waste containers. Airplane nacelle/windshield collision with birds. Airbag deployment prediction Forecast earthquakes' effects on structures

33 Climate Models/Global Warming Background info: Global temperature started rising in 1970s Sharp increase in CO 2 and Methane since 1950s CO 2 and Methane increasing since 16,000 years ago IPCC 2001: +0.74°C in the last century

34 Climate’s Coupled Models General Circulation Models + Oceanic Models GCM: Developed originally weather prediction Uses as input: Sun’s energy output, earth orbit, topography, sea/ice surfaces, more. Predicts: Atmospheric temperature, solar/radiant energy I/O, precipitation, etc.

35 Climate Model Accuracy (Part 1) The science? Not completely known. e.g. Cloud formation not fully understood. Simplification used? Pretty extreme. e.g. ~200km spaced grid points. Accuracy? Mixed results. More on this on next slide…

36 Climate Models’ Accuracy (Part 2) Accurate: consensus on continued increase of temperature and sea level Semi-accurate: +0.33°C mean surface temperature change Upper bound of IPCC 2001

37 Climate Models’ Accuracy (Part 3) Inaccurate: Predicted troposphere warming did not occur. Inaccurate: Science magazine reports 3.3m per year since 1993. (50% higher than IPCC 2001.)

38 Climate Models’ Accuracy (Part 4) Inaccurate: Greenland Ice Sheet melt rate. IPCC 2001: -44 ± 53 Gt/yr 2006 Estimate (from U.S. satellites): -239 km 3 /year 239 km 3 = 239 Gt ice, approximately. Side Note: Melting of entire ice shelf is predicted to means 6.5m increase in sea level (21 ft.)

39 IPCC 2001: General Predictions Human activities likely the primary cause of warming. Note: IPPC 2007 Summary upgraded to the language to “Very Likely”, which is 90%+ confidence level. Continued increase in temperature.

40 - Intermission - (Next Up: Questions for Discussion)

41 Questions for Discussion Q1: Electronic voting machines? Good? Bad? Run quickly? Q2: How have computer errors affected you? Q3: Given all the inaccuracy and uncertainty with climate modeling, should they be the basis for policy decisions? Or, should they be just another set of considerations, like any special interest?


Download ppt "Can We Trust Computers? CS 301 (Spring 2007) Mark Luntzel, Niel Ngyuen, and James Cheng."

Similar presentations


Ads by Google