Presentation is loading. Please wait.

Presentation is loading. Please wait.

Survey Results. Class Year Prior Programming Experience.

Similar presentations


Presentation on theme: "Survey Results. Class Year Prior Programming Experience."— Presentation transcript:

1 Survey Results

2 Class Year

3 Prior Programming Experience

4 Object Oriented Experience

5 Java AWT/Swing Experience

6 How students feel about 15 so far

7 Hours spent per week

8 Length of lines o hours were a zoo the same week the survey was released. Never seen lines so long or expect to see them as bad again. o even though we have 150 hours of help a week, lines were still too long Socratic Method o TAs are carefully trained during TA camp, here to support not to tutor o they will NOT give you the answers or tell you if you are correct or not TA Hours

9 What we’ll fix o dates for last few assignments were on weekend where fewer hours were held. For future programming assignments, this will not be the case. o we will change hours for the final projects and will be increasing them around final project deadlines What you’ll fix o start early, start now, start yesterday: very little wait the first week of Doodle Jump o this week hours have also been almost empty, so start Tetris while it is easy to get your questions answered o come prepared with a specific question or bug, so that the TA can work with you effectively to help you TA Hours

10 Timeliness o projects are graded and returned before early hand-in of next assignment o DQs are returned the night that they are turned in, so students can start work right away o one project for one student takes an hour overall to grade (from the time a TA takes to go through code plus discussion/clarification from HTAs/other TAs) DQs o due on Sundays at 2pm, so that TAs can immediately grade during weekly Sunday meeting o these are due BEFORE the help session because students are expected to have thought about and worked on the design before the correct answers are explained in the session Grading

11 Consistency o all TAs work off a common rubric. Most work is graded during a weekly meeting with all TAs to ensure we are consistent o as the course progresses, we will be getting stricter about things like design and style o can always appeal grading complaints to the HTAs and/or Andy Grading

12 Introducing… Maps (1/3) ●Maps are used to store (key, value) pairs, so a key is used to lookup its corresponding value ●(Word, Definition) in a dictionary ●(Brown ID, Person), in banner ●(Name, Phone #) in a contacts list ●(Identifier, Memory address) in compiler – called symbol table 11/33

13 Introducing… Maps (2/3) ●In Java, use the java.util.HashMap class ●In general, this structure is often called a hash table ●Other structures to accomplish this goal include TreeMap, Hashtable, LinkedHashMap, and more o each has its own advantages and drawbacks o we will focus on HashMap HashMap s have constant-time insert, removal, and search. We will explain why shortly 12/33

14 HashMap Syntax ●Like other data structures, need to specify the type of elements we put in ●However, this time we need to specify the type of both the key AND the value ●The Key and Value can be instances of any class new HashMap (); 13/33

15 HashMap Syntax ●If we wanted to map an Integer to its String representation HashMap intTable = new HashMap (); ●If we wanted to map a TA to his/her Birthday HashMap birthdayTable = new HashMap (); ●In all cases, both key and value type must resolve to a class ●Note: Can’t use because both int and boolean are primitives, not classes o cannot use a primitive type as a generic, so use a built-in class that is equivalent to that primitive (wrapper) ●Instead use 14/33

16 java.util.HashMap Methods (1/2) //K refers to the type of Key, V to type of value. //adds the specified key, value pair to the table public V put(K key, V value) //returns the value to which the specified key is mapped, or null //if the map contains no mapping for the key //Note on parameter type: Java accepts any Object, but you should //supply the same type as the key public V get(Object key) //returns the number of keys in this hashtable public int size() 15/33

17 java.util.HashMap Methods (2/2) //Note on parameter type: Java accepts any Object, but you //should supply the same type as either the key or the value //tests if the specified object is a key in this hashtable public boolean containsKey(Object key) //returns true if the hashtable maps at least one key to this value public boolean containsValue(Object Value) //removes the key and its corresponding value from the hashtable //returns value which the key mapped to, or null if key had no mapping public V remove(Object key) //More methods in JavaDocs 16/33

18 Finding out your friends’ logins (1/4) ●Given an array of CS students who have the properties “csLogin” and “real name”, how might you efficiently find out your friends’ logins? ●Givens o String[] _friends, an array of your 30 friends’ names o CSStudent[] _students, an array of students 17/33

19 ●Old Approach: for (int i=0; i < _friends.length; i++){ //for all friends for (int j=0; j < _students.length; j++){ //for all students if (_friends[i].equals(_students[j].getName())){ String login = _students[j].getLogin(); System.out.println(_friends[i] + “‘s login is “ + login + “!”); } } } ●Note: Use String class’ equals() method because “ == ” checks for equality of reference, not of content ●This is O(n 2 ) – far from optimal Finding out your friends’ logins (2/4) 18/33

20 ●An approach using a HashMap: o Key is name o Value is login o Use name to look up login! Finding out your friends’ logins (3/4) 19/33

21 ●An approach using a HashMap HashMap myTable = new HashMap (); for (CSStudent student : _students){ myTable.put(student.getName(), student.getLogin()); } for (String friendName : _friends){ String login = myTable.get(friendName); if (login == null){ System.out.println(“No login found for “ + friendName); continue; } System.out.println(friendName + “‘s login is “ + login + “!”); } ●What’s the runtime now? ●O(n) – because each insert and search is O(1); much better! Finding out your friends’ logins (4/4) 20/33

22 Counting frequency in an Array (1/4) ●How many times does a given word show up in a given string? ●Givens String[] _book, an array of String s containing many words String _searchTerm, the String you’re looking for 21/33

23 Counting frequency in an Array (2/4) int wordCounter = 0; for (String word : _book){ if (word.equals(_searchTerm)){ wordCounter++; } System.out.println(_searchTerm + “ appears “ + wordCounter + “ times”); 22/33

24 ●When tracking one word, code is simple ●But what if we wanted to keep track of 5 words? 100? ●Should we make instance variables to count the frequency of each word? ●Should we iterate through the _book for each of the search terms? Sounds like O(n 2 )... Counting frequency in an Array (3/4) 23/33

25 HashMap counter = new HashMap (); for (String currWord : _book){ if (counter.containsKey(currWord){ Integer count = counter.get(currWord); counter.remove(currWord); count++; counter.put(currWord, count); } else{ //First time seeing word counter.put(currWord, 1); } } Counting frequency in an Array (4/4) //_searchTerms is an array of Strings we’re counting for (String word : _searchTerms){ Integer freq = counter.get(word); if (freq == null){ freq = 0; } System.out.println(word + “ shows up “ + freq + “ times!”); } Despite increase in search terms, still O(n) 24/33

26 Map Implementation (1/5) ●How do we implement a Map with constant-time insertion, removal, and search? ●In essence, we are searching through a data structure for the value associated with the key o similar to the searching problem we have been trying to optimize ●Data structures we have so far: o Runtime to search in an unsorted array is O(n) o To search in a sorted array using binary search is O(logn) o Using a binary search tree, search is also O(logn), but we have faster insertion and removal o Can we do better than a binary search tree? 25/33

27 ●How about a ternary search tree (each node has at most 3 children)? o O(Log 3 N) ●Or a 10-way tree with O(Log 10 N) ●Let’s try the runtime for a search with 1,000,000 nodes o Log 10 1,000,000 = 6 o Log 2 1,000,000 < 20, so shallower but broader tree ●Analysis: the logs are not sufficiently different and the comparison (basically an n-way nested if-else-if) is far more time consuming, hence not worth it Map Implementation (2/5) 26/33

28 Map Implementation (3/5) ●Try a radically different approach, using an array ●What if we could directly use the key as an index to access the appropriate spot in the array? ●Remember: digits, alphanumerics, symbols, even control characters are all stored as bit strings– “it’s bits all the way down…” o see ASCII table o bit strings can be interpreted as numbers in binary that can be used to index into an array 27/33

29 Map Implementation (4/5) ●But creating an array to look up CS15 students (value) based on some ID number (key) would be a tremendous waste of space o If ID number is one letter followed by five digits (e.g., D00011), there are 26*10 5 combinations! o do not want to allocate 2,600,000 words for no more than 300 students o (1 word = 4 bytes) o array would be terribly sparse… ●What about using social security number? o would need to allocate 10 9 words, about 4 gigabytes, for no more than 300 students! And think about aribtrary names <30 chars, need !! 28/33

30 Map Implementation (5/5) ●Thus, two major problems: o How can we deal with arbitrarily long keys, both numeric and alphanumeric? o How can we build a small, dense (i.e., space- efficient) array that we can index into to find keys and values? ●Impossible? ●No, we approximate 29/33

31 Hashing ●How do we approximate? o We use Hashing o Hashing refers to computing an array index from an arbitrarily large key using a hash function o Hash function takes in key and returns index in array ●Index leads to a simple value or an entire object ●Therefore, a two-step process: o hash to create index, use index to get value hash function indexkey value array 30/33

32 Hashing ●Array used in hashing typically holds several hundred to several thousand entries; size typically a prime (e.g., 1051) o array of links to instances of the class TA Greg Ardra Sonia null N - 1 Hash(‘Greg’)=0 Hash(‘Ardra’)=1 Hash(‘Sonia’)=4 31/33

33 ●An example of a hash function for alphanumeric keys o ASCII is a bit representation that lets us represent all alphanumeric symbols as integers o Take each character in key, convert to integer, sum integers - sum is index o But what if index is greater than array size? o Use mod, i.e. (index % arrayLength) to ensure final index is in bounds ●A better hash function o take a string, chop it into sections of 4 letters each, then take value of 32 bits that make up each 4-letter section and XOR them together, then % that result by table size ●Almost any reasonable function that uses all bits will do, so choose a fast one, and one that distributes more or less uniformly (randomly) in the array to minimize holes! Hash Functions 32/33

34 Collisions ●If we have 6,000 Brown student names that we are mapping to Banner IDs using an array of size 1051, clearly, we are going to get “collisions” where different keys will hash to the same index ●Does that kill the idea? No! ●Instead of having an array of type Value, we instead have each entry in the array be a _head pointer to an overflow “bucket” for all keys that hash to that index. The bucket can be, e.g., our perennial favorite, the unsorted singly linked list, or an array, whatever… ●So, if we get a collision, the linked list will hold all values with keys associated to that bucket 33/33

35 Collisions ●Since multiple objects will typically hash to the same bucket, for methods like get(key) and remove(key), HashMap will have to iterate through all items in the hashed bucket to get or remove the right object ●This is O(k), where k is the length of a bucket – it will be small, so brute force search is fine ●The best hash functions minimize collisions ●Java has its own efficient hash function, covered in CS16 ●A way to think about hashing: a fast, large intial division (e.g., 1051-way), followed by a brute force search over a small bucket – even bucket size 100 is fast! 34/33

36 HashMap Pseudocode table = array of lists of some size h = some hash function public put(K key, V val): int index = hash(key) table[index].addFirst(key, val) O(1), if h() runs in O(1) time public V get(K key): index = hash(key) for (k, v) in table[index]: if k == key: return v return null //key not found Runs in O(k) time, where k is size of bucket, usually small Note: LinkedLists only hold one element per node, so in actual code, you would need to make a class to hold the key and the value 35/33

37 HashMaps… efficiency for free? ●Not quite ●While put() and get() methods run in O(1) time, each takes more time than inserting at the end of a queue, for example ●A bit more memory expensive (array + buckets) ●Inefficient when many collisions occur (array too small) ●But it is likely the best solution overall, if you don’t need order ●No support for ordering o (key, value) pairs are not stored in any logical order 36/33

38 Lecture 20 A Brief History of Computers & Programming Languages See The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution by Walter Isaacson. Copyright © 2014 by Walter Isaacson

39 In the beginning…(1/4) ●Long history of mechanical computing devices (adders) as part of development of clocks, orreries, elaborate music boxes and instruments o based on cogs on wheels, carry mechanism o 1645: Blaise Pascal creates the Pascaline adding machine – up to 8 dials o 1671: Leibniz Calculating Machine could add, subtract, multiply, and divide 1 of 24 Music Link: https://www.youtube.com/watch?v=P0FxZUrIB5M&index=2&list= RDtds0qoxWVss https://www.youtube.com/watch?v=P0FxZUrIB5M&index=2&list= RDtds0qoxWVss Museum dedicated to mechanical music making:

40 In the beginning…(2/4) ●1822: British inventor Charles Babbage proposed idea of mechanical calculation to compute polynomials (for ballistics, longitude tables) - designed but never built the Difference Engine ●Then proposed combining mechanical calculation with idea of feeding instructions to a more powerful machine via punched cards in the style of music boxes and the Jacquard Loom, thus designing the first (mechanical) computer, the Analytical Engine o first had to invent machine tools for the precise machining required, but never completed the Analytical Engine o but the “architecture” is strikingly similar to the essence of modern computers: driven by instructions; arithmetic unit, memory, input/output Jacquard Loom Charles Babbage Punch cards on a Jacquard Loom 2 of 24

41 Difference Engine ●A modern implementation of the difference engine was finally completed by the London Science Museum in of 24

42 Analytic Engine ●Babbage’s son built a small part of his analytic engine in 1910, and the Science Museum has begun the process of building a compete version 4 of 24

43 Ada Lovelace In the beginning…(3/4) ●~1845: Augusta Ada Lovelace, Lord Byron’s daughter, writes program to calculate Bernoulli numbers o first known computer program and programmer! o written for Babbage’s analytical engine o “machine does exactly what you tell it to do” o “the Analytical Engine weaves algebraic patterns the way the Jacquard Loom weaves flowers and leaves” o Ada programming language named in her honor was a DOD-sponsored language for writing complex programs using software engineering principles, including Abstract Data Types A piece of the analytic engine (photo from the Science Museum of London) 5 of 24

44 In the beginning…(4/4) ●In the 1900’s specialists programmed “business machines" (note IBM’s original name) by actually changing hardware’s wiring o advanced models used plug boards - try to debug that! ●1946: J. Presper Eckert and John Mauchly at University of Pennsylvania built Electronic Numerical Integrator and Computer (ENIAC) – see o first electronic general purpose “automatic” computer, used for ballistics and bomb calculations o 18,000 vacuum tubes, MTBF of 20 minutes! o still programmed by wiring o 5,000 adds/ subtracts/sec, 400 multiplies/sec, 35 divisions or square roots/sec (10 digit numbers) o men wanted to build h/w, left less prestigious programming to the all- female corps of programmers, called “computers”, a 19 th century term o forgotten-female-programmers-who-created-modern-tech 6 of 24

45 Stored Program Computer Architecture ●1945: John von Neumann introduces seminal concept of “stored program” o “it’s bits all the way down…” for both data and instructions o program can be stored in main memory and even treated as data to be operated on –paved the way for modern computers CPU I/O InterfaceMemory Address Data Control System Buses Simplified Processor Architecture ●Simple instruction execution loop o use instruction register to fetch instruction stored at that address in memory, increment instruction register, decode and execute instruction o instruction typically is o instruction may update memory – even modify stored program itself, e.g., for loops. Unsafe: use h/w looping w/ index registers ●von Neumann was a polymath: worked on atom and hydrogen bombs, co-invented “game theory”, computational biology, etc. ●CS15 Final Project Othello uses the minimax algorithm 7 of 24

46 Moore’s Law and the Shrinking Computer ●Moore’s (1) “Law”: an observation that over the history of computing hardware, number of transistors in a dense integrated circuit doubles approximately every two years o fastest exponential in history ●Smaller feature size, greater density means shorter paths, faster signal propagation in microprocessors ●We benefit not just from microminiaturization of the CPU but also from great electromechanical engineering of peripheral devices (e.g., disk controllers, disks – 40MB was a big disk in the 60s! Dual Meta-4 mini had 32KB, 1MB disk) ●Today’s IBM zEC12 microprocessor (6 Ghz) has about same computing power as 5,000 football fields worth of our IBM /360 mod 50s (.14MIPS) 50 years ago - mainframes still selling! ●See mainframe-history-as-ibm-system360-turns-50-cobol-turns-55.html/ ●But are silicon chips hitting a limit? ●What’s next: biological computers? Quantum computers? 8 of 24 (1) Gordon Moore was the co-founder of Intel and the co-inventor of the integrated circuit, which led to microprocessors, etc.

47 Computers get Ever Faster, but do they get More “Powerful”? ●Computer is the only Universal Machine! ●Yet theoretically only need 6 instructions for ANY algorithm! o load accumulator (high-speed register) from memory address o store accumulator to memory address o subtract contents of memory address from accumulator, store result in accumulator o jump to memory address (instruction) if accumulator < 0 (“conditional jump”) o read to accumulator from external input device o write from accumulator to external output device ●You can build o add by subtracting negative numbers o divide by repeated subtract, multiply by repeated add o If-then-else and loops with conditional jump o output to printer by write into special memory location 9 of 24

48 Trade Offs in Power/Complexity of Instruction ●Trade-offs o complexity of instruction (how much it does) o speed of instruction execution o size of instruction set (and can compiler take advantage of them) ●Today’s computers o Complex Instruction Set Computer (CISC) > 500 started with IBM mainframes in 50s and 60s, now “Intel architecture” dominates o Reduced Instruction Set Computers (RISC) 100 – 300 (simpler but faster instructions) major innovation and important in 80s and 90s o Intel architecture has adapted best ideas from RISC o ARM architecture also designed in accordance w/ RISC; used in phones, tablets, etc. o emphasis today is on “multi-core” (multiple CPUs per chip) and low-power designs o GPUs (Graphics Processing Units) are even more powerful, for games, but also for data crunching, e.g., scientific simulation (weather prediction, protein folding…) 10 of 24

49 ●Alan Turing (1912 – 1954) – logician, mathematician, first computer scientist ●Designed code breaking machine to crack the German Enigma cypher during WWII ●Formalized notions of algorithm, computer and mechanized computing in an era that was very concerned with what was computable and what was not, mechanized mathematics, e.g., undecidability, halting problem, etc….Also started AI, Turing test ●Turing Machine as the simplest possible conceptual device to execute an arbitrary algorithm: device with a writable tape and a read/write head; the logic is in a table ●Table contains the “program” of “instructions” as a “state machine” – if in state i and read 1, do x, go to a next state, if read 0, do y, go to a next state. Simple actions: 1)move the head one square L or R 2)read/write current cell (empty or tally mark) 3)integers represented by equivalent number of tally marks ●Universal Turing Machine that could simulate any other TM proof that one could build a universal “programmable” computer. MIT’s AI guru Minsky showed a 43 state UTM! ●Committed suicide after being prosecuted and “treated” for being gay Turing, Computability 11 of 24

50 First, Numeric Machine Language, Then Came Assembly Language (1/2) ●1949: John Mauchly develops Short Order Code o first assembly language o provided vehicle for higher-level languages to be developed ●Symbolic code that is 1:1 with machine code o load accumulator with contents stored at address 4 o program translates to machine code via table lookup of opcode, decimal to binary conversion algo o assembler early example of treating a program as data! opcode LOAD memory address 12 of 24

51 First, Numeric Machine Language, Then Came Assembly Language (2/2) ●Must be defined for each processor o hard-wired for particular processor’s architecture o generated by compilers for higher-level languages ●Modern processors are very complicated o so writing at assembly language level takes real skill o compilers can optimize code globally for high-level languages, using sophisticated computation graphs o programmers generally optimize code only locally ●Still used today when speed and size count o embedded computers, device drivers, games o programmer must understand hardware well to use effectively o increasingly, C is used as a “machine-independent” assembly language 13 of 24

52 High-Level Languages ●Attempt to make programming more intuitive o closer to programmer’s concepts (high-level) o further from machine’s concepts (low-level) ●Symbolic code that is 1:N with machine code o one high-level instruction may become tens or hundreds of machine code instructions ●Most importantly, machine independent o avoided vendor lock-in o depended on compiler to translate high-level constructs to computer’s machine code o thus allows one source program to be used on many target architectures ●Still trying to make languages higher level o Java guarantees single compilation, same execution on multiple machines via byte codes: write once, run everywhere o compile to byte code virtual machine; computer will have virtual machine interpreter 14 of 24

53 High-Level Languages: Important Dates (1/2) ●1957: John Backus et. al. at IBM develop FORTRAN language and compiler o FORmula TRANslator o still used today, mostly for scientific computing, highly optimized o for number crunching ●1959: Committee on Data System Languages develops COBOL, led by Rear Admiral Grace Hopper, one of first modern programmers (Grace Hopper Celebration of Women in Computing ) o Common Business Oriented Language, “English-like”, support o for data records o still tons of legacy code in banks, insurance companies, retail… (Y2K!) ●1959: John McCarthy develops Lisp o LISt Processing o seen as slow, so primarily used only for “AI” projects o Scheme is a modern Lisp-like “functional programming” language 15 of 24

54 High-Level Languages: Important Dates (2/2) ●1960: ALGOL 60 standard published o ALGOrithm Language o basis of most popular languages today ●1964: John Kemeny and Thomas Kurtz at Dartmouth develop BASIC o Beginners All-purpose Symbolic Instruction Code o simple language, meant to be used by beginners and non-professionals, efficient on microcomputers o was popularized by Microsoft’s Visual BASIC, now being replaced by JavaScript 16 of 24

55 Structured Programming (1/2) ●1968: Edsgar Dijkstra writes landmark note: “GoTo Statement Considered Harmful” o no predictability, can go anywhere in program o leads to spaghetti code o can’t be understood by programmer or compiler ●New languages would have constructs for common one-in-one-out flows of control for controlled branching o if/else-if and switch statements o while and for loops o gives sequential, predictable order to code, only controlled branches allowed 17 of 24

56 Structured Programming (2/2) ●Brown’s AM101, AM40, CS11 (CS15 precursors) switched to new style in late 60’s o taught PL/I, then PL/C, using structured programming style o then, even taught “structured assembly” based on positive experiences o switched to Pascal as a more modern version o see, we have a rich history of experimentation! ●Too much commercial legacy code was spaghetti! 18 of 24

57 Next Generation High-Level Procedural Languages ●Emphasize task decomposition, no bundling of data and procedures in objects ●1964: Researchers at IBM develop PL/I o Programming Language I o designed to synthesize best features of FORTRAN, COBOL, and Algol 60 o failed as attempt to be the one general purpose programming language ●1970: Niklaus Wirth develops Pascal o named for Blaise Pascal, to be educational language ●1972: Dennis Ritchie at Bell Labs develops C o predecessor named B o often called portable assembly language o surpassed COBOL as most popular language 19 of 24

58 Even OOPLs are Relatively Old ●1967: Ole-Johan Dahl and Kristen Nygaard at Norwegian Computing Centre develop Simula, SIMUlation Language and first OO programming language, classes ●1972: Alan Kay, Adele Goldberg, et al. at Xerox PARC develop Smalltalk and the Windows metaphor/GUI ●1972: Barbara Liskov at MIT develops CLU, with focus on ADT’s (next slide) ●1980: US Department of Defense develops Ada to combat plethora of languages o ADT’s, Objects, Concurrency… ●1983: Bjarne Stroustrup develops C++ o OO extensions to popular C language -- named C++ as a play on the ++ operator ●1995: James Gosling et. al. at Sun Microsystems develop Java, a cleaned-up, smaller dialect of C++ o meant to be internet and embedded device programming language o provide facilities for better reuse and safety o some professionals avoid it because it is seen as inefficient (use C++ or C instead) o Microsoft’s C# is a powerful Java-ish competitor; also Python, Ruby-on-Rails o JavaScript is NOT Java, and is only partially an OOP 20 of 24

59 Barbara Liskov’s Distinguished Lecture at Brown 11/06/14 Biography: o member of the National Academy of Engineering and the National Academy of Sciences, charter fellow of the National Academy of Inventors. o ACM Turing Award (the Nobel prize of CS), IEEE Von Neumann medal The Power of Abstraction o abstraction is at the center of much work in Computer Science o it encompasses finding the right interface for a system as well as finding an effective design for a system implementation o furthermore, abstraction is the basis for program construction, allowing programs to be built in a modular fashion. What I learned from her talk o ADTs need to describe the behavior, not just the method signatures, return types, error conditions…i.e., the “pragmatics” o Java and other OOPLs can only provide support for enforcing that subtypes can do what supertypes can – they can’t enforce the idea that subtypes should also exhibit the same behavior 21 of 24

60 Who “owns” ADT’s? November 7, 2014 Computer Scientists Ask Supreme Court to Rule APIs Can’t Be Copyrighted EFF Files Amicus Brief on Behalf of Tech Pioneers in Oracle v. Google Court Battle San Francisco - The Electronic Frontier Foundation (EFF) filed a brief with the Supreme Court of the United States today, arguing on behalf of 77 computer scientists that the justices should review a disastrous appellate court decision finding that application programming interfaces (APIs) are copyrightable. That decision, handed down by the U.S. Court of Appeals for the Federal Circuit in May, up-ended decades of settled legal precedent and industry practice.briefdisastrous appellate court decision Signatories to the brief include five Turing Award winners, four National Medal of Technology winners, and numerous fellows of the Association for Computing Machinery, IEEE, and the American Academy of Arts and Sciences. The list also includes designers of computer systems and programming languages such as AppleScript, AWK, C++, Haskell, IBM S/360, Java, JavaScript, Lotus 1-2-3, MS- DOS, Python, Scala, SmallTalk, TCP/IP, Unix, and Wiki. 22 of 24

61 Software Engineering (1/2) ●1968: NATO Science Committee addresses “software crisis” o hardware progressing rapidly, but not software o software development seen mostly as craft with too much trial-and-error o too little has changed – e.g., ACA website debacle! o coins term software engineering ●1975: Frederick Brooks writes landmark book “The Mythical Man-Month” o says “no silver bullet,” software is inherently complex o complexity can be ameliorated but cannot be cured by higher-level languages o adding people to late project delays it (“9 women can’t make baby in a month”) ●1990s: Les Hatton develops “30-5-1” rule o from study of real commercial programs o discovered 30 bugs per 1000 lines untested code on average, then only 5 in well- tested code, and 1 bug still remaining after code in production o rule held regardless of language, probably still true today! 23 of 24

62 Microsoft just squashed a 19-year-old software bug. How did it go undetected so long? ●Tuesday, November 11 th, Microsoft patched critical bug affecting Windows o bug could potentially allow hackers to remotely control users’ machines ●IBM researchers who found bug say it could have been around for two decades o Spotting bugs in code can be difficult, even after extensive review ●“This isn't the first time major flaws have taken years to uncover. In 2010, a Google engineer uncovered a 17-year-old Windows bug affecting all 32-bit versions of the operating system and that could be used to hijack PCs. In September, another problem called "Shellshock" was discovered in a free software package built into some 70 percent of all devices connected to the Internet. It could have been introduced as long as 22 years ago, says Chet Ramey, the long-time maintainer of the code.” Full article here: squashed-a-19-year-old-software-bug-how-did-it-go-undetected-so-long/http://www.washingtonpost.com/blogs/the-switch/wp/2014/11/12/microsoft-just- squashed-a-19-year-old-software-bug-how-did-it-go-undetected-so-long/ 23 of 24

63 Software Engineering (2/2) ●Sophisticated development and testing methodologies o CS17 and CS19 teach students to write tests that inform the implementation rather than write tests that are tailored to the implementation o goal is to cover both general and edge cases ●Libraries of reusable components o companies offer well-tested common components o “plug-n-play” frameworks to connect trusted catalogue parts o OO is a good paradigm to make this goal feasible – works well for GUI widgets and large-scale components (e.g., “Enterprise JavaBeans”, QT Framework) ●CS32: modern software engineering using Java! ●Note: new languages and software engineering technologies (frameworks, IDEs…) still hot subjects, both in industry and in academe o e.g., Apple’s new Swift, positioned as successor to C and Objective-C, see https://developer.apple.com/swift/ 23 of 24

64 Announcements Tetris help session in Salomon 101 at 5PM. HW3 due Saturday at 5PM! Tetris early handin 11/18; on time 11/20; late 11/22. Start working now! Lecture on Thursday and next Tuesday will be in Salomon 101 We sent out grade reports Sunday evening. 63/33


Download ppt "Survey Results. Class Year Prior Programming Experience."

Similar presentations


Ads by Google