Presentation on theme: "The Mysteries of Algorithms A Self-told Story of Richard M. Karp (People & Ideas in Theoretical Computer Science) Three principles Getting Educated The."— Presentation transcript:
The Mysteries of Algorithms A Self-told Story of Richard M. Karp (People & Ideas in Theoretical Computer Science) Three principles Getting Educated The Professorial Life NP-Completeness
Richard M. Karp Prof. Karp was born in Boston, 1935, and received Ph.D in Applied Math in 1959 from Harvard Univ – 1968: IBM Thomas J. Watson Research Center – 1994: UC Berkeley 1994 – today: University of Washington, he is a professor of CS and Engineering and an Adjunct Professor of Molecular Biotechnology. The unifying theme of his work has been the study of combinatorial algorithms. The U.S. National Medal of Science, Turing Award and many more awards.
Three Principles in Making Career Choices Understand what you are good at and what you like to do, and choose accordingly. In the words of Socrates, “Know thyself.” Disregard the fashions of the day and search for the new areas of research that are about to become important. In the words of the great hockey player and philosopher Wayne Gretzky, “Skate to where the puck is gonna be.” To find exciting problems, look at the interfaces between disciplines.
Getting Educated A solid classical education at Boston Latin School. Favorite subject – mathematics. At Harvard, found: had to work hard to earn good grades, many equalled or surpassed my ability, writing and laboratory science was definitely not for me. By the middle of senior year, concluded that career in pure math was not for me. Being reluctant to leave Cambridge and work for living, I decided to become a Ph. D student at Harvard.
The Computation Lab Entered Comp lab in Spotty performance in Applied math, electrical engineering, probability and statistics, but special feel for those topics that involved probability and discrete math. Productive summers with M.I.T Lincoln Lab and GE further fortified confidence. Ph.D. dissertation was based on the idea graph theory algorithms can be used to analyze programs. The Comp lab’s old boy network helped me to land a job at IBM’s Research division.
IBM Days Assigned to work on algorithms for logic circuit design – the harsh realities of combinatorial explosions. The field of combinatorial algorithms was in a stage of rapid development. IBM brought in Alan Hoffman to build a combinatorics research group and he become my mentor. “good algorithm” – polynomial time algorithm and for some combinatorial optimization problem a good one might not exist.
Work Done at IBM IBM had a strong group in formal models of computation (automata theory, formal languages and mathematical logic). Become aware of the importance of reducibilities in recursive function theory. Own work: formal models centered around parallel computation – systolic algorithms, parallel program schema as a model of asynchronous parallel computation.
A Zest for Teaching Moved to UC Berkely 1968 to be more involved with students. My father was a junior high school math teacher – role model. Thoughts on teaching: preparation, structuring the material (top-down organization, modularity, information hiding), conducting the lecture.
Professorial Life The move to Berkeley marked the end of my scientific apprenticeship. A professor’s life is a juggling act. Supervised thirty-five Ph.D. students. Rule: never to assign a thesis problem, but to work together with each student to develop a direction that is significant and fits the students abilities and interests.
NP-Completeness 1971, read Steve Cook’s paper “The Complexity of Theorem-Proving Procedures,” – proved that every set of strings accepted in polynomial time by a nondeterministic TM is polynomial-time reducible to SAT (the propositional satisfiability problem). 1972, Karp presented NP-complete problems – a set of other real world problems might enjoy the same universal character as SAT.
Significance of NP-Completeness Putting computational complexity theory in touch with the real world by propagating to workers in many fields the fundamental idea that computational problems of interest to them may be intractable, and that the question of their intractability can be linked to the central questions in complexity theory. Example: public key cryptography researchers cited NP-completeness as a motivation for one way function and trapdoor function.
Dealing with NP-Hard Problems NP-hard problems arise frequently in applications they cannot be ignored. Polynomial-time approximation algorithm. 1974, decided to study the performance of heuristics from a probabilistic point of view. Try to prove that a fast heuristic algorithm finds near-optimal solution with high probability. The reasons why heuristics work so well in so many cases remain a mystery.
Randomization and Derandomization 1975, early work and road map for probabilistic algorithm analysis. 1991, randomization is an extremely important tool for the construction of algorithms. Two principle advantages: (1) smaller space and time requirements and (2) simple to understand and implement. Derandomization – the elimination of random choices from a randomized algorithms.
The Complexity Year NSF provided funding to arrange for an all-star cast of young computer scientists and mathematicians to work at the Institute for a year, and to attract a number of the leading senior scientists in the field. Although didn’t crack the P:NP problem, the complexity year met is main goal. Strengthened the research communities in a number of emerging areas including computational molecular biology.
A Spirited Debate 1995, after 60 th birth day, in an oral report, criticizing the theory community for being ingrown, worshiping math depth, working on artificial problems became the focus of firestorm of criticism. Changing to positive tone and asserting not only by pursuing the deep questions that had originated within theory itself, but also linking up with application.
Computational Molecular Biology 1963, IBM decided to look into the applications and math to Biology and Medicine. Nothing came out the trip looking for research problem. 1991, began to think seriously about apply my knowledge of algorithms to those fields. Human Genome Project, former student Dan Gusfield, attending seminars…closer connection with UW –> a hotbed of computational MB. Sequencing of genomes as merely a first step towards functional genomics.
Future Work The problem of characterizing the regulatory networks will be much of the future work. Expect to draw on the existing knowledge in statistical clustering theory and computational learning theory and will need to advance the state of the art in these field in order to succeed.