Ancient Paradoxes With An Impossible Resolution. Great Theoretical Ideas In Computer Science Steven RudichCS 15-251 Spring 2005 Lecture 29Apr 28, 2005Carnegie.

Slides:



Advertisements
Similar presentations
Completeness and Expressiveness
Advertisements

Lecture 9. Resource bounded KC K-, and C- complexities depend on unlimited computational resources. Kolmogorov himself first observed that we can put resource.
MATH 224 – Discrete Mathematics
Michael Alves, Patrick Dugan, Robert Daniels, Carlos Vicuna
Great Theoretical Ideas in Computer Science.
1 The Limits of Computation Intractable and Non-computable functions.
Bounds on Code Length Theorem: Let l ∗ 1, l ∗ 2,..., l ∗ m be optimal codeword lengths for a source distribution p and a D-ary alphabet, and let L ∗ be.
Lecture 12: Lower bounds By "lower bounds" here we mean a lower bound on the complexity of a problem, not an algorithm. Basically we need to prove that.
CS420 lecture one Problems, algorithms, decidability, tractability.
CSE115/ENGR160 Discrete Mathematics 02/28/12
Discrete Structures & Algorithms The P vs. NP Question EECE 320.
Complexity 15-1 Complexity Andrei Bulatov Hierarchy Theorem.
Complexity 11-1 Complexity Andrei Bulatov Space Complexity.
1 Introduction to Computability Theory Lecture15: Reductions Prof. Amos Israeli.
CSE332: Data Abstractions Lecture 2: Math Review; Algorithm Analysis Tyler Robison Summer
1 Introduction to Computability Theory Lecture12: Reductions Prof. Amos Israeli.
CSE332: Data Abstractions Lecture 2: Math Review; Algorithm Analysis Dan Grossman Spring 2010.
Chapter 3 Growth of Functions
Great Theoretical Ideas in Computer Science.
CPSC 411, Fall 2008: Set 12 1 CPSC 411 Design and Analysis of Algorithms Set 12: Undecidability Prof. Jennifer Welch Fall 2008.
Counting II: Recurring Problems And Correspondences Great Theoretical Ideas In Computer Science Steven RudichCS Spring 2004 Lecture 10Feb 12, 2004Carnegie.
1 Undecidability Andreas Klappenecker [based on slides by Prof. Welch]
Randomized Computation Roni Parshani Orly Margalit Eran Mantzur Avi Mintz
Counting I: One To One Correspondence and Choice Trees Great Theoretical Ideas In Computer Science Steven RudichCS Spring 2004 Lecture 9Feb 10,
CSE115/ENGR160 Discrete Mathematics 03/03/11 Ming-Hsuan Yang UC Merced 1.
Data Structures – LECTURE 10 Huffman coding
Chapter 11: Limitations of Algorithmic Power
1 Introduction to Computability Theory Lecture11: The Halting Problem Prof. Amos Israeli.
Information Theory and Security
Cantor’s Legacy: Infinity And Diagonalization Great Theoretical Ideas In Computer Science Steven RudichCS Spring 2004 Lecture 25Apr 13, 2004Carnegie.
Introduction to AEP In information theory, the asymptotic equipartition property (AEP) is the analog of the law of large numbers. This law states that.
Discrete Structures Lecture 11: Algorithms Miss, Yanyan,Ji United International College Thanks to Professor Michael Hvidsten.
Computability Kolmogorov-Chaitin-Solomonoff. Other topics. Homework: Prepare presentations.
MA/CSSE 474 Theory of Computation More Reduction Examples Non-SD Reductions.
TECH Computer Science NP-Complete Problems Problems  Abstract Problems  Decision Problem, Optimal value, Optimal solution  Encodings  //Data Structure.
Great Theoretical Ideas in Computer Science for Some.
P vs NP: what is efficient computation? Great Theoretical Ideas In Computer Science Anupam GuptaCS Fall 2005 Lecture 29Dec 8, 2005Carnegie Mellon.
Lecture 18. Unsolvability Before the 1930’s, mathematics was not like today. Then people believed that “everything true must be provable”. (More formally,
Discrete Structures Lecture 12: Trees Ji Yanyan United International College Thanks to Professor Michael Hvidsten.
Course Overview and Road Map Computability and Logic.
COMPSCI 102 Introduction to Discrete Mathematics.
Great Theoretical Ideas in Computer Science about AWESOME Some Generating Functions Probability Infinity Computability With Alan! (not Turing) Mind-
Great Theoretical Ideas in Computer Science.
COMPSCI 102 Introduction to Discrete Mathematics.
Great Theoretical Ideas in Computer Science.
Great Theoretical Ideas in Computer Science.
Great Theoretical Ideas In Computer Science Anupam GuptaCS Fall 2006 Lecture 28Dec 5th, 2006Carnegie Mellon University Complexity Theory: A graph.
1 Turing’s Thesis. 2 Turing’s thesis: Any computation carried out by mechanical means can be performed by a Turing Machine (1930)
Inspiration Versus Perspiration: The P = ? NP Question.
Great Theoretical Ideas in Computer Science.
Russell’s Paradox and the Halting Problem Lecture 24 Section 5.4 Wed, Feb 23, 2005.
Counting II: Recurring Problems And Correspondences Great Theoretical Ideas In Computer Science John LaffertyCS Fall 2005 Lecture 7Sept 20, 2005Carnegie.
Counting I: One To One Correspondence and Choice Trees Great Theoretical Ideas In Computer Science Steven RudichCS Spring 2004 Lecture 9Feb 10,
CSE 311 Foundations of Computing I Lecture 28 Computability: Other Undecidable Problems Autumn 2011 CSE 3111.
Counting II: Recurring Problems And Correspondences Great Theoretical Ideas In Computer Science V. AdamchikCS Spring 2006 Lecture 6Feb 2, 2005Carnegie.
Bringing Together Paradox, Counting, and Computation To Make Randomness! CS Lecture 21 
Thales’ Legacy: What Is A Proof? Great Theoretical Ideas In Computer Science Steven RudichCS Spring 2004 Lecture 27April 22, 2004Carnegie Mellon.
Algorithmic Foundations COMP108 COMP108 Algorithmic Foundations Algorithm efficiency Prudence Wong.
Unary, Binary, and Beyond Great Theoretical Ideas In Computer Science Steven RudichCS Spring 2003 Lecture 2Jan 16, 2003Carnegie Mellon University.
Conceptual Foundations © 2008 Pearson Education Australia Lecture slides for this course are based on teaching materials provided/referred by: (1) Statistics.
Fall 2013 Lecture 27: Turing machines and decidability CSE 311: Foundations of Computing.
Gödel's Legacy: The Limits Of Logics
CSE 311 Foundations of Computing I
Great Theoretical Ideas in Computer Science
CSCE 411 Design and Analysis of Algorithms
RS – Reed Solomon List Decoding.
The Limitations of Proofs: An Incompressible Resolution
Counting II: Recurring Problems And Correspondences
Counting II: Recurring Problems And Correspondences
Presentation transcript:

Ancient Paradoxes With An Impossible Resolution. Great Theoretical Ideas In Computer Science Steven RudichCS Spring 2005 Lecture 29Apr 28, 2005Carnegie Mellon University 

Each Java program has a unique and determined outcome [not halting, or outputting something].

Unless otherwise stated, we will be considering programs that take no input.

Each Java program has an unambiguous meaning.

Java is a prefix free language. That is, no Java program is the prefix of any other.

Binary Java is Prefix-Free We will represent Java in binary (using ASCII codes for each character). We will allow only java programs where all the classes are put in one big class delimited by { }.

PREFIX FREE MEANS THAT THE NOTION OF A RANDOM JAVA PROGRAM IS WELL DEFINED.

Flip a fair coin to create a sequence of random bits. Stop, If the bits form a Java program P. Each program gets picked with probability ½ length of program P

Java is an unambiguous, prefix-free language.

Define  to be the probability that a random program halts.

 Is the probability that a random coin sequence will describe the text of a halting program.

Gottfried Wilhelm von Leibniz There is almost no paradox without utility

BERRY PARADOX: “The smallest natural number that can’t be named in less than fourteen words.”

List all English sentences of 13 words or less. For each one, if it names a number, cross that number off a list of natural numbers. Smallest number left is number named by the Berry Sentence?

As you loop through sentences, you will meet the Berry sentence. This procedure will not have a well defined outcome.

Worse: In English, there is not always a fact of the matter about whether or not a given sentence names a number.

“This sentence refers to the number 7, unless the number named by this sentence is 7.”

BERRY PARADOX: “The smallest natural number that can’t be named in less than fourteen words.”

Java is a language where each program either produces nothing or outputs a unique string. What happens when we express the Berry paradox in Java?

Counting A set of binary stings is “prefix-free” if no string in the set is a prefix of another string in the set

Theorem: If S is prefix-free and contains no strings longer than n, then S contains at most 2 n strings. For each string x in S, let f(x) be the string x with n-|X| 0’s appended to its right. Thus, f is a 1-1 map from S into {0,1} n.

Storing Poker Hands I want to store a 5 card poker hand using the smallest number of bits (space efficient). The naïve scheme would use 2 bits for a suit, 4 bits for a rank, and hence 6 bits per card and 30 bits per hand. How can I do better?

Order the Poker hands lexicographically To store a hand all I need is to store its index of size  log(2,598,960)  =22 bits. Let’s call this the “indexing trick”.

22 Bits Is OPTIMAL 2 21 < 2,598,560 There are more poker hands than there are 21 bit strings. Hence, you can’t have a string for each hand.

Incompressibility We call a binary string x incompressible if the shortest Binary Java program to output x is at least as long as x.

Th: Half the strings of any given length are incompressible Java is prefix-free so there are at most 2 n-1 programs of length n-1 or shorter. There are 2 n strings of length n, and hence at least half of them have no smaller length program that outputs them.

A compressible string … a million times..01 public class Counter { public static void main(String argv[]) { for (int i=0; i< ; i++) System.out.print("01"); } }

It is possible to define randomness in terms of incompressibility Kolmogorov Chaitin

An incompressible string has no computable, atypical properties! Kolmogorov Chaitin

An incompressible string has no computable pattern! Kolmogorov Chaitin

If a string x is incompressible, then there is nothing atypical that you can say about it. Suppose D is some atypical, computable predicate that is true of x. Since D is atypical, it is not true of many n bit strings. So compress x by referring to x by its index i in the enumeration of strings of length n that have property D. [Notice the use of the “indexing trick”

When we notice a “pattern”, we always mean something atypical.

So when you see a “pattern” in a sufficiently long string it allows you to compress it. Hence, incompressible strings have no pattern.

For example, we can compress a sufficiently long Binary string with: 60% 1’s 1 always following 1101 ASCII Of English Language Text

BERRY PARADOX: “The smallest natural number that can’t be named in less than fourteen words.”

Java Berry The shortest incompressible string that is longer than this Java program

Java Berry The shortest incompressible string that this program can certify is longer than this program

Define an Incompressibility Detector to be a program P such that: P(x) = “yes” means x is definitely incompressible P(x) = “not sure”, otherwise

Let INCOMPRESSIBLE be a JAVA incompressibility detector whose program length is n. INCOMPRESSIBLE(x) = “yes” means x is definitely incompressible INCOMPRESSIBLE(x) = “not sure”, otherwise

JAVA BERRY { k:= bound on length of my program text Loop x = strings of length k+1 to infinity { If INCOMPRESSIBLE(X) Output X} Text of subroutine for INCOMPRESSIBLE. } The shortest incompressible string that this program can certify is longer than this program

If JAVA BERRY outputs ANYTHING a real paradox would result!

JAVA BERRY { S = Text of subroutine for INCOMPRESSIBLE k:= STRING_LENGTH(S) Loop x = strings of length k+b to infinity { If EXECUTE(S, X) = “YES” Output X} Routine for EXECUTE (S,X) which executes the Java program is the string S on input X Routing for STRING_LENGTH(S) returns the length of string S } BERRY has text length b + n Note: b is a constant, independent of n

JAVA BERRY OUTPUTS NOTHING. Theorem: There is a constant b such that no INCOMPRESSIBLE detector of length n outputs “yes” on any string of length greater than n+b. Proof: If so, we could use it inside Java Berry and obtain a contradiction.

Let  be a sound, formal system that can be presented as a n-bit program enumerating consequences of its axioms. No statement of the form “X is incompressible” for X of length > n+b is a consequence of .

You fix any n-bit foundation for mathematics. Now consider that half of the strings of length m>n+b are incompressible. Your foundation can’t prove that any one of them is incompressible.

Random Unknowable Truths.

Define  to be the probability that a random program halts.

 is a maximally unknowable number

 is the optimally compressed form of the halting oracle.

Let  n be the first n-bits of . By the properties of binary representation:  -  n < ½ n

The first n bits of  give enough information to solve the halting problem for any program with length bounded by n. Let  n be the first n bits of . Let P be a program of length n, of weight 2 -n. Start with a balance with  n on the left side and nothing on the other:  n Notice that  n + ½ n is greater than 

The first n bits of  give enough information to solve the halting problem for any program with length bounded by n. Now start time sharing to run every program except P for an infinite number of steps each. If a program M halts, put weight ½ |M| on the right side: [½| M| ….]  n Converging to  or to  – (1/2) n

The first n bits of  give enough information to solve the halting problem for any program with length bounded by n. If P halts, then W <  - ½ n <  n. Hence, the balance will never tip. If P does not halt, W converges to , and hence the balance must tip. W = ½ n [½| M| ….] nn

The first n bits of  give enough information to solve the halting problem for any program with length bounded by n. L:=0 Timeshare each program M, except P When a program of length a halts, add 2 -a to L. When the first n bits of L equals the first n- bits of , any length <= n program that is going to halt will have halted.

Busy Beaver Function BusyBeaver(n) = max running time of any halting program of length n. In BusyBeaver(n) we can unpack the first n bits of information encoded in Omega.

From n-bits of  we can find all incompressible strings of length n+1 Determine all the programs of length n that halt. Run them and cross off any (n+1)-bit strings they output. The strings that are left are incompressible.

n bits of axioms can only help you know n + b bits of  Or else you could prove that strings longer than your axiom system were incompressible

 Is not compressible by more than b. Suppose you could compress n bits of  by more b to get a string X. Decompress X and use it to find an incompressible strings of length n+1 and output it. This method has the length of X plus b which is still less than n+1. Contradiction.

Busy Beaver Function BusyBeaver(n) = max running time of any halting program of length n. In BusyBeaver(n) we can unpack the first n bits of information encoded in Omega.

Busy Beaver Function BusyBeaver(n) = max running time of any halting program of length n. What is the growth rate of BusyBeaver? Grows faster than any computable function!

Suppose a computable f(n) > BusyBeaver(n) BusyBeaver(n) = max running time of any halting program of length n. Run all n-bit programs for f(n) time. The ones that have not halted will never halt.

Reason is our most powerful tool, but some truths of the mathematical world have no pattern, or representation that can be reasoned about.

We can make a Diophantine polynomial U in 16 variables such that when X 1 is fixed to k, the resulting polynomial has a root iff the kth bit of Omega is 1.

CIRCUIT-SATISFIABILITY Given a circuit with n-inputs and one output, is there a way to assign 0-1 values to the input wires so that the output value is 1 (true)? AND NOT Yes, this circuit is satisfiable. It has satisfying assignment 110.

CIRCUIT-SATISFIABILITY Given: A circuit with n-inputs and one output, is there a way to assign 0-1 values to the input wires so that the output value is 1 (true)? BRUTE FORCE: Try out all 2 n assignments

AND NOT 3-colorabilityCircuit Satisfiability

A Graph Named “Gadget”

TF X Y Output

T F X Y XY FFF FTT TFT TTT

T F X Y XYOR FFF FTT TFT TTT

TF X NOT gate NOT X

OR NOT xy z

OR NOT xy z x y z

OR NOT xy z x y z

OR NOT xy z x y z

OR NOT xy z x y z

OR NOT xy z x y z How do we force the graph to be 3 colorable exactly when the circuit is satifiable?

OR NOT xy z x y z Satisfiability of this circuit = 3-colorability of this graph TRUE

You can quickly transform a method to decide 3-coloring into a method to decide circuit satifiability!