Ku naru shoh-mei no michi The way of the empty proof L’art de la preuve vide.

Slides:



Advertisements
Similar presentations
C&O 355 Mathematical Programming Fall 2010 Lecture 6 N. Harvey TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AA A A.
Advertisements

1 LP Duality Lecture 13: Feb Min-Max Theorems In bipartite graph, Maximum matching = Minimum Vertex Cover In every graph, Maximum Flow = Minimum.
Geometry and Theory of LP Standard (Inequality) Primal Problem: Dual Problem:
Lecture #3; Based on slides by Yinyu Ye
Mathematical Foundations for Search and Surf Engines.
C&O 355 Mathematical Programming Fall 2010 Lecture 22 N. Harvey TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A A A.
Totally Unimodular Matrices
Introduction to Algorithms
Linear Equations in Linear Algebra
Basic Feasible Solutions: Recap MS&E 211. WILL FOLLOW A CELEBRATED INTELLECTUAL TEACHING TRADITION.
10/11/2001Random walks and spectral segmentation1 CSE 291 Fall 2001 Marina Meila and Jianbo Shi: Learning Segmentation by Random Walks/A Random Walks View.
Андрей Андреевич Марков. Markov Chains Graduate Seminar in Applied Statistics Presented by Matthias Theubert Never look behind you…
Entropy Rates of a Stochastic Process
6.896: Probability and Computation Spring 2011 Constantinos (Costis) Daskalakis lecture 2.
Overview of Markov chains David Gleich Purdue University Network & Matrix Computations Computer Science 15 Sept 2011.
Linear programming Thomas S. Ferguson University of California at Los Angeles Compressive Sensing Tutorial PART 3 Svetlana Avramov-Zamurovic January 29,
CSCI 3160 Design and Analysis of Algorithms Tutorial 6 Fei Chen.
Totally Unimodular Matrices Lecture 11: Feb 23 Simplex Algorithm Elliposid Algorithm.
. Hidden Markov Model Lecture #6 Background Readings: Chapters 3.1, 3.2 in the text book, Biological Sequence Analysis, Durbin et al., 2001.
Duality Lecture 10: Feb 9. Min-Max theorems In bipartite graph, Maximum matching = Minimum Vertex Cover In every graph, Maximum Flow = Minimum Cut Both.
Duality Dual problem Duality Theorem Complementary Slackness
Linear Programming – Max Flow – Min Cut Orgad Keller.
Linear Equations in Linear Algebra
Lecture 10 Dimensions, Independence, Basis and Complete Solution of Linear Systems Shang-Hua Teng.
CS6800 Advanced Theory of Computation Fall 2012 Vinay B Gavirangaswamy
C&O 355 Mathematical Programming Fall 2010 Lecture 17 N. Harvey TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AA A.
Piyush Kumar (Lecture 2: PageRank) Welcome to COT5405.
C&O 355 Mathematical Programming Fall 2010 Lecture 1 N. Harvey TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AA A A.
System of Linear Equations Nattee Niparnan. LINEAR EQUATIONS.
C&O 355 Lecture 2 N. Harvey TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A A.
C&O 355 Mathematical Programming Fall 2010 Lecture 2 N. Harvey TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AA A A.
CS774. Markov Random Field : Theory and Application Lecture 08 Kyomin Jung KAIST Sep
C&O 355 Mathematical Programming Fall 2010 Lecture 4 N. Harvey TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A A A.
Linear Programming System of Linear Inequalities  The solution set of LP is described by Ax  b. Gauss showed how to solve a system of linear.
Theory of Computing Lecture 13 MAS 714 Hartmut Klauck.
Three different ways There are three different ways to show that ρ(A) is a simple eigenvalue of an irreducible nonnegative matrix A:
TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AA A A A A A A A A Image:
PageRank. s1s1 p 12 p 21 s2s2 s3s3 p 31 s4s4 p 41 p 34 p 42 p 13 x 1 = p 21 p 34 p 41 + p 34 p 42 p 21 + p 21 p 31 p 41 + p 31 p 42 p 21 / Σ x 2 = p 31.
Linear Programming (Convex) Cones  Def: closed under nonnegative linear combinations, i.e. K is a cone provided a 1, …, a p  K  R n, 1, …, p.
X y x-y · 4 -y-2x · 5 -3x+y · 6 x+y · 3 Given x, for what values of y is (x,y) feasible? Need: y · 3x+6, y · -x+3, y ¸ -2x-5, and y ¸ x-4 Consider the.
Hon Wai Leong, NUS (CS6234, Spring 2009) Page 1 Copyright © 2009 by Leong Hon Wai CS6234: Lecture 4  Linear Programming  LP and Simplex Algorithm [PS82]-Ch2.
C&O 355 Lecture 7 N. Harvey TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A A.
CPSC 536N Sparse Approximations Winter 2013 Lecture 1 N. Harvey TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AAAAAAAAAA.
Applied Mathematical Sciences, Vol. 8, 2014, no. 18, HIKARI Ltd, hikari.comhttp://dx.doi.org/ /ams Formula for the.
Hon Wai Leong, NUS (CS6234, Spring 2009) Page 1 Copyright © 2009 by Leong Hon Wai CS6234: Lecture 4  Linear Programming  LP and Simplex Algorithm [PS82]-Ch2.
Number of primal and dual bases of network flow and unimodular integer programs Hiroki NAKAYAMA 1, Takayuki ISHIZEKI 2, Hiroshi IMAI 1 The University of.
C&O 355 Lecture 19 N. Harvey TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A A A A A A A A A A.
Common Intersection of Half-Planes in R 2 2 PROBLEM (Common Intersection of half- planes in R 2 ) Given n half-planes H 1, H 2,..., H n in R 2 compute.
Linear Programming Chap 2. The Geometry of LP  In the text, polyhedron is defined as P = { x  R n : Ax  b }. So some of our earlier results should.
1 1.4 Linear Equations in Linear Algebra THE MATRIX EQUATION © 2016 Pearson Education, Ltd.
What have you learned in this lecture until now? Circuit theory Undefined quantities ( , ) Axioms ( , ) Two.
1 Chapter 4 Geometry of Linear Programming  There are strong relationships between the geometrical and algebraic features of LP problems  Convenient.
Lap Chi Lau we will only use slides 4 to 19
Computation of the solutions of nonlinear polynomial systems
Linear Equations in Linear Algebra
Markov Chains and Mixing Times
Topics in Algorithms Lap Chi Lau.
Proving that a Valid Inequality is Facet-defining
Piyush Kumar (Lecture 2: PageRank)
Linear Equations in Linear Algebra
Mikhail (Mike, Misha) Kagan jointly with Misha Klin
Chapter 4. Duality Theory
Enumerating All Nash Equilibria for Two-person Extensive Games
Lecture 20 Linear Program Duality
Proving that a Valid Inequality is Facet-defining
(Convex) Cones Def: closed under nonnegative linear combinations, i.e.
BASIC FEASIBLE SOLUTIONS
MinMax Principle in Game Theory – continued….
Chapter 2. Simplex method
Presentation transcript:

Ku naru shoh-mei no michi The way of the empty proof L’art de la preuve vide

If you have the right definition you have a simpler proof. If you have the right data structures, you have a simpler algorithm. If you have the right…..

Power of Elimination Tarski’s Meta-theorem To any formula Ф(X 1, X 2,……,X m ) in the vocabulary {0,1,+,.,=,<} one can effectively associate two objects: (i) a quantifier free formula θ(X 1,……,X m ) in the same vocabulary and (ii) a proof of the equivalence Ф  θ that uses the axioms for real closed fields.

x ax 2 + b + c = 0 If and only if b 2 – 4ac > 0

Resolution: Tarski’s Meta-theorem for Logic Elimination not restricted to algebra and geometry 2x - 3y ≤ 5 -3x + y ≤ -1 3(2x - 3y ≤ 5) 2(-3x + y ≤ -1) 6x - 9y ≤ 15 -6x + 2y ≤ -2 6x - 9y ≤ 15 -6x + 2y ≤ -2 ___________________________________________________________________________________________ 0x -7y ≤ 13 2x - 3y ≤ 5 -3x + y ≤ -1 6x - 9y ≤ 15 -6x + 2y ≤ -2 ___________________________________________________________________________________________ -7y ≤ 13 P(x) V Q(x) -P(a) V R(y) P(x) V Q(x) -P(a) V R(y) P(a) V Q(x) -P(a) V R(y) _____________________________________________________________________________________________________________________________ x = a Q(a) V R(y) P(a) V Q(x) -P(a) V R(y) _______________________________________________________________________________ Q(a) V R(y) P(x) V Q(x) -P(a) V R(y) ________________________________________________________________________________ x = a P(a) V Q(x) -P(a) V R(y) _______________________________________________________________________________

Used in automated theorem proving (algebra, geometry & logic) But… However… Can be used in special cases to prove theorems by hand

F0F0 F1F1 F2F2 FnFn ….. PPP F3F3 F4F4 PPP P?P?

The elimination of the proof is an ideal seldom reached Elimination gives us the heart of the proof

Markov’s Ergodic Theorem (1906) Any irreducible, finite, aperiodic Markov Chain has all states Ergodic (reachable at any time in the future) and has a unique stationary distribution, which is a probability vector.

x1x x2x2 x3x3 x4x Probabilities after 15 steps X1 =.33 X2 =.26 X3 =.26 X4 = steps X1 =.33 X2 =.26 X3 =.23 X4 = steps X1 =.37 X2 =.29 X3 =.21 X4 = steps X1 =.38 X2 =.28 X3 =.21 X4 = steps X1 =.38 X2 =.28 X3 =.21 X4 =.13

x1x x2x2 x3x3 x4x Probabilities after 15 steps X1 =.46 X2 =.20 X3 =.26 X4 = steps X1 =.36 X2 =.26 X3 =.23 X4 = steps X1 =.38 X2 =.28 X3 =.21 X4 = steps X1 =.38 X2 =.28 X3 =.21 X4 = steps X1 =.38 X2 =.28 X3 =.21 X4 =.13

You can view the problem in three different ways: Principal eigenvector problem Classical Linear Programming problem Elimination problem p 11 x 1 + p 21 x 2 + p 31 x 3 = x 1 p 12 x 1 + p 22 x 2 + p 32 x 3 = x 2 p 13 x 1 + p 23 x 2 + p 33 x 3 = x 3 ∑x i = 1 x i ≥ 0

Difficulties as an Eigenvector Problem Notion of convergence Deal with complex numbers Uniqueness of solution Need to use theorems: Perron-Frobenius, Chapman, Kolmogoroff, Cauchy… and/or restrictive hypotheses….

Symbolic Gaussian Elimination System of two variables: p 11 x 1 + p 21 x 2 = x 1 p 12 x 1 + p 22 x 2 = x 2 ∑x i = 1 With Maple we find: x 1 = p 21 /(p 21 + p 12 ) x 2 = p 12 /(p 21 + p 12 )

x 1 = p 21 /(p 21 + p 12 ) x 2 = p 12 /(p 21 + p 12 ) x1x1 x2x2 p 12 p 21 p 11 p 22 x 1 = p 21 /(p 21 + p 12 ) x 2 = p 12 /(p 21 + p 12 ) x 1 = p 21 /(p 21 + p 12 ) x 2 = p 12 /(p 21 + p 12 )

Symbolic Gaussian Elimination Three variables: p 11 x 1 + p 21 x 2 + p 31 x 3 = x 1 p 12 x 1 + p 22 x 2 + p 32 x 3 = x 2 p 13 x 1 + p 23 x 2 + p 33 x 3 = x 3 ∑x i = 1 With Maple we find: x 1 = (p 31 p 21 + p 31 p 23 + p 32 p 21 ) / Σ x 2 = (p 13 p 32 + p 12 p 31 + p 12 p 32 ) / Σ x 3 = (p 13 p 21 + p 12 p 23 + p 13 p 23 ) / Σ Σ = (p 31 p 21 + p 31 p 23 + p 32 p 21 + p 13 p 32 + p 12 p 31 + p 12 p 32 + p 13 p 21 + p 12 p 23 + p 13 p 23 )

x1x1 p 12 p 21 p 11 x2x2 p 22 x3x3 p 33 p 13 p 23 p 31 p 32 x 1 = (p 31 p 21 + p 23 p 31 + p 32 p 21 ) / Σ x 2 = (p 13 p 32 + p 31 p 12 + p 12 p 32 ) / Σ x 3 = (p 21 p 13 + p 12 p 23 + p 13 p 23 ) / Σ

x1x1 p 12 p 21 p 11 x2x2 p 22 x3x3 p 33 p 13 p 23 p 31 p 32 x 1 = (p 31 p 21 + p 23 p 31 + p 32 p 21 ) / Σ x 2 = (p 13 p 32 + p 31 p 12 + p 12 p 32 ) / Σ x 3 = (p 21 p 13 + p 12 p 23 + p 13 p 23 ) / Σ

x1x1 p 12 p 21 p 11 x2x2 p 22 x3x3 p 33 p 13 p 23 p 31 p 32 x 1 = (p 31 p 21 + p 23 p 31 + p 32 p 21 ) / Σ x 2 = (p 13 p 32 + p 31 p 12 + p 12 p 32 ) / Σ x 3 = (p 21 p 13 + p 12 p 23 + p 13 p 23 ) / Σ

x1x1 p 12 p 21 p 11 x2x2 p 22 x3x3 p 33 p 13 p 23 p 31 p 32 x 1 = (p 31 p 21 + p 23 p 31 + p 32 p 21 ) / Σ x 2 = (p 13 p 32 + p 31 p 12 + p 12 p 32 ) / Σ x 3 = (p 21 p 13 + p 12 p 23 + p 13 p 23 ) / Σ

x1x1 p 12 p 21 p 11 x2x2 p 22 x3x3 p 33 p 13 p 23 p 31 p 32 x 1 = (p 31 p 21 + p 23 p 31 + p 32 p 21 ) / Σ x 2 = (p 13 p 32 + p 31 p 12 + p 12 p 32 ) / Σ x 3 = (p 21 p 13 + p 12 p 23 + p 13 p 23 ) / Σ

Do you see the LIGHT? James Brown (The Blues Brothers)

Symbolic Gaussian Elimination System of four variables: p 21 x 2 + p 31 x 3 + p 41 x 4 = x 1 p 12 x 1 + p 42 x 4 = x 2 p 13 x 1 = x 3 p 34 x 3 = x 4 ∑x i = 1

x1x1 p 12 p 21 x2x2 x3x3 p 31 x4x4 p 41 p 34 p 42 p 13 x 1 = p 21 p 34 p 41 + p 34 p 42 p 21 + p 21 p 31 p 41 + p 31 p 42 p 21 / Σ x 2 = p 31 p 41 p 12 + p 31 p 42 p 12 + p 34 p 41 p 12 + p 34 p 42 p 12 + p 13 p 34 p 42 / Σ x 3 = p 41 p 21 p 13 + p 42 p 21 p 13 / Σ x 4 = p 21 p 13 p 34 / Σ Σ = p 21 p 34 p 41 + p 34 p 42 p 21 + p 21 p 31 p 41 + p 31 p 42 p 21 + p 31 p 41 p 12 + p 31 p 42 p 12 + p 34 p 41 p 12 + p 34 p 42 p 12 + p 13 p 34 p 42 + p 41 p 21 p 13 + p 42 p 21 p 13 + p 21 p 13 p 34 With Maple we find:

x1x1 p 12 p 21 x2x2 x3x3 p 31 x4x4 p 41 p 34 p 42 p 13 x 1 = p 21 p 34 p 41 + p 34 p 42 p 21 + p 21 p 31 p 41 + p 31 p 42 p 21 / Σ x 2 = p 31 p 41 p 12 + p 31 p 42 p 12 + p 34 p 41 p 12 + p 34 p 42 p 12 + p 13 p 34 p 42 / Σ x 3 = p 41 p 21 p 13 + p 42 p 21 p 13 / Σ x 4 = p 21 p 13 p 34 / Σ Σ = p 21 p 34 p 41 + p 34 p 42 p 21 + p 21 p 31 p 41 + p 31 p 42 p 21 + p 31 p 41 p 12 + p 31 p 42 p 12 + p 34 p 41 p 12 + p 34 p 42 p 12 + p 13 p 34 p 42 + p 41 p 21 p 13 + p 42 p 21 p 13 + p 21 p 13 p 34

x1x1 p 12 p 21 x2x2 x3x3 p 31 x4x4 p 41 p 34 p 42 p 13 x 1 = p 21 p 34 p 41 + p 34 p 42 p 21 + p 21 p 31 p 41 + p 31 p 42 p 21 / Σ x 2 = p 31 p 41 p 12 + p 31 p 42 p 12 + p 34 p 41 p 12 + p 34 p 42 p 12 + p 13 p 34 p 42 / Σ x 3 = p 41 p 21 p 13 + p 42 p 21 p 13 / Σ x 4 = p 21 p 13 p 34 / Σ Σ = p 21 p 34 p 41 + p 34 p 42 p 21 + p 21 p 31 p 41 + p 31 p 42 p 21 + p 31 p 41 p 12 + p 31 p 42 p 12 + p 34 p 41 p 12 + p 34 p 42 p 12 + p 13 p 34 p 42 + p 41 p 21 p 13 + p 42 p 21 p 13 + p 21 p 13 p 34

x1x1 p 12 p 21 x2x2 x3x3 p 31 x4x4 p 41 p 34 p 42 p 13 x 1 = p 21 p 34 p 41 + p 34 p 42 p 21 + p 21 p 31 p 41 + p 31 p 42 p 21 / Σ x 2 = p 31 p 41 p 12 + p 31 p 42 p 12 + p 34 p 41 p 12 + p 34 p 42 p 12 + p 13 p 34 p 42 / Σ x 3 = p 41 p 21 p 13 + p 42 p 21 p 13 / Σ x 4 = p 21 p 13 p 34 / Σ Σ = p 21 p 34 p 41 + p 34 p 42 p 21 + p 21 p 31 p 41 + p 31 p 42 p 21 + p 31 p 41 p 12 + p 31 p 42 p 12 + p 34 p 41 p 12 + p 34 p 42 p 12 + p 13 p 34 p 42 + p 41 p 21 p 13 + p 42 p 21 p 13 + p 21 p 13 p 34

x1x1 p 12 p 21 x2x2 x3x3 p 31 x4x4 p 41 p 34 p 42 p 13 x 1 = p 21 p 34 p 41 + p 34 p 42 p 21 + p 21 p 31 p 41 + p 31 p 42 p 21 / Σ x 2 = p 31 p 41 p 12 + p 31 p 42 p 12 + p 34 p 41 p 12 + p 34 p 42 p 12 + p 13 p 34 p 42 / Σ x 3 = p 41 p 21 p 13 + p 42 p 21 p 13 / Σ x 4 = p 21 p 13 p 34 / Σ Σ = p 21 p 34 p 41 + p 34 p 42 p 21 + p 21 p 31 p 41 + p 31 p 42 p 21 + p 31 p 41 p 12 + p 31 p 42 p 12 + p 34 p 41 p 12 + p 34 p 42 p 12 + p 13 p 34 p 42 + p 41 p 21 p 13 + p 42 p 21 p 13 + p 21 p 13 p 34

x1x1 p 12 p 21 x2x2 x3x3 p 31 x4x4 p 41 p 34 p 42 p 13 x 1 = p 21 p 34 p 41 + p 34 p 42 p 21 + p 21 p 31 p 41 + p 31 p 42 p 21 / Σ x 2 = p 31 p 41 p 12 + p 31 p 42 p 12 + p 34 p 41 p 12 + p 34 p 42 p 12 + p 13 p 34 p 42 / Σ x 3 = p 41 p 21 p 13 + p 42 p 21 p 13 / Σ x 4 = p 21 p 13 p 34 / Σ Σ = p 21 p 34 p 41 + p 34 p 42 p 21 + p 21 p 31 p 41 + p 31 p 42 p 21 + p 31 p 41 p 12 + p 31 p 42 p 12 + p 34 p 41 p 12 + p 34 p 42 p 12 + p 13 p 34 p 42 + p 41 p 21 p 13 + p 42 p 21 p 13 + p 21 p 13 p 34

Ergodic Theorem Revisited If there exists a reverse spanning tree in a graph of the Markov chain associated to a stochastic system, then: (a)the stochastic system admits the following probability vector as a solution: (b) the solution is unique. (c) the conditions {x i ≥ 0} i=1,n are redundant and the solution can be computed by Gaussian elimination.

Internet Sites Kleinberg Google SALSA In degree heuristic

Markov Chain as a Conservation System x1x1 p 12 p 21 p 11 x2x2 p 22 x3x3 p 33 p 13 p 23 p 31 p 32 p 11 x 1 + p 21 x 2 + p 31 x 3 = x 1 (p 11 + p 12 + p 13 ) p 12 x 1 + p 22 x 2 + p 32 x 3 = x 2 (p 21 + p 22 + p 23 ) p 13 x 1 + p 23 x 2 + p 33 x 3 = x 3 (p 31 + p 32 + p 33 )

Kirchoff’s Current Law The sum of currents flowing towards a node is equal to the sum of currents flowing away from the node. i 31 + i 21 = i 12 + i 13 i 32 + i 12 = i 21 i 13 = i 31 + i i 13 i 31 i 32 i 12 i 21 i 3 + i 2 = i 1 + i 4

Kirchoff’s Matrix Tree Theorem (1847) For an n-vertex digraph, define an n x n matrix A such that A[i,j] = 1 if there is an edge from i to j, for all i ≠ j, and the diagonal entries are such that the row sums are 0. Let A(k) be the matrix obtained from A by deleting row k and column k. Then the absolute value of the determinant of A(k) is the number of spanning trees rooted at k (edges directed towards vertex k)

Two theorems for the price of one!! Differences Kirchoff theorem perform n gaussian eliminations Revised version – only two gaussian

Minimax Theorem Fundamental Theorem in Game Theory Von Neumann & Kuhn Minimax Theorem brings certainty into the world of probabilistic game theory. Applications in Computer Science, Economics & Business, Biology, etc.

Minimax Theorem max(x + 2y) -3x + y ≤ 2 2x + 3y ≤ 4 4x + 2y ≤ 1 -x ≤ 0 -y ≤ 0 0 ≤ z ≤ 10 ≤ z ≤ 1 max value z can take is min value of the right hand side z = x + 2y -3z + 7y ≤ 2 2z - y ≤ 4 4z - 6y ≤ 1 -z + 2y ≤ 0 -y ≤ 0 2z ≤ 2 3z ≤ 8 -z ≤ 0 10z ≤ 19 11z ≤ 30 -3z ≤ 2 min value z can take is max value of the left hand side x = z – 2y z ≤ 2/2 z ≤ 8/3 -z ≤ 0 z ≤ 19/10 z ≤ 30/11 -3z ≤ 2 z ≤ 2/2 z ≤ 8/3 -z ≤ 0 z ≤ 19/10 z ≤ 30/11 -3z ≤ 2 z ≤ 2/2 z ≤ 8/3 0 ≤ z z ≤ 19/10 z ≤ 30/11 -2/3 ≤ z z ≤ 2/2 z ≤ 8/3 0 ≤ z z ≤ 19/10 z ≤ 30/11 -2/3 ≤ z

Duality Theorem If the primal problem has an optimal solution, x* = (x 1 *, x 2 *,……,x n * ) then the dual also has an optimal solution, y* = (y 1 *, y 2 *,……,y m * ) and

Géométrie Geometry Γεωμετρία

Κανένας δεν εισάγει εκτός αν ξέρει τη γεωμετρία Πλάτων Nobody enters unless he knows Geometry Plato

Personne ne sort s’ il ne connait la Géométrie κανένας δεν παίρνει από εδώ εκτός αν ξέρει τη γεωμετρία Jean-Louis L. Nobody gets out unless he knows Geometry

A polyhedron is defined as the intersection of a finite number of linear halfspaces. 2x + y ≤ 4 -x +3y ≤ 2 -3x - 4y ≤ 5 -x - 6y ≤ 4

A polytope Q is defined as a convex hull of a finite collection of points. x = ∑λ i x i y = ∑λ i y i ∑λ i = 1 λ i ≥ 0

Minkowski(1896)-Steinitz(1916)- Farkas(1906)-Weyl(1935) Theorem Q is a polytope if and only if it is a bounded polyhedron. Extension by Charnes & Cooper (1958)

“This classical result is an outstanding example of a fact which is completely obvious to geometric intuition, but wields important algebraic content and is not trivial to prove.” R.T. Rockafeller

A polytope Q is defined as a convex hull of a finite collection of points. λ 1 + 2λ 2 + λ 3 = x λ 1 + 2λ 2 + 3λ 3 = y λ 1 + λ 2 + λ 3 = 1 λ 1 ≥ 0 λ 2 ≥ 0 λ 3 ≥ 0 y x p 3 (1,3) p 2 (2,2) p 1 (1,1) A polyhedron is defined as the intersection of a finite number of linear halfspaces. λ 1 = 1 - λ 2 - λ 3 x + 2λ 3 = y 2 - x - λ 3 ≥ 0 x - 1 ≥ 0 λ 3 ≥ 0 λ 1 + 2λ 2 + λ 3 = x λ 1 + 2λ 2 + 3λ 3 = y λ 1 ≥ 0 λ 2 ≥ 0 λ 3 ≥ λ 2 = x 1 + λ 2 + 2λ 3 = y 1 - λ 2 - λ 3 ≥ 0 λ 2 ≥ 0 λ 3 ≥ 0 λ 2 = x x - λ 3 ≥ 0 x - 1 ≥ 0 λ 3 ≥ λ 2 + 2λ 3 = y 1 - λ 2 - λ 3 ≥ 0 λ 2 ≥ 0 λ 3 ≥ 0 λ 3 = (y - x) / 2 x ≤ 4 - y x ≥ 1 y ≥ x

References Kirchhoff, G. "Über die Auflösung der Gleichungen, auf welche man bei der untersuchung der linearen verteilung galvanischer Ströme geführt wird." Ann. Phys. Chem. 72, , А. А. Марков. "Распространение закона больших чисел на величины, зависящие друг от друга". "Известия Физико-математического общества при Казанском университете", 2-я серия, том 15, ст , H. Minkowski, Geometrie der Zahlen (Leipzig, 1896). J. Farkas, "Theorie der einfachen Ungleichungen," J. F. Reine u. Ang. Mat., 124, 1-27, H. Weyl, "Elementare Theorie der konvexen Polyeder," Comm. Helvet., 7, , A. Charnes, W. W. Cooper, “The Strong Minkowski Farkas-Weyl Theorem for Vector Spaces Over Ordered Fields,” Proceedings of the National Academy of Sciences, pp , 1958.

References E. Steinitz. Bedingt Konvergente Reihen und Konvexe Systeme. J. reine angew. Math., 146:1-52, K. Jeev, J-L. Lassez: Symbolic Stochastic Systems. MSV/AMCS 2004: V. Chandru, J-L. Lassez: Qualitative Theorem Proving in Linear Constraints. Verification: Theory and Practice 2003: Jean-Louis Lassez: From LP to LP: Programming with Constraints. DBPL 1991 :