Presentation is loading. Please wait.

Presentation is loading. Please wait.

Automated Proof Generation for EG Bram van Heuveln Spring 2003 AI & Reasoning Lab, RPI.

Similar presentations


Presentation on theme: "Automated Proof Generation for EG Bram van Heuveln Spring 2003 AI & Reasoning Lab, RPI."— Presentation transcript:

1 Automated Proof Generation for EG Bram van Heuveln Spring 2003 AI & Reasoning Lab, RPI

2 Automated Proof Generation Automated Proof Generation (APG) tries to come with routines that systematically generate formal derivations (formal proofs). APG is a subdivision of ATP, in that in ATP one is not restricted to formal proofs to decide whether something does or does not follow. That is, Proof Generation is one way to do Theorem Proving, but there are other ways as well, such as truth-tables, truth-trees, Davis-Putnam, etc.

3 APG for EG I will present an APG routine that systematically derives a conclusion from a set of premises in the EG system, assuming that the conclusion is logically entailed by the premises. If the conclusion does not follow from the premises, then the procedure produces counterexamples. Like most ATP routines, this APG routine is based on a consistency checking mechanism, and adds a few routines to make that into a formal proof. First, we need to define the notion of a satisfaction graph (or model graph).

4 Satisfaction Graph A model is a list of literals where for each literal in the list, its complement is not in the list. A satisfaction graph is a graph of the following form (i.e. it is a graph depicting a disjunction of models): M M1M1 MnMn  for n = 1: for n = 0:

5 Reducing a Graph with regard to a Literal The reduction of any graph  with regard to a literal graph L (i.e. L is either an atomic sentence or the negation of an atomic sentence), is obtained from graph  by removing all occurrences of L, and by removing all complements of L with the empty cut. We’ll write this as  L.

6 Routine for Transforming any Graph into a Satisfaction Graph graph Transform(G) begin if G = Transform( end if G = return G; return Paste(L, Transform(  L ); L  if G = 11 nn  Transform( 11  ) nn  return or )  // when applicable, remove DC’s and duplicates

7 Paste Routine graph Paste(L, G) begin if G =return //L is a literal, G is a satisfaction graph if G = Mreturn L M M1M1 MnMn  if G = end L M 1 L M n  return

8 The Trans Routine and Model Graphs Basically, for any graph G, the Transform routine (Trans) transforms G into satisfaction graph S, where S is a DNF equivalent to G. The Trans routine also makes sure that all disjuncts (which are conjunctions of literals) are consistent. Hence, S represents various truth-value assignments (models) that can be used to satisfy G. If G is unsatisfiable, then there are no models for G, and hence S will be a disjunction of 0 disjuncts, i.e. the empty cut. Example on next slide.

9 VWU UW WU VTrans() = Trans(VWUUWWUV) = WUWUVTrans( VWUVWU U)W)= WUWU=V

10 Final Routine for APG in EG Given any graphs  and , if  |= , the following routine systematically transforms  into , using the inference rules from EG:           DC IN IT(2x) Trans DC E

11 Why the Routine Works The routine works because: –1. The Trans routine finds all models that satisfy G. So, if  |= , then the graph for  and ~  will be unsatisfiable, and hence the Trans routine will output an empty cut. –2. The Trans(form) and Paste routines can be simulated by following EG rules. That is, reducing a graph with regard to some literal amounts to deiteration, and pasting a literal back to possible models amounts to iteration, and other than that you use Double Cut rules. It is interesting to note that the Trans routine follows very much the same path as the EGTT satisfiability routine (illustrated on next 2 slides).

12 Sat( XY XYXY XY XY XXY XYXY YXY XY )) ) = = X Sat(YYXX )) = = | Sat( ) )False =|

13 Trans( XY XYXY XY XY XXY XYXY YXY XY )) ) = = X Trans(XXYYXYX )) = =


Download ppt "Automated Proof Generation for EG Bram van Heuveln Spring 2003 AI & Reasoning Lab, RPI."

Similar presentations


Ads by Google