# HG changeset patch # User wenzelm # Date 1451141054 -3600 # Node ID 2e48182cc82c85312016955054e052aa201f56ad # Parent ed47044ee6a6478a4d6ac442eb74d48566f9ce4c misc tuning and modernization; diff -r ed47044ee6a6 -r 2e48182cc82c src/HOL/Isar_Examples/Basic_Logic.thy --- a/src/HOL/Isar_Examples/Basic_Logic.thy Sat Dec 26 15:03:41 2015 +0100 +++ b/src/HOL/Isar_Examples/Basic_Logic.thy Sat Dec 26 15:44:14 2015 +0100 @@ -1,5 +1,5 @@ (* Title: HOL/Isar_Examples/Basic_Logic.thy - Author: Markus Wenzel, TU Muenchen + Author: Makarius Basic propositional and quantifier reasoning. *) @@ -13,9 +13,11 @@ subsection \Pure backward reasoning\ -text \In order to get a first idea of how Isabelle/Isar proof documents may - look like, we consider the propositions \I\, \K\, and \S\. The following - (rather explicit) proofs should require little extra explanations.\ +text \ + In order to get a first idea of how Isabelle/Isar proof documents may look + like, we consider the propositions \I\, \K\, and \S\. The following (rather + explicit) proofs should require little extra explanations. +\ lemma I: "A \ A" proof @@ -50,54 +52,52 @@ qed qed -text \Isar provides several ways to fine-tune the reasoning, avoiding - excessive detail. Several abbreviated language elements are available, - enabling the writer to express proofs in a more concise way, even without - referring to any automated proof tools yet. - - First of all, proof by assumption may be abbreviated as a single dot.\ +text \ + Isar provides several ways to fine-tune the reasoning, avoiding excessive + detail. Several abbreviated language elements are available, enabling the + writer to express proofs in a more concise way, even without referring to + any automated proof tools yet. -lemma "A \ A" -proof - assume A - show A by fact+ -qed - -text \In fact, concluding any (sub-)proof already involves solving any - remaining goals by assumption\footnote{This is not a completely trivial - operation, as proof by assumption may involve full higher-order - unification.}. Thus we may skip the rather vacuous body of the above proof - as well.\ + Concluding any (sub-)proof already involves solving any remaining goals by + assumption\<^footnote>\This is not a completely trivial operation, as proof by + assumption may involve full higher-order unification.\. Thus we may skip the + rather vacuous body of the above proof. +\ lemma "A \ A" proof qed -text \Note that the \isacommand{proof} command refers to the \rule\ method - (without arguments) by default. Thus it implicitly applies a single rule, - as determined from the syntactic form of the statements involved. The - \isacommand{by} command abbreviates any proof with empty body, so the - proof may be further pruned.\ +text \ + Note that the \<^theory_text>\proof\ command refers to the \rule\ method (without + arguments) by default. Thus it implicitly applies a single rule, as + determined from the syntactic form of the statements involved. The \<^theory_text>\by\ + command abbreviates any proof with empty body, so the proof may be further + pruned. +\ lemma "A \ A" by rule -text \Proof by a single rule may be abbreviated as double-dot.\ +text \ + Proof by a single rule may be abbreviated as double-dot. +\ lemma "A \ A" .. -text \Thus we have arrived at an adequate representation of the proof of a - tautology that holds by a single standard rule.\footnote{Apparently, the - rule here is implication introduction.} +text \ + Thus we have arrived at an adequate representation of the proof of a + tautology that holds by a single standard rule.\<^footnote>\Apparently, the + rule here is implication introduction.\ \<^medskip> Let us also reconsider \K\. Its statement is composed of iterated - connectives. Basic decomposition is by a single rule at a time, which is - why our first version above was by nesting two proofs. + connectives. Basic decomposition is by a single rule at a time, which is why + our first version above was by nesting two proofs. - The \intro\ proof method repeatedly decomposes a goal's - conclusion.\footnote{The dual method is \elim\, acting on a goal's - premises.}\ + The \intro\ proof method repeatedly decomposes a goal's conclusion.\<^footnote>\The + dual method is \elim\, acting on a goal's premises.\ +\ lemma "A \ B \ A" proof (intro impI) @@ -110,29 +110,32 @@ lemma "A \ B \ A" by (intro impI) -text \Just like \rule\, the \intro\ and \elim\ proof methods pick standard +text \ + Just like \rule\, the \intro\ and \elim\ proof methods pick standard structural rules, in case no explicit arguments are given. While implicit - rules are usually just fine for single rule application, this may go too - far with iteration. Thus in practice, \intro\ and \elim\ would be - typically restricted to certain structures by giving a few rules only, - e.g.\ \isacommand{proof}~\(intro impI allI)\ to strip implications and - universal quantifiers. + rules are usually just fine for single rule application, this may go too far + with iteration. Thus in practice, \intro\ and \elim\ would be typically + restricted to certain structures by giving a few rules only, e.g.\ \<^theory_text>\proof + (intro impI allI)\ to strip implications and universal quantifiers. Such well-tuned iterated decomposition of certain structures is the prime - application of \intro\ and \elim\. In contrast, terminal steps that solve - a goal completely are usually performed by actual automated proof methods - (such as \isacommand{by}~\blast\.\ + application of \intro\ and \elim\. In contrast, terminal steps that solve a + goal completely are usually performed by actual automated proof methods + (such as \<^theory_text>\by blast\. +\ subsection \Variations of backward vs.\ forward reasoning\ -text \Certainly, any proof may be performed in backward-style only. On the - other hand, small steps of reasoning are often more naturally expressed in +text \ + Certainly, any proof may be performed in backward-style only. On the other + hand, small steps of reasoning are often more naturally expressed in forward-style. Isar supports both backward and forward reasoning as a first-class concept. In order to demonstrate the difference, we consider several proofs of \A \ B \ B \ A\. - The first version is purely backward.\ + The first version is purely backward. +\ lemma "A \ B \ B \ A" proof @@ -144,12 +147,13 @@ qed qed -text \Above, the projection rules \conjunct1\ / \conjunct2\ had to be named - explicitly, since the goals \B\ and \A\ did not provide any structural - clue. This may be avoided using \isacommand{from} to focus on the \A \ B\ - assumption as the current facts, enabling the use of double-dot proofs. - Note that \isacommand{from} already does forward-chaining, involving the - \conjE\ rule here.\ +text \ + Above, the projection rules \conjunct1\ / \conjunct2\ had to be named + explicitly, since the goals \B\ and \A\ did not provide any structural clue. + This may be avoided using \<^theory_text>\from\ to focus on the \A \ B\ assumption as the + current facts, enabling the use of double-dot proofs. Note that \<^theory_text>\from\ + already does forward-chaining, involving the \conjE\ rule here. +\ lemma "A \ B \ B \ A" proof @@ -161,12 +165,14 @@ qed qed -text \In the next version, we move the forward step one level upwards. - Forward-chaining from the most recent facts is indicated by the - \isacommand{then} command. Thus the proof of \B \ A\ from \A \ B\ actually - becomes an elimination, rather than an introduction. The resulting proof - structure directly corresponds to that of the \conjE\ rule, including the - repeated goal proposition that is abbreviated as \?thesis\ below.\ +text \ + In the next version, we move the forward step one level upwards. + Forward-chaining from the most recent facts is indicated by the \<^theory_text>\then\ + command. Thus the proof of \B \ A\ from \A \ B\ actually becomes an + elimination, rather than an introduction. The resulting proof structure + directly corresponds to that of the \conjE\ rule, including the repeated + goal proposition that is abbreviated as \?thesis\ below. +\ lemma "A \ B \ B \ A" proof @@ -178,9 +184,11 @@ qed qed -text \In the subsequent version we flatten the structure of the main body by - doing forward reasoning all the time. Only the outermost decomposition - step is left as backward.\ +text \ + In the subsequent version we flatten the structure of the main body by doing + forward reasoning all the time. Only the outermost decomposition step is + left as backward. +\ lemma "A \ B \ B \ A" proof @@ -190,9 +198,11 @@ from \B\ \A\ show "B \ A" .. qed -text \We can still push forward-reasoning a bit further, even at the risk of - getting ridiculous. Note that we force the initial proof step to do - nothing here, by referring to the \-\ proof method.\ +text \ + We can still push forward-reasoning a bit further, even at the risk of + getting ridiculous. Note that we force the initial proof step to do nothing + here, by referring to the \-\ proof method. +\ lemma "A \ B \ B \ A" proof - @@ -208,22 +218,22 @@ text \ \<^medskip> With these examples we have shifted through a whole range from purely - backward to purely forward reasoning. Apparently, in the extreme ends we - get slightly ill-structured proofs, which also require much explicit - naming of either rules (backward) or local facts (forward). + backward to purely forward reasoning. Apparently, in the extreme ends we get + slightly ill-structured proofs, which also require much explicit naming of + either rules (backward) or local facts (forward). - The general lesson learned here is that good proof style would achieve - just the \<^emph>\right\ balance of top-down backward decomposition, and - bottom-up forward composition. In general, there is no single best way to - arrange some pieces of formal reasoning, of course. Depending on the - actual applications, the intended audience etc., rules (and methods) on - the one hand vs.\ facts on the other hand have to be emphasized in an - appropriate way. This requires the proof writer to develop good taste, and - some practice, of course. + The general lesson learned here is that good proof style would achieve just + the \<^emph>\right\ balance of top-down backward decomposition, and bottom-up + forward composition. In general, there is no single best way to arrange some + pieces of formal reasoning, of course. Depending on the actual applications, + the intended audience etc., rules (and methods) on the one hand vs.\ facts + on the other hand have to be emphasized in an appropriate way. This requires + the proof writer to develop good taste, and some practice, of course. \<^medskip> - For our example the most appropriate way of reasoning is probably the - middle one, with conjunction introduction done after elimination.\ + For our example the most appropriate way of reasoning is probably the middle + one, with conjunction introduction done after elimination. +\ lemma "A \ B \ B \ A" proof @@ -239,15 +249,19 @@ subsection \A few examples from ``Introduction to Isabelle''\ -text \We rephrase some of the basic reasoning examples of @{cite - "isabelle-intro"}, using HOL rather than FOL.\ +text \ + We rephrase some of the basic reasoning examples of @{cite + "isabelle-intro"}, using HOL rather than FOL. +\ subsubsection \A propositional proof\ -text \We consider the proposition \P \ P \ P\. The proof below involves - forward-chaining from \P \ P\, followed by an explicit case-analysis on - the two \<^emph>\identical\ cases.\ +text \ + We consider the proposition \P \ P \ P\. The proof below involves + forward-chaining from \P \ P\, followed by an explicit case-analysis on the + two \<^emph>\identical\ cases. +\ lemma "P \ P \ P" proof @@ -260,9 +274,10 @@ qed qed -text \Case splits are \<^emph>\not\ hardwired into the Isar language as a - special feature. The \isacommand{next} command used to separate the cases - above is just a short form of managing block structure. +text \ + Case splits are \<^emph>\not\ hardwired into the Isar language as a special + feature. The \<^theory_text>\next\ command used to separate the cases above is just a + short form of managing block structure. \<^medskip> In general, applying proof methods may split up a goal into separate @@ -270,17 +285,17 @@ corresponding proof text typically mimics this by establishing results in appropriate contexts, separated by blocks. - In order to avoid too much explicit parentheses, the Isar system - implicitly opens an additional block for any new goal, the - \isacommand{next} statement then closes one block level, opening a new - one. The resulting behaviour is what one would expect from separating - cases, only that it is more flexible. E.g.\ an induction base case (which - does not introduce local assumptions) would \<^emph>\not\ require - \isacommand{next} to separate the subsequent step case. + In order to avoid too much explicit parentheses, the Isar system implicitly + opens an additional block for any new goal, the \<^theory_text>\next\ statement then + closes one block level, opening a new one. The resulting behaviour is what + one would expect from separating cases, only that it is more flexible. E.g.\ + an induction base case (which does not introduce local assumptions) would + \<^emph>\not\ require \<^theory_text>\next\ to separate the subsequent step case. \<^medskip> In our example the situation is even simpler, since the two cases actually - coincide. Consequently the proof may be rephrased as follows.\ + coincide. Consequently the proof may be rephrased as follows. +\ lemma "P \ P \ P" proof @@ -307,16 +322,17 @@ subsubsection \A quantifier proof\ -text \To illustrate quantifier reasoning, let us prove +text \ + To illustrate quantifier reasoning, let us prove \(\x. P (f x)) \ (\y. P y)\. Informally, this holds because any \a\ with \P (f a)\ may be taken as a witness for the second existential statement. The first proof is rather verbose, exhibiting quite a lot of (redundant) - detail. It gives explicit rules, even with some instantiation. - Furthermore, we encounter two new language elements: the \isacommand{fix} - command augments the context by some new ``arbitrary, but fixed'' element; - the \isacommand{is} annotation binds term abbreviations by higher-order - pattern matching.\ + detail. It gives explicit rules, even with some instantiation. Furthermore, + we encounter two new language elements: the \<^theory_text>\fix\ command augments the + context by some new ``arbitrary, but fixed'' element; the \<^theory_text>\is\ annotation + binds term abbreviations by higher-order pattern matching. +\ lemma "(\x. P (f x)) \ (\y. P y)" proof @@ -329,11 +345,13 @@ qed qed -text \While explicit rule instantiation may occasionally improve readability - of certain aspects of reasoning, it is usually quite redundant. Above, the - basic proof outline gives already enough structural clues for the system - to infer both the rules and their instances (by higher-order unification). - Thus we may as well prune the text as follows.\ +text \ + While explicit rule instantiation may occasionally improve readability of + certain aspects of reasoning, it is usually quite redundant. Above, the + basic proof outline gives already enough structural clues for the system to + infer both the rules and their instances (by higher-order unification). Thus + we may as well prune the text as follows. +\ lemma "(\x. P (f x)) \ (\y. P y)" proof @@ -346,9 +364,11 @@ qed qed -text \Explicit \\\-elimination as seen above can become quite cumbersome in - practice. The derived Isar language element ``\isakeyword{obtain}'' - provides a more handsome way to do generalized existence reasoning.\ +text \ + Explicit \\\-elimination as seen above can become quite cumbersome in + practice. The derived Isar language element ``\<^theory_text>\obtain\'' provides a more + handsome way to do generalized existence reasoning. +\ lemma "(\x. P (f x)) \ (\y. P y)" proof @@ -357,19 +377,22 @@ then show "\y. P y" .. qed -text \Technically, \isakeyword{obtain} is similar to \isakeyword{fix} and - \isakeyword{assume} together with a soundness proof of the elimination - involved. Thus it behaves similar to any other forward proof element. Also - note that due to the nature of general existence reasoning involved here, - any result exported from the context of an \isakeyword{obtain} statement - may \<^emph>\not\ refer to the parameters introduced there.\ +text \ + Technically, \<^theory_text>\obtain\ is similar to \<^theory_text>\fix\ and \<^theory_text>\assume\ together with a + soundness proof of the elimination involved. Thus it behaves similar to any + other forward proof element. Also note that due to the nature of general + existence reasoning involved here, any result exported from the context of + an \<^theory_text>\obtain\ statement may \<^emph>\not\ refer to the parameters introduced there. +\ subsubsection \Deriving rules in Isabelle\ -text \We derive the conjunction elimination rule from the corresponding +text \ + We derive the conjunction elimination rule from the corresponding projections. The proof is quite straight-forward, since Isabelle/Isar - supports non-atomic goals and assumptions fully transparently.\ + supports non-atomic goals and assumptions fully transparently. +\ theorem conjE: "A \ B \ (A \ B \ C) \ C" proof - diff -r ed47044ee6a6 -r 2e48182cc82c src/HOL/Isar_Examples/Drinker.thy --- a/src/HOL/Isar_Examples/Drinker.thy Sat Dec 26 15:03:41 2015 +0100 +++ b/src/HOL/Isar_Examples/Drinker.thy Sat Dec 26 15:44:14 2015 +0100 @@ -8,11 +8,12 @@ imports Main begin -text \Here is another example of classical reasoning: the Drinker's - Principle says that for some person, if he is drunk, everybody else is - drunk! +text \ + Here is another example of classical reasoning: the Drinker's Principle says + that for some person, if he is drunk, everybody else is drunk! - We first prove a classical part of de-Morgan's law.\ + We first prove a classical part of de-Morgan's law. +\ lemma de_Morgan: assumes "\ (\x. P x)" diff -r ed47044ee6a6 -r 2e48182cc82c src/HOL/Isar_Examples/Expr_Compiler.thy --- a/src/HOL/Isar_Examples/Expr_Compiler.thy Sat Dec 26 15:03:41 2015 +0100 +++ b/src/HOL/Isar_Examples/Expr_Compiler.thy Sat Dec 26 15:44:14 2015 +0100 @@ -1,5 +1,5 @@ (* Title: HOL/Isar_Examples/Expr_Compiler.thy - Author: Markus Wenzel, TU Muenchen + Author: Makarius Correctness of a simple expression/stack-machine compiler. *) @@ -10,32 +10,40 @@ imports Main begin -text \This is a (rather trivial) example of program verification. We model a +text \ + This is a (rather trivial) example of program verification. We model a compiler for translating expressions to stack machine instructions, and - prove its correctness wrt.\ some evaluation semantics.\ + prove its correctness wrt.\ some evaluation semantics. +\ subsection \Binary operations\ -text \Binary operations are just functions over some type of values. This is - both for abstract syntax and semantics, i.e.\ we use a ``shallow - embedding'' here.\ +text \ + Binary operations are just functions over some type of values. This is both + for abstract syntax and semantics, i.e.\ we use a ``shallow embedding'' + here. +\ type_synonym 'val binop = "'val \ 'val \ 'val" subsection \Expressions\ -text \The language of expressions is defined as an inductive type, - consisting of variables, constants, and binary operations on expressions.\ +text \ + The language of expressions is defined as an inductive type, consisting of + variables, constants, and binary operations on expressions. +\ datatype (dead 'adr, dead 'val) expr = Variable 'adr | Constant 'val | Binop "'val binop" "('adr, 'val) expr" "('adr, 'val) expr" -text \Evaluation (wrt.\ some environment of variable assignments) is defined - by primitive recursion over the structure of expressions.\ +text \ + Evaluation (wrt.\ some environment of variable assignments) is defined by + primitive recursion over the structure of expressions. +\ primrec eval :: "('adr, 'val) expr \ ('adr \ 'val) \ 'val" where @@ -46,15 +54,19 @@ subsection \Machine\ -text \Next we model a simple stack machine, with three instructions.\ +text \ + Next we model a simple stack machine, with three instructions. +\ datatype (dead 'adr, dead 'val) instr = Const 'val | Load 'adr | Apply "'val binop" -text \Execution of a list of stack machine instructions is easily defined as - follows.\ +text \ + Execution of a list of stack machine instructions is easily defined as + follows. +\ primrec exec :: "(('adr, 'val) instr) list \ 'val list \ ('adr \ 'val) \ 'val list" where @@ -72,8 +84,10 @@ subsection \Compiler\ -text \We are ready to define the compilation function of expressions to - lists of stack machine instructions.\ +text \ + We are ready to define the compilation function of expressions to lists of + stack machine instructions. +\ primrec compile :: "('adr, 'val) expr \ (('adr, 'val) instr) list" where @@ -82,8 +96,10 @@ | "compile (Binop f e1 e2) = compile e2 @ compile e1 @ [Apply f]" -text \The main result of this development is the correctness theorem for - \compile\. We first establish a lemma about \exec\ and list append.\ +text \ + The main result of this development is the correctness theorem for + \compile\. We first establish a lemma about \exec\ and list append. +\ lemma exec_append: "exec (xs @ ys) stack env = @@ -128,8 +144,8 @@ In the proofs above, the \simp\ method does quite a lot of work behind the scenes (mostly ``functional program execution''). Subsequently, the same reasoning is elaborated in detail --- at most one recursive function - definition is used at a time. Thus we get a better idea of what is - actually going on. + definition is used at a time. Thus we get a better idea of what is actually + going on. \ lemma exec_append': diff -r ed47044ee6a6 -r 2e48182cc82c src/HOL/Isar_Examples/Fibonacci.thy --- a/src/HOL/Isar_Examples/Fibonacci.thy Sat Dec 26 15:03:41 2015 +0100 +++ b/src/HOL/Isar_Examples/Fibonacci.thy Sat Dec 26 15:44:14 2015 +0100 @@ -18,9 +18,8 @@ imports "../Number_Theory/Primes" begin -text_raw \\footnote{Isar version by Gertrud Bauer. Original tactic - script by Larry Paulson. A few proofs of laws taken from - @{cite "Concrete-Math"}.}\ +text_raw \\<^footnote>\Isar version by Gertrud Bauer. Original tactic script by Larry + Paulson. A few proofs of laws taken from @{cite "Concrete-Math"}.\\ declare One_nat_def [simp] diff -r ed47044ee6a6 -r 2e48182cc82c src/HOL/Isar_Examples/Group.thy --- a/src/HOL/Isar_Examples/Group.thy Sat Dec 26 15:03:41 2015 +0100 +++ b/src/HOL/Isar_Examples/Group.thy Sat Dec 26 15:44:14 2015 +0100 @@ -1,5 +1,5 @@ (* Title: HOL/Isar_Examples/Group.thy - Author: Markus Wenzel, TU Muenchen + Author: Makarius *) section \Basic group theory\ @@ -10,17 +10,21 @@ subsection \Groups and calculational reasoning\ -text \Groups over signature \(* :: \ \ \ \ \, 1 :: \, inverse :: \ \ \)\ - are defined as an axiomatic type class as follows. Note that the parent - class \times\ is provided by the basic HOL theory.\ +text \ + Groups over signature \(* :: \ \ \ \ \, 1 :: \, inverse :: \ \ \)\ are + defined as an axiomatic type class as follows. Note that the parent class + \times\ is provided by the basic HOL theory. +\ class group = times + one + inverse + assumes group_assoc: "(x * y) * z = x * (y * z)" and group_left_one: "1 * x = x" and group_left_inverse: "inverse x * x = 1" -text \The group axioms only state the properties of left one and inverse, - the right versions may be derived as follows.\ +text \ + The group axioms only state the properties of left one and inverse, the + right versions may be derived as follows. +\ theorem (in group) group_right_inverse: "x * inverse x = 1" proof - @@ -43,8 +47,10 @@ finally show ?thesis . qed -text \With \group_right_inverse\ already available, \group_right_one\ - \label{thm:group-right-one} is now established much easier.\ +text \ + With \group_right_inverse\ already available, \group_right_one\ + is now established much easier. +\ theorem (in group) group_right_one: "x * 1 = x" proof - @@ -63,23 +69,23 @@ \<^medskip> The calculational proof style above follows typical presentations given in any introductory course on algebra. The basic technique is to form a - transitive chain of equations, which in turn are established by - simplifying with appropriate rules. The low-level logical details of - equational reasoning are left implicit. + transitive chain of equations, which in turn are established by simplifying + with appropriate rules. The low-level logical details of equational + reasoning are left implicit. Note that ``\\\'' is just a special term variable that is bound - automatically to the argument\footnote{The argument of a curried infix - expression happens to be its right-hand side.} of the last fact achieved - by any local assumption or proven statement. In contrast to \?thesis\, the - ``\\\'' variable is bound \<^emph>\after\ the proof is finished. + automatically to the argument\<^footnote>\The argument of a curried infix expression + happens to be its right-hand side.\ of the last fact achieved by any local + assumption or proven statement. In contrast to \?thesis\, the ``\\\'' + variable is bound \<^emph>\after\ the proof is finished. - There are only two separate Isar language elements for calculational - proofs: ``\isakeyword{also}'' for initial or intermediate calculational - steps, and ``\isakeyword{finally}'' for exhibiting the result of a - calculation. These constructs are not hardwired into Isabelle/Isar, but - defined on top of the basic Isar/VM interpreter. Expanding the - \isakeyword{also} and \isakeyword{finally} derived language elements, - calculations may be simulated by hand as demonstrated below.\ + There are only two separate Isar language elements for calculational proofs: + ``\<^theory_text>\also\'' for initial or intermediate calculational steps, and + ``\<^theory_text>\finally\'' for exhibiting the result of a calculation. These constructs + are not hardwired into Isabelle/Isar, but defined on top of the basic + Isar/VM interpreter. Expanding the \<^theory_text>\also\ and \<^theory_text>\finally\ derived language + elements, calculations may be simulated by hand as demonstrated below. +\ theorem (in group) "x * 1 = x" proof - @@ -112,30 +118,35 @@ show ?thesis . qed -text \Note that this scheme of calculations is not restricted to plain +text \ + Note that this scheme of calculations is not restricted to plain transitivity. Rules like anti-symmetry, or even forward and backward - substitution work as well. For the actual implementation of - \isacommand{also} and \isacommand{finally}, Isabelle/Isar maintains - separate context information of ``transitivity'' rules. Rule selection - takes place automatically by higher-order unification.\ + substitution work as well. For the actual implementation of \<^theory_text>\also\ and + \<^theory_text>\finally\, Isabelle/Isar maintains separate context information of + ``transitivity'' rules. Rule selection takes place automatically by + higher-order unification. +\ subsection \Groups as monoids\ -text \Monoids over signature \(* :: \ \ \ \ \, 1 :: \)\ are defined like - this.\ +text \ + Monoids over signature \(* :: \ \ \ \ \, 1 :: \)\ are defined like this. +\ class monoid = times + one + assumes monoid_assoc: "(x * y) * z = x * (y * z)" and monoid_left_one: "1 * x = x" and monoid_right_one: "x * 1 = x" -text \Groups are \<^emph>\not\ yet monoids directly from the definition. For - monoids, \right_one\ had to be included as an axiom, but for groups both - \right_one\ and \right_inverse\ are derivable from the other axioms. With - \group_right_one\ derived as a theorem of group theory (see - page~\pageref{thm:group-right-one}), we may still instantiate - \group \ monoid\ properly as follows.\ +text \ + Groups are \<^emph>\not\ yet monoids directly from the definition. For monoids, + \right_one\ had to be included as an axiom, but for groups both \right_one\ + and \right_inverse\ are derivable from the other axioms. With + \group_right_one\ derived as a theorem of group theory (see @{thm + group_right_one}), we may still instantiate \group \ monoid\ properly as + follows. +\ instance group \ monoid by intro_classes @@ -143,18 +154,21 @@ rule group_left_one, rule group_right_one) -text \The \isacommand{instance} command actually is a version of - \isacommand{theorem}, setting up a goal that reflects the intended class - relation (or type constructor arity). Thus any Isar proof language element - may be involved to establish this statement. When concluding the proof, - the result is transformed into the intended type signature extension - behind the scenes.\ +text \ + The \<^theory_text>\instance\ command actually is a version of \<^theory_text>\theorem\, setting up a + goal that reflects the intended class relation (or type constructor arity). + Thus any Isar proof language element may be involved to establish this + statement. When concluding the proof, the result is transformed into the + intended type signature extension behind the scenes. +\ subsection \More theorems of group theory\ -text \The one element is already uniquely determined by preserving - an \<^emph>\arbitrary\ group element.\ +text \ + The one element is already uniquely determined by preserving an \<^emph>\arbitrary\ + group element. +\ theorem (in group) group_one_equality: assumes eq: "e * x = x" @@ -173,7 +187,9 @@ finally show ?thesis . qed -text \Likewise, the inverse is already determined by the cancel property.\ +text \ + Likewise, the inverse is already determined by the cancel property. +\ theorem (in group) group_inverse_equality: assumes eq: "x' * x = 1" @@ -192,7 +208,9 @@ finally show ?thesis . qed -text \The inverse operation has some further characteristic properties.\ +text \ + The inverse operation has some further characteristic properties. +\ theorem (in group) group_inverse_times: "inverse (x * y) = inverse y * inverse x" proof (rule group_inverse_equality) diff -r ed47044ee6a6 -r 2e48182cc82c src/HOL/Isar_Examples/Hoare.thy --- a/src/HOL/Isar_Examples/Hoare.thy Sat Dec 26 15:03:41 2015 +0100 +++ b/src/HOL/Isar_Examples/Hoare.thy Sat Dec 26 15:44:14 2015 +0100 @@ -1,5 +1,5 @@ (* Title: HOL/Isar_Examples/Hoare.thy - Author: Markus Wenzel, TU Muenchen + Author: Makarius A formulation of Hoare logic suitable for Isar. *) @@ -12,10 +12,12 @@ subsection \Abstract syntax and semantics\ -text \The following abstract syntax and semantics of Hoare Logic over - \<^verbatim>\WHILE\ programs closely follows the existing tradition in Isabelle/HOL - of formalizing the presentation given in @{cite \\S6\ "Winskel:1993"}. See - also @{file "~~/src/HOL/Hoare"} and @{cite "Nipkow:1998:Winskel"}.\ +text \ + The following abstract syntax and semantics of Hoare Logic over \<^verbatim>\WHILE\ + programs closely follows the existing tradition in Isabelle/HOL of + formalizing the presentation given in @{cite \\S6\ "Winskel:1993"}. See also + @{file "~~/src/HOL/Hoare"} and @{cite "Nipkow:1998:Winskel"}. +\ type_synonym 'a bexp = "'a set" type_synonym 'a assn = "'a set" @@ -59,15 +61,17 @@ subsection \Primitive Hoare rules\ -text \From the semantics defined above, we derive the standard set of - primitive Hoare rules; e.g.\ see @{cite \\S6\ "Winskel:1993"}. Usually, - variant forms of these rules are applied in actual proof, see also - \S\ref{sec:hoare-isar} and \S\ref{sec:hoare-vcg}. +text \ + From the semantics defined above, we derive the standard set of primitive + Hoare rules; e.g.\ see @{cite \\S6\ "Winskel:1993"}. Usually, variant forms + of these rules are applied in actual proof, see also \S\ref{sec:hoare-isar} + and \S\ref{sec:hoare-vcg}. \<^medskip> The \basic\ rule represents any kind of atomic access to the state space. This subsumes the common rules of \skip\ and \assign\, as formulated in - \S\ref{sec:hoare-isar}.\ + \S\ref{sec:hoare-isar}. +\ theorem basic: "\ {s. f s \ P} (Basic f) P" proof @@ -78,8 +82,10 @@ with s show "s' \ P" by simp qed -text \The rules for sequential commands and semantic consequences are - established in a straight forward manner as follows.\ +text \ + The rules for sequential commands and semantic consequences are established + in a straight forward manner as follows. +\ theorem seq: "\ P c1 Q \ \ Q c2 R \ \ P (c1; c2) R" proof @@ -104,9 +110,10 @@ with QQ' show "s' \ Q'" .. qed -text \The rule for conditional commands is directly reflected by the - corresponding semantics; in the proof we just have to look closely which - cases apply.\ +text \ + The rule for conditional commands is directly reflected by the corresponding + semantics; in the proof we just have to look closely which cases apply. +\ theorem cond: assumes case_b: "\ (P \ b) c1 Q" @@ -134,11 +141,13 @@ qed qed -text \The \while\ rule is slightly less trivial --- it is the only one based - on recursion, which is expressed in the semantics by a Kleene-style least +text \ + The \while\ rule is slightly less trivial --- it is the only one based on + recursion, which is expressed in the semantics by a Kleene-style least fixed-point construction. The auxiliary statement below, which is by induction on the number of iterations is the main point to be proven; the - rest is by routine application of the semantics of \<^verbatim>\WHILE\.\ + rest is by routine application of the semantics of \<^verbatim>\WHILE\. +\ theorem while: assumes body: "\ (P \ b) c P" @@ -164,13 +173,14 @@ subsection \Concrete syntax for assertions\ -text \We now introduce concrete syntax for describing commands (with - embedded expressions) and assertions. The basic technique is that of - semantic ``quote-antiquote''. A \<^emph>\quotation\ is a syntactic entity - delimited by an implicit abstraction, say over the state space. An - \<^emph>\antiquotation\ is a marked expression within a quotation that refers the - implicit argument; a typical antiquotation would select (or even update) - components from the state. +text \ + We now introduce concrete syntax for describing commands (with embedded + expressions) and assertions. The basic technique is that of semantic + ``quote-antiquote''. A \<^emph>\quotation\ is a syntactic entity delimited by an + implicit abstraction, say over the state space. An \<^emph>\antiquotation\ is a + marked expression within a quotation that refers the implicit argument; a + typical antiquotation would select (or even update) components from the + state. We will see some examples later in the concrete rules and applications. @@ -182,7 +192,8 @@ concrete syntactic representation of our Hoare language, the actual ``ML drivers'' is quite involved. Just note that the we re-use the basic quote/antiquote translations as already defined in Isabelle/Pure (see @{ML - Syntax_Trans.quote_tr}, and @{ML Syntax_Trans.quote_tr'},).\ + Syntax_Trans.quote_tr}, and @{ML Syntax_Trans.quote_tr'},). +\ syntax "_quote" :: "'b \ ('a \ 'b)" @@ -211,9 +222,11 @@ in [(@{syntax_const "_quote"}, K quote_tr)] end \ -text \As usual in Isabelle syntax translations, the part for printing is - more complicated --- we cannot express parts as macro rules as above. - Don't look here, unless you have to do similar things for yourself.\ +text \ + As usual in Isabelle syntax translations, the part for printing is more + complicated --- we cannot express parts as macro rules as above. Don't look + here, unless you have to do similar things for yourself. +\ print_translation \ let @@ -242,14 +255,16 @@ subsection \Rules for single-step proof \label{sec:hoare-isar}\ -text \We are now ready to introduce a set of Hoare rules to be used in - single-step structured proofs in Isabelle/Isar. We refer to the concrete - syntax introduce above. +text \ + We are now ready to introduce a set of Hoare rules to be used in single-step + structured proofs in Isabelle/Isar. We refer to the concrete syntax + introduce above. \<^medskip> Assertions of Hoare Logic may be manipulated in calculational proofs, with the inclusion expressed in terms of sets or predicates. Reversed order is - supported as well.\ + supported as well. +\ lemma [trans]: "\ P c Q \ P' \ P \ \ P' c Q" by (unfold Valid_def) blast @@ -276,10 +291,11 @@ by (simp add: Valid_def) -text \Identity and basic assignments.\footnote{The \hoare\ method introduced - in \S\ref{sec:hoare-vcg} is able to provide proper instances for any - number of basic assignments, without producing additional verification - conditions.}\ +text \ + Identity and basic assignments.\<^footnote>\The \hoare\ method introduced in + \S\ref{sec:hoare-vcg} is able to provide proper instances for any number of + basic assignments, without producing additional verification conditions.\ +\ lemma skip [intro?]: "\ P SKIP P" proof - @@ -290,20 +306,22 @@ lemma assign: "\ P [\a/\x::'a] \x := \a P" by (rule basic) -text \Note that above formulation of assignment corresponds to our - preferred way to model state spaces, using (extensible) record types in - HOL @{cite "Naraschewski-Wenzel:1998:HOOL"}. For any record field \x\, - Isabelle/HOL provides a functions \x\ (selector) and \x_update\ (update). - Above, there is only a place-holder appearing for the latter kind of - function: due to concrete syntax \\x := \a\ also contains - \x_update\.\footnote{Note that due to the external nature of HOL record - fields, we could not even state a general theorem relating selector and - update functions (if this were required here); this would only work for - any particular instance of record fields introduced so far.} +text \ + Note that above formulation of assignment corresponds to our preferred way + to model state spaces, using (extensible) record types in HOL @{cite + "Naraschewski-Wenzel:1998:HOOL"}. For any record field \x\, Isabelle/HOL + provides a functions \x\ (selector) and \x_update\ (update). Above, there is + only a place-holder appearing for the latter kind of function: due to + concrete syntax \\x := \a\ also contains \x_update\.\<^footnote>\Note that due to the + external nature of HOL record fields, we could not even state a general + theorem relating selector and update functions (if this were required here); + this would only work for any particular instance of record fields introduced + so far.\ \<^medskip> - Sequential composition --- normalizing with associativity achieves proper - of chunks of code verified separately.\ + Sequential composition --- normalizing with associativity achieves proper of + chunks of code verified separately. +\ lemmas [trans, intro?] = seq @@ -342,15 +360,17 @@ subsection \Verification conditions \label{sec:hoare-vcg}\ -text \We now load the \<^emph>\original\ ML file for proof scripts and tactic - definition for the Hoare Verification Condition Generator (see @{file +text \ + We now load the \<^emph>\original\ ML file for proof scripts and tactic definition + for the Hoare Verification Condition Generator (see @{file "~~/src/HOL/Hoare/"}). As far as we are concerned here, the result is a proof method \hoare\, which may be applied to a Hoare Logic assertion to - extract purely logical verification conditions. It is important to note - that the method requires \<^verbatim>\WHILE\ loops to be fully annotated with - invariants beforehand. Furthermore, only \<^emph>\concrete\ pieces of code are - handled --- the underlying tactic fails ungracefully if supplied with - meta-variables or parameters, for example.\ + extract purely logical verification conditions. It is important to note that + the method requires \<^verbatim>\WHILE\ loops to be fully annotated with invariants + beforehand. Furthermore, only \<^emph>\concrete\ pieces of code are handled --- the + underlying tactic fails ungracefully if supplied with meta-variables or + parameters, for example. +\ lemma SkipRule: "p \ q \ Valid p (Basic id) q" by (auto simp add: Valid_def) diff -r ed47044ee6a6 -r 2e48182cc82c src/HOL/Isar_Examples/Hoare_Ex.thy --- a/src/HOL/Isar_Examples/Hoare_Ex.thy Sat Dec 26 15:03:41 2015 +0100 +++ b/src/HOL/Isar_Examples/Hoare_Ex.thy Sat Dec 26 15:44:14 2015 +0100 @@ -6,9 +6,11 @@ subsection \State spaces\ -text \First of all we provide a store of program variables that occur in any - of the programs considered later. Slightly unexpected things may happen - when attempting to work with undeclared variables.\ +text \ + First of all we provide a store of program variables that occur in any of + the programs considered later. Slightly unexpected things may happen when + attempting to work with undeclared variables. +\ record vars = I :: nat @@ -16,28 +18,33 @@ N :: nat S :: nat -text \While all of our variables happen to have the same type, nothing would +text \ + While all of our variables happen to have the same type, nothing would prevent us from working with many-sorted programs as well, or even - polymorphic ones. Also note that Isabelle/HOL's extensible record types - even provides simple means to extend the state space later.\ + polymorphic ones. Also note that Isabelle/HOL's extensible record types even + provides simple means to extend the state space later. +\ subsection \Basic examples\ -text \We look at few trivialities involving assignment and sequential - composition, in order to get an idea of how to work with our formulation - of Hoare Logic.\ +text \ + We look at few trivialities involving assignment and sequential composition, + in order to get an idea of how to work with our formulation of Hoare Logic. -text \Using the basic \assign\ rule directly is a bit - cumbersome.\ + \<^medskip> + Using the basic \assign\ rule directly is a bit cumbersome. +\ lemma "\ \\(N_update (\_. (2 * \N))) \ \\N = 10\\ \N := 2 * \N \\N = 10\" by (rule assign) -text \Certainly we want the state modification already done, e.g.\ by +text \ + Certainly we want the state modification already done, e.g.\ by simplification. The \hoare\ method performs the basic state update for us; we may apply the Simplifier afterwards to achieve ``obvious'' consequences - as well.\ + as well. +\ lemma "\ \True\ \N := 10 \\N = 10\" by hoare @@ -66,10 +73,12 @@ \\M = b \ \N = a\" by hoare simp -text \It is important to note that statements like the following one can - only be proven for each individual program variable. Due to the - extra-logical nature of record fields, we cannot formulate a theorem - relating record selectors and updates schematically.\ +text \ + It is important to note that statements like the following one can only be + proven for each individual program variable. Due to the extra-logical nature + of record fields, we cannot formulate a theorem relating record selectors + and updates schematically. +\ lemma "\ \\N = a\ \N := \N \\N = a\" by hoare @@ -83,9 +92,11 @@ oops -text \In the following assignments we make use of the consequence rule in - order to achieve the intended precondition. Certainly, the \hoare\ method - is able to handle this case, too.\ +text \ + In the following assignments we make use of the consequence rule in order to + achieve the intended precondition. Certainly, the \hoare\ method is able to + handle this case, too. +\ lemma "\ \\M = \N\ \M := \M + 1 \\M \ \N\" proof - @@ -113,10 +124,12 @@ subsection \Multiplication by addition\ -text \We now do some basic examples of actual \<^verbatim>\WHILE\ programs. This one is - a loop for calculating the product of two natural numbers, by iterated - addition. We first give detailed structured proof based on single-step - Hoare rules.\ +text \ + We now do some basic examples of actual \<^verbatim>\WHILE\ programs. This one is a + loop for calculating the product of two natural numbers, by iterated + addition. We first give detailed structured proof based on single-step Hoare + rules. +\ lemma "\ \\M = 0 \ \S = 0\ @@ -140,10 +153,12 @@ finally show ?thesis . qed -text \The subsequent version of the proof applies the \hoare\ method to - reduce the Hoare statement to a purely logical problem that can be solved - fully automatically. Note that we have to specify the \<^verbatim>\WHILE\ loop - invariant in the original statement.\ +text \ + The subsequent version of the proof applies the \hoare\ method to reduce the + Hoare statement to a purely logical problem that can be solved fully + automatically. Note that we have to specify the \<^verbatim>\WHILE\ loop invariant in + the original statement. +\ lemma "\ \\M = 0 \ \S = 0\ @@ -156,15 +171,16 @@ subsection \Summing natural numbers\ -text \We verify an imperative program to sum natural numbers up to a given - limit. First some functional definition for proper specification of the - problem. +text \ + We verify an imperative program to sum natural numbers up to a given limit. + First some functional definition for proper specification of the problem. \<^medskip> The following proof is quite explicit in the individual steps taken, with the \hoare\ method only applied locally to take care of assignment and sequential composition. Note that we express intermediate proof obligation - in pure logic, without referring to the state space.\ + in pure logic, without referring to the state space. +\ theorem "\ \True\ @@ -202,8 +218,10 @@ finally show ?thesis . qed -text \The next version uses the \hoare\ method, while still explaining the - resulting proof obligations in an abstract, structured manner.\ +text \ + The next version uses the \hoare\ method, while still explaining the + resulting proof obligations in an abstract, structured manner. +\ theorem "\ \True\ @@ -228,8 +246,10 @@ qed qed -text \Certainly, this proof may be done fully automatic as well, provided - that the invariant is given beforehand.\ +text \ + Certainly, this proof may be done fully automatic as well, provided that the + invariant is given beforehand. +\ theorem "\ \True\ @@ -246,8 +266,10 @@ subsection \Time\ -text \A simple embedding of time in Hoare logic: function \timeit\ inserts - an extra variable to keep track of the elapsed time.\ +text \ + A simple embedding of time in Hoare logic: function \timeit\ inserts an + extra variable to keep track of the elapsed time. +\ record tstate = time :: nat diff -r ed47044ee6a6 -r 2e48182cc82c src/HOL/Isar_Examples/Knaster_Tarski.thy --- a/src/HOL/Isar_Examples/Knaster_Tarski.thy Sat Dec 26 15:03:41 2015 +0100 +++ b/src/HOL/Isar_Examples/Knaster_Tarski.thy Sat Dec 26 15:44:14 2015 +0100 @@ -1,5 +1,5 @@ (* Title: HOL/Isar_Examples/Knaster_Tarski.thy - Author: Markus Wenzel, TU Muenchen + Author: Makarius Typical textbook proof example. *) @@ -13,28 +13,30 @@ subsection \Prose version\ -text \According to the textbook @{cite \pages 93--94\ "davey-priestley"}, - the Knaster-Tarski fixpoint theorem is as follows.\footnote{We have - dualized the argument, and tuned the notation a little bit.} +text \ + According to the textbook @{cite \pages 93--94\ "davey-priestley"}, the + Knaster-Tarski fixpoint theorem is as follows.\<^footnote>\We have dualized the + argument, and tuned the notation a little bit.\ \<^bold>\The Knaster-Tarski Fixpoint Theorem.\ Let \L\ be a complete lattice and - \f: L \ L\ an order-preserving map. Then \\{x \ L | f(x) \ x}\ is a - fixpoint of \f\. + \f: L \ L\ an order-preserving map. Then \\{x \ L | f(x) \ x}\ is a fixpoint + of \f\. - \<^bold>\Proof.\ Let \H = {x \ L | f(x) \ x}\ and \a = \H\. For all \x \ H\ we - have \a \ x\, so \f(a) \ f(x) \ x\. Thus \f(a)\ is a lower bound of \H\, - whence \f(a) \ a\. We now use this inequality to prove the reverse one (!) - and thereby complete the proof that \a\ is a fixpoint. Since \f\ is + \<^bold>\Proof.\ Let \H = {x \ L | f(x) \ x}\ and \a = \H\. For all \x \ H\ we have + \a \ x\, so \f(a) \ f(x) \ x\. Thus \f(a)\ is a lower bound of \H\, whence + \f(a) \ a\. We now use this inequality to prove the reverse one (!) and + thereby complete the proof that \a\ is a fixpoint. Since \f\ is order-preserving, \f(f(a)) \ f(a)\. This says \f(a) \ H\, so \a \ f(a)\.\ subsection \Formal versions\ -text \The Isar proof below closely follows the original presentation. - Virtually all of the prose narration has been rephrased in terms of formal - Isar language elements. Just as many textbook-style proofs, there is a - strong bias towards forward proof, and several bends in the course of - reasoning.\ +text \ + The Isar proof below closely follows the original presentation. Virtually + all of the prose narration has been rephrased in terms of formal Isar + language elements. Just as many textbook-style proofs, there is a strong + bias towards forward proof, and several bends in the course of reasoning. +\ theorem Knaster_Tarski: fixes f :: "'a::complete_lattice \ 'a" @@ -64,15 +66,17 @@ qed qed -text \Above we have used several advanced Isar language elements, such as - explicit block structure and weak assumptions. Thus we have mimicked the - particular way of reasoning of the original text. +text \ + Above we have used several advanced Isar language elements, such as explicit + block structure and weak assumptions. Thus we have mimicked the particular + way of reasoning of the original text. In the subsequent version the order of reasoning is changed to achieve structured top-down decomposition of the problem at the outer level, while only the inner steps of reasoning are done in a forward manner. We are certainly more at ease here, requiring only the most basic features of the - Isar language.\ + Isar language. +\ theorem Knaster_Tarski': fixes f :: "'a::complete_lattice \ 'a" diff -r ed47044ee6a6 -r 2e48182cc82c src/HOL/Isar_Examples/Mutilated_Checkerboard.thy --- a/src/HOL/Isar_Examples/Mutilated_Checkerboard.thy Sat Dec 26 15:03:41 2015 +0100 +++ b/src/HOL/Isar_Examples/Mutilated_Checkerboard.thy Sat Dec 26 15:44:14 2015 +0100 @@ -9,8 +9,10 @@ imports Main begin -text \The Mutilated Checker Board Problem, formalized inductively. See @{cite - "paulson-mutilated-board"} for the original tactic script version.\ +text \ + The Mutilated Checker Board Problem, formalized inductively. See @{cite + "paulson-mutilated-board"} for the original tactic script version. +\ subsection \Tilings\ diff -r ed47044ee6a6 -r 2e48182cc82c src/HOL/Isar_Examples/Peirce.thy --- a/src/HOL/Isar_Examples/Peirce.thy Sat Dec 26 15:03:41 2015 +0100 +++ b/src/HOL/Isar_Examples/Peirce.thy Sat Dec 26 15:44:14 2015 +0100 @@ -1,5 +1,5 @@ (* Title: HOL/Isar_Examples/Peirce.thy - Author: Markus Wenzel, TU Muenchen + Author: Makarius *) section \Peirce's Law\ @@ -8,15 +8,16 @@ imports Main begin -text \We consider Peirce's Law: \((A \ B) \ A) \ A\. This is an inherently - non-intuitionistic statement, so its proof will certainly involve some - form of classical contradiction. +text \ + We consider Peirce's Law: \((A \ B) \ A) \ A\. This is an inherently + non-intuitionistic statement, so its proof will certainly involve some form + of classical contradiction. The first proof is again a well-balanced combination of plain backward and forward reasoning. The actual classical step is where the negated goal may be introduced as additional assumption. This eventually leads to a - contradiction.\footnote{The rule involved there is negation elimination; - it holds in intuitionistic logic as well.}\ + contradiction.\<^footnote>\The rule involved there is negation elimination; it holds in + intuitionistic logic as well.\\ theorem "((A \ B) \ A) \ A" proof @@ -33,17 +34,19 @@ qed qed -text \In the subsequent version the reasoning is rearranged by means of - ``weak assumptions'' (as introduced by \isacommand{presume}). Before - assuming the negated goal \\ A\, its intended consequence \A \ B\ is put - into place in order to solve the main problem. Nevertheless, we do not get - anything for free, but have to establish \A \ B\ later on. The overall - effect is that of a logical \<^emph>\cut\. +text \ + In the subsequent version the reasoning is rearranged by means of ``weak + assumptions'' (as introduced by \<^theory_text>\presume\). Before assuming the negated + goal \\ A\, its intended consequence \A \ B\ is put into place in order to + solve the main problem. Nevertheless, we do not get anything for free, but + have to establish \A \ B\ later on. The overall effect is that of a logical + \<^emph>\cut\. - Technically speaking, whenever some goal is solved by \isacommand{show} in - the context of weak assumptions then the latter give rise to new subgoals, - which may be established separately. In contrast, strong assumptions (as - introduced by \isacommand{assume}) are solved immediately.\ + Technically speaking, whenever some goal is solved by \<^theory_text>\show\ in the context + of weak assumptions then the latter give rise to new subgoals, which may be + established separately. In contrast, strong assumptions (as introduced by + \<^theory_text>\assume\) are solved immediately. +\ theorem "((A \ B) \ A) \ A" proof @@ -62,20 +65,22 @@ qed qed -text \Note that the goals stemming from weak assumptions may be even left - until qed time, where they get eventually solved ``by assumption'' as - well. In that case there is really no fundamental difference between the - two kinds of assumptions, apart from the order of reducing the individual - parts of the proof configuration. +text \ + Note that the goals stemming from weak assumptions may be even left until + qed time, where they get eventually solved ``by assumption'' as well. In + that case there is really no fundamental difference between the two kinds of + assumptions, apart from the order of reducing the individual parts of the + proof configuration. - Nevertheless, the ``strong'' mode of plain assumptions is quite important - in practice to achieve robustness of proof text interpretation. By forcing - both the conclusion \<^emph>\and\ the assumptions to unify with the pending goal - to be solved, goal selection becomes quite deterministic. For example, - decomposition with rules of the ``case-analysis'' type usually gives rise - to several goals that only differ in there local contexts. With strong + Nevertheless, the ``strong'' mode of plain assumptions is quite important in + practice to achieve robustness of proof text interpretation. By forcing both + the conclusion \<^emph>\and\ the assumptions to unify with the pending goal to be + solved, goal selection becomes quite deterministic. For example, + decomposition with rules of the ``case-analysis'' type usually gives rise to + several goals that only differ in there local contexts. With strong assumptions these may be still solved in any order in a predictable way, - while weak ones would quickly lead to great confusion, eventually - demanding even some backtracking.\ + while weak ones would quickly lead to great confusion, eventually demanding + even some backtracking. +\ end diff -r ed47044ee6a6 -r 2e48182cc82c src/HOL/Isar_Examples/Puzzle.thy --- a/src/HOL/Isar_Examples/Puzzle.thy Sat Dec 26 15:03:41 2015 +0100 +++ b/src/HOL/Isar_Examples/Puzzle.thy Sat Dec 26 15:44:14 2015 +0100 @@ -4,12 +4,14 @@ imports Main begin -text_raw \\footnote{A question from ``Bundeswettbewerb Mathematik''. - Original pen-and-paper proof due to Herbert Ehler; Isabelle tactic - script by Tobias Nipkow.}\ +text_raw \\<^footnote>\A question from ``Bundeswettbewerb Mathematik''. Original + pen-and-paper proof due to Herbert Ehler; Isabelle tactic script by Tobias + Nipkow.\\ -text \\<^bold>\Problem.\ Given some function \f: \ \ \\ such that - \f (f n) < f (Suc n)\ for all \n\. Demonstrate that \f\ is the identity.\ +text \ + \<^bold>\Problem.\ Given some function \f: \ \ \\ such that \f (f n) < f (Suc n)\ + for all \n\. Demonstrate that \f\ is the identity. +\ theorem assumes f_ax: "\n. f (f n) < f (Suc n)" diff -r ed47044ee6a6 -r 2e48182cc82c src/HOL/Isar_Examples/Summation.thy --- a/src/HOL/Isar_Examples/Summation.thy Sat Dec 26 15:03:41 2015 +0100 +++ b/src/HOL/Isar_Examples/Summation.thy Sat Dec 26 15:44:14 2015 +0100 @@ -1,5 +1,5 @@ (* Title: HOL/Isar_Examples/Summation.thy - Author: Markus Wenzel + Author: Makarius *) section \Summing natural numbers\ @@ -8,17 +8,19 @@ imports Main begin -text \Subsequently, we prove some summation laws of natural numbers - (including odds, squares, and cubes). These examples demonstrate how plain - natural deduction (including induction) may be combined with calculational - proof.\ +text \ + Subsequently, we prove some summation laws of natural numbers (including + odds, squares, and cubes). These examples demonstrate how plain natural + deduction (including induction) may be combined with calculational proof. +\ subsection \Summation laws\ -text \The sum of natural numbers \0 + \ + n\ equals \n \ (n + 1)/2\. - Avoiding formal reasoning about division we prove this equation multiplied - by \2\.\ +text \ + The sum of natural numbers \0 + \ + n\ equals \n \ (n + 1)/2\. Avoiding + formal reasoning about division we prove this equation multiplied by \2\. +\ theorem sum_of_naturals: "2 * (\i::nat=0..n. i) = n * (n + 1)" @@ -35,39 +37,38 @@ by simp qed -text \The above proof is a typical instance of mathematical induction. The - main statement is viewed as some \?P n\ that is split by the induction - method into base case \?P 0\, and step case \?P n \ ?P (Suc n)\ for - arbitrary \n\. +text \ + The above proof is a typical instance of mathematical induction. The main + statement is viewed as some \?P n\ that is split by the induction method + into base case \?P 0\, and step case \?P n \ ?P (Suc n)\ for arbitrary \n\. The step case is established by a short calculation in forward manner. Starting from the left-hand side \?S (n + 1)\ of the thesis, the final result is achieved by transformations involving basic arithmetic reasoning - (using the Simplifier). The main point is where the induction hypothesis - \?S n = n \ (n + 1)\ is introduced in order to replace a certain subterm. - So the ``transitivity'' rule involved here is actual \<^emph>\substitution\. Also - note how the occurrence of ``\dots'' in the subsequent step documents the - position where the right-hand side of the hypothesis got filled in. + (using the Simplifier). The main point is where the induction hypothesis \?S + n = n \ (n + 1)\ is introduced in order to replace a certain subterm. So the + ``transitivity'' rule involved here is actual \<^emph>\substitution\. Also note how + the occurrence of ``\dots'' in the subsequent step documents the position + where the right-hand side of the hypothesis got filled in. \<^medskip> A further notable point here is integration of calculations with plain natural deduction. This works so well in Isar for two reasons. - \<^enum> Facts involved in \isakeyword{also}~/ \isakeyword{finally} - calculational chains may be just anything. There is nothing special - about \isakeyword{have}, so the natural deduction element - \isakeyword{assume} works just as well. + \<^enum> Facts involved in \<^theory_text>\also\~/ \<^theory_text>\finally\ calculational chains may be just + anything. There is nothing special about \<^theory_text>\have\, so the natural deduction + element \<^theory_text>\assume\ works just as well. \<^enum> There are two \<^emph>\separate\ primitives for building natural deduction - contexts: \isakeyword{fix}~\x\ and \isakeyword{assume}~\A\. Thus it is - possible to start reasoning with some new ``arbitrary, but fixed'' - elements before bringing in the actual assumption. In contrast, natural - deduction is occasionally formalized with basic context elements of the - form \x:A\ instead. + contexts: \<^theory_text>\fix x\ and \<^theory_text>\assume A\. Thus it is possible to start reasoning + with some new ``arbitrary, but fixed'' elements before bringing in the + actual assumption. In contrast, natural deduction is occasionally + formalized with basic context elements of the form \x:A\ instead. \<^medskip> We derive further summation laws for odds, squares, and cubes as follows. - The basic technique of induction plus calculation is the same as before.\ + The basic technique of induction plus calculation is the same as before. +\ theorem sum_of_odds: "(\i::nat=0..Subsequently we require some additional tweaking of Isabelle built-in - arithmetic simplifications, such as bringing in distributivity by hand.\ +text \ + Subsequently we require some additional tweaking of Isabelle built-in + arithmetic simplifications, such as bringing in distributivity by hand. +\ lemmas distrib = add_mult_distrib add_mult_distrib2 @@ -123,15 +126,17 @@ by simp qed -text \Note that in contrast to older traditions of tactical proof - scripts, the structured proof applies induction on the original, - unsimplified statement. This allows to state the induction cases robustly - and conveniently. Simplification (or other automated) methods are then - applied in terminal position to solve certain sub-problems completely. +text \ + Note that in contrast to older traditions of tactical proof scripts, the + structured proof applies induction on the original, unsimplified statement. + This allows to state the induction cases robustly and conveniently. + Simplification (or other automated) methods are then applied in terminal + position to solve certain sub-problems completely. As a general rule of good proof style, automatic methods such as \simp\ or - \auto\ should normally be never used as initial proof methods with a - nested sub-proof to address the automatically produced situation, but only - as terminal ones to solve sub-problems.\ + \auto\ should normally be never used as initial proof methods with a nested + sub-proof to address the automatically produced situation, but only as + terminal ones to solve sub-problems. +\ end diff -r ed47044ee6a6 -r 2e48182cc82c src/HOL/Isar_Examples/document/root.tex --- a/src/HOL/Isar_Examples/document/root.tex Sat Dec 26 15:03:41 2015 +0100 +++ b/src/HOL/Isar_Examples/document/root.tex Sat Dec 26 15:44:14 2015 +0100 @@ -16,7 +16,7 @@ \begin{document} \title{Miscellaneous Isabelle/Isar examples for Higher-Order Logic} -\author{Markus Wenzel \\[2ex] +\author{Makarius Wenzel \\[2ex] With contributions by Gertrud Bauer and Tobias Nipkow} \maketitle