improved presentation;
authorwenzelm
Sat, 30 Oct 1999 20:20:48 +0200
changeset 7982 d534b897ce39
parent 7981 5120a2a15d06
child 7983 d823fdcc0645
improved presentation;
src/HOL/Isar_examples/BasicLogic.thy
src/HOL/Isar_examples/Cantor.thy
src/HOL/Isar_examples/ExprCompiler.thy
src/HOL/Isar_examples/Group.thy
src/HOL/Isar_examples/KnasterTarski.thy
src/HOL/Isar_examples/MultisetOrder.thy
src/HOL/Isar_examples/Peirce.thy
src/HOL/Isar_examples/Summation.thy
src/HOL/Isar_examples/W_correct.thy
src/HOL/Isar_examples/document/style.tex
--- a/src/HOL/Isar_examples/BasicLogic.thy	Sat Oct 30 20:13:16 1999 +0200
+++ b/src/HOL/Isar_examples/BasicLogic.thy	Sat Oct 30 20:20:48 1999 +0200
@@ -5,7 +5,7 @@
 Basic propositional and quantifier reasoning.
 *)
 
-header {* Basic reasoning *};
+header {* Basic logical reasoning *};
 
 theory BasicLogic = Main:;
 
@@ -70,9 +70,9 @@
 text {*
  In fact, concluding any (sub-)proof already involves solving any
  remaining goals by assumption\footnote{This is not a completely
- trivial operation, as proof by assumption involves full higher-order
- unification.}.  Thus we may skip the rather vacuous body of the above
- proof as well.
+ trivial operation, as proof by assumption may involve full
+ higher-order unification.}.  Thus we may skip the rather vacuous body
+ of the above proof as well.
 *};
 
 lemma "A --> A";
@@ -99,7 +99,7 @@
 text {*
  Thus we have arrived at an adequate representation of the proof of a
  tautology that holds by a single standard rule.\footnote{Apparently,
- the rule is implication introduction.}
+ the rule here is implication introduction.}
 *};
 
 text {*
@@ -129,7 +129,7 @@
  Just like $\idt{rule}$, the $\idt{intro}$ and $\idt{elim}$ proof
  methods pick standard structural rules, in case no explicit arguments
  are given.  While implicit rules are usually just fine for single
- rule application, this may go too far in iteration.  Thus in
+ rule application, this may go too far with iteration.  Thus in
  practice, $\idt{intro}$ and $\idt{elim}$ would be typically
  restricted to certain structures by giving a few rules only, e.g.\
  \isacommand{proof}~($\idt{intro}$~\name{impI}~\name{allI}) to strip
@@ -168,11 +168,12 @@
 
 text {*
  Above, the $\idt{conjunct}_{1/2}$ projection rules had to be named
- explicitly, since the goals did not provide any structural clue.
- This may be avoided using \isacommand{from} to focus on $\idt{prems}$
- (i.e.\ the $A \conj B$ assumption) as the current facts, enabling the
- use of double-dot proofs.  Note that \isacommand{from} already
- does forward-chaining, involving the \name{conjE} rule.
+ explicitly, since the goals $B$ and $A$ did not provide any
+ structural clue.  This may be avoided using \isacommand{from} to
+ focus on $\idt{prems}$ (i.e.\ the $A \conj B$ assumption) as the
+ current facts, enabling the use of double-dot proofs.  Note that
+ \isacommand{from} already does forward-chaining, involving the
+ \name{conjE} rule here.
 *};
 
 lemma "A & B --> B & A";
@@ -222,7 +223,7 @@
 text {*
  We can still push forward reasoning a bit further, even at the risk
  of getting ridiculous.  Note that we force the initial proof step to
- do nothing, by referring to the ``-'' proof method.
+ do nothing here, by referring to the ``-'' proof method.
 *};
 
 lemma "A & B --> B & A";
@@ -245,7 +246,7 @@
 
  The general lesson learned here is that good proof style would
  achieve just the \emph{right} balance of top-down backward
- decomposition, and bottom-up forward composition.  In practice, there
+ decomposition, and bottom-up forward composition.  In general, there
  is no single best way to arrange some pieces of formal reasoning, of
  course.  Depending on the actual applications, the intended audience
  etc., rules (and methods) on the one hand vs.\ facts on the other
@@ -278,7 +279,7 @@
 
 text {*
  We rephrase some of the basic reasoning examples of
- \cite{isabelle-intro} (using HOL rather than FOL).
+ \cite{isabelle-intro}, using HOL rather than FOL.
 *};
 
 subsubsection {* A propositional proof *};
@@ -315,8 +316,8 @@
  In order to avoid too much explicit parentheses, the Isar system
  implicitly opens an additional block for any new goal, the
  \isacommand{next} statement then closes one block level, opening a
- new one.  The resulting behavior is what one might expect from
- separating cases, only that it is more flexible.  E.g. an induction
+ new one.  The resulting behavior is what one would expect from
+ separating cases, only that it is more flexible.  E.g.\ an induction
  base case (which does not introduce local assumptions) would
  \emph{not} require \isacommand{next} to separate the subsequent step
  case.
@@ -381,8 +382,8 @@
 qed;
 
 text {*
- While explicit rule instantiation may occasionally help to improve
- the readability of certain aspects of reasoning, it is usually quite
+ While explicit rule instantiation may occasionally improve
+ readability of certain aspects of reasoning, it is usually quite
  redundant.  Above, the basic proof outline gives already enough
  structural clues for the system to infer both the rules and their
  instances (by higher-order unification).  Thus we may as well prune
@@ -404,17 +405,18 @@
 subsubsection {* Deriving rules in Isabelle *};
 
 text {*
- We derive the conjunction elimination rule from the projections.  The
- proof is quite straight-forward, since Isabelle/Isar supports
- non-atomic goals and assumptions fully transparently.
+ We derive the conjunction elimination rule from the corresponding
+ projections.  The proof is quite straight-forward, since
+ Isabelle/Isar supports non-atomic goals and assumptions fully
+ transparently.
 *};
 
 theorem conjE: "A & B ==> (A ==> B ==> C) ==> C";
 proof -;
   assume "A & B";
-  assume ab_c: "A ==> B ==> C";
+  assume r: "A ==> B ==> C";
   show C;
-  proof (rule ab_c);
+  proof (rule r);
     show A; by (rule conjunct1);
     show B; by (rule conjunct2);
   qed;
@@ -425,7 +427,7 @@
  different way.  The tactic script as given in \cite{isabelle-intro}
  for the same example of \name{conjE} depends on the primitive
  \texttt{goal} command to decompose the rule into premises and
- conclusion.  The proper result would then emerge by discharging of
+ conclusion.  The actual result would then emerge by discharging of
  the context at \texttt{qed} time.
 *};
 
--- a/src/HOL/Isar_examples/Cantor.thy	Sat Oct 30 20:13:16 1999 +0200
+++ b/src/HOL/Isar_examples/Cantor.thy	Sat Oct 30 20:20:48 1999 +0200
@@ -30,7 +30,7 @@
  with the innermost reasoning expressed quite naively.
 *};
 
-theorem "EX S. S ~: range(f :: 'a => 'a set)";
+theorem "EX S. S ~: range (f :: 'a => 'a set)";
 proof;
   let ?S = "{x. x ~: f x}";
   show "?S ~: range f";
@@ -69,7 +69,7 @@
  introduced \emph{before} its corresponding \isacommand{show}.}
 *};
 
-theorem "EX S. S ~: range(f :: 'a => 'a set)";
+theorem "EX S. S ~: range (f :: 'a => 'a set)";
 proof;
   let ?S = "{x. x ~: f x}";
   show "?S ~: range f";
@@ -95,22 +95,22 @@
 
 text {*
  How much creativity is required?  As it happens, Isabelle can prove
- this theorem automatically.  The default context of the Isabelle's
- classical prover contains rules for most of the constructs of HOL's
- set theory.  We must augment it with \name{equalityCE} to break up
- set equalities, and then apply best-first search.  Depth-first search
- would diverge, but best-first search successfully navigates through
- the large search space.
+ this theorem automatically.  The context of Isabelle's classical
+ prover contains rules for most of the constructs of HOL's set theory.
+ We must augment it with \name{equalityCE} to break up set equalities,
+ and then apply best-first search.  Depth-first search would diverge,
+ but best-first search successfully navigates through the large search
+ space.
 *};
 
-theorem "EX S. S ~: range(f :: 'a => 'a set)";
+theorem "EX S. S ~: range (f :: 'a => 'a set)";
   by (best elim: equalityCE);
 
 text {*
  While this establishes the same theorem internally, we do not get any
  idea of how the proof actually works.  There is currently no way to
  transform internal system-level representations of Isabelle proofs
- back into Isar documents.  Writing intelligible proof documents
+ back into Isar text.  Writing intelligible proof documents
  really is a creative process, after all.
 *};
 
--- a/src/HOL/Isar_examples/ExprCompiler.thy	Sat Oct 30 20:13:16 1999 +0200
+++ b/src/HOL/Isar_examples/ExprCompiler.thy	Sat Oct 30 20:20:48 1999 +0200
@@ -20,7 +20,7 @@
 
 text {*
  Binary operations are just functions over some type of values.  This
- is both for syntax and semantics, i.e.\ we use a ``shallow
+ is both for abstract syntax and semantics, i.e.\ we use a ``shallow
  embedding'' here.
 *};
 
--- a/src/HOL/Isar_examples/Group.thy	Sat Oct 30 20:13:16 1999 +0200
+++ b/src/HOL/Isar_examples/Group.thy	Sat Oct 30 20:20:48 1999 +0200
@@ -53,8 +53,9 @@
 qed;
 
 text {*
- With \name{group-right-inverse} already at our disposal,
- \name{group-right-unit} is now obtained much easier.
+ With \name{group-right-inverse} already available,
+ \name{group-right-unit}\label{thm:group-right-unit} is now
+ established much easier.
 *};
 
 theorem group_right_unit: "x * one = (x::'a::group)";
@@ -75,14 +76,14 @@
  presentations given in any introductory course on algebra.  The basic
  technique is to form a transitive chain of equations, which in turn
  are established by simplifying with appropriate rules.  The low-level
- logical parts of equational reasoning are left implicit.
+ logical details of equational reasoning are left implicit.
 
- Note that ``$\dots$'' is just a special term variable that happens to
- be bound automatically to the argument\footnote{The argument of a
- curried infix expression happens to be its right-hand side.} of the
- last fact achieved by any local assumption or proven statement.  In
- contrast to $\var{thesis}$, the ``$\dots$'' variable is bound
- \emph{after} the proof is finished.
+ Note that ``$\dots$'' is just a special term variable that is bound
+ automatically to the argument\footnote{The argument of a curried
+ infix expression happens to be its right-hand side.} of the last fact
+ achieved by any local assumption or proven statement.  In contrast to
+ $\var{thesis}$, the ``$\dots$'' variable is bound \emph{after} the
+ proof is finished, though.
 
  There are only two separate Isar language elements for calculational
  proofs: ``\isakeyword{also}'' for initial or intermediate
@@ -90,8 +91,8 @@
  result of a calculation.  These constructs are not hardwired into
  Isabelle/Isar, but defined on top of the basic Isar/VM interpreter.
  Expanding the \isakeyword{also} and \isakeyword{finally} derived
- language elements, calculations may be simulated as demonstrated
- below.
+ language elements, calculations may be simulated by hand as
+ demonstrated below.
 *};
 
 theorem "x * one = (x::'a::group)";
@@ -128,10 +129,10 @@
 text {*
  Note that this scheme of calculations is not restricted to plain
  transitivity.  Rules like anti-symmetry, or even forward and backward
- substitution work as well.  For the actual \isacommand{also} and
- \isacommand{finally} commands, Isabelle/Isar maintains separate
- context information of ``transitivity'' rules.  Rule selection takes
- place automatically by higher-order unification.
+ substitution work as well.  For the actual implementation of
+ \isacommand{also} and \isacommand{finally}, Isabelle/Isar maintains
+ separate context information of ``transitivity'' rules.  Rule
+ selection takes place automatically by higher-order unification.
 *};
 
 
@@ -150,10 +151,11 @@
 text {*
  Groups are \emph{not} yet monoids directly from the definition.  For
  monoids, \name{right-unit} had to be included as an axiom, but for
- groups both \name{right-unit} and \name{right-inverse} are
- derivable from the other axioms.  With \name{group-right-unit}
- derived as a theorem of group theory (see above), we may still
- instantiate $\idt{group} \subset \idt{monoid}$ properly as follows.
+ groups both \name{right-unit} and \name{right-inverse} are derivable
+ from the other axioms.  With \name{group-right-unit} derived as a
+ theorem of group theory (see page~\pageref{thm:group-right-unit}), we
+ may still instantiate $\idt{group} \subset \idt{monoid}$ properly as
+ follows.
 *};
 
 instance group < monoid;
@@ -167,7 +169,7 @@
  \isacommand{theorem}, setting up a goal that reflects the intended
  class relation (or type constructor arity).  Thus any Isar proof
  language element may be involved to establish this statement.  When
- concluding the proof, the result is transformed into the original
+ concluding the proof, the result is transformed into the intended
  type signature extension behind the scenes.
 *};
 
--- a/src/HOL/Isar_examples/KnasterTarski.thy	Sat Oct 30 20:13:16 1999 +0200
+++ b/src/HOL/Isar_examples/KnasterTarski.thy	Sat Oct 30 20:20:48 1999 +0200
@@ -37,7 +37,8 @@
  The Isar proof below closely follows the original presentation.
  Virtually all of the prose narration has been rephrased in terms of
  formal Isar language elements.  Just as many textbook-style proofs,
- there is a strong bias towards forward reasoning.
+ there is a strong bias towards forward proof, and several bends
+ in the course of reasoning.
 *};
 
 theorem KnasterTarski: "mono f ==> EX a::'a set. f a = a";
@@ -72,11 +73,11 @@
  explicit block structure and weak assumptions.  Thus we have mimicked
  the particular way of reasoning of the original text.
 
- In the subsequent version of the proof the order of reasoning is
- changed to achieve structured top-down decomposition of the problem
- at the outer level, while the small inner steps of reasoning or done
- in a forward manner.  Note that this requires only the most basic
- features of the Isar language, we are certainly more at ease here.
+ In the subsequent version the order of reasoning is changed to
+ achieve structured top-down decomposition of the problem at the outer
+ level, while only the inner steps of reasoning are done in a forward
+ manner.  We are certainly more at ease here, requiring only the most
+ basic features of the Isar language.
 *};
 
 theorem KnasterTarski': "mono f ==> EX a::'a set. f a = a";
--- a/src/HOL/Isar_examples/MultisetOrder.thy	Sat Oct 30 20:13:16 1999 +0200
+++ b/src/HOL/Isar_examples/MultisetOrder.thy	Sat Oct 30 20:20:48 1999 +0200
@@ -10,7 +10,7 @@
 theory MultisetOrder = Multiset:;
 
 text_raw {*
- \footnote{Original tactic script by Tobias Nipkow (see also
+ \footnote{Original tactic script by Tobias Nipkow (see
  \url{http://isabelle.in.tum.de/library/HOL/Induct/Multiset.html}),
  based on a pen-and-paper proof due to Wilfried Buchholz.}
 *};
@@ -22,8 +22,8 @@
     (EX K. (ALL b. elem K b --> (b, a) : r) & N = M0 + K)"
   (concl is "?case1 (mult1 r) | ?case2");
 proof (unfold mult1_def);
-  let ?r = "%K a. ALL b. elem K b --> (b, a) : r";
-  let ?R = "%N M. EX a M0 K. M = M0 + {#a#} & N = M0 + K & ?r K a";
+  let ?r = "\<lambda>K a. ALL b. elem K b --> (b, a) : r";
+  let ?R = "\<lambda>N M. EX a M0 K. M = M0 + {#a#} & N = M0 + K & ?r K a";
   let ?case1 = "?case1 {(N, M). ?R N M}";
 
   assume "(N, M0 + {#a#}) : {(N, M). ?R N M}";
@@ -61,7 +61,6 @@
 proof;
   let ?R = "mult1 r";
   let ?W = "acc ?R";
-
   {{;
     fix M M0 a;
     assume M0: "M0 : ?W"
--- a/src/HOL/Isar_examples/Peirce.thy	Sat Oct 30 20:13:16 1999 +0200
+++ b/src/HOL/Isar_examples/Peirce.thy	Sat Oct 30 20:20:48 1999 +0200
@@ -75,15 +75,15 @@
  individual parts of the proof configuration.
 
  Nevertheless, the ``strong'' mode of plain assumptions is quite
- important in practice to achieve robustness of proof document
+ important in practice to achieve robustness of proof text
  interpretation.  By forcing both the conclusion \emph{and} the
  assumptions to unify with the pending goal to be solved, goal
  selection becomes quite deterministic.  For example, decomposition
- with ``case-analysis'' type rules usually give rise to several goals
- that only differ in there local contexts.  With strong assumptions
- these may be still solved in any order in a predictable way, while
- weak ones would quickly lead to great confusion, eventually demanding
- even some backtracking.
+ with rules of the ``case-analysis'' type usually gives rise to
+ several goals that only differ in there local contexts.  With strong
+ assumptions these may be still solved in any order in a predictable
+ way, while weak ones would quickly lead to great confusion,
+ eventually demanding even some backtracking.
 *};
 
 end;
--- a/src/HOL/Isar_examples/Summation.thy	Sat Oct 30 20:13:16 1999 +0200
+++ b/src/HOL/Isar_examples/Summation.thy	Sat Oct 30 20:20:48 1999 +0200
@@ -16,7 +16,7 @@
 
 text {*
  Subsequently, we prove some summation laws of natural numbers
- (including odds, squares and cubes).  These examples demonstrate how
+ (including odds, squares, and cubes).  These examples demonstrate how
  plain natural deduction (including induction) may be combined with
  calculational proof.
 *};
@@ -26,25 +26,25 @@
 
 text {*
   The binder operator $\idt{sum} :: (\idt{nat} \To \idt{nat}) \To
- \idt{nat} \To \idt{nat}$ formalizes summation from $0$ up to $k$
- (excluding the bound).
+ \idt{nat} \To \idt{nat}$ formalizes summation of natural numbers
+ indexed from $0$ up to $k$ (excluding the bound):
  \[
  \sum\limits_{i < k} f(i) = \idt{sum} \ap (\lam i f \ap i) \ap k
  \]
 *};
 
 consts
-  sum   :: "[nat => nat, nat] => nat";
+  sum :: "[nat => nat, nat] => nat";
 
 primrec
   "sum f 0 = 0"
   "sum f (Suc n) = f n + sum f n";
 
 syntax
-  "_SUM" :: "idt => nat => nat => nat"
+  "_SUM" :: "[idt, nat, nat] => nat"
     ("SUM _ < _. _" [0, 0, 10] 10);
 translations
-  "SUM i < k. b" == "sum (%i. b) k";
+  "SUM i < k. b" == "sum (\<lambda>i. b) k";
 
 
 subsection {* Summation laws *};
@@ -69,8 +69,8 @@
 
 text {*
  The sum of natural numbers $0 + \cdots + n$ equals $n \times (n +
- 1)/2$.  In order to avoid formal reasoning about division, we just
- show $2 \times \Sigma_{i < n} i = n \times (n + 1)$.
+ 1)/2$.  Avoiding formal reasoning about division we prove this
+ equation multiplied by $2$.
 *};
 
 theorem sum_of_naturals:
@@ -90,23 +90,22 @@
  The above proof is a typical instance of mathematical induction.  The
  main statement is viewed as some $\var{P} \ap n$ that is split by the
  induction method into base case $\var{P} \ap 0$, and step case
- $\var{P} \ap n \Impl \var{P} \ap (\idt{Suc} \ap n)$ for any $n$.
+ $\var{P} \ap n \Impl \var{P} \ap (\idt{Suc} \ap n)$ for arbitrary $n$.
 
  The step case is established by a short calculation in forward
  manner.  Starting from the left-hand side $\var{S} \ap (n + 1)$ of
- the thesis, the final result is achieved by basic transformations
- involving arithmetic reasoning (using the Simplifier).  The main
- point is where the induction hypothesis $\var{S} \ap n = n \times (n
- + 1)$ is introduced in order to replace a certain subterm.  So the
+ the thesis, the final result is achieved by transformations involving
+ basic arithmetic reasoning (using the Simplifier).  The main point is
+ where the induction hypothesis $\var{S} \ap n = n \times (n + 1)$ is
+ introduced in order to replace a certain subterm.  So the
  ``transitivity'' rule involved here is actual \emph{substitution}.
  Also note how the occurrence of ``\dots'' in the subsequent step
- documents the position where the right-hand side of the hypotheses
+ documents the position where the right-hand side of the hypothesis
  got filled in.
 
  \medskip A further notable point here is integration of calculations
- with plain natural deduction.  This works works quite well in Isar
- for two reasons.
-
+ with plain natural deduction.  This works so well in Isar for two
+ reasons.
  \begin{enumerate}
 
  \item Facts involved in \isakeyword{also}~/ \isakeyword{finally}
@@ -116,19 +115,18 @@
 
  \item There are two \emph{separate} primitives for building natural
  deduction contexts: \isakeyword{fix}~$x$ and \isakeyword{assume}~$A$.
- Thus it is possible to start reasoning with new ``arbitrary, but
- fixed'' elements before bringing in the actual assumptions.
- Occasionally, natural deduction is formalized with basic context
- elements of the form $x:A$; this would rule out mixing with
- calculations as done here.
+ Thus it is possible to start reasoning with some new ``arbitrary, but
+ fixed'' elements before bringing in the actual assumption.  In
+ contrast, natural deduction is occasionally formalized with basic
+ context elements of the form $x:A$ instead.
 
  \end{enumerate}
 *};
 
 text {*
- \medskip We derive further summation laws for odds, squares, cubes as
- follows.  The basic technique of induction plus calculation is the
- same.
+ \medskip We derive further summation laws for odds, squares, and
+ cubes as follows.  The basic technique of induction plus calculation
+ is the same as before.
 *};
 
 theorem sum_of_odds:
@@ -175,20 +173,20 @@
 text {*
  Comparing these examples with the tactic script version
  \url{http://isabelle.in.tum.de/library/HOL/ex/NatSum.html}, we note
- an important difference how of induction vs.\ simplification is
+ an important difference of how induction vs.\ simplification is
  applied.  While \cite[\S10]{isabelle-ref} advises for these examples
  that ``induction should not be applied until the goal is in the
  simplest form'' this would be a very bad idea in our setting.
 
  Simplification normalizes all arithmetic expressions involved,
- producing huge intermediate goals.  Applying induction afterwards,
- the Isar document would then have to reflect the emerging
- configuration by appropriate the sub-proofs.  This would result in
- badly structured, low-level technical reasoning, without any good
- idea of the actual point.
+ producing huge intermediate goals.  With applying induction
+ afterwards, the Isar proof text would have to reflect the emerging
+ configuration by appropriate sub-proofs.  This would result in badly
+ structured, low-level technical reasoning, without any good idea of
+ the actual point.
 
  \medskip As a general rule of good proof style, automatic methods
- such as $\idt{simp}$ or $\idt{auto}$ should normally never used as
+ such as $\idt{simp}$ or $\idt{auto}$ should normally be never used as
  initial proof methods, but only as terminal ones, solving certain
  goals completely.
 *};
--- a/src/HOL/Isar_examples/W_correct.thy	Sat Oct 30 20:13:16 1999 +0200
+++ b/src/HOL/Isar_examples/W_correct.thy	Sat Oct 30 20:20:48 1999 +0200
@@ -39,7 +39,7 @@
     AppI: "[| a |- e1 :: t2 -> t1; a |- e2 :: t2 |]
               ==> a |- App e1 e2 :: t1";
 
-text {* Type assigment is close wrt.\ substitution. *};
+text {* Type assigment is closed wrt.\ substitution. *};
 
 lemma has_type_subst_closed: "a |- e :: t ==> $s a |- e :: $s t";
 proof -;
@@ -79,10 +79,10 @@
 
 primrec
   "W (Var i) a n =
-      (if i < length a then Ok(id_subst, a ! i, n) else Fail)"
+      (if i < length a then Ok (id_subst, a ! i, n) else Fail)"
   "W (Abs e) a n =
       ((s, t, m) := W e (TVar n # a) (Suc n);
-       Ok(s, (s n) -> t, m))"
+       Ok (s, (s n) -> t, m))"
   "W (App e1 e2) a n =
       ((s1, t1, m1) := W e1 a n;
        (s2, t2, m2) := W e2 ($s1 a) m1;
@@ -92,9 +92,13 @@
 
 subsection {* Correctness theorem *};
 
+text_raw {* \begin{comment} *};
+
 (* FIXME proper split att/mod *)
 ML_setup {* Addsplits [split_bind]; *};
 
+text_raw {* \end{comment} *};
+
 theorem W_correct: "W e a n = Ok (s, t, m) ==> $ s a |- e :: t";
 proof -;
   assume W_ok: "W e a n = Ok (s, t, m)";
--- a/src/HOL/Isar_examples/document/style.tex	Sat Oct 30 20:13:16 1999 +0200
+++ b/src/HOL/Isar_examples/document/style.tex	Sat Oct 30 20:20:48 1999 +0200
@@ -2,7 +2,7 @@
 %% $Id$
 
 \documentclass[11pt,a4paper]{article}
-\usepackage{comment,proof,isabelle,pdfsetup}
+\usepackage{comment,proof,isabelle,isabellesym,pdfsetup}
 
 \renewcommand{\isamarkupheader}[1]{\section{#1}}