*** empty log message ***
authornipkow
Wed, 13 Dec 2000 09:39:53 +0100
changeset 10654 458068404143
parent 10653 55f33da63366
child 10655 ddd33e0f4935
*** empty log message ***
doc-src/TutorialI/Advanced/Partial.thy
doc-src/TutorialI/Advanced/ROOT.ML
doc-src/TutorialI/Advanced/WFrec.thy
doc-src/TutorialI/Advanced/advanced.tex
doc-src/TutorialI/Advanced/document/Partial.tex
doc-src/TutorialI/Advanced/document/WFrec.tex
doc-src/TutorialI/CTL/CTL.thy
doc-src/TutorialI/CTL/document/CTL.tex
doc-src/TutorialI/Inductive/Even.tex
doc-src/TutorialI/IsaMakefile
doc-src/TutorialI/Misc/AdvancedInd.thy
doc-src/TutorialI/Misc/document/natsum.tex
doc-src/TutorialI/Misc/natsum.thy
doc-src/TutorialI/Misc/simp.thy
doc-src/TutorialI/Recdef/Nested2.thy
doc-src/TutorialI/Recdef/document/Nested2.tex
doc-src/TutorialI/Recdef/document/examples.tex
doc-src/TutorialI/Recdef/document/termination.tex
doc-src/TutorialI/Recdef/examples.thy
doc-src/TutorialI/Recdef/termination.thy
doc-src/TutorialI/ToyList/ToyList.thy
doc-src/TutorialI/ToyList/document/ToyList.tex
doc-src/TutorialI/Types/Axioms.thy
doc-src/TutorialI/Types/Overloading2.thy
doc-src/TutorialI/Types/Pairs.thy
doc-src/TutorialI/Types/Typedef.thy
doc-src/TutorialI/Types/document/Axioms.tex
doc-src/TutorialI/Types/document/Overloading2.tex
doc-src/TutorialI/Types/document/Pairs.tex
doc-src/TutorialI/Types/document/Typedef.tex
doc-src/TutorialI/Types/numerics.tex
doc-src/TutorialI/appendix.tex
doc-src/TutorialI/fp.tex
doc-src/TutorialI/todo.tobias
doc-src/TutorialI/tutorial.tex
--- /dev/null	Thu Jan 01 00:00:00 1970 +0000
+++ b/doc-src/TutorialI/Advanced/Partial.thy	Wed Dec 13 09:39:53 2000 +0100
@@ -0,0 +1,213 @@
+(*<*)theory Partial = While_Combinator:(*>*)
+
+text{*\noindent
+Throughout the tutorial we have emphasized the fact that all functions
+in HOL are total. Hence we cannot hope to define truly partial
+functions. The best we can do are functions that are
+\emph{underdefined}\index{underdefined function}:
+for certain arguments we only know that a result
+exists, but we don't know what it is. When defining functions that are
+normally considered partial, underdefinedness turns out to be a very
+reasonable alternative.
+
+We have already seen an instance of underdefinedness by means of
+non-exhaustive pattern matching: the definition of @{term last} in
+\S\ref{sec:recdef-examples}. The same is allowed for \isacommand{primrec}
+*}
+
+consts hd :: "'a list \<Rightarrow> 'a"
+primrec "hd (x#xs) = x"
+
+text{*\noindent
+although it generates a warning.
+
+Even ordinary definitions allow underdefinedness, this time by means of
+preconditions:
+*}
+
+constdefs minus :: "nat \<Rightarrow> nat \<Rightarrow> nat"
+"n \<le> m \<Longrightarrow> minus m n \<equiv> m - n"
+
+text{*
+The rest of this section is devoted to the question of how to define
+partial recursive functions by other means that non-exhaustive pattern
+matching.
+*}
+
+subsubsection{*Guarded recursion*}
+
+text{* Neither \isacommand{primrec} nor \isacommand{recdef} allow to
+prefix an equation with a condition in the way ordinary definitions do
+(see @{term minus} above). Instead we have to move the condition over
+to the right-hand side of the equation. Given a partial function $f$
+that should satisfy the recursion equation $f(x) = t$ over its domain
+$dom(f)$, we turn this into the \isacommand{recdef}
+@{prop[display]"f(x) = (if x \<in> dom(f) then t else arbitrary)"}
+where @{term arbitrary} is a predeclared constant of type @{typ 'a}
+which has no definition. Thus we know nothing about its value,
+which is ideal for specifying underdefined functions on top of it.
+
+As a simple example we define division on @{typ nat}:
+*}
+
+consts divi :: "nat \<times> nat \<Rightarrow> nat"
+recdef divi "measure(\<lambda>(m,n). m)"
+  "divi(m,n) = (if n = 0 then arbitrary else
+                if m < n then 0 else divi(m-n,n)+1)"
+
+text{*\noindent Of course we could also have defined
+@{term"divi(m,0)"} to be some specific number, for example 0. The
+latter option is chosen for the predefined @{text div} function, which
+simplifies proofs at the expense of moving further away from the
+standard mathematical divison function.
+
+As a more substantial example we consider the problem of searching a graph.
+For simplicity our graph is given by a function (@{term f}) of
+type @{typ"'a \<Rightarrow> 'a"} which
+maps each node to its successor, and the task is to find the end of a chain,
+i.e.\ a node pointing to itself. Here is a first attempt:
+@{prop[display]"find(f,x) = (if f x = x then x else find(f, f x))"}
+This may be viewed as a fixed point finder or as one half of the well known
+\emph{Union-Find} algorithm.
+The snag is that it may not terminate if @{term f} has nontrivial cycles.
+Phrased differently, the relation
+*}
+
+constdefs step1 :: "('a \<Rightarrow> 'a) \<Rightarrow> ('a \<times> 'a)set"
+  "step1 f \<equiv> {(y,x). y = f x \<and> y \<noteq> x}"
+
+text{*\noindent
+must be well-founded. Thus we make the following definition:
+*}
+
+consts find :: "('a \<Rightarrow> 'a) \<times> 'a \<Rightarrow> 'a"
+recdef find "same_fst (\<lambda>f. wf(step1 f)) step1"
+  "find(f,x) = (if wf(step1 f)
+                then if f x = x then x else find(f, f x)
+                else arbitrary)"
+(hints recdef_simp:same_fst_def step1_def)
+
+text{*\noindent
+The recursion equation itself should be clear enough: it is our aborted
+first attempt augmented with a check that there are no non-trivial loops.
+
+What complicates the termination proof is that the argument of
+@{term find} is a pair. To express the required well-founded relation
+we employ the predefined combinator @{term same_fst} of type
+@{text[display]"('a \<Rightarrow> bool) \<Rightarrow> ('a \<Rightarrow> ('b\<times>'b)set) \<Rightarrow> (('a\<times>'b) \<times> ('a\<times>'b))set"}
+defined as
+@{thm[display]same_fst_def[no_vars]}
+This combinator is designed for recursive functions on pairs where the first
+component of the argument is passed unchanged to all recursive
+calls. Given a constraint on the first component and a relation on the second
+component, @{term same_fst} builds the required relation on pairs.
+The theorem @{thm[display]wf_same_fst[no_vars]}
+is known to the well-foundedness prover of \isacommand{recdef}.
+Thus well-foundedness of the given relation is immediate.
+Furthermore, each recursive call descends along the given relation:
+the first argument stays unchanged and the second one descends along
+@{term"step1 f"}. The proof merely requires unfolding of some definitions.
+
+Normally you will then derive the following conditional variant of and from
+the recursion equation
+*}
+
+lemma [simp]:
+  "wf(step1 f) \<Longrightarrow> find(f,x) = (if f x = x then x else find(f, f x))"
+by simp
+
+text{*\noindent and then disable the original recursion equation:*}
+
+declare find.simps[simp del]
+
+text{*
+We can reason about such underdefined functions just like about any other
+recursive function. Here is a simple example of recursion induction:
+*}
+
+lemma "wf(step1 f) \<longrightarrow> f(find(f,x)) = find(f,x)"
+apply(induct_tac f x rule:find.induct);
+apply simp
+done
+
+subsubsection{*The {\tt\slshape while} combinator*}
+
+text{*If the recursive function happens to be tail recursive, its
+definition becomes a triviality if based on the predefined \isaindexbold{while}
+combinator.  The latter lives in the library theory
+\isa{While_Combinator}, which is not part of @{text Main} but needs to
+be included explicitly among the ancestor theories.
+
+Constant @{term while} is of type @{text"('a \<Rightarrow> bool) \<Rightarrow> ('a \<Rightarrow> 'a) \<Rightarrow> 'a"}
+and satisfies the recursion equation @{thm[display]while_unfold[no_vars]}
+That is, @{term"while b c s"} is equivalent to the imperative program
+\begin{verbatim}
+     x := s; while b(x) do x := c(x); return x
+\end{verbatim}
+In general, @{term s} will be a tuple (better still: a record). As an example
+consider the tail recursive variant of function @{term find} above:
+*}
+
+constdefs find2 :: "('a \<Rightarrow> 'a) \<Rightarrow> 'a \<Rightarrow> 'a"
+  "find2 f x \<equiv>
+   fst(while (\<lambda>(x,x'). x' \<noteq> x) (\<lambda>(x,x'). (x',f x')) (x,f x))"
+
+text{*\noindent
+The loop operates on two ``local variables'' @{term x} and @{term x'}
+containing the ``current'' and the ``next'' value of function @{term f}.
+They are initalized with the global @{term x} and @{term"f x"}. At the
+end @{term fst} selects the local @{term x}.
+
+This looks like we can define at least tail recursive functions
+without bothering about termination after all. But there is no free
+lunch: when proving properties of functions defined by @{term while},
+termination rears its ugly head again. Here is
+@{thm[source]while_rule}, the well known proof rule for total
+correctness of loops expressed with @{term while}:
+@{thm[display,margin=50]while_rule[no_vars]} @{term P} needs to be
+true of the initial state @{term s} and invariant under @{term c}
+(premises 1 and 2).The post-condition @{term Q} must become true when
+leaving the loop (premise 3). And each loop iteration must descend
+along a well-founded relation @{term r} (premises 4 and 5).
+
+Let us now prove that @{term find2} does indeed find a fixed point. Instead
+of induction we apply the above while rule, suitably instantiated.
+Only the final premise of @{thm[source]while_rule} is left unproved
+by @{text auto} but falls to @{text simp}:
+*}
+
+lemma lem: "\<lbrakk> wf(step1 f); x' = f x \<rbrakk> \<Longrightarrow> \<exists>y y'.
+   while (\<lambda>(x,x'). x' \<noteq> x) (\<lambda>(x,x'). (x',f x')) (x,x') = (y,y') \<and>
+   y' = y \<and> f y = y"
+apply(rule_tac P = "\<lambda>(x,x'). x' = f x" and
+               r = "inv_image (step1 f) fst" in while_rule);
+apply auto
+apply(simp add:inv_image_def step1_def)
+done
+
+text{*
+The theorem itself is a simple consequence of this lemma:
+*}
+
+theorem "wf(step1 f) \<Longrightarrow> f(find2 f x) = find2 f x"
+apply(drule_tac x = x in lem)
+apply(auto simp add:find2_def)
+done
+
+text{* Let us conclude this section on partial functions by a
+discussion of the merits of the @{term while} combinator. We have
+already seen that the advantage (if it is one) of not having to
+provide a termintion argument when defining a function via @{term
+while} merely puts off the evil hour. On top of that, tail recursive
+functions tend to be more complicated to reason about. So why use
+@{term while} at all? The only reason is executability: the recursion
+equation for @{term while} is a directly executable functional
+program. This is in stark contrast to guarded recursion as introduced
+above which requires an explicit test @{prop"x \<in> dom f"} in the
+function body.  Unless @{term dom} is trivial, this leads to a
+definition which is either not at all executable or prohibitively
+expensive. Thus, if you are aiming for an efficiently executable definition
+of a partial function, you are likely to need @{term while}.
+*}
+
+(*<*)end(*>*)
--- a/doc-src/TutorialI/Advanced/ROOT.ML	Wed Dec 13 09:32:55 2000 +0100
+++ b/doc-src/TutorialI/Advanced/ROOT.ML	Wed Dec 13 09:39:53 2000 +0100
@@ -1,3 +1,4 @@
 use "../settings.ML";
 use_thy "simp";
 use_thy "WFrec";
+use_thy "Partial";
--- a/doc-src/TutorialI/Advanced/WFrec.thy	Wed Dec 13 09:32:55 2000 +0100
+++ b/doc-src/TutorialI/Advanced/WFrec.thy	Wed Dec 13 09:39:53 2000 +0100
@@ -5,7 +5,7 @@
 functions. Sometimes this can be quite inconvenient or even
 impossible. Fortunately, \isacommand{recdef} supports much more
 general definitions. For example, termination of Ackermann's function
-can be shown by means of the lexicographic product @{text"<*lex*>"}:
+can be shown by means of the \rmindex{lexicographic product} @{text"<*lex*>"}:
 *}
 
 consts ack :: "nat\<times>nat \<Rightarrow> nat";
@@ -80,6 +80,7 @@
 
 text{*\noindent
 and should have appended the following hint to our above definition:
+\indexbold{*recdef_wf}
 *}
 (*<*)
 consts g :: "nat \<Rightarrow> nat"
@@ -87,5 +88,5 @@
 "g 0 = 0"
 "g (Suc n) = g n"
 (*>*)
-(hints recdef_wf add: wf_id)
+(hints recdef_wf: wf_id)
 (*<*)end(*>*)
--- a/doc-src/TutorialI/Advanced/advanced.tex	Wed Dec 13 09:32:55 2000 +0100
+++ b/doc-src/TutorialI/Advanced/advanced.tex	Wed Dec 13 09:39:53 2000 +0100
@@ -13,9 +13,10 @@
 \section{Advanced forms of recursion}
 \index{*recdef|(}
 
-The purpose of this section is to introduce advanced forms of \isacommand{recdef}. It
-covers two topics: how to define recursive function over nested recursive datatypes
-and how to establish termination by means other than measure functions.
+The purpose of this section is to introduce advanced forms of
+\isacommand{recdef}: how to define recursive function over nested recursive
+datatypes, how to establish termination by means other than measure functions,
+and how to deal with partial functions.
 
 If, after reading this section, you feel that the definition of recursive
 functions is overly and maybe unnecessarily complicated by the requirement of
@@ -32,12 +33,17 @@
 \input{Recdef/document/Nested0.tex}
 \input{Recdef/document/Nested1.tex}
 \input{Recdef/document/Nested2.tex}
-\index{*recdef|)}
 
 \subsection{Beyond measure}
 \label{sec:beyond-measure}
 \input{Advanced/document/WFrec.tex}
 
+\subsection{Partial functions}
+\index{partial function}
+\input{Advanced/document/Partial.tex}
+
+\index{*recdef|)}
+
 \section{Advanced induction techniques}
 \label{sec:advanced-ind}
 \index{induction|(}
--- /dev/null	Thu Jan 01 00:00:00 1970 +0000
+++ b/doc-src/TutorialI/Advanced/document/Partial.tex	Wed Dec 13 09:39:53 2000 +0100
@@ -0,0 +1,225 @@
+%
+\begin{isabellebody}%
+\def\isabellecontext{Partial}%
+%
+\begin{isamarkuptext}%
+\noindent
+Throughout the tutorial we have emphasized the fact that all functions
+in HOL are total. Hence we cannot hope to define truly partial
+functions. The best we can do are functions that are
+\emph{underdefined}\index{underdefined function}:
+for certain arguments we only know that a result
+exists, but we don't know what it is. When defining functions that are
+normally considered partial, underdefinedness turns out to be a very
+reasonable alternative.
+
+We have already seen an instance of underdefinedness by means of
+non-exhaustive pattern matching: the definition of \isa{last} in
+\S\ref{sec:recdef-examples}. The same is allowed for \isacommand{primrec}%
+\end{isamarkuptext}%
+\isacommand{consts}\ hd\ {\isacharcolon}{\isacharcolon}\ {\isachardoublequote}{\isacharprime}a\ list\ {\isasymRightarrow}\ {\isacharprime}a{\isachardoublequote}\isanewline
+\isacommand{primrec}\ {\isachardoublequote}hd\ {\isacharparenleft}x{\isacharhash}xs{\isacharparenright}\ {\isacharequal}\ x{\isachardoublequote}%
+\begin{isamarkuptext}%
+\noindent
+although it generates a warning.
+
+Even ordinary definitions allow underdefinedness, this time by means of
+preconditions:%
+\end{isamarkuptext}%
+\isacommand{constdefs}\ minus\ {\isacharcolon}{\isacharcolon}\ {\isachardoublequote}nat\ {\isasymRightarrow}\ nat\ {\isasymRightarrow}\ nat{\isachardoublequote}\isanewline
+{\isachardoublequote}n\ {\isasymle}\ m\ {\isasymLongrightarrow}\ minus\ m\ n\ {\isasymequiv}\ m\ {\isacharminus}\ n{\isachardoublequote}%
+\begin{isamarkuptext}%
+The rest of this section is devoted to the question of how to define
+partial recursive functions by other means that non-exhaustive pattern
+matching.%
+\end{isamarkuptext}%
+%
+\isamarkupsubsubsection{Guarded recursion%
+}
+%
+\begin{isamarkuptext}%
+Neither \isacommand{primrec} nor \isacommand{recdef} allow to
+prefix an equation with a condition in the way ordinary definitions do
+(see \isa{minus} above). Instead we have to move the condition over
+to the right-hand side of the equation. Given a partial function $f$
+that should satisfy the recursion equation $f(x) = t$ over its domain
+$dom(f)$, we turn this into the \isacommand{recdef}
+\begin{isabelle}%
+\ \ \ \ \ f\ x\ {\isacharequal}\ {\isacharparenleft}if\ x\ {\isasymin}\ dom\ f\ then\ t\ else\ arbitrary{\isacharparenright}%
+\end{isabelle}
+where \isa{arbitrary} is a predeclared constant of type \isa{{\isacharprime}a}
+which has no definition. Thus we know nothing about its value,
+which is ideal for specifying underdefined functions on top of it.
+
+As a simple example we define division on \isa{nat}:%
+\end{isamarkuptext}%
+\isacommand{consts}\ divi\ {\isacharcolon}{\isacharcolon}\ {\isachardoublequote}nat\ {\isasymtimes}\ nat\ {\isasymRightarrow}\ nat{\isachardoublequote}\isanewline
+\isacommand{recdef}\ divi\ {\isachardoublequote}measure{\isacharparenleft}{\isasymlambda}{\isacharparenleft}m{\isacharcomma}n{\isacharparenright}{\isachardot}\ m{\isacharparenright}{\isachardoublequote}\isanewline
+\ \ {\isachardoublequote}divi{\isacharparenleft}m{\isacharcomma}n{\isacharparenright}\ {\isacharequal}\ {\isacharparenleft}if\ n\ {\isacharequal}\ {\isadigit{0}}\ then\ arbitrary\ else\isanewline
+\ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ if\ m\ {\isacharless}\ n\ then\ {\isadigit{0}}\ else\ divi{\isacharparenleft}m{\isacharminus}n{\isacharcomma}n{\isacharparenright}{\isacharplus}{\isadigit{1}}{\isacharparenright}{\isachardoublequote}%
+\begin{isamarkuptext}%
+\noindent Of course we could also have defined
+\isa{divi\ {\isacharparenleft}m{\isacharcomma}\ {\isadigit{0}}{\isacharparenright}} to be some specific number, for example 0. The
+latter option is chosen for the predefined \isa{div} function, which
+simplifies proofs at the expense of moving further away from the
+standard mathematical divison function.
+
+As a more substantial example we consider the problem of searching a graph.
+For simplicity our graph is given by a function (\isa{f}) of
+type \isa{{\isacharprime}a\ {\isasymRightarrow}\ {\isacharprime}a} which
+maps each node to its successor, and the task is to find the end of a chain,
+i.e.\ a node pointing to itself. Here is a first attempt:
+\begin{isabelle}%
+\ \ \ \ \ find\ {\isacharparenleft}f{\isacharcomma}\ x{\isacharparenright}\ {\isacharequal}\ {\isacharparenleft}if\ f\ x\ {\isacharequal}\ x\ then\ x\ else\ find\ {\isacharparenleft}f{\isacharcomma}\ f\ x{\isacharparenright}{\isacharparenright}%
+\end{isabelle}
+This may be viewed as a fixed point finder or as one half of the well known
+\emph{Union-Find} algorithm.
+The snag is that it may not terminate if \isa{f} has nontrivial cycles.
+Phrased differently, the relation%
+\end{isamarkuptext}%
+\isacommand{constdefs}\ step{\isadigit{1}}\ {\isacharcolon}{\isacharcolon}\ {\isachardoublequote}{\isacharparenleft}{\isacharprime}a\ {\isasymRightarrow}\ {\isacharprime}a{\isacharparenright}\ {\isasymRightarrow}\ {\isacharparenleft}{\isacharprime}a\ {\isasymtimes}\ {\isacharprime}a{\isacharparenright}set{\isachardoublequote}\isanewline
+\ \ {\isachardoublequote}step{\isadigit{1}}\ f\ {\isasymequiv}\ {\isacharbraceleft}{\isacharparenleft}y{\isacharcomma}x{\isacharparenright}{\isachardot}\ y\ {\isacharequal}\ f\ x\ {\isasymand}\ y\ {\isasymnoteq}\ x{\isacharbraceright}{\isachardoublequote}%
+\begin{isamarkuptext}%
+\noindent
+must be well-founded. Thus we make the following definition:%
+\end{isamarkuptext}%
+\isacommand{consts}\ find\ {\isacharcolon}{\isacharcolon}\ {\isachardoublequote}{\isacharparenleft}{\isacharprime}a\ {\isasymRightarrow}\ {\isacharprime}a{\isacharparenright}\ {\isasymtimes}\ {\isacharprime}a\ {\isasymRightarrow}\ {\isacharprime}a{\isachardoublequote}\isanewline
+\isacommand{recdef}\ find\ {\isachardoublequote}same{\isacharunderscore}fst\ {\isacharparenleft}{\isasymlambda}f{\isachardot}\ wf{\isacharparenleft}step{\isadigit{1}}\ f{\isacharparenright}{\isacharparenright}\ step{\isadigit{1}}{\isachardoublequote}\isanewline
+\ \ {\isachardoublequote}find{\isacharparenleft}f{\isacharcomma}x{\isacharparenright}\ {\isacharequal}\ {\isacharparenleft}if\ wf{\isacharparenleft}step{\isadigit{1}}\ f{\isacharparenright}\isanewline
+\ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ then\ if\ f\ x\ {\isacharequal}\ x\ then\ x\ else\ find{\isacharparenleft}f{\isacharcomma}\ f\ x{\isacharparenright}\isanewline
+\ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ else\ arbitrary{\isacharparenright}{\isachardoublequote}\isanewline
+{\isacharparenleft}\isakeyword{hints}\ recdef{\isacharunderscore}simp{\isacharcolon}same{\isacharunderscore}fst{\isacharunderscore}def\ step{\isadigit{1}}{\isacharunderscore}def{\isacharparenright}%
+\begin{isamarkuptext}%
+\noindent
+The recursion equation itself should be clear enough: it is our aborted
+first attempt augmented with a check that there are no non-trivial loops.
+
+What complicates the termination proof is that the argument of
+\isa{find} is a pair. To express the required well-founded relation
+we employ the predefined combinator \isa{same{\isacharunderscore}fst} of type
+\begin{isabelle}%
+\ \ \ \ \ {\isacharparenleft}{\isacharprime}a\ {\isasymRightarrow}\ bool{\isacharparenright}\ {\isasymRightarrow}\ {\isacharparenleft}{\isacharprime}a\ {\isasymRightarrow}\ {\isacharparenleft}{\isacharprime}b{\isasymtimes}{\isacharprime}b{\isacharparenright}set{\isacharparenright}\ {\isasymRightarrow}\ {\isacharparenleft}{\isacharparenleft}{\isacharprime}a{\isasymtimes}{\isacharprime}b{\isacharparenright}\ {\isasymtimes}\ {\isacharparenleft}{\isacharprime}a{\isasymtimes}{\isacharprime}b{\isacharparenright}{\isacharparenright}set%
+\end{isabelle}
+defined as
+\begin{isabelle}%
+\ \ \ \ \ same{\isacharunderscore}fst\ P\ R\ {\isasymequiv}\ {\isacharbraceleft}{\isacharparenleft}{\isacharparenleft}x{\isacharprime}{\isacharcomma}\ y{\isacharprime}{\isacharparenright}{\isacharcomma}\ x{\isacharcomma}\ y{\isacharparenright}{\isachardot}\ x{\isacharprime}\ {\isacharequal}\ x\ {\isasymand}\ P\ x\ {\isasymand}\ {\isacharparenleft}y{\isacharprime}{\isacharcomma}\ y{\isacharparenright}\ {\isasymin}\ R\ x{\isacharbraceright}%
+\end{isabelle}
+This combinator is designed for recursive functions on pairs where the first
+component of the argument is passed unchanged to all recursive
+calls. Given a constraint on the first component and a relation on the second
+component, \isa{same{\isacharunderscore}fst} builds the required relation on pairs.
+The theorem \begin{isabelle}%
+\ \ \ \ \ {\isacharparenleft}{\isasymAnd}x{\isachardot}\ P\ x\ {\isasymLongrightarrow}\ wf\ {\isacharparenleft}R\ x{\isacharparenright}{\isacharparenright}\ {\isasymLongrightarrow}\ wf\ {\isacharparenleft}same{\isacharunderscore}fst\ P\ R{\isacharparenright}%
+\end{isabelle}
+is known to the well-foundedness prover of \isacommand{recdef}.
+Thus well-foundedness of the given relation is immediate.
+Furthermore, each recursive call descends along the given relation:
+the first argument stays unchanged and the second one descends along
+\isa{step{\isadigit{1}}\ f}. The proof merely requires unfolding of some definitions.
+
+Normally you will then derive the following conditional variant of and from
+the recursion equation%
+\end{isamarkuptext}%
+\isacommand{lemma}\ {\isacharbrackleft}simp{\isacharbrackright}{\isacharcolon}\isanewline
+\ \ {\isachardoublequote}wf{\isacharparenleft}step{\isadigit{1}}\ f{\isacharparenright}\ {\isasymLongrightarrow}\ find{\isacharparenleft}f{\isacharcomma}x{\isacharparenright}\ {\isacharequal}\ {\isacharparenleft}if\ f\ x\ {\isacharequal}\ x\ then\ x\ else\ find{\isacharparenleft}f{\isacharcomma}\ f\ x{\isacharparenright}{\isacharparenright}{\isachardoublequote}\isanewline
+\isacommand{by}\ simp%
+\begin{isamarkuptext}%
+\noindent and then disable the original recursion equation:%
+\end{isamarkuptext}%
+\isacommand{declare}\ find{\isachardot}simps{\isacharbrackleft}simp\ del{\isacharbrackright}%
+\begin{isamarkuptext}%
+We can reason about such underdefined functions just like about any other
+recursive function. Here is a simple example of recursion induction:%
+\end{isamarkuptext}%
+\isacommand{lemma}\ {\isachardoublequote}wf{\isacharparenleft}step{\isadigit{1}}\ f{\isacharparenright}\ {\isasymlongrightarrow}\ f{\isacharparenleft}find{\isacharparenleft}f{\isacharcomma}x{\isacharparenright}{\isacharparenright}\ {\isacharequal}\ find{\isacharparenleft}f{\isacharcomma}x{\isacharparenright}{\isachardoublequote}\isanewline
+\isacommand{apply}{\isacharparenleft}induct{\isacharunderscore}tac\ f\ x\ rule{\isacharcolon}find{\isachardot}induct{\isacharparenright}\isanewline
+\isacommand{apply}\ simp\isanewline
+\isacommand{done}%
+\isamarkupsubsubsection{The {\tt\slshape while} combinator%
+}
+%
+\begin{isamarkuptext}%
+If the recursive function happens to be tail recursive, its
+definition becomes a triviality if based on the predefined \isaindexbold{while}
+combinator.  The latter lives in the library theory
+\isa{While_Combinator}, which is not part of \isa{Main} but needs to
+be included explicitly among the ancestor theories.
+
+Constant \isa{while} is of type \isa{{\isacharparenleft}{\isacharprime}a\ {\isasymRightarrow}\ bool{\isacharparenright}\ {\isasymRightarrow}\ {\isacharparenleft}{\isacharprime}a\ {\isasymRightarrow}\ {\isacharprime}a{\isacharparenright}\ {\isasymRightarrow}\ {\isacharprime}a}
+and satisfies the recursion equation \begin{isabelle}%
+\ \ \ \ \ while\ b\ c\ s\ {\isacharequal}\ {\isacharparenleft}if\ b\ s\ then\ while\ b\ c\ {\isacharparenleft}c\ s{\isacharparenright}\ else\ s{\isacharparenright}%
+\end{isabelle}
+That is, \isa{while\ b\ c\ s} is equivalent to the imperative program
+\begin{verbatim}
+     x := s; while b(x) do x := c(x); return x
+\end{verbatim}
+In general, \isa{s} will be a tuple (better still: a record). As an example
+consider the tail recursive variant of function \isa{find} above:%
+\end{isamarkuptext}%
+\isacommand{constdefs}\ find{\isadigit{2}}\ {\isacharcolon}{\isacharcolon}\ {\isachardoublequote}{\isacharparenleft}{\isacharprime}a\ {\isasymRightarrow}\ {\isacharprime}a{\isacharparenright}\ {\isasymRightarrow}\ {\isacharprime}a\ {\isasymRightarrow}\ {\isacharprime}a{\isachardoublequote}\isanewline
+\ \ {\isachardoublequote}find{\isadigit{2}}\ f\ x\ {\isasymequiv}\isanewline
+\ \ \ fst{\isacharparenleft}while\ {\isacharparenleft}{\isasymlambda}{\isacharparenleft}x{\isacharcomma}x{\isacharprime}{\isacharparenright}{\isachardot}\ x{\isacharprime}\ {\isasymnoteq}\ x{\isacharparenright}\ {\isacharparenleft}{\isasymlambda}{\isacharparenleft}x{\isacharcomma}x{\isacharprime}{\isacharparenright}{\isachardot}\ {\isacharparenleft}x{\isacharprime}{\isacharcomma}f\ x{\isacharprime}{\isacharparenright}{\isacharparenright}\ {\isacharparenleft}x{\isacharcomma}f\ x{\isacharparenright}{\isacharparenright}{\isachardoublequote}%
+\begin{isamarkuptext}%
+\noindent
+The loop operates on two ``local variables'' \isa{x} and \isa{x{\isacharprime}}
+containing the ``current'' and the ``next'' value of function \isa{f}.
+They are initalized with the global \isa{x} and \isa{f\ x}. At the
+end \isa{fst} selects the local \isa{x}.
+
+This looks like we can define at least tail recursive functions
+without bothering about termination after all. But there is no free
+lunch: when proving properties of functions defined by \isa{while},
+termination rears its ugly head again. Here is
+\isa{while{\isacharunderscore}rule}, the well known proof rule for total
+correctness of loops expressed with \isa{while}:
+\begin{isabelle}%
+\ \ \ \ \ {\isasymlbrakk}P\ s{\isacharsemicolon}\ {\isasymAnd}s{\isachardot}\ {\isasymlbrakk}P\ s{\isacharsemicolon}\ b\ s{\isasymrbrakk}\ {\isasymLongrightarrow}\ P\ {\isacharparenleft}c\ s{\isacharparenright}{\isacharsemicolon}\isanewline
+\ \ \ \ \ \ \ \ {\isasymAnd}s{\isachardot}\ {\isasymlbrakk}P\ s{\isacharsemicolon}\ {\isasymnot}\ b\ s{\isasymrbrakk}\ {\isasymLongrightarrow}\ Q\ s{\isacharsemicolon}\ wf\ r{\isacharsemicolon}\isanewline
+\ \ \ \ \ \ \ \ {\isasymAnd}s{\isachardot}\ {\isasymlbrakk}P\ s{\isacharsemicolon}\ b\ s{\isasymrbrakk}\ {\isasymLongrightarrow}\ {\isacharparenleft}c\ s{\isacharcomma}\ s{\isacharparenright}\ {\isasymin}\ r{\isasymrbrakk}\isanewline
+\ \ \ \ \ {\isasymLongrightarrow}\ Q\ {\isacharparenleft}while\ b\ c\ s{\isacharparenright}%
+\end{isabelle} \isa{P} needs to be
+true of the initial state \isa{s} and invariant under \isa{c}
+(premises 1 and 2).The post-condition \isa{Q} must become true when
+leaving the loop (premise 3). And each loop iteration must descend
+along a well-founded relation \isa{r} (premises 4 and 5).
+
+Let us now prove that \isa{find{\isadigit{2}}} does indeed find a fixed point. Instead
+of induction we apply the above while rule, suitably instantiated.
+Only the final premise of \isa{while{\isacharunderscore}rule} is left unproved
+by \isa{auto} but falls to \isa{simp}:%
+\end{isamarkuptext}%
+\isacommand{lemma}\ lem{\isacharcolon}\ {\isachardoublequote}{\isasymlbrakk}\ wf{\isacharparenleft}step{\isadigit{1}}\ f{\isacharparenright}{\isacharsemicolon}\ x{\isacharprime}\ {\isacharequal}\ f\ x\ {\isasymrbrakk}\ {\isasymLongrightarrow}\ {\isasymexists}y\ y{\isacharprime}{\isachardot}\isanewline
+\ \ \ while\ {\isacharparenleft}{\isasymlambda}{\isacharparenleft}x{\isacharcomma}x{\isacharprime}{\isacharparenright}{\isachardot}\ x{\isacharprime}\ {\isasymnoteq}\ x{\isacharparenright}\ {\isacharparenleft}{\isasymlambda}{\isacharparenleft}x{\isacharcomma}x{\isacharprime}{\isacharparenright}{\isachardot}\ {\isacharparenleft}x{\isacharprime}{\isacharcomma}f\ x{\isacharprime}{\isacharparenright}{\isacharparenright}\ {\isacharparenleft}x{\isacharcomma}x{\isacharprime}{\isacharparenright}\ {\isacharequal}\ {\isacharparenleft}y{\isacharcomma}y{\isacharprime}{\isacharparenright}\ {\isasymand}\isanewline
+\ \ \ y{\isacharprime}\ {\isacharequal}\ y\ {\isasymand}\ f\ y\ {\isacharequal}\ y{\isachardoublequote}\isanewline
+\isacommand{apply}{\isacharparenleft}rule{\isacharunderscore}tac\ P\ {\isacharequal}\ {\isachardoublequote}{\isasymlambda}{\isacharparenleft}x{\isacharcomma}x{\isacharprime}{\isacharparenright}{\isachardot}\ x{\isacharprime}\ {\isacharequal}\ f\ x{\isachardoublequote}\ \isakeyword{and}\isanewline
+\ \ \ \ \ \ \ \ \ \ \ \ \ \ \ r\ {\isacharequal}\ {\isachardoublequote}inv{\isacharunderscore}image\ {\isacharparenleft}step{\isadigit{1}}\ f{\isacharparenright}\ fst{\isachardoublequote}\ \isakeyword{in}\ while{\isacharunderscore}rule{\isacharparenright}\isanewline
+\isacommand{apply}\ auto\isanewline
+\isacommand{apply}{\isacharparenleft}simp\ add{\isacharcolon}inv{\isacharunderscore}image{\isacharunderscore}def\ step{\isadigit{1}}{\isacharunderscore}def{\isacharparenright}\isanewline
+\isacommand{done}%
+\begin{isamarkuptext}%
+The theorem itself is a simple consequence of this lemma:%
+\end{isamarkuptext}%
+\isacommand{theorem}\ {\isachardoublequote}wf{\isacharparenleft}step{\isadigit{1}}\ f{\isacharparenright}\ {\isasymLongrightarrow}\ f{\isacharparenleft}find{\isadigit{2}}\ f\ x{\isacharparenright}\ {\isacharequal}\ find{\isadigit{2}}\ f\ x{\isachardoublequote}\isanewline
+\isacommand{apply}{\isacharparenleft}drule{\isacharunderscore}tac\ x\ {\isacharequal}\ x\ \isakeyword{in}\ lem{\isacharparenright}\isanewline
+\isacommand{apply}{\isacharparenleft}auto\ simp\ add{\isacharcolon}find{\isadigit{2}}{\isacharunderscore}def{\isacharparenright}\isanewline
+\isacommand{done}%
+\begin{isamarkuptext}%
+Let us conclude this section on partial functions by a
+discussion of the merits of the \isa{while} combinator. We have
+already seen that the advantage (if it is one) of not having to
+provide a termintion argument when defining a function via \isa{while} merely puts off the evil hour. On top of that, tail recursive
+functions tend to be more complicated to reason about. So why use
+\isa{while} at all? The only reason is executability: the recursion
+equation for \isa{while} is a directly executable functional
+program. This is in stark contrast to guarded recursion as introduced
+above which requires an explicit test \isa{x\ {\isasymin}\ dom\ f} in the
+function body.  Unless \isa{dom} is trivial, this leads to a
+definition which is either not at all executable or prohibitively
+expensive. Thus, if you are aiming for an efficiently executable definition
+of a partial function, you are likely to need \isa{while}.%
+\end{isamarkuptext}%
+\end{isabellebody}%
+%%% Local Variables:
+%%% mode: latex
+%%% TeX-master: "root"
+%%% End:
--- a/doc-src/TutorialI/Advanced/document/WFrec.tex	Wed Dec 13 09:32:55 2000 +0100
+++ b/doc-src/TutorialI/Advanced/document/WFrec.tex	Wed Dec 13 09:39:53 2000 +0100
@@ -8,7 +8,7 @@
 functions. Sometimes this can be quite inconvenient or even
 impossible. Fortunately, \isacommand{recdef} supports much more
 general definitions. For example, termination of Ackermann's function
-can be shown by means of the lexicographic product \isa{{\isacharless}{\isacharasterisk}lex{\isacharasterisk}{\isachargreater}}:%
+can be shown by means of the \rmindex{lexicographic product} \isa{{\isacharless}{\isacharasterisk}lex{\isacharasterisk}{\isachargreater}}:%
 \end{isamarkuptext}%
 \isacommand{consts}\ ack\ {\isacharcolon}{\isacharcolon}\ {\isachardoublequote}nat{\isasymtimes}nat\ {\isasymRightarrow}\ nat{\isachardoublequote}\isanewline
 \isacommand{recdef}\ ack\ {\isachardoublequote}measure{\isacharparenleft}{\isasymlambda}m{\isachardot}\ m{\isacharparenright}\ {\isacharless}{\isacharasterisk}lex{\isacharasterisk}{\isachargreater}\ measure{\isacharparenleft}{\isasymlambda}n{\isachardot}\ n{\isacharparenright}{\isachardoublequote}\isanewline
@@ -76,9 +76,10 @@
 \isacommand{by}\ simp%
 \begin{isamarkuptext}%
 \noindent
-and should have appended the following hint to our above definition:%
+and should have appended the following hint to our above definition:
+\indexbold{*recdef_wf}%
 \end{isamarkuptext}%
-{\isacharparenleft}\isakeyword{hints}\ recdef{\isacharunderscore}wf\ add{\isacharcolon}\ wf{\isacharunderscore}id{\isacharparenright}\end{isabellebody}%
+{\isacharparenleft}\isakeyword{hints}\ recdef{\isacharunderscore}wf{\isacharcolon}\ wf{\isacharunderscore}id{\isacharparenright}\end{isabellebody}%
 %%% Local Variables:
 %%% mode: latex
 %%% TeX-master: "root"
--- a/doc-src/TutorialI/CTL/CTL.thy	Wed Dec 13 09:32:55 2000 +0100
+++ b/doc-src/TutorialI/CTL/CTL.thy	Wed Dec 13 09:39:53 2000 +0100
@@ -182,7 +182,7 @@
 text{*\noindent
 Element @{term"n+1"} on this path is some arbitrary successor
 @{term t} of element @{term n} such that @{term"P t"} holds.  Remember that @{text"SOME t. R t"}
-is some arbitrary but fixed @{term t} such that @{prop"R t"} holds (see \S\ref{sec-SOME}). Of
+is some arbitrary but fixed @{term t} such that @{prop"R t"} holds (see \S\ref{sec:SOME}). Of
 course, such a @{term t} may in general not exist, but that is of no
 concern to us since we will only use @{term path} in such cases where a
 suitable @{term t} does exist.
--- a/doc-src/TutorialI/CTL/document/CTL.tex	Wed Dec 13 09:32:55 2000 +0100
+++ b/doc-src/TutorialI/CTL/document/CTL.tex	Wed Dec 13 09:39:53 2000 +0100
@@ -139,7 +139,7 @@
 \noindent
 Element \isa{n\ {\isacharplus}\ {\isadigit{1}}} on this path is some arbitrary successor
 \isa{t} of element \isa{n} such that \isa{P\ t} holds.  Remember that \isa{SOME\ t{\isachardot}\ R\ t}
-is some arbitrary but fixed \isa{t} such that \isa{R\ t} holds (see \S\ref{sec-SOME}). Of
+is some arbitrary but fixed \isa{t} such that \isa{R\ t} holds (see \S\ref{sec:SOME}). Of
 course, such a \isa{t} may in general not exist, but that is of no
 concern to us since we will only use \isa{path} in such cases where a
 suitable \isa{t} does exist.
--- a/doc-src/TutorialI/Inductive/Even.tex	Wed Dec 13 09:32:55 2000 +0100
+++ b/doc-src/TutorialI/Inductive/Even.tex	Wed Dec 13 09:39:53 2000 +0100
@@ -179,8 +179,10 @@
 \ 2.\ \isasymAnd na.\ \isasymlbrakk na\ \isasymin \ even;\ n\ \isasymin \ even\isasymrbrakk \ \isasymLongrightarrow \ n\ \isasymin \ even%
 \end{isabelle}
 The first one is hopeless.  In general, rule inductions involving
-non-trivial terms will probably go wrong.  The solution is easy provided
-we have the necessary inverses, here subtraction:
+non-trivial terms will probably go wrong. How to deal with such situations
+in general is described in {\S}\ref{sec:ind-var-in-prems} below.
+In the current case the solution is easy because
+we have the necessary inverse, subtraction:
 \begin{isabelle}
 \isacommand{lemma}\ even_imp_even_minus_2:\ "n\ \isasymin \ even\ \isasymLongrightarrow \ n-\#2\ \isasymin \ even"\isanewline
 \isacommand{apply}\ (erule\ even.induct)\isanewline
--- a/doc-src/TutorialI/IsaMakefile	Wed Dec 13 09:32:55 2000 +0100
+++ b/doc-src/TutorialI/IsaMakefile	Wed Dec 13 09:39:53 2000 +0100
@@ -23,6 +23,9 @@
 	@cd $(SRC)/HOL; $(ISATOOL) make HOL
 
 styles:
+	@rm -f isabelle.sty
+	@rm -f isabellesym.sty
+	@rm -f pdfsetup.sty
 	@$(ISATOOL) latex -o sty >/dev/null
 	@rm -f pdfsetup.sty
 	@rm -f */document/isabelle.sty
@@ -98,7 +101,8 @@
 
 HOL-Advanced: HOL $(LOG)/HOL-Advanced.gz
 
-$(LOG)/HOL-Advanced.gz: $(OUT)/HOL Advanced/simp.thy Advanced/ROOT.ML Advanced/WFrec.thy
+$(LOG)/HOL-Advanced.gz: $(OUT)/HOL Advanced/simp.thy Advanced/ROOT.ML Advanced/WFrec.thy \
+	Advanced/Partial.thy
 	@$(ISATOOL) usedir -i true -d dvi -D document $(OUT)/HOL Advanced
 	@rm -f tutorial.dvi
 
--- a/doc-src/TutorialI/Misc/AdvancedInd.thy	Wed Dec 13 09:32:55 2000 +0100
+++ b/doc-src/TutorialI/Misc/AdvancedInd.thy	Wed Dec 13 09:39:53 2000 +0100
@@ -290,8 +290,5 @@
 example theorem @{thm[source]nat_less_induct} can be viewed (and derived) as
 a special case of @{thm[source]wf_induct} where @{term r} is @{text"<"} on
 @{typ nat}. The details can be found in the HOL library.
-*};
-
-(*<*)
-end
-(*>*)
+*}
+(*<*)end(*>*)
--- a/doc-src/TutorialI/Misc/document/natsum.tex	Wed Dec 13 09:32:55 2000 +0100
+++ b/doc-src/TutorialI/Misc/document/natsum.tex	Wed Dec 13 09:39:53 2000 +0100
@@ -29,7 +29,8 @@
 \isaindexbold{div}, \isaindexbold{mod}, \isaindexbold{min} and
 \isaindexbold{max} are predefined, as are the relations
 \indexboldpos{\isasymle}{$HOL2arithrel} and
-\ttindexboldpos{<}{$HOL2arithrel}. There is even a least number operation
+\ttindexboldpos{<}{$HOL2arithrel}. As usual, \isa{m\ {\isacharminus}\ n\ {\isacharequal}\ {\isadigit{0}}} if
+\isa{m\ {\isacharless}\ n}. There is even a least number operation
 \isaindexbold{LEAST}. For example, \isa{{\isacharparenleft}LEAST\ n{\isachardot}\ {\isadigit{1}}\ {\isacharless}\ n{\isacharparenright}\ {\isacharequal}\ {\isadigit{2}}}, although
 Isabelle does not prove this completely automatically. Note that \isa{{\isadigit{1}}}
 and \isa{{\isadigit{2}}} are available as abbreviations for the corresponding
@@ -69,7 +70,9 @@
 (which attacks the first subgoal). It proves arithmetic goals involving the
 usual logical connectives (\isa{{\isasymnot}}, \isa{{\isasymand}}, \isa{{\isasymor}},
 \isa{{\isasymlongrightarrow}}), the relations \isa{{\isasymle}} and \isa{{\isacharless}}, and the operations
-\isa{{\isacharplus}}, \isa{{\isacharminus}}, \isa{min} and \isa{max}. For example,%
+\isa{{\isacharplus}}, \isa{{\isacharminus}}, \isa{min} and \isa{max}. Technically, this is
+known as the class of (quantifier free) \bfindex{linear arithmetic} formulae.
+For example,%
 \end{isamarkuptext}%
 \isacommand{lemma}\ {\isachardoublequote}min\ i\ {\isacharparenleft}max\ j\ {\isacharparenleft}k{\isacharasterisk}k{\isacharparenright}{\isacharparenright}\ {\isacharequal}\ max\ {\isacharparenleft}min\ {\isacharparenleft}k{\isacharasterisk}k{\isacharparenright}\ i{\isacharparenright}\ {\isacharparenleft}min\ i\ {\isacharparenleft}j{\isacharcolon}{\isacharcolon}nat{\isacharparenright}{\isacharparenright}{\isachardoublequote}\isanewline
 \isacommand{apply}{\isacharparenleft}arith{\isacharparenright}%
@@ -88,8 +91,8 @@
   of \ttindexboldpos{-}{$HOL2arithfun}, \isaindexbold{min} and
   \isaindexbold{max} because they are first eliminated by case distinctions.
 
-  \isa{arith} is incomplete even for the restricted class of formulae
-  described above (known as ``linear arithmetic''). If divisibility plays a
+  \isa{arith} is incomplete even for the restricted class of
+  linear arithmetic formulae. If divisibility plays a
   role, it may fail to prove a valid formula, for example
   \isa{m\ {\isacharplus}\ m\ {\isasymnoteq}\ n\ {\isacharplus}\ n\ {\isacharplus}\ {\isadigit{1}}}. Fortunately, such examples are rare in practice.
 \end{warn}%
--- a/doc-src/TutorialI/Misc/natsum.thy	Wed Dec 13 09:32:55 2000 +0100
+++ b/doc-src/TutorialI/Misc/natsum.thy	Wed Dec 13 09:39:53 2000 +0100
@@ -27,7 +27,8 @@
 \isaindexbold{div}, \isaindexbold{mod}, \isaindexbold{min} and
 \isaindexbold{max} are predefined, as are the relations
 \indexboldpos{\isasymle}{$HOL2arithrel} and
-\ttindexboldpos{<}{$HOL2arithrel}. There is even a least number operation
+\ttindexboldpos{<}{$HOL2arithrel}. As usual, @{prop"m-n = 0"} if
+@{prop"m<n"}. There is even a least number operation
 \isaindexbold{LEAST}. For example, @{prop"(LEAST n. 1 < n) = 2"}, although
 Isabelle does not prove this completely automatically. Note that @{term 1}
 and @{term 2} are available as abbreviations for the corresponding
@@ -69,7 +70,9 @@
 (which attacks the first subgoal). It proves arithmetic goals involving the
 usual logical connectives (@{text"\<not>"}, @{text"\<and>"}, @{text"\<or>"},
 @{text"\<longrightarrow>"}), the relations @{text"\<le>"} and @{text"<"}, and the operations
-@{text"+"}, @{text"-"}, @{term min} and @{term max}. For example,
+@{text"+"}, @{text"-"}, @{term min} and @{term max}. Technically, this is
+known as the class of (quantifier free) \bfindex{linear arithmetic} formulae.
+For example,
 *}
 
 lemma "min i (max j (k*k)) = max (min (k*k) i) (min i (j::nat))";
@@ -92,8 +95,8 @@
   of \ttindexboldpos{-}{$HOL2arithfun}, \isaindexbold{min} and
   \isaindexbold{max} because they are first eliminated by case distinctions.
 
-  \isa{arith} is incomplete even for the restricted class of formulae
-  described above (known as ``linear arithmetic''). If divisibility plays a
+  \isa{arith} is incomplete even for the restricted class of
+  linear arithmetic formulae. If divisibility plays a
   role, it may fail to prove a valid formula, for example
   @{prop"m+m \<noteq> n+n+1"}. Fortunately, such examples are rare in practice.
 \end{warn}
--- a/doc-src/TutorialI/Misc/simp.thy	Wed Dec 13 09:32:55 2000 +0100
+++ b/doc-src/TutorialI/Misc/simp.thy	Wed Dec 13 09:39:53 2000 +0100
@@ -227,7 +227,7 @@
 
 subsubsection{*Automatic case splits*}
 
-text{*\indexbold{case splits}\index{*split|(}
+text{*\indexbold{case splits}\index{*split (method, attr.)|(}
 Goals containing @{text"if"}-expressions are usually proved by case
 distinction on the condition of the @{text"if"}. For example the goal
 *}
@@ -235,40 +235,39 @@
 lemma "\<forall>xs. if xs = [] then rev xs = [] else rev xs \<noteq> []";
 
 txt{*\noindent
-can be split by a degenerate form of simplification
+can be split by a special method @{text split}:
 *}
 
-apply(simp only: split: split_if);
+apply(split split_if)
 
 txt{*\noindent
 @{subgoals[display,indent=0]}
-where no simplification rules are included (@{text"only:"} is followed by the
-empty list of theorems) but the rule \isaindexbold{split_if} for
-splitting @{text"if"}s is added (via the modifier @{text"split:"}). Because
+where \isaindexbold{split_if} is a theorem that expresses splitting of
+@{text"if"}s. Because
 case-splitting on @{text"if"}s is almost always the right proof strategy, the
 simplifier performs it automatically. Try \isacommand{apply}@{text"(simp)"}
 on the initial goal above.
 
 This splitting idea generalizes from @{text"if"} to \isaindex{case}:
-*}(*<*)oops;(*>*)
-
+*}(*<*)by simp(*>*)
 lemma "(case xs of [] \<Rightarrow> zs | y#ys \<Rightarrow> y#(ys@zs)) = xs@zs";
-apply(simp only: split: list.split);
+apply(split list.split);
 
 txt{*
 @{subgoals[display,indent=0]}
 In contrast to @{text"if"}-expressions, the simplifier does not split
 @{text"case"}-expressions by default because this can lead to nontermination
-in case of recursive datatypes. Again, if the @{text"only:"} modifier is
-dropped, the above goal is solved,
+in case of recursive datatypes. Therefore the simplifier has a modifier
+@{text split} for adding further splitting rules explicitly. This means the
+above lemma can be proved in one step by
 *}
 (*<*)oops;
 lemma "(case xs of [] \<Rightarrow> zs | y#ys \<Rightarrow> y#(ys@zs)) = xs@zs";
 (*>*)
 apply(simp split: list.split);
 (*<*)done(*>*)
-text{*\noindent%
-which \isacommand{apply}@{text"(simp)"} alone will not do.
+text{*\noindent
+whereas \isacommand{apply}@{text"(simp)"} alone will not succeed.
 
 In general, every datatype $t$ comes with a theorem
 $t$@{text".split"} which can be declared to be a \bfindex{split rule} either
@@ -294,14 +293,19 @@
 declare list.split [split del]
 
 text{*
+In polished proofs the @{text split} method is rarely used on its own
+but always as part of the simplifier. However, if a goal contains
+multiple splittable constructs, the @{text split} method can be
+helpful in selectively exploring the effects of splitting.
+
 The above split rules intentionally only affect the conclusion of a
 subgoal.  If you want to split an @{text"if"} or @{text"case"}-expression in
 the assumptions, you have to apply @{thm[source]split_if_asm} or
 $t$@{text".split_asm"}:
 *}
 
-lemma "if xs = [] then ys ~= [] else ys = [] ==> xs @ ys ~= []"
-apply(simp only: split: split_if_asm);
+lemma "if xs = [] then ys \<noteq> [] else ys = [] \<Longrightarrow> xs @ ys \<noteq> []"
+apply(split split_if_asm)
 
 txt{*\noindent
 In contrast to splitting the conclusion, this actually creates two
@@ -318,15 +322,12 @@
   same is true for \isaindex{case}-expressions: only the selector is
   simplified at first, until either the expression reduces to one of the
   cases or it is split.
-\end{warn}
-
-\index{*split|)}
+\end{warn}\index{*split (method, attr.)|)}
 *}
 (*<*)
 by(simp_all)
 (*>*)
 
-
 subsubsection{*Arithmetic*}
 
 text{*\index{arithmetic}
--- a/doc-src/TutorialI/Recdef/Nested2.thy	Wed Dec 13 09:32:55 2000 +0100
+++ b/doc-src/TutorialI/Recdef/Nested2.thy	Wed Dec 13 09:39:53 2000 +0100
@@ -80,7 +80,7 @@
 
 text{*\noindent
 or declare them globally
-by giving them the @{text recdef_cong} attribute as in
+by giving them the \isaindexbold{recdef_cong} attribute as in
 *}
 
 declare map_cong[recdef_cong];
--- a/doc-src/TutorialI/Recdef/document/Nested2.tex	Wed Dec 13 09:32:55 2000 +0100
+++ b/doc-src/TutorialI/Recdef/document/Nested2.tex	Wed Dec 13 09:39:53 2000 +0100
@@ -75,7 +75,7 @@
 \begin{isamarkuptext}%
 \noindent
 or declare them globally
-by giving them the \isa{recdef{\isacharunderscore}cong} attribute as in%
+by giving them the \isaindexbold{recdef_cong} attribute as in%
 \end{isamarkuptext}%
 \isacommand{declare}\ map{\isacharunderscore}cong{\isacharbrackleft}recdef{\isacharunderscore}cong{\isacharbrackright}%
 \begin{isamarkuptext}%
--- a/doc-src/TutorialI/Recdef/document/examples.tex	Wed Dec 13 09:32:55 2000 +0100
+++ b/doc-src/TutorialI/Recdef/document/examples.tex	Wed Dec 13 09:39:53 2000 +0100
@@ -33,6 +33,8 @@
 \noindent
 This time the measure is the length of the list, which decreases with the
 recursive call; the first component of the argument tuple is irrelevant.
+The details of tupled $\lambda$-abstractions \isa{{\isasymlambda}{\isacharparenleft}x\isactrlsub {\isadigit{1}}{\isacharcomma}{\isasymdots}{\isacharcomma}x\isactrlsub n{\isacharparenright}} are
+explained in \S\ref{sec:products}, but for now your intuition is all you need.
 
 Pattern matching need not be exhaustive:%
 \end{isamarkuptext}%
--- a/doc-src/TutorialI/Recdef/document/termination.tex	Wed Dec 13 09:32:55 2000 +0100
+++ b/doc-src/TutorialI/Recdef/document/termination.tex	Wed Dec 13 09:39:53 2000 +0100
@@ -39,7 +39,7 @@
 \noindent
 Because \isacommand{recdef}'s termination prover involves simplification,
 we include with our second attempt the hint to use \isa{termi{\isacharunderscore}lem} as
-a simplification rule:%
+a simplification rule:\indexbold{*recdef_simp}%
 \end{isamarkuptext}%
 \isacommand{consts}\ g\ {\isacharcolon}{\isacharcolon}\ {\isachardoublequote}nat{\isasymtimes}nat\ {\isasymRightarrow}\ nat{\isachardoublequote}\isanewline
 \isacommand{recdef}\ g\ {\isachardoublequote}measure{\isacharparenleft}{\isasymlambda}{\isacharparenleft}x{\isacharcomma}y{\isacharparenright}{\isachardot}\ x{\isacharminus}y{\isacharparenright}{\isachardoublequote}\isanewline
--- a/doc-src/TutorialI/Recdef/examples.thy	Wed Dec 13 09:32:55 2000 +0100
+++ b/doc-src/TutorialI/Recdef/examples.thy	Wed Dec 13 09:39:53 2000 +0100
@@ -35,6 +35,8 @@
 text{*\noindent
 This time the measure is the length of the list, which decreases with the
 recursive call; the first component of the argument tuple is irrelevant.
+The details of tupled $\lambda$-abstractions @{text"\<lambda>(x\<^sub>1,\<dots>,x\<^sub>n)"} are
+explained in \S\ref{sec:products}, but for now your intuition is all you need.
 
 Pattern matching need not be exhaustive:
 *}
--- a/doc-src/TutorialI/Recdef/termination.thy	Wed Dec 13 09:32:55 2000 +0100
+++ b/doc-src/TutorialI/Recdef/termination.thy	Wed Dec 13 09:39:53 2000 +0100
@@ -42,7 +42,7 @@
 text{*\noindent
 Because \isacommand{recdef}'s termination prover involves simplification,
 we include with our second attempt the hint to use @{thm[source]termi_lem} as
-a simplification rule:
+a simplification rule:\indexbold{*recdef_simp}
 *}
 
 consts g :: "nat\<times>nat \<Rightarrow> nat";
--- a/doc-src/TutorialI/ToyList/ToyList.thy	Wed Dec 13 09:32:55 2000 +0100
+++ b/doc-src/TutorialI/ToyList/ToyList.thy	Wed Dec 13 09:39:53 2000 +0100
@@ -69,7 +69,8 @@
 \isacommand{primrec}\index{*primrec} indicates that the recursion is of a
 particularly primitive kind where each recursive call peels off a datatype
 constructor from one of the arguments.  Thus the
-recursion always terminates, i.e.\ the function is \bfindex{total}.
+recursion always terminates, i.e.\ the function is \textbf{total}.
+\index{total function}
 
 The termination requirement is absolutely essential in HOL, a logic of total
 functions. If we were to drop it, inconsistencies would quickly arise: the
--- a/doc-src/TutorialI/ToyList/document/ToyList.tex	Wed Dec 13 09:32:55 2000 +0100
+++ b/doc-src/TutorialI/ToyList/document/ToyList.tex	Wed Dec 13 09:39:53 2000 +0100
@@ -67,7 +67,8 @@
 \isacommand{primrec}\index{*primrec} indicates that the recursion is of a
 particularly primitive kind where each recursive call peels off a datatype
 constructor from one of the arguments.  Thus the
-recursion always terminates, i.e.\ the function is \bfindex{total}.
+recursion always terminates, i.e.\ the function is \textbf{total}.
+\index{total function}
 
 The termination requirement is absolutely essential in HOL, a logic of total
 functions. If we were to drop it, inconsistencies would quickly arise: the
--- a/doc-src/TutorialI/Types/Axioms.thy	Wed Dec 13 09:32:55 2000 +0100
+++ b/doc-src/TutorialI/Types/Axioms.thy	Wed Dec 13 09:39:53 2000 +0100
@@ -114,7 +114,7 @@
 Linear orders are an example of subclassing by construction, which is the most
 common case. It is also possible to prove additional subclass relationships
 later on, i.e.\ subclassing by proof. This is the topic of the following
-section.
+paragraph.
 *}
 
 subsubsection{*Strict orders*}
--- a/doc-src/TutorialI/Types/Overloading2.thy	Wed Dec 13 09:32:55 2000 +0100
+++ b/doc-src/TutorialI/Types/Overloading2.thy	Wed Dec 13 09:39:53 2000 +0100
@@ -37,11 +37,11 @@
 @{text"*"} & @{text "('a::times) \<Rightarrow> 'a \<Rightarrow> 'a"} & (infixl 70) \\
 @{text"^"} & @{text "('a::power) \<Rightarrow> nat \<Rightarrow> 'a"} & (infixr 80) \\
 @{text"-"} & @{text "('a::minus) \<Rightarrow> 'a"} \\
-@{term abs} &  @{text "('a::minus) \<Rightarrow> 'a"} \\
+@{term abs} &  @{text "('a::minus) \<Rightarrow> 'a"} & $\mid~\mid$\\
 @{text"\<le>"} & @{text "('a::ord) \<Rightarrow> 'a \<Rightarrow> bool"} & (infixl 50) \\
 @{text"<"} & @{text "('a::ord) \<Rightarrow> 'a \<Rightarrow> bool"} & (infixl 50) \\
 @{term min} &  @{text "('a::ord) \<Rightarrow> 'a \<Rightarrow> 'a"} \\
-@{term max} &  @{text "('a::ord) \<Rightarrow> 'a \<Rightarrow> 'a"} \\
+@{term max} &  @{text "('a::ord) \<Rightarrow> 'a \<Rightarrow> 'a"}
 \end{tabular}
 \caption{Overloaded constants in HOL}
 \label{tab:overloading}
--- a/doc-src/TutorialI/Types/Pairs.thy	Wed Dec 13 09:32:55 2000 +0100
+++ b/doc-src/TutorialI/Types/Pairs.thy	Wed Dec 13 09:39:53 2000 +0100
@@ -39,7 +39,8 @@
 different terms).
 
 Internally, @{term"%(x,y). t"} becomes @{text"split (\<lambda>x y. t)"}, where
-@{term split} is the uncurrying function of type @{text"('a \<Rightarrow> 'b
+@{term split}\indexbold{*split (constant)}
+is the uncurrying function of type @{text"('a \<Rightarrow> 'b
 \<Rightarrow> 'c) \<Rightarrow> 'a \<times> 'b \<Rightarrow> 'c"} defined as
 \begin{center}
 @{thm split_def}
@@ -78,7 +79,7 @@
 *}
 
 lemma "(\<lambda>(x,y).y) p = snd p"
-apply(simp only: split:split_split);
+apply(split split_split);
 
 txt{*
 @{subgoals[display,indent=0]}
@@ -191,10 +192,10 @@
 In case you would like to turn off this automatic splitting, just disable the
 responsible simplification rules:
 \begin{center}
-@{thm split_paired_All}
+@{thm split_paired_All[no_vars]}
 \hfill
 (@{thm[source]split_paired_All})\\
-@{thm split_paired_Ex}
+@{thm split_paired_Ex[no_vars]}
 \hfill
 (@{thm[source]split_paired_Ex})
 \end{center}
--- a/doc-src/TutorialI/Types/Typedef.thy	Wed Dec 13 09:32:55 2000 +0100
+++ b/doc-src/TutorialI/Types/Typedef.thy	Wed Dec 13 09:39:53 2000 +0100
@@ -199,7 +199,7 @@
 @{prop"P B"} and @{prop"P C"}. First we prove the analogous proposition for the
 representation: *}
 
-lemma cases_lemma: "\<lbrakk> Q 0; Q 1; Q 2; n : three \<rbrakk> \<Longrightarrow>  Q(n::nat)";
+lemma cases_lemma: "\<lbrakk> Q 0; Q 1; Q 2; n : three \<rbrakk> \<Longrightarrow>  Q n";
 
 txt{*\noindent
 Expanding @{thm[source]three_def} yields the premise @{prop"n\<le>2"}. Repeated
--- a/doc-src/TutorialI/Types/document/Axioms.tex	Wed Dec 13 09:32:55 2000 +0100
+++ b/doc-src/TutorialI/Types/document/Axioms.tex	Wed Dec 13 09:39:53 2000 +0100
@@ -118,7 +118,7 @@
 Linear orders are an example of subclassing by construction, which is the most
 common case. It is also possible to prove additional subclass relationships
 later on, i.e.\ subclassing by proof. This is the topic of the following
-section.%
+paragraph.%
 \end{isamarkuptext}%
 %
 \isamarkupsubsubsection{Strict orders%
--- a/doc-src/TutorialI/Types/document/Overloading2.tex	Wed Dec 13 09:32:55 2000 +0100
+++ b/doc-src/TutorialI/Types/document/Overloading2.tex	Wed Dec 13 09:39:53 2000 +0100
@@ -39,11 +39,11 @@
 \isa{{\isacharasterisk}} & \isa{{\isacharparenleft}{\isacharprime}a{\isacharcolon}{\isacharcolon}times{\isacharparenright}\ {\isasymRightarrow}\ {\isacharprime}a\ {\isasymRightarrow}\ {\isacharprime}a} & (infixl 70) \\
 \isa{{\isacharcircum}} & \isa{{\isacharparenleft}{\isacharprime}a{\isacharcolon}{\isacharcolon}power{\isacharparenright}\ {\isasymRightarrow}\ nat\ {\isasymRightarrow}\ {\isacharprime}a} & (infixr 80) \\
 \isa{{\isacharminus}} & \isa{{\isacharparenleft}{\isacharprime}a{\isacharcolon}{\isacharcolon}minus{\isacharparenright}\ {\isasymRightarrow}\ {\isacharprime}a} \\
-\isa{abs} &  \isa{{\isacharparenleft}{\isacharprime}a{\isacharcolon}{\isacharcolon}minus{\isacharparenright}\ {\isasymRightarrow}\ {\isacharprime}a} \\
+\isa{abs} &  \isa{{\isacharparenleft}{\isacharprime}a{\isacharcolon}{\isacharcolon}minus{\isacharparenright}\ {\isasymRightarrow}\ {\isacharprime}a} & $\mid~\mid$\\
 \isa{{\isasymle}} & \isa{{\isacharparenleft}{\isacharprime}a{\isacharcolon}{\isacharcolon}ord{\isacharparenright}\ {\isasymRightarrow}\ {\isacharprime}a\ {\isasymRightarrow}\ bool} & (infixl 50) \\
 \isa{{\isacharless}} & \isa{{\isacharparenleft}{\isacharprime}a{\isacharcolon}{\isacharcolon}ord{\isacharparenright}\ {\isasymRightarrow}\ {\isacharprime}a\ {\isasymRightarrow}\ bool} & (infixl 50) \\
 \isa{min} &  \isa{{\isacharparenleft}{\isacharprime}a{\isacharcolon}{\isacharcolon}ord{\isacharparenright}\ {\isasymRightarrow}\ {\isacharprime}a\ {\isasymRightarrow}\ {\isacharprime}a} \\
-\isa{max} &  \isa{{\isacharparenleft}{\isacharprime}a{\isacharcolon}{\isacharcolon}ord{\isacharparenright}\ {\isasymRightarrow}\ {\isacharprime}a\ {\isasymRightarrow}\ {\isacharprime}a} \\
+\isa{max} &  \isa{{\isacharparenleft}{\isacharprime}a{\isacharcolon}{\isacharcolon}ord{\isacharparenright}\ {\isasymRightarrow}\ {\isacharprime}a\ {\isasymRightarrow}\ {\isacharprime}a}
 \end{tabular}
 \caption{Overloaded constants in HOL}
 \label{tab:overloading}
--- a/doc-src/TutorialI/Types/document/Pairs.tex	Wed Dec 13 09:32:55 2000 +0100
+++ b/doc-src/TutorialI/Types/document/Pairs.tex	Wed Dec 13 09:39:53 2000 +0100
@@ -44,7 +44,8 @@
 different terms).
 
 Internally, \isa{{\isasymlambda}{\isacharparenleft}x{\isacharcomma}\ y{\isacharparenright}{\isachardot}\ t} becomes \isa{split\ {\isacharparenleft}{\isasymlambda}x\ y{\isachardot}\ t{\isacharparenright}}, where
-\isa{split} is the uncurrying function of type \isa{{\isacharparenleft}{\isacharprime}a\ {\isasymRightarrow}\ {\isacharprime}b\ {\isasymRightarrow}\ {\isacharprime}c{\isacharparenright}\ {\isasymRightarrow}\ {\isacharprime}a\ {\isasymtimes}\ {\isacharprime}b\ {\isasymRightarrow}\ {\isacharprime}c} defined as
+\isa{split}\indexbold{*split (constant)}
+is the uncurrying function of type \isa{{\isacharparenleft}{\isacharprime}a\ {\isasymRightarrow}\ {\isacharprime}b\ {\isasymRightarrow}\ {\isacharprime}c{\isacharparenright}\ {\isasymRightarrow}\ {\isacharprime}a\ {\isasymtimes}\ {\isacharprime}b\ {\isasymRightarrow}\ {\isacharprime}c} defined as
 \begin{center}
 \isa{split\ {\isasymequiv}\ {\isasymlambda}c\ p{\isachardot}\ c\ {\isacharparenleft}fst\ p{\isacharparenright}\ {\isacharparenleft}snd\ p{\isacharparenright}}
 \hfill(\isa{split{\isacharunderscore}def})
@@ -80,7 +81,7 @@
 rule \isa{split{\isacharunderscore}split} replaces \isa{p} by a pair:%
 \end{isamarkuptext}%
 \isacommand{lemma}\ {\isachardoublequote}{\isacharparenleft}{\isasymlambda}{\isacharparenleft}x{\isacharcomma}y{\isacharparenright}{\isachardot}y{\isacharparenright}\ p\ {\isacharequal}\ snd\ p{\isachardoublequote}\isanewline
-\isacommand{apply}{\isacharparenleft}simp\ only{\isacharcolon}\ split{\isacharcolon}split{\isacharunderscore}split{\isacharparenright}%
+\isacommand{apply}{\isacharparenleft}split\ split{\isacharunderscore}split{\isacharparenright}%
 \begin{isamarkuptxt}%
 \begin{isabelle}%
 \ {\isadigit{1}}{\isachardot}\ {\isasymforall}x\ y{\isachardot}\ p\ {\isacharequal}\ {\isacharparenleft}x{\isacharcomma}\ y{\isacharparenright}\ {\isasymlongrightarrow}\ y\ {\isacharequal}\ snd\ p%
@@ -184,10 +185,10 @@
 In case you would like to turn off this automatic splitting, just disable the
 responsible simplification rules:
 \begin{center}
-\isa{{\isacharparenleft}{\isasymforall}x{\isachardot}\ {\isacharquery}P\ x{\isacharparenright}\ {\isacharequal}\ {\isacharparenleft}{\isasymforall}a\ b{\isachardot}\ {\isacharquery}P\ {\isacharparenleft}a{\isacharcomma}\ b{\isacharparenright}{\isacharparenright}}
+\isa{{\isacharparenleft}{\isasymforall}x{\isachardot}\ P\ x{\isacharparenright}\ {\isacharequal}\ {\isacharparenleft}{\isasymforall}a\ b{\isachardot}\ P\ {\isacharparenleft}a{\isacharcomma}\ b{\isacharparenright}{\isacharparenright}}
 \hfill
 (\isa{split{\isacharunderscore}paired{\isacharunderscore}All})\\
-\isa{{\isacharparenleft}{\isasymexists}x{\isachardot}\ {\isacharquery}P\ x{\isacharparenright}\ {\isacharequal}\ {\isacharparenleft}{\isasymexists}a\ b{\isachardot}\ {\isacharquery}P\ {\isacharparenleft}a{\isacharcomma}\ b{\isacharparenright}{\isacharparenright}}
+\isa{{\isacharparenleft}{\isasymexists}x{\isachardot}\ P\ x{\isacharparenright}\ {\isacharequal}\ {\isacharparenleft}{\isasymexists}a\ b{\isachardot}\ P\ {\isacharparenleft}a{\isacharcomma}\ b{\isacharparenright}{\isacharparenright}}
 \hfill
 (\isa{split{\isacharunderscore}paired{\isacharunderscore}Ex})
 \end{center}%
--- a/doc-src/TutorialI/Types/document/Typedef.tex	Wed Dec 13 09:32:55 2000 +0100
+++ b/doc-src/TutorialI/Types/document/Typedef.tex	Wed Dec 13 09:39:53 2000 +0100
@@ -198,7 +198,7 @@
 \isa{P\ B} and \isa{P\ C}. First we prove the analogous proposition for the
 representation:%
 \end{isamarkuptext}%
-\isacommand{lemma}\ cases{\isacharunderscore}lemma{\isacharcolon}\ {\isachardoublequote}{\isasymlbrakk}\ Q\ {\isadigit{0}}{\isacharsemicolon}\ Q\ {\isadigit{1}}{\isacharsemicolon}\ Q\ {\isadigit{2}}{\isacharsemicolon}\ n\ {\isacharcolon}\ three\ {\isasymrbrakk}\ {\isasymLongrightarrow}\ \ Q{\isacharparenleft}n{\isacharcolon}{\isacharcolon}nat{\isacharparenright}{\isachardoublequote}%
+\isacommand{lemma}\ cases{\isacharunderscore}lemma{\isacharcolon}\ {\isachardoublequote}{\isasymlbrakk}\ Q\ {\isadigit{0}}{\isacharsemicolon}\ Q\ {\isadigit{1}}{\isacharsemicolon}\ Q\ {\isadigit{2}}{\isacharsemicolon}\ n\ {\isacharcolon}\ three\ {\isasymrbrakk}\ {\isasymLongrightarrow}\ \ Q\ n{\isachardoublequote}%
 \begin{isamarkuptxt}%
 \noindent
 Expanding \isa{three{\isacharunderscore}def} yields the premise \isa{n\ {\isasymle}\ {\isadigit{2}}}. Repeated
--- a/doc-src/TutorialI/Types/numerics.tex	Wed Dec 13 09:32:55 2000 +0100
+++ b/doc-src/TutorialI/Types/numerics.tex	Wed Dec 13 09:39:53 2000 +0100
@@ -243,7 +243,7 @@
 operators.  The name \isa{zadd_ac} refers to the associativity and
 commutativity theorems for integer addition, while \isa{zmult_ac} has the
 analogous theorems for multiplication.  The prefix~\isa{z} in many theorem
-names recalls the use of $\Bbb{Z}$ to denote the set of integers.
+names recalls the use of $Z$ to denote the set of integers.
 
 For division and remainder, the treatment of negative divisors follows
 traditional mathematical practice: the sign of the remainder follows that
--- a/doc-src/TutorialI/appendix.tex	Wed Dec 13 09:32:55 2000 +0100
+++ b/doc-src/TutorialI/appendix.tex	Wed Dec 13 09:39:53 2000 +0100
@@ -96,34 +96,43 @@
 \label{fig:ascii}
 \end{figure}\indexbold{ASCII symbols}
 
-
 \begin{figure}[htbp]
 \begin{center}
-\begin{tabular}{|lllll|}
+\begin{tabular}{|lllllllll|}
 \hline
 \texttt{ALL} &
+\texttt{BIT} &
+\texttt{CHR} &
+\texttt{EX} &
+\texttt{GOAL} &
+\texttt{INT} &
+\texttt{Int} &
+\texttt{LEAST} &
+\texttt{O} \\
+\texttt{OFCLASS} &
+\texttt{PI} &
+\texttt{PROP} &
+\texttt{SIGMA} &
+\texttt{SOME} &
+\texttt{TYPE} &
+\texttt{UN} &
+\texttt{Un} &\\
 \texttt{case} &
+\texttt{choose} &
 \texttt{div} &
 \texttt{dvd} &
-\texttt{else} \\
-\texttt{EX} &
+\texttt{else} &
+\texttt{funcset} &
 \texttt{if} &
 \texttt{in} &
-\texttt{INT} &
-\texttt{Int} \\
-\texttt{LEAST} &
+\texttt{lam} \\
 \texttt{let} &
+\texttt{mem} &
 \texttt{mod} &
-\texttt{O} &
-\texttt{o} \\
+\texttt{o} &
 \texttt{of} &
 \texttt{op} &
-\texttt{PROP} &
-\texttt{SIGMA} &
-\texttt{then} \\
-\texttt{Times} &
-\texttt{UN} &
-\texttt{Un} &&\\
+\texttt{then}&&\\
 \hline
 \end{tabular}
 \end{center}
--- a/doc-src/TutorialI/fp.tex	Wed Dec 13 09:32:55 2000 +0100
+++ b/doc-src/TutorialI/fp.tex	Wed Dec 13 09:39:53 2000 +0100
@@ -206,7 +206,7 @@
 such that $C$ is a constructor of the datatype $t$ and all recursive calls of
 $f$ in $r$ are of the form $f \, \dots \, y@i \, \dots$ for some $i$. Thus
 Isabelle immediately sees that $f$ terminates because one (fixed!) argument
-becomes smaller with every recursive call. There must be exactly one equation
+becomes smaller with every recursive call. There must be at most one equation
 for each constructor.  Their order is immaterial.
 A more general method for defining total recursive functions is introduced in
 {\S}\ref{sec:recdef}.
@@ -472,7 +472,7 @@
 more advanced termination proofs are discussed in {\S}\ref{sec:beyond-measure}.
 
 \subsection{Defining recursive functions}
-
+\label{sec:recdef-examples}
 \input{Recdef/document/examples.tex}
 
 \subsection{Proving termination}
--- a/doc-src/TutorialI/todo.tobias	Wed Dec 13 09:32:55 2000 +0100
+++ b/doc-src/TutorialI/todo.tobias	Wed Dec 13 09:39:53 2000 +0100
@@ -3,8 +3,6 @@
 
 Relation: comp -> composition
 
-replace "simp only split" by "split_tac".
-
 Add map_cong?? (upto 10% slower)
 
 Recdef: Get rid of function name in header.
@@ -14,7 +12,7 @@
 -> new example in Recdef/termination
 
 a tactic for replacing a specific occurrence:
-apply(substitute [2] thm)
+apply(subst [2] thm)
 
 it would be nice if @term could deal with ?-vars.
 then a number of (unchecked!) @texts could be converted to @terms.
@@ -37,16 +35,16 @@
 Minor fixes in the tutorial
 ===========================
 
+adjust type of ^ in tab:overloading
+
 explanation of term "contrapositive"/contraposition in Rules?
 Index the notion and maybe the rules contrapos_xy
 
-Even: forward ref from problem with "Suc(Suc n) : even" to general solution in
-AdvancedInd section.
+If Advanced and Types chapter ar swapped:
+Make forward ref from ... = True in Axioms to preprocessor section.
 
 get rid of use_thy in tutorial?
 
-Explain typographic conventions?
-
 Orderings on numbers (with hint that it is overloaded):
 bounded quantifers ALL x<y, <=.
 
@@ -78,6 +76,7 @@
 ==============================================
 
 Tacticals: , ? +
+Note: + is used in typedef section!
 
 Mention that simp etc (big step tactics) insist on change?
 
@@ -87,9 +86,6 @@
 A list of further useful commands (rules? tricks?)
 prefer, defer, print_simpset (-> print_simps?)
 
-An overview of the automatic methods: simp, auto, fast, blast, force,
-clarify, clarsimp (intro, elim?)
-
 Advanced Ind expects rule_format incl (no_asm) (which it currently explains!)
 Where explained? Should go into a separate section as Inductive needs it as
 well.
@@ -124,6 +120,15 @@
 Ind. sets: define ABC inductively and prove
 ABC = {rep A n @ rep B n @ rep C n. True}
 
+Partial rekursive functions / Nontermination:
+
+Exercise: ?! f. !i. f i = if i=0 then 1 else i*f(i-1)
+(What about sum? Is there one, a unique one?)
+
+Exercise
+Better(?) sum i = fst(while (%(s,i). i=0) (%(s,i). (s+i,i-1)) (0,i))
+Prove 0 <= i ==> sum i = i*(i+1) via while-rule
+
 Possible examples/case studies
 ==============================
 
@@ -145,6 +150,12 @@
 
 Sorting with comp-parameter and with type class (<)
 
+Recdef:more example proofs:
+ if-normalization with measure function,
+ nested if-normalization,
+ quicksort
+ Trie?
+
 New book by Bird?
 
 Steps Towards Mechanizing Program Transformations Using PVS by N. Shankar,
@@ -166,60 +177,7 @@
 
 A look at the library?
 Map.
-If WF is discussed, make a link to it from AdvancedInd.
 
 Prototyping?
 
 ==============================================================
-Recdef:
-
-nested recursion
-more example proofs:
- if-normalization with measure function,
- nested if-normalization,
- quicksort
- Trie?
-a case study?
-
-----------
-
-Partial rekursive functions / Nontermination
-
-What appears to be the problem:
-axiom f n = f n + 1
-lemma False
-apply(cut_facts_tac axiom, simp).
-
-1. Guarded recursion
-
-Scheme:
-f x = if $x \in dom(f)$ then ... else arbitrary
-
-Example: sum/fact: int -> int (for no good reason because we have nat)
-
-Exercise: ?! f. !i. f i = if i=0 then 1 else i*f(i-1)
-(What about sum? Is there one, a unique one?)
-
-[Alternative: include argument that is counted down
- f x n = if n=0 then None else ...
-Refer to Boyer and Moore]
-
-More complex: same_fst
-
-chase(f,x) = if wf{(f x,x) . f x ~= x}
-             then if f x = x then x else chase(f,f x)
-             else arb
-
-Prove wf ==> f(chase(f,x)) = chase(f,x)
-
-2. While / Tail recursion
-
-chase f x = fst(while (%(x,fx). x=fx) (%(x,fx). (fx,f fx)) (x,f x))
-
-==> unfold eqn for chase? Prove fixpoint property?
-
-Better(?) sum i = fst(while (%(s,i). i=0) (%(s,i). (s+i,i-1)) (0,i))
-Prove 0 <= i ==> sum i = i*(i+1) via while-rule
-
-Mention prototyping?
-==============================================================
--- a/doc-src/TutorialI/tutorial.tex	Wed Dec 13 09:32:55 2000 +0100
+++ b/doc-src/TutorialI/tutorial.tex	Wed Dec 13 09:39:53 2000 +0100
@@ -45,8 +45,10 @@
 \newcommand{\isaindexbold}[1]{\isa{#1}\index{*#1|bold}}
 \newcommand{\isaindex}[1]{\isa{#1}\index{*#1}}
 
+\index{termination|see{total function}}
 \index{product type|see{pair}}
 \index{tuple|see{pair}}
+\index{*<*lex*>|see{lexicographic product}}
 
 \underscoreoff