doc-src/TutorialI/Advanced/Partial.thy
changeset 10654 458068404143
child 10885 90695f46440b
     1.1 --- /dev/null	Thu Jan 01 00:00:00 1970 +0000
     1.2 +++ b/doc-src/TutorialI/Advanced/Partial.thy	Wed Dec 13 09:39:53 2000 +0100
     1.3 @@ -0,0 +1,213 @@
     1.4 +(*<*)theory Partial = While_Combinator:(*>*)
     1.5 +
     1.6 +text{*\noindent
     1.7 +Throughout the tutorial we have emphasized the fact that all functions
     1.8 +in HOL are total. Hence we cannot hope to define truly partial
     1.9 +functions. The best we can do are functions that are
    1.10 +\emph{underdefined}\index{underdefined function}:
    1.11 +for certain arguments we only know that a result
    1.12 +exists, but we don't know what it is. When defining functions that are
    1.13 +normally considered partial, underdefinedness turns out to be a very
    1.14 +reasonable alternative.
    1.15 +
    1.16 +We have already seen an instance of underdefinedness by means of
    1.17 +non-exhaustive pattern matching: the definition of @{term last} in
    1.18 +\S\ref{sec:recdef-examples}. The same is allowed for \isacommand{primrec}
    1.19 +*}
    1.20 +
    1.21 +consts hd :: "'a list \<Rightarrow> 'a"
    1.22 +primrec "hd (x#xs) = x"
    1.23 +
    1.24 +text{*\noindent
    1.25 +although it generates a warning.
    1.26 +
    1.27 +Even ordinary definitions allow underdefinedness, this time by means of
    1.28 +preconditions:
    1.29 +*}
    1.30 +
    1.31 +constdefs minus :: "nat \<Rightarrow> nat \<Rightarrow> nat"
    1.32 +"n \<le> m \<Longrightarrow> minus m n \<equiv> m - n"
    1.33 +
    1.34 +text{*
    1.35 +The rest of this section is devoted to the question of how to define
    1.36 +partial recursive functions by other means that non-exhaustive pattern
    1.37 +matching.
    1.38 +*}
    1.39 +
    1.40 +subsubsection{*Guarded recursion*}
    1.41 +
    1.42 +text{* Neither \isacommand{primrec} nor \isacommand{recdef} allow to
    1.43 +prefix an equation with a condition in the way ordinary definitions do
    1.44 +(see @{term minus} above). Instead we have to move the condition over
    1.45 +to the right-hand side of the equation. Given a partial function $f$
    1.46 +that should satisfy the recursion equation $f(x) = t$ over its domain
    1.47 +$dom(f)$, we turn this into the \isacommand{recdef}
    1.48 +@{prop[display]"f(x) = (if x \<in> dom(f) then t else arbitrary)"}
    1.49 +where @{term arbitrary} is a predeclared constant of type @{typ 'a}
    1.50 +which has no definition. Thus we know nothing about its value,
    1.51 +which is ideal for specifying underdefined functions on top of it.
    1.52 +
    1.53 +As a simple example we define division on @{typ nat}:
    1.54 +*}
    1.55 +
    1.56 +consts divi :: "nat \<times> nat \<Rightarrow> nat"
    1.57 +recdef divi "measure(\<lambda>(m,n). m)"
    1.58 +  "divi(m,n) = (if n = 0 then arbitrary else
    1.59 +                if m < n then 0 else divi(m-n,n)+1)"
    1.60 +
    1.61 +text{*\noindent Of course we could also have defined
    1.62 +@{term"divi(m,0)"} to be some specific number, for example 0. The
    1.63 +latter option is chosen for the predefined @{text div} function, which
    1.64 +simplifies proofs at the expense of moving further away from the
    1.65 +standard mathematical divison function.
    1.66 +
    1.67 +As a more substantial example we consider the problem of searching a graph.
    1.68 +For simplicity our graph is given by a function (@{term f}) of
    1.69 +type @{typ"'a \<Rightarrow> 'a"} which
    1.70 +maps each node to its successor, and the task is to find the end of a chain,
    1.71 +i.e.\ a node pointing to itself. Here is a first attempt:
    1.72 +@{prop[display]"find(f,x) = (if f x = x then x else find(f, f x))"}
    1.73 +This may be viewed as a fixed point finder or as one half of the well known
    1.74 +\emph{Union-Find} algorithm.
    1.75 +The snag is that it may not terminate if @{term f} has nontrivial cycles.
    1.76 +Phrased differently, the relation
    1.77 +*}
    1.78 +
    1.79 +constdefs step1 :: "('a \<Rightarrow> 'a) \<Rightarrow> ('a \<times> 'a)set"
    1.80 +  "step1 f \<equiv> {(y,x). y = f x \<and> y \<noteq> x}"
    1.81 +
    1.82 +text{*\noindent
    1.83 +must be well-founded. Thus we make the following definition:
    1.84 +*}
    1.85 +
    1.86 +consts find :: "('a \<Rightarrow> 'a) \<times> 'a \<Rightarrow> 'a"
    1.87 +recdef find "same_fst (\<lambda>f. wf(step1 f)) step1"
    1.88 +  "find(f,x) = (if wf(step1 f)
    1.89 +                then if f x = x then x else find(f, f x)
    1.90 +                else arbitrary)"
    1.91 +(hints recdef_simp:same_fst_def step1_def)
    1.92 +
    1.93 +text{*\noindent
    1.94 +The recursion equation itself should be clear enough: it is our aborted
    1.95 +first attempt augmented with a check that there are no non-trivial loops.
    1.96 +
    1.97 +What complicates the termination proof is that the argument of
    1.98 +@{term find} is a pair. To express the required well-founded relation
    1.99 +we employ the predefined combinator @{term same_fst} of type
   1.100 +@{text[display]"('a \<Rightarrow> bool) \<Rightarrow> ('a \<Rightarrow> ('b\<times>'b)set) \<Rightarrow> (('a\<times>'b) \<times> ('a\<times>'b))set"}
   1.101 +defined as
   1.102 +@{thm[display]same_fst_def[no_vars]}
   1.103 +This combinator is designed for recursive functions on pairs where the first
   1.104 +component of the argument is passed unchanged to all recursive
   1.105 +calls. Given a constraint on the first component and a relation on the second
   1.106 +component, @{term same_fst} builds the required relation on pairs.
   1.107 +The theorem @{thm[display]wf_same_fst[no_vars]}
   1.108 +is known to the well-foundedness prover of \isacommand{recdef}.
   1.109 +Thus well-foundedness of the given relation is immediate.
   1.110 +Furthermore, each recursive call descends along the given relation:
   1.111 +the first argument stays unchanged and the second one descends along
   1.112 +@{term"step1 f"}. The proof merely requires unfolding of some definitions.
   1.113 +
   1.114 +Normally you will then derive the following conditional variant of and from
   1.115 +the recursion equation
   1.116 +*}
   1.117 +
   1.118 +lemma [simp]:
   1.119 +  "wf(step1 f) \<Longrightarrow> find(f,x) = (if f x = x then x else find(f, f x))"
   1.120 +by simp
   1.121 +
   1.122 +text{*\noindent and then disable the original recursion equation:*}
   1.123 +
   1.124 +declare find.simps[simp del]
   1.125 +
   1.126 +text{*
   1.127 +We can reason about such underdefined functions just like about any other
   1.128 +recursive function. Here is a simple example of recursion induction:
   1.129 +*}
   1.130 +
   1.131 +lemma "wf(step1 f) \<longrightarrow> f(find(f,x)) = find(f,x)"
   1.132 +apply(induct_tac f x rule:find.induct);
   1.133 +apply simp
   1.134 +done
   1.135 +
   1.136 +subsubsection{*The {\tt\slshape while} combinator*}
   1.137 +
   1.138 +text{*If the recursive function happens to be tail recursive, its
   1.139 +definition becomes a triviality if based on the predefined \isaindexbold{while}
   1.140 +combinator.  The latter lives in the library theory
   1.141 +\isa{While_Combinator}, which is not part of @{text Main} but needs to
   1.142 +be included explicitly among the ancestor theories.
   1.143 +
   1.144 +Constant @{term while} is of type @{text"('a \<Rightarrow> bool) \<Rightarrow> ('a \<Rightarrow> 'a) \<Rightarrow> 'a"}
   1.145 +and satisfies the recursion equation @{thm[display]while_unfold[no_vars]}
   1.146 +That is, @{term"while b c s"} is equivalent to the imperative program
   1.147 +\begin{verbatim}
   1.148 +     x := s; while b(x) do x := c(x); return x
   1.149 +\end{verbatim}
   1.150 +In general, @{term s} will be a tuple (better still: a record). As an example
   1.151 +consider the tail recursive variant of function @{term find} above:
   1.152 +*}
   1.153 +
   1.154 +constdefs find2 :: "('a \<Rightarrow> 'a) \<Rightarrow> 'a \<Rightarrow> 'a"
   1.155 +  "find2 f x \<equiv>
   1.156 +   fst(while (\<lambda>(x,x'). x' \<noteq> x) (\<lambda>(x,x'). (x',f x')) (x,f x))"
   1.157 +
   1.158 +text{*\noindent
   1.159 +The loop operates on two ``local variables'' @{term x} and @{term x'}
   1.160 +containing the ``current'' and the ``next'' value of function @{term f}.
   1.161 +They are initalized with the global @{term x} and @{term"f x"}. At the
   1.162 +end @{term fst} selects the local @{term x}.
   1.163 +
   1.164 +This looks like we can define at least tail recursive functions
   1.165 +without bothering about termination after all. But there is no free
   1.166 +lunch: when proving properties of functions defined by @{term while},
   1.167 +termination rears its ugly head again. Here is
   1.168 +@{thm[source]while_rule}, the well known proof rule for total
   1.169 +correctness of loops expressed with @{term while}:
   1.170 +@{thm[display,margin=50]while_rule[no_vars]} @{term P} needs to be
   1.171 +true of the initial state @{term s} and invariant under @{term c}
   1.172 +(premises 1 and 2).The post-condition @{term Q} must become true when
   1.173 +leaving the loop (premise 3). And each loop iteration must descend
   1.174 +along a well-founded relation @{term r} (premises 4 and 5).
   1.175 +
   1.176 +Let us now prove that @{term find2} does indeed find a fixed point. Instead
   1.177 +of induction we apply the above while rule, suitably instantiated.
   1.178 +Only the final premise of @{thm[source]while_rule} is left unproved
   1.179 +by @{text auto} but falls to @{text simp}:
   1.180 +*}
   1.181 +
   1.182 +lemma lem: "\<lbrakk> wf(step1 f); x' = f x \<rbrakk> \<Longrightarrow> \<exists>y y'.
   1.183 +   while (\<lambda>(x,x'). x' \<noteq> x) (\<lambda>(x,x'). (x',f x')) (x,x') = (y,y') \<and>
   1.184 +   y' = y \<and> f y = y"
   1.185 +apply(rule_tac P = "\<lambda>(x,x'). x' = f x" and
   1.186 +               r = "inv_image (step1 f) fst" in while_rule);
   1.187 +apply auto
   1.188 +apply(simp add:inv_image_def step1_def)
   1.189 +done
   1.190 +
   1.191 +text{*
   1.192 +The theorem itself is a simple consequence of this lemma:
   1.193 +*}
   1.194 +
   1.195 +theorem "wf(step1 f) \<Longrightarrow> f(find2 f x) = find2 f x"
   1.196 +apply(drule_tac x = x in lem)
   1.197 +apply(auto simp add:find2_def)
   1.198 +done
   1.199 +
   1.200 +text{* Let us conclude this section on partial functions by a
   1.201 +discussion of the merits of the @{term while} combinator. We have
   1.202 +already seen that the advantage (if it is one) of not having to
   1.203 +provide a termintion argument when defining a function via @{term
   1.204 +while} merely puts off the evil hour. On top of that, tail recursive
   1.205 +functions tend to be more complicated to reason about. So why use
   1.206 +@{term while} at all? The only reason is executability: the recursion
   1.207 +equation for @{term while} is a directly executable functional
   1.208 +program. This is in stark contrast to guarded recursion as introduced
   1.209 +above which requires an explicit test @{prop"x \<in> dom f"} in the
   1.210 +function body.  Unless @{term dom} is trivial, this leads to a
   1.211 +definition which is either not at all executable or prohibitively
   1.212 +expensive. Thus, if you are aiming for an efficiently executable definition
   1.213 +of a partial function, you are likely to need @{term while}.
   1.214 +*}
   1.215 +
   1.216 +(*<*)end(*>*)