src/Doc/Tutorial/Advanced/Partial.thy
changeset 48985 5386df44a037
parent 35416 d8d7d1b785af
child 58860 fee7cfa69c50
equal deleted inserted replaced
48984:f51d4a302962 48985:5386df44a037
       
     1 (*<*)theory Partial imports While_Combinator begin(*>*)
       
     2 
       
     3 text{*\noindent Throughout this tutorial, we have emphasized
       
     4 that all functions in HOL are total.  We cannot hope to define
       
     5 truly partial functions, but must make them total.  A straightforward
       
     6 method is to lift the result type of the function from $\tau$ to
       
     7 $\tau$~@{text option} (see \ref{sec:option}), where @{term None} is
       
     8 returned if the function is applied to an argument not in its
       
     9 domain. Function @{term assoc} in \S\ref{sec:Trie} is a simple example.
       
    10 We do not pursue this schema further because it should be clear
       
    11 how it works. Its main drawback is that the result of such a lifted
       
    12 function has to be unpacked first before it can be processed
       
    13 further. Its main advantage is that you can distinguish if the
       
    14 function was applied to an argument in its domain or not. If you do
       
    15 not need to make this distinction, for example because the function is
       
    16 never used outside its domain, it is easier to work with
       
    17 \emph{underdefined}\index{functions!underdefined} functions: for
       
    18 certain arguments we only know that a result exists, but we do not
       
    19 know what it is. When defining functions that are normally considered
       
    20 partial, underdefinedness turns out to be a very reasonable
       
    21 alternative.
       
    22 
       
    23 We have already seen an instance of underdefinedness by means of
       
    24 non-exhaustive pattern matching: the definition of @{term last} in
       
    25 \S\ref{sec:fun}. The same is allowed for \isacommand{primrec}
       
    26 *}
       
    27 
       
    28 consts hd :: "'a list \<Rightarrow> 'a"
       
    29 primrec "hd (x#xs) = x"
       
    30 
       
    31 text{*\noindent
       
    32 although it generates a warning.
       
    33 Even ordinary definitions allow underdefinedness, this time by means of
       
    34 preconditions:
       
    35 *}
       
    36 
       
    37 definition subtract :: "nat \<Rightarrow> nat \<Rightarrow> nat" where
       
    38 "n \<le> m \<Longrightarrow> subtract m n \<equiv> m - n"
       
    39 
       
    40 text{*
       
    41 The rest of this section is devoted to the question of how to define
       
    42 partial recursive functions by other means than non-exhaustive pattern
       
    43 matching.
       
    44 *}
       
    45 
       
    46 subsubsection{*Guarded Recursion*}
       
    47 
       
    48 text{* 
       
    49 \index{recursion!guarded}%
       
    50 Neither \isacommand{primrec} nor \isacommand{recdef} allow to
       
    51 prefix an equation with a condition in the way ordinary definitions do
       
    52 (see @{const subtract} above). Instead we have to move the condition over
       
    53 to the right-hand side of the equation. Given a partial function $f$
       
    54 that should satisfy the recursion equation $f(x) = t$ over its domain
       
    55 $dom(f)$, we turn this into the \isacommand{recdef}
       
    56 @{prop[display]"f(x) = (if x \<in> dom(f) then t else arbitrary)"}
       
    57 where @{term arbitrary} is a predeclared constant of type @{typ 'a}
       
    58 which has no definition. Thus we know nothing about its value,
       
    59 which is ideal for specifying underdefined functions on top of it.
       
    60 
       
    61 As a simple example we define division on @{typ nat}:
       
    62 *}
       
    63 
       
    64 consts divi :: "nat \<times> nat \<Rightarrow> nat"
       
    65 recdef divi "measure(\<lambda>(m,n). m)"
       
    66   "divi(m,0) = arbitrary"
       
    67   "divi(m,n) = (if m < n then 0 else divi(m-n,n)+1)"
       
    68 
       
    69 text{*\noindent Of course we could also have defined
       
    70 @{term"divi(m,0)"} to be some specific number, for example 0. The
       
    71 latter option is chosen for the predefined @{text div} function, which
       
    72 simplifies proofs at the expense of deviating from the
       
    73 standard mathematical division function.
       
    74 
       
    75 As a more substantial example we consider the problem of searching a graph.
       
    76 For simplicity our graph is given by a function @{term f} of
       
    77 type @{typ"'a \<Rightarrow> 'a"} which
       
    78 maps each node to its successor; the graph has out-degree 1.
       
    79 The task is to find the end of a chain, modelled by a node pointing to
       
    80 itself. Here is a first attempt:
       
    81 @{prop[display]"find(f,x) = (if f x = x then x else find(f, f x))"}
       
    82 This may be viewed as a fixed point finder or as the second half of the well
       
    83 known \emph{Union-Find} algorithm.
       
    84 The snag is that it may not terminate if @{term f} has non-trivial cycles.
       
    85 Phrased differently, the relation
       
    86 *}
       
    87 
       
    88 definition step1 :: "('a \<Rightarrow> 'a) \<Rightarrow> ('a \<times> 'a)set" where
       
    89   "step1 f \<equiv> {(y,x). y = f x \<and> y \<noteq> x}"
       
    90 
       
    91 text{*\noindent
       
    92 must be well-founded. Thus we make the following definition:
       
    93 *}
       
    94 
       
    95 consts find :: "('a \<Rightarrow> 'a) \<times> 'a \<Rightarrow> 'a"
       
    96 recdef find "same_fst (\<lambda>f. wf(step1 f)) step1"
       
    97   "find(f,x) = (if wf(step1 f)
       
    98                 then if f x = x then x else find(f, f x)
       
    99                 else arbitrary)"
       
   100 (hints recdef_simp: step1_def)
       
   101 
       
   102 text{*\noindent
       
   103 The recursion equation itself should be clear enough: it is our aborted
       
   104 first attempt augmented with a check that there are no non-trivial loops.
       
   105 To express the required well-founded relation we employ the
       
   106 predefined combinator @{term same_fst} of type
       
   107 @{text[display]"('a \<Rightarrow> bool) \<Rightarrow> ('a \<Rightarrow> ('b\<times>'b)set) \<Rightarrow> (('a\<times>'b) \<times> ('a\<times>'b))set"}
       
   108 defined as
       
   109 @{thm[display]same_fst_def[no_vars]}
       
   110 This combinator is designed for
       
   111 recursive functions on pairs where the first component of the argument is
       
   112 passed unchanged to all recursive calls. Given a constraint on the first
       
   113 component and a relation on the second component, @{term same_fst} builds the
       
   114 required relation on pairs.  The theorem
       
   115 @{thm[display]wf_same_fst[no_vars]}
       
   116 is known to the well-foundedness prover of \isacommand{recdef}.  Thus
       
   117 well-foundedness of the relation given to \isacommand{recdef} is immediate.
       
   118 Furthermore, each recursive call descends along that relation: the first
       
   119 argument stays unchanged and the second one descends along @{term"step1
       
   120 f"}. The proof requires unfolding the definition of @{const step1},
       
   121 as specified in the \isacommand{hints} above.
       
   122 
       
   123 Normally you will then derive the following conditional variant from
       
   124 the recursion equation:
       
   125 *}
       
   126 
       
   127 lemma [simp]:
       
   128   "wf(step1 f) \<Longrightarrow> find(f,x) = (if f x = x then x else find(f, f x))"
       
   129 by simp
       
   130 
       
   131 text{*\noindent Then you should disable the original recursion equation:*}
       
   132 
       
   133 declare find.simps[simp del]
       
   134 
       
   135 text{*
       
   136 Reasoning about such underdefined functions is like that for other
       
   137 recursive functions.  Here is a simple example of recursion induction:
       
   138 *}
       
   139 
       
   140 lemma "wf(step1 f) \<longrightarrow> f(find(f,x)) = find(f,x)"
       
   141 apply(induct_tac f x rule: find.induct);
       
   142 apply simp
       
   143 done
       
   144 
       
   145 subsubsection{*The {\tt\slshape while} Combinator*}
       
   146 
       
   147 text{*If the recursive function happens to be tail recursive, its
       
   148 definition becomes a triviality if based on the predefined \cdx{while}
       
   149 combinator.  The latter lives in the Library theory \thydx{While_Combinator}.
       
   150 % which is not part of {text Main} but needs to
       
   151 % be included explicitly among the ancestor theories.
       
   152 
       
   153 Constant @{term while} is of type @{text"('a \<Rightarrow> bool) \<Rightarrow> ('a \<Rightarrow> 'a) \<Rightarrow> 'a"}
       
   154 and satisfies the recursion equation @{thm[display]while_unfold[no_vars]}
       
   155 That is, @{term"while b c s"} is equivalent to the imperative program
       
   156 \begin{verbatim}
       
   157      x := s; while b(x) do x := c(x); return x
       
   158 \end{verbatim}
       
   159 In general, @{term s} will be a tuple or record.  As an example
       
   160 consider the following definition of function @{const find}:
       
   161 *}
       
   162 
       
   163 definition find2 :: "('a \<Rightarrow> 'a) \<Rightarrow> 'a \<Rightarrow> 'a" where
       
   164   "find2 f x \<equiv>
       
   165    fst(while (\<lambda>(x,x'). x' \<noteq> x) (\<lambda>(x,x'). (x',f x')) (x,f x))"
       
   166 
       
   167 text{*\noindent
       
   168 The loop operates on two ``local variables'' @{term x} and @{term x'}
       
   169 containing the ``current'' and the ``next'' value of function @{term f}.
       
   170 They are initialized with the global @{term x} and @{term"f x"}. At the
       
   171 end @{term fst} selects the local @{term x}.
       
   172 
       
   173 Although the definition of tail recursive functions via @{term while} avoids
       
   174 termination proofs, there is no free lunch. When proving properties of
       
   175 functions defined by @{term while}, termination rears its ugly head
       
   176 again. Here is \tdx{while_rule}, the well known proof rule for total
       
   177 correctness of loops expressed with @{term while}:
       
   178 @{thm[display,margin=50]while_rule[no_vars]} @{term P} needs to be true of
       
   179 the initial state @{term s} and invariant under @{term c} (premises 1
       
   180 and~2). The post-condition @{term Q} must become true when leaving the loop
       
   181 (premise~3). And each loop iteration must descend along a well-founded
       
   182 relation @{term r} (premises 4 and~5).
       
   183 
       
   184 Let us now prove that @{const find2} does indeed find a fixed point. Instead
       
   185 of induction we apply the above while rule, suitably instantiated.
       
   186 Only the final premise of @{thm[source]while_rule} is left unproved
       
   187 by @{text auto} but falls to @{text simp}:
       
   188 *}
       
   189 
       
   190 lemma lem: "wf(step1 f) \<Longrightarrow>
       
   191   \<exists>y. while (\<lambda>(x,x'). x' \<noteq> x) (\<lambda>(x,x'). (x',f x')) (x,f x) = (y,y) \<and>
       
   192        f y = y"
       
   193 apply(rule_tac P = "\<lambda>(x,x'). x' = f x" and
       
   194                r = "inv_image (step1 f) fst" in while_rule);
       
   195 apply auto
       
   196 apply(simp add: inv_image_def step1_def)
       
   197 done
       
   198 
       
   199 text{*
       
   200 The theorem itself is a simple consequence of this lemma:
       
   201 *}
       
   202 
       
   203 theorem "wf(step1 f) \<Longrightarrow> f(find2 f x) = find2 f x"
       
   204 apply(drule_tac x = x in lem)
       
   205 apply(auto simp add: find2_def)
       
   206 done
       
   207 
       
   208 text{* Let us conclude this section on partial functions by a
       
   209 discussion of the merits of the @{term while} combinator. We have
       
   210 already seen that the advantage of not having to
       
   211 provide a termination argument when defining a function via @{term
       
   212 while} merely puts off the evil hour. On top of that, tail recursive
       
   213 functions tend to be more complicated to reason about. So why use
       
   214 @{term while} at all? The only reason is executability: the recursion
       
   215 equation for @{term while} is a directly executable functional
       
   216 program. This is in stark contrast to guarded recursion as introduced
       
   217 above which requires an explicit test @{prop"x \<in> dom f"} in the
       
   218 function body.  Unless @{term dom} is trivial, this leads to a
       
   219 definition that is impossible to execute or prohibitively slow.
       
   220 Thus, if you are aiming for an efficiently executable definition
       
   221 of a partial function, you are likely to need @{term while}.
       
   222 *}
       
   223 
       
   224 (*<*)end(*>*)