--- a/etc/isar-keywords.el Thu Sep 12 20:53:25 2013 +0200
+++ b/etc/isar-keywords.el Thu Sep 12 20:54:26 2013 +0200
@@ -319,7 +319,6 @@
"constant"
"constrains"
"datatypes"
- "defaults"
"defines"
"file"
"fixes"
--- a/src/Doc/Datatypes/Datatypes.thy Thu Sep 12 20:53:25 2013 +0200
+++ b/src/Doc/Datatypes/Datatypes.thy Thu Sep 12 20:54:26 2013 +0200
@@ -75,7 +75,7 @@
infinitely many direct subtrees.
To use the package, it is necessary to import the @{theory BNF} theory, which
-can be precompiled into the @{text "HOL-BNF"} image. The following commands show
+can be precompiled into the \texttt{HOL-BNF} image. The following commands show
how to launch jEdit/PIDE with the image loaded and how to build the image
without launching jEdit:
*}
@@ -94,7 +94,9 @@
\footnote{If the @{text quick_and_dirty} option is enabled, some of the
internal constructions and most of the internal proof obligations are skipped.}
The package's metatheory is described in a pair of papers
-\cite{traytel-et-al-2012,blanchette-et-al-wit}.
+\cite{traytel-et-al-2012,blanchette-et-al-wit}. The central notion is that of a
+\emph{bounded natural functor} (BNF)---a well-behaved type constructor for which
+nested (co)recursion is supported.
This tutorial is organized as follows:
@@ -116,13 +118,15 @@
@{command primcorec} command.
\item Section \ref{sec:registering-bounded-natural-functors}, ``Registering
-Bounded Natural Functors,'' explains how to set up the package to allow nested
-recursion through custom well-behaved type constructors.
+Bounded Natural Functors,'' explains how to use the @{command bnf} command
+to register arbitrary type constructors as BNFs.
-\item Section \ref{sec:generating-free-constructor-theorems}, ``Generating Free
-Constructor Theorems,'' explains how to derive convenience theorems for free
-constructors using the @{command wrap_free_constructors} command, as performed
-internally by @{command datatype_new} and @{command codatatype}.
+\item Section
+\ref{sec:generating-destructors-and-theorems-for-free-constructors},
+``Generating Destructors and Theorems for Free Constructors,'' explains how to
+use the command @{command wrap_free_constructors} to derive destructor constants
+and theorems for freely generated types, as performed internally by @{command
+datatype_new} and @{command codatatype}.
\item Section \ref{sec:standard-ml-interface}, ``Standard ML Interface,''
describes the package's programmatic interface.
@@ -163,7 +167,6 @@
for its appearance. If you have ideas regarding material that should be
included, please let the authors know.
\end{framed}
-
*}
@@ -253,17 +256,17 @@
text {*
\noindent
-Nonatomic types must be enclosed in double quotes on the right-hand side of the
-equal sign, as is customary in Isabelle.
+Occurrences of nonatomic types on the right-hand side of the equal sign must be
+enclosed in double quotes, as is customary in Isabelle.
*}
subsubsection {* Mutual Recursion *}
text {*
-\emph{Mutually recursive} types are introduced simultaneously and may refer to each
-other. The example below introduces a pair of types for even and odd natural
-numbers:
+\emph{Mutually recursive} types are introduced simultaneously and may refer to
+each other. The example below introduces a pair of types for even and odd
+natural numbers:
*}
datatype_new enat = EZero | ESuc onat
@@ -321,6 +324,12 @@
@{text 'a\<^sub>m}. These type arguments are called \emph{live}; the remaining
type arguments are called \emph{dead}. In @{typ "'a \<Rightarrow> 'b"} and
@{typ "('a, 'b) fn"}, the type variable @{typ 'a} is dead and @{typ 'b} is live.
+
+Type constructors must be registered as bounded natural functors (BNFs) to have
+live arguments. This is done automatically for datatypes and codatatypes
+introduced by the @{command datatype_new} and @{command codatatype} commands.
+Section~\ref{sec:registering-bounded-natural-functors} explains how to register
+arbitrary type constructors as BNFs.
*}
@@ -399,9 +408,9 @@
discriminator associated with @{const Cons} is simply
@{term "\<lambda>xs. \<not> null xs"}.
-The @{text "defaults"} keyword following the @{const Nil} constructor specifies
-a default value for selectors associated with other constructors. Here, it is
-used to ensure that the tail of the empty list is itself (instead of being left
+The @{text defaults} clause following the @{const Nil} constructor specifies a
+default value for selectors associated with other constructors. Here, it is used
+to ensure that the tail of the empty list is itself (instead of being left
unspecified).
Because @{const Nil} is a nullary constructor, it is also possible to use
@@ -418,8 +427,9 @@
(*<*)
end
(*>*)
- datatype_new ('a, 'b) prod (infixr "*" 20) =
- Pair 'a 'b
+ datatype_new ('a, 'b) prod (infixr "*" 20) = Pair 'a 'b
+
+text {* \blankline *}
datatype_new (set: 'a) list (map: map rel: list_all2) =
null: Nil ("[]")
@@ -432,6 +442,8 @@
syntax "_list" :: "args \<Rightarrow> 'a list" ("[(_)]")
+text {* \blankline *}
+
translations
"[x, xs]" == "x # [xs]"
"[x]" == "x # []"
@@ -502,7 +514,8 @@
\noindent
The main constituents of a constructor specification is the name of the
constructor and the list of its argument types. An optional discriminator name
-can be supplied at the front to override the default name (@{text t.un_Cji}).
+can be supplied at the front to override the default name
+(@{text t.is_C\<^sub>j}).
@{rail "
@{syntax_def ctor_arg}: type | '(' name ':' type ')'
@@ -513,8 +526,8 @@
\noindent
In addition to the type of a constructor argument, it is possible to specify a
name for the corresponding selector to override the default name
-(@{text t.un_C}$_{ij}$). The same selector names can be reused for several
-constructors as long as they have the same type.
+(@{text un_C\<^sub>ji}). The same selector names can be reused for several
+constructors as long as they share the same type.
@{rail "
@{syntax_def dt_sel_defaults}: '(' @'defaults' (name ':' term +) ')'
@@ -542,16 +555,19 @@
and destructors that can be derived for any freely generated type. Internally,
the derivation is performed by @{command wrap_free_constructors}.
-\item The \emph{inductive theorems} are properties of datatypes that rely on
+\item The \emph{functorial theorems} are properties of datatypes related to
+their BNF nature.
+
+\item The \emph{inductive theorems} are properties of datatypes related to
their inductive nature.
+
\end{itemize}
\noindent
The full list of named theorems can be obtained as usual by entering the
command \keyw{print\_theorems} immediately after the datatype definition.
This list normally excludes low-level theorems that reveal internal
-constructions. To make these accessible, add the following line to the top of your
-theory file:
+constructions. To make these accessible, add the line
*}
declare [[bnf_note_all]]
@@ -559,6 +575,10 @@
declare [[bnf_note_all = false]]
(*>*)
+text {*
+\noindent
+to the top of the theory file.
+*}
subsubsection {* Free Constructor Theorems *}
@@ -570,134 +590,157 @@
The first subgroup of properties are concerned with the constructors.
They are listed below for @{typ "'a list"}:
+\begin{indentblock}
\begin{description}
-\item[@{text "t."}\hthm{inject} @{text "[iff, induct_simp]"}:] ~ \\
+\item[@{text "t."}\hthm{inject} @{text "[iff, induct_simp]"}\upshape:] ~ \\
@{thm list.inject[no_vars]}
-\item[@{text "t."}\hthm{distinct} @{text "[simp, induct_simp]"}:] ~ \\
+\item[@{text "t."}\hthm{distinct} @{text "[simp, induct_simp]"}\upshape:] ~ \\
@{thm list.distinct(1)[no_vars]} \\
@{thm list.distinct(2)[no_vars]}
-\item[@{text "t."}\hthm{exhaust} @{text "[cases t, case_names C\<^sub>1 \<dots> C\<^sub>n]"}:] ~ \\
+\item[@{text "t."}\hthm{exhaust} @{text "[cases t, case_names C\<^sub>1 \<dots> C\<^sub>n]"}\upshape:] ~ \\
@{thm list.exhaust[no_vars]}
-\item[@{text "t."}\hthm{nchotomy}:] ~ \\
+\item[@{text "t."}\hthm{nchotomy}\upshape:] ~ \\
@{thm list.nchotomy[no_vars]}
\end{description}
+\end{indentblock}
\noindent
The next subgroup is concerned with the case combinator:
+\begin{indentblock}
\begin{description}
-\item[@{text "t."}\hthm{case} @{text "[simp]"}:] ~ \\
+\item[@{text "t."}\hthm{case} @{text "[simp]"}\upshape:] ~ \\
@{thm list.case(1)[no_vars]} \\
@{thm list.case(2)[no_vars]}
-\item[@{text "t."}\hthm{case\_cong}:] ~ \\
+\item[@{text "t."}\hthm{case\_cong}\upshape:] ~ \\
@{thm list.case_cong[no_vars]}
-\item[@{text "t."}\hthm{weak\_case\_cong} @{text "[cong]"}:] ~ \\
+\item[@{text "t."}\hthm{weak\_case\_cong} @{text "[cong]"}\upshape:] ~ \\
@{thm list.weak_case_cong[no_vars]}
-\item[@{text "t."}\hthm{split}:] ~ \\
+\item[@{text "t."}\hthm{split}\upshape:] ~ \\
@{thm list.split[no_vars]}
-\item[@{text "t."}\hthm{split\_asm}:] ~ \\
+\item[@{text "t."}\hthm{split\_asm}\upshape:] ~ \\
@{thm list.split_asm[no_vars]}
\item[@{text "t."}\hthm{splits} = @{text "split split_asm"}]
\end{description}
+\end{indentblock}
\noindent
The third and last subgroup revolves around discriminators and selectors:
+\begin{indentblock}
\begin{description}
-\item[@{text "t."}\hthm{discs} @{text "[simp]"}:] ~ \\
+\item[@{text "t."}\hthm{discs} @{text "[simp]"}\upshape:] ~ \\
@{thm list.discs(1)[no_vars]} \\
@{thm list.discs(2)[no_vars]}
-\item[@{text "t."}\hthm{sels} @{text "[simp]"}:] ~ \\
+\item[@{text "t."}\hthm{sels} @{text "[simp]"}\upshape:] ~ \\
@{thm list.sels(1)[no_vars]} \\
@{thm list.sels(2)[no_vars]}
-\item[@{text "t."}\hthm{collapse} @{text "[simp]"}:] ~ \\
+\item[@{text "t."}\hthm{collapse} @{text "[simp]"}\upshape:] ~ \\
@{thm list.collapse(1)[no_vars]} \\
@{thm list.collapse(2)[no_vars]}
-\item[@{text "t."}\hthm{disc\_exclude}:] ~ \\
+\item[@{text "t."}\hthm{disc\_exclude}\upshape:] ~ \\
These properties are missing for @{typ "'a list"} because there is only one
proper discriminator. Had the datatype been introduced with a second
discriminator called @{const is_Cons}, they would have read thusly: \\[\jot]
@{prop "null list \<Longrightarrow> \<not> is_Cons list"} \\
@{prop "is_Cons list \<Longrightarrow> \<not> null list"}
-\item[@{text "t."}\hthm{disc\_exhaust} @{text "[case_names C\<^sub>1 \<dots> C\<^sub>n]"}:] ~ \\
+\item[@{text "t."}\hthm{disc\_exhaust} @{text "[case_names C\<^sub>1 \<dots> C\<^sub>n]"}\upshape:] ~ \\
@{thm list.disc_exhaust[no_vars]}
-\item[@{text "t."}\hthm{expand}:] ~ \\
+\item[@{text "t."}\hthm{expand}\upshape:] ~ \\
@{thm list.expand[no_vars]}
-\item[@{text "t."}\hthm{case\_conv}:] ~ \\
+\item[@{text "t."}\hthm{case\_conv}\upshape:] ~ \\
@{thm list.case_conv[no_vars]}
\end{description}
+\end{indentblock}
+*}
+
+
+subsubsection {* Functorial Theorems *}
+
+text {*
+The BNF-related theorem are listed below:
+
+\begin{indentblock}
+\begin{description}
+
+\item[@{text "t."}\hthm{sets} @{text "[code]"}\upshape:] ~ \\
+@{thm list.sets(1)[no_vars]} \\
+@{thm list.sets(2)[no_vars]}
+
+\item[@{text "t."}\hthm{map} @{text "[code]"}\upshape:] ~ \\
+@{thm list.map(1)[no_vars]} \\
+@{thm list.map(2)[no_vars]}
+
+\item[@{text "t."}\hthm{rel\_inject} @{text "[code]"}\upshape:] ~ \\
+@{thm list.rel_inject(1)[no_vars]} \\
+@{thm list.rel_inject(2)[no_vars]}
+
+\item[@{text "t."}\hthm{rel\_distinct} @{text "[code]"}\upshape:] ~ \\
+@{thm list.rel_distinct(1)[no_vars]} \\
+@{thm list.rel_distinct(2)[no_vars]}
+
+\end{description}
+\end{indentblock}
*}
subsubsection {* Inductive Theorems *}
text {*
+The inductive theorems are listed below:
+\begin{indentblock}
\begin{description}
-\item[@{text "t."}\hthm{induct} @{text "[induct t, case_names C\<^sub>1 \<dots> C\<^sub>n]"}:] ~ \\
+\item[@{text "t."}\hthm{induct} @{text "[induct t, case_names C\<^sub>1 \<dots> C\<^sub>n]"}\upshape:] ~ \\
@{thm list.induct[no_vars]}
-\item[@{text "t\<^sub>1_\<dots>_t\<^sub>m."}\hthm{induct}: @{text "[case_names C\<^sub>1 \<dots> C\<^sub>n]"}:] ~ \\
+\item[@{text "t\<^sub>1_\<dots>_t\<^sub>m."}\hthm{induct} @{text "[case_names C\<^sub>1 \<dots> C\<^sub>n]"}\upshape:] ~ \\
Given $m > 1$ mutually recursive datatypes, this induction rule can be used to
prove $m$ properties simultaneously.
-\item[@{text "t."}\hthm{fold} @{text "[code]"}:] ~ \\
+\item[@{text "t."}\hthm{fold} @{text "[code]"}\upshape:] ~ \\
@{thm list.fold(1)[no_vars]} \\
@{thm list.fold(2)[no_vars]}
-\item[@{text "t."}\hthm{rec} @{text "[code]"}:] ~ \\
+\item[@{text "t."}\hthm{rec} @{text "[code]"}\upshape:] ~ \\
@{thm list.rec(1)[no_vars]} \\
@{thm list.rec(2)[no_vars]}
-\item[@{text "t."}\hthm{sets}: @{text "[code]"}] ~ \\
-@{thm list.sets(1)[no_vars]} \\
-@{thm list.sets(2)[no_vars]}
-
-\item[@{text "t."}\hthm{map}: @{text "[code]"}] ~ \\
-@{thm list.map(1)[no_vars]} \\
-@{thm list.map(2)[no_vars]}
-
-\item[@{text "t."}\hthm{rel\_inject} @{text "[code]"}:] ~ \\
-@{thm list.rel_inject(1)[no_vars]} \\
-@{thm list.rel_inject(2)[no_vars]}
-
-\item[@{text "t."}\hthm{rel\_distinct} @{text "[code]"}:] ~ \\
-@{thm list.rel_distinct(1)[no_vars]} \\
-@{thm list.rel_distinct(2)[no_vars]}
-
\end{description}
+\end{indentblock}
\noindent
For convenience, @{command datatype_new} also provides the following collection:
+\begin{indentblock}
\begin{description}
\item[@{text "t."}\hthm{simps} = @{text t.inject} @{text t.distinct} @{text t.case} @{text t.rec} @{text t.fold} @{text t.map} @{text t.rel_inject}] ~ \\
@{text t.rel_distinct} @{text t.sets}
\end{description}
-
+\end{indentblock}
*}
@@ -1170,7 +1213,6 @@
\label{ssec:bnf-syntax} *}
text {*
-
@{rail "
@@{command_def bnf} target? (name ':')? term \\
term_list term term_list term?
@@ -1181,8 +1223,8 @@
options: no_discs_sels rep_compat
*}
-section {* Generating Free Constructor Theorems
- \label{sec:generating-free-constructor-theorems} *}
+section {* Generating Destructors and Theorems for Free Constructors
+ \label{sec:generating-destructors-and-theorems-for-free-constructors} *}
text {*
This section explains how to derive convenience theorems for free constructors,
@@ -1293,6 +1335,7 @@
* no recursion through unused arguments (unlike with the old package)
+* in a locale, cannot use locally fixed types (because of limitation in typedef)?
*}
--- a/src/Doc/Datatypes/document/root.tex Thu Sep 12 20:53:25 2013 +0200
+++ b/src/Doc/Datatypes/document/root.tex Thu Sep 12 20:54:26 2013 +0200
@@ -13,10 +13,17 @@
\usepackage{railsetup}
\usepackage{framed}
+\setcounter{secnumdepth}{3}
+\setcounter{tocdepth}{3}
+
\newbox\boxA
\setbox\boxA=\hbox{\ }
\parindent=4\wd\boxA
+\newcommand\blankline{\vskip-.5\baselineskip}
+
+\newenvironment{indentblock}{\list{}{}\item[]}{\endlist}
+
\newcommand{\keyw}[1]{\isacommand{#1}}
\newcommand{\synt}[1]{\textit{#1}}
\newcommand{\hthm}[1]{\textbf{\textit{#1}}}
--- a/src/HOL/BNF/BNF_Def.thy Thu Sep 12 20:53:25 2013 +0200
+++ b/src/HOL/BNF/BNF_Def.thy Thu Sep 12 20:54:26 2013 +0200
@@ -89,6 +89,9 @@
lemma eq_OOI: "R = op = \<Longrightarrow> R = R OO R"
by auto
+lemma OO_Grp_alt: "(Grp A f)^--1 OO Grp A g = (\<lambda>x y. \<exists>z. z \<in> A \<and> f z = x \<and> g z = y)"
+ unfolding Grp_def by auto
+
lemma Grp_UNIV_id: "f = id \<Longrightarrow> (Grp UNIV f)^--1 OO Grp UNIV f = Grp UNIV f"
unfolding Grp_def by auto
@@ -110,10 +113,6 @@
lemma Collect_split_Grp_inD: "z \<in> Collect (split (Grp A f)) \<Longrightarrow> fst z \<in> A"
unfolding Grp_def o_def by auto
-lemma wpull_Grp:
-"wpull (Collect (split (Grp A f))) A (f ` A) f id fst snd"
-unfolding wpull_def Grp_def by auto
-
definition "pick_middlep P Q a c = (SOME b. P a b \<and> Q b c)"
lemma pick_middlep:
--- a/src/HOL/BNF/BNF_FP_Base.thy Thu Sep 12 20:53:25 2013 +0200
+++ b/src/HOL/BNF/BNF_FP_Base.thy Thu Sep 12 20:54:26 2013 +0200
@@ -11,8 +11,6 @@
theory BNF_FP_Base
imports BNF_Comp BNF_Ctr_Sugar
-keywords
- "defaults"
begin
lemma mp_conj: "(P \<longrightarrow> Q) \<and> R \<Longrightarrow> P \<Longrightarrow> R \<and> Q"
--- a/src/HOL/BNF/BNF_Util.thy Thu Sep 12 20:53:25 2013 +0200
+++ b/src/HOL/BNF/BNF_Util.thy Thu Sep 12 20:54:26 2013 +0200
@@ -47,16 +47,9 @@
lemma bijI: "\<lbrakk>\<And>x y. (f x = f y) = (x = y); \<And>y. \<exists>x. y = f x\<rbrakk> \<Longrightarrow> bij f"
unfolding bij_def inj_on_def by auto blast
-lemma pair_mem_Collect_split:
-"(\<lambda>x y. (x, y) \<in> {(x, y). P x y}) = P"
-by simp
-
lemma Collect_pair_mem_eq: "{(x, y). (x, y) \<in> R} = R"
by simp
-lemma Collect_fst_snd_mem_eq: "{p. (fst p, snd p) \<in> A} = A"
-by simp
-
(* Operator: *)
definition "Gr A f = {(a, f a) | a. a \<in> A}"
--- a/src/HOL/BNF/Tools/bnf_def.ML Thu Sep 12 20:53:25 2013 +0200
+++ b/src/HOL/BNF/Tools/bnf_def.ML Thu Sep 12 20:54:26 2013 +0200
@@ -770,7 +770,7 @@
val bnf_wit_As = map (apsnd (mk_bnf_t As')) bnf_wits;
val pre_names_lthy = lthy;
- val (((((((((((((((((((((((((fs, gs), hs), x), y), (z, z')), zs), ys), As),
+ val ((((((((((((((((((((((((fs, gs), hs), x), y), zs), ys), As),
As_copy), Xs), B1s), B2s), f1s), f2s), e1s), e2s), p1s), p2s), bs), (Rs, Rs')), Rs_copy), Ss),
transfer_domRs), transfer_ranRs), names_lthy) = pre_names_lthy
|> mk_Frees "f" (map2 (curry op -->) As' Bs')
@@ -778,7 +778,6 @@
||>> mk_Frees "h" (map2 (curry op -->) As' Ts)
||>> yield_singleton (mk_Frees "x") CA'
||>> yield_singleton (mk_Frees "y") CB'
- ||>> yield_singleton (apfst (op ~~) oo mk_Frees' "z") CRs'
||>> mk_Frees "z" As'
||>> mk_Frees "y" Bs'
||>> mk_Frees "A" (map HOLogic.mk_setT As')
@@ -1093,7 +1092,8 @@
val map_wppull = Lazy.lazy mk_map_wppull;
- val rel_OO_Grps = no_refl [#rel_OO_Grp axioms];
+ val rel_OO_Grp = #rel_OO_Grp axioms;
+ val rel_OO_Grps = no_refl [rel_OO_Grp];
fun mk_rel_Grp () =
let
@@ -1182,23 +1182,7 @@
val rel_OO = Lazy.lazy mk_rel_OO;
- fun mk_in_rel () =
- let
- val bnf_in = mk_in setRs (map (mk_bnf_t RTs) bnf_sets) CRs';
- val map1 = Term.list_comb (mk_bnf_map RTs As', map fst_const RTs);
- val map2 = Term.list_comb (mk_bnf_map RTs Bs', map snd_const RTs);
- val map_fst_eq = HOLogic.mk_eq (map1 $ z, x);
- val map_snd_eq = HOLogic.mk_eq (map2 $ z, y);
- val lhs = Term.list_comb (rel, Rs) $ x $ y;
- val rhs =
- HOLogic.mk_exists (fst z', snd z', HOLogic.mk_conj (HOLogic.mk_mem (z, bnf_in),
- HOLogic.mk_conj (map_fst_eq, map_snd_eq)));
- val goal =
- fold_rev Logic.all (x :: y :: Rs) (mk_Trueprop_eq (lhs, rhs));
- in
- Goal.prove_sorry lthy [] [] goal (mk_in_rel_tac (the_single rel_OO_Grps))
- |> Thm.close_derivation
- end;
+ fun mk_in_rel () = trans OF [rel_OO_Grp, @{thm OO_Grp_alt}] RS @{thm predicate2_eqD};
val in_rel = Lazy.lazy mk_in_rel;
--- a/src/HOL/BNF/Tools/bnf_def_tactics.ML Thu Sep 12 20:53:25 2013 +0200
+++ b/src/HOL/BNF/Tools/bnf_def_tactics.ML Thu Sep 12 20:54:26 2013 +0200
@@ -21,7 +21,6 @@
val mk_rel_eq_tac: int -> thm -> thm -> thm -> tactic
val mk_rel_OO_tac: thm list -> thm -> thm -> thm -> thm -> thm list ->
{prems: thm list, context: Proof.context} -> tactic
- val mk_in_rel_tac: thm -> {prems: 'a, context: Proof.context} -> tactic
val mk_rel_conversep_tac: thm -> thm -> tactic
val mk_rel_conversep_le_tac: thm list -> thm -> thm -> thm -> thm list ->
{prems: thm list, context: Proof.context} -> tactic
@@ -209,13 +208,6 @@
rtac (map_comp0 RS sym), atac, atac]) [@{thm fst_fstOp}, @{thm snd_sndOp}])] 1
end;
-fun mk_in_rel_tac rel_OO_Gr {context = ctxt, prems = _} =
- EVERY' [rtac (rel_OO_Gr RS fun_cong RS fun_cong RS trans), rtac iffI,
- REPEAT_DETERM o eresolve_tac [@{thm GrpE}, @{thm relcomppE}, @{thm conversepE}],
- hyp_subst_tac ctxt, rtac exI, rtac conjI, atac, rtac conjI, rtac refl, rtac refl,
- REPEAT_DETERM o eresolve_tac [exE, conjE], rtac @{thm relcomppI}, rtac @{thm conversepI},
- etac @{thm GrpI}, atac, etac @{thm GrpI}, atac] 1;
-
fun mk_rel_mono_strong_tac in_rel set_map0s {context = ctxt, prems = _} =
if null set_map0s then atac 1
else
@@ -230,16 +222,18 @@
{context = ctxt, prems = _} =
let
val n = length set_maps;
+ val in_tac = if n = 0 then rtac UNIV_I else
+ rtac CollectI THEN' CONJ_WRAP' (fn thm =>
+ etac (thm RS
+ @{thm ord_eq_le_trans[OF _ subset_trans[OF image_mono convol_image_vimage2p]]}))
+ set_maps;
in
REPEAT_DETERM_N n (HEADGOAL (rtac @{thm fun_relI})) THEN
unfold_thms_tac ctxt @{thms fun_rel_iff_leq_vimage2p} THEN
HEADGOAL (EVERY' [rtac @{thm order_trans}, rtac rel_mono, REPEAT_DETERM_N n o atac,
rtac @{thm predicate2I}, dtac (in_rel RS iffD1),
REPEAT_DETERM o eresolve_tac [exE, CollectE, conjE], hyp_subst_tac ctxt,
- rtac @{thm vimage2pI}, rtac (in_rel RS iffD2), rtac exI, rtac conjI, rtac CollectI,
- CONJ_WRAP' (fn thm =>
- etac (thm RS @{thm ord_eq_le_trans[OF _ subset_trans[OF image_mono convol_image_vimage2p]]}))
- set_maps,
+ rtac @{thm vimage2pI}, rtac (in_rel RS iffD2), rtac exI, rtac conjI, in_tac,
rtac conjI,
EVERY' (map (fn convol =>
rtac (box_equals OF [map_cong0, map_comp RS sym, map_comp RS sym]) THEN'
--- a/src/HOL/BNF/Tools/bnf_fp_def_sugar.ML Thu Sep 12 20:53:25 2013 +0200
+++ b/src/HOL/BNF/Tools/bnf_fp_def_sugar.ML Thu Sep 12 20:54:26 2013 +0200
@@ -542,9 +542,12 @@
let
val thy = Proof_Context.theory_of lthy0;
+ val maybe_conceal_def_binding = Thm.def_binding
+ #> Config.get lthy0 bnf_note_all = false ? Binding.conceal;
+
val ((csts, defs), (lthy', lthy)) = lthy0
|> apfst split_list o fold_map (fn (b, spec) =>
- Specification.definition (SOME (b, NONE, NoSyn), ((Thm.def_binding b, []), spec))
+ Specification.definition (SOME (b, NONE, NoSyn), ((maybe_conceal_def_binding b, []), spec))
#>> apsnd snd) binding_specs
||> `Local_Theory.restore;
@@ -1250,9 +1253,12 @@
map3 (fn k => fn xs => fn tuple_x => fold_rev Term.lambda xs (ctor $
mk_InN_balanced ctr_sum_prod_T n tuple_x k)) ks xss tuple_xs;
+ val maybe_conceal_def_binding = Thm.def_binding
+ #> Config.get no_defs_lthy bnf_note_all = false ? Binding.conceal;
+
val ((raw_ctrs, raw_ctr_defs), (lthy', lthy)) = no_defs_lthy
|> apfst split_list o fold_map3 (fn b => fn mx => fn rhs =>
- Local_Theory.define ((b, mx), ((Thm.def_binding b, []), rhs)) #>> apsnd snd)
+ Local_Theory.define ((b, mx), ((maybe_conceal_def_binding b, []), rhs)) #>> apsnd snd)
ctr_bindings ctr_mixfixes ctr_rhss
||> `Local_Theory.restore;
@@ -1538,7 +1544,7 @@
(Parse.typ >> pair Binding.empty);
val parse_defaults =
- @{keyword "("} |-- @{keyword "defaults"} |-- Scan.repeat parse_bound_term --| @{keyword ")"};
+ @{keyword "("} |-- Parse.reserved "defaults" |-- Scan.repeat parse_bound_term --| @{keyword ")"};
val parse_type_arg_constrained =
Parse.type_ident -- Scan.option (@{keyword "::"} |-- Parse.!!! Parse.sort);
@@ -1554,8 +1560,6 @@
val no_map_rel = (Binding.empty, Binding.empty);
-(* "map" and "rel" are purposedly not registered as keywords, because they are short and nice names
- that we don't want them to be highlighted everywhere. *)
fun extract_map_rel ("map", b) = apfst (K b)
| extract_map_rel ("rel", b) = apsnd (K b)
| extract_map_rel (s, _) = error ("Unknown label " ^ quote s ^ " (expected \"map\" or \"rel\")");
--- a/src/HOL/BNF/Tools/bnf_fp_def_sugar_tactics.ML Thu Sep 12 20:53:25 2013 +0200
+++ b/src/HOL/BNF/Tools/bnf_fp_def_sugar_tactics.ML Thu Sep 12 20:54:26 2013 +0200
@@ -152,7 +152,7 @@
full_simp_tac
(ss_only (@{thm prod.inject} :: no_refl discs @ rel_eqs @ more_simp_thms) ctxt) THEN'
REPEAT o etac conjE THEN_MAYBE' REPEAT o hyp_subst_tac ctxt THEN' REPEAT o rtac conjI THEN'
- REPEAT o rtac refl);
+ REPEAT o (rtac refl ORELSE' atac));
fun mk_coinduct_distinct_ctrs_tac ctxt discs discs' =
hyp_subst_tac ctxt THEN' REPEAT o etac conjE THEN'
--- a/src/HOL/BNF/Tools/bnf_fp_util.ML Thu Sep 12 20:53:25 2013 +0200
+++ b/src/HOL/BNF/Tools/bnf_fp_util.ML Thu Sep 12 20:54:26 2013 +0200
@@ -587,13 +587,15 @@
in
Binding.prefix_name rawN
#> fold_rev (fn (s, mand) => Binding.qualify mand s) (qs @ [(n, true)])
+ #> Binding.conceal
end;
val ((bnfs, (deadss, livess)), (unfold_set, lthy)) = apfst (apsnd split_list o split_list)
(fold_map2 (fn b => bnf_of_typ Smart_Inline (raw_qualify b) fp_sort Xs) bs rhsXs
(empty_unfolds, lthy));
- fun norm_qualify i = Binding.qualify true (Binding.name_of (nth bs (Int.max (0, i - 1))));
+ fun norm_qualify i = Binding.qualify true (Binding.name_of (nth bs (Int.max (0, i - 1))))
+ #> Binding.conceal;
val Ass = map (map dest_TFree) livess;
val resDs = fold (subtract (op =)) Ass resBs;
@@ -606,7 +608,8 @@
val Dss = map3 (append oo map o nth) livess kill_poss deadss;
- val pre_qualify = Binding.qualify false o Binding.name_of;
+ fun pre_qualify b = Binding.qualify false (Binding.name_of b)
+ #> Config.get lthy' bnf_note_all = false ? Binding.conceal;
val ((pre_bnfs, deadss), lthy'') =
fold_map3 (fn b => seal_bnf (pre_qualify b) unfold_set' (Binding.prefix_name preN b))
--- a/src/HOL/BNF/Tools/bnf_gfp.ML Thu Sep 12 20:53:25 2013 +0200
+++ b/src/HOL/BNF/Tools/bnf_gfp.ML Thu Sep 12 20:54:26 2013 +0200
@@ -66,13 +66,17 @@
val ks = 1 upto n;
val m = live - n; (*passive, if 0 don't generate a new BNF*)
val ls = 1 upto m;
+
+ val note_all = Config.get lthy bnf_note_all;
val b_names = map Binding.name_of bs;
- val common_name = mk_common_name b_names;
- val b = Binding.name common_name;
- val internal_b = Binding.prefix true common_name b;
- fun qualify_bs internal = map2 (Binding.prefix internal) b_names bs;
- val internal_bs = qualify_bs true;
- val external_bs = qualify_bs false;
+ val b_name = mk_common_name b_names;
+ val b = Binding.name b_name;
+ val mk_internal_b = Binding.name #> Binding.prefix true b_name #> Binding.conceal;
+ fun mk_internal_bs name =
+ map (fn b =>
+ Binding.prefix true b_name (Binding.suffix_name ("_" ^ name) b) |> Binding.conceal) bs;
+ val external_bs = map2 (Binding.prefix false) b_names bs
+ |> note_all = false ? map Binding.conceal;
(* TODO: check if m, n, etc., are sane *)
@@ -297,7 +301,7 @@
(* coalgebra *)
- val coalg_bind = Binding.suffix_name ("_" ^ coN ^ algN) internal_b;
+ val coalg_bind = mk_internal_b (coN ^ algN) ;
val coalg_name = Binding.name_of coalg_bind;
val coalg_def_bind = (Thm.def_binding coalg_bind, []);
@@ -373,7 +377,7 @@
(* morphism *)
- val mor_bind = Binding.suffix_name ("_" ^ morN) internal_b;
+ val mor_bind = mk_internal_b morN;
val mor_name = Binding.name_of mor_bind;
val mor_def_bind = (Thm.def_binding mor_bind, []);
@@ -518,8 +522,7 @@
val timer = time (timer "Morphism definition & thms");
- fun hset_rec_bind j = internal_b
- |> Binding.suffix_name ("_" ^ hset_recN ^ (if m = 1 then "" else string_of_int j)) ;
+ fun hset_rec_bind j = mk_internal_b (hset_recN ^ (if m = 1 then "" else string_of_int j));
val hset_rec_name = Binding.name_of o hset_rec_bind;
val hset_rec_def_bind = rpair [] o Thm.def_binding o hset_rec_bind;
@@ -573,8 +576,8 @@
val hset_rec_0ss' = transpose hset_rec_0ss;
val hset_rec_Sucss' = transpose hset_rec_Sucss;
- fun hset_bind i j = nth internal_bs (i - 1)
- |> Binding.suffix_name ("_" ^ hsetN ^ (if m = 1 then "" else string_of_int j));
+ fun hset_binds j = mk_internal_bs (hsetN ^ (if m = 1 then "" else string_of_int j))
+ fun hset_bind i j = nth (hset_binds j) (i - 1);
val hset_name = Binding.name_of oo hset_bind;
val hset_def_bind = rpair [] o Thm.def_binding oo hset_bind;
@@ -741,7 +744,7 @@
(* bisimulation *)
- val bis_bind = Binding.suffix_name ("_" ^ bisN) internal_b;
+ val bis_bind = mk_internal_b bisN;
val bis_name = Binding.name_of bis_bind;
val bis_def_bind = (Thm.def_binding bis_bind, []);
@@ -885,7 +888,8 @@
(* largest self-bisimulation *)
- fun lsbis_bind i = nth internal_bs (i - 1) |> Binding.suffix_name ("_" ^ lsbisN);
+ val lsbis_binds = mk_internal_bs lsbisN;
+ fun lsbis_bind i = nth lsbis_binds (i - 1);
val lsbis_name = Binding.name_of o lsbis_bind;
val lsbis_def_bind = rpair [] o Thm.def_binding o lsbis_bind;
@@ -970,8 +974,7 @@
then (lthy, sum_bd, sum_bdT, bd_card_order, bd_Cinfinite, bd_Card_order, set_bdss)
else
let
- val sbdT_bind =
- Binding.qualify false (Binding.name_of b) (Binding.suffix_name ("_" ^ sum_bdTN) b);
+ val sbdT_bind = mk_internal_b sum_bdTN;
val ((sbdT_name, (sbdT_glob_info, sbdT_loc_info)), lthy) =
typedef (sbdT_bind, dead_params, NoSyn)
@@ -980,7 +983,7 @@
val sbdT = Type (sbdT_name, dead_params');
val Abs_sbdT = Const (#Abs_name sbdT_glob_info, sum_bdT --> sbdT);
- val sbd_bind = Binding.suffix_name ("_" ^ sum_bdN) internal_b;
+ val sbd_bind = mk_internal_b sum_bdN;
val sbd_name = Binding.name_of sbd_bind;
val sbd_def_bind = (Thm.def_binding sbd_bind, []);
@@ -1076,7 +1079,8 @@
(* tree coalgebra *)
- fun isNode_bind i = nth internal_bs (i - 1) |> Binding.suffix_name ("_" ^ isNodeN);
+ val isNode_binds = mk_internal_bs isNodeN;
+ fun isNode_bind i = nth isNode_binds (i - 1);
val isNode_name = Binding.name_of o isNode_bind;
val isNode_def_bind = rpair [] o Thm.def_binding o isNode_bind;
@@ -1135,7 +1139,8 @@
Library.foldr1 HOLogic.mk_conj [empty, Field, prefCl, tree, undef]
end;
- fun carT_bind i = nth internal_bs (i - 1) |> Binding.suffix_name ("_" ^ carTN);
+ val carT_binds = mk_internal_bs carTN;
+ fun carT_bind i = nth carT_binds (i - 1);
val carT_name = Binding.name_of o carT_bind;
val carT_def_bind = rpair [] o Thm.def_binding o carT_bind;
@@ -1167,7 +1172,8 @@
(Const (nth carTs (i - 1),
Library.foldr (op -->) (map fastype_of As, HOLogic.mk_setT treeT)), As);
- fun strT_bind i = nth internal_bs (i - 1) |> Binding.suffix_name ("_" ^ strTN);
+ val strT_binds = mk_internal_bs strTN;
+ fun strT_bind i = nth strT_binds (i - 1);
val strT_name = Binding.name_of o strT_bind;
val strT_def_bind = rpair [] o Thm.def_binding o strT_bind;
@@ -1228,7 +1234,7 @@
val to_sbd_thmss = mk_to_sbd_thmss @{thm toCard};
val from_to_sbd_thmss = mk_to_sbd_thmss @{thm fromCard_toCard};
- val Lev_bind = Binding.suffix_name ("_" ^ LevN) internal_b;
+ val Lev_bind = mk_internal_b LevN;
val Lev_name = Binding.name_of Lev_bind;
val Lev_def_bind = rpair [] (Thm.def_binding Lev_bind);
@@ -1282,7 +1288,7 @@
val Lev_0s = flat (mk_rec_simps n @{thm nat_rec_0} [Lev_def]);
val Lev_Sucs = flat (mk_rec_simps n @{thm nat_rec_Suc} [Lev_def]);
- val rv_bind = Binding.suffix_name ("_" ^ rvN) internal_b;
+ val rv_bind = mk_internal_b rvN;
val rv_name = Binding.name_of rv_bind;
val rv_def_bind = rpair [] (Thm.def_binding rv_bind);
@@ -1328,7 +1334,8 @@
val rv_Nils = flat (mk_rec_simps n @{thm list_rec_Nil} [rv_def]);
val rv_Conss = flat (mk_rec_simps n @{thm list_rec_Cons} [rv_def]);
- fun beh_bind i = nth internal_bs (i - 1) |> Binding.suffix_name ("_" ^ behN);
+ val beh_binds = mk_internal_bs behN;
+ fun beh_bind i = nth beh_binds (i - 1);
val beh_name = Binding.name_of o beh_bind;
val beh_def_bind = rpair [] o Thm.def_binding o beh_bind;
@@ -1636,7 +1643,7 @@
val ((T_names, (T_glob_infos, T_loc_infos)), lthy) =
lthy
|> fold_map4 (fn b => fn mx => fn car_final => fn in_car_final =>
- typedef (b, params, mx) car_final NONE
+ typedef (Binding.conceal b, params, mx) car_final NONE
(EVERY' [rtac exI, rtac in_car_final] 1)) bs mixfixes car_finals in_car_final_thms
|>> apsnd split_list o split_list;
@@ -1692,7 +1699,7 @@
fun dtor_bind i = nth external_bs (i - 1) |> Binding.suffix_name ("_" ^ dtorN);
val dtor_name = Binding.name_of o dtor_bind;
- val dtor_def_bind = rpair [] o Thm.def_binding o dtor_bind;
+ val dtor_def_bind = rpair [] o Binding.conceal o Thm.def_binding o dtor_bind;
fun dtor_spec i rep str map_FT dtorT Jz Jz' =
let
@@ -1744,7 +1751,7 @@
fun unfold_bind i = nth external_bs (i - 1) |> Binding.suffix_name ("_" ^ dtor_unfoldN);
val unfold_name = Binding.name_of o unfold_bind;
- val unfold_def_bind = rpair [] o Thm.def_binding o unfold_bind;
+ val unfold_def_bind = rpair [] o Binding.conceal o Thm.def_binding o unfold_bind;
fun unfold_spec i T AT abs f z z' =
let
@@ -1865,7 +1872,7 @@
fun ctor_bind i = nth external_bs (i - 1) |> Binding.suffix_name ("_" ^ ctorN);
val ctor_name = Binding.name_of o ctor_bind;
- val ctor_def_bind = rpair [] o Thm.def_binding o ctor_bind;
+ val ctor_def_bind = rpair [] o Binding.conceal o Thm.def_binding o ctor_bind;
fun ctor_spec i ctorT =
let
@@ -1936,7 +1943,7 @@
fun corec_bind i = nth external_bs (i - 1) |> Binding.suffix_name ("_" ^ dtor_corecN);
val corec_name = Binding.name_of o corec_bind;
- val corec_def_bind = rpair [] o Thm.def_binding o corec_bind;
+ val corec_def_bind = rpair [] o Binding.conceal o Thm.def_binding o corec_bind;
val corec_strs =
map3 (fn dtor => fn sum_s => fn mapx =>
@@ -2096,11 +2103,11 @@
(*register new codatatypes as BNFs*)
val (timer, Jbnfs, (folded_dtor_map_o_thms, folded_dtor_map_thms), folded_dtor_set_thmss',
- dtor_set_induct_thms, dtor_Jrel_thms, lthy) =
+ dtor_set_induct_thms, dtor_Jrel_thms, Jbnf_notes, lthy) =
if m = 0 then
(timer, replicate n DEADID_bnf,
map_split (`(mk_pointfree lthy)) (map2 mk_dtor_map_DEADID_thm dtor_inject_thms map_ids),
- replicate n [], [], map2 mk_dtor_Jrel_DEADID_thm dtor_inject_thms bnfs, lthy)
+ replicate n [], [], map2 mk_dtor_Jrel_DEADID_thm dtor_inject_thms bnfs, [], lthy)
else let
val fTs = map2 (curry op -->) passiveAs passiveBs;
val gTs = map2 (curry op -->) passiveBs passiveCs;
@@ -2734,8 +2741,7 @@
bs thmss)
in
(timer, Jbnfs, (folded_dtor_map_o_thms, folded_dtor_map_thms), folded_dtor_set_thmss',
- dtor_set_induct_thms, dtor_Jrel_thms,
- lthy |> Local_Theory.notes (Jbnf_common_notes @ Jbnf_notes) |> snd)
+ dtor_set_induct_thms, dtor_Jrel_thms, Jbnf_common_notes @ Jbnf_notes, lthy)
end;
val dtor_unfold_o_map_thms = mk_xtor_un_fold_o_map_thms Greatest_FP false m
@@ -2883,7 +2889,11 @@
|> maps (fn (thmN, thmss) =>
map2 (fn b => fn thms =>
((Binding.qualify true (Binding.name_of b) (Binding.name thmN), []), [(thms, [])]))
- bs thmss)
+ bs thmss);
+
+ (*FIXME: once the package exports all the necessary high-level characteristic theorems,
+ those should not only be concealed but rather not noted at all*)
+ val maybe_conceal_notes = note_all = false ? map (apfst (apfst Binding.conceal));
in
timer;
({Ts = Ts, bnfs = Jbnfs, ctors = ctors, dtors = dtors,
@@ -2896,24 +2906,18 @@
xtor_co_iter_thmss = transpose [dtor_unfold_thms, dtor_corec_thms],
xtor_co_iter_o_map_thmss = transpose [dtor_unfold_o_map_thms, dtor_corec_o_map_thms],
rel_xtor_co_induct_thm = Jrel_coinduct_thm},
- lthy |> Local_Theory.notes (common_notes @ notes) |> snd)
+ lthy |> Local_Theory.notes (maybe_conceal_notes (common_notes @ notes @ Jbnf_notes)) |> snd)
end;
val _ =
Outer_Syntax.local_theory @{command_spec "codatatype"} "define BNF-based coinductive datatypes"
(parse_co_datatype_cmd Greatest_FP construct_gfp);
-local
-
val option_parser = Parse.group (fn () => "option") (Parse.reserved "sequential" >> K true);
-in
-
val _ = Outer_Syntax.local_theory_to_proof @{command_spec "primcorec"}
"define primitive corecursive functions"
((Scan.optional (@{keyword "("} |-- Parse.!!! option_parser --| @{keyword ")"}) false) --
(Parse.fixes -- Parse_Spec.where_alt_specs) >> uncurry add_primcorec_cmd);
-
-end
end;
--- a/src/HOL/BNF/Tools/bnf_lfp.ML Thu Sep 12 20:53:25 2013 +0200
+++ b/src/HOL/BNF/Tools/bnf_lfp.ML Thu Sep 12 20:54:26 2013 +0200
@@ -36,13 +36,17 @@
val n = length bnfs; (*active*)
val ks = 1 upto n;
val m = live - n; (*passive, if 0 don't generate a new BNF*)
+
+ val note_all = Config.get lthy bnf_note_all;
val b_names = map Binding.name_of bs;
- val common_name = mk_common_name b_names;
- val b = Binding.name common_name;
- val internal_b = Binding.prefix true common_name b;
- fun qualify_bs internal = map2 (Binding.prefix internal) b_names bs;
- val internal_bs = qualify_bs true;
- val external_bs = qualify_bs false;
+ val b_name = mk_common_name b_names;
+ val b = Binding.name b_name;
+ val mk_internal_b = Binding.name #> Binding.prefix true b_name #> Binding.conceal;
+ fun mk_internal_bs name =
+ map (fn b =>
+ Binding.prefix true b_name (Binding.suffix_name ("_" ^ name) b) |> Binding.conceal) bs;
+ val external_bs = map2 (Binding.prefix false) b_names bs
+ |> note_all = false ? map Binding.conceal;
(* TODO: check if m, n, etc., are sane *)
@@ -238,7 +242,7 @@
(* algebra *)
- val alg_bind = Binding.suffix_name ("_" ^ algN) internal_b;
+ val alg_bind = mk_internal_b algN;
val alg_name = Binding.name_of alg_bind;
val alg_def_bind = (Thm.def_binding alg_bind, []);
@@ -325,7 +329,7 @@
(* morphism *)
- val mor_bind = Binding.suffix_name ("_" ^ morN) internal_b;
+ val mor_bind = mk_internal_b morN;
val mor_name = Binding.name_of mor_bind;
val mor_def_bind = (Thm.def_binding mor_bind, []);
@@ -712,8 +716,9 @@
val timer = time (timer "min_algs definition & thms");
- fun min_alg_bind i = nth internal_bs (i - 1) |> Binding.suffix_name ("_" ^ min_algN);
- val min_alg_name = Binding.name_of o min_alg_bind;
+ val min_alg_binds = mk_internal_bs min_algN;
+ fun min_alg_bind i = nth min_alg_binds (i - 1);
+ fun min_alg_name i = Binding.name_of (min_alg_bind i);
val min_alg_def_bind = rpair [] o Thm.def_binding o min_alg_bind;
fun min_alg_spec i =
@@ -791,7 +796,7 @@
val timer = time (timer "Minimal algebra definition & thms");
val II_repT = HOLogic.mk_prodT (HOLogic.mk_tupleT II_BTs, HOLogic.mk_tupleT II_sTs);
- val IIT_bind = Binding.suffix_name ("_" ^ IITN) b;
+ val IIT_bind = mk_internal_b IITN;
val ((IIT_name, (IIT_glob_info, IIT_loc_info)), lthy) =
typedef (IIT_bind, params, NoSyn)
@@ -824,7 +829,8 @@
val select_Bs = map (mk_nthN n (HOLogic.mk_fst (Rep_IIT $ iidx))) ks;
val select_ss = map (mk_nthN n (HOLogic.mk_snd (Rep_IIT $ iidx))) ks;
- fun str_init_bind i = nth internal_bs (i - 1) |> Binding.suffix_name ("_" ^ str_initN);
+ val str_init_binds = mk_internal_bs str_initN;
+ fun str_init_bind i = nth str_init_binds (i - 1);
val str_init_name = Binding.name_of o str_init_bind;
val str_init_def_bind = rpair [] o Thm.def_binding o str_init_bind;
@@ -953,7 +959,8 @@
val ((T_names, (T_glob_infos, T_loc_infos)), lthy) =
lthy
- |> fold_map3 (fn b => fn mx => fn car_init => typedef (b, params, mx) car_init NONE
+ |> fold_map3 (fn b => fn mx => fn car_init =>
+ typedef (Binding.conceal b, params, mx) car_init NONE
(EVERY' [rtac ssubst, rtac @{thm ex_in_conv}, resolve_tac alg_not_empty_thms,
rtac alg_init_thm] 1)) bs mixfixes car_inits
|>> apsnd split_list o split_list;
@@ -1016,7 +1023,7 @@
fun ctor_bind i = nth external_bs (i - 1) |> Binding.suffix_name ("_" ^ ctorN);
val ctor_name = Binding.name_of o ctor_bind;
- val ctor_def_bind = rpair [] o Thm.def_binding o ctor_bind;
+ val ctor_def_bind = rpair [] o Binding.conceal o Thm.def_binding o ctor_bind;
fun ctor_spec i abs str map_FT_init x x' =
let
@@ -1075,7 +1082,7 @@
fun fold_bind i = nth external_bs (i - 1) |> Binding.suffix_name ("_" ^ ctor_foldN);
val fold_name = Binding.name_of o fold_bind;
- val fold_def_bind = rpair [] o Thm.def_binding o fold_bind;
+ val fold_def_bind = rpair [] o Binding.conceal o Thm.def_binding o fold_bind;
fun fold_spec i T AT =
let
@@ -1165,7 +1172,7 @@
fun dtor_bind i = nth external_bs (i - 1) |> Binding.suffix_name ("_" ^ dtorN);
val dtor_name = Binding.name_of o dtor_bind;
- val dtor_def_bind = rpair [] o Thm.def_binding o dtor_bind;
+ val dtor_def_bind = rpair [] o Binding.conceal o Thm.def_binding o dtor_bind;
fun dtor_spec i FT T =
let
@@ -1238,7 +1245,7 @@
fun rec_bind i = nth external_bs (i - 1) |> Binding.suffix_name ("_" ^ ctor_recN);
val rec_name = Binding.name_of o rec_bind;
- val rec_def_bind = rpair [] o Thm.def_binding o rec_bind;
+ val rec_def_bind = rpair [] o Binding.conceal o Thm.def_binding o rec_bind;
val rec_strs =
map3 (fn ctor => fn prod_s => fn mapx =>
@@ -1405,11 +1412,11 @@
(*register new datatypes as BNFs*)
val (timer, Ibnfs, (folded_ctor_map_o_thms, folded_ctor_map_thms), folded_ctor_set_thmss',
- ctor_Irel_thms, lthy) =
+ ctor_Irel_thms, Ibnf_notes, lthy) =
if m = 0 then
(timer, replicate n DEADID_bnf,
map_split (`(mk_pointfree lthy)) (map2 mk_ctor_map_DEADID_thm ctor_inject_thms map_ids),
- replicate n [], map2 mk_ctor_Irel_DEADID_thm ctor_inject_thms bnfs, lthy)
+ replicate n [], map2 mk_ctor_Irel_DEADID_thm ctor_inject_thms bnfs, [], lthy)
else let
val fTs = map2 (curry op -->) passiveAs passiveBs;
val f1Ts = map2 (curry op -->) passiveAs passiveYs;
@@ -1809,7 +1816,7 @@
bs thmss)
in
(timer, Ibnfs, (folded_ctor_map_o_thms, folded_ctor_map_thms), folded_ctor_set_thmss',
- ctor_Irel_thms, lthy |> Local_Theory.notes (Ibnf_common_notes @ Ibnf_notes) |> snd)
+ ctor_Irel_thms, Ibnf_common_notes @ Ibnf_notes, lthy)
end;
val ctor_fold_o_map_thms = mk_xtor_un_fold_o_map_thms Least_FP false m ctor_fold_unique_thm
@@ -1858,7 +1865,11 @@
|> maps (fn (thmN, thmss) =>
map2 (fn b => fn thms =>
((Binding.qualify true (Binding.name_of b) (Binding.name thmN), []), [(thms, [])]))
- bs thmss)
+ bs thmss);
+
+ (*FIXME: once the package exports all the necessary high-level characteristic theorems,
+ those should not only be concealed but rather not noted at all*)
+ val maybe_conceal_notes = note_all = false ? map (apfst (apfst Binding.conceal));
in
timer;
({Ts = Ts, bnfs = Ibnfs, ctors = ctors, dtors = dtors, xtor_co_iterss = transpose [folds, recs],
@@ -1869,7 +1880,7 @@
xtor_co_iter_thmss = transpose [ctor_fold_thms, ctor_rec_thms],
xtor_co_iter_o_map_thmss = transpose [ctor_fold_o_map_thms, ctor_rec_o_map_thms],
rel_xtor_co_induct_thm = Irel_induct_thm},
- lthy |> Local_Theory.notes (common_notes @ notes) |> snd)
+ lthy |> Local_Theory.notes (maybe_conceal_notes (common_notes @ notes @ Ibnf_notes)) |> snd)
end;
val _ =
--- a/src/HOL/BNF/Tools/bnf_tactics.ML Thu Sep 12 20:53:25 2013 +0200
+++ b/src/HOL/BNF/Tools/bnf_tactics.ML Thu Sep 12 20:54:26 2013 +0200
@@ -102,7 +102,7 @@
rtac (unfold_thms ctxt (IJrel_defs @ IJsrel_defs @
@{thms Collect_pair_mem_eq mem_Collect_eq fst_conv snd_conv}) dtor_srel RS trans) 1 THEN
unfold_thms_tac ctxt (srel_def ::
- @{thms Collect_fst_snd_mem_eq mem_Collect_eq pair_mem_Collect_split fst_conv snd_conv
+ @{thms pair_collapse Collect_mem_eq mem_Collect_eq prod.cases fst_conv snd_conv
split_conv}) THEN
rtac refl 1;
--- a/src/HOL/Mirabelle/Tools/mirabelle_sledgehammer_filter.ML Thu Sep 12 20:53:25 2013 +0200
+++ b/src/HOL/Mirabelle/Tools/mirabelle_sledgehammer_filter.ML Thu Sep 12 20:54:26 2013 +0200
@@ -12,11 +12,11 @@
fun extract_relevance_fudge args
{local_const_multiplier, worse_irrel_freq, higher_order_irrel_weight,
- abs_rel_weight, abs_irrel_weight, skolem_irrel_weight,
- theory_const_rel_weight, theory_const_irrel_weight,
- chained_const_irrel_weight, intro_bonus, elim_bonus, simp_bonus,
- local_bonus, assum_bonus, chained_bonus, max_imperfect, max_imperfect_exp,
- threshold_divisor, ridiculous_threshold} =
+ abs_rel_weight, abs_irrel_weight, theory_const_rel_weight,
+ theory_const_irrel_weight, chained_const_irrel_weight, intro_bonus,
+ elim_bonus, simp_bonus, local_bonus, assum_bonus, chained_bonus,
+ max_imperfect, max_imperfect_exp, threshold_divisor,
+ ridiculous_threshold} =
{local_const_multiplier =
get args "local_const_multiplier" local_const_multiplier,
worse_irrel_freq = get args "worse_irrel_freq" worse_irrel_freq,
@@ -24,7 +24,6 @@
get args "higher_order_irrel_weight" higher_order_irrel_weight,
abs_rel_weight = get args "abs_rel_weight" abs_rel_weight,
abs_irrel_weight = get args "abs_irrel_weight" abs_irrel_weight,
- skolem_irrel_weight = get args "skolem_irrel_weight" skolem_irrel_weight,
theory_const_rel_weight =
get args "theory_const_rel_weight" theory_const_rel_weight,
theory_const_irrel_weight =
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/src/HOL/Tools/Sledgehammer/MaSh/src/ExpandFeatures.py Thu Sep 12 20:54:26 2013 +0200
@@ -0,0 +1,162 @@
+'''
+Created on Aug 21, 2013
+
+@author: daniel
+'''
+
+from math import log
+from gensim import corpora, models, similarities
+
+class ExpandFeatures(object):
+
+ def __init__(self,dicts):
+ self.dicts = dicts
+ self.featureMap = {}
+ self.alpha = 0.1
+ self.featureCounts = {}
+ self.counter = 0
+ self.corpus = []
+ self.LSIModel = models.lsimodel.LsiModel(self.corpus,num_topics=500)
+
+ def initialize(self,dicts):
+ self.dicts = dicts
+ IS = open(dicts.accFile,'r')
+ for line in IS:
+ line = line.split(':')
+ name = line[0]
+ #print 'name',name
+ nameId = dicts.nameIdDict[name]
+ features = dicts.featureDict[nameId]
+ dependencies = dicts.dependenciesDict[nameId]
+ x = [self.dicts.idNameDict[d] for d in dependencies]
+ #print x
+ self.update(features, dependencies)
+ self.corpus.append([(x,1) for x in features.keys()])
+ IS.close()
+ print 'x'
+ #self.LSIModel = models.lsimodel.LsiModel(self.corpus,num_topics=500)
+ print self.LSIModel
+ print 'y'
+
+ def update(self,features,dependencies):
+ self.counter += 1
+ self.corpus.append([(x,1) for x in features.keys()])
+ self.LSIModel.add_documents([[(x,1) for x in features.keys()]])
+ """
+ for f in features.iterkeys():
+ try:
+ self.featureCounts[f] += 1
+ except:
+ self.featureCounts[f] = 1
+ if self.featureCounts[f] > 100:
+ continue
+ try:
+ self.featureMap[f] = self.featureMap[f].intersection(features.keys())
+ except:
+ self.featureMap[f] = set(features.keys())
+ #print 'fOld',len(fMap),self.featureCounts[f],len(dependencies)
+
+ for d in dependencies[1:]:
+ #print 'dep',self.dicts.idNameDict[d]
+ dFeatures = self.dicts.featureDict[d]
+ for df in dFeatures.iterkeys():
+ if self.featureCounts.has_key(df):
+ if self.featureCounts[df] > 20:
+ continue
+ else:
+ print df
+ try:
+ fMap[df] += self.alpha * (1.0 - fMap[df])
+ except:
+ fMap[df] = self.alpha
+ """
+ #print 'fNew',len(fMap)
+
+ def expand(self,features):
+ #print self.corpus[:50]
+ #print corpus
+ #tfidfmodel = models.TfidfModel(self.corpus, normalize=True)
+ #print features.keys()
+ #tfidfcorpus = [tfidfmodel[x] for x in self.corpus]
+ #newFeatures = LSI[[(x,1) for x in features.keys()]]
+ newFeatures = self.LSIModel[[(x,1) for x in features.keys()]]
+ print features
+ print newFeatures
+ #print newFeatures
+
+ """
+ newFeatures = dict(features)
+ for f in features.keys():
+ try:
+ fC = self.featureCounts[f]
+ except:
+ fC = 0.5
+ newFeatures[f] = log(float(8+self.counter) / fC)
+ #nrOfFeatures = float(len(features))
+ addedCount = 0
+ alpha = 0.2
+ #"""
+
+ """
+ consideredFeatures = []
+ while len(newFeatures) < 30:
+ #alpha = alpha * 0.5
+ minF = None
+ minFrequence = 1000000
+ for f in newFeatures.iterkeys():
+ if f in consideredFeatures:
+ continue
+ try:
+ if self.featureCounts[f] < minFrequence:
+ minF = f
+ except:
+ pass
+ if minF == None:
+ break
+ # Expand minimal feature
+ consideredFeatures.append(minF)
+ for expF in self.featureMap[minF]:
+ if not newFeatures.has_key(expF):
+ fC = self.featureCounts[minF]
+ newFeatures[expF] = alpha*log(float(8+self.counter) / fC)
+ #print features, newFeatures
+ #"""
+ """
+ for f in features.iterkeys():
+ try:
+ self.featureCounts[f] += 1
+ except:
+ self.featureCounts[f] = 0
+ if self.featureCounts[f] > 10:
+ continue
+ addedCount += 1
+ try:
+ fmap = self.featureMap[f]
+ except:
+ self.featureMap[f] = {}
+ fmap = {}
+ for nf,nv in fmap.iteritems():
+ try:
+ newFeatures[nf] += nv
+ except:
+ newFeatures[nf] = nv
+ if addedCount > 0:
+ for f,w in newFeatures.iteritems():
+ newFeatures[f] = float(w)/addedCount
+ #"""
+ """
+ deleteF = []
+ for f,w in newFeatures.iteritems():
+ if w < 0.1:
+ deleteF.append(f)
+ for f in deleteF:
+ del newFeatures[f]
+ """
+ #print 'fold',len(features)
+ #print 'fnew',len(newFeatures)
+ return dict(newFeatures)
+
+if __name__ == "__main__":
+ pass
+
+
\ No newline at end of file
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/src/HOL/Tools/Sledgehammer/MaSh/src/KNN.py Thu Sep 12 20:54:26 2013 +0200
@@ -0,0 +1,99 @@
+'''
+Created on Aug 21, 2013
+
+@author: daniel
+'''
+
+from cPickle import dump,load
+from numpy import array
+from math import sqrt,log
+
+def cosine(f1,f2):
+ f1Norm = 0.0
+ for f in f1.keys():
+ f1Norm += f1[f] * f1[f]
+ #assert f1Norm = sum(map(lambda x,y: x*y,f1.itervalues(),f1.itervalues()))
+ f1Norm = sqrt(f1Norm)
+
+ f2Norm = 0.0
+ for f in f2.keys():
+ f2Norm += f2[f] * f2[f]
+ f2Norm = sqrt(f2Norm)
+
+ dotProduct = 0.0
+ featureIntersection = set(f1.keys()) & set(f2.keys())
+ for f in featureIntersection:
+ dotProduct += f1[f] * f2[f]
+ cosine = dotProduct / (f1Norm * f2Norm)
+ return 1.0 - cosine
+
+def euclidean(f1,f2):
+ diffSum = 0.0
+ featureUnion = set(f1.keys()) | set(f2.keys())
+ for f in featureUnion:
+ try:
+ f1Val = f1[f]
+ except:
+ f1Val = 0.0
+ try:
+ f2Val = f2[f]
+ except:
+ f2Val = 0.0
+ diff = f1Val - f2Val
+ diffSum += diff * diff
+ #if f in f1.keys():
+ # diffSum += log(2+self.pointCount/self.featureCounts[f]) * diff * diff
+ #else:
+ # diffSum += diff * diff
+ #print diffSum,f1,f2
+ return diffSum
+
+class KNN(object):
+ '''
+ A basic KNN ranker.
+ '''
+
+ def __init__(self,dicts,metric=cosine):
+ '''
+ Constructor
+ '''
+ self.points = dicts.featureDict
+ self.metric = metric
+
+ def initializeModel(self,_trainData,_dicts):
+ """
+ Build basic model from training data.
+ """
+ pass
+
+ def update(self,dataPoint,features,dependencies):
+ assert self.points[dataPoint] == features
+
+ def overwrite(self,problemId,newDependencies,dicts):
+ # Taken care of by dicts
+ pass
+
+ def delete(self,dataPoint,features,dependencies):
+ # Taken care of by dicts
+ pass
+
+ def predict(self,features,accessibles,dicts):
+ predictions = map(lambda x: self.metric(features,self.points[x]),accessibles)
+ predictions = array(predictions)
+ perm = predictions.argsort()
+ return array(accessibles)[perm],predictions[perm]
+
+ def save(self,fileName):
+ OStream = open(fileName, 'wb')
+ dump((self.points,self.metric),OStream)
+ OStream.close()
+
+ def load(self,fileName):
+ OStream = open(fileName, 'rb')
+ self.points,self.metric = load(OStream)
+ OStream.close()
+
+if __name__ == '__main__':
+ pass
+
+
\ No newline at end of file
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/src/HOL/Tools/Sledgehammer/MaSh/src/KNNs.py Thu Sep 12 20:54:26 2013 +0200
@@ -0,0 +1,105 @@
+'''
+Created on Aug 21, 2013
+
+@author: daniel
+'''
+
+from math import log
+from KNN import KNN,cosine
+from numpy import array
+
+class KNNAdaptPointFeatures(KNN):
+
+ def __init__(self,dicts,metric=cosine,alpha = 0.05):
+ self.points = dicts.featureDict
+ self.metric = self.euclidean
+ self.alpha = alpha
+ self.count = 0
+ self.featureCount = {}
+
+ def initializeModel(self,trainData,dicts):
+ """
+ Build basic model from training data.
+ """
+ IS = open(dicts.accFile,'r')
+ for line in IS:
+ line = line.split(':')
+ name = line[0]
+ nameId = dicts.nameIdDict[name]
+ features = dicts.featureDict[nameId]
+ dependencies = dicts.dependenciesDict[nameId]
+ self.update(nameId, features, dependencies)
+ IS.close()
+
+ def update(self,dataPoint,features,dependencies):
+ self.count += 1
+ for f in features.iterkeys():
+ try:
+ self.featureCount[f] += 1
+ except:
+ self.featureCount[f] = 1
+ for d in dependencies:
+ dFeatures = self.points[d]
+ featureUnion = set(dFeatures.keys()) | set(features.keys())
+ for f in featureUnion:
+ try:
+ pVal = features[f]
+ except:
+ pVal = 0.0
+ try:
+ dVal = dFeatures[f]
+ except:
+ dVal = 0.0
+ newDVal = dVal + self.alpha * (pVal - dVal)
+ dFeatures[f] = newDVal
+
+ def euclidean(self,f1,f2):
+ diffSum = 0.0
+ f1Set = set(f1.keys())
+ featureUnion = f1Set | set(f2.keys())
+ for f in featureUnion:
+ if not self.featureCount.has_key(f):
+ continue
+ if self.featureCount[f] == 1:
+ continue
+ try:
+ f1Val = f1[f]
+ except:
+ f1Val = 0.0
+ try:
+ f2Val = f2[f]
+ except:
+ f2Val = 0.0
+ diff = f1Val - f2Val
+ diffSum += diff * diff
+ if f in f1Set:
+ diffSum += log(2+self.count/self.featureCount[f]) * diff * diff
+ else:
+ diffSum += diff * diff
+ #print diffSum,f1,f2
+ return diffSum
+
+class KNNUrban(KNN):
+ def __init__(self,dicts,metric=cosine,nrOfNeighbours = 40):
+ self.points = dicts.featureDict
+ self.metric = metric
+ self.nrOfNeighbours = nrOfNeighbours # Ignored at the moment
+
+ def predict(self,features,accessibles,dicts):
+ predictions = map(lambda x: self.metric(features,self.points[x]),accessibles)
+ pDict = dict(zip(accessibles,predictions))
+ for a,p in zip(accessibles,predictions):
+ aDeps = dicts.dependenciesDict[a]
+ for d in aDeps:
+ pDict[d] -= p
+ predictions = []
+ names = []
+ for n,p in pDict.items():
+ predictions.append(p)
+ names.append(n)
+ predictions = array(predictions)
+ perm = predictions.argsort()
+ return array(names)[perm],predictions[perm]
+
+
+
\ No newline at end of file
--- a/src/HOL/Tools/Sledgehammer/MaSh/src/dictionaries.py Thu Sep 12 20:53:25 2013 +0200
+++ b/src/HOL/Tools/Sledgehammer/MaSh/src/dictionaries.py Thu Sep 12 20:54:26 2013 +0200
@@ -1,18 +1,13 @@
# Title: HOL/Tools/Sledgehammer/MaSh/src/dictionaries.py
# Author: Daniel Kuehlwein, ICIS, Radboud University Nijmegen
-# Copyright 2012
+# Copyright 2012-2013
#
# Persistent dictionaries: accessibility, dependencies, and features.
-'''
-Created on Jul 12, 2012
-
-@author: daniel
-'''
-
+import logging,sys
from os.path import join
from Queue import Queue
-from readData import create_accessible_dict,create_dependencies_dict,create_feature_dict
+from readData import create_accessible_dict,create_dependencies_dict
from cPickle import load,dump
class Dictionaries(object):
@@ -32,20 +27,17 @@
self.dependenciesDict = {}
self.accessibleDict = {}
self.expandedAccessibles = {}
- # For SInE features
- self.useSine = False
- self.featureCountDict = {}
- self.triggerFeaturesDict = {}
- self.featureTriggeredFormulasDict = {}
+ self.accFile = ''
self.changed = True
"""
Init functions. nameIdDict, idNameDict, featureIdDict, articleDict get filled!
"""
- def init_featureDict(self,featureFile,sineFeatures):
- self.featureDict,self.maxNameId,self.maxFeatureId,self.featureCountDict,self.triggerFeaturesDict,self.featureTriggeredFormulasDict =\
- create_feature_dict(self.nameIdDict,self.idNameDict,self.maxNameId,self.featureIdDict,self.maxFeatureId,self.featureCountDict,\
- self.triggerFeaturesDict,self.featureTriggeredFormulasDict,sineFeatures,featureFile)
+ def init_featureDict(self,featureFile):
+ self.create_feature_dict(featureFile)
+ #self.featureDict,self.maxNameId,self.maxFeatureId,self.featureCountDict,self.triggerFeaturesDict,self.featureTriggeredFormulasDict =\
+ # create_feature_dict(self.nameIdDict,self.idNameDict,self.maxNameId,self.featureIdDict,self.maxFeatureId,self.featureCountDict,\
+ # self.triggerFeaturesDict,self.featureTriggeredFormulasDict,sineFeatures,featureFile)
def init_dependenciesDict(self,depFile):
self.dependenciesDict = create_dependencies_dict(self.nameIdDict,depFile)
def init_accessibleDict(self,accFile):
@@ -54,16 +46,37 @@
def init_all(self,args):
self.featureFileName = 'mash_features'
self.accFileName = 'mash_accessibility'
- self.useSine = args.sineFeatures
featureFile = join(args.inputDir,self.featureFileName)
depFile = join(args.inputDir,args.depFile)
- accFile = join(args.inputDir,self.accFileName)
- self.init_featureDict(featureFile,self.useSine)
- self.init_accessibleDict(accFile)
+ self.accFile = join(args.inputDir,self.accFileName)
+ self.init_featureDict(featureFile)
+ self.init_accessibleDict(self.accFile)
self.init_dependenciesDict(depFile)
self.expandedAccessibles = {}
self.changed = True
+ def create_feature_dict(self,inputFile):
+ logger = logging.getLogger('create_feature_dict')
+ self.featureDict = {}
+ IS = open(inputFile,'r')
+ for line in IS:
+ line = line.split(':')
+ name = line[0]
+ # Name Id
+ if self.nameIdDict.has_key(name):
+ logger.warning('%s appears twice in the feature file. Aborting.',name)
+ sys.exit(-1)
+ else:
+ self.nameIdDict[name] = self.maxNameId
+ self.idNameDict[self.maxNameId] = name
+ nameId = self.maxNameId
+ self.maxNameId += 1
+ features = self.get_features(line)
+ # Store results
+ self.featureDict[nameId] = features
+ IS.close()
+ return
+
def get_name_id(self,name):
"""
Return the Id for a name.
@@ -82,27 +95,23 @@
def add_feature(self,featureName):
if not self.featureIdDict.has_key(featureName):
self.featureIdDict[featureName] = self.maxFeatureId
- if self.useSine:
- self.featureCountDict[self.maxFeatureId] = 0
self.maxFeatureId += 1
self.changed = True
fId = self.featureIdDict[featureName]
- if self.useSine:
- self.featureCountDict[fId] += 1
return fId
def get_features(self,line):
- # Feature Ids
featureNames = [f.strip() for f in line[1].split()]
- features = []
+ features = {}
for fn in featureNames:
tmp = fn.split('=')
- weight = 1.0
+ weight = 1.0
if len(tmp) == 2:
fn = tmp[0]
weight = float(tmp[1])
fId = self.add_feature(tmp[0])
- features.append((fId,weight))
+ features[fId] = weight
+ #features[fId] = 1.0 ###
return features
def expand_accessibles(self,acc):
@@ -142,16 +151,6 @@
self.accessibleDict[nameId] = unExpAcc
features = self.get_features(line)
self.featureDict[nameId] = features
- if self.useSine:
- # SInE Features
- minFeatureCount = min([self.featureCountDict[f] for f,_w in features])
- triggerFeatures = [f for f,_w in features if self.featureCountDict[f] == minFeatureCount]
- self.triggerFeaturesDict[nameId] = triggerFeatures
- for f in triggerFeatures:
- if self.featureTriggeredFormulasDict.has_key(f):
- self.featureTriggeredFormulasDict[f].append(nameId)
- else:
- self.featureTriggeredFormulasDict[f] = [nameId]
self.dependenciesDict[nameId] = [self.nameIdDict[d.strip()] for d in line[2].split()]
self.changed = True
return nameId
@@ -219,14 +218,12 @@
if self.changed:
dictsStream = open(fileName, 'wb')
dump((self.accessibleDict,self.dependenciesDict,self.expandedAccessibles,self.featureDict,\
- self.featureIdDict,self.idNameDict,self.maxFeatureId,self.maxNameId,self.nameIdDict,\
- self.featureCountDict,self.triggerFeaturesDict,self.featureTriggeredFormulasDict,self.useSine),dictsStream)
+ self.featureIdDict,self.idNameDict,self.maxFeatureId,self.maxNameId,self.nameIdDict),dictsStream)
self.changed = False
dictsStream.close()
def load(self,fileName):
dictsStream = open(fileName, 'rb')
self.accessibleDict,self.dependenciesDict,self.expandedAccessibles,self.featureDict,\
- self.featureIdDict,self.idNameDict,self.maxFeatureId,self.maxNameId,self.nameIdDict,\
- self.featureCountDict,self.triggerFeaturesDict,self.featureTriggeredFormulasDict,self.useSine = load(dictsStream)
+ self.featureIdDict,self.idNameDict,self.maxFeatureId,self.maxNameId,self.nameIdDict = load(dictsStream)
self.changed = False
dictsStream.close()
--- a/src/HOL/Tools/Sledgehammer/MaSh/src/mash.py Thu Sep 12 20:53:25 2013 +0200
+++ b/src/HOL/Tools/Sledgehammer/MaSh/src/mash.py Thu Sep 12 20:54:26 2013 +0200
@@ -41,7 +41,7 @@
path = dirname(realpath(__file__))
spawnDaemon(os.path.join(path,'server.py'))
serverIsUp=False
- for _i in range(10):
+ for _i in range(20):
# Test if server is up
try:
sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
@@ -118,11 +118,12 @@
OS.write('%s\n' % received)
OS.close()
IS.close()
+
+ # Statistics
+ if args.statistics:
+ received = communicate('avgStats',args.host,args.port)
+ logger.info(received)
- # Statistics
- if args.statistics:
- received = communicate('avgStats',args.host,args.port)
- logger.info(received)
if args.saveModels:
communicate('save',args.host,args.port)
--- a/src/HOL/Tools/Sledgehammer/MaSh/src/mashTest.py Thu Sep 12 20:53:25 2013 +0200
+++ /dev/null Thu Jan 01 00:00:00 1970 +0000
@@ -1,12 +0,0 @@
-'''
-Created on Aug 20, 2013
-
-@author: daniel
-'''
-from mash import mash
-
-if __name__ == "__main__":
- args = ['--statistics', '--init', '--inputDir', '../data/20130118/Jinja', '--log', '../tmp/auth.log', '--modelFile', '../tmp/m0', '--dictsFile', '../tmp/d0','--NBDefaultPriorWeight', '20.0', '--NBDefVal', '-15.0', '--NBPosWeight', '10.0']
- mash(args)
- args = ['-i', '../data/20130118/Jinja/mash_commands', '-p', '../tmp/auth.pred0', '--statistics', '--cutOff', '500', '--log', '../tmp/auth.log','--modelFile', '../tmp/m0', '--dictsFile', '../tmp/d0']
- mash(args)
\ No newline at end of file
--- a/src/HOL/Tools/Sledgehammer/MaSh/src/parameters.py Thu Sep 12 20:53:25 2013 +0200
+++ b/src/HOL/Tools/Sledgehammer/MaSh/src/parameters.py Thu Sep 12 20:54:26 2013 +0200
@@ -22,14 +22,13 @@
parser.add_argument('--depFile', default='mash_dependencies',
help='Name of the file with the premise dependencies. The file must be in inputDir. Default = mash_dependencies')
- parser.add_argument('--algorithm',default='nb',help="Which learning algorithm is used. nb = Naive Bayes,predef=predefined. Default=nb.")
+ parser.add_argument('--algorithm',default='nb',help="Which learning algorithm is used. nb = Naive Bayes,KNN,predef=predefined. Default=nb.")
+ parser.add_argument('--predef',help="File containing the predefined suggestions. Only used when algorithm = predef.")
# NB Parameters
parser.add_argument('--NBDefaultPriorWeight',default=20.0,help="Initializes classifiers with value * p |- p. Default=20.0.",type=float)
parser.add_argument('--NBDefVal',default=-15.0,help="Default value for unknown features. Default=-15.0.",type=float)
parser.add_argument('--NBPosWeight',default=10.0,help="Weight value for positive features. Default=10.0.",type=float)
- # TODO: Rename to sineFeatures
- parser.add_argument('--sineFeatures',default=False,action='store_true',help="Uses a SInE like prior for premise lvl predictions. Default=False.")
- parser.add_argument('--sineWeight',default=0.5,help="How much the SInE prior is weighted. Default=0.5.",type=float)
+ parser.add_argument('--expandFeatures',default=False,action='store_true',help="Learning-based feature expansion. Default=False.")
parser.add_argument('--statistics',default=False,action='store_true',help="Create and show statistics for the top CUTOFF predictions.\
WARNING: This will make the program a lot slower! Default=False.")
--- a/src/HOL/Tools/Sledgehammer/MaSh/src/readData.py Thu Sep 12 20:53:25 2013 +0200
+++ b/src/HOL/Tools/Sledgehammer/MaSh/src/readData.py Thu Sep 12 20:54:26 2013 +0200
@@ -14,55 +14,6 @@
import sys,logging
-def create_feature_dict(nameIdDict,idNameDict,maxNameId,featureIdDict,maxFeatureId,featureCountDict,\
- triggerFeaturesDict,featureTriggeredFormulasDict,sineFeatures,inputFile):
- logger = logging.getLogger('create_feature_dict')
- featureDict = {}
- IS = open(inputFile,'r')
- for line in IS:
- line = line.split(':')
- name = line[0]
- # Name Id
- if nameIdDict.has_key(name):
- logger.warning('%s appears twice in the feature file. Aborting.',name)
- sys.exit(-1)
- else:
- nameIdDict[name] = maxNameId
- idNameDict[maxNameId] = name
- nameId = maxNameId
- maxNameId += 1
- # Feature Ids
- featureNames = [f.strip() for f in line[1].split()]
- features = []
- minFeatureCount = 9999999
- for fn in featureNames:
- weight = 1.0
- tmp = fn.split('=')
- if len(tmp) == 2:
- fn = tmp[0]
- weight = float(tmp[1])
- if not featureIdDict.has_key(fn):
- featureIdDict[fn] = maxFeatureId
- featureCountDict[maxFeatureId] = 0
- maxFeatureId += 1
- fId = featureIdDict[fn]
- features.append((fId,weight))
- if sineFeatures:
- featureCountDict[fId] += 1
- minFeatureCount = min(minFeatureCount,featureCountDict[fId])
- # Store results
- featureDict[nameId] = features
- if sineFeatures:
- triggerFeatures = [f for f,_w in features if featureCountDict[f] == minFeatureCount]
- triggerFeaturesDict[nameId] = triggerFeatures
- for f in triggerFeatures:
- if featureTriggeredFormulasDict.has_key(f):
- featureTriggeredFormulasDict[f].append(nameId)
- else:
- featureTriggeredFormulasDict[f] = [nameId]
- IS.close()
- return featureDict,maxNameId,maxFeatureId,featureCountDict,triggerFeaturesDict,featureTriggeredFormulasDict
-
def create_dependencies_dict(nameIdDict,inputFile):
logger = logging.getLogger('create_dependencies_dict')
dependenciesDict = {}
--- a/src/HOL/Tools/Sledgehammer/MaSh/src/server.py Thu Sep 12 20:53:25 2013 +0200
+++ b/src/HOL/Tools/Sledgehammer/MaSh/src/server.py Thu Sep 12 20:54:26 2013 +0200
@@ -7,10 +7,15 @@
import SocketServer,os,string,logging
from multiprocessing import Manager
+from threading import Timer
from time import time
from dictionaries import Dictionaries
from parameters import init_parser
from sparseNaiveBayes import sparseNBClassifier
+from KNN import KNN,euclidean
+from KNNs import KNNAdaptPointFeatures,KNNUrban
+from predefined import Predefined
+#from ExpandFeatures import ExpandFeatures
from stats import Statistics
@@ -19,6 +24,21 @@
SocketServer.ThreadingTCPServer.__init__(self,*args, **kwargs)
self.manager = Manager()
self.lock = Manager().Lock()
+ self.idle_timeout = 28800.0 # 8 hours in seconds
+ self.idle_timer = Timer(self.idle_timeout, self.shutdown)
+ self.idle_timer.start()
+
+ def save(self):
+ # Save Models
+ self.model.save(self.args.modelFile)
+ self.dicts.save(self.args.dictsFile)
+ if not self.args.saveStats == None:
+ statsFile = os.path.join(self.args.outputDir,self.args.saveStats)
+ self.stats.save(statsFile)
+
+ def save_and_shutdown(self):
+ self.save()
+ self.shutdown()
class MaShHandler(SocketServer.BaseRequestHandler):
@@ -28,25 +48,32 @@
else:
argv = argv.split(';')
self.server.args = init_parser(argv)
- # Pick model
- if self.server.args.algorithm == 'nb':
- self.server.model = sparseNBClassifier(self.server.args.NBDefaultPriorWeight,self.server.args.NBPosWeight,self.server.args.NBDefVal)
- else: # Default case
- self.server.model = sparseNBClassifier(self.server.args.NBDefaultPriorWeight,self.server.args.NBPosWeight,self.server.args.NBDefVal)
# Load all data
- # TODO: rewrite dicts for concurrency and without sine
self.server.dicts = Dictionaries()
if os.path.isfile(self.server.args.dictsFile):
self.server.dicts.load(self.server.args.dictsFile)
elif self.server.args.init:
self.server.dicts.init_all(self.server.args)
+ # Pick model
+ if self.server.args.algorithm == 'nb':
+ self.server.model = sparseNBClassifier(self.server.args.NBDefaultPriorWeight,self.server.args.NBPosWeight,self.server.args.NBDefVal)
+ elif self.server.args.algorithm == 'KNN':
+ #self.server.model = KNN(self.server.dicts)
+ self.server.model = KNNAdaptPointFeatures(self.server.dicts)
+ elif self.server.args.algorithm == 'predef':
+ self.server.model = Predefined(self.server.args.predef)
+ else: # Default case
+ self.server.model = sparseNBClassifier(self.server.args.NBDefaultPriorWeight,self.server.args.NBPosWeight,self.server.args.NBDefVal)
+# if self.server.args.expandFeatures:
+# self.server.expandFeatures = ExpandFeatures(self.server.dicts)
+# self.server.expandFeatures.initialize(self.server.dicts)
# Create Model
if os.path.isfile(self.server.args.modelFile):
self.server.model.load(self.server.args.modelFile)
elif self.server.args.init:
trainData = self.server.dicts.featureDict.keys()
self.server.model.initializeModel(trainData,self.server.dicts)
-
+
if self.server.args.statistics:
self.server.stats = Statistics(self.server.args.cutOff)
self.server.statementCounter = 1
@@ -77,6 +104,8 @@
self.server.logger.debug('Poor predictions: %s',bp)
self.server.statementCounter += 1
+# if self.server.args.expandFeatures:
+# self.server.expandFeatures.update(self.server.dicts.featureDict[problemId],self.server.dicts.dependenciesDict[problemId])
# Update Dependencies, p proves p
self.server.dicts.dependenciesDict[problemId] = [problemId]+self.server.dicts.dependenciesDict[problemId]
self.server.model.update(problemId,self.server.dicts.featureDict[problemId],self.server.dicts.dependenciesDict[problemId])
@@ -92,22 +121,25 @@
self.server.computeStats = True
if self.server.args.algorithm == 'predef':
return
- name,features,accessibles,hints,numberOfPredictions = self.server.dicts.parse_problem(self.data)
+ name,features,accessibles,hints,numberOfPredictions = self.server.dicts.parse_problem(self.data)
if numberOfPredictions == None:
numberOfPredictions = self.server.args.numberOfPredictions
if not hints == []:
self.server.model.update('hints',features,hints)
-
+# if self.server.args.expandFeatures:
+# features = self.server.expandFeatures.expand(features)
# Create predictions
self.server.logger.debug('Starting computation for line %s',self.server.callCounter)
- predictionsFeatures = features
- self.server.predictions,predictionValues = self.server.model.predict(predictionsFeatures,accessibles,self.server.dicts)
+
+ self.server.predictions,predictionValues = self.server.model.predict(features,accessibles,self.server.dicts)
assert len(self.server.predictions) == len(predictionValues)
self.server.logger.debug('Time needed: '+str(round(time()-self.startTime,2)))
# Output
predictionNames = [str(self.server.dicts.idNameDict[p]) for p in self.server.predictions[:numberOfPredictions]]
- predictionValues = [str(x) for x in predictionValues[:numberOfPredictions]]
+ #predictionValues = [str(x) for x in predictionValues[:numberOfPredictions]]
+ #predictionsStringList = ['%s=%s' % (predictionNames[i],predictionValues[i]) for i in range(len(predictionNames))]
+ #predictionsString = string.join(predictionsStringList,' ')
predictionsString = string.join(predictionNames,' ')
outString = '%s: %s' % (name,predictionsString)
self.request.sendall(outString)
@@ -115,27 +147,18 @@
def shutdown(self,saveModels=True):
self.request.sendall('Shutting down server.')
if saveModels:
- self.save()
+ self.server.save()
self.server.shutdown()
- def save(self):
- # Save Models
- self.server.model.save(self.server.args.modelFile)
- self.server.dicts.save(self.server.args.dictsFile)
- if not self.server.args.saveStats == None:
- statsFile = os.path.join(self.server.args.outputDir,self.server.args.saveStats)
- self.server.stats.save(statsFile)
-
def handle(self):
# self.request is the TCP socket connected to the client
self.data = self.request.recv(4194304).strip()
self.server.lock.acquire()
- #print "{} wrote:".format(self.client_address[0])
self.startTime = time()
if self.data == 'shutdown':
self.shutdown()
elif self.data == 'save':
- self.save()
+ self.server.save()
elif self.data.startswith('i'):
self.init(self.data[2:])
elif self.data.startswith('!'):
@@ -153,15 +176,16 @@
else:
self.request.sendall('Unspecified input format: \n%s',self.data)
self.server.callCounter += 1
+ # Update idle shutdown timer
+ self.server.idle_timer.cancel()
+ self.server.idle_timer = Timer(self.server.idle_timeout, self.server.save_and_shutdown)
+ self.server.idle_timer.start()
self.server.lock.release()
if __name__ == "__main__":
HOST, PORT = "localhost", 9255
- #print 'Started Server'
- # Create the server, binding to localhost on port 9999
SocketServer.TCPServer.allow_reuse_address = True
server = ThreadingTCPServer((HOST, PORT), MaShHandler)
- #server = SocketServer.TCPServer((HOST, PORT), MaShHandler)
# Activate the server; this will keep running until you
# interrupt the program with Ctrl-C
--- a/src/HOL/Tools/Sledgehammer/MaSh/src/sparseNaiveBayes.py Thu Sep 12 20:53:25 2013 +0200
+++ b/src/HOL/Tools/Sledgehammer/MaSh/src/sparseNaiveBayes.py Thu Sep 12 20:54:26 2013 +0200
@@ -36,7 +36,7 @@
dFeatureCounts = {}
# Add p proves p with weight self.defaultPriorWeight
if not self.defaultPriorWeight == 0:
- for f,_w in dicts.featureDict[d]:
+ for f in dicts.featureDict[d].iterkeys():
dFeatureCounts[f] = self.defaultPriorWeight
self.counts[d] = [self.defaultPriorWeight,dFeatureCounts]
@@ -44,7 +44,7 @@
for dep in keyDeps:
self.counts[dep][0] += 1
depFeatures = dicts.featureDict[key]
- for f,_w in depFeatures:
+ for f in depFeatures.iterkeys():
if self.counts[dep][1].has_key(f):
self.counts[dep][1][f] += 1
else:
@@ -59,12 +59,12 @@
dFeatureCounts = {}
# Give p |- p a higher weight
if not self.defaultPriorWeight == 0:
- for f,_w in features:
+ for f in features.iterkeys():
dFeatureCounts[f] = self.defaultPriorWeight
self.counts[dataPoint] = [self.defaultPriorWeight,dFeatureCounts]
for dep in dependencies:
self.counts[dep][0] += 1
- for f,_w in features:
+ for f in features.iterkeys():
if self.counts[dep][1].has_key(f):
self.counts[dep][1][f] += 1
else:
@@ -97,12 +97,14 @@
"""
tau = 0.05 # Jasmin, change value here
predictions = []
+ #observedFeatures = [f for f,_w in features]
+ observedFeatures = features.keys()
for a in accessibles:
posA = self.counts[a][0]
fA = set(self.counts[a][1].keys())
fWeightsA = self.counts[a][1]
resultA = log(posA)
- for f,w in features:
+ for f,w in features.iteritems():
# DEBUG
#w = 1.0
if f in fA:
@@ -114,9 +116,10 @@
else:
resultA += w*self.defVal
if not tau == 0.0:
- observedFeatures = [f for f,_w in features]
missingFeatures = list(fA.difference(observedFeatures))
- sumOfWeights = sum([log(float(fWeightsA[x])/posA) for x in missingFeatures])
+ #sumOfWeights = sum([log(float(fWeightsA[x])/posA) for x in missingFeatures]) # slower
+ sumOfWeights = sum([log(float(fWeightsA[x])) for x in missingFeatures]) - log(posA) * len(missingFeatures) #DEFAULT
+ #sumOfWeights = sum([log(float(fWeightsA[x])/self.totalFeatureCounts[x]) for x in missingFeatures]) - log(posA) * len(missingFeatures)
resultA -= tau * sumOfWeights
predictions.append(resultA)
predictions = array(predictions)
--- a/src/HOL/Tools/Sledgehammer/MaSh/src/stats.py Thu Sep 12 20:53:25 2013 +0200
+++ b/src/HOL/Tools/Sledgehammer/MaSh/src/stats.py Thu Sep 12 20:54:26 2013 +0200
@@ -97,7 +97,7 @@
badPreds.append(dep)
recall100 = len(predictions)+1
positives+=1
- self.logger.debug('Dependencies missing for %s in accessibles! Estimating Statistics.',\
+ self.logger.debug('Dependencies missing for %s in cutoff predictions! Estimating Statistics.',\
string.join([str(dep) for dep in missing],','))
if positives == 0 or negatives == 0:
@@ -113,7 +113,7 @@
self.badPreds = badPreds
self.avgAvailable += available
self.avgDepNr += depNr
- self.logger.info('Statement: %s: AUC: %s \t Needed: %s \t Recall100: %s \t Available: %s \t cutOff:%s',\
+ self.logger.info('Statement: %s: AUC: %s \t Needed: %s \t Recall100: %s \t Available: %s \t cutOff: %s',\
statementCounter,round(100*auc,2),depNr,recall100,available,self.cutOff)
def printAvg(self):
@@ -135,7 +135,7 @@
else:
medianrecall100 = float(sorted(self.recall100Median)[nrDataPoints/2] + sorted(self.recall100Median)[nrDataPoints/2 + 1])/2
- returnString = 'avgAUC: %s \t medianAUC: %s \t avgDepNr: %s \t avgRecall100: %s \t medianRecall100: %s \t cutOff:%s' %\
+ returnString = 'avgAUC: %s \t medianAUC: %s \t avgDepNr: %s \t avgRecall100: %s \t medianRecall100: %s \t cutOff: %s' %\
(round(100*self.avgAUC/self.problems,2),round(100*medianAUC,2),round(self.avgDepNr/self.problems,2),round(self.avgRecall100/self.problems,2),round(medianrecall100,2),self.cutOff)
self.logger.info(returnString)
return returnString
@@ -143,44 +143,6 @@
"""
self.logger.info('avgAUC: %s \t medianAUC: %s \t avgDepNr: %s \t avgRecall100: %s \t medianRecall100: %s \t cutOff:%s', \
round(100*self.avgAUC/self.problems,2),round(100*medianAUC,2),round(self.avgDepNr/self.problems,2),round(self.avgRecall100/self.problems,2),round(medianrecall100,2),self.cutOff)
-
- #try:
- #if True:
- if False:
- from matplotlib.pyplot import plot,figure,show,xlabel,ylabel,axis,hist
- avgRecall = [float(x)/self.problems for x in self.recallData]
- figure('Recall')
- plot(range(self.cutOff),avgRecall)
- ylabel('Average Recall')
- xlabel('Highest ranked premises')
- axis([0,self.cutOff,0.0,1.0])
- figure('100%Recall')
- plot(range(self.cutOff),self.recall100Data)
- ylabel('100%Recall')
- xlabel('Highest ranked premises')
- axis([0,self.cutOff,0,self.problems])
- figure('AUC Histogram')
- hist(self.aucData,bins=100)
- ylabel('Problems')
- xlabel('AUC')
- maxCount = max(self.premiseOccurenceCounter.values())
- minCount = min(self.premiseOccurenceCounter.values())
- figure('Dependency Occurances')
- hist(self.premiseOccurenceCounter.values(),bins=range(minCount,maxCount+2),align = 'left')
- #ylabel('Occurences')
- xlabel('Number of Times a Dependency Occurs')
- figure('Dependency Appearance in Problems after Introduction.')
- hist(self.depAppearances,bins=50)
- figure('Dependency Appearance in Problems after Introduction in Percent.')
- xAxis = range(max(self.depAppearances)+1)
- yAxis = [0] * (max(self.depAppearances)+1)
- for val in self.depAppearances:
- yAxis[val] += 1
- yAxis = [float(x)/len(self.firstDepAppearance.keys()) for x in yAxis]
- plot(xAxis,yAxis)
- show()
- #except:
- # self.logger.warning('Matplotlib module missing. Skipping graphs.')
"""
def save(self,fileName):
--- a/src/HOL/Tools/Sledgehammer/sledgehammer_mash.ML Thu Sep 12 20:53:25 2013 +0200
+++ b/src/HOL/Tools/Sledgehammer/sledgehammer_mash.ML Thu Sep 12 20:54:26 2013 +0200
@@ -165,12 +165,12 @@
val model_dir = File.shell_path (mash_model_dir ())
val core = "--inputFile " ^ cmd_file ^ " --predictions " ^ sugg_file
val command =
- "cd \"$ISABELLE_SLEDGEHAMMER_MASH\"/src; " ^
- "./mash.py --quiet" ^
- " --outputDir " ^ model_dir ^
- " --modelFile=" ^ model_dir ^ "/model.pickle" ^
- " --dictsFile=" ^ model_dir ^ "/dict.pickle" ^
- " --log " ^ log_file ^ " " ^ core ^
+ "cd \"$ISABELLE_SLEDGEHAMMER_MASH\"/src; \
+ \PYTHONDONTWRITEBYTECODE=y ./mash.py --quiet\
+ \ --outputDir " ^ model_dir ^
+ " --modelFile=" ^ model_dir ^ "/model.pickle\
+ \ --dictsFile=" ^ model_dir ^ "/dict.pickle\
+ \ --log " ^ log_file ^ " " ^ core ^
(if extra_args = [] then "" else " " ^ space_implode " " extra_args) ^
" >& " ^ err_file ^
(if background then " &" else "")
@@ -218,9 +218,15 @@
Date.fmt ".%Y%m%d_%H%M%S__" (Date.fromTimeLocal (Time.now ())) ^
serial_string ()
+(* Avoid scientific notation *)
+fun safe_str_of_real r =
+ if r < 0.00001 then "0.00001"
+ else if r >= 1000000.0 then "1000000"
+ else Markup.print_real r
+
fun encode_feature (name, weight) =
encode_str name ^
- (if Real.== (weight, 1.0) then "" else "=" ^ Markup.print_real weight)
+ (if Real.== (weight, 1.0) then "" else "=" ^ safe_str_of_real weight)
val encode_features = map encode_feature #> space_implode " "
@@ -445,7 +451,8 @@
end
-fun mash_unlearn ctxt ({overlord, ...} : params) = clear_state ctxt overlord
+fun mash_unlearn ctxt ({overlord, ...} : params) =
+ (clear_state ctxt overlord; Output.urgent_message "Reset MaSh.")
(*** Isabelle helpers ***)
@@ -944,14 +951,17 @@
fold (fn s => Symtab.map_default (s, 0) (Integer.add 1))
(Term.add_const_names t [])
-fun mash_suggested_facts ctxt ({overlord, ...} : params) prover max_facts hyp_ts
- concl_t facts =
+fun mash_suggested_facts ctxt ({debug, overlord, ...} : params) prover max_facts
+ hyp_ts concl_t facts =
let
val thy = Proof_Context.theory_of ctxt
+ val thy_name = Context.theory_name thy
val facts = facts |> sort (crude_thm_ord o pairself snd o swap)
val chained = facts |> filter (fn ((_, (scope, _)), _) => scope = Chained)
val num_facts = length facts
val const_tab = fold (add_const_counts o prop_of o snd) facts Symtab.empty
+ fun fact_has_right_theory (_, th) =
+ thy_name = Context.theory_name (theory_of_thm th)
fun chained_or_extra_features_of factor (((_, stature), th), weight) =
[prop_of th]
|> features_of ctxt prover (theory_of_thm th) num_facts const_tab stature
@@ -964,8 +974,8 @@
let
val parents = maximal_wrt_access_graph access_G facts
val goal_feats =
- features_of ctxt prover thy num_facts const_tab
- (Local, General) (concl_t :: hyp_ts)
+ features_of ctxt prover thy num_facts const_tab (Local, General)
+ (concl_t :: hyp_ts)
val chained_feats =
chained
|> map (rpair 1.0)
@@ -974,12 +984,14 @@
val extra_feats =
facts
|> take (Int.max (0, num_extra_feature_facts - length chained))
+ |> filter fact_has_right_theory
|> weight_facts_steeply
|> map (chained_or_extra_features_of extra_feature_factor)
|> rpair [] |-> fold (union (eq_fst (op =)))
val feats =
fold (union (eq_fst (op =))) [chained_feats, extra_feats]
goal_feats
+ |> debug ? sort (Real.compare o swap o pairself snd)
val hints =
chained |> filter (is_fact_in_graph access_G o snd)
|> map (nickname_of_thm o snd)
--- a/src/HOL/Tools/Sledgehammer/sledgehammer_mepo.ML Thu Sep 12 20:53:25 2013 +0200
+++ b/src/HOL/Tools/Sledgehammer/sledgehammer_mepo.ML Thu Sep 12 20:54:26 2013 +0200
@@ -15,7 +15,6 @@
val trace : bool Config.T
val pseudo_abs_name : string
- val pseudo_skolem_prefix : string
val mepo_suggested_facts :
Proof.context -> params -> string -> int -> relevance_fudge option
-> term list -> term -> raw_fact list -> fact list
@@ -30,12 +29,11 @@
open Sledgehammer_Provers
val trace =
- Attrib.setup_config_bool @{binding sledgehammer_filter_iter_trace} (K false)
+ Attrib.setup_config_bool @{binding sledgehammer_mepo_trace} (K false)
fun trace_msg ctxt msg = if Config.get ctxt trace then tracing (msg ()) else ()
val sledgehammer_prefix = "Sledgehammer" ^ Long_Name.separator
val pseudo_abs_name = sledgehammer_prefix ^ "abs"
-val pseudo_skolem_prefix = sledgehammer_prefix ^ "sko"
val theory_const_suffix = Long_Name.separator ^ " 1"
fun order_of_type (Type (@{type_name fun}, [T1, T2])) =
@@ -88,9 +86,7 @@
(* Add a pconstant to the table, but a [] entry means a standard
connective, which we ignore.*)
-fun add_pconst_to_table also_skolem (s, p) =
- if (not also_skolem andalso String.isPrefix pseudo_skolem_prefix s) then I
- else Symtab.map_default (s, [p]) (insert (op =) p)
+fun add_pconst_to_table (s, p) = Symtab.map_default (s, [p]) (insert (op =) p)
(* Set constants tend to pull in too many irrelevant facts. We limit the damage
by treating them more or less as if they were built-in but add their
@@ -98,20 +94,15 @@
val set_consts = [@{const_name Collect}, @{const_name Set.member}]
val set_thms = @{thms Collect_mem_eq mem_Collect_eq Collect_cong}
-fun add_pconsts_in_term thy is_built_in_const also_skolems pos =
+fun add_pconsts_in_term thy is_built_in_const =
let
- val flip = Option.map not
- (* We include free variables, as well as constants, to handle locales. For
- each quantifiers that must necessarily be skolemized by the automatic
- prover, we introduce a fresh constant to simulate the effect of
- Skolemization. *)
fun do_const const ext_arg (x as (s, _)) ts =
let val (built_in, ts) = is_built_in_const x ts in
if member (op =) set_consts s then
fold (do_term ext_arg) ts
else
(not built_in
- ? add_pconst_to_table also_skolems (rich_pconst thy const x))
+ ? add_pconst_to_table (rich_pconst thy const x))
#> fold (do_term false) ts
end
and do_term ext_arg t =
@@ -123,61 +114,45 @@
(* Since lambdas on the right-hand side of equalities are usually
extensionalized later by "abs_extensionalize_term", we don't
penalize them here. *)
- ? add_pconst_to_table true (pseudo_abs_name,
- PType (order_of_type T + 1, [])))
+ ? add_pconst_to_table (pseudo_abs_name,
+ PType (order_of_type T + 1, [])))
#> fold (do_term false) (t' :: ts)
| (_, ts) => fold (do_term false) ts
- fun do_quantifier will_surely_be_skolemized abs_T body_t =
- do_formula pos body_t
- #> (if also_skolems andalso will_surely_be_skolemized then
- add_pconst_to_table true (pseudo_skolem_prefix ^ serial_string (),
- PType (order_of_type abs_T, []))
- else
- I)
and do_term_or_formula ext_arg T =
- if T = HOLogic.boolT then do_formula NONE else do_term ext_arg
- and do_formula pos t =
+ if T = HOLogic.boolT then do_formula else do_term ext_arg
+ and do_formula t =
case t of
- Const (@{const_name all}, _) $ Abs (_, T, t') =>
- do_quantifier (pos = SOME false) T t'
- | @{const "==>"} $ t1 $ t2 =>
- do_formula (flip pos) t1 #> do_formula pos t2
+ Const (@{const_name all}, _) $ Abs (_, T, t') => do_formula t'
+ | @{const "==>"} $ t1 $ t2 => do_formula t1 #> do_formula t2
| Const (@{const_name "=="}, Type (_, [T, _])) $ t1 $ t2 =>
do_term_or_formula false T t1 #> do_term_or_formula true T t2
- | @{const Trueprop} $ t1 => do_formula pos t1
+ | @{const Trueprop} $ t1 => do_formula t1
| @{const False} => I
| @{const True} => I
- | @{const Not} $ t1 => do_formula (flip pos) t1
- | Const (@{const_name All}, _) $ Abs (_, T, t') =>
- do_quantifier (pos = SOME false) T t'
- | Const (@{const_name Ex}, _) $ Abs (_, T, t') =>
- do_quantifier (pos = SOME true) T t'
- | @{const HOL.conj} $ t1 $ t2 => fold (do_formula pos) [t1, t2]
- | @{const HOL.disj} $ t1 $ t2 => fold (do_formula pos) [t1, t2]
- | @{const HOL.implies} $ t1 $ t2 =>
- do_formula (flip pos) t1 #> do_formula pos t2
+ | @{const Not} $ t1 => do_formula t1
+ | Const (@{const_name All}, _) $ Abs (_, T, t') => do_formula t'
+ | Const (@{const_name Ex}, _) $ Abs (_, T, t') => do_formula t'
+ | @{const HOL.conj} $ t1 $ t2 => do_formula t1 #> do_formula t2
+ | @{const HOL.disj} $ t1 $ t2 => do_formula t1 #> do_formula t2
+ | @{const HOL.implies} $ t1 $ t2 => do_formula t1 #> do_formula t2
| Const (@{const_name HOL.eq}, Type (_, [T, _])) $ t1 $ t2 =>
do_term_or_formula false T t1 #> do_term_or_formula true T t2
| Const (@{const_name If}, Type (_, [_, Type (_, [T, _])]))
$ t1 $ t2 $ t3 =>
- do_formula NONE t1 #> fold (do_term_or_formula false T) [t2, t3]
- | Const (@{const_name Ex1}, _) $ Abs (_, T, t') =>
- do_quantifier (is_some pos) T t'
+ do_formula t1 #> fold (do_term_or_formula false T) [t2, t3]
+ | Const (@{const_name Ex1}, _) $ Abs (_, T, t') => do_formula t'
| Const (@{const_name Ball}, _) $ t1 $ Abs (_, T, t') =>
- do_quantifier (pos = SOME false) T
- (HOLogic.mk_imp (incr_boundvars 1 t1 $ Bound 0, t'))
+ do_formula (t1 $ Bound ~1) #> do_formula t'
| Const (@{const_name Bex}, _) $ t1 $ Abs (_, T, t') =>
- do_quantifier (pos = SOME true) T
- (HOLogic.mk_conj (incr_boundvars 1 t1 $ Bound 0, t'))
+ do_formula (t1 $ Bound ~1) #> do_formula t'
| (t0 as Const (_, @{typ bool})) $ t1 =>
- do_term false t0 #> do_formula pos t1 (* theory constant *)
+ do_term false t0 #> do_formula t1 (* theory constant *)
| _ => do_term false t
- in do_formula pos end
+ in do_formula end
fun pconsts_in_fact thy is_built_in_const t =
Symtab.fold (fn (s, pss) => fold (cons o pair s) pss)
- (Symtab.empty |> add_pconsts_in_term thy is_built_in_const true
- (SOME true) t) []
+ (Symtab.empty |> add_pconsts_in_term thy is_built_in_const t) []
(* Inserts a dummy "constant" referring to the theory name, so that relevance
takes the given theory into account. *)
@@ -263,13 +238,11 @@
if String.isSubstring "." s then 1.0 else local_const_multiplier
(* Computes a constant's weight, as determined by its frequency. *)
-fun generic_pconst_weight local_const_multiplier abs_weight skolem_weight
- theory_const_weight chained_const_weight weight_for f
- const_tab chained_const_tab (c as (s, PType (m, _))) =
+fun generic_pconst_weight local_const_multiplier abs_weight theory_const_weight
+ chained_const_weight weight_for f const_tab chained_const_tab
+ (c as (s, PType (m, _))) =
if s = pseudo_abs_name then
abs_weight
- else if String.isPrefix pseudo_skolem_prefix s then
- skolem_weight
else if String.isSuffix theory_const_suffix s then
theory_const_weight
else
@@ -284,19 +257,18 @@
fun rel_pconst_weight ({local_const_multiplier, abs_rel_weight,
theory_const_rel_weight, ...} : relevance_fudge)
const_tab =
- generic_pconst_weight local_const_multiplier abs_rel_weight 0.0
+ generic_pconst_weight local_const_multiplier abs_rel_weight
theory_const_rel_weight 0.0 rel_weight_for I const_tab
Symtab.empty
fun irrel_pconst_weight (fudge as {local_const_multiplier, abs_irrel_weight,
- skolem_irrel_weight,
theory_const_irrel_weight,
chained_const_irrel_weight, ...})
const_tab chained_const_tab =
generic_pconst_weight local_const_multiplier abs_irrel_weight
- skolem_irrel_weight theory_const_irrel_weight
- chained_const_irrel_weight (irrel_weight_for fudge) swap
- const_tab chained_const_tab
+ theory_const_irrel_weight chained_const_irrel_weight
+ (irrel_weight_for fudge) swap const_tab
+ chained_const_tab
fun stature_bonus ({intro_bonus, ...} : relevance_fudge) (_, Intro) =
intro_bonus
@@ -308,8 +280,7 @@
| stature_bonus _ _ = 0.0
fun is_odd_const_name s =
- s = pseudo_abs_name orelse String.isPrefix pseudo_skolem_prefix s orelse
- String.isSuffix theory_const_suffix s
+ s = pseudo_abs_name orelse String.isSuffix theory_const_suffix s
fun fact_weight fudge stature const_tab rel_const_tab chained_const_tab
fact_consts =
@@ -358,7 +329,7 @@
fun if_empty_replace_with_scope thy is_built_in_const facts sc tab =
if Symtab.is_empty tab then
Symtab.empty
- |> fold (add_pconsts_in_term thy is_built_in_const false (SOME false))
+ |> fold (add_pconsts_in_term thy is_built_in_const)
(map_filter (fn ((_, (sc', _)), th) =>
if sc' = sc then SOME (prop_of th) else NONE) facts)
else
@@ -397,15 +368,15 @@
let
val thy = Proof_Context.theory_of ctxt
val const_tab = fold (count_fact_consts thy fudge) facts Symtab.empty
- val add_pconsts = add_pconsts_in_term thy is_built_in_const false o SOME
+ val add_pconsts = add_pconsts_in_term thy is_built_in_const
val chained_ts =
facts |> map_filter (fn ((_, (Chained, _)), th) => SOME (prop_of th)
| _ => NONE)
- val chained_const_tab = Symtab.empty |> fold (add_pconsts true) chained_ts
+ val chained_const_tab = Symtab.empty |> fold add_pconsts chained_ts
val goal_const_tab =
Symtab.empty
- |> fold (add_pconsts true) hyp_ts
- |> add_pconsts false concl_t
+ |> fold add_pconsts hyp_ts
+ |> add_pconsts concl_t
|> (fn tab => if Symtab.is_empty tab then chained_const_tab else tab)
|> fold (if_empty_replace_with_scope thy is_built_in_const facts)
[Chained, Assum, Local]
@@ -428,7 +399,7 @@
take_most_relevant ctxt max_facts remaining_max fudge candidates
val sps = maps (snd o fst) accepts
val rel_const_tab' =
- rel_const_tab |> fold (add_pconst_to_table false) sps
+ rel_const_tab |> fold add_pconst_to_table sps
fun is_dirty (s, _) =
Symtab.lookup rel_const_tab' s <> Symtab.lookup rel_const_tab s
val (hopeful_rejects, hopeless_rejects) =
--- a/src/HOL/Tools/Sledgehammer/sledgehammer_provers.ML Thu Sep 12 20:53:25 2013 +0200
+++ b/src/HOL/Tools/Sledgehammer/sledgehammer_provers.ML Thu Sep 12 20:54:26 2013 +0200
@@ -53,7 +53,6 @@
higher_order_irrel_weight : real,
abs_rel_weight : real,
abs_irrel_weight : real,
- skolem_irrel_weight : real,
theory_const_rel_weight : real,
theory_const_irrel_weight : real,
chained_const_irrel_weight : real,
@@ -276,7 +275,6 @@
higher_order_irrel_weight = 1.05,
abs_rel_weight = 0.5,
abs_irrel_weight = 2.0,
- skolem_irrel_weight = 0.05,
theory_const_rel_weight = 0.5,
theory_const_irrel_weight = 0.25,
chained_const_irrel_weight = 0.25,
@@ -298,7 +296,6 @@
higher_order_irrel_weight = #higher_order_irrel_weight atp_relevance_fudge,
abs_rel_weight = #abs_rel_weight atp_relevance_fudge,
abs_irrel_weight = #abs_irrel_weight atp_relevance_fudge,
- skolem_irrel_weight = #skolem_irrel_weight atp_relevance_fudge,
theory_const_rel_weight = #theory_const_rel_weight atp_relevance_fudge,
theory_const_irrel_weight = #theory_const_irrel_weight atp_relevance_fudge,
chained_const_irrel_weight = #chained_const_irrel_weight atp_relevance_fudge,
@@ -371,7 +368,6 @@
higher_order_irrel_weight : real,
abs_rel_weight : real,
abs_irrel_weight : real,
- skolem_irrel_weight : real,
theory_const_rel_weight : real,
theory_const_irrel_weight : real,
chained_const_irrel_weight : real,
--- a/src/HOL/Tools/Sledgehammer/sledgehammer_run.ML Thu Sep 12 20:53:25 2013 +0200
+++ b/src/HOL/Tools/Sledgehammer/sledgehammer_run.ML Thu Sep 12 20:54:26 2013 +0200
@@ -246,7 +246,8 @@
factss = factss}
fun learn prover =
mash_learn_proof ctxt params prover (prop_of goal) all_facts
- val launch = launch_prover params mode output_result minimize_command only learn
+ val launch =
+ launch_prover params mode output_result minimize_command only learn
in
if mode = Auto_Try then
(unknownN, state)