--- a/ANNOUNCE Mon Jun 07 13:42:38 2010 +0200
+++ b/ANNOUNCE Tue Jun 08 07:45:39 2010 +0200
@@ -9,23 +9,24 @@
* Explicit proof terms for type class reasoning.
-* Authentic syntax for *all* logical entities (type classes, type
-constructors, term constants): provides simple and robust
-correspondence between formal entities and concrete syntax.
+* Robust treatment of concrete syntax for different logical entities;
+mixfix syntax in local proof context.
-* HOL: Package for constructing quotient types.
+* Type declarations and notation within local theory context.
+
+* HOL: package for quotient types.
-* HOL: Code generation now with simple concept for abstract
-datatypes obeying invariants; applications for typical data structures
-(e.g. search trees) can be found in the library.
+* HOL code generation: simple concept for abstract datatypes obeying
+invariants (e.g. red-black trees).
-* HOL: New development of the Reals using Cauchy Sequences.
+* HOL: new development of the Reals using Cauchy Sequences.
-* HOL: Reorganization of abstract algebra type class hierarchy.
+* HOL: reorganization of abstract algebra type class hierarchy.
-* Commands 'types', 'typedecl' and 'typedef' now work within a local theory
-context -- without introducing dependencies on parameters or
-assumptions.
+* HOL: substantial reorganization of main library and related tools.
+
+* HOLCF: reorganization of 'domain' package.
+
You may get Isabelle2009-2 from the following mirror sites:
--- a/CONTRIBUTORS Mon Jun 07 13:42:38 2010 +0200
+++ b/CONTRIBUTORS Tue Jun 08 07:45:39 2010 +0200
@@ -3,6 +3,10 @@
who is listed as an author in one of the source files of this Isabelle
distribution.
+Contributions to this Isabelle version
+--------------------------------------
+
+
Contributions to Isabelle2009-2
--------------------------------------
--- a/NEWS Mon Jun 07 13:42:38 2010 +0200
+++ b/NEWS Tue Jun 08 07:45:39 2010 +0200
@@ -1,17 +1,16 @@
Isabelle NEWS -- history user-relevant changes
==============================================
+New in this Isabelle version
+----------------------------
+
+
+
New in Isabelle2009-2 (June 2010)
---------------------------------
*** General ***
-* Schematic theorem statements need to be explicitly markup as such,
-via commands 'schematic_lemma', 'schematic_theorem',
-'schematic_corollary'. Thus the relevance of the proof is made
-syntactically clear, which impacts performance in a parallel or
-asynchronous interactive environment. Minor INCOMPATIBILITY.
-
* Authentic syntax for *all* logical entities (type classes, type
constructors, term constants): provides simple and robust
correspondence between formal entities and concrete syntax. Within
@@ -69,8 +68,19 @@
'def', 'obtain' etc. or via the explicit 'write' command, which is
similar to the 'notation' command in theory specifications.
+* Discontinued unnamed infix syntax (legacy feature for many years) --
+need to specify constant name and syntax separately. Internal ML
+datatype constructors have been renamed from InfixName to Infix etc.
+Minor INCOMPATIBILITY.
+
+* Schematic theorem statements need to be explicitly markup as such,
+via commands 'schematic_lemma', 'schematic_theorem',
+'schematic_corollary'. Thus the relevance of the proof is made
+syntactically clear, which impacts performance in a parallel or
+asynchronous interactive environment. Minor INCOMPATIBILITY.
+
* Use of cumulative prems via "!" in some proof methods has been
-discontinued (legacy feature).
+discontinued (old legacy feature).
* References 'trace_simp' and 'debug_simp' have been replaced by
configuration options stored in the context. Enabling tracing (the
@@ -92,20 +102,13 @@
*** Pure ***
-* Predicates of locales introduced by classes carry a mandatory
-"class" prefix. INCOMPATIBILITY.
-
-* Command 'code_reflect' allows to incorporate generated ML code into
-runtime environment; replaces immature code_datatype antiquotation.
-INCOMPATIBILITY.
-
-* Empty class specifications observe default sort. INCOMPATIBILITY.
-
-* Old 'axclass' command has been discontinued. Use 'class' instead.
-INCOMPATIBILITY.
-
-* Code generator: simple concept for abstract datatypes obeying
-invariants.
+* Proofterms record type-class reasoning explicitly, using the
+"unconstrain" operation internally. This eliminates all sort
+constraints from a theorem and proof, introducing explicit
+OFCLASS-premises. On the proof term level, this operation is
+automatically applied at theorem boundaries, such that closed proofs
+are always free of sort constraints. INCOMPATIBILITY for tools that
+inspect proof terms.
* Local theory specifications may depend on extra type variables that
are not present in the result type -- arguments TYPE('a) :: 'a itself
@@ -113,18 +116,28 @@
definition unitary :: bool where "unitary = (ALL (x::'a) y. x = y)"
+* Predicates of locales introduced by classes carry a mandatory
+"class" prefix. INCOMPATIBILITY.
+
+* Vacuous class specifications observe default sort. INCOMPATIBILITY.
+
+* Old 'axclass' command has been discontinued. INCOMPATIBILITY, use
+'class' instead.
+
+* Command 'code_reflect' allows to incorporate generated ML code into
+runtime environment; replaces immature code_datatype antiquotation.
+INCOMPATIBILITY.
+
+* Code generator: simple concept for abstract datatypes obeying
+invariants.
+
* Code generator: details of internal data cache have no impact on the
user space functionality any longer.
-* Methods unfold_locales and intro_locales ignore non-locale subgoals.
-This is more appropriate for interpretations with 'where'.
+* Methods "unfold_locales" and "intro_locales" ignore non-locale
+subgoals. This is more appropriate for interpretations with 'where'.
INCOMPATIBILITY.
-* Discontinued unnamed infix syntax (legacy feature for many years) --
-need to specify constant name and syntax separately. Internal ML
-datatype constructors have been renamed from InfixName to Infix etc.
-Minor INCOMPATIBILITY.
-
* Command 'example_proof' opens an empty proof body. This allows to
experiment with Isar, without producing any persistent result.
@@ -139,24 +152,42 @@
* Command 'defaultsort' has been renamed to 'default_sort', it works
within a local theory context. Minor INCOMPATIBILITY.
-* Raw axioms/defs may no longer carry sort constraints, and raw defs
-may no longer carry premises. User-level specifications are
-transformed accordingly by Thm.add_axiom/add_def.
-
-* Proof terms: Type substitutions on proof constants now use canonical
-order of type variables. INCOMPATIBILITY for tools working with proof
-terms.
-
-* New operation Thm.unconstrainT eliminates all sort constraints from
-a theorem and proof, introducing explicit OFCLASS-premises. On the
-proof term level, this operation is automatically applied at PThm
-boundaries, such that closed proofs are always free of sort
-constraints. The old (axiomatic) unconstrain operation has been
-discontinued. INCOMPATIBILITY for tools working with proof terms.
-
*** HOL ***
+* Command 'typedef' now works within a local theory context -- without
+introducing dependencies on parameters or assumptions, which is not
+possible in Isabelle/Pure/HOL. Note that the logical environment may
+contain multiple interpretations of local typedefs (with different
+non-emptiness proofs), even in a global theory context.
+
+* New package for quotient types. Commands 'quotient_type' and
+'quotient_definition' may be used for defining types and constants by
+quotient constructions. An example is the type of integers created by
+quotienting pairs of natural numbers:
+
+ fun
+ intrel :: "(nat * nat) => (nat * nat) => bool"
+ where
+ "intrel (x, y) (u, v) = (x + v = u + y)"
+
+ quotient_type int = "nat × nat" / intrel
+ by (auto simp add: equivp_def expand_fun_eq)
+
+ quotient_definition
+ "0::int" is "(0::nat, 0::nat)"
+
+The method "lifting" can be used to lift of theorems from the
+underlying "raw" type to the quotient type. The example
+src/HOL/Quotient_Examples/FSet.thy includes such a quotient
+construction and provides a reasoning infrastructure for finite sets.
+
+* Renamed Library/Quotient.thy to Library/Quotient_Type.thy to avoid
+clash with new theory Quotient in Main HOL.
+
+* Moved the SMT binding into the main HOL session, eliminating
+separate HOL-SMT session.
+
* List membership infix mem operation is only an input abbreviation.
INCOMPATIBILITY.
@@ -177,8 +208,8 @@
instead. INCOMPATIBILITY.
* Dropped several real-specific versions of lemmas about floor and
-ceiling; use the generic lemmas from Archimedean_Field.thy instead.
-INCOMPATIBILITY.
+ceiling; use the generic lemmas from theory "Archimedean_Field"
+instead. INCOMPATIBILITY.
floor_number_of_eq ~> floor_number_of
le_floor_eq_number_of ~> number_of_le_floor
@@ -218,26 +249,19 @@
provides abstract red-black tree type which is backed by "RBT_Impl" as
implementation. INCOMPATIBILTY.
-* Command 'typedef' now works within a local theory context -- without
-introducing dependencies on parameters or assumptions, which is not
-possible in Isabelle/Pure/HOL. Note that the logical environment may
-contain multiple interpretations of local typedefs (with different
-non-emptiness proofs), even in a global theory context.
-
* Theory Library/Coinductive_List has been removed -- superseded by
AFP/thys/Coinductive.
* Theory PReal, including the type "preal" and related operations, has
been removed. INCOMPATIBILITY.
-* Split off theory Big_Operators containing setsum, setprod, Inf_fin,
-Sup_fin, Min, Max from theory Finite_Set. INCOMPATIBILITY.
+* Split off theory "Big_Operators" containing setsum, setprod,
+Inf_fin, Sup_fin, Min, Max from theory Finite_Set. INCOMPATIBILITY.
* Theory "Rational" renamed to "Rat", for consistency with "Nat",
"Int" etc. INCOMPATIBILITY.
-* Constant Rat.normalize needs to be qualified. Minor
-INCOMPATIBILITY.
+* Constant Rat.normalize needs to be qualified. INCOMPATIBILITY.
* New set of rules "ac_simps" provides combined assoc / commute
rewrites for all interpretations of the appropriate generic locales.
@@ -295,7 +319,7 @@
ordered_semiring_strict ~> linordered_semiring_strict
The following slightly odd type classes have been moved to a
- separate theory Library/Lattice_Algebras.thy:
+ separate theory Library/Lattice_Algebras:
lordered_ab_group_add ~> lattice_ab_group_add
lordered_ab_group_add_abs ~> lattice_ab_group_add_abs
@@ -335,18 +359,16 @@
INTER_fold_inter ~> INFI_fold_inf
UNION_fold_union ~> SUPR_fold_sup
-* Theory Complete_Lattice: lemmas top_def and bot_def have been
+* Theory "Complete_Lattice": lemmas top_def and bot_def have been
replaced by the more convenient lemmas Inf_empty and Sup_empty.
Dropped lemmas Inf_insert_simp and Sup_insert_simp, which are subsumed
by Inf_insert and Sup_insert. Lemmas Inf_UNIV and Sup_UNIV replace
former Inf_Univ and Sup_Univ. Lemmas inf_top_right and sup_bot_right
subsume inf_top and sup_bot respectively. INCOMPATIBILITY.
-* HOLogic.strip_psplit: types are returned in syntactic order, similar
-to other strip and tuple operations. INCOMPATIBILITY.
-
* Reorganized theory Multiset: swapped notation of pointwise and
multiset order:
+
- pointwise ordering is instance of class order with standard syntax
<= and <;
- multiset ordering has syntax <=# and <#; partial order properties
@@ -357,10 +379,13 @@
generation;
- use insert_DiffM2 [symmetric] instead of elem_imp_eq_diff_union,
if needed.
+
Renamed:
- multiset_eq_conv_count_eq -> multiset_ext_iff
- multi_count_ext -> multiset_ext
- diff_union_inverse2 -> diff_union_cancelR
+
+ multiset_eq_conv_count_eq ~> multiset_ext_iff
+ multi_count_ext ~> multiset_ext
+ diff_union_inverse2 ~> diff_union_cancelR
+
INCOMPATIBILITY.
* Theory Permutation: replaced local "remove" by List.remove1.
@@ -369,9 +394,6 @@
* Theory List: added transpose.
-* Renamed Library/Quotient.thy to Library/Quotient_Type.thy to avoid
-clash with new theory Quotient in Main HOL.
-
* Library/Nat_Bijection.thy is a collection of bijective functions
between nat and other types, which supersedes the older libraries
Library/Nat_Int_Bij.thy and HOLCF/NatIso.thy. INCOMPATIBILITY.
@@ -452,8 +474,6 @@
"sharing_depth", and "show_skolems" options. INCOMPATIBILITY.
- Removed "nitpick_intro" attribute. INCOMPATIBILITY.
-* Moved the SMT binding into the HOL image.
-
*** HOLCF ***
@@ -469,7 +489,7 @@
* Most definedness lemmas generated by the domain package (previously
of the form "x ~= UU ==> foo$x ~= UU") now have an if-and-only-if form
like "foo$x = UU <-> x = UU", which works better as a simp rule.
-Proof scripts that used definedness lemmas as intro rules may break,
+Proofs that used definedness lemmas as intro rules may break,
potential INCOMPATIBILITY.
* Induction and casedist rules generated by the domain package now
@@ -513,6 +533,38 @@
*** ML ***
+* Antiquotations for basic formal entities:
+
+ @{class NAME} -- type class
+ @{class_syntax NAME} -- syntax representation of the above
+
+ @{type_name NAME} -- logical type
+ @{type_abbrev NAME} -- type abbreviation
+ @{nonterminal NAME} -- type of concrete syntactic category
+ @{type_syntax NAME} -- syntax representation of any of the above
+
+ @{const_name NAME} -- logical constant (INCOMPATIBILITY)
+ @{const_abbrev NAME} -- abbreviated constant
+ @{const_syntax NAME} -- syntax representation of any of the above
+
+* Antiquotation @{syntax_const NAME} ensures that NAME refers to a raw
+syntax constant (cf. 'syntax' command).
+
+* Antiquotation @{make_string} inlines a function to print arbitrary
+values similar to the ML toplevel. The result is compiler dependent
+and may fall back on "?" in certain situations.
+
+* Diagnostic commands 'ML_val' and 'ML_command' may refer to
+antiquotations @{Isar.state} and @{Isar.goal}. This replaces impure
+Isar.state() and Isar.goal(), which belong to the old TTY loop and do
+not work with the asynchronous Isar document model.
+
+* Configuration options now admit dynamic default values, depending on
+the context or even global references.
+
+* SHA1.digest digests strings according to SHA-1 (see RFC 3174). It
+uses an efficient external library if available (for Poly/ML).
+
* Renamed some important ML structures, while keeping the old names
for some time as aliases within the structure Legacy:
@@ -541,32 +593,6 @@
Pretty.pp argument to merge, which is absent in the standard
Theory_Data version.
-* Antiquotations for basic formal entities:
-
- @{class NAME} -- type class
- @{class_syntax NAME} -- syntax representation of the above
-
- @{type_name NAME} -- logical type
- @{type_abbrev NAME} -- type abbreviation
- @{nonterminal NAME} -- type of concrete syntactic category
- @{type_syntax NAME} -- syntax representation of any of the above
-
- @{const_name NAME} -- logical constant (INCOMPATIBILITY)
- @{const_abbrev NAME} -- abbreviated constant
- @{const_syntax NAME} -- syntax representation of any of the above
-
-* Antiquotation @{syntax_const NAME} ensures that NAME refers to a raw
-syntax constant (cf. 'syntax' command).
-
-* Antiquotation @{make_string} inlines a function to print arbitrary
-values similar to the ML toplevel. The result is compiler dependent
-and may fall back on "?" in certain situations.
-
-* Diagnostic commands 'ML_val' and 'ML_command' may refer to
-antiquotations @{Isar.state} and @{Isar.goal}. This replaces impure
-Isar.state() and Isar.goal(), which belong to the old TTY loop and do
-not work with the asynchronous Isar document model.
-
* Sorts.certify_sort and derived "cert" operations for types and terms
no longer minimize sorts. Thus certification at the boundary of the
inference kernel becomes invariant under addition of class relations,
@@ -586,15 +612,17 @@
to emphasize that these only work in a global situation (which is
quite rare).
-* Configuration options now admit dynamic default values, depending on
-the context or even global references.
-
-* SHA1.digest digests strings according to SHA-1 (see RFC 3174). It
-uses an efficient external library if available (for Poly/ML).
-
* Curried take and drop in library.ML; negative length is interpreted
as infinity (as in chop). Subtle INCOMPATIBILITY.
+* Proof terms: type substitutions on proof constants now use canonical
+order of type variables. INCOMPATIBILITY for tools working with proof
+terms.
+
+* Raw axioms/defs may no longer carry sort constraints, and raw defs
+may no longer carry premises. User-level specifications are
+transformed accordingly by Thm.add_axiom/add_def.
+
*** System ***
@@ -3345,6 +3373,8 @@
* Real: proper support for ML code generation, including 'quickcheck'.
Reals are implemented as arbitrary precision rationals.
+* Real: new development using Cauchy Sequences.
+
* Hyperreal: Several constants that previously worked only for the
reals have been generalized, so they now work over arbitrary vector
spaces. Type annotations may need to be added in some cases; potential
--- a/src/HOL/Nominal/Examples/CK_Machine.thy Mon Jun 07 13:42:38 2010 +0200
+++ b/src/HOL/Nominal/Examples/CK_Machine.thy Tue Jun 08 07:45:39 2010 +0200
@@ -492,7 +492,7 @@
shows "(\<Delta>@\<Gamma>) \<turnstile> e[x::=e'] : T"
using a b
proof (nominal_induct \<Gamma>'\<equiv>"\<Delta>@[(x,T')]@\<Gamma>" e T avoiding: x e' \<Delta> rule: typing.strong_induct)
- case (t_VAR \<Gamma>' y T x e' \<Delta>)
+ case (t_VAR y T x e' \<Delta>)
then have a1: "valid (\<Delta>@[(x,T')]@\<Gamma>)"
and a2: "(y,T) \<in> set (\<Delta>@[(x,T')]@\<Gamma>)"
and a3: "\<Gamma> \<turnstile> e' : T'" by simp_all
--- a/src/HOL/Nominal/Examples/Nominal_Examples.thy Mon Jun 07 13:42:38 2010 +0200
+++ b/src/HOL/Nominal/Examples/Nominal_Examples.thy Tue Jun 08 07:45:39 2010 +0200
@@ -4,23 +4,25 @@
theory Nominal_Examples
imports
+ CK_Machine
CR
CR_Takahashi
Class3
Compile
+ Contexts
+ Crary
Fsub
Height
Lambda_mu
+ LocalWeakening
+ Pattern
SN
- Weakening
- Crary
SOS
- LocalWeakening
+ Standardization
Support
- Contexts
- Standardization
+ Type_Preservation
W
- Pattern
+ Weakening
begin
end
--- a/src/HOL/Nominal/Examples/Type_Preservation.thy Mon Jun 07 13:42:38 2010 +0200
+++ b/src/HOL/Nominal/Examples/Type_Preservation.thy Tue Jun 08 07:45:39 2010 +0200
@@ -142,7 +142,7 @@
shows "(\<Delta>@\<Gamma>) \<turnstile> e[x::=e'] : T"
using a b
proof (nominal_induct \<Gamma>'\<equiv>"\<Delta>@[(x,T')]@\<Gamma>" e T avoiding: x e' \<Delta> rule: typing.strong_induct)
- case (t_Var \<Gamma>' y T x e' \<Delta>)
+ case (t_Var y T x e' \<Delta>)
then have a1: "valid (\<Delta>@[(x,T')]@\<Gamma>)"
and a2: "(y,T) \<in> set (\<Delta>@[(x,T')]@\<Gamma>)"
and a3: "\<Gamma> \<turnstile> e' : T'" by simp_all
--- a/src/HOL/Tools/ATP_Manager/atp_systems.ML Mon Jun 07 13:42:38 2010 +0200
+++ b/src/HOL/Tools/ATP_Manager/atp_systems.ML Tue Jun 08 07:45:39 2010 +0200
@@ -239,14 +239,14 @@
fun generic_tptp_prover
(name, {home_var, executable, arguments, proof_delims, known_failures,
max_axiom_clauses, prefers_theory_relevant})
- (params as {debug, overlord, respect_no_atp, relevance_threshold,
- relevance_convergence, theory_relevant, defs_relevant,
- isar_proof, ...})
+ (params as {debug, overlord, full_types, respect_no_atp,
+ relevance_threshold, relevance_convergence, theory_relevant,
+ defs_relevant, isar_proof, ...})
minimize_command timeout =
generic_prover overlord
- (get_relevant_facts respect_no_atp relevance_threshold
- relevance_convergence defs_relevant max_axiom_clauses
- (the_default prefers_theory_relevant theory_relevant))
+ (relevant_facts full_types respect_no_atp relevance_threshold
+ relevance_convergence defs_relevant max_axiom_clauses
+ (the_default prefers_theory_relevant theory_relevant))
(prepare_clauses false)
(write_tptp_file (debug andalso overlord)) home_var
executable (arguments timeout) proof_delims known_failures name params
@@ -276,7 +276,7 @@
known_failures =
[(Unprovable, "Satisfiability detected"),
(Unprovable, "UNPROVABLE"),
- (OutOfResources, "CANNOT PROVE"),
+ (Unprovable, "CANNOT PROVE"),
(OutOfResources, "Refutation not found")],
max_axiom_clauses = 60,
prefers_theory_relevant = false}
@@ -314,13 +314,13 @@
fun generic_dfg_prover
(name, {home_var, executable, arguments, proof_delims, known_failures,
max_axiom_clauses, prefers_theory_relevant})
- (params as {overlord, respect_no_atp, relevance_threshold,
+ (params as {overlord, full_types, respect_no_atp, relevance_threshold,
relevance_convergence, theory_relevant, defs_relevant, ...})
minimize_command timeout =
generic_prover overlord
- (get_relevant_facts respect_no_atp relevance_threshold
- relevance_convergence defs_relevant max_axiom_clauses
- (the_default prefers_theory_relevant theory_relevant))
+ (relevant_facts full_types respect_no_atp relevance_threshold
+ relevance_convergence defs_relevant max_axiom_clauses
+ (the_default prefers_theory_relevant theory_relevant))
(prepare_clauses true) write_dfg_file home_var executable
(arguments timeout) proof_delims known_failures name params
minimize_command
--- a/src/HOL/Tools/Sledgehammer/sledgehammer_fact_filter.ML Mon Jun 07 13:42:38 2010 +0200
+++ b/src/HOL/Tools/Sledgehammer/sledgehammer_fact_filter.ML Tue Jun 08 07:45:39 2010 +0200
@@ -16,11 +16,13 @@
del: Facts.ref list,
only: bool}
+ val name_thms_pair_from_ref :
+ Proof.context -> thm list -> Facts.ref -> string * thm list
val tvar_classes_of_terms : term list -> string list
val tfree_classes_of_terms : term list -> string list
val type_consts_of_terms : theory -> term list -> string list
- val get_relevant_facts :
- bool -> real -> real -> bool -> int -> bool -> relevance_override
+ val relevant_facts :
+ bool -> bool -> real -> real -> bool -> int -> bool -> relevance_override
-> Proof.context * (thm list * 'a) -> thm list
-> (thm * (string * int)) list
val prepare_clauses :
@@ -253,36 +255,41 @@
end
end;
+fun cnf_for_facts ctxt =
+ let val thy = ProofContext.theory_of ctxt in
+ maps (maps (cnf_axiom thy) o ProofContext.get_fact ctxt)
+ end
+
fun relevant_clauses ctxt relevance_convergence defs_relevant max_new
- (relevance_override as {add, del, only}) thy ctab =
+ (relevance_override as {add, del, only}) ctab =
let
- val thms_for_facts =
- maps (maps (cnf_axiom thy) o ProofContext.get_fact ctxt)
- val add_thms = thms_for_facts add
- val del_thms = thms_for_facts del
- fun iter p rel_consts =
+ val thy = ProofContext.theory_of ctxt
+ val add_thms = cnf_for_facts ctxt add
+ val del_thms = cnf_for_facts ctxt del
+ fun iter threshold rel_consts =
let
fun relevant ([], _) [] = [] (* Nothing added this iteration *)
- | relevant (newpairs,rejects) [] =
+ | relevant (newpairs, rejects) [] =
let
val (newrels, more_rejects) = take_best max_new newpairs
val new_consts = maps #2 newrels
val rel_consts' = List.foldl add_const_typ_table rel_consts new_consts
- val newp = p + (1.0 - p) / relevance_convergence
+ val threshold =
+ threshold + (1.0 - threshold) / relevance_convergence
in
trace_msg (fn () => "relevant this iteration: " ^
Int.toString (length newrels));
- map #1 newrels @ iter newp rel_consts' (more_rejects @ rejects)
+ map #1 newrels @ iter threshold rel_consts'
+ (more_rejects @ rejects)
end
| relevant (newrels, rejects)
((ax as (clsthm as (thm, (name, n)), consts_typs)) :: axs) =
let
- val weight = if member Thm.eq_thm del_thms thm then 0.0
- else if member Thm.eq_thm add_thms thm then 1.0
- else if only then 0.0
+ val weight = if member Thm.eq_thm add_thms thm then 1.0
+ else if member Thm.eq_thm del_thms thm then 0.0
else clause_weight ctab rel_consts consts_typs
in
- if p <= weight orelse
+ if weight >= threshold orelse
(defs_relevant andalso defines thy (#1 clsthm) rel_consts) then
(trace_msg (fn () => name ^ " clause " ^ Int.toString n ^
" passes: " ^ Real.toString weight);
@@ -291,8 +298,8 @@
relevant (newrels, ax :: rejects) axs
end
in
- trace_msg (fn () => "relevant_clauses, current pass mark: " ^
- Real.toString p);
+ trace_msg (fn () => "relevant_clauses, current threshold: " ^
+ Real.toString threshold);
relevant ([], [])
end
in iter end
@@ -310,7 +317,7 @@
commas (Symtab.keys goal_const_tab))
val relevant =
relevant_clauses ctxt relevance_convergence defs_relevant max_new
- relevance_override thy const_tab relevance_threshold
+ relevance_override const_tab relevance_threshold
goal_const_tab
(map (pair_consts_typs_axiom theory_relevant thy)
axioms)
@@ -352,7 +359,7 @@
fun subtract_cls ax_clauses =
filter_out (Termtab.defined (mk_clause_table ax_clauses) o prop_of)
-fun all_valid_thms respect_no_atp ctxt chained_ths =
+fun all_name_thms_pairs respect_no_atp ctxt chained_ths =
let
val global_facts = PureThy.facts_of (ProofContext.theory_of ctxt);
val local_facts = ProofContext.facts_of ctxt;
@@ -369,7 +376,7 @@
val name1 = Facts.extern facts name;
val name2 = Name_Space.extern full_space name;
- val ths = filter_out bad_for_atp ths0;
+ val ths = filter_out is_theorem_bad_for_atps ths0;
in
if Facts.is_concealed facts name orelse
(respect_no_atp andalso is_package_def name) then
@@ -398,26 +405,32 @@
(* The single-name theorems go after the multiple-name ones, so that single
names are preferred when both are available. *)
-fun name_thm_pairs respect_no_atp ctxt chained_ths =
+fun name_thm_pairs respect_no_atp ctxt name_thms_pairs =
let
- val (mults, singles) =
- List.partition is_multi (all_valid_thms respect_no_atp ctxt chained_ths)
+ val (mults, singles) = List.partition is_multi name_thms_pairs
val ps = [] |> fold add_single_names singles
|> fold add_multi_names mults
in ps |> respect_no_atp ? filter_out (No_ATPs.member ctxt o snd) end;
-fun check_named ("", th) =
- (warning ("No name for theorem " ^ Display.string_of_thm_without_context th); false)
- | check_named _ = true;
+fun is_named ("", th) =
+ (warning ("No name for theorem " ^
+ Display.string_of_thm_without_context th); false)
+ | is_named _ = true
+fun checked_name_thm_pairs respect_no_atp ctxt =
+ name_thm_pairs respect_no_atp ctxt
+ #> tap (fn ps => trace_msg
+ (fn () => ("Considering " ^ Int.toString (length ps) ^
+ " theorems")))
+ #> filter is_named
-fun get_all_lemmas respect_no_atp ctxt chained_ths =
- let val included_thms =
- tap (fn ths => trace_msg
- (fn () => ("Including all " ^ Int.toString (length ths) ^ " theorems")))
- (name_thm_pairs respect_no_atp ctxt chained_ths)
- in
- filter check_named included_thms
- end;
+fun name_thms_pair_from_ref ctxt chained_ths xref =
+ let
+ val ths = ProofContext.get_fact ctxt xref
+ val name = Facts.string_of_ref xref
+ |> forall (member Thm.eq_thm chained_ths) ths
+ ? prefix chained_prefix
+ in (name, ths) end
+
(***************************************************************)
(* Type Classes Present in the Axiom or Conjecture Clauses *)
@@ -463,26 +476,29 @@
fun restrict_to_logic thy true cls = filter (Meson.is_fol_term thy o prop_of o fst) cls
| restrict_to_logic thy false cls = cls;
-(**** Predicates to detect unwanted clauses (prolific or likely to cause unsoundness) ****)
+(**** Predicates to detect unwanted clauses (prolific or likely to cause
+ unsoundness) ****)
(** Too general means, positive equality literal with a variable X as one operand,
when X does not occur properly in the other operand. This rules out clearly
inconsistent clauses such as V=a|V=b, though it by no means guarantees soundness. **)
-fun occurs ix =
- let fun occ(Var (jx,_)) = (ix=jx)
- | occ(t1$t2) = occ t1 orelse occ t2
- | occ(Abs(_,_,t)) = occ t
- | occ _ = false
- in occ end;
+fun var_occurs_in_term ix =
+ let
+ fun aux (Var (jx, _)) = (ix = jx)
+ | aux (t1 $ t2) = aux t1 orelse aux t2
+ | aux (Abs (_, _, t)) = aux t
+ | aux _ = false
+ in aux end
-fun is_recordtype T = not (null (Record.dest_recTs T));
+fun is_record_type T = not (null (Record.dest_recTs T))
(*Unwanted equalities include
(1) those between a variable that does not properly occur in the second operand,
(2) those between a variable and a record, since these seem to be prolific "cases" thms
*)
-fun too_general_eqterms (Var (ix,T), t) = not (occurs ix t) orelse is_recordtype T
+fun too_general_eqterms (Var (ix,T), t) =
+ not (var_occurs_in_term ix t) orelse is_record_type T
| too_general_eqterms _ = false;
fun too_general_equality (Const (@{const_name "op ="}, _) $ x $ y) =
@@ -492,40 +508,56 @@
fun has_typed_var tycons = exists_subterm
(fn Var (_, Type (a, _)) => member (op =) tycons a | _ => false);
-(*Clauses are forbidden to contain variables of these types. The typical reason is that
- they lead to unsoundness. Note that "unit" satisfies numerous equations like ?X=().
- The resulting clause will have no type constraint, yielding false proofs. Even "bool"
- leads to many unsound proofs, though (obviously) only for higher-order problems.*)
-val unwanted_types = [@{type_name unit}, @{type_name bool}];
+(* Clauses are forbidden to contain variables of these types. The typical reason
+ is that they lead to unsoundness. Note that "unit" satisfies numerous
+ equations like "?x = ()". The resulting clause will have no type constraint,
+ yielding false proofs. Even "bool" leads to many unsound proofs, though only
+ for higher-order problems. *)
+val dangerous_types = [@{type_name unit}, @{type_name bool}];
-fun unwanted t =
- t = @{prop True} orelse has_typed_var unwanted_types t orelse
- forall too_general_equality (HOLogic.disjuncts (strip_Trueprop t));
+(* Clauses containing variables of type "unit" or "bool" or of the form
+ "?x = A | ?x = B | ?x = C" are likely to lead to unsound proofs if types are
+ omitted. *)
+fun is_dangerous_term _ @{prop True} = true
+ | is_dangerous_term full_types t =
+ not full_types andalso
+ (has_typed_var dangerous_types t orelse
+ forall too_general_equality (HOLogic.disjuncts (strip_Trueprop t)))
-(*Clauses containing variables of type "unit" or "bool" are unlikely to be useful and
- likely to lead to unsound proofs.*)
-fun remove_unwanted_clauses cls = filter (not o unwanted o prop_of o fst) cls;
+fun is_dangerous_theorem full_types add_thms thm =
+ not (member Thm.eq_thm add_thms thm) andalso
+ is_dangerous_term full_types (prop_of thm)
+
+fun remove_dangerous_clauses full_types add_thms =
+ filter_out (is_dangerous_theorem full_types add_thms o fst)
fun is_first_order thy = forall (Meson.is_fol_term thy) o map prop_of
-fun get_relevant_facts respect_no_atp relevance_threshold relevance_convergence
- defs_relevant max_new theory_relevant
- (relevance_override as {add, only, ...})
- (ctxt, (chained_ths, _)) goal_cls =
+fun relevant_facts full_types respect_no_atp relevance_threshold
+ relevance_convergence defs_relevant max_new theory_relevant
+ (relevance_override as {add, del, only})
+ (ctxt, (chained_ths, _)) goal_cls =
if (only andalso null add) orelse relevance_threshold > 1.0 then
[]
else
let
val thy = ProofContext.theory_of ctxt
+ val add_thms = cnf_for_facts ctxt add
+ val has_override = not (null add) orelse not (null del)
val is_FO = is_first_order thy goal_cls
- val included_cls = get_all_lemmas respect_no_atp ctxt chained_ths
- |> cnf_rules_pairs thy |> make_unique
+ val axioms =
+ checked_name_thm_pairs respect_no_atp ctxt
+ (if only then map (name_thms_pair_from_ref ctxt chained_ths) add
+ else all_name_thms_pairs respect_no_atp ctxt chained_ths)
+ |> cnf_rules_pairs thy
+ |> not has_override ? make_unique
|> restrict_to_logic thy is_FO
- |> remove_unwanted_clauses
+ |> not only ? remove_dangerous_clauses full_types add_thms
in
relevance_filter ctxt relevance_threshold relevance_convergence
defs_relevant max_new theory_relevant relevance_override
- thy included_cls (map prop_of goal_cls)
+ thy axioms (map prop_of goal_cls)
+ |> has_override ? make_unique
end
(* prepare for passing to writer,
--- a/src/HOL/Tools/Sledgehammer/sledgehammer_fact_preprocessor.ML Mon Jun 07 13:42:38 2010 +0200
+++ b/src/HOL/Tools/Sledgehammer/sledgehammer_fact_preprocessor.ML Tue Jun 08 07:45:39 2010 +0200
@@ -14,11 +14,12 @@
val skolem_infix: string
val cnf_axiom: theory -> thm -> thm list
val multi_base_blacklist: string list
- val bad_for_atp: thm -> bool
+ val is_theorem_bad_for_atps: thm -> bool
val type_has_topsort: typ -> bool
- val cnf_rules_pairs: theory -> (string * thm) list -> (thm * (string * int)) list
- val suppress_endtheory: bool Unsynchronized.ref
- (*for emergency use where endtheory causes problems*)
+ val cnf_rules_pairs:
+ theory -> (string * thm) list -> (thm * (string * int)) list
+ val use_skolem_cache: bool Unsynchronized.ref
+ (* for emergency use where the Skolem cache causes problems *)
val strip_subgoal : thm -> int -> (string * typ) list * term list * term
val neg_clausify: thm -> thm list
val neg_conjecture_clauses:
@@ -86,9 +87,9 @@
val rhs_consts = fold_aterms (fn Const c => insert (op =) c | _ => I) rhs []
in fold (#2 #> Term.fold_atyps add_new_TFrees) rhs_consts [] end;
-(*Traverse a theorem, declaring Skolem function definitions. String s is the suggested
- prefix for the Skolem constant.*)
-fun declare_skofuns s th =
+(* Traverse a theorem, declaring Skolem function definitions. String "s" is the
+ suggested prefix for the Skolem constants. *)
+fun declare_skolem_funs s th thy =
let
val nref = Unsynchronized.ref 0 (* FIXME ??? *)
fun dec_sko (Const (@{const_name Ex}, _) $ (xtp as Abs (s', T, p))) (axs, thy) =
@@ -103,13 +104,13 @@
val cT = extraTs ---> Ts ---> T
val rhs = list_abs_free (map dest_Free args, HOLogic.choice_const T $ xtp)
(*Forms a lambda-abstraction over the formal parameters*)
- val (c, thy') =
+ val (c, thy) =
Sign.declare_const ((Binding.conceal (Binding.name cname), cT), NoSyn) thy
val cdef = cname ^ "_def"
- val ((_, ax), thy'') =
- Thm.add_def true false (Binding.name cdef, Logic.mk_equals (c, rhs)) thy'
+ val ((_, ax), thy) =
+ Thm.add_def true false (Binding.name cdef, Logic.mk_equals (c, rhs)) thy
val ax' = Drule.export_without_context ax
- in dec_sko (subst_bound (list_comb (c, args), p)) (ax' :: axs, thy'') end
+ in dec_sko (subst_bound (list_comb (c, args), p)) (ax' :: axs, thy) end
| dec_sko (Const (@{const_name All}, _) $ (Abs (a, T, p))) thx =
(*Universal quant: insert a free variable into body and continue*)
let val fname = Name.variant (OldTerm.add_term_names (p, [])) a
@@ -117,11 +118,11 @@
| dec_sko (@{const "op &"} $ p $ q) thx = dec_sko q (dec_sko p thx)
| dec_sko (@{const "op |"} $ p $ q) thx = dec_sko q (dec_sko p thx)
| dec_sko (@{const Trueprop} $ p) thx = dec_sko p thx
- | dec_sko t thx = thx (*Do nothing otherwise*)
- in fn thy => dec_sko (Thm.prop_of th) ([], thy) end;
+ | dec_sko t thx = thx
+ in dec_sko (prop_of th) ([], thy) end
(*Traverse a theorem, accumulating Skolem function definitions.*)
-fun assume_skofuns s th =
+fun assume_skolem_funs s th =
let val sko_count = Unsynchronized.ref 0 (* FIXME ??? *)
fun dec_sko (Const (@{const_name Ex}, _) $ (xtp as Abs (s', T, p))) defs =
(*Existential: declare a Skolem function, then insert into body and continue*)
@@ -135,9 +136,7 @@
HOLogic.choice_const T $ xtp)
(*Forms a lambda-abstraction over the formal parameters*)
val def = Logic.mk_equals (c, rhs)
- in dec_sko (subst_bound (list_comb(c,args), p))
- (def :: defs)
- end
+ in dec_sko (subst_bound (list_comb(c,args), p)) (def :: defs) end
| dec_sko (Const (@{const_name All},_) $ Abs (a, T, p)) defs =
(*Universal quant: insert a free variable into body and continue*)
let val fname = Name.variant (OldTerm.add_term_names (p,[])) a
@@ -219,41 +218,44 @@
val crator = cterm_of thy rator
val abs_B' = cterm_instantiate [(f_B,crator),(g_B,crand)] @{thm abs_B}
val (_,rhs) = Thm.dest_equals (cprop_of abs_B')
- in
- Thm.transitive abs_B' (Conv.arg_conv abstract rhs)
- end
+ in Thm.transitive abs_B' (Conv.arg_conv abstract rhs) end
else makeK()
- | _ => error "abstract: Bad term"
+ | _ => raise Fail "abstract: Bad term"
end;
-(*Traverse a theorem, declaring abstraction function definitions. String s is the suggested
- prefix for the constants.*)
-fun combinators_aux ct =
- if lambda_free (term_of ct) then Thm.reflexive ct
+(* Traverse a theorem, remplacing lambda-abstractions with combinators. *)
+fun do_introduce_combinators ct =
+ if lambda_free (term_of ct) then
+ Thm.reflexive ct
+ else case term_of ct of
+ Abs _ =>
+ let
+ val (cv, cta) = Thm.dest_abs NONE ct
+ val (v, _) = dest_Free (term_of cv)
+ val u_th = do_introduce_combinators cta
+ val cu = Thm.rhs_of u_th
+ val comb_eq = abstract (Thm.cabs cv cu)
+ in Thm.transitive (Thm.abstract_rule v cv u_th) comb_eq end
+ | _ $ _ =>
+ let val (ct1, ct2) = Thm.dest_comb ct in
+ Thm.combination (do_introduce_combinators ct1)
+ (do_introduce_combinators ct2)
+ end
+
+fun introduce_combinators th =
+ if lambda_free (prop_of th) then
+ th
else
- case term_of ct of
- Abs _ =>
- let val (cv, cta) = Thm.dest_abs NONE ct
- val (v, _) = dest_Free (term_of cv)
- val u_th = combinators_aux cta
- val cu = Thm.rhs_of u_th
- val comb_eq = abstract (Thm.cabs cv cu)
- in Thm.transitive (Thm.abstract_rule v cv u_th) comb_eq end
- | _ $ _ =>
- let val (ct1, ct2) = Thm.dest_comb ct
- in Thm.combination (combinators_aux ct1) (combinators_aux ct2) end;
-
-fun combinators th =
- if lambda_free (prop_of th) then th
- else
- let val th = Drule.eta_contraction_rule th
- val eqth = combinators_aux (cprop_of th)
- in Thm.equal_elim eqth th end
- handle THM (msg,_,_) =>
- (warning (cat_lines
- ["Error in the combinator translation of " ^ Display.string_of_thm_without_context th,
- " Exception message: " ^ msg]);
- TrueI); (*A type variable of sort {} will cause make abstraction fail.*)
+ let
+ val th = Drule.eta_contraction_rule th
+ val eqth = do_introduce_combinators (cprop_of th)
+ in Thm.equal_elim eqth th end
+ handle THM (msg, _, _) =>
+ (warning ("Error in the combinator translation of " ^
+ Display.string_of_thm_without_context th ^
+ "\nException message: " ^ msg ^ ".");
+ (* A type variable of sort "{}" will make abstraction fail. *)
+ TrueI)
(*cterms are used throughout for efficiency*)
val cTrueprop = Thm.cterm_of @{theory HOL} HOLogic.Trueprop;
@@ -302,37 +304,43 @@
(*Generate Skolem functions for a theorem supplied in nnf*)
fun assume_skolem_of_def s th =
- map (skolem_of_def o Thm.assume o (cterm_of (theory_of_thm th))) (assume_skofuns s th);
+ map (skolem_of_def o Thm.assume o cterm_of (theory_of_thm th))
+ (assume_skolem_funs s th)
-(*** Blacklisting (FIXME: duplicated in "Sledgehammer_Fact_Filter"?) ***)
+(*** Blacklisting (more in "Sledgehammer_Fact_Filter") ***)
-val max_lambda_nesting = 3;
+val max_lambda_nesting = 3
-fun excessive_lambdas (f$t, k) = excessive_lambdas (f,k) orelse excessive_lambdas (t,k)
- | excessive_lambdas (Abs(_,_,t), k) = k=0 orelse excessive_lambdas (t,k-1)
- | excessive_lambdas _ = false;
+fun term_has_too_many_lambdas max (t1 $ t2) =
+ exists (term_has_too_many_lambdas max) [t1, t2]
+ | term_has_too_many_lambdas max (Abs (_, _, t)) =
+ max = 0 orelse term_has_too_many_lambdas (max - 1) t
+ | term_has_too_many_lambdas _ _ = false
-fun is_formula_type T = (T = HOLogic.boolT orelse T = propT);
+fun is_formula_type T = (T = HOLogic.boolT orelse T = propT)
-(*Don't count nested lambdas at the level of formulas, as they are quantifiers*)
-fun excessive_lambdas_fm Ts (Abs(_,T,t)) = excessive_lambdas_fm (T::Ts) t
- | excessive_lambdas_fm Ts t =
- if is_formula_type (fastype_of1 (Ts, t))
- then exists (excessive_lambdas_fm Ts) (#2 (strip_comb t))
- else excessive_lambdas (t, max_lambda_nesting);
+(* Don't count nested lambdas at the level of formulas, since they are
+ quantifiers. *)
+fun formula_has_too_many_lambdas Ts (Abs (_, T, t)) =
+ formula_has_too_many_lambdas (T :: Ts) t
+ | formula_has_too_many_lambdas Ts t =
+ if is_formula_type (fastype_of1 (Ts, t)) then
+ exists (formula_has_too_many_lambdas Ts) (#2 (strip_comb t))
+ else
+ term_has_too_many_lambdas max_lambda_nesting t
-(*The max apply_depth of any metis call in Metis_Examples (on 31-10-2007) was 11.*)
-val max_apply_depth = 15;
+(* The max apply depth of any "metis" call in "Metis_Examples" (on 31-10-2007)
+ was 11. *)
+val max_apply_depth = 15
-fun apply_depth (f$t) = Int.max (apply_depth f, apply_depth t + 1)
- | apply_depth (Abs(_,_,t)) = apply_depth t
- | apply_depth _ = 0;
+fun apply_depth (f $ t) = Int.max (apply_depth f, apply_depth t + 1)
+ | apply_depth (Abs (_, _, t)) = apply_depth t
+ | apply_depth _ = 0
-fun too_complex t =
- apply_depth t > max_apply_depth orelse
- Meson.too_many_clauses NONE t orelse
- excessive_lambdas_fm [] t;
+fun is_formula_too_complex t =
+ apply_depth t > max_apply_depth orelse Meson.too_many_clauses NONE t orelse
+ formula_has_too_many_lambdas [] t
fun is_strange_thm th =
case head_of (concl_of th) of
@@ -340,10 +348,11 @@
a <> @{const_name "=="})
| _ => false;
-fun bad_for_atp th =
- too_complex (prop_of th)
- orelse exists_type type_has_topsort (prop_of th)
- orelse is_strange_thm th;
+fun is_theorem_bad_for_atps thm =
+ let val t = prop_of thm in
+ is_formula_too_complex t orelse exists_type type_has_topsort t orelse
+ is_strange_thm thm
+ end
(* FIXME: put other record thms here, or declare as "no_atp" *)
val multi_base_blacklist =
@@ -356,14 +365,21 @@
(*Skolemize a named theorem, with Skolem functions as additional premises.*)
fun skolem_thm (s, th) =
- if member (op =) multi_base_blacklist (Long_Name.base_name s) orelse bad_for_atp th then []
+ if member (op =) multi_base_blacklist (Long_Name.base_name s) orelse
+ is_theorem_bad_for_atps th then
+ []
else
let
val ctxt0 = Variable.global_thm_context th
- val (nnfth, ctxt1) = to_nnf th ctxt0
- val (cnfs, ctxt2) = Meson.make_cnf (assume_skolem_of_def s nnfth) nnfth ctxt1
- in cnfs |> map combinators |> Variable.export ctxt2 ctxt0 |> Meson.finish_cnf end
- handle THM _ => [];
+ val (nnfth, ctxt) = to_nnf th ctxt0
+ val defs = assume_skolem_of_def s nnfth
+ val (cnfs, ctxt) = Meson.make_cnf defs nnfth ctxt
+ in
+ cnfs |> map introduce_combinators
+ |> Variable.export ctxt ctxt0
+ |> Meson.finish_cnf
+ end
+ handle THM _ => []
(*The cache prevents repeated clausification of a theorem, and also repeated declaration of
Skolem functions.*)
@@ -413,19 +429,19 @@
fun skolem_def (name, th) thy =
let val ctxt0 = Variable.global_thm_context th in
- (case try (to_nnf th) ctxt0 of
+ case try (to_nnf th) ctxt0 of
NONE => (NONE, thy)
- | SOME (nnfth, ctxt1) =>
- let val (defs, thy') = declare_skofuns (flatten_name name) nnfth thy
- in (SOME (th, ctxt0, ctxt1, nnfth, defs), thy') end)
+ | SOME (nnfth, ctxt) =>
+ let val (defs, thy') = declare_skolem_funs (flatten_name name) nnfth thy
+ in (SOME (th, ctxt0, ctxt, nnfth, defs), thy') end
end;
-fun skolem_cnfs (th, ctxt0, ctxt1, nnfth, defs) =
+fun skolem_cnfs (th, ctxt0, ctxt, nnfth, defs) =
let
- val (cnfs, ctxt2) = Meson.make_cnf (map skolem_of_def defs) nnfth ctxt1;
+ val (cnfs, ctxt) = Meson.make_cnf (map skolem_of_def defs) nnfth ctxt;
val cnfs' = cnfs
- |> map combinators
- |> Variable.export ctxt2 ctxt0
+ |> map introduce_combinators
+ |> Variable.export ctxt ctxt0
|> Meson.finish_cnf
|> map Thm.close_derivation;
in (th, cnfs') end;
@@ -441,8 +457,10 @@
val new_thms = (new_facts, []) |-> fold (fn (name, ths) =>
if member (op =) multi_base_blacklist (Long_Name.base_name name) then I
else fold_index (fn (i, th) =>
- if bad_for_atp th orelse is_some (lookup_cache thy th) then I
- else cons (name ^ "_" ^ string_of_int (i + 1), Thm.transfer thy th)) ths);
+ if is_theorem_bad_for_atps th orelse is_some (lookup_cache thy th) then
+ I
+ else
+ cons (name ^ "_" ^ string_of_int (i + 1), Thm.transfer thy th)) ths)
in
if null new_facts then NONE
else
@@ -457,11 +475,10 @@
end;
-val suppress_endtheory = Unsynchronized.ref false
+val use_skolem_cache = Unsynchronized.ref true
fun clause_cache_endtheory thy =
- if ! suppress_endtheory then NONE
- else saturate_skolem_cache thy;
+ if !use_skolem_cache then saturate_skolem_cache thy else NONE
(*The cache can be kept smaller by inspecting the prop of each thm. Can ignore all that are
@@ -481,7 +498,10 @@
EVERY' [rtac ccontr, Object_Logic.atomize_prems_tac, Meson.skolemize_tac ctxt]
val neg_clausify =
- single #> Meson.make_clauses_unsorted #> map combinators #> Meson.finish_cnf
+ single
+ #> Meson.make_clauses_unsorted
+ #> map introduce_combinators
+ #> Meson.finish_cnf
fun neg_conjecture_clauses ctxt st0 n =
let
@@ -509,7 +529,7 @@
(** setup **)
val setup =
- perhaps saturate_skolem_cache #>
- Theory.at_end clause_cache_endtheory;
+ perhaps saturate_skolem_cache
+ #> Theory.at_end clause_cache_endtheory
end;
--- a/src/HOL/Tools/Sledgehammer/sledgehammer_isar.ML Mon Jun 07 13:42:38 2010 +0200
+++ b/src/HOL/Tools/Sledgehammer/sledgehammer_isar.ML Tue Jun 08 07:45:39 2010 +0200
@@ -19,6 +19,7 @@
struct
open Sledgehammer_Util
+open Sledgehammer_Fact_Filter
open Sledgehammer_Fact_Preprocessor
open ATP_Manager
open ATP_Systems
@@ -238,19 +239,12 @@
state) atps
in () end
-fun minimize override_params i frefs state =
+fun minimize override_params i refs state =
let
val thy = Proof.theory_of state
val ctxt = Proof.context_of state
val chained_ths = #facts (Proof.goal state)
- fun theorems_from_ref ctxt fref =
- let
- val ths = ProofContext.get_fact ctxt fref
- val name = Facts.string_of_ref fref
- |> forall (member Thm.eq_thm chained_ths) ths
- ? prefix chained_prefix
- in (name, ths) end
- val name_thms_pairs = map (theorems_from_ref ctxt) frefs
+ val name_thms_pairs = map (name_thms_pair_from_ref ctxt chained_ths) refs
in
case subgoal_count state of
0 => priority "No subgoal!"
--- a/src/Tools/jEdit/makedist Mon Jun 07 13:42:38 2010 +0200
+++ b/src/Tools/jEdit/makedist Tue Jun 08 07:45:39 2010 +0200
@@ -11,7 +11,7 @@
## diagnostics
-JEDIT_HOME="/home/isajedit/jedit-orig/4.3pre17"
+JEDIT_HOME=""
function usage()
{
@@ -20,7 +20,6 @@
echo
echo " Options are:"
echo " -j DIR specify original jEdit distribution"
- echo " (default: $JEDIT_HOME)"
echo
echo " Produce Isabelle/jEdit distribution from Netbeans build"
echo " in $THIS/dist"
@@ -66,12 +65,13 @@
# target name
+[ -z "$JEDIT_HOME" ] && fail "Unknown JEDIT_HOME"
+
VERSION=$(basename "$JEDIT_HOME")
JEDIT="jedit-${VERSION}"
rm -rf "$JEDIT" jedit
mkdir "$JEDIT"
-ln -s "$JEDIT" jedit
# copy stuff
@@ -95,5 +95,4 @@
# build archive
echo "${JEDIT}.tar.gz"
-tar czf "${JEDIT}.tar.gz" "$JEDIT" jedit
-ln -sf "${JEDIT}.tar.gz" jedit.tar.gz
+tar czf "${JEDIT}.tar.gz" "$JEDIT"