Lukas Steven's more general fold foctions for maps
authornipkow
Tue, 08 Jun 2021 17:01:32 +0200
changeset 73833 ae2f8144b60d
parent 73832 9db620f007fa (current diff)
parent 73831 5153fad491f3 (diff)
child 73844 8a9fd2ffb81d
Lukas Steven's more general fold foctions for maps
--- a/NEWS	Tue Jun 01 19:46:34 2021 +0200
+++ b/NEWS	Tue Jun 08 17:01:32 2021 +0200
@@ -34,6 +34,12 @@
 
 *** Document preparation ***
 
+* More predefined symbols: \<interleave> \<sslash> (package "stmaryrd"), \<checkmark> \<crossmark> (LaTeX package
+"pifont").
+
+* High-quality blackboard-bold symbols from font "txmia" (LaTeX package
+"pxfonts"): \<bbbA>\<bool>\<complex>\<bbbD>\<bbbE>\<bbbF>\<bbbG>\<bbbH>\<bbbI>\<bbbJ>\<bbbK>\<bbbL>\<bbbM>\<nat>\<bbbO>\<bbbP>\<rat>\<real>\<bbbS>\<bbbT>\<bbbU>\<bbbV>\<bbbW>\<bbbX>\<bbbY>\<int>.
+
 * Document antiquotations for ML text have been refined: "def" and "ref"
 variants support index entries, e.g. @{ML} (no entry) vs. @{ML_def}
 (bold entry) vs. @{ML_ref} (regular entry); @{ML_type} supports explicit
@@ -99,14 +105,14 @@
 
 *** HOL ***
 
-* Theory Multiset: dedicated predicate "multiset" is gone, use
-explict expression instead.  Minor INCOMPATIBILITY.
-
-* Theory Multiset: consolidated abbreviations Mempty, Melem, not_Melem
-to empty_mset, member_mset, not_member_mset respectively.  Minor
-INCOMPATIBILITY.
-
-* Theory Multiset: consolidated operation and fact names:
+* Theory "HOL-Library.Multiset": dedicated predicate "multiset" is gone,
+use explict expression instead. Minor INCOMPATIBILITY.
+
+* Theory "HOL-Library.Multiset": consolidated abbreviations Mempty,
+Melem, not_Melem to empty_mset, member_mset, not_member_mset
+respectively. Minor INCOMPATIBILITY.
+
+* Theory "HOL-Library.Multiset": consolidated operation and fact names:
     inf_subset_mset ~> inter_mset
     sup_subset_mset ~> union_mset
     multiset_inter_def ~> inter_mset_def
@@ -114,51 +120,53 @@
     multiset_inter_count ~> count_inter_mset
     sup_subset_mset_count ~> count_union_mset
 
-* Theory Multiset: syntax precendence for membership operations has been
-adjusted to match the corresponding precendences on sets.  Rare
-INCOMPATIBILITY.
-
-* HOL-Analysis/HOL-Probability: indexed products of discrete
-distributions, negative binomial distribution, Hoeffding's inequality,
-Chernoff bounds, Cauchy–Schwarz inequality for nn_integral, and some
-more small lemmas. Some theorems that were stated awkwardly before were
-corrected. Minor INCOMPATIBILITY.
+* Theory "HOL-Library.Multiset": syntax precendence for membership
+operations has been adjusted to match the corresponding precendences on
+sets. Rare INCOMPATIBILITY.
+
+* Session "HOL-Analysis" and "HOL-Probability": indexed products of
+discrete distributions, negative binomial distribution, Hoeffding's
+inequality, Chernoff bounds, Cauchy–Schwarz inequality for nn_integral,
+and some more small lemmas. Some theorems that were stated awkwardly
+before were corrected. Minor INCOMPATIBILITY.
 
 * Theorems "antisym" and "eq_iff" in class "order" have been renamed to
 "order.antisym" and "order.eq_iff", to coexist locally with "antisym"
-and "eq_iff" from locale "ordering".  INCOMPATIBILITY: significant
+and "eq_iff" from locale "ordering". INCOMPATIBILITY: significant
 potential for change can be avoided if interpretations of type class
 "order" are replaced or augmented by interpretations of locale
 "ordering".
 
-* Theorem "swap_def" now is always qualified as "Fun.swap_def".  Minor
+* Theorem "swap_def" now is always qualified as "Fun.swap_def". Minor
 INCOMPATIBILITY; note that for most applications less elementary lemmas
 exists.
 
-* Dedicated session HOL-Combinatorics.  INCOMPATIBILITY: theories
+* Theory "HOL-Library.Permutation" has been renamed to the more specific
+"HOL-Library.List_Permutation". Note that most notions from that theory
+are already present in theory "HOL-Combinatorics.Permutations".
+INCOMPATIBILITY.
+
+* Dedicated session "HOL-Combinatorics". INCOMPATIBILITY: theories
 "Permutations", "List_Permutation" (formerly "Permutation"), "Stirling",
 "Multiset_Permutations", "Perm" have been moved there from session
 HOL-Library.
 
-* Theory "Permutation" in HOL-Library has been renamed to the more
-specific "List_Permutation".  Note that most notions from that
-theory are already present in theory "Permutations".  INCOMPATIBILITY.
-
-* Lemma "permutes_induct" has been given stronger
-hypotheses and named premises.  INCOMPATIBILITY.
-
-* Theory "Transposition" in HOL-Combinatorics provides elementary
-swap operation "transpose".
-
-* Combinator "Fun.swap" resolved into a mere input abbreviation
-in separate theory "Transposition" in HOL-Combinatorics.
-INCOMPATIBILITY.
+* Theory "HOL-Combinatorics.Transposition" provides elementary swap
+operation "transpose".
+
+* Lemma "permutes_induct" has been given stronger hypotheses and named
+premises. INCOMPATIBILITY.
+
+* Combinator "Fun.swap" resolved into a mere input abbreviation in
+separate theory "Transposition" in HOL-Combinatorics. INCOMPATIBILITY.
 
 * Bit operations set_bit, unset_bit and flip_bit are now class
-operations.  INCOMPATIBILITY.
-
-* Abbreviation "max_word" has been moved to session Word_Lib in the AFP.
-See there further the changelog in theory Guide.  INCOMPATIBILITY.
+operations. INCOMPATIBILITY.
+
+* Abbreviation "max_word" has been moved to session Word_Lib in the AFP,
+as also have constants "shiftl1", "shiftr1", "sshiftr1", "bshiftr1",
+"setBit", "clearBit". See there further the changelog in theory Guide.
+INCOMPATIBILITY.
 
 
 *** ML ***
@@ -230,9 +238,15 @@
 *** System ***
 
 * System option "system_log" specifies an optional log file for internal
-messages produced by Output.system_message in Isabelle/ML; "-" refers to
-console progress of the build job. This works for "isabelle build" or
-any derivative of it.
+messages produced by Output.system_message in Isabelle/ML; the value
+"true" refers to console progress of the build job. This works for
+"isabelle build" or any derivative of it.
+
+* System options of type string may be set to "true" using the short
+notation of type bool. E.g. "isabelle build -o system_log".
+
+* System option "document=true" is an alias for "document=pdf" and thus
+can be used in the short form. E.g. "isabelle build -o document".
 
 * Command-line tool "isabelle version" supports repository archives
 (without full .hg directory). More options.
@@ -989,12 +1003,6 @@
 
 *** Document preparation ***
 
-* More predefined symbols: \<interleave> \<sslash> (package "stmaryrd"), \<checkmark> \<crossmark> (package
-"pifont").
-
-* High-quality blackboard-bold symbols from font "txmia" (package
-"pxfonts"): \<bbbA>\<bool>\<complex>\<bbbD>\<bbbE>\<bbbF>\<bbbG>\<bbbH>\<bbbI>\<bbbJ>\<bbbK>\<bbbL>\<bbbM>\<nat>\<bbbO>\<bbbP>\<rat>\<real>\<bbbS>\<bbbT>\<bbbU>\<bbbV>\<bbbW>\<bbbX>\<bbbY>\<int>.
-
 * Document markers are formal comments of the form \<^marker>\<open>marker_body\<close> that
 are stripped from document output: the effect is to modify the semantic
 presentation context or to emit markup to the PIDE document. Some
--- a/etc/options	Tue Jun 01 19:46:34 2021 +0200
+++ b/etc/options	Tue Jun 08 17:01:32 2021 +0200
@@ -6,7 +6,7 @@
   -- "generate theory browser information"
 
 option document : string = ""
-  -- "build document in given format: pdf, dvi, false"
+  -- "build PDF document (enabled for \"pdf\" or \"true\", disabled for \"\" or \"false\")"
 option document_output : string = ""
   -- "document output directory"
 option document_variants : string = "document"
@@ -91,8 +91,6 @@
   -- "level of parallel proof checking: 0, 1, 2"
 option parallel_subproofs_threshold : real = 0.01
   -- "lower bound of timing estimate for forked nested proofs (seconds)"
-option parallel_presentation : bool = true
-  -- "parallel theory presentation"
 
 option command_timing_threshold : real = 0.1
   -- "default threshold for persistent command timing (seconds)"
@@ -134,7 +132,7 @@
   -- "ML profiling (possible values: time, allocations)"
 
 option system_log : string = ""
-  -- "output for system messages (log file or \"-\" for console progress)"
+  -- "output for system messages (log file or \"true\" for console progress)"
 
 option system_heaps : bool = false
   -- "store session heaps in $ISABELLE_HEAPS_SYSTEM, not $ISABELLE_HEAPS"
@@ -307,10 +305,10 @@
 
 section "Phabricator"
 
-option phabricator_version_arcanist : string = "7af9846f994a8d0a1fc89af996e3ddd81f01765e"
+option phabricator_version_arcanist : string = "cdae0ac68f1fed138323fa3dbb299ef3b287723c"
   -- "repository version for arcanist"
 
-option phabricator_version_phabricator : string = "2afedad61c5181bb4f832cea27b9b59df19f3fd5"
+option phabricator_version_phabricator : string = "4f70fcffa8a5393e210d64f237ffdaa256256d6a"
   -- "repository version for phabricator"
 
 
--- a/src/Doc/System/Presentation.thy	Tue Jun 01 19:46:34 2021 +0200
+++ b/src/Doc/System/Presentation.thy	Tue Jun 08 17:01:32 2021 +0200
@@ -47,9 +47,9 @@
   reported by the above verbose invocation of the build process.
 
   Many Isabelle sessions (such as \<^session>\<open>HOL-Library\<close> in
-  \<^dir>\<open>~~/src/HOL/Library\<close>) also provide printable documents in PDF. These are
+  \<^dir>\<open>~~/src/HOL/Library\<close>) also provide theory documents in PDF. These are
   prepared automatically as well if enabled like this:
-  @{verbatim [display] \<open>isabelle build -o browser_info -o document=pdf -v -c HOL-Library\<close>}
+  @{verbatim [display] \<open>isabelle build -o browser_info -o document -v -c HOL-Library\<close>}
 
   Enabling both browser info and document preparation simultaneously causes an
   appropriate ``document'' link to be included in the HTML index. Documents
--- a/src/Doc/System/Sessions.thy	Tue Jun 01 19:46:34 2021 +0200
+++ b/src/Doc/System/Sessions.thy	Tue Jun 08 17:01:32 2021 +0200
@@ -201,8 +201,9 @@
     info, see also \secref{sec:info}.
 
     \<^item> @{system_option_def "document"} controls document output for a
-    particular session or theory; \<^verbatim>\<open>document=pdf\<close> means enabled,
-    \<^verbatim>\<open>document=false\<close> means disabled (especially for particular theories).
+    particular session or theory; \<^verbatim>\<open>document=pdf\<close> or \<^verbatim>\<open>document=true\<close> means
+    enabled, \<^verbatim>\<open>document=""\<close> or \<^verbatim>\<open>document=false\<close> means disabled (especially
+    for particular theories).
 
     \<^item> @{system_option_def "document_output"} specifies an alternative
     directory for generated output of the document preparation system; the
@@ -278,7 +279,8 @@
 
     \<^item> @{system_option_def system_log} specifies an optional log file for
     low-level messages produced by \<^ML>\<open>Output.system_message\<close> in
-    Isabelle/ML; ``\<^verbatim>\<open>-\<close>'' refers to console progress of the build job.
+    Isabelle/ML; the value ``\<^verbatim>\<open>true\<close>'' refers to console progress of the build
+    job.
 
     \<^item> @{system_option_def "system_heaps"} determines the directories for
     session heap images: \<^path>\<open>$ISABELLE_HEAPS\<close> is the user directory and
@@ -410,11 +412,12 @@
   The build process depends on additional options
   (\secref{sec:system-options}) that are passed to the prover eventually. The
   settings variable @{setting_ref ISABELLE_BUILD_OPTIONS} allows to provide
-  additional defaults, e.g.\ \<^verbatim>\<open>ISABELLE_BUILD_OPTIONS="document=pdf threads=4"\<close>.
-  Moreover, the environment of system build options may be augmented on the
-  command line via \<^verbatim>\<open>-o\<close>~\<open>name\<close>\<^verbatim>\<open>=\<close>\<open>value\<close> or \<^verbatim>\<open>-o\<close>~\<open>name\<close>, which abbreviates
-  \<^verbatim>\<open>-o\<close>~\<open>name\<close>\<^verbatim>\<open>=true\<close> for Boolean options. Multiple occurrences of \<^verbatim>\<open>-o\<close> on
-  the command-line are applied in the given order.
+  additional defaults, e.g.\ \<^verbatim>\<open>ISABELLE_BUILD_OPTIONS="document=pdf
+  threads=4"\<close>. Moreover, the environment of system build options may be
+  augmented on the command line via \<^verbatim>\<open>-o\<close>~\<open>name\<close>\<^verbatim>\<open>=\<close>\<open>value\<close> or \<^verbatim>\<open>-o\<close>~\<open>name\<close>,
+  which abbreviates \<^verbatim>\<open>-o\<close>~\<open>name\<close>\<^verbatim>\<open>=true\<close> for Boolean or string options.
+  Multiple occurrences of \<^verbatim>\<open>-o\<close> on the command-line are applied in the given
+  order.
 
   \<^medskip>
   Option \<^verbatim>\<open>-P\<close> enables PDF/HTML presentation in the given directory, where
@@ -496,7 +499,7 @@
 
   \<^smallskip>
   Build all sessions with HTML browser info and PDF document preparation:
-  @{verbatim [display] \<open>isabelle build -a -o browser_info -o document=pdf\<close>}
+  @{verbatim [display] \<open>isabelle build -a -o browser_info -o document\<close>}
 
   \<^smallskip>
   Build all sessions with a maximum of 8 parallel prover processes and 4
--- a/src/HOL/Analysis/Convex_Euclidean_Space.thy	Tue Jun 01 19:46:34 2021 +0200
+++ b/src/HOL/Analysis/Convex_Euclidean_Space.thy	Tue Jun 08 17:01:32 2021 +0200
@@ -1658,6 +1658,31 @@
     unfolding islimpt_def by blast
 qed
 
+lemma islimpt_Ioc [simp]:
+  fixes a :: real
+  assumes "a<b" 
+  shows "x islimpt {a<..b} \<longleftrightarrow> x \<in> {a..b}"  (is "?lhs = ?rhs")
+proof
+  show "?lhs \<Longrightarrow> ?rhs"
+    by (metis assms closed_atLeastAtMost closed_limpt closure_greaterThanAtMost closure_subset islimpt_subset)
+next
+  assume ?rhs
+  then have "x \<in> closure {a<..<b}"
+    using assms closure_greaterThanLessThan by blast
+  then show ?lhs
+    by (metis (no_types) Diff_empty Diff_insert0 interior_lessThanAtMost interior_limit_point interior_subset islimpt_in_closure islimpt_subset)
+qed
+
+lemma islimpt_Ico [simp]:
+  fixes a :: real
+  assumes "a<b" shows "x islimpt {a..<b} \<longleftrightarrow> x \<in> {a..b}"
+  by (metis assms closure_atLeastLessThan closure_greaterThanAtMost islimpt_Ioc limpt_of_closure) 
+
+lemma islimpt_Icc [simp]:
+  fixes a :: real
+  assumes "a<b" shows "x islimpt {a..b} \<longleftrightarrow> x \<in> {a..b}"
+  by (metis assms closure_atLeastLessThan islimpt_Ico limpt_of_closure)
+
 lemma connected_imp_perfect_aff_dim:
      "\<lbrakk>connected S; aff_dim S \<noteq> 0; a \<in> S\<rbrakk> \<Longrightarrow> a islimpt S"
   using aff_dim_sing connected_imp_perfect by blast
--- a/src/HOL/Analysis/Derivative.thy	Tue Jun 01 19:46:34 2021 +0200
+++ b/src/HOL/Analysis/Derivative.thy	Tue Jun 08 17:01:32 2021 +0200
@@ -3281,6 +3281,10 @@
   unfolding C1_differentiable_on_eq
   by (rule continuous_intros | simp add: continuous_at_imp_continuous_on differentiable_imp_continuous_within)+
 
+lemma C1_differentiable_on_of_real [derivative_intros]: "of_real C1_differentiable_on S"
+  unfolding C1_differentiable_on_def
+  by (smt (verit, del_insts) DERIV_ident UNIV_I continuous_on_const has_vector_derivative_of_real has_vector_derivative_transform)
+
 
 definition\<^marker>\<open>tag important\<close> piecewise_C1_differentiable_on
            (infixr "piecewise'_C1'_differentiable'_on" 50)
@@ -3298,7 +3302,7 @@
            C1_differentiable_on_def differentiable_def has_vector_derivative_def
            intro: has_derivative_at_withinI)
 
-lemma piecewise_C1_differentiable_compose:
+lemma piecewise_C1_differentiable_compose [derivative_intros]:
   assumes fg: "f piecewise_C1_differentiable_on S" "g piecewise_C1_differentiable_on (f ` S)" and fin: "\<And>x. finite (S \<inter> f-`{x})"
   shows "(g \<circ> f) piecewise_C1_differentiable_on S"
 proof -
@@ -3334,7 +3338,7 @@
   unfolding C1_differentiable_on_eq continuous_on_eq_continuous_within
   using differentiable_at_withinI differentiable_imp_continuous_within by blast
 
-lemma C1_differentiable_on_empty [iff]: "f C1_differentiable_on {}"
+lemma C1_differentiable_on_empty [iff,derivative_intros]: "f C1_differentiable_on {}"
   unfolding C1_differentiable_on_def
   by auto
 
@@ -3356,7 +3360,7 @@
     done
 qed
 
-lemma piecewise_C1_differentiable_cases:
+lemma piecewise_C1_differentiable_cases [derivative_intros]:
   fixes c::real
   assumes "f piecewise_C1_differentiable_on {a..c}"
           "g piecewise_C1_differentiable_on {c..b}"
@@ -3444,12 +3448,21 @@
     by (simp add: piecewise_C1_differentiable_on_def)
 qed
 
-lemma piecewise_C1_differentiable_neg:
+lemma piecewise_C1_differentiable_const [derivative_intros]:
+  "(\<lambda>x. c) piecewise_C1_differentiable_on S"
+  by (simp add: C1_differentiable_imp_piecewise)
+
+lemma piecewise_C1_differentiable_scaleR [derivative_intros]:
+    "\<lbrakk>f piecewise_C1_differentiable_on S\<rbrakk>
+     \<Longrightarrow> (\<lambda>x. c *\<^sub>R f x) piecewise_C1_differentiable_on S"
+  by (force simp add: piecewise_C1_differentiable_on_def continuous_on_scaleR)
+
+lemma piecewise_C1_differentiable_neg [derivative_intros]:
     "f piecewise_C1_differentiable_on S \<Longrightarrow> (\<lambda>x. -(f x)) piecewise_C1_differentiable_on S"
   unfolding piecewise_C1_differentiable_on_def
   by (auto intro!: continuous_on_minus C1_differentiable_on_minus)
 
-lemma piecewise_C1_differentiable_add:
+lemma piecewise_C1_differentiable_add [derivative_intros]:
   assumes "f piecewise_C1_differentiable_on i"
           "g piecewise_C1_differentiable_on i"
     shows "(\<lambda>x. f x + g x) piecewise_C1_differentiable_on i"
@@ -3466,10 +3479,26 @@
     by (auto simp: piecewise_C1_differentiable_on_def continuous_on_add)
 qed
 
-lemma piecewise_C1_differentiable_diff:
+lemma piecewise_C1_differentiable_diff [derivative_intros]:
     "\<lbrakk>f piecewise_C1_differentiable_on S;  g piecewise_C1_differentiable_on S\<rbrakk>
      \<Longrightarrow> (\<lambda>x. f x - g x) piecewise_C1_differentiable_on S"
   unfolding diff_conv_add_uminus
   by (metis piecewise_C1_differentiable_add piecewise_C1_differentiable_neg)
 
+lemma piecewise_C1_differentiable_cmult_right [derivative_intros]:
+  fixes c::complex
+  shows "f piecewise_C1_differentiable_on S
+     \<Longrightarrow> (\<lambda>x. f x * c) piecewise_C1_differentiable_on S"
+  by (force simp: piecewise_C1_differentiable_on_def continuous_on_mult_right)
+
+lemma piecewise_C1_differentiable_cmult_left [derivative_intros]:
+  fixes c::complex
+  shows "f piecewise_C1_differentiable_on S
+     \<Longrightarrow> (\<lambda>x. c * f x) piecewise_C1_differentiable_on S"
+  using piecewise_C1_differentiable_cmult_right [of f S c] by (simp add: mult.commute)
+
+lemma piecewise_C1_differentiable_on_of_real [derivative_intros]: 
+  "of_real piecewise_C1_differentiable_on S"
+  by (simp add: C1_differentiable_imp_piecewise C1_differentiable_on_of_real)
+
 end
--- a/src/HOL/Analysis/Linear_Algebra.thy	Tue Jun 01 19:46:34 2021 +0200
+++ b/src/HOL/Analysis/Linear_Algebra.thy	Tue Jun 08 17:01:32 2021 +0200
@@ -688,6 +688,9 @@
   shows "linear f \<Longrightarrow> f differentiable net"
   by (metis linear_imp_has_derivative differentiable_def)
 
+lemma of_real_differentiable [simp,derivative_intros]: "of_real differentiable F"
+  by (simp add: bounded_linear_imp_differentiable bounded_linear_of_real)
+
 
 subsection\<^marker>\<open>tag unimportant\<close> \<open>We continue\<close>
 
@@ -1056,7 +1059,7 @@
 lemma norm_le_infnorm:
   fixes x :: "'a::euclidean_space"
   shows "norm x \<le> sqrt DIM('a) * infnorm x"
-  unfolding norm_eq_sqrt_inner id_def 
+  unfolding norm_eq_sqrt_inner id_def
 proof (rule real_le_lsqrt[OF inner_ge_zero])
   show "sqrt DIM('a) * infnorm x \<ge> 0"
     by (simp add: zero_le_mult_iff infnorm_pos_le)
@@ -1085,7 +1088,7 @@
   (is "?lhs \<longleftrightarrow> ?rhs")
 proof (cases "x=0")
   case True
-  then show ?thesis 
+  then show ?thesis
     by auto
 next
   case False
@@ -1095,9 +1098,9 @@
         norm x * (norm y * (y \<bullet> x) - norm x * norm y * norm y) =  0)"
     using False unfolding inner_simps
     by (auto simp add: power2_norm_eq_inner[symmetric] power2_eq_square inner_commute field_simps)
-  also have "\<dots> \<longleftrightarrow> (2 * norm x * norm y * (norm x * norm y - x \<bullet> y) = 0)" 
+  also have "\<dots> \<longleftrightarrow> (2 * norm x * norm y * (norm x * norm y - x \<bullet> y) = 0)"
     using False  by (simp add: field_simps inner_commute)
-  also have "\<dots> \<longleftrightarrow> ?lhs" 
+  also have "\<dots> \<longleftrightarrow> ?lhs"
     using False by auto
   finally show ?thesis by metis
 qed
@@ -1125,7 +1128,7 @@
   shows "norm (x + y) = norm x + norm y \<longleftrightarrow> norm x *\<^sub>R y = norm y *\<^sub>R x"
 proof (cases "x = 0 \<or> y = 0")
   case True
-  then show ?thesis 
+  then show ?thesis
     by force
 next
   case False
@@ -1206,7 +1209,7 @@
     by (auto simp: insert_commute)
 next
   case False
-  show ?thesis 
+  show ?thesis
   proof
     assume h: "?lhs"
     then obtain u where u: "\<forall> x\<in> {0,x,y}. \<forall>y\<in> {0,x,y}. \<exists>c. x - y = c *\<^sub>R u"
@@ -1250,7 +1253,7 @@
   proof
     assume "\<bar>x \<bullet> y\<bar> = norm x * norm y"
     then show "collinear {0, x, y}"
-      unfolding norm_cauchy_schwarz_abs_eq collinear_lemma 
+      unfolding norm_cauchy_schwarz_abs_eq collinear_lemma
       by (meson eq_vector_fraction_iff nnz)
   next
     assume "collinear {0, x, y}"
--- a/src/HOL/Analysis/Path_Connected.thy	Tue Jun 01 19:46:34 2021 +0200
+++ b/src/HOL/Analysis/Path_Connected.thy	Tue Jun 08 17:01:32 2021 +0200
@@ -207,6 +207,9 @@
   unfolding pathstart_def reversepath_def pathfinish_def
   by auto
 
+lemma reversepath_o: "reversepath g = g \<circ> (-)1"
+  by (auto simp: reversepath_def)
+
 lemma pathstart_join[simp]: "pathstart (g1 +++ g2) = pathstart g1"
   unfolding pathstart_def joinpaths_def pathfinish_def
   by auto
--- a/src/HOL/Complex_Analysis/Complex_Singularities.thy	Tue Jun 01 19:46:34 2021 +0200
+++ b/src/HOL/Complex_Analysis/Complex_Singularities.thy	Tue Jun 08 17:01:32 2021 +0200
@@ -18,6 +18,11 @@
   shows "is_pole g b"
   using is_pole_cong assms by auto
 
+lemma is_pole_shift_iff:
+  fixes f :: "('a::real_normed_vector \<Rightarrow> 'b::real_normed_vector)"
+  shows "is_pole (f \<circ> (+) d) a = is_pole f (a + d)"
+  by (metis add_diff_cancel_right' filterlim_shift_iff is_pole_def)
+
 lemma is_pole_tendsto:
   fixes f::"('a::topological_space \<Rightarrow> 'b::real_normed_div_algebra)"
   shows "is_pole f x \<Longrightarrow> ((inverse o f) \<longlongrightarrow> 0) (at x)"
--- a/src/HOL/Complex_Analysis/Contour_Integration.thy	Tue Jun 01 19:46:34 2021 +0200
+++ b/src/HOL/Complex_Analysis/Contour_Integration.thy	Tue Jun 08 17:01:32 2021 +0200
@@ -68,8 +68,8 @@
 text\<open>Show that we can forget about the localized derivative.\<close>
 
 lemma has_integral_localized_vector_derivative:
-    "((\<lambda>x. f (g x) * vector_derivative g (at x within {a..b})) has_integral i) {a..b} \<longleftrightarrow>
-     ((\<lambda>x. f (g x) * vector_derivative g (at x)) has_integral i) {a..b}"
+    "((\<lambda>x. f (g x) * vector_derivative p (at x within {a..b})) has_integral i) {a..b} \<longleftrightarrow>
+     ((\<lambda>x. f (g x) * vector_derivative p (at x)) has_integral i) {a..b}"
 proof -
   have *: "{a..b} - {a,b} = interior {a..b}"
     by (simp add: atLeastAtMost_diff_ends)
@@ -78,8 +78,8 @@
 qed
 
 lemma integrable_on_localized_vector_derivative:
-    "(\<lambda>x. f (g x) * vector_derivative g (at x within {a..b})) integrable_on {a..b} \<longleftrightarrow>
-     (\<lambda>x. f (g x) * vector_derivative g (at x)) integrable_on {a..b}"
+    "(\<lambda>x. f (g x) * vector_derivative p (at x within {a..b})) integrable_on {a..b} \<longleftrightarrow>
+     (\<lambda>x. f (g x) * vector_derivative p (at x)) integrable_on {a..b}"
   by (simp add: integrable_on_def has_integral_localized_vector_derivative)
 
 lemma has_contour_integral:
--- a/src/HOL/Deriv.thy	Tue Jun 01 19:46:34 2021 +0200
+++ b/src/HOL/Deriv.thy	Tue Jun 08 17:01:32 2021 +0200
@@ -708,6 +708,23 @@
     (\<lambda>x. f x * g x) differentiable (at x within s)"
   unfolding differentiable_def by (blast intro: has_derivative_mult)
 
+lemma differentiable_cmult_left_iff [simp]:
+  fixes c::"'a::real_normed_field" 
+  shows "(\<lambda>t. c * q t) differentiable at t \<longleftrightarrow> c = 0 \<or> (\<lambda>t. q t) differentiable at t" (is "?lhs = ?rhs")
+proof
+  assume L: ?lhs
+  {assume "c \<noteq> 0"
+    then have "q differentiable at t"
+      using differentiable_mult [OF differentiable_const L, of concl: "1/c"] by auto
+  } then show ?rhs
+    by auto
+qed auto
+
+lemma differentiable_cmult_right_iff [simp]:
+  fixes c::"'a::real_normed_field" 
+  shows "(\<lambda>t. q t * c) differentiable at t \<longleftrightarrow> c = 0 \<or> (\<lambda>t. q t) differentiable at t" (is "?lhs = ?rhs")
+  by (simp add: mult.commute flip: differentiable_cmult_left_iff)
+
 lemma differentiable_inverse [simp, derivative_intros]:
   fixes f :: "'a::real_normed_vector \<Rightarrow> 'b::real_normed_field"
   shows "f differentiable (at x within s) \<Longrightarrow> f x \<noteq> 0 \<Longrightarrow>
--- /dev/null	Thu Jan 01 00:00:00 1970 +0000
+++ b/src/HOL/Examples/Sqrt.thy	Tue Jun 08 17:01:32 2021 +0200
@@ -0,0 +1,104 @@
+(*  Title:      HOL/Examples/Sqrt.thy
+    Author:     Makarius
+    Author:     Tobias Nipkow, TU Muenchen
+*)
+
+section \<open>Square roots of primes are irrational\<close>
+
+theory Sqrt
+  imports Complex_Main "HOL-Computational_Algebra.Primes"
+begin
+
+text \<open>
+  The square root of any prime number (including 2) is irrational.
+\<close>
+
+theorem sqrt_prime_irrational:
+  fixes p :: nat
+  assumes "prime p"
+  shows "sqrt p \<notin> \<rat>"
+proof
+  from \<open>prime p\<close> have p: "p > 1" by (rule prime_gt_1_nat)
+  assume "sqrt p \<in> \<rat>"
+  then obtain m n :: nat
+    where n: "n \<noteq> 0"
+      and sqrt_rat: "\<bar>sqrt p\<bar> = m / n"
+      and "coprime m n" by (rule Rats_abs_nat_div_natE)
+  have eq: "m\<^sup>2 = p * n\<^sup>2"
+  proof -
+    from n and sqrt_rat have "m = \<bar>sqrt p\<bar> * n" by simp
+    then have "m\<^sup>2 = (sqrt p)\<^sup>2 * n\<^sup>2" by (simp add: power_mult_distrib)
+    also have "(sqrt p)\<^sup>2 = p" by simp
+    also have "\<dots> * n\<^sup>2 = p * n\<^sup>2" by simp
+    finally show ?thesis by linarith
+  qed
+  have "p dvd m \<and> p dvd n"
+  proof
+    from eq have "p dvd m\<^sup>2" ..
+    with \<open>prime p\<close> show "p dvd m" by (rule prime_dvd_power)
+    then obtain k where "m = p * k" ..
+    with eq have "p * n\<^sup>2 = p\<^sup>2 * k\<^sup>2" by algebra
+    with p have "n\<^sup>2 = p * k\<^sup>2" by (simp add: power2_eq_square)
+    then have "p dvd n\<^sup>2" ..
+    with \<open>prime p\<close> show "p dvd n" by (rule prime_dvd_power)
+  qed
+  then have "p dvd gcd m n" by simp
+  with \<open>coprime m n\<close> have "p = 1" by simp
+  with p show False by simp
+qed
+
+corollary sqrt_2_not_rat: "sqrt 2 \<notin> \<rat>"
+  using sqrt_prime_irrational [of 2] by simp
+
+text \<open>
+  Here is an alternative version of the main proof, using mostly linear
+  forward-reasoning. While this results in less top-down structure, it is
+  probably closer to proofs seen in mathematics.
+\<close>
+
+theorem
+  fixes p :: nat
+  assumes "prime p"
+  shows "sqrt p \<notin> \<rat>"
+proof
+  from \<open>prime p\<close> have p: "p > 1" by (rule prime_gt_1_nat)
+  assume "sqrt p \<in> \<rat>"
+  then obtain m n :: nat
+    where n: "n \<noteq> 0"
+      and sqrt_rat: "\<bar>sqrt p\<bar> = m / n"
+      and "coprime m n" by (rule Rats_abs_nat_div_natE)
+  from n and sqrt_rat have "m = \<bar>sqrt p\<bar> * n" by simp
+  then have "m\<^sup>2 = (sqrt p)\<^sup>2 * n\<^sup>2" by (auto simp add: power2_eq_square)
+  also have "(sqrt p)\<^sup>2 = p" by simp
+  also have "\<dots> * n\<^sup>2 = p * n\<^sup>2" by simp
+  finally have eq: "m\<^sup>2 = p * n\<^sup>2" by linarith
+  then have "p dvd m\<^sup>2" ..
+  with \<open>prime p\<close> have dvd_m: "p dvd m" by (rule prime_dvd_power)
+  then obtain k where "m = p * k" ..
+  with eq have "p * n\<^sup>2 = p\<^sup>2 * k\<^sup>2" by algebra
+  with p have "n\<^sup>2 = p * k\<^sup>2" by (simp add: power2_eq_square)
+  then have "p dvd n\<^sup>2" ..
+  with \<open>prime p\<close> have "p dvd n" by (rule prime_dvd_power)
+  with dvd_m have "p dvd gcd m n" by (rule gcd_greatest)
+  with \<open>coprime m n\<close> have "p = 1" by simp
+  with p show False by simp
+qed
+
+
+text \<open>
+  Another old chestnut, which is a consequence of the irrationality of
+  \<^term>\<open>sqrt 2\<close>.
+\<close>
+
+lemma "\<exists>a b::real. a \<notin> \<rat> \<and> b \<notin> \<rat> \<and> a powr b \<in> \<rat>" (is "\<exists>a b. ?P a b")
+proof (cases "sqrt 2 powr sqrt 2 \<in> \<rat>")
+  case True
+  with sqrt_2_not_rat have "?P (sqrt 2) (sqrt 2)" by simp
+  then show ?thesis by blast
+next
+  case False
+  with sqrt_2_not_rat powr_powr have "?P (sqrt 2 powr sqrt 2) (sqrt 2)" by simp
+  then show ?thesis by blast
+qed
+
+end
--- a/src/HOL/Library/Bit_Operations.thy	Tue Jun 01 19:46:34 2021 +0200
+++ b/src/HOL/Library/Bit_Operations.thy	Tue Jun 08 17:01:32 2021 +0200
@@ -9,6 +9,11 @@
     "HOL-Library.Boolean_Algebra"
 begin
 
+lemma bit_numeral_int_iff [bit_simps]: \<comment> \<open>TODO: move\<close>
+  \<open>bit (numeral m :: int) n \<longleftrightarrow> bit (numeral m :: nat) n\<close>
+  using bit_of_nat_iff_bit [of \<open>numeral m\<close> n] by simp
+
+
 subsection \<open>Bit operations\<close>
 
 class semiring_bit_operations = semiring_bit_shifts +
--- /dev/null	Thu Jan 01 00:00:00 1970 +0000
+++ b/src/HOL/Library/Lexord.thy	Tue Jun 08 17:01:32 2021 +0200
@@ -0,0 +1,208 @@
+section \<open>Lexicographic orderings\<close>
+
+theory Lexord
+  imports Main
+begin
+
+subsection \<open>The preorder case\<close>
+
+locale lex_preordering = preordering
+begin
+
+inductive lex_less :: \<open>'a list \<Rightarrow> 'a list \<Rightarrow> bool\<close>  (infix \<open>[\<^bold><]\<close> 50) 
+where
+  Nil: \<open>[] [\<^bold><] y # ys\<close>
+| Cons: \<open>x \<^bold>< y \<Longrightarrow> x # xs [\<^bold><] y # ys\<close>
+| Cons_eq: \<open>x \<^bold>\<le> y \<Longrightarrow> y \<^bold>\<le> x \<Longrightarrow> xs [\<^bold><] ys \<Longrightarrow> x # xs [\<^bold><] y # ys\<close>
+
+inductive lex_less_eq :: \<open>'a list \<Rightarrow> 'a list \<Rightarrow> bool\<close>  (infix \<open>[\<^bold>\<le>]\<close> 50)
+where
+  Nil: \<open>[] [\<^bold>\<le>] ys\<close>
+| Cons: \<open>x \<^bold>< y \<Longrightarrow> x # xs [\<^bold>\<le>] y # ys\<close>
+| Cons_eq: \<open>x \<^bold>\<le> y \<Longrightarrow> y \<^bold>\<le> x \<Longrightarrow> xs [\<^bold>\<le>] ys \<Longrightarrow> x # xs [\<^bold>\<le>] y # ys\<close>
+
+lemma lex_less_simps [simp]:
+  \<open>[] [\<^bold><] y # ys\<close>
+  \<open>\<not> xs [\<^bold><] []\<close>
+  \<open>x # xs [\<^bold><] y # ys \<longleftrightarrow> x \<^bold>< y \<or> x \<^bold>\<le> y \<and> y \<^bold>\<le> x \<and> xs [\<^bold><] ys\<close>
+  by (auto intro: lex_less.intros elim: lex_less.cases)
+
+lemma lex_less_eq_simps [simp]:
+  \<open>[] [\<^bold>\<le>] ys\<close>
+  \<open>\<not> x # xs [\<^bold>\<le>] []\<close>
+  \<open>x # xs [\<^bold>\<le>] y # ys \<longleftrightarrow> x \<^bold>< y \<or> x \<^bold>\<le> y \<and> y \<^bold>\<le> x \<and> xs [\<^bold>\<le>] ys\<close>
+  by (auto intro: lex_less_eq.intros elim: lex_less_eq.cases)
+
+lemma lex_less_code [code]:
+  \<open>[] [\<^bold><] y # ys \<longleftrightarrow> True\<close>
+  \<open>xs [\<^bold><] [] \<longleftrightarrow> False\<close>
+  \<open>x # xs [\<^bold><] y # ys \<longleftrightarrow> x \<^bold>< y \<or> x \<^bold>\<le> y \<and> y \<^bold>\<le> x \<and> xs [\<^bold><] ys\<close>
+  by simp_all
+
+lemma lex_less_eq_code [code]:
+  \<open>[] [\<^bold>\<le>] ys \<longleftrightarrow> True\<close>
+  \<open>x # xs [\<^bold>\<le>] [] \<longleftrightarrow> False\<close>
+  \<open>x # xs [\<^bold>\<le>] y # ys \<longleftrightarrow> x \<^bold>< y \<or> x \<^bold>\<le> y \<and> y \<^bold>\<le> x \<and> xs [\<^bold>\<le>] ys\<close>
+  by simp_all
+
+lemma preordering:
+  \<open>preordering ([\<^bold>\<le>]) ([\<^bold><])\<close>
+proof
+  fix xs ys zs
+  show \<open>xs [\<^bold>\<le>] xs\<close>
+    by (induction xs) (simp_all add: refl)
+  show \<open>xs [\<^bold>\<le>] zs\<close> if \<open>xs [\<^bold>\<le>] ys\<close> \<open>ys [\<^bold>\<le>] zs\<close>
+  using that proof (induction arbitrary: zs)
+    case (Nil ys)
+    then show ?case by simp
+  next
+    case (Cons x y xs ys)
+    then show ?case
+      by (cases zs) (auto dest: strict_trans strict_trans2)
+  next
+    case (Cons_eq x y xs ys)
+    then show ?case
+      by (cases zs) (auto dest: strict_trans1 intro: trans)
+  qed
+  show \<open>xs [\<^bold><] ys \<longleftrightarrow> xs [\<^bold>\<le>] ys \<and> \<not> ys [\<^bold>\<le>] xs\<close> (is \<open>?P \<longleftrightarrow> ?Q\<close>)
+  proof
+    assume ?P
+    then have \<open>xs [\<^bold>\<le>] ys\<close>
+      by induction simp_all
+    moreover have \<open>\<not> ys [\<^bold>\<le>] xs\<close>
+      using \<open>?P\<close>
+      by induction (simp_all, simp_all add: strict_iff_not asym)
+    ultimately show ?Q ..
+  next
+    assume ?Q
+    then have \<open>xs [\<^bold>\<le>] ys\<close> \<open>\<not> ys [\<^bold>\<le>] xs\<close>
+      by auto
+    then show ?P
+    proof induction
+      case (Nil ys)
+      then show ?case
+        by (cases ys) simp_all
+    next
+      case (Cons x y xs ys)
+      then show ?case
+        by simp
+    next
+      case (Cons_eq x y xs ys)
+      then show ?case
+        by simp
+    qed
+  qed
+qed
+
+interpretation lex: preordering \<open>([\<^bold>\<le>])\<close> \<open>([\<^bold><])\<close>
+  by (fact preordering)
+
+end
+
+
+subsection \<open>The order case\<close>
+
+locale lex_ordering = lex_preordering + ordering
+begin
+
+interpretation lex: preordering \<open>([\<^bold>\<le>])\<close> \<open>([\<^bold><])\<close>
+  by (fact preordering)
+
+lemma less_lex_Cons_iff [simp]:
+  \<open>x # xs [\<^bold><] y # ys \<longleftrightarrow> x \<^bold>< y \<or> x = y \<and> xs [\<^bold><] ys\<close>
+  by (auto intro: refl antisym)
+
+lemma less_eq_lex_Cons_iff [simp]:
+  \<open>x # xs [\<^bold>\<le>] y # ys \<longleftrightarrow> x \<^bold>< y \<or> x = y \<and> xs [\<^bold>\<le>] ys\<close>
+  by (auto intro: refl antisym)
+
+lemma ordering:
+  \<open>ordering ([\<^bold>\<le>]) ([\<^bold><])\<close>
+proof
+  fix xs ys
+  show *: \<open>xs = ys\<close> if \<open>xs [\<^bold>\<le>] ys\<close> \<open>ys [\<^bold>\<le>] xs\<close>
+  using that proof induction
+  case (Nil ys)
+    then show ?case by (cases ys) simp
+  next
+    case (Cons x y xs ys)
+    then show ?case by (auto dest: asym intro: antisym)
+      (simp add: strict_iff_not)
+  next
+    case (Cons_eq x y xs ys)
+    then show ?case by (auto intro: antisym)
+      (simp add: strict_iff_not)
+  qed
+  show \<open>xs [\<^bold><] ys \<longleftrightarrow> xs [\<^bold>\<le>] ys \<and> xs \<noteq> ys\<close>
+    by (auto simp add: lex.strict_iff_not dest: *)
+qed
+
+interpretation lex: ordering \<open>([\<^bold>\<le>])\<close> \<open>([\<^bold><])\<close>
+  by (fact ordering)
+
+end
+
+
+subsection \<open>Canonical instance\<close>
+
+instantiation list :: (preorder) preorder
+begin
+
+global_interpretation lex: lex_preordering \<open>(\<le>) :: 'a::preorder \<Rightarrow> 'a \<Rightarrow> bool\<close> \<open>(<) :: 'a \<Rightarrow> 'a \<Rightarrow> bool\<close>
+  defines less_eq_list = lex.lex_less_eq
+    and less_list = lex.lex_less ..
+
+instance
+  by (rule class.preorder.of_class.intro, rule preordering_preorderI, fact lex.preordering)
+
+end
+
+global_interpretation lex: lex_ordering \<open>(\<le>) :: 'a::order \<Rightarrow> 'a \<Rightarrow> bool\<close> \<open>(<) :: 'a \<Rightarrow> 'a \<Rightarrow> bool\<close>
+  rewrites \<open>lex_preordering.lex_less_eq (\<le>) (<) = ((\<le>) :: 'a list \<Rightarrow> 'a list \<Rightarrow> bool)\<close>
+    and \<open>lex_preordering.lex_less (\<le>) (<) = ((<) :: 'a list \<Rightarrow> 'a list \<Rightarrow> bool)\<close>
+proof -
+  interpret lex_ordering \<open>(\<le>) :: 'a \<Rightarrow> 'a \<Rightarrow> bool\<close> \<open>(<) :: 'a \<Rightarrow> 'a \<Rightarrow> bool\<close> ..
+  show \<open>lex_ordering ((\<le>)  :: 'a \<Rightarrow> 'a \<Rightarrow> bool) (<)\<close>
+    by (fact lex_ordering_axioms)
+  show \<open>lex_preordering.lex_less_eq (\<le>) (<) = (\<le>)\<close>
+    by (simp add: less_eq_list_def)
+  show \<open>lex_preordering.lex_less (\<le>) (<) = (<)\<close>
+    by (simp add: less_list_def)
+qed
+
+instance list :: (order) order
+  by (rule class.order.of_class.intro, rule ordering_orderI, fact lex.ordering)
+
+export_code \<open>(\<le>) :: _ list \<Rightarrow> _ list \<Rightarrow> bool\<close> \<open>(<) :: _ list \<Rightarrow> _ list \<Rightarrow> bool\<close> in Haskell
+
+
+subsection \<open>Non-canonical instance\<close>
+
+context comm_monoid_mult
+begin
+
+definition dvd_strict :: \<open>'a \<Rightarrow> 'a \<Rightarrow> bool\<close>
+  where \<open>dvd_strict a b \<longleftrightarrow> a dvd b \<and> \<not> b dvd a\<close>
+
+end
+
+global_interpretation dvd: lex_preordering \<open>(dvd) :: 'a::comm_monoid_mult \<Rightarrow> 'a \<Rightarrow> bool\<close> dvd_strict
+  defines lex_dvd = dvd.lex_less_eq
+    and lex_dvd_strict = dvd.lex_less
+  apply (rule lex_preordering.intro)
+  apply standard
+    apply (auto simp add: dvd_strict_def)
+  done
+
+print_theorems
+
+global_interpretation lex_dvd: preordering lex_dvd lex_dvd_strict
+  by (fact dvd.preordering)
+
+definition \<open>example = lex_dvd [(4::int), - 7, 8] [- 8, 13, 5]\<close>
+
+export_code example in Haskell
+
+value example
+
+end
--- a/src/HOL/Library/Word.thy	Tue Jun 01 19:46:34 2021 +0200
+++ b/src/HOL/Library/Word.thy	Tue Jun 08 17:01:32 2021 +0200
@@ -1802,40 +1802,6 @@
   for w :: \<open>'a::len word\<close>
   by transfer (auto simp add: bit_signed_take_bit_iff le_less min_def)
 
-lift_definition shiftl1 :: \<open>'a::len word \<Rightarrow> 'a word\<close>
-  is \<open>(*) 2\<close>
-  by (auto simp add: take_bit_eq_mod intro: mod_mult_cong)
-
-lemma shiftl1_eq:
-  \<open>shiftl1 w = word_of_int (2 * uint w)\<close>
-  by transfer (simp add: take_bit_eq_mod mod_simps)
-
-lemma shiftl1_eq_mult_2:
-  \<open>shiftl1 = (*) 2\<close>
-  by (rule ext, transfer) simp
-
-lemma bit_shiftl1_iff [bit_simps]:
-  \<open>bit (shiftl1 w) n \<longleftrightarrow> 0 < n \<and> n < LENGTH('a) \<and> bit w (n - 1)\<close>
-    for w :: \<open>'a::len word\<close>
-  by (simp add: shiftl1_eq_mult_2 bit_double_iff not_le) (simp add: ac_simps)
-
-lift_definition shiftr1 :: \<open>'a::len word \<Rightarrow> 'a word\<close>
-  \<comment> \<open>shift right as unsigned or as signed, ie logical or arithmetic\<close>
-  is \<open>\<lambda>k. take_bit LENGTH('a) k div 2\<close> 
-  by simp
-
-lemma shiftr1_eq_div_2:
-  \<open>shiftr1 w = w div 2\<close>
-  by transfer simp
-
-lemma bit_shiftr1_iff [bit_simps]:
-  \<open>bit (shiftr1 w) n \<longleftrightarrow> bit w (Suc n)\<close>
-  by transfer (auto simp flip: bit_Suc simp add: bit_take_bit_iff)
-
-lemma shiftr1_eq:
-  \<open>shiftr1 w = word_of_int (uint w div 2)\<close>
-  by transfer simp
-
 lemma bit_word_iff_drop_bit_and [code]:
   \<open>bit a n \<longleftrightarrow> drop_bit n a AND 1 = 1\<close> for a :: \<open>'a::len word\<close>
   by (simp add: bit_iff_odd_drop_bit odd_iff_mod_2_eq_one and_one_eq)
@@ -1847,32 +1813,6 @@
     and word_xor_def: "(a::'a word) XOR b = word_of_int (uint a XOR uint b)"
   by (transfer, simp add: take_bit_not_take_bit)+
 
-lift_definition setBit :: \<open>'a::len word \<Rightarrow> nat \<Rightarrow> 'a word\<close>
-  is \<open>\<lambda>k n. set_bit n k\<close>
-  by (simp add: take_bit_set_bit_eq)
-
-lemma set_Bit_eq:
-  \<open>setBit w n = set_bit n w\<close>
-  by transfer simp
-
-lemma bit_setBit_iff [bit_simps]:
-  \<open>bit (setBit w m) n \<longleftrightarrow> (m = n \<and> n < LENGTH('a) \<or> bit w n)\<close>
-  for w :: \<open>'a::len word\<close>
-  by transfer (auto simp add: bit_set_bit_iff)
-
-lift_definition clearBit :: \<open>'a::len word \<Rightarrow> nat \<Rightarrow> 'a word\<close>
-  is \<open>\<lambda>k n. unset_bit n k\<close>
-  by (simp add: take_bit_unset_bit_eq)
-
-lemma clear_Bit_eq:
-  \<open>clearBit w n = unset_bit n w\<close>
-  by transfer simp
-
-lemma bit_clearBit_iff [bit_simps]:
-  \<open>bit (clearBit w m) n \<longleftrightarrow> m \<noteq> n \<and> bit w n\<close>
-  for w :: \<open>'a::len word\<close>
-  by transfer (auto simp add: bit_unset_bit_iff)
-
 definition even_word :: \<open>'a::len word \<Rightarrow> bool\<close>
   where [code_abbrev]: \<open>even_word = even\<close>
 
@@ -1962,6 +1902,14 @@
   for w :: \<open>'a::len word\<close>
   by transfer simp
 
+lemma signed_drop_bit_of_0 [simp]:
+  \<open>signed_drop_bit n 0 = 0\<close>
+  by transfer simp
+
+lemma signed_drop_bit_of_minus_1 [simp]:
+  \<open>signed_drop_bit n (- 1) = - 1\<close>
+  by transfer simp
+
 lemma signed_drop_bit_signed_drop_bit [simp]:
   \<open>signed_drop_bit m (signed_drop_bit n w) = signed_drop_bit (m + n) w\<close>
   for w :: \<open>'a::len word\<close>
@@ -1989,22 +1937,6 @@
     by (auto simp add: bit_sint_iff bit_drop_bit_eq bit_signed_drop_bit_iff dest: bit_imp_le_length)
 qed auto
 
-lift_definition sshiftr1 :: \<open>'a::len word \<Rightarrow> 'a word\<close>
-  is \<open>\<lambda>k. take_bit LENGTH('a) (signed_take_bit (LENGTH('a) - Suc 0) k div 2)\<close>
-  by (simp flip: signed_take_bit_decr_length_iff)
-
-lift_definition bshiftr1 :: \<open>bool \<Rightarrow> 'a::len word \<Rightarrow> 'a word\<close>
-  is \<open>\<lambda>b k. take_bit LENGTH('a) k div 2 + of_bool b * 2 ^ (LENGTH('a) - Suc 0)\<close>
-  by (fact arg_cong)
-
-lemma sshiftr1_eq_signed_drop_bit_Suc_0:
-  \<open>sshiftr1 = signed_drop_bit (Suc 0)\<close>
-  by (rule ext) (transfer, simp add: drop_bit_Suc)
-
-lemma sshiftr1_eq:
-  \<open>sshiftr1 w = word_of_int (sint w div 2)\<close>
-  by transfer simp
-
 
 subsection \<open>Rotation\<close>
 
@@ -3605,92 +3537,8 @@
   by (induct n) (simp_all add: wi_hom_syms)
 
 
-subsection \<open>Shifting, Rotating, and Splitting Words\<close>
-
-lemma shiftl1_wi [simp]: "shiftl1 (word_of_int w) = word_of_int (2 * w)"
-  by transfer simp
-
-lemma shiftl1_numeral [simp]: "shiftl1 (numeral w) = numeral (Num.Bit0 w)"
-  unfolding word_numeral_alt shiftl1_wi by simp
-
-lemma shiftl1_neg_numeral [simp]: "shiftl1 (- numeral w) = - numeral (Num.Bit0 w)"
-  unfolding word_neg_numeral_alt shiftl1_wi by simp
-
-lemma shiftl1_0 [simp] : "shiftl1 0 = 0"
-  by transfer simp
-
-lemma shiftl1_def_u: "shiftl1 w = word_of_int (2 * uint w)"
-  by (fact shiftl1_eq)
-
-lemma shiftl1_def_s: "shiftl1 w = word_of_int (2 * sint w)"
-  by (simp add: shiftl1_def_u wi_hom_syms)
-
-lemma shiftr1_0 [simp]: "shiftr1 0 = 0"
-  by transfer simp
-
-lemma sshiftr1_0 [simp]: "sshiftr1 0 = 0"
-  by transfer simp
-
-lemma sshiftr1_n1 [simp]: "sshiftr1 (- 1) = - 1"
-  by transfer simp
-
-text \<open>
-  see paper page 10, (1), (2), \<open>shiftr1_def\<close> is of the form of (1),
-  where \<open>f\<close> (ie \<open>_ div 2\<close>) takes normal arguments to normal results,
-  thus we get (2) from (1)
-\<close>
-
-lemma uint_shiftr1: "uint (shiftr1 w) = uint w div 2"
-  using drop_bit_eq_div [of 1 \<open>uint w\<close>, symmetric]
-  by transfer (simp add: drop_bit_take_bit min_def)
-
-lemma bit_sshiftr1_iff [bit_simps]:
-  \<open>bit (sshiftr1 w) n \<longleftrightarrow> bit w (if n = LENGTH('a) - 1 then LENGTH('a) - 1 else Suc n)\<close>
-  for w :: \<open>'a::len word\<close>
-  apply transfer
-  by (auto simp add: bit_take_bit_iff bit_signed_take_bit_iff min_def le_Suc_eq simp flip: bit_Suc)
-
-lemma shiftr1_div_2: "uint (shiftr1 w) = uint w div 2"
-  by (fact uint_shiftr1)
-
-lemma sshiftr1_div_2: "sint (sshiftr1 w) = sint w div 2"
-  using sint_signed_drop_bit_eq [of 1 w]
-  by (simp add: drop_bit_Suc sshiftr1_eq_signed_drop_bit_Suc_0) 
-
-lemma bit_bshiftr1_iff [bit_simps]:
-  \<open>bit (bshiftr1 b w) n \<longleftrightarrow> b \<and> n = LENGTH('a) - 1 \<or> bit w (Suc n)\<close>
-  for w :: \<open>'a::len word\<close>
-  apply transfer
-    apply (subst disjunctive_add)
-   apply (auto simp add: bit_take_bit_iff bit_or_iff bit_exp_iff simp flip: bit_Suc)
-  done
-
-
 subsubsection \<open>shift functions in terms of lists of bools\<close>
 
-lemma shiftl1_rev: "shiftl1 w = word_reverse (shiftr1 (word_reverse w))"
-  by (intro bit_word_eqI) (auto simp add: bit_shiftl1_iff bit_word_reverse_iff bit_shiftr1_iff Suc_diff_Suc)
-
-\<comment> \<open>note -- the following results use \<open>'a::len word < number_ring\<close>\<close>
-
-lemma shiftl1_2t: "shiftl1 w = 2 * w"
-  for w :: "'a::len word"
-  by (simp add: shiftl1_eq wi_hom_mult [symmetric])
-
-lemma shiftl1_p: "shiftl1 w = w + w"
-  for w :: "'a::len word"
-  by (simp add: shiftl1_2t)
-
-lemma shiftr1_bintr [simp]:
-  "(shiftr1 (numeral w) :: 'a::len word) =
-    word_of_int (take_bit LENGTH('a) (numeral w) div 2)"
-  by transfer simp
-
-lemma sshiftr1_sbintr [simp]:
-  "(sshiftr1 (numeral w) :: 'a::len word) =
-    word_of_int (signed_take_bit (LENGTH('a) - 1) (numeral w) div 2)"
-  by transfer simp
-
 text \<open>TODO: rules for \<^term>\<open>- (numeral n)\<close>\<close>
 
 lemma drop_bit_word_numeral [simp]:
@@ -4173,9 +4021,6 @@
 lemma uint_lt_0 [simp]: "uint x < 0 = False"
   by (simp add: linorder_not_less)
 
-lemma shiftr1_1 [simp]: "shiftr1 (1::'a::len word) = 0"
-  by transfer simp
-
 lemma word_less_1 [simp]: "x < 1 \<longleftrightarrow> x = 0"
   for x :: "'a::len word"
   by (simp add: word_less_nat_alt unat_0_iff)
--- a/src/HOL/Limits.thy	Tue Jun 01 19:46:34 2021 +0200
+++ b/src/HOL/Limits.thy	Tue Jun 08 17:01:32 2021 +0200
@@ -11,6 +11,20 @@
   imports Real_Vector_Spaces
 begin
 
+text \<open>Lemmas related to shifting/scaling\<close>
+lemma range_add [simp]:
+  fixes a::"'a::group_add" shows "range ((+) a) = UNIV"
+  by (metis add_minus_cancel surjI)
+
+lemma range_diff [simp]:
+  fixes a::"'a::group_add" shows "range ((-) a) = UNIV"
+  by (metis (full_types) add_minus_cancel diff_minus_eq_add surj_def)
+
+lemma range_mult [simp]:
+  fixes a::"real" shows "range ((*) a) = (if a=0 then {0} else UNIV)"
+  by (simp add: surj_def) (meson dvdE dvd_field_iff)
+
+
 subsection \<open>Filter going to infinity norm\<close>
 
 definition at_infinity :: "'a::real_normed_vector filter"
@@ -1461,6 +1475,28 @@
   for a d :: "real"
   by (simp add: filter_eq_iff eventually_filtermap eventually_at_filter filtermap_nhds_shift[symmetric])
 
+lemma filterlim_shift:
+  fixes d :: "'a::real_normed_vector"
+  assumes "filterlim f F (at a)"
+  shows "filterlim (f \<circ> (+) d) F (at (a - d))"
+  unfolding filterlim_iff
+proof (intro strip)
+  fix P
+  assume "eventually P F"
+  then have "\<forall>\<^sub>F x in filtermap (\<lambda>y. y - d) (at a). P (f (d + x))"
+    using assms by (force simp add: filterlim_iff eventually_filtermap)
+  then show "(\<forall>\<^sub>F x in at (a - d). P ((f \<circ> (+) d) x))"
+    by (force simp add: filtermap_at_shift)
+qed
+
+lemma filterlim_shift_iff:
+  fixes d :: "'a::real_normed_vector"
+  shows "filterlim (f \<circ> (+) d) F (at (a - d)) = filterlim f F (at a)"   (is "?lhs = ?rhs")
+proof
+  assume L: ?lhs show ?rhs
+    using filterlim_shift [OF L, of "-d"] by (simp add: filterlim_iff)
+qed (metis filterlim_shift)
+
 lemma at_right_to_0: "at_right a = filtermap (\<lambda>x. x + a) (at_right 0)"
   for a :: real
   using filtermap_at_right_shift[of "-a" 0] by simp
--- a/src/HOL/Orderings.thy	Tue Jun 01 19:46:34 2021 +0200
+++ b/src/HOL/Orderings.thy	Tue Jun 08 17:01:32 2021 +0200
@@ -279,6 +279,16 @@
 
 end
 
+lemma preordering_preorderI:
+  \<open>class.preorder (\<^bold>\<le>) (\<^bold><)\<close> if \<open>preordering (\<^bold>\<le>) (\<^bold><)\<close>
+    for less_eq (infix \<open>\<^bold>\<le>\<close> 50) and less (infix \<open>\<^bold><\<close> 50)
+proof -
+  from that interpret preordering \<open>(\<^bold>\<le>)\<close> \<open>(\<^bold><)\<close> .
+  show ?thesis
+    by standard (auto simp add: strict_iff_not refl intro: trans)
+qed
+
+
 
 subsection \<open>Partial orders\<close>
 
@@ -409,12 +419,12 @@
 qed
 
 lemma order_strictI:
-  fixes less (infix "\<sqsubset>" 50)
-    and less_eq (infix "\<sqsubseteq>" 50)
-  assumes "\<And>a b. a \<sqsubseteq> b \<longleftrightarrow> a \<sqsubset> b \<or> a = b"
-    assumes "\<And>a b. a \<sqsubset> b \<Longrightarrow> \<not> b \<sqsubset> a"
-  assumes "\<And>a. \<not> a \<sqsubset> a"
-  assumes "\<And>a b c. a \<sqsubset> b \<Longrightarrow> b \<sqsubset> c \<Longrightarrow> a \<sqsubset> c"
+  fixes less (infix "\<^bold><" 50)
+    and less_eq (infix "\<^bold>\<le>" 50)
+  assumes "\<And>a b. a \<^bold>\<le> b \<longleftrightarrow> a \<^bold>< b \<or> a = b"
+    assumes "\<And>a b. a \<^bold>< b \<Longrightarrow> \<not> b \<^bold>< a"
+  assumes "\<And>a. \<not> a \<^bold>< a"
+  assumes "\<And>a b c. a \<^bold>< b \<Longrightarrow> b \<^bold>< c \<Longrightarrow> a \<^bold>< c"
   shows "class.order less_eq less"
   by (rule ordering_orderI) (rule ordering_strictI, (fact assms)+)
 
--- a/src/HOL/ROOT	Tue Jun 01 19:46:34 2021 +0200
+++ b/src/HOL/ROOT	Tue Jun 08 17:01:32 2021 +0200
@@ -20,7 +20,7 @@
     Notable Examples in Isabelle/HOL.
   "
   sessions
-    "HOL-Library"
+    "HOL-Computational_Algebra"
   theories
     Adhoc_Overloading_Examples
     Ackermann
@@ -36,6 +36,7 @@
     Peirce
     Records
     Seq
+    Sqrt
   document_files
     "root.bib"
     "root.tex"
@@ -706,7 +707,6 @@
     Sketch_and_Explore
     Sorting_Algorithms_Examples
     Specifications_with_bundle_mixins
-    Sqrt
     Sqrt_Script
     Sudoku
     Sum_of_Powers
@@ -939,7 +939,8 @@
 
 session "HOL-Mirabelle-ex" in "Tools/Mirabelle" = HOL +
   description "Some arbitrary small test case for Mirabelle."
-  options [timeout = 60, mirabelle_actions = "arith"]
+  options [timeout = 60,
+    mirabelle_theories = "HOL-Analysis.Inner_Product", mirabelle_actions = "arith"]
   sessions
     "HOL-Analysis"
   theories
--- a/src/HOL/Tools/Mirabelle/mirabelle.ML	Tue Jun 01 19:46:34 2021 +0200
+++ b/src/HOL/Tools/Mirabelle/mirabelle.ML	Tue Jun 08 17:01:32 2021 +0200
@@ -147,40 +147,57 @@
     fun check_pos range = check_line range o Position.line_of;
   in check_pos o get_theory end;
 
+fun check_session qualifier thy_name (_: Position.T) =
+  Resources.theory_qualifier thy_name = qualifier;
+
 
 (* presentation hook *)
 
 val whitelist = ["apply", "by", "proof"];
 
 val _ =
-  Theory.setup (Thy_Info.add_presentation (fn context => fn thy =>
+  Build.add_hook (fn qualifier => fn loaded_theories =>
     let
-      val {options, adjust_pos, segments, ...} = context;
-      val mirabelle_timeout = Options.seconds options \<^system_option>\<open>mirabelle_timeout\<close>;
-      val mirabelle_actions = Options.string options \<^system_option>\<open>mirabelle_actions\<close>;
-      val mirabelle_theories = Options.string options \<^system_option>\<open>mirabelle_theories\<close>;
+      val mirabelle_timeout = Options.default_seconds \<^system_option>\<open>mirabelle_timeout\<close>;
+      val mirabelle_stride = Options.default_int \<^system_option>\<open>mirabelle_stride\<close>;
+      val mirabelle_actions = Options.default_string \<^system_option>\<open>mirabelle_actions\<close>;
+      val mirabelle_theories = Options.default_string \<^system_option>\<open>mirabelle_theories\<close>;
 
       val actions =
         (case read_actions mirabelle_actions of
           SOME actions => actions
         | NONE => error ("Failed to parse mirabelle_actions: " ^ quote mirabelle_actions));
-      val check = check_theories (space_explode "," mirabelle_theories);
-      val commands =
-        segments |> map_filter (fn {command = tr, prev_state = st, state = st', ...} =>
-          let
-            val name = Toplevel.name_of tr;
-            val pos = adjust_pos (Toplevel.pos_of tr);
-          in
-            if can (Proof.assert_backward o Toplevel.proof_of) st andalso
-              member (op =) whitelist name andalso
-              check (Context.theory_long_name thy) pos
-            then SOME {name = name, pos = pos, pre = Toplevel.proof_of st, post = st'}
-            else NONE
-          end);
+      val check =
+        if mirabelle_theories = "" then check_session qualifier
+        else check_theories (space_explode "," mirabelle_theories);
 
-      fun apply (i, (name, arguments)) () =
-        apply_action (i + 1) name arguments mirabelle_timeout commands thy;
-    in if null commands then () else fold_index apply actions () end));
+      fun theory_commands (thy, segments) =
+        let
+          val commands = segments
+            |> map_index (fn (n, {command = tr, prev_state = st, state = st', ...}) =>
+              if n mod mirabelle_stride = 0 then
+                let
+                  val name = Toplevel.name_of tr;
+                  val pos = Toplevel.pos_of tr;
+                in
+                  if Context.proper_subthy (\<^theory>, thy) andalso
+                    can (Proof.assert_backward o Toplevel.proof_of) st andalso
+                    member (op =) whitelist name andalso
+                    check (Context.theory_long_name thy) pos
+                  then SOME {name = name, pos = pos, pre = Toplevel.proof_of st, post = st'}
+                  else NONE
+                end
+              else NONE)
+            |> map_filter I;
+        in if null commands then NONE else SOME (thy, commands) end;
+
+      fun app_actions (thy, commands) =
+        (actions, ()) |-> fold_index (fn (index, (name, arguments)) => fn () =>
+          apply_action (index + 1) name arguments mirabelle_timeout commands thy);
+    in
+      if null actions then ()
+      else List.app app_actions (map_filter theory_commands loaded_theories)
+    end);
 
 
 (* Mirabelle utility functions *)
--- a/src/HOL/Tools/Mirabelle/mirabelle.scala	Tue Jun 01 19:46:34 2021 +0200
+++ b/src/HOL/Tools/Mirabelle/mirabelle.scala	Tue Jun 08 17:01:32 2021 +0200
@@ -102,7 +102,7 @@
 
     if (build_results0.ok) {
       val build_options =
-        options + "timeout_build=false" + "parallel_presentation=false" +
+        options + "timeout_build=false" +
           ("mirabelle_actions=" + actions.mkString(";")) +
           ("mirabelle_theories=" + theories.mkString(","))
 
@@ -161,7 +161,6 @@
     var select_dirs: List[Path] = Nil
     var numa_shuffling = false
     var output_dir = default_output_dir
-    var requirements = false
     var theories: List[String] = Nil
     var exclude_session_groups: List[String] = Nil
     var all_sessions = false
@@ -172,6 +171,7 @@
     var verbose = false
     var exclude_sessions: List[String] = Nil
 
+    val default_stride = options.int("mirabelle_stride")
     val default_timeout = options.seconds("mirabelle_timeout")
 
     val getopts = Getopts("""
@@ -182,8 +182,7 @@
     -B NAME      include session NAME and all descendants
     -D DIR       include session directory and select its sessions
     -N           cyclic shuffling of NUMA CPU nodes (performance tuning)
-    -O DIR       output directory for log files (default: """ + default_output_dir + """,
-    -R           refer to requirements of selected sessions
+    -O DIR       output directory for log files (default: """ + default_output_dir + """)
     -T THEORY    theory restriction: NAME or NAME[LINE:END_LINE]
     -X NAME      exclude sessions from group NAME and all descendants
     -a           select all sessions
@@ -191,6 +190,7 @@
     -g NAME      select session group NAME
     -j INT       maximum number of parallel jobs (default 1)
     -o OPTION    override Isabelle system OPTION (via NAME=VAL or NAME)
+    -s INT       run actions on every nth goal (default """ + default_stride + """)
     -t SECONDS   timeout for each action (default """ + default_timeout + """)
     -v           verbose
     -x NAME      exclude session NAME and all descendants
@@ -213,7 +213,6 @@
       "D:" -> (arg => select_dirs = select_dirs ::: List(Path.explode(arg))),
       "N" -> (_ => numa_shuffling = true),
       "O:" -> (arg => output_dir = Path.explode(arg)),
-      "R" -> (_ => requirements = true),
       "T:" -> (arg => theories = theories ::: List(arg)),
       "X:" -> (arg => exclude_session_groups = exclude_session_groups ::: List(arg)),
       "a" -> (_ => all_sessions = true),
@@ -221,6 +220,7 @@
       "g:" -> (arg => session_groups = session_groups ::: List(arg)),
       "j:" -> (arg => max_jobs = Value.Int.parse(arg)),
       "o:" -> (arg => options = options + arg),
+      "s:" -> (arg => options = options + ("mirabelle_stride=" + arg)),
       "t:" -> (arg => options = options + ("mirabelle_timeout=" + arg)),
       "v" -> (_ => verbose = true),
       "x:" -> (arg => exclude_sessions = exclude_sessions ::: List(arg)))
@@ -241,7 +241,6 @@
         mirabelle(options, actions, output_dir,
           theories = theories,
           selection = Sessions.Selection(
-            requirements = requirements,
             all_sessions = all_sessions,
             base_sessions = base_sessions,
             exclude_session_groups = exclude_session_groups,
--- a/src/HOL/Tools/Mirabelle/mirabelle_sledgehammer.ML	Tue Jun 01 19:46:34 2021 +0200
+++ b/src/HOL/Tools/Mirabelle/mirabelle_sledgehammer.ML	Tue Jun 08 17:01:32 2021 +0200
@@ -36,7 +36,6 @@
 val sliceK = "slice" (*=BOOL: allow sledgehammer-level strategy-scheduling*)
 val smt_proofsK = "smt_proofs" (*=BOOL: enable SMT proof generation*)
 val strictK = "strict" (*=BOOL: run in strict mode*)
-val strideK = "stride" (*=NUM: run every nth goal*)
 val term_orderK = "term_order" (*=STRING: term order (in E)*)
 val type_encK = "type_enc" (*=STRING: type encoding scheme*)
 val uncurried_aliasesK = "uncurried_aliases" (*=SMART_BOOL: use fresh function names to alias curried applications*)
@@ -51,7 +50,6 @@
 val fact_filter_default = "smart"
 val type_enc_default = "smart"
 val strict_default = "false"
-val stride_default = 1
 val max_facts_default = "smart"
 val slice_default = "true"
 val max_calls_default = 10000000
@@ -615,7 +613,6 @@
 
 (* crude hack *)
 val num_sledgehammer_calls = Unsynchronized.ref 0
-val remaining_stride = Unsynchronized.ref stride_default
 
 val _ =
   Theory.setup (Mirabelle.theory_action \<^binding>\<open>sledgehammer\<close>
@@ -627,7 +624,6 @@
         val data = Unsynchronized.ref empty_data
         val change_data = Unsynchronized.change data
 
-        val stride = Mirabelle.get_int_argument args (strideK, stride_default)
         val max_calls = Mirabelle.get_int_argument args (max_callsK, max_calls_default)
         val check_trivial = Mirabelle.get_bool_argument args (check_trivialK, check_trivial_default)
 
@@ -638,13 +634,8 @@
               val goal = Thm.major_prem_of (#goal (Proof.goal st))
               val log =
                 if can Logic.dest_conjunction goal orelse can Logic.dest_equals goal then []
-                else if !remaining_stride > 1 then
-                  (* We still have some steps to do *)
-                  (Unsynchronized.dec remaining_stride; ["Skipping because of stride"])
                 else
-                  (* This was the last step, now run the action *)
                   let
-                    val _ = remaining_stride := stride
                     val _ = Unsynchronized.inc num_sledgehammer_calls
                   in
                     if !num_sledgehammer_calls > max_calls then
--- a/src/HOL/Tools/etc/options	Tue Jun 01 19:46:34 2021 +0200
+++ b/src/HOL/Tools/etc/options	Tue Jun 08 17:01:32 2021 +0200
@@ -53,6 +53,9 @@
 option mirabelle_timeout : real = 30
   -- "default timeout for Mirabelle actions"
 
+option mirabelle_stride : int = 1
+  -- "default stride for running Mirabelle actions on every nth goal"
+
 option mirabelle_actions : string = ""
   -- "Mirabelle actions (outer syntax, separated by semicolons)"
 
--- a/src/HOL/ex/Sqrt.thy	Tue Jun 01 19:46:34 2021 +0200
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,104 +0,0 @@
-(*  Title:      HOL/ex/Sqrt.thy
-    Author:     Markus Wenzel, Tobias Nipkow, TU Muenchen
-*)
-
-section \<open>Square roots of primes are irrational\<close>
-
-theory Sqrt
-imports Complex_Main "HOL-Computational_Algebra.Primes"
-begin
-
-text \<open>The square root of any prime number (including 2) is irrational.\<close>
-
-theorem sqrt_prime_irrational:
-  assumes "prime (p::nat)"
-  shows "sqrt p \<notin> \<rat>"
-proof
-  from \<open>prime p\<close> have p: "1 < p" by (simp add: prime_nat_iff)
-  assume "sqrt p \<in> \<rat>"
-  then obtain m n :: nat where
-      n: "n \<noteq> 0" and sqrt_rat: "\<bar>sqrt p\<bar> = m / n"
-    and "coprime m n" by (rule Rats_abs_nat_div_natE)
-  have eq: "m\<^sup>2 = p * n\<^sup>2"
-  proof -
-    from n and sqrt_rat have "m = \<bar>sqrt p\<bar> * n" by simp
-    then have "m\<^sup>2 = (sqrt p)\<^sup>2 * n\<^sup>2"
-      by (auto simp add: power2_eq_square)
-    also have "(sqrt p)\<^sup>2 = p" by simp
-    also have "\<dots> * n\<^sup>2 = p * n\<^sup>2" by simp
-    finally show ?thesis using of_nat_eq_iff by blast
-  qed
-  have "p dvd m \<and> p dvd n"
-  proof
-    from eq have "p dvd m\<^sup>2" ..
-    with \<open>prime p\<close> show "p dvd m" by (rule prime_dvd_power_nat)
-    then obtain k where "m = p * k" ..
-    with eq have "p * n\<^sup>2 = p\<^sup>2 * k\<^sup>2" by (auto simp add: power2_eq_square ac_simps)
-    with p have "n\<^sup>2 = p * k\<^sup>2" by (simp add: power2_eq_square)
-    then have "p dvd n\<^sup>2" ..
-    with \<open>prime p\<close> show "p dvd n" by (rule prime_dvd_power_nat)
-  qed
-  then have "p dvd gcd m n" by simp
-  with \<open>coprime m n\<close> have "p = 1" by simp
-  with p show False by simp
-qed
-
-corollary sqrt_2_not_rat: "sqrt 2 \<notin> \<rat>"
-  using sqrt_prime_irrational[of 2] by simp
-
-
-subsection \<open>Variations\<close>
-
-text \<open>
-  Here is an alternative version of the main proof, using mostly
-  linear forward-reasoning.  While this results in less top-down
-  structure, it is probably closer to proofs seen in mathematics.
-\<close>
-
-theorem
-  assumes "prime (p::nat)"
-  shows "sqrt p \<notin> \<rat>"
-proof
-  from \<open>prime p\<close> have p: "1 < p" by (simp add: prime_nat_iff)
-  assume "sqrt p \<in> \<rat>"
-  then obtain m n :: nat where
-      n: "n \<noteq> 0" and sqrt_rat: "\<bar>sqrt p\<bar> = m / n"
-    and "coprime m n" by (rule Rats_abs_nat_div_natE)
-  from n and sqrt_rat have "m = \<bar>sqrt p\<bar> * n" by simp
-  then have "m\<^sup>2 = (sqrt p)\<^sup>2 * n\<^sup>2"
-    by (auto simp add: power2_eq_square)
-  also have "(sqrt p)\<^sup>2 = p" by simp
-  also have "\<dots> * n\<^sup>2 = p * n\<^sup>2" by simp
-  finally have eq: "m\<^sup>2 = p * n\<^sup>2" using of_nat_eq_iff by blast
-  then have "p dvd m\<^sup>2" ..
-  with \<open>prime p\<close> have dvd_m: "p dvd m" by (rule prime_dvd_power_nat)
-  then obtain k where "m = p * k" ..
-  with eq have "p * n\<^sup>2 = p\<^sup>2 * k\<^sup>2" by (auto simp add: power2_eq_square ac_simps)
-  with p have "n\<^sup>2 = p * k\<^sup>2" by (simp add: power2_eq_square)
-  then have "p dvd n\<^sup>2" ..
-  with \<open>prime p\<close> have "p dvd n" by (rule prime_dvd_power_nat)
-  with dvd_m have "p dvd gcd m n" by (rule gcd_greatest)
-  with \<open>coprime m n\<close> have "p = 1" by simp
-  with p show False by simp
-qed
-
-
-text \<open>Another old chestnut, which is a consequence of the irrationality of 2.\<close>
-
-lemma "\<exists>a b::real. a \<notin> \<rat> \<and> b \<notin> \<rat> \<and> a powr b \<in> \<rat>" (is "\<exists>a b. ?P a b")
-proof cases
-  assume "sqrt 2 powr sqrt 2 \<in> \<rat>"
-  then have "?P (sqrt 2) (sqrt 2)"
-    by (metis sqrt_2_not_rat)
-  then show ?thesis by blast
-next
-  assume 1: "sqrt 2 powr sqrt 2 \<notin> \<rat>"
-  have "(sqrt 2 powr sqrt 2) powr sqrt 2 = 2"
-    using powr_realpow [of _ 2]
-    by (simp add: powr_powr power2_eq_square [symmetric])
-  then have "?P (sqrt 2 powr sqrt 2) (sqrt 2)"
-    by (metis 1 Rats_number_of sqrt_2_not_rat)
-  then show ?thesis by blast
-qed
-
-end
--- a/src/HOL/ex/Sqrt_Script.thy	Tue Jun 01 19:46:34 2021 +0200
+++ b/src/HOL/ex/Sqrt_Script.thy	Tue Jun 08 17:01:32 2021 +0200
@@ -5,14 +5,14 @@
 
 section \<open>Square roots of primes are irrational (script version)\<close>
 
-theory Sqrt_Script
-imports Complex_Main "HOL-Computational_Algebra.Primes"
-begin
+text \<open>
+  Contrast this linear Isabelle/Isar script with the more mathematical version
+  in \<^file>\<open>~~/src/HOL/Examples/Sqrt.thy\<close> by Makarius Wenzel.
+\<close>
 
-text \<open>
-  \medskip Contrast this linear Isabelle/Isar script with Markus
-  Wenzel's more mathematical version.
-\<close>
+theory Sqrt_Script
+  imports Complex_Main "HOL-Computational_Algebra.Primes"
+begin
 
 subsection \<open>Preliminaries\<close>
 
--- a/src/Pure/Admin/components.scala	Tue Jun 01 19:46:34 2021 +0200
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,356 +0,0 @@
-/*  Title:      Pure/Admin/components.scala
-    Author:     Makarius
-
-Isabelle system components.
-*/
-
-package isabelle
-
-
-import java.io.{File => JFile}
-
-
-object Components
-{
-  /* archive name */
-
-  object Archive
-  {
-    val suffix: String = ".tar.gz"
-
-    def apply(name: String): String =
-      if (name == "") error("Bad component name: " + quote(name))
-      else name + suffix
-
-    def unapply(archive: String): Option[String] =
-    {
-      for {
-        name0 <- Library.try_unsuffix(suffix, archive)
-        name <- proper_string(name0)
-      } yield name
-    }
-
-    def get_name(archive: String): String =
-      unapply(archive) getOrElse
-        error("Bad component archive name (expecting .tar.gz): " + quote(archive))
-  }
-
-
-  /* component collections */
-
-  def default_component_repository: String =
-    Isabelle_System.getenv("ISABELLE_COMPONENT_REPOSITORY")
-
-  val default_components_base: Path = Path.explode("$ISABELLE_COMPONENTS_BASE")
-
-  def admin(dir: Path): Path = dir + Path.explode("Admin/components")
-
-  def contrib(dir: Path = Path.current, name: String = ""): Path =
-    dir + Path.explode("contrib") + Path.explode(name)
-
-  def unpack(dir: Path, archive: Path, progress: Progress = new Progress): String =
-  {
-    val name = Archive.get_name(archive.file_name)
-    progress.echo("Unpacking " + name)
-    Isabelle_System.gnutar("-xzf " + File.bash_path(archive), dir = dir).check
-    name
-  }
-
-  def resolve(base_dir: Path, names: List[String],
-    target_dir: Option[Path] = None,
-    copy_dir: Option[Path] = None,
-    progress: Progress = new Progress): Unit =
-  {
-    Isabelle_System.make_directory(base_dir)
-    for (name <- names) {
-      val archive_name = Archive(name)
-      val archive = base_dir + Path.explode(archive_name)
-      if (!archive.is_file) {
-        val remote = Components.default_component_repository + "/" + archive_name
-        Isabelle_System.download_file(remote, archive, progress = progress)
-      }
-      for (dir <- copy_dir) {
-        Isabelle_System.make_directory(dir)
-        Isabelle_System.copy_file(archive, dir)
-      }
-      unpack(target_dir getOrElse base_dir, archive, progress = progress)
-    }
-  }
-
-  private val platforms_family: Map[Platform.Family.Value, Set[String]] =
-    Map(
-      Platform.Family.linux_arm -> Set("arm64-linux", "arm64_32-linux"),
-      Platform.Family.linux -> Set("x86_64-linux", "x86_64_32-linux"),
-      Platform.Family.macos ->
-        Set("arm64-darwin", "arm64_32-darwin", "x86_64-darwin", "x86_64_32-darwin"),
-      Platform.Family.windows ->
-        Set("x86_64-cygwin", "x86_64-windows", "x86_64_32-windows", "x86-windows"))
-
-  private val platforms_all: Set[String] =
-    Set("x86-linux", "x86-cygwin") ++ platforms_family.iterator.flatMap(_._2)
-
-  def purge(dir: Path, platform: Platform.Family.Value): Unit =
-  {
-    val purge_set = platforms_all -- platforms_family(platform)
-
-    File.find_files(dir.file,
-      (file: JFile) => file.isDirectory && purge_set(file.getName),
-      include_dirs = true).foreach(Isabelle_System.rm_tree)
-  }
-
-
-  /* component directory content */
-
-  def settings(dir: Path = Path.current): Path = dir + Path.explode("etc/settings")
-  def components(dir: Path = Path.current): Path = dir + Path.explode("etc/components")
-
-  def check_dir(dir: Path): Boolean =
-    settings(dir).is_file || components(dir).is_file
-
-  def read_components(dir: Path): List[String] =
-    split_lines(File.read(components(dir))).filter(_.nonEmpty)
-
-  def write_components(dir: Path, lines: List[String]): Unit =
-    File.write(components(dir), terminate_lines(lines))
-
-
-  /* component repository content */
-
-  val components_sha1: Path = Path.explode("~~/Admin/components/components.sha1")
-
-  sealed case class SHA1_Digest(sha1: String, file_name: String)
-  {
-    override def toString: String = sha1 + "  " + file_name
-  }
-
-  def read_components_sha1(lines: List[String] = Nil): List[SHA1_Digest] =
-    (proper_list(lines) getOrElse split_lines(File.read(components_sha1))).flatMap(line =>
-      Word.explode(line) match {
-        case Nil => None
-        case List(sha1, name) => Some(SHA1_Digest(sha1, name))
-        case _ => error("Bad components.sha1 entry: " + quote(line))
-      })
-
-  def write_components_sha1(entries: List[SHA1_Digest]): Unit =
-    File.write(components_sha1, entries.sortBy(_.file_name).mkString("", "\n", "\n"))
-
-
-  /** manage user components **/
-
-  val components_path = Path.explode("$ISABELLE_HOME_USER/etc/components")
-
-  def read_components(): List[String] =
-    if (components_path.is_file) Library.trim_split_lines(File.read(components_path))
-    else Nil
-
-  def write_components(lines: List[String]): Unit =
-  {
-    Isabelle_System.make_directory(components_path.dir)
-    File.write(components_path, Library.terminate_lines(lines))
-  }
-
-  def update_components(add: Boolean, path0: Path, progress: Progress = new Progress): Unit =
-  {
-    val path = path0.expand.absolute
-    if (!(path + Path.explode("etc/settings")).is_file &&
-        !(path + Path.explode("etc/components")).is_file) error("Bad component directory: " + path)
-
-    val lines1 = read_components()
-    val lines2 =
-      lines1.filter(line =>
-        line.isEmpty || line.startsWith("#") || !File.eq(Path.explode(line), path))
-    val lines3 = if (add) lines2 ::: List(path.implode) else lines2
-
-    if (lines1 != lines3) write_components(lines3)
-
-    val prefix = if (lines1 == lines3) "Unchanged" else if (add) "Added" else "Removed"
-    progress.echo(prefix + " component " + path)
-  }
-
-
-  /* main entry point */
-
-  def main(args: Array[String]): Unit =
-  {
-    Command_Line.tool {
-      for (arg <- args) {
-        val add =
-          if (arg.startsWith("+")) true
-          else if (arg.startsWith("-")) false
-          else error("Bad argument: " + quote(arg))
-        val path = Path.explode(arg.substring(1))
-        update_components(add, path, progress = new Console_Progress)
-      }
-    }
-  }
-
-
-  /** build and publish components **/
-
-  def build_components(
-    options: Options,
-    components: List[Path],
-    progress: Progress = new Progress,
-    publish: Boolean = false,
-    force: Boolean = false,
-    update_components_sha1: Boolean = false): Unit =
-  {
-    val archives: List[Path] =
-      for (path <- components) yield {
-        path.file_name match {
-          case Archive(_) => path
-          case name =>
-            if (!path.is_dir) error("Bad component directory: " + path)
-            else if (!check_dir(path)) {
-              error("Malformed component directory: " + path +
-                "\n  (requires " + settings() + " or " + Components.components() + ")")
-            }
-            else {
-              val component_path = path.expand
-              val archive_dir = component_path.dir
-              val archive_name = Archive(name)
-
-              val archive = archive_dir + Path.explode(archive_name)
-              if (archive.is_file && !force) {
-                error("Component archive already exists: " + archive)
-              }
-
-              progress.echo("Packaging " + archive_name)
-              Isabelle_System.gnutar("-czf " + File.bash_path(archive) + " " + Bash.string(name),
-                dir = archive_dir).check
-
-              archive
-            }
-        }
-      }
-
-    if ((publish && archives.nonEmpty) || update_components_sha1) {
-      options.string("isabelle_components_server") match {
-        case SSH.Target(user, host) =>
-          using(SSH.open_session(options, host = host, user = user))(ssh =>
-          {
-            val components_dir = Path.explode(options.string("isabelle_components_dir"))
-            val contrib_dir = Path.explode(options.string("isabelle_components_contrib_dir"))
-
-            for (dir <- List(components_dir, contrib_dir) if !ssh.is_dir(dir)) {
-              error("Bad remote directory: " + dir)
-            }
-
-            if (publish) {
-              for (archive <- archives) {
-                val archive_name = archive.file_name
-                val name = Archive.get_name(archive_name)
-                val remote_component = components_dir + archive.base
-                val remote_contrib = contrib_dir + Path.explode(name)
-
-                // component archive
-                if (ssh.is_file(remote_component) && !force) {
-                  error("Remote component archive already exists: " + remote_component)
-                }
-                progress.echo("Uploading " + archive_name)
-                ssh.write_file(remote_component, archive)
-
-                // contrib directory
-                val is_standard_component =
-                  Isabelle_System.with_tmp_dir("component")(tmp_dir =>
-                  {
-                    Isabelle_System.gnutar("-xzf " + File.bash_path(archive), dir = tmp_dir).check
-                    check_dir(tmp_dir + Path.explode(name))
-                  })
-                if (is_standard_component) {
-                  if (ssh.is_dir(remote_contrib)) {
-                    if (force) ssh.rm_tree(remote_contrib)
-                    else error("Remote component directory already exists: " + remote_contrib)
-                  }
-                  progress.echo("Unpacking remote " + archive_name)
-                  ssh.execute("tar -C " + ssh.bash_path(contrib_dir) + " -xzf " +
-                    ssh.bash_path(remote_component)).check
-                }
-                else {
-                  progress.echo_warning("No unpacking of non-standard component: " + archive_name)
-                }
-              }
-            }
-
-            // remote SHA1 digests
-            if (update_components_sha1) {
-              val lines =
-                for {
-                  entry <- ssh.read_dir(components_dir)
-                  if entry.is_file && entry.name.endsWith(Archive.suffix)
-                }
-                yield {
-                  progress.echo("Digesting remote " + entry.name)
-                  ssh.execute("cd " + ssh.bash_path(components_dir) +
-                    "; sha1sum " + Bash.string(entry.name)).check.out
-                }
-              write_components_sha1(read_components_sha1(lines))
-            }
-          })
-        case s => error("Bad isabelle_components_server: " + quote(s))
-      }
-    }
-
-    // local SHA1 digests
-    {
-      val new_entries =
-        for (archive <- archives)
-        yield {
-          val file_name = archive.file_name
-          progress.echo("Digesting local " + file_name)
-          val sha1 = SHA1.digest(archive).rep
-          SHA1_Digest(sha1, file_name)
-        }
-      val new_names = new_entries.map(_.file_name).toSet
-
-      write_components_sha1(
-        new_entries :::
-        read_components_sha1().filterNot(entry => new_names.contains(entry.file_name)))
-    }
-  }
-
-
-  /* Isabelle tool wrapper */
-
-  private val relevant_options =
-    List("isabelle_components_server", "isabelle_components_dir", "isabelle_components_contrib_dir")
-
-  val isabelle_tool =
-    Isabelle_Tool("build_components", "build and publish Isabelle components",
-      Scala_Project.here, args =>
-    {
-      var publish = false
-      var update_components_sha1 = false
-      var force = false
-      var options = Options.init()
-
-      def show_options: String =
-        cat_lines(relevant_options.map(name => options.options(name).print))
-
-      val getopts = Getopts("""
-Usage: isabelle build_components [OPTIONS] ARCHIVES... DIRS...
-
-  Options are:
-    -P           publish on SSH server (see options below)
-    -f           force: overwrite existing component archives and directories
-    -o OPTION    override Isabelle system OPTION (via NAME=VAL or NAME)
-    -u           update all SHA1 keys in Isabelle repository Admin/components
-
-  Build and publish Isabelle components as .tar.gz archives on SSH server,
-  depending on system options:
-
-""" + Library.indent_lines(2, show_options) + "\n",
-        "P" -> (_ => publish = true),
-        "f" -> (_ => force = true),
-        "o:" -> (arg => options = options + arg),
-        "u" -> (_ => update_components_sha1 = true))
-
-      val more_args = getopts(args)
-      if (more_args.isEmpty && !update_components_sha1) getopts.usage()
-
-      val progress = new Console_Progress
-
-      build_components(options, more_args.map(Path.explode), progress = progress,
-        publish = publish, force = force, update_components_sha1 = update_components_sha1)
-    })
-}
--- a/src/Pure/PIDE/headless.scala	Tue Jun 01 19:46:34 2021 +0200
+++ b/src/Pure/PIDE/headless.scala	Tue Jun 08 17:01:32 2021 +0200
@@ -574,7 +574,7 @@
       val session = new Session(session_base_info.session, options, resources)
 
       progress.echo("Starting session " + session_base_info.session + " ...")
-      Isabelle_Process(session, options, session_base_info.sessions_structure, store,
+      Isabelle_Process.start(session, options, session_base_info.sessions_structure, store,
         logic = session_base_info.session, modes = print_mode).await_startup()
 
       session
--- /dev/null	Thu Jan 01 00:00:00 1970 +0000
+++ b/src/Pure/System/components.scala	Tue Jun 08 17:01:32 2021 +0200
@@ -0,0 +1,361 @@
+/*  Title:      Pure/System/components.scala
+    Author:     Makarius
+
+Isabelle system components.
+*/
+
+package isabelle
+
+
+import java.io.{File => JFile}
+
+
+object Components
+{
+  /* archive name */
+
+  object Archive
+  {
+    val suffix: String = ".tar.gz"
+
+    def apply(name: String): String =
+      if (name == "") error("Bad component name: " + quote(name))
+      else name + suffix
+
+    def unapply(archive: String): Option[String] =
+    {
+      for {
+        name0 <- Library.try_unsuffix(suffix, archive)
+        name <- proper_string(name0)
+      } yield name
+    }
+
+    def get_name(archive: String): String =
+      unapply(archive) getOrElse
+        error("Bad component archive name (expecting .tar.gz): " + quote(archive))
+  }
+
+
+  /* component collections */
+
+  def default_component_repository: String =
+    Isabelle_System.getenv("ISABELLE_COMPONENT_REPOSITORY")
+
+  val default_components_base: Path = Path.explode("$ISABELLE_COMPONENTS_BASE")
+
+  def admin(dir: Path): Path = dir + Path.explode("Admin/components")
+
+  def contrib(dir: Path = Path.current, name: String = ""): Path =
+    dir + Path.explode("contrib") + Path.explode(name)
+
+  def unpack(dir: Path, archive: Path, progress: Progress = new Progress): String =
+  {
+    val name = Archive.get_name(archive.file_name)
+    progress.echo("Unpacking " + name)
+    Isabelle_System.gnutar("-xzf " + File.bash_path(archive), dir = dir).check
+    name
+  }
+
+  def resolve(base_dir: Path, names: List[String],
+    target_dir: Option[Path] = None,
+    copy_dir: Option[Path] = None,
+    progress: Progress = new Progress): Unit =
+  {
+    Isabelle_System.make_directory(base_dir)
+    for (name <- names) {
+      val archive_name = Archive(name)
+      val archive = base_dir + Path.explode(archive_name)
+      if (!archive.is_file) {
+        val remote = Components.default_component_repository + "/" + archive_name
+        Isabelle_System.download_file(remote, archive, progress = progress)
+      }
+      for (dir <- copy_dir) {
+        Isabelle_System.make_directory(dir)
+        Isabelle_System.copy_file(archive, dir)
+      }
+      unpack(target_dir getOrElse base_dir, archive, progress = progress)
+    }
+  }
+
+  private val platforms_family: Map[Platform.Family.Value, Set[String]] =
+    Map(
+      Platform.Family.linux_arm -> Set("arm64-linux", "arm64_32-linux"),
+      Platform.Family.linux -> Set("x86_64-linux", "x86_64_32-linux"),
+      Platform.Family.macos ->
+        Set("arm64-darwin", "arm64_32-darwin", "x86_64-darwin", "x86_64_32-darwin"),
+      Platform.Family.windows ->
+        Set("x86_64-cygwin", "x86_64-windows", "x86_64_32-windows", "x86-windows"))
+
+  private val platforms_all: Set[String] =
+    Set("x86-linux", "x86-cygwin") ++ platforms_family.iterator.flatMap(_._2)
+
+  def purge(dir: Path, platform: Platform.Family.Value): Unit =
+  {
+    val purge_set = platforms_all -- platforms_family(platform)
+
+    File.find_files(dir.file,
+      (file: JFile) => file.isDirectory && purge_set(file.getName),
+      include_dirs = true).foreach(Isabelle_System.rm_tree)
+  }
+
+
+  /* component directories */
+
+  def directories(): List[Path] =
+    Path.split(Isabelle_System.getenv_strict("ISABELLE_COMPONENTS"))
+
+
+  /* component directory content */
+
+  def settings(dir: Path = Path.current): Path = dir + Path.explode("etc/settings")
+  def components(dir: Path = Path.current): Path = dir + Path.explode("etc/components")
+
+  def check_dir(dir: Path): Boolean =
+    settings(dir).is_file || components(dir).is_file
+
+  def read_components(dir: Path): List[String] =
+    split_lines(File.read(components(dir))).filter(_.nonEmpty)
+
+  def write_components(dir: Path, lines: List[String]): Unit =
+    File.write(components(dir), terminate_lines(lines))
+
+
+  /* component repository content */
+
+  val components_sha1: Path = Path.explode("~~/Admin/components/components.sha1")
+
+  sealed case class SHA1_Digest(sha1: String, file_name: String)
+  {
+    override def toString: String = sha1 + "  " + file_name
+  }
+
+  def read_components_sha1(lines: List[String] = Nil): List[SHA1_Digest] =
+    (proper_list(lines) getOrElse split_lines(File.read(components_sha1))).flatMap(line =>
+      Word.explode(line) match {
+        case Nil => None
+        case List(sha1, name) => Some(SHA1_Digest(sha1, name))
+        case _ => error("Bad components.sha1 entry: " + quote(line))
+      })
+
+  def write_components_sha1(entries: List[SHA1_Digest]): Unit =
+    File.write(components_sha1, entries.sortBy(_.file_name).mkString("", "\n", "\n"))
+
+
+  /** manage user components **/
+
+  val components_path = Path.explode("$ISABELLE_HOME_USER/etc/components")
+
+  def read_components(): List[String] =
+    if (components_path.is_file) Library.trim_split_lines(File.read(components_path))
+    else Nil
+
+  def write_components(lines: List[String]): Unit =
+  {
+    Isabelle_System.make_directory(components_path.dir)
+    File.write(components_path, Library.terminate_lines(lines))
+  }
+
+  def update_components(add: Boolean, path0: Path, progress: Progress = new Progress): Unit =
+  {
+    val path = path0.expand.absolute
+    if (!check_dir(path) && !Sessions.is_session_dir(path)) error("Bad component directory: " + path)
+
+    val lines1 = read_components()
+    val lines2 =
+      lines1.filter(line =>
+        line.isEmpty || line.startsWith("#") || !File.eq(Path.explode(line), path))
+    val lines3 = if (add) lines2 ::: List(path.implode) else lines2
+
+    if (lines1 != lines3) write_components(lines3)
+
+    val prefix = if (lines1 == lines3) "Unchanged" else if (add) "Added" else "Removed"
+    progress.echo(prefix + " component " + path)
+  }
+
+
+  /* main entry point */
+
+  def main(args: Array[String]): Unit =
+  {
+    Command_Line.tool {
+      for (arg <- args) {
+        val add =
+          if (arg.startsWith("+")) true
+          else if (arg.startsWith("-")) false
+          else error("Bad argument: " + quote(arg))
+        val path = Path.explode(arg.substring(1))
+        update_components(add, path, progress = new Console_Progress)
+      }
+    }
+  }
+
+
+  /** build and publish components **/
+
+  def build_components(
+    options: Options,
+    components: List[Path],
+    progress: Progress = new Progress,
+    publish: Boolean = false,
+    force: Boolean = false,
+    update_components_sha1: Boolean = false): Unit =
+  {
+    val archives: List[Path] =
+      for (path <- components) yield {
+        path.file_name match {
+          case Archive(_) => path
+          case name =>
+            if (!path.is_dir) error("Bad component directory: " + path)
+            else if (!check_dir(path)) {
+              error("Malformed component directory: " + path +
+                "\n  (requires " + settings() + " or " + Components.components() + ")")
+            }
+            else {
+              val component_path = path.expand
+              val archive_dir = component_path.dir
+              val archive_name = Archive(name)
+
+              val archive = archive_dir + Path.explode(archive_name)
+              if (archive.is_file && !force) {
+                error("Component archive already exists: " + archive)
+              }
+
+              progress.echo("Packaging " + archive_name)
+              Isabelle_System.gnutar("-czf " + File.bash_path(archive) + " " + Bash.string(name),
+                dir = archive_dir).check
+
+              archive
+            }
+        }
+      }
+
+    if ((publish && archives.nonEmpty) || update_components_sha1) {
+      options.string("isabelle_components_server") match {
+        case SSH.Target(user, host) =>
+          using(SSH.open_session(options, host = host, user = user))(ssh =>
+          {
+            val components_dir = Path.explode(options.string("isabelle_components_dir"))
+            val contrib_dir = Path.explode(options.string("isabelle_components_contrib_dir"))
+
+            for (dir <- List(components_dir, contrib_dir) if !ssh.is_dir(dir)) {
+              error("Bad remote directory: " + dir)
+            }
+
+            if (publish) {
+              for (archive <- archives) {
+                val archive_name = archive.file_name
+                val name = Archive.get_name(archive_name)
+                val remote_component = components_dir + archive.base
+                val remote_contrib = contrib_dir + Path.explode(name)
+
+                // component archive
+                if (ssh.is_file(remote_component) && !force) {
+                  error("Remote component archive already exists: " + remote_component)
+                }
+                progress.echo("Uploading " + archive_name)
+                ssh.write_file(remote_component, archive)
+
+                // contrib directory
+                val is_standard_component =
+                  Isabelle_System.with_tmp_dir("component")(tmp_dir =>
+                  {
+                    Isabelle_System.gnutar("-xzf " + File.bash_path(archive), dir = tmp_dir).check
+                    check_dir(tmp_dir + Path.explode(name))
+                  })
+                if (is_standard_component) {
+                  if (ssh.is_dir(remote_contrib)) {
+                    if (force) ssh.rm_tree(remote_contrib)
+                    else error("Remote component directory already exists: " + remote_contrib)
+                  }
+                  progress.echo("Unpacking remote " + archive_name)
+                  ssh.execute("tar -C " + ssh.bash_path(contrib_dir) + " -xzf " +
+                    ssh.bash_path(remote_component)).check
+                }
+                else {
+                  progress.echo_warning("No unpacking of non-standard component: " + archive_name)
+                }
+              }
+            }
+
+            // remote SHA1 digests
+            if (update_components_sha1) {
+              val lines =
+                for {
+                  entry <- ssh.read_dir(components_dir)
+                  if entry.is_file && entry.name.endsWith(Archive.suffix)
+                }
+                yield {
+                  progress.echo("Digesting remote " + entry.name)
+                  ssh.execute("cd " + ssh.bash_path(components_dir) +
+                    "; sha1sum " + Bash.string(entry.name)).check.out
+                }
+              write_components_sha1(read_components_sha1(lines))
+            }
+          })
+        case s => error("Bad isabelle_components_server: " + quote(s))
+      }
+    }
+
+    // local SHA1 digests
+    {
+      val new_entries =
+        for (archive <- archives)
+        yield {
+          val file_name = archive.file_name
+          progress.echo("Digesting local " + file_name)
+          val sha1 = SHA1.digest(archive).rep
+          SHA1_Digest(sha1, file_name)
+        }
+      val new_names = new_entries.map(_.file_name).toSet
+
+      write_components_sha1(
+        new_entries :::
+        read_components_sha1().filterNot(entry => new_names.contains(entry.file_name)))
+    }
+  }
+
+
+  /* Isabelle tool wrapper */
+
+  private val relevant_options =
+    List("isabelle_components_server", "isabelle_components_dir", "isabelle_components_contrib_dir")
+
+  val isabelle_tool =
+    Isabelle_Tool("build_components", "build and publish Isabelle components",
+      Scala_Project.here, args =>
+    {
+      var publish = false
+      var update_components_sha1 = false
+      var force = false
+      var options = Options.init()
+
+      def show_options: String =
+        cat_lines(relevant_options.map(name => options.options(name).print))
+
+      val getopts = Getopts("""
+Usage: isabelle build_components [OPTIONS] ARCHIVES... DIRS...
+
+  Options are:
+    -P           publish on SSH server (see options below)
+    -f           force: overwrite existing component archives and directories
+    -o OPTION    override Isabelle system OPTION (via NAME=VAL or NAME)
+    -u           update all SHA1 keys in Isabelle repository Admin/components
+
+  Build and publish Isabelle components as .tar.gz archives on SSH server,
+  depending on system options:
+
+""" + Library.indent_lines(2, show_options) + "\n",
+        "P" -> (_ => publish = true),
+        "f" -> (_ => force = true),
+        "o:" -> (arg => options = options + arg),
+        "u" -> (_ => update_components_sha1 = true))
+
+      val more_args = getopts(args)
+      if (more_args.isEmpty && !update_components_sha1) getopts.usage()
+
+      val progress = new Console_Progress
+
+      build_components(options, more_args.map(Path.explode), progress = progress,
+        publish = publish, force = force, update_components_sha1 = update_components_sha1)
+    })
+}
--- a/src/Pure/System/isabelle_process.scala	Tue Jun 01 19:46:34 2021 +0200
+++ b/src/Pure/System/isabelle_process.scala	Tue Jun 08 17:01:32 2021 +0200
@@ -12,7 +12,7 @@
 
 object Isabelle_Process
 {
-  def apply(
+  def start(
     session: Session,
     options: Options,
     sessions_structure: Sessions.Structure,
@@ -37,13 +37,16 @@
           modes = modes, cwd = cwd, env = env)
       }
       catch { case exn @ ERROR(_) => channel.shutdown(); throw exn }
-    process.stdin.close()
 
-    new Isabelle_Process(session, channel, process)
+    val isabelle_process = new Isabelle_Process(session, process)
+    process.stdin.close()
+    session.start(receiver => new Prover(receiver, session.cache, channel, process))
+
+    isabelle_process
   }
 }
 
-class Isabelle_Process private(session: Session, channel: System_Channel, process: Bash.Process)
+class Isabelle_Process private(session: Session, process: Bash.Process)
 {
   private val startup = Future.promise[String]
   private val terminated = Future.promise[Process_Result]
@@ -62,8 +65,6 @@
       case _ =>
     }
 
-  session.start(receiver => new Prover(receiver, session.cache, channel, process))
-
   def await_startup(): Isabelle_Process =
     startup.join match {
       case "" => this
--- a/src/Pure/System/isabelle_system.scala	Tue Jun 01 19:46:34 2021 +0200
+++ b/src/Pure/System/isabelle_system.scala	Tue Jun 08 17:01:32 2021 +0200
@@ -581,12 +581,6 @@
   def admin(): Boolean = Path.explode("~~/Admin").is_dir
 
 
-  /* components */
-
-  def components(): List[Path] =
-    Path.split(getenv_strict("ISABELLE_COMPONENTS"))
-
-
   /* default logic */
 
   def default_logic(args: String*): String =
--- a/src/Pure/System/options.scala	Tue Jun 01 19:46:34 2021 +0200
+++ b/src/Pure/System/options.scala	Tue Jun 08 17:01:32 2021 +0200
@@ -134,7 +134,7 @@
   {
     var options = empty
     for {
-      dir <- Isabelle_System.components()
+      dir <- Components.directories()
       file = dir + OPTIONS if file.is_file
     } { options = Parser.parse_file(options, file.implode, File.read(file)) }
     opts.foldLeft(Options.Parser.parse_prefs(options, prefs))(_ + _)
@@ -349,7 +349,7 @@
     val opt = check_name(name)
     opt_value match {
       case Some(value) => this + (name, value)
-      case None if opt.typ == Options.Bool => this + (name, "true")
+      case None if opt.typ == Options.Bool | opt.typ == Options.String => this + (name, "true")
       case None => error("Missing value for option " + quote(name) + " : " + opt.typ.print)
     }
   }
--- a/src/Pure/Thy/sessions.scala	Tue Jun 01 19:46:34 2021 +0200
+++ b/src/Pure/Thy/sessions.scala	Tue Jun 08 17:01:32 2021 +0200
@@ -510,7 +510,7 @@
     def document_enabled: Boolean =
       options.string("document") match {
         case "" | "false" => false
-        case "pdf" => true
+        case "pdf" | "true" => true
         case doc => error("Bad document specification " + quote(doc))
       }
 
@@ -1014,7 +1014,7 @@
 
   /* load sessions from certain directories */
 
-  private def is_session_dir(dir: Path): Boolean =
+  def is_session_dir(dir: Path): Boolean =
     (dir + ROOT).is_file || (dir + ROOTS).is_file
 
   private def check_session_dir(dir: Path): Path =
@@ -1023,7 +1023,7 @@
 
   def directories(dirs: List[Path], select_dirs: List[Path]): List[(Boolean, Path)] =
   {
-    val default_dirs = Isabelle_System.components().filter(is_session_dir)
+    val default_dirs = Components.directories().filter(is_session_dir)
     for { (select, dir) <- (default_dirs ::: dirs).map((false, _)) ::: select_dirs.map((true, _)) }
     yield (select, dir.canonical)
   }
--- a/src/Pure/Thy/thy_info.ML	Tue Jun 01 19:46:34 2021 +0200
+++ b/src/Pure/Thy/thy_info.ML	Tue Jun 08 17:01:32 2021 +0200
@@ -18,7 +18,8 @@
   val get_theory: string -> theory
   val master_directory: string -> Path.T
   val remove_thy: string -> unit
-  val use_theories: Options.T -> string -> (string * Position.T) list -> unit
+  val use_theories: Options.T -> string -> (string * Position.T) list ->
+    (theory * Document_Output.segment list) list
   val use_thy: string -> unit
   val script_thy: Position.T -> string -> theory -> theory
   val register_thy: theory -> unit
@@ -47,19 +48,13 @@
   fun merge data : T = Library.merge (eq_snd op =) data;
 );
 
-fun sequential_presentation options =
-  not (Options.bool options \<^system_option>\<open>parallel_presentation\<close>);
-
 fun apply_presentation (context: presentation_context) thy =
-  Par_List.map'
-    {name = "apply_presentation", sequential = sequential_presentation (#options context)}
-    (fn (f, _) => f context thy) (Presentation.get thy)
-  |> ignore;
+  ignore (Par_List.map (fn (f, _) => f context thy) (Presentation.get thy));
 
 fun add_presentation f = Presentation.map (cons (f, stamp ()));
 
 val _ =
-  Theory.setup (add_presentation (fn {options, file_pos, segments, ...} => fn thy =>
+  Theory.setup (add_presentation (fn {options, segments, ...} => fn thy =>
     if exists (Toplevel.is_skipped_proof o #state) segments then ()
     else
       let
@@ -186,85 +181,80 @@
 (* scheduling loader tasks *)
 
 datatype result =
-  Result of {theory: theory, exec_id: Document_ID.exec,
-    present: unit -> unit, commit: unit -> unit, weight: int};
+  Result of
+   {theory: theory,
+    exec_id: Document_ID.exec,
+    present: unit -> presentation_context option,
+    commit: unit -> unit};
 
 fun theory_result theory =
-  Result {theory = theory, exec_id = Document_ID.none, present = I, commit = I, weight = 0};
+  Result
+   {theory = theory,
+    exec_id = Document_ID.none,
+    present = K NONE,
+    commit = I};
 
 fun result_theory (Result {theory, ...}) = theory;
-fun result_present (Result {present, ...}) = present;
 fun result_commit (Result {commit, ...}) = commit;
-fun result_ord (Result {weight = i, ...}, Result {weight = j, ...}) = int_ord (j, i);
-
-fun join_theory (Result {theory, exec_id, ...}) =
-  let
-    val _ = Execution.join [exec_id];
-    val res = Exn.capture Thm.consolidate_theory theory;
-    val exns = maps Task_Queue.group_status (Execution.peek exec_id);
-  in res :: map Exn.Exn exns end;
 
 datatype task =
   Task of string list * (theory list -> result) |
   Finished of theory;
 
-fun task_finished (Task _) = false
-  | task_finished (Finished _) = true;
+local
 
-fun task_parents deps (parents: string list) = map (the o AList.lookup (op =) deps) parents;
+fun consolidate_theory (Exn.Exn exn) = [Exn.Exn exn]
+  | consolidate_theory (Exn.Res (Result {theory, exec_id, ...})) =
+      let
+        val _ = Execution.join [exec_id];
+        val res = Exn.capture Thm.consolidate_theory theory;
+        val exns = maps Task_Queue.group_status (Execution.peek exec_id);
+      in res :: map Exn.Exn exns end;
 
-val schedule_seq =
-  String_Graph.schedule (fn deps => fn (_, task) =>
-    (case task of
-      Task (parents, body) =>
-        let
-          val result = body (task_parents deps parents);
-          val _ = Par_Exn.release_all (join_theory result);
-          val _ = result_present result ();
-          val _ = result_commit result ();
-        in result_theory result end
-    | Finished thy => thy)) #> ignore;
+fun present_theory (Exn.Exn exn) = [Exn.Exn exn]
+  | present_theory (Exn.Res (Result {theory, present, ...})) =
+      (case present () of
+        NONE => []
+      | SOME (context as {segments, ...}) =>
+          [Exn.capture (fn () => (apply_presentation context theory; (theory, segments))) ()]);
+
+in
 
-val schedule_futures = Thread_Attributes.uninterruptible (fn _ => fn tasks =>
+val schedule_theories = Thread_Attributes.uninterruptible (fn _ => fn tasks =>
   let
+    fun join_parents deps name parents =
+      (case map #1 (filter (not o can Future.join o #2) deps) of
+        [] => map (result_theory o Future.join o the o AList.lookup (op =) deps) parents
+      | bad =>
+          error ("Failed to load theory " ^ quote name ^ " (unresolved " ^ commas_quote bad ^ ")"));
+
     val futures = tasks
       |> String_Graph.schedule (fn deps => fn (name, task) =>
         (case task of
           Task (parents, body) =>
-            (singleton o Future.forks)
-              {name = "theory:" ^ name, group = NONE,
-                deps = map (Future.task_of o #2) deps, pri = 0, interrupts = true}
-              (fn () =>
-                (case filter (not o can Future.join o #2) deps of
-                  [] => body (map (result_theory o Future.join) (task_parents deps parents))
-                | bad =>
-                    error
-                      ("Failed to load theory " ^ quote name ^
-                        " (unresolved " ^ commas_quote (map #1 bad) ^ ")")))
+            if Multithreading.max_threads () > 1 then
+              (singleton o Future.forks)
+                {name = "theory:" ^ name, group = NONE,
+                  deps = map (Future.task_of o #2) deps, pri = 0, interrupts = true}
+                (fn () => body (join_parents deps name parents))
+            else
+              Future.value_result (Exn.capture (fn () => body (join_parents deps name parents)) ())
         | Finished theory => Future.value (theory_result theory)));
 
-    val results1 = futures
-      |> maps (fn future =>
-          (case Future.join_result future of
-            Exn.Res result => join_theory result
-          | Exn.Exn exn => [Exn.Exn exn]));
+    val results1 = futures |> maps (consolidate_theory o Future.join_result);
 
-    val results2 = futures
-      |> map_filter (Exn.get_res o Future.join_result)
-      |> sort result_ord
-      |> Par_List.map'
-          {name = "present", sequential = sequential_presentation (Options.default ())}
-          (fn result => Exn.capture (result_present result) ());
+    val present_results = futures |> maps (present_theory o Future.join_result);
+    val results2 = (map o Exn.map_res) (K ()) present_results;
 
-    (* FIXME more precise commit order (!?) *)
     val results3 = futures
       |> map (fn future => Exn.capture (fn () => result_commit (Future.join future) ()) ());
 
-    (* FIXME avoid global Execution.reset (!??) *)
     val results4 = map Exn.Exn (maps Task_Queue.group_status (Execution.reset ()));
 
     val _ = Par_Exn.release_all (results1 @ results2 @ results3 @ results4);
-  in () end);
+  in Par_Exn.release_all present_results end);
+
+end;
 
 
 (* eval theory *)
@@ -306,16 +296,14 @@
 
     val (results, thy) = cond_timeit true ("theory " ^ quote name) excursion;
 
-    fun present () =
+    fun present () : presentation_context =
       let
         val segments =
           (spans ~~ maps Toplevel.join_results results) |> map (fn (span, (st, tr, st')) =>
             {span = span, prev_state = st, command = tr, state = st'});
-        val context: presentation_context =
-          {options = options, file_pos = text_pos, adjust_pos = I, segments = segments};
-      in apply_presentation context thy end;
+      in {options = options, file_pos = text_pos, adjust_pos = I, segments = segments} end;
 
-  in (thy, present, size text) end;
+  in (thy, present) end;
 
 
 (* require_thy -- checking database entries wrt. the file-system *)
@@ -352,7 +340,7 @@
     val timing_start = Timing.start ();
 
     val header = Thy_Header.make (name, put_id header_pos) imports keywords;
-    val (theory, present, weight) =
+    val (theory, present) =
       eval_thy options dir header text_pos text
         (if name = Context.PureN then [Context.the_global_context ()] else parents);
 
@@ -361,9 +349,7 @@
     val _ = Output.try_protocol_message (timing_props @ Markup.timing_properties timing_result) []
 
     fun commit () = update_thy deps theory;
-  in
-    Result {theory = theory, exec_id = exec_id, present = present, commit = commit, weight = weight}
-  end;
+  in Result {theory = theory, exec_id = exec_id, present = SOME o present, commit = commit} end;
 
 fun check_thy_deps dir name =
   (case lookup_deps name of
@@ -383,6 +369,9 @@
             | SOME theory => Resources.loaded_files_current theory);
       in (current, deps', theory_pos', imports', keywords') end);
 
+fun task_finished (Task _) = false
+  | task_finished (Finished _) = true;
+
 in
 
 fun require_thys options initiators qualifier dir strs tasks =
@@ -430,11 +419,10 @@
 (* use theories *)
 
 fun use_theories options qualifier imports =
-  let val (_, tasks) = require_thys options [] qualifier Path.current imports String_Graph.empty
-  in if Multithreading.max_threads () > 1 then schedule_futures tasks else schedule_seq tasks end;
+  schedule_theories (#2 (require_thys options [] qualifier Path.current imports String_Graph.empty));
 
 fun use_thy name =
-  use_theories (Options.default ()) Resources.default_qualifier [(name, Position.none)];
+  ignore (use_theories (Options.default ()) Resources.default_qualifier [(name, Position.none)]);
 
 
 (* toplevel scripting -- without maintaining database *)
--- a/src/Pure/Tools/build.ML	Tue Jun 01 19:46:34 2021 +0200
+++ b/src/Pure/Tools/build.ML	Tue Jun 08 17:01:32 2021 +0200
@@ -4,7 +4,13 @@
 Build Isabelle sessions.
 *)
 
-structure Build: sig end =
+signature BUILD =
+sig
+  type hook = string -> (theory * Document_Output.segment list) list -> unit
+  val add_hook: hook -> unit
+end;
+
+structure Build: BUILD =
 struct
 
 (* session timing *)
@@ -23,28 +29,43 @@
 
 (* build theories *)
 
+type hook = string -> (theory * Document_Output.segment list) list -> unit;
+
+local
+  val hooks = Synchronized.var "Build.hooks" ([]: hook list);
+in
+
+fun add_hook hook = Synchronized.change hooks (cons hook);
+
+fun apply_hooks qualifier loaded_theories =
+  Synchronized.value hooks
+  |> List.app (fn hook => hook qualifier loaded_theories);
+
+end;
+
+
 fun build_theories qualifier (options, thys) =
   let
     val condition = space_explode "," (Options.string options "condition");
     val conds = filter_out (can getenv_strict) condition;
-  in
-    if null conds then
-      (Options.set_default options;
-        Isabelle_Process.init_options ();
-        Future.fork I;
-        (Thy_Info.use_theories options qualifier
-        |>
-          (case Options.string options "profiling" of
-            "" => I
-          | "time" => profile_time
-          | "allocations" => profile_allocations
-          | bad => error ("Bad profiling option: " ^ quote bad))
-        |> Unsynchronized.setmp print_mode
-            (space_explode "," (Options.string options "print_mode") @ print_mode_value ())) thys)
-    else
-      Output.physical_stderr ("Skipping theories " ^ commas_quote (map #1 thys) ^
-        " (undefined " ^ commas conds ^ ")\n")
-  end;
+    val loaded_theories =
+      if null conds then
+        (Options.set_default options;
+          Isabelle_Process.init_options ();
+          Future.fork I;
+          (Thy_Info.use_theories options qualifier
+          |>
+            (case Options.string options "profiling" of
+              "" => I
+            | "time" => profile_time
+            | "allocations" => profile_allocations
+            | bad => error ("Bad profiling option: " ^ quote bad))
+          |> Unsynchronized.setmp print_mode
+              (space_explode "," (Options.string options "print_mode") @ print_mode_value ())) thys)
+      else
+       (Output.physical_stderr ("Skipping theories " ^ commas_quote (map #1 thys) ^
+          " (undefined " ^ commas conds ^ ")\n"); [])
+  in apply_hooks qualifier loaded_theories end;
 
 
 (* build session *)
--- a/src/Pure/Tools/build.scala	Tue Jun 01 19:46:34 2021 +0200
+++ b/src/Pure/Tools/build.scala	Tue Jun 08 17:01:32 2021 +0200
@@ -183,7 +183,8 @@
     no_build: Boolean = false,
     soft_build: Boolean = false,
     verbose: Boolean = false,
-    export_files: Boolean = false): Results =
+    export_files: Boolean = false,
+    session_setup: (String, Session) => Unit = (_, _) => ()): Results =
   {
     val build_options =
       options +
@@ -295,7 +296,7 @@
     val log =
       build_options.string("system_log") match {
         case "" => No_Logger
-        case "-" => Logger.make(progress)
+        case "true" => Logger.make(progress)
         case log_file => Logger.make(Some(Path.explode(log_file)))
       }
 
@@ -425,7 +426,7 @@
                   val numa_node = numa_nodes.next(used_node)
                   val job =
                     new Build_Job(progress, session_name, info, deps, store, do_store,
-                      verbose, log, numa_node, queue.command_timings(session_name))
+                      log, session_setup, numa_node, queue.command_timings(session_name))
                   loop(pending, running + (session_name -> (ancestor_heaps, job)), results)
                 }
                 else {
--- a/src/Pure/Tools/build_job.scala	Tue Jun 01 19:46:34 2021 +0200
+++ b/src/Pure/Tools/build_job.scala	Tue Jun 08 17:01:32 2021 +0200
@@ -202,8 +202,8 @@
   deps: Sessions.Deps,
   store: Sessions.Store,
   do_store: Boolean,
-  verbose: Boolean,
   log: Logger,
+  session_setup: (String, Session) => Unit,
   val numa_node: Option[Int],
   command_timings0: List[Properties.T])
 {
@@ -411,10 +411,12 @@
           case _ =>
         }
 
+      session_setup(session_name, session)
+
       val eval_main = Command_Line.ML_tool("Isabelle_Process.init_build ()" :: eval_store)
 
       val process =
-        Isabelle_Process(session, options, sessions_structure, store,
+        Isabelle_Process.start(session, options, sessions_structure, store,
           logic = parent, raw_ml_system = is_pure,
           use_prelude = use_prelude, eval_main = eval_main,
           cwd = info.dir.file, env = env)
--- a/src/Pure/build-jars	Tue Jun 01 19:46:34 2021 +0200
+++ b/src/Pure/build-jars	Tue Jun 08 17:01:32 2021 +0200
@@ -34,7 +34,6 @@
   src/Pure/Admin/build_zipperposition.scala
   src/Pure/Admin/check_sources.scala
   src/Pure/Admin/ci_profile.scala
-  src/Pure/Admin/components.scala
   src/Pure/Admin/isabelle_cronjob.scala
   src/Pure/Admin/isabelle_devel.scala
   src/Pure/Admin/jenkins.scala
@@ -132,6 +131,7 @@
   src/Pure/ROOT.scala
   src/Pure/System/bash.scala
   src/Pure/System/command_line.scala
+  src/Pure/System/components.scala
   src/Pure/System/cygwin.scala
   src/Pure/System/executable.scala
   src/Pure/System/getopts.scala
--- a/src/Tools/VSCode/src/language_server.scala	Tue Jun 01 19:46:34 2021 +0200
+++ b/src/Tools/VSCode/src/language_server.scala	Tue Jun 08 17:01:32 2021 +0200
@@ -306,8 +306,8 @@
       dynamic_output.init()
 
       try {
-        Isabelle_Process(session, options, base_info.sessions_structure, Sessions.store(options),
-          modes = modes, logic = base_info.session).await_startup()
+        Isabelle_Process.start(session, options, base_info.sessions_structure,
+          Sessions.store(options), modes = modes, logic = base_info.session).await_startup()
         reply_ok("Welcome to Isabelle/" + base_info.session + Isabelle_System.isabelle_heading())
       }
       catch { case ERROR(msg) => reply_error(msg) }
--- a/src/Tools/jEdit/src/isabelle.scala	Tue Jun 01 19:46:34 2021 +0200
+++ b/src/Tools/jEdit/src/isabelle.scala	Tue Jun 08 17:01:32 2021 +0200
@@ -199,6 +199,7 @@
       if (continuous_checking != b) {
         PIDE.options.bool(CONTINUOUS_CHECKING) = b
         PIDE.session.update_options(PIDE.options.value)
+        PIDE.plugin.deps_changed()
       }
     }
 
--- a/src/Tools/jEdit/src/jedit_sessions.scala	Tue Jun 01 19:46:34 2021 +0200
+++ b/src/Tools/jEdit/src/jedit_sessions.scala	Tue Jun 08 17:01:32 2021 +0200
@@ -143,7 +143,7 @@
 
     session.phase_changed += PIDE.plugin.session_phase_changed
 
-    Isabelle_Process(session, options, sessions_structure, store,
+    Isabelle_Process.start(session, options, sessions_structure, store,
       logic = PIDE.resources.session_name,
       modes =
         (space_explode(',', options.string("jedit_print_mode")) :::