merged
authorAngeliki KoutsoukouArgyraki <ak2110@cam.ac.uk>
Mon, 03 Sep 2018 13:32:29 +0100
changeset 68891 3cdde2ea2667
parent 68890 725d5ed56563 (current diff)
parent 68889 d9c051e9da2b (diff)
child 68893 58bf801d679a
merged
--- a/NEWS	Mon Sep 03 13:28:52 2018 +0100
+++ b/NEWS	Mon Sep 03 13:32:29 2018 +0100
@@ -14,6 +14,12 @@
 specified in seconds; a negative value means it is disabled (default).
 
 
+*** Isar ***
+
+* More robust treatment of structural errors: begin/end blocks take
+precedence over goal/proof.
+
+
 *** HOL ***
 
 * Simplified syntax setup for big operators under image. In rare
@@ -22,6 +28,9 @@
 SUPREMUM, UNION, INTER should now rarely occur in output and are just
 retained as migration auxiliary. INCOMPATIBILITY.
 
+* Sledgehammer: The URL for SystemOnTPTP, which is used by remote
+provers, has been updated.
+
 
 *** ML ***
 
@@ -49,6 +58,14 @@
 observes the official standard).
 
 
+*** System ***
+
+* Isabelle Server message "use_theories" terminates more robustly in the
+presence of structurally broken sources: full consolidation of theories
+is no longer required.
+
+
+
 New in Isabelle2018 (August 2018)
 ---------------------------------
 
--- a/src/Doc/System/Server.thy	Mon Sep 03 13:28:52 2018 +0100
+++ b/src/Doc/System/Server.thy	Mon Sep 03 13:32:29 2018 +0100
@@ -516,14 +516,21 @@
   session-qualified theory name (e.g.\ \<^verbatim>\<open>HOL-ex.Seq\<close>).
 
   \<^item> \<^bold>\<open>type\<close> \<open>node_status = {ok: bool, total: int, unprocessed: int, running:
-  int, warned: int, failed: int, finished: int, initialized: bool,
-  consolidated: bool}\<close> represents a formal theory node status of the PIDE
-  document model. Fields \<open>total\<close>, \<open>unprocessed\<close>, \<open>running\<close>, \<open>warned\<close>,
-  \<open>failed\<close>, \<open>finished\<close> account for individual commands within a theory node;
-  \<open>ok\<close> is an abstraction for \<open>failed = 0\<close>. The \<open>initialized\<close> flag indicates
-  that the initial \<^theory_text>\<open>theory\<close> command has been processed. The \<open>consolidated\<close>
-  flag indicates whether the outermost theory command structure has finished
-  (or failed) and the final \<^theory_text>\<open>end\<close> command has been checked.
+  int, warned: int, failed: int, finished: int, canceled: bool, consolidated:
+  bool}\<close> represents a formal theory node status of the PIDE document model as
+  follows.
+
+    \<^item> Fields \<open>total\<close>, \<open>unprocessed\<close>, \<open>running\<close>, \<open>warned\<close>, \<open>failed\<close>, \<open>finished\<close>
+      account for individual commands within a theory node; \<open>ok\<close> is an
+      abstraction for \<open>failed = 0\<close>.
+
+    \<^item> The \<open>canceled\<close> flag tells if some command in the theory has been
+      spontaneously canceled (by an Interrupt exception that could also
+      indicate resource problems).
+
+    \<^item> The \<open>consolidated\<close> flag indicates whether the outermost theory command
+    structure has finished (or failed) and the final \<^theory_text>\<open>end\<close> command has been
+    checked.
 \<close>
 
 
@@ -916,10 +923,9 @@
   \<^medskip>
   The \<^verbatim>\<open>use_theories\<close> command updates the identified session by adding the
   current version of theory files to it, while dependencies are resolved
-  implicitly. The command succeeds eventually, when all theories have been
-  \<^emph>\<open>consolidated\<close> in the sense the formal \<open>node_status\<close>
-  (\secref{sec:json-types}): the outermost command structure has finished (or
-  failed) and the final \<^theory_text>\<open>end\<close> command of each theory has been checked.
+  implicitly. The command succeeds eventually, when all theories have status
+  \<^emph>\<open>terminated\<close> or \<^emph>\<open>consolidated\<close> in the sense of \<open>node_status\<close>
+  (\secref{sec:json-types}).
 
   Already used theories persist in the session until purged explicitly
   (\secref{sec:command-purge-theories}). This also means that repeated
@@ -935,14 +941,10 @@
   represented as plain text in UTF-8 encoding, which means the string needs to
   be decoded as in \<^verbatim>\<open>java.util.Base64.getDecoder.decode(String)\<close>.
 
-  \<^medskip> Due to asynchronous nature of PIDE document processing, structurally
-  broken theories never reach the \<open>consolidated\<close> state: consequently
-  \<^verbatim>\<open>use_theories\<close> will wait forever. The status is checked every \<open>check_delay\<close>
-  seconds, and bounded by \<open>check_limit\<close> attempts (default: 0, i.e.\
-  unbounded). A \<open>check_limit > 0\<close> effectively specifies a timeout of
-  \<open>check_delay \<times> check_limit\<close> seconds; it needs to be greater than the system
-  option @{system_option editor_consolidate_delay} to give PIDE processing a
-  chance to consolidate document nodes in the first place.
+  \<^medskip> The status of PIDE processing is checked every \<open>check_delay\<close> seconds, and
+  bounded by \<open>check_limit\<close> attempts (default: 0, i.e.\ unbounded). A
+  \<open>check_limit > 0\<close> effectively specifies a timeout of \<open>check_delay \<times>
+  check_limit\<close> seconds.
 
   \<^medskip> A non-negative \<open>nodes_status_delay\<close> enables continuous notifications of
   kind \<open>nodes_status\<close>, with a field of name and type \<open>nodes_status\<close>. The time
--- a/src/HOL/Complete_Lattices.thy	Mon Sep 03 13:28:52 2018 +0100
+++ b/src/HOL/Complete_Lattices.thy	Mon Sep 03 13:32:29 2018 +0100
@@ -236,6 +236,9 @@
 lemma INF_mono: "(\<And>m. m \<in> B \<Longrightarrow> \<exists>n\<in>A. f n \<le> g m) \<Longrightarrow> (\<Sqinter>n\<in>A. f n) \<le> (\<Sqinter>n\<in>B. g n)"
   using Inf_mono [of "g ` B" "f ` A"] by auto
 
+lemma INF_mono': "(\<And>x. f x \<le> g x) \<Longrightarrow> (INF x:A. f x) \<le> (INF x:A. g x)"
+  by (rule INF_mono) auto
+
 lemma Sup_mono:
   assumes "\<And>a. a \<in> A \<Longrightarrow> \<exists>b\<in>B. a \<le> b"
   shows "\<Squnion>A \<le> \<Squnion>B"
@@ -249,6 +252,9 @@
 lemma SUP_mono: "(\<And>n. n \<in> A \<Longrightarrow> \<exists>m\<in>B. f n \<le> g m) \<Longrightarrow> (\<Squnion>n\<in>A. f n) \<le> (\<Squnion>n\<in>B. g n)"
   using Sup_mono [of "f ` A" "g ` B"] by auto
 
+lemma SUP_mono': "(\<And>x. f x \<le> g x) \<Longrightarrow> (SUP x:A. f x) \<le> (SUP x:A. g x)"
+  by (rule SUP_mono) auto
+
 lemma INF_superset_mono: "B \<subseteq> A \<Longrightarrow> (\<And>x. x \<in> B \<Longrightarrow> f x \<le> g x) \<Longrightarrow> (\<Sqinter>x\<in>A. f x) \<le> (\<Sqinter>x\<in>B. g x)"
   \<comment> \<open>The last inclusion is POSITIVE!\<close>
   by (blast intro: INF_mono dest: subsetD)
--- a/src/HOL/Filter.thy	Mon Sep 03 13:28:52 2018 +0100
+++ b/src/HOL/Filter.thy	Mon Sep 03 13:32:29 2018 +0100
@@ -522,6 +522,33 @@
 lemma eventually_INF1: "i \<in> I \<Longrightarrow> eventually P (F i) \<Longrightarrow> eventually P (\<Sqinter>i\<in>I. F i)"
   using filter_leD[OF INF_lower] .
 
+lemma eventually_INF_finite:
+  assumes "finite A"
+  shows   "eventually P (INF x:A. F x) \<longleftrightarrow>
+             (\<exists>Q. (\<forall>x\<in>A. eventually (Q x) (F x)) \<and> (\<forall>y. (\<forall>x\<in>A. Q x y) \<longrightarrow> P y))" 
+  using assms
+proof (induction arbitrary: P rule: finite_induct)
+  case (insert a A P)
+  from insert.hyps have [simp]: "x \<noteq> a" if "x \<in> A" for x
+    using that by auto
+  have "eventually P (INF x:insert a A. F x) \<longleftrightarrow>
+          (\<exists>Q R S. eventually Q (F a) \<and> (( (\<forall>x\<in>A. eventually (S x) (F x)) \<and>
+            (\<forall>y. (\<forall>x\<in>A. S x y) \<longrightarrow> R y)) \<and> (\<forall>x. Q x \<and> R x \<longrightarrow> P x)))"
+    unfolding ex_simps by (simp add: eventually_inf insert.IH)
+  also have "\<dots> \<longleftrightarrow> (\<exists>Q. (\<forall>x\<in>insert a A. eventually (Q x) (F x)) \<and>
+                           (\<forall>y. (\<forall>x\<in>insert a A. Q x y) \<longrightarrow> P y))"
+  proof (safe, goal_cases)
+    case (1 Q R S)
+    thus ?case using 1 by (intro exI[of _ "S(a := Q)"]) auto
+  next
+    case (2 Q)
+    show ?case
+      by (rule exI[of _ "Q a"], rule exI[of _ "\<lambda>y. \<forall>x\<in>A. Q x y"],
+          rule exI[of _ "Q(a := (\<lambda>_. True))"]) (use 2 in auto)
+  qed
+  finally show ?case .
+qed auto
+
 subsubsection \<open>Map function for filters\<close>
 
 definition filtermap :: "('a \<Rightarrow> 'b) \<Rightarrow> 'a filter \<Rightarrow> 'b filter"
@@ -653,6 +680,33 @@
   "filtercomap f (\<Squnion>b\<in>B. F b) \<ge> (\<Squnion>b\<in>B. filtercomap f (F b))"
   by (intro SUP_least filtercomap_mono SUP_upper)
 
+lemma filtermap_le_iff_le_filtercomap: "filtermap f F \<le> G \<longleftrightarrow> F \<le> filtercomap f G"
+  unfolding le_filter_def eventually_filtermap eventually_filtercomap
+  using eventually_mono by auto
+
+lemma filtercomap_neq_bot:
+  assumes "\<And>P. eventually P F \<Longrightarrow> \<exists>x. P (f x)"
+  shows   "filtercomap f F \<noteq> bot"
+  using assms by (auto simp: trivial_limit_def eventually_filtercomap)
+
+lemma filtercomap_neq_bot_surj:
+  assumes "F \<noteq> bot" and "surj f"
+  shows   "filtercomap f F \<noteq> bot"
+proof (rule filtercomap_neq_bot)
+  fix P assume *: "eventually P F"
+  show "\<exists>x. P (f x)"
+  proof (rule ccontr)
+    assume **: "\<not>(\<exists>x. P (f x))"
+    from * have "eventually (\<lambda>_. False) F"
+    proof eventually_elim
+      case (elim x)
+      from \<open>surj f\<close> obtain y where "x = f y" by auto
+      with elim and ** show False by auto
+    qed
+    with assms show False by (simp add: trivial_limit_def)
+  qed
+qed
+
 lemma eventually_filtercomapI [intro]:
   assumes "eventually P F"
   shows   "eventually (\<lambda>x. P (f x)) (filtercomap f F)"
@@ -858,6 +912,74 @@
 lemma filtermap_sequentually_ne_bot: "filtermap f sequentially \<noteq> bot"
   by (simp add: filtermap_bot_iff)
 
+subsection \<open>Increasing finite subsets\<close>
+
+definition finite_subsets_at_top where
+  "finite_subsets_at_top A = (INF X:{X. finite X \<and> X \<subseteq> A}. principal {Y. finite Y \<and> X \<subseteq> Y \<and> Y \<subseteq> A})"
+
+lemma eventually_finite_subsets_at_top:
+  "eventually P (finite_subsets_at_top A) \<longleftrightarrow>
+     (\<exists>X. finite X \<and> X \<subseteq> A \<and> (\<forall>Y. finite Y \<and> X \<subseteq> Y \<and> Y \<subseteq> A \<longrightarrow> P Y))"
+  unfolding finite_subsets_at_top_def
+proof (subst eventually_INF_base, goal_cases)
+  show "{X. finite X \<and> X \<subseteq> A} \<noteq> {}" by auto
+next
+  case (2 B C)
+  thus ?case by (intro bexI[of _ "B \<union> C"]) auto
+qed (simp_all add: eventually_principal)
+
+lemma eventually_finite_subsets_at_top_weakI [intro]:
+  assumes "\<And>X. finite X \<Longrightarrow> X \<subseteq> A \<Longrightarrow> P X"
+  shows   "eventually P (finite_subsets_at_top A)"
+proof -
+  have "eventually (\<lambda>X. finite X \<and> X \<subseteq> A) (finite_subsets_at_top A)"
+    by (auto simp: eventually_finite_subsets_at_top)
+  thus ?thesis by eventually_elim (use assms in auto)
+qed
+
+lemma finite_subsets_at_top_neq_bot [simp]: "finite_subsets_at_top A \<noteq> bot"
+proof -
+  have "\<not>eventually (\<lambda>x. False) (finite_subsets_at_top A)"
+    by (auto simp: eventually_finite_subsets_at_top)
+  thus ?thesis by auto
+qed
+
+lemma filtermap_image_finite_subsets_at_top:
+  assumes "inj_on f A"
+  shows   "filtermap ((`) f) (finite_subsets_at_top A) = finite_subsets_at_top (f ` A)"
+  unfolding filter_eq_iff eventually_filtermap
+proof (safe, goal_cases)
+  case (1 P)
+  then obtain X where X: "finite X" "X \<subseteq> A" "\<And>Y. finite Y \<Longrightarrow> X \<subseteq> Y \<Longrightarrow> Y \<subseteq> A \<Longrightarrow> P (f ` Y)"
+    unfolding eventually_finite_subsets_at_top by force
+  show ?case unfolding eventually_finite_subsets_at_top eventually_filtermap
+  proof (rule exI[of _ "f ` X"], intro conjI allI impI, goal_cases)
+    case (3 Y)
+    with assms and X(1,2) have "P (f ` (f -` Y \<inter> A))" using X(1,2)
+      by (intro X(3) finite_vimage_IntI) auto
+    also have "f ` (f -` Y \<inter> A) = Y" using assms 3 by blast
+    finally show ?case .
+  qed (insert assms X(1,2), auto intro!: finite_vimage_IntI)
+next
+  case (2 P)
+  then obtain X where X: "finite X" "X \<subseteq> f ` A" "\<And>Y. finite Y \<Longrightarrow> X \<subseteq> Y \<Longrightarrow> Y \<subseteq> f ` A \<Longrightarrow> P Y"
+    unfolding eventually_finite_subsets_at_top by force
+  show ?case unfolding eventually_finite_subsets_at_top eventually_filtermap
+  proof (rule exI[of _ "f -` X \<inter> A"], intro conjI allI impI, goal_cases)
+    case (3 Y)
+    with X(1,2) and assms show ?case by (intro X(3)) force+
+  qed (insert assms X(1), auto intro!: finite_vimage_IntI)
+qed
+
+lemma eventually_finite_subsets_at_top_finite:
+  assumes "finite A"
+  shows   "eventually P (finite_subsets_at_top A) \<longleftrightarrow> P A"
+  unfolding eventually_finite_subsets_at_top using assms by force
+
+lemma finite_subsets_at_top_finite: "finite A \<Longrightarrow> finite_subsets_at_top A = principal {A}"
+  by (auto simp: filter_eq_iff eventually_finite_subsets_at_top_finite eventually_principal)
+
+
 subsection \<open>The cofinite filter\<close>
 
 definition "cofinite = Abs_filter (\<lambda>P. finite {x. \<not> P x})"
@@ -1208,6 +1330,9 @@
   "(LIM x F. f x :> principal S) \<longleftrightarrow> (eventually (\<lambda>x. f x \<in> S) F)"
   unfolding filterlim_def eventually_filtermap le_principal ..
 
+lemma filterlim_filtercomap [intro]: "filterlim f F (filtercomap f F)"
+  unfolding filterlim_def by (rule filtermap_filtercomap)
+
 lemma filterlim_inf:
   "(LIM x F1. f x :> inf F2 F3) \<longleftrightarrow> ((LIM x F1. f x :> F2) \<and> (LIM x F1. f x :> F3))"
   unfolding filterlim_def by simp
@@ -1220,6 +1345,15 @@
   "(\<And>m. m \<in> J \<Longrightarrow> \<exists>i\<in>I. filtermap f (F i) \<le> G m) \<Longrightarrow> LIM x (\<Sqinter>i\<in>I. F i). f x :> (\<Sqinter>j\<in>J. G j)"
   unfolding filterlim_def by (rule order_trans[OF filtermap_INF INF_mono])
 
+lemma filterlim_INF': "x \<in> A \<Longrightarrow> filterlim f F (G x) \<Longrightarrow> filterlim f F (INF x:A. G x)"
+  unfolding filterlim_def by (rule order.trans[OF filtermap_mono[OF INF_lower]])
+
+lemma filterlim_filtercomap_iff: "filterlim f (filtercomap g G) F \<longleftrightarrow> filterlim (g \<circ> f) G F"
+  by (simp add: filterlim_def filtermap_le_iff_le_filtercomap filtercomap_filtercomap o_def)
+
+lemma filterlim_iff_le_filtercomap: "filterlim f F G \<longleftrightarrow> G \<le> filtercomap f F"
+  by (simp add: filterlim_def filtermap_le_iff_le_filtercomap)
+
 lemma filterlim_base:
   "(\<And>m x. m \<in> J \<Longrightarrow> i m \<in> I) \<Longrightarrow> (\<And>m x. m \<in> J \<Longrightarrow> x \<in> F (i m) \<Longrightarrow> f x \<in> G m) \<Longrightarrow>
     LIM x (\<Sqinter>i\<in>I. principal (F i)). f x :> (\<Sqinter>j\<in>J. principal (G j))"
@@ -1347,9 +1481,51 @@
   shows "(LIM x F. f x :> at_bot) \<longleftrightarrow> (\<forall>Z<c. eventually (\<lambda>x. Z \<ge> f x) F)"
   by (metis filterlim_at_bot filterlim_at_bot_le lt_ex order_le_less_trans)
     
-lemma filterlim_filtercomap [intro]: "filterlim f F (filtercomap f F)"
-  unfolding filterlim_def by (rule filtermap_filtercomap)
+lemma filterlim_finite_subsets_at_top:
+  "filterlim f (finite_subsets_at_top A) F \<longleftrightarrow>
+     (\<forall>X. finite X \<and> X \<subseteq> A \<longrightarrow> eventually (\<lambda>y. finite (f y) \<and> X \<subseteq> f y \<and> f y \<subseteq> A) F)"
+  (is "?lhs = ?rhs")
+proof 
+  assume ?lhs
+  thus ?rhs
+  proof (safe, goal_cases)
+    case (1 X)
+    hence *: "(\<forall>\<^sub>F x in F. P (f x))" if "eventually P (finite_subsets_at_top A)" for P
+      using that by (auto simp: filterlim_def le_filter_def eventually_filtermap)
+    have "\<forall>\<^sub>F Y in finite_subsets_at_top A. finite Y \<and> X \<subseteq> Y \<and> Y \<subseteq> A"
+      using 1 unfolding eventually_finite_subsets_at_top by force
+    thus ?case by (intro *) auto
+  qed
+next
+  assume rhs: ?rhs
+  show ?lhs unfolding filterlim_def le_filter_def eventually_finite_subsets_at_top
+  proof (safe, goal_cases)
+    case (1 P X)
+    with rhs have "\<forall>\<^sub>F y in F. finite (f y) \<and> X \<subseteq> f y \<and> f y \<subseteq> A" by auto
+    thus "eventually P (filtermap f F)" unfolding eventually_filtermap
+      by eventually_elim (insert 1, auto)
+  qed
+qed
 
+lemma filterlim_atMost_at_top:
+  "filterlim (\<lambda>n. {..n}) (finite_subsets_at_top (UNIV :: nat set)) at_top"
+  unfolding filterlim_finite_subsets_at_top
+proof (safe, goal_cases)
+  case (1 X)
+  then obtain n where n: "X \<subseteq> {..n}" by (auto simp: finite_nat_set_iff_bounded_le)
+  show ?case using eventually_ge_at_top[of n]
+    by eventually_elim (insert n, auto)
+qed
+
+lemma filterlim_lessThan_at_top:
+  "filterlim (\<lambda>n. {..<n}) (finite_subsets_at_top (UNIV :: nat set)) at_top"
+  unfolding filterlim_finite_subsets_at_top
+proof (safe, goal_cases)
+  case (1 X)
+  then obtain n where n: "X \<subseteq> {..<n}" by (auto simp: finite_nat_set_iff_bounded)
+  show ?case using eventually_ge_at_top[of n]
+    by eventually_elim (insert n, auto)
+qed
 
 subsection \<open>Setup @{typ "'a filter"} for lifting and transfer\<close>
 
--- a/src/HOL/Library/Countable_Set.thy	Mon Sep 03 13:28:52 2018 +0100
+++ b/src/HOL/Library/Countable_Set.thy	Mon Sep 03 13:32:29 2018 +0100
@@ -69,6 +69,11 @@
   then show thesis ..
 qed
 
+lemma countable_infiniteE':
+  assumes "countable A" "infinite A"
+  obtains g where "bij_betw g (UNIV :: nat set) A"
+  using bij_betw_inv[OF countableE_infinite[OF assms]] that by blast
+
 lemma countable_enum_cases:
   assumes "countable S"
   obtains (finite) f :: "'a \<Rightarrow> nat" where "finite S" "bij_betw f S {..<card S}"
--- a/src/HOL/Library/Liminf_Limsup.thy	Mon Sep 03 13:28:52 2018 +0100
+++ b/src/HOL/Library/Liminf_Limsup.thy	Mon Sep 03 13:32:29 2018 +0100
@@ -71,6 +71,11 @@
   shows "(INF i : A. INF j : B. f i j) = (INF p : A \<times> B. f (fst p) (snd p))"
   by (rule antisym) (auto intro!: INF_greatest INF_lower2)
 
+lemma INF_Sigma:
+  fixes f :: "_ \<Rightarrow> _ \<Rightarrow> _ :: complete_lattice"
+  shows "(INF i : A. INF j : B i. f i j) = (INF p : Sigma A B. f (fst p) (snd p))"
+  by (rule antisym) (auto intro!: INF_greatest INF_lower2)
+
 subsubsection \<open>\<open>Liminf\<close> and \<open>Limsup\<close>\<close>
 
 definition Liminf :: "'a filter \<Rightarrow> ('a \<Rightarrow> 'b) \<Rightarrow> 'b :: complete_lattice" where
@@ -202,7 +207,7 @@
 lemma le_Limsup:
   assumes F: "F \<noteq> bot" and x: "\<forall>\<^sub>F x in F. l \<le> f x"
   shows "l \<le> Limsup F f"
-  using F Liminf_bounded Liminf_le_Limsup order.trans x by blast
+  using F Liminf_bounded[of l f F] Liminf_le_Limsup[of F f] order.trans x by blast
 
 lemma Liminf_le:
   assumes F: "F \<noteq> bot" and x: "\<forall>\<^sub>F x in F. f x \<le> l"
--- a/src/HOL/Limits.thy	Mon Sep 03 13:28:52 2018 +0100
+++ b/src/HOL/Limits.thy	Mon Sep 03 13:32:29 2018 +0100
@@ -876,6 +876,91 @@
     and tendsto_add[OF tendsto_const[of "-c"], of "\<lambda>x. c + f x" "c + d"] by auto
 
 
+class topological_monoid_mult = topological_semigroup_mult + monoid_mult
+class topological_comm_monoid_mult = topological_monoid_mult + comm_monoid_mult
+
+lemma tendsto_power_strong [tendsto_intros]:
+  fixes f :: "_ \<Rightarrow> 'b :: topological_monoid_mult"
+  assumes "(f \<longlongrightarrow> a) F" "(g \<longlongrightarrow> b) F"
+  shows   "((\<lambda>x. f x ^ g x) \<longlongrightarrow> a ^ b) F"
+proof -
+  have "((\<lambda>x. f x ^ b) \<longlongrightarrow> a ^ b) F"
+    by (induction b) (auto intro: tendsto_intros assms)
+  also from assms(2) have "eventually (\<lambda>x. g x = b) F"
+    by (simp add: nhds_discrete filterlim_principal)
+  hence "eventually (\<lambda>x. f x ^ b = f x ^ g x) F"
+    by eventually_elim simp
+  hence "((\<lambda>x. f x ^ b) \<longlongrightarrow> a ^ b) F \<longleftrightarrow> ((\<lambda>x. f x ^ g x) \<longlongrightarrow> a ^ b) F"
+    by (intro filterlim_cong refl)
+  finally show ?thesis .
+qed
+
+lemma continuous_mult' [continuous_intros]:
+  fixes f g :: "_ \<Rightarrow> 'b::topological_semigroup_mult"
+  shows "continuous F f \<Longrightarrow> continuous F g \<Longrightarrow> continuous F (\<lambda>x. f x * g x)"
+  unfolding continuous_def by (rule tendsto_mult)
+
+lemma continuous_power' [continuous_intros]:
+  fixes f :: "_ \<Rightarrow> 'b::topological_monoid_mult"
+  shows "continuous F f \<Longrightarrow> continuous F g \<Longrightarrow> continuous F (\<lambda>x. f x ^ g x)"
+  unfolding continuous_def by (rule tendsto_power_strong) auto
+
+lemma continuous_on_mult' [continuous_intros]:
+  fixes f g :: "_ \<Rightarrow> 'b::topological_semigroup_mult"
+  shows "continuous_on A f \<Longrightarrow> continuous_on A g \<Longrightarrow> continuous_on A (\<lambda>x. f x * g x)"
+  unfolding continuous_on_def by (auto intro: tendsto_mult)
+
+lemma continuous_on_power' [continuous_intros]:
+  fixes f :: "_ \<Rightarrow> 'b::topological_monoid_mult"
+  shows "continuous_on A f \<Longrightarrow> continuous_on A g \<Longrightarrow> continuous_on A (\<lambda>x. f x ^ g x)"
+  unfolding continuous_on_def by (auto intro: tendsto_power_strong)
+
+lemma tendsto_mult_one:
+  fixes f g :: "_ \<Rightarrow> 'b::topological_monoid_mult"
+  shows "(f \<longlongrightarrow> 1) F \<Longrightarrow> (g \<longlongrightarrow> 1) F \<Longrightarrow> ((\<lambda>x. f x * g x) \<longlongrightarrow> 1) F"
+  by (drule (1) tendsto_mult) simp
+
+lemma tendsto_prod' [tendsto_intros]:
+  fixes f :: "'a \<Rightarrow> 'b \<Rightarrow> 'c::topological_comm_monoid_mult"
+  shows "(\<And>i. i \<in> I \<Longrightarrow> (f i \<longlongrightarrow> a i) F) \<Longrightarrow> ((\<lambda>x. \<Prod>i\<in>I. f i x) \<longlongrightarrow> (\<Prod>i\<in>I. a i)) F"
+  by (induct I rule: infinite_finite_induct) (simp_all add: tendsto_mult)
+
+lemma tendsto_one_prod':
+  fixes f :: "'a \<Rightarrow> 'b \<Rightarrow> 'c::topological_comm_monoid_mult"
+  assumes "\<And>i. i \<in> I \<Longrightarrow> ((\<lambda>x. f x i) \<longlongrightarrow> 1) F"
+  shows "((\<lambda>i. prod (f i) I) \<longlongrightarrow> 1) F"
+  using tendsto_prod' [of I "\<lambda>x y. f y x" "\<lambda>x. 1"] assms by simp
+
+lemma continuous_prod' [continuous_intros]:
+  fixes f :: "'a \<Rightarrow> 'b::t2_space \<Rightarrow> 'c::topological_comm_monoid_mult"
+  shows "(\<And>i. i \<in> I \<Longrightarrow> continuous F (f i)) \<Longrightarrow> continuous F (\<lambda>x. \<Prod>i\<in>I. f i x)"
+  unfolding continuous_def by (rule tendsto_prod')
+
+lemma continuous_on_prod' [continuous_intros]:
+  fixes f :: "'a \<Rightarrow> 'b::topological_space \<Rightarrow> 'c::topological_comm_monoid_mult"
+  shows "(\<And>i. i \<in> I \<Longrightarrow> continuous_on S (f i)) \<Longrightarrow> continuous_on S (\<lambda>x. \<Prod>i\<in>I. f i x)"
+  unfolding continuous_on_def by (auto intro: tendsto_prod')
+
+instance nat :: topological_comm_monoid_mult
+  by standard
+    (simp add: nhds_discrete principal_prod_principal filterlim_principal eventually_principal)
+
+instance int :: topological_comm_monoid_mult
+  by standard
+    (simp add: nhds_discrete principal_prod_principal filterlim_principal eventually_principal)
+
+class comm_real_normed_algebra_1 = real_normed_algebra_1 + comm_monoid_mult
+
+context real_normed_field
+begin
+
+subclass comm_real_normed_algebra_1
+proof
+  from norm_mult[of "1 :: 'a" 1] show "norm 1 = 1" by simp 
+qed (simp_all add: norm_mult)
+
+end
+
 subsubsection \<open>Inverse and division\<close>
 
 lemma (in bounded_bilinear) Zfun_prod_Bfun:
--- a/src/HOL/Series.thy	Mon Sep 03 13:28:52 2018 +0100
+++ b/src/HOL/Series.thy	Mon Sep 03 13:32:29 2018 +0100
@@ -335,6 +335,18 @@
 
 end
 
+lemma sums_If_finite_set':
+  fixes f g :: "nat \<Rightarrow> 'a::{t2_space,topological_ab_group_add}"
+  assumes "g sums S" and "finite A" and "S' = S + (\<Sum>n\<in>A. f n - g n)"
+  shows   "(\<lambda>n. if n \<in> A then f n else g n) sums S'"
+proof -
+  have "(\<lambda>n. g n + (if n \<in> A then f n - g n else 0)) sums (S + (\<Sum>n\<in>A. f n - g n))"
+    by (intro sums_add assms sums_If_finite_set)
+  also have "(\<lambda>n. g n + (if n \<in> A then f n - g n else 0)) = (\<lambda>n. if n \<in> A then f n else g n)"
+    by (simp add: fun_eq_iff)
+  finally show ?thesis using assms by simp
+qed
+
 subsection \<open>Infinite summability on real normed vector spaces\<close>
 
 context
--- a/src/HOL/Tools/ATP/scripts/remote_atp	Mon Sep 03 13:28:52 2018 +0100
+++ b/src/HOL/Tools/ATP/scripts/remote_atp	Mon Sep 03 13:32:29 2018 +0100
@@ -12,7 +12,7 @@
 use LWP;
 
 my $SystemOnTPTPFormReplyURL =
-  "http://www.cs.miami.edu/~tptp/cgi-bin/SystemOnTPTPFormReply";
+  "http://www.tptp.org/cgi-bin/SystemOnTPTPFormReply";
 
 # default parameters
 my %URLParameters = (
--- a/src/HOL/Topological_Spaces.thy	Mon Sep 03 13:28:52 2018 +0100
+++ b/src/HOL/Topological_Spaces.thy	Mon Sep 03 13:32:29 2018 +0100
@@ -512,6 +512,10 @@
 lemma (in discrete_topology) at_discrete: "at x within S = bot"
   unfolding at_within_def nhds_discrete by simp
 
+lemma (in discrete_topology) tendsto_discrete:
+  "filterlim (f :: 'b \<Rightarrow> 'a) (nhds y) F \<longleftrightarrow> eventually (\<lambda>x. f x = y) F"
+  by (auto simp: nhds_discrete filterlim_principal)
+
 lemma (in topological_space) at_within_eq:
   "at x within s = (INF S:{S. open S \<and> x \<in> S}. principal (S \<inter> s - {x}))"
   unfolding nhds_def at_within_def
@@ -881,6 +885,36 @@
   shows "((\<lambda>x. a) \<longlongrightarrow> b) F \<longleftrightarrow> a = b"
   by (auto intro!: tendsto_unique [OF assms tendsto_const])
 
+lemma Lim_in_closed_set:
+  assumes "closed S" "eventually (\<lambda>x. f(x) \<in> S) F" "F \<noteq> bot" "(f \<longlongrightarrow> l) F"
+  shows "l \<in> S"
+proof (rule ccontr)
+  assume "l \<notin> S"
+  with \<open>closed S\<close> have "open (- S)" "l \<in> - S"
+    by (simp_all add: open_Compl)
+  with assms(4) have "eventually (\<lambda>x. f x \<in> - S) F"
+    by (rule topological_tendstoD)
+  with assms(2) have "eventually (\<lambda>x. False) F"
+    by (rule eventually_elim2) simp
+  with assms(3) show "False"
+    by (simp add: eventually_False)
+qed
+
+lemma (in t3_space) nhds_closed:
+  assumes "x \<in> A" and "open A"
+  shows   "\<exists>A'. x \<in> A' \<and> closed A' \<and> A' \<subseteq> A \<and> eventually (\<lambda>y. y \<in> A') (nhds x)"
+proof -
+  from assms have "\<exists>U V. open U \<and> open V \<and> x \<in> U \<and> - A \<subseteq> V \<and> U \<inter> V = {}"
+    by (intro t3_space) auto
+  then obtain U V where UV: "open U" "open V" "x \<in> U" "-A \<subseteq> V" "U \<inter> V = {}"
+    by auto
+  have "eventually (\<lambda>y. y \<in> U) (nhds x)"
+    using \<open>open U\<close> and \<open>x \<in> U\<close> by (intro eventually_nhds_in_open)
+  hence "eventually (\<lambda>y. y \<in> -V) (nhds x)"
+    by eventually_elim (use UV in auto)
+  with UV show ?thesis by (intro exI[of _ "-V"]) auto
+qed
+
 lemma (in order_topology) increasing_tendsto:
   assumes bdd: "eventually (\<lambda>n. f n \<le> l) F"
     and en: "\<And>x. x < l \<Longrightarrow> eventually (\<lambda>n. x < f n) F"
@@ -2082,6 +2116,9 @@
   by (auto simp: at_within_Icc_at_right at_within_Icc_at_left continuous_on_def
       dest: bspec[where x=a] bspec[where x=b])
 
+lemma continuous_on_discrete [simp, continuous_intros]:
+  "continuous_on A (f :: 'a :: discrete_topology \<Rightarrow> _)"
+  by (auto simp: continuous_on_def at_discrete)
 
 subsubsection \<open>Continuity at a point\<close>
 
@@ -2125,6 +2162,10 @@
   "continuous_on s f \<longleftrightarrow> (\<forall>x\<in>s. continuous (at x within s) f)"
   unfolding continuous_on_def continuous_within ..
 
+lemma continuous_discrete [simp, continuous_intros]:
+  "continuous (at x within A) (f :: 'a :: discrete_topology \<Rightarrow> _)"
+  by (auto simp: continuous_def at_discrete)
+
 abbreviation isCont :: "('a::t2_space \<Rightarrow> 'b::topological_space) \<Rightarrow> 'a \<Rightarrow> bool"
   where "isCont f a \<equiv> continuous (at a) f"
 
--- a/src/Pure/Concurrent/lazy.ML	Mon Sep 03 13:28:52 2018 +0100
+++ b/src/Pure/Concurrent/lazy.ML	Mon Sep 03 13:32:29 2018 +0100
@@ -15,7 +15,8 @@
   val peek: 'a lazy -> 'a Exn.result option
   val is_running: 'a lazy -> bool
   val is_finished: 'a lazy -> bool
-  val force_result: 'a lazy -> 'a Exn.result
+  val finished_result: 'a lazy -> 'a Exn.result
+  val force_result: {strict: bool} -> 'a lazy -> 'a Exn.result
   val force: 'a lazy -> 'a
   val force_value: 'a lazy -> 'a lazy
   val map: ('a -> 'b) -> 'a lazy -> 'b lazy
@@ -59,18 +60,28 @@
         Expr _ => false
       | Result res => not (Future.is_finished res));
 
+fun is_finished_future res =
+  Future.is_finished res andalso not (Exn.is_interrupt_exn (Future.join_result res));
+
 fun is_finished (Value _) = true
   | is_finished (Lazy var) =
       (case Synchronized.value var of
         Expr _ => false
-      | Result res =>
-          Future.is_finished res andalso not (Exn.is_interrupt_exn (Future.join_result res)));
+      | Result res => is_finished_future res);
+
+fun finished_result (Value a) = Exn.Res a
+  | finished_result (Lazy var) =
+      let fun fail () = Exn.Exn (Fail "Unfinished lazy") in
+        (case Synchronized.value var of
+          Expr _ => fail ()
+        | Result res => if is_finished_future res then Future.join_result res else fail ())
+      end;
 
 
 (* force result *)
 
-fun force_result (Value a) = Exn.Res a
-  | force_result (Lazy var) =
+fun force_result _ (Value a) = Exn.Res a
+  | force_result {strict} (Lazy var) =
       (case peek (Lazy var) of
         SOME res => res
       | NONE =>
@@ -93,19 +104,22 @@
                     val res0 = Exn.capture (restore_attributes e) ();
                     val _ = Exn.capture (fn () => Future.fulfill_result x res0) ();
                     val res = Future.join_result x;
-                    (*semantic race: some other threads might see the same
-                      interrupt, until there is a fresh start*)
                     val _ =
-                      if Exn.is_interrupt_exn res
-                      then Synchronized.change var (fn _ => Expr (name, e))
-                      else Synchronized.assign var (Result (Future.value_result res));
+                      if not (Exn.is_interrupt_exn res) then
+                        Synchronized.assign var (Result (Future.value_result res))
+                      else if strict then
+                        Synchronized.assign var
+                          (Result (Future.value_result (Exn.Exn (Fail "Interrupt"))))
+                      (*semantic race: some other threads might see the same
+                        interrupt, until there is a fresh start*)
+                      else Synchronized.change var (fn _ => Expr (name, e));
                   in res end
               | NONE => Exn.capture (restore_attributes (fn () => Future.join x)) ())
             end) ());
 
 end;
 
-fun force x = Exn.release (force_result x);
+fun force x = Exn.release (force_result {strict = false} x);
 
 fun force_value x = if is_value x then x else value (force x);
 
@@ -122,7 +136,8 @@
 fun consolidate xs =
   let
     val unfinished =
-      xs |> map_filter (fn x => if is_finished x then NONE else SOME (fn () => force_result x));
+      xs |> map_filter (fn x =>
+        if is_finished x then NONE else SOME (fn () => force_result {strict = false} x));
     val _ =
       if Future.relevant unfinished then
         ignore (Future.forked_results {name = "Lazy.consolidate", deps = []} unfinished)
@@ -133,7 +148,7 @@
 (* future evaluation *)
 
 fun future params x =
-  if is_finished x then Future.value_result (force_result x)
+  if is_finished x then Future.value_result (force_result {strict = false} x)
   else (singleton o Future.forks) params (fn () => force x);
 
 
--- a/src/Pure/General/position.ML	Mon Sep 03 13:28:52 2018 +0100
+++ b/src/Pure/General/position.ML	Mon Sep 03 13:32:29 2018 +0100
@@ -29,6 +29,7 @@
   val id_only: string -> T
   val get_id: T -> string option
   val put_id: string -> T -> T
+  val copy_id: T -> T -> T
   val id_properties_of: T -> Properties.T
   val parse_id: T -> int option
   val adjust_offsets: (int -> int option) -> T -> T
@@ -142,6 +143,7 @@
 
 fun get_id (Pos (_, props)) = Properties.get props Markup.idN;
 fun put_id id (Pos (count, props)) = Pos (count, norm_props (Properties.put (Markup.idN, id) props));
+fun copy_id pos = (case get_id pos of NONE => I | SOME id => put_id id);
 
 fun parse_id pos = Option.map Value.parse_int (get_id pos);
 
@@ -151,14 +153,16 @@
   | NONE => []);
 
 fun adjust_offsets adjust (pos as Pos (_, props)) =
-  (case parse_id pos of
-    SOME id =>
-      (case adjust id of
-        SOME offset =>
-          let val Pos (count, _) = advance_offsets offset pos
-          in Pos (count, Properties.remove Markup.idN props) end
-      | NONE => pos)
-  | NONE => pos);
+  if is_none (file_of pos) then
+    (case parse_id pos of
+      SOME id =>
+        (case adjust id of
+          SOME offset =>
+            let val Pos (count, _) = advance_offsets offset pos
+            in Pos (count, Properties.remove Markup.idN props) end
+        | NONE => pos)
+    | NONE => pos)
+  else pos;
 
 
 (* markup properties *)
--- a/src/Pure/Isar/class.ML	Mon Sep 03 13:28:52 2018 +0100
+++ b/src/Pure/Isar/class.ML	Mon Sep 03 13:32:29 2018 +0100
@@ -229,9 +229,13 @@
 
 fun activate_defs class thms thy =
   (case Element.eq_morphism thy thms of
-    SOME eq_morph => fold (fn cls => fn thy =>
-      Context.theory_map (Locale.amend_registration (cls, base_morphism thy cls)
-        (eq_morph, true) (export_morphism thy cls)) thy) (heritage thy [class]) thy
+    SOME eq_morph =>
+      fold (fn cls => fn thy =>
+        Context.theory_map
+          (Locale.amend_registration
+            {dep = (cls, base_morphism thy cls),
+              mixin = SOME (eq_morph, true),
+              export = export_morphism thy cls}) thy) (heritage thy [class]) thy
   | NONE => thy);
 
 fun register_operation class (c, t) input_only thy =
@@ -484,10 +488,13 @@
     val diff_sort = Sign.complete_sort thy [sup]
       |> subtract (op =) (Sign.complete_sort thy [sub])
       |> filter (is_class thy);
-    fun add_dependency some_wit = case some_dep_morph of
-        SOME dep_morph => Generic_Target.locale_dependency sub
-          (sup, dep_morph $> Element.satisfy_morphism (the_list some_witn)) NONE export
-      | NONE => I;
+    fun add_dependency some_wit (* FIXME unused!? *) =
+      (case some_dep_morph of
+        SOME dep_morph =>
+          Generic_Target.locale_dependency sub
+            {dep = (sup, dep_morph $> Element.satisfy_morphism (the_list some_witn)),
+              mixin = NONE, export = export}
+      | NONE => I);
   in
     lthy
     |> Local_Theory.raw_theory
--- a/src/Pure/Isar/class_declaration.ML	Mon Sep 03 13:28:52 2018 +0100
+++ b/src/Pure/Isar/class_declaration.ML	Mon Sep 03 13:32:29 2018 +0100
@@ -328,8 +328,10 @@
     |-> (fn (param_map, params, assm_axiom) =>
        `(fn thy => calculate thy class sups base_sort param_map assm_axiom)
     #-> (fn (base_morph, eq_morph, export_morph, some_axiom, some_assm_intro, of_class) =>
-       Context.theory_map (Locale.add_registration (class, base_morph)
-         (Option.map (rpair true) eq_morph) export_morph)
+       Locale.add_registration_theory
+         {dep = (class, base_morph),
+           mixin = Option.map (rpair true) eq_morph,
+           export = export_morph}
     #> Class.register class sups params base_sort base_morph export_morph some_axiom some_assm_intro of_class
     #> Global_Theory.store_thm (prefix (Binding.qualified_name (class ^ ".of_class.intro")), of_class)))
     |> snd
--- a/src/Pure/Isar/expression.ML	Mon Sep 03 13:28:52 2018 +0100
+++ b/src/Pure/Isar/expression.ML	Mon Sep 03 13:32:29 2018 +0100
@@ -531,12 +531,8 @@
 local
 
 fun props_of thy (name, morph) =
-  let
-    val (asm, defs) = Locale.specification_of thy name;
-  in
-    (case asm of NONE => defs | SOME asm => asm :: defs)
-    |> map (Morphism.term morph)
-  end;
+  let val (asm, defs) = Locale.specification_of thy name
+  in map (Morphism.term morph) (the_list asm @ defs) end;
 
 fun prep_goal_expression prep expression ctxt =
   let
--- a/src/Pure/Isar/generic_target.ML	Mon Sep 03 13:28:52 2018 +0100
+++ b/src/Pure/Isar/generic_target.ML	Mon Sep 03 13:32:29 2018 +0100
@@ -52,8 +52,7 @@
   val theory_abbrev: Syntax.mode -> (binding * mixfix) * term ->
     local_theory -> (term * term) * local_theory
   val theory_declaration: declaration -> local_theory -> local_theory
-  val theory_registration: string * morphism -> (morphism * bool) option -> morphism ->
-    local_theory -> local_theory
+  val theory_registration: Locale.registration -> local_theory -> local_theory
 
   (*locale target primitives*)
   val locale_target_notes: string -> string -> Attrib.fact list -> Attrib.fact list ->
@@ -71,8 +70,7 @@
     local_theory -> local_theory
   val locale_const: string -> Syntax.mode -> (binding * mixfix) * term ->
     local_theory -> local_theory
-  val locale_dependency: string -> string * morphism -> (morphism * bool) option -> morphism ->
-    local_theory -> local_theory
+  val locale_dependency: string -> Locale.registration -> local_theory -> local_theory
 
   (*initialisation*)
   val init: {background_naming: Name_Space.naming, setup: theory -> Proof.context,
@@ -372,7 +370,7 @@
   background_declaration decl #> standard_declaration (K true) decl;
 
 val theory_registration =
-  Local_Theory.raw_theory o Context.theory_map ooo Locale.add_registration;
+  Local_Theory.raw_theory o Locale.add_registration_theory;
 
 
 
@@ -406,9 +404,9 @@
   locale_target_const locale (K true) prmode ((b, mx), rhs)
   #> standard_const (fn (this, other) => other <> 0 andalso this <> other) prmode ((b, mx), rhs);
 
-fun locale_dependency locale dep_morph mixin export =
-  (Local_Theory.raw_theory ooo Locale.add_dependency locale) dep_morph mixin export
-  #> Locale.activate_fragment_nonbrittle dep_morph mixin export;
+fun locale_dependency locale registration =
+  Local_Theory.raw_theory (Locale.add_dependency locale registration)
+  #> Locale.add_registration_local_theory registration;
 
 
 (** locale abbreviations **)
--- a/src/Pure/Isar/interpretation.ML	Mon Sep 03 13:28:52 2018 +0100
+++ b/src/Pure/Isar/interpretation.ML	Mon Sep 03 13:32:29 2018 +0100
@@ -97,30 +97,30 @@
 
 local
 
-fun meta_rewrite eqns ctxt =
+fun abs_def_rule eqns ctxt =
   (map (Local_Defs.abs_def_rule ctxt) (maps snd eqns), ctxt);
 
-fun note_eqns_register pos note activate deps eqnss witss def_eqns thms export export' ctxt =
+fun note_eqns_register note add_registration
+    deps eqnss witss def_eqns thms export export' ctxt =
   let
-    val thmss = unflat ((map o map) fst eqnss) thms;
-    val factss =
-      map2 (map2 (fn b => fn eq => (b, [([Morphism.thm export eq], [])]))) ((map o map) fst eqnss) thmss;
-    val (eqnss', ctxt') = fold_map (fn facts => note Thm.theoremK facts #-> meta_rewrite) factss ctxt;
+    val factss = thms
+      |> unflat ((map o map) #1 eqnss)
+      |> map2 (map2 (fn b => fn eq => (b, [([Morphism.thm export eq], [])]))) ((map o map) #1 eqnss);
+    val (eqnss', ctxt') =
+      fold_map (fn facts => note Thm.theoremK facts #-> abs_def_rule) factss ctxt;
     val defs = (Binding.empty_atts, [(map (Morphism.thm (export' $> export)) def_eqns, [])]);
-    val (eqns', ctxt'') = ctxt'
-      |> note Thm.theoremK [defs]
-      |-> meta_rewrite;
-    val dep_morphs =
-      map2 (fn (dep, morph) => fn wits =>
-        let val morph' = morph
-          $> Element.satisfy_morphism (map (Element.transform_witness export') wits)
-          $> Morphism.binding_morphism "position" (Binding.set_pos pos)
-        in (dep, morph') end) deps witss;
-    fun activate' (dep_morph, eqns) ctxt =
-      activate dep_morph
-        (Option.map (rpair true) (Element.eq_morphism (Proof_Context.theory_of ctxt) (eqns @ eqns')))
-        export ctxt;
-  in ctxt'' |> fold activate' (dep_morphs ~~ eqnss') end;
+    val (eqns', ctxt'') = ctxt' |> note Thm.theoremK [defs] |-> abs_def_rule;
+    val deps' =
+      (deps ~~ witss) |> map (fn ((dep, morph), wits) =>
+        (dep, morph $> Element.satisfy_morphism (map (Element.transform_witness export') wits)));
+    fun register (dep, eqns) ctxt =
+      ctxt |> add_registration
+        {dep = dep,
+          mixin =
+            Option.map (rpair true)
+              (Element.eq_morphism (Proof_Context.theory_of ctxt) (eqns @ eqns')),
+          export = export};
+  in ctxt'' |> fold register (deps' ~~ eqnss') end;
 
 in
 
@@ -129,9 +129,8 @@
   let
     val (((propss, eq_propss, deps, eqnss, export, export'), def_eqns), goal_ctxt) =
       prep_interpretation expression raw_defs initial_ctxt;
-    val pos = Position.thread_data ();
     fun after_qed witss eqns =
-      note_eqns_register pos note add_registration deps eqnss witss def_eqns eqns export export';
+      note_eqns_register note add_registration deps eqnss witss def_eqns eqns export export';
   in setup_proof after_qed propss (flat eq_propss) goal_ctxt end;
 
 end;
@@ -143,23 +142,16 @@
 
 local
 
+fun setup_proof state after_qed propss eqns goal_ctxt =
+  Element.witness_local_proof_eqs
+    (fn witss => fn eqns => Proof.map_context (after_qed witss eqns) #> Proof.reset_facts)
+    "interpret" propss eqns goal_ctxt state;
+
 fun gen_interpret prep_interpretation expression state =
-  let
-    val _ = Proof.assert_forward_or_chain state;
-    fun lift_after_qed after_qed witss eqns =
-      Proof.map_context (after_qed witss eqns) #> Proof.reset_facts;
-    fun setup_proof after_qed propss eqns goal_ctxt =
-      Element.witness_local_proof_eqs (lift_after_qed after_qed) "interpret"
-        propss eqns goal_ctxt state;
-    fun add_registration reg mixin export ctxt = ctxt
-      |> Proof_Context.set_stmt false
-      |> Context.proof_map (Locale.add_registration reg mixin export)
-      |> Proof_Context.restore_stmt ctxt;
-  in
-    Proof.context_of state
-    |> generic_interpretation prep_interpretation setup_proof
-      Attrib.local_notes add_registration expression []
-  end;
+  Proof.assert_forward_or_chain state
+  |> Proof.context_of
+  |> generic_interpretation prep_interpretation (setup_proof state)
+    Attrib.local_notes Locale.add_registration_proof expression [];
 
 in
 
--- a/src/Pure/Isar/local_theory.ML	Mon Sep 03 13:28:52 2018 +0100
+++ b/src/Pure/Isar/local_theory.ML	Mon Sep 03 13:32:29 2018 +0100
@@ -14,6 +14,11 @@
   type fact = binding * thms;
 end;
 
+structure Locale =
+struct
+  type registration = {dep: string * morphism, mixin: (morphism * bool) option, export: morphism};
+end;
+
 signature LOCAL_THEORY =
 sig
   type operations
@@ -54,10 +59,8 @@
   val abbrev: Syntax.mode -> (binding * mixfix) * term -> local_theory ->
     (term * term) * local_theory
   val declaration: {syntax: bool, pervasive: bool} -> declaration -> local_theory -> local_theory
-  val theory_registration: string * morphism -> (morphism * bool) option -> morphism ->
-    local_theory -> local_theory
-  val locale_dependency: string * morphism -> (morphism * bool) option -> morphism ->
-    local_theory -> local_theory
+  val theory_registration: Locale.registration -> local_theory -> local_theory
+  val locale_dependency: Locale.registration -> local_theory -> local_theory
   val pretty: local_theory -> Pretty.T list
   val add_thms_dynamic: binding * (Context.generic -> thm list) -> local_theory -> local_theory
   val set_defsort: sort -> local_theory -> local_theory
@@ -91,10 +94,8 @@
   abbrev: Syntax.mode -> (binding * mixfix) * term -> local_theory ->
     (term * term) * local_theory,
   declaration: {syntax: bool, pervasive: bool} -> declaration -> local_theory -> local_theory,
-  theory_registration: string * morphism -> (morphism * bool) option -> morphism ->
-     local_theory -> local_theory,
-  locale_dependency: string * morphism -> (morphism * bool) option -> morphism ->
-     local_theory -> local_theory,
+  theory_registration: Locale.registration -> local_theory -> local_theory,
+  locale_dependency: Locale.registration -> local_theory -> local_theory,
   pretty: local_theory -> Pretty.T list};
 
 type lthy =
@@ -250,7 +251,7 @@
 fun target_morphism lthy = standard_morphism lthy (target_of lthy);
 
 fun propagate_ml_env (context as Context.Proof lthy) =
-      let val inherit = ML_Env.inherit context in
+      let val inherit = ML_Env.inherit [context] in
         lthy
         |> background_theory (Context.theory_map inherit)
         |> map_contexts (K (Context.proof_map inherit))
@@ -276,10 +277,10 @@
 val define_internal = operation2 #define true;
 val notes_kind = operation2 #notes;
 val declaration = operation2 #declaration;
-fun theory_registration dep_morph mixin export =
-  assert_bottom #> operation (fn ops => #theory_registration ops dep_morph mixin export);
-fun locale_dependency dep_morph mixin export =
-  assert_bottom #> operation (fn ops => #locale_dependency ops dep_morph mixin export);
+fun theory_registration registration =
+  assert_bottom #> operation (fn ops => #theory_registration ops registration);
+fun locale_dependency registration =
+  assert_bottom #> operation (fn ops => #locale_dependency ops registration);
 
 
 (* theorems *)
--- a/src/Pure/Isar/locale.ML	Mon Sep 03 13:28:52 2018 +0100
+++ b/src/Pure/Isar/locale.ML	Mon Sep 03 13:32:29 2018 +0100
@@ -42,6 +42,7 @@
     declaration list ->
     (string * Attrib.fact list) list ->
     (string * morphism) list -> theory -> theory
+  val locale_space: theory -> Name_Space.T
   val intern: theory -> xstring -> string
   val check: theory -> xstring * Position.T -> string
   val extern: theory -> string -> xstring
@@ -53,6 +54,14 @@
   val axioms_of: theory -> string -> thm list
   val instance_of: theory -> string -> morphism -> term list
   val specification_of: theory -> string -> term option * term list
+  val hyp_spec_of: theory -> string -> Element.context_i list
+
+  type content =
+   {type_params: (string * sort) list,
+    params: ((string * typ) * mixfix) list,
+    asm: term option,
+    defs: term list}
+  val dest_locales: theory -> (string * content) list
 
   (* Storing results *)
   val add_facts: string -> string -> Attrib.fact list -> Proof.context -> Proof.context
@@ -73,20 +82,17 @@
   val intro_locales_tac: bool -> Proof.context -> thm list -> tactic
 
   (* Registrations and dependencies *)
-  val add_registration: string * morphism -> (morphism * bool) option ->
-    morphism -> Context.generic -> Context.generic
-  val activate_fragment: string * morphism -> (morphism * bool) option -> morphism ->
-    local_theory -> local_theory
-  val activate_fragment_nonbrittle: string * morphism -> (morphism * bool) option -> morphism ->
-    local_theory -> local_theory
-  val amend_registration: string * morphism -> morphism * bool ->
-    morphism -> Context.generic -> Context.generic
+  type registration = {dep: string * morphism, mixin: (morphism * bool) option, export: morphism}
+  val amend_registration: registration -> Context.generic -> Context.generic
+  val add_registration: registration -> Context.generic -> Context.generic
+  val add_registration_theory: registration -> theory -> theory
+  val add_registration_proof: registration -> Proof.context -> Proof.context
+  val add_registration_local_theory: registration -> local_theory -> local_theory
+  val activate_fragment: registration -> local_theory -> local_theory
   val registrations_of: Context.generic -> string -> (string * morphism) list
-  val add_dependency: string -> string * morphism -> (morphism * bool) option ->
-    morphism -> theory -> theory
+  val add_dependency: string -> registration -> theory -> theory
 
   (* Diagnostic *)
-  val hyp_spec_of: theory -> string -> Element.context_i list
   val print_locales: bool -> theory -> unit
   val print_locale: theory -> bool -> xstring * Position.T -> unit
   val print_registrations: Proof.context -> xstring * Position.T -> unit
@@ -209,6 +215,20 @@
   Locales.map o Name_Space.map_table_entry name o map_locale o apsnd;
 
 
+(* content *)
+
+type content =
+ {type_params: (string * sort) list,
+  params: ((string * typ) * mixfix) list,
+  asm: term option,
+  defs: term list};
+
+fun dest_locales thy =
+  (Locales.get thy, []) |-> Name_Space.fold_table
+    (fn (name, Loc {parameters = (type_params, params), spec = (asm, defs), ...}) =>
+      cons (name, {type_params = type_params, params = params, asm = asm, defs = defs}));
+
+
 (** Primitive operations **)
 
 fun params_of thy = snd o #parameters o the_locale thy;
@@ -222,8 +242,9 @@
 
 fun specification_of thy = #spec o the_locale thy;
 
-fun dependencies_of thy name = the_locale thy name |>
-  #dependencies;
+fun hyp_spec_of thy = #hyp_spec o the_locale thy;
+
+fun dependencies_of thy = #dependencies o the_locale thy;
 
 fun mixins_of thy name serial = the_locale thy name |>
   #mixins |> lookup_mixins serial;
@@ -516,60 +537,68 @@
 
 (*** Add and extend registrations ***)
 
-fun amend_registration (name, morph) mixin export context =
-  let
-    val thy = Context.theory_of context;
-    val ctxt = Context.proof_of context;
+type registration = Locale.registration;
+
+fun amend_registration {mixin = NONE, ...} context = context
+  | amend_registration {dep = (name, morph), mixin = SOME mixin, export} context =
+      let
+        val thy = Context.theory_of context;
+        val ctxt = Context.proof_of context;
 
-    val regs = Registrations.get context |> fst;
-    val base = instance_of thy name (morph $> export);
-    val serial' =
-      (case Idtab.lookup regs (name, base) of
-        NONE =>
-          error ("No interpretation of locale " ^ quote (markup_name ctxt name) ^
-            " with\nparameter instantiation " ^
-            space_implode " " (map (quote o Syntax.string_of_term_global thy) base) ^
-            " available")
-      | SOME (_, serial') => serial');
-  in
-    add_mixin serial' mixin context
-  end;
+        val regs = Registrations.get context |> fst;
+        val base = instance_of thy name (morph $> export);
+        val serial' =
+          (case Idtab.lookup regs (name, base) of
+            NONE =>
+              error ("No interpretation of locale " ^ quote (markup_name ctxt name) ^
+                " with\nparameter instantiation " ^
+                space_implode " " (map (quote o Syntax.string_of_term_global thy) base) ^
+                " available")
+          | SOME (_, serial') => serial');
+      in
+        add_mixin serial' mixin context
+      end;
 
 (* Note that a registration that would be subsumed by an existing one will not be
    generated, and it will not be possible to amend it. *)
 
-fun add_registration (name, base_morph) mixin export context =
+fun add_registration {dep = (name, base_morph), mixin, export} context =
   let
     val thy = Context.theory_of context;
-    val mix = (case mixin of NONE => Morphism.identity | SOME (mix, _) => mix);
-    val morph = base_morph $> mix;
-    val inst = instance_of thy name morph;
+    val pos_morph = Morphism.binding_morphism "position" (Binding.set_pos (Position.thread_data ()));
+    val mix_morph = (case mixin of NONE => base_morph | SOME (mix, _) => base_morph $> mix);
+    val inst = instance_of thy name mix_morph;
     val idents = Idents.get context;
   in
-    if redundant_ident thy idents (name, inst)
-    then context  (* FIXME amend mixins? *)
+    if redundant_ident thy idents (name, inst) then context  (* FIXME amend mixins? *)
     else
       (idents, context)
       (* add new registrations with inherited mixins *)
-      |> roundup thy (add_reg thy export) export (name, morph)
-      |> snd
+      |> roundup thy (add_reg thy export) export (name, mix_morph) |> #2
       (* add mixin *)
-      |> (case mixin of NONE => I | SOME mixin => amend_registration (name, morph) mixin export)
+      |> amend_registration {dep = (name, mix_morph), mixin = mixin, export = export}
       (* activate import hierarchy as far as not already active *)
-      |> activate_facts (SOME export) (name, morph)
+      |> activate_facts (SOME export) (name, mix_morph $> pos_morph)
+  end;
+
+val add_registration_theory = Context.theory_map o add_registration;
+
+fun add_registration_proof registration ctxt = ctxt
+  |> Proof_Context.set_stmt false
+  |> Context.proof_map (add_registration registration)
+  |> Proof_Context.restore_stmt ctxt;
+
+fun add_registration_local_theory registration lthy =
+  let val n = Local_Theory.level lthy in
+    lthy |> Local_Theory.map_contexts (fn level =>
+      level = n - 1 ? Context.proof_map (add_registration registration))
   end;
 
 
 (* locale fragments within local theory *)
 
-fun activate_fragment_nonbrittle dep_morph mixin export lthy =
-  let val n = Local_Theory.level lthy in
-    lthy |> Local_Theory.map_contexts (fn level =>
-      level = n - 1 ? Context.proof_map (add_registration dep_morph mixin export))
-  end;
-
-fun activate_fragment dep_morph mixin export =
-  Local_Theory.mark_brittle #> activate_fragment_nonbrittle dep_morph mixin export;
+fun activate_fragment registration =
+  Local_Theory.mark_brittle #> add_registration_local_theory registration;
 
 
 
@@ -590,7 +619,7 @@
   end;
 *)
 
-fun add_dependency loc (name, morph) mixin export thy =
+fun add_dependency loc {dep = (name, morph), mixin, export} thy =
   let
     val serial' = serial ();
     val thy' = thy |>
@@ -603,7 +632,7 @@
         (registrations_of context' loc) (Idents.get context', []);
   in
     thy'
-    |> fold_rev (fn dep => Context.theory_map (add_registration dep NONE export)) regs
+    |> fold_rev (fn dep => add_registration_theory {dep = dep, mixin = NONE, export = export}) regs
   end;
 
 
@@ -687,8 +716,6 @@
 
 (*** diagnostic commands and interfaces ***)
 
-fun hyp_spec_of thy = #hyp_spec o the_locale thy;
-
 fun print_locales verbose thy =
   Pretty.block
     (Pretty.breaks
--- a/src/Pure/Isar/proof.ML	Mon Sep 03 13:28:52 2018 +0100
+++ b/src/Pure/Isar/proof.ML	Mon Sep 03 13:32:29 2018 +0100
@@ -33,7 +33,6 @@
   val enter_forward: state -> state
   val enter_chain: state -> state
   val enter_backward: state -> state
-  val has_bottom_goal: state -> bool
   val using_facts: thm list -> state -> state
   val pretty_state: state -> Pretty.T list
   val refine: Method.text -> state -> state Seq.result Seq.seq
@@ -95,6 +94,8 @@
   val end_block: state -> state
   val begin_notepad: context -> state
   val end_notepad: state -> context
+  val is_notepad: state -> bool
+  val reset_notepad: state -> state
   val proof: Method.text_range option -> state -> state Seq.result Seq.seq
   val defer: int -> state -> state
   val prefer: int -> state -> state
@@ -237,7 +238,7 @@
 fun map_contexts f = map_all (fn (ctxt, facts, mode, goal) => (f ctxt, facts, mode, goal));
 
 fun propagate_ml_env state = map_contexts
-  (Context.proof_map (ML_Env.inherit (Context.Proof (context_of state)))) state;
+  (Context.proof_map (ML_Env.inherit [Context.Proof (context_of state)])) state;
 
 
 (* facts *)
@@ -326,17 +327,6 @@
 val before_qed = #before_qed o #2 o current_goal;
 
 
-(* bottom goal *)
-
-fun has_bottom_goal (State stack) =
-  let
-    fun bottom [Node {goal = SOME _, ...}, Node {goal = NONE, ...}] = true
-      | bottom [Node {goal, ...}] = is_some goal
-      | bottom [] = false
-      | bottom (_ :: rest) = bottom rest;
-  in bottom (op :: (Stack.dest stack)) end;
-
-
 (* nested goal *)
 
 fun map_goal f (State stack) =
@@ -904,6 +894,20 @@
   #> close_block
   #> context_of;
 
+fun get_notepad_context (State stack) =
+  let
+    fun escape [Node {goal = SOME _, ...}, Node {goal = NONE, ...}] = NONE
+      | escape [Node {goal = SOME _, ...}] = NONE
+      | escape [Node {goal = NONE, context = ctxt, ...}] = SOME ctxt
+      | escape [] = NONE
+      | escape (_ :: rest) = escape rest;
+  in escape (op :: (Stack.dest stack)) end;
+
+val is_notepad = is_some o get_notepad_context;
+
+fun reset_notepad state =
+  begin_notepad (the_default (context_of state) (get_notepad_context state));
+
 
 (* sub-proofs *)
 
--- a/src/Pure/Isar/toplevel.ML	Mon Sep 03 13:28:52 2018 +0100
+++ b/src/Pure/Isar/toplevel.ML	Mon Sep 03 13:32:29 2018 +0100
@@ -65,7 +65,7 @@
     (local_theory -> Proof.state) -> transition -> transition
   val theory_to_proof: (theory -> Proof.state) -> transition -> transition
   val end_proof: (bool -> Proof.state -> Proof.context) -> transition -> transition
-  val forget_proof: bool -> transition -> transition
+  val forget_proof: transition -> transition
   val proofs': (bool -> Proof.state -> Proof.state Seq.result Seq.seq) -> transition -> transition
   val proof': (bool -> Proof.state -> Proof.state) -> transition -> transition
   val proofs: (Proof.state -> Proof.state Seq.result Seq.seq) -> transition -> transition
@@ -84,6 +84,7 @@
   val command_exception: bool -> transition -> state -> state
   val reset_theory: state -> state option
   val reset_proof: state -> state option
+  val reset_notepad: state -> state option
   type result
   val join_results: result -> (transition * state) list
   val element_result: Keyword.keywords -> transition Thy_Element.element -> state -> result * state
@@ -170,8 +171,7 @@
   | is_end_theory _ = false;
 
 fun end_theory _ (State (NONE, SOME (Theory (Context.Theory thy, _)))) = thy
-  | end_theory pos (State (NONE, _)) = error ("Bad theory" ^ Position.here pos)
-  | end_theory pos (State (SOME _, _)) = error ("Unfinished theory" ^ Position.here pos);
+  | end_theory pos _ = error ("Malformed theory" ^ Position.here pos);
 
 
 (* presentation context *)
@@ -518,10 +518,10 @@
 
 end;
 
-fun forget_proof strict = transaction (fn _ =>
+val forget_proof = transaction (fn _ =>
   (fn Proof (prf, (_, orig_gthy)) =>
-        if strict andalso not (Proof.has_bottom_goal (Proof_Node.current prf))
-        then raise UNDEF else Theory (orig_gthy, NONE)
+        if Proof.is_notepad (Proof_Node.current prf) then raise UNDEF
+        else Theory (orig_gthy, NONE)
     | Skipped_Proof (_, (_, orig_gthy)) => Theory (orig_gthy, NONE)
     | _ => raise UNDEF));
 
@@ -635,7 +635,7 @@
 
 in
 
-val reset_theory = reset_state is_theory (forget_proof false);
+val reset_theory = reset_state is_theory forget_proof;
 
 val reset_proof =
   reset_state is_proof
@@ -643,6 +643,14 @@
       (fn Theory (gthy, _) => Skipped_Proof (0, (gthy, gthy))
         | _ => raise UNDEF)));
 
+val reset_notepad =
+  reset_state
+    (fn st =>
+      (case try proof_of st of
+        SOME state => not (Proof.is_notepad state) orelse can Proof.end_notepad state
+      | NONE => true))
+    (proof Proof.reset_notepad);
+
 end;
 
 
--- a/src/Pure/ML/ml_context.ML	Mon Sep 03 13:28:52 2018 +0100
+++ b/src/Pure/ML/ml_context.ML	Mon Sep 03 13:32:29 2018 +0100
@@ -168,7 +168,7 @@
         (Option.map (Context.Proof o Context_Position.set_visible false) env_ctxt)
         (fn () =>
           (ML_Compiler.eval non_verbose Position.none env; Context.get_generic_context ())) ()
-      |> (fn NONE => () | SOME context' => Context.>> (ML_Env.inherit context'));
+      |> (fn NONE => () | SOME context' => Context.>> (ML_Env.inherit [context']));
 
     (*eval body*)
     val _ = ML_Compiler.eval flags pos body;
--- a/src/Pure/ML/ml_env.ML	Mon Sep 03 13:28:52 2018 +0100
+++ b/src/Pure/ML/ml_env.ML	Mon Sep 03 13:32:29 2018 +0100
@@ -15,7 +15,7 @@
   val ML_read_global: bool Config.T
   val ML_write_global_raw: Config.raw
   val ML_write_global: bool Config.T
-  val inherit: Context.generic -> Context.generic -> Context.generic
+  val inherit: Context.generic list -> Context.generic -> Context.generic
   type operations =
    {read_source: Input.source -> ML_Lex.token Antiquote.antiquote list,
     explode_token: ML_Lex.token -> char list}
@@ -101,20 +101,31 @@
  {read_source: Input.source -> ML_Lex.token Antiquote.antiquote list,
   explode_token: ML_Lex.token -> char list};
 
+local
+
 type env = tables * operations;
+type data = env Name_Space.table * (bool Unsynchronized.ref * Position.T) Inttab.table;
+
+val empty_data: data = (Name_Space.empty_table "ML_environment", Inttab.empty);
+
+fun merge_data ((envs1, breakpoints1), (envs2, breakpoints2)) : data =
+  ((envs1, envs2) |> Name_Space.join_tables
+    (K (fn ((tables1, ops), (tables2, _)) => (merge_tables (tables1, tables2), ops))),
+   Inttab.merge (K true) (breakpoints1, breakpoints2));
+
+in
 
 structure Data = Generic_Data
 (
-  type T = env Name_Space.table * (bool Unsynchronized.ref * Position.T) Inttab.table;
-  val empty: T = (Name_Space.empty_table "ML_environment", Inttab.empty);
+  type T = data;
+  val empty = empty_data;
   val extend = I;
-  fun merge ((envs1, breakpoints1), (envs2, breakpoints2)) : T =
-    ((envs1, envs2) |> Name_Space.join_tables
-      (K (fn ((tables1, ops), (tables2, _)) => (merge_tables (tables1, tables2), ops))),
-     Inttab.merge (K true) (breakpoints1, breakpoints2));
+  val merge = merge_data;
 );
 
-val inherit = Data.put o Data.get;
+fun inherit contexts = Data.put (Library.foldl1 merge_data (map Data.get contexts));
+
+end;
 
 val get_env = Name_Space.get o #1 o Data.get;
 val get_tables = #1 oo get_env;
--- a/src/Pure/PIDE/command.ML	Mon Sep 03 13:28:52 2018 +0100
+++ b/src/Pure/PIDE/command.ML	Mon Sep 03 13:32:29 2018 +0100
@@ -70,7 +70,8 @@
     val text = File.read full_path;
     val lines = split_lines text;
     val digest = SHA1.digest text;
-  in {src_path = src_path, lines = lines, digest = digest, pos = Path.position full_path} end
+    val file_pos = Position.copy_id pos (Path.position full_path);
+  in {src_path = src_path, lines = lines, digest = digest, pos = file_pos} end
   handle ERROR msg => error (msg ^ Position.here pos);
 
 val read_file = read_file_node "";
@@ -179,7 +180,7 @@
 fun eval_finished (Eval {eval_process, ...}) = Lazy.is_finished eval_process;
 
 fun eval_result (Eval {eval_process, ...}) =
-  task_context (Future.worker_subgroup ()) Lazy.force eval_process;
+  Exn.release (Lazy.finished_result eval_process);
 
 val eval_result_command = #command o eval_result;
 val eval_result_state = #state o eval_result;
@@ -192,6 +193,10 @@
     val res =
       if Keyword.is_theory_body keywords name then Toplevel.reset_theory st0
       else if Keyword.is_proof keywords name then Toplevel.reset_proof st0
+      else if Keyword.is_theory_end keywords name then
+        (case Toplevel.reset_notepad st0 of
+          NONE => Toplevel.reset_theory st0
+        | some => some)
       else NONE;
   in
     (case res of
@@ -259,10 +264,14 @@
         let
           val _ = status tr Markup.failed;
           val _ = status tr Markup.finished;
-          val _ = if null errs then (report tr (Markup.bad ()); Exn.interrupt ()) else ();
+          val _ = if null errs then (status tr Markup.canceled; Exn.interrupt ()) else ();
         in {failed = true, command = tr, state = st} end
     | SOME st' =>
         let
+          val _ =
+            if Keyword.is_theory_end keywords (Toplevel.name_of tr) andalso
+              can (Toplevel.end_theory Position.none) st'
+            then status tr Markup.finalized else ();
           val _ = status tr Markup.finished;
         in {failed = false, command = tr, state = st'} end)
   end;
@@ -282,7 +291,8 @@
               read keywords thy master_dir init blobs_info span |> Toplevel.exec_id exec_id) ();
       in eval_state keywords span tr eval_state0 end;
   in
-    Eval {command_id = command_id, exec_id = exec_id, eval_process = Lazy.lazy_name "Command.eval" process}
+    Eval {command_id = command_id, exec_id = exec_id,
+      eval_process = Lazy.lazy_name "Command.eval" process}
   end;
 
 end;
@@ -442,7 +452,7 @@
 fun run_process execution_id exec_id process =
   let val group = Future.worker_subgroup () in
     if Execution.running execution_id exec_id [group] then
-      ignore (task_context group Lazy.force_result process)
+      ignore (task_context group Lazy.force_result {strict = true} process)
     else ()
   end;
 
--- a/src/Pure/PIDE/document.ML	Mon Sep 03 13:28:52 2018 +0100
+++ b/src/Pure/PIDE/document.ML	Mon Sep 03 13:32:29 2018 +0100
@@ -514,28 +514,28 @@
           (fn deps => fn (name, node) =>
             if Symtab.defined required name orelse visible_node node orelse pending_result node then
               let
-                fun body () =
-                 (Execution.worker_task_active true name;
-                  if forall finished_import deps then
-                    iterate_entries (fn (_, opt_exec) => fn () =>
-                      (case opt_exec of
-                        SOME exec =>
-                          if Execution.is_running execution_id
-                          then SOME (Command.exec execution_id exec)
-                          else NONE
-                      | NONE => NONE)) node ()
-                  else ();
-                  Execution.worker_task_active false name)
-                  handle exn =>
-                   (Output.system_message (Runtime.exn_message exn);
-                    Execution.worker_task_active false name;
-                    Exn.reraise exn);
                 val future =
                   (singleton o Future.forks)
                    {name = "theory:" ^ name,
                     group = SOME (Future.new_group NONE),
                     deps = delay :: Execution.active_tasks name @ maps (the_list o #2 o #2) deps,
-                    pri = 0, interrupts = false} body;
+                    pri = 0, interrupts = false}
+                  (fn () =>
+                   (Execution.worker_task_active true name;
+                    if forall finished_import deps then
+                      iterate_entries (fn (_, opt_exec) => fn () =>
+                        (case opt_exec of
+                          SOME exec =>
+                            if Execution.is_running execution_id
+                            then SOME (Command.exec execution_id exec)
+                            else NONE
+                        | NONE => NONE)) node ()
+                    else ();
+                    Execution.worker_task_active false name)
+                      handle exn =>
+                       (Output.system_message (Runtime.exn_message exn);
+                        Execution.worker_task_active false name;
+                        Exn.reraise exn));
               in (node, SOME (Future.task_of future)) end
             else (node, NONE));
       val execution' =
@@ -574,9 +574,12 @@
     val header = read_header node span;
     val imports = #imports header;
 
-    fun maybe_end_theory pos st =
-      SOME (Toplevel.end_theory pos st)
-        handle ERROR msg => (Output.error_message msg; NONE);
+    fun maybe_eval_result eval = Command.eval_result_state eval
+      handle Fail _ => Toplevel.toplevel;
+
+    fun maybe_end_theory pos st = SOME (Toplevel.end_theory pos st)
+      handle ERROR msg => (Output.error_message msg; NONE);
+
     val parents_reports =
       imports |> map_filter (fn (import, pos) =>
         (case Thy_Info.lookup_theory import of
@@ -584,7 +587,7 @@
             maybe_end_theory pos
               (case get_result (snd (the (AList.lookup (op =) deps import))) of
                 NONE => Toplevel.toplevel
-              | SOME (_, eval) => Command.eval_result_state eval)
+              | SOME (_, eval) => maybe_eval_result eval)
         | some => some)
         |> Option.map (fn thy => (thy, (pos, Theory.get_markup thy))));
 
--- a/src/Pure/PIDE/document.scala	Mon Sep 03 13:28:52 2018 +0100
+++ b/src/Pure/PIDE/document.scala	Mon Sep 03 13:32:29 2018 +0100
@@ -706,14 +706,16 @@
     def the_dynamic_state(id: Document_ID.Exec): Command.State = execs.getOrElse(id, fail)
     def the_assignment(version: Version): State.Assignment = assignments.getOrElse(version.id, fail)
 
+    def lookup_id(id: Document_ID.Generic): Option[Command.State] =
+      commands.get(id) orElse execs.get(id)
+
     private def self_id(st: Command.State)(id: Document_ID.Generic): Boolean =
       id == st.command.id ||
       (execs.get(id) match { case Some(st1) => st1.command.id == st.command.id case None => false })
 
     private def other_id(id: Document_ID.Generic)
       : Option[(Symbol.Text_Chunk.Id, Symbol.Text_Chunk)] =
-      (execs.get(id) orElse commands.get(id)).map(st =>
-        ((Symbol.Text_Chunk.Id(st.command.id), st.command.chunk)))
+      lookup_id(id).map(st => ((Symbol.Text_Chunk.Id(st.command.id), st.command.chunk)))
 
     private def redirection(st: Command.State): Graph[Document_ID.Command, Unit] =
       (commands_redirection /: st.markups.redirection_iterator)({ case (graph, id) =>
@@ -1077,7 +1079,7 @@
         /* find command */
 
         def find_command(id: Document_ID.Generic): Option[(Node, Command)] =
-          state.commands.get(id) orElse state.execs.get(id) match {
+          state.lookup_id(id) match {
             case None => None
             case Some(st) =>
               val command = st.command
--- a/src/Pure/PIDE/document_status.scala	Mon Sep 03 13:28:52 2018 +0100
+++ b/src/Pure/PIDE/document_status.scala	Mon Sep 03 13:32:29 2018 +0100
@@ -15,7 +15,7 @@
   {
     val proper_elements: Markup.Elements =
       Markup.Elements(Markup.ACCEPTED, Markup.FORKED, Markup.JOINED, Markup.RUNNING,
-        Markup.FINISHED, Markup.FAILED)
+        Markup.FINISHED, Markup.FAILED, Markup.CANCELED)
 
     val liberal_elements: Markup.Elements =
       proper_elements + Markup.WARNING + Markup.LEGACY + Markup.ERROR
@@ -26,6 +26,8 @@
       var accepted = false
       var warned = false
       var failed = false
+      var canceled = false
+      var finalized = false
       var forks = 0
       var runs = 0
       for (markup <- markup_iterator) {
@@ -37,10 +39,20 @@
           case Markup.FINISHED => runs -= 1
           case Markup.WARNING | Markup.LEGACY => warned = true
           case Markup.FAILED | Markup.ERROR => failed = true
+          case Markup.CANCELED => canceled = true
+          case Markup.FINALIZED => finalized = true
           case _ =>
         }
       }
-      Command_Status(touched, accepted, warned, failed, forks, runs)
+      Command_Status(
+        touched = touched,
+        accepted = accepted,
+        warned = warned,
+        failed = failed,
+        canceled = canceled,
+        finalized = finalized,
+        forks = forks,
+        runs = runs)
     }
 
     val empty = make(Iterator.empty)
@@ -58,23 +70,30 @@
     private val accepted: Boolean,
     private val warned: Boolean,
     private val failed: Boolean,
+    private val canceled: Boolean,
+    private val finalized: Boolean,
     forks: Int,
     runs: Int)
   {
     def + (that: Command_Status): Command_Status =
       Command_Status(
-        touched || that.touched,
-        accepted || that.accepted,
-        warned || that.warned,
-        failed || that.failed,
-        forks + that.forks,
-        runs + that.runs)
+        touched = touched || that.touched,
+        accepted = accepted || that.accepted,
+        warned = warned || that.warned,
+        failed = failed || that.failed,
+        canceled = canceled || that.canceled,
+        finalized = finalized || that.finalized,
+        forks = forks + that.forks,
+        runs = runs + that.runs)
 
     def is_unprocessed: Boolean = accepted && !failed && (!touched || (forks != 0 && runs == 0))
     def is_running: Boolean = runs != 0
     def is_warned: Boolean = warned
     def is_failed: Boolean = failed
     def is_finished: Boolean = !failed && touched && forks == 0 && runs == 0
+    def is_canceled: Boolean = canceled
+    def is_finalized: Boolean = finalized
+    def is_terminated: Boolean = canceled || touched && forks == 0 && runs == 0
   }
 
 
@@ -94,6 +113,9 @@
       var warned = 0
       var failed = 0
       var finished = 0
+      var canceled = false
+      var terminated = false
+      var finalized = false
       for (command <- node.commands.iterator) {
         val states = state.command_states(version, command)
         val status = Command_Status.merge(states.iterator.map(_.document_status))
@@ -103,17 +125,39 @@
         else if (status.is_warned) warned += 1
         else if (status.is_finished) finished += 1
         else unprocessed += 1
+
+        if (status.is_canceled) canceled = true
+        if (status.is_terminated) terminated = true
+        if (status.is_finalized) finalized = true
       }
       val initialized = state.node_initialized(version, name)
       val consolidated = state.node_consolidated(version, name)
 
-      Node_Status(unprocessed, running, warned, failed, finished, initialized, consolidated)
+      Node_Status(
+        unprocessed = unprocessed,
+        running = running,
+        warned = warned,
+        failed = failed,
+        finished = finished,
+        canceled = canceled,
+        terminated = terminated,
+        initialized = initialized,
+        finalized = finalized,
+        consolidated = consolidated)
     }
   }
 
   sealed case class Node_Status(
-    unprocessed: Int, running: Int, warned: Int, failed: Int, finished: Int,
-    initialized: Boolean, consolidated: Boolean)
+    unprocessed: Int,
+    running: Int,
+    warned: Int,
+    failed: Int,
+    finished: Int,
+    canceled: Boolean,
+    terminated: Boolean,
+    initialized: Boolean,
+    finalized: Boolean,
+    consolidated: Boolean)
   {
     def ok: Boolean = failed == 0
     def total: Int = unprocessed + running + warned + failed + finished
@@ -121,7 +165,7 @@
     def json: JSON.Object.T =
       JSON.Object("ok" -> ok, "total" -> total, "unprocessed" -> unprocessed,
         "running" -> running, "warned" -> warned, "failed" -> failed, "finished" -> finished,
-        "initialized" -> initialized, "consolidated" -> consolidated)
+        "canceled" -> canceled, "consolidated" -> consolidated)
   }
 
 
@@ -179,6 +223,12 @@
     def apply(name: Document.Node.Name): Node_Status = rep(name)
     def get(name: Document.Node.Name): Option[Node_Status] = rep.get(name)
 
+    def quasi_consolidated(name: Document.Node.Name): Boolean =
+      rep.get(name) match {
+        case Some(st) => !st.finalized && st.terminated
+        case None => false
+      }
+
     def overall_node_status(name: Document.Node.Name): Overall_Node_Status.Value =
       rep.get(name) match {
         case Some(st) if st.consolidated =>
@@ -223,14 +273,17 @@
       var ok = 0
       var failed = 0
       var pending = 0
+      var canceled = 0
       for (name <- rep.keysIterator) {
         overall_node_status(name) match {
           case Overall_Node_Status.ok => ok += 1
           case Overall_Node_Status.failed => failed += 1
           case Overall_Node_Status.pending => pending += 1
         }
+        if (apply(name).canceled) canceled += 1
       }
-      "Nodes_Status(ok = " + ok + ", failed = " + failed + ", pending = " + pending + ")"
+      "Nodes_Status(ok = " + ok + ", failed = " + failed + ", pending = " + pending +
+        ", canceled = " + canceled + ")"
     }
   }
 }
--- a/src/Pure/PIDE/execution.ML	Mon Sep 03 13:28:52 2018 +0100
+++ b/src/Pure/PIDE/execution.ML	Mon Sep 03 13:32:29 2018 +0100
@@ -168,18 +168,19 @@
                 Exn.capture (Future.interruptible_task e) ()
                 |> Future.identify_result pos
                 |> Exn.map_exn Runtime.thread_context;
+              val errors =
+                Exn.capture (fn () =>
+                  (case result of
+                    Exn.Exn exn =>
+                     (status task [Markup.failed];
+                      status task [Markup.finished];
+                      Output.report [Markup.markup_only (Markup.bad ())];
+                      if exec_id = 0 then ()
+                      else List.app (Future.error_message pos) (Runtime.exn_messages exn))
+                  | Exn.Res _ =>
+                      status task [Markup.finished])) ();
               val _ = status task [Markup.joined];
-              val _ =
-                (case result of
-                  Exn.Exn exn =>
-                   (status task [Markup.failed];
-                    status task [Markup.finished];
-                    Output.report [Markup.markup_only (Markup.bad ())];
-                    if exec_id = 0 then ()
-                    else List.app (Future.error_message pos) (Runtime.exn_messages exn))
-                | Exn.Res _ =>
-                    status task [Markup.finished])
-            in Exn.release result end);
+            in Exn.release errors; Exn.release result end);
 
       val _ = status (Future.task_of future) [Markup.forked];
     in future end)) ();
--- a/src/Pure/PIDE/markup.ML	Mon Sep 03 13:28:52 2018 +0100
+++ b/src/Pure/PIDE/markup.ML	Mon Sep 03 13:32:29 2018 +0100
@@ -165,7 +165,9 @@
   val runningN: string val running: T
   val finishedN: string val finished: T
   val failedN: string val failed: T
+  val canceledN: string val canceled: T
   val initializedN: string val initialized: T
+  val finalizedN: string val finalized: T
   val consolidatedN: string val consolidated: T
   val exec_idN: string
   val initN: string
@@ -577,8 +579,9 @@
 val (runningN, running) = markup_elem "running";
 val (finishedN, finished) = markup_elem "finished";
 val (failedN, failed) = markup_elem "failed";
-
+val (canceledN, canceled) = markup_elem "canceled";
 val (initializedN, initialized) = markup_elem "initialized";
+val (finalizedN, finalized) = markup_elem "finalized";
 val (consolidatedN, consolidated) = markup_elem "consolidated";
 
 
--- a/src/Pure/PIDE/markup.scala	Mon Sep 03 13:28:52 2018 +0100
+++ b/src/Pure/PIDE/markup.scala	Mon Sep 03 13:32:29 2018 +0100
@@ -424,8 +424,9 @@
   val RUNNING = "running"
   val FINISHED = "finished"
   val FAILED = "failed"
-
+  val CANCELED = "canceled"
   val INITIALIZED = "initialized"
+  val FINALIZED = "finalized"
   val CONSOLIDATED = "consolidated"
 
 
--- a/src/Pure/PIDE/rendering.scala	Mon Sep 03 13:28:52 2018 +0100
+++ b/src/Pure/PIDE/rendering.scala	Mon Sep 03 13:32:29 2018 +0100
@@ -19,7 +19,7 @@
   object Color extends Enumeration
   {
     // background
-    val unprocessed1, running1, bad, intensify, entity, active, active_result,
+    val unprocessed1, running1, canceled, bad, intensify, entity, active, active_result,
       markdown_bullet1, markdown_bullet2, markdown_bullet3, markdown_bullet4 = Value
     val background_colors = values
 
@@ -432,6 +432,7 @@
             val status = Document_Status.Command_Status.make(markups.iterator)
             if (status.is_unprocessed) Some(Rendering.Color.unprocessed1)
             else if (status.is_running) Some(Rendering.Color.running1)
+            else if (status.is_canceled) Some(Rendering.Color.canceled)
             else opt_color
           case (_, opt_color) => opt_color
         })
--- a/src/Pure/Pure.thy	Mon Sep 03 13:28:52 2018 +0100
+++ b/src/Pure/Pure.thy	Mon Sep 03 13:32:29 2018 +0100
@@ -916,7 +916,7 @@
 
 val _ =
   Outer_Syntax.command \<^command_keyword>\<open>oops\<close> "forget proof"
-    (Scan.succeed (Toplevel.forget_proof true));
+    (Scan.succeed Toplevel.forget_proof);
 
 in end\<close>
 
--- a/src/Pure/System/progress.scala	Mon Sep 03 13:28:52 2018 +0100
+++ b/src/Pure/System/progress.scala	Mon Sep 03 13:32:29 2018 +0100
@@ -21,7 +21,7 @@
   def echo(msg: String) {}
   def echo_if(cond: Boolean, msg: String) { if (cond) echo(msg) }
   def theory(session: String, theory: String) {}
-  def nodes_status(nodes_status: List[(Document.Node.Name, Document_Status.Node_Status)]) {}
+  def nodes_status(nodes_status: Document_Status.Nodes_Status, names: List[Document.Node.Name]) {}
 
   def echo_warning(msg: String) { echo(Output.warning_text(msg)) }
   def echo_error_message(msg: String) { echo(Output.error_message_text(msg)) }
--- a/src/Pure/Thy/export_theory.ML	Mon Sep 03 13:28:52 2018 +0100
+++ b/src/Pure/Thy/export_theory.ML	Mon Sep 03 13:32:29 2018 +0100
@@ -87,7 +87,6 @@
                     XML.Elem (entity_markup space name, body))))
           |> sort (int_ord o apply2 #1) |> map #2
         end;
-
       in if null elems then () else export_body thy export_name elems end;
 
 
@@ -196,6 +195,25 @@
     val _ = if null classrel then () else export_body thy "classrel" (encode_classrel classrel);
     val _ = if null arities then () else export_body thy "arities" (encode_arities arities);
 
+
+    (* locales *)
+
+    fun encode_locale ({type_params, params, asm, defs}: Locale.content) =
+      let
+        val args = map #1 params;
+        val typargs = type_params @ rev (fold Term.add_tfrees (map Free args @ the_list asm) []);
+        val encode =
+          let open XML.Encode Term_XML.Encode in
+            pair (list (pair string sort))
+              (pair (list (pair string typ))
+                (pair (option term) (list term)))
+          end;
+      in encode (typargs, (args, (asm, defs))) end;
+
+    val _ =
+      export_entities "locales" (fn _ => SOME o encode_locale) Locale.locale_space
+        (Locale.dest_locales thy);
+
   in () end);
 
 end;
--- a/src/Pure/Thy/export_theory.scala	Mon Sep 03 13:28:52 2018 +0100
+++ b/src/Pure/Thy/export_theory.scala	Mon Sep 03 13:32:29 2018 +0100
@@ -30,9 +30,10 @@
     axioms: Boolean = true,
     facts: Boolean = true,
     classes: Boolean = true,
-    typedefs: Boolean = true,
+    locales: Boolean = true,
     classrel: Boolean = true,
     arities: Boolean = true,
+    typedefs: Boolean = true,
     cache: Term.Cache = Term.make_cache()): Session =
   {
     val thys =
@@ -42,7 +43,8 @@
           Export.read_theory_names(db, session_name).map((theory_name: String) =>
             read_theory(Export.Provider.database(db, session_name, theory_name),
               session_name, theory_name, types = types, consts = consts,
-              axioms = axioms, facts = facts, classes = classes, typedefs = typedefs,
+              axioms = axioms, facts = facts, classes = classes, locales = locales,
+              classrel = classrel, arities = arities, typedefs = typedefs,
               cache = Some(cache)))
         }
       })
@@ -69,9 +71,10 @@
     axioms: List[Fact_Single],
     facts: List[Fact_Multi],
     classes: List[Class],
-    typedefs: List[Typedef],
+    locales: List[Locale],
     classrel: List[Classrel],
-    arities: List[Arity])
+    arities: List[Arity],
+    typedefs: List[Typedef])
   {
     override def toString: String = name
 
@@ -81,7 +84,8 @@
         consts.iterator.map(_.entity.serial) ++
         axioms.iterator.map(_.entity.serial) ++
         facts.iterator.map(_.entity.serial) ++
-        classes.iterator.map(_.entity.serial)
+        classes.iterator.map(_.entity.serial) ++
+        locales.iterator.map(_.entity.serial)
 
     def cache(cache: Term.Cache): Theory =
       Theory(cache.string(name),
@@ -91,13 +95,14 @@
         axioms.map(_.cache(cache)),
         facts.map(_.cache(cache)),
         classes.map(_.cache(cache)),
-        typedefs.map(_.cache(cache)),
+        locales.map(_.cache(cache)),
         classrel.map(_.cache(cache)),
-        arities.map(_.cache(cache)))
+        arities.map(_.cache(cache)),
+        typedefs.map(_.cache(cache)))
   }
 
   def empty_theory(name: String): Theory =
-    Theory(name, Nil, Nil, Nil, Nil, Nil, Nil, Nil, Nil, Nil)
+    Theory(name, Nil, Nil, Nil, Nil, Nil, Nil, Nil, Nil, Nil, Nil)
 
   def read_theory(provider: Export.Provider, session_name: String, theory_name: String,
     types: Boolean = true,
@@ -105,9 +110,10 @@
     axioms: Boolean = true,
     facts: Boolean = true,
     classes: Boolean = true,
-    typedefs: Boolean = true,
+    locales: Boolean = true,
     classrel: Boolean = true,
     arities: Boolean = true,
+    typedefs: Boolean = true,
     cache: Option[Term.Cache] = None): Theory =
   {
     val parents =
@@ -124,9 +130,10 @@
         if (axioms) read_axioms(provider) else Nil,
         if (facts) read_facts(provider) else Nil,
         if (classes) read_classes(provider) else Nil,
-        if (typedefs) read_typedefs(provider) else Nil,
+        if (locales) read_locales(provider) else Nil,
         if (classrel) read_classrel(provider) else Nil,
-        if (arities) read_arities(provider) else Nil)
+        if (arities) read_arities(provider) else Nil,
+        if (typedefs) read_typedefs(provider) else Nil)
     if (cache.isDefined) theory.cache(cache.get) else theory
   }
 
@@ -154,6 +161,7 @@
     val AXIOM = Value("axiom")
     val FACT = Value("fact")
     val CLASS = Value("class")
+    val LOCALE = Value("locale")
   }
 
   sealed case class Entity(
@@ -316,6 +324,36 @@
       })
 
 
+  /* locales */
+
+  sealed case class Locale(
+    entity: Entity, typargs: List[(String, Term.Sort)], args: List[(String, Term.Typ)],
+      asm: Option[Term.Term], defs: List[Term.Term])
+  {
+    def cache(cache: Term.Cache): Locale =
+      Locale(entity.cache(cache),
+        typargs.map({ case (name, sort) => (cache.string(name), cache.sort(sort)) }),
+        args.map({ case (name, typ) => (cache.string(name), cache.typ(typ)) }),
+        asm.map(cache.term(_)),
+        defs.map(cache.term(_)))
+  }
+
+  def read_locales(provider: Export.Provider): List[Locale] =
+    provider.uncompressed_yxml(export_prefix + "locales").map((tree: XML.Tree) =>
+      {
+        val (entity, body) = decode_entity(Kind.LOCALE, tree)
+        val (typargs, (args, (asm, defs))) =
+        {
+          import XML.Decode._
+          import Term_XML.Decode._
+          pair(list(pair(string, sort)),
+            pair(list(pair(string, typ)),
+              pair(option(term), list(term))))(body)
+        }
+        Locale(entity, typargs, args, asm, defs)
+      })
+
+
   /* sort algebra */
 
   sealed case class Classrel(class_name: String, super_names: List[String])
--- a/src/Pure/Thy/thy_resources.scala	Mon Sep 03 13:28:52 2018 +0100
+++ b/src/Pure/Thy/thy_resources.scala	Mon Sep 03 13:32:29 2018 +0100
@@ -111,6 +111,8 @@
           master_dir = proper_string(master_dir) getOrElse tmp_dir_name, progress = progress)
       val dep_theories_set = dep_theories.toSet
 
+      val nodes_status_update = Synchronized(Document_Status.Nodes_Status.empty_update)
+
       val result = Future.promise[Theories_Result]
 
       def check_state(beyond_limit: Boolean = false)
@@ -118,7 +120,11 @@
         val state = session.current_state()
         state.stable_tip_version match {
           case Some(version) =>
-            if (beyond_limit || dep_theories.forall(state.node_consolidated(version, _))) {
+            if (beyond_limit ||
+                dep_theories.forall(name =>
+                  state.node_consolidated(version, name) ||
+                  nodes_status_update.value._1.quasi_consolidated(name)))
+            {
               val nodes =
                 for (name <- dep_theories)
                 yield (name -> Document_Status.Node_Status.make(state, version, name))
@@ -144,12 +150,10 @@
 
       val theories_progress = Synchronized(Set.empty[Document.Node.Name])
 
-      val nodes_status_update = Synchronized(Document_Status.Nodes_Status.empty_update)
-
       val delay_nodes_status =
         Standard_Thread.delay_first(nodes_status_delay max Time.zero) {
           val (nodes_status, names) = nodes_status_update.value
-          progress.nodes_status(names.map(name => (name -> nodes_status(name))))
+          progress.nodes_status(nodes_status, names)
         }
 
       val consumer =
@@ -160,19 +164,19 @@
               val state = snapshot.state
               val version = snapshot.version
 
-              if (nodes_status_delay >= Time.zero) {
-                nodes_status_update.change(
-                  { case upd @ (nodes_status, _) =>
-                      val domain =
-                        if (nodes_status.is_empty) dep_theories_set
-                        else changed.nodes.iterator.filter(dep_theories_set).toSet
-                      val upd1 =
-                        nodes_status.update(resources.session_base, state, version,
-                          domain = Some(domain), trim = changed.assignment).getOrElse(upd)
-                      if (upd == upd1) upd
-                      else { delay_nodes_status.invoke; upd1 }
-                  })
-              }
+              nodes_status_update.change(
+                { case upd @ (nodes_status, _) =>
+                    val domain =
+                      if (nodes_status.is_empty) dep_theories_set
+                      else changed.nodes.iterator.filter(dep_theories_set).toSet
+                    val upd1 =
+                      nodes_status.update(resources.session_base, state, version,
+                        domain = Some(domain), trim = changed.assignment).getOrElse(upd)
+                    if (nodes_status_delay >= Time.zero && upd != upd1)
+                      delay_nodes_status.invoke
+
+                    upd1
+                })
 
               val check_theories =
                 (for {
--- a/src/Pure/Tools/dump.scala	Mon Sep 03 13:28:52 2018 +0100
+++ b/src/Pure/Tools/dump.scala	Mon Sep 03 13:32:29 2018 +0100
@@ -136,7 +136,7 @@
     else {
       for {
         (name, status) <- theories_result.nodes if !status.ok
-        (tree, pos) <- theories_result.snapshot(name).messages if Protocol.is_error(tree)
+        (tree, _) <- theories_result.snapshot(name).messages if Protocol.is_error(tree)
       } progress.echo_error_message(XML.content(Pretty.formatted(List(tree))))
 
       session_result.copy(rc = session_result.rc max 1)
--- a/src/Pure/Tools/server.scala	Mon Sep 03 13:28:52 2018 +0100
+++ b/src/Pure/Tools/server.scala	Mon Sep 03 13:32:29 2018 +0100
@@ -274,11 +274,9 @@
         (List("session" -> session, "theory" -> theory) ::: more.toList):_*)
 
     override def nodes_status(
-      nodes_status: List[(Document.Node.Name, Document_Status.Node_Status)])
+      nodes_status: Document_Status.Nodes_Status, names: List[Document.Node.Name])
     {
-      val json =
-        for ((name, status) <- nodes_status)
-        yield name.json + ("status" -> status.json)
+      val json = names.map(name => name.json + ("status" -> nodes_status(name).json))
       context.notify(JSON.Object(Markup.KIND -> Markup.NODES_STATUS, Markup.NODES_STATUS -> json))
     }
 
--- a/src/Tools/VSCode/extension/README.md	Mon Sep 03 13:28:52 2018 +0100
+++ b/src/Tools/VSCode/extension/README.md	Mon Sep 03 13:32:29 2018 +0100
@@ -1,15 +1,14 @@
 # Isabelle Prover IDE support
 
 This extension connects VSCode to the Isabelle Prover IDE infrastructure: it
-requires Isabelle2018.
+requires a repository version of Isabelle.
 
 The implementation is centered around the VSCode Language Server protocol, but
 with many add-ons that are specific to VSCode and Isabelle/PIDE.
 
 See also:
 
-  * <https://isabelle.in.tum.de/website-Isabelle2018>
-  * <https://isabelle.in.tum.de/repos/isabelle/file/Isabelle2018/src/Tools/VSCode>
+  * <https://isabelle.in.tum.de/repos/isabelle/file/tip/src/Tools/VSCode>
   * <https://github.com/Microsoft/language-server-protocol>
 
 
@@ -59,8 +58,7 @@
 
 ### Isabelle/VSCode Installation
 
-  * Download Isabelle2018 from <https://isabelle.in.tum.de/website-Isabelle2018>
-    or any of its mirror sites.
+  * Download a recent Isabelle development snapshot from <http://isabelle.in.tum.de/devel/release_snapshot>
 
   * Unpack and run the main Isabelle/jEdit application as usual, to ensure that
   the logic image is built properly and Isabelle works as expected.
@@ -69,7 +67,7 @@
 
   * Open the VSCode *Extensions* view and install the following:
 
-      + *Isabelle2018* (needs to fit to the underlying Isabelle release).
+      + *Isabelle*.
 
       + *Prettify Symbols Mode* (important for display of Isabelle symbols).
 
@@ -90,17 +88,17 @@
 
       + Linux:
         ```
-        "isabelle.home": "/home/makarius/Isabelle2018"
+        "isabelle.home": "/home/makarius/Isabelle"
         ```
 
       + Mac OS X:
         ```
-        "isabelle.home": "/Users/makarius/Isabelle.app/Isabelle2018"
+        "isabelle.home": "/Users/makarius/Isabelle.app/Isabelle"
         ```
 
       + Windows:
         ```
-        "isabelle.home": "C:\\Users\\makarius\\Isabelle2018"
+        "isabelle.home": "C:\\Users\\makarius\\Isabelle"
         ```
 
   * Restart the VSCode application to ensure that all extensions are properly
--- a/src/Tools/VSCode/extension/package.json	Mon Sep 03 13:28:52 2018 +0100
+++ b/src/Tools/VSCode/extension/package.json	Mon Sep 03 13:32:29 2018 +0100
@@ -1,6 +1,6 @@
 {
-    "name": "Isabelle2018",
-    "displayName": "Isabelle2018",
+    "name": "isabelle",
+    "displayName": "Isabelle",
     "description": "Isabelle Prover IDE",
     "keywords": [
         "theorem prover",
@@ -10,7 +10,7 @@
         "document preparation"
     ],
     "icon": "isabelle.png",
-    "version": "1.1.0",
+    "version": "1.1.1",
     "publisher": "makarius",
     "license": "MIT",
     "repository": {
@@ -250,6 +250,14 @@
                     "type": "string",
                     "default": "rgba(255, 160, 160, 0.40)"
                 },
+                "isabelle.canceled_light_color": {
+                    "type": "string",
+                    "default": "rgba(255, 106, 106, 0.40)"
+                },
+                "isabelle.canceled_dark_color": {
+                    "type": "string",
+                    "default": "rgba(255, 106, 106, 0.40)"
+                },
                 "isabelle.bad_light_color": {
                     "type": "string",
                     "default": "rgba(255, 106, 106, 0.40)"
--- a/src/Tools/VSCode/extension/src/decorations.ts	Mon Sep 03 13:28:52 2018 +0100
+++ b/src/Tools/VSCode/extension/src/decorations.ts	Mon Sep 03 13:32:29 2018 +0100
@@ -13,6 +13,7 @@
 const background_colors = [
   "unprocessed1",
   "running1",
+  "canceled",
   "bad",
   "intensify",
   "quoted",
--- a/src/Tools/jEdit/etc/options	Mon Sep 03 13:28:52 2018 +0100
+++ b/src/Tools/jEdit/etc/options	Mon Sep 03 13:32:29 2018 +0100
@@ -95,6 +95,7 @@
 option error_message_color : string = "FFC1C1FF"
 option spell_checker_color : string = "0000FFFF"
 option bad_color : string = "FF6A6A64"
+option canceled_color : string = "FF6A6A64"
 option intensify_color : string = "FFCC6664"
 option entity_color : string = "CCD9FF80"
 option entity_ref_color : string = "800080FF"