119 are content with this, you can quit reading here. Anyway, in order to customise |
119 are content with this, you can quit reading here. Anyway, in order to customise |
120 and adapt the code generator, it is inevitable to gain some understanding |
120 and adapt the code generator, it is inevitable to gain some understanding |
121 how it works. |
121 how it works. |
122 |
122 |
123 \begin{figure}[h] |
123 \begin{figure}[h] |
124 \begin{tikzpicture}[x = 4.2cm, y = 1cm] |
124 \includegraphics{Thy/pictures/architecture} |
125 \tikzstyle entity=[rounded corners, draw, thick, color = black, fill = white]; |
|
126 \tikzstyle process=[ellipse, draw, thick, color = green, fill = white]; |
|
127 \tikzstyle process_arrow=[->, semithick, color = green]; |
|
128 \node (HOL) at (0, 4) [style=entity] {@{text "Isabelle/HOL"} theory}; |
|
129 \node (eqn) at (2, 2) [style=entity] {code equations}; |
|
130 \node (iml) at (2, 0) [style=entity] {intermediate language}; |
|
131 \node (seri) at (1, 0) [style=process] {serialisation}; |
|
132 \node (SML) at (0, 3) [style=entity] {@{text SML}}; |
|
133 \node (OCaml) at (0, 2) [style=entity] {@{text OCaml}}; |
|
134 \node (further) at (0, 1) [style=entity] {@{text "\<dots>"}}; |
|
135 \node (Haskell) at (0, 0) [style=entity] {@{text Haskell}}; |
|
136 \draw [style=process_arrow] (HOL) .. controls (2, 4) .. |
|
137 node [style=process, near start] {selection} |
|
138 node [style=process, near end] {preprocessing} |
|
139 (eqn); |
|
140 \draw [style=process_arrow] (eqn) -- node (transl) [style=process] {translation} (iml); |
|
141 \draw [style=process_arrow] (iml) -- (seri); |
|
142 \draw [style=process_arrow] (seri) -- (SML); |
|
143 \draw [style=process_arrow] (seri) -- (OCaml); |
|
144 \draw [style=process_arrow, dashed] (seri) -- (further); |
|
145 \draw [style=process_arrow] (seri) -- (Haskell); |
|
146 \end{tikzpicture} |
|
147 \caption{Code generator architecture} |
125 \caption{Code generator architecture} |
148 \label{fig:arch} |
126 \label{fig:arch} |
149 \end{figure} |
127 \end{figure} |
150 |
128 |
151 The code generator employs a notion of executability |
129 The code generator employs a notion of executability |
162 the input |
140 the input |
163 of the next in the chain, see diagram \ref{fig:arch}: |
141 of the next in the chain, see diagram \ref{fig:arch}: |
164 |
142 |
165 \begin{itemize} |
143 \begin{itemize} |
166 |
144 |
167 \item Out of the vast collection of theorems proven in a |
145 \item Starting point is a collection of raw code equations in a |
168 \qn{theory}, a reasonable subset modelling |
146 theory; due to proof irrelevance it is not relevant where they |
169 code equations is \qn{selected}. |
147 stem from but typically they are either descendant of specification |
|
148 tools or explicit proofs by the user. |
|
149 |
|
150 \item Before these raw code equations are continued |
|
151 with, they can be subjected to theorem transformations. This |
|
152 \qn{preprocessor} is an interface which allows to apply the full |
|
153 expressiveness of ML-based theorem transformations to code |
|
154 generation. The result of the preprocessing step is a |
|
155 structured collection of code equations. |
170 |
156 |
171 \item On those selected theorems, certain |
157 \item These code equations are \qn{translated} to a program in an |
172 transformations are carried out |
158 abstract intermediate language. Think of it as a kind |
173 (\qn{preprocessing}). Their purpose is to turn theorems |
|
174 representing non- or badly executable |
|
175 specifications into equivalent but executable counterparts. |
|
176 The result is a structured collection of \qn{code theorems}. |
|
177 |
|
178 \item Before the selected code equations are continued with, |
|
179 they can be \qn{preprocessed}, i.e. subjected to theorem |
|
180 transformations. This \qn{preprocessor} is an interface which |
|
181 allows to apply |
|
182 the full expressiveness of ML-based theorem transformations |
|
183 to code generation; motivating examples are shown below, see |
|
184 \secref{sec:preproc}. |
|
185 The result of the preprocessing step is a structured collection |
|
186 of code equations. |
|
187 |
|
188 \item These code equations are \qn{translated} to a program |
|
189 in an abstract intermediate language. Think of it as a kind |
|
190 of \qt{Mini-Haskell} with four \qn{statements}: @{text data} |
159 of \qt{Mini-Haskell} with four \qn{statements}: @{text data} |
191 (for datatypes), @{text fun} (stemming from code equations), |
160 (for datatypes), @{text fun} (stemming from code equations), |
192 also @{text class} and @{text inst} (for type classes). |
161 also @{text class} and @{text inst} (for type classes). |
193 |
162 |
194 \item Finally, the abstract program is \qn{serialised} into concrete |
163 \item Finally, the abstract program is \qn{serialised} into concrete |
195 source code of a target language. |
164 source code of a target language. |
|
165 This step only produces concrete syntax but does not change the |
|
166 program in essence; all conceptual transformations occur in the |
|
167 translation step. |
196 |
168 |
197 \end{itemize} |
169 \end{itemize} |
198 |
170 |
199 \noindent From these steps, only the two last are carried out outside the logic; by |
171 \noindent From these steps, only the two last are carried out outside the logic; by |
200 keeping this layer as thin as possible, the amount of code to trust is |
172 keeping this layer as thin as possible, the amount of code to trust is |