Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion docs/Generics/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -48,7 +48,7 @@ pdflatex generics

## Reading the PDF

The book makes use of internal hyperlinks so it is is best to use PDF reader with support for PDF bookmarks and back/forward history:
The book makes use of internal hyperlinks so it is best to use PDF reader with support for PDF bookmarks and back/forward history:

- Preview.app on macOS fits the bill; you can add Back/Forward buttons to the toolbar with **View** > **Customize Toolbar**.
- [Skim.app](https://skim-app.sourceforge.io) is a BSD-licensed open source PDF reader for macOS.
Expand Down
2 changes: 1 addition & 1 deletion docs/Generics/chapters/archetypes.tex
Original file line number Diff line number Diff line change
Expand Up @@ -679,7 +679,7 @@ \section{The Archetype Builder}\label{archetype builder}
This kind of data structure first appeared in the early history of compilers in the implementation of ``\texttt{EQUIV} statements'' for declaring that two symbols ought to share a storage location. These statements thus define an equivalence relation on symbols, and the performance gains from a careful implementation were noted in~\cite{improvedequivalence}. A survey of later techniques appears in~\cite{unionfindsurvey}.

\paragraph{Lazy expansion.}
From a theoretical point of view, the archetype builder's approach amounts to exhausive enumeration of all \index{derived requiremen!enumerationt}derived requirements and \index{valid type parameter!enumeration}valid type parameters of a generic signature, made slightly more efficient by the choice of data structure (the asymmetry in the handling of member types in \AlgRef{archetype builder merge} means we skip parts of the search space that would yield nothing new).
From a theoretical point of view, the archetype builder's approach amounts to exhaustive enumeration of all \index{derived requirement!enumeration}derived requirements and \index{valid type parameter!enumeration}valid type parameters of a generic signature, made slightly more efficient by the choice of data structure (the asymmetry in the handling of member types in \AlgRef{archetype builder merge} means we skip parts of the search space that would yield nothing new).

The eager expansion model survived the introduction of protocol \texttt{where} clauses in Swift~4 \cite{se0142}, and thus associated requirements, with only relatively minor changes. The introduction of recursive conformances in Swift~4.1~\cite{se0157} necessitated a larger overhaul. Once the type parameter graph becomes infinite, the eager conformance requirement expansion of \AlgRef{archetype builder expand} no longer makes sense. The \texttt{ArchetypeBuilder} was renamed to the \IndexDefinition{GenericSignatureBuilder@\texttt{GenericSignatureBuilder}}\texttt{GenericSignatureBuilder} as part of a re-design where the recursive expansion was now performed as needed, within the lookup of \AlgRef{archetype builder lookup} itself \cite{implrecursive}.

Expand Down
4 changes: 2 additions & 2 deletions docs/Generics/chapters/building-generic-signatures.tex
Original file line number Diff line number Diff line change
Expand Up @@ -768,7 +768,7 @@ \section{Well-Formed Requirements}\label{generic signature validity}
\ReflexStep{2}{\rT.Indices}{3}
\end{gather*}

We now state the general result. We need this for the proof of \ThmRef{valid theorem}, and also later in \SecRef{conformance paths exist}, when we show that every derived conformance requirement has a derivation of a certrain form.
We now state the general result. We need this for the proof of \ThmRef{valid theorem}, and also later in \SecRef{conformance paths exist}, when we show that every derived conformance requirement has a derivation of a certain form.

\begin{lemma}[Formal substitution]\label{subst lemma}
Let $G$ be an arbitrary generic signature. Suppose that $G\vdash\tT$ and $G\vdash\TP$ for some type parameter \tT\ and protocol \tP. Then if we take a valid type parameter or derived requirement of~$\GP$ and replace \tSelf\ with \tT\ throughout, we get a valid type parameter or derived requirement of~$G$, just rooted in \tT. That is:
Expand Down Expand Up @@ -1312,7 +1312,7 @@ \section{Requirement Minimization}\label{minimal requirements}
We now state the general definition.

\begin{definition}\label{conflicting req def}
Let $G$ be a \index{well-formed generic signature}well-formed generic signature. If $G$ has a pair of \index{derived requirement!conflicts}derived requirements $R_1$~and~$R_2$ where $R_1\otimes\Sigma$ and $R_2\otimes\Sigma$ cannot both be \index{satisfied requirement!conflicts}satisfied by the same substitution map~$\Sigma$, then $R_1$~and~$R_2$ define a pair of \IndexDefinition{conflicting requirement}\emph{confliciting requirements}. A generic signature $G$ is \emph{conflict-free} if it does not have any pairs of conflicting requirements. The pairs of derived requirements that can lead to conflicts are enumerated below:
Let $G$ be a \index{well-formed generic signature}well-formed generic signature. If $G$ has a pair of \index{derived requirement!conflicts}derived requirements $R_1$~and~$R_2$ where $R_1\otimes\Sigma$ and $R_2\otimes\Sigma$ cannot both be \index{satisfied requirement!conflicts}satisfied by the same substitution map~$\Sigma$, then $R_1$~and~$R_2$ define a pair of \IndexDefinition{conflicting requirement}\emph{conflicting requirements}. A generic signature $G$ is \emph{conflict-free} if it does not have any pairs of conflicting requirements. The pairs of derived requirements that can lead to conflicts are enumerated below:
\begin{enumerate}
\item For two concrete \index{same-type requirement!conflicts}same-type requirements $\SameReq{T}{$\tX_1$}$ and $\SameReq{T}{$\tX_2$}$, we desugar the ``combined'' requirement $\SameReq{$\tX_1$}{$\tX_2$}$, as we already saw. Here and every remaining case below, desugaring will either detect a conflict, or produce a simpler list of requirements that can replace one of the two original requirements.
\item For a concrete same-type requirement $\TX$ and superclass requirement $\TC$, we desugar $\ConfReq{X}{C}$, which can be satisfied only if~\tX\ is a class type that is also a subclass of~\tC.
Expand Down
6 changes: 3 additions & 3 deletions docs/Generics/chapters/completion.tex
Original file line number Diff line number Diff line change
Expand Up @@ -387,7 +387,7 @@ \chapter{Completion}\label{completion}
\end{enumerate}
\end{algorithm}

We can now describe the main loop of the Knuth-Bendix completion procedure, which repeatedly finds and resolves critical pairs until no more non-trivial critical pairs remain. This process might not terminate, and we might find ourselves discovering new critical pairs and adding new rules to resolve them, forever. To prevent an infinite loop in the case of failure, we implement a termination check; if we think we've done too much work already, we give up on constructing a covergent rewriting system. We already mentioned the \textbf{left-simplified}, \textbf{right-simplified}, and \textbf{substitution-simplified} flags a few times; they are set by the rule simplification passes, with the first two described in \SecRef{rule reduction} and the third one in \SecRef{subst simplification}. These passes are invoked at appropriate times in the main loop below.
We can now describe the main loop of the Knuth-Bendix completion procedure, which repeatedly finds and resolves critical pairs until no more non-trivial critical pairs remain. This process might not terminate, and we might find ourselves discovering new critical pairs and adding new rules to resolve them, forever. To prevent an infinite loop in the case of failure, we implement a termination check; if we think we've done too much work already, we give up on constructing a convergent rewriting system. We already mentioned the \textbf{left-simplified}, \textbf{right-simplified}, and \textbf{substitution-simplified} flags a few times; they are set by the rule simplification passes, with the first two described in \SecRef{rule reduction} and the third one in \SecRef{subst simplification}. These passes are invoked at appropriate times in the main loop below.

\begin{algorithm}[Knuth-Bendix completion procedure]\label{knuthbendix} Takes a list of rewrite rules as input. Records new rewrite rules, as well as rewrite loops, and returns success or failure. On success, the rewrite rules define a convergent rewriting system. Failure indicates we got a non-orientable rewrite rule, or we triggered the termination check.
\begin{enumerate}
Expand Down Expand Up @@ -1233,7 +1233,7 @@ \section{More Critical Pairs}\label{more critical pairs}
{\assocsym{S}{C}\cdot(\ProtoConfInv{\assocsym{S}{C}}{S})}
\end{center}

If the identity conformance rule had not been part of that initial set, this last critical pair would \emph{define} the identity conformance rule, and we would obtain the same rewrite system in the end. We don't need to explicitly add the identity conformance rule, after all. There is a practical consideration though. By making this rule part of the initial set, and cruicially, marking it \index{permanent rule}\textbf{permanent}, we remove it from consideration in the rewrite system minimization algorithm. This cuts out a lot of unnecessary work.
If the identity conformance rule had not been part of that initial set, this last critical pair would \emph{define} the identity conformance rule, and we would obtain the same rewrite system in the end. We don't need to explicitly add the identity conformance rule, after all. There is a practical consideration though. By making this rule part of the initial set, and crucially, marking it \index{permanent rule}\textbf{permanent}, we remove it from consideration in the rewrite system minimization algorithm. This cuts out a lot of unnecessary work.

\smallskip

Expand Down Expand Up @@ -1523,7 +1523,7 @@ \section{Recursive Conformances}\label{recursive conformances redux}
\end{tabular}
\end{center}

\ListingRef{rewrite system q} shows the convegent rewrite system for \tQ, which imports rules from protocol~\texttt{N}. Completion discovers the following critical pairs; we will be content to just summarize because they are similar to previous examples:
\ListingRef{rewrite system q} shows the convergent rewrite system for \tQ, which imports rules from protocol~\texttt{N}. Completion discovers the following critical pairs; we will be content to just summarize because they are similar to previous examples:
\begin{itemize}
\item Rule (6) overlaps with (7) on $\pQ\cdot\pQ\cdot\nA$. We define rule (\CRule{10}) (this general principle was established in \ExRef{proto assoc rule}).
\item Rule (8) overlaps with (2) on $\pQ\cdot\protosym{N}\cdot\nA$. We define rule (\CRule{11}) (\ExRef{protocol inheritance completion example}).
Expand Down
2 changes: 1 addition & 1 deletion docs/Generics/chapters/conformance-paths.tex
Original file line number Diff line number Diff line change
Expand Up @@ -143,7 +143,7 @@ \chapter{Conformance Paths}\label{conformance paths}
\qquad {} = \Proto{Sequence} \otimes \texttt{Substring}\\
\qquad {} = \SubstringSequence
\end{gather*}
However, we wish to recover $\SubstringSequence$ from $\Sigma$ alone, without resorting to a \index{global conformance lookup}\emph{global} conformance lookup, as we did in the above calculation. We don't have a lot of options. In fact, pretty much the \emph{only} thing local confomance lookup can do is take one of the \index{root conformance}root conformances stored in the substitution map, and proceed to apply \index{associated conformance projection}associated conformance projection operations.
However, we wish to recover $\SubstringSequence$ from $\Sigma$ alone, without resorting to a \index{global conformance lookup}\emph{global} conformance lookup, as we did in the above calculation. We don't have a lot of options. In fact, pretty much the \emph{only} thing local conformance lookup can do is take one of the \index{root conformance}root conformances stored in the substitution map, and proceed to apply \index{associated conformance projection}associated conformance projection operations.

Our generic signature $G$ only has one explicit conformance requirement, so we start with the lone root conformance of $\Sigma$:
\[
Expand Down
4 changes: 2 additions & 2 deletions docs/Generics/chapters/conformances.tex
Original file line number Diff line number Diff line change
Expand Up @@ -64,7 +64,7 @@ \chapter{Conformances}\label{conformances}
\draw [arrow] (SquareShape) -- (Square);
\end{tikzpicture}
\end{center}
We denote this inherited conformance by $\ConfReq{Square}{Shape}$. It behaves identically to the superclass conformance $\ConfReq{Polygon}{Shape}$, except if ask for its \index{conforming type!inherited conformance}conforming type, we get back \texttt{Square} instead of \texttt{Polygon}. In \ExRef{inherited specialized conf}, we will see that more complex behavors can manifest when the superclass declaration has generic parameters.
We denote this inherited conformance by $\ConfReq{Square}{Shape}$. It behaves identically to the superclass conformance $\ConfReq{Polygon}{Shape}$, except if ask for its \index{conforming type!inherited conformance}conforming type, we get back \texttt{Square} instead of \texttt{Polygon}. In \ExRef{inherited specialized conf}, we will see that more complex behaviors can manifest when the superclass declaration has generic parameters.

\section{Conformance Lookup}\label{conformance lookup}

Expand Down Expand Up @@ -320,7 +320,7 @@ \section{Type Witnesses}\label{type witnesses}
func play(_: String) {}
}
\end{Verbatim}
Associated type inference \index{synthesized declaration}synthesizes a type alias member for each inferred associated type, so we can refer to \texttt{Cat.Toy} elsewhere in the program, as if we explictly declared the type alias \texttt{Toy} as a member of \texttt{Cat}. Now, consider \texttt{Dog}, which witnesses \texttt{Pet}'s associated type with a generic parameter named \texttt{Toy}. This is Case~3:
Associated type inference \index{synthesized declaration}synthesizes a type alias member for each inferred associated type, so we can refer to \texttt{Cat.Toy} elsewhere in the program, as if we explicitly declared the type alias \texttt{Toy} as a member of \texttt{Cat}. Now, consider \texttt{Dog}, which witnesses \texttt{Pet}'s associated type with a generic parameter named \texttt{Toy}. This is Case~3:
\begin{Verbatim}
struct Dog<Toy>: Pet {
// synthesized: typealias Toy = Toy
Expand Down
2 changes: 1 addition & 1 deletion docs/Generics/generics.tex
Original file line number Diff line number Diff line change
Expand Up @@ -241,7 +241,7 @@
\\*(Cited on pages #2.)%
\fi }
% Request evalator requests
% Request evaluator requests
\newcommand{\Request}[1]{\textbf{#1}}
% Generic parameters
Expand Down