diff --git a/README.md b/README.md index 01f755e..5e0da9c 100644 --- a/README.md +++ b/README.md @@ -13,11 +13,12 @@ contains two documents: [![Download README](images/vectors/gitlab/Download-README-critical.svg)](https://collaborating.tuhh.de/alex/latex-git-cookbook/-/jobs/artifacts/master/raw/README.pdf?job=build_pandoc) - **The README covers git and Continuous Delivery, using Docker.** + **The README covers git and Continuous Delivery.** 2. A [LaTeX document](cookbook.tex), usable as a cookbook (different "recipes" to achieve various things in LaTeX) and also as a template. - The LaTeX Cookbook PDF covers LaTeX-specific topics. + The LaTeX Cookbook PDF covers LaTeX-specific topics and instruction on compiling the + LaTeX source using Docker. It is also available for download: [![Download PDF](images/vectors/gitlab/Download-Cookbook-informational.svg)](https://collaborating.tuhh.de/alex/latex-git-cookbook/-/jobs/artifacts/master/raw/cookbook.pdf?job=build_latex) @@ -203,7 +204,7 @@ pretty git GUI image from [here](https://unsplash.com/photos/842ofHC6MaI): ![git GUI](images/bitmaps/readme/yancy-min-842ofHC6MaI-unsplash.jpg) -## Git(Lab) and Continuous Delivery +### Git(Lab) and Continuous Delivery GitLab is a platform to host git repositories. Projects on there can serve as git remotes. @@ -251,122 +252,6 @@ The advantages are: which compiles the PDF and offers it for download afterwards. The last part could be called *Continuous Deployment*, albeit a very basic version. -### Docker - -Docker is a tool providing so-called *containers* (though wrong, think of them as -light-weight virtual machines). -These containers provide isolated, well-defined environments for applications to run in. -They are created from executing corresponding Docker *images*. -These images are in turn generated using a script-like list of instructions, -so-called [*Dockerfiles*](https://docs.docker.com/engine/reference/builder/). - -In summary: - -1. a **`Dockerfile` text document** is created, containing instructions on how the image should - look like (like what stuff to install, what to copy where, ...). - - As a baseline, these instructions often rely on a Debian distribution. - As such, all the usual Debian/Linux tools can be accessed, like `bash`. - - An (unrelated) [example Dockerfile](https://github.com/alexpovel/random_python/blob/master/music-converter/Dockerfile) - can look like: - - ```dockerfile - # Dockerfile - - # Get the latest Debian Slim with Python installed - FROM python:slim - - # Update the Debian package repositories and install a Debian package. - # Agree to installation automatically (`-y`)! - # This is required because Dockerfiles need to run without user interaction. - RUN apt-get update && apt-get install -y ffmpeg - - # Copy a file from the building host into the image - COPY requirements.txt . - - # Run some shell command, as you would in a normal sh/bash environment. - # This is a Python-specific command to install Python packages according to some - # requirements. - RUN pip install -r requirements.txt - - # Copy more stuff! - COPY music-converter/ music-converter/ - - # This will be the command the image executes if run. - # It runs this command as a process and terminates as soon as the process ends - # (successfully or otherwise). - # Docker is not like a virtual machine: it is intended to run *one* process, then - # die. If you need to run it again, just create a new container (instance of a - # Docker image). Treat containers as *cattle*, not as a *pet*. The - # container-recreation process is light-weight, fast and the way to go. - # - # Of course, this does not stop anyone from running one *long-running* process - # (as in infinity, `while True`-style). This is still a good use-case for Docker - # (as are most things!). An example for this is a webserver. - ENTRYPOINT [ "python", "-m", "music-converter", "/in", "--destination", "/out" ] - ``` - - The Dockerfile this project uses for LaTeX stuff is - [here](https://github.com/alexpovel/latex-extras-docker/blob/master/Dockerfile). - It is not as simple, so not as suited for an example. - -2. The **image** is then built accordingly, resulting in a large-ish file that contains an - executable environment. - For example, if we install a comprehensive `TeXLive` distribution, the image can be - more than 2 GB in size. - - This Docker image can be distributed. - In fact, there is [Docker Hub](https://hub.docker.com/), which exists for just this - purpose. - If you just instruct to run an image called e.g. `alexpovel/latex`, without - specifying a full URL to somewhere, Docker will look on the Hub for an image of that - name (and find it [here](https://hub.docker.com/r/alexpovel/latex)). - All participants of a project can pull images from there, and everyone will - be on the same page (alternatively, you can build the image from the Dockerfile). - - For example, the LaTeX environment for this project requires a whole bunch of - setting-up. - This can take hours to read up upon, understand, explain, implement and getting to - run. - In some cases, it will be **impossible** if some required part of a project conflicts - with a pre-existing condition on your machine. - For example, project *A* requires `perl` in version `6.9.0`, but project *B* requires - version `4.2.0`. - This is what Docker is all about: **isolation**. - Whatever is present on your system does not matter, only the Docker image/container - contents are relevant. - - If project member *X* has version `6.6.6` and they proclaim - ["*works for me*"](https://web.archive.org/web/20200928142058/https://events.ccc.de/2016/11/22/hello-this-is-33c3-works-for-me/), - but you have `1.3.37` and it *doesn't*, and you both cannot change your versions for - some other reason... tough luck. - This is what Docker is for. - - Further, if you for example specify `FROM python:3.8.6` as your *base image*, aka - provided a so-called *tag* of `3.8.6`, it will be that tag in ten years' time still. - As such, you nailed the version your process takes place in and requires. - Once set up, this will run on virtually any computer running Docker, be it your - laptop now or whatever your machine is in ten years. - This is especially important for the reproducibility of research. -3. Once the image is created, it can be run, **creating a container**. - We can then enter the container and use it like a pretty normal (usually Linux) - machine, for example to compile our `tex` files. - Other, single commands can also be executed. - For example, to compile `cookbook.tex` in PowerShell when the `alexpovel/latex` image - is available after [installing Docker](https://docs.docker.com/docker-for-windows/install/) - and getting the image (`docker pull alexpovel/latex` -- if you don't run this beforehand, - it will be downloaded automatically when it is missing), run: - - ```powershell - docker run --rm --volume ${PWD}:/tex alexpovel/latex - ``` - - Done! - For more info, especially the details of the above command, see - [here](https://github.com/alexpovel/latex-extras-docker/blob/master/README.md#quick-intro). - **For this to work, you do not have to have anything installed on your machine, only Docker**. - One concrete workflow to employ this chain is to have a Dockerfile repository on GitHub, [like this one](https://github.com/alexpovel/latex-extras-docker). GitHub then integrates with [DockerHub](https://hub.docker.com/). @@ -384,41 +269,7 @@ Refer to the (that Dockerfile is used to compile this very README to PDF via `pandoc`) for more details. -#### Installed packages - -For more information on the LaTeX packages mentioned -[in the Dockerfile repository](https://github.com/alexpovel/latex-extras-docker), -refer to the accompanying LaTeX cookbook. - -#### Equivalent Windows Install - -To get the same, or at least a very similar environment running on Windows, -the elements can be installed individually: - -1. [MiKTeX](https://miktex.org/download); for a closer match to the Docker, install - [TeXLive](https://www.tug.org/texlive/windows.html) instead: - for a LaTeX distribution with the `lualatex`, `biber`, `bib2gls`, `latexmk` etc. - programs, as well as all LaTeX packages. -2. [Java Runtime Environment](https://www.java.com/en/download/): - for [`bib2gls`](https://ctan.org/pkg/bib2gls), which is in turn used by - [`glossaries-extra`](https://ctan.org/pkg/glossaries-extra). -3. [InkScape](https://inkscape.org/release): - for the [`svg`](https://ctan.org/pkg/svg) package to convert SVGs automatically - (absolutely none of the `PDF/PDF_TEX` nonsense anymore!) -4. [gnuplot](https://sourceforge.net/projects/gnuplot/files/latest/download): - for [`pgfplots`](https://ctan.org/pkg/pgfplots) to generate contour plots. -5. [Perl](http://strawberryperl.com/): - for [`latexmk`](https://mg.readthedocs.io/latexmk.html) to work - -(see how annoying, manual and laborious this list is? ... use [Docker](#docker)!) - -These are required to compile the LaTeX document. -If InkScape and gnuplot ask to put their respective binaries into the `$PATH` -environment variable, hit yes. -If they do not, add the path yourself to the directory containing the binaries -(`.exe`) in `Edit environment variables for your account -> Path -> Edit... -> New`. - -### Enable Runner for the project +#### Enable Runner for the project To build anything, we need someone to build *for* us. GitLab calls these build-servers *runners*. @@ -428,7 +279,7 @@ Enable it (him? her?) for the project on the GitLab project homepage: `Settings -> CI/CD -> Runners -> Enable Shared Runners`. Otherwise, the build process will get 'stuck' indefinitely. -### Add git info to PDF metadata +#### Add git info to PDF metadata After retrieving a built PDF, it might get lost in the nether. That is, the downloader loses track of what commit it belongs to, or even what release. @@ -512,7 +363,7 @@ then turned into macros/control sequences (`\newcommand`) of a given name using `token.set_macro( csname, content)`, see the [LuaTeX reference](http://mirrors.ctan.org/systems/doc/luatex/luatex.pdf) on all the included Lua functionality provided by LuaTeX, on top of regular Lua. -### Add PDF Download Button +#### Add PDF Download Button On the top of the project page, we can add *badges*. That's how GitLab calls the small, clickable buttons. diff --git a/chapters/frontmatter/preface.tex b/chapters/frontmatter/preface.tex index b917233..55cd868 100644 --- a/chapters/frontmatter/preface.tex +++ b/chapters/frontmatter/preface.tex @@ -59,239 +59,9 @@ duplicating that source code so it can be read in the printed output directly is just another vector for errors to creep in. -\section*{Beginning Prerequisites} - -Since this document will probably still be read by people new to \LaTeX{}, there -will be a short description on how to get started below. -\LaTeX{} requires three things: -\begin{enumerate} - \item of course, a source text file, ending in \texttt{.tex}. - A minimal example is: - - \begin{minted}[linenos=false]{latex} -\documentclass{scrartcl} -\begin{document} - Hello World! -\end{document} - \end{minted} - Note the usage of \texttt{scrartcl} over the standard \texttt{article}. - This is a \ctanpackage{koma-script} document class. - There are only \href{https://tex.stackexchange.com/a/73146/120853}{two reasons} - why you would not use that package, both of which usually do not apply. - \ctanpackage{koma-script} replaces the conventional \texttt{documentclass} - like so: - - \begin{tabular}{ - @{} - l - @{ \textrightarrow{} } - l - @{} - } - \texttt{article} & \texttt{scrartcl} \\ - \texttt{report} & \texttt{scrreprt} \\ - \texttt{book} & \texttt{scrbook} - \end{tabular} - - There is also a \texttt{letter} class, making writing formal letters a walk - in the park. - \ctanpackage{komascript} will provide you with a very large host of tools - and settings that work very well indeed. - Those combine most options you would otherwise get from other packages into - one convenient class. - Included are, among others: - \begin{itemize} - \item \ctanpackage{scrlayer-scrpage} instead of \ctanpackage{fancyhdr} for - page styles, headers, footers \iecfeg{etc.}, - \item \ctanpackage{tocbasic} to modify the Table of Contents and other lists, - and - \item \ctanpackage{scrlttr2} for typesetting letters. - \end{itemize} - If the document loads a \ctanpackage{koma-script} document class, these - packages are already available. - However, the \ctanpackage{koma-script} defaults are also great - (like using \texttt{a4paper} over \texttt{legal}), - so you can also get started without dealing with options or packages at all. - Just use it everywhere and profit. - A viable alternative is \ctanpackage{memoir}, though I never used that. - \item a \emph{distribution}, which are the compilers, packages and other - goodies like fonts: - \begin{itemize} - \item \emph{Compilers} translate high\-/level source code (see the - first point) to a different \enquote{language}. - In our case, the other language is \abb{portable_document_format} - source code. - It is not human\-/readable and mostly gibberish, but a - \abb{portable_document_format} viewer takes care of that. - \item \emph{Packages} are bundles of ready\-/made functionalities for - \LaTeX{}. - There are packages for basically everything. - The \href{https://ctan.org/}{\abb{comprehensive_tex_archive_network}}, - a \emph{package repository}, contains basically all of them. - \end{itemize} - UNIX-based operating systems do well with - \href{https://www.tug.org/texlive/}{TeXLive}, - which is available as a package for most distributions. - It is also available for Windows. - It has a yearly release schedule. - So there might be \href{https://tex.stackexchange.com/a/476742/120853}{bugs} - that do not get fixed for a whole while. - Nevertheless, I can recommend it. - - Another viable alternative is \href{https://miktex.org/}{MiKTeX}. - It has a rolling release model, aka updates to packages are published - whenever they are deemed ready. - MiKTeX's \abb{graphical_user_interface} (the \emph{MiKTeX Console}) is pretty - polished and usable, see \cref{fig:miktex_gui}. - - \begin{figure} - \ffigbox[\FBwidth]{% - \caption[% - % Do not use regular glossaries-commands in captions (or - % section headers etc.): if they occur in the LoC/LoT etc., - % they will be expanded there already, which is unwanted. - MiKTeX \glsfmtlong{abb.graphical_user_interface} on Windows% - ]{% - MiKTeX \abb{graphical_user_interface} on Windows% - }% - \label{fig:miktex_gui}% - }{% - % Having issued \graphicspath globally, we do not have to specify - % the full path here. Not even a file extension is necessary. - \includegraphics[width=0.8\textwidth]{miktex_gui} - } - \end{figure} - - You should hit that juicy \emph{Check for updates} at least yearly, rather - biannually. - \LaTeX{} is a slow world, in which files from the previous millennium might - very well still compile and look fine. - However, a very large share of errors are caused by out\-/of\-/date packages. - For example, if your \LaTeX{} distribution is ancient (anything older than, - say, three years), and you then compile a new file that installs a new - package, you suddenly have that package in its latest version, alongside - all the old packages. - That will not go well long. - \item of course, an \emph{editor}. - - Here, you are free to do whatever you want. - I recommend \href{https://code.visualstudio.com/}{Visual Studio Code}, using its - \href{https://marketplace.visualstudio.com/items?itemName=James-Yu.latex-workshop}% - {\LaTeX{} Workshop} extension, which provides syntax highlighting, shortcuts - and many other useful things. - VSCode is among the most state\-/of\-/the\-/art editors currently available. - Being usable for \LaTeX{} is just a nice \enquote{side\-/effect} we can take - advantage of. - - For a more conventional, complete \abb{integrated_development_environment}, - try \href{https://www.texstudio.org/}{TeXStudio}. - Like VSCode, it is also - \href{https://github.com/texstudio-org/texstudio}{open source}. - TeXStudio will cater to \SI{99}{\percent} of your \LaTeX{} needs. - - If you like to live dangerously, you can even write your \LaTeX{} in Notepad. - Vim is not mentioned here because its users will probably have skipped this - section\dots{} -\end{enumerate} - -\subsection*{Compiling this document} - -This document leverages many more advanced and demanding packages. -In this context, \emph{demanding} means that they sometimes need outside tools, -since \hologo{LuaLaTeX} and/or plain \TeX{} would have been insufficient for their -implementations. -This is most prevalent in packages that need sorting functionalities. -Therefore, compiling this document is a bit more involved than simply calling -\texttt{pdflatex}, the currently most common way of compiling \LaTeX{}. -To make it work for everyone, a -\href{https://hub.docker.com/repository/docker/alexpovel/latex}{Docker image} -is provided. -It includes all the tools required. -For more info, see the -\href{https://collaborating.tuhh.de/alex/latex-git-cookbook/-/blob/master/README.md}{README} -for the repository. - -Outside tools are required for -\begin{enumerate} - \item \ctanpackage{glossaries-extra}: requires the outside tool - \ctanpackage{bib2gls}, see \cref{ch:bib2gls}, which in turn requires - a Java Runtime Environment. - \item \ctanpackage{biblatex}: requires the outside tool \ctanpackage{biber}, - see \cref{ch:bibliography_rationale}. - \item \ctanpackage{svg}: requires the outside program - \href{https://inkscape.org/}{InkScape}. - Examples for that package's main command, \verb|\includesvg|, are - \cref{fig:wide_caption,fig:tighter_caption,fig:multiple_floats,fig:sidecap,fig:inside_float}. - \item Using \texttt{contour gnuplot} commands with \ctanpackage{pgfplots}, see - \cref{fig:mollier_diagram}, requires - \href{http://www.gnuplot.info/download.html}{gnuplot}. -\end{enumerate} - -\paragraph{Required Installations} -If you have a proper \LaTeX{} distribution, \texttt{bib2gls} and \texttt{biber} -will already be available as commands and ready to be invoked. -The latter means that they are on your \texttt{\$PATH}, \iecfeg{i.e.}\ the path to -the respective binaries are part of the \texttt{\$PATH} environment variable. -For info on what else to install, refer to the -\href{https://collaborating.tuhh.de/alex/latex-git-cookbook#installed-packages}{README section} -dealing with this. - -To compile the entire document from scratch, install those things manually. -To never worry about installing and keeping things in order manually, use the provided -Docker image! - -\paragraph{Compilation} -With the prerequisites done, the compilation itself can start. -For it, call: -\begin{enumerate} - \item \texttt{lualatex} - \item \texttt{biber} - \item \texttt{bib2gls} - \item \texttt{lualatex} - \item \texttt{lualatex} -\end{enumerate} -This should get all the references right and set up the bibliography as well as the -glossaries, which in turn fills out any missing (shown as \texttt{??}) entries. -I say \emph{should} because I never actually ran this chain myself. -Instead, there is the \textbf{\texttt{latexmk}} tool to ease all this painful labour. -\texttt{latexmk} automates \LaTeX{} compilation by detecting all the required -steps and then running them as often as required. -It requires \textbf{Perl}. -Linux users will already have it available, Windows users may grab -\href{http://strawberryperl.com/}{Strawberry Perl}. - -Once that is done, the entire document can be compiled by simply calling -\texttt{latexmk}. -You do not even have to provide a \texttt{*.tex} file argument. -By default, \texttt{latexmk} will simply compile all found \texttt{*.tex} files. -The core ingredient to this magic process is the \texttt{.latexmkrc} configuration file. -You can find it in the repository root directory. -It is tailored to this document and does not need to be touched if the compilation -process itself has not changed. -It also contains some more insights to the entire process. - -\texttt{latexmk} is great because it figures out most things by itself and enjoys -wide\-/spread acceptance and adoption. -If it does not figure out everything from the get\-/go, it is easily customized, -like for this document. -As such, chances are your \LaTeX{} editor either supports or outright relies on it -already. - -Having walked through all this manually, hopefully using the prepared Docker image -instead makes more sense now. -Taking a look at \texttt{.gitlab-ci.yml} in the project root, we can see how easy it -can be: -run Docker container, call \texttt{latexmk}. -That is it! -It is guaranteed to work for everyone, because the Docker container (that is, the -virtual build environment) will be identical for all users. -It is independent of local \LaTeX{} installations and all their quirks. -It will continue to work forever (the underlying software versions are well\-/constrained) -and the generated output will be identical across the board. - -If all of this is embedded into a pipeline on GitLab, your documents are built whenever -you \texttt{git push} to the remote server (or whenever you configure it to). -It does not get simpler; the downside is of course the lengthier setup, but all of that -is explained in the -\href{https://collaborating.tuhh.de/alex/latex-git-cookbook/-/blob/master/README.md}{README}. -Also, the repository itself is a live demonstration where everything is set up already! +\paragraph{Source Repository} +The source repository for this document is at +\begin{center} + \url{https://collaborating.tuhh.de/alex/latex-git-cookbook} . +\end{center} +Any references to the \enquote{source} or \enquote{repository} refer to that project. diff --git a/chapters/mainmatter.tex b/chapters/mainmatter.tex index 6443f3c..b0c515b 100644 --- a/chapters/mainmatter.tex +++ b/chapters/mainmatter.tex @@ -1,5 +1,6 @@ \mainmatter +\subimport{mainmatter/}{usage} \subimport{mainmatter/}{base-features} \subimport{mainmatter/}{floats} \subimport{mainmatter/}{code-listings} diff --git a/chapters/mainmatter/usage.tex b/chapters/mainmatter/usage.tex new file mode 100644 index 0000000..1739c00 --- /dev/null +++ b/chapters/mainmatter/usage.tex @@ -0,0 +1,299 @@ +\chapter{Usage} + +This document \enquote{requires}% +\footnote{ + It does not \emph{technically} require Docker, but I hope to convince you that + the non\-/Docker, manual way is the way of the dodo and a big no\-/no. +} +\href{https://www.docker.com/}{Docker}. +On a high level, Docker allows to prepare specific software bundles, tailor\-/made for +whatever application the bundle author desires. +Other users can then use these software bundles for their own projects. +The bundles are well\-/defined and can be distributed and run very easily. +The more complex and demanding the required software, the higher the benefit of using such bundles. +The driving design principle and primary use case for these bundles is \emph{isolation}. +Whatever you do with the bundles, it happens in isolation from your host system. +This allows a user to have, for example, arbitrary Python versions at their disposal, +whereas a local system installation can only ever offer a single version. + +For a \LaTeX{} document, the one at hand here is pretty complex. +This is owed to the many used packages and outside tooling. +Outside tooling (programs other than \LaTeX{} called from within \LaTeX{}) is quite +prevalent in \LaTeX{}, since it itself is so limited.% +\footnote{ + For example, when writing Python, you would not call Perl or JavaScript from within it, + because whatever they can do, Python can. + The same analogy does \emph{not} hold for \LaTeX{}: base \LaTeX{} can do surprisingly + little, even though \TeX{} is technically Turing\-/complete. +} +Let us look at the outside tooling used for this document. + +\section{Outside tools and special packages} + +This section highlights the pain points of \emph{not} using Docker. +If you are already familiar and need no convincing, skip to \cref{ch:using-docker}. +For this document, outside tools are required for: +\begin{enumerate} + \item \ctanpackage{glossaries-extra}: requires the outside tool + \ctanpackage{bib2gls}, see \cref{ch:bib2gls}, which in turn requires + a Java Runtime Environment. + \item \ctanpackage{biblatex}: requires the outside tool \ctanpackage{biber}, + see \cref{ch:bibliography_rationale}. + \item \ctanpackage{svg}: requires the outside program + \href{https://inkscape.org/}{InkScape}. + Examples for that package's main command, \verb|\includesvg|, are + \cref{fig:wide_caption,fig:tighter_caption,fig:multiple_floats,fig:sidecap,fig:inside_float}. + \item Using \texttt{contour gnuplot} commands with \ctanpackage{pgfplots}, see + \cref{fig:mollier_diagram}, requires + \href{http://www.gnuplot.info/download.html}{gnuplot}. + \item Syntax highlighting for source code, see \cref{ch:code-listings}, + is done through \ctanpackage{minted}. + It requires \href{https://www.python.org/}{Python} (built into virtually all Linux + distributions but needs to be installed on Windows) with its + \href{https://pypi.org/project/Pygments/}{pygments} package installed. + That package does the heavy lifting and \ctanpackage{minted} inserts the result + into your \LaTeX{} document. +\end{enumerate} + +If you have a proper \LaTeX{} distribution, \texttt{bib2gls} and \texttt{biber} +will already be available as commands and ready to be invoked. +The latter means that they are on your \texttt{\$PATH}, \iecfeg{i.e.}\ the paths to +the respective binaries are part of the \texttt{PATH} environment variable. +\emph{All other stuff, you have to install manually}. + +You will have to install and keep everything else updated by hand. +What if one of the dependencies of this document conflicts with something you have already +installed on your system? +What if that conflict is not resolvable? +You would be unable to use this document. +This is where Docker comes into play. +You outsource all this setting\-/up to someone who already bundled all that stuff together. +Docker calls these \enquote{bundles} \emph{images}. +There is a Docker image \emph{tailor\-/made} for this document: +\begin{center} + \href{https://hub.docker.com/r/alexpovel/latex}{alexpovel/latex}. +\end{center} +It is guaranteed to function correctly for this document, since the author maintains both +in parallel. +There will never be a mismatch. +If maintenance ceases, it will cease for both components at the same time, hence it will +still continue to work, just not get updated anymore. + +\section{Using Docker} +\label{ch:using-docker} + +Instead of all of the above, \textbf{only one installation is required}: Docker. +You can get it from +\begin{center} + \url{https://docs.docker.com/get-docker/} . +\end{center} +\emph{You don't even need a \LaTeX{} distribution}. +All you need is an editor to edit the files in, see \cref{ch:editor}. +Once you want to compile, open a terminal, navigate to your directory and run +\begin{minted}[linenos=false]{powershell} + docker run --rm --volume ${PWD}:/tex alexpovel/latex +\end{minted} +for PowerShell or +\begin{minted}[linenos=false]{shell} + docker run --rm --volume $(pwd):/tex alexpovel/latex +\end{minted} +for bash. +That's it! + +The command consists of: +\begin{itemize} + \item The \texttt{--rm} option, which removes the run container after you are done + (containers are \enquote{instances} of images). + This is generally desired, since containers are ephemeral and should be treated as such. + Do not keep containers around, simply create new ones! + If you need adjustments, adjust the image, not the container, then create new containers + from that adjusted image. + \item The \texttt{--volume} option makes the current directory (\texttt{pwd}) + available \emph{inside} the running container, at the \texttt{/tex} destination path. + The \LaTeX{} process needs your files to work with. + Without this option, the container would be \enquote{naked}, with no way of + accessing your files. + \item The final argument to \texttt{docker run} is the image to be run, in this case + \texttt{alexpovel/latex}, which will look for that name on + \href{https://hub.docker.com/}{DockerHub}. +\end{itemize} + +Ideally, the command can be registered as the \enquote{compilation} command of your +editor. +That way, you just hit the compile button and will be using Docker in the background, +with no changes to your workflow. + +\subsection{Compilation steps} + +You are probably used to running \texttt{pdflatex} or similar on your source files, +as many times as needed. +So where does that step happen in the above \texttt{docker} command? + +The Docker approach uses the \textbf{\texttt{latexmk}} tool to ease all the painful labour +of running chains of \texttt{pdflatex}, \texttt{biber} \iecfeg{etc}.\ manually. +\texttt{latexmk} automates \LaTeX{} compilation by detecting all the required +steps and then running them as often as required. +It requires \textbf{Perl}. +Linux users will already have it available, Windows users may grab +\href{http://strawberryperl.com/}{Strawberry Perl}. +As such, this document's processing pipeline \emph{as a whole} requires Perl, +although it is technically not required for document compilation only. + +Once Perl is installed (of course, the Docker image already contains it), +the entire document can be compiled by \textbf{simply calling \texttt{latexmk}}. +You do not even have to provide a \texttt{*.tex} file argument. +By default, \texttt{latexmk} will simply compile all found \texttt{*.tex} files. +The core ingredient to this magic process is the \texttt{.latexmkrc} configuration file. +You can find it in the repository root directory. +It is tailored to this document and does not need to be touched if the compilation +process itself has not changed. +It also contains some more insights to the entire process. + +\texttt{latexmk} is great because it figures out most things by itself and enjoys +wide\-/spread acceptance and adoption. +If it does not figure out everything from the get\-/go, it is easily customized, +like for this document. + +Having walked through all this manually, hopefully using the prepared Docker image +instead makes more sense now. +It is guaranteed to work for everyone, because the Docker container (that is, the +virtual build environment) will be identical for all users. +It is independent of local \LaTeX{} installations and all their quirks. +As such, it simply and forever does away with the entire, huge class of +\begin{displayquote}[Everyone, at some point] + But it works on my machine! +\end{displayquote} +Good riddance to that. + +If all of this is embedded into a pipeline on GitLab, your documents are built whenever +you \texttt{git push} to the remote server (or whenever you configure it to). +It does not get simpler; the downside is of course the lengthier setup, but all of that +is explained in the +\href{https://collaborating.tuhh.de/alex/latex-git-cookbook/-/blob/master/README.md}{README}. +Also, the repository itself is a live demonstration where everything is set up already! + +\subsection{More on Docker} + +You do not need to know of the entire chain of how Docker images are created and run. +Only consuming the final image has all the benefits with little effort. +However, the process is not complex: +\begin{enumerate} + \item A \textbf{\texttt{Dockerfile} text document} is created, containing instructions on how the image should look like (like what stuff to install, what to copy where, ...). + + As a baseline, these instructions often rely on a Debian distribution. + As such, all the usual Debian/Linux tools can be accessed, like \texttt{bash}. + + An (unrelated) + \href{https://github.com/alexpovel/random_python/blob/master/music-converter/Dockerfile}{example Dockerfile} + can look like: + + \begin{minted}{dockerfile} +# Get the latest Debian Slim with Python installed +FROM python:slim + +# Update the Debian package repositories and install a Debian package. +# Agree to installation automatically (`-y`)! +# This is required because Dockerfiles need to run without user interaction. +RUN apt-get update && apt-get install -y ffmpeg + +# Copy a file from the building host into the image +COPY requirements.txt . + +# Run some shell command, as you would in a normal sh/bash environment. +# This is a Python-specific command to install Python packages according to some +# requirements. +RUN pip install -r requirements.txt + +# Copy more stuff! +COPY music-converter/ music-converter/ + +# This will be the command the image executes if run. +# It runs this command as a process and terminates as soon as the process ends +# (successfully or otherwise). +# Docker is not like a virtual machine: it is intended to run *one* process, then +# die. If you need to run it again, just create a new container (instance of a +# Docker image). Treat containers as *cattle*, not as a *pet*. The +# container-recreation process is light-weight, fast and the way to go. +# +# Of course, this does not stop anyone from running one *long-running* process +# (as in infinity, `while True`-style). This is still a good use-case for Docker +# (as are most things!). An example for this is a webserver. +ENTRYPOINT [ "python", "-m", "music-converter", "/in", "--destination", "/out" ] + \end{minted} + + The Dockerfile this project uses for LaTeX stuff is here: + \begin{center} + \url{https://github.com/alexpovel/latex-extras-docker/blob/master/Dockerfile} + \end{center} + It is not as simple, so not as suited for an example. + Its length gives you an idea of the setup required to compile this \LaTeX{} document. + All of that complexity is of no concern to you when using Docker! + Of course, such an image also works for much simpler documents. + + If you require custom additions, you can always inherit from existing base images: + \begin{minted}{dockerfile} +FROM alexpovel/latex + +# ... Your stuff goes here ... + \end{minted} + \item The \textbf{image} is then built according to the \texttt{Dockerfile} instructions, + resulting in a large\-/ish file that contains an executable environment. + For example, if we install a comprehensive TeXLive distribution, the image can be + more than \SI{2}{\giga\byte} in size. + Note that you will never interact with that \enquote{file} directly. + Docker manages it for you, and all interaction occurs through the \texttt{docker} command. + + The Docker image can be distributed. + If you just instruct to run an image called \iecfeg{e.g.} \texttt{alexpovel/latex}, without + specifying a full URL to somewhere, Docker will look on its Hub for an image of that + name (and find it \href{https://hub.docker.com/r/alexpovel/latex}{here}). + Anyone can pull (public) images from there, and everyone will + be on the same page (alternatively, you can build the image from the Dockerfile). + + For example, as stated, the \LaTeX{} environment for this project requires a whole bunch of setting\-/up. + This can take more than an afternoon to read up upon, understand, implement and getting to run. + In some cases, it will be impossible if some required part of a project conflicts + with a pre\-/existing condition on your machine. + For example, suppose project \emph{A} requires Perl in version \texttt{6.9.0}, + but project \emph{B} requires version \texttt{4.2.0}. + This is what Docker is all about: isolation. + Whatever is present on your system does not matter, only the Docker image/container + contents are relevant. + + Further, if you for example specify \mintinline{dockerfile}{FROM python:3.8.6} + as your base image, aka provided a so\-/called tag of \texttt{3.8.6}, it will be that tag in ten years' time still. + As such, you nailed the version your process takes place in and requires. + Once set up, this will run on virtually any computer running Docker, be it your + laptop now or whatever your machine is in ten years. + This is especially important for the reproducibility of research. + \item Once the image is created, it can be run, \textbf{creating a container}. + We can then enter the container and use it like a pretty normal (usually Linux) + machine, for example to compile our \LaTeX{} files. + Other, single commands can also be executed. + + The proper way is to run one container \emph{per process}. + If that process (\iecfeg{e.g.}\ \texttt{latexmk}) finishes, the container exits. + A new process then requires a new container. +\end{enumerate} + +\subsection{Editor} +\label{ch:editor} + +You are free to do whatever you want. +However, a garbage editor can substantially hamstring your work. +For example, please do not use Notepad++. +It is a fantastic little program but unsuitable for any serious, longer work. + +The author uses and dearly recommends \href{https://code.visualstudio.com/}{Visual Studio Code}, +using its +\href{https://marketplace.visualstudio.com/items?itemName=James-Yu.latex-workshop}{\LaTeX{} Workshop} extension, which provides syntax highlighting, shortcuts and many other useful things. +VSCode is among the most state\-/of\-/the\-/art editors currently available. +Being usable for \LaTeX{} is just a nice \enquote{side\-/effect} we can take +advantage of. +It is open\-/source and therefore also has a privacy\-/respecting alternative fork, \emph{VSCodium}. + +For a more conventional, complete \abb{integrated_development_environment}, +try \href{https://www.texstudio.org/}{TeXStudio}. +Like VSCode, it is also +\href{https://github.com/texstudio-org/texstudio}{open source}. +TeXStudio will cater to \SI{99}{\percent} of your \LaTeX{} needs, but nothing else (Markdown, \dots{}). diff --git a/cookbook.cls b/cookbook.cls index e4685a2..b4ea825 100644 --- a/cookbook.cls +++ b/cookbook.cls @@ -1627,6 +1627,8 @@ math-micro=\text{µ},% https://tex.stackexchange.com/a/54915/120853 text-micro=µ, per-mode=symbol, + % + binary-units, % Allow "GB" for gigabyte etc., which isn't strictly SI but useful }% % Second setup step, with locales. See also % https://tex.stackexchange.com/a/46979/120853