Remember, a few hours of trial and error can save you several minutes of looking at the README.
This repo contains a LaTeX document, usable as a cookbook (different "recipes" to achieve various things in LaTeX) and also as a template. The resulting PDF covers LaTeX-specific topics and instructions on compiling the LaTeX source using Docker. It is available for download.
This very README is also made available for download as a PDF, converted from Markdown using pandoc with the Eisvogel template.
Many nights were lost over issues involving GitLab CI/CD, but also plain LaTeX. Here is a non-exhaustive list --- a bit like a gallery of failure --- of the most common ones. Hopefully, it spares you some despair.
-
You run into a similar error as:
! Package pgfplots Error: Sorry, the requested column number '1' in table 'dat. csv' does not exist!? Please verify you used the correct index 0 <= i < N..
This can happen if you use
pgfplotstable
for plotting from tabular data, be it inline or from an outside file. If you usematlab2tikz
, you might also run into the above error, since it potentially uses inline tables.In the class file, there is a line reading:
\pgfplotstableset{col sep=comma}% If ALL files/tables are comma-separated
This is a global default for all tables, assuming that they are comma-separated. The default is whitespace, which
matlab2tikz
uses, hence it breaks. You can override the column separator either in the above global option, or manually for each plot:% See https://tex.stackexchange.com/a/251245/120853 \addplot3[surf] table [col sep=comma] {dat.csv};
-
When using package
fontspec
(or its derivativeunicode-math
), compilation fails with! error: (type 2): cannot find file '' ! ==> Fatal error occurred, no output PDF file produced!
It is possible that the font cache is corrupted after moving fonts around. For example, if previously all fonts were in a flat
./fonts/
subdirectory of your document root, and then you decide to sort them into./fonts/sans/
etc., the luatex cache will still point to the old ones.See here and also, similarly, here for a solution: delete the
.lua
and.luc
files of the fonts in question fromluatex-cache/generic/fonts/
. For MiKTeX 2.9 on Windows 10, this was found in%USERPROFILE%\AppData\Local\MiKTeX\2.9\luatex-cache
. -
The error is or is similar to:
! Undefined control sequence. l.52 \glsxtr@r
With an
*.aux
file mentioned in the error message as well. Here, an auxiliary file got corrupted in an unsuccessful run and simply needs to be deleted. Do this manually or usinglatexmk -c
. -
Concerning
glossaries-extra
:-
Using
\glssetcategoryattribute{<category>}{indexonlyfirst}{true}
. For all items in<category>
, it is meant to only add the very first reference to the printed glossary. If this reference is within a float, this breaks, and nothing shows up in the '##2
' column.The way the document was set up, most symbols are currently affected. However, in an actual document, it is highly unlikely you will be referencing/using (with
\gls{<symbol>}
) symbols the first time in floating objects. Therefore, this problem is likely not a realistic issue. -
In conjunction with
subimport
: That package introduces a neat structure to have subdirectories and do nested imports of*.tex
files. But that might not be worth it, since it breaks many referencing functionalities in for example TeXStudio.More importantly, it seems to cause
glossaries-extra
to no longer recognize which references have occurred. We currently callselection=all
in\GlsXtrLoadResources
to load all stuff found in the respective*.bib
file, regardless of whether it has actually been called at some point (using\gls{}
etc.). This is a bit like ifbiblatex
did not recognize cite commands and we just pulled every single item in the bibliography file. Some people use gigantic*.bib
files, shared among their projects. If suddenly every entry showed up in the printed document despite not being referenced (be it a glossary or a bibliography item), chaos would ensue.
-
These are valid not only for LaTeX files, but most text-based source files:
- For the love of God, use
UTF-8
(and onlyUTF-8
, not anything higher) for text encoding. Stop usingWindows 1252
,Latin
etc. Existing files can be easily updated to UTF-8 without much danger for regression (i.e., introducing errors). - Put each sentence, or even part of a sentence, and each instruction onto its own line.
This is very important to
diff
files properly, akagit diff
. Generally, keep lines short. - In a similar vein, use indentation appropriately. Indent using 4 spaces. There are schools of thought that advocate two spaces, or also one tab. Ultimately, that does not really matter. 'Four spaces' just seems to generally win the fight for a common coding style, bringing us to the next point.
- Be consistent. Even if you pull your own custom stuff, at least be consistent in doing so.
This makes things predictable, the code will be easier to read, and also more easily
changed programmatically.
GNU/Linux and by extension Windows using
Windows Subsystem for Linux
has a very wide range of tools that make search, and search-and-replace, and various other
operations for plain text files easy.
The same is true for similar tools in IDEs.
However, if the text is scattered and the style was mangled and fragmented into various
sub-styles, this becomes very hard.
For example, one person might use
$<math>$
for inline-LaTeX math, another the (preferred)\(<math>\)
style. Suddenly, you would have to search for both versions to find all inline-math. So stay consistent. If you work on pre-existing documents, use the established style. If you change it, change it fully, and not just for newly added work.
We use beautiful, capable fonts based on TeX Gyre for high-quality typesetting. Particularly, TeX Gyre Pagella as an open and free Palatino (by Hermann Zapf) clone. It comes with an accompanying math font, TeX Gyre Pagella Math. This is extremely important, since only with two matching (or even basically identical fonts like in our case) fonts will a document like this one here look good. If math and text fonts don't mix well, it will look terrible. If the math font is not highly capable and provides all the glyphs we need, it will also look terrible.
There aren't very many high-quality free fonts with a math companion available,
see this compilation.
A similar compilation of 'nice' fonts is found here,
however that is specifically for pdflatex
, not lualatex
.
Of all of the available ones, Pagella was chosen.
This can be changed with relative ease in the package options for the responsible package,
unicode-math
.
If the new font does not come with at least the same amount of features,
parts of the document might break!
There is also a list of
symbols defined by unicode-math
for reference.
unicode-math
builds on top of and loads, then extends, fontspec
.
fontspec
is a package for lualatex
and xelatex
designed to allow usage of outside, system
fonts (as opposed to fonts that ship with latex distributions/packages).
These can be system-installed fonts, but we bring our own fonts, and they reside in this directory.
This is to ensure everyone can compile this repository/document,
as long as they have this subdirectory intact.
It is also OS-agnostic
(with system-installed fonts, calling them by their name can get really out of hand;
with plain filenames, we know exactly what to call them).
Some fonts ship with regular LaTeX distributions and are available without being explicitly installed by the user.
TeX Gyre fonts are an example.
These need to neither be included here nor installed, they are ship with the LaTeX installation (TeXLive).
The distinct advantage of unicode-math
over its parent fontspec
is the additional
feature of a math font (\setmathfont
).
Note that unicode-math
requires lua(la)tex
or xe(la)tex
.
You cannot compile this document with pdf(la)tex
.
This also means that the packages inputenc
and fontenc
are not needed.
In fact, they are, as far as I know, incompatible and may break the document.
They used to be employed to allow advanced encoding for both the plain-text source as
well as the output PDF.
This allowed usage of Umlauts etc., but that's a relic of a distant past.
In fact, if we wanted to, we could input Unicode characters directly into the source code,
e.g. \(α = 2\)
over \(\alpha = 2\)
(we don't do this because we use an entirely different system in the form of glossaries-extra
).
Now, it remains to choose between lualatex
and xelatex
(luatex
and xetex
exist too, but output to DVI
, which we don't care for nowadays;
lualatex
and xelatex
output to PDF
).
The choice falls to lualatex
and is quite easy:
-
we make use of the
contour
packages to print characters (of any color) with a thick contour aka background around them (in any color as well). Of course, the most obvious use is black text with a white contour. Text set like that will be legible on various colored or otherwise obstructed (for example by plot grids) page backgrounds. To this purpose, there is the command\ctrw{<text>}
, which is used a lot. Thecontour
capabilities don't care for math mode, font weight, size, shape... it all just works.Finally, the kicker here is that it does not work in
xelatex
. I tried this and this and this and lastly this and eventually failed horribly. -
further,
pdflatex
, with its roots in probably the 80s, still to this day has strict and low memory limits. For more advanced computations, it will complain and claim to be out of memory (the good oldTeX capacity exceeded, sorry
), when in reality the host computer has Gigabytes and Gigabytes of RAM to spare. This lead to the package/librarytikz-externalize
, which puts Tikz-pictures into their own compile jobs. That way, each job is much smaller and can succeed within memory limits. Another way to solve the limitation is to increase all the various memory limits manually, to arbitrarily high values. That this seems laborious, hacky and, well, arbitrary, is quickly apparent.Now, even if we used
tikz-externalize
, it truncates the features we can make use of. Since it exports the Tikz environments to outside PDFs and then inputs those, labels/references/links cannot really be supported. I noticed this when I wanted equation references as a legend (if that is a good idea is another question ...). Sotikz-externalize
and its many caveats (despite it being a brilliant solution to a problem we shouldn't even be having anymore nowadays) is off the table.lualatex
simply solves all these concerns: it allocates memory as required. I don't know to what degree it does this (if it can fill up modern machine memory), but I can compile our entire document with countless Tikz environments and demanding plots (with highsample=
values) with no memory issues. As far as I know,xelatex
doesn't do this, so forget about it.
These two reasons are easily enough to choose lualatex
over xelatex
.
I don't have more reasons anyway, since otherwise, the programs are quite identical.