# daly/axiom

Axiom is a free, open source computer algebra system
Switch branches/tags
Nothing to show
Goal: Proving Axiom Sane

\index{Scott, Dana S.}
\begin{chunk}{axiom.bib}
@article{Scot93,
author = "Scott, Dana S.",
title = {{A Type-Theoretical Alternative to ISWIM, CUCH, OWHY}},
journal = "Theoretical Computer Science",
volume = 121,
number = "1-2",
year = "1993",
pages = "411-440",
abstract =
"The paper (first written in 1969 and circulated privately) concerns
the definition, axiomatization, and applications of the hereditarily
monotone and continuous functionals generated from the integers and
the Booleans (plus “undefined” elements). The system is formulated as
a typed system of combinators (or as a typed λ-calculus) with a
recursion operator (the least fixed-point operator), and its proof
rules are contrasted to a certain extent with those of the untyped
λ-calculus. For publication (1993), a new preface has been added, and
many bibliographical references and comments in footnotes have been
appended.",
paper = "Scot93.pdf",
keywords = "printed"
}

\end{chunk}

\index{Winkler, Franz}
\begin{chunk}{axiom.bib}
@inproceedings{Wink85,
author = "Winkler, Franz",
title = {{Reducing the Complexity of the Knuth-Bendix Completion
Algorithm: A  Unification'' of Different Approaches}},
booktitle = "European Conference on Computer Algebra (EUROCAL 85)",
pages = "378-389",
publisher = "Springer",
year = "1985",
isbn = "978-3-540-15983-4",
abstract =
"The Knuth-Bendix completion procedure for rewrite rule systems is
of wide applicability in symbolic and algebraic computation.
Attempts to reduce the complexity of this completion algorithm are
reported in the literature. Already in their seminal 1967 paper
D.E. Knuth and P.B. Bendix have suggested to keep all the rules
iterreduced during the execution of the algorithm. G. Huet has
presented a version of the completion algorithm in which every
rewrite rule is kept in reduced form with respect to all the other
rules in the system. Borrowing an idea of Buchberger's for the
completion of bases of polynomial ideals the author has proposed
in 1983 a criterion for detecting unnecessary'' critical
pairs. If a critical pair is recognized as unnecessary then one
need not apply the costly process of computing normal forms to
it. It has been unclear whether these approaches can be
combined. We demonstrate that it is possible to keep all the
rewrite rules interreduced and still use a criterion for
eliminating unnecessary critical pairs.",
paper = "Wink85.pdf",
keywords = "printed"
}

\end{chunk}

\index{Kandri-Rody, Abdelilah}
\index{Kapur, Deepak}
\index{Winkler, Franz}
\begin{chunk}{axiom.bib}
@inproceedings{Kand89,
author = "Kandri-Rody, Abdelilah and Kapur, Deepak and Winkler, Franz",
title = {{Knuth-Bendix Procedure and Buchberger Algorithm - A Synthesis}},
booktitle = "ISSAC'89",
publisher = "ACM Press",
pages = 55-67",
year = "1989",
isbn = "0-89791-325-6",
abstract =
"The Knuth-Bendix procedure for the completion of a rewrite rule
system and the Buchberger algorithm for computing a Groebner basis
of a polynomial ideal are very similar in two respects: they both
start wtih an arbitrary specification of an algebraic structure
(axioms for an equational theory and a basis for a polynomial
ideal, respectively) which is transformed to a very special
specification of this algebraic structure (a complete rewrite rule
system and a Groebner basis of the polynomial ideal, respectively).
This special specification allows to decide many problems concerning
the given algebraic structure. Moreover, both algorithms achieve
their goals by employing the same basic concepts: formation of
critical pairs and completion.

Although the two methods are obviously related, the exact nature
of this relation remains to be clarified. Based on previous work
we show how the Knuth-Bendix procedure and the Buchberger algorithm
can be seen as special cases of a more general completion procedure.",
paper = "Kand89.pdf",
keywords = "printed"
}

\end{chunk}

\index{Winkler, Franz}
\begin{chunk}{axiom.bib}
@misc{Wink92,
author = "Winkler, Franz",
title = {{Computer Algebra}},
booktitle = "Encyclopedia of Physical Science and Technology",
volume = "4",
year = "1992",
paper = "Wink92.pdf",
keywords = "printed"
}

\end{chunk}

\index{Winkler, Franz}
begin{chunk}{axiom.bib}
@inproceedings{Wink95,
author = "Winkler, Franz",
title = {{Computer Algebra - Problems and Developments}},
booktitle = "Computers in Industry 2 (SCAFI'92)",
pages = "1-17",
year = "1995",
isbn = "978-0471955290",
paper = "Wink95.pdf"
}

\end{chunk}

\index{Winkler, Franz}
\begin{chunk}{axiom.bib}
@techreport{Wink99,
author = "Winkler, Franz",
title = {{Advances and Problems in Algebraic Computation}},
type = "technical report",
number = "99-49",
year = "1999",
institution = "RISC, Linz",
abstract =
"In the last years there has been dramatic progress in all areas
of algebraic computation. Many working mathematiians have access
to and actually use algebraic software systems for their research
No end of this development is in sight. We report on some active
topics in algebraic computation, namely the theory of integration
and symbolic solution of systems of differential equations, the
solution of systems of algebraic equations, the factorization of
polynomials, and the design and analysis of algebraic curves and
surfaces.",
paper = "Wink99.pdf",
keywords = "printed"
}

\end{chunk}

\index{Winkler, Franz}
\begin{chunk}{axiom.bib}
@inproceedings{Wink06,
author = "Winkler, Franz",
title = {{Computer Algebra and Geometry - Some Interactions}},
booktitle = "Proc. of Coll. on Constructive Algebra and System
Theory (CAST)",
year = "2006",
publisher = "unknown",
isbn = "90-6984-477-x",
pages = "127-138",
abstract =
"Algebraic curves and surfaces have been studied intensively in
algebraic geometry for centuries. Thus, there exists a huge amount
of theoretical knowledge about these geometric objects. Recently,
algebraic curves and surfaces play an important and ever
increasing role in computer aided geometric design, computer
vision, computer aided manufacturing, coding theory, and
cryptography, just to name a few application areas. Consequently,
theoretical results need to be adapted to practical needs. We need
efficient algorithms for generating, representing, manipulating,
analyzing, rendering algebraic curves and surfaces. Exact computer
algebraic methods can be employed effectively for dealing with
these geometric problems.",
paper = "Wink06.pdf"
}

\end{chunk}

\index{Davies, Rowan}
\index{Pfenning, Frank}
\begin{chunk}{axiom.bib}
@article{Davi01,
author = "Davies, Rowan and Pfenning, Frank",
title = {{A Modal Analysis of Staged Computation}},
journal = "J. ACM",
volume = "48",
number = "3",
pages = "555-604",
year = "2001",
abstract =
"We show that a type system based on the intuitionistic modal logic S4
provides an expressive framework for specifying and analyzing
computation stages in the context of typed $\lambda$-calculi and
functional languages. We directly demonstrate the sense in which our
$\lambda_e^{\rightarrow\box}$-calculus captures staging, and also give
a conservative embedding of Nielson and Nielson's two-level functional
language in our functional language Mini-ML${}^\box$, thus proving that
binding-time correctness is equivalent to modal correctness on this
fragment. In addition, Mini-ML${}^\box$ can also express immediate
evaluation and sharing of code across multiple stages, thus supporting
run-time code generation as well as partial evaluation.",
paper = "Davi01.pdf",
keywords = "printed"
}

\end{chunk}

\index{Pfenning, Frank}
\index{Davies, Rowan}
\begin{chunk}{axiom.bib}
@article{Pfen01,
author = "Pfenning, Frank and Davies, Rowan",
title = {{A Judgemental Reconstruction of Modal Logic}},
journal = "Mathematical Structures in Computer Science",
volume = "11",
pages = "511-540",
year = "2001",
abstract =
"We reconsider the foundations of modal logic, following Martin-Lof's
methodology of distinguishing judgments from propositions. We give
constructive meaning explanations for necessity and possibility which
yields a simple and uniform system of natural deduction for
intuitionistic modal logic which does not exhibit anomalies found in
other proposals. We also give a new presentation of lax logic and nd
that the lax modality is already expressible using possibility and
necessity. Through a computational interpretation of proofs in modal
logic we further obtain a new formulation of Moggi's monadic
metalanguage.",
paper = "Pfen01.pdf",
keywords = "printed"
}

\end{chunk}

\index{Lamport, Leslie}
\begin{chunk}{axiom.bib}
@misc{Lamp03,
author = "Lamport, Leslie",
title = {{The Future of Computing: Logic or Biology}},
"\url{https://lamport.azurewebsites.net/pubs/future-of-computing.pdf}",
year = "2003",
paper = "Lamp03.pdf",
keywords = "printed"
}

\end{chunk}

\index{Glymour, Clark}
\index{Serafin, Luke}
\begin{chunk}{axiom.bib}
@misc{Glym15,
author = "Glymour, Clark and Serafin, Luke",
title = {{Mathematical Metaphysics}},
year = "2015",
paper = "Glym15.pdf",
keywords = "printed"
}

\end{chunk}

\index{Reynolds, John C.}
\begin{chunk}{axiom.bib}
@article{Reyn93,
author = "Reynolds, John C.",
title = {{The Discoveries of Continuations}},
journal = "Lisp and Symbolic Computation",
volume = "6",
pages = "233-248",
year = "1993",
abstract =
"We give a brief account of the discoveries of continuations and
related concepts.",
paper = "Reyn93.pdf",
keywords = "printed"
}

\end{chunk}

\index{Oles, Frank Joseph}
\begin{chunk}{axiom.bib}
@phdthesis{Oles82,
author = "Oles, Frank Joseph",
title = {{A Category-Theoretic Approach to the Semantics of
Programming Languages}},
school = "Syracuse University",
year = "1982",
abstract =
"Here we create a framework for handling the semantics of fully
typed programming languages with imperative features, higher-order
ALGOL-like procedures, block structure, and implicit
conversions. Our approach involves introduction of a new family of
programming languages, the {\sl coercive typed $\lambda$-calculi},
denoted by $L$ in the body of the dissertation. By appropriately
choosing the linguistic constants (i.e. generators) of $L$, we can
view phrases of variants of ALGOL as syntactically sugared phrases
of $L$.

This dissertation breaks into three parts. In the first part,
consisting of the first chapter, we supply basic definitions and
motivate the idea that functor categories arise naturally in the
explanation of block structure and stack discipline. The second
part, consisting of the next three chapters, is dedicated to the
general theory of the semantics of the coercive typed
$\lambda$-calculus; the interplay between posets, algebras, and
Cartesian closed categories is particularly intense here. The
remaining four chapters make up the final part, in which we apply
the general theory to give both direct and continuation semantics
for desugared variants of ALGOL. To do so, it is necessary to show
certain functor categories are Cartesian closed and to describe a
category $\Sigma$ of store shapes. An interesting novelty in the
presentation of continuation semantics is the view that commands
form a procedural, rather than a primitive, phrase type.",
paper = "Oles82.pdf"
}

\end{chunk}

\index{Abdali, S. Kamal}
\index{Winkler, Franz}
\begin{chunk}{axiom.bib}
@techreport{Abda81,
author = "Abdali, S. Kamal and Winkler, Franz",
title = {{A Lambda-Calculus Model for Generating Verification Conditions}},
type = "technical report",
institution = "Rensselaer Polytechnic Institute",
number = "CS-8104",
year = "1981",
abstract =
"A lambda-calculus-based method is developed for the automatic
generation of verification conditions. A programming language is
specified in which inductive assertions associated with a program
are incorporated within the body of the program by means of assert
and maintain-while statements. This programming language includes
the following features: assignments, conditionals, loops,
compounds, ALGOL-type block structure. A model is developed which
consists of rules to translate each statement in this programming
language into the lamba-calculus. The model is such that the
lambda-expression translation of any program reduces to a list
(tuple) of lambda-expression representations of all verification
conditions of the program. A proof of this property is given.",
paper = "Abda81.pdf",
keywords = "printed"
}

\end{chunk}

\index{Abdali, S. Kamal}
\begin{chunk}{axiom.bib}
@book{Abda73,
author = "Abdali, S. Kamal",
title = {{A Simple Lambda-Calculus Model of Programming Languages}},
isbn = "978-1332196029",
publisher = "Forgotten Books",
year = "1973",
abstract =
"We present a simple correspondence between a large subset of
ALGOL 60 language and lambda-calculus. With the aid of this
correspondence, a program can be translated into a single
lambda-expression. In general, the representation of a program is
specified by means of a system of simultaneous conversion
relations among the representations of the program
constituents. High-level programming language features are treated
directly, not in terms of the representations of machine-level
operations. The model includes input-output in such a way that
when the representation of a (convergent) program is applied to
the input item representations, the resulting combination reduces
to a tuple of the representations of the output items. This model
does not introduce any imperative notions into the calculus; the
descriptive programming constructs, such as expressions, and the
imperative ones, such as assignment and jumps, are translated
alike into pure lambda expressions. The applicability of the model
to the problems of proving program equivalence and correctness is
illustrated by means of simple examples."
paper = "Abda73.pdf"
}

\end{chunk}

\index{Abdali, S. Kamal}
\begin{chunk}{axiom.bib}
@article{Abda74,
author = "Abdali, S. Kamal",
title = {{A Lambda-Calculus Model of Programming Languages -
I. Simple Constructs}},
journal = "J. of Computer Languages",
volume = "1",
pages = "287-301",
year = "1974",
abstract =
"A simple correspondence is presented between the elementary
constructs of programming languages and the lambda-calculus. The
correspondence is obtained by using intuitive, functional
interpretation of programming language constructs, completely
avoiding the notions of machine memory and address. In particular,
the concepts of program variable and assignments are accounted for
in terms of the concepts of mathematical variable and
substitution, respectively. Lambda-expression representations are
given of assignments, conditional and compound statements,
input-output, and blocks. Algol 60 is used as the illustrative
language.",
paper = "Abda74.pdf",
keywords = "printed"
}

\end{chunk}

\index{Abdali, S. Kamal}
\begin{chunk}{axiom.bib}
@article{Abda74a,
author = "Abdali, S. Kamal",
title = {{A Lambda-Calculus Model of Programming Languages -
II. Jumps and Procedures}},
journal = "J. of Computer Languages",
volume = "1",
pages = "303-320",
year = "1974",
abstract =
"The correspondence between programming languages and the
lambda-calculus presented in Part I of the paper is extended there
to include iteration statements, jumps, and procedures. Programs
containing loops are represented by lambda-expressions whose
components are specified recursively by means of systems of
simultaneous conversion relations. The representation of
call-by-name and side-effects in a program is accomplished without
any resort to the concepts of memory andaddress by making use of a
number of variables in addition to those declared by the programs;
operations can be simulated by pure substitution. The
applicability of the model to the problems of proving program
correctness and equivalence is demonstrated by means of examples.",
paper = "Abda74a.pdf",
keywords = "printed"
}

\end{chunk}

\index{Abdali, S. Kamal}
\begin{chunk}{axiom.bib}
@phdthesis{Abda74b,
author = "Abdali, S. Kamal",
title = {{A Combinatory Logic Model of Programming Languages}},
school = "University of Wisconsin",
year = "1974",
abstract =
"A simple correspondence is presented between a large subset of the
ALGOL 60 language and the combinatory logic.  With the aid of this
correspondence, a program can be translated into a single combinatory
object.  The combinatory object representing a program is specified,
in general, by means of a system of reduction relations among the
representations of the program constituents.  This object denotes, in
terms of the combinatory logic, the function that the program is
intended to compute.

The model has been derived by using intuitive, functional
interpretations of the constructs of programming languages, completely
avoiding the notions of machine command and address.  In particular,
the concepts of program variable, assignment, and procedure have been
accounted for in terms of the concepts of mathematical variable,
substitution, and function, respectively.

High-level programming language features are represented in the
combinatory logic directly, not in terms of the representations of
machine-level operations.  Input-output is treated in such a
manner that when the representation of a program is applied to the
representations of the input items, the resulting combination reduces
to a tuple of the representations of the output items.

The applicability of the model to the problems of
proving program equivalence and correctness is illustrated
by means of examples.",
paper = "Abda74b.pdf"
}

\end{chunk}

\index{Krall, Andreas}
\begin{chunk}{axiom.bib}
@misc{Kralxx,
author = "Krall, Andreas",
title = {{Implementation Techniques for Prolog}},
year = "unknown",
abstract =
"This paper is a short survey about currently used implementation
techniques for Prolog. It gives an introduction to unification and
resolution in Prolog and presents the memory model and a basic
execution model. These models are expanded to the Vienna Abstract
Machine (VAM) with its two versions, the VAM${}_{2p}$ and the
VAM${}_{1p}$, and the most famous abstract machine, the Warren
Abstract Machine (WAM). The continuation passing style model of
Prolog, binary Prolog, leands to the BinWAM. Abstract
interpretation can be applied to gather information about a
program. This information is used in the generation of very
specialized machine code and in optimizations like clause indexing
and instruction scheduling on each kind of abstract machine.",
paper = "Kralxx.pdf",
keywords = "printed"
}

\end{chunk}

\index{Cardelli, Luca}
\index{Curien, Pierre-Louis}
\index{Levy, Jean-Jacques}
\begin{chunk}{axiom.bib}
author = "Abadi, Martin and Cardelli, Luca and Curien, Pierre-Louis
and Levy, Jean-Jacques",
title = {{Explicit Substitutions}},
booktitle = "Symp. of Principles of Programming Languages",
publisher = "ACM",
year = "1990",
abstract =
"The $lambda\sigma$-calculus is a refinement of the
$lambda$-calculus where substitutions are manipulated
explicitly. The $\lambda\sigma$-calculus provides a setting for
studying the theory of substitutions, with pleasant mathematical
properties. It is also a useful bridge between the classical
$\lambda$-calculus and concrete implementations.",
keywords = "printed"
}

\end{chunk}

\index{Beeson, Michael}
\begin{chunk}{axiom.bib}
@article{Bees80a,
author = "Beeson, Michael",
title = {{Formalizing Constructive Mathematics: Why and How?}},
journal = "Lecture Notes in Mathematics",
volume = "873",
pages = "146-190",
year = "1980",
paper = "Bees80a.pdf"
}

\end{chunk}

\index{Goodman, Nicolas D.}
\begin{chunk}{axiom.bib}
@article{Good80,
author = "Goodman, Nicolas D.",
title = {{Reflections on Bishop's Philosophy of Mathematics}},
journal = "Lecture Notes in Mathematics",
volume = "873",
pages = "135-145",
year = "1980",
paper = "Good80.pdf",
keywords = "printed"
}

\end{chunk}

\index{Rosenkranz, Markus}
\begin{chunk}{axiom.bib}
@book{Rose07,
author = "Rosenkranz, Markus",
title = {{Gr\"obner Bases in Symbolic Analysis}},
publisher = "Walter de Gruyter, Berlin Germany",
year = "2007",
isbn = "978-3-11.019323-7",
keywords = "axiomref, TPDref"
}

\end{chunk}

\index{Daly, Timothy}
\begin{chunk}{axiom.bib}
@misc{Daly06,
author = "Daly, Timothy",
title = {{Literate Programming}},
year = "2006",
keywords = "axiomref, TPDref"
}

\end{chunk}

\index{Prengel, Alex}
\begin{chunk}{axiom.bib}
@misc{Pren16,
author = "Prengel, Alex",
title = {{MIT axiom-math_v8.14}},
year = "2016",
keywords = "axiomref, TPDref"
}

\end{chunk}

\index{Jenks, Richard D.}
\begin{chunk}{axiom.bib}
@article{Jenk11,
author = "Jenks, Richard D.",
title = {{The 2011 Richard D. Jenks Memorial Prize}},
journal = "ACM Communications in Computer Algebra",
volume = "45",
number = "4",
year = "2011",
abstract =
"The 2011 Richard D. Jenks Memorial Prize for Excellence in
Software Engineering Applied to Computer Algebra was presented by
members of the Prize Selection Committee and its chair, Erich
Kaltofen, at ISSAC and SNC in San Jose, CA, on June 9, 2011 to the
Maple Project at the University of Waterloo.",
paper = "Jenk11.pdf",
keywords = "axiomref"
}

\end{chunk}

\index{Jenks, Richard D.}
\begin{chunk}{axiom.bib}
@article{Jenk13,
author = "Jenks, Richard D.",
title = {{The 2013 Richard D. Jenks Memorial Prize}},
journal = "ACM Communications in Computer Algebra",
volume = "47",
number = "4",
year = "2013",
abstract =
"The 2013 Richard D. Jenks Memorial Prize for Excellence in
Software Engineering Applied to Computer Algebra was announced by
members of the Prize Selection Committee, Mark Giesbrecht
representing its chair Erich Kaltofen, at ISSAC in Boston, MA, on
June 28, 2013 to have been awared to Professor William Arthur
Stein of the Sage Project at the University of Washington.",
paper = "Jenk13.pdf",
keywords = "axiomref"
}

\end{chunk}

\index{Jenks, Richard D.}
\begin{chunk}{axiom.bib}
@article{Jenk15,
author = "Jenks, Richard D.",
title = {{The 2015 Richard D. Jenks Memorial Prize}},
journal = "ACM Communications in Computer Algebra",
volume = "49",
number = "4",
year = "2015",
abstract =
"The 2015 Richard D. Jenks Memorial Prize was awarded on October
30, 2015 at the Fields Institute in Toronto during the Major
Thematic Program on Computer Algebra to Professor Victor Shoup for
NTL: A Library for doing Number Theory.",
paper = "Jenk15.pdf",
keywords = "axiomref"
}

\end{chunk}

\index{Jenks, Richard D.}
\begin{chunk}{axiom.bib}
@article{Jenk17,
author = "Jenks, Richard D.",
title = {{The 2017 Richard D. Jenks Memorial Prize}},
journal = "ACM Communications in Computer Algebra",
volume = "51",
number = "4",
year = "2017",
abstract =
"The 2017 Richard D. Jenks Memorial Prize for Excellence in
Software Engineering Applied to Computer Algebra has been awarded
to Stephen Wolfram for Wolfram-Alpha and Mathematica.",
paper = "Jenk17.pdf",
keywords = "axiomref"
}

\end{chunk}

\index{Lazard, Daniel}
\begin{chunk}{axiom.bib}
@article{Laza01,
author = "Lazard, Daniel",
title = {{Solving Systems of Algebraic Equations}},
journal = "ACM SIGSAM Bulletin",
volume = "35",
number = "3",
year = "2001",
abstract =
"Let $f_1,\ldots,f_k$ be $k$ k multivariate polynomials which have
a finite number of common zeros in the algebraic closure of the ground
field, counting the common zeros at infinity. An algorithm is
given and proved which reduces the computations of these zeros to
the resolution of a single univariate equation whose degree is the
number of common zeros. This algorithm gives the whole algebraic
and geometric structure of the set of zeros (multiplicities,
conjugate zeros,...). When all the polynomials have the same degree,
the complexity of this algorithm is polynomial relative to the generic
number of solutions.",
paper = "Laza01.pdf"
}

\end{chunk}

\index{Abramov, S.A.}
\begin{chunk}{axiom.bib}
@article{Abra00,
author = "Abramov, S.A.",
title = {{A Note on the Number of Division Steps in the Euclidean
Algorithm}},
journal = "ACM SIGSAM",
volume = "34",
number = "4",
year = "2000",
paper = "Abra00.pdf",
keywords = "printed"
}

\end{chunk}

\index{Essex, Christopher}
\index{Davison, Matt}
\index{Schulzky, Christian}
\begin{chunk}{axiom.bib}
@article{Esse00,
author = "Essex, Christopher and Davison, Matt and Schulzky, Christian",
title = {{Numerical Monsters}},
journal = "ACM SIGSAM",
volume = "34",
number = "4",
year = "2000",
abstract =
"When the results of certain computer calculations are shown to be not
simply incorrect but dramatically incorrect, we have a powerful reason
to be cautious about all computer-based calculations. In this paper we
present a 'Rogue's Gallery' of simple calculations whose correct
solutions are obvious to humans but whose numerical solutions are
incorrect in pathological ways. We call these calculations, which can
be guaranteed to wreak numerical mayhem across both software
packages and hardware platforms, 'Numerical Monsters'. Our monsters
can be used to provide deep insights into how computer calculations
fail, and we use t h e m to engender appreciation for the subject of
numerical analysis in our students. Although these monsters are based
on well-understood numerical pathologies, even experienced numerical
analysts will find surprises in their behaviour and ,can use the
lessons they bring to become even better masters of their tools.",
paper = "Esse00.pdf"
}

\end{chunk}

\index{Buchberger, Bruno}
\begin{chunk}{axiom.bib}
@article{Buch02,
author = "Buchberger, Bruno",
title = {{Computer Algebra: The End of Mathematics?}},
journal = "ACM SIGSAM",
volume = "36",
number = "1",
year = "2002",
abstract =
"Mathematical software systems, such as Mathematica, Maple, Derive,
and so on, are substantially based on enormous advances in the area of
mathematics known as Computer Algebra or Symbolic Mathematics. In
fact, everything taught in high school and in the first semesters of a
university mathematical education, is available in these systems 'at
the touch of the button'. Will mathematics become unnecessary because
of this? In the three sections of this essay, I answer this question
for non-mathematicians, for mathematicians and for (future) students
of mathematics.",
paper = "Buch02.pdf"
}

\end{chunk}

\index{Barnett, Michael P.}
\begin{chunk}{axiom.bib}
@article{Barn08,
author = "Barnett, Michael P.",
title = {{Reasoning in Symbolic Computation}},
journal = "ACM Communications in Symbolic Algebra",
volume = "42",
number = "1",
year = "2008",
abstract =
"I discuss notations for some styles of mathematical reasoning that
include analogy. These notations extend the conventions of the
mathematica package mathscape that I reported recently in the
Journal of Symbolic Computation. The paper introduces the reasoning
objects that I call 'precursors' and 'consequences lists'.",
paper = "Barn08.pdf",
keywords = "printed"
}

\end{chunk}

\index{Lichtblau, Daniel}
\begin{chunk}{axiom.bib}
@article{Lich11,
author = "Lichtblau, Daniel",
title = {{Symbolic Definite (and Indefinite) Integration: Methods and
Open Issues}},
journal = "ACM Comm. in Computer Algebra",
volume = "45",
number = "1",
year = "2011",
abstract = "
The computation of definite integrals presents one with a variety of
choices. There are various methods such as Newton-Leibniz or Slater's
convolution method. There are questions such as whether to split or
merge sums, how to search for singularities on the path of
integration, when to issue conditional results, how to assess
(possibly conditional) convergence, and more. These various
considerations moreover interact with one another in a multitude of
ways. Herein we discuss these various issues and illustrate
with examples.",
paper = "Lich11.pdf"
}

\end{chunk}

\index{Zengler, Christoph}
\index{Kubler, Andreas}
\index{Kuchlin, Wolfgang}
\begin{chunk}{axiom.bib}
@article{Zeng11,
author = "Zengler, Christoph and Kubler, Andreas and Kuchlin, Wolfgang",
title = {{New Approaches to Boolean Quantifier Elimination}},
journal = "ACM Comm. in Computer Algebra",
volume = "45",
number = "2",
year = "2011",
abstract =
"We present four different approaches for existential Boolean
quantifier elimination, based on model enumeration, resolution,
knowledge compilation with projection, and substitution. We point out
possible applications in the area of verification and we present
preliminary benchmark results of the different approaches.",
paper = "Zeng11.pdf",
keywords = "printed"
}

\end{chunk}

\index{Stoutemyer, David R.}
\begin{chunk}{axiom.bib}
@article{Stou11,
author = "Stoutemyer, David R.",
title = {{Ways to Implement Computer Algebra Compactly}},
journal = "ACM Comm. in Computer Algebra",
volume = "45",
number = "4",
year = "2011",
abstract =
"Computer algebra had to be implemented compactly to fit on early
personal computers and hand-held calculators. Compact implementation
is still important for portable hand-held devices. Also, compact
implementation increases comprehensibility while decreasing develop-
ment and maintenance time and cost, regardless of the platform. This
article describes several ways to achieve compact implementations,
including:
\begin{itemize}
\item Exploit evaluation followed by interpolation to avoid
implementing a parser, such as in PicoMath
\item Use contiguous storage as an expression stack to avoid garbage
collection and pointer-space overhead, such as in Calculus Demon
and TI-Math-Engine
\item Use various techniques for saving code space for linked-storage
representation of expressions and functions, such as muMath and Derive
\end{itemize}",
paper = "Stou11.pdf",
keywords = "axiomref"
}

\end{chunk}

\index{Roche, Daniel S.}
\begin{chunk}{axiom.bib}
@article{Roch11,
author = "Roche, Daniel S.",
title = {{Chunky and Equal-Spaced Polynomial Multiplication}},
journal = "J. Symbolic Computation",
volume = "46",
pages = "791-806",
year = "2011",
abstract =
"Finding the product of two polynomials is an essential and basic
problem in computer algebra. While most previous results have
focused on the worst-case complexity, we instead employ the
technique of adaptive analysis to give an improvement in many
‘‘easy’’ cases. We present two adaptive measures and methods
for polynomial multiplication, and also show how to effectively
combine them to gain both advantages. One useful feature of these
algorithms is that they essentially provide a gradient between
existing ‘‘sparse’’ and ‘‘dense’’ methods. We prove that these
approaches provide significant improvements in many cases but
in the worst case are still comparable to the fastest existing
algorithms.",
paper = "Roch11.pdf"
}

\end{chunk}

\index{Bailey, David H.}
\index{Borwein, Jonathan M.}
\index{Kaiser, Alexander D.}
\begin{chunk}{axiom.bib}
@article{Bail14,
author = "Bailey, David H. and Borwein, Jonathan M. and
Kaiser, Alexander D.",
title = {{Automated Simplification of Large Symbolic Expressions}},
journal = "J. Symbolic Computation",
volume = "60",
pages = "120-136",
year = "2014",
abstract =
"We present a set of algorithms for automated simplification of
symbolic constants of the form
$\sum_i\alpha_i x_i$ with $\alpha_i$ rational and $x_i$
complex. The included algorithms, called SimplifySum and
implemented in Mathematica , remove redundant terms, attempt to make
terms and the full expression real, and remove terms using repeated
application of the multipair PSLQ integer relation detection
algorithm. Also included are facilities for making substitutions
according to user-specified identities. We illustrate this toolset by
giving some real-world examples of its usage, including one, for
instance, where the tool reduced a symbolic expression of
approximately 100 000 characters in size enough to enable manual
manipulation to one with just four simple terms.",
paper = "Bail14.pdf"
}

\end{chunk}

\index{Bailey, David H.}
\index{Borwein, Jonathan M.}
\begin{chunk}{axiom.bib}
@article{Bail11,
author = "Bailey, David H. and Borwein, Jonathan M.",
title = {{High-precision Numerical Integration: Progress and Challenges}},
journal = "J. Symbolic Computation",
number = "46",
pages = "741-754",
year = "2011",
abstract =
"One of the most fruitful advances in the field of experimental
mathematics has been the development of practical methods for very
high-precision numerical integration, a quest initiated by Keith
Geddes and other researchers in the 1980s and 1990s.These
techniques, when coupled with equally powerful integer relation
detection methods, have resulted in the analytic evaluation of many
integrals that previously were beyond the realm of symbolic
techniques.This paper presents a survey of the current state-of-the-art
in this area (including results by the present authors and others),
mentions some new results, and then sketches what challenges lie
paper = "Bail11.pdf"
}

\end{chunk}

\index{Huang, Zongyan}
\index{England, Matthew}
\index{Wilson, David}
\index{Davenport, James H.}
\index{Paulson, Lawrence C.}
\begin{chunk}{axiom.bib}
@article{Huan14,
author = "Huang, Zongyan and England, Matthew and Wilson, David and
Davenport, James H. and Paulson, Lawrence C.",
title = {{A Comparison of Three Heuristics to Choose the Variable
Ordering for Cylindrical Algebraic Decomposition}},
journal = "ACM Comm. in Computer Algebra",
volume = "48",
number = "3",
year = "2014",
abstraact =
"Cylindrical algebraic decomposition (CAD) is a key tool for problems
in real algebraic geometry and beyond. When using CAD there is often a
choice over the variable ordering to use, with some problems
infeasible in one ordering but simple in another. Here we discuss a
recent experiment comparing three heuristics for making this choice on
thousands of examples.",
paper = "Huan14.pdf"
}

\end{chunk}

\begin{chunk}{axiom.bib}
@article{Hama14,
title = {{MathLibre: Personalizable Computer Environment for
Mathematical Research}},
volume = "48",
number = "3",
year = "2014",
abstract =
"MathLibre is a project to archive free mathematical software and free
mathematical documents and offer them on Live Linux system. MathLibre
Project is the direct descendant of KNOPPIX/Math Project. It provides
a desktop for mathematicians that can be set up easily and quickly.",
paper = "Hama14.pdf"
}

\end{chunk}

\index{Fateman, Richard}
\begin{chunk}{axiom.bib}
@article{Fate14a,
author = "Fateman, Richard",
title = {{Algorithm Differentiation in Lisp: ADIL}},
journal = "ACM Comm. in Computer Algebra",
volume = "48",
number = "3",
year = "2014",
abstract =
"Algorithm differentiation (AD) is a technique used to transform a
program $F$ computing a numerical function of one argument $F(x)$ into
another program $G(p)$ that returns a pair,
$\langle F (p), F^\prime(p)\rangle$ where by $F^\prime(p)$
we mean the derivative of $F$ with respect to its argument $x$,
evaluated at $x = p$. That is, we have a program AD that takes as input
a program, and returns another: $G := AD(F)$. Over the years AD
programs have been developed to allow $F$ to be expressed in some
specialized variant of a popular programming language L (FORTRAN, C,
Matlab, Python) and where $G$ is delivered in that language $L$ or some
other. Alternatively, executing $F(p)$ is some environment will deliver
$\langle F^\prime(p), F(p)\rangle$ directly. AD tools have also
been incorporated in computer algebra systems (CAS) such as
Maple. A CAS is hardly necessary for the task of writing the AD
program, since the main requirement is a set of tools for
manipulation of an internal (typically tree) form of a program.
In Lisp, a normal program is already in this form and so the AD
program in Lisp (ADIL), the target $F$ and the product $G$ can
all be expressed compactly in Lisp. In spite of the brevity and
extensibility of the ADIL program, we can provide
features which are unsupported in other AD programs. In particular,
recursive functions are easily accommodated. Our perspective here is
to point out that for scientists who write programs in Lisp or any
language that can be converted to Lisp, AD is easily at hand.",
paper = "Fate14a.pdf"
}

\end{chunk}

\index{England, M.}
\index{Cheb-Terrab, E.}
\index{Davenport, J.H.}
\index{Wilson, D.}
\begin{chunk}{axiom.bib}
@article{Engl14c,
author = "England, M. and Cheb-Terrab, E. and Bradford, R. and
Davenport, J.H. and Wilson, D.",
title = {{Branch Cuts in MAPLE 17}},
journal = "ACM Comm. in Computer Algebra",
volume = "48",
number = "1",
year = "2014",
abstract =
"Accurate and comprehensible knowledge about the position of branch
cuts is essential for correctly working with multi-valued functions,
such as the square root and logarithm. We discuss the new tools in
Maple 17 for calculating and visualising the branch cuts of such
functions, and others built up from them. The cuts are described in an
intuitive and accurate form, offering substantial improvement on the
descriptions previously available.",
paper = "Engl14c.pdf"
}

\end{chunk}

\index{Stoutemyer, David R.}
\begin{chunk}{axiom.bib}
@article{Stou13,
author = "Stoutemyer, David R.",
title = {{A Computer Algebra User Interface Manifesto}},
journal = "ACM Communications in Computer Algebra",
volume = "47",
number = "4",
year = "2013",
abstract =
"Many computer algebra systems have more than 1000 built-in functions,
describes a proposed interactive general-purpose wizard for organizing
optional transformations and allowing easy fine grain control over the
form of the result – even by amateurs. This wizard integrates ideas
including:
\begin{itemize}
\item flexible subexpression selection;
\item complete control over the ordering of variables and
commutative operands, with well-chosen defaults;
\item interleaving the choice of successively less main
variables with applicable function choices to provide detailed
control without incurring a combinatorial number of applicable
alternatives at any one level;
\item quick applicability tests to reduce the listing of
inapplicable transformations;
\item using an organizing principle to order the alternatives
\item  labeling quickly-computed alternatives in dialog boxes
with a preview of their results, using ellipsis elisions if
\item allowing the user to retreat from a sequence of choices
to explore other branches of the tree of alternatives – or to
return quickly to branches already visited;
\item allowing the user to accumulate more than one of the
alternative forms;
\item integrating direct manipulation into the wizard; and
\item supporting not only the usual input-result pair mode, but
also the useful alternative derivational and in situ replacement
modes in a unified window.
\end{itemize}",
paper = "Stou13.pdf",
kryeotfd = "printed"
}p

\end{chunk}

\index{Sorenson, Jonathan}
\begin{chunk}{axiom.bib}
@article{Sore94,
author = "Sorenson, Jonathan",
title = {{Two Fast GCD Algorithms}},
journal = "Journal of Algorithms",
volume = "16",
pages = "110-144",
year = "1994",
abstract =
"We present two new algorithms for computing geatest common
divisors: the right- and left-shift $k$-ary GCD algorithms. These
algorithms are generalizations of the binary and left-shift binary
algorithms. Interestingly, both new algorithms have sequential
versions that are practical and efficient and parallel versions
that rival the best previous parallel GCD algorithms. We show that
sequential versions of both algorithms take $O(n^2/log~n)$ bit
operations in the worst case to compute the GCD of two n-bit
integers. This compares favorably to the Euclidean and moth
precision versions of these GCD algorithms, and we found that both
$k$-ary algorithms are faster than the Euclidean and binary
algorithms on inputs of 100 to 1000 decimal digits in length. We
show that parallel versions of both algorithms match the
complexity of the best previous parallel GCD algorithm due to Chor
and Goldreich. Specifically, if $log~n\le k \le 2^n$ and $k$ is a
power of two, then both algorithms run in $O(n/log~n+log^2~n)$
time using a number of processors bounded by a polynomial in $n$
and $k$ on a common CRCW PRAM. We also discuss extended versions
of both algorithms.",
paper = "Sore94.pdf",
keywords = "printed"
}

\end{chunk}

\index{Weber, Kenneth}
\index{Trevisan, Vilmar}
\index{Martins, Luiz Felipe}
\begin{chunk}{axiom.bib}
@article{Webe05a,
author = "Weber, Kenneth and Trevisan, Vilmar and Martins, Luiz Felipe",
title = {{A Modular Integer GCD Algorithm}},
journal = "J. of Algorithms",
volume = "54",
year = "2005",
pages = "152-167",
abstract =
"This paper describes the first algorithm to compute the greatest
common divisor (GCD) of two $n$-bit integers using a modular
representation for intermediate values $U$, $V$ and also for the
result. It is based on a reduction step, similar to one used in
the accelerated algorithm when $U$ and $V$ are close to the same
size, that replaces $U$ by $(U-bV)/p$, where $p$ is one of the
prime moduli and $b$ is the unique integer in the interval
$(-p/2,p/2)$ such that $b\equiv UV^{-1} (mod~p)$. When the
algorithm is executed on a bit common CRCW PRAM with
$O(n~log~n~log~log~log~n)$ processors, it takes $(O(n)$ time in
the worst case. A heuristic model of the average case yields
$O(n/log~n)$ time on the same number of processors.",
paper = "Webe05a.pdf",
keywords = "printed"
}

\end{chunk}

\index{Schonhage, A.}
\begin{chunk}{axiom.bib}
@article{Scho88,
author = "Schonhage, A.",
title = {{Probabilistic Computation of Integer Polynomial GCDs}},
journal = "J. of Algorithms",
volume = "9",
pages = "365-371",
year = "1988",
abstract =
"We describe a probabilistic algorithm for the computation of the
gcd of two univariate integer polynomials of degrees $\le n$ with
their $l^1$-norms being bounded by $2^h$ and estimate its expected
running time by a worst-case bound of
$O(n(n+h)^{l+\epsilon(l)})$.",
paper = "Scho88.pdf",
keywords = "printed"
}

\end{chunk}

\index{LeGuin, Ursula K.}
\begin{chunk}{axiom.bib}
@book{Legu76,
author = "LeGuin, Ursula K.",
title = {{The Left Hand of Darkness}},
publisher = "Penguin Random House",
year = "1976",
isbn = "978-1-101-66539-8"
}

\end{chunk}

\index{Shulman, Michael}
\begin{chunk}{axiom.bib}
@misc{Shul18,
author = "Shulman, Michael",
title = {{Linear Logic for Constructive Mathematics}},
year = "2018",
abstract =
"We show that numerous distinctive concepts of constructive
mathematics arise automatically from an interpretation of 'linear
higher-order logic' into intuitionistic higher-order logic via a Chu
construction.  This includes apartness relations, complemented
subsets, anti-subgroups and anti-ideals, strict and non-strict order
pairs, cut-valued metrics, and apartness spaces. We also explain the
constructive bifurcation of classical concept s using the choice
between multiplicative and additive linear connectives. Linear logic
thus systematically 'constructivizes' classical definitions and deals
automatically with the resulting bookkeeping, and could potentially
be used directly as a basis for constructive mathematics in place
of intuitionistic logic.",
paper = "Shul18.pdf",
keywords = "printed"
}

\end{chunk}

\begin{chunk}{axiom.bib}
@techreport{Khan13,
title = {{On the Formal Verification of Maple Programs}},
year = "2013",
type = "technical report",
number = "13-07",
institution = "RISC Linz",
abstract =
"In this paper, we present an example-based demonstration of our
recent results on the formal verification of programs written in the
computer algebra language (Mini)Maple (a slightly modified subset of
Maple).  The main goal of this work is to develop a verification
framework for behavioral analysis of MiniMaple programs.  For
verification, we translate an annotated MiniMaple program into the
language Why3ML of the intermediate verification tool Why3 developed
at LRI, France.  We generate verification conditions by the
corresponding component of Why3 and later prove the correctness of
these conditions by various supported by the Why3 back-end automatic
and interactive theorem provers.  We have used the verification
framework to verify some parts of the main test example of our
verification framework, the Maple package DifferenceDifferential
developed at our institute to compute bivariate difference-differential
polynomials using relative Gr ̈obner bases.",
paper = "Khan13.pdf",
keywords = "printed"
}

\end{chunk}

\index{Daly, Timothy}
\begin{chunk}{axiom.bib}}
@misc{Daly03,
author = "Daly, Timothy",
title = {{Axiom Website: http://axiom-developer.org}},
keywords = "axiomref",
year = "2003"
}

\end{chunk}

\index{Dijkstra, Edsgar W.}
\begin{chunk}{axiom.bib}
@article{Dijk75,
author = "Dijkstra, Edsgar W.",
title = {{Guarded Commands, Nondeterminancy and Formal Derivation
of Programs}},
journal = "Comm. of the ACM",
volume = "18",
number = "8",
year = "1975",
pages = "453-457",
abstract =
"So-called 'guarded commands' are introduced as a building block
for alternative and repetitive constructs that allow
nondeterministic program components for which at least the
activity evoked, but possibly even the final state, is not
necessarily uniquly determined by the initial state. For the
formal derivation of programs expressed in terms of these
constructs, a calculus will be shown.",
paper = "Dijk75.pdf",
keywords = "printed"
}

\end{chunk}

\index{van Emden, M.H.}
\index{Kowalski, R.A.}
\begin{chunk}{axiom.bib}
@article{Emde76,
author = "van Emden, M.H. and Kowalski, R.A.",
title = {{The Semantics of Predicate Logic as a Programming Language}},
journal = "J. Association for Computing Machinery",
volume = "23",
number = "4",
year = "1976",
pages = "733-742",
abstract =
"Sentences in first-order predicate logic can be usefully interpreted
as programs In this paper the operational and fixpomt semantics of
predicate logic programs are defined, and the connections with the
proof theory and model theory of logic are investigated It is
concluded that operational semantics is a part of proof theory and
that fixpolnt semantics is a special case of model-theoret:c
semantics.",
paper = "Emde76.pdf",
keywords = "printed"
}

\end{chunk}

\index{Tarski, Alfred}
\begin{chunk}{axiom.bib}
@article{Tars69,
author = "Tarski, Alfred",
title = {{Truth and Proof}},
journal = "Scientific American",
volume = "1969",
pages = "63-77",
year = "1969",
paper = "Tars69.pdf",
keywords = "printed"
}

\end{chunk}

\index{Apt, Krysztof R.}
\index{Bezem, Marc}
\begin{chunk}{axiom.bib}
@inbook{Aptx99,
author = "Apt, Krysztof R. and Bezem, Marc",
title = {{Formulas as Programs}},
booktitle = "The Logic Programming Paradigm",
year = "1999",
publisher = "Springer Berlin Heidelberg",
pages = "75-107",
isbn = "978-3-642-60085-2",
abstract =
"We provide here a computational interpretation of first-order
logic based on a constructive interpretation of satisfiability
w.r.t. a fixed but arbitrary interpretation. In this approach the
formulas themselves are programs. This contrasts with the
so-called formulas as types approach in which the proofs of the
formulas are typed terms that can be taken as programs. This view
of computing is inspired by logic programming and constraint logic
programming but differs from them in a number of crucial aspects.

Formulas as programs is argued to yield a realistic approach to
programming that has been realized in the implemented programming
language Alma-0 Apt, Brunekreef, Partington and Schaerf (1998)
that combines the advantages of imperative and logic
programming. The work here reported can also be used to reason
about the correctness of non-recursive Alma-0 programs that do not
include destructive assignment.",
paper = "Aptx99.pdf",
keywords = "printed"
}

\end{chunk}

\index{Mazur, Barry}
\begin{chunk}{axiom.bib}
@misc{Mazu07,
author = "Mazur, Barry",
title = {{When is One Thing Equal to Some Other Thing?}},
year = "2007",
paper = "Mazu07.pdf"
}

\end{chunk}

\index{Kant, Immanuel}
\begin{chunk}{axiom.bib}
@book{Kant98,
author = "Kant, Immanuel",
title = {{The Critique of Pure Reason}},
publisher = "Cambridge University Press",
year = "1998",
isbn = "0-521-35402-1",
paper = "Kant98.pdf"
}

\end{chunk}

\index{Smith, A.}
\begin{chunk}{axiom.bib}
@techreport{Smit90,
author = "Smith, A.",
title = {{The Knuth-Bendix Completion Algorithm and its Specification in Z}},
type = "technical report",
institution = "Ministry of Defence, RSRE Malvern WORCS",
year = "1990",
number = "RSRE Memorandum 4323",
abstract =
"Proving that something is a consequence of a set of axioms can be
very difficult. The Knuth-Bendix completion algorithm attempts to
automate this process when the axioms are equations. The algorithm
is bound up in the area of term rewriting, and so this memorandum
contains an introduction to the tehory of term rewriting, followed
by an overview of the algorithm. Finally a formal specification of
the algorithm is given using the language Z.",
paper = "Smit90.pdf",
keywords = "printed"
}

\end{chunk}

\index{Dick, A.J.J.}
\begin{chunk}{axiom.bib}
@techreport{Dick88,
author = "Dick, A.J.J.",
title = {{Automated Equational Reasoning and the Knuth-Bendix Algorithm:
An Informal Introduction}},
year = "1988",
type = "technical report",
institution = "Rutherford Appelton Laboratory",
number = "RAL-88-043",
paper = "Dick88.pdf",
keywords = "printed"
}

\end{chunk}

\index{Dick, Jeremy}
\begin{chunk}{axiom.bib}
@article{Dick91,
author = "Dick, Jeremy",
title = {{An Introduction to Knuth-Bendix Completion}},
journal = "The Computer Journal",
volume = "34",
number = "1",
year = "1991",
abstract =
"An informal introduction is given to the underlying concepts of
term rewriting. Topics covered are Knuth-Bendix completion,
completion modulo equations, unfailing completion and theorem
proving by completion.",
paper = "Dick91.pdf",
keywords = "printed"
}

\end{chunk}

\index{Knuth, Donald E.}
\index{Bendix, Peter B.}
\begin{chunk}
\incollection{Knut70,
author = "Knuth, Donald E. and Bendix, Peter B.",
title = {{Simple Word Problems in Unversal Algebras}},
booktitle = "Computational Problems in Abstract Algebra",
editor = "Leech, John",
publisher = "Pergamon Press",
isbn = "08-012975-7",
year = "1970",
pages = "263-298",
paper = "Knut70.pdf",
keywords = "printed"
}

\end{chunk}

\index{Leech, John}
\begin{chunk}{axiom.bib}
@book{Leec70,
author = "Leech, John",
title = {{Computational Problems in Abstract Algebra}},
publisher = "Pergamon Press",
year = "1970",
isbn = "08-012975-7",
paper = "Leec70.pdf"
}

\end{chunk}

\index{Dershowitz, Nachum}
\begin{chunk}{axiom.bib}
@article{Ders87,
author = "Dershowitz, Nachum",
title = {{Termination of Rewriting}},
journal = "J. Symbolic Computation",
volume = "3",
year = "1987",
pages = "69-116",
abstract =
"This survey describes methods for proving that systems of rewrite
rules are terminating programs.  We illustrate the use in termination
proofs of various kinds of orderings on terms, including polynomial
interpretations and path orderings. The effect of restrictions, such
as linearity, on the form of rules is also considered.  In general,
however, termination is an undeeidable property of rewrite systems.",
paper = "Ders87.pdf",
keywords = "printed"
}

\end{chunk}

\index{Atkey, Robert}
\begin{chunk}{axiom.bib}
@misc{Atke18,
author = "Atkey, Robert",
title = {{Parameterised Notions of Computation}},
year = "2018",
abstract =
"Moggi’s Computational Monads and Power et al ’s equivalent notion of
Freyd category have captured a large range of computational effects
present in programming languages. Examples include non-termination,
non-determinism, exceptions, continuations, side-effects and
input/output. We present generalisations of both computational monads
and Freyd categories, which we call parameterised monads and
parameterised Freyd categories, that also capture computational
effects with parameters. Examples of such are composable
continuations, side-effects where the type of the state varies and
input/output where the range of inputs and outputs varies. By also
considering structured parameterisation, we extend the range of
effects to cover separated side-effects and multiple independent
streams of I/O. We also present two typed $\lambda$-calculi that
soundly and completely model our categorical definitions — with and
without symmetric monoidal parameterisation — and act as prototypical
languages with parameterised effects.",
paper = "Atke18.pdf",
keywords = "printed"
}

\end{chunk}

\index{Bozman, Gerald}
\index{Buco, William}
\index{Daly, Timothy}
\index{Tetzlaff, William}
\begin{chunk}{axiom.bib}
@article{Bozm84,
author = "Bozman, Gerald and Buco, William and Daly, Timothy and
Tetzlaff, William",
title = {{Analysis of Free Storage Algorithms - Revisited}},
journal = "IBM Systems Journal",
volume = "23",
number = "1",
year = "1984",
abstract =
"Most research in free-storage management has centered around
strategies that search a linked list and strategies that partition
storage into predetermined sizes. Such algorithms are analyzed in
terms of CPU efficiency and storage efficiency. The subject of this
study is the free-storage management in the Virtual Machine/System
Product (VM/SP) system control program. As a part of this study,
simulations were done of established, and proposed, dynamic storage
algorithms for the VM/SP operating system. Empirical evidence is given
that simplifying statistical assumptions about the distribution of
interarrival times and holding times has high predictive
ability. Algorithms such as first-fit, modified first-fit, and
best-fit are found to be CPU-inefficient. Buddy systems are found to
be very fast but suffer from a high degree of internal
fragmentation. A form of extended subpooling is shown to be as fast as
buddy systems with improved storage efficiency. This algorithm was
implemented for VM/SP, and then measured. Results for this algorithm
are given for several production VM/SP systems.",
paper = "Bozm84.pdf"
}

\end{chunk}
Latest commit 5054cfd Aug 12, 2018
 Failed to load latest commit information. book Jan 4, 2015 books Aug 12, 2018 license Mar 25, 2016 lsp May 13, 2017 src Aug 12, 2018 zips May 13, 2017 Dockerfile Feb 14, 2015 Dockerfile.howto Feb 14, 2015 Makefile Aug 2, 2017 Makefile.pamphlet Aug 2, 2017 Xresource.txt Jul 27, 2016 axiom30yr.jpg May 19, 2007 axiomicon.png Jun 1, 2007 axiomlogo.jpg Jun 1, 2007 axiomsml.png Jun 1, 2007 changelog Aug 12, 2018 configure Jun 24, 2008 faq Nov 14, 2016 goals Aug 16, 2017 patch Aug 12, 2018 projects Jan 8, 2015 proposal Mar 4, 2018 readme Jun 27, 2017

"What matters the most is what you do for free" -- John Gorka

"...even more important, for the progress of mathematics in the computer
age, is the beaver, who will build the needed infrastructure of computer
mathematics, that would eventually enable us to solve many outstanding
open problems, and many new ones. Consequently, the developers of computer
algebra systems, and creators of algorithms, are even more important than
both birds and frogs." --Doron Zeilberger
(http://www.math.rutgers.edu/~zeilberg/Opinion95.html)

Listening to computer scientists argue, it seems that the standards of
proof is I've had two beers and there is this anecdote about a tribe
in New Guinea from one of Scott Birkins books that seems to be applicable.

The debate is hindered by low standards of proof.

-- Greg Wilson "What We Actually Know About Software Development"
https://vimeo.com/9270320

When you teach, you do something useful.
When you do research, most days you don't.
-- June Huh IAS Princeton

You've unpacked the Axiom source code to some directory. In this
document we'll call that directory /home/me/axiom. Note that the path
cannot contain spaces.

=======================================================================
================= MAKING AXIOM ========================================
=======================================================================

Axiom builds a system-specific version based on a string we'll call
the SYSNAME. Currently recognized SYSNAME strings can be found on the
Axiom website at:

Replace SYSNAME below with the likely name of your system.

that place is /home/me/axiom, then:

cd /home/me/axiom                         << where you unpacked the sources
export AXIOM=/home/me/axiom/mnt/SYSNAME   << which axiom to build
export PATH=$AXIOM/bin:$PATH
make                                      << build the system

A recent cause of likely build failures is SELinux.
See the faq file for possible solutions.

When the make completes you'll have an executable called $AXIOM/bin/axiom ======================================================================= ================= INSTALLING AXIOM ==================================== ======================================================================= You can install Axiom on your system by typing (as root): make install This will put Axiom into /usr/local/axiom and the axiom command in /usr/local/bin/axiom You can change these defaults to anything thus: make INSTALL=/home/me/myaxiom COMMAND=/home/me/bin/myaxiom install ===================================================================== Documentation can be found at various places in the system or on the Axiom website: <http://axiom.axiom-developer.org> There is a book (available on Amazon.com): Jenks, Richard D. and Sutor, Robert S. "Axiom, The Scientific Computation System" Springer-Verlag, NY, 1992, ISBN 0-387-97855-0 The book is automatically built as part of the make and lives in: (yourpath)/axiom/mnt/(sysname)/doc/bookvol0.pdf In general every directory will contain a Makefile.dvi file. These files document facts about how Axiom is built. The directory mnt/linux/doc will contain .dvi files as they are written. Axiom is free and open source software. It is copyrighted code that is released under the Modified BSD license. Much debate about this topic has already been archived on the axiom-legal and axiom-developer mailing lists. The mail archives are available at the Axiom website: http://savannah.gnu.org/projects/axiom For the purposes of copyright, Axiom is to be considered "Joint Work". Specifically: "A joint work is a work prepared by two or more individuals, with the intention that their separate contributions be merged into a single work. A joint author can also be an organization or a corporation under the definition of "work for hire." A person who has merely contributed ideas without actually documenting those ideas generally cannot be considered an author. Authors own the work jointly and equally, unless the authors make an agreement otherwise. Each joint author has the right to exercise any or all of the exclusive rights inherent in the joint work. Each author may: * Grant thirds parties permission to use the work on a nonexclusive basis without the consent of the other joint authors * Transfer his or her entire ownership interest to another person without the other joint author's consent * Update the work for his or her own purpose Additionally, each joint author must account to the other joint authors for any profits received from licensing their joint work." (http://copyright.universityofcalifornia.edu/ownership/joint-works.html) Questions and comments should be sent to: axiom-developer@nongnu.org Tim Daly daly@axiom-developer.org ===================================================================== Philosophy Scratchpad was a large, general purpose computer algebra system that was originally developed by IBM under the direction of Richard Jenks. The project started in 1971 and evolved slowly. Barry Trager was key to the technical direction of the project. Scratchpad developed over a 20 year stretch and was basically considered as a research platform for developing new ideas in computational mathematics. In the 1990s, as IBM's fortunes slid, the Scratchpad project was renamed to Axiom, sold to the Numerical Algorithms Group (NAG) in England and became a commercial system. As part of the Scratchpad project at IBM in Yorktown I worked on all aspects of the system and eventually helped transfer the product to NAG. For a variety of reasons it never became a financial success and NAG withdrew it from the market in October, 2001. NAG agreed to release Axiom as free software. The basic motivation was that Axiom represents something different from other programs in a lot of ways. Primarily because of its foundation in mathematics the Axiom system will potentially be useful 30 years from now. In its current state it represents about 30 years and 300 man-years of research work. To strive to keep such a large collection of knowledge alive seems a worthwhile goal. However, keeping Axiom alive means more than just taking the source code and dumping it onto a public server. There are a lot of things about the system that need to change if it is going to survive and thrive for the next 30 years. The system is complex and difficult to build. There are few people who know how it is structured and why it is structured that way. Somehow it needs to be documented deeply so others can contribute. The mathematics is difficult. Unlike other free software you can't just reach for the old saying read the source code''. The source code is plain, clear and about as close to the mathematical theory as is practical. Unfortunately the mathematical theory is enshrined in some research library where few people will have access. Somehow this must change. The research work, the mathematics, the published papers, and the source code have all got to be kept together for the next generation to read, understand and modify. The mathematics is narrow and very focused. This was due to the fact that, while Axiom is a great research platform, we only had a limited number of visitors at IBM Research. So there is very little in the way of, say, infinite group theory in Axiom. We can add it. Doing so will show up shortcomings in the system. For example, how do you represent an infinite object? There are many possible representations and they depend on your goals. The system will have to change, expand, and, hopefully, become cleaner as more thought is applied. Scratchpad changed continuously while it was being used for research and we expect Axiom to do the same. The language (spad) is designed to let you write algorithms that are very close to the mathematics. However, the algorithms as presented in the current system have never been shown or proven (an important distinction) to be correct. It is vital that we undertake the huge effort of verifying and validating the code. How else can we trust the results and of what use is a system this complex without trust? Somehow we have to extend the system to integrate program proof techniques. That is, we have to make computational mathematics hold to the same standard as the rest of mathematics. All of which seems to integrate into a requirement for better documentation. The key change which developers of Axiom will find with this version is that the documentation is primary and the code is secondary. Taking direction from Knuth and Dijkstra the system is now in a literate programming style. The hope is that the next generation of developers and users will be able to understand, maintain and extend the system gracefully. And that eventually papers submitted to journals (an Axiom Journal?) will be easily imported into the system with their running code made available automatically. There is no guarantee that this attempt to change the culture of computational mathematicians is going to succeed. But it is our firm belief that current systems have reached a complexity plateau and we need to find new techniques to push the envelope. In general, we need to consider changes to the system with a 30 year horizon rather than the current write-ship-debug mentality of software development. This is, after all, mathematics, the queen of the sciences. It deserves all of the time, talent and attention we can bring to bear on the subject. Tim Daly -- September 3, 2002 Contributions All of these people have, in some way or other, been associated with Scratchpad and Axiom. If you contribute, add your name. The names are in alphabetical order as we make no attempt to quantify the relative merit of the contributions. In books/bookvol5.pamphlet is a variable called credits which contains this list. Typing )credits at the axiom command prompt will prettyprint the list. "An alphabetical listing of contributors to AXIOM:" "Michael Albaugh Cyril Alberga Roy Adler" "Christian Aistleitner Richard Anderson George Andrews" "S.J. Atkins Jeremy Avigad Henry Baker" "Martin Baker Stephen Balzac Yurij Baransky" "David R. Barton Thomas Baruchel Gerald Baumgartner" "Gilbert Baumslag Michael Becker Nelson H. F. Beebe" "Jay Belanger David Bindel Fred Blair" "Vladimir Bondarenko Mark Botch Raoul Bourquin" "Alexandre Bouyer Karen Braman Wolfgang Brehm" "Peter A. Broadbery Martin Brock Manuel Bronstein" "Christopher Brown Stephen Buchwald Florian Bundschuh" "Luanne Burns William Burge Ralph Byers" "Quentin Carpent Pierre Casteran Robert Cavines" "Bruce Char Ondrej Certik Tzu-Yi Chen" "Bobby Cheng Cheekai Chin David V. Chudnovsky" "Gregory V. Chudnovsky Mark Clements James Cloos" "Jia Zhao Cong Josh Cohen Christophe Conil" "Don Coppersmith George Corliss Robert Corless" "Gary Cornell Meino Cramer Jeremy Du Croz" "David Cyganski Nathaniel Daly Timothy Daly Sr." "Timothy Daly Jr. James H. Davenport David Day" "James Demmel Didier Deshommes Michael Dewar" "Inderjit Dhillon Jack Dongarra Jean Della Dora" "Gabriel Dos Reis Claire DiCrescendo Sam Dooley" "Zlatko Drmac Lionel Ducos Iain Duff" "Lee Duhem Martin Dunstan Brian Dupee" "Dominique Duval Robert Edwards Heow Eide-Goodman" "Lars Erickson Mark Fahey Richard Fateman" "Bertfried Fauser Stuart Feldman John Fletcher" "Brian Ford Albrecht Fortenbacher George Frances" "Constantine Frangos Timothy Freeman Korrinn Fu" "Marc Gaetano Rudiger Gebauer Van de Geijn" "Kathy Gerber Patricia Gianni Gustavo Goertkin" "Samantha Goldrich Holger Gollan Teresa Gomez-Diaz" "Laureano Gonzalez-Vega Stephen Gortler Johannes Grabmeier" "Matt Grayson Klaus Ebbe Grue James Griesmer" "Vladimir Grinberg Oswald Gschnitzer Ming Gu" "Jocelyn Guidry Gaetan Hache Steve Hague" "Satoshi Hamaguchi Sven Hammarling Mike Hansen" "Richard Hanson Richard Harke Bill Hart" "Vilya Harvey Martin Hassner Arthur S. Hathaway" "Dan Hatton Waldek Hebisch Karl Hegbloom" "Ralf Hemmecke Henderson Antoine Hersen" "Nicholas J. Higham Hoon Hong Roger House" "Gernot Hueber Pietro Iglio Alejandro Jakubi" "Richard Jenks Bo Kagstrom William Kahan" "Kyriakos Kalorkoti Kai Kaminski Grant Keady" "Wilfrid Kendall Tony Kennedy David Kincaid" "Keshav Kini Ted Kosan Paul Kosinski" "Igor Kozachenko Fred Krogh Klaus Kusche" "Bernhard Kutzler Tim Lahey Larry Lambe" "Kaj Laurson Charles Lawson George L. Legendre" "Franz Lehner Frederic Lehobey Michel Levaud" "Howard Levy J. Lewis Ren-Cang Li" "Rudiger Loos Craig Lucas Michael Lucks" "Richard Luczak Camm Maguire Francois Maltey" "Osni Marques Alasdair McAndrew Bob McElrath" "Michael McGettrick Edi Meier Ian Meikle" "David Mentre Victor S. Miller Gerard Milmeister" "Mohammed Mobarak H. Michael Moeller Michael Monagan" "Marc Moreno-Maza Scott Morrison Joel Moses" "Mark Murray William Naylor Patrice Naudin" "C. Andrew Neff John Nelder Godfrey Nolan" "Arthur Norman Jinzhong Niu Michael O'Connor" "Summat Oemrawsingh Kostas Oikonomou Humberto Ortiz-Zuazaga" "Julian A. Padget Bill Page David Parnas" "Susan Pelzel Michel Petitot Didier Pinchon" "Ayal Pinkus Frederick H. Pitts Frank Pfenning" "Jose Alfredo Portes E. Quintana-Orti Gregorio Quintana-Orti" "Beresford Parlett A. Petitet Andre Platzer" "Peter Poromaas Claude Quitte Arthur C. Ralfs" "Norman Ramsey Anatoly Raportirenko Guilherme Reis" "Huan Ren Albert D. Rich Michael Richardson" "Jason Riedy Renaud Rioboo Jean Rivlin" "Nicolas Robidoux Simon Robinson Raymond Rogers" "Michael Rothstein Martin Rubey Jeff Rutter" "Philip Santas Alfred Scheerhorn William Schelter" "Gerhard Schneider Martin Schoenert Marshall Schor" "Frithjof Schulze Fritz Schwarz Steven Segletes" "V. Sima Nick Simicich William Sit" "Elena Smirnova Jacob Nyffeler Smith Matthieu Sozeau" "Ken Stanley Jonathan Steinbach Fabio Stumbo" "Christine Sundaresan Klaus Sutner Robert Sutor" "Moss E. Sweedler Eugene Surowitz Max Tegmark" "T. Doug Telford James Thatcher Laurent Thery" "Balbir Thomas Mike Thomas Dylan Thurston" "Francoise Tisseur Steve Toleque Raymond Toy" "Barry Trager Themos T. Tsikas Gregory Vanuxem" "Kresimir Veselic Christof Voemel Bernhard Wall" "Stephen Watt Andreas Weber Jaap Weel" "Juergen Weiss M. Weller Mark Wegman" "James Wen Thorsten Werther Michael Wester" "R. Clint Whaley James T. Wheeler John M. Wiley" "Berhard Will Clifton J. Williamson Stephen Wilson" "Shmuel Winograd Robert Wisbauer Sandra Wityak" "Waldemar Wiwianka Knut Wolf Yanyang Xiao" "Liu Xiaojun Clifford Yapp David Yun" "Qian Yun Vadim Zhytnikov Richard Zippel" "Evelyn Zoernack Bruno Zuercher Dan Zwillinger" Pervasive Literate Programming I think David Diamond said it best (Datamation, June 1976, pg 134): The fellow who designed it is working far away. The spec's not been updated for many a livelong day. The guy who implemented it is promoted up the line. And some of the enhancements didn't match to the design. They haven't kept the flowcharts, the manual's a mess. And most of what you need to know, you'll simply have to guess. and with respect to Axiom: The research that it's based on is no longer to be had. And the theory that it's based on has changed by just a tad. If we keep it all together then at least there is a hope. That the people who maintain it will have a chance to cope. To quote Fred Brooks, "The Mythical Man-month" "A basic principle of data processing teaches the folly of trying to maintain independent files in synchronization... Yet our practice in programming documentation violates our own teaching. We typically attempt to maintain a machine-readable form of a program and an independent set of human-readable documentation, consisting of prose and flowcharts ... The solution, I think, is to merge the files, to incorporate the documentation in the source program." "A common fallacy is to assume authors of incomprehensilbe code will somehow be able to express themselves lucidly and clearly in comments." -- Kevlin Henney "A programmer who cannot explain their ideas clearly in natural language is incapable of writing readable code." -- Tim Daly As you can already see from this document the whole of the Axiom effort is structured around literate programs. Every directory has a Makefile.pamphlet file which explains details of that directory. The whole source tree hangs from the Makefile tree. (Some of the Makefile.pamphlet files contain only text if executable code is not needed). Every source file is embedded in a pamphlet file. Which begs the question: What is a pamphlet file?''. Basically it is a tex document with some additional markup tags that surround source code. Pamphlet files are intended to document one particular subject. Pamphlet files can be later combined into booklet'' files as one would embed chapters into books. Which begs the question: Why bother with pamphlet files?''. Clearly you didn't read the philosophy rant above. In more detail there have been two traditional methods of documenting source code. The first is to sprinkle random comments into the code. This suffers from the problem that the comments assume you already understand the purpose of the code and why an enlightened comment like This oughta work'' is perfectly clear and compelling. The second method is to write a document as a separate file. They get written half-heartedly because the lack of source code allows you to skip over explaining ugly implementation details (where all of the real confusion lies). This form of documentation never gets updated and gradually becomes uninteresting history. Pamphlet files overcome neither of these limitations if you don't make the effort to do it right. Ask yourself the question What would Knuth do?'' or Will this be clear 30 years from now?''. Which begs the question: Why go to all this trouble?''. Because you're having a conversation with people who are far removed from you in time, space, and understanding. Because someone else will have to maintain your code. Because you are part of a community of mathematicians who hold you to high standards. Because if you can't explain it clearly maybe YOU don't understand it or it isn't as clear as you think it is. Lets imagine that we'd like to receive a pamphlet file from a colleague. It contains a new theory and spiffy new algorithm. We'd like to be able to put the pamphlet file into the system and have everything magically happen to integrate the new pamphlet into the system. What would that imply? Well, lets assume that the pamphlet file has certain required sections. We'd like to get more than the technical paper and the code. We'd also like to see the help information, test cases, example code, cross-references to other pamphlets that would get automatically included, have the proof available and automatically checkable, etc. In the best of all possible worlds we have a front-end system that knows nothing except how to deconstruct and integrate properly formed pamphlet files. If this were true we could be sure that all of the mathematics is documented and external to the system. There are no rabbits'' (as Dijkstra called surprises or special knowledge) that we pull out of our hat. Conceptually, given an underlying Lisp system, it is clear we can built such a system. The General Directory Structure The Top Level directory structure contains 7 directories which are explained in detail below. Three of the directories (license, zips, and lsp) are not part of the essential core of Axiom. The other four directories (src, int, obj, and mnt) comprise the system build environment. Each directory has a specific purpose. Lets look at the essential directories first.$AXIOM
lsp
GCL
src
include
lib
scripts
int
doc
obj
tmp
noweb
SYS=(linux,...)
mnt
SYS=(linux,...)
zips

The src directory consists of human-written, system-independent
code. You can copy this directory (and the top-level makefiles) and
have a complete build system. Nothing in this directory is ever
changed by the Makefiles and it can be mounted in a read-only fashion
during the build process. An example file would be the lisp source
files.

The int directory consists of machine-generated, system-independent
code. Consider this directory as a cache. Nothing in this directory is
necessary for a clean system build but once the build completes the
information in this directory can significantly shorten rebuilds.
Since this information is system-independent we can use the cache no
matter what architecture we target. An example file would be the dvi
files generated from the tex sources.

The obj directory consists of machine-generated, system-dependent
code. This directory is "scratch" space for the compiler and other
tools. Nothing in this directory is useful once the system is
built. An example file would be the .o files from the C compiler.

The mnt directory consists of machine-generated, system-dependent code
that will comprise the "shipped system". You can copy this directory
and have a complete, running Axiom system. If the end user will see it
or need it in any way it belongs here. Executables are generally built
in obj and moved here. Example files would be the final executable
images, the input files, etc.

The four directories above make it possible to do a system build for
one system (say, a linux system) which will fill in the int
subdirectory. Then you can NFS mount the src and int directories
read-only on a Solaris machine and do a solaris system build.  The
original Axiom could build across many different architectures,
compilers, and operating systems.

is included in this distribution. Source files contain a line that
refers to one of these license files. Some people are of the belief
that including the full license text in every source file magically
strengthens the license but this is not so.  Imagine including the
full text of the copyright at the beginning of every section of a book.

of the files released as part of this distribution except as noted in
particular files. Copyright information that might have shown up in
source files that were released from NAG are also collected here and
noted at the top of the files they cover.

The zips directory

The zips directory contains particular distributions of
network-available software that is known to work with this release of
Axiom. Newer versions may work and, if so, should be added to this
directory. The makefiles that handle these files will unpack them into
the correct locations in the src directory tree. These files exist to
increase the stability of the distribution so we can guarantee that
the code works. We encourage testing the latest distributions so that
we can remain with the leading edge and give feedback to the
individual package developers if problems arise.

The lsp directory

Axiom lives on top of Common Lisp, specifically Gnu Common Lisp (GCL)

Steps to build Axiom

The Initial Distribution files

The initial distribution contains several top level files. These are:

1) Makefile.pamphlet
This is the noweb source for the Makefile file. All changes to
the Makefile should occur here and the

3) Makefile This is the actual Makefile that will create Axiom.

In general the distribution will contain the pamphlet files for each
source file and the source file itself. Modifications should be made
and explained in the pamphlet files. The document command should be
run to rebuild the source file and the dvi file.

Steps in the build process

The sequence of steps necessary to build a clean Axiom is simply:

export AXIOM=(path-including-this-directory)/mnt/SYSNAME
export PATH=$AXIOM/bin:$PATH
make

If this fails check the FAQ for possible problems and their fixes.