Skip to content
Pierre Letouzey edited this page Oct 26, 2017 · 7 revisions

Presents: Hugo Herbelin, Bruno Barras, Benjamin Werner, Julien Forest, Pierre Letouzey, Andreas Abel, Stéphane Glondu, Stéphane Lescuyer, Arnaud Spiwack, Jean-Marc Notin, Elie Soubiran, Vincent Siles, Vincent Gross, Yann Régis-Gianas (Apologies for missing names)


Program

  • 'FSets' by P. Letouzey
  • 'Better aliasing of modules' par E. Soubiran
  • eta par H. Herbelin
  • 8.3 version

8.3 version

  • Hugo: We are in time (i.e. not that late).

    Branching to 8.3 in two weeks. 3 months to make it stable = mid-december for the release ?

  • Arnaud's work: introduction of a new proof engine in trunk after branching.

    Lot of cleaning needed. Not enterily finished (some contributions and the declarative mode are not working). 10% loss of efficiency.

  • For 8.3, a lot of bugs to be fixed.

    Hugo: Coq's development is a collaborative work!

    Do not hesitate to contribute. This is an opportunity to learn the code.


FSets

  • Overall view:

    • There are two versions of FSets in preparation (one based on modules,

      another one is based on type classes).

    • The modular version of FSets will publish raw data structures

      and invariants using two separated inductives.

    • A set of functors implements the previous interface.

    • The use of the "include" mechanism enables a lot of factorization

      (more to come).

    • Map is not done yet.

  • Presentation details of Pierre's work to refactor the standard library:

    • He uses "Equivalence" implemented by type classes.

    • A type class "StrictOrder" is introduced.

    • Now, "compare" is pure. It does not convey terms of type Prop anymore.

    • A predicate "CompareSpec" is defined.

    • Some specifications have been grouped thanks to the use of <-> in order to factorize two lemma for the -> and <- versions. This is possible since the recent improvement of rewriting tactics.

    • About back-compatibility, here is the new architecture of FSets:

      • MSets "Modular Sets", to provide a smooth evolution of client code.
      • CSets will be "Type class Sets". => Only a client that used low-level data structures will break.
    • A new "Order Sig" : a small specification + a new generic tactic "order" (prove formulae using saturation over ordering relation).

    • About the "physical" separation of program code and program invariants, a wrapper is provided to transform an interface in this new style into an interface based on sum types.

      Parameter IsOk : t -> Prop.
      Class Ok (s : t) : Prop := { ok : IsOk s }.
      Parameter isok : t -> bool.
      Declare Instance isok_OK s `(isok s = true) : Ok s | 10.

      => S. Lescuyer : why "10" ? We should be able to choose the priority at instanciation, not at definition.

    • Modular definitions of Max and Min based on a general module type called OrderedTypeFull. => We obtain the usual operations and properties over Min and Max for free. => Dual properties are proven only once.

    • A module dedicated to integers and a wrapper for preexisting relations.

    • Numbers can be improved a lot with the new module system and type classes.

    • Some changes in Zmin : not exactly the same statements as in 8.2.

    • Objectives: coherent set of properties for the standard library.


B. Werner

  • Discussion started using an example:

    set = { l : list | no_repetition l }
    add : set -> A -> set
    remove : ...
    
  • Context: joint work with Chantal Keller:

    • a big dataset of lemmas have been imported from Isabelle/HOL. - logical properties are carried around => loss of efficiency.
  • They are looking for a solution.

    => Maybe the one of Pierre that separates data structures and invariants.

  • Two other solutions:

    => Stop normalization of logical part (using a "fun () => ..."). => Integrate proof-irrelevance in Coq.

  • Lescuyer reports an evaluation that is two times faster when he uses a purely computational implementation instead of sum types.


Module system (Elie Soubiran) about sharing constraint

  • 3 constructions. Let P be a path.

    • Module M := P
    • Module M Include P End M
    • (Sig) S with Module M := P
  • Standard definition of strenghtening by explicit substitution.

  • How to handle non logical components like hints, tactics and notations?

    => Elie implements a quotient of the name space using canonical names that are computed thanks to the sharing of prefixes. The canonical name of a module is the oldest of its names.

    => Names are now implemented using a pair of an external name (used by the programmer) and a canonical name which is used internally.

    => Equalities and ordering over these new implementations of names have been defined.


Eta (notes by Hugo)

Hugo reminds some facts about CIC and discuss how eta (for implication) could be added or not

  • to the current syntax-directed implementation
  • to the standard declarative presentation of CIC
  • to the domain-free declarative presentation of CIC
  • to a typed presentation of CIC

and what consequences this addition would have on the confidence in Coq.

  • Coq implements a syntax-directed version of CIC (let's call it CIC_sd); subtyping and conversion in CIC_sd are syntax-directed in that they are implemented by reduction to weak-head normal form of the two terms to compare and by recursively comparing the subterms in case the two terms have the same head constructor. Adding eta to CIC_sd is easy: it suffices to insert an eta-expansion on the fly every time a comparison between a whnf which is a lambda and a whnf which is not a lambda is encountereed (we don't need to change subtyping since subtyping involves only types and whnf that are lambdas are not types). As for CIC_sd without eta, the result of the conversion test, and actually even the inner wh reduction algorithm, are correct only if the original terms are well-typed in some sort (from which, by subject reduction, we know that all subconversion problems are typed in the same type). In particular, to ensure this typing invariant is preserved, because of dependencies, conversion on subterms of wh applications has to be done from left to right.

  • The standard declarative presentation of CIC is defined from the reflexive-symmetric-transitive closure of untyped reduction (let's call it CIC_decl). Because CIC is "full" (the product of types in any two sorts exists), it is a standard result that CIC_decl and CIC_sd are equivalent. However, adding eta to CIC_decl is problematic:

    • if eta is added as an (untyped) expansion, we lose any control on the termination of this expansion;

    • if eta is added as an (untyped) reduction, we have Nederpelt's counterexample to untyped confluence (x:A.(y:B.fyy)x reduces both to x:A.fxx and y:B.fyy);

    • moreover, with eta as an (untyped) reduction, this forces subtyping to be contravariant (since subtyping implies that f:Type2 -> Type2 |- x:Type1.fx : Type1 -> Type3 and to get subject reduction for eta, on would need Type2 -> Type2 to be a subtype of Type1 -> Type3);

  • Moving to a domain-free presentation of CIC_decl (let's call it CIC_decldf) would solve the problem of confluence of eta-reduction but would still require contravariance. Note that in the absence of eta, CIC_decldf is known to be equivalent to CIC_decl because it is normalising, and hence to CIC_sd because CIC is full, see Benjamin's PhD. We can reasonably conjecture that (shortcutting the CIC_decl step) this correspondence scales to eta.

  • We have two kinds of model for the CIC.

  • There is (virtually) the normalisation model one would obtain by extending the models of inductive strong elimination by Benjamin and of universes by Alexandre to a model of the full CIC. These models are good for proving normalisation but not for validating standard axioms such as extensionality and classical logic + axiom of unique choice.

  • There is the set-theoretic model Benjamin and Gyesik are working on that does not prove normalisation nor support contravariant subtyping but validates extensionality, classical logic and unique choice. If one wants this model (and I think we want it because it is the only one that ensures the compatibility of Coq and of the classical axiom of choice), one has to forget about contravariant subtyping and forget about the domain-free presentation CIC_decl^df of the CIC. Also, so as to support the subtyping rule Prop <= Type of CIC, the set-theoretic model relies on a variant of CIC with typed conversion. If we do not want to renounce to Prop <= Type that Coq already supports, we need therefore to look at how to add eta to the CIC with typed conversion.

  • Let's call CIC_decl^typed the variant of CIC with typed conversion (aka judgemental equality). Adding eta either as an expansion or reduction rule seems easy there. Benjamin suggests that by adapting Goguen's metatheory of UTT, we could be able to show that CIC_decl^typed and CIC_decl are equivalent.

Summary:

  • As far as eta is concerned, only two presentations of CIC are relevant:

    • the syntax-directed CIC_sd that Coq implements,
    • the presentation with typed conversion CIC_decl^typed which, in particular, serves as support for the set-theoretical model.
  • Both presentations are conjectured equivalent in the absence of eta, first because CIC is full (this is the easy part of the correspondence) and secondly because it is normalising and that Goguen's work (or a NbE-style proof) would scale (this is the hard part of the correspondence which amounts to building a model). Once formally established for beta, one can reasonably expect the correspondence to scale also to eta.

  • Since, in order to support Prop <= Type, the set-theoretical model relies on CIC_decl^typed, the implementation is not fundamentally further from its model with eta than it is with only beta.

  • Removing Prop <= Type from the CIC (so that the set-theoretical model only requires CIC_decl) does not help for supporting eta because adding eta on top of CIC_decl would require contravariant subtyping what the state of the art is not able to provide with the set-theoretical model.

  • Hugo's claim is then that adding eta to Coq would not significantly modify the current conjectured steps on which the system is grounded, unless we not only renounce to add eta but also remove Prop <= Type. From the research point of view, going towards eta and towards a better metatheory of CIC_decl^typed seems also to be the most challenging direction.

Extra discussions

  • Use of "Parameter" instead of "Axiom" in specification (ie Module Type).
  • Lescuyer + P. Letouzey : easier to be aware of a development's axioms.
  • Herbelin : maybe a new keyword "Specification"?
  • a lot of delta reductions are introduced by type classes
Clone this wiki locally