Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

API for reduction-insenstive constr traversal #46

Open
JasonGross opened this issue Aug 27, 2020 · 2 comments
Open

API for reduction-insenstive constr traversal #46

JasonGross opened this issue Aug 27, 2020 · 2 comments

Comments

@JasonGross
Copy link
Member

JasonGross commented Aug 27, 2020

I'm reviewing coq/coq#12218, and seeing a lot of code that I believe will be wrong in the presence of casts, let-ins, and reducible matches (if true then ... else ...) in surprising places (like at top-level in the types of constructors). Coq also has a number of performance issues in the presence of nested let ... in ...s, e.g., around computation of implicit arguments.

This suggests to me that Coq should have a term-traversal API which is insensitive to some sorts of reduction, much like EConstr is insensitive to evar-normalization. Ideally it would not incur overhead (it should add let-binders to the context rather than substituting them, it should, perhaps, perform beta-reduction by binding the argument in a let binder and going under the binder), and be much more light-weight than EConstr. For example, it might just be a kind function which updates the local environment and returns both the context and the kind of term modulo whatever you ask it to be modulo.

Said another way, this would be an API for traversing fully reduced terms (or perhaps βιζ-normal terms) without having to perform expensive substitutions in parts of the terms that you don't care about.

I haven't turned this into an actual CEP because I don't have a good enough sense of this, but I'd be interested in comments and/or starting a discussion here.

cc @ppedrot @andres-erbsen

@andres-erbsen
Copy link

My understanding has been that there is no interest in supporting terms whose whnf is large, or being careful with reduction more generally. If I'm wrong I'd be happy to participate in the design.

@ppedrot
Copy link
Member

ppedrot commented Aug 27, 2020

@JasonGross I agree it'd be nice but I don't have the least idea on how to do that efficiently.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants