Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Followup of #17993, fixes #18239: anomaly on triggering conditions for simpl on encapsulated mutual fixpoints #18243

Conversation

herbelin
Copy link
Member

@herbelin herbelin commented Nov 1, 2023

We ensure that there are enough arguments in the original applied constant, in addition to ensuring enough arguments in the constant that immediately surrounds a global mutual fixpoint.

In addition to fixing bug #18239, introduced in #17993, this fixes also a similar bug anterior to #17993.

Fixes #18239

  • Added / updated test-suite.

@herbelin herbelin added kind: fix This fixes a bug or incorrect documentation. part: reduction strategies The Strategy command for defining reduction straegies. labels Nov 1, 2023
@herbelin herbelin added this to the 8.19+rc1 milestone Nov 1, 2023
@herbelin herbelin closed this Nov 1, 2023
@coqbot-app coqbot-app bot removed this from the 8.19+rc1 milestone Nov 1, 2023
@herbelin herbelin reopened this Nov 1, 2023
@coqbot-app coqbot-app bot added the needs: full CI The latest GitLab pipeline that ran was a light CI. Say "@coqbot run full ci" to get a full CI. label Nov 1, 2023
@herbelin herbelin marked this pull request as ready for review November 1, 2023 18:51
@herbelin herbelin requested a review from a team as a code owner November 1, 2023 18:51
pretyping/tacred.ml Outdated Show resolved Hide resolved
@SkySkimmer
Copy link
Contributor

@coqbot run full ci

@coqbot-app coqbot-app bot removed the needs: full CI The latest GitLab pipeline that ran was a light CI. Say "@coqbot run full ci" to get a full CI. label Nov 1, 2023
…ons for simpl.

We ensure that there are enough arguments in the original applied
constant, in addition to ensuring enough arguments in the constant
that immediately surrounds a global mutual fixpoint.

This incidentally fixes a pre-coq#17993 bug.
@herbelin herbelin force-pushed the master+fix18239-followup-17993-simpl-triggering-condition branch from 8e65e45 to b789f40 Compare November 2, 2023 10:14
@coqbot-app coqbot-app bot added the needs: full CI The latest GitLab pipeline that ran was a light CI. Say "@coqbot run full ci" to get a full CI. label Nov 2, 2023
@herbelin
Copy link
Member Author

herbelin commented Nov 2, 2023

@SkySkimmer: I renamed the OCaml variables in a way which I hope is clearer. I added new tests, checked the result of simpl, and documented the behavior (using a match goal makes the reading of the tests difficult but I did not want to abuse too much of the output test-suite directory). I added two items to the changelog. If ok for you, we can restart the full CI (and check it is still green).

@SkySkimmer
Copy link
Contributor

@coqbot run full ci

@coqbot-app coqbot-app bot removed the needs: full CI The latest GitLab pipeline that ran was a light CI. Say "@coqbot run full ci" to get a full CI. label Nov 2, 2023
@SkySkimmer SkySkimmer self-assigned this Nov 2, 2023
@SkySkimmer SkySkimmer added this to the 8.19+rc1 milestone Nov 2, 2023
@SkySkimmer
Copy link
Contributor

Tried fiat crypto locally, Curves/Weierstrass/AffineProofs worked fine with src/Curves/Weierstrass/AffineProofs.vo (real: 216.29, user: 373.41, sys: 1.75, mem: 2086324 ko)
let's rerun and see

@SkySkimmer
Copy link
Contributor

@coqbot bench

@SkySkimmer
Copy link
Contributor

I retried 2 times and all 3 jobs failed the same way (Curves/Weierstrass/AffineProofs fails and seems to take a long time to do so, eg https://gitlab.inria.fr/coq/coq/-/jobs/3551339

COQC src/Curves/Weierstrass/AffineProofs.v
Finished transaction in 1.519 secs (1.385u,0.127s) (successful)
Command terminated by signal 9
src/Curves/Weierstrass/AffineProofs.vo (real: 3379.13, user: 6.29, sys: 74.80, mem: 1275872 ko)

)

cc @JasonGross @andres-erbsen

@SkySkimmer
Copy link
Contributor

also why is real so much greater than user + sys

@SkySkimmer
Copy link
Contributor

It did also fail on master in a similar way https://gitlab.inria.fr/coq/coq/-/jobs/3555681

Copy link
Contributor

coqbot-app bot commented Nov 3, 2023

🏁 Bench results:

┌──────────────────────────────┬─────────────────────────┬───────────────────────────────────────┬───────────────────────────────────────┬─────────────────────────┐
│                              │      user time [s]      │              CPU cycles               │           CPU instructions            │  max resident mem [KB]  │
│                              │                         │                                       │                                       │                         │
│         package_name         │   NEW      OLD    PDIFF │      NEW             OLD        PDIFF │      NEW             OLD        PDIFF │   NEW      OLD    PDIFF │
├──────────────────────────────┼─────────────────────────┼───────────────────────────────────────┼───────────────────────────────────────┼─────────────────────────┤
│       coq-mathcomp-ssreflect │  183.67   186.39  -1.46 │   828872427243    841283223048  -1.48 │  1472443403440   1472404768174   0.00 │ 1287748  1289508  -0.14 │
│         coq-mathcomp-algebra │  472.44   478.74  -1.32 │  2128961048016   2156841134485  -1.29 │  3971845005753   3972217581836  -0.01 │ 1377736  1399252  -1.54 │
│        coq-mathcomp-fingroup │   38.15    38.49  -0.88 │   173035969042    174392706841  -0.78 │   264343659817    264318494300   0.01 │  558996   559188  -0.03 │
│      coq-metacoq-safechecker │  377.16   380.25  -0.81 │  1715333139146   1729089954894  -0.80 │  2862226825875   2862072179750   0.01 │ 1867208  1868084  -0.05 │
│           coq-mathcomp-field │  218.38   219.91  -0.70 │   982777731334    990009732053  -0.73 │  1758113792404   1758238137987  -0.01 │ 1505792  1505756   0.00 │
│          coq-category-theory │  778.55   783.55  -0.64 │  3531258729865   3555087712053  -0.67 │  6153927465433   6153893157673   0.00 │  884048   883924   0.01 │
│                   coq-stdlib │  373.10   374.99  -0.50 │  1558187669947   1562973562741  -0.31 │  1389264469313   1389205280913   0.00 │  765896   753544   1.64 │
│                coq-fiat-core │   60.92    61.19  -0.44 │   258842844683    258015323893   0.32 │   381083316455    381116334868  -0.01 │  494412   494412   0.00 │
│       coq-mathcomp-character │  135.29   135.84  -0.40 │   612086262118    614660070133  -0.42 │  1001594409269   1001548129128   0.00 │ 1245824  1245844  -0.00 │
│                coq-fourcolor │ 1551.39  1557.55  -0.40 │  6853430207666   6871391454652  -0.26 │ 12476839538461  12476971961780  -0.00 │ 2333428  2329876   0.15 │
│       coq-mathcomp-odd-order │  872.81   875.91  -0.35 │  3971112444997   3987362550116  -0.41 │  6736173311395   6736205857134  -0.00 │ 1630516  1630780  -0.02 │
│  coq-rewriter-perf-SuperFast │  755.47   757.15  -0.22 │  3363989182228   3368583017197  -0.14 │  5749196418182   5751325130347  -0.04 │ 1408688  1408536   0.01 │
│                     coq-core │  117.99   118.20  -0.18 │   462291260305    459455669149   0.62 │   494246129266    494307643707  -0.01 │  367928   367464   0.13 │
│        coq-mathcomp-solvable │  144.38   144.58  -0.14 │   653249606083    654118628310  -0.13 │  1056252423955   1056249407020   0.00 │  931668   928976   0.29 │
│                     coq-corn │  805.48   806.58  -0.14 │  3625686046392   3631545678339  -0.16 │  5629697335182   5629221646910   0.01 │  780104   779880   0.03 │
│   coq-performance-tests-lite │  734.57   735.24  -0.09 │  3276169877696   3280277606098  -0.13 │  5795550623643   5794287788773   0.02 │ 1690056  1690992  -0.06 │
│            coq-iris-examples │  471.15   471.48  -0.07 │  2121569174384   2123075674566  -0.07 │  3244999515513   3244779080810   0.01 │ 1073636  1073772  -0.01 │
│             coq-math-classes │   89.03    89.07  -0.04 │   397534865895    398064162186  -0.13 │   556173097845    556089849188   0.01 │  524780   525080  -0.06 │
│ coq-fiat-crypto-with-bedrock │ 6499.66  6497.07   0.04 │ 29264689368844  29242693287373   0.08 │ 53711146755635  53716138721998  -0.01 │ 2308268  2308504  -0.01 │
│                     coq-hott │  158.52   158.36   0.10 │   704523232204    704692667676  -0.02 │  1113311630290   1113473299399  -0.01 │  514308   513932   0.07 │
│          coq-metacoq-erasure │  360.39   359.76   0.18 │  1619956933279   1617446430854   0.16 │  2550188027873   2550157003522   0.00 │ 1889400  1889164   0.01 │
│                coq-equations │   11.35    11.33   0.18 │    48295245833     48427562908  -0.27 │    76412642301     76419445841  -0.01 │  417316   417504  -0.05 │
│                 coq-rewriter │  377.69   376.78   0.24 │  1699569676305   1695845553458   0.22 │  2767842241569   2768006070569  -0.01 │ 1231608  1232228  -0.05 │
│               coq-verdi-raft │  601.31   599.75   0.26 │  2726522035854   2719057360166   0.27 │  4209151330296   4209741963611  -0.01 │  807608   809784  -0.27 │
│                  coq-bignums │   29.31    29.23   0.27 │   131455495610    131497783371  -0.03 │   189104433999    189126288129  -0.01 │  490556   488760   0.37 │
│               coq-coquelicot │   39.08    38.96   0.31 │   173169862532    172088254127   0.63 │   241817767860    241669042046   0.06 │  793184   795724  -0.32 │
│                 coq-compcert │  286.08   285.05   0.36 │  1278570020282   1275214246852   0.26 │  1962858350767   1962629584540   0.01 │ 1144700  1145012  -0.03 │
│         coq-metacoq-template │  153.87   153.24   0.41 │   675538046728    673331747353   0.33 │  1052251282545   1052292144001  -0.00 │ 1177064  1180684  -0.31 │
│                    coq-color │  237.93   236.95   0.41 │  1058761323845   1054748376763   0.38 │  1542266003035   1542257140343   0.00 │ 1125896  1126512  -0.05 │
│        coq-engine-bench-lite │  161.97   161.28   0.43 │   688800476609    684826087326   0.58 │  1277278959667   1278904732951  -0.13 │ 1265060  1265116  -0.00 │
│                  coq-unimath │ 1589.42  1581.36   0.51 │  7188632653073   7152806841452   0.50 │ 13475726451730  13476887365066  -0.01 │ 1321932  1322116  -0.01 │
│                  coq-coqutil │   41.82    41.58   0.58 │   184646400647    183525480583   0.61 │   265495089167    265464878538   0.01 │  562964   561552   0.25 │
│                 coq-bedrock2 │  375.28   373.08   0.59 │  1694058454250   1683779808813   0.61 │  3257033819106   3257116397655  -0.00 │  871004   870464   0.06 │
│                    coq-verdi │   49.51    49.18   0.67 │   222035173268    220425413886   0.73 │   338229200374    337798289098   0.13 │  523228   529424  -1.17 │
│                      coq-vst │  862.65   856.75   0.69 │  3876787849090   3850761844183   0.68 │  6388643587362   6388224325250   0.01 │ 2257488  2257452   0.00 │
│            coq-metacoq-pcuic │  631.14   626.52   0.74 │  2834774113242   2811985939374   0.81 │  4137528577789   4137224861710   0.01 │ 2263420  2261036   0.11 │
│             coq-fiat-parsers │  336.73   333.71   0.90 │  1484468426820   1468560811740   1.08 │  2466655136492   2467450406596  -0.03 │ 2428180  2426676   0.06 │
│                 coq-coqprime │   48.69    48.18   1.06 │   217376668687    214811606595   1.19 │   330496345259    330381982474   0.03 │  778728   781032  -0.29 │
│     coq-metacoq-translations │   18.62    18.24   2.08 │    81220898832     79713041339   1.89 │   129623941751    129634956989  -0.01 │  837100   837288  -0.02 │
└──────────────────────────────┴─────────────────────────┴───────────────────────────────────────┴───────────────────────────────────────┴─────────────────────────┘

INFO: failed to install
coq-perennial

🐢 Top 25 slow downs
┌────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┐
│                                                         TOP 25 SLOW DOWNS                                                          │
│                                                                                                                                    │
│   OLD       NEW      DIFF   %DIFF   Ln                    FILE                                                                     │
├────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┤
│  40.2370   42.2890  2.0520  5.10%   224  coq-performance-tests-lite/PerformanceExperiments/rewrite_lift_lets_map.v.html            │
│  41.3580   43.3830  2.0250  4.90%   236  coq-rewriter-perf-SuperFast/src/Rewriter/Rewriter/Examples/PerfTesting/LiftLetsMap.v.html │
│  41.6540   43.4500  1.7960  4.31%   236  coq-rewriter/src/Rewriter/Rewriter/Examples/PerfTesting/LiftLetsMap.v.html                │
│  49.7160   51.4440  1.7280  3.48%   365  coq-mathcomp-odd-order/theories/PFsection4.v.html                                         │
│  41.3120   42.6930  1.3810  3.34%   368  coq-mathcomp-odd-order/theories/PFsection4.v.html                                         │
│  62.7730   63.9000  1.1270  1.80%   139  coq-fiat-parsers/src/Parsers/Refinement/SharpenedJSON.v.html                              │
│ 211.6500  212.6680  1.0180  0.48%   141  coq-fiat-crypto-with-bedrock/src/UnsaturatedSolinasHeuristics/Tests.v.html                │
│  24.3110   25.1150  0.8040  3.31%    20  coq-fiat-crypto-with-bedrock/src/Spec/Test/X25519.v.html                                  │
│  24.3180   25.1090  0.7910  3.25%    25  coq-fiat-crypto-with-bedrock/src/Spec/Test/X25519.v.html                                  │
│  19.0320   19.6470  0.6150  3.23%    28  coq-fiat-crypto-with-bedrock/src/Spec/Test/X25519.v.html                                  │
│ 121.1760  121.7520  0.5760  0.48%   155  coq-fiat-crypto-with-bedrock/src/UnsaturatedSolinasHeuristics/Tests.v.html                │
│  34.6070   35.0830  0.4760  1.38%   548  coq-fiat-crypto-with-bedrock/rupicola/bedrock2/compiler/src/compiler/MMIO.v.html          │
│  28.3320   28.7390  0.4070  1.44%   147  coq-vst/veric/expr_lemmas4.v.html                                                         │
│  10.8320   11.2140  0.3820  3.53%    55  coq-category-theory/Construction/Comma/Natural/Transformation.v.html                      │
│  12.7020   13.0840  0.3820  3.01%    15  coq-fiat-crypto-with-bedrock/src/Spec/Test/X25519.v.html                                  │
│  45.7750   46.1490  0.3740  0.82%   558  coq-bedrock2/bedrock2/src/bedrock2Examples/insertionsort.v.html                           │
│  10.2910   10.6590  0.3680  3.58%   496  coq-rewriter/src/Rewriter/Rewriter/Wf.v.html                                              │
│ 113.3430  113.7070  0.3640  0.32%   999  coq-performance-tests-lite/src/fiat_crypto_via_setoid_rewrite_standalone.v.html           │
│  13.0830   13.4320  0.3490  2.67%    55  coq-fiat-crypto-with-bedrock/src/Bedrock/End2End/X25519/Field25519.v.html                 │
│  16.4700   16.8190  0.3490  2.12%   957  coq-unimath/UniMath/CategoryTheory/Monoidal/Examples/DisplayedCartesianMonoidal.v.html    │
│  20.8700   21.1970  0.3270  1.57%    49  coq-fiat-crypto-with-bedrock/src/Curves/Weierstrass/AffineProofs.v.html                   │
│  51.7610   52.0740  0.3130  0.60%    50  coq-fiat-crypto-with-bedrock/src/Bedrock/Field/Synthesis/Examples/p224_64_new.v.html      │
│ 142.8150  143.1250  0.3100  0.22%  1190  coq-unimath/UniMath/CategoryTheory/GrothendieckConstruction/IsPullback.v.html             │
│  27.4860   27.7830  0.2970  1.08%   194  coq-vst/veric/expr_lemmas4.v.html                                                         │
│ 126.3220  126.6090  0.2870  0.23%    22  coq-fiat-crypto-with-bedrock/src/Rewriter/Passes/ArithWithCasts.v.html                    │
└────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┘
🐇 Top 25 speed ups
┌─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┐
│                                                                  TOP 25 SPEED UPS                                                                   │
│                                                                                                                                                     │
│   OLD       NEW      DIFF     %DIFF    Ln                      FILE                                                                                 │
├─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┤
│  54.9890   53.7650  -1.2240   -2.23%   915  coq-fiat-crypto-with-bedrock/src/Bedrock/End2End/X25519/GarageDoor.v.html                               │
│ 114.5260  113.6070  -0.9190   -0.80%    48  coq-fiat-crypto-with-bedrock/src/Curves/Weierstrass/AffineProofs.v.html                                 │
│  90.9920   90.2280  -0.7640   -0.84%   103  coq-fiat-crypto-with-bedrock/src/Arithmetic/BarrettReduction.v.html                                     │
│  18.1080   17.4120  -0.6960   -3.84%  3269  coq-fiat-crypto-with-bedrock/src/Assembly/WithBedrock/Proofs.v.html                                     │
│  22.5970   21.9760  -0.6210   -2.75%   219  coq-fiat-crypto-with-bedrock/src/Curves/EdwardsMontgomery.v.html                                        │
│   2.3690    1.7780  -0.5910  -24.95%   209  coq-stdlib/setoid_ring/Ncring_tac.v.html                                                                │
│  23.1320   22.5440  -0.5880   -2.54%   352  coq-fiat-crypto-with-bedrock/src/Curves/Montgomery/XZProofs.v.html                                      │
│  28.2970   27.7810  -0.5160   -1.82%    12  coq-fourcolor/theories/job299to302.v.html                                                               │
│  28.4030   28.0070  -0.3960   -1.39%    12  coq-fourcolor/theories/job618to622.v.html                                                               │
│  28.3050   27.9140  -0.3910   -1.38%    12  coq-fourcolor/theories/job279to282.v.html                                                               │
│  19.5460   19.1680  -0.3780   -1.93%    40  coq-fiat-crypto-with-bedrock/src/PushButtonSynthesis/SolinasReductionReificationCache.v.html            │
│   8.0860    7.7170  -0.3690   -4.56%  2854  coq-fiat-crypto-with-bedrock/src/Assembly/EquivalenceProofs.v.html                                      │
│  23.3150   22.9590  -0.3560   -1.53%    12  coq-fourcolor/theories/job283to286.v.html                                                               │
│  14.6710   14.3250  -0.3460   -2.36%   315  coq-fiat-crypto-with-bedrock/src/Bedrock/Group/ScalarMult/MontgomeryLadder.v.html                       │
│  12.0170   11.6860  -0.3310   -2.75%   258  coq-fiat-crypto-with-bedrock/src/Curves/Montgomery/XZProofs.v.html                                      │
│  20.6810   20.3500  -0.3310   -1.60%  1073  coq-metacoq-safechecker/safechecker/theories/PCUICSafeReduce.v.html                                     │
│ 161.8300  161.5050  -0.3250   -0.20%   233  coq-fiat-crypto-with-bedrock/rupicola/bedrock2/deps/riscv-coq/src/riscv/Proofs/DecodeByExtension.v.html │
│  34.2650   33.9460  -0.3190   -0.93%    12  coq-fourcolor/theories/job165to189.v.html                                                               │
│  40.2520   39.9360  -0.3160   -0.79%   835  coq-fiat-crypto-with-bedrock/src/Fancy/Compiler.v.html                                                  │
│  35.1840   34.8730  -0.3110   -0.88%    12  coq-fourcolor/theories/job254to270.v.html                                                               │
│   9.2280    8.9420  -0.2860   -3.10%   149  coq-category-theory/Structure/Monoid.v.html                                                             │
│  19.7460   19.4610  -0.2850   -1.44%    12  coq-fourcolor/theories/job550to553.v.html                                                               │
│  14.7740   14.4890  -0.2850   -1.93%   212  coq-fiat-crypto-with-bedrock/src/Curves/Weierstrass/Projective.v.html                                   │
│  30.3860   30.1090  -0.2770   -0.91%    12  coq-fourcolor/theories/job287to290.v.html                                                               │
│  10.8700   10.5950  -0.2750   -2.53%   497  coq-mathcomp-algebra/mathcomp/algebra/ssrnum.v.html                                                     │
└─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┘

@SkySkimmer
Copy link
Contributor

Got it to pass after another retry, with the failures on master I gurss it's an unrelated problem.

@coqbot merge now

@coqbot-app coqbot-app bot merged commit 4e66de6 into coq:master Nov 4, 2023
10 of 11 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
kind: fix This fixes a bug or incorrect documentation. part: reduction strategies The Strategy command for defining reduction straegies.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Error: Anomaly "Not an unfoldable reference."
2 participants