Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix #5351: Intuition does not work with universe polymorphism. #8905

Merged
merged 3 commits into from
Mar 18, 2023

Conversation

ppedrot
Copy link
Member

@ppedrot ppedrot commented Nov 5, 2018

Instead of using non-linear pattern-matching in tauto, we use a tauto-specific conversion function so as to add the missing universe constraints. We cannot use constr_eq as the current implementation of non-linear matching uses conversion rather than syntactic equality.

Fixes / closes #4721.
Fixes / closes #5351.

  • Added / updated test-suite

@ppedrot ppedrot added the kind: fix This fixes a bug or incorrect documentation. label Nov 5, 2018
@ppedrot ppedrot added this to the 8.9.0 milestone Nov 5, 2018
@coqbot coqbot added the needs: rebase Should be rebased on the latest master to solve conflicts or have a newer CI run. label Nov 5, 2018
@coqbot coqbot removed the needs: rebase Should be rebased on the latest master to solve conflicts or have a newer CI run. label Nov 5, 2018
@SkySkimmer
Copy link
Contributor

Why not expose Tactics.convert as a regular tactic instead of making it tauto specific?

@ppedrot
Copy link
Member Author

ppedrot commented Nov 5, 2018

I am not super fond of globally exporting tactics. I know this is the current model but this should be pondered seriously.

@ppedrot
Copy link
Member Author

ppedrot commented Nov 5, 2018

Another advantage is that this PR is backportable as-is.

@ppedrot
Copy link
Member Author

ppedrot commented Nov 5, 2018

Failures are spurious. I'll rebase this PR when the fiat fix lands.

@ppedrot
Copy link
Member Author

ppedrot commented Nov 6, 2018

Benchmark show an observable slowdown on some developments:

┌────────────────────────┬─────────────────────────┬─────────────────────────────────────────────┬─────────────────────────────────────────────┬───────────────────────────────┬────────────────────────────┐
│                        │      user time [s]      │                 CPU cycles                  │              CPU instructions               │     max resident mem [KB]     │         mem faults         │
│                        │                         │                                             │                                             │                               │                            │
│           package_name │     NEW     OLD PDIFF   │               NEW               OLD PDIFF   │               NEW               OLD PDIFF   │        NEW        OLD PDIFF   │     NEW     OLD    PDIFF   │
├────────────────────────┼─────────────────────────┼─────────────────────────────────────────────┼─────────────────────────────────────────────┼───────────────────────────────┼────────────────────────────┤
│            coq-bignums │   99.56   99.85 -0.29 % │      275601866624      275730120878 -0.05 % │      397266468927      397563173721 -0.07 % │     536792     537208 -0.08 % │     281      43  +553.49 % │
├────────────────────────┼─────────────────────────┼─────────────────────────────────────────────┼─────────────────────────────────────────────┼───────────────────────────────┼────────────────────────────┤
│     coq-mathcomp-field │  259.94  260.60 -0.25 % │      722864183063      723165366301 -0.04 % │     1095463419243     1095660982631 -0.02 % │     755076     756380 -0.17 % │       3      13   -76.92 % │
├────────────────────────┼─────────────────────────┼─────────────────────────────────────────────┼─────────────────────────────────────────────┼───────────────────────────────┼────────────────────────────┤
│    coq-formal-topology │   59.56   59.71 -0.25 % │      162946154896      163316440275 -0.23 % │      247452869847      247669714821 -0.09 % │     477540     477576 -0.01 % │       0       0     +nan % │
├────────────────────────┼─────────────────────────┼─────────────────────────────────────────────┼─────────────────────────────────────────────┼───────────────────────────────┼────────────────────────────┤
│               coq-hott │  342.92  343.64 -0.21 % │      949089994803      949364669453 -0.03 % │     1445378545505     1445690903038 -0.02 % │     600296     600436 -0.02 % │     443       0     +nan % │
├────────────────────────┼─────────────────────────┼─────────────────────────────────────────────┼─────────────────────────────────────────────┼───────────────────────────────┼────────────────────────────┤
│ coq-mathcomp-character │  280.76  281.31 -0.20 % │      780595911578      781889049013 -0.17 % │     1121614464239     1121815119977 -0.02 % │    1090792    1090824 -0.00 % │      37     117   -68.38 % │
├────────────────────────┼─────────────────────────┼─────────────────────────────────────────────┼─────────────────────────────────────────────┼───────────────────────────────┼────────────────────────────┤
│              coq-flocq │  595.82  596.48 -0.11 % │     1657556658154     1660128447256 -0.15 % │     2412343510398     2409381620365 +0.12 % │    1766500    1766476 +0.00 % │     591     593    -0.34 % │
├────────────────────────┼─────────────────────────┼─────────────────────────────────────────────┼─────────────────────────────────────────────┼───────────────────────────────┼────────────────────────────┤
│ coq-mathcomp-odd-order │ 1383.88 1384.53 -0.05 % │     3855260246750     3857671082624 -0.06 % │     6474120477642     6473972166857 +0.00 % │    1359156    1359300 -0.01 % │     245     193   +26.94 % │
├────────────────────────┼─────────────────────────┼─────────────────────────────────────────────┼─────────────────────────────────────────────┼───────────────────────────────┼────────────────────────────┤
│               coq-corn │ 1532.94 1533.08 -0.01 % │     4256695149556     4257318679089 -0.01 % │     6263329405502     6261598624227 +0.03 % │     833848     834024 -0.02 % │      44      44    +0.00 % │
├────────────────────────┼─────────────────────────┼─────────────────────────────────────────────┼─────────────────────────────────────────────┼───────────────────────────────┼────────────────────────────┤
│  coq-mathcomp-solvable │  216.98  216.96 +0.01 % │      602254348868      602377032582 -0.02 % │      852112715634      852237498684 -0.01 % │     861228     839304 +2.61 % │      12       1 +1100.00 % │
├────────────────────────┼─────────────────────────┼─────────────────────────────────────────────┼─────────────────────────────────────────────┼───────────────────────────────┼────────────────────────────┤
│   coq-mathcomp-algebra │  185.19  185.11 +0.04 % │      513905465886      514187882658 -0.05 % │      699536525183      699424441175 +0.02 % │     638900     639008 -0.02 % │     317      24 +1220.83 % │
├────────────────────────┼─────────────────────────┼─────────────────────────────────────────────┼─────────────────────────────────────────────┼───────────────────────────────┼────────────────────────────┤
│ coq-mathcomp-ssreflect │   67.03   66.98 +0.07 % │      184214474389      184898304003 -0.37 % │      263550074536      263748425027 -0.08 % │     530004     530248 -0.05 % │      15     109   -86.24 % │
├────────────────────────┼─────────────────────────┼─────────────────────────────────────────────┼─────────────────────────────────────────────┼───────────────────────────────┼────────────────────────────┤
│                coq-vst │ 1625.76 1624.51 +0.08 % │     4519814183987     4513764739916 +0.13 % │     6195754609276     6191017875167 +0.08 % │    2215148    2215028 +0.01 % │     914     932    -1.93 % │
├────────────────────────┼─────────────────────────┼─────────────────────────────────────────────┼─────────────────────────────────────────────┼───────────────────────────────┼────────────────────────────┤
│        coq-fiat-crypto │ 6370.14 6361.06 +0.14 % │    17716854128857    17688956617813 +0.16 % │    28930846068344    28904984753673 +0.09 % │    2470952    2455076 +0.65 % │    1714    1340   +27.91 % │
├────────────────────────┼─────────────────────────┼─────────────────────────────────────────────┼─────────────────────────────────────────────┼───────────────────────────────┼────────────────────────────┤
│  coq-mathcomp-fingroup │   83.84   83.68 +0.19 % │      231928408652      231631018790 +0.13 % │      343515292772      343627671079 -0.03 % │     583652     583780 -0.02 % │       6     162   -96.30 % │
├────────────────────────┼─────────────────────────┼─────────────────────────────────────────────┼─────────────────────────────────────────────┼───────────────────────────────┼────────────────────────────┤
│       coq-math-classes │  252.30  251.65 +0.26 % │      694269358563      693127048830 +0.16 % │      907559514462      906385268746 +0.13 % │     523648     523608 +0.01 % │      91      82   +10.98 % │
├────────────────────────┼─────────────────────────┼─────────────────────────────────────────────┼─────────────────────────────────────────────┼───────────────────────────────┼────────────────────────────┤
│              coq-color │  738.09  734.76 +0.45 % │     2045652336668     2036938037003 +0.43 % │     2466045011241     2455065980967 +0.45 % │    1365188    1369680 -0.33 % │     406     469   -13.43 % │
├────────────────────────┼─────────────────────────┼─────────────────────────────────────────────┼─────────────────────────────────────────────┼───────────────────────────────┼────────────────────────────┤
│       coq-fiat-parsers │  720.22  716.77 +0.48 % │     1999100992610     1989949462315 +0.46 % │     3037958063738     3030535869415 +0.24 % │    2889828    2910124 -0.70 % │     517     754   -31.43 % │
├────────────────────────┼─────────────────────────┼─────────────────────────────────────────────┼─────────────────────────────────────────────┼───────────────────────────────┼────────────────────────────┤
│           coq-compcert │  905.30  888.73 +1.86 % │     2509438489593     2463702050515 +1.86 % │     3597359047517     3550641154113 +1.32 % │    1326516    1326656 -0.01 % │     262     392   -33.16 % │
├────────────────────────┼─────────────────────────┼─────────────────────────────────────────────┼─────────────────────────────────────────────┼───────────────────────────────┼────────────────────────────┤
│             coq-geocoq │ 2075.01 2017.07 +2.87 % │     5764283679126     5601972462976 +2.90 % │     8286535420180     8109680021275 +2.18 % │    1221320    1239824 -1.49 % │     614     198  +210.10 % │
└────────────────────────┴─────────────────────────┴─────────────────────────────────────────────┴─────────────────────────────────────────────┴───────────────────────────────┴────────────────────────────┘

This is not good... And it also forecasts a bigger slowdown if we change Ltac non-linear variable matching to universe-unifying conversion.

@ejgallego
Copy link
Member

if we change Ltac non-linear variable matching to universe-unifying conversion.

Maybe this is a good use case for an attribute on the match?

@ppedrot
Copy link
Member Author

ppedrot commented Nov 6, 2018

Maybe this is a good use case for an attribute on the match?

Ltac match is already quite complex in its own right, it has a quite a combinatorics: multi / lazy / nothing, reverse / nothing... I was more in favour of deprecating non-linear matching, as there are too many possible interpretations, i.e. about as many as there are notions of equalities on terms.

(Also, this is Ltac, so there are no attributes in there. Yet.)

@ppedrot ppedrot removed the needs: benchmarking Performance testing is required. label Nov 6, 2018
@silene silene modified the milestones: 8.9.0, 8.10+beta1 Nov 8, 2018
@mattam82 mattam82 self-assigned this Nov 16, 2018
(* generalize (id0 id1); intro; clear id0 does not work
(see Marco Maggiesi's BZ#301)
so we instead use Assert and exact. *)
is_conv X1 X3;
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I believe you're hitting this because you replaced a syntactic equality test by a conversion (the strategy is different in constr_matching and tactic_matching). I guess you could recover compatibility and performance by also exporting a constr_eq, I guess using it in axioms above.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think so, all of the equalities here are performed by conversion because they live in tactic_matching. Actually I first tried to use a syntactic equality and it broke in the stdlib because some part of the code was relying on conversion.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah right, we call extended_matches on the conclusion and use conversion in tactic_matching to related the two instances of the meta. So the "specific" case using constr_eq is only for non-linear occurrences in the goal if I'm not mistaken.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In non-linear occurrences from the same hypothesis as well, also, but indeed.

@ppedrot
Copy link
Member Author

ppedrot commented Nov 17, 2018

@mattam82 The reason why it is slower is really that we perform universe unification instead of mere syntactic equality during conversion. It's common knowledge that UState.process_universe_constraints is horribly slow.

@mattam82
Copy link
Member

Ah, I didn't realize you were calling universe unification here and not just conversion with inference of constraints, i.e. just adding a bit of inference compared to what Reductionops.is_conv in tactic_matching is doing. Did you try calling Reduction.infer_conv instead, which avoids the mumbo-jumbo with algebraics/flexibles etc?

@ppedrot
Copy link
Member Author

ppedrot commented Nov 17, 2018

@mattam82 If we do that, don't we just sidestep the bug? Like, what's the reason we use the universe-unifying conversion elsewhere and don't go directly for Reduction.infer_conv? Can't we hit a case where this is needed?

@SkySkimmer
Copy link
Contributor

Tactics.convert is just a wrapper around infer_conv.

@ppedrot
Copy link
Member Author

ppedrot commented Nov 17, 2018

So there is no way out?

@SkySkimmer
Copy link
Contributor

SkySkimmer commented Nov 17, 2018 via email

@ppedrot
Copy link
Member Author

ppedrot commented Nov 17, 2018

According to my experiments this was really due to the call on universe processing, so I guess we're fried. What do?

(* generalize (id0 id1); intro; clear id0 does not work
(see Marco Maggiesi's BZ#301)
so we instead use Assert and exact. *)
is_conv X1 X3;
assert X2; [exact (id0 id1) | clear id0]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What happens if we don't check convertibility and just let it fail when it tries to exact id0 id1?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Probably going to be worse performance-wise because now you call full-blown unification. Fast failure matters a lot.

@mattam82
Copy link
Member

It's a bit tricky indeed: Tactics.convert wraps Reductionops.infer_conv which calls universe unification while Reduction.infer_conv has a simpler universe inference (the one used by module subtyping). The main difference between the two if that Reductionops calls process_universe_constraints directly so it can handle algebraics and flexible universe variables that appear during typechecking, keeping universe equality information in the UState structure while Reduction.infer_conv does not care about this and just generates constraints. However to put these constraints in the Evd you'd have to go through process_universe_constraints again, so I guess there would be no difference. The only way to improve performance would be to optimize process_universe_constraints/the uState to avoid maintaining the substitution separately from the graph. I don't see immediately how to do that while maintaining the algebraic universe support, but it can maybe be done: one needs to be able to instantiate a universe variable l with an algebraic u.

@github-actions github-actions bot added the needs: rebase Should be rebased on the latest master to solve conflicts or have a newer CI run. label Nov 24, 2022
@coqbot-app coqbot-app bot added needs: full CI The latest GitLab pipeline that ran was a light CI. Say "@coqbot run full ci" to get a full CI. and removed needs: rebase Should be rebased on the latest master to solve conflicts or have a newer CI run. labels Dec 15, 2022
@ppedrot
Copy link
Member Author

ppedrot commented Dec 21, 2022

@coqbot bench

@coqbot-app
Copy link
Contributor

coqbot-app bot commented Dec 22, 2022

🏁 Bench results:

┌──────────────────────────────┬─────────────────────────┬───────────────────────────────────────┬───────────────────────────────────────┬─────────────────────────┐
│                              │      user time [s]      │              CPU cycles               │           CPU instructions            │  max resident mem [KB]  │
│                              │                         │                                       │                                       │                         │
│         package_name         │   NEW      OLD    PDIFF │      NEW             OLD        PDIFF │      NEW             OLD        PDIFF │   NEW      OLD    PDIFF │
├──────────────────────────────┼─────────────────────────┼───────────────────────────────────────┼───────────────────────────────────────┼─────────────────────────┤
│        coq-mathcomp-fingroup │   22.09    22.26  -0.76 │   100725123441    101369504557  -0.64 │   148899746552    148852400126   0.03 │  497476   496836   0.13 │
│             coq-math-classes │   92.04    92.68  -0.69 │   417193404567    419684881538  -0.59 │   595977904860    594992462997   0.17 │  511880   511112   0.15 │
│      coq-metacoq-safechecker │  212.61   213.97  -0.64 │   968378432772    974197600594  -0.60 │  1461486761212   1460358455736   0.08 │ 1419676  1433100  -0.94 │
│                 coq-coqprime │   44.73    44.95  -0.49 │   203075256925    203140636059  -0.03 │   309409390348    309247866071   0.05 │  765912   765708   0.03 │
│       coq-mathcomp-character │   66.42    66.70  -0.42 │   303366361434    304551769828  -0.39 │   464929879859    464804428372   0.03 │  744072   743516   0.07 │
│                coq-fiat-core │   59.02    59.21  -0.32 │   253835349145    253862763427  -0.01 │   370109160308    368682123383   0.39 │  488048   490960  -0.59 │
│        coq-mathcomp-solvable │   80.32    80.52  -0.25 │   366379036259    367232655634  -0.23 │   566894534258    566658127371   0.04 │  808860   809856  -0.12 │
│                     coq-hott │  160.85   161.20  -0.22 │   728082022262    729897481762  -0.25 │  1152506210986   1152509156281  -0.00 │  573240   574220  -0.17 │
│            coq-iris-examples │  482.12   482.95  -0.17 │  2188957298809   2192546837562  -0.16 │  3332664694603   3332669385231  -0.00 │ 1191344  1189088   0.19 │
│                  coq-unimath │ 1266.92  1268.24  -0.10 │  5775024041194   5783324183842  -0.14 │ 10720959033627  10721232701242  -0.00 │ 1633956  1636796  -0.17 │
│                  coq-coqutil │   35.51    35.53  -0.06 │   158871209142    158903885111  -0.02 │   227719840433    227274810162   0.20 │  557884   558000  -0.02 │
│         coq-metacoq-template │  133.30   133.31  -0.01 │   598269746354    598579734038  -0.05 │   981336212017    980306998555   0.10 │ 1281844  1281628   0.02 │
│                  coq-bignums │   28.20    28.20   0.00 │   128401047796    128309305679   0.07 │   183662899780    183706460322  -0.02 │  479436   479040   0.08 │
│       coq-mathcomp-odd-order │  422.32   422.29   0.01 │  1931530003595   1931060189820   0.02 │  3229024229187   3229321924360  -0.01 │ 1551500  1553528  -0.13 │
│                 coq-bedrock2 │  384.62   384.39   0.06 │  1757029684801   1757175388835  -0.01 │  3410781394810   3408052960834   0.08 │  937472   937704  -0.02 │
│                    coq-verdi │   47.85    47.82   0.06 │   217044244547    216259341125   0.36 │   332337780989    331532773097   0.24 │  526820   528036  -0.23 │
│        coq-engine-bench-lite │  161.23   161.09   0.09 │   693053729172    693297262770  -0.04 │  1301317317348   1301779838865  -0.04 │ 1206488  1206612  -0.01 │
│                     coq-corn │  808.00   807.05   0.12 │  3676528510548   3673365182954   0.09 │  5783643390708   5781600230898   0.04 │  833692   835388  -0.20 │
│                coq-fourcolor │ 1484.86  1482.76   0.14 │  6772279680677   6761996296851   0.15 │ 12187094150004  12187489012390  -0.00 │ 1423460  1423868  -0.03 │
│  coq-rewriter-perf-SuperFast │  726.16   724.67   0.21 │  3305785827848   3300858262093   0.15 │  5743948841872   5730983412280   0.23 │ 1306900  1307124  -0.02 │
│   coq-performance-tests-lite │  754.48   752.76   0.23 │  3418982668261   3414094435804   0.14 │  6062085574780   6060657217214   0.02 │ 1665448  1665584  -0.01 │
│                     coq-core │  108.82   108.55   0.25 │   457216055568    458224734759  -0.22 │   473026850226    472722125708   0.06 │  286584   285708   0.31 │
│             coq-fiat-parsers │  334.00   333.00   0.30 │  1500799673901   1498167365098   0.18 │  2507344452477   2500700576734   0.27 │ 3461732  3467320  -0.16 │
│          coq-category-theory │  766.48   763.66   0.37 │  3500763054960   3488527642405   0.35 │  5968836536187   5948380248262   0.34 │  911616   907564   0.45 │
│          coq-metacoq-erasure │  208.75   207.96   0.38 │   945981707353    943013581443   0.31 │  1506737110358   1507278960452  -0.04 │ 1438888  1443304  -0.31 │
│           coq-mathcomp-field │   93.17    92.74   0.46 │   425128621373    423437327390   0.40 │   704114276324    704154479181  -0.01 │  934156   934212  -0.01 │
│                coq-perennial │ 5513.69  5487.57   0.48 │ 25132611603933  25017204740251   0.46 │ 41431330480942  41421010855158   0.02 │ 2543948  2556560  -0.49 │
│         coq-mathcomp-algebra │   62.40    61.98   0.68 │   284277410937    283171647901   0.39 │   397171572611    397112062970   0.01 │  572080   575380  -0.57 │
│                 coq-rewriter │  355.61   353.13   0.70 │  1620994659083   1608382349070   0.78 │  2702131943273   2688285771521   0.52 │ 1211892  1211896  -0.00 │
│                    coq-color │  227.92   226.33   0.70 │  1032905676220   1023721882457   0.90 │  1502469091815   1492235003660   0.69 │ 1180548  1167376   1.13 │
│ coq-fiat-crypto-with-bedrock │ 6272.59  6221.64   0.82 │ 28521223082886  28293544296239   0.80 │ 52416737798681  52183897958756   0.45 │ 2828532  2822384   0.22 │
│       coq-mathcomp-ssreflect │   26.63    26.41   0.83 │   120820994734    120116442212   0.59 │   155582567394    155629008790  -0.03 │  566000   568196  -0.39 │
│            coq-metacoq-pcuic │  565.11   559.36   1.03 │  2570854442403   2545489355790   1.00 │  3760234032706   3745025523386   0.41 │ 1889400  1889048   0.02 │
│               coq-verdi-raft │  582.21   575.78   1.12 │  2654338474021   2628199869611   0.99 │  4180279598967   4153703291994   0.64 │  926844   927100  -0.03 │
│     coq-metacoq-translations │   14.68    14.46   1.52 │    65475296868     64893416739   0.90 │   105372507177    105237299410   0.13 │  742428   743720  -0.17 │
│               coq-coquelicot │   36.98    36.35   1.73 │   165488775687    162440901673   1.88 │   225939575197    223121400260   1.26 │  780616   801500  -2.61 │
│                 coq-compcert │  293.58   288.26   1.85 │  1331558319054   1307100424719   1.87 │  2031458657094   1996021368936   1.78 │ 1123488  1120596   0.26 │
│                   coq-stdlib │  431.41   421.54   2.34 │  1852315057345   1811516757091   2.25 │  1484588152161   1469390320535   1.03 │  714304   707136   1.01 │
│                   coq-geocoq │  707.77   681.30   3.89 │  3224185171735   3103964122927   3.87 │  5153156642794   5017347092123   2.71 │  997716   983776   1.42 │
└──────────────────────────────┴─────────────────────────┴───────────────────────────────────────┴───────────────────────────────────────┴─────────────────────────┘

INFO: failed to install coq-vst

🐢 Top 25 slow downs
┌────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┐
│                                                                 TOP 25 SLOW DOWNS                                                                  │
│                                                                                                                                                    │
│   OLD       NEW      DIFF    %DIFF    Ln                      FILE                                                                                 │
├────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┤
│   3.4350   10.6760  7.2410  210.80%   469  coq-fiat-crypto-with-bedrock/rupicola/bedrock2/compiler/src/compiler/FlattenExpr.v.html                 │
│   3.3100   10.4110  7.1010  214.53%   435  coq-fiat-crypto-with-bedrock/rupicola/bedrock2/compiler/src/compiler/FlattenExpr.v.html                 │
│  26.7580   31.7220  4.9640   18.55%    67  coq-fiat-crypto-with-bedrock/rupicola/bedrock2/deps/riscv-coq/src/riscv/Proofs/VerifyDecode.v.html      │
│   2.2930    5.8200  3.5270  153.82%   246  coq-fiat-crypto-with-bedrock/rupicola/bedrock2/compiler/src/compiler/FlattenExpr.v.html                 │
│   1.4720    4.7740  3.3020  224.32%   472  coq-fiat-crypto-with-bedrock/rupicola/bedrock2/compiler/src/compiler/FlattenExpr.v.html                 │
│   1.4110    4.5560  3.1450  222.89%   438  coq-fiat-crypto-with-bedrock/rupicola/bedrock2/compiler/src/compiler/FlattenExpr.v.html                 │
│   9.9220   12.5910  2.6690   26.90%   275  coq-category-theory/Theory/Metacategory.v.html                                                          │
│   2.3220    4.2790  1.9570   84.28%   193  coq-fiat-crypto-with-bedrock/rupicola/bedrock2/compiler/src/compiler/FlattenExpr.v.html                 │
│   0.9080    2.5660  1.6580  182.60%   194  coq-geocoq/Elements/OriginalProofs/proposition_46.v.html                                                │
│ 127.0690  128.7250  1.6560    1.30%    22  coq-fiat-crypto-with-bedrock/src/Rewriter/Passes/ArithWithCasts.v.html                                  │
│   6.3810    8.0020  1.6210   25.40%   420  coq-rewriter-perf-SuperFast/src/Rewriter/Rewriter/Wf.v.html                                             │
│   6.3600    7.9440  1.5840   24.91%   420  coq-rewriter/src/Rewriter/Rewriter/Wf.v.html                                                            │
│   0.9780    2.4780  1.5000  153.37%   207  coq-fiat-crypto-with-bedrock/rupicola/bedrock2/compiler/src/compiler/FlattenExpr.v.html                 │
│   0.7530    2.1080  1.3550  179.95%   325  coq-geocoq/Elements/OriginalProofs/proposition_42.v.html                                                │
│   0.6750    1.9790  1.3040  193.19%   411  coq-fiat-crypto-with-bedrock/rupicola/bedrock2/compiler/src/compiler/FlattenExpr.v.html                 │
│   9.0590   10.2870  1.2280   13.56%  1150  coq-fiat-crypto-with-bedrock/src/Assembly/WithBedrock/SymbolicProofs.v.html                             │
│   6.5700    7.3230  0.7530   11.46%  1719  coq-perennial/src/program_proof/wal/recovery_proof.v.html                                               │
│   0.4840    1.1850  0.7010  144.83%   310  coq-geocoq/Elements/OriginalProofs/proposition_42.v.html                                                │
│   0.6430    1.3430  0.7000  108.86%   177  coq-geocoq/Elements/OriginalProofs/proposition_45.v.html                                                │
│ 163.6160  164.2800  0.6640    0.41%   232  coq-fiat-crypto-with-bedrock/rupicola/bedrock2/deps/riscv-coq/src/riscv/Proofs/DecodeByExtension.v.html │
│  40.4070   41.0430  0.6360    1.57%   222  coq-performance-tests-lite/PerformanceExperiments/rewrite_lift_lets_map.v.html                          │
│  31.2890   31.8880  0.5990    1.91%    10  coq-fourcolor/theories/job563to588.v.html                                                               │
│   0.3800    0.9610  0.5810  152.89%   163  coq-geocoq/Elements/OriginalProofs/proposition_30.v.html                                                │
│   0.3800    0.9520  0.5720  150.53%   222  coq-geocoq/Elements/OriginalProofs/proposition_30.v.html                                                │
│   0.3760    0.9460  0.5700  151.60%   159  coq-geocoq/Elements/OriginalProofs/proposition_30.v.html                                                │
└────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┘

🐇 Top 25 speed ups
┌───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┐
│                                                               TOP 25 SPEED UPS                                                                │
│                                                                                                                                               │
│   OLD       NEW      DIFF     %DIFF     Ln                     FILE                                                                           │
├───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┤
│  41.7650   41.1560  -0.6090    -1.46%   235  coq-rewriter-perf-SuperFast/src/Rewriter/Rewriter/Examples/PerfTesting/LiftLetsMap.v.html        │
│  26.0670   25.6560  -0.4110    -1.58%    10  coq-fourcolor/theories/job495to498.v.html                                                        │
│  21.7880   21.3840  -0.4040    -1.85%    10  coq-fourcolor/theories/job507to510.v.html                                                        │
│   2.4920    2.1270  -0.3650   -14.65%  1775  coq-perennial/src/program_proof/wal/recovery_proof.v.html                                        │
│  23.5940   23.2760  -0.3180    -1.35%    10  coq-fourcolor/theories/job542to545.v.html                                                        │
│  62.5580   62.2620  -0.2960    -0.47%   137  coq-fiat-parsers/src/Parsers/Refinement/SharpenedJSON.v.html                                     │
│ 128.1360  127.8630  -0.2730    -0.21%   692  coq-fiat-crypto-with-bedrock/rupicola/bedrock2/bedrock2/src/bedrock2Examples/lightbulb.v.html    │
│   0.2860    0.0140  -0.2720   -95.10%   253  coq-fiat-crypto-with-bedrock/src/Assembly/WithBedrock/SymbolicProofs.v.html                      │
│  23.6280   23.3650  -0.2630    -1.11%    10  coq-fourcolor/theories/job511to516.v.html                                                        │
│  17.3940   17.1450  -0.2490    -1.43%   962  coq-unimath/UniMath/CategoryTheory/Monoidal/DisplayedCartesianMonoidalCategoriesWhiskered.v.html │
│  22.6570   22.4350  -0.2220    -0.98%    23  coq-fiat-crypto-with-bedrock/src/Rewriter/Passes/Arith.v.html                                    │
│  35.2060   34.9850  -0.2210    -0.63%   445  coq-unimath/UniMath/SyntheticHomotopyTheory/Circle2.v.html                                       │
│   0.2690    0.0670  -0.2020   -75.09%   260  coq-metacoq-pcuic/pcuic/theories/PCUICContextReduction.v.html                                    │
│  10.0850    9.8890  -0.1960    -1.94%    78  coq-rewriter-perf-SuperFast/src/Rewriter/Rewriter/Examples/PerfTesting/LiftLetsMap.v.html        │
│ 123.3340  123.1380  -0.1960    -0.16%    47  coq-fiat-crypto-with-bedrock/src/Curves/Weierstrass/AffineProofs.v.html                          │
│  22.2490   22.0560  -0.1930    -0.87%    10  coq-fourcolor/theories/job546to549.v.html                                                        │
│   0.2050    0.0150  -0.1900   -92.68%   341  coq-metacoq-safechecker/safechecker/theories/PCUICTypeChecker.v.html                             │
│  23.1170   22.9290  -0.1880    -0.81%    10  coq-fourcolor/theories/job307to310.v.html                                                        │
│   0.2010    0.0160  -0.1850   -92.04%   413  coq-metacoq-safechecker/safechecker/theories/PCUICTypeChecker.v.html                             │
│   0.1850    0.0010  -0.1840   -99.46%   496  coq-metacoq-safechecker/safechecker/theories/PCUICTypeChecker.v.html                             │
│   2.3200    2.1360  -0.1840    -7.93%    30  coq-fiat-crypto-with-bedrock/src/Assembly/Parse/TestAsm.v.html                                   │
│   0.2060    0.0230  -0.1830   -88.83%   487  coq-color/Term/Lambda/LCompClos.v.html                                                           │
│ 127.8680  127.6870  -0.1810    -0.14%   693  coq-bedrock2/bedrock2/src/bedrock2Examples/lightbulb.v.html                                      │
│  27.7710   27.5920  -0.1790    -0.64%    10  coq-fourcolor/theories/job223to226.v.html                                                        │
│   0.1730    0.0000  -0.1730  -100.00%   189  coq-metacoq-pcuic/pcuic/theories/PCUICSpine.v.html                                               │
└───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┘

@ppedrot ppedrot added the request: full CI Use this label when you want your next push to trigger a full CI. label Mar 14, 2023
@coqbot-app coqbot-app bot removed request: full CI Use this label when you want your next push to trigger a full CI. needs: full CI The latest GitLab pipeline that ran was a light CI. Say "@coqbot run full ci" to get a full CI. labels Mar 14, 2023
@ppedrot
Copy link
Member Author

ppedrot commented Mar 14, 2023

@coqbot bench

@coqbot-app
Copy link
Contributor

coqbot-app bot commented Mar 15, 2023

🏁 Bench results:

┌──────────────────────────────┬─────────────────────────┬───────────────────────────────────────┬───────────────────────────────────────┬─────────────────────────┐
│                              │      user time [s]      │              CPU cycles               │           CPU instructions            │  max resident mem [KB]  │
│                              │                         │                                       │                                       │                         │
│         package_name         │   NEW      OLD    PDIFF │      NEW             OLD        PDIFF │      NEW             OLD        PDIFF │   NEW      OLD    PDIFF │
├──────────────────────────────┼─────────────────────────┼───────────────────────────────────────┼───────────────────────────────────────┼─────────────────────────┤
│               coq-coquelicot │   36.69    37.12  -1.16 │   162185878132    163354469960  -0.72 │   224266828537    225887879747  -0.72 │  768916   793200  -3.06 │
│               coq-verdi-raft │  560.52   566.85  -1.12 │  2538541918466   2567688369079  -1.14 │  3998691625801   4042407961233  -1.08 │  817028   816288   0.09 │
│                  coq-bignums │   28.10    28.40  -1.06 │   126161505845    127343851863  -0.93 │   183202077335    183292760904  -0.05 │  486508   487476  -0.20 │
│     coq-metacoq-translations │   14.73    14.88  -1.01 │    65001438077     64937915637   0.10 │   105548032161    105210935209   0.32 │  751472   750384   0.14 │
│                 coq-compcert │  287.91   290.78  -0.99 │  1287615047978   1299510201103  -0.92 │  1976811061595   1987831233223  -0.55 │ 1112296  1112724  -0.04 │
│  coq-rewriter-perf-SuperFast │  725.56   730.53  -0.68 │  3231462198508   3254215893480  -0.70 │  5665083505845   5677366969808  -0.22 │ 1425064  1307616   8.98 │
│            coq-metacoq-pcuic │  554.81   558.46  -0.65 │  2497095032968   2512928972969  -0.63 │  3708187678537   3712219849553  -0.11 │ 1902716  1900352   0.12 │
│                    coq-color │  228.38   229.86  -0.64 │  1017292854067   1021397588589  -0.40 │  1486541255781   1489350214051  -0.19 │ 1153816  1153892  -0.01 │
│             coq-fiat-parsers │  331.49   333.49  -0.60 │  1458418366922   1467518211894  -0.62 │  2451603064882   2453276744414  -0.07 │ 2730868  2731036  -0.01 │
│                 coq-rewriter │  350.70   352.61  -0.54 │  1578669672383   1587516374798  -0.56 │  2646900380464   2662453195726  -0.58 │ 1332532  1332596  -0.00 │
│         coq-metacoq-template │  131.81   132.33  -0.39 │   584501461843    586640894439  -0.36 │   957445241419    959070307859  -0.17 │ 1294836  1294500   0.03 │
│                 coq-coqprime │   46.05    46.20  -0.32 │   203951858074    205123165012  -0.57 │   312930015858    312949117779  -0.01 │  777304   773996   0.43 │
│        coq-engine-bench-lite │  156.79   157.30  -0.32 │   665881923652    668376469289  -0.37 │  1236029575440   1240866477654  -0.39 │ 1103836  1065448   3.60 │
│          coq-category-theory │  722.99   724.96  -0.27 │  3277484314519   3287178542444  -0.29 │  5667054561917   5667202864732  -0.00 │  863488   928632  -7.02 │
│        coq-mathcomp-solvable │   77.18    77.39  -0.27 │   348390843288    348857676220  -0.13 │   540071128300    540216318364  -0.03 │  772844   773504  -0.09 │
│                coq-fourcolor │ 1518.31  1521.08  -0.18 │  6700109640321   6706693718460  -0.10 │ 12165573120315  12163650364361   0.02 │ 1423296  1423232   0.00 │
│                     coq-core │  112.59   112.79  -0.18 │   446992033372    445828794659   0.26 │   479810250911    479169473955   0.13 │  289496   287148   0.82 │
│       coq-mathcomp-character │   63.71    63.81  -0.16 │   289148041500    288834371695   0.11 │   444000367621    443913799286   0.02 │  700920   701444  -0.07 │
│ coq-fiat-crypto-with-bedrock │ 6151.14  6158.16  -0.11 │ 27642276638130  27673376494479  -0.11 │ 51201283217519  51228147367067  -0.05 │ 2423488  2407428   0.67 │
│                     coq-corn │  791.82   792.52  -0.09 │  3567622717406   3569325442398  -0.05 │  5572731344770   5572505502932   0.00 │  773080   770504   0.33 │
│       coq-mathcomp-ssreflect │   27.57    27.59  -0.07 │   123224454369    123467638639  -0.20 │   160794649707    160757496754   0.02 │  573716   576872  -0.55 │
│                   coq-stdlib │  413.72   413.70   0.00 │  1748314760505   1744800109456   0.20 │  1453818986744   1456169684574  -0.16 │  652032   658284  -0.95 │
│                  coq-unimath │ 1372.52  1371.86   0.05 │  6200500667239   6198099257529   0.04 │ 11653130106250  11653774817560  -0.01 │ 1552736  1549072   0.24 │
│                      coq-vst │  875.68   875.21   0.05 │  3938671981613   3936115527179   0.06 │  6531806995843   6526531721154   0.08 │ 2159868  2159780   0.00 │
│          coq-metacoq-erasure │  203.73   203.54   0.09 │   911730091601    912511895714  -0.09 │  1472419309644   1475173993583  -0.19 │ 1472108  1487788  -1.05 │
│                    coq-verdi │   47.65    47.60   0.11 │   213161036506    212589957384   0.27 │   326139023506    326370027779  -0.07 │  525020   526752  -0.33 │
│      coq-metacoq-safechecker │  209.93   209.70   0.11 │   948098775453    947395692940   0.07 │  1443192179146   1443738068689  -0.04 │ 1408644  1407252   0.10 │
│       coq-mathcomp-odd-order │  408.60   408.15   0.11 │  1856434577084   1855051549022   0.07 │  3112022230437   3113201978772  -0.04 │ 1570036  1570232  -0.01 │
│                   coq-geocoq │  607.58   606.84   0.12 │  2738911111667   2735988670539   0.11 │  4318051249825   4328585888683  -0.24 │  902800   902860  -0.01 │
│         coq-mathcomp-algebra │   62.06    61.98   0.13 │   280803117240    279933659367   0.31 │   391473978982    391518139071  -0.01 │  577672   577264   0.07 │
│                     coq-hott │  152.91   152.65   0.17 │   682746911148    680681734017   0.30 │  1086776296291   1086906716353  -0.01 │  627328   625644   0.27 │
│                coq-fiat-core │   60.66    60.54   0.20 │   255554164252    255881512633  -0.13 │   376492310762    376680996649  -0.05 │  489884   491424  -0.31 │
│   coq-performance-tests-lite │  758.86   757.19   0.22 │  3378380891936   3370168663021   0.24 │  6011313021498   6010298861014   0.02 │ 1665516  1665548  -0.00 │
│                 coq-bedrock2 │  312.51   311.78   0.23 │  1408218187723   1404520319781   0.26 │  2785821813585   2785960440279  -0.00 │  877976   877660   0.04 │
│                coq-perennial │ 5286.17  5263.98   0.42 │ 23892193471048  23791851952376   0.42 │ 39438671328550  39419246398440   0.05 │ 1956880  1962484  -0.29 │
│                  coq-coqutil │   37.78    37.62   0.43 │   166685761992    166266675959   0.25 │   240162944038    240106294729   0.02 │  558112   558060   0.01 │
│             coq-math-classes │   86.16    85.79   0.43 │   384194105135    382733377757   0.38 │   537488896373    538094997942  -0.11 │  519632   518356   0.25 │
│        coq-mathcomp-fingroup │   21.95    21.85   0.46 │    98746931093     98216337602   0.54 │   145241088703    145206645494   0.02 │  493844   492540   0.26 │
│           coq-mathcomp-field │   78.59    78.21   0.49 │   355471379363    353933375196   0.43 │   581400538755    581242816190   0.03 │  903460   903532  -0.01 │
└──────────────────────────────┴─────────────────────────┴───────────────────────────────────────┴───────────────────────────────────────┴─────────────────────────┘

INFO: failed to install coq-iris-examples

🐢 Top 25 slow downs
┌──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┐
│                                                                  TOP 25 SLOW DOWNS                                                                   │
│                                                                                                                                                      │
│   OLD       NEW      DIFF     %DIFF     Ln                      FILE                                                                                 │
├──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┤
│  20.5100   23.2260  2.7160     13.24%  1354  coq-perennial/src/program_proof/wal/installer_proof.v.html                                              │
│ 157.4790  158.8310  1.3520      0.86%   232  coq-fiat-crypto-with-bedrock/rupicola/bedrock2/deps/riscv-coq/src/riscv/Proofs/DecodeByExtension.v.html │
│  80.3550   81.1670  0.8120      1.01%   617  coq-bedrock2/bedrock2/src/bedrock2Examples/lightbulb.v.html                                             │
│  40.6840   41.3270  0.6430      1.58%    84  coq-fiat-crypto-with-bedrock/src/Curves/Montgomery/AffineProofs.v.html                                  │
│   3.4220    3.8790  0.4570     13.35%   631  coq-perennial/src/program_proof/wal/installer_proof.v.html                                              │
│ 126.9790  127.3970  0.4180      0.33%    22  coq-fiat-crypto-with-bedrock/src/Rewriter/Passes/ArithWithCasts.v.html                                  │
│   1.5980    1.9650  0.3670     22.97%  1294  coq-perennial/src/program_proof/aof/proof.v.html                                                        │
│  22.1120   22.4590  0.3470      1.57%    10  coq-fourcolor/theories/job490to494.v.html                                                               │
│  17.3910   17.7190  0.3280      1.89%   874  coq-perennial/src/program_proof/simple/setattr.v.html                                                   │
│   0.4600    0.7800  0.3200     69.57%  1903  coq-perennial/src/program_proof/wal/recovery_proof.v.html                                               │
│  25.9050   26.2240  0.3190      1.23%  2292  coq-perennial/src/goose_lang/logical_reln_fund.v.html                                                   │
│  28.1630   28.4620  0.2990      1.06%    10  coq-fourcolor/theories/job291to294.v.html                                                               │
│   0.3250    0.6230  0.2980     91.69%   228  coq-fiat-crypto-with-bedrock/src/PushButtonSynthesis/SolinasReduction.v.html                            │
│  17.7670   18.0520  0.2850      1.60%     6  coq-fiat-crypto-with-bedrock/rupicola/bedrock2/deps/riscv-coq/src/riscv/Proofs/DecodeEncodeCSR.v.html   │
│ 114.9880  115.2730  0.2850      0.25%    47  coq-fiat-crypto-with-bedrock/src/Curves/Weierstrass/AffineProofs.v.html                                 │
│   5.7870    6.0630  0.2760      4.77%    20  coq-fiat-crypto-with-bedrock/src/Language/IdentifiersGENERATEDProofs.v.html                             │
│  13.3240   13.5730  0.2490      1.87%   185  coq-perennial/src/goose_lang/interpreter/disk_interpreter.v.html                                        │
│   0.5520    0.7980  0.2460     44.57%   945  coq-vst/veric/binop_lemmas2.v.html                                                                      │
│  25.0430   25.2820  0.2390      0.95%   371  coq-unimath/UniMath/CategoryTheory/GrothendieckConstruction/IsPullback.v.html                           │
│  26.2640   26.4940  0.2300      0.88%    10  coq-fourcolor/theories/job495to498.v.html                                                               │
│  20.4760   20.7050  0.2290      1.12%   812  coq-perennial/src/program_proof/wal/logger_proof.v.html                                                 │
│  20.8010   21.0260  0.2250      1.08%  2060  coq-fiat-crypto-with-bedrock/rupicola/bedrock2/compiler/src/compiler/FlatToRiscvFunctions.v.html        │
│   0.0010    0.2240  0.2230  22300.00%   483  coq-fiat-crypto-with-bedrock/src/Assembly/WithBedrock/SymbolicProofs.v.html                             │
│   0.7530    0.9760  0.2230     29.61%    26  coq-fiat-crypto-with-bedrock/src/PushButtonSynthesis/DettmanMultiplication.v.html                       │
│  22.0640   22.2830  0.2190      0.99%     6  coq-fiat-crypto-with-bedrock/rupicola/bedrock2/deps/riscv-coq/src/riscv/Proofs/DecodeEncodeI.v.html     │
└──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┘

🐇 Top 25 speed ups
┌─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┐
│                                                              TOP 25 SPEED UPS                                                               │
│                                                                                                                                             │
│   OLD      NEW     DIFF     %DIFF    Ln                     FILE                                                                            │
├─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┤
│  6.3830   5.1070  -1.2760  -19.99%   420  coq-rewriter-perf-SuperFast/src/Rewriter/Rewriter/Wf.v.html                                       │
│  6.3790   5.1330  -1.2460  -19.53%   420  coq-rewriter/src/Rewriter/Rewriter/Wf.v.html                                                      │
│  8.6020   7.8910  -0.7110   -8.27%  1150  coq-fiat-crypto-with-bedrock/src/Assembly/WithBedrock/SymbolicProofs.v.html                       │
│ 29.1710  28.5340  -0.6370   -2.18%    10  coq-fourcolor/theories/job499to502.v.html                                                         │
│ 35.1390  34.5190  -0.6200   -1.76%    10  coq-fourcolor/theories/job001to106.v.html                                                         │
│  3.5070   2.9130  -0.5940  -16.94%   443  coq-fiat-crypto-with-bedrock/rupicola/bedrock2/compiler/src/compiler/FlattenExpr.v.html           │
│ 55.2470  54.6910  -0.5560   -1.01%   911  coq-fiat-crypto-with-bedrock/src/Bedrock/End2End/X25519/GarageDoor.v.html                         │
│  3.5530   2.9980  -0.5550  -15.62%   477  coq-fiat-crypto-with-bedrock/rupicola/bedrock2/compiler/src/compiler/FlattenExpr.v.html           │
│ 33.6260  33.1030  -0.5230   -1.56%    10  coq-fourcolor/theories/job589to610.v.html                                                         │
│ 41.7890  41.3360  -0.4530   -1.08%   235  coq-rewriter-perf-SuperFast/src/Rewriter/Rewriter/Examples/PerfTesting/LiftLetsMap.v.html         │
│ 80.5380  80.0960  -0.4420   -0.55%   617  coq-fiat-crypto-with-bedrock/rupicola/bedrock2/bedrock2/src/bedrock2Examples/lightbulb.v.html     │
│  2.4640   2.0270  -0.4370  -17.74%   254  coq-fiat-crypto-with-bedrock/rupicola/bedrock2/compiler/src/compiler/FlattenExpr.v.html           │
│  5.0500   4.6140  -0.4360   -8.63%  1341  coq-perennial/src/program_proof/aof/proof.v.html                                                  │
│  8.2880   7.8560  -0.4320   -5.21%  1669  coq-fiat-crypto-with-bedrock/rupicola/bedrock2/compiler/src/compiler/FlatToRiscvFunctions.v.html  │
│ 19.9310  19.5240  -0.4070   -2.04%    10  coq-fourcolor/theories/job623to633.v.html                                                         │
│ 28.3860  28.0200  -0.3660   -1.29%    78  coq-rewriter-perf-SuperFast/src/Rewriter/Rewriter/Examples/PerfTesting/SieveOfEratosthenes.v.html │
│ 62.6510  62.3020  -0.3490   -0.56%   137  coq-fiat-parsers/src/Parsers/Refinement/SharpenedJSON.v.html                                      │
│  0.4440   0.0950  -0.3490  -78.60%  1314  coq-perennial/src/program_proof/aof/proof.v.html                                                  │
│  4.4710   4.1240  -0.3470   -7.76%   129  coq-category-theory/Functor/Strong/Product.v.html                                                 │
│ 38.9130  38.5670  -0.3460   -0.89%   833  coq-fiat-crypto-with-bedrock/src/Fancy/Compiler.v.html                                            │
│  4.1110   3.7780  -0.3330   -8.10%   444  coq-perennial/src/program_proof/wal/circ_proof_crash.v.html                                       │
│ 34.6910  34.3690  -0.3220   -0.93%    10  coq-fourcolor/theories/job165to189.v.html                                                         │
│  1.5500   1.2360  -0.3140  -20.26%   480  coq-fiat-crypto-with-bedrock/rupicola/bedrock2/compiler/src/compiler/FlattenExpr.v.html           │
│  1.5940   1.2820  -0.3120  -19.57%  1869  coq-perennial/src/program_proof/wal/recovery_proof.v.html                                         │
│ 30.9980  30.7030  -0.2950   -0.95%  1447  coq-unimath/UniMath/CategoryTheory/EnrichedCats/Examples/KleisliEnriched.v.html                   │
└─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┘

@ppedrot
Copy link
Member Author

ppedrot commented Mar 15, 2023

My last trick to reduce the slowdown seems to have worked. I prevent generating gazillions of case analysis by hardcoding in the exposed OCaml tactic the behaviour of the corresponding match goal rule.

Instead of using non-linear pattern-matching in tauto, we use
a tauto-specific conversion function so as to add the missing universe
constraints. We cannot use constr_eq as the current implementation of
non-linear matching uses conversion rather than syntactic equality.
Instead of doing it in the branch after having pattern-matched on types
that look like functions, we perform the match directly within the built-in
primitive. Hopefully this should cut many trivial branches in match goal.
@ppedrot ppedrot added request: full CI Use this label when you want your next push to trigger a full CI. and removed needs: progress Work in progress: awaiting action from the author. labels Mar 15, 2023
@coqbot-app coqbot-app bot removed the request: full CI Use this label when you want your next push to trigger a full CI. label Mar 15, 2023
@ppedrot ppedrot marked this pull request as ready for review March 15, 2023 10:44
@SkySkimmer SkySkimmer assigned SkySkimmer and unassigned mattam82 Mar 18, 2023
@SkySkimmer SkySkimmer added this to the 8.18+rc1 milestone Mar 18, 2023
@SkySkimmer
Copy link
Contributor

@coqbot merge now

@coqbot-app coqbot-app bot merged commit f795833 into coq:master Mar 18, 2023
@ppedrot ppedrot deleted the fix-5351 branch March 19, 2023 10:24
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
kind: fix This fixes a bug or incorrect documentation.
Projects
None yet
8 participants