Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Reorder Z.euclidean_division_equations_cleanup #18818

Merged
merged 1 commit into from Mar 21, 2024

Conversation

JasonGross
Copy link
Member

@JasonGross JasonGross commented Mar 18, 2024

We first destruct, then clear, then specialize. The reasoning is:

  • destruct before clear because the clear is not going to make much of a difference to destruct, but the destruct might expose more opportunities for clearing
  • clear between destruct and specialize so that we don't waste a bunch of time modifying the goal (via specialize) on hypotheses that we would anyway want to clear

Fixes #18770

  • Added changelog

@JasonGross JasonGross added kind: fix This fixes a bug or incorrect documentation. kind: performance Improvements to performance and efficiency. part: standard library The standard library stdlib. part: micromega The lia, nia, lra, nra and psatz tactics. Also the legacy omega tactic. labels Mar 18, 2024
@JasonGross JasonGross requested a review from a team as a code owner March 18, 2024 21:37
@coqbot-app coqbot-app bot added the needs: full CI The latest GitLab pipeline that ran was a light CI. Say "@coqbot run full ci" to get a full CI. label Mar 18, 2024
@JasonGross
Copy link
Member Author

@coqbot bench

Copy link
Contributor

coqbot-app bot commented Mar 18, 2024

Got more than one checkSuite.

Fixes coq#18770

We first destruct, then clear, then specialize.  The reasoning is:
- `destruct` before `clear` because the `clear` is not going to make much of
  a difference to `destruct`, but the `destruct` might expose more
  opportunities for `clear`ing
- `clear` between `destruct` and `specialize` so that we don't waste a
  bunch of time modifying the goal (via `specialize`) on hypotheses that
  we would anyway want to `clear`
@JasonGross
Copy link
Member Author

@coqbot bench

1 similar comment
@JasonGross
Copy link
Member Author

@coqbot bench

Copy link
Contributor

coqbot-app bot commented Mar 19, 2024

🏁 Bench results:

┌─────────────────────────────────────┬─────────────────────────┬───────────────────────────────────────┬───────────────────────────────────────┬─────────────────────────┐
│                                     │      user time [s]      │              CPU cycles               │           CPU instructions            │  max resident mem [KB]  │
│                                     │                         │                                       │                                       │                         │
│            package_name             │   NEW      OLD    PDIFF │      NEW             OLD        PDIFF │      NEW             OLD        PDIFF │   NEW      OLD    PDIFF │
├─────────────────────────────────────┼─────────────────────────┼───────────────────────────────────────┼───────────────────────────────────────┼─────────────────────────┤
│          coq-performance-tests-lite │  708.43   746.97  -5.16 │  3171347541587   3349671348973  -5.32 │  5620800918285   5871604873884  -4.27 │ 1688528  1688324   0.01 │
│                        coq-bedrock2 │  381.26   394.97  -3.47 │  1722179864945   1785419606694  -3.54 │  3343623897210   3467948666772  -3.58 │  940692   937552   0.33 │
│                           coq-color │  239.50   242.17  -1.10 │  1068178757698   1078805828558  -0.99 │  1572645408007   1572557553648   0.01 │ 1234416  1237924  -0.28 │
│                  coq-mathcomp-field │  142.07   143.24  -0.82 │   641482118960    646826537303  -0.83 │  1063761786412   1063708845574   0.00 │ 1354512  1357144  -0.19 │
│                    coq-math-classes │   86.50    87.08  -0.67 │   386598422416    388618396167  -0.52 │   534989406317    534957894056   0.01 │  516436   514496   0.38 │
│            coq-metacoq-translations │   17.13    17.24  -0.64 │    75255742721     75693350359  -0.58 │   124094754667    124086290979   0.01 │  848276   846200   0.25 │
│                         coq-coqutil │   43.12    43.34  -0.51 │   188983324702    189247983371  -0.14 │   270915529690    271321374726  -0.15 │  568220   567608   0.11 │
│                   coq-metacoq-pcuic │  793.28   797.30  -0.50 │  3579423385679   3596798482942  -0.48 │  5393034390709   5393227932111  -0.00 │ 2365320  2363716   0.07 │
│                      coq-verdi-raft │  589.36   592.20  -0.48 │  2672849901002   2684740579980  -0.44 │  4215499118331   4215565936244  -0.00 │  861980   862136  -0.02 │
│                         coq-unimath │ 2414.66  2423.99  -0.38 │ 10945612164133  10988489181733  -0.39 │ 21678373172460  21679204161542  -0.00 │ 1279432  1279596  -0.01 │
│                             coq-vst │  878.64   881.72  -0.35 │  3967143754426   3981972971316  -0.37 │  6723919478637   6723894974343   0.00 │ 2273916  2271248   0.12 │
│                 coq-category-theory │  763.66   766.27  -0.34 │  3459625112433   3471785360270  -0.35 │  5790494062019   5790074699636   0.01 │  975620   975604   0.00 │
│                            coq-hott │  155.99   156.51  -0.33 │   691598082857    691434186715   0.02 │  1097952198154   1097764586246   0.02 │  548112   547964   0.03 │
│                coq-mathcomp-algebra │  280.40   281.33  -0.33 │  1268053840641   1272139694031  -0.32 │  2104849220632   2104848909079   0.00 │ 1353764  1354776  -0.07 │
│                       coq-fourcolor │ 1361.26  1365.53  -0.31 │  6186944642148   6204603054932  -0.28 │ 12150240283265  12150582099894  -0.00 │ 2127612  2127660  -0.00 │
│        coq-fiat-crypto-with-bedrock │ 5881.62  5897.49  -0.27 │ 26633568573168  26708954817382  -0.28 │ 48607814021785  48784011391040  -0.36 │ 3246136  3245980   0.00 │
│                    coq-fiat-parsers │  314.88   315.60  -0.23 │  1395742376461   1396509149909  -0.05 │  2433852140895   2433823134649   0.00 │ 2838560  2838488   0.00 │
│                      coq-coquelicot │   39.91    40.00  -0.23 │   176963812823    176811618499   0.09 │   250086230932    250014759209   0.03 │  855260   852820   0.29 │
│                 coq-metacoq-erasure │  509.12   509.70  -0.11 │  2305540336038   2307883899531  -0.10 │  3599379042097   3599287156221   0.00 │ 1793004  1802532  -0.53 │
│                        coq-compcert │  284.08   284.29  -0.07 │  1275649142496   1278397318437  -0.21 │  1947887712607   1947810548890   0.00 │ 1105376  1105644  -0.02 │
│                           coq-verdi │   50.15    50.15   0.00 │   224406706787    224887035595  -0.21 │   344567271684    344568022375  -0.00 │  529680   530788  -0.21 │
│                        coq-rewriter │  402.69   402.54   0.04 │  1814287237093   1813475413083   0.04 │  3016831859644   3016897315532  -0.00 │ 1481032  1482976  -0.13 │
│                            coq-core │  130.10   130.02   0.06 │   504808845489    504398321790   0.08 │   533897032949    533899413042  -0.00 │  443764   444944  -0.27 │
│                                 coq │  728.80   728.28   0.07 │  3061240939380   3060454256148   0.03 │  5443405958392   5454558538412  -0.20 │ 2283188  2317300  -1.47 │
│                            coq-corn │  735.00   734.30   0.10 │  3321473409889   3316402177307   0.15 │  5164792187767   5164965730143  -0.00 │  773492   773340   0.02 │
│                coq-metacoq-template │  149.61   149.34   0.18 │   658232380644    658745670572  -0.08 │  1030600070391   1030526376298   0.01 │ 1168828  1166580   0.19 │
│ coq-neural-net-interp-computed-lite │  296.73   296.04   0.23 │  1354315158979   1349232619983   0.38 │  3420897164330   3421012254759  -0.00 │ 1109176  1109364  -0.02 │
│               coq-engine-bench-lite │  157.38   156.91   0.30 │   660257421862    658494719465   0.27 │  1226423771283   1223330498103   0.25 │ 1206820  1207104  -0.02 │
│              coq-mathcomp-ssreflect │  113.22   112.88   0.30 │   511252145218    510968033861   0.06 │   804161864704    804272256895  -0.01 │ 1364704  1364820  -0.01 │
│                          coq-stdlib │  371.74   370.51   0.33 │  1564638066154   1561424549325   0.21 │  1339001000906   1338949423331   0.00 │  713904   715324  -0.20 │
│             coq-metacoq-safechecker │  421.18   419.74   0.34 │  1922614269354   1915393684381   0.38 │  3203155724983   3203297950860  -0.00 │ 1862112  1861416   0.04 │
│               coq-mathcomp-fingroup │   31.10    30.99   0.35 │   141104228340    140817702423   0.20 │   210773186182    210743528014   0.01 │  564868   568048  -0.56 │
│                        coq-coqprime │   48.91    48.72   0.39 │   217449133328    217835418279  -0.18 │   332684384211    332670034862   0.00 │  788940   789392  -0.06 │
│         coq-rewriter-perf-SuperFast │  807.97   804.30   0.46 │  3602521796601   3590502945629   0.33 │  6246525928064   6248347003022  -0.03 │ 1593768  1591252   0.16 │
│               coq-mathcomp-solvable │  119.82   119.16   0.55 │   546280010281    542479718271   0.70 │   860042916848    860050127406  -0.00 │  858180   858724  -0.06 │
│                       coq-fiat-core │   61.31    60.96   0.57 │   253105321130    252937334068   0.07 │   371013834189    371059926306  -0.01 │  480052   482000  -0.40 │
│              coq-mathcomp-odd-order │  780.43   774.88   0.72 │  3562573062492   3536909075500   0.73 │  6032277231759   6032180873250   0.00 │ 1613260  1613348  -0.01 │
│                   coq-iris-examples │  468.79   465.19   0.77 │  2121445138452   2106563884875   0.71 │  3258818485492   3258853534140  -0.00 │ 1110532  1111724  -0.11 │
│                         coq-bignums │   30.16    29.92   0.80 │   136084526358    134747775619   0.99 │   193429396444    193441162342  -0.01 │  480376   481984  -0.33 │
│              coq-mathcomp-character │  106.06   105.12   0.89 │   481639161073    478069514348   0.75 │   753190978283    753222654975  -0.00 │ 1027640  1027912  -0.03 │
│                       coq-equations │    7.76     7.56   2.65 │    31179416252     31248742885  -0.22 │    50509046347     50518076396  -0.02 │  388672   388644   0.01 │
└─────────────────────────────────────┴─────────────────────────┴───────────────────────────────────────┴───────────────────────────────────────┴─────────────────────────┘

🐢 Top 25 slow downs
┌──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┐
│                                                              TOP 25 SLOW DOWNS                                                               │
│                                                                                                                                              │
│   OLD       NEW      DIFF    %DIFF     Ln                     FILE                                                                           │
├──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┤
│  94.1990   95.5150  1.3160     1.40%   103  coq-fiat-crypto-with-bedrock/src/Arithmetic/BarrettReduction.v.html                              │
│ 254.7720  255.8970  1.1250     0.44%     8  coq-neural-net-interp-computed-lite/theories/MaxOfTwoNumbersSimpler/Computed/AllLogits.v.html    │
│  16.9140   17.5100  0.5960     3.52%   607  coq-mathcomp-odd-order/theories/PFsection9.v.html                                                │
│   0.9320    1.4430  0.5110    54.83%   854  coq-stdlib/FSets/FMapAVL.v.html                                                                  │
│  63.1260   63.5550  0.4290     0.68%   609  coq-fiat-crypto-with-bedrock/rupicola/bedrock2/bedrock2/src/bedrock2Examples/lightbulb.v.html    │
│  81.8340   82.2310  0.3970     0.49%    48  coq-fiat-crypto-with-bedrock/src/Curves/Weierstrass/AffineProofs.v.html                          │
│  24.3530   24.7290  0.3760     1.54%    85  coq-fiat-crypto-with-bedrock/src/Curves/Montgomery/AffineProofs.v.html                           │
│   4.7850    5.1590  0.3740     7.82%  1258  coq-mathcomp-odd-order/theories/PFsection9.v.html                                                │
│  41.7960   42.1660  0.3700     0.89%   139  coq-fiat-parsers/src/Parsers/Refinement/SharpenedJSON.v.html                                     │
│  17.5660   17.9310  0.3650     2.08%    31  coq-engine-bench-lite/coq/PerformanceDemos/pattern.v.html                                        │
│  33.8170   34.1430  0.3260     0.96%  1333  coq-fiat-crypto-with-bedrock/rupicola/bedrock2/compiler/src/compiler/FlatToRiscvFunctions.v.html │
│  10.2480   10.5550  0.3070     3.00%   279  coq-category-theory/Theory/Metacategory.v.html                                                   │
│   0.7860    1.0570  0.2710    34.48%   816  coq-stdlib/MSets/MSetRBT.v.html                                                                  │
│   0.8380    1.0910  0.2530    30.19%   384  coq-stdlib/MSets/MSetAVL.v.html                                                                  │
│   0.0060    0.2380  0.2320  3866.67%   522  coq-fiat-crypto-with-bedrock/src/Bedrock/Field/Synthesis/New/UnsaturatedSolinas.v.html           │
│   0.0390    0.2680  0.2290   587.18%   376  coq-fiat-crypto-with-bedrock/src/Bedrock/Field/Synthesis/New/UnsaturatedSolinas.v.html           │
│  27.6390   27.8680  0.2290     0.83%  1933  coq-fiat-crypto-with-bedrock/src/Bedrock/End2End/RupicolaCrypto/ChaCha20.v.html                  │
│  10.5960   10.8060  0.2100     1.98%   186  coq-unimath/UniMath/AlgebraicTheories/RepresentationTheorem.v.html                               │
│  97.6890   97.8980  0.2090     0.21%   376  coq-unimath/UniMath/ModelCategories/Generated/LNWFSMonoidalStructure.v.html                      │
│   8.5330    8.7410  0.2080     2.44%    35  coq-fiat-crypto-with-bedrock/rupicola/src/Rupicola/Examples/Net/IPChecksum/IPChecksum.v.html     │
│  10.2830   10.4870  0.2040     1.98%   496  coq-rewriter-perf-SuperFast/src/Rewriter/Rewriter/Wf.v.html                                      │
│   1.9960    2.1920  0.1960     9.82%  1088  coq-category-theory/Functor/Construction/Product/Monoidal.v.html                                 │
│  78.9440   79.1380  0.1940     0.25%   365  coq-mathcomp-odd-order/theories/PFsection4.v.html                                                │
│  11.5560   11.7480  0.1920     1.66%   410  coq-verdi-raft/theories/RaftProofs/LeaderLogsLogMatchingProof.v.html                             │
│  46.3510   46.5370  0.1860     0.40%   915  coq-fiat-crypto-with-bedrock/src/Bedrock/End2End/X25519/GarageDoor.v.html                        │
└──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┘
🐇 Top 25 speed ups
┌─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┐
│                                                              TOP 25 SPEED UPS                                                               │
│                                                                                                                                             │
│   OLD       NEW       DIFF     %DIFF    Ln                     FILE                                                                         │
├─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┤
│  35.1060    1.0330  -34.0730  -97.06%    21  coq-performance-tests-lite/src/zify_large_context.v.html                                       │
│   4.6800    0.2570   -4.4230  -94.51%    17  coq-performance-tests-lite/src/zify_large_context.v.html                                       │
│  75.5430   73.4100   -2.1330   -2.82%   905  coq-unimath/UniMath/ModelCategories/Generated/LNWFSCocomplete.v.html                           │
│   1.5890    0.6150   -0.9740  -61.30%  1080  coq-bedrock2/bedrock2/src/bedrock2/unzify.v.html                                               │
│   1.5860    0.6180   -0.9680  -61.03%  1080  coq-fiat-crypto-with-bedrock/rupicola/bedrock2/bedrock2/src/bedrock2/unzify.v.html             │
│  44.6920   43.8870   -0.8050   -1.80%   827  coq-vst/veric/binop_lemmas4.v.html                                                             │
│   2.4290    1.7910   -0.6380  -26.27%   736  coq-stdlib/Reals/Cauchy/ConstructiveCauchyReals.v.html                                         │
│   1.6090    1.0470   -0.5620  -34.93%   323  coq-fiat-crypto-with-bedrock/rupicola/bedrock2/bedrock2/src/bedrock2Examples/memmove.v.html    │
│   1.6080    1.0500   -0.5580  -34.70%   323  coq-bedrock2/bedrock2/src/bedrock2Examples/memmove.v.html                                      │
│  12.2610   11.7220   -0.5390   -4.40%   827  coq-fiat-crypto-with-bedrock/src/Bedrock/End2End/X25519/GarageDoor.v.html                      │
│   2.7450    2.2210   -0.5240  -19.09%   481  coq-bedrock2/bedrock2/src/bedrock2Examples/lightbulb.v.html                                    │
│   2.8230    2.3080   -0.5150  -18.24%   480  coq-bedrock2/bedrock2/src/bedrock2Examples/lightbulb.v.html                                    │
│   2.8180    2.3090   -0.5090  -18.06%   480  coq-fiat-crypto-with-bedrock/rupicola/bedrock2/bedrock2/src/bedrock2Examples/lightbulb.v.html  │
│   2.7250    2.2250   -0.5000  -18.35%   481  coq-fiat-crypto-with-bedrock/rupicola/bedrock2/bedrock2/src/bedrock2Examples/lightbulb.v.html  │
│   0.8100    0.3240   -0.4860  -60.00%   102  coq-fiat-crypto-with-bedrock/rupicola/bedrock2/bedrock2/src/bedrock2Examples/uint128_32.v.html │
│   1.2240    0.7500   -0.4740  -38.73%   887  coq-fiat-crypto-with-bedrock/rupicola/bedrock2/bedrock2/src/bedrock2Examples/LAN9250.v.html    │
│   2.5990    2.1290   -0.4700  -18.08%   483  coq-fiat-crypto-with-bedrock/rupicola/bedrock2/bedrock2/src/bedrock2Examples/lightbulb.v.html  │
│   0.7960    0.3270   -0.4690  -58.92%   102  coq-bedrock2/bedrock2/src/bedrock2Examples/uint128_32.v.html                                   │
│   1.2240    0.7560   -0.4680  -38.24%   887  coq-bedrock2/bedrock2/src/bedrock2Examples/LAN9250.v.html                                      │
│   2.5670    2.1000   -0.4670  -18.19%   482  coq-bedrock2/bedrock2/src/bedrock2Examples/lightbulb.v.html                                    │
│   2.5670    2.1050   -0.4620  -18.00%   482  coq-fiat-crypto-with-bedrock/rupicola/bedrock2/bedrock2/src/bedrock2Examples/lightbulb.v.html  │
│   2.5960    2.1450   -0.4510  -17.37%   483  coq-bedrock2/bedrock2/src/bedrock2Examples/lightbulb.v.html                                    │
│ 154.7050  154.2680   -0.4370   -0.28%  1190  coq-unimath/UniMath/CategoryTheory/GrothendieckConstruction/IsPullback.v.html                  │
│  10.0400    9.6090   -0.4310   -4.29%   828  coq-fiat-crypto-with-bedrock/src/Bedrock/End2End/X25519/GarageDoor.v.html                      │
│   9.3320    8.9170   -0.4150   -4.45%   829  coq-fiat-crypto-with-bedrock/src/Bedrock/End2End/X25519/GarageDoor.v.html                      │
└─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┘

@JasonGross
Copy link
Member Author

The

94.1990 95.5150 1.3160 1.40% 103 coq-fiat-crypto-with-bedrock/src/Arithmetic/BarrettReduction.v.html
line might be real, but is marginal, and the other top slowdowns seem like noise.
(But also, surely

 destruct q_nice_strong with (b:=b) (k:=k) (m:=mu) (offset:=1) (a:=x) (n:=M) as [cond Hcond];
        eauto using Z.lt_gt with zarith.

was not taking 95 seconds when we first wrote that line, and something is going wrong?)

Everything else looks good to me. @coqbot run full ci

@coqbot-app coqbot-app bot removed the needs: full CI The latest GitLab pipeline that ran was a light CI. Say "@coqbot run full ci" to get a full CI. label Mar 19, 2024
@andres-erbsen andres-erbsen self-assigned this Mar 20, 2024
| [ H : ?A -> ?x = ?y -> _, H' : ?y < ?x |- _ ] => clear H
| [ H : 0 <= ?x < _ |- _ ] => destruct H
end.
repeat
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this repeat here because specialize in the old version actually trigger more clear later, whereas in the new version there is no such opportunity without this repeat?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not just more clear but also more destruct, yes. I'm not 100% sure that it gets used in practice, but I expect it to be relatively cheap to run the final iteration, and I think it's better to not have to manually track whether or not specialize can reveal more opportunities for clear and destruct

@andres-erbsen
Copy link
Contributor

@coqbot merge now

Copy link
Contributor

coqbot-app bot commented Mar 20, 2024

@andres-erbsen: You cannot merge this PR because:

  • No milestone has been set.

@JasonGross JasonGross added this to the 8.20+rc1 milestone Mar 20, 2024
@JasonGross
Copy link
Member Author

@andres-erbsen I've set a milestone, so this can now be merged

@andres-erbsen
Copy link
Contributor

@coqbot merge now

@coqbot-app coqbot-app bot merged commit d6bad76 into coq:master Mar 21, 2024
7 checks passed
@JasonGross JasonGross deleted the faster-zify branch March 21, 2024 06:13
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
kind: fix This fixes a bug or incorrect documentation. kind: performance Improvements to performance and efficiency. part: micromega The lia, nia, lra, nra and psatz tactics. Also the legacy omega tactic. part: standard library The standard library stdlib.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

#17984 can cause significant slowdown when the context contains many assumptions of the form 0 <= ... < ...
2 participants