New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[9.x] Adds Benchmark::measure
⏱
#44252
Conversation
This is awesome @nunomaduro 🔥👏. I can't wait to use this feature to measure and compare db queries in terms of time used to retrieve data. |
This is perfect when trying to show laravel newcomers the difference between those database calls. Whether its 0.8ms or 51.6ms you doesn't really feel the difference. But showing |
This is amazing, I've done similar things in the past but always had to code it again |
Discussed with @nunomaduro I want to keep this a bit simpler and lighter for now. Added here: b4293d7 API essentially the same: dd(Benchmark::measure([
'scenario one' => fn () => sleep(1),
'scenario two' => fn () => sleep(2),
], iterations: 2));
Benchmark::dd([
'scenario one' => fn () => sleep(1),
'scenario two' => fn () => sleep(2),
], iterations: 2);
dd(Benchmark::measure(function () {
sleep(random_int(1, 3));
}, iteartions: 2));
Benchmark::dd(function () {
sleep(random_int(1, 3));
}); Can expand on it later if necessary. |
There could be an extended version as a package laravel-benchmark
…On Fri, Sep 23, 2022, 18:19 Taylor Otwell ***@***.***> wrote:
Closed #44252 <#44252>.
—
Reply to this email directly, view it on GitHub
<#44252 (comment)>, or
unsubscribe
<https://github.com/notifications/unsubscribe-auth/ABUKIBCDXHB6KRKQ2RCIYKDV7YNELANCNFSM6AAAAAAQTC2WU4>
.
You are receiving this because you commented.Message ID:
***@***.***>
|
Yeah, like: composer require lib/laravel-benchmark --dev |
This pull request introduces a new
Benchmark::measure
facade that allows you quickly measure and compare Laravel code performance. Here is an example:Using the following code, and if the application is running on the console, Artisan will output the following content:
As you can see, this report shows the elapsed real time, and thehe unit of time is miliseconds. In addition, Ten "repeats" were used by default. Meaning that both the given callbacks were executed ten times, and the value on the right is the average time elapsed of those ten executions.
Also, the fastest callback gets highlighted by using the green color. And, the left content contains an "description" based on the given closure's code.
Next, besides passing a list of callbacks, you can also pass a single callback. And of course, you can equally specify the number of desired repetitons:
When visiting this route, the output will be rendered on the browser like so:
As future scope, would be interesting to provide certain methods, in special places of the framework, so we can easily benchmark routes, commands, and other framework resources without having to put the entire code of those resources within a closure for the
Benchmark::measure
method. In concrete terms, this would mean something like this: