Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[5.9] Add PHPUnit bootstrap file to allow execution of console commands before a test run #5050

Open
wants to merge 2 commits into
base: develop
from

Conversation

Projects
None yet
@timacdonald
Copy link

commented Jun 30, 2019

This PR introduces a PHPUnit watcher bootstrap file that caches the config before the test suite runs for the first time, which in my benchmarks reduces the boot time of the Kernel during a test run by ~50% without any additional work by the developer.

This will only really have an impact on a larger test suite, but personally I think it is a worthwhile thing to consider.

The benchmark test suite ended up with the following durations:

  • Without caching: 9.15s
  • With caching: 4.2s

Caching the config before a test run is good for a couple of reasons:

  1. It's faster.
  2. It is closer reflects what you use in production.

If, for example, you cache your routes in production but try deploy a sneaky closure route, your app is gonna blow up when you go to cache the routes during production. Having this in place will help you catch that kind of thing during testing, either locally or during CI. (you could catch this in CI if you are already calling this command before running the tests).

The PHPUnit listener bootstrap file will cache the config for the test environment before a test is run. Because it is in "User land" it is 100% opt-in. If you don't want it, don't use it. One handy thing about running the cache command in a PHPUnit listener bootstrap file (as opposed to say in a composer script) is that it will already have the environment variables from the phpunit.xml file loaded, so we don't need to read those ourselves.

Notes

  • As this doesn't require any changes to the framework, it is totally possible to just package up, which I'm more than happy to do.

  • The watcher could be kept in core with just the phpunit.xml changes made here.

  • it is possible we could see similar results by caching routes and events, but I haven’t tested this yet. I plan to extend the benchmark repo I setup to also have some feature (http) tests so we can see what kind of impact that might have on a test suite as well.

This PR is a simpler approach to my first run at this: laravel/framework#28998

Show resolved Hide resolved tests/Listener.php Outdated

@timacdonald timacdonald changed the title Add PHPUnit listener to cache config (Alternative approach) Add PHPUnit listener to cache config Jun 30, 2019

@timacdonald timacdonald changed the base branch from master to develop Jun 30, 2019

@timacdonald timacdonald force-pushed the timacdonald:config_caching_v2 branch from d6c5682 to ce36ce8 Jun 30, 2019

@timacdonald

This comment has been minimized.

Copy link
Author

commented Jun 30, 2019

@mfn as per your comment on my first PR - would love you to give this version a spin on your suite of 6,000+ tests.

And to answer your question: There is no need to manually do any config cache / busting. This version caches to a dedicated cache.phpunit.php file (or as specified in the phpunit.xml).

@GrahamCampbell GrahamCampbell changed the title Add PHPUnit listener to cache config [5.9] Add PHPUnit listener to cache config Jun 30, 2019

@mfn

This comment has been minimized.

Copy link

commented Jun 30, 2019

I tried it and realized I'm on 5.8 (obviously), had to replace Env::get with env()

But unfortunately, as I thought, it's not measureable with my suite. I also can only run it in a timely manner on CI (in parallel) in the cloud and we all know they don't give you the prime resources for doing that.

Sorry for bringing up hopes.

Show resolved Hide resolved tests/Listener.php Outdated
@driesvints

This comment has been minimized.

Copy link
Member

commented Jul 1, 2019

Can't we move the listener to the foundation component in Illuminate\Foundation\Testing? It feels weird to include it in the skeleton.

@timacdonald

This comment has been minimized.

Copy link
Author

commented Jul 1, 2019

@driesvints I think that would probably be a good idea yea (and I mentioned this in my initial comment). I'll send the PR to laravel/framework if you are keen to move forward with this as a thing in core - just don't wanna spam you with PRs to manage.

@mfn

This comment has been minimized.

Copy link

commented Jul 1, 2019

The only bad thing with the Listener is, exactly as you said, the framework boot problem.

Not too long ago I used a dedicated listener for a Laravel project and it would not have worked together with your because I too had to boot the framework they.

Eventually I found a different solution for the same problem without using a listener.

Do you think it's possible without that?

@timacdonald

This comment has been minimized.

Copy link
Author

commented Jul 1, 2019

@mfn could you shed light on why your listener would have conflicted with this one?

This listener does boot the framework - but the thing that takes the longest during booting is collating all the config. This listener caches the config on first run, meaning that all subsequent boots are ~50% quicker (as per my benchmarking).

I would think that means your listener would actually become faster because when it boots the framework it doesn't have to collate all the config on the fly. Or am I missing something?

@mfn

This comment has been minimized.

Copy link

commented Jul 1, 2019

The Listener was, as in your case, a special one which also booted the framework.

But: it just came to my mind, I'm not using it anymore and I think hardly anyone does so I guess you can just disregard my comment ;)

Show resolved Hide resolved tests/Listener.php Outdated

@timacdonald timacdonald force-pushed the timacdonald:config_caching_v2 branch from 51da23a to 071efc0 Jul 1, 2019

@timacdonald

This comment has been minimized.

Copy link
Author

commented Jul 1, 2019

@mfn no worries. if you do think of any specific scenarios where this could conflict (with or without another listener) please do mention it. I'd be happy to run some tests and see if we need to make adjustments if any are needed.

@timacdonald timacdonald force-pushed the timacdonald:config_caching_v2 branch from c7e47a5 to 980adb5 Jul 3, 2019

@timacdonald

This comment has been minimized.

Copy link
Author

commented Jul 3, 2019

I've just pushed a new commit to this. You can now specify arbitrary commands to run before the test suite, rather than locking it down to config caching only.

It makes it more generic so devs can define the commands to run before the testsuite runs for the first time.

<listeners>
    <listener class="Illuminate\Foundation\Testing\Listener">
        <arguments>
            <!-- Define artisan commands to execute before the test suite runs -->
            <string>config:cache</string>
            <string>route:cache</string>
            <string>event:cache</string>
            <string>some-3rd-party-package:cache-stuff</string>
        </arguments>
    </listener>
</listeners>
@m1guelpf

This comment has been minimized.

Copy link
Contributor

commented Jul 3, 2019

Isn't Listener a really gwneral name? I'd rather have a CacheForTestsListener or another name that describes what this is doing better

@timacdonald

This comment has been minimized.

Copy link
Author

commented Jul 3, 2019

Agreed. I think PreRunListener or SetupListener or something along those lines would be better. I’ve been more focused on the implementation than those details up until now so if anyone has any other suggestions, please do share.

I also figured Taylor, if he wants this, would know what he’d want to call it as well.

@timacdonald

This comment has been minimized.

Copy link
Author

commented Jul 3, 2019

...although if this is the Laravel listener, I dont think it would be the worst thing to call it Listener. Just like the TestCase is called TestCase instead of BootsFrameworkTestCase.

But I don’t have strong opinions on the naming either way TBH

@timacdonald timacdonald changed the title [5.9] Add PHPUnit listener to cache config [5.9] Add PHPUnit listener to execute console commands before a test run Jul 3, 2019

@timacdonald timacdonald force-pushed the timacdonald:config_caching_v2 branch from 980adb5 to 4474d2a Jul 4, 2019

@timacdonald

This comment has been minimized.

Copy link
Author

commented Jul 4, 2019

Someone just suggested this could be done in a bootstrap file...and I think I was too deep down the rabbit hole to realise that is probably the better way to go compared to a listener.

Just ship a tests/bootstrap.php or bootstrap/tests.php file, reference it in the phpunit.xml bootstrap tag and then you can provide classic Laravel style documentation as guidance as well. Again it is all in "user land" so the dev is in full control if they want to take advantage of it.

pretending these comments were not terrible...it could be something like this...

<?php

use Illuminate\Contracts\Console\Kernel;

require_once __DIR__.'/../vendor/autoload.php';

/*
|--------------------------------------------------------------------------
| Bootstrap the testing environment
|--------------------------------------------------------------------------
|
| You have the option to specify console commands that will execute before your
| test suite is run. Caching config, routes, & events may improve performance
| and bring your testing environment closer to production.
|
*/

$commands = [
    'config:cache',
    'route:cache',
    'event:cache',
];

$app = require __DIR__.'/../bootstrap/app.php';

$console = tap($app->make(Kernel::class))->bootstrap();

foreach ($commands as $command) {
    $console->call($command);
}
@taylorotwell

This comment has been minimized.

Copy link
Member

commented Jul 5, 2019

Yeah I think that makes decent sense probably.

@timacdonald timacdonald force-pushed the timacdonald:config_caching_v2 branch from 4474d2a to a8e9017 Jul 5, 2019

@timacdonald

This comment has been minimized.

Copy link
Author

commented Jul 5, 2019

I've just pushed changes to implement to bootstrap file instead of the Listener.

I've commented out the route:cache command as the routes that come in the skeleton are not cacheable. Commenting it out means that the 2 tests in the skeleton still pass out of the box.

@driesvints

This comment has been minimized.

Copy link
Member

commented Jul 5, 2019

My two cents: I think the file would be better lived under tests/bootstrap.php since that's a common location in most PHP apps I've seen so far.

I also believe the current bootstrap/app.php file would be better placed as ./bootstrap.php and the cache directory as a top-level directory or underneath storage but that's a different topic :)

@timacdonald

This comment has been minimized.

Copy link
Author

commented Jul 7, 2019

We've actually found in our initial feature test benchmarks that caching the routes can actually slow down the test suite. Looks like it is due the the performance of unserialize. Our initial benchmarks were against a test app with 1,000 routes. I'm going to run some more tests against different route numbers through the week. Looks like most apps, according to this Twitter poll have far less routes.

Gonna do some more digging into this throughout the week and will report back regarding route caching.

Happy to close this and re-open once I have a better overall understanding of if route caching is a worthwhile thing to include in the bootstrapping process.

@m1guelpf

This comment has been minimized.

Copy link
Contributor

commented Jul 7, 2019

Now that I think about it, this would also break tests which register routes on runtime (I think that's a common approach for testing middleware

@timacdonald

This comment has been minimized.

Copy link
Author

commented Jul 7, 2019

@m1guelpf runtime defined routes continue to work if the routes are cached, e.g. the following test passes with routes cached before the test run:

public function test_runtime_routing()
{
    Route::get('my-test-route', function () {
        return 'expected';
    });

    $response = $this->get('my-test-route');

    $this->assertSame('expected', $response->content());
}

Let me know if this isn't what you are referring to though. I might have misunderstood.

@driesvints

This comment has been minimized.

Copy link
Member

commented Jul 8, 2019

@timacdonald can you move the bootstrap file to tests/bootstrap.php?

@garygreen

This comment has been minimized.

Copy link
Contributor

commented Jul 9, 2019

We've actually found in our initial feature test benchmarks that caching the routes can actually slow down the test suite. Looks like it is due the the performance of unserialize. Our initial benchmarks were against a test app with 1,000 routes.

That's also been my experience with route caching - there's a huge memory overhead when route caching is enabled and yet no discernible improvement in speed. It's wort noting the route caching feature was added 4 years ago so maybe improvements in PHP have changed things, or possibly the benchmarking when route caching was added was flawed at the time. Your not the only one to have noticed this, I personally think route caching should be deprecated because it's really a miss-sold feature. If there's any strong evidence it's actually effective then maybe it can be "undeprecated" 😁

@timacdonald timacdonald force-pushed the timacdonald:config_caching_v2 branch from a8e9017 to 56960ed Jul 9, 2019

@timacdonald

This comment has been minimized.

Copy link
Author

commented Jul 9, 2019

@driesvints that's done for ya mate. Sorry I didn't jump on that when you requested it earlier.

@garygreen that is interesting for sure. I guess I can't really comment about what it is like in production, as I'm not taking OPCache into account as this is focused on the local testing aspect. Perhaps we can put together some arguments for and against it in an issue over on laravel/ideas and get some stats / benchmarks to provide to Taylor so he can make an informed decision about it. Perhaps we deploy this benchmark suite to a production environment and measure the impact of routes with and without caching. You can generate the desired number of unique web and api routes by calling (new Tests\Builder)->loadRoutes(500) (note this generated 500 web and 500 api routes). I don't have time right now to do this (as I'm also looking at some other testing performance improvements), but will add it to my todo list if no one else jumps on it.

@timacdonald timacdonald changed the title [5.9] Add PHPUnit listener to execute console commands before a test run [5.9] Add PHPUnit bootstrap file to allow execution of console commands before a test run Jul 9, 2019

@garygreen

This comment has been minimized.

Copy link
Contributor

commented Jul 9, 2019

I guess I can't really comment about what it is like in production, as I'm not taking OPCache into account as this is focused on the local testing aspect

I've been testing with OpCache enabled on local and it's exactly the same - route caching has a negative impact on performance and higher memory footprint for requests. It definitely would be good to have more tests run from various developers/machines to build a strong case that route caching should be deprecated.

@timacdonald

This comment has been minimized.

Copy link
Author

commented Jul 12, 2019

Although this PR isn't only about improved performance, I just wanted to drop in some performance reports I've had from people giving this a spin.

I've heard from about 8 people that are seeing anywhere from 0% to 25% speed increases in their test suites. I'm personally seeing around 5% of a speed boost on one of my projects and 15% on another. I'm guessing that most people will only see a small percent boost - maybe 5% at most.

I think any real world speed improvements are going to vary wildly based on the kind of tests that make up your suite.

I'm not a Docker person, but I believe Docker (maybe just the Mac version?) doesn't have great file access performance, so I think the higher end speed boosts are potentially coming from Docker users. Yet to confirm that though.

Not saying this is a good sample - but wanted to drop it in here to help any discussion.

@taylorotwell

This comment has been minimized.

Copy link
Member

commented Jul 12, 2019

So what's the consensus on this one? Do we still think it's worth merging?

@hanspagel

This comment has been minimized.

Copy link

commented Jul 12, 2019

Down from 2.25 to 2.00 for 400 tests in docker. 👌

@garygreen

This comment has been minimized.

Copy link
Contributor

commented Jul 12, 2019

Just providing my results. This is tested on Windows. Haven't got a huge number of test classes, quite a few assertions though. This PR doesn't really make any difference for me.

without pr
image

with pr
image

@garygreen

This comment has been minimized.

Copy link
Contributor

commented Jul 12, 2019

I suspect results may vary wildly with this depending on how many configs and events you got, how much data is in them etc. I've personally gone thru and commented out all the settings/options don't need and try to keep configs to a minimum. Also I don't use any events in my application.

@timacdonald care to share how many configs you have, what's in them etc? What's the file output size of your cached config? Mine is:

Cached config size: 16KB
Cached events size: 1KB

@timacdonald

This comment has been minimized.

Copy link
Author

commented Jul 13, 2019

@taylorotwell I think it would be good to get some more people reporting their results. I'm gonna try and get a few more people to come in and let us know if this is having any real world impact.

@hanspagel that's a nice little boost 🎉

@garygreen I'm surprised to see this PR make the test suite run slower. Although I'm sure you are all over this, would you mind just confirming you did the following steps. The key things are changing the <server to <env vars and making sure you suite runs without any existing config cache files.

I'm dropping these steps in here so I can try and get some more results reported.

Steps to test this PR on Laravel 5.8

Record how long your suite currently runs...

  1. Clear out any existing files in bootstrap/cache directory.
  2. Run the test suite 3 times and average the result.

Record how long your suite runs with this PR...

  1. Add the bootstrap file to tests/boostrap.php as is, i.e. don't cache the routes.
  2. Add these changes to your existing phpunit.xml file.
  3. Change the added <server variables to <env variables.
  4. Clear out any existing files in bootstrap/cache directory.
  5. Run the test suite 3 times and average the result.
@jasonlbeggs

This comment has been minimized.

Copy link

commented Jul 13, 2019

I tested it in two different apps. One is pretty large and has a ton of migrations. One is pretty small and only has a few migrations. Here are my statistics from following the steps in each app:

App 1

(Tests: 792, Assertions: 2010, Migrations: 256)

Before: 8.43 minutes
After Average: 8.31 minutes

App 2

(Tests: 102 tests, Assertions: 259, Migrations: 11)

Before: 5.01 seconds
After Average: 2.77 seconds

@stefanzweifel

This comment has been minimized.

Copy link

commented Jul 13, 2019

Ran your PR in a larger app. Like Jasons's, it has a ton of migrations (167).
It's also just a subsection of the entire test suite. (We usually run the entire suite only with paratest)

Here are the stats:

Tests: 220, Assertions: 922.

Control (clear bootstrap/cache on each run)

==> avg. 1.77 Minutes

  1. run: Time: 1.81 minutes, Memory: 102.50 MB
  2. run: Time: 1.91 minutes, Memory: 102.50 MB
  3. run: Time: 1.59 minutes, Memory: 102.50 MB

Tim's PR

==> avg. 1.5933 Minutes

  1. run: Time: 1.62 minutes, Memory: 106.50 MB
  2. run: Time: 1.55 minutes, Memory: 106.50 MB
  3. run: Time: 1.61 minutes, Memory: 106.50 MB

I've also ran the suite multiple times with Laravel's default setup and didn't cleared bootstrap/cache. Even then, Tim's PR brought some improvements.

==> avg. 1.62 Minutes

  1. run: Time: 1.64 minutes, Memory: 102.50 MB
  2. run: Time: 1.73 minutes, Memory: 102.50 MB
  3. run: Time: 1.55 minutes, Memory: 102.50 MB
  4. run: Time: 1.56 minutes, Memory: 102.50 MB
@edvordo

This comment has been minimized.

Copy link

commented Jul 13, 2019

Wanted to chip in

383 tests, 1586 assertions, 131 migrations

  • pre-PR runs
    Time: 3.03 minutes, Memory: 62.50 MB
    Time: 3.29 minutes, Memory: 62.50 MB
    Time: 3.04 minutes, Memory: 66.50 MB
    Average: 3.12 minutes
  • post-PR runs
    Time: 2.92 minutes, Memory: 68.50 MB
    Time: 3.01 minutes, Memory: 66.50 MB
    Time: 2.89 minutes, Memory: 66.50 MB
    Average: 2.94 minutes

And (just for fun) without running migrations
  • pre-PR runs
    Time: 43.13 seconds, Memory: 58.50 MB
    Time: 41.91 seconds, Memory: 58.50 MB
    Time: 42.07 second, Memory: 58.50 MB
    Average: 42.37 seconds
  • post-PR runs
    Time: 43.38 seconds, Memory: 64.50 MB
    Time: 41.51 seconds, Memory: 64.50 MB
    Time: 41.63 seconds, Memory: 64.50 MB
    Average: 42.17 seconds
@garygreen

This comment has been minimized.

Copy link
Contributor

commented Jul 14, 2019

Although I'm sure you are all over this, would you mind just confirming you did the following steps. The key things are changing the <server to <env vars and making sure you suite runs without any existing config cache files.

Yes I have changed to env vars in phpunitxml and ensured it's properly created the cached files in bootstrap/cache then run the tests. I don't think anything else is needed?

And (just for fun) without running migrations [...PR makes no difference to speed]

This mirrors my tests results as well - I don't run any migrations during my tests, though do hit the database that's already been migrated. So maybe speed is related to migrations?

I'm not overly familiar with how Laravel handles migrating the database before tests, it's often felt slow whenever I've used the built in traits so I tend to just do manual tearup/teardown of DB for speed. The only thing I use is DatabaseTransactions.

@narwy

This comment has been minimized.

Copy link
Contributor

commented Jul 14, 2019

I don't agree with the /tests/bootstrap.php, I feel like the current structure / naming convention of all the files would be out of place. /bootstrap/app.php & /bootstrap/tests.php It's a folder specific for bootstrap files. If we start to put them everywhere It can become messy? Good idea though and the results look promising.

@timacdonald

This comment has been minimized.

Copy link
Author

commented Jul 15, 2019

@garygreen Nah, you got it all mate. I figured you knew what was going on, but just wanted to double check to make sure we were on the same page. I guess this is the nature of a micro-optimisation like this - it'll benefit some test suites and not others. Bummer it didn't make any improvement on yours.

I also use DatabaseTransactions on my test suites and did see some nice improvements. I also work with an already migrated before the test run DB.

I think the sales pitch for this PR is:

  1. Some test suites will see a nice (and sometimes significant) improvement, others (and probably most) will not depending on the makeup of your suite.
  2. It beings your test suite closer to production, and encourages devs to not use the env helper, as the config is cached and the helper will already return null.
  3. It gives people less of a reason to say that "Laravel" is slowing down the tests.
  4. It gives developers a place from the get go to bootstrap their test suite - if needed.

But...it isn't a game changer and never was going to be. Not including it in the framework isn't gonna be an issue - but I think it is a "nice to have" myself.

@elvispt

This comment has been minimized.

Copy link

commented Jul 15, 2019

Here are my results (small Laravel 5.8.28 app) for my test suite which includes migrations:
Pre-PR:

  • Time: 21.41 seconds, Memory: 104.25 MB
  • Time: 21.59 seconds, Memory: 106.25 MB
  • Time: 21.82 seconds, Memory: 104.25 MB
    Average: 21.60

Post-PR:

  • Time: 21.99 seconds, Memory: 96.25 MB
  • Time: 21.3 seconds, Memory: 100.25 MB
  • Time: 21.75 seconds, Memory: 98.25 MB
    Average: 21.68

Did notice an improvement on memory, but on times it's basically the same.

@adrianb93

This comment has been minimized.

Copy link

commented Jul 15, 2019

App 1:

This is a new app with only unit tests (we've not written any routes yet.)

(Tests: 186, Assertions: 555)
Before Average: 45.9 seconds (46.86; 45.92; 44.92;)
After Average: 44.92 seconds (47.94; 42.93; 43.89;)

App 2:

This is a super slow test suite and needs test improvements. I ran this suite 3 times, once to warm up, once for non-cache; once for cache. I didn't do a three run average for this because I don't have hours to spend on this.

(Tests: 346 tests, Assertions: 882)

Before: 23.34 minutes
After: 22.52 minutes

Thoughts:

  • Config caching doesn't make the test suite any slower.
  • Config caching may improve your test speeds. For a portion of app 2, I saw huge performance improvements on the fast tests of the suite
  • Regardless of performance, being able to test that your app works under config cache is definitely a good thing.
  • For those who write tests, it may guide people away from using env() calls outside of config files. From my experience, most Laravel devs find env() more convinient because they can skip the step of adding something to their config files. They end up treating .env as a config file.
  • I love the tests/bootstrap.php file. I've already added my own code to it for some before and after test suite setups and cleanups. It's a nice home to do your own userland test suite improvements.

By the way, if anybody wants to run something after the suite finishes, add this to the end of the booststrap file:

// Will run once the entire test suite has finished
register_shutdown_function(function () use ($console) {
    // Figured this is a helpful example for those on a Laravel version below 5.8
    $console->call('config:clear');
});
@driesvints

This comment has been minimized.

Copy link
Member

commented Jul 15, 2019

Judging from the average of the results I think it's best not to include this in the framework as it's not applicable to everyone and the optimizations aren't too big. Maybe it's best to detail this in a blog post and note when this would be applicable.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
You can’t perform that action at this time.