-
-
Notifications
You must be signed in to change notification settings - Fork 156
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Mutations should be skipped base on individual time, not total time #1965
Comments
I have to look into the code but as far as I recall the skipping logic follows the timeout logic. Let me get back after reconfirming. |
It looks like we are tracking full test time instead of timings for individual tests, at line 106 here: infection/src/TestFramework/Coverage/JUnit/JUnitTestExecutionInfoAdder.php Lines 97 to 108 in 154822a
So this isn't a problem with the feature itself, but rather a problem with the underlying data aggregation. |
At this moment I don't feel particularly good about investing time into fixing the issue: the timeout feature exists to limit the time Infection spends running tests, so there should be no harm in adjusting (bumping) the timeout when needed to run more tests. But I can see it can be frustrating. Furthermore, I still agree with my recommendation from this comment to tag integration tests with |
Personally I have to agree with @danepowell, I find the current behaviour quite unintuitive. If I configure a timeout for a test (regardless of the framework/language), I personally would expect it to configure it for the execution of a singular test (unless the config hints at something else of course). But personal bias aside, we're not in any language or any framework, we're in PHP and arguably the de facto test framework is PHPUnit. And in this case still, this is how the timeout works: `CounterTest.php`<?php
declare(strict_types=1);
namespace App\Tests;
use PHPUnit\Framework\Attributes\DataProvider;
use PHPUnit\Framework\Attributes\Small;
use PHPUnit\Framework\TestCase;
use function range;
use function sleep;
#[Small]
final class CounterTest extends TestCase
{
#[DataProvider('inputProvider')]
public function test_smaller_than_small(mixed $input): void
{
$this->addToAssertionCount(1);
}
#[DataProvider('inputProvider')]
public function test_bigger_than_small(mixed $input): void
{
sleep(2);
$this->addToAssertionCount(1);
}
public static function inputProvider(): iterable
{
foreach (range(1, 10) as $i) {
yield [$i];
}
}
} Provided your enforce the timeout with *: There is no I think the current timeout is not devoid or sense or use, but it is not what I would expect |
as I said on discord here, I also think this feature shouldn't sum all the tests' time covering mutated line to decide if mutant should be skipped or not, but instead check tests time individually. In our docs explanation - https://infection.github.io/2020/08/18/whats-new-in-0.17.0/#Skip-S-mutations-that-are-over-specified-time-limit - we give an example with one integration test which takes more than %timeout%, but not with total time of all tests, that's why I think it's confusing at the moment. By the way, there are 2 places where
infection/src/Process/Factory/MutantProcessFactory.php Lines 59 to 68 in 154822a
infection/src/Process/Runner/MutationTestingRunner.php Lines 102 to 106 in 154822a
In this issue, we are discussing point 2. Thoughts how I think it should work instead, given
|
This is the functional impact of this issue: if my tests take 100 ms, my timeout should be close to that to avoid wasting considerable time on actual mutation-induced timeouts. But I can't drop the timeout if it's just going to lead to all tests being skipped. I'll look into the coversNothing annotation but I don't think that's going to be a satisfying workaround. I know my "unit" tests are actually integration tests but that's unfortunately all that's supported by the Symfony Console CommandTester. |
What would define unit vs integration there is whether you use an instantiated command/application or you get one from the kernel or something with externally configured services. Otherwise, using directly the command to test it is not effective: the command runner is the application and it enriches the command definition. |
In JUnit there are timings for every and each test. So we could calculate the exact time it requires to run a set of tests and use that. There should be no need for heuristics and guesswork. <?xml version="1.0" encoding="UTF-8"?>
<testsuites>
<testsuite name="phpunit.xml.dist" tests="2276" assertions="2489" errors="0" failures="0" skipped="0" time="5.511347">
<testsuite name="Main" tests="2276" assertions="2489" errors="0" failures="0" skipped="0" time="5.511347">
<testsuite name="Tests\Pipeline\ChunkTest" file="tests/ChunkTest.php" tests="25" assertions="25" errors="0" failures="0" skipped="0" time="0.083669">
<testsuite name="Tests\Pipeline\ChunkTest::testChunk" tests="24" assertions="24" errors="0" failures="0" skipped="0" time="0.081661">
<testcase name="testChunk with data set #23" file="tests/ChunkTest.php" line="77" class="Tests\Pipeline\ChunkTest" classname="Tests.Pipeline.ChunkTest" assertions="1" time="0.027784"/>
<testcase name="testChunk with data set #13" file="tests/ChunkTest.php" line="77" class="Tests\Pipeline\ChunkTest" classname="Tests.Pipeline.ChunkTest" assertions="1" time="0.002334"/>
<testcase name="testChunk with data set #9" file="tests/ChunkTest.php" line="77" class="Tests\Pipeline\ChunkTest" classname="Tests.Pipeline.ChunkTest" assertions="1" time="0.002521"/>
<testcase name="testChunk with data set #19"
file="tests/ChunkTest.php" line="77"
class="Tests\Pipeline\ChunkTest"
classname="Tests.Pipeline.ChunkTest"
assertions="1" time="0.002292"/>
^^^^^^^^^^^^^^^ (I guess it wasn't the case before, and at one point we had to use test suite-wide timings as only these were available.) |
Is your feature request related to a problem? Please describe.
Mutations for a given line of code seem to be skipped based on the total amount of time taken for all tests that cover that line. In my case, I have a few lines of code that are covered by nearly every test case; so while each test takes 100 ms, mutations are skipped entirely for those lines because 100 ms x 300 tests = 30 s, which is greater than the timeout.
Of course I can just increase the timeout as a workaround, but this is still not ideal behavior.
Describe the solution you'd like
Mutations should only be skipped base on the time required (maybe maximum time required?) to run a single test case per line.
Additional context
As discussed in Discord: https://discord.com/channels/767914934016802818/771020567343136831/1241042229473316934
CC @sanmai who authored this code
The text was updated successfully, but these errors were encountered: