Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Limit information for failed tests #917

Closed
LEdiodes opened this issue Oct 17, 2017 · 17 comments · Fixed by #1975
Closed

Limit information for failed tests #917

LEdiodes opened this issue Oct 17, 2017 · 17 comments · Fixed by #1975

Comments

@LEdiodes
Copy link

Is there a way to suppress the errors?

for example;

instead of;
[-] Kaseya service [Running on VFOOT] 48ms
Expected: {Running}
But was: {}
331: it "Kaseya service [Running on $Computer]" { (Get-service -name *
vrt* -ComputerName $Computer).Status | should Be Running }
at , C:\projects\Onboarding\Pester-ServerOnboarding.ps1: line
331

I would like a simple;
[-] Kaseya service [Running on VFOOT] 48ms

or
[-] Kaseya service [Running on VFOOT] 48ms
Expected: {Running}
But was: {}

Thanks in advance.

@it-praktyk
Copy link
Contributor

As I know it's not currently possible.

@it-praktyk it-praktyk changed the title Suppress failed tests Limit information for failed tests Oct 17, 2017
@nohwnd
Copy link
Member

nohwnd commented Nov 11, 2017

You are right, there is no option for that. We can add it when we have options, but wouldn't you lose valuable information? Or is your usage different than testing code - for example running infrastructure tests?

@codykonior
Copy link
Contributor

I think that extra verbose output is a bug I've just written up in #991

@nohwnd nohwnd self-assigned this Feb 20, 2018
@nohwnd nohwnd added this to the New runtime milestone Dec 16, 2018
@nohwnd nohwnd modified the milestones: New runtime, 5.x Mar 24, 2019
@StephenFerrero
Copy link

Adding a +1 for this, we are running infrastructure tests, and so the additional trace info is not desired.
Would like the option to see something like:

[-] Policy infrastructure setting should be enabled 3ms (2ms|2ms)   
    Expected $true, but got $false.

@nohwnd
Copy link
Member

nohwnd commented May 7, 2021

Could be an option for Outptut/Should. I am open for PRs, should be simple.

@nohwnd nohwnd removed their assignment May 14, 2021
@ArmaanMcleod
Copy link
Contributor

ArmaanMcleod commented Jun 2, 2021

@nohwnd I'd be happy to take this on. This has been something my customers have also asked to be included in since they run many Azure infra tests and only want to see the simple Expected/But was: {} view for failed tests.

Thinking we could add an option to PesterConfiguration for this? Either in the Should or Output properties. I would have to document this change as well. Thoughts?

@nohwnd
Copy link
Member

nohwnd commented Jun 3, 2021

Yup. As I commented above, I think it should go into the Should / Output config. I am more inclined to putting it on Should, because it's specific to Should. There is one more option on Debug that also affects this: ShowFullErrors, it filters the error stack trace to non-internal code. I am not saying we should move it, but the naming should be similar.

So probably Should.ShowStackTrace ?

I also think Should prints the first line of the stack trace into the message itself. Not sure if this would be another option? Imho it's very useful info to know where the error happened, even when not looking at the whole stack trace.

@ArmaanMcleod
Copy link
Contributor

ArmaanMcleod commented Jun 5, 2021

@nohwnd Thanks for the feedback.

Inspecting the Write-ErrorToScreen function: https://github.com/pester/Pester/blob/main/src/functions/Output.ps1#L770, DisplayErrorMessage contains the first line of the stacktrace, which matches what you said.

I also agree that even if Should.ShowStackTrace is set to $false, then this line should still be included, since it is very useful to see where the test failed. I don't think it would be good idea to remove this completely from the output.

I could add a switch flag to the above function to not include DisplayStackTrace and ScriptStackTrace in the error, which should take away most of the stack trace output. This could be an improvement at least.

@nohwnd
Copy link
Member

nohwnd commented Jun 5, 2021

Yup, do it. I am still torn if the option should be added to Should or Output section of the config object, because we should still keep all the info on the run object IMHO, just not print it to screen. On the other hand I want to avoid having ton of options for customizing the output. @fflaten any opinions?

@fflaten
Copy link
Collaborator

fflaten commented Jun 5, 2021

OP didn't want any stacktrace included, however I personally agree with you and think the first line is a vital part of the error - just for the hyperlink (file:line) alone. If it's decided that it should stay, then this would only enable/disable printing DisplayStackTrace. Full ScriptTrace would be intact in ErrorRecord (in result-object) in case we need to post-process or dump to errorlog. 👍

Discovery and Setup exceptions might be the deciding factor here. Should it hide DisplayStackTrace on those as well? If so I'd say Output-section as it's a runtime-wide decision.

Example of setup-error:
image
image
DisplayErrorMessage above actually doesn't include the first line. That's probably a bug on it's own for Block+Container errors. We should be consistent with what belongs in DisplayErrorMessage vs DisplayStackTrace throughout the module 🙂

@nohwnd
Copy link
Member

nohwnd commented Jun 6, 2021

Agree on both points.

@StephenFerrero
Copy link

StephenFerrero commented Jun 6, 2021

For our specific use case it would be helpful to have minimal output. The person running tests in this case do not need to know the test file location, just that the test passed/failed. In our case the test name includes a identifier which can be used to lookup further info.

A streamlined output of just +/- lines would be ideal.

This may be too specific of a use case, but I wonder if others using Pester for infrastructure testing would agree.

Take a look at this output from steampipe for a similar idea:

image

@ArmaanMcleod
Copy link
Contributor

ArmaanMcleod commented Jun 7, 2021

Thanks @StephenFerrero.

@nohwnd and @fflaten with the above in mind should we introduce a new option?

These are the two use cases I'm seeing:

  • User wants to hide stack trace, but still see where the error happened. Output.ShowStackTrace = $false all that is needed.
  • User wants to hide stack trace + remove any info where the error happened, similar to what @StephenFerrero suggested above. This would mean Output.ShowStackTrace = $false and some other option like Output.ShowErrorLocation = $false would need to be set. Seems to be the most flexible solution to customize feedback output for failed cases.

Let me know what you guys think 😄.

@fflaten
Copy link
Collaborator

fflaten commented Jun 7, 2021

I don't read these as the same issue/request. OP wants error message with none or minimal stacktrace. Whether or not the first line in stacktrace should be part of errormessage is relevant here. If it'd have to go then I'd avoid another option for errorline, but rather move that line to DisplayStackTrace and let the ShowStackTrace-option be a enum to accept None, First/Simple/Minimal or All.

@StephenFerrero describes only showing context/describe/it names in color. That would effectively be Detailed output with ShowErrorMessage (doesn't exist) and ShowStackTrace set to false.

@ArmaanMcleod
Copy link
Contributor

ArmaanMcleod commented Jun 7, 2021

@fflaten I like your thinking 👍 . I was considering this as the best way forward, since having an extra option seems a little too much.

If we move the first trace line to DisplayStackTrace, we can achieve the following quite simply with just ShowStackTrace:

StackTrace enabled:

enabled

StackTrace disabled:

disabled

Perhaps an improvement could be to use verbosity settings to have more fine grained control on what output is shown, like you suggested. Wondering if I should cover this in a separate issue? Might need some more design decisions. since we'd need to agree on what verbosity those enums would cover. I'd be happy to open an issue for more discussion around this 😃

@nohwnd
Copy link
Member

nohwnd commented Jun 7, 2021

What @StephenFerrero would have to be implemented in the step after a file has run (or after top level block has run).

Look at 1 Account Settings, it knows how many tests there are (we know that from Discovery), but it also knows how many tests failed. We know that only after the whole block has run. (we could jump back on the screen and rewrite it, but that brings a ton of new problems).

Making this output from the result object is easy to do externally from Pester, and quite trivial to implement. You just iterate over blocks and implement the [=======] output (or [----+++] output, or something alike).

You can also add the additional info that Pester does not have by default, by adding annotations to tests and blocks in form or single item -ForEach. e.g

Describe 'Passcode Requirement' { 
     It 'Minimum passcode length is bigger than 0' -Foreach @{ Severity = 'High' } {
        # your test
    }
}

And then you would look if you have any failed test with Severity High in the block, and printed the yellow "HIGH", and also the Alarm for that test.

All in all this is subjective output, that follows different principles than the current Pester output, and which will spend a lot of time dealing with making the formatting work on different screen sizes, handling different severities and so on.

It's not complicated to implement, but not something that should be built-in in Pester.

@StephenFerrero
Copy link

Sorry, I think I complicated the request with that screen shot. I'm not looking to pester to implement all the extra highlighting and graphical representation of failed tests.

Just looking for the test output we have today sans stack trace, exactly what is shown in @ArmaanMcleod screenshot labeled stack trace disabled.

Understood though if this is too specific. We can absolutely implement our own output display using the pester results object.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

7 participants