Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add {{path}} as option for messages in rulesets #572

Closed
kylesykes opened this issue Sep 18, 2019 · 11 comments · Fixed by #615
Closed

Add {{path}} as option for messages in rulesets #572

kylesykes opened this issue Sep 18, 2019 · 11 comments · Fixed by #615

Comments

@kylesykes
Copy link
Contributor

User story.
As a writer of rulesets against a large OA3 spec which is split into many different files that get resolved prior to running Spectral, I want to be able to have the message in the ruleset be able to reference the current {{path}} that the error was triggered from to help guide where the issues were in the non-resolved files (where line numbers are useless).

Is your feature request related to a problem?
We use Spectral as a CLI tool in a CI pipeline, and triggers on PR changes to a very hefty spec which gets first resolved, then linting the resolved spec. Currently we have over 300+ routes, and if someone makes a change to a large majority of routes (for example, adding oauth scopes to security), we can write a rule to validate that we actually added something to them all with the following ruleset:

security-zero-or-one-oauth-scope:
    description: Every OAuth2 secured endpoint defines scopes
    given: $..paths.*[?( @property != 'parameters' )].security[0]
    severity: error
    message: 'Errored at {{property}}... Error message is: {{error}}'
    then:
      field: oauth2
      function: length
      functionOptions:
        min: 1

However if the validation fails, we might only see the line number and the property name (in our case, oauth2), of which there might be hundreds to manually sift through in the non-resolved files to find the issue because the line number is meaningless when trying to look at the non-resolved OA3 files(see screenshot below).

Describe the solution you'd like
If it's easy to add in given the context, it would be awesome if we could give the option to dump out the current path which caused the error.

So a message of:
message: 'Errored at {{path}}... Error message is: {{error}}'
could give a message of:
Errored at paths["/v1/errorCodes"].get.security[0].oauth2 ... Error message is: min length is 1

We did experiment with running Spectral against the unresolved OA3 files, but it wasn't successful with detecting things correctly. I'm not sure what sort of resolving is going on under the hood, but if the solution is us doing something incorrectly with regards to using Spectral against the unresolved spec, that would also be a great solution (since it should identify the offending file, where the line number may be useful depending on how things were resolved).

We are open to any suggestions to help devs easily identify where they've broken a rule! Thanks!

Additional context
Sample error currently:
image

Experimenting with using Spectral on an unresolved file produces a bunch of errors which fail to identify path params in this particular file:
image

@P0lip
Copy link
Contributor

P0lip commented Sep 19, 2019

@kylesykes
Thank you for your suggestion. The idea sounds good to me!

I'm not sure what sort of resolving is going on under the hood,

By default all Spectral rules are executed against resolved document, however you can change it.
Each rule can get unresolved document if you'd wish to do so.

security-zero-or-one-oauth-scope:
    description: Every OAuth2 secured endpoint defines scopes
    given: $..paths.*[?( @property != 'parameters' )].security[0]
    severity: error
    resolved: false # this is true by default
    message: 'Errored at {{property}}... Error message is: {{error}}'
    then:
      field: oauth2
      function: length
      functionOptions:
        min: 1

@philsturgeon
Copy link
Contributor

The JSON Path should be in the result but maybe having it in stylish is too unstylish. Can we have a more verbose output format which makes this easier to understand? The JUnit output (#478) we just added can do this right?

Maybe this JUnit feature would actually take care of this PR instead of adding the path to other outputs?

We also have a HTML output (#389) which would be another verbose format to help solve this issue.

If either of these solutions help lets close this one down.

@kylesykes
Copy link
Contributor Author

Actually, if the JUnit output would contain the more detailed (necessary in our case) path information (more than what the CLI currently supports), that would actually be a far better solution since it would better integrate into Gitlab's CI.

I'd be happy to close this down as soon as I can validate that something like the path could be included, otherwise I can easily shift this from "add it to the CLI" to "add the path information to the Junit output" if it's not there currently (because I see a lot of value in that, especially for people who have a large number of unresolved files that others are editing). I saw the Junit PR was merged into develop a few days ago, so I tried using it with the develop tagged docker image to preview it but something else fishy is going on (see below screenshot). Unless I'm using the formatter incorrectly?

Thank you for your time and effort. We've been blown away by the quality of your tools compared to others and the progress you all make on them and we will absolutely be putting Spectral through it's paces for this project.

image

@P0lip
Copy link
Contributor

P0lip commented Sep 24, 2019

Actually, if the JUnit output would contain the more detailed (necessary in our case) path information (more than what the CLI currently supports), that would actually be a far better solution since it would better integrate into Gitlab's CI.

If storing that information in JUnit is enough, I'm all in for doing it so.

I saw the Junit PR was merged into develop a few days ago, so I tried using it with the develop tagged docker image to preview it but something else fishy is going on (see below screenshot). Unless I'm using the formatter incorrectly?

@kylesykes We haven't released it yet. :) 4.2.0 release containing JUnit formatter should be out soon, though!

@philsturgeon
Copy link
Contributor

Added in JUnit! #588

Play around with develop once it's merged and let us know if there is any trouble. :)

@kylesykes
Copy link
Contributor Author

Pulled the docker image this morning and it worked like a charm! Thank you for all your hard work. It seems we're catching you all at the sweet spot where you're actively developing what we're consuming, so I hope I'm not slamming you all with things far afield from your planned roadmap.

Noticed a couple things that you should know about with the JUnit output:

  1. [Bug] Any rules that use a custom function have the classname property on testcase set as undefined, which splits those tests into a different test suite entirely. The bottom collection of tests all use custom functions, whereas the ones above do not. Probably not intended?
    image

  2. [Request/Advice?] When playing around with visualizing outputs in various viewers, I can't figure out a way to get the path property to visualize (since I'm assuming that's a nonstandard property in JUnit). I'm not at all familiar with JUnit so I've just been furiously googling this morning to try and get up to speed. The result is that most JUnit viewers (such as Gitlab's) won't expose that extra path information to the users (which might defeat the intent of adding it). I'm not sure if this is just user error on my part or not though and a misunderstanding of how to best utilize JUnit, so I'd be glad for some advice in that area.

At the minimum, users could look at the raw JUnit to find the information, but I would imagine the point of adding it to JUnit output was so other tools could pick it up and display the information. Not sure what a solution looks like here, but wanted to call it out since it's a newly implemented feature.

Here are some screenshots from various viewers we were experimenting with outside of Gitlab:
junit2html:
image

junit-viewer:
image

@P0lip
Copy link
Contributor

P0lip commented Sep 27, 2019

@kylesykes

[Bug] Any rules that use a custom function have the classname property on testcase set as undefined, which splits those tests into a different test suite entirely. The bottom collection of tests all use custom functions, whereas the ones above do not. Probably not intended?

That's interesting. Do your custom functions provide any path? If so, I've got a hunch - the path those functions provide does not exist in the document.
Is that a case?

Regarding the second request, I'll verify it and get back to you.

@kylesykes
Copy link
Contributor Author

Actually, I think I goofed on my initial guess at the issue. That particular rule doesn't use a custom function (I have another almost identically named that does however).

I have no idea what's happening now, or what's special about this rule versus others that are working fine so here's some examples.

Here's a rule with almost the same JSONPath syntax that is appropriately put into the right testsuite (with a custom function, showing that's not the issue). The regex is just looking for paths that do not end in } (trying to find "collection" type endpoints, ie: /pets versus /pets/{petId})

pagination-endpoints-paginated:
    description: All collection endpoints have the pagination tag
    given: $.paths[?( @property.match('^((?!.*\}$).)*$') )].get
    severity: error
    recommended: true
    then:
        field: tags
        function: pagination_exists

And here's the one that's throwing things in the undefined test suite:

pagination-responses-have-x-next-token:
    description: All collection endpoints have the X-Next-Token parameter in responses
    given: $.paths[?( @property.match('^((?!.*\}$).)*$') )].get.responses['200'].headers
    severity: error
    recommended: true
    then:
        - field: X-Next-Token
        function: truthy

Things I can spot that are different is the bottom one traverses a path that gets resolved internally (.responses['200'] in the spec is just a $ref to a response object, and i'm checking the headers in that) whereas the top one doesn't have that issue (the tags are an inline array).

If it's at all helpful, here's a barebones skeleton of what I'm working with (the actual spec is huge and way more complicated):

{
    "paths": {
        "/agreements": {
            "get": {
                "description": "Get some Agreements",
                "responses": {
                    "200": {
                        "$ref": "#/components/responses/GetAgreementsOk"
                    },
                    "default": {
                        "$ref": "#/components/responses/Error"
                    }
                },
                "summary": "List agreements",
                "tags": [
                    "agreements",
                    "pagination"
                ]
            }
        }
    },
    "responses": {
        "GetAgreementsOk": {
            "description": "Successful operation",
            "headers": {
                "X-Http-Request-Id": {
                    "$ref": "#/components/headers/X-Http-Request-Id"
                },
                "X-Next-Token": {
                    "$ref": "#/components/headers/X-Next-Token"
                }
            }
        }
    }
}

Fun fact, did you know JSONPath-Plus doesn't support =~ regexp in JSONPath filters? I didn't until struggling for 2 days 🤦‍♂

@P0lip
Copy link
Contributor

P0lip commented Sep 29, 2019

@kylesykes
Thanks for the sample rulesets and the portion of your spec. They helped me validate my hunch, I will fix the issue.

EDIT:
I've got a fix already. Going to open a PR tomorrow or on Tuesday latest.

@kylesykes
Copy link
Contributor Author

For those who are curious, it works perfectly. We use Gitlab and Spectral runs during PRs, and now (after #607 was done) the JUnit output is able to be parsed perfectly by Gitlab, pointing exactly to where the problem is (in the larger unresolved spec, which line numbers alone didn't accommodate):

image

Thank you all very much!

@P0lip
Copy link
Contributor

P0lip commented Oct 3, 2019

@kylesykes great to hear that! Thanks again for all the awesome feedback you provided. ❤️

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants