Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support URLs in lint --rules #41

Merged
merged 4 commits into from
Apr 20, 2018
Merged

Support URLs in lint --rules #41

merged 4 commits into from
Apr 20, 2018

Conversation

dangoosby
Copy link
Contributor

@dangoosby dangoosby commented Apr 3, 2018

Speccy can now retrieve rules from provided url.

speccy lint test/samples/petstore.yaml --rules https://gist.githubusercontent.com/philsturgeon/b5295d402f2dc94b86aa32eddb1fa100/raw/06eec564a2f8c9b972930981c165ebd7b00b8322/rules.json

@coveralls
Copy link

coveralls commented Apr 3, 2018

Coverage Status

Coverage increased (+1.6%) to 65.881% when pulling f160602 on rules-url into 17b01cb on master.

Copy link
Contributor

@philsturgeon philsturgeon left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This looks great! Did you get a chance to add the test we were talking about? Using express to serve up a rules file so it can test the loading actually works?

speccy.js Outdated
@@ -23,6 +23,7 @@ program
.command('lint <file-or-url>')
.option('-q, --quiet', 'reduce verbosity')
.option('-r, --rules [ruleFile]', 'Provide multiple rules files', collect, [])
// .option('-u, --url [ruleURL]', 'Provide multiple rules urls', collect, [])
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

👋

lib/loader.js Outdated
@@ -86,7 +95,7 @@ function deepMergeRules(ruleNickname, skipRules, rules = []) {
return rules;
}

const loadRules = (loadFiles, skipRules = []) => {
const loadRules = (loadFiles, skipRules = [], url = []) => {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can probably get rid of this

@@ -9598,6 +9603,14 @@
}
}
},
"wget": {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oops this is still in there

@@ -35,6 +35,11 @@ describe('loader.js', () => {
]
};

it('accepts url rules', () => {
const url = loader.loadRules(['https://raw.githubusercontent.com/wework/speccy/master/rules/default.json']);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

My concern about this test is that it goes out to the actual internet. If github is down, the test suite will fail. If a contributor is offline, the test suite will fail.

To get around this most languages/frameworks have some sort of "web mock" library. In ruby it's literally called webmock, and in Node there's an awesome one called nock.

We could maybe do it with express, but nock might be a tad lighter, and avoid any awkward firewall related issues that could come from actually running a server locally.

lib/linter.js Outdated
const lint = async (objectName, object, options = {}) => {

console.log('now linting', object);
console.log('activeRules', activeRules);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Problem 1: This is outputting empty in tests using this loadRuleFiles logic.

http://github.com/wework/speccy/blob/8e94f39bb4caa49ae6d3cf3f43fb408a4978f31c/test/linter.test.js#L21

package.json Outdated
@@ -6,7 +6,7 @@
"speccy": "./speccy.js"
},
"scripts": {
"test": "nyc --reporter=html --reporter=text mocha",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Don't let this change sneak in!

@philsturgeon
Copy link
Contributor

philsturgeon commented Apr 11, 2018

So, this change ended up involving a change @MikeRalphson pointed out, that loader knew too much about the shape of linter rules. A lot of stuff is getting async in this, which should make a few things better, but is making testing hell.

All tests that use the helper lintAndExpectErrors(rule, input, ['exactly-two-things']); are getting stupid errors like this:

(node:62894) UnhandledPromiseRejectionWarning: AssertionError: expected Array [ 'exactly-two-things' ] to be Array [ 'exactly-two-things' ]
    at Assertion.fail (/Users/psturgeon/src/speccy/node_modules/should/cjs/should.js:275:17)
    at Assertion.value (/Users/psturgeon/src/speccy/node_modules/should/cjs/should.js:356:19)
    at /Users/psturgeon/src/speccy/node_modules/should/cjs/should.js:387:24
    at <anonymous>
    at process._tickCallback (internal/process/next_tick.js:188:7)
(node:62894) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). (rejection id: 28)

Is runLinter(object, test.input).should.be.eventually.equal(test.expectedRuleErrors); the fiend causing this? I dont want to complicate all the tests by returning a promise and making them then it up, its easy to get false positives that way.

lib/linter.js Outdated

const initialize = (options = {}) => {
activeRules = {};
if (options.skip instanceof Array) {

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can use Array.isArray here (same as line 39)

@philsturgeon philsturgeon force-pushed the rules-url branch 5 times, most recently from c0d1e7c to 5ba4ad5 Compare April 12, 2018 18:40
lib/loader.js Outdated
@@ -25,6 +31,20 @@ function readFileAsync(filename, encoding) {
});
}

const fetchUrl = url => {
return fetch(url).then(response => {
Copy link

@maniator maniator Apr 12, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Since you are already using async/await you can make this a little more less "thenny"

const fetchUrl = async (url) => {
    const response = await fetch(url);

    if (response.ok) {
        try {
            return await response.json();
        } catch (error) {
            return Promise.reject(new ReadError('Invalid JSON: ' + error.message));
        }
    }
    if (response.status === 404) {
        return Promise.reject(new OpenError('Page not found: ' + url));
    }

    return Promise.reject(new NetworkError('HTTP error: ' + response.status)); 
};

I am not entirely sure if returning Promise.reject is the best approach or just throwing an error.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Awesome thank you. This is better. I wrote it like this when I was failing to understand async/await, but think I got it now. :)

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Happy to help ^_^

@philsturgeon philsturgeon added this to the 0.6 milestone Apr 13, 2018
@philsturgeon philsturgeon added the enhancement New feature or request label Apr 13, 2018
@philsturgeon philsturgeon dismissed their stale review April 13, 2018 13:13

Ended up getting involved with the code, cannot review now.

@philsturgeon
Copy link
Contributor

This PR is ready! Tested the command out manually and the tests cover everything else.

lib/loader.js Outdated
loadedRules = loadedRules.concat(deepMergeRules(files[f], skipRules));
}
return loadedRules;
const promises = files.map(file => recursivelyLoadRuleFiles(file, [], { verbose }));
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Gah ok so loading URLs does not actually work it seems. It loads the file, but i dont think the promises ever get resolved. In my debugging I added these console lines:

    console.log('Got some promises', promises);
    const foo = await Promise.all(promises);
    console.log('Did they resolve?', foo);

The output with --verbose is...

GET https://gist.githubusercontent.com/philsturgeon/b5295d402f2dc94b86aa32eddb1fa100/raw/1f1c3516294c8c8ebe294aea8b56c7116bce28c0/rules.json
Got some promises [ Promise { <pending> } ]
Specification is valid, with 0 lint errors

So it seems to junk out somewhere in the Promise.all... What the heck is that about?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The death is happening mid fetchUrl, this code outputs got here https://gist.githubusercontent.com/philsturgeon/b5295d402f2dc94b86aa32eddb1fa100/raw/06eec564a2f8c9b972930981c165ebd7b00b8322/rules.json but never outputs the data or an error...


const fetchUrl = async (url) => {
    console.log('got here', url);

    fetch(url).then(data => {
        console.log('data', data)
    }, err => {
        console.log('err', err)
    });

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You may be missing some wrapping try/catch mechanisms in your code.

lib/loader.js Outdated

let data;
if (file && file.startsWith('http')) {
data = await fetchUrl(file);

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Remember to wrap this in a try/catch as it now can throw errors

@philsturgeon philsturgeon force-pushed the rules-url branch 2 times, most recently from 25864fe to 6be79f3 Compare April 17, 2018 19:29
@philsturgeon
Copy link
Contributor

Done! Ready, actually got it working thanks to help from a million people.

@philsturgeon philsturgeon changed the title added fetch rules from url functionality and test Support URLs in lint --rules Apr 17, 2018
const lintObject = (objectName, object, options) => {
const results = options.linter(objectName, object, options);

// Update results
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This method is mostly there to avoid needing to refactor validate.js to look for the new return value. In the future I'd like to continue this work, but I'm waiting for the dust to settle on openapi-kit before I worry about any of that. Let's accept this slight oddity and move on.

linter.lint('something', input, options);
const ruleErrors = options.lintResults.map(result => result.rule.name);
ruleErrors.should.deepEqual(expectedErrors);
linter.initialize();
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

New API for linter is easier to work with thanks to awesome advice from @MikeRalphson.

@philsturgeon
Copy link
Contributor

@maniator how is it looking? Need some thumbs!

Phil Sturgeon and others added 4 commits April 20, 2018 12:26
This makes the loader a bit less aware of internal linter workings, and theoretically speeds things up a little bit.

Feedback from @maniator
@philsturgeon philsturgeon merged commit 1ffa57a into master Apr 20, 2018
@philsturgeon philsturgeon deleted the rules-url branch April 20, 2018 17:26
@philsturgeon
Copy link
Contributor

Thank you @maniator @qwertypants @dangoosby for your help on this one!

@maniator
Copy link

Happy to help @philsturgeon :-D

@maniator
Copy link

@philsturgeon I don't often look @ github alerts :-( (slack is my friend...)

const result = await asyncMap(files, file => recursivelyLoadRuleFiles(file, [], { verbose }));
const flatten = [].concat(...result);
// Unique copy of the array
return [...(new Set(flatten))];

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not sure the new Set is needed here

/shrug

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants