As mentioned in PR #151, I think more tests would be useful - as that PR added some simple unit tests to get the ball rolling, this adds a simple integration test, using supertest and nock to make requests to I/O Docs and mock out a remote service, respectively.
The test does a lot of faffing about moving config.json around - if #146 gets merged, we could look at avoiding that.
Add initial simple integration test
I'd rather see this done via Mocks/Stubs than to run a server for testing. Have you tried http://sinonjs.org/ before?
Thanks for the quick feedback!
These tests aren't actually running a server, as I understand it. I/O Docs itself isn't being run, as supertest integrates with the app object in-process, rather than over HTTP. The remote API isn't being run either: nock intercepts HTTP requests made by I/O Docs and services them with mocked responses (again, in process).
I'd envisaged these tests as integration tests, which is why the only thing mocked is the remote API. I completely agree that there should also be some unit tests that test components in greater isolation: see PR #151 for that, which does indeed use sinon, as you suggest.
It's quite difficult to write unit tests like that at the moment, though: there's a little refactoring needed to carve things up into testable chunks of code. That's where having a suite of integration tests like this would come in handy (to prove that those refactorings are safe).
Does that make this seem a bit more palatable?
I like the approach of mocking the remote api's for integration integration tests. at the same time I don't see any need to test the remote services themselves, it seems like over kill to write tests to test the mocks of services. I'm going to close this one but if you object feel free to let me know. It might be interesting to write a mock service that allows us to exercise all of iodocs and then continue with the mocha tests against those portions of iodocs. I would also hesitate to bring on more testing or mocking frameworks than you already introduced on your last PR :)
I'm going to close this one but if you object feel free to let me know
I object! Or, at least, I'd like to take a minute to explain why I think there's some value here. :)
I don't see any need to test the remote services themselves, it seems like over kill to write tests to test the mocks of services
Yes, I agree - but I don't think that's quite what's happening here (or at least not only what's happening here). The spec in this PR tests that iodocs passes along a request and response to a mocked service correctly - i.e. that requests made from a client (e.g. a browser in the real world, or supertest in our tests) to /processReq (an iodocs endpoint) correctly interprets a configuration file and makes a request to a server (which we've mocked) and returns the response (e.g. a simple JSON object) to the client.
This spec is really very simple - a proof of concept, really - but I'd imagined writing a good deal more tests (to cover different HTTP methods, maybe different content types, different aspects of configuration, etc etc).
I would also hesitate to bring on more testing or mocking frameworks
So, we could get away without supertest (it's really just convenience for creating requests to the iodocs app), but nock is pretty essential (to mock the service in-process, and ensure the suite remains snappy to run).
It might be interesting to write a mock service that allows us to exercise all of iodocs
Honestly, it's been a while since I looked at iodocs, but from memory I think it would be tricky to write only one such service - we might need a few, in order to cover mutually exclusive functionality? But, yes, having mock services so we can exercise the whole of iodocs is exactly what I tried to achieve here.
and then continue with the mocha tests against those portions of iodocs
Agreed. As I said above, more unit tests would be good. Unless things have changed since last year (and they may have done; I haven't kept up with the codebase), I think they'll be tricky to write without some refactoring, and having some integration tests will help give confidence in those refactorings. My usual preference would be to keep the integration tests relatively few in number, just enough to cover the mainline cases, and then address quirky edge cases with unit tests once that's possible.
All that said, I'm no longer actively using iodocs, so I'm unlikely to undertake the above, so if you don't think that's a sensible road to take (and nobody else volunteers), it's probably best to keep this PR closed.
Thanks @rowanhill this is a great conversation starter and I think you have some valid points that warrant more investigation. I don't want to reopen this PR exactly but the discussion is definitely something I'd like to see continue. IO Docs is in need of a good testing methodology and your input is valuable to the discussion moving forward. Created #215 to continue this thread.