Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

D4.8: Facilities for running notebooks as verification tests #98

Closed
4 tasks done
minrk opened this issue Sep 8, 2015 · 28 comments
Closed
4 tasks done

D4.8: Facilities for running notebooks as verification tests #98

minrk opened this issue Sep 8, 2015 · 28 comments

Comments

@minrk
Copy link
Contributor

minrk commented Sep 8, 2015

The Jupyter Notebook is a web application that enables the creation and sharing of executable documents that contains live code, equations, visualizations and explanatory text. Thanks to a modular design, Jupyter can be used with any computational system that provides a so-called Jupyter kernel implementing the Jupyter messaging protocol to communicate with the notebook. OpenDreamKit therefore promotes the Jupyter notebook as user interface of choice, in particular since it is particularly suitable for building modular web based Virtual Research Environments.

This deliverable aims at enabling testing of Jupyter notebooks, with a good balance of convenience and configurability to address the range of possible ways to validate noteboooks. Testing is integral to ODK's goals of enabling reproducible practices in computational math and science, and this work enables validating notebooks as documentation and communication products, extending the scope of testing beyond traditional software.

Accomplishments:

  • Develop nbval package for testing notebooks;
  • Allow multiple testing modes, ranging from lax error-checking to strict output comparison;
  • Enable normalizing output for comparison of transient values such as memory addresses and dates;
  • Integrate D4.6 (D4.6: Tools for collaborating on notebooks via version-control #95: nbdime) for displaying changes between notebooks when they differ.
@minrk minrk added this to the D4.9 milestone Sep 8, 2015
@minrk minrk self-assigned this Nov 3, 2015
@martinal
Copy link

This will build on top of diff tools developed as part of #95

@serge-sans-paille
Copy link
Contributor

Does this mean we could rerun a notebook, verify it still work and verify the output of the cell is the same as it used to be?

In that case it sounds like a great feature to me!

@martinal
Copy link

That would be the general idea yes, and also showing diffs of output when it changes.

Note that there are already some tools for running tests with notebooks out there, I haven't looked into that much yet, my focus now is on the diff and merge.

@fangohr
Copy link
Member

fangohr commented Dec 18, 2015

One of the testing tools is this: https://github.com/computationalmodelling/pytest_validate_nb

A use case for us (and reason to develop it) is to re-execute documentation and tutorials and to check they still execute: it seems a common problem that you sit down and write some examples and tutorials at some point, but fail to update these as interfaces change etc. By running those notebooks as tests, we get the additional testing for free.

Dealing with particular text output (times, dates, memory addresses, etc) needs additional attention as they can change from run to run.

@martinal
Copy link

That is a very common problem yes. Are you using this for any large-ish projects? Is it stable and/or used by others outside your team? Either way it should serve as a base or at least inspiration later.

@fangohr
Copy link
Member

fangohr commented Dec 18, 2015

No-ish. We have used this technology for a large (internal) project, and it worked well. (Until an IPython upgrade broke our homegrown scripts, and we failed to fix it.) The code in this repository is intended to be a replacement for that, and also increasingly used for new projects. I don't know about any other users. As you say, it may serve as inspiration only; and if it will be used for OpenDreamKit, that's also great.

@nthiery nthiery modified the milestones: Month 18: 2017-02-28, D4.9 Mar 22, 2016
@takluyver
Copy link
Member

pytest_validate_nb has now been renamed nbval, and our group is using it in fidimag (like this).

@martinal
Copy link

@takluyver I think it makes sense to build on nbval rather than creating a new tool, what is missing from nbval to cover this deliverable? I'll look more into it some time soon.

@bpilorget
Copy link
Contributor

@minrk (WP leader) and @kohlhase (Lead Beneficiary)
This deliverable is due for February 2017

@fangohr
Copy link
Member

fangohr commented Nov 28, 2016

@martinal We have gathered a wish list of features for nbval at https://github.com/computationalmodelling/nbval/issues

@kohlhase
Copy link
Member

kohlhase commented Dec 16, 2016 via email

@minrk
Copy link
Contributor Author

minrk commented Dec 16, 2016

@kohlhase it certainly is. Unassigned, and assigned myself.

@bpilorget bpilorget changed the title D4.9: Facilities for running notebooks as verification tests D4.8: Facilities for running notebooks as verification tests Dec 16, 2016
@bpilorget
Copy link
Contributor

@kohlhase is the lead for "Facilities for running notebooks as verification tests" which was here D4.9 whereas it is D4.8 in the grant. Hence the confusion. I corrected it.

@nthiery
Copy link
Contributor

nthiery commented Feb 6, 2017

Dear M18 deliverable leaders,

Just a reminder that reports are due for mid-february, to buy us some time for proofreading, feedback, and final submission before February 28th. See our README for details on the process.

In practice, I'll be offline February 12-19, and the week right after will be pretty busy. Therefore, it would be helpful if a first draft could be available sometime this week, so that I can have a head start reviewing it.

Thanks in advance!

@minrk
Copy link
Contributor Author

minrk commented Feb 9, 2017

@nthiery @fangohr @takluyver I've pushed a draft of the D4.8 report if you'd like to have a look, propose more content. In particular, I'd like to know who I should add in the authors list.

@fangohr
Copy link
Member

fangohr commented Feb 10, 2017

Hi Min, many thanks for putting this together. I'll try to read / extend / give feedback soon. Regarding authors: we had many people involved in the initial development, but they probably don't need to feature on the deliverable report.

@nthiery
Copy link
Contributor

nthiery commented Feb 10, 2017 via email

@nthiery
Copy link
Contributor

nthiery commented Feb 10, 2017

For the description of what Jupyter notebooks are, you can e.g. copy paste that from D4.4's (#93) issue description.

@minrk
Copy link
Contributor Author

minrk commented Feb 10, 2017

@nthiery thanks, I've updated the GitHub description.

@fangohr the guidelines suggest that the authors should include only the ODK participants on the report, since it's a mostly internal thing. I'm just not sure which contributors from your side are on that list.

@takluyver
Copy link
Member

I just made a few minor changes.

@nthiery
Copy link
Contributor

nthiery commented Feb 23, 2017

Hi @minrk, @takluyver, @fangohr,
Is there anything else you are planning for the report, or shall I just do a last pass of proofreading tomorrow and submit?

@minrk
Copy link
Contributor Author

minrk commented Feb 24, 2017

Almost there. I am going through a checklist, but will be done by the end of the day.

@minrk
Copy link
Contributor Author

minrk commented Feb 24, 2017

@nthiery I believe this one is ready to go.

@fangohr
Copy link
Member

fangohr commented Feb 24, 2017

Nothing from me to contribute further - making sure we have items from the check list should complete the item; thank you @minrk . @nthiery I have updated the report some time ago already taking into your account your feedback from 14 days ago.

[The only remaining consideration for change would be to include the documentation in the report, but if we link to its URL, that should be just as well. And I should have mentioned it earlier anyway.] I won't any time to contribute further in the next few days, so please proceed without me from here.

@fangohr
Copy link
Member

fangohr commented Feb 24, 2017

I am very pleased with the tool :)

@nthiery
Copy link
Contributor

nthiery commented Feb 27, 2017

I did some minor edits, and added nbval's home and documentation as appendices to the report f31abea. About to submit!

@nthiery
Copy link
Contributor

nthiery commented Feb 27, 2017

By the way: two suggestions about nbval itself (not needed for the report):

  • Better advertise the documentation notebook in the README (I missed it)
  • Plausibly advertise ODK's report there: it contains relevant design and discussion material! Alternatively, that material could be moved to nbval's repo at some point.
  • The documentation notebook could have a more informative title (maybe "nbval's documentation")

@nthiery
Copy link
Contributor

nthiery commented Feb 27, 2017

Submitted! Thanks everyone for all the cool work!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

8 participants