Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support for custom expectations? #22

Closed
mllg opened this issue Jul 18, 2019 · 19 comments
Closed

Support for custom expectations? #22

mllg opened this issue Jul 18, 2019 · 19 comments
Labels
enhancement New feature or request

Comments

@mllg
Copy link

mllg commented Jul 18, 2019

My package checkmate extends testthat with > 40 expect_* functions, and I'm looking into doing something similar for tinytest. I would need a possibility to write custom expectations, but AFAICT this is currently not possible.

Are there any plans to support custom expectations, or to customize the message (testthat had label and info for that)?

@markvanderloo
Copy link
Owner

markvanderloo commented Jul 18, 2019

Indeed, currently tinytest is unexpandable :-). But it is a good idea to make it so. I haven't given it a lot of thought, but I think it should be doable without much trouble. I'll put it on the list!

@markvanderloo
Copy link
Owner

markvanderloo commented Jul 19, 2019

Hi Michael,

so I figured out how to do it. If I implement it, the API would look something like this.

  1. You write expect_ functions. These must return an tinytest object. These can be created
    as follows:
tinytest::tinytest(result, call, diff, short)

where
- call is the call to the function (usually sys.call(sys.parent(1)) will do.
- result is TRUE of FALSE (passes or fails)
- diff is a character scalar, long description in the case of FALSE, otherwise NA_character_
- short is one of "data" when the data differs, "attr" when an attribute differs, or "xcpt" when
an expected exception is not thrown.
Tinytest will take care of adding file names and line numbers when necessary.

  1. You notify tinytest of the new functions by setting an option. Probably something like
options(tt.extensions = list(checkmate = c("expect_hihi","expect_haha"))

This can be left to the user or you do it in your .onLoad, that way I can let tinytest pick it up when a user does library(checkmate) in a testfile.

This means we are not compatible with testthat (I never aimed it to be). So it would be some work on your side. If that's not a problem for you I can put it in the next version.

Let me know what you think.

edit I have not tested this really, and I'm not sure if/how it works when functions from your package use other functions internal to your package. I have to experiment with that as well before I know whether it really works.

edit again Ok it can be made to work. It takes a bit of detailed work so probably not in the next release.

@mllg
Copy link
Author

mllg commented Jul 20, 2019

Your approach looks good to me. I'll have a closer look next week and report back if everything works out.

As both unit test frameworks come with the same naming scheme for expectations (expect_*), I see three possibilities to support tinytest:

  1. Stick with expect_* for testthat, and try to find another prefix for tinytest (I don't like this option).
  2. Detect the framework during runtime by parsing the DESCRIPTION for either testthat or tinytest.
  3. Detect the framework during runtime by querying an environment variable or option. It is not perfectly clear to me where this should be set to ensure that the variable is set (a) during R CMD check but also (b) after loading the package with devtools::load_all().

Do you see any other options?

@tdeenes
Copy link

tdeenes commented Jul 20, 2019

When I read the post of @markvanderloo yesterday, I thought I would suggest that you described as option 3 (using an environment variable and an option, where the environment variable takes precedence). However, there is a fourth option: remove all unit-test-framework dependent functions from checkmate, and create deliberately tiny packages which extend a given unit test package. E.g., you could introduce checkmate.testthat and checkmate.tinytest.

@mllg
Copy link
Author

mllg commented Jul 20, 2019

Removing the expectation from checkmate would break quite some reverse dependencies so that the transition would take some months. Might still be worth it, though.

@tdeenes
Copy link

tdeenes commented Jul 20, 2019

The big gains would be that 1) checkmate itself would be totally independent from the unit testing frameworks, 2) it would be pretty easy to create extension packages as new test frameworks arise, 3) all expectation functions could have the same names in each extension package without any further machinery.

As for the transition: I find the data.table-way pretty compelling.

@markvanderloo
Copy link
Owner

@mllg I need to test it but I think I can set a variable for you within R. Tinytest runs all expressions in a test file in a fresh R new.env(). I can create a a variable there that you can poll with exists() from your expect_ functions.

@markvanderloo
Copy link
Owner

Added in 31ead00 and b04df1c. Needs testing and probably still contains bugs, but I've verified that it does not break anything existing, including reverse suggests.

I would be grateful if you could kick the tyres a little with something simple (if not no problem of course).

You can install the current dev version with: (note: I do not put .Rd files in the repo as they are generated by roxygen2 anyway)

git clone https://github.com/markvanderloo/tinytest
cd tinytest
make install

The API is described in

?register_tinytest_extension

@mllg
Copy link
Author

mllg commented Jul 25, 2019

unregister_tinytest_extension() is not working yet. L is not defined.

@mllg
Copy link
Author

mllg commented Jul 25, 2019

I decided that having separate packages for the expectations is the better solution in the long run. I've transfered the expect_ function from checkmate to checkmate.testthat (https://github.com/mllg/checkmate.testthat). My plan is to upload the package to CRAN soon, notify all package maintainers that they need to transition to the new package for the expectations and then upload a version of checkmate w/o the expectations.

I created a similar package for tinytest here: https://github.com/mllg/checkmate.tinytest.
This still needs testing, this is a first proof of concept. My main concern is that I might screwed up the variable name lookup. Maybe you can have a quick look?

@markvanderloo
Copy link
Owner

Fixed unregister_tinytest_extension() daf014d. Also added to test suites.

I'm going to build a small extension package that I will use for testing but it will also serve as an example. Will not get to that until next week.

Had a quick look at your package, seems to register everything right. As said, I have not actually run anything yet with an extension package yet but it should work (famous last words :-) ).

Also, currently the API does not overwrite existing functions -- functions loaded earlier currently take precedence. That should change to 'functions added later take precedence (like in R)'.

@markvanderloo
Copy link
Owner

markvanderloo commented Aug 9, 2019

Now done in 8664bf9. Some notes:

  • unregistering is no longer necessary, as we cannot unload packages
    • but ppl can create multiple test-files in /test that will be run with separate R sessions by R CMD check
  • Extension packages overwrite functions loaded earlier now.
  • There are minimal example packages. See here and here

Full spec in ?register_tinytest_extension.

Closing this issue now.

@markvanderloo markvanderloo added the enhancement New feature or request label Aug 9, 2019
@markvanderloo
Copy link
Owner

By the way, as a token of my gratitude for your suggestion and feedback, you have been immortalized in the NEWS file ;-).

@mllg
Copy link
Author

mllg commented Aug 19, 2019

Everything seems to work now. Thanks!

@mllg
Copy link
Author

mllg commented Aug 20, 2019

Well, I discovered some issues now:

  1. It seems to me that you are not doing any tests at all in https://github.com/markvanderloo/uses.tinytest.extension/tree/master/pkg ? There are no test files, and you are testing a package "pkg", but should test "uses.tinytest.extension"? https://github.com/markvanderloo/uses.tinytest.extension/blob/master/pkg/tests/tinytest.R#L3

  2. I get reports for failed custom expectations, but successful expectations are not recorded. An example is here https://github.com/mllg/checkmate.tinytest where I would expect to see 5 successful tests, but only 3 are reported.

@markvanderloo markvanderloo reopened this Aug 21, 2019
@markvanderloo
Copy link
Owner

I have solved issue 1 in markvanderloo/uses.tinytest.extension@291ab9e and markvanderloo/uses.tinytest.extension@ec5462b. Running make test works fine now.

Regarding issue 2, the link points to your repo, not to an example. One thing to keep in mind is that passing tests are recorded but not printed by default. You can print passes by saving the test output,
say out <- test_all() and then print(out, passes=TRUE). Just a wild guess, if you are aware of it: my apologies.

Cheers for reporting, as always.

@mllg
Copy link
Author

mllg commented Aug 21, 2019

Regarding issue 2, the link points to your repo, not to an example. One thing to keep in mind is that passing tests are recorded but not printed by default.

Sorry, the link was indeed not helpful. I meant to point here: https://github.com/mllg/checkmate.tinytest/blob/master/inst/tinytest/test_cm.R. I'm unsure how to point a more isolated example for the following issue:

  • 1st call to tinytest::test_all() in the package root dir: "6 tests OK"
  • 2nd and subsequent call to tinytest::test_all() in the package root dir: "3 tests OK"
  • 1st call to tinytest::run_test_dir("inst/tinytest") in the package root dir: "6 tests OK"
  • 2nd and subsequent call to tinytest::run_test_dir("inst/tinytest") in the package root dir: "3 tests OK"
  • All calls to tinytest::test_package("checkmate.tinytest"): "3 tests OK"

Note that I have 5 tests in the referenced file: 3 calling expect_true() and 2 calling a custom expectation from checkmate (expect_numeric()). I would expect "5 tests OK" every time, right?

@markvanderloo
Copy link
Owner

Thanks, fixed in 5592c36. There was some real subtle stuff going on with (lazy) eval.

@markvanderloo
Copy link
Owner

This is part of version 1.0.0 on CRAN, so closing now.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants