Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Tests for shapley.R and explanation.R #128

Merged
merged 9 commits into from Nov 4, 2019
Merged

Conversation

martinju
Copy link
Member

Both tests rely on an xgboost model (not fitted in the test to ensure consistency across different xgboost versions) and a small subset of the Boston data.

The tests check if the complete output from shapr and explain is the same as before.

For the explain function the 18 different examples from test-refactor.R. All examples are tested in the same call, but the error messages specifies where the error is for easy debugging.

@nikolase90
Copy link
Collaborator

Smooth!

@nikolase90
Copy link
Collaborator

Could you add a script, i.e. inst/script/test_data.R that generates the all the files? Could be nice to have if we at some point regenerate the files.

@martinju
Copy link
Member Author

martinju commented Nov 1, 2019

Sure. The expect_known_value writes the objects to file if the file don't exist, such that all that is needed is perhaps a script for generating the xgboost model?

@nikolase90
Copy link
Collaborator

Ah, ok. So if we wanted to update the results at some point we would just delete the file that is currently there, right? And then the tests would write a new file?

@martinju
Copy link
Member Author

martinju commented Nov 1, 2019

Exactly. If there is no file the test will pass, but give a warning. I will make a script creating the model, though.

Renames to ensure shapley.R test runs before explanation.R since the latter depends on the object stored by the former.
@nikolase90
Copy link
Collaborator

nikolase90 commented Nov 1, 2019

Question;
I pulled the branch, installed the package, and ran devtools::test(). Everything is OK.

Then I deleted explanation_explain_obj_list.rds. Did the same steps as above, and it fails. Any idea why?

@nikolase90
Copy link
Collaborator

@martinju Do you want me to take a look?

@martinju
Copy link
Member Author

martinju commented Nov 1, 2019

Ok, this has to be a versioning issue. I upgraded to CRAN version of xgboost (0.90.2), and the tests fail to predict with xgboost (i.e. completely different issue).
@nikolase90 Maybe you can try to pull, delete the xgboost and test objects, then create the xgboost object, and then run devtools::test() twice and push the new files? I have a feeling that might just work.

@nikolase90
Copy link
Collaborator

Sure!

@martinju
Copy link
Member Author

martinju commented Nov 1, 2019

And, yes, feel free to try to debug on your end if that doesn't do the trick.

@martinju
Copy link
Member Author

martinju commented Nov 1, 2019

Woohoo, test passed!

@martinju martinju merged commit 38fdad4 into master Nov 4, 2019
@martinju martinju deleted the martin/shapr_explain_tests branch November 4, 2019 11:49
@nikolase90 nikolase90 mentioned this pull request Nov 5, 2019
15 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants