Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix all lints #327

Merged
merged 14 commits into from
Jan 24, 2023
Merged

Fix all lints #327

merged 14 commits into from
Jan 24, 2023

Conversation

rempsyc
Copy link
Sponsor Member

@rempsyc rempsyc commented Jan 19, 2023

Fixes all lints (easystats/easystats#334)

@rempsyc rempsyc removed the request for review from IndrajeetPatil January 19, 2023 23:34
@codecov-commenter
Copy link

codecov-commenter commented Jan 20, 2023

Codecov Report

❗ No coverage uploaded for pull request base (main@d721298). Click here to learn what that means.
The diff coverage is 66.25%.

❗ Current head 010a242 differs from pull request most recent head e1f6a25. Consider uploading reports for the commit e1f6a25 to get more accurate results

@@           Coverage Diff           @@
##             main     #327   +/-   ##
=======================================
  Coverage        ?   58.85%           
=======================================
  Files           ?       46           
  Lines           ?     3145           
  Branches        ?        0           
=======================================
  Hits            ?     1851           
  Misses          ?     1294           
  Partials        ?        0           
Impacted Files Coverage Δ
R/report.brmsfit.R 0.00% <0.00%> (ø)
R/report.default.R 0.00% <0.00%> (ø)
R/report.stanreg.R 92.95% <ø> (ø)
R/report_misc.R 0.00% <0.00%> (ø)
R/report_text.R 94.44% <ø> (ø)
R/utils_error_message.R 0.00% <0.00%> (ø)
R/utils_grouped_df.R 0.00% <0.00%> (ø)
R/report.data.frame.R 3.00% <6.66%> (ø)
R/report.numeric.R 59.35% <58.82%> (ø)
R/report.bayesfactor_models.R 75.98% <60.00%> (ø)
... and 11 more

Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here.

Copy link
Member

@IndrajeetPatil IndrajeetPatil left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That was a lot of work. Thank you so much, @rempsyc!

I am just awaiting a few clarifications in my comments.

tests/testthat/helper.R Show resolved Hide resolved
R/report.default.R Show resolved Hide resolved
R/report_parameters.R Show resolved Hide resolved
R/report_participants.R Outdated Show resolved Hide resolved
R/report_participants.R Outdated Show resolved Hide resolved
If we're not sure whether `x` is of length 1, we should also wrap it into `!all(x %in% names(data))`

Co-authored-by: Indrajeet Patil <patilindrajeet.science@gmail.com>
@strengejacke
Copy link
Member

Sorry, my last one should have been a comment, not commit.

@rempsyc
Copy link
Sponsor Member Author

rempsyc commented Jan 20, 2023

What's the point of Codecov Report if it doesn't work correctly? Or am I doing something wrong?

@rempsyc
Copy link
Sponsor Member Author

rempsyc commented Jan 20, 2023

Any idea why vapply is failing but only on ubuntu-latest (oldrel-3)?

@IndrajeetPatil
Copy link
Member

What's the point of Codecov Report if it doesn't work correctly? Or am I doing something wrong?

It's working as expected. We run our code coverage on Ubuntu machines, but now we run tests only on Windows machine. So, on Ubuntu, code coverage is 0.

I change the workflow to use Windows instead.

Any idea why vapply is failing but only on ubuntu-latest (oldrel-3)?

Must be related to stringsAsFactors somehow?

IndrajeetPatil and others added 2 commits January 20, 2023 16:20
Co-authored-by: Indrajeet Patil <patilindrajeet.science@gmail.com>
@IndrajeetPatil
Copy link
Member

Code coverage is about 60%

@rempsyc
Copy link
Sponsor Member Author

rempsyc commented Jan 20, 2023

Must be related to stringsAsFactors somehow?

Thanks Indra, that was it precisely. One of the example (grouped df) was causing the issue when stringsAsFactors = TRUE. I changed the example.

@rempsyc
Copy link
Sponsor Member Author

rempsyc commented Jan 23, 2023

There is a dependency error on older R versions:

  Error: 
  ! error in pak subprocess
  Caused by error: 
  ! Could not solve package dependencies:
  * deps::.: Can't install dependency ivreg
  * ivreg: Can't install dependency car (>= 3.0-9)
  * car: Can't install dependency pbkrtest (>= 0.4-4)
  * pbkrtest: Needs R >= 4.1.0

Technically we could skip ivreg examples on older R versions, right? But won't pak try to install the suggested dependency and fail anyway? How does it work normally?

@IndrajeetPatil
Copy link
Member

All of these issues are already fixed in workflows. If you push a new commit, you shouldn't see them anymore.

@IndrajeetPatil
Copy link
Member

@rempsyc I think some snapshots have changed again due to changes in effectsize. Can you please update them? Thanks. I don't have access to a Windows machine right now.

@IndrajeetPatil
Copy link
Member

The snapshot tests are still failing, but I am going to merge this anyway.

Looks like snapshots are mismatched because of warnings.

@IndrajeetPatil IndrajeetPatil merged commit 345e82f into easystats:main Jan 24, 2023
@rempsyc
Copy link
Sponsor Member Author

rempsyc commented Jan 24, 2023

There's something weird going on. The following produces no warning in the console, in a reprex, using testthat::test_file("tests/testthat/test-report.htest-t-test.R") or devtools::test():

packageVersion("report")
#> [1] '0.5.5.4'
packageVersion("effectsize")
#> [1] '0.8.2.6'

sleep2 <- reshape(sleep, direction = "wide", idvar = "ID", timevar = "group")
x <- report::report(t.test(sleep2$extra.1, sleep2$extra.2, paired = TRUE))

Created on 2023-01-24 with reprex v2.0.2

But when doing the tests using Rstudio's "Run Tests" (on active file), Rstudio's devtools::test() (Ctrl+Shift+E), or GHA workflows, the "Unable to retrieve data from htest object" warning appears... This warning shouldn't appear anyway because we are not using the formula method here. I updated effectsize to the latest github version too.

@rempsyc
Copy link
Sponsor Member Author

rempsyc commented Jan 24, 2023

Also, you can see from my last commit that my snapshots actually removed the warnings. Yet, in the snapshot errors, it looks like old is first and new is second, but the warning appears first, as if the snapshots were not actually updated as per my commit:

old vs new
  "Code"
  "  report(t.test(df$x, df$y, paired = TRUE))"
- "Output"
- "  Effect sizes were labelled following Cohen's (1988) recommendations."
- "  "
- "  The Paired t-test testing the difference between df$x and df$y (mean difference"
- "  = 0.43) suggests that the effect is positive, statistically significant, and"
- "  large (difference = 0.43, 95% CI [0.10, 0.76], t(8) = 3.04, p = 0.016; Cohen's"
- "  d = 1.01, 95% CI [0.18, 1.81])"
+ "Warning <simpleWarning>"
+ "  Unable to retrieve data from htest object."
+ "    Returning an approximate effect size using t_to_d()."
+ "Output"
+ "  Effect sizes were labelled following Cohen's (1988) recommendations."
+ "  "
+ "  The Paired t-test testing the difference between df$x and df$y (mean difference"
+ "  = 0.43) suggests that the effect is positive, statistically significant, and"
+ "  large (difference = 0.43, 95% CI [0.10, 0.76], t(8) = 3.04, p = 0.016; Cohen's"
+ "  d = 1.07, 95% CI [0.19, 1.92])"

What am I missing here?

@IndrajeetPatil
Copy link
Member

I am fairly certain this has to do with how insight::get_data() works in interactive versus non-interactive contexts.

I think a way to reconcile these differences is by using suppressWarnings() in these tests.

@rempsyc
Copy link
Sponsor Member Author

rempsyc commented Jan 24, 2023

Sure, I can supress these warnings. But first, I would like to understand why in non-interactive contexts get_data is not able to get the data as in interactive contexts? It seems like it should not emit any warning at all in this context since we are not using formula methods...

@rempsyc
Copy link
Sponsor Member Author

rempsyc commented Jan 24, 2023

Or according to the GHA error perhaps it is actually behaving correctly, and emitting no error. But then it should not detect the old snapshots as having those warnings as they do not actually appear here in the current snapshots: https://github.com/easystats/report/blob/main/tests/testthat/_snaps/windows/report.htest-t-test.md

@mattansb
Copy link
Member

But when doing the tests using Rstudio's "Run Tests" (on active file), Rstudio's devtools::test() (Ctrl+Shift+E), or GHA workflows, the "Unable to retrieve data from htest object" warning appears...

I would try superassigning the data (with <<-) - this is what I do in effectsize:

https://github.com/easystats/effectsize/blob/282747e7eac8394cc19e58627a18ec4ea55dce95/tests/testthat/test-effectsize.R#L4-L10

@rempsyc
Copy link
Sponsor Member Author

rempsyc commented Jan 25, 2023

@mattansb thanks, that seems to have done the trick, thanks! :D

@IndrajeetPatil
Copy link
Member

IndrajeetPatil commented Jan 25, 2023

There is no perfect solution here. Super-assignment creates objects in the parent environment, which breaks encapsulation for tests. This can lead to order effects, which are hard to debug if tests start failing.

If you are not sure what I mean. Try the following. Create two test files.

File: tests/testthat/test-test1.R

test_that("test-1", {
  x <<- 4
  expect_equal(x, 4)
})

File: tests/testthat/test-test2.R

test_that("test-2", {
  expect_equal(x, 4)
})

And then run devtools::test(). The test-2 will pass (even though x is not even defined in this file)!
But if you go to test-2 file and run tests in the active file, tests will fail.
The super-assignment has introduced dependencies between test files here, which is never a good idea.

This is just to give some context as to why we shouldn't be using <<- unless and until it's absolutely necessary.

@mattansb
Copy link
Member

Absolutely, but in this case, it solves the scoping issue that insight::get_data suffers from for htest-objects (that don't have a CALL - WHY???)

@IndrajeetPatil
Copy link
Member

Definitely. I think you and I are on the same page here. Just wanted to give some context for Remi. Super-assignments can be seductive, so I like disillusioning people a bit :)

I have already merged his PR that uses super-assignments to fix tests.

htest-objects (that don't have a CALL - WHY???)

Is this something we can request be implemented in R itself? I've never made feature requests in R mailing list, though. So not sure how this works.

@mattansb
Copy link
Member

Is this something we can request be implemented in R itself? I've never made feature requests in R mailing list, though. So not sure how this works.

I think we've discussed this in the past - @bwiernik ?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

5 participants