Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

submission: awardFindR #432

Closed
10 of 27 tasks
mccallc opened this issue Mar 5, 2021 · 26 comments
Closed
10 of 27 tasks

submission: awardFindR #432

mccallc opened this issue Mar 5, 2021 · 26 comments

Comments

@mccallc
Copy link

mccallc commented Mar 5, 2021

Submitting Author: Michael McCall (@mccallc)
Other Authors: Sebastian Karcher (@adam3smith)
Repository: https://github.com/PESData/awardFindR
Version submitted: 0.1.0
Editor: @annakrystalli
Reviewers: @karawoo, @zambujo

Due date for @karawoo: 2021-06-01

Due date for @zambujo: 2021-06-01
Archive: TBD
Version accepted: TBD


  • Paste the full DESCRIPTION file inside a code block below:
Package: awardFindR
Type: Package
Title: QDR awardFindR
Version: 0.1.0
Authors@R: c(person("Michael", "McCall", email = "mimccall@syr.edu",
                  role = c("aut", "cre"), comment = c(ORCID="0000-0002-4668-4212")),
                  person("Sebastian", "Karcher", email = "karcher@u.northwestern.edu",
                  role = c("ctb"), comment = c(ORCID = "0000-0001-8249-7388")),
                  person("Qualitative Data", "Repository", email = "qdr@syr.edu",
                  role = c("cph")),
                  person("Sloan", "Foundation", email = "technology@sloan.org",
                  role = c("fnd")))
Author: Michael C. McCall
Maintainer: Michael McCall <mimccall@syr.edu>
Description: Queries a number of scientific awards databases.
    Collects relevant results based on keyword and date parameters,
    returns list of projects that fit those criteria as a data frame.
    Sources include: Arnold Ventures, Carnegie Corp, Federal RePORTER,
    Gates Foundation, MacArthur Foundation, Mellon Foundation, NEH, NIH,
    NSF, Open Philanthropy, Open Society Foundations, Rockefeller Foundation,
    Russell Sage Foundation, Robert Wood Johnson Foundation, 
    Sloan Foundation, Social Science Research Council, John Templeton Foundation,
    USASpending.gov
URL: https://github.com/PESData/awardFindR
BugReports: https://github.com/PESData/awardFindR/issues
License: MIT + file LICENSE
Encoding: UTF-8
LazyData: true
Imports:
    rvest,
    xml2,
    readr,
    httr
RoxygenNote: 7.1.1
Suggests: 
    knitr,
    rmarkdown,
    testthat (>= 3.0.0),
    vcr,
    covr
VignetteBuilder: knitr
Depends: 
    R (>= 2.10)
Config/testthat/edition: 3

Scope

  • Please indicate which category or categories from our package fit policies this package falls under: (Please check an appropriate box below. If you are unsure, we suggest you make a pre-submission inquiry.):

    • data retrieval
    • data extraction
    • data munging
    • data deposition
    • workflow automation
    • version control
    • citation management and bibliometrics
    • scientific software wrappers
    • field and lab reproducibility tools
    • database software bindings
    • geospatial data
    • text analysis
  • Explain how and why the package falls under these categories (briefly, 1-2 sentences):

Acts as an interface to web-based APIs and search engines.

  • Who is the target audience and what are scientific applications of this package?

Anyone interested in trends in academic funding or researchers looking for a suitable funder based on previously funded research. The package can be used for scientometrics more generally, or more narrowly field-specific lines of inquiry.

The target audience is primarily US-based researchers, and the sources of governmental research funding are all covered to reasonable expectations through the combination of NSF, NEH, NIH, Federal RePORTER and USAspending. Private funders are more difficult to evaluate, but the sources included are the most prominent in the social sciences.

Nothing that is functionally similar. Some source-specific packages exist, but none that aggregate across sources.

nihexporter provides search functionality for the NIH grant database, but is limited to an existing scrape of NIH data from 2016. awardsBot dynamically scrapes NSF awards, but lacks search functionality and is primarily designed for tracking and emailing PIs regarding their obligations. fedreporter is not actively maintained and only offers search functionality for a antiquated and limited version of the Federal Reporter API we implement here.

This package offers a unified, simple search functionality across these federal grant databases, and can include many private funders as well that are not covered by any existing packages.

Yes

  • If you made a pre-submission enquiry, please paste the link to the corresponding issue, forum post, or other discussion, or @tag the editor you contacted.

Technical checks

Confirm each of the following by checking the box.

This package:

Publication options

  • Do you intend for this package to go on CRAN?

  • Do you intend for this package to go on Bioconductor?

  • Do you wish to submit an Applications Article about your package to Methods in Ecology and Evolution? If so:

MEE Options
  • The package is novel and will be of interest to the broad readership of the journal.
  • The manuscript describing the package is no longer than 3000 words.
  • You intend to archive the code for the package in a long-term repository which meets the requirements of the journal (see MEE's Policy on Publishing Code)
  • (Scope: Do consider MEE's Aims and Scope for your manuscript. We make no guarantee that your manuscript will be within MEE scope.)
  • (Although not required, we strongly recommend having a full manuscript prepared when you submit here.)
  • (Please do not submit your package separately to Methods in Ecology and Evolution)

Code of conduct

@melvidoni
Copy link
Contributor

Hello @mccallc, thanks for submitting. We were discussing this submission with the Associate Editors, but we need more information.

In particular, we want to know "how much this package solves the problem". Say, does it cover all possible sources? The top-10 sources? The most prestigious sources? Also, it would be nice to know not just that other packages cover some sources, but more precisely what proportion of the covered sources are already provided by other packages. If all sources are already considered, maybe the benefit of this package is that it might provide a consistent interface across sources?

Please, edit your answer on the submission post, and let me know once it is done.

@mccallc
Copy link
Author

mccallc commented Mar 10, 2021

Hello @melvidoni, thank you so much for your consideration. I believe I have now addressed your questions in the original post.

@melvidoni
Copy link
Contributor

Thanks @mccallc. @annakrystalli will be your Handling Editor. She will run the initial checks soon.

@annakrystalli
Copy link
Contributor

@ropensci-review-bot assign @annakrystalli as editor

@ropensci-review-bot
Copy link
Collaborator

Assigned! @annakrystalli is now the editor

@annakrystalli
Copy link
Contributor

Editor checks:

  • Documentation: The package has sufficient documentation available online (README, pkgdown docs) to allow for an assessment of functionality and scope without installing the package.
  • Fit: The package meets criteria for fit and overlap
  • Automated tests: Package has a testing suite and is tested via a CI service.
  • License: The package has a CRAN or OSI accepted license
  • Repository: The repository link resolves correctly

Editor comments

Dear @mccallc

Thanks for submitting this super interesting package! I've now completed initial editor checks and there are a few issues to sort out before we proceed.

goodpractice

There are a number of issues returned from running goodpractice::gp()

It is good practice to

  • .

    ✖ avoid long code lines, it is bad for readability.
    Also, many people prefer editor windows that are about 80
    characters wide. Try make your lines shorter than 80
    characters
    
    R/arnold.R:10:1
    R/arnold.R:20:1
    R/carnegie.R:34:1
    R/carnegie.R:35:1
    R/fedreporter.R:5:1
    ... and 74 more lines
    
  • .

    ✖ avoid sapply(), it is not type safe. It might
    return a vector, or a list, depending on the input data.
    Consider using vapply() instead.
    
    R/carnegie.R:25:13
    R/main.R:89:26
    R/main.R:90:17
    R/rockefeller.R:21:18
    
  • .

    fix this R CMD check NOTE: Malformed Description
    field: should contain one or more complete sentences.
  • .

    ✖ fix this R CMD check NOTE: Namespace in Imports
    field not imported from: ‘readr’ All declared Imports should
    be used.
    
  • .

    ✖ fix this R CMD check ERROR: Running examples in
    ‘awardFindR-Ex.R’ failed The error most likely occurred in: >
    ### Name: gates_get > ### Title: Query awards from the Bill &
    Melinda Gates Foundation > ### Aliases: gates_get > > ### **
    Examples > > gates <- gates_get("qualitative", "2018-01-01",
    "2020-01-01") POST
    https://www.gatesfoundation.org/services/gfo/search.ashx ...
    Not Found (HTTP 404). Warning in request(url, "post",
    payload) : Not Found (HTTP 404). Error: $ operator is invalid
    for atomic vectors Execution halted
    
  • .

    checking tests ... Runningtestthat.RERROR
    Running the tests intests/testthat.Rfailed. Last 13 lines
    of output: 11.  └─awardFindR:::FUN(X[[i]], ...) 12.
    └─awardFindR:::request(paste0(query_url, "&offset=", offset),
    "get") 13.  └─httr::GET(url) 14.
    └─httr:::request_perform(req, hu$handle$handle) 15.
    └─httr:::perform_callback("request", req = req) 16.
    └─webmockr:::callback(...) 17.
    └─webmockr::HttrAdapter$new()$handle_request(req) 18.
    └─private$request_handler(req)$handle() 19.
    └─eval(parse(text = req_type_fun))(self$request) 20.
    └─err$run() 21.  └─self$construct_message() [ FAIL 1 | WARN 0
    | SKIP 0 | PASS 14 ] Error: Test failures Execution halted
  • .

    avoid 'T' and 'F', as they are just variables which
    are set to the logicals 'TRUE' and 'FALSE' by default, but
    are not reserved words and hence can be overwritten by the
    user.  Hence, one should always use 'TRUE' and 'FALSE' for
    the logicals.
    
    R/arnold.R:NA:NA
    R/arnold.R:NA:NA
    R/arnold.R:NA:NA
    R/carnegie.R:NA:NA
    R/carnegie.R:NA:NA
    ... and 7 more lines
Tests

A bit more detail on the single failing test when running devtools::test() (also turning up in devtools::check())

Error (test-full.R:2:3): Search that should return all empty
Error: 

================================================================================
An HTTP request has been made that vcr does not know how to handle:
GET https://api.nsf.gov/services/v1/awards.json?keyword="foobar"&cfdaNumber=47.041,47.050,47.049,47.078,47.083,47.079,47.070,47.074,47.076,47.075&printFields=id,date,startDate,expDate,title,awardeeName,piFirstName,piLastName,piEmail,cfdaNumber,fundsObligatedAmt,fundProgramName&dateStart=01/01/2019&dateEnd=03/22/2021&offset=1
vcr is currently using the following cassette:
  - ../fixtures/empty.yml
    - record_mode: once
    - match_requests_on: method, uri
Run `vcr::vcr_last_error()` for more verbose errors
If you're not sure what to do, open an issue https://github.com/ropensci/vcr/issues
& see https://books.ropensci.org/http-testing
================================================================================


Backtrace:
  1. base::suppressMessages(...) test-full.R:2:2
  5. awardFindR::awardFindR("foobar") test-full.R:3:4
  9. awardFindR::.nsf_standardize(keywords, from_date, to_date)
 10. base::lapply(keywords, nsf_get, from_date, to_date) /Users/Anna/Documents/workflows/rOpenSci/editorials/awardFindR/R/nsf.R:49:2
 11. awardFindR:::FUN(X[[i]], ...)
 12. awardFindR::request(paste0(query_url, "&offset=", offset), "get") /Users/Anna/Documents/workflows/rOpenSci/editorials/awardFindR/R/nsf.R:24:2
 13. httr::GET(url) /Users/Anna/Documents/workflows/rOpenSci/editorials/awardFindR/R/utils.R:30:4
 14. httr:::request_perform(req, hu$handle$handle)
 15. httr:::perform_callback("request", req = req)
 16. webmockr:::callback(...)
 17. webmockr::HttrAdapter$new()$handle_request(req)
 18. private$request_handler(req)$handle()
 19. eval(parse(text = req_type_fun))(self$request)
 20. err$run()
 21. self$construct_message()
──────────────────────────────────────────────────────────────────────────
✓ |   1       | http [0.3 s]                                              
✓ |   1       | neh [1.7 s]                                               
✓ |   1       | ophil [0.2 s]                                             
✓ |   1       | osociety [0.4 s]                                          
✓ |   2       | rockefeller [0.4 s]                                       
✓ |   1       | ssrc [0.9 s]                                              
✓ |   1       | templeton [3.7 s]                                         
✓ |   1       | usaspend [0.2 s]                                          

══ Results ═══════════════════════════════════════════════════════════════
Duration: 89.6 s

Test failure means I can't run covr::package_coverage() to get a breakdown of test coverage.

Spellcheck

Only a single typo thrown up by devtools::spell_check()

  • .

    Philanthrophy   README.md:23
    

Documentation question

In your docs, you only demonstrate the use of the awardFindR() function, yet in the reference section, the list of exported functions is much longer, including finctions like request() that seems to be more of a back-end function. It's a good time to review exported functions to make sure the reviewers can focus on functions that are only relevant to the end user. Also, for the functions you do decide to export, while not necessary to introduce every single function in the docs (that's what examples and the reference section are for after all!), it would be good to mention to the user in the docs that other exported functions exist with an example of how they could typically use one of them.


@mccallc, could you also please add the rOpenSci under review badge to your README?

[![](https://badges.ropensci.org/432_status.svg)](https://github.com/ropensci/software-review/issues/432)

@mccallc
Copy link
Author

mccallc commented Apr 2, 2021

Hello @annakrystalli,

I believe I have addressed each of your points regarding good practice, tests, spellcheck, documentation, and adding the rOpenSci badge. These are all reflected in the latest commit to the repository.

goodpractice::gp() still returns a few lines missing tests, though these are mainly for fringe cases of error handling that can be reasonably expected to work. Over 99% of code is covered. If a full 100% is needed, please let me know.

@mccallc mccallc closed this as completed Apr 2, 2021
@mccallc mccallc reopened this Apr 2, 2021
@mccallc
Copy link
Author

mccallc commented May 10, 2021

Hello @annakrystalli,

Hope I'm not bothering, but I'm not sure if I messed up any automatic processes when I accidentally closed this ticket prematurely. I believe I addressed your previous comments, and I wanted to indicate I'm still interested in continuing the review process.

@annakrystalli
Copy link
Contributor

Hello @mccallc ! Thanks for confirming and for addressing the initial comments. 99% coverage is absolutely fine.

I'll start looking for reviewers.

@annakrystalli
Copy link
Contributor

@ropensci-review-bot seeking reviewers

@ropensci-review-bot
Copy link
Collaborator

Please add this badge to the README of your package repository:

[![Status at rOpenSci Software Peer Review](https://badges.ropensci.org/432_status.svg)](https://github.com/ropensci/software-review/issues/432)

Furthermore, if your package does not have a NEWS.md file yet, please create one to capture the changes made during the review process. See https://devguide.ropensci.org/releasing.html#news

@annakrystalli
Copy link
Contributor

@ropensci-review-bot add @karawoo to reviewers

@ropensci-review-bot
Copy link
Collaborator

@karawoo added to the reviewers list. Review due date is 2021-06-01. Thanks @karawoo for accepting to review! Please refer to our reviewer guide.

@annakrystalli
Copy link
Contributor

@ropensci-review-bot add @zambujo to reviewers

@ropensci-review-bot
Copy link
Collaborator

@zambujo added to the reviewers list. Review due date is 2021-06-01. Thanks @zambujo for accepting to review! Please refer to our reviewer guide.

@annakrystalli
Copy link
Contributor

Dear @karawoo & @zambujo, just a reminder that the deadline for review is approaching. If you have any questions, please feel free to reach out!

@karawoo
Copy link

karawoo commented May 29, 2021

Package Review

Please check off boxes as applicable, and elaborate in comments below. Your review is not limited to these topics, as described in the reviewer guide

  • Briefly describe any working relationship you have (had) with the package authors.
  • As the reviewer I confirm that there are no conflicts of interest for me to review this work (If you are unsure whether you are in conflict, please speak to your editor before starting your review).

Documentation

The package includes all the following forms of documentation:

  • A statement of need clearly stating problems the software is designed to solve and its target audience in README
  • Installation instructions: for the development version of package and any non-standard dependencies in README
  • Vignette(s) demonstrating major functionality that runs successfully locally
  • Function Documentation: for all exported functions
  • Examples (that run successfully locally) for all exported functions -- one example threw an error (see details below)
  • Community guidelines including contribution guidelines in the README or CONTRIBUTING, and DESCRIPTION with URL, BugReports and Maintainer (which may be autogenerated via Authors@R).

Functionality

  • Installation: Installation succeeds as documented.
  • Functionality: Any functional claims of the software been confirmed.
  • Performance: Any performance claims of the software been confirmed.
  • Automated tests: Unit tests cover essential functions of the package
    and a reasonable range of inputs and conditions. All tests pass on the local machine.
  • Packaging guidelines: The package conforms to the rOpenSci packaging guidelines

Estimated hours spent reviewing: 3

  • Should the author(s) deem it appropriate, I agree to be acknowledged as a package reviewer ("rev" role) in the package DESCRIPTION file.

Review Comments

This is a really nice and well put together package. It was very easy to get started with it and the functions are quite intuitive. I thought the contributing guide was also particularly well written.

While testing the examples I found that one example threw an error, and a few others didn't error but returned different output than I expected:

library("awardFindR")

## Throws an error
specific <- awardFindR(keywords=c("ethnography", "case studies"))
#> ... [removed long curl output here] ...
#> Error in data.frame(grantee = info[2], date, amount, program = info[4],  : 
#>   arguments imply differing number of rows: 1, 0

## Returns NULL
(arnold <- arnold_get("qualitative data", 2016, 2017))
#> POST https://pyj9b8sltv-dsn.algolia.net/1/indexes/*/queries?x-algolia-agent=Algolia%20for%20JavaScript%20(4.5.1)%3B%20Browser%20(lite)%3B%20instantsearch.js%20(4.8.3)%3B%20JS%20Helper%20(3.2.2)&x-algolia-api-key=bec9ead5977b11ae383f4df272c2f106&x-algolia-application-id=PYJ9B8SLTV ... OK (HTTP 200).
#> NULL

## Succeeds with a warning
sloan <- sloan_get(c("qualitative data", "case studies"), 2018, 2020)
#> GET https://sloan.org/grants-database?dynamic=1&order_by=approved_at&order_by_direction=desc&limit=3000 ... OK (HTTP 200).
#> Warning in grepl(keyword, descriptions, ignore.case = TRUE): argument 'pattern'
#> has length > 1 and only the first element will be used

## Returns a list instead of a data frame
nsf <- nsf_get("ethnography", "2020-01-01", "2020-02-01")
#> GET https://api.nsf.gov/services/v1/awards.json?keyword="ethnography"&printFields=id,date,startDate,expDate,title,awardeeName,piFirstName,piLastName,piEmail,cfdaNumber,fundsObligatedAmt,fundProgramName,abstractText,awardeeCounty&dateStart=01/01/2020&dateEnd=02/01/2020&offset=1 ... OK (HTTP 200).
class(nsf)
#> [1] "list"

Created on 2021-05-28 by the reprex package (v0.3.0.9001)

I noticed that the awardFindR() sets the stringsAsFactors option to FALSE, and this will persist to the rest of the user's R session. While FALSE is now the default from R 4.0.0, I think it would be more polite to set the option within the function and restore the user's original settings when the function exits:

opt <- options(stringsAsFactors = FALSE)
on.exit(options(opt))

In general the functions in this package are quite consistent, however some functions (neh_get() for example) provide a message when no results are found, whereas others do not. Personally I find the messages helpful (moreso than the HTTP requests that are printed).

Other minor comments

A few thoughts/ideas that I'll leave up to the package author whether they want to incorporate:

  • Speaking of the HTTP request messages, I think these are probably sometimes useful, but there might also be times when they might not be wanted. While a user could wrap their function call in suppressMessage(), another option might be to add a verbose argument to the functions that shows all the requests if TRUE and not if FALSE. This is just an idea though -- ultimately that's a design decision for the author.

  • In gates_get() there are arguments from_year and to_year defined as "Year integer to begin search", but the examples in the documentation show character strings in YYYY-MM-DD format. It looks like gates_get() can take full dates, so might want to use from_date/to_date and could use @inheritParams to inherit documentation from another function with the same arguments.

  • While testing out awardFindR() I accidentally included a typo in my sources and entered them like this c("nsf, fedreporter") (all as one string). This was obviously user error, but a quick and easy way to return a more helpful error message could be to add

    sources <- match.arg(sources, several.ok = TRUE)

    near the top of the function. Then my typo would have created the following error message:

    Error in match.arg(sources, several.ok = TRUE) : 
      'arg' should be one of "arnold", "carnegie", "fedreporter", "gates", "macarthur", "mellon", "neh", "nih", "nsf", "ophil", "osociety", "rockefeller", "rsf", "rwjf", "sloan", "ssrc", "templeton", "usaspend"
    

    instead of

    Error in awardFindR("baikal", sources = c("nsf, fedreporter"), from_date = "2010-01-01",  : 
      exists(paste0(".", source, "_standardize")) is not TRUE
    

    Up to the author if they want to include this, just a suggestion to help out clumsy-typing users like me :)

@zambujo
Copy link

zambujo commented Jun 1, 2021

Package Review

  • As the reviewer I confirm that there are no conflicts of interest for me to review this work (If you are unsure whether you are in conflict, please speak to your editor before starting your review).

Documentation

The package includes all the following forms of documentation:

  • A statement of need clearly stating problems the software is designed to solve and its target audience in README
  • Installation instructions: for the development version of package and any non-standard dependencies in README
  • Vignette(s) demonstrating major functionality that runs successfully locally
  • Function Documentation: for all exported functions
  • Examples (that run successfully locally) for all exported functions
  • Community guidelines including contribution guidelines in the README or CONTRIBUTING, and DESCRIPTION with URL, BugReports and Maintainer (which may be autogenerated via Authors@R).

Functionality

  • Installation: Installation succeeds as documented.
  • Functionality: Any functional claims of the software been confirmed.
  • Performance: Any performance claims of the software been confirmed.
  • Automated tests: Unit tests cover essential functions of the package
    and a reasonable range of inputs and conditions. All tests pass on the local machine.
  • Packaging guidelines: The package conforms to the rOpenSci packaging guidelines

Estimated hours spent reviewing: 3

  • Should the author(s) deem it appropriate, I agree to be acknowledged as a package reviewer ("rev" role) in the package DESCRIPTION file.

Review Comments

{awardFindR} is a well-organised, well-tested package providing a common interface to access scattered data on grants and awards over a series of APIs and websites.

I could not find any issues other than those pointed out in the previous reviews. I will nevertheless add a few minor points, mostly about design and styling, whose implementation is not required and left to the consideration of the authors.

  • awardFindR is both the name of the package and the name of the main function. While this is not an issue, I think it is nice when ?awardFindR returns the help page for the package as a whole, in which case I would consider renaming the main function;

  • given the package structure, it could make sense to use Roxygen's inheritance functionality @inheritParams to document parameters only once in a common/generic function (in main.R, for instance);

  • I have the impression that it is common to name functions with the verb prefixing the subject, as it follows a logic of inheritance (e.g. print(), print.lm()); so I was wondering whether it would not be more natural to reverse the function names (e.g. neh_get() would become get_neh());

  • I am not used to see the installation instructions towards the end of the README.md. I am used to find them at the top of the file, right after a very brief description of the package.

  • I could not find the use for read_csv() in utils.R (nor, accordingly, the need to import readr in the DESCRIPTION);

@annakrystalli
Copy link
Contributor

Thank you @karawoo & @zambujo for your thoughtful and timely reviews!

Over to you @mccallc 😊. Any questions, just let me know.

@mccallc
Copy link
Author

mccallc commented Jun 14, 2021

Hello!

Thank you very much @karawoo and @zambujo for your reviews and extensive comments. I really appreciate them. I've implemented changes that should address each individual comment, including:

Based on review from @karawoo:

  • Adding a verbose option to all functions to print HTTP messages, defaulting to false (don't print the messages).
  • Fixing documentation for functions like gates_get through the use of roxygen2 inheritParams
  • implementing match.args for argument validation
  • including output messages when there's no results from a source
  • fixing all bugs that cause problems in the examples
  • removing the global setting of stringsAsFactors entirely, which doesn't seem to cause any problems in oldrel versions of R.

Based on review from @zambujo:

  • Using roxygen2 inheritParams to reduce repetition in documentation and harmonize between functions
  • Moving installation instructions to the top of the readme
  • Getting rid of read_csv dependency (removing readr dependency causes problems with httr::content, which needs it)
  • changing function names to verb_object format. awardFindR is now search_awards, nsf_get is now get_nsf, etc

Again, thank you so all much. These comments really helped push this package in the right direction.

Let me know what's next in the process, @annakrystalli !

@annakrystalli
Copy link
Contributor

Thanks for the response @mccallc !

@karawoo & @zambujo, please let us know if you feel the responses to your comments address the issues you raised or whether you have further comments.

@karawoo
Copy link

karawoo commented Jun 15, 2021

All of my comments have been addressed and I don't have any more to add. Nice work!

@zambujo
Copy link

zambujo commented Jun 16, 2021

All comments have been addressed from my side as well. No further comments other than: Well done!.. and apologies for the delayed reply.

@annakrystalli
Copy link
Contributor

Thank you again @karawoo and @zambujo for your reviews and @mccallc for submitting and the subsequent work on the package! 🎉

@annakrystalli
Copy link
Contributor

@ropensci-review-bot approve

@ropensci-review-bot
Copy link
Collaborator

Approved! Thanks @mccallc for submitting and @karawoo, @zambujo for your reviews! 😁

To-dos:

  • Transfer the repo to rOpenSci's "ropensci" GitHub organization under "Settings" in your repo. I have invited you to a team that should allow you to do so. You'll be made admin once you do.
  • Fix all links to the GitHub repo to point to the repo under the ropensci organization.
  • Delete your current code of conduct file if you had one since rOpenSci's default one will apply, see https://devguide.ropensci.org/collaboration.html#coc-file
  • If you already had a pkgdown website and are ok relying only on rOpenSci central docs building and branding,
    • deactivate the automatic deployment you might have set up
    • remove styling tweaks from your pkgdown config but keep that config file
    • replace the whole current pkgdown website with a redirecting page
    • replace your package docs URL with https://docs.ropensci.org/package_name
    • In addition, in your DESCRIPTION file, include the docs link in the URL field alongside the link to the GitHub repository, e.g.: URL: https://docs.ropensci.org/foobar (website) https://github.com/ropensci/foobar
  • Fix any links in badges for CI and coverage to point to the ropensci URL. We no longer transfer Appveyor projects to ropensci Appveyor account so after transfer of your repo to rOpenSci's "ropensci" GitHub organization the badge should be [![AppVeyor Build Status](https://ci.appveyor.com/api/projects/status/github/ropensci/pkgname?branch=master&svg=true)](https://ci.appveyor.com/project/individualaccount/pkgname). If Appveyor does not pick up new commits after transfer, you might need to delete and re-create the Appveyor project. (Repo transfers are smoother with GitHub Actions)
  • Please check you updated the package version post-review version updated and that you documented all changes in NEWS.md
  • We're starting to roll out software metadata files to all ropensci packages via the Codemeta initiative, see https://github.com/ropensci/codemetar/#codemetar for how to include it in your package, after installing the package - should be easy as running codemetar::write_codemeta() in the root of your package.

Should you want to acknowledge your reviewers in your package DESCRIPTION, you can do so by making them "rev"-type contributors in the Authors@R field (with their consent). More info on this here.

Welcome aboard! We'd love to host a post about your package - either a short introduction to it with an example for a technical audience or a longer post with some narrative about its development or something you learned, and an example of its use for a broader readership. If you are interested, consult the blog guide, and tag @stefaniebutland in your reply. She will get in touch about timing and can answer any questions.

We've put together an online book with our best practice and tips, this chapter starts the 3d section that's about guidance for after onboarding. Please tell us what could be improved, the corresponding repo is here.

Last but not least, you can volunteer as a reviewer via filling a short form.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

6 participants