New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

rrricanes #118

Closed
timtrice opened this Issue Jun 1, 2017 · 84 comments

Comments

Projects
None yet
8 participants
@timtrice

timtrice commented Jun 1, 2017

Summary

  • What does this package do? (explain in 50 words or less):
    Scrapes and parses tropical cyclone advisories and GIS data from the National Hurricane Center. Can access real-time or archived data for storms since 1998.

  • Paste the full DESCRIPTION file inside a code block below:

Package: rrricanes
Type: Package
Title: Web scraper for real-time and archived advisory products for Atlantic and 
    east Pacific hurricanes and tropical storms
Version: 0.1.0.1
Authors@R: person("Tim", "Trice", email = "tim.trice@gmail.com",
                  role = c("aut", "cre"))
Depends:
  R (>= 3.3.3)
Description: Get archived data of past and current hurricanes and tropical 
  storms for the Atlantic and eastern Pacific oceans. Data is available for 
  storms since 1998.
URL: http://www.timtrice.net
BugReports: https://github.com/timtrice/rrricanes/issues
License: MIT + file LICENSE
LazyData: TRUE
Imports:
  data.table (>= 1.9.6), 
  dplyr (>= 0.5.0), 
  httr (>= 1.2.1), 
  lubridate (>= 1.6.0), 
  magrittr (>= 1.5), 
  purrr (>= 0.2.2.2), 
  readr (>= 1.1.0), 
  rvest (>= 0.3.2), 
  stringr (>= 1.2.0), 
  tibble (>= 1.3.1), 
  tidyr (>= 0.6.2), 
  xml2 (>= 1.1.1)
Suggests:
  devtools, 
  knitr, 
  rmarkdown,
  testthat,
RoxygenNote: 6.0.1
VignetteBuilder: knitr
  • URL for the package (the development repository, not a stylized html page):
    https://github.com/timtrice/rrricanes

  • Please indicate which category or categories from our package fit policies this package falls under and why? (e.g., data retrieval, reproducibility. If you are unsure, we suggest you make a pre-submission inquiry.):
    data extraction, data visualization, data munging, data packages

  • Who is the target audience?
    Those with an interest in studying tropical meteorology, tracking storm developments and comparing the forecast abilities of the National Hurricane Center storm-to-storm, year-to-year.

  • Are there other R packages that accomplish the same thing? If so, what is different about yours?
    I have searched (I believe) extensively and have not found one at all. To my knowledge the only other hurricane package in CRAN or GitHub is HURDAT - and I wrote that one, too :)

Requirements

R 3.3.3
Most tidyverse packages.
rnaturalearthdata and rnaturalerathhires will be required for release 0.1.1.

Confirm each of the following by checking the box. This package:

  • has a CRAN and OSI accepted license.
  • contains a README with instructions for installing the development version.
  • does not violate the Terms of Service of any service it interacts with.
  • includes documentation with examples for all functions.
  • contains a vignette with examples of its essential functions and uses.
  • has a test suite.
  • has continuous integration, including reporting of test coverage, using services such as Travis CI, Coeveralls and/or CodeCov.
  • I agree to abide by ROpenSci's Code of Conduct during the review process and in maintaining my package should it be accepted.

Publication options

  • Do you intend for this package to go on CRAN?
  • Do you wish to automatically submit to the Journal of Open Source Software? If so:
    • The package contains a paper.md with a high-level description in the package root or in inst/.
    • The package is deposited in a long-term repository with the DOI:
    • (Do not submit your package separately to JOSS)

Detail

  • Does R CMD check (or devtools::check()) succeed? Paste and describe any errors or warnings:
    In release 0.1.0.1, only 2 notes about global variables. This has largely been resolved in unreleased 0.1.1.

  • Does the package conform to rOpenSci packaging guidelines? Please describe any exceptions:
    I used a CONTRIBUTORS.md file instead of CODE_OF_CONDUCT.md.
    I use @keywords internal for internal functions

  • If this is a resubmission following rejection, please explain the change in circumstances:

  • If possible, please provide recommendations of reviewers - those with experience with similar packages and/or likely users of your package - and their GitHub user names:

@maelle

This comment has been minimized.

Show comment
Hide comment
@maelle

maelle Jun 8, 2017

Member

Thanks for your submission @timtrice ! 😸

I have two questions

  • I see you have Appveyor CI set up but in the logs I see it happens with these R CHECK options --no-tests --no-manual --no-build-vignettes, could you try a build on Appveyor not skipping tests and building the vignettes? I get errors when doing it on my Windows PC (R 3.3.1 though, I'll try again on another PC later but maybe the R version is not the issue, hence the Appveyor suggestion 😉)

Tests results

> devtools::test()
Loading rrricanes
Loading required package: testthat

Attaching package:testthatThe following object is masked frompackage:dplyr:

    matches

Testing rrricanes
Test base functions.: .........................
Storm Discussions (discus): ...........................12
|====================================================================================|100% ~0 s remaining     ....
Forecast/Advisory Products (fstadv): 3
Test getting storm data.: .
|====================================================================================|100% ~0 s remaining     ..
Position Estimates (posest): .................
|====================================================================================|100% ~0 s remaining     4
|====================================================================================|100% ~0 s remaining     .....5
Scrapers: .................
Update (update): .................
|====================================================================================|100% ~0 s remaining     .

Failed -------------------------------------------------------------------------------------------------------
1. Error: Test discus() (@test_discus.R#54) ------------------------------------------------------------------
character string is not in a standard unambiguous format
1: discus(link = url) at C:\Users\msalmon.ISGLOBAL\Documents\rrricanes/tests/testthat/test_discus.R:54
2: scrape_header(contents, ret = "date") at C:\Users\msalmon.ISGLOBAL\Documents\rrricanes/R/discus.R:71
3: scrape_date(header) at C:\Users\msalmon.ISGLOBAL\Documents\rrricanes/R/scrapers.R:234
4: as.POSIXct(x, tz = "Etc/GMT+4") at C:\Users\msalmon.ISGLOBAL\Documents\rrricanes/R/scrapers.R:162
5: as.POSIXct.default(x, tz = "Etc/GMT+4")
6: as.POSIXct(as.POSIXlt(x, tz, ...), tz, ...)
7: as.POSIXlt(x, tz, ...)
8: as.POSIXlt.character(x, tz, ...)
9: stop("character string is not in a standard unambiguous format")

2. Error: Test get_discus() (@test_discus.R#71) --------------------------------------------------------------
character string is not in a standard unambiguous format
1: get_discus(link = url) at C:\Users\msalmon.ISGLOBAL\Documents\rrricanes/tests/testthat/test_discus.R:71
2: purrr::map(filter_discus(products), discus) at C:\Users\msalmon.ISGLOBAL\Documents\rrricanes/R/discus.R:41
3: .f(.x[[i]], ...)
4: scrape_header(contents, ret = "date") at C:\Users\msalmon.ISGLOBAL\Documents\rrricanes/R/discus.R:71
5: scrape_date(header) at C:\Users\msalmon.ISGLOBAL\Documents\rrricanes/R/scrapers.R:234
6: as.POSIXct(x, tz = "Etc/GMT+4") at C:\Users\msalmon.ISGLOBAL\Documents\rrricanes/R/scrapers.R:162
7: as.POSIXct.default(x, tz = "Etc/GMT+4")
8: as.POSIXct(as.POSIXlt(x, tz, ...), tz, ...)
9: as.POSIXlt(x, tz, ...)
10: as.POSIXlt.character(x, tz, ...)
11: stop("character string is not in a standard unambiguous format")

3. Error: (unknown) (@test_fstadv.R#8) -----------------------------------------------------------------------
character string is not in a standard unambiguous format
1: get_fstadv(link = url.al011998) at C:\Users\msalmon.ISGLOBAL\Documents\rrricanes/tests/testthat/test_fstadv.R:8
2: purrr::map(filter_fstadv(products), fstadv) at C:\Users\msalmon.ISGLOBAL\Documents\rrricanes/R/fstadv.R:157
3: .f(.x[[i]], ...)
4: scrape_header(contents, ret = "date") at C:\Users\msalmon.ISGLOBAL\Documents\rrricanes/R/fstadv.R:210
5: scrape_date(header) at C:\Users\msalmon.ISGLOBAL\Documents\rrricanes/R/scrapers.R:234
6: as.POSIXct(x, tz = "UTC") at C:\Users\msalmon.ISGLOBAL\Documents\rrricanes/R/scrapers.R:152
7: as.POSIXct.default(x, tz = "UTC")
8: as.POSIXct(as.POSIXlt(x, tz, ...), tz, ...)
9: as.POSIXlt(x, tz, ...)
10: as.POSIXlt.character(x, tz, ...)
11: stop("character string is not in a standard unambiguous format")

4. Error: (unknown) (@test_prblty.R#9) -----------------------------------------------------------------------
character string is not in a standard unambiguous format
1: get_prblty(al1998[1]) at C:\Users\msalmon.ISGLOBAL\Documents\rrricanes/tests/testthat/test_prblty.R:9
2: purrr::map(filter_prblty(products), prblty) at C:\Users\msalmon.ISGLOBAL\Documents\rrricanes/R/prblty.R:25
3: .f(.x[[i]], ...)
4: scrape_header(contents, ret = "date") at C:\Users\msalmon.ISGLOBAL\Documents\rrricanes/R/prblty.R:52
5: scrape_date(header) at C:\Users\msalmon.ISGLOBAL\Documents\rrricanes/R/scrapers.R:234
6: as.POSIXct(x, tz = "Etc/GMT+4") at C:\Users\msalmon.ISGLOBAL\Documents\rrricanes/R/scrapers.R:156
7: as.POSIXct.default(x, tz = "Etc/GMT+4")
8: as.POSIXct(as.POSIXlt(x, tz, ...), tz, ...)
9: as.POSIXlt(x, tz, ...)
10: as.POSIXlt.character(x, tz, ...)
11: stop("character string is not in a standard unambiguous format")

5. Error: 1998, Tropical Storm Alex, Advisory 26 (@test_public.R#38) -----------------------------------------
character string is not in a standard unambiguous format
1: expect_identical(public(al.1998.alex.products.public[25]) %>% dplyr::select(Status) %>% dplyr::first(), "Tropical Disturbance") at C:\Users\msalmon.ISGLOBAL\Documents\rrricanes/tests/testthat/test_public.R:38
2: identical(object, expected)
3: public(al.1998.alex.products.public[25]) %>% dplyr::select(Status) %>% dplyr::first()
4: eval(lhs, parent, parent)
5: eval(expr, envir, enclos)
6: public(al.1998.alex.products.public[25])
7: scrape_header(contents, ret = "date") at C:\Users\msalmon.ISGLOBAL\Documents\rrricanes/R/public.R:70
8: scrape_date(header) at C:\Users\msalmon.ISGLOBAL\Documents\rrricanes/R/scrapers.R:234
9: as.POSIXct(x, tz = "Etc/GMT+4") at C:\Users\msalmon.ISGLOBAL\Documents\rrricanes/R/scrapers.R:156
10: as.POSIXct.default(x, tz = "Etc/GMT+4")
11: as.POSIXct(as.POSIXlt(x, tz, ...), tz, ...)
12: as.POSIXlt(x, tz, ...)
13: as.POSIXlt.character(x, tz, ...)
14: stop("character string is not in a standard unambiguous format")

DONE =========================================================================================================
Don't worry, you'll get it.
  • In the following (uncommented) code, why do you use assignments to the global environment?
    s <- sapply(x, function(z, n = names, l = link, m = msg){
        f <- paste("get", z, sep = "_")
        res <- do.call(f, args = list("link" = l, "msg" = m))
        if (!is.null(n[[z]])) {
            assign(n[z][[1]], res, envir = .GlobalEnv)
        } else {
            assign(z, res, envir = .GlobalEnv)
        }
    })
Member

maelle commented Jun 8, 2017

Thanks for your submission @timtrice ! 😸

I have two questions

  • I see you have Appveyor CI set up but in the logs I see it happens with these R CHECK options --no-tests --no-manual --no-build-vignettes, could you try a build on Appveyor not skipping tests and building the vignettes? I get errors when doing it on my Windows PC (R 3.3.1 though, I'll try again on another PC later but maybe the R version is not the issue, hence the Appveyor suggestion 😉)

Tests results

> devtools::test()
Loading rrricanes
Loading required package: testthat

Attaching package:testthatThe following object is masked frompackage:dplyr:

    matches

Testing rrricanes
Test base functions.: .........................
Storm Discussions (discus): ...........................12
|====================================================================================|100% ~0 s remaining     ....
Forecast/Advisory Products (fstadv): 3
Test getting storm data.: .
|====================================================================================|100% ~0 s remaining     ..
Position Estimates (posest): .................
|====================================================================================|100% ~0 s remaining     4
|====================================================================================|100% ~0 s remaining     .....5
Scrapers: .................
Update (update): .................
|====================================================================================|100% ~0 s remaining     .

Failed -------------------------------------------------------------------------------------------------------
1. Error: Test discus() (@test_discus.R#54) ------------------------------------------------------------------
character string is not in a standard unambiguous format
1: discus(link = url) at C:\Users\msalmon.ISGLOBAL\Documents\rrricanes/tests/testthat/test_discus.R:54
2: scrape_header(contents, ret = "date") at C:\Users\msalmon.ISGLOBAL\Documents\rrricanes/R/discus.R:71
3: scrape_date(header) at C:\Users\msalmon.ISGLOBAL\Documents\rrricanes/R/scrapers.R:234
4: as.POSIXct(x, tz = "Etc/GMT+4") at C:\Users\msalmon.ISGLOBAL\Documents\rrricanes/R/scrapers.R:162
5: as.POSIXct.default(x, tz = "Etc/GMT+4")
6: as.POSIXct(as.POSIXlt(x, tz, ...), tz, ...)
7: as.POSIXlt(x, tz, ...)
8: as.POSIXlt.character(x, tz, ...)
9: stop("character string is not in a standard unambiguous format")

2. Error: Test get_discus() (@test_discus.R#71) --------------------------------------------------------------
character string is not in a standard unambiguous format
1: get_discus(link = url) at C:\Users\msalmon.ISGLOBAL\Documents\rrricanes/tests/testthat/test_discus.R:71
2: purrr::map(filter_discus(products), discus) at C:\Users\msalmon.ISGLOBAL\Documents\rrricanes/R/discus.R:41
3: .f(.x[[i]], ...)
4: scrape_header(contents, ret = "date") at C:\Users\msalmon.ISGLOBAL\Documents\rrricanes/R/discus.R:71
5: scrape_date(header) at C:\Users\msalmon.ISGLOBAL\Documents\rrricanes/R/scrapers.R:234
6: as.POSIXct(x, tz = "Etc/GMT+4") at C:\Users\msalmon.ISGLOBAL\Documents\rrricanes/R/scrapers.R:162
7: as.POSIXct.default(x, tz = "Etc/GMT+4")
8: as.POSIXct(as.POSIXlt(x, tz, ...), tz, ...)
9: as.POSIXlt(x, tz, ...)
10: as.POSIXlt.character(x, tz, ...)
11: stop("character string is not in a standard unambiguous format")

3. Error: (unknown) (@test_fstadv.R#8) -----------------------------------------------------------------------
character string is not in a standard unambiguous format
1: get_fstadv(link = url.al011998) at C:\Users\msalmon.ISGLOBAL\Documents\rrricanes/tests/testthat/test_fstadv.R:8
2: purrr::map(filter_fstadv(products), fstadv) at C:\Users\msalmon.ISGLOBAL\Documents\rrricanes/R/fstadv.R:157
3: .f(.x[[i]], ...)
4: scrape_header(contents, ret = "date") at C:\Users\msalmon.ISGLOBAL\Documents\rrricanes/R/fstadv.R:210
5: scrape_date(header) at C:\Users\msalmon.ISGLOBAL\Documents\rrricanes/R/scrapers.R:234
6: as.POSIXct(x, tz = "UTC") at C:\Users\msalmon.ISGLOBAL\Documents\rrricanes/R/scrapers.R:152
7: as.POSIXct.default(x, tz = "UTC")
8: as.POSIXct(as.POSIXlt(x, tz, ...), tz, ...)
9: as.POSIXlt(x, tz, ...)
10: as.POSIXlt.character(x, tz, ...)
11: stop("character string is not in a standard unambiguous format")

4. Error: (unknown) (@test_prblty.R#9) -----------------------------------------------------------------------
character string is not in a standard unambiguous format
1: get_prblty(al1998[1]) at C:\Users\msalmon.ISGLOBAL\Documents\rrricanes/tests/testthat/test_prblty.R:9
2: purrr::map(filter_prblty(products), prblty) at C:\Users\msalmon.ISGLOBAL\Documents\rrricanes/R/prblty.R:25
3: .f(.x[[i]], ...)
4: scrape_header(contents, ret = "date") at C:\Users\msalmon.ISGLOBAL\Documents\rrricanes/R/prblty.R:52
5: scrape_date(header) at C:\Users\msalmon.ISGLOBAL\Documents\rrricanes/R/scrapers.R:234
6: as.POSIXct(x, tz = "Etc/GMT+4") at C:\Users\msalmon.ISGLOBAL\Documents\rrricanes/R/scrapers.R:156
7: as.POSIXct.default(x, tz = "Etc/GMT+4")
8: as.POSIXct(as.POSIXlt(x, tz, ...), tz, ...)
9: as.POSIXlt(x, tz, ...)
10: as.POSIXlt.character(x, tz, ...)
11: stop("character string is not in a standard unambiguous format")

5. Error: 1998, Tropical Storm Alex, Advisory 26 (@test_public.R#38) -----------------------------------------
character string is not in a standard unambiguous format
1: expect_identical(public(al.1998.alex.products.public[25]) %>% dplyr::select(Status) %>% dplyr::first(), "Tropical Disturbance") at C:\Users\msalmon.ISGLOBAL\Documents\rrricanes/tests/testthat/test_public.R:38
2: identical(object, expected)
3: public(al.1998.alex.products.public[25]) %>% dplyr::select(Status) %>% dplyr::first()
4: eval(lhs, parent, parent)
5: eval(expr, envir, enclos)
6: public(al.1998.alex.products.public[25])
7: scrape_header(contents, ret = "date") at C:\Users\msalmon.ISGLOBAL\Documents\rrricanes/R/public.R:70
8: scrape_date(header) at C:\Users\msalmon.ISGLOBAL\Documents\rrricanes/R/scrapers.R:234
9: as.POSIXct(x, tz = "Etc/GMT+4") at C:\Users\msalmon.ISGLOBAL\Documents\rrricanes/R/scrapers.R:156
10: as.POSIXct.default(x, tz = "Etc/GMT+4")
11: as.POSIXct(as.POSIXlt(x, tz, ...), tz, ...)
12: as.POSIXlt(x, tz, ...)
13: as.POSIXlt.character(x, tz, ...)
14: stop("character string is not in a standard unambiguous format")

DONE =========================================================================================================
Don't worry, you'll get it.
  • In the following (uncommented) code, why do you use assignments to the global environment?
    s <- sapply(x, function(z, n = names, l = link, m = msg){
        f <- paste("get", z, sep = "_")
        res <- do.call(f, args = list("link" = l, "msg" = m))
        if (!is.null(n[[z]])) {
            assign(n[z][[1]], res, envir = .GlobalEnv)
        } else {
            assign(z, res, envir = .GlobalEnv)
        }
    })
@timtrice

This comment has been minimized.

Show comment
Hide comment
@timtrice

timtrice Jun 8, 2017

@maelle , thank you for taking the time to review my project!

Regarding the code in question, it has been replaced with the following:

    ds <- purrr::map(products, .f = function(x) {
        sprintf("get_%s", x) %>%
            purrr::invoke_map(.x = list(link = link)) %>%
            purrr::flatten_df()})
    names(ds) <- products
    return(ds)

This is in branch develop-0.2.0. I believe I have modified all of the assign calls; they do not show up in R CMD checks (0 notes).

Regarding the tests; I apologize for them not being enabled. At one point earlier this week I had to disable them due to the NHC archives being down for some annual archive pages. This seems to have been resolved now. Tests have been added to both Travis and Appveyor.

A side note to that last point; I have noticed there are timeouts when accessing the NHC archives. This has been verified on numerous occasions with Travis and Appveyor during tests. I have a feature request pending to add handling for this. Additionally, in 0.2.0, load_storm_data has been added that accesses some already-processed datasets through the GitHub repo rrricanesdata. So I'm trying to develop some workaround to ensure users have multiple ways to get quick access to data they need.

timtrice commented Jun 8, 2017

@maelle , thank you for taking the time to review my project!

Regarding the code in question, it has been replaced with the following:

    ds <- purrr::map(products, .f = function(x) {
        sprintf("get_%s", x) %>%
            purrr::invoke_map(.x = list(link = link)) %>%
            purrr::flatten_df()})
    names(ds) <- products
    return(ds)

This is in branch develop-0.2.0. I believe I have modified all of the assign calls; they do not show up in R CMD checks (0 notes).

Regarding the tests; I apologize for them not being enabled. At one point earlier this week I had to disable them due to the NHC archives being down for some annual archive pages. This seems to have been resolved now. Tests have been added to both Travis and Appveyor.

A side note to that last point; I have noticed there are timeouts when accessing the NHC archives. This has been verified on numerous occasions with Travis and Appveyor during tests. I have a feature request pending to add handling for this. Additionally, in 0.2.0, load_storm_data has been added that accesses some already-processed datasets through the GitHub repo rrricanesdata. So I'm trying to develop some workaround to ensure users have multiple ways to get quick access to data they need.

@maelle

This comment has been minimized.

Show comment
Hide comment
@maelle

maelle Jun 8, 2017

Member

@timtrice thanks, nice work! For review (and my editor checks :-) ) will master be updated?

Great on having already processed datasets, but I see that repo has no licensing information. I'm not experienced at all in licensing issues but I guess it'd be at least nice to add where this data comes from etc.

Member

maelle commented Jun 8, 2017

@timtrice thanks, nice work! For review (and my editor checks :-) ) will master be updated?

Great on having already processed datasets, but I see that repo has no licensing information. I'm not experienced at all in licensing issues but I guess it'd be at least nice to add where this data comes from etc.

@maelle

This comment has been minimized.

Show comment
Hide comment
@maelle

maelle Jun 8, 2017

Member

Now that I could check the package (on the right R version) here is something more formalized.

Editor checks:

  • [ x] Fit: The package meets criteria for fit and overlap
  • [ x] Automated tests: Package has a testing suite and is tested via Travis-CI or another CI service.
  • [ x] License: The package has a CRAN or OSI accepted license
  • [ x] Repository: The repository link resolves correctly
  • Archive (JOSS only, may be post-review): The repository DOI resolves correctly
  • Version (JOSS only, may be post-review): Does the release version given match the GitHub release (v1.0.0)?

Editor comments

  • Note that lazyeval should soon be replaced by rlang so it might be worth to have a look at https://github.com/tidyverse/rlang

  • 2 long lines identified by goodpractice::gp()

── GP rrricanes ───────────────────────────────────────────────────────────────────────────────────────────────

It is good practice towrite unit tests for all functions, and all package code in general. 93% of code lines are
    covered by test cases.

    R/base.R:74:NA
    R/base.R:75:NA
    R/base.R:76:NA
    R/base.R:77:NA
    R/base.R:78:NA
    ... and 64 more linesavoid long code lines, it is bad for readability. Also, many people prefer editor windows that
    are about 80 characters wide. Try make your lines shorter than 80 characters

    R\base.R:84:1
    R\scrapers.R:83:1
    tests\testthat\test_scrapers.R:46:1

  • Typos identified via devtools::spell_check()
currentlby      get_fstadv.Rd:18
dataframej      build_archive_df.Rd:20

Reviewers: @jsta @robinsones
Due date: 2017-07-10

Member

maelle commented Jun 8, 2017

Now that I could check the package (on the right R version) here is something more formalized.

Editor checks:

  • [ x] Fit: The package meets criteria for fit and overlap
  • [ x] Automated tests: Package has a testing suite and is tested via Travis-CI or another CI service.
  • [ x] License: The package has a CRAN or OSI accepted license
  • [ x] Repository: The repository link resolves correctly
  • Archive (JOSS only, may be post-review): The repository DOI resolves correctly
  • Version (JOSS only, may be post-review): Does the release version given match the GitHub release (v1.0.0)?

Editor comments

  • Note that lazyeval should soon be replaced by rlang so it might be worth to have a look at https://github.com/tidyverse/rlang

  • 2 long lines identified by goodpractice::gp()

── GP rrricanes ───────────────────────────────────────────────────────────────────────────────────────────────

It is good practice towrite unit tests for all functions, and all package code in general. 93% of code lines are
    covered by test cases.

    R/base.R:74:NA
    R/base.R:75:NA
    R/base.R:76:NA
    R/base.R:77:NA
    R/base.R:78:NA
    ... and 64 more linesavoid long code lines, it is bad for readability. Also, many people prefer editor windows that
    are about 80 characters wide. Try make your lines shorter than 80 characters

    R\base.R:84:1
    R\scrapers.R:83:1
    tests\testthat\test_scrapers.R:46:1

  • Typos identified via devtools::spell_check()
currentlby      get_fstadv.Rd:18
dataframej      build_archive_df.Rd:20

Reviewers: @jsta @robinsones
Due date: 2017-07-10

@timtrice

This comment has been minimized.

Show comment
Hide comment
@timtrice

timtrice Jun 8, 2017

@maelle, thank you for the compliment! Very much appreciated.

The master will be updated with develop-0.2.0 when I finish the GIS products. They probably could be released now but I'm trying to think over things and make sure there's not a better way to handle them (consolidating functions, for example). Plus one of the products is raster format so I'm looking over ways to view them with ggplot, if even possible, what the alternatives are, etc.

In hindsight I should have incorporated bug fixes into master. It was one of those things that I had already made new changes and started just hammering out bugs - before I knew it I lost track of where I was. This is the first project I'm trying to incorporate R package rules and github best practices along with what I know needs to be done. Some of it (as your example above) is just a sloppy "get it working" attempt before I get back to making the code cleaner.

Regarding rrricanesdata, I can certainly add a license and the script to add new datasets and update existing ones as need be. I'll add a README as well today.

timtrice commented Jun 8, 2017

@maelle, thank you for the compliment! Very much appreciated.

The master will be updated with develop-0.2.0 when I finish the GIS products. They probably could be released now but I'm trying to think over things and make sure there's not a better way to handle them (consolidating functions, for example). Plus one of the products is raster format so I'm looking over ways to view them with ggplot, if even possible, what the alternatives are, etc.

In hindsight I should have incorporated bug fixes into master. It was one of those things that I had already made new changes and started just hammering out bugs - before I knew it I lost track of where I was. This is the first project I'm trying to incorporate R package rules and github best practices along with what I know needs to be done. Some of it (as your example above) is just a sloppy "get it working" attempt before I get back to making the code cleaner.

Regarding rrricanesdata, I can certainly add a license and the script to add new datasets and update existing ones as need be. I'll add a README as well today.

@maelle

This comment has been minimized.

Show comment
Hide comment
@maelle

maelle Jun 8, 2017

Member

@timtrice sorry to have posted so many comments in a row. Should we put the review on hold until you've merged everything with master? This way you'd get a review on the final product. :-)

Member

maelle commented Jun 8, 2017

@timtrice sorry to have posted so many comments in a row. Should we put the review on hold until you've merged everything with master? This way you'd get a review on the final product. :-)

@timtrice

This comment has been minimized.

Show comment
Hide comment
@timtrice

timtrice Jun 8, 2017

@maelle ,

No problem regarding the comments.

Let me push the fixes to 0.1.2; I'll hold back the GIS products until I can determine the best direction. I'll post an update later today or tomorrow morning when it's complete.

The tests do take forever. This is because of the scraping; particularly the forecast/advisory products. There are minor variations in some of the advisories so I have to make sure that one regex pattern change here doesn't affect other advisories. That is another issue I'm also trying to determine better methods.

Edit: to be clear those notes should no longer exist in this new release.

Thank you

timtrice commented Jun 8, 2017

@maelle ,

No problem regarding the comments.

Let me push the fixes to 0.1.2; I'll hold back the GIS products until I can determine the best direction. I'll post an update later today or tomorrow morning when it's complete.

The tests do take forever. This is because of the scraping; particularly the forecast/advisory products. There are minor variations in some of the advisories so I have to make sure that one regex pattern change here doesn't affect other advisories. That is another issue I'm also trying to determine better methods.

Edit: to be clear those notes should no longer exist in this new release.

Thank you

@maelle

This comment has been minimized.

Show comment
Hide comment
@maelle

maelle Jun 8, 2017

Member

Ok so I'll re-do the editor checks once you've updated this thread! :-)

Member

maelle commented Jun 8, 2017

Ok so I'll re-do the editor checks once you've updated this thread! :-)

@maelle

This comment has been minimized.

Show comment
Hide comment
@maelle

maelle Jun 8, 2017

Member

@timtrice I added the results of goodpractice::gp() in the comment with other editor checks. Please have a look before the next version, goodpractice gives some good suggestions (and some you might prefer to ignore which is fine).

Member

maelle commented Jun 8, 2017

@timtrice I added the results of goodpractice::gp() in the comment with other editor checks. Please have a look before the next version, goodpractice gives some good suggestions (and some you might prefer to ignore which is fine).

@timtrice

This comment has been minimized.

Show comment
Hide comment
@timtrice

timtrice Jun 9, 2017

I have released 0.1.2 into the master and develop branch.

Local checks and tests pass. I'm having issues passing them on Appveyor. Travis has passed. Appveyor wants to fail 2 or 3 tests in because of timeout issues. I hoped to have resolved this by adding functionality to reattempt data collection on timeout but it apparently isn't enough. At least, for testing. I'm not sure what else I can do to resolve those type of issues. I'll keep re-building the commit if it continues to fail but hopefully it'll catch a good run and finally go through.

(Edit to note all tests in Appveyor finally passed)

Thank you again for taking the time to review my project and I look forward to hearing back from the community about it.

Tim

timtrice commented Jun 9, 2017

I have released 0.1.2 into the master and develop branch.

Local checks and tests pass. I'm having issues passing them on Appveyor. Travis has passed. Appveyor wants to fail 2 or 3 tests in because of timeout issues. I hoped to have resolved this by adding functionality to reattempt data collection on timeout but it apparently isn't enough. At least, for testing. I'm not sure what else I can do to resolve those type of issues. I'll keep re-building the commit if it continues to fail but hopefully it'll catch a good run and finally go through.

(Edit to note all tests in Appveyor finally passed)

Thank you again for taking the time to review my project and I look forward to hearing back from the community about it.

Tim

@maelle

This comment has been minimized.

Show comment
Hide comment
@maelle

maelle Jun 10, 2017

Member

@timtrice awesome, and could you identify why they previously failed?

Member

maelle commented Jun 10, 2017

@timtrice awesome, and could you identify why they previously failed?

@maelle

This comment has been minimized.

Show comment
Hide comment
@maelle

maelle Jun 10, 2017

Member

@timtrice Best loading message ever! 😁 I updated the editor checks.

I'm now looking for reviewers.

Member

maelle commented Jun 10, 2017

@timtrice Best loading message ever! 😁 I updated the editor checks.

I'm now looking for reviewers.

@timtrice

This comment has been minimized.

Show comment
Hide comment
@timtrice

timtrice Jun 10, 2017

@maelle ,

In these instances (and very common), when the tests are checked or vignettes are built the functions are scraping the National Hurricane Center archives. Very often a timeout is reached though I have gotten other odd errors I haven't researched yet.

In Release 0.1.2, I wrapped the initial extraction function in a check:

get_url_contents <- function(link) {
    # Try to establish connection three times with timeout of 3 seconds
    max_attempts <- getOption("rrricanes.http_attempts")
    if (max_attempts > 5) max_attempts <- 5
    for (i in seq(1, max_attempts)) {
        safe_GET <- purrr::safely(httr::GET)
        contents <- safe_GET(url = link,
                             httr::timeout(getOption("rrricanes.http_timeout")))
        if (!is.null(contents$result))
            return(xml2::read_html(x = contents$result))
    }
    stop(contents$error$message, call. = TRUE)
}

So, there are two options user can set; rrricanes.http_attempts which defaults to 3 and rrricanes.http_timeout which defaults to 1. Attempts cannot be more than 5 (arbitrary number). The tests are run with these options set to 5 and 1, respectively. Now I"m thinking I should have added a pause in there, as well...

The constant issues with the archives is why I built load_storm_data which I would consider now the primary option. That functions ignores the NHC archives and goes straight to rrricanesdata. This puts more responsibility on me to make sure those archives are up-to-date and I'm looking at methods to automate this. However, I also emphasize within the project this function may not return the most up-to-date data and to use the other functions as a fallback.

I'm open to any alternatives or ideas you and others may have. It is very annoying and frustrating, especially with Appveyor that may get through the first four tests successfully but fail on the last; in which I have to rebuild the entire thing.

I am also looking at implementing data.world in some way to make it easier. As the plans are to add GIS data, reconnaissance data and others, I know I need to develop a good solid data plan going into future releases.

Thank you,

Tim Trice

timtrice commented Jun 10, 2017

@maelle ,

In these instances (and very common), when the tests are checked or vignettes are built the functions are scraping the National Hurricane Center archives. Very often a timeout is reached though I have gotten other odd errors I haven't researched yet.

In Release 0.1.2, I wrapped the initial extraction function in a check:

get_url_contents <- function(link) {
    # Try to establish connection three times with timeout of 3 seconds
    max_attempts <- getOption("rrricanes.http_attempts")
    if (max_attempts > 5) max_attempts <- 5
    for (i in seq(1, max_attempts)) {
        safe_GET <- purrr::safely(httr::GET)
        contents <- safe_GET(url = link,
                             httr::timeout(getOption("rrricanes.http_timeout")))
        if (!is.null(contents$result))
            return(xml2::read_html(x = contents$result))
    }
    stop(contents$error$message, call. = TRUE)
}

So, there are two options user can set; rrricanes.http_attempts which defaults to 3 and rrricanes.http_timeout which defaults to 1. Attempts cannot be more than 5 (arbitrary number). The tests are run with these options set to 5 and 1, respectively. Now I"m thinking I should have added a pause in there, as well...

The constant issues with the archives is why I built load_storm_data which I would consider now the primary option. That functions ignores the NHC archives and goes straight to rrricanesdata. This puts more responsibility on me to make sure those archives are up-to-date and I'm looking at methods to automate this. However, I also emphasize within the project this function may not return the most up-to-date data and to use the other functions as a fallback.

I'm open to any alternatives or ideas you and others may have. It is very annoying and frustrating, especially with Appveyor that may get through the first four tests successfully but fail on the last; in which I have to rebuild the entire thing.

I am also looking at implementing data.world in some way to make it easier. As the plans are to add GIS data, reconnaissance data and others, I know I need to develop a good solid data plan going into future releases.

Thank you,

Tim Trice

@maelle

This comment has been minimized.

Show comment
Hide comment
@maelle

maelle Jun 10, 2017

Member

@timtrice In this code I make the waiting time a bit longer with each attempt. I should have used purrr::safely as well. 😉

Have you told the owner of the website you're scraping about your package by the way? It'd also be good to know how ok it is for you to have a copy of their data.

Member

maelle commented Jun 10, 2017

@timtrice In this code I make the waiting time a bit longer with each attempt. I should have used purrr::safely as well. 😉

Have you told the owner of the website you're scraping about your package by the way? It'd also be good to know how ok it is for you to have a copy of their data.

@timtrice

This comment has been minimized.

Show comment
Hide comment
@timtrice

timtrice Jun 10, 2017

Thanks @maelle, I'll take a look at it. I'm trying a simple Sys.sleep right now.

Here is the latest example of a fail in Travis: https://travis-ci.org/timtrice/rrricanes/jobs/241505560

Quitting from lines 175-179 (getting_started.Rmd) 
Error: processing vignette 'getting_started.Rmd' failed with diagnostics:
Timeout was reached
Execution halted
The command "R CMD build  ." failed and exited with 1 during .
Your build has been stopped.

Just constant...

The NHC webmaster is aware, yes. Addtionally, the data is considered public domain. The National Weather Service states:

Use of NOAA/NWS Data and Products

The information on National Weather Service (NWS) Web pages are in the public domain, 
unless specifically noted otherwise, and may be used without charge for any lawful purpose 
so long as you do not: 1) claim it is your own (e.g., by claiming copyright for NWS 
information -- see below), 2) use it in a manner that implies an endorsement or affiliation 
with NOAA/NWS, or 3) modify its content and then present it as official government material. 
You also cannot present information of your own in a way that makes it appear to be official 
government information.

I've tried to document thoroughly where the data originates and, to the best of my knowledge, have made no implicit or explicit claims the data is mine or that it is restricted.

timtrice commented Jun 10, 2017

Thanks @maelle, I'll take a look at it. I'm trying a simple Sys.sleep right now.

Here is the latest example of a fail in Travis: https://travis-ci.org/timtrice/rrricanes/jobs/241505560

Quitting from lines 175-179 (getting_started.Rmd) 
Error: processing vignette 'getting_started.Rmd' failed with diagnostics:
Timeout was reached
Execution halted
The command "R CMD build  ." failed and exited with 1 during .
Your build has been stopped.

Just constant...

The NHC webmaster is aware, yes. Addtionally, the data is considered public domain. The National Weather Service states:

Use of NOAA/NWS Data and Products

The information on National Weather Service (NWS) Web pages are in the public domain, 
unless specifically noted otherwise, and may be used without charge for any lawful purpose 
so long as you do not: 1) claim it is your own (e.g., by claiming copyright for NWS 
information -- see below), 2) use it in a manner that implies an endorsement or affiliation 
with NOAA/NWS, or 3) modify its content and then present it as official government material. 
You also cannot present information of your own in a way that makes it appear to be official 
government information.

I've tried to document thoroughly where the data originates and, to the best of my knowledge, have made no implicit or explicit claims the data is mine or that it is restricted.

@maelle

This comment has been minimized.

Show comment
Hide comment
@maelle

maelle Jun 10, 2017

Member

Reg. the licensing stuff I'd suggest we wait for the reviewers (that I'm currently looking for) whether they have suggestions, and if needed we can ask the community at large what's best practice. :-)

Member

maelle commented Jun 10, 2017

Reg. the licensing stuff I'd suggest we wait for the reviewers (that I'm currently looking for) whether they have suggestions, and if needed we can ask the community at large what's best practice. :-)

@mpadge mpadge referenced this issue Jun 10, 2017

Closed

httr timout #72

@timtrice

This comment has been minimized.

Show comment
Hide comment
@timtrice

timtrice Jun 11, 2017

Apologies if I keep moving the goal posts, so-to-speak, but I found errors in some of the tidy functions and had to rush a patch. Release 0.1.3 has been pushed to master.

https://github.com/timtrice/rrricanes/releases/tag/v0.1.3

timtrice commented Jun 11, 2017

Apologies if I keep moving the goal posts, so-to-speak, but I found errors in some of the tidy functions and had to rush a patch. Release 0.1.3 has been pushed to master.

https://github.com/timtrice/rrricanes/releases/tag/v0.1.3

@maelle

This comment has been minimized.

Show comment
Hide comment
@maelle

maelle Jun 15, 2017

Member

One reviewer found, seeking another one. @timtrice

Member

maelle commented Jun 15, 2017

One reviewer found, seeking another one. @timtrice

@timtrice

This comment has been minimized.

Show comment
Hide comment
@timtrice

timtrice Jun 15, 2017

Can someone advise how I should handle new releases? I expect to be able to push a beta release of 0.2.0 today into master but I want to make sure this will not cause any issues with the reviewing stage.

timtrice commented Jun 15, 2017

Can someone advise how I should handle new releases? I expect to be able to push a beta release of 0.2.0 today into master but I want to make sure this will not cause any issues with the reviewing stage.

@jsta

This comment has been minimized.

Show comment
Hide comment
@jsta

jsta Jul 18, 2017

Contributor

I took a look. Seems good to me. I did see, however, that the man pages need to be rebuilt from the inline roxygen.

Contributor

jsta commented Jul 18, 2017

I took a look. Seems good to me. I did see, however, that the man pages need to be rebuilt from the inline roxygen.

@timtrice

This comment has been minimized.

Show comment
Hide comment
@timtrice

timtrice Jul 18, 2017

@jsta, thank you. I corrected the package title and updated the documentation.

timtrice commented Jul 18, 2017

@jsta, thank you. I corrected the package title and updated the documentation.

@maelle

This comment has been minimized.

Show comment
Hide comment
@maelle

maelle Jul 23, 2017

Member

Looking at both packages one last time before approving, sorry for the delay.

  • In rrricanesdata README "complimentary dataset" I think you mean "complementary" 😉

  • In that README too, I think the "Getting started" part comes a bit late, I'd suggest moving it before "Datasets".

  • When running R CMD Check on rrricanesdata I get checking DESCRIPTION meta-information ... WARNING Non-standard license specification: CC0-1.0 + file LICENSE Standardizable: FALSE but if I remember correctly this license was the best one that was advised to you, right?

  • It it the first time ever I get this message when running goodpractice::gp() on a package (rrricanesdata again): ♥ Mmhm! Majestic package! Keep up the mind-blowing work! 👏

  • Does rrricanesdata really "require an active internet connection as data is extracted from online sources."?

  • Reg. the "rev" role although I had successfully submitted a package to CRAN with this role in the past now CRAN maintainers are having an internal discussion to decide whether they'll accept such roles in the future so please hold on a bit with the CRAN submission, sorry about that. Oh and I see you added me, I didn't review the rrricanes package, just did the editor checks, which was less work. 😉

Despite these few remarks the 2 packages are approved! 🎉 Thanks a lot to the reviewers @jsta and @robinsones and to you @timtrice !

To-dos

  • Transfer the repos to the rOpenSci organization under "Settings" in your repo. I have invited you to 2 teams that should allow you to do so.
  • Add the peer-review badge to the README. [![](http://badges.ropensci.org/118_status.svg)](https://github.com/ropensci/onboarding/issues/118) It will only show "peer reviewed" once I've closed this issue.
  • Fix any links in DESCRIPTION, and in badges for CI and coverage to point to the ropensci URL. (we'll turn on the services on our end.)
Member

maelle commented Jul 23, 2017

Looking at both packages one last time before approving, sorry for the delay.

  • In rrricanesdata README "complimentary dataset" I think you mean "complementary" 😉

  • In that README too, I think the "Getting started" part comes a bit late, I'd suggest moving it before "Datasets".

  • When running R CMD Check on rrricanesdata I get checking DESCRIPTION meta-information ... WARNING Non-standard license specification: CC0-1.0 + file LICENSE Standardizable: FALSE but if I remember correctly this license was the best one that was advised to you, right?

  • It it the first time ever I get this message when running goodpractice::gp() on a package (rrricanesdata again): ♥ Mmhm! Majestic package! Keep up the mind-blowing work! 👏

  • Does rrricanesdata really "require an active internet connection as data is extracted from online sources."?

  • Reg. the "rev" role although I had successfully submitted a package to CRAN with this role in the past now CRAN maintainers are having an internal discussion to decide whether they'll accept such roles in the future so please hold on a bit with the CRAN submission, sorry about that. Oh and I see you added me, I didn't review the rrricanes package, just did the editor checks, which was less work. 😉

Despite these few remarks the 2 packages are approved! 🎉 Thanks a lot to the reviewers @jsta and @robinsones and to you @timtrice !

To-dos

  • Transfer the repos to the rOpenSci organization under "Settings" in your repo. I have invited you to 2 teams that should allow you to do so.
  • Add the peer-review badge to the README. [![](http://badges.ropensci.org/118_status.svg)](https://github.com/ropensci/onboarding/issues/118) It will only show "peer reviewed" once I've closed this issue.
  • Fix any links in DESCRIPTION, and in badges for CI and coverage to point to the ropensci URL. (we'll turn on the services on our end.)
@timtrice

This comment has been minimized.

Show comment
Hide comment
@timtrice

timtrice Jul 23, 2017

@maelle,

Very glad to hear the news~

I have removed you as a reviewer to rrricanes.

I have also made some modifications to rrricanesdata including removing**`Getting Started**, rewriting Installation, and relocating Data.

I will wait to push that to master until you provide the README links.

Once I push to master I'll transfer both repos.

Regarding the license, I never really got a consensus on this. The discussion, if I recall correctly, centered more around if it mattered as it was not a CRAN package. But, there wasn't a definitive, "This is/is not the correct license for this type of package".

Regarding rrricanes to CRAN; does it have to be submitted at any point soon? I feel the GIS documentation can be improved and it is next on my list before I push a minor release. I need to document this procedure better (one of my weaknesses) and I'm not sure if the way I'm handling shapefiles is the "proper" way.

timtrice commented Jul 23, 2017

@maelle,

Very glad to hear the news~

I have removed you as a reviewer to rrricanes.

I have also made some modifications to rrricanesdata including removing**`Getting Started**, rewriting Installation, and relocating Data.

I will wait to push that to master until you provide the README links.

Once I push to master I'll transfer both repos.

Regarding the license, I never really got a consensus on this. The discussion, if I recall correctly, centered more around if it mattered as it was not a CRAN package. But, there wasn't a definitive, "This is/is not the correct license for this type of package".

Regarding rrricanes to CRAN; does it have to be submitted at any point soon? I feel the GIS documentation can be improved and it is next on my list before I push a minor release. I need to document this procedure better (one of my weaknesses) and I'm not sure if the way I'm handling shapefiles is the "proper" way.

@maelle

This comment has been minimized.

Show comment
Hide comment
@maelle

maelle Jul 23, 2017

Member

@timtrice nice work!

  • I hope to provide the badge link in the next days. There was some issues with the existing badges (I experienced them for a package of mine actually) so there's work going on.

  • No you don't need to submit to CRAN any point soon. You can get some feedback about this documentation/code via rOpenSci forum/Slack, where some people are experienced with shapefiles.

Member

maelle commented Jul 23, 2017

@timtrice nice work!

  • I hope to provide the badge link in the next days. There was some issues with the existing badges (I experienced them for a package of mine actually) so there's work going on.

  • No you don't need to submit to CRAN any point soon. You can get some feedback about this documentation/code via rOpenSci forum/Slack, where some people are experienced with shapefiles.

@timtrice

This comment has been minimized.

Show comment
Hide comment
@timtrice

timtrice Jul 23, 2017

One other question for clarity; am I transferring the drat repo as well? Or just rrricanesdata?

timtrice commented Jul 23, 2017

One other question for clarity; am I transferring the drat repo as well? Or just rrricanesdata?

@maelle

This comment has been minimized.

Show comment
Hide comment
@maelle

maelle Jul 23, 2017

Member

Oh good question! @sckott I have no drat experience, could you please answer the question above (and @timtrice can provide more info about the package he distributes via drat)? Thanks 😸

Member

maelle commented Jul 23, 2017

Oh good question! @sckott I have no drat experience, could you please answer the question above (and @timtrice can provide more info about the package he distributes via drat)? Thanks 😸

@maelle

This comment has been minimized.

Show comment
Hide comment
@maelle

maelle Jul 25, 2017

Member

Link to badge added! [![](http://badges.ropensci.org/118_status.svg)](https://github.com/ropensci/onboarding/issues/118)

Member

maelle commented Jul 25, 2017

Link to badge added! [![](http://badges.ropensci.org/118_status.svg)](https://github.com/ropensci/onboarding/issues/118)

@timtrice

This comment has been minimized.

Show comment
Hide comment
@timtrice

timtrice Jul 25, 2017

@maelle ,

For the most part I'm done. I need to update the pkgdown docs for rrricanes but I have to do that on another machine; will do so tonight.

But, the URL in the description for each repo, I can't change. I guess I'm not an admin for my repo anymore? It still points to the old repo address for the pkgdown documentation which I need to update.

timtrice commented Jul 25, 2017

@maelle ,

For the most part I'm done. I need to update the pkgdown docs for rrricanes but I have to do that on another machine; will do so tonight.

But, the URL in the description for each repo, I can't change. I guess I'm not an admin for my repo anymore? It still points to the old repo address for the pkgdown documentation which I need to update.

@maelle

This comment has been minimized.

Show comment
Hide comment
@maelle

maelle Jul 26, 2017

Member

Just made you admin of both repos, sorry about that

Member

maelle commented Jul 26, 2017

Just made you admin of both repos, sorry about that

@sckott

This comment has been minimized.

Show comment
Hide comment
@sckott

sckott Jul 26, 2017

Member

Carl and I discussed about drat - we have our own as you may know, and so once these two pkgs are in ropensci org, they will be included in our drat - so i'd say not move the drat repo

Member

sckott commented Jul 26, 2017

Carl and I discussed about drat - we have our own as you may know, and so once these two pkgs are in ropensci org, they will be included in our drat - so i'd say not move the drat repo

@maelle

This comment has been minimized.

Show comment
Hide comment
@maelle

maelle Jul 26, 2017

Member

Thanks @sckott (and Carl)!

Member

maelle commented Jul 26, 2017

Thanks @sckott (and Carl)!

@maelle maelle closed this Jul 27, 2017

@maelle

This comment has been minimized.

Show comment
Hide comment
@maelle

maelle Jul 27, 2017

Member

@timtrice I forgot to ask whether you'd like to write a a blog post about your package for rOpenSci blog, either a short-form intro to it (https://ropensci.org/tech-notes/) or long-form post with more narrative about its development. ((https://ropensci.org/blog/). Let me know if you are interested.

Member

maelle commented Jul 27, 2017

@timtrice I forgot to ask whether you'd like to write a a blog post about your package for rOpenSci blog, either a short-form intro to it (https://ropensci.org/tech-notes/) or long-form post with more narrative about its development. ((https://ropensci.org/blog/). Let me know if you are interested.

@timtrice

This comment has been minimized.

Show comment
Hide comment
@timtrice

timtrice Jul 27, 2017

@maelle , that'd be fantastic, yes!

timtrice commented Jul 27, 2017

@maelle , that'd be fantastic, yes!

@maelle

This comment has been minimized.

Show comment
Hide comment
@maelle

maelle Aug 3, 2017

Member

@timtrice if you submit the package to CRAN before R3.5, please build it on R-devel in order not to get any note because of the "rev" role. :-)

Member

maelle commented Aug 3, 2017

@timtrice if you submit the package to CRAN before R3.5, please build it on R-devel in order not to get any note because of the "rev" role. :-)

@timtrice

This comment has been minimized.

Show comment
Hide comment
@timtrice

timtrice Aug 19, 2017

@maelle ,

I've added introducing_rrricanes gist for review per your previous comment. If I'm on the wrong track with the purpose of this, please let me know. Or if you have additional suggestions, of course, I'm all for it.

Thank you for your help!

timtrice commented Aug 19, 2017

@maelle ,

I've added introducing_rrricanes gist for review per your previous comment. If I'm on the wrong track with the purpose of this, please let me know. Or if you have additional suggestions, of course, I'm all for it.

Thank you for your help!

@maelle

This comment has been minimized.

Show comment
Hide comment
@maelle

maelle Aug 20, 2017

Member

You mean for the blog post? Also poking @stefaniebutland then. And reading it now, many thanks!

Member

maelle commented Aug 20, 2017

You mean for the blog post? Also poking @stefaniebutland then. And reading it now, many thanks!

@maelle

This comment has been minimized.

Show comment
Hide comment
@maelle

maelle Aug 20, 2017

Member

Nice to see this text being written! Some comments:

  • I think it'd be nice to start with why you created this package, to give it a personal touch before the technical stuff? Well if that's not intended as technical note of course.

  • streamling -> streaming, it's own "get" function -> its own "get" function

  • Maybe add even more examples of visualizations in the post? And maybe comment on the existing one, what does it show and mean? Could you include the code used for generating it?

  • In "Optional products are Strike Probabilities (since deprecated and replaced by Wind Speed Probabilities), Position Estimates (since rolled into Tropical Updates)." what does "since" refer to?

  • When saying contributions are welcome, maybe give more details about what sort of contributions you expect? E.g. bug reports, feature requests, or PRs. When I wrote about my own package ropenaq Stefanie encouraged me to open issues labelled "Beginner" and "Intermediate", see here (yep no one volunteered but don't be disheartened by this!)

Member

maelle commented Aug 20, 2017

Nice to see this text being written! Some comments:

  • I think it'd be nice to start with why you created this package, to give it a personal touch before the technical stuff? Well if that's not intended as technical note of course.

  • streamling -> streaming, it's own "get" function -> its own "get" function

  • Maybe add even more examples of visualizations in the post? And maybe comment on the existing one, what does it show and mean? Could you include the code used for generating it?

  • In "Optional products are Strike Probabilities (since deprecated and replaced by Wind Speed Probabilities), Position Estimates (since rolled into Tropical Updates)." what does "since" refer to?

  • When saying contributions are welcome, maybe give more details about what sort of contributions you expect? E.g. bug reports, feature requests, or PRs. When I wrote about my own package ropenaq Stefanie encouraged me to open issues labelled "Beginner" and "Intermediate", see here (yep no one volunteered but don't be disheartened by this!)

@stefaniebutland

This comment has been minimized.

Show comment
Hide comment
@stefaniebutland

stefaniebutland Aug 23, 2017

@timtrice Great to hear you're interested in contributing a blog post about rrricanes and thanks for starting out with the introducing_rrricanes gist. Here are some practical details. I'll give some feedback on the draft in a separate comment below.

As Maëlle mentioned, you could do either a short post for the Developer Blog that has a technical focus, or do a post for the main blog whose audience is broader.

Main blog post would include more narrative about your motivation for creating the package, unmet need, how-to use, good to end with a thank you to package reviewers with links to their GitHub or Twitter, point readers to issues and what you think is next to improve the package and invite people to open or address an issue etc.

Deadlines:

  • Developer blog post can come any time.
  • For main blog we post about once a week. Next open slots are late September. We ask for a draft submitted as a pull request a week in advance.

Practical instructions:

Which type of post are you thinking of?
Did I miss anything?

stefaniebutland commented Aug 23, 2017

@timtrice Great to hear you're interested in contributing a blog post about rrricanes and thanks for starting out with the introducing_rrricanes gist. Here are some practical details. I'll give some feedback on the draft in a separate comment below.

As Maëlle mentioned, you could do either a short post for the Developer Blog that has a technical focus, or do a post for the main blog whose audience is broader.

Main blog post would include more narrative about your motivation for creating the package, unmet need, how-to use, good to end with a thank you to package reviewers with links to their GitHub or Twitter, point readers to issues and what you think is next to improve the package and invite people to open or address an issue etc.

Deadlines:

  • Developer blog post can come any time.
  • For main blog we post about once a week. Next open slots are late September. We ask for a draft submitted as a pull request a week in advance.

Practical instructions:

Which type of post are you thinking of?
Did I miss anything?

@stefaniebutland

This comment has been minimized.

Show comment
Hide comment
@stefaniebutland

stefaniebutland Aug 23, 2017

Some comments on introducing_rrricanes to add to Maëlle's:

  • "archived" in "...current and archive tropical cyclone data..."
  • If for the main blog, definitely good to add a short narrative intro about your motivation etc as noted in my comment above.
  • maybe create another heading for the section on installation so it doesn't fall under "What is rrricanes". Here's a good main blog post example where sections are well laid out with headings
  • echoing @maelle's comment "Maybe add even more examples of visualizations in the post? And maybe comment on the existing one, what does it show and mean? Could you include the code used for generating it?" This would add a lot of value and help with the narrative so people can see for themselves what they have to gain by trying out the package.

I hope those comments help. Thank you again for doing this. Please tag me here or ping me on slack with any questions or to look at next version.

stefaniebutland commented Aug 23, 2017

Some comments on introducing_rrricanes to add to Maëlle's:

  • "archived" in "...current and archive tropical cyclone data..."
  • If for the main blog, definitely good to add a short narrative intro about your motivation etc as noted in my comment above.
  • maybe create another heading for the section on installation so it doesn't fall under "What is rrricanes". Here's a good main blog post example where sections are well laid out with headings
  • echoing @maelle's comment "Maybe add even more examples of visualizations in the post? And maybe comment on the existing one, what does it show and mean? Could you include the code used for generating it?" This would add a lot of value and help with the narrative so people can see for themselves what they have to gain by trying out the package.

I hope those comments help. Thank you again for doing this. Please tag me here or ping me on slack with any questions or to look at next version.

@timtrice

This comment has been minimized.

Show comment
Hide comment
@timtrice

timtrice Sep 2, 2017

@maelle , @stefaniebutland ,

I've added additional visuals to the next version and moved the document to it's own repo:

https://github.com/timtrice/introducing_rrricanes

The markdown file seems to be complete w/ images and what-not.

This may be a bit longer than what was intended but I really wanted to cover the big ideas behind the package and try to explain them under the assumption that anyone using it would be unfamiliar with what the different data meant. I hope that I've accomplished this (even if a bit too thoroughly).

Look forward to any input you both may have to offer.

timtrice commented Sep 2, 2017

@maelle , @stefaniebutland ,

I've added additional visuals to the next version and moved the document to it's own repo:

https://github.com/timtrice/introducing_rrricanes

The markdown file seems to be complete w/ images and what-not.

This may be a bit longer than what was intended but I really wanted to cover the big ideas behind the package and try to explain them under the assumption that anyone using it would be unfamiliar with what the different data meant. I hope that I've accomplished this (even if a bit too thoroughly).

Look forward to any input you both may have to offer.

@stefaniebutland

This comment has been minimized.

Show comment
Hide comment
@stefaniebutland

stefaniebutland Sep 5, 2017

Thank you for the update @timtrice. I'll have another look early this week.
fyi, @maelle has limited internet for now so she might not respond (yet)

stefaniebutland commented Sep 5, 2017

Thank you for the update @timtrice. I'll have another look early this week.
fyi, @maelle has limited internet for now so she might not respond (yet)

@stefaniebutland

This comment has been minimized.

Show comment
Hide comment
@stefaniebutland

stefaniebutland Sep 6, 2017

Hi @timtrice
This looks ready to set a date as an rOpenSci blog post. Please send a pull request according to instructions here: https://github.com/ropensci/roweb/blob/master/.github/CONTRIBUTING.md and we will do final review on the pull request.

  • I propose posting on Tues Sep 19, so please submit pull request by Tues Sep 12.
  • For tags (see CONTRIBUTING.md above for format), I suggest: R, onboarding, community (since this pkg is a community contribution not written by rOpenSci staff), package, software, review, package_name; package-topics (copied from onboarding review topic labels)
  • just say this in intro and people will understand, and be prepared

This may be a bit longer than what was intended but I really wanted to cover the big ideas behind the package and try to explain them under the assumption that anyone using it would be unfamiliar with what the different data meant.

  • Perhaps give a brief outline of what people will find in the post or outline examples you will show so that people are motivated to read through
  • define NHC archives first time it's noted (I think you did in second mention)
  • Here gis_advisory(key = key) could you just list a few of the datasets to save space/reader attention since full list is no more informative? Same for gis <- gis_latest() and maybe you can identify others that could be shortened.
  • love this: "You do not need to submit code in order to be listed as a contributor. ..."

stefaniebutland commented Sep 6, 2017

Hi @timtrice
This looks ready to set a date as an rOpenSci blog post. Please send a pull request according to instructions here: https://github.com/ropensci/roweb/blob/master/.github/CONTRIBUTING.md and we will do final review on the pull request.

  • I propose posting on Tues Sep 19, so please submit pull request by Tues Sep 12.
  • For tags (see CONTRIBUTING.md above for format), I suggest: R, onboarding, community (since this pkg is a community contribution not written by rOpenSci staff), package, software, review, package_name; package-topics (copied from onboarding review topic labels)
  • just say this in intro and people will understand, and be prepared

This may be a bit longer than what was intended but I really wanted to cover the big ideas behind the package and try to explain them under the assumption that anyone using it would be unfamiliar with what the different data meant.

  • Perhaps give a brief outline of what people will find in the post or outline examples you will show so that people are motivated to read through
  • define NHC archives first time it's noted (I think you did in second mention)
  • Here gis_advisory(key = key) could you just list a few of the datasets to save space/reader attention since full list is no more informative? Same for gis <- gis_latest() and maybe you can identify others that could be shortened.
  • love this: "You do not need to submit code in order to be listed as a contributor. ..."
@timtrice

This comment has been minimized.

Show comment
Hide comment
@timtrice

timtrice Sep 11, 2017

Hello @stefaniebutland , @maelle ,

I wanted to give you guys an update on this. I finished the post last week and was ready to submit to roweb. However, currently the NHC website is down for GIS products making it impossible to render the Rmd file.

They are aware of the issue and, I've been told are trying to correct it. But I have no idea when that will be. Basically, I'm on hold for now.

timtrice commented Sep 11, 2017

Hello @stefaniebutland , @maelle ,

I wanted to give you guys an update on this. I finished the post last week and was ready to submit to roweb. However, currently the NHC website is down for GIS products making it impossible to render the Rmd file.

They are aware of the issue and, I've been told are trying to correct it. But I have no idea when that will be. Basically, I'm on hold for now.

@stefaniebutland

This comment has been minimized.

Show comment
Hide comment
@stefaniebutland

stefaniebutland Sep 11, 2017

Thanks @timtrice for the update! While your post is scheduled to go up Sep 19, I can easily move that date. Just ping me when you have submitted to roweb.

stefaniebutland commented Sep 11, 2017

Thanks @timtrice for the update! While your post is scheduled to go up Sep 19, I can easily move that date. Just ping me when you have submitted to roweb.

@timtrice timtrice referenced this issue Sep 14, 2017

Merged

rrricanes #348

@timtrice

This comment has been minimized.

Show comment
Hide comment
@timtrice

timtrice Sep 14, 2017

@stefaniebutland @maelle

I was finally able to render the Rmd file and submit the pull request as referenced above.

timtrice commented Sep 14, 2017

@stefaniebutland @maelle

I was finally able to render the Rmd file and submit the pull request as referenced above.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment