Skip to content

R language package to serve Gulf Stream mapped data

License

Notifications You must be signed in to change notification settings

BigelowLab/gstream

Repository files navigation

gstream

Serving Gulf Stream datasets directly from R.

Requirements

R v4.1+

rlang

dplyr

sf

Installation

Use the remotes package to install directly from github.

remotes::install("BigelowLab/gstream)

Data from US Navy

Arhcived data

NOAA’s Ocean Prediction Center provides a FTP server](https://ftp.opc.ncep.noaa.gov/grids/experimental/GStream) for downloads by year. We have downloaded these and repackaged into spatial format files - these are included with the gstream package. They also provide daily updates.

suppressPackageStartupMessages({
  library(sf)
  library(dplyr)
  library(gstream)
  library(rnaturalearth)
})

x = read_usn() |>
  dplyr::glimpse()
## Rows: 3,909
## Columns: 3
## $ date     <date> 2010-01-22, 2010-01-22, 2010-01-25, 2010-01-25, 2010-01-27, …
## $ wall     <chr> "north", "south", "north", "south", "north", "south", "north"…
## $ geometry <MULTIPOINT [°]> MULTIPOINT ((-80.2 25), (-8..., MULTIPOINT ((-77.5…

This reads in all of the data stored with the package. We can then do a simple plot of all of the locations.

bb = sf::st_bbox(x)
coast = rnaturalearth::ne_coastline(scale = "medium", returnclass = "sf")

plot(x['wall'], pch = 3, axes = TRUE, reset = FALSE)
plot(sf::st_geometry(coast), add = TRUE)

Downloading daily updates and configuration

Note that you don’t need to create the configuration file if you are not downloading data.

The daily data is hosted by by NOAA’s Ocean Prediction Center In particular they post the US Navy’s daily Gulf Stream point data for the north and south walls. These can be downloaded. We provide a mechanism for storing the URL of the daily data, the path to where you want to store the downloads and a simple script for downloading. The configuration can be stored anywhere, but by default we look for it isn ~/.gstream.

cfg = read_configuration()
cfg
## $usn
## $usn$rawpath
## [1] "/Users/ben/Dropbox/data/gstream/usn/raw"
## 
## $usn$dailyuri
## [1] "https://ocean.weather.gov/gulf_stream_latest.txt"
## 
## $usn$ftpuri
## [1] "https://ftp.opc.ncep.noaa.gov/grids/experimental/GStream"

Obviously, you will want to modify the rawpath to suit your own needs. We then set up a cron job to make the daily download at local 6pm.

# gstream data
0 18 * * * /usr/local/bin/Rscript /Users/ben/Library/CloudStorage/Dropbox/code/projects/gsi/inst/scripts/usn_daily_download.R >> /dev/null 2>&1

Ordering USN data

The USN data is not ordered, that is the points for a given day are not following a polyline.

d = dplyr::filter(x, date == as.Date("2020-12-19"), wall == "north")
plot(sf::st_geometry(d), type = "l", axes = TRUE)

With thanks to Dewey Dunnington we can reorder them into a single LINESTRING.

d = dplyr::filter(x, date == as.Date("2020-01-03"), wall == "north")
do = order_usn(d)
plot(sf::st_geometry(d), type = "l", axes = TRUE, reset= FALSE)
plot(sf::st_geometry(do), type = "l", add = TRUE, col = "orange")

About

R language package to serve Gulf Stream mapped data

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Languages