-
Notifications
You must be signed in to change notification settings - Fork 0
/
polygon-prep.Rmd
289 lines (222 loc) · 13.7 KB
/
polygon-prep.Rmd
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
---
title: "Polygon preparation"
author: "Robert Schlegel"
date: "2019-05-23"
output: workflowr::wflow_html
editor_options:
chunk_output_type: console
---
```{r global_options, include = FALSE}
knitr::opts_chunk$set(fig.width = 8, fig.align = 'center',
echo = TRUE, warning = FALSE, message = FALSE,
eval = TRUE, tidy = FALSE)
```
## Introduction
This markdown file contains all of the code used to prepare the polygons that are used to define the different regions in the Northwest Atlantic. These different regions will then have their SST pixels spatially averaged to create a single time series per region. This is done so that the MHW detection algorithm may then be run on these individual time series to represent a general representation of the SST in those regions, rather than running the algorithm on each pixel individually, which would introduce a host of problems.
```{r libraries}
# Packages used in this vignette
library(tidyverse) # Base suite of functions
library(R.matlab) # For dealing with MATLAB files
library(marmap) # For bathymetry
library(maptools) # Contour tools
library(rgeos) # For intersections
```
## Coastal region polygons
The first step in this analysis is to broadly define the coastal regions based on previous research into thermally relevant boundaries. We have chosen to use a paper by Richaud et al 2016 to do this (https://www.sciencedirect.com/science/article/pii/S0278434316303181#f0010). Being the kind-hearted man that he is, Benjamin forwarded us the polygons (Richaud et al. 2016; Figure 2) from this work as a MATLAB file so we must first open that up and convert it to an R format for further use.
```{r mat-R, eval=FALSE}
# Load the file
NWA_polygons <- R.matlab::readMat("data/boundaries.mat")
# Remove index list items and attributes
NWA_polygons[grepl("[.]",names(NWA_polygons))] <- NULL
# attributes(NWA_polygons) <- NULL
# Function for neatly converting list items into a dataframe
# vec <- NWA_polygons[1]
mat_col <- function(vec){
df <- as.data.frame(vec)
df$region <- substr(colnames(df)[1], 2, nchar(colnames(df)[1]))
colnames(df)[1] <- strtrim(colnames(df)[1], 1)
df <- df[c(2,1)]
return(df)
}
# Create multiple smaller data.frames
coords_1 <- cbind(mat_col(NWA_polygons[1]), mat_col(NWA_polygons[2])[2])
coords_2 <- cbind(mat_col(NWA_polygons[3]), mat_col(NWA_polygons[4])[2])
coords_3 <- cbind(mat_col(NWA_polygons[5]), mat_col(NWA_polygons[6])[2])
coords_4 <- cbind(mat_col(NWA_polygons[7]), mat_col(NWA_polygons[8])[2])
coords_5 <- cbind(mat_col(NWA_polygons[9]), mat_col(NWA_polygons[10])[2])
coords_6 <- cbind(mat_col(NWA_polygons[11]), mat_col(NWA_polygons[12])[2])
# Combine them into one full dataframe and save
NWA_coords <- rbind(coords_1, coords_2, coords_3, coords_4, coords_5, coords_6)
colnames(NWA_coords) <- c("region", "lon", "lat")
saveRDS(NWA_coords, "data/NWA_coords.Rda")
```
With our polygons now switched over from MATLAB to R we now want to visualise them to ensure that everything has gone smoothly.
```{r poly-vis}
# Load polygon coordinates
NWA_coords <- readRDS("data/NWA_coords.Rda")
# The base map
map_base <- ggplot2::fortify(maps::map(fill = TRUE, col = "grey80", plot = FALSE)) %>%
dplyr::rename(lon = long) %>%
mutate(group = ifelse(lon > 180, group+9999, group),
lon = ifelse(lon > 180, lon-360, lon)) %>%
select(-region, -subregion)
# Quick map
NWA_coords_plot <- ggplot(data = NWA_coords, aes(x = lon, y = lat)) +
geom_polygon(data = map_base, aes(group = group), show.legend = F) +
geom_polygon(aes(colour = region, fill = region), size = 1.5, alpha = 0.2) +
coord_cartesian(xlim = c(min(NWA_coords$lon)-1, max(NWA_coords$lon)+1),
ylim = c(min(NWA_coords$lat)-1, max(NWA_coords$lat)+1)) +
labs(x = NULL, y = NULL, colour = "Region", fill = "Region") +
theme(legend.position = "bottom")
# ggsave(NWA_coords_plot, filename = "output/NWA_coords_plot.pdf", height = 5, width = 6)
# Visualise
NWA_coords_plot
```
The region abbreviations are: "gm" for Gulf of Maine, "gls" for Gulf of St. Lawrence, "ls" for Labrador Shelf, "mab" for Mid-Atlantic Bight, "nfs" for Newfoundland Shelf and "ss" for Scotian Shelf.
Before we move on, we'll do a bit of house keeping to establish a consistent study area for this project based on our polygons. We'll simply extend the study area by 1 degree of longitude and latitude from the furthest edges of the polygons, as seen in the figure above.
```{r study-area-coords, eval=FALSE}
# Set the max/min lon/at values
lon_min <- round(min(NWA_coords$lon)-1, 2)
lon_max <- round(max(NWA_coords$lon)+1, 2)
lat_min <- round(min(NWA_coords$lat)-1, 2)
lat_max <- round(max(NWA_coords$lat)+1, 2)
# Combine and save
NWA_corners <- c(lon_min, lon_max, lat_min, lat_max)
saveRDS(NWA_corners, file = "data/NWA_corners.Rda")
```
### Cabot Strait
It was decided that because we are interested in the geogrophy of the regions, and not just their temperature regimes, the Cabot Strait needed to be defined apart from the Gulf of St. Lawrence region. To do this we will simply snip the "gsl" polygon into two pieces at its narrowest point.
```{r cabot-strait-1}
# Extract the gsl region only
gsl_sub <- NWA_coords[NWA_coords$region == "gsl",]
# Add a simple integer column for ease of plotting
gsl_sub$row_count <- 1:nrow(gsl_sub)
ggplot(data = gsl_sub, aes(x = lon, y = lat)) +
geom_polygon(aes(fill = region)) +
geom_label(aes(label = row_count))
```
It appears from the crude figure above that we should pinch the polygon off into two separate shapes at row 6 and 10.
```{r cabot-strait-2}
# Create smaller gsl polygon
gsl_new <- NWA_coords[NWA_coords$region == "gsl",] %>%
slice(-c(7:9))
# Create new cbs (Cabot Strait) polygon
cbs <- NWA_coords[NWA_coords$region == "gsl",] %>%
slice(6:10) %>%
mutate(region = "cbs")
# Attach the new polygons to the original polygons
NWA_coords_cabot <- NWA_coords %>%
filter(region != "gsl") %>%
rbind(., gsl_new, cbs)
saveRDS(NWA_coords_cabot, "data/NWA_coords_cabot.Rda")
# Plot the new areas to ensure everything worked
NWA_coords_cabot_plot <- ggplot(data = NWA_coords_cabot, aes(x = lon, y = lat)) +
geom_polygon(data = map_base, aes(group = group), show.legend = F) +
geom_polygon(aes(colour = region, fill = region), size = 1.5, alpha = 0.2) +
coord_cartesian(xlim = c(min(NWA_coords$lon)-1, max(NWA_coords$lon)+1),
ylim = c(min(NWA_coords$lat)-1, max(NWA_coords$lat)+1)) +
labs(x = NULL, y = NULL, colour = "Region", fill = "Region") +
theme(legend.position = "bottom")
# ggsave(NWA_coords_cabot_plot, filename = "output/NWA_coords_cabot_plot.pdf", height = 5, width = 6)
# Visualise
NWA_coords_cabot_plot
```
Everything is looking good, but we may want to divide the Gulf of Maine (gm) into two polygons as well. This would make the Bay of Fundy in it's own region. FOr now however we will move on to the next step, which is dividing the current polygons by bathymetry.
## Bathymetry polygons
These regions are an excellent start, but because the aim of this research project is to determine the primary drivers of MHWs in the Northwest Atlantic we want to subset the regions not only byu their thermal characteristics, but also by their geophysical characteristics, which in this instance is most immediately available as bathymetry. The rationale for this is that surface forcing as a driver of anaomolously warm seawater is likely to be more prevalent in shallow waters than in shelf waters or deeper. To this end we will divide the regions into three sub-regions each: 0 -- 50 m depth, 51 -- 200 m depth, 201+ m depth.
<!-- Unless otherwise specified we will not be looking at waters further than the continental shelf. This is because the usefulness of this work will largely be constrained by it's applications to coastal ecosystems and fisheries. Therefore we are less interested in understanding the drivers of synoptic scale and greater open ocean events. -->
### Downloading data
Before we can divide up our polygons by bathymetry, we must first download said bathymetry data. We will use here the NOAA data as it is (to my knowledge) the most conveient bathymetry data available directly through R.
```{r bathy-download, eval=FALSE}
# Donwload NOAA bathy data at highest available resolution
NWA_bathy_hires <- as.xyz(getNOAA.bathy(lon1 = NWA_corners[1], lon2 = NWA_corners[2],
lat1 = NWA_corners[3], lat2 = NWA_corners[4], resolution = 1))
colnames(NWA_bathy_hires) <- c("lon", "lat", "depth")
NWA_bathy_hires <- NWA_bathy_hires %>%
filter(depth <= 0) %>%
mutate(lon = round(lon, 4),
lat = round(lat, 4))
saveRDS(NWA_bathy_hires, file = "data/NWA_bathy_hires.Rda")
# Donwload NOAA bathy data at a courser resolution
NWA_bathy_lowres <- as.xyz(getNOAA.bathy(lon1 = NWA_corners[1], lon2 = NWA_corners[2],
lat1 = NWA_corners[3], lat2 = NWA_corners[4], resolution = 6))
colnames(NWA_bathy_lowres) <- c("lon", "lat", "depth")
NWA_bathy_lowres <- NWA_bathy_lowres %>%
filter(depth <= 0) %>%
mutate(lon = round(lon, 4),
lat = round(lat, 4))
saveRDS(NWA_bathy_lowres, file = "data/NWA_bathy_lowres.Rda")
```
With our high res bathymetry data downloaded we will now go about finding the 50 and 200 metre isobaths. These lines will be used to define the edges of our sub-regions. The naming conventions we will use will be
```{r isobath-hires-plot}
# Load hires bathymetry data and NWA corners
NWA_bathy_hires <- readRDS("data/NWA_bathy_hires.Rda")
NWA_corners <- readRDS("data/NWA_corners.Rda")
# Plot to see what we've got
bathy_hires_plot <- ggplot(NWA_coords_cabot, aes(x = lon, y = lat)) +
geom_polygon(data = map_base, aes(group = group), show.legend = F) +
geom_polygon(aes(group = region, fill = region, colour = region), alpha = 0.2) +
geom_contour(data = NWA_bathy_hires, aes(z = depth),
breaks = c(-50), size = c(0.3), colour = "grey70") +
geom_contour(data = NWA_bathy_hires, aes(z = depth),
breaks = c(-200), size = c(0.3), colour = "grey30") +
coord_cartesian(xlim = NWA_corners[1:2],
ylim = NWA_corners[3:4]) +
labs(x = NULL, y = NULL)
# Visualise
bathy_hires_plot
```
The resolution on the figure above is fantastic, but perhaps a bit more than what we need as the desire is to create functional polygons for dividing our regions. The level of detail present in the figure above will likely leed to issues later on. Let's look at the lowres bathymetry data and see if that works better.
```{r isobath-lowres-plot}
# Load hires bathymetry data
NWA_bathy_lowres <- readRDS("data/NWA_bathy_lowres.Rda")
# Plot to see what we've got
bathy_lowres_plot <- ggplot(NWA_coords_cabot, aes(x = lon, y = lat)) +
geom_polygon(data = map_base, aes(group = group), show.legend = F) +
geom_polygon(aes(group = region, fill = region, colour = region), alpha = 0.2) +
geom_contour(data = NWA_bathy_lowres, aes(z = depth),
breaks = c(-50), size = c(0.3), colour = "grey70") +
geom_contour(data = NWA_bathy_lowres, aes(z = depth),
breaks = c(-200), size = c(0.3), colour = "grey30") +
coord_cartesian(xlim = NWA_corners[1:2],
ylim = NWA_corners[3:4]) +
labs(x = NULL, y = NULL)
# Visualise
bathy_lowres_plot
```
The above plot shows the NOAA bathymetry data downloaded with a resolution of 6. I tried these data at all resolutions from 2 -- 8 and visually this looks to be the best balance between removing unnecessary convolutions without loosing the broader features.
### Isobath contour based polygons
With a decent resolution picked out for our bathymetry data we now nee to create polygons from the desired isobaths. These will then be used as custom bounding boxes with which we will filter and combine the SST pixels for the next vignette.
```{r isobath-1}
# Spread bathy dataframe to matrix to play nice with shapefiles
NWA_bathy_matrix <- NWA_bathy_lowres %>%
mutate(depth = round(depth, -1)) %>%
reshape2::acast( lon~lat, value.var = "depth")
# Generate contours
cont <- contourLines(x = as.numeric(row.names(NWA_bathy_matrix)), y = as.numeric(colnames(NWA_bathy_matrix)),
z = NWA_bathy_matrix, levels = c(-50, -200))
cont_lines <- ContourLines2SLDF(cont, proj4string = CRS("+proj=longlat +datum=WGS84")) # create SpatialLines DF
# Have a look
plot(cont_lines)
```
With the isobath contours now convert to a `spatial` class object we will need to do the same for our study area polygons so they can play nice with one another.
```{r isobath-2}
# First convert the dataframe of different areas to a list
poly_list <- split(NWA_coords_cabot, NWA_coords_cabot$region)
# We then only want lon/lat in the list, not the names
poly_list <- lapply(poly_list, function(x) { x["region"] <- NULL; x })
# Convert coords to polygon type
poly_poly <- sapply(poly_list, Polygon)
# Add id variables back in
poly_poly <- lapply(seq_along(poly_poly), function(i) Polygons(list(poly_poly[[i]]),
ID = names(poly_list)[i]))
# Create SpatialPolygons object
spatial_poly <- SpatialPolygons(poly_poly, proj4string = CRS("+proj=longlat +datum=WGS84"))
# Clip the contour lines to the study area polygons
spatial_poly_lines <- gIntersection(cont_lines, spatial_poly) # clip the contours to subarea0
# Visualise
plot(spatial_poly)
lines(spatial_poly_lines)
```
After going through all of this shape file nonsense it dawned on me that I could much more easily just have downloaded the bathymetry data and then done a nearest neighbour search within each region for the SST pixels and then clumped them based on which of the depth categories they fall within.
We will now go about creating SST time series for each of the depth sub-regions. This work is continued in the [SST preparation](https://robwschlegel.github.io/MHWNWA/sst-prep.html) vignette.