hurricane_harvey_prelims is an analysis project that has collected text products from the National Weather Service offices in Brownsville, Corpus Christi, Austin/San Antonio, and Houston, Texas, and Lake Charles, and New Orleans Louisiana. The project extracts and moves into tidy datasets the information obtained from these products.
The project consists of a report detailing the observations collected for rainfall, wind, and tornadic activity. The project also contains an explanation on how the data was obtained and how it was cleaned and formatted.
The project uses the
workflowr package as a means of organization, but it is not required to run the code.
This project relies on text products dated at a specific time. The sites collected are:
Brownsville, TX (BRO)
Corpus Christi, TX (CRP)
Austin/San Antonion, TX (EWX)
Houston, TX (HGX)
Lake Charles, LA (LCH)
New Orleans, LA (LIX)
At any time the links above may be updated with new information on unrelated storm activity. Therefore, the products were downloaded at the original time of writing and saved. The raw products are saved in the ./data directory. The script to do this is ./code/01_download_data.R.
The script that performs the data extractions and transformations is ./code/02_parse_text.R.
This script reads the downloaded text files, examines each section of each text file, and, using regex patterns, collects the information into it's appropriate dataset;
rain_df- Rainfall observations
slp_df- Sea Level Pressure observations
tor_df- Tornado observations
- R 3.5.2 - The R Project for Statistical Computing
Please read Contributing for details on code of conduct.