Skip to content

Used SqlAlchemy to connect to sql.lite database related to climate in Hawaii, for further analysis in Python. Later, created an API to show results based on given parameters.

Notifications You must be signed in to change notification settings

Umarfoo/Climate_Analysis_API_Python_SQLAlchemy

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Python SQLAlchemy - Climate Analysis & API

Background

I had decided to treat myself to a long holiday vacation in Honolulu, Hawaii! To help with my trip planning, I needed to do some climate analysis on the area. The following outlines what I needed to do.

Step 1 - Climate Analysis and Exploration

To begin, use Python and SQLAlchemy to do basic climate analysis and data exploration of my climate database. All of the following analysis were completed using SQLAlchemy ORM queries, Pandas, and Matplotlib.

  • Choose a start date and end date for my trip. Made sure that your vacation range was approximately 3-15 days total.

  • Used SQLAlchemy create_engine to connect to my sqlite database.

  • Used SQLAlchemy automap_base() to reflect my tables into classes and save a reference to those classes called Station and Measurement.

Precipitation Analysis

  • Designed a query to retrieve the last 12 months of precipitation data.

  • Selected only the date and prcp values.

  • Loaded the query results into a Pandas DataFrame and set the index to the date column.

  • Sorted the DataFrame values by date.

  • Plotted the results using the DataFrame plot method.

  • Used Pandas to print the summary statistics for the precipitation data.

Station Analysis

  • Designed a query to calculate the total number of stations.

  • Designed a query to find the most active stations.

    • Listed the stations and observation counts in descending order.

    • Discovered which station has the highest number of observations.

  • Designed a query to retrieve the last 12 months of temperature observation data (TOBS).

    • Filtered by the station with the highest number of observations.

    • Plotted the results as a histogram with bins=12.

Step 2 - Climate App

Now that I had completed my initial analysis, I designed a Flask API based on the queries that I have just developed.

  • Used Flask to create my routes.

Routes

  • /

    • Home page.

    • Listed all routes that are available.

  • /api/v1.0/precipitation

    • Converted the query results to a dictionary using date as the key and prcp as the value.

    • Returned the JSON representation of your dictionary.

  • /api/v1.0/stations

    • Returned a JSON list of stations from the dataset.
  • /api/v1.0/tobs

    • Queried the dates and temperature observations of the most active station for the last year of data.

    • Returned a JSON list of temperature observations (TOBS) for the previous year.

  • /api/v1.0/<start> and /api/v1.0/<start>/<end>

    • Returned a JSON list of the minimum temperature, the average temperature, and the max temperature for a given start or start-end range.

    • When given the start only, calculated TMIN, TAVG, and TMAX for all dates greater than and equal to the start date.

    • When given the start and the end date, calculated the TMIN, TAVG, and TMAX for dates between the start and end date inclusive.

About

Used SqlAlchemy to connect to sql.lite database related to climate in Hawaii, for further analysis in Python. Later, created an API to show results based on given parameters.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published