Skip to content

Commit

Permalink
Add US-SEC parser for Seminole Electric Cooperative in Florida (#1902)
Browse files Browse the repository at this point in the history
* Start on US-FL SEC parser, just solar

background in #1713. example output:

```json
[{
  'zoneKey': 'US-SEC'
  'datetime': "2019-07-02T11:00:00Z",
  'production': {'solar': 0.0125},
  'source': 'apps.seminole.coop',
 }, {
  'zoneKey': 'US-SEC'
  'datetime': "2019-07-02T12:00:00Z",
  'production': {'solar': 0.1269},
  'source': 'apps.seminole.coop',
 },
 ...
]
```

* EIA bug fix: extract ['series'][0]['data'] even with target_datetime

* US-SEC: fetch production from EIA

for #1713

* finish US-SEC: merge gas/coal from EIA with solar

for #1713

uses `ENTSOE.merge_production_outputs()`, and there isn't much precedent for parsers reusing, so i've also added a TODO to move that function to `lib.utils`.

* add US-SEC (Florida) to README and capacity zones

* add details to US-SEC and EIA._fetch_production_or_consumption docstrings

* add US-SEC geometries based on Florida counties

Florida county GeoJSON downloaded from https://geodata.myflorida.com/datasets/swfwmd::florida-counties

Counties chosen from https://www.seminole-electric.com/members/ and linked pages, then lightly manually edited.

TODO: figure out why topogen.sh always downloads the Florida GeoJSON file every time, even though it's wrapped in an `if [ ! -e ... ]` just like the NACIS zip downloads.

* US-SEC: add US-FL counties raw GeoJSON to third_party_maps

downloaded from https://geodata.myflorida.com/datasets/swfwmd::florida-counties on 2019-08-05.

source URL: https://opendata.arcgis.com/datasets/4abd0a3669204df2bc3a57066d217959_4.geojson

* US-SEC: merge Florida counties and drop internal borders
  • Loading branch information
snarfed authored and corradio committed Aug 9, 2019
1 parent c371a60 commit 16c0a4d
Show file tree
Hide file tree
Showing 8 changed files with 152 additions and 17 deletions.
1 change: 1 addition & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -180,6 +180,7 @@ Real-time electricity data is obtained using [parsers](https://github.com/tmrowc
- Southwest Power Pool: [SPP](https://marketplace.spp.org/pages/generation-mix)
- Southwest Variable Energy Resource Initiative: [SVERI](https://sveri.energy.arizona.edu/#generation-by-fuel-type)
- Texas: [ERCOT](http://www.ercot.com/content/cdr/html/real_time_system_conditions.html)
- Seminole Electric Cooperative (Florida): [EIA](https://www.eia.gov/opendata/qb.php?category=2122629&sdid=EBA.SEC-ALL.NG.H), [SEC](https://www.seminole-electric.com/facilities/generation/)
- Uruguay: [UTE](http://www.ute.com.uy/SgePublico/ConsPotenciaGeneracionArbolXFuente.aspx)
&nbsp;</details>

Expand Down
24 changes: 24 additions & 0 deletions config/zones.json
Original file line number Diff line number Diff line change
Expand Up @@ -3735,6 +3735,30 @@
},
"timezone": null
},
"US-SEC": {
"bounding_box": [
[
-87.63333333,
24.45
],
[
-80.03333333,
31.0
]
],
"capacity": {
"unknown": 2110,
"solar": 2
},
"contributors": [
"https://github.com/snarfed"
],
"flag_file_name": "us.png",
"parsers": {
"production": "US_SEC.fetch_production"
},
"timezone": "US/Eastern"
},
"US-TX": {
"_comment": "http://www.ercot.com/content/wcm/lists/172484/ERCOT_Quick_Facts_01.17.19.pdf lists wind capacity and utility scale solar and percentages, unknown capacity calculated from those figures",
"bounding_box": [
Expand Down
28 changes: 23 additions & 5 deletions parsers/EIA.py
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,10 @@
'US-IPC': 'EBA.IPCO-ALL.DF.H'
}

PRODUCTION = {
'US-SEC': 'EBA.SEC-ALL.NG.H'
}

EXCHANGES = {
'MX-BC->US-CA': 'EBA.CISO-CFE.ID.H',
'US-BPA->US-IPC': 'EBA.BPAT-IPCO.ID.H',
Expand All @@ -32,16 +36,29 @@

def fetch_consumption_forecast(zone_key, session=None, target_datetime=None,
logger=None):
return _fetch_production_or_consumption(
zone_key, DAY_AHEAD[zone_key], session=session,
target_datetime=target_datetime, logger=logger)


def fetch_production(zone_key, session=None, target_datetime=None,
logger=None):
return _fetch_production_or_consumption(
zone_key, PRODUCTION[zone_key], session=session,
target_datetime=target_datetime, logger=logger)


series_id = DAY_AHEAD[zone_key]
def _fetch_production_or_consumption(zone_key, series_id, session=None,
target_datetime=None, logger=None):
"""Fetches production or consumption forecast, determined by series_id."""
s = session or requests.Session()
forecast_series = Series(series_id=series_id, session=s)
series = Series(series_id=series_id, session=s)

if target_datetime:
raw_data = forecast_series.last_from(24, end=target_datetime)
raw_data = series.last_from(24, end=target_datetime)
else:
# Get the last 24 hours available.
raw_data = forecast_series.last(24)['series'][0]['data']
raw_data = series.last(24)

# UTC timestamp with no offset returned.

Expand All @@ -50,7 +67,7 @@ def fetch_consumption_forecast(zone_key, session=None, target_datetime=None,
'datetime': parser.parse(datapoint[0]),
'value': datapoint[1],
'source': 'eia.org',
} for datapoint in raw_data]
} for datapoint in raw_data['series'][0]['data']]


def fetch_exchange(zone_key1, zone_key2, session=None, target_datetime=None, logger=None):
Expand Down Expand Up @@ -102,4 +119,5 @@ def fetch_exchange(zone_key1, zone_key2, session=None, target_datetime=None, log
"Main method, never used by the Electricity Map backend, but handy for testing."

print(fetch_consumption_forecast('US-NY'))
print(fetch_production('US-SEC'))
print(fetch_exchange('MX-BC', 'US-CA'))
2 changes: 2 additions & 0 deletions parsers/ENTSOE.py
Original file line number Diff line number Diff line change
Expand Up @@ -857,6 +857,8 @@ def fetch_production(zone_key, session=None, target_datetime=None,
}


# TODO: generalize and move to lib.utils so other parsers can reuse it. (it's
# currently used by US_SEC.)
def merge_production_outputs(parser_outputs, merge_zone_key, merge_source=None):
"""
Given multiple parser outputs, sum the production and storage
Expand Down
64 changes: 64 additions & 0 deletions parsers/US_SEC.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,64 @@
#!/usr/bin/env python3

"""Parser for the Seminole Electric Cooperative in Florida, USA.
Combines hourly gas and coal production data from EIA with hourly solar generation from http://apps.seminole.coop/db/cs/ . Both generally lag a day or so behind, and sometimes as much as 10d behind.
https://www.seminole-electric.com/facilities/generation/ lists two 650MW coal-fired plants, one 810MW gas plant, and a small 2.2MW solar farm. However, EIA combines coal and gas generation data, so we report those together as unknown.
https://github.com/tmrowco/electricitymap-contrib/issues/1713
"""

import logging
from datetime import timezone

import pandas as pd

from parsers import EIA
from parsers.ENTSOE import merge_production_outputs


def fetch_production(zone_key='US-SEC', session=None, target_datetime=None,
logger=logging.getLogger(__name__)):
unknown = fetch_unknown(zone_key=zone_key, session=session,
target_datetime=target_datetime, logger=logger)
solar = fetch_solar(session=session, logger=logger)
return merge_production_outputs((unknown, solar), zone_key,
merge_source='eia.org, apps.seminole.coop')


def fetch_unknown(zone_key='US-SEC', session=None, target_datetime=None,
logger=logging.getLogger(__name__)):
data = EIA.fetch_production(zone_key=zone_key, session=session,
target_datetime=target_datetime, logger=logger)
for hour in data:
hour.update({
'production': {'unknown': hour.pop('value')},
'storage': {}, # required by merge_production_outputs
})

return data


def fetch_solar(session=None, logger=logging.getLogger(__name__)):
url = 'http://apps.seminole.coop/db/cs/render.ashx?ItemPath=/Applications/Solar+Dashboard/Cooperative+Solar+-+Data&Format=EXCEL&rptHDInterval=Week&rptHDOffset=0'
df = pd.read_excel(url, sheet_name='Hourly', skiprows=[0])

return [{
'zoneKey': 'US-SEC',
'datetime': row['Date/Time (UTC)'].to_pydatetime().replace(tzinfo=timezone.utc),
'production': {'solar': row['kW'] / 1000.0},
'storage': {}, # required by merge_production_outputs
'source': 'apps.seminole.coop',
} for _, row in df.iterrows()]


def main():
"""Main method, not used by the ElectricityMap backend, just for testing."""
import pprint
print('fetch_production() ->')
print(pprint.pprint(fetch_production()))


if __name__ == '__main__':
main()
47 changes: 36 additions & 11 deletions web/generate-geometries.js
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ function readNDJSON(path) {

const countryGeos = readNDJSON('./build/tmp_countries.json');
const stateGeos = readNDJSON('./build/tmp_states.json');
const thirpartyGeos = readNDJSON('./build/tmp_thirdparty.json').concat([
const thirdpartyGeos = readNDJSON('./build/tmp_thirdparty.json').concat([
require('./third_party_maps/DK-DK2-without-BHM.json'),
require('./third_party_maps/NO-NO1.json'),
require('./third_party_maps/NO-NO2.json'),
Expand Down Expand Up @@ -47,9 +47,11 @@ const thirpartyGeos = readNDJSON('./build/tmp_thirdparty.json').concat([
JSON.parse(fs.readFileSync('./third_party_maps/US-HI-MO.geojson')),
JSON.parse(fs.readFileSync('./third_party_maps/US-HI-NI.geojson')),
JSON.parse(fs.readFileSync('./third_party_maps/US-HI-OA.geojson')),
]);
]).concat(
JSON.parse(fs.readFileSync('./third_party_maps/US-FL.geojson')).features,
);

const allGeos = countryGeos.concat(stateGeos, thirpartyGeos);
const allGeos = countryGeos.concat(stateGeos, thirdpartyGeos);

function geomerge() {
// Convert both into multipolygon
Expand Down Expand Up @@ -82,28 +84,36 @@ function hascMatch(properties, hasc) {
);
}

function equals(obj, prop, val) {
return obj && prop in obj && obj[prop] === val;
}

function getCountry(countryId) {
return geomerge(...allGeos.filter(d => d.id === countryId));
return geomerge(...allGeos.filter(d => equals(d, 'id', countryId)));
}
function getByPropertiesId(zoneId) {
return geomerge(...allGeos.filter(d => d.properties.id === zoneId));
return geomerge(...allGeos.filter(d => equals(d.properties, 'id', zoneId)));
}
function getSubUnit(subid) {
return geomerge(...allGeos.filter(d => d.properties.subid === subid));
return geomerge(...allGeos.filter(d => equals(d.properties, 'subid', subid)));
}
function getState(countryId, code_hasc, use_maybe=false) {
return geomerge(...allGeos.filter(d =>
d.id === countryId && (use_maybe && hascMatch(d.properties, code_hasc) || d.properties.code_hasc === code_hasc)));
equals(d, 'id', countryId) && 'code_hasc' in d.properties &&
(use_maybe && hascMatch(d.properties, code_hasc) || d.properties.code_hasc === code_hasc)));
}
function getStateByFips(countryId, fips) {
return geomerge(...allGeos.filter(d =>
d.id === countryId && d.properties.fips === fips));
equals(d, 'id', countryId) && equals(d.properties, 'fips', fips)));
}
function getStateByAdm1(adm1_code) {
return geomerge(...allGeos.filter(d => d.properties.adm1_code === adm1_code));
return geomerge(...allGeos.filter(d => equals(d.properties, 'adm1_code', adm1_code)));
}
function getByRegionCod(region_cod) {
return geomerge(...allGeos.filter(d => d.properties.region_cod === region_cod));
return geomerge(...allGeos.filter(d => equals(d.properties, 'region_cod', region_cod)));
}
function getCounty(county_name) {
return geomerge(...allGeos.filter(d => equals(d.properties, 'COUNTYNAME', county_name)));
}
function getStates(countryId, code_hascs, use_maybe) {
return geomerge(...code_hascs.map(d => getState(countryId, d, use_maybe)));
Expand All @@ -117,6 +127,9 @@ function getCountries(countryIds) {
function getSubUnits(ids) {
return geomerge(...ids.map(getSubUnit));
}
function getCounties(names) {
return geomerge(...names.map(getCounty));
}

const zoneDefinitions = [
// Map between "zones" iso_a2 and adm0_a3 in order to support XX, GB etc..
Expand Down Expand Up @@ -595,6 +608,7 @@ const zoneDefinitions = [
// { zoneName: 'US-RI', countryId: 'USA', stateId: 'US.RI', type: 'state' },
{ zoneName: 'US-SC', countryId: 'USA', stateId: 'US.SC', type: 'state' },
// { zoneName: 'US-SD', countryId: 'USA', stateId: 'US.SD', type: 'state' },
{ zoneName: 'US-SEC', type: 'county', counties: ['ALACHUA', 'BAKER', 'BRADFORD', 'CITRUS', 'CLAY', 'COLUMBIA', 'DESOTO', 'DIXIE', 'GADSDEN', 'GILCHRIST', 'GLADES', 'HAMILTON', 'HARDEE', 'HENDRY', 'HERNANDO', 'HIGHLANDS', 'JEFFERSON', 'LAFAYETTE', 'LAKE', 'LEON', 'LEVY', 'LIBERTY', 'MADISON', 'MANATEE', 'OKEECHOBEE', 'OSCEOLA', 'PASCO', 'PUTNAM', 'SARASOTA', 'SUMTER', 'SUWANNEE', 'TAYLOR', 'UNION', 'WAKULLA']},
{ zoneName: 'US-SPP', type: 'states', countryId: 'USA', states: [
'US.KS', 'US.NE','US.OK', 'US.ND', 'US.SD']},
{ zoneName: 'US-SVERI', type: 'states', countryId: 'USA', states: ['US.AZ', 'US.NM']},
Expand Down Expand Up @@ -666,6 +680,9 @@ const getDataForZone = (zone, mergeStates) => {
return getByRegionCod(zone.region_cod);
}
}
else if (zone.type === 'county') {
return getCounties(zone.counties);
}
else{
console.warn(`unknown type "${zone.type}" for zone`, zone.zoneName);
}
Expand Down Expand Up @@ -750,9 +767,17 @@ zoneFeatures = toListOfFeatures(zoneFeaturesInline);
// Write unsimplified list of geojson, without state merges
fs.writeFileSync('public/dist/zonegeometries.json', zoneFeatures.map(JSON.stringify).join('\n'));

// Simplify all countries
// Convert to TopoJSON
const topojson = require('topojson');
let topo = topojson.topology(zones);

// merge contiguous Florida counties in US-SEC so that we only see the outer
// region boundary line(s), not the interior county boundary lines.
// Example: https://bl.ocks.org/mbostock/5416405
// Background: https://github.com/tmrowco/electricitymap-contrib/issues/1713#issuecomment-517704023
topo.objects['US-SEC'] = topojson.mergeArcs(topo, [topo.objects['US-SEC']]);

// Simplify all countries
topo = topojson.presimplify(topo);
topo = topojson.simplify(topo, 0.01);

Expand Down
2 changes: 1 addition & 1 deletion web/src/world.json

Large diffs are not rendered by default.

1 change: 1 addition & 0 deletions web/third_party_maps/US-FL.geojson

Large diffs are not rendered by default.

0 comments on commit 16c0a4d

Please sign in to comment.