diff --git a/.buildinfo b/.buildinfo new file mode 100644 index 0000000..8cbe87f --- /dev/null +++ b/.buildinfo @@ -0,0 +1,4 @@ +# Sphinx build info version 1 +# This file hashes the configuration used when building these files. When it is not found, a full rebuild will be done. +config: 6a89bab28ebd95128323eaf1dc13ed46 +tags: 645f666f9bcd5a90fca523b33c5a78b7 diff --git a/.nojekyll b/.nojekyll new file mode 100644 index 0000000..e69de29 diff --git a/_generated/lasso.CubeTransit.html b/_generated/lasso.CubeTransit.html new file mode 100644 index 0000000..901bda7 --- /dev/null +++ b/_generated/lasso.CubeTransit.html @@ -0,0 +1,687 @@ + + + + + + + + + + lasso.CubeTransit — lasso documentation + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ + + +
+ + + + + +
+ +
+ + + + + + + + + + + + + + + + + +
+ + + + +
+
+
+
+ +
+

lasso.CubeTransit

+
+
+class lasso.CubeTransit(parameters={})[source]
+

Bases: object

+

Class for storing information about transit defined in Cube line +files.

+

Has the capability to:

+
+
    +
  • Parse cube line file properties and shapes into python dictionaries

  • +
  • Compare line files and represent changes as Project Card dictionaries

  • +
+
+

Typical usage example:

+
tn = CubeTransit.create_from_cube(CUBE_DIR)
+transit_change_list = tn.evaluate_differences(base_transit_network)
+
+
+
+
+lines
+

list of strings representing unique line names in +the cube network.

+
+
Type
+

list

+
+
+
+ +
+
+line_properties
+

dictionary of line properties keyed by line name. Property +values are stored in a dictionary by property name. These +properties are directly read from the cube line files and haven’t +been translated to standard transit values.

+
+
Type
+

dict

+
+
+
+ +
+
+shapes
+

dictionary of shapes +keyed by line name. Shapes stored as a pandas DataFrame of nodes with following columns:

+
+
    +
  • ‘node_id’ (int): positive integer of node id

  • +
  • ‘node’ (int): node number, with negative indicating a non-stop

  • +
  • ‘stop’ (boolean): indicates if it is a stop

  • +
  • ‘order’ (int): order within this shape

  • +
+
+
+
Type
+

dict

+
+
+
+ +
+
+program_type
+

Either PT or TRNBLD

+
+
Type
+

str

+
+
+
+ +
+
+parameters
+

Parameters instance that will be applied to this instance which +includes information about time periods and variables.

+
+
Type
+

Parameters

+
+
+
+ +
+
+source_list
+

List of cube line file sources that have been read and added.

+
+
Type
+

list

+
+
+
+ +
+
+diff_dict
+
+
Type
+

dict

+
+
+
+ +
+
+__init__(parameters={})[source]
+

Constructor for CubeTransit

+

parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters

+
+ +

Methods

+ ++++ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +

__init__([parameters])

Constructor for CubeTransit

add_additional_time_periods(…)

Copies a route to another cube time period with appropriate values for time-period-specific properties.

add_cube(transit_source)

Reads a .lin file and adds it to existing TransitNetwork instance.

build_route_name([route_id, time_period, …])

Create a route name by contatenating route, time period, agency, and direction

calculate_start_end_times(line_properties_dict)

Calculate the start and end times of the property change WARNING: Doesn’t take care of discongruous time periods!!!!

create_add_route_card_dict(line)

Creates a project card change formatted dictionary for adding a route based on the information in self.route_properties for the line.

create_delete_route_card_dict(line, …)

Creates a project card change formatted dictionary for deleting a line.

create_from_cube(transit_source)

Reads a cube .lin file and stores as TransitNetwork object.

create_update_route_card_dict(line, …)

Creates a project card change formatted dictionary for updating the line.

cube_properties_to_standard_properties(…)

Converts cube style properties to standard properties.

evaluate_differences(base_transit)

    +
  1. Identifies what routes need to be updated, deleted, or added

  2. +
+

evaluate_route_property_differences(…[, …])

Checks if any values have been updated or added for a specific route and creates project card entries for each.

evaluate_route_shape_changes(shape_build, …)

Compares two route shapes and constructs returns list of changes suitable for a project card.

get_time_period_numbers_from_cube_properties(…)

Finds properties that are associated with time periods and the returns the numbers in them.

unpack_route_name(line_name)

Unpacks route name into direction, route, agency, and time period info

+
+
+add_additional_time_periods(new_time_period_number, orig_line_name)[source]
+

Copies a route to another cube time period with appropriate +values for time-period-specific properties.

+
+
New properties are stored under the new name in:
    +
  • ::self.shapes

  • +
  • ::self.line_properties

  • +
+
+
+
+
Parameters
+
    +
  • new_time_period_number (int) – cube time period number

  • +
  • orig_line_name (str) – name of the originating line, from which +the new line will copy its properties.

  • +
+
+
Returns
+

Line name with new time period.

+
+
Return type
+

str

+
+
+
+ +
+
+add_cube(transit_source)[source]
+

Reads a .lin file and adds it to existing TransitNetwork instance.

+
+
Parameters
+

transit_source – a string or the directory of the cube line file to be parsed

+
+
Return type
+

None

+
+
+
+ +
+
+static build_route_name(route_id='', time_period='', agency_id=0, direction_id=1)[source]
+

Create a route name by contatenating route, time period, agency, and direction

+
+
Parameters
+
    +
  • route_id – i.e. 452-111

  • +
  • time_period – i.e. pk

  • +
  • direction_id – i.e. 1

  • +
  • agency_id – i.e. 0

  • +
+
+
Returns
+

constructed line_name i.e. “0_452-111_452_pk1”

+
+
Return type
+

str

+
+
+
+ +
+
+calculate_start_end_times(line_properties_dict)[source]
+

Calculate the start and end times of the property change +WARNING: Doesn’t take care of discongruous time periods!!!!

+
+
Parameters
+

line_properties_dict – dictionary of cube-flavor properties for a transit line

+
+
+
+ +
+
+create_add_route_card_dict(line)[source]
+

Creates a project card change formatted dictionary for adding +a route based on the information in self.route_properties for +the line.

+
+
Parameters
+

line – name of line that is being updated

+
+
Returns
+

A project card change-formatted dictionary for the route addition.

+
+
+
+ +
+
+create_delete_route_card_dict(line, base_transit_line_properties_dict)[source]
+

Creates a project card change formatted dictionary for deleting a line.

+
+
Parameters
+
    +
  • line – name of line that is being deleted

  • +
  • base_transit_line_properties_dict – dictionary of cube-style +attribute values in order to find time periods and +start and end times.

  • +
+
+
Returns
+

A project card change-formatted dictionary for the route deletion.

+
+
+
+ +
+
+static create_from_cube(transit_source)[source]
+

Reads a cube .lin file and stores as TransitNetwork object.

+
+
Parameters
+

transit_source – a string or the directory of the cube line file to be parsed

+
+
Returns
+

A ::CubeTransit object created from the transit_source.

+
+
+
+ +
+
+create_update_route_card_dict(line, updated_properties_dict)[source]
+

Creates a project card change formatted dictionary for updating +the line.

+
+
Parameters
+
    +
  • line – name of line that is being updated

  • +
  • updated_properties_dict – dictionary of attributes to update as +‘property’: <property name>, +‘set’: <new property value>

  • +
+
+
Returns
+

A project card change-formatted dictionary for the attribute update.

+
+
+
+ +
+
+static cube_properties_to_standard_properties(cube_properties_dict)[source]
+

Converts cube style properties to standard properties.

+

This is most pertinent to time-period specific variables like headway, +and varibles that have stnadard units like headway, which is minutes +in cube and seconds in standard format.

+
+
Parameters
+

cube_properties_dict – <cube style property name> : <property value>

+
+
Returns
+

+
<standard

style property name>, “set” : <property value with correct units>`

+
+
+

+
+
Return type
+

A list of dictionaries with values for `”property”

+
+
Return type
+

list

+
+
+
+ +
+
+evaluate_differences(base_transit)[source]
+
    +
  1. Identifies what routes need to be updated, deleted, or added

  2. +
  3. +
    For routes being added or updated, identify if the time periods

    have changed or if there are multiples, and make duplicate lines if so

    +
    +
    +
  4. +
  5. Create project card dictionaries for each change.

  6. +
+
+
Parameters
+

base_transit (CubeTransit) – an instance of this class for the base condition

+
+
Returns
+

A list of dictionaries containing project card changes +required to evaluate the differences between the base network +and this transit network instance.

+
+
+
+ +
+
+evaluate_route_property_differences(properties_build, properties_base, time_period_number, absolute=True, validate_base=False)[source]
+

Checks if any values have been updated or added for a specific +route and creates project card entries for each.

+
+
Parameters
+
    +
  • properties_build – ::<property_name>: <property_value>

  • +
  • properties_base – ::<property_name>: <property_value>

  • +
  • time_period_number – time period to evaluate

  • +
  • absolute – if True, will use set command rather than a change. If false, will automatically check the base value. Note that this only applies to the numeric values of frequency/headway

  • +
  • validate_base – if True, will add the existing line in the project card

  • +
+
+
Returns
+

+
a list of dictionary values suitable for writing to a project card

{ +‘property’: <property_name>, +‘set’: <set value>, +‘change’: <change from existing value>, +‘existing’: <existing value to check>, +}

+
+
+

+
+
Return type
+

transit_change_list (list)

+
+
+
+ +
+
+evaluate_route_shape_changes(shape_build, shape_base)[source]
+

Compares two route shapes and constructs returns list of changes +suitable for a project card.

+
+
Parameters
+
    +
  • shape_build – DataFrame of the build-version of the route shape.

  • +
  • shape_base – dDataFrame of the base-version of the route shape.

  • +
+
+
Returns
+

List of shape changes formatted as a project card-change dictionary.

+
+
+
+ +
+
+static get_time_period_numbers_from_cube_properties(properties_list)[source]
+

Finds properties that are associated with time periods and the +returns the numbers in them.

+
+
Parameters
+

properties_list (list) – list of all properties.

+
+
Returns
+

list of strings of the time period numbers found

+
+
+
+ +
+
+static unpack_route_name(line_name)[source]
+

Unpacks route name into direction, route, agency, and time period info

+
+
Parameters
+

line_name (str) – i.e. “0_452-111_452_pk1”

+
+
Returns
+

452-111 +time_period (str): i.e. pk +direction_id (str) : i.e. 1 +agency_id (str) : i.e. 0

+
+
Return type
+

route_id (str)

+
+
+
+ +
+ +
+ + +
+ +
+ + +
+
+ +
+ +
+ + + + + + + + + + + \ No newline at end of file diff --git a/_generated/lasso.ModelRoadwayNetwork.html b/_generated/lasso.ModelRoadwayNetwork.html new file mode 100644 index 0000000..967fc98 --- /dev/null +++ b/_generated/lasso.ModelRoadwayNetwork.html @@ -0,0 +1,1628 @@ + + + + + + + + + + lasso.ModelRoadwayNetwork — lasso documentation + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ + + +
+ + + + + +
+ +
+ + + + + + + + + + + + + + + + + +
+ + + + +
+
+
+
+ +
+

lasso.ModelRoadwayNetwork

+
+
+class lasso.ModelRoadwayNetwork(nodes, links, shapes, parameters={})[source]
+

Bases: network_wrangler.roadwaynetwork.RoadwayNetwork

+

Subclass of network_wrangler class RoadwayNetwork

+

A representation of the physical roadway network and its properties.

+
+
+__init__(nodes, links, shapes, parameters={})[source]
+

Constructor

+
+
Parameters
+
    +
  • nodes – geodataframe of nodes

  • +
  • links – dataframe of links

  • +
  • shapes – geodataframe of shapes

  • +
  • parameters – dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not specified, will use default parameters.

  • +
+
+
+
+ +

Methods

+ ++++ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +

__init__(nodes, links, shapes[, parameters])

Constructor

add_counts([network_variable, …])

Adds count variable.

add_new_roadway_feature_change(links, nodes)

add the new roadway features defined in the project card.

add_variable_using_shst_reference([…])

Join network links with source data, via SHST API node match result.

addition_map(links, nodes)

Shows which links and nodes are added to the roadway network

apply(project_card_dictionary)

Wrapper method to apply a project to a roadway network.

apply_managed_lane_feature_change(link_idx, …)

Apply the managed lane feature changes to the roadway network

apply_python_calculation(pycode[, in_place])

Changes roadway network object by executing pycode.

apply_roadway_feature_change(link_idx, …)

Changes the roadway attributes for the selected features based on the project card information passed

assess_connectivity([mode, …])

Returns a network graph and list of disconnected subgraphs as described by a list of their member nodes.

build_selection_key(selection_dict)

Selections are stored by a key combining the query and the A and B ids.

calculate_area_type([area_type_shape, …])

Calculates area type variable.

calculate_assign_group_and_roadway_class([…])

Calculates assignment group and roadway class variables.

calculate_centroidconnect([…])

Calculates centroid connector variable.

calculate_county([county_shape, …])

Calculates county variable.

calculate_distance([network_variable, …])

calculate link distance in miles

calculate_hov([network_variable, …])

Calculates hov variable.

calculate_mpo([county_network_variable, …])

Calculates mpo variable.

convert_int([int_col_names])

Convert integer columns

create_ML_variable([network_variable, overwrite])

Created ML lanes placeholder for project to write out ML changes

create_calculated_variables()

Creates calculated roadway variables.

create_dummy_connector_links(ml_df)

create dummy connector links between the general purpose and managed lanes

create_hov_corridor_variable([…])

Created hov corridor placeholder for project to write out corridor changes

create_managed_lane_network([in_place])

Create a roadway network with managed lanes links separated out.

create_managed_variable([network_variable, …])

Created placeholder for project to write out managed

dataframe_to_fixed_width(df)

Convert dataframe to fixed width format, geometry column will not be transformed.

delete_roadway_feature_change(links, nodes)

delete the roadway features defined in the project card.

deletion_map(links, nodes)

Shows which links and nodes are deleted from the roadway network

fill_na()

Fill na values from create_managed_lane_network()

from_RoadwayNetwork(roadway_network_object)

RoadwayNetwork to ModelRoadwayNetwork

get_attribute(links_df, join_key, …)

Gets attribute from source data using SHST match result.

get_managed_lane_node_ids(nodes_list)

get_modal_graph(links_df, nodes_df[, mode])

Determines if the network graph is “strongly” connected A graph is strongly connected if each vertex is reachable from every other vertex.

get_modal_links_nodes(links_df, nodes_df[, …])

Returns nodes and link dataframes for specific mode.

get_property_by_time_period_and_group(property)

Return a series for the properties with a specific group or time period.

is_network_connected([mode, links_df, nodes_df])

Determines if the network graph is “strongly” connected A graph is strongly connected if each vertex is reachable from every other vertex.

network_connection_plot(G, …)

Plot a graph to check for network connection.

orig_dest_nodes_foreign_key(selection[, …])

Returns the foreign key id (whatever is used in the u and v variables in the links file) for the AB nodes as a tuple.

ox_graph(nodes_df, links_df)

create an osmnx-flavored network graph

read(link_file, node_file, shape_file[, …])

Reads in links and nodes network standard.

read_match_result(path)

Reads the shst geojson match returns.

rename_variables_for_dbf(input_df[, …])

Rename attributes for DBF/SHP, make sure length within 10 chars.

roadway_net_to_gdf(roadway_net)

Turn the roadway network into a GeoDataFrame :param roadway_net: the roadway network to export

roadway_standard_to_met_council_network([…])

Rename and format roadway attributes to be consistent with what metcouncil’s model is expecting.

select_roadway_features(selection[, …])

Selects roadway features that satisfy selection criteria

selection_has_unique_link_id(selection_dict)

+
param selection_dictionary
+

Dictionary representation of selection

+
+
+

selection_map(selected_link_idx[, A, B, …])

Shows which links are selected for roadway property change or parallel managed lanes category of roadway projects.

split_properties_by_time_period_and_category([…])

Splits properties by time period, assuming a variable structure of

validate_link_schema(link_file[, …])

Validate roadway network data link schema and output a boolean

validate_node_schema(node_file[, …])

Validate roadway network data node schema and output a boolean

validate_object_types(nodes, links, shapes)

Determines if the roadway network is being built with the right object types.

validate_properties(properties[, …])

If there are change or existing commands, make sure that that property exists in the network.

validate_selection(selection)

Evaluate whetther the selection dictionary contains the minimum required values.

validate_shape_schema(shape_file[, …])

Validate roadway network data shape schema and output a boolean

validate_uniqueness()

Confirms that the unique identifiers are met.

write([path, filename])

Writes a network in the roadway network standard

write_roadway_as_fixedwidth([…])

Writes out fixed width file.

write_roadway_as_shp([…])

Write out dbf/shp for cube.

+

Attributes

+ ++++ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +

CALCULATED_VALUES

CRS

KEEP_SAME_ATTRIBUTES_ML_AND_GP

LINK_FOREIGN_KEY

MANAGED_LANES_LINK_ID_SCALAR

MANAGED_LANES_NODE_ID_SCALAR

MANAGED_LANES_REQUIRED_ATTRIBUTES

MANAGED_LANES_SCALAR

MAX_SEARCH_BREADTH

MODES_TO_NETWORK_LINK_VARIABLES

MODES_TO_NETWORK_NODE_VARIABLES

NODE_FOREIGN_KEY

SEARCH_BREADTH

SELECTION_REQUIRES

SP_WEIGHT_FACTOR

UNIQUE_LINK_KEY

UNIQUE_MODEL_LINK_IDENTIFIERS

UNIQUE_NODE_IDENTIFIERS

UNIQUE_NODE_KEY

UNIQUE_SHAPE_KEY

+
+
+add_counts(network_variable='AADT', mndot_count_shst_data=None, widot_count_shst_data=None, mndot_count_variable_shp=None, widot_count_variable_shp=None)[source]
+

Adds count variable.

+

join the network with count node data, via SHST API node match result

+
+
Parameters
+
    +
  • network_variable (str) – Name of the variable that should be written to. Default to “AADT”.

  • +
  • mndot_count_shst_data (str) – File path to MNDOT count location SHST API node match result.

  • +
  • widot_count_shst_data (str) – File path to WIDOT count location SHST API node match result.

  • +
  • mndot_count_variable_shp (str) – File path to MNDOT count location geodatabase.

  • +
  • widot_count_variable_shp (str) – File path to WIDOT count location geodatabase.

  • +
+
+
Returns
+

None

+
+
+
+ +
+
+add_new_roadway_feature_change(links, nodes)
+

add the new roadway features defined in the project card. +new shapes are also added for the new roadway links.

+
+
Parameters
+
    +
  • links – list of dictionaries

  • +
  • nodes – list of dictionaries

  • +
+
+
+

returns: None

+
+
Return type
+

None

+
+
+
+ +
+
+add_variable_using_shst_reference(var_shst_csvdata=None, shst_csv_variable=None, network_variable=None, network_var_type=<class 'int'>, overwrite=False)[source]
+

Join network links with source data, via SHST API node match result.

+
+
Parameters
+
    +
  • var_shst_csvdata (str) – File path to SHST API return.

  • +
  • shst_csv_variable (str) – Variable name in the source data.

  • +
  • network_variable (str) – Name of the variable that should be written to.

  • +
  • network_var_type – Variable type in the written network.

  • +
  • overwrite (bool) – True is overwriting existing variable. Default to False.

  • +
+
+
Returns
+

None

+
+
+
+ +
+
+addition_map(links, nodes)
+

Shows which links and nodes are added to the roadway network

+
+ +
+
+apply(project_card_dictionary)
+

Wrapper method to apply a project to a roadway network.

+
+
Parameters
+

project_card_dictionary – dict +a dictionary of the project card object

+
+
+
+ +
+
+apply_managed_lane_feature_change(link_idx, properties, in_place=True)
+

Apply the managed lane feature changes to the roadway network

+
+
Parameters
+
    +
  • link_idx – list of lndices of all links to apply change to

  • +
  • properties – list of dictionarys roadway properties to change

  • +
  • in_place – boolean to indicate whether to update self or return +a new roadway network object

  • +
+
+
+
+
Return type
+

Union(None, RoadwayNetwork)

+
+
+
+ +
+
+apply_python_calculation(pycode, in_place=True)
+

Changes roadway network object by executing pycode.

+
+
Parameters
+
    +
  • pycode – python code which changes values in the roadway network object

  • +
  • in_place – update self or return a new roadway network object

  • +
+
+
Return type
+

Union(None, RoadwayNetwork)

+
+
+
+ +
+
+apply_roadway_feature_change(link_idx, properties, in_place=True)
+

Changes the roadway attributes for the selected features based on the +project card information passed

+
+
Parameters
+
    +
  • link_idx – list +lndices of all links to apply change to

  • +
  • properties – list of dictionarys +roadway properties to change

  • +
  • in_place – boolean +update self or return a new roadway network object

  • +
+
+
Return type
+

Union(None, RoadwayNetwork)

+
+
+
+ +
+
+assess_connectivity(mode='', ignore_end_nodes=True, links_df=None, nodes_df=None)
+

Returns a network graph and list of disconnected subgraphs +as described by a list of their member nodes.

+
+
Parameters
+
    +
  • mode – list of modes of the network, one of drive,`transit`, +walk, bike

  • +
  • ignore_end_nodes – if True, ignores stray singleton nodes

  • +
  • links_df – if specified, will assess connectivity of this +links list rather than self.links_df

  • +
  • nodes_df – if specified, will assess connectivity of this +nodes list rather than self.nodes_df

  • +
+
+
+
+
Returns: Tuple of

Network Graph (osmnx flavored networkX DiGraph) +List of disconnected subgraphs described by the list of their

+
+

member nodes (as described by their model_node_id)

+
+
+
+
+ +
+
+build_selection_key(selection_dict)
+

Selections are stored by a key combining the query and the A and B ids. +This method combines the two for you based on the selection dictionary.

+
+
Parameters
+

selection_dictonary – Selection Dictionary

+
+
+

Returns: Tuple serving as the selection key.

+
+
Return type
+

tuple

+
+
+
+ +
+
+calculate_area_type(area_type_shape=None, area_type_shape_variable=None, network_variable='area_type', area_type_codes_dict=None, downtown_area_type_shape=None, downtown_area_type=None, overwrite=False)[source]
+

Calculates area type variable.

+

This uses the centroid of the geometry field to determine which area it should be labeled. +This isn’t perfect, but it much quicker than other methods.

+
+
Parameters
+
    +
  • area_type_shape (str) – The File path to area geodatabase.

  • +
  • area_type_shape_variable (str) – The variable name of area type in area geodadabase.

  • +
  • network_variable (str) – The variable name of area type in network standard. Default to “area_type”.

  • +
  • area_type_codes_dict – The dictionary to map input area_type_shape_variable to network_variable

  • +
  • downtown_area_type_shape – The file path to the downtown area type boundary.

  • +
  • downtown_area_type (int) – Integer value of downtown area type

  • +
  • overwrite (Bool) – True if overwriting existing county variable in network. Default to False.

  • +
+
+
Returns
+

None

+
+
+
+ +
+
+calculate_assign_group_and_roadway_class(assign_group_variable_name='assign_group', road_class_variable_name='roadway_class', mrcc_roadway_class_shape=None, mrcc_shst_data=None, mrcc_roadway_class_variable_shp=None, mrcc_assgngrp_dict=None, widot_roadway_class_shape=None, widot_shst_data=None, widot_roadway_class_variable_shp=None, widot_assgngrp_dict=None, osm_assgngrp_dict=None, overwrite=False)[source]
+

Calculates assignment group and roadway class variables.

+

Assignment Group is used in MetCouncil’s traffic assignment to segment the volume/delay curves. +Original source is from the MRCC data for the Minnesota: “route system” which is a roadway class +For Wisconsin, it is from the Wisconsin DOT database, which has a variable called “roadway category”

+

There is a crosswalk between the MRCC Route System and Wisconsin DOT –> Met Council Assignment group

+

This method joins the network with mrcc and widot roadway data by shst js matcher returns

+
+
Parameters
+
    +
  • assign_group_variable_name (str) – Name of the variable assignment group should be written to. Default to “assign_group”.

  • +
  • road_class_variable_name (str) – Name of the variable roadway class should be written to. Default to “roadway_class”.

  • +
  • mrcc_roadway_class_shape (str) – File path to the MRCC route system geodatabase.

  • +
  • mrcc_shst_data (str) – File path to the MRCC SHST match return.

  • +
  • mrcc_roadway_class_variable_shp (str) – Name of the variable where MRCC route system are stored.

  • +
  • mrcc_assgngrp_dict (dict) – Dictionary to map MRCC route system variable to assignment group.

  • +
  • widot_roadway_class_shape (str) – File path to the WIDOT roadway category geodatabase.

  • +
  • widot_shst_data (str) – File path to the WIDOT SHST match return.

  • +
  • widot_roadway_class_variable_shp (str) – Name of the variable where WIDOT roadway category are stored.

  • +
  • widot_assgngrp_dict (dict) – Dictionary to map WIDOT roadway category variable to assignment group.

  • +
  • osm_assgngrp_dict (dict) – Dictionary to map OSM roadway class to assignment group.

  • +
+
+
Returns
+

None

+
+
+
+ +
+
+calculate_centroidconnect(network_variable='centroidconnect', highest_taz_number=None, as_integer=True, overwrite=False, lanes_variable='lanes', number_of_lanes=1)[source]
+

Calculates centroid connector variable.

+
+
Parameters
+
    +
  • network_variable (str) – Variable that should be written to in the network. Default to “centroidconnect”

  • +
  • highest_taz_number (int) – the max TAZ number in the network.

  • +
  • as_integer (bool) – If True, will convert true/false to 1/0s. Default to True.

  • +
  • overwrite (Bool) – True if overwriting existing county variable in network. Default to False.

  • +
  • lanes_variable (str) – Variable that stores the number of lanes. Default to “lanes”.

  • +
  • number_of_lanes (int) – Number of lanes for centroid connectors. Default to 1.

  • +
+
+
Returns
+

None

+
+
+
+ +
+
+calculate_county(county_shape=None, county_shape_variable=None, network_variable='county', county_codes_dict=None, overwrite=False)[source]
+

Calculates county variable.

+

This uses the centroid of the geometry field to determine which county it should be labeled. +This isn’t perfect, but it much quicker than other methods.

+
+
Parameters
+
    +
  • county_shape (str) – The File path to county geodatabase.

  • +
  • county_shape_variable (str) – The variable name of county in county geodadabase.

  • +
  • network_variable (str) – The variable name of county in network standard. Default to “county”.

  • +
  • overwrite (Bool) – True if overwriting existing county variable in network. Default to False.

  • +
+
+
Returns
+

None

+
+
+
+ +
+
+calculate_distance(network_variable='distance', centroidconnect_only=True, overwrite=False)[source]
+

calculate link distance in miles

+
+
Parameters
+
    +
  • centroidconnect_only (Bool) – True if calculating distance for centroidconnectors only. Default to True.

  • +
  • overwrite (Bool) – True if overwriting existing variable in network. Default to False.

  • +
+
+
Returns
+

None

+
+
+
+ +
+
+calculate_hov(network_variable='HOV', as_integer=True, overwrite=False)[source]
+

Calculates hov variable.

+
+
Parameters
+
    +
  • network_variable (str) – Variable that should be written to in the network. Default to “HOV”

  • +
  • as_integer (bool) – If True, will convert true/false to 1/0s. Defauly to True.

  • +
  • overwrite (Bool) – True if overwriting existing county variable in network. Default to False.

  • +
+
+
Returns
+

None

+
+
+
+ +
+
+calculate_mpo(county_network_variable='county', network_variable='mpo', as_integer=True, mpo_counties=None, overwrite=False)[source]
+

Calculates mpo variable.

+
+
Parameters
+
    +
  • county_variable (str) – Name of the variable where the county names are stored. Default to “county”.

  • +
  • network_variable (str) – Name of the variable that should be written to. Default to “mpo”.

  • +
  • as_integer (bool) – If true, will convert true/false to 1/0s.

  • +
  • mpo_counties (list) – List of county names that are within mpo region.

  • +
  • overwrite (Bool) – True if overwriting existing county variable in network. Default to False.

  • +
+
+
Returns
+

None

+
+
+
+ +
+
+convert_int(int_col_names=[])[source]
+

Convert integer columns

+
+ +
+
+create_ML_variable(network_variable='ML_lanes', overwrite=False)[source]
+

Created ML lanes placeholder for project to write out ML changes

+

ML lanes default to 0, ML info comes from cube LOG file and store in project cards

+
+
Parameters
+

overwrite (Bool) – True if overwriting existing variable in network. Default to False.

+
+
Returns
+

None

+
+
+
+ +
+
+create_calculated_variables()[source]
+

Creates calculated roadway variables.

+
+
Parameters
+

None

+
+
+
+ +
+ +

create dummy connector links between the general purpose and managed lanes

+
+
Parameters
+
    +
  • gp_df – GeoDataFrame +dataframe of general purpose links (where managed lane also exists)

  • +
  • ml_df – GeoDataFrame +dataframe of corresponding managed lane links

  • +
+
+
+
+ +
+
+create_hov_corridor_variable(network_variable='segment_id', overwrite=False)[source]
+

Created hov corridor placeholder for project to write out corridor changes

+

hov corridor id default to 0, its info comes from cube LOG file and store in project cards

+
+
Parameters
+

overwrite (Bool) – True if overwriting existing variable in network. Default to False.

+
+
Returns
+

None

+
+
+
+ +
+
+create_managed_lane_network(in_place=False)
+

Create a roadway network with managed lanes links separated out. +Add new parallel managed lane links, access/egress links, +and add shapes corresponding to the new links

+
+
Parameters
+

in_place – update self or return a new roadway network object

+
+
+

returns: A RoadwayNetwork instance

+
+
Return type
+

RoadwayNetwork

+
+
+
+ +
+
+create_managed_variable(network_variable='managed', overwrite=False)[source]
+

Created placeholder for project to write out managed

+

managed default to 0, its info comes from cube LOG file and store in project cards

+
+
Parameters
+

overwrite (Bool) – True if overwriting existing variable in network. Default to False.

+
+
Returns
+

None

+
+
+
+ +
+
+static dataframe_to_fixed_width(df)[source]
+

Convert dataframe to fixed width format, geometry column will not be transformed.

+
+
Parameters
+

df (pandas DataFrame) –

+
+
Returns
+

dataframe with fixed width for each column. +dict: dictionary with columns names as keys, column width as values.

+
+
Return type
+

pandas dataframe

+
+
+
+ +
+
+delete_roadway_feature_change(links, nodes, ignore_missing=True)
+

delete the roadway features defined in the project card. +valid links and nodes defined in the project gets deleted +and shapes corresponding to the deleted links are also deleted.

+
+
Parameters
+
    +
  • links – dict +list of dictionaries

  • +
  • nodes – dict +list of dictionaries

  • +
  • ignore_missing – bool +If True, will only warn about links/nodes that are missing from +network but specified to “delete” in project card +If False, will fail.

  • +
+
+
Return type
+

None

+
+
+
+ +
+
+deletion_map(links, nodes)
+

Shows which links and nodes are deleted from the roadway network

+
+ +
+
+fill_na()[source]
+

Fill na values from create_managed_lane_network()

+
+ +
+
+static from_RoadwayNetwork(roadway_network_object, parameters={})[source]
+

RoadwayNetwork to ModelRoadwayNetwork

+
+
Parameters
+
    +
  • roadway_network_object (RoadwayNetwork) –

  • +
  • parameters – dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not specified, will use default parameters.

  • +
+
+
Returns
+

ModelRoadwayNetwork

+
+
+
+ +
+
+static get_attribute(links_df, join_key, source_shst_ref_df, source_gdf, field_name)[source]
+

Gets attribute from source data using SHST match result.

+
+
Parameters
+
    +
  • links_df (dataframe) – The network dataframe that new attribute should be written to.

  • +
  • join_key (str) – SHST ID variable name used to join source data with network dataframe.

  • +
  • source_shst_ref_df (str) – File path to source data SHST match result.

  • +
  • source_gdf (str) – File path to source data.

  • +
  • field_name (str) – Name of the attribute to get from source data.

  • +
+
+
Returns
+

None

+
+
+
+ +
+
+static get_managed_lane_node_ids(nodes_list)
+
+ +
+
+static get_modal_graph(links_df, nodes_df, mode=None)
+

Determines if the network graph is “strongly” connected +A graph is strongly connected if each vertex is reachable from every other vertex.

+
+
Parameters
+
    +
  • links_df – DataFrame of standard network links

  • +
  • nodes_df – DataFrame of standard network nodes

  • +
  • mode – mode of the network, one of drive,`transit`, +walk, bike

  • +
+
+
+

Returns: networkx: osmnx: DiGraph of network

+
+ +
+ +

Returns nodes and link dataframes for specific mode.

+
+
Parameters
+
    +
  • links_df – DataFrame of standard network links

  • +
  • nodes_df – DataFrame of standard network nodes

  • +
  • modes – list of the modes of the network to be kept, must be in drive,`transit`,`rail`,`bus`, +walk, bike. For example, if bike and walk are selected, both bike and walk links will be kept.

  • +
+
+
+

Returns: tuple of DataFrames for links, nodes filtered by mode

+

links with walk access are not marked as having walk access +Issue discussed in https://github.com/wsp-sag/network_wrangler/issues/145 +modal_nodes_df = nodes_df[nodes_df[mode_node_variable] == 1]

+
+ +
+
+get_property_by_time_period_and_group(property, time_period=None, category=None)
+

Return a series for the properties with a specific group or time period.

+
+
Parameters
+
    +
  • property (str) – the variable that you want from network

  • +
  • time_period (list(str)) – the time period that you are querying for +i.e. [‘16:00’, ‘19:00’]

  • +
  • category (str or list(str)(Optional)) –

    the group category +i.e. “sov”

    +

    or

    +

    list of group categories in order of search, i.e. +[“hov3”,”hov2”]

    +

  • +
+
+
Returns
+

+
+
Return type
+

pandas series

+
+
+
+ +
+
+is_network_connected(mode=None, links_df=None, nodes_df=None)
+

Determines if the network graph is “strongly” connected +A graph is strongly connected if each vertex is reachable from every other vertex.

+
+
Parameters
+
    +
  • mode – mode of the network, one of drive,`transit`, +walk, bike

  • +
  • links_df – DataFrame of standard network links

  • +
  • nodes_df – DataFrame of standard network nodes

  • +
+
+
+

Returns: boolean

+
+ +
+
+static network_connection_plot(G, disconnected_subgraph_nodes)
+

Plot a graph to check for network connection.

+
+
Parameters
+
    +
  • G – OSMNX flavored networkX graph.

  • +
  • disconnected_subgraph_nodes – List of disconnected subgraphs described by the list of their +member nodes (as described by their model_node_id).

  • +
+
+
+

returns: fig, ax : tuple

+
+ +
+
+orig_dest_nodes_foreign_key(selection, node_foreign_key='')
+

Returns the foreign key id (whatever is used in the u and v +variables in the links file) for the AB nodes as a tuple.

+
+
Parameters
+
    +
  • selection – selection dictionary with A and B keys

  • +
  • node_foreign_key – variable name for whatever is used by the u and v variable

  • +
  • the links_df file. If nothing is specified (in) –

  • +
  • whatever (assume) –

  • +
  • is (default) –

  • +
+
+
+

Returns: tuple of (A_id, B_id)

+
+
Return type
+

tuple

+
+
+
+ +
+
+static ox_graph(nodes_df, links_df)
+

create an osmnx-flavored network graph

+

osmnx doesn’t like values that are arrays, so remove the variables +that have arrays. osmnx also requires that certain variables +be filled in, so do that too.

+
+
Parameters
+
    +
  • nodes_df – GeoDataFrame of nodes

  • +
  • link_df – GeoDataFrame of links

  • +
+
+
+

Returns: a networkx multidigraph

+
+ +
+
+static read(link_file, node_file, shape_file, fast=False, recalculate_calculated_variables=False, recalculate_distance=False, parameters={})[source]
+

Reads in links and nodes network standard.

+
+
Parameters
+
    +
  • link_file (str) – File path to link json.

  • +
  • node_file (str) – File path to node geojson.

  • +
  • shape_file (str) – File path to link true shape geojson

  • +
  • fast (bool) – boolean that will skip validation to speed up read time.

  • +
  • recalculate_calculated_variables (bool) – calculates fields from spatial joins, etc.

  • +
  • recalculate_distance (bool) – re-calculates distance.

  • +
  • parameters – dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not specified, will use default parameters.

  • +
+
+
Returns
+

ModelRoadwayNetwork

+
+
+
+ +
+
+static read_match_result(path)[source]
+

Reads the shst geojson match returns.

+

Returns shst dataframe.

+

Reading lots of same type of file and concatenating them into a single DataFrame.

+
+
Parameters
+

path (str) – File path to SHST match results.

+
+
Returns
+

geopandas geodataframe

+
+
Return type
+

geodataframe

+
+
+

##todo +not sure why we need, but should be in utilities not this class

+
+ +
+
+rename_variables_for_dbf(input_df, variable_crosswalk=None, output_variables=None, convert_geometry_to_xy=False)[source]
+

Rename attributes for DBF/SHP, make sure length within 10 chars.

+
+
Parameters
+
    +
  • input_df (dataframe) – Network standard DataFrame.

  • +
  • variable_crosswalk (str) – File path to variable name crosswalk from network standard to DBF names.

  • +
  • output_variables (list) – List of strings for DBF variables.

  • +
  • convert_geometry_to_xy (bool) – True if converting node geometry to X/Y

  • +
+
+
Returns
+

dataframe

+
+
+
+ +
+
+static roadway_net_to_gdf(roadway_net)
+

Turn the roadway network into a GeoDataFrame +:param roadway_net: the roadway network to export

+

returns: shapes dataframe

+
+
Return type
+

GeoDataFrame

+
+
+
+ +
+
+roadway_standard_to_met_council_network(output_epsg=None)[source]
+

Rename and format roadway attributes to be consistent with what metcouncil’s model is expecting.

+
+
Parameters
+

output_epsg (int) – epsg number of output network.

+
+
Returns
+

None

+
+
+
+ +
+
+select_roadway_features(selection, search_mode='drive', force_search=False)
+

Selects roadway features that satisfy selection criteria

+
+
Example usage:
+
net.select_roadway_features(
+
selection = [ {

# a match condition for the from node using osm, +# shared streets, or model node number +‘from’: {‘osm_model_link_id’: ‘1234’}, +# a match for the to-node.. +‘to’: {‘shstid’: ‘4321’}, +# a regex or match for facility condition +# could be # of lanes, facility type, etc. +‘facility’: {‘name’:’Main St’}, +}, … ])

+
+
+
+
+
+
+
+
Parameters
+

selection – dictionary with keys for: +A - from node +B - to node +link - which includes at least a variable for name

+
+
+

Returns: a list of node foreign IDs on shortest path

+
+
Return type
+

GeoDataFrame

+
+
+
+ +
+ +
+
Parameters
+

selection_dictionary – Dictionary representation of selection +of roadway features, containing a “link” key.

+
+
+
+
Returns: A boolean indicating if the selection dictionary contains

a required unique link id.

+
+
+
+
Return type
+

bool

+
+
+
+ +
+
+selection_map(selected_link_idx, A=None, B=None, candidate_link_idx=[])
+

Shows which links are selected for roadway property change or parallel +managed lanes category of roadway projects.

+
+
Parameters
+
    +
  • selected_links_idx – list of selected link indices

  • +
  • candidate_links_idx – optional list of candidate link indices to also include in map

  • +
  • A – optional foreign key of starting node of a route selection

  • +
  • B – optional foreign key of ending node of a route selection

  • +
+
+
+
+ +
+
+split_properties_by_time_period_and_category(properties_to_split=None)[source]
+

Splits properties by time period, assuming a variable structure of

+
+
Parameters
+

properties_to_split

dict +dictionary of output variable prefix mapped to the source variable and what to stratify it by +e.g. +{

+
+

’trn_priority’ : {‘v’:’trn_priority’, ‘times_periods’:{“AM”: (“6:00”, “9:00”),”PM”: (“16:00”, “19:00”)}}, +‘ttime_assert’ : {‘v’:’ttime_assert’, ‘times_periods’:{“AM”: (“6:00”, “9:00”),”PM”: (“16:00”, “19:00”)}}, +‘lanes’ : {‘v’:’lanes’, ‘times_periods’:{“AM”: (“6:00”, “9:00”),”PM”: (“16:00”, “19:00”)}}, +‘ML_lanes’ : {‘v’:’ML_lanes’, ‘times_periods’:{“AM”: (“6:00”, “9:00”),”PM”: (“16:00”, “19:00”)}}, +‘price’ : {‘v’:’price’, ‘times_periods’:{“AM”: (“6:00”, “9:00”),”PM”: (“16:00”, “19:00”)}},’categories’: {“sov”: [“sov”, “default”],”hov2”: [“hov2”, “default”, “sov”]}}, +‘access’ : {‘v’:’access’, ‘times_periods’:{“AM”: (“6:00”, “9:00”),”PM”: (“16:00”, “19:00”)}},

+
+

}

+

+
+
+
+ +
+ +

Validate roadway network data link schema and output a boolean

+
+ +
+
+static validate_node_schema(node_file, schema_location='roadway_network_node.json')
+

Validate roadway network data node schema and output a boolean

+
+ +
+
+static validate_object_types(nodes, links, shapes)
+

Determines if the roadway network is being built with the right object types. +Does not validate schemas.

+
+
Parameters
+
    +
  • nodes – nodes geodataframe

  • +
  • links – link geodataframe

  • +
  • shapes – shape geodataframe

  • +
+
+
+

Returns: boolean

+
+ +
+
+validate_properties(properties, ignore_existing=False, require_existing_for_change=False)
+

If there are change or existing commands, make sure that that +property exists in the network.

+
+
Parameters
+
    +
  • properties – properties dictionary to be evaluated

  • +
  • ignore_existing – If True, will only warn about properties +that specify an “existing” value. If False, will fail.

  • +
  • require_existing_for_change – If True, will fail if there isn’t +a specified value in theproject card for existing when a +change is specified.

  • +
+
+
+

Returns: boolean value as to whether the properties dictonary is valid.

+
+
Return type
+

bool

+
+
+
+ +
+
+validate_selection(selection)
+

Evaluate whetther the selection dictionary contains the +minimum required values.

+
+
Parameters
+

selection – selection dictionary to be evaluated

+
+
+

Returns: boolean value as to whether the selection dictonary is valid.

+
+
Return type
+

Bool

+
+
+
+ +
+
+static validate_shape_schema(shape_file, schema_location='roadway_network_shape.json')
+

Validate roadway network data shape schema and output a boolean

+
+ +
+
+validate_uniqueness()
+

Confirms that the unique identifiers are met.

+
+
Return type
+

Bool

+
+
+
+ +
+
+write(path='.', filename=None)
+

Writes a network in the roadway network standard

+
+
Parameters
+
    +
  • path – the path were the output will be saved

  • +
  • filename – the name prefix of the roadway files that will be generated

  • +
+
+
Return type
+

None

+
+
+
+ +
+
+write_roadway_as_fixedwidth(node_output_variables=None, link_output_variables=None, output_link_txt=None, output_node_txt=None, output_link_header_width_txt=None, output_node_header_width_txt=None, output_cube_network_script=None, drive_only=False)[source]
+

Writes out fixed width file.

+

This function does: +1. write out link and node fixed width data files for cube. +2. write out header and width correspondence. +3. write out cube network building script with header and width specification.

+
+
Parameters
+
    +
  • node_output_variables (list) – list of node variable names.

  • +
  • link_output_variables (list) – list of link variable names.

  • +
  • output_link_txt (str) – File path to output link database.

  • +
  • output_node_txt (str) – File path to output node database.

  • +
  • output_link_header_width_txt (str) – File path to link column width records.

  • +
  • output_node_header_width_txt (str) – File path to node column width records.

  • +
  • output_cube_network_script (str) – File path to CUBE network building script.

  • +
  • drive_only (bool) – If True, only writes drive nodes and links

  • +
+
+
Returns
+

None

+
+
+
+ +
+
+write_roadway_as_shp(node_output_variables=None, link_output_variables=None, data_to_csv=True, data_to_dbf=False, output_link_shp=None, output_node_shp=None, output_link_csv=None, output_node_csv=None)[source]
+

Write out dbf/shp for cube. Write out csv in addition to shp with full length variable names.

+
+
Parameters
+
    +
  • node_output_variables (list) – List of strings for node output variables.

  • +
  • link_output_variables (list) – List of strings for link output variables.

  • +
  • data_to_csv (bool) – True if write network in csv format.

  • +
  • data_to_dbf (bool) – True if write network in dbf/shp format.

  • +
  • output_link_shp (str) – File path to output link dbf/shp.

  • +
  • output_node_shp (str) – File path to output node dbf/shp.

  • +
  • output_link_csv (str) – File path to output link csv.

  • +
  • output_node_csv (str) – File path to output node csv.

  • +
+
+
Returns
+

None

+
+
+
+ +
+
+CALCULATED_VALUES = ['area_type', 'county', 'assign_group', 'centroidconnect']
+
+ +
+
+CRS = 4326
+
+ +
+
+KEEP_SAME_ATTRIBUTES_ML_AND_GP = ['distance', 'bike_access', 'drive_access', 'transit_access', 'walk_access', 'maxspeed', 'name', 'oneway', 'ref', 'roadway', 'length', 'segment_id']
+
+ +
+ +
+ +
+ +
+ +
+
+MANAGED_LANES_NODE_ID_SCALAR = 500000
+
+ +
+
+MANAGED_LANES_REQUIRED_ATTRIBUTES = ['A', 'B', 'model_link_id', 'locationReferences']
+
+ +
+
+MANAGED_LANES_SCALAR = 500000
+
+ +
+
+MAX_SEARCH_BREADTH = 10
+
+ +
+ +
+ +
+
+MODES_TO_NETWORK_NODE_VARIABLES = {'bike': ['bike_node'], 'bus': ['bus_only', 'drive_node'], 'drive': ['drive_node'], 'rail': ['rail_only', 'drive_node'], 'transit': ['bus_only', 'rail_only', 'drive_node'], 'walk': ['walk_node']}
+
+ +
+
+NODE_FOREIGN_KEY = 'model_node_id'
+
+ +
+
+SEARCH_BREADTH = 5
+
+ +
+
+SELECTION_REQUIRES = ['link']
+
+ +
+
+SP_WEIGHT_FACTOR = 100
+
+ +
+ +
+ +
+ +
+ +
+
+UNIQUE_NODE_IDENTIFIERS = ['model_node_id']
+
+ +
+
+UNIQUE_NODE_KEY = 'model_node_id'
+
+ +
+
+UNIQUE_SHAPE_KEY = 'shape_id'
+
+ +
+ +
+ + +
+ +
+ + +
+
+ +
+ +
+ + + + + + + + + + + \ No newline at end of file diff --git a/_generated/lasso.Parameters.html b/_generated/lasso.Parameters.html new file mode 100644 index 0000000..ca2a0fc --- /dev/null +++ b/_generated/lasso.Parameters.html @@ -0,0 +1,630 @@ + + + + + + + + + + lasso.Parameters — lasso documentation + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ + + +
+ + + + + +
+ +
+ + + + + + + + + + + + + + + + + +
+ + + + +
+
+
+
+ +
+

lasso.Parameters

+
+
+class lasso.Parameters(**kwargs)[source]
+

Bases: object

+

A class representing all the parameters defining the networks +including time of day, categories, etc.

+

Parameters can be set at runtime by initializing a parameters instance +with a keyword argument setting the attribute. Parameters that are +not explicitly set will use default parameters listed in this class. +.. highlight:: python +##TODO potentially split this between several classes.

+
+
Attr:
+
time_period_to_time (dict): Maps time period abbreviations used in

Cube to time of days used on gtfs and highway network standard +Default:

+
{
+    "AM": ("6:00", "9:00"),
+    "MD": ("9:00", "16:00"),
+    "PM": ("16:00", "19:00"),
+    "NT": ("19:00", "6:00"),
+}
+
+
+
+
cube_time_periods (dict): Maps cube time period numbers used in

transit line files to the time period abbreviations in time_period_to_time +dictionary. +Default:

+
{"1": "AM", "2": "MD"}
+
+
+
+
categories (dict): Maps demand category abbreviations to a list of

network categories they are allowed to use. +Default:

+
{
+    # suffix, source (in order of search)
+    "sov": ["sov", "default"],
+    "hov2": ["hov2", "default", "sov"],
+    "hov3": ["hov3", "hov2", "default", "sov"],
+    "truck": ["trk", "sov", "default"],
+}
+
+
+
+
properties_to_split (dict): Dictionary mapping variables in standard

roadway network to categories and time periods that need to be +split out in final model network to get variables like LANES_AM. +Default:

+
{
+    "trn_priority": {
+        "v": "trn_priority",
+        "time_periods": self.time_periods_to_time,
+    },
+    "ttime_assert": {
+        "v": "ttime_assert",
+        "time_periods": self.time_periods_to_time,
+    },
+    "lanes": {"v": "lanes", "time_periods": self.time_periods_to_time},
+    "price": {
+        "v": "price",
+        "time_periods": self.time_periods_to_time,
+        "categories": self.categories,
+    },
+    "access": {"v": "access", "time_periods": self.time_periods_to_time},
+}
+
+
+
+
county_shape (str): File location of shapefile defining counties.

Default:

+
r"metcouncil_data/county/cb_2017_us_county_5m.shp"
+
+
+
+
county_variable_shp (str): Property defining the county n ame in

the county_shape file. +Default:

+
NAME
+
+
+
+
lanes_lookup_file (str): Lookup table of number of lanes for different data sources.

Default:

+
r"metcouncil_data/lookups/lanes.csv"
+
+
+
+
centroid_connect_lanes (int): Number of lanes for centroid connectors.

Default:

+
1
+
+
+
+
mpo_counties (list): list of county names within MPO boundary.

Default:

+
[
+    "ANOKA",
+    "DAKOTA",
+    "HENNEPIN",
+    "RAMSEY",
+    "SCOTT",
+    "WASHINGTON",
+    "CARVER",
+]
+
+
+
+
taz_shape (str):

Default:

+
r"metcouncil_data/TAZ/TAZOfficialWCurrentForecasts.shp"
+
+
+
+
taz_data (str):

Default:

+
??
+
+
+
+
highest_taz_number (int): highest TAZ number in order to define

centroid connectors. +Default:

+
3100
+
+
+
+
output_variables (list): list of variables to output in final model

network. +Default:

+
[
+    "model_link_id",
+    "A",
+    "B",
+    "shstGeometryId",
+    "distance",
+    "roadway",
+    "name",
+    "roadway_class",
+    "bike_access",
+    "transit_access",
+    "walk_access",
+    "drive_access",
+    "truck_access",
+    "trn_priority_AM",
+    "trn_priority_MD",
+    "trn_priority_PM",
+    "trn_priority_NT",
+    "ttime_assert_AM",
+    "ttime_assert_MD",
+    "ttime_assert_PM",
+    "ttime_assert_NT",
+    "lanes_AM",
+    "lanes_MD",
+    "lanes_PM",
+    "lanes_NT",
+    "price_sov_AM",
+    "price_hov2_AM",
+    "price_hov3_AM",
+    "price_truck_AM",
+    "price_sov_MD",
+    "price_hov2_MD",
+    "price_hov3_MD",
+    "price_truck_MD",
+    "price_sov_PM",
+    "price_hov2_PM",
+    "price_hov3_PM",
+    "price_truck_PM",
+    "price_sov_NT",
+    "price_hov2_NT",
+    "price_hov3_NT",
+    "price_truck_NT",
+    "roadway_class_idx",
+    "assign_group",
+    "access_AM",
+    "access_MD",
+    "access_PM",
+    "access_NT",
+    "mpo",
+    "area_type",
+    "county",
+    "centroidconnect",
+    "AADT",
+    "count_year",
+    "count_AM",
+    "count_MD",
+    "count_PM",
+    "count_NT",
+    "count_daily",
+    "model_node_id",
+    "N",
+    "osm_node_id",
+    "bike_node",
+    "transit_node",
+    "walk_node",
+    "drive_node",
+    "geometry",
+    "X",
+    "Y",
+]
+
+
+
+
area_type_shape (str): Location of shapefile defining area type.

Default:

+
r"metcouncil_data/area_type/ThriveMSP2040CommunityDesignation.shp"
+
+
+
+
area_type_variable_shp (str): property in area_type_shape with area

type in it. +Default:

+
"COMDES2040"
+
+
+
+
area_type_code_dict (dict): Mapping of the area_type_variable_shp to

the area type code used in the MetCouncil cube network. +Default:

+
{
+    23: 4,  # urban center
+    24: 3,
+    25: 2,
+    35: 2,
+    36: 1,
+    41: 1,
+    51: 1,
+    52: 1,
+    53: 1,
+    60: 1,
+}
+
+
+
+
downtown_area_type_shape (str): Location of shapefile defining downtown area type.

Default:

+
r"metcouncil_data/area_type/downtownzones_TAZ.shp"
+
+
+
+
downtown_area_type (int): Area type integer for downtown.

Default:

+
5
+
+
+
+
mrcc_roadway_class_shape (str): Shapefile of MRCC links with a property

associated with roadway class. Default:

+
r"metcouncil_data/mrcc/trans_mrcc_centerlines.shp"
+
+
+
+
mrcc_roadway_class_variable_shp (str): The property in mrcc_roadway_class_shp

associated with roadway class. Default:

+
"ROUTE_SYS"
+
+
+
+
widot_roadway_class_shape (str): Shapefile of Wisconsin links with a property

associated with roadway class. Default:

+
r"metcouncil_data/Wisconsin_Lanes_Counts_Median/WISLR.shp"
+
+
+
+
widot_roadway_class_variable_shp (str): The property in widot_roadway_class_shape

associated with roadway class.Default:

+
"RDWY_CTGY_"
+
+
+
+
mndot_count_shape (str): Shapefile of MnDOT links with a property

associated with counts. Default:

+
r"metcouncil_data/count_mn/AADT_2017_Count_Locations.shp"
+
+
+
+
mndot_count_variable_shp (str): The property in mndot_count_shape

associated with counts. Default:

+
"AADT_mn"
+
+
+
+
widot_count_shape (str): Shapefile of Wisconsin DOT links with a property

associated with counts. Default:Default:

+
r"metcouncil_data/Wisconsin_Lanes_Counts_Median/TRADAS_(counts).shp"
+
+
+
+
widot_count_variable_shp (str): The property in widot_count_shape

associated with counts. Default:

+
"AADT_wi"
+
+
+
+
mrcc_shst_data (str): MnDOT MRCC to Shared Streets crosswalk. Default:
+
::

r”metcouncil_data/mrcc/mrcc.out.matched.csv”

+
+
+
+
widot_shst_data (str): WisconsinDOT to Shared Streets crosswalk.Default:
+
::

r”metcouncil_data/Wisconsin_Lanes_Counts_Median/widot.out.matched.geojson”

+
+
+
+
mndot_count_shst_data (str): MetCouncil count data with ShST Default:
+
::

r”metcouncil_data/count_mn/mn_count_ShSt_API_match.csv”

+
+
+
+
widot_count_shst_data (str): WisconsinDOT count data with ShST Default:
+
::

r”metcouncil_data/Wisconsin_Lanes_Counts_Median/wi_count_ShSt_API_match.csv”,

+
+
+
+
mrcc_assgngrp_dict (str): Mapping beetween MRCC ROUTE_SYS variable

and assignment group. Default:

+
"lookups/mrcc_route_sys_asgngrp_crosswalk.csv"
+
+
+
+
widot_assgngrp_dict (dict): Mapping beetween Wisconsin DOT RDWY_CTGY_

variable and assignment group. Default:

+
"lookups/widot_ctgy_asgngrp_crosswalk.csv"
+
+
+
+
osm_assgngrp_dict (dict): Mapping between OSM Roadway variable

and assignment group. Default:

+
"lookups/osm_highway_asgngrp_crosswalk.csv"
+
+
+
+
roadway_class_dict (str): Mapping between assignment group and

roadway class. Default:

+
"lookups/asgngrp_rc_num_crosswalk.csv"
+
+
+
+
output_epsg (int): EPSG type of geographic projection for output

shapefiles. Default:

+
26915
+
+
+
+
net_to_dbf (str): Lookup of network variables to DBF compliant

lengths. Default:

+
"examples/settings/net_to_dbf.csv"
+
+
+
+
output_link_shp (str): Output shapefile for roadway links. Default:
+
::

r”tests/scratch/links.shp”

+
+
+
+
output_node_shp (str): Output shapefile for roadway nodes. Default:
+
::

r”tests/scratch/nodes.shp”

+
+
+
+
output_link_csv (str): Output csv for roadway links. Default:
+
::

r”tests/scratch/links.csv”

+
+
+
+
output_node_csv (str): Output csv for roadway nodes. Default:
+
::

r”tests/scratch/nodes.csv”

+
+
+
+
output_link_txt (str): Output fixed format txt for roadway links. Default:
+
::

r”tests/scratch/links.txt”

+
+
+
+
output_node_txt (str): Output fixed format txt for roadway nodes. Default:
+
::

r”tests/scratch/nodes.txt”

+
+
+
+
output_link_header_width_txt (str): Header for txt roadway links. Default:
+
::

r”tests/scratch/links_header_width.txt”

+
+
+
+
output_node_header_width_txt (str): Header for txt for roadway Nodes. Default:
+
::

r”tests/scratch/nodes_header_width.txt”

+
+
+
+
output_cube_network_script (str): Cube script for importing

fixed-format roadway network. Default:

+
r"tests/scratch/make_complete_network_from_fixed_width_file.s
+
+
+
+
+
+
+
+
+__init__(**kwargs)[source]
+

Time period and category splitting info

+
+ +

Methods

+ ++++ + + + + + +

__init__(**kwargs)

Time period and category splitting info

+
+
+output_epsg
+

Create all the possible headway variable combinations based on the cube time periods setting

+
+ +
+
+properties_to_split
+

Details for calculating the county based on the centroid of the link. +The COUNTY_VARIABLE should be the name of a field in shapefile.

+
+ +
+ +
+ + +
+ +
+ + +
+
+ +
+ +
+ + + + + + + + + + + \ No newline at end of file diff --git a/_generated/lasso.Project.html b/_generated/lasso.Project.html new file mode 100644 index 0000000..56f865d --- /dev/null +++ b/_generated/lasso.Project.html @@ -0,0 +1,525 @@ + + + + + + + + + + lasso.Project — lasso documentation + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ + + +
+ + + + + +
+ +
+ + + + + + + + + + + + + + + + + +
+ + + + +
+
+
+
+ +
+

lasso.Project

+
+
+class lasso.Project(roadway_changes=None, transit_changes=None, base_roadway_network=None, base_transit_network=None, build_transit_network=None, project_name='', evaluate=False, parameters={})[source]
+

Bases: object

+

A single or set of changes to the roadway or transit system.

+

Compares a base and a build transit network or a base and build +highway network and produces project cards.

+

Typical usage example:

+
test_project = Project.create_project(
+    base_transit_source=os.path.join(CUBE_DIR, "transit.LIN"),
+    build_transit_source=os.path.join(CUBE_DIR, "transit_route_shape_change"),
+)
+test_project.evaluate_changes()
+test_project.write_project_card(
+    os.path.join(SCRATCH_DIR, "t_transit_shape_test.yml")
+)
+
+
+
+
+DEFAULT_PROJECT_NAME
+

a class-level constant that defines what +the project name will be if none is set.

+
+ +
+
+STATIC_VALUES
+

a class-level constant which defines values that +are not evaluated when assessing changes.

+
+ +
+
+card_data
+

{“project”: <project_name>, “changes”: <list of change dicts>}

+
+
Type
+

dict

+
+
+
+ +
+
+roadway_changes
+

pandas dataframe of CUBE roadway changes.

+
+
Type
+

DataFrame

+
+
+
+ +
+
+transit_changes
+
+
Type
+

CubeTransit

+
+
+
+ +
+
+base_roadway_network
+
+
Type
+

RoadwayNetwork

+
+
+
+ +
+
+base_transit_network
+
+
Type
+

CubeTransit

+
+
+
+ +
+
+build_transit_network
+
+
Type
+

CubeTransit

+
+
+
+ +
+
+project_name
+

name of the project, set to DEFAULT_PROJECT_NAME if not provided

+
+
Type
+

str

+
+
+
+ +
+
+parameters
+

an instance of the Parameters class which sets a bunch of parameters

+
+ +
+
+__init__(roadway_changes=None, transit_changes=None, base_roadway_network=None, base_transit_network=None, build_transit_network=None, project_name='', evaluate=False, parameters={})[source]
+

ProjectCard constructor.

+
+
Parameters
+
    +
  • roadway_changes – dataframe of roadway changes read from a log file

  • +
  • transit_changes

  • +
  • base_roadway_network – RoadwayNetwork instance for base case

  • +
  • base_transit_network – CubeTransit instance for base transit network

  • +
  • build_transit_network – CubeTransit instance for build transit network

  • +
  • project_name – name of the project

  • +
  • evaluate – defaults to false, but if true, will create card data

  • +
  • parameters – dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not specified, will use default parameters.

  • +
+
+
+

returns: instance of ProjectCard

+
+ +

Methods

+ ++++ + + + + + + + + + + + + + + + + + + + + + + + + + + +

__init__([roadway_changes, transit_changes, …])

ProjectCard constructor.

add_highway_changes([…])

Evaluates changes from the log file based on the base highway object and adds entries into the self.card_data dictionary.

add_transit_changes()

Evaluates changes between base and build transit objects and adds entries into the self.card_data dictionary.

create_project([roadway_log_file, …])

Constructor for a Project instance.

determine_roadway_network_changes_compatability(…)

Checks to see that any links or nodes that change exist in base roadway network.

evaluate_changes()

Determines which changes should be evaluated, initiates self.card_data to be an aggregation of transit and highway changes.

read_logfile(logfilename)

Reads a Cube log file and returns a dataframe of roadway_changes

write_project_card([filename])

Writes project cards.

+

Attributes

+ ++++ + + + + + + + + + + + +

CALCULATED_VALUES

DEFAULT_PROJECT_NAME

STATIC_VALUES

+
+
+add_highway_changes(limit_variables_to_existing_network=False)[source]
+

Evaluates changes from the log file based on the base highway object and +adds entries into the self.card_data dictionary.

+
+
Parameters
+

limit_variables_to_existing_network (bool) – True if no ad-hoc variables. Default to False.

+
+
+
+ +
+
+add_transit_changes()[source]
+

Evaluates changes between base and build transit objects and +adds entries into the self.card_data dictionary.

+
+ +
+
+static create_project(roadway_log_file=None, roadway_shp_file=None, roadway_csv_file=None, base_roadway_dir=None, base_transit_source=None, build_transit_source=None, roadway_changes=None, transit_changes=None, base_roadway_network=None, base_transit_network=None, build_transit_network=None, project_name=None, recalculate_calculated_variables=False, recalculate_distance=False, parameters={})[source]
+

Constructor for a Project instance.

+
+
Parameters
+
    +
  • roadway_log_file – File path to consuming logfile or a list of logfile paths.

  • +
  • roadway_shp_file – File path to consuming shape file for roadway changes.

  • +
  • roadway_csv_file – File path to consuming csv file for roadway changes.

  • +
  • base_roadway_dir – Folder path to base roadway network.

  • +
  • base_transit_source – Folder path to base transit network or cube line file string.

  • +
  • base_transit_file – File path to base transit network.

  • +
  • build_transit_source – Folder path to build transit network or cube line file string.

  • +
  • build_transit_file – File path to build transit network.

  • +
  • roadway_changes – pandas dataframe of CUBE roadway changes.

  • +
  • transit_changes – build transit changes.

  • +
  • base_roadway_network – Base roadway network object.

  • +
  • base_transit_network – Base transit network object.

  • +
  • build_transit_network – Build transit network object.

  • +
  • project_name – If not provided, will default to the roadway_log_file filename if provided (or the first filename if a list is provided)

  • +
  • recalculate_calculated_variables – if reading in a base network, if this is true it will recalculate variables such as area type, etc. This only needs to be true if you are creating project cards that are changing the calculated variables.

  • +
  • recalculate_distance – recalculate the distance variable. This only needs to be true if you are creating project cards that change the distance.

  • +
  • parameters – dictionary of parameters

  • +
+
+
Returns
+

A Project instance.

+
+
+
+ +
+
+static determine_roadway_network_changes_compatability(base_roadway_network, roadway_changes, parameters)[source]
+

Checks to see that any links or nodes that change exist in base roadway network.

+
+ +
+
+evaluate_changes()[source]
+

Determines which changes should be evaluated, initiates +self.card_data to be an aggregation of transit and highway changes.

+
+ +
+
+static read_logfile(logfilename)[source]
+

Reads a Cube log file and returns a dataframe of roadway_changes

+
+
Parameters
+

logfilename (str or list[str]) – File path to CUBE logfile or list of logfile paths.

+
+
Returns
+

A DataFrame reprsentation of the log file.

+
+
Return type
+

DataFrame

+
+
+
+ +
+
+write_project_card(filename=None)[source]
+

Writes project cards.

+
+
Parameters
+

filename (str) – File path to output .yml

+
+
Returns
+

None

+
+
+
+ +
+
+CALCULATED_VALUES = ['area_type', 'county', 'assign_group', 'centroidconnect']
+
+ +
+
+DEFAULT_PROJECT_NAME = 'USER TO define'
+
+ +
+
+STATIC_VALUES = ['model_link_id', 'area_type', 'county', 'centroidconnect']
+
+ +
+ +
+ + +
+ +
+ + +
+
+ +
+ +
+ + + + + + + + + + + \ No newline at end of file diff --git a/_generated/lasso.StandardTransit.html b/_generated/lasso.StandardTransit.html new file mode 100644 index 0000000..afb251c --- /dev/null +++ b/_generated/lasso.StandardTransit.html @@ -0,0 +1,498 @@ + + + + + + + + + + lasso.StandardTransit — lasso documentation + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ + + +
+ + + + + +
+ +
+ + + + + + + + + + + + + + + + + +
+ + + + +
+
+
+
+ +
+

lasso.StandardTransit

+
+
+class lasso.StandardTransit(ptg_feed, parameters={})[source]
+

Bases: object

+

Holds a standard transit feed as a Partridge object and contains +methods to manipulate and translate the GTFS data to MetCouncil’s +Cube Line files.

+

Typical usage example:

+
cube_transit_net = StandardTransit.read_gtfs(BASE_TRANSIT_DIR)
+cube_transit_net.write_as_cube_lin(os.path.join(WRITE_DIR, "outfile.lin"))
+
+
+
+
+feed
+

Partridge Feed object containing read-only access to GTFS feed

+
+ +
+
+parameters
+

Parameters instance containing information +about time periods and variables.

+
+
Type
+

Parameters

+
+
+
+ +
+
+__init__(ptg_feed, parameters={})[source]
+
+
Parameters
+
    +
  • ptg_feed – partridge feed object

  • +
  • parameters – dictionary of parameter settings (see Parameters class) or an instance of Parameters

  • +
+
+
+
+ +

Methods

+ ++++ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +

__init__(ptg_feed[, parameters])

+
param ptg_feed
+

partridge feed object

+
+
+

calculate_cube_mode(row)

Assigns a cube mode number by following logic.

cube_format(row)

Creates a string represnting the route in cube line file notation.

fromTransitNetwork(transit_network_object[, …])

RoadwayNetwork to ModelRoadwayNetwork

read_gtfs(gtfs_feed_dir[, parameters])

Reads GTFS files from a directory and returns a StandardTransit instance.

route_properties_gtfs_to_cube(self)

Prepare gtfs for cube lin file.

shape_gtfs_to_cube(row)

Creates a list of nodes that for the route in appropriate cube format.

time_to_cube_time_period(start_time_secs[, …])

Converts seconds from midnight to the cube time period.

write_as_cube_lin([outpath])

Writes the gtfs feed as a cube line file after converting gtfs properties to MetCouncil cube properties.

+
+
+calculate_cube_mode(row)[source]
+

Assigns a cube mode number by following logic.

+

For rail, uses GTFS route_type variable: +https://developers.google.com/transit/gtfs/reference

+
+
::

# route_type : cube_mode +route_type_to_cube_mode = {0: 8, # Tram, Streetcar, Light rail

+
+

3: 0, # Bus; further disaggregated for cube +2: 9} # Rail

+
+
+
+

For buses, uses route id numbers and route name to find +express and suburban buses as follows:

+
+
::
+
if not cube_mode:
+
if ‘express’ in row[‘LONGNAME’].lower():

cube_mode = 7 # Express

+
+
elif int(row[‘route_id’].split(“-“)[0]) > 99:

cube_mode = 6 # Suburban Local

+
+
else:

cube_mode = 5 # Urban Local

+
+
+
+
+
+
+
+
Parameters
+

row – A DataFrame row with route_type, route_long_name, and route_id

+
+
Returns
+

cube mode number

+
+
Return type
+

int

+
+
+
+ +
+
+cube_format(row)[source]
+

Creates a string represnting the route in cube line file notation.

+
+
Parameters
+

row – row of a DataFrame representing a cube-formatted trip, with the Attributes +trip_id, shape_id, NAME, LONGNAME, tod, HEADWAY, MODE, ONEWAY, OPERATOR

+
+
Returns
+

string representation of route in cube line file notation

+
+
+
+ +
+
+static fromTransitNetwork(transit_network_object, parameters={})[source]
+

RoadwayNetwork to ModelRoadwayNetwork

+
+
Parameters
+
    +
  • transit_network_object – Reference to an instance of TransitNetwork.

  • +
  • parameters – dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not provided will +use default parameters.

  • +
+
+
Returns
+

StandardTransit

+
+
+
+ +
+
+static read_gtfs(gtfs_feed_dir, parameters={})[source]
+

Reads GTFS files from a directory and returns a StandardTransit +instance.

+
+
Parameters
+
    +
  • gtfs_feed_dir – location of the GTFS files

  • +
  • parameters – dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not provided will +use default parameters.

  • +
+
+
Returns
+

StandardTransit instance

+
+
+
+ +
+
+static route_properties_gtfs_to_cube(self)[source]
+

Prepare gtfs for cube lin file.

+

Does the following operations: +1. Combines route, frequency, trip, and shape information +2. Converts time of day to time periods +3. Calculates cube route name from gtfs route name and properties +4. Assigns a cube-appropriate mode number +5. Assigns a cube-appropriate operator number

+
+
Returns
+

+
DataFrame of trips with cube-appropriate values for:
    +
  • NAME

  • +
  • ONEWAY

  • +
  • OPERATOR

  • +
  • MODE

  • +
  • HEADWAY

  • +
+
+
+

+
+
Return type
+

trip_df (DataFrame)

+
+
+
+ +
+
+shape_gtfs_to_cube(row)[source]
+

Creates a list of nodes that for the route in appropriate +cube format.

+
+
Parameters
+

row – DataFrame row with both shape_id and trip_id

+
+
+
+
Returns: a string representation of the node list

for a route in cube format.

+
+
+
+ +
+
+time_to_cube_time_period(start_time_secs, as_str=True, verbose=False)[source]
+

Converts seconds from midnight to the cube time period.

+
+
Parameters
+
    +
  • start_time_secs – start time for transit trip in seconds +from midnight

  • +
  • as_str – if True, returns the time period as a string, +otherwise returns a numeric time period

  • +
+
+
Returns
+

+
if as_str is False, returns the numeric

time period

+
+
this_tp: if as_str is True, returns the Cube time period

name abbreviation

+
+
+

+
+
Return type
+

this_tp_num

+
+
+
+ +
+
+write_as_cube_lin(outpath=None)[source]
+

Writes the gtfs feed as a cube line file after +converting gtfs properties to MetCouncil cube properties.

+
+
Parameters
+

outpath – File location for output cube line file.

+
+
+
+ +
+ +
+ + +
+ +
+ + +
+
+ +
+ +
+ + + + + + + + + + + \ No newline at end of file diff --git a/_generated/lasso.logger.html b/_generated/lasso.logger.html new file mode 100644 index 0000000..c909ef4 --- /dev/null +++ b/_generated/lasso.logger.html @@ -0,0 +1,246 @@ + + + + + + + + + + lasso.logger — lasso documentation + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ + + +
+ + + + + +
+ +
+ + + + + + + + + + + + + + + + + +
+ + + + +
+
+
+
+ +
+

lasso.logger

+

Functions

+ ++++ + + + + + +

setupLogging(infoLogFilename, debugLogFilename)

Sets up the logger.

+
+
+lasso.logger.setupLogging(infoLogFilename, debugLogFilename, logToConsole=True)[source]
+

Sets up the logger. The infoLog is terse, just gives the bare minimum of details +so the network composition will be clear later. +The debuglog is very noisy, for debugging.

+

Pass none to either. +Spews it all out to console too, if logToConsole is true.

+
+ +
+ + +
+ +
+ + +
+
+ +
+ +
+ + + + + + + + + + + \ No newline at end of file diff --git a/_generated/lasso.util.html b/_generated/lasso.util.html new file mode 100644 index 0000000..f2d5a24 --- /dev/null +++ b/_generated/lasso.util.html @@ -0,0 +1,301 @@ + + + + + + + + + + lasso.util — lasso documentation + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ + + +
+ + + + + +
+ +
+ + + + + + + + + + + + + + + + + +
+ + + + +
+
+
+
+ +
+

lasso.util

+

Functions

+ ++++ + + + + + + + + + + + + + + +

column_name_to_parts(c[, parameters])

get_shared_streets_intersection_hash(lat, long)

Calculated per:

hhmmss_to_datetime(hhmmss_str)

Creates a datetime time object from a string of hh:mm:ss

secs_to_datetime(secs)

Creates a datetime time object from a seconds from midnight

+
+
+lasso.util.column_name_to_parts(c, parameters=None)[source]
+
+ +
+
+lasso.util.get_shared_streets_intersection_hash(lat, long, osm_node_id=None)[source]
+
+
Calculated per:

https://github.com/sharedstreets/sharedstreets-js/blob/0e6d7de0aee2e9ae3b007d1e45284b06cc241d02/src/index.ts#L553-L565

+
+
Expected in/out
+
-93.0965985, 44.952112199999995 osm_node_id = 954734870

69f13f881649cb21ee3b359730790bb9

+
+
+
+
+
+ +
+
+lasso.util.hhmmss_to_datetime(hhmmss_str)[source]
+

Creates a datetime time object from a string of hh:mm:ss

+
+
Parameters
+

hhmmss_str – string of hh:mm:ss

+
+
Returns
+

datetime.time object representing time

+
+
Return type
+

dt

+
+
+
+ +
+
+lasso.util.secs_to_datetime(secs)[source]
+

Creates a datetime time object from a seconds from midnight

+
+
Parameters
+

secs – seconds from midnight

+
+
Returns
+

datetime.time object representing time

+
+
Return type
+

dt

+
+
+
+ +
+ + +
+ +
+ + +
+
+ +
+ +
+ + + + + + + + + + + \ No newline at end of file diff --git a/_modules/index.html b/_modules/index.html new file mode 100644 index 0000000..c01ada3 --- /dev/null +++ b/_modules/index.html @@ -0,0 +1,207 @@ + + + + + + + + + + Overview: module code — lasso documentation + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ + + +
+ + + + + +
+ +
+ + + + + + + + + + + + + + + + + +
+ +
    + +
  • »
  • + +
  • Overview: module code
  • + + +
  • + +
  • + +
+ + +
+
+ + + +
+
+ +
+ +
+ + + + + + + + + + + \ No newline at end of file diff --git a/_modules/lasso/logger.html b/_modules/lasso/logger.html new file mode 100644 index 0000000..bbdb030 --- /dev/null +++ b/_modules/lasso/logger.html @@ -0,0 +1,248 @@ + + + + + + + + + + lasso.logger — lasso documentation + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ + + +
+ + + + + +
+ +
+ + + + + + + + + + + + + + + + + +
+ +
    + +
  • »
  • + +
  • Module code »
  • + +
  • lasso.logger
  • + + +
  • + +
  • + +
+ + +
+
+
+
+ +

Source code for lasso.logger

+import logging
+
+__all__ = ["WranglerLogger", "setupLogging"]
+
+
+# for all the Wrangler logging needs!
+WranglerLogger = logging.getLogger("WranglerLogger")
+
+
+
[docs]def setupLogging(infoLogFilename, debugLogFilename, logToConsole=True): + """ Sets up the logger. The infoLog is terse, just gives the bare minimum of details + so the network composition will be clear later. + The debuglog is very noisy, for debugging. + + Pass none to either. + Spews it all out to console too, if logToConsole is true. + """ + # clear handlers if any exist already + WranglerLogger.handlers = [] + + # create a logger + WranglerLogger.setLevel(logging.DEBUG) + + if infoLogFilename: + infologhandler = logging.StreamHandler(open(infoLogFilename, "w")) + infologhandler.setLevel(logging.INFO) + infologhandler.setFormatter( + logging.Formatter("%(asctime)s - %(name)s - %(levelname)s - %(message)s") + ) + WranglerLogger.addHandler(infologhandler) + + if debugLogFilename: + debugloghandler = logging.StreamHandler(open(debugLogFilename, "w")) + debugloghandler.setLevel(logging.DEBUG) + debugloghandler.setFormatter( + logging.Formatter("%(asctime)s %(levelname)s %(message)s", "%Y-%m-%d %H:%M") + ) + WranglerLogger.addHandler(debugloghandler) + + if logToConsole: + consolehandler = logging.StreamHandler() + consolehandler.setLevel(logging.DEBUG) + consolehandler.setFormatter( + logging.Formatter("%(name)-12s: %(levelname)-8s %(message)s") + ) + WranglerLogger.addHandler(consolehandler)
+
+ +
+ +
+ + +
+
+ +
+ +
+ + + + + + + + + + + \ No newline at end of file diff --git a/_modules/lasso/parameters.html b/_modules/lasso/parameters.html new file mode 100644 index 0000000..ed4f935 --- /dev/null +++ b/_modules/lasso/parameters.html @@ -0,0 +1,943 @@ + + + + + + + + + + lasso.parameters — lasso documentation + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ + + +
+ + + + + +
+ +
+ + + + + + + + + + + + + + + + + +
+ +
    + +
  • »
  • + +
  • Module code »
  • + +
  • lasso.parameters
  • + + +
  • + +
  • + +
+ + +
+
+
+
+ +

Source code for lasso.parameters

+import os
+from .logger import WranglerLogger
+
+
+def get_base_dir(lasso_base_dir=os.getcwd()):
+    d = lasso_base_dir
+    for i in range(3):
+        if "metcouncil_data" in os.listdir(d):
+            WranglerLogger.info("Lasso base directory set as: {}".format(d))
+            return d
+        d = os.path.dirname(d)
+
+    msg = "Cannot find Lasso base directory from {}, please input using keyword in parameters: `lasso_base_dir =` ".format(
+        lasso_base_dir
+    )
+    WranglerLogger.error(msg)
+    raise (ValueError(msg))
+
+
+
[docs]class Parameters: + """A class representing all the parameters defining the networks + including time of day, categories, etc. + + Parameters can be set at runtime by initializing a parameters instance + with a keyword argument setting the attribute. Parameters that are + not explicitly set will use default parameters listed in this class. + .. highlight:: python + ##TODO potentially split this between several classes. + + Attr: + time_period_to_time (dict): Maps time period abbreviations used in + Cube to time of days used on gtfs and highway network standard + Default: + :: + { + "AM": ("6:00", "9:00"), + "MD": ("9:00", "16:00"), + "PM": ("16:00", "19:00"), + "NT": ("19:00", "6:00"), + } + cube_time_periods (dict): Maps cube time period numbers used in + transit line files to the time period abbreviations in time_period_to_time + dictionary. + Default: + :: + {"1": "AM", "2": "MD"} + categories (dict): Maps demand category abbreviations to a list of + network categories they are allowed to use. + Default: + :: + { + # suffix, source (in order of search) + "sov": ["sov", "default"], + "hov2": ["hov2", "default", "sov"], + "hov3": ["hov3", "hov2", "default", "sov"], + "truck": ["trk", "sov", "default"], + } + properties_to_split (dict): Dictionary mapping variables in standard + roadway network to categories and time periods that need to be + split out in final model network to get variables like LANES_AM. + Default: + :: + { + "trn_priority": { + "v": "trn_priority", + "time_periods": self.time_periods_to_time, + }, + "ttime_assert": { + "v": "ttime_assert", + "time_periods": self.time_periods_to_time, + }, + "lanes": {"v": "lanes", "time_periods": self.time_periods_to_time}, + "price": { + "v": "price", + "time_periods": self.time_periods_to_time, + "categories": self.categories, + }, + "access": {"v": "access", "time_periods": self.time_periods_to_time}, + } + county_shape (str): File location of shapefile defining counties. + Default: + :: + r"metcouncil_data/county/cb_2017_us_county_5m.shp" + + county_variable_shp (str): Property defining the county n ame in + the county_shape file. + Default: + :: + NAME + lanes_lookup_file (str): Lookup table of number of lanes for different data sources. + Default: + :: + r"metcouncil_data/lookups/lanes.csv" + centroid_connect_lanes (int): Number of lanes for centroid connectors. + Default: + :: + 1 + mpo_counties (list): list of county names within MPO boundary. + Default: + :: + [ + "ANOKA", + "DAKOTA", + "HENNEPIN", + "RAMSEY", + "SCOTT", + "WASHINGTON", + "CARVER", + ] + + taz_shape (str): + Default: + :: + r"metcouncil_data/TAZ/TAZOfficialWCurrentForecasts.shp" + taz_data (str): + Default: + :: + ?? + highest_taz_number (int): highest TAZ number in order to define + centroid connectors. + Default: + :: + 3100 + output_variables (list): list of variables to output in final model + network. + Default: + :: + [ + "model_link_id", + "A", + "B", + "shstGeometryId", + "distance", + "roadway", + "name", + "roadway_class", + "bike_access", + "transit_access", + "walk_access", + "drive_access", + "truck_access", + "trn_priority_AM", + "trn_priority_MD", + "trn_priority_PM", + "trn_priority_NT", + "ttime_assert_AM", + "ttime_assert_MD", + "ttime_assert_PM", + "ttime_assert_NT", + "lanes_AM", + "lanes_MD", + "lanes_PM", + "lanes_NT", + "price_sov_AM", + "price_hov2_AM", + "price_hov3_AM", + "price_truck_AM", + "price_sov_MD", + "price_hov2_MD", + "price_hov3_MD", + "price_truck_MD", + "price_sov_PM", + "price_hov2_PM", + "price_hov3_PM", + "price_truck_PM", + "price_sov_NT", + "price_hov2_NT", + "price_hov3_NT", + "price_truck_NT", + "roadway_class_idx", + "assign_group", + "access_AM", + "access_MD", + "access_PM", + "access_NT", + "mpo", + "area_type", + "county", + "centroidconnect", + "AADT", + "count_year", + "count_AM", + "count_MD", + "count_PM", + "count_NT", + "count_daily", + "model_node_id", + "N", + "osm_node_id", + "bike_node", + "transit_node", + "walk_node", + "drive_node", + "geometry", + "X", + "Y", + ] + area_type_shape (str): Location of shapefile defining area type. + Default: + :: + r"metcouncil_data/area_type/ThriveMSP2040CommunityDesignation.shp" + area_type_variable_shp (str): property in area_type_shape with area + type in it. + Default: + :: + "COMDES2040" + area_type_code_dict (dict): Mapping of the area_type_variable_shp to + the area type code used in the MetCouncil cube network. + Default: + :: + { + 23: 4, # urban center + 24: 3, + 25: 2, + 35: 2, + 36: 1, + 41: 1, + 51: 1, + 52: 1, + 53: 1, + 60: 1, + } + downtown_area_type_shape (str): Location of shapefile defining downtown area type. + Default: + :: + r"metcouncil_data/area_type/downtownzones_TAZ.shp" + downtown_area_type (int): Area type integer for downtown. + Default: + :: + 5 + mrcc_roadway_class_shape (str): Shapefile of MRCC links with a property + associated with roadway class. Default: + :: + r"metcouncil_data/mrcc/trans_mrcc_centerlines.shp" + mrcc_roadway_class_variable_shp (str): The property in mrcc_roadway_class_shp + associated with roadway class. Default: + :: + "ROUTE_SYS" + widot_roadway_class_shape (str): Shapefile of Wisconsin links with a property + associated with roadway class. Default: + :: + r"metcouncil_data/Wisconsin_Lanes_Counts_Median/WISLR.shp" + widot_roadway_class_variable_shp (str): The property in widot_roadway_class_shape + associated with roadway class.Default: + :: + "RDWY_CTGY_" + mndot_count_shape (str): Shapefile of MnDOT links with a property + associated with counts. Default: + :: + r"metcouncil_data/count_mn/AADT_2017_Count_Locations.shp" + mndot_count_variable_shp (str): The property in mndot_count_shape + associated with counts. Default: + :: + "AADT_mn" + widot_count_shape (str): Shapefile of Wisconsin DOT links with a property + associated with counts. Default:Default: + :: + r"metcouncil_data/Wisconsin_Lanes_Counts_Median/TRADAS_(counts).shp" + widot_count_variable_shp (str): The property in widot_count_shape + associated with counts. Default: + :: + "AADT_wi" + mrcc_shst_data (str): MnDOT MRCC to Shared Streets crosswalk. Default: + :: + r"metcouncil_data/mrcc/mrcc.out.matched.csv" + widot_shst_data (str): WisconsinDOT to Shared Streets crosswalk.Default: + :: + r"metcouncil_data/Wisconsin_Lanes_Counts_Median/widot.out.matched.geojson" + mndot_count_shst_data (str): MetCouncil count data with ShST Default: + :: + r"metcouncil_data/count_mn/mn_count_ShSt_API_match.csv" + widot_count_shst_data (str): WisconsinDOT count data with ShST Default: + :: + r"metcouncil_data/Wisconsin_Lanes_Counts_Median/wi_count_ShSt_API_match.csv", + mrcc_assgngrp_dict (str): Mapping beetween MRCC ROUTE_SYS variable + and assignment group. Default: + :: + "lookups/mrcc_route_sys_asgngrp_crosswalk.csv" + widot_assgngrp_dict (dict): Mapping beetween Wisconsin DOT RDWY_CTGY_ + variable and assignment group. Default: + :: + "lookups/widot_ctgy_asgngrp_crosswalk.csv" + osm_assgngrp_dict (dict): Mapping between OSM Roadway variable + and assignment group. Default: + :: + "lookups/osm_highway_asgngrp_crosswalk.csv" + roadway_class_dict (str): Mapping between assignment group and + roadway class. Default: + :: + "lookups/asgngrp_rc_num_crosswalk.csv" + output_epsg (int): EPSG type of geographic projection for output + shapefiles. Default: + :: + 26915 + net_to_dbf (str): Lookup of network variables to DBF compliant + lengths. Default: + :: + "examples/settings/net_to_dbf.csv" + output_link_shp (str): Output shapefile for roadway links. Default: + :: + r"tests/scratch/links.shp" + output_node_shp (str): Output shapefile for roadway nodes. Default: + :: + r"tests/scratch/nodes.shp" + output_link_csv (str): Output csv for roadway links. Default: + :: + r"tests/scratch/links.csv" + output_node_csv (str): Output csv for roadway nodes. Default: + :: + r"tests/scratch/nodes.csv" + output_link_txt (str): Output fixed format txt for roadway links. Default: + :: + r"tests/scratch/links.txt" + output_node_txt (str): Output fixed format txt for roadway nodes. Default: + :: + r"tests/scratch/nodes.txt" + output_link_header_width_txt (str): Header for txt roadway links. Default: + :: + r"tests/scratch/links_header_width.txt" + output_node_header_width_txt (str): Header for txt for roadway Nodes. Default: + :: + r"tests/scratch/nodes_header_width.txt" + output_cube_network_script (str): Cube script for importing + fixed-format roadway network. Default: + :: + r"tests/scratch/make_complete_network_from_fixed_width_file.s + + + + """ + +
[docs] def __init__(self, **kwargs): + """ + Time period and category splitting info + """ + + if "time_periods_to_time" in kwargs: + self.time_periods_to_time = kwargs.get("time_periods_to_time") + else: + self.time_period_to_time = { + "AM": ("6:00", "9:00"), ##TODO FILL IN with real numbers + "MD": ("9:00", "16:00"), + "PM": ("16:00", "19:00"), + "NT": ("19:00", "6:00"), + } + + self.route_type_bus_mode_dict = {"Urb Loc": 5, "Sub Loc": 6, "Express": 7} + + self.route_type_mode_dict = {0: 8, 2: 9} + + self.cube_time_periods = {"1": "AM", "2": "MD"} + self.cube_time_periods_name = {"AM": "pk", "MD": "op"} + + if "categories" in kwargs: + self.categories = kwargs.get("categories") + else: + self.categories = { + # suffix, source (in order of search) + "sov": ["sov", "default"], + "hov2": ["hov2", "default", "sov"], + "hov3": ["hov3", "hov2", "default", "sov"], + "truck": ["trk", "sov", "default"], + } + + # prefix, source variable, categories + self.properties_to_split = { + "trn_priority": { + "v": "trn_priority", + "time_periods": self.time_period_to_time, + }, + "ttime_assert": { + "v": "ttime_assert", + "time_periods": self.time_period_to_time, + }, + "lanes": {"v": "lanes", "time_periods": self.time_period_to_time}, + "ML_lanes": {"v": "ML_lanes", "time_periods": self.time_period_to_time}, + "price": { + "v": "price", + "time_periods": self.time_period_to_time, + "categories": self.categories, + }, + "access": {"v": "access", "time_periods": self.time_period_to_time}, + } + + """ + Details for calculating the county based on the centroid of the link. + The COUNTY_VARIABLE should be the name of a field in shapefile. + """ + if "lasso_base_dir" in kwargs: + self.base_dir = get_base_dir(lasso_base_dir=kwargs.get("lasso_base_dir")) + else: + self.base_dir = get_base_dir() + + if "data_file_location" in kwargs: + self.data_file_location = kwargs.get("data_file_location") + else: + self.data_file_location = os.path.join(self.base_dir, "metcouncil_data") + + if "settings_location" in kwargs: + self.settings_location = kwargs.get("settings_location") + else: + self.settings_location = os.path.join(self.base_dir, "examples", "settings") + + if "scratch_location" in kwargs: + self.scratch_location = kwargs.get("scratch_location") + else: + self.scratch_location = os.path.join(self.base_dir, "tests", "scratch") + + ### COUNTIES + + self.county_shape = os.path.join( + self.data_file_location, "county", "cb_2017_us_county_5m.shp" + ) + self.county_variable_shp = "NAME" + + self.county_code_dict = { + "Anoka": 1, + "Carver": 2, + "Dakota": 3, + "Hennepin": 4, + "Ramsey": 5, + "Scott": 6, + "Washington": 7, + "external": 10, + "Chisago": 11, + "Goodhue": 12, + "Isanti": 13, + "Le Sueur": 14, + "McLeod": 15, + "Pierce": 16, + "Polk": 17, + "Rice": 18, + "Sherburne": 19, + "Sibley": 20, + "St. Croix": 21, + "Wright": 22, + } + + self.mpo_counties = [ + 1, + 3, + 4, + 5, + 6, + 7, + 2, + ] + + ### Lanes + self.lanes_lookup_file = os.path.join( + self.data_file_location, "lookups", "lanes.csv" + ) + + ### TAZS + + self.taz_shape = os.path.join( + self.data_file_location, "TAZ", "TAZOfficialWCurrentForecasts.shp" + ) + self.taz_data = None + self.highest_taz_number = 3100 + + ### AREA TYPE + self.area_type_shape = os.path.join( + self.data_file_location, + "area_type", + "ThriveMSP2040CommunityDesignation.shp", + ) + self.area_type_variable_shp = "COMDES2040" + # area type map from raw data to model category + + # source https://metrocouncil.org/Planning/Publications-And-Resources/Thrive-MSP-2040-Plan-(1)/7_ThriveMSP2040_LandUsePoliciesbyCD.aspx + # urban center + # urban + # suburban + # suburban edge + # emerging suburban edge + # rural center + # diversified rural + # rural residential + # agricultural + self.area_type_code_dict = { + 23: 4, # urban center + 24: 3, + 25: 2, + 35: 2, + 36: 1, + 41: 1, + 51: 1, + 52: 1, + 53: 1, + 60: 1, + } + + self.downtown_area_type_shape = os.path.join( + self.data_file_location, + "area_type", + "downtownzones_TAZ.shp", + ) + + self.downtown_area_type = int(5) + + self.centroid_connect_lanes = int(1) + + self.osm_assgngrp_dict = os.path.join( + self.data_file_location, "lookups", "osm_highway_asgngrp_crosswalk.csv" + ) + self.mrcc_roadway_class_shape = os.path.join( + self.data_file_location, "mrcc", "trans_mrcc_centerlines.shp" + ) + + self.mrcc_roadway_class_variable_shp = "ROUTE_SYS" + + self.mrcc_assgngrp_dict = os.path.join( + self.data_file_location, "lookups", "mrcc_route_sys_asgngrp_crosswalk.csv" + ) + + self.mrcc_shst_data = os.path.join( + self.data_file_location, "mrcc", "mrcc.out.matched.csv" + ) + + self.widot_roadway_class_shape = os.path.join( + self.data_file_location, "Wisconsin_Lanes_Counts_Median", "WISLR.shp" + ) + + self.widot_roadway_class_variable_shp = "RDWY_CTGY_" + + self.widot_assgngrp_dict = os.path.join( + self.data_file_location, "lookups", "widot_ctgy_asgngrp_crosswalk.csv" + ) + + self.widot_shst_data = os.path.join( + self.data_file_location, + "Wisconsin_Lanes_Counts_Median", + "widot.out.matched.geojson", + ) + + self.roadway_class_dict = os.path.join( + self.data_file_location, "lookups", "asgngrp_rc_num_crosswalk.csv" + ) + + self.mndot_count_shape = os.path.join( + self.data_file_location, "count_mn", "AADT_2017_Count_Locations.shp" + ) + + self.mndot_count_shst_data = os.path.join( + self.data_file_location, "count_mn", "mn_count_ShSt_API_match.csv" + ) + + self.mndot_count_variable_shp = "AADT_mn" + + self.widot_county_shape = os.path.join( + self.data_file_location, + "Wisconsin_Lanes_Counts_Median", + "TRADAS_(counts).shp", + ) + + self.widot_count_shst_data = os.path.join( + self.data_file_location, + "Wisconsin_Lanes_Counts_Median", + "wi_count_ShSt_API_match.csv", + ) + + self.widot_count_variable_shp = "AADT_wi" + + self.net_to_dbf_crosswalk = os.path.join( + self.settings_location, "net_to_dbf.csv" + ) + + self.log_to_net_crosswalk = os.path.join( + self.settings_location, "log_to_net.csv" + ) + + self.output_variables = [ + "model_link_id", + "link_id", + "A", + "B", + "shstGeometryId", + "shape_id", + "distance", + "roadway", + "name", + "roadway_class", + "bike_access", + "walk_access", + "drive_access", + "truck_access", + "trn_priority_AM", + "trn_priority_MD", + "trn_priority_PM", + "trn_priority_NT", + "ttime_assert_AM", + "ttime_assert_MD", + "ttime_assert_PM", + "ttime_assert_NT", + "lanes_AM", + "lanes_MD", + "lanes_PM", + "lanes_NT", + "price_sov_AM", + "price_hov2_AM", + "price_hov3_AM", + "price_truck_AM", + "price_sov_MD", + "price_hov2_MD", + "price_hov3_MD", + "price_truck_MD", + "price_sov_PM", + "price_hov2_PM", + "price_hov3_PM", + "price_truck_PM", + "price_sov_NT", + "price_hov2_NT", + "price_hov3_NT", + "price_truck_NT", + "roadway_class_idx", + "assign_group", + "access_AM", + "access_MD", + "access_PM", + "access_NT", + "mpo", + "area_type", + "county", + "centroidconnect", + #'mrcc_id', + "AADT", + "count_year", + "count_AM", + "count_MD", + "count_PM", + "count_NT", + "count_daily", + "model_node_id", + "N", + "osm_node_id", + "bike_node", + "transit_node", + "walk_node", + "drive_node", + "geometry", + "X", + "Y", + "ML_lanes_AM", + "ML_lanes_MD", + "ML_lanes_PM", + "ML_lanes_NT", + "segment_id", + "managed", + "bus_only", + "rail_only", + "bike_facility", + "mrcc_id", + "ROUTE_SYS", #mrcc functional class + ] + + self.output_link_shp = os.path.join(self.scratch_location, "links.shp") + self.output_node_shp = os.path.join(self.scratch_location, "nodes.shp") + self.output_link_csv = os.path.join(self.scratch_location, "links.csv") + self.output_node_csv = os.path.join(self.scratch_location, "nodes.csv") + self.output_link_txt = os.path.join(self.scratch_location, "links.txt") + self.output_node_txt = os.path.join(self.scratch_location, "nodes.txt") + self.output_link_header_width_txt = os.path.join( + self.scratch_location, "links_header_width.txt" + ) + self.output_node_header_width_txt = os.path.join( + self.scratch_location, "nodes_header_width.txt" + ) + self.output_cube_network_script = os.path.join( + self.scratch_location, "make_complete_network_from_fixed_width_file.s" + ) + self.output_dir = os.path.join(self.scratch_location) + self.output_epsg = 26915 + + """ + Create all the possible headway variable combinations based on the cube time periods setting + """ + self.time_period_properties_list = [ + p + "[" + str(t) + "]" + for p in ["HEADWAY", "FREQ"] + for t in self.cube_time_periods.keys() + ] + + self.int_col = [ + "model_link_id", + "model_node_id", + "A", + "B", + # "lanes", + "lanes_AM", + "lanes_MD", + "lanes_PM", + "lanes_NT", + "roadway_class", + "assign_group", + "county", + "area_type", + "trn_priority", + "AADT", + "count_AM", + "count_MD", + "count_PM", + "count_NT", + "count_daily", + "centroidconnect", + "bike_facility", + "drive_access", + "walk_access", + "bike_access", + "truck_access", + "drive_node", + "walk_node", + "bike_node", + "transit_node", + # "ML_lanes", + "ML_lanes_AM", + "ML_lanes_MD", + "ML_lanes_PM", + "ML_lanes_NT", + "segment_id", + "managed", + "bus_only", + "rail_only", + "mrcc_id", + ] + + self.float_col = ["distance", "ttime_assert", "price", "X", "Y"] + + self.string_col = [ + "osm_node_id", + "name", + "roadway", + "shstGeometryId", + "access_AM", + "access_MD", + "access_PM", + "access_NT", + "ROUTE_SYS", + ] + + self.__dict__.update(kwargs)
+
+ +
+ +
+ + +
+
+ +
+ +
+ + + + + + + + + + + \ No newline at end of file diff --git a/_modules/lasso/project.html b/_modules/lasso/project.html new file mode 100644 index 0000000..2ab9ab7 --- /dev/null +++ b/_modules/lasso/project.html @@ -0,0 +1,1022 @@ + + + + + + + + + + lasso.project — lasso documentation + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ + + +
+ + + + + +
+ +
+ + + + + + + + + + + + + + + + + +
+ +
    + +
  • »
  • + +
  • Module code »
  • + +
  • lasso.project
  • + + +
  • + +
  • + +
+ + +
+
+
+
+ +

Source code for lasso.project

+import json
+import os
+import re
+from typing import Any, Dict, Optional, Union, List
+from csv import reader
+
+import pandas as pd
+from pandas import DataFrame
+import geopandas as gpd
+
+from network_wrangler import ProjectCard
+from network_wrangler import RoadwayNetwork
+
+from .transit import CubeTransit, StandardTransit
+from .logger import WranglerLogger
+from .parameters import Parameters
+from .roadway import ModelRoadwayNetwork
+from .util import column_name_to_parts
+
+
+
[docs]class Project(object): + """A single or set of changes to the roadway or transit system. + + Compares a base and a build transit network or a base and build + highway network and produces project cards. + + .. highlight:: python + + Typical usage example: + :: + test_project = Project.create_project( + base_transit_source=os.path.join(CUBE_DIR, "transit.LIN"), + build_transit_source=os.path.join(CUBE_DIR, "transit_route_shape_change"), + ) + test_project.evaluate_changes() + test_project.write_project_card( + os.path.join(SCRATCH_DIR, "t_transit_shape_test.yml") + ) + + Attributes: + DEFAULT_PROJECT_NAME: a class-level constant that defines what + the project name will be if none is set. + STATIC_VALUES: a class-level constant which defines values that + are not evaluated when assessing changes. + card_data (dict): {"project": <project_name>, "changes": <list of change dicts>} + roadway_changes (DataFrame): pandas dataframe of CUBE roadway changes. + transit_changes (CubeTransit): + base_roadway_network (RoadwayNetwork): + base_transit_network (CubeTransit): + build_transit_network (CubeTransit): + project_name (str): name of the project, set to DEFAULT_PROJECT_NAME if not provided + parameters: an instance of the Parameters class which sets a bunch of parameters + """ + + DEFAULT_PROJECT_NAME = "USER TO define" + + STATIC_VALUES = [ + "model_link_id", + "area_type", + "county", + # "assign_group", + "centroidconnect", + ] + CALCULATED_VALUES = [ + "area_type", + "county", + "assign_group", + "centroidconnect", + ] + +
[docs] def __init__( + self, + roadway_changes: Optional[DataFrame] = None, + transit_changes: Optional[CubeTransit] = None, + base_roadway_network: Optional[RoadwayNetwork] = None, + base_transit_network: Optional[CubeTransit] = None, + build_transit_network: Optional[CubeTransit] = None, + project_name: Optional[str] = "", + evaluate: Optional[bool] = False, + parameters: Union[dict, Parameters] = {}, + ): + """ + ProjectCard constructor. + + args: + roadway_changes: dataframe of roadway changes read from a log file + transit_changes: + base_roadway_network: RoadwayNetwork instance for base case + base_transit_network: CubeTransit instance for base transit network + build_transit_network: CubeTransit instance for build transit network + project_name: name of the project + evaluate: defaults to false, but if true, will create card data + parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not specified, will use default parameters. + + returns: instance of ProjectCard + """ + self.card_data = Dict[str, Dict[str, Any]] + + self.roadway_changes = roadway_changes + self.base_roadway_network = base_roadway_network + self.base_transit_network = base_transit_network + self.build_transit_network = build_transit_network + self.transit_changes = transit_changes + self.project_name = ( + project_name if project_name else Project.DEFAULT_PROJECT_NAME + ) + + if type(parameters) is dict: + self.parameters = Parameters(**parameters) + elif isinstance(parameters, Parameters): + self.parameters = Parameters(**parameters.__dict__) + else: + msg = "Parameters should be a dict or instance of Parameters: found {} which is of type:{}".format( + parameters, type(parameters) + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + if base_roadway_network != None: + self.determine_roadway_network_changes_compatability( + self.base_roadway_network, self.roadway_changes, self.parameters + ) + + if evaluate: + self.evaluate_changes()
+ +
[docs] def write_project_card(self, filename: str = None): + """ + Writes project cards. + + Args: + filename (str): File path to output .yml + + Returns: + None + """ + ProjectCard(self.card_data).write(filename=filename)
+ +
[docs] @staticmethod + def create_project( + roadway_log_file: Union[str, List[str], None] = None, + roadway_shp_file: Optional[str] = None, + roadway_csv_file: Optional[str] = None, + base_roadway_dir: Optional[str] = None, + base_transit_source: Optional[str] = None, + build_transit_source: Optional[str] = None, + roadway_changes: Optional[DataFrame] = None, + transit_changes: Optional[CubeTransit] = None, + base_roadway_network: Optional[RoadwayNetwork] = None, + base_transit_network: Optional[CubeTransit] = None, + build_transit_network: Optional[CubeTransit] = None, + project_name: Optional[str] = None, + recalculate_calculated_variables: Optional[bool] = False, + recalculate_distance: Optional[bool] = False, + parameters: Optional[dict] = {}, + ): + """ + Constructor for a Project instance. + + Args: + roadway_log_file: File path to consuming logfile or a list of logfile paths. + roadway_shp_file: File path to consuming shape file for roadway changes. + roadway_csv_file: File path to consuming csv file for roadway changes. + base_roadway_dir: Folder path to base roadway network. + base_transit_source: Folder path to base transit network or cube line file string. + base_transit_file: File path to base transit network. + build_transit_source: Folder path to build transit network or cube line file string. + build_transit_file: File path to build transit network. + roadway_changes: pandas dataframe of CUBE roadway changes. + transit_changes: build transit changes. + base_roadway_network: Base roadway network object. + base_transit_network: Base transit network object. + build_transit_network: Build transit network object. + project_name: If not provided, will default to the roadway_log_file filename if provided (or the first filename if a list is provided) + recalculate_calculated_variables: if reading in a base network, if this is true it will recalculate variables such as area type, etc. This only needs to be true if you are creating project cards that are changing the calculated variables. + recalculate_distance: recalculate the distance variable. This only needs to be true if you are creating project cards that change the distance. + parameters: dictionary of parameters + Returns: + A Project instance. + """ + + if base_transit_source and base_transit_network: + msg = "Method takes only one of 'base_transit_source' and 'base_transit_network' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if base_transit_source: + base_transit_network = CubeTransit.create_from_cube(base_transit_source) + WranglerLogger.debug( + "Base network has {} lines".format(len(base_transit_network.lines)) + ) + if len(base_transit_network.lines) <= 10: + WranglerLogger.debug( + "Base network lines: {}".format( + "\n - ".join(base_transit_network.lines) + ) + ) + elif base_transit_network: + pass + else: + msg = "No base transit network." + WranglerLogger.info(msg) + base_transit_network = None + + if build_transit_source and transit_changes: + msg = "Method takes only one of 'build_transit_source' and 'transit_changes' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if build_transit_source: + WranglerLogger.debug("build") + build_transit_network = CubeTransit.create_from_cube(build_transit_source) + WranglerLogger.debug( + "Build network has {} lines".format(len(build_transit_network.lines)) + ) + if len(build_transit_network.lines) <= 10: + WranglerLogger.debug( + "Build network lines: {}".format( + "\n - ".join(build_transit_network.lines) + ) + ) + elif transit_changes: + pass + else: + msg = "No transit changes given or processed." + WranglerLogger.info(msg) + transit_changes = None + + if roadway_log_file and roadway_changes: + msg = "Method takes only one of 'roadway_log_file' and 'roadway_changes' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if roadway_shp_file and roadway_changes: + msg = "Method takes only one of 'roadway_shp_file' and 'roadway_changes' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if roadway_csv_file and roadway_changes: + msg = "Method takes only one of 'roadway_csv_file' and 'roadway_changes' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if roadway_log_file and roadway_csv_file: + msg = "Method takes only one of 'roadway_log_file' and 'roadway_csv_file' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if roadway_shp_file and roadway_csv_file: + msg = "Method takes only one of 'roadway_shp_file' and 'roadway_csv_file' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if roadway_log_file and roadway_shp_file: + msg = "Method takes only one of 'roadway_log_file' and 'roadway_shp_file' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if roadway_log_file and not project_name: + if type(roadway_log_file) == list: + project_name = os.path.splitext(os.path.basename(roadway_log_file[0]))[0] + WranglerLogger.info("No Project Name - Using name of first log file in list") + else: + project_name = os.path.splitext(os.path.basename(roadway_log_file))[0] + WranglerLogger.info("No Project Name - Using name of log file") + if roadway_log_file: + roadway_changes = Project.read_logfile(roadway_log_file) + elif roadway_shp_file: + roadway_changes = gpd.read_file(roadway_shp_file) + roadway_changes = DataFrame(roadway_changes.drop("geometry", axis=1)) + roadway_changes["model_node_id"] = 0 + elif roadway_csv_file: + roadway_changes = pd.read_csv(roadway_csv_file) + roadway_changes["model_node_id"] = 0 + elif roadway_changes: + pass + else: + msg = "No roadway changes given or processed." + WranglerLogger.info(msg) + roadway_changes = pd.DataFrame({}) + + if base_roadway_network and base_roadway_dir: + msg = "Method takes only one of 'base_roadway_network' and 'base_roadway_dir' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if base_roadway_dir: + base_roadway_network = ModelRoadwayNetwork.read( + os.path.join(base_roadway_dir, "link.json"), + os.path.join(base_roadway_dir, "node.geojson"), + os.path.join(base_roadway_dir, "shape.geojson"), + fast=True, + recalculate_calculated_variables=recalculate_calculated_variables, + recalculate_distance=recalculate_distance, + parameters=parameters, + ) + base_roadway_network.split_properties_by_time_period_and_category() + elif base_roadway_network: + pass + else: + msg = "No base roadway network." + WranglerLogger.info(msg) + base_roadway_network = None + + project = Project( + roadway_changes=roadway_changes, + transit_changes=transit_changes, + base_roadway_network=base_roadway_network, + base_transit_network=base_transit_network, + build_transit_network=build_transit_network, + evaluate=True, + project_name=project_name, + parameters=parameters, + ) + + return project
+ +
[docs] @staticmethod + def read_logfile(logfilename: Union[str, List[str]]) -> DataFrame: + """ + Reads a Cube log file and returns a dataframe of roadway_changes + + Args: + logfilename (str or list[str]): File path to CUBE logfile or list of logfile paths. + + Returns: + A DataFrame reprsentation of the log file. + """ + if type(logfilename) == str: + logfilename = [logfilename] + + link_df = pd.DataFrame() + node_df = pd.DataFrame() + + for file in logfilename: + WranglerLogger.info("Reading logfile: {}".format(file)) + with open(file) as f: + _content = f.readlines() + + _node_lines = [x.strip().replace(";",",") for x in _content if x.startswith("N")] + WranglerLogger.debug("node lines: {}".format(_node_lines)) + _link_lines = [x.strip().replace(";",",") for x in _content if x.startswith("L")] + WranglerLogger.debug("link lines: {}".format(_link_lines)) + + _nodecol = ["OBJECT", "OPERATION", "GROUP"] + _node_lines[0].split(",")[1:] + WranglerLogger.debug("Node Cols: {}".format(_nodecol)) + _linkcol = ["OBJECT", "OPERATION", "GROUP"] + _link_lines[0].split(",")[1:] + WranglerLogger.debug("Link Cols: {}".format(_linkcol)) + + _node_df = pd.DataFrame([x.split(",") for x in _node_lines[1:]],columns = _nodecol) + WranglerLogger.debug("Node DF: {}".format(_node_df)) + _link_df = pd.DataFrame([x.split(",") for x in _link_lines[1:]],columns = _linkcol) + WranglerLogger.debug("Link DF: {}".format(_link_df)) + + node_df = pd.concat([node_df,_node_df]) + link_df = pd.concat([link_df,_link_df]) + + log_df = pd.concat([link_df, node_df], ignore_index=True, sort=False) + + # CUBE logfile headers for string fields: NAME[111] instead of NAME, need to shorten that + log_df.columns = [c.split("[")[0] for c in log_df.columns] + + WranglerLogger.info( + "Processed {} Node lines and {} Link lines".format( + node_df.shape[0], link_df.shape[0] + ) + ) + + return log_df
+ +
[docs] @staticmethod + def determine_roadway_network_changes_compatability( + base_roadway_network: ModelRoadwayNetwork, + roadway_changes: DataFrame, + parameters: Parameters, + ): + """ + Checks to see that any links or nodes that change exist in base roadway network. + """ + WranglerLogger.info( + "Evaluating compatibility between roadway network changes and base network. Not evaluating deletions." + ) + + # CUBE log file saves all variable names in upper cases, need to convert them to be same as network + log_to_net_df = pd.read_csv(parameters.log_to_net_crosswalk) + log_to_net_dict = dict(zip(log_to_net_df["log"], log_to_net_df["net"])) + + dbf_to_net_df = pd.read_csv(parameters.net_to_dbf_crosswalk) + dbf_to_net_dict = dict(zip(dbf_to_net_df["dbf"], dbf_to_net_df["net"])) + + roadway_changes.rename(columns=log_to_net_dict, inplace=True) + roadway_changes.rename(columns=dbf_to_net_dict, inplace=True) + + # for links "L" that change "C", + # find locations where there isn't a base roadway link + + link_changes_df = roadway_changes[ + (roadway_changes.OBJECT == "L") & (roadway_changes.OPERATION == "C") + ] + + link_merge_df = pd.merge( + link_changes_df[["A", "B"]].astype(str), + base_roadway_network.links_df[["A", "B", "model_link_id"]].astype(str), + how="left", + on=["A", "B"], + ) + + missing_links = link_merge_df.loc[link_merge_df["model_link_id"].isna()] + + if missing_links.shape[0]: + msg = "Network missing the following AB links:\n{}".format(missing_links) + WranglerLogger.error(msg) + raise ValueError(msg) + + # for links "N" that change "C", + # find locations where there isn't a base roadway node + + node_changes_df = roadway_changes[ + (roadway_changes.OBJECT == "N") & (roadway_changes.OPERATION == "C") + ] + node_merge_df = pd.merge( + node_changes_df[["model_node_id"]], + base_roadway_network.nodes_df[["model_node_id", "geometry"]], + how="left", + on=["model_node_id"], + ) + missing_nodes = node_merge_df.loc[node_merge_df["geometry"].isna()] + if missing_nodes.shape[0]: + msg = "Network missing the following nodes:\n{}".format(missing_nodes) + WranglerLogger.error(msg) + raise ValueError(msg)
+ +
[docs] def evaluate_changes(self): + """ + Determines which changes should be evaluated, initiates + self.card_data to be an aggregation of transit and highway changes. + """ + highway_change_list = [] + transit_change_list = [] + + WranglerLogger.info("Evaluating project changes.") + + if not self.roadway_changes.empty: + highway_change_list = self.add_highway_changes() + + if (self.transit_changes is not None) or ( + self.base_transit_network is not None + and self.build_transit_network is not None + ): + transit_change_list = self.add_transit_changes() + + self.card_data = { + "project": self.project_name, + "changes": transit_change_list + highway_change_list, + }
+ +
[docs] def add_transit_changes(self): + """ + Evaluates changes between base and build transit objects and + adds entries into the self.card_data dictionary. + """ + + transit_change_list = self.build_transit_network.evaluate_differences( + self.base_transit_network + ) + + return transit_change_list
+ +
[docs] def add_highway_changes(self, limit_variables_to_existing_network=False): + """ + Evaluates changes from the log file based on the base highway object and + adds entries into the self.card_data dictionary. + + Args: + limit_variables_to_existing_network (bool): True if no ad-hoc variables. Default to False. + """ + + for c in self.parameters.string_col: + if c in self.roadway_changes.columns: + self.roadway_changes[c] = self.roadway_changes[c].str.lstrip(" ") + + ## if worth it, could also add some functionality to network wrangler itself. + node_changes_df = self.roadway_changes[ + self.roadway_changes.OBJECT == "N" + ].copy() + + link_changes_df = self.roadway_changes[ + self.roadway_changes.OBJECT == "L" + ].copy() + + def _final_op(x): + if x.OPERATION_history[-1] == "D": + if "A" in x.OPERATION_history[:-1]: + return "N" + else: + return "D" + elif x.OPERATION_history[-1] == "A": + if "D" in x.OPERATION_history[:-1]: + return "C" + else: + return "A" + else: + if "A" in x.OPERATION_history[:-1]: + return "A" + else: + return "C" + + def _process_deletions(link_changes_df): + """ + + """ + WranglerLogger.debug("Processing link deletions") + + cube_delete_df = link_changes_df[link_changes_df.OPERATION_final == "D"] + if len(cube_delete_df) > 0: + links_to_delete = cube_delete_df["model_link_id"].tolist() + delete_link_dict = { + "category": "Roadway Deletion", + "links": {"model_link_id": links_to_delete}, + } + WranglerLogger.debug("{} Links Deleted.".format(len(links_to_delete))) + else: + delete_link_dict = None + WranglerLogger.debug("No link deletions processed") + + return delete_link_dict + + def _process_link_additions( + link_changes_df, limit_variables_to_existing_network + ): + """ + + """ + WranglerLogger.debug("Processing link additions") + cube_add_df = link_changes_df[link_changes_df.OPERATION_final == "A"] + if not cube_add_df.shape[1]: + WranglerLogger.debug("No link additions processed") + return {} + + if limit_variables_to_existing_network: + add_col = [ + c + for c in cube_add_df.columns + if c in self.base_roadway_network.links_df.columns + ] + else: + add_col = [ + c + for c in cube_add_df.columns if c not in ["OPERATION_final"] + ] + # can leave out "OPERATION_final" from writing out, is there a reason to write it out? + + add_link_properties = cube_add_df[add_col].to_dict("records") + + # WranglerLogger.debug("Add Link Properties: {}".format(add_link_properties)) + WranglerLogger.debug("{} Links Added".format(len(add_link_properties))) + + return {"category": "Add New Roadway", "links": add_link_properties} + + def _process_node_additions(node_add_df): + """ + + """ + WranglerLogger.debug("Processing node additions") + + if not node_add_df.shape[1]: + WranglerLogger.debug("No node additions processed") + return [] + + add_nodes_dict_list = node_add_df.drop(["OPERATION_final"], axis=1).to_dict( + "records" + ) + WranglerLogger.debug("{} Nodes Added".format(len(add_nodes_dict_list))) + + return add_nodes_dict_list + + def _process_single_link_change(change_row, changeable_col): + """ + + """ + + # 1. Find associated base year network values + base_df = self.base_roadway_network.links_df[ + (self.base_roadway_network.links_df["A"] == change_row.A) + & (self.base_roadway_network.links_df["B"] == change_row.B) + ] + + if not base_df.shape[0]: + msg = "No match found in network for AB combination: ({},{}). Incompatible base network.".format( + change_row.A, change_row.B + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + elif base_df.shape[0] > 1: + WranglerLogger.warning( + "Found more than one match in base network for AB combination: ({},{}). Selecting first one to operate on but AB should be unique to network.".format( + row.A, row.B + ) + ) + + base_row = base_df.iloc[0] + # WranglerLogger.debug("Properties with changes: {}".format(changeable_col)) + + # 2. find columns that changed (enough) + changed_col = [] + for col in changeable_col: + WranglerLogger.debug("Assessing Column: {}".format(col)) + # if it is the same as before, or a static value, don't process as a change + if str(change_row[col]) == str(base_row[col]): + continue + if (col == "roadway_class") & (change_row[col] == 0): + continue + # only look at distance if it has significantly changed + if col == "distance": + if ( + abs( + (change_row[col] - float(base_row[col])) + / base_row[col].astype(float) + ) + > 0.01 + ): + changed_col.append(col) + else: + continue + else: + changed_col.append(col) + + WranglerLogger.debug( + "Properties with changes that will be processed: {}".format(changed_col) + ) + + if not changed_col: + return pd.DataFrame() + + # 3. Iterate through columns with changed values and structure the changes as expected in project card + property_dict_list = [] + processed_properties = [] + for c in changed_col: + # WranglerLogger.debug("Processing Column: {}".format(c)) + ( + p_base_name, + p_time_period, + p_category, + managed_lane, + ) = column_name_to_parts(c) + + _d = { + "existing": base_row[c], + "set": change_row[c], + } + if c in Project.CALCULATED_VALUES: + _d = { + "set": change_row[c], + } + if p_time_period: + _d["time"] = list( + self.parameters.time_period_to_time[p_time_period] + ) + if p_category: + _d["category"] = p_category + + # iterate through existing properties that have been changed and see if you should just add + if p_base_name in processed_properties: + for processed_p in property_dict_list: + if processed_p["property"] == p_base_name: + processed_p["timeofday"] += [_d] + elif p_time_period: + property_dict = {"property": p_base_name, "timeofday": [_d]} + processed_properties.append(p_base_name) + property_dict_list.append(property_dict) + else: + _d["property"] = c + processed_properties.append(_d["property"]) + property_dict_list.append(_d) + + card_df = pd.DataFrame( + { + "properties": pd.Series([property_dict_list]), + "model_link_id": pd.Series(base_row["model_link_id"]), + } + ) + + # WranglerLogger.debug('single change card_df:\n {}'.format(card_df)) + + return card_df + + def _process_link_changes(link_changes_df, changeable_col): + """ + + """ + cube_change_df = link_changes_df[link_changes_df.OPERATION_final == "C"] + if not cube_change_df.shape[0]: + WranglerLogger.info("No link changes processed") + return [] + + change_link_dict_df = pd.DataFrame(columns=["properties", "model_link_id"]) + + for index, row in cube_change_df.iterrows(): + card_df = _process_single_link_change(row, changeable_col) + + change_link_dict_df = pd.concat( + [change_link_dict_df, card_df], ignore_index=True, sort=False + ) + + if not change_link_dict_df.shape[0]: + WranglerLogger.info("No link changes processed") + return [] + + # WranglerLogger.debug('change_link_dict_df Unaggregated:\n {}'.format(change_link_dict_df)) + + # Have to change to string so that it is a hashable type for the aggregation + change_link_dict_df["properties"] = change_link_dict_df[ + "properties" + ].astype(str) + # Group the changes that are the same + change_link_dict_df = ( + change_link_dict_df.groupby("properties")[["model_link_id"]] + .agg(lambda x: list(x)) + .reset_index() + ) + # WranglerLogger.debug('change_link_dict_df Aggregated:\n {}'.format(change_link_dict_df)) + + # Reformat model link id to correct "facility" format + change_link_dict_df["facility"] = change_link_dict_df.apply( + lambda x: {"link": [{"model_link_id": x.model_link_id}]}, axis=1 + ) + + # WranglerLogger.debug('change_link_dict_df 3: {}'.format(change_link_dict_df)) + change_link_dict_df["properties"] = change_link_dict_df["properties"].apply( + lambda x: json.loads( + x.replace("'\"", "'").replace("\"'", "'").replace("'", '"') + ) + ) + + change_link_dict_df["category"] = "Roadway Property Change" + + change_link_dict_list = change_link_dict_df[ + ["category", "facility", "properties"] + ].to_dict("record") + + WranglerLogger.debug( + "{} Changes Processed".format(len(change_link_dict_list)) + ) + return change_link_dict_list + + def _consolidate_actions(log, base, key_list): + log_df = log.copy() + # will be changed if to allow new variables being added/changed that are not in base network + changeable_col = [x for x in log_df.columns if x in base.columns] + + for x in changeable_col: + log_df[x] = log_df[x].astype(base[x].dtype) + + action_history_df = ( + log_df.groupby(key_list)["OPERATION"] + .agg(lambda x: x.tolist()) + .rename("OPERATION_history") + .reset_index() + ) + + log_df = pd.merge(log_df, action_history_df, on=key_list, how="left") + log_df.drop_duplicates(subset=key_list, keep="last", inplace=True) + log_df["OPERATION_final"] = log_df.apply(lambda x: _final_op(x), axis=1) + return log_df[changeable_col + ["OPERATION_final"]] + + if len(link_changes_df) != 0: + link_changes_df = _consolidate_actions( + link_changes_df, self.base_roadway_network.links_df, ["A", "B"] + ) + + if len(node_changes_df) != 0: + node_changes_df = _consolidate_actions( + node_changes_df, self.base_roadway_network.nodes_df, ["model_node_id"] + ) + + # print error message for node change and node deletion + if ( + len(node_changes_df[node_changes_df.OPERATION_final.isin(["C", "D"])]) + > 0 + ): + msg = "NODE changes and deletions are not allowed!" + WranglerLogger.error(msg) + raise ValueError(msg) + node_add_df = node_changes_df[node_changes_df.OPERATION_final == "A"] + else: + node_add_df = pd.DataFrame() + + # process deletions + delete_link_dict = _process_deletions(link_changes_df) + + # process additions + add_link_dict = _process_link_additions( + link_changes_df, limit_variables_to_existing_network + ) + add_link_dict["nodes"] = _process_node_additions(node_add_df) + + # process changes + WranglerLogger.debug("Processing changes") + WranglerLogger.debug(link_changes_df) + changeable_col = list( + ( + set(link_changes_df.columns) + & set(self.base_roadway_network.links_df.columns) + ) + - set(Project.STATIC_VALUES) + ) + + cols_in_changes_not_in_net = list( + set(link_changes_df.columns) + - set(self.base_roadway_network.links_df.columns) + ) + + if cols_in_changes_not_in_net: + WranglerLogger.warning( + "The following attributes are specified in the changes but do not exist in the base network: {}".format( + cols_in_changes_not_in_net + ) + ) + + change_link_dict_list = _process_link_changes(link_changes_df, changeable_col) + + # combine together + + highway_change_list = list( + filter(None, [delete_link_dict] + [add_link_dict] + change_link_dict_list) + ) + + return highway_change_list
+
+ +
+ +
+ + +
+
+ +
+ +
+ + + + + + + + + + + \ No newline at end of file diff --git a/_modules/lasso/roadway.html b/_modules/lasso/roadway.html new file mode 100644 index 0000000..9c8eb50 --- /dev/null +++ b/_modules/lasso/roadway.html @@ -0,0 +1,2338 @@ + + + + + + + + + + lasso.roadway — lasso documentation + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ + + +
+ + + + + +
+ +
+ + + + + + + + + + + + + + + + + +
+ +
    + +
  • »
  • + +
  • Module code »
  • + +
  • lasso.roadway
  • + + +
  • + +
  • + +
+ + +
+
+
+
+ +

Source code for lasso.roadway

+import copy
+import glob
+import os
+from typing import Optional, Union
+
+import geopandas as gpd
+import pandas as pd
+
+from geopandas import GeoDataFrame
+from pandas import DataFrame
+import numpy as np
+
+from network_wrangler import RoadwayNetwork
+from .parameters import Parameters
+from .logger import WranglerLogger
+
+
+
[docs]class ModelRoadwayNetwork(RoadwayNetwork): + """ + Subclass of network_wrangler class :ref:`RoadwayNetwork <network_wrangler:RoadwayNetwork>` + + A representation of the physical roadway network and its properties. + """ + + CALCULATED_VALUES = [ + "area_type", + "county", + "assign_group", + "centroidconnect", + ] + +
[docs] def __init__( + self, + nodes: GeoDataFrame, + links: DataFrame, + shapes: GeoDataFrame, + parameters: Union[Parameters, dict] = {}, + ): + """ + Constructor + + Args: + nodes: geodataframe of nodes + links: dataframe of links + shapes: geodataframe of shapes + parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not specified, will use default parameters. + + """ + super().__init__(nodes, links, shapes) + + # will have to change if want to alter them + if type(parameters) is dict: + self.parameters = Parameters(**parameters) + elif isinstance(parameters, Parameters): + self.parameters = Parameters(**parameters.__dict__) + else: + msg = "Parameters should be a dict or instance of Parameters: found {} which is of type:{}".format( + parameters, type(parameters) + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + self.links_metcouncil_df = None + self.nodes_metcouncil_df = None
+ # self.shapes_metcouncil_df = None + ##todo also write to file + # WranglerLogger.debug("Used PARAMS\n", '\n'.join(['{}: {}'.format(k,v) for k,v in self.parameters.__dict__.items()])) + +
[docs] @staticmethod + def read( + link_file: str, + node_file: str, + shape_file: str, + fast: bool = False, + recalculate_calculated_variables=False, + recalculate_distance=False, + parameters: Union[dict, Parameters] = {}, + ): + """ + Reads in links and nodes network standard. + + Args: + link_file (str): File path to link json. + node_file (str): File path to node geojson. + shape_file (str): File path to link true shape geojson + fast (bool): boolean that will skip validation to speed up read time. + recalculate_calculated_variables (bool): calculates fields from spatial joins, etc. + recalculate_distance (bool): re-calculates distance. + parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not specified, will use default parameters. + + Returns: + ModelRoadwayNetwork + """ + # road_net = super().read(link_file, node_file, shape_file, fast=fast) + road_net = RoadwayNetwork.read(link_file, node_file, shape_file, fast=fast) + + m_road_net = ModelRoadwayNetwork( + road_net.nodes_df, + road_net.links_df, + road_net.shapes_df, + parameters=parameters, + ) + + if recalculate_calculated_variables: + m_road_net.create_calculated_variables() + if recalculate_distance: + m_road_net.calculate_distance(overwrite=True) + + m_road_net.fill_na() + # this method is making period values as string "NaN", need to revise. + m_road_net.split_properties_by_time_period_and_category() + for c in m_road_net.links_df.columns: + m_road_net.links_df[c] = m_road_net.links_df[c].replace("NaN", np.nan) + m_road_net.convert_int() + + return m_road_net
+ +
[docs] @staticmethod + def from_RoadwayNetwork( + roadway_network_object, parameters: Union[dict, Parameters] = {} + ): + """ + RoadwayNetwork to ModelRoadwayNetwork + + Args: + roadway_network_object (RoadwayNetwork). + parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not specified, will use default parameters. + + Returns: + ModelRoadwayNetwork + """ + return ModelRoadwayNetwork( + roadway_network_object.nodes_df, + roadway_network_object.links_df, + roadway_network_object.shapes_df, + parameters=parameters, + )
+ +
[docs] def split_properties_by_time_period_and_category(self, properties_to_split=None): + """ + Splits properties by time period, assuming a variable structure of + + Args: + properties_to_split: dict + dictionary of output variable prefix mapped to the source variable and what to stratify it by + e.g. + { + 'trn_priority' : {'v':'trn_priority', 'times_periods':{"AM": ("6:00", "9:00"),"PM": ("16:00", "19:00")}}, + 'ttime_assert' : {'v':'ttime_assert', 'times_periods':{"AM": ("6:00", "9:00"),"PM": ("16:00", "19:00")}}, + 'lanes' : {'v':'lanes', 'times_periods':{"AM": ("6:00", "9:00"),"PM": ("16:00", "19:00")}}, + 'ML_lanes' : {'v':'ML_lanes', 'times_periods':{"AM": ("6:00", "9:00"),"PM": ("16:00", "19:00")}}, + 'price' : {'v':'price', 'times_periods':{"AM": ("6:00", "9:00"),"PM": ("16:00", "19:00")}},'categories': {"sov": ["sov", "default"],"hov2": ["hov2", "default", "sov"]}}, + 'access' : {'v':'access', 'times_periods':{"AM": ("6:00", "9:00"),"PM": ("16:00", "19:00")}}, + } + + """ + import itertools + + if properties_to_split == None: + properties_to_split = self.parameters.properties_to_split + + for out_var, params in properties_to_split.items(): + if params["v"] not in self.links_df.columns: + WranglerLogger.warning( + "Specified variable to split: {} not in network variables: {}. Returning 0.".format( + params["v"], str(self.links_df.columns) + ) + ) + if params.get("time_periods") and params.get("categories"): + + for time_suffix, category_suffix in itertools.product( + params["time_periods"], params["categories"] + ): + self.links_df[ + out_var + "_" + time_suffix + "_" + category_suffix + ] = 0 + elif params.get("time_periods"): + for time_suffix in params["time_periods"]: + self.links_df[out_var + "_" + time_suffix] = 0 + elif params.get("time_periods") and params.get("categories"): + for time_suffix, category_suffix in itertools.product( + params["time_periods"], params["categories"] + ): + self.links_df[ + out_var + "_" + category_suffix + "_" + time_suffix + ] = self.get_property_by_time_period_and_group( + params["v"], + category=params["categories"][category_suffix], + time_period=params["time_periods"][time_suffix], + ) + elif params.get("time_periods"): + for time_suffix in params["time_periods"]: + self.links_df[ + out_var + "_" + time_suffix + ] = self.get_property_by_time_period_and_group( + params["v"], + category=None, + time_period=params["time_periods"][time_suffix], + ) + else: + raise ValueError( + "Shoudn't have a category without a time period: {}".format(params) + )
+ +
[docs] def create_calculated_variables(self): + """ + Creates calculated roadway variables. + + Args: + None + """ + WranglerLogger.info("Creating calculated roadway variables.") + self.calculate_area_type() + self.calculate_county() + self.calculate_centroidconnect() + self.calculate_mpo() + self.calculate_assign_group_and_roadway_class() + self.add_counts() + self.create_ML_variable() + self.create_hov_corridor_variable() + self.create_managed_variable()
+ +
[docs] def calculate_county( + self, + county_shape=None, + county_shape_variable=None, + network_variable="county", + county_codes_dict=None, + overwrite=False, + ): + """ + Calculates county variable. + + This uses the centroid of the geometry field to determine which county it should be labeled. + This isn't perfect, but it much quicker than other methods. + + Args: + county_shape (str): The File path to county geodatabase. + county_shape_variable (str): The variable name of county in county geodadabase. + network_variable (str): The variable name of county in network standard. Default to "county". + overwrite (Bool): True if overwriting existing county variable in network. Default to False. + + Returns: + None + """ + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing County Variable '{}' already in network".format( + network_variable + ) + ) + else: + WranglerLogger.info( + "County Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + """ + Verify inputs + """ + + county_shape = county_shape if county_shape else self.parameters.county_shape + + county_shape_variable = ( + county_shape_variable + if county_shape_variable + else self.parameters.county_variable_shp + ) + + WranglerLogger.info( + "Adding roadway network variable for county using a spatial join with: {}".format( + county_shape + ) + ) + + county_codes_dict = ( + county_codes_dict if county_codes_dict else self.parameters.county_code_dict + ) + if not county_codes_dict: + msg = "No county codes dictionary specified" + WranglerLogger.error(msg) + raise ValueError(msg) + + """ + Start actual process + """ + + centroids_gdf = self.links_df.copy() + centroids_gdf["geometry"] = centroids_gdf["geometry"].centroid + + county_gdf = gpd.read_file(county_shape) + county_gdf = county_gdf.to_crs(epsg=RoadwayNetwork.CRS) + joined_gdf = gpd.sjoin(centroids_gdf, county_gdf, how="left", op="intersects") + + joined_gdf[county_shape_variable] = ( + joined_gdf[county_shape_variable] + .map(county_codes_dict) + .fillna(10) + .astype(int) + ) + + self.links_df[network_variable] = joined_gdf[county_shape_variable] + + WranglerLogger.info( + "Finished Calculating county variable: {}".format(network_variable) + )
+ +
[docs] def calculate_area_type( + self, + area_type_shape=None, + area_type_shape_variable=None, + network_variable="area_type", + area_type_codes_dict=None, + downtown_area_type_shape=None, + downtown_area_type=None, + overwrite=False, + ): + """ + Calculates area type variable. + + This uses the centroid of the geometry field to determine which area it should be labeled. + This isn't perfect, but it much quicker than other methods. + + Args: + area_type_shape (str): The File path to area geodatabase. + area_type_shape_variable (str): The variable name of area type in area geodadabase. + network_variable (str): The variable name of area type in network standard. Default to "area_type". + area_type_codes_dict: The dictionary to map input area_type_shape_variable to network_variable + downtown_area_type_shape: The file path to the downtown area type boundary. + downtown_area_type (int): Integer value of downtown area type + overwrite (Bool): True if overwriting existing county variable in network. Default to False. + + Returns: + None + + """ + + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing Area Type Variable '{}' already in network".format( + network_variable + ) + ) + else: + WranglerLogger.info( + "Area Type Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + WranglerLogger.info( + "Calculating Area Type from Spatial Data and adding as roadway network variable: {}".format( + network_variable + ) + ) + + """ + Verify inputs + """ + + area_type_shape = ( + area_type_shape if area_type_shape else self.parameters.area_type_shape + ) + + if not area_type_shape: + msg = "No area type shape specified" + WranglerLogger.error(msg) + raise ValueError(msg) + if not os.path.exists(area_type_shape): + msg = "File not found for area type shape: {}".format(area_type_shape) + WranglerLogger.error(msg) + raise ValueError(msg) + + area_type_shape_variable = ( + area_type_shape_variable + if area_type_shape_variable + else self.parameters.area_type_variable_shp + ) + + if not area_type_shape_variable: + msg = "No area type shape varible specified" + WranglerLogger.error(msg) + raise ValueError(msg) + + area_type_codes_dict = ( + area_type_codes_dict + if area_type_codes_dict + else self.parameters.area_type_code_dict + ) + if not area_type_codes_dict: + msg = "No area type codes dictionary specified" + WranglerLogger.error(msg) + raise ValueError(msg) + + downtown_area_type_shape = ( + downtown_area_type_shape if downtown_area_type_shape else self.parameters.downtown_area_type_shape + ) + + if not downtown_area_type_shape: + msg = "No downtown area type shape specified" + WranglerLogger.error(msg) + raise ValueError(msg) + if not os.path.exists(downtown_area_type_shape): + msg = "File not found for downtown area type shape: {}".format(downtown_area_type_shape) + WranglerLogger.error(msg) + raise ValueError(msg) + + downtown_area_type = ( + downtown_area_type + if downtown_area_type + else self.parameters.downtown_area_type + ) + if not downtown_area_type: + msg = "No downtown area type value specified" + WranglerLogger.error(msg) + raise ValueError(msg) + + """ + Start actual process + """ + centroids_gdf = self.links_df.copy() + centroids_gdf["geometry"] = centroids_gdf["geometry"].centroid + + WranglerLogger.debug("Reading Area Type Shapefile {}".format(area_type_shape)) + area_type_gdf = gpd.read_file(area_type_shape) + area_type_gdf = area_type_gdf.to_crs(epsg=RoadwayNetwork.CRS) + + downtown_gdf = gpd.read_file(downtown_area_type_shape) + downtown_gdf = downtown_gdf.to_crs(epsg=RoadwayNetwork.CRS) + + joined_gdf = gpd.sjoin( + centroids_gdf, area_type_gdf, how="left", op="intersects" + ) + + joined_gdf[area_type_shape_variable] = ( + joined_gdf[area_type_shape_variable] + .map(area_type_codes_dict) + .fillna(1) + .astype(int) + ) + + WranglerLogger.debug("Area Type Codes Used: {}".format(area_type_codes_dict)) + + d_joined_gdf = gpd.sjoin( + centroids_gdf, downtown_gdf, how="left", op="intersects" + ) + + d_joined_gdf['downtown_area_type'] = ( + d_joined_gdf['Id'] + .fillna(-99) + .astype(int) + ) + + joined_gdf.loc[d_joined_gdf['downtown_area_type'] == 0, area_type_shape_variable] = downtown_area_type + + WranglerLogger.debug("Downtown Area Type used boundary file: {}".format(downtown_area_type_shape)) + + self.links_df[network_variable] = joined_gdf[area_type_shape_variable] + + WranglerLogger.info( + "Finished Calculating Area Type from Spatial Data into variable: {}".format( + network_variable + ) + )
+ +
[docs] def calculate_centroidconnect( + self, + network_variable="centroidconnect", + highest_taz_number=None, + as_integer=True, + overwrite=False, + lanes_variable="lanes", + number_of_lanes=1, + ): + """ + Calculates centroid connector variable. + + Args: + network_variable (str): Variable that should be written to in the network. Default to "centroidconnect" + highest_taz_number (int): the max TAZ number in the network. + as_integer (bool): If True, will convert true/false to 1/0s. Default to True. + overwrite (Bool): True if overwriting existing county variable in network. Default to False. + lanes_variable (str): Variable that stores the number of lanes. Default to "lanes". + number_of_lanes (int): Number of lanes for centroid connectors. Default to 1. + + Returns: + None + """ + + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing Centroid Connector Variable '{}' already in network".format( + network_variable + ) + ) + else: + WranglerLogger.info( + "Centroid Connector Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + WranglerLogger.info( + "Calculating Centroid Connector and adding as roadway network variable: {}".format( + network_variable + ) + ) + """ + Verify inputs + """ + highest_taz_number = ( + highest_taz_number + if highest_taz_number + else self.parameters.highest_taz_number + ) + + if not highest_taz_number: + msg = "No highest_TAZ number specified in method variable or in parameters" + WranglerLogger.error(msg) + raise ValueError(msg) + + WranglerLogger.debug( + "Calculating Centroid Connectors using highest TAZ number: {}".format( + highest_taz_number + ) + ) + + if not network_variable: + msg = "No network variable specified for centroid connector" + WranglerLogger.error(msg) + raise ValueError(msg) + + number_of_lanes = ( + number_of_lanes if number_of_lanes else self.parameters.centroid_connect_lanes + ) + + """ + Start actual process + """ + self.links_df[network_variable] = False + + self.links_df.loc[ + (self.links_df["A"] <= highest_taz_number) + | (self.links_df["B"] <= highest_taz_number), + network_variable, + ] = True + + self.links_df[lanes_variable] = number_of_lanes + + if as_integer: + self.links_df[network_variable] = self.links_df[network_variable].astype( + int + ) + WranglerLogger.info( + "Finished calculating centroid connector variable: {}".format( + network_variable + ) + )
+ +
[docs] def calculate_mpo( + self, + county_network_variable="county", + network_variable="mpo", + as_integer=True, + mpo_counties=None, + overwrite=False, + ): + """ + Calculates mpo variable. + + Args: + county_variable (str): Name of the variable where the county names are stored. Default to "county". + network_variable (str): Name of the variable that should be written to. Default to "mpo". + as_integer (bool): If true, will convert true/false to 1/0s. + mpo_counties (list): List of county names that are within mpo region. + overwrite (Bool): True if overwriting existing county variable in network. Default to False. + + Returns: + None + """ + + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing MPO Variable '{}' already in network".format( + network_variable + ) + ) + else: + WranglerLogger.info( + "MPO Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + WranglerLogger.info( + "Calculating MPO as roadway network variable: {}".format(network_variable) + ) + """ + Verify inputs + """ + county_network_variable = ( + county_network_variable + if county_network_variable + else self.parameters.county_network_variable + ) + + if not county_network_variable: + msg = "No variable specified as containing 'county' in the network." + WranglerLogger.error(msg) + raise ValueError(msg) + if county_network_variable not in self.links_df.columns: + msg = "Specified county network variable: {} does not exist in network. Try running or debuging county calculation." + WranglerLogger.error(msg) + raise ValueError(msg) + + mpo_counties = mpo_counties if mpo_counties else self.parameters.mpo_counties + + if not mpo_counties: + msg = "No MPO Counties specified in method call or in parameters." + WranglerLogger.error(msg) + raise ValueError(msg) + + WranglerLogger.debug("MPO Counties: {}".format(",".join(str(mpo_counties)))) + + """ + Start actual process + """ + + mpo = self.links_df[county_network_variable].isin(mpo_counties) + + if as_integer: + mpo = mpo.astype(int) + + self.links_df[network_variable] = mpo + + WranglerLogger.info( + "Finished calculating MPO variable: {}".format(network_variable) + )
+ +
[docs] def calculate_assign_group_and_roadway_class( + self, + assign_group_variable_name="assign_group", + road_class_variable_name="roadway_class", + mrcc_roadway_class_shape=None, + mrcc_shst_data=None, + mrcc_roadway_class_variable_shp=None, + mrcc_assgngrp_dict=None, + widot_roadway_class_shape=None, + widot_shst_data=None, + widot_roadway_class_variable_shp=None, + widot_assgngrp_dict=None, + osm_assgngrp_dict=None, + overwrite=False, + ): + """ + Calculates assignment group and roadway class variables. + + Assignment Group is used in MetCouncil's traffic assignment to segment the volume/delay curves. + Original source is from the MRCC data for the Minnesota: "route system" which is a roadway class + For Wisconsin, it is from the Wisconsin DOT database, which has a variable called "roadway category" + + There is a crosswalk between the MRCC Route System and Wisconsin DOT --> Met Council Assignment group + + This method joins the network with mrcc and widot roadway data by shst js matcher returns + + Args: + assign_group_variable_name (str): Name of the variable assignment group should be written to. Default to "assign_group". + road_class_variable_name (str): Name of the variable roadway class should be written to. Default to "roadway_class". + mrcc_roadway_class_shape (str): File path to the MRCC route system geodatabase. + mrcc_shst_data (str): File path to the MRCC SHST match return. + mrcc_roadway_class_variable_shp (str): Name of the variable where MRCC route system are stored. + mrcc_assgngrp_dict (dict): Dictionary to map MRCC route system variable to assignment group. + widot_roadway_class_shape (str): File path to the WIDOT roadway category geodatabase. + widot_shst_data (str): File path to the WIDOT SHST match return. + widot_roadway_class_variable_shp (str): Name of the variable where WIDOT roadway category are stored. + widot_assgngrp_dict (dict): Dictionary to map WIDOT roadway category variable to assignment group. + osm_assgngrp_dict (dict): Dictionary to map OSM roadway class to assignment group. + + Return: + None + """ + + update_assign_group = False + update_roadway_class = False + + WranglerLogger.info( + "Calculating Assignment Group and Roadway Class as network variables: '{}' and '{}'".format( + assign_group_variable_name, road_class_variable_name, + ) + ) + + if assign_group_variable_name in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing MPO Variable '{}' already in network".format( + assign_group_variable_name + ) + ) + else: + WranglerLogger.info( + "MPO Variable '{}' updated for some links. Returning without overwriting for those links. Calculating for other links".format( + assign_group_variable_name + ) + ) + update_assign_group = True + + if road_class_variable_name in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing MPO Variable '{}' already in network".format( + road_class_variable_name + ) + ) + else: + WranglerLogger.info( + "MPO Variable '{}' updated for some links. Returning without overwriting for those links. Calculating for other links".format( + road_class_variable_name + ) + ) + update_roadway_class = True + + """ + Verify inputs + """ + mrcc_roadway_class_shape = ( + mrcc_roadway_class_shape + if mrcc_roadway_class_shape + else self.parameters.mrcc_roadway_class_shape + ) + if not mrcc_roadway_class_shape: + msg = "'mrcc_roadway_class_shape' not found in method or lasso parameters.".format( + mrcc_roadway_class_shape + ) + WranglerLogger.error(msg) + raise ValueError(msg) + if not os.path.exists(mrcc_roadway_class_shape): + msg = "'mrcc_roadway_class_shape' not found at following location: {}.".format( + mrcc_roadway_class_shape + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + widot_roadway_class_shape = ( + widot_roadway_class_shape + if widot_roadway_class_shape + else self.parameters.widot_roadway_class_shape + ) + if not widot_roadway_class_shape: + msg = "'widot_roadway_class_shape' not found in method or lasso parameters.".format( + widot_roadway_class_shape + ) + WranglerLogger.error(msg) + raise ValueError(msg) + if not os.path.exists(widot_roadway_class_shape): + msg = "'widot_roadway_class_shape' not found at following location: {}.".format( + widot_roadway_class_shape + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + mrcc_shst_data = ( + mrcc_shst_data if mrcc_shst_data else self.parameters.mrcc_shst_data + ) + if not mrcc_shst_data: + msg = "'mrcc_shst_data' not found in method or lasso parameters.".format( + mrcc_shst_data + ) + WranglerLogger.error(msg) + raise ValueError(msg) + if not os.path.exists(mrcc_shst_data): + msg = "'mrcc_shst_data' not found at following location: {}.".format( + mrcc_shst_data + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + widot_shst_data = ( + widot_shst_data if widot_shst_data else self.parameters.widot_shst_data + ) + if not widot_shst_data: + msg = "'widot_shst_data' not found in method or lasso parameters.".format( + widot_shst_data + ) + WranglerLogger.error(msg) + raise ValueError(msg) + if not os.path.exists(widot_shst_data): + msg = "'widot_shst_data' not found at following location: {}.".format( + widot_shst_data + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + mrcc_roadway_class_variable_shp = ( + mrcc_roadway_class_variable_shp + if mrcc_roadway_class_variable_shp + else self.parameters.mrcc_roadway_class_variable_shp + ) + if not mrcc_roadway_class_variable_shp: + msg = "'mrcc_roadway_class_variable_shp' not found in method or lasso parameters." + WranglerLogger.error(msg) + raise ValueError(msg) + + widot_roadway_class_variable_shp = ( + widot_roadway_class_variable_shp + if widot_roadway_class_variable_shp + else self.parameters.widot_roadway_class_variable_shp + ) + if not widot_roadway_class_variable_shp: + msg = "'widot_roadway_class_variable_shp' not found in method or lasso parameters." + WranglerLogger.error(msg) + raise ValueError(msg) + + mrcc_assgngrp_dict = ( + mrcc_assgngrp_dict + if mrcc_assgngrp_dict + else self.parameters.mrcc_assgngrp_dict + ) + if not mrcc_assgngrp_dict: + msg = "'mrcc_assgngrp_dict' not found in method or lasso parameters." + WranglerLogger.error(msg) + raise ValueError(msg) + + widot_assgngrp_dict = ( + widot_assgngrp_dict + if widot_assgngrp_dict + else self.parameters.widot_assgngrp_dict + ) + if not widot_assgngrp_dict: + msg = "'widot_assgngrp_dict' not found in method or lasso parameters." + WranglerLogger.error(msg) + raise ValueError(msg) + + osm_assgngrp_dict = ( + osm_assgngrp_dict + if osm_assgngrp_dict + else self.parameters.osm_assgngrp_dict + ) + if not osm_assgngrp_dict: + msg = "'osm_assgngrp_dict' not found in method or lasso parameters.".format( + osm_assgngrp_dict + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + """ + Start actual process + """ + + WranglerLogger.debug("Calculating Centroid Connectors") + self.calculate_centroidconnect() + + WranglerLogger.debug( + "Reading MRCC Shapefile: {}".format(mrcc_roadway_class_shape) + ) + mrcc_gdf = gpd.read_file(mrcc_roadway_class_shape) + WranglerLogger.debug("MRCC GDF Columns\n{}".format(mrcc_gdf.columns)) + #'LINK_ID', 'ROUTE_SYS', 'ST_CONCAT', 'geometry' + mrcc_gdf["LINK_ID"] = range(1, 1 + len(mrcc_gdf)) + # returns shstreets dataframe with geometry ID, pp_link_id (which is the LINK_ID) + + # shstReferenceId,shstGeometryId,pp_link_id + mrcc_shst_ref_df = pd.read_csv(mrcc_shst_data) + WranglerLogger.debug( + "mrcc shst ref df columns\n{}".format(mrcc_shst_ref_df.columns) + ) + + widot_gdf = gpd.read_file(widot_roadway_class_shape) + widot_gdf["LINK_ID"] = range(1, 1 + len(widot_gdf)) + WranglerLogger.debug("WiDOT GDF Columns\n{}".format(widot_gdf.columns)) + widot_shst_ref_df = ModelRoadwayNetwork.read_match_result(widot_shst_data) + WranglerLogger.debug( + "widot shst ref df columns".format(widot_shst_ref_df.columns) + ) + # join MRCC geodataframe with MRCC shared street return to get MRCC route_sys and shared street geometry id + # + # get route_sys from MRCC + # end up with OSM data with MRCC attributes + join_gdf = ModelRoadwayNetwork.get_attribute( + self.links_df, + "shstGeometryId", + mrcc_shst_ref_df, + mrcc_gdf, + mrcc_roadway_class_variable_shp, + ) + + # for exporting mrcc_id + if "mrcc_id" in self.links_df.columns: + join_gdf.drop(["source_link_id"], axis=1, inplace=True) + else: + join_gdf.rename(columns={"source_link_id" : "mrcc_id"}, inplace=True) + + join_gdf = ModelRoadwayNetwork.get_attribute( + join_gdf, + "shstGeometryId", + widot_shst_ref_df, + widot_gdf, + widot_roadway_class_variable_shp, + ) + + osm_asgngrp_crosswalk_df = pd.read_csv(osm_assgngrp_dict) + mrcc_asgngrp_crosswalk_df = pd.read_csv( + mrcc_assgngrp_dict, dtype={mrcc_roadway_class_variable_shp: str} + ) + widot_asgngrp_crosswak_df = pd.read_csv(widot_assgngrp_dict) + + join_gdf = pd.merge( + join_gdf, + osm_asgngrp_crosswalk_df.rename( + columns={ + "assign_group": "assignment_group_osm", + "roadway_class": "roadway_class_osm" + } + ), + how="left", + on="roadway", + ) + + join_gdf = pd.merge( + join_gdf, + mrcc_asgngrp_crosswalk_df.rename( + columns={ + "assign_group": "assignment_group_mrcc", + "roadway_class": "roadway_class_mrcc" + } + ), + how="left", + on=mrcc_roadway_class_variable_shp, + ) + + join_gdf = pd.merge( + join_gdf, + widot_asgngrp_crosswak_df.rename( + columns={ + "assign_group": "assignment_group_widot", + "roadway_class": "roadway_class_widot" + } + ), + how="left", + on=widot_roadway_class_variable_shp, + ) + + def _set_asgngrp(x): + try: + if x.centroidconnect == 1: + return 9 + elif x.bus_only == 1: + return 98 + elif x.rail_only == 1: + return 100 + elif x.drive_access == 0: + return 101 + elif x.assignment_group_mrcc > 0: + return int(x.assignment_group_mrcc) + elif x.assignment_group_widot > 0: + return int(x.assignment_group_widot) + else: + return int(x.assignment_group_osm) + except: + return 0 + + join_gdf[assign_group_variable_name] = join_gdf.apply(lambda x: _set_asgngrp(x), axis=1) + + def _set_roadway_class(x): + try: + if x.centroidconnect == 1: + return 99 + elif x.bus_only == 1: + return 50 + elif x.rail_only == 1: + return 100 + elif x.drive_access == 0: + return 101 + elif x.roadway_class_mrcc > 0: + return int(x.roadway_class_mrcc) + elif x.roadway_class_widot > 0: + return int(x.roadway_class_widot) + else: + return int(x.roadway_class_osm) + except: + return 0 + + join_gdf[road_class_variable_name] = join_gdf.apply(lambda x: _set_roadway_class(x), axis=1) + + if "mrcc_id" in self.links_df.columns: + columns_from_source = ["model_link_id"] + else: + columns_from_source=[ + "model_link_id", + "mrcc_id", + mrcc_roadway_class_variable_shp, + widot_roadway_class_variable_shp, + ] + + if update_assign_group: + join_gdf.rename( + columns={ + assign_group_variable_name: assign_group_variable_name + "_cal" + }, + inplace=True + ) + self.links_df = pd.merge( + self.links_df, + join_gdf[columns_from_source + [assign_group_variable_name + "_cal"]], + how="left", + on="model_link_id", + ) + self.links_df[assign_group_variable_name] = np.where( + self.links_df[assign_group_variable_name] > 0, + self.links_df[assign_group_variable_name], + self.links_df[assign_group_variable_name + "_cal"], + ) + self.links_df.drop(assign_group_variable_name + "_cal", axis=1, inplace=True) + else: + self.links_df = pd.merge( + self.links_df, + join_gdf[columns_from_source + [assign_group_variable_name]], + how = "left", + on = "model_link_id", + ) + + if "mrcc_id" in self.links_df.columns: + columns_from_source = ["model_link_id"] + else: + columns_from_source = [ + "model_link_id", + "mrcc_id", + mrcc_roadway_class_variable_shp, + widot_roadway_class_variable_shp, + ] + + + if update_roadway_class: + join_gdf.rename( + columns={ + road_class_variable_name: road_class_variable_name + "_cal" + }, + inplace=True + ) + self.links_df = pd.merge( + self.links_df, + join_gdf[columns_from_source + [road_class_variable_name + "_cal"]], + how="left", + on="model_link_id", + ) + self.links_df[road_class_variable_name] = np.where( + self.links_df[road_class_variable_name] > 0, + self.links_df[road_class_variable_name], + self.links_df[road_class_variable_name + "_cal"], + ) + self.links_df.drop(road_class_variable_name + "_cal", axis=1, inplace=True) + else: + self.links_df = pd.merge( + self.links_df, + join_gdf[columns_from_source + [road_class_variable_name]], + how="left", + on="model_link_id", + ) + + WranglerLogger.info( + "Finished calculating assignment group variable {} and roadway class variable {}".format( + assign_group_variable_name, road_class_variable_name, + ) + )
+ +
[docs] def add_variable_using_shst_reference( + self, + var_shst_csvdata=None, + shst_csv_variable=None, + network_variable=None, + network_var_type=int, + overwrite=False, + ): + """ + Join network links with source data, via SHST API node match result. + + Args: + var_shst_csvdata (str): File path to SHST API return. + shst_csv_variable (str): Variable name in the source data. + network_variable (str): Name of the variable that should be written to. + network_var_type : Variable type in the written network. + overwrite (bool): True is overwriting existing variable. Default to False. + + Returns: + None + + """ + WranglerLogger.info( + "Adding Variable {} using Shared Streets Reference from {}".format( + network_variable, var_shst_csvdata + ) + ) + + var_shst_df = pd.read_csv(var_shst_csvdata) + + if "shstReferenceId" not in var_shst_df.columns: + msg = "'shstReferenceId' required but not found in {}".format(var_shst_data) + WranglerLogger.error(msg) + raise ValueError(msg) + + if shst_csv_variable not in var_shst_df.columns: + msg = "{} required but not found in {}".format( + shst_csv_variable, var_shst_data + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + join_gdf = pd.merge( + self.links_df, + var_shst_df[["shstReferenceId", shst_csv_variable]], + how="left", + on="shstReferenceId", + ) + + join_gdf[shst_csv_variable].fillna(0, inplace=True) + + if network_variable in self.links_df.columns and not overwrite: + join_gdf.loc[join_gdf[network_variable] > 0, network_variable] = join_gdf[ + shst_csv_variable + ].astype(network_var_type) + else: + join_gdf[network_variable] = join_gdf[shst_csv_variable].astype( + network_var_type + ) + + self.links_df[network_variable] = join_gdf[network_variable] + + WranglerLogger.info( + "Added variable: {} using Shared Streets Reference".format(network_variable) + )
+ +
[docs] def add_counts( + self, + network_variable="AADT", + mndot_count_shst_data=None, + widot_count_shst_data=None, + mndot_count_variable_shp=None, + widot_count_variable_shp=None, + ): + + """ + Adds count variable. + + join the network with count node data, via SHST API node match result + + Args: + network_variable (str): Name of the variable that should be written to. Default to "AADT". + mndot_count_shst_data (str): File path to MNDOT count location SHST API node match result. + widot_count_shst_data (str): File path to WIDOT count location SHST API node match result. + mndot_count_variable_shp (str): File path to MNDOT count location geodatabase. + widot_count_variable_shp (str): File path to WIDOT count location geodatabase. + + Returns: + None + """ + + WranglerLogger.info("Adding Counts") + + """ + Verify inputs + """ + + mndot_count_shst_data = ( + mndot_count_shst_data + if mndot_count_shst_data + else self.parameters.mndot_count_shst_data + ) + widot_count_shst_data = ( + widot_count_shst_data + if widot_count_shst_data + else self.parameters.widot_count_shst_data + ) + mndot_count_variable_shp = ( + mndot_count_variable_shp + if mndot_count_variable_shp + else self.parameters.mndot_count_variable_shp + ) + widot_count_variable_shp = ( + widot_count_variable_shp + if widot_count_variable_shp + else self.parameters.widot_count_variable_shp + ) + + for varname, var in { + "mndot_count_shst_data": mndot_count_shst_data, + "widot_count_shst_data": widot_count_shst_data, + }.items(): + if not var: + msg = "'{}' not found in method or lasso parameters.".format(varname) + WranglerLogger.error(msg) + raise ValueError(msg) + if not os.path.exists(var): + msg = "{}' not found at following location: {}.".format(varname, var) + WranglerLogger.error(msg) + raise ValueError(msg) + + for varname, var in { + "mndot_count_variable_shp": mndot_count_variable_shp, + "widot_count_variable_shp": widot_count_variable_shp, + }.items(): + if not var: + msg = "'{}' not found in method or lasso parameters.".format(varname) + WranglerLogger.error(msg) + raise ValueError(msg) + + """ + Start actual process + """ + WranglerLogger.debug( + "Adding MNDOT Counts using \n- shst file: {}\n- shp file: {}\n- as network variable: {}".format( + mndot_count_shst_data, mndot_count_variable_shp, network_variable + ) + ) + # Add Minnesota Counts + self.add_variable_using_shst_reference( + var_shst_csvdata=mndot_count_shst_data, + shst_csv_variable=mndot_count_variable_shp, + network_variable=network_variable, + network_var_type=int, + overwrite=True, + ) + WranglerLogger.debug( + "Adding WiDot Counts using \n- shst file: {}\n- shp file: {}\n- as network variable: {}".format( + widot_count_shst_data, widot_count_variable_shp, network_variable + ) + ) + # Add Wisconsin Counts, but don't overwrite Minnesota + self.add_variable_using_shst_reference( + var_shst_csvdata=widot_count_shst_data, + shst_csv_variable=widot_count_variable_shp, + network_variable=network_variable, + network_var_type=int, + overwrite=False, + ) + + self.links_df["count_AM"] = self.links_df[network_variable] / 4 + self.links_df["count_MD"] = self.links_df[network_variable] / 4 + self.links_df["count_PM"] = self.links_df[network_variable] / 4 + self.links_df["count_NT"] = self.links_df[network_variable] / 4 + + self.links_df["count_daily"] = self.links_df[network_variable] + self.links_df["count_year"] = 2017 + + WranglerLogger.info( + "Finished adding counts variable: {}".format(network_variable) + )
+ +
[docs] @staticmethod + def read_match_result(path): + """ + Reads the shst geojson match returns. + + Returns shst dataframe. + + Reading lots of same type of file and concatenating them into a single DataFrame. + + Args: + path (str): File path to SHST match results. + + Returns: + geodataframe: geopandas geodataframe + + ##todo + not sure why we need, but should be in utilities not this class + """ + refId_gdf = DataFrame() + refid_file = glob.glob(path) + for i in refid_file: + new = gpd.read_file(i) + refId_gdf = pd.concat([refId_gdf, new], ignore_index=True, sort=False) + return refId_gdf
+ +
[docs] @staticmethod + def get_attribute( + links_df, + join_key, # either "shstReferenceId", or "shstGeometryId", tests showed the latter gave better coverage + source_shst_ref_df, # source shst refId + source_gdf, # source dataframe + field_name, # , # targetted attribute from source + ): + """ + Gets attribute from source data using SHST match result. + + Args: + links_df (dataframe): The network dataframe that new attribute should be written to. + join_key (str): SHST ID variable name used to join source data with network dataframe. + source_shst_ref_df (str): File path to source data SHST match result. + source_gdf (str): File path to source data. + field_name (str): Name of the attribute to get from source data. + + Returns: + None + """ + # join based on shared streets geometry ID + # pp_link_id is shared streets match return + # source_ink_id is mrcc + WranglerLogger.debug( + "source ShSt rename_variables_for_dbf columns\n{}".format( + source_shst_ref_df.columns + ) + ) + WranglerLogger.debug("source gdf columns\n{}".format(source_gdf.columns)) + # end up with OSM network with the MRCC Link ID + # could also do with route_sys...would that be quicker? + join_refId_df = pd.merge( + links_df, + source_shst_ref_df[[join_key, "pp_link_id", "score"]].rename( + columns={"pp_link_id": "source_link_id", "score": "source_score"} + ), + how="left", + on=join_key, + ) + + # joined with MRCC dataframe to get route_sys + + join_refId_df = pd.merge( + join_refId_df, + source_gdf[["LINK_ID", field_name]].rename( + columns={"LINK_ID": "source_link_id"} + ), + how="left", + on="source_link_id", + ) + + # drop duplicated records with same field value + + join_refId_df.drop_duplicates( + subset=["model_link_id", "shstReferenceId", field_name], inplace=True + ) + + # more than one match, take the best score + + join_refId_df.sort_values( + by=["model_link_id", "source_score"], + ascending=True, + na_position="first", + inplace=True, + ) + + join_refId_df.drop_duplicates( + subset=["model_link_id"], keep="last", inplace=True + ) + + # self.links_df[field_name] = join_refId_df[field_name] + + return join_refId_df[links_df.columns.tolist() + [field_name, "source_link_id"]]
+ +
[docs] def calculate_hov( + self, network_variable="HOV", as_integer=True, overwrite=False, + ): + """ + Calculates hov variable. + + Args: + network_variable (str): Variable that should be written to in the network. Default to "HOV" + as_integer (bool): If True, will convert true/false to 1/0s. Defauly to True. + overwrite (Bool): True if overwriting existing county variable in network. Default to False. + + Returns: + None + """ + + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing hov Variable '{}' already in network".format( + network_variable + ) + ) + else: + WranglerLogger.info( + "hov Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + WranglerLogger.info( + "Calculating hov and adding as roadway network variable: {}".format( + network_variable + ) + ) + """ + Verify inputs + """ + + if not network_variable: + msg = "No network variable specified for centroid connector" + WranglerLogger.error(msg) + raise ValueError(msg) + + """ + Start actual process + """ + self.links_df[network_variable] = 0 + + self.links_df.loc[ + (self.links_df["assign_group"] == 8) | (self.links_df["access"] == "hov"), + network_variable, + ] = 100 + + if as_integer: + self.links_df[network_variable] = self.links_df[network_variable].astype( + int + ) + WranglerLogger.info( + "Finished calculating hov variable: {}".format(network_variable) + )
+ +
[docs] def create_ML_variable( + self, network_variable="ML_lanes", overwrite=False, + ): + """ + Created ML lanes placeholder for project to write out ML changes + + ML lanes default to 0, ML info comes from cube LOG file and store in project cards + + Args: + overwrite (Bool): True if overwriting existing variable in network. Default to False. + + Returns: + None + """ + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing ML Variable '{}' already in network".format( + network_variable + ) + ) + self.links_df[network_variable] = int(0) + else: + WranglerLogger.info( + "ML Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + """ + Verify inputs + """ + + WranglerLogger.info( + "Finished creating ML lanes variable: {}".format(network_variable) + )
+ +
[docs] def create_hov_corridor_variable( + self, network_variable="segment_id", overwrite=False, + ): + """ + Created hov corridor placeholder for project to write out corridor changes + + hov corridor id default to 0, its info comes from cube LOG file and store in project cards + + Args: + overwrite (Bool): True if overwriting existing variable in network. Default to False. + + Returns: + None + """ + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing hov corridor Variable '{}' already in network".format( + network_variable + ) + ) + else: + WranglerLogger.info( + "Hov corridor Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + """ + Verify inputs + """ + + self.links_df[network_variable] = int(0) + + WranglerLogger.info( + "Finished creating hov corridor variable: {}".format(network_variable) + )
+ +
[docs] def create_managed_variable( + self, network_variable="managed", overwrite=False, + ): + """ + Created placeholder for project to write out managed + + managed default to 0, its info comes from cube LOG file and store in project cards + + Args: + overwrite (Bool): True if overwriting existing variable in network. Default to False. + + Returns: + None + """ + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing managed Variable '{}' already in network".format( + network_variable + ) + ) + else: + WranglerLogger.info( + "Managed Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + """ + Verify inputs + """ + + self.links_df[network_variable] = int(0) + + WranglerLogger.info( + "Finished creating managed variable: {}".format(network_variable) + )
+ +
[docs] def calculate_distance( + self, network_variable="distance", centroidconnect_only=True, overwrite=False + ): + """ + calculate link distance in miles + + Args: + centroidconnect_only (Bool): True if calculating distance for centroidconnectors only. Default to True. + overwrite (Bool): True if overwriting existing variable in network. Default to False. + + Returns: + None + + """ + + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing distance Variable '{}' already in network".format( + network_variable + ) + ) + else: + WranglerLogger.info( + "Distance Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + """ + Verify inputs + """ + + if "centroidconnect" not in self.links_df: + msg = "No variable specified for centroid connector, calculating centroidconnect first" + WranglerLogger.info(msg) + self.calculate_centroidconnect() + + """ + Start actual process + """ + + temp_links_gdf = self.links_df.copy() + temp_links_gdf.crs = "EPSG:4326" + temp_links_gdf = temp_links_gdf.to_crs(epsg=26915) + + if centroidconnect_only: + WranglerLogger.info( + "Calculating {} for centroid connectors".format(network_variable) + ) + temp_links_gdf[network_variable] = np.where( + temp_links_gdf.centroidconnect == 1, + temp_links_gdf.geometry.length / 1609.34, + temp_links_gdf[network_variable], + ) + else: + WranglerLogger.info( + "Calculating distance for all links".format(network_variable) + ) + temp_links_gdf[network_variable] = temp_links_gdf.geometry.length / 1609.34 + + self.links_df[network_variable] = temp_links_gdf[network_variable]
+ +
[docs] def convert_int(self, int_col_names=[]): + """ + Convert integer columns + """ + + WranglerLogger.info("Converting variable type to MetCouncil standard") + + if not int_col_names: + int_col_names = self.parameters.int_col + + ##Why are we doing this? + # int_col_names.remove("lanes") + + for c in list(set(self.links_df.columns) & set(int_col_names)): + try: + self.links_df[c] = self.links_df[c].replace(np.nan, 0) + self.links_df[c] = self.links_df[c].astype(int) + except: + self.links_df[c] = self.links_df[c].astype(float) + self.links_df[c] = self.links_df[c].astype(int) + + for c in list(set(self.nodes_df.columns) & set(int_col_names)): + self.nodes_df[c] = self.nodes_df[c].astype(int)
+ +
[docs] def fill_na(self): + """ + Fill na values from create_managed_lane_network() + """ + + WranglerLogger.info("Filling nan for network from network wrangler") + + num_col = self.parameters.int_col + self.parameters.float_col + + for x in list(self.links_df.columns): + if x in num_col: + self.links_df[x].fillna(0, inplace=True) + self.links_df[x] = self.links_df[x].apply( + lambda k: 0 if k in [np.nan, "", float("nan"), "NaN"] else k + ) + + else: + self.links_df[x].fillna("", inplace=True) + + for x in list(self.nodes_df.columns): + if x in num_col: + self.nodes_df[x].fillna(0, inplace=True) + else: + self.nodes_df[x].fillna("", inplace=True)
+ +
[docs] def roadway_standard_to_met_council_network(self, output_epsg=None): + """ + Rename and format roadway attributes to be consistent with what metcouncil's model is expecting. + + Args: + output_epsg (int): epsg number of output network. + + Returns: + None + """ + + WranglerLogger.info( + "Renaming roadway attributes to be consistent with what metcouncil's model is expecting" + ) + + """ + Verify inputs + """ + + output_epsg = output_epsg if output_epsg else self.parameters.output_epsg + + """ + Start actual process + """ + if "managed" in self.links_df.columns: + WranglerLogger.info("Creating managed lane network.") + self.create_managed_lane_network(in_place=True) + + # when ML and assign_group projects are applied together, assign_group is filled as "" by wrangler for ML links + for c in ModelRoadwayNetwork.CALCULATED_VALUES: + if c in self.links_df.columns and c in self.parameters.int_col: + self.links_df[c] = self.links_df[c].replace("", 0) + else: + WranglerLogger.info("Didn't detect managed lanes in network.") + + self.create_calculated_variables() + self.calculate_distance(overwrite=True) + + self.fill_na() + # no method to calculate price yet, will be hard coded in project card + WranglerLogger.info("Splitting variables by time period and category") + self.split_properties_by_time_period_and_category() + self.convert_int() + + self.links_metcouncil_df = self.links_df.copy() + self.nodes_metcouncil_df = self.nodes_df.copy() + + self.links_metcouncil_df = pd.merge( + self.links_metcouncil_df.drop("geometry", axis = 1), # drop the stick geometry in links_df + self.shapes_df[["shape_id", "geometry"]], + how = "left", + on = "shape_id" + ) + + self.links_metcouncil_df.crs = "EPSG:4326" + self.nodes_metcouncil_df.crs = "EPSG:4326" + WranglerLogger.info("Setting Coordinate Reference System to EPSG 26915") + self.links_metcouncil_df = self.links_metcouncil_df.to_crs(epsg=26915) + self.nodes_metcouncil_df = self.nodes_metcouncil_df.to_crs(epsg=26915) + + self.nodes_metcouncil_df["X"] = self.nodes_metcouncil_df.geometry.apply( + lambda g: g.x + ) + self.nodes_metcouncil_df["Y"] = self.nodes_metcouncil_df.geometry.apply( + lambda g: g.y + ) + + # CUBE expect node id to be N + self.nodes_metcouncil_df.rename(columns={"model_node_id": "N"}, inplace=True)
+ +
[docs] def rename_variables_for_dbf( + self, + input_df, + variable_crosswalk: str = None, + output_variables: list = None, + convert_geometry_to_xy=False, + ): + """ + Rename attributes for DBF/SHP, make sure length within 10 chars. + + Args: + input_df (dataframe): Network standard DataFrame. + variable_crosswalk (str): File path to variable name crosswalk from network standard to DBF names. + output_variables (list): List of strings for DBF variables. + convert_geometry_to_xy (bool): True if converting node geometry to X/Y + + Returns: + dataframe + + """ + WranglerLogger.info("Renaming variables so that they are DBF-safe") + + """ + Verify inputs + """ + + variable_crosswalk = ( + variable_crosswalk + if variable_crosswalk + else self.parameters.net_to_dbf_crosswalk + ) + + output_variables = ( + output_variables if output_variables else self.parameters.output_variables + ) + + """ + Start actual process + """ + + crosswalk_df = pd.read_csv(variable_crosswalk) + WranglerLogger.debug( + "Variable crosswalk: {} \n {}".format(variable_crosswalk, crosswalk_df) + ) + net_to_dbf_dict = dict(zip(crosswalk_df["net"], crosswalk_df["dbf"])) + + dbf_name_list = [] + + dbf_df = copy.deepcopy(input_df) + + # only write out variables that we specify + # if variable is specified in the crosswalk, rename it to that variable + for c in dbf_df.columns: + if c in output_variables: + try: + dbf_df.rename(columns={c: net_to_dbf_dict[c]}, inplace=True) + dbf_name_list += [net_to_dbf_dict[c]] + except: + dbf_name_list += [c] + + if "geometry" in dbf_df.columns: + if str(dbf_df["geometry"].iloc[0].geom_type) == "Point": + dbf_df["X"] = dbf_df.geometry.apply(lambda g: g.x) + dbf_df["Y"] = dbf_df.geometry.apply(lambda g: g.y) + dbf_name_list += ["X", "Y"] + + WranglerLogger.debug("DBF Variables: {}".format(",".join(dbf_name_list))) + + return dbf_df[dbf_name_list]
+ +
[docs] def write_roadway_as_shp( + self, + node_output_variables: list = None, + link_output_variables: list = None, + data_to_csv: bool = True, + data_to_dbf: bool = False, + output_link_shp: str = None, + output_node_shp: str = None, + output_link_csv: str = None, + output_node_csv: str = None, + ): + """ + Write out dbf/shp for cube. Write out csv in addition to shp with full length variable names. + + Args: + node_output_variables (list): List of strings for node output variables. + link_output_variables (list): List of strings for link output variables. + data_to_csv (bool): True if write network in csv format. + data_to_dbf (bool): True if write network in dbf/shp format. + output_link_shp (str): File path to output link dbf/shp. + output_node_shp (str): File path to output node dbf/shp. + output_link_csv (str): File path to output link csv. + output_node_csv (str): File path to output node csv. + + Returns: + None + """ + + WranglerLogger.info("Writing Network as Shapefile") + WranglerLogger.debug( + "Output Variables: \n - {}".format( + "\n - ".join(self.parameters.output_variables) + ) + ) + + """ + Verify inputs + """ + + if self.nodes_metcouncil_df is None: + self.roadway_standard_to_met_council_network() + + WranglerLogger.debug( + "Network Link Variables: \n - {}".format( + "\n - ".join(self.links_metcouncil_df.columns) + ) + ) + WranglerLogger.debug( + "Network Node Variables: \n - {}".format( + "\n - ".join(self.nodes_metcouncil_df.columns) + ) + ) + + link_output_variables = ( + link_output_variables + if link_output_variables + else [ + c + for c in self.links_metcouncil_df.columns + if c in self.parameters.output_variables + ] + ) + + node_output_variables = ( + node_output_variables + if node_output_variables + else [ + c + for c in self.nodes_metcouncil_df.columns + if c in self.parameters.output_variables + ] + ) + + # unless specified that all the data goes to the DBF, only output A and B + dbf_link_output_variables = ( + link_output_variables if data_to_dbf else ["A", "B", "shape_id", "geometry"] + ) + + output_link_shp = ( + output_link_shp if output_link_shp else self.parameters.output_link_shp + ) + + output_node_shp = ( + output_node_shp if output_node_shp else self.parameters.output_node_shp + ) + + output_link_csv = ( + output_link_csv if output_link_csv else self.parameters.output_link_csv + ) + + output_node_csv = ( + output_node_csv if output_node_csv else self.parameters.output_node_csv + ) + + """ + Start Process + """ + + WranglerLogger.info("Renaming DBF Node Variables") + nodes_dbf_df = self.rename_variables_for_dbf( + self.nodes_metcouncil_df, output_variables=node_output_variables + ) + WranglerLogger.info("Renaming DBF Link Variables") + links_dbf_df = self.rename_variables_for_dbf( + self.links_metcouncil_df, output_variables=dbf_link_output_variables + ) + + links_dbf_df = gpd.GeoDataFrame(links_dbf_df, geometry = links_dbf_df["geometry"]) + + WranglerLogger.info("Writing Node Shapes:\n - {}".format(output_node_shp)) + nodes_dbf_df.to_file(output_node_shp) + WranglerLogger.info("Writing Link Shapes:\n - {}".format(output_link_shp)) + links_dbf_df.to_file(output_link_shp) + + if data_to_csv: + WranglerLogger.info( + "Writing Network Data to CSVs:\n - {}\n - {}".format( + output_link_csv, output_node_csv + ) + ) + self.links_metcouncil_df[link_output_variables].to_csv( + output_link_csv, index=False + ) + self.nodes_metcouncil_df[node_output_variables].to_csv( + output_node_csv, index=False + )
+ + # this should be moved to util +
[docs] @staticmethod + def dataframe_to_fixed_width(df): + """ + Convert dataframe to fixed width format, geometry column will not be transformed. + + Args: + df (pandas DataFrame). + + Returns: + pandas dataframe: dataframe with fixed width for each column. + dict: dictionary with columns names as keys, column width as values. + """ + WranglerLogger.info("Starting fixed width conversion") + + # get the max length for each variable column + max_width_dict = dict( + [ + (v, df[v].apply(lambda r: len(str(r)) if r != None else 0).max()) + for v in df.columns.values + if v != "geometry" + ] + ) + + fw_df = df.drop("geometry", axis=1).copy() + for c in fw_df.columns: + fw_df[c] = fw_df[c].apply(lambda x: str(x)) + fw_df['pad'] = fw_df[c].apply(lambda x: " " * (max_width_dict[c] - len(x))) + fw_df[c] = fw_df.apply(lambda x: x['pad'] + x[c], axis=1) + + return fw_df, max_width_dict
+ +
[docs] def write_roadway_as_fixedwidth( + self, + node_output_variables: list = None, + link_output_variables: list = None, + output_link_txt: str = None, + output_node_txt: str = None, + output_link_header_width_txt: str = None, + output_node_header_width_txt: str = None, + output_cube_network_script: str = None, + drive_only: bool = False, + ): + """ + Writes out fixed width file. + + This function does: + 1. write out link and node fixed width data files for cube. + 2. write out header and width correspondence. + 3. write out cube network building script with header and width specification. + + Args: + node_output_variables (list): list of node variable names. + link_output_variables (list): list of link variable names. + output_link_txt (str): File path to output link database. + output_node_txt (str): File path to output node database. + output_link_header_width_txt (str): File path to link column width records. + output_node_header_width_txt (str): File path to node column width records. + output_cube_network_script (str): File path to CUBE network building script. + drive_only (bool): If True, only writes drive nodes and links + + Returns: + None + + """ + + """ + Verify inputs + """ + + if self.nodes_metcouncil_df is None: + self.roadway_standard_to_met_council_network() + + WranglerLogger.debug( + "Network Link Variables: \n - {}".format( + "\n - ".join(self.links_metcouncil_df.columns) + ) + ) + WranglerLogger.debug( + "Network Node Variables: \n - {}".format( + "\n - ".join(self.nodes_metcouncil_df.columns) + ) + ) + + link_output_variables = ( + link_output_variables + if link_output_variables + else [ + c + for c in self.links_metcouncil_df.columns + if c in self.parameters.output_variables + ] + ) + + node_output_variables = ( + node_output_variables + if node_output_variables + else [ + c + for c in self.nodes_metcouncil_df.columns + if c in self.parameters.output_variables + ] + ) + + output_link_txt = ( + output_link_txt if output_link_txt else self.parameters.output_link_txt + ) + + output_node_txt = ( + output_node_txt if output_node_txt else self.parameters.output_node_txt + ) + + output_link_header_width_txt = ( + output_link_header_width_txt + if output_link_header_width_txt + else self.parameters.output_link_header_width_txt + ) + + output_node_header_width_txt = ( + output_node_header_width_txt + if output_node_header_width_txt + else self.parameters.output_node_header_width_txt + ) + + output_cube_network_script = ( + output_cube_network_script + if output_cube_network_script + else self.parameters.output_cube_network_script + ) + + """ + Start Process + """ + link_ff_df, link_max_width_dict = self.dataframe_to_fixed_width( + self.links_metcouncil_df[link_output_variables] + ) + + if drive_only: + link_ff_df = link_ff_df.loc[link_ff_df["drive_access"] == 1] + + WranglerLogger.info("Writing out link database") + + link_ff_df.to_csv(output_link_txt, sep=";", index=False, header=False) + + # write out header and width correspondence + WranglerLogger.info("Writing out link header and width ----") + link_max_width_df = DataFrame( + list(link_max_width_dict.items()), columns=["header", "width"] + ) + link_max_width_df.to_csv(output_link_header_width_txt, index=False) + + node_ff_df, node_max_width_dict = self.dataframe_to_fixed_width( + self.nodes_metcouncil_df[node_output_variables] + ) + WranglerLogger.info("Writing out node database") + + if drive_only: + node_ff_df = node_ff_df.loc[node_ff_df["drive_node"] == 1] + + node_ff_df.to_csv(output_node_txt, sep=";", index=False, header=False) + + # write out header and width correspondence + WranglerLogger.info("Writing out node header and width") + node_max_width_df = DataFrame( + list(node_max_width_dict.items()), columns=["header", "width"] + ) + node_max_width_df.to_csv(output_node_header_width_txt, index=False) + + # write out cube script + s = 'RUN PGM = NETWORK MSG = "Read in network from fixed width file" \n' + s += "FILEI LINKI[1] = %LINK_DATA_PATH%," + start_pos = 1 + for i in range(len(link_max_width_df)): + s += " VAR=" + link_max_width_df.header.iloc[i] + + if ( + self.links_metcouncil_df.dtypes.loc[link_max_width_df.header.iloc[i]] + == "O" + ): + s += "(C" + str(link_max_width_df.width.iloc[i]) + ")" + + s += ( + ", BEG=" + + str(start_pos) + + ", LEN=" + + str(link_max_width_df.width.iloc[i]) + + "," + ) + + start_pos += link_max_width_df.width.iloc[i] + 1 + + s = s[:-1] + s += "\n" + s += "FILEI NODEI[1] = %NODE_DATA_PATH%," + start_pos = 1 + for i in range(len(node_max_width_df)): + s += " VAR=" + node_max_width_df.header.iloc[i] + + if ( + self.nodes_metcouncil_df.dtypes.loc[node_max_width_df.header.iloc[i]] + == "O" + ): + s += "(C" + str(node_max_width_df.width.iloc[i]) + ")" + + s += ( + ", BEG=" + + str(start_pos) + + ", LEN=" + + str(node_max_width_df.width.iloc[i]) + + "," + ) + + start_pos += node_max_width_df.width.iloc[i] + 1 + + s = s[:-1] + s += "\n" + s += 'FILEO NETO = "%SCENARIO_DIR%/complete_network.net" \n\n ZONES = %zones% \n\n' + s += "ROADWAY = LTRIM(TRIM(ROADWAY)) \n" + s += "NAME = LTRIM(TRIM(NAME)) \n" + s += "\n \nENDRUN" + + with open(output_cube_network_script, "w") as f: + f.write(s)
+
+ +
+ +
+ + +
+
+ +
+ +
+ + + + + + + + + + + \ No newline at end of file diff --git a/_modules/lasso/transit.html b/_modules/lasso/transit.html new file mode 100644 index 0000000..6f9e7c7 --- /dev/null +++ b/_modules/lasso/transit.html @@ -0,0 +1,1521 @@ + + + + + + + + + + lasso.transit — lasso documentation + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ + + +
+ + + + + +
+ +
+ + + + + + + + + + + + + + + + + +
+ +
    + +
  • »
  • + +
  • Module code »
  • + +
  • lasso.transit
  • + + +
  • + +
  • + +
+ + +
+
+
+
+ +

Source code for lasso.transit

+"""Transit-related classes to parse, compare, and write standard and cube transit files.
+
+  Typical usage example:
+
+    tn = CubeTransit.create_from_cube(CUBE_DIR)
+    transit_change_list = tn.evaluate_differences(base_transit_network)
+
+    cube_transit_net = StandardTransit.read_gtfs(BASE_TRANSIT_DIR)
+    cube_transit_net.write_as_cube_lin(os.path.join(WRITE_DIR, "outfile.lin"))
+"""
+import os
+import copy
+import csv
+import datetime, time
+from typing import Any, Dict, Optional, Union
+
+from lark import Lark, Transformer, v_args
+from pandas import DataFrame
+
+import pandas as pd
+import partridge as ptg
+
+from network_wrangler import TransitNetwork
+
+from .logger import WranglerLogger
+from .parameters import Parameters
+
+
+
[docs]class CubeTransit(object): + """ Class for storing information about transit defined in Cube line + files. + + Has the capability to: + + - Parse cube line file properties and shapes into python dictionaries + - Compare line files and represent changes as Project Card dictionaries + + .. highlight:: python + + Typical usage example: + :: + tn = CubeTransit.create_from_cube(CUBE_DIR) + transit_change_list = tn.evaluate_differences(base_transit_network) + + Attributes: + lines (list): list of strings representing unique line names in + the cube network. + line_properties (dict): dictionary of line properties keyed by line name. Property + values are stored in a dictionary by property name. These + properties are directly read from the cube line files and haven't + been translated to standard transit values. + shapes (dict): dictionary of shapes + keyed by line name. Shapes stored as a pandas DataFrame of nodes with following columns: + - 'node_id' (int): positive integer of node id + - 'node' (int): node number, with negative indicating a non-stop + - 'stop' (boolean): indicates if it is a stop + - 'order' (int): order within this shape + program_type (str): Either PT or TRNBLD + parameters (Parameters): + Parameters instance that will be applied to this instance which + includes information about time periods and variables. + source_list (list): + List of cube line file sources that have been read and added. + diff_dict (dict): + """ + +
[docs] def __init__(self, parameters: Union[Parameters, dict] = {}): + """ + Constructor for CubeTransit + + parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters + """ + print("Creating a new Cube Transit instance") + WranglerLogger.debug("Creating a new Cube Transit instance") + + self.lines = [] + + self.line_properties = {} + self.shapes = {} + + self.program_type = None + + if type(parameters) is dict: + self.parameters = Parameters(**parameters) + elif isinstance(parameters, Parameters): + self.parameters = Parameters(**parameters.__dict__) + else: + msg = "Parameters should be a dict or instance of Parameters: found {} which is of type:{}".format( + parameters, type(parameters) + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + self.source_list = [] + + self.diff_dict = Dict[str, Any]
+ +
[docs] def add_cube(self, transit_source: str) -> None: + """Reads a .lin file and adds it to existing TransitNetwork instance. + + Args: + transit_source: a string or the directory of the cube line file to be parsed + + """ + + """ + Figure out what kind of transit source it is + """ + + parser = Lark(TRANSIT_LINE_FILE_GRAMMAR, debug="debug", parser="lalr") + + if "NAME=" in transit_source: + WranglerLogger.debug("reading transit source as string") + self.source_list.append("input_str") + parse_tree = parser.parse(transit_source) + elif os.path.isfile(transit_source): + print("reading: {}".format(transit_source)) + with open(transit_source) as file: + WranglerLogger.debug( + "reading transit source: {}".format(transit_source) + ) + self.source_list.append(transit_source) + parse_tree = parser.parse(file.read()) + elif os.path.isdir(transit_source): + import glob + + for lin_file in glob.glob(os.path.join(transit_source, "*.LIN")): + self.add_cube(lin_file) + return + else: + msg= "{} not a valid transit line string, directory, or file" + WranglerLogger.error(msg) + raise ValueError(msg) + + WranglerLogger.debug("finished parsing cube line file") + #WranglerLogger.debug("--Parse Tree--\n {}".format(parse_tree.pretty())) + transformed_tree_data = CubeTransformer().transform(parse_tree) + #WranglerLogger.debug("--Transformed Parse Tree--\n {}".format(transformed_tree_data)) + + _line_data = transformed_tree_data['lines'] + + line_properties_dict = { + k: v["line_properties"] for k, v in _line_data.items() + } + line_shapes_dict = { + k: v["line_shape"] for k, v in _line_data.items() + } + new_lines = list(line_properties_dict.keys()) + """ + Before adding lines, check to see if any are overlapping with existing ones in the network + """ + + overlapping_lines = set(new_lines) & set(self.lines) + if overlapping_lines: + msg = "Overlapping lines found when adding from {}. \nSource files:\n{}\n{} Overlapping Lines of {} total new lines.\n-->{}".format( + transit_source, + "\n - ".join(self.source_list), + len(new_lines), + len(overlapping_lines), + overlapping_lines, + ) + print(msg) + WranglerLogger.error(msg) + raise ValueError(msg) + + self.program_type = transformed_tree_data.get("program_type",None) + + self.lines += new_lines + self.line_properties.update(line_properties_dict) + self.shapes.update(line_shapes_dict) + + WranglerLogger.debug("Added lines to CubeTransit: \n".format(new_lines))
+ +
[docs] @staticmethod + def create_from_cube(transit_source: str): + """ + Reads a cube .lin file and stores as TransitNetwork object. + + Args: + transit_source: a string or the directory of the cube line file to be parsed + + Returns: + A ::CubeTransit object created from the transit_source. + """ + + tn = CubeTransit() + tn.add_cube(transit_source) + + return tn
+ +
[docs] def evaluate_differences(self, base_transit): + """ + 1. Identifies what routes need to be updated, deleted, or added + 2. For routes being added or updated, identify if the time periods + have changed or if there are multiples, and make duplicate lines if so + 3. Create project card dictionaries for each change. + + Args: + base_transit (CubeTransit): an instance of this class for the base condition + + Returns: + A list of dictionaries containing project card changes + required to evaluate the differences between the base network + and this transit network instance. + """ + transit_change_list = [] + + """ + Identify what needs to be evaluated + """ + lines_to_update = [l for l in self.lines if l in base_transit.lines] + lines_to_delete = [l for l in base_transit.lines if l not in self.lines] + lines_to_add = [l for l in self.lines if l not in base_transit.lines] + + project_card_changes = [] + + """ + Evaluate Property Updates + """ + + for line in lines_to_update: + WranglerLogger.debug( + "Finding differences in time periods for: {}".format(line) + ) + + """ + Find any additional time periods that might need to add or delete. + """ + base_cube_time_period_numbers = CubeTransit.get_time_period_numbers_from_cube_properties( + base_transit.line_properties[line] + ) + + try: + assert len(base_cube_time_period_numbers) == 1 + except: + msg = "Base network line {} should only have one time period per route, but {} found".format( + line, base_cube_time_period_numbers + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + base_cube_time_period_number = base_cube_time_period_numbers[0] + + build_cube_time_period_numbers = CubeTransit.get_time_period_numbers_from_cube_properties( + self.line_properties[line] + ) + + time_periods_to_add = [ + tp + for tp in build_cube_time_period_numbers + if tp not in base_cube_time_period_numbers + ] + + for tp in time_periods_to_add: + lines_to_add.append(self.add_additional_time_periods(tp, line)) + + time_periods_to_delete = [ + tp + for tp in base_cube_time_period_numbers + if tp not in build_cube_time_period_numbers + ] + + for tp in time_periods_to_delete: + lines_to_delete.append(line) + + WranglerLogger.debug("Evaluating differences in: {}".format(line)) + updated_properties = self.evaluate_route_property_differences( + self.line_properties[line], + base_transit.line_properties[line], + base_cube_time_period_number, + ) + updated_shapes = self.evaluate_route_shape_changes( + self.shapes[line], base_transit.shapes[line] + ) + if updated_properties: + update_prop_card_dict = self.create_update_route_card_dict( + line, updated_properties + ) + project_card_changes.append(update_prop_card_dict) + + if updated_shapes: + update_shape_card_dict = self.create_update_route_card_dict( + line, updated_shapes + ) + project_card_changes.append(update_shape_card_dict) + + """ + Evaluate Deletions + """ + for line in lines_to_delete: + delete_card_dict = self.create_delete_route_card_dict( + line, base_transit.line_properties[line] + ) + project_card_changes.append(delete_card_dict) + + """ + Evaluate Additions + + First assess if need to add multiple routes if there are multiple time periods + """ + for line in lines_to_add: + time_period_numbers = CubeTransit.get_time_period_numbers_from_cube_properties( + self.line_properties[line] + ) + if len(time_period_numbers) > 1: + for tp in time_period_numbers[1:]: + lines_to_add.append(self.add_additional_time_periods(tp, line)) + + for line in lines_to_add: + add_card_dict = self.create_add_route_card_dict(line) + project_card_changes.append(add_card_dict) + + return project_card_changes
+ +
[docs] def add_additional_time_periods( + self, new_time_period_number: int, orig_line_name: str + ) -> str: + """ + Copies a route to another cube time period with appropriate + values for time-period-specific properties. + + New properties are stored under the new name in: + - ::self.shapes + - ::self.line_properties + + Args: + new_time_period_number (int): cube time period number + orig_line_name(str): name of the originating line, from which + the new line will copy its properties. + + Returns: + Line name with new time period. + """ + WranglerLogger.debug( + "adding time periods {} to line {}".format( + new_time_period_number, orig_line_name + ) + ) + + ( + route_id, + _init_time_period, + agency_id, + direction_id, + ) = CubeTransit.unpack_route_name(orig_line_name) + new_time_period_name = self.parameters.cube_time_periods[new_time_period_number] + new_tp_line_name = CubeTransit.build_route_name( + route_id=route_id, + time_period=new_time_period_name, + agency_id=agency_id, + direction_id=direction_id, + ) + + try: + assert new_tp_line_name not in self.lines + except: + msg = "Trying to add a new time period {} to line {}, but constructed name {} is already in line list.".format( + new_time_period_number, orig_line_name, new_tp_line_name + ) + WrangerLogger.error(msg) + raise ValueError(msg) + + # copy to a new line and add it to list of lines to add + self.line_properties[new_tp_line_name] = copy.deepcopy( + self.line_properties[orig_line_name] + ) + self.shapes[new_tp_line_name] = copy.deepcopy(self.shapes[orig_line_name]) + self.line_properties[new_tp_line_name]["NAME"] = new_tp_line_name + + """ + Remove entries that aren't for this time period from the new line's properties list. + """ + this_time_period_properties_list = [ + p + "[" + str(new_time_period_number) + "]" + ##todo parameterize all time period specific variables + for p in ["HEADWAY", "FREQ"] + ] + + not_this_tp_properties_list = list( + set(self.parameters.time_period_properties_list) + - set(this_time_period_properties_list) + ) + + for k in not_this_tp_properties_list: + self.line_properties[new_tp_line_name].pop(k, None) + + """ + Remove entries for time period from the original line's properties list. + """ + for k in this_time_period_properties_list: + self.line_properties[orig_line_name].pop(k, None) + + """ + Add new line to list of lines to add. + """ + WranglerLogger.debug( + "Adding new time period {} for line {} as {}.".format( + new_time_period_number, orig_line_name, new_tp_line_name + ) + ) + return new_tp_line_name
+ +
[docs] def create_update_route_card_dict(self, line: str, updated_properties_dict: dict): + """ + Creates a project card change formatted dictionary for updating + the line. + + Args: + line: name of line that is being updated + updated_properties_dict: dictionary of attributes to update as + 'property': <property name>, + 'set': <new property value> + + Returns: + A project card change-formatted dictionary for the attribute update. + """ + base_start_time_str, base_end_time_str = self.calculate_start_end_times( + self.line_properties[line] + ) + + update_card_dict = { + "category": "Transit Service Property Change", + "facility": { + "route_id": line.split("_")[1], + "direction_id": int(line.strip('"')[-1]), + "start_time": base_start_time_str, + "end_time": base_end_time_str, + }, + "properties": updated_properties_dict, + } + WranglerLogger.debug( + "Updating {} route to changes:\n{}".format(line, str(update_card_dict)) + ) + + return update_card_dict
+ +
[docs] def create_delete_route_card_dict( + self, line: str, base_transit_line_properties_dict: dict + ): + """ + Creates a project card change formatted dictionary for deleting a line. + + Args: + line: name of line that is being deleted + base_transit_line_properties_dict: dictionary of cube-style + attribute values in order to find time periods and + start and end times. + + Returns: + A project card change-formatted dictionary for the route deletion. + """ + base_start_time_str, base_end_time_str = self.calculate_start_end_times( + base_transit_line_properties_dict + ) + + delete_card_dict = { + "category": "Delete Transit Service", + "facility": { + "route_id": line.split("_")[1], + "direction_id": int(line.strip('"')[-1]), + "start_time": base_start_time_str, + "end_time": base_end_time_str, + }, + } + WranglerLogger.debug( + "Deleting {} route to changes:\n{}".format(line, delete_card_dict) + ) + + return delete_card_dict
+ +
[docs] def create_add_route_card_dict(self, line: str): + """ + Creates a project card change formatted dictionary for adding + a route based on the information in self.route_properties for + the line. + + Args: + line: name of line that is being updated + + Returns: + A project card change-formatted dictionary for the route addition. + """ + start_time_str, end_time_str = self.calculate_start_end_times( + self.line_properties[line] + ) + + standard_properties = self.cube_properties_to_standard_properties( + self.line_properties[line] + ) + + routing_properties = { + "property": "routing", + "set": self.shapes[line]["node"].tolist(), + } + + add_card_dict = { + "category": "New Transit Service", + "facility": { + "route_id": line.split("_")[1], + "direction_id": int(line.strip('"')[-1]), + "start_time": start_time_str, + "end_time": end_time_str, + "agency_id": int(line.strip('"')[0]), + }, + "properties": standard_properties + [routing_properties], + } + + WranglerLogger.debug( + "Adding {} route to changes:\n{}".format(line, add_card_dict) + ) + return add_card_dict
+ +
[docs] @staticmethod + def get_time_period_numbers_from_cube_properties(properties_list: list): + """ + Finds properties that are associated with time periods and the + returns the numbers in them. + + Args: + properties_list (list): list of all properties. + + Returns: + list of strings of the time period numbers found + """ + time_periods_list = [] + for p in properties_list: + if ("[" not in p) or ("]" not in p): + continue + tp_num = p.split("[")[1][0] + if tp_num and tp_num not in time_periods_list: + time_periods_list.append(tp_num) + return time_periods_list
+ +
[docs] @staticmethod + def build_route_name( + route_id: str = "", + time_period: str = "", + agency_id: str = 0, + direction_id: str = 1, + ) -> str: + """ + Create a route name by contatenating route, time period, agency, and direction + + Args: + route_id: i.e. 452-111 + time_period: i.e. pk + direction_id: i.e. 1 + agency_id: i.e. 0 + + Returns: + constructed line_name i.e. "0_452-111_452_pk1" + """ + + return ( + str(agency_id) + + "_" + + str(route_id) + + "_" + + str(route_id.split("-")[0]) + + "_" + + str(time_period) + + str(direction_id) + )
+ +
[docs] @staticmethod + def unpack_route_name(line_name: str): + """ + Unpacks route name into direction, route, agency, and time period info + + Args: + line_name (str): i.e. "0_452-111_452_pk1" + + Returns: + route_id (str): 452-111 + time_period (str): i.e. pk + direction_id (str) : i.e. 1 + agency_id (str) : i.e. 0 + """ + + line_name = line_name.strip('"') + + agency_id, route_id, _rtid, _tp_direction = line_name.split("_") + time_period = _tp_direction[0:-1] + direction_id = _tp_direction[-1] + + return route_id, time_period, agency_id, direction_id
+ +
[docs] def calculate_start_end_times(self, line_properties_dict: dict): + """ + Calculate the start and end times of the property change + WARNING: Doesn't take care of discongruous time periods!!!! + + Args: + line_properties_dict: dictionary of cube-flavor properties for a transit line + """ + start_time_m = 24 * 60 + end_time_m = 0 * 60 + + WranglerLogger.debug( + "parameters.time_period_properties_list: {}".format( + self.parameters.time_period_properties_list + ) + ) + current_cube_time_period_numbers = CubeTransit.get_time_period_numbers_from_cube_properties( + line_properties_dict + ) + + WranglerLogger.debug( + "current_cube_time_period_numbers:{}".format( + current_cube_time_period_numbers + ) + ) + + for tp in current_cube_time_period_numbers: + time_period_name = self.parameters.cube_time_periods[tp] + WranglerLogger.debug("time_period_name:{}".format(time_period_name)) + _start_time, _end_time = self.parameters.time_period_to_time[ + time_period_name + ] + + # change from "HH:MM" to integer # of seconds + _start_time_m = (int(_start_time.split(":")[0]) * 60) + int( + _start_time.split(":")[1] + ) + _end_time_m = (int(_end_time.split(":")[0]) * 60) + int( + _end_time.split(":")[1] + ) + + # find bounding start and end times + if _start_time_m < start_time_m: + start_time_m = _start_time_m + if _end_time_m > end_time_m: + end_time_m = _end_time_m + + if start_time_m > end_time_m: + msg = "Start time ({}) is after end time ({})".format( + start_time_m, end_time_m + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + start_time_str = "{:02d}:{:02d}".format(*divmod(start_time_m, 60)) + end_time_str = "{:02d}:{:02d}".format(*divmod(end_time_m, 60)) + return start_time_str, end_time_str
+ +
[docs] @staticmethod + def cube_properties_to_standard_properties(cube_properties_dict: dict) -> list: + """ + Converts cube style properties to standard properties. + + This is most pertinent to time-period specific variables like headway, + and varibles that have stnadard units like headway, which is minutes + in cube and seconds in standard format. + + Args: + cube_properties_dict: <cube style property name> : <property value> + + Returns: + A list of dictionaries with values for `"property": <standard + style property name>, "set" : <property value with correct units>` + + """ + standard_properties_list = [] + for k, v in cube_properties_dict.items(): + change_item = {} + if any(i in k for i in ["HEADWAY", "FREQ"]): + change_item["property"] = "headway_secs" + change_item["set"] = v * 60 + else: + change_item["property"] = k + change_item["set"] = v + standard_properties_list.append(change_item) + + return standard_properties_list
+ +
[docs] def evaluate_route_property_differences( + self, + properties_build: dict, + properties_base: dict, + time_period_number: str, + absolute: bool = True, + validate_base: bool = False, + ): + """ + Checks if any values have been updated or added for a specific + route and creates project card entries for each. + + Args: + properties_build: ::<property_name>: <property_value> + properties_base: ::<property_name>: <property_value> + time_period_number: time period to evaluate + absolute: if True, will use `set` command rather than a change. If false, will automatically check the base value. Note that this only applies to the numeric values of frequency/headway + validate_base: if True, will add the `existing` line in the project card + + Returns: + transit_change_list (list): a list of dictionary values suitable for writing to a project card + `{ + 'property': <property_name>, + 'set': <set value>, + 'change': <change from existing value>, + 'existing': <existing value to check>, + }` + + """ + + # Remove time period specific values for things that aren't part of the time period in question + this_time_period_properties_list = [ + p + "[" + str(time_period_number) + "]" + ##todo parameterize all time period specific variables + for p in ["HEADWAY", "FREQ"] + ] + + not_this_tp_properties_list = list( + set(self.parameters.time_period_properties_list) + - set(this_time_period_properties_list) + ) + + for k in not_this_tp_properties_list: + properties_build.pop(k, None) + properties_base.pop(k, None) + + difference_dict = dict( + set(properties_build.items()) ^ set(properties_base.items()) + ) + + # Iterate through properties list to build difference project card list + + properties_list = [] + for k, v in difference_dict.items(): + change_item = {} + if any(i in k for i in ["HEADWAY", "FREQ"]): + change_item["property"] = "headway_secs" + + if absolute: + change_item["set"] = ( + v * 60 + ) # project cards are in secs, cube is in minutes + else: + change_item["change"] = ( + properties_build[k] - properties_base[k] + ) * 60 + if validate_base or not absolute: + change_item["existing"] = properties_base[k] * 60 + else: + change_item["property"] = k + change_item["set"] = v + if validate_base: + change_item["existing"] = properties_base[k] + + properties_list.append(change_item) + WranglerLogger.debug( + "Evaluated Route Changes: \n {})".format( + "\n".join(map(str, properties_list)) + ) + ) + return properties_list
+ +
[docs] def evaluate_route_shape_changes( + self, shape_build: DataFrame, shape_base: DataFrame + ): + """ + Compares two route shapes and constructs returns list of changes + suitable for a project card. + + Args: + shape_build: DataFrame of the build-version of the route shape. + shape_base: dDataFrame of the base-version of the route shape. + + Returns: + List of shape changes formatted as a project card-change dictionary. + + """ + + if shape_build.node.equals(shape_base.node): + return None + + shape_change_list = [] + + base_node_list = shape_build.node.tolist() + build_node_list = shape_base.node.tolist() + + sort_len = max(len(base_node_list), len(build_node_list)) + + start_pos = None + end_pos = None + for i in range(sort_len): + if (i == len(base_node_list)) | (i == len(build_node_list)): + start_pos = i - 1 + break + if base_node_list[i] != build_node_list[i]: + start_pos = i + break + else: + continue + + j = -1 + for i in range(sort_len): + if (i == len(base_node_list)) | (i == len(build_node_list)): + end_pos = j + 1 + break + if base_node_list[j] != build_node_list[j]: + end_pos = j + break + else: + j -= 1 + + if start_pos or end_pos: + existing = base_node_list[ + (start_pos - 2 if start_pos > 1 else None) : ( + end_pos + 2 if end_pos < -2 else None + ) + ] + set = build_node_list[ + (start_pos - 2 if start_pos > 1 else None) : ( + end_pos + 2 if end_pos < -2 else None + ) + ] + + shape_change_list.append( + {"property": "routing", "existing": existing, "set": set} + ) + + return shape_change_list
+ + +
[docs]class StandardTransit(object): + """Holds a standard transit feed as a Partridge object and contains + methods to manipulate and translate the GTFS data to MetCouncil's + Cube Line files. + + .. highlight:: python + Typical usage example: + :: + cube_transit_net = StandardTransit.read_gtfs(BASE_TRANSIT_DIR) + cube_transit_net.write_as_cube_lin(os.path.join(WRITE_DIR, "outfile.lin")) + + Attributes: + feed: Partridge Feed object containing read-only access to GTFS feed + parameters (Parameters): Parameters instance containing information + about time periods and variables. + """ + +
[docs] def __init__(self, ptg_feed, parameters: Union[Parameters, dict] = {}): + """ + + Args: + ptg_feed: partridge feed object + parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters + """ + self.feed = ptg_feed + + if type(parameters) is dict: + self.parameters = Parameters(**parameters) + elif isinstance(parameters, Parameters): + self.parameters = Parameters(**parameters.__dict__) + else: + msg = "Parameters should be a dict or instance of Parameters: found {} which is of type:{}".format( + parameters, type(parameters) + ) + WranglerLogger.error(msg) + raise ValueError(msg)
+ +
[docs] @staticmethod + def fromTransitNetwork( + transit_network_object: TransitNetwork, parameters: Union[Parameters, dict] = {} + ): + """ + RoadwayNetwork to ModelRoadwayNetwork + + Args: + transit_network_object: Reference to an instance of TransitNetwork. + parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not provided will + use default parameters. + + Returns: + StandardTransit + """ + return StandardTransit(transit_network_object.feed, parameters=parameters)
+ +
[docs] @staticmethod + def read_gtfs(gtfs_feed_dir: str, parameters: Union[Parameters, dict] = {}): + """ + Reads GTFS files from a directory and returns a StandardTransit + instance. + + Args: + gtfs_feed_dir: location of the GTFS files + parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not provided will + use default parameters. + + Returns: + StandardTransit instance + """ + return StandardTransit(ptg.load_feed(gtfs_feed_dir), parameters=parameters)
+ +
[docs] def write_as_cube_lin(self, outpath: str = None): + """ + Writes the gtfs feed as a cube line file after + converting gtfs properties to MetCouncil cube properties. + + Args: + outpath: File location for output cube line file. + + """ + if not outpath: + outpath = os.path.join(self.parameters.scratch_location, "outtransit.lin") + trip_cube_df = self.route_properties_gtfs_to_cube(self) + + trip_cube_df["LIN"] = trip_cube_df.apply(self.cube_format, axis=1) + + l = trip_cube_df["LIN"].tolist() + + with open(outpath, "w") as f: + f.write("\n".join(l))
+ +
[docs] @staticmethod + def route_properties_gtfs_to_cube(self): + """ + Prepare gtfs for cube lin file. + + Does the following operations: + 1. Combines route, frequency, trip, and shape information + 2. Converts time of day to time periods + 3. Calculates cube route name from gtfs route name and properties + 4. Assigns a cube-appropriate mode number + 5. Assigns a cube-appropriate operator number + + Returns: + trip_df (DataFrame): DataFrame of trips with cube-appropriate values for: + - NAME + - ONEWAY + - OPERATOR + - MODE + - HEADWAY + """ + WranglerLogger.info( + "Converting GTFS Standard Properties to MetCouncil's Cube Standard" + ) + metro_operator_dict = { + "0": 3, + "1": 3, + "2": 3, + "3": 4, + "4": 2, + "5": 5, + "6": 8, + "7": 1, + "8": 1, + "9": 10, + "10": 3, + "11": 9, + "12": 3, + "13": 4, + "14": 4, + "15": 3, + } + + shape_df = self.feed.shapes.copy() + trip_df = self.feed.trips.copy() + + """ + Add information from: routes, frequencies, and routetype to trips_df + """ + trip_df = pd.merge(trip_df, self.feed.routes, how="left", on="route_id") + trip_df = pd.merge(trip_df, self.feed.frequencies, how="left", on="trip_id") + + trip_df["tod_name"] = trip_df.start_time.apply(self.time_to_cube_time_period) + inv_cube_time_periods_map = { + v: k for k, v in self.parameters.cube_time_periods.items() + } + trip_df["tod_num"] = trip_df.tod_name.map(inv_cube_time_periods_map) + trip_df["tod_name"] = trip_df.tod_name.map( + self.parameters.cube_time_periods_name + ) + + trip_df["NAME"] = trip_df.apply( + lambda x: x.agency_id + + "_" + + x.route_id + + "_" + + x.route_short_name + + "_" + + x.tod_name + + str(x.direction_id), + axis=1, + ) + + trip_df["LONGNAME"] = trip_df["route_long_name"] + trip_df["HEADWAY"] = (trip_df["headway_secs"] / 60).astype(int) + trip_df["MODE"] = trip_df.apply(self.calculate_cube_mode, axis=1) + trip_df["ONEWAY"] = "T" + trip_df["OPERATOR"] = trip_df["agency_id"].map(metro_operator_dict) + + return trip_df
+ +
[docs] def calculate_cube_mode(self, row) -> int: + """ + Assigns a cube mode number by following logic. + + For rail, uses GTFS route_type variable: + https://developers.google.com/transit/gtfs/reference + + :: + # route_type : cube_mode + route_type_to_cube_mode = {0: 8, # Tram, Streetcar, Light rail + 3: 0, # Bus; further disaggregated for cube + 2: 9} # Rail + + For buses, uses route id numbers and route name to find + express and suburban buses as follows: + + :: + if not cube_mode: + if 'express' in row['LONGNAME'].lower(): + cube_mode = 7 # Express + elif int(row['route_id'].split("-")[0]) > 99: + cube_mode = 6 # Suburban Local + else: + cube_mode = 5 # Urban Local + + Args: + row: A DataFrame row with route_type, route_long_name, and route_id + + Returns: + cube mode number + """ + # route_type : cube_mode + route_type_to_cube_mode = { + 0: 8, # Tram, Streetcar, Light rail + 3: 0, # Bus; further disaggregated for cube + 2: 9, + } # Rail + + cube_mode = route_type_to_cube_mode[row["route_type"]] + + if not cube_mode: + if "express" in row["route_long_name"].lower(): + cube_mode = 7 # Express + elif int(row["route_id"].split("-")[0]) > 99: + cube_mode = 6 # Suburban Local + else: + cube_mode = 5 # Urban Local + + return cube_mode
+ +
[docs] def time_to_cube_time_period( + self, start_time_secs: int, as_str: bool = True, verbose: bool = False + ): + """ + Converts seconds from midnight to the cube time period. + + Args: + start_time_secs: start time for transit trip in seconds + from midnight + as_str: if True, returns the time period as a string, + otherwise returns a numeric time period + + Returns: + this_tp_num: if as_str is False, returns the numeric + time period + this_tp: if as_str is True, returns the Cube time period + name abbreviation + """ + from .util import hhmmss_to_datetime, secs_to_datetime + + # set initial time as the time that spans midnight + + start_time_dt = secs_to_datetime(start_time_secs) + + # set initial time as the time that spans midnight + this_tp = "NA" + for tp_name, _times in self.parameters.time_period_to_time.items(): + _start_time, _end_time = _times + _dt_start_time = hhmmss_to_datetime(_start_time) + _dt_end_time = hhmmss_to_datetime(_end_time) + if _dt_start_time > _dt_end_time: + this_tp = tp_name + break + + for tp_name, _times in self.parameters.time_period_to_time.items(): + _start_time, _end_time = _times + _dt_start_time = hhmmss_to_datetime(_start_time) + if start_time_dt >= _dt_start_time: + this_time = _dt_start_time + this_tp = tp_name + + if verbose: + WranglerLogger.debug( + "Finding Cube Time Period from Start Time: \ + \n - start_time_sec: {} \ + \n - start_time_dt: {} \ + \n - this_tp: {}".format( + start_time_secs, start_time_dt, this_tp + ) + ) + + if as_str: + return this_tp + + name_to_num = {v: k for k, v in self.parameters.cube_time_periods.items} + this_tp_num = name_to_num.get(this_tp) + + if not this_tp_num: + msg = "Cannot find time period number in {} for time period name: {}".format( + name_to_num, this_tp + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + return this_tp_num
+ +
[docs] def shape_gtfs_to_cube(self, row): + """ + Creates a list of nodes that for the route in appropriate + cube format. + + Args: + row: DataFrame row with both shape_id and trip_id + + Returns: a string representation of the node list + for a route in cube format. + + """ + trip_stop_times_df = self.feed.stop_times.copy() + trip_stop_times_df = trip_stop_times_df[ + trip_stop_times_df.trip_id == row.trip_id + ] + + trip_node_df = self.feed.shapes.copy() + trip_node_df = trip_node_df[trip_node_df.shape_id == row.shape_id] + + trip_stop_times_df = pd.merge( + trip_stop_times_df, self.feed.stops, how="left", on="stop_id" + ) + + stop_node_id_list = trip_stop_times_df["model_node_id"].tolist() + trip_node_list = trip_node_df["shape_model_node_id"].tolist() + + # node list + node_list_str = "" + for nodeIdx in range(len(trip_node_list)): + if trip_node_list[nodeIdx] in stop_node_id_list: + node_list_str += "\n %s" % (trip_node_list[nodeIdx]) + if nodeIdx < (len(trip_node_list) - 1): + node_list_str += "," + else: + node_list_str += "\n -%s" % (trip_node_list[nodeIdx]) + if nodeIdx < (len(trip_node_list) - 1): + node_list_str += "," + + return node_list_str
+ +
[docs] def cube_format(self, row): + """ + Creates a string represnting the route in cube line file notation. + + Args: + row: row of a DataFrame representing a cube-formatted trip, with the Attributes + trip_id, shape_id, NAME, LONGNAME, tod, HEADWAY, MODE, ONEWAY, OPERATOR + + Returns: + string representation of route in cube line file notation + """ + + s = '\nLINE NAME="{}",'.format(row.NAME) + s += '\n LONGNAME="{}",'.format(row.LONGNAME) + s += "\n HEADWAY[{}]={},".format(row.tod_num, row.HEADWAY) + s += "\n MODE={},".format(row.MODE) + s += "\n ONEWAY={},".format(row.ONEWAY) + s += "\n OPERATOR={},".format(row.OPERATOR) + s += "\n NODES={}".format(self.shape_gtfs_to_cube(row)) + + return s
+ + +class CubeTransformer(Transformer): + """A lark-parsing Transformer which transforms the parse-tree to + a dictionary. + + .. highlight:: python + Typical usage example: + :: + transformed_tree_data = CubeTransformer().transform(parse_tree) + + Attributes: + line_order (int): a dynamic counter to hold the order of the nodes within + a route shape + lines_list (list): a list of the line names + """ + + def __init__(self): + self.line_order = 0 + self.lines_list = [] + + def lines(self, line): + # WranglerLogger.debug("lines: \n {}".format(line)) + + # This MUST be a tuple because it returns to start in the tree + lines = {k: v for k, v in line} + return ("lines", lines) + + @v_args(inline=True) + def program_type_line(self, PROGRAM_TYPE, whitespace=None): + # WranglerLogger.debug("program_type_line:{}".format(PROGRAM_TYPE)) + self.program_type = PROGRAM_TYPE.value + + # This MUST be a tuple because it returns to start in the tree + return ("program_type", PROGRAM_TYPE.value) + + @v_args(inline=True) + def line(self, lin_attributes, nodes): + # WranglerLogger.debug("line...attributes:\n {}".format(lin_attributes)) + # WranglerLogger.debug("line...nodes:\n {}".format(nodes)) + lin_name = lin_attributes["NAME"] + + self.line_order = 0 + # WranglerLogger.debug("parsing: {}".format(lin_name)) + + return (lin_name, {"line_properties": lin_attributes, "line_shape": nodes}) + + @v_args(inline=True) + def lin_attributes(self, *lin_attr): + lin_attr = {k: v for (k, v) in lin_attr} + # WranglerLogger.debug("lin_attributes: {}".format(lin_attr)) + return lin_attr + + @v_args(inline=True) + def lin_attr(self, lin_attr_name, attr_value, SEMICOLON_COMMENT=None): + # WranglerLogger.debug("lin_attr {}: {}".format(lin_attr_name, attr_value)) + return lin_attr_name, attr_value + + def lin_attr_name(self, args): + attr_name = args[0].value.upper() + # WranglerLogger.debug(".......args {}".format(args)) + if attr_name in ["USERA", "FREQ", "HEADWAY"]: + attr_name = attr_name + "[" + str(args[2]) + "]" + return attr_name + + def attr_value(self, attr_value): + try: + return int(attr_value[0].value) + except: + return attr_value[0].value + + def nodes(self, lin_node): + lin_node = DataFrame(lin_node) + # WranglerLogger.debug("nodes:\n {}".format(lin_node)) + + return lin_node + + @v_args(inline=True) + def lin_node(self, NODE_NUM, SEMICOLON_COMMENT=None, *lin_nodeattr): + self.line_order += 1 + n = int(NODE_NUM.value) + return {"node_id": abs(n), "node": n, "stop": n > 0, "order": self.line_order} + + start = dict + + +TRANSIT_LINE_FILE_GRAMMAR = r""" + +start : program_type_line? lines +WHITESPACE : /[ \t\r\n]/+ +STRING : /("(?!"").*?(?<!\\)(\\\\)*?"|'(?!'').*?(?<!\\)(\\\\)*?')/i +SEMICOLON_COMMENT : /;[^\n]*/ +BOOLEAN : "T"i | "F"i +program_type_line : ";;<<" PROGRAM_TYPE ">><<LINE>>;;" WHITESPACE? +PROGRAM_TYPE : "PT" | "TRNBUILD" + +lines : line* +line : "LINE" lin_attributes nodes + +lin_attributes : lin_attr+ +lin_attr : lin_attr_name "=" attr_value "," SEMICOLON_COMMENT* +TIME_PERIOD : "1".."5" +!lin_attr_name : "allstops"i + | "color"i + | ("freq"i "[" TIME_PERIOD "]") + | ("headway"i "[" TIME_PERIOD "]") + | "mode"i + | "name"i + | "oneway"i + | "owner"i + | "runtime"i + | "timefac"i + | "xyspeed"i + | "longname"i + | "shortname"i + | ("usera"i TIME_PERIOD) + | ("usern2"i) + | "circular"i + | "vehicletype"i + | "operator"i + | "faresystem"i + +attr_value : BOOLEAN | STRING | SIGNED_INT + +nodes : lin_node+ +lin_node : ("N" | "NODES")? "="? NODE_NUM ","? SEMICOLON_COMMENT? lin_nodeattr* +NODE_NUM : SIGNED_INT +lin_nodeattr : lin_nodeattr_name "=" attr_value ","? SEMICOLON_COMMENT* +!lin_nodeattr_name : "access_c"i + | "access"i + | "delay"i + | "xyspeed"i + | "timefac"i + | "nntime"i + | "time"i + +operator : SEMICOLON_COMMENT* "OPERATOR" opmode_attr* SEMICOLON_COMMENT* +mode : SEMICOLON_COMMENT* "MODE" opmode_attr* SEMICOLON_COMMENT* +opmode_attr : ( (opmode_attr_name "=" attr_value) ","? ) +opmode_attr_name : "number" | "name" | "longname" + +%import common.SIGNED_INT +%import common.WS +%ignore WS + +""" +
+ +
+ +
+ + +
+
+ +
+ +
+ + + + + + + + + + + \ No newline at end of file diff --git a/_modules/lasso/util.html b/_modules/lasso/util.html new file mode 100644 index 0000000..46a227b --- /dev/null +++ b/_modules/lasso/util.html @@ -0,0 +1,288 @@ + + + + + + + + + + lasso.util — lasso documentation + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ + + +
+ + + + + +
+ +
+ + + + + + + + + + + + + + + + + +
+ +
    + +
  • »
  • + +
  • Module code »
  • + +
  • lasso.util
  • + + +
  • + +
  • + +
+ + +
+
+
+
+ +

Source code for lasso.util

+
[docs]def get_shared_streets_intersection_hash(lat, long, osm_node_id=None): + """ + Calculated per: + https://github.com/sharedstreets/sharedstreets-js/blob/0e6d7de0aee2e9ae3b007d1e45284b06cc241d02/src/index.ts#L553-L565 + Expected in/out + -93.0965985, 44.952112199999995 osm_node_id = 954734870 + 69f13f881649cb21ee3b359730790bb9 + + """ + import hashlib + + message = "Intersection {0:.5f} {0:.5f}".format(long, lat) + if osm_node_id: + message += " {}".format(osm_node_id) + unhashed = message.encode("utf-8") + hash = hashlib.md5(unhashed).hexdigest() + return hash
+ + +
[docs]def hhmmss_to_datetime(hhmmss_str: str): + """ + Creates a datetime time object from a string of hh:mm:ss + + Args: + hhmmss_str: string of hh:mm:ss + Returns: + dt: datetime.time object representing time + """ + import datetime + + dt = datetime.time(*[int(i) for i in hhmmss_str.split(":")]) + + return dt
+ + +
[docs]def secs_to_datetime(secs: int): + """ + Creates a datetime time object from a seconds from midnight + + Args: + secs: seconds from midnight + Returns: + dt: datetime.time object representing time + """ + import datetime + + dt = (datetime.datetime.min + datetime.timedelta(seconds=secs)).time() + + return dt
+ + +
[docs]def column_name_to_parts(c, parameters=None): + + if not parameters: + from .parameters import Parameters + + parameters = Parameters() + + if c[0:2] == "ML": + managed = True + else: + managed = False + + time_period = None + category = None + + if c.split("_")[0] not in parameters.properties_to_split.keys(): + return c, None, None, managed + + tps = parameters.time_period_to_time.keys() + cats = parameters.categories.keys() + + if c.split("_")[-1] in tps: + time_period = c.split("_")[-1] + base_name = c.split(time_period)[-2][:-1] + if c.split("_")[-2] in cats: + category = c.split("_")[-2] + base_name = c.split(category)[-2][:-1] + elif c.split("_")[-1] in cats: + category = c.split("_")[-1] + base_name = c.split(category)[-2][:-1] + else: + msg = "Can't split property correctly: {}".format(c) + WranglerLogger.error(msg) + + return base_name, time_period, category, managed
+
+ +
+ +
+ + +
+
+ +
+ +
+ + + + + + + + + + + \ No newline at end of file diff --git a/_modules/network_wrangler/roadwaynetwork.html b/_modules/network_wrangler/roadwaynetwork.html new file mode 100644 index 0000000..db4cd88 --- /dev/null +++ b/_modules/network_wrangler/roadwaynetwork.html @@ -0,0 +1,2712 @@ + + + + + + + + + + network_wrangler.roadwaynetwork — lasso documentation + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ + + +
+ + + + + +
+ +
+ + + + + + + + + + + + + + + + + +
+ +
    + +
  • »
  • + +
  • Module code »
  • + +
  • network_wrangler.roadwaynetwork
  • + + +
  • + +
  • + +
+ + +
+
+
+
+ +

Source code for network_wrangler.roadwaynetwork

+#!/usr/bin/env python
+# -*- coding: utf-8 -*-
+
+from __future__ import annotations
+
+import os
+import sys
+import copy
+import numbers
+from random import randint
+
+import folium
+import pandas as pd
+import geopandas as gpd
+import json
+import networkx as nx
+import numpy as np
+import osmnx as ox
+
+from geopandas.geodataframe import GeoDataFrame
+
+from pandas.core.frame import DataFrame
+
+from jsonschema import validate
+from jsonschema.exceptions import ValidationError
+from jsonschema.exceptions import SchemaError
+
+from shapely.geometry import Point, LineString
+
+from .logger import WranglerLogger
+from .projectcard import ProjectCard
+from .utils import point_df_to_geojson, link_df_to_json, parse_time_spans
+from .utils import offset_location_reference, haversine_distance, create_unique_shape_id
+from .utils import create_location_reference_from_nodes, create_line_string
+
+
+class RoadwayNetwork(object):
+    """
+    Representation of a Roadway Network.
+
+    .. highlight:: python
+
+    Typical usage example:
+    ::
+
+        net = RoadwayNetwork.read(
+            link_file=MY_LINK_FILE,
+            node_file=MY_NODE_FILE,
+            shape_file=MY_SHAPE_FILE,
+        )
+        my_selection = {
+            "link": [{"name": ["I 35E"]}],
+            "A": {"osm_node_id": "961117623"},  # start searching for segments at A
+            "B": {"osm_node_id": "2564047368"},
+        }
+        net.select_roadway_features(my_selection)
+
+        my_change = [
+            {
+                'property': 'lanes',
+                'existing': 1,
+                'set': 2,
+             },
+             {
+                'property': 'drive_access',
+                'set': 0,
+              },
+        ]
+
+        my_net.apply_roadway_feature_change(
+            my_net.select_roadway_features(my_selection),
+            my_change
+        )
+
+        ml_net = net.create_managed_lane_network(in_place=False)
+        ml_net.is_network_connected(mode="drive"))
+        _, disconnected_nodes = ml_net.assess_connectivity(mode="walk", ignore_end_nodes=True)
+        ml_net.write(filename=my_out_prefix, path=my_dir)
+
+    Attributes:
+        nodes_df (GeoDataFrame): node data
+
+        links_df (GeoDataFrame): link data, including start and end
+            nodes and associated shape
+
+        shapes_df (GeoDataFrame): detailed shape data
+
+        selections (dict): dictionary storing selections in case they are made repeatedly
+
+        CRS (str): coordinate reference system in PROJ4 format.
+            See https://proj.org/operations/projections/index.html#
+
+        ESPG (int): integer representing coordinate system https://epsg.io/
+
+        NODE_FOREIGN_KEY (str): column in `nodes_df` associated with the
+            LINK_FOREIGN_KEY
+
+        LINK_FOREIGN_KEY (list(str)): list of columns in `link_df` that
+            represent the NODE_FOREIGN_KEY
+
+        UNIQUE_LINK_KEY (str): column that is a unique key for links
+
+        UNIQUE_NODE_KEY (str): column that is a unique key for nodes
+
+        UNIQUE_SHAPE_KEY (str): column that is a unique shape key
+
+        UNIQUE_MODEL_LINK_IDENTIFIERS (list(str)): list of all unique
+            identifiers for links, including the UNIQUE_LINK_KEY
+
+        UNIQUE_NODE_IDENTIFIERS (list(str)): list of all unique identifiers
+            for nodes, including the UNIQUE_NODE_KEY
+
+        SELECTION_REQUIRES (list(str))): required attributes in the selection
+            if a unique identifier is not used
+
+        SEARCH_BREADTH (int): initial number of links from name-based
+            selection that are traveresed before trying another shortest
+            path when searching for paths between A and B node
+
+        MAX_SEARCH_BREADTH (int): maximum number of links traversed between
+            links that match the searched name when searching for paths
+            between A and B node
+
+        SP_WEIGHT_FACTOR (Union(int, float)): penalty assigned for each
+            degree of distance between a link and a link with the searched-for
+            name when searching for paths between A and B node
+
+        MANAGED_LANES_TO_NODE_ID_SCALAR (int): scalar value added to
+            the general purpose lanes' `model_node_id` when creating
+            an associated node for a parallel managed lane
+
+        MANAGED_LANES_TO_LINK_ID_SCALAR (int): scalar value added to
+            the general purpose lanes' `model_link_id` when creating
+            an associated link for a parallel managed lane
+
+        MANAGED_LANES_REQUIRED_ATTRIBUTES (list(str)): list of attributes
+            that must be provided in managed lanes
+
+        KEEP_SAME_ATTRIBUTES_ML_AND_GP (list(str)): list of attributes
+            to copy from a general purpose lane to managed lane
+    """
+
+    # CRS = "+proj=longlat +ellps=WGS84 +datum=WGS84 +no_defs"
+    CRS = 4326  # "EPSG:4326"
+
+    NODE_FOREIGN_KEY = "model_node_id"
+    LINK_FOREIGN_KEY = ["A", "B"]
+
+    SEARCH_BREADTH = 5
+    MAX_SEARCH_BREADTH = 10
+    SP_WEIGHT_FACTOR = 100
+    MANAGED_LANES_NODE_ID_SCALAR = 500000
+    MANAGED_LANES_LINK_ID_SCALAR = 1000000
+
+    SELECTION_REQUIRES = ["link"]
+
+    UNIQUE_LINK_KEY = "model_link_id"
+    UNIQUE_NODE_KEY = "model_node_id"
+    UNIQUE_MODEL_LINK_IDENTIFIERS = ["model_link_id"]
+    UNIQUE_NODE_IDENTIFIERS = ["model_node_id"]
+
+    UNIQUE_SHAPE_KEY = "shape_id"
+
+    MANAGED_LANES_REQUIRED_ATTRIBUTES = [
+        "A",
+        "B",
+        "model_link_id",
+        "locationReferences",
+    ]
+
+    KEEP_SAME_ATTRIBUTES_ML_AND_GP = [
+        "distance",
+        "bike_access",
+        "drive_access",
+        "transit_access",
+        "walk_access",
+        "maxspeed",
+        "name",
+        "oneway",
+        "ref",
+        "roadway",
+        "length",
+        "segment_id",
+    ]
+
+    MANAGED_LANES_SCALAR = 500000
+
+    MODES_TO_NETWORK_LINK_VARIABLES = {
+        "drive": ["drive_access"],
+        "bus": ["bus_only", "drive_access"],
+        "rail": ["rail_only"],
+        "transit": ["bus_only", "rail_only", "drive_access"],
+        "walk": ["walk_access"],
+        "bike": ["bike_access"],
+    }
+
+    MODES_TO_NETWORK_NODE_VARIABLES = {
+        "drive": ["drive_node"],
+        "rail": ["rail_only", "drive_node"],
+        "bus": ["bus_only", "drive_node"],
+        "transit": ["bus_only", "rail_only", "drive_node"],
+        "walk": ["walk_node"],
+        "bike": ["bike_node"],
+    }
+
+    def __init__(self, nodes: GeoDataFrame, links: GeoDataFrame, shapes: GeoDataFrame):
+        """
+        Constructor
+        """
+
+        if not RoadwayNetwork.validate_object_types(nodes, links, shapes):
+            sys.exit("RoadwayNetwork: Invalid constructor data type")
+
+        self.nodes_df = nodes
+        self.links_df = links
+        self.shapes_df = shapes
+
+        self.link_file = None
+        self.node_file = None
+        self.shape_file = None
+
+        # Add non-required fields if they aren't there.
+        # for field, default_value in RoadwayNetwork.OPTIONAL_FIELDS:
+        #    if field not in self.links_df.columns:
+        #        self.links_df[field] = default_value
+        if not self.validate_uniqueness():
+            raise ValueError("IDs in network not unique")
+        self.selections = {}
+
+    @staticmethod
+    def read(
+        link_file: str, node_file: str, shape_file: str, fast: bool = True
+    ) -> RoadwayNetwork:
+        """
+        Reads a network from the roadway network standard
+        Validates that it conforms to the schema
+
+        args:
+            link_file: full path to the link file
+            node_file: full path to the node file
+            shape_file: full path to the shape file
+            fast: boolean that will skip validation to speed up read time
+
+        Returns: a RoadwayNetwork instance
+
+        .. todo:: Turn off fast=True as default
+        """
+
+        WranglerLogger.info(
+            "Reading from following files:\n-{}\n-{}\n-{}.".format(
+                link_file, node_file, shape_file
+            )
+        )
+
+        """
+        Validate Input
+        """
+
+        if not os.path.exists(link_file):
+            msg = "Link file doesn't exist at: {}".format(link_file)
+            WranglerLogger.error(msg)
+            raise ValueError(msg)
+
+        if not os.path.exists(node_file):
+            msg = "Node file doesn't exist at: {}".format(node_file)
+            WranglerLogger.error(msg)
+            raise ValueError(msg)
+
+        if not os.path.exists(shape_file):
+            msg = "Shape file doesn't exist at: {}".format(shape_file)
+            WranglerLogger.error(msg)
+            raise ValueError(msg)
+
+        if not fast:
+            if not (
+                RoadwayNetwork.validate_node_schema(node_file)
+                and RoadwayNetwork.validate_link_schema(link_file)
+                and RoadwayNetwork.validate_shape_schema(shape_file)
+            ):
+
+                sys.exit("RoadwayNetwork: Data doesn't conform to schema")
+
+        with open(link_file) as f:
+            link_json = json.load(f)
+
+        link_properties = pd.DataFrame(link_json)
+        link_geometries = [
+            create_line_string(g["locationReferences"]) for g in link_json
+        ]
+        links_df = gpd.GeoDataFrame(link_properties, geometry=link_geometries)
+        links_df.crs = RoadwayNetwork.CRS
+        # coerce types for booleans which might not have a 1 and are therefore read in as intersection
+        bool_columns = [
+            "rail_only",
+            "bus_only",
+            "drive_access",
+            "bike_access",
+            "walk_access",
+            "truck_access",
+        ]
+        for bc in list(set(bool_columns) & set(links_df.columns)):
+            links_df[bc] = links_df[bc].astype(bool)
+
+        shapes_df = gpd.read_file(shape_file)
+        shapes_df.dropna(subset=["geometry", "id"], inplace=True)
+        shapes_df.crs = RoadwayNetwork.CRS
+
+        # geopandas uses fiona OGR drivers, which doesn't let you have
+        # a list as a property type. Therefore, must read in node_properties
+        # separately in a vanilla dataframe and then convert to geopandas
+
+        with open(node_file) as f:
+            node_geojson = json.load(f)
+
+        node_properties = pd.DataFrame(
+            [g["properties"] for g in node_geojson["features"]]
+        )
+        node_geometries = [
+            Point(g["geometry"]["coordinates"]) for g in node_geojson["features"]
+        ]
+
+        nodes_df = gpd.GeoDataFrame(node_properties, geometry=node_geometries)
+
+        nodes_df.gdf_name = "network_nodes"
+
+        # set a copy of the  foreign key to be the index so that the
+        # variable itself remains queryiable
+        nodes_df[RoadwayNetwork.NODE_FOREIGN_KEY + "_idx"] = nodes_df[
+            RoadwayNetwork.NODE_FOREIGN_KEY
+        ]
+        nodes_df.set_index(RoadwayNetwork.NODE_FOREIGN_KEY + "_idx", inplace=True)
+
+        nodes_df.crs = RoadwayNetwork.CRS
+        nodes_df["X"] = nodes_df["geometry"].apply(lambda g: g.x)
+        nodes_df["Y"] = nodes_df["geometry"].apply(lambda g: g.y)
+
+        WranglerLogger.info("Read %s links from %s" % (len(links_df), link_file))
+        WranglerLogger.info("Read %s nodes from %s" % (len(nodes_df), node_file))
+        WranglerLogger.info("Read %s shapes from %s" % (len(shapes_df), shape_file))
+
+        roadway_network = RoadwayNetwork(
+            nodes=nodes_df, links=links_df, shapes=shapes_df
+        )
+
+        roadway_network.link_file = link_file
+        roadway_network.node_file = node_file
+        roadway_network.shape_file = shape_file
+
+        return roadway_network
+
+    def write(self, path: str = ".", filename: str = None) -> None:
+        """
+        Writes a network in the roadway network standard
+
+        args:
+            path: the path were the output will be saved
+            filename: the name prefix of the roadway files that will be generated
+        """
+
+        if not os.path.exists(path):
+            WranglerLogger.debug("\nPath [%s] doesn't exist; creating." % path)
+            os.mkdir(path)
+
+        if filename:
+            links_file = os.path.join(path, filename + "_" + "link.json")
+            nodes_file = os.path.join(path, filename + "_" + "node.geojson")
+            shapes_file = os.path.join(path, filename + "_" + "shape.geojson")
+        else:
+            links_file = os.path.join(path, "link.json")
+            nodes_file = os.path.join(path, "node.geojson")
+            shapes_file = os.path.join(path, "shape.geojson")
+
+        link_property_columns = self.links_df.columns.values.tolist()
+        link_property_columns.remove("geometry")
+        links_json = link_df_to_json(self.links_df, link_property_columns)
+        with open(links_file, "w") as f:
+            json.dump(links_json, f)
+
+        # geopandas wont let you write to geojson because
+        # it uses fiona, which doesn't accept a list as one of the properties
+        # so need to convert the df to geojson manually first
+        property_columns = self.nodes_df.columns.values.tolist()
+        property_columns.remove("geometry")
+
+        nodes_geojson = point_df_to_geojson(self.nodes_df, property_columns)
+
+        with open(nodes_file, "w") as f:
+            json.dump(nodes_geojson, f)
+
+        self.shapes_df.to_file(shapes_file, driver="GeoJSON")
+
+    @staticmethod
+    def roadway_net_to_gdf(roadway_net: RoadwayNetwork) -> gpd.GeoDataFrame:
+        """
+        Turn the roadway network into a GeoDataFrame
+        args:
+            roadway_net: the roadway network to export
+
+        returns: shapes dataframe
+
+        .. todo:: Make this much more sophisticated, for example attach link info to shapes
+        """
+        return roadway_net.shapes_df
+
+    def validate_uniqueness(self) -> Bool:
+        """
+        Confirms that the unique identifiers are met.
+        """
+        valid = True
+        for c in RoadwayNetwork.UNIQUE_MODEL_LINK_IDENTIFIERS:
+            if c not in self.links_df.columns:
+                valid = False
+                msg = "Network doesn't contain unique link identifier: {}".format(c)
+                WranglerLogger.error(msg)
+            if not self.links_df[c].is_unique:
+                valid = False
+                msg = "Unique identifier {} is not unique in network links".format(c)
+                WranglerLogger.error(msg)
+        for c in RoadwayNetwork.LINK_FOREIGN_KEY:
+            if c not in self.links_df.columns:
+                valid = False
+                msg = "Network doesn't contain link foreign key identifier: {}".format(
+                    c
+                )
+                WranglerLogger.error(msg)
+        link_foreign_key = self.links_df[RoadwayNetwork.LINK_FOREIGN_KEY].apply(
+            lambda x: "-".join(x.map(str)), axis=1
+        )
+        if not link_foreign_key.is_unique:
+            valid = False
+            msg = "Foreign key: {} is not unique in network links".format(
+                RoadwayNetwork.LINK_FOREIGN_KEY
+            )
+            WranglerLogger.error(msg)
+        for c in RoadwayNetwork.UNIQUE_NODE_IDENTIFIERS:
+            if c not in self.nodes_df.columns:
+                valid = False
+                msg = "Network doesn't contain unique node identifier: {}".format(c)
+                WranglerLogger.error(msg)
+            if not self.nodes_df[c].is_unique:
+                valid = False
+                msg = "Unique identifier {} is not unique in network nodes".format(c)
+                WranglerLogger.error(msg)
+        if RoadwayNetwork.NODE_FOREIGN_KEY not in self.nodes_df.columns:
+            valid = False
+            msg = "Network doesn't contain node foreign key identifier: {}".format(
+                RoadwayNetwork.NODE_FOREIGN_KEY
+            )
+            WranglerLogger.error(msg)
+        elif not self.nodes_df[RoadwayNetwork.NODE_FOREIGN_KEY].is_unique:
+            valid = False
+            msg = "Foreign key: {} is not unique in network nodes".format(
+                RoadwayNetwork.NODE_FOREIGN_KEY
+            )
+            WranglerLogger.error(msg)
+        if RoadwayNetwork.UNIQUE_SHAPE_KEY not in self.shapes_df.columns:
+            valid = False
+            msg = "Network doesn't contain unique shape id: {}".format(
+                RoadwayNetwork.UNIQUE_SHAPE_KEY
+            )
+            WranglerLogger.error(msg)
+        elif not self.shapes_df[RoadwayNetwork.UNIQUE_SHAPE_KEY].is_unique:
+            valid = False
+            msg = "Unique key: {} is not unique in network shapes".format(
+                RoadwayNetwork.UNIQUE_SHAPE_KEY
+            )
+            WranglerLogger.error(msg)
+        return valid
+
+    @staticmethod
+    def validate_object_types(
+        nodes: GeoDataFrame, links: GeoDataFrame, shapes: GeoDataFrame
+    ):
+        """
+        Determines if the roadway network is being built with the right object types.
+        Does not validate schemas.
+
+        Args:
+            nodes: nodes geodataframe
+            links: link geodataframe
+            shapes: shape geodataframe
+
+        Returns: boolean
+        """
+
+        errors = ""
+
+        if not isinstance(nodes, GeoDataFrame):
+            error_message = "Incompatible nodes type:{}. Must provide a GeoDataFrame.  ".format(
+                type(nodes)
+            )
+            WranglerLogger.error(error_message)
+            errors.append(error_message)
+        if not isinstance(links, GeoDataFrame):
+            error_message = "Incompatible links type:{}. Must provide a GeoDataFrame.  ".format(
+                type(links)
+            )
+            WranglerLogger.error(error_message)
+            errors.append(error_message)
+        if not isinstance(shapes, GeoDataFrame):
+            error_message = "Incompatible shapes type:{}. Must provide a GeoDataFrame.  ".format(
+                type(shapes)
+            )
+            WranglerLogger.error(error_message)
+            errors.append(error_message)
+
+        if errors:
+            return False
+        return True
+
+    @staticmethod
+    def validate_node_schema(
+        node_file, schema_location: str = "roadway_network_node.json"
+    ):
+        """
+        Validate roadway network data node schema and output a boolean
+        """
+        if not os.path.exists(schema_location):
+            base_path = os.path.join(
+                os.path.dirname(os.path.realpath(__file__)), "schemas"
+            )
+            schema_location = os.path.join(base_path, schema_location)
+
+        with open(schema_location) as schema_json_file:
+            schema = json.load(schema_json_file)
+
+        with open(node_file) as node_json_file:
+            json_data = json.load(node_json_file)
+
+        try:
+            validate(json_data, schema)
+            return True
+
+        except ValidationError as exc:
+            WranglerLogger.error("Failed Node schema validation: Validation Error")
+            WranglerLogger.error("Node File Loc:{}".format(node_file))
+            WranglerLogger.error("Node Schema Loc:{}".format(schema_location))
+            WranglerLogger.error(exc.message)
+
+        except SchemaError as exc:
+            WranglerLogger.error("Invalid Node Schema")
+            WranglerLogger.error("Node Schema Loc:{}".format(schema_location))
+            WranglerLogger.error(json.dumps(exc.message, indent=2))
+
+        return False
+
+    @staticmethod
+    def validate_link_schema(
+        link_file, schema_location: str = "roadway_network_link.json"
+    ):
+        """
+        Validate roadway network data link schema and output a boolean
+        """
+
+        if not os.path.exists(schema_location):
+            base_path = os.path.join(
+                os.path.dirname(os.path.realpath(__file__)), "schemas"
+            )
+            schema_location = os.path.join(base_path, schema_location)
+
+        with open(schema_location) as schema_json_file:
+            schema = json.load(schema_json_file)
+
+        with open(link_file) as link_json_file:
+            json_data = json.load(link_json_file)
+
+        try:
+            validate(json_data, schema)
+            return True
+
+        except ValidationError as exc:
+            WranglerLogger.error("Failed Link schema validation: Validation Error")
+            WranglerLogger.error("Link File Loc:{}".format(link_file))
+            WranglerLogger.error("Path:{}".format(exc.path))
+            WranglerLogger.error(exc.message)
+
+        except SchemaError as exc:
+            WranglerLogger.error("Invalid Link Schema")
+            WranglerLogger.error("Link Schema Loc: {}".format(schema_location))
+            WranglerLogger.error(json.dumps(exc.message, indent=2))
+
+        return False
+
+    @staticmethod
+    def validate_shape_schema(
+        shape_file, schema_location: str = "roadway_network_shape.json"
+    ):
+        """
+        Validate roadway network data shape schema and output a boolean
+        """
+
+        if not os.path.exists(schema_location):
+            base_path = os.path.join(
+                os.path.dirname(os.path.realpath(__file__)), "schemas"
+            )
+            schema_location = os.path.join(base_path, schema_location)
+
+        with open(schema_location) as schema_json_file:
+            schema = json.load(schema_json_file)
+
+        with open(shape_file) as shape_json_file:
+            json_data = json.load(shape_json_file)
+
+        try:
+            validate(json_data, schema)
+            return True
+
+        except ValidationError as exc:
+            WranglerLogger.error("Failed Shape schema validation: Validation Error")
+            WranglerLogger.error("Shape File Loc:{}".format(shape_file))
+            WranglerLogger.error("Path:{}".format(exc.path))
+            WranglerLogger.error(exc.message)
+
+        except SchemaError as exc:
+            WranglerLogger.error("Invalid Shape Schema")
+            WranglerLogger.error("Shape Schema Loc: {}".format(schema_location))
+            WranglerLogger.error(json.dumps(exc.message, indent=2))
+
+        return False
+
+    def validate_selection(self, selection: dict) -> Bool:
+        """
+        Evaluate whetther the selection dictionary contains the
+        minimum required values.
+
+        Args:
+            selection: selection dictionary to be evaluated
+
+        Returns: boolean value as to whether the selection dictonary is valid.
+        """
+        if not set(RoadwayNetwork.SELECTION_REQUIRES).issubset(selection):
+            err_msg = "Project Card Selection requires: {}".format(
+                ",".join(RoadwayNetwork.SELECTION_REQUIRES)
+            )
+            err_msg += ", but selection only contains: {}".format(",".join(selection))
+            WranglerLogger.error(err_msg)
+            raise KeyError(err_msg)
+
+        err = []
+        for l in selection["link"]:
+            for k, v in l.items():
+                if k not in self.links_df.columns:
+                    err.append(
+                        "{} specified in link selection but not an attribute in network\n".format(
+                            k
+                        )
+                    )
+        selection_keys = [k for l in selection["link"] for k, v in l.items()]
+        unique_link_id = bool(
+            set(RoadwayNetwork.UNIQUE_MODEL_LINK_IDENTIFIERS).intersection(
+                set(selection_keys)
+            )
+        )
+
+        if not unique_link_id:
+            for k, v in selection["A"].items():
+                if (
+                    k not in self.nodes_df.columns
+                    and k != RoadwayNetwork.NODE_FOREIGN_KEY
+                ):
+                    err.append(
+                        "{} specified in A node selection but not an attribute in network\n".format(
+                            k
+                        )
+                    )
+            for k, v in selection["B"].items():
+                if (
+                    k not in self.nodes_df.columns
+                    and k != RoadwayNetwork.NODE_FOREIGN_KEY
+                ):
+                    err.append(
+                        "{} specified in B node selection but not an attribute in network\n".format(
+                            k
+                        )
+                    )
+        if err:
+            WranglerLogger.error(
+                "ERROR: Selection variables in project card not found in network"
+            )
+            WranglerLogger.error("\n".join(err))
+            WranglerLogger.error(
+                "--existing node columns:{}".format(" ".join(self.nodes_df.columns))
+            )
+            WranglerLogger.error(
+                "--existing link columns:{}".format(" ".join(self.links_df.columns))
+            )
+            raise ValueError()
+            return False
+        else:
+            return True
+
+    def orig_dest_nodes_foreign_key(
+        self, selection: dict, node_foreign_key: str = ""
+    ) -> tuple:
+        """
+        Returns the foreign key id (whatever is used in the u and v
+        variables in the links file) for the AB nodes as a tuple.
+
+        Args:
+            selection : selection dictionary with A and B keys
+            node_foreign_key: variable name for whatever is used by the u and v variable
+            in the links_df file.  If nothing is specified, assume whatever
+            default is (usually osm_node_id)
+
+        Returns: tuple of (A_id, B_id)
+        """
+
+        if not node_foreign_key:
+            node_foreign_key = RoadwayNetwork.NODE_FOREIGN_KEY
+        if len(selection["A"]) > 1:
+            raise ("Selection A node dictionary should be of length 1")
+        if len(selection["B"]) > 1:
+            raise ("Selection B node dictionary should be of length 1")
+
+        A_node_key, A_id = next(iter(selection["A"].items()))
+        B_node_key, B_id = next(iter(selection["B"].items()))
+
+        if A_node_key != node_foreign_key:
+            A_id = self.nodes_df[self.nodes_df[A_node_key] == A_id][
+                node_foreign_key
+            ].values[0]
+        if B_node_key != node_foreign_key:
+            B_id = self.nodes_df[self.nodes_df[B_node_key] == B_id][
+                node_foreign_key
+            ].values[0]
+
+        return (A_id, B_id)
+
+    @staticmethod
+    def get_managed_lane_node_ids(nodes_list):
+        return [x + RoadwayNetwork.MANAGED_LANES_SCALAR for x in nodes_list]
+
+    @staticmethod
+    def ox_graph(nodes_df: GeoDataFrame, links_df: GeoDataFrame):
+        """
+        create an osmnx-flavored network graph
+
+        osmnx doesn't like values that are arrays, so remove the variables
+        that have arrays.  osmnx also requires that certain variables
+        be filled in, so do that too.
+
+        Args:
+            nodes_df : GeoDataFrame of nodes
+            link_df : GeoDataFrame of links
+
+        Returns: a networkx multidigraph
+        """
+        WranglerLogger.debug("starting ox_graph()")
+
+        graph_nodes = nodes_df.copy().drop(
+            ["inboundReferenceIds", "outboundReferenceIds"], axis=1
+        )
+
+        graph_nodes.gdf_name = "network_nodes"
+        WranglerLogger.debug("GRAPH NODES: {}".format(graph_nodes.columns))
+        graph_nodes["id"] = graph_nodes[RoadwayNetwork.NODE_FOREIGN_KEY]
+
+        graph_nodes["x"] = graph_nodes["X"]
+        graph_nodes["y"] = graph_nodes["Y"]
+
+        graph_links = links_df.copy().drop(
+            ["osm_link_id", "locationReferences"], axis=1
+        )
+
+        # have to change this over into u,v b/c this is what osm-nx is expecting
+        graph_links["u"] = graph_links[RoadwayNetwork.LINK_FOREIGN_KEY[0]]
+        graph_links["v"] = graph_links[RoadwayNetwork.LINK_FOREIGN_KEY[1]]
+        graph_links["id"] = graph_links[RoadwayNetwork.UNIQUE_LINK_KEY]
+        graph_links["key"] = graph_links[RoadwayNetwork.UNIQUE_LINK_KEY]
+
+        WranglerLogger.debug("starting ox.gdfs_to_graph()")
+        try:
+            G = ox.graph_from_gdfs(graph_nodes, graph_links)
+        except:
+            WranglerLogger.debug(
+                "Please upgrade your OSMNX package. For now, using depricated osmnx.gdfs_to_graph because osmnx.graph_from_gdfs not found"
+            )
+            G = ox.gdfs_to_graph(graph_nodes, graph_links)
+
+        WranglerLogger.debug("finished ox.gdfs_to_graph()")
+        return G
+
+    @staticmethod
+    def selection_has_unique_link_id(selection_dict: dict) -> bool:
+        """
+        Args:
+            selection_dictionary: Dictionary representation of selection
+                of roadway features, containing a "link" key.
+
+        Returns: A boolean indicating if the selection dictionary contains
+            a required unique link id.
+        """
+        selection_keys = [k for l in selection_dict["link"] for k, v in l.items()]
+        return bool(
+            set(RoadwayNetwork.UNIQUE_MODEL_LINK_IDENTIFIERS).intersection(
+                set(selection_keys)
+            )
+        )
+
+    def build_selection_key(self, selection_dict: dict) -> tuple:
+        """
+        Selections are stored by a key combining the query and the A and B ids.
+        This method combines the two for you based on the selection dictionary.
+
+        Args:
+            selection_dictonary: Selection Dictionary
+
+        Returns: Tuple serving as the selection key.
+
+        """
+        sel_query = ProjectCard.build_link_selection_query(
+            selection=selection_dict,
+            unique_model_link_identifiers=RoadwayNetwork.UNIQUE_MODEL_LINK_IDENTIFIERS,
+        )
+
+        if RoadwayNetwork.selection_has_unique_link_id(selection_dict):
+            return sel_query
+
+        A_id, B_id = self.orig_dest_nodes_foreign_key(selection_dict)
+        return (sel_query, A_id, B_id)
+
+    def select_roadway_features(
+        self, selection: dict, search_mode="drive", force_search=False
+    ) -> GeoDataFrame:
+        """
+        Selects roadway features that satisfy selection criteria
+
+        Example usage:
+            net.select_roadway_features(
+              selection = [ {
+                #   a match condition for the from node using osm,
+                #   shared streets, or model node number
+                'from': {'osm_model_link_id': '1234'},
+                #   a match for the to-node..
+                'to': {'shstid': '4321'},
+                #   a regex or match for facility condition
+                #   could be # of lanes, facility type, etc.
+                'facility': {'name':'Main St'},
+                }, ... ])
+
+        Args:
+            selection : dictionary with keys for:
+                 A - from node
+                 B - to node
+                 link - which includes at least a variable for `name`
+
+        Returns: a list of node foreign IDs on shortest path
+        """
+        WranglerLogger.debug("validating selection")
+        self.validate_selection(selection)
+
+        # create a unique key for the selection so that we can cache it
+        sel_key = self.build_selection_key(selection)
+        WranglerLogger.debug("Selection Key: {}".format(sel_key))
+
+        # if this selection has been queried before, just return the
+        # previously selected links
+        if sel_key in self.selections and not force_search:
+            if self.selections[sel_key]["selection_found"]:
+                return self.selections[sel_key]["selected_links"].index.tolist()
+            else:
+                msg = "Selection previously queried but no selection found"
+                WranglerLogger.error(msg)
+                raise Exception(msg)
+        self.selections[sel_key] = {}
+        self.selections[sel_key]["selection_found"] = False
+
+        unique_model_link_identifer_in_selection = RoadwayNetwork.selection_has_unique_link_id(
+            selection
+        )
+        if not unique_model_link_identifer_in_selection:
+            A_id, B_id = self.orig_dest_nodes_foreign_key(selection)
+        # identify candidate links which match the initial query
+        # assign them as iteration = 0
+        # subsequent iterations that didn't match the query will be
+        # assigned a heigher weight in the shortest path
+        WranglerLogger.debug("Building selection query")
+        # build a selection query based on the selection dictionary
+
+        sel_query = ProjectCard.build_link_selection_query(
+            selection=selection,
+            unique_model_link_identifiers=RoadwayNetwork.UNIQUE_MODEL_LINK_IDENTIFIERS,
+            mode=RoadwayNetwork.MODES_TO_NETWORK_LINK_VARIABLES[search_mode],
+        )
+        WranglerLogger.debug("Selecting features:\n{}".format(sel_query))
+
+        self.selections[sel_key]["candidate_links"] = self.links_df.query(
+            sel_query, engine="python"
+        )
+        WranglerLogger.debug("Completed query")
+        candidate_links = self.selections[sel_key][
+            "candidate_links"
+        ]  # b/c too long to keep that way
+
+        candidate_links["i"] = 0
+
+        if len(candidate_links.index) == 0 and unique_model_link_identifer_in_selection:
+            msg = "No links found based on unique link identifiers.\nSelection Failed."
+            WranglerLogger.error(msg)
+            raise Exception(msg)
+
+        if len(candidate_links.index) == 0:
+            WranglerLogger.debug(
+                "No candidate links in initial search.\nRetrying query using 'ref' instead of 'name'"
+            )
+            # if the query doesn't come back with something from 'name'
+            # try it again with 'ref' instead
+            selection_has_name_key = any("name" in d for d in selection["link"])
+
+            if not selection_has_name_key:
+                msg = "Not able to complete search using 'ref' instead of 'name' because 'name' not in search."
+                WranglerLogger.error(msg)
+                raise Exception(msg)
+
+            if not "ref" in self.links_df.columns:
+                msg = "Not able to complete search using 'ref' because 'ref' not in network."
+                WranglerLogger.error(msg)
+                raise Exception(msg)
+
+            WranglerLogger.debug("Trying selection query replacing 'name' with 'ref'")
+            sel_query = sel_query.replace("name", "ref")
+
+            self.selections[sel_key]["candidate_links"] = self.links_df.query(
+                sel_query, engine="python"
+            )
+            candidate_links = self.selections[sel_key]["candidate_links"]
+
+            candidate_links["i"] = 0
+
+            if len(candidate_links.index) == 0:
+                msg = "No candidate links in search using either 'name' or 'ref' in query.\nSelection Failed."
+                WranglerLogger.error(msg)
+                raise Exception(msg)
+
+        def _add_breadth(candidate_links: DataFrame, nodes: Data, links, i):
+            """
+            Add outbound and inbound reference IDs to candidate links
+            from existing nodes
+
+            Args:
+            candidate_links : GeoDataFrame
+                df with the links from the previous iteration that we
+                want to add on to
+
+            nodes : GeoDataFrame
+                df of all nodes in the full network
+
+            links : GeoDataFrame
+                df of all links in the full network
+
+            i : int
+                iteration of adding breadth
+
+            Returns:
+                candidate_links : GeoDataFrame
+                    updated df with one more degree of added breadth
+
+                node_list_foreign_keys : list
+                    list of foreign key ids for nodes in the updated candidate links
+                    to test if the A and B nodes are in there.
+
+            ..todo:: Make unique ID for links in the settings
+            """
+            WranglerLogger.debug("-Adding Breadth-")
+
+            node_list_foreign_keys = list(
+                set(
+                    [
+                        i
+                        for fk in RoadwayNetwork.LINK_FOREIGN_KEY
+                        for i in list(candidate_links[fk])
+                    ]
+                )
+                # set(list(candidate_links["u"]) + list(candidate_links["v"]))
+            )
+            candidate_nodes = nodes.loc[node_list_foreign_keys]
+            WranglerLogger.debug("Candidate Nodes: {}".format(len(candidate_nodes)))
+            links_shstRefId_to_add = list(
+                set(
+                    sum(candidate_nodes["outboundReferenceIds"].tolist(), [])
+                    + sum(candidate_nodes["inboundReferenceIds"].tolist(), [])
+                )
+                - set(candidate_links["shstReferenceId"].tolist())
+                - set([""])
+            )
+            ##TODO make unique ID for links in the settings
+            # print("Link IDs to add: {}".format(links_shstRefId_to_add))
+            # print("Links: ", links_id_to_add)
+            links_to_add = links[links.shstReferenceId.isin(links_shstRefId_to_add)]
+            # print("Adding Links:",links_to_add)
+            WranglerLogger.debug("Adding {} links.".format(links_to_add.shape[0]))
+            links[links.model_link_id.isin(links_shstRefId_to_add)]["i"] = i
+            candidate_links = candidate_links.append(links_to_add)
+            node_list_foreign_keys = list(
+                set(
+                    [
+                        i
+                        for fk in RoadwayNetwork.LINK_FOREIGN_KEY
+                        for i in list(candidate_links[fk])
+                    ]
+                )
+                # set(list(candidate_links["u"]) + list(candidate_links["v"]))
+            )
+
+            return candidate_links, node_list_foreign_keys
+
+        def _shortest_path():
+            WranglerLogger.debug(
+                "_shortest_path(): calculating shortest path from graph"
+            )
+            candidate_links.loc[:, "weight"] = 1 + (
+                candidate_links["i"] * RoadwayNetwork.SP_WEIGHT_FACTOR
+            )
+
+            node_list_foreign_keys = list(
+                set(
+                    [
+                        i
+                        for fk in RoadwayNetwork.LINK_FOREIGN_KEY
+                        for i in list(candidate_links[fk])
+                    ]
+                )
+                # set(list(candidate_links["u"]) + list(candidate_links["v"]))
+            )
+
+            candidate_nodes = self.nodes_df.loc[node_list_foreign_keys]
+
+            WranglerLogger.debug("creating network graph")
+            G = RoadwayNetwork.ox_graph(candidate_nodes, candidate_links)
+            self.selections[sel_key]["graph"] = G
+            self.selections[sel_key]["candidate_links"] = candidate_links
+
+            try:
+                WranglerLogger.debug(
+                    "Calculating NX shortest path from A_id: {} to B_id: {}".format(
+                        A_id, B_id
+                    )
+                )
+                sp_route = nx.shortest_path(G, A_id, B_id, weight="weight")
+                WranglerLogger.debug("Shortest path successfully routed")
+
+            except nx.NetworkXNoPath:
+                return False
+
+            sp_links = candidate_links[
+                candidate_links["A"].isin(sp_route)
+                & candidate_links["B"].isin(sp_route)
+            ]
+
+            self.selections[sel_key]["route"] = sp_route
+            self.selections[sel_key]["links"] = sp_links
+
+            return True
+
+        if not unique_model_link_identifer_in_selection:
+            # find the node ids for the candidate links
+            WranglerLogger.debug("Not a unique ID selection, conduct search")
+            node_list_foreign_keys = list(
+                set(
+                    [
+                        i
+                        for fk in RoadwayNetwork.LINK_FOREIGN_KEY
+                        for i in list(candidate_links[fk])
+                    ]
+                )
+                # set(list(candidate_links["u"]) + list(candidate_links["v"]))
+            )
+            WranglerLogger.debug("Foreign key list: {}".format(node_list_foreign_keys))
+            i = 0
+
+            max_i = RoadwayNetwork.SEARCH_BREADTH
+
+            while (
+                A_id not in node_list_foreign_keys
+                and B_id not in node_list_foreign_keys
+                and i <= max_i
+            ):
+                WranglerLogger.debug(
+                    "Adding breadth, no shortest path. i: {}, Max i: {}".format(
+                        i, max_i
+                    )
+                )
+                i += 1
+                candidate_links, node_list_foreign_keys = _add_breadth(
+                    candidate_links, self.nodes_df, self.links_df, i
+                )
+            WranglerLogger.debug("calculating shortest path from graph")
+            sp_found = _shortest_path()
+            if not sp_found:
+                WranglerLogger.info(
+                    "No shortest path found with {}, trying greater breadth until SP found".format(
+                        i
+                    )
+                )
+            while not sp_found and i <= RoadwayNetwork.MAX_SEARCH_BREADTH:
+                WranglerLogger.debug(
+                    "Adding breadth, with shortest path iteration. i: {} Max i: {}".format(
+                        i, max_i
+                    )
+                )
+                i += 1
+                candidate_links, node_list_foreign_keys = _add_breadth(
+                    candidate_links, self.nodes_df, self.links_df, i
+                )
+                sp_found = _shortest_path()
+
+            if sp_found:
+                # reselect from the links in the shortest path, the ones with
+                # the desired values....ignoring name.
+                if len(selection["link"]) > 1:
+                    resel_query = ProjectCard.build_link_selection_query(
+                        selection=selection,
+                        unique_model_link_identifiers=RoadwayNetwork.UNIQUE_MODEL_LINK_IDENTIFIERS,
+                        mode=RoadwayNetwork.MODES_TO_NETWORK_LINK_VARIABLES[
+                            search_mode
+                        ],
+                        ignore=["name"],
+                    )
+                    WranglerLogger.info("Reselecting features:\n{}".format(resel_query))
+                    self.selections[sel_key]["selected_links"] = self.selections[
+                        sel_key
+                    ]["links"].query(resel_query, engine="python")
+                else:
+                    self.selections[sel_key]["selected_links"] = self.selections[
+                        sel_key
+                    ]["links"]
+
+                self.selections[sel_key]["selection_found"] = True
+                # Return pandas.Series of links_ids
+                return self.selections[sel_key]["selected_links"].index.tolist()
+            else:
+                WranglerLogger.error(
+                    "Couldn't find path from {} to {}".format(A_id, B_id)
+                )
+                raise ValueError
+        else:
+            # unique identifier exists and no need to go through big search
+            self.selections[sel_key]["selected_links"] = self.selections[sel_key][
+                "candidate_links"
+            ]
+            self.selections[sel_key]["selection_found"] = True
+
+            return self.selections[sel_key]["selected_links"].index.tolist()
+
+    def validate_properties(
+        self,
+        properties: dict,
+        ignore_existing: bool = False,
+        require_existing_for_change: bool = False,
+    ) -> bool:
+        """
+        If there are change or existing commands, make sure that that
+        property exists in the network.
+
+        Args:
+            properties : properties dictionary to be evaluated
+            ignore_existing: If True, will only warn about properties
+                that specify an "existing" value.  If False, will fail.
+            require_existing_for_change: If True, will fail if there isn't
+                a specified value in theproject card for existing when a
+                change is specified.
+
+        Returns: boolean value as to whether the properties dictonary is valid.
+        """
+
+        validation_error_message = []
+
+        for p in properties:
+            if p["property"] not in self.links_df.columns:
+                if p.get("change"):
+                    validation_error_message.append(
+                        '"Change" is specified for attribute {}, but doesn\'t exist in base network\n'.format(
+                            p["property"]
+                        )
+                    )
+
+                if p.get("existing") and not ignore_existing:
+                    validation_error_message.append(
+                        '"Existing" is specified for attribute {}, but doesn\'t exist in base network\n'.format(
+                            p["property"]
+                        )
+                    )
+                elif p.get("existing"):
+                    WranglerLogger.warning(
+                        '"Existing" is specified for attribute {}, but doesn\'t exist in base network\n'.format(
+                            p["property"]
+                        )
+                    )
+
+            if p.get("change") and not p.get("existing"):
+                if require_existing_for_change:
+                    validation_error_message.append(
+                        '"Change" is specified for attribute {}, but there isn\'t a value for existing.\nTo proceed, run with the setting require_existing_for_change=False'.format(
+                            p["property"]
+                        )
+                    )
+                else:
+                    WranglerLogger.warning(
+                        '"Change" is specified for attribute {}, but there isn\'t a value for existing.\n'.format(
+                            p["property"]
+                        )
+                    )
+
+        if validation_error_message:
+            WranglerLogger.error(" ".join(validation_error_message))
+            raise ValueError()
+
+    def apply(self, project_card_dictionary: dict):
+        """
+        Wrapper method to apply a project to a roadway network.
+
+        Args:
+            project_card_dictionary: dict
+              a dictionary of the project card object
+
+        """
+
+        WranglerLogger.info(
+            "Applying Project to Roadway Network: {}".format(
+                project_card_dictionary["project"]
+            )
+        )
+
+        def _apply_individual_change(project_dictionary: dict):
+
+            if project_dictionary["category"].lower() == "roadway property change":
+                self.apply_roadway_feature_change(
+                    self.select_roadway_features(project_dictionary["facility"]),
+                    project_dictionary["properties"],
+                )
+            elif project_dictionary["category"].lower() == "parallel managed lanes":
+                self.apply_managed_lane_feature_change(
+                    self.select_roadway_features(project_dictionary["facility"]),
+                    project_dictionary["properties"],
+                )
+            elif project_dictionary["category"].lower() == "add new roadway":
+                self.add_new_roadway_feature_change(
+                    project_dictionary.get("links"), project_dictionary.get("nodes")
+                )
+            elif project_dictionary["category"].lower() == "roadway deletion":
+                self.delete_roadway_feature_change(
+                    project_dictionary.get("links"), project_dictionary.get("nodes")
+                )
+            elif project_dictionary["category"].lower() == "calculated roadway":
+                self.apply_python_calculation(
+                    project_dictionary['pycode']
+                )
+            else:
+                raise (BaseException)
+
+        if project_card_dictionary.get("changes"):
+            for project_dictionary in project_card_dictionary["changes"]:
+                _apply_individual_change(project_dictionary)
+        else:
+            _apply_individual_change(project_card_dictionary)
+
+    def apply_python_calculation(self, pycode: str, in_place: bool = True) -> Union(None, RoadwayNetwork):
+        """
+        Changes roadway network object by executing pycode.
+
+        Args:
+            pycode: python code which changes values in the roadway network object
+            in_place: update self or return a new roadway network object
+        """
+        exec(pycode)
+
+    def apply_roadway_feature_change(
+        self, link_idx: list, properties: dict, in_place: bool = True
+    ) -> Union(None, RoadwayNetwork):
+        """
+        Changes the roadway attributes for the selected features based on the
+        project card information passed
+
+        Args:
+            link_idx : list
+                lndices of all links to apply change to
+            properties : list of dictionarys
+                roadway properties to change
+            in_place: boolean
+                update self or return a new roadway network object
+        """
+
+        # check if there are change or existing commands that that property
+        #   exists in the network
+        # if there is a set command, add that property to network
+        self.validate_properties(properties)
+
+        for i, p in enumerate(properties):
+            attribute = p["property"]
+
+            # if project card specifies an existing value in the network
+            #   check and see if the existing value in the network matches
+            if p.get("existing"):
+                network_values = self.links_df.loc[link_idx, attribute].tolist()
+                if not set(network_values).issubset([p.get("existing")]):
+                    WranglerLogger.warning(
+                        "Existing value defined for {} in project card does "
+                        "not match the value in the roadway network for the "
+                        "selected links".format(attribute)
+                    )
+
+            if in_place:
+                if "set" in p.keys():
+                    self.links_df.loc[link_idx, attribute] = p["set"]
+                else:
+                    self.links_df.loc[link_idx, attribute] = (
+                        self.links_df.loc[link_idx, attribute] + p["change"]
+                    )
+            else:
+                if i == 0:
+                    updated_network = copy.deepcopy(self)
+
+                if "set" in p.keys():
+                    updated_network.links_df.loc[link_idx, attribute] = p["set"]
+                else:
+                    updated_network.links_df.loc[link_idx, attribute] = (
+                        updated_network.links_df.loc[link_idx, attribute] + p["change"]
+                    )
+
+                if i == len(properties) - 1:
+                    return updated_network
+
+    def apply_managed_lane_feature_change(
+        self, link_idx: list, properties: dict, in_place: bool = True
+    ) -> Union(None, RoadwayNetwork):
+        """
+        Apply the managed lane feature changes to the roadway network
+
+        Args:
+            link_idx : list of lndices of all links to apply change to
+            properties : list of dictionarys roadway properties to change
+            in_place: boolean to indicate whether to update self or return
+                a new roadway network object
+
+        .. todo:: decide on connectors info when they are more specific in project card
+        """
+
+        # add ML flag
+        if "managed" in self.links_df.columns:
+            self.links_df.loc[link_idx, "managed"] = 1
+        else:
+            self.links_df["managed"] = 0
+            self.links_df.loc[link_idx, "managed"] = 1
+
+        for p in properties:
+            attribute = p["property"]
+
+            if "group" in p.keys():
+                attr_value = {}
+                attr_value["default"] = p["set"]
+                attr_value["timeofday"] = []
+                for g in p["group"]:
+                    category = g["category"]
+                    for tod in g["timeofday"]:
+                        attr_value["timeofday"].append(
+                            {
+                                "category": category,
+                                "time": parse_time_spans(tod["time"]),
+                                "value": tod["set"],
+                            }
+                        )
+
+            elif "timeofday" in p.keys():
+                attr_value = {}
+                attr_value["default"] = p["set"]
+                attr_value["timeofday"] = []
+                for tod in p["timeofday"]:
+                    attr_value["timeofday"].append(
+                        {"time": parse_time_spans(tod["time"]), "value": tod["set"]}
+                    )
+
+            elif "set" in p.keys():
+                attr_value = p["set"]
+
+            else:
+                attr_value = ""
+
+            # TODO: decide on connectors info when they are more specific in project card
+            if attribute == "ML_ACCESS" and attr_value == "all":
+                attr_value = 1
+
+            if attribute == "ML_EGRESS" and attr_value == "all":
+                attr_value = 1
+
+            if in_place:
+                if attribute in self.links_df.columns and not isinstance(
+                    attr_value, numbers.Number
+                ):
+                    # if the attribute already exists
+                    # and the attr value we are trying to set is not numeric
+                    # then change the attribute type to object
+                    self.links_df[attribute] = self.links_df[attribute].astype(object)
+
+                if attribute not in self.links_df.columns:
+                    # if it is a new attribute then initiate with NaN values
+                    self.links_df[attribute] = "NaN"
+
+                for idx in link_idx:
+                    self.links_df.at[idx, attribute] = attr_value
+
+            else:
+                if i == 0:
+                    updated_network = copy.deepcopy(self)
+
+                if attribute in self.links_df.columns and not isinstance(
+                    attr_value, numbers.Number
+                ):
+                    # if the attribute already exists
+                    # and the attr value we are trying to set is not numeric
+                    # then change the attribute type to object
+                    updated_network.links_df[attribute] = updated_network.links_df[
+                        attribute
+                    ].astype(object)
+
+                if attribute not in updated_network.links_df.columns:
+                    # if it is a new attribute then initiate with NaN values
+                    updated_network.links_df[attribute] = "NaN"
+
+                for idx in link_idx:
+                    updated_network.links_df.at[idx, attribute] = attr_value
+
+                if i == len(properties) - 1:
+                    return updated_network
+
+    def add_new_roadway_feature_change(self, links: dict, nodes: dict) -> None:
+        """
+        add the new roadway features defined in the project card.
+        new shapes are also added for the new roadway links.
+
+        args:
+            links : list of dictionaries
+            nodes : list of dictionaries
+
+        returns: None
+
+        .. todo:: validate links and nodes dictionary
+        """
+
+        def _add_dict_to_df(df, new_dict):
+            df_column_names = df.columns
+            new_row_to_add = {}
+
+            # add the fields from project card that are in the network
+            for property in df_column_names:
+                if property in new_dict.keys():
+                    if df[property].dtype == np.float64:
+                        value = pd.to_numeric(new_dict[property], downcast="float")
+                    elif df[property].dtype == np.int64:
+                        value = pd.to_numeric(new_dict[property], downcast="integer")
+                    else:
+                        value = str(new_dict[property])
+                else:
+                    value = ""
+
+                new_row_to_add[property] = value
+
+            # add the fields from project card that are NOT in the network
+            for key, value in new_dict.items():
+                if key not in df_column_names:
+                    new_row_to_add[key] = new_dict[key]
+
+            out_df = df.append(new_row_to_add, ignore_index=True)
+            return out_df
+
+        if nodes is not None:
+            for node in nodes:
+                if node.get(RoadwayNetwork.NODE_FOREIGN_KEY) is None:
+                    msg = "New link to add doesn't contain link foreign key identifier: {}".format(
+                        RoadwayNetwork.NODE_FOREIGN_KEY
+                    )
+                    WranglerLogger.error(msg)
+                    raise ValueError(msg)
+
+                node_query = (
+                    RoadwayNetwork.UNIQUE_NODE_KEY
+                    + " == "
+                    + str(node[RoadwayNetwork.NODE_FOREIGN_KEY])
+                )
+                if not self.nodes_df.query(node_query, engine="python").empty:
+                    msg = "Node with id = {} already exist in the network".format(
+                        node[RoadwayNetwork.NODE_FOREIGN_KEY]
+                    )
+                    WranglerLogger.error(msg)
+                    raise ValueError(msg)
+
+            for node in nodes:
+                self.nodes_df = _add_dict_to_df(self.nodes_df, node)
+
+        if links is not None:
+            for link in links:
+                for key in RoadwayNetwork.LINK_FOREIGN_KEY:
+                    if link.get(key) is None:
+                        msg = "New link to add doesn't contain link foreign key identifier: {}".format(
+                            key
+                        )
+                        WranglerLogger.error(msg)
+                        raise ValueError(msg)
+
+                ab_query = "A == " + str(link["A"]) + " and B == " + str(link["B"])
+
+                if not self.links_df.query(ab_query, engine="python").empty:
+                    msg = "Link with A = {} and B = {} already exist in the network".format(
+                        link["A"], link["B"]
+                    )
+                    WranglerLogger.error(msg)
+                    raise ValueError(msg)
+
+                if self.nodes_df[
+                    self.nodes_df[RoadwayNetwork.UNIQUE_NODE_KEY] == link["A"]
+                ].empty:
+                    msg = "New link to add has A node = {} but the node does not exist in the network".format(
+                        link["A"]
+                    )
+                    WranglerLogger.error(msg)
+                    raise ValueError(msg)
+
+                if self.nodes_df[
+                    self.nodes_df[RoadwayNetwork.UNIQUE_NODE_KEY] == link["B"]
+                ].empty:
+                    msg = "New link to add has B node = {} but the node does not exist in the network".format(
+                        link["B"]
+                    )
+                    WranglerLogger.error(msg)
+                    raise ValueError(msg)
+
+            for link in links:
+                link["new_link"] = 1
+                self.links_df = _add_dict_to_df(self.links_df, link)
+
+            # add location reference and geometry for new links
+            self.links_df["locationReferences"] = self.links_df.apply(
+                lambda x: create_location_reference_from_nodes(
+                    self.nodes_df[
+                        self.nodes_df[RoadwayNetwork.NODE_FOREIGN_KEY] == x["A"]
+                    ].squeeze(),
+                    self.nodes_df[
+                        self.nodes_df[RoadwayNetwork.NODE_FOREIGN_KEY] == x["B"]
+                    ].squeeze(),
+                )
+                if x["new_link"] == 1
+                else x["locationReferences"],
+                axis=1,
+            )
+            self.links_df["geometry"] = self.links_df.apply(
+                lambda x: create_line_string(x["locationReferences"])
+                if x["new_link"] == 1
+                else x["geometry"],
+                axis=1,
+            )
+
+            self.links_df[RoadwayNetwork.UNIQUE_SHAPE_KEY] = self.links_df.apply(
+                lambda x: create_unique_shape_id(x["geometry"])
+                if x["new_link"] == 1
+                else x[RoadwayNetwork.UNIQUE_SHAPE_KEY],
+                axis=1,
+            )
+
+            # add new shapes
+            added_links = self.links_df[self.links_df["new_link"] == 1]
+
+            added_shapes_df = pd.DataFrame({"geometry": added_links["geometry"]})
+            added_shapes_df[RoadwayNetwork.UNIQUE_SHAPE_KEY] = added_shapes_df[
+                "geometry"
+            ].apply(lambda x: create_unique_shape_id(x))
+            self.shapes_df = self.shapes_df.append(added_shapes_df)
+
+            self.links_df.drop(["new_link"], axis=1, inplace=True)
+
+    def delete_roadway_feature_change(
+        self, links: dict, nodes: dict, ignore_missing=True
+    ) -> None:
+        """
+        delete the roadway features defined in the project card.
+        valid links and nodes defined in the project gets deleted
+        and shapes corresponding to the deleted links are also deleted.
+
+        Args:
+            links : dict
+                list of dictionaries
+            nodes : dict
+                list of dictionaries
+            ignore_missing: bool
+                If True, will only warn about links/nodes that are missing from
+                network but specified to "delete" in project card
+                If False, will fail.
+        """
+
+        missing_error_message = []
+
+        if links is not None:
+            shapes_to_delete = []
+            for key, val in links.items():
+                missing_links = [v for v in val if v not in self.links_df[key].tolist()]
+                if missing_links:
+                    message = "Links attribute {} with values as {} does not exist in the network\n".format(
+                        key, missing_links
+                    )
+                    if ignore_missing:
+                        WranglerLogger.warning(message)
+                    else:
+                        missing_error_message.append(message)
+
+                deleted_links = self.links_df[self.links_df[key].isin(val)]
+                shapes_to_delete.extend(
+                    deleted_links[RoadwayNetwork.UNIQUE_SHAPE_KEY].tolist()
+                )
+
+                self.links_df.drop(
+                    self.links_df.index[self.links_df[key].isin(val)], inplace=True
+                )
+
+            self.shapes_df.drop(
+                self.shapes_df.index[
+                    self.shapes_df[RoadwayNetwork.UNIQUE_SHAPE_KEY].isin(
+                        shapes_to_delete
+                    )
+                ],
+                inplace=True,
+            )
+
+        if nodes is not None:
+            for key, val in nodes.items():
+                missing_nodes = [v for v in val if v not in self.nodes_df[key].tolist()]
+                if missing_nodes:
+                    message = "Nodes attribute {} with values as {} does not exist in the network\n".format(
+                        key, missing_links
+                    )
+                    if ignore_missing:
+                        WranglerLogger.warning(message)
+                    else:
+                        missing_error_message.append(message)
+
+                self.nodes_df = self.nodes_df[~self.nodes_df[key].isin(val)]
+
+        if missing_error_message:
+            WranglerLogger.error(" ".join(missing_error_message))
+            raise ValueError()
+
+    def get_property_by_time_period_and_group(
+        self, property, time_period=None, category=None
+    ):
+        """
+        Return a series for the properties with a specific group or time period.
+
+        args
+        ------
+        property: str
+          the variable that you want from network
+        time_period: list(str)
+          the time period that you are querying for
+          i.e. ['16:00', '19:00']
+        category: str or list(str)(Optional)
+          the group category
+          i.e. "sov"
+
+          or
+
+          list of group categories in order of search, i.e.
+          ["hov3","hov2"]
+
+        returns
+        --------
+        pandas series
+        """
+
+        def _get_property(
+            v,
+            time_spans=None,
+            category=None,
+            return_partial_match: bool = False,
+            partial_match_minutes: int = 60,
+        ):
+            """
+
+            .. todo:: return the time period with the largest overlap
+
+            """
+
+            if category and not time_spans:
+                WranglerLogger.error(
+                    "\nShouldn't have a category group without time spans"
+                )
+                raise ValueError("Shouldn't have a category group without time spans")
+
+            # simple case
+            if type(v) in (int, float, str):
+                return v
+
+            if not category:
+                category = ["default"]
+            elif isinstance(category, str):
+                category = [category]
+            search_cats = [c.lower() for c in category]
+
+            # if no time or group specified, but it is a complex link situation
+            if not time_spans:
+                if "default" in v.keys():
+                    return v["default"]
+                else:
+                    WranglerLogger.debug("variable: ".format(v))
+                    msg = "Variable {} is more complex in network than query".format(v)
+                    WranglerLogger.error(msg)
+                    raise ValueError(msg)
+
+            if v.get("timeofday"):
+                categories = []
+                for tg in v["timeofday"]:
+                    if (time_spans[0] >= tg["time"][0]) and (
+                        time_spans[1] <= tg["time"][1]
+                    ):
+                        if tg.get("category"):
+                            categories += tg["category"]
+                            for c in search_cats:
+                                print("CAT:", c, tg["category"])
+                                if c in tg["category"]:
+                                    # print("Var:", v)
+                                    # print(
+                                    #    "RETURNING:", time_spans, category, tg["value"]
+                                    # )
+                                    return tg["value"]
+                        else:
+                            # print("Var:", v)
+                            # print("RETURNING:", time_spans, category, tg["value"])
+                            return tg["value"]
+
+                    # if there isn't a fully matched time period, see if there is an overlapping one
+                    # right now just return the first overlapping ones
+                    # TODO return the time period with the largest overlap
+
+                    if (
+                        (time_spans[0] >= tg["time"][0])
+                        and (time_spans[0] <= tg["time"][1])
+                    ) or (
+                        (time_spans[1] >= tg["time"][0])
+                        and (time_spans[1] <= tg["time"][1])
+                    ):
+                        overlap_minutes = max(
+                            0,
+                            min(tg["time"][1], time_spans[1])
+                            - max(time_spans[0], tg["time"][0]),
+                        )
+                        # print("OLM",overlap_minutes)
+                        if not return_partial_match and overlap_minutes > 0:
+                            WranglerLogger.debug(
+                                "Couldn't find time period consistent with {}, but found a partial match: {}. Consider allowing partial matches using 'return_partial_match' keyword or updating query.".format(
+                                    time_spans, tg["time"]
+                                )
+                            )
+                        elif (
+                            overlap_minutes < partial_match_minutes
+                            and overlap_minutes > 0
+                        ):
+                            WranglerLogger.debug(
+                                "Time period: {} overlapped less than the minimum number of minutes ({}<{}) to be considered a match with time period in network: {}.".format(
+                                    time_spans,
+                                    overlap_minutes,
+                                    partial_match_minutes,
+                                    tg["time"],
+                                )
+                            )
+                        elif overlap_minutes > 0:
+                            WranglerLogger.debug(
+                                "Returning a partial time period match. Time period: {} overlapped the minimum number of minutes ({}>={}) to be considered a match with time period in network: {}.".format(
+                                    time_spans,
+                                    overlap_minutes,
+                                    partial_match_minutes,
+                                    tg["time"],
+                                )
+                            )
+                            if tg.get("category"):
+                                categories += tg["category"]
+                                for c in search_cats:
+                                    print("CAT:", c, tg["category"])
+                                    if c in tg["category"]:
+                                        # print("Var:", v)
+                                        # print(
+                                        #    "RETURNING:",
+                                        #    time_spans,
+                                        #    category,
+                                        #    tg["value"],
+                                        # )
+                                        return tg["value"]
+                            else:
+                                # print("Var:", v)
+                                # print("RETURNING:", time_spans, category, tg["value"])
+                                return tg["value"]
+
+                """
+                WranglerLogger.debug(
+                    "\nCouldn't find time period for {}, returning default".format(
+                        str(time_spans)
+                    )
+                )
+                """
+                if "default" in v.keys():
+                    # print("Var:", v)
+                    # print("RETURNING:", time_spans, v["default"])
+                    return v["default"]
+                else:
+                    # print("Var:", v)
+                    WranglerLogger.error(
+                        "\nCan't find default; must specify a category in {}".format(
+                            str(categories)
+                        )
+                    )
+                    raise ValueError(
+                        "Can't find default, must specify a category in: {}".format(
+                            str(categories)
+                        )
+                    )
+
+        time_spans = parse_time_spans(time_period)
+
+        return self.links_df[property].apply(
+            _get_property, time_spans=time_spans, category=category
+        )
+
+    def create_dummy_connector_links(gp_df: GeoDataFrame, ml_df: GeoDataFrame):
+        """
+        create dummy connector links between the general purpose and managed lanes
+
+        args:
+            gp_df : GeoDataFrame
+                dataframe of general purpose links (where managed lane also exists)
+            ml_df : GeoDataFrame
+                dataframe of corresponding managed lane links
+        """
+
+        gp_ml_links_df = pd.concat(
+            [gp_df, ml_df.add_prefix("ML_")], axis=1, join="inner"
+        )
+
+        access_df = gp_df.iloc[0:0, :].copy()
+        egress_df = gp_df.iloc[0:0, :].copy()
+
+        def _get_connector_references(ref_1: list, ref_2: list, type: str):
+            if type == "access":
+                out_location_reference = [
+                    {"sequence": 1, "point": ref_1[0]["point"]},
+                    {"sequence": 2, "point": ref_2[0]["point"]},
+                ]
+
+            if type == "egress":
+                out_location_reference = [
+                    {"sequence": 1, "point": ref_2[1]["point"]},
+                    {"sequence": 2, "point": ref_1[1]["point"]},
+                ]
+            return out_location_reference
+
+        for index, row in gp_ml_links_df.iterrows():
+            access_row = {}
+            access_row["A"] = row["A"]
+            access_row["B"] = row["ML_A"]
+            access_row["lanes"] = 1
+            access_row["model_link_id"] = (
+                row["model_link_id"] + row["ML_model_link_id"] + 1
+            )
+            access_row["access"] = row["ML_access"]
+            access_row["drive_access"] = row["drive_access"]
+            access_row["locationReferences"] = _get_connector_references(
+                row["locationReferences"], row["ML_locationReferences"], "access"
+            )
+            access_row["distance"] = haversine_distance(
+                access_row["locationReferences"][0]["point"],
+                access_row["locationReferences"][1]["point"],
+            )
+            access_row["roadway"] = "ml_access"
+            access_row["name"] = "Access Dummy " + row["name"]
+            # ref is not a *required* attribute, so make conditional:
+            if "ref" in gp_ml_links_df.columns:
+                access_row["ref"] = row["ref"]
+            else:
+                access_row["ref"] = ""
+            access_df = access_df.append(access_row, ignore_index=True)
+
+            egress_row = {}
+            egress_row["A"] = row["ML_B"]
+            egress_row["B"] = row["B"]
+            egress_row["lanes"] = 1
+            egress_row["model_link_id"] = (
+                row["model_link_id"] + row["ML_model_link_id"] + 2
+            )
+            egress_row["access"] = row["ML_access"]
+            egress_row["drive_access"] = row["drive_access"]
+            egress_row["locationReferences"] = _get_connector_references(
+                row["locationReferences"], row["ML_locationReferences"], "egress"
+            )
+            egress_row["distance"] = haversine_distance(
+                egress_row["locationReferences"][0]["point"],
+                egress_row["locationReferences"][1]["point"],
+            )
+            egress_row["roadway"] = "ml_egress"
+            egress_row["name"] = "Egress Dummy " + row["name"]
+            # ref is not a *required* attribute, so make conditional:
+            if "ref" in gp_ml_links_df.columns:
+                egress_row["ref"] = row["ref"]
+            else:
+                egress_row["ref"] = ""
+            egress_df = egress_df.append(egress_row, ignore_index=True)
+
+        return (access_df, egress_df)
+
+    def create_managed_lane_network(self, in_place: bool = False) -> RoadwayNetwork:
+        """
+        Create a roadway network with managed lanes links separated out.
+        Add new parallel managed lane links, access/egress links,
+        and add shapes corresponding to the new links
+
+        args:
+            in_place: update self or return a new roadway network object
+
+        returns: A RoadwayNetwork instance
+
+        .. todo:: make this a more rigorous test
+        """
+
+        WranglerLogger.info("Creating network with duplicated managed lanes")
+
+        if "ml_access" in self.links_df["roadway"].tolist():
+            msg = "managed lane access links already exist in network; shouldn't be running create managed lane network. Returning network as-is."
+            WranglerLogger.error(msg)
+            if in_place:
+                return
+            else:
+                return copy.deepcopy(self)
+
+        link_attributes = self.links_df.columns.values.tolist()
+
+        ml_attributes = [i for i in link_attributes if i.startswith("ML_")]
+
+        # non_ml_links are links in the network where there is no managed lane.
+        # gp_links are the gp lanes and ml_links are ml lanes respectively for the ML roadways.
+
+        non_ml_links_df = self.links_df[self.links_df["managed"] == 0]
+        non_ml_links_df = non_ml_links_df.drop(ml_attributes, axis=1)
+
+        ml_links_df = self.links_df[self.links_df["managed"] == 1]
+        gp_links_df = ml_links_df.drop(ml_attributes, axis=1)
+
+        for attr in link_attributes:
+            if attr ==  "name":
+                ml_links_df["name"] = "Managed Lane "+gp_links_df["name"]
+            elif attr in ml_attributes and attr not in ["ML_ACCESS", "ML_EGRESS"]:
+                gp_attr = attr.split("_", 1)[1]
+                ml_links_df.loc[:, gp_attr] = ml_links_df[attr]
+
+            if (
+                attr not in RoadwayNetwork.KEEP_SAME_ATTRIBUTES_ML_AND_GP
+                and attr not in RoadwayNetwork.MANAGED_LANES_REQUIRED_ATTRIBUTES
+            ):
+                ml_links_df[attr] = ""
+
+        ml_links_df = ml_links_df.drop(ml_attributes, axis=1)
+
+        ml_links_df["managed"] = 1
+        gp_links_df["managed"] = 0
+
+        def _update_location_reference(location_reference: list):
+            out_location_reference = copy.deepcopy(location_reference)
+            out_location_reference[0]["point"] = offset_lat_lon(
+                out_location_reference[0]["point"]
+            )
+            out_location_reference[1]["point"] = offset_lat_lon(
+                out_location_reference[1]["point"]
+            )
+            return out_location_reference
+
+        ml_links_df["A"] = (
+            ml_links_df["A"] + RoadwayNetwork.MANAGED_LANES_NODE_ID_SCALAR
+        )
+        ml_links_df["B"] = (
+            ml_links_df["B"] + RoadwayNetwork.MANAGED_LANES_NODE_ID_SCALAR
+        )
+        ml_links_df[RoadwayNetwork.UNIQUE_LINK_KEY] = (
+            ml_links_df[RoadwayNetwork.UNIQUE_LINK_KEY]
+            + RoadwayNetwork.MANAGED_LANES_LINK_ID_SCALAR
+        )
+        ml_links_df["locationReferences"] = ml_links_df["locationReferences"].apply(
+            # lambda x: _update_location_reference(x)
+            lambda x: offset_location_reference(x)
+        )
+        ml_links_df["geometry"] = ml_links_df["locationReferences"].apply(
+            lambda x: create_line_string(x)
+        )
+        ml_links_df[RoadwayNetwork.UNIQUE_SHAPE_KEY] = ml_links_df["geometry"].apply(
+            lambda x: create_unique_shape_id(x)
+        )
+
+        access_links_df, egress_links_df = RoadwayNetwork.create_dummy_connector_links(
+            gp_links_df, ml_links_df
+        )
+        access_links_df["geometry"] = access_links_df["locationReferences"].apply(
+            lambda x: create_line_string(x)
+        )
+        egress_links_df["geometry"] = egress_links_df["locationReferences"].apply(
+            lambda x: create_line_string(x)
+        )
+        access_links_df[RoadwayNetwork.UNIQUE_SHAPE_KEY] = access_links_df[
+            "geometry"
+        ].apply(lambda x: create_unique_shape_id(x))
+        egress_links_df[RoadwayNetwork.UNIQUE_SHAPE_KEY] = egress_links_df[
+            "geometry"
+        ].apply(lambda x: create_unique_shape_id(x))
+
+        out_links_df = gp_links_df.append(ml_links_df)
+        out_links_df = out_links_df.append(access_links_df)
+        out_links_df = out_links_df.append(egress_links_df)
+        out_links_df = out_links_df.append(non_ml_links_df)
+
+        # only the ml_links_df has the new nodes added
+        added_a_nodes = ml_links_df["A"]
+        added_b_nodes = ml_links_df["B"]
+
+        out_nodes_df = self.nodes_df
+
+        for a_node in added_a_nodes:
+            out_nodes_df = out_nodes_df.append(
+                {
+                    "model_node_id": a_node,
+                    "geometry": Point(
+                        out_links_df[out_links_df["A"] == a_node].iloc[0][
+                            "locationReferences"
+                        ][0]["point"]
+                    ),
+                    "drive_node": 1,
+                },
+                ignore_index=True,
+            )
+
+        for b_node in added_b_nodes:
+            if b_node not in out_nodes_df["model_node_id"].tolist():
+                out_nodes_df = out_nodes_df.append(
+                    {
+                        "model_node_id": b_node,
+                        "geometry": Point(
+                            out_links_df[out_links_df["B"] == b_node].iloc[0][
+                                "locationReferences"
+                            ][1]["point"]
+                        ),
+                        "drive_node": 1,
+                    },
+                    ignore_index=True,
+                )
+
+        out_nodes_df["X"] = out_nodes_df["geometry"].apply(lambda g: g.x)
+        out_nodes_df["Y"] = out_nodes_df["geometry"].apply(lambda g: g.y)
+
+        out_shapes_df = self.shapes_df
+
+        # managed lanes, access and egress connectors are new geometry
+        new_shapes_df = pd.DataFrame(
+            {
+                "geometry": ml_links_df["geometry"]
+                .append(access_links_df["geometry"])
+                .append(egress_links_df["geometry"])
+            }
+        )
+        new_shapes_df[RoadwayNetwork.UNIQUE_SHAPE_KEY] = new_shapes_df[
+            "geometry"
+        ].apply(lambda x: create_unique_shape_id(x))
+        out_shapes_df = out_shapes_df.append(new_shapes_df)
+
+        out_links_df = out_links_df.reset_index()
+        out_nodes_df = out_nodes_df.reset_index()
+        out_shapes_df = out_shapes_df.reset_index()
+
+        if in_place:
+            self.links_df = out_links_df
+            self.nodes_df = out_nodes_df
+            self.shapes_df = out_shapes_df
+        else:
+            out_network = copy.deepcopy(self)
+            out_network.links_df = out_links_df
+            out_network.nodes_df = out_nodes_df
+            out_network.shapes_df = out_shapes_df
+            return out_network
+
+    @staticmethod
+    def get_modal_links_nodes(
+        links_df: DataFrame, nodes_df: DataFrame, modes: list[str] = None
+    ) -> tuple(DataFrame, DataFrame):
+        """Returns nodes and link dataframes for specific mode.
+
+        Args:
+            links_df: DataFrame of standard network links
+            nodes_df: DataFrame of standard network nodes
+            modes: list of the modes of the network to be kept, must be in `drive`,`transit`,`rail`,`bus`,
+                `walk`, `bike`. For example, if bike and walk are selected, both bike and walk links will be kept.
+
+        Returns: tuple of DataFrames for links, nodes filtered by mode
+
+        .. todo:: Right now we don't filter the nodes because transit-only
+        links with walk access are not marked as having walk access
+        Issue discussed in https://github.com/wsp-sag/network_wrangler/issues/145
+        modal_nodes_df = nodes_df[nodes_df[mode_node_variable] == 1]
+        """
+        for mode in modes:
+            if mode not in RoadwayNetwork.MODES_TO_NETWORK_LINK_VARIABLES.keys():
+                msg = "mode value should be one of {}, got {}".format(
+                    list(RoadwayNetwork.MODES_TO_NETWORK_LINK_VARIABLES.keys()), mode,
+                )
+                WranglerLogger.error(msg)
+                raise ValueError(msg)
+
+        mode_link_variables = list(
+            set(
+                [
+                    mode
+                    for mode in modes
+                    for mode in RoadwayNetwork.MODES_TO_NETWORK_LINK_VARIABLES[mode]
+                ]
+            )
+        )
+        mode_node_variables = list(
+            set(
+                [
+                    mode
+                    for mode in modes
+                    for mode in RoadwayNetwork.MODES_TO_NETWORK_NODE_VARIABLES[mode]
+                ]
+            )
+        )
+
+        if not set(mode_link_variables).issubset(set(links_df.columns)):
+            msg = "{} not in provided links_df list of columns. Available columns are: \n {}".format(
+                set(mode_link_variables) - set(links_df.columns), links_df.columns
+            )
+            WranglerLogger.error(msg)
+
+        if not set(mode_node_variables).issubset(set(nodes_df.columns)):
+            msg = "{} not in provided nodes_df list of columns. Available columns are: \n {}".format(
+                set(mode_node_variables) - set(nodes_df.columns), nodes_df.columns
+            )
+            WranglerLogger.error(msg)
+
+        modal_links_df = links_df.loc[links_df[mode_link_variables].any(axis=1)]
+
+        ##TODO right now we don't filter the nodes because transit-only
+        # links with walk access are not marked as having walk access
+        # Issue discussed in https://github.com/wsp-sag/network_wrangler/issues/145
+        # modal_nodes_df = nodes_df[nodes_df[mode_node_variable] == 1]
+        modal_nodes_df = nodes_df
+
+        return modal_links_df, modal_nodes_df
+
+    @staticmethod
+    def get_modal_graph(links_df: DataFrame, nodes_df: DataFrame, mode: str = None):
+        """Determines if the network graph is "strongly" connected
+        A graph is strongly connected if each vertex is reachable from every other vertex.
+
+        Args:
+            links_df: DataFrame of standard network links
+            nodes_df: DataFrame of standard network nodes
+            mode: mode of the network, one of `drive`,`transit`,
+                `walk`, `bike`
+
+        Returns: networkx: osmnx: DiGraph  of network
+        """
+        if mode not in RoadwayNetwork.MODES_TO_NETWORK_LINK_VARIABLES.keys():
+            msg = "mode value should be one of {}.".format(
+                list(RoadwayNetwork.MODES_TO_NETWORK_LINK_VARIABLES.keys())
+            )
+            WranglerLogger.error(msg)
+            raise ValueError(msg)
+
+        _links_df, _nodes_df = RoadwayNetwork.get_modal_links_nodes(
+            links_df, nodes_df, modes=[mode],
+        )
+        G = RoadwayNetwork.ox_graph(_nodes_df, _links_df)
+
+        return G
+
+    def is_network_connected(
+        self, mode: str = None, links_df: DataFrame = None, nodes_df: DataFrame = None
+    ):
+        """
+        Determines if the network graph is "strongly" connected
+        A graph is strongly connected if each vertex is reachable from every other vertex.
+
+        Args:
+            mode:  mode of the network, one of `drive`,`transit`,
+                `walk`, `bike`
+            links_df: DataFrame of standard network links
+            nodes_df: DataFrame of standard network nodes
+
+        Returns: boolean
+
+        .. todo:: Consider caching graphs if they take a long time.
+        """
+
+        _nodes_df = nodes_df if nodes_df else self.nodes_df
+        _links_df = links_df if links_df else self.links_df
+
+        if mode:
+            _links_df, _nodes_df = RoadwayNetwork.get_modal_links_nodes(
+                _links_df, _nodes_df, modes=[mode],
+            )
+        else:
+            WranglerLogger.info(
+                "Assessing connectivity without a mode\
+                specified. This may have limited value in interpretation.\
+                To add mode specificity, add the keyword `mode =` to calling\
+                this method"
+            )
+
+        # TODO: consider caching graphs if they start to take forever
+        #      and we are calling them more than once.
+        G = RoadwayNetwork.ox_graph(_nodes_df, _links_df)
+        is_connected = nx.is_strongly_connected(G)
+
+        return is_connected
+
+    def assess_connectivity(
+        self,
+        mode: str = "",
+        ignore_end_nodes: bool = True,
+        links_df: DataFrame = None,
+        nodes_df: DataFrame = None,
+    ):
+        """Returns a network graph and list of disconnected subgraphs
+        as described by a list of their member nodes.
+
+        Args:
+            mode:  list of modes of the network, one of `drive`,`transit`,
+                `walk`, `bike`
+            ignore_end_nodes: if True, ignores stray singleton nodes
+            links_df: if specified, will assess connectivity of this
+                links list rather than self.links_df
+            nodes_df: if specified, will assess connectivity of this
+                nodes list rather than self.nodes_df
+
+        Returns: Tuple of
+            Network Graph (osmnx flavored networkX DiGraph)
+            List of disconnected subgraphs described by the list of their
+                member nodes (as described by their `model_node_id`)
+        """
+        _nodes_df = nodes_df if nodes_df else self.nodes_df
+        _links_df = links_df if links_df else self.links_df
+
+        if mode:
+            _links_df, _nodes_df = RoadwayNetwork.get_modal_links_nodes(
+                _links_df, _nodes_df, modes=[mode],
+            )
+        else:
+            WranglerLogger.info(
+                "Assessing connectivity without a mode\
+                specified. This may have limited value in interpretation.\
+                To add mode specificity, add the keyword `mode =` to calling\
+                this method"
+            )
+
+        G = RoadwayNetwork.ox_graph(_nodes_df, _links_df)
+        # sub_graphs = [s for s in sorted(nx.strongly_connected_component_subgraphs(G), key=len, reverse=True)]
+        sub_graphs = [
+            s
+            for s in sorted(
+                (G.subgraph(c) for c in nx.strongly_connected_components(G)),
+                key=len,
+                reverse=True,
+            )
+        ]
+
+        sub_graph_nodes = [
+            list(s)
+            for s in sorted(nx.strongly_connected_components(G), key=len, reverse=True)
+        ]
+
+        # sorted on decreasing length, dropping the main sub-graph
+        disconnected_sub_graph_nodes = sub_graph_nodes[1:]
+
+        # dropping the sub-graphs with only 1 node
+        if ignore_end_nodes:
+            disconnected_sub_graph_nodes = [
+                list(s) for s in disconnected_sub_graph_nodes if len(s) > 1
+            ]
+
+        WranglerLogger.info(
+            "{} for disconnected networks for mode = {}:\n{}".format(
+                RoadwayNetwork.NODE_FOREIGN_KEY,
+                mode,
+                "\n".join(list(map(str, disconnected_sub_graph_nodes))),
+            )
+        )
+        return G, disconnected_sub_graph_nodes
+
+    @staticmethod
+    def network_connection_plot(G, disconnected_subgraph_nodes: list):
+        """Plot a graph to check for network connection.
+
+        Args:
+            G: OSMNX flavored networkX graph.
+            disconnected_subgraph_nodes: List of disconnected subgraphs described by the list of their
+                member nodes (as described by their `model_node_id`).
+
+        returns: fig, ax : tuple
+        """
+
+        colors = []
+        for i in range(len(disconnected_subgraph_nodes)):
+            colors.append("#%06X" % randint(0, 0xFFFFFF))
+
+        fig, ax = ox.plot_graph(
+            G,
+            figsize=(16, 16),
+            show=False,
+            close=True,
+            edge_color="black",
+            edge_alpha=0.1,
+            node_color="black",
+            node_alpha=0.5,
+            node_size=10,
+        )
+        i = 0
+        for nodes in disconnected_subgraph_nodes:
+            for n in nodes:
+                size = 100
+                ax.scatter(G.nodes[n]["X"], G.nodes[n]["Y"], c=colors[i], s=size)
+            i = i + 1
+
+        return fig, ax
+
+    def selection_map(
+        self,
+        selected_link_idx: list,
+        A: Optional[Any] = None,
+        B: Optional[Any] = None,
+        candidate_link_idx: Optional[List] = [],
+    ):
+        """
+        Shows which links are selected for roadway property change or parallel
+        managed lanes category of roadway projects.
+
+        Args:
+            selected_links_idx: list of selected link indices
+            candidate_links_idx: optional list of candidate link indices to also include in map
+            A: optional foreign key of starting node of a route selection
+            B: optional foreign key of ending node of a route selection
+        """
+        WranglerLogger.debug(
+            "Selected Links: {}\nCandidate Links: {}\n".format(
+                selected_link_idx, candidate_link_idx
+            )
+        )
+
+        graph_link_idx = list(set(selected_link_idx + candidate_link_idx))
+        graph_links = self.links_df.loc[graph_link_idx]
+
+        node_list_foreign_keys = list(
+            set(
+                [
+                    i
+                    for fk in RoadwayNetwork.LINK_FOREIGN_KEY
+                    for i in list(graph_links[fk])
+                ]
+            )
+        )
+
+        graph_nodes = self.nodes_df.loc[node_list_foreign_keys]
+
+        G = RoadwayNetwork.ox_graph(graph_nodes, graph_links)
+
+        # base map plot with whole graph
+        m = ox.plot_graph_folium(
+            G, edge_color=None, tiles="cartodbpositron", width="300px", height="250px"
+        )
+
+        # plot selection
+        selected_links = self.links_df.loc[selected_link_idx]
+
+        for _, row in selected_links.iterrows():
+            pl = ox.folium._make_folium_polyline(
+                edge=row, edge_color="blue", edge_width=5, edge_opacity=0.8
+            )
+            pl.add_to(m)
+
+        # if have A and B node add them to base map
+        def _folium_node(node_row, color="white", icon=""):
+            node_marker = folium.Marker(
+                location=[node_row["Y"], node_row["X"]],
+                icon=folium.Icon(icon=icon, color=color),
+            )
+            return node_marker
+
+        if A:
+
+            # WranglerLogger.debug("A: {}\n{}".format(A,self.nodes_df[self.nodes_df[RoadwayNetwork.NODE_FOREIGN_KEY] == A]))
+            _folium_node(
+                self.nodes_df[self.nodes_df[RoadwayNetwork.NODE_FOREIGN_KEY] == A],
+                color="green",
+                icon="play",
+            ).add_to(m)
+
+        if B:
+            _folium_node(
+                self.nodes_df[self.nodes_df[RoadwayNetwork.NODE_FOREIGN_KEY] == B],
+                color="red",
+                icon="star",
+            ).add_to(m)
+
+        return m
+
+    def deletion_map(self, links: dict, nodes: dict):
+        """
+        Shows which links and nodes are deleted from the roadway network
+        """
+        # deleted_links = None
+        # deleted_nodes = None
+
+        missing_error_message = []
+
+        if links is not None:
+            for key, val in links.items():
+                deleted_links = self.links_df[self.links_df[key].isin(val)]
+
+                node_list_foreign_keys = list(
+                    set(
+                        [
+                            i
+                            for fk in RoadwayNetwork.LINK_FOREIGN_KEY
+                            for i in list(deleted_links[fk])
+                        ]
+                    )
+                )
+                candidate_nodes = self.nodes_df.loc[node_list_foreign_keys]
+        else:
+            deleted_links = None
+
+        if nodes is not None:
+            for key, val in nodes.items():
+                deleted_nodes = self.nodes_df[self.nodes_df[key].isin(val)]
+        else:
+            deleted_nodes = None
+
+        G = RoadwayNetwork.ox_graph(candidate_nodes, deleted_links)
+
+        m = ox.plot_graph_folium(G, edge_color="red", tiles="cartodbpositron")
+
+        def _folium_node(node, color="white", icon=""):
+            node_circle = folium.Circle(
+                location=[node["Y"], node["X"]],
+                radius=2,
+                fill=True,
+                color=color,
+                fill_opacity=0.8,
+            )
+            return node_circle
+
+        if deleted_nodes is not None:
+            for _, row in deleted_nodes.iterrows():
+                _folium_node(row, color="red").add_to(m)
+
+        return m
+
+    def addition_map(self, links: dict, nodes: dict):
+        """
+        Shows which links and nodes are added to the roadway network
+        """
+
+        if links is not None:
+            link_ids = []
+            for link in links:
+                link_ids.append(link.get(RoadwayNetwork.UNIQUE_LINK_KEY))
+
+            added_links = self.links_df[
+                self.links_df[RoadwayNetwork.UNIQUE_LINK_KEY].isin(link_ids)
+            ]
+            node_list_foreign_keys = list(
+                set(
+                    [
+                        i
+                        for fk in RoadwayNetwork.LINK_FOREIGN_KEY
+                        for i in list(added_links[fk])
+                    ]
+                )
+            )
+            try:
+                candidate_nodes = self.nodes_df.loc[node_list_foreign_keys]
+            except:
+                return None
+
+        if nodes is not None:
+            node_ids = []
+            for node in nodes:
+                node_ids.append(node.get(RoadwayNetwork.UNIQUE_NODE_KEY))
+
+            added_nodes = self.nodes_df[
+                self.nodes_df[RoadwayNetwork.UNIQUE_NODE_KEY].isin(node_ids)
+            ]
+        else:
+            added_nodes = None
+
+        G = RoadwayNetwork.ox_graph(candidate_nodes, added_links)
+
+        m = ox.plot_graph_folium(G, edge_color="green", tiles="cartodbpositron")
+
+        def _folium_node(node, color="white", icon=""):
+            node_circle = folium.Circle(
+                location=[node["Y"], node["X"]],
+                radius=2,
+                fill=True,
+                color=color,
+                fill_opacity=0.8,
+            )
+            return node_circle
+
+        if added_nodes is not None:
+            for _, row in added_nodes.iterrows():
+                _folium_node(row, color="green").add_to(m)
+
+        return m
+
+ +
+ +
+ + +
+
+ +
+ +
+ + + + + + + + + + + \ No newline at end of file diff --git a/_sources/_generated/lasso.CubeTransit.rst.txt b/_sources/_generated/lasso.CubeTransit.rst.txt new file mode 100644 index 0000000..e24b49e --- /dev/null +++ b/_sources/_generated/lasso.CubeTransit.rst.txt @@ -0,0 +1,36 @@ +lasso.CubeTransit +================= + +.. currentmodule:: lasso + +.. autoclass:: CubeTransit + + + .. automethod:: __init__ + + + .. rubric:: Methods + + .. autosummary:: + + ~CubeTransit.__init__ + ~CubeTransit.add_additional_time_periods + ~CubeTransit.add_cube + ~CubeTransit.build_route_name + ~CubeTransit.calculate_start_end_times + ~CubeTransit.create_add_route_card_dict + ~CubeTransit.create_delete_route_card_dict + ~CubeTransit.create_from_cube + ~CubeTransit.create_update_route_card_dict + ~CubeTransit.cube_properties_to_standard_properties + ~CubeTransit.evaluate_differences + ~CubeTransit.evaluate_route_property_differences + ~CubeTransit.evaluate_route_shape_changes + ~CubeTransit.get_time_period_numbers_from_cube_properties + ~CubeTransit.unpack_route_name + + + + + + \ No newline at end of file diff --git a/_sources/_generated/lasso.ModelRoadwayNetwork.rst.txt b/_sources/_generated/lasso.ModelRoadwayNetwork.rst.txt new file mode 100644 index 0000000..01d7dba --- /dev/null +++ b/_sources/_generated/lasso.ModelRoadwayNetwork.rst.txt @@ -0,0 +1,104 @@ +lasso.ModelRoadwayNetwork +========================= + +.. currentmodule:: lasso + +.. autoclass:: ModelRoadwayNetwork + + + .. automethod:: __init__ + + + .. rubric:: Methods + + .. autosummary:: + + ~ModelRoadwayNetwork.__init__ + ~ModelRoadwayNetwork.add_counts + ~ModelRoadwayNetwork.add_new_roadway_feature_change + ~ModelRoadwayNetwork.add_variable_using_shst_reference + ~ModelRoadwayNetwork.addition_map + ~ModelRoadwayNetwork.apply + ~ModelRoadwayNetwork.apply_managed_lane_feature_change + ~ModelRoadwayNetwork.apply_python_calculation + ~ModelRoadwayNetwork.apply_roadway_feature_change + ~ModelRoadwayNetwork.assess_connectivity + ~ModelRoadwayNetwork.build_selection_key + ~ModelRoadwayNetwork.calculate_area_type + ~ModelRoadwayNetwork.calculate_assign_group_and_roadway_class + ~ModelRoadwayNetwork.calculate_centroidconnect + ~ModelRoadwayNetwork.calculate_county + ~ModelRoadwayNetwork.calculate_distance + ~ModelRoadwayNetwork.calculate_hov + ~ModelRoadwayNetwork.calculate_mpo + ~ModelRoadwayNetwork.convert_int + ~ModelRoadwayNetwork.create_ML_variable + ~ModelRoadwayNetwork.create_calculated_variables + ~ModelRoadwayNetwork.create_dummy_connector_links + ~ModelRoadwayNetwork.create_hov_corridor_variable + ~ModelRoadwayNetwork.create_managed_lane_network + ~ModelRoadwayNetwork.create_managed_variable + ~ModelRoadwayNetwork.dataframe_to_fixed_width + ~ModelRoadwayNetwork.delete_roadway_feature_change + ~ModelRoadwayNetwork.deletion_map + ~ModelRoadwayNetwork.fill_na + ~ModelRoadwayNetwork.from_RoadwayNetwork + ~ModelRoadwayNetwork.get_attribute + ~ModelRoadwayNetwork.get_managed_lane_node_ids + ~ModelRoadwayNetwork.get_modal_graph + ~ModelRoadwayNetwork.get_modal_links_nodes + ~ModelRoadwayNetwork.get_property_by_time_period_and_group + ~ModelRoadwayNetwork.is_network_connected + ~ModelRoadwayNetwork.network_connection_plot + ~ModelRoadwayNetwork.orig_dest_nodes_foreign_key + ~ModelRoadwayNetwork.ox_graph + ~ModelRoadwayNetwork.read + ~ModelRoadwayNetwork.read_match_result + ~ModelRoadwayNetwork.rename_variables_for_dbf + ~ModelRoadwayNetwork.roadway_net_to_gdf + ~ModelRoadwayNetwork.roadway_standard_to_met_council_network + ~ModelRoadwayNetwork.select_roadway_features + ~ModelRoadwayNetwork.selection_has_unique_link_id + ~ModelRoadwayNetwork.selection_map + ~ModelRoadwayNetwork.split_properties_by_time_period_and_category + ~ModelRoadwayNetwork.validate_link_schema + ~ModelRoadwayNetwork.validate_node_schema + ~ModelRoadwayNetwork.validate_object_types + ~ModelRoadwayNetwork.validate_properties + ~ModelRoadwayNetwork.validate_selection + ~ModelRoadwayNetwork.validate_shape_schema + ~ModelRoadwayNetwork.validate_uniqueness + ~ModelRoadwayNetwork.write + ~ModelRoadwayNetwork.write_roadway_as_fixedwidth + ~ModelRoadwayNetwork.write_roadway_as_shp + + + + + + .. rubric:: Attributes + + .. autosummary:: + + ~ModelRoadwayNetwork.CALCULATED_VALUES + ~ModelRoadwayNetwork.CRS + ~ModelRoadwayNetwork.KEEP_SAME_ATTRIBUTES_ML_AND_GP + ~ModelRoadwayNetwork.LINK_FOREIGN_KEY + ~ModelRoadwayNetwork.MANAGED_LANES_LINK_ID_SCALAR + ~ModelRoadwayNetwork.MANAGED_LANES_NODE_ID_SCALAR + ~ModelRoadwayNetwork.MANAGED_LANES_REQUIRED_ATTRIBUTES + ~ModelRoadwayNetwork.MANAGED_LANES_SCALAR + ~ModelRoadwayNetwork.MAX_SEARCH_BREADTH + ~ModelRoadwayNetwork.MODES_TO_NETWORK_LINK_VARIABLES + ~ModelRoadwayNetwork.MODES_TO_NETWORK_NODE_VARIABLES + ~ModelRoadwayNetwork.NODE_FOREIGN_KEY + ~ModelRoadwayNetwork.SEARCH_BREADTH + ~ModelRoadwayNetwork.SELECTION_REQUIRES + ~ModelRoadwayNetwork.SP_WEIGHT_FACTOR + ~ModelRoadwayNetwork.UNIQUE_LINK_KEY + ~ModelRoadwayNetwork.UNIQUE_MODEL_LINK_IDENTIFIERS + ~ModelRoadwayNetwork.UNIQUE_NODE_IDENTIFIERS + ~ModelRoadwayNetwork.UNIQUE_NODE_KEY + ~ModelRoadwayNetwork.UNIQUE_SHAPE_KEY + + \ No newline at end of file diff --git a/_sources/_generated/lasso.Parameters.rst.txt b/_sources/_generated/lasso.Parameters.rst.txt new file mode 100644 index 0000000..ebc5881 --- /dev/null +++ b/_sources/_generated/lasso.Parameters.rst.txt @@ -0,0 +1,22 @@ +lasso.Parameters +================ + +.. currentmodule:: lasso + +.. autoclass:: Parameters + + + .. automethod:: __init__ + + + .. rubric:: Methods + + .. autosummary:: + + ~Parameters.__init__ + + + + + + \ No newline at end of file diff --git a/_sources/_generated/lasso.Project.rst.txt b/_sources/_generated/lasso.Project.rst.txt new file mode 100644 index 0000000..29cfb13 --- /dev/null +++ b/_sources/_generated/lasso.Project.rst.txt @@ -0,0 +1,37 @@ +lasso.Project +============= + +.. currentmodule:: lasso + +.. autoclass:: Project + + + .. automethod:: __init__ + + + .. rubric:: Methods + + .. autosummary:: + + ~Project.__init__ + ~Project.add_highway_changes + ~Project.add_transit_changes + ~Project.create_project + ~Project.determine_roadway_network_changes_compatability + ~Project.evaluate_changes + ~Project.read_logfile + ~Project.write_project_card + + + + + + .. rubric:: Attributes + + .. autosummary:: + + ~Project.CALCULATED_VALUES + ~Project.DEFAULT_PROJECT_NAME + ~Project.STATIC_VALUES + + \ No newline at end of file diff --git a/_sources/_generated/lasso.StandardTransit.rst.txt b/_sources/_generated/lasso.StandardTransit.rst.txt new file mode 100644 index 0000000..62d2d61 --- /dev/null +++ b/_sources/_generated/lasso.StandardTransit.rst.txt @@ -0,0 +1,30 @@ +lasso.StandardTransit +===================== + +.. currentmodule:: lasso + +.. autoclass:: StandardTransit + + + .. automethod:: __init__ + + + .. rubric:: Methods + + .. autosummary:: + + ~StandardTransit.__init__ + ~StandardTransit.calculate_cube_mode + ~StandardTransit.cube_format + ~StandardTransit.fromTransitNetwork + ~StandardTransit.read_gtfs + ~StandardTransit.route_properties_gtfs_to_cube + ~StandardTransit.shape_gtfs_to_cube + ~StandardTransit.time_to_cube_time_period + ~StandardTransit.write_as_cube_lin + + + + + + \ No newline at end of file diff --git a/_sources/_generated/lasso.logger.rst.txt b/_sources/_generated/lasso.logger.rst.txt new file mode 100644 index 0000000..2054273 --- /dev/null +++ b/_sources/_generated/lasso.logger.rst.txt @@ -0,0 +1,29 @@ +lasso.logger +============ + +.. automodule:: lasso.logger + + + + + + + + .. rubric:: Functions + + .. autosummary:: + + setupLogging + + + + + + + + + + + + + diff --git a/_sources/_generated/lasso.util.rst.txt b/_sources/_generated/lasso.util.rst.txt new file mode 100644 index 0000000..799dc53 --- /dev/null +++ b/_sources/_generated/lasso.util.rst.txt @@ -0,0 +1,32 @@ +lasso.util +========== + +.. automodule:: lasso.util + + + + + + + + .. rubric:: Functions + + .. autosummary:: + + column_name_to_parts + get_shared_streets_intersection_hash + hhmmss_to_datetime + secs_to_datetime + + + + + + + + + + + + + diff --git a/_sources/autodoc.rst.txt b/_sources/autodoc.rst.txt new file mode 100644 index 0000000..7e48d58 --- /dev/null +++ b/_sources/autodoc.rst.txt @@ -0,0 +1,29 @@ +Lasso Classes and Functions +==================================== + +.. automodule:: lasso + :no-members: + :no-undoc-members: + :no-inherited-members: + :no-show-inheritance: + + +Base Classes +-------------- +.. autosummary:: + :toctree: _generated + :nosignatures: + + CubeTransit + StandardTransit + ModelRoadwayNetwork + Project + Parameters + +Utils and Functions +-------------------- +.. autosummary:: + :toctree: _generated + + util + logger diff --git a/_sources/index.rst.txt b/_sources/index.rst.txt new file mode 100644 index 0000000..1255d4e --- /dev/null +++ b/_sources/index.rst.txt @@ -0,0 +1,35 @@ +.. lasso documentation master file, created by + sphinx-quickstart on Thu Dec 5 15:43:28 2019. + You can adapt this file completely to your liking, but it should at least + contain the root `toctree` directive. + +Welcome to lasso's documentation! +================================= + +This package of utilities is a wrapper around the +[network_wrangler](http://github.com/wsp-sag/network_wrangler) package +for MetCouncil. It aims to have the following functionality: +1. parse Cube log files and base highway networks and create ProjectCards + for Network Wrangler +2. parse two Cube transit line files and create ProjectCards for NetworkWrangler +3. refine Network Wrangler highway networks to contain specific variables and + settings for Metropolitan Council and export them to a format that can + be read in by Citilab's Cube software. + +.. toctree:: + :maxdepth: 3 + :caption: Contents: + + starting + setup + running + autodoc + + + +Indices and tables +================== + +* :ref:`genindex` +* :ref:`modindex` +* :ref:`search` diff --git a/_sources/running.md.txt b/_sources/running.md.txt new file mode 100644 index 0000000..e139dc8 --- /dev/null +++ b/_sources/running.md.txt @@ -0,0 +1,12 @@ +# Running Lasso + +## Create project files + + +## Create a scenario + + +## Exporting networks + + +## Auditing and Reporting diff --git a/_sources/setup.md.txt b/_sources/setup.md.txt new file mode 100644 index 0000000..e77d463 --- /dev/null +++ b/_sources/setup.md.txt @@ -0,0 +1,9 @@ +# Setup + +### Projects + +### Parameters + +### Settings + +### Additional Data Files diff --git a/_sources/starting.md.txt b/_sources/starting.md.txt new file mode 100644 index 0000000..769ff09 --- /dev/null +++ b/_sources/starting.md.txt @@ -0,0 +1,288 @@ +# Starting Out + +## Installation + +If you are managing multiple python versions, we suggest using [`virtualenv`](https://virtualenv.pypa.io/en/latest/) or [`conda`](https://conda.io/en/latest/) virtual environments. + +Example using a conda environment (recommended) and using the package manager [pip](https://pip.pypa.io/en/stable/) to install Lasso from the source on GitHub. + +```bash +conda config --add channels conda-forge +conda create python=3.7 rtree geopandas -n +conda activate +pip install git+https://github.com/wsp-sag/Lasso@master +``` + +Lasso will install `network_wrangler` from the [PyPi](https://pypi.org/project/network-wrangler/) repository because it is included in Lasso's `requirements.txt`. + +#### Bleeding Edge +If you want to install a more up-to-date or development version of network wrangler and lasso , you can do so by installing it from the `develop` branch of + +```bash +conda config --add channels conda-forge +conda create python=3.7 rtree geopandas -n +conda activate +pip install git+https://github.com/wsp-sag/network_wrangler@develop +pip install git+https://github.com/wsp-sag/Lasso@develop +``` + +#### From Clone +If you are going to be working on Lasso locally, you might want to clone it to your local machine and install it from the clone. The -e will install it in [editable mode](https://pip.pypa.io/en/stable/reference/pip_install/?highlight=editable#editable-installs). + +**if you plan to do development on both network wrangler and lasso locally, consider installing network wrangler from a clone as well!** + +```bash +conda config --add channels conda-forge +conda create python=3.7 rtree geopandas osmnx -n +conda activate +git clone https://github.com/wsp-sag/Lasso +git clone https://github.com/wsp-sag/network_wrangler +cd network_wrangler +pip install -e . +cd .. +cd Lasso +pip install -e . +``` + +Notes: + +1. The -e installs it in editable mode. +2. If you are not part of the project team and want to contribute code bxack to the project, please fork before you clone and then add the original repository to your upstream origin list per [these directions on github](https://help.github.com/en/articles/fork-a-repo). +3. if you wanted to install from a specific tag/version number or branch, replace `@master` with `@` or `@tag` +4. If you want to make use of frequent developer updates for network wrangler as well, you can also install it from clone by copying the instructions for cloning and installing Lasso for Network Wrangler + +If you are going to be doing Lasso development, we also recommend: + - a good IDE such as [Atom](http://atom.io), VS Code, Sublime Text, etc. + with Python syntax highlighting turned on. + - [GitHub Desktop](https://desktop.github.com/) to locally update your clones + +## Brief Intro + +Lasso is a 'wrapper' around the [Network Wrangler](http://wsp-sag.github.io/network_wrangler) utility. + +Both Lasso and NetworkWrangler are built around the following data schemas: + - [`roadway network`], which is based on a mashup of Open Street Map and [Shared Streets](http://sharedstreets.io). In Network Wrangler these are read in from three json files reprsenting: links, shapes, and nodes. Data fields that change by time of day or by user category are represented as nested fields such that any field can be defined for an ad-hoc time-of-day span or user category. + - [`transit network`], which is based on a frequency-based implementation of the csv-based GTFS; and + - [`project card`], which is novel to Network Wrangler and stores information about network changes as a result of projects in yml. + +In addition, Lasso utilizes the following data schemas: + + - [`MetCouncil Model Roadway Network Schema`], which adds data fields to the `roadway network` schema that MetCouncil uses in their travel model including breaking out data fields by time period. + - [`MetCouncil Model Transit Network Schema`], which uses the Cube PublicTransport format, and + - [`Cube Log Files`], which document changes to the roadway network done in the Cube GUI. Lasso translates these to project cards in order to be used by NetworkWrangler. + - [`Cube public transport line files`], which define a set of transit lines in the cube software. + +### Components +Network Wrangler has the following atomic parts: + + - _RoadwayNetwork_ object, which represents the `roadway network` data as GeoDataFrames; + - _TransitNetwork_ object, which represents the `transit network` data as DataFrames; + - _ProjectCard_ object, which represents the data of the `project card`. Project cards identify the infrastructure that is changing (a selection) and defines the changes; or contains information about a new facility to be constructed or a new service to be run.; + - _Scenario_ object, which consist of at least a RoadwayNetwork, and +TransitNetwork. Scenarios can be based on or tiered from other scenarios. +Scenarios can query and add ProjectCards to describe a set of changes that should be made to the network. + +In addition, Lasso has the following atomic parts: + + - _Project_ object, creates project cards from one of the following: a base and a build transit network in cube format, a base and build highway network, or a base highway network and a Cube log file. + - _ModelRoadwayNetwork_ object is a subclass of `RoadwayNetwork` and contains MetCouncil-specific methods to define and create MetCouncil-specific variables and export the network to a format that can be read by Cube. + - _StandardTransit_, an object for holding a standard transit feed as a Partridge object and contains + methods to manipulate and translate the GTFS data to MetCouncil's Cube Line files. + - _CubeTransit_, an object for storing information about transit defined in `Cube public transport line files` + . Has the capability to parse cube line file properties and shapes into python dictionaries and compare line files and represent changes as Project Card dictionaries. + - _Parameters_, A class representing all the parameters defining the networks + including time of day, categories, etc. Parameters can be set at runtime by initializing a parameters instance + with a keyword argument setting the attribute. Parameters that are + not explicitly set will use default parameters listed in this class. + +#### RoadwayNetwork + +Reads, writes, queries and and manipulates roadway network data, which +is mainly stored in the GeoDataFrames `links_df`, `nodes_df`, and `shapes_df`. + +```python +net = RoadwayNetwork.read( + link_file=MY_LINK_FILE, + node_file=MY_NODE_FILE, + shape_file=MY_SHAPE_FILE, + ) +my_selection = { + "link": [{"name": ["I 35E"]}], + "A": {"osm_node_id": "961117623"}, # start searching for segments at A + "B": {"osm_node_id": "2564047368"}, +} +net.select_roadway_features(my_selection) + +my_change = [ + { + 'property': 'lanes', + 'existing': 1, + 'set': 2, + }, + { + 'property': 'drive_access', + 'set': 0, + }, +] + +my_net.apply_roadway_feature_change( + my_net.select_roadway_features(my_selection), + my_change +) + +ml_net = net.create_managed_lane_network(in_place=False) + +ml_net.is_network_connected(mode="drive")) + +_, disconnected_nodes = ml_net.assess_connectivity( + mode="walk", + ignore_end_nodes=True +) +ml_net.write(filename=my_out_prefix, path=my_dir) +``` +#### TransitNetwork + +#### ProjectCard + +#### Scenario + +Manages sets of project cards and tiering from a base scenario/set of networks. + +```python + +my_base_scenario = { + "road_net": RoadwayNetwork.read( + link_file=STPAUL_LINK_FILE, + node_file=STPAUL_NODE_FILE, + shape_file=STPAUL_SHAPE_FILE, + fast=True, + ), + "transit_net": TransitNetwork.read(STPAUL_DIR), +} + +card_filenames = [ + "3_multiple_roadway_attribute_change.yml", + "multiple_changes.yml", + "4_simple_managed_lane.yml", +] + +project_card_directory = os.path.join(STPAUL_DIR, "project_cards") + +project_cards_list = [ + ProjectCard.read(os.path.join(project_card_directory, filename), validate=False) + for filename in card_filenames +] + +my_scenario = Scenario.create_scenario( + base_scenario=my_base_scenario, + project_cards_list=project_cards_list, +) +my_scenario.check_scenario_requisites() + +my_scenario.apply_all_projects() + +my_scenario.scenario_summary() +``` + +#### Project +Creates project cards by comparing two MetCouncil Model Transit Network files or by reading a cube log file and a base network; + +```python + +test_project = Project.create_project( + base_transit_source=os.path.join(CUBE_DIR, "transit.LIN"), + build_transit_source=os.path.join(CUBE_DIR, "transit_route_shape_change"), + ) + +test_project.evaluate_changes() + +test_project.write_project_card( + os.path.join(SCRATCH_DIR, "t_transit_shape_test.yml") + ) + +``` + +#### ModelRoadwayNetwork +A subclass of network_wrangler's RoadwayNetwork +class which additional understanding about how to translate and write the +network out to the MetCouncil Roadway Network schema. + +```Python + +net = ModelRoadwayNetwork.read( + link_file=STPAUL_LINK_FILE, + node_file=STPAUL_NODE_FILE, + shape_file=STPAUL_SHAPE_FILE, + fast=True, + ) + +net.write_roadway_as_fixedwidth() + +``` + +#### StandardTransit +Translates the standard GTFS data to MetCouncil's Cube Line files. + +```Python +cube_transit_net = StandardTransit.read_gtfs(BASE_TRANSIT_DIR) +cube_transit_net.write_as_cube_lin(os.path.join(WRITE_DIR, "outfile.lin")) +``` + +#### CubeTransit +Used by the project class and has the capability to: + - Parse cube line file properties and shapes into python dictionaries + - Compare line files and represent changes as Project Card dictionaries + +```python +tn = CubeTransit.create_from_cube(CUBE_DIR) +transit_change_list = tn.evaluate_differences(base_transit_network) +``` + +#### Parameters +Holds information about default parameters but can +also be initialized to override those parameters at object instantiation using a dictionary. + +```Python +# read parameters from a yaml configuration file +# could also provide as a key/value pair +with open(config_file) as f: + my_config = yaml.safe_load(f) + +# provide parameters at instantiation of ModelRoadwayNetwork +model_road_net = ModelRoadwayNetwork.from_RoadwayNetwork( + my_scenario.road_net, parameters=my_config.get("my_parameters", {}) + ) +# network written with direction from the parameters given +model_road_net.write_roadway_as_shp() + +``` + +### Typical Workflow + +Workflows in Lasso and Network Wrangler typically accomplish one of two goals: +1. Create Project Cards to document network changes as a result of either transit or roadway projects. +2. Create Model Network Files for a scenario as represented by a series of Project Cards layered on top of a base network. + +#### Project Cards from Transit LIN Files + + +#### Project Cards from Cube LOG Files + + +#### Model Network Files for a Scenario + + + +## Running Quickstart Jupyter Notebooks + +To learn basic lasso functionality, please refer to the following jupyter notebooks in the `/notebooks` directory: + + - `Lasso Project Card Creation Quickstart.ipynb` + - `Lasso Scenario Creation Quickstart.ipynb` + + Jupyter notebooks can be started by activating the lasso conda environment and typing `jupyter notebook`: + + ```bash + conda activate + jupyter notebook + ``` diff --git a/_static/basic.css b/_static/basic.css new file mode 100644 index 0000000..24bc73e --- /dev/null +++ b/_static/basic.css @@ -0,0 +1,855 @@ +/* + * basic.css + * ~~~~~~~~~ + * + * Sphinx stylesheet -- basic theme. + * + * :copyright: Copyright 2007-2020 by the Sphinx team, see AUTHORS. + * :license: BSD, see LICENSE for details. + * + */ + +/* -- main layout ----------------------------------------------------------- */ + +div.clearer { + clear: both; +} + +div.section::after { + display: block; + content: ''; + clear: left; +} + +/* -- relbar ---------------------------------------------------------------- */ + +div.related { + width: 100%; + font-size: 90%; +} + +div.related h3 { + display: none; +} + +div.related ul { + margin: 0; + padding: 0 0 0 10px; + list-style: none; +} + +div.related li { + display: inline; +} + +div.related li.right { + float: right; + margin-right: 5px; +} + +/* -- sidebar --------------------------------------------------------------- */ + +div.sphinxsidebarwrapper { + padding: 10px 5px 0 10px; +} + +div.sphinxsidebar { + float: left; + width: 230px; + margin-left: -100%; + font-size: 90%; + word-wrap: break-word; + overflow-wrap : break-word; +} + +div.sphinxsidebar ul { + list-style: none; +} + +div.sphinxsidebar ul ul, +div.sphinxsidebar ul.want-points { + margin-left: 20px; + list-style: square; +} + +div.sphinxsidebar ul ul { + margin-top: 0; + margin-bottom: 0; +} + +div.sphinxsidebar form { + margin-top: 10px; +} + +div.sphinxsidebar input { + border: 1px solid #98dbcc; + font-family: sans-serif; + font-size: 1em; +} + +div.sphinxsidebar #searchbox form.search { + overflow: hidden; +} + +div.sphinxsidebar #searchbox input[type="text"] { + float: left; + width: 80%; + padding: 0.25em; + box-sizing: border-box; +} + +div.sphinxsidebar #searchbox input[type="submit"] { + float: left; + width: 20%; + border-left: none; + padding: 0.25em; + box-sizing: border-box; +} + + +img { + border: 0; + max-width: 100%; +} + +/* -- search page ----------------------------------------------------------- */ + +ul.search { + margin: 10px 0 0 20px; + padding: 0; +} + +ul.search li { + padding: 5px 0 5px 20px; + background-image: url(file.png); + background-repeat: no-repeat; + background-position: 0 7px; +} + +ul.search li a { + font-weight: bold; +} + +ul.search li div.context { + color: #888; + margin: 2px 0 0 30px; + text-align: left; +} + +ul.keywordmatches li.goodmatch a { + font-weight: bold; +} + +/* -- index page ------------------------------------------------------------ */ + +table.contentstable { + width: 90%; + margin-left: auto; + margin-right: auto; +} + +table.contentstable p.biglink { + line-height: 150%; +} + +a.biglink { + font-size: 1.3em; +} + +span.linkdescr { + font-style: italic; + padding-top: 5px; + font-size: 90%; +} + +/* -- general index --------------------------------------------------------- */ + +table.indextable { + width: 100%; +} + +table.indextable td { + text-align: left; + vertical-align: top; +} + +table.indextable ul { + margin-top: 0; + margin-bottom: 0; + list-style-type: none; +} + +table.indextable > tbody > tr > td > ul { + padding-left: 0em; +} + +table.indextable tr.pcap { + height: 10px; +} + +table.indextable tr.cap { + margin-top: 10px; + background-color: #f2f2f2; +} + +img.toggler { + margin-right: 3px; + margin-top: 3px; + cursor: pointer; +} + +div.modindex-jumpbox { + border-top: 1px solid #ddd; + border-bottom: 1px solid #ddd; + margin: 1em 0 1em 0; + padding: 0.4em; +} + +div.genindex-jumpbox { + border-top: 1px solid #ddd; + border-bottom: 1px solid #ddd; + margin: 1em 0 1em 0; + padding: 0.4em; +} + +/* -- domain module index --------------------------------------------------- */ + +table.modindextable td { + padding: 2px; + border-collapse: collapse; +} + +/* -- general body styles --------------------------------------------------- */ + +div.body { + min-width: 450px; + max-width: 800px; +} + +div.body p, div.body dd, div.body li, div.body blockquote { + -moz-hyphens: auto; + -ms-hyphens: auto; + -webkit-hyphens: auto; + hyphens: auto; +} + +a.headerlink { + visibility: hidden; +} + +a.brackets:before, +span.brackets > a:before{ + content: "["; +} + +a.brackets:after, +span.brackets > a:after { + content: "]"; +} + +h1:hover > a.headerlink, +h2:hover > a.headerlink, +h3:hover > a.headerlink, +h4:hover > a.headerlink, +h5:hover > a.headerlink, +h6:hover > a.headerlink, +dt:hover > a.headerlink, +caption:hover > a.headerlink, +p.caption:hover > a.headerlink, +div.code-block-caption:hover > a.headerlink { + visibility: visible; +} + +div.body p.caption { + text-align: inherit; +} + +div.body td { + text-align: left; +} + +.first { + margin-top: 0 !important; +} + +p.rubric { + margin-top: 30px; + font-weight: bold; +} + +img.align-left, .figure.align-left, object.align-left { + clear: left; + float: left; + margin-right: 1em; +} + +img.align-right, .figure.align-right, object.align-right { + clear: right; + float: right; + margin-left: 1em; +} + +img.align-center, .figure.align-center, object.align-center { + display: block; + margin-left: auto; + margin-right: auto; +} + +img.align-default, .figure.align-default { + display: block; + margin-left: auto; + margin-right: auto; +} + +.align-left { + text-align: left; +} + +.align-center { + text-align: center; +} + +.align-default { + text-align: center; +} + +.align-right { + text-align: right; +} + +/* -- sidebars -------------------------------------------------------------- */ + +div.sidebar { + margin: 0 0 0.5em 1em; + border: 1px solid #ddb; + padding: 7px; + background-color: #ffe; + width: 40%; + float: right; + clear: right; + overflow-x: auto; +} + +p.sidebar-title { + font-weight: bold; +} + +div.admonition, div.topic, blockquote { + clear: left; +} + +/* -- topics ---------------------------------------------------------------- */ + +div.topic { + border: 1px solid #ccc; + padding: 7px; + margin: 10px 0 10px 0; +} + +p.topic-title { + font-size: 1.1em; + font-weight: bold; + margin-top: 10px; +} + +/* -- admonitions ----------------------------------------------------------- */ + +div.admonition { + margin-top: 10px; + margin-bottom: 10px; + padding: 7px; +} + +div.admonition dt { + font-weight: bold; +} + +p.admonition-title { + margin: 0px 10px 5px 0px; + font-weight: bold; +} + +div.body p.centered { + text-align: center; + margin-top: 25px; +} + +/* -- content of sidebars/topics/admonitions -------------------------------- */ + +div.sidebar > :last-child, +div.topic > :last-child, +div.admonition > :last-child { + margin-bottom: 0; +} + +div.sidebar::after, +div.topic::after, +div.admonition::after, +blockquote::after { + display: block; + content: ''; + clear: both; +} + +/* -- tables ---------------------------------------------------------------- */ + +table.docutils { + margin-top: 10px; + margin-bottom: 10px; + border: 0; + border-collapse: collapse; +} + +table.align-center { + margin-left: auto; + margin-right: auto; +} + +table.align-default { + margin-left: auto; + margin-right: auto; +} + +table caption span.caption-number { + font-style: italic; +} + +table caption span.caption-text { +} + +table.docutils td, table.docutils th { + padding: 1px 8px 1px 5px; + border-top: 0; + border-left: 0; + border-right: 0; + border-bottom: 1px solid #aaa; +} + +table.footnote td, table.footnote th { + border: 0 !important; +} + +th { + text-align: left; + padding-right: 5px; +} + +table.citation { + border-left: solid 1px gray; + margin-left: 1px; +} + +table.citation td { + border-bottom: none; +} + +th > :first-child, +td > :first-child { + margin-top: 0px; +} + +th > :last-child, +td > :last-child { + margin-bottom: 0px; +} + +/* -- figures --------------------------------------------------------------- */ + +div.figure { + margin: 0.5em; + padding: 0.5em; +} + +div.figure p.caption { + padding: 0.3em; +} + +div.figure p.caption span.caption-number { + font-style: italic; +} + +div.figure p.caption span.caption-text { +} + +/* -- field list styles ----------------------------------------------------- */ + +table.field-list td, table.field-list th { + border: 0 !important; +} + +.field-list ul { + margin: 0; + padding-left: 1em; +} + +.field-list p { + margin: 0; +} + +.field-name { + -moz-hyphens: manual; + -ms-hyphens: manual; + -webkit-hyphens: manual; + hyphens: manual; +} + +/* -- hlist styles ---------------------------------------------------------- */ + +table.hlist { + margin: 1em 0; +} + +table.hlist td { + vertical-align: top; +} + + +/* -- other body styles ----------------------------------------------------- */ + +ol.arabic { + list-style: decimal; +} + +ol.loweralpha { + list-style: lower-alpha; +} + +ol.upperalpha { + list-style: upper-alpha; +} + +ol.lowerroman { + list-style: lower-roman; +} + +ol.upperroman { + list-style: upper-roman; +} + +:not(li) > ol > li:first-child > :first-child, +:not(li) > ul > li:first-child > :first-child { + margin-top: 0px; +} + +:not(li) > ol > li:last-child > :last-child, +:not(li) > ul > li:last-child > :last-child { + margin-bottom: 0px; +} + +ol.simple ol p, +ol.simple ul p, +ul.simple ol p, +ul.simple ul p { + margin-top: 0; +} + +ol.simple > li:not(:first-child) > p, +ul.simple > li:not(:first-child) > p { + margin-top: 0; +} + +ol.simple p, +ul.simple p { + margin-bottom: 0; +} + +dl.footnote > dt, +dl.citation > dt { + float: left; + margin-right: 0.5em; +} + +dl.footnote > dd, +dl.citation > dd { + margin-bottom: 0em; +} + +dl.footnote > dd:after, +dl.citation > dd:after { + content: ""; + clear: both; +} + +dl.field-list { + display: grid; + grid-template-columns: fit-content(30%) auto; +} + +dl.field-list > dt { + font-weight: bold; + word-break: break-word; + padding-left: 0.5em; + padding-right: 5px; +} + +dl.field-list > dt:after { + content: ":"; +} + +dl.field-list > dd { + padding-left: 0.5em; + margin-top: 0em; + margin-left: 0em; + margin-bottom: 0em; +} + +dl { + margin-bottom: 15px; +} + +dd > :first-child { + margin-top: 0px; +} + +dd ul, dd table { + margin-bottom: 10px; +} + +dd { + margin-top: 3px; + margin-bottom: 10px; + margin-left: 30px; +} + +dl > dd:last-child, +dl > dd:last-child > :last-child { + margin-bottom: 0; +} + +dt:target, span.highlighted { + background-color: #fbe54e; +} + +rect.highlighted { + fill: #fbe54e; +} + +dl.glossary dt { + font-weight: bold; + font-size: 1.1em; +} + +.optional { + font-size: 1.3em; +} + +.sig-paren { + font-size: larger; +} + +.versionmodified { + font-style: italic; +} + +.system-message { + background-color: #fda; + padding: 5px; + border: 3px solid red; +} + +.footnote:target { + background-color: #ffa; +} + +.line-block { + display: block; + margin-top: 1em; + margin-bottom: 1em; +} + +.line-block .line-block { + margin-top: 0; + margin-bottom: 0; + margin-left: 1.5em; +} + +.guilabel, .menuselection { + font-family: sans-serif; +} + +.accelerator { + text-decoration: underline; +} + +.classifier { + font-style: oblique; +} + +.classifier:before { + font-style: normal; + margin: 0.5em; + content: ":"; +} + +abbr, acronym { + border-bottom: dotted 1px; + cursor: help; +} + +/* -- code displays --------------------------------------------------------- */ + +pre { + overflow: auto; + overflow-y: hidden; /* fixes display issues on Chrome browsers */ +} + +pre, div[class*="highlight-"] { + clear: both; +} + +span.pre { + -moz-hyphens: none; + -ms-hyphens: none; + -webkit-hyphens: none; + hyphens: none; +} + +div[class*="highlight-"] { + margin: 1em 0; +} + +td.linenos pre { + border: 0; + background-color: transparent; + color: #aaa; +} + +table.highlighttable { + display: block; +} + +table.highlighttable tbody { + display: block; +} + +table.highlighttable tr { + display: flex; +} + +table.highlighttable td { + margin: 0; + padding: 0; +} + +table.highlighttable td.linenos { + padding-right: 0.5em; +} + +table.highlighttable td.code { + flex: 1; + overflow: hidden; +} + +.highlight .hll { + display: block; +} + +div.highlight pre, +table.highlighttable pre { + margin: 0; +} + +div.code-block-caption + div { + margin-top: 0; +} + +div.code-block-caption { + margin-top: 1em; + padding: 2px 5px; + font-size: small; +} + +div.code-block-caption code { + background-color: transparent; +} + +table.highlighttable td.linenos, +div.doctest > div.highlight span.gp { /* gp: Generic.Prompt */ + user-select: none; +} + +div.code-block-caption span.caption-number { + padding: 0.1em 0.3em; + font-style: italic; +} + +div.code-block-caption span.caption-text { +} + +div.literal-block-wrapper { + margin: 1em 0; +} + +code.descname { + background-color: transparent; + font-weight: bold; + font-size: 1.2em; +} + +code.descclassname { + background-color: transparent; +} + +code.xref, a code { + background-color: transparent; + font-weight: bold; +} + +h1 code, h2 code, h3 code, h4 code, h5 code, h6 code { + background-color: transparent; +} + +.viewcode-link { + float: right; +} + +.viewcode-back { + float: right; + font-family: sans-serif; +} + +div.viewcode-block:target { + margin: -1px -10px; + padding: 0 10px; +} + +/* -- math display ---------------------------------------------------------- */ + +img.math { + vertical-align: middle; +} + +div.body div.math p { + text-align: center; +} + +span.eqno { + float: right; +} + +span.eqno a.headerlink { + position: absolute; + z-index: 1; +} + +div.math:hover a.headerlink { + visibility: visible; +} + +/* -- printout stylesheet --------------------------------------------------- */ + +@media print { + div.document, + div.documentwrapper, + div.bodywrapper { + margin: 0 !important; + width: 100%; + } + + div.sphinxsidebar, + div.related, + div.footer, + #top-link { + display: none; + } +} \ No newline at end of file diff --git a/_static/css/badge_only.css b/_static/css/badge_only.css new file mode 100644 index 0000000..e380325 --- /dev/null +++ b/_static/css/badge_only.css @@ -0,0 +1 @@ +.fa:before{-webkit-font-smoothing:antialiased}.clearfix{*zoom:1}.clearfix:after,.clearfix:before{display:table;content:""}.clearfix:after{clear:both}@font-face{font-family:FontAwesome;font-style:normal;font-weight:400;src:url(fonts/fontawesome-webfont.eot?674f50d287a8c48dc19ba404d20fe713?#iefix) format("embedded-opentype"),url(fonts/fontawesome-webfont.woff2?af7ae505a9eed503f8b8e6982036873e) format("woff2"),url(fonts/fontawesome-webfont.woff?fee66e712a8a08eef5805a46892932ad) format("woff"),url(fonts/fontawesome-webfont.ttf?b06871f281fee6b241d60582ae9369b9) format("truetype"),url(fonts/fontawesome-webfont.svg?912ec66d7572ff821749319396470bde#FontAwesome) format("svg")}.fa:before{font-family:FontAwesome;font-style:normal;font-weight:400;line-height:1}.fa:before,a .fa{text-decoration:inherit}.fa:before,a .fa,li .fa{display:inline-block}li .fa-large:before{width:1.875em}ul.fas{list-style-type:none;margin-left:2em;text-indent:-.8em}ul.fas li .fa{width:.8em}ul.fas li .fa-large:before{vertical-align:baseline}.fa-book:before,.icon-book:before{content:"\f02d"}.fa-caret-down:before,.icon-caret-down:before{content:"\f0d7"}.fa-caret-up:before,.icon-caret-up:before{content:"\f0d8"}.fa-caret-left:before,.icon-caret-left:before{content:"\f0d9"}.fa-caret-right:before,.icon-caret-right:before{content:"\f0da"}.rst-versions{position:fixed;bottom:0;left:0;width:300px;color:#fcfcfc;background:#1f1d1d;font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif;z-index:400}.rst-versions a{color:#2980b9;text-decoration:none}.rst-versions .rst-badge-small{display:none}.rst-versions .rst-current-version{padding:12px;background-color:#272525;display:block;text-align:right;font-size:90%;cursor:pointer;color:#27ae60}.rst-versions .rst-current-version:after{clear:both;content:"";display:block}.rst-versions .rst-current-version .fa{color:#fcfcfc}.rst-versions .rst-current-version .fa-book,.rst-versions .rst-current-version .icon-book{float:left}.rst-versions .rst-current-version.rst-out-of-date{background-color:#e74c3c;color:#fff}.rst-versions .rst-current-version.rst-active-old-version{background-color:#f1c40f;color:#000}.rst-versions.shift-up{height:auto;max-height:100%;overflow-y:scroll}.rst-versions.shift-up .rst-other-versions{display:block}.rst-versions .rst-other-versions{font-size:90%;padding:12px;color:grey;display:none}.rst-versions .rst-other-versions hr{display:block;height:1px;border:0;margin:20px 0;padding:0;border-top:1px solid #413d3d}.rst-versions .rst-other-versions dd{display:inline-block;margin:0}.rst-versions .rst-other-versions dd a{display:inline-block;padding:6px;color:#fcfcfc}.rst-versions.rst-badge{width:auto;bottom:20px;right:20px;left:auto;border:none;max-width:300px;max-height:90%}.rst-versions.rst-badge .fa-book,.rst-versions.rst-badge .icon-book{float:none;line-height:30px}.rst-versions.rst-badge.shift-up .rst-current-version{text-align:right}.rst-versions.rst-badge.shift-up .rst-current-version .fa-book,.rst-versions.rst-badge.shift-up .rst-current-version .icon-book{float:left}.rst-versions.rst-badge>.rst-current-version{width:auto;height:30px;line-height:30px;padding:0 6px;display:block;text-align:center}@media screen and (max-width:768px){.rst-versions{width:85%;display:none}.rst-versions.shift{display:block}} \ No newline at end of file diff --git a/_static/css/fonts/Roboto-Slab-Bold.woff b/_static/css/fonts/Roboto-Slab-Bold.woff new file mode 100644 index 0000000..6cb6000 Binary files /dev/null and b/_static/css/fonts/Roboto-Slab-Bold.woff differ diff --git a/_static/css/fonts/Roboto-Slab-Bold.woff2 b/_static/css/fonts/Roboto-Slab-Bold.woff2 new file mode 100644 index 0000000..7059e23 Binary files /dev/null and b/_static/css/fonts/Roboto-Slab-Bold.woff2 differ diff --git a/_static/css/fonts/Roboto-Slab-Regular.woff b/_static/css/fonts/Roboto-Slab-Regular.woff new file mode 100644 index 0000000..f815f63 Binary files /dev/null and b/_static/css/fonts/Roboto-Slab-Regular.woff differ diff --git a/_static/css/fonts/Roboto-Slab-Regular.woff2 b/_static/css/fonts/Roboto-Slab-Regular.woff2 new file mode 100644 index 0000000..f2c76e5 Binary files /dev/null and b/_static/css/fonts/Roboto-Slab-Regular.woff2 differ diff --git a/_static/css/fonts/fontawesome-webfont.eot b/_static/css/fonts/fontawesome-webfont.eot new file mode 100644 index 0000000..e9f60ca Binary files /dev/null and b/_static/css/fonts/fontawesome-webfont.eot differ diff --git a/_static/css/fonts/fontawesome-webfont.svg b/_static/css/fonts/fontawesome-webfont.svg new file mode 100644 index 0000000..855c845 --- /dev/null +++ b/_static/css/fonts/fontawesome-webfont.svg @@ -0,0 +1,2671 @@ + + + + +Created by FontForge 20120731 at Mon Oct 24 17:37:40 2016 + By ,,, +Copyright Dave Gandy 2016. All rights reserved. + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/_static/css/fonts/fontawesome-webfont.ttf b/_static/css/fonts/fontawesome-webfont.ttf new file mode 100644 index 0000000..35acda2 Binary files /dev/null and b/_static/css/fonts/fontawesome-webfont.ttf differ diff --git a/_static/css/fonts/fontawesome-webfont.woff b/_static/css/fonts/fontawesome-webfont.woff new file mode 100644 index 0000000..400014a Binary files /dev/null and b/_static/css/fonts/fontawesome-webfont.woff differ diff --git a/_static/css/fonts/fontawesome-webfont.woff2 b/_static/css/fonts/fontawesome-webfont.woff2 new file mode 100644 index 0000000..4d13fc6 Binary files /dev/null and b/_static/css/fonts/fontawesome-webfont.woff2 differ diff --git a/_static/css/fonts/lato-bold-italic.woff b/_static/css/fonts/lato-bold-italic.woff new file mode 100644 index 0000000..88ad05b Binary files /dev/null and b/_static/css/fonts/lato-bold-italic.woff differ diff --git a/_static/css/fonts/lato-bold-italic.woff2 b/_static/css/fonts/lato-bold-italic.woff2 new file mode 100644 index 0000000..c4e3d80 Binary files /dev/null and b/_static/css/fonts/lato-bold-italic.woff2 differ diff --git a/_static/css/fonts/lato-bold.woff b/_static/css/fonts/lato-bold.woff new file mode 100644 index 0000000..c6dff51 Binary files /dev/null and b/_static/css/fonts/lato-bold.woff differ diff --git a/_static/css/fonts/lato-bold.woff2 b/_static/css/fonts/lato-bold.woff2 new file mode 100644 index 0000000..bb19504 Binary files /dev/null and b/_static/css/fonts/lato-bold.woff2 differ diff --git a/_static/css/fonts/lato-normal-italic.woff b/_static/css/fonts/lato-normal-italic.woff new file mode 100644 index 0000000..76114bc Binary files /dev/null and b/_static/css/fonts/lato-normal-italic.woff differ diff --git a/_static/css/fonts/lato-normal-italic.woff2 b/_static/css/fonts/lato-normal-italic.woff2 new file mode 100644 index 0000000..3404f37 Binary files /dev/null and b/_static/css/fonts/lato-normal-italic.woff2 differ diff --git a/_static/css/fonts/lato-normal.woff b/_static/css/fonts/lato-normal.woff new file mode 100644 index 0000000..ae1307f Binary files /dev/null and b/_static/css/fonts/lato-normal.woff differ diff --git a/_static/css/fonts/lato-normal.woff2 b/_static/css/fonts/lato-normal.woff2 new file mode 100644 index 0000000..3bf9843 Binary files /dev/null and b/_static/css/fonts/lato-normal.woff2 differ diff --git a/_static/css/theme.css b/_static/css/theme.css new file mode 100644 index 0000000..8cd4f10 --- /dev/null +++ b/_static/css/theme.css @@ -0,0 +1,4 @@ +html{box-sizing:border-box}*,:after,:before{box-sizing:inherit}article,aside,details,figcaption,figure,footer,header,hgroup,nav,section{display:block}audio,canvas,video{display:inline-block;*display:inline;*zoom:1}[hidden],audio:not([controls]){display:none}*{-webkit-box-sizing:border-box;-moz-box-sizing:border-box;box-sizing:border-box}html{font-size:100%;-webkit-text-size-adjust:100%;-ms-text-size-adjust:100%}body{margin:0}a:active,a:hover{outline:0}abbr[title]{border-bottom:1px dotted}b,strong{font-weight:700}blockquote{margin:0}dfn{font-style:italic}ins{background:#ff9;text-decoration:none}ins,mark{color:#000}mark{background:#ff0;font-style:italic;font-weight:700}.rst-content code,.rst-content tt,code,kbd,pre,samp{font-family:monospace,serif;_font-family:courier new,monospace;font-size:1em}pre{white-space:pre}q{quotes:none}q:after,q:before{content:"";content:none}small{font-size:85%}sub,sup{font-size:75%;line-height:0;position:relative;vertical-align:baseline}sup{top:-.5em}sub{bottom:-.25em}dl,ol,ul{margin:0;padding:0;list-style:none;list-style-image:none}li{list-style:none}dd{margin:0}img{border:0;-ms-interpolation-mode:bicubic;vertical-align:middle;max-width:100%}svg:not(:root){overflow:hidden}figure,form{margin:0}label{cursor:pointer}button,input,select,textarea{font-size:100%;margin:0;vertical-align:baseline;*vertical-align:middle}button,input{line-height:normal}button,input[type=button],input[type=reset],input[type=submit]{cursor:pointer;-webkit-appearance:button;*overflow:visible}button[disabled],input[disabled]{cursor:default}input[type=search]{-webkit-appearance:textfield;-moz-box-sizing:content-box;-webkit-box-sizing:content-box;box-sizing:content-box}textarea{resize:vertical}table{border-collapse:collapse;border-spacing:0}td{vertical-align:top}.chromeframe{margin:.2em 0;background:#ccc;color:#000;padding:.2em 0}.ir{display:block;border:0;text-indent:-999em;overflow:hidden;background-color:transparent;background-repeat:no-repeat;text-align:left;direction:ltr;*line-height:0}.ir br{display:none}.hidden{display:none!important;visibility:hidden}.visuallyhidden{border:0;clip:rect(0 0 0 0);height:1px;margin:-1px;overflow:hidden;padding:0;position:absolute;width:1px}.visuallyhidden.focusable:active,.visuallyhidden.focusable:focus{clip:auto;height:auto;margin:0;overflow:visible;position:static;width:auto}.invisible{visibility:hidden}.relative{position:relative}big,small{font-size:100%}@media print{body,html,section{background:none!important}*{box-shadow:none!important;text-shadow:none!important;filter:none!important;-ms-filter:none!important}a,a:visited{text-decoration:underline}.ir a:after,a[href^="#"]:after,a[href^="javascript:"]:after{content:""}blockquote,pre{page-break-inside:avoid}thead{display:table-header-group}img,tr{page-break-inside:avoid}img{max-width:100%!important}@page{margin:.5cm}.rst-content .toctree-wrapper>p.caption,h2,h3,p{orphans:3;widows:3}.rst-content .toctree-wrapper>p.caption,h2,h3{page-break-after:avoid}}.btn,.fa:before,.icon:before,.rst-content .admonition,.rst-content .admonition-title:before,.rst-content .admonition-todo,.rst-content .attention,.rst-content .caution,.rst-content .code-block-caption .headerlink:before,.rst-content .danger,.rst-content .error,.rst-content .hint,.rst-content .important,.rst-content .note,.rst-content .seealso,.rst-content .tip,.rst-content .warning,.rst-content code.download span:first-child:before,.rst-content dl dt .headerlink:before,.rst-content h1 .headerlink:before,.rst-content h2 .headerlink:before,.rst-content h3 .headerlink:before,.rst-content h4 .headerlink:before,.rst-content h5 .headerlink:before,.rst-content h6 .headerlink:before,.rst-content p.caption .headerlink:before,.rst-content table>caption .headerlink:before,.rst-content tt.download span:first-child:before,.wy-alert,.wy-dropdown .caret:before,.wy-inline-validate.wy-inline-validate-danger .wy-input-context:before,.wy-inline-validate.wy-inline-validate-info .wy-input-context:before,.wy-inline-validate.wy-inline-validate-success .wy-input-context:before,.wy-inline-validate.wy-inline-validate-warning .wy-input-context:before,.wy-menu-vertical li.current>a,.wy-menu-vertical li.current>a span.toctree-expand:before,.wy-menu-vertical li.on a,.wy-menu-vertical li.on a span.toctree-expand:before,.wy-menu-vertical li span.toctree-expand:before,.wy-nav-top a,.wy-side-nav-search .wy-dropdown>a,.wy-side-nav-search>a,input[type=color],input[type=date],input[type=datetime-local],input[type=datetime],input[type=email],input[type=month],input[type=number],input[type=password],input[type=search],input[type=tel],input[type=text],input[type=time],input[type=url],input[type=week],select,textarea{-webkit-font-smoothing:antialiased}.clearfix{*zoom:1}.clearfix:after,.clearfix:before{display:table;content:""}.clearfix:after{clear:both}/*! + * Font Awesome 4.7.0 by @davegandy - http://fontawesome.io - @fontawesome + * License - http://fontawesome.io/license (Font: SIL OFL 1.1, CSS: MIT License) + */@font-face{font-family:FontAwesome;src:url(fonts/fontawesome-webfont.eot?674f50d287a8c48dc19ba404d20fe713);src:url(fonts/fontawesome-webfont.eot?674f50d287a8c48dc19ba404d20fe713?#iefix&v=4.7.0) format("embedded-opentype"),url(fonts/fontawesome-webfont.woff2?af7ae505a9eed503f8b8e6982036873e) format("woff2"),url(fonts/fontawesome-webfont.woff?fee66e712a8a08eef5805a46892932ad) format("woff"),url(fonts/fontawesome-webfont.ttf?b06871f281fee6b241d60582ae9369b9) format("truetype"),url(fonts/fontawesome-webfont.svg?912ec66d7572ff821749319396470bde#fontawesomeregular) format("svg");font-weight:400;font-style:normal}.fa,.icon,.rst-content .admonition-title,.rst-content .code-block-caption .headerlink,.rst-content code.download span:first-child,.rst-content dl dt .headerlink,.rst-content h1 .headerlink,.rst-content h2 .headerlink,.rst-content h3 .headerlink,.rst-content h4 .headerlink,.rst-content h5 .headerlink,.rst-content h6 .headerlink,.rst-content p.caption .headerlink,.rst-content table>caption .headerlink,.rst-content tt.download span:first-child,.wy-menu-vertical li.current>a span.toctree-expand,.wy-menu-vertical li.on a span.toctree-expand,.wy-menu-vertical li span.toctree-expand{display:inline-block;font:normal normal normal 14px/1 FontAwesome;font-size:inherit;text-rendering:auto;-webkit-font-smoothing:antialiased;-moz-osx-font-smoothing:grayscale}.fa-lg{font-size:1.33333em;line-height:.75em;vertical-align:-15%}.fa-2x{font-size:2em}.fa-3x{font-size:3em}.fa-4x{font-size:4em}.fa-5x{font-size:5em}.fa-fw{width:1.28571em;text-align:center}.fa-ul{padding-left:0;margin-left:2.14286em;list-style-type:none}.fa-ul>li{position:relative}.fa-li{position:absolute;left:-2.14286em;width:2.14286em;top:.14286em;text-align:center}.fa-li.fa-lg{left:-1.85714em}.fa-border{padding:.2em .25em .15em;border:.08em solid #eee;border-radius:.1em}.fa-pull-left{float:left}.fa-pull-right{float:right}.fa-pull-left.icon,.fa.fa-pull-left,.rst-content .code-block-caption .fa-pull-left.headerlink,.rst-content .fa-pull-left.admonition-title,.rst-content code.download span.fa-pull-left:first-child,.rst-content dl dt .fa-pull-left.headerlink,.rst-content h1 .fa-pull-left.headerlink,.rst-content h2 .fa-pull-left.headerlink,.rst-content h3 .fa-pull-left.headerlink,.rst-content h4 .fa-pull-left.headerlink,.rst-content h5 .fa-pull-left.headerlink,.rst-content h6 .fa-pull-left.headerlink,.rst-content p.caption .fa-pull-left.headerlink,.rst-content table>caption .fa-pull-left.headerlink,.rst-content tt.download span.fa-pull-left:first-child,.wy-menu-vertical li.current>a span.fa-pull-left.toctree-expand,.wy-menu-vertical li.on a span.fa-pull-left.toctree-expand,.wy-menu-vertical li span.fa-pull-left.toctree-expand{margin-right:.3em}.fa-pull-right.icon,.fa.fa-pull-right,.rst-content .code-block-caption .fa-pull-right.headerlink,.rst-content .fa-pull-right.admonition-title,.rst-content code.download span.fa-pull-right:first-child,.rst-content dl dt .fa-pull-right.headerlink,.rst-content h1 .fa-pull-right.headerlink,.rst-content h2 .fa-pull-right.headerlink,.rst-content h3 .fa-pull-right.headerlink,.rst-content h4 .fa-pull-right.headerlink,.rst-content h5 .fa-pull-right.headerlink,.rst-content h6 .fa-pull-right.headerlink,.rst-content p.caption .fa-pull-right.headerlink,.rst-content table>caption .fa-pull-right.headerlink,.rst-content tt.download span.fa-pull-right:first-child,.wy-menu-vertical li.current>a span.fa-pull-right.toctree-expand,.wy-menu-vertical li.on a span.fa-pull-right.toctree-expand,.wy-menu-vertical li span.fa-pull-right.toctree-expand{margin-left:.3em}.pull-right{float:right}.pull-left{float:left}.fa.pull-left,.pull-left.icon,.rst-content .code-block-caption .pull-left.headerlink,.rst-content .pull-left.admonition-title,.rst-content code.download span.pull-left:first-child,.rst-content dl dt .pull-left.headerlink,.rst-content h1 .pull-left.headerlink,.rst-content h2 .pull-left.headerlink,.rst-content h3 .pull-left.headerlink,.rst-content h4 .pull-left.headerlink,.rst-content h5 .pull-left.headerlink,.rst-content h6 .pull-left.headerlink,.rst-content p.caption .pull-left.headerlink,.rst-content table>caption .pull-left.headerlink,.rst-content tt.download span.pull-left:first-child,.wy-menu-vertical li.current>a span.pull-left.toctree-expand,.wy-menu-vertical li.on a span.pull-left.toctree-expand,.wy-menu-vertical li span.pull-left.toctree-expand{margin-right:.3em}.fa.pull-right,.pull-right.icon,.rst-content .code-block-caption .pull-right.headerlink,.rst-content .pull-right.admonition-title,.rst-content code.download span.pull-right:first-child,.rst-content dl dt .pull-right.headerlink,.rst-content h1 .pull-right.headerlink,.rst-content h2 .pull-right.headerlink,.rst-content h3 .pull-right.headerlink,.rst-content h4 .pull-right.headerlink,.rst-content h5 .pull-right.headerlink,.rst-content h6 .pull-right.headerlink,.rst-content p.caption .pull-right.headerlink,.rst-content table>caption .pull-right.headerlink,.rst-content tt.download span.pull-right:first-child,.wy-menu-vertical li.current>a span.pull-right.toctree-expand,.wy-menu-vertical li.on a span.pull-right.toctree-expand,.wy-menu-vertical li span.pull-right.toctree-expand{margin-left:.3em}.fa-spin{-webkit-animation:fa-spin 2s linear infinite;animation:fa-spin 2s linear infinite}.fa-pulse{-webkit-animation:fa-spin 1s steps(8) infinite;animation:fa-spin 1s steps(8) infinite}@-webkit-keyframes fa-spin{0%{-webkit-transform:rotate(0deg);transform:rotate(0deg)}to{-webkit-transform:rotate(359deg);transform:rotate(359deg)}}@keyframes fa-spin{0%{-webkit-transform:rotate(0deg);transform:rotate(0deg)}to{-webkit-transform:rotate(359deg);transform:rotate(359deg)}}.fa-rotate-90{-ms-filter:"progid:DXImageTransform.Microsoft.BasicImage(rotation=1)";-webkit-transform:rotate(90deg);-ms-transform:rotate(90deg);transform:rotate(90deg)}.fa-rotate-180{-ms-filter:"progid:DXImageTransform.Microsoft.BasicImage(rotation=2)";-webkit-transform:rotate(180deg);-ms-transform:rotate(180deg);transform:rotate(180deg)}.fa-rotate-270{-ms-filter:"progid:DXImageTransform.Microsoft.BasicImage(rotation=3)";-webkit-transform:rotate(270deg);-ms-transform:rotate(270deg);transform:rotate(270deg)}.fa-flip-horizontal{-ms-filter:"progid:DXImageTransform.Microsoft.BasicImage(rotation=0, mirror=1)";-webkit-transform:scaleX(-1);-ms-transform:scaleX(-1);transform:scaleX(-1)}.fa-flip-vertical{-ms-filter:"progid:DXImageTransform.Microsoft.BasicImage(rotation=2, mirror=1)";-webkit-transform:scaleY(-1);-ms-transform:scaleY(-1);transform:scaleY(-1)}:root .fa-flip-horizontal,:root .fa-flip-vertical,:root .fa-rotate-90,:root .fa-rotate-180,:root .fa-rotate-270{filter:none}.fa-stack{position:relative;display:inline-block;width:2em;height:2em;line-height:2em;vertical-align:middle}.fa-stack-1x,.fa-stack-2x{position:absolute;left:0;width:100%;text-align:center}.fa-stack-1x{line-height:inherit}.fa-stack-2x{font-size:2em}.fa-inverse{color:#fff}.fa-glass:before{content:""}.fa-music:before{content:""}.fa-search:before,.icon-search:before{content:""}.fa-envelope-o:before{content:""}.fa-heart:before{content:""}.fa-star:before{content:""}.fa-star-o:before{content:""}.fa-user:before{content:""}.fa-film:before{content:""}.fa-th-large:before{content:""}.fa-th:before{content:""}.fa-th-list:before{content:""}.fa-check:before{content:""}.fa-close:before,.fa-remove:before,.fa-times:before{content:""}.fa-search-plus:before{content:""}.fa-search-minus:before{content:""}.fa-power-off:before{content:""}.fa-signal:before{content:""}.fa-cog:before,.fa-gear:before{content:""}.fa-trash-o:before{content:""}.fa-home:before,.icon-home:before{content:""}.fa-file-o:before{content:""}.fa-clock-o:before{content:""}.fa-road:before{content:""}.fa-download:before,.rst-content code.download span:first-child:before,.rst-content tt.download span:first-child:before{content:""}.fa-arrow-circle-o-down:before{content:""}.fa-arrow-circle-o-up:before{content:""}.fa-inbox:before{content:""}.fa-play-circle-o:before{content:""}.fa-repeat:before,.fa-rotate-right:before{content:""}.fa-refresh:before{content:""}.fa-list-alt:before{content:""}.fa-lock:before{content:""}.fa-flag:before{content:""}.fa-headphones:before{content:""}.fa-volume-off:before{content:""}.fa-volume-down:before{content:""}.fa-volume-up:before{content:""}.fa-qrcode:before{content:""}.fa-barcode:before{content:""}.fa-tag:before{content:""}.fa-tags:before{content:""}.fa-book:before,.icon-book:before{content:""}.fa-bookmark:before{content:""}.fa-print:before{content:""}.fa-camera:before{content:""}.fa-font:before{content:""}.fa-bold:before{content:""}.fa-italic:before{content:""}.fa-text-height:before{content:""}.fa-text-width:before{content:""}.fa-align-left:before{content:""}.fa-align-center:before{content:""}.fa-align-right:before{content:""}.fa-align-justify:before{content:""}.fa-list:before{content:""}.fa-dedent:before,.fa-outdent:before{content:""}.fa-indent:before{content:""}.fa-video-camera:before{content:""}.fa-image:before,.fa-photo:before,.fa-picture-o:before{content:""}.fa-pencil:before{content:""}.fa-map-marker:before{content:""}.fa-adjust:before{content:""}.fa-tint:before{content:""}.fa-edit:before,.fa-pencil-square-o:before{content:""}.fa-share-square-o:before{content:""}.fa-check-square-o:before{content:""}.fa-arrows:before{content:""}.fa-step-backward:before{content:""}.fa-fast-backward:before{content:""}.fa-backward:before{content:""}.fa-play:before{content:""}.fa-pause:before{content:""}.fa-stop:before{content:""}.fa-forward:before{content:""}.fa-fast-forward:before{content:""}.fa-step-forward:before{content:""}.fa-eject:before{content:""}.fa-chevron-left:before{content:""}.fa-chevron-right:before{content:""}.fa-plus-circle:before{content:""}.fa-minus-circle:before{content:""}.fa-times-circle:before,.wy-inline-validate.wy-inline-validate-danger .wy-input-context:before{content:""}.fa-check-circle:before,.wy-inline-validate.wy-inline-validate-success .wy-input-context:before{content:""}.fa-question-circle:before{content:""}.fa-info-circle:before{content:""}.fa-crosshairs:before{content:""}.fa-times-circle-o:before{content:""}.fa-check-circle-o:before{content:""}.fa-ban:before{content:""}.fa-arrow-left:before{content:""}.fa-arrow-right:before{content:""}.fa-arrow-up:before{content:""}.fa-arrow-down:before{content:""}.fa-mail-forward:before,.fa-share:before{content:""}.fa-expand:before{content:""}.fa-compress:before{content:""}.fa-plus:before{content:""}.fa-minus:before{content:""}.fa-asterisk:before{content:""}.fa-exclamation-circle:before,.rst-content .admonition-title:before,.wy-inline-validate.wy-inline-validate-info .wy-input-context:before,.wy-inline-validate.wy-inline-validate-warning .wy-input-context:before{content:""}.fa-gift:before{content:""}.fa-leaf:before{content:""}.fa-fire:before,.icon-fire:before{content:""}.fa-eye:before{content:""}.fa-eye-slash:before{content:""}.fa-exclamation-triangle:before,.fa-warning:before{content:""}.fa-plane:before{content:""}.fa-calendar:before{content:""}.fa-random:before{content:""}.fa-comment:before{content:""}.fa-magnet:before{content:""}.fa-chevron-up:before{content:""}.fa-chevron-down:before{content:""}.fa-retweet:before{content:""}.fa-shopping-cart:before{content:""}.fa-folder:before{content:""}.fa-folder-open:before{content:""}.fa-arrows-v:before{content:""}.fa-arrows-h:before{content:""}.fa-bar-chart-o:before,.fa-bar-chart:before{content:""}.fa-twitter-square:before{content:""}.fa-facebook-square:before{content:""}.fa-camera-retro:before{content:""}.fa-key:before{content:""}.fa-cogs:before,.fa-gears:before{content:""}.fa-comments:before{content:""}.fa-thumbs-o-up:before{content:""}.fa-thumbs-o-down:before{content:""}.fa-star-half:before{content:""}.fa-heart-o:before{content:""}.fa-sign-out:before{content:""}.fa-linkedin-square:before{content:""}.fa-thumb-tack:before{content:""}.fa-external-link:before{content:""}.fa-sign-in:before{content:""}.fa-trophy:before{content:""}.fa-github-square:before{content:""}.fa-upload:before{content:""}.fa-lemon-o:before{content:""}.fa-phone:before{content:""}.fa-square-o:before{content:""}.fa-bookmark-o:before{content:""}.fa-phone-square:before{content:""}.fa-twitter:before{content:""}.fa-facebook-f:before,.fa-facebook:before{content:""}.fa-github:before,.icon-github:before{content:""}.fa-unlock:before{content:""}.fa-credit-card:before{content:""}.fa-feed:before,.fa-rss:before{content:""}.fa-hdd-o:before{content:""}.fa-bullhorn:before{content:""}.fa-bell:before{content:""}.fa-certificate:before{content:""}.fa-hand-o-right:before{content:""}.fa-hand-o-left:before{content:""}.fa-hand-o-up:before{content:""}.fa-hand-o-down:before{content:""}.fa-arrow-circle-left:before,.icon-circle-arrow-left:before{content:""}.fa-arrow-circle-right:before,.icon-circle-arrow-right:before{content:""}.fa-arrow-circle-up:before{content:""}.fa-arrow-circle-down:before{content:""}.fa-globe:before{content:""}.fa-wrench:before{content:""}.fa-tasks:before{content:""}.fa-filter:before{content:""}.fa-briefcase:before{content:""}.fa-arrows-alt:before{content:""}.fa-group:before,.fa-users:before{content:""}.fa-chain:before,.fa-link:before,.icon-link:before{content:""}.fa-cloud:before{content:""}.fa-flask:before{content:""}.fa-cut:before,.fa-scissors:before{content:""}.fa-copy:before,.fa-files-o:before{content:""}.fa-paperclip:before{content:""}.fa-floppy-o:before,.fa-save:before{content:""}.fa-square:before{content:""}.fa-bars:before,.fa-navicon:before,.fa-reorder:before{content:""}.fa-list-ul:before{content:""}.fa-list-ol:before{content:""}.fa-strikethrough:before{content:""}.fa-underline:before{content:""}.fa-table:before{content:""}.fa-magic:before{content:""}.fa-truck:before{content:""}.fa-pinterest:before{content:""}.fa-pinterest-square:before{content:""}.fa-google-plus-square:before{content:""}.fa-google-plus:before{content:""}.fa-money:before{content:""}.fa-caret-down:before,.icon-caret-down:before,.wy-dropdown .caret:before{content:""}.fa-caret-up:before{content:""}.fa-caret-left:before{content:""}.fa-caret-right:before{content:""}.fa-columns:before{content:""}.fa-sort:before,.fa-unsorted:before{content:""}.fa-sort-desc:before,.fa-sort-down:before{content:""}.fa-sort-asc:before,.fa-sort-up:before{content:""}.fa-envelope:before{content:""}.fa-linkedin:before{content:""}.fa-rotate-left:before,.fa-undo:before{content:""}.fa-gavel:before,.fa-legal:before{content:""}.fa-dashboard:before,.fa-tachometer:before{content:""}.fa-comment-o:before{content:""}.fa-comments-o:before{content:""}.fa-bolt:before,.fa-flash:before{content:""}.fa-sitemap:before{content:""}.fa-umbrella:before{content:""}.fa-clipboard:before,.fa-paste:before{content:""}.fa-lightbulb-o:before{content:""}.fa-exchange:before{content:""}.fa-cloud-download:before{content:""}.fa-cloud-upload:before{content:""}.fa-user-md:before{content:""}.fa-stethoscope:before{content:""}.fa-suitcase:before{content:""}.fa-bell-o:before{content:""}.fa-coffee:before{content:""}.fa-cutlery:before{content:""}.fa-file-text-o:before{content:""}.fa-building-o:before{content:""}.fa-hospital-o:before{content:""}.fa-ambulance:before{content:""}.fa-medkit:before{content:""}.fa-fighter-jet:before{content:""}.fa-beer:before{content:""}.fa-h-square:before{content:""}.fa-plus-square:before{content:""}.fa-angle-double-left:before{content:""}.fa-angle-double-right:before{content:""}.fa-angle-double-up:before{content:""}.fa-angle-double-down:before{content:""}.fa-angle-left:before{content:""}.fa-angle-right:before{content:""}.fa-angle-up:before{content:""}.fa-angle-down:before{content:""}.fa-desktop:before{content:""}.fa-laptop:before{content:""}.fa-tablet:before{content:""}.fa-mobile-phone:before,.fa-mobile:before{content:""}.fa-circle-o:before{content:""}.fa-quote-left:before{content:""}.fa-quote-right:before{content:""}.fa-spinner:before{content:""}.fa-circle:before{content:""}.fa-mail-reply:before,.fa-reply:before{content:""}.fa-github-alt:before{content:""}.fa-folder-o:before{content:""}.fa-folder-open-o:before{content:""}.fa-smile-o:before{content:""}.fa-frown-o:before{content:""}.fa-meh-o:before{content:""}.fa-gamepad:before{content:""}.fa-keyboard-o:before{content:""}.fa-flag-o:before{content:""}.fa-flag-checkered:before{content:""}.fa-terminal:before{content:""}.fa-code:before{content:""}.fa-mail-reply-all:before,.fa-reply-all:before{content:""}.fa-star-half-empty:before,.fa-star-half-full:before,.fa-star-half-o:before{content:""}.fa-location-arrow:before{content:""}.fa-crop:before{content:""}.fa-code-fork:before{content:""}.fa-chain-broken:before,.fa-unlink:before{content:""}.fa-question:before{content:""}.fa-info:before{content:""}.fa-exclamation:before{content:""}.fa-superscript:before{content:""}.fa-subscript:before{content:""}.fa-eraser:before{content:""}.fa-puzzle-piece:before{content:""}.fa-microphone:before{content:""}.fa-microphone-slash:before{content:""}.fa-shield:before{content:""}.fa-calendar-o:before{content:""}.fa-fire-extinguisher:before{content:""}.fa-rocket:before{content:""}.fa-maxcdn:before{content:""}.fa-chevron-circle-left:before{content:""}.fa-chevron-circle-right:before{content:""}.fa-chevron-circle-up:before{content:""}.fa-chevron-circle-down:before{content:""}.fa-html5:before{content:""}.fa-css3:before{content:""}.fa-anchor:before{content:""}.fa-unlock-alt:before{content:""}.fa-bullseye:before{content:""}.fa-ellipsis-h:before{content:""}.fa-ellipsis-v:before{content:""}.fa-rss-square:before{content:""}.fa-play-circle:before{content:""}.fa-ticket:before{content:""}.fa-minus-square:before{content:""}.fa-minus-square-o:before,.wy-menu-vertical li.current>a span.toctree-expand:before,.wy-menu-vertical li.on a span.toctree-expand:before{content:""}.fa-level-up:before{content:""}.fa-level-down:before{content:""}.fa-check-square:before{content:""}.fa-pencil-square:before{content:""}.fa-external-link-square:before{content:""}.fa-share-square:before{content:""}.fa-compass:before{content:""}.fa-caret-square-o-down:before,.fa-toggle-down:before{content:""}.fa-caret-square-o-up:before,.fa-toggle-up:before{content:""}.fa-caret-square-o-right:before,.fa-toggle-right:before{content:""}.fa-eur:before,.fa-euro:before{content:""}.fa-gbp:before{content:""}.fa-dollar:before,.fa-usd:before{content:""}.fa-inr:before,.fa-rupee:before{content:""}.fa-cny:before,.fa-jpy:before,.fa-rmb:before,.fa-yen:before{content:""}.fa-rouble:before,.fa-rub:before,.fa-ruble:before{content:""}.fa-krw:before,.fa-won:before{content:""}.fa-bitcoin:before,.fa-btc:before{content:""}.fa-file:before{content:""}.fa-file-text:before{content:""}.fa-sort-alpha-asc:before{content:""}.fa-sort-alpha-desc:before{content:""}.fa-sort-amount-asc:before{content:""}.fa-sort-amount-desc:before{content:""}.fa-sort-numeric-asc:before{content:""}.fa-sort-numeric-desc:before{content:""}.fa-thumbs-up:before{content:""}.fa-thumbs-down:before{content:""}.fa-youtube-square:before{content:""}.fa-youtube:before{content:""}.fa-xing:before{content:""}.fa-xing-square:before{content:""}.fa-youtube-play:before{content:""}.fa-dropbox:before{content:""}.fa-stack-overflow:before{content:""}.fa-instagram:before{content:""}.fa-flickr:before{content:""}.fa-adn:before{content:""}.fa-bitbucket:before,.icon-bitbucket:before{content:""}.fa-bitbucket-square:before{content:""}.fa-tumblr:before{content:""}.fa-tumblr-square:before{content:""}.fa-long-arrow-down:before{content:""}.fa-long-arrow-up:before{content:""}.fa-long-arrow-left:before{content:""}.fa-long-arrow-right:before{content:""}.fa-apple:before{content:""}.fa-windows:before{content:""}.fa-android:before{content:""}.fa-linux:before{content:""}.fa-dribbble:before{content:""}.fa-skype:before{content:""}.fa-foursquare:before{content:""}.fa-trello:before{content:""}.fa-female:before{content:""}.fa-male:before{content:""}.fa-gittip:before,.fa-gratipay:before{content:""}.fa-sun-o:before{content:""}.fa-moon-o:before{content:""}.fa-archive:before{content:""}.fa-bug:before{content:""}.fa-vk:before{content:""}.fa-weibo:before{content:""}.fa-renren:before{content:""}.fa-pagelines:before{content:""}.fa-stack-exchange:before{content:""}.fa-arrow-circle-o-right:before{content:""}.fa-arrow-circle-o-left:before{content:""}.fa-caret-square-o-left:before,.fa-toggle-left:before{content:""}.fa-dot-circle-o:before{content:""}.fa-wheelchair:before{content:""}.fa-vimeo-square:before{content:""}.fa-try:before,.fa-turkish-lira:before{content:""}.fa-plus-square-o:before,.wy-menu-vertical li span.toctree-expand:before{content:""}.fa-space-shuttle:before{content:""}.fa-slack:before{content:""}.fa-envelope-square:before{content:""}.fa-wordpress:before{content:""}.fa-openid:before{content:""}.fa-bank:before,.fa-institution:before,.fa-university:before{content:""}.fa-graduation-cap:before,.fa-mortar-board:before{content:""}.fa-yahoo:before{content:""}.fa-google:before{content:""}.fa-reddit:before{content:""}.fa-reddit-square:before{content:""}.fa-stumbleupon-circle:before{content:""}.fa-stumbleupon:before{content:""}.fa-delicious:before{content:""}.fa-digg:before{content:""}.fa-pied-piper-pp:before{content:""}.fa-pied-piper-alt:before{content:""}.fa-drupal:before{content:""}.fa-joomla:before{content:""}.fa-language:before{content:""}.fa-fax:before{content:""}.fa-building:before{content:""}.fa-child:before{content:""}.fa-paw:before{content:""}.fa-spoon:before{content:""}.fa-cube:before{content:""}.fa-cubes:before{content:""}.fa-behance:before{content:""}.fa-behance-square:before{content:""}.fa-steam:before{content:""}.fa-steam-square:before{content:""}.fa-recycle:before{content:""}.fa-automobile:before,.fa-car:before{content:""}.fa-cab:before,.fa-taxi:before{content:""}.fa-tree:before{content:""}.fa-spotify:before{content:""}.fa-deviantart:before{content:""}.fa-soundcloud:before{content:""}.fa-database:before{content:""}.fa-file-pdf-o:before{content:""}.fa-file-word-o:before{content:""}.fa-file-excel-o:before{content:""}.fa-file-powerpoint-o:before{content:""}.fa-file-image-o:before,.fa-file-photo-o:before,.fa-file-picture-o:before{content:""}.fa-file-archive-o:before,.fa-file-zip-o:before{content:""}.fa-file-audio-o:before,.fa-file-sound-o:before{content:""}.fa-file-movie-o:before,.fa-file-video-o:before{content:""}.fa-file-code-o:before{content:""}.fa-vine:before{content:""}.fa-codepen:before{content:""}.fa-jsfiddle:before{content:""}.fa-life-bouy:before,.fa-life-buoy:before,.fa-life-ring:before,.fa-life-saver:before,.fa-support:before{content:""}.fa-circle-o-notch:before{content:""}.fa-ra:before,.fa-rebel:before,.fa-resistance:before{content:""}.fa-empire:before,.fa-ge:before{content:""}.fa-git-square:before{content:""}.fa-git:before{content:""}.fa-hacker-news:before,.fa-y-combinator-square:before,.fa-yc-square:before{content:""}.fa-tencent-weibo:before{content:""}.fa-qq:before{content:""}.fa-wechat:before,.fa-weixin:before{content:""}.fa-paper-plane:before,.fa-send:before{content:""}.fa-paper-plane-o:before,.fa-send-o:before{content:""}.fa-history:before{content:""}.fa-circle-thin:before{content:""}.fa-header:before{content:""}.fa-paragraph:before{content:""}.fa-sliders:before{content:""}.fa-share-alt:before{content:""}.fa-share-alt-square:before{content:""}.fa-bomb:before{content:""}.fa-futbol-o:before,.fa-soccer-ball-o:before{content:""}.fa-tty:before{content:""}.fa-binoculars:before{content:""}.fa-plug:before{content:""}.fa-slideshare:before{content:""}.fa-twitch:before{content:""}.fa-yelp:before{content:""}.fa-newspaper-o:before{content:""}.fa-wifi:before{content:""}.fa-calculator:before{content:""}.fa-paypal:before{content:""}.fa-google-wallet:before{content:""}.fa-cc-visa:before{content:""}.fa-cc-mastercard:before{content:""}.fa-cc-discover:before{content:""}.fa-cc-amex:before{content:""}.fa-cc-paypal:before{content:""}.fa-cc-stripe:before{content:""}.fa-bell-slash:before{content:""}.fa-bell-slash-o:before{content:""}.fa-trash:before{content:""}.fa-copyright:before{content:""}.fa-at:before{content:""}.fa-eyedropper:before{content:""}.fa-paint-brush:before{content:""}.fa-birthday-cake:before{content:""}.fa-area-chart:before{content:""}.fa-pie-chart:before{content:""}.fa-line-chart:before{content:""}.fa-lastfm:before{content:""}.fa-lastfm-square:before{content:""}.fa-toggle-off:before{content:""}.fa-toggle-on:before{content:""}.fa-bicycle:before{content:""}.fa-bus:before{content:""}.fa-ioxhost:before{content:""}.fa-angellist:before{content:""}.fa-cc:before{content:""}.fa-ils:before,.fa-shekel:before,.fa-sheqel:before{content:""}.fa-meanpath:before{content:""}.fa-buysellads:before{content:""}.fa-connectdevelop:before{content:""}.fa-dashcube:before{content:""}.fa-forumbee:before{content:""}.fa-leanpub:before{content:""}.fa-sellsy:before{content:""}.fa-shirtsinbulk:before{content:""}.fa-simplybuilt:before{content:""}.fa-skyatlas:before{content:""}.fa-cart-plus:before{content:""}.fa-cart-arrow-down:before{content:""}.fa-diamond:before{content:""}.fa-ship:before{content:""}.fa-user-secret:before{content:""}.fa-motorcycle:before{content:""}.fa-street-view:before{content:""}.fa-heartbeat:before{content:""}.fa-venus:before{content:""}.fa-mars:before{content:""}.fa-mercury:before{content:""}.fa-intersex:before,.fa-transgender:before{content:""}.fa-transgender-alt:before{content:""}.fa-venus-double:before{content:""}.fa-mars-double:before{content:""}.fa-venus-mars:before{content:""}.fa-mars-stroke:before{content:""}.fa-mars-stroke-v:before{content:""}.fa-mars-stroke-h:before{content:""}.fa-neuter:before{content:""}.fa-genderless:before{content:""}.fa-facebook-official:before{content:""}.fa-pinterest-p:before{content:""}.fa-whatsapp:before{content:""}.fa-server:before{content:""}.fa-user-plus:before{content:""}.fa-user-times:before{content:""}.fa-bed:before,.fa-hotel:before{content:""}.fa-viacoin:before{content:""}.fa-train:before{content:""}.fa-subway:before{content:""}.fa-medium:before{content:""}.fa-y-combinator:before,.fa-yc:before{content:""}.fa-optin-monster:before{content:""}.fa-opencart:before{content:""}.fa-expeditedssl:before{content:""}.fa-battery-4:before,.fa-battery-full:before,.fa-battery:before{content:""}.fa-battery-3:before,.fa-battery-three-quarters:before{content:""}.fa-battery-2:before,.fa-battery-half:before{content:""}.fa-battery-1:before,.fa-battery-quarter:before{content:""}.fa-battery-0:before,.fa-battery-empty:before{content:""}.fa-mouse-pointer:before{content:""}.fa-i-cursor:before{content:""}.fa-object-group:before{content:""}.fa-object-ungroup:before{content:""}.fa-sticky-note:before{content:""}.fa-sticky-note-o:before{content:""}.fa-cc-jcb:before{content:""}.fa-cc-diners-club:before{content:""}.fa-clone:before{content:""}.fa-balance-scale:before{content:""}.fa-hourglass-o:before{content:""}.fa-hourglass-1:before,.fa-hourglass-start:before{content:""}.fa-hourglass-2:before,.fa-hourglass-half:before{content:""}.fa-hourglass-3:before,.fa-hourglass-end:before{content:""}.fa-hourglass:before{content:""}.fa-hand-grab-o:before,.fa-hand-rock-o:before{content:""}.fa-hand-paper-o:before,.fa-hand-stop-o:before{content:""}.fa-hand-scissors-o:before{content:""}.fa-hand-lizard-o:before{content:""}.fa-hand-spock-o:before{content:""}.fa-hand-pointer-o:before{content:""}.fa-hand-peace-o:before{content:""}.fa-trademark:before{content:""}.fa-registered:before{content:""}.fa-creative-commons:before{content:""}.fa-gg:before{content:""}.fa-gg-circle:before{content:""}.fa-tripadvisor:before{content:""}.fa-odnoklassniki:before{content:""}.fa-odnoklassniki-square:before{content:""}.fa-get-pocket:before{content:""}.fa-wikipedia-w:before{content:""}.fa-safari:before{content:""}.fa-chrome:before{content:""}.fa-firefox:before{content:""}.fa-opera:before{content:""}.fa-internet-explorer:before{content:""}.fa-television:before,.fa-tv:before{content:""}.fa-contao:before{content:""}.fa-500px:before{content:""}.fa-amazon:before{content:""}.fa-calendar-plus-o:before{content:""}.fa-calendar-minus-o:before{content:""}.fa-calendar-times-o:before{content:""}.fa-calendar-check-o:before{content:""}.fa-industry:before{content:""}.fa-map-pin:before{content:""}.fa-map-signs:before{content:""}.fa-map-o:before{content:""}.fa-map:before{content:""}.fa-commenting:before{content:""}.fa-commenting-o:before{content:""}.fa-houzz:before{content:""}.fa-vimeo:before{content:""}.fa-black-tie:before{content:""}.fa-fonticons:before{content:""}.fa-reddit-alien:before{content:""}.fa-edge:before{content:""}.fa-credit-card-alt:before{content:""}.fa-codiepie:before{content:""}.fa-modx:before{content:""}.fa-fort-awesome:before{content:""}.fa-usb:before{content:""}.fa-product-hunt:before{content:""}.fa-mixcloud:before{content:""}.fa-scribd:before{content:""}.fa-pause-circle:before{content:""}.fa-pause-circle-o:before{content:""}.fa-stop-circle:before{content:""}.fa-stop-circle-o:before{content:""}.fa-shopping-bag:before{content:""}.fa-shopping-basket:before{content:""}.fa-hashtag:before{content:""}.fa-bluetooth:before{content:""}.fa-bluetooth-b:before{content:""}.fa-percent:before{content:""}.fa-gitlab:before,.icon-gitlab:before{content:""}.fa-wpbeginner:before{content:""}.fa-wpforms:before{content:""}.fa-envira:before{content:""}.fa-universal-access:before{content:""}.fa-wheelchair-alt:before{content:""}.fa-question-circle-o:before{content:""}.fa-blind:before{content:""}.fa-audio-description:before{content:""}.fa-volume-control-phone:before{content:""}.fa-braille:before{content:""}.fa-assistive-listening-systems:before{content:""}.fa-american-sign-language-interpreting:before,.fa-asl-interpreting:before{content:""}.fa-deaf:before,.fa-deafness:before,.fa-hard-of-hearing:before{content:""}.fa-glide:before{content:""}.fa-glide-g:before{content:""}.fa-sign-language:before,.fa-signing:before{content:""}.fa-low-vision:before{content:""}.fa-viadeo:before{content:""}.fa-viadeo-square:before{content:""}.fa-snapchat:before{content:""}.fa-snapchat-ghost:before{content:""}.fa-snapchat-square:before{content:""}.fa-pied-piper:before{content:""}.fa-first-order:before{content:""}.fa-yoast:before{content:""}.fa-themeisle:before{content:""}.fa-google-plus-circle:before,.fa-google-plus-official:before{content:""}.fa-fa:before,.fa-font-awesome:before{content:""}.fa-handshake-o:before{content:""}.fa-envelope-open:before{content:""}.fa-envelope-open-o:before{content:""}.fa-linode:before{content:""}.fa-address-book:before{content:""}.fa-address-book-o:before{content:""}.fa-address-card:before,.fa-vcard:before{content:""}.fa-address-card-o:before,.fa-vcard-o:before{content:""}.fa-user-circle:before{content:""}.fa-user-circle-o:before{content:""}.fa-user-o:before{content:""}.fa-id-badge:before{content:""}.fa-drivers-license:before,.fa-id-card:before{content:""}.fa-drivers-license-o:before,.fa-id-card-o:before{content:""}.fa-quora:before{content:""}.fa-free-code-camp:before{content:""}.fa-telegram:before{content:""}.fa-thermometer-4:before,.fa-thermometer-full:before,.fa-thermometer:before{content:""}.fa-thermometer-3:before,.fa-thermometer-three-quarters:before{content:""}.fa-thermometer-2:before,.fa-thermometer-half:before{content:""}.fa-thermometer-1:before,.fa-thermometer-quarter:before{content:""}.fa-thermometer-0:before,.fa-thermometer-empty:before{content:""}.fa-shower:before{content:""}.fa-bath:before,.fa-bathtub:before,.fa-s15:before{content:""}.fa-podcast:before{content:""}.fa-window-maximize:before{content:""}.fa-window-minimize:before{content:""}.fa-window-restore:before{content:""}.fa-times-rectangle:before,.fa-window-close:before{content:""}.fa-times-rectangle-o:before,.fa-window-close-o:before{content:""}.fa-bandcamp:before{content:""}.fa-grav:before{content:""}.fa-etsy:before{content:""}.fa-imdb:before{content:""}.fa-ravelry:before{content:""}.fa-eercast:before{content:""}.fa-microchip:before{content:""}.fa-snowflake-o:before{content:""}.fa-superpowers:before{content:""}.fa-wpexplorer:before{content:""}.fa-meetup:before{content:""}.sr-only{position:absolute;width:1px;height:1px;padding:0;margin:-1px;overflow:hidden;clip:rect(0,0,0,0);border:0}.sr-only-focusable:active,.sr-only-focusable:focus{position:static;width:auto;height:auto;margin:0;overflow:visible;clip:auto}.fa,.icon,.rst-content .admonition-title,.rst-content .code-block-caption .headerlink,.rst-content code.download span:first-child,.rst-content dl dt .headerlink,.rst-content h1 .headerlink,.rst-content h2 .headerlink,.rst-content h3 .headerlink,.rst-content h4 .headerlink,.rst-content h5 .headerlink,.rst-content h6 .headerlink,.rst-content p.caption .headerlink,.rst-content table>caption .headerlink,.rst-content tt.download span:first-child,.wy-dropdown .caret,.wy-inline-validate.wy-inline-validate-danger .wy-input-context,.wy-inline-validate.wy-inline-validate-info .wy-input-context,.wy-inline-validate.wy-inline-validate-success .wy-input-context,.wy-inline-validate.wy-inline-validate-warning .wy-input-context,.wy-menu-vertical li.current>a span.toctree-expand,.wy-menu-vertical li.on a span.toctree-expand,.wy-menu-vertical li span.toctree-expand{font-family:inherit}.fa:before,.icon:before,.rst-content .admonition-title:before,.rst-content .code-block-caption .headerlink:before,.rst-content code.download span:first-child:before,.rst-content dl dt .headerlink:before,.rst-content h1 .headerlink:before,.rst-content h2 .headerlink:before,.rst-content h3 .headerlink:before,.rst-content h4 .headerlink:before,.rst-content h5 .headerlink:before,.rst-content h6 .headerlink:before,.rst-content p.caption .headerlink:before,.rst-content table>caption .headerlink:before,.rst-content tt.download span:first-child:before,.wy-dropdown .caret:before,.wy-inline-validate.wy-inline-validate-danger .wy-input-context:before,.wy-inline-validate.wy-inline-validate-info .wy-input-context:before,.wy-inline-validate.wy-inline-validate-success .wy-input-context:before,.wy-inline-validate.wy-inline-validate-warning .wy-input-context:before,.wy-menu-vertical li.current>a span.toctree-expand:before,.wy-menu-vertical li.on a span.toctree-expand:before,.wy-menu-vertical li span.toctree-expand:before{font-family:FontAwesome;display:inline-block;font-style:normal;font-weight:400;line-height:1;text-decoration:inherit}.rst-content .code-block-caption a .headerlink,.rst-content a .admonition-title,.rst-content code.download a span:first-child,.rst-content dl dt a .headerlink,.rst-content h1 a .headerlink,.rst-content h2 a .headerlink,.rst-content h3 a .headerlink,.rst-content h4 a .headerlink,.rst-content h5 a .headerlink,.rst-content h6 a .headerlink,.rst-content p.caption a .headerlink,.rst-content table>caption a .headerlink,.rst-content tt.download a span:first-child,.wy-menu-vertical li.current>a span.toctree-expand,.wy-menu-vertical li.on a span.toctree-expand,.wy-menu-vertical li a span.toctree-expand,a .fa,a .icon,a .rst-content .admonition-title,a .rst-content .code-block-caption .headerlink,a .rst-content code.download span:first-child,a .rst-content dl dt .headerlink,a .rst-content h1 .headerlink,a .rst-content h2 .headerlink,a .rst-content h3 .headerlink,a .rst-content h4 .headerlink,a .rst-content h5 .headerlink,a .rst-content h6 .headerlink,a .rst-content p.caption .headerlink,a .rst-content table>caption .headerlink,a .rst-content tt.download span:first-child,a .wy-menu-vertical li span.toctree-expand{display:inline-block;text-decoration:inherit}.btn .fa,.btn .icon,.btn .rst-content .admonition-title,.btn .rst-content .code-block-caption .headerlink,.btn .rst-content code.download span:first-child,.btn .rst-content dl dt .headerlink,.btn .rst-content h1 .headerlink,.btn .rst-content h2 .headerlink,.btn .rst-content h3 .headerlink,.btn .rst-content h4 .headerlink,.btn .rst-content h5 .headerlink,.btn .rst-content h6 .headerlink,.btn .rst-content p.caption .headerlink,.btn .rst-content table>caption .headerlink,.btn .rst-content tt.download span:first-child,.btn .wy-menu-vertical li.current>a span.toctree-expand,.btn .wy-menu-vertical li.on a span.toctree-expand,.btn .wy-menu-vertical li span.toctree-expand,.nav .fa,.nav .icon,.nav .rst-content .admonition-title,.nav .rst-content .code-block-caption .headerlink,.nav .rst-content code.download span:first-child,.nav .rst-content dl dt .headerlink,.nav .rst-content h1 .headerlink,.nav .rst-content h2 .headerlink,.nav .rst-content h3 .headerlink,.nav .rst-content h4 .headerlink,.nav .rst-content h5 .headerlink,.nav .rst-content h6 .headerlink,.nav .rst-content p.caption .headerlink,.nav .rst-content table>caption .headerlink,.nav .rst-content tt.download span:first-child,.nav .wy-menu-vertical li.current>a span.toctree-expand,.nav .wy-menu-vertical li.on a span.toctree-expand,.nav .wy-menu-vertical li span.toctree-expand,.rst-content .btn .admonition-title,.rst-content .code-block-caption .btn .headerlink,.rst-content .code-block-caption .nav .headerlink,.rst-content .nav .admonition-title,.rst-content code.download .btn span:first-child,.rst-content code.download .nav span:first-child,.rst-content dl dt .btn .headerlink,.rst-content dl dt .nav .headerlink,.rst-content h1 .btn .headerlink,.rst-content h1 .nav .headerlink,.rst-content h2 .btn .headerlink,.rst-content h2 .nav .headerlink,.rst-content h3 .btn .headerlink,.rst-content h3 .nav .headerlink,.rst-content h4 .btn .headerlink,.rst-content h4 .nav .headerlink,.rst-content h5 .btn .headerlink,.rst-content h5 .nav .headerlink,.rst-content h6 .btn .headerlink,.rst-content h6 .nav .headerlink,.rst-content p.caption .btn .headerlink,.rst-content p.caption .nav .headerlink,.rst-content table>caption .btn .headerlink,.rst-content table>caption .nav .headerlink,.rst-content tt.download .btn span:first-child,.rst-content tt.download .nav span:first-child,.wy-menu-vertical li .btn span.toctree-expand,.wy-menu-vertical li.current>a .btn span.toctree-expand,.wy-menu-vertical li.current>a .nav span.toctree-expand,.wy-menu-vertical li .nav span.toctree-expand,.wy-menu-vertical li.on a .btn span.toctree-expand,.wy-menu-vertical li.on a .nav span.toctree-expand{display:inline}.btn .fa-large.icon,.btn .fa.fa-large,.btn .rst-content .code-block-caption .fa-large.headerlink,.btn .rst-content .fa-large.admonition-title,.btn .rst-content code.download span.fa-large:first-child,.btn .rst-content dl dt .fa-large.headerlink,.btn .rst-content h1 .fa-large.headerlink,.btn .rst-content h2 .fa-large.headerlink,.btn .rst-content h3 .fa-large.headerlink,.btn .rst-content h4 .fa-large.headerlink,.btn .rst-content h5 .fa-large.headerlink,.btn .rst-content h6 .fa-large.headerlink,.btn .rst-content p.caption .fa-large.headerlink,.btn .rst-content table>caption .fa-large.headerlink,.btn .rst-content tt.download span.fa-large:first-child,.btn .wy-menu-vertical li span.fa-large.toctree-expand,.nav .fa-large.icon,.nav .fa.fa-large,.nav .rst-content .code-block-caption .fa-large.headerlink,.nav .rst-content .fa-large.admonition-title,.nav .rst-content code.download span.fa-large:first-child,.nav .rst-content dl dt .fa-large.headerlink,.nav .rst-content h1 .fa-large.headerlink,.nav .rst-content h2 .fa-large.headerlink,.nav .rst-content h3 .fa-large.headerlink,.nav .rst-content h4 .fa-large.headerlink,.nav .rst-content h5 .fa-large.headerlink,.nav .rst-content h6 .fa-large.headerlink,.nav .rst-content p.caption .fa-large.headerlink,.nav .rst-content table>caption .fa-large.headerlink,.nav .rst-content tt.download span.fa-large:first-child,.nav .wy-menu-vertical li span.fa-large.toctree-expand,.rst-content .btn .fa-large.admonition-title,.rst-content .code-block-caption .btn .fa-large.headerlink,.rst-content .code-block-caption .nav .fa-large.headerlink,.rst-content .nav .fa-large.admonition-title,.rst-content code.download .btn span.fa-large:first-child,.rst-content code.download .nav span.fa-large:first-child,.rst-content dl dt .btn .fa-large.headerlink,.rst-content dl dt .nav .fa-large.headerlink,.rst-content h1 .btn .fa-large.headerlink,.rst-content h1 .nav .fa-large.headerlink,.rst-content h2 .btn .fa-large.headerlink,.rst-content h2 .nav .fa-large.headerlink,.rst-content h3 .btn .fa-large.headerlink,.rst-content h3 .nav .fa-large.headerlink,.rst-content h4 .btn .fa-large.headerlink,.rst-content h4 .nav .fa-large.headerlink,.rst-content h5 .btn .fa-large.headerlink,.rst-content h5 .nav .fa-large.headerlink,.rst-content h6 .btn .fa-large.headerlink,.rst-content h6 .nav .fa-large.headerlink,.rst-content p.caption .btn .fa-large.headerlink,.rst-content p.caption .nav .fa-large.headerlink,.rst-content table>caption .btn .fa-large.headerlink,.rst-content table>caption .nav .fa-large.headerlink,.rst-content tt.download .btn span.fa-large:first-child,.rst-content tt.download .nav span.fa-large:first-child,.wy-menu-vertical li .btn span.fa-large.toctree-expand,.wy-menu-vertical li .nav span.fa-large.toctree-expand{line-height:.9em}.btn .fa-spin.icon,.btn .fa.fa-spin,.btn .rst-content .code-block-caption .fa-spin.headerlink,.btn .rst-content .fa-spin.admonition-title,.btn .rst-content code.download span.fa-spin:first-child,.btn .rst-content dl dt .fa-spin.headerlink,.btn .rst-content h1 .fa-spin.headerlink,.btn .rst-content h2 .fa-spin.headerlink,.btn .rst-content h3 .fa-spin.headerlink,.btn .rst-content h4 .fa-spin.headerlink,.btn .rst-content h5 .fa-spin.headerlink,.btn .rst-content h6 .fa-spin.headerlink,.btn .rst-content p.caption .fa-spin.headerlink,.btn .rst-content table>caption .fa-spin.headerlink,.btn .rst-content tt.download span.fa-spin:first-child,.btn .wy-menu-vertical li span.fa-spin.toctree-expand,.nav .fa-spin.icon,.nav .fa.fa-spin,.nav .rst-content .code-block-caption .fa-spin.headerlink,.nav .rst-content .fa-spin.admonition-title,.nav .rst-content code.download span.fa-spin:first-child,.nav .rst-content dl dt .fa-spin.headerlink,.nav .rst-content h1 .fa-spin.headerlink,.nav .rst-content h2 .fa-spin.headerlink,.nav .rst-content h3 .fa-spin.headerlink,.nav .rst-content h4 .fa-spin.headerlink,.nav .rst-content h5 .fa-spin.headerlink,.nav .rst-content h6 .fa-spin.headerlink,.nav .rst-content p.caption .fa-spin.headerlink,.nav .rst-content table>caption .fa-spin.headerlink,.nav .rst-content tt.download span.fa-spin:first-child,.nav .wy-menu-vertical li span.fa-spin.toctree-expand,.rst-content .btn .fa-spin.admonition-title,.rst-content .code-block-caption .btn .fa-spin.headerlink,.rst-content .code-block-caption .nav .fa-spin.headerlink,.rst-content .nav .fa-spin.admonition-title,.rst-content code.download .btn span.fa-spin:first-child,.rst-content code.download .nav span.fa-spin:first-child,.rst-content dl dt .btn .fa-spin.headerlink,.rst-content dl dt .nav .fa-spin.headerlink,.rst-content h1 .btn .fa-spin.headerlink,.rst-content h1 .nav .fa-spin.headerlink,.rst-content h2 .btn .fa-spin.headerlink,.rst-content h2 .nav .fa-spin.headerlink,.rst-content h3 .btn .fa-spin.headerlink,.rst-content h3 .nav .fa-spin.headerlink,.rst-content h4 .btn .fa-spin.headerlink,.rst-content h4 .nav .fa-spin.headerlink,.rst-content h5 .btn .fa-spin.headerlink,.rst-content h5 .nav .fa-spin.headerlink,.rst-content h6 .btn .fa-spin.headerlink,.rst-content h6 .nav .fa-spin.headerlink,.rst-content p.caption .btn .fa-spin.headerlink,.rst-content p.caption .nav .fa-spin.headerlink,.rst-content table>caption .btn .fa-spin.headerlink,.rst-content table>caption .nav .fa-spin.headerlink,.rst-content tt.download .btn span.fa-spin:first-child,.rst-content tt.download .nav span.fa-spin:first-child,.wy-menu-vertical li .btn span.fa-spin.toctree-expand,.wy-menu-vertical li .nav span.fa-spin.toctree-expand{display:inline-block}.btn.fa:before,.btn.icon:before,.rst-content .btn.admonition-title:before,.rst-content .code-block-caption .btn.headerlink:before,.rst-content code.download span.btn:first-child:before,.rst-content dl dt .btn.headerlink:before,.rst-content h1 .btn.headerlink:before,.rst-content h2 .btn.headerlink:before,.rst-content h3 .btn.headerlink:before,.rst-content h4 .btn.headerlink:before,.rst-content h5 .btn.headerlink:before,.rst-content h6 .btn.headerlink:before,.rst-content p.caption .btn.headerlink:before,.rst-content table>caption .btn.headerlink:before,.rst-content tt.download span.btn:first-child:before,.wy-menu-vertical li span.btn.toctree-expand:before{opacity:.5;-webkit-transition:opacity .05s ease-in;-moz-transition:opacity .05s ease-in;transition:opacity .05s ease-in}.btn.fa:hover:before,.btn.icon:hover:before,.rst-content .btn.admonition-title:hover:before,.rst-content .code-block-caption .btn.headerlink:hover:before,.rst-content code.download span.btn:first-child:hover:before,.rst-content dl dt .btn.headerlink:hover:before,.rst-content h1 .btn.headerlink:hover:before,.rst-content h2 .btn.headerlink:hover:before,.rst-content h3 .btn.headerlink:hover:before,.rst-content h4 .btn.headerlink:hover:before,.rst-content h5 .btn.headerlink:hover:before,.rst-content h6 .btn.headerlink:hover:before,.rst-content p.caption .btn.headerlink:hover:before,.rst-content table>caption .btn.headerlink:hover:before,.rst-content tt.download span.btn:first-child:hover:before,.wy-menu-vertical li span.btn.toctree-expand:hover:before{opacity:1}.btn-mini .fa:before,.btn-mini .icon:before,.btn-mini .rst-content .admonition-title:before,.btn-mini .rst-content .code-block-caption .headerlink:before,.btn-mini .rst-content code.download span:first-child:before,.btn-mini .rst-content dl dt .headerlink:before,.btn-mini .rst-content h1 .headerlink:before,.btn-mini .rst-content h2 .headerlink:before,.btn-mini .rst-content h3 .headerlink:before,.btn-mini .rst-content h4 .headerlink:before,.btn-mini .rst-content h5 .headerlink:before,.btn-mini .rst-content h6 .headerlink:before,.btn-mini .rst-content p.caption .headerlink:before,.btn-mini .rst-content table>caption .headerlink:before,.btn-mini .rst-content tt.download span:first-child:before,.btn-mini .wy-menu-vertical li span.toctree-expand:before,.rst-content .btn-mini .admonition-title:before,.rst-content .code-block-caption .btn-mini .headerlink:before,.rst-content code.download .btn-mini span:first-child:before,.rst-content dl dt .btn-mini .headerlink:before,.rst-content h1 .btn-mini .headerlink:before,.rst-content h2 .btn-mini .headerlink:before,.rst-content h3 .btn-mini .headerlink:before,.rst-content h4 .btn-mini .headerlink:before,.rst-content h5 .btn-mini .headerlink:before,.rst-content h6 .btn-mini .headerlink:before,.rst-content p.caption .btn-mini .headerlink:before,.rst-content table>caption .btn-mini .headerlink:before,.rst-content tt.download .btn-mini span:first-child:before,.wy-menu-vertical li .btn-mini span.toctree-expand:before{font-size:14px;vertical-align:-15%}.rst-content .admonition,.rst-content .admonition-todo,.rst-content .attention,.rst-content .caution,.rst-content .danger,.rst-content .error,.rst-content .hint,.rst-content .important,.rst-content .note,.rst-content .seealso,.rst-content .tip,.rst-content .warning,.wy-alert{padding:12px;line-height:24px;margin-bottom:24px;background:#e7f2fa}.rst-content .admonition-title,.wy-alert-title{font-weight:700;display:block;color:#fff;background:#6ab0de;padding:6px 12px;margin:-12px -12px 12px}.rst-content .danger,.rst-content .error,.rst-content .wy-alert-danger.admonition,.rst-content .wy-alert-danger.admonition-todo,.rst-content .wy-alert-danger.attention,.rst-content .wy-alert-danger.caution,.rst-content .wy-alert-danger.hint,.rst-content .wy-alert-danger.important,.rst-content .wy-alert-danger.note,.rst-content .wy-alert-danger.seealso,.rst-content .wy-alert-danger.tip,.rst-content .wy-alert-danger.warning,.wy-alert.wy-alert-danger{background:#fdf3f2}.rst-content .danger .admonition-title,.rst-content .danger .wy-alert-title,.rst-content .error .admonition-title,.rst-content .error .wy-alert-title,.rst-content .wy-alert-danger.admonition-todo .admonition-title,.rst-content .wy-alert-danger.admonition-todo .wy-alert-title,.rst-content .wy-alert-danger.admonition .admonition-title,.rst-content .wy-alert-danger.admonition .wy-alert-title,.rst-content .wy-alert-danger.attention .admonition-title,.rst-content .wy-alert-danger.attention .wy-alert-title,.rst-content .wy-alert-danger.caution .admonition-title,.rst-content .wy-alert-danger.caution .wy-alert-title,.rst-content .wy-alert-danger.hint .admonition-title,.rst-content .wy-alert-danger.hint .wy-alert-title,.rst-content .wy-alert-danger.important .admonition-title,.rst-content .wy-alert-danger.important .wy-alert-title,.rst-content .wy-alert-danger.note .admonition-title,.rst-content .wy-alert-danger.note .wy-alert-title,.rst-content .wy-alert-danger.seealso .admonition-title,.rst-content .wy-alert-danger.seealso .wy-alert-title,.rst-content .wy-alert-danger.tip .admonition-title,.rst-content .wy-alert-danger.tip .wy-alert-title,.rst-content .wy-alert-danger.warning .admonition-title,.rst-content .wy-alert-danger.warning .wy-alert-title,.rst-content .wy-alert.wy-alert-danger .admonition-title,.wy-alert.wy-alert-danger .rst-content .admonition-title,.wy-alert.wy-alert-danger .wy-alert-title{background:#f29f97}.rst-content .admonition-todo,.rst-content .attention,.rst-content .caution,.rst-content .warning,.rst-content .wy-alert-warning.admonition,.rst-content .wy-alert-warning.danger,.rst-content .wy-alert-warning.error,.rst-content .wy-alert-warning.hint,.rst-content .wy-alert-warning.important,.rst-content .wy-alert-warning.note,.rst-content .wy-alert-warning.seealso,.rst-content .wy-alert-warning.tip,.wy-alert.wy-alert-warning{background:#ffedcc}.rst-content .admonition-todo .admonition-title,.rst-content .admonition-todo .wy-alert-title,.rst-content .attention .admonition-title,.rst-content .attention .wy-alert-title,.rst-content .caution .admonition-title,.rst-content .caution .wy-alert-title,.rst-content .warning .admonition-title,.rst-content .warning .wy-alert-title,.rst-content .wy-alert-warning.admonition .admonition-title,.rst-content .wy-alert-warning.admonition .wy-alert-title,.rst-content .wy-alert-warning.danger .admonition-title,.rst-content .wy-alert-warning.danger .wy-alert-title,.rst-content .wy-alert-warning.error .admonition-title,.rst-content .wy-alert-warning.error .wy-alert-title,.rst-content .wy-alert-warning.hint .admonition-title,.rst-content .wy-alert-warning.hint .wy-alert-title,.rst-content .wy-alert-warning.important .admonition-title,.rst-content .wy-alert-warning.important .wy-alert-title,.rst-content .wy-alert-warning.note .admonition-title,.rst-content .wy-alert-warning.note .wy-alert-title,.rst-content .wy-alert-warning.seealso .admonition-title,.rst-content .wy-alert-warning.seealso .wy-alert-title,.rst-content .wy-alert-warning.tip .admonition-title,.rst-content .wy-alert-warning.tip .wy-alert-title,.rst-content .wy-alert.wy-alert-warning .admonition-title,.wy-alert.wy-alert-warning .rst-content .admonition-title,.wy-alert.wy-alert-warning .wy-alert-title{background:#f0b37e}.rst-content .note,.rst-content .seealso,.rst-content .wy-alert-info.admonition,.rst-content .wy-alert-info.admonition-todo,.rst-content .wy-alert-info.attention,.rst-content .wy-alert-info.caution,.rst-content .wy-alert-info.danger,.rst-content .wy-alert-info.error,.rst-content .wy-alert-info.hint,.rst-content .wy-alert-info.important,.rst-content .wy-alert-info.tip,.rst-content .wy-alert-info.warning,.wy-alert.wy-alert-info{background:#e7f2fa}.rst-content .note .admonition-title,.rst-content .note .wy-alert-title,.rst-content .seealso .admonition-title,.rst-content .seealso .wy-alert-title,.rst-content .wy-alert-info.admonition-todo .admonition-title,.rst-content .wy-alert-info.admonition-todo .wy-alert-title,.rst-content .wy-alert-info.admonition .admonition-title,.rst-content .wy-alert-info.admonition .wy-alert-title,.rst-content .wy-alert-info.attention .admonition-title,.rst-content .wy-alert-info.attention .wy-alert-title,.rst-content .wy-alert-info.caution .admonition-title,.rst-content .wy-alert-info.caution .wy-alert-title,.rst-content .wy-alert-info.danger .admonition-title,.rst-content .wy-alert-info.danger .wy-alert-title,.rst-content .wy-alert-info.error .admonition-title,.rst-content .wy-alert-info.error .wy-alert-title,.rst-content .wy-alert-info.hint .admonition-title,.rst-content .wy-alert-info.hint .wy-alert-title,.rst-content .wy-alert-info.important .admonition-title,.rst-content .wy-alert-info.important .wy-alert-title,.rst-content .wy-alert-info.tip .admonition-title,.rst-content .wy-alert-info.tip .wy-alert-title,.rst-content .wy-alert-info.warning .admonition-title,.rst-content .wy-alert-info.warning .wy-alert-title,.rst-content .wy-alert.wy-alert-info .admonition-title,.wy-alert.wy-alert-info .rst-content .admonition-title,.wy-alert.wy-alert-info .wy-alert-title{background:#6ab0de}.rst-content .hint,.rst-content .important,.rst-content .tip,.rst-content .wy-alert-success.admonition,.rst-content .wy-alert-success.admonition-todo,.rst-content .wy-alert-success.attention,.rst-content .wy-alert-success.caution,.rst-content .wy-alert-success.danger,.rst-content .wy-alert-success.error,.rst-content .wy-alert-success.note,.rst-content .wy-alert-success.seealso,.rst-content .wy-alert-success.warning,.wy-alert.wy-alert-success{background:#dbfaf4}.rst-content .hint .admonition-title,.rst-content .hint .wy-alert-title,.rst-content .important .admonition-title,.rst-content .important .wy-alert-title,.rst-content .tip .admonition-title,.rst-content .tip .wy-alert-title,.rst-content .wy-alert-success.admonition-todo .admonition-title,.rst-content .wy-alert-success.admonition-todo .wy-alert-title,.rst-content .wy-alert-success.admonition .admonition-title,.rst-content .wy-alert-success.admonition .wy-alert-title,.rst-content .wy-alert-success.attention .admonition-title,.rst-content .wy-alert-success.attention .wy-alert-title,.rst-content .wy-alert-success.caution .admonition-title,.rst-content .wy-alert-success.caution .wy-alert-title,.rst-content .wy-alert-success.danger .admonition-title,.rst-content .wy-alert-success.danger .wy-alert-title,.rst-content .wy-alert-success.error .admonition-title,.rst-content .wy-alert-success.error .wy-alert-title,.rst-content .wy-alert-success.note .admonition-title,.rst-content .wy-alert-success.note .wy-alert-title,.rst-content .wy-alert-success.seealso .admonition-title,.rst-content .wy-alert-success.seealso .wy-alert-title,.rst-content .wy-alert-success.warning .admonition-title,.rst-content .wy-alert-success.warning .wy-alert-title,.rst-content .wy-alert.wy-alert-success .admonition-title,.wy-alert.wy-alert-success .rst-content .admonition-title,.wy-alert.wy-alert-success .wy-alert-title{background:#1abc9c}.rst-content .wy-alert-neutral.admonition,.rst-content .wy-alert-neutral.admonition-todo,.rst-content .wy-alert-neutral.attention,.rst-content .wy-alert-neutral.caution,.rst-content .wy-alert-neutral.danger,.rst-content .wy-alert-neutral.error,.rst-content .wy-alert-neutral.hint,.rst-content .wy-alert-neutral.important,.rst-content .wy-alert-neutral.note,.rst-content .wy-alert-neutral.seealso,.rst-content .wy-alert-neutral.tip,.rst-content .wy-alert-neutral.warning,.wy-alert.wy-alert-neutral{background:#f3f6f6}.rst-content .wy-alert-neutral.admonition-todo .admonition-title,.rst-content .wy-alert-neutral.admonition-todo .wy-alert-title,.rst-content .wy-alert-neutral.admonition .admonition-title,.rst-content .wy-alert-neutral.admonition .wy-alert-title,.rst-content .wy-alert-neutral.attention .admonition-title,.rst-content .wy-alert-neutral.attention .wy-alert-title,.rst-content .wy-alert-neutral.caution .admonition-title,.rst-content .wy-alert-neutral.caution .wy-alert-title,.rst-content .wy-alert-neutral.danger .admonition-title,.rst-content .wy-alert-neutral.danger .wy-alert-title,.rst-content .wy-alert-neutral.error .admonition-title,.rst-content .wy-alert-neutral.error .wy-alert-title,.rst-content .wy-alert-neutral.hint .admonition-title,.rst-content .wy-alert-neutral.hint .wy-alert-title,.rst-content .wy-alert-neutral.important .admonition-title,.rst-content .wy-alert-neutral.important .wy-alert-title,.rst-content .wy-alert-neutral.note .admonition-title,.rst-content .wy-alert-neutral.note .wy-alert-title,.rst-content .wy-alert-neutral.seealso .admonition-title,.rst-content .wy-alert-neutral.seealso .wy-alert-title,.rst-content .wy-alert-neutral.tip .admonition-title,.rst-content .wy-alert-neutral.tip .wy-alert-title,.rst-content .wy-alert-neutral.warning .admonition-title,.rst-content .wy-alert-neutral.warning .wy-alert-title,.rst-content .wy-alert.wy-alert-neutral .admonition-title,.wy-alert.wy-alert-neutral .rst-content .admonition-title,.wy-alert.wy-alert-neutral .wy-alert-title{color:#404040;background:#e1e4e5}.rst-content .wy-alert-neutral.admonition-todo a,.rst-content .wy-alert-neutral.admonition a,.rst-content .wy-alert-neutral.attention a,.rst-content .wy-alert-neutral.caution a,.rst-content .wy-alert-neutral.danger a,.rst-content .wy-alert-neutral.error a,.rst-content .wy-alert-neutral.hint a,.rst-content .wy-alert-neutral.important a,.rst-content .wy-alert-neutral.note a,.rst-content .wy-alert-neutral.seealso a,.rst-content .wy-alert-neutral.tip a,.rst-content .wy-alert-neutral.warning a,.wy-alert.wy-alert-neutral a{color:#2980b9}.rst-content .admonition-todo p:last-child,.rst-content .admonition p:last-child,.rst-content .attention p:last-child,.rst-content .caution p:last-child,.rst-content .danger p:last-child,.rst-content .error p:last-child,.rst-content .hint p:last-child,.rst-content .important p:last-child,.rst-content .note p:last-child,.rst-content .seealso p:last-child,.rst-content .tip p:last-child,.rst-content .warning p:last-child,.wy-alert p:last-child{margin-bottom:0}.wy-tray-container{position:fixed;bottom:0;left:0;z-index:600}.wy-tray-container li{display:block;width:300px;background:transparent;color:#fff;text-align:center;box-shadow:0 5px 5px 0 rgba(0,0,0,.1);padding:0 24px;min-width:20%;opacity:0;height:0;line-height:56px;overflow:hidden;-webkit-transition:all .3s ease-in;-moz-transition:all .3s ease-in;transition:all .3s ease-in}.wy-tray-container li.wy-tray-item-success{background:#27ae60}.wy-tray-container li.wy-tray-item-info{background:#2980b9}.wy-tray-container li.wy-tray-item-warning{background:#e67e22}.wy-tray-container li.wy-tray-item-danger{background:#e74c3c}.wy-tray-container li.on{opacity:1;height:56px}@media screen and (max-width:768px){.wy-tray-container{bottom:auto;top:0;width:100%}.wy-tray-container li{width:100%}}button{font-size:100%;margin:0;vertical-align:baseline;*vertical-align:middle;cursor:pointer;line-height:normal;-webkit-appearance:button;*overflow:visible}button::-moz-focus-inner,input::-moz-focus-inner{border:0;padding:0}button[disabled]{cursor:default}.btn{display:inline-block;border-radius:2px;line-height:normal;white-space:nowrap;text-align:center;cursor:pointer;font-size:100%;padding:6px 12px 8px;color:#fff;border:1px solid rgba(0,0,0,.1);background-color:#27ae60;text-decoration:none;font-weight:400;font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif;box-shadow:inset 0 1px 2px -1px hsla(0,0%,100%,.5),inset 0 -2px 0 0 rgba(0,0,0,.1);outline-none:false;vertical-align:middle;*display:inline;zoom:1;-webkit-user-drag:none;-webkit-user-select:none;-moz-user-select:none;-ms-user-select:none;user-select:none;-webkit-transition:all .1s linear;-moz-transition:all .1s linear;transition:all .1s linear}.btn-hover{background:#2e8ece;color:#fff}.btn:hover{background:#2cc36b;color:#fff}.btn:focus{background:#2cc36b;outline:0}.btn:active{box-shadow:inset 0 -1px 0 0 rgba(0,0,0,.05),inset 0 2px 0 0 rgba(0,0,0,.1);padding:8px 12px 6px}.btn:visited{color:#fff}.btn-disabled,.btn-disabled:active,.btn-disabled:focus,.btn-disabled:hover,.btn:disabled{background-image:none;filter:progid:DXImageTransform.Microsoft.gradient(enabled = false);filter:alpha(opacity=40);opacity:.4;cursor:not-allowed;box-shadow:none}.btn::-moz-focus-inner{padding:0;border:0}.btn-small{font-size:80%}.btn-info{background-color:#2980b9!important}.btn-info:hover{background-color:#2e8ece!important}.btn-neutral{background-color:#f3f6f6!important;color:#404040!important}.btn-neutral:hover{background-color:#e5ebeb!important;color:#404040}.btn-neutral:visited{color:#404040!important}.btn-success{background-color:#27ae60!important}.btn-success:hover{background-color:#295!important}.btn-danger{background-color:#e74c3c!important}.btn-danger:hover{background-color:#ea6153!important}.btn-warning{background-color:#e67e22!important}.btn-warning:hover{background-color:#e98b39!important}.btn-invert{background-color:#222}.btn-invert:hover{background-color:#2f2f2f!important}.btn-link{background-color:transparent!important;color:#2980b9;box-shadow:none;border-color:transparent!important}.btn-link:active,.btn-link:hover{background-color:transparent!important;color:#409ad5!important;box-shadow:none}.btn-link:visited{color:#9b59b6}.wy-btn-group .btn,.wy-control .btn{vertical-align:middle}.wy-btn-group{margin-bottom:24px;*zoom:1}.wy-btn-group:after,.wy-btn-group:before{display:table;content:""}.wy-btn-group:after{clear:both}.wy-dropdown{position:relative;display:inline-block}.wy-dropdown-active .wy-dropdown-menu{display:block}.wy-dropdown-menu{position:absolute;left:0;display:none;float:left;top:100%;min-width:100%;background:#fcfcfc;z-index:100;border:1px solid #cfd7dd;box-shadow:0 2px 2px 0 rgba(0,0,0,.1);padding:12px}.wy-dropdown-menu>dd>a{display:block;clear:both;color:#404040;white-space:nowrap;font-size:90%;padding:0 12px;cursor:pointer}.wy-dropdown-menu>dd>a:hover{background:#2980b9;color:#fff}.wy-dropdown-menu>dd.divider{border-top:1px solid #cfd7dd;margin:6px 0}.wy-dropdown-menu>dd.search{padding-bottom:12px}.wy-dropdown-menu>dd.search input[type=search]{width:100%}.wy-dropdown-menu>dd.call-to-action{background:#e3e3e3;text-transform:uppercase;font-weight:500;font-size:80%}.wy-dropdown-menu>dd.call-to-action:hover{background:#e3e3e3}.wy-dropdown-menu>dd.call-to-action .btn{color:#fff}.wy-dropdown.wy-dropdown-up .wy-dropdown-menu{bottom:100%;top:auto;left:auto;right:0}.wy-dropdown.wy-dropdown-bubble .wy-dropdown-menu{background:#fcfcfc;margin-top:2px}.wy-dropdown.wy-dropdown-bubble .wy-dropdown-menu a{padding:6px 12px}.wy-dropdown.wy-dropdown-bubble .wy-dropdown-menu a:hover{background:#2980b9;color:#fff}.wy-dropdown.wy-dropdown-left .wy-dropdown-menu{right:0;left:auto;text-align:right}.wy-dropdown-arrow:before{content:" ";border-bottom:5px solid #f5f5f5;border-left:5px solid transparent;border-right:5px solid transparent;position:absolute;display:block;top:-4px;left:50%;margin-left:-3px}.wy-dropdown-arrow.wy-dropdown-arrow-left:before{left:11px}.wy-form-stacked select{display:block}.wy-form-aligned .wy-help-inline,.wy-form-aligned input,.wy-form-aligned label,.wy-form-aligned select,.wy-form-aligned textarea{display:inline-block;*display:inline;*zoom:1;vertical-align:middle}.wy-form-aligned .wy-control-group>label{display:inline-block;vertical-align:middle;width:10em;margin:6px 12px 0 0;float:left}.wy-form-aligned .wy-control{float:left}.wy-form-aligned .wy-control label{display:block}.wy-form-aligned .wy-control select{margin-top:6px}fieldset{margin:0}fieldset,legend{border:0;padding:0}legend{width:100%;white-space:normal;margin-bottom:24px;font-size:150%;*margin-left:-7px}label,legend{display:block}label{margin:0 0 .3125em;color:#333;font-size:90%}input,select,textarea{font-size:100%;margin:0;vertical-align:baseline;*vertical-align:middle}.wy-control-group{margin-bottom:24px;max-width:1200px;margin-left:auto;margin-right:auto;*zoom:1}.wy-control-group:after,.wy-control-group:before{display:table;content:""}.wy-control-group:after{clear:both}.wy-control-group.wy-control-group-required>label:after{content:" *";color:#e74c3c}.wy-control-group .wy-form-full,.wy-control-group .wy-form-halves,.wy-control-group .wy-form-thirds{padding-bottom:12px}.wy-control-group .wy-form-full input[type=color],.wy-control-group .wy-form-full input[type=date],.wy-control-group .wy-form-full input[type=datetime-local],.wy-control-group .wy-form-full input[type=datetime],.wy-control-group .wy-form-full input[type=email],.wy-control-group .wy-form-full input[type=month],.wy-control-group .wy-form-full input[type=number],.wy-control-group .wy-form-full input[type=password],.wy-control-group .wy-form-full input[type=search],.wy-control-group .wy-form-full input[type=tel],.wy-control-group .wy-form-full input[type=text],.wy-control-group .wy-form-full input[type=time],.wy-control-group .wy-form-full input[type=url],.wy-control-group .wy-form-full input[type=week],.wy-control-group .wy-form-full select,.wy-control-group .wy-form-halves input[type=color],.wy-control-group .wy-form-halves input[type=date],.wy-control-group .wy-form-halves input[type=datetime-local],.wy-control-group .wy-form-halves input[type=datetime],.wy-control-group .wy-form-halves input[type=email],.wy-control-group .wy-form-halves input[type=month],.wy-control-group .wy-form-halves input[type=number],.wy-control-group .wy-form-halves input[type=password],.wy-control-group .wy-form-halves input[type=search],.wy-control-group .wy-form-halves input[type=tel],.wy-control-group .wy-form-halves input[type=text],.wy-control-group .wy-form-halves input[type=time],.wy-control-group .wy-form-halves input[type=url],.wy-control-group .wy-form-halves input[type=week],.wy-control-group .wy-form-halves select,.wy-control-group .wy-form-thirds input[type=color],.wy-control-group .wy-form-thirds input[type=date],.wy-control-group .wy-form-thirds input[type=datetime-local],.wy-control-group .wy-form-thirds input[type=datetime],.wy-control-group .wy-form-thirds input[type=email],.wy-control-group .wy-form-thirds input[type=month],.wy-control-group .wy-form-thirds input[type=number],.wy-control-group .wy-form-thirds input[type=password],.wy-control-group .wy-form-thirds input[type=search],.wy-control-group .wy-form-thirds input[type=tel],.wy-control-group .wy-form-thirds input[type=text],.wy-control-group .wy-form-thirds input[type=time],.wy-control-group .wy-form-thirds input[type=url],.wy-control-group .wy-form-thirds input[type=week],.wy-control-group .wy-form-thirds select{width:100%}.wy-control-group .wy-form-full{float:left;display:block;width:100%;margin-right:0}.wy-control-group .wy-form-full:last-child{margin-right:0}.wy-control-group .wy-form-halves{float:left;display:block;margin-right:2.35765%;width:48.82117%}.wy-control-group .wy-form-halves:last-child,.wy-control-group .wy-form-halves:nth-of-type(2n){margin-right:0}.wy-control-group .wy-form-halves:nth-of-type(odd){clear:left}.wy-control-group .wy-form-thirds{float:left;display:block;margin-right:2.35765%;width:31.76157%}.wy-control-group .wy-form-thirds:last-child,.wy-control-group .wy-form-thirds:nth-of-type(3n){margin-right:0}.wy-control-group .wy-form-thirds:nth-of-type(3n+1){clear:left}.wy-control-group.wy-control-group-no-input .wy-control,.wy-control-no-input{margin:6px 0 0;font-size:90%}.wy-control-no-input{display:inline-block}.wy-control-group.fluid-input input[type=color],.wy-control-group.fluid-input input[type=date],.wy-control-group.fluid-input input[type=datetime-local],.wy-control-group.fluid-input input[type=datetime],.wy-control-group.fluid-input input[type=email],.wy-control-group.fluid-input input[type=month],.wy-control-group.fluid-input input[type=number],.wy-control-group.fluid-input input[type=password],.wy-control-group.fluid-input input[type=search],.wy-control-group.fluid-input input[type=tel],.wy-control-group.fluid-input input[type=text],.wy-control-group.fluid-input input[type=time],.wy-control-group.fluid-input input[type=url],.wy-control-group.fluid-input input[type=week]{width:100%}.wy-form-message-inline{padding-left:.3em;color:#666;font-size:90%}.wy-form-message{display:block;color:#999;font-size:70%;margin-top:.3125em;font-style:italic}.wy-form-message p{font-size:inherit;font-style:italic;margin-bottom:6px}.wy-form-message p:last-child{margin-bottom:0}input{line-height:normal}input[type=button],input[type=reset],input[type=submit]{-webkit-appearance:button;cursor:pointer;font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif;*overflow:visible}input[type=color],input[type=date],input[type=datetime-local],input[type=datetime],input[type=email],input[type=month],input[type=number],input[type=password],input[type=search],input[type=tel],input[type=text],input[type=time],input[type=url],input[type=week]{-webkit-appearance:none;padding:6px;display:inline-block;border:1px solid #ccc;font-size:80%;font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif;box-shadow:inset 0 1px 3px #ddd;border-radius:0;-webkit-transition:border .3s linear;-moz-transition:border .3s linear;transition:border .3s linear}input[type=datetime-local]{padding:.34375em .625em}input[disabled]{cursor:default}input[type=checkbox],input[type=radio]{padding:0;margin-right:.3125em;*height:13px;*width:13px}input[type=checkbox],input[type=radio],input[type=search]{-webkit-box-sizing:border-box;-moz-box-sizing:border-box;box-sizing:border-box}input[type=search]::-webkit-search-cancel-button,input[type=search]::-webkit-search-decoration{-webkit-appearance:none}input[type=color]:focus,input[type=date]:focus,input[type=datetime-local]:focus,input[type=datetime]:focus,input[type=email]:focus,input[type=month]:focus,input[type=number]:focus,input[type=password]:focus,input[type=search]:focus,input[type=tel]:focus,input[type=text]:focus,input[type=time]:focus,input[type=url]:focus,input[type=week]:focus{outline:0;outline:thin dotted\9;border-color:#333}input.no-focus:focus{border-color:#ccc!important}input[type=checkbox]:focus,input[type=file]:focus,input[type=radio]:focus{outline:thin dotted #333;outline:1px auto #129fea}input[type=color][disabled],input[type=date][disabled],input[type=datetime-local][disabled],input[type=datetime][disabled],input[type=email][disabled],input[type=month][disabled],input[type=number][disabled],input[type=password][disabled],input[type=search][disabled],input[type=tel][disabled],input[type=text][disabled],input[type=time][disabled],input[type=url][disabled],input[type=week][disabled]{cursor:not-allowed;background-color:#fafafa}input:focus:invalid,select:focus:invalid,textarea:focus:invalid{color:#e74c3c;border:1px solid #e74c3c}input:focus:invalid:focus,select:focus:invalid:focus,textarea:focus:invalid:focus{border-color:#e74c3c}input[type=checkbox]:focus:invalid:focus,input[type=file]:focus:invalid:focus,input[type=radio]:focus:invalid:focus{outline-color:#e74c3c}input.wy-input-large{padding:12px;font-size:100%}textarea{overflow:auto;vertical-align:top;width:100%;font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif}select,textarea{padding:.5em .625em;display:inline-block;border:1px solid #ccc;font-size:80%;box-shadow:inset 0 1px 3px #ddd;-webkit-transition:border .3s linear;-moz-transition:border .3s linear;transition:border .3s linear}select{border:1px solid #ccc;background-color:#fff}select[multiple]{height:auto}select:focus,textarea:focus{outline:0}input[readonly],select[disabled],select[readonly],textarea[disabled],textarea[readonly]{cursor:not-allowed;background-color:#fafafa}input[type=checkbox][disabled],input[type=radio][disabled]{cursor:not-allowed}.wy-checkbox,.wy-radio{margin:6px 0;color:#404040;display:block}.wy-checkbox input,.wy-radio input{vertical-align:baseline}.wy-form-message-inline{display:inline-block;*display:inline;*zoom:1;vertical-align:middle}.wy-input-prefix,.wy-input-suffix{white-space:nowrap;padding:6px}.wy-input-prefix .wy-input-context,.wy-input-suffix .wy-input-context{line-height:27px;padding:0 8px;display:inline-block;font-size:80%;background-color:#f3f6f6;border:1px solid #ccc;color:#999}.wy-input-suffix .wy-input-context{border-left:0}.wy-input-prefix .wy-input-context{border-right:0}.wy-switch{position:relative;display:block;height:24px;margin-top:12px;cursor:pointer}.wy-switch:before{left:0;top:0;width:36px;height:12px;background:#ccc}.wy-switch:after,.wy-switch:before{position:absolute;content:"";display:block;border-radius:4px;-webkit-transition:all .2s ease-in-out;-moz-transition:all .2s ease-in-out;transition:all .2s ease-in-out}.wy-switch:after{width:18px;height:18px;background:#999;left:-3px;top:-3px}.wy-switch span{position:absolute;left:48px;display:block;font-size:12px;color:#ccc;line-height:1}.wy-switch.active:before{background:#1e8449}.wy-switch.active:after{left:24px;background:#27ae60}.wy-switch.disabled{cursor:not-allowed;opacity:.8}.wy-control-group.wy-control-group-error .wy-form-message,.wy-control-group.wy-control-group-error>label{color:#e74c3c}.wy-control-group.wy-control-group-error input[type=color],.wy-control-group.wy-control-group-error input[type=date],.wy-control-group.wy-control-group-error input[type=datetime-local],.wy-control-group.wy-control-group-error input[type=datetime],.wy-control-group.wy-control-group-error input[type=email],.wy-control-group.wy-control-group-error input[type=month],.wy-control-group.wy-control-group-error input[type=number],.wy-control-group.wy-control-group-error input[type=password],.wy-control-group.wy-control-group-error input[type=search],.wy-control-group.wy-control-group-error input[type=tel],.wy-control-group.wy-control-group-error input[type=text],.wy-control-group.wy-control-group-error input[type=time],.wy-control-group.wy-control-group-error input[type=url],.wy-control-group.wy-control-group-error input[type=week],.wy-control-group.wy-control-group-error textarea{border:1px solid #e74c3c}.wy-inline-validate{white-space:nowrap}.wy-inline-validate .wy-input-context{padding:.5em .625em;display:inline-block;font-size:80%}.wy-inline-validate.wy-inline-validate-success .wy-input-context{color:#27ae60}.wy-inline-validate.wy-inline-validate-danger .wy-input-context{color:#e74c3c}.wy-inline-validate.wy-inline-validate-warning .wy-input-context{color:#e67e22}.wy-inline-validate.wy-inline-validate-info .wy-input-context{color:#2980b9}.rotate-90{-webkit-transform:rotate(90deg);-moz-transform:rotate(90deg);-ms-transform:rotate(90deg);-o-transform:rotate(90deg);transform:rotate(90deg)}.rotate-180{-webkit-transform:rotate(180deg);-moz-transform:rotate(180deg);-ms-transform:rotate(180deg);-o-transform:rotate(180deg);transform:rotate(180deg)}.rotate-270{-webkit-transform:rotate(270deg);-moz-transform:rotate(270deg);-ms-transform:rotate(270deg);-o-transform:rotate(270deg);transform:rotate(270deg)}.mirror{-webkit-transform:scaleX(-1);-moz-transform:scaleX(-1);-ms-transform:scaleX(-1);-o-transform:scaleX(-1);transform:scaleX(-1)}.mirror.rotate-90{-webkit-transform:scaleX(-1) rotate(90deg);-moz-transform:scaleX(-1) rotate(90deg);-ms-transform:scaleX(-1) rotate(90deg);-o-transform:scaleX(-1) rotate(90deg);transform:scaleX(-1) rotate(90deg)}.mirror.rotate-180{-webkit-transform:scaleX(-1) rotate(180deg);-moz-transform:scaleX(-1) rotate(180deg);-ms-transform:scaleX(-1) rotate(180deg);-o-transform:scaleX(-1) rotate(180deg);transform:scaleX(-1) rotate(180deg)}.mirror.rotate-270{-webkit-transform:scaleX(-1) rotate(270deg);-moz-transform:scaleX(-1) rotate(270deg);-ms-transform:scaleX(-1) rotate(270deg);-o-transform:scaleX(-1) rotate(270deg);transform:scaleX(-1) rotate(270deg)}@media only screen and (max-width:480px){.wy-form button[type=submit]{margin:.7em 0 0}.wy-form input[type=color],.wy-form input[type=date],.wy-form input[type=datetime-local],.wy-form input[type=datetime],.wy-form input[type=email],.wy-form input[type=month],.wy-form input[type=number],.wy-form input[type=password],.wy-form input[type=search],.wy-form input[type=tel],.wy-form input[type=text],.wy-form input[type=time],.wy-form input[type=url],.wy-form input[type=week],.wy-form label{margin-bottom:.3em;display:block}.wy-form input[type=color],.wy-form input[type=date],.wy-form input[type=datetime-local],.wy-form input[type=datetime],.wy-form input[type=email],.wy-form input[type=month],.wy-form input[type=number],.wy-form input[type=password],.wy-form input[type=search],.wy-form input[type=tel],.wy-form input[type=time],.wy-form input[type=url],.wy-form input[type=week]{margin-bottom:0}.wy-form-aligned .wy-control-group label{margin-bottom:.3em;text-align:left;display:block;width:100%}.wy-form-aligned .wy-control{margin:1.5em 0 0}.wy-form-message,.wy-form-message-inline,.wy-form .wy-help-inline{display:block;font-size:80%;padding:6px 0}}@media screen and (max-width:768px){.tablet-hide{display:none}}@media screen and (max-width:480px){.mobile-hide{display:none}}.float-left{float:left}.float-right{float:right}.full-width{width:100%}.rst-content table.docutils,.rst-content table.field-list,.wy-table{border-collapse:collapse;border-spacing:0;empty-cells:show;margin-bottom:24px}.rst-content table.docutils caption,.rst-content table.field-list caption,.wy-table caption{color:#000;font:italic 85%/1 arial,sans-serif;padding:1em 0;text-align:center}.rst-content table.docutils td,.rst-content table.docutils th,.rst-content table.field-list td,.rst-content table.field-list th,.wy-table td,.wy-table th{font-size:90%;margin:0;overflow:visible;padding:8px 16px}.rst-content table.docutils td:first-child,.rst-content table.docutils th:first-child,.rst-content table.field-list td:first-child,.rst-content table.field-list th:first-child,.wy-table td:first-child,.wy-table th:first-child{border-left-width:0}.rst-content table.docutils thead,.rst-content table.field-list thead,.wy-table thead{color:#000;text-align:left;vertical-align:bottom;white-space:nowrap}.rst-content table.docutils thead th,.rst-content table.field-list thead th,.wy-table thead th{font-weight:700;border-bottom:2px solid #e1e4e5}.rst-content table.docutils td,.rst-content table.field-list td,.wy-table td{background-color:transparent;vertical-align:middle}.rst-content table.docutils td p,.rst-content table.field-list td p,.wy-table td p{line-height:18px}.rst-content table.docutils td p:last-child,.rst-content table.field-list td p:last-child,.wy-table td p:last-child{margin-bottom:0}.rst-content table.docutils .wy-table-cell-min,.rst-content table.field-list .wy-table-cell-min,.wy-table .wy-table-cell-min{width:1%;padding-right:0}.rst-content table.docutils .wy-table-cell-min input[type=checkbox],.rst-content table.field-list .wy-table-cell-min input[type=checkbox],.wy-table .wy-table-cell-min input[type=checkbox]{margin:0}.wy-table-secondary{color:grey;font-size:90%}.wy-table-tertiary{color:grey;font-size:80%}.rst-content table.docutils:not(.field-list) tr:nth-child(2n-1) td,.wy-table-backed,.wy-table-odd td,.wy-table-striped tr:nth-child(2n-1) td{background-color:#f3f6f6}.rst-content table.docutils,.wy-table-bordered-all{border:1px solid #e1e4e5}.rst-content table.docutils td,.wy-table-bordered-all td{border-bottom:1px solid #e1e4e5;border-left:1px solid #e1e4e5}.rst-content table.docutils tbody>tr:last-child td,.wy-table-bordered-all tbody>tr:last-child td{border-bottom-width:0}.wy-table-bordered{border:1px solid #e1e4e5}.wy-table-bordered-rows td{border-bottom:1px solid #e1e4e5}.wy-table-bordered-rows tbody>tr:last-child td{border-bottom-width:0}.wy-table-horizontal td,.wy-table-horizontal th{border-width:0 0 1px;border-bottom:1px solid #e1e4e5}.wy-table-horizontal tbody>tr:last-child td{border-bottom-width:0}.wy-table-responsive{margin-bottom:24px;max-width:100%;overflow:auto}.wy-table-responsive table{margin-bottom:0!important}.wy-table-responsive table td,.wy-table-responsive table th{white-space:nowrap}a{color:#2980b9;text-decoration:none;cursor:pointer}a:hover{color:#3091d1}a:visited{color:#9b59b6}html{height:100%}body,html{overflow-x:hidden}body{font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif;font-weight:400;color:#404040;min-height:100%;background:#edf0f2}.wy-text-left{text-align:left}.wy-text-center{text-align:center}.wy-text-right{text-align:right}.wy-text-large{font-size:120%}.wy-text-normal{font-size:100%}.wy-text-small,small{font-size:80%}.wy-text-strike{text-decoration:line-through}.wy-text-warning{color:#e67e22!important}a.wy-text-warning:hover{color:#eb9950!important}.wy-text-info{color:#2980b9!important}a.wy-text-info:hover{color:#409ad5!important}.wy-text-success{color:#27ae60!important}a.wy-text-success:hover{color:#36d278!important}.wy-text-danger{color:#e74c3c!important}a.wy-text-danger:hover{color:#ed7669!important}.wy-text-neutral{color:#404040!important}a.wy-text-neutral:hover{color:#595959!important}.rst-content .toctree-wrapper>p.caption,h1,h2,h3,h4,h5,h6,legend{margin-top:0;font-weight:700;font-family:Roboto Slab,ff-tisa-web-pro,Georgia,Arial,sans-serif}p{line-height:24px;font-size:16px;margin:0 0 24px}h1{font-size:175%}.rst-content .toctree-wrapper>p.caption,h2{font-size:150%}h3{font-size:125%}h4{font-size:115%}h5{font-size:110%}h6{font-size:100%}hr{display:block;height:1px;border:0;border-top:1px solid #e1e4e5;margin:24px 0;padding:0}.rst-content code,.rst-content tt,code{white-space:nowrap;max-width:100%;background:#fff;border:1px solid #e1e4e5;font-size:75%;padding:0 5px;font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;color:#e74c3c;overflow-x:auto}.rst-content tt.code-large,code.code-large{font-size:90%}.rst-content .section ul,.rst-content .toctree-wrapper ul,.wy-plain-list-disc,article ul{list-style:disc;line-height:24px;margin-bottom:24px}.rst-content .section ul li,.rst-content .toctree-wrapper ul li,.wy-plain-list-disc li,article ul li{list-style:disc;margin-left:24px}.rst-content .section ul li p:last-child,.rst-content .section ul li ul,.rst-content .toctree-wrapper ul li p:last-child,.rst-content .toctree-wrapper ul li ul,.wy-plain-list-disc li p:last-child,.wy-plain-list-disc li ul,article ul li p:last-child,article ul li ul{margin-bottom:0}.rst-content .section ul li li,.rst-content .toctree-wrapper ul li li,.wy-plain-list-disc li li,article ul li li{list-style:circle}.rst-content .section ul li li li,.rst-content .toctree-wrapper ul li li li,.wy-plain-list-disc li li li,article ul li li li{list-style:square}.rst-content .section ul li ol li,.rst-content .toctree-wrapper ul li ol li,.wy-plain-list-disc li ol li,article ul li ol li{list-style:decimal}.rst-content .section ol,.rst-content ol.arabic,.wy-plain-list-decimal,article ol{list-style:decimal;line-height:24px;margin-bottom:24px}.rst-content .section ol li,.rst-content ol.arabic li,.wy-plain-list-decimal li,article ol li{list-style:decimal;margin-left:24px}.rst-content .section ol li p:last-child,.rst-content .section ol li ul,.rst-content ol.arabic li p:last-child,.rst-content ol.arabic li ul,.wy-plain-list-decimal li p:last-child,.wy-plain-list-decimal li ul,article ol li p:last-child,article ol li ul{margin-bottom:0}.rst-content .section ol li ul li,.rst-content ol.arabic li ul li,.wy-plain-list-decimal li ul li,article ol li ul li{list-style:disc}.wy-breadcrumbs{*zoom:1}.wy-breadcrumbs:after,.wy-breadcrumbs:before{display:table;content:""}.wy-breadcrumbs:after{clear:both}.wy-breadcrumbs li{display:inline-block}.wy-breadcrumbs li.wy-breadcrumbs-aside{float:right}.wy-breadcrumbs li a{display:inline-block;padding:5px}.wy-breadcrumbs li a:first-child{padding-left:0}.rst-content .wy-breadcrumbs li tt,.wy-breadcrumbs li .rst-content tt,.wy-breadcrumbs li code{padding:5px;border:none;background:none}.rst-content .wy-breadcrumbs li tt.literal,.wy-breadcrumbs li .rst-content tt.literal,.wy-breadcrumbs li code.literal{color:#404040}.wy-breadcrumbs-extra{margin-bottom:0;color:#b3b3b3;font-size:80%;display:inline-block}@media screen and (max-width:480px){.wy-breadcrumbs-extra,.wy-breadcrumbs li.wy-breadcrumbs-aside{display:none}}@media print{.wy-breadcrumbs li.wy-breadcrumbs-aside{display:none}}html{font-size:16px}.wy-affix{position:fixed;top:1.618em}.wy-menu a:hover{text-decoration:none}.wy-menu-horiz{*zoom:1}.wy-menu-horiz:after,.wy-menu-horiz:before{display:table;content:""}.wy-menu-horiz:after{clear:both}.wy-menu-horiz li,.wy-menu-horiz ul{display:inline-block}.wy-menu-horiz li:hover{background:hsla(0,0%,100%,.1)}.wy-menu-horiz li.divide-left{border-left:1px solid #404040}.wy-menu-horiz li.divide-right{border-right:1px solid #404040}.wy-menu-horiz a{height:32px;display:inline-block;line-height:32px;padding:0 16px}.wy-menu-vertical{width:300px}.wy-menu-vertical header,.wy-menu-vertical p.caption{color:#55a5d9;height:32px;line-height:32px;padding:0 1.618em;margin:12px 0 0;display:block;font-weight:700;text-transform:uppercase;font-size:85%;white-space:nowrap}.wy-menu-vertical ul{margin-bottom:0}.wy-menu-vertical li.divide-top{border-top:1px solid #404040}.wy-menu-vertical li.divide-bottom{border-bottom:1px solid #404040}.wy-menu-vertical li.current{background:#e3e3e3}.wy-menu-vertical li.current a{color:grey;border-right:1px solid #c9c9c9;padding:.4045em 2.427em}.wy-menu-vertical li.current a:hover{background:#d6d6d6}.rst-content .wy-menu-vertical li tt,.wy-menu-vertical li .rst-content tt,.wy-menu-vertical li code{border:none;background:inherit;color:inherit;padding-left:0;padding-right:0}.wy-menu-vertical li span.toctree-expand{display:block;float:left;margin-left:-1.2em;font-size:.8em;line-height:1.6em;color:#4d4d4d}.wy-menu-vertical li.current>a,.wy-menu-vertical li.on a{color:#404040;font-weight:700;position:relative;background:#fcfcfc;border:none;padding:.4045em 1.618em}.wy-menu-vertical li.current>a:hover,.wy-menu-vertical li.on a:hover{background:#fcfcfc}.wy-menu-vertical li.current>a:hover span.toctree-expand,.wy-menu-vertical li.on a:hover span.toctree-expand{color:grey}.wy-menu-vertical li.current>a span.toctree-expand,.wy-menu-vertical li.on a span.toctree-expand{display:block;font-size:.8em;line-height:1.6em;color:#333}.wy-menu-vertical li.toctree-l1.current>a{border-bottom:1px solid #c9c9c9;border-top:1px solid #c9c9c9}.wy-menu-vertical .toctree-l1.current .toctree-l2>ul,.wy-menu-vertical .toctree-l2.current .toctree-l3>ul,.wy-menu-vertical .toctree-l3.current .toctree-l4>ul,.wy-menu-vertical .toctree-l4.current .toctree-l5>ul,.wy-menu-vertical .toctree-l5.current .toctree-l6>ul,.wy-menu-vertical .toctree-l6.current .toctree-l7>ul,.wy-menu-vertical .toctree-l7.current .toctree-l8>ul,.wy-menu-vertical .toctree-l8.current .toctree-l9>ul,.wy-menu-vertical .toctree-l9.current .toctree-l10>ul,.wy-menu-vertical .toctree-l10.current .toctree-l11>ul{display:none}.wy-menu-vertical .toctree-l1.current .current.toctree-l2>ul,.wy-menu-vertical .toctree-l2.current .current.toctree-l3>ul,.wy-menu-vertical .toctree-l3.current .current.toctree-l4>ul,.wy-menu-vertical .toctree-l4.current .current.toctree-l5>ul,.wy-menu-vertical .toctree-l5.current .current.toctree-l6>ul,.wy-menu-vertical .toctree-l6.current .current.toctree-l7>ul,.wy-menu-vertical .toctree-l7.current .current.toctree-l8>ul,.wy-menu-vertical .toctree-l8.current .current.toctree-l9>ul,.wy-menu-vertical .toctree-l9.current .current.toctree-l10>ul,.wy-menu-vertical .toctree-l10.current .current.toctree-l11>ul{display:block}.wy-menu-vertical li.toctree-l3,.wy-menu-vertical li.toctree-l4{font-size:.9em}.wy-menu-vertical li.toctree-l2 a,.wy-menu-vertical li.toctree-l3 a,.wy-menu-vertical li.toctree-l4 a,.wy-menu-vertical li.toctree-l5 a,.wy-menu-vertical li.toctree-l6 a,.wy-menu-vertical li.toctree-l7 a,.wy-menu-vertical li.toctree-l8 a,.wy-menu-vertical li.toctree-l9 a,.wy-menu-vertical li.toctree-l10 a{color:#404040}.wy-menu-vertical li.toctree-l2 a:hover span.toctree-expand,.wy-menu-vertical li.toctree-l3 a:hover span.toctree-expand,.wy-menu-vertical li.toctree-l4 a:hover span.toctree-expand,.wy-menu-vertical li.toctree-l5 a:hover span.toctree-expand,.wy-menu-vertical li.toctree-l6 a:hover span.toctree-expand,.wy-menu-vertical li.toctree-l7 a:hover span.toctree-expand,.wy-menu-vertical li.toctree-l8 a:hover span.toctree-expand,.wy-menu-vertical li.toctree-l9 a:hover span.toctree-expand,.wy-menu-vertical li.toctree-l10 a:hover span.toctree-expand{color:grey}.wy-menu-vertical li.toctree-l2.current li.toctree-l3>a,.wy-menu-vertical li.toctree-l3.current li.toctree-l4>a,.wy-menu-vertical li.toctree-l4.current li.toctree-l5>a,.wy-menu-vertical li.toctree-l5.current li.toctree-l6>a,.wy-menu-vertical li.toctree-l6.current li.toctree-l7>a,.wy-menu-vertical li.toctree-l7.current li.toctree-l8>a,.wy-menu-vertical li.toctree-l8.current li.toctree-l9>a,.wy-menu-vertical li.toctree-l9.current li.toctree-l10>a,.wy-menu-vertical li.toctree-l10.current li.toctree-l11>a{display:block}.wy-menu-vertical li.toctree-l2.current>a{padding:.4045em 2.427em}.wy-menu-vertical li.toctree-l2.current li.toctree-l3>a,.wy-menu-vertical li.toctree-l3.current>a{padding:.4045em 4.045em}.wy-menu-vertical li.toctree-l3.current li.toctree-l4>a,.wy-menu-vertical li.toctree-l4.current>a{padding:.4045em 5.663em}.wy-menu-vertical li.toctree-l4.current li.toctree-l5>a,.wy-menu-vertical li.toctree-l5.current>a{padding:.4045em 7.281em}.wy-menu-vertical li.toctree-l5.current li.toctree-l6>a,.wy-menu-vertical li.toctree-l6.current>a{padding:.4045em 8.899em}.wy-menu-vertical li.toctree-l6.current li.toctree-l7>a,.wy-menu-vertical li.toctree-l7.current>a{padding:.4045em 10.517em}.wy-menu-vertical li.toctree-l7.current li.toctree-l8>a,.wy-menu-vertical li.toctree-l8.current>a{padding:.4045em 12.135em}.wy-menu-vertical li.toctree-l8.current li.toctree-l9>a,.wy-menu-vertical li.toctree-l9.current>a{padding:.4045em 13.753em}.wy-menu-vertical li.toctree-l9.current li.toctree-l10>a,.wy-menu-vertical li.toctree-l10.current>a{padding:.4045em 15.371em}.wy-menu-vertical li.toctree-l10.current li.toctree-l11>a{padding:.4045em 16.989em}.wy-menu-vertical li.toctree-l2.current>a,.wy-menu-vertical li.toctree-l2.current li.toctree-l3>a{background:#c9c9c9}.wy-menu-vertical li.toctree-l2 span.toctree-expand{color:#a3a3a3}.wy-menu-vertical li.toctree-l3.current>a,.wy-menu-vertical li.toctree-l3.current li.toctree-l4>a{background:#bdbdbd}.wy-menu-vertical li.toctree-l3 span.toctree-expand{color:#969696}.wy-menu-vertical li.current ul{display:block}.wy-menu-vertical li ul{margin-bottom:0;display:none}.wy-menu-vertical li ul li a{margin-bottom:0;color:#d9d9d9;font-weight:400}.wy-menu-vertical a{line-height:18px;padding:.4045em 1.618em;display:block;position:relative;font-size:90%;color:#d9d9d9}.wy-menu-vertical a:hover{background-color:#4e4a4a;cursor:pointer}.wy-menu-vertical a:hover span.toctree-expand{color:#d9d9d9}.wy-menu-vertical a:active{background-color:#2980b9;cursor:pointer;color:#fff}.wy-menu-vertical a:active span.toctree-expand{color:#fff}.wy-side-nav-search{display:block;width:300px;padding:.809em;margin-bottom:.809em;z-index:200;background-color:#2980b9;text-align:center;color:#fcfcfc}.wy-side-nav-search input[type=text]{width:100%;border-radius:50px;padding:6px 12px;border-color:#2472a4}.wy-side-nav-search img{display:block;margin:auto auto .809em;height:45px;width:45px;background-color:#2980b9;padding:5px;border-radius:100%}.wy-side-nav-search .wy-dropdown>a,.wy-side-nav-search>a{color:#fcfcfc;font-size:100%;font-weight:700;display:inline-block;padding:4px 6px;margin-bottom:.809em}.wy-side-nav-search .wy-dropdown>a:hover,.wy-side-nav-search>a:hover{background:hsla(0,0%,100%,.1)}.wy-side-nav-search .wy-dropdown>a img.logo,.wy-side-nav-search>a img.logo{display:block;margin:0 auto;height:auto;width:auto;border-radius:0;max-width:100%;background:transparent}.wy-side-nav-search .wy-dropdown>a.icon img.logo,.wy-side-nav-search>a.icon img.logo{margin-top:.85em}.wy-side-nav-search>div.version{margin-top:-.4045em;margin-bottom:.809em;font-weight:400;color:hsla(0,0%,100%,.3)}.wy-nav .wy-menu-vertical header{color:#2980b9}.wy-nav .wy-menu-vertical a{color:#b3b3b3}.wy-nav .wy-menu-vertical a:hover{background-color:#2980b9;color:#fff}[data-menu-wrap]{-webkit-transition:all .2s ease-in;-moz-transition:all .2s ease-in;transition:all .2s ease-in;position:absolute;opacity:1;width:100%;opacity:0}[data-menu-wrap].move-center{left:0;right:auto;opacity:1}[data-menu-wrap].move-left{right:auto;left:-100%;opacity:0}[data-menu-wrap].move-right{right:-100%;left:auto;opacity:0}.wy-body-for-nav{background:#fcfcfc}.wy-grid-for-nav{position:absolute;width:100%;height:100%}.wy-nav-side{position:fixed;top:0;bottom:0;left:0;padding-bottom:2em;width:300px;overflow-x:hidden;overflow-y:hidden;min-height:100%;color:#9b9b9b;background:#343131;z-index:200}.wy-side-scroll{width:320px;position:relative;overflow-x:hidden;overflow-y:scroll;height:100%}.wy-nav-top{display:none;background:#2980b9;color:#fff;padding:.4045em .809em;position:relative;line-height:50px;text-align:center;font-size:100%;*zoom:1}.wy-nav-top:after,.wy-nav-top:before{display:table;content:""}.wy-nav-top:after{clear:both}.wy-nav-top a{color:#fff;font-weight:700}.wy-nav-top img{margin-right:12px;height:45px;width:45px;background-color:#2980b9;padding:5px;border-radius:100%}.wy-nav-top i{font-size:30px;float:left;cursor:pointer;padding-top:inherit}.wy-nav-content-wrap{margin-left:300px;background:#fcfcfc;min-height:100%}.wy-nav-content{padding:1.618em 3.236em;height:100%;max-width:800px;margin:auto}.wy-body-mask{position:fixed;width:100%;height:100%;background:rgba(0,0,0,.2);display:none;z-index:499}.wy-body-mask.on{display:block}footer{color:grey}footer p{margin-bottom:12px}.rst-content footer span.commit tt,footer span.commit .rst-content tt,footer span.commit code{padding:0;font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;font-size:1em;background:none;border:none;color:grey}.rst-footer-buttons{*zoom:1}.rst-footer-buttons:after,.rst-footer-buttons:before{width:100%;display:table;content:""}.rst-footer-buttons:after{clear:both}.rst-breadcrumbs-buttons{margin-top:12px;*zoom:1}.rst-breadcrumbs-buttons:after,.rst-breadcrumbs-buttons:before{display:table;content:""}.rst-breadcrumbs-buttons:after{clear:both}#search-results .search li{margin-bottom:24px;border-bottom:1px solid #e1e4e5;padding-bottom:24px}#search-results .search li:first-child{border-top:1px solid #e1e4e5;padding-top:24px}#search-results .search li a{font-size:120%;margin-bottom:12px;display:inline-block}#search-results .context{color:grey;font-size:90%}.genindextable li>ul{margin-left:24px}@media screen and (max-width:768px){.wy-body-for-nav{background:#fcfcfc}.wy-nav-top{display:block}.wy-nav-side{left:-300px}.wy-nav-side.shift{width:85%;left:0}.wy-menu.wy-menu-vertical,.wy-side-nav-search,.wy-side-scroll{width:auto}.wy-nav-content-wrap{margin-left:0}.wy-nav-content-wrap .wy-nav-content{padding:1.618em}.wy-nav-content-wrap.shift{position:fixed;min-width:100%;left:85%;top:0;height:100%;overflow:hidden}}@media screen and (min-width:1100px){.wy-nav-content-wrap{background:rgba(0,0,0,.05)}.wy-nav-content{margin:0;background:#fcfcfc}}@media print{.rst-versions,.wy-nav-side,footer{display:none}.wy-nav-content-wrap{margin-left:0}}.rst-versions{position:fixed;bottom:0;left:0;width:300px;color:#fcfcfc;background:#1f1d1d;font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif;z-index:400}.rst-versions a{color:#2980b9;text-decoration:none}.rst-versions .rst-badge-small{display:none}.rst-versions .rst-current-version{padding:12px;background-color:#272525;display:block;text-align:right;font-size:90%;cursor:pointer;color:#27ae60;*zoom:1}.rst-versions .rst-current-version:after,.rst-versions .rst-current-version:before{display:table;content:""}.rst-versions .rst-current-version:after{clear:both}.rst-content .code-block-caption .rst-versions .rst-current-version .headerlink,.rst-content .rst-versions .rst-current-version .admonition-title,.rst-content code.download .rst-versions .rst-current-version span:first-child,.rst-content dl dt .rst-versions .rst-current-version .headerlink,.rst-content h1 .rst-versions .rst-current-version .headerlink,.rst-content h2 .rst-versions .rst-current-version .headerlink,.rst-content h3 .rst-versions .rst-current-version .headerlink,.rst-content h4 .rst-versions .rst-current-version .headerlink,.rst-content h5 .rst-versions .rst-current-version .headerlink,.rst-content h6 .rst-versions .rst-current-version .headerlink,.rst-content p.caption .rst-versions .rst-current-version .headerlink,.rst-content table>caption .rst-versions .rst-current-version .headerlink,.rst-content tt.download .rst-versions .rst-current-version span:first-child,.rst-versions .rst-current-version .fa,.rst-versions .rst-current-version .icon,.rst-versions .rst-current-version .rst-content .admonition-title,.rst-versions .rst-current-version .rst-content .code-block-caption .headerlink,.rst-versions .rst-current-version .rst-content code.download span:first-child,.rst-versions .rst-current-version .rst-content dl dt .headerlink,.rst-versions .rst-current-version .rst-content h1 .headerlink,.rst-versions .rst-current-version .rst-content h2 .headerlink,.rst-versions .rst-current-version .rst-content h3 .headerlink,.rst-versions .rst-current-version .rst-content h4 .headerlink,.rst-versions .rst-current-version .rst-content h5 .headerlink,.rst-versions .rst-current-version .rst-content h6 .headerlink,.rst-versions .rst-current-version .rst-content p.caption .headerlink,.rst-versions .rst-current-version .rst-content table>caption .headerlink,.rst-versions .rst-current-version .rst-content tt.download span:first-child,.rst-versions .rst-current-version .wy-menu-vertical li span.toctree-expand,.wy-menu-vertical li .rst-versions .rst-current-version span.toctree-expand{color:#fcfcfc}.rst-versions .rst-current-version .fa-book,.rst-versions .rst-current-version .icon-book{float:left}.rst-versions .rst-current-version.rst-out-of-date{background-color:#e74c3c;color:#fff}.rst-versions .rst-current-version.rst-active-old-version{background-color:#f1c40f;color:#000}.rst-versions.shift-up{height:auto;max-height:100%;overflow-y:scroll}.rst-versions.shift-up .rst-other-versions{display:block}.rst-versions .rst-other-versions{font-size:90%;padding:12px;color:grey;display:none}.rst-versions .rst-other-versions hr{display:block;height:1px;border:0;margin:20px 0;padding:0;border-top:1px solid #413d3d}.rst-versions .rst-other-versions dd{display:inline-block;margin:0}.rst-versions .rst-other-versions dd a{display:inline-block;padding:6px;color:#fcfcfc}.rst-versions.rst-badge{width:auto;bottom:20px;right:20px;left:auto;border:none;max-width:300px;max-height:90%}.rst-versions.rst-badge .fa-book,.rst-versions.rst-badge .icon-book{float:none;line-height:30px}.rst-versions.rst-badge.shift-up .rst-current-version{text-align:right}.rst-versions.rst-badge.shift-up .rst-current-version .fa-book,.rst-versions.rst-badge.shift-up .rst-current-version .icon-book{float:left}.rst-versions.rst-badge>.rst-current-version{width:auto;height:30px;line-height:30px;padding:0 6px;display:block;text-align:center}@media screen and (max-width:768px){.rst-versions{width:85%;display:none}.rst-versions.shift{display:block}}.rst-content img{max-width:100%;height:auto}.rst-content div.figure{margin-bottom:24px}.rst-content div.figure p.caption{font-style:italic}.rst-content div.figure p:last-child.caption{margin-bottom:0}.rst-content div.figure.align-center{text-align:center}.rst-content .section>a>img,.rst-content .section>img{margin-bottom:24px}.rst-content abbr[title]{text-decoration:none}.rst-content.style-external-links a.reference.external:after{font-family:FontAwesome;content:"\f08e";color:#b3b3b3;vertical-align:super;font-size:60%;margin:0 .2em}.rst-content blockquote{margin-left:24px;line-height:24px;margin-bottom:24px}.rst-content pre.literal-block{white-space:pre;margin:0;padding:12px;font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;display:block;overflow:auto}.rst-content div[class^=highlight],.rst-content pre.literal-block{border:1px solid #e1e4e5;overflow-x:auto;margin:1px 0 24px}.rst-content div[class^=highlight] div[class^=highlight],.rst-content pre.literal-block div[class^=highlight]{padding:0;border:none;margin:0}.rst-content div[class^=highlight] td.code{width:100%}.rst-content .linenodiv pre{border-right:1px solid #e6e9ea;margin:0;padding:12px;font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;user-select:none;pointer-events:none}.rst-content div[class^=highlight] pre{white-space:pre;margin:0;padding:12px;display:block;overflow:auto}.rst-content div[class^=highlight] pre .hll{display:block;margin:0 -12px;padding:0 12px}.rst-content .linenodiv pre,.rst-content div[class^=highlight] pre,.rst-content pre.literal-block{font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;font-size:12px;line-height:1.4}.rst-content div.highlight .gp{user-select:none;pointer-events:none}.rst-content .code-block-caption{font-style:italic;font-size:85%;line-height:1;padding:1em 0;text-align:center}@media print{.rst-content .codeblock,.rst-content div[class^=highlight],.rst-content div[class^=highlight] pre{white-space:pre-wrap}}.rst-content .admonition,.rst-content .admonition-todo,.rst-content .attention,.rst-content .caution,.rst-content .danger,.rst-content .error,.rst-content .hint,.rst-content .important,.rst-content .note,.rst-content .seealso,.rst-content .tip,.rst-content .warning{clear:both}.rst-content .admonition-todo .last,.rst-content .admonition-todo>:last-child,.rst-content .admonition .last,.rst-content .admonition>:last-child,.rst-content .attention .last,.rst-content .attention>:last-child,.rst-content .caution .last,.rst-content .caution>:last-child,.rst-content .danger .last,.rst-content .danger>:last-child,.rst-content .error .last,.rst-content .error>:last-child,.rst-content .hint .last,.rst-content .hint>:last-child,.rst-content .important .last,.rst-content .important>:last-child,.rst-content .note .last,.rst-content .note>:last-child,.rst-content .seealso .last,.rst-content .seealso>:last-child,.rst-content .tip .last,.rst-content .tip>:last-child,.rst-content .warning .last,.rst-content .warning>:last-child{margin-bottom:0}.rst-content .admonition-title:before{margin-right:4px}.rst-content .admonition table{border-color:rgba(0,0,0,.1)}.rst-content .admonition table td,.rst-content .admonition table th{background:transparent!important;border-color:rgba(0,0,0,.1)!important}.rst-content .section ol.loweralpha,.rst-content .section ol.loweralpha>li{list-style:lower-alpha}.rst-content .section ol.upperalpha,.rst-content .section ol.upperalpha>li{list-style:upper-alpha}.rst-content .section ol li>*,.rst-content .section ul li>*{margin-top:12px;margin-bottom:12px}.rst-content .section ol li>:first-child,.rst-content .section ul li>:first-child{margin-top:0}.rst-content .section ol li>p,.rst-content .section ol li>p:last-child,.rst-content .section ul li>p,.rst-content .section ul li>p:last-child{margin-bottom:12px}.rst-content .section ol li>p:only-child,.rst-content .section ol li>p:only-child:last-child,.rst-content .section ul li>p:only-child,.rst-content .section ul li>p:only-child:last-child{margin-bottom:0}.rst-content .section ol li>ol,.rst-content .section ol li>ul,.rst-content .section ul li>ol,.rst-content .section ul li>ul{margin-bottom:12px}.rst-content .section ol.simple li>*,.rst-content .section ol.simple li ol,.rst-content .section ol.simple li ul,.rst-content .section ul.simple li>*,.rst-content .section ul.simple li ol,.rst-content .section ul.simple li ul{margin-top:0;margin-bottom:0}.rst-content .line-block{margin-left:0;margin-bottom:24px;line-height:24px}.rst-content .line-block .line-block{margin-left:24px;margin-bottom:0}.rst-content .topic-title{font-weight:700;margin-bottom:12px}.rst-content .toc-backref{color:#404040}.rst-content .align-right{float:right;margin:0 0 24px 24px}.rst-content .align-left{float:left;margin:0 24px 24px 0}.rst-content .align-center{margin:auto}.rst-content .align-center:not(table){display:block}.rst-content .code-block-caption .headerlink,.rst-content .toctree-wrapper>p.caption .headerlink,.rst-content dl dt .headerlink,.rst-content h1 .headerlink,.rst-content h2 .headerlink,.rst-content h3 .headerlink,.rst-content h4 .headerlink,.rst-content h5 .headerlink,.rst-content h6 .headerlink,.rst-content p.caption .headerlink,.rst-content table>caption .headerlink{visibility:hidden;font-size:14px}.rst-content .code-block-caption .headerlink:after,.rst-content .toctree-wrapper>p.caption .headerlink:after,.rst-content dl dt .headerlink:after,.rst-content h1 .headerlink:after,.rst-content h2 .headerlink:after,.rst-content h3 .headerlink:after,.rst-content h4 .headerlink:after,.rst-content h5 .headerlink:after,.rst-content h6 .headerlink:after,.rst-content p.caption .headerlink:after,.rst-content table>caption .headerlink:after{content:"\f0c1";font-family:FontAwesome}.rst-content .code-block-caption:hover .headerlink:after,.rst-content .toctree-wrapper>p.caption:hover .headerlink:after,.rst-content dl dt:hover .headerlink:after,.rst-content h1:hover .headerlink:after,.rst-content h2:hover .headerlink:after,.rst-content h3:hover .headerlink:after,.rst-content h4:hover .headerlink:after,.rst-content h5:hover .headerlink:after,.rst-content h6:hover .headerlink:after,.rst-content p.caption:hover .headerlink:after,.rst-content table>caption:hover .headerlink:after{visibility:visible}.rst-content table>caption .headerlink:after{font-size:12px}.rst-content .centered{text-align:center}.rst-content .sidebar{float:right;width:40%;display:block;margin:0 0 24px 24px;padding:24px;background:#f3f6f6;border:1px solid #e1e4e5}.rst-content .sidebar dl,.rst-content .sidebar p,.rst-content .sidebar ul{font-size:90%}.rst-content .sidebar .last,.rst-content .sidebar>:last-child{margin-bottom:0}.rst-content .sidebar .sidebar-title{display:block;font-family:Roboto Slab,ff-tisa-web-pro,Georgia,Arial,sans-serif;font-weight:700;background:#e1e4e5;padding:6px 12px;margin:-24px -24px 24px;font-size:100%}.rst-content .highlighted{background:#f1c40f;box-shadow:0 0 0 2px #f1c40f;display:inline;font-weight:700}.rst-content .citation-reference,.rst-content .footnote-reference{vertical-align:baseline;position:relative;top:-.4em;line-height:0;font-size:90%}.rst-content .hlist{width:100%}html.writer-html4 .rst-content table.docutils.citation,html.writer-html4 .rst-content table.docutils.footnote{background:none;border:none}html.writer-html4 .rst-content table.docutils.citation td,html.writer-html4 .rst-content table.docutils.citation tr,html.writer-html4 .rst-content table.docutils.footnote td,html.writer-html4 .rst-content table.docutils.footnote tr{border:none;background-color:transparent!important;white-space:normal}html.writer-html4 .rst-content table.docutils.citation td.label,html.writer-html4 .rst-content table.docutils.footnote td.label{padding-left:0;padding-right:0;vertical-align:top}html.writer-html5 .rst-content dl dt span.classifier:before{content:" : "}html.writer-html5 .rst-content dl.field-list,html.writer-html5 .rst-content dl.footnote{display:grid;grid-template-columns:max-content auto}html.writer-html5 .rst-content dl.field-list>dt,html.writer-html5 .rst-content dl.footnote>dt{padding-left:1rem}html.writer-html5 .rst-content dl.field-list>dt:after,html.writer-html5 .rst-content dl.footnote>dt:after{content:":"}html.writer-html5 .rst-content dl.field-list>dd,html.writer-html5 .rst-content dl.field-list>dt,html.writer-html5 .rst-content dl.footnote>dd,html.writer-html5 .rst-content dl.footnote>dt{margin-bottom:0}html.writer-html5 .rst-content dl.footnote{font-size:.9rem}html.writer-html5 .rst-content dl.footnote>dt{margin:0 .5rem .5rem 0;line-height:1.2rem;word-break:break-all;font-weight:400}html.writer-html5 .rst-content dl.footnote>dt>span.brackets{margin-right:.5rem}html.writer-html5 .rst-content dl.footnote>dt>span.brackets:before{content:"["}html.writer-html5 .rst-content dl.footnote>dt>span.brackets:after{content:"]"}html.writer-html5 .rst-content dl.footnote>dt>span.fn-backref{font-style:italic}html.writer-html5 .rst-content dl.footnote>dd{margin:0 0 .5rem;line-height:1.2rem}html.writer-html5 .rst-content dl.footnote>dd p,html.writer-html5 .rst-content dl.option-list kbd{font-size:.9rem}.rst-content table.docutils.footnote,html.writer-html4 .rst-content table.docutils.citation,html.writer-html5 .rst-content dl.footnote{color:grey}.rst-content table.docutils.footnote code,.rst-content table.docutils.footnote tt,html.writer-html4 .rst-content table.docutils.citation code,html.writer-html4 .rst-content table.docutils.citation tt,html.writer-html5 .rst-content dl.footnote code,html.writer-html5 .rst-content dl.footnote tt{color:#555}.rst-content .wy-table-responsive.citation,.rst-content .wy-table-responsive.footnote{margin-bottom:0}.rst-content .wy-table-responsive.citation+:not(.citation),.rst-content .wy-table-responsive.footnote+:not(.footnote){margin-top:24px}.rst-content .wy-table-responsive.citation:last-child,.rst-content .wy-table-responsive.footnote:last-child{margin-bottom:24px}.rst-content table.docutils th{border-color:#e1e4e5}html.writer-html5 .rst-content table.docutils th{border:1px solid #e1e4e5}html.writer-html5 .rst-content table.docutils td>p,html.writer-html5 .rst-content table.docutils th>p{line-height:1rem;margin-bottom:0;font-size:.9rem}.rst-content table.docutils td .last,.rst-content table.docutils td .last>:last-child{margin-bottom:0}.rst-content table.field-list,.rst-content table.field-list td{border:none}.rst-content table.field-list td p{font-size:inherit;line-height:inherit}.rst-content table.field-list td>strong{display:inline-block}.rst-content table.field-list .field-name{padding-right:10px;text-align:left;white-space:nowrap}.rst-content table.field-list .field-body{text-align:left}.rst-content code,.rst-content tt{color:#000;font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;padding:2px 5px}.rst-content code big,.rst-content code em,.rst-content tt big,.rst-content tt em{font-size:100%!important;line-height:normal}.rst-content code.literal,.rst-content tt.literal{color:#e74c3c}.rst-content code.xref,.rst-content tt.xref,a .rst-content code,a .rst-content tt{font-weight:700;color:#404040}.rst-content kbd,.rst-content pre,.rst-content samp{font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace}.rst-content a code,.rst-content a tt{color:#2980b9}.rst-content dl{margin-bottom:24px}.rst-content dl dt{font-weight:700;margin-bottom:12px}.rst-content dl ol,.rst-content dl p,.rst-content dl table,.rst-content dl ul{margin-bottom:12px}.rst-content dl dd{margin:0 0 12px 24px;line-height:24px}html.writer-html4 .rst-content dl:not(.docutils),html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.glossary):not(.simple){margin-bottom:24px}html.writer-html4 .rst-content dl:not(.docutils)>dt,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.glossary):not(.simple)>dt{display:table;margin:6px 0;font-size:90%;line-height:normal;background:#e7f2fa;color:#2980b9;border-top:3px solid #6ab0de;padding:6px;position:relative}html.writer-html4 .rst-content dl:not(.docutils)>dt:before,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.glossary):not(.simple)>dt:before{color:#6ab0de}html.writer-html4 .rst-content dl:not(.docutils)>dt .headerlink,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.glossary):not(.simple)>dt .headerlink{color:#404040;font-size:100%!important}html.writer-html4 .rst-content dl:not(.docutils) dl:not(.field-list)>dt,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.glossary):not(.simple) dl:not(.field-list)>dt{margin-bottom:6px;border:none;border-left:3px solid #ccc;background:#f0f0f0;color:#555}html.writer-html4 .rst-content dl:not(.docutils) dl:not(.field-list)>dt .headerlink,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.glossary):not(.simple) dl:not(.field-list)>dt .headerlink{color:#404040;font-size:100%!important}html.writer-html4 .rst-content dl:not(.docutils)>dt:first-child,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.glossary):not(.simple)>dt:first-child{margin-top:0}html.writer-html4 .rst-content dl:not(.docutils) code,html.writer-html4 .rst-content dl:not(.docutils) tt,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.glossary):not(.simple) code,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.glossary):not(.simple) tt{font-weight:700}html.writer-html4 .rst-content dl:not(.docutils) code.descclassname,html.writer-html4 .rst-content dl:not(.docutils) code.descname,html.writer-html4 .rst-content dl:not(.docutils) tt.descclassname,html.writer-html4 .rst-content dl:not(.docutils) tt.descname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.glossary):not(.simple) code.descclassname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.glossary):not(.simple) code.descname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.glossary):not(.simple) tt.descclassname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.glossary):not(.simple) tt.descname{background-color:transparent;border:none;padding:0;font-size:100%!important}html.writer-html4 .rst-content dl:not(.docutils) code.descname,html.writer-html4 .rst-content dl:not(.docutils) tt.descname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.glossary):not(.simple) code.descname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.glossary):not(.simple) tt.descname{font-weight:700}html.writer-html4 .rst-content dl:not(.docutils) .optional,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.glossary):not(.simple) .optional{display:inline-block;padding:0 4px;color:#000;font-weight:700}html.writer-html4 .rst-content dl:not(.docutils) .property,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.glossary):not(.simple) .property{display:inline-block;padding-right:8px}.rst-content .viewcode-back,.rst-content .viewcode-link{display:inline-block;color:#27ae60;font-size:80%;padding-left:24px}.rst-content .viewcode-back{display:block;float:right}.rst-content p.rubric{margin-bottom:12px;font-weight:700}.rst-content code.download,.rst-content tt.download{background:inherit;padding:inherit;font-weight:400;font-family:inherit;font-size:inherit;color:inherit;border:inherit;white-space:inherit}.rst-content code.download span:first-child,.rst-content tt.download span:first-child{-webkit-font-smoothing:subpixel-antialiased}.rst-content code.download span:first-child:before,.rst-content tt.download span:first-child:before{margin-right:4px}.rst-content .guilabel{border:1px solid #7fbbe3;background:#e7f2fa;font-size:80%;font-weight:700;border-radius:4px;padding:2.4px 6px;margin:auto 2px}.rst-content .versionmodified{font-style:italic}@media screen and (max-width:480px){.rst-content .sidebar{width:100%}}span[id*=MathJax-Span]{color:#404040}.math{text-align:center}@font-face{font-family:Lato;src:url(fonts/lato-normal.woff2?bd03a2cc277bbbc338d464e679fe9942) format("woff2"),url(fonts/lato-normal.woff?27bd77b9162d388cb8d4c4217c7c5e2a) format("woff");font-weight:400;font-style:normal;font-display:block}@font-face{font-family:Lato;src:url(fonts/lato-bold.woff2?cccb897485813c7c256901dbca54ecf2) format("woff2"),url(fonts/lato-bold.woff?d878b6c29b10beca227e9eef4246111b) format("woff");font-weight:700;font-style:normal;font-display:block}@font-face{font-family:Lato;src:url(fonts/lato-bold-italic.woff2?0b6bb6725576b072c5d0b02ecdd1900d) format("woff2"),url(fonts/lato-bold-italic.woff?9c7e4e9eb485b4a121c760e61bc3707c) format("woff");font-weight:700;font-style:italic;font-display:block}@font-face{font-family:Lato;src:url(fonts/lato-normal-italic.woff2?4eb103b4d12be57cb1d040ed5e162e9d) format("woff2"),url(fonts/lato-normal-italic.woff?f28f2d6482446544ef1ea1ccc6dd5892) format("woff");font-weight:400;font-style:italic;font-display:block}@font-face{font-family:Roboto Slab;font-style:normal;font-weight:400;src:url(fonts/Roboto-Slab-Regular.woff2?7abf5b8d04d26a2cafea937019bca958) format("woff2"),url(fonts/Roboto-Slab-Regular.woff?c1be9284088d487c5e3ff0a10a92e58c) format("woff");font-display:block}@font-face{font-family:Roboto Slab;font-style:normal;font-weight:700;src:url(fonts/Roboto-Slab-Bold.woff2?9984f4a9bda09be08e83f2506954adbe) format("woff2"),url(fonts/Roboto-Slab-Bold.woff?bed5564a116b05148e3b3bea6fb1162a) format("woff");font-display:block} \ No newline at end of file diff --git a/_static/doctools.js b/_static/doctools.js new file mode 100644 index 0000000..daccd20 --- /dev/null +++ b/_static/doctools.js @@ -0,0 +1,315 @@ +/* + * doctools.js + * ~~~~~~~~~~~ + * + * Sphinx JavaScript utilities for all documentation. + * + * :copyright: Copyright 2007-2020 by the Sphinx team, see AUTHORS. + * :license: BSD, see LICENSE for details. + * + */ + +/** + * select a different prefix for underscore + */ +$u = _.noConflict(); + +/** + * make the code below compatible with browsers without + * an installed firebug like debugger +if (!window.console || !console.firebug) { + var names = ["log", "debug", "info", "warn", "error", "assert", "dir", + "dirxml", "group", "groupEnd", "time", "timeEnd", "count", "trace", + "profile", "profileEnd"]; + window.console = {}; + for (var i = 0; i < names.length; ++i) + window.console[names[i]] = function() {}; +} + */ + +/** + * small helper function to urldecode strings + */ +jQuery.urldecode = function(x) { + return decodeURIComponent(x).replace(/\+/g, ' '); +}; + +/** + * small helper function to urlencode strings + */ +jQuery.urlencode = encodeURIComponent; + +/** + * This function returns the parsed url parameters of the + * current request. Multiple values per key are supported, + * it will always return arrays of strings for the value parts. + */ +jQuery.getQueryParameters = function(s) { + if (typeof s === 'undefined') + s = document.location.search; + var parts = s.substr(s.indexOf('?') + 1).split('&'); + var result = {}; + for (var i = 0; i < parts.length; i++) { + var tmp = parts[i].split('=', 2); + var key = jQuery.urldecode(tmp[0]); + var value = jQuery.urldecode(tmp[1]); + if (key in result) + result[key].push(value); + else + result[key] = [value]; + } + return result; +}; + +/** + * highlight a given string on a jquery object by wrapping it in + * span elements with the given class name. + */ +jQuery.fn.highlightText = function(text, className) { + function highlight(node, addItems) { + if (node.nodeType === 3) { + var val = node.nodeValue; + var pos = val.toLowerCase().indexOf(text); + if (pos >= 0 && + !jQuery(node.parentNode).hasClass(className) && + !jQuery(node.parentNode).hasClass("nohighlight")) { + var span; + var isInSVG = jQuery(node).closest("body, svg, foreignObject").is("svg"); + if (isInSVG) { + span = document.createElementNS("http://www.w3.org/2000/svg", "tspan"); + } else { + span = document.createElement("span"); + span.className = className; + } + span.appendChild(document.createTextNode(val.substr(pos, text.length))); + node.parentNode.insertBefore(span, node.parentNode.insertBefore( + document.createTextNode(val.substr(pos + text.length)), + node.nextSibling)); + node.nodeValue = val.substr(0, pos); + if (isInSVG) { + var rect = document.createElementNS("http://www.w3.org/2000/svg", "rect"); + var bbox = node.parentElement.getBBox(); + rect.x.baseVal.value = bbox.x; + rect.y.baseVal.value = bbox.y; + rect.width.baseVal.value = bbox.width; + rect.height.baseVal.value = bbox.height; + rect.setAttribute('class', className); + addItems.push({ + "parent": node.parentNode, + "target": rect}); + } + } + } + else if (!jQuery(node).is("button, select, textarea")) { + jQuery.each(node.childNodes, function() { + highlight(this, addItems); + }); + } + } + var addItems = []; + var result = this.each(function() { + highlight(this, addItems); + }); + for (var i = 0; i < addItems.length; ++i) { + jQuery(addItems[i].parent).before(addItems[i].target); + } + return result; +}; + +/* + * backward compatibility for jQuery.browser + * This will be supported until firefox bug is fixed. + */ +if (!jQuery.browser) { + jQuery.uaMatch = function(ua) { + ua = ua.toLowerCase(); + + var match = /(chrome)[ \/]([\w.]+)/.exec(ua) || + /(webkit)[ \/]([\w.]+)/.exec(ua) || + /(opera)(?:.*version|)[ \/]([\w.]+)/.exec(ua) || + /(msie) ([\w.]+)/.exec(ua) || + ua.indexOf("compatible") < 0 && /(mozilla)(?:.*? rv:([\w.]+)|)/.exec(ua) || + []; + + return { + browser: match[ 1 ] || "", + version: match[ 2 ] || "0" + }; + }; + jQuery.browser = {}; + jQuery.browser[jQuery.uaMatch(navigator.userAgent).browser] = true; +} + +/** + * Small JavaScript module for the documentation. + */ +var Documentation = { + + init : function() { + this.fixFirefoxAnchorBug(); + this.highlightSearchWords(); + this.initIndexTable(); + if (DOCUMENTATION_OPTIONS.NAVIGATION_WITH_KEYS) { + this.initOnKeyListeners(); + } + }, + + /** + * i18n support + */ + TRANSLATIONS : {}, + PLURAL_EXPR : function(n) { return n === 1 ? 0 : 1; }, + LOCALE : 'unknown', + + // gettext and ngettext don't access this so that the functions + // can safely bound to a different name (_ = Documentation.gettext) + gettext : function(string) { + var translated = Documentation.TRANSLATIONS[string]; + if (typeof translated === 'undefined') + return string; + return (typeof translated === 'string') ? translated : translated[0]; + }, + + ngettext : function(singular, plural, n) { + var translated = Documentation.TRANSLATIONS[singular]; + if (typeof translated === 'undefined') + return (n == 1) ? singular : plural; + return translated[Documentation.PLURALEXPR(n)]; + }, + + addTranslations : function(catalog) { + for (var key in catalog.messages) + this.TRANSLATIONS[key] = catalog.messages[key]; + this.PLURAL_EXPR = new Function('n', 'return +(' + catalog.plural_expr + ')'); + this.LOCALE = catalog.locale; + }, + + /** + * add context elements like header anchor links + */ + addContextElements : function() { + $('div[id] > :header:first').each(function() { + $('\u00B6'). + attr('href', '#' + this.id). + attr('title', _('Permalink to this headline')). + appendTo(this); + }); + $('dt[id]').each(function() { + $('\u00B6'). + attr('href', '#' + this.id). + attr('title', _('Permalink to this definition')). + appendTo(this); + }); + }, + + /** + * workaround a firefox stupidity + * see: https://bugzilla.mozilla.org/show_bug.cgi?id=645075 + */ + fixFirefoxAnchorBug : function() { + if (document.location.hash && $.browser.mozilla) + window.setTimeout(function() { + document.location.href += ''; + }, 10); + }, + + /** + * highlight the search words provided in the url in the text + */ + highlightSearchWords : function() { + var params = $.getQueryParameters(); + var terms = (params.highlight) ? params.highlight[0].split(/\s+/) : []; + if (terms.length) { + var body = $('div.body'); + if (!body.length) { + body = $('body'); + } + window.setTimeout(function() { + $.each(terms, function() { + body.highlightText(this.toLowerCase(), 'highlighted'); + }); + }, 10); + $('') + .appendTo($('#searchbox')); + } + }, + + /** + * init the domain index toggle buttons + */ + initIndexTable : function() { + var togglers = $('img.toggler').click(function() { + var src = $(this).attr('src'); + var idnum = $(this).attr('id').substr(7); + $('tr.cg-' + idnum).toggle(); + if (src.substr(-9) === 'minus.png') + $(this).attr('src', src.substr(0, src.length-9) + 'plus.png'); + else + $(this).attr('src', src.substr(0, src.length-8) + 'minus.png'); + }).css('display', ''); + if (DOCUMENTATION_OPTIONS.COLLAPSE_INDEX) { + togglers.click(); + } + }, + + /** + * helper function to hide the search marks again + */ + hideSearchWords : function() { + $('#searchbox .highlight-link').fadeOut(300); + $('span.highlighted').removeClass('highlighted'); + }, + + /** + * make the url absolute + */ + makeURL : function(relativeURL) { + return DOCUMENTATION_OPTIONS.URL_ROOT + '/' + relativeURL; + }, + + /** + * get the current relative url + */ + getCurrentURL : function() { + var path = document.location.pathname; + var parts = path.split(/\//); + $.each(DOCUMENTATION_OPTIONS.URL_ROOT.split(/\//), function() { + if (this === '..') + parts.pop(); + }); + var url = parts.join('/'); + return path.substring(url.lastIndexOf('/') + 1, path.length - 1); + }, + + initOnKeyListeners: function() { + $(document).keydown(function(event) { + var activeElementType = document.activeElement.tagName; + // don't navigate when in search box or textarea + if (activeElementType !== 'TEXTAREA' && activeElementType !== 'INPUT' && activeElementType !== 'SELECT' + && !event.altKey && !event.ctrlKey && !event.metaKey && !event.shiftKey) { + switch (event.keyCode) { + case 37: // left + var prevHref = $('link[rel="prev"]').prop('href'); + if (prevHref) { + window.location.href = prevHref; + return false; + } + case 39: // right + var nextHref = $('link[rel="next"]').prop('href'); + if (nextHref) { + window.location.href = nextHref; + return false; + } + } + } + }); + } +}; + +// quick alias for translations +_ = Documentation.gettext; + +$(document).ready(function() { + Documentation.init(); +}); diff --git a/_static/documentation_options.js b/_static/documentation_options.js new file mode 100644 index 0000000..2fa8c97 --- /dev/null +++ b/_static/documentation_options.js @@ -0,0 +1,12 @@ +var DOCUMENTATION_OPTIONS = { + URL_ROOT: document.getElementById("documentation_options").getAttribute('data-url_root'), + VERSION: '', + LANGUAGE: 'None', + COLLAPSE_INDEX: false, + BUILDER: 'html', + FILE_SUFFIX: '.html', + LINK_SUFFIX: '.html', + HAS_SOURCE: true, + SOURCELINK_SUFFIX: '.txt', + NAVIGATION_WITH_KEYS: false +}; \ No newline at end of file diff --git a/_static/file.png b/_static/file.png new file mode 100644 index 0000000..a858a41 Binary files /dev/null and b/_static/file.png differ diff --git a/_static/fonts/FontAwesome.otf b/_static/fonts/FontAwesome.otf new file mode 100644 index 0000000..401ec0f Binary files /dev/null and b/_static/fonts/FontAwesome.otf differ diff --git a/_static/fonts/Lato/lato-bold.eot b/_static/fonts/Lato/lato-bold.eot new file mode 100644 index 0000000..3361183 Binary files /dev/null and b/_static/fonts/Lato/lato-bold.eot differ diff --git a/_static/fonts/Lato/lato-bold.ttf b/_static/fonts/Lato/lato-bold.ttf new file mode 100644 index 0000000..29f691d Binary files /dev/null and b/_static/fonts/Lato/lato-bold.ttf differ diff --git a/_static/fonts/Lato/lato-bold.woff b/_static/fonts/Lato/lato-bold.woff new file mode 100644 index 0000000..c6dff51 Binary files /dev/null and b/_static/fonts/Lato/lato-bold.woff differ diff --git a/_static/fonts/Lato/lato-bold.woff2 b/_static/fonts/Lato/lato-bold.woff2 new file mode 100644 index 0000000..bb19504 Binary files /dev/null and b/_static/fonts/Lato/lato-bold.woff2 differ diff --git a/_static/fonts/Lato/lato-bolditalic.eot b/_static/fonts/Lato/lato-bolditalic.eot new file mode 100644 index 0000000..3d41549 Binary files /dev/null and b/_static/fonts/Lato/lato-bolditalic.eot differ diff --git a/_static/fonts/Lato/lato-bolditalic.ttf b/_static/fonts/Lato/lato-bolditalic.ttf new file mode 100644 index 0000000..f402040 Binary files /dev/null and b/_static/fonts/Lato/lato-bolditalic.ttf differ diff --git a/_static/fonts/Lato/lato-bolditalic.woff b/_static/fonts/Lato/lato-bolditalic.woff new file mode 100644 index 0000000..88ad05b Binary files /dev/null and b/_static/fonts/Lato/lato-bolditalic.woff differ diff --git a/_static/fonts/Lato/lato-bolditalic.woff2 b/_static/fonts/Lato/lato-bolditalic.woff2 new file mode 100644 index 0000000..c4e3d80 Binary files /dev/null and b/_static/fonts/Lato/lato-bolditalic.woff2 differ diff --git a/_static/fonts/Lato/lato-italic.eot b/_static/fonts/Lato/lato-italic.eot new file mode 100644 index 0000000..3f82642 Binary files /dev/null and b/_static/fonts/Lato/lato-italic.eot differ diff --git a/_static/fonts/Lato/lato-italic.ttf b/_static/fonts/Lato/lato-italic.ttf new file mode 100644 index 0000000..b4bfc9b Binary files /dev/null and b/_static/fonts/Lato/lato-italic.ttf differ diff --git a/_static/fonts/Lato/lato-italic.woff b/_static/fonts/Lato/lato-italic.woff new file mode 100644 index 0000000..76114bc Binary files /dev/null and b/_static/fonts/Lato/lato-italic.woff differ diff --git a/_static/fonts/Lato/lato-italic.woff2 b/_static/fonts/Lato/lato-italic.woff2 new file mode 100644 index 0000000..3404f37 Binary files /dev/null and b/_static/fonts/Lato/lato-italic.woff2 differ diff --git a/_static/fonts/Lato/lato-regular.eot b/_static/fonts/Lato/lato-regular.eot new file mode 100644 index 0000000..11e3f2a Binary files /dev/null and b/_static/fonts/Lato/lato-regular.eot differ diff --git a/_static/fonts/Lato/lato-regular.ttf b/_static/fonts/Lato/lato-regular.ttf new file mode 100644 index 0000000..74decd9 Binary files /dev/null and b/_static/fonts/Lato/lato-regular.ttf differ diff --git a/_static/fonts/Lato/lato-regular.woff b/_static/fonts/Lato/lato-regular.woff new file mode 100644 index 0000000..ae1307f Binary files /dev/null and b/_static/fonts/Lato/lato-regular.woff differ diff --git a/_static/fonts/Lato/lato-regular.woff2 b/_static/fonts/Lato/lato-regular.woff2 new file mode 100644 index 0000000..3bf9843 Binary files /dev/null and b/_static/fonts/Lato/lato-regular.woff2 differ diff --git a/_static/fonts/Roboto-Slab-Bold.woff b/_static/fonts/Roboto-Slab-Bold.woff new file mode 100644 index 0000000..6cb6000 Binary files /dev/null and b/_static/fonts/Roboto-Slab-Bold.woff differ diff --git a/_static/fonts/Roboto-Slab-Bold.woff2 b/_static/fonts/Roboto-Slab-Bold.woff2 new file mode 100644 index 0000000..7059e23 Binary files /dev/null and b/_static/fonts/Roboto-Slab-Bold.woff2 differ diff --git a/_static/fonts/Roboto-Slab-Light.woff b/_static/fonts/Roboto-Slab-Light.woff new file mode 100644 index 0000000..337d287 Binary files /dev/null and b/_static/fonts/Roboto-Slab-Light.woff differ diff --git a/_static/fonts/Roboto-Slab-Light.woff2 b/_static/fonts/Roboto-Slab-Light.woff2 new file mode 100644 index 0000000..20398af Binary files /dev/null and b/_static/fonts/Roboto-Slab-Light.woff2 differ diff --git a/_static/fonts/Roboto-Slab-Regular.woff b/_static/fonts/Roboto-Slab-Regular.woff new file mode 100644 index 0000000..f815f63 Binary files /dev/null and b/_static/fonts/Roboto-Slab-Regular.woff differ diff --git a/_static/fonts/Roboto-Slab-Regular.woff2 b/_static/fonts/Roboto-Slab-Regular.woff2 new file mode 100644 index 0000000..f2c76e5 Binary files /dev/null and b/_static/fonts/Roboto-Slab-Regular.woff2 differ diff --git a/_static/fonts/Roboto-Slab-Thin.woff b/_static/fonts/Roboto-Slab-Thin.woff new file mode 100644 index 0000000..6b30ea6 Binary files /dev/null and b/_static/fonts/Roboto-Slab-Thin.woff differ diff --git a/_static/fonts/Roboto-Slab-Thin.woff2 b/_static/fonts/Roboto-Slab-Thin.woff2 new file mode 100644 index 0000000..328f5bb Binary files /dev/null and b/_static/fonts/Roboto-Slab-Thin.woff2 differ diff --git a/_static/fonts/RobotoSlab/roboto-slab-v7-bold.eot b/_static/fonts/RobotoSlab/roboto-slab-v7-bold.eot new file mode 100644 index 0000000..79dc8ef Binary files /dev/null and b/_static/fonts/RobotoSlab/roboto-slab-v7-bold.eot differ diff --git a/_static/fonts/RobotoSlab/roboto-slab-v7-bold.ttf b/_static/fonts/RobotoSlab/roboto-slab-v7-bold.ttf new file mode 100644 index 0000000..df5d1df Binary files /dev/null and b/_static/fonts/RobotoSlab/roboto-slab-v7-bold.ttf differ diff --git a/_static/fonts/RobotoSlab/roboto-slab-v7-bold.woff b/_static/fonts/RobotoSlab/roboto-slab-v7-bold.woff new file mode 100644 index 0000000..6cb6000 Binary files /dev/null and b/_static/fonts/RobotoSlab/roboto-slab-v7-bold.woff differ diff --git a/_static/fonts/RobotoSlab/roboto-slab-v7-bold.woff2 b/_static/fonts/RobotoSlab/roboto-slab-v7-bold.woff2 new file mode 100644 index 0000000..7059e23 Binary files /dev/null and b/_static/fonts/RobotoSlab/roboto-slab-v7-bold.woff2 differ diff --git a/_static/fonts/RobotoSlab/roboto-slab-v7-regular.eot b/_static/fonts/RobotoSlab/roboto-slab-v7-regular.eot new file mode 100644 index 0000000..2f7ca78 Binary files /dev/null and b/_static/fonts/RobotoSlab/roboto-slab-v7-regular.eot differ diff --git a/_static/fonts/RobotoSlab/roboto-slab-v7-regular.ttf b/_static/fonts/RobotoSlab/roboto-slab-v7-regular.ttf new file mode 100644 index 0000000..eb52a79 Binary files /dev/null and b/_static/fonts/RobotoSlab/roboto-slab-v7-regular.ttf differ diff --git a/_static/fonts/RobotoSlab/roboto-slab-v7-regular.woff b/_static/fonts/RobotoSlab/roboto-slab-v7-regular.woff new file mode 100644 index 0000000..f815f63 Binary files /dev/null and b/_static/fonts/RobotoSlab/roboto-slab-v7-regular.woff differ diff --git a/_static/fonts/RobotoSlab/roboto-slab-v7-regular.woff2 b/_static/fonts/RobotoSlab/roboto-slab-v7-regular.woff2 new file mode 100644 index 0000000..f2c76e5 Binary files /dev/null and b/_static/fonts/RobotoSlab/roboto-slab-v7-regular.woff2 differ diff --git a/_static/fonts/fontawesome-webfont.eot b/_static/fonts/fontawesome-webfont.eot new file mode 100644 index 0000000..e9f60ca Binary files /dev/null and b/_static/fonts/fontawesome-webfont.eot differ diff --git a/_static/fonts/fontawesome-webfont.svg b/_static/fonts/fontawesome-webfont.svg new file mode 100644 index 0000000..855c845 --- /dev/null +++ b/_static/fonts/fontawesome-webfont.svg @@ -0,0 +1,2671 @@ + + + + +Created by FontForge 20120731 at Mon Oct 24 17:37:40 2016 + By ,,, +Copyright Dave Gandy 2016. All rights reserved. + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/_static/fonts/fontawesome-webfont.ttf b/_static/fonts/fontawesome-webfont.ttf new file mode 100644 index 0000000..35acda2 Binary files /dev/null and b/_static/fonts/fontawesome-webfont.ttf differ diff --git a/_static/fonts/fontawesome-webfont.woff b/_static/fonts/fontawesome-webfont.woff new file mode 100644 index 0000000..400014a Binary files /dev/null and b/_static/fonts/fontawesome-webfont.woff differ diff --git a/_static/fonts/fontawesome-webfont.woff2 b/_static/fonts/fontawesome-webfont.woff2 new file mode 100644 index 0000000..4d13fc6 Binary files /dev/null and b/_static/fonts/fontawesome-webfont.woff2 differ diff --git a/_static/fonts/lato-bold-italic.woff b/_static/fonts/lato-bold-italic.woff new file mode 100644 index 0000000..88ad05b Binary files /dev/null and b/_static/fonts/lato-bold-italic.woff differ diff --git a/_static/fonts/lato-bold-italic.woff2 b/_static/fonts/lato-bold-italic.woff2 new file mode 100644 index 0000000..c4e3d80 Binary files /dev/null and b/_static/fonts/lato-bold-italic.woff2 differ diff --git a/_static/fonts/lato-bold.woff b/_static/fonts/lato-bold.woff new file mode 100644 index 0000000..c6dff51 Binary files /dev/null and b/_static/fonts/lato-bold.woff differ diff --git a/_static/fonts/lato-bold.woff2 b/_static/fonts/lato-bold.woff2 new file mode 100644 index 0000000..bb19504 Binary files /dev/null and b/_static/fonts/lato-bold.woff2 differ diff --git a/_static/fonts/lato-normal-italic.woff b/_static/fonts/lato-normal-italic.woff new file mode 100644 index 0000000..76114bc Binary files /dev/null and b/_static/fonts/lato-normal-italic.woff differ diff --git a/_static/fonts/lato-normal-italic.woff2 b/_static/fonts/lato-normal-italic.woff2 new file mode 100644 index 0000000..3404f37 Binary files /dev/null and b/_static/fonts/lato-normal-italic.woff2 differ diff --git a/_static/fonts/lato-normal.woff b/_static/fonts/lato-normal.woff new file mode 100644 index 0000000..ae1307f Binary files /dev/null and b/_static/fonts/lato-normal.woff differ diff --git a/_static/fonts/lato-normal.woff2 b/_static/fonts/lato-normal.woff2 new file mode 100644 index 0000000..3bf9843 Binary files /dev/null and b/_static/fonts/lato-normal.woff2 differ diff --git a/_static/graphviz.css b/_static/graphviz.css new file mode 100644 index 0000000..8ab69e0 --- /dev/null +++ b/_static/graphviz.css @@ -0,0 +1,19 @@ +/* + * graphviz.css + * ~~~~~~~~~~~~ + * + * Sphinx stylesheet -- graphviz extension. + * + * :copyright: Copyright 2007-2020 by the Sphinx team, see AUTHORS. + * :license: BSD, see LICENSE for details. + * + */ + +img.graphviz { + border: 0; + max-width: 100%; +} + +object.graphviz { + max-width: 100%; +} diff --git a/_static/jquery-3.5.1.js b/_static/jquery-3.5.1.js new file mode 100644 index 0000000..5093733 --- /dev/null +++ b/_static/jquery-3.5.1.js @@ -0,0 +1,10872 @@ +/*! + * jQuery JavaScript Library v3.5.1 + * https://jquery.com/ + * + * Includes Sizzle.js + * https://sizzlejs.com/ + * + * Copyright JS Foundation and other contributors + * Released under the MIT license + * https://jquery.org/license + * + * Date: 2020-05-04T22:49Z + */ +( function( global, factory ) { + + "use strict"; + + if ( typeof module === "object" && typeof module.exports === "object" ) { + + // For CommonJS and CommonJS-like environments where a proper `window` + // is present, execute the factory and get jQuery. + // For environments that do not have a `window` with a `document` + // (such as Node.js), expose a factory as module.exports. + // This accentuates the need for the creation of a real `window`. + // e.g. var jQuery = require("jquery")(window); + // See ticket #14549 for more info. + module.exports = global.document ? + factory( global, true ) : + function( w ) { + if ( !w.document ) { + throw new Error( "jQuery requires a window with a document" ); + } + return factory( w ); + }; + } else { + factory( global ); + } + +// Pass this if window is not defined yet +} )( typeof window !== "undefined" ? window : this, function( window, noGlobal ) { + +// Edge <= 12 - 13+, Firefox <=18 - 45+, IE 10 - 11, Safari 5.1 - 9+, iOS 6 - 9.1 +// throw exceptions when non-strict code (e.g., ASP.NET 4.5) accesses strict mode +// arguments.callee.caller (trac-13335). But as of jQuery 3.0 (2016), strict mode should be common +// enough that all such attempts are guarded in a try block. +"use strict"; + +var arr = []; + +var getProto = Object.getPrototypeOf; + +var slice = arr.slice; + +var flat = arr.flat ? function( array ) { + return arr.flat.call( array ); +} : function( array ) { + return arr.concat.apply( [], array ); +}; + + +var push = arr.push; + +var indexOf = arr.indexOf; + +var class2type = {}; + +var toString = class2type.toString; + +var hasOwn = class2type.hasOwnProperty; + +var fnToString = hasOwn.toString; + +var ObjectFunctionString = fnToString.call( Object ); + +var support = {}; + +var isFunction = function isFunction( obj ) { + + // Support: Chrome <=57, Firefox <=52 + // In some browsers, typeof returns "function" for HTML elements + // (i.e., `typeof document.createElement( "object" ) === "function"`). + // We don't want to classify *any* DOM node as a function. + return typeof obj === "function" && typeof obj.nodeType !== "number"; + }; + + +var isWindow = function isWindow( obj ) { + return obj != null && obj === obj.window; + }; + + +var document = window.document; + + + + var preservedScriptAttributes = { + type: true, + src: true, + nonce: true, + noModule: true + }; + + function DOMEval( code, node, doc ) { + doc = doc || document; + + var i, val, + script = doc.createElement( "script" ); + + script.text = code; + if ( node ) { + for ( i in preservedScriptAttributes ) { + + // Support: Firefox 64+, Edge 18+ + // Some browsers don't support the "nonce" property on scripts. + // On the other hand, just using `getAttribute` is not enough as + // the `nonce` attribute is reset to an empty string whenever it + // becomes browsing-context connected. + // See https://github.com/whatwg/html/issues/2369 + // See https://html.spec.whatwg.org/#nonce-attributes + // The `node.getAttribute` check was added for the sake of + // `jQuery.globalEval` so that it can fake a nonce-containing node + // via an object. + val = node[ i ] || node.getAttribute && node.getAttribute( i ); + if ( val ) { + script.setAttribute( i, val ); + } + } + } + doc.head.appendChild( script ).parentNode.removeChild( script ); + } + + +function toType( obj ) { + if ( obj == null ) { + return obj + ""; + } + + // Support: Android <=2.3 only (functionish RegExp) + return typeof obj === "object" || typeof obj === "function" ? + class2type[ toString.call( obj ) ] || "object" : + typeof obj; +} +/* global Symbol */ +// Defining this global in .eslintrc.json would create a danger of using the global +// unguarded in another place, it seems safer to define global only for this module + + + +var + version = "3.5.1", + + // Define a local copy of jQuery + jQuery = function( selector, context ) { + + // The jQuery object is actually just the init constructor 'enhanced' + // Need init if jQuery is called (just allow error to be thrown if not included) + return new jQuery.fn.init( selector, context ); + }; + +jQuery.fn = jQuery.prototype = { + + // The current version of jQuery being used + jquery: version, + + constructor: jQuery, + + // The default length of a jQuery object is 0 + length: 0, + + toArray: function() { + return slice.call( this ); + }, + + // Get the Nth element in the matched element set OR + // Get the whole matched element set as a clean array + get: function( num ) { + + // Return all the elements in a clean array + if ( num == null ) { + return slice.call( this ); + } + + // Return just the one element from the set + return num < 0 ? this[ num + this.length ] : this[ num ]; + }, + + // Take an array of elements and push it onto the stack + // (returning the new matched element set) + pushStack: function( elems ) { + + // Build a new jQuery matched element set + var ret = jQuery.merge( this.constructor(), elems ); + + // Add the old object onto the stack (as a reference) + ret.prevObject = this; + + // Return the newly-formed element set + return ret; + }, + + // Execute a callback for every element in the matched set. + each: function( callback ) { + return jQuery.each( this, callback ); + }, + + map: function( callback ) { + return this.pushStack( jQuery.map( this, function( elem, i ) { + return callback.call( elem, i, elem ); + } ) ); + }, + + slice: function() { + return this.pushStack( slice.apply( this, arguments ) ); + }, + + first: function() { + return this.eq( 0 ); + }, + + last: function() { + return this.eq( -1 ); + }, + + even: function() { + return this.pushStack( jQuery.grep( this, function( _elem, i ) { + return ( i + 1 ) % 2; + } ) ); + }, + + odd: function() { + return this.pushStack( jQuery.grep( this, function( _elem, i ) { + return i % 2; + } ) ); + }, + + eq: function( i ) { + var len = this.length, + j = +i + ( i < 0 ? len : 0 ); + return this.pushStack( j >= 0 && j < len ? [ this[ j ] ] : [] ); + }, + + end: function() { + return this.prevObject || this.constructor(); + }, + + // For internal use only. + // Behaves like an Array's method, not like a jQuery method. + push: push, + sort: arr.sort, + splice: arr.splice +}; + +jQuery.extend = jQuery.fn.extend = function() { + var options, name, src, copy, copyIsArray, clone, + target = arguments[ 0 ] || {}, + i = 1, + length = arguments.length, + deep = false; + + // Handle a deep copy situation + if ( typeof target === "boolean" ) { + deep = target; + + // Skip the boolean and the target + target = arguments[ i ] || {}; + i++; + } + + // Handle case when target is a string or something (possible in deep copy) + if ( typeof target !== "object" && !isFunction( target ) ) { + target = {}; + } + + // Extend jQuery itself if only one argument is passed + if ( i === length ) { + target = this; + i--; + } + + for ( ; i < length; i++ ) { + + // Only deal with non-null/undefined values + if ( ( options = arguments[ i ] ) != null ) { + + // Extend the base object + for ( name in options ) { + copy = options[ name ]; + + // Prevent Object.prototype pollution + // Prevent never-ending loop + if ( name === "__proto__" || target === copy ) { + continue; + } + + // Recurse if we're merging plain objects or arrays + if ( deep && copy && ( jQuery.isPlainObject( copy ) || + ( copyIsArray = Array.isArray( copy ) ) ) ) { + src = target[ name ]; + + // Ensure proper type for the source value + if ( copyIsArray && !Array.isArray( src ) ) { + clone = []; + } else if ( !copyIsArray && !jQuery.isPlainObject( src ) ) { + clone = {}; + } else { + clone = src; + } + copyIsArray = false; + + // Never move original objects, clone them + target[ name ] = jQuery.extend( deep, clone, copy ); + + // Don't bring in undefined values + } else if ( copy !== undefined ) { + target[ name ] = copy; + } + } + } + } + + // Return the modified object + return target; +}; + +jQuery.extend( { + + // Unique for each copy of jQuery on the page + expando: "jQuery" + ( version + Math.random() ).replace( /\D/g, "" ), + + // Assume jQuery is ready without the ready module + isReady: true, + + error: function( msg ) { + throw new Error( msg ); + }, + + noop: function() {}, + + isPlainObject: function( obj ) { + var proto, Ctor; + + // Detect obvious negatives + // Use toString instead of jQuery.type to catch host objects + if ( !obj || toString.call( obj ) !== "[object Object]" ) { + return false; + } + + proto = getProto( obj ); + + // Objects with no prototype (e.g., `Object.create( null )`) are plain + if ( !proto ) { + return true; + } + + // Objects with prototype are plain iff they were constructed by a global Object function + Ctor = hasOwn.call( proto, "constructor" ) && proto.constructor; + return typeof Ctor === "function" && fnToString.call( Ctor ) === ObjectFunctionString; + }, + + isEmptyObject: function( obj ) { + var name; + + for ( name in obj ) { + return false; + } + return true; + }, + + // Evaluates a script in a provided context; falls back to the global one + // if not specified. + globalEval: function( code, options, doc ) { + DOMEval( code, { nonce: options && options.nonce }, doc ); + }, + + each: function( obj, callback ) { + var length, i = 0; + + if ( isArrayLike( obj ) ) { + length = obj.length; + for ( ; i < length; i++ ) { + if ( callback.call( obj[ i ], i, obj[ i ] ) === false ) { + break; + } + } + } else { + for ( i in obj ) { + if ( callback.call( obj[ i ], i, obj[ i ] ) === false ) { + break; + } + } + } + + return obj; + }, + + // results is for internal usage only + makeArray: function( arr, results ) { + var ret = results || []; + + if ( arr != null ) { + if ( isArrayLike( Object( arr ) ) ) { + jQuery.merge( ret, + typeof arr === "string" ? + [ arr ] : arr + ); + } else { + push.call( ret, arr ); + } + } + + return ret; + }, + + inArray: function( elem, arr, i ) { + return arr == null ? -1 : indexOf.call( arr, elem, i ); + }, + + // Support: Android <=4.0 only, PhantomJS 1 only + // push.apply(_, arraylike) throws on ancient WebKit + merge: function( first, second ) { + var len = +second.length, + j = 0, + i = first.length; + + for ( ; j < len; j++ ) { + first[ i++ ] = second[ j ]; + } + + first.length = i; + + return first; + }, + + grep: function( elems, callback, invert ) { + var callbackInverse, + matches = [], + i = 0, + length = elems.length, + callbackExpect = !invert; + + // Go through the array, only saving the items + // that pass the validator function + for ( ; i < length; i++ ) { + callbackInverse = !callback( elems[ i ], i ); + if ( callbackInverse !== callbackExpect ) { + matches.push( elems[ i ] ); + } + } + + return matches; + }, + + // arg is for internal usage only + map: function( elems, callback, arg ) { + var length, value, + i = 0, + ret = []; + + // Go through the array, translating each of the items to their new values + if ( isArrayLike( elems ) ) { + length = elems.length; + for ( ; i < length; i++ ) { + value = callback( elems[ i ], i, arg ); + + if ( value != null ) { + ret.push( value ); + } + } + + // Go through every key on the object, + } else { + for ( i in elems ) { + value = callback( elems[ i ], i, arg ); + + if ( value != null ) { + ret.push( value ); + } + } + } + + // Flatten any nested arrays + return flat( ret ); + }, + + // A global GUID counter for objects + guid: 1, + + // jQuery.support is not used in Core but other projects attach their + // properties to it so it needs to exist. + support: support +} ); + +if ( typeof Symbol === "function" ) { + jQuery.fn[ Symbol.iterator ] = arr[ Symbol.iterator ]; +} + +// Populate the class2type map +jQuery.each( "Boolean Number String Function Array Date RegExp Object Error Symbol".split( " " ), +function( _i, name ) { + class2type[ "[object " + name + "]" ] = name.toLowerCase(); +} ); + +function isArrayLike( obj ) { + + // Support: real iOS 8.2 only (not reproducible in simulator) + // `in` check used to prevent JIT error (gh-2145) + // hasOwn isn't used here due to false negatives + // regarding Nodelist length in IE + var length = !!obj && "length" in obj && obj.length, + type = toType( obj ); + + if ( isFunction( obj ) || isWindow( obj ) ) { + return false; + } + + return type === "array" || length === 0 || + typeof length === "number" && length > 0 && ( length - 1 ) in obj; +} +var Sizzle = +/*! + * Sizzle CSS Selector Engine v2.3.5 + * https://sizzlejs.com/ + * + * Copyright JS Foundation and other contributors + * Released under the MIT license + * https://js.foundation/ + * + * Date: 2020-03-14 + */ +( function( window ) { +var i, + support, + Expr, + getText, + isXML, + tokenize, + compile, + select, + outermostContext, + sortInput, + hasDuplicate, + + // Local document vars + setDocument, + document, + docElem, + documentIsHTML, + rbuggyQSA, + rbuggyMatches, + matches, + contains, + + // Instance-specific data + expando = "sizzle" + 1 * new Date(), + preferredDoc = window.document, + dirruns = 0, + done = 0, + classCache = createCache(), + tokenCache = createCache(), + compilerCache = createCache(), + nonnativeSelectorCache = createCache(), + sortOrder = function( a, b ) { + if ( a === b ) { + hasDuplicate = true; + } + return 0; + }, + + // Instance methods + hasOwn = ( {} ).hasOwnProperty, + arr = [], + pop = arr.pop, + pushNative = arr.push, + push = arr.push, + slice = arr.slice, + + // Use a stripped-down indexOf as it's faster than native + // https://jsperf.com/thor-indexof-vs-for/5 + indexOf = function( list, elem ) { + var i = 0, + len = list.length; + for ( ; i < len; i++ ) { + if ( list[ i ] === elem ) { + return i; + } + } + return -1; + }, + + booleans = "checked|selected|async|autofocus|autoplay|controls|defer|disabled|hidden|" + + "ismap|loop|multiple|open|readonly|required|scoped", + + // Regular expressions + + // http://www.w3.org/TR/css3-selectors/#whitespace + whitespace = "[\\x20\\t\\r\\n\\f]", + + // https://www.w3.org/TR/css-syntax-3/#ident-token-diagram + identifier = "(?:\\\\[\\da-fA-F]{1,6}" + whitespace + + "?|\\\\[^\\r\\n\\f]|[\\w-]|[^\0-\\x7f])+", + + // Attribute selectors: http://www.w3.org/TR/selectors/#attribute-selectors + attributes = "\\[" + whitespace + "*(" + identifier + ")(?:" + whitespace + + + // Operator (capture 2) + "*([*^$|!~]?=)" + whitespace + + + // "Attribute values must be CSS identifiers [capture 5] + // or strings [capture 3 or capture 4]" + "*(?:'((?:\\\\.|[^\\\\'])*)'|\"((?:\\\\.|[^\\\\\"])*)\"|(" + identifier + "))|)" + + whitespace + "*\\]", + + pseudos = ":(" + identifier + ")(?:\\((" + + + // To reduce the number of selectors needing tokenize in the preFilter, prefer arguments: + // 1. quoted (capture 3; capture 4 or capture 5) + "('((?:\\\\.|[^\\\\'])*)'|\"((?:\\\\.|[^\\\\\"])*)\")|" + + + // 2. simple (capture 6) + "((?:\\\\.|[^\\\\()[\\]]|" + attributes + ")*)|" + + + // 3. anything else (capture 2) + ".*" + + ")\\)|)", + + // Leading and non-escaped trailing whitespace, capturing some non-whitespace characters preceding the latter + rwhitespace = new RegExp( whitespace + "+", "g" ), + rtrim = new RegExp( "^" + whitespace + "+|((?:^|[^\\\\])(?:\\\\.)*)" + + whitespace + "+$", "g" ), + + rcomma = new RegExp( "^" + whitespace + "*," + whitespace + "*" ), + rcombinators = new RegExp( "^" + whitespace + "*([>+~]|" + whitespace + ")" + whitespace + + "*" ), + rdescend = new RegExp( whitespace + "|>" ), + + rpseudo = new RegExp( pseudos ), + ridentifier = new RegExp( "^" + identifier + "$" ), + + matchExpr = { + "ID": new RegExp( "^#(" + identifier + ")" ), + "CLASS": new RegExp( "^\\.(" + identifier + ")" ), + "TAG": new RegExp( "^(" + identifier + "|[*])" ), + "ATTR": new RegExp( "^" + attributes ), + "PSEUDO": new RegExp( "^" + pseudos ), + "CHILD": new RegExp( "^:(only|first|last|nth|nth-last)-(child|of-type)(?:\\(" + + whitespace + "*(even|odd|(([+-]|)(\\d*)n|)" + whitespace + "*(?:([+-]|)" + + whitespace + "*(\\d+)|))" + whitespace + "*\\)|)", "i" ), + "bool": new RegExp( "^(?:" + booleans + ")$", "i" ), + + // For use in libraries implementing .is() + // We use this for POS matching in `select` + "needsContext": new RegExp( "^" + whitespace + + "*[>+~]|:(even|odd|eq|gt|lt|nth|first|last)(?:\\(" + whitespace + + "*((?:-\\d)?\\d*)" + whitespace + "*\\)|)(?=[^-]|$)", "i" ) + }, + + rhtml = /HTML$/i, + rinputs = /^(?:input|select|textarea|button)$/i, + rheader = /^h\d$/i, + + rnative = /^[^{]+\{\s*\[native \w/, + + // Easily-parseable/retrievable ID or TAG or CLASS selectors + rquickExpr = /^(?:#([\w-]+)|(\w+)|\.([\w-]+))$/, + + rsibling = /[+~]/, + + // CSS escapes + // http://www.w3.org/TR/CSS21/syndata.html#escaped-characters + runescape = new RegExp( "\\\\[\\da-fA-F]{1,6}" + whitespace + "?|\\\\([^\\r\\n\\f])", "g" ), + funescape = function( escape, nonHex ) { + var high = "0x" + escape.slice( 1 ) - 0x10000; + + return nonHex ? + + // Strip the backslash prefix from a non-hex escape sequence + nonHex : + + // Replace a hexadecimal escape sequence with the encoded Unicode code point + // Support: IE <=11+ + // For values outside the Basic Multilingual Plane (BMP), manually construct a + // surrogate pair + high < 0 ? + String.fromCharCode( high + 0x10000 ) : + String.fromCharCode( high >> 10 | 0xD800, high & 0x3FF | 0xDC00 ); + }, + + // CSS string/identifier serialization + // https://drafts.csswg.org/cssom/#common-serializing-idioms + rcssescape = /([\0-\x1f\x7f]|^-?\d)|^-$|[^\0-\x1f\x7f-\uFFFF\w-]/g, + fcssescape = function( ch, asCodePoint ) { + if ( asCodePoint ) { + + // U+0000 NULL becomes U+FFFD REPLACEMENT CHARACTER + if ( ch === "\0" ) { + return "\uFFFD"; + } + + // Control characters and (dependent upon position) numbers get escaped as code points + return ch.slice( 0, -1 ) + "\\" + + ch.charCodeAt( ch.length - 1 ).toString( 16 ) + " "; + } + + // Other potentially-special ASCII characters get backslash-escaped + return "\\" + ch; + }, + + // Used for iframes + // See setDocument() + // Removing the function wrapper causes a "Permission Denied" + // error in IE + unloadHandler = function() { + setDocument(); + }, + + inDisabledFieldset = addCombinator( + function( elem ) { + return elem.disabled === true && elem.nodeName.toLowerCase() === "fieldset"; + }, + { dir: "parentNode", next: "legend" } + ); + +// Optimize for push.apply( _, NodeList ) +try { + push.apply( + ( arr = slice.call( preferredDoc.childNodes ) ), + preferredDoc.childNodes + ); + + // Support: Android<4.0 + // Detect silently failing push.apply + // eslint-disable-next-line no-unused-expressions + arr[ preferredDoc.childNodes.length ].nodeType; +} catch ( e ) { + push = { apply: arr.length ? + + // Leverage slice if possible + function( target, els ) { + pushNative.apply( target, slice.call( els ) ); + } : + + // Support: IE<9 + // Otherwise append directly + function( target, els ) { + var j = target.length, + i = 0; + + // Can't trust NodeList.length + while ( ( target[ j++ ] = els[ i++ ] ) ) {} + target.length = j - 1; + } + }; +} + +function Sizzle( selector, context, results, seed ) { + var m, i, elem, nid, match, groups, newSelector, + newContext = context && context.ownerDocument, + + // nodeType defaults to 9, since context defaults to document + nodeType = context ? context.nodeType : 9; + + results = results || []; + + // Return early from calls with invalid selector or context + if ( typeof selector !== "string" || !selector || + nodeType !== 1 && nodeType !== 9 && nodeType !== 11 ) { + + return results; + } + + // Try to shortcut find operations (as opposed to filters) in HTML documents + if ( !seed ) { + setDocument( context ); + context = context || document; + + if ( documentIsHTML ) { + + // If the selector is sufficiently simple, try using a "get*By*" DOM method + // (excepting DocumentFragment context, where the methods don't exist) + if ( nodeType !== 11 && ( match = rquickExpr.exec( selector ) ) ) { + + // ID selector + if ( ( m = match[ 1 ] ) ) { + + // Document context + if ( nodeType === 9 ) { + if ( ( elem = context.getElementById( m ) ) ) { + + // Support: IE, Opera, Webkit + // TODO: identify versions + // getElementById can match elements by name instead of ID + if ( elem.id === m ) { + results.push( elem ); + return results; + } + } else { + return results; + } + + // Element context + } else { + + // Support: IE, Opera, Webkit + // TODO: identify versions + // getElementById can match elements by name instead of ID + if ( newContext && ( elem = newContext.getElementById( m ) ) && + contains( context, elem ) && + elem.id === m ) { + + results.push( elem ); + return results; + } + } + + // Type selector + } else if ( match[ 2 ] ) { + push.apply( results, context.getElementsByTagName( selector ) ); + return results; + + // Class selector + } else if ( ( m = match[ 3 ] ) && support.getElementsByClassName && + context.getElementsByClassName ) { + + push.apply( results, context.getElementsByClassName( m ) ); + return results; + } + } + + // Take advantage of querySelectorAll + if ( support.qsa && + !nonnativeSelectorCache[ selector + " " ] && + ( !rbuggyQSA || !rbuggyQSA.test( selector ) ) && + + // Support: IE 8 only + // Exclude object elements + ( nodeType !== 1 || context.nodeName.toLowerCase() !== "object" ) ) { + + newSelector = selector; + newContext = context; + + // qSA considers elements outside a scoping root when evaluating child or + // descendant combinators, which is not what we want. + // In such cases, we work around the behavior by prefixing every selector in the + // list with an ID selector referencing the scope context. + // The technique has to be used as well when a leading combinator is used + // as such selectors are not recognized by querySelectorAll. + // Thanks to Andrew Dupont for this technique. + if ( nodeType === 1 && + ( rdescend.test( selector ) || rcombinators.test( selector ) ) ) { + + // Expand context for sibling selectors + newContext = rsibling.test( selector ) && testContext( context.parentNode ) || + context; + + // We can use :scope instead of the ID hack if the browser + // supports it & if we're not changing the context. + if ( newContext !== context || !support.scope ) { + + // Capture the context ID, setting it first if necessary + if ( ( nid = context.getAttribute( "id" ) ) ) { + nid = nid.replace( rcssescape, fcssescape ); + } else { + context.setAttribute( "id", ( nid = expando ) ); + } + } + + // Prefix every selector in the list + groups = tokenize( selector ); + i = groups.length; + while ( i-- ) { + groups[ i ] = ( nid ? "#" + nid : ":scope" ) + " " + + toSelector( groups[ i ] ); + } + newSelector = groups.join( "," ); + } + + try { + push.apply( results, + newContext.querySelectorAll( newSelector ) + ); + return results; + } catch ( qsaError ) { + nonnativeSelectorCache( selector, true ); + } finally { + if ( nid === expando ) { + context.removeAttribute( "id" ); + } + } + } + } + } + + // All others + return select( selector.replace( rtrim, "$1" ), context, results, seed ); +} + +/** + * Create key-value caches of limited size + * @returns {function(string, object)} Returns the Object data after storing it on itself with + * property name the (space-suffixed) string and (if the cache is larger than Expr.cacheLength) + * deleting the oldest entry + */ +function createCache() { + var keys = []; + + function cache( key, value ) { + + // Use (key + " ") to avoid collision with native prototype properties (see Issue #157) + if ( keys.push( key + " " ) > Expr.cacheLength ) { + + // Only keep the most recent entries + delete cache[ keys.shift() ]; + } + return ( cache[ key + " " ] = value ); + } + return cache; +} + +/** + * Mark a function for special use by Sizzle + * @param {Function} fn The function to mark + */ +function markFunction( fn ) { + fn[ expando ] = true; + return fn; +} + +/** + * Support testing using an element + * @param {Function} fn Passed the created element and returns a boolean result + */ +function assert( fn ) { + var el = document.createElement( "fieldset" ); + + try { + return !!fn( el ); + } catch ( e ) { + return false; + } finally { + + // Remove from its parent by default + if ( el.parentNode ) { + el.parentNode.removeChild( el ); + } + + // release memory in IE + el = null; + } +} + +/** + * Adds the same handler for all of the specified attrs + * @param {String} attrs Pipe-separated list of attributes + * @param {Function} handler The method that will be applied + */ +function addHandle( attrs, handler ) { + var arr = attrs.split( "|" ), + i = arr.length; + + while ( i-- ) { + Expr.attrHandle[ arr[ i ] ] = handler; + } +} + +/** + * Checks document order of two siblings + * @param {Element} a + * @param {Element} b + * @returns {Number} Returns less than 0 if a precedes b, greater than 0 if a follows b + */ +function siblingCheck( a, b ) { + var cur = b && a, + diff = cur && a.nodeType === 1 && b.nodeType === 1 && + a.sourceIndex - b.sourceIndex; + + // Use IE sourceIndex if available on both nodes + if ( diff ) { + return diff; + } + + // Check if b follows a + if ( cur ) { + while ( ( cur = cur.nextSibling ) ) { + if ( cur === b ) { + return -1; + } + } + } + + return a ? 1 : -1; +} + +/** + * Returns a function to use in pseudos for input types + * @param {String} type + */ +function createInputPseudo( type ) { + return function( elem ) { + var name = elem.nodeName.toLowerCase(); + return name === "input" && elem.type === type; + }; +} + +/** + * Returns a function to use in pseudos for buttons + * @param {String} type + */ +function createButtonPseudo( type ) { + return function( elem ) { + var name = elem.nodeName.toLowerCase(); + return ( name === "input" || name === "button" ) && elem.type === type; + }; +} + +/** + * Returns a function to use in pseudos for :enabled/:disabled + * @param {Boolean} disabled true for :disabled; false for :enabled + */ +function createDisabledPseudo( disabled ) { + + // Known :disabled false positives: fieldset[disabled] > legend:nth-of-type(n+2) :can-disable + return function( elem ) { + + // Only certain elements can match :enabled or :disabled + // https://html.spec.whatwg.org/multipage/scripting.html#selector-enabled + // https://html.spec.whatwg.org/multipage/scripting.html#selector-disabled + if ( "form" in elem ) { + + // Check for inherited disabledness on relevant non-disabled elements: + // * listed form-associated elements in a disabled fieldset + // https://html.spec.whatwg.org/multipage/forms.html#category-listed + // https://html.spec.whatwg.org/multipage/forms.html#concept-fe-disabled + // * option elements in a disabled optgroup + // https://html.spec.whatwg.org/multipage/forms.html#concept-option-disabled + // All such elements have a "form" property. + if ( elem.parentNode && elem.disabled === false ) { + + // Option elements defer to a parent optgroup if present + if ( "label" in elem ) { + if ( "label" in elem.parentNode ) { + return elem.parentNode.disabled === disabled; + } else { + return elem.disabled === disabled; + } + } + + // Support: IE 6 - 11 + // Use the isDisabled shortcut property to check for disabled fieldset ancestors + return elem.isDisabled === disabled || + + // Where there is no isDisabled, check manually + /* jshint -W018 */ + elem.isDisabled !== !disabled && + inDisabledFieldset( elem ) === disabled; + } + + return elem.disabled === disabled; + + // Try to winnow out elements that can't be disabled before trusting the disabled property. + // Some victims get caught in our net (label, legend, menu, track), but it shouldn't + // even exist on them, let alone have a boolean value. + } else if ( "label" in elem ) { + return elem.disabled === disabled; + } + + // Remaining elements are neither :enabled nor :disabled + return false; + }; +} + +/** + * Returns a function to use in pseudos for positionals + * @param {Function} fn + */ +function createPositionalPseudo( fn ) { + return markFunction( function( argument ) { + argument = +argument; + return markFunction( function( seed, matches ) { + var j, + matchIndexes = fn( [], seed.length, argument ), + i = matchIndexes.length; + + // Match elements found at the specified indexes + while ( i-- ) { + if ( seed[ ( j = matchIndexes[ i ] ) ] ) { + seed[ j ] = !( matches[ j ] = seed[ j ] ); + } + } + } ); + } ); +} + +/** + * Checks a node for validity as a Sizzle context + * @param {Element|Object=} context + * @returns {Element|Object|Boolean} The input node if acceptable, otherwise a falsy value + */ +function testContext( context ) { + return context && typeof context.getElementsByTagName !== "undefined" && context; +} + +// Expose support vars for convenience +support = Sizzle.support = {}; + +/** + * Detects XML nodes + * @param {Element|Object} elem An element or a document + * @returns {Boolean} True iff elem is a non-HTML XML node + */ +isXML = Sizzle.isXML = function( elem ) { + var namespace = elem.namespaceURI, + docElem = ( elem.ownerDocument || elem ).documentElement; + + // Support: IE <=8 + // Assume HTML when documentElement doesn't yet exist, such as inside loading iframes + // https://bugs.jquery.com/ticket/4833 + return !rhtml.test( namespace || docElem && docElem.nodeName || "HTML" ); +}; + +/** + * Sets document-related variables once based on the current document + * @param {Element|Object} [doc] An element or document object to use to set the document + * @returns {Object} Returns the current document + */ +setDocument = Sizzle.setDocument = function( node ) { + var hasCompare, subWindow, + doc = node ? node.ownerDocument || node : preferredDoc; + + // Return early if doc is invalid or already selected + // Support: IE 11+, Edge 17 - 18+ + // IE/Edge sometimes throw a "Permission denied" error when strict-comparing + // two documents; shallow comparisons work. + // eslint-disable-next-line eqeqeq + if ( doc == document || doc.nodeType !== 9 || !doc.documentElement ) { + return document; + } + + // Update global variables + document = doc; + docElem = document.documentElement; + documentIsHTML = !isXML( document ); + + // Support: IE 9 - 11+, Edge 12 - 18+ + // Accessing iframe documents after unload throws "permission denied" errors (jQuery #13936) + // Support: IE 11+, Edge 17 - 18+ + // IE/Edge sometimes throw a "Permission denied" error when strict-comparing + // two documents; shallow comparisons work. + // eslint-disable-next-line eqeqeq + if ( preferredDoc != document && + ( subWindow = document.defaultView ) && subWindow.top !== subWindow ) { + + // Support: IE 11, Edge + if ( subWindow.addEventListener ) { + subWindow.addEventListener( "unload", unloadHandler, false ); + + // Support: IE 9 - 10 only + } else if ( subWindow.attachEvent ) { + subWindow.attachEvent( "onunload", unloadHandler ); + } + } + + // Support: IE 8 - 11+, Edge 12 - 18+, Chrome <=16 - 25 only, Firefox <=3.6 - 31 only, + // Safari 4 - 5 only, Opera <=11.6 - 12.x only + // IE/Edge & older browsers don't support the :scope pseudo-class. + // Support: Safari 6.0 only + // Safari 6.0 supports :scope but it's an alias of :root there. + support.scope = assert( function( el ) { + docElem.appendChild( el ).appendChild( document.createElement( "div" ) ); + return typeof el.querySelectorAll !== "undefined" && + !el.querySelectorAll( ":scope fieldset div" ).length; + } ); + + /* Attributes + ---------------------------------------------------------------------- */ + + // Support: IE<8 + // Verify that getAttribute really returns attributes and not properties + // (excepting IE8 booleans) + support.attributes = assert( function( el ) { + el.className = "i"; + return !el.getAttribute( "className" ); + } ); + + /* getElement(s)By* + ---------------------------------------------------------------------- */ + + // Check if getElementsByTagName("*") returns only elements + support.getElementsByTagName = assert( function( el ) { + el.appendChild( document.createComment( "" ) ); + return !el.getElementsByTagName( "*" ).length; + } ); + + // Support: IE<9 + support.getElementsByClassName = rnative.test( document.getElementsByClassName ); + + // Support: IE<10 + // Check if getElementById returns elements by name + // The broken getElementById methods don't pick up programmatically-set names, + // so use a roundabout getElementsByName test + support.getById = assert( function( el ) { + docElem.appendChild( el ).id = expando; + return !document.getElementsByName || !document.getElementsByName( expando ).length; + } ); + + // ID filter and find + if ( support.getById ) { + Expr.filter[ "ID" ] = function( id ) { + var attrId = id.replace( runescape, funescape ); + return function( elem ) { + return elem.getAttribute( "id" ) === attrId; + }; + }; + Expr.find[ "ID" ] = function( id, context ) { + if ( typeof context.getElementById !== "undefined" && documentIsHTML ) { + var elem = context.getElementById( id ); + return elem ? [ elem ] : []; + } + }; + } else { + Expr.filter[ "ID" ] = function( id ) { + var attrId = id.replace( runescape, funescape ); + return function( elem ) { + var node = typeof elem.getAttributeNode !== "undefined" && + elem.getAttributeNode( "id" ); + return node && node.value === attrId; + }; + }; + + // Support: IE 6 - 7 only + // getElementById is not reliable as a find shortcut + Expr.find[ "ID" ] = function( id, context ) { + if ( typeof context.getElementById !== "undefined" && documentIsHTML ) { + var node, i, elems, + elem = context.getElementById( id ); + + if ( elem ) { + + // Verify the id attribute + node = elem.getAttributeNode( "id" ); + if ( node && node.value === id ) { + return [ elem ]; + } + + // Fall back on getElementsByName + elems = context.getElementsByName( id ); + i = 0; + while ( ( elem = elems[ i++ ] ) ) { + node = elem.getAttributeNode( "id" ); + if ( node && node.value === id ) { + return [ elem ]; + } + } + } + + return []; + } + }; + } + + // Tag + Expr.find[ "TAG" ] = support.getElementsByTagName ? + function( tag, context ) { + if ( typeof context.getElementsByTagName !== "undefined" ) { + return context.getElementsByTagName( tag ); + + // DocumentFragment nodes don't have gEBTN + } else if ( support.qsa ) { + return context.querySelectorAll( tag ); + } + } : + + function( tag, context ) { + var elem, + tmp = [], + i = 0, + + // By happy coincidence, a (broken) gEBTN appears on DocumentFragment nodes too + results = context.getElementsByTagName( tag ); + + // Filter out possible comments + if ( tag === "*" ) { + while ( ( elem = results[ i++ ] ) ) { + if ( elem.nodeType === 1 ) { + tmp.push( elem ); + } + } + + return tmp; + } + return results; + }; + + // Class + Expr.find[ "CLASS" ] = support.getElementsByClassName && function( className, context ) { + if ( typeof context.getElementsByClassName !== "undefined" && documentIsHTML ) { + return context.getElementsByClassName( className ); + } + }; + + /* QSA/matchesSelector + ---------------------------------------------------------------------- */ + + // QSA and matchesSelector support + + // matchesSelector(:active) reports false when true (IE9/Opera 11.5) + rbuggyMatches = []; + + // qSa(:focus) reports false when true (Chrome 21) + // We allow this because of a bug in IE8/9 that throws an error + // whenever `document.activeElement` is accessed on an iframe + // So, we allow :focus to pass through QSA all the time to avoid the IE error + // See https://bugs.jquery.com/ticket/13378 + rbuggyQSA = []; + + if ( ( support.qsa = rnative.test( document.querySelectorAll ) ) ) { + + // Build QSA regex + // Regex strategy adopted from Diego Perini + assert( function( el ) { + + var input; + + // Select is set to empty string on purpose + // This is to test IE's treatment of not explicitly + // setting a boolean content attribute, + // since its presence should be enough + // https://bugs.jquery.com/ticket/12359 + docElem.appendChild( el ).innerHTML = "" + + ""; + + // Support: IE8, Opera 11-12.16 + // Nothing should be selected when empty strings follow ^= or $= or *= + // The test attribute must be unknown in Opera but "safe" for WinRT + // https://msdn.microsoft.com/en-us/library/ie/hh465388.aspx#attribute_section + if ( el.querySelectorAll( "[msallowcapture^='']" ).length ) { + rbuggyQSA.push( "[*^$]=" + whitespace + "*(?:''|\"\")" ); + } + + // Support: IE8 + // Boolean attributes and "value" are not treated correctly + if ( !el.querySelectorAll( "[selected]" ).length ) { + rbuggyQSA.push( "\\[" + whitespace + "*(?:value|" + booleans + ")" ); + } + + // Support: Chrome<29, Android<4.4, Safari<7.0+, iOS<7.0+, PhantomJS<1.9.8+ + if ( !el.querySelectorAll( "[id~=" + expando + "-]" ).length ) { + rbuggyQSA.push( "~=" ); + } + + // Support: IE 11+, Edge 15 - 18+ + // IE 11/Edge don't find elements on a `[name='']` query in some cases. + // Adding a temporary attribute to the document before the selection works + // around the issue. + // Interestingly, IE 10 & older don't seem to have the issue. + input = document.createElement( "input" ); + input.setAttribute( "name", "" ); + el.appendChild( input ); + if ( !el.querySelectorAll( "[name='']" ).length ) { + rbuggyQSA.push( "\\[" + whitespace + "*name" + whitespace + "*=" + + whitespace + "*(?:''|\"\")" ); + } + + // Webkit/Opera - :checked should return selected option elements + // http://www.w3.org/TR/2011/REC-css3-selectors-20110929/#checked + // IE8 throws error here and will not see later tests + if ( !el.querySelectorAll( ":checked" ).length ) { + rbuggyQSA.push( ":checked" ); + } + + // Support: Safari 8+, iOS 8+ + // https://bugs.webkit.org/show_bug.cgi?id=136851 + // In-page `selector#id sibling-combinator selector` fails + if ( !el.querySelectorAll( "a#" + expando + "+*" ).length ) { + rbuggyQSA.push( ".#.+[+~]" ); + } + + // Support: Firefox <=3.6 - 5 only + // Old Firefox doesn't throw on a badly-escaped identifier. + el.querySelectorAll( "\\\f" ); + rbuggyQSA.push( "[\\r\\n\\f]" ); + } ); + + assert( function( el ) { + el.innerHTML = "" + + ""; + + // Support: Windows 8 Native Apps + // The type and name attributes are restricted during .innerHTML assignment + var input = document.createElement( "input" ); + input.setAttribute( "type", "hidden" ); + el.appendChild( input ).setAttribute( "name", "D" ); + + // Support: IE8 + // Enforce case-sensitivity of name attribute + if ( el.querySelectorAll( "[name=d]" ).length ) { + rbuggyQSA.push( "name" + whitespace + "*[*^$|!~]?=" ); + } + + // FF 3.5 - :enabled/:disabled and hidden elements (hidden elements are still enabled) + // IE8 throws error here and will not see later tests + if ( el.querySelectorAll( ":enabled" ).length !== 2 ) { + rbuggyQSA.push( ":enabled", ":disabled" ); + } + + // Support: IE9-11+ + // IE's :disabled selector does not pick up the children of disabled fieldsets + docElem.appendChild( el ).disabled = true; + if ( el.querySelectorAll( ":disabled" ).length !== 2 ) { + rbuggyQSA.push( ":enabled", ":disabled" ); + } + + // Support: Opera 10 - 11 only + // Opera 10-11 does not throw on post-comma invalid pseudos + el.querySelectorAll( "*,:x" ); + rbuggyQSA.push( ",.*:" ); + } ); + } + + if ( ( support.matchesSelector = rnative.test( ( matches = docElem.matches || + docElem.webkitMatchesSelector || + docElem.mozMatchesSelector || + docElem.oMatchesSelector || + docElem.msMatchesSelector ) ) ) ) { + + assert( function( el ) { + + // Check to see if it's possible to do matchesSelector + // on a disconnected node (IE 9) + support.disconnectedMatch = matches.call( el, "*" ); + + // This should fail with an exception + // Gecko does not error, returns false instead + matches.call( el, "[s!='']:x" ); + rbuggyMatches.push( "!=", pseudos ); + } ); + } + + rbuggyQSA = rbuggyQSA.length && new RegExp( rbuggyQSA.join( "|" ) ); + rbuggyMatches = rbuggyMatches.length && new RegExp( rbuggyMatches.join( "|" ) ); + + /* Contains + ---------------------------------------------------------------------- */ + hasCompare = rnative.test( docElem.compareDocumentPosition ); + + // Element contains another + // Purposefully self-exclusive + // As in, an element does not contain itself + contains = hasCompare || rnative.test( docElem.contains ) ? + function( a, b ) { + var adown = a.nodeType === 9 ? a.documentElement : a, + bup = b && b.parentNode; + return a === bup || !!( bup && bup.nodeType === 1 && ( + adown.contains ? + adown.contains( bup ) : + a.compareDocumentPosition && a.compareDocumentPosition( bup ) & 16 + ) ); + } : + function( a, b ) { + if ( b ) { + while ( ( b = b.parentNode ) ) { + if ( b === a ) { + return true; + } + } + } + return false; + }; + + /* Sorting + ---------------------------------------------------------------------- */ + + // Document order sorting + sortOrder = hasCompare ? + function( a, b ) { + + // Flag for duplicate removal + if ( a === b ) { + hasDuplicate = true; + return 0; + } + + // Sort on method existence if only one input has compareDocumentPosition + var compare = !a.compareDocumentPosition - !b.compareDocumentPosition; + if ( compare ) { + return compare; + } + + // Calculate position if both inputs belong to the same document + // Support: IE 11+, Edge 17 - 18+ + // IE/Edge sometimes throw a "Permission denied" error when strict-comparing + // two documents; shallow comparisons work. + // eslint-disable-next-line eqeqeq + compare = ( a.ownerDocument || a ) == ( b.ownerDocument || b ) ? + a.compareDocumentPosition( b ) : + + // Otherwise we know they are disconnected + 1; + + // Disconnected nodes + if ( compare & 1 || + ( !support.sortDetached && b.compareDocumentPosition( a ) === compare ) ) { + + // Choose the first element that is related to our preferred document + // Support: IE 11+, Edge 17 - 18+ + // IE/Edge sometimes throw a "Permission denied" error when strict-comparing + // two documents; shallow comparisons work. + // eslint-disable-next-line eqeqeq + if ( a == document || a.ownerDocument == preferredDoc && + contains( preferredDoc, a ) ) { + return -1; + } + + // Support: IE 11+, Edge 17 - 18+ + // IE/Edge sometimes throw a "Permission denied" error when strict-comparing + // two documents; shallow comparisons work. + // eslint-disable-next-line eqeqeq + if ( b == document || b.ownerDocument == preferredDoc && + contains( preferredDoc, b ) ) { + return 1; + } + + // Maintain original order + return sortInput ? + ( indexOf( sortInput, a ) - indexOf( sortInput, b ) ) : + 0; + } + + return compare & 4 ? -1 : 1; + } : + function( a, b ) { + + // Exit early if the nodes are identical + if ( a === b ) { + hasDuplicate = true; + return 0; + } + + var cur, + i = 0, + aup = a.parentNode, + bup = b.parentNode, + ap = [ a ], + bp = [ b ]; + + // Parentless nodes are either documents or disconnected + if ( !aup || !bup ) { + + // Support: IE 11+, Edge 17 - 18+ + // IE/Edge sometimes throw a "Permission denied" error when strict-comparing + // two documents; shallow comparisons work. + /* eslint-disable eqeqeq */ + return a == document ? -1 : + b == document ? 1 : + /* eslint-enable eqeqeq */ + aup ? -1 : + bup ? 1 : + sortInput ? + ( indexOf( sortInput, a ) - indexOf( sortInput, b ) ) : + 0; + + // If the nodes are siblings, we can do a quick check + } else if ( aup === bup ) { + return siblingCheck( a, b ); + } + + // Otherwise we need full lists of their ancestors for comparison + cur = a; + while ( ( cur = cur.parentNode ) ) { + ap.unshift( cur ); + } + cur = b; + while ( ( cur = cur.parentNode ) ) { + bp.unshift( cur ); + } + + // Walk down the tree looking for a discrepancy + while ( ap[ i ] === bp[ i ] ) { + i++; + } + + return i ? + + // Do a sibling check if the nodes have a common ancestor + siblingCheck( ap[ i ], bp[ i ] ) : + + // Otherwise nodes in our document sort first + // Support: IE 11+, Edge 17 - 18+ + // IE/Edge sometimes throw a "Permission denied" error when strict-comparing + // two documents; shallow comparisons work. + /* eslint-disable eqeqeq */ + ap[ i ] == preferredDoc ? -1 : + bp[ i ] == preferredDoc ? 1 : + /* eslint-enable eqeqeq */ + 0; + }; + + return document; +}; + +Sizzle.matches = function( expr, elements ) { + return Sizzle( expr, null, null, elements ); +}; + +Sizzle.matchesSelector = function( elem, expr ) { + setDocument( elem ); + + if ( support.matchesSelector && documentIsHTML && + !nonnativeSelectorCache[ expr + " " ] && + ( !rbuggyMatches || !rbuggyMatches.test( expr ) ) && + ( !rbuggyQSA || !rbuggyQSA.test( expr ) ) ) { + + try { + var ret = matches.call( elem, expr ); + + // IE 9's matchesSelector returns false on disconnected nodes + if ( ret || support.disconnectedMatch || + + // As well, disconnected nodes are said to be in a document + // fragment in IE 9 + elem.document && elem.document.nodeType !== 11 ) { + return ret; + } + } catch ( e ) { + nonnativeSelectorCache( expr, true ); + } + } + + return Sizzle( expr, document, null, [ elem ] ).length > 0; +}; + +Sizzle.contains = function( context, elem ) { + + // Set document vars if needed + // Support: IE 11+, Edge 17 - 18+ + // IE/Edge sometimes throw a "Permission denied" error when strict-comparing + // two documents; shallow comparisons work. + // eslint-disable-next-line eqeqeq + if ( ( context.ownerDocument || context ) != document ) { + setDocument( context ); + } + return contains( context, elem ); +}; + +Sizzle.attr = function( elem, name ) { + + // Set document vars if needed + // Support: IE 11+, Edge 17 - 18+ + // IE/Edge sometimes throw a "Permission denied" error when strict-comparing + // two documents; shallow comparisons work. + // eslint-disable-next-line eqeqeq + if ( ( elem.ownerDocument || elem ) != document ) { + setDocument( elem ); + } + + var fn = Expr.attrHandle[ name.toLowerCase() ], + + // Don't get fooled by Object.prototype properties (jQuery #13807) + val = fn && hasOwn.call( Expr.attrHandle, name.toLowerCase() ) ? + fn( elem, name, !documentIsHTML ) : + undefined; + + return val !== undefined ? + val : + support.attributes || !documentIsHTML ? + elem.getAttribute( name ) : + ( val = elem.getAttributeNode( name ) ) && val.specified ? + val.value : + null; +}; + +Sizzle.escape = function( sel ) { + return ( sel + "" ).replace( rcssescape, fcssescape ); +}; + +Sizzle.error = function( msg ) { + throw new Error( "Syntax error, unrecognized expression: " + msg ); +}; + +/** + * Document sorting and removing duplicates + * @param {ArrayLike} results + */ +Sizzle.uniqueSort = function( results ) { + var elem, + duplicates = [], + j = 0, + i = 0; + + // Unless we *know* we can detect duplicates, assume their presence + hasDuplicate = !support.detectDuplicates; + sortInput = !support.sortStable && results.slice( 0 ); + results.sort( sortOrder ); + + if ( hasDuplicate ) { + while ( ( elem = results[ i++ ] ) ) { + if ( elem === results[ i ] ) { + j = duplicates.push( i ); + } + } + while ( j-- ) { + results.splice( duplicates[ j ], 1 ); + } + } + + // Clear input after sorting to release objects + // See https://github.com/jquery/sizzle/pull/225 + sortInput = null; + + return results; +}; + +/** + * Utility function for retrieving the text value of an array of DOM nodes + * @param {Array|Element} elem + */ +getText = Sizzle.getText = function( elem ) { + var node, + ret = "", + i = 0, + nodeType = elem.nodeType; + + if ( !nodeType ) { + + // If no nodeType, this is expected to be an array + while ( ( node = elem[ i++ ] ) ) { + + // Do not traverse comment nodes + ret += getText( node ); + } + } else if ( nodeType === 1 || nodeType === 9 || nodeType === 11 ) { + + // Use textContent for elements + // innerText usage removed for consistency of new lines (jQuery #11153) + if ( typeof elem.textContent === "string" ) { + return elem.textContent; + } else { + + // Traverse its children + for ( elem = elem.firstChild; elem; elem = elem.nextSibling ) { + ret += getText( elem ); + } + } + } else if ( nodeType === 3 || nodeType === 4 ) { + return elem.nodeValue; + } + + // Do not include comment or processing instruction nodes + + return ret; +}; + +Expr = Sizzle.selectors = { + + // Can be adjusted by the user + cacheLength: 50, + + createPseudo: markFunction, + + match: matchExpr, + + attrHandle: {}, + + find: {}, + + relative: { + ">": { dir: "parentNode", first: true }, + " ": { dir: "parentNode" }, + "+": { dir: "previousSibling", first: true }, + "~": { dir: "previousSibling" } + }, + + preFilter: { + "ATTR": function( match ) { + match[ 1 ] = match[ 1 ].replace( runescape, funescape ); + + // Move the given value to match[3] whether quoted or unquoted + match[ 3 ] = ( match[ 3 ] || match[ 4 ] || + match[ 5 ] || "" ).replace( runescape, funescape ); + + if ( match[ 2 ] === "~=" ) { + match[ 3 ] = " " + match[ 3 ] + " "; + } + + return match.slice( 0, 4 ); + }, + + "CHILD": function( match ) { + + /* matches from matchExpr["CHILD"] + 1 type (only|nth|...) + 2 what (child|of-type) + 3 argument (even|odd|\d*|\d*n([+-]\d+)?|...) + 4 xn-component of xn+y argument ([+-]?\d*n|) + 5 sign of xn-component + 6 x of xn-component + 7 sign of y-component + 8 y of y-component + */ + match[ 1 ] = match[ 1 ].toLowerCase(); + + if ( match[ 1 ].slice( 0, 3 ) === "nth" ) { + + // nth-* requires argument + if ( !match[ 3 ] ) { + Sizzle.error( match[ 0 ] ); + } + + // numeric x and y parameters for Expr.filter.CHILD + // remember that false/true cast respectively to 0/1 + match[ 4 ] = +( match[ 4 ] ? + match[ 5 ] + ( match[ 6 ] || 1 ) : + 2 * ( match[ 3 ] === "even" || match[ 3 ] === "odd" ) ); + match[ 5 ] = +( ( match[ 7 ] + match[ 8 ] ) || match[ 3 ] === "odd" ); + + // other types prohibit arguments + } else if ( match[ 3 ] ) { + Sizzle.error( match[ 0 ] ); + } + + return match; + }, + + "PSEUDO": function( match ) { + var excess, + unquoted = !match[ 6 ] && match[ 2 ]; + + if ( matchExpr[ "CHILD" ].test( match[ 0 ] ) ) { + return null; + } + + // Accept quoted arguments as-is + if ( match[ 3 ] ) { + match[ 2 ] = match[ 4 ] || match[ 5 ] || ""; + + // Strip excess characters from unquoted arguments + } else if ( unquoted && rpseudo.test( unquoted ) && + + // Get excess from tokenize (recursively) + ( excess = tokenize( unquoted, true ) ) && + + // advance to the next closing parenthesis + ( excess = unquoted.indexOf( ")", unquoted.length - excess ) - unquoted.length ) ) { + + // excess is a negative index + match[ 0 ] = match[ 0 ].slice( 0, excess ); + match[ 2 ] = unquoted.slice( 0, excess ); + } + + // Return only captures needed by the pseudo filter method (type and argument) + return match.slice( 0, 3 ); + } + }, + + filter: { + + "TAG": function( nodeNameSelector ) { + var nodeName = nodeNameSelector.replace( runescape, funescape ).toLowerCase(); + return nodeNameSelector === "*" ? + function() { + return true; + } : + function( elem ) { + return elem.nodeName && elem.nodeName.toLowerCase() === nodeName; + }; + }, + + "CLASS": function( className ) { + var pattern = classCache[ className + " " ]; + + return pattern || + ( pattern = new RegExp( "(^|" + whitespace + + ")" + className + "(" + whitespace + "|$)" ) ) && classCache( + className, function( elem ) { + return pattern.test( + typeof elem.className === "string" && elem.className || + typeof elem.getAttribute !== "undefined" && + elem.getAttribute( "class" ) || + "" + ); + } ); + }, + + "ATTR": function( name, operator, check ) { + return function( elem ) { + var result = Sizzle.attr( elem, name ); + + if ( result == null ) { + return operator === "!="; + } + if ( !operator ) { + return true; + } + + result += ""; + + /* eslint-disable max-len */ + + return operator === "=" ? result === check : + operator === "!=" ? result !== check : + operator === "^=" ? check && result.indexOf( check ) === 0 : + operator === "*=" ? check && result.indexOf( check ) > -1 : + operator === "$=" ? check && result.slice( -check.length ) === check : + operator === "~=" ? ( " " + result.replace( rwhitespace, " " ) + " " ).indexOf( check ) > -1 : + operator === "|=" ? result === check || result.slice( 0, check.length + 1 ) === check + "-" : + false; + /* eslint-enable max-len */ + + }; + }, + + "CHILD": function( type, what, _argument, first, last ) { + var simple = type.slice( 0, 3 ) !== "nth", + forward = type.slice( -4 ) !== "last", + ofType = what === "of-type"; + + return first === 1 && last === 0 ? + + // Shortcut for :nth-*(n) + function( elem ) { + return !!elem.parentNode; + } : + + function( elem, _context, xml ) { + var cache, uniqueCache, outerCache, node, nodeIndex, start, + dir = simple !== forward ? "nextSibling" : "previousSibling", + parent = elem.parentNode, + name = ofType && elem.nodeName.toLowerCase(), + useCache = !xml && !ofType, + diff = false; + + if ( parent ) { + + // :(first|last|only)-(child|of-type) + if ( simple ) { + while ( dir ) { + node = elem; + while ( ( node = node[ dir ] ) ) { + if ( ofType ? + node.nodeName.toLowerCase() === name : + node.nodeType === 1 ) { + + return false; + } + } + + // Reverse direction for :only-* (if we haven't yet done so) + start = dir = type === "only" && !start && "nextSibling"; + } + return true; + } + + start = [ forward ? parent.firstChild : parent.lastChild ]; + + // non-xml :nth-child(...) stores cache data on `parent` + if ( forward && useCache ) { + + // Seek `elem` from a previously-cached index + + // ...in a gzip-friendly way + node = parent; + outerCache = node[ expando ] || ( node[ expando ] = {} ); + + // Support: IE <9 only + // Defend against cloned attroperties (jQuery gh-1709) + uniqueCache = outerCache[ node.uniqueID ] || + ( outerCache[ node.uniqueID ] = {} ); + + cache = uniqueCache[ type ] || []; + nodeIndex = cache[ 0 ] === dirruns && cache[ 1 ]; + diff = nodeIndex && cache[ 2 ]; + node = nodeIndex && parent.childNodes[ nodeIndex ]; + + while ( ( node = ++nodeIndex && node && node[ dir ] || + + // Fallback to seeking `elem` from the start + ( diff = nodeIndex = 0 ) || start.pop() ) ) { + + // When found, cache indexes on `parent` and break + if ( node.nodeType === 1 && ++diff && node === elem ) { + uniqueCache[ type ] = [ dirruns, nodeIndex, diff ]; + break; + } + } + + } else { + + // Use previously-cached element index if available + if ( useCache ) { + + // ...in a gzip-friendly way + node = elem; + outerCache = node[ expando ] || ( node[ expando ] = {} ); + + // Support: IE <9 only + // Defend against cloned attroperties (jQuery gh-1709) + uniqueCache = outerCache[ node.uniqueID ] || + ( outerCache[ node.uniqueID ] = {} ); + + cache = uniqueCache[ type ] || []; + nodeIndex = cache[ 0 ] === dirruns && cache[ 1 ]; + diff = nodeIndex; + } + + // xml :nth-child(...) + // or :nth-last-child(...) or :nth(-last)?-of-type(...) + if ( diff === false ) { + + // Use the same loop as above to seek `elem` from the start + while ( ( node = ++nodeIndex && node && node[ dir ] || + ( diff = nodeIndex = 0 ) || start.pop() ) ) { + + if ( ( ofType ? + node.nodeName.toLowerCase() === name : + node.nodeType === 1 ) && + ++diff ) { + + // Cache the index of each encountered element + if ( useCache ) { + outerCache = node[ expando ] || + ( node[ expando ] = {} ); + + // Support: IE <9 only + // Defend against cloned attroperties (jQuery gh-1709) + uniqueCache = outerCache[ node.uniqueID ] || + ( outerCache[ node.uniqueID ] = {} ); + + uniqueCache[ type ] = [ dirruns, diff ]; + } + + if ( node === elem ) { + break; + } + } + } + } + } + + // Incorporate the offset, then check against cycle size + diff -= last; + return diff === first || ( diff % first === 0 && diff / first >= 0 ); + } + }; + }, + + "PSEUDO": function( pseudo, argument ) { + + // pseudo-class names are case-insensitive + // http://www.w3.org/TR/selectors/#pseudo-classes + // Prioritize by case sensitivity in case custom pseudos are added with uppercase letters + // Remember that setFilters inherits from pseudos + var args, + fn = Expr.pseudos[ pseudo ] || Expr.setFilters[ pseudo.toLowerCase() ] || + Sizzle.error( "unsupported pseudo: " + pseudo ); + + // The user may use createPseudo to indicate that + // arguments are needed to create the filter function + // just as Sizzle does + if ( fn[ expando ] ) { + return fn( argument ); + } + + // But maintain support for old signatures + if ( fn.length > 1 ) { + args = [ pseudo, pseudo, "", argument ]; + return Expr.setFilters.hasOwnProperty( pseudo.toLowerCase() ) ? + markFunction( function( seed, matches ) { + var idx, + matched = fn( seed, argument ), + i = matched.length; + while ( i-- ) { + idx = indexOf( seed, matched[ i ] ); + seed[ idx ] = !( matches[ idx ] = matched[ i ] ); + } + } ) : + function( elem ) { + return fn( elem, 0, args ); + }; + } + + return fn; + } + }, + + pseudos: { + + // Potentially complex pseudos + "not": markFunction( function( selector ) { + + // Trim the selector passed to compile + // to avoid treating leading and trailing + // spaces as combinators + var input = [], + results = [], + matcher = compile( selector.replace( rtrim, "$1" ) ); + + return matcher[ expando ] ? + markFunction( function( seed, matches, _context, xml ) { + var elem, + unmatched = matcher( seed, null, xml, [] ), + i = seed.length; + + // Match elements unmatched by `matcher` + while ( i-- ) { + if ( ( elem = unmatched[ i ] ) ) { + seed[ i ] = !( matches[ i ] = elem ); + } + } + } ) : + function( elem, _context, xml ) { + input[ 0 ] = elem; + matcher( input, null, xml, results ); + + // Don't keep the element (issue #299) + input[ 0 ] = null; + return !results.pop(); + }; + } ), + + "has": markFunction( function( selector ) { + return function( elem ) { + return Sizzle( selector, elem ).length > 0; + }; + } ), + + "contains": markFunction( function( text ) { + text = text.replace( runescape, funescape ); + return function( elem ) { + return ( elem.textContent || getText( elem ) ).indexOf( text ) > -1; + }; + } ), + + // "Whether an element is represented by a :lang() selector + // is based solely on the element's language value + // being equal to the identifier C, + // or beginning with the identifier C immediately followed by "-". + // The matching of C against the element's language value is performed case-insensitively. + // The identifier C does not have to be a valid language name." + // http://www.w3.org/TR/selectors/#lang-pseudo + "lang": markFunction( function( lang ) { + + // lang value must be a valid identifier + if ( !ridentifier.test( lang || "" ) ) { + Sizzle.error( "unsupported lang: " + lang ); + } + lang = lang.replace( runescape, funescape ).toLowerCase(); + return function( elem ) { + var elemLang; + do { + if ( ( elemLang = documentIsHTML ? + elem.lang : + elem.getAttribute( "xml:lang" ) || elem.getAttribute( "lang" ) ) ) { + + elemLang = elemLang.toLowerCase(); + return elemLang === lang || elemLang.indexOf( lang + "-" ) === 0; + } + } while ( ( elem = elem.parentNode ) && elem.nodeType === 1 ); + return false; + }; + } ), + + // Miscellaneous + "target": function( elem ) { + var hash = window.location && window.location.hash; + return hash && hash.slice( 1 ) === elem.id; + }, + + "root": function( elem ) { + return elem === docElem; + }, + + "focus": function( elem ) { + return elem === document.activeElement && + ( !document.hasFocus || document.hasFocus() ) && + !!( elem.type || elem.href || ~elem.tabIndex ); + }, + + // Boolean properties + "enabled": createDisabledPseudo( false ), + "disabled": createDisabledPseudo( true ), + + "checked": function( elem ) { + + // In CSS3, :checked should return both checked and selected elements + // http://www.w3.org/TR/2011/REC-css3-selectors-20110929/#checked + var nodeName = elem.nodeName.toLowerCase(); + return ( nodeName === "input" && !!elem.checked ) || + ( nodeName === "option" && !!elem.selected ); + }, + + "selected": function( elem ) { + + // Accessing this property makes selected-by-default + // options in Safari work properly + if ( elem.parentNode ) { + // eslint-disable-next-line no-unused-expressions + elem.parentNode.selectedIndex; + } + + return elem.selected === true; + }, + + // Contents + "empty": function( elem ) { + + // http://www.w3.org/TR/selectors/#empty-pseudo + // :empty is negated by element (1) or content nodes (text: 3; cdata: 4; entity ref: 5), + // but not by others (comment: 8; processing instruction: 7; etc.) + // nodeType < 6 works because attributes (2) do not appear as children + for ( elem = elem.firstChild; elem; elem = elem.nextSibling ) { + if ( elem.nodeType < 6 ) { + return false; + } + } + return true; + }, + + "parent": function( elem ) { + return !Expr.pseudos[ "empty" ]( elem ); + }, + + // Element/input types + "header": function( elem ) { + return rheader.test( elem.nodeName ); + }, + + "input": function( elem ) { + return rinputs.test( elem.nodeName ); + }, + + "button": function( elem ) { + var name = elem.nodeName.toLowerCase(); + return name === "input" && elem.type === "button" || name === "button"; + }, + + "text": function( elem ) { + var attr; + return elem.nodeName.toLowerCase() === "input" && + elem.type === "text" && + + // Support: IE<8 + // New HTML5 attribute values (e.g., "search") appear with elem.type === "text" + ( ( attr = elem.getAttribute( "type" ) ) == null || + attr.toLowerCase() === "text" ); + }, + + // Position-in-collection + "first": createPositionalPseudo( function() { + return [ 0 ]; + } ), + + "last": createPositionalPseudo( function( _matchIndexes, length ) { + return [ length - 1 ]; + } ), + + "eq": createPositionalPseudo( function( _matchIndexes, length, argument ) { + return [ argument < 0 ? argument + length : argument ]; + } ), + + "even": createPositionalPseudo( function( matchIndexes, length ) { + var i = 0; + for ( ; i < length; i += 2 ) { + matchIndexes.push( i ); + } + return matchIndexes; + } ), + + "odd": createPositionalPseudo( function( matchIndexes, length ) { + var i = 1; + for ( ; i < length; i += 2 ) { + matchIndexes.push( i ); + } + return matchIndexes; + } ), + + "lt": createPositionalPseudo( function( matchIndexes, length, argument ) { + var i = argument < 0 ? + argument + length : + argument > length ? + length : + argument; + for ( ; --i >= 0; ) { + matchIndexes.push( i ); + } + return matchIndexes; + } ), + + "gt": createPositionalPseudo( function( matchIndexes, length, argument ) { + var i = argument < 0 ? argument + length : argument; + for ( ; ++i < length; ) { + matchIndexes.push( i ); + } + return matchIndexes; + } ) + } +}; + +Expr.pseudos[ "nth" ] = Expr.pseudos[ "eq" ]; + +// Add button/input type pseudos +for ( i in { radio: true, checkbox: true, file: true, password: true, image: true } ) { + Expr.pseudos[ i ] = createInputPseudo( i ); +} +for ( i in { submit: true, reset: true } ) { + Expr.pseudos[ i ] = createButtonPseudo( i ); +} + +// Easy API for creating new setFilters +function setFilters() {} +setFilters.prototype = Expr.filters = Expr.pseudos; +Expr.setFilters = new setFilters(); + +tokenize = Sizzle.tokenize = function( selector, parseOnly ) { + var matched, match, tokens, type, + soFar, groups, preFilters, + cached = tokenCache[ selector + " " ]; + + if ( cached ) { + return parseOnly ? 0 : cached.slice( 0 ); + } + + soFar = selector; + groups = []; + preFilters = Expr.preFilter; + + while ( soFar ) { + + // Comma and first run + if ( !matched || ( match = rcomma.exec( soFar ) ) ) { + if ( match ) { + + // Don't consume trailing commas as valid + soFar = soFar.slice( match[ 0 ].length ) || soFar; + } + groups.push( ( tokens = [] ) ); + } + + matched = false; + + // Combinators + if ( ( match = rcombinators.exec( soFar ) ) ) { + matched = match.shift(); + tokens.push( { + value: matched, + + // Cast descendant combinators to space + type: match[ 0 ].replace( rtrim, " " ) + } ); + soFar = soFar.slice( matched.length ); + } + + // Filters + for ( type in Expr.filter ) { + if ( ( match = matchExpr[ type ].exec( soFar ) ) && ( !preFilters[ type ] || + ( match = preFilters[ type ]( match ) ) ) ) { + matched = match.shift(); + tokens.push( { + value: matched, + type: type, + matches: match + } ); + soFar = soFar.slice( matched.length ); + } + } + + if ( !matched ) { + break; + } + } + + // Return the length of the invalid excess + // if we're just parsing + // Otherwise, throw an error or return tokens + return parseOnly ? + soFar.length : + soFar ? + Sizzle.error( selector ) : + + // Cache the tokens + tokenCache( selector, groups ).slice( 0 ); +}; + +function toSelector( tokens ) { + var i = 0, + len = tokens.length, + selector = ""; + for ( ; i < len; i++ ) { + selector += tokens[ i ].value; + } + return selector; +} + +function addCombinator( matcher, combinator, base ) { + var dir = combinator.dir, + skip = combinator.next, + key = skip || dir, + checkNonElements = base && key === "parentNode", + doneName = done++; + + return combinator.first ? + + // Check against closest ancestor/preceding element + function( elem, context, xml ) { + while ( ( elem = elem[ dir ] ) ) { + if ( elem.nodeType === 1 || checkNonElements ) { + return matcher( elem, context, xml ); + } + } + return false; + } : + + // Check against all ancestor/preceding elements + function( elem, context, xml ) { + var oldCache, uniqueCache, outerCache, + newCache = [ dirruns, doneName ]; + + // We can't set arbitrary data on XML nodes, so they don't benefit from combinator caching + if ( xml ) { + while ( ( elem = elem[ dir ] ) ) { + if ( elem.nodeType === 1 || checkNonElements ) { + if ( matcher( elem, context, xml ) ) { + return true; + } + } + } + } else { + while ( ( elem = elem[ dir ] ) ) { + if ( elem.nodeType === 1 || checkNonElements ) { + outerCache = elem[ expando ] || ( elem[ expando ] = {} ); + + // Support: IE <9 only + // Defend against cloned attroperties (jQuery gh-1709) + uniqueCache = outerCache[ elem.uniqueID ] || + ( outerCache[ elem.uniqueID ] = {} ); + + if ( skip && skip === elem.nodeName.toLowerCase() ) { + elem = elem[ dir ] || elem; + } else if ( ( oldCache = uniqueCache[ key ] ) && + oldCache[ 0 ] === dirruns && oldCache[ 1 ] === doneName ) { + + // Assign to newCache so results back-propagate to previous elements + return ( newCache[ 2 ] = oldCache[ 2 ] ); + } else { + + // Reuse newcache so results back-propagate to previous elements + uniqueCache[ key ] = newCache; + + // A match means we're done; a fail means we have to keep checking + if ( ( newCache[ 2 ] = matcher( elem, context, xml ) ) ) { + return true; + } + } + } + } + } + return false; + }; +} + +function elementMatcher( matchers ) { + return matchers.length > 1 ? + function( elem, context, xml ) { + var i = matchers.length; + while ( i-- ) { + if ( !matchers[ i ]( elem, context, xml ) ) { + return false; + } + } + return true; + } : + matchers[ 0 ]; +} + +function multipleContexts( selector, contexts, results ) { + var i = 0, + len = contexts.length; + for ( ; i < len; i++ ) { + Sizzle( selector, contexts[ i ], results ); + } + return results; +} + +function condense( unmatched, map, filter, context, xml ) { + var elem, + newUnmatched = [], + i = 0, + len = unmatched.length, + mapped = map != null; + + for ( ; i < len; i++ ) { + if ( ( elem = unmatched[ i ] ) ) { + if ( !filter || filter( elem, context, xml ) ) { + newUnmatched.push( elem ); + if ( mapped ) { + map.push( i ); + } + } + } + } + + return newUnmatched; +} + +function setMatcher( preFilter, selector, matcher, postFilter, postFinder, postSelector ) { + if ( postFilter && !postFilter[ expando ] ) { + postFilter = setMatcher( postFilter ); + } + if ( postFinder && !postFinder[ expando ] ) { + postFinder = setMatcher( postFinder, postSelector ); + } + return markFunction( function( seed, results, context, xml ) { + var temp, i, elem, + preMap = [], + postMap = [], + preexisting = results.length, + + // Get initial elements from seed or context + elems = seed || multipleContexts( + selector || "*", + context.nodeType ? [ context ] : context, + [] + ), + + // Prefilter to get matcher input, preserving a map for seed-results synchronization + matcherIn = preFilter && ( seed || !selector ) ? + condense( elems, preMap, preFilter, context, xml ) : + elems, + + matcherOut = matcher ? + + // If we have a postFinder, or filtered seed, or non-seed postFilter or preexisting results, + postFinder || ( seed ? preFilter : preexisting || postFilter ) ? + + // ...intermediate processing is necessary + [] : + + // ...otherwise use results directly + results : + matcherIn; + + // Find primary matches + if ( matcher ) { + matcher( matcherIn, matcherOut, context, xml ); + } + + // Apply postFilter + if ( postFilter ) { + temp = condense( matcherOut, postMap ); + postFilter( temp, [], context, xml ); + + // Un-match failing elements by moving them back to matcherIn + i = temp.length; + while ( i-- ) { + if ( ( elem = temp[ i ] ) ) { + matcherOut[ postMap[ i ] ] = !( matcherIn[ postMap[ i ] ] = elem ); + } + } + } + + if ( seed ) { + if ( postFinder || preFilter ) { + if ( postFinder ) { + + // Get the final matcherOut by condensing this intermediate into postFinder contexts + temp = []; + i = matcherOut.length; + while ( i-- ) { + if ( ( elem = matcherOut[ i ] ) ) { + + // Restore matcherIn since elem is not yet a final match + temp.push( ( matcherIn[ i ] = elem ) ); + } + } + postFinder( null, ( matcherOut = [] ), temp, xml ); + } + + // Move matched elements from seed to results to keep them synchronized + i = matcherOut.length; + while ( i-- ) { + if ( ( elem = matcherOut[ i ] ) && + ( temp = postFinder ? indexOf( seed, elem ) : preMap[ i ] ) > -1 ) { + + seed[ temp ] = !( results[ temp ] = elem ); + } + } + } + + // Add elements to results, through postFinder if defined + } else { + matcherOut = condense( + matcherOut === results ? + matcherOut.splice( preexisting, matcherOut.length ) : + matcherOut + ); + if ( postFinder ) { + postFinder( null, results, matcherOut, xml ); + } else { + push.apply( results, matcherOut ); + } + } + } ); +} + +function matcherFromTokens( tokens ) { + var checkContext, matcher, j, + len = tokens.length, + leadingRelative = Expr.relative[ tokens[ 0 ].type ], + implicitRelative = leadingRelative || Expr.relative[ " " ], + i = leadingRelative ? 1 : 0, + + // The foundational matcher ensures that elements are reachable from top-level context(s) + matchContext = addCombinator( function( elem ) { + return elem === checkContext; + }, implicitRelative, true ), + matchAnyContext = addCombinator( function( elem ) { + return indexOf( checkContext, elem ) > -1; + }, implicitRelative, true ), + matchers = [ function( elem, context, xml ) { + var ret = ( !leadingRelative && ( xml || context !== outermostContext ) ) || ( + ( checkContext = context ).nodeType ? + matchContext( elem, context, xml ) : + matchAnyContext( elem, context, xml ) ); + + // Avoid hanging onto element (issue #299) + checkContext = null; + return ret; + } ]; + + for ( ; i < len; i++ ) { + if ( ( matcher = Expr.relative[ tokens[ i ].type ] ) ) { + matchers = [ addCombinator( elementMatcher( matchers ), matcher ) ]; + } else { + matcher = Expr.filter[ tokens[ i ].type ].apply( null, tokens[ i ].matches ); + + // Return special upon seeing a positional matcher + if ( matcher[ expando ] ) { + + // Find the next relative operator (if any) for proper handling + j = ++i; + for ( ; j < len; j++ ) { + if ( Expr.relative[ tokens[ j ].type ] ) { + break; + } + } + return setMatcher( + i > 1 && elementMatcher( matchers ), + i > 1 && toSelector( + + // If the preceding token was a descendant combinator, insert an implicit any-element `*` + tokens + .slice( 0, i - 1 ) + .concat( { value: tokens[ i - 2 ].type === " " ? "*" : "" } ) + ).replace( rtrim, "$1" ), + matcher, + i < j && matcherFromTokens( tokens.slice( i, j ) ), + j < len && matcherFromTokens( ( tokens = tokens.slice( j ) ) ), + j < len && toSelector( tokens ) + ); + } + matchers.push( matcher ); + } + } + + return elementMatcher( matchers ); +} + +function matcherFromGroupMatchers( elementMatchers, setMatchers ) { + var bySet = setMatchers.length > 0, + byElement = elementMatchers.length > 0, + superMatcher = function( seed, context, xml, results, outermost ) { + var elem, j, matcher, + matchedCount = 0, + i = "0", + unmatched = seed && [], + setMatched = [], + contextBackup = outermostContext, + + // We must always have either seed elements or outermost context + elems = seed || byElement && Expr.find[ "TAG" ]( "*", outermost ), + + // Use integer dirruns iff this is the outermost matcher + dirrunsUnique = ( dirruns += contextBackup == null ? 1 : Math.random() || 0.1 ), + len = elems.length; + + if ( outermost ) { + + // Support: IE 11+, Edge 17 - 18+ + // IE/Edge sometimes throw a "Permission denied" error when strict-comparing + // two documents; shallow comparisons work. + // eslint-disable-next-line eqeqeq + outermostContext = context == document || context || outermost; + } + + // Add elements passing elementMatchers directly to results + // Support: IE<9, Safari + // Tolerate NodeList properties (IE: "length"; Safari: ) matching elements by id + for ( ; i !== len && ( elem = elems[ i ] ) != null; i++ ) { + if ( byElement && elem ) { + j = 0; + + // Support: IE 11+, Edge 17 - 18+ + // IE/Edge sometimes throw a "Permission denied" error when strict-comparing + // two documents; shallow comparisons work. + // eslint-disable-next-line eqeqeq + if ( !context && elem.ownerDocument != document ) { + setDocument( elem ); + xml = !documentIsHTML; + } + while ( ( matcher = elementMatchers[ j++ ] ) ) { + if ( matcher( elem, context || document, xml ) ) { + results.push( elem ); + break; + } + } + if ( outermost ) { + dirruns = dirrunsUnique; + } + } + + // Track unmatched elements for set filters + if ( bySet ) { + + // They will have gone through all possible matchers + if ( ( elem = !matcher && elem ) ) { + matchedCount--; + } + + // Lengthen the array for every element, matched or not + if ( seed ) { + unmatched.push( elem ); + } + } + } + + // `i` is now the count of elements visited above, and adding it to `matchedCount` + // makes the latter nonnegative. + matchedCount += i; + + // Apply set filters to unmatched elements + // NOTE: This can be skipped if there are no unmatched elements (i.e., `matchedCount` + // equals `i`), unless we didn't visit _any_ elements in the above loop because we have + // no element matchers and no seed. + // Incrementing an initially-string "0" `i` allows `i` to remain a string only in that + // case, which will result in a "00" `matchedCount` that differs from `i` but is also + // numerically zero. + if ( bySet && i !== matchedCount ) { + j = 0; + while ( ( matcher = setMatchers[ j++ ] ) ) { + matcher( unmatched, setMatched, context, xml ); + } + + if ( seed ) { + + // Reintegrate element matches to eliminate the need for sorting + if ( matchedCount > 0 ) { + while ( i-- ) { + if ( !( unmatched[ i ] || setMatched[ i ] ) ) { + setMatched[ i ] = pop.call( results ); + } + } + } + + // Discard index placeholder values to get only actual matches + setMatched = condense( setMatched ); + } + + // Add matches to results + push.apply( results, setMatched ); + + // Seedless set matches succeeding multiple successful matchers stipulate sorting + if ( outermost && !seed && setMatched.length > 0 && + ( matchedCount + setMatchers.length ) > 1 ) { + + Sizzle.uniqueSort( results ); + } + } + + // Override manipulation of globals by nested matchers + if ( outermost ) { + dirruns = dirrunsUnique; + outermostContext = contextBackup; + } + + return unmatched; + }; + + return bySet ? + markFunction( superMatcher ) : + superMatcher; +} + +compile = Sizzle.compile = function( selector, match /* Internal Use Only */ ) { + var i, + setMatchers = [], + elementMatchers = [], + cached = compilerCache[ selector + " " ]; + + if ( !cached ) { + + // Generate a function of recursive functions that can be used to check each element + if ( !match ) { + match = tokenize( selector ); + } + i = match.length; + while ( i-- ) { + cached = matcherFromTokens( match[ i ] ); + if ( cached[ expando ] ) { + setMatchers.push( cached ); + } else { + elementMatchers.push( cached ); + } + } + + // Cache the compiled function + cached = compilerCache( + selector, + matcherFromGroupMatchers( elementMatchers, setMatchers ) + ); + + // Save selector and tokenization + cached.selector = selector; + } + return cached; +}; + +/** + * A low-level selection function that works with Sizzle's compiled + * selector functions + * @param {String|Function} selector A selector or a pre-compiled + * selector function built with Sizzle.compile + * @param {Element} context + * @param {Array} [results] + * @param {Array} [seed] A set of elements to match against + */ +select = Sizzle.select = function( selector, context, results, seed ) { + var i, tokens, token, type, find, + compiled = typeof selector === "function" && selector, + match = !seed && tokenize( ( selector = compiled.selector || selector ) ); + + results = results || []; + + // Try to minimize operations if there is only one selector in the list and no seed + // (the latter of which guarantees us context) + if ( match.length === 1 ) { + + // Reduce context if the leading compound selector is an ID + tokens = match[ 0 ] = match[ 0 ].slice( 0 ); + if ( tokens.length > 2 && ( token = tokens[ 0 ] ).type === "ID" && + context.nodeType === 9 && documentIsHTML && Expr.relative[ tokens[ 1 ].type ] ) { + + context = ( Expr.find[ "ID" ]( token.matches[ 0 ] + .replace( runescape, funescape ), context ) || [] )[ 0 ]; + if ( !context ) { + return results; + + // Precompiled matchers will still verify ancestry, so step up a level + } else if ( compiled ) { + context = context.parentNode; + } + + selector = selector.slice( tokens.shift().value.length ); + } + + // Fetch a seed set for right-to-left matching + i = matchExpr[ "needsContext" ].test( selector ) ? 0 : tokens.length; + while ( i-- ) { + token = tokens[ i ]; + + // Abort if we hit a combinator + if ( Expr.relative[ ( type = token.type ) ] ) { + break; + } + if ( ( find = Expr.find[ type ] ) ) { + + // Search, expanding context for leading sibling combinators + if ( ( seed = find( + token.matches[ 0 ].replace( runescape, funescape ), + rsibling.test( tokens[ 0 ].type ) && testContext( context.parentNode ) || + context + ) ) ) { + + // If seed is empty or no tokens remain, we can return early + tokens.splice( i, 1 ); + selector = seed.length && toSelector( tokens ); + if ( !selector ) { + push.apply( results, seed ); + return results; + } + + break; + } + } + } + } + + // Compile and execute a filtering function if one is not provided + // Provide `match` to avoid retokenization if we modified the selector above + ( compiled || compile( selector, match ) )( + seed, + context, + !documentIsHTML, + results, + !context || rsibling.test( selector ) && testContext( context.parentNode ) || context + ); + return results; +}; + +// One-time assignments + +// Sort stability +support.sortStable = expando.split( "" ).sort( sortOrder ).join( "" ) === expando; + +// Support: Chrome 14-35+ +// Always assume duplicates if they aren't passed to the comparison function +support.detectDuplicates = !!hasDuplicate; + +// Initialize against the default document +setDocument(); + +// Support: Webkit<537.32 - Safari 6.0.3/Chrome 25 (fixed in Chrome 27) +// Detached nodes confoundingly follow *each other* +support.sortDetached = assert( function( el ) { + + // Should return 1, but returns 4 (following) + return el.compareDocumentPosition( document.createElement( "fieldset" ) ) & 1; +} ); + +// Support: IE<8 +// Prevent attribute/property "interpolation" +// https://msdn.microsoft.com/en-us/library/ms536429%28VS.85%29.aspx +if ( !assert( function( el ) { + el.innerHTML = ""; + return el.firstChild.getAttribute( "href" ) === "#"; +} ) ) { + addHandle( "type|href|height|width", function( elem, name, isXML ) { + if ( !isXML ) { + return elem.getAttribute( name, name.toLowerCase() === "type" ? 1 : 2 ); + } + } ); +} + +// Support: IE<9 +// Use defaultValue in place of getAttribute("value") +if ( !support.attributes || !assert( function( el ) { + el.innerHTML = ""; + el.firstChild.setAttribute( "value", "" ); + return el.firstChild.getAttribute( "value" ) === ""; +} ) ) { + addHandle( "value", function( elem, _name, isXML ) { + if ( !isXML && elem.nodeName.toLowerCase() === "input" ) { + return elem.defaultValue; + } + } ); +} + +// Support: IE<9 +// Use getAttributeNode to fetch booleans when getAttribute lies +if ( !assert( function( el ) { + return el.getAttribute( "disabled" ) == null; +} ) ) { + addHandle( booleans, function( elem, name, isXML ) { + var val; + if ( !isXML ) { + return elem[ name ] === true ? name.toLowerCase() : + ( val = elem.getAttributeNode( name ) ) && val.specified ? + val.value : + null; + } + } ); +} + +return Sizzle; + +} )( window ); + + + +jQuery.find = Sizzle; +jQuery.expr = Sizzle.selectors; + +// Deprecated +jQuery.expr[ ":" ] = jQuery.expr.pseudos; +jQuery.uniqueSort = jQuery.unique = Sizzle.uniqueSort; +jQuery.text = Sizzle.getText; +jQuery.isXMLDoc = Sizzle.isXML; +jQuery.contains = Sizzle.contains; +jQuery.escapeSelector = Sizzle.escape; + + + + +var dir = function( elem, dir, until ) { + var matched = [], + truncate = until !== undefined; + + while ( ( elem = elem[ dir ] ) && elem.nodeType !== 9 ) { + if ( elem.nodeType === 1 ) { + if ( truncate && jQuery( elem ).is( until ) ) { + break; + } + matched.push( elem ); + } + } + return matched; +}; + + +var siblings = function( n, elem ) { + var matched = []; + + for ( ; n; n = n.nextSibling ) { + if ( n.nodeType === 1 && n !== elem ) { + matched.push( n ); + } + } + + return matched; +}; + + +var rneedsContext = jQuery.expr.match.needsContext; + + + +function nodeName( elem, name ) { + + return elem.nodeName && elem.nodeName.toLowerCase() === name.toLowerCase(); + +}; +var rsingleTag = ( /^<([a-z][^\/\0>:\x20\t\r\n\f]*)[\x20\t\r\n\f]*\/?>(?:<\/\1>|)$/i ); + + + +// Implement the identical functionality for filter and not +function winnow( elements, qualifier, not ) { + if ( isFunction( qualifier ) ) { + return jQuery.grep( elements, function( elem, i ) { + return !!qualifier.call( elem, i, elem ) !== not; + } ); + } + + // Single element + if ( qualifier.nodeType ) { + return jQuery.grep( elements, function( elem ) { + return ( elem === qualifier ) !== not; + } ); + } + + // Arraylike of elements (jQuery, arguments, Array) + if ( typeof qualifier !== "string" ) { + return jQuery.grep( elements, function( elem ) { + return ( indexOf.call( qualifier, elem ) > -1 ) !== not; + } ); + } + + // Filtered directly for both simple and complex selectors + return jQuery.filter( qualifier, elements, not ); +} + +jQuery.filter = function( expr, elems, not ) { + var elem = elems[ 0 ]; + + if ( not ) { + expr = ":not(" + expr + ")"; + } + + if ( elems.length === 1 && elem.nodeType === 1 ) { + return jQuery.find.matchesSelector( elem, expr ) ? [ elem ] : []; + } + + return jQuery.find.matches( expr, jQuery.grep( elems, function( elem ) { + return elem.nodeType === 1; + } ) ); +}; + +jQuery.fn.extend( { + find: function( selector ) { + var i, ret, + len = this.length, + self = this; + + if ( typeof selector !== "string" ) { + return this.pushStack( jQuery( selector ).filter( function() { + for ( i = 0; i < len; i++ ) { + if ( jQuery.contains( self[ i ], this ) ) { + return true; + } + } + } ) ); + } + + ret = this.pushStack( [] ); + + for ( i = 0; i < len; i++ ) { + jQuery.find( selector, self[ i ], ret ); + } + + return len > 1 ? jQuery.uniqueSort( ret ) : ret; + }, + filter: function( selector ) { + return this.pushStack( winnow( this, selector || [], false ) ); + }, + not: function( selector ) { + return this.pushStack( winnow( this, selector || [], true ) ); + }, + is: function( selector ) { + return !!winnow( + this, + + // If this is a positional/relative selector, check membership in the returned set + // so $("p:first").is("p:last") won't return true for a doc with two "p". + typeof selector === "string" && rneedsContext.test( selector ) ? + jQuery( selector ) : + selector || [], + false + ).length; + } +} ); + + +// Initialize a jQuery object + + +// A central reference to the root jQuery(document) +var rootjQuery, + + // A simple way to check for HTML strings + // Prioritize #id over to avoid XSS via location.hash (#9521) + // Strict HTML recognition (#11290: must start with <) + // Shortcut simple #id case for speed + rquickExpr = /^(?:\s*(<[\w\W]+>)[^>]*|#([\w-]+))$/, + + init = jQuery.fn.init = function( selector, context, root ) { + var match, elem; + + // HANDLE: $(""), $(null), $(undefined), $(false) + if ( !selector ) { + return this; + } + + // Method init() accepts an alternate rootjQuery + // so migrate can support jQuery.sub (gh-2101) + root = root || rootjQuery; + + // Handle HTML strings + if ( typeof selector === "string" ) { + if ( selector[ 0 ] === "<" && + selector[ selector.length - 1 ] === ">" && + selector.length >= 3 ) { + + // Assume that strings that start and end with <> are HTML and skip the regex check + match = [ null, selector, null ]; + + } else { + match = rquickExpr.exec( selector ); + } + + // Match html or make sure no context is specified for #id + if ( match && ( match[ 1 ] || !context ) ) { + + // HANDLE: $(html) -> $(array) + if ( match[ 1 ] ) { + context = context instanceof jQuery ? context[ 0 ] : context; + + // Option to run scripts is true for back-compat + // Intentionally let the error be thrown if parseHTML is not present + jQuery.merge( this, jQuery.parseHTML( + match[ 1 ], + context && context.nodeType ? context.ownerDocument || context : document, + true + ) ); + + // HANDLE: $(html, props) + if ( rsingleTag.test( match[ 1 ] ) && jQuery.isPlainObject( context ) ) { + for ( match in context ) { + + // Properties of context are called as methods if possible + if ( isFunction( this[ match ] ) ) { + this[ match ]( context[ match ] ); + + // ...and otherwise set as attributes + } else { + this.attr( match, context[ match ] ); + } + } + } + + return this; + + // HANDLE: $(#id) + } else { + elem = document.getElementById( match[ 2 ] ); + + if ( elem ) { + + // Inject the element directly into the jQuery object + this[ 0 ] = elem; + this.length = 1; + } + return this; + } + + // HANDLE: $(expr, $(...)) + } else if ( !context || context.jquery ) { + return ( context || root ).find( selector ); + + // HANDLE: $(expr, context) + // (which is just equivalent to: $(context).find(expr) + } else { + return this.constructor( context ).find( selector ); + } + + // HANDLE: $(DOMElement) + } else if ( selector.nodeType ) { + this[ 0 ] = selector; + this.length = 1; + return this; + + // HANDLE: $(function) + // Shortcut for document ready + } else if ( isFunction( selector ) ) { + return root.ready !== undefined ? + root.ready( selector ) : + + // Execute immediately if ready is not present + selector( jQuery ); + } + + return jQuery.makeArray( selector, this ); + }; + +// Give the init function the jQuery prototype for later instantiation +init.prototype = jQuery.fn; + +// Initialize central reference +rootjQuery = jQuery( document ); + + +var rparentsprev = /^(?:parents|prev(?:Until|All))/, + + // Methods guaranteed to produce a unique set when starting from a unique set + guaranteedUnique = { + children: true, + contents: true, + next: true, + prev: true + }; + +jQuery.fn.extend( { + has: function( target ) { + var targets = jQuery( target, this ), + l = targets.length; + + return this.filter( function() { + var i = 0; + for ( ; i < l; i++ ) { + if ( jQuery.contains( this, targets[ i ] ) ) { + return true; + } + } + } ); + }, + + closest: function( selectors, context ) { + var cur, + i = 0, + l = this.length, + matched = [], + targets = typeof selectors !== "string" && jQuery( selectors ); + + // Positional selectors never match, since there's no _selection_ context + if ( !rneedsContext.test( selectors ) ) { + for ( ; i < l; i++ ) { + for ( cur = this[ i ]; cur && cur !== context; cur = cur.parentNode ) { + + // Always skip document fragments + if ( cur.nodeType < 11 && ( targets ? + targets.index( cur ) > -1 : + + // Don't pass non-elements to Sizzle + cur.nodeType === 1 && + jQuery.find.matchesSelector( cur, selectors ) ) ) { + + matched.push( cur ); + break; + } + } + } + } + + return this.pushStack( matched.length > 1 ? jQuery.uniqueSort( matched ) : matched ); + }, + + // Determine the position of an element within the set + index: function( elem ) { + + // No argument, return index in parent + if ( !elem ) { + return ( this[ 0 ] && this[ 0 ].parentNode ) ? this.first().prevAll().length : -1; + } + + // Index in selector + if ( typeof elem === "string" ) { + return indexOf.call( jQuery( elem ), this[ 0 ] ); + } + + // Locate the position of the desired element + return indexOf.call( this, + + // If it receives a jQuery object, the first element is used + elem.jquery ? elem[ 0 ] : elem + ); + }, + + add: function( selector, context ) { + return this.pushStack( + jQuery.uniqueSort( + jQuery.merge( this.get(), jQuery( selector, context ) ) + ) + ); + }, + + addBack: function( selector ) { + return this.add( selector == null ? + this.prevObject : this.prevObject.filter( selector ) + ); + } +} ); + +function sibling( cur, dir ) { + while ( ( cur = cur[ dir ] ) && cur.nodeType !== 1 ) {} + return cur; +} + +jQuery.each( { + parent: function( elem ) { + var parent = elem.parentNode; + return parent && parent.nodeType !== 11 ? parent : null; + }, + parents: function( elem ) { + return dir( elem, "parentNode" ); + }, + parentsUntil: function( elem, _i, until ) { + return dir( elem, "parentNode", until ); + }, + next: function( elem ) { + return sibling( elem, "nextSibling" ); + }, + prev: function( elem ) { + return sibling( elem, "previousSibling" ); + }, + nextAll: function( elem ) { + return dir( elem, "nextSibling" ); + }, + prevAll: function( elem ) { + return dir( elem, "previousSibling" ); + }, + nextUntil: function( elem, _i, until ) { + return dir( elem, "nextSibling", until ); + }, + prevUntil: function( elem, _i, until ) { + return dir( elem, "previousSibling", until ); + }, + siblings: function( elem ) { + return siblings( ( elem.parentNode || {} ).firstChild, elem ); + }, + children: function( elem ) { + return siblings( elem.firstChild ); + }, + contents: function( elem ) { + if ( elem.contentDocument != null && + + // Support: IE 11+ + // elements with no `data` attribute has an object + // `contentDocument` with a `null` prototype. + getProto( elem.contentDocument ) ) { + + return elem.contentDocument; + } + + // Support: IE 9 - 11 only, iOS 7 only, Android Browser <=4.3 only + // Treat the template element as a regular one in browsers that + // don't support it. + if ( nodeName( elem, "template" ) ) { + elem = elem.content || elem; + } + + return jQuery.merge( [], elem.childNodes ); + } +}, function( name, fn ) { + jQuery.fn[ name ] = function( until, selector ) { + var matched = jQuery.map( this, fn, until ); + + if ( name.slice( -5 ) !== "Until" ) { + selector = until; + } + + if ( selector && typeof selector === "string" ) { + matched = jQuery.filter( selector, matched ); + } + + if ( this.length > 1 ) { + + // Remove duplicates + if ( !guaranteedUnique[ name ] ) { + jQuery.uniqueSort( matched ); + } + + // Reverse order for parents* and prev-derivatives + if ( rparentsprev.test( name ) ) { + matched.reverse(); + } + } + + return this.pushStack( matched ); + }; +} ); +var rnothtmlwhite = ( /[^\x20\t\r\n\f]+/g ); + + + +// Convert String-formatted options into Object-formatted ones +function createOptions( options ) { + var object = {}; + jQuery.each( options.match( rnothtmlwhite ) || [], function( _, flag ) { + object[ flag ] = true; + } ); + return object; +} + +/* + * Create a callback list using the following parameters: + * + * options: an optional list of space-separated options that will change how + * the callback list behaves or a more traditional option object + * + * By default a callback list will act like an event callback list and can be + * "fired" multiple times. + * + * Possible options: + * + * once: will ensure the callback list can only be fired once (like a Deferred) + * + * memory: will keep track of previous values and will call any callback added + * after the list has been fired right away with the latest "memorized" + * values (like a Deferred) + * + * unique: will ensure a callback can only be added once (no duplicate in the list) + * + * stopOnFalse: interrupt callings when a callback returns false + * + */ +jQuery.Callbacks = function( options ) { + + // Convert options from String-formatted to Object-formatted if needed + // (we check in cache first) + options = typeof options === "string" ? + createOptions( options ) : + jQuery.extend( {}, options ); + + var // Flag to know if list is currently firing + firing, + + // Last fire value for non-forgettable lists + memory, + + // Flag to know if list was already fired + fired, + + // Flag to prevent firing + locked, + + // Actual callback list + list = [], + + // Queue of execution data for repeatable lists + queue = [], + + // Index of currently firing callback (modified by add/remove as needed) + firingIndex = -1, + + // Fire callbacks + fire = function() { + + // Enforce single-firing + locked = locked || options.once; + + // Execute callbacks for all pending executions, + // respecting firingIndex overrides and runtime changes + fired = firing = true; + for ( ; queue.length; firingIndex = -1 ) { + memory = queue.shift(); + while ( ++firingIndex < list.length ) { + + // Run callback and check for early termination + if ( list[ firingIndex ].apply( memory[ 0 ], memory[ 1 ] ) === false && + options.stopOnFalse ) { + + // Jump to end and forget the data so .add doesn't re-fire + firingIndex = list.length; + memory = false; + } + } + } + + // Forget the data if we're done with it + if ( !options.memory ) { + memory = false; + } + + firing = false; + + // Clean up if we're done firing for good + if ( locked ) { + + // Keep an empty list if we have data for future add calls + if ( memory ) { + list = []; + + // Otherwise, this object is spent + } else { + list = ""; + } + } + }, + + // Actual Callbacks object + self = { + + // Add a callback or a collection of callbacks to the list + add: function() { + if ( list ) { + + // If we have memory from a past run, we should fire after adding + if ( memory && !firing ) { + firingIndex = list.length - 1; + queue.push( memory ); + } + + ( function add( args ) { + jQuery.each( args, function( _, arg ) { + if ( isFunction( arg ) ) { + if ( !options.unique || !self.has( arg ) ) { + list.push( arg ); + } + } else if ( arg && arg.length && toType( arg ) !== "string" ) { + + // Inspect recursively + add( arg ); + } + } ); + } )( arguments ); + + if ( memory && !firing ) { + fire(); + } + } + return this; + }, + + // Remove a callback from the list + remove: function() { + jQuery.each( arguments, function( _, arg ) { + var index; + while ( ( index = jQuery.inArray( arg, list, index ) ) > -1 ) { + list.splice( index, 1 ); + + // Handle firing indexes + if ( index <= firingIndex ) { + firingIndex--; + } + } + } ); + return this; + }, + + // Check if a given callback is in the list. + // If no argument is given, return whether or not list has callbacks attached. + has: function( fn ) { + return fn ? + jQuery.inArray( fn, list ) > -1 : + list.length > 0; + }, + + // Remove all callbacks from the list + empty: function() { + if ( list ) { + list = []; + } + return this; + }, + + // Disable .fire and .add + // Abort any current/pending executions + // Clear all callbacks and values + disable: function() { + locked = queue = []; + list = memory = ""; + return this; + }, + disabled: function() { + return !list; + }, + + // Disable .fire + // Also disable .add unless we have memory (since it would have no effect) + // Abort any pending executions + lock: function() { + locked = queue = []; + if ( !memory && !firing ) { + list = memory = ""; + } + return this; + }, + locked: function() { + return !!locked; + }, + + // Call all callbacks with the given context and arguments + fireWith: function( context, args ) { + if ( !locked ) { + args = args || []; + args = [ context, args.slice ? args.slice() : args ]; + queue.push( args ); + if ( !firing ) { + fire(); + } + } + return this; + }, + + // Call all the callbacks with the given arguments + fire: function() { + self.fireWith( this, arguments ); + return this; + }, + + // To know if the callbacks have already been called at least once + fired: function() { + return !!fired; + } + }; + + return self; +}; + + +function Identity( v ) { + return v; +} +function Thrower( ex ) { + throw ex; +} + +function adoptValue( value, resolve, reject, noValue ) { + var method; + + try { + + // Check for promise aspect first to privilege synchronous behavior + if ( value && isFunction( ( method = value.promise ) ) ) { + method.call( value ).done( resolve ).fail( reject ); + + // Other thenables + } else if ( value && isFunction( ( method = value.then ) ) ) { + method.call( value, resolve, reject ); + + // Other non-thenables + } else { + + // Control `resolve` arguments by letting Array#slice cast boolean `noValue` to integer: + // * false: [ value ].slice( 0 ) => resolve( value ) + // * true: [ value ].slice( 1 ) => resolve() + resolve.apply( undefined, [ value ].slice( noValue ) ); + } + + // For Promises/A+, convert exceptions into rejections + // Since jQuery.when doesn't unwrap thenables, we can skip the extra checks appearing in + // Deferred#then to conditionally suppress rejection. + } catch ( value ) { + + // Support: Android 4.0 only + // Strict mode functions invoked without .call/.apply get global-object context + reject.apply( undefined, [ value ] ); + } +} + +jQuery.extend( { + + Deferred: function( func ) { + var tuples = [ + + // action, add listener, callbacks, + // ... .then handlers, argument index, [final state] + [ "notify", "progress", jQuery.Callbacks( "memory" ), + jQuery.Callbacks( "memory" ), 2 ], + [ "resolve", "done", jQuery.Callbacks( "once memory" ), + jQuery.Callbacks( "once memory" ), 0, "resolved" ], + [ "reject", "fail", jQuery.Callbacks( "once memory" ), + jQuery.Callbacks( "once memory" ), 1, "rejected" ] + ], + state = "pending", + promise = { + state: function() { + return state; + }, + always: function() { + deferred.done( arguments ).fail( arguments ); + return this; + }, + "catch": function( fn ) { + return promise.then( null, fn ); + }, + + // Keep pipe for back-compat + pipe: function( /* fnDone, fnFail, fnProgress */ ) { + var fns = arguments; + + return jQuery.Deferred( function( newDefer ) { + jQuery.each( tuples, function( _i, tuple ) { + + // Map tuples (progress, done, fail) to arguments (done, fail, progress) + var fn = isFunction( fns[ tuple[ 4 ] ] ) && fns[ tuple[ 4 ] ]; + + // deferred.progress(function() { bind to newDefer or newDefer.notify }) + // deferred.done(function() { bind to newDefer or newDefer.resolve }) + // deferred.fail(function() { bind to newDefer or newDefer.reject }) + deferred[ tuple[ 1 ] ]( function() { + var returned = fn && fn.apply( this, arguments ); + if ( returned && isFunction( returned.promise ) ) { + returned.promise() + .progress( newDefer.notify ) + .done( newDefer.resolve ) + .fail( newDefer.reject ); + } else { + newDefer[ tuple[ 0 ] + "With" ]( + this, + fn ? [ returned ] : arguments + ); + } + } ); + } ); + fns = null; + } ).promise(); + }, + then: function( onFulfilled, onRejected, onProgress ) { + var maxDepth = 0; + function resolve( depth, deferred, handler, special ) { + return function() { + var that = this, + args = arguments, + mightThrow = function() { + var returned, then; + + // Support: Promises/A+ section 2.3.3.3.3 + // https://promisesaplus.com/#point-59 + // Ignore double-resolution attempts + if ( depth < maxDepth ) { + return; + } + + returned = handler.apply( that, args ); + + // Support: Promises/A+ section 2.3.1 + // https://promisesaplus.com/#point-48 + if ( returned === deferred.promise() ) { + throw new TypeError( "Thenable self-resolution" ); + } + + // Support: Promises/A+ sections 2.3.3.1, 3.5 + // https://promisesaplus.com/#point-54 + // https://promisesaplus.com/#point-75 + // Retrieve `then` only once + then = returned && + + // Support: Promises/A+ section 2.3.4 + // https://promisesaplus.com/#point-64 + // Only check objects and functions for thenability + ( typeof returned === "object" || + typeof returned === "function" ) && + returned.then; + + // Handle a returned thenable + if ( isFunction( then ) ) { + + // Special processors (notify) just wait for resolution + if ( special ) { + then.call( + returned, + resolve( maxDepth, deferred, Identity, special ), + resolve( maxDepth, deferred, Thrower, special ) + ); + + // Normal processors (resolve) also hook into progress + } else { + + // ...and disregard older resolution values + maxDepth++; + + then.call( + returned, + resolve( maxDepth, deferred, Identity, special ), + resolve( maxDepth, deferred, Thrower, special ), + resolve( maxDepth, deferred, Identity, + deferred.notifyWith ) + ); + } + + // Handle all other returned values + } else { + + // Only substitute handlers pass on context + // and multiple values (non-spec behavior) + if ( handler !== Identity ) { + that = undefined; + args = [ returned ]; + } + + // Process the value(s) + // Default process is resolve + ( special || deferred.resolveWith )( that, args ); + } + }, + + // Only normal processors (resolve) catch and reject exceptions + process = special ? + mightThrow : + function() { + try { + mightThrow(); + } catch ( e ) { + + if ( jQuery.Deferred.exceptionHook ) { + jQuery.Deferred.exceptionHook( e, + process.stackTrace ); + } + + // Support: Promises/A+ section 2.3.3.3.4.1 + // https://promisesaplus.com/#point-61 + // Ignore post-resolution exceptions + if ( depth + 1 >= maxDepth ) { + + // Only substitute handlers pass on context + // and multiple values (non-spec behavior) + if ( handler !== Thrower ) { + that = undefined; + args = [ e ]; + } + + deferred.rejectWith( that, args ); + } + } + }; + + // Support: Promises/A+ section 2.3.3.3.1 + // https://promisesaplus.com/#point-57 + // Re-resolve promises immediately to dodge false rejection from + // subsequent errors + if ( depth ) { + process(); + } else { + + // Call an optional hook to record the stack, in case of exception + // since it's otherwise lost when execution goes async + if ( jQuery.Deferred.getStackHook ) { + process.stackTrace = jQuery.Deferred.getStackHook(); + } + window.setTimeout( process ); + } + }; + } + + return jQuery.Deferred( function( newDefer ) { + + // progress_handlers.add( ... ) + tuples[ 0 ][ 3 ].add( + resolve( + 0, + newDefer, + isFunction( onProgress ) ? + onProgress : + Identity, + newDefer.notifyWith + ) + ); + + // fulfilled_handlers.add( ... ) + tuples[ 1 ][ 3 ].add( + resolve( + 0, + newDefer, + isFunction( onFulfilled ) ? + onFulfilled : + Identity + ) + ); + + // rejected_handlers.add( ... ) + tuples[ 2 ][ 3 ].add( + resolve( + 0, + newDefer, + isFunction( onRejected ) ? + onRejected : + Thrower + ) + ); + } ).promise(); + }, + + // Get a promise for this deferred + // If obj is provided, the promise aspect is added to the object + promise: function( obj ) { + return obj != null ? jQuery.extend( obj, promise ) : promise; + } + }, + deferred = {}; + + // Add list-specific methods + jQuery.each( tuples, function( i, tuple ) { + var list = tuple[ 2 ], + stateString = tuple[ 5 ]; + + // promise.progress = list.add + // promise.done = list.add + // promise.fail = list.add + promise[ tuple[ 1 ] ] = list.add; + + // Handle state + if ( stateString ) { + list.add( + function() { + + // state = "resolved" (i.e., fulfilled) + // state = "rejected" + state = stateString; + }, + + // rejected_callbacks.disable + // fulfilled_callbacks.disable + tuples[ 3 - i ][ 2 ].disable, + + // rejected_handlers.disable + // fulfilled_handlers.disable + tuples[ 3 - i ][ 3 ].disable, + + // progress_callbacks.lock + tuples[ 0 ][ 2 ].lock, + + // progress_handlers.lock + tuples[ 0 ][ 3 ].lock + ); + } + + // progress_handlers.fire + // fulfilled_handlers.fire + // rejected_handlers.fire + list.add( tuple[ 3 ].fire ); + + // deferred.notify = function() { deferred.notifyWith(...) } + // deferred.resolve = function() { deferred.resolveWith(...) } + // deferred.reject = function() { deferred.rejectWith(...) } + deferred[ tuple[ 0 ] ] = function() { + deferred[ tuple[ 0 ] + "With" ]( this === deferred ? undefined : this, arguments ); + return this; + }; + + // deferred.notifyWith = list.fireWith + // deferred.resolveWith = list.fireWith + // deferred.rejectWith = list.fireWith + deferred[ tuple[ 0 ] + "With" ] = list.fireWith; + } ); + + // Make the deferred a promise + promise.promise( deferred ); + + // Call given func if any + if ( func ) { + func.call( deferred, deferred ); + } + + // All done! + return deferred; + }, + + // Deferred helper + when: function( singleValue ) { + var + + // count of uncompleted subordinates + remaining = arguments.length, + + // count of unprocessed arguments + i = remaining, + + // subordinate fulfillment data + resolveContexts = Array( i ), + resolveValues = slice.call( arguments ), + + // the master Deferred + master = jQuery.Deferred(), + + // subordinate callback factory + updateFunc = function( i ) { + return function( value ) { + resolveContexts[ i ] = this; + resolveValues[ i ] = arguments.length > 1 ? slice.call( arguments ) : value; + if ( !( --remaining ) ) { + master.resolveWith( resolveContexts, resolveValues ); + } + }; + }; + + // Single- and empty arguments are adopted like Promise.resolve + if ( remaining <= 1 ) { + adoptValue( singleValue, master.done( updateFunc( i ) ).resolve, master.reject, + !remaining ); + + // Use .then() to unwrap secondary thenables (cf. gh-3000) + if ( master.state() === "pending" || + isFunction( resolveValues[ i ] && resolveValues[ i ].then ) ) { + + return master.then(); + } + } + + // Multiple arguments are aggregated like Promise.all array elements + while ( i-- ) { + adoptValue( resolveValues[ i ], updateFunc( i ), master.reject ); + } + + return master.promise(); + } +} ); + + +// These usually indicate a programmer mistake during development, +// warn about them ASAP rather than swallowing them by default. +var rerrorNames = /^(Eval|Internal|Range|Reference|Syntax|Type|URI)Error$/; + +jQuery.Deferred.exceptionHook = function( error, stack ) { + + // Support: IE 8 - 9 only + // Console exists when dev tools are open, which can happen at any time + if ( window.console && window.console.warn && error && rerrorNames.test( error.name ) ) { + window.console.warn( "jQuery.Deferred exception: " + error.message, error.stack, stack ); + } +}; + + + + +jQuery.readyException = function( error ) { + window.setTimeout( function() { + throw error; + } ); +}; + + + + +// The deferred used on DOM ready +var readyList = jQuery.Deferred(); + +jQuery.fn.ready = function( fn ) { + + readyList + .then( fn ) + + // Wrap jQuery.readyException in a function so that the lookup + // happens at the time of error handling instead of callback + // registration. + .catch( function( error ) { + jQuery.readyException( error ); + } ); + + return this; +}; + +jQuery.extend( { + + // Is the DOM ready to be used? Set to true once it occurs. + isReady: false, + + // A counter to track how many items to wait for before + // the ready event fires. See #6781 + readyWait: 1, + + // Handle when the DOM is ready + ready: function( wait ) { + + // Abort if there are pending holds or we're already ready + if ( wait === true ? --jQuery.readyWait : jQuery.isReady ) { + return; + } + + // Remember that the DOM is ready + jQuery.isReady = true; + + // If a normal DOM Ready event fired, decrement, and wait if need be + if ( wait !== true && --jQuery.readyWait > 0 ) { + return; + } + + // If there are functions bound, to execute + readyList.resolveWith( document, [ jQuery ] ); + } +} ); + +jQuery.ready.then = readyList.then; + +// The ready event handler and self cleanup method +function completed() { + document.removeEventListener( "DOMContentLoaded", completed ); + window.removeEventListener( "load", completed ); + jQuery.ready(); +} + +// Catch cases where $(document).ready() is called +// after the browser event has already occurred. +// Support: IE <=9 - 10 only +// Older IE sometimes signals "interactive" too soon +if ( document.readyState === "complete" || + ( document.readyState !== "loading" && !document.documentElement.doScroll ) ) { + + // Handle it asynchronously to allow scripts the opportunity to delay ready + window.setTimeout( jQuery.ready ); + +} else { + + // Use the handy event callback + document.addEventListener( "DOMContentLoaded", completed ); + + // A fallback to window.onload, that will always work + window.addEventListener( "load", completed ); +} + + + + +// Multifunctional method to get and set values of a collection +// The value/s can optionally be executed if it's a function +var access = function( elems, fn, key, value, chainable, emptyGet, raw ) { + var i = 0, + len = elems.length, + bulk = key == null; + + // Sets many values + if ( toType( key ) === "object" ) { + chainable = true; + for ( i in key ) { + access( elems, fn, i, key[ i ], true, emptyGet, raw ); + } + + // Sets one value + } else if ( value !== undefined ) { + chainable = true; + + if ( !isFunction( value ) ) { + raw = true; + } + + if ( bulk ) { + + // Bulk operations run against the entire set + if ( raw ) { + fn.call( elems, value ); + fn = null; + + // ...except when executing function values + } else { + bulk = fn; + fn = function( elem, _key, value ) { + return bulk.call( jQuery( elem ), value ); + }; + } + } + + if ( fn ) { + for ( ; i < len; i++ ) { + fn( + elems[ i ], key, raw ? + value : + value.call( elems[ i ], i, fn( elems[ i ], key ) ) + ); + } + } + } + + if ( chainable ) { + return elems; + } + + // Gets + if ( bulk ) { + return fn.call( elems ); + } + + return len ? fn( elems[ 0 ], key ) : emptyGet; +}; + + +// Matches dashed string for camelizing +var rmsPrefix = /^-ms-/, + rdashAlpha = /-([a-z])/g; + +// Used by camelCase as callback to replace() +function fcamelCase( _all, letter ) { + return letter.toUpperCase(); +} + +// Convert dashed to camelCase; used by the css and data modules +// Support: IE <=9 - 11, Edge 12 - 15 +// Microsoft forgot to hump their vendor prefix (#9572) +function camelCase( string ) { + return string.replace( rmsPrefix, "ms-" ).replace( rdashAlpha, fcamelCase ); +} +var acceptData = function( owner ) { + + // Accepts only: + // - Node + // - Node.ELEMENT_NODE + // - Node.DOCUMENT_NODE + // - Object + // - Any + return owner.nodeType === 1 || owner.nodeType === 9 || !( +owner.nodeType ); +}; + + + + +function Data() { + this.expando = jQuery.expando + Data.uid++; +} + +Data.uid = 1; + +Data.prototype = { + + cache: function( owner ) { + + // Check if the owner object already has a cache + var value = owner[ this.expando ]; + + // If not, create one + if ( !value ) { + value = {}; + + // We can accept data for non-element nodes in modern browsers, + // but we should not, see #8335. + // Always return an empty object. + if ( acceptData( owner ) ) { + + // If it is a node unlikely to be stringify-ed or looped over + // use plain assignment + if ( owner.nodeType ) { + owner[ this.expando ] = value; + + // Otherwise secure it in a non-enumerable property + // configurable must be true to allow the property to be + // deleted when data is removed + } else { + Object.defineProperty( owner, this.expando, { + value: value, + configurable: true + } ); + } + } + } + + return value; + }, + set: function( owner, data, value ) { + var prop, + cache = this.cache( owner ); + + // Handle: [ owner, key, value ] args + // Always use camelCase key (gh-2257) + if ( typeof data === "string" ) { + cache[ camelCase( data ) ] = value; + + // Handle: [ owner, { properties } ] args + } else { + + // Copy the properties one-by-one to the cache object + for ( prop in data ) { + cache[ camelCase( prop ) ] = data[ prop ]; + } + } + return cache; + }, + get: function( owner, key ) { + return key === undefined ? + this.cache( owner ) : + + // Always use camelCase key (gh-2257) + owner[ this.expando ] && owner[ this.expando ][ camelCase( key ) ]; + }, + access: function( owner, key, value ) { + + // In cases where either: + // + // 1. No key was specified + // 2. A string key was specified, but no value provided + // + // Take the "read" path and allow the get method to determine + // which value to return, respectively either: + // + // 1. The entire cache object + // 2. The data stored at the key + // + if ( key === undefined || + ( ( key && typeof key === "string" ) && value === undefined ) ) { + + return this.get( owner, key ); + } + + // When the key is not a string, or both a key and value + // are specified, set or extend (existing objects) with either: + // + // 1. An object of properties + // 2. A key and value + // + this.set( owner, key, value ); + + // Since the "set" path can have two possible entry points + // return the expected data based on which path was taken[*] + return value !== undefined ? value : key; + }, + remove: function( owner, key ) { + var i, + cache = owner[ this.expando ]; + + if ( cache === undefined ) { + return; + } + + if ( key !== undefined ) { + + // Support array or space separated string of keys + if ( Array.isArray( key ) ) { + + // If key is an array of keys... + // We always set camelCase keys, so remove that. + key = key.map( camelCase ); + } else { + key = camelCase( key ); + + // If a key with the spaces exists, use it. + // Otherwise, create an array by matching non-whitespace + key = key in cache ? + [ key ] : + ( key.match( rnothtmlwhite ) || [] ); + } + + i = key.length; + + while ( i-- ) { + delete cache[ key[ i ] ]; + } + } + + // Remove the expando if there's no more data + if ( key === undefined || jQuery.isEmptyObject( cache ) ) { + + // Support: Chrome <=35 - 45 + // Webkit & Blink performance suffers when deleting properties + // from DOM nodes, so set to undefined instead + // https://bugs.chromium.org/p/chromium/issues/detail?id=378607 (bug restricted) + if ( owner.nodeType ) { + owner[ this.expando ] = undefined; + } else { + delete owner[ this.expando ]; + } + } + }, + hasData: function( owner ) { + var cache = owner[ this.expando ]; + return cache !== undefined && !jQuery.isEmptyObject( cache ); + } +}; +var dataPriv = new Data(); + +var dataUser = new Data(); + + + +// Implementation Summary +// +// 1. Enforce API surface and semantic compatibility with 1.9.x branch +// 2. Improve the module's maintainability by reducing the storage +// paths to a single mechanism. +// 3. Use the same single mechanism to support "private" and "user" data. +// 4. _Never_ expose "private" data to user code (TODO: Drop _data, _removeData) +// 5. Avoid exposing implementation details on user objects (eg. expando properties) +// 6. Provide a clear path for implementation upgrade to WeakMap in 2014 + +var rbrace = /^(?:\{[\w\W]*\}|\[[\w\W]*\])$/, + rmultiDash = /[A-Z]/g; + +function getData( data ) { + if ( data === "true" ) { + return true; + } + + if ( data === "false" ) { + return false; + } + + if ( data === "null" ) { + return null; + } + + // Only convert to a number if it doesn't change the string + if ( data === +data + "" ) { + return +data; + } + + if ( rbrace.test( data ) ) { + return JSON.parse( data ); + } + + return data; +} + +function dataAttr( elem, key, data ) { + var name; + + // If nothing was found internally, try to fetch any + // data from the HTML5 data-* attribute + if ( data === undefined && elem.nodeType === 1 ) { + name = "data-" + key.replace( rmultiDash, "-$&" ).toLowerCase(); + data = elem.getAttribute( name ); + + if ( typeof data === "string" ) { + try { + data = getData( data ); + } catch ( e ) {} + + // Make sure we set the data so it isn't changed later + dataUser.set( elem, key, data ); + } else { + data = undefined; + } + } + return data; +} + +jQuery.extend( { + hasData: function( elem ) { + return dataUser.hasData( elem ) || dataPriv.hasData( elem ); + }, + + data: function( elem, name, data ) { + return dataUser.access( elem, name, data ); + }, + + removeData: function( elem, name ) { + dataUser.remove( elem, name ); + }, + + // TODO: Now that all calls to _data and _removeData have been replaced + // with direct calls to dataPriv methods, these can be deprecated. + _data: function( elem, name, data ) { + return dataPriv.access( elem, name, data ); + }, + + _removeData: function( elem, name ) { + dataPriv.remove( elem, name ); + } +} ); + +jQuery.fn.extend( { + data: function( key, value ) { + var i, name, data, + elem = this[ 0 ], + attrs = elem && elem.attributes; + + // Gets all values + if ( key === undefined ) { + if ( this.length ) { + data = dataUser.get( elem ); + + if ( elem.nodeType === 1 && !dataPriv.get( elem, "hasDataAttrs" ) ) { + i = attrs.length; + while ( i-- ) { + + // Support: IE 11 only + // The attrs elements can be null (#14894) + if ( attrs[ i ] ) { + name = attrs[ i ].name; + if ( name.indexOf( "data-" ) === 0 ) { + name = camelCase( name.slice( 5 ) ); + dataAttr( elem, name, data[ name ] ); + } + } + } + dataPriv.set( elem, "hasDataAttrs", true ); + } + } + + return data; + } + + // Sets multiple values + if ( typeof key === "object" ) { + return this.each( function() { + dataUser.set( this, key ); + } ); + } + + return access( this, function( value ) { + var data; + + // The calling jQuery object (element matches) is not empty + // (and therefore has an element appears at this[ 0 ]) and the + // `value` parameter was not undefined. An empty jQuery object + // will result in `undefined` for elem = this[ 0 ] which will + // throw an exception if an attempt to read a data cache is made. + if ( elem && value === undefined ) { + + // Attempt to get data from the cache + // The key will always be camelCased in Data + data = dataUser.get( elem, key ); + if ( data !== undefined ) { + return data; + } + + // Attempt to "discover" the data in + // HTML5 custom data-* attrs + data = dataAttr( elem, key ); + if ( data !== undefined ) { + return data; + } + + // We tried really hard, but the data doesn't exist. + return; + } + + // Set the data... + this.each( function() { + + // We always store the camelCased key + dataUser.set( this, key, value ); + } ); + }, null, value, arguments.length > 1, null, true ); + }, + + removeData: function( key ) { + return this.each( function() { + dataUser.remove( this, key ); + } ); + } +} ); + + +jQuery.extend( { + queue: function( elem, type, data ) { + var queue; + + if ( elem ) { + type = ( type || "fx" ) + "queue"; + queue = dataPriv.get( elem, type ); + + // Speed up dequeue by getting out quickly if this is just a lookup + if ( data ) { + if ( !queue || Array.isArray( data ) ) { + queue = dataPriv.access( elem, type, jQuery.makeArray( data ) ); + } else { + queue.push( data ); + } + } + return queue || []; + } + }, + + dequeue: function( elem, type ) { + type = type || "fx"; + + var queue = jQuery.queue( elem, type ), + startLength = queue.length, + fn = queue.shift(), + hooks = jQuery._queueHooks( elem, type ), + next = function() { + jQuery.dequeue( elem, type ); + }; + + // If the fx queue is dequeued, always remove the progress sentinel + if ( fn === "inprogress" ) { + fn = queue.shift(); + startLength--; + } + + if ( fn ) { + + // Add a progress sentinel to prevent the fx queue from being + // automatically dequeued + if ( type === "fx" ) { + queue.unshift( "inprogress" ); + } + + // Clear up the last queue stop function + delete hooks.stop; + fn.call( elem, next, hooks ); + } + + if ( !startLength && hooks ) { + hooks.empty.fire(); + } + }, + + // Not public - generate a queueHooks object, or return the current one + _queueHooks: function( elem, type ) { + var key = type + "queueHooks"; + return dataPriv.get( elem, key ) || dataPriv.access( elem, key, { + empty: jQuery.Callbacks( "once memory" ).add( function() { + dataPriv.remove( elem, [ type + "queue", key ] ); + } ) + } ); + } +} ); + +jQuery.fn.extend( { + queue: function( type, data ) { + var setter = 2; + + if ( typeof type !== "string" ) { + data = type; + type = "fx"; + setter--; + } + + if ( arguments.length < setter ) { + return jQuery.queue( this[ 0 ], type ); + } + + return data === undefined ? + this : + this.each( function() { + var queue = jQuery.queue( this, type, data ); + + // Ensure a hooks for this queue + jQuery._queueHooks( this, type ); + + if ( type === "fx" && queue[ 0 ] !== "inprogress" ) { + jQuery.dequeue( this, type ); + } + } ); + }, + dequeue: function( type ) { + return this.each( function() { + jQuery.dequeue( this, type ); + } ); + }, + clearQueue: function( type ) { + return this.queue( type || "fx", [] ); + }, + + // Get a promise resolved when queues of a certain type + // are emptied (fx is the type by default) + promise: function( type, obj ) { + var tmp, + count = 1, + defer = jQuery.Deferred(), + elements = this, + i = this.length, + resolve = function() { + if ( !( --count ) ) { + defer.resolveWith( elements, [ elements ] ); + } + }; + + if ( typeof type !== "string" ) { + obj = type; + type = undefined; + } + type = type || "fx"; + + while ( i-- ) { + tmp = dataPriv.get( elements[ i ], type + "queueHooks" ); + if ( tmp && tmp.empty ) { + count++; + tmp.empty.add( resolve ); + } + } + resolve(); + return defer.promise( obj ); + } +} ); +var pnum = ( /[+-]?(?:\d*\.|)\d+(?:[eE][+-]?\d+|)/ ).source; + +var rcssNum = new RegExp( "^(?:([+-])=|)(" + pnum + ")([a-z%]*)$", "i" ); + + +var cssExpand = [ "Top", "Right", "Bottom", "Left" ]; + +var documentElement = document.documentElement; + + + + var isAttached = function( elem ) { + return jQuery.contains( elem.ownerDocument, elem ); + }, + composed = { composed: true }; + + // Support: IE 9 - 11+, Edge 12 - 18+, iOS 10.0 - 10.2 only + // Check attachment across shadow DOM boundaries when possible (gh-3504) + // Support: iOS 10.0-10.2 only + // Early iOS 10 versions support `attachShadow` but not `getRootNode`, + // leading to errors. We need to check for `getRootNode`. + if ( documentElement.getRootNode ) { + isAttached = function( elem ) { + return jQuery.contains( elem.ownerDocument, elem ) || + elem.getRootNode( composed ) === elem.ownerDocument; + }; + } +var isHiddenWithinTree = function( elem, el ) { + + // isHiddenWithinTree might be called from jQuery#filter function; + // in that case, element will be second argument + elem = el || elem; + + // Inline style trumps all + return elem.style.display === "none" || + elem.style.display === "" && + + // Otherwise, check computed style + // Support: Firefox <=43 - 45 + // Disconnected elements can have computed display: none, so first confirm that elem is + // in the document. + isAttached( elem ) && + + jQuery.css( elem, "display" ) === "none"; + }; + + + +function adjustCSS( elem, prop, valueParts, tween ) { + var adjusted, scale, + maxIterations = 20, + currentValue = tween ? + function() { + return tween.cur(); + } : + function() { + return jQuery.css( elem, prop, "" ); + }, + initial = currentValue(), + unit = valueParts && valueParts[ 3 ] || ( jQuery.cssNumber[ prop ] ? "" : "px" ), + + // Starting value computation is required for potential unit mismatches + initialInUnit = elem.nodeType && + ( jQuery.cssNumber[ prop ] || unit !== "px" && +initial ) && + rcssNum.exec( jQuery.css( elem, prop ) ); + + if ( initialInUnit && initialInUnit[ 3 ] !== unit ) { + + // Support: Firefox <=54 + // Halve the iteration target value to prevent interference from CSS upper bounds (gh-2144) + initial = initial / 2; + + // Trust units reported by jQuery.css + unit = unit || initialInUnit[ 3 ]; + + // Iteratively approximate from a nonzero starting point + initialInUnit = +initial || 1; + + while ( maxIterations-- ) { + + // Evaluate and update our best guess (doubling guesses that zero out). + // Finish if the scale equals or crosses 1 (making the old*new product non-positive). + jQuery.style( elem, prop, initialInUnit + unit ); + if ( ( 1 - scale ) * ( 1 - ( scale = currentValue() / initial || 0.5 ) ) <= 0 ) { + maxIterations = 0; + } + initialInUnit = initialInUnit / scale; + + } + + initialInUnit = initialInUnit * 2; + jQuery.style( elem, prop, initialInUnit + unit ); + + // Make sure we update the tween properties later on + valueParts = valueParts || []; + } + + if ( valueParts ) { + initialInUnit = +initialInUnit || +initial || 0; + + // Apply relative offset (+=/-=) if specified + adjusted = valueParts[ 1 ] ? + initialInUnit + ( valueParts[ 1 ] + 1 ) * valueParts[ 2 ] : + +valueParts[ 2 ]; + if ( tween ) { + tween.unit = unit; + tween.start = initialInUnit; + tween.end = adjusted; + } + } + return adjusted; +} + + +var defaultDisplayMap = {}; + +function getDefaultDisplay( elem ) { + var temp, + doc = elem.ownerDocument, + nodeName = elem.nodeName, + display = defaultDisplayMap[ nodeName ]; + + if ( display ) { + return display; + } + + temp = doc.body.appendChild( doc.createElement( nodeName ) ); + display = jQuery.css( temp, "display" ); + + temp.parentNode.removeChild( temp ); + + if ( display === "none" ) { + display = "block"; + } + defaultDisplayMap[ nodeName ] = display; + + return display; +} + +function showHide( elements, show ) { + var display, elem, + values = [], + index = 0, + length = elements.length; + + // Determine new display value for elements that need to change + for ( ; index < length; index++ ) { + elem = elements[ index ]; + if ( !elem.style ) { + continue; + } + + display = elem.style.display; + if ( show ) { + + // Since we force visibility upon cascade-hidden elements, an immediate (and slow) + // check is required in this first loop unless we have a nonempty display value (either + // inline or about-to-be-restored) + if ( display === "none" ) { + values[ index ] = dataPriv.get( elem, "display" ) || null; + if ( !values[ index ] ) { + elem.style.display = ""; + } + } + if ( elem.style.display === "" && isHiddenWithinTree( elem ) ) { + values[ index ] = getDefaultDisplay( elem ); + } + } else { + if ( display !== "none" ) { + values[ index ] = "none"; + + // Remember what we're overwriting + dataPriv.set( elem, "display", display ); + } + } + } + + // Set the display of the elements in a second loop to avoid constant reflow + for ( index = 0; index < length; index++ ) { + if ( values[ index ] != null ) { + elements[ index ].style.display = values[ index ]; + } + } + + return elements; +} + +jQuery.fn.extend( { + show: function() { + return showHide( this, true ); + }, + hide: function() { + return showHide( this ); + }, + toggle: function( state ) { + if ( typeof state === "boolean" ) { + return state ? this.show() : this.hide(); + } + + return this.each( function() { + if ( isHiddenWithinTree( this ) ) { + jQuery( this ).show(); + } else { + jQuery( this ).hide(); + } + } ); + } +} ); +var rcheckableType = ( /^(?:checkbox|radio)$/i ); + +var rtagName = ( /<([a-z][^\/\0>\x20\t\r\n\f]*)/i ); + +var rscriptType = ( /^$|^module$|\/(?:java|ecma)script/i ); + + + +( function() { + var fragment = document.createDocumentFragment(), + div = fragment.appendChild( document.createElement( "div" ) ), + input = document.createElement( "input" ); + + // Support: Android 4.0 - 4.3 only + // Check state lost if the name is set (#11217) + // Support: Windows Web Apps (WWA) + // `name` and `type` must use .setAttribute for WWA (#14901) + input.setAttribute( "type", "radio" ); + input.setAttribute( "checked", "checked" ); + input.setAttribute( "name", "t" ); + + div.appendChild( input ); + + // Support: Android <=4.1 only + // Older WebKit doesn't clone checked state correctly in fragments + support.checkClone = div.cloneNode( true ).cloneNode( true ).lastChild.checked; + + // Support: IE <=11 only + // Make sure textarea (and checkbox) defaultValue is properly cloned + div.innerHTML = ""; + support.noCloneChecked = !!div.cloneNode( true ).lastChild.defaultValue; + + // Support: IE <=9 only + // IE <=9 replaces "; + support.option = !!div.lastChild; +} )(); + + +// We have to close these tags to support XHTML (#13200) +var wrapMap = { + + // XHTML parsers do not magically insert elements in the + // same way that tag soup parsers do. So we cannot shorten + // this by omitting or other required elements. + thead: [ 1, "", "
" ], + col: [ 2, "", "
" ], + tr: [ 2, "", "
" ], + td: [ 3, "", "
" ], + + _default: [ 0, "", "" ] +}; + +wrapMap.tbody = wrapMap.tfoot = wrapMap.colgroup = wrapMap.caption = wrapMap.thead; +wrapMap.th = wrapMap.td; + +// Support: IE <=9 only +if ( !support.option ) { + wrapMap.optgroup = wrapMap.option = [ 1, "" ]; +} + + +function getAll( context, tag ) { + + // Support: IE <=9 - 11 only + // Use typeof to avoid zero-argument method invocation on host objects (#15151) + var ret; + + if ( typeof context.getElementsByTagName !== "undefined" ) { + ret = context.getElementsByTagName( tag || "*" ); + + } else if ( typeof context.querySelectorAll !== "undefined" ) { + ret = context.querySelectorAll( tag || "*" ); + + } else { + ret = []; + } + + if ( tag === undefined || tag && nodeName( context, tag ) ) { + return jQuery.merge( [ context ], ret ); + } + + return ret; +} + + +// Mark scripts as having already been evaluated +function setGlobalEval( elems, refElements ) { + var i = 0, + l = elems.length; + + for ( ; i < l; i++ ) { + dataPriv.set( + elems[ i ], + "globalEval", + !refElements || dataPriv.get( refElements[ i ], "globalEval" ) + ); + } +} + + +var rhtml = /<|&#?\w+;/; + +function buildFragment( elems, context, scripts, selection, ignored ) { + var elem, tmp, tag, wrap, attached, j, + fragment = context.createDocumentFragment(), + nodes = [], + i = 0, + l = elems.length; + + for ( ; i < l; i++ ) { + elem = elems[ i ]; + + if ( elem || elem === 0 ) { + + // Add nodes directly + if ( toType( elem ) === "object" ) { + + // Support: Android <=4.0 only, PhantomJS 1 only + // push.apply(_, arraylike) throws on ancient WebKit + jQuery.merge( nodes, elem.nodeType ? [ elem ] : elem ); + + // Convert non-html into a text node + } else if ( !rhtml.test( elem ) ) { + nodes.push( context.createTextNode( elem ) ); + + // Convert html into DOM nodes + } else { + tmp = tmp || fragment.appendChild( context.createElement( "div" ) ); + + // Deserialize a standard representation + tag = ( rtagName.exec( elem ) || [ "", "" ] )[ 1 ].toLowerCase(); + wrap = wrapMap[ tag ] || wrapMap._default; + tmp.innerHTML = wrap[ 1 ] + jQuery.htmlPrefilter( elem ) + wrap[ 2 ]; + + // Descend through wrappers to the right content + j = wrap[ 0 ]; + while ( j-- ) { + tmp = tmp.lastChild; + } + + // Support: Android <=4.0 only, PhantomJS 1 only + // push.apply(_, arraylike) throws on ancient WebKit + jQuery.merge( nodes, tmp.childNodes ); + + // Remember the top-level container + tmp = fragment.firstChild; + + // Ensure the created nodes are orphaned (#12392) + tmp.textContent = ""; + } + } + } + + // Remove wrapper from fragment + fragment.textContent = ""; + + i = 0; + while ( ( elem = nodes[ i++ ] ) ) { + + // Skip elements already in the context collection (trac-4087) + if ( selection && jQuery.inArray( elem, selection ) > -1 ) { + if ( ignored ) { + ignored.push( elem ); + } + continue; + } + + attached = isAttached( elem ); + + // Append to fragment + tmp = getAll( fragment.appendChild( elem ), "script" ); + + // Preserve script evaluation history + if ( attached ) { + setGlobalEval( tmp ); + } + + // Capture executables + if ( scripts ) { + j = 0; + while ( ( elem = tmp[ j++ ] ) ) { + if ( rscriptType.test( elem.type || "" ) ) { + scripts.push( elem ); + } + } + } + } + + return fragment; +} + + +var + rkeyEvent = /^key/, + rmouseEvent = /^(?:mouse|pointer|contextmenu|drag|drop)|click/, + rtypenamespace = /^([^.]*)(?:\.(.+)|)/; + +function returnTrue() { + return true; +} + +function returnFalse() { + return false; +} + +// Support: IE <=9 - 11+ +// focus() and blur() are asynchronous, except when they are no-op. +// So expect focus to be synchronous when the element is already active, +// and blur to be synchronous when the element is not already active. +// (focus and blur are always synchronous in other supported browsers, +// this just defines when we can count on it). +function expectSync( elem, type ) { + return ( elem === safeActiveElement() ) === ( type === "focus" ); +} + +// Support: IE <=9 only +// Accessing document.activeElement can throw unexpectedly +// https://bugs.jquery.com/ticket/13393 +function safeActiveElement() { + try { + return document.activeElement; + } catch ( err ) { } +} + +function on( elem, types, selector, data, fn, one ) { + var origFn, type; + + // Types can be a map of types/handlers + if ( typeof types === "object" ) { + + // ( types-Object, selector, data ) + if ( typeof selector !== "string" ) { + + // ( types-Object, data ) + data = data || selector; + selector = undefined; + } + for ( type in types ) { + on( elem, type, selector, data, types[ type ], one ); + } + return elem; + } + + if ( data == null && fn == null ) { + + // ( types, fn ) + fn = selector; + data = selector = undefined; + } else if ( fn == null ) { + if ( typeof selector === "string" ) { + + // ( types, selector, fn ) + fn = data; + data = undefined; + } else { + + // ( types, data, fn ) + fn = data; + data = selector; + selector = undefined; + } + } + if ( fn === false ) { + fn = returnFalse; + } else if ( !fn ) { + return elem; + } + + if ( one === 1 ) { + origFn = fn; + fn = function( event ) { + + // Can use an empty set, since event contains the info + jQuery().off( event ); + return origFn.apply( this, arguments ); + }; + + // Use same guid so caller can remove using origFn + fn.guid = origFn.guid || ( origFn.guid = jQuery.guid++ ); + } + return elem.each( function() { + jQuery.event.add( this, types, fn, data, selector ); + } ); +} + +/* + * Helper functions for managing events -- not part of the public interface. + * Props to Dean Edwards' addEvent library for many of the ideas. + */ +jQuery.event = { + + global: {}, + + add: function( elem, types, handler, data, selector ) { + + var handleObjIn, eventHandle, tmp, + events, t, handleObj, + special, handlers, type, namespaces, origType, + elemData = dataPriv.get( elem ); + + // Only attach events to objects that accept data + if ( !acceptData( elem ) ) { + return; + } + + // Caller can pass in an object of custom data in lieu of the handler + if ( handler.handler ) { + handleObjIn = handler; + handler = handleObjIn.handler; + selector = handleObjIn.selector; + } + + // Ensure that invalid selectors throw exceptions at attach time + // Evaluate against documentElement in case elem is a non-element node (e.g., document) + if ( selector ) { + jQuery.find.matchesSelector( documentElement, selector ); + } + + // Make sure that the handler has a unique ID, used to find/remove it later + if ( !handler.guid ) { + handler.guid = jQuery.guid++; + } + + // Init the element's event structure and main handler, if this is the first + if ( !( events = elemData.events ) ) { + events = elemData.events = Object.create( null ); + } + if ( !( eventHandle = elemData.handle ) ) { + eventHandle = elemData.handle = function( e ) { + + // Discard the second event of a jQuery.event.trigger() and + // when an event is called after a page has unloaded + return typeof jQuery !== "undefined" && jQuery.event.triggered !== e.type ? + jQuery.event.dispatch.apply( elem, arguments ) : undefined; + }; + } + + // Handle multiple events separated by a space + types = ( types || "" ).match( rnothtmlwhite ) || [ "" ]; + t = types.length; + while ( t-- ) { + tmp = rtypenamespace.exec( types[ t ] ) || []; + type = origType = tmp[ 1 ]; + namespaces = ( tmp[ 2 ] || "" ).split( "." ).sort(); + + // There *must* be a type, no attaching namespace-only handlers + if ( !type ) { + continue; + } + + // If event changes its type, use the special event handlers for the changed type + special = jQuery.event.special[ type ] || {}; + + // If selector defined, determine special event api type, otherwise given type + type = ( selector ? special.delegateType : special.bindType ) || type; + + // Update special based on newly reset type + special = jQuery.event.special[ type ] || {}; + + // handleObj is passed to all event handlers + handleObj = jQuery.extend( { + type: type, + origType: origType, + data: data, + handler: handler, + guid: handler.guid, + selector: selector, + needsContext: selector && jQuery.expr.match.needsContext.test( selector ), + namespace: namespaces.join( "." ) + }, handleObjIn ); + + // Init the event handler queue if we're the first + if ( !( handlers = events[ type ] ) ) { + handlers = events[ type ] = []; + handlers.delegateCount = 0; + + // Only use addEventListener if the special events handler returns false + if ( !special.setup || + special.setup.call( elem, data, namespaces, eventHandle ) === false ) { + + if ( elem.addEventListener ) { + elem.addEventListener( type, eventHandle ); + } + } + } + + if ( special.add ) { + special.add.call( elem, handleObj ); + + if ( !handleObj.handler.guid ) { + handleObj.handler.guid = handler.guid; + } + } + + // Add to the element's handler list, delegates in front + if ( selector ) { + handlers.splice( handlers.delegateCount++, 0, handleObj ); + } else { + handlers.push( handleObj ); + } + + // Keep track of which events have ever been used, for event optimization + jQuery.event.global[ type ] = true; + } + + }, + + // Detach an event or set of events from an element + remove: function( elem, types, handler, selector, mappedTypes ) { + + var j, origCount, tmp, + events, t, handleObj, + special, handlers, type, namespaces, origType, + elemData = dataPriv.hasData( elem ) && dataPriv.get( elem ); + + if ( !elemData || !( events = elemData.events ) ) { + return; + } + + // Once for each type.namespace in types; type may be omitted + types = ( types || "" ).match( rnothtmlwhite ) || [ "" ]; + t = types.length; + while ( t-- ) { + tmp = rtypenamespace.exec( types[ t ] ) || []; + type = origType = tmp[ 1 ]; + namespaces = ( tmp[ 2 ] || "" ).split( "." ).sort(); + + // Unbind all events (on this namespace, if provided) for the element + if ( !type ) { + for ( type in events ) { + jQuery.event.remove( elem, type + types[ t ], handler, selector, true ); + } + continue; + } + + special = jQuery.event.special[ type ] || {}; + type = ( selector ? special.delegateType : special.bindType ) || type; + handlers = events[ type ] || []; + tmp = tmp[ 2 ] && + new RegExp( "(^|\\.)" + namespaces.join( "\\.(?:.*\\.|)" ) + "(\\.|$)" ); + + // Remove matching events + origCount = j = handlers.length; + while ( j-- ) { + handleObj = handlers[ j ]; + + if ( ( mappedTypes || origType === handleObj.origType ) && + ( !handler || handler.guid === handleObj.guid ) && + ( !tmp || tmp.test( handleObj.namespace ) ) && + ( !selector || selector === handleObj.selector || + selector === "**" && handleObj.selector ) ) { + handlers.splice( j, 1 ); + + if ( handleObj.selector ) { + handlers.delegateCount--; + } + if ( special.remove ) { + special.remove.call( elem, handleObj ); + } + } + } + + // Remove generic event handler if we removed something and no more handlers exist + // (avoids potential for endless recursion during removal of special event handlers) + if ( origCount && !handlers.length ) { + if ( !special.teardown || + special.teardown.call( elem, namespaces, elemData.handle ) === false ) { + + jQuery.removeEvent( elem, type, elemData.handle ); + } + + delete events[ type ]; + } + } + + // Remove data and the expando if it's no longer used + if ( jQuery.isEmptyObject( events ) ) { + dataPriv.remove( elem, "handle events" ); + } + }, + + dispatch: function( nativeEvent ) { + + var i, j, ret, matched, handleObj, handlerQueue, + args = new Array( arguments.length ), + + // Make a writable jQuery.Event from the native event object + event = jQuery.event.fix( nativeEvent ), + + handlers = ( + dataPriv.get( this, "events" ) || Object.create( null ) + )[ event.type ] || [], + special = jQuery.event.special[ event.type ] || {}; + + // Use the fix-ed jQuery.Event rather than the (read-only) native event + args[ 0 ] = event; + + for ( i = 1; i < arguments.length; i++ ) { + args[ i ] = arguments[ i ]; + } + + event.delegateTarget = this; + + // Call the preDispatch hook for the mapped type, and let it bail if desired + if ( special.preDispatch && special.preDispatch.call( this, event ) === false ) { + return; + } + + // Determine handlers + handlerQueue = jQuery.event.handlers.call( this, event, handlers ); + + // Run delegates first; they may want to stop propagation beneath us + i = 0; + while ( ( matched = handlerQueue[ i++ ] ) && !event.isPropagationStopped() ) { + event.currentTarget = matched.elem; + + j = 0; + while ( ( handleObj = matched.handlers[ j++ ] ) && + !event.isImmediatePropagationStopped() ) { + + // If the event is namespaced, then each handler is only invoked if it is + // specially universal or its namespaces are a superset of the event's. + if ( !event.rnamespace || handleObj.namespace === false || + event.rnamespace.test( handleObj.namespace ) ) { + + event.handleObj = handleObj; + event.data = handleObj.data; + + ret = ( ( jQuery.event.special[ handleObj.origType ] || {} ).handle || + handleObj.handler ).apply( matched.elem, args ); + + if ( ret !== undefined ) { + if ( ( event.result = ret ) === false ) { + event.preventDefault(); + event.stopPropagation(); + } + } + } + } + } + + // Call the postDispatch hook for the mapped type + if ( special.postDispatch ) { + special.postDispatch.call( this, event ); + } + + return event.result; + }, + + handlers: function( event, handlers ) { + var i, handleObj, sel, matchedHandlers, matchedSelectors, + handlerQueue = [], + delegateCount = handlers.delegateCount, + cur = event.target; + + // Find delegate handlers + if ( delegateCount && + + // Support: IE <=9 + // Black-hole SVG instance trees (trac-13180) + cur.nodeType && + + // Support: Firefox <=42 + // Suppress spec-violating clicks indicating a non-primary pointer button (trac-3861) + // https://www.w3.org/TR/DOM-Level-3-Events/#event-type-click + // Support: IE 11 only + // ...but not arrow key "clicks" of radio inputs, which can have `button` -1 (gh-2343) + !( event.type === "click" && event.button >= 1 ) ) { + + for ( ; cur !== this; cur = cur.parentNode || this ) { + + // Don't check non-elements (#13208) + // Don't process clicks on disabled elements (#6911, #8165, #11382, #11764) + if ( cur.nodeType === 1 && !( event.type === "click" && cur.disabled === true ) ) { + matchedHandlers = []; + matchedSelectors = {}; + for ( i = 0; i < delegateCount; i++ ) { + handleObj = handlers[ i ]; + + // Don't conflict with Object.prototype properties (#13203) + sel = handleObj.selector + " "; + + if ( matchedSelectors[ sel ] === undefined ) { + matchedSelectors[ sel ] = handleObj.needsContext ? + jQuery( sel, this ).index( cur ) > -1 : + jQuery.find( sel, this, null, [ cur ] ).length; + } + if ( matchedSelectors[ sel ] ) { + matchedHandlers.push( handleObj ); + } + } + if ( matchedHandlers.length ) { + handlerQueue.push( { elem: cur, handlers: matchedHandlers } ); + } + } + } + } + + // Add the remaining (directly-bound) handlers + cur = this; + if ( delegateCount < handlers.length ) { + handlerQueue.push( { elem: cur, handlers: handlers.slice( delegateCount ) } ); + } + + return handlerQueue; + }, + + addProp: function( name, hook ) { + Object.defineProperty( jQuery.Event.prototype, name, { + enumerable: true, + configurable: true, + + get: isFunction( hook ) ? + function() { + if ( this.originalEvent ) { + return hook( this.originalEvent ); + } + } : + function() { + if ( this.originalEvent ) { + return this.originalEvent[ name ]; + } + }, + + set: function( value ) { + Object.defineProperty( this, name, { + enumerable: true, + configurable: true, + writable: true, + value: value + } ); + } + } ); + }, + + fix: function( originalEvent ) { + return originalEvent[ jQuery.expando ] ? + originalEvent : + new jQuery.Event( originalEvent ); + }, + + special: { + load: { + + // Prevent triggered image.load events from bubbling to window.load + noBubble: true + }, + click: { + + // Utilize native event to ensure correct state for checkable inputs + setup: function( data ) { + + // For mutual compressibility with _default, replace `this` access with a local var. + // `|| data` is dead code meant only to preserve the variable through minification. + var el = this || data; + + // Claim the first handler + if ( rcheckableType.test( el.type ) && + el.click && nodeName( el, "input" ) ) { + + // dataPriv.set( el, "click", ... ) + leverageNative( el, "click", returnTrue ); + } + + // Return false to allow normal processing in the caller + return false; + }, + trigger: function( data ) { + + // For mutual compressibility with _default, replace `this` access with a local var. + // `|| data` is dead code meant only to preserve the variable through minification. + var el = this || data; + + // Force setup before triggering a click + if ( rcheckableType.test( el.type ) && + el.click && nodeName( el, "input" ) ) { + + leverageNative( el, "click" ); + } + + // Return non-false to allow normal event-path propagation + return true; + }, + + // For cross-browser consistency, suppress native .click() on links + // Also prevent it if we're currently inside a leveraged native-event stack + _default: function( event ) { + var target = event.target; + return rcheckableType.test( target.type ) && + target.click && nodeName( target, "input" ) && + dataPriv.get( target, "click" ) || + nodeName( target, "a" ); + } + }, + + beforeunload: { + postDispatch: function( event ) { + + // Support: Firefox 20+ + // Firefox doesn't alert if the returnValue field is not set. + if ( event.result !== undefined && event.originalEvent ) { + event.originalEvent.returnValue = event.result; + } + } + } + } +}; + +// Ensure the presence of an event listener that handles manually-triggered +// synthetic events by interrupting progress until reinvoked in response to +// *native* events that it fires directly, ensuring that state changes have +// already occurred before other listeners are invoked. +function leverageNative( el, type, expectSync ) { + + // Missing expectSync indicates a trigger call, which must force setup through jQuery.event.add + if ( !expectSync ) { + if ( dataPriv.get( el, type ) === undefined ) { + jQuery.event.add( el, type, returnTrue ); + } + return; + } + + // Register the controller as a special universal handler for all event namespaces + dataPriv.set( el, type, false ); + jQuery.event.add( el, type, { + namespace: false, + handler: function( event ) { + var notAsync, result, + saved = dataPriv.get( this, type ); + + if ( ( event.isTrigger & 1 ) && this[ type ] ) { + + // Interrupt processing of the outer synthetic .trigger()ed event + // Saved data should be false in such cases, but might be a leftover capture object + // from an async native handler (gh-4350) + if ( !saved.length ) { + + // Store arguments for use when handling the inner native event + // There will always be at least one argument (an event object), so this array + // will not be confused with a leftover capture object. + saved = slice.call( arguments ); + dataPriv.set( this, type, saved ); + + // Trigger the native event and capture its result + // Support: IE <=9 - 11+ + // focus() and blur() are asynchronous + notAsync = expectSync( this, type ); + this[ type ](); + result = dataPriv.get( this, type ); + if ( saved !== result || notAsync ) { + dataPriv.set( this, type, false ); + } else { + result = {}; + } + if ( saved !== result ) { + + // Cancel the outer synthetic event + event.stopImmediatePropagation(); + event.preventDefault(); + return result.value; + } + + // If this is an inner synthetic event for an event with a bubbling surrogate + // (focus or blur), assume that the surrogate already propagated from triggering the + // native event and prevent that from happening again here. + // This technically gets the ordering wrong w.r.t. to `.trigger()` (in which the + // bubbling surrogate propagates *after* the non-bubbling base), but that seems + // less bad than duplication. + } else if ( ( jQuery.event.special[ type ] || {} ).delegateType ) { + event.stopPropagation(); + } + + // If this is a native event triggered above, everything is now in order + // Fire an inner synthetic event with the original arguments + } else if ( saved.length ) { + + // ...and capture the result + dataPriv.set( this, type, { + value: jQuery.event.trigger( + + // Support: IE <=9 - 11+ + // Extend with the prototype to reset the above stopImmediatePropagation() + jQuery.extend( saved[ 0 ], jQuery.Event.prototype ), + saved.slice( 1 ), + this + ) + } ); + + // Abort handling of the native event + event.stopImmediatePropagation(); + } + } + } ); +} + +jQuery.removeEvent = function( elem, type, handle ) { + + // This "if" is needed for plain objects + if ( elem.removeEventListener ) { + elem.removeEventListener( type, handle ); + } +}; + +jQuery.Event = function( src, props ) { + + // Allow instantiation without the 'new' keyword + if ( !( this instanceof jQuery.Event ) ) { + return new jQuery.Event( src, props ); + } + + // Event object + if ( src && src.type ) { + this.originalEvent = src; + this.type = src.type; + + // Events bubbling up the document may have been marked as prevented + // by a handler lower down the tree; reflect the correct value. + this.isDefaultPrevented = src.defaultPrevented || + src.defaultPrevented === undefined && + + // Support: Android <=2.3 only + src.returnValue === false ? + returnTrue : + returnFalse; + + // Create target properties + // Support: Safari <=6 - 7 only + // Target should not be a text node (#504, #13143) + this.target = ( src.target && src.target.nodeType === 3 ) ? + src.target.parentNode : + src.target; + + this.currentTarget = src.currentTarget; + this.relatedTarget = src.relatedTarget; + + // Event type + } else { + this.type = src; + } + + // Put explicitly provided properties onto the event object + if ( props ) { + jQuery.extend( this, props ); + } + + // Create a timestamp if incoming event doesn't have one + this.timeStamp = src && src.timeStamp || Date.now(); + + // Mark it as fixed + this[ jQuery.expando ] = true; +}; + +// jQuery.Event is based on DOM3 Events as specified by the ECMAScript Language Binding +// https://www.w3.org/TR/2003/WD-DOM-Level-3-Events-20030331/ecma-script-binding.html +jQuery.Event.prototype = { + constructor: jQuery.Event, + isDefaultPrevented: returnFalse, + isPropagationStopped: returnFalse, + isImmediatePropagationStopped: returnFalse, + isSimulated: false, + + preventDefault: function() { + var e = this.originalEvent; + + this.isDefaultPrevented = returnTrue; + + if ( e && !this.isSimulated ) { + e.preventDefault(); + } + }, + stopPropagation: function() { + var e = this.originalEvent; + + this.isPropagationStopped = returnTrue; + + if ( e && !this.isSimulated ) { + e.stopPropagation(); + } + }, + stopImmediatePropagation: function() { + var e = this.originalEvent; + + this.isImmediatePropagationStopped = returnTrue; + + if ( e && !this.isSimulated ) { + e.stopImmediatePropagation(); + } + + this.stopPropagation(); + } +}; + +// Includes all common event props including KeyEvent and MouseEvent specific props +jQuery.each( { + altKey: true, + bubbles: true, + cancelable: true, + changedTouches: true, + ctrlKey: true, + detail: true, + eventPhase: true, + metaKey: true, + pageX: true, + pageY: true, + shiftKey: true, + view: true, + "char": true, + code: true, + charCode: true, + key: true, + keyCode: true, + button: true, + buttons: true, + clientX: true, + clientY: true, + offsetX: true, + offsetY: true, + pointerId: true, + pointerType: true, + screenX: true, + screenY: true, + targetTouches: true, + toElement: true, + touches: true, + + which: function( event ) { + var button = event.button; + + // Add which for key events + if ( event.which == null && rkeyEvent.test( event.type ) ) { + return event.charCode != null ? event.charCode : event.keyCode; + } + + // Add which for click: 1 === left; 2 === middle; 3 === right + if ( !event.which && button !== undefined && rmouseEvent.test( event.type ) ) { + if ( button & 1 ) { + return 1; + } + + if ( button & 2 ) { + return 3; + } + + if ( button & 4 ) { + return 2; + } + + return 0; + } + + return event.which; + } +}, jQuery.event.addProp ); + +jQuery.each( { focus: "focusin", blur: "focusout" }, function( type, delegateType ) { + jQuery.event.special[ type ] = { + + // Utilize native event if possible so blur/focus sequence is correct + setup: function() { + + // Claim the first handler + // dataPriv.set( this, "focus", ... ) + // dataPriv.set( this, "blur", ... ) + leverageNative( this, type, expectSync ); + + // Return false to allow normal processing in the caller + return false; + }, + trigger: function() { + + // Force setup before trigger + leverageNative( this, type ); + + // Return non-false to allow normal event-path propagation + return true; + }, + + delegateType: delegateType + }; +} ); + +// Create mouseenter/leave events using mouseover/out and event-time checks +// so that event delegation works in jQuery. +// Do the same for pointerenter/pointerleave and pointerover/pointerout +// +// Support: Safari 7 only +// Safari sends mouseenter too often; see: +// https://bugs.chromium.org/p/chromium/issues/detail?id=470258 +// for the description of the bug (it existed in older Chrome versions as well). +jQuery.each( { + mouseenter: "mouseover", + mouseleave: "mouseout", + pointerenter: "pointerover", + pointerleave: "pointerout" +}, function( orig, fix ) { + jQuery.event.special[ orig ] = { + delegateType: fix, + bindType: fix, + + handle: function( event ) { + var ret, + target = this, + related = event.relatedTarget, + handleObj = event.handleObj; + + // For mouseenter/leave call the handler if related is outside the target. + // NB: No relatedTarget if the mouse left/entered the browser window + if ( !related || ( related !== target && !jQuery.contains( target, related ) ) ) { + event.type = handleObj.origType; + ret = handleObj.handler.apply( this, arguments ); + event.type = fix; + } + return ret; + } + }; +} ); + +jQuery.fn.extend( { + + on: function( types, selector, data, fn ) { + return on( this, types, selector, data, fn ); + }, + one: function( types, selector, data, fn ) { + return on( this, types, selector, data, fn, 1 ); + }, + off: function( types, selector, fn ) { + var handleObj, type; + if ( types && types.preventDefault && types.handleObj ) { + + // ( event ) dispatched jQuery.Event + handleObj = types.handleObj; + jQuery( types.delegateTarget ).off( + handleObj.namespace ? + handleObj.origType + "." + handleObj.namespace : + handleObj.origType, + handleObj.selector, + handleObj.handler + ); + return this; + } + if ( typeof types === "object" ) { + + // ( types-object [, selector] ) + for ( type in types ) { + this.off( type, selector, types[ type ] ); + } + return this; + } + if ( selector === false || typeof selector === "function" ) { + + // ( types [, fn] ) + fn = selector; + selector = undefined; + } + if ( fn === false ) { + fn = returnFalse; + } + return this.each( function() { + jQuery.event.remove( this, types, fn, selector ); + } ); + } +} ); + + +var + + // Support: IE <=10 - 11, Edge 12 - 13 only + // In IE/Edge using regex groups here causes severe slowdowns. + // See https://connect.microsoft.com/IE/feedback/details/1736512/ + rnoInnerhtml = /\s*$/g; + +// Prefer a tbody over its parent table for containing new rows +function manipulationTarget( elem, content ) { + if ( nodeName( elem, "table" ) && + nodeName( content.nodeType !== 11 ? content : content.firstChild, "tr" ) ) { + + return jQuery( elem ).children( "tbody" )[ 0 ] || elem; + } + + return elem; +} + +// Replace/restore the type attribute of script elements for safe DOM manipulation +function disableScript( elem ) { + elem.type = ( elem.getAttribute( "type" ) !== null ) + "/" + elem.type; + return elem; +} +function restoreScript( elem ) { + if ( ( elem.type || "" ).slice( 0, 5 ) === "true/" ) { + elem.type = elem.type.slice( 5 ); + } else { + elem.removeAttribute( "type" ); + } + + return elem; +} + +function cloneCopyEvent( src, dest ) { + var i, l, type, pdataOld, udataOld, udataCur, events; + + if ( dest.nodeType !== 1 ) { + return; + } + + // 1. Copy private data: events, handlers, etc. + if ( dataPriv.hasData( src ) ) { + pdataOld = dataPriv.get( src ); + events = pdataOld.events; + + if ( events ) { + dataPriv.remove( dest, "handle events" ); + + for ( type in events ) { + for ( i = 0, l = events[ type ].length; i < l; i++ ) { + jQuery.event.add( dest, type, events[ type ][ i ] ); + } + } + } + } + + // 2. Copy user data + if ( dataUser.hasData( src ) ) { + udataOld = dataUser.access( src ); + udataCur = jQuery.extend( {}, udataOld ); + + dataUser.set( dest, udataCur ); + } +} + +// Fix IE bugs, see support tests +function fixInput( src, dest ) { + var nodeName = dest.nodeName.toLowerCase(); + + // Fails to persist the checked state of a cloned checkbox or radio button. + if ( nodeName === "input" && rcheckableType.test( src.type ) ) { + dest.checked = src.checked; + + // Fails to return the selected option to the default selected state when cloning options + } else if ( nodeName === "input" || nodeName === "textarea" ) { + dest.defaultValue = src.defaultValue; + } +} + +function domManip( collection, args, callback, ignored ) { + + // Flatten any nested arrays + args = flat( args ); + + var fragment, first, scripts, hasScripts, node, doc, + i = 0, + l = collection.length, + iNoClone = l - 1, + value = args[ 0 ], + valueIsFunction = isFunction( value ); + + // We can't cloneNode fragments that contain checked, in WebKit + if ( valueIsFunction || + ( l > 1 && typeof value === "string" && + !support.checkClone && rchecked.test( value ) ) ) { + return collection.each( function( index ) { + var self = collection.eq( index ); + if ( valueIsFunction ) { + args[ 0 ] = value.call( this, index, self.html() ); + } + domManip( self, args, callback, ignored ); + } ); + } + + if ( l ) { + fragment = buildFragment( args, collection[ 0 ].ownerDocument, false, collection, ignored ); + first = fragment.firstChild; + + if ( fragment.childNodes.length === 1 ) { + fragment = first; + } + + // Require either new content or an interest in ignored elements to invoke the callback + if ( first || ignored ) { + scripts = jQuery.map( getAll( fragment, "script" ), disableScript ); + hasScripts = scripts.length; + + // Use the original fragment for the last item + // instead of the first because it can end up + // being emptied incorrectly in certain situations (#8070). + for ( ; i < l; i++ ) { + node = fragment; + + if ( i !== iNoClone ) { + node = jQuery.clone( node, true, true ); + + // Keep references to cloned scripts for later restoration + if ( hasScripts ) { + + // Support: Android <=4.0 only, PhantomJS 1 only + // push.apply(_, arraylike) throws on ancient WebKit + jQuery.merge( scripts, getAll( node, "script" ) ); + } + } + + callback.call( collection[ i ], node, i ); + } + + if ( hasScripts ) { + doc = scripts[ scripts.length - 1 ].ownerDocument; + + // Reenable scripts + jQuery.map( scripts, restoreScript ); + + // Evaluate executable scripts on first document insertion + for ( i = 0; i < hasScripts; i++ ) { + node = scripts[ i ]; + if ( rscriptType.test( node.type || "" ) && + !dataPriv.access( node, "globalEval" ) && + jQuery.contains( doc, node ) ) { + + if ( node.src && ( node.type || "" ).toLowerCase() !== "module" ) { + + // Optional AJAX dependency, but won't run scripts if not present + if ( jQuery._evalUrl && !node.noModule ) { + jQuery._evalUrl( node.src, { + nonce: node.nonce || node.getAttribute( "nonce" ) + }, doc ); + } + } else { + DOMEval( node.textContent.replace( rcleanScript, "" ), node, doc ); + } + } + } + } + } + } + + return collection; +} + +function remove( elem, selector, keepData ) { + var node, + nodes = selector ? jQuery.filter( selector, elem ) : elem, + i = 0; + + for ( ; ( node = nodes[ i ] ) != null; i++ ) { + if ( !keepData && node.nodeType === 1 ) { + jQuery.cleanData( getAll( node ) ); + } + + if ( node.parentNode ) { + if ( keepData && isAttached( node ) ) { + setGlobalEval( getAll( node, "script" ) ); + } + node.parentNode.removeChild( node ); + } + } + + return elem; +} + +jQuery.extend( { + htmlPrefilter: function( html ) { + return html; + }, + + clone: function( elem, dataAndEvents, deepDataAndEvents ) { + var i, l, srcElements, destElements, + clone = elem.cloneNode( true ), + inPage = isAttached( elem ); + + // Fix IE cloning issues + if ( !support.noCloneChecked && ( elem.nodeType === 1 || elem.nodeType === 11 ) && + !jQuery.isXMLDoc( elem ) ) { + + // We eschew Sizzle here for performance reasons: https://jsperf.com/getall-vs-sizzle/2 + destElements = getAll( clone ); + srcElements = getAll( elem ); + + for ( i = 0, l = srcElements.length; i < l; i++ ) { + fixInput( srcElements[ i ], destElements[ i ] ); + } + } + + // Copy the events from the original to the clone + if ( dataAndEvents ) { + if ( deepDataAndEvents ) { + srcElements = srcElements || getAll( elem ); + destElements = destElements || getAll( clone ); + + for ( i = 0, l = srcElements.length; i < l; i++ ) { + cloneCopyEvent( srcElements[ i ], destElements[ i ] ); + } + } else { + cloneCopyEvent( elem, clone ); + } + } + + // Preserve script evaluation history + destElements = getAll( clone, "script" ); + if ( destElements.length > 0 ) { + setGlobalEval( destElements, !inPage && getAll( elem, "script" ) ); + } + + // Return the cloned set + return clone; + }, + + cleanData: function( elems ) { + var data, elem, type, + special = jQuery.event.special, + i = 0; + + for ( ; ( elem = elems[ i ] ) !== undefined; i++ ) { + if ( acceptData( elem ) ) { + if ( ( data = elem[ dataPriv.expando ] ) ) { + if ( data.events ) { + for ( type in data.events ) { + if ( special[ type ] ) { + jQuery.event.remove( elem, type ); + + // This is a shortcut to avoid jQuery.event.remove's overhead + } else { + jQuery.removeEvent( elem, type, data.handle ); + } + } + } + + // Support: Chrome <=35 - 45+ + // Assign undefined instead of using delete, see Data#remove + elem[ dataPriv.expando ] = undefined; + } + if ( elem[ dataUser.expando ] ) { + + // Support: Chrome <=35 - 45+ + // Assign undefined instead of using delete, see Data#remove + elem[ dataUser.expando ] = undefined; + } + } + } + } +} ); + +jQuery.fn.extend( { + detach: function( selector ) { + return remove( this, selector, true ); + }, + + remove: function( selector ) { + return remove( this, selector ); + }, + + text: function( value ) { + return access( this, function( value ) { + return value === undefined ? + jQuery.text( this ) : + this.empty().each( function() { + if ( this.nodeType === 1 || this.nodeType === 11 || this.nodeType === 9 ) { + this.textContent = value; + } + } ); + }, null, value, arguments.length ); + }, + + append: function() { + return domManip( this, arguments, function( elem ) { + if ( this.nodeType === 1 || this.nodeType === 11 || this.nodeType === 9 ) { + var target = manipulationTarget( this, elem ); + target.appendChild( elem ); + } + } ); + }, + + prepend: function() { + return domManip( this, arguments, function( elem ) { + if ( this.nodeType === 1 || this.nodeType === 11 || this.nodeType === 9 ) { + var target = manipulationTarget( this, elem ); + target.insertBefore( elem, target.firstChild ); + } + } ); + }, + + before: function() { + return domManip( this, arguments, function( elem ) { + if ( this.parentNode ) { + this.parentNode.insertBefore( elem, this ); + } + } ); + }, + + after: function() { + return domManip( this, arguments, function( elem ) { + if ( this.parentNode ) { + this.parentNode.insertBefore( elem, this.nextSibling ); + } + } ); + }, + + empty: function() { + var elem, + i = 0; + + for ( ; ( elem = this[ i ] ) != null; i++ ) { + if ( elem.nodeType === 1 ) { + + // Prevent memory leaks + jQuery.cleanData( getAll( elem, false ) ); + + // Remove any remaining nodes + elem.textContent = ""; + } + } + + return this; + }, + + clone: function( dataAndEvents, deepDataAndEvents ) { + dataAndEvents = dataAndEvents == null ? false : dataAndEvents; + deepDataAndEvents = deepDataAndEvents == null ? dataAndEvents : deepDataAndEvents; + + return this.map( function() { + return jQuery.clone( this, dataAndEvents, deepDataAndEvents ); + } ); + }, + + html: function( value ) { + return access( this, function( value ) { + var elem = this[ 0 ] || {}, + i = 0, + l = this.length; + + if ( value === undefined && elem.nodeType === 1 ) { + return elem.innerHTML; + } + + // See if we can take a shortcut and just use innerHTML + if ( typeof value === "string" && !rnoInnerhtml.test( value ) && + !wrapMap[ ( rtagName.exec( value ) || [ "", "" ] )[ 1 ].toLowerCase() ] ) { + + value = jQuery.htmlPrefilter( value ); + + try { + for ( ; i < l; i++ ) { + elem = this[ i ] || {}; + + // Remove element nodes and prevent memory leaks + if ( elem.nodeType === 1 ) { + jQuery.cleanData( getAll( elem, false ) ); + elem.innerHTML = value; + } + } + + elem = 0; + + // If using innerHTML throws an exception, use the fallback method + } catch ( e ) {} + } + + if ( elem ) { + this.empty().append( value ); + } + }, null, value, arguments.length ); + }, + + replaceWith: function() { + var ignored = []; + + // Make the changes, replacing each non-ignored context element with the new content + return domManip( this, arguments, function( elem ) { + var parent = this.parentNode; + + if ( jQuery.inArray( this, ignored ) < 0 ) { + jQuery.cleanData( getAll( this ) ); + if ( parent ) { + parent.replaceChild( elem, this ); + } + } + + // Force callback invocation + }, ignored ); + } +} ); + +jQuery.each( { + appendTo: "append", + prependTo: "prepend", + insertBefore: "before", + insertAfter: "after", + replaceAll: "replaceWith" +}, function( name, original ) { + jQuery.fn[ name ] = function( selector ) { + var elems, + ret = [], + insert = jQuery( selector ), + last = insert.length - 1, + i = 0; + + for ( ; i <= last; i++ ) { + elems = i === last ? this : this.clone( true ); + jQuery( insert[ i ] )[ original ]( elems ); + + // Support: Android <=4.0 only, PhantomJS 1 only + // .get() because push.apply(_, arraylike) throws on ancient WebKit + push.apply( ret, elems.get() ); + } + + return this.pushStack( ret ); + }; +} ); +var rnumnonpx = new RegExp( "^(" + pnum + ")(?!px)[a-z%]+$", "i" ); + +var getStyles = function( elem ) { + + // Support: IE <=11 only, Firefox <=30 (#15098, #14150) + // IE throws on elements created in popups + // FF meanwhile throws on frame elements through "defaultView.getComputedStyle" + var view = elem.ownerDocument.defaultView; + + if ( !view || !view.opener ) { + view = window; + } + + return view.getComputedStyle( elem ); + }; + +var swap = function( elem, options, callback ) { + var ret, name, + old = {}; + + // Remember the old values, and insert the new ones + for ( name in options ) { + old[ name ] = elem.style[ name ]; + elem.style[ name ] = options[ name ]; + } + + ret = callback.call( elem ); + + // Revert the old values + for ( name in options ) { + elem.style[ name ] = old[ name ]; + } + + return ret; +}; + + +var rboxStyle = new RegExp( cssExpand.join( "|" ), "i" ); + + + +( function() { + + // Executing both pixelPosition & boxSizingReliable tests require only one layout + // so they're executed at the same time to save the second computation. + function computeStyleTests() { + + // This is a singleton, we need to execute it only once + if ( !div ) { + return; + } + + container.style.cssText = "position:absolute;left:-11111px;width:60px;" + + "margin-top:1px;padding:0;border:0"; + div.style.cssText = + "position:relative;display:block;box-sizing:border-box;overflow:scroll;" + + "margin:auto;border:1px;padding:1px;" + + "width:60%;top:1%"; + documentElement.appendChild( container ).appendChild( div ); + + var divStyle = window.getComputedStyle( div ); + pixelPositionVal = divStyle.top !== "1%"; + + // Support: Android 4.0 - 4.3 only, Firefox <=3 - 44 + reliableMarginLeftVal = roundPixelMeasures( divStyle.marginLeft ) === 12; + + // Support: Android 4.0 - 4.3 only, Safari <=9.1 - 10.1, iOS <=7.0 - 9.3 + // Some styles come back with percentage values, even though they shouldn't + div.style.right = "60%"; + pixelBoxStylesVal = roundPixelMeasures( divStyle.right ) === 36; + + // Support: IE 9 - 11 only + // Detect misreporting of content dimensions for box-sizing:border-box elements + boxSizingReliableVal = roundPixelMeasures( divStyle.width ) === 36; + + // Support: IE 9 only + // Detect overflow:scroll screwiness (gh-3699) + // Support: Chrome <=64 + // Don't get tricked when zoom affects offsetWidth (gh-4029) + div.style.position = "absolute"; + scrollboxSizeVal = roundPixelMeasures( div.offsetWidth / 3 ) === 12; + + documentElement.removeChild( container ); + + // Nullify the div so it wouldn't be stored in the memory and + // it will also be a sign that checks already performed + div = null; + } + + function roundPixelMeasures( measure ) { + return Math.round( parseFloat( measure ) ); + } + + var pixelPositionVal, boxSizingReliableVal, scrollboxSizeVal, pixelBoxStylesVal, + reliableTrDimensionsVal, reliableMarginLeftVal, + container = document.createElement( "div" ), + div = document.createElement( "div" ); + + // Finish early in limited (non-browser) environments + if ( !div.style ) { + return; + } + + // Support: IE <=9 - 11 only + // Style of cloned element affects source element cloned (#8908) + div.style.backgroundClip = "content-box"; + div.cloneNode( true ).style.backgroundClip = ""; + support.clearCloneStyle = div.style.backgroundClip === "content-box"; + + jQuery.extend( support, { + boxSizingReliable: function() { + computeStyleTests(); + return boxSizingReliableVal; + }, + pixelBoxStyles: function() { + computeStyleTests(); + return pixelBoxStylesVal; + }, + pixelPosition: function() { + computeStyleTests(); + return pixelPositionVal; + }, + reliableMarginLeft: function() { + computeStyleTests(); + return reliableMarginLeftVal; + }, + scrollboxSize: function() { + computeStyleTests(); + return scrollboxSizeVal; + }, + + // Support: IE 9 - 11+, Edge 15 - 18+ + // IE/Edge misreport `getComputedStyle` of table rows with width/height + // set in CSS while `offset*` properties report correct values. + // Behavior in IE 9 is more subtle than in newer versions & it passes + // some versions of this test; make sure not to make it pass there! + reliableTrDimensions: function() { + var table, tr, trChild, trStyle; + if ( reliableTrDimensionsVal == null ) { + table = document.createElement( "table" ); + tr = document.createElement( "tr" ); + trChild = document.createElement( "div" ); + + table.style.cssText = "position:absolute;left:-11111px"; + tr.style.height = "1px"; + trChild.style.height = "9px"; + + documentElement + .appendChild( table ) + .appendChild( tr ) + .appendChild( trChild ); + + trStyle = window.getComputedStyle( tr ); + reliableTrDimensionsVal = parseInt( trStyle.height ) > 3; + + documentElement.removeChild( table ); + } + return reliableTrDimensionsVal; + } + } ); +} )(); + + +function curCSS( elem, name, computed ) { + var width, minWidth, maxWidth, ret, + + // Support: Firefox 51+ + // Retrieving style before computed somehow + // fixes an issue with getting wrong values + // on detached elements + style = elem.style; + + computed = computed || getStyles( elem ); + + // getPropertyValue is needed for: + // .css('filter') (IE 9 only, #12537) + // .css('--customProperty) (#3144) + if ( computed ) { + ret = computed.getPropertyValue( name ) || computed[ name ]; + + if ( ret === "" && !isAttached( elem ) ) { + ret = jQuery.style( elem, name ); + } + + // A tribute to the "awesome hack by Dean Edwards" + // Android Browser returns percentage for some values, + // but width seems to be reliably pixels. + // This is against the CSSOM draft spec: + // https://drafts.csswg.org/cssom/#resolved-values + if ( !support.pixelBoxStyles() && rnumnonpx.test( ret ) && rboxStyle.test( name ) ) { + + // Remember the original values + width = style.width; + minWidth = style.minWidth; + maxWidth = style.maxWidth; + + // Put in the new values to get a computed value out + style.minWidth = style.maxWidth = style.width = ret; + ret = computed.width; + + // Revert the changed values + style.width = width; + style.minWidth = minWidth; + style.maxWidth = maxWidth; + } + } + + return ret !== undefined ? + + // Support: IE <=9 - 11 only + // IE returns zIndex value as an integer. + ret + "" : + ret; +} + + +function addGetHookIf( conditionFn, hookFn ) { + + // Define the hook, we'll check on the first run if it's really needed. + return { + get: function() { + if ( conditionFn() ) { + + // Hook not needed (or it's not possible to use it due + // to missing dependency), remove it. + delete this.get; + return; + } + + // Hook needed; redefine it so that the support test is not executed again. + return ( this.get = hookFn ).apply( this, arguments ); + } + }; +} + + +var cssPrefixes = [ "Webkit", "Moz", "ms" ], + emptyStyle = document.createElement( "div" ).style, + vendorProps = {}; + +// Return a vendor-prefixed property or undefined +function vendorPropName( name ) { + + // Check for vendor prefixed names + var capName = name[ 0 ].toUpperCase() + name.slice( 1 ), + i = cssPrefixes.length; + + while ( i-- ) { + name = cssPrefixes[ i ] + capName; + if ( name in emptyStyle ) { + return name; + } + } +} + +// Return a potentially-mapped jQuery.cssProps or vendor prefixed property +function finalPropName( name ) { + var final = jQuery.cssProps[ name ] || vendorProps[ name ]; + + if ( final ) { + return final; + } + if ( name in emptyStyle ) { + return name; + } + return vendorProps[ name ] = vendorPropName( name ) || name; +} + + +var + + // Swappable if display is none or starts with table + // except "table", "table-cell", or "table-caption" + // See here for display values: https://developer.mozilla.org/en-US/docs/CSS/display + rdisplayswap = /^(none|table(?!-c[ea]).+)/, + rcustomProp = /^--/, + cssShow = { position: "absolute", visibility: "hidden", display: "block" }, + cssNormalTransform = { + letterSpacing: "0", + fontWeight: "400" + }; + +function setPositiveNumber( _elem, value, subtract ) { + + // Any relative (+/-) values have already been + // normalized at this point + var matches = rcssNum.exec( value ); + return matches ? + + // Guard against undefined "subtract", e.g., when used as in cssHooks + Math.max( 0, matches[ 2 ] - ( subtract || 0 ) ) + ( matches[ 3 ] || "px" ) : + value; +} + +function boxModelAdjustment( elem, dimension, box, isBorderBox, styles, computedVal ) { + var i = dimension === "width" ? 1 : 0, + extra = 0, + delta = 0; + + // Adjustment may not be necessary + if ( box === ( isBorderBox ? "border" : "content" ) ) { + return 0; + } + + for ( ; i < 4; i += 2 ) { + + // Both box models exclude margin + if ( box === "margin" ) { + delta += jQuery.css( elem, box + cssExpand[ i ], true, styles ); + } + + // If we get here with a content-box, we're seeking "padding" or "border" or "margin" + if ( !isBorderBox ) { + + // Add padding + delta += jQuery.css( elem, "padding" + cssExpand[ i ], true, styles ); + + // For "border" or "margin", add border + if ( box !== "padding" ) { + delta += jQuery.css( elem, "border" + cssExpand[ i ] + "Width", true, styles ); + + // But still keep track of it otherwise + } else { + extra += jQuery.css( elem, "border" + cssExpand[ i ] + "Width", true, styles ); + } + + // If we get here with a border-box (content + padding + border), we're seeking "content" or + // "padding" or "margin" + } else { + + // For "content", subtract padding + if ( box === "content" ) { + delta -= jQuery.css( elem, "padding" + cssExpand[ i ], true, styles ); + } + + // For "content" or "padding", subtract border + if ( box !== "margin" ) { + delta -= jQuery.css( elem, "border" + cssExpand[ i ] + "Width", true, styles ); + } + } + } + + // Account for positive content-box scroll gutter when requested by providing computedVal + if ( !isBorderBox && computedVal >= 0 ) { + + // offsetWidth/offsetHeight is a rounded sum of content, padding, scroll gutter, and border + // Assuming integer scroll gutter, subtract the rest and round down + delta += Math.max( 0, Math.ceil( + elem[ "offset" + dimension[ 0 ].toUpperCase() + dimension.slice( 1 ) ] - + computedVal - + delta - + extra - + 0.5 + + // If offsetWidth/offsetHeight is unknown, then we can't determine content-box scroll gutter + // Use an explicit zero to avoid NaN (gh-3964) + ) ) || 0; + } + + return delta; +} + +function getWidthOrHeight( elem, dimension, extra ) { + + // Start with computed style + var styles = getStyles( elem ), + + // To avoid forcing a reflow, only fetch boxSizing if we need it (gh-4322). + // Fake content-box until we know it's needed to know the true value. + boxSizingNeeded = !support.boxSizingReliable() || extra, + isBorderBox = boxSizingNeeded && + jQuery.css( elem, "boxSizing", false, styles ) === "border-box", + valueIsBorderBox = isBorderBox, + + val = curCSS( elem, dimension, styles ), + offsetProp = "offset" + dimension[ 0 ].toUpperCase() + dimension.slice( 1 ); + + // Support: Firefox <=54 + // Return a confounding non-pixel value or feign ignorance, as appropriate. + if ( rnumnonpx.test( val ) ) { + if ( !extra ) { + return val; + } + val = "auto"; + } + + + // Support: IE 9 - 11 only + // Use offsetWidth/offsetHeight for when box sizing is unreliable. + // In those cases, the computed value can be trusted to be border-box. + if ( ( !support.boxSizingReliable() && isBorderBox || + + // Support: IE 10 - 11+, Edge 15 - 18+ + // IE/Edge misreport `getComputedStyle` of table rows with width/height + // set in CSS while `offset*` properties report correct values. + // Interestingly, in some cases IE 9 doesn't suffer from this issue. + !support.reliableTrDimensions() && nodeName( elem, "tr" ) || + + // Fall back to offsetWidth/offsetHeight when value is "auto" + // This happens for inline elements with no explicit setting (gh-3571) + val === "auto" || + + // Support: Android <=4.1 - 4.3 only + // Also use offsetWidth/offsetHeight for misreported inline dimensions (gh-3602) + !parseFloat( val ) && jQuery.css( elem, "display", false, styles ) === "inline" ) && + + // Make sure the element is visible & connected + elem.getClientRects().length ) { + + isBorderBox = jQuery.css( elem, "boxSizing", false, styles ) === "border-box"; + + // Where available, offsetWidth/offsetHeight approximate border box dimensions. + // Where not available (e.g., SVG), assume unreliable box-sizing and interpret the + // retrieved value as a content box dimension. + valueIsBorderBox = offsetProp in elem; + if ( valueIsBorderBox ) { + val = elem[ offsetProp ]; + } + } + + // Normalize "" and auto + val = parseFloat( val ) || 0; + + // Adjust for the element's box model + return ( val + + boxModelAdjustment( + elem, + dimension, + extra || ( isBorderBox ? "border" : "content" ), + valueIsBorderBox, + styles, + + // Provide the current computed size to request scroll gutter calculation (gh-3589) + val + ) + ) + "px"; +} + +jQuery.extend( { + + // Add in style property hooks for overriding the default + // behavior of getting and setting a style property + cssHooks: { + opacity: { + get: function( elem, computed ) { + if ( computed ) { + + // We should always get a number back from opacity + var ret = curCSS( elem, "opacity" ); + return ret === "" ? "1" : ret; + } + } + } + }, + + // Don't automatically add "px" to these possibly-unitless properties + cssNumber: { + "animationIterationCount": true, + "columnCount": true, + "fillOpacity": true, + "flexGrow": true, + "flexShrink": true, + "fontWeight": true, + "gridArea": true, + "gridColumn": true, + "gridColumnEnd": true, + "gridColumnStart": true, + "gridRow": true, + "gridRowEnd": true, + "gridRowStart": true, + "lineHeight": true, + "opacity": true, + "order": true, + "orphans": true, + "widows": true, + "zIndex": true, + "zoom": true + }, + + // Add in properties whose names you wish to fix before + // setting or getting the value + cssProps: {}, + + // Get and set the style property on a DOM Node + style: function( elem, name, value, extra ) { + + // Don't set styles on text and comment nodes + if ( !elem || elem.nodeType === 3 || elem.nodeType === 8 || !elem.style ) { + return; + } + + // Make sure that we're working with the right name + var ret, type, hooks, + origName = camelCase( name ), + isCustomProp = rcustomProp.test( name ), + style = elem.style; + + // Make sure that we're working with the right name. We don't + // want to query the value if it is a CSS custom property + // since they are user-defined. + if ( !isCustomProp ) { + name = finalPropName( origName ); + } + + // Gets hook for the prefixed version, then unprefixed version + hooks = jQuery.cssHooks[ name ] || jQuery.cssHooks[ origName ]; + + // Check if we're setting a value + if ( value !== undefined ) { + type = typeof value; + + // Convert "+=" or "-=" to relative numbers (#7345) + if ( type === "string" && ( ret = rcssNum.exec( value ) ) && ret[ 1 ] ) { + value = adjustCSS( elem, name, ret ); + + // Fixes bug #9237 + type = "number"; + } + + // Make sure that null and NaN values aren't set (#7116) + if ( value == null || value !== value ) { + return; + } + + // If a number was passed in, add the unit (except for certain CSS properties) + // The isCustomProp check can be removed in jQuery 4.0 when we only auto-append + // "px" to a few hardcoded values. + if ( type === "number" && !isCustomProp ) { + value += ret && ret[ 3 ] || ( jQuery.cssNumber[ origName ] ? "" : "px" ); + } + + // background-* props affect original clone's values + if ( !support.clearCloneStyle && value === "" && name.indexOf( "background" ) === 0 ) { + style[ name ] = "inherit"; + } + + // If a hook was provided, use that value, otherwise just set the specified value + if ( !hooks || !( "set" in hooks ) || + ( value = hooks.set( elem, value, extra ) ) !== undefined ) { + + if ( isCustomProp ) { + style.setProperty( name, value ); + } else { + style[ name ] = value; + } + } + + } else { + + // If a hook was provided get the non-computed value from there + if ( hooks && "get" in hooks && + ( ret = hooks.get( elem, false, extra ) ) !== undefined ) { + + return ret; + } + + // Otherwise just get the value from the style object + return style[ name ]; + } + }, + + css: function( elem, name, extra, styles ) { + var val, num, hooks, + origName = camelCase( name ), + isCustomProp = rcustomProp.test( name ); + + // Make sure that we're working with the right name. We don't + // want to modify the value if it is a CSS custom property + // since they are user-defined. + if ( !isCustomProp ) { + name = finalPropName( origName ); + } + + // Try prefixed name followed by the unprefixed name + hooks = jQuery.cssHooks[ name ] || jQuery.cssHooks[ origName ]; + + // If a hook was provided get the computed value from there + if ( hooks && "get" in hooks ) { + val = hooks.get( elem, true, extra ); + } + + // Otherwise, if a way to get the computed value exists, use that + if ( val === undefined ) { + val = curCSS( elem, name, styles ); + } + + // Convert "normal" to computed value + if ( val === "normal" && name in cssNormalTransform ) { + val = cssNormalTransform[ name ]; + } + + // Make numeric if forced or a qualifier was provided and val looks numeric + if ( extra === "" || extra ) { + num = parseFloat( val ); + return extra === true || isFinite( num ) ? num || 0 : val; + } + + return val; + } +} ); + +jQuery.each( [ "height", "width" ], function( _i, dimension ) { + jQuery.cssHooks[ dimension ] = { + get: function( elem, computed, extra ) { + if ( computed ) { + + // Certain elements can have dimension info if we invisibly show them + // but it must have a current display style that would benefit + return rdisplayswap.test( jQuery.css( elem, "display" ) ) && + + // Support: Safari 8+ + // Table columns in Safari have non-zero offsetWidth & zero + // getBoundingClientRect().width unless display is changed. + // Support: IE <=11 only + // Running getBoundingClientRect on a disconnected node + // in IE throws an error. + ( !elem.getClientRects().length || !elem.getBoundingClientRect().width ) ? + swap( elem, cssShow, function() { + return getWidthOrHeight( elem, dimension, extra ); + } ) : + getWidthOrHeight( elem, dimension, extra ); + } + }, + + set: function( elem, value, extra ) { + var matches, + styles = getStyles( elem ), + + // Only read styles.position if the test has a chance to fail + // to avoid forcing a reflow. + scrollboxSizeBuggy = !support.scrollboxSize() && + styles.position === "absolute", + + // To avoid forcing a reflow, only fetch boxSizing if we need it (gh-3991) + boxSizingNeeded = scrollboxSizeBuggy || extra, + isBorderBox = boxSizingNeeded && + jQuery.css( elem, "boxSizing", false, styles ) === "border-box", + subtract = extra ? + boxModelAdjustment( + elem, + dimension, + extra, + isBorderBox, + styles + ) : + 0; + + // Account for unreliable border-box dimensions by comparing offset* to computed and + // faking a content-box to get border and padding (gh-3699) + if ( isBorderBox && scrollboxSizeBuggy ) { + subtract -= Math.ceil( + elem[ "offset" + dimension[ 0 ].toUpperCase() + dimension.slice( 1 ) ] - + parseFloat( styles[ dimension ] ) - + boxModelAdjustment( elem, dimension, "border", false, styles ) - + 0.5 + ); + } + + // Convert to pixels if value adjustment is needed + if ( subtract && ( matches = rcssNum.exec( value ) ) && + ( matches[ 3 ] || "px" ) !== "px" ) { + + elem.style[ dimension ] = value; + value = jQuery.css( elem, dimension ); + } + + return setPositiveNumber( elem, value, subtract ); + } + }; +} ); + +jQuery.cssHooks.marginLeft = addGetHookIf( support.reliableMarginLeft, + function( elem, computed ) { + if ( computed ) { + return ( parseFloat( curCSS( elem, "marginLeft" ) ) || + elem.getBoundingClientRect().left - + swap( elem, { marginLeft: 0 }, function() { + return elem.getBoundingClientRect().left; + } ) + ) + "px"; + } + } +); + +// These hooks are used by animate to expand properties +jQuery.each( { + margin: "", + padding: "", + border: "Width" +}, function( prefix, suffix ) { + jQuery.cssHooks[ prefix + suffix ] = { + expand: function( value ) { + var i = 0, + expanded = {}, + + // Assumes a single number if not a string + parts = typeof value === "string" ? value.split( " " ) : [ value ]; + + for ( ; i < 4; i++ ) { + expanded[ prefix + cssExpand[ i ] + suffix ] = + parts[ i ] || parts[ i - 2 ] || parts[ 0 ]; + } + + return expanded; + } + }; + + if ( prefix !== "margin" ) { + jQuery.cssHooks[ prefix + suffix ].set = setPositiveNumber; + } +} ); + +jQuery.fn.extend( { + css: function( name, value ) { + return access( this, function( elem, name, value ) { + var styles, len, + map = {}, + i = 0; + + if ( Array.isArray( name ) ) { + styles = getStyles( elem ); + len = name.length; + + for ( ; i < len; i++ ) { + map[ name[ i ] ] = jQuery.css( elem, name[ i ], false, styles ); + } + + return map; + } + + return value !== undefined ? + jQuery.style( elem, name, value ) : + jQuery.css( elem, name ); + }, name, value, arguments.length > 1 ); + } +} ); + + +function Tween( elem, options, prop, end, easing ) { + return new Tween.prototype.init( elem, options, prop, end, easing ); +} +jQuery.Tween = Tween; + +Tween.prototype = { + constructor: Tween, + init: function( elem, options, prop, end, easing, unit ) { + this.elem = elem; + this.prop = prop; + this.easing = easing || jQuery.easing._default; + this.options = options; + this.start = this.now = this.cur(); + this.end = end; + this.unit = unit || ( jQuery.cssNumber[ prop ] ? "" : "px" ); + }, + cur: function() { + var hooks = Tween.propHooks[ this.prop ]; + + return hooks && hooks.get ? + hooks.get( this ) : + Tween.propHooks._default.get( this ); + }, + run: function( percent ) { + var eased, + hooks = Tween.propHooks[ this.prop ]; + + if ( this.options.duration ) { + this.pos = eased = jQuery.easing[ this.easing ]( + percent, this.options.duration * percent, 0, 1, this.options.duration + ); + } else { + this.pos = eased = percent; + } + this.now = ( this.end - this.start ) * eased + this.start; + + if ( this.options.step ) { + this.options.step.call( this.elem, this.now, this ); + } + + if ( hooks && hooks.set ) { + hooks.set( this ); + } else { + Tween.propHooks._default.set( this ); + } + return this; + } +}; + +Tween.prototype.init.prototype = Tween.prototype; + +Tween.propHooks = { + _default: { + get: function( tween ) { + var result; + + // Use a property on the element directly when it is not a DOM element, + // or when there is no matching style property that exists. + if ( tween.elem.nodeType !== 1 || + tween.elem[ tween.prop ] != null && tween.elem.style[ tween.prop ] == null ) { + return tween.elem[ tween.prop ]; + } + + // Passing an empty string as a 3rd parameter to .css will automatically + // attempt a parseFloat and fallback to a string if the parse fails. + // Simple values such as "10px" are parsed to Float; + // complex values such as "rotate(1rad)" are returned as-is. + result = jQuery.css( tween.elem, tween.prop, "" ); + + // Empty strings, null, undefined and "auto" are converted to 0. + return !result || result === "auto" ? 0 : result; + }, + set: function( tween ) { + + // Use step hook for back compat. + // Use cssHook if its there. + // Use .style if available and use plain properties where available. + if ( jQuery.fx.step[ tween.prop ] ) { + jQuery.fx.step[ tween.prop ]( tween ); + } else if ( tween.elem.nodeType === 1 && ( + jQuery.cssHooks[ tween.prop ] || + tween.elem.style[ finalPropName( tween.prop ) ] != null ) ) { + jQuery.style( tween.elem, tween.prop, tween.now + tween.unit ); + } else { + tween.elem[ tween.prop ] = tween.now; + } + } + } +}; + +// Support: IE <=9 only +// Panic based approach to setting things on disconnected nodes +Tween.propHooks.scrollTop = Tween.propHooks.scrollLeft = { + set: function( tween ) { + if ( tween.elem.nodeType && tween.elem.parentNode ) { + tween.elem[ tween.prop ] = tween.now; + } + } +}; + +jQuery.easing = { + linear: function( p ) { + return p; + }, + swing: function( p ) { + return 0.5 - Math.cos( p * Math.PI ) / 2; + }, + _default: "swing" +}; + +jQuery.fx = Tween.prototype.init; + +// Back compat <1.8 extension point +jQuery.fx.step = {}; + + + + +var + fxNow, inProgress, + rfxtypes = /^(?:toggle|show|hide)$/, + rrun = /queueHooks$/; + +function schedule() { + if ( inProgress ) { + if ( document.hidden === false && window.requestAnimationFrame ) { + window.requestAnimationFrame( schedule ); + } else { + window.setTimeout( schedule, jQuery.fx.interval ); + } + + jQuery.fx.tick(); + } +} + +// Animations created synchronously will run synchronously +function createFxNow() { + window.setTimeout( function() { + fxNow = undefined; + } ); + return ( fxNow = Date.now() ); +} + +// Generate parameters to create a standard animation +function genFx( type, includeWidth ) { + var which, + i = 0, + attrs = { height: type }; + + // If we include width, step value is 1 to do all cssExpand values, + // otherwise step value is 2 to skip over Left and Right + includeWidth = includeWidth ? 1 : 0; + for ( ; i < 4; i += 2 - includeWidth ) { + which = cssExpand[ i ]; + attrs[ "margin" + which ] = attrs[ "padding" + which ] = type; + } + + if ( includeWidth ) { + attrs.opacity = attrs.width = type; + } + + return attrs; +} + +function createTween( value, prop, animation ) { + var tween, + collection = ( Animation.tweeners[ prop ] || [] ).concat( Animation.tweeners[ "*" ] ), + index = 0, + length = collection.length; + for ( ; index < length; index++ ) { + if ( ( tween = collection[ index ].call( animation, prop, value ) ) ) { + + // We're done with this property + return tween; + } + } +} + +function defaultPrefilter( elem, props, opts ) { + var prop, value, toggle, hooks, oldfire, propTween, restoreDisplay, display, + isBox = "width" in props || "height" in props, + anim = this, + orig = {}, + style = elem.style, + hidden = elem.nodeType && isHiddenWithinTree( elem ), + dataShow = dataPriv.get( elem, "fxshow" ); + + // Queue-skipping animations hijack the fx hooks + if ( !opts.queue ) { + hooks = jQuery._queueHooks( elem, "fx" ); + if ( hooks.unqueued == null ) { + hooks.unqueued = 0; + oldfire = hooks.empty.fire; + hooks.empty.fire = function() { + if ( !hooks.unqueued ) { + oldfire(); + } + }; + } + hooks.unqueued++; + + anim.always( function() { + + // Ensure the complete handler is called before this completes + anim.always( function() { + hooks.unqueued--; + if ( !jQuery.queue( elem, "fx" ).length ) { + hooks.empty.fire(); + } + } ); + } ); + } + + // Detect show/hide animations + for ( prop in props ) { + value = props[ prop ]; + if ( rfxtypes.test( value ) ) { + delete props[ prop ]; + toggle = toggle || value === "toggle"; + if ( value === ( hidden ? "hide" : "show" ) ) { + + // Pretend to be hidden if this is a "show" and + // there is still data from a stopped show/hide + if ( value === "show" && dataShow && dataShow[ prop ] !== undefined ) { + hidden = true; + + // Ignore all other no-op show/hide data + } else { + continue; + } + } + orig[ prop ] = dataShow && dataShow[ prop ] || jQuery.style( elem, prop ); + } + } + + // Bail out if this is a no-op like .hide().hide() + propTween = !jQuery.isEmptyObject( props ); + if ( !propTween && jQuery.isEmptyObject( orig ) ) { + return; + } + + // Restrict "overflow" and "display" styles during box animations + if ( isBox && elem.nodeType === 1 ) { + + // Support: IE <=9 - 11, Edge 12 - 15 + // Record all 3 overflow attributes because IE does not infer the shorthand + // from identically-valued overflowX and overflowY and Edge just mirrors + // the overflowX value there. + opts.overflow = [ style.overflow, style.overflowX, style.overflowY ]; + + // Identify a display type, preferring old show/hide data over the CSS cascade + restoreDisplay = dataShow && dataShow.display; + if ( restoreDisplay == null ) { + restoreDisplay = dataPriv.get( elem, "display" ); + } + display = jQuery.css( elem, "display" ); + if ( display === "none" ) { + if ( restoreDisplay ) { + display = restoreDisplay; + } else { + + // Get nonempty value(s) by temporarily forcing visibility + showHide( [ elem ], true ); + restoreDisplay = elem.style.display || restoreDisplay; + display = jQuery.css( elem, "display" ); + showHide( [ elem ] ); + } + } + + // Animate inline elements as inline-block + if ( display === "inline" || display === "inline-block" && restoreDisplay != null ) { + if ( jQuery.css( elem, "float" ) === "none" ) { + + // Restore the original display value at the end of pure show/hide animations + if ( !propTween ) { + anim.done( function() { + style.display = restoreDisplay; + } ); + if ( restoreDisplay == null ) { + display = style.display; + restoreDisplay = display === "none" ? "" : display; + } + } + style.display = "inline-block"; + } + } + } + + if ( opts.overflow ) { + style.overflow = "hidden"; + anim.always( function() { + style.overflow = opts.overflow[ 0 ]; + style.overflowX = opts.overflow[ 1 ]; + style.overflowY = opts.overflow[ 2 ]; + } ); + } + + // Implement show/hide animations + propTween = false; + for ( prop in orig ) { + + // General show/hide setup for this element animation + if ( !propTween ) { + if ( dataShow ) { + if ( "hidden" in dataShow ) { + hidden = dataShow.hidden; + } + } else { + dataShow = dataPriv.access( elem, "fxshow", { display: restoreDisplay } ); + } + + // Store hidden/visible for toggle so `.stop().toggle()` "reverses" + if ( toggle ) { + dataShow.hidden = !hidden; + } + + // Show elements before animating them + if ( hidden ) { + showHide( [ elem ], true ); + } + + /* eslint-disable no-loop-func */ + + anim.done( function() { + + /* eslint-enable no-loop-func */ + + // The final step of a "hide" animation is actually hiding the element + if ( !hidden ) { + showHide( [ elem ] ); + } + dataPriv.remove( elem, "fxshow" ); + for ( prop in orig ) { + jQuery.style( elem, prop, orig[ prop ] ); + } + } ); + } + + // Per-property setup + propTween = createTween( hidden ? dataShow[ prop ] : 0, prop, anim ); + if ( !( prop in dataShow ) ) { + dataShow[ prop ] = propTween.start; + if ( hidden ) { + propTween.end = propTween.start; + propTween.start = 0; + } + } + } +} + +function propFilter( props, specialEasing ) { + var index, name, easing, value, hooks; + + // camelCase, specialEasing and expand cssHook pass + for ( index in props ) { + name = camelCase( index ); + easing = specialEasing[ name ]; + value = props[ index ]; + if ( Array.isArray( value ) ) { + easing = value[ 1 ]; + value = props[ index ] = value[ 0 ]; + } + + if ( index !== name ) { + props[ name ] = value; + delete props[ index ]; + } + + hooks = jQuery.cssHooks[ name ]; + if ( hooks && "expand" in hooks ) { + value = hooks.expand( value ); + delete props[ name ]; + + // Not quite $.extend, this won't overwrite existing keys. + // Reusing 'index' because we have the correct "name" + for ( index in value ) { + if ( !( index in props ) ) { + props[ index ] = value[ index ]; + specialEasing[ index ] = easing; + } + } + } else { + specialEasing[ name ] = easing; + } + } +} + +function Animation( elem, properties, options ) { + var result, + stopped, + index = 0, + length = Animation.prefilters.length, + deferred = jQuery.Deferred().always( function() { + + // Don't match elem in the :animated selector + delete tick.elem; + } ), + tick = function() { + if ( stopped ) { + return false; + } + var currentTime = fxNow || createFxNow(), + remaining = Math.max( 0, animation.startTime + animation.duration - currentTime ), + + // Support: Android 2.3 only + // Archaic crash bug won't allow us to use `1 - ( 0.5 || 0 )` (#12497) + temp = remaining / animation.duration || 0, + percent = 1 - temp, + index = 0, + length = animation.tweens.length; + + for ( ; index < length; index++ ) { + animation.tweens[ index ].run( percent ); + } + + deferred.notifyWith( elem, [ animation, percent, remaining ] ); + + // If there's more to do, yield + if ( percent < 1 && length ) { + return remaining; + } + + // If this was an empty animation, synthesize a final progress notification + if ( !length ) { + deferred.notifyWith( elem, [ animation, 1, 0 ] ); + } + + // Resolve the animation and report its conclusion + deferred.resolveWith( elem, [ animation ] ); + return false; + }, + animation = deferred.promise( { + elem: elem, + props: jQuery.extend( {}, properties ), + opts: jQuery.extend( true, { + specialEasing: {}, + easing: jQuery.easing._default + }, options ), + originalProperties: properties, + originalOptions: options, + startTime: fxNow || createFxNow(), + duration: options.duration, + tweens: [], + createTween: function( prop, end ) { + var tween = jQuery.Tween( elem, animation.opts, prop, end, + animation.opts.specialEasing[ prop ] || animation.opts.easing ); + animation.tweens.push( tween ); + return tween; + }, + stop: function( gotoEnd ) { + var index = 0, + + // If we are going to the end, we want to run all the tweens + // otherwise we skip this part + length = gotoEnd ? animation.tweens.length : 0; + if ( stopped ) { + return this; + } + stopped = true; + for ( ; index < length; index++ ) { + animation.tweens[ index ].run( 1 ); + } + + // Resolve when we played the last frame; otherwise, reject + if ( gotoEnd ) { + deferred.notifyWith( elem, [ animation, 1, 0 ] ); + deferred.resolveWith( elem, [ animation, gotoEnd ] ); + } else { + deferred.rejectWith( elem, [ animation, gotoEnd ] ); + } + return this; + } + } ), + props = animation.props; + + propFilter( props, animation.opts.specialEasing ); + + for ( ; index < length; index++ ) { + result = Animation.prefilters[ index ].call( animation, elem, props, animation.opts ); + if ( result ) { + if ( isFunction( result.stop ) ) { + jQuery._queueHooks( animation.elem, animation.opts.queue ).stop = + result.stop.bind( result ); + } + return result; + } + } + + jQuery.map( props, createTween, animation ); + + if ( isFunction( animation.opts.start ) ) { + animation.opts.start.call( elem, animation ); + } + + // Attach callbacks from options + animation + .progress( animation.opts.progress ) + .done( animation.opts.done, animation.opts.complete ) + .fail( animation.opts.fail ) + .always( animation.opts.always ); + + jQuery.fx.timer( + jQuery.extend( tick, { + elem: elem, + anim: animation, + queue: animation.opts.queue + } ) + ); + + return animation; +} + +jQuery.Animation = jQuery.extend( Animation, { + + tweeners: { + "*": [ function( prop, value ) { + var tween = this.createTween( prop, value ); + adjustCSS( tween.elem, prop, rcssNum.exec( value ), tween ); + return tween; + } ] + }, + + tweener: function( props, callback ) { + if ( isFunction( props ) ) { + callback = props; + props = [ "*" ]; + } else { + props = props.match( rnothtmlwhite ); + } + + var prop, + index = 0, + length = props.length; + + for ( ; index < length; index++ ) { + prop = props[ index ]; + Animation.tweeners[ prop ] = Animation.tweeners[ prop ] || []; + Animation.tweeners[ prop ].unshift( callback ); + } + }, + + prefilters: [ defaultPrefilter ], + + prefilter: function( callback, prepend ) { + if ( prepend ) { + Animation.prefilters.unshift( callback ); + } else { + Animation.prefilters.push( callback ); + } + } +} ); + +jQuery.speed = function( speed, easing, fn ) { + var opt = speed && typeof speed === "object" ? jQuery.extend( {}, speed ) : { + complete: fn || !fn && easing || + isFunction( speed ) && speed, + duration: speed, + easing: fn && easing || easing && !isFunction( easing ) && easing + }; + + // Go to the end state if fx are off + if ( jQuery.fx.off ) { + opt.duration = 0; + + } else { + if ( typeof opt.duration !== "number" ) { + if ( opt.duration in jQuery.fx.speeds ) { + opt.duration = jQuery.fx.speeds[ opt.duration ]; + + } else { + opt.duration = jQuery.fx.speeds._default; + } + } + } + + // Normalize opt.queue - true/undefined/null -> "fx" + if ( opt.queue == null || opt.queue === true ) { + opt.queue = "fx"; + } + + // Queueing + opt.old = opt.complete; + + opt.complete = function() { + if ( isFunction( opt.old ) ) { + opt.old.call( this ); + } + + if ( opt.queue ) { + jQuery.dequeue( this, opt.queue ); + } + }; + + return opt; +}; + +jQuery.fn.extend( { + fadeTo: function( speed, to, easing, callback ) { + + // Show any hidden elements after setting opacity to 0 + return this.filter( isHiddenWithinTree ).css( "opacity", 0 ).show() + + // Animate to the value specified + .end().animate( { opacity: to }, speed, easing, callback ); + }, + animate: function( prop, speed, easing, callback ) { + var empty = jQuery.isEmptyObject( prop ), + optall = jQuery.speed( speed, easing, callback ), + doAnimation = function() { + + // Operate on a copy of prop so per-property easing won't be lost + var anim = Animation( this, jQuery.extend( {}, prop ), optall ); + + // Empty animations, or finishing resolves immediately + if ( empty || dataPriv.get( this, "finish" ) ) { + anim.stop( true ); + } + }; + doAnimation.finish = doAnimation; + + return empty || optall.queue === false ? + this.each( doAnimation ) : + this.queue( optall.queue, doAnimation ); + }, + stop: function( type, clearQueue, gotoEnd ) { + var stopQueue = function( hooks ) { + var stop = hooks.stop; + delete hooks.stop; + stop( gotoEnd ); + }; + + if ( typeof type !== "string" ) { + gotoEnd = clearQueue; + clearQueue = type; + type = undefined; + } + if ( clearQueue ) { + this.queue( type || "fx", [] ); + } + + return this.each( function() { + var dequeue = true, + index = type != null && type + "queueHooks", + timers = jQuery.timers, + data = dataPriv.get( this ); + + if ( index ) { + if ( data[ index ] && data[ index ].stop ) { + stopQueue( data[ index ] ); + } + } else { + for ( index in data ) { + if ( data[ index ] && data[ index ].stop && rrun.test( index ) ) { + stopQueue( data[ index ] ); + } + } + } + + for ( index = timers.length; index--; ) { + if ( timers[ index ].elem === this && + ( type == null || timers[ index ].queue === type ) ) { + + timers[ index ].anim.stop( gotoEnd ); + dequeue = false; + timers.splice( index, 1 ); + } + } + + // Start the next in the queue if the last step wasn't forced. + // Timers currently will call their complete callbacks, which + // will dequeue but only if they were gotoEnd. + if ( dequeue || !gotoEnd ) { + jQuery.dequeue( this, type ); + } + } ); + }, + finish: function( type ) { + if ( type !== false ) { + type = type || "fx"; + } + return this.each( function() { + var index, + data = dataPriv.get( this ), + queue = data[ type + "queue" ], + hooks = data[ type + "queueHooks" ], + timers = jQuery.timers, + length = queue ? queue.length : 0; + + // Enable finishing flag on private data + data.finish = true; + + // Empty the queue first + jQuery.queue( this, type, [] ); + + if ( hooks && hooks.stop ) { + hooks.stop.call( this, true ); + } + + // Look for any active animations, and finish them + for ( index = timers.length; index--; ) { + if ( timers[ index ].elem === this && timers[ index ].queue === type ) { + timers[ index ].anim.stop( true ); + timers.splice( index, 1 ); + } + } + + // Look for any animations in the old queue and finish them + for ( index = 0; index < length; index++ ) { + if ( queue[ index ] && queue[ index ].finish ) { + queue[ index ].finish.call( this ); + } + } + + // Turn off finishing flag + delete data.finish; + } ); + } +} ); + +jQuery.each( [ "toggle", "show", "hide" ], function( _i, name ) { + var cssFn = jQuery.fn[ name ]; + jQuery.fn[ name ] = function( speed, easing, callback ) { + return speed == null || typeof speed === "boolean" ? + cssFn.apply( this, arguments ) : + this.animate( genFx( name, true ), speed, easing, callback ); + }; +} ); + +// Generate shortcuts for custom animations +jQuery.each( { + slideDown: genFx( "show" ), + slideUp: genFx( "hide" ), + slideToggle: genFx( "toggle" ), + fadeIn: { opacity: "show" }, + fadeOut: { opacity: "hide" }, + fadeToggle: { opacity: "toggle" } +}, function( name, props ) { + jQuery.fn[ name ] = function( speed, easing, callback ) { + return this.animate( props, speed, easing, callback ); + }; +} ); + +jQuery.timers = []; +jQuery.fx.tick = function() { + var timer, + i = 0, + timers = jQuery.timers; + + fxNow = Date.now(); + + for ( ; i < timers.length; i++ ) { + timer = timers[ i ]; + + // Run the timer and safely remove it when done (allowing for external removal) + if ( !timer() && timers[ i ] === timer ) { + timers.splice( i--, 1 ); + } + } + + if ( !timers.length ) { + jQuery.fx.stop(); + } + fxNow = undefined; +}; + +jQuery.fx.timer = function( timer ) { + jQuery.timers.push( timer ); + jQuery.fx.start(); +}; + +jQuery.fx.interval = 13; +jQuery.fx.start = function() { + if ( inProgress ) { + return; + } + + inProgress = true; + schedule(); +}; + +jQuery.fx.stop = function() { + inProgress = null; +}; + +jQuery.fx.speeds = { + slow: 600, + fast: 200, + + // Default speed + _default: 400 +}; + + +// Based off of the plugin by Clint Helfers, with permission. +// https://web.archive.org/web/20100324014747/http://blindsignals.com/index.php/2009/07/jquery-delay/ +jQuery.fn.delay = function( time, type ) { + time = jQuery.fx ? jQuery.fx.speeds[ time ] || time : time; + type = type || "fx"; + + return this.queue( type, function( next, hooks ) { + var timeout = window.setTimeout( next, time ); + hooks.stop = function() { + window.clearTimeout( timeout ); + }; + } ); +}; + + +( function() { + var input = document.createElement( "input" ), + select = document.createElement( "select" ), + opt = select.appendChild( document.createElement( "option" ) ); + + input.type = "checkbox"; + + // Support: Android <=4.3 only + // Default value for a checkbox should be "on" + support.checkOn = input.value !== ""; + + // Support: IE <=11 only + // Must access selectedIndex to make default options select + support.optSelected = opt.selected; + + // Support: IE <=11 only + // An input loses its value after becoming a radio + input = document.createElement( "input" ); + input.value = "t"; + input.type = "radio"; + support.radioValue = input.value === "t"; +} )(); + + +var boolHook, + attrHandle = jQuery.expr.attrHandle; + +jQuery.fn.extend( { + attr: function( name, value ) { + return access( this, jQuery.attr, name, value, arguments.length > 1 ); + }, + + removeAttr: function( name ) { + return this.each( function() { + jQuery.removeAttr( this, name ); + } ); + } +} ); + +jQuery.extend( { + attr: function( elem, name, value ) { + var ret, hooks, + nType = elem.nodeType; + + // Don't get/set attributes on text, comment and attribute nodes + if ( nType === 3 || nType === 8 || nType === 2 ) { + return; + } + + // Fallback to prop when attributes are not supported + if ( typeof elem.getAttribute === "undefined" ) { + return jQuery.prop( elem, name, value ); + } + + // Attribute hooks are determined by the lowercase version + // Grab necessary hook if one is defined + if ( nType !== 1 || !jQuery.isXMLDoc( elem ) ) { + hooks = jQuery.attrHooks[ name.toLowerCase() ] || + ( jQuery.expr.match.bool.test( name ) ? boolHook : undefined ); + } + + if ( value !== undefined ) { + if ( value === null ) { + jQuery.removeAttr( elem, name ); + return; + } + + if ( hooks && "set" in hooks && + ( ret = hooks.set( elem, value, name ) ) !== undefined ) { + return ret; + } + + elem.setAttribute( name, value + "" ); + return value; + } + + if ( hooks && "get" in hooks && ( ret = hooks.get( elem, name ) ) !== null ) { + return ret; + } + + ret = jQuery.find.attr( elem, name ); + + // Non-existent attributes return null, we normalize to undefined + return ret == null ? undefined : ret; + }, + + attrHooks: { + type: { + set: function( elem, value ) { + if ( !support.radioValue && value === "radio" && + nodeName( elem, "input" ) ) { + var val = elem.value; + elem.setAttribute( "type", value ); + if ( val ) { + elem.value = val; + } + return value; + } + } + } + }, + + removeAttr: function( elem, value ) { + var name, + i = 0, + + // Attribute names can contain non-HTML whitespace characters + // https://html.spec.whatwg.org/multipage/syntax.html#attributes-2 + attrNames = value && value.match( rnothtmlwhite ); + + if ( attrNames && elem.nodeType === 1 ) { + while ( ( name = attrNames[ i++ ] ) ) { + elem.removeAttribute( name ); + } + } + } +} ); + +// Hooks for boolean attributes +boolHook = { + set: function( elem, value, name ) { + if ( value === false ) { + + // Remove boolean attributes when set to false + jQuery.removeAttr( elem, name ); + } else { + elem.setAttribute( name, name ); + } + return name; + } +}; + +jQuery.each( jQuery.expr.match.bool.source.match( /\w+/g ), function( _i, name ) { + var getter = attrHandle[ name ] || jQuery.find.attr; + + attrHandle[ name ] = function( elem, name, isXML ) { + var ret, handle, + lowercaseName = name.toLowerCase(); + + if ( !isXML ) { + + // Avoid an infinite loop by temporarily removing this function from the getter + handle = attrHandle[ lowercaseName ]; + attrHandle[ lowercaseName ] = ret; + ret = getter( elem, name, isXML ) != null ? + lowercaseName : + null; + attrHandle[ lowercaseName ] = handle; + } + return ret; + }; +} ); + + + + +var rfocusable = /^(?:input|select|textarea|button)$/i, + rclickable = /^(?:a|area)$/i; + +jQuery.fn.extend( { + prop: function( name, value ) { + return access( this, jQuery.prop, name, value, arguments.length > 1 ); + }, + + removeProp: function( name ) { + return this.each( function() { + delete this[ jQuery.propFix[ name ] || name ]; + } ); + } +} ); + +jQuery.extend( { + prop: function( elem, name, value ) { + var ret, hooks, + nType = elem.nodeType; + + // Don't get/set properties on text, comment and attribute nodes + if ( nType === 3 || nType === 8 || nType === 2 ) { + return; + } + + if ( nType !== 1 || !jQuery.isXMLDoc( elem ) ) { + + // Fix name and attach hooks + name = jQuery.propFix[ name ] || name; + hooks = jQuery.propHooks[ name ]; + } + + if ( value !== undefined ) { + if ( hooks && "set" in hooks && + ( ret = hooks.set( elem, value, name ) ) !== undefined ) { + return ret; + } + + return ( elem[ name ] = value ); + } + + if ( hooks && "get" in hooks && ( ret = hooks.get( elem, name ) ) !== null ) { + return ret; + } + + return elem[ name ]; + }, + + propHooks: { + tabIndex: { + get: function( elem ) { + + // Support: IE <=9 - 11 only + // elem.tabIndex doesn't always return the + // correct value when it hasn't been explicitly set + // https://web.archive.org/web/20141116233347/http://fluidproject.org/blog/2008/01/09/getting-setting-and-removing-tabindex-values-with-javascript/ + // Use proper attribute retrieval(#12072) + var tabindex = jQuery.find.attr( elem, "tabindex" ); + + if ( tabindex ) { + return parseInt( tabindex, 10 ); + } + + if ( + rfocusable.test( elem.nodeName ) || + rclickable.test( elem.nodeName ) && + elem.href + ) { + return 0; + } + + return -1; + } + } + }, + + propFix: { + "for": "htmlFor", + "class": "className" + } +} ); + +// Support: IE <=11 only +// Accessing the selectedIndex property +// forces the browser to respect setting selected +// on the option +// The getter ensures a default option is selected +// when in an optgroup +// eslint rule "no-unused-expressions" is disabled for this code +// since it considers such accessions noop +if ( !support.optSelected ) { + jQuery.propHooks.selected = { + get: function( elem ) { + + /* eslint no-unused-expressions: "off" */ + + var parent = elem.parentNode; + if ( parent && parent.parentNode ) { + parent.parentNode.selectedIndex; + } + return null; + }, + set: function( elem ) { + + /* eslint no-unused-expressions: "off" */ + + var parent = elem.parentNode; + if ( parent ) { + parent.selectedIndex; + + if ( parent.parentNode ) { + parent.parentNode.selectedIndex; + } + } + } + }; +} + +jQuery.each( [ + "tabIndex", + "readOnly", + "maxLength", + "cellSpacing", + "cellPadding", + "rowSpan", + "colSpan", + "useMap", + "frameBorder", + "contentEditable" +], function() { + jQuery.propFix[ this.toLowerCase() ] = this; +} ); + + + + + // Strip and collapse whitespace according to HTML spec + // https://infra.spec.whatwg.org/#strip-and-collapse-ascii-whitespace + function stripAndCollapse( value ) { + var tokens = value.match( rnothtmlwhite ) || []; + return tokens.join( " " ); + } + + +function getClass( elem ) { + return elem.getAttribute && elem.getAttribute( "class" ) || ""; +} + +function classesToArray( value ) { + if ( Array.isArray( value ) ) { + return value; + } + if ( typeof value === "string" ) { + return value.match( rnothtmlwhite ) || []; + } + return []; +} + +jQuery.fn.extend( { + addClass: function( value ) { + var classes, elem, cur, curValue, clazz, j, finalValue, + i = 0; + + if ( isFunction( value ) ) { + return this.each( function( j ) { + jQuery( this ).addClass( value.call( this, j, getClass( this ) ) ); + } ); + } + + classes = classesToArray( value ); + + if ( classes.length ) { + while ( ( elem = this[ i++ ] ) ) { + curValue = getClass( elem ); + cur = elem.nodeType === 1 && ( " " + stripAndCollapse( curValue ) + " " ); + + if ( cur ) { + j = 0; + while ( ( clazz = classes[ j++ ] ) ) { + if ( cur.indexOf( " " + clazz + " " ) < 0 ) { + cur += clazz + " "; + } + } + + // Only assign if different to avoid unneeded rendering. + finalValue = stripAndCollapse( cur ); + if ( curValue !== finalValue ) { + elem.setAttribute( "class", finalValue ); + } + } + } + } + + return this; + }, + + removeClass: function( value ) { + var classes, elem, cur, curValue, clazz, j, finalValue, + i = 0; + + if ( isFunction( value ) ) { + return this.each( function( j ) { + jQuery( this ).removeClass( value.call( this, j, getClass( this ) ) ); + } ); + } + + if ( !arguments.length ) { + return this.attr( "class", "" ); + } + + classes = classesToArray( value ); + + if ( classes.length ) { + while ( ( elem = this[ i++ ] ) ) { + curValue = getClass( elem ); + + // This expression is here for better compressibility (see addClass) + cur = elem.nodeType === 1 && ( " " + stripAndCollapse( curValue ) + " " ); + + if ( cur ) { + j = 0; + while ( ( clazz = classes[ j++ ] ) ) { + + // Remove *all* instances + while ( cur.indexOf( " " + clazz + " " ) > -1 ) { + cur = cur.replace( " " + clazz + " ", " " ); + } + } + + // Only assign if different to avoid unneeded rendering. + finalValue = stripAndCollapse( cur ); + if ( curValue !== finalValue ) { + elem.setAttribute( "class", finalValue ); + } + } + } + } + + return this; + }, + + toggleClass: function( value, stateVal ) { + var type = typeof value, + isValidValue = type === "string" || Array.isArray( value ); + + if ( typeof stateVal === "boolean" && isValidValue ) { + return stateVal ? this.addClass( value ) : this.removeClass( value ); + } + + if ( isFunction( value ) ) { + return this.each( function( i ) { + jQuery( this ).toggleClass( + value.call( this, i, getClass( this ), stateVal ), + stateVal + ); + } ); + } + + return this.each( function() { + var className, i, self, classNames; + + if ( isValidValue ) { + + // Toggle individual class names + i = 0; + self = jQuery( this ); + classNames = classesToArray( value ); + + while ( ( className = classNames[ i++ ] ) ) { + + // Check each className given, space separated list + if ( self.hasClass( className ) ) { + self.removeClass( className ); + } else { + self.addClass( className ); + } + } + + // Toggle whole class name + } else if ( value === undefined || type === "boolean" ) { + className = getClass( this ); + if ( className ) { + + // Store className if set + dataPriv.set( this, "__className__", className ); + } + + // If the element has a class name or if we're passed `false`, + // then remove the whole classname (if there was one, the above saved it). + // Otherwise bring back whatever was previously saved (if anything), + // falling back to the empty string if nothing was stored. + if ( this.setAttribute ) { + this.setAttribute( "class", + className || value === false ? + "" : + dataPriv.get( this, "__className__" ) || "" + ); + } + } + } ); + }, + + hasClass: function( selector ) { + var className, elem, + i = 0; + + className = " " + selector + " "; + while ( ( elem = this[ i++ ] ) ) { + if ( elem.nodeType === 1 && + ( " " + stripAndCollapse( getClass( elem ) ) + " " ).indexOf( className ) > -1 ) { + return true; + } + } + + return false; + } +} ); + + + + +var rreturn = /\r/g; + +jQuery.fn.extend( { + val: function( value ) { + var hooks, ret, valueIsFunction, + elem = this[ 0 ]; + + if ( !arguments.length ) { + if ( elem ) { + hooks = jQuery.valHooks[ elem.type ] || + jQuery.valHooks[ elem.nodeName.toLowerCase() ]; + + if ( hooks && + "get" in hooks && + ( ret = hooks.get( elem, "value" ) ) !== undefined + ) { + return ret; + } + + ret = elem.value; + + // Handle most common string cases + if ( typeof ret === "string" ) { + return ret.replace( rreturn, "" ); + } + + // Handle cases where value is null/undef or number + return ret == null ? "" : ret; + } + + return; + } + + valueIsFunction = isFunction( value ); + + return this.each( function( i ) { + var val; + + if ( this.nodeType !== 1 ) { + return; + } + + if ( valueIsFunction ) { + val = value.call( this, i, jQuery( this ).val() ); + } else { + val = value; + } + + // Treat null/undefined as ""; convert numbers to string + if ( val == null ) { + val = ""; + + } else if ( typeof val === "number" ) { + val += ""; + + } else if ( Array.isArray( val ) ) { + val = jQuery.map( val, function( value ) { + return value == null ? "" : value + ""; + } ); + } + + hooks = jQuery.valHooks[ this.type ] || jQuery.valHooks[ this.nodeName.toLowerCase() ]; + + // If set returns undefined, fall back to normal setting + if ( !hooks || !( "set" in hooks ) || hooks.set( this, val, "value" ) === undefined ) { + this.value = val; + } + } ); + } +} ); + +jQuery.extend( { + valHooks: { + option: { + get: function( elem ) { + + var val = jQuery.find.attr( elem, "value" ); + return val != null ? + val : + + // Support: IE <=10 - 11 only + // option.text throws exceptions (#14686, #14858) + // Strip and collapse whitespace + // https://html.spec.whatwg.org/#strip-and-collapse-whitespace + stripAndCollapse( jQuery.text( elem ) ); + } + }, + select: { + get: function( elem ) { + var value, option, i, + options = elem.options, + index = elem.selectedIndex, + one = elem.type === "select-one", + values = one ? null : [], + max = one ? index + 1 : options.length; + + if ( index < 0 ) { + i = max; + + } else { + i = one ? index : 0; + } + + // Loop through all the selected options + for ( ; i < max; i++ ) { + option = options[ i ]; + + // Support: IE <=9 only + // IE8-9 doesn't update selected after form reset (#2551) + if ( ( option.selected || i === index ) && + + // Don't return options that are disabled or in a disabled optgroup + !option.disabled && + ( !option.parentNode.disabled || + !nodeName( option.parentNode, "optgroup" ) ) ) { + + // Get the specific value for the option + value = jQuery( option ).val(); + + // We don't need an array for one selects + if ( one ) { + return value; + } + + // Multi-Selects return an array + values.push( value ); + } + } + + return values; + }, + + set: function( elem, value ) { + var optionSet, option, + options = elem.options, + values = jQuery.makeArray( value ), + i = options.length; + + while ( i-- ) { + option = options[ i ]; + + /* eslint-disable no-cond-assign */ + + if ( option.selected = + jQuery.inArray( jQuery.valHooks.option.get( option ), values ) > -1 + ) { + optionSet = true; + } + + /* eslint-enable no-cond-assign */ + } + + // Force browsers to behave consistently when non-matching value is set + if ( !optionSet ) { + elem.selectedIndex = -1; + } + return values; + } + } + } +} ); + +// Radios and checkboxes getter/setter +jQuery.each( [ "radio", "checkbox" ], function() { + jQuery.valHooks[ this ] = { + set: function( elem, value ) { + if ( Array.isArray( value ) ) { + return ( elem.checked = jQuery.inArray( jQuery( elem ).val(), value ) > -1 ); + } + } + }; + if ( !support.checkOn ) { + jQuery.valHooks[ this ].get = function( elem ) { + return elem.getAttribute( "value" ) === null ? "on" : elem.value; + }; + } +} ); + + + + +// Return jQuery for attributes-only inclusion + + +support.focusin = "onfocusin" in window; + + +var rfocusMorph = /^(?:focusinfocus|focusoutblur)$/, + stopPropagationCallback = function( e ) { + e.stopPropagation(); + }; + +jQuery.extend( jQuery.event, { + + trigger: function( event, data, elem, onlyHandlers ) { + + var i, cur, tmp, bubbleType, ontype, handle, special, lastElement, + eventPath = [ elem || document ], + type = hasOwn.call( event, "type" ) ? event.type : event, + namespaces = hasOwn.call( event, "namespace" ) ? event.namespace.split( "." ) : []; + + cur = lastElement = tmp = elem = elem || document; + + // Don't do events on text and comment nodes + if ( elem.nodeType === 3 || elem.nodeType === 8 ) { + return; + } + + // focus/blur morphs to focusin/out; ensure we're not firing them right now + if ( rfocusMorph.test( type + jQuery.event.triggered ) ) { + return; + } + + if ( type.indexOf( "." ) > -1 ) { + + // Namespaced trigger; create a regexp to match event type in handle() + namespaces = type.split( "." ); + type = namespaces.shift(); + namespaces.sort(); + } + ontype = type.indexOf( ":" ) < 0 && "on" + type; + + // Caller can pass in a jQuery.Event object, Object, or just an event type string + event = event[ jQuery.expando ] ? + event : + new jQuery.Event( type, typeof event === "object" && event ); + + // Trigger bitmask: & 1 for native handlers; & 2 for jQuery (always true) + event.isTrigger = onlyHandlers ? 2 : 3; + event.namespace = namespaces.join( "." ); + event.rnamespace = event.namespace ? + new RegExp( "(^|\\.)" + namespaces.join( "\\.(?:.*\\.|)" ) + "(\\.|$)" ) : + null; + + // Clean up the event in case it is being reused + event.result = undefined; + if ( !event.target ) { + event.target = elem; + } + + // Clone any incoming data and prepend the event, creating the handler arg list + data = data == null ? + [ event ] : + jQuery.makeArray( data, [ event ] ); + + // Allow special events to draw outside the lines + special = jQuery.event.special[ type ] || {}; + if ( !onlyHandlers && special.trigger && special.trigger.apply( elem, data ) === false ) { + return; + } + + // Determine event propagation path in advance, per W3C events spec (#9951) + // Bubble up to document, then to window; watch for a global ownerDocument var (#9724) + if ( !onlyHandlers && !special.noBubble && !isWindow( elem ) ) { + + bubbleType = special.delegateType || type; + if ( !rfocusMorph.test( bubbleType + type ) ) { + cur = cur.parentNode; + } + for ( ; cur; cur = cur.parentNode ) { + eventPath.push( cur ); + tmp = cur; + } + + // Only add window if we got to document (e.g., not plain obj or detached DOM) + if ( tmp === ( elem.ownerDocument || document ) ) { + eventPath.push( tmp.defaultView || tmp.parentWindow || window ); + } + } + + // Fire handlers on the event path + i = 0; + while ( ( cur = eventPath[ i++ ] ) && !event.isPropagationStopped() ) { + lastElement = cur; + event.type = i > 1 ? + bubbleType : + special.bindType || type; + + // jQuery handler + handle = ( + dataPriv.get( cur, "events" ) || Object.create( null ) + )[ event.type ] && + dataPriv.get( cur, "handle" ); + if ( handle ) { + handle.apply( cur, data ); + } + + // Native handler + handle = ontype && cur[ ontype ]; + if ( handle && handle.apply && acceptData( cur ) ) { + event.result = handle.apply( cur, data ); + if ( event.result === false ) { + event.preventDefault(); + } + } + } + event.type = type; + + // If nobody prevented the default action, do it now + if ( !onlyHandlers && !event.isDefaultPrevented() ) { + + if ( ( !special._default || + special._default.apply( eventPath.pop(), data ) === false ) && + acceptData( elem ) ) { + + // Call a native DOM method on the target with the same name as the event. + // Don't do default actions on window, that's where global variables be (#6170) + if ( ontype && isFunction( elem[ type ] ) && !isWindow( elem ) ) { + + // Don't re-trigger an onFOO event when we call its FOO() method + tmp = elem[ ontype ]; + + if ( tmp ) { + elem[ ontype ] = null; + } + + // Prevent re-triggering of the same event, since we already bubbled it above + jQuery.event.triggered = type; + + if ( event.isPropagationStopped() ) { + lastElement.addEventListener( type, stopPropagationCallback ); + } + + elem[ type ](); + + if ( event.isPropagationStopped() ) { + lastElement.removeEventListener( type, stopPropagationCallback ); + } + + jQuery.event.triggered = undefined; + + if ( tmp ) { + elem[ ontype ] = tmp; + } + } + } + } + + return event.result; + }, + + // Piggyback on a donor event to simulate a different one + // Used only for `focus(in | out)` events + simulate: function( type, elem, event ) { + var e = jQuery.extend( + new jQuery.Event(), + event, + { + type: type, + isSimulated: true + } + ); + + jQuery.event.trigger( e, null, elem ); + } + +} ); + +jQuery.fn.extend( { + + trigger: function( type, data ) { + return this.each( function() { + jQuery.event.trigger( type, data, this ); + } ); + }, + triggerHandler: function( type, data ) { + var elem = this[ 0 ]; + if ( elem ) { + return jQuery.event.trigger( type, data, elem, true ); + } + } +} ); + + +// Support: Firefox <=44 +// Firefox doesn't have focus(in | out) events +// Related ticket - https://bugzilla.mozilla.org/show_bug.cgi?id=687787 +// +// Support: Chrome <=48 - 49, Safari <=9.0 - 9.1 +// focus(in | out) events fire after focus & blur events, +// which is spec violation - http://www.w3.org/TR/DOM-Level-3-Events/#events-focusevent-event-order +// Related ticket - https://bugs.chromium.org/p/chromium/issues/detail?id=449857 +if ( !support.focusin ) { + jQuery.each( { focus: "focusin", blur: "focusout" }, function( orig, fix ) { + + // Attach a single capturing handler on the document while someone wants focusin/focusout + var handler = function( event ) { + jQuery.event.simulate( fix, event.target, jQuery.event.fix( event ) ); + }; + + jQuery.event.special[ fix ] = { + setup: function() { + + // Handle: regular nodes (via `this.ownerDocument`), window + // (via `this.document`) & document (via `this`). + var doc = this.ownerDocument || this.document || this, + attaches = dataPriv.access( doc, fix ); + + if ( !attaches ) { + doc.addEventListener( orig, handler, true ); + } + dataPriv.access( doc, fix, ( attaches || 0 ) + 1 ); + }, + teardown: function() { + var doc = this.ownerDocument || this.document || this, + attaches = dataPriv.access( doc, fix ) - 1; + + if ( !attaches ) { + doc.removeEventListener( orig, handler, true ); + dataPriv.remove( doc, fix ); + + } else { + dataPriv.access( doc, fix, attaches ); + } + } + }; + } ); +} +var location = window.location; + +var nonce = { guid: Date.now() }; + +var rquery = ( /\?/ ); + + + +// Cross-browser xml parsing +jQuery.parseXML = function( data ) { + var xml; + if ( !data || typeof data !== "string" ) { + return null; + } + + // Support: IE 9 - 11 only + // IE throws on parseFromString with invalid input. + try { + xml = ( new window.DOMParser() ).parseFromString( data, "text/xml" ); + } catch ( e ) { + xml = undefined; + } + + if ( !xml || xml.getElementsByTagName( "parsererror" ).length ) { + jQuery.error( "Invalid XML: " + data ); + } + return xml; +}; + + +var + rbracket = /\[\]$/, + rCRLF = /\r?\n/g, + rsubmitterTypes = /^(?:submit|button|image|reset|file)$/i, + rsubmittable = /^(?:input|select|textarea|keygen)/i; + +function buildParams( prefix, obj, traditional, add ) { + var name; + + if ( Array.isArray( obj ) ) { + + // Serialize array item. + jQuery.each( obj, function( i, v ) { + if ( traditional || rbracket.test( prefix ) ) { + + // Treat each array item as a scalar. + add( prefix, v ); + + } else { + + // Item is non-scalar (array or object), encode its numeric index. + buildParams( + prefix + "[" + ( typeof v === "object" && v != null ? i : "" ) + "]", + v, + traditional, + add + ); + } + } ); + + } else if ( !traditional && toType( obj ) === "object" ) { + + // Serialize object item. + for ( name in obj ) { + buildParams( prefix + "[" + name + "]", obj[ name ], traditional, add ); + } + + } else { + + // Serialize scalar item. + add( prefix, obj ); + } +} + +// Serialize an array of form elements or a set of +// key/values into a query string +jQuery.param = function( a, traditional ) { + var prefix, + s = [], + add = function( key, valueOrFunction ) { + + // If value is a function, invoke it and use its return value + var value = isFunction( valueOrFunction ) ? + valueOrFunction() : + valueOrFunction; + + s[ s.length ] = encodeURIComponent( key ) + "=" + + encodeURIComponent( value == null ? "" : value ); + }; + + if ( a == null ) { + return ""; + } + + // If an array was passed in, assume that it is an array of form elements. + if ( Array.isArray( a ) || ( a.jquery && !jQuery.isPlainObject( a ) ) ) { + + // Serialize the form elements + jQuery.each( a, function() { + add( this.name, this.value ); + } ); + + } else { + + // If traditional, encode the "old" way (the way 1.3.2 or older + // did it), otherwise encode params recursively. + for ( prefix in a ) { + buildParams( prefix, a[ prefix ], traditional, add ); + } + } + + // Return the resulting serialization + return s.join( "&" ); +}; + +jQuery.fn.extend( { + serialize: function() { + return jQuery.param( this.serializeArray() ); + }, + serializeArray: function() { + return this.map( function() { + + // Can add propHook for "elements" to filter or add form elements + var elements = jQuery.prop( this, "elements" ); + return elements ? jQuery.makeArray( elements ) : this; + } ) + .filter( function() { + var type = this.type; + + // Use .is( ":disabled" ) so that fieldset[disabled] works + return this.name && !jQuery( this ).is( ":disabled" ) && + rsubmittable.test( this.nodeName ) && !rsubmitterTypes.test( type ) && + ( this.checked || !rcheckableType.test( type ) ); + } ) + .map( function( _i, elem ) { + var val = jQuery( this ).val(); + + if ( val == null ) { + return null; + } + + if ( Array.isArray( val ) ) { + return jQuery.map( val, function( val ) { + return { name: elem.name, value: val.replace( rCRLF, "\r\n" ) }; + } ); + } + + return { name: elem.name, value: val.replace( rCRLF, "\r\n" ) }; + } ).get(); + } +} ); + + +var + r20 = /%20/g, + rhash = /#.*$/, + rantiCache = /([?&])_=[^&]*/, + rheaders = /^(.*?):[ \t]*([^\r\n]*)$/mg, + + // #7653, #8125, #8152: local protocol detection + rlocalProtocol = /^(?:about|app|app-storage|.+-extension|file|res|widget):$/, + rnoContent = /^(?:GET|HEAD)$/, + rprotocol = /^\/\//, + + /* Prefilters + * 1) They are useful to introduce custom dataTypes (see ajax/jsonp.js for an example) + * 2) These are called: + * - BEFORE asking for a transport + * - AFTER param serialization (s.data is a string if s.processData is true) + * 3) key is the dataType + * 4) the catchall symbol "*" can be used + * 5) execution will start with transport dataType and THEN continue down to "*" if needed + */ + prefilters = {}, + + /* Transports bindings + * 1) key is the dataType + * 2) the catchall symbol "*" can be used + * 3) selection will start with transport dataType and THEN go to "*" if needed + */ + transports = {}, + + // Avoid comment-prolog char sequence (#10098); must appease lint and evade compression + allTypes = "*/".concat( "*" ), + + // Anchor tag for parsing the document origin + originAnchor = document.createElement( "a" ); + originAnchor.href = location.href; + +// Base "constructor" for jQuery.ajaxPrefilter and jQuery.ajaxTransport +function addToPrefiltersOrTransports( structure ) { + + // dataTypeExpression is optional and defaults to "*" + return function( dataTypeExpression, func ) { + + if ( typeof dataTypeExpression !== "string" ) { + func = dataTypeExpression; + dataTypeExpression = "*"; + } + + var dataType, + i = 0, + dataTypes = dataTypeExpression.toLowerCase().match( rnothtmlwhite ) || []; + + if ( isFunction( func ) ) { + + // For each dataType in the dataTypeExpression + while ( ( dataType = dataTypes[ i++ ] ) ) { + + // Prepend if requested + if ( dataType[ 0 ] === "+" ) { + dataType = dataType.slice( 1 ) || "*"; + ( structure[ dataType ] = structure[ dataType ] || [] ).unshift( func ); + + // Otherwise append + } else { + ( structure[ dataType ] = structure[ dataType ] || [] ).push( func ); + } + } + } + }; +} + +// Base inspection function for prefilters and transports +function inspectPrefiltersOrTransports( structure, options, originalOptions, jqXHR ) { + + var inspected = {}, + seekingTransport = ( structure === transports ); + + function inspect( dataType ) { + var selected; + inspected[ dataType ] = true; + jQuery.each( structure[ dataType ] || [], function( _, prefilterOrFactory ) { + var dataTypeOrTransport = prefilterOrFactory( options, originalOptions, jqXHR ); + if ( typeof dataTypeOrTransport === "string" && + !seekingTransport && !inspected[ dataTypeOrTransport ] ) { + + options.dataTypes.unshift( dataTypeOrTransport ); + inspect( dataTypeOrTransport ); + return false; + } else if ( seekingTransport ) { + return !( selected = dataTypeOrTransport ); + } + } ); + return selected; + } + + return inspect( options.dataTypes[ 0 ] ) || !inspected[ "*" ] && inspect( "*" ); +} + +// A special extend for ajax options +// that takes "flat" options (not to be deep extended) +// Fixes #9887 +function ajaxExtend( target, src ) { + var key, deep, + flatOptions = jQuery.ajaxSettings.flatOptions || {}; + + for ( key in src ) { + if ( src[ key ] !== undefined ) { + ( flatOptions[ key ] ? target : ( deep || ( deep = {} ) ) )[ key ] = src[ key ]; + } + } + if ( deep ) { + jQuery.extend( true, target, deep ); + } + + return target; +} + +/* Handles responses to an ajax request: + * - finds the right dataType (mediates between content-type and expected dataType) + * - returns the corresponding response + */ +function ajaxHandleResponses( s, jqXHR, responses ) { + + var ct, type, finalDataType, firstDataType, + contents = s.contents, + dataTypes = s.dataTypes; + + // Remove auto dataType and get content-type in the process + while ( dataTypes[ 0 ] === "*" ) { + dataTypes.shift(); + if ( ct === undefined ) { + ct = s.mimeType || jqXHR.getResponseHeader( "Content-Type" ); + } + } + + // Check if we're dealing with a known content-type + if ( ct ) { + for ( type in contents ) { + if ( contents[ type ] && contents[ type ].test( ct ) ) { + dataTypes.unshift( type ); + break; + } + } + } + + // Check to see if we have a response for the expected dataType + if ( dataTypes[ 0 ] in responses ) { + finalDataType = dataTypes[ 0 ]; + } else { + + // Try convertible dataTypes + for ( type in responses ) { + if ( !dataTypes[ 0 ] || s.converters[ type + " " + dataTypes[ 0 ] ] ) { + finalDataType = type; + break; + } + if ( !firstDataType ) { + firstDataType = type; + } + } + + // Or just use first one + finalDataType = finalDataType || firstDataType; + } + + // If we found a dataType + // We add the dataType to the list if needed + // and return the corresponding response + if ( finalDataType ) { + if ( finalDataType !== dataTypes[ 0 ] ) { + dataTypes.unshift( finalDataType ); + } + return responses[ finalDataType ]; + } +} + +/* Chain conversions given the request and the original response + * Also sets the responseXXX fields on the jqXHR instance + */ +function ajaxConvert( s, response, jqXHR, isSuccess ) { + var conv2, current, conv, tmp, prev, + converters = {}, + + // Work with a copy of dataTypes in case we need to modify it for conversion + dataTypes = s.dataTypes.slice(); + + // Create converters map with lowercased keys + if ( dataTypes[ 1 ] ) { + for ( conv in s.converters ) { + converters[ conv.toLowerCase() ] = s.converters[ conv ]; + } + } + + current = dataTypes.shift(); + + // Convert to each sequential dataType + while ( current ) { + + if ( s.responseFields[ current ] ) { + jqXHR[ s.responseFields[ current ] ] = response; + } + + // Apply the dataFilter if provided + if ( !prev && isSuccess && s.dataFilter ) { + response = s.dataFilter( response, s.dataType ); + } + + prev = current; + current = dataTypes.shift(); + + if ( current ) { + + // There's only work to do if current dataType is non-auto + if ( current === "*" ) { + + current = prev; + + // Convert response if prev dataType is non-auto and differs from current + } else if ( prev !== "*" && prev !== current ) { + + // Seek a direct converter + conv = converters[ prev + " " + current ] || converters[ "* " + current ]; + + // If none found, seek a pair + if ( !conv ) { + for ( conv2 in converters ) { + + // If conv2 outputs current + tmp = conv2.split( " " ); + if ( tmp[ 1 ] === current ) { + + // If prev can be converted to accepted input + conv = converters[ prev + " " + tmp[ 0 ] ] || + converters[ "* " + tmp[ 0 ] ]; + if ( conv ) { + + // Condense equivalence converters + if ( conv === true ) { + conv = converters[ conv2 ]; + + // Otherwise, insert the intermediate dataType + } else if ( converters[ conv2 ] !== true ) { + current = tmp[ 0 ]; + dataTypes.unshift( tmp[ 1 ] ); + } + break; + } + } + } + } + + // Apply converter (if not an equivalence) + if ( conv !== true ) { + + // Unless errors are allowed to bubble, catch and return them + if ( conv && s.throws ) { + response = conv( response ); + } else { + try { + response = conv( response ); + } catch ( e ) { + return { + state: "parsererror", + error: conv ? e : "No conversion from " + prev + " to " + current + }; + } + } + } + } + } + } + + return { state: "success", data: response }; +} + +jQuery.extend( { + + // Counter for holding the number of active queries + active: 0, + + // Last-Modified header cache for next request + lastModified: {}, + etag: {}, + + ajaxSettings: { + url: location.href, + type: "GET", + isLocal: rlocalProtocol.test( location.protocol ), + global: true, + processData: true, + async: true, + contentType: "application/x-www-form-urlencoded; charset=UTF-8", + + /* + timeout: 0, + data: null, + dataType: null, + username: null, + password: null, + cache: null, + throws: false, + traditional: false, + headers: {}, + */ + + accepts: { + "*": allTypes, + text: "text/plain", + html: "text/html", + xml: "application/xml, text/xml", + json: "application/json, text/javascript" + }, + + contents: { + xml: /\bxml\b/, + html: /\bhtml/, + json: /\bjson\b/ + }, + + responseFields: { + xml: "responseXML", + text: "responseText", + json: "responseJSON" + }, + + // Data converters + // Keys separate source (or catchall "*") and destination types with a single space + converters: { + + // Convert anything to text + "* text": String, + + // Text to html (true = no transformation) + "text html": true, + + // Evaluate text as a json expression + "text json": JSON.parse, + + // Parse text as xml + "text xml": jQuery.parseXML + }, + + // For options that shouldn't be deep extended: + // you can add your own custom options here if + // and when you create one that shouldn't be + // deep extended (see ajaxExtend) + flatOptions: { + url: true, + context: true + } + }, + + // Creates a full fledged settings object into target + // with both ajaxSettings and settings fields. + // If target is omitted, writes into ajaxSettings. + ajaxSetup: function( target, settings ) { + return settings ? + + // Building a settings object + ajaxExtend( ajaxExtend( target, jQuery.ajaxSettings ), settings ) : + + // Extending ajaxSettings + ajaxExtend( jQuery.ajaxSettings, target ); + }, + + ajaxPrefilter: addToPrefiltersOrTransports( prefilters ), + ajaxTransport: addToPrefiltersOrTransports( transports ), + + // Main method + ajax: function( url, options ) { + + // If url is an object, simulate pre-1.5 signature + if ( typeof url === "object" ) { + options = url; + url = undefined; + } + + // Force options to be an object + options = options || {}; + + var transport, + + // URL without anti-cache param + cacheURL, + + // Response headers + responseHeadersString, + responseHeaders, + + // timeout handle + timeoutTimer, + + // Url cleanup var + urlAnchor, + + // Request state (becomes false upon send and true upon completion) + completed, + + // To know if global events are to be dispatched + fireGlobals, + + // Loop variable + i, + + // uncached part of the url + uncached, + + // Create the final options object + s = jQuery.ajaxSetup( {}, options ), + + // Callbacks context + callbackContext = s.context || s, + + // Context for global events is callbackContext if it is a DOM node or jQuery collection + globalEventContext = s.context && + ( callbackContext.nodeType || callbackContext.jquery ) ? + jQuery( callbackContext ) : + jQuery.event, + + // Deferreds + deferred = jQuery.Deferred(), + completeDeferred = jQuery.Callbacks( "once memory" ), + + // Status-dependent callbacks + statusCode = s.statusCode || {}, + + // Headers (they are sent all at once) + requestHeaders = {}, + requestHeadersNames = {}, + + // Default abort message + strAbort = "canceled", + + // Fake xhr + jqXHR = { + readyState: 0, + + // Builds headers hashtable if needed + getResponseHeader: function( key ) { + var match; + if ( completed ) { + if ( !responseHeaders ) { + responseHeaders = {}; + while ( ( match = rheaders.exec( responseHeadersString ) ) ) { + responseHeaders[ match[ 1 ].toLowerCase() + " " ] = + ( responseHeaders[ match[ 1 ].toLowerCase() + " " ] || [] ) + .concat( match[ 2 ] ); + } + } + match = responseHeaders[ key.toLowerCase() + " " ]; + } + return match == null ? null : match.join( ", " ); + }, + + // Raw string + getAllResponseHeaders: function() { + return completed ? responseHeadersString : null; + }, + + // Caches the header + setRequestHeader: function( name, value ) { + if ( completed == null ) { + name = requestHeadersNames[ name.toLowerCase() ] = + requestHeadersNames[ name.toLowerCase() ] || name; + requestHeaders[ name ] = value; + } + return this; + }, + + // Overrides response content-type header + overrideMimeType: function( type ) { + if ( completed == null ) { + s.mimeType = type; + } + return this; + }, + + // Status-dependent callbacks + statusCode: function( map ) { + var code; + if ( map ) { + if ( completed ) { + + // Execute the appropriate callbacks + jqXHR.always( map[ jqXHR.status ] ); + } else { + + // Lazy-add the new callbacks in a way that preserves old ones + for ( code in map ) { + statusCode[ code ] = [ statusCode[ code ], map[ code ] ]; + } + } + } + return this; + }, + + // Cancel the request + abort: function( statusText ) { + var finalText = statusText || strAbort; + if ( transport ) { + transport.abort( finalText ); + } + done( 0, finalText ); + return this; + } + }; + + // Attach deferreds + deferred.promise( jqXHR ); + + // Add protocol if not provided (prefilters might expect it) + // Handle falsy url in the settings object (#10093: consistency with old signature) + // We also use the url parameter if available + s.url = ( ( url || s.url || location.href ) + "" ) + .replace( rprotocol, location.protocol + "//" ); + + // Alias method option to type as per ticket #12004 + s.type = options.method || options.type || s.method || s.type; + + // Extract dataTypes list + s.dataTypes = ( s.dataType || "*" ).toLowerCase().match( rnothtmlwhite ) || [ "" ]; + + // A cross-domain request is in order when the origin doesn't match the current origin. + if ( s.crossDomain == null ) { + urlAnchor = document.createElement( "a" ); + + // Support: IE <=8 - 11, Edge 12 - 15 + // IE throws exception on accessing the href property if url is malformed, + // e.g. http://example.com:80x/ + try { + urlAnchor.href = s.url; + + // Support: IE <=8 - 11 only + // Anchor's host property isn't correctly set when s.url is relative + urlAnchor.href = urlAnchor.href; + s.crossDomain = originAnchor.protocol + "//" + originAnchor.host !== + urlAnchor.protocol + "//" + urlAnchor.host; + } catch ( e ) { + + // If there is an error parsing the URL, assume it is crossDomain, + // it can be rejected by the transport if it is invalid + s.crossDomain = true; + } + } + + // Convert data if not already a string + if ( s.data && s.processData && typeof s.data !== "string" ) { + s.data = jQuery.param( s.data, s.traditional ); + } + + // Apply prefilters + inspectPrefiltersOrTransports( prefilters, s, options, jqXHR ); + + // If request was aborted inside a prefilter, stop there + if ( completed ) { + return jqXHR; + } + + // We can fire global events as of now if asked to + // Don't fire events if jQuery.event is undefined in an AMD-usage scenario (#15118) + fireGlobals = jQuery.event && s.global; + + // Watch for a new set of requests + if ( fireGlobals && jQuery.active++ === 0 ) { + jQuery.event.trigger( "ajaxStart" ); + } + + // Uppercase the type + s.type = s.type.toUpperCase(); + + // Determine if request has content + s.hasContent = !rnoContent.test( s.type ); + + // Save the URL in case we're toying with the If-Modified-Since + // and/or If-None-Match header later on + // Remove hash to simplify url manipulation + cacheURL = s.url.replace( rhash, "" ); + + // More options handling for requests with no content + if ( !s.hasContent ) { + + // Remember the hash so we can put it back + uncached = s.url.slice( cacheURL.length ); + + // If data is available and should be processed, append data to url + if ( s.data && ( s.processData || typeof s.data === "string" ) ) { + cacheURL += ( rquery.test( cacheURL ) ? "&" : "?" ) + s.data; + + // #9682: remove data so that it's not used in an eventual retry + delete s.data; + } + + // Add or update anti-cache param if needed + if ( s.cache === false ) { + cacheURL = cacheURL.replace( rantiCache, "$1" ); + uncached = ( rquery.test( cacheURL ) ? "&" : "?" ) + "_=" + ( nonce.guid++ ) + + uncached; + } + + // Put hash and anti-cache on the URL that will be requested (gh-1732) + s.url = cacheURL + uncached; + + // Change '%20' to '+' if this is encoded form body content (gh-2658) + } else if ( s.data && s.processData && + ( s.contentType || "" ).indexOf( "application/x-www-form-urlencoded" ) === 0 ) { + s.data = s.data.replace( r20, "+" ); + } + + // Set the If-Modified-Since and/or If-None-Match header, if in ifModified mode. + if ( s.ifModified ) { + if ( jQuery.lastModified[ cacheURL ] ) { + jqXHR.setRequestHeader( "If-Modified-Since", jQuery.lastModified[ cacheURL ] ); + } + if ( jQuery.etag[ cacheURL ] ) { + jqXHR.setRequestHeader( "If-None-Match", jQuery.etag[ cacheURL ] ); + } + } + + // Set the correct header, if data is being sent + if ( s.data && s.hasContent && s.contentType !== false || options.contentType ) { + jqXHR.setRequestHeader( "Content-Type", s.contentType ); + } + + // Set the Accepts header for the server, depending on the dataType + jqXHR.setRequestHeader( + "Accept", + s.dataTypes[ 0 ] && s.accepts[ s.dataTypes[ 0 ] ] ? + s.accepts[ s.dataTypes[ 0 ] ] + + ( s.dataTypes[ 0 ] !== "*" ? ", " + allTypes + "; q=0.01" : "" ) : + s.accepts[ "*" ] + ); + + // Check for headers option + for ( i in s.headers ) { + jqXHR.setRequestHeader( i, s.headers[ i ] ); + } + + // Allow custom headers/mimetypes and early abort + if ( s.beforeSend && + ( s.beforeSend.call( callbackContext, jqXHR, s ) === false || completed ) ) { + + // Abort if not done already and return + return jqXHR.abort(); + } + + // Aborting is no longer a cancellation + strAbort = "abort"; + + // Install callbacks on deferreds + completeDeferred.add( s.complete ); + jqXHR.done( s.success ); + jqXHR.fail( s.error ); + + // Get transport + transport = inspectPrefiltersOrTransports( transports, s, options, jqXHR ); + + // If no transport, we auto-abort + if ( !transport ) { + done( -1, "No Transport" ); + } else { + jqXHR.readyState = 1; + + // Send global event + if ( fireGlobals ) { + globalEventContext.trigger( "ajaxSend", [ jqXHR, s ] ); + } + + // If request was aborted inside ajaxSend, stop there + if ( completed ) { + return jqXHR; + } + + // Timeout + if ( s.async && s.timeout > 0 ) { + timeoutTimer = window.setTimeout( function() { + jqXHR.abort( "timeout" ); + }, s.timeout ); + } + + try { + completed = false; + transport.send( requestHeaders, done ); + } catch ( e ) { + + // Rethrow post-completion exceptions + if ( completed ) { + throw e; + } + + // Propagate others as results + done( -1, e ); + } + } + + // Callback for when everything is done + function done( status, nativeStatusText, responses, headers ) { + var isSuccess, success, error, response, modified, + statusText = nativeStatusText; + + // Ignore repeat invocations + if ( completed ) { + return; + } + + completed = true; + + // Clear timeout if it exists + if ( timeoutTimer ) { + window.clearTimeout( timeoutTimer ); + } + + // Dereference transport for early garbage collection + // (no matter how long the jqXHR object will be used) + transport = undefined; + + // Cache response headers + responseHeadersString = headers || ""; + + // Set readyState + jqXHR.readyState = status > 0 ? 4 : 0; + + // Determine if successful + isSuccess = status >= 200 && status < 300 || status === 304; + + // Get response data + if ( responses ) { + response = ajaxHandleResponses( s, jqXHR, responses ); + } + + // Use a noop converter for missing script + if ( !isSuccess && jQuery.inArray( "script", s.dataTypes ) > -1 ) { + s.converters[ "text script" ] = function() {}; + } + + // Convert no matter what (that way responseXXX fields are always set) + response = ajaxConvert( s, response, jqXHR, isSuccess ); + + // If successful, handle type chaining + if ( isSuccess ) { + + // Set the If-Modified-Since and/or If-None-Match header, if in ifModified mode. + if ( s.ifModified ) { + modified = jqXHR.getResponseHeader( "Last-Modified" ); + if ( modified ) { + jQuery.lastModified[ cacheURL ] = modified; + } + modified = jqXHR.getResponseHeader( "etag" ); + if ( modified ) { + jQuery.etag[ cacheURL ] = modified; + } + } + + // if no content + if ( status === 204 || s.type === "HEAD" ) { + statusText = "nocontent"; + + // if not modified + } else if ( status === 304 ) { + statusText = "notmodified"; + + // If we have data, let's convert it + } else { + statusText = response.state; + success = response.data; + error = response.error; + isSuccess = !error; + } + } else { + + // Extract error from statusText and normalize for non-aborts + error = statusText; + if ( status || !statusText ) { + statusText = "error"; + if ( status < 0 ) { + status = 0; + } + } + } + + // Set data for the fake xhr object + jqXHR.status = status; + jqXHR.statusText = ( nativeStatusText || statusText ) + ""; + + // Success/Error + if ( isSuccess ) { + deferred.resolveWith( callbackContext, [ success, statusText, jqXHR ] ); + } else { + deferred.rejectWith( callbackContext, [ jqXHR, statusText, error ] ); + } + + // Status-dependent callbacks + jqXHR.statusCode( statusCode ); + statusCode = undefined; + + if ( fireGlobals ) { + globalEventContext.trigger( isSuccess ? "ajaxSuccess" : "ajaxError", + [ jqXHR, s, isSuccess ? success : error ] ); + } + + // Complete + completeDeferred.fireWith( callbackContext, [ jqXHR, statusText ] ); + + if ( fireGlobals ) { + globalEventContext.trigger( "ajaxComplete", [ jqXHR, s ] ); + + // Handle the global AJAX counter + if ( !( --jQuery.active ) ) { + jQuery.event.trigger( "ajaxStop" ); + } + } + } + + return jqXHR; + }, + + getJSON: function( url, data, callback ) { + return jQuery.get( url, data, callback, "json" ); + }, + + getScript: function( url, callback ) { + return jQuery.get( url, undefined, callback, "script" ); + } +} ); + +jQuery.each( [ "get", "post" ], function( _i, method ) { + jQuery[ method ] = function( url, data, callback, type ) { + + // Shift arguments if data argument was omitted + if ( isFunction( data ) ) { + type = type || callback; + callback = data; + data = undefined; + } + + // The url can be an options object (which then must have .url) + return jQuery.ajax( jQuery.extend( { + url: url, + type: method, + dataType: type, + data: data, + success: callback + }, jQuery.isPlainObject( url ) && url ) ); + }; +} ); + +jQuery.ajaxPrefilter( function( s ) { + var i; + for ( i in s.headers ) { + if ( i.toLowerCase() === "content-type" ) { + s.contentType = s.headers[ i ] || ""; + } + } +} ); + + +jQuery._evalUrl = function( url, options, doc ) { + return jQuery.ajax( { + url: url, + + // Make this explicit, since user can override this through ajaxSetup (#11264) + type: "GET", + dataType: "script", + cache: true, + async: false, + global: false, + + // Only evaluate the response if it is successful (gh-4126) + // dataFilter is not invoked for failure responses, so using it instead + // of the default converter is kludgy but it works. + converters: { + "text script": function() {} + }, + dataFilter: function( response ) { + jQuery.globalEval( response, options, doc ); + } + } ); +}; + + +jQuery.fn.extend( { + wrapAll: function( html ) { + var wrap; + + if ( this[ 0 ] ) { + if ( isFunction( html ) ) { + html = html.call( this[ 0 ] ); + } + + // The elements to wrap the target around + wrap = jQuery( html, this[ 0 ].ownerDocument ).eq( 0 ).clone( true ); + + if ( this[ 0 ].parentNode ) { + wrap.insertBefore( this[ 0 ] ); + } + + wrap.map( function() { + var elem = this; + + while ( elem.firstElementChild ) { + elem = elem.firstElementChild; + } + + return elem; + } ).append( this ); + } + + return this; + }, + + wrapInner: function( html ) { + if ( isFunction( html ) ) { + return this.each( function( i ) { + jQuery( this ).wrapInner( html.call( this, i ) ); + } ); + } + + return this.each( function() { + var self = jQuery( this ), + contents = self.contents(); + + if ( contents.length ) { + contents.wrapAll( html ); + + } else { + self.append( html ); + } + } ); + }, + + wrap: function( html ) { + var htmlIsFunction = isFunction( html ); + + return this.each( function( i ) { + jQuery( this ).wrapAll( htmlIsFunction ? html.call( this, i ) : html ); + } ); + }, + + unwrap: function( selector ) { + this.parent( selector ).not( "body" ).each( function() { + jQuery( this ).replaceWith( this.childNodes ); + } ); + return this; + } +} ); + + +jQuery.expr.pseudos.hidden = function( elem ) { + return !jQuery.expr.pseudos.visible( elem ); +}; +jQuery.expr.pseudos.visible = function( elem ) { + return !!( elem.offsetWidth || elem.offsetHeight || elem.getClientRects().length ); +}; + + + + +jQuery.ajaxSettings.xhr = function() { + try { + return new window.XMLHttpRequest(); + } catch ( e ) {} +}; + +var xhrSuccessStatus = { + + // File protocol always yields status code 0, assume 200 + 0: 200, + + // Support: IE <=9 only + // #1450: sometimes IE returns 1223 when it should be 204 + 1223: 204 + }, + xhrSupported = jQuery.ajaxSettings.xhr(); + +support.cors = !!xhrSupported && ( "withCredentials" in xhrSupported ); +support.ajax = xhrSupported = !!xhrSupported; + +jQuery.ajaxTransport( function( options ) { + var callback, errorCallback; + + // Cross domain only allowed if supported through XMLHttpRequest + if ( support.cors || xhrSupported && !options.crossDomain ) { + return { + send: function( headers, complete ) { + var i, + xhr = options.xhr(); + + xhr.open( + options.type, + options.url, + options.async, + options.username, + options.password + ); + + // Apply custom fields if provided + if ( options.xhrFields ) { + for ( i in options.xhrFields ) { + xhr[ i ] = options.xhrFields[ i ]; + } + } + + // Override mime type if needed + if ( options.mimeType && xhr.overrideMimeType ) { + xhr.overrideMimeType( options.mimeType ); + } + + // X-Requested-With header + // For cross-domain requests, seeing as conditions for a preflight are + // akin to a jigsaw puzzle, we simply never set it to be sure. + // (it can always be set on a per-request basis or even using ajaxSetup) + // For same-domain requests, won't change header if already provided. + if ( !options.crossDomain && !headers[ "X-Requested-With" ] ) { + headers[ "X-Requested-With" ] = "XMLHttpRequest"; + } + + // Set headers + for ( i in headers ) { + xhr.setRequestHeader( i, headers[ i ] ); + } + + // Callback + callback = function( type ) { + return function() { + if ( callback ) { + callback = errorCallback = xhr.onload = + xhr.onerror = xhr.onabort = xhr.ontimeout = + xhr.onreadystatechange = null; + + if ( type === "abort" ) { + xhr.abort(); + } else if ( type === "error" ) { + + // Support: IE <=9 only + // On a manual native abort, IE9 throws + // errors on any property access that is not readyState + if ( typeof xhr.status !== "number" ) { + complete( 0, "error" ); + } else { + complete( + + // File: protocol always yields status 0; see #8605, #14207 + xhr.status, + xhr.statusText + ); + } + } else { + complete( + xhrSuccessStatus[ xhr.status ] || xhr.status, + xhr.statusText, + + // Support: IE <=9 only + // IE9 has no XHR2 but throws on binary (trac-11426) + // For XHR2 non-text, let the caller handle it (gh-2498) + ( xhr.responseType || "text" ) !== "text" || + typeof xhr.responseText !== "string" ? + { binary: xhr.response } : + { text: xhr.responseText }, + xhr.getAllResponseHeaders() + ); + } + } + }; + }; + + // Listen to events + xhr.onload = callback(); + errorCallback = xhr.onerror = xhr.ontimeout = callback( "error" ); + + // Support: IE 9 only + // Use onreadystatechange to replace onabort + // to handle uncaught aborts + if ( xhr.onabort !== undefined ) { + xhr.onabort = errorCallback; + } else { + xhr.onreadystatechange = function() { + + // Check readyState before timeout as it changes + if ( xhr.readyState === 4 ) { + + // Allow onerror to be called first, + // but that will not handle a native abort + // Also, save errorCallback to a variable + // as xhr.onerror cannot be accessed + window.setTimeout( function() { + if ( callback ) { + errorCallback(); + } + } ); + } + }; + } + + // Create the abort callback + callback = callback( "abort" ); + + try { + + // Do send the request (this may raise an exception) + xhr.send( options.hasContent && options.data || null ); + } catch ( e ) { + + // #14683: Only rethrow if this hasn't been notified as an error yet + if ( callback ) { + throw e; + } + } + }, + + abort: function() { + if ( callback ) { + callback(); + } + } + }; + } +} ); + + + + +// Prevent auto-execution of scripts when no explicit dataType was provided (See gh-2432) +jQuery.ajaxPrefilter( function( s ) { + if ( s.crossDomain ) { + s.contents.script = false; + } +} ); + +// Install script dataType +jQuery.ajaxSetup( { + accepts: { + script: "text/javascript, application/javascript, " + + "application/ecmascript, application/x-ecmascript" + }, + contents: { + script: /\b(?:java|ecma)script\b/ + }, + converters: { + "text script": function( text ) { + jQuery.globalEval( text ); + return text; + } + } +} ); + +// Handle cache's special case and crossDomain +jQuery.ajaxPrefilter( "script", function( s ) { + if ( s.cache === undefined ) { + s.cache = false; + } + if ( s.crossDomain ) { + s.type = "GET"; + } +} ); + +// Bind script tag hack transport +jQuery.ajaxTransport( "script", function( s ) { + + // This transport only deals with cross domain or forced-by-attrs requests + if ( s.crossDomain || s.scriptAttrs ) { + var script, callback; + return { + send: function( _, complete ) { + script = jQuery( " + + + + + + + + + + + + + + + + + +
+ + + +
+ + + + + +
+ +
+ + + + + + + + + + + + + + + + + +
+ + + + +
+
+
+
+ +
+

Lasso Classes and Functions

+
+

Base Classes

+ ++++ + + + + + + + + + + + + + + + + + +

CubeTransit

Class for storing information about transit defined in Cube line files.

StandardTransit

Holds a standard transit feed as a Partridge object and contains methods to manipulate and translate the GTFS data to MetCouncil’s Cube Line files.

ModelRoadwayNetwork

Subclass of network_wrangler class RoadwayNetwork

Project

A single or set of changes to the roadway or transit system.

Parameters

A class representing all the parameters defining the networks including time of day, categories, etc.

+
+
+

Utils and Functions

+ ++++ + + + + + + + + +

util

logger

+
+
+ + +
+ +
+ + +
+
+ +
+ +
+ + + + + + + + + + + \ No newline at end of file diff --git a/branch/bart/.buildinfo b/branch/bart/.buildinfo new file mode 100644 index 0000000..927a054 --- /dev/null +++ b/branch/bart/.buildinfo @@ -0,0 +1,4 @@ +# Sphinx build info version 1 +# This file hashes the configuration used when building these files. When it is not found, a full rebuild will be done. +config: ac324329a2923a8190303012a3e3d43e +tags: d77d1c0d9ca2f4c8421862c7c5a0d620 diff --git a/branch/bart/_generated/lasso.CubeTransit/index.html b/branch/bart/_generated/lasso.CubeTransit/index.html new file mode 100644 index 0000000..a509c51 --- /dev/null +++ b/branch/bart/_generated/lasso.CubeTransit/index.html @@ -0,0 +1,570 @@ + + + + + + + lasso.CubeTransit — lasso documentation + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

lasso.CubeTransit

+
+
+class lasso.CubeTransit(parameters={})[source]
+

Bases: object

+

Class for storing information about transit defined in Cube line +files.

+

Has the capability to:

+
+
    +
  • Parse cube line file properties and shapes into python dictionaries

  • +
  • Compare line files and represent changes as Project Card dictionaries

  • +
+
+

Typical usage example:

+
tn = CubeTransit.create_from_cube(CUBE_DIR)
+transit_change_list = tn.evaluate_differences(base_transit_network)
+
+
+
+
+lines
+

list of strings representing unique line names in +the cube network.

+
+
Type
+

list

+
+
+
+ +
+
+line_properties
+

dictionary of line properties keyed by line name. Property +values are stored in a dictionary by property name. These +properties are directly read from the cube line files and haven’t +been translated to standard transit values.

+
+
Type
+

dict

+
+
+
+ +
+
+shapes
+

dictionary of shapes +keyed by line name. Shapes stored as a pandas DataFrame of nodes with following columns:

+
+
    +
  • ‘node_id’ (int): positive integer of node id

  • +
  • ‘node’ (int): node number, with negative indicating a non-stop

  • +
  • ‘stop’ (boolean): indicates if it is a stop

  • +
  • ‘order’ (int): order within this shape

  • +
+
+
+
Type
+

dict

+
+
+
+ +
+
+program_type
+

Either PT or TRNBLD

+
+
Type
+

str

+
+
+
+ +
+
+parameters
+

Parameters instance that will be applied to this instance which +includes information about time periods and variables.

+
+
Type
+

Parameters

+
+
+
+ +
+
+source_list
+

List of cube line file sources that have been read and added.

+
+
Type
+

list

+
+
+
+ +
+
+diff_dict
+
+
Type
+

dict

+
+
+
+ +
+
+__init__(parameters={})[source]
+

Constructor for CubeTransit

+

parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters

+
+ +

Methods

+ ++++ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +

__init__([parameters])

Constructor for CubeTransit

add_additional_time_periods(...)

Copies a route to another cube time period with appropriate values for time-period-specific properties.

add_cube(transit_source)

Reads a .lin file and adds it to existing TransitNetwork instance.

build_route_name([route_id, time_period, ...])

Create a route name by contatenating route, time period, agency, and direction

calculate_start_end_times(line_properties_dict)

Calculate the start and end times of the property change WARNING: Doesn't take care of discongruous time periods!!!!

create_add_route_card_dict(line)

Creates a project card change formatted dictionary for adding a route based on the information in self.route_properties for the line.

create_delete_route_card_dict(line, ...)

Creates a project card change formatted dictionary for deleting a line.

create_from_cube(transit_source[, parameters])

Reads a cube .lin file and stores as TransitNetwork object.

create_update_route_card_dict(line, ...)

Creates a project card change formatted dictionary for updating the line.

cube_properties_to_standard_properties(...)

Converts cube style properties to standard properties.

evaluate_differences(base_transit)

    +
  1. Identifies what routes need to be updated, deleted, or added

  2. +
+

evaluate_route_property_differences(...[, ...])

Checks if any values have been updated or added for a specific route and creates project card entries for each.

evaluate_route_shape_changes(shape_build, ...)

Compares two route shapes and constructs returns list of changes suitable for a project card.

get_time_period_numbers_from_cube_properties(...)

Finds properties that are associated with time periods and the returns the numbers in them.

unpack_route_name(line_name)

Unpacks route name into direction, route, agency, and time period info

+
+
+add_additional_time_periods(new_time_period_number, orig_line_name)[source]
+

Copies a route to another cube time period with appropriate +values for time-period-specific properties.

+
+
New properties are stored under the new name in:
    +
  • ::self.shapes

  • +
  • ::self.line_properties

  • +
+
+
+
+
Parameters
+
    +
  • new_time_period_number (int) – cube time period number

  • +
  • orig_line_name (str) – name of the originating line, from which +the new line will copy its properties.

  • +
+
+
Returns
+

Line name with new time period.

+
+
+
+ +
+
+add_cube(transit_source)[source]
+

Reads a .lin file and adds it to existing TransitNetwork instance.

+
+
Parameters
+

transit_source – a string or the directory of the cube line file to be parsed

+
+
+
+ +
+
+static build_route_name(route_id='', time_period='', agency_id=0, direction_id=1)[source]
+

Create a route name by contatenating route, time period, agency, and direction

+
+
Parameters
+
    +
  • route_id – i.e. 452-111

  • +
  • time_period – i.e. pk

  • +
  • direction_id – i.e. 1

  • +
  • agency_id – i.e. 0

  • +
+
+
Returns
+

constructed line_name i.e. “0_452-111_452_pk1”

+
+
+
+ +
+
+calculate_start_end_times(line_properties_dict)[source]
+

Calculate the start and end times of the property change +WARNING: Doesn’t take care of discongruous time periods!!!!

+
+
Parameters
+

line_properties_dict – dictionary of cube-flavor properties for a transit line

+
+
+
+ +
+
+create_add_route_card_dict(line)[source]
+

Creates a project card change formatted dictionary for adding +a route based on the information in self.route_properties for +the line.

+
+
Parameters
+

line – name of line that is being updated

+
+
Returns
+

A project card change-formatted dictionary for the route addition.

+
+
+
+ +
+
+create_delete_route_card_dict(line, base_transit_line_properties_dict)[source]
+

Creates a project card change formatted dictionary for deleting a line.

+
+
Parameters
+
    +
  • line – name of line that is being deleted

  • +
  • base_transit_line_properties_dict – dictionary of cube-style +attribute values in order to find time periods and +start and end times.

  • +
+
+
Returns
+

A project card change-formatted dictionary for the route deletion.

+
+
+
+ +
+
+static create_from_cube(transit_source, parameters={})[source]
+

Reads a cube .lin file and stores as TransitNetwork object.

+
+
Parameters
+

transit_source – a string or the directory of the cube line file to be parsed

+
+
Returns
+

A ::CubeTransit object created from the transit_source.

+
+
+
+ +
+
+create_update_route_card_dict(line, updated_properties_dict)[source]
+

Creates a project card change formatted dictionary for updating +the line.

+
+
Parameters
+
    +
  • line – name of line that is being updated

  • +
  • updated_properties_dict – dictionary of attributes to update as +‘property’: <property name>, +‘set’: <new property value>

  • +
+
+
Returns
+

A project card change-formatted dictionary for the attribute update.

+
+
+
+ +
+
+static cube_properties_to_standard_properties(cube_properties_dict)[source]
+

Converts cube style properties to standard properties.

+

This is most pertinent to time-period specific variables like headway, +and varibles that have stnadard units like headway, which is minutes +in cube and seconds in standard format.

+
+
Parameters
+

cube_properties_dict – <cube style property name> : <property value>

+
+
Returns
+

+
<standard

style property name>, “set” : <property value with correct units>`

+
+
+

+
+
Return type
+

A list of dictionaries with values for `”property”

+
+
+
+ +
+
+evaluate_differences(base_transit)[source]
+
    +
  1. Identifies what routes need to be updated, deleted, or added

  2. +
  3. +
    For routes being added or updated, identify if the time periods

    have changed or if there are multiples, and make duplicate lines if so

    +
    +
    +
  4. +
  5. Create project card dictionaries for each change.

  6. +
+
+
Parameters
+

base_transit (CubeTransit) – an instance of this class for the base condition

+
+
Returns
+

A list of dictionaries containing project card changes +required to evaluate the differences between the base network +and this transit network instance.

+
+
+
+ +
+
+evaluate_route_property_differences(properties_build, properties_base, time_period_number, absolute=True, validate_base=False)[source]
+

Checks if any values have been updated or added for a specific +route and creates project card entries for each.

+
+
Parameters
+
    +
  • properties_build – ::<property_name>: <property_value>

  • +
  • properties_base – ::<property_name>: <property_value>

  • +
  • time_period_number – time period to evaluate

  • +
  • absolute – if True, will use set command rather than a change. If false, will automatically check the base value. Note that this only applies to the numeric values of frequency/headway

  • +
  • validate_base – if True, will add the existing line in the project card

  • +
+
+
Returns
+

+
a list of dictionary values suitable for writing to a project card

{ +‘property’: <property_name>, +‘set’: <set value>, +‘change’: <change from existing value>, +‘existing’: <existing value to check>, +}

+
+
+

+
+
Return type
+

transit_change_list (list)

+
+
+
+ +
+
+static evaluate_route_shape_changes(shape_build, shape_base)[source]
+

Compares two route shapes and constructs returns list of changes +suitable for a project card.

+
+
Parameters
+
    +
  • shape_build – DataFrame of the build-version of the route shape.

  • +
  • shape_base – dDataFrame of the base-version of the route shape.

  • +
+
+
Returns
+

List of shape changes formatted as a project card-change dictionary.

+
+
+
+ +
+
+static get_time_period_numbers_from_cube_properties(properties_list)[source]
+

Finds properties that are associated with time periods and the +returns the numbers in them.

+
+
Parameters
+

properties_list (list) – list of all properties.

+
+
Returns
+

list of strings of the time period numbers found

+
+
+
+ +
+
+static unpack_route_name(line_name)[source]
+

Unpacks route name into direction, route, agency, and time period info

+
+
Parameters
+

line_name (str) – i.e. “0_452-111_452_pk1”

+
+
Returns
+

452-111 +time_period (str): i.e. pk +direction_id (str) : i.e. 1 +agency_id (str) : i.e. 0

+
+
Return type
+

route_id (str)

+
+
+
+ +
+ +
+ + +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bart/_generated/lasso.ModelRoadwayNetwork/index.html b/branch/bart/_generated/lasso.ModelRoadwayNetwork/index.html new file mode 100644 index 0000000..8e16e0e --- /dev/null +++ b/branch/bart/_generated/lasso.ModelRoadwayNetwork/index.html @@ -0,0 +1,1588 @@ + + + + + + + lasso.ModelRoadwayNetwork — lasso documentation + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

lasso.ModelRoadwayNetwork

+
+
+class lasso.ModelRoadwayNetwork(nodes, links, shapes, parameters={}, **kwargs)[source]
+

Bases: RoadwayNetwork

+

Subclass of network_wrangler class RoadwayNetwork

+

A representation of the physical roadway network and its properties.

+
+
+__init__(nodes, links, shapes, parameters={}, **kwargs)[source]
+

Constructor

+
+
Parameters
+
    +
  • nodes – geodataframe of nodes

  • +
  • links – dataframe of links

  • +
  • shapes – geodataframe of shapes

  • +
  • parameters – dictionary of parameter settings (see Parameters class) or an instance of Parameters. +If not specified, will use default parameters.

  • +
  • crs (int) – coordinate reference system, ESPG number

  • +
  • node_foreign_key (str) – variable linking the node table to the link table

  • +
  • link_foreign_key (list) – list of variable linking the link table to the node foreign key

  • +
  • shape_foreign_key (str) – variable linking the links table and shape table

  • +
  • unique_link_ids (list) – list of variables unique to each link

  • +
  • unique_node_ids (list) – list of variables unique to each node

  • +
  • modes_to_network_link_variables (dict) – Mapping of modes to link variables in the network

  • +
  • modes_to_network_nodes_variables (dict) – Mapping of modes to node variables in the network

  • +
  • managed_lanes_node_id_scalar (int) – Scalar values added to primary keys for nodes for +corresponding managed lanes.

  • +
  • managed_lanes_link_id_scalar (int) – Scalar values added to primary keys for links for +corresponding managed lanes.

  • +
  • managed_lanes_required_attributes (list) – attributes that must be specified in managed +lane projects.

  • +
  • keep_same_attributes_ml_and_gp (list) – attributes to copy to managed lanes from parallel +general purpose lanes.

  • +
+
+
+
+ +

Methods

+ ++++ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +

__init__(nodes, links, shapes[, parameters])

Constructor

add_counts([network_variable, ...])

Adds count variable.

add_incident_link_data_to_nodes([links_df, ...])

Add data from links going to/from nodes to node.

add_new_roadway_feature_change(links, nodes)

add the new roadway features defined in the project card.

add_variable_using_shst_reference([...])

Join network links with source data, via SHST API node match result.

addition_map(links, nodes)

Shows which links and nodes are added to the roadway network

apply(project_card_dictionary)

Wrapper method to apply a project to a roadway network.

apply_managed_lane_feature_change(link_idx, ...)

Apply the managed lane feature changes to the roadway network

apply_python_calculation(pycode[, in_place])

Changes roadway network object by executing pycode.

apply_roadway_feature_change(link_idx, ...)

Changes the roadway attributes for the selected features based on the project card information passed

assess_connectivity([mode, ...])

Returns a network graph and list of disconnected subgraphs as described by a list of their member nodes.

build_selection_key(selection_dict)

Selections are stored by a key combining the query and the A and B ids.

calculate_area_type([area_type_shape, ...])

#MC Calculates area type variable.

calculate_centroidconnect(parameters[, ...])

Calculates centroid connector variable.

calculate_county([county_shape, ...])

#MC Calculates county variable.

calculate_distance([network_variable, ...])

calculate link distance in miles

calculate_mpo([county_network_variable, ...])

Calculates mpo variable.

calculate_use([network_variable, ...])

Calculates use variable.

convert_int([int_col_names])

Convert integer columns

create_ML_variable([network_variable, overwrite])

Created ML lanes placeholder for project to write out ML changes

create_calculated_variables()

Creates calculated roadway variables.

create_dummy_connector_links(ml_df[, ...])

create dummy connector links between the general purpose and managed lanes

create_hov_corridor_variable([...])

Created hov corridor placeholder for project to write out corridor changes

create_managed_lane_network([...])

Create a roadway network with managed lanes links separated out.

create_managed_variable([network_variable, ...])

Created placeholder for project to write out managed

dataframe_to_fixed_width(df)

Convert dataframe to fixed width format, geometry column will not be transformed.

delete_roadway_feature_change(links, nodes)

delete the roadway features defined in the project card.

deletion_map(links, nodes)

Shows which links and nodes are deleted from the roadway network

fill_na()

Fill na values from create_managed_lane_network()

from_RoadwayNetwork(roadway_network_object)

RoadwayNetwork to ModelRoadwayNetwork

get_attribute(links_df, join_key, ...)

Gets attribute from source data using SHST match result.

get_managed_lane_node_ids(nodes_list[, scalar])

Transform a list of node IDS by a scalar.

get_modal_graph(links_df, nodes_df[, mode, ...])

Determines if the network graph is "strongly" connected A graph is strongly connected if each vertex is reachable from every other vertex.

get_modal_links_nodes(links_df, nodes_df[, ...])

Returns nodes and link dataframes for specific mode.

get_property_by_time_period_and_group(prop)

Return a series for the properties with a specific group or time period.

identify_segment(O_id, D_id[, ...])

+
param endpoints
+

list of length of two unique keys of nodes making up endpoints of segment

+
+
+

identify_segment_endpoints([mode, links_df, ...])

+
param mode
+

list of modes of the network, one of drive,`transit`,

+
+
+

is_network_connected([mode, links_df, nodes_df])

Determines if the network graph is "strongly" connected A graph is strongly connected if each vertex is reachable from every other vertex.

load_transform_network(node_filename, ...[, ...])

Reads roadway network files from disk and transforms them into GeoDataFrames.

network_connection_plot(G, ...)

Plot a graph to check for network connection.

orig_dest_nodes_foreign_key(selection[, ...])

Returns the foreign key id (whatever is used in the u and v variables in the links file) for the AB nodes as a tuple.

ox_graph(nodes_df, links_df[, ...])

create an osmnx-flavored network graph

path_search(candidate_links_df, O_id, D_id)

+
param candidate_links
+

selection of links geodataframe with links likely to be part of path

+
+
+

read(link_filename, node_filename, ...[, ...])

Reads in links and nodes network standard.

read_match_result(path)

Reads the shst geojson match returns.

rename_variables_for_dbf(input_df[, ...])

Rename attributes for DBF/SHP, make sure length within 10 chars.

roadway_net_to_gdf(roadway_net)

Turn the roadway network into a GeoDataFrame :param roadway_net: the roadway network to export

roadway_standard_to_met_council_network([...])

Rename and format roadway attributes to be consistent with what metcouncil's model is expecting.

select_roadway_features(selection[, ...])

Selects roadway features that satisfy selection criteria

selection_has_unique_link_id(selection_dict)

+
param selection_dictionary
+

Dictionary representation of selection

+
+
+

selection_map(selected_link_idx[, A, B, ...])

Shows which links are selected for roadway property change or parallel managed lanes category of roadway projects.

shortest_path(graph_links_df, O_id, D_id[, ...])

+
param graph_links_df
+

+
+

split_properties_by_time_period_and_category([...])

Splits properties by time period, assuming a variable structure of

update_distance([links_df, use_shapes, ...])

Calculate link distance in specified units to network variable using either straight line distance or (if specified) shape distance if available.

validate_link_schema(link_filename[, ...])

Validate roadway network data link schema and output a boolean

validate_node_schema(node_file[, ...])

Validate roadway network data node schema and output a boolean

validate_properties(properties[, ...])

If there are change or existing commands, make sure that that property exists in the network.

validate_selection(selection[, ...])

Evaluate whetther the selection dictionary contains the minimum required values.

validate_shape_schema(shape_file[, ...])

Validate roadway network data shape schema and output a boolean

validate_uniqueness()

Confirms that the unique identifiers are met.

write([path, filename])

Writes a network in the roadway network standard

write_roadway_as_fixedwidth(output_dir[, ...])

Writes out fixed width file.

write_roadway_as_shp(output_dir[, ...])

Write out dbf/shp/gpkg for cube.

+

Attributes

+ ++++ + + + + + +

CALCULATED_VALUES

+
+
+add_counts(network_variable='AADT', mndot_count_shst_data=None, widot_count_shst_data=None, mndot_count_variable_shp=None, widot_count_variable_shp=None)[source]
+

Adds count variable. +#MC +join the network with count node data, via SHST API node match result

+
+
Parameters
+
    +
  • network_variable (str) – Name of the variable that should be written to. Default to “AADT”.

  • +
  • mndot_count_shst_data (str) – File path to MNDOT count location SHST API node match result.

  • +
  • widot_count_shst_data (str) – File path to WIDOT count location SHST API node match result.

  • +
  • mndot_count_variable_shp (str) – File path to MNDOT count location geodatabase.

  • +
  • widot_count_variable_shp (str) – File path to WIDOT count location geodatabase.

  • +
+
+
Returns
+

None

+
+
+
+ +
+ +

Add data from links going to/from nodes to node.

+
+
Parameters
+
    +
  • links_df – if specified, will assess connectivity of this +links list rather than self.links_df

  • +
  • nodes_df – if specified, will assess connectivity of this +nodes list rather than self.nodes_df

  • +
  • link_variables – list of columns in links dataframe to add to incident nodes

  • +
+
+
Returns
+

nodes DataFrame with link data where length is N*number of links going in/out

+
+
Return type
+

DataFrame

+
+
+
+ +
+
+add_new_roadway_feature_change(links, nodes)
+

add the new roadway features defined in the project card. +new shapes are also added for the new roadway links.

+
+
Parameters
+
    +
  • links – list of dictionaries

  • +
  • nodes – list of dictionaries

  • +
+
+
+

returns: None

+
+
Return type
+

None

+
+
+
+ +
+
+add_variable_using_shst_reference(var_shst_csvdata=None, shst_csv_variable=None, network_variable=None, network_var_type=<class 'int'>, overwrite=False)[source]
+

Join network links with source data, via SHST API node match result.

+
+
Parameters
+
    +
  • var_shst_csvdata (str) – File path to SHST API return.

  • +
  • shst_csv_variable (str) – Variable name in the source data.

  • +
  • network_variable (str) – Name of the variable that should be written to.

  • +
  • network_var_type – Variable type in the written network.

  • +
  • overwrite (bool) – True is overwriting existing variable. Default to False.

  • +
+
+
Returns
+

None

+
+
+
+ +
+
+addition_map(links, nodes)
+

Shows which links and nodes are added to the roadway network

+
+ +
+
+apply(project_card_dictionary)
+

Wrapper method to apply a project to a roadway network.

+
+
Parameters
+

project_card_dictionary – dict +a dictionary of the project card object

+
+
+
+ +
+
+apply_managed_lane_feature_change(link_idx, properties, in_place=True)
+

Apply the managed lane feature changes to the roadway network

+
+
Parameters
+
    +
  • link_idx – list of lndices of all links to apply change to

  • +
  • properties – list of dictionarys roadway properties to change

  • +
  • in_place – boolean to indicate whether to update self or return +a new roadway network object

  • +
+
+
+
+ +
+
+apply_python_calculation(pycode, in_place=True)
+

Changes roadway network object by executing pycode.

+
+
Parameters
+
    +
  • pycode – python code which changes values in the roadway network object

  • +
  • in_place – update self or return a new roadway network object

  • +
+
+
+
+ +
+
+apply_roadway_feature_change(link_idx, properties, in_place=True)
+

Changes the roadway attributes for the selected features based on the +project card information passed

+
+
Parameters
+
    +
  • link_idx – list +lndices of all links to apply change to

  • +
  • properties – list of dictionarys +roadway properties to change

  • +
  • in_place – boolean +update self or return a new roadway network object

  • +
+
+
+
+ +
+
+assess_connectivity(mode='', ignore_end_nodes=True, links_df=None, nodes_df=None)
+

Returns a network graph and list of disconnected subgraphs +as described by a list of their member nodes.

+
+
Parameters
+
    +
  • mode – list of modes of the network, one of drive,`transit`, +walk, bike

  • +
  • ignore_end_nodes – if True, ignores stray singleton nodes

  • +
  • links_df – if specified, will assess connectivity of this +links list rather than self.links_df

  • +
  • nodes_df – if specified, will assess connectivity of this +nodes list rather than self.nodes_df

  • +
+
+
+
+
Returns: Tuple of

Network Graph (osmnx flavored networkX DiGraph) +List of disconnected subgraphs described by the list of their

+
+

member nodes (as described by their model_node_id)

+
+
+
+
+ +
+
+build_selection_key(selection_dict)
+

Selections are stored by a key combining the query and the A and B ids. +This method combines the two for you based on the selection dictionary.

+
+
Parameters
+

selection_dictonary – Selection Dictionary

+
+
+

Returns: Tuple serving as the selection key.

+
+
Return type
+

tuple

+
+
+
+ +
+
+calculate_area_type(area_type_shape=None, area_type_shape_variable=None, network_variable='area_type', area_type_codes_dict=None, downtown_area_type_shape=None, downtown_area_type=None, overwrite=False)[source]
+

#MC +Calculates area type variable.

+

This uses the centroid of the geometry field to determine which area it should be labeled. +This isn’t perfect, but it much quicker than other methods.

+
+
Parameters
+
    +
  • area_type_shape (str) – The File path to area geodatabase.

  • +
  • area_type_shape_variable (str) – The variable name of area type in area geodadabase.

  • +
  • network_variable (str) – The variable name of area type in network standard. Default to “area_type”.

  • +
  • area_type_codes_dict – The dictionary to map input area_type_shape_variable to network_variable

  • +
  • downtown_area_type_shape – The file path to the downtown area type boundary.

  • +
  • downtown_area_type (int) – Integer value of downtown area type

  • +
  • overwrite (Bool) – True if overwriting existing county variable in network. Default to False.

  • +
+
+
Returns
+

None

+
+
+
+ +
+
+calculate_centroidconnect(parameters, network_variable='centroidconnect', highest_taz_number=None, as_integer=True, overwrite=False)[source]
+

Calculates centroid connector variable.

+
+
Parameters
+
    +
  • parameters (Parameters) – A Lasso Parameters, which stores input files.

  • +
  • network_variable (str) – Variable that should be written to in the network. Default to “centroidconnect”

  • +
  • highest_taz_number (int) – the max TAZ number in the network.

  • +
  • as_integer (bool) – If True, will convert true/false to 1/0s. Default to True.

  • +
  • overwrite (Bool) – True if overwriting existing county variable in network. Default to False.

  • +
+
+
Returns
+

RoadwayNetwork

+
+
+
+ +
+
+calculate_county(county_shape=None, county_shape_variable=None, network_variable='county', county_codes_dict=None, overwrite=False)[source]
+

#MC +Calculates county variable.

+

This uses the centroid of the geometry field to determine which county it should be labeled. +This isn’t perfect, but it much quicker than other methods.

+
+
Parameters
+
    +
  • county_shape (str) – The File path to county geodatabase.

  • +
  • county_shape_variable (str) – The variable name of county in county geodadabase.

  • +
  • network_variable (str) – The variable name of county in network standard. Default to “county”.

  • +
  • overwrite (Bool) – True if overwriting existing county variable in network. Default to False.

  • +
+
+
Returns
+

None

+
+
+
+ +
+
+calculate_distance(network_variable='distance', centroidconnect_only=False, overwrite=False)[source]
+

calculate link distance in miles

+
+
Parameters
+
    +
  • centroidconnect_only (Bool) – True if calculating distance for centroidconnectors only. Default to False.

  • +
  • overwrite (Bool) – True if overwriting existing variable in network. Default to False.

  • +
+
+
Returns
+

None

+
+
+
+ +
+
+calculate_mpo(county_network_variable='county', network_variable='mpo', as_integer=True, mpo_counties=None, overwrite=False)[source]
+

Calculates mpo variable. +#MC +:param county_variable: Name of the variable where the county names are stored. Default to “county”. +:type county_variable: str +:param network_variable: Name of the variable that should be written to. Default to “mpo”. +:type network_variable: str +:param as_integer: If true, will convert true/false to 1/0s. +:type as_integer: bool +:param mpo_counties: List of county names that are within mpo region. +:type mpo_counties: list +:param overwrite: True if overwriting existing county variable in network. Default to False. +:type overwrite: Bool

+
+
Returns
+

None

+
+
+
+ +
+
+calculate_use(network_variable='use', as_integer=True, overwrite=False)[source]
+

Calculates use variable.

+
+
Parameters
+
    +
  • network_variable (str) – Variable that should be written to in the network. Default to “use”

  • +
  • as_integer (bool) – If True, will convert true/false to 1/0s. Defauly to True.

  • +
  • overwrite (Bool) – True if overwriting existing county variable in network. Default to False.

  • +
+
+
Returns
+

None

+
+
+
+ +
+
+convert_int(int_col_names=[])[source]
+

Convert integer columns

+
+ +
+
+create_ML_variable(network_variable='ML_lanes', overwrite=False)[source]
+

Created ML lanes placeholder for project to write out ML changes

+

ML lanes default to 0, ML info comes from cube LOG file and store in project cards

+
+
Parameters
+

overwrite (Bool) – True if overwriting existing variable in network. Default to False.

+
+
Returns
+

None

+
+
+
+ +
+
+create_calculated_variables()[source]
+

Creates calculated roadway variables.

+
+
Parameters
+

None

+
+
+
+ +
+ +

create dummy connector links between the general purpose and managed lanes

+
+
Parameters
+
    +
  • gp_df – GeoDataFrame +dataframe of general purpose links (where managed lane also exists)

  • +
  • ml_df – GeoDataFrame +dataframe of corresponding managed lane links,

  • +
  • access_lanes – int +number of lanes in access dummy link

  • +
  • egress_lanes – int +number of lanes in egress dummy link

  • +
  • access_roadway – str +roaday type for access dummy link

  • +
  • egress_roadway – str +roadway type for egress dummy link

  • +
  • access_name_prefix – str +prefix for access dummy link name

  • +
  • egress_name_prefix – str +prefix for egress dummy link name

  • +
+
+
+
+ +
+
+create_hov_corridor_variable(network_variable='segment_id', overwrite=False)[source]
+

Created hov corridor placeholder for project to write out corridor changes

+

hov corridor id default to 0, its info comes from cube LOG file and store in project cards

+
+
Parameters
+

overwrite (Bool) – True if overwriting existing variable in network. Default to False.

+
+
Returns
+

None

+
+
+
+ +
+
+create_managed_lane_network(keep_same_attributes_ml_and_gp=None, keep_additional_attributes_ml_and_gp=[], managed_lanes_required_attributes=[], managed_lanes_node_id_scalar=None, managed_lanes_link_id_scalar=None, in_place=False)
+

Create a roadway network with managed lanes links separated out. +Add new parallel managed lane links, access/egress links, +and add shapes corresponding to the new links

+
+
Parameters
+
    +
  • keep_same_attributes_ml_and_gp – list of attributes to copy from general purpose +lane to managed lane. If not specified, will look for value in the RoadwayNetwork +instance. If not found there, will default to KEEP_SAME_ATTRIBUTES_ML_AND_GP.

  • +
  • keep_additional_attributes_ml_and_gp – list of additional attributes to add. This is useful +if you want to leave the default attributes and then ALSO some others.

  • +
  • managed_lanes_required_attributes – list of attributes that are required to be specified +in new managed lanes. If not specified, will look for value in the RoadwayNetwork +instance. If not found there, will default to MANAGED_LANES_REQUIRED_ATTRIBUTES.

  • +
  • managed_lanes_node_id_scalar – integer value added to original node IDs to create managed +lane unique ids. If not specified, will look for value in the RoadwayNetwork +instance. If not found there, will default to MANAGED_LANES_NODE_ID_SCALAR.

  • +
  • managed_lanes_link_id_scalar – integer value added to original link IDs to create managed +lane unique ids. If not specified, will look for value in the RoadwayNetwork +instance. If not found there, will default to MANAGED_LANES_LINK_ID_SCALAR.

  • +
  • in_place – update self or return a new roadway network object

  • +
+
+
+

returns: A RoadwayNetwork instance

+
+
Return type
+

RoadwayNetwork

+
+
+
+ +
+
+create_managed_variable(network_variable='managed', overwrite=False)[source]
+

Created placeholder for project to write out managed

+

managed default to 0, its info comes from cube LOG file and store in project cards

+
+
Parameters
+

overwrite (Bool) – True if overwriting existing variable in network. Default to False.

+
+
Returns
+

None

+
+
+
+ +
+
+static dataframe_to_fixed_width(df)[source]
+

Convert dataframe to fixed width format, geometry column will not be transformed.

+
+
Parameters
+

df (pandas DataFrame) –

+
+
Returns
+

dataframe with fixed width for each column. +dict: dictionary with columns names as keys, column width as values.

+
+
Return type
+

pandas dataframe

+
+
+
+ +
+
+delete_roadway_feature_change(links, nodes, ignore_missing=True)
+

delete the roadway features defined in the project card. +valid links and nodes defined in the project gets deleted +and shapes corresponding to the deleted links are also deleted.

+
+
Parameters
+
    +
  • links – dict +list of dictionaries

  • +
  • nodes – dict +list of dictionaries

  • +
  • ignore_missing – bool +If True, will only warn about links/nodes that are missing from +network but specified to “delete” in project card +If False, will fail.

  • +
+
+
Return type
+

None

+
+
+
+ +
+
+deletion_map(links, nodes)
+

Shows which links and nodes are deleted from the roadway network

+
+ +
+
+fill_na()[source]
+

Fill na values from create_managed_lane_network()

+
+ +
+
+static from_RoadwayNetwork(roadway_network_object, parameters={})[source]
+

RoadwayNetwork to ModelRoadwayNetwork

+
+
Parameters
+
    +
  • roadway_network_object (RoadwayNetwork) –

  • +
  • parameters – dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not specified, will use default parameters.

  • +
+
+
Returns
+

ModelRoadwayNetwork

+
+
+
+ +
+
+static get_attribute(links_df, join_key, source_shst_ref_df, source_gdf, field_name)[source]
+

Gets attribute from source data using SHST match result.

+
+
Parameters
+
    +
  • links_df (dataframe) – The network dataframe that new attribute should be written to.

  • +
  • join_key (str) – SHST ID variable name used to join source data with network dataframe.

  • +
  • source_shst_ref_df (str) – File path to source data SHST match result.

  • +
  • source_gdf (str) – File path to source data.

  • +
  • field_name (str) – Name of the attribute to get from source data.

  • +
+
+
Returns
+

None

+
+
+
+ +
+
+static get_managed_lane_node_ids(nodes_list, scalar=4500000)
+

Transform a list of node IDS by a scalar. +..todo #237 what if node ids are not a number?

+
+
Parameters
+
    +
  • nodes_list – list of integers

  • +
  • scalar – value to add to node IDs

  • +
+
+
+

Returns: list of integers

+
+ +
+
+static get_modal_graph(links_df, nodes_df, mode=None, modes_to_network_link_variables={'bike': ['bike_access'], 'bus': ['bus_only', 'drive_access'], 'drive': ['drive_access'], 'rail': ['rail_only'], 'transit': ['bus_only', 'rail_only', 'drive_access'], 'walk': ['walk_access']})
+

Determines if the network graph is “strongly” connected +A graph is strongly connected if each vertex is reachable from every other vertex.

+
+
Parameters
+
    +
  • links_df – DataFrame of standard network links

  • +
  • nodes_df – DataFrame of standard network nodes

  • +
  • mode – mode of the network, one of drive,`transit`, +walk, bike

  • +
  • modes_to_network_link_variables – dictionary mapping the mode selections to the network variables +that must bool to true to select that mode. Defaults to MODES_TO_NETWORK_LINK_VARIABLES

  • +
+
+
+

Returns: networkx: osmnx: DiGraph of network

+
+ +
+ +

Returns nodes and link dataframes for specific mode.

+
+
Parameters
+
    +
  • links_df – DataFrame of standard network links

  • +
  • nodes_df – DataFrame of standard network nodes

  • +
  • modes – list of the modes of the network to be kept, must be in drive,`transit`,`rail`,`bus`, +walk, bike. For example, if bike and walk are selected, both bike and walk links will be kept.

  • +
  • modes_to_network_link_variables – dictionary mapping the mode selections to the network variables +that must bool to true to select that mode. Defaults to MODES_TO_NETWORK_LINK_VARIABLES

  • +
+
+
+

Returns: tuple of DataFrames for links, nodes filtered by mode

+

links with walk access are not marked as having walk access +Issue discussed in https://github.com/wsp-sag/network_wrangler/issues/145 +modal_nodes_df = nodes_df[nodes_df[mode_node_variable] == 1]

+
+ +
+
+get_property_by_time_period_and_group(prop, time_period=None, category=None, default_return=None)
+

Return a series for the properties with a specific group or time period.

+
+
Parameters
+
    +
  • prop (str) – the variable that you want from network

  • +
  • time_period (list(str)) – the time period that you are querying for +i.e. [‘16:00’, ‘19:00’]

  • +
  • category (str or list(str)(Optional)) –

    the group category +i.e. “sov”

    +

    or

    +

    list of group categories in order of search, i.e. +[“hov3”,”hov2”]

    +

  • +
  • default_return (what to return if variable or time period not found. Default is None.) –

  • +
+
+
Return type
+

pandas series

+
+
+
+ +
+
+identify_segment(O_id, D_id, selection_dict={}, mode=None, nodes_df=None, links_df=None)
+
+
Parameters
+
    +
  • endpoints – list of length of two unique keys of nodes making up endpoints of segment

  • +
  • selection_dict – dictionary of link variables to select candidate links from, otherwise will create a graph of ALL links which will be both a RAM hog and could result in odd shortest paths.

  • +
  • segment_variables – list of variables to keep

  • +
+
+
+
+ +
+
+identify_segment_endpoints(mode='', links_df=None, nodes_df=None, min_connecting_links=10, min_distance=None, max_link_deviation=2)
+
+
Parameters
+
    +
  • mode – list of modes of the network, one of drive,`transit`, +walk, bike

  • +
  • links_df – if specified, will assess connectivity of this +links list rather than self.links_df

  • +
  • nodes_df – if specified, will assess connectivity of this +nodes list rather than self.nodes_df

  • +
+
+
+
+ +
+
+is_network_connected(mode=None, links_df=None, nodes_df=None)
+

Determines if the network graph is “strongly” connected +A graph is strongly connected if each vertex is reachable from every other vertex.

+
+
Parameters
+
    +
  • mode – mode of the network, one of drive,`transit`, +walk, bike

  • +
  • links_df – DataFrame of standard network links

  • +
  • nodes_df – DataFrame of standard network nodes

  • +
+
+
+

Returns: boolean

+
+ +
+
+static load_transform_network(node_filename, link_filename, shape_filename, crs=4326, node_foreign_key='model_node_id', validate_schema=True, **kwargs)
+

Reads roadway network files from disk and transforms them into GeoDataFrames.

+
+
Parameters
+
    +
  • node_filename – file name for nodes.

  • +
  • link_filename – file name for links.

  • +
  • shape_filename – file name for shapes.

  • +
  • crs – coordinate reference system. Defaults to value in CRS.

  • +
  • node_foreign_key – variable linking the node table to the link table. Defaults +to NODE_FOREIGN_KEY.

  • +
  • validate_schema – boolean indicating if network should be validated to schema.

  • +
+
+
+

returns: tuple of GeodataFrames nodes_df, links_df, shapes_df

+
+
Return type
+

tuple

+
+
+
+ +
+
+static network_connection_plot(G, disconnected_subgraph_nodes)
+

Plot a graph to check for network connection.

+
+
Parameters
+
    +
  • G – OSMNX flavored networkX graph.

  • +
  • disconnected_subgraph_nodes – List of disconnected subgraphs described by the list of their +member nodes (as described by their model_node_id).

  • +
+
+
+

returns: fig, ax : tuple

+
+ +
+
+orig_dest_nodes_foreign_key(selection, node_foreign_key='')
+

Returns the foreign key id (whatever is used in the u and v +variables in the links file) for the AB nodes as a tuple.

+
+
Parameters
+
    +
  • selection – selection dictionary with A and B keys

  • +
  • node_foreign_key – variable name for whatever is used by the u and v variable

  • +
  • specified (in the links_df file. If nothing is) –

  • +
  • whatever (assume) –

  • +
  • is (default) –

  • +
+
+
+

Returns: tuple of (A_id, B_id)

+
+
Return type
+

tuple

+
+
+
+ +
+
+static ox_graph(nodes_df, links_df, node_foreign_key='model_node_id', link_foreign_key=['A', 'B'], unique_link_key='model_link_id')
+

create an osmnx-flavored network graph

+

osmnx doesn’t like values that are arrays, so remove the variables +that have arrays. osmnx also requires that certain variables +be filled in, so do that too.

+
+
Parameters
+
    +
  • nodes_df – GeoDataFrame of nodes

  • +
  • link_df – GeoDataFrame of links

  • +
  • node_foreign_key – field referenced in link_foreign_key

  • +
  • link_foreign_key – list of attributes that define the link start and end nodes to the node foreign key

  • +
  • unique_link_key – primary key for links

  • +
+
+
+

Returns: a networkx multidigraph

+
+ +
+ +
+
Parameters
+
    +
  • candidate_links – selection of links geodataframe with links likely to be part of path

  • +
  • O_id – origin node foreigh key ID

  • +
  • D_id – destination node foreigh key ID

  • +
  • weight_column – column to use for weight of shortest path. Defaults to “i” (iteration)

  • +
  • weight_factor – optional weight to multiply the weight column by when finding the shortest path

  • +
  • search_breadth

  • +
+
+
+

Returns

+
+ +
+
+static read(link_filename, node_filename, shape_filename, fast=False, recalculate_calculated_variables=False, recalculate_distance=False, parameters={}, **kwargs)[source]
+

Reads in links and nodes network standard.

+
+
Parameters
+
    +
  • link_filename (str) – File path to link json.

  • +
  • node_filename (str) – File path to node geojson.

  • +
  • shape_filename (str) – File path to link true shape geojson

  • +
  • fast (bool) – boolean that will skip validation to speed up read time.

  • +
  • recalculate_calculated_variables (bool) – calculates fields from spatial joins, etc.

  • +
  • recalculate_distance (bool) – re-calculates distance.

  • +
  • parameters – dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not specified, will use default parameters.

  • +
  • crs (int) – coordinate reference system, ESPG number

  • +
  • node_foreign_key (str) – variable linking the node table to the link table

  • +
  • link_foreign_key (list) – list of variable linking the link table to the node foreign key

  • +
  • shape_foreign_key (str) – variable linking the links table and shape table

  • +
  • unique_link_ids (list) – list of variables unique to each link

  • +
  • unique_node_ids (list) – list of variables unique to each node

  • +
  • modes_to_network_link_variables (dict) – Mapping of modes to link variables in the network

  • +
  • modes_to_network_nodes_variables (dict) – Mapping of modes to node variables in the network

  • +
  • managed_lanes_node_id_scalar (int) – Scalar values added to primary keys for nodes for +corresponding managed lanes.

  • +
  • managed_lanes_link_id_scalar (int) – Scalar values added to primary keys for links for +corresponding managed lanes.

  • +
  • managed_lanes_required_attributes (list) – attributes that must be specified in managed +lane projects.

  • +
  • keep_same_attributes_ml_and_gp (list) – attributes to copy to managed lanes from parallel +general purpose lanes.

  • +
+
+
Returns
+

ModelRoadwayNetwork

+
+
+
+ +
+
+static read_match_result(path)[source]
+

Reads the shst geojson match returns.

+

Returns shst dataframe.

+

Reading lots of same type of file and concatenating them into a single DataFrame.

+
+
Parameters
+

path (str) – File path to SHST match results.

+
+
Returns
+

geopandas geodataframe

+
+
Return type
+

geodataframe

+
+
+

##todo +not sure why we need, but should be in utilities not this class

+
+ +
+
+rename_variables_for_dbf(input_df, variable_crosswalk=None, output_variables=None, convert_geometry_to_xy=False)[source]
+

Rename attributes for DBF/SHP, make sure length within 10 chars.

+
+
Parameters
+
    +
  • input_df (dataframe) – Network standard DataFrame.

  • +
  • variable_crosswalk (str) – File path to variable name crosswalk from network standard to DBF names.

  • +
  • output_variables (list) – List of strings for DBF variables.

  • +
  • convert_geometry_to_xy (bool) – True if converting node geometry to X/Y

  • +
+
+
Returns
+

dataframe

+
+
+
+ +
+
+static roadway_net_to_gdf(roadway_net)
+

Turn the roadway network into a GeoDataFrame +:param roadway_net: the roadway network to export

+

returns: shapes dataframe

+
+
Return type
+

GeoDataFrame

+
+
+
+ +
+
+roadway_standard_to_met_council_network(output_epsg=None)[source]
+

Rename and format roadway attributes to be consistent with what metcouncil’s model is expecting. +#MC +:param output_epsg: epsg number of output network. +:type output_epsg: int

+
+
Returns
+

None

+
+
+
+ +
+
+select_roadway_features(selection, search_mode='drive', force_search=False, sp_weight_factor=None)
+

Selects roadway features that satisfy selection criteria

+
+
Example usage:
+
net.select_roadway_features(
+
selection = [ {

# a match condition for the from node using osm, +# shared streets, or model node number +‘from’: {‘osm_model_link_id’: ‘1234’}, +# a match for the to-node.. +‘to’: {‘shstid’: ‘4321’}, +# a regex or match for facility condition +# could be # of lanes, facility type, etc. +‘facility’: {‘name’:’Main St’}, +}, … ])

+
+
+
+
+
+
+
+
Parameters
+
    +
  • selection – dictionary with keys for: +A - from node +B - to node +link - which includes at least a variable for name

  • +
  • search_mode – mode which you are searching for; defaults to “drive”

  • +
  • force_search – boolean directing method to perform search even if one +with same selection dict is stored from a previous search.

  • +
  • sp_weight_factor – multiple used to discourage shortest paths which +meander from original search returned from name or ref query. +If not set here, will default to value of sp_weight_factor in +RoadwayNetwork instance. If not set there, will defaul to SP_WEIGHT_FACTOR.

  • +
+
+
+

Returns: a list of link IDs in selection

+
+
Return type
+

GeoDataFrame

+
+
+
+ +
+ +
+
Parameters
+

selection_dictionary – Dictionary representation of selection +of roadway features, containing a “link” key.

+
+
+
+
Returns: A boolean indicating if the selection dictionary contains

a unique identifier for links.

+
+
+
+
Return type
+

bool

+
+
+
+ +
+
+selection_map(selected_link_idx, A=None, B=None, candidate_link_idx=[])
+

Shows which links are selected for roadway property change or parallel +managed lanes category of roadway projects.

+
+
Parameters
+
    +
  • selected_links_idx – list of selected link indices

  • +
  • candidate_links_idx – optional list of candidate link indices to also include in map

  • +
  • A – optional foreign key of starting node of a route selection

  • +
  • B – optional foreign key of ending node of a route selection

  • +
+
+
+
+ +
+
+shortest_path(graph_links_df, O_id, D_id, nodes_df=None, weight_column='i', weight_factor=100)
+
+
Parameters
+
    +
  • graph_links_df

  • +
  • O_id – foreign key for start node

  • +
  • D_id – foreign key for end node

  • +
  • nodes_df – optional nodes df, otherwise will use network instance

  • +
  • weight_column – column to use as a weight, defaults to “i”

  • +
  • weight_factor – any additional weighting to multiply the weight column by, defaults to SP_WEIGHT_FACTOR

  • +
+
+
+

Returns: tuple with length of four +- Boolean if shortest path found +- nx Directed graph of graph links +- route of shortest path nodes as List +- links in shortest path selected from links_df

+
+
Return type
+

tuple

+
+
+
+ +
+
+split_properties_by_time_period_and_category(properties_to_split=None)[source]
+

Splits properties by time period, assuming a variable structure of

+
+
Parameters
+

properties_to_split

dict +dictionary of output variable prefix mapped to the source variable and what to stratify it by +e.g. +{

+
+

’lanes’ : {‘v’:’lanes’, ‘times_periods’:{“AM”: (“6:00”, “10:00”),”PM”: (“15:00”, “19:00”)}}, +‘ML_lanes’ : {‘v’:’ML_lanes’, ‘times_periods’:{“AM”: (“6:00”, “10:00”),”PM”: (“15:00”, “19:00”)}}, +‘use’ : {‘v’:’use’, ‘times_periods’:{“AM”: (“6:00”, “10:00”),”PM”: (“15:00”, “19:00”)}},

+
+

}

+

+
+
+
+ +
+
+update_distance(links_df=None, use_shapes=False, units='miles', network_variable='distance', overwrite=True, inplace=True)
+

Calculate link distance in specified units to network variable using either straight line +distance or (if specified) shape distance if available.

+
+
Parameters
+
    +
  • links_df – Links GeoDataFrame. Useful if want to update a portion of network links +(i.e. only centroid connectors). If not provided, will use entire self.links_df.

  • +
  • use_shapes – if True, will add length information from self.shapes_df rather than crow-fly. +If no corresponding shape found in self.shapes_df, will default to crow-fly.

  • +
  • units – units to use. Defaults to the standard unit of miles. Available units: “meters”, “miles”.

  • +
  • network_variable – variable to store link distance in. Defaults to “distance”.

  • +
  • overwrite – Defaults to True and will overwrite all existing calculated distances. +False will only update NaNs.

  • +
  • inplace – updates self.links_df

  • +
+
+
Returns
+

links_df with updated distance

+
+
+
+ +
+ +

Validate roadway network data link schema and output a boolean

+
+ +
+
+static validate_node_schema(node_file, schema_location='roadway_network_node.json')
+

Validate roadway network data node schema and output a boolean

+
+ +
+
+validate_properties(properties, ignore_existing=False, require_existing_for_change=False)
+

If there are change or existing commands, make sure that that +property exists in the network.

+
+
Parameters
+
    +
  • properties – properties dictionary to be evaluated

  • +
  • ignore_existing – If True, will only warn about properties +that specify an “existing” value. If False, will fail.

  • +
  • require_existing_for_change – If True, will fail if there isn’t +a specified value in theproject card for existing when a +change is specified.

  • +
+
+
+

Returns: boolean value as to whether the properties dictonary is valid.

+
+
Return type
+

bool

+
+
+
+ +
+
+validate_selection(selection, selection_requires=['link'])
+

Evaluate whetther the selection dictionary contains the +minimum required values.

+
+
Parameters
+

selection – selection dictionary to be evaluated

+
+
+

Returns: boolean value as to whether the selection dictonary is valid.

+
+
Return type
+

bool

+
+
+
+ +
+
+static validate_shape_schema(shape_file, schema_location='roadway_network_shape.json')
+

Validate roadway network data shape schema and output a boolean

+
+ +
+
+validate_uniqueness()
+

Confirms that the unique identifiers are met.

+
+
Return type
+

bool

+
+
+
+ +
+
+write(path='.', filename=None)
+

Writes a network in the roadway network standard

+
+
Parameters
+
    +
  • path – the path were the output will be saved

  • +
  • filename – the name prefix of the roadway files that will be generated

  • +
+
+
Return type
+

None

+
+
+
+ +
+
+write_roadway_as_fixedwidth(output_dir, node_output_variables=None, link_output_variables=None, output_link_txt=None, output_node_txt=None, output_link_header_width_txt=None, output_node_header_width_txt=None, output_cube_network_script=None, drive_only=False)[source]
+

Writes out fixed width file.

+

This function does: +1. write out link and node fixed width data files for cube. +2. write out header and width correspondence. +3. write out cube network building script with header and width specification.

+
+
Parameters
+
    +
  • output_dir (str) – File path to where links, nodes and script will be written and run

  • +
  • node_output_variables (list) – list of node variable names.

  • +
  • link_output_variables (list) – list of link variable names.

  • +
  • output_link_txt (str) – File name of output link database (within output_dir)

  • +
  • output_node_txt (str) – File name of output node database (within output_dir)

  • +
  • output_link_header_width_txt (str) – File name of link column width records (within output_dir)

  • +
  • output_node_header_width_txt (str) – File name of node column width records (within output_dir)

  • +
  • output_cube_network_script (str) – File name of CUBE network building script (within output_dir)

  • +
  • drive_only (bool) – If True, only writes drive nodes and links

  • +
+
+
Returns
+

None

+
+
+
+ +
+
+write_roadway_as_shp(output_dir, node_output_variables=None, link_output_variables=None, data_to_csv=True, data_to_dbf=False, output_link_shp=None, output_node_shp=None, output_link_csv=None, output_node_csv=None, output_gpkg=None, output_link_gpkg_layer=None, output_node_gpkg_layer=None, output_gpkg_link_filter=None)[source]
+

Write out dbf/shp/gpkg for cube. Write out csv in addition to shp with full length variable names.

+
+
Parameters
+
    +
  • output_dir (str) – File path to directory

  • +
  • node_output_variables (list) – List of strings for node output variables.

  • +
  • link_output_variables (list) – List of strings for link output variables.

  • +
  • data_to_csv (bool) – True if write network in csv format.

  • +
  • data_to_dbf (bool) – True if write network in dbf/shp format.

  • +
  • output_link_shp (str) – File name to output link dbf/shp.

  • +
  • output_node_shp (str) – File name of output node dbf/shp.

  • +
  • output_link_csv (str) – File name to output link csv.

  • +
  • output_node_csv (str) – File name to output node csv.

  • +
  • output_gpkg (str) – File name to output GeoPackage.

  • +
  • output_link_gpkg_layer (str) – Layer name within output_gpkg to output links.

  • +
  • output_node_gpkg_layer (str) – Layer name within output_gpkg to output links.

  • +
  • output_gpkg_link_filter (str) – Optional column name to additional output link subset layers

  • +
+
+
Returns
+

None

+
+
+
+ +
+
+CALCULATED_VALUES = ['area_type', 'county', 'centroidconnect']
+
+ +
+ +
+ + +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bart/_generated/lasso.Parameters/index.html b/branch/bart/_generated/lasso.Parameters/index.html new file mode 100644 index 0000000..fcbab55 --- /dev/null +++ b/branch/bart/_generated/lasso.Parameters/index.html @@ -0,0 +1,563 @@ + + + + + + + lasso.Parameters — lasso documentation + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

lasso.Parameters

+
+
+class lasso.Parameters(**kwargs)[source]
+

Bases: object

+

A class representing all the parameters defining the networks +including time of day, categories, etc.

+

Parameters can be set at runtime by initializing a parameters instance +with a keyword argument setting the attribute. Parameters that are +not explicitly set will use default parameters listed in this class. +.. highlight:: python

+
+
Attr:
+
time_period_to_time (dict): Maps time period abbreviations used in

Cube to time of days used on gtfs and highway network standard +Default:

+
{
+    "EA": ("3:00", "6:00"),
+    "AM": ("6:00, "10:00"),
+    "MD": ("10:00", "15:00"),
+    "PM": ("15:00", "19:00"),
+    "EV": ("19:00", "3:00"),
+}
+
+
+
+
cube_time_periods (dict): Maps cube time period numbers used in

transit line files to the time period abbreviations in time_period_to_time +dictionary. +Default:

+
{"1": "EA", "2": "AM", "3": "MD", "4": "PM", "5": "EV"}
+
+
+
+
categories (dict): Maps demand category abbreviations to a list of

network categories they are allowed to use. +Default:

+
{
+    # suffix, source (in order of search)
+    "sov": ["sov", "default"],
+    "hov2": ["hov2", "default", "sov"],
+    "hov3": ["hov3", "hov2", "default", "sov"],
+    "truck": ["trk", "sov", "default"],
+}
+
+
+
+
properties_to_split (dict): Dictionary mapping variables in standard

roadway network to categories and time periods that need to be +split out in final model network to get variables like LANES_AM. +Default:

+
{
+    "lanes": {
+        "v": "lanes",
+        "time_periods": self.time_periods_to_time
+    },
+    "ML_lanes": {
+        "v": "ML_lanes",
+        "time_periods": self.time_periods_to_time
+    },
+    "use": {
+        "v": "use",
+        "time_periods": self.time_periods_to_time
+    },
+}
+
+
+
+
county_shape (str): File location of shapefile defining counties.

Default:

+
r"metcouncil_data/county/cb_2017_us_county_5m.shp"
+
+
+
+
county_variable_shp (str): Property defining the county n ame in

the county_shape file. +Default:

+
NAME
+
+
+
+
lanes_lookup_file (str): Lookup table of number of lanes for different data sources.

Default:

+
r"metcouncil_data/lookups/lanes.csv"
+
+
+
+
centroid_connect_lanes (int): Number of lanes for centroid connectors.

Default:

+
1
+
+
+
+
mpo_counties (list): list of county names within MPO boundary.

Default:

+
[
+    "ANOKA",
+    "DAKOTA",
+    "HENNEPIN",
+    "RAMSEY",
+    "SCOTT",
+    "WASHINGTON",
+    "CARVER",
+]
+
+
+
+
taz_shape (str):

Default:

+
r"metcouncil_data/TAZ/TAZOfficialWCurrentForecasts.shp"
+
+
+
+
taz_data (str):

Default:

+
??
+
+
+
+
highest_taz_number (int): highest TAZ number in order to define

centroid connectors. +Default:

+
3100
+
+
+
+
output_variables (list): list of variables to output in final model

network. +Default:

+
[
+    "model_link_id",
+    "link_id",
+    "A",
+    "B",
+    "shstGeometryId",
+    "distance",
+    "roadway",
+    "name",
+    "roadway_class",
+    "bike_access",
+    "walk_access",
+    "drive_access",
+    "truck_access",
+    "trn_priority_EA",
+    "trn_priority_AM",
+    "trn_priority_MD",
+    "trn_priority_PM",
+    "trn_priority_EV",
+    "ttime_assert_EA",
+    "ttime_assert_AM",
+    "ttime_assert_MD",
+    "ttime_assert_PM",
+    "ttime_assert_EV",
+    "lanes_EA",
+    "lanes_AM",
+    "lanes_MD",
+    "lanes_PM",
+    "lanes_EV",
+    "price_sov_EA",
+    "price_hov2_EA",
+    "price_hov3_EA",
+    "price_truck_EA",
+    "price_sov_AM",
+    "price_hov2_AM",
+    "price_hov3_AM",
+    "price_truck_AM",
+    "price_sov_MD",
+    "price_hov2_MD",
+    "price_hov3_MD",
+    "price_truck_MD",
+    "price_sov_PM",
+    "price_hov2_PM",
+    "price_hov3_PM",
+    "price_truck_PM",
+    "price_sov_EV",
+    "price_hov2_EV",
+    "price_hov3_EV",
+    "price_truck_EV",
+    "roadway_class_idx",
+    "facility_type",
+    "county",
+    "centroidconnect",
+    "model_node_id",
+    "N",
+    "osm_node_id",
+    "bike_node",
+    "transit_node",
+    "walk_node",
+    "drive_node",
+    "geometry",
+    "X",
+    "Y",
+    "ML_lanes_EA",
+    "ML_lanes_AM",
+    "ML_lanes_MD",
+    "ML_lanes_PM",
+    "ML_lanes_EV",
+    "segment_id",
+    "managed",
+    "bus_only",
+    "rail_only"
+]
+
+
+
+
osm_facility_type_dict (dict): Mapping between OSM Roadway variable

and facility type. Default:

+
+
area_type_shape (str): Location of shapefile defining area type.

Default:

+
r"metcouncil_data/area_type/ThriveMSP2040CommunityDesignation.shp"
+
+
+
+
area_type_variable_shp (str): property in area_type_shape with area

type in it. +Default:

+
"COMDES2040"
+
+
+
+
area_type_code_dict (dict): Mapping of the area_type_variable_shp to

the area type code used in the MetCouncil cube network. +Default:

+
{
+    23: 4,  # urban center
+    24: 3,
+    25: 2,
+    35: 2,
+    36: 1,
+    41: 1,
+    51: 1,
+    52: 1,
+    53: 1,
+    60: 1,
+}
+
+
+
+
downtown_area_type_shape (str): Location of shapefile defining downtown area type.

Default:

+
r"metcouncil_data/area_type/downtownzones_TAZ.shp"
+
+
+
+
downtown_area_type (int): Area type integer for downtown.

Default:

+
5
+
+
+
+
mrcc_roadway_class_shape (str): Shapefile of MRCC links with a property

associated with roadway class. Default:

+
r"metcouncil_data/mrcc/trans_mrcc_centerlines.shp"
+
+
+
+
mrcc_roadway_class_variable_shp (str): The property in mrcc_roadway_class_shp

associated with roadway class. Default:

+
"ROUTE_SYS"
+
+
+
+
widot_roadway_class_shape (str): Shapefile of Wisconsin links with a property

associated with roadway class. Default:

+
r"metcouncil_data/Wisconsin_Lanes_Counts_Median/WISLR.shp"
+
+
+
+
widot_roadway_class_variable_shp (str): The property in widot_roadway_class_shape

associated with roadway class.Default:

+
"RDWY_CTGY_"
+
+
+
+
mndot_count_shape (str): Shapefile of MnDOT links with a property

associated with counts. Default:

+
r"metcouncil_data/count_mn/AADT_2017_Count_Locations.shp"
+
+
+
+
mndot_count_variable_shp (str): The property in mndot_count_shape

associated with counts. Default:

+
+
::

“lookups/osm_highway_facility_type_crosswalk.csv”

+
+
+
+
legacy_tm2_attributes (str): CSV file of link attributes by

shStReferenceId from Legacy TM2 network. Default:

+
"lookups/legacy_tm2_attributes.csv"
+
+
+
+
osm_lanes_attributes (str): CSV file of number of lanes by shStReferenceId

from OSM. Default:

+
"lookups/osm_lanes_attributes.csv"
+
+
+
+
tam_tm2_attributes (str): CSV file of link attributes by

shStReferenceId from TAM TM2 network. Default:

+
"lookups/tam_tm2_attributes.csv"
+
+
+
+
tom_tom_attributes (str): CSV file of link attributes by

shStReferenceId from TomTom network. Default:

+
"lookups/tomtom_attributes.csv"
+
+
+
+
sfcta_attributes (str): CSV file of link attributes by

shStReferenceId from SFCTA network. Default:

+
"lookups/sfcta_attributes.csv"
+
+
+
+
output_epsg (int): EPSG type of geographic projection for output

shapefiles. Default:

+
102646
+
+
+
+
output_link_shp (str): Output shapefile for roadway links. Default:
+
::

r”tests/scratch/links.shp”

+
+
+
+
output_node_shp (str): Output shapefile for roadway nodes. Default:
+
::

r”tests/scratch/nodes.shp”

+
+
+
+
output_link_csv (str): Output csv for roadway links. Default:
+
::

r”tests/scratch/links.csv”

+
+
+
+
output_node_csv (str): Output csv for roadway nodes. Default:
+
::

r”tests/scratch/nodes.csv”

+
+
+
+
output_link_txt (str): Output fixed format txt for roadway links. Default:
+
::

r”tests/scratch/links.txt”

+
+
+
+
output_node_txt (str): Output fixed format txt for roadway nodes. Default:
+
::

r”tests/scratch/nodes.txt”

+
+
+
+
output_link_header_width_txt (str): Header for txt roadway links. Default:
+
::

r”tests/scratch/links_header_width.txt”

+
+
+
+
output_node_header_width_txt (str): Header for txt for roadway Nodes. Default:
+
::

r”tests/scratch/nodes_header_width.txt”

+
+
+
+
output_cube_network_script (str): Cube script for importing

fixed-format roadway network. Default:

+
r"tests/scratch/make_complete_network_from_fixed_width_file.s
+
+
+
+
zones (int): Number of travel analysis zones in the network. Default:
+
::

3061

+
+
+
+
+
+
+
+
+__init__(**kwargs)[source]
+

Time period and category splitting info

+
+ +

Methods

+ ++++ + + + + + +

__init__(**kwargs)

Time period and category splitting info

+

Attributes

+ ++++ + + + + + + + + + + + + + + +

cube_time_periods

#MC self.route_type_bus_mode_dict = {"Urb Loc": 5, "Sub Loc": 6, "Express": 7}

properties_to_split

Details for calculating the county based on the centroid of the link.

county_link_range_dict

self.county_code_dict = {

zones

Create all the possible headway variable combinations based on the cube time periods setting

+
+ +
+
self.county_code_dict = {

“Anoka”: 1, +“Carver”: 2, +“Dakota”: 3, +“Hennepin”: 4, +“Ramsey”: 5, +“Scott”: 6, +“Washington”: 7, +“external”: 10, +“Chisago”: 11, +“Goodhue”: 12, +“Isanti”: 13, +“Le Sueur”: 14, +“McLeod”: 15, +“Pierce”: 16, +“Polk”: 17, +“Rice”: 18, +“Sherburne”: 19, +“Sibley”: 20, +“St. Croix”: 21, +“Wright”: 22,

+
+
+

}

+
+ +
+
+cube_time_periods
+

#MC +self.route_type_bus_mode_dict = {“Urb Loc”: 5, “Sub Loc”: 6, “Express”: 7}

+

self.route_type_mode_dict = {0: 8, 2: 9}

+

self.cube_time_periods = {“1”: “AM”, “2”: “MD”} +self.cube_time_periods_name = {“AM”: “pk”, “MD”: “op”}

+
+ +
+
+properties_to_split
+

Details for calculating the county based on the centroid of the link. +The NAME varible should be the name of a field in shapefile.

+
+ +
+
+zones
+

Create all the possible headway variable combinations based on the cube time periods setting

+
+ +
+ +
+ + +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bart/_generated/lasso.Project/index.html b/branch/bart/_generated/lasso.Project/index.html new file mode 100644 index 0000000..493b62c --- /dev/null +++ b/branch/bart/_generated/lasso.Project/index.html @@ -0,0 +1,524 @@ + + + + + + + lasso.Project — lasso documentation + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

lasso.Project

+
+
+class lasso.Project(roadway_link_changes=None, roadway_node_changes=None, transit_changes=None, base_roadway_network=None, base_transit_network=None, base_cube_transit_network=None, build_cube_transit_network=None, project_name='', evaluate=False, parameters={})[source]
+

Bases: object

+

A single or set of changes to the roadway or transit system.

+

Compares a base and a build transit network or a base and build +highway network and produces project cards.

+

Typical usage example:

+
test_project = Project.create_project(
+    base_cube_transit_source=os.path.join(CUBE_DIR, "transit.LIN"),
+    build_cube_transit_source=os.path.join(CUBE_DIR, "transit_route_shape_change"),
+)
+test_project.evaluate_changes()
+test_project.write_project_card(
+    os.path.join(SCRATCH_DIR, "t_transit_shape_test.yml")
+)
+
+
+
+
+DEFAULT_PROJECT_NAME
+

a class-level constant that defines what +the project name will be if none is set.

+
+ +
+
+STATIC_VALUES
+

a class-level constant which defines values that +are not evaluated when assessing changes.

+
+ +
+
+card_data
+

{“project”: <project_name>, “changes”: <list of change dicts>}

+
+
Type
+

dict

+
+
+
+ +
+ +

pandas dataframe of CUBE roadway link changes.

+
+
Type
+

DataFrame

+
+
+
+ +
+
+roadway_node_changes
+

pandas dataframe of CUBE roadway node changes.

+
+
Type
+

DataFrame

+
+
+
+ +
+
+transit_changes
+
+
Type
+

CubeTransit

+
+
+
+ +
+
+base_roadway_network
+
+
Type
+

RoadwayNetwork

+
+
+
+ +
+
+base_cube_transit_network
+
+
Type
+

CubeTransit

+
+
+
+ +
+
+build_cube_transit_network
+
+
Type
+

CubeTransit

+
+
+
+ +
+
+project_name
+

name of the project, set to DEFAULT_PROJECT_NAME if not provided

+
+
Type
+

str

+
+
+
+ +
+
+parameters
+

an instance of the Parameters class which sets a bunch of parameters

+
+ +
+
+__init__(roadway_link_changes=None, roadway_node_changes=None, transit_changes=None, base_roadway_network=None, base_transit_network=None, base_cube_transit_network=None, build_cube_transit_network=None, project_name='', evaluate=False, parameters={})[source]
+

ProjectCard constructor.

+
+
Parameters
+
    +
  • roadway_link_changes – dataframe of roadway changes read from a log file

  • +
  • roadway_node_changes – dataframe of roadway changes read from a log file

  • +
  • transit_changes – dataframe of transit changes read from a log file

  • +
  • base_roadway_network – RoadwayNetwork instance for base case

  • +
  • base_transit_network – StandardTransit instance for base case

  • +
  • base_cube_transit_network – CubeTransit instance for base transit network

  • +
  • build_cube_transit_network – CubeTransit instance for build transit network

  • +
  • project_name – name of the project

  • +
  • evaluate – defaults to false, but if true, will create card data

  • +
  • parameters – dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not specified, will use default parameters.

  • +
+
+
+

returns: instance of ProjectCard

+
+ +

Methods

+ ++++ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +

__init__([roadway_link_changes, ...])

ProjectCard constructor.

add_highway_changes([...])

Evaluates changes from the log file based on the base highway object and adds entries into the self.card_data dictionary.

add_transit_changes()

Evaluates changes between base and build transit objects and adds entries into the self.card_data dictionary.

create_project([roadway_log_file, ...])

Constructor for a Project instance.

determine_roadway_network_changes_compatibility(...)

Checks to see that any links or nodes that change exist in base roadway network.

emme_id_to_wrangler_id(emme_link_change_df, ...)

rewrite the emme id with wrangler id, using the emme wrangler id crosswalk located in database folder

emme_name_to_wrangler_name(...)

rename emme names to wrangler names using crosswalk file

evaluate_changes()

Determines which changes should be evaluated, initiates self.card_data to be an aggregation of transit and highway changes.

get_object_from_network_build_command()

determine the network build object is node or link

get_operation_from_network_build_command()

determine the network build object action type

read_logfile(logfilename)

Reads a Cube log file and returns separate dataframes of roadway_link_changes and roadway_node_changes

read_network_build_file(networkbuildfilename)

Reads a emme network build file and returns separate dataframes of roadway_link_changes and roadway_node_changes

write_project_card([filename])

Writes project cards.

+

Attributes

+ ++++ + + + + + + + + + + + +

CALCULATED_VALUES

DEFAULT_PROJECT_NAME

STATIC_VALUES

+
+
+add_highway_changes(limit_variables_to_existing_network=False)[source]
+

Evaluates changes from the log file based on the base highway object and +adds entries into the self.card_data dictionary.

+
+
Parameters
+

limit_variables_to_existing_network (bool) – True if no ad-hoc variables. Default to False.

+
+
+
+ +
+
+add_transit_changes()[source]
+

Evaluates changes between base and build transit objects and +adds entries into the self.card_data dictionary.

+
+ +
+
+static create_project(roadway_log_file=None, roadway_shp_file=None, roadway_csv_file=None, network_build_file=None, emme_node_id_crosswalk_file=None, emme_name_crosswalk_file=None, base_roadway_dir=None, base_transit_dir=None, base_cube_transit_source=None, build_cube_transit_source=None, roadway_link_changes=None, roadway_node_changes=None, transit_changes=None, base_roadway_network=None, base_cube_transit_network=None, build_cube_transit_network=None, project_name=None, recalculate_calculated_variables=False, recalculate_distance=False, parameters={}, **kwargs)[source]
+

Constructor for a Project instance.

+
+
Parameters
+
    +
  • roadway_log_file – File path to consuming logfile or a list of logfile paths.

  • +
  • roadway_shp_file – File path to consuming shape file for roadway changes.

  • +
  • roadway_csv_file – File path to consuming csv file for roadway changes.

  • +
  • network_build_file – File path to consuming EMME network build for network changes.

  • +
  • base_roadway_dir – Folder path to base roadway network.

  • +
  • base_transit_dir – Folder path to base transit network.

  • +
  • base_cube_transit_source – Folder path to base transit network or cube line file string.

  • +
  • base_cube_transit_file – File path to base transit network.

  • +
  • build_cube_transit_source – Folder path to build transit network or cube line file string.

  • +
  • build_cube_transit_file – File path to build transit network.

  • +
  • roadway_link_changes – pandas dataframe of CUBE roadway link changes.

  • +
  • roadway_node_changes – pandas dataframe of CUBE roadway node changes.

  • +
  • transit_changes – build transit changes.

  • +
  • base_roadway_network – Base roadway network object.

  • +
  • base_cube_transit_network – Base cube transit network object.

  • +
  • build_cube_transit_network – Build cube transit network object.

  • +
  • project_name – If not provided, will default to the roadway_log_file filename if +provided (or the first filename if a list is provided)

  • +
  • recalculate_calculated_variables – if reading in a base network, if this is true it +will recalculate variables such as area type, etc. This only needs to be true +if you are creating project cards that are changing the calculated variables.

  • +
  • recalculate_distance – recalculate the distance variable. This only needs to be +true if you are creating project cards that change the distance.

  • +
  • parameters – dictionary of parameters

  • +
  • crs (int) – coordinate reference system, ESPG number

  • +
  • node_foreign_key (str) – variable linking the node table to the link table

  • +
  • link_foreign_key (list) – list of variable linking the link table to the node foreign key

  • +
  • shape_foreign_key (str) – variable linking the links table and shape table

  • +
  • unique_link_ids (list) – list of variables unique to each link

  • +
  • unique_node_ids (list) – list of variables unique to each node

  • +
  • modes_to_network_link_variables (dict) – Mapping of modes to link variables in +the network

  • +
  • modes_to_network_nodes_variables (dict) – Mapping of modes to node variables +in the network

  • +
  • managed_lanes_node_id_scalar (int) – Scalar values added to primary keys for nodes for +corresponding managed lanes.

  • +
  • managed_lanes_link_id_scalar (int) – Scalar values added to primary keys for links for +corresponding managed lanes.

  • +
  • managed_lanes_required_attributes (list) – attributes that must be specified in managed +lane projects.

  • +
  • keep_same_attributes_ml_and_gp (list) – attributes to copy to managed lanes from parallel +general purpose lanes.

  • +
+
+
Returns
+

A Project instance.

+
+
+
+ +
+
+static determine_roadway_network_changes_compatibility(base_roadway_network, roadway_link_changes, roadway_node_changes, parameters)[source]
+

Checks to see that any links or nodes that change exist in base roadway network.

+
+ +
+
+static emme_id_to_wrangler_id(emme_link_change_df, emme_node_change_df, emme_transit_changes_df, emme_node_id_crosswalk_file)[source]
+

rewrite the emme id with wrangler id, using the emme wrangler id crosswalk located in database folder

+
+ +
+
+static emme_name_to_wrangler_name(emme_link_change_df, emme_node_change_df, emme_name_crosswalk_file)[source]
+

rename emme names to wrangler names using crosswalk file

+
+ +
+
+evaluate_changes()[source]
+

Determines which changes should be evaluated, initiates +self.card_data to be an aggregation of transit and highway changes.

+
+ +
+
+get_object_from_network_build_command()[source]
+

determine the network build object is node or link

+
+
Parameters
+

row – network build command history dataframe

+
+
Returns
+

‘N’ for node, ‘L’ for link

+
+
+
+ +
+
+get_operation_from_network_build_command()[source]
+

determine the network build object action type

+
+
Parameters
+

row – network build command history dataframe

+
+
Returns
+

‘A’, ‘C’, ‘D’

+
+
+
+ +
+
+static read_logfile(logfilename)[source]
+

Reads a Cube log file and returns separate dataframes of roadway_link_changes and roadway_node_changes

+
+
Parameters
+

logfilename (str or list[str]) – File path to CUBE logfile or list of logfile paths.

+
+
Returns
+

A DataFrame reprsentation of the log file.

+
+
+
+ +
+
+static read_network_build_file(networkbuildfilename)[source]
+

Reads a emme network build file and returns separate dataframes of roadway_link_changes and roadway_node_changes

+
+
Parameters
+

networkbuildfilename (str or list[str]) – File path to emme nework build file or list of network build file paths.

+
+
Returns
+

A DataFrame representation of the network build file

+
+
+
+ +
+
+write_project_card(filename=None)[source]
+

Writes project cards.

+
+
Parameters
+

filename (str) – File path to output .yml

+
+
Returns
+

None

+
+
+
+ +
+
+CALCULATED_VALUES = ['area_type', 'county', 'assign_group', 'centroidconnect']
+
+ +
+
+DEFAULT_PROJECT_NAME = 'USER TO define'
+
+ +
+
+STATIC_VALUES = ['model_link_id', 'area_type', 'county', 'centroidconnect']
+
+ +
+ +
+ + +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bart/_generated/lasso.StandardTransit/index.html b/branch/bart/_generated/lasso.StandardTransit/index.html new file mode 100644 index 0000000..884dc90 --- /dev/null +++ b/branch/bart/_generated/lasso.StandardTransit/index.html @@ -0,0 +1,452 @@ + + + + + + + lasso.StandardTransit — lasso documentation + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

lasso.StandardTransit

+
+
+class lasso.StandardTransit(ptg_feed, parameters={})[source]
+

Bases: object

+

Holds a standard transit feed as a Partridge object and contains +methods to manipulate and translate the GTFS data to MetCouncil’s +Cube Line files.

+

Typical usage example:

+
cube_transit_net = StandardTransit.read_gtfs(BASE_TRANSIT_DIR)
+cube_transit_net.write_as_cube_lin(os.path.join(WRITE_DIR, "outfile.lin"))
+
+
+
+
+feed
+

Partridge Feed object containing read-only access to GTFS feed

+
+ +
+
+parameters
+

Parameters instance containing information +about time periods and variables.

+
+
Type
+

Parameters

+
+
+
+ +
+
+__init__(ptg_feed, parameters={})[source]
+
+
Parameters
+
    +
  • ptg_feed – partridge feed object

  • +
  • parameters – dictionary of parameter settings (see Parameters class) or an instance of Parameters

  • +
+
+
+
+ +

Methods

+ ++++ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +

__init__(ptg_feed[, parameters])

+
param ptg_feed
+

partridge feed object

+
+
+

calculate_cube_mode(row)

Assigns a cube mode number by following logic.

cube_format(row)

Creates a string represnting the route in cube line file notation. #MC :param row: row of a DataFrame representing a cube-formatted trip, with the Attributes trip_id, shape_id, NAME, LONGNAME, tod, HEADWAY, MODE, ONEWAY, OPERATOR.

evaluate_differences(transit_changes)

Compare changes from the transit_changes dataframe with the standard transit network returns the project card changes in dictionary format

fromTransitNetwork(transit_network_object[, ...])

RoadwayNetwork to ModelRoadwayNetwork

read_gtfs(gtfs_feed_dir[, parameters])

Reads GTFS files from a directory and returns a StandardTransit instance.

route_properties_gtfs_to_cube(self)

Prepare gtfs for cube lin file.

shape_gtfs_to_cube(row[, add_nntime])

Creates a list of nodes that for the route in appropriate cube format.

shape_gtfs_to_dict_list(trip_id, shape_id, ...)

This is a copy of StandardTransit.shape_gtfs_to_cube() because we need the same logic of stepping through the routed nodes and corresponding them with shape nodes.

shape_gtfs_to_emme(trip_row)

Creates transit segment for the trips in appropriate emme format.

time_to_cube_time_period(start_time_secs[, ...])

Converts seconds from midnight to the cube time period.

write_as_cube_lin([outpath])

Writes the gtfs feed as a cube line file after converting gtfs properties to MetCouncil cube properties.

+
+
+calculate_cube_mode(row)[source]
+

Assigns a cube mode number by following logic. +#MC +For rail, uses GTFS route_type variable: +https://developers.google.com/transit/gtfs/reference

+
+
::

# route_type : cube_mode +route_type_to_cube_mode = {0: 8, # Tram, Streetcar, Light rail

+
+

3: 0, # Bus; further disaggregated for cube +2: 9} # Rail

+
+
+
+

For buses, uses route id numbers and route name to find +express and suburban buses as follows:

+
+
::
+
if not cube_mode:
+
if ‘express’ in row[‘LONGNAME’].lower():

cube_mode = 7 # Express

+
+
elif int(row[‘route_id’].split(“-“)[0]) > 99:

cube_mode = 6 # Suburban Local

+
+
else:

cube_mode = 5 # Urban Local

+
+
+
+
+
+
+
+
Parameters
+

row – A DataFrame row with route_type, route_long_name, and route_id

+
+
Returns
+

cube mode number

+
+
+
+ +
+
+cube_format(row)[source]
+

Creates a string represnting the route in cube line file notation. +#MC +:param row: row of a DataFrame representing a cube-formatted trip, with the Attributes

+
+

trip_id, shape_id, NAME, LONGNAME, tod, HEADWAY, MODE, ONEWAY, OPERATOR

+
+
+
Returns
+

string representation of route in cube line file notation

+
+
+
+ +
+
+evaluate_differences(transit_changes)[source]
+

Compare changes from the transit_changes dataframe with the standard transit network +returns the project card changes in dictionary format

+
+ +
+
+static fromTransitNetwork(transit_network_object, parameters={})[source]
+

RoadwayNetwork to ModelRoadwayNetwork

+
+
Parameters
+
    +
  • transit_network_object – Reference to an instance of TransitNetwork.

  • +
  • parameters – dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not provided will +use default parameters.

  • +
+
+
Returns
+

StandardTransit

+
+
+
+ +
+
+static read_gtfs(gtfs_feed_dir, parameters={})[source]
+

Reads GTFS files from a directory and returns a StandardTransit +instance.

+
+
Parameters
+
    +
  • gtfs_feed_dir – location of the GTFS files

  • +
  • parameters – dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not provided will +use default parameters.

  • +
+
+
Returns
+

StandardTransit instance

+
+
+
+ +
+
+static route_properties_gtfs_to_cube(self)[source]
+

Prepare gtfs for cube lin file. +#MC +Does the following operations: +1. Combines route, frequency, trip, and shape information +2. Converts time of day to time periods +3. Calculates cube route name from gtfs route name and properties +4. Assigns a cube-appropriate mode number +5. Assigns a cube-appropriate operator number

+
+
Returns
+

+
DataFrame of trips with cube-appropriate values for:
    +
  • NAME

  • +
  • ONEWAY

  • +
  • OPERATOR

  • +
  • MODE

  • +
  • HEADWAY

  • +
+
+
+

+
+
Return type
+

trip_df (DataFrame)

+
+
+
+ +
+
+shape_gtfs_to_cube(row, add_nntime=False)[source]
+

Creates a list of nodes that for the route in appropriate +cube format.

+
+
Parameters
+

row – DataFrame row with both shape_id and trip_id

+
+
+
+
Returns: a string representation of the node list

for a route in cube format.

+
+
+
+ +
+
+shape_gtfs_to_dict_list(trip_id, shape_id, add_nntime)[source]
+

This is a copy of StandardTransit.shape_gtfs_to_cube() because we need the same logic of +stepping through the routed nodes and corresponding them with shape nodes.

+

TODO: eliminate this necessity by tagging the stop nodes in the shapes to begin with when +the transit routing on the roadway network is first performed.

+

As such, I’m copying the code from StandardTransit.shape_gtfs_to_cube() with minimal modifications.

+
+
Parameters
+
    +
  • question (shape_id of the trip in) –

  • +
  • question

  • +
+
+
Returns
+

trip_id +shape_id +shape_pt_sequence +shape_mode_node_id +is_stop +access +stop_sequence

+
+
Return type
+

list of dict records with columns

+
+
+
+ +
+
+shape_gtfs_to_emme(trip_row)[source]
+

Creates transit segment for the trips in appropriate +emme format.

+
+
Parameters
+

row – DataFrame row with both shape_id and trip_id

+
+
+
+
Returns: a dataframe representation of the transit segment

for a trip in emme format.

+
+
+
+ +
+
+time_to_cube_time_period(start_time_secs, as_str=True, verbose=False)[source]
+

Converts seconds from midnight to the cube time period.

+
+
Parameters
+
    +
  • start_time_secs – start time for transit trip in seconds +from midnight

  • +
  • as_str – if True, returns the time period as a string, +otherwise returns a numeric time period

  • +
+
+
Returns
+

+
if as_str is False, returns the numeric

time period

+
+
this_tp: if as_str is True, returns the Cube time period

name abbreviation

+
+
+

+
+
Return type
+

this_tp_num

+
+
+
+ +
+
+write_as_cube_lin(outpath=None)[source]
+

Writes the gtfs feed as a cube line file after +converting gtfs properties to MetCouncil cube properties. +#MC +:param outpath: File location for output cube line file.

+
+ +
+ +
+ + +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bart/_generated/lasso.logger/index.html b/branch/bart/_generated/lasso.logger/index.html new file mode 100644 index 0000000..1b07a08 --- /dev/null +++ b/branch/bart/_generated/lasso.logger/index.html @@ -0,0 +1,142 @@ + + + + + + + lasso.logger — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

lasso.logger

+

Functions

+ ++++ + + + + + +

setupLogging(infoLogFilename, debugLogFilename)

Sets up the logger.

+
+
+lasso.logger.setupLogging(infoLogFilename, debugLogFilename, logToConsole=True)[source]
+

Sets up the logger. The infoLog is terse, just gives the bare minimum of details +so the network composition will be clear later. +The debuglog is very noisy, for debugging.

+

Pass none to either. +Spews it all out to console too, if logToConsole is true.

+
+ +
+ + +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bart/_generated/lasso.util/index.html b/branch/bart/_generated/lasso.util/index.html new file mode 100644 index 0000000..b30fac8 --- /dev/null +++ b/branch/bart/_generated/lasso.util/index.html @@ -0,0 +1,1456 @@ + + + + + + + lasso.util — lasso documentation + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

lasso.util

+

Functions

+ ++++ + + + + + + + + + + + + + + + + + + + + + + + +

column_name_to_parts(c[, parameters])

create_locationreference(node, link)

geodesic_point_buffer(lat, lon, meters)

creates circular buffer polygon for node

get_shared_streets_intersection_hash(lat, long)

Calculated per:

hhmmss_to_datetime(hhmmss_str)

Creates a datetime time object from a string of hh:mm:ss

secs_to_datetime(secs)

Creates a datetime time object from a seconds from midnight

shorten_name(name)

+
+
+class lasso.util.Point(*args)[source]
+

Bases: BaseGeometry

+

A zero dimensional feature

+

A point has zero length and zero area.

+
+
+x, y, z
+

Coordinate values

+
+
Type
+

float

+
+
+
+ +

Example

+
>>> p = Point(1.0, -1.0)
+>>> print(p)
+POINT (1 -1)
+>>> p.y
+-1.0
+>>> p.x
+1.0
+
+
+
+
+almost_equals(other, decimal=6)
+

True if geometries are equal at all coordinates to a +specified decimal place.

+
+

Deprecated since version 1.8.0: The ‘almost_equals()’ method is deprecated +and will be removed in Shapely 2.0 because the name is +confusing. The ‘equals_exact()’ method should be used +instead.

+
+

Refers to approximate coordinate equality, which requires +coordinates to be approximately equal and in the same order for +all components of a geometry.

+

Because of this it is possible for “equals()” to be True for two +geometries and “almost_equals()” to be False.

+

Examples

+
>>> LineString(
+...     [(0, 0), (2, 2)]
+... ).equals_exact(
+...     LineString([(0, 0), (1, 1), (2, 2)]),
+...     1e-6
+... )
+False
+
+
+
+
Return type
+

bool

+
+
+
+ +
+
+array_interface()[source]
+

Provide the Numpy array protocol.

+
+ +
+
+buffer(distance, resolution=16, quadsegs=None, cap_style=1, join_style=1, mitre_limit=5.0, single_sided=False)
+

Get a geometry that represents all points within a distance +of this geometry.

+

A positive distance produces a dilation, a negative distance an +erosion. A very small or zero distance may sometimes be used to +“tidy” a polygon.

+
+
Parameters
+
    +
  • distance (float) – The distance to buffer around the object.

  • +
  • resolution (int, optional) – The resolution of the buffer around each vertex of the +object.

  • +
  • quadsegs (int, optional) – Sets the number of line segments used to approximate an +angle fillet. Note: the use of a quadsegs parameter is +deprecated and will be gone from the next major release.

  • +
  • cap_style (int, optional) – The styles of caps are: CAP_STYLE.round (1), CAP_STYLE.flat +(2), and CAP_STYLE.square (3).

  • +
  • join_style (int, optional) – The styles of joins between offset segments are: +JOIN_STYLE.round (1), JOIN_STYLE.mitre (2), and +JOIN_STYLE.bevel (3).

  • +
  • mitre_limit (float, optional) – The mitre limit ratio is used for very sharp corners. The +mitre ratio is the ratio of the distance from the corner to +the end of the mitred offset corner. When two line segments +meet at a sharp angle, a miter join will extend the original +geometry. To prevent unreasonable geometry, the mitre limit +allows controlling the maximum length of the join corner. +Corners with a ratio which exceed the limit will be beveled.

  • +
  • single_side (bool, optional) –

    The side used is determined by the sign of the buffer +distance:

    +
    +

    a positive distance indicates the left-hand side +a negative distance indicates the right-hand side

    +
    +

    The single-sided buffer of point geometries is the same as +the regular buffer. The End Cap Style for single-sided +buffers is always ignored, and forced to the equivalent of +CAP_FLAT.

    +

  • +
+
+
Return type
+

Geometry

+
+
+

Notes

+

The return value is a strictly two-dimensional geometry. All +Z coordinates of the original geometry will be ignored.

+

Examples

+
>>> from shapely.wkt import loads
+>>> g = loads('POINT (0.0 0.0)')
+>>> g.buffer(1.0).area        # 16-gon approx of a unit radius circle
+3.1365484905459...
+>>> g.buffer(1.0, 128).area   # 128-gon approximation
+3.141513801144...
+>>> round(g.buffer(1.0, 3).area, 10)  # triangle approximation
+3.0
+>>> list(g.buffer(1.0, cap_style=CAP_STYLE.square).exterior.coords)
+[(1.0, 1.0), (1.0, -1.0), (-1.0, -1.0), (-1.0, 1.0), (1.0, 1.0)]
+>>> g.buffer(1.0, cap_style=CAP_STYLE.square).area
+4.0
+
+
+
+ +
+
+contains(other)
+

Returns True if the geometry contains the other, else False

+
+ +
+
+covered_by(other)
+

Returns True if the geometry is covered by the other, else False

+
+ +
+
+covers(other)
+

Returns True if the geometry covers the other, else False

+
+ +
+
+crosses(other)
+

Returns True if the geometries cross, else False

+
+ +
+
+difference(other)
+

Returns the difference of the geometries

+
+ +
+
+disjoint(other)
+

Returns True if geometries are disjoint, else False

+
+ +
+
+distance(other)
+

Unitless distance to other geometry (float)

+
+ +
+
+empty(val=93892948296208)
+
+ +
+
+equals(other)
+

Returns True if geometries are equal, else False.

+

This method considers point-set equality (or topological +equality), and is equivalent to (self.within(other) & +self.contains(other)).

+

Examples

+
>>> LineString(
+...     [(0, 0), (2, 2)]
+... ).equals(
+...     LineString([(0, 0), (1, 1), (2, 2)])
+... )
+True
+
+
+
+
Return type
+

bool

+
+
+
+ +
+
+equals_exact(other, tolerance)
+

True if geometries are equal to within a specified +tolerance.

+
+
Parameters
+
    +
  • other (BaseGeometry) – The other geometry object in this comparison.

  • +
  • tolerance (float) – Absolute tolerance in the same units as coordinates.

  • +
  • equality (This method considers coordinate) –

  • +
  • requires (which) –

  • +
  • components (coordinates to be equal and in the same order for all) –

  • +
  • geometry. (of a) –

  • +
  • two (Because of this it is possible for "equals()" to be True for) –

  • +
  • False. (geometries and "equals_exact()" to be) –

  • +
+
+
+

Examples

+
>>> LineString(
+...     [(0, 0), (2, 2)]
+... ).equals_exact(
+...     LineString([(0, 0), (1, 1), (2, 2)]),
+...     1e-6
+... )
+False
+
+
+
+
Return type
+

bool

+
+
+
+ +
+
+geometryType()
+
+ +
+
+hausdorff_distance(other)
+

Unitless hausdorff distance to other geometry (float)

+
+ +
+
+interpolate(distance, normalized=False)
+

Return a point at the specified distance along a linear geometry

+

Negative length values are taken as measured in the reverse +direction from the end of the geometry. Out-of-range index +values are handled by clamping them to the valid range of values. +If the normalized arg is True, the distance will be interpreted as a +fraction of the geometry’s length.

+
+ +
+
+intersection(other)
+

Returns the intersection of the geometries

+
+ +
+
+intersects(other)
+

Returns True if geometries intersect, else False

+
+ +
+
+normalize()
+

Converts geometry to normal form (or canonical form).

+

This method orders the coordinates, rings of a polygon and parts of +multi geometries consistently. Typically useful for testing purposes +(for example in combination with equals_exact).

+

Examples

+
>>> from shapely.wkt import loads
+>>> p = loads("MULTILINESTRING((0 0, 1 1), (3 3, 2 2))")
+>>> p.normalize().wkt
+'MULTILINESTRING ((2 2, 3 3), (0 0, 1 1))'
+
+
+
+ +
+
+overlaps(other)
+

Returns True if geometries overlap, else False

+
+ +
+
+project(other, normalized=False)
+

Returns the distance along this geometry to a point nearest the +specified point

+

If the normalized arg is True, return the distance normalized to the +length of the linear geometry.

+
+ +
+
+relate(other)
+

Returns the DE-9IM intersection matrix for the two geometries +(string)

+
+ +
+
+relate_pattern(other, pattern)
+

Returns True if the DE-9IM string code for the relationship between +the geometries satisfies the pattern, else False

+
+ +
+
+representative_point()
+

Returns a point guaranteed to be within the object, cheaply.

+
+ +
+
+simplify(tolerance, preserve_topology=True)
+

Returns a simplified geometry produced by the Douglas-Peucker +algorithm

+

Coordinates of the simplified geometry will be no more than the +tolerance distance from the original. Unless the topology preserving +option is used, the algorithm may produce self-intersecting or +otherwise invalid geometries.

+
+ +
+
+svg(scale_factor=1.0, fill_color=None, opacity=None)[source]
+

Returns SVG circle element for the Point geometry.

+
+
Parameters
+
    +
  • scale_factor (float) – Multiplication factor for the SVG circle diameter. Default is 1.

  • +
  • fill_color (str, optional) – Hex string for fill color. Default is to use “#66cc99” if +geometry is valid, and “#ff3333” if invalid.

  • +
  • opacity (float) – Float number between 0 and 1 for color opacity. Default value is 0.6

  • +
+
+
+
+ +
+
+symmetric_difference(other)
+

Returns the symmetric difference of the geometries +(Shapely geometry)

+
+ +
+
+touches(other)
+

Returns True if geometries touch, else False

+
+ +
+
+union(other)
+

Returns the union of the geometries (Shapely geometry)

+
+ +
+
+within(other)
+

Returns True if geometry is within the other, else False

+
+ +
+
+property area
+

Unitless area of the geometry (float)

+
+ +
+
+property array_interface_base
+
+ +
+
+property boundary
+

Returns a lower dimension geometry that bounds the object

+

The boundary of a polygon is a line, the boundary of a line is a +collection of points. The boundary of a point is an empty (null) +collection.

+
+ +
+
+property bounds
+

Returns minimum bounding region (minx, miny, maxx, maxy)

+
+ +
+
+property centroid
+

Returns the geometric center of the object

+
+ +
+
+property convex_hull
+

that’s a +convex hull, more or less

+

The convex hull of a three member multipoint, for example, is a +triangular polygon.

+
+
Type
+

Imagine an elastic band stretched around the geometry

+
+
+
+ +
+
+property coords
+

Access to geometry’s coordinates (CoordinateSequence)

+
+ +
+
+property ctypes
+

Return ctypes buffer

+
+ +
+
+property envelope
+

A figure that envelopes the geometry

+
+ +
+
+property geom_type
+

Name of the geometry’s type, such as ‘Point’

+
+ +
+
+property has_z
+

True if the geometry’s coordinate sequence(s) have z values (are +3-dimensional)

+
+ +
+
+impl = <GEOSImpl object: GEOS C API version (1, 13, 0)>
+
+ +
+
+property is_closed
+

True if the geometry is closed, else False

+

Applicable only to 1-D geometries.

+
+ +
+
+property is_empty
+

True if the set of points in this geometry is empty, else False

+
+ +
+
+property is_ring
+

True if the geometry is a closed ring, else False

+
+ +
+
+property is_simple
+

True if the geometry is simple, meaning that any self-intersections +are only at boundary points, else False

+
+ +
+
+property is_valid
+

True if the geometry is valid (definition depends on sub-class), +else False

+
+ +
+
+property length
+

Unitless length of the geometry (float)

+
+ +
+
+property minimum_clearance
+

Unitless distance by which a node could be moved to produce an invalid geometry (float)

+
+ +
+
+property minimum_rotated_rectangle
+

Returns the general minimum bounding rectangle of +the geometry. Can possibly be rotated. If the convex hull +of the object is a degenerate (line or point) this same degenerate +is returned.

+
+ +
+
+property type
+
+ +
+
+property wkb
+

WKB representation of the geometry

+
+ +
+
+property wkb_hex
+

WKB hex representation of the geometry

+
+ +
+
+property wkt
+

WKT representation of the geometry

+
+ +
+
+property x
+

Return x coordinate.

+
+ +
+
+property xy
+

Separate arrays of X and Y coordinate values

+

Example

+
>>> x, y = Point(0, 0).xy
+>>> list(x)
+[0.0]
+>>> list(y)
+[0.0]
+
+
+
+ +
+
+property y
+

Return y coordinate.

+
+ +
+
+property z
+

Return z coordinate.

+
+ +
+ +
+
+class lasso.util.Polygon(shell=None, holes=None)[source]
+

Bases: BaseGeometry

+

A two-dimensional figure bounded by a linear ring

+

A polygon has a non-zero area. It may have one or more negative-space +“holes” which are also bounded by linear rings. If any rings cross each +other, the feature is invalid and operations on it may fail.

+
+
+exterior
+

The ring which bounds the positive space of the polygon.

+
+
Type
+

LinearRing

+
+
+
+ +
+
+interiors
+

A sequence of rings which bound all existing holes.

+
+
Type
+

sequence

+
+
+
+ +
+
+almost_equals(other, decimal=6)
+

True if geometries are equal at all coordinates to a +specified decimal place.

+
+

Deprecated since version 1.8.0: The ‘almost_equals()’ method is deprecated +and will be removed in Shapely 2.0 because the name is +confusing. The ‘equals_exact()’ method should be used +instead.

+
+

Refers to approximate coordinate equality, which requires +coordinates to be approximately equal and in the same order for +all components of a geometry.

+

Because of this it is possible for “equals()” to be True for two +geometries and “almost_equals()” to be False.

+

Examples

+
>>> LineString(
+...     [(0, 0), (2, 2)]
+... ).equals_exact(
+...     LineString([(0, 0), (1, 1), (2, 2)]),
+...     1e-6
+... )
+False
+
+
+
+
Return type
+

bool

+
+
+
+ +
+
+buffer(distance, resolution=16, quadsegs=None, cap_style=1, join_style=1, mitre_limit=5.0, single_sided=False)
+

Get a geometry that represents all points within a distance +of this geometry.

+

A positive distance produces a dilation, a negative distance an +erosion. A very small or zero distance may sometimes be used to +“tidy” a polygon.

+
+
Parameters
+
    +
  • distance (float) – The distance to buffer around the object.

  • +
  • resolution (int, optional) – The resolution of the buffer around each vertex of the +object.

  • +
  • quadsegs (int, optional) – Sets the number of line segments used to approximate an +angle fillet. Note: the use of a quadsegs parameter is +deprecated and will be gone from the next major release.

  • +
  • cap_style (int, optional) – The styles of caps are: CAP_STYLE.round (1), CAP_STYLE.flat +(2), and CAP_STYLE.square (3).

  • +
  • join_style (int, optional) – The styles of joins between offset segments are: +JOIN_STYLE.round (1), JOIN_STYLE.mitre (2), and +JOIN_STYLE.bevel (3).

  • +
  • mitre_limit (float, optional) – The mitre limit ratio is used for very sharp corners. The +mitre ratio is the ratio of the distance from the corner to +the end of the mitred offset corner. When two line segments +meet at a sharp angle, a miter join will extend the original +geometry. To prevent unreasonable geometry, the mitre limit +allows controlling the maximum length of the join corner. +Corners with a ratio which exceed the limit will be beveled.

  • +
  • single_side (bool, optional) –

    The side used is determined by the sign of the buffer +distance:

    +
    +

    a positive distance indicates the left-hand side +a negative distance indicates the right-hand side

    +
    +

    The single-sided buffer of point geometries is the same as +the regular buffer. The End Cap Style for single-sided +buffers is always ignored, and forced to the equivalent of +CAP_FLAT.

    +

  • +
+
+
Return type
+

Geometry

+
+
+

Notes

+

The return value is a strictly two-dimensional geometry. All +Z coordinates of the original geometry will be ignored.

+

Examples

+
>>> from shapely.wkt import loads
+>>> g = loads('POINT (0.0 0.0)')
+>>> g.buffer(1.0).area        # 16-gon approx of a unit radius circle
+3.1365484905459...
+>>> g.buffer(1.0, 128).area   # 128-gon approximation
+3.141513801144...
+>>> round(g.buffer(1.0, 3).area, 10)  # triangle approximation
+3.0
+>>> list(g.buffer(1.0, cap_style=CAP_STYLE.square).exterior.coords)
+[(1.0, 1.0), (1.0, -1.0), (-1.0, -1.0), (-1.0, 1.0), (1.0, 1.0)]
+>>> g.buffer(1.0, cap_style=CAP_STYLE.square).area
+4.0
+
+
+
+ +
+
+contains(other)
+

Returns True if the geometry contains the other, else False

+
+ +
+
+covered_by(other)
+

Returns True if the geometry is covered by the other, else False

+
+ +
+
+covers(other)
+

Returns True if the geometry covers the other, else False

+
+ +
+
+crosses(other)
+

Returns True if the geometries cross, else False

+
+ +
+
+difference(other)
+

Returns the difference of the geometries

+
+ +
+
+disjoint(other)
+

Returns True if geometries are disjoint, else False

+
+ +
+
+distance(other)
+

Unitless distance to other geometry (float)

+
+ +
+
+empty(val=93892948296208)
+
+ +
+
+equals(other)
+

Returns True if geometries are equal, else False.

+

This method considers point-set equality (or topological +equality), and is equivalent to (self.within(other) & +self.contains(other)).

+

Examples

+
>>> LineString(
+...     [(0, 0), (2, 2)]
+... ).equals(
+...     LineString([(0, 0), (1, 1), (2, 2)])
+... )
+True
+
+
+
+
Return type
+

bool

+
+
+
+ +
+
+equals_exact(other, tolerance)
+

True if geometries are equal to within a specified +tolerance.

+
+
Parameters
+
    +
  • other (BaseGeometry) – The other geometry object in this comparison.

  • +
  • tolerance (float) – Absolute tolerance in the same units as coordinates.

  • +
  • equality (This method considers coordinate) –

  • +
  • requires (which) –

  • +
  • components (coordinates to be equal and in the same order for all) –

  • +
  • geometry. (of a) –

  • +
  • two (Because of this it is possible for "equals()" to be True for) –

  • +
  • False. (geometries and "equals_exact()" to be) –

  • +
+
+
+

Examples

+
>>> LineString(
+...     [(0, 0), (2, 2)]
+... ).equals_exact(
+...     LineString([(0, 0), (1, 1), (2, 2)]),
+...     1e-6
+... )
+False
+
+
+
+
Return type
+

bool

+
+
+
+ +
+
+classmethod from_bounds(xmin, ymin, xmax, ymax)[source]
+

Construct a Polygon() from spatial bounds.

+
+ +
+
+geometryType()
+
+ +
+
+hausdorff_distance(other)
+

Unitless hausdorff distance to other geometry (float)

+
+ +
+
+interpolate(distance, normalized=False)
+

Return a point at the specified distance along a linear geometry

+

Negative length values are taken as measured in the reverse +direction from the end of the geometry. Out-of-range index +values are handled by clamping them to the valid range of values. +If the normalized arg is True, the distance will be interpreted as a +fraction of the geometry’s length.

+
+ +
+
+intersection(other)
+

Returns the intersection of the geometries

+
+ +
+
+intersects(other)
+

Returns True if geometries intersect, else False

+
+ +
+
+normalize()
+

Converts geometry to normal form (or canonical form).

+

This method orders the coordinates, rings of a polygon and parts of +multi geometries consistently. Typically useful for testing purposes +(for example in combination with equals_exact).

+

Examples

+
>>> from shapely.wkt import loads
+>>> p = loads("MULTILINESTRING((0 0, 1 1), (3 3, 2 2))")
+>>> p.normalize().wkt
+'MULTILINESTRING ((2 2, 3 3), (0 0, 1 1))'
+
+
+
+ +
+
+overlaps(other)
+

Returns True if geometries overlap, else False

+
+ +
+
+project(other, normalized=False)
+

Returns the distance along this geometry to a point nearest the +specified point

+

If the normalized arg is True, return the distance normalized to the +length of the linear geometry.

+
+ +
+
+relate(other)
+

Returns the DE-9IM intersection matrix for the two geometries +(string)

+
+ +
+
+relate_pattern(other, pattern)
+

Returns True if the DE-9IM string code for the relationship between +the geometries satisfies the pattern, else False

+
+ +
+
+representative_point()
+

Returns a point guaranteed to be within the object, cheaply.

+
+ +
+
+simplify(tolerance, preserve_topology=True)
+

Returns a simplified geometry produced by the Douglas-Peucker +algorithm

+

Coordinates of the simplified geometry will be no more than the +tolerance distance from the original. Unless the topology preserving +option is used, the algorithm may produce self-intersecting or +otherwise invalid geometries.

+
+ +
+
+svg(scale_factor=1.0, fill_color=None, opacity=None)[source]
+

Returns SVG path element for the Polygon geometry.

+
+
Parameters
+
    +
  • scale_factor (float) – Multiplication factor for the SVG stroke-width. Default is 1.

  • +
  • fill_color (str, optional) – Hex string for fill color. Default is to use “#66cc99” if +geometry is valid, and “#ff3333” if invalid.

  • +
  • opacity (float) – Float number between 0 and 1 for color opacity. Default value is 0.6

  • +
+
+
+
+ +
+
+symmetric_difference(other)
+

Returns the symmetric difference of the geometries +(Shapely geometry)

+
+ +
+
+touches(other)
+

Returns True if geometries touch, else False

+
+ +
+
+union(other)
+

Returns the union of the geometries (Shapely geometry)

+
+ +
+
+within(other)
+

Returns True if geometry is within the other, else False

+
+ +
+
+property area
+

Unitless area of the geometry (float)

+
+ +
+
+property array_interface_base
+
+ +
+
+property boundary
+

Returns a lower dimension geometry that bounds the object

+

The boundary of a polygon is a line, the boundary of a line is a +collection of points. The boundary of a point is an empty (null) +collection.

+
+ +
+
+property bounds
+

Returns minimum bounding region (minx, miny, maxx, maxy)

+
+ +
+
+property centroid
+

Returns the geometric center of the object

+
+ +
+
+property convex_hull
+

that’s a +convex hull, more or less

+

The convex hull of a three member multipoint, for example, is a +triangular polygon.

+
+
Type
+

Imagine an elastic band stretched around the geometry

+
+
+
+ +
+
+property coords
+

Access to geometry’s coordinates (CoordinateSequence)

+
+ +
+
+property ctypes
+

Return ctypes buffer

+
+ +
+
+property envelope
+

A figure that envelopes the geometry

+
+ +
+
+property exterior
+
+ +
+
+property geom_type
+

Name of the geometry’s type, such as ‘Point’

+
+ +
+
+property has_z
+

True if the geometry’s coordinate sequence(s) have z values (are +3-dimensional)

+
+ +
+
+impl = <GEOSImpl object: GEOS C API version (1, 13, 0)>
+
+ +
+
+property interiors
+
+ +
+
+property is_closed
+

True if the geometry is closed, else False

+

Applicable only to 1-D geometries.

+
+ +
+
+property is_empty
+

True if the set of points in this geometry is empty, else False

+
+ +
+
+property is_ring
+

True if the geometry is a closed ring, else False

+
+ +
+
+property is_simple
+

True if the geometry is simple, meaning that any self-intersections +are only at boundary points, else False

+
+ +
+
+property is_valid
+

True if the geometry is valid (definition depends on sub-class), +else False

+
+ +
+
+property length
+

Unitless length of the geometry (float)

+
+ +
+
+property minimum_clearance
+

Unitless distance by which a node could be moved to produce an invalid geometry (float)

+
+ +
+
+property minimum_rotated_rectangle
+

Returns the general minimum bounding rectangle of +the geometry. Can possibly be rotated. If the convex hull +of the object is a degenerate (line or point) this same degenerate +is returned.

+
+ +
+
+property type
+
+ +
+
+property wkb
+

WKB representation of the geometry

+
+ +
+
+property wkb_hex
+

WKB hex representation of the geometry

+
+ +
+
+property wkt
+

WKT representation of the geometry

+
+ +
+
+property xy
+

Separate arrays of X and Y coordinate values

+
+ +
+ +
+
+class lasso.util.partial[source]
+

Bases: object

+

partial(func, *args, **keywords) - new function with partial application +of the given arguments and keywords.

+
+
+args
+

tuple of arguments to future partial calls

+
+ +
+
+func
+

function object to use in future partial calls

+
+ +
+
+keywords
+

dictionary of keyword arguments to future partial calls

+
+ +
+ +
+
+lasso.util.column_name_to_parts(c, parameters=None)[source]
+
+ +
+
+lasso.util.create_locationreference(node, link)[source]
+
+ +
+
+lasso.util.geodesic_point_buffer(lat, lon, meters)[source]
+

creates circular buffer polygon for node

+
+
Parameters
+
    +
  • lat – node lat

  • +
  • lon – node lon

  • +
  • meters – buffer distance, radius of circle

  • +
+
+
Returns
+

Polygon

+
+
+
+ +
+
+lasso.util.get_shared_streets_intersection_hash(lat, long, osm_node_id=None)[source]
+
+
Calculated per:

https://github.com/sharedstreets/sharedstreets-js/blob/0e6d7de0aee2e9ae3b007d1e45284b06cc241d02/src/index.ts#L553-L565

+
+
Expected in/out
+
-93.0965985, 44.952112199999995 osm_node_id = 954734870

69f13f881649cb21ee3b359730790bb9

+
+
+
+
+
+ +
+
+lasso.util.hhmmss_to_datetime(hhmmss_str)[source]
+

Creates a datetime time object from a string of hh:mm:ss

+
+
Parameters
+

hhmmss_str – string of hh:mm:ss

+
+
Returns
+

datetime.time object representing time

+
+
Return type
+

dt

+
+
+
+ +
+
+lasso.util.secs_to_datetime(secs)[source]
+

Creates a datetime time object from a seconds from midnight

+
+
Parameters
+

secs – seconds from midnight

+
+
Returns
+

datetime.time object representing time

+
+
Return type
+

dt

+
+
+
+ +
+
+lasso.util.shorten_name(name)[source]
+
+ +
+
+lasso.util.transform(func, geom)[source]
+

Applies func to all coordinates of geom and returns a new +geometry of the same type from the transformed coordinates.

+

func maps x, y, and optionally z to output xp, yp, zp. The input +parameters may iterable types like lists or arrays or single values. +The output shall be of the same type. Scalars in, scalars out. +Lists in, lists out.

+

For example, here is an identity function applicable to both types +of input.

+
+
+
def id_func(x, y, z=None):

return tuple(filter(None, [x, y, z]))

+
+
+

g2 = transform(id_func, g1)

+
+

Using pyproj >= 2.1, this example will accurately project Shapely geometries:

+
+

import pyproj

+

wgs84 = pyproj.CRS(‘EPSG:4326’) +utm = pyproj.CRS(‘EPSG:32618’)

+

project = pyproj.Transformer.from_crs(wgs84, utm, always_xy=True).transform

+

g2 = transform(project, g1)

+
+

Note that the always_xy kwarg is required here as Shapely geometries only support +X,Y coordinate ordering.

+

Lambda expressions such as the one in

+
+

g2 = transform(lambda x, y, z=None: (x+1.0, y+1.0), g1)

+
+

also satisfy the requirements for func.

+
+ +
+
+lasso.util.unidecode(string, errors='ignore', replace_str='?')
+

Transliterate an Unicode object into an ASCII string

+
>>> unidecode("北亰")
+"Bei Jing "
+
+
+

This function first tries to convert the string using ASCII codec. +If it fails (because of non-ASCII characters), it falls back to +transliteration using the character tables.

+

This is approx. five times faster if the string only contains ASCII +characters, but slightly slower than unicode_expect_nonascii if +non-ASCII characters are present.

+

errors specifies what to do with characters that have not been +found in replacement tables. The default is ‘ignore’ which ignores +the character. ‘strict’ raises an UnidecodeError. ‘replace’ +substitutes the character with replace_str (default is ‘?’). +‘preserve’ keeps the original character.

+

Note that if ‘preserve’ is used the returned string might not be +ASCII!

+
+
Return type
+

str

+
+
+
+ +
+ + +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bart/_modules/functools/index.html b/branch/bart/_modules/functools/index.html new file mode 100644 index 0000000..f323701 --- /dev/null +++ b/branch/bart/_modules/functools/index.html @@ -0,0 +1,1078 @@ + + + + + + functools — lasso documentation + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for functools

+"""functools.py - Tools for working with functions and callable objects
+"""
+# Python module wrapper for _functools C module
+# to allow utilities written in Python to be added
+# to the functools module.
+# Written by Nick Coghlan <ncoghlan at gmail.com>,
+# Raymond Hettinger <python at rcn.com>,
+# and Łukasz Langa <lukasz at langa.pl>.
+#   Copyright (C) 2006-2013 Python Software Foundation.
+# See C source code for _functools credits/copyright
+
+__all__ = ['update_wrapper', 'wraps', 'WRAPPER_ASSIGNMENTS', 'WRAPPER_UPDATES',
+           'total_ordering', 'cmp_to_key', 'lru_cache', 'reduce', 'partial',
+           'partialmethod', 'singledispatch', 'singledispatchmethod',
+           "cached_property"]
+
+from abc import get_cache_token
+from collections import namedtuple
+# import types, weakref  # Deferred to single_dispatch()
+from reprlib import recursive_repr
+from _thread import RLock
+
+
+################################################################################
+### update_wrapper() and wraps() decorator
+################################################################################
+
+# update_wrapper() and wraps() are tools to help write
+# wrapper functions that can handle naive introspection
+
+WRAPPER_ASSIGNMENTS = ('__module__', '__name__', '__qualname__', '__doc__',
+                       '__annotations__')
+WRAPPER_UPDATES = ('__dict__',)
+def update_wrapper(wrapper,
+                   wrapped,
+                   assigned = WRAPPER_ASSIGNMENTS,
+                   updated = WRAPPER_UPDATES):
+    """Update a wrapper function to look like the wrapped function
+
+       wrapper is the function to be updated
+       wrapped is the original function
+       assigned is a tuple naming the attributes assigned directly
+       from the wrapped function to the wrapper function (defaults to
+       functools.WRAPPER_ASSIGNMENTS)
+       updated is a tuple naming the attributes of the wrapper that
+       are updated with the corresponding attribute from the wrapped
+       function (defaults to functools.WRAPPER_UPDATES)
+    """
+    for attr in assigned:
+        try:
+            value = getattr(wrapped, attr)
+        except AttributeError:
+            pass
+        else:
+            setattr(wrapper, attr, value)
+    for attr in updated:
+        getattr(wrapper, attr).update(getattr(wrapped, attr, {}))
+    # Issue #17482: set __wrapped__ last so we don't inadvertently copy it
+    # from the wrapped function when updating __dict__
+    wrapper.__wrapped__ = wrapped
+    # Return the wrapper so this can be used as a decorator via partial()
+    return wrapper
+
+def wraps(wrapped,
+          assigned = WRAPPER_ASSIGNMENTS,
+          updated = WRAPPER_UPDATES):
+    """Decorator factory to apply update_wrapper() to a wrapper function
+
+       Returns a decorator that invokes update_wrapper() with the decorated
+       function as the wrapper argument and the arguments to wraps() as the
+       remaining arguments. Default arguments are as for update_wrapper().
+       This is a convenience function to simplify applying partial() to
+       update_wrapper().
+    """
+    return partial(update_wrapper, wrapped=wrapped,
+                   assigned=assigned, updated=updated)
+
+
+################################################################################
+### total_ordering class decorator
+################################################################################
+
+# The total ordering functions all invoke the root magic method directly
+# rather than using the corresponding operator.  This avoids possible
+# infinite recursion that could occur when the operator dispatch logic
+# detects a NotImplemented result and then calls a reflected method.
+
+def _gt_from_lt(self, other, NotImplemented=NotImplemented):
+    'Return a > b.  Computed by @total_ordering from (not a < b) and (a != b).'
+    op_result = self.__lt__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return not op_result and self != other
+
+def _le_from_lt(self, other, NotImplemented=NotImplemented):
+    'Return a <= b.  Computed by @total_ordering from (a < b) or (a == b).'
+    op_result = self.__lt__(other)
+    return op_result or self == other
+
+def _ge_from_lt(self, other, NotImplemented=NotImplemented):
+    'Return a >= b.  Computed by @total_ordering from (not a < b).'
+    op_result = self.__lt__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return not op_result
+
+def _ge_from_le(self, other, NotImplemented=NotImplemented):
+    'Return a >= b.  Computed by @total_ordering from (not a <= b) or (a == b).'
+    op_result = self.__le__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return not op_result or self == other
+
+def _lt_from_le(self, other, NotImplemented=NotImplemented):
+    'Return a < b.  Computed by @total_ordering from (a <= b) and (a != b).'
+    op_result = self.__le__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return op_result and self != other
+
+def _gt_from_le(self, other, NotImplemented=NotImplemented):
+    'Return a > b.  Computed by @total_ordering from (not a <= b).'
+    op_result = self.__le__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return not op_result
+
+def _lt_from_gt(self, other, NotImplemented=NotImplemented):
+    'Return a < b.  Computed by @total_ordering from (not a > b) and (a != b).'
+    op_result = self.__gt__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return not op_result and self != other
+
+def _ge_from_gt(self, other, NotImplemented=NotImplemented):
+    'Return a >= b.  Computed by @total_ordering from (a > b) or (a == b).'
+    op_result = self.__gt__(other)
+    return op_result or self == other
+
+def _le_from_gt(self, other, NotImplemented=NotImplemented):
+    'Return a <= b.  Computed by @total_ordering from (not a > b).'
+    op_result = self.__gt__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return not op_result
+
+def _le_from_ge(self, other, NotImplemented=NotImplemented):
+    'Return a <= b.  Computed by @total_ordering from (not a >= b) or (a == b).'
+    op_result = self.__ge__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return not op_result or self == other
+
+def _gt_from_ge(self, other, NotImplemented=NotImplemented):
+    'Return a > b.  Computed by @total_ordering from (a >= b) and (a != b).'
+    op_result = self.__ge__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return op_result and self != other
+
+def _lt_from_ge(self, other, NotImplemented=NotImplemented):
+    'Return a < b.  Computed by @total_ordering from (not a >= b).'
+    op_result = self.__ge__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return not op_result
+
+_convert = {
+    '__lt__': [('__gt__', _gt_from_lt),
+               ('__le__', _le_from_lt),
+               ('__ge__', _ge_from_lt)],
+    '__le__': [('__ge__', _ge_from_le),
+               ('__lt__', _lt_from_le),
+               ('__gt__', _gt_from_le)],
+    '__gt__': [('__lt__', _lt_from_gt),
+               ('__ge__', _ge_from_gt),
+               ('__le__', _le_from_gt)],
+    '__ge__': [('__le__', _le_from_ge),
+               ('__gt__', _gt_from_ge),
+               ('__lt__', _lt_from_ge)]
+}
+
+def total_ordering(cls):
+    """Class decorator that fills in missing ordering methods"""
+    # Find user-defined comparisons (not those inherited from object).
+    roots = {op for op in _convert if getattr(cls, op, None) is not getattr(object, op, None)}
+    if not roots:
+        raise ValueError('must define at least one ordering operation: < > <= >=')
+    root = max(roots)       # prefer __lt__ to __le__ to __gt__ to __ge__
+    for opname, opfunc in _convert[root]:
+        if opname not in roots:
+            opfunc.__name__ = opname
+            setattr(cls, opname, opfunc)
+    return cls
+
+
+################################################################################
+### cmp_to_key() function converter
+################################################################################
+
+def cmp_to_key(mycmp):
+    """Convert a cmp= function into a key= function"""
+    class K(object):
+        __slots__ = ['obj']
+        def __init__(self, obj):
+            self.obj = obj
+        def __lt__(self, other):
+            return mycmp(self.obj, other.obj) < 0
+        def __gt__(self, other):
+            return mycmp(self.obj, other.obj) > 0
+        def __eq__(self, other):
+            return mycmp(self.obj, other.obj) == 0
+        def __le__(self, other):
+            return mycmp(self.obj, other.obj) <= 0
+        def __ge__(self, other):
+            return mycmp(self.obj, other.obj) >= 0
+        __hash__ = None
+    return K
+
+try:
+    from _functools import cmp_to_key
+except ImportError:
+    pass
+
+
+################################################################################
+### reduce() sequence to a single item
+################################################################################
+
+_initial_missing = object()
+
+def reduce(function, sequence, initial=_initial_missing):
+    """
+    reduce(function, sequence[, initial]) -> value
+
+    Apply a function of two arguments cumulatively to the items of a sequence,
+    from left to right, so as to reduce the sequence to a single value.
+    For example, reduce(lambda x, y: x+y, [1, 2, 3, 4, 5]) calculates
+    ((((1+2)+3)+4)+5).  If initial is present, it is placed before the items
+    of the sequence in the calculation, and serves as a default when the
+    sequence is empty.
+    """
+
+    it = iter(sequence)
+
+    if initial is _initial_missing:
+        try:
+            value = next(it)
+        except StopIteration:
+            raise TypeError("reduce() of empty sequence with no initial value") from None
+    else:
+        value = initial
+
+    for element in it:
+        value = function(value, element)
+
+    return value
+
+try:
+    from _functools import reduce
+except ImportError:
+    pass
+
+
+################################################################################
+### partial() argument application
+################################################################################
+
+# Purely functional, no descriptor behaviour
+
[docs]class partial: + """New function with partial application of the given arguments + and keywords. + """ + + __slots__ = "func", "args", "keywords", "__dict__", "__weakref__" + + def __new__(cls, func, /, *args, **keywords): + if not callable(func): + raise TypeError("the first argument must be callable") + + if hasattr(func, "func"): + args = func.args + args + keywords = {**func.keywords, **keywords} + func = func.func + + self = super(partial, cls).__new__(cls) + + self.func = func + self.args = args + self.keywords = keywords + return self + + def __call__(self, /, *args, **keywords): + keywords = {**self.keywords, **keywords} + return self.func(*self.args, *args, **keywords) + + @recursive_repr() + def __repr__(self): + qualname = type(self).__qualname__ + args = [repr(self.func)] + args.extend(repr(x) for x in self.args) + args.extend(f"{k}={v!r}" for (k, v) in self.keywords.items()) + if type(self).__module__ == "functools": + return f"functools.{qualname}({', '.join(args)})" + return f"{qualname}({', '.join(args)})" + + def __reduce__(self): + return type(self), (self.func,), (self.func, self.args, + self.keywords or None, self.__dict__ or None) + + def __setstate__(self, state): + if not isinstance(state, tuple): + raise TypeError("argument to __setstate__ must be a tuple") + if len(state) != 4: + raise TypeError(f"expected 4 items in state, got {len(state)}") + func, args, kwds, namespace = state + if (not callable(func) or not isinstance(args, tuple) or + (kwds is not None and not isinstance(kwds, dict)) or + (namespace is not None and not isinstance(namespace, dict))): + raise TypeError("invalid partial state") + + args = tuple(args) # just in case it's a subclass + if kwds is None: + kwds = {} + elif type(kwds) is not dict: # XXX does it need to be *exactly* dict? + kwds = dict(kwds) + if namespace is None: + namespace = {} + + self.__dict__ = namespace + self.func = func + self.args = args + self.keywords = kwds
+ +try: + from _functools import partial +except ImportError: + pass + +# Descriptor version +class partialmethod(object): + """Method descriptor with partial application of the given arguments + and keywords. + + Supports wrapping existing descriptors and handles non-descriptor + callables as instance methods. + """ + + def __init__(*args, **keywords): + if len(args) >= 2: + self, func, *args = args + elif not args: + raise TypeError("descriptor '__init__' of partialmethod " + "needs an argument") + elif 'func' in keywords: + func = keywords.pop('func') + self, *args = args + import warnings + warnings.warn("Passing 'func' as keyword argument is deprecated", + DeprecationWarning, stacklevel=2) + else: + raise TypeError("type 'partialmethod' takes at least one argument, " + "got %d" % (len(args)-1)) + args = tuple(args) + + if not callable(func) and not hasattr(func, "__get__"): + raise TypeError("{!r} is not callable or a descriptor" + .format(func)) + + # func could be a descriptor like classmethod which isn't callable, + # so we can't inherit from partial (it verifies func is callable) + if isinstance(func, partialmethod): + # flattening is mandatory in order to place cls/self before all + # other arguments + # it's also more efficient since only one function will be called + self.func = func.func + self.args = func.args + args + self.keywords = {**func.keywords, **keywords} + else: + self.func = func + self.args = args + self.keywords = keywords + __init__.__text_signature__ = '($self, func, /, *args, **keywords)' + + def __repr__(self): + args = ", ".join(map(repr, self.args)) + keywords = ", ".join("{}={!r}".format(k, v) + for k, v in self.keywords.items()) + format_string = "{module}.{cls}({func}, {args}, {keywords})" + return format_string.format(module=self.__class__.__module__, + cls=self.__class__.__qualname__, + func=self.func, + args=args, + keywords=keywords) + + def _make_unbound_method(self): + def _method(cls_or_self, /, *args, **keywords): + keywords = {**self.keywords, **keywords} + return self.func(cls_or_self, *self.args, *args, **keywords) + _method.__isabstractmethod__ = self.__isabstractmethod__ + _method._partialmethod = self + return _method + + def __get__(self, obj, cls=None): + get = getattr(self.func, "__get__", None) + result = None + if get is not None: + new_func = get(obj, cls) + if new_func is not self.func: + # Assume __get__ returning something new indicates the + # creation of an appropriate callable + result = partial(new_func, *self.args, **self.keywords) + try: + result.__self__ = new_func.__self__ + except AttributeError: + pass + if result is None: + # If the underlying descriptor didn't do anything, treat this + # like an instance method + result = self._make_unbound_method().__get__(obj, cls) + return result + + @property + def __isabstractmethod__(self): + return getattr(self.func, "__isabstractmethod__", False) + +# Helper functions + +def _unwrap_partial(func): + while isinstance(func, partial): + func = func.func + return func + +################################################################################ +### LRU Cache function decorator +################################################################################ + +_CacheInfo = namedtuple("CacheInfo", ["hits", "misses", "maxsize", "currsize"]) + +class _HashedSeq(list): + """ This class guarantees that hash() will be called no more than once + per element. This is important because the lru_cache() will hash + the key multiple times on a cache miss. + + """ + + __slots__ = 'hashvalue' + + def __init__(self, tup, hash=hash): + self[:] = tup + self.hashvalue = hash(tup) + + def __hash__(self): + return self.hashvalue + +def _make_key(args, kwds, typed, + kwd_mark = (object(),), + fasttypes = {int, str}, + tuple=tuple, type=type, len=len): + """Make a cache key from optionally typed positional and keyword arguments + + The key is constructed in a way that is flat as possible rather than + as a nested structure that would take more memory. + + If there is only a single argument and its data type is known to cache + its hash value, then that argument is returned without a wrapper. This + saves space and improves lookup speed. + + """ + # All of code below relies on kwds preserving the order input by the user. + # Formerly, we sorted() the kwds before looping. The new way is *much* + # faster; however, it means that f(x=1, y=2) will now be treated as a + # distinct call from f(y=2, x=1) which will be cached separately. + key = args + if kwds: + key += kwd_mark + for item in kwds.items(): + key += item + if typed: + key += tuple(type(v) for v in args) + if kwds: + key += tuple(type(v) for v in kwds.values()) + elif len(key) == 1 and type(key[0]) in fasttypes: + return key[0] + return _HashedSeq(key) + +def lru_cache(maxsize=128, typed=False): + """Least-recently-used cache decorator. + + If *maxsize* is set to None, the LRU features are disabled and the cache + can grow without bound. + + If *typed* is True, arguments of different types will be cached separately. + For example, f(3.0) and f(3) will be treated as distinct calls with + distinct results. + + Arguments to the cached function must be hashable. + + View the cache statistics named tuple (hits, misses, maxsize, currsize) + with f.cache_info(). Clear the cache and statistics with f.cache_clear(). + Access the underlying function with f.__wrapped__. + + See: http://en.wikipedia.org/wiki/Cache_replacement_policies#Least_recently_used_(LRU) + + """ + + # Users should only access the lru_cache through its public API: + # cache_info, cache_clear, and f.__wrapped__ + # The internals of the lru_cache are encapsulated for thread safety and + # to allow the implementation to change (including a possible C version). + + if isinstance(maxsize, int): + # Negative maxsize is treated as 0 + if maxsize < 0: + maxsize = 0 + elif callable(maxsize) and isinstance(typed, bool): + # The user_function was passed in directly via the maxsize argument + user_function, maxsize = maxsize, 128 + wrapper = _lru_cache_wrapper(user_function, maxsize, typed, _CacheInfo) + return update_wrapper(wrapper, user_function) + elif maxsize is not None: + raise TypeError( + 'Expected first argument to be an integer, a callable, or None') + + def decorating_function(user_function): + wrapper = _lru_cache_wrapper(user_function, maxsize, typed, _CacheInfo) + return update_wrapper(wrapper, user_function) + + return decorating_function + +def _lru_cache_wrapper(user_function, maxsize, typed, _CacheInfo): + # Constants shared by all lru cache instances: + sentinel = object() # unique object used to signal cache misses + make_key = _make_key # build a key from the function arguments + PREV, NEXT, KEY, RESULT = 0, 1, 2, 3 # names for the link fields + + cache = {} + hits = misses = 0 + full = False + cache_get = cache.get # bound method to lookup a key or return None + cache_len = cache.__len__ # get cache size without calling len() + lock = RLock() # because linkedlist updates aren't threadsafe + root = [] # root of the circular doubly linked list + root[:] = [root, root, None, None] # initialize by pointing to self + + if maxsize == 0: + + def wrapper(*args, **kwds): + # No caching -- just a statistics update + nonlocal misses + misses += 1 + result = user_function(*args, **kwds) + return result + + elif maxsize is None: + + def wrapper(*args, **kwds): + # Simple caching without ordering or size limit + nonlocal hits, misses + key = make_key(args, kwds, typed) + result = cache_get(key, sentinel) + if result is not sentinel: + hits += 1 + return result + misses += 1 + result = user_function(*args, **kwds) + cache[key] = result + return result + + else: + + def wrapper(*args, **kwds): + # Size limited caching that tracks accesses by recency + nonlocal root, hits, misses, full + key = make_key(args, kwds, typed) + with lock: + link = cache_get(key) + if link is not None: + # Move the link to the front of the circular queue + link_prev, link_next, _key, result = link + link_prev[NEXT] = link_next + link_next[PREV] = link_prev + last = root[PREV] + last[NEXT] = root[PREV] = link + link[PREV] = last + link[NEXT] = root + hits += 1 + return result + misses += 1 + result = user_function(*args, **kwds) + with lock: + if key in cache: + # Getting here means that this same key was added to the + # cache while the lock was released. Since the link + # update is already done, we need only return the + # computed result and update the count of misses. + pass + elif full: + # Use the old root to store the new key and result. + oldroot = root + oldroot[KEY] = key + oldroot[RESULT] = result + # Empty the oldest link and make it the new root. + # Keep a reference to the old key and old result to + # prevent their ref counts from going to zero during the + # update. That will prevent potentially arbitrary object + # clean-up code (i.e. __del__) from running while we're + # still adjusting the links. + root = oldroot[NEXT] + oldkey = root[KEY] + oldresult = root[RESULT] + root[KEY] = root[RESULT] = None + # Now update the cache dictionary. + del cache[oldkey] + # Save the potentially reentrant cache[key] assignment + # for last, after the root and links have been put in + # a consistent state. + cache[key] = oldroot + else: + # Put result in a new link at the front of the queue. + last = root[PREV] + link = [last, root, key, result] + last[NEXT] = root[PREV] = cache[key] = link + # Use the cache_len bound method instead of the len() function + # which could potentially be wrapped in an lru_cache itself. + full = (cache_len() >= maxsize) + return result + + def cache_info(): + """Report cache statistics""" + with lock: + return _CacheInfo(hits, misses, maxsize, cache_len()) + + def cache_clear(): + """Clear the cache and cache statistics""" + nonlocal hits, misses, full + with lock: + cache.clear() + root[:] = [root, root, None, None] + hits = misses = 0 + full = False + + wrapper.cache_info = cache_info + wrapper.cache_clear = cache_clear + return wrapper + +try: + from _functools import _lru_cache_wrapper +except ImportError: + pass + + +################################################################################ +### singledispatch() - single-dispatch generic function decorator +################################################################################ + +def _c3_merge(sequences): + """Merges MROs in *sequences* to a single MRO using the C3 algorithm. + + Adapted from http://www.python.org/download/releases/2.3/mro/. + + """ + result = [] + while True: + sequences = [s for s in sequences if s] # purge empty sequences + if not sequences: + return result + for s1 in sequences: # find merge candidates among seq heads + candidate = s1[0] + for s2 in sequences: + if candidate in s2[1:]: + candidate = None + break # reject the current head, it appears later + else: + break + if candidate is None: + raise RuntimeError("Inconsistent hierarchy") + result.append(candidate) + # remove the chosen candidate + for seq in sequences: + if seq[0] == candidate: + del seq[0] + +def _c3_mro(cls, abcs=None): + """Computes the method resolution order using extended C3 linearization. + + If no *abcs* are given, the algorithm works exactly like the built-in C3 + linearization used for method resolution. + + If given, *abcs* is a list of abstract base classes that should be inserted + into the resulting MRO. Unrelated ABCs are ignored and don't end up in the + result. The algorithm inserts ABCs where their functionality is introduced, + i.e. issubclass(cls, abc) returns True for the class itself but returns + False for all its direct base classes. Implicit ABCs for a given class + (either registered or inferred from the presence of a special method like + __len__) are inserted directly after the last ABC explicitly listed in the + MRO of said class. If two implicit ABCs end up next to each other in the + resulting MRO, their ordering depends on the order of types in *abcs*. + + """ + for i, base in enumerate(reversed(cls.__bases__)): + if hasattr(base, '__abstractmethods__'): + boundary = len(cls.__bases__) - i + break # Bases up to the last explicit ABC are considered first. + else: + boundary = 0 + abcs = list(abcs) if abcs else [] + explicit_bases = list(cls.__bases__[:boundary]) + abstract_bases = [] + other_bases = list(cls.__bases__[boundary:]) + for base in abcs: + if issubclass(cls, base) and not any( + issubclass(b, base) for b in cls.__bases__ + ): + # If *cls* is the class that introduces behaviour described by + # an ABC *base*, insert said ABC to its MRO. + abstract_bases.append(base) + for base in abstract_bases: + abcs.remove(base) + explicit_c3_mros = [_c3_mro(base, abcs=abcs) for base in explicit_bases] + abstract_c3_mros = [_c3_mro(base, abcs=abcs) for base in abstract_bases] + other_c3_mros = [_c3_mro(base, abcs=abcs) for base in other_bases] + return _c3_merge( + [[cls]] + + explicit_c3_mros + abstract_c3_mros + other_c3_mros + + [explicit_bases] + [abstract_bases] + [other_bases] + ) + +def _compose_mro(cls, types): + """Calculates the method resolution order for a given class *cls*. + + Includes relevant abstract base classes (with their respective bases) from + the *types* iterable. Uses a modified C3 linearization algorithm. + + """ + bases = set(cls.__mro__) + # Remove entries which are already present in the __mro__ or unrelated. + def is_related(typ): + return (typ not in bases and hasattr(typ, '__mro__') + and issubclass(cls, typ)) + types = [n for n in types if is_related(n)] + # Remove entries which are strict bases of other entries (they will end up + # in the MRO anyway. + def is_strict_base(typ): + for other in types: + if typ != other and typ in other.__mro__: + return True + return False + types = [n for n in types if not is_strict_base(n)] + # Subclasses of the ABCs in *types* which are also implemented by + # *cls* can be used to stabilize ABC ordering. + type_set = set(types) + mro = [] + for typ in types: + found = [] + for sub in typ.__subclasses__(): + if sub not in bases and issubclass(cls, sub): + found.append([s for s in sub.__mro__ if s in type_set]) + if not found: + mro.append(typ) + continue + # Favor subclasses with the biggest number of useful bases + found.sort(key=len, reverse=True) + for sub in found: + for subcls in sub: + if subcls not in mro: + mro.append(subcls) + return _c3_mro(cls, abcs=mro) + +def _find_impl(cls, registry): + """Returns the best matching implementation from *registry* for type *cls*. + + Where there is no registered implementation for a specific type, its method + resolution order is used to find a more generic implementation. + + Note: if *registry* does not contain an implementation for the base + *object* type, this function may return None. + + """ + mro = _compose_mro(cls, registry.keys()) + match = None + for t in mro: + if match is not None: + # If *match* is an implicit ABC but there is another unrelated, + # equally matching implicit ABC, refuse the temptation to guess. + if (t in registry and t not in cls.__mro__ + and match not in cls.__mro__ + and not issubclass(match, t)): + raise RuntimeError("Ambiguous dispatch: {} or {}".format( + match, t)) + break + if t in registry: + match = t + return registry.get(match) + +def singledispatch(func): + """Single-dispatch generic function decorator. + + Transforms a function into a generic function, which can have different + behaviours depending upon the type of its first argument. The decorated + function acts as the default implementation, and additional + implementations can be registered using the register() attribute of the + generic function. + """ + # There are many programs that use functools without singledispatch, so we + # trade-off making singledispatch marginally slower for the benefit of + # making start-up of such applications slightly faster. + import types, weakref + + registry = {} + dispatch_cache = weakref.WeakKeyDictionary() + cache_token = None + + def dispatch(cls): + """generic_func.dispatch(cls) -> <function implementation> + + Runs the dispatch algorithm to return the best available implementation + for the given *cls* registered on *generic_func*. + + """ + nonlocal cache_token + if cache_token is not None: + current_token = get_cache_token() + if cache_token != current_token: + dispatch_cache.clear() + cache_token = current_token + try: + impl = dispatch_cache[cls] + except KeyError: + try: + impl = registry[cls] + except KeyError: + impl = _find_impl(cls, registry) + dispatch_cache[cls] = impl + return impl + + def register(cls, func=None): + """generic_func.register(cls, func) -> func + + Registers a new implementation for the given *cls* on a *generic_func*. + + """ + nonlocal cache_token + if func is None: + if isinstance(cls, type): + return lambda f: register(cls, f) + ann = getattr(cls, '__annotations__', {}) + if not ann: + raise TypeError( + f"Invalid first argument to `register()`: {cls!r}. " + f"Use either `@register(some_class)` or plain `@register` " + f"on an annotated function." + ) + func = cls + + # only import typing if annotation parsing is necessary + from typing import get_type_hints + argname, cls = next(iter(get_type_hints(func).items())) + if not isinstance(cls, type): + raise TypeError( + f"Invalid annotation for {argname!r}. " + f"{cls!r} is not a class." + ) + registry[cls] = func + if cache_token is None and hasattr(cls, '__abstractmethods__'): + cache_token = get_cache_token() + dispatch_cache.clear() + return func + + def wrapper(*args, **kw): + if not args: + raise TypeError(f'{funcname} requires at least ' + '1 positional argument') + + return dispatch(args[0].__class__)(*args, **kw) + + funcname = getattr(func, '__name__', 'singledispatch function') + registry[object] = func + wrapper.register = register + wrapper.dispatch = dispatch + wrapper.registry = types.MappingProxyType(registry) + wrapper._clear_cache = dispatch_cache.clear + update_wrapper(wrapper, func) + return wrapper + + +# Descriptor version +class singledispatchmethod: + """Single-dispatch generic method descriptor. + + Supports wrapping existing descriptors and handles non-descriptor + callables as instance methods. + """ + + def __init__(self, func): + if not callable(func) and not hasattr(func, "__get__"): + raise TypeError(f"{func!r} is not callable or a descriptor") + + self.dispatcher = singledispatch(func) + self.func = func + + def register(self, cls, method=None): + """generic_method.register(cls, func) -> func + + Registers a new implementation for the given *cls* on a *generic_method*. + """ + return self.dispatcher.register(cls, func=method) + + def __get__(self, obj, cls=None): + def _method(*args, **kwargs): + method = self.dispatcher.dispatch(args[0].__class__) + return method.__get__(obj, cls)(*args, **kwargs) + + _method.__isabstractmethod__ = self.__isabstractmethod__ + _method.register = self.register + update_wrapper(_method, self.func) + return _method + + @property + def __isabstractmethod__(self): + return getattr(self.func, '__isabstractmethod__', False) + + +################################################################################ +### cached_property() - computed once per instance, cached as attribute +################################################################################ + +_NOT_FOUND = object() + + +class cached_property: + def __init__(self, func): + self.func = func + self.attrname = None + self.__doc__ = func.__doc__ + self.lock = RLock() + + def __set_name__(self, owner, name): + if self.attrname is None: + self.attrname = name + elif name != self.attrname: + raise TypeError( + "Cannot assign the same cached_property to two different names " + f"({self.attrname!r} and {name!r})." + ) + + def __get__(self, instance, owner=None): + if instance is None: + return self + if self.attrname is None: + raise TypeError( + "Cannot use cached_property instance without calling __set_name__ on it.") + try: + cache = instance.__dict__ + except AttributeError: # not all objects have __dict__ (e.g. class defines slots) + msg = ( + f"No '__dict__' attribute on {type(instance).__name__!r} " + f"instance to cache {self.attrname!r} property." + ) + raise TypeError(msg) from None + val = cache.get(self.attrname, _NOT_FOUND) + if val is _NOT_FOUND: + with self.lock: + # check if another thread filled cache while we awaited lock + val = cache.get(self.attrname, _NOT_FOUND) + if val is _NOT_FOUND: + val = self.func(instance) + try: + cache[self.attrname] = val + except TypeError: + msg = ( + f"The '__dict__' attribute on {type(instance).__name__!r} instance " + f"does not support item assignment for caching {self.attrname!r} property." + ) + raise TypeError(msg) from None + return val +
+ +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bart/_modules/index.html b/branch/bart/_modules/index.html new file mode 100644 index 0000000..8ac1e74 --- /dev/null +++ b/branch/bart/_modules/index.html @@ -0,0 +1,111 @@ + + + + + + Overview: module code — lasso documentation + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+
    +
  • + +
  • +
  • +
+
+
+ +
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bart/_modules/lasso/logger/index.html b/branch/bart/_modules/lasso/logger/index.html new file mode 100644 index 0000000..78c1286 --- /dev/null +++ b/branch/bart/_modules/lasso/logger/index.html @@ -0,0 +1,148 @@ + + + + + + lasso.logger — lasso documentation + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for lasso.logger

+import logging
+
+__all__ = ["WranglerLogger", "setupLogging"]
+
+
+# for all the Wrangler logging needs!
+WranglerLogger = logging.getLogger("WranglerLogger")
+
+
+
[docs]def setupLogging(infoLogFilename, debugLogFilename, logToConsole=True): + """Sets up the logger. The infoLog is terse, just gives the bare minimum of details + so the network composition will be clear later. + The debuglog is very noisy, for debugging. + + Pass none to either. + Spews it all out to console too, if logToConsole is true. + """ + # clear handlers if any exist already + WranglerLogger.handlers = [] + + # create a logger + WranglerLogger.setLevel(logging.DEBUG) + + if infoLogFilename: + infologhandler = logging.StreamHandler(open(infoLogFilename, "w")) + infologhandler.setLevel(logging.INFO) + infologhandler.setFormatter( + logging.Formatter("%(asctime)s - %(name)s - %(levelname)s - %(message)s") + ) + WranglerLogger.addHandler(infologhandler) + + if debugLogFilename: + debugloghandler = logging.StreamHandler(open(debugLogFilename, "w")) + debugloghandler.setLevel(logging.DEBUG) + debugloghandler.setFormatter( + logging.Formatter("%(asctime)s %(levelname)s %(message)s", "%Y-%m-%d %H:%M") + ) + WranglerLogger.addHandler(debugloghandler) + + if logToConsole: + consolehandler = logging.StreamHandler() + consolehandler.setLevel(logging.DEBUG) + consolehandler.setFormatter( + logging.Formatter("%(name)-12s: %(levelname)-8s %(message)s") + ) + WranglerLogger.addHandler(consolehandler)
+
+ +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bart/_modules/lasso/parameters/index.html b/branch/bart/_modules/lasso/parameters/index.html new file mode 100644 index 0000000..2e3028f --- /dev/null +++ b/branch/bart/_modules/lasso/parameters/index.html @@ -0,0 +1,1029 @@ + + + + + + lasso.parameters — lasso documentation + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for lasso.parameters

+import os
+from .logger import WranglerLogger
+
+
+from pyproj import CRS
+
+
+def get_base_dir(lasso_base_dir=os.getcwd()):
+    d = lasso_base_dir
+    for i in range(3):
+        if "metcouncil_data" in os.listdir(d):
+
+            WranglerLogger.info("Lasso base directory set as: {}".format(d))
+            return d
+        d = os.path.dirname(d)
+
+    msg = "Cannot find Lasso base directory from {}, please input using keyword in parameters: `lasso_base_dir =` ".format(
+        lasso_base_dir
+    )
+    WranglerLogger.error(msg)
+    raise (ValueError(msg))
+
+
+
[docs]class Parameters: + """A class representing all the parameters defining the networks + including time of day, categories, etc. + + Parameters can be set at runtime by initializing a parameters instance + with a keyword argument setting the attribute. Parameters that are + not explicitly set will use default parameters listed in this class. + .. highlight:: python + + Attr: + time_period_to_time (dict): Maps time period abbreviations used in + Cube to time of days used on gtfs and highway network standard + Default: + :: + { + "EA": ("3:00", "6:00"), + "AM": ("6:00, "10:00"), + "MD": ("10:00", "15:00"), + "PM": ("15:00", "19:00"), + "EV": ("19:00", "3:00"), + } + cube_time_periods (dict): Maps cube time period numbers used in + transit line files to the time period abbreviations in time_period_to_time + dictionary. + Default: + :: + {"1": "EA", "2": "AM", "3": "MD", "4": "PM", "5": "EV"} + categories (dict): Maps demand category abbreviations to a list of + network categories they are allowed to use. + Default: + :: + { + # suffix, source (in order of search) + "sov": ["sov", "default"], + "hov2": ["hov2", "default", "sov"], + "hov3": ["hov3", "hov2", "default", "sov"], + "truck": ["trk", "sov", "default"], + } + properties_to_split (dict): Dictionary mapping variables in standard + roadway network to categories and time periods that need to be + split out in final model network to get variables like LANES_AM. + Default: + :: + { + "lanes": { + "v": "lanes", + "time_periods": self.time_periods_to_time + }, + "ML_lanes": { + "v": "ML_lanes", + "time_periods": self.time_periods_to_time + }, + "use": { + "v": "use", + "time_periods": self.time_periods_to_time + }, + } + + county_shape (str): File location of shapefile defining counties. + Default: + :: + r"metcouncil_data/county/cb_2017_us_county_5m.shp" + + county_variable_shp (str): Property defining the county n ame in + the county_shape file. + Default: + :: + NAME + lanes_lookup_file (str): Lookup table of number of lanes for different data sources. + Default: + :: + r"metcouncil_data/lookups/lanes.csv" + centroid_connect_lanes (int): Number of lanes for centroid connectors. + Default: + :: + 1 + mpo_counties (list): list of county names within MPO boundary. + Default: + :: + [ + "ANOKA", + "DAKOTA", + "HENNEPIN", + "RAMSEY", + "SCOTT", + "WASHINGTON", + "CARVER", + ] + + taz_shape (str): + Default: + :: + r"metcouncil_data/TAZ/TAZOfficialWCurrentForecasts.shp" + taz_data (str): + Default: + :: + ?? + highest_taz_number (int): highest TAZ number in order to define + centroid connectors. + Default: + :: + 3100 + + output_variables (list): list of variables to output in final model + network. + Default: + :: + [ + "model_link_id", + "link_id", + "A", + "B", + "shstGeometryId", + "distance", + "roadway", + "name", + "roadway_class", + "bike_access", + "walk_access", + "drive_access", + "truck_access", + "trn_priority_EA", + "trn_priority_AM", + "trn_priority_MD", + "trn_priority_PM", + "trn_priority_EV", + "ttime_assert_EA", + "ttime_assert_AM", + "ttime_assert_MD", + "ttime_assert_PM", + "ttime_assert_EV", + "lanes_EA", + "lanes_AM", + "lanes_MD", + "lanes_PM", + "lanes_EV", + "price_sov_EA", + "price_hov2_EA", + "price_hov3_EA", + "price_truck_EA", + "price_sov_AM", + "price_hov2_AM", + "price_hov3_AM", + "price_truck_AM", + "price_sov_MD", + "price_hov2_MD", + "price_hov3_MD", + "price_truck_MD", + "price_sov_PM", + "price_hov2_PM", + "price_hov3_PM", + "price_truck_PM", + "price_sov_EV", + "price_hov2_EV", + "price_hov3_EV", + "price_truck_EV", + "roadway_class_idx", + "facility_type", + "county", + "centroidconnect", + "model_node_id", + "N", + "osm_node_id", + "bike_node", + "transit_node", + "walk_node", + "drive_node", + "geometry", + "X", + "Y", + "ML_lanes_EA", + "ML_lanes_AM", + "ML_lanes_MD", + "ML_lanes_PM", + "ML_lanes_EV", + "segment_id", + "managed", + "bus_only", + "rail_only" + ] + + osm_facility_type_dict (dict): Mapping between OSM Roadway variable + and facility type. Default: + + area_type_shape (str): Location of shapefile defining area type. + Default: + :: + r"metcouncil_data/area_type/ThriveMSP2040CommunityDesignation.shp" + area_type_variable_shp (str): property in area_type_shape with area + type in it. + Default: + :: + "COMDES2040" + area_type_code_dict (dict): Mapping of the area_type_variable_shp to + the area type code used in the MetCouncil cube network. + Default: + :: + { + 23: 4, # urban center + 24: 3, + 25: 2, + 35: 2, + 36: 1, + 41: 1, + 51: 1, + 52: 1, + 53: 1, + 60: 1, + } + downtown_area_type_shape (str): Location of shapefile defining downtown area type. + Default: + :: + r"metcouncil_data/area_type/downtownzones_TAZ.shp" + downtown_area_type (int): Area type integer for downtown. + Default: + :: + 5 + mrcc_roadway_class_shape (str): Shapefile of MRCC links with a property + associated with roadway class. Default: + :: + r"metcouncil_data/mrcc/trans_mrcc_centerlines.shp" + mrcc_roadway_class_variable_shp (str): The property in mrcc_roadway_class_shp + associated with roadway class. Default: + :: + "ROUTE_SYS" + widot_roadway_class_shape (str): Shapefile of Wisconsin links with a property + associated with roadway class. Default: + :: + r"metcouncil_data/Wisconsin_Lanes_Counts_Median/WISLR.shp" + widot_roadway_class_variable_shp (str): The property in widot_roadway_class_shape + associated with roadway class.Default: + :: + "RDWY_CTGY_" + mndot_count_shape (str): Shapefile of MnDOT links with a property + associated with counts. Default: + :: + r"metcouncil_data/count_mn/AADT_2017_Count_Locations.shp" + mndot_count_variable_shp (str): The property in mndot_count_shape + associated with counts. Default: + + :: + "lookups/osm_highway_facility_type_crosswalk.csv" + legacy_tm2_attributes (str): CSV file of link attributes by + shStReferenceId from Legacy TM2 network. Default: + :: + "lookups/legacy_tm2_attributes.csv" + osm_lanes_attributes (str): CSV file of number of lanes by shStReferenceId + from OSM. Default: + :: + "lookups/osm_lanes_attributes.csv" + tam_tm2_attributes (str): CSV file of link attributes by + shStReferenceId from TAM TM2 network. Default: + :: + "lookups/tam_tm2_attributes.csv" + tom_tom_attributes (str): CSV file of link attributes by + shStReferenceId from TomTom network. Default: + :: + "lookups/tomtom_attributes.csv" + sfcta_attributes (str): CSV file of link attributes by + shStReferenceId from SFCTA network. Default: + :: + "lookups/sfcta_attributes.csv" + output_epsg (int): EPSG type of geographic projection for output + shapefiles. Default: + :: + 102646 + output_link_shp (str): Output shapefile for roadway links. Default: + :: + r"tests/scratch/links.shp" + output_node_shp (str): Output shapefile for roadway nodes. Default: + :: + r"tests/scratch/nodes.shp" + output_link_csv (str): Output csv for roadway links. Default: + :: + r"tests/scratch/links.csv" + output_node_csv (str): Output csv for roadway nodes. Default: + :: + r"tests/scratch/nodes.csv" + output_link_txt (str): Output fixed format txt for roadway links. Default: + :: + r"tests/scratch/links.txt" + output_node_txt (str): Output fixed format txt for roadway nodes. Default: + :: + r"tests/scratch/nodes.txt" + output_link_header_width_txt (str): Header for txt roadway links. Default: + :: + r"tests/scratch/links_header_width.txt" + output_node_header_width_txt (str): Header for txt for roadway Nodes. Default: + :: + r"tests/scratch/nodes_header_width.txt" + output_cube_network_script (str): Cube script for importing + fixed-format roadway network. Default: + :: + r"tests/scratch/make_complete_network_from_fixed_width_file.s + + zones (int): Number of travel analysis zones in the network. Default: + :: + 3061 + """ + +
[docs] def __init__(self, **kwargs): + """ + Time period and category splitting info + """ + if "time_periods_to_time" in kwargs: + self.time_periods_to_time = kwargs.get("time_periods_to_time") + else: + self.time_period_to_time = { + "EA": ("3:00", "6:00"), + "AM": ("6:00", "10:00"), + "MD": ("10:00", "15:00"), + "PM": ("15:00", "19:00"), + "EV": ("19:00", "3:00"), + } + + #MTC + self.cube_time_periods = { + "1": "EA", + "2": "AM", + "3": "MD", + "4": "PM", + "5": "EV", + } + + """ + #MC + self.route_type_bus_mode_dict = {"Urb Loc": 5, "Sub Loc": 6, "Express": 7} + + self.route_type_mode_dict = {0: 8, 2: 9} + + self.cube_time_periods = {"1": "AM", "2": "MD"} + self.cube_time_periods_name = {"AM": "pk", "MD": "op"} + """ + if "categories" in kwargs: + self.categories = kwargs.get("categories") + else: + self.categories = { + # suffix, source (in order of search) + "sov": ["sov", "default"], + "hov2": ["hov2", "default", "sov"], + "hov3": ["hov3", "hov2", "default", "sov"], + "truck": ["trk", "sov", "default"], + } + + # prefix, source variable, categories + self.properties_to_split = { + "lanes": { + "v": "lanes", + "time_periods": self.time_period_to_time, + }, + "ML_lanes": { + "v": "ML_lanes", + "time_periods": self.time_period_to_time, + }, + "useclass": { + "v": "useclass", + "time_periods": self.time_period_to_time, + }, + } + + """ + Details for calculating the county based on the centroid of the link. + The NAME varible should be the name of a field in shapefile. + """ + #MTC + if 'lasso_base_dir' in kwargs: + self.base_dir = get_base_dir(lasso_base_dir = kwargs.get("lasso_base_dir")) + else: + self.base_dir = get_base_dir() + + if 'data_file_location' in kwargs: + self.data_file_location = kwargs.get("data_file_location") + else: + self.data_file_location = os.path.join(self.base_dir, "mtc_data") + + #MC + if "lasso_base_dir" in kwargs: + self.base_dir = get_base_dir(lasso_base_dir=kwargs.get("lasso_base_dir")) + else: + self.base_dir = get_base_dir() + """ + if "data_file_location" in kwargs: + self.data_file_location = kwargs.get("data_file_location") + else: + self.data_file_location = os.path.join(self.base_dir, "metcouncil_data") + """ + + #-------- + if "settings_location" in kwargs: + self.settings_location = kwargs.get("settings_location") + else: + self.settings_location = os.path.join(self.base_dir, "examples", "settings") + + if "scratch_location" in kwargs: + self.scratch_location = kwargs.get("scratch_location") + else: + self.scratch_location = os.path.join(self.base_dir, "tests", "scratch") + + ### COUNTIES + + self.county_shape = os.path.join( + self.data_file_location, "county", "county.shp" + ) + self.county_variable_shp = "NAME" + + #MTC + self.county_code_dict = { + 'San Francisco':1, + 'San Mateo':2, + 'Santa Clara':3, + 'Alameda':4, + 'Contra Costa':5, + 'Solano':6, + 'Napa':7, + 'Sonoma':8, + 'Marin':9, + 'External':10, + } + + self.county_centroid_range_dict = { + 'San Francisco':range(1,100000), + 'San Mateo':range(100001,200000), + 'Santa Clara':range(200001,300000), + 'Alameda':range(300001,400000), + 'Contra Costa':range(400001,500000), + 'Solano':range(500001,600000), + 'Napa':range(600001,700000), + 'Sonoma':range(700001,800000), + 'Marin':range(800001,900000), + 'External':range(900001,1000000) + } + + self.county_node_range_dict = { + 'San Francisco':range(1000000,1500000), + 'San Mateo':range(1500000,2000000), + 'Santa Clara':range(2000000,2500000), + 'Alameda':range(2500000,3000000), + 'Contra Costa':range(3000000,3500000), + 'Solano':range(3500000,4000000), + 'Napa':range(4000000,4500000), + 'Sonoma':range(4500000,5000000), + 'Marin':range(5000000,5500000), + } + + self.county_hov_node_range_dict = { + 'San Francisco':range(5500000,6000000), + 'San Mateo':range(6000000,6500000), + 'Santa Clara':range(6500000,7000000), + 'Alameda':range(7000000,7500000), + 'Contra Costa':range(7500000,8000000), + 'Solano':range(8000000,8500000), + 'Napa':range(8500000,9000000), + 'Sonoma':range(9000000,9500000), + 'Marin':range(9500000,10000000), + } + + self.county_link_range_dict = { + 'San Francisco':range(1,1000000), + 'San Mateo':range(1000000,2000000), + 'Santa Clara':range(2000000,3000000), + 'Alameda':range(3000000,4000000), + 'Contra Costa':range(4000000,5000000), + 'Solano':range(5000000,6000000), + 'Napa':range(6000000,7000000), + 'Sonoma':range(7000000,8000000), + 'Marin':range(8000000,9000000) + } + + #MC + """ + self.county_code_dict = { + "Anoka": 1, + "Carver": 2, + "Dakota": 3, + "Hennepin": 4, + "Ramsey": 5, + "Scott": 6, + "Washington": 7, + "external": 10, + "Chisago": 11, + "Goodhue": 12, + "Isanti": 13, + "Le Sueur": 14, + "McLeod": 15, + "Pierce": 16, + "Polk": 17, + "Rice": 18, + "Sherburne": 19, + "Sibley": 20, + "St. Croix": 21, + "Wright": 22, + } + """ + + self.mpo_counties = [ + 1, + 3, + 4, + 5, + 6, + 7, + 8, + 9 + ] + + self.taz_N_list = list(range(1, 10000)) + list(range(100001, 110000)) + list(range(200001, 210000)) + list(range(300001, 310000))\ + + list(range(400001, 410000)) + list(range(500001, 510000)) + list(range(600001, 610000)) + list(range(700001, 710000))\ + + list(range(800001, 810000)) + list(range(900001, 1000000)) + + self.maz_N_list = list(range(10001, 90000)) + list(range(110001, 190000)) + list(range(210001, 290000)) + list(range(310001, 390000))\ + + list(range(410001, 490000)) + list(range(510001, 590000)) + list(range(610001, 690000)) + list(range(710001, 790000))\ + + list(range(810001, 890000)) + + self.tap_N_list = list(range(90001, 99999)) + list(range(190001, 199999)) + list(range(290001, 299999)) + list(range(390001, 399999))\ + + list(range(490001, 499999)) + list(range(590001, 599999)) + list(range(690001, 699999)) + list(range(790001, 799999))\ + + list(range(890001, 899999)) + + self.tap_N_start = { + "San Francisco" : 90001, + "San Mateo" : 190001, + "Santa Clara" : 290001, + "Alameda" : 390001, + "Contra Costa" : 490001, + "Solano" : 590001, + "Napa" : 690001, + "Sonoma" : 790001, + "Marin" : 890001 + } + + #MTC + self.osm_facility_type_dict = os.path.join( + self.data_file_location, "lookups", "osm_highway_facility_type_crosswalk.csv" + ) + #MC + ### Lanes + self.lanes_lookup_file = os.path.join( + self.data_file_location, "lookups", "lanes.csv" + ) + + ### TAZS + + self.taz_shape = os.path.join( + self.data_file_location, "TAZ", "TAZOfficialWCurrentForecasts.shp" + ) + ###### + #MTC + self.osm_lanes_attributes = os.path.join( + self.data_file_location, "lookups", "osm_lanes_attributes.csv" + ) + + self.legacy_tm2_attributes = os.path.join( + self.data_file_location, "lookups", "legacy_tm2_attributes.csv" + ) + + self.assignable_analysis = os.path.join( + self.data_file_location, "lookups", "assignable_analysis_links.csv" + ) + ### + ### AREA TYPE - MC + self.area_type_shape = os.path.join( + self.data_file_location, + "area_type", + "ThriveMSP2040CommunityDesignation.shp", + ) + self.area_type_variable_shp = "COMDES2040" + # area type map from raw data to model category + + # source https://metrocouncil.org/Planning/Publications-And-Resources/Thrive-MSP-2040-Plan-(1)/7_ThriveMSP2040_LandUsePoliciesbyCD.aspx + # urban center + # urban + # suburban + # suburban edge + # emerging suburban edge + # rural center + # diversified rural + # rural residential + # agricultural + self.area_type_code_dict = { + 23: 4, # urban center + 24: 3, + 25: 2, + 35: 2, + 36: 1, + 41: 1, + 51: 1, + 52: 1, + 53: 1, + 60: 1, + } + + self.downtown_area_type_shape = os.path.join( + self.data_file_location, + "area_type", + "downtownzones_TAZ.shp", + ) + + self.downtown_area_type = int(5) + + self.centroid_connect_lanes = int(1) + + self.osm_assgngrp_dict = os.path.join( + self.data_file_location, "lookups", "osm_highway_asgngrp_crosswalk.csv" + ) + self.mrcc_roadway_class_shape = os.path.join( + self.data_file_location, "mrcc", "trans_mrcc_centerlines.shp" + ) + #### + ###MTC + self.tam_tm2_attributes = os.path.join( + self.data_file_location, "lookups", "tam_tm2_attributes.csv" + ) + + self.sfcta_attributes = os.path.join( + self.data_file_location, "lookups", "sfcta_attributes.csv" + ) + + self.tomtom_attributes = os.path.join( + self.data_file_location, "lookups", "tomtom_attributes.csv" + ) + + self.pems_attributes = os.path.join( + self.data_file_location, "lookups", "pems_attributes.csv" + ) + + self.centroid_file = os.path.join( + self.data_file_location, "centroid", "centroid_node.pickle" + ) + #### + ###MC + self.widot_shst_data = os.path.join( + self.data_file_location, + "Wisconsin_Lanes_Counts_Median", + "widot.out.matched.geojson", + ) + #### + + self.centroid_connector_link_file = os.path.join( + self.data_file_location, "centroid", "cc_link.pickle" + ) + + self.centroid_connector_shape_file = os.path.join( + self.data_file_location, "centroid", "cc_shape.pickle" + ) + + self.tap_file = os.path.join( + self.data_file_location, "tap", "tap_node.pickle" + ) + + self.tap_connector_link_file = os.path.join( + self.data_file_location, "tap", "tap_link.pickle" + ) + + self.tap_connector_shape_file = os.path.join( + self.data_file_location, "tap", "tap_shape.pickle" + ) + + self.net_to_dbf_crosswalk = os.path.join( + self.settings_location, "net_to_dbf.csv" + ) + + ###MTC + self.log_to_net_crosswalk = os.path.join(self.settings_location, "log_to_net.csv") + + self.emme_name_crosswalk_file = os.path.join( + self.data_file_location, "lookups", "emme_attribute_names.csv" + ) + #### + #MC + self.mndot_count_variable_shp = "AADT_mn" + + self.widot_county_shape = os.path.join( + self.data_file_location, + "Wisconsin_Lanes_Counts_Median", + "TRADAS_(counts).shp", + ) + ### + ###MTC + self.mode_crosswalk_file = os.path.join( + self.data_file_location, "lookups", "gtfs_to_tm2_mode_crosswalk.csv" + ) + + self.veh_cap_crosswalk_file = os.path.join( + self.data_file_location, "lookups", "transitSeatCap.csv" + ) + + self.faresystem_crosswalk_file = os.path.join( + self.data_file_location, "lookups", "faresystem_crosswalk.txt" + ) + + # https://app.asana.com/0/12291104512575/1200287255197808/f + self.fare_2015_to_2010_deflator = 0.927 + #### + #MC + self.widot_count_variable_shp = "AADT_wi" + + self.net_to_dbf_crosswalk = os.path.join( + self.settings_location, "net_to_dbf.csv" + ) + + self.log_to_net_crosswalk = os.path.join( + self.settings_location, "log_to_net.csv" + ) + + self.subregion_boundary_file = os.path.join( + self.data_file_location, 'emme', 'subregion_boundary_for_active_modes.shp' + ) + + self.subregion_boundary_id_variable = 'subregion' + #### + + self.output_variables = [ + "model_link_id", + "link_id", + "A", + "B", + "shstGeometryId", + #MTC + 'name', + "distance", + #"roadway", + #"name", + #MC + #"shape_id", + #"distance", + #"roadway", + #"name", + #"roadway_class", + #### + "bike_access", + "walk_access", + "drive_access", + "truck_access", + "lanes_EA", + "lanes_AM", + "lanes_MD", + "lanes_PM", + "lanes_EV", + "county", + "model_node_id", + "N", + "osm_node_id", + "geometry", + "X", + "Y", + "segment_id", + "managed", + "bus_only", + "rail_only", + #MTC + "assignable", + "cntype", + "useclass_AM", + "useclass_MD", + "useclass_PM", + "useclass_EV", + "useclass_EA", + "transit", + "tollbooth", + "tollseg", + "ft", + "tap_drive", + "tollbooth", + "tollseg", + "farezone", + "tap_id", + #### + #MC + "bike_facility", + "mrcc_id", + "ROUTE_SYS", # mrcc functional class + ] + + self.output_link_shp = os.path.join(self.scratch_location, "links.shp") + self.output_node_shp = os.path.join(self.scratch_location, "nodes.shp") + self.output_link_csv = os.path.join(self.scratch_location, "links.csv") + self.output_node_csv = os.path.join(self.scratch_location, "nodes.csv") + self.output_link_txt = os.path.join(self.scratch_location, "links.txt") + self.output_node_txt = os.path.join(self.scratch_location, "nodes.txt") + self.output_link_header_width_txt = os.path.join( + self.scratch_location, "links_header_width.txt" + ) + self.output_node_header_width_txt = os.path.join( + self.scratch_location, "nodes_header_width.txt" + ) + self.output_cube_network_script = os.path.join( + self.scratch_location, "make_complete_network_from_fixed_width_file.s" + ) + self.output_dir = os.path.join(self.scratch_location) + self.output_proj = CRS("epsg:2875") + self.output_proj4 = '+proj=lcc +lat_0=32.1666666666667 +lon_0=-116.25 +lat_1=33.8833333333333 +lat_2=32.7833333333333 +x_0=2000000.0001016 +y_0=500000.0001016 +ellps=GRS80 +towgs84=-0.991,1.9072,0.5129,-1.25033e-07,-4.6785e-08,-5.6529e-08,0 +units=us-ft +no_defs +type=crs' + self.prj_file = os.path.join(self.data_file_location, 'projection', '2875.prj') + self.wkt_projection = 'PROJCS["NAD83(HARN) / California zone 6 (ftUS)",GEOGCS["NAD83(HARN)",DATUM["NAD83_High_Accuracy_Reference_Network",SPHEROID["GRS 1980",6378137,298.257222101],TOWGS84[-0.991,1.9072,0.5129,-1.25033E-07,-4.6785E-08,-5.6529E-08,0]],PRIMEM["Greenwich",0],UNIT["degree",0.0174532925199433,AUTHORITY["EPSG","9122"]],AUTHORITY["EPSG","4152"]],PROJECTION["Lambert_Conformal_Conic_2SP"],PARAMETER["latitude_of_origin",32.1666666666667],PARAMETER["central_meridian",-116.25],PARAMETER["standard_parallel_1",33.8833333333333],PARAMETER["standard_parallel_2",32.7833333333333],PARAMETER["false_easting",6561666.667],PARAMETER["false_northing",1640416.667],UNIT["US survey foot",0.304800609601219],AXIS["Easting",EAST],AXIS["Northing",NORTH],AUTHORITY["EPSG","2875"]]' + + self.fare_matrix_output_variables = ["faresystem", "origin_farezone", "destination_farezone", "price"] + + self.zones = 4756 + """ + Create all the possible headway variable combinations based on the cube time periods setting + """ + self.time_period_properties_list = [ + p + "[" + str(t) + "]" + for p in ["HEADWAY", "FREQ"] + for t in self.cube_time_periods.keys() + ] + + self.int_col = [ + "model_link_id", + "model_node_id", + "A", + "B", + #MTC + #"county", + ### + #MC + # "lanes", + "lanes_AM", + "lanes_MD", + "lanes_PM", + "lanes_NT", + "roadway_class", + "assign_group", + #"county", + "area_type", + "trn_priority", + "AADT", + "count_AM", + "count_MD", + "count_PM", + "count_NT", + "count_daily", + "centroidconnect", + "bike_facility", + #### + "drive_access", + "walk_access", + "bike_access", + "truck_access", + #MTC + "ML_lanes_AM", + "ML_lanes_MD", + "ML_lanes_PM", + "ML_lanes_EV", + "ML_lanes_EA", + ### + #MC + "drive_node", + "walk_node", + "bike_node", + "transit_node", + # "ML_lanes", + "ML_lanes_AM", + "ML_lanes_MD", + "ML_lanes_PM", + "ML_lanes_NT", + #### + "segment_id", + "managed", + "bus_only", + "rail_only", + "transit", + ##MTC + "ft", + "assignable", + "lanes_AM", + "lanes_MD", + "lanes_PM", + "lanes_EA", + "lanes_EV", + "useclass_AM", + "useclass_EA", + "useclass_MD", + "useclass_PM", + "useclass_EV", + "tollseg", + "tollbooth", + "farezone", + "tap_id", + ] + + self.float_col = [ + "distance", + "price", + "X", + "Y" + "mrcc_id", + ] + + self.float_col = ["distance", "ttime_assert", "price", "X", "Y"] + + self.string_col = [ + "osm_node_id", + "name", + "roadway", + "shstGeometryId", + "access_AM", + "access_MD", + "access_PM", + "access_NT", + "ROUTE_SYS", + ] + + self.zones = 3061 + + self.__dict__.update(kwargs)
+
+ +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bart/_modules/lasso/project/index.html b/branch/bart/_modules/lasso/project/index.html new file mode 100644 index 0000000..a44028b --- /dev/null +++ b/branch/bart/_modules/lasso/project/index.html @@ -0,0 +1,1505 @@ + + + + + + lasso.project — lasso documentation + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for lasso.project

+import json
+import os
+import re
+from typing import Any, Dict, Optional, Union, List
+from csv import reader
+
+from pandas.core import base
+
+import numpy as np
+import pandas as pd
+from pandas import DataFrame
+import geopandas as gpd
+
+from network_wrangler import ProjectCard
+from network_wrangler import RoadwayNetwork
+
+from .transit import CubeTransit, StandardTransit
+from .logger import WranglerLogger
+from .parameters import Parameters
+from .roadway import ModelRoadwayNetwork
+from .util import column_name_to_parts
+
+
+
[docs]class Project(object): + """A single or set of changes to the roadway or transit system. + + Compares a base and a build transit network or a base and build + highway network and produces project cards. + + .. highlight:: python + + Typical usage example: + :: + test_project = Project.create_project( + base_cube_transit_source=os.path.join(CUBE_DIR, "transit.LIN"), + build_cube_transit_source=os.path.join(CUBE_DIR, "transit_route_shape_change"), + ) + test_project.evaluate_changes() + test_project.write_project_card( + os.path.join(SCRATCH_DIR, "t_transit_shape_test.yml") + ) + + Attributes: + DEFAULT_PROJECT_NAME: a class-level constant that defines what + the project name will be if none is set. + STATIC_VALUES: a class-level constant which defines values that + are not evaluated when assessing changes. + card_data (dict): {"project": <project_name>, "changes": <list of change dicts>} + roadway_link_changes (DataFrame): pandas dataframe of CUBE roadway link changes. + roadway_node_changes (DataFrame): pandas dataframe of CUBE roadway node changes. + transit_changes (CubeTransit): + base_roadway_network (RoadwayNetwork): + base_cube_transit_network (CubeTransit): + build_cube_transit_network (CubeTransit): + project_name (str): name of the project, set to DEFAULT_PROJECT_NAME if not provided + parameters: an instance of the Parameters class which sets a bunch of parameters + """ + + DEFAULT_PROJECT_NAME = "USER TO define" + + STATIC_VALUES = [ + "model_link_id", + "area_type", + "county", + # "assign_group", + "centroidconnect", + ] + CALCULATED_VALUES = [ + "area_type", + "county", + "assign_group", + "centroidconnect", + ] + +
[docs] def __init__( + self, + roadway_link_changes: Optional[DataFrame] = None, + roadway_node_changes: Optional[DataFrame] = None, + transit_changes: Optional[DataFrame] = None, + base_roadway_network: Optional[RoadwayNetwork] = None, + base_transit_network: Optional[StandardTransit] = None, + base_cube_transit_network: Optional[CubeTransit] = None, + build_cube_transit_network: Optional[CubeTransit] = None, + project_name: Optional[str] = "", + evaluate: Optional[bool] = False, + parameters: Union[dict, Parameters] = {}, + ): + """ + ProjectCard constructor. + + args: + roadway_link_changes: dataframe of roadway changes read from a log file + roadway_node_changes: dataframe of roadway changes read from a log file + transit_changes: dataframe of transit changes read from a log file + base_roadway_network: RoadwayNetwork instance for base case + base_transit_network: StandardTransit instance for base case + base_cube_transit_network: CubeTransit instance for base transit network + build_cube_transit_network: CubeTransit instance for build transit network + project_name: name of the project + evaluate: defaults to false, but if true, will create card data + parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not specified, will use default parameters. + + returns: instance of ProjectCard + """ + self.card_data = Dict[str, Dict[str, Any]] + + self.roadway_link_changes = roadway_link_changes + self.roadway_node_changes = roadway_node_changes + self.base_roadway_network = base_roadway_network + self.base_transit_network = base_transit_network + self.base_cube_transit_network = base_cube_transit_network + self.build_cube_transit_network = build_cube_transit_network + self.transit_changes = transit_changes + self.project_name = ( + project_name if project_name else Project.DEFAULT_PROJECT_NAME + ) + + if type(parameters) is dict: + self.parameters = Parameters(**parameters) + elif isinstance(parameters, Parameters): + self.parameters = Parameters(**parameters.__dict__) + else: + msg = "Parameters should be a dict or instance of Parameters: found {} which is of type:{}".format( + parameters, type(parameters) + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + if base_roadway_network != None: + self.determine_roadway_network_changes_compatibility( + self.base_roadway_network, + self.roadway_link_changes, + self.roadway_node_changes, + self.parameters + ) + + if evaluate: + self.evaluate_changes()
+ +
[docs] def write_project_card(self, filename: str = None): + """ + Writes project cards. + + Args: + filename (str): File path to output .yml + + Returns: + None + """ + ProjectCard(self.card_data).write(out_filename=filename)
+ +
[docs] @staticmethod + def create_project( + roadway_log_file: Union[str, List[str], None] = None, + roadway_shp_file: Optional[str] = None, + roadway_csv_file: Optional[str] = None, + network_build_file: Optional[str] = None, + emme_node_id_crosswalk_file: Optional[str] = None, + emme_name_crosswalk_file: Optional[str] = None, + base_roadway_dir: Optional[str] = None, + base_transit_dir: Optional[str] = None, + base_cube_transit_source: Optional[str] = None, + build_cube_transit_source: Optional[str] = None, + roadway_link_changes: Optional[DataFrame] = None, + roadway_node_changes: Optional[DataFrame] = None, + transit_changes: Optional[CubeTransit] = None, + base_roadway_network: Optional[RoadwayNetwork] = None, + base_cube_transit_network: Optional[CubeTransit] = None, + build_cube_transit_network: Optional[CubeTransit] = None, + project_name: Optional[str] = None, + recalculate_calculated_variables: Optional[bool] = False, + recalculate_distance: Optional[bool] = False, + parameters: Optional[dict] = {}, + **kwargs, + ): + """ + Constructor for a Project instance. + + Args: + roadway_log_file: File path to consuming logfile or a list of logfile paths. + roadway_shp_file: File path to consuming shape file for roadway changes. + roadway_csv_file: File path to consuming csv file for roadway changes. + network_build_file: File path to consuming EMME network build for network changes. + base_roadway_dir: Folder path to base roadway network. + base_transit_dir: Folder path to base transit network. + base_cube_transit_source: Folder path to base transit network or cube line file string. + base_cube_transit_file: File path to base transit network. + build_cube_transit_source: Folder path to build transit network or cube line file string. + build_cube_transit_file: File path to build transit network. + roadway_link_changes: pandas dataframe of CUBE roadway link changes. + roadway_node_changes: pandas dataframe of CUBE roadway node changes. + transit_changes: build transit changes. + base_roadway_network: Base roadway network object. + base_cube_transit_network: Base cube transit network object. + build_cube_transit_network: Build cube transit network object. + project_name: If not provided, will default to the roadway_log_file filename if + provided (or the first filename if a list is provided) + recalculate_calculated_variables: if reading in a base network, if this is true it + will recalculate variables such as area type, etc. This only needs to be true + if you are creating project cards that are changing the calculated variables. + recalculate_distance: recalculate the distance variable. This only needs to be + true if you are creating project cards that change the distance. + parameters: dictionary of parameters + crs (int): coordinate reference system, ESPG number + node_foreign_key (str): variable linking the node table to the link table + link_foreign_key (list): list of variable linking the link table to the node foreign key + shape_foreign_key (str): variable linking the links table and shape table + unique_link_ids (list): list of variables unique to each link + unique_node_ids (list): list of variables unique to each node + modes_to_network_link_variables (dict): Mapping of modes to link variables in + the network + modes_to_network_nodes_variables (dict): Mapping of modes to node variables + in the network + managed_lanes_node_id_scalar (int): Scalar values added to primary keys for nodes for + corresponding managed lanes. + managed_lanes_link_id_scalar (int): Scalar values added to primary keys for links for + corresponding managed lanes. + managed_lanes_required_attributes (list): attributes that must be specified in managed + lane projects. + keep_same_attributes_ml_and_gp (list): attributes to copy to managed lanes from parallel + general purpose lanes. + + Returns: + A Project instance. + """ + + if base_cube_transit_source and base_cube_transit_network: + msg = "Method takes only one of 'base_cube_transit_source' and 'base_cube_transit_network' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if base_cube_transit_source: + base_cube_transit_network = CubeTransit.create_from_cube(base_cube_transit_source, parameters) + WranglerLogger.debug( + "Base network has {} lines".format(len(base_cube_transit_network.lines)) + ) + if len(base_cube_transit_network.lines) <= 10: + WranglerLogger.debug( + "Base network lines: {}".format( + "\n - ".join(base_cube_transit_network.lines) + ) + ) + elif base_cube_transit_network: + pass + else: + msg = "No base cube transit network." + WranglerLogger.info(msg) + base_cube_transit_network = None + + if build_cube_transit_source and transit_changes: + msg = "Method takes only one of 'build_cube_transit_source' and 'transit_changes' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if build_cube_transit_source: + WranglerLogger.debug("build") + build_cube_transit_network = CubeTransit.create_from_cube(build_cube_transit_source, parameters) + WranglerLogger.debug( + "Build network has {} lines".format(len(build_cube_transit_network.lines)) + ) + if len(build_cube_transit_network.lines) <= 10: + WranglerLogger.debug( + "Build network lines: {}".format( + "\n - ".join(build_cube_transit_network.lines) + ) + ) + elif transit_changes: + pass + else: + msg = "No cube transit changes given or processed." + WranglerLogger.info(msg) + transit_changes = None + + if roadway_log_file and (roadway_link_changes or roadway_node_changes): + msg = "Method takes only one of 'roadway_log_file' and 'roadway_changes' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if roadway_shp_file and (roadway_link_changes or roadway_node_changes): + msg = "Method takes only one of 'roadway_shp_file' and 'roadway_changes' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if roadway_csv_file and (roadway_link_changes or roadway_node_changes): + msg = "Method takes only one of 'roadway_csv_file' and 'roadway_changes' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if roadway_log_file and roadway_csv_file: + msg = "Method takes only one of 'roadway_log_file' and 'roadway_csv_file' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if roadway_shp_file and roadway_csv_file: + msg = "Method takes only one of 'roadway_shp_file' and 'roadway_csv_file' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if roadway_log_file and roadway_shp_file: + msg = "Method takes only one of 'roadway_log_file' and 'roadway_shp_file' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if roadway_log_file and not project_name: + if type(roadway_log_file) == list: + project_name = os.path.splitext(os.path.basename(roadway_log_file[0]))[ + 0 + ] + WranglerLogger.info( + "No Project Name - Using name of first log file in list" + ) + else: + project_name = os.path.splitext(os.path.basename(roadway_log_file))[0] + WranglerLogger.info("No Project Name - Using name of log file") + if network_build_file and not project_name: + if type(network_build_file) == list: + with open(network_build_file[0]) as f: + _content = json.load(f) + project_name = ( + _content.get('metadata').get('project_title') + ' ' + + _content.get('metadata').get('date') + ' ' + + _content.get('metadata').get('comments') + ) + WranglerLogger.info( + "No Project Name - Using metadata of first network build file in list" + ) + else: + with open(network_build_file) as f: + _content = json.load(f) + project_name = ( + _content.get('metadata').get('project_title') + ' ' + + _content.get('metadata').get('date') + ' ' + + _content.get('metadata').get('comments') + ) + WranglerLogger.info("No Project Name - Using metadata of network build file") + if roadway_log_file: + roadway_link_changes, roadway_node_changes = Project.read_logfile(roadway_log_file) + elif roadway_shp_file: + roadway_changes = gpd.read_file(roadway_shp_file) + roadway_link_changes = roadway_changes[roadway_changes.OBJECT == 'L'].copy() + roadway_node_changes = roadway_changes[roadway_changes.OBJECT == 'N'].copy() + roadway_link_changes = DataFrame(roadway_link_changes.drop("geometry", axis=1)) + roadway_node_changes = DataFrame(roadway_node_changes.drop("geometry", axis=1)) + roadway_node_changes["model_node_id"] = 0 + elif roadway_csv_file: + roadway_changes = pd.read_csv(roadway_csv_file) + roadway_link_changes = roadway_changes[roadway_changes.OBJECT == 'L'].copy() + roadway_node_changes = roadway_changes[roadway_changes.OBJECT == 'N'].copy() + roadway_node_changes["model_node_id"] = 0 + elif network_build_file: + roadway_link_changes, roadway_node_changes, transit_changes = Project.read_network_build_file(network_build_file) + if emme_node_id_crosswalk_file: + # get wrangler IDs from emme element_id + roadway_link_changes, roadway_node_changes, transit_changes = Project.emme_id_to_wrangler_id( + roadway_link_changes, + roadway_node_changes, + transit_changes, + emme_node_id_crosswalk_file + ) + else: + msg = "User needs to specify emme node id crosswalk file using emme_node_id_crosswalk_file = " + WranglerLogger.error(msg) + raise ValueError(msg) + # rename emme attributes to wrangler attributes + if emme_name_crosswalk_file is None: + emme_name_crosswalk_file = parameters.emme_name_crosswalk_file + roadway_link_changes, roadway_node_changes = Project.emme_name_to_wrangler_name( + roadway_link_changes, + roadway_node_changes, + emme_name_crosswalk_file + ) + elif roadway_link_changes: + pass + elif roadway_node_changes: + pass + else: + msg = "No roadway changes given or processed." + WranglerLogger.info(msg) + roadway_link_changes = pd.DataFrame({}) + roadway_node_changes = pd.DataFrame({}) + + if base_roadway_network and base_roadway_dir: + msg = "Method takes only one of 'base_roadway_network' and 'base_roadway_dir' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if base_roadway_dir: + base_roadway_network = ModelRoadwayNetwork.read( + os.path.join(base_roadway_dir, "link.json"), + os.path.join(base_roadway_dir, "node.geojson"), + os.path.join(base_roadway_dir, "shape.geojson"), + fast=True, + recalculate_calculated_variables=recalculate_calculated_variables, + recalculate_distance=recalculate_distance, + parameters=parameters, + **kwargs, + ) + base_roadway_network.split_properties_by_time_period_and_category() + elif base_roadway_network: + base_roadway_network.split_properties_by_time_period_and_category() + else: + msg = "No base roadway network." + WranglerLogger.info(msg) + base_roadway_network = None + + if base_cube_transit_source and base_transit_dir: + msg = "Method takes only one of 'base_cube_transit_source' and 'base_transit_dir' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if base_transit_dir: + base_transit_network = StandardTransit.read_gtfs( + gtfs_feed_dir=base_transit_dir, + parameters=parameters + ) + else: + msg = "No base transit network." + WranglerLogger.info(msg) + base_transit_network = None + + project = Project( + roadway_link_changes=roadway_link_changes, + roadway_node_changes=roadway_node_changes, + transit_changes=transit_changes, + base_roadway_network=base_roadway_network, + base_transit_network=base_transit_network, + base_cube_transit_network=base_cube_transit_network, + build_cube_transit_network=build_cube_transit_network, + evaluate=True, + project_name=project_name, + parameters=parameters, + ) + + return project
+ +
[docs] @staticmethod + def read_logfile(logfilename: Union[str, List[str]]): + """ + Reads a Cube log file and returns separate dataframes of roadway_link_changes and roadway_node_changes + + Args: + logfilename (str or list[str]): File path to CUBE logfile or list of logfile paths. + + Returns: + A DataFrame reprsentation of the log file. + """ + if type(logfilename) == str: + logfilename = [logfilename] + + link_df = pd.DataFrame() + node_df = pd.DataFrame() + + for file in logfilename: + WranglerLogger.info("Reading logfile: {}".format(file)) + with open(file) as f: + _content = f.readlines() + + _node_lines = [ + x.strip().replace(";", ",") for x in _content if x.startswith("N") + ] + WranglerLogger.debug("node lines: {}".format(_node_lines)) + _link_lines = [ + x.strip().replace(";", ",") for x in _content if x.startswith("L") + ] + WranglerLogger.debug("link lines: {}".format(_link_lines)) + + _nodecol = ["OBJECT", "OPERATION", "GROUP"] + _node_lines[0].split(",")[ + 1: + ] + WranglerLogger.debug("Node Cols: {}".format(_nodecol)) + _linkcol = ["OBJECT", "OPERATION", "GROUP"] + _link_lines[0].split(",")[ + 1: + ] + WranglerLogger.debug("Link Cols: {}".format(_linkcol)) + + def split_log(x): + return list(reader([x], delimiter=',', quotechar='"'))[0] + + _node_df = pd.DataFrame([split_log(x) for x in _node_lines[1:]],columns = _nodecol) + WranglerLogger.debug("Node DF: {}".format(_node_df)) + _link_df = pd.DataFrame([split_log(x) for x in _link_lines[1:]],columns = _linkcol) + WranglerLogger.debug("Link DF: {}".format(_link_df)) + + node_df = pd.concat([node_df, _node_df]) + link_df = pd.concat([link_df, _link_df]) + + # CUBE logfile headers for string fields: NAME[111] instead of NAME, need to shorten that + link_df.columns = [c.split("[")[0] for c in link_df.columns] + # CUBE logfile headers for string fields: NAME[111] instead of NAME, need to shorten that + node_df.columns = [c.split("[")[0] for c in node_df.columns] + + if len(link_df) > 0: + # create operation history + action_history_df = ( + link_df.groupby(['A', 'B'])["OPERATION"] + .agg(lambda x: x.tolist()) + .rename("operation_history") + .reset_index() + ) + action_history_df["operation_final"] = action_history_df.apply(lambda x: Project._final_op(x), axis=1) + + link_df = pd.merge(link_df, action_history_df, on=['A', 'B'], how="left") + + if len(node_df) > 0: + action_history_df = ( + node_df.groupby('N')["OPERATION"] + .agg(lambda x: x.tolist()) + .rename("operation_history") + .reset_index() + ) + action_history_df["operation_final"] = action_history_df.apply(lambda x: Project._final_op(x), axis=1) + + node_df = pd.merge(node_df, action_history_df, on='N', how="left") + + WranglerLogger.info( + "Processed {} Node lines and {} Link lines".format( + node_df.shape[0], link_df.shape[0] + ) + ) + + return link_df, node_df
+ +
[docs] @staticmethod + def read_network_build_file(networkbuildfilename: Union[str, List[str]]): + """ + Reads a emme network build file and returns separate dataframes of roadway_link_changes and roadway_node_changes + + Args: + networkbuildfilename (str or list[str]): File path to emme nework build file or list of network build file paths. + + Returns: + A DataFrame representation of the network build file + """ + if type(networkbuildfilename) == str: + networkbuildfilename = [networkbuildfilename] + + _link_command_history_df = DataFrame() + _node_command_history_df = DataFrame() + _transit_command_history_df = DataFrame() + + for file in networkbuildfilename: + WranglerLogger.info("Reading network build file: {}".format(file)) + with open(file) as f: + _content = json.load(f) + + _command_history = _content.get('command_history') + + # loop through all the commands + for command in _command_history: + if command.get('command') == 'set_attribute': + element_id = command.get('parameters').get('element_ids') + object = Project.get_object_from_network_build_command(command) + operation = Project.get_operation_from_network_build_command(command) + + _command_df = DataFrame( + data = { + 'element_id' : element_id, + 'object' : object, + 'operation' : operation + } + ) + + _command_df[command.get('parameters').get('attribute_name')] = command.get('parameters').get('value') + + if command.get('command') in ['create_link', 'create_node']: + if command.get('command') == 'create_link': + element_id = command.get('results').get('changes').get('added').get('LINK') + if command.get('command') == 'create_node': + element_id = command.get('results').get('changes').get('added').get('NODE') + object = Project.get_object_from_network_build_command(command) + operation = Project.get_operation_from_network_build_command(command) + + _command_df = DataFrame( + data = { + 'element_id' : element_id, + 'object' : object, + 'operation' : operation + } + ) + + for attribute_name, attribute_value in command.get('parameters').get('attributes').items(): + _command_df[attribute_name] = attribute_value + + if command.get('command') == 'delete_link': + element_id = command.get('results').get('changes').get('removed').get('LINK') + object = Project.get_object_from_network_build_command(command) + operation = Project.get_operation_from_network_build_command(command) + + _command_df = DataFrame( + data = { + 'element_id' : element_id, + 'object' : object, + 'operation' : operation + } + ) + + if command.get('command') == 'modify_transit_line': + element_id = command.get('parameters').get('line_id') + object = Project.get_object_from_network_build_command(command) + operation = Project.get_operation_from_network_build_command(command) + + _command_df = DataFrame( + data = { + 'element_id' : pd.Series(element_id), + 'object' : pd.Series(object), + 'operation' : pd.Series(operation) + } + ) + + _command_df['new_itinerary'] = [command.get('parameters').get('new_itinerary')] + + if ('L' in _command_df['object'].unique()): + _link_command_history_df = _link_command_history_df.append( + _command_df[_command_df['object'] == 'L'], + sort = False, + ignore_index = True + ) + + if ('N' in _command_df['object'].unique()): + _node_command_history_df = _node_command_history_df.append( + _command_df[_command_df['object'] == 'N'], + sort = False, + ignore_index = True + ) + + if ( + ('TRANSIT_LINE' in _command_df['object'].unique()) | + ('TRANSIT_STOP' in _command_df['object'].unique()) | + ('TRANSIT_SHAPE' in _command_df['object'].unique()) + ): + _transit_command_history_df = _transit_command_history_df.append( + _command_df[_command_df['object'].isin(['TRANSIT_LINE', 'TRANSIT_STOP', 'TRANSIT_SHAPE'])], + sort = False, + ignore_index = True + ) + + if len(_link_command_history_df) > 0: + # create operation history + link_action_history_df = ( + _link_command_history_df.groupby('element_id')["operation"] + .agg(lambda x: x.tolist()) + .rename("operation_history") + .reset_index() + ) + link_action_history_df["operation_final"] = link_action_history_df.apply( + lambda x: Project._final_op(x), + axis=1 + ) + + # get the last none null value for each element + # consolidate elements to single record + def get_last_valid(series): + if len(series.dropna()) > 0: + return series.dropna().iloc[-1] + else: + return np.nan + + #_command_history_df = _command_history_df.groupby(['element_id']).apply(get_last_valid).reset_index() + _link_command_history_df = _link_command_history_df.groupby( + ['element_id'] + ).last().reset_index() + + _link_command_history_df = pd.merge(_link_command_history_df, link_action_history_df, on='element_id', how="left") + + if len(_node_command_history_df) > 0: + # create node operation history + node_action_history_df = ( + _node_command_history_df.groupby('element_id')["operation"] + .agg(lambda x: x.tolist()) + .rename("operation_history") + .reset_index() + ) + node_action_history_df["operation_final"] = node_action_history_df.apply( + lambda x: Project._final_op(x), + axis=1 + ) + + _node_command_history_df = _node_command_history_df.groupby( + ['element_id'] + ).last().reset_index() + + _node_command_history_df = pd.merge(_node_command_history_df, node_action_history_df, on='element_id', how="left") + + WranglerLogger.info( + "Processed {} link element commands, {} node element commands".format( + _link_command_history_df.shape[0], + _node_command_history_df.shape[0] + ) + ) + + return _link_command_history_df, _node_command_history_df, _transit_command_history_df
+ +
[docs] @staticmethod + def emme_id_to_wrangler_id(emme_link_change_df, emme_node_change_df, emme_transit_changes_df, emme_node_id_crosswalk_file): + """ + rewrite the emme id with wrangler id, using the emme wrangler id crosswalk located in database folder + """ + WranglerLogger.info('Reading emme node id crosswalk file from {}'.format(emme_node_id_crosswalk_file)) + emme_node_id_crosswalk_df = pd.read_csv(emme_node_id_crosswalk_file) + emme_node_id_dict = dict(zip(emme_node_id_crosswalk_df['emme_node_id'], emme_node_id_crosswalk_df['model_node_id'])) + + # get node changes + if len(emme_node_change_df) > 0: + emme_node_change_df['emme_id'] = emme_node_change_df['element_id'].apply(lambda x: int(x.split('-')[0])) + + # get new emme nodes + new_emme_node_id_list = [ + n for n in emme_node_change_df['emme_id'] if n not in emme_node_id_crosswalk_df['emme_node_id'] + ] + WranglerLogger.info('New emme node id list {}'.format(new_emme_node_id_list)) + new_wrangler_node = emme_node_id_crosswalk_df['model_node_id'].max() + + # add crosswalk for new emme nodes + for new_emme_node in new_emme_node_id_list: + new_wrangler_node = new_wrangler_node + 1 + emme_node_id_dict.update({new_emme_node : new_wrangler_node}) + + # for nodes update model_node_id + emme_node_change_df['model_node_id'] = emme_node_change_df['emme_id'].map(emme_node_id_dict).fillna(0) + + if len(emme_link_change_df) > 0: + emme_link_change_df['A'] = emme_link_change_df['element_id'].apply(lambda x: int(x.split('-')[0])) + emme_link_change_df['B'] = emme_link_change_df['element_id'].apply(lambda x: int(x.split('-')[-1])) + # for links update A,B nodes + emme_link_change_df['A'] = emme_link_change_df['A'].map(emme_node_id_dict) + emme_link_change_df['B'] = emme_link_change_df['B'].map(emme_node_id_dict) + + if len(emme_transit_changes_df) > 0: + emme_transit_changes_df['i_node'] = emme_transit_changes_df.apply( + lambda x: x['element_id'].split('-')[-3] if x['object'] == 'TRANSIT_STOP' else 0, + axis = 1 + ) + + emme_transit_changes_df['j_node'] = emme_transit_changes_df.apply( + lambda x: x['element_id'].split('-')[-2] if x['object'] == 'TRANSIT_STOP' else 0, + axis = 1 + ) + + # update i,j nodes + emme_transit_changes_df['i_node'] = emme_transit_changes_df[ + 'i_node' + ].astype( + int + ).map( + emme_node_id_dict + ).fillna(0).astype(int) + emme_transit_changes_df['j_node'] = emme_transit_changes_df[ + 'j_node' + ].astype( + int + ).map( + emme_node_id_dict + ).fillna(0).astype(int) + + # update routing nodes + emme_transit_changes_df['new_itinerary'] = emme_transit_changes_df.apply( + lambda x: [emme_node_id_dict.get(n) for n in x['new_itinerary']] if x['object'] == 'TRANSIT_SHAPE' else 0, + axis = 1 + ) + + return emme_link_change_df, emme_node_change_df, emme_transit_changes_df
+ +
[docs] def get_object_from_network_build_command(row): + """ + determine the network build object is node or link + + Args: + row: network build command history dataframe + + Returns: + 'N' for node, 'L' for link + """ + + if row.get('command') == 'create_link': + return 'L' + + if row.get('command') == 'create_node': + return 'N' + + if row.get('command') == 'delete_link': + return 'L' + + if row.get('command') == 'set_attribute': + if row.get('parameters').get('element_type') == 'LINK': + return 'L' + if row.get('parameters').get('element_type') == 'NODE': + return 'N' + if row.get('parameters').get('element_type') == 'TRANSIT_LINE': + return 'TRANSIT_LINE' + if row.get('parameters').get('element_type') == 'TRANSIT_SEGMENT': + return 'TRANSIT_STOP' + + if row.get('command') == 'modify_transit_line': + return 'TRANSIT_SHAPE'
+ +
[docs] def get_operation_from_network_build_command(row): + """ + determine the network build object action type + + Args: + row: network build command history dataframe + + Returns: + 'A', 'C', 'D' + """ + + if row.get('command') == 'create_link': + return 'A' + + if row.get('command') == 'create_node': + return 'A' + + if row.get('command') == 'delete_link': + return 'D' + + if row.get('command') == 'set_attribute': + if row.get('parameters').get('element_type') == 'LINK': + return 'C' + if row.get('parameters').get('element_type') == 'NODE': + return 'C' + if row.get('parameters').get('element_type') == 'TRANSIT_LINE': + return 'C' + if row.get('parameters').get('element_type') == 'TRANSIT_SEGMENT': + return 'C' + + if row.get('command') == 'modify_transit_line': + return 'C'
+ +
[docs] @staticmethod + def emme_name_to_wrangler_name(emme_link_change_df, emme_node_change_df, emme_name_crosswalk_file): + """ + rename emme names to wrangler names using crosswalk file + """ + + WranglerLogger.info('Reading emme attribute name crosswalk file {}'.format(emme_name_crosswalk_file)) + emme_name_crosswalk_df = pd.read_csv(emme_name_crosswalk_file) + emme_name_crosswalk_dict = dict(zip(emme_name_crosswalk_df['emme_name'], emme_name_crosswalk_df['wrangler_name'])) + + # drop columns we don't need from emme to avoid confusion + ignore_columns = [ + c for c in emme_link_change_df.columns if c not in list(emme_name_crosswalk_dict.keys()) + ['operation_final', 'A', 'B'] + ] + WranglerLogger.info('Ignoring link changes in {}'.format(ignore_columns)) + emme_link_change_df = emme_link_change_df.drop(ignore_columns, axis = 1) + + ignore_columns = [ + c for c in emme_node_change_df.columns if c not in list(emme_name_crosswalk_dict.keys()) + ['operation_final', 'model_node_id'] + ] + WranglerLogger.info('Ignoring node changes in {}'.format(ignore_columns)) + emme_node_change_df = emme_node_change_df.drop(ignore_columns, axis = 1) + + # rename emme name to wrangler name + emme_link_change_df.rename(columns = emme_name_crosswalk_dict, inplace = True) + emme_node_change_df.rename(columns = emme_name_crosswalk_dict, inplace = True) + + return emme_link_change_df, emme_node_change_df
+ +
[docs] @staticmethod + def determine_roadway_network_changes_compatibility( + base_roadway_network: ModelRoadwayNetwork, + roadway_link_changes: DataFrame, + roadway_node_changes: DataFrame, + parameters: Parameters, + ): + """ + Checks to see that any links or nodes that change exist in base roadway network. + """ + WranglerLogger.info( + "Evaluating compatibility between roadway network changes and base network. Not evaluating deletions." + ) + + # CUBE log file saves all variable names in upper cases, need to convert them to be same as network + log_to_net_df = pd.read_csv(parameters.log_to_net_crosswalk) + log_to_net_dict = dict(zip(log_to_net_df["log"], log_to_net_df["net"])) + + dbf_to_net_df = pd.read_csv(parameters.net_to_dbf_crosswalk) + dbf_to_net_dict = dict(zip(dbf_to_net_df["dbf"], dbf_to_net_df["net"])) + + for c in roadway_link_changes.columns: + if (c not in log_to_net_df["log"].tolist() + log_to_net_df["net"].tolist()) & (c not in ["A", "B"]): + roadway_link_changes.rename(columns={c : c.lower()}, inplace=True) + roadway_link_changes.rename(columns=log_to_net_dict, inplace=True) + roadway_link_changes.rename(columns=dbf_to_net_dict, inplace=True) + + for c in roadway_node_changes.columns: + if (c not in log_to_net_df["log"].tolist() + log_to_net_df["net"].tolist()) & (c not in ["A", "B"]): + roadway_node_changes.rename(columns={c : c.lower()}, inplace=True) + roadway_node_changes.rename(columns=log_to_net_dict, inplace=True) + roadway_node_changes.rename(columns=dbf_to_net_dict, inplace=True) + + # for links "L" that change "C", + # find locations where there isn't a base roadway link + if len(roadway_link_changes) > 0: + link_changes_df = roadway_link_changes[ + roadway_link_changes["operation_final"] == "C" + ].copy() + + link_merge_df = pd.merge( + link_changes_df[["A", "B"]].astype(str), + base_roadway_network.links_df[["A", "B", "model_link_id"]].astype(str), + how="left", + on=["A", "B"], + ) + + missing_links = link_merge_df.loc[link_merge_df["model_link_id"].isna()] + + if missing_links.shape[0]: + msg = "Network missing the following AB links:\n{}".format(missing_links) + WranglerLogger.error(msg) + raise ValueError(msg) + + # for links "N" that change "C", + # find locations where there isn't a base roadway node + if len(roadway_node_changes) > 0: + node_changes_df = roadway_node_changes[ + roadway_node_changes["operation_final"] == "C" + ].copy() + + node_merge_df = pd.merge( + node_changes_df[["model_node_id"]], + base_roadway_network.nodes_df[["model_node_id", "geometry"]], + how="left", + on=["model_node_id"], + ) + missing_nodes = node_merge_df.loc[node_merge_df["geometry"].isna()] + if missing_nodes.shape[0]: + msg = "Network missing the following nodes:\n{}".format(missing_nodes) + WranglerLogger.error(msg) + raise ValueError(msg)
+ +
[docs] def evaluate_changes(self): + """ + Determines which changes should be evaluated, initiates + self.card_data to be an aggregation of transit and highway changes. + """ + highway_change_list = [] + transit_change_list = [] + + WranglerLogger.info("Evaluating project changes.") + + if (not self.roadway_link_changes.empty) | (not self.roadway_node_changes.empty): + highway_change_list = self.add_highway_changes() + + if (not self.transit_changes.empty) or ( + self.base_cube_transit_network is not None + and self.build_cube_transit_network is not None + ): + transit_change_list = self.add_transit_changes() + + self.card_data = { + "project": self.project_name, + "changes": transit_change_list + highway_change_list, + }
+ +
[docs] def add_transit_changes(self): + """ + Evaluates changes between base and build transit objects and + adds entries into the self.card_data dictionary. + """ + if self.build_cube_transit_network: + transit_change_list = self.build_cube_transit_network.evaluate_differences( + self.base_cube_transit_network + ) + elif self.base_transit_network: + transit_change_list = self.base_transit_network.evaluate_differences( + self.transit_changes + ) + return transit_change_list
+ + @staticmethod + def _final_op(x): + if x["operation_history"][-1] == "D": + if "A" in x["operation_history"][:-1]: + return "N" + else: + return "D" + elif x["operation_history"][-1] == "A": + if "D" in x["operation_history"][:-1]: + return "C" + else: + return "A" + else: + if "A" in x["operation_history"][:-1]: + return "A" + else: + return "C" + +
[docs] def add_highway_changes(self, limit_variables_to_existing_network=False): + """ + Evaluates changes from the log file based on the base highway object and + adds entries into the self.card_data dictionary. + + Args: + limit_variables_to_existing_network (bool): True if no ad-hoc variables. Default to False. + """ + + for c in self.parameters.string_col: + if c in self.roadway_link_changes.columns: + self.roadway_link_changes[c] = self.roadway_link_changes[c].str.lstrip(" ") + if c in self.roadway_node_changes.columns: + self.roadway_node_changes[c] = self.roadway_node_changes[c].str.lstrip(" ") + + ## if worth it, could also add some functionality to network wrangler itself. + node_changes_df = self.roadway_node_changes.copy() + + link_changes_df = self.roadway_link_changes.copy() + + def _process_deletions(link_changes_df): + """ + create deletion section in project card + """ + WranglerLogger.debug("Processing link deletions") + + cube_delete_df = link_changes_df[link_changes_df["operation_final"] == "D"].copy() + + # make sure columns has the same type as base network + cube_delete_df['A'] = cube_delete_df['A'].astype( + type(self.base_roadway_network.links_df['A'].iloc[0]) + ) + cube_delete_df['B'] = cube_delete_df['B'].astype( + type(self.base_roadway_network.links_df['B'].iloc[0]) + ) + + if 'model_link_id' in cube_delete_df.columns: + cube_delete_df.drop(['model_link_id'], axis = 1, inplace = True) + + cube_delete_df = pd.merge( + cube_delete_df, + self.base_roadway_network.links_df[['A', 'B', 'model_link_id']], + how = 'left', + on = ['A', 'B'] + ) + + if len(cube_delete_df) > 0: + links_to_delete = cube_delete_df["model_link_id"].tolist() + delete_link_dict = { + "category": "Roadway Deletion", + "links": {"model_link_id": links_to_delete}, + } + WranglerLogger.debug("{} Links Deleted.".format(len(links_to_delete))) + else: + delete_link_dict = None + WranglerLogger.debug("No link deletions processed") + + return delete_link_dict + + def _process_link_additions( + link_changes_df, limit_variables_to_existing_network + ): + """""" + WranglerLogger.debug("Processing link additions") + cube_add_df = link_changes_df[link_changes_df["operation_final"] == "A"] + + if len(cube_add_df) == 0: + WranglerLogger.debug("No link additions processed") + return {} + + if limit_variables_to_existing_network: + add_col = [ + c + for c in cube_add_df.columns + if c in self.base_roadway_network.links_df.columns + ] + else: + add_col = [ + c for c in cube_add_df.columns if c not in ["operation_final"] + ] + # can leave out "operation_final" from writing out, is there a reason to write it out? + + for x in add_col: + cube_add_df[x] = cube_add_df[x].astype(self.base_roadway_network.links_df[x].dtype) + + add_link_properties = cube_add_df[add_col].to_dict("records") + + # WranglerLogger.debug("Add Link Properties: {}".format(add_link_properties)) + WranglerLogger.debug("{} Links Added".format(len(add_link_properties))) + + return {"category": "Add New Roadway", "links": add_link_properties} + + def _process_node_additions(node_add_df): + """""" + WranglerLogger.debug("Processing node additions") + + if len(node_add_df) == 0: + WranglerLogger.debug("No node additions processed") + return [] + + node_add_df = node_add_df.drop(["operation_final"], axis=1) + + for x in node_add_df.columns: + node_add_df[x] = node_add_df[x].astype(self.base_roadway_network.nodes_df[x].dtype) + + add_nodes_dict_list = node_add_df.to_dict( + "records" + ) + WranglerLogger.debug("{} Nodes Added".format(len(add_nodes_dict_list))) + + return add_nodes_dict_list + + def _process_single_link_change(change_row, changeable_col): + """""" + + # 1. Find associated base year network values + base_df = self.base_roadway_network.links_df[ + (self.base_roadway_network.links_df["A"] == int(change_row.A)) + & (self.base_roadway_network.links_df["B"] == int(change_row.B)) + ] + + if not base_df.shape[0]: + msg = "No match found in network for AB combination: ({},{}). Incompatible base network.".format( + change_row.A, change_row.B + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + elif base_df.shape[0] > 1: + WranglerLogger.warning( + "Found more than one match in base network for AB combination: ({},{}). Selecting first one to operate on but AB should be unique to network.".format( + change_row.A, change_row.B + ) + ) + + base_row = base_df.iloc[0] + # WranglerLogger.debug("Properties with changes: {}".format(changeable_col)) + + # 2. find columns that changed (enough) + changed_col = [] + for col in changeable_col: + WranglerLogger.debug("Assessing Column: {}".format(col)) + # if it is the same as before, or a static value, don't process as a change + if str(change_row[col]).strip('"\'') == str(base_row[col]).strip('"\''): + continue + # if it is NaN or None, don't process as a change + if (change_row[col] != change_row[col]) | (change_row[col] is None): + continue + if (col == "roadway_class") & (change_row[col] == 0): + continue + # only look at distance if it has significantly changed + if col == "distance": + if ( + abs( + (change_row[col] - float(base_row[col])) + / base_row[col].astype(float) + ) + > 0.01 + ): + change_row[col] = type(base_row[col])(change_row[col]) + changed_col.append(col) + else: + continue + else: + change_row[col] = type(base_row[col])(change_row[col]) + changed_col.append(col) + + WranglerLogger.debug( + "Properties with changes that will be processed: {}".format(changed_col) + ) + + if not changed_col: + return pd.DataFrame() + + # 3. Iterate through columns with changed values and structure the changes as expected in project card + property_dict_list = [] + processed_properties = [] + + # check if it's a manged lane change + for c in changed_col: + if c.startswith("ML_"): + # TODO ML project card skeleton + msg = "Detected managed lane changes, please create managed lane project card!" + WranglerLogger.error(msg) + raise ValueError(msg) + return + + # regular roadway property change + for c in changed_col: + # WranglerLogger.debug("Processing Column: {}".format(c)) + ( + p_base_name, + p_time_period, + p_category, + managed_lane, + ) = column_name_to_parts(c, self.parameters) + + _d = { + "existing": base_row[c], + "set": change_row[c], + } + if c in Project.CALCULATED_VALUES: + _d = { + "set": change_row[c], + } + if p_time_period: + if managed_lane == 1: + _d["time"] = list( + self.parameters.time_period_to_time[p_time_period] + ) + if p_category: + _d["category"] = p_category + + # iterate through existing properties that have been changed and see if you should just add + if (p_base_name in processed_properties) & (managed_lane == 1): + for processed_p in property_dict_list: + if processed_p["property"] == p_base_name: + processed_p["timeofday"] += [_d] + elif (p_base_name in processed_properties) & (managed_lane == 0): + for processed_p in property_dict_list: + if processed_p["property"] == p_base_name: + if processed_p["set"] != change_row[c]: + msg = "Detected different changes for split-property variables on regular roadway links: " + msg += "conflicting \"{}\" values \"{}\", \"{}\"".format(p_base_name, processed_p["set"], change_row[c]) + WranglerLogger.error(msg) + raise ValueError(msg) + elif p_time_period: + if managed_lane == 1: + property_dict = {"property": p_base_name, "timeofday": [_d]} + processed_properties.append(p_base_name) + property_dict_list.append(property_dict) + else: + _d["property"] = p_base_name + processed_properties.append(_d["property"]) + property_dict_list.append(_d) + else: + _d["property"] = p_base_name + processed_properties.append(_d["property"]) + property_dict_list.append(_d) + + card_df = pd.DataFrame( + { + "properties": pd.Series([property_dict_list]), + "model_link_id": pd.Series(base_row["model_link_id"]), + } + ) + + # WranglerLogger.debug('single change card_df:\n {}'.format(card_df)) + + return card_df + + def _process_link_changes(link_changes_df, changeable_col): + """""" + cube_change_df = link_changes_df[link_changes_df["operation_final"] == "C"].copy() + + # make sure columns has the same type as base network + cube_change_df['A'] = cube_change_df['A'].astype( + type(self.base_roadway_network.links_df['A'].iloc[0]) + ) + cube_change_df['B'] = cube_change_df['B'].astype( + type(self.base_roadway_network.links_df['B'].iloc[0]) + ) + + if 'model_link_id' in cube_change_df.columns: + cube_change_df.drop('model_link_id', axis = 1, inplace = True) + + cube_change_df = pd.merge( + cube_change_df, + self.base_roadway_network.links_df[['A', 'B', 'model_link_id']], + how = 'left', + on = ['A', 'B'] + ) + + if not cube_change_df.shape[0]: + WranglerLogger.info("No link changes processed") + return [] + + change_link_dict_df = pd.DataFrame(columns=["properties", "model_link_id"]) + + for index, row in cube_change_df.iterrows(): + card_df = _process_single_link_change(row, changeable_col) + + change_link_dict_df = pd.concat( + [change_link_dict_df, card_df], ignore_index=True, sort=False + ) + + if not change_link_dict_df.shape[0]: + WranglerLogger.info("No link changes processed") + return [] + + # WranglerLogger.debug('change_link_dict_df Unaggregated:\n {}'.format(change_link_dict_df)) + + # Have to change to string so that it is a hashable type for the aggregation + change_link_dict_df["properties"] = change_link_dict_df[ + "properties" + ].astype(str) + # Group the changes that are the same + change_link_dict_df = ( + change_link_dict_df.groupby("properties")[["model_link_id"]] + .agg(lambda x: list(x)) + .reset_index() + ) + # WranglerLogger.debug('change_link_dict_df Aggregated:\n {}'.format(change_link_dict_df)) + + # Reformat model link id to correct "facility" format + change_link_dict_df["facility"] = change_link_dict_df.apply( + lambda x: {"link": [{"model_link_id": x.model_link_id}]}, axis=1 + ) + + # WranglerLogger.debug('change_link_dict_df 3: {}'.format(change_link_dict_df)) + change_link_dict_df["properties"] = change_link_dict_df["properties"].apply( + lambda x: json.loads( + x.replace("'\"", "'").replace("\"'", "'").replace("'", '"') + ) + ) + + change_link_dict_df["category"] = "Roadway Property Change" + + change_link_dict_list = change_link_dict_df[ + ["category", "facility", "properties"] + ].to_dict("record") + + WranglerLogger.debug( + "{} Changes Processed".format(len(change_link_dict_list)) + ) + return change_link_dict_list + + def _consolidate_actions(log, base, key_list): + log_df = log.copy() + # will be changed if to allow new variables being added/changed that are not in base network + changeable_col = [x for x in log_df.columns if x in base.columns] + #print(log_df) + #for x in changeable_col: + # print(x) + #log_df[x] = log_df[x].astype(base[x].dtype) + + if 'operation_final' not in log_df.columns: + action_history_df = ( + log_df.groupby(key_list)["operation"] + .agg(lambda x: x.tolist()) + .rename("operation_history") + .reset_index() + ) + + log_df = pd.merge(log_df, action_history_df, on=key_list, how="left") + log_df.drop_duplicates(subset=key_list, keep="last", inplace=True) + log_df["operation_final"] = log_df.apply(lambda x: Project._final_op(x), axis=1) + + return log_df[changeable_col + ["operation_final"]] + + delete_link_dict = None + add_link_dict = None + change_link_dict_list = [] + + delete_link_dict = None + add_link_dict = None + change_link_dict_list = [] + + if len(link_changes_df) != 0: + link_changes_df = _consolidate_actions( + link_changes_df, self.base_roadway_network.links_df, ["A", "B"] + ) + + # process deletions + delete_link_dict = _process_deletions(link_changes_df) + + # process additions + add_link_dict = _process_link_additions( + link_changes_df, limit_variables_to_existing_network + ) + + # process changes + WranglerLogger.debug("Processing changes") + WranglerLogger.debug(link_changes_df) + changeable_col = list( + ( + set(link_changes_df.columns) + & set(self.base_roadway_network.links_df.columns) + ) + - set(Project.STATIC_VALUES) + ) + + cols_in_changes_not_in_net = list( + set(link_changes_df.columns) + - set(self.base_roadway_network.links_df.columns) + ) + + if cols_in_changes_not_in_net: + WranglerLogger.warning( + "The following attributes are specified in the changes but do not exist in the base network: {}".format( + cols_in_changes_not_in_net + ) + ) + + change_link_dict_list = _process_link_changes(link_changes_df, changeable_col) + + if len(node_changes_df) != 0: + node_changes_df = _consolidate_actions( + node_changes_df, self.base_roadway_network.nodes_df, ["model_node_id"] + ) + + # print error message for node change and node deletion + if ( + len(node_changes_df[node_changes_df["operation_final"].isin(["C", "D"])]) + > 0 + ): + msg = "NODE changes and deletions are not allowed!" + WranglerLogger.warning(msg) + #raise ValueError(msg) + node_add_df = node_changes_df[node_changes_df["operation_final"] == "A"] + + if add_link_dict: + add_link_dict["nodes"] = _process_node_additions(node_add_df) + else: + add_link_dict = {"category": "Add New Roadway", "nodes": _process_node_additions(node_add_df)} + + else: + None + + # combine together + + highway_change_list = list( + filter(None, [delete_link_dict] + [add_link_dict] + change_link_dict_list) + ) + + return highway_change_list
+
+ +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bart/_modules/lasso/roadway/index.html b/branch/bart/_modules/lasso/roadway/index.html new file mode 100644 index 0000000..ebfca34 --- /dev/null +++ b/branch/bart/_modules/lasso/roadway/index.html @@ -0,0 +1,2060 @@ + + + + + + lasso.roadway — lasso documentation + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for lasso.roadway

+import copy
+import glob
+import os
+from typing import Optional, Union
+
+import geopandas as gpd
+import pandas as pd
+
+from geopandas import GeoDataFrame
+from pandas import DataFrame
+import numpy as np
+
+from network_wrangler import RoadwayNetwork
+from .parameters import Parameters
+from .logger import WranglerLogger
+
+
+
[docs]class ModelRoadwayNetwork(RoadwayNetwork): + """ + Subclass of network_wrangler class :ref:`RoadwayNetwork <network_wrangler:RoadwayNetwork>` + + A representation of the physical roadway network and its properties. + """ + + CALCULATED_VALUES = [ + "area_type", + "county", + "centroidconnect", + ] + +
[docs] def __init__( + self, + nodes: GeoDataFrame, + links: DataFrame, + shapes: GeoDataFrame, + parameters: Union[Parameters, dict] = {}, + **kwargs, + ): + """ + Constructor + + Args: + nodes: geodataframe of nodes + links: dataframe of links + shapes: geodataframe of shapes + parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters. + If not specified, will use default parameters. + crs (int): coordinate reference system, ESPG number + node_foreign_key (str): variable linking the node table to the link table + link_foreign_key (list): list of variable linking the link table to the node foreign key + shape_foreign_key (str): variable linking the links table and shape table + unique_link_ids (list): list of variables unique to each link + unique_node_ids (list): list of variables unique to each node + modes_to_network_link_variables (dict): Mapping of modes to link variables in the network + modes_to_network_nodes_variables (dict): Mapping of modes to node variables in the network + managed_lanes_node_id_scalar (int): Scalar values added to primary keys for nodes for + corresponding managed lanes. + managed_lanes_link_id_scalar (int): Scalar values added to primary keys for links for + corresponding managed lanes. + managed_lanes_required_attributes (list): attributes that must be specified in managed + lane projects. + keep_same_attributes_ml_and_gp (list): attributes to copy to managed lanes from parallel + general purpose lanes. + """ + super().__init__(nodes, links, shapes, **kwargs) + + # will have to change if want to alter them + if type(parameters) is dict: + self.parameters = Parameters(**parameters) + elif isinstance(parameters, Parameters): + self.parameters = Parameters(**parameters.__dict__) + else: + msg = "Parameters should be a dict or instance of Parameters: found {} which is of type:{}".format( + parameters, type(parameters) + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + self.links_metcouncil_df = None + self.nodes_metcouncil_df = None + + self.fill_na() + self.convert_int()
+ # self.shapes_metcouncil_df = None + ##todo also write to file + # WranglerLogger.debug("Used PARAMS\n", '\n'.join(['{}: {}'.format(k,v) for k,v in self.parameters.__dict__.items()])) + +
[docs] @staticmethod + def read( + link_filename: str, + node_filename: str, + shape_filename: str, + fast: bool = False, + recalculate_calculated_variables=False, + recalculate_distance=False, + parameters: Union[dict, Parameters] = {}, + **kwargs, + ): + """ + Reads in links and nodes network standard. + + Args: + link_filename (str): File path to link json. + node_filename (str): File path to node geojson. + shape_filename (str): File path to link true shape geojson + fast (bool): boolean that will skip validation to speed up read time. + recalculate_calculated_variables (bool): calculates fields from spatial joins, etc. + recalculate_distance (bool): re-calculates distance. + parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not specified, will use default parameters. + crs (int): coordinate reference system, ESPG number + node_foreign_key (str): variable linking the node table to the link table + link_foreign_key (list): list of variable linking the link table to the node foreign key + shape_foreign_key (str): variable linking the links table and shape table + unique_link_ids (list): list of variables unique to each link + unique_node_ids (list): list of variables unique to each node + modes_to_network_link_variables (dict): Mapping of modes to link variables in the network + modes_to_network_nodes_variables (dict): Mapping of modes to node variables in the network + managed_lanes_node_id_scalar (int): Scalar values added to primary keys for nodes for + corresponding managed lanes. + managed_lanes_link_id_scalar (int): Scalar values added to primary keys for links for + corresponding managed lanes. + managed_lanes_required_attributes (list): attributes that must be specified in managed + lane projects. + keep_same_attributes_ml_and_gp (list): attributes to copy to managed lanes from parallel + general purpose lanes. + Returns: + ModelRoadwayNetwork + """ + + nodes_df, links_df, shapes_df = RoadwayNetwork.load_transform_network( + node_filename, + link_filename, + shape_filename, + validate_schema=not fast, + **kwargs, + ) + + m_road_net = ModelRoadwayNetwork( + nodes_df, + links_df, + shapes_df, + parameters=parameters, + **kwargs, + ) + + if recalculate_calculated_variables: + m_road_net.create_calculated_variables() + if recalculate_distance: + m_road_net.calculate_distance(overwrite=True) + + m_road_net.fill_na() + # this method is making period values as string "NaN", need to revise. + m_road_net.split_properties_by_time_period_and_category() + for c in m_road_net.links_df.columns: + m_road_net.links_df[c] = m_road_net.links_df[c].replace("NaN", np.nan) + m_road_net.convert_int() + + return m_road_net
+ +
[docs] @staticmethod + def from_RoadwayNetwork( + roadway_network_object, + parameters: Union[dict, Parameters] = {}, + ): + """ + RoadwayNetwork to ModelRoadwayNetwork + + Args: + roadway_network_object (RoadwayNetwork). + parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not specified, will use default parameters. + + Returns: + ModelRoadwayNetwork + """ + + additional_params_dict = { + k: v + for k, v in roadway_network_object.__dict__.items() + if k not in ["nodes_df", "links_df", "shapes_df", "parameters"] + } + + return ModelRoadwayNetwork( + roadway_network_object.nodes_df, + roadway_network_object.links_df, + roadway_network_object.shapes_df, + parameters=parameters, + **additional_params_dict, + )
+ +
[docs] def split_properties_by_time_period_and_category(self, properties_to_split=None): + """ + Splits properties by time period, assuming a variable structure of + + Args: + properties_to_split: dict + dictionary of output variable prefix mapped to the source variable and what to stratify it by + e.g. + { + 'lanes' : {'v':'lanes', 'times_periods':{"AM": ("6:00", "10:00"),"PM": ("15:00", "19:00")}}, + 'ML_lanes' : {'v':'ML_lanes', 'times_periods':{"AM": ("6:00", "10:00"),"PM": ("15:00", "19:00")}}, + 'use' : {'v':'use', 'times_periods':{"AM": ("6:00", "10:00"),"PM": ("15:00", "19:00")}}, + } + + """ + import itertools + + if properties_to_split == None: + properties_to_split = self.parameters.properties_to_split + + for out_var, params in properties_to_split.items(): + if params["v"] not in self.links_df.columns: + WranglerLogger.warning( + "Specified variable to split: {} not in network variables: {}. Returning 0.".format( + params["v"], str(self.links_df.columns) + ) + ) + if params.get("time_periods") and params.get("categories"): + + for time_suffix, category_suffix in itertools.product( + params["time_periods"], params["categories"] + ): + self.links_df[ + out_var + "_" + time_suffix + "_" + category_suffix + ] = 0 + elif params.get("time_periods"): + for time_suffix in params["time_periods"]: + self.links_df[out_var + "_" + time_suffix] = 0 + elif params.get("time_periods") and params.get("categories"): + for time_suffix, category_suffix in itertools.product( + params["time_periods"], params["categories"] + ): + self.links_df[ + out_var + "_" + category_suffix + "_" + time_suffix + ] = self.get_property_by_time_period_and_group( + params["v"], + category=params["categories"][category_suffix], + time_period=params["time_periods"][time_suffix], + ) + elif params.get("time_periods"): + for time_suffix in params["time_periods"]: + self.links_df[ + out_var + "_" + time_suffix + ] = self.get_property_by_time_period_and_group( + params["v"], + category=None, + time_period=params["time_periods"][time_suffix], + ) + else: + raise ValueError( + "Shoudn't have a category without a time period: {}".format(params) + )
+ +
[docs] def create_calculated_variables(self): + """ + Creates calculated roadway variables. + + Args: + None + """ + WranglerLogger.info("Creating calculated roadway variables.") + + #MTC + self.create_ML_variable() + #/MTC + #MC + self.calculate_area_type() + self.calculate_county() + self.calculate_mpo() + self.add_counts() + self.create_ML_variable() + self.create_hov_corridor_variable() + self.create_managed_variable()
+ #/MC + +
[docs] def calculate_county( + self, + county_shape=None, + county_shape_variable=None, + network_variable="county", + county_codes_dict=None, + overwrite=False, + ): + """ + #MC + Calculates county variable. + + This uses the centroid of the geometry field to determine which county it should be labeled. + This isn't perfect, but it much quicker than other methods. + + Args: + county_shape (str): The File path to county geodatabase. + county_shape_variable (str): The variable name of county in county geodadabase. + network_variable (str): The variable name of county in network standard. Default to "county". + overwrite (Bool): True if overwriting existing county variable in network. Default to False. + + Returns: + None + """ + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing County Variable '{}' already in network".format( + network_variable + ) + ) + else: + WranglerLogger.info( + "County Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + """ + Verify inputs + """ + + county_shape = county_shape if county_shape else self.parameters.county_shape + + county_shape_variable = ( + county_shape_variable + if county_shape_variable + else self.parameters.county_variable_shp + ) + + WranglerLogger.info( + "Adding roadway network variable for county using a spatial join with: {}".format( + county_shape + ) + ) + + county_codes_dict = ( + county_codes_dict if county_codes_dict else self.parameters.county_code_dict + ) + if not county_codes_dict: + msg = "No county codes dictionary specified" + WranglerLogger.error(msg) + raise ValueError(msg) + + """ + Start actual process + """ + + centroids_gdf = self.links_df.copy() + centroids_gdf["geometry"] = centroids_gdf["geometry"].centroid + + county_gdf = gpd.read_file(county_shape) + county_gdf = county_gdf.to_crs(epsg=self.crs) + joined_gdf = gpd.sjoin(centroids_gdf, county_gdf, how="left", op="intersects") + + joined_gdf[county_shape_variable] = ( + joined_gdf[county_shape_variable] + .map(county_codes_dict) + .fillna(10) + .astype(int) + ) + + self.links_df[network_variable] = joined_gdf[county_shape_variable] + + WranglerLogger.info( + "Finished Calculating county variable: {}".format(network_variable) + )
+ +
[docs] def calculate_area_type( + self, + area_type_shape=None, + area_type_shape_variable=None, + network_variable="area_type", + area_type_codes_dict=None, + downtown_area_type_shape=None, + downtown_area_type=None, + overwrite=False, + ): + """ + #MC + Calculates area type variable. + + This uses the centroid of the geometry field to determine which area it should be labeled. + This isn't perfect, but it much quicker than other methods. + + Args: + area_type_shape (str): The File path to area geodatabase. + area_type_shape_variable (str): The variable name of area type in area geodadabase. + network_variable (str): The variable name of area type in network standard. Default to "area_type". + area_type_codes_dict: The dictionary to map input area_type_shape_variable to network_variable + downtown_area_type_shape: The file path to the downtown area type boundary. + downtown_area_type (int): Integer value of downtown area type + overwrite (Bool): True if overwriting existing county variable in network. Default to False. + + Returns: + None + + """ + + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing Area Type Variable '{}' already in network".format( + network_variable + ) + ) + else: + WranglerLogger.info( + "Area Type Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + WranglerLogger.info( + "Calculating Area Type from Spatial Data and adding as roadway network variable: {}".format( + network_variable + ) + ) + + """ + Verify inputs + """ + + area_type_shape = ( + area_type_shape if area_type_shape else self.parameters.area_type_shape + ) + + if not area_type_shape: + msg = "No area type shape specified" + WranglerLogger.error(msg) + raise ValueError(msg) + if not os.path.exists(area_type_shape): + msg = "File not found for area type shape: {}".format(area_type_shape) + WranglerLogger.error(msg) + raise ValueError(msg) + + area_type_shape_variable = ( + area_type_shape_variable + if area_type_shape_variable + else self.parameters.area_type_variable_shp + ) + + if not area_type_shape_variable: + msg = "No area type shape varible specified" + WranglerLogger.error(msg) + raise ValueError(msg) + + area_type_codes_dict = ( + area_type_codes_dict + if area_type_codes_dict + else self.parameters.area_type_code_dict + ) + if not area_type_codes_dict: + msg = "No area type codes dictionary specified" + WranglerLogger.error(msg) + raise ValueError(msg) + + downtown_area_type_shape = ( + downtown_area_type_shape + if downtown_area_type_shape + else self.parameters.downtown_area_type_shape + ) + + if not downtown_area_type_shape: + msg = "No downtown area type shape specified" + WranglerLogger.error(msg) + raise ValueError(msg) + if not os.path.exists(downtown_area_type_shape): + msg = "File not found for downtown area type shape: {}".format( + downtown_area_type_shape + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + downtown_area_type = ( + downtown_area_type + if downtown_area_type + else self.parameters.downtown_area_type + ) + if not downtown_area_type: + msg = "No downtown area type value specified" + WranglerLogger.error(msg) + raise ValueError(msg) + + """ + Start actual process + """ + centroids_gdf = self.links_df.copy() + centroids_gdf["geometry"] = centroids_gdf["geometry"].centroid + + WranglerLogger.debug("Reading Area Type Shapefile {}".format(area_type_shape)) + area_type_gdf = gpd.read_file(area_type_shape) + area_type_gdf = area_type_gdf.to_crs(epsg=self.crs) + + downtown_gdf = gpd.read_file(downtown_area_type_shape) + downtown_gdf = downtown_gdf.to_crs(epsg=self.crs) + + joined_gdf = gpd.sjoin( + centroids_gdf, area_type_gdf, how="left", op="intersects" + ) + + joined_gdf[area_type_shape_variable] = ( + joined_gdf[area_type_shape_variable] + .map(area_type_codes_dict) + .fillna(1) + .astype(int) + ) + + WranglerLogger.debug("Area Type Codes Used: {}".format(area_type_codes_dict)) + + d_joined_gdf = gpd.sjoin( + centroids_gdf, downtown_gdf, how="left", op="intersects" + ) + + d_joined_gdf["downtown_area_type"] = d_joined_gdf["Id"].fillna(-99).astype(int) + + joined_gdf.loc[ + d_joined_gdf["downtown_area_type"] == 0, area_type_shape_variable + ] = downtown_area_type + + WranglerLogger.debug( + "Downtown Area Type used boundary file: {}".format(downtown_area_type_shape) + ) + + self.links_df[network_variable] = joined_gdf[area_type_shape_variable] + + WranglerLogger.info( + "Finished Calculating Area Type from Spatial Data into variable: {}".format( + network_variable + ) + )
+ +
[docs] def calculate_mpo( + self, + county_network_variable="county", + network_variable="mpo", + as_integer=True, + mpo_counties=None, + overwrite=False, + ): + """ + Calculates mpo variable. + #MC + Args: + county_variable (str): Name of the variable where the county names are stored. Default to "county". + network_variable (str): Name of the variable that should be written to. Default to "mpo". + as_integer (bool): If true, will convert true/false to 1/0s. + mpo_counties (list): List of county names that are within mpo region. + overwrite (Bool): True if overwriting existing county variable in network. Default to False. + + Returns: + None + """ + + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing MPO Variable '{}' already in network".format( + network_variable + ) + ) + else: + WranglerLogger.info( + "MPO Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + WranglerLogger.info( + "Calculating MPO as roadway network variable: {}".format(network_variable) + ) + """ + Verify inputs + """ + county_network_variable = ( + county_network_variable + if county_network_variable + else self.parameters.county_network_variable + ) + + if not county_network_variable: + msg = "No variable specified as containing 'county' in the network." + WranglerLogger.error(msg) + raise ValueError(msg) + if county_network_variable not in self.links_df.columns: + msg = "Specified county network variable: {} does not exist in network. Try running or debuging county calculation." + WranglerLogger.error(msg) + raise ValueError(msg) + + mpo_counties = mpo_counties if mpo_counties else self.parameters.mpo_counties + + if not mpo_counties: + msg = "No MPO Counties specified in method call or in parameters." + WranglerLogger.error(msg) + raise ValueError(msg) + + WranglerLogger.debug("MPO Counties: {}".format(",".join(str(mpo_counties)))) + + """ + Start actual process + """ + + mpo = self.links_df[county_network_variable].isin(mpo_counties) + + if as_integer: + mpo = mpo.astype(int) + + self.links_df[network_variable] = mpo + + WranglerLogger.info( + "Finished calculating MPO variable: {}".format(network_variable) + )
+ +
[docs] def add_variable_using_shst_reference( + self, + var_shst_csvdata=None, + shst_csv_variable=None, + network_variable=None, + network_var_type=int, + overwrite=False, + ): + """ + Join network links with source data, via SHST API node match result. + + Args: + var_shst_csvdata (str): File path to SHST API return. + shst_csv_variable (str): Variable name in the source data. + network_variable (str): Name of the variable that should be written to. + network_var_type : Variable type in the written network. + overwrite (bool): True is overwriting existing variable. Default to False. + + Returns: + None + + """ + WranglerLogger.info( + "Adding Variable {} using Shared Streets Reference from {}".format( + network_variable, var_shst_csvdata + ) + ) + + var_shst_df = pd.read_csv(var_shst_csvdata) + # there are aadt = 0 in the counts, drop them + var_shst_df = var_shst_df[var_shst_df[shst_csv_variable] > 0].copy() + # count station to shared street match - there are many-to-one matches, keep just one match + var_shst_df.drop_duplicates(subset = ["shstReferenceId"], inplace = True) + + if "shstReferenceId" not in var_shst_df.columns: + msg = "'shstReferenceId' required but not found in {}".format(var_shst_data) + WranglerLogger.error(msg) + raise ValueError(msg) + + if shst_csv_variable not in var_shst_df.columns: + msg = "{} required but not found in {}".format( + shst_csv_variable, var_shst_data + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + join_gdf = pd.merge( + self.links_df, + var_shst_df[["shstReferenceId", shst_csv_variable]], + how="left", + on="shstReferenceId", + ) + + join_gdf[shst_csv_variable].fillna(0, inplace=True) + + if network_variable in self.links_df.columns and not overwrite: + join_gdf.loc[join_gdf[network_variable] == 0, network_variable] = join_gdf[ + shst_csv_variable + ].astype(network_var_type) + else: + join_gdf[network_variable] = join_gdf[shst_csv_variable].astype( + network_var_type + ) + + self.links_df[network_variable] = join_gdf[network_variable] + + # MN and WI counts are vehicles using the segment in both directions, no directional counts + # we will make sure both direction has the same daily AADT + dir_link_count_df = self.links_df[ + (self.links_df[network_variable] > 0) & + (self.links_df["drive_access"] == 1) + ][["A", "B", network_variable]].copy() + reverse_dir_link_count_df = dir_link_count_df.rename(columns = {"A":"B", "B":"A"}).copy() + + link_count_df = pd.concat( + [dir_link_count_df, reverse_dir_link_count_df], + sort = False, + ignore_index = True + ) + link_count_df.drop_duplicates(subset = ["A", "B"], inplace = True) + + self.links_df = pd.merge( + self.links_df.drop(network_variable, axis = 1), + link_count_df[["A", "B", network_variable]], + how = "left", + on = ["A", "B"] + ) + self.links_df[network_variable].fillna(0, inplace = True) + WranglerLogger.info( + "Added variable: {} using Shared Streets Reference".format(network_variable) + )
+ +
[docs] def add_counts( + self, + network_variable="AADT", + mndot_count_shst_data=None, + widot_count_shst_data=None, + mndot_count_variable_shp=None, + widot_count_variable_shp=None, + ): + + """ + Adds count variable. + #MC + join the network with count node data, via SHST API node match result + + Args: + network_variable (str): Name of the variable that should be written to. Default to "AADT". + mndot_count_shst_data (str): File path to MNDOT count location SHST API node match result. + widot_count_shst_data (str): File path to WIDOT count location SHST API node match result. + mndot_count_variable_shp (str): File path to MNDOT count location geodatabase. + widot_count_variable_shp (str): File path to WIDOT count location geodatabase. + + Returns: + None + """ + + WranglerLogger.info("Adding Counts") + + """ + Verify inputs + """ + + mndot_count_shst_data = ( + mndot_count_shst_data + if mndot_count_shst_data + else self.parameters.mndot_count_shst_data + ) + widot_count_shst_data = ( + widot_count_shst_data + if widot_count_shst_data + else self.parameters.widot_count_shst_data + ) + mndot_count_variable_shp = ( + mndot_count_variable_shp + if mndot_count_variable_shp + else self.parameters.mndot_count_variable_shp + ) + widot_count_variable_shp = ( + widot_count_variable_shp + if widot_count_variable_shp + else self.parameters.widot_count_variable_shp + ) + + for varname, var in { + "mndot_count_shst_data": mndot_count_shst_data, + "widot_count_shst_data": widot_count_shst_data, + }.items(): + if not var: + msg = "'{}' not found in method or lasso parameters.".format(varname) + WranglerLogger.error(msg) + raise ValueError(msg) + if not os.path.exists(var): + msg = "{}' not found at following location: {}.".format(varname, var) + WranglerLogger.error(msg) + raise ValueError(msg) + + for varname, var in { + "mndot_count_variable_shp": mndot_count_variable_shp, + "widot_count_variable_shp": widot_count_variable_shp, + }.items(): + if not var: + msg = "'{}' not found in method or lasso parameters.".format(varname) + WranglerLogger.error(msg) + raise ValueError(msg) + + """ + Start actual process + """ + WranglerLogger.debug( + "Adding MNDOT Counts using \n- shst file: {}\n- shp file: {}\n- as network variable: {}".format( + mndot_count_shst_data, mndot_count_variable_shp, network_variable + ) + ) + # Add Minnesota Counts + self.add_variable_using_shst_reference( + var_shst_csvdata=mndot_count_shst_data, + shst_csv_variable=mndot_count_variable_shp, + network_variable=network_variable, + network_var_type=int, + overwrite=True, + ) + WranglerLogger.debug( + "Adding WiDot Counts using \n- shst file: {}\n- shp file: {}\n- as network variable: {}".format( + widot_count_shst_data, widot_count_variable_shp, network_variable + ) + ) + # Add Wisconsin Counts, but don't overwrite Minnesota + self.add_variable_using_shst_reference( + var_shst_csvdata=widot_count_shst_data, + shst_csv_variable=widot_count_variable_shp, + network_variable=network_variable, + network_var_type=int, + overwrite=False, + ) + + self.links_df["count_AM"] = self.links_df[network_variable] / 4 + self.links_df["count_MD"] = self.links_df[network_variable] / 4 + self.links_df["count_PM"] = self.links_df[network_variable] / 4 + self.links_df["count_NT"] = self.links_df[network_variable] / 4 + + self.links_df["count_daily"] = self.links_df[network_variable] + # add COUNTYEAR + self.links_df["count_year"] = 2017 + + WranglerLogger.info( + "Finished adding counts variable: {}".format(network_variable) + )
+ +
[docs] @staticmethod + def read_match_result(path): + """ + Reads the shst geojson match returns. + + Returns shst dataframe. + + Reading lots of same type of file and concatenating them into a single DataFrame. + + Args: + path (str): File path to SHST match results. + + Returns: + geodataframe: geopandas geodataframe + + ##todo + not sure why we need, but should be in utilities not this class + """ + refId_gdf = DataFrame() + refid_file = glob.glob(path) + for i in refid_file: + new = gpd.read_file(i) + refId_gdf = pd.concat([refId_gdf, new], ignore_index=True, sort=False) + return refId_gdf
+ +
[docs] @staticmethod + def get_attribute( + links_df, + join_key, # either "shstReferenceId", or "shstGeometryId", tests showed the latter gave better coverage + source_shst_ref_df, # source shst refId + source_gdf, # source dataframe + field_name, # , # targetted attribute from source + ): + """ + Gets attribute from source data using SHST match result. + + Args: + links_df (dataframe): The network dataframe that new attribute should be written to. + join_key (str): SHST ID variable name used to join source data with network dataframe. + source_shst_ref_df (str): File path to source data SHST match result. + source_gdf (str): File path to source data. + field_name (str): Name of the attribute to get from source data. + + Returns: + None + """ + # join based on shared streets geometry ID + # pp_link_id is shared streets match return + # source_ink_id is mrcc + WranglerLogger.debug( + "source ShSt rename_variables_for_dbf columns\n{}".format( + source_shst_ref_df.columns + ) + ) + WranglerLogger.debug("source gdf columns\n{}".format(source_gdf.columns)) + # end up with OSM network with the MRCC Link ID + # could also do with route_sys...would that be quicker? + join_refId_df = pd.merge( + links_df, + source_shst_ref_df[[join_key, "pp_link_id", "score"]].rename( + columns={"pp_link_id": "source_link_id", "score": "source_score"} + ), + how="left", + on=join_key, + ) + + # joined with MRCC dataframe to get route_sys + + join_refId_df = pd.merge( + join_refId_df, + source_gdf[["LINK_ID", field_name]].rename( + columns={"LINK_ID": "source_link_id"} + ), + how="left", + on="source_link_id", + ) + + # drop duplicated records with same field value + + join_refId_df.drop_duplicates( + subset=["model_link_id", "shstReferenceId", field_name], inplace=True + ) + + # more than one match, take the best score + + join_refId_df.sort_values( + by=["model_link_id", "source_score"], + ascending=True, + na_position="first", + inplace=True, + ) + + join_refId_df.drop_duplicates( + subset=["model_link_id"], keep="last", inplace=True + ) + + # self.links_df[field_name] = join_refId_df[field_name] + + return join_refId_df[links_df.columns.tolist() + [field_name, "source_link_id"]]
+ +
[docs] def calculate_use( + self, + network_variable="use", + as_integer=True, + overwrite=False, + ): + """ + Calculates use variable. + + Args: + network_variable (str): Variable that should be written to in the network. Default to "use" + as_integer (bool): If True, will convert true/false to 1/0s. Defauly to True. + overwrite (Bool): True if overwriting existing county variable in network. Default to False. + + Returns: + None + """ + + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing hov Variable '{}' already in network".format( + network_variable + ) + ) + else: + WranglerLogger.info( + "'use' Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + WranglerLogger.info( + "Calculating hov and adding as roadway network variable: {}".format( + network_variable + ) + ) + """ + Verify inputs + """ + + if not network_variable: + msg = "No network variable specified for centroid connector" + WranglerLogger.error(msg) + raise ValueError(msg) + + """ + Start actual process + """ + #MTC + self.links_df[network_variable] = int(1) + #/MTC + + self.links_df[network_variable] = 0 + + self.links_df.loc[ + (self.links_df["assign_group"] == 8) | (self.links_df["access"] == "hov"), + network_variable, + ] = 100 + #/MC + + + if as_integer: + self.links_df[network_variable] = self.links_df[network_variable].astype( + int + ) + WranglerLogger.info( + "Finished calculating hov variable: {}".format(network_variable) + )
+ +
[docs] def create_ML_variable( + self, + network_variable="ML_lanes", + overwrite=False, + ): + """ + Created ML lanes placeholder for project to write out ML changes + + ML lanes default to 0, ML info comes from cube LOG file and store in project cards + + Args: + overwrite (Bool): True if overwriting existing variable in network. Default to False. + + Returns: + None + """ + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing ML Variable '{}' already in network".format( + network_variable + ) + ) + self.links_df[network_variable] = int(0) + else: + WranglerLogger.info( + "ML Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + """ + Verify inputs + """ + + WranglerLogger.info( + "Finished creating ML lanes variable: {}".format(network_variable) + )
+ +
[docs] def create_hov_corridor_variable( + self, + network_variable="segment_id", + overwrite=False, + ): + """ + Created hov corridor placeholder for project to write out corridor changes + + hov corridor id default to 0, its info comes from cube LOG file and store in project cards + + Args: + overwrite (Bool): True if overwriting existing variable in network. Default to False. + + Returns: + None + """ + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing hov corridor Variable '{}' already in network".format( + network_variable + ) + ) + else: + WranglerLogger.info( + "Hov corridor Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + """ + Verify inputs + """ + + self.links_df[network_variable] = int(0) + + WranglerLogger.info( + "Finished creating hov corridor variable: {}".format(network_variable) + )
+ +
[docs] def create_managed_variable( + self, + network_variable="managed", + overwrite=False, + ): + """ + Created placeholder for project to write out managed + + managed default to 0, its info comes from cube LOG file and store in project cards + + Args: + overwrite (Bool): True if overwriting existing variable in network. Default to False. + + Returns: + None + """ + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing managed Variable '{}' already in network".format( + network_variable + ) + ) + else: + WranglerLogger.info( + "Managed Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + """ + Verify inputs + """ + + self.links_df[network_variable] = int(0) + + WranglerLogger.info( + "Finished creating managed variable: {}".format(network_variable) + )
+ +
[docs] def calculate_centroidconnect( + self, + parameters, + network_variable="centroidconnect", + highest_taz_number=None, + as_integer=True, + overwrite=False, + ): + """ + Calculates centroid connector variable. + + Args: + parameters (Parameters): A Lasso Parameters, which stores input files. + network_variable (str): Variable that should be written to in the network. Default to "centroidconnect" + highest_taz_number (int): the max TAZ number in the network. + as_integer (bool): If True, will convert true/false to 1/0s. Default to True. + overwrite (Bool): True if overwriting existing county variable in network. Default to False. + Returns: + RoadwayNetwork + """ + + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing Centroid Connector Variable '{}' already in network".format( + network_variable + ) + ) + else: + WranglerLogger.info( + "Centroid Connector Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + WranglerLogger.info( + "Calculating Centroid Connector and adding as roadway network variable: {}".format( + network_variable + ) + ) + """ + Verify inputs + """ + highest_taz_number = ( + highest_taz_number if highest_taz_number else parameters.highest_taz_number + ) + + if not highest_taz_number: + msg = "No highest_TAZ number specified in method variable or in parameters" + WranglerLogger.error(msg) + raise ValueError(msg) + + WranglerLogger.debug( + "Calculating Centroid Connectors using highest TAZ number: {}".format( + highest_taz_number + ) + ) + + if not network_variable: + msg = "No network variable specified for centroid connector" + WranglerLogger.error(msg) + raise ValueError(msg) + + """ + Start actual process + """ + self.links_df[network_variable] = False + + self.links_df.loc[ + (self.links_df["A"] <= highest_taz_number) + | (self.links_df["B"] <= highest_taz_number), + network_variable, + ] = True + + if as_integer: + self.links_df[network_variable] = self.links_df[ + network_variable + ].astype(int) + WranglerLogger.info( + "Finished calculating centroid connector variable: {}".format(network_variable) + )
+ + +
[docs] def calculate_distance( + self, network_variable="distance", centroidconnect_only=False, overwrite=False + ): + """ + calculate link distance in miles + + Args: + centroidconnect_only (Bool): True if calculating distance for centroidconnectors only. Default to False. + overwrite (Bool): True if overwriting existing variable in network. Default to False. + + Returns: + None + + """ + + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing distance Variable '{}' already in network".format( + network_variable + ) + ) + else: + WranglerLogger.info( + "Distance Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + """ + Verify inputs + """ + + #MC + if ("centroidconnect" not in self.links_df) & ("taz" not in self.links_df.roadway.unique()): + if centroidconnect_only: + msg = "No variable specified for centroid connector, calculating centroidconnect first" + WranglerLogger.error(msg) + raise ValueError(msg) + #/MC + """ + Start actual process + """ + + temp_links_gdf = self.links_df.copy() + temp_links_gdf.crs = "EPSG:4326" + temp_links_gdf = temp_links_gdf.to_crs(epsg=26915) + + #MTC + WranglerLogger.info( + "Calculating distance for all links".format(network_variable) + ) + temp_links_gdf[network_variable] = temp_links_gdf.geometry.length / 1609.34 + #/MTC + #MC + if centroidconnect_only: + WranglerLogger.info( + "Calculating {} for centroid connectors".format(network_variable) + ) + temp_links_gdf[network_variable] = np.where( + temp_links_gdf.centroidconnect == 1, + temp_links_gdf.geometry.length / 1609.34, + temp_links_gdf[network_variable], + ) + else: + WranglerLogger.info( + "Calculating distance for all links".format(network_variable) + ) + temp_links_gdf[network_variable] = temp_links_gdf.geometry.length / 1609.34 + #/MC + + self.links_df[network_variable] = temp_links_gdf[network_variable]
+ +
[docs] def convert_int(self, int_col_names=[]): + """ + Convert integer columns + """ + + #MTC + WranglerLogger.info( + "Converting variable type to mtc standard" + ) + + int_col_names = self.parameters.int_col + #/MTC + #MC + """ + WranglerLogger.info("Converting variable type to MetCouncil standard") + + if not int_col_names: + int_col_names = self.parameters.int_col + #/MC + """ + ##Why are we doing this? + # int_col_names.remove("lanes") + + for c in list(set(self.links_df.columns) & set(int_col_names)): + self.links_df[c] = self.links_df[c].replace(np.nan, 0) + # REPLACE BLANKS WITH ZERO FOR INTEGER COLUMNS + self.links_df[c] = self.links_df[c].replace('', 0) + try: + self.links_df[c] = self.links_df[c].replace(np.nan, 0) + self.links_df[c] = self.links_df[c].replace("", 0) + self.links_df[c] = self.links_df[c].astype(int) + except ValueError: + try: + self.links_df[c] = self.links_df[c].astype(float) + self.links_df[c] = self.links_df[c].astype(int) + except: + msg = f"Could not convert column {c} to integer." + WranglerLogger.error(msg) + raise ValueError(msg) + + for c in list(set(self.nodes_df.columns) & set(int_col_names)): + self.nodes_df[c] = self.nodes_df[c].replace("", 0) + self.nodes_df[c] = self.nodes_df[c].astype(int)
+ +
[docs] def fill_na(self): + """ + Fill na values from create_managed_lane_network() + """ + + WranglerLogger.info("Filling nan for network from network wrangler") + + num_col = self.parameters.int_col + self.parameters.float_col + + for x in list(self.links_df.columns): + if x in num_col: + self.links_df[x].fillna(0, inplace=True) + self.links_df[x] = self.links_df[x].apply( + lambda k: 0 if k in [np.nan, "", float("nan"), "NaN"] else k + ) + + else: + self.links_df[x].fillna("", inplace=True) + + for x in list(self.nodes_df.columns): + if x in num_col: + self.nodes_df[x].fillna(0, inplace=True) + else: + self.nodes_df[x].fillna("", inplace=True)
+ + +
[docs] def roadway_standard_to_met_council_network(self, output_epsg=None): + """ + Rename and format roadway attributes to be consistent with what metcouncil's model is expecting. + #MC + Args: + output_epsg (int): epsg number of output network. + + Returns: + None + """ + + WranglerLogger.info( + "Renaming roadway attributes to be consistent with what metcouncil's model is expecting" + ) + + """ + Verify inputs + """ + + output_epsg = output_epsg if output_epsg else self.parameters.output_epsg + + """ + Start actual process + """ + if "managed" in self.links_df.columns: + WranglerLogger.info("Creating managed lane network.") + self.create_managed_lane_network(in_place=True) + + # when ML and assign_group projects are applied together, assign_group is filled as "" by wrangler for ML links + for c in ModelRoadwayNetwork.CALCULATED_VALUES: + if c in self.links_df.columns and c in self.parameters.int_col: + self.links_df[c] = self.links_df[c].replace("", 0) + else: + WranglerLogger.info("Didn't detect managed lanes in network.") + + self.calculate_centroidconnect(self.parameters) + self.create_calculated_variables() + self.calculate_distance(overwrite=True) + + self.fill_na() + # no method to calculate price yet, will be hard coded in project card + WranglerLogger.info("Splitting variables by time period and category") + self.split_properties_by_time_period_and_category() + self.convert_int() + + self.links_metcouncil_df = self.links_df.copy() + self.nodes_metcouncil_df = self.nodes_df.copy() + + self.links_metcouncil_df = pd.merge( + self.links_metcouncil_df.drop( + "geometry", axis=1 + ), # drop the stick geometry in links_df + self.shapes_df[["shape_id", "geometry"]], + how="left", + on="shape_id", + ) + + self.links_metcouncil_df.crs = "EPSG:4326" + self.nodes_metcouncil_df.crs = "EPSG:4326" + WranglerLogger.info("Setting Coordinate Reference System to EPSG 26915") + self.links_metcouncil_df = self.links_metcouncil_df.to_crs(epsg=26915) + self.nodes_metcouncil_df = self.nodes_metcouncil_df.to_crs(epsg=26915) + + self.nodes_metcouncil_df["X"] = self.nodes_metcouncil_df.geometry.apply( + lambda g: g.x + ) + self.nodes_metcouncil_df["Y"] = self.nodes_metcouncil_df.geometry.apply( + lambda g: g.y + ) + + # CUBE expect node id to be N + self.nodes_metcouncil_df.rename(columns={"model_node_id": "N"}, inplace=True)
+ +
[docs] def rename_variables_for_dbf( + self, + input_df, + variable_crosswalk: str = None, + output_variables: list = None, + convert_geometry_to_xy=False, + ): + """ + Rename attributes for DBF/SHP, make sure length within 10 chars. + + Args: + input_df (dataframe): Network standard DataFrame. + variable_crosswalk (str): File path to variable name crosswalk from network standard to DBF names. + output_variables (list): List of strings for DBF variables. + convert_geometry_to_xy (bool): True if converting node geometry to X/Y + + Returns: + dataframe + + """ + WranglerLogger.info("Renaming variables so that they are DBF-safe") + + """ + Verify inputs + """ + + variable_crosswalk = ( + variable_crosswalk + if variable_crosswalk + else self.parameters.net_to_dbf_crosswalk + ) + + output_variables = ( + output_variables if output_variables else self.parameters.output_variables + ) + + """ + Start actual process + """ + + crosswalk_df = pd.read_csv(variable_crosswalk) + WranglerLogger.debug( + "Variable crosswalk: {} \n {}".format(variable_crosswalk, crosswalk_df) + ) + net_to_dbf_dict = dict(zip(crosswalk_df["net"], crosswalk_df["dbf"])) + + dbf_name_list = [] + + dbf_df = copy.deepcopy(input_df) + + # only write out variables that we specify + # if variable is specified in the crosswalk, rename it to that variable + for c in dbf_df.columns: + if c in output_variables: + try: + dbf_df.rename(columns={c: net_to_dbf_dict[c]}, inplace=True) + dbf_name_list += [net_to_dbf_dict[c]] + except: + dbf_name_list += [c] + + if "geometry" in dbf_df.columns: + if str(dbf_df["geometry"].iloc[0].geom_type) == "Point": + dbf_df["X"] = dbf_df.geometry.apply(lambda g: g.x) + dbf_df["Y"] = dbf_df.geometry.apply(lambda g: g.y) + dbf_name_list += ["X", "Y"] + + WranglerLogger.debug("DBF Variables: {}".format(",".join(dbf_name_list))) + + return dbf_df[dbf_name_list]
+ +
[docs] def write_roadway_as_shp( + self, + output_dir, + node_output_variables: list = None, + link_output_variables: list = None, + data_to_csv: bool = True, + data_to_dbf: bool = False, + output_link_shp: str = None, + output_node_shp: str = None, + output_link_csv: str = None, + output_node_csv: str = None, + output_gpkg: str = None, + output_link_gpkg_layer: str = None, + output_node_gpkg_layer: str = None, + output_gpkg_link_filter: str = None + ): + """ + Write out dbf/shp/gpkg for cube. Write out csv in addition to shp with full length variable names. + + Args: + output_dir (str): File path to directory + node_output_variables (list): List of strings for node output variables. + link_output_variables (list): List of strings for link output variables. + data_to_csv (bool): True if write network in csv format. + data_to_dbf (bool): True if write network in dbf/shp format. + output_link_shp (str): File name to output link dbf/shp. + output_node_shp (str): File name of output node dbf/shp. + output_link_csv (str): File name to output link csv. + output_node_csv (str): File name to output node csv. + output_gpkg (str): File name to output GeoPackage. + output_link_gpkg_layer (str): Layer name within output_gpkg to output links. + output_node_gpkg_layer (str): Layer name within output_gpkg to output links. + output_gpkg_link_filter (str): Optional column name to additional output link subset layers + + Returns: + None + """ + + WranglerLogger.info("Writing Network as Shapefile") + WranglerLogger.debug( + "Output Variables: \n - {}".format( + "\n - ".join(self.parameters.output_variables) + ) + ) + + """ + Verify inputs + """ + + if self.nodes_mtc_df is None: + self.roadway_standard_to_met_council_network() + + WranglerLogger.debug( + "Network Link Variables: \n - {}".format( + "\n - ".join(self.links_mtc_df.columns) + ) + ) + WranglerLogger.debug( + "Network Node Variables: \n - {}".format( + "\n - ".join(self.nodes_mtc_df.columns) + ) + ) + + link_output_variables = ( + link_output_variables + if link_output_variables + else [ + c + for c in self.links_mtc_df.columns + if c in self.parameters.output_variables + ] + ) + + node_output_variables = ( + node_output_variables + if node_output_variables + else [ + c + for c in self.nodes_mtc_df.columns + if c in self.parameters.output_variables + ] + ) + + # unless specified that all the data goes to the DBF, only output A and B + dbf_link_output_variables = ( + #MTC + link_output_variables if link_output_variables else ["A", "B", "geometry"] + #MC + #link_output_variables if data_to_dbf else ["A", "B", "shape_id", "geometry"] + ) + + # Removing code to set this to versions from parameters + # User can use these as arg + + """ + Start Process + """ + # rename these to short only for shapefile option + if output_node_shp: + WranglerLogger.info("Renaming DBF Node Variables") + nodes_dbf_df = self.rename_variables_for_dbf(self.nodes_mtc_df, output_variables=node_output_variables) + else: + WranglerLogger.debug("nodes_mtc_df columns: {}".format(list(self.nodes_mtc_df.columns))) + nodes_dbf_df = self.nodes_mtc_df[node_output_variables] + + if output_link_shp: + WranglerLogger.info("Renaming DBF Link Variables") + links_dbf_df = self.rename_variables_for_dbf(self.links_mtc_df, output_variables=dbf_link_output_variables) + else: + WranglerLogger.debug("links_mtc_df columns: {}".format(list(self.links_mtc_df.columns))) + links_dbf_df = self.links_mtc_df[dbf_link_output_variables] + + links_dbf_df = gpd.GeoDataFrame(links_dbf_df, geometry=links_dbf_df["geometry"]) + + # temp debug + WranglerLogger.debug("links_dbf_df.loc[(links_dbf_df.A.isin([7063066,2563066]))|(links_dbf_df.B.isin([7063066,2563066]))]:\n{}".format( + links_dbf_df.loc[(links_dbf_df.A.isin([7063066,2563066]))|(links_dbf_df.B.isin([7063066,2563066]))] + )) + + if output_node_shp: + WranglerLogger.info("Writing Node Shapes: {}".format(os.path.join(output_dir, output_node_shp))) + nodes_dbf_df.to_file(os.path.join(output_dir, output_node_shp)) + + if output_gpkg and output_node_gpkg_layer: + WranglerLogger.info("Writing GeoPackage {} with Node Layer {}".format(os.path.join(output_dir, output_gpkg), output_node_gpkg_layer)) + nodes_dbf_df.to_file(os.path.join(output_dir, output_gpkg), layer=output_node_gpkg_layer, driver="GPKG") + + if output_link_shp: + WranglerLogger.info("Writing Link Shapes: {}".format(os.path.join(output_dir, output_link_shp))) + links_dbf_df.to_file(os.path.join(output_dir, output_link_shp)) + + # debug test + link_schema = { + "properties": { + "A" : "int:8", + "B" : "int:8", + "model_link_id" : "int:10", + "shstGeometryId": "str:32", + "name" : "str:84", + "ft" : "int:2", + "assignable" : "int:18", + "cntype" : "str:80", + "distance" : "float", + "county" : "str:15", + "bike_access" : "int:2", + "drive_access" : "int:2", + "walk_access" : "int:2", + "rail_only" : "int:2", + "bus_only" : "int:2", + "transit" : "int:2", + "managed" : "int:2", + "tollbooth" : "int:2", + "tollseg" : "int:2", + "segment_id" : "int:4", + "lanes_EA" : "int:2", + "heuristic_num" : "int:2", + "lanes_AM" : "int:2", + "lanes_MD" : "int:2", + "lanes_PM" : "int:2", + "lanes_EV" : "int:2", + "useclass_EA" : "int:2", + "useclass_AM" : "int:2", + "useclass_MD" : "int:2", + "useclass_PM" : "int:2", + "useclass_EV" : "int:2" + }, + "geometry": "LineString" + } + if output_gpkg and output_link_gpkg_layer: + WranglerLogger.info("Writing GeoPackage {} with Link Layer {}".format(os.path.join(output_dir, output_gpkg), output_link_gpkg_layer)) + links_dbf_df.to_file(os.path.join(output_dir, output_gpkg), layer=output_link_gpkg_layer, schema=link_schema, driver="GPKG") + + # output additional link layers if filter column is specified + # e.g. if county-subsets are output + if output_gpkg_link_filter: + link_value_counts = links_dbf_df[output_gpkg_link_filter].value_counts() + for filter_val,filter_count in link_value_counts.items(): + gpkg_layer_name = "{}_{}".format(output_link_gpkg_layer, filter_val) + gpkg_layer_name = gpkg_layer_name.replace(" ","_") + WranglerLogger.info("Writing GeoPackage {} with Link Layer {} for {} rows".format( + os.path.join(output_dir, output_gpkg), gpkg_layer_name, filter_count)) + links_dbf_df.loc[ links_dbf_df[output_gpkg_link_filter]==filter_val ].to_file( + os.path.join(output_dir, output_gpkg), layer=gpkg_layer_name, schema=link_schema, driver="GPKG") + + + + + if data_to_csv: + WranglerLogger.info( + "Writing Network Data to CSVs:\n - {}\n - {}".format( + output_link_csv, output_node_csv + ) + ) + self.links_mtc_df[link_output_variables].to_csv( + output_link_csv, index=False + ) + self.nodes_mtc_df[node_output_variables].to_csv( + output_node_csv, index=False + )
+ + + # this should be moved to util +
[docs] @staticmethod + def dataframe_to_fixed_width(df): + """ + Convert dataframe to fixed width format, geometry column will not be transformed. + + Args: + df (pandas DataFrame). + + Returns: + pandas dataframe: dataframe with fixed width for each column. + dict: dictionary with columns names as keys, column width as values. + """ + WranglerLogger.info("Starting fixed width conversion") + + # get the max length for each variable column + max_width_dict = dict( + [ + (v, df[v].apply(lambda r: len(str(r)) if r != None else 0).max()) + for v in df.columns.values + if v != "geometry" + ] + ) + + fw_df = df.drop("geometry", axis=1).copy() + for c in fw_df.columns: + fw_df[c] = fw_df[c].apply(lambda x: str(x)) + fw_df["pad"] = fw_df[c].apply(lambda x: " " * (max_width_dict[c] - len(x))) + fw_df[c] = fw_df.apply(lambda x: x["pad"] + x[c], axis=1) + + return fw_df, max_width_dict
+ +
[docs] def write_roadway_as_fixedwidth( + self, + output_dir, + node_output_variables: list = None, + link_output_variables: list = None, + output_link_txt: str = None, + output_node_txt: str = None, + output_link_header_width_txt: str = None, + output_node_header_width_txt: str = None, + output_cube_network_script: str = None, + drive_only: bool = False, + ): + """ + Writes out fixed width file. + + This function does: + 1. write out link and node fixed width data files for cube. + 2. write out header and width correspondence. + 3. write out cube network building script with header and width specification. + + Args: + output_dir (str): File path to where links, nodes and script will be written and run + node_output_variables (list): list of node variable names. + link_output_variables (list): list of link variable names. + output_link_txt (str): File name of output link database (within output_dir) + output_node_txt (str): File name of output node database (within output_dir) + output_link_header_width_txt (str): File name of link column width records (within output_dir) + output_node_header_width_txt (str): File name of node column width records (within output_dir) + output_cube_network_script (str): File name of CUBE network building script (within output_dir) + drive_only (bool): If True, only writes drive nodes and links + + Returns: + None + + """ + + """ + Verify inputs + """ + + if self.nodes_mtc_df is None: + self.roadway_standard_to_mtc_network() + + WranglerLogger.debug( + "Network Link Variables: \n - {}".format( + "\n - ".join(self.links_mtc_df.columns) + ) + ) + WranglerLogger.debug( + "Network Node Variables: \n - {}".format( + "\n - ".join(self.nodes_mtc_df.columns) + ) + ) + + link_output_variables = ( + link_output_variables + if link_output_variables + else [ + c + for c in self.links_mtc_df.columns + if c in self.parameters.output_variables + ] + ) + + node_output_variables = ( + node_output_variables + if node_output_variables + else [ + c + for c in self.nodes_mtc_df.columns + if c in self.parameters.output_variables + ] + ) + + output_link_txt = ( + output_link_txt if output_link_txt else self.parameters.output_link_txt + ) + + output_node_txt = ( + output_node_txt if output_node_txt else self.parameters.output_node_txt + ) + + output_link_header_width_txt = ( + output_link_header_width_txt + if output_link_header_width_txt + else self.parameters.output_link_header_width_txt + ) + + output_node_header_width_txt = ( + output_node_header_width_txt + if output_node_header_width_txt + else self.parameters.output_node_header_width_txt + ) + + output_cube_network_script = ( + output_cube_network_script + if output_cube_network_script + else self.parameters.output_cube_network_script + ) + + """ + Start Process + """ + #MTC + link_ff_df, link_max_width_dict = self.dataframe_to_fixed_width( + self.links_mtc_df[link_output_variables] + ) + + if drive_only: + link_ff_df = link_ff_df.loc[link_ff_df['drive_access'] == 1] + #/MTC + """ + #MC + link_ff_df, link_max_width_dict = self.dataframe_to_fixed_width( + self.links_metcouncil_df[link_output_variables] + ) + + if drive_only: + link_ff_df = link_ff_df.loc[link_ff_df["drive_access"] == 1] + #/MC + """ + WranglerLogger.info("Writing out link database") + + link_ff_df.to_csv(os.path.join(output_dir, output_link_txt), sep=";", index=False, header=False) + + # write out header and width correspondence + WranglerLogger.info("Writing out link header and width ----") + link_max_width_df = DataFrame( + list(link_max_width_dict.items()), columns=["header", "width"] + ) + link_max_width_df.to_csv(os.path.join(output_dir, output_link_header_width_txt), index=False) + + #MTC + node_ff_df, node_max_width_dict = self.dataframe_to_fixed_width( + self.nodes_mtc_df[node_output_variables] + ) + #/MTC + """ + #MC + node_ff_df, node_max_width_dict = self.dataframe_to_fixed_width( + self.nodes_metcouncil_df[node_output_variables] + ) + #/MC + """ + WranglerLogger.info("Writing out node database") + + if drive_only: + node_ff_df = node_ff_df.loc[node_ff_df["drive_node"] == 1] + + + node_ff_df.to_csv(os.path.join(output_dir, output_node_txt), sep=";", index=False, header=False) + + # write out header and width correspondence + WranglerLogger.info("Writing out node header and width") + node_max_width_df = DataFrame( + list(node_max_width_dict.items()), columns=["header", "width"] + ) + node_max_width_df.to_csv(os.path.join(output_dir, output_node_header_width_txt), index=False) + + # write out cube script + s = 'RUN PGM = NETWORK MSG = "Read in network from fixed width file" \n' + s += 'FILEI LINKI[1] = "{}",'.format(output_link_txt) + start_pos = 1 + for i in range(len(link_max_width_df)): + s += " VAR=" + link_max_width_df.header.iloc[i] + + if ( + self.links_mtc_df.dtypes.loc[link_max_width_df.header.iloc[i]] + == "O" + ): + s += "(C" + str(link_max_width_df.width.iloc[i]) + ")" + + s += ( + ", BEG=" + + str(start_pos) + + ", LEN=" + + str(link_max_width_df.width.iloc[i]) + + "," + ) + + start_pos += link_max_width_df.width.iloc[i] + 1 + + s = s[:-1] + s += "\n" + s += 'FILEI NODEI[1] = "{}",'.format(output_node_txt) + start_pos = 1 + for i in range(len(node_max_width_df)): + s += " VAR=" + node_max_width_df.header.iloc[i] + + if ( + self.nodes_mtc_df.dtypes.loc[node_max_width_df.header.iloc[i]] + == "O" + ): + s += "(C" + str(node_max_width_df.width.iloc[i]) + ")" + + s += ( + ", BEG=" + + str(start_pos) + + ", LEN=" + + str(node_max_width_df.width.iloc[i]) + + "," + ) + + start_pos += node_max_width_df.width.iloc[i] + 1 + + s = s[:-1] + s += '\n' + s += 'FILEO NETO = "complete_network.net"\n\n' + s += ' ZONES = {}\n\n'.format(self.parameters.zones) + s += '; Trim leading whitespace from string variables\n' + # todo: The below should be built above based on columns that are strings + s += ' phase=NODEMERGE\n' + s += ' county = LTRIM(county)\n' + s += ' endphase\n' + s += ' phase=LINKMERGE\n' + s += ' name = LTRIM(name)\n' + s += ' county = LTRIM(county)\n' + s += ' cntype = LTRIM(cntype)\n' + s += ' endphase\n' + s += '\nENDRUN\n' + + with open(os.path.join(output_dir, output_cube_network_script), "w") as f: + f.write(s) + + # run the cube script to create the cube network + import subprocess + env = copy.copy(os.environ) + cube_cmd = '"C:\\Program Files\\Citilabs\\CubeVoyager\\runtpp.exe" {}'.format(output_cube_network_script) + try: + WranglerLogger.info("Running [{}] in cwd [{}]".format(cube_cmd, output_dir)) + ret = subprocess.run(cube_cmd, cwd=output_dir, capture_output=True, check=True) + + WranglerLogger.info("return code: {}".format(ret.returncode)) + + for line in ret.stdout.decode('utf-8').split('\r\n'): + if len(line) > 0: WranglerLogger.info("stdout: {}".format(line)) + + for line in ret.stderr.decode('utf-8').split('\r\n'): + if len(line) > 0: WranglerLogger.info("stderr: {}".format(line)) + + except Exception as e: + WranglerLogger.error(e)
+
+ +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bart/_modules/lasso/transit/index.html b/branch/bart/_modules/lasso/transit/index.html new file mode 100644 index 0000000..37ad7b9 --- /dev/null +++ b/branch/bart/_modules/lasso/transit/index.html @@ -0,0 +1,2052 @@ + + + + + + lasso.transit — lasso documentation + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for lasso.transit

+"""Transit-related classes to parse, compare, and write standard and cube transit files.
+
+  Typical usage example:
+
+    tn = CubeTransit.create_from_cube(CUBE_DIR)
+    transit_change_list = tn.evaluate_differences(base_transit_network)
+
+    cube_transit_net = StandardTransit.read_gtfs(BASE_TRANSIT_DIR)
+    cube_transit_net.write_as_cube_lin(os.path.join(WRITE_DIR, "outfile.lin"))
+"""
+import os
+import copy
+import csv
+import datetime, time
+from typing import Any, Dict, Optional, Union
+
+from lark import Lark, Transformer, v_args
+from pandas import DataFrame
+
+import pandas as pd
+import partridge as ptg
+import numpy as np
+
+from network_wrangler import TransitNetwork
+
+from .logger import WranglerLogger
+from .parameters import Parameters
+
+
[docs]class CubeTransit(object): + """Class for storing information about transit defined in Cube line + files. + + Has the capability to: + + - Parse cube line file properties and shapes into python dictionaries + - Compare line files and represent changes as Project Card dictionaries + + .. highlight:: python + + Typical usage example: + :: + tn = CubeTransit.create_from_cube(CUBE_DIR) + transit_change_list = tn.evaluate_differences(base_transit_network) + + Attributes: + lines (list): list of strings representing unique line names in + the cube network. + line_properties (dict): dictionary of line properties keyed by line name. Property + values are stored in a dictionary by property name. These + properties are directly read from the cube line files and haven't + been translated to standard transit values. + shapes (dict): dictionary of shapes + keyed by line name. Shapes stored as a pandas DataFrame of nodes with following columns: + - 'node_id' (int): positive integer of node id + - 'node' (int): node number, with negative indicating a non-stop + - 'stop' (boolean): indicates if it is a stop + - 'order' (int): order within this shape + program_type (str): Either PT or TRNBLD + parameters (Parameters): + Parameters instance that will be applied to this instance which + includes information about time periods and variables. + source_list (list): + List of cube line file sources that have been read and added. + diff_dict (dict): + """ + +
[docs] def __init__(self, parameters: Union[Parameters, dict] = {}): + """ + Constructor for CubeTransit + + parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters + """ + WranglerLogger.debug("Creating a new Cube Transit instance") + + self.lines = [] + + self.line_properties = {} + self.shapes = {} + + self.program_type = None + + if type(parameters) is dict: + self.parameters = Parameters(**parameters) + elif isinstance(parameters, Parameters): + self.parameters = Parameters(**parameters.__dict__) + else: + msg = "Parameters should be a dict or instance of Parameters: found {} which is of type:{}".format( + parameters, type(parameters) + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + self.source_list = [] + + self.diff_dict = Dict[str, Any]
+ +
[docs] def add_cube(self, transit_source: str): + """Reads a .lin file and adds it to existing TransitNetwork instance. + + Args: + transit_source: a string or the directory of the cube line file to be parsed + + """ + + """ + Figure out what kind of transit source it is + """ + + parser = Lark(TRANSIT_LINE_FILE_GRAMMAR, debug="debug", parser="lalr") + + if "NAME=" in transit_source: + WranglerLogger.debug("reading transit source as string") + self.source_list.append("input_str") + parse_tree = parser.parse(transit_source) + elif os.path.isfile(transit_source): + print("reading: {}".format(transit_source)) + with open(transit_source) as file: + WranglerLogger.debug( + "reading transit source: {}".format(transit_source) + ) + self.source_list.append(transit_source) + parse_tree = parser.parse(file.read()) + elif os.path.isdir(transit_source): + import glob + + for lin_file in glob.glob(os.path.join(transit_source, "*.LIN")): + self.add_cube(lin_file) + return + else: + msg = "{} not a valid transit line string, directory, or file" + WranglerLogger.error(msg) + raise ValueError(msg) + + WranglerLogger.debug("finished parsing cube line file") + # WranglerLogger.debug("--Parse Tree--\n {}".format(parse_tree.pretty())) + transformed_tree_data = CubeTransformer().transform(parse_tree) + # WranglerLogger.debug("--Transformed Parse Tree--\n {}".format(transformed_tree_data)) + + _line_data = transformed_tree_data["lines"] + + line_properties_dict = {k: v["line_properties"] for k, v in _line_data.items()} + line_shapes_dict = {k: v["line_shape"] for k, v in _line_data.items()} + new_lines = list(line_properties_dict.keys()) + """ + Before adding lines, check to see if any are overlapping with existing ones in the network + """ + + overlapping_lines = set(new_lines) & set(self.lines) + if overlapping_lines: + msg = "Overlapping lines found when adding from {}. \nSource files:\n{}\n{} Overlapping Lines of {} total new lines.\n-->{}".format( + transit_source, + "\n - ".join(self.source_list), + len(new_lines), + len(overlapping_lines), + overlapping_lines, + ) + print(msg) + WranglerLogger.error(msg) + raise ValueError(msg) + + self.program_type = transformed_tree_data.get("program_type", None) + + self.lines += new_lines + self.line_properties.update(line_properties_dict) + self.shapes.update(line_shapes_dict) + + WranglerLogger.debug("Added lines to CubeTransit: \n".format(new_lines))
+ +
[docs] @staticmethod + def create_from_cube(transit_source: str, parameters: Optional[dict] = {}): + """ + Reads a cube .lin file and stores as TransitNetwork object. + + Args: + transit_source: a string or the directory of the cube line file to be parsed + + Returns: + A ::CubeTransit object created from the transit_source. + """ + + tn = CubeTransit(parameters) + tn.add_cube(transit_source) + + return tn
+ +
[docs] def evaluate_differences(self, base_transit): + """ + 1. Identifies what routes need to be updated, deleted, or added + 2. For routes being added or updated, identify if the time periods + have changed or if there are multiples, and make duplicate lines if so + 3. Create project card dictionaries for each change. + + Args: + base_transit (CubeTransit): an instance of this class for the base condition + + Returns: + A list of dictionaries containing project card changes + required to evaluate the differences between the base network + and this transit network instance. + """ + transit_change_list = [] + + """ + Identify what needs to be evaluated + """ + lines_to_update = [l for l in self.lines if l in base_transit.lines] + lines_to_delete = [l for l in base_transit.lines if l not in self.lines] + lines_to_add = [l for l in self.lines if l not in base_transit.lines] + + project_card_changes = [] + + """ + Evaluate Property Updates + """ + + for line in lines_to_update: + WranglerLogger.debug( + "Finding differences in time periods for: {}".format(line) + ) + + """ + Find any additional time periods that might need to add or delete. + """ + base_cube_time_period_numbers = ( + CubeTransit.get_time_period_numbers_from_cube_properties( + base_transit.line_properties[line] + ) + ) + + try: + assert len(base_cube_time_period_numbers) == 1 + except: + msg = "Base network line {} should only have one time period per route, but {} found".format( + line, base_cube_time_period_numbers + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + base_cube_time_period_number = base_cube_time_period_numbers[0] + + build_cube_time_period_numbers = ( + CubeTransit.get_time_period_numbers_from_cube_properties( + self.line_properties[line] + ) + ) + + time_periods_to_add = [ + tp + for tp in build_cube_time_period_numbers + if tp not in base_cube_time_period_numbers + ] + + for tp in time_periods_to_add: + lines_to_add.append(self.add_additional_time_periods(tp, line)) + + time_periods_to_delete = [ + tp + for tp in base_cube_time_period_numbers + if tp not in build_cube_time_period_numbers + ] + + for tp in time_periods_to_delete: + lines_to_delete.append(line) + + WranglerLogger.debug("Evaluating differences in: {}".format(line)) + updated_properties = self.evaluate_route_property_differences( + self.line_properties[line], + base_transit.line_properties[line], + base_cube_time_period_number, + ) + updated_shapes = CubeTransit.evaluate_route_shape_changes( + self.shapes[line].node, base_transit.shapes[line].node + ) + if updated_properties: + update_prop_card_dict = self.create_update_route_card_dict( + line, updated_properties + ) + project_card_changes.append(update_prop_card_dict) + + if updated_shapes: + update_shape_card_dict = self.create_update_route_card_dict( + line, updated_shapes + ) + project_card_changes.append(update_shape_card_dict) + + """ + Evaluate Deletions + """ + for line in lines_to_delete: + delete_card_dict = self.create_delete_route_card_dict( + line, base_transit.line_properties[line] + ) + project_card_changes.append(delete_card_dict) + + """ + Evaluate Additions + + First assess if need to add multiple routes if there are multiple time periods + """ + for line in lines_to_add: + time_period_numbers = ( + CubeTransit.get_time_period_numbers_from_cube_properties( + self.line_properties[line] + ) + ) + if len(time_period_numbers) > 1: + for tp in time_period_numbers[1:]: + lines_to_add.append(self.add_additional_time_periods(tp, line)) + + for line in lines_to_add: + add_card_dict = self.create_add_route_card_dict(line) + project_card_changes.append(add_card_dict) + + return project_card_changes
+ +
[docs] def add_additional_time_periods( + self, new_time_period_number: int, orig_line_name: str + ): + """ + Copies a route to another cube time period with appropriate + values for time-period-specific properties. + + New properties are stored under the new name in: + - ::self.shapes + - ::self.line_properties + + Args: + new_time_period_number (int): cube time period number + orig_line_name(str): name of the originating line, from which + the new line will copy its properties. + + Returns: + Line name with new time period. + """ + WranglerLogger.debug( + "adding time periods {} to line {}".format( + new_time_period_number, orig_line_name + ) + ) + + ( + route_id, + _init_time_period, + agency_id, + direction_id, + ) = CubeTransit.unpack_route_name(orig_line_name) + new_time_period_name = self.parameters.cube_time_periods[new_time_period_number] + new_tp_line_name = CubeTransit.build_route_name( + route_id=route_id, + time_period=new_time_period_name, + agency_id=agency_id, + direction_id=direction_id, + ) + + try: + assert new_tp_line_name not in self.lines + except: + msg = "Trying to add a new time period {} to line {}, but constructed name {} is already in line list.".format( + new_time_period_number, orig_line_name, new_tp_line_name + ) + WrangerLogger.error(msg) + raise ValueError(msg) + + # copy to a new line and add it to list of lines to add + self.line_properties[new_tp_line_name] = copy.deepcopy( + self.line_properties[orig_line_name] + ) + self.shapes[new_tp_line_name] = copy.deepcopy(self.shapes[orig_line_name]) + self.line_properties[new_tp_line_name]["NAME"] = new_tp_line_name + + """ + Remove entries that aren't for this time period from the new line's properties list. + """ + this_time_period_properties_list = [ + p + "[" + str(new_time_period_number) + "]" + ##todo parameterize all time period specific variables + for p in ["HEADWAY", "FREQ"] + ] + + not_this_tp_properties_list = list( + set(self.parameters.time_period_properties_list) + - set(this_time_period_properties_list) + ) + + for k in not_this_tp_properties_list: + self.line_properties[new_tp_line_name].pop(k, None) + + """ + Remove entries for time period from the original line's properties list. + """ + for k in this_time_period_properties_list: + self.line_properties[orig_line_name].pop(k, None) + + """ + Add new line to list of lines to add. + """ + WranglerLogger.debug( + "Adding new time period {} for line {} as {}.".format( + new_time_period_number, orig_line_name, new_tp_line_name + ) + ) + return new_tp_line_name
+ +
[docs] def create_update_route_card_dict(self, line: str, updated_properties_dict: dict): + """ + Creates a project card change formatted dictionary for updating + the line. + + Args: + line: name of line that is being updated + updated_properties_dict: dictionary of attributes to update as + 'property': <property name>, + 'set': <new property value> + + Returns: + A project card change-formatted dictionary for the attribute update. + """ + base_start_time_str, base_end_time_str = self.calculate_start_end_times( + self.line_properties[line] + ) + + update_card_dict = { + "category": "Transit Service Property Change", + "facility": { + "route_id": line.split("_")[1], + "direction_id": int(line.split("_")[-2].strip("d\"")), + "shape_id": line.split("_")[-1].strip("s\""), + "start_time": base_start_time_str, + "end_time": base_end_time_str, + }, + "properties": updated_properties_dict, + } + WranglerLogger.debug( + "Updating {} route to changes:\n{}".format(line, str(update_card_dict)) + ) + + return update_card_dict
+ +
[docs] def create_delete_route_card_dict( + self, line: str, base_transit_line_properties_dict: dict + ): + """ + Creates a project card change formatted dictionary for deleting a line. + + Args: + line: name of line that is being deleted + base_transit_line_properties_dict: dictionary of cube-style + attribute values in order to find time periods and + start and end times. + + Returns: + A project card change-formatted dictionary for the route deletion. + """ + base_start_time_str, base_end_time_str = self.calculate_start_end_times( + base_transit_line_properties_dict + ) + + delete_card_dict = { + "category": "Delete Transit Service", + "facility": { + "route_id": line.split("_")[1], + "direction_id": int(line.strip('"')[-1]), + "start_time": base_start_time_str, + "end_time": base_end_time_str, + }, + } + WranglerLogger.debug( + "Deleting {} route to changes:\n{}".format(line, delete_card_dict) + ) + + return delete_card_dict
+ +
[docs] def create_add_route_card_dict(self, line: str): + """ + Creates a project card change formatted dictionary for adding + a route based on the information in self.route_properties for + the line. + + Args: + line: name of line that is being updated + + Returns: + A project card change-formatted dictionary for the route addition. + """ + start_time_str, end_time_str = self.calculate_start_end_times( + self.line_properties[line] + ) + + standard_properties = self.cube_properties_to_standard_properties( + self.line_properties[line] + ) + + routing_properties = { + "property": "routing", + "set": self.shapes[line]["node"].tolist(), + } + + add_card_dict = { + "category": "New Transit Service", + "facility": { + "route_id": line.split("_")[1], + "direction_id": int(line.strip('_')[-2]), + "start_time": start_time_str, + "end_time": end_time_str, + "agency_id": line.strip('_')[0], + }, + "properties": standard_properties + [routing_properties], + } + + WranglerLogger.debug( + "Adding {} route to changes:\n{}".format(line, add_card_dict) + ) + return add_card_dict
+ +
[docs] @staticmethod + def get_time_period_numbers_from_cube_properties(properties_list: list): + """ + Finds properties that are associated with time periods and the + returns the numbers in them. + + Args: + properties_list (list): list of all properties. + + Returns: + list of strings of the time period numbers found + """ + time_periods_list = [] + for p in properties_list: + if ("[" not in p) or ("]" not in p): + continue + tp_num = p.split("[")[1][0] + if tp_num and tp_num not in time_periods_list: + time_periods_list.append(tp_num) + return time_periods_list
+ +
[docs] @staticmethod + def build_route_name( + route_id: str = "", + time_period: str = "", + agency_id: str = 0, + direction_id: str = 1, + ): + """ + Create a route name by contatenating route, time period, agency, and direction + + Args: + route_id: i.e. 452-111 + time_period: i.e. pk + direction_id: i.e. 1 + agency_id: i.e. 0 + + Returns: + constructed line_name i.e. "0_452-111_452_pk1" + """ + + return ( + str(agency_id) + + "_" + + str(route_id) + + "_" + + str(route_id.split("-")[0]) + + "_" + + str(time_period) + + str(direction_id) + )
+ +
[docs] @staticmethod + def unpack_route_name(line_name: str): + """ + Unpacks route name into direction, route, agency, and time period info + + Args: + line_name (str): i.e. "0_452-111_452_pk1" + + Returns: + route_id (str): 452-111 + time_period (str): i.e. pk + direction_id (str) : i.e. 1 + agency_id (str) : i.e. 0 + """ + + line_name = line_name.strip('"') + + agency_id, route_id, _rtid, _tp_direction = line_name.split("_") + time_period = _tp_direction[0:-1] + direction_id = _tp_direction[-1] + + return route_id, time_period, agency_id, direction_id
+ +
[docs] def calculate_start_end_times(self, line_properties_dict: dict): + """ + Calculate the start and end times of the property change + WARNING: Doesn't take care of discongruous time periods!!!! + + Args: + line_properties_dict: dictionary of cube-flavor properties for a transit line + """ + start_time_m = 24 * 60 + end_time_m = 0 * 60 + + WranglerLogger.debug( + "parameters.time_period_properties_list: {}".format( + self.parameters.time_period_properties_list + ) + ) + current_cube_time_period_numbers = ( + CubeTransit.get_time_period_numbers_from_cube_properties( + line_properties_dict + ) + ) + + WranglerLogger.debug( + "current_cube_time_period_numbers:{}".format( + current_cube_time_period_numbers + ) + ) + + for tp in current_cube_time_period_numbers: + time_period_name = self.parameters.cube_time_periods[tp] + WranglerLogger.debug("time_period_name:{}".format(time_period_name)) + _start_time, _end_time = self.parameters.time_period_to_time[ + time_period_name + ] + + # change from "HH:MM" to integer # of seconds + _start_time_m = (int(_start_time.split(":")[0]) * 60) + int( + _start_time.split(":")[1] + ) + _end_time_m = (int(_end_time.split(":")[0]) * 60) + int( + _end_time.split(":")[1] + ) + + # find bounding start and end times + if _start_time_m < start_time_m: + start_time_m = _start_time_m + if _end_time_m > end_time_m: + end_time_m = _end_time_m + + if start_time_m > end_time_m: + msg = "Start time ({}) is after end time ({})".format( + start_time_m, end_time_m + ) + #WranglerLogger.error(msg) + #raise ValueError(msg) + + start_time_str = "{:02d}:{:02d}".format(*divmod(start_time_m, 60)) + end_time_str = "{:02d}:{:02d}".format(*divmod(end_time_m, 60)) + return start_time_str, end_time_str
+ +
[docs] @staticmethod + def cube_properties_to_standard_properties(cube_properties_dict: dict): + """ + Converts cube style properties to standard properties. + + This is most pertinent to time-period specific variables like headway, + and varibles that have stnadard units like headway, which is minutes + in cube and seconds in standard format. + + Args: + cube_properties_dict: <cube style property name> : <property value> + + Returns: + A list of dictionaries with values for `"property": <standard + style property name>, "set" : <property value with correct units>` + + """ + standard_properties_list = [] + for k, v in cube_properties_dict.items(): + change_item = {} + if any(i in k for i in ["HEADWAY", "FREQ"]): + change_item["property"] = "headway_secs" + change_item["set"] = v * 60 + else: + change_item["property"] = k + change_item["set"] = v + standard_properties_list.append(change_item) + + return standard_properties_list
+ +
[docs] def evaluate_route_property_differences( + self, + properties_build: dict, + properties_base: dict, + time_period_number: str, + absolute: bool = True, + validate_base: bool = False, + ): + """ + Checks if any values have been updated or added for a specific + route and creates project card entries for each. + + Args: + properties_build: ::<property_name>: <property_value> + properties_base: ::<property_name>: <property_value> + time_period_number: time period to evaluate + absolute: if True, will use `set` command rather than a change. If false, will automatically check the base value. Note that this only applies to the numeric values of frequency/headway + validate_base: if True, will add the `existing` line in the project card + + Returns: + transit_change_list (list): a list of dictionary values suitable for writing to a project card + `{ + 'property': <property_name>, + 'set': <set value>, + 'change': <change from existing value>, + 'existing': <existing value to check>, + }` + + """ + + # Remove time period specific values for things that aren't part of the time period in question + this_time_period_properties_list = [ + p + "[" + str(time_period_number) + "]" + ##todo parameterize all time period specific variables + for p in ["HEADWAY", "FREQ"] + ] + + not_this_tp_properties_list = list( + set(self.parameters.time_period_properties_list) + - set(this_time_period_properties_list) + ) + + for k in not_this_tp_properties_list: + properties_build.pop(k, None) + properties_base.pop(k, None) + + difference_dict = dict( + set(properties_build.items()) ^ set(properties_base.items()) + ) + + # Iterate through properties list to build difference project card list + + properties_list = [] + for k, v in difference_dict.items(): + change_item = {} + if any(i in k for i in ["HEADWAY", "FREQ"]): + change_item["property"] = "headway_secs" + + if absolute: + change_item["set"] = ( + v * 60 + ) # project cards are in secs, cube is in minutes + else: + change_item["change"] = ( + properties_build[k] - properties_base[k] + ) * 60 + if validate_base or not absolute: + change_item["existing"] = properties_base[k] * 60 + else: + change_item["property"] = k + change_item["set"] = v + if validate_base: + change_item["existing"] = properties_base[k] + + properties_list.append(change_item) + WranglerLogger.debug( + "Evaluated Route Changes: \n {})".format( + "\n".join(map(str, properties_list)) + ) + ) + return properties_list
+ +
[docs] @staticmethod + def evaluate_route_shape_changes( + shape_build: DataFrame, shape_base: DataFrame + ): + """ + Compares two route shapes and constructs returns list of changes + suitable for a project card. + + Args: + shape_build: DataFrame of the build-version of the route shape. + shape_base: dDataFrame of the base-version of the route shape. + + Returns: + List of shape changes formatted as a project card-change dictionary. + + """ + + if shape_build.equals(shape_base): + return None + + shape_change_list = [] + + base_node_list = shape_base.tolist() + build_node_list = shape_build.tolist() + + sort_len = max(len(base_node_list), len(build_node_list)) + + start_pos = None + end_pos = None + for i in range(sort_len): + if (i == len(base_node_list)) | (i == len(build_node_list)): + start_pos = i - 1 + break + if base_node_list[i] != build_node_list[i]: + start_pos = i + break + else: + continue + + j = -1 + for i in range(sort_len): + if (i == len(base_node_list)) | (i == len(build_node_list)): + end_pos = j + 1 + break + if base_node_list[j] != build_node_list[j]: + end_pos = j + break + else: + j -= 1 + + if start_pos or end_pos: + existing = base_node_list[ + (start_pos - 2 if start_pos > 1 else None) : ( + end_pos + 2 if end_pos < -2 else None + ) + ] + set = build_node_list[ + (start_pos - 2 if start_pos > 1 else None) : ( + end_pos + 2 if end_pos < -2 else None + ) + ] + + shape_change_list.append( + {"property": "routing", "existing": existing, "set": set} + ) + + return shape_change_list
+ + +
[docs]class StandardTransit(object): + """Holds a standard transit feed as a Partridge object and contains + methods to manipulate and translate the GTFS data to MetCouncil's + Cube Line files. + + .. highlight:: python + Typical usage example: + :: + cube_transit_net = StandardTransit.read_gtfs(BASE_TRANSIT_DIR) + cube_transit_net.write_as_cube_lin(os.path.join(WRITE_DIR, "outfile.lin")) + + Attributes: + feed: Partridge Feed object containing read-only access to GTFS feed + parameters (Parameters): Parameters instance containing information + about time periods and variables. + """ + +
[docs] def __init__(self, ptg_feed, parameters: Union[Parameters, dict] = {}): + """ + + Args: + ptg_feed: partridge feed object + parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters + """ + self.feed = ptg_feed + + if type(parameters) is dict: + self.parameters = Parameters(**parameters) + elif isinstance(parameters, Parameters): + self.parameters = Parameters(**parameters.__dict__) + else: + msg = "Parameters should be a dict or instance of Parameters: found {} which is of type:{}".format( + parameters, type(parameters) + ) + WranglerLogger.error(msg) + raise ValueError(msg)
+ +
[docs] @staticmethod + def fromTransitNetwork( + transit_network_object: TransitNetwork, parameters: Union[Parameters, dict] = {} + ): + """ + RoadwayNetwork to ModelRoadwayNetwork + + Args: + transit_network_object: Reference to an instance of TransitNetwork. + parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not provided will + use default parameters. + + Returns: + StandardTransit + """ + return StandardTransit(transit_network_object.feed, parameters=parameters)
+ +
[docs] @staticmethod + def read_gtfs(gtfs_feed_dir: str, parameters: Union[Parameters, dict] = {}): + """ + Reads GTFS files from a directory and returns a StandardTransit + instance. + + Args: + gtfs_feed_dir: location of the GTFS files + parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not provided will + use default parameters. + + Returns: + StandardTransit instance + """ + return StandardTransit(ptg.load_feed(gtfs_feed_dir), parameters=parameters)
+ +
[docs] def write_as_cube_lin(self, outpath: str = None): + """ + Writes the gtfs feed as a cube line file after + converting gtfs properties to MetCouncil cube properties. + #MC + Args: + outpath: File location for output cube line file. + + """ + if not outpath: + outpath = os.path.join(self.parameters.scratch_location, "outtransit.lin") + trip_cube_df = self.route_properties_gtfs_to_cube(self) + + trip_cube_df["LIN"] = trip_cube_df.apply(self.cube_format, axis=1) + + l = trip_cube_df["LIN"].tolist() + l = [";;<<PT>><<LINE>>;;"] + l + + with open(outpath, "w") as f: + f.write("\n".join(l))
+ +
[docs] @staticmethod + def route_properties_gtfs_to_cube(self): + """ + Prepare gtfs for cube lin file. + #MC + Does the following operations: + 1. Combines route, frequency, trip, and shape information + 2. Converts time of day to time periods + 3. Calculates cube route name from gtfs route name and properties + 4. Assigns a cube-appropriate mode number + 5. Assigns a cube-appropriate operator number + + Returns: + trip_df (DataFrame): DataFrame of trips with cube-appropriate values for: + - NAME + - ONEWAY + - OPERATOR + - MODE + - HEADWAY + """ + WranglerLogger.info( + "Converting GTFS Standard Properties to MetCouncil's Cube Standard" + ) + metro_operator_dict = { + "0": 3, + "1": 3, + "2": 3, + "3": 4, + "4": 2, + "5": 5, + "6": 8, + "7": 1, + "8": 1, + "9": 10, + "10": 3, + "11": 9, + "12": 3, + "13": 4, + "14": 4, + "15": 3, + } + + shape_df = self.feed.shapes.copy() + trip_df = self.feed.trips.copy() + + """ + Add information from: routes, frequencies, and routetype to trips_df + """ + trip_df = pd.merge(trip_df, self.feed.routes, how="left", on="route_id") + trip_df = pd.merge(trip_df, self.feed.frequencies, how="left", on="trip_id") + + trip_df["tod_name"] = trip_df.start_time.apply(self.time_to_cube_time_period) + inv_cube_time_periods_map = { + v: k for k, v in self.parameters.cube_time_periods.items() + } + trip_df["tod_num"] = trip_df.tod_name.map(inv_cube_time_periods_map) + trip_df["tod_name"] = trip_df.tod_name.map( + self.parameters.cube_time_periods_name + ) + + # add shape_id to name when N most common pattern is used for routes*tod*direction + trip_df["shp_id"] = trip_df.groupby(["route_id", "tod_name", "direction_id"]).cumcount() + trip_df["shp_id"] = trip_df["shp_id"].astype(str) + trip_df["shp_id"] = "shp" + trip_df["shp_id"] + + trip_df["route_short_name"] = trip_df["route_short_name"].str.replace("-", "_").str.replace(" ", ".").str.replace(",", "_").str.slice(stop = 50) + + trip_df["route_long_name"] = trip_df["route_long_name"].str.replace(",", "_").str.slice(stop = 50) + + trip_df["NAME"] = trip_df.apply( + lambda x: x.agency_id + + "_" + + x.route_id + + "_" + + x.tod_name + + "_" + + "d" + + str(x.direction_id) + + "_s" + + x.shape_id, + axis=1, + ) + + # CUBE max string length + trip_df["NAME"] = trip_df["NAME"].str.slice(stop = 28) + + trip_df["LONGNAME"] = trip_df["route_long_name"] + # CUBE max string length + trip_df["LONGNAME"] = trip_df["LONGNAME"].str.slice(stop = 30) + + trip_df["HEADWAY"] = (trip_df["headway_secs"] / 60).astype(int) + trip_df["MODE"] = trip_df.apply(self.calculate_cube_mode, axis=1) + trip_df["ONEWAY"] = "T" + trip_df["OPERATOR"] = trip_df["agency_id"].map(metro_operator_dict) + trip_df["SHORTNAME"] = trip_df["route_short_name"].str.slice(stop = 30) + + return trip_df
+ +
[docs] def calculate_cube_mode(self, row): + """ + Assigns a cube mode number by following logic. + #MC + For rail, uses GTFS route_type variable: + https://developers.google.com/transit/gtfs/reference + + :: + # route_type : cube_mode + route_type_to_cube_mode = {0: 8, # Tram, Streetcar, Light rail + 3: 0, # Bus; further disaggregated for cube + 2: 9} # Rail + + For buses, uses route id numbers and route name to find + express and suburban buses as follows: + + :: + if not cube_mode: + if 'express' in row['LONGNAME'].lower(): + cube_mode = 7 # Express + elif int(row['route_id'].split("-")[0]) > 99: + cube_mode = 6 # Suburban Local + else: + cube_mode = 5 # Urban Local + + Args: + row: A DataFrame row with route_type, route_long_name, and route_id + + Returns: + cube mode number + """ + # route_type : cube_mode + route_type_to_cube_mode = { + 0: 8, # Tram, Streetcar, Light rail + 3: 0, # Bus; further disaggregated for cube + 2: 9, + } # Rail + + cube_mode = route_type_to_cube_mode[row["route_type"]] + + if not cube_mode: + if "express" in row["route_long_name"].lower(): + cube_mode = 7 # Express + elif int(row["route_id"].split("-")[0]) > 99: + cube_mode = 6 # Suburban Local + else: + cube_mode = 5 # Urban Local + + return cube_mode
+ +
[docs] def time_to_cube_time_period( + self, start_time_secs: int, as_str: bool = True, verbose: bool = False + ): + """ + Converts seconds from midnight to the cube time period. + + Args: + start_time_secs: start time for transit trip in seconds + from midnight + as_str: if True, returns the time period as a string, + otherwise returns a numeric time period + + Returns: + this_tp_num: if as_str is False, returns the numeric + time period + this_tp: if as_str is True, returns the Cube time period + name abbreviation + """ + from .util import hhmmss_to_datetime, secs_to_datetime + + # set initial time as the time that spans midnight + + start_time_dt = secs_to_datetime(start_time_secs) + + # set initial time as the time that spans midnight + this_tp = "NA" + for tp_name, _times in self.parameters.time_period_to_time.items(): + _start_time, _end_time = _times + _dt_start_time = hhmmss_to_datetime(_start_time) + _dt_end_time = hhmmss_to_datetime(_end_time) + if _dt_start_time > _dt_end_time: + this_tp = tp_name + break + + for tp_name, _times in self.parameters.time_period_to_time.items(): + _start_time, _end_time = _times + _dt_start_time = hhmmss_to_datetime(_start_time) + if start_time_dt >= _dt_start_time: + this_time = _dt_start_time + this_tp = tp_name + + if verbose: + WranglerLogger.debug( + "Finding Cube Time Period from Start Time: \ + \n - start_time_sec: {} \ + \n - start_time_dt: {} \ + \n - this_tp: {}".format( + start_time_secs, start_time_dt, this_tp + ) + ) + + if as_str: + return this_tp + + name_to_num = {v: k for k, v in self.parameters.cube_time_periods.items()} + this_tp_num = name_to_num.get(this_tp) + + if not this_tp_num: + msg = ( + "Cannot find time period number in {} for time period name: {}".format( + name_to_num, this_tp + ) + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + return this_tp_num
+ +
[docs] def shape_gtfs_to_dict_list(self, trip_id: str, shape_id: str, add_nntime: bool): + """ + This is a copy of StandardTransit.shape_gtfs_to_cube() because we need the same logic of + stepping through the routed nodes and corresponding them with shape nodes. + + TODO: eliminate this necessity by tagging the stop nodes in the shapes to begin with when + the transit routing on the roadway network is first performed. + + As such, I'm copying the code from StandardTransit.shape_gtfs_to_cube() with minimal modifications. + + Args: + trip_id of the trip in question + shape_id of the trip in question + Returns: + list of dict records with columns: + trip_id + shape_id + shape_pt_sequence + shape_mode_node_id + is_stop + access + stop_sequence + """ + # get the stop times for this route + # https://developers.google.com/transit/gtfs/reference#stop_timestxt + trip_stop_times_df = self.feed.stop_times.loc[ self.feed.stop_times.trip_id == trip_id, + ['trip_id','arrival_time','departure_time','stop_id','stop_sequence','pickup_type','drop_off_type']].copy() + trip_stop_times_df.sort_values(by='stop_sequence', inplace=True) + trip_stop_times_df.reset_index(drop=True, inplace=True) + # print("trip_stop_times_df:\n{}".format(trip_stop_times_df)) + # print("trip_stop_times_df.dtypes:\n{}".format(trip_stop_times_df.dtypes)) + # trip_stop_times_df: + # trip_id arrival_time departure_time stop_id stop_sequence pickup_type drop_off_type + # 0 10007 0 0 7781 1 0 NaN + # 1 10007 120 120 7845 2 0 NaN + # 2 10007 300 300 7790 3 0 NaN + # 3 10007 360 360 7854 4 0 NaN + # 4 10007 390 390 7951 5 0 NaN + # 5 10007 720 720 7950 6 0 NaN + # 6 10007 810 810 7850 7 0 NaN + # 7 10007 855 855 7945 8 0 NaN + # 8 10007 900 900 7803 9 0 NaN + # 9 10007 930 930 7941 10 0 NaN + # trip_stop_times_df.dtypes: + # trip_id object + # arrival_time object + # departure_time object + # stop_id object + # stop_sequence int64 + # pickup_type object + # drop_off_type object + + # get the shapes for this route + # https://developers.google.com/transit/gtfs/reference#shapestxt + trip_node_df = self.feed.shapes.loc[self.feed.shapes.shape_id == shape_id].copy() + trip_node_df.sort_values(by="shape_pt_sequence", inplace = True) + trip_node_df.reset_index(drop=True, inplace=True) + # print("trip_node_df.head(20):\n{}".format(trip_node_df.head(20))) + # print("trip_node_df.dtypes:\n{}".format(trip_node_df.dtypes)) + # trip_node_df: + # shape_id shape_pt_sequence shape_osm_node_id shape_shst_node_id shape_model_node_id shape_pt_lat shape_pt_lon + # 0 696 1 1429334016 35cb440c505534e8aedbd3a286b70eab 2139625 NaN NaN + # 1 696 2 444242480 39e263722d5849b3c732b48734671400 2164862 NaN NaN + # 2 696 3 5686705779 4c41c608c35f457079fd673bce5556e5 2169898 NaN NaN + # 3 696 4 3695761874 d0f5b2173189bbb1b5dbaa78a004e8c4 2021876 NaN NaN + # 4 696 5 1433982749 60726971f0fb359a57e9d8df30bf384b 2002078 NaN NaN + # 5 696 6 1433982740 634c301424647d5883191edf522180e3 2156807 NaN NaN + # 6 696 7 4915736746 f03c3d7f1aa0358a91c165f53dac1e20 2145185 NaN NaN + # 7 696 8 65604864 68b8df24f1572d267ecf834107741393 2120788 NaN NaN + # 8 696 9 65604866 e412a013ad45af6649fa1b396f74c127 2066513 NaN NaN + # 9 696 10 956664242 657e1602aa8585383ed058f28f7811ed 2006476 NaN NaN + # 10 696 11 291642561 726b03cced023a6459d7333885927208 2133933 NaN NaN + # 11 696 12 291642583 709a0c00811f213f7476349a2c002003 2159991 NaN NaN + # 12 696 13 291642745 c5aaab62e0c78c34d93ee57795f06953 2165343 NaN NaN + # 13 696 14 5718664845 c7f1f4aa88887071a0d28154fc84604b 2007965 NaN NaN + # 14 696 15 291642692 0ef007a79b391e8ba98daf4985f26f9b 2160569 NaN NaN + # 15 696 16 5718664843 2ce63288e77747abc3a4124f0e28efcf 2047955 NaN NaN + # 16 696 17 3485537279 ec0c8eb524f41072a9fd87ecfd45e15f 2169094 NaN NaN + # 17 696 18 5718664419 57ca23828db4adea39355a92fb0fc3ff 2082102 NaN NaN + # 18 696 19 5718664417 4aba41268ada1058ee58e99a84e28d37 2019974 NaN NaN + # 19 696 20 65545418 d4f815a2f6da6c95d2f032a3cd61020c 2025374 NaN NaN # trip_node_df.dtypes: + # shape_id object + # shape_pt_sequence int64 + # shape_osm_node_id object + # shape_shst_node_id object + # shape_model_node_id object + # shape_pt_lat object + # shape_pt_lon object + + # we only need: shape_id, shape_pt_sequence, shape_model_node_id + trip_node_df = trip_node_df[['shape_id','shape_pt_sequence','shape_model_node_id']] + + if 'trip_id' in self.feed.stops.columns: + trip_stop_times_df = pd.merge( + trip_stop_times_df, self.feed.stops, how="left", on=['trip_id', "stop_id"] + ) + else: + trip_stop_times_df = pd.merge( + trip_stop_times_df, self.feed.stops, how="left", on="stop_id" + ) + # print("trip_stop_times_df:\n{}".format(trip_stop_times_df)) + # print("trip_stop_times_df.dtypes:\n{}".format(trip_stop_times_df.dtypes)) + # trip_stop_times_df.dtypes: + # trip_id object + # arrival_time object + # departure_time object + # stop_id object + # stop_sequence int64 + # pickup_type object + # drop_off_type object + # stop_name object + # stop_lat float64 + # stop_lon float64 + # zone_id object + # agency_raw_name object + # stop_code object + # location_type float64 + # parent_station object + # stop_desc object + # stop_url object + # stop_timezone object + # wheelchair_boarding float64 + # platform_code object + # position object + # direction object + # * used by routes object + # osm_node_id object + # shst_node_id object + # model_node_id object + + trip_stop_times_df["model_node_id"] = pd.to_numeric(trip_stop_times_df["model_node_id"]).astype(int) + trip_node_df["shape_model_node_id"] = pd.to_numeric(trip_node_df["shape_model_node_id"]).astype(int) + + stop_node_id_list = trip_stop_times_df["model_node_id"].tolist() + trip_node_list = trip_node_df["shape_model_node_id"].tolist() + + # sometimes GTFS `stop_sequence` does not start with 1, e.g. SFMTA light rails + trip_stop_times_df["internal_stop_sequence"] = range(1, 1+len(trip_stop_times_df)) + # sometimes GTFS `departure_time` is not recorded for every stop, e.g. VTA light rails + trip_stop_times_df["departure_time"].fillna(method = "ffill", inplace = True) + trip_stop_times_df["departure_time"].fillna(0, inplace = True) + trip_stop_times_df["NNTIME"] = trip_stop_times_df["departure_time"].diff() / 60 + # CUBE NNTIME takes 2 decimals + trip_stop_times_df["NNTIME"] = trip_stop_times_df["NNTIME"].round(2) + trip_stop_times_df["NNTIME"].fillna(-1, inplace = True) + + # ACCESS + def _access_type(x): + if (x.pickup_type in [1, "1"]): + return 2 + elif (x.drop_off_type in [1, "1"]): + return 1 + else: + return 0 + + trip_stop_times_df["ACCESS"] = trip_stop_times_df.apply(lambda x: _access_type(x), axis = 1) + + # this is the same as shape_gtfs_to_cube but we'll build up a list of dicts with shape/stop information + shape_stop_dict_list = [] + + # node list + node_list_str = "" + stop_seq = 0 + for nodeIdx in range(len(trip_node_list)): + if trip_node_list[nodeIdx] in stop_node_id_list: + # in case a route stops at a stop more than once, e.g. circular route + stop_seq += 1 + + if (add_nntime) & (stop_seq > 1): + if len(trip_stop_times_df[ + trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]]) > 1: + nntime_v = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]) & + (trip_stop_times_df["internal_stop_sequence"] == stop_seq), + "NNTIME"].iloc[0] + else: + nntime_v = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]),"NNTIME"].iloc[0] + + if nntime_v > 0: + nntime = ", NNTIME=%s" % (nntime_v) + else: + nntime = "" + else: + nntime = "" + + access_v = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]),"ACCESS"].iloc[0] + if access_v > 0: + access = ", ACCESS=%s" % (access_v) + else: + access = "" + + node_list_str += "\n %s%s%s" % (trip_node_list[nodeIdx], nntime, access) + # add this stop to shape_stop_df + node_dict = trip_node_df.iloc[nodeIdx].to_dict() + node_dict['trip_id' ] = trip_id + node_dict['is_stop' ] = True + node_dict['access' ] = access_v + node_dict['stop_sequence'] = stop_seq + shape_stop_dict_list.append(node_dict) + + if nodeIdx < (len(trip_node_list) - 1): + node_list_str += "," + if ((add_nntime) & (stop_seq > 1) & (len(nntime) > 0)) | (len(access) > 0): + node_list_str += " N=" + else: + node_list_str += "\n -%s" % (trip_node_list[nodeIdx]) + # add this stop to shape_stop_df + node_dict = trip_node_df.iloc[nodeIdx].to_dict() + node_dict['trip_id'] = trip_id + node_dict['is_stop'] = False + shape_stop_dict_list.append(node_dict) + if nodeIdx < (len(trip_node_list) - 1): + node_list_str += "," + + # remove NNTIME = 0 + node_list_str = node_list_str.replace(" NNTIME=0.0, N=", "") + node_list_str = node_list_str.replace(" NNTIME=0.0,", "") + + # print("node_list_str: {}".format(node_list_str)) + return shape_stop_dict_list
+ +
[docs] def shape_gtfs_to_cube(self, row, add_nntime = False): + """ + Creates a list of nodes that for the route in appropriate + cube format. + + Args: + row: DataFrame row with both shape_id and trip_id + + Returns: a string representation of the node list + for a route in cube format. + + """ + trip_stop_times_df = self.feed.stop_times.copy() + trip_stop_times_df = trip_stop_times_df[ + trip_stop_times_df.trip_id == row.trip_id + ] + + trip_node_df = self.feed.shapes.copy() + trip_node_df = trip_node_df[trip_node_df.shape_id == row.shape_id] + trip_node_df.sort_values(by = ["shape_pt_sequence"], inplace = True) + + if 'trip_id' in self.feed.stops.columns: + trip_stop_times_df = pd.merge( + trip_stop_times_df, self.feed.stops, how="left", on=['trip_id', "stop_id"] + ) + else: + trip_stop_times_df = pd.merge( + trip_stop_times_df, self.feed.stops, how="left", on="stop_id" + ) + + trip_stop_times_df["model_node_id"] = pd.to_numeric(trip_stop_times_df["model_node_id"]).astype(int) + trip_node_df["shape_model_node_id"] = pd.to_numeric(trip_node_df["shape_model_node_id"]).astype(int) + + stop_node_id_list = trip_stop_times_df["model_node_id"].tolist() + trip_node_list = trip_node_df["shape_model_node_id"].tolist() + + trip_stop_times_df.sort_values(by = ["stop_sequence"], inplace = True) + # sometimes GTFS `stop_sequence` does not start with 1, e.g. SFMTA light rails + trip_stop_times_df["internal_stop_sequence"] = range(1, 1+len(trip_stop_times_df)) + # sometimes GTFS `departure_time` is not recorded for every stop, e.g. VTA light rails + trip_stop_times_df["departure_time"].fillna(method = "ffill", inplace = True) + trip_stop_times_df["departure_time"].fillna(0, inplace = True) + trip_stop_times_df["NNTIME"] = trip_stop_times_df["departure_time"].diff() / 60 + # CUBE NNTIME takes 2 decimals + trip_stop_times_df["NNTIME"] = trip_stop_times_df["NNTIME"].round(2) + trip_stop_times_df["NNTIME"].fillna(-1, inplace = True) + + # ACCESS + def _access_type(x): + if (x.pickup_type in [1, "1"]): + return 2 + elif (x.drop_off_type in [1, "1"]): + return 1 + else: + return 0 + + trip_stop_times_df["ACCESS"] = trip_stop_times_df.apply(lambda x: _access_type(x), axis = 1) + + # node list + node_list_str = "" + stop_seq = 0 + for nodeIdx in range(len(trip_node_list)): + if trip_node_list[nodeIdx] in stop_node_id_list: + # in case a route stops at a stop more than once, e.g. circular route + stop_seq += 1 + + if (add_nntime) & (stop_seq > 1): + if len(trip_stop_times_df[ + trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]]) > 1: + nntime_v = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]) & + (trip_stop_times_df["internal_stop_sequence"] == stop_seq), + "NNTIME"].iloc[0] + else: + nntime_v = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]),"NNTIME"].iloc[0] + + if nntime_v > 0: + nntime = ", NNTIME=%s" % (nntime_v) + else: + nntime = "" + else: + nntime = "" + + access_v = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]),"ACCESS"].iloc[0] + if access_v > 0: + access = ", ACCESS=%s" % (access_v) + else: + access = "" + + node_list_str += "\n %s%s%s" % (trip_node_list[nodeIdx], nntime, access) + if nodeIdx < (len(trip_node_list) - 1): + node_list_str += "," + if ((add_nntime) & (stop_seq > 1) & (len(nntime) > 0)) | (len(access) > 0): + node_list_str += " N=" + else: + node_list_str += "\n -%s" % (trip_node_list[nodeIdx]) + if nodeIdx < (len(trip_node_list) - 1): + node_list_str += "," + + # remove NNTIME = 0 + node_list_str = node_list_str.replace(" NNTIME=0.0, N=", "") + node_list_str = node_list_str.replace(" NNTIME=0.0,", "") + + return node_list_str
+ + +
[docs] def cube_format(self, row): + """ + Creates a string represnting the route in cube line file notation. + #MC + Args: + row: row of a DataFrame representing a cube-formatted trip, with the Attributes + trip_id, shape_id, NAME, LONGNAME, tod, HEADWAY, MODE, ONEWAY, OPERATOR + + Returns: + string representation of route in cube line file notation + """ + + s = '\nLINE NAME="{}",'.format(row.NAME) + s += '\n LONGNAME="{}",'.format(row.LONGNAME) + s += "\n HEADWAY[{}]={},".format(row.tod_num, row.HEADWAY) + s += "\n MODE={},".format(row.MODE) + s += "\n ONEWAY={},".format(row.ONEWAY) + s += "\n OPERATOR={},".format(row.OPERATOR) + s += '\n SHORTNAME="{}",'.format(row.SHORTNAME) + s += "\n NODES={}".format(self.shape_gtfs_to_cube(row)) + + return s
+ +
[docs] def shape_gtfs_to_emme(self, trip_row): + """ + Creates transit segment for the trips in appropriate + emme format. + + Args: + row: DataFrame row with both shape_id and trip_id + + Returns: a dataframe representation of the transit segment + for a trip in emme format. + + """ + trip_stop_times_df = self.feed.stop_times.copy() + trip_stop_times_df = trip_stop_times_df[ + trip_stop_times_df.trip_id == trip_row.trip_id + ] + + trip_node_df = self.feed.shapes.copy() + trip_node_df = trip_node_df[trip_node_df.shape_id == trip_row.shape_id] + trip_node_df.sort_values(by = ["shape_pt_sequence"], inplace = True) + + trip_stop_times_df = pd.merge( + trip_stop_times_df, self.feed.stops, how="left", on="stop_id" + ) + + stop_node_id_list = trip_stop_times_df["model_node_id"].tolist() + trip_node_list = trip_node_df["shape_model_node_id"].tolist() + + trip_stop_times_df.sort_values(by = ["stop_sequence"], inplace = True) + # sometimes GTFS `stop_sequence` does not start with 1, e.g. SFMTA light rails + trip_stop_times_df["internal_stop_sequence"] = range(1, 1+len(trip_stop_times_df)) + # sometimes GTFS `departure_time` is not recorded for every stop, e.g. VTA light rails + trip_stop_times_df["departure_time"].fillna(method = "ffill", inplace = True) + trip_stop_times_df["departure_time"].fillna(0, inplace = True) + trip_stop_times_df["NNTIME"] = trip_stop_times_df["departure_time"].diff() / 60 + # CUBE NNTIME takes 2 decimals + trip_stop_times_df["NNTIME"] = trip_stop_times_df["NNTIME"].round(2) + trip_stop_times_df["NNTIME"].fillna(-1, inplace = True) + + # node list + stop_seq = 0 + nntimes = [] + allow_alightings=[] + allow_boardings=[] + stop_names=[] + + if trip_row.TM2_line_haul_name in ["Light rail", "Heavy rail", "Commuter rail", "Ferry service"]: + add_nntime = True + else: + add_nntime = False + + for nodeIdx in range(len(trip_node_list)): + + if trip_node_list[nodeIdx] in stop_node_id_list: + # in case a route stops at a stop more than once, e.g. circular route + stop_seq += 1 + + if (add_nntime) & (stop_seq > 1): + if len(trip_stop_times_df[ + trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]]) > 1: + + nntime_v = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]) & + (trip_stop_times_df["internal_stop_sequence"] == stop_seq), + "NNTIME"].iloc[0] + else: + nntime_v = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]),"NNTIME"].iloc[0] + + nntimes.append(nntime_v) + else: + nntimes.append(0) + + pickup_type = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]),"pickup_type"].iloc[0] + if pickup_type in [1, "1"]: + allow_alightings.append(0) + else: + allow_alightings.append(1) + + drop_off_type = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]),"drop_off_type"].iloc[0] + if drop_off_type in [1, "1"]: + allow_boardings.append(0) + else: + allow_boardings.append(1) + + stop_name = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]),"stop_name"].iloc[0] + stop_names.append(stop_name) + + else: + nntimes.append(0) + allow_alightings.append(0) + allow_boardings.append(0) + stop_names.append("") + + trip_node_df['time_minutes'] = nntimes + trip_node_df['allow_alightings'] = allow_alightings + trip_node_df['allow_boardings'] = allow_boardings + trip_node_df['stop_name'] = stop_names + trip_node_df['line_id'] = trip_row['line_id'] + trip_node_df['node_id'] = trip_node_df['shape_model_node_id'].astype(int) + trip_node_df['stop_order'] = trip_node_df['shape_pt_sequence'] + + return trip_node_df
+ +
[docs] def evaluate_differences(self, transit_changes): + """ + Compare changes from the transit_changes dataframe with the standard transit network + returns the project card changes in dictionary format + """ + + # simple properties change + trip_df = self.feed.trips.copy() + + mode_crosswalk = pd.read_csv(self.parameters.mode_crosswalk_file) + mode_crosswalk.drop_duplicates(subset = ["agency_raw_name", "route_type", "is_express_bus"], inplace = True) + + trip_df = pd.merge(trip_df, self.feed.routes.drop("agency_raw_name", axis = 1), how="left", on="route_id") + + trip_df = pd.merge(trip_df, self.feed.frequencies, how="left", on="trip_id") + + trip_df["tod"] = trip_df.start_time.apply(self.time_to_cube_time_period, as_str = False) + trip_df["tod_name"] = trip_df.start_time.apply(self.time_to_cube_time_period) + + trip_df["headway_minutes"] = (trip_df["headway_secs"] / 60).astype(int) + + trip_df = pd.merge(trip_df, self.feed.agency[["agency_name", "agency_raw_name", "agency_id"]], how = "left", on = ["agency_raw_name", "agency_id"]) + + # identify express bus + # moved this here from top since this StandardTransit shouldn't depend on mtc... + from .mtc import _is_express_bus + trip_df["is_express_bus"] = trip_df.apply(lambda x: _is_express_bus(x), axis = 1) + trip_df.drop("agency_name", axis = 1 , inplace = True) + + trip_df = pd.merge( + trip_df, + mode_crosswalk.drop("agency_id", axis = 1), + how = "left", + on = ["agency_raw_name", "route_type", "is_express_bus"] + ) + + trip_df["line_id"] = trip_df.apply( + lambda x: str(x.TM2_operator) + + "_" + + str(x.route_id) + + "_" + + x.tod_name + + "_" + + "d" + + str(int(x.direction_id)) + + "_s" + + x.shape_id, + axis=1, + ) + + trip_df["line_id"] = trip_df["line_id"].str.slice(stop = 28) + + project_card_changes = [] + + # lines updated + transit_changes['line_id'] = transit_changes.apply( + lambda x: '-'.join(x['element_id'].split('-')[:-3]) if + x['object'] == 'TRANSIT_STOP' else + x['element_id'], + axis = 1 + ) + + lines_updated_df = transit_changes[ + (transit_changes['operation'] == 'C') & + (transit_changes['line_id'].isin(trip_df['line_id'].tolist())) + ].copy() + + ######################### + # simple property changes + ######################### + + property_changes_df = lines_updated_df[ + lines_updated_df.object == 'TRANSIT_LINE' + ].copy() + + property_attribute_list = ['headway_secs'] + + for index, row in property_changes_df.iterrows(): + line_id = row['line_id'] + properties_list = [] + change_item = {} + for c in property_attribute_list: + existing_value = int(trip_df[ + trip_df['line_id'] == line_id + ][c].iloc[0]) + + change_item["existing"] = existing_value + + if c == 'headway_secs': + change_item["set"] = row['headway'] * 60 + else: + change_item["set"] = row[c] + + change_item["property"] = c + + properties_list.append(change_item) + + property_changes_df.loc[index, 'properties'] = properties_list + + ############### + # shape changes + ############### + + shape_changes_df = lines_updated_df[ + lines_updated_df.object.isin(['TRANSIT_SHAPE']) + ].copy() + + for index, row in shape_changes_df.iterrows(): + line_id = row.line_id + + # get base shape + trip_row = trip_df[trip_df.line_id == line_id].copy().squeeze() + + base_shape = self.shape_gtfs_to_emme( + trip_row=trip_row + ) + base_shape['shape_model_node_id'] = base_shape['shape_model_node_id'].astype(int) + + # get build shape + build_shape = row.new_itinerary + + updated_shapes = CubeTransit.evaluate_route_shape_changes( + shape_base = base_shape.shape_model_node_id, + shape_build = pd.Series(row.new_itinerary) + ) + updated_shapes[0]['property'] = 'shapes' + shape_changes_df.loc[index, 'properties'] = updated_shapes + + ############## + # stop changes + ############## + stop_changes_df = lines_updated_df[ + lines_updated_df.object.isin(['TRANSIT_STOP']) + ].copy() + + stop_attribute_list = ['allow_alightings', 'allow_boardings'] + + stop_changes_df = stop_changes_df.groupby( + ['line_id','i_node'] + )[stop_attribute_list].last().reset_index() + + stop_attribute_changes_df = pd.DataFrame() + + for attribute in stop_attribute_list: + + attribute_df = stop_changes_df.groupby( + ['line_id', attribute] + )['i_node'].apply(list).reset_index() + attribute_df['properties'] = attribute_df.apply( + lambda x: { + 'property' : attribute if x[attribute] == True else 'no_'+attribute.split('_')[-1], + 'set': x['i_node']}, + axis = 1 + ) + + stop_attribute_changes_df = pd.concat( + [stop_attribute_changes_df, + attribute_df[['line_id', 'properties']]], + sort = False, + ignore_index = True + ) + + ############## + # combine all transit changes + ############## + transit_changes_df = pd.concat( + [ + property_changes_df, + shape_changes_df, + stop_attribute_changes_df + ], + sort = False, + ignore_index = True + ) + + # groupby line_id + transit_changes_df = transit_changes_df.groupby( + ['line_id'] + )['properties'].apply(list).reset_index() + + # create change items by line_id + for index, row in transit_changes_df.iterrows(): + line_id = row['line_id'] + base_start_time_str = self.parameters.time_period_to_time.get( + line_id.split("_")[2] + )[0] + + base_end_time_str = self.parameters.time_period_to_time.get( + line_id.split("_")[2] + )[1] + + update_card_dict = { + "category": "Transit Service Property Change", + "facility": { + "route_id": line_id.split("_")[1], + "direction_id": int(line_id.split("_")[-2].strip("d\"")), + "shape_id": line_id.split("_")[-1].strip("s\""), + "start_time": base_start_time_str, + "end_time": base_end_time_str + }, + "properties": row['properties'], + } + + project_card_changes.append(update_card_dict) + + return project_card_changes
+ +class CubeTransformer(Transformer): + """A lark-parsing Transformer which transforms the parse-tree to + a dictionary. + + .. highlight:: python + Typical usage example: + :: + transformed_tree_data = CubeTransformer().transform(parse_tree) + + Attributes: + line_order (int): a dynamic counter to hold the order of the nodes within + a route shape + lines_list (list): a list of the line names + """ + + def __init__(self): + self.line_order = 0 + self.lines_list = [] + + def lines(self, line): + # WranglerLogger.debug("lines: \n {}".format(line)) + + # This MUST be a tuple because it returns to start in the tree + lines = {k: v for k, v in line} + return ("lines", lines) + + @v_args(inline=True) + def program_type_line(self, PROGRAM_TYPE, whitespace=None): + # WranglerLogger.debug("program_type_line:{}".format(PROGRAM_TYPE)) + self.program_type = PROGRAM_TYPE.value + + # This MUST be a tuple because it returns to start in the tree + return ("program_type", PROGRAM_TYPE.value) + + @v_args(inline=True) + def line(self, lin_attributes, nodes): + # WranglerLogger.debug("line...attributes:\n {}".format(lin_attributes)) + # WranglerLogger.debug("line...nodes:\n {}".format(nodes)) + lin_name = lin_attributes["NAME"] + + self.line_order = 0 + # WranglerLogger.debug("parsing: {}".format(lin_name)) + + return (lin_name, {"line_properties": lin_attributes, "line_shape": nodes}) + + @v_args(inline=True) + def lin_attributes(self, *lin_attr): + lin_attr = {k: v for (k, v) in lin_attr} + # WranglerLogger.debug("lin_attributes: {}".format(lin_attr)) + return lin_attr + + @v_args(inline=True) + def lin_attr(self, lin_attr_name, attr_value, SEMICOLON_COMMENT=None): + # WranglerLogger.debug("lin_attr {}: {}".format(lin_attr_name, attr_value)) + return lin_attr_name, attr_value + + def lin_attr_name(self, args): + attr_name = args[0].value.upper() + # WranglerLogger.debug(".......args {}".format(args)) + if attr_name in ["FREQ", "HEADWAY"]: + attr_name = attr_name + "[" + str(args[2]) + "]" + return attr_name + + def attr_value(self, attr_value): + try: + return int(attr_value[0].value) + except: + return attr_value[0].value + + def nodes(self, lin_node): + lin_node = DataFrame(lin_node) + # WranglerLogger.debug("nodes:\n {}".format(lin_node)) + + return lin_node + + @v_args(inline=True) + def lin_node(self, NODE_NUM, SEMICOLON_COMMENT=None, *lin_nodeattr): + self.line_order += 1 + n = int(NODE_NUM.value) + return {"node_id": abs(n), "node": n, "stop": n > 0, "order": self.line_order} + + start = dict + + +TRANSIT_LINE_FILE_GRAMMAR = r""" + +start : program_type_line? lines +WHITESPACE : /[ \t\r\n]/+ +STRING : /("(?!"").*?(?<!\\)(\\\\)*?"|'(?!'').*?(?<!\\)(\\\\)*?')/i +SEMICOLON_COMMENT : /;[^\n]*/ +BOOLEAN : "T"i | "F"i +program_type_line : ";;<<" PROGRAM_TYPE ">><<LINE>>;;" WHITESPACE? +PROGRAM_TYPE : "PT" | "TRNBUILD" + +lines : line* +line : "LINE" lin_attributes nodes + +lin_attributes : lin_attr+ +lin_attr : lin_attr_name "=" attr_value "," SEMICOLON_COMMENT* +TIME_PERIOD : "1".."5" +!lin_attr_name : "allstops"i + | "color"i + | ("freq"i "[" TIME_PERIOD "]") + | ("headway"i "[" TIME_PERIOD "]") + | "mode"i + | "name"i + | "oneway"i + | "owner"i + | "runtime"i + | "timefac"i + | "xyspeed"i + | "longname"i + | "shortname"i + | ("usera1"i) + | ("usera2"i) + | "circular"i + | "vehicletype"i + | "operator"i + | "faresystem"i + +attr_value : BOOLEAN | STRING | SIGNED_INT | FLOAT + +nodes : lin_node+ +lin_node : ("N" | "NODES")? "="? NODE_NUM ","? SEMICOLON_COMMENT? lin_nodeattr* +NODE_NUM : SIGNED_INT +lin_nodeattr : lin_nodeattr_name "=" attr_value ","? SEMICOLON_COMMENT* +!lin_nodeattr_name : "access_c"i + | "access"i + | "delay"i + | "xyspeed"i + | "timefac"i + | "nntime"i + | "time"i + +operator : SEMICOLON_COMMENT* "OPERATOR" opmode_attr* SEMICOLON_COMMENT* +mode : SEMICOLON_COMMENT* "MODE" opmode_attr* SEMICOLON_COMMENT* +opmode_attr : ( (opmode_attr_name "=" attr_value) ","? ) +opmode_attr_name : "number" | "name" | "longname" + +%import common.SIGNED_INT +%import common.FLOAT +%import common.WS +%ignore WS + +""" +
+ +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bart/_modules/lasso/util/index.html b/branch/bart/_modules/lasso/util/index.html new file mode 100644 index 0000000..ab15e7f --- /dev/null +++ b/branch/bart/_modules/lasso/util/index.html @@ -0,0 +1,251 @@ + + + + + + lasso.util — lasso documentation + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for lasso.util

+from functools import partial
+import pyproj
+from shapely.ops import transform
+from shapely.geometry import Point, Polygon
+import re
+from unidecode import unidecode
+
+
[docs]def get_shared_streets_intersection_hash(lat, long, osm_node_id=None): + """ + Calculated per: + https://github.com/sharedstreets/sharedstreets-js/blob/0e6d7de0aee2e9ae3b007d1e45284b06cc241d02/src/index.ts#L553-L565 + Expected in/out + -93.0965985, 44.952112199999995 osm_node_id = 954734870 + 69f13f881649cb21ee3b359730790bb9 + + """ + import hashlib + + message = "Intersection {0:.5f} {0:.5f}".format(long, lat) + if osm_node_id: + message += " {}".format(osm_node_id) + unhashed = message.encode("utf-8") + hash = hashlib.md5(unhashed).hexdigest() + return hash
+ + +
[docs]def hhmmss_to_datetime(hhmmss_str: str): + """ + Creates a datetime time object from a string of hh:mm:ss + + Args: + hhmmss_str: string of hh:mm:ss + Returns: + dt: datetime.time object representing time + """ + import datetime + + dt = datetime.time(*[int(i) for i in hhmmss_str.split(":")]) + + return dt
+ + +
[docs]def secs_to_datetime(secs: int): + """ + Creates a datetime time object from a seconds from midnight + + Args: + secs: seconds from midnight + Returns: + dt: datetime.time object representing time + """ + import datetime + + dt = (datetime.datetime.min + datetime.timedelta(seconds=secs)).time() + + return dt
+ + +
[docs]def geodesic_point_buffer(lat, lon, meters): + """ + creates circular buffer polygon for node + + Args: + lat: node lat + lon: node lon + meters: buffer distance, radius of circle + Returns: + Polygon + """ + proj_wgs84 = pyproj.Proj('+proj=longlat +datum=WGS84') + # Azimuthal equidistant projection + aeqd_proj = '+proj=aeqd +lat_0={lat} +lon_0={lon} +x_0=0 +y_0=0' + project = partial( + pyproj.transform, + pyproj.Proj(aeqd_proj.format(lat=lat, lon=lon)), + proj_wgs84) + buf = Point(0, 0).buffer(meters) # distance in meters + return Polygon(transform(project, buf).exterior.coords[:])
+ +
[docs]def create_locationreference(node, link): + node['X'] = node['geometry'].apply(lambda p: p.x) + node['Y'] = node['geometry'].apply(lambda p: p.y) + node['point'] = [list(xy) for xy in zip(node.X, node.Y)] + node_dict = dict(zip(node.model_node_id, node.point)) + + link['A_point'] = link['A'].map(node_dict) + link['B_point'] = link['B'].map(node_dict) + link['locationReferences'] = link.apply(lambda x: [{'sequence':1, + 'point': x['A_point'], + 'distanceToNextRef':x['length'], + 'bearing' : 0, + 'intersectionId':x['fromIntersectionId']}, + {'sequence':2, + 'point': x['B_point'], + 'intersectionId':x['toIntersectionId']}], + axis = 1)
+ +
[docs]def column_name_to_parts(c, parameters=None): + + if not parameters: + from .parameters import Parameters + + parameters = Parameters() + + if c[0:2] == "ML": + managed = True + else: + managed = False + + time_period = None + category = None + + if c.split("_")[0] not in parameters.properties_to_split.keys(): + return c, None, None, managed + + tps = parameters.time_period_to_time.keys() + cats = parameters.categories.keys() + + if c.split("_")[-1] in tps: + time_period = c.split("_")[-1] + base_name = c.split(time_period)[-2][:-1] + if c.split("_")[-2] in cats: + category = c.split("_")[-2] + base_name = c.split(category)[-2][:-1] + elif c.split("_")[-1] in cats: + category = c.split("_")[-1] + base_name = c.split(category)[-2][:-1] + else: + msg = "Can't split property correctly: {}".format(c) + WranglerLogger.error(msg) + + return base_name, time_period, category, managed
+ +
[docs]def shorten_name(name): + if type(name) == str: + name_list = name.split(',') + else: + name_list = name + name_list = [re.sub(r'\W+', ' ', c).replace('nan', '').strip(' ') for c in name_list] + + name_list = list(set(name_list)) + #name_list.remove('') + + name_new = ' '.join(name_list).strip(' ') + + # convert non english character to english + name_new = unidecode(name_new) + + return name_new
+
+ +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bart/_modules/shapely/geometry/point/index.html b/branch/bart/_modules/shapely/geometry/point/index.html new file mode 100644 index 0000000..33c9230 --- /dev/null +++ b/branch/bart/_modules/shapely/geometry/point/index.html @@ -0,0 +1,392 @@ + + + + + + shapely.geometry.point — lasso documentation + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for shapely.geometry.point

+"""Points and related utilities
+"""
+
+from ctypes import c_double
+import warnings
+
+from shapely.coords import CoordinateSequence
+from shapely.errors import DimensionError, ShapelyDeprecationWarning
+from shapely.geos import lgeos
+from shapely.geometry.base import BaseGeometry, geos_geom_from_py
+from shapely.geometry.proxy import CachingGeometryProxy
+
+__all__ = ['Point', 'asPoint']
+
+
+
[docs]class Point(BaseGeometry): + """ + A zero dimensional feature + + A point has zero length and zero area. + + Attributes + ---------- + x, y, z : float + Coordinate values + + Example + ------- + >>> p = Point(1.0, -1.0) + >>> print(p) + POINT (1 -1) + >>> p.y + -1.0 + >>> p.x + 1.0 + """ + + def __init__(self, *args): + """ + Parameters + ---------- + There are 2 cases: + + 1) 1 parameter: this must satisfy the numpy array protocol. + 2) 2 or more parameters: x, y, z : float + Easting, northing, and elevation. + """ + BaseGeometry.__init__(self) + if len(args) > 0: + if len(args) == 1: + geom, n = geos_point_from_py(args[0]) + elif len(args) > 3: + raise TypeError( + "Point() takes at most 3 arguments ({} given)".format(len(args)) + ) + else: + geom, n = geos_point_from_py(tuple(args)) + self._set_geom(geom) + self._ndim = n + + # Coordinate getters and setters + + @property + def x(self): + """Return x coordinate.""" + return self.coords[0][0] + + @property + def y(self): + """Return y coordinate.""" + return self.coords[0][1] + + @property + def z(self): + """Return z coordinate.""" + if self._ndim != 3: + raise DimensionError("This point has no z coordinate.") + return self.coords[0][2] + + @property + def __geo_interface__(self): + return { + 'type': 'Point', + 'coordinates': self.coords[0] + } + +
[docs] def svg(self, scale_factor=1., fill_color=None, opacity=None): + """Returns SVG circle element for the Point geometry. + + Parameters + ========== + scale_factor : float + Multiplication factor for the SVG circle diameter. Default is 1. + fill_color : str, optional + Hex string for fill color. Default is to use "#66cc99" if + geometry is valid, and "#ff3333" if invalid. + opacity : float + Float number between 0 and 1 for color opacity. Default value is 0.6 + """ + if self.is_empty: + return '<g />' + if fill_color is None: + fill_color = "#66cc99" if self.is_valid else "#ff3333" + if opacity is None: + opacity = 0.6 + return ( + '<circle cx="{0.x}" cy="{0.y}" r="{1}" ' + 'stroke="#555555" stroke-width="{2}" fill="{3}" opacity="{4}" />' + ).format(self, 3. * scale_factor, 1. * scale_factor, fill_color, opacity)
+ + @property + def _ctypes(self): + if not self._ctypes_data: + array_type = c_double * self._ndim + array = array_type() + xy = self.coords[0] + array[0] = xy[0] + array[1] = xy[1] + if self._ndim == 3: + array[2] = xy[2] + self._ctypes_data = array + return self._ctypes_data + + def _array_interface(self): + """Provide the Numpy array protocol.""" + if self.is_empty: + ai = {'version': 3, 'typestr': '<f8', 'shape': (0,), 'data': (c_double * 0)()} + else: + ai = self._array_interface_base + ai.update({'shape': (self._ndim,)}) + return ai + +
[docs] def array_interface(self): + """Provide the Numpy array protocol.""" + warnings.warn( + "The 'array_interface' method is deprecated and will be removed " + "in Shapely 2.0.", + ShapelyDeprecationWarning, stacklevel=2) + return self._array_interface()
+ + @property + def __array_interface__(self): + warnings.warn( + "The array interface is deprecated and will no longer work in " + "Shapely 2.0. Convert the '.coords' to a numpy array instead.", + ShapelyDeprecationWarning, stacklevel=3) + return self._array_interface() + + @property + def bounds(self): + """Returns minimum bounding region (minx, miny, maxx, maxy)""" + try: + xy = self.coords[0] + except IndexError: + return () + return (xy[0], xy[1], xy[0], xy[1]) + + # Coordinate access + + def _get_coords(self): + """Access to geometry's coordinates (CoordinateSequence)""" + return CoordinateSequence(self) + + def _set_coords(self, *args): + warnings.warn( + "Setting the 'coords' to mutate a Geometry in place is deprecated," + " and will not be possible any more in Shapely 2.0", + ShapelyDeprecationWarning, stacklevel=2) + self._empty() + if len(args) == 1: + geom, n = geos_point_from_py(args[0]) + elif len(args) > 3: + raise TypeError("Point() takes at most 3 arguments ({} given)".format(len(args))) + else: + geom, n = geos_point_from_py(tuple(args)) + self._set_geom(geom) + self._ndim = n + + coords = property(_get_coords, _set_coords) + + @property + def xy(self): + """Separate arrays of X and Y coordinate values + + Example: + >>> x, y = Point(0, 0).xy + >>> list(x) + [0.0] + >>> list(y) + [0.0] + """ + return self.coords.xy
+ + +class PointAdapter(CachingGeometryProxy, Point): + + _other_owned = False + + def __init__(self, context): + warnings.warn( + "The proxy geometries (through the 'asShape()', 'asPoint()' or " + "'PointAdapter()' constructors) are deprecated and will be " + "removed in Shapely 2.0. Use the 'shape()' function or the " + "standard 'Point()' constructor instead.", + ShapelyDeprecationWarning, stacklevel=4) + self.context = context + self.factory = geos_point_from_py + + @property + def _ndim(self): + try: + # From array protocol + array = self.context.__array_interface__ + n = array['shape'][0] + assert n == 2 or n == 3 + return n + except AttributeError: + # Fall back on list + return len(self.context) + + @property + def __array_interface__(self): + """Provide the Numpy array protocol.""" + try: + return self.context.__array_interface__ + except AttributeError: + return self.array_interface() + + def _get_coords(self): + """Access to geometry's coordinates (CoordinateSequence)""" + return CoordinateSequence(self) + + def _set_coords(self, ob): + raise NotImplementedError("Adapters can not modify their sources") + + coords = property(_get_coords) + + +def asPoint(context): + """Adapt an object to the Point interface""" + return PointAdapter(context) + + +def geos_point_from_py(ob, update_geom=None, update_ndim=0): + """Create a GEOS geom from an object that is a Point, a coordinate sequence + or that provides the array interface. + + Returns the GEOS geometry and the number of its dimensions. + """ + if isinstance(ob, Point): + return geos_geom_from_py(ob) + + # Accept either (x, y) or [(x, y)] + if not hasattr(ob, '__getitem__'): # generators + ob = list(ob) + + if isinstance(ob[0], tuple): + coords = ob[0] + else: + coords = ob + n = len(coords) + dx = c_double(coords[0]) + dy = c_double(coords[1]) + dz = None + if n == 3: + dz = c_double(coords[2]) + + if update_geom: + cs = lgeos.GEOSGeom_getCoordSeq(update_geom) + if n != update_ndim: + raise ValueError( + "Wrong coordinate dimensions; this geometry has dimensions: " + "%d" % update_ndim) + else: + cs = lgeos.GEOSCoordSeq_create(1, n) + + # Because of a bug in the GEOS C API, always set X before Y + lgeos.GEOSCoordSeq_setX(cs, 0, dx) + lgeos.GEOSCoordSeq_setY(cs, 0, dy) + if n == 3: + lgeos.GEOSCoordSeq_setZ(cs, 0, dz) + + if update_geom: + return None + else: + return lgeos.GEOSGeom_createPoint(cs), n + + +def update_point_from_py(geom, ob): + geos_point_from_py(ob, geom._geom, geom._ndim) +
+ +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bart/_modules/shapely/geometry/polygon/index.html b/branch/bart/_modules/shapely/geometry/polygon/index.html new file mode 100644 index 0000000..632d94a --- /dev/null +++ b/branch/bart/_modules/shapely/geometry/polygon/index.html @@ -0,0 +1,672 @@ + + + + + + shapely.geometry.polygon — lasso documentation + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for shapely.geometry.polygon

+"""Polygons and their linear ring components
+"""
+
+import sys
+import warnings
+
+from ctypes import c_void_p, cast, POINTER
+import weakref
+
+from shapely.algorithms.cga import signed_area
+from shapely.coords import CoordinateSequence
+from shapely.geos import lgeos
+from shapely.geometry.base import BaseGeometry, geos_geom_from_py
+from shapely.geometry.linestring import LineString, LineStringAdapter
+from shapely.geometry.point import Point
+from shapely.geometry.proxy import PolygonProxy
+from shapely.errors import TopologicalError, ShapelyDeprecationWarning
+
+
+__all__ = ['Polygon', 'asPolygon', 'LinearRing', 'asLinearRing']
+
+
+class LinearRing(LineString):
+    """
+    A closed one-dimensional feature comprising one or more line segments
+
+    A LinearRing that crosses itself or touches itself at a single point is
+    invalid and operations on it may fail.
+    """
+
+    def __init__(self, coordinates=None):
+        """
+        Parameters
+        ----------
+        coordinates : sequence
+            A sequence of (x, y [,z]) numeric coordinate pairs or triples.
+            Also can be a sequence of Point objects.
+
+        Rings are implicitly closed. There is no need to specific a final
+        coordinate pair identical to the first.
+
+        Example
+        -------
+        Construct a square ring.
+
+          >>> ring = LinearRing( ((0, 0), (0, 1), (1 ,1 ), (1 , 0)) )
+          >>> ring.is_closed
+          True
+          >>> ring.length
+          4.0
+        """
+        BaseGeometry.__init__(self)
+        if coordinates is not None:
+            ret = geos_linearring_from_py(coordinates)
+            if ret is not None:
+                geom, n = ret
+                self._set_geom(geom)
+                self._ndim = n
+
+    @property
+    def __geo_interface__(self):
+        return {
+            'type': 'LinearRing',
+            'coordinates': tuple(self.coords)
+            }
+
+    # Coordinate access
+
+    def _get_coords(self):
+        """Access to geometry's coordinates (CoordinateSequence)"""
+        return CoordinateSequence(self)
+
+    def _set_coords(self, coordinates):
+        warnings.warn(
+            "Setting the 'coords' to mutate a Geometry in place is deprecated,"
+            " and will not be possible any more in Shapely 2.0",
+            ShapelyDeprecationWarning, stacklevel=2)
+        self._empty()
+        ret = geos_linearring_from_py(coordinates)
+        if ret is not None:
+            geom, n = ret
+            self._set_geom(geom)
+            self._ndim = n
+
+    coords = property(_get_coords, _set_coords)
+
+    def __setstate__(self, state):
+        """WKB doesn't differentiate between LineString and LinearRing so we
+        need to move the coordinate sequence into the correct geometry type"""
+        super().__setstate__(state)
+        cs = lgeos.GEOSGeom_getCoordSeq(self.__geom__)
+        cs_clone = lgeos.GEOSCoordSeq_clone(cs)
+        lgeos.GEOSGeom_destroy(self.__geom__)
+        self.__geom__ = lgeos.GEOSGeom_createLinearRing(cs_clone)
+
+    @property
+    def is_ccw(self):
+        """True is the ring is oriented counter clock-wise"""
+        return bool(self.impl['is_ccw'](self))
+
+    @property
+    def is_simple(self):
+        """True if the geometry is simple, meaning that any self-intersections
+        are only at boundary points, else False"""
+        return LineString(self).is_simple
+
+
+class LinearRingAdapter(LineStringAdapter):
+
+    __p__ = None
+
+    def __init__(self, context):
+        warnings.warn(
+            "The proxy geometries (through the 'asShape()', 'asLinearRing()' or "
+            "'LinearRingAdapter()' constructors) are deprecated and will be "
+            "removed in Shapely 2.0. Use the 'shape()' function or the "
+            "standard 'LinearRing()' constructor instead.",
+            ShapelyDeprecationWarning, stacklevel=3)
+        self.context = context
+        self.factory = geos_linearring_from_py
+
+    @property
+    def __geo_interface__(self):
+        return {
+            'type': 'LinearRing',
+            'coordinates': tuple(self.coords)
+            }
+
+    def _get_coords(self):
+        """Access to geometry's coordinates (CoordinateSequence)"""
+        return CoordinateSequence(self)
+
+    coords = property(_get_coords)
+
+
+def asLinearRing(context):
+    """Adapt an object to the LinearRing interface"""
+    return LinearRingAdapter(context)
+
+
+class InteriorRingSequence:
+
+    _factory = None
+    _geom = None
+    __p__ = None
+    _ndim = None
+    _index = 0
+    _length = 0
+    __rings__ = None
+    _gtag = None
+
+    def __init__(self, parent):
+        self.__p__ = parent
+        self._geom = parent._geom
+        self._ndim = parent._ndim
+
+    def __iter__(self):
+        self._index = 0
+        self._length = self.__len__()
+        return self
+
+    def __next__(self):
+        if self._index < self._length:
+            ring = self._get_ring(self._index)
+            self._index += 1
+            return ring
+        else:
+            raise StopIteration
+
+    def __len__(self):
+        return lgeos.GEOSGetNumInteriorRings(self._geom)
+
+    def __getitem__(self, key):
+        m = self.__len__()
+        if isinstance(key, int):
+            if key + m < 0 or key >= m:
+                raise IndexError("index out of range")
+            if key < 0:
+                i = m + key
+            else:
+                i = key
+            return self._get_ring(i)
+        elif isinstance(key, slice):
+            res = []
+            start, stop, stride = key.indices(m)
+            for i in range(start, stop, stride):
+                res.append(self._get_ring(i))
+            return res
+        else:
+            raise TypeError("key must be an index or slice")
+
+    @property
+    def _longest(self):
+        max = 0
+        for g in iter(self):
+            l = len(g.coords)
+            if l > max:
+                max = l
+
+    def gtag(self):
+        return hash(repr(self.__p__))
+
+    def _get_ring(self, i):
+        gtag = self.gtag()
+        if gtag != self._gtag:
+            self.__rings__ = {}
+        if i not in self.__rings__:
+            g = lgeos.GEOSGetInteriorRingN(self._geom, i)
+            ring = LinearRing()
+            ring._set_geom(g)
+            ring.__p__ = self
+            ring._other_owned = True
+            ring._ndim = self._ndim
+            self.__rings__[i] = weakref.ref(ring)
+        return self.__rings__[i]()
+
+
+
[docs]class Polygon(BaseGeometry): + """ + A two-dimensional figure bounded by a linear ring + + A polygon has a non-zero area. It may have one or more negative-space + "holes" which are also bounded by linear rings. If any rings cross each + other, the feature is invalid and operations on it may fail. + + Attributes + ---------- + exterior : LinearRing + The ring which bounds the positive space of the polygon. + interiors : sequence + A sequence of rings which bound all existing holes. + """ + + _exterior = None + _interiors = [] + _ndim = 2 + + def __init__(self, shell=None, holes=None): + """ + Parameters + ---------- + shell : sequence + A sequence of (x, y [,z]) numeric coordinate pairs or triples. + Also can be a sequence of Point objects. + holes : sequence + A sequence of objects which satisfy the same requirements as the + shell parameters above + + Example + ------- + Create a square polygon with no holes + + >>> coords = ((0., 0.), (0., 1.), (1., 1.), (1., 0.), (0., 0.)) + >>> polygon = Polygon(coords) + >>> polygon.area + 1.0 + """ + BaseGeometry.__init__(self) + + if shell is not None: + ret = geos_polygon_from_py(shell, holes) + if ret is not None: + geom, n = ret + self._set_geom(geom) + self._ndim = n + else: + self._empty() + + @property + def exterior(self): + if self.is_empty: + return LinearRing() + elif self._exterior is None or self._exterior() is None: + g = lgeos.GEOSGetExteriorRing(self._geom) + ring = LinearRing() + ring._set_geom(g) + ring.__p__ = self + ring._other_owned = True + ring._ndim = self._ndim + self._exterior = weakref.ref(ring) + return self._exterior() + + @property + def interiors(self): + if self.is_empty: + return [] + return InteriorRingSequence(self) + + def __eq__(self, other): + if not isinstance(other, Polygon): + return False + check_empty = (self.is_empty, other.is_empty) + if all(check_empty): + return True + elif any(check_empty): + return False + my_coords = [ + tuple(self.exterior.coords), + [tuple(interior.coords) for interior in self.interiors] + ] + other_coords = [ + tuple(other.exterior.coords), + [tuple(interior.coords) for interior in other.interiors] + ] + return my_coords == other_coords + + def __ne__(self, other): + return not self.__eq__(other) + + __hash__ = None + + @property + def _ctypes(self): + if not self._ctypes_data: + self._ctypes_data = self.exterior._ctypes + return self._ctypes_data + + @property + def __array_interface__(self): + raise AttributeError( + "A polygon does not itself provide the array interface. Its rings do.") + + def _get_coords(self): + raise NotImplementedError( + "Component rings have coordinate sequences, but the polygon does not") + + def _set_coords(self, ob): + raise NotImplementedError( + "Component rings have coordinate sequences, but the polygon does not") + + @property + def coords(self): + raise NotImplementedError( + "Component rings have coordinate sequences, but the polygon does not") + + @property + def __geo_interface__(self): + if self.exterior == LinearRing(): + coords = [] + else: + coords = [tuple(self.exterior.coords)] + for hole in self.interiors: + coords.append(tuple(hole.coords)) + return { + 'type': 'Polygon', + 'coordinates': tuple(coords)} + +
[docs] def svg(self, scale_factor=1., fill_color=None, opacity=None): + """Returns SVG path element for the Polygon geometry. + + Parameters + ========== + scale_factor : float + Multiplication factor for the SVG stroke-width. Default is 1. + fill_color : str, optional + Hex string for fill color. Default is to use "#66cc99" if + geometry is valid, and "#ff3333" if invalid. + opacity : float + Float number between 0 and 1 for color opacity. Default value is 0.6 + """ + if self.is_empty: + return '<g />' + if fill_color is None: + fill_color = "#66cc99" if self.is_valid else "#ff3333" + if opacity is None: + opacity = 0.6 + exterior_coords = [ + ["{},{}".format(*c) for c in self.exterior.coords]] + interior_coords = [ + ["{},{}".format(*c) for c in interior.coords] + for interior in self.interiors] + path = " ".join([ + "M {} L {} z".format(coords[0], " L ".join(coords[1:])) + for coords in exterior_coords + interior_coords]) + return ( + '<path fill-rule="evenodd" fill="{2}" stroke="#555555" ' + 'stroke-width="{0}" opacity="{3}" d="{1}" />' + ).format(2. * scale_factor, path, fill_color, opacity)
+ +
[docs] @classmethod + def from_bounds(cls, xmin, ymin, xmax, ymax): + """Construct a `Polygon()` from spatial bounds.""" + return cls([ + (xmin, ymin), + (xmin, ymax), + (xmax, ymax), + (xmax, ymin)])
+ + +class PolygonAdapter(PolygonProxy, Polygon): + + def __init__(self, shell, holes=None): + warnings.warn( + "The proxy geometries (through the 'asShape()', 'asPolygon()' or " + "'PolygonAdapter()' constructors) are deprecated and will be " + "removed in Shapely 2.0. Use the 'shape()' function or the " + "standard 'Polygon()' constructor instead.", + ShapelyDeprecationWarning, stacklevel=4) + self.shell = shell + self.holes = holes + self.context = (shell, holes) + self.factory = geos_polygon_from_py + + @property + def _ndim(self): + try: + # From array protocol + array = self.shell.__array_interface__ + n = array['shape'][1] + assert n == 2 or n == 3 + return n + except AttributeError: + # Fall back on list + return len(self.shell[0]) + + +def asPolygon(shell, holes=None): + """Adapt objects to the Polygon interface""" + return PolygonAdapter(shell, holes) + + +def orient(polygon, sign=1.0): + s = float(sign) + rings = [] + ring = polygon.exterior + if signed_area(ring)/s >= 0.0: + rings.append(ring) + else: + rings.append(list(ring.coords)[::-1]) + for ring in polygon.interiors: + if signed_area(ring)/s <= 0.0: + rings.append(ring) + else: + rings.append(list(ring.coords)[::-1]) + return Polygon(rings[0], rings[1:]) + + +def geos_linearring_from_py(ob, update_geom=None, update_ndim=0): + # If a LinearRing is passed in, clone it and return + # If a valid LineString is passed in, clone the coord seq and return a + # LinearRing. + # + # NB: access to coordinates using the array protocol has been moved + # entirely to the speedups module. + + if isinstance(ob, LineString): + if type(ob) == LinearRing: + return geos_geom_from_py(ob) + elif not ob.is_valid: + raise TopologicalError("An input LineString must be valid.") + elif ob.is_closed and len(ob.coords) >= 4: + return geos_geom_from_py(ob, lgeos.GEOSGeom_createLinearRing) + else: + ob = list(ob.coords) + + try: + m = len(ob) + except TypeError: # generators + ob = list(ob) + m = len(ob) + + if m == 0: + return None + + def _coords(o): + if isinstance(o, Point): + return o.coords[0] + else: + return o + + n = len(_coords(ob[0])) + if m < 3: + raise ValueError( + "A LinearRing must have at least 3 coordinate tuples") + assert (n == 2 or n == 3) + + # Add closing coordinates if not provided + if ( + m == 3 + or _coords(ob[0])[0] != _coords(ob[-1])[0] + or _coords(ob[0])[1] != _coords(ob[-1])[1] + ): + M = m + 1 + else: + M = m + + # Create a coordinate sequence + if update_geom is not None: + if n != update_ndim: + raise ValueError( + "Coordinate dimensions mismatch: target geom has {} dims, " + "update geom has {} dims".format(n, update_ndim)) + cs = lgeos.GEOSGeom_getCoordSeq(update_geom) + else: + cs = lgeos.GEOSCoordSeq_create(M, n) + + # add to coordinate sequence + for i in range(m): + coords = _coords(ob[i]) + # Because of a bug in the GEOS C API, + # always set X before Y + lgeos.GEOSCoordSeq_setX(cs, i, coords[0]) + lgeos.GEOSCoordSeq_setY(cs, i, coords[1]) + if n == 3: + try: + lgeos.GEOSCoordSeq_setZ(cs, i, coords[2]) + except IndexError: + raise ValueError("Inconsistent coordinate dimensionality") + + # Add closing coordinates to sequence? + if M > m: + coords = _coords(ob[0]) + # Because of a bug in the GEOS C API, + # always set X before Y + lgeos.GEOSCoordSeq_setX(cs, M-1, coords[0]) + lgeos.GEOSCoordSeq_setY(cs, M-1, coords[1]) + if n == 3: + lgeos.GEOSCoordSeq_setZ(cs, M-1, coords[2]) + + if update_geom is not None: + return None + else: + return lgeos.GEOSGeom_createLinearRing(cs), n + + +def update_linearring_from_py(geom, ob): + geos_linearring_from_py(ob, geom._geom, geom._ndim) + + +def geos_polygon_from_py(shell, holes=None): + + if shell is None: + return None + + if isinstance(shell, Polygon): + return geos_geom_from_py(shell) + + if shell is not None: + ret = geos_linearring_from_py(shell) + if ret is None: + return None + + geos_shell, ndim = ret + if holes is not None and len(holes) > 0: + ob = holes + L = len(ob) + exemplar = ob[0] + try: + N = len(exemplar[0]) + except TypeError: + N = exemplar._ndim + if not L >= 1: + raise ValueError("number of holes must be non zero") + if N not in (2, 3): + raise ValueError("insufficiant coordinate dimension") + + # Array of pointers to ring geometries + geos_holes = (c_void_p * L)() + + # add to coordinate sequence + for l in range(L): + geom, ndim = geos_linearring_from_py(ob[l]) + geos_holes[l] = cast(geom, c_void_p) + else: + geos_holes = POINTER(c_void_p)() + L = 0 + + return ( + lgeos.GEOSGeom_createPolygon( + c_void_p(geos_shell), geos_holes, L), ndim) +
+ +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bart/_modules/shapely/ops/index.html b/branch/bart/_modules/shapely/ops/index.html new file mode 100644 index 0000000..fdfc16b --- /dev/null +++ b/branch/bart/_modules/shapely/ops/index.html @@ -0,0 +1,867 @@ + + + + + + shapely.ops — lasso documentation + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for shapely.ops

+"""Support for various GEOS geometry operations
+"""
+
+from ctypes import byref, c_void_p, c_double
+from warnings import warn
+
+from shapely.errors import GeometryTypeError, ShapelyDeprecationWarning
+from shapely.prepared import prep
+from shapely.geos import lgeos
+from shapely.geometry.base import geom_factory, BaseGeometry, BaseMultipartGeometry
+from shapely.geometry import (
+    shape, Point, MultiPoint, LineString, MultiLineString, Polygon, GeometryCollection)
+from shapely.geometry.polygon import orient as orient_
+from shapely.algorithms.polylabel import polylabel
+
+
+__all__ = ['cascaded_union', 'linemerge', 'operator', 'polygonize',
+           'polygonize_full', 'transform', 'unary_union', 'triangulate',
+           'voronoi_diagram', 'split', 'nearest_points', 'validate', 'snap',
+           'shared_paths', 'clip_by_rect', 'orient', 'substring']
+
+
+class CollectionOperator:
+
+    def shapeup(self, ob):
+        if isinstance(ob, BaseGeometry):
+            return ob
+        else:
+            try:
+                return shape(ob)
+            except (ValueError, AttributeError):
+                return LineString(ob)
+
+    def polygonize(self, lines):
+        """Creates polygons from a source of lines
+
+        The source may be a MultiLineString, a sequence of LineString objects,
+        or a sequence of objects than can be adapted to LineStrings.
+        """
+        source = getattr(lines, 'geoms', None) or lines
+        try:
+            source = iter(source)
+        except TypeError:
+            source = [source]
+        finally:
+            obs = [self.shapeup(l) for l in source]
+        geom_array_type = c_void_p * len(obs)
+        geom_array = geom_array_type()
+        for i, line in enumerate(obs):
+            geom_array[i] = line._geom
+        product = lgeos.GEOSPolygonize(byref(geom_array), len(obs))
+        collection = geom_factory(product)
+        for g in collection.geoms:
+            clone = lgeos.GEOSGeom_clone(g._geom)
+            g = geom_factory(clone)
+            g._other_owned = False
+            yield g
+
+    def polygonize_full(self, lines):
+        """Creates polygons from a source of lines, returning the polygons
+        and leftover geometries.
+
+        The source may be a MultiLineString, a sequence of LineString objects,
+        or a sequence of objects than can be adapted to LineStrings.
+
+        Returns a tuple of objects: (polygons, dangles, cut edges, invalid ring
+        lines). Each are a geometry collection.
+
+        Dangles are edges which have one or both ends which are not incident on
+        another edge endpoint. Cut edges are connected at both ends but do not
+        form part of polygon. Invalid ring lines form rings which are invalid
+        (bowties, etc).
+        """
+        source = getattr(lines, 'geoms', None) or lines
+        try:
+            source = iter(source)
+        except TypeError:
+            source = [source]
+        finally:
+            obs = [self.shapeup(l) for l in source]
+        L = len(obs)
+        subs = (c_void_p * L)()
+        for i, g in enumerate(obs):
+            subs[i] = g._geom
+        collection = lgeos.GEOSGeom_createCollection(5, subs, L)
+        dangles = c_void_p()
+        cuts = c_void_p()
+        invalids = c_void_p()
+        product = lgeos.GEOSPolygonize_full(
+            collection, byref(dangles), byref(cuts), byref(invalids))
+        return (
+            geom_factory(product),
+            geom_factory(dangles),
+            geom_factory(cuts),
+            geom_factory(invalids)
+            )
+
+    def linemerge(self, lines):
+        """Merges all connected lines from a source
+
+        The source may be a MultiLineString, a sequence of LineString objects,
+        or a sequence of objects than can be adapted to LineStrings.  Returns a
+        LineString or MultiLineString when lines are not contiguous.
+        """
+        source = None
+        if hasattr(lines, 'type') and lines.type == 'MultiLineString':
+            source = lines
+        elif hasattr(lines, 'geoms'):
+            # other Multi geometries
+            source = MultiLineString([ls.coords for ls in lines.geoms])
+        elif hasattr(lines, '__iter__'):
+            try:
+                source = MultiLineString([ls.coords for ls in lines])
+            except AttributeError:
+                source = MultiLineString(lines)
+        if source is None:
+            raise ValueError("Cannot linemerge %s" % lines)
+        result = lgeos.GEOSLineMerge(source._geom)
+        return geom_factory(result)
+
+    def cascaded_union(self, geoms):
+        """Returns the union of a sequence of geometries
+
+        This function is deprecated, as it was superseded by
+        :meth:`unary_union`.
+        """
+        warn(
+            "The 'cascaded_union()' function is deprecated. "
+            "Use 'unary_union()' instead.",
+            ShapelyDeprecationWarning, stacklevel=2)
+        try:
+            if isinstance(geoms, BaseMultipartGeometry):
+                geoms = geoms.geoms
+            L = len(geoms)
+        except TypeError:
+            geoms = [geoms]
+            L = 1
+        subs = (c_void_p * L)()
+        for i, g in enumerate(geoms):
+            subs[i] = g._geom
+        collection = lgeos.GEOSGeom_createCollection(6, subs, L)
+        return geom_factory(lgeos.methods['cascaded_union'](collection))
+
+    def unary_union(self, geoms):
+        """Returns the union of a sequence of geometries
+
+        This method replaces :meth:`cascaded_union` as the
+        preferred method for dissolving many polygons.
+        """
+        try:
+            if isinstance(geoms, BaseMultipartGeometry):
+                geoms = geoms.geoms
+            L = len(geoms)
+        except TypeError:
+            geoms = [geoms]
+            L = 1
+        subs = (c_void_p * L)()
+        for i, g in enumerate(geoms):
+            subs[i] = g._geom
+        collection = lgeos.GEOSGeom_createCollection(6, subs, L)
+        return geom_factory(lgeos.methods['unary_union'](collection))
+
+operator = CollectionOperator()
+polygonize = operator.polygonize
+polygonize_full = operator.polygonize_full
+linemerge = operator.linemerge
+cascaded_union = operator.cascaded_union
+unary_union = operator.unary_union
+
+
+def triangulate(geom, tolerance=0.0, edges=False):
+    """Creates the Delaunay triangulation and returns a list of geometries
+
+    The source may be any geometry type. All vertices of the geometry will be
+    used as the points of the triangulation.
+
+    From the GEOS documentation:
+    tolerance is the snapping tolerance used to improve the robustness of
+    the triangulation computation. A tolerance of 0.0 specifies that no
+    snapping will take place.
+
+    If edges is False, a list of Polygons (triangles) will be returned.
+    Otherwise the list of LineString edges is returned.
+
+    """
+    func = lgeos.methods['delaunay_triangulation']
+    gc = geom_factory(func(geom._geom, tolerance, int(edges)))
+    return [g for g in gc.geoms]
+
+
+def voronoi_diagram(geom, envelope=None, tolerance=0.0, edges=False):
+    """
+    Constructs a Voronoi Diagram [1] from the given geometry.
+    Returns a list of geometries.
+
+    Parameters
+    ----------
+    geom: geometry
+        the input geometry whose vertices will be used to calculate
+        the final diagram.
+    envelope: geometry, None
+        clipping envelope for the returned diagram, automatically
+        determined if None. The diagram will be clipped to the larger
+        of this envelope or an envelope surrounding the sites.
+    tolerance: float, 0.0
+        sets the snapping tolerance used to improve the robustness
+        of the computation. A tolerance of 0.0 specifies that no
+        snapping will take place.
+    edges: bool, False
+        If False, return regions as polygons. Else, return only
+        edges e.g. LineStrings.
+
+    GEOS documentation can be found at [2]
+
+    Returns
+    -------
+    GeometryCollection
+        geometries representing the Voronoi regions.
+
+    Notes
+    -----
+    The tolerance `argument` can be finicky and is known to cause the
+    algorithm to fail in several cases. If you're using `tolerance`
+    and getting a failure, try removing it. The test cases in
+    tests/test_voronoi_diagram.py show more details.
+
+
+    References
+    ----------
+    [1] https://en.wikipedia.org/wiki/Voronoi_diagram
+    [2] https://geos.osgeo.org/doxygen/geos__c_8h_source.html  (line 730)
+    """
+    func = lgeos.methods['voronoi_diagram']
+    envelope = envelope._geom if envelope else None
+    try:
+        result = geom_factory(func(geom._geom, envelope, tolerance, int(edges)))
+    except ValueError:
+        errstr = "Could not create Voronoi Diagram with the specified inputs."
+        if tolerance:
+            errstr += " Try running again with default tolerance value."
+        raise ValueError(errstr)
+
+    if result.type != 'GeometryCollection':
+        return GeometryCollection([result])
+    return result
+
+
+class ValidateOp:
+    def __call__(self, this):
+        return lgeos.GEOSisValidReason(this._geom)
+
+validate = ValidateOp()
+
+
+
[docs]def transform(func, geom): + """Applies `func` to all coordinates of `geom` and returns a new + geometry of the same type from the transformed coordinates. + + `func` maps x, y, and optionally z to output xp, yp, zp. The input + parameters may iterable types like lists or arrays or single values. + The output shall be of the same type. Scalars in, scalars out. + Lists in, lists out. + + For example, here is an identity function applicable to both types + of input. + + def id_func(x, y, z=None): + return tuple(filter(None, [x, y, z])) + + g2 = transform(id_func, g1) + + Using pyproj >= 2.1, this example will accurately project Shapely geometries: + + import pyproj + + wgs84 = pyproj.CRS('EPSG:4326') + utm = pyproj.CRS('EPSG:32618') + + project = pyproj.Transformer.from_crs(wgs84, utm, always_xy=True).transform + + g2 = transform(project, g1) + + Note that the always_xy kwarg is required here as Shapely geometries only support + X,Y coordinate ordering. + + Lambda expressions such as the one in + + g2 = transform(lambda x, y, z=None: (x+1.0, y+1.0), g1) + + also satisfy the requirements for `func`. + """ + if geom.is_empty: + return geom + if geom.type in ('Point', 'LineString', 'LinearRing', 'Polygon'): + + # First we try to apply func to x, y, z sequences. When func is + # optimized for sequences, this is the fastest, though zipping + # the results up to go back into the geometry constructors adds + # extra cost. + try: + if geom.type in ('Point', 'LineString', 'LinearRing'): + return type(geom)(zip(*func(*zip(*geom.coords)))) + elif geom.type == 'Polygon': + shell = type(geom.exterior)( + zip(*func(*zip(*geom.exterior.coords)))) + holes = list(type(ring)(zip(*func(*zip(*ring.coords)))) + for ring in geom.interiors) + return type(geom)(shell, holes) + + # A func that assumes x, y, z are single values will likely raise a + # TypeError, in which case we'll try again. + except TypeError: + if geom.type in ('Point', 'LineString', 'LinearRing'): + return type(geom)([func(*c) for c in geom.coords]) + elif geom.type == 'Polygon': + shell = type(geom.exterior)( + [func(*c) for c in geom.exterior.coords]) + holes = list(type(ring)([func(*c) for c in ring.coords]) + for ring in geom.interiors) + return type(geom)(shell, holes) + + elif geom.type.startswith('Multi') or geom.type == 'GeometryCollection': + return type(geom)([transform(func, part) for part in geom.geoms]) + else: + raise GeometryTypeError('Type %r not recognized' % geom.type)
+ + +def nearest_points(g1, g2): + """Returns the calculated nearest points in the input geometries + + The points are returned in the same order as the input geometries. + """ + seq = lgeos.methods['nearest_points'](g1._geom, g2._geom) + if seq is None: + if g1.is_empty: + raise ValueError('The first input geometry is empty') + else: + raise ValueError('The second input geometry is empty') + + try: + x1 = c_double() + y1 = c_double() + x2 = c_double() + y2 = c_double() + lgeos.GEOSCoordSeq_getX(seq, 0, byref(x1)) + lgeos.GEOSCoordSeq_getY(seq, 0, byref(y1)) + lgeos.GEOSCoordSeq_getX(seq, 1, byref(x2)) + lgeos.GEOSCoordSeq_getY(seq, 1, byref(y2)) + finally: + lgeos._lgeos.GEOSCoordSeq_destroy(seq) + + p1 = Point(x1.value, y1.value) + p2 = Point(x2.value, y2.value) + return (p1, p2) + +def snap(g1, g2, tolerance): + """Snap one geometry to another with a given tolerance + + Vertices of the first geometry are snapped to vertices of the second + geometry. The resulting snapped geometry is returned. The input geometries + are not modified. + + Parameters + ---------- + g1 : geometry + The first geometry + g2 : geometry + The second geometry + tolerance : float + The snapping tolerance + + Example + ------- + >>> square = Polygon([(1,1), (2, 1), (2, 2), (1, 2), (1, 1)]) + >>> line = LineString([(0,0), (0.8, 0.8), (1.8, 0.95), (2.6, 0.5)]) + >>> result = snap(line, square, 0.5) + >>> result.wkt + 'LINESTRING (0 0, 1 1, 2 1, 2.6 0.5)' + """ + return(geom_factory(lgeos.methods['snap'](g1._geom, g2._geom, tolerance))) + +def shared_paths(g1, g2): + """Find paths shared between the two given lineal geometries + + Returns a GeometryCollection with two elements: + - First element is a MultiLineString containing shared paths with the + same direction for both inputs. + - Second element is a MultiLineString containing shared paths with the + opposite direction for the two inputs. + + Parameters + ---------- + g1 : geometry + The first geometry + g2 : geometry + The second geometry + """ + if not isinstance(g1, LineString): + raise GeometryTypeError("First geometry must be a LineString") + if not isinstance(g2, LineString): + raise GeometryTypeError("Second geometry must be a LineString") + return(geom_factory(lgeos.methods['shared_paths'](g1._geom, g2._geom))) + + +class SplitOp: + + @staticmethod + def _split_polygon_with_line(poly, splitter): + """Split a Polygon with a LineString""" + if not isinstance(poly, Polygon): + raise GeometryTypeError("First argument must be a Polygon") + if not isinstance(splitter, LineString): + raise GeometryTypeError("Second argument must be a LineString") + + union = poly.boundary.union(splitter) + + # greatly improves split performance for big geometries with many + # holes (the following contains checks) with minimal overhead + # for common cases + poly = prep(poly) + + # some polygonized geometries may be holes, we do not want them + # that's why we test if the original polygon (poly) contains + # an inner point of polygonized geometry (pg) + return [pg for pg in polygonize(union) if poly.contains(pg.representative_point())] + + @staticmethod + def _split_line_with_line(line, splitter): + """Split a LineString with another (Multi)LineString or (Multi)Polygon""" + + # if splitter is a polygon, pick it's boundary + if splitter.type in ('Polygon', 'MultiPolygon'): + splitter = splitter.boundary + + if not isinstance(line, LineString): + raise GeometryTypeError("First argument must be a LineString") + if not isinstance(splitter, LineString) and not isinstance(splitter, MultiLineString): + raise GeometryTypeError("Second argument must be either a LineString or a MultiLineString") + + # | s\l | Interior | Boundary | Exterior | + # |----------|----------|----------|----------| + # | Interior | 0 or F | * | * | At least one of these two must be 0 + # | Boundary | 0 or F | * | * | So either '0********' or '[0F]**0*****' + # | Exterior | * | * | * | No overlapping interiors ('1********') + relation = splitter.relate(line) + if relation[0] == '1': + # The lines overlap at some segment (linear intersection of interiors) + raise ValueError('Input geometry segment overlaps with the splitter.') + elif relation[0] == '0' or relation[3] == '0': + # The splitter crosses or touches the line's interior --> return multilinestring from the split + return line.difference(splitter) + else: + # The splitter does not cross or touch the line's interior --> return collection with identity line + return [line] + + @staticmethod + def _split_line_with_point(line, splitter): + """Split a LineString with a Point""" + if not isinstance(line, LineString): + raise GeometryTypeError("First argument must be a LineString") + if not isinstance(splitter, Point): + raise GeometryTypeError("Second argument must be a Point") + + # check if point is in the interior of the line + if not line.relate_pattern(splitter, '0********'): + # point not on line interior --> return collection with single identity line + # (REASONING: Returning a list with the input line reference and creating a + # GeometryCollection at the general split function prevents unnecessary copying + # of linestrings in multipoint splitting function) + return [line] + elif line.coords[0] == splitter.coords[0]: + # if line is a closed ring the previous test doesn't behave as desired + return [line] + + # point is on line, get the distance from the first point on line + distance_on_line = line.project(splitter) + coords = list(line.coords) + # split the line at the point and create two new lines + current_position = 0.0 + for i in range(len(coords)-1): + point1 = coords[i] + point2 = coords[i+1] + dx = point1[0] - point2[0] + dy = point1[1] - point2[1] + segment_length = (dx ** 2 + dy ** 2) ** 0.5 + current_position += segment_length + if distance_on_line == current_position: + # splitter is exactly on a vertex + return [ + LineString(coords[:i+2]), + LineString(coords[i+1:]) + ] + elif distance_on_line < current_position: + # splitter is between two vertices + return [ + LineString(coords[:i+1] + [splitter.coords[0]]), + LineString([splitter.coords[0]] + coords[i+1:]) + ] + return [line] + + + @staticmethod + def _split_line_with_multipoint(line, splitter): + """Split a LineString with a MultiPoint""" + + if not isinstance(line, LineString): + raise GeometryTypeError("First argument must be a LineString") + if not isinstance(splitter, MultiPoint): + raise GeometryTypeError("Second argument must be a MultiPoint") + + chunks = [line] + for pt in splitter.geoms: + new_chunks = [] + for chunk in filter(lambda x: not x.is_empty, chunks): + # add the newly split 2 lines or the same line if not split + new_chunks.extend(SplitOp._split_line_with_point(chunk, pt)) + chunks = new_chunks + + return chunks + + @staticmethod + def split(geom, splitter): + """ + Splits a geometry by another geometry and returns a collection of geometries. This function is the theoretical + opposite of the union of the split geometry parts. If the splitter does not split the geometry, a collection + with a single geometry equal to the input geometry is returned. + The function supports: + - Splitting a (Multi)LineString by a (Multi)Point or (Multi)LineString or (Multi)Polygon + - Splitting a (Multi)Polygon by a LineString + + It may be convenient to snap the splitter with low tolerance to the geometry. For example in the case + of splitting a line by a point, the point must be exactly on the line, for the line to be correctly split. + When splitting a line by a polygon, the boundary of the polygon is used for the operation. + When splitting a line by another line, a ValueError is raised if the two overlap at some segment. + + Parameters + ---------- + geom : geometry + The geometry to be split + splitter : geometry + The geometry that will split the input geom + + Example + ------- + >>> pt = Point((1, 1)) + >>> line = LineString([(0,0), (2,2)]) + >>> result = split(line, pt) + >>> result.wkt + 'GEOMETRYCOLLECTION (LINESTRING (0 0, 1 1), LINESTRING (1 1, 2 2))' + """ + + if geom.type in ('MultiLineString', 'MultiPolygon'): + return GeometryCollection([i for part in geom.geoms for i in SplitOp.split(part, splitter).geoms]) + + elif geom.type == 'LineString': + if splitter.type in ('LineString', 'MultiLineString', 'Polygon', 'MultiPolygon'): + split_func = SplitOp._split_line_with_line + elif splitter.type in ('Point'): + split_func = SplitOp._split_line_with_point + elif splitter.type in ('MultiPoint'): + split_func = SplitOp._split_line_with_multipoint + else: + raise GeometryTypeError("Splitting a LineString with a %s is not supported" % splitter.type) + + elif geom.type == 'Polygon': + if splitter.type == 'LineString': + split_func = SplitOp._split_polygon_with_line + else: + raise GeometryTypeError("Splitting a Polygon with a %s is not supported" % splitter.type) + + else: + raise GeometryTypeError("Splitting %s geometry is not supported" % geom.type) + + return GeometryCollection(split_func(geom, splitter)) + +split = SplitOp.split + + +def substring(geom, start_dist, end_dist, normalized=False): + """Return a line segment between specified distances along a LineString + + Negative distance values are taken as measured in the reverse + direction from the end of the geometry. Out-of-range index + values are handled by clamping them to the valid range of values. + + If the start distance equals the end distance, a Point is returned. + + If the start distance is actually beyond the end distance, then the + reversed substring is returned such that the start distance is + at the first coordinate. + + Parameters + ---------- + geom : LineString + The geometry to get a substring of. + start_dist : float + The distance along `geom` of the start of the substring. + end_dist : float + The distance along `geom` of the end of the substring. + normalized : bool, False + Whether the distance parameters are interpreted as a + fraction of the geometry's length. + + Returns + ------- + Union[Point, LineString] + The substring between `start_dist` and `end_dist` or a Point + if they are at the same location. + + Raises + ------ + TypeError + If `geom` is not a LineString. + + Examples + -------- + >>> from shapely.geometry import LineString + >>> from shapely.ops import substring + >>> ls = LineString((i, 0) for i in range(6)) + >>> ls.wkt + 'LINESTRING (0 0, 1 0, 2 0, 3 0, 4 0, 5 0)' + >>> substring(ls, start_dist=1, end_dist=3).wkt + 'LINESTRING (1 0, 2 0, 3 0)' + >>> substring(ls, start_dist=3, end_dist=1).wkt + 'LINESTRING (3 0, 2 0, 1 0)' + >>> substring(ls, start_dist=1, end_dist=-3).wkt + 'LINESTRING (1 0, 2 0)' + >>> substring(ls, start_dist=0.2, end_dist=-0.6, normalized=True).wkt + 'LINESTRING (1 0, 2 0)' + + Returning a `Point` when `start_dist` and `end_dist` are at the + same location. + + >>> substring(ls, 2.5, -2.5).wkt + 'POINT (2.5 0)' + """ + + if not isinstance(geom, LineString): + raise GeometryTypeError("Can only calculate a substring of LineString geometries. A %s was provided." % geom.type) + + # Filter out cases in which to return a point + if start_dist == end_dist: + return geom.interpolate(start_dist, normalized) + elif not normalized and start_dist >= geom.length and end_dist >= geom.length: + return geom.interpolate(geom.length, normalized) + elif not normalized and -start_dist >= geom.length and -end_dist >= geom.length: + return geom.interpolate(0, normalized) + elif normalized and start_dist >= 1 and end_dist >= 1: + return geom.interpolate(1, normalized) + elif normalized and -start_dist >= 1 and -end_dist >= 1: + return geom.interpolate(0, normalized) + + if normalized: + start_dist *= geom.length + end_dist *= geom.length + + # Filter out cases where distances meet at a middle point from opposite ends. + if start_dist < 0 < end_dist and abs(start_dist) + end_dist == geom.length: + return geom.interpolate(end_dist) + elif end_dist < 0 < start_dist and abs(end_dist) + start_dist == geom.length: + return geom.interpolate(start_dist) + + start_point = geom.interpolate(start_dist) + end_point = geom.interpolate(end_dist) + + if start_dist < 0: + start_dist = geom.length + start_dist # Values may still be negative, + if end_dist < 0: # but only in the out-of-range + end_dist = geom.length + end_dist # sense, not the wrap-around sense. + + reverse = start_dist > end_dist + if reverse: + start_dist, end_dist = end_dist, start_dist + + if start_dist < 0: + start_dist = 0 # to avoid duplicating the first vertex + + if reverse: + vertex_list = [(end_point.x, end_point.y)] + else: + vertex_list = [(start_point.x, start_point.y)] + + coords = list(geom.coords) + current_distance = 0 + for p1, p2 in zip(coords, coords[1:]): + if start_dist < current_distance < end_dist: + vertex_list.append(p1) + elif current_distance >= end_dist: + break + + current_distance += ((p2[0] - p1[0]) ** 2 + (p2[1] - p1[1]) ** 2) ** 0.5 + + if reverse: + vertex_list.append((start_point.x, start_point.y)) + # reverse direction result + vertex_list = reversed(vertex_list) + else: + vertex_list.append((end_point.x, end_point.y)) + + return LineString(vertex_list) + + +def clip_by_rect(geom, xmin, ymin, xmax, ymax): + """Returns the portion of a geometry within a rectangle + + The geometry is clipped in a fast but possibly dirty way. The output is + not guaranteed to be valid. No exceptions will be raised for topological + errors. + + Parameters + ---------- + geom : geometry + The geometry to be clipped + xmin : float + Minimum x value of the rectangle + ymin : float + Minimum y value of the rectangle + xmax : float + Maximum x value of the rectangle + ymax : float + Maximum y value of the rectangle + + Notes + ----- + Requires GEOS >= 3.5.0 + New in 1.7. + """ + if geom.is_empty: + return geom + result = geom_factory(lgeos.methods['clip_by_rect'](geom._geom, xmin, ymin, xmax, ymax)) + return result + + +def orient(geom, sign=1.0): + """A properly oriented copy of the given geometry. + + The signed area of the result will have the given sign. A sign of + 1.0 means that the coordinates of the product's exterior rings will + be oriented counter-clockwise. + + Parameters + ---------- + geom : Geometry + The original geometry. May be a Polygon, MultiPolygon, or + GeometryCollection. + sign : float, optional. + The sign of the result's signed area. + + Returns + ------- + Geometry + + """ + if isinstance(geom, BaseMultipartGeometry): + return geom.__class__( + list( + map( + lambda geom: orient(geom, sign), + geom.geoms, + ) + ) + ) + if isinstance(geom, (Polygon,)): + return orient_(geom, sign) + return geom +
+ +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bart/_sources/_generated/lasso.CubeTransit.rst.txt b/branch/bart/_sources/_generated/lasso.CubeTransit.rst.txt new file mode 100644 index 0000000..e24b49e --- /dev/null +++ b/branch/bart/_sources/_generated/lasso.CubeTransit.rst.txt @@ -0,0 +1,36 @@ +lasso.CubeTransit +================= + +.. currentmodule:: lasso + +.. autoclass:: CubeTransit + + + .. automethod:: __init__ + + + .. rubric:: Methods + + .. autosummary:: + + ~CubeTransit.__init__ + ~CubeTransit.add_additional_time_periods + ~CubeTransit.add_cube + ~CubeTransit.build_route_name + ~CubeTransit.calculate_start_end_times + ~CubeTransit.create_add_route_card_dict + ~CubeTransit.create_delete_route_card_dict + ~CubeTransit.create_from_cube + ~CubeTransit.create_update_route_card_dict + ~CubeTransit.cube_properties_to_standard_properties + ~CubeTransit.evaluate_differences + ~CubeTransit.evaluate_route_property_differences + ~CubeTransit.evaluate_route_shape_changes + ~CubeTransit.get_time_period_numbers_from_cube_properties + ~CubeTransit.unpack_route_name + + + + + + \ No newline at end of file diff --git a/branch/bart/_sources/_generated/lasso.ModelRoadwayNetwork.rst.txt b/branch/bart/_sources/_generated/lasso.ModelRoadwayNetwork.rst.txt new file mode 100644 index 0000000..29190d8 --- /dev/null +++ b/branch/bart/_sources/_generated/lasso.ModelRoadwayNetwork.rst.txt @@ -0,0 +1,90 @@ +lasso.ModelRoadwayNetwork +========================= + +.. currentmodule:: lasso + +.. autoclass:: ModelRoadwayNetwork + + + .. automethod:: __init__ + + + .. rubric:: Methods + + .. autosummary:: + + ~ModelRoadwayNetwork.__init__ + ~ModelRoadwayNetwork.add_counts + ~ModelRoadwayNetwork.add_incident_link_data_to_nodes + ~ModelRoadwayNetwork.add_new_roadway_feature_change + ~ModelRoadwayNetwork.add_variable_using_shst_reference + ~ModelRoadwayNetwork.addition_map + ~ModelRoadwayNetwork.apply + ~ModelRoadwayNetwork.apply_managed_lane_feature_change + ~ModelRoadwayNetwork.apply_python_calculation + ~ModelRoadwayNetwork.apply_roadway_feature_change + ~ModelRoadwayNetwork.assess_connectivity + ~ModelRoadwayNetwork.build_selection_key + ~ModelRoadwayNetwork.calculate_area_type + ~ModelRoadwayNetwork.calculate_centroidconnect + ~ModelRoadwayNetwork.calculate_county + ~ModelRoadwayNetwork.calculate_distance + ~ModelRoadwayNetwork.calculate_mpo + ~ModelRoadwayNetwork.calculate_use + ~ModelRoadwayNetwork.convert_int + ~ModelRoadwayNetwork.create_ML_variable + ~ModelRoadwayNetwork.create_calculated_variables + ~ModelRoadwayNetwork.create_dummy_connector_links + ~ModelRoadwayNetwork.create_hov_corridor_variable + ~ModelRoadwayNetwork.create_managed_lane_network + ~ModelRoadwayNetwork.create_managed_variable + ~ModelRoadwayNetwork.dataframe_to_fixed_width + ~ModelRoadwayNetwork.delete_roadway_feature_change + ~ModelRoadwayNetwork.deletion_map + ~ModelRoadwayNetwork.fill_na + ~ModelRoadwayNetwork.from_RoadwayNetwork + ~ModelRoadwayNetwork.get_attribute + ~ModelRoadwayNetwork.get_managed_lane_node_ids + ~ModelRoadwayNetwork.get_modal_graph + ~ModelRoadwayNetwork.get_modal_links_nodes + ~ModelRoadwayNetwork.get_property_by_time_period_and_group + ~ModelRoadwayNetwork.identify_segment + ~ModelRoadwayNetwork.identify_segment_endpoints + ~ModelRoadwayNetwork.is_network_connected + ~ModelRoadwayNetwork.load_transform_network + ~ModelRoadwayNetwork.network_connection_plot + ~ModelRoadwayNetwork.orig_dest_nodes_foreign_key + ~ModelRoadwayNetwork.ox_graph + ~ModelRoadwayNetwork.path_search + ~ModelRoadwayNetwork.read + ~ModelRoadwayNetwork.read_match_result + ~ModelRoadwayNetwork.rename_variables_for_dbf + ~ModelRoadwayNetwork.roadway_net_to_gdf + ~ModelRoadwayNetwork.roadway_standard_to_met_council_network + ~ModelRoadwayNetwork.select_roadway_features + ~ModelRoadwayNetwork.selection_has_unique_link_id + ~ModelRoadwayNetwork.selection_map + ~ModelRoadwayNetwork.shortest_path + ~ModelRoadwayNetwork.split_properties_by_time_period_and_category + ~ModelRoadwayNetwork.update_distance + ~ModelRoadwayNetwork.validate_link_schema + ~ModelRoadwayNetwork.validate_node_schema + ~ModelRoadwayNetwork.validate_properties + ~ModelRoadwayNetwork.validate_selection + ~ModelRoadwayNetwork.validate_shape_schema + ~ModelRoadwayNetwork.validate_uniqueness + ~ModelRoadwayNetwork.write + ~ModelRoadwayNetwork.write_roadway_as_fixedwidth + ~ModelRoadwayNetwork.write_roadway_as_shp + + + + + + .. rubric:: Attributes + + .. autosummary:: + + ~ModelRoadwayNetwork.CALCULATED_VALUES + + \ No newline at end of file diff --git a/branch/bart/_sources/_generated/lasso.Parameters.rst.txt b/branch/bart/_sources/_generated/lasso.Parameters.rst.txt new file mode 100644 index 0000000..28d2c86 --- /dev/null +++ b/branch/bart/_sources/_generated/lasso.Parameters.rst.txt @@ -0,0 +1,31 @@ +lasso.Parameters +================ + +.. currentmodule:: lasso + +.. autoclass:: Parameters + + + .. automethod:: __init__ + + + .. rubric:: Methods + + .. autosummary:: + + ~Parameters.__init__ + + + + + + .. rubric:: Attributes + + .. autosummary:: + + ~Parameters.cube_time_periods + ~Parameters.properties_to_split + ~Parameters.county_link_range_dict + ~Parameters.zones + + \ No newline at end of file diff --git a/branch/bart/_sources/_generated/lasso.Project.rst.txt b/branch/bart/_sources/_generated/lasso.Project.rst.txt new file mode 100644 index 0000000..863945b --- /dev/null +++ b/branch/bart/_sources/_generated/lasso.Project.rst.txt @@ -0,0 +1,42 @@ +lasso.Project +============= + +.. currentmodule:: lasso + +.. autoclass:: Project + + + .. automethod:: __init__ + + + .. rubric:: Methods + + .. autosummary:: + + ~Project.__init__ + ~Project.add_highway_changes + ~Project.add_transit_changes + ~Project.create_project + ~Project.determine_roadway_network_changes_compatibility + ~Project.emme_id_to_wrangler_id + ~Project.emme_name_to_wrangler_name + ~Project.evaluate_changes + ~Project.get_object_from_network_build_command + ~Project.get_operation_from_network_build_command + ~Project.read_logfile + ~Project.read_network_build_file + ~Project.write_project_card + + + + + + .. rubric:: Attributes + + .. autosummary:: + + ~Project.CALCULATED_VALUES + ~Project.DEFAULT_PROJECT_NAME + ~Project.STATIC_VALUES + + \ No newline at end of file diff --git a/branch/bart/_sources/_generated/lasso.StandardTransit.rst.txt b/branch/bart/_sources/_generated/lasso.StandardTransit.rst.txt new file mode 100644 index 0000000..4fae048 --- /dev/null +++ b/branch/bart/_sources/_generated/lasso.StandardTransit.rst.txt @@ -0,0 +1,33 @@ +lasso.StandardTransit +===================== + +.. currentmodule:: lasso + +.. autoclass:: StandardTransit + + + .. automethod:: __init__ + + + .. rubric:: Methods + + .. autosummary:: + + ~StandardTransit.__init__ + ~StandardTransit.calculate_cube_mode + ~StandardTransit.cube_format + ~StandardTransit.evaluate_differences + ~StandardTransit.fromTransitNetwork + ~StandardTransit.read_gtfs + ~StandardTransit.route_properties_gtfs_to_cube + ~StandardTransit.shape_gtfs_to_cube + ~StandardTransit.shape_gtfs_to_dict_list + ~StandardTransit.shape_gtfs_to_emme + ~StandardTransit.time_to_cube_time_period + ~StandardTransit.write_as_cube_lin + + + + + + \ No newline at end of file diff --git a/branch/bart/_sources/_generated/lasso.logger.rst.txt b/branch/bart/_sources/_generated/lasso.logger.rst.txt new file mode 100644 index 0000000..2054273 --- /dev/null +++ b/branch/bart/_sources/_generated/lasso.logger.rst.txt @@ -0,0 +1,29 @@ +lasso.logger +============ + +.. automodule:: lasso.logger + + + + + + + + .. rubric:: Functions + + .. autosummary:: + + setupLogging + + + + + + + + + + + + + diff --git a/branch/bart/_sources/_generated/lasso.util.rst.txt b/branch/bart/_sources/_generated/lasso.util.rst.txt new file mode 100644 index 0000000..95fecf8 --- /dev/null +++ b/branch/bart/_sources/_generated/lasso.util.rst.txt @@ -0,0 +1,35 @@ +lasso.util +========== + +.. automodule:: lasso.util + + + + + + + + .. rubric:: Functions + + .. autosummary:: + + column_name_to_parts + create_locationreference + geodesic_point_buffer + get_shared_streets_intersection_hash + hhmmss_to_datetime + secs_to_datetime + shorten_name + + + + + + + + + + + + + diff --git a/branch/bart/_sources/autodoc.rst.txt b/branch/bart/_sources/autodoc.rst.txt new file mode 100644 index 0000000..7e48d58 --- /dev/null +++ b/branch/bart/_sources/autodoc.rst.txt @@ -0,0 +1,29 @@ +Lasso Classes and Functions +==================================== + +.. automodule:: lasso + :no-members: + :no-undoc-members: + :no-inherited-members: + :no-show-inheritance: + + +Base Classes +-------------- +.. autosummary:: + :toctree: _generated + :nosignatures: + + CubeTransit + StandardTransit + ModelRoadwayNetwork + Project + Parameters + +Utils and Functions +-------------------- +.. autosummary:: + :toctree: _generated + + util + logger diff --git a/branch/bart/_sources/index.rst.txt b/branch/bart/_sources/index.rst.txt new file mode 100644 index 0000000..616853c --- /dev/null +++ b/branch/bart/_sources/index.rst.txt @@ -0,0 +1,36 @@ +.. lasso documentation master file, created by + sphinx-quickstart on Thu Dec 5 15:43:28 2019. + You can adapt this file completely to your liking, but it should at least + contain the root `toctree` directive. + +Welcome to lasso's documentation! +================================= + +This package of utilities is a wrapper around the +`network_wrangler `_ package +for MetCouncil and MTC. It aims to have the following functionality: + +1. parse Cube log files and base highway networks and create ProjectCards + for Network Wrangler +2. parse two Cube transit line files and create ProjectCards for NetworkWrangler +3. refine Network Wrangler highway networks to contain specific variables and + settings for the respective agency and export them to a format that can + be read in by Citilab's Cube software. + +.. toctree:: + :maxdepth: 3 + :caption: Contents: + + starting + setup + running + autodoc + + + +Indices and tables +================== + +* :ref:`genindex` +* :ref:`modindex` +* :ref:`search` diff --git a/branch/bart/_sources/running.md.txt b/branch/bart/_sources/running.md.txt new file mode 100644 index 0000000..e139dc8 --- /dev/null +++ b/branch/bart/_sources/running.md.txt @@ -0,0 +1,12 @@ +# Running Lasso + +## Create project files + + +## Create a scenario + + +## Exporting networks + + +## Auditing and Reporting diff --git a/branch/bart/_sources/setup.md.txt b/branch/bart/_sources/setup.md.txt new file mode 100644 index 0000000..e77d463 --- /dev/null +++ b/branch/bart/_sources/setup.md.txt @@ -0,0 +1,9 @@ +# Setup + +### Projects + +### Parameters + +### Settings + +### Additional Data Files diff --git a/branch/bart/_sources/starting.md.txt b/branch/bart/_sources/starting.md.txt new file mode 100644 index 0000000..8886f95 --- /dev/null +++ b/branch/bart/_sources/starting.md.txt @@ -0,0 +1,292 @@ +# Starting Out + +## Installation + +If you are managing multiple python versions, we suggest using [`virtualenv`](https://virtualenv.pypa.io/en/latest/) or [`conda`](https://conda.io/en/latest/) virtual environments. + +Example using a conda environment (recommended) and using the package manager [pip](https://pip.pypa.io/en/stable/) to install Lasso from the source on GitHub. + +```bash +conda config --add channels conda-forge +conda create python=3.7 rtree geopandas -n +conda activate +pip install git+https://github.com/wsp-sag/Lasso@master +``` + +Lasso will install `network_wrangler` from the [PyPi](https://pypi.org/project/network-wrangler/) repository because it is included in Lasso's `requirements.txt`. + +#### Bleeding Edge +If you want to install a more up-to-date or development version of network wrangler and lasso , you can do so by installing it from the `develop` branch of + +```bash +conda config --add channels conda-forge +conda create python=3.7 rtree geopandas -n +conda activate +pip install git+https://github.com/wsp-sag/network_wrangler@develop +pip install git+https://github.com/wsp-sag/Lasso@develop +``` + +#### From Clone +If you are going to be working on Lasso locally, you might want to clone it to your local machine and install it from the clone. The -e will install it in [editable mode](https://pip.pypa.io/en/stable/reference/pip_install/?highlight=editable#editable-installs). + +**if you plan to do development on both network wrangler and lasso locally, consider installing network wrangler from a clone as well!** + +```bash +conda config --add channels conda-forge +conda create python=3.7 rtree geopandas osmnx -n +conda activate +git clone https://github.com/wsp-sag/Lasso +git clone https://github.com/wsp-sag/network_wrangler +cd network_wrangler +pip install -e . +cd .. +cd Lasso +pip install -e . +``` + +Notes: + +1. The -e installs it in editable mode. +2. If you are not part of the project team and want to contribute code bxack to the project, please fork before you clone and then add the original repository to your upstream origin list per [these directions on github](https://help.github.com/en/articles/fork-a-repo). +3. if you wanted to install from a specific tag/version number or branch, replace `@master` with `@` or `@tag` +4. If you want to make use of frequent developer updates for network wrangler as well, you can also install it from clone by copying the instructions for cloning and installing Lasso for Network Wrangler + +If you are going to be doing Lasso development, we also recommend: + - a good IDE such as [Atom](http://atom.io), VS Code, Sublime Text, etc. + with Python syntax highlighting turned on. + - [GitHub Desktop](https://desktop.github.com/) to locally update your clones + +## Brief Intro + +Lasso is a 'wrapper' around the [Network Wrangler](http://wsp-sag.github.io/network_wrangler) utility. + +Both Lasso and NetworkWrangler are built around the following data schemas: + - [`roadway network`], which is based on a mashup of Open Street Map and [Shared Streets](http://sharedstreets.io). In Network Wrangler these are read in from three json files reprsenting: links, shapes, and nodes. Data fields that change by time of day or by user category are represented as nested fields such that any field can be defined for an ad-hoc time-of-day span or user category. + - [`transit network`], which is based on a frequency-based implementation of the csv-based GTFS; and + - [`project card`], which is novel to Network Wrangler and stores information about network changes as a result of projects in yml. + +In addition, Lasso utilizes the following data schemas: + + - [`MetCouncil Model Roadway Network Schema`], which adds data fields to the `roadway network` schema that MetCouncil uses in their travel model including breaking out data fields by time period. + - [`MetCouncil Model Transit Network Schema`], which uses the Cube PublicTransport format, and + - [`Cube Log Files`], which document changes to the roadway network done in the Cube GUI. Lasso translates these to project cards in order to be used by NetworkWrangler. + - [`Cube public transport line files`], which define a set of transit lines in the cube software. + +### Components +Network Wrangler has the following atomic parts: + + - _RoadwayNetwork_ object, which represents the `roadway network` data as GeoDataFrames; + - _TransitNetwork_ object, which represents the `transit network` data as DataFrames; + - _ProjectCard_ object, which represents the data of the `project card`. Project cards identify the infrastructure that is changing (a selection) and defines the changes; or contains information about a new facility to be constructed or a new service to be run.; + - _Scenario_ object, which consist of at least a RoadwayNetwork, and +TransitNetwork. Scenarios can be based on or tiered from other scenarios. +Scenarios can query and add ProjectCards to describe a set of changes that should be made to the network. + +In addition, Lasso has the following atomic parts: + + - _Project_ object, creates project cards from one of the following: a base and a build transit network in cube format, a base and build highway network, or a base highway network and a Cube log file. + - _ModelRoadwayNetwork_ object is a subclass of `RoadwayNetwork` and contains MetCouncil-specific methods to define and create MetCouncil-specific variables and export the network to a format that can be read by Cube. + - _StandardTransit_, an object for holding a standard transit feed as a Partridge object and contains + methods to manipulate and translate the GTFS data to MetCouncil's Cube Line files. + - _CubeTransit_, an object for storing information about transit defined in `Cube public transport line files` + . Has the capability to parse cube line file properties and shapes into python dictionaries and compare line files and represent changes as Project Card dictionaries. + - _Parameters_, A class representing all the parameters defining the networks + including time of day, categories, etc. Parameters can be set at runtime by initializing a parameters instance + with a keyword argument setting the attribute. Parameters that are + not explicitly set will use default parameters listed in this class. + +#### RoadwayNetwork + +Reads, writes, queries and and manipulates roadway network data, which +is mainly stored in the GeoDataFrames `links_df`, `nodes_df`, and `shapes_df`. + +```python +net = RoadwayNetwork.read( + link_filename=MY_LINK_FILE, + node_filename=MY_NODE_FILE, + shape_filename=MY_SHAPE_FILE, + shape_foreign_key ='shape_id', + + ) +my_selection = { + "link": [{"name": ["I 35E"]}], + "A": {"osm_node_id": "961117623"}, # start searching for segments at A + "B": {"osm_node_id": "2564047368"}, +} +net.select_roadway_features(my_selection) + +my_change = [ + { + 'property': 'lanes', + 'existing': 1, + 'set': 2, + }, + { + 'property': 'drive_access', + 'set': 0, + }, +] + +my_net.apply_roadway_feature_change( + my_net.select_roadway_features(my_selection), + my_change +) + +ml_net = net.create_managed_lane_network(in_place=False) + +ml_net.is_network_connected(mode="drive")) + +_, disconnected_nodes = ml_net.assess_connectivity( + mode="walk", + ignore_end_nodes=True +) +ml_net.write(filename=my_out_prefix, path=my_dir) +``` +#### TransitNetwork + +#### ProjectCard + +#### Scenario + +Manages sets of project cards and tiering from a base scenario/set of networks. + +```python + +my_base_scenario = { + "road_net": RoadwayNetwork.read( + link_filename=STPAUL_LINK_FILE, + node_filename=STPAUL_NODE_FILE, + shape_filename=STPAUL_SHAPE_FILE, + fast=True, + shape_foreign_key ='shape_id', + ), + "transit_net": TransitNetwork.read(STPAUL_DIR), +} + +card_filenames = [ + "3_multiple_roadway_attribute_change.yml", + "multiple_changes.yml", + "4_simple_managed_lane.yml", +] + +project_card_directory = os.path.join(STPAUL_DIR, "project_cards") + +project_cards_list = [ + ProjectCard.read(os.path.join(project_card_directory, filename), validate=False) + for filename in card_filenames +] + +my_scenario = Scenario.create_scenario( + base_scenario=my_base_scenario, + project_cards_list=project_cards_list, +) +my_scenario.check_scenario_requisites() + +my_scenario.apply_all_projects() + +my_scenario.scenario_summary() +``` + +#### Project +Creates project cards by comparing two MetCouncil Model Transit Network files or by reading a cube log file and a base network; + +```python + +test_project = Project.create_project( + base_transit_source=os.path.join(CUBE_DIR, "transit.LIN"), + build_transit_source=os.path.join(CUBE_DIR, "transit_route_shape_change"), + ) + +test_project.evaluate_changes() + +test_project.write_project_card( + os.path.join(SCRATCH_DIR, "t_transit_shape_test.yml") + ) + +``` + +#### ModelRoadwayNetwork +A subclass of network_wrangler's RoadwayNetwork +class which additional understanding about how to translate and write the +network out to the MetCouncil Roadway Network schema. + +```Python + +net = ModelRoadwayNetwork.read( + link_filename=STPAUL_LINK_FILE, + node_filename=STPAUL_NODE_FILE, + shape_filename=STPAUL_SHAPE_FILE, + fast=True, + shape_foreign_key ='shape_id', + ) + +net.write_roadway_as_fixedwidth() + +``` + +#### StandardTransit +Translates the standard GTFS data to MetCouncil's Cube Line files. + +```Python +cube_transit_net = StandardTransit.read_gtfs(BASE_TRANSIT_DIR) +cube_transit_net.write_as_cube_lin(os.path.join(WRITE_DIR, "outfile.lin")) +``` + +#### CubeTransit +Used by the project class and has the capability to: + - Parse cube line file properties and shapes into python dictionaries + - Compare line files and represent changes as Project Card dictionaries + +```python +tn = CubeTransit.create_from_cube(CUBE_DIR) +transit_change_list = tn.evaluate_differences(base_transit_network) +``` + +#### Parameters +Holds information about default parameters but can +also be initialized to override those parameters at object instantiation using a dictionary. + +```Python +# read parameters from a yaml configuration file +# could also provide as a key/value pair +with open(config_file) as f: + my_config = yaml.safe_load(f) + +# provide parameters at instantiation of ModelRoadwayNetwork +model_road_net = ModelRoadwayNetwork.from_RoadwayNetwork( + my_scenario.road_net, parameters=my_config.get("my_parameters", {}) + ) +# network written with direction from the parameters given +model_road_net.write_roadway_as_shp() + +``` + +### Typical Workflow + +Workflows in Lasso and Network Wrangler typically accomplish one of two goals: +1. Create Project Cards to document network changes as a result of either transit or roadway projects. +2. Create Model Network Files for a scenario as represented by a series of Project Cards layered on top of a base network. + +#### Project Cards from Transit LIN Files + + +#### Project Cards from Cube LOG Files + + +#### Model Network Files for a Scenario + + + +## Running Quickstart Jupyter Notebooks + +To learn basic lasso functionality, please refer to the following jupyter notebooks in the `/notebooks` directory: + + - `Lasso Project Card Creation Quickstart.ipynb` + - `Lasso Scenario Creation Quickstart.ipynb` + + Jupyter notebooks can be started by activating the lasso conda environment and typing `jupyter notebook`: + + ```bash + conda activate + jupyter notebook + ``` diff --git a/branch/bart/_static/_sphinx_javascript_frameworks_compat.js b/branch/bart/_static/_sphinx_javascript_frameworks_compat.js new file mode 100644 index 0000000..8549469 --- /dev/null +++ b/branch/bart/_static/_sphinx_javascript_frameworks_compat.js @@ -0,0 +1,134 @@ +/* + * _sphinx_javascript_frameworks_compat.js + * ~~~~~~~~~~ + * + * Compatability shim for jQuery and underscores.js. + * + * WILL BE REMOVED IN Sphinx 6.0 + * xref RemovedInSphinx60Warning + * + */ + +/** + * select a different prefix for underscore + */ +$u = _.noConflict(); + + +/** + * small helper function to urldecode strings + * + * See https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/decodeURIComponent#Decoding_query_parameters_from_a_URL + */ +jQuery.urldecode = function(x) { + if (!x) { + return x + } + return decodeURIComponent(x.replace(/\+/g, ' ')); +}; + +/** + * small helper function to urlencode strings + */ +jQuery.urlencode = encodeURIComponent; + +/** + * This function returns the parsed url parameters of the + * current request. Multiple values per key are supported, + * it will always return arrays of strings for the value parts. + */ +jQuery.getQueryParameters = function(s) { + if (typeof s === 'undefined') + s = document.location.search; + var parts = s.substr(s.indexOf('?') + 1).split('&'); + var result = {}; + for (var i = 0; i < parts.length; i++) { + var tmp = parts[i].split('=', 2); + var key = jQuery.urldecode(tmp[0]); + var value = jQuery.urldecode(tmp[1]); + if (key in result) + result[key].push(value); + else + result[key] = [value]; + } + return result; +}; + +/** + * highlight a given string on a jquery object by wrapping it in + * span elements with the given class name. + */ +jQuery.fn.highlightText = function(text, className) { + function highlight(node, addItems) { + if (node.nodeType === 3) { + var val = node.nodeValue; + var pos = val.toLowerCase().indexOf(text); + if (pos >= 0 && + !jQuery(node.parentNode).hasClass(className) && + !jQuery(node.parentNode).hasClass("nohighlight")) { + var span; + var isInSVG = jQuery(node).closest("body, svg, foreignObject").is("svg"); + if (isInSVG) { + span = document.createElementNS("http://www.w3.org/2000/svg", "tspan"); + } else { + span = document.createElement("span"); + span.className = className; + } + span.appendChild(document.createTextNode(val.substr(pos, text.length))); + node.parentNode.insertBefore(span, node.parentNode.insertBefore( + document.createTextNode(val.substr(pos + text.length)), + node.nextSibling)); + node.nodeValue = val.substr(0, pos); + if (isInSVG) { + var rect = document.createElementNS("http://www.w3.org/2000/svg", "rect"); + var bbox = node.parentElement.getBBox(); + rect.x.baseVal.value = bbox.x; + rect.y.baseVal.value = bbox.y; + rect.width.baseVal.value = bbox.width; + rect.height.baseVal.value = bbox.height; + rect.setAttribute('class', className); + addItems.push({ + "parent": node.parentNode, + "target": rect}); + } + } + } + else if (!jQuery(node).is("button, select, textarea")) { + jQuery.each(node.childNodes, function() { + highlight(this, addItems); + }); + } + } + var addItems = []; + var result = this.each(function() { + highlight(this, addItems); + }); + for (var i = 0; i < addItems.length; ++i) { + jQuery(addItems[i].parent).before(addItems[i].target); + } + return result; +}; + +/* + * backward compatibility for jQuery.browser + * This will be supported until firefox bug is fixed. + */ +if (!jQuery.browser) { + jQuery.uaMatch = function(ua) { + ua = ua.toLowerCase(); + + var match = /(chrome)[ \/]([\w.]+)/.exec(ua) || + /(webkit)[ \/]([\w.]+)/.exec(ua) || + /(opera)(?:.*version|)[ \/]([\w.]+)/.exec(ua) || + /(msie) ([\w.]+)/.exec(ua) || + ua.indexOf("compatible") < 0 && /(mozilla)(?:.*? rv:([\w.]+)|)/.exec(ua) || + []; + + return { + browser: match[ 1 ] || "", + version: match[ 2 ] || "0" + }; + }; + jQuery.browser = {}; + jQuery.browser[jQuery.uaMatch(navigator.userAgent).browser] = true; +} diff --git a/branch/bart/_static/basic.css b/branch/bart/_static/basic.css new file mode 100644 index 0000000..eeb0519 --- /dev/null +++ b/branch/bart/_static/basic.css @@ -0,0 +1,899 @@ +/* + * basic.css + * ~~~~~~~~~ + * + * Sphinx stylesheet -- basic theme. + * + * :copyright: Copyright 2007-2022 by the Sphinx team, see AUTHORS. + * :license: BSD, see LICENSE for details. + * + */ + +/* -- main layout ----------------------------------------------------------- */ + +div.clearer { + clear: both; +} + +div.section::after { + display: block; + content: ''; + clear: left; +} + +/* -- relbar ---------------------------------------------------------------- */ + +div.related { + width: 100%; + font-size: 90%; +} + +div.related h3 { + display: none; +} + +div.related ul { + margin: 0; + padding: 0 0 0 10px; + list-style: none; +} + +div.related li { + display: inline; +} + +div.related li.right { + float: right; + margin-right: 5px; +} + +/* -- sidebar --------------------------------------------------------------- */ + +div.sphinxsidebarwrapper { + padding: 10px 5px 0 10px; +} + +div.sphinxsidebar { + float: left; + width: 230px; + margin-left: -100%; + font-size: 90%; + word-wrap: break-word; + overflow-wrap : break-word; +} + +div.sphinxsidebar ul { + list-style: none; +} + +div.sphinxsidebar ul ul, +div.sphinxsidebar ul.want-points { + margin-left: 20px; + list-style: square; +} + +div.sphinxsidebar ul ul { + margin-top: 0; + margin-bottom: 0; +} + +div.sphinxsidebar form { + margin-top: 10px; +} + +div.sphinxsidebar input { + border: 1px solid #98dbcc; + font-family: sans-serif; + font-size: 1em; +} + +div.sphinxsidebar #searchbox form.search { + overflow: hidden; +} + +div.sphinxsidebar #searchbox input[type="text"] { + float: left; + width: 80%; + padding: 0.25em; + box-sizing: border-box; +} + +div.sphinxsidebar #searchbox input[type="submit"] { + float: left; + width: 20%; + border-left: none; + padding: 0.25em; + box-sizing: border-box; +} + + +img { + border: 0; + max-width: 100%; +} + +/* -- search page ----------------------------------------------------------- */ + +ul.search { + margin: 10px 0 0 20px; + padding: 0; +} + +ul.search li { + padding: 5px 0 5px 20px; + background-image: url(file.png); + background-repeat: no-repeat; + background-position: 0 7px; +} + +ul.search li a { + font-weight: bold; +} + +ul.search li p.context { + color: #888; + margin: 2px 0 0 30px; + text-align: left; +} + +ul.keywordmatches li.goodmatch a { + font-weight: bold; +} + +/* -- index page ------------------------------------------------------------ */ + +table.contentstable { + width: 90%; + margin-left: auto; + margin-right: auto; +} + +table.contentstable p.biglink { + line-height: 150%; +} + +a.biglink { + font-size: 1.3em; +} + +span.linkdescr { + font-style: italic; + padding-top: 5px; + font-size: 90%; +} + +/* -- general index --------------------------------------------------------- */ + +table.indextable { + width: 100%; +} + +table.indextable td { + text-align: left; + vertical-align: top; +} + +table.indextable ul { + margin-top: 0; + margin-bottom: 0; + list-style-type: none; +} + +table.indextable > tbody > tr > td > ul { + padding-left: 0em; +} + +table.indextable tr.pcap { + height: 10px; +} + +table.indextable tr.cap { + margin-top: 10px; + background-color: #f2f2f2; +} + +img.toggler { + margin-right: 3px; + margin-top: 3px; + cursor: pointer; +} + +div.modindex-jumpbox { + border-top: 1px solid #ddd; + border-bottom: 1px solid #ddd; + margin: 1em 0 1em 0; + padding: 0.4em; +} + +div.genindex-jumpbox { + border-top: 1px solid #ddd; + border-bottom: 1px solid #ddd; + margin: 1em 0 1em 0; + padding: 0.4em; +} + +/* -- domain module index --------------------------------------------------- */ + +table.modindextable td { + padding: 2px; + border-collapse: collapse; +} + +/* -- general body styles --------------------------------------------------- */ + +div.body { + min-width: 360px; + max-width: 800px; +} + +div.body p, div.body dd, div.body li, div.body blockquote { + -moz-hyphens: auto; + -ms-hyphens: auto; + -webkit-hyphens: auto; + hyphens: auto; +} + +a.headerlink { + visibility: hidden; +} +a.brackets:before, +span.brackets > a:before{ + content: "["; +} + +a.brackets:after, +span.brackets > a:after { + content: "]"; +} + + +h1:hover > a.headerlink, +h2:hover > a.headerlink, +h3:hover > a.headerlink, +h4:hover > a.headerlink, +h5:hover > a.headerlink, +h6:hover > a.headerlink, +dt:hover > a.headerlink, +caption:hover > a.headerlink, +p.caption:hover > a.headerlink, +div.code-block-caption:hover > a.headerlink { + visibility: visible; +} + +div.body p.caption { + text-align: inherit; +} + +div.body td { + text-align: left; +} + +.first { + margin-top: 0 !important; +} + +p.rubric { + margin-top: 30px; + font-weight: bold; +} + +img.align-left, figure.align-left, .figure.align-left, object.align-left { + clear: left; + float: left; + margin-right: 1em; +} + +img.align-right, figure.align-right, .figure.align-right, object.align-right { + clear: right; + float: right; + margin-left: 1em; +} + +img.align-center, figure.align-center, .figure.align-center, object.align-center { + display: block; + margin-left: auto; + margin-right: auto; +} + +img.align-default, figure.align-default, .figure.align-default { + display: block; + margin-left: auto; + margin-right: auto; +} + +.align-left { + text-align: left; +} + +.align-center { + text-align: center; +} + +.align-default { + text-align: center; +} + +.align-right { + text-align: right; +} + +/* -- sidebars -------------------------------------------------------------- */ + +div.sidebar, +aside.sidebar { + margin: 0 0 0.5em 1em; + border: 1px solid #ddb; + padding: 7px; + background-color: #ffe; + width: 40%; + float: right; + clear: right; + overflow-x: auto; +} + +p.sidebar-title { + font-weight: bold; +} +div.admonition, div.topic, blockquote { + clear: left; +} + +/* -- topics ---------------------------------------------------------------- */ +div.topic { + border: 1px solid #ccc; + padding: 7px; + margin: 10px 0 10px 0; +} + +p.topic-title { + font-size: 1.1em; + font-weight: bold; + margin-top: 10px; +} + +/* -- admonitions ----------------------------------------------------------- */ + +div.admonition { + margin-top: 10px; + margin-bottom: 10px; + padding: 7px; +} + +div.admonition dt { + font-weight: bold; +} + +p.admonition-title { + margin: 0px 10px 5px 0px; + font-weight: bold; +} + +div.body p.centered { + text-align: center; + margin-top: 25px; +} + +/* -- content of sidebars/topics/admonitions -------------------------------- */ + +div.sidebar > :last-child, +aside.sidebar > :last-child, +div.topic > :last-child, +div.admonition > :last-child { + margin-bottom: 0; +} + +div.sidebar::after, +aside.sidebar::after, +div.topic::after, +div.admonition::after, +blockquote::after { + display: block; + content: ''; + clear: both; +} + +/* -- tables ---------------------------------------------------------------- */ + +table.docutils { + margin-top: 10px; + margin-bottom: 10px; + border: 0; + border-collapse: collapse; +} + +table.align-center { + margin-left: auto; + margin-right: auto; +} + +table.align-default { + margin-left: auto; + margin-right: auto; +} + +table caption span.caption-number { + font-style: italic; +} + +table caption span.caption-text { +} + +table.docutils td, table.docutils th { + padding: 1px 8px 1px 5px; + border-top: 0; + border-left: 0; + border-right: 0; + border-bottom: 1px solid #aaa; +} + +th { + text-align: left; + padding-right: 5px; +} + +table.citation { + border-left: solid 1px gray; + margin-left: 1px; +} + +table.citation td { + border-bottom: none; +} + +th > :first-child, +td > :first-child { + margin-top: 0px; +} + +th > :last-child, +td > :last-child { + margin-bottom: 0px; +} + +/* -- figures --------------------------------------------------------------- */ + +div.figure, figure { + margin: 0.5em; + padding: 0.5em; +} + +div.figure p.caption, figcaption { + padding: 0.3em; +} + +div.figure p.caption span.caption-number, +figcaption span.caption-number { + font-style: italic; +} + +div.figure p.caption span.caption-text, +figcaption span.caption-text { +} + +/* -- field list styles ----------------------------------------------------- */ + +table.field-list td, table.field-list th { + border: 0 !important; +} + +.field-list ul { + margin: 0; + padding-left: 1em; +} + +.field-list p { + margin: 0; +} + +.field-name { + -moz-hyphens: manual; + -ms-hyphens: manual; + -webkit-hyphens: manual; + hyphens: manual; +} + +/* -- hlist styles ---------------------------------------------------------- */ + +table.hlist { + margin: 1em 0; +} + +table.hlist td { + vertical-align: top; +} + +/* -- object description styles --------------------------------------------- */ + +.sig { + font-family: 'Consolas', 'Menlo', 'DejaVu Sans Mono', 'Bitstream Vera Sans Mono', monospace; +} + +.sig-name, code.descname { + background-color: transparent; + font-weight: bold; +} + +.sig-name { + font-size: 1.1em; +} + +code.descname { + font-size: 1.2em; +} + +.sig-prename, code.descclassname { + background-color: transparent; +} + +.optional { + font-size: 1.3em; +} + +.sig-paren { + font-size: larger; +} + +.sig-param.n { + font-style: italic; +} + +/* C++ specific styling */ + +.sig-inline.c-texpr, +.sig-inline.cpp-texpr { + font-family: unset; +} + +.sig.c .k, .sig.c .kt, +.sig.cpp .k, .sig.cpp .kt { + color: #0033B3; +} + +.sig.c .m, +.sig.cpp .m { + color: #1750EB; +} + +.sig.c .s, .sig.c .sc, +.sig.cpp .s, .sig.cpp .sc { + color: #067D17; +} + + +/* -- other body styles ----------------------------------------------------- */ + +ol.arabic { + list-style: decimal; +} + +ol.loweralpha { + list-style: lower-alpha; +} + +ol.upperalpha { + list-style: upper-alpha; +} + +ol.lowerroman { + list-style: lower-roman; +} + +ol.upperroman { + list-style: upper-roman; +} + +:not(li) > ol > li:first-child > :first-child, +:not(li) > ul > li:first-child > :first-child { + margin-top: 0px; +} + +:not(li) > ol > li:last-child > :last-child, +:not(li) > ul > li:last-child > :last-child { + margin-bottom: 0px; +} + +ol.simple ol p, +ol.simple ul p, +ul.simple ol p, +ul.simple ul p { + margin-top: 0; +} + +ol.simple > li:not(:first-child) > p, +ul.simple > li:not(:first-child) > p { + margin-top: 0; +} + +ol.simple p, +ul.simple p { + margin-bottom: 0; +} +dl.footnote > dt, +dl.citation > dt { + float: left; + margin-right: 0.5em; +} + +dl.footnote > dd, +dl.citation > dd { + margin-bottom: 0em; +} + +dl.footnote > dd:after, +dl.citation > dd:after { + content: ""; + clear: both; +} + +dl.field-list { + display: grid; + grid-template-columns: fit-content(30%) auto; +} + +dl.field-list > dt { + font-weight: bold; + word-break: break-word; + padding-left: 0.5em; + padding-right: 5px; +} +dl.field-list > dt:after { + content: ":"; +} + + +dl.field-list > dd { + padding-left: 0.5em; + margin-top: 0em; + margin-left: 0em; + margin-bottom: 0em; +} + +dl { + margin-bottom: 15px; +} + +dd > :first-child { + margin-top: 0px; +} + +dd ul, dd table { + margin-bottom: 10px; +} + +dd { + margin-top: 3px; + margin-bottom: 10px; + margin-left: 30px; +} + +dl > dd:last-child, +dl > dd:last-child > :last-child { + margin-bottom: 0; +} + +dt:target, span.highlighted { + background-color: #fbe54e; +} + +rect.highlighted { + fill: #fbe54e; +} + +dl.glossary dt { + font-weight: bold; + font-size: 1.1em; +} + +.versionmodified { + font-style: italic; +} + +.system-message { + background-color: #fda; + padding: 5px; + border: 3px solid red; +} + +.footnote:target { + background-color: #ffa; +} + +.line-block { + display: block; + margin-top: 1em; + margin-bottom: 1em; +} + +.line-block .line-block { + margin-top: 0; + margin-bottom: 0; + margin-left: 1.5em; +} + +.guilabel, .menuselection { + font-family: sans-serif; +} + +.accelerator { + text-decoration: underline; +} + +.classifier { + font-style: oblique; +} + +.classifier:before { + font-style: normal; + margin: 0 0.5em; + content: ":"; + display: inline-block; +} + +abbr, acronym { + border-bottom: dotted 1px; + cursor: help; +} + +/* -- code displays --------------------------------------------------------- */ + +pre { + overflow: auto; + overflow-y: hidden; /* fixes display issues on Chrome browsers */ +} + +pre, div[class*="highlight-"] { + clear: both; +} + +span.pre { + -moz-hyphens: none; + -ms-hyphens: none; + -webkit-hyphens: none; + hyphens: none; + white-space: nowrap; +} + +div[class*="highlight-"] { + margin: 1em 0; +} + +td.linenos pre { + border: 0; + background-color: transparent; + color: #aaa; +} + +table.highlighttable { + display: block; +} + +table.highlighttable tbody { + display: block; +} + +table.highlighttable tr { + display: flex; +} + +table.highlighttable td { + margin: 0; + padding: 0; +} + +table.highlighttable td.linenos { + padding-right: 0.5em; +} + +table.highlighttable td.code { + flex: 1; + overflow: hidden; +} + +.highlight .hll { + display: block; +} + +div.highlight pre, +table.highlighttable pre { + margin: 0; +} + +div.code-block-caption + div { + margin-top: 0; +} + +div.code-block-caption { + margin-top: 1em; + padding: 2px 5px; + font-size: small; +} + +div.code-block-caption code { + background-color: transparent; +} + +table.highlighttable td.linenos, +span.linenos, +div.highlight span.gp { /* gp: Generic.Prompt */ + user-select: none; + -webkit-user-select: text; /* Safari fallback only */ + -webkit-user-select: none; /* Chrome/Safari */ + -moz-user-select: none; /* Firefox */ + -ms-user-select: none; /* IE10+ */ +} + +div.code-block-caption span.caption-number { + padding: 0.1em 0.3em; + font-style: italic; +} + +div.code-block-caption span.caption-text { +} + +div.literal-block-wrapper { + margin: 1em 0; +} + +code.xref, a code { + background-color: transparent; + font-weight: bold; +} + +h1 code, h2 code, h3 code, h4 code, h5 code, h6 code { + background-color: transparent; +} + +.viewcode-link { + float: right; +} + +.viewcode-back { + float: right; + font-family: sans-serif; +} + +div.viewcode-block:target { + margin: -1px -10px; + padding: 0 10px; +} + +/* -- math display ---------------------------------------------------------- */ + +img.math { + vertical-align: middle; +} + +div.body div.math p { + text-align: center; +} + +span.eqno { + float: right; +} + +span.eqno a.headerlink { + position: absolute; + z-index: 1; +} + +div.math:hover a.headerlink { + visibility: visible; +} + +/* -- printout stylesheet --------------------------------------------------- */ + +@media print { + div.document, + div.documentwrapper, + div.bodywrapper { + margin: 0 !important; + width: 100%; + } + + div.sphinxsidebar, + div.related, + div.footer, + #top-link { + display: none; + } +} \ No newline at end of file diff --git a/branch/bart/_static/css/badge_only.css b/branch/bart/_static/css/badge_only.css new file mode 100644 index 0000000..c718cee --- /dev/null +++ b/branch/bart/_static/css/badge_only.css @@ -0,0 +1 @@ +.clearfix{*zoom:1}.clearfix:after,.clearfix:before{display:table;content:""}.clearfix:after{clear:both}@font-face{font-family:FontAwesome;font-style:normal;font-weight:400;src:url(fonts/fontawesome-webfont.eot?674f50d287a8c48dc19ba404d20fe713?#iefix) format("embedded-opentype"),url(fonts/fontawesome-webfont.woff2?af7ae505a9eed503f8b8e6982036873e) format("woff2"),url(fonts/fontawesome-webfont.woff?fee66e712a8a08eef5805a46892932ad) format("woff"),url(fonts/fontawesome-webfont.ttf?b06871f281fee6b241d60582ae9369b9) format("truetype"),url(fonts/fontawesome-webfont.svg?912ec66d7572ff821749319396470bde#FontAwesome) format("svg")}.fa:before{font-family:FontAwesome;font-style:normal;font-weight:400;line-height:1}.fa:before,a .fa{text-decoration:inherit}.fa:before,a .fa,li .fa{display:inline-block}li .fa-large:before{width:1.875em}ul.fas{list-style-type:none;margin-left:2em;text-indent:-.8em}ul.fas li .fa{width:.8em}ul.fas li .fa-large:before{vertical-align:baseline}.fa-book:before,.icon-book:before{content:"\f02d"}.fa-caret-down:before,.icon-caret-down:before{content:"\f0d7"}.fa-caret-up:before,.icon-caret-up:before{content:"\f0d8"}.fa-caret-left:before,.icon-caret-left:before{content:"\f0d9"}.fa-caret-right:before,.icon-caret-right:before{content:"\f0da"}.rst-versions{position:fixed;bottom:0;left:0;width:300px;color:#fcfcfc;background:#1f1d1d;font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif;z-index:400}.rst-versions a{color:#2980b9;text-decoration:none}.rst-versions .rst-badge-small{display:none}.rst-versions .rst-current-version{padding:12px;background-color:#272525;display:block;text-align:right;font-size:90%;cursor:pointer;color:#27ae60}.rst-versions .rst-current-version:after{clear:both;content:"";display:block}.rst-versions .rst-current-version .fa{color:#fcfcfc}.rst-versions .rst-current-version .fa-book,.rst-versions .rst-current-version .icon-book{float:left}.rst-versions .rst-current-version.rst-out-of-date{background-color:#e74c3c;color:#fff}.rst-versions .rst-current-version.rst-active-old-version{background-color:#f1c40f;color:#000}.rst-versions.shift-up{height:auto;max-height:100%;overflow-y:scroll}.rst-versions.shift-up .rst-other-versions{display:block}.rst-versions .rst-other-versions{font-size:90%;padding:12px;color:grey;display:none}.rst-versions .rst-other-versions hr{display:block;height:1px;border:0;margin:20px 0;padding:0;border-top:1px solid #413d3d}.rst-versions .rst-other-versions dd{display:inline-block;margin:0}.rst-versions .rst-other-versions dd a{display:inline-block;padding:6px;color:#fcfcfc}.rst-versions.rst-badge{width:auto;bottom:20px;right:20px;left:auto;border:none;max-width:300px;max-height:90%}.rst-versions.rst-badge .fa-book,.rst-versions.rst-badge .icon-book{float:none;line-height:30px}.rst-versions.rst-badge.shift-up .rst-current-version{text-align:right}.rst-versions.rst-badge.shift-up .rst-current-version .fa-book,.rst-versions.rst-badge.shift-up .rst-current-version .icon-book{float:left}.rst-versions.rst-badge>.rst-current-version{width:auto;height:30px;line-height:30px;padding:0 6px;display:block;text-align:center}@media screen and (max-width:768px){.rst-versions{width:85%;display:none}.rst-versions.shift{display:block}} \ No newline at end of file diff --git a/branch/bart/_static/css/fonts/Roboto-Slab-Bold.woff b/branch/bart/_static/css/fonts/Roboto-Slab-Bold.woff new file mode 100644 index 0000000..6cb6000 Binary files /dev/null and b/branch/bart/_static/css/fonts/Roboto-Slab-Bold.woff differ diff --git a/branch/bart/_static/css/fonts/Roboto-Slab-Bold.woff2 b/branch/bart/_static/css/fonts/Roboto-Slab-Bold.woff2 new file mode 100644 index 0000000..7059e23 Binary files /dev/null and b/branch/bart/_static/css/fonts/Roboto-Slab-Bold.woff2 differ diff --git a/branch/bart/_static/css/fonts/Roboto-Slab-Regular.woff b/branch/bart/_static/css/fonts/Roboto-Slab-Regular.woff new file mode 100644 index 0000000..f815f63 Binary files /dev/null and b/branch/bart/_static/css/fonts/Roboto-Slab-Regular.woff differ diff --git a/branch/bart/_static/css/fonts/Roboto-Slab-Regular.woff2 b/branch/bart/_static/css/fonts/Roboto-Slab-Regular.woff2 new file mode 100644 index 0000000..f2c76e5 Binary files /dev/null and b/branch/bart/_static/css/fonts/Roboto-Slab-Regular.woff2 differ diff --git a/branch/bart/_static/css/fonts/fontawesome-webfont.eot b/branch/bart/_static/css/fonts/fontawesome-webfont.eot new file mode 100644 index 0000000..e9f60ca Binary files /dev/null and b/branch/bart/_static/css/fonts/fontawesome-webfont.eot differ diff --git a/branch/bart/_static/css/fonts/fontawesome-webfont.svg b/branch/bart/_static/css/fonts/fontawesome-webfont.svg new file mode 100644 index 0000000..855c845 --- /dev/null +++ b/branch/bart/_static/css/fonts/fontawesome-webfont.svg @@ -0,0 +1,2671 @@ + + + + +Created by FontForge 20120731 at Mon Oct 24 17:37:40 2016 + By ,,, +Copyright Dave Gandy 2016. All rights reserved. + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/branch/bart/_static/css/fonts/fontawesome-webfont.ttf b/branch/bart/_static/css/fonts/fontawesome-webfont.ttf new file mode 100644 index 0000000..35acda2 Binary files /dev/null and b/branch/bart/_static/css/fonts/fontawesome-webfont.ttf differ diff --git a/branch/bart/_static/css/fonts/fontawesome-webfont.woff b/branch/bart/_static/css/fonts/fontawesome-webfont.woff new file mode 100644 index 0000000..400014a Binary files /dev/null and b/branch/bart/_static/css/fonts/fontawesome-webfont.woff differ diff --git a/branch/bart/_static/css/fonts/fontawesome-webfont.woff2 b/branch/bart/_static/css/fonts/fontawesome-webfont.woff2 new file mode 100644 index 0000000..4d13fc6 Binary files /dev/null and b/branch/bart/_static/css/fonts/fontawesome-webfont.woff2 differ diff --git a/branch/bart/_static/css/fonts/lato-bold-italic.woff b/branch/bart/_static/css/fonts/lato-bold-italic.woff new file mode 100644 index 0000000..88ad05b Binary files /dev/null and b/branch/bart/_static/css/fonts/lato-bold-italic.woff differ diff --git a/branch/bart/_static/css/fonts/lato-bold-italic.woff2 b/branch/bart/_static/css/fonts/lato-bold-italic.woff2 new file mode 100644 index 0000000..c4e3d80 Binary files /dev/null and b/branch/bart/_static/css/fonts/lato-bold-italic.woff2 differ diff --git a/branch/bart/_static/css/fonts/lato-bold.woff b/branch/bart/_static/css/fonts/lato-bold.woff new file mode 100644 index 0000000..c6dff51 Binary files /dev/null and b/branch/bart/_static/css/fonts/lato-bold.woff differ diff --git a/branch/bart/_static/css/fonts/lato-bold.woff2 b/branch/bart/_static/css/fonts/lato-bold.woff2 new file mode 100644 index 0000000..bb19504 Binary files /dev/null and b/branch/bart/_static/css/fonts/lato-bold.woff2 differ diff --git a/branch/bart/_static/css/fonts/lato-normal-italic.woff b/branch/bart/_static/css/fonts/lato-normal-italic.woff new file mode 100644 index 0000000..76114bc Binary files /dev/null and b/branch/bart/_static/css/fonts/lato-normal-italic.woff differ diff --git a/branch/bart/_static/css/fonts/lato-normal-italic.woff2 b/branch/bart/_static/css/fonts/lato-normal-italic.woff2 new file mode 100644 index 0000000..3404f37 Binary files /dev/null and b/branch/bart/_static/css/fonts/lato-normal-italic.woff2 differ diff --git a/branch/bart/_static/css/fonts/lato-normal.woff b/branch/bart/_static/css/fonts/lato-normal.woff new file mode 100644 index 0000000..ae1307f Binary files /dev/null and b/branch/bart/_static/css/fonts/lato-normal.woff differ diff --git a/branch/bart/_static/css/fonts/lato-normal.woff2 b/branch/bart/_static/css/fonts/lato-normal.woff2 new file mode 100644 index 0000000..3bf9843 Binary files /dev/null and b/branch/bart/_static/css/fonts/lato-normal.woff2 differ diff --git a/branch/bart/_static/css/theme.css b/branch/bart/_static/css/theme.css new file mode 100644 index 0000000..09a1af8 --- /dev/null +++ b/branch/bart/_static/css/theme.css @@ -0,0 +1,4 @@ +html{box-sizing:border-box}*,:after,:before{box-sizing:inherit}article,aside,details,figcaption,figure,footer,header,hgroup,nav,section{display:block}audio,canvas,video{display:inline-block;*display:inline;*zoom:1}[hidden],audio:not([controls]){display:none}*{-webkit-box-sizing:border-box;-moz-box-sizing:border-box;box-sizing:border-box}html{font-size:100%;-webkit-text-size-adjust:100%;-ms-text-size-adjust:100%}body{margin:0}a:active,a:hover{outline:0}abbr[title]{border-bottom:1px dotted}b,strong{font-weight:700}blockquote{margin:0}dfn{font-style:italic}ins{background:#ff9;text-decoration:none}ins,mark{color:#000}mark{background:#ff0;font-style:italic;font-weight:700}.rst-content code,.rst-content tt,code,kbd,pre,samp{font-family:monospace,serif;_font-family:courier new,monospace;font-size:1em}pre{white-space:pre}q{quotes:none}q:after,q:before{content:"";content:none}small{font-size:85%}sub,sup{font-size:75%;line-height:0;position:relative;vertical-align:baseline}sup{top:-.5em}sub{bottom:-.25em}dl,ol,ul{margin:0;padding:0;list-style:none;list-style-image:none}li{list-style:none}dd{margin:0}img{border:0;-ms-interpolation-mode:bicubic;vertical-align:middle;max-width:100%}svg:not(:root){overflow:hidden}figure,form{margin:0}label{cursor:pointer}button,input,select,textarea{font-size:100%;margin:0;vertical-align:baseline;*vertical-align:middle}button,input{line-height:normal}button,input[type=button],input[type=reset],input[type=submit]{cursor:pointer;-webkit-appearance:button;*overflow:visible}button[disabled],input[disabled]{cursor:default}input[type=search]{-webkit-appearance:textfield;-moz-box-sizing:content-box;-webkit-box-sizing:content-box;box-sizing:content-box}textarea{resize:vertical}table{border-collapse:collapse;border-spacing:0}td{vertical-align:top}.chromeframe{margin:.2em 0;background:#ccc;color:#000;padding:.2em 0}.ir{display:block;border:0;text-indent:-999em;overflow:hidden;background-color:transparent;background-repeat:no-repeat;text-align:left;direction:ltr;*line-height:0}.ir br{display:none}.hidden{display:none!important;visibility:hidden}.visuallyhidden{border:0;clip:rect(0 0 0 0);height:1px;margin:-1px;overflow:hidden;padding:0;position:absolute;width:1px}.visuallyhidden.focusable:active,.visuallyhidden.focusable:focus{clip:auto;height:auto;margin:0;overflow:visible;position:static;width:auto}.invisible{visibility:hidden}.relative{position:relative}big,small{font-size:100%}@media print{body,html,section{background:none!important}*{box-shadow:none!important;text-shadow:none!important;filter:none!important;-ms-filter:none!important}a,a:visited{text-decoration:underline}.ir a:after,a[href^="#"]:after,a[href^="javascript:"]:after{content:""}blockquote,pre{page-break-inside:avoid}thead{display:table-header-group}img,tr{page-break-inside:avoid}img{max-width:100%!important}@page{margin:.5cm}.rst-content .toctree-wrapper>p.caption,h2,h3,p{orphans:3;widows:3}.rst-content .toctree-wrapper>p.caption,h2,h3{page-break-after:avoid}}.btn,.fa:before,.icon:before,.rst-content .admonition,.rst-content .admonition-title:before,.rst-content .admonition-todo,.rst-content .attention,.rst-content .caution,.rst-content .code-block-caption .headerlink:before,.rst-content .danger,.rst-content .eqno .headerlink:before,.rst-content .error,.rst-content .hint,.rst-content .important,.rst-content .note,.rst-content .seealso,.rst-content .tip,.rst-content .warning,.rst-content code.download span:first-child:before,.rst-content dl dt .headerlink:before,.rst-content h1 .headerlink:before,.rst-content h2 .headerlink:before,.rst-content h3 .headerlink:before,.rst-content h4 .headerlink:before,.rst-content h5 .headerlink:before,.rst-content h6 .headerlink:before,.rst-content p.caption .headerlink:before,.rst-content p .headerlink:before,.rst-content table>caption .headerlink:before,.rst-content tt.download span:first-child:before,.wy-alert,.wy-dropdown .caret:before,.wy-inline-validate.wy-inline-validate-danger .wy-input-context:before,.wy-inline-validate.wy-inline-validate-info .wy-input-context:before,.wy-inline-validate.wy-inline-validate-success .wy-input-context:before,.wy-inline-validate.wy-inline-validate-warning .wy-input-context:before,.wy-menu-vertical li.current>a button.toctree-expand:before,.wy-menu-vertical li.on a button.toctree-expand:before,.wy-menu-vertical li button.toctree-expand:before,input[type=color],input[type=date],input[type=datetime-local],input[type=datetime],input[type=email],input[type=month],input[type=number],input[type=password],input[type=search],input[type=tel],input[type=text],input[type=time],input[type=url],input[type=week],select,textarea{-webkit-font-smoothing:antialiased}.clearfix{*zoom:1}.clearfix:after,.clearfix:before{display:table;content:""}.clearfix:after{clear:both}/*! + * Font Awesome 4.7.0 by @davegandy - http://fontawesome.io - @fontawesome + * License - http://fontawesome.io/license (Font: SIL OFL 1.1, CSS: MIT License) + */@font-face{font-family:FontAwesome;src:url(fonts/fontawesome-webfont.eot?674f50d287a8c48dc19ba404d20fe713);src:url(fonts/fontawesome-webfont.eot?674f50d287a8c48dc19ba404d20fe713?#iefix&v=4.7.0) format("embedded-opentype"),url(fonts/fontawesome-webfont.woff2?af7ae505a9eed503f8b8e6982036873e) format("woff2"),url(fonts/fontawesome-webfont.woff?fee66e712a8a08eef5805a46892932ad) format("woff"),url(fonts/fontawesome-webfont.ttf?b06871f281fee6b241d60582ae9369b9) format("truetype"),url(fonts/fontawesome-webfont.svg?912ec66d7572ff821749319396470bde#fontawesomeregular) format("svg");font-weight:400;font-style:normal}.fa,.icon,.rst-content .admonition-title,.rst-content .code-block-caption .headerlink,.rst-content .eqno .headerlink,.rst-content code.download span:first-child,.rst-content dl dt .headerlink,.rst-content h1 .headerlink,.rst-content h2 .headerlink,.rst-content h3 .headerlink,.rst-content h4 .headerlink,.rst-content h5 .headerlink,.rst-content h6 .headerlink,.rst-content p.caption .headerlink,.rst-content p .headerlink,.rst-content table>caption .headerlink,.rst-content tt.download span:first-child,.wy-menu-vertical li.current>a button.toctree-expand,.wy-menu-vertical li.on a button.toctree-expand,.wy-menu-vertical li button.toctree-expand{display:inline-block;font:normal normal normal 14px/1 FontAwesome;font-size:inherit;text-rendering:auto;-webkit-font-smoothing:antialiased;-moz-osx-font-smoothing:grayscale}.fa-lg{font-size:1.33333em;line-height:.75em;vertical-align:-15%}.fa-2x{font-size:2em}.fa-3x{font-size:3em}.fa-4x{font-size:4em}.fa-5x{font-size:5em}.fa-fw{width:1.28571em;text-align:center}.fa-ul{padding-left:0;margin-left:2.14286em;list-style-type:none}.fa-ul>li{position:relative}.fa-li{position:absolute;left:-2.14286em;width:2.14286em;top:.14286em;text-align:center}.fa-li.fa-lg{left:-1.85714em}.fa-border{padding:.2em .25em .15em;border:.08em solid #eee;border-radius:.1em}.fa-pull-left{float:left}.fa-pull-right{float:right}.fa-pull-left.icon,.fa.fa-pull-left,.rst-content .code-block-caption .fa-pull-left.headerlink,.rst-content .eqno .fa-pull-left.headerlink,.rst-content .fa-pull-left.admonition-title,.rst-content code.download span.fa-pull-left:first-child,.rst-content dl dt .fa-pull-left.headerlink,.rst-content h1 .fa-pull-left.headerlink,.rst-content h2 .fa-pull-left.headerlink,.rst-content h3 .fa-pull-left.headerlink,.rst-content h4 .fa-pull-left.headerlink,.rst-content h5 .fa-pull-left.headerlink,.rst-content h6 .fa-pull-left.headerlink,.rst-content p .fa-pull-left.headerlink,.rst-content table>caption .fa-pull-left.headerlink,.rst-content tt.download span.fa-pull-left:first-child,.wy-menu-vertical li.current>a button.fa-pull-left.toctree-expand,.wy-menu-vertical li.on a button.fa-pull-left.toctree-expand,.wy-menu-vertical li button.fa-pull-left.toctree-expand{margin-right:.3em}.fa-pull-right.icon,.fa.fa-pull-right,.rst-content .code-block-caption .fa-pull-right.headerlink,.rst-content .eqno .fa-pull-right.headerlink,.rst-content .fa-pull-right.admonition-title,.rst-content code.download span.fa-pull-right:first-child,.rst-content dl dt .fa-pull-right.headerlink,.rst-content h1 .fa-pull-right.headerlink,.rst-content h2 .fa-pull-right.headerlink,.rst-content h3 .fa-pull-right.headerlink,.rst-content h4 .fa-pull-right.headerlink,.rst-content h5 .fa-pull-right.headerlink,.rst-content h6 .fa-pull-right.headerlink,.rst-content p .fa-pull-right.headerlink,.rst-content table>caption .fa-pull-right.headerlink,.rst-content tt.download span.fa-pull-right:first-child,.wy-menu-vertical li.current>a button.fa-pull-right.toctree-expand,.wy-menu-vertical li.on a button.fa-pull-right.toctree-expand,.wy-menu-vertical li button.fa-pull-right.toctree-expand{margin-left:.3em}.pull-right{float:right}.pull-left{float:left}.fa.pull-left,.pull-left.icon,.rst-content .code-block-caption .pull-left.headerlink,.rst-content .eqno .pull-left.headerlink,.rst-content .pull-left.admonition-title,.rst-content code.download span.pull-left:first-child,.rst-content dl dt .pull-left.headerlink,.rst-content h1 .pull-left.headerlink,.rst-content h2 .pull-left.headerlink,.rst-content h3 .pull-left.headerlink,.rst-content h4 .pull-left.headerlink,.rst-content h5 .pull-left.headerlink,.rst-content h6 .pull-left.headerlink,.rst-content p .pull-left.headerlink,.rst-content table>caption .pull-left.headerlink,.rst-content tt.download span.pull-left:first-child,.wy-menu-vertical li.current>a button.pull-left.toctree-expand,.wy-menu-vertical li.on a button.pull-left.toctree-expand,.wy-menu-vertical li button.pull-left.toctree-expand{margin-right:.3em}.fa.pull-right,.pull-right.icon,.rst-content .code-block-caption .pull-right.headerlink,.rst-content .eqno .pull-right.headerlink,.rst-content .pull-right.admonition-title,.rst-content code.download span.pull-right:first-child,.rst-content dl dt .pull-right.headerlink,.rst-content h1 .pull-right.headerlink,.rst-content h2 .pull-right.headerlink,.rst-content h3 .pull-right.headerlink,.rst-content h4 .pull-right.headerlink,.rst-content h5 .pull-right.headerlink,.rst-content h6 .pull-right.headerlink,.rst-content p .pull-right.headerlink,.rst-content table>caption .pull-right.headerlink,.rst-content tt.download span.pull-right:first-child,.wy-menu-vertical li.current>a button.pull-right.toctree-expand,.wy-menu-vertical li.on a button.pull-right.toctree-expand,.wy-menu-vertical li button.pull-right.toctree-expand{margin-left:.3em}.fa-spin{-webkit-animation:fa-spin 2s linear infinite;animation:fa-spin 2s linear infinite}.fa-pulse{-webkit-animation:fa-spin 1s steps(8) infinite;animation:fa-spin 1s steps(8) infinite}@-webkit-keyframes fa-spin{0%{-webkit-transform:rotate(0deg);transform:rotate(0deg)}to{-webkit-transform:rotate(359deg);transform:rotate(359deg)}}@keyframes fa-spin{0%{-webkit-transform:rotate(0deg);transform:rotate(0deg)}to{-webkit-transform:rotate(359deg);transform:rotate(359deg)}}.fa-rotate-90{-ms-filter:"progid:DXImageTransform.Microsoft.BasicImage(rotation=1)";-webkit-transform:rotate(90deg);-ms-transform:rotate(90deg);transform:rotate(90deg)}.fa-rotate-180{-ms-filter:"progid:DXImageTransform.Microsoft.BasicImage(rotation=2)";-webkit-transform:rotate(180deg);-ms-transform:rotate(180deg);transform:rotate(180deg)}.fa-rotate-270{-ms-filter:"progid:DXImageTransform.Microsoft.BasicImage(rotation=3)";-webkit-transform:rotate(270deg);-ms-transform:rotate(270deg);transform:rotate(270deg)}.fa-flip-horizontal{-ms-filter:"progid:DXImageTransform.Microsoft.BasicImage(rotation=0, mirror=1)";-webkit-transform:scaleX(-1);-ms-transform:scaleX(-1);transform:scaleX(-1)}.fa-flip-vertical{-ms-filter:"progid:DXImageTransform.Microsoft.BasicImage(rotation=2, mirror=1)";-webkit-transform:scaleY(-1);-ms-transform:scaleY(-1);transform:scaleY(-1)}:root .fa-flip-horizontal,:root .fa-flip-vertical,:root .fa-rotate-90,:root .fa-rotate-180,:root .fa-rotate-270{filter:none}.fa-stack{position:relative;display:inline-block;width:2em;height:2em;line-height:2em;vertical-align:middle}.fa-stack-1x,.fa-stack-2x{position:absolute;left:0;width:100%;text-align:center}.fa-stack-1x{line-height:inherit}.fa-stack-2x{font-size:2em}.fa-inverse{color:#fff}.fa-glass:before{content:""}.fa-music:before{content:""}.fa-search:before,.icon-search:before{content:""}.fa-envelope-o:before{content:""}.fa-heart:before{content:""}.fa-star:before{content:""}.fa-star-o:before{content:""}.fa-user:before{content:""}.fa-film:before{content:""}.fa-th-large:before{content:""}.fa-th:before{content:""}.fa-th-list:before{content:""}.fa-check:before{content:""}.fa-close:before,.fa-remove:before,.fa-times:before{content:""}.fa-search-plus:before{content:""}.fa-search-minus:before{content:""}.fa-power-off:before{content:""}.fa-signal:before{content:""}.fa-cog:before,.fa-gear:before{content:""}.fa-trash-o:before{content:""}.fa-home:before,.icon-home:before{content:""}.fa-file-o:before{content:""}.fa-clock-o:before{content:""}.fa-road:before{content:""}.fa-download:before,.rst-content code.download span:first-child:before,.rst-content tt.download span:first-child:before{content:""}.fa-arrow-circle-o-down:before{content:""}.fa-arrow-circle-o-up:before{content:""}.fa-inbox:before{content:""}.fa-play-circle-o:before{content:""}.fa-repeat:before,.fa-rotate-right:before{content:""}.fa-refresh:before{content:""}.fa-list-alt:before{content:""}.fa-lock:before{content:""}.fa-flag:before{content:""}.fa-headphones:before{content:""}.fa-volume-off:before{content:""}.fa-volume-down:before{content:""}.fa-volume-up:before{content:""}.fa-qrcode:before{content:""}.fa-barcode:before{content:""}.fa-tag:before{content:""}.fa-tags:before{content:""}.fa-book:before,.icon-book:before{content:""}.fa-bookmark:before{content:""}.fa-print:before{content:""}.fa-camera:before{content:""}.fa-font:before{content:""}.fa-bold:before{content:""}.fa-italic:before{content:""}.fa-text-height:before{content:""}.fa-text-width:before{content:""}.fa-align-left:before{content:""}.fa-align-center:before{content:""}.fa-align-right:before{content:""}.fa-align-justify:before{content:""}.fa-list:before{content:""}.fa-dedent:before,.fa-outdent:before{content:""}.fa-indent:before{content:""}.fa-video-camera:before{content:""}.fa-image:before,.fa-photo:before,.fa-picture-o:before{content:""}.fa-pencil:before{content:""}.fa-map-marker:before{content:""}.fa-adjust:before{content:""}.fa-tint:before{content:""}.fa-edit:before,.fa-pencil-square-o:before{content:""}.fa-share-square-o:before{content:""}.fa-check-square-o:before{content:""}.fa-arrows:before{content:""}.fa-step-backward:before{content:""}.fa-fast-backward:before{content:""}.fa-backward:before{content:""}.fa-play:before{content:""}.fa-pause:before{content:""}.fa-stop:before{content:""}.fa-forward:before{content:""}.fa-fast-forward:before{content:""}.fa-step-forward:before{content:""}.fa-eject:before{content:""}.fa-chevron-left:before{content:""}.fa-chevron-right:before{content:""}.fa-plus-circle:before{content:""}.fa-minus-circle:before{content:""}.fa-times-circle:before,.wy-inline-validate.wy-inline-validate-danger .wy-input-context:before{content:""}.fa-check-circle:before,.wy-inline-validate.wy-inline-validate-success .wy-input-context:before{content:""}.fa-question-circle:before{content:""}.fa-info-circle:before{content:""}.fa-crosshairs:before{content:""}.fa-times-circle-o:before{content:""}.fa-check-circle-o:before{content:""}.fa-ban:before{content:""}.fa-arrow-left:before{content:""}.fa-arrow-right:before{content:""}.fa-arrow-up:before{content:""}.fa-arrow-down:before{content:""}.fa-mail-forward:before,.fa-share:before{content:""}.fa-expand:before{content:""}.fa-compress:before{content:""}.fa-plus:before{content:""}.fa-minus:before{content:""}.fa-asterisk:before{content:""}.fa-exclamation-circle:before,.rst-content .admonition-title:before,.wy-inline-validate.wy-inline-validate-info .wy-input-context:before,.wy-inline-validate.wy-inline-validate-warning .wy-input-context:before{content:""}.fa-gift:before{content:""}.fa-leaf:before{content:""}.fa-fire:before,.icon-fire:before{content:""}.fa-eye:before{content:""}.fa-eye-slash:before{content:""}.fa-exclamation-triangle:before,.fa-warning:before{content:""}.fa-plane:before{content:""}.fa-calendar:before{content:""}.fa-random:before{content:""}.fa-comment:before{content:""}.fa-magnet:before{content:""}.fa-chevron-up:before{content:""}.fa-chevron-down:before{content:""}.fa-retweet:before{content:""}.fa-shopping-cart:before{content:""}.fa-folder:before{content:""}.fa-folder-open:before{content:""}.fa-arrows-v:before{content:""}.fa-arrows-h:before{content:""}.fa-bar-chart-o:before,.fa-bar-chart:before{content:""}.fa-twitter-square:before{content:""}.fa-facebook-square:before{content:""}.fa-camera-retro:before{content:""}.fa-key:before{content:""}.fa-cogs:before,.fa-gears:before{content:""}.fa-comments:before{content:""}.fa-thumbs-o-up:before{content:""}.fa-thumbs-o-down:before{content:""}.fa-star-half:before{content:""}.fa-heart-o:before{content:""}.fa-sign-out:before{content:""}.fa-linkedin-square:before{content:""}.fa-thumb-tack:before{content:""}.fa-external-link:before{content:""}.fa-sign-in:before{content:""}.fa-trophy:before{content:""}.fa-github-square:before{content:""}.fa-upload:before{content:""}.fa-lemon-o:before{content:""}.fa-phone:before{content:""}.fa-square-o:before{content:""}.fa-bookmark-o:before{content:""}.fa-phone-square:before{content:""}.fa-twitter:before{content:""}.fa-facebook-f:before,.fa-facebook:before{content:""}.fa-github:before,.icon-github:before{content:""}.fa-unlock:before{content:""}.fa-credit-card:before{content:""}.fa-feed:before,.fa-rss:before{content:""}.fa-hdd-o:before{content:""}.fa-bullhorn:before{content:""}.fa-bell:before{content:""}.fa-certificate:before{content:""}.fa-hand-o-right:before{content:""}.fa-hand-o-left:before{content:""}.fa-hand-o-up:before{content:""}.fa-hand-o-down:before{content:""}.fa-arrow-circle-left:before,.icon-circle-arrow-left:before{content:""}.fa-arrow-circle-right:before,.icon-circle-arrow-right:before{content:""}.fa-arrow-circle-up:before{content:""}.fa-arrow-circle-down:before{content:""}.fa-globe:before{content:""}.fa-wrench:before{content:""}.fa-tasks:before{content:""}.fa-filter:before{content:""}.fa-briefcase:before{content:""}.fa-arrows-alt:before{content:""}.fa-group:before,.fa-users:before{content:""}.fa-chain:before,.fa-link:before,.icon-link:before{content:""}.fa-cloud:before{content:""}.fa-flask:before{content:""}.fa-cut:before,.fa-scissors:before{content:""}.fa-copy:before,.fa-files-o:before{content:""}.fa-paperclip:before{content:""}.fa-floppy-o:before,.fa-save:before{content:""}.fa-square:before{content:""}.fa-bars:before,.fa-navicon:before,.fa-reorder:before{content:""}.fa-list-ul:before{content:""}.fa-list-ol:before{content:""}.fa-strikethrough:before{content:""}.fa-underline:before{content:""}.fa-table:before{content:""}.fa-magic:before{content:""}.fa-truck:before{content:""}.fa-pinterest:before{content:""}.fa-pinterest-square:before{content:""}.fa-google-plus-square:before{content:""}.fa-google-plus:before{content:""}.fa-money:before{content:""}.fa-caret-down:before,.icon-caret-down:before,.wy-dropdown .caret:before{content:""}.fa-caret-up:before{content:""}.fa-caret-left:before{content:""}.fa-caret-right:before{content:""}.fa-columns:before{content:""}.fa-sort:before,.fa-unsorted:before{content:""}.fa-sort-desc:before,.fa-sort-down:before{content:""}.fa-sort-asc:before,.fa-sort-up:before{content:""}.fa-envelope:before{content:""}.fa-linkedin:before{content:""}.fa-rotate-left:before,.fa-undo:before{content:""}.fa-gavel:before,.fa-legal:before{content:""}.fa-dashboard:before,.fa-tachometer:before{content:""}.fa-comment-o:before{content:""}.fa-comments-o:before{content:""}.fa-bolt:before,.fa-flash:before{content:""}.fa-sitemap:before{content:""}.fa-umbrella:before{content:""}.fa-clipboard:before,.fa-paste:before{content:""}.fa-lightbulb-o:before{content:""}.fa-exchange:before{content:""}.fa-cloud-download:before{content:""}.fa-cloud-upload:before{content:""}.fa-user-md:before{content:""}.fa-stethoscope:before{content:""}.fa-suitcase:before{content:""}.fa-bell-o:before{content:""}.fa-coffee:before{content:""}.fa-cutlery:before{content:""}.fa-file-text-o:before{content:""}.fa-building-o:before{content:""}.fa-hospital-o:before{content:""}.fa-ambulance:before{content:""}.fa-medkit:before{content:""}.fa-fighter-jet:before{content:""}.fa-beer:before{content:""}.fa-h-square:before{content:""}.fa-plus-square:before{content:""}.fa-angle-double-left:before{content:""}.fa-angle-double-right:before{content:""}.fa-angle-double-up:before{content:""}.fa-angle-double-down:before{content:""}.fa-angle-left:before{content:""}.fa-angle-right:before{content:""}.fa-angle-up:before{content:""}.fa-angle-down:before{content:""}.fa-desktop:before{content:""}.fa-laptop:before{content:""}.fa-tablet:before{content:""}.fa-mobile-phone:before,.fa-mobile:before{content:""}.fa-circle-o:before{content:""}.fa-quote-left:before{content:""}.fa-quote-right:before{content:""}.fa-spinner:before{content:""}.fa-circle:before{content:""}.fa-mail-reply:before,.fa-reply:before{content:""}.fa-github-alt:before{content:""}.fa-folder-o:before{content:""}.fa-folder-open-o:before{content:""}.fa-smile-o:before{content:""}.fa-frown-o:before{content:""}.fa-meh-o:before{content:""}.fa-gamepad:before{content:""}.fa-keyboard-o:before{content:""}.fa-flag-o:before{content:""}.fa-flag-checkered:before{content:""}.fa-terminal:before{content:""}.fa-code:before{content:""}.fa-mail-reply-all:before,.fa-reply-all:before{content:""}.fa-star-half-empty:before,.fa-star-half-full:before,.fa-star-half-o:before{content:""}.fa-location-arrow:before{content:""}.fa-crop:before{content:""}.fa-code-fork:before{content:""}.fa-chain-broken:before,.fa-unlink:before{content:""}.fa-question:before{content:""}.fa-info:before{content:""}.fa-exclamation:before{content:""}.fa-superscript:before{content:""}.fa-subscript:before{content:""}.fa-eraser:before{content:""}.fa-puzzle-piece:before{content:""}.fa-microphone:before{content:""}.fa-microphone-slash:before{content:""}.fa-shield:before{content:""}.fa-calendar-o:before{content:""}.fa-fire-extinguisher:before{content:""}.fa-rocket:before{content:""}.fa-maxcdn:before{content:""}.fa-chevron-circle-left:before{content:""}.fa-chevron-circle-right:before{content:""}.fa-chevron-circle-up:before{content:""}.fa-chevron-circle-down:before{content:""}.fa-html5:before{content:""}.fa-css3:before{content:""}.fa-anchor:before{content:""}.fa-unlock-alt:before{content:""}.fa-bullseye:before{content:""}.fa-ellipsis-h:before{content:""}.fa-ellipsis-v:before{content:""}.fa-rss-square:before{content:""}.fa-play-circle:before{content:""}.fa-ticket:before{content:""}.fa-minus-square:before{content:""}.fa-minus-square-o:before,.wy-menu-vertical li.current>a button.toctree-expand:before,.wy-menu-vertical li.on a button.toctree-expand:before{content:""}.fa-level-up:before{content:""}.fa-level-down:before{content:""}.fa-check-square:before{content:""}.fa-pencil-square:before{content:""}.fa-external-link-square:before{content:""}.fa-share-square:before{content:""}.fa-compass:before{content:""}.fa-caret-square-o-down:before,.fa-toggle-down:before{content:""}.fa-caret-square-o-up:before,.fa-toggle-up:before{content:""}.fa-caret-square-o-right:before,.fa-toggle-right:before{content:""}.fa-eur:before,.fa-euro:before{content:""}.fa-gbp:before{content:""}.fa-dollar:before,.fa-usd:before{content:""}.fa-inr:before,.fa-rupee:before{content:""}.fa-cny:before,.fa-jpy:before,.fa-rmb:before,.fa-yen:before{content:""}.fa-rouble:before,.fa-rub:before,.fa-ruble:before{content:""}.fa-krw:before,.fa-won:before{content:""}.fa-bitcoin:before,.fa-btc:before{content:""}.fa-file:before{content:""}.fa-file-text:before{content:""}.fa-sort-alpha-asc:before{content:""}.fa-sort-alpha-desc:before{content:""}.fa-sort-amount-asc:before{content:""}.fa-sort-amount-desc:before{content:""}.fa-sort-numeric-asc:before{content:""}.fa-sort-numeric-desc:before{content:""}.fa-thumbs-up:before{content:""}.fa-thumbs-down:before{content:""}.fa-youtube-square:before{content:""}.fa-youtube:before{content:""}.fa-xing:before{content:""}.fa-xing-square:before{content:""}.fa-youtube-play:before{content:""}.fa-dropbox:before{content:""}.fa-stack-overflow:before{content:""}.fa-instagram:before{content:""}.fa-flickr:before{content:""}.fa-adn:before{content:""}.fa-bitbucket:before,.icon-bitbucket:before{content:""}.fa-bitbucket-square:before{content:""}.fa-tumblr:before{content:""}.fa-tumblr-square:before{content:""}.fa-long-arrow-down:before{content:""}.fa-long-arrow-up:before{content:""}.fa-long-arrow-left:before{content:""}.fa-long-arrow-right:before{content:""}.fa-apple:before{content:""}.fa-windows:before{content:""}.fa-android:before{content:""}.fa-linux:before{content:""}.fa-dribbble:before{content:""}.fa-skype:before{content:""}.fa-foursquare:before{content:""}.fa-trello:before{content:""}.fa-female:before{content:""}.fa-male:before{content:""}.fa-gittip:before,.fa-gratipay:before{content:""}.fa-sun-o:before{content:""}.fa-moon-o:before{content:""}.fa-archive:before{content:""}.fa-bug:before{content:""}.fa-vk:before{content:""}.fa-weibo:before{content:""}.fa-renren:before{content:""}.fa-pagelines:before{content:""}.fa-stack-exchange:before{content:""}.fa-arrow-circle-o-right:before{content:""}.fa-arrow-circle-o-left:before{content:""}.fa-caret-square-o-left:before,.fa-toggle-left:before{content:""}.fa-dot-circle-o:before{content:""}.fa-wheelchair:before{content:""}.fa-vimeo-square:before{content:""}.fa-try:before,.fa-turkish-lira:before{content:""}.fa-plus-square-o:before,.wy-menu-vertical li button.toctree-expand:before{content:""}.fa-space-shuttle:before{content:""}.fa-slack:before{content:""}.fa-envelope-square:before{content:""}.fa-wordpress:before{content:""}.fa-openid:before{content:""}.fa-bank:before,.fa-institution:before,.fa-university:before{content:""}.fa-graduation-cap:before,.fa-mortar-board:before{content:""}.fa-yahoo:before{content:""}.fa-google:before{content:""}.fa-reddit:before{content:""}.fa-reddit-square:before{content:""}.fa-stumbleupon-circle:before{content:""}.fa-stumbleupon:before{content:""}.fa-delicious:before{content:""}.fa-digg:before{content:""}.fa-pied-piper-pp:before{content:""}.fa-pied-piper-alt:before{content:""}.fa-drupal:before{content:""}.fa-joomla:before{content:""}.fa-language:before{content:""}.fa-fax:before{content:""}.fa-building:before{content:""}.fa-child:before{content:""}.fa-paw:before{content:""}.fa-spoon:before{content:""}.fa-cube:before{content:""}.fa-cubes:before{content:""}.fa-behance:before{content:""}.fa-behance-square:before{content:""}.fa-steam:before{content:""}.fa-steam-square:before{content:""}.fa-recycle:before{content:""}.fa-automobile:before,.fa-car:before{content:""}.fa-cab:before,.fa-taxi:before{content:""}.fa-tree:before{content:""}.fa-spotify:before{content:""}.fa-deviantart:before{content:""}.fa-soundcloud:before{content:""}.fa-database:before{content:""}.fa-file-pdf-o:before{content:""}.fa-file-word-o:before{content:""}.fa-file-excel-o:before{content:""}.fa-file-powerpoint-o:before{content:""}.fa-file-image-o:before,.fa-file-photo-o:before,.fa-file-picture-o:before{content:""}.fa-file-archive-o:before,.fa-file-zip-o:before{content:""}.fa-file-audio-o:before,.fa-file-sound-o:before{content:""}.fa-file-movie-o:before,.fa-file-video-o:before{content:""}.fa-file-code-o:before{content:""}.fa-vine:before{content:""}.fa-codepen:before{content:""}.fa-jsfiddle:before{content:""}.fa-life-bouy:before,.fa-life-buoy:before,.fa-life-ring:before,.fa-life-saver:before,.fa-support:before{content:""}.fa-circle-o-notch:before{content:""}.fa-ra:before,.fa-rebel:before,.fa-resistance:before{content:""}.fa-empire:before,.fa-ge:before{content:""}.fa-git-square:before{content:""}.fa-git:before{content:""}.fa-hacker-news:before,.fa-y-combinator-square:before,.fa-yc-square:before{content:""}.fa-tencent-weibo:before{content:""}.fa-qq:before{content:""}.fa-wechat:before,.fa-weixin:before{content:""}.fa-paper-plane:before,.fa-send:before{content:""}.fa-paper-plane-o:before,.fa-send-o:before{content:""}.fa-history:before{content:""}.fa-circle-thin:before{content:""}.fa-header:before{content:""}.fa-paragraph:before{content:""}.fa-sliders:before{content:""}.fa-share-alt:before{content:""}.fa-share-alt-square:before{content:""}.fa-bomb:before{content:""}.fa-futbol-o:before,.fa-soccer-ball-o:before{content:""}.fa-tty:before{content:""}.fa-binoculars:before{content:""}.fa-plug:before{content:""}.fa-slideshare:before{content:""}.fa-twitch:before{content:""}.fa-yelp:before{content:""}.fa-newspaper-o:before{content:""}.fa-wifi:before{content:""}.fa-calculator:before{content:""}.fa-paypal:before{content:""}.fa-google-wallet:before{content:""}.fa-cc-visa:before{content:""}.fa-cc-mastercard:before{content:""}.fa-cc-discover:before{content:""}.fa-cc-amex:before{content:""}.fa-cc-paypal:before{content:""}.fa-cc-stripe:before{content:""}.fa-bell-slash:before{content:""}.fa-bell-slash-o:before{content:""}.fa-trash:before{content:""}.fa-copyright:before{content:""}.fa-at:before{content:""}.fa-eyedropper:before{content:""}.fa-paint-brush:before{content:""}.fa-birthday-cake:before{content:""}.fa-area-chart:before{content:""}.fa-pie-chart:before{content:""}.fa-line-chart:before{content:""}.fa-lastfm:before{content:""}.fa-lastfm-square:before{content:""}.fa-toggle-off:before{content:""}.fa-toggle-on:before{content:""}.fa-bicycle:before{content:""}.fa-bus:before{content:""}.fa-ioxhost:before{content:""}.fa-angellist:before{content:""}.fa-cc:before{content:""}.fa-ils:before,.fa-shekel:before,.fa-sheqel:before{content:""}.fa-meanpath:before{content:""}.fa-buysellads:before{content:""}.fa-connectdevelop:before{content:""}.fa-dashcube:before{content:""}.fa-forumbee:before{content:""}.fa-leanpub:before{content:""}.fa-sellsy:before{content:""}.fa-shirtsinbulk:before{content:""}.fa-simplybuilt:before{content:""}.fa-skyatlas:before{content:""}.fa-cart-plus:before{content:""}.fa-cart-arrow-down:before{content:""}.fa-diamond:before{content:""}.fa-ship:before{content:""}.fa-user-secret:before{content:""}.fa-motorcycle:before{content:""}.fa-street-view:before{content:""}.fa-heartbeat:before{content:""}.fa-venus:before{content:""}.fa-mars:before{content:""}.fa-mercury:before{content:""}.fa-intersex:before,.fa-transgender:before{content:""}.fa-transgender-alt:before{content:""}.fa-venus-double:before{content:""}.fa-mars-double:before{content:""}.fa-venus-mars:before{content:""}.fa-mars-stroke:before{content:""}.fa-mars-stroke-v:before{content:""}.fa-mars-stroke-h:before{content:""}.fa-neuter:before{content:""}.fa-genderless:before{content:""}.fa-facebook-official:before{content:""}.fa-pinterest-p:before{content:""}.fa-whatsapp:before{content:""}.fa-server:before{content:""}.fa-user-plus:before{content:""}.fa-user-times:before{content:""}.fa-bed:before,.fa-hotel:before{content:""}.fa-viacoin:before{content:""}.fa-train:before{content:""}.fa-subway:before{content:""}.fa-medium:before{content:""}.fa-y-combinator:before,.fa-yc:before{content:""}.fa-optin-monster:before{content:""}.fa-opencart:before{content:""}.fa-expeditedssl:before{content:""}.fa-battery-4:before,.fa-battery-full:before,.fa-battery:before{content:""}.fa-battery-3:before,.fa-battery-three-quarters:before{content:""}.fa-battery-2:before,.fa-battery-half:before{content:""}.fa-battery-1:before,.fa-battery-quarter:before{content:""}.fa-battery-0:before,.fa-battery-empty:before{content:""}.fa-mouse-pointer:before{content:""}.fa-i-cursor:before{content:""}.fa-object-group:before{content:""}.fa-object-ungroup:before{content:""}.fa-sticky-note:before{content:""}.fa-sticky-note-o:before{content:""}.fa-cc-jcb:before{content:""}.fa-cc-diners-club:before{content:""}.fa-clone:before{content:""}.fa-balance-scale:before{content:""}.fa-hourglass-o:before{content:""}.fa-hourglass-1:before,.fa-hourglass-start:before{content:""}.fa-hourglass-2:before,.fa-hourglass-half:before{content:""}.fa-hourglass-3:before,.fa-hourglass-end:before{content:""}.fa-hourglass:before{content:""}.fa-hand-grab-o:before,.fa-hand-rock-o:before{content:""}.fa-hand-paper-o:before,.fa-hand-stop-o:before{content:""}.fa-hand-scissors-o:before{content:""}.fa-hand-lizard-o:before{content:""}.fa-hand-spock-o:before{content:""}.fa-hand-pointer-o:before{content:""}.fa-hand-peace-o:before{content:""}.fa-trademark:before{content:""}.fa-registered:before{content:""}.fa-creative-commons:before{content:""}.fa-gg:before{content:""}.fa-gg-circle:before{content:""}.fa-tripadvisor:before{content:""}.fa-odnoklassniki:before{content:""}.fa-odnoklassniki-square:before{content:""}.fa-get-pocket:before{content:""}.fa-wikipedia-w:before{content:""}.fa-safari:before{content:""}.fa-chrome:before{content:""}.fa-firefox:before{content:""}.fa-opera:before{content:""}.fa-internet-explorer:before{content:""}.fa-television:before,.fa-tv:before{content:""}.fa-contao:before{content:""}.fa-500px:before{content:""}.fa-amazon:before{content:""}.fa-calendar-plus-o:before{content:""}.fa-calendar-minus-o:before{content:""}.fa-calendar-times-o:before{content:""}.fa-calendar-check-o:before{content:""}.fa-industry:before{content:""}.fa-map-pin:before{content:""}.fa-map-signs:before{content:""}.fa-map-o:before{content:""}.fa-map:before{content:""}.fa-commenting:before{content:""}.fa-commenting-o:before{content:""}.fa-houzz:before{content:""}.fa-vimeo:before{content:""}.fa-black-tie:before{content:""}.fa-fonticons:before{content:""}.fa-reddit-alien:before{content:""}.fa-edge:before{content:""}.fa-credit-card-alt:before{content:""}.fa-codiepie:before{content:""}.fa-modx:before{content:""}.fa-fort-awesome:before{content:""}.fa-usb:before{content:""}.fa-product-hunt:before{content:""}.fa-mixcloud:before{content:""}.fa-scribd:before{content:""}.fa-pause-circle:before{content:""}.fa-pause-circle-o:before{content:""}.fa-stop-circle:before{content:""}.fa-stop-circle-o:before{content:""}.fa-shopping-bag:before{content:""}.fa-shopping-basket:before{content:""}.fa-hashtag:before{content:""}.fa-bluetooth:before{content:""}.fa-bluetooth-b:before{content:""}.fa-percent:before{content:""}.fa-gitlab:before,.icon-gitlab:before{content:""}.fa-wpbeginner:before{content:""}.fa-wpforms:before{content:""}.fa-envira:before{content:""}.fa-universal-access:before{content:""}.fa-wheelchair-alt:before{content:""}.fa-question-circle-o:before{content:""}.fa-blind:before{content:""}.fa-audio-description:before{content:""}.fa-volume-control-phone:before{content:""}.fa-braille:before{content:""}.fa-assistive-listening-systems:before{content:""}.fa-american-sign-language-interpreting:before,.fa-asl-interpreting:before{content:""}.fa-deaf:before,.fa-deafness:before,.fa-hard-of-hearing:before{content:""}.fa-glide:before{content:""}.fa-glide-g:before{content:""}.fa-sign-language:before,.fa-signing:before{content:""}.fa-low-vision:before{content:""}.fa-viadeo:before{content:""}.fa-viadeo-square:before{content:""}.fa-snapchat:before{content:""}.fa-snapchat-ghost:before{content:""}.fa-snapchat-square:before{content:""}.fa-pied-piper:before{content:""}.fa-first-order:before{content:""}.fa-yoast:before{content:""}.fa-themeisle:before{content:""}.fa-google-plus-circle:before,.fa-google-plus-official:before{content:""}.fa-fa:before,.fa-font-awesome:before{content:""}.fa-handshake-o:before{content:""}.fa-envelope-open:before{content:""}.fa-envelope-open-o:before{content:""}.fa-linode:before{content:""}.fa-address-book:before{content:""}.fa-address-book-o:before{content:""}.fa-address-card:before,.fa-vcard:before{content:""}.fa-address-card-o:before,.fa-vcard-o:before{content:""}.fa-user-circle:before{content:""}.fa-user-circle-o:before{content:""}.fa-user-o:before{content:""}.fa-id-badge:before{content:""}.fa-drivers-license:before,.fa-id-card:before{content:""}.fa-drivers-license-o:before,.fa-id-card-o:before{content:""}.fa-quora:before{content:""}.fa-free-code-camp:before{content:""}.fa-telegram:before{content:""}.fa-thermometer-4:before,.fa-thermometer-full:before,.fa-thermometer:before{content:""}.fa-thermometer-3:before,.fa-thermometer-three-quarters:before{content:""}.fa-thermometer-2:before,.fa-thermometer-half:before{content:""}.fa-thermometer-1:before,.fa-thermometer-quarter:before{content:""}.fa-thermometer-0:before,.fa-thermometer-empty:before{content:""}.fa-shower:before{content:""}.fa-bath:before,.fa-bathtub:before,.fa-s15:before{content:""}.fa-podcast:before{content:""}.fa-window-maximize:before{content:""}.fa-window-minimize:before{content:""}.fa-window-restore:before{content:""}.fa-times-rectangle:before,.fa-window-close:before{content:""}.fa-times-rectangle-o:before,.fa-window-close-o:before{content:""}.fa-bandcamp:before{content:""}.fa-grav:before{content:""}.fa-etsy:before{content:""}.fa-imdb:before{content:""}.fa-ravelry:before{content:""}.fa-eercast:before{content:""}.fa-microchip:before{content:""}.fa-snowflake-o:before{content:""}.fa-superpowers:before{content:""}.fa-wpexplorer:before{content:""}.fa-meetup:before{content:""}.sr-only{position:absolute;width:1px;height:1px;padding:0;margin:-1px;overflow:hidden;clip:rect(0,0,0,0);border:0}.sr-only-focusable:active,.sr-only-focusable:focus{position:static;width:auto;height:auto;margin:0;overflow:visible;clip:auto}.fa,.icon,.rst-content .admonition-title,.rst-content .code-block-caption .headerlink,.rst-content .eqno .headerlink,.rst-content code.download span:first-child,.rst-content dl dt .headerlink,.rst-content h1 .headerlink,.rst-content h2 .headerlink,.rst-content h3 .headerlink,.rst-content h4 .headerlink,.rst-content h5 .headerlink,.rst-content h6 .headerlink,.rst-content p.caption .headerlink,.rst-content p .headerlink,.rst-content table>caption .headerlink,.rst-content tt.download span:first-child,.wy-dropdown .caret,.wy-inline-validate.wy-inline-validate-danger .wy-input-context,.wy-inline-validate.wy-inline-validate-info .wy-input-context,.wy-inline-validate.wy-inline-validate-success .wy-input-context,.wy-inline-validate.wy-inline-validate-warning .wy-input-context,.wy-menu-vertical li.current>a button.toctree-expand,.wy-menu-vertical li.on a button.toctree-expand,.wy-menu-vertical li button.toctree-expand{font-family:inherit}.fa:before,.icon:before,.rst-content .admonition-title:before,.rst-content .code-block-caption .headerlink:before,.rst-content .eqno .headerlink:before,.rst-content code.download span:first-child:before,.rst-content dl dt .headerlink:before,.rst-content h1 .headerlink:before,.rst-content h2 .headerlink:before,.rst-content h3 .headerlink:before,.rst-content h4 .headerlink:before,.rst-content h5 .headerlink:before,.rst-content h6 .headerlink:before,.rst-content p.caption .headerlink:before,.rst-content p .headerlink:before,.rst-content table>caption .headerlink:before,.rst-content tt.download span:first-child:before,.wy-dropdown .caret:before,.wy-inline-validate.wy-inline-validate-danger .wy-input-context:before,.wy-inline-validate.wy-inline-validate-info .wy-input-context:before,.wy-inline-validate.wy-inline-validate-success .wy-input-context:before,.wy-inline-validate.wy-inline-validate-warning .wy-input-context:before,.wy-menu-vertical li.current>a button.toctree-expand:before,.wy-menu-vertical li.on a button.toctree-expand:before,.wy-menu-vertical li button.toctree-expand:before{font-family:FontAwesome;display:inline-block;font-style:normal;font-weight:400;line-height:1;text-decoration:inherit}.rst-content .code-block-caption a .headerlink,.rst-content .eqno a .headerlink,.rst-content a .admonition-title,.rst-content code.download a span:first-child,.rst-content dl dt a .headerlink,.rst-content h1 a .headerlink,.rst-content h2 a .headerlink,.rst-content h3 a .headerlink,.rst-content h4 a .headerlink,.rst-content h5 a .headerlink,.rst-content h6 a .headerlink,.rst-content p.caption a .headerlink,.rst-content p a .headerlink,.rst-content table>caption a .headerlink,.rst-content tt.download a span:first-child,.wy-menu-vertical li.current>a button.toctree-expand,.wy-menu-vertical li.on a button.toctree-expand,.wy-menu-vertical li a button.toctree-expand,a .fa,a .icon,a .rst-content .admonition-title,a .rst-content .code-block-caption .headerlink,a .rst-content .eqno .headerlink,a .rst-content code.download span:first-child,a .rst-content dl dt .headerlink,a .rst-content h1 .headerlink,a .rst-content h2 .headerlink,a .rst-content h3 .headerlink,a .rst-content h4 .headerlink,a .rst-content h5 .headerlink,a .rst-content h6 .headerlink,a .rst-content p.caption .headerlink,a .rst-content p .headerlink,a .rst-content table>caption .headerlink,a .rst-content tt.download span:first-child,a .wy-menu-vertical li button.toctree-expand{display:inline-block;text-decoration:inherit}.btn .fa,.btn .icon,.btn .rst-content .admonition-title,.btn .rst-content .code-block-caption .headerlink,.btn .rst-content .eqno .headerlink,.btn .rst-content code.download span:first-child,.btn .rst-content dl dt .headerlink,.btn .rst-content h1 .headerlink,.btn .rst-content h2 .headerlink,.btn .rst-content h3 .headerlink,.btn .rst-content h4 .headerlink,.btn .rst-content h5 .headerlink,.btn .rst-content h6 .headerlink,.btn .rst-content p .headerlink,.btn .rst-content table>caption .headerlink,.btn .rst-content tt.download span:first-child,.btn .wy-menu-vertical li.current>a button.toctree-expand,.btn .wy-menu-vertical li.on a button.toctree-expand,.btn .wy-menu-vertical li button.toctree-expand,.nav .fa,.nav .icon,.nav .rst-content .admonition-title,.nav .rst-content .code-block-caption .headerlink,.nav .rst-content .eqno .headerlink,.nav .rst-content code.download span:first-child,.nav .rst-content dl dt .headerlink,.nav .rst-content h1 .headerlink,.nav .rst-content h2 .headerlink,.nav .rst-content h3 .headerlink,.nav .rst-content h4 .headerlink,.nav .rst-content h5 .headerlink,.nav .rst-content h6 .headerlink,.nav .rst-content p .headerlink,.nav .rst-content table>caption .headerlink,.nav .rst-content tt.download span:first-child,.nav .wy-menu-vertical li.current>a button.toctree-expand,.nav .wy-menu-vertical li.on a button.toctree-expand,.nav .wy-menu-vertical li button.toctree-expand,.rst-content .btn .admonition-title,.rst-content .code-block-caption .btn .headerlink,.rst-content .code-block-caption .nav .headerlink,.rst-content .eqno .btn .headerlink,.rst-content .eqno .nav .headerlink,.rst-content .nav .admonition-title,.rst-content code.download .btn span:first-child,.rst-content code.download .nav span:first-child,.rst-content dl dt .btn .headerlink,.rst-content dl dt .nav .headerlink,.rst-content h1 .btn .headerlink,.rst-content h1 .nav .headerlink,.rst-content h2 .btn .headerlink,.rst-content h2 .nav .headerlink,.rst-content h3 .btn .headerlink,.rst-content h3 .nav .headerlink,.rst-content h4 .btn .headerlink,.rst-content h4 .nav .headerlink,.rst-content h5 .btn .headerlink,.rst-content h5 .nav .headerlink,.rst-content h6 .btn .headerlink,.rst-content h6 .nav .headerlink,.rst-content p .btn .headerlink,.rst-content p .nav .headerlink,.rst-content table>caption .btn .headerlink,.rst-content table>caption .nav .headerlink,.rst-content tt.download .btn span:first-child,.rst-content tt.download .nav span:first-child,.wy-menu-vertical li .btn button.toctree-expand,.wy-menu-vertical li.current>a .btn button.toctree-expand,.wy-menu-vertical li.current>a .nav button.toctree-expand,.wy-menu-vertical li .nav button.toctree-expand,.wy-menu-vertical li.on a .btn button.toctree-expand,.wy-menu-vertical li.on a .nav button.toctree-expand{display:inline}.btn .fa-large.icon,.btn .fa.fa-large,.btn .rst-content .code-block-caption .fa-large.headerlink,.btn .rst-content .eqno .fa-large.headerlink,.btn .rst-content .fa-large.admonition-title,.btn .rst-content code.download span.fa-large:first-child,.btn .rst-content dl dt .fa-large.headerlink,.btn .rst-content h1 .fa-large.headerlink,.btn .rst-content h2 .fa-large.headerlink,.btn .rst-content h3 .fa-large.headerlink,.btn .rst-content h4 .fa-large.headerlink,.btn .rst-content h5 .fa-large.headerlink,.btn .rst-content h6 .fa-large.headerlink,.btn .rst-content p .fa-large.headerlink,.btn .rst-content table>caption .fa-large.headerlink,.btn .rst-content tt.download span.fa-large:first-child,.btn .wy-menu-vertical li button.fa-large.toctree-expand,.nav .fa-large.icon,.nav .fa.fa-large,.nav .rst-content .code-block-caption .fa-large.headerlink,.nav .rst-content .eqno .fa-large.headerlink,.nav .rst-content .fa-large.admonition-title,.nav .rst-content code.download span.fa-large:first-child,.nav .rst-content dl dt .fa-large.headerlink,.nav .rst-content h1 .fa-large.headerlink,.nav .rst-content h2 .fa-large.headerlink,.nav .rst-content h3 .fa-large.headerlink,.nav .rst-content h4 .fa-large.headerlink,.nav .rst-content h5 .fa-large.headerlink,.nav .rst-content h6 .fa-large.headerlink,.nav .rst-content p .fa-large.headerlink,.nav .rst-content table>caption .fa-large.headerlink,.nav .rst-content tt.download span.fa-large:first-child,.nav .wy-menu-vertical li button.fa-large.toctree-expand,.rst-content .btn .fa-large.admonition-title,.rst-content .code-block-caption .btn .fa-large.headerlink,.rst-content .code-block-caption .nav .fa-large.headerlink,.rst-content .eqno .btn .fa-large.headerlink,.rst-content .eqno .nav .fa-large.headerlink,.rst-content .nav .fa-large.admonition-title,.rst-content code.download .btn span.fa-large:first-child,.rst-content code.download .nav span.fa-large:first-child,.rst-content dl dt .btn .fa-large.headerlink,.rst-content dl dt .nav .fa-large.headerlink,.rst-content h1 .btn .fa-large.headerlink,.rst-content h1 .nav .fa-large.headerlink,.rst-content h2 .btn .fa-large.headerlink,.rst-content h2 .nav .fa-large.headerlink,.rst-content h3 .btn .fa-large.headerlink,.rst-content h3 .nav .fa-large.headerlink,.rst-content h4 .btn .fa-large.headerlink,.rst-content h4 .nav .fa-large.headerlink,.rst-content h5 .btn .fa-large.headerlink,.rst-content h5 .nav .fa-large.headerlink,.rst-content h6 .btn .fa-large.headerlink,.rst-content h6 .nav .fa-large.headerlink,.rst-content p .btn .fa-large.headerlink,.rst-content p .nav .fa-large.headerlink,.rst-content table>caption .btn .fa-large.headerlink,.rst-content table>caption .nav .fa-large.headerlink,.rst-content tt.download .btn span.fa-large:first-child,.rst-content tt.download .nav span.fa-large:first-child,.wy-menu-vertical li .btn button.fa-large.toctree-expand,.wy-menu-vertical li .nav button.fa-large.toctree-expand{line-height:.9em}.btn .fa-spin.icon,.btn .fa.fa-spin,.btn .rst-content .code-block-caption .fa-spin.headerlink,.btn .rst-content .eqno .fa-spin.headerlink,.btn .rst-content .fa-spin.admonition-title,.btn .rst-content code.download span.fa-spin:first-child,.btn .rst-content dl dt .fa-spin.headerlink,.btn .rst-content h1 .fa-spin.headerlink,.btn .rst-content h2 .fa-spin.headerlink,.btn .rst-content h3 .fa-spin.headerlink,.btn .rst-content h4 .fa-spin.headerlink,.btn .rst-content h5 .fa-spin.headerlink,.btn .rst-content h6 .fa-spin.headerlink,.btn .rst-content p .fa-spin.headerlink,.btn .rst-content table>caption .fa-spin.headerlink,.btn .rst-content tt.download span.fa-spin:first-child,.btn .wy-menu-vertical li button.fa-spin.toctree-expand,.nav .fa-spin.icon,.nav .fa.fa-spin,.nav .rst-content .code-block-caption .fa-spin.headerlink,.nav .rst-content .eqno .fa-spin.headerlink,.nav .rst-content .fa-spin.admonition-title,.nav .rst-content code.download span.fa-spin:first-child,.nav .rst-content dl dt .fa-spin.headerlink,.nav .rst-content h1 .fa-spin.headerlink,.nav .rst-content h2 .fa-spin.headerlink,.nav .rst-content h3 .fa-spin.headerlink,.nav .rst-content h4 .fa-spin.headerlink,.nav .rst-content h5 .fa-spin.headerlink,.nav .rst-content h6 .fa-spin.headerlink,.nav .rst-content p .fa-spin.headerlink,.nav .rst-content table>caption .fa-spin.headerlink,.nav .rst-content tt.download span.fa-spin:first-child,.nav .wy-menu-vertical li button.fa-spin.toctree-expand,.rst-content .btn .fa-spin.admonition-title,.rst-content .code-block-caption .btn .fa-spin.headerlink,.rst-content .code-block-caption .nav .fa-spin.headerlink,.rst-content .eqno .btn .fa-spin.headerlink,.rst-content .eqno .nav .fa-spin.headerlink,.rst-content .nav .fa-spin.admonition-title,.rst-content code.download .btn span.fa-spin:first-child,.rst-content code.download .nav span.fa-spin:first-child,.rst-content dl dt .btn .fa-spin.headerlink,.rst-content dl dt .nav .fa-spin.headerlink,.rst-content h1 .btn .fa-spin.headerlink,.rst-content h1 .nav .fa-spin.headerlink,.rst-content h2 .btn .fa-spin.headerlink,.rst-content h2 .nav .fa-spin.headerlink,.rst-content h3 .btn .fa-spin.headerlink,.rst-content h3 .nav .fa-spin.headerlink,.rst-content h4 .btn .fa-spin.headerlink,.rst-content h4 .nav .fa-spin.headerlink,.rst-content h5 .btn .fa-spin.headerlink,.rst-content h5 .nav .fa-spin.headerlink,.rst-content h6 .btn .fa-spin.headerlink,.rst-content h6 .nav .fa-spin.headerlink,.rst-content p .btn .fa-spin.headerlink,.rst-content p .nav .fa-spin.headerlink,.rst-content table>caption .btn .fa-spin.headerlink,.rst-content table>caption .nav .fa-spin.headerlink,.rst-content tt.download .btn span.fa-spin:first-child,.rst-content tt.download .nav span.fa-spin:first-child,.wy-menu-vertical li .btn button.fa-spin.toctree-expand,.wy-menu-vertical li .nav button.fa-spin.toctree-expand{display:inline-block}.btn.fa:before,.btn.icon:before,.rst-content .btn.admonition-title:before,.rst-content .code-block-caption .btn.headerlink:before,.rst-content .eqno .btn.headerlink:before,.rst-content code.download span.btn:first-child:before,.rst-content dl dt .btn.headerlink:before,.rst-content h1 .btn.headerlink:before,.rst-content h2 .btn.headerlink:before,.rst-content h3 .btn.headerlink:before,.rst-content h4 .btn.headerlink:before,.rst-content h5 .btn.headerlink:before,.rst-content h6 .btn.headerlink:before,.rst-content p .btn.headerlink:before,.rst-content table>caption .btn.headerlink:before,.rst-content tt.download span.btn:first-child:before,.wy-menu-vertical li button.btn.toctree-expand:before{opacity:.5;-webkit-transition:opacity .05s ease-in;-moz-transition:opacity .05s ease-in;transition:opacity .05s ease-in}.btn.fa:hover:before,.btn.icon:hover:before,.rst-content .btn.admonition-title:hover:before,.rst-content .code-block-caption .btn.headerlink:hover:before,.rst-content .eqno .btn.headerlink:hover:before,.rst-content code.download span.btn:first-child:hover:before,.rst-content dl dt .btn.headerlink:hover:before,.rst-content h1 .btn.headerlink:hover:before,.rst-content h2 .btn.headerlink:hover:before,.rst-content h3 .btn.headerlink:hover:before,.rst-content h4 .btn.headerlink:hover:before,.rst-content h5 .btn.headerlink:hover:before,.rst-content h6 .btn.headerlink:hover:before,.rst-content p .btn.headerlink:hover:before,.rst-content table>caption .btn.headerlink:hover:before,.rst-content tt.download span.btn:first-child:hover:before,.wy-menu-vertical li button.btn.toctree-expand:hover:before{opacity:1}.btn-mini .fa:before,.btn-mini .icon:before,.btn-mini .rst-content .admonition-title:before,.btn-mini .rst-content .code-block-caption .headerlink:before,.btn-mini .rst-content .eqno .headerlink:before,.btn-mini .rst-content code.download span:first-child:before,.btn-mini .rst-content dl dt .headerlink:before,.btn-mini .rst-content h1 .headerlink:before,.btn-mini .rst-content h2 .headerlink:before,.btn-mini .rst-content h3 .headerlink:before,.btn-mini .rst-content h4 .headerlink:before,.btn-mini .rst-content h5 .headerlink:before,.btn-mini .rst-content h6 .headerlink:before,.btn-mini .rst-content p .headerlink:before,.btn-mini .rst-content table>caption .headerlink:before,.btn-mini .rst-content tt.download span:first-child:before,.btn-mini .wy-menu-vertical li button.toctree-expand:before,.rst-content .btn-mini .admonition-title:before,.rst-content .code-block-caption .btn-mini .headerlink:before,.rst-content .eqno .btn-mini .headerlink:before,.rst-content code.download .btn-mini span:first-child:before,.rst-content dl dt .btn-mini .headerlink:before,.rst-content h1 .btn-mini .headerlink:before,.rst-content h2 .btn-mini .headerlink:before,.rst-content h3 .btn-mini .headerlink:before,.rst-content h4 .btn-mini .headerlink:before,.rst-content h5 .btn-mini .headerlink:before,.rst-content h6 .btn-mini .headerlink:before,.rst-content p .btn-mini .headerlink:before,.rst-content table>caption .btn-mini .headerlink:before,.rst-content tt.download .btn-mini span:first-child:before,.wy-menu-vertical li .btn-mini button.toctree-expand:before{font-size:14px;vertical-align:-15%}.rst-content .admonition,.rst-content .admonition-todo,.rst-content .attention,.rst-content .caution,.rst-content .danger,.rst-content .error,.rst-content .hint,.rst-content .important,.rst-content .note,.rst-content .seealso,.rst-content .tip,.rst-content .warning,.wy-alert{padding:12px;line-height:24px;margin-bottom:24px;background:#e7f2fa}.rst-content .admonition-title,.wy-alert-title{font-weight:700;display:block;color:#fff;background:#6ab0de;padding:6px 12px;margin:-12px -12px 12px}.rst-content .danger,.rst-content .error,.rst-content .wy-alert-danger.admonition,.rst-content .wy-alert-danger.admonition-todo,.rst-content .wy-alert-danger.attention,.rst-content .wy-alert-danger.caution,.rst-content .wy-alert-danger.hint,.rst-content .wy-alert-danger.important,.rst-content .wy-alert-danger.note,.rst-content .wy-alert-danger.seealso,.rst-content .wy-alert-danger.tip,.rst-content .wy-alert-danger.warning,.wy-alert.wy-alert-danger{background:#fdf3f2}.rst-content .danger .admonition-title,.rst-content .danger .wy-alert-title,.rst-content .error .admonition-title,.rst-content .error .wy-alert-title,.rst-content .wy-alert-danger.admonition-todo .admonition-title,.rst-content .wy-alert-danger.admonition-todo .wy-alert-title,.rst-content .wy-alert-danger.admonition .admonition-title,.rst-content .wy-alert-danger.admonition .wy-alert-title,.rst-content .wy-alert-danger.attention .admonition-title,.rst-content .wy-alert-danger.attention .wy-alert-title,.rst-content .wy-alert-danger.caution .admonition-title,.rst-content .wy-alert-danger.caution .wy-alert-title,.rst-content .wy-alert-danger.hint .admonition-title,.rst-content .wy-alert-danger.hint .wy-alert-title,.rst-content .wy-alert-danger.important .admonition-title,.rst-content .wy-alert-danger.important .wy-alert-title,.rst-content .wy-alert-danger.note .admonition-title,.rst-content .wy-alert-danger.note .wy-alert-title,.rst-content .wy-alert-danger.seealso .admonition-title,.rst-content .wy-alert-danger.seealso .wy-alert-title,.rst-content .wy-alert-danger.tip .admonition-title,.rst-content .wy-alert-danger.tip .wy-alert-title,.rst-content .wy-alert-danger.warning .admonition-title,.rst-content .wy-alert-danger.warning .wy-alert-title,.rst-content .wy-alert.wy-alert-danger .admonition-title,.wy-alert.wy-alert-danger .rst-content .admonition-title,.wy-alert.wy-alert-danger .wy-alert-title{background:#f29f97}.rst-content .admonition-todo,.rst-content .attention,.rst-content .caution,.rst-content .warning,.rst-content .wy-alert-warning.admonition,.rst-content .wy-alert-warning.danger,.rst-content .wy-alert-warning.error,.rst-content .wy-alert-warning.hint,.rst-content .wy-alert-warning.important,.rst-content .wy-alert-warning.note,.rst-content .wy-alert-warning.seealso,.rst-content .wy-alert-warning.tip,.wy-alert.wy-alert-warning{background:#ffedcc}.rst-content .admonition-todo .admonition-title,.rst-content .admonition-todo .wy-alert-title,.rst-content .attention .admonition-title,.rst-content .attention .wy-alert-title,.rst-content .caution .admonition-title,.rst-content .caution .wy-alert-title,.rst-content .warning .admonition-title,.rst-content .warning .wy-alert-title,.rst-content .wy-alert-warning.admonition .admonition-title,.rst-content .wy-alert-warning.admonition .wy-alert-title,.rst-content .wy-alert-warning.danger .admonition-title,.rst-content .wy-alert-warning.danger .wy-alert-title,.rst-content .wy-alert-warning.error .admonition-title,.rst-content .wy-alert-warning.error .wy-alert-title,.rst-content .wy-alert-warning.hint .admonition-title,.rst-content .wy-alert-warning.hint .wy-alert-title,.rst-content .wy-alert-warning.important .admonition-title,.rst-content .wy-alert-warning.important .wy-alert-title,.rst-content .wy-alert-warning.note .admonition-title,.rst-content .wy-alert-warning.note .wy-alert-title,.rst-content .wy-alert-warning.seealso .admonition-title,.rst-content .wy-alert-warning.seealso .wy-alert-title,.rst-content .wy-alert-warning.tip .admonition-title,.rst-content .wy-alert-warning.tip .wy-alert-title,.rst-content .wy-alert.wy-alert-warning .admonition-title,.wy-alert.wy-alert-warning .rst-content .admonition-title,.wy-alert.wy-alert-warning .wy-alert-title{background:#f0b37e}.rst-content .note,.rst-content .seealso,.rst-content .wy-alert-info.admonition,.rst-content .wy-alert-info.admonition-todo,.rst-content .wy-alert-info.attention,.rst-content .wy-alert-info.caution,.rst-content .wy-alert-info.danger,.rst-content .wy-alert-info.error,.rst-content .wy-alert-info.hint,.rst-content .wy-alert-info.important,.rst-content .wy-alert-info.tip,.rst-content .wy-alert-info.warning,.wy-alert.wy-alert-info{background:#e7f2fa}.rst-content .note .admonition-title,.rst-content .note .wy-alert-title,.rst-content .seealso .admonition-title,.rst-content .seealso .wy-alert-title,.rst-content .wy-alert-info.admonition-todo .admonition-title,.rst-content .wy-alert-info.admonition-todo .wy-alert-title,.rst-content .wy-alert-info.admonition .admonition-title,.rst-content .wy-alert-info.admonition .wy-alert-title,.rst-content .wy-alert-info.attention .admonition-title,.rst-content .wy-alert-info.attention .wy-alert-title,.rst-content .wy-alert-info.caution .admonition-title,.rst-content .wy-alert-info.caution .wy-alert-title,.rst-content .wy-alert-info.danger .admonition-title,.rst-content .wy-alert-info.danger .wy-alert-title,.rst-content .wy-alert-info.error .admonition-title,.rst-content .wy-alert-info.error .wy-alert-title,.rst-content .wy-alert-info.hint .admonition-title,.rst-content .wy-alert-info.hint .wy-alert-title,.rst-content .wy-alert-info.important .admonition-title,.rst-content .wy-alert-info.important .wy-alert-title,.rst-content .wy-alert-info.tip .admonition-title,.rst-content .wy-alert-info.tip .wy-alert-title,.rst-content .wy-alert-info.warning .admonition-title,.rst-content .wy-alert-info.warning .wy-alert-title,.rst-content .wy-alert.wy-alert-info .admonition-title,.wy-alert.wy-alert-info .rst-content .admonition-title,.wy-alert.wy-alert-info .wy-alert-title{background:#6ab0de}.rst-content .hint,.rst-content .important,.rst-content .tip,.rst-content .wy-alert-success.admonition,.rst-content .wy-alert-success.admonition-todo,.rst-content .wy-alert-success.attention,.rst-content .wy-alert-success.caution,.rst-content .wy-alert-success.danger,.rst-content .wy-alert-success.error,.rst-content .wy-alert-success.note,.rst-content .wy-alert-success.seealso,.rst-content .wy-alert-success.warning,.wy-alert.wy-alert-success{background:#dbfaf4}.rst-content .hint .admonition-title,.rst-content .hint .wy-alert-title,.rst-content .important .admonition-title,.rst-content .important .wy-alert-title,.rst-content .tip .admonition-title,.rst-content .tip .wy-alert-title,.rst-content .wy-alert-success.admonition-todo .admonition-title,.rst-content .wy-alert-success.admonition-todo .wy-alert-title,.rst-content .wy-alert-success.admonition .admonition-title,.rst-content .wy-alert-success.admonition .wy-alert-title,.rst-content .wy-alert-success.attention .admonition-title,.rst-content .wy-alert-success.attention .wy-alert-title,.rst-content .wy-alert-success.caution .admonition-title,.rst-content .wy-alert-success.caution .wy-alert-title,.rst-content .wy-alert-success.danger .admonition-title,.rst-content .wy-alert-success.danger .wy-alert-title,.rst-content .wy-alert-success.error .admonition-title,.rst-content .wy-alert-success.error .wy-alert-title,.rst-content .wy-alert-success.note .admonition-title,.rst-content .wy-alert-success.note .wy-alert-title,.rst-content .wy-alert-success.seealso .admonition-title,.rst-content .wy-alert-success.seealso .wy-alert-title,.rst-content .wy-alert-success.warning .admonition-title,.rst-content .wy-alert-success.warning .wy-alert-title,.rst-content .wy-alert.wy-alert-success .admonition-title,.wy-alert.wy-alert-success .rst-content .admonition-title,.wy-alert.wy-alert-success .wy-alert-title{background:#1abc9c}.rst-content .wy-alert-neutral.admonition,.rst-content .wy-alert-neutral.admonition-todo,.rst-content .wy-alert-neutral.attention,.rst-content .wy-alert-neutral.caution,.rst-content .wy-alert-neutral.danger,.rst-content .wy-alert-neutral.error,.rst-content .wy-alert-neutral.hint,.rst-content .wy-alert-neutral.important,.rst-content .wy-alert-neutral.note,.rst-content .wy-alert-neutral.seealso,.rst-content .wy-alert-neutral.tip,.rst-content .wy-alert-neutral.warning,.wy-alert.wy-alert-neutral{background:#f3f6f6}.rst-content .wy-alert-neutral.admonition-todo .admonition-title,.rst-content .wy-alert-neutral.admonition-todo .wy-alert-title,.rst-content .wy-alert-neutral.admonition .admonition-title,.rst-content .wy-alert-neutral.admonition .wy-alert-title,.rst-content .wy-alert-neutral.attention .admonition-title,.rst-content .wy-alert-neutral.attention .wy-alert-title,.rst-content .wy-alert-neutral.caution .admonition-title,.rst-content .wy-alert-neutral.caution .wy-alert-title,.rst-content .wy-alert-neutral.danger .admonition-title,.rst-content .wy-alert-neutral.danger .wy-alert-title,.rst-content .wy-alert-neutral.error .admonition-title,.rst-content .wy-alert-neutral.error .wy-alert-title,.rst-content .wy-alert-neutral.hint .admonition-title,.rst-content .wy-alert-neutral.hint .wy-alert-title,.rst-content .wy-alert-neutral.important .admonition-title,.rst-content .wy-alert-neutral.important .wy-alert-title,.rst-content .wy-alert-neutral.note .admonition-title,.rst-content .wy-alert-neutral.note .wy-alert-title,.rst-content .wy-alert-neutral.seealso .admonition-title,.rst-content .wy-alert-neutral.seealso .wy-alert-title,.rst-content .wy-alert-neutral.tip .admonition-title,.rst-content .wy-alert-neutral.tip .wy-alert-title,.rst-content .wy-alert-neutral.warning .admonition-title,.rst-content .wy-alert-neutral.warning .wy-alert-title,.rst-content .wy-alert.wy-alert-neutral .admonition-title,.wy-alert.wy-alert-neutral .rst-content .admonition-title,.wy-alert.wy-alert-neutral .wy-alert-title{color:#404040;background:#e1e4e5}.rst-content .wy-alert-neutral.admonition-todo a,.rst-content .wy-alert-neutral.admonition a,.rst-content .wy-alert-neutral.attention a,.rst-content .wy-alert-neutral.caution a,.rst-content .wy-alert-neutral.danger a,.rst-content .wy-alert-neutral.error a,.rst-content .wy-alert-neutral.hint a,.rst-content .wy-alert-neutral.important a,.rst-content .wy-alert-neutral.note a,.rst-content .wy-alert-neutral.seealso a,.rst-content .wy-alert-neutral.tip a,.rst-content .wy-alert-neutral.warning a,.wy-alert.wy-alert-neutral a{color:#2980b9}.rst-content .admonition-todo p:last-child,.rst-content .admonition p:last-child,.rst-content .attention p:last-child,.rst-content .caution p:last-child,.rst-content .danger p:last-child,.rst-content .error p:last-child,.rst-content .hint p:last-child,.rst-content .important p:last-child,.rst-content .note p:last-child,.rst-content .seealso p:last-child,.rst-content .tip p:last-child,.rst-content .warning p:last-child,.wy-alert p:last-child{margin-bottom:0}.wy-tray-container{position:fixed;bottom:0;left:0;z-index:600}.wy-tray-container li{display:block;width:300px;background:transparent;color:#fff;text-align:center;box-shadow:0 5px 5px 0 rgba(0,0,0,.1);padding:0 24px;min-width:20%;opacity:0;height:0;line-height:56px;overflow:hidden;-webkit-transition:all .3s ease-in;-moz-transition:all .3s ease-in;transition:all .3s ease-in}.wy-tray-container li.wy-tray-item-success{background:#27ae60}.wy-tray-container li.wy-tray-item-info{background:#2980b9}.wy-tray-container li.wy-tray-item-warning{background:#e67e22}.wy-tray-container li.wy-tray-item-danger{background:#e74c3c}.wy-tray-container li.on{opacity:1;height:56px}@media screen and (max-width:768px){.wy-tray-container{bottom:auto;top:0;width:100%}.wy-tray-container li{width:100%}}button{font-size:100%;margin:0;vertical-align:baseline;*vertical-align:middle;cursor:pointer;line-height:normal;-webkit-appearance:button;*overflow:visible}button::-moz-focus-inner,input::-moz-focus-inner{border:0;padding:0}button[disabled]{cursor:default}.btn{display:inline-block;border-radius:2px;line-height:normal;white-space:nowrap;text-align:center;cursor:pointer;font-size:100%;padding:6px 12px 8px;color:#fff;border:1px solid rgba(0,0,0,.1);background-color:#27ae60;text-decoration:none;font-weight:400;font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif;box-shadow:inset 0 1px 2px -1px hsla(0,0%,100%,.5),inset 0 -2px 0 0 rgba(0,0,0,.1);outline-none:false;vertical-align:middle;*display:inline;zoom:1;-webkit-user-drag:none;-webkit-user-select:none;-moz-user-select:none;-ms-user-select:none;user-select:none;-webkit-transition:all .1s linear;-moz-transition:all .1s linear;transition:all .1s linear}.btn-hover{background:#2e8ece;color:#fff}.btn:hover{background:#2cc36b;color:#fff}.btn:focus{background:#2cc36b;outline:0}.btn:active{box-shadow:inset 0 -1px 0 0 rgba(0,0,0,.05),inset 0 2px 0 0 rgba(0,0,0,.1);padding:8px 12px 6px}.btn:visited{color:#fff}.btn-disabled,.btn-disabled:active,.btn-disabled:focus,.btn-disabled:hover,.btn:disabled{background-image:none;filter:progid:DXImageTransform.Microsoft.gradient(enabled = false);filter:alpha(opacity=40);opacity:.4;cursor:not-allowed;box-shadow:none}.btn::-moz-focus-inner{padding:0;border:0}.btn-small{font-size:80%}.btn-info{background-color:#2980b9!important}.btn-info:hover{background-color:#2e8ece!important}.btn-neutral{background-color:#f3f6f6!important;color:#404040!important}.btn-neutral:hover{background-color:#e5ebeb!important;color:#404040}.btn-neutral:visited{color:#404040!important}.btn-success{background-color:#27ae60!important}.btn-success:hover{background-color:#295!important}.btn-danger{background-color:#e74c3c!important}.btn-danger:hover{background-color:#ea6153!important}.btn-warning{background-color:#e67e22!important}.btn-warning:hover{background-color:#e98b39!important}.btn-invert{background-color:#222}.btn-invert:hover{background-color:#2f2f2f!important}.btn-link{background-color:transparent!important;color:#2980b9;box-shadow:none;border-color:transparent!important}.btn-link:active,.btn-link:hover{background-color:transparent!important;color:#409ad5!important;box-shadow:none}.btn-link:visited{color:#9b59b6}.wy-btn-group .btn,.wy-control .btn{vertical-align:middle}.wy-btn-group{margin-bottom:24px;*zoom:1}.wy-btn-group:after,.wy-btn-group:before{display:table;content:""}.wy-btn-group:after{clear:both}.wy-dropdown{position:relative;display:inline-block}.wy-dropdown-active .wy-dropdown-menu{display:block}.wy-dropdown-menu{position:absolute;left:0;display:none;float:left;top:100%;min-width:100%;background:#fcfcfc;z-index:100;border:1px solid #cfd7dd;box-shadow:0 2px 2px 0 rgba(0,0,0,.1);padding:12px}.wy-dropdown-menu>dd>a{display:block;clear:both;color:#404040;white-space:nowrap;font-size:90%;padding:0 12px;cursor:pointer}.wy-dropdown-menu>dd>a:hover{background:#2980b9;color:#fff}.wy-dropdown-menu>dd.divider{border-top:1px solid #cfd7dd;margin:6px 0}.wy-dropdown-menu>dd.search{padding-bottom:12px}.wy-dropdown-menu>dd.search input[type=search]{width:100%}.wy-dropdown-menu>dd.call-to-action{background:#e3e3e3;text-transform:uppercase;font-weight:500;font-size:80%}.wy-dropdown-menu>dd.call-to-action:hover{background:#e3e3e3}.wy-dropdown-menu>dd.call-to-action .btn{color:#fff}.wy-dropdown.wy-dropdown-up .wy-dropdown-menu{bottom:100%;top:auto;left:auto;right:0}.wy-dropdown.wy-dropdown-bubble .wy-dropdown-menu{background:#fcfcfc;margin-top:2px}.wy-dropdown.wy-dropdown-bubble .wy-dropdown-menu a{padding:6px 12px}.wy-dropdown.wy-dropdown-bubble .wy-dropdown-menu a:hover{background:#2980b9;color:#fff}.wy-dropdown.wy-dropdown-left .wy-dropdown-menu{right:0;left:auto;text-align:right}.wy-dropdown-arrow:before{content:" ";border-bottom:5px solid #f5f5f5;border-left:5px solid transparent;border-right:5px solid transparent;position:absolute;display:block;top:-4px;left:50%;margin-left:-3px}.wy-dropdown-arrow.wy-dropdown-arrow-left:before{left:11px}.wy-form-stacked select{display:block}.wy-form-aligned .wy-help-inline,.wy-form-aligned input,.wy-form-aligned label,.wy-form-aligned select,.wy-form-aligned textarea{display:inline-block;*display:inline;*zoom:1;vertical-align:middle}.wy-form-aligned .wy-control-group>label{display:inline-block;vertical-align:middle;width:10em;margin:6px 12px 0 0;float:left}.wy-form-aligned .wy-control{float:left}.wy-form-aligned .wy-control label{display:block}.wy-form-aligned .wy-control select{margin-top:6px}fieldset{margin:0}fieldset,legend{border:0;padding:0}legend{width:100%;white-space:normal;margin-bottom:24px;font-size:150%;*margin-left:-7px}label,legend{display:block}label{margin:0 0 .3125em;color:#333;font-size:90%}input,select,textarea{font-size:100%;margin:0;vertical-align:baseline;*vertical-align:middle}.wy-control-group{margin-bottom:24px;max-width:1200px;margin-left:auto;margin-right:auto;*zoom:1}.wy-control-group:after,.wy-control-group:before{display:table;content:""}.wy-control-group:after{clear:both}.wy-control-group.wy-control-group-required>label:after{content:" *";color:#e74c3c}.wy-control-group .wy-form-full,.wy-control-group .wy-form-halves,.wy-control-group .wy-form-thirds{padding-bottom:12px}.wy-control-group .wy-form-full input[type=color],.wy-control-group .wy-form-full input[type=date],.wy-control-group .wy-form-full input[type=datetime-local],.wy-control-group .wy-form-full input[type=datetime],.wy-control-group .wy-form-full input[type=email],.wy-control-group .wy-form-full input[type=month],.wy-control-group .wy-form-full input[type=number],.wy-control-group .wy-form-full input[type=password],.wy-control-group .wy-form-full input[type=search],.wy-control-group .wy-form-full input[type=tel],.wy-control-group .wy-form-full input[type=text],.wy-control-group .wy-form-full input[type=time],.wy-control-group .wy-form-full input[type=url],.wy-control-group .wy-form-full input[type=week],.wy-control-group .wy-form-full select,.wy-control-group .wy-form-halves input[type=color],.wy-control-group .wy-form-halves input[type=date],.wy-control-group .wy-form-halves input[type=datetime-local],.wy-control-group .wy-form-halves input[type=datetime],.wy-control-group .wy-form-halves input[type=email],.wy-control-group .wy-form-halves input[type=month],.wy-control-group .wy-form-halves input[type=number],.wy-control-group .wy-form-halves input[type=password],.wy-control-group .wy-form-halves input[type=search],.wy-control-group .wy-form-halves input[type=tel],.wy-control-group .wy-form-halves input[type=text],.wy-control-group .wy-form-halves input[type=time],.wy-control-group .wy-form-halves input[type=url],.wy-control-group .wy-form-halves input[type=week],.wy-control-group .wy-form-halves select,.wy-control-group .wy-form-thirds input[type=color],.wy-control-group .wy-form-thirds input[type=date],.wy-control-group .wy-form-thirds input[type=datetime-local],.wy-control-group .wy-form-thirds input[type=datetime],.wy-control-group .wy-form-thirds input[type=email],.wy-control-group .wy-form-thirds input[type=month],.wy-control-group .wy-form-thirds input[type=number],.wy-control-group .wy-form-thirds input[type=password],.wy-control-group .wy-form-thirds input[type=search],.wy-control-group .wy-form-thirds input[type=tel],.wy-control-group .wy-form-thirds input[type=text],.wy-control-group .wy-form-thirds input[type=time],.wy-control-group .wy-form-thirds input[type=url],.wy-control-group .wy-form-thirds input[type=week],.wy-control-group .wy-form-thirds select{width:100%}.wy-control-group .wy-form-full{float:left;display:block;width:100%;margin-right:0}.wy-control-group .wy-form-full:last-child{margin-right:0}.wy-control-group .wy-form-halves{float:left;display:block;margin-right:2.35765%;width:48.82117%}.wy-control-group .wy-form-halves:last-child,.wy-control-group .wy-form-halves:nth-of-type(2n){margin-right:0}.wy-control-group .wy-form-halves:nth-of-type(odd){clear:left}.wy-control-group .wy-form-thirds{float:left;display:block;margin-right:2.35765%;width:31.76157%}.wy-control-group .wy-form-thirds:last-child,.wy-control-group .wy-form-thirds:nth-of-type(3n){margin-right:0}.wy-control-group .wy-form-thirds:nth-of-type(3n+1){clear:left}.wy-control-group.wy-control-group-no-input .wy-control,.wy-control-no-input{margin:6px 0 0;font-size:90%}.wy-control-no-input{display:inline-block}.wy-control-group.fluid-input input[type=color],.wy-control-group.fluid-input input[type=date],.wy-control-group.fluid-input input[type=datetime-local],.wy-control-group.fluid-input input[type=datetime],.wy-control-group.fluid-input input[type=email],.wy-control-group.fluid-input input[type=month],.wy-control-group.fluid-input input[type=number],.wy-control-group.fluid-input input[type=password],.wy-control-group.fluid-input input[type=search],.wy-control-group.fluid-input input[type=tel],.wy-control-group.fluid-input input[type=text],.wy-control-group.fluid-input input[type=time],.wy-control-group.fluid-input input[type=url],.wy-control-group.fluid-input input[type=week]{width:100%}.wy-form-message-inline{padding-left:.3em;color:#666;font-size:90%}.wy-form-message{display:block;color:#999;font-size:70%;margin-top:.3125em;font-style:italic}.wy-form-message p{font-size:inherit;font-style:italic;margin-bottom:6px}.wy-form-message p:last-child{margin-bottom:0}input{line-height:normal}input[type=button],input[type=reset],input[type=submit]{-webkit-appearance:button;cursor:pointer;font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif;*overflow:visible}input[type=color],input[type=date],input[type=datetime-local],input[type=datetime],input[type=email],input[type=month],input[type=number],input[type=password],input[type=search],input[type=tel],input[type=text],input[type=time],input[type=url],input[type=week]{-webkit-appearance:none;padding:6px;display:inline-block;border:1px solid #ccc;font-size:80%;font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif;box-shadow:inset 0 1px 3px #ddd;border-radius:0;-webkit-transition:border .3s linear;-moz-transition:border .3s linear;transition:border .3s linear}input[type=datetime-local]{padding:.34375em .625em}input[disabled]{cursor:default}input[type=checkbox],input[type=radio]{padding:0;margin-right:.3125em;*height:13px;*width:13px}input[type=checkbox],input[type=radio],input[type=search]{-webkit-box-sizing:border-box;-moz-box-sizing:border-box;box-sizing:border-box}input[type=search]::-webkit-search-cancel-button,input[type=search]::-webkit-search-decoration{-webkit-appearance:none}input[type=color]:focus,input[type=date]:focus,input[type=datetime-local]:focus,input[type=datetime]:focus,input[type=email]:focus,input[type=month]:focus,input[type=number]:focus,input[type=password]:focus,input[type=search]:focus,input[type=tel]:focus,input[type=text]:focus,input[type=time]:focus,input[type=url]:focus,input[type=week]:focus{outline:0;outline:thin dotted\9;border-color:#333}input.no-focus:focus{border-color:#ccc!important}input[type=checkbox]:focus,input[type=file]:focus,input[type=radio]:focus{outline:thin dotted #333;outline:1px auto #129fea}input[type=color][disabled],input[type=date][disabled],input[type=datetime-local][disabled],input[type=datetime][disabled],input[type=email][disabled],input[type=month][disabled],input[type=number][disabled],input[type=password][disabled],input[type=search][disabled],input[type=tel][disabled],input[type=text][disabled],input[type=time][disabled],input[type=url][disabled],input[type=week][disabled]{cursor:not-allowed;background-color:#fafafa}input:focus:invalid,select:focus:invalid,textarea:focus:invalid{color:#e74c3c;border:1px solid #e74c3c}input:focus:invalid:focus,select:focus:invalid:focus,textarea:focus:invalid:focus{border-color:#e74c3c}input[type=checkbox]:focus:invalid:focus,input[type=file]:focus:invalid:focus,input[type=radio]:focus:invalid:focus{outline-color:#e74c3c}input.wy-input-large{padding:12px;font-size:100%}textarea{overflow:auto;vertical-align:top;width:100%;font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif}select,textarea{padding:.5em .625em;display:inline-block;border:1px solid #ccc;font-size:80%;box-shadow:inset 0 1px 3px #ddd;-webkit-transition:border .3s linear;-moz-transition:border .3s linear;transition:border .3s linear}select{border:1px solid #ccc;background-color:#fff}select[multiple]{height:auto}select:focus,textarea:focus{outline:0}input[readonly],select[disabled],select[readonly],textarea[disabled],textarea[readonly]{cursor:not-allowed;background-color:#fafafa}input[type=checkbox][disabled],input[type=radio][disabled]{cursor:not-allowed}.wy-checkbox,.wy-radio{margin:6px 0;color:#404040;display:block}.wy-checkbox input,.wy-radio input{vertical-align:baseline}.wy-form-message-inline{display:inline-block;*display:inline;*zoom:1;vertical-align:middle}.wy-input-prefix,.wy-input-suffix{white-space:nowrap;padding:6px}.wy-input-prefix .wy-input-context,.wy-input-suffix .wy-input-context{line-height:27px;padding:0 8px;display:inline-block;font-size:80%;background-color:#f3f6f6;border:1px solid #ccc;color:#999}.wy-input-suffix .wy-input-context{border-left:0}.wy-input-prefix .wy-input-context{border-right:0}.wy-switch{position:relative;display:block;height:24px;margin-top:12px;cursor:pointer}.wy-switch:before{left:0;top:0;width:36px;height:12px;background:#ccc}.wy-switch:after,.wy-switch:before{position:absolute;content:"";display:block;border-radius:4px;-webkit-transition:all .2s ease-in-out;-moz-transition:all .2s ease-in-out;transition:all .2s ease-in-out}.wy-switch:after{width:18px;height:18px;background:#999;left:-3px;top:-3px}.wy-switch span{position:absolute;left:48px;display:block;font-size:12px;color:#ccc;line-height:1}.wy-switch.active:before{background:#1e8449}.wy-switch.active:after{left:24px;background:#27ae60}.wy-switch.disabled{cursor:not-allowed;opacity:.8}.wy-control-group.wy-control-group-error .wy-form-message,.wy-control-group.wy-control-group-error>label{color:#e74c3c}.wy-control-group.wy-control-group-error input[type=color],.wy-control-group.wy-control-group-error input[type=date],.wy-control-group.wy-control-group-error input[type=datetime-local],.wy-control-group.wy-control-group-error input[type=datetime],.wy-control-group.wy-control-group-error input[type=email],.wy-control-group.wy-control-group-error input[type=month],.wy-control-group.wy-control-group-error input[type=number],.wy-control-group.wy-control-group-error input[type=password],.wy-control-group.wy-control-group-error input[type=search],.wy-control-group.wy-control-group-error input[type=tel],.wy-control-group.wy-control-group-error input[type=text],.wy-control-group.wy-control-group-error input[type=time],.wy-control-group.wy-control-group-error input[type=url],.wy-control-group.wy-control-group-error input[type=week],.wy-control-group.wy-control-group-error textarea{border:1px solid #e74c3c}.wy-inline-validate{white-space:nowrap}.wy-inline-validate .wy-input-context{padding:.5em .625em;display:inline-block;font-size:80%}.wy-inline-validate.wy-inline-validate-success .wy-input-context{color:#27ae60}.wy-inline-validate.wy-inline-validate-danger .wy-input-context{color:#e74c3c}.wy-inline-validate.wy-inline-validate-warning .wy-input-context{color:#e67e22}.wy-inline-validate.wy-inline-validate-info .wy-input-context{color:#2980b9}.rotate-90{-webkit-transform:rotate(90deg);-moz-transform:rotate(90deg);-ms-transform:rotate(90deg);-o-transform:rotate(90deg);transform:rotate(90deg)}.rotate-180{-webkit-transform:rotate(180deg);-moz-transform:rotate(180deg);-ms-transform:rotate(180deg);-o-transform:rotate(180deg);transform:rotate(180deg)}.rotate-270{-webkit-transform:rotate(270deg);-moz-transform:rotate(270deg);-ms-transform:rotate(270deg);-o-transform:rotate(270deg);transform:rotate(270deg)}.mirror{-webkit-transform:scaleX(-1);-moz-transform:scaleX(-1);-ms-transform:scaleX(-1);-o-transform:scaleX(-1);transform:scaleX(-1)}.mirror.rotate-90{-webkit-transform:scaleX(-1) rotate(90deg);-moz-transform:scaleX(-1) rotate(90deg);-ms-transform:scaleX(-1) rotate(90deg);-o-transform:scaleX(-1) rotate(90deg);transform:scaleX(-1) rotate(90deg)}.mirror.rotate-180{-webkit-transform:scaleX(-1) rotate(180deg);-moz-transform:scaleX(-1) rotate(180deg);-ms-transform:scaleX(-1) rotate(180deg);-o-transform:scaleX(-1) rotate(180deg);transform:scaleX(-1) rotate(180deg)}.mirror.rotate-270{-webkit-transform:scaleX(-1) rotate(270deg);-moz-transform:scaleX(-1) rotate(270deg);-ms-transform:scaleX(-1) rotate(270deg);-o-transform:scaleX(-1) rotate(270deg);transform:scaleX(-1) rotate(270deg)}@media only screen and (max-width:480px){.wy-form button[type=submit]{margin:.7em 0 0}.wy-form input[type=color],.wy-form input[type=date],.wy-form input[type=datetime-local],.wy-form input[type=datetime],.wy-form input[type=email],.wy-form input[type=month],.wy-form input[type=number],.wy-form input[type=password],.wy-form input[type=search],.wy-form input[type=tel],.wy-form input[type=text],.wy-form input[type=time],.wy-form input[type=url],.wy-form input[type=week],.wy-form label{margin-bottom:.3em;display:block}.wy-form input[type=color],.wy-form input[type=date],.wy-form input[type=datetime-local],.wy-form input[type=datetime],.wy-form input[type=email],.wy-form input[type=month],.wy-form input[type=number],.wy-form input[type=password],.wy-form input[type=search],.wy-form input[type=tel],.wy-form input[type=time],.wy-form input[type=url],.wy-form input[type=week]{margin-bottom:0}.wy-form-aligned .wy-control-group label{margin-bottom:.3em;text-align:left;display:block;width:100%}.wy-form-aligned .wy-control{margin:1.5em 0 0}.wy-form-message,.wy-form-message-inline,.wy-form .wy-help-inline{display:block;font-size:80%;padding:6px 0}}@media screen and (max-width:768px){.tablet-hide{display:none}}@media screen and (max-width:480px){.mobile-hide{display:none}}.float-left{float:left}.float-right{float:right}.full-width{width:100%}.rst-content table.docutils,.rst-content table.field-list,.wy-table{border-collapse:collapse;border-spacing:0;empty-cells:show;margin-bottom:24px}.rst-content table.docutils caption,.rst-content table.field-list caption,.wy-table caption{color:#000;font:italic 85%/1 arial,sans-serif;padding:1em 0;text-align:center}.rst-content table.docutils td,.rst-content table.docutils th,.rst-content table.field-list td,.rst-content table.field-list th,.wy-table td,.wy-table th{font-size:90%;margin:0;overflow:visible;padding:8px 16px}.rst-content table.docutils td:first-child,.rst-content table.docutils th:first-child,.rst-content table.field-list td:first-child,.rst-content table.field-list th:first-child,.wy-table td:first-child,.wy-table th:first-child{border-left-width:0}.rst-content table.docutils thead,.rst-content table.field-list thead,.wy-table thead{color:#000;text-align:left;vertical-align:bottom;white-space:nowrap}.rst-content table.docutils thead th,.rst-content table.field-list thead th,.wy-table thead th{font-weight:700;border-bottom:2px solid #e1e4e5}.rst-content table.docutils td,.rst-content table.field-list td,.wy-table td{background-color:transparent;vertical-align:middle}.rst-content table.docutils td p,.rst-content table.field-list td p,.wy-table td p{line-height:18px}.rst-content table.docutils td p:last-child,.rst-content table.field-list td p:last-child,.wy-table td p:last-child{margin-bottom:0}.rst-content table.docutils .wy-table-cell-min,.rst-content table.field-list .wy-table-cell-min,.wy-table .wy-table-cell-min{width:1%;padding-right:0}.rst-content table.docutils .wy-table-cell-min input[type=checkbox],.rst-content table.field-list .wy-table-cell-min input[type=checkbox],.wy-table .wy-table-cell-min input[type=checkbox]{margin:0}.wy-table-secondary{color:grey;font-size:90%}.wy-table-tertiary{color:grey;font-size:80%}.rst-content table.docutils:not(.field-list) tr:nth-child(2n-1) td,.wy-table-backed,.wy-table-odd td,.wy-table-striped tr:nth-child(2n-1) td{background-color:#f3f6f6}.rst-content table.docutils,.wy-table-bordered-all{border:1px solid #e1e4e5}.rst-content table.docutils td,.wy-table-bordered-all td{border-bottom:1px solid #e1e4e5;border-left:1px solid #e1e4e5}.rst-content table.docutils tbody>tr:last-child td,.wy-table-bordered-all tbody>tr:last-child td{border-bottom-width:0}.wy-table-bordered{border:1px solid #e1e4e5}.wy-table-bordered-rows td{border-bottom:1px solid #e1e4e5}.wy-table-bordered-rows tbody>tr:last-child td{border-bottom-width:0}.wy-table-horizontal td,.wy-table-horizontal th{border-width:0 0 1px;border-bottom:1px solid #e1e4e5}.wy-table-horizontal tbody>tr:last-child td{border-bottom-width:0}.wy-table-responsive{margin-bottom:24px;max-width:100%;overflow:auto}.wy-table-responsive table{margin-bottom:0!important}.wy-table-responsive table td,.wy-table-responsive table th{white-space:nowrap}a{color:#2980b9;text-decoration:none;cursor:pointer}a:hover{color:#3091d1}a:visited{color:#9b59b6}html{height:100%}body,html{overflow-x:hidden}body{font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif;font-weight:400;color:#404040;min-height:100%;background:#edf0f2}.wy-text-left{text-align:left}.wy-text-center{text-align:center}.wy-text-right{text-align:right}.wy-text-large{font-size:120%}.wy-text-normal{font-size:100%}.wy-text-small,small{font-size:80%}.wy-text-strike{text-decoration:line-through}.wy-text-warning{color:#e67e22!important}a.wy-text-warning:hover{color:#eb9950!important}.wy-text-info{color:#2980b9!important}a.wy-text-info:hover{color:#409ad5!important}.wy-text-success{color:#27ae60!important}a.wy-text-success:hover{color:#36d278!important}.wy-text-danger{color:#e74c3c!important}a.wy-text-danger:hover{color:#ed7669!important}.wy-text-neutral{color:#404040!important}a.wy-text-neutral:hover{color:#595959!important}.rst-content .toctree-wrapper>p.caption,h1,h2,h3,h4,h5,h6,legend{margin-top:0;font-weight:700;font-family:Roboto Slab,ff-tisa-web-pro,Georgia,Arial,sans-serif}p{line-height:24px;font-size:16px;margin:0 0 24px}h1{font-size:175%}.rst-content .toctree-wrapper>p.caption,h2{font-size:150%}h3{font-size:125%}h4{font-size:115%}h5{font-size:110%}h6{font-size:100%}hr{display:block;height:1px;border:0;border-top:1px solid #e1e4e5;margin:24px 0;padding:0}.rst-content code,.rst-content tt,code{white-space:nowrap;max-width:100%;background:#fff;border:1px solid #e1e4e5;font-size:75%;padding:0 5px;font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;color:#e74c3c;overflow-x:auto}.rst-content tt.code-large,code.code-large{font-size:90%}.rst-content .section ul,.rst-content .toctree-wrapper ul,.rst-content section ul,.wy-plain-list-disc,article ul{list-style:disc;line-height:24px;margin-bottom:24px}.rst-content .section ul li,.rst-content .toctree-wrapper ul li,.rst-content section ul li,.wy-plain-list-disc li,article ul li{list-style:disc;margin-left:24px}.rst-content .section ul li p:last-child,.rst-content .section ul li ul,.rst-content .toctree-wrapper ul li p:last-child,.rst-content .toctree-wrapper ul li ul,.rst-content section ul li p:last-child,.rst-content section ul li ul,.wy-plain-list-disc li p:last-child,.wy-plain-list-disc li ul,article ul li p:last-child,article ul li ul{margin-bottom:0}.rst-content .section ul li li,.rst-content .toctree-wrapper ul li li,.rst-content section ul li li,.wy-plain-list-disc li li,article ul li li{list-style:circle}.rst-content .section ul li li li,.rst-content .toctree-wrapper ul li li li,.rst-content section ul li li li,.wy-plain-list-disc li li li,article ul li li li{list-style:square}.rst-content .section ul li ol li,.rst-content .toctree-wrapper ul li ol li,.rst-content section ul li ol li,.wy-plain-list-disc li ol li,article ul li ol li{list-style:decimal}.rst-content .section ol,.rst-content .section ol.arabic,.rst-content .toctree-wrapper ol,.rst-content .toctree-wrapper ol.arabic,.rst-content section ol,.rst-content section ol.arabic,.wy-plain-list-decimal,article ol{list-style:decimal;line-height:24px;margin-bottom:24px}.rst-content .section ol.arabic li,.rst-content .section ol li,.rst-content .toctree-wrapper ol.arabic li,.rst-content .toctree-wrapper ol li,.rst-content section ol.arabic li,.rst-content section ol li,.wy-plain-list-decimal li,article ol li{list-style:decimal;margin-left:24px}.rst-content .section ol.arabic li ul,.rst-content .section ol li p:last-child,.rst-content .section ol li ul,.rst-content .toctree-wrapper ol.arabic li ul,.rst-content .toctree-wrapper ol li p:last-child,.rst-content .toctree-wrapper ol li ul,.rst-content section ol.arabic li ul,.rst-content section ol li p:last-child,.rst-content section ol li ul,.wy-plain-list-decimal li p:last-child,.wy-plain-list-decimal li ul,article ol li p:last-child,article ol li ul{margin-bottom:0}.rst-content .section ol.arabic li ul li,.rst-content .section ol li ul li,.rst-content .toctree-wrapper ol.arabic li ul li,.rst-content .toctree-wrapper ol li ul li,.rst-content section ol.arabic li ul li,.rst-content section ol li ul li,.wy-plain-list-decimal li ul li,article ol li ul li{list-style:disc}.wy-breadcrumbs{*zoom:1}.wy-breadcrumbs:after,.wy-breadcrumbs:before{display:table;content:""}.wy-breadcrumbs:after{clear:both}.wy-breadcrumbs>li{display:inline-block;padding-top:5px}.wy-breadcrumbs>li.wy-breadcrumbs-aside{float:right}.rst-content .wy-breadcrumbs>li code,.rst-content .wy-breadcrumbs>li tt,.wy-breadcrumbs>li .rst-content tt,.wy-breadcrumbs>li code{all:inherit;color:inherit}.breadcrumb-item:before{content:"/";color:#bbb;font-size:13px;padding:0 6px 0 3px}.wy-breadcrumbs-extra{margin-bottom:0;color:#b3b3b3;font-size:80%;display:inline-block}@media screen and (max-width:480px){.wy-breadcrumbs-extra,.wy-breadcrumbs li.wy-breadcrumbs-aside{display:none}}@media print{.wy-breadcrumbs li.wy-breadcrumbs-aside{display:none}}html{font-size:16px}.wy-affix{position:fixed;top:1.618em}.wy-menu a:hover{text-decoration:none}.wy-menu-horiz{*zoom:1}.wy-menu-horiz:after,.wy-menu-horiz:before{display:table;content:""}.wy-menu-horiz:after{clear:both}.wy-menu-horiz li,.wy-menu-horiz ul{display:inline-block}.wy-menu-horiz li:hover{background:hsla(0,0%,100%,.1)}.wy-menu-horiz li.divide-left{border-left:1px solid #404040}.wy-menu-horiz li.divide-right{border-right:1px solid #404040}.wy-menu-horiz a{height:32px;display:inline-block;line-height:32px;padding:0 16px}.wy-menu-vertical{width:300px}.wy-menu-vertical header,.wy-menu-vertical p.caption{color:#55a5d9;height:32px;line-height:32px;padding:0 1.618em;margin:12px 0 0;display:block;font-weight:700;text-transform:uppercase;font-size:85%;white-space:nowrap}.wy-menu-vertical ul{margin-bottom:0}.wy-menu-vertical li.divide-top{border-top:1px solid #404040}.wy-menu-vertical li.divide-bottom{border-bottom:1px solid #404040}.wy-menu-vertical li.current{background:#e3e3e3}.wy-menu-vertical li.current a{color:grey;border-right:1px solid #c9c9c9;padding:.4045em 2.427em}.wy-menu-vertical li.current a:hover{background:#d6d6d6}.rst-content .wy-menu-vertical li tt,.wy-menu-vertical li .rst-content tt,.wy-menu-vertical li code{border:none;background:inherit;color:inherit;padding-left:0;padding-right:0}.wy-menu-vertical li button.toctree-expand{display:block;float:left;margin-left:-1.2em;line-height:18px;color:#4d4d4d;border:none;background:none;padding:0}.wy-menu-vertical li.current>a,.wy-menu-vertical li.on a{color:#404040;font-weight:700;position:relative;background:#fcfcfc;border:none;padding:.4045em 1.618em}.wy-menu-vertical li.current>a:hover,.wy-menu-vertical li.on a:hover{background:#fcfcfc}.wy-menu-vertical li.current>a:hover button.toctree-expand,.wy-menu-vertical li.on a:hover button.toctree-expand{color:grey}.wy-menu-vertical li.current>a button.toctree-expand,.wy-menu-vertical li.on a button.toctree-expand{display:block;line-height:18px;color:#333}.wy-menu-vertical li.toctree-l1.current>a{border-bottom:1px solid #c9c9c9;border-top:1px solid #c9c9c9}.wy-menu-vertical .toctree-l1.current .toctree-l2>ul,.wy-menu-vertical .toctree-l2.current .toctree-l3>ul,.wy-menu-vertical .toctree-l3.current .toctree-l4>ul,.wy-menu-vertical .toctree-l4.current .toctree-l5>ul,.wy-menu-vertical .toctree-l5.current .toctree-l6>ul,.wy-menu-vertical .toctree-l6.current .toctree-l7>ul,.wy-menu-vertical .toctree-l7.current .toctree-l8>ul,.wy-menu-vertical .toctree-l8.current .toctree-l9>ul,.wy-menu-vertical .toctree-l9.current .toctree-l10>ul,.wy-menu-vertical .toctree-l10.current .toctree-l11>ul{display:none}.wy-menu-vertical .toctree-l1.current .current.toctree-l2>ul,.wy-menu-vertical .toctree-l2.current .current.toctree-l3>ul,.wy-menu-vertical .toctree-l3.current .current.toctree-l4>ul,.wy-menu-vertical .toctree-l4.current .current.toctree-l5>ul,.wy-menu-vertical .toctree-l5.current .current.toctree-l6>ul,.wy-menu-vertical .toctree-l6.current .current.toctree-l7>ul,.wy-menu-vertical .toctree-l7.current .current.toctree-l8>ul,.wy-menu-vertical .toctree-l8.current .current.toctree-l9>ul,.wy-menu-vertical .toctree-l9.current .current.toctree-l10>ul,.wy-menu-vertical .toctree-l10.current .current.toctree-l11>ul{display:block}.wy-menu-vertical li.toctree-l3,.wy-menu-vertical li.toctree-l4{font-size:.9em}.wy-menu-vertical li.toctree-l2 a,.wy-menu-vertical li.toctree-l3 a,.wy-menu-vertical li.toctree-l4 a,.wy-menu-vertical li.toctree-l5 a,.wy-menu-vertical li.toctree-l6 a,.wy-menu-vertical li.toctree-l7 a,.wy-menu-vertical li.toctree-l8 a,.wy-menu-vertical li.toctree-l9 a,.wy-menu-vertical li.toctree-l10 a{color:#404040}.wy-menu-vertical li.toctree-l2 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l3 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l4 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l5 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l6 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l7 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l8 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l9 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l10 a:hover button.toctree-expand{color:grey}.wy-menu-vertical li.toctree-l2.current li.toctree-l3>a,.wy-menu-vertical li.toctree-l3.current li.toctree-l4>a,.wy-menu-vertical li.toctree-l4.current li.toctree-l5>a,.wy-menu-vertical li.toctree-l5.current li.toctree-l6>a,.wy-menu-vertical li.toctree-l6.current li.toctree-l7>a,.wy-menu-vertical li.toctree-l7.current li.toctree-l8>a,.wy-menu-vertical li.toctree-l8.current li.toctree-l9>a,.wy-menu-vertical li.toctree-l9.current li.toctree-l10>a,.wy-menu-vertical li.toctree-l10.current li.toctree-l11>a{display:block}.wy-menu-vertical li.toctree-l2.current>a{padding:.4045em 2.427em}.wy-menu-vertical li.toctree-l2.current li.toctree-l3>a{padding:.4045em 1.618em .4045em 4.045em}.wy-menu-vertical li.toctree-l3.current>a{padding:.4045em 4.045em}.wy-menu-vertical li.toctree-l3.current li.toctree-l4>a{padding:.4045em 1.618em .4045em 5.663em}.wy-menu-vertical li.toctree-l4.current>a{padding:.4045em 5.663em}.wy-menu-vertical li.toctree-l4.current li.toctree-l5>a{padding:.4045em 1.618em .4045em 7.281em}.wy-menu-vertical li.toctree-l5.current>a{padding:.4045em 7.281em}.wy-menu-vertical li.toctree-l5.current li.toctree-l6>a{padding:.4045em 1.618em .4045em 8.899em}.wy-menu-vertical li.toctree-l6.current>a{padding:.4045em 8.899em}.wy-menu-vertical li.toctree-l6.current li.toctree-l7>a{padding:.4045em 1.618em .4045em 10.517em}.wy-menu-vertical li.toctree-l7.current>a{padding:.4045em 10.517em}.wy-menu-vertical li.toctree-l7.current li.toctree-l8>a{padding:.4045em 1.618em .4045em 12.135em}.wy-menu-vertical li.toctree-l8.current>a{padding:.4045em 12.135em}.wy-menu-vertical li.toctree-l8.current li.toctree-l9>a{padding:.4045em 1.618em .4045em 13.753em}.wy-menu-vertical li.toctree-l9.current>a{padding:.4045em 13.753em}.wy-menu-vertical li.toctree-l9.current li.toctree-l10>a{padding:.4045em 1.618em .4045em 15.371em}.wy-menu-vertical li.toctree-l10.current>a{padding:.4045em 15.371em}.wy-menu-vertical li.toctree-l10.current li.toctree-l11>a{padding:.4045em 1.618em .4045em 16.989em}.wy-menu-vertical li.toctree-l2.current>a,.wy-menu-vertical li.toctree-l2.current li.toctree-l3>a{background:#c9c9c9}.wy-menu-vertical li.toctree-l2 button.toctree-expand{color:#a3a3a3}.wy-menu-vertical li.toctree-l3.current>a,.wy-menu-vertical li.toctree-l3.current li.toctree-l4>a{background:#bdbdbd}.wy-menu-vertical li.toctree-l3 button.toctree-expand{color:#969696}.wy-menu-vertical li.current ul{display:block}.wy-menu-vertical li ul{margin-bottom:0;display:none}.wy-menu-vertical li ul li a{margin-bottom:0;color:#d9d9d9;font-weight:400}.wy-menu-vertical a{line-height:18px;padding:.4045em 1.618em;display:block;position:relative;font-size:90%;color:#d9d9d9}.wy-menu-vertical a:hover{background-color:#4e4a4a;cursor:pointer}.wy-menu-vertical a:hover button.toctree-expand{color:#d9d9d9}.wy-menu-vertical a:active{background-color:#2980b9;cursor:pointer;color:#fff}.wy-menu-vertical a:active button.toctree-expand{color:#fff}.wy-side-nav-search{display:block;width:300px;padding:.809em;margin-bottom:.809em;z-index:200;background-color:#2980b9;text-align:center;color:#fcfcfc}.wy-side-nav-search input[type=text]{width:100%;border-radius:50px;padding:6px 12px;border-color:#2472a4}.wy-side-nav-search img{display:block;margin:auto auto .809em;height:45px;width:45px;background-color:#2980b9;padding:5px;border-radius:100%}.wy-side-nav-search .wy-dropdown>a,.wy-side-nav-search>a{color:#fcfcfc;font-size:100%;font-weight:700;display:inline-block;padding:4px 6px;margin-bottom:.809em;max-width:100%}.wy-side-nav-search .wy-dropdown>a:hover,.wy-side-nav-search>a:hover{background:hsla(0,0%,100%,.1)}.wy-side-nav-search .wy-dropdown>a img.logo,.wy-side-nav-search>a img.logo{display:block;margin:0 auto;height:auto;width:auto;border-radius:0;max-width:100%;background:transparent}.wy-side-nav-search .wy-dropdown>a.icon img.logo,.wy-side-nav-search>a.icon img.logo{margin-top:.85em}.wy-side-nav-search>div.version{margin-top:-.4045em;margin-bottom:.809em;font-weight:400;color:hsla(0,0%,100%,.3)}.wy-nav .wy-menu-vertical header{color:#2980b9}.wy-nav .wy-menu-vertical a{color:#b3b3b3}.wy-nav .wy-menu-vertical a:hover{background-color:#2980b9;color:#fff}[data-menu-wrap]{-webkit-transition:all .2s ease-in;-moz-transition:all .2s ease-in;transition:all .2s ease-in;position:absolute;opacity:1;width:100%;opacity:0}[data-menu-wrap].move-center{left:0;right:auto;opacity:1}[data-menu-wrap].move-left{right:auto;left:-100%;opacity:0}[data-menu-wrap].move-right{right:-100%;left:auto;opacity:0}.wy-body-for-nav{background:#fcfcfc}.wy-grid-for-nav{position:absolute;width:100%;height:100%}.wy-nav-side{position:fixed;top:0;bottom:0;left:0;padding-bottom:2em;width:300px;overflow-x:hidden;overflow-y:hidden;min-height:100%;color:#9b9b9b;background:#343131;z-index:200}.wy-side-scroll{width:320px;position:relative;overflow-x:hidden;overflow-y:scroll;height:100%}.wy-nav-top{display:none;background:#2980b9;color:#fff;padding:.4045em .809em;position:relative;line-height:50px;text-align:center;font-size:100%;*zoom:1}.wy-nav-top:after,.wy-nav-top:before{display:table;content:""}.wy-nav-top:after{clear:both}.wy-nav-top a{color:#fff;font-weight:700}.wy-nav-top img{margin-right:12px;height:45px;width:45px;background-color:#2980b9;padding:5px;border-radius:100%}.wy-nav-top i{font-size:30px;float:left;cursor:pointer;padding-top:inherit}.wy-nav-content-wrap{margin-left:300px;background:#fcfcfc;min-height:100%}.wy-nav-content{padding:1.618em 3.236em;height:100%;max-width:800px;margin:auto}.wy-body-mask{position:fixed;width:100%;height:100%;background:rgba(0,0,0,.2);display:none;z-index:499}.wy-body-mask.on{display:block}footer{color:grey}footer p{margin-bottom:12px}.rst-content footer span.commit tt,footer span.commit .rst-content tt,footer span.commit code{padding:0;font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;font-size:1em;background:none;border:none;color:grey}.rst-footer-buttons{*zoom:1}.rst-footer-buttons:after,.rst-footer-buttons:before{width:100%;display:table;content:""}.rst-footer-buttons:after{clear:both}.rst-breadcrumbs-buttons{margin-top:12px;*zoom:1}.rst-breadcrumbs-buttons:after,.rst-breadcrumbs-buttons:before{display:table;content:""}.rst-breadcrumbs-buttons:after{clear:both}#search-results .search li{margin-bottom:24px;border-bottom:1px solid #e1e4e5;padding-bottom:24px}#search-results .search li:first-child{border-top:1px solid #e1e4e5;padding-top:24px}#search-results .search li a{font-size:120%;margin-bottom:12px;display:inline-block}#search-results .context{color:grey;font-size:90%}.genindextable li>ul{margin-left:24px}@media screen and (max-width:768px){.wy-body-for-nav{background:#fcfcfc}.wy-nav-top{display:block}.wy-nav-side{left:-300px}.wy-nav-side.shift{width:85%;left:0}.wy-menu.wy-menu-vertical,.wy-side-nav-search,.wy-side-scroll{width:auto}.wy-nav-content-wrap{margin-left:0}.wy-nav-content-wrap .wy-nav-content{padding:1.618em}.wy-nav-content-wrap.shift{position:fixed;min-width:100%;left:85%;top:0;height:100%;overflow:hidden}}@media screen and (min-width:1100px){.wy-nav-content-wrap{background:rgba(0,0,0,.05)}.wy-nav-content{margin:0;background:#fcfcfc}}@media print{.rst-versions,.wy-nav-side,footer{display:none}.wy-nav-content-wrap{margin-left:0}}.rst-versions{position:fixed;bottom:0;left:0;width:300px;color:#fcfcfc;background:#1f1d1d;font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif;z-index:400}.rst-versions a{color:#2980b9;text-decoration:none}.rst-versions .rst-badge-small{display:none}.rst-versions .rst-current-version{padding:12px;background-color:#272525;display:block;text-align:right;font-size:90%;cursor:pointer;color:#27ae60;*zoom:1}.rst-versions .rst-current-version:after,.rst-versions .rst-current-version:before{display:table;content:""}.rst-versions .rst-current-version:after{clear:both}.rst-content .code-block-caption .rst-versions .rst-current-version .headerlink,.rst-content .eqno .rst-versions .rst-current-version .headerlink,.rst-content .rst-versions .rst-current-version .admonition-title,.rst-content code.download .rst-versions .rst-current-version span:first-child,.rst-content dl dt .rst-versions .rst-current-version .headerlink,.rst-content h1 .rst-versions .rst-current-version .headerlink,.rst-content h2 .rst-versions .rst-current-version .headerlink,.rst-content h3 .rst-versions .rst-current-version .headerlink,.rst-content h4 .rst-versions .rst-current-version .headerlink,.rst-content h5 .rst-versions .rst-current-version .headerlink,.rst-content h6 .rst-versions .rst-current-version .headerlink,.rst-content p .rst-versions .rst-current-version .headerlink,.rst-content table>caption .rst-versions .rst-current-version .headerlink,.rst-content tt.download .rst-versions .rst-current-version span:first-child,.rst-versions .rst-current-version .fa,.rst-versions .rst-current-version .icon,.rst-versions .rst-current-version .rst-content .admonition-title,.rst-versions .rst-current-version .rst-content .code-block-caption .headerlink,.rst-versions .rst-current-version .rst-content .eqno .headerlink,.rst-versions .rst-current-version .rst-content code.download span:first-child,.rst-versions .rst-current-version .rst-content dl dt .headerlink,.rst-versions .rst-current-version .rst-content h1 .headerlink,.rst-versions .rst-current-version .rst-content h2 .headerlink,.rst-versions .rst-current-version .rst-content h3 .headerlink,.rst-versions .rst-current-version .rst-content h4 .headerlink,.rst-versions .rst-current-version .rst-content h5 .headerlink,.rst-versions .rst-current-version .rst-content h6 .headerlink,.rst-versions .rst-current-version .rst-content p .headerlink,.rst-versions .rst-current-version .rst-content table>caption .headerlink,.rst-versions .rst-current-version .rst-content tt.download span:first-child,.rst-versions .rst-current-version .wy-menu-vertical li button.toctree-expand,.wy-menu-vertical li .rst-versions .rst-current-version button.toctree-expand{color:#fcfcfc}.rst-versions .rst-current-version .fa-book,.rst-versions .rst-current-version .icon-book{float:left}.rst-versions .rst-current-version.rst-out-of-date{background-color:#e74c3c;color:#fff}.rst-versions .rst-current-version.rst-active-old-version{background-color:#f1c40f;color:#000}.rst-versions.shift-up{height:auto;max-height:100%;overflow-y:scroll}.rst-versions.shift-up .rst-other-versions{display:block}.rst-versions .rst-other-versions{font-size:90%;padding:12px;color:grey;display:none}.rst-versions .rst-other-versions hr{display:block;height:1px;border:0;margin:20px 0;padding:0;border-top:1px solid #413d3d}.rst-versions .rst-other-versions dd{display:inline-block;margin:0}.rst-versions .rst-other-versions dd a{display:inline-block;padding:6px;color:#fcfcfc}.rst-versions.rst-badge{width:auto;bottom:20px;right:20px;left:auto;border:none;max-width:300px;max-height:90%}.rst-versions.rst-badge .fa-book,.rst-versions.rst-badge .icon-book{float:none;line-height:30px}.rst-versions.rst-badge.shift-up .rst-current-version{text-align:right}.rst-versions.rst-badge.shift-up .rst-current-version .fa-book,.rst-versions.rst-badge.shift-up .rst-current-version .icon-book{float:left}.rst-versions.rst-badge>.rst-current-version{width:auto;height:30px;line-height:30px;padding:0 6px;display:block;text-align:center}@media screen and (max-width:768px){.rst-versions{width:85%;display:none}.rst-versions.shift{display:block}}.rst-content .toctree-wrapper>p.caption,.rst-content h1,.rst-content h2,.rst-content h3,.rst-content h4,.rst-content h5,.rst-content h6{margin-bottom:24px}.rst-content img{max-width:100%;height:auto}.rst-content div.figure,.rst-content figure{margin-bottom:24px}.rst-content div.figure .caption-text,.rst-content figure .caption-text{font-style:italic}.rst-content div.figure p:last-child.caption,.rst-content figure p:last-child.caption{margin-bottom:0}.rst-content div.figure.align-center,.rst-content figure.align-center{text-align:center}.rst-content .section>a>img,.rst-content .section>img,.rst-content section>a>img,.rst-content section>img{margin-bottom:24px}.rst-content abbr[title]{text-decoration:none}.rst-content.style-external-links a.reference.external:after{font-family:FontAwesome;content:"\f08e";color:#b3b3b3;vertical-align:super;font-size:60%;margin:0 .2em}.rst-content blockquote{margin-left:24px;line-height:24px;margin-bottom:24px}.rst-content pre.literal-block{white-space:pre;margin:0;padding:12px;font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;display:block;overflow:auto}.rst-content div[class^=highlight],.rst-content pre.literal-block{border:1px solid #e1e4e5;overflow-x:auto;margin:1px 0 24px}.rst-content div[class^=highlight] div[class^=highlight],.rst-content pre.literal-block div[class^=highlight]{padding:0;border:none;margin:0}.rst-content div[class^=highlight] td.code{width:100%}.rst-content .linenodiv pre{border-right:1px solid #e6e9ea;margin:0;padding:12px;font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;user-select:none;pointer-events:none}.rst-content div[class^=highlight] pre{white-space:pre;margin:0;padding:12px;display:block;overflow:auto}.rst-content div[class^=highlight] pre .hll{display:block;margin:0 -12px;padding:0 12px}.rst-content .linenodiv pre,.rst-content div[class^=highlight] pre,.rst-content pre.literal-block{font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;font-size:12px;line-height:1.4}.rst-content div.highlight .gp,.rst-content div.highlight span.linenos{user-select:none;pointer-events:none}.rst-content div.highlight span.linenos{display:inline-block;padding-left:0;padding-right:12px;margin-right:12px;border-right:1px solid #e6e9ea}.rst-content .code-block-caption{font-style:italic;font-size:85%;line-height:1;padding:1em 0;text-align:center}@media print{.rst-content .codeblock,.rst-content div[class^=highlight],.rst-content div[class^=highlight] pre{white-space:pre-wrap}}.rst-content .admonition,.rst-content .admonition-todo,.rst-content .attention,.rst-content .caution,.rst-content .danger,.rst-content .error,.rst-content .hint,.rst-content .important,.rst-content .note,.rst-content .seealso,.rst-content .tip,.rst-content .warning{clear:both}.rst-content .admonition-todo .last,.rst-content .admonition-todo>:last-child,.rst-content .admonition .last,.rst-content .admonition>:last-child,.rst-content .attention .last,.rst-content .attention>:last-child,.rst-content .caution .last,.rst-content .caution>:last-child,.rst-content .danger .last,.rst-content .danger>:last-child,.rst-content .error .last,.rst-content .error>:last-child,.rst-content .hint .last,.rst-content .hint>:last-child,.rst-content .important .last,.rst-content .important>:last-child,.rst-content .note .last,.rst-content .note>:last-child,.rst-content .seealso .last,.rst-content .seealso>:last-child,.rst-content .tip .last,.rst-content .tip>:last-child,.rst-content .warning .last,.rst-content .warning>:last-child{margin-bottom:0}.rst-content .admonition-title:before{margin-right:4px}.rst-content .admonition table{border-color:rgba(0,0,0,.1)}.rst-content .admonition table td,.rst-content .admonition table th{background:transparent!important;border-color:rgba(0,0,0,.1)!important}.rst-content .section ol.loweralpha,.rst-content .section ol.loweralpha>li,.rst-content .toctree-wrapper ol.loweralpha,.rst-content .toctree-wrapper ol.loweralpha>li,.rst-content section ol.loweralpha,.rst-content section ol.loweralpha>li{list-style:lower-alpha}.rst-content .section ol.upperalpha,.rst-content .section ol.upperalpha>li,.rst-content .toctree-wrapper ol.upperalpha,.rst-content .toctree-wrapper ol.upperalpha>li,.rst-content section ol.upperalpha,.rst-content section ol.upperalpha>li{list-style:upper-alpha}.rst-content .section ol li>*,.rst-content .section ul li>*,.rst-content .toctree-wrapper ol li>*,.rst-content .toctree-wrapper ul li>*,.rst-content section ol li>*,.rst-content section ul li>*{margin-top:12px;margin-bottom:12px}.rst-content .section ol li>:first-child,.rst-content .section ul li>:first-child,.rst-content .toctree-wrapper ol li>:first-child,.rst-content .toctree-wrapper ul li>:first-child,.rst-content section ol li>:first-child,.rst-content section ul li>:first-child{margin-top:0}.rst-content .section ol li>p,.rst-content .section ol li>p:last-child,.rst-content .section ul li>p,.rst-content .section ul li>p:last-child,.rst-content .toctree-wrapper ol li>p,.rst-content .toctree-wrapper ol li>p:last-child,.rst-content .toctree-wrapper ul li>p,.rst-content .toctree-wrapper ul li>p:last-child,.rst-content section ol li>p,.rst-content section ol li>p:last-child,.rst-content section ul li>p,.rst-content section ul li>p:last-child{margin-bottom:12px}.rst-content .section ol li>p:only-child,.rst-content .section ol li>p:only-child:last-child,.rst-content .section ul li>p:only-child,.rst-content .section ul li>p:only-child:last-child,.rst-content .toctree-wrapper ol li>p:only-child,.rst-content .toctree-wrapper ol li>p:only-child:last-child,.rst-content .toctree-wrapper ul li>p:only-child,.rst-content .toctree-wrapper ul li>p:only-child:last-child,.rst-content section ol li>p:only-child,.rst-content section ol li>p:only-child:last-child,.rst-content section ul li>p:only-child,.rst-content section ul li>p:only-child:last-child{margin-bottom:0}.rst-content .section ol li>ol,.rst-content .section ol li>ul,.rst-content .section ul li>ol,.rst-content .section ul li>ul,.rst-content .toctree-wrapper ol li>ol,.rst-content .toctree-wrapper ol li>ul,.rst-content .toctree-wrapper ul li>ol,.rst-content .toctree-wrapper ul li>ul,.rst-content section ol li>ol,.rst-content section ol li>ul,.rst-content section ul li>ol,.rst-content section ul li>ul{margin-bottom:12px}.rst-content .section ol.simple li>*,.rst-content .section ol.simple li ol,.rst-content .section ol.simple li ul,.rst-content .section ul.simple li>*,.rst-content .section ul.simple li ol,.rst-content .section ul.simple li ul,.rst-content .toctree-wrapper ol.simple li>*,.rst-content .toctree-wrapper ol.simple li ol,.rst-content .toctree-wrapper ol.simple li ul,.rst-content .toctree-wrapper ul.simple li>*,.rst-content .toctree-wrapper ul.simple li ol,.rst-content .toctree-wrapper ul.simple li ul,.rst-content section ol.simple li>*,.rst-content section ol.simple li ol,.rst-content section ol.simple li ul,.rst-content section ul.simple li>*,.rst-content section ul.simple li ol,.rst-content section ul.simple li ul{margin-top:0;margin-bottom:0}.rst-content .line-block{margin-left:0;margin-bottom:24px;line-height:24px}.rst-content .line-block .line-block{margin-left:24px;margin-bottom:0}.rst-content .topic-title{font-weight:700;margin-bottom:12px}.rst-content .toc-backref{color:#404040}.rst-content .align-right{float:right;margin:0 0 24px 24px}.rst-content .align-left{float:left;margin:0 24px 24px 0}.rst-content .align-center{margin:auto}.rst-content .align-center:not(table){display:block}.rst-content .code-block-caption .headerlink,.rst-content .eqno .headerlink,.rst-content .toctree-wrapper>p.caption .headerlink,.rst-content dl dt .headerlink,.rst-content h1 .headerlink,.rst-content h2 .headerlink,.rst-content h3 .headerlink,.rst-content h4 .headerlink,.rst-content h5 .headerlink,.rst-content h6 .headerlink,.rst-content p.caption .headerlink,.rst-content p .headerlink,.rst-content table>caption .headerlink{opacity:0;font-size:14px;font-family:FontAwesome;margin-left:.5em}.rst-content .code-block-caption .headerlink:focus,.rst-content .code-block-caption:hover .headerlink,.rst-content .eqno .headerlink:focus,.rst-content .eqno:hover .headerlink,.rst-content .toctree-wrapper>p.caption .headerlink:focus,.rst-content .toctree-wrapper>p.caption:hover .headerlink,.rst-content dl dt .headerlink:focus,.rst-content dl dt:hover .headerlink,.rst-content h1 .headerlink:focus,.rst-content h1:hover .headerlink,.rst-content h2 .headerlink:focus,.rst-content h2:hover .headerlink,.rst-content h3 .headerlink:focus,.rst-content h3:hover .headerlink,.rst-content h4 .headerlink:focus,.rst-content h4:hover .headerlink,.rst-content h5 .headerlink:focus,.rst-content h5:hover .headerlink,.rst-content h6 .headerlink:focus,.rst-content h6:hover .headerlink,.rst-content p.caption .headerlink:focus,.rst-content p.caption:hover .headerlink,.rst-content p .headerlink:focus,.rst-content p:hover .headerlink,.rst-content table>caption .headerlink:focus,.rst-content table>caption:hover .headerlink{opacity:1}.rst-content p a{overflow-wrap:anywhere}.rst-content .wy-table td p,.rst-content .wy-table td ul,.rst-content .wy-table th p,.rst-content .wy-table th ul,.rst-content table.docutils td p,.rst-content table.docutils td ul,.rst-content table.docutils th p,.rst-content table.docutils th ul,.rst-content table.field-list td p,.rst-content table.field-list td ul,.rst-content table.field-list th p,.rst-content table.field-list th ul{font-size:inherit}.rst-content .btn:focus{outline:2px solid}.rst-content table>caption .headerlink:after{font-size:12px}.rst-content .centered{text-align:center}.rst-content .sidebar{float:right;width:40%;display:block;margin:0 0 24px 24px;padding:24px;background:#f3f6f6;border:1px solid #e1e4e5}.rst-content .sidebar dl,.rst-content .sidebar p,.rst-content .sidebar ul{font-size:90%}.rst-content .sidebar .last,.rst-content .sidebar>:last-child{margin-bottom:0}.rst-content .sidebar .sidebar-title{display:block;font-family:Roboto Slab,ff-tisa-web-pro,Georgia,Arial,sans-serif;font-weight:700;background:#e1e4e5;padding:6px 12px;margin:-24px -24px 24px;font-size:100%}.rst-content .highlighted{background:#f1c40f;box-shadow:0 0 0 2px #f1c40f;display:inline;font-weight:700}.rst-content .citation-reference,.rst-content .footnote-reference{vertical-align:baseline;position:relative;top:-.4em;line-height:0;font-size:90%}.rst-content .hlist{width:100%}.rst-content dl dt span.classifier:before{content:" : "}.rst-content dl dt span.classifier-delimiter{display:none!important}html.writer-html4 .rst-content table.docutils.citation,html.writer-html4 .rst-content table.docutils.footnote{background:none;border:none}html.writer-html4 .rst-content table.docutils.citation td,html.writer-html4 .rst-content table.docutils.citation tr,html.writer-html4 .rst-content table.docutils.footnote td,html.writer-html4 .rst-content table.docutils.footnote tr{border:none;background-color:transparent!important;white-space:normal}html.writer-html4 .rst-content table.docutils.citation td.label,html.writer-html4 .rst-content table.docutils.footnote td.label{padding-left:0;padding-right:0;vertical-align:top}html.writer-html5 .rst-content dl.citation,html.writer-html5 .rst-content dl.field-list,html.writer-html5 .rst-content dl.footnote{display:grid;grid-template-columns:max-content auto}html.writer-html5 .rst-content dl.citation>dt,html.writer-html5 .rst-content dl.field-list>dt,html.writer-html5 .rst-content dl.footnote>dt{padding-left:1rem}html.writer-html5 .rst-content dl.citation>dt:after,html.writer-html5 .rst-content dl.field-list>dt:after,html.writer-html5 .rst-content dl.footnote>dt:after{content:":"}html.writer-html5 .rst-content dl.citation>dd,html.writer-html5 .rst-content dl.citation>dt,html.writer-html5 .rst-content dl.field-list>dd,html.writer-html5 .rst-content dl.field-list>dt,html.writer-html5 .rst-content dl.footnote>dd,html.writer-html5 .rst-content dl.footnote>dt{margin-bottom:0}html.writer-html5 .rst-content dl.citation,html.writer-html5 .rst-content dl.footnote{font-size:.9rem}html.writer-html5 .rst-content dl.citation>dt,html.writer-html5 .rst-content dl.footnote>dt{margin:0 .5rem .5rem 0;line-height:1.2rem;word-break:break-all;font-weight:400}html.writer-html5 .rst-content dl.citation>dt>span.brackets,html.writer-html5 .rst-content dl.footnote>dt>span.brackets{margin-right:.5rem}html.writer-html5 .rst-content dl.citation>dt>span.brackets:before,html.writer-html5 .rst-content dl.footnote>dt>span.brackets:before{content:"["}html.writer-html5 .rst-content dl.citation>dt>span.brackets:after,html.writer-html5 .rst-content dl.footnote>dt>span.brackets:after{content:"]"}html.writer-html5 .rst-content dl.citation>dt>span.fn-backref,html.writer-html5 .rst-content dl.footnote>dt>span.fn-backref{font-style:italic}html.writer-html5 .rst-content dl.citation>dd,html.writer-html5 .rst-content dl.footnote>dd{margin:0 0 .5rem;line-height:1.2rem}html.writer-html5 .rst-content dl.citation>dd p,html.writer-html5 .rst-content dl.footnote>dd p,html.writer-html5 .rst-content dl.option-list kbd{font-size:.9rem}.rst-content dl.citation,.rst-content table.docutils.footnote,html.writer-html4 .rst-content table.docutils.citation,html.writer-html5 .rst-content dl.footnote{color:grey}.rst-content dl.citation code,.rst-content dl.citation tt,.rst-content table.docutils.footnote code,.rst-content table.docutils.footnote tt,html.writer-html4 .rst-content table.docutils.citation code,html.writer-html4 .rst-content table.docutils.citation tt,html.writer-html5 .rst-content dl.footnote code,html.writer-html5 .rst-content dl.footnote tt{color:#555}.rst-content .wy-table-responsive.citation,.rst-content .wy-table-responsive.footnote{margin-bottom:0}.rst-content .wy-table-responsive.citation+:not(.citation),.rst-content .wy-table-responsive.footnote+:not(.footnote){margin-top:24px}.rst-content .wy-table-responsive.citation:last-child,.rst-content .wy-table-responsive.footnote:last-child{margin-bottom:24px}.rst-content table.docutils th{border-color:#e1e4e5}html.writer-html5 .rst-content table.docutils th{border:1px solid #e1e4e5}html.writer-html5 .rst-content table.docutils td>p,html.writer-html5 .rst-content table.docutils th>p{line-height:1rem;margin-bottom:0;font-size:.9rem}.rst-content table.docutils td .last,.rst-content table.docutils td .last>:last-child{margin-bottom:0}.rst-content table.field-list,.rst-content table.field-list td{border:none}.rst-content table.field-list td p{line-height:inherit}.rst-content table.field-list td>strong{display:inline-block}.rst-content table.field-list .field-name{padding-right:10px;text-align:left;white-space:nowrap}.rst-content table.field-list .field-body{text-align:left}.rst-content code,.rst-content tt{color:#000;font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;padding:2px 5px}.rst-content code big,.rst-content code em,.rst-content tt big,.rst-content tt em{font-size:100%!important;line-height:normal}.rst-content code.literal,.rst-content tt.literal{color:#e74c3c;white-space:normal}.rst-content code.xref,.rst-content tt.xref,a .rst-content code,a .rst-content tt{font-weight:700;color:#404040;overflow-wrap:normal}.rst-content kbd,.rst-content pre,.rst-content samp{font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace}.rst-content a code,.rst-content a tt{color:#2980b9}.rst-content dl{margin-bottom:24px}.rst-content dl dt{font-weight:700;margin-bottom:12px}.rst-content dl ol,.rst-content dl p,.rst-content dl table,.rst-content dl ul{margin-bottom:12px}.rst-content dl dd{margin:0 0 12px 24px;line-height:24px}.rst-content dl dd>ol:last-child,.rst-content dl dd>p:last-child,.rst-content dl dd>table:last-child,.rst-content dl dd>ul:last-child{margin-bottom:0}html.writer-html4 .rst-content dl:not(.docutils),html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple){margin-bottom:24px}html.writer-html4 .rst-content dl:not(.docutils)>dt,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt{display:table;margin:6px 0;font-size:90%;line-height:normal;background:#e7f2fa;color:#2980b9;border-top:3px solid #6ab0de;padding:6px;position:relative}html.writer-html4 .rst-content dl:not(.docutils)>dt:before,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt:before{color:#6ab0de}html.writer-html4 .rst-content dl:not(.docutils)>dt .headerlink,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt .headerlink{color:#404040;font-size:100%!important}html.writer-html4 .rst-content dl:not(.docutils) dl:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) dl:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt{margin-bottom:6px;border:none;border-left:3px solid #ccc;background:#f0f0f0;color:#555}html.writer-html4 .rst-content dl:not(.docutils) dl:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt .headerlink,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) dl:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt .headerlink{color:#404040;font-size:100%!important}html.writer-html4 .rst-content dl:not(.docutils)>dt:first-child,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt:first-child{margin-top:0}html.writer-html4 .rst-content dl:not(.docutils) code.descclassname,html.writer-html4 .rst-content dl:not(.docutils) code.descname,html.writer-html4 .rst-content dl:not(.docutils) tt.descclassname,html.writer-html4 .rst-content dl:not(.docutils) tt.descname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) code.descclassname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) code.descname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) tt.descclassname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) tt.descname{background-color:transparent;border:none;padding:0;font-size:100%!important}html.writer-html4 .rst-content dl:not(.docutils) code.descname,html.writer-html4 .rst-content dl:not(.docutils) tt.descname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) code.descname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) tt.descname{font-weight:700}html.writer-html4 .rst-content dl:not(.docutils) .optional,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) .optional{display:inline-block;padding:0 4px;color:#000;font-weight:700}html.writer-html4 .rst-content dl:not(.docutils) .property,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) .property{display:inline-block;padding-right:8px;max-width:100%}html.writer-html4 .rst-content dl:not(.docutils) .k,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) .k{font-style:italic}html.writer-html4 .rst-content dl:not(.docutils) .descclassname,html.writer-html4 .rst-content dl:not(.docutils) .descname,html.writer-html4 .rst-content dl:not(.docutils) .sig-name,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) .descclassname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) .descname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) .sig-name{font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;color:#000}.rst-content .viewcode-back,.rst-content .viewcode-link{display:inline-block;color:#27ae60;font-size:80%;padding-left:24px}.rst-content .viewcode-back{display:block;float:right}.rst-content p.rubric{margin-bottom:12px;font-weight:700}.rst-content code.download,.rst-content tt.download{background:inherit;padding:inherit;font-weight:400;font-family:inherit;font-size:inherit;color:inherit;border:inherit;white-space:inherit}.rst-content code.download span:first-child,.rst-content tt.download span:first-child{-webkit-font-smoothing:subpixel-antialiased}.rst-content code.download span:first-child:before,.rst-content tt.download span:first-child:before{margin-right:4px}.rst-content .guilabel{border:1px solid #7fbbe3;background:#e7f2fa;font-size:80%;font-weight:700;border-radius:4px;padding:2.4px 6px;margin:auto 2px}.rst-content :not(dl.option-list)>:not(dt):not(kbd):not(.kbd)>.kbd,.rst-content :not(dl.option-list)>:not(dt):not(kbd):not(.kbd)>kbd{color:inherit;font-size:80%;background-color:#fff;border:1px solid #a6a6a6;border-radius:4px;box-shadow:0 2px grey;padding:2.4px 6px;margin:auto 0}.rst-content .versionmodified{font-style:italic}@media screen and (max-width:480px){.rst-content .sidebar{width:100%}}span[id*=MathJax-Span]{color:#404040}.math{text-align:center}@font-face{font-family:Lato;src:url(fonts/lato-normal.woff2?bd03a2cc277bbbc338d464e679fe9942) format("woff2"),url(fonts/lato-normal.woff?27bd77b9162d388cb8d4c4217c7c5e2a) format("woff");font-weight:400;font-style:normal;font-display:block}@font-face{font-family:Lato;src:url(fonts/lato-bold.woff2?cccb897485813c7c256901dbca54ecf2) format("woff2"),url(fonts/lato-bold.woff?d878b6c29b10beca227e9eef4246111b) format("woff");font-weight:700;font-style:normal;font-display:block}@font-face{font-family:Lato;src:url(fonts/lato-bold-italic.woff2?0b6bb6725576b072c5d0b02ecdd1900d) format("woff2"),url(fonts/lato-bold-italic.woff?9c7e4e9eb485b4a121c760e61bc3707c) format("woff");font-weight:700;font-style:italic;font-display:block}@font-face{font-family:Lato;src:url(fonts/lato-normal-italic.woff2?4eb103b4d12be57cb1d040ed5e162e9d) format("woff2"),url(fonts/lato-normal-italic.woff?f28f2d6482446544ef1ea1ccc6dd5892) format("woff");font-weight:400;font-style:italic;font-display:block}@font-face{font-family:Roboto Slab;font-style:normal;font-weight:400;src:url(fonts/Roboto-Slab-Regular.woff2?7abf5b8d04d26a2cafea937019bca958) format("woff2"),url(fonts/Roboto-Slab-Regular.woff?c1be9284088d487c5e3ff0a10a92e58c) format("woff");font-display:block}@font-face{font-family:Roboto Slab;font-style:normal;font-weight:700;src:url(fonts/Roboto-Slab-Bold.woff2?9984f4a9bda09be08e83f2506954adbe) format("woff2"),url(fonts/Roboto-Slab-Bold.woff?bed5564a116b05148e3b3bea6fb1162a) format("woff");font-display:block} \ No newline at end of file diff --git a/branch/bart/_static/doctools.js b/branch/bart/_static/doctools.js new file mode 100644 index 0000000..527b876 --- /dev/null +++ b/branch/bart/_static/doctools.js @@ -0,0 +1,156 @@ +/* + * doctools.js + * ~~~~~~~~~~~ + * + * Base JavaScript utilities for all Sphinx HTML documentation. + * + * :copyright: Copyright 2007-2022 by the Sphinx team, see AUTHORS. + * :license: BSD, see LICENSE for details. + * + */ +"use strict"; + +const BLACKLISTED_KEY_CONTROL_ELEMENTS = new Set([ + "TEXTAREA", + "INPUT", + "SELECT", + "BUTTON", +]); + +const _ready = (callback) => { + if (document.readyState !== "loading") { + callback(); + } else { + document.addEventListener("DOMContentLoaded", callback); + } +}; + +/** + * Small JavaScript module for the documentation. + */ +const Documentation = { + init: () => { + Documentation.initDomainIndexTable(); + Documentation.initOnKeyListeners(); + }, + + /** + * i18n support + */ + TRANSLATIONS: {}, + PLURAL_EXPR: (n) => (n === 1 ? 0 : 1), + LOCALE: "unknown", + + // gettext and ngettext don't access this so that the functions + // can safely bound to a different name (_ = Documentation.gettext) + gettext: (string) => { + const translated = Documentation.TRANSLATIONS[string]; + switch (typeof translated) { + case "undefined": + return string; // no translation + case "string": + return translated; // translation exists + default: + return translated[0]; // (singular, plural) translation tuple exists + } + }, + + ngettext: (singular, plural, n) => { + const translated = Documentation.TRANSLATIONS[singular]; + if (typeof translated !== "undefined") + return translated[Documentation.PLURAL_EXPR(n)]; + return n === 1 ? singular : plural; + }, + + addTranslations: (catalog) => { + Object.assign(Documentation.TRANSLATIONS, catalog.messages); + Documentation.PLURAL_EXPR = new Function( + "n", + `return (${catalog.plural_expr})` + ); + Documentation.LOCALE = catalog.locale; + }, + + /** + * helper function to focus on search bar + */ + focusSearchBar: () => { + document.querySelectorAll("input[name=q]")[0]?.focus(); + }, + + /** + * Initialise the domain index toggle buttons + */ + initDomainIndexTable: () => { + const toggler = (el) => { + const idNumber = el.id.substr(7); + const toggledRows = document.querySelectorAll(`tr.cg-${idNumber}`); + if (el.src.substr(-9) === "minus.png") { + el.src = `${el.src.substr(0, el.src.length - 9)}plus.png`; + toggledRows.forEach((el) => (el.style.display = "none")); + } else { + el.src = `${el.src.substr(0, el.src.length - 8)}minus.png`; + toggledRows.forEach((el) => (el.style.display = "")); + } + }; + + const togglerElements = document.querySelectorAll("img.toggler"); + togglerElements.forEach((el) => + el.addEventListener("click", (event) => toggler(event.currentTarget)) + ); + togglerElements.forEach((el) => (el.style.display = "")); + if (DOCUMENTATION_OPTIONS.COLLAPSE_INDEX) togglerElements.forEach(toggler); + }, + + initOnKeyListeners: () => { + // only install a listener if it is really needed + if ( + !DOCUMENTATION_OPTIONS.NAVIGATION_WITH_KEYS && + !DOCUMENTATION_OPTIONS.ENABLE_SEARCH_SHORTCUTS + ) + return; + + document.addEventListener("keydown", (event) => { + // bail for input elements + if (BLACKLISTED_KEY_CONTROL_ELEMENTS.has(document.activeElement.tagName)) return; + // bail with special keys + if (event.altKey || event.ctrlKey || event.metaKey) return; + + if (!event.shiftKey) { + switch (event.key) { + case "ArrowLeft": + if (!DOCUMENTATION_OPTIONS.NAVIGATION_WITH_KEYS) break; + + const prevLink = document.querySelector('link[rel="prev"]'); + if (prevLink && prevLink.href) { + window.location.href = prevLink.href; + event.preventDefault(); + } + break; + case "ArrowRight": + if (!DOCUMENTATION_OPTIONS.NAVIGATION_WITH_KEYS) break; + + const nextLink = document.querySelector('link[rel="next"]'); + if (nextLink && nextLink.href) { + window.location.href = nextLink.href; + event.preventDefault(); + } + break; + } + } + + // some keyboard layouts may need Shift to get / + switch (event.key) { + case "/": + if (!DOCUMENTATION_OPTIONS.ENABLE_SEARCH_SHORTCUTS) break; + Documentation.focusSearchBar(); + event.preventDefault(); + } + }); + }, +}; + +// quick alias for translations +const _ = Documentation.gettext; + +_ready(Documentation.init); diff --git a/branch/bart/_static/documentation_options.js b/branch/bart/_static/documentation_options.js new file mode 100644 index 0000000..c066c69 --- /dev/null +++ b/branch/bart/_static/documentation_options.js @@ -0,0 +1,14 @@ +var DOCUMENTATION_OPTIONS = { + URL_ROOT: document.getElementById("documentation_options").getAttribute('data-url_root'), + VERSION: '', + LANGUAGE: 'en', + COLLAPSE_INDEX: false, + BUILDER: 'dirhtml', + FILE_SUFFIX: '.html', + LINK_SUFFIX: '.html', + HAS_SOURCE: true, + SOURCELINK_SUFFIX: '.txt', + NAVIGATION_WITH_KEYS: false, + SHOW_SEARCH_SUMMARY: true, + ENABLE_SEARCH_SHORTCUTS: true, +}; \ No newline at end of file diff --git a/branch/bart/_static/file.png b/branch/bart/_static/file.png new file mode 100644 index 0000000..a858a41 Binary files /dev/null and b/branch/bart/_static/file.png differ diff --git a/branch/bart/_static/graphviz.css b/branch/bart/_static/graphviz.css new file mode 100644 index 0000000..19e7afd --- /dev/null +++ b/branch/bart/_static/graphviz.css @@ -0,0 +1,19 @@ +/* + * graphviz.css + * ~~~~~~~~~~~~ + * + * Sphinx stylesheet -- graphviz extension. + * + * :copyright: Copyright 2007-2022 by the Sphinx team, see AUTHORS. + * :license: BSD, see LICENSE for details. + * + */ + +img.graphviz { + border: 0; + max-width: 100%; +} + +object.graphviz { + max-width: 100%; +} diff --git a/branch/bart/_static/jquery-3.6.0.js b/branch/bart/_static/jquery-3.6.0.js new file mode 100644 index 0000000..fc6c299 --- /dev/null +++ b/branch/bart/_static/jquery-3.6.0.js @@ -0,0 +1,10881 @@ +/*! + * jQuery JavaScript Library v3.6.0 + * https://jquery.com/ + * + * Includes Sizzle.js + * https://sizzlejs.com/ + * + * Copyright OpenJS Foundation and other contributors + * Released under the MIT license + * https://jquery.org/license + * + * Date: 2021-03-02T17:08Z + */ +( function( global, factory ) { + + "use strict"; + + if ( typeof module === "object" && typeof module.exports === "object" ) { + + // For CommonJS and CommonJS-like environments where a proper `window` + // is present, execute the factory and get jQuery. + // For environments that do not have a `window` with a `document` + // (such as Node.js), expose a factory as module.exports. + // This accentuates the need for the creation of a real `window`. + // e.g. var jQuery = require("jquery")(window); + // See ticket #14549 for more info. + module.exports = global.document ? + factory( global, true ) : + function( w ) { + if ( !w.document ) { + throw new Error( "jQuery requires a window with a document" ); + } + return factory( w ); + }; + } else { + factory( global ); + } + +// Pass this if window is not defined yet +} )( typeof window !== "undefined" ? window : this, function( window, noGlobal ) { + +// Edge <= 12 - 13+, Firefox <=18 - 45+, IE 10 - 11, Safari 5.1 - 9+, iOS 6 - 9.1 +// throw exceptions when non-strict code (e.g., ASP.NET 4.5) accesses strict mode +// arguments.callee.caller (trac-13335). But as of jQuery 3.0 (2016), strict mode should be common +// enough that all such attempts are guarded in a try block. +"use strict"; + +var arr = []; + +var getProto = Object.getPrototypeOf; + +var slice = arr.slice; + +var flat = arr.flat ? function( array ) { + return arr.flat.call( array ); +} : function( array ) { + return arr.concat.apply( [], array ); +}; + + +var push = arr.push; + +var indexOf = arr.indexOf; + +var class2type = {}; + +var toString = class2type.toString; + +var hasOwn = class2type.hasOwnProperty; + +var fnToString = hasOwn.toString; + +var ObjectFunctionString = fnToString.call( Object ); + +var support = {}; + +var isFunction = function isFunction( obj ) { + + // Support: Chrome <=57, Firefox <=52 + // In some browsers, typeof returns "function" for HTML elements + // (i.e., `typeof document.createElement( "object" ) === "function"`). + // We don't want to classify *any* DOM node as a function. + // Support: QtWeb <=3.8.5, WebKit <=534.34, wkhtmltopdf tool <=0.12.5 + // Plus for old WebKit, typeof returns "function" for HTML collections + // (e.g., `typeof document.getElementsByTagName("div") === "function"`). (gh-4756) + return typeof obj === "function" && typeof obj.nodeType !== "number" && + typeof obj.item !== "function"; + }; + + +var isWindow = function isWindow( obj ) { + return obj != null && obj === obj.window; + }; + + +var document = window.document; + + + + var preservedScriptAttributes = { + type: true, + src: true, + nonce: true, + noModule: true + }; + + function DOMEval( code, node, doc ) { + doc = doc || document; + + var i, val, + script = doc.createElement( "script" ); + + script.text = code; + if ( node ) { + for ( i in preservedScriptAttributes ) { + + // Support: Firefox 64+, Edge 18+ + // Some browsers don't support the "nonce" property on scripts. + // On the other hand, just using `getAttribute` is not enough as + // the `nonce` attribute is reset to an empty string whenever it + // becomes browsing-context connected. + // See https://github.com/whatwg/html/issues/2369 + // See https://html.spec.whatwg.org/#nonce-attributes + // The `node.getAttribute` check was added for the sake of + // `jQuery.globalEval` so that it can fake a nonce-containing node + // via an object. + val = node[ i ] || node.getAttribute && node.getAttribute( i ); + if ( val ) { + script.setAttribute( i, val ); + } + } + } + doc.head.appendChild( script ).parentNode.removeChild( script ); + } + + +function toType( obj ) { + if ( obj == null ) { + return obj + ""; + } + + // Support: Android <=2.3 only (functionish RegExp) + return typeof obj === "object" || typeof obj === "function" ? + class2type[ toString.call( obj ) ] || "object" : + typeof obj; +} +/* global Symbol */ +// Defining this global in .eslintrc.json would create a danger of using the global +// unguarded in another place, it seems safer to define global only for this module + + + +var + version = "3.6.0", + + // Define a local copy of jQuery + jQuery = function( selector, context ) { + + // The jQuery object is actually just the init constructor 'enhanced' + // Need init if jQuery is called (just allow error to be thrown if not included) + return new jQuery.fn.init( selector, context ); + }; + +jQuery.fn = jQuery.prototype = { + + // The current version of jQuery being used + jquery: version, + + constructor: jQuery, + + // The default length of a jQuery object is 0 + length: 0, + + toArray: function() { + return slice.call( this ); + }, + + // Get the Nth element in the matched element set OR + // Get the whole matched element set as a clean array + get: function( num ) { + + // Return all the elements in a clean array + if ( num == null ) { + return slice.call( this ); + } + + // Return just the one element from the set + return num < 0 ? this[ num + this.length ] : this[ num ]; + }, + + // Take an array of elements and push it onto the stack + // (returning the new matched element set) + pushStack: function( elems ) { + + // Build a new jQuery matched element set + var ret = jQuery.merge( this.constructor(), elems ); + + // Add the old object onto the stack (as a reference) + ret.prevObject = this; + + // Return the newly-formed element set + return ret; + }, + + // Execute a callback for every element in the matched set. + each: function( callback ) { + return jQuery.each( this, callback ); + }, + + map: function( callback ) { + return this.pushStack( jQuery.map( this, function( elem, i ) { + return callback.call( elem, i, elem ); + } ) ); + }, + + slice: function() { + return this.pushStack( slice.apply( this, arguments ) ); + }, + + first: function() { + return this.eq( 0 ); + }, + + last: function() { + return this.eq( -1 ); + }, + + even: function() { + return this.pushStack( jQuery.grep( this, function( _elem, i ) { + return ( i + 1 ) % 2; + } ) ); + }, + + odd: function() { + return this.pushStack( jQuery.grep( this, function( _elem, i ) { + return i % 2; + } ) ); + }, + + eq: function( i ) { + var len = this.length, + j = +i + ( i < 0 ? len : 0 ); + return this.pushStack( j >= 0 && j < len ? [ this[ j ] ] : [] ); + }, + + end: function() { + return this.prevObject || this.constructor(); + }, + + // For internal use only. + // Behaves like an Array's method, not like a jQuery method. + push: push, + sort: arr.sort, + splice: arr.splice +}; + +jQuery.extend = jQuery.fn.extend = function() { + var options, name, src, copy, copyIsArray, clone, + target = arguments[ 0 ] || {}, + i = 1, + length = arguments.length, + deep = false; + + // Handle a deep copy situation + if ( typeof target === "boolean" ) { + deep = target; + + // Skip the boolean and the target + target = arguments[ i ] || {}; + i++; + } + + // Handle case when target is a string or something (possible in deep copy) + if ( typeof target !== "object" && !isFunction( target ) ) { + target = {}; + } + + // Extend jQuery itself if only one argument is passed + if ( i === length ) { + target = this; + i--; + } + + for ( ; i < length; i++ ) { + + // Only deal with non-null/undefined values + if ( ( options = arguments[ i ] ) != null ) { + + // Extend the base object + for ( name in options ) { + copy = options[ name ]; + + // Prevent Object.prototype pollution + // Prevent never-ending loop + if ( name === "__proto__" || target === copy ) { + continue; + } + + // Recurse if we're merging plain objects or arrays + if ( deep && copy && ( jQuery.isPlainObject( copy ) || + ( copyIsArray = Array.isArray( copy ) ) ) ) { + src = target[ name ]; + + // Ensure proper type for the source value + if ( copyIsArray && !Array.isArray( src ) ) { + clone = []; + } else if ( !copyIsArray && !jQuery.isPlainObject( src ) ) { + clone = {}; + } else { + clone = src; + } + copyIsArray = false; + + // Never move original objects, clone them + target[ name ] = jQuery.extend( deep, clone, copy ); + + // Don't bring in undefined values + } else if ( copy !== undefined ) { + target[ name ] = copy; + } + } + } + } + + // Return the modified object + return target; +}; + +jQuery.extend( { + + // Unique for each copy of jQuery on the page + expando: "jQuery" + ( version + Math.random() ).replace( /\D/g, "" ), + + // Assume jQuery is ready without the ready module + isReady: true, + + error: function( msg ) { + throw new Error( msg ); + }, + + noop: function() {}, + + isPlainObject: function( obj ) { + var proto, Ctor; + + // Detect obvious negatives + // Use toString instead of jQuery.type to catch host objects + if ( !obj || toString.call( obj ) !== "[object Object]" ) { + return false; + } + + proto = getProto( obj ); + + // Objects with no prototype (e.g., `Object.create( null )`) are plain + if ( !proto ) { + return true; + } + + // Objects with prototype are plain iff they were constructed by a global Object function + Ctor = hasOwn.call( proto, "constructor" ) && proto.constructor; + return typeof Ctor === "function" && fnToString.call( Ctor ) === ObjectFunctionString; + }, + + isEmptyObject: function( obj ) { + var name; + + for ( name in obj ) { + return false; + } + return true; + }, + + // Evaluates a script in a provided context; falls back to the global one + // if not specified. + globalEval: function( code, options, doc ) { + DOMEval( code, { nonce: options && options.nonce }, doc ); + }, + + each: function( obj, callback ) { + var length, i = 0; + + if ( isArrayLike( obj ) ) { + length = obj.length; + for ( ; i < length; i++ ) { + if ( callback.call( obj[ i ], i, obj[ i ] ) === false ) { + break; + } + } + } else { + for ( i in obj ) { + if ( callback.call( obj[ i ], i, obj[ i ] ) === false ) { + break; + } + } + } + + return obj; + }, + + // results is for internal usage only + makeArray: function( arr, results ) { + var ret = results || []; + + if ( arr != null ) { + if ( isArrayLike( Object( arr ) ) ) { + jQuery.merge( ret, + typeof arr === "string" ? + [ arr ] : arr + ); + } else { + push.call( ret, arr ); + } + } + + return ret; + }, + + inArray: function( elem, arr, i ) { + return arr == null ? -1 : indexOf.call( arr, elem, i ); + }, + + // Support: Android <=4.0 only, PhantomJS 1 only + // push.apply(_, arraylike) throws on ancient WebKit + merge: function( first, second ) { + var len = +second.length, + j = 0, + i = first.length; + + for ( ; j < len; j++ ) { + first[ i++ ] = second[ j ]; + } + + first.length = i; + + return first; + }, + + grep: function( elems, callback, invert ) { + var callbackInverse, + matches = [], + i = 0, + length = elems.length, + callbackExpect = !invert; + + // Go through the array, only saving the items + // that pass the validator function + for ( ; i < length; i++ ) { + callbackInverse = !callback( elems[ i ], i ); + if ( callbackInverse !== callbackExpect ) { + matches.push( elems[ i ] ); + } + } + + return matches; + }, + + // arg is for internal usage only + map: function( elems, callback, arg ) { + var length, value, + i = 0, + ret = []; + + // Go through the array, translating each of the items to their new values + if ( isArrayLike( elems ) ) { + length = elems.length; + for ( ; i < length; i++ ) { + value = callback( elems[ i ], i, arg ); + + if ( value != null ) { + ret.push( value ); + } + } + + // Go through every key on the object, + } else { + for ( i in elems ) { + value = callback( elems[ i ], i, arg ); + + if ( value != null ) { + ret.push( value ); + } + } + } + + // Flatten any nested arrays + return flat( ret ); + }, + + // A global GUID counter for objects + guid: 1, + + // jQuery.support is not used in Core but other projects attach their + // properties to it so it needs to exist. + support: support +} ); + +if ( typeof Symbol === "function" ) { + jQuery.fn[ Symbol.iterator ] = arr[ Symbol.iterator ]; +} + +// Populate the class2type map +jQuery.each( "Boolean Number String Function Array Date RegExp Object Error Symbol".split( " " ), + function( _i, name ) { + class2type[ "[object " + name + "]" ] = name.toLowerCase(); + } ); + +function isArrayLike( obj ) { + + // Support: real iOS 8.2 only (not reproducible in simulator) + // `in` check used to prevent JIT error (gh-2145) + // hasOwn isn't used here due to false negatives + // regarding Nodelist length in IE + var length = !!obj && "length" in obj && obj.length, + type = toType( obj ); + + if ( isFunction( obj ) || isWindow( obj ) ) { + return false; + } + + return type === "array" || length === 0 || + typeof length === "number" && length > 0 && ( length - 1 ) in obj; +} +var Sizzle = +/*! + * Sizzle CSS Selector Engine v2.3.6 + * https://sizzlejs.com/ + * + * Copyright JS Foundation and other contributors + * Released under the MIT license + * https://js.foundation/ + * + * Date: 2021-02-16 + */ +( function( window ) { +var i, + support, + Expr, + getText, + isXML, + tokenize, + compile, + select, + outermostContext, + sortInput, + hasDuplicate, + + // Local document vars + setDocument, + document, + docElem, + documentIsHTML, + rbuggyQSA, + rbuggyMatches, + matches, + contains, + + // Instance-specific data + expando = "sizzle" + 1 * new Date(), + preferredDoc = window.document, + dirruns = 0, + done = 0, + classCache = createCache(), + tokenCache = createCache(), + compilerCache = createCache(), + nonnativeSelectorCache = createCache(), + sortOrder = function( a, b ) { + if ( a === b ) { + hasDuplicate = true; + } + return 0; + }, + + // Instance methods + hasOwn = ( {} ).hasOwnProperty, + arr = [], + pop = arr.pop, + pushNative = arr.push, + push = arr.push, + slice = arr.slice, + + // Use a stripped-down indexOf as it's faster than native + // https://jsperf.com/thor-indexof-vs-for/5 + indexOf = function( list, elem ) { + var i = 0, + len = list.length; + for ( ; i < len; i++ ) { + if ( list[ i ] === elem ) { + return i; + } + } + return -1; + }, + + booleans = "checked|selected|async|autofocus|autoplay|controls|defer|disabled|hidden|" + + "ismap|loop|multiple|open|readonly|required|scoped", + + // Regular expressions + + // http://www.w3.org/TR/css3-selectors/#whitespace + whitespace = "[\\x20\\t\\r\\n\\f]", + + // https://www.w3.org/TR/css-syntax-3/#ident-token-diagram + identifier = "(?:\\\\[\\da-fA-F]{1,6}" + whitespace + + "?|\\\\[^\\r\\n\\f]|[\\w-]|[^\0-\\x7f])+", + + // Attribute selectors: http://www.w3.org/TR/selectors/#attribute-selectors + attributes = "\\[" + whitespace + "*(" + identifier + ")(?:" + whitespace + + + // Operator (capture 2) + "*([*^$|!~]?=)" + whitespace + + + // "Attribute values must be CSS identifiers [capture 5] + // or strings [capture 3 or capture 4]" + "*(?:'((?:\\\\.|[^\\\\'])*)'|\"((?:\\\\.|[^\\\\\"])*)\"|(" + identifier + "))|)" + + whitespace + "*\\]", + + pseudos = ":(" + identifier + ")(?:\\((" + + + // To reduce the number of selectors needing tokenize in the preFilter, prefer arguments: + // 1. quoted (capture 3; capture 4 or capture 5) + "('((?:\\\\.|[^\\\\'])*)'|\"((?:\\\\.|[^\\\\\"])*)\")|" + + + // 2. simple (capture 6) + "((?:\\\\.|[^\\\\()[\\]]|" + attributes + ")*)|" + + + // 3. anything else (capture 2) + ".*" + + ")\\)|)", + + // Leading and non-escaped trailing whitespace, capturing some non-whitespace characters preceding the latter + rwhitespace = new RegExp( whitespace + "+", "g" ), + rtrim = new RegExp( "^" + whitespace + "+|((?:^|[^\\\\])(?:\\\\.)*)" + + whitespace + "+$", "g" ), + + rcomma = new RegExp( "^" + whitespace + "*," + whitespace + "*" ), + rcombinators = new RegExp( "^" + whitespace + "*([>+~]|" + whitespace + ")" + whitespace + + "*" ), + rdescend = new RegExp( whitespace + "|>" ), + + rpseudo = new RegExp( pseudos ), + ridentifier = new RegExp( "^" + identifier + "$" ), + + matchExpr = { + "ID": new RegExp( "^#(" + identifier + ")" ), + "CLASS": new RegExp( "^\\.(" + identifier + ")" ), + "TAG": new RegExp( "^(" + identifier + "|[*])" ), + "ATTR": new RegExp( "^" + attributes ), + "PSEUDO": new RegExp( "^" + pseudos ), + "CHILD": new RegExp( "^:(only|first|last|nth|nth-last)-(child|of-type)(?:\\(" + + whitespace + "*(even|odd|(([+-]|)(\\d*)n|)" + whitespace + "*(?:([+-]|)" + + whitespace + "*(\\d+)|))" + whitespace + "*\\)|)", "i" ), + "bool": new RegExp( "^(?:" + booleans + ")$", "i" ), + + // For use in libraries implementing .is() + // We use this for POS matching in `select` + "needsContext": new RegExp( "^" + whitespace + + "*[>+~]|:(even|odd|eq|gt|lt|nth|first|last)(?:\\(" + whitespace + + "*((?:-\\d)?\\d*)" + whitespace + "*\\)|)(?=[^-]|$)", "i" ) + }, + + rhtml = /HTML$/i, + rinputs = /^(?:input|select|textarea|button)$/i, + rheader = /^h\d$/i, + + rnative = /^[^{]+\{\s*\[native \w/, + + // Easily-parseable/retrievable ID or TAG or CLASS selectors + rquickExpr = /^(?:#([\w-]+)|(\w+)|\.([\w-]+))$/, + + rsibling = /[+~]/, + + // CSS escapes + // http://www.w3.org/TR/CSS21/syndata.html#escaped-characters + runescape = new RegExp( "\\\\[\\da-fA-F]{1,6}" + whitespace + "?|\\\\([^\\r\\n\\f])", "g" ), + funescape = function( escape, nonHex ) { + var high = "0x" + escape.slice( 1 ) - 0x10000; + + return nonHex ? + + // Strip the backslash prefix from a non-hex escape sequence + nonHex : + + // Replace a hexadecimal escape sequence with the encoded Unicode code point + // Support: IE <=11+ + // For values outside the Basic Multilingual Plane (BMP), manually construct a + // surrogate pair + high < 0 ? + String.fromCharCode( high + 0x10000 ) : + String.fromCharCode( high >> 10 | 0xD800, high & 0x3FF | 0xDC00 ); + }, + + // CSS string/identifier serialization + // https://drafts.csswg.org/cssom/#common-serializing-idioms + rcssescape = /([\0-\x1f\x7f]|^-?\d)|^-$|[^\0-\x1f\x7f-\uFFFF\w-]/g, + fcssescape = function( ch, asCodePoint ) { + if ( asCodePoint ) { + + // U+0000 NULL becomes U+FFFD REPLACEMENT CHARACTER + if ( ch === "\0" ) { + return "\uFFFD"; + } + + // Control characters and (dependent upon position) numbers get escaped as code points + return ch.slice( 0, -1 ) + "\\" + + ch.charCodeAt( ch.length - 1 ).toString( 16 ) + " "; + } + + // Other potentially-special ASCII characters get backslash-escaped + return "\\" + ch; + }, + + // Used for iframes + // See setDocument() + // Removing the function wrapper causes a "Permission Denied" + // error in IE + unloadHandler = function() { + setDocument(); + }, + + inDisabledFieldset = addCombinator( + function( elem ) { + return elem.disabled === true && elem.nodeName.toLowerCase() === "fieldset"; + }, + { dir: "parentNode", next: "legend" } + ); + +// Optimize for push.apply( _, NodeList ) +try { + push.apply( + ( arr = slice.call( preferredDoc.childNodes ) ), + preferredDoc.childNodes + ); + + // Support: Android<4.0 + // Detect silently failing push.apply + // eslint-disable-next-line no-unused-expressions + arr[ preferredDoc.childNodes.length ].nodeType; +} catch ( e ) { + push = { apply: arr.length ? + + // Leverage slice if possible + function( target, els ) { + pushNative.apply( target, slice.call( els ) ); + } : + + // Support: IE<9 + // Otherwise append directly + function( target, els ) { + var j = target.length, + i = 0; + + // Can't trust NodeList.length + while ( ( target[ j++ ] = els[ i++ ] ) ) {} + target.length = j - 1; + } + }; +} + +function Sizzle( selector, context, results, seed ) { + var m, i, elem, nid, match, groups, newSelector, + newContext = context && context.ownerDocument, + + // nodeType defaults to 9, since context defaults to document + nodeType = context ? context.nodeType : 9; + + results = results || []; + + // Return early from calls with invalid selector or context + if ( typeof selector !== "string" || !selector || + nodeType !== 1 && nodeType !== 9 && nodeType !== 11 ) { + + return results; + } + + // Try to shortcut find operations (as opposed to filters) in HTML documents + if ( !seed ) { + setDocument( context ); + context = context || document; + + if ( documentIsHTML ) { + + // If the selector is sufficiently simple, try using a "get*By*" DOM method + // (excepting DocumentFragment context, where the methods don't exist) + if ( nodeType !== 11 && ( match = rquickExpr.exec( selector ) ) ) { + + // ID selector + if ( ( m = match[ 1 ] ) ) { + + // Document context + if ( nodeType === 9 ) { + if ( ( elem = context.getElementById( m ) ) ) { + + // Support: IE, Opera, Webkit + // TODO: identify versions + // getElementById can match elements by name instead of ID + if ( elem.id === m ) { + results.push( elem ); + return results; + } + } else { + return results; + } + + // Element context + } else { + + // Support: IE, Opera, Webkit + // TODO: identify versions + // getElementById can match elements by name instead of ID + if ( newContext && ( elem = newContext.getElementById( m ) ) && + contains( context, elem ) && + elem.id === m ) { + + results.push( elem ); + return results; + } + } + + // Type selector + } else if ( match[ 2 ] ) { + push.apply( results, context.getElementsByTagName( selector ) ); + return results; + + // Class selector + } else if ( ( m = match[ 3 ] ) && support.getElementsByClassName && + context.getElementsByClassName ) { + + push.apply( results, context.getElementsByClassName( m ) ); + return results; + } + } + + // Take advantage of querySelectorAll + if ( support.qsa && + !nonnativeSelectorCache[ selector + " " ] && + ( !rbuggyQSA || !rbuggyQSA.test( selector ) ) && + + // Support: IE 8 only + // Exclude object elements + ( nodeType !== 1 || context.nodeName.toLowerCase() !== "object" ) ) { + + newSelector = selector; + newContext = context; + + // qSA considers elements outside a scoping root when evaluating child or + // descendant combinators, which is not what we want. + // In such cases, we work around the behavior by prefixing every selector in the + // list with an ID selector referencing the scope context. + // The technique has to be used as well when a leading combinator is used + // as such selectors are not recognized by querySelectorAll. + // Thanks to Andrew Dupont for this technique. + if ( nodeType === 1 && + ( rdescend.test( selector ) || rcombinators.test( selector ) ) ) { + + // Expand context for sibling selectors + newContext = rsibling.test( selector ) && testContext( context.parentNode ) || + context; + + // We can use :scope instead of the ID hack if the browser + // supports it & if we're not changing the context. + if ( newContext !== context || !support.scope ) { + + // Capture the context ID, setting it first if necessary + if ( ( nid = context.getAttribute( "id" ) ) ) { + nid = nid.replace( rcssescape, fcssescape ); + } else { + context.setAttribute( "id", ( nid = expando ) ); + } + } + + // Prefix every selector in the list + groups = tokenize( selector ); + i = groups.length; + while ( i-- ) { + groups[ i ] = ( nid ? "#" + nid : ":scope" ) + " " + + toSelector( groups[ i ] ); + } + newSelector = groups.join( "," ); + } + + try { + push.apply( results, + newContext.querySelectorAll( newSelector ) + ); + return results; + } catch ( qsaError ) { + nonnativeSelectorCache( selector, true ); + } finally { + if ( nid === expando ) { + context.removeAttribute( "id" ); + } + } + } + } + } + + // All others + return select( selector.replace( rtrim, "$1" ), context, results, seed ); +} + +/** + * Create key-value caches of limited size + * @returns {function(string, object)} Returns the Object data after storing it on itself with + * property name the (space-suffixed) string and (if the cache is larger than Expr.cacheLength) + * deleting the oldest entry + */ +function createCache() { + var keys = []; + + function cache( key, value ) { + + // Use (key + " ") to avoid collision with native prototype properties (see Issue #157) + if ( keys.push( key + " " ) > Expr.cacheLength ) { + + // Only keep the most recent entries + delete cache[ keys.shift() ]; + } + return ( cache[ key + " " ] = value ); + } + return cache; +} + +/** + * Mark a function for special use by Sizzle + * @param {Function} fn The function to mark + */ +function markFunction( fn ) { + fn[ expando ] = true; + return fn; +} + +/** + * Support testing using an element + * @param {Function} fn Passed the created element and returns a boolean result + */ +function assert( fn ) { + var el = document.createElement( "fieldset" ); + + try { + return !!fn( el ); + } catch ( e ) { + return false; + } finally { + + // Remove from its parent by default + if ( el.parentNode ) { + el.parentNode.removeChild( el ); + } + + // release memory in IE + el = null; + } +} + +/** + * Adds the same handler for all of the specified attrs + * @param {String} attrs Pipe-separated list of attributes + * @param {Function} handler The method that will be applied + */ +function addHandle( attrs, handler ) { + var arr = attrs.split( "|" ), + i = arr.length; + + while ( i-- ) { + Expr.attrHandle[ arr[ i ] ] = handler; + } +} + +/** + * Checks document order of two siblings + * @param {Element} a + * @param {Element} b + * @returns {Number} Returns less than 0 if a precedes b, greater than 0 if a follows b + */ +function siblingCheck( a, b ) { + var cur = b && a, + diff = cur && a.nodeType === 1 && b.nodeType === 1 && + a.sourceIndex - b.sourceIndex; + + // Use IE sourceIndex if available on both nodes + if ( diff ) { + return diff; + } + + // Check if b follows a + if ( cur ) { + while ( ( cur = cur.nextSibling ) ) { + if ( cur === b ) { + return -1; + } + } + } + + return a ? 1 : -1; +} + +/** + * Returns a function to use in pseudos for input types + * @param {String} type + */ +function createInputPseudo( type ) { + return function( elem ) { + var name = elem.nodeName.toLowerCase(); + return name === "input" && elem.type === type; + }; +} + +/** + * Returns a function to use in pseudos for buttons + * @param {String} type + */ +function createButtonPseudo( type ) { + return function( elem ) { + var name = elem.nodeName.toLowerCase(); + return ( name === "input" || name === "button" ) && elem.type === type; + }; +} + +/** + * Returns a function to use in pseudos for :enabled/:disabled + * @param {Boolean} disabled true for :disabled; false for :enabled + */ +function createDisabledPseudo( disabled ) { + + // Known :disabled false positives: fieldset[disabled] > legend:nth-of-type(n+2) :can-disable + return function( elem ) { + + // Only certain elements can match :enabled or :disabled + // https://html.spec.whatwg.org/multipage/scripting.html#selector-enabled + // https://html.spec.whatwg.org/multipage/scripting.html#selector-disabled + if ( "form" in elem ) { + + // Check for inherited disabledness on relevant non-disabled elements: + // * listed form-associated elements in a disabled fieldset + // https://html.spec.whatwg.org/multipage/forms.html#category-listed + // https://html.spec.whatwg.org/multipage/forms.html#concept-fe-disabled + // * option elements in a disabled optgroup + // https://html.spec.whatwg.org/multipage/forms.html#concept-option-disabled + // All such elements have a "form" property. + if ( elem.parentNode && elem.disabled === false ) { + + // Option elements defer to a parent optgroup if present + if ( "label" in elem ) { + if ( "label" in elem.parentNode ) { + return elem.parentNode.disabled === disabled; + } else { + return elem.disabled === disabled; + } + } + + // Support: IE 6 - 11 + // Use the isDisabled shortcut property to check for disabled fieldset ancestors + return elem.isDisabled === disabled || + + // Where there is no isDisabled, check manually + /* jshint -W018 */ + elem.isDisabled !== !disabled && + inDisabledFieldset( elem ) === disabled; + } + + return elem.disabled === disabled; + + // Try to winnow out elements that can't be disabled before trusting the disabled property. + // Some victims get caught in our net (label, legend, menu, track), but it shouldn't + // even exist on them, let alone have a boolean value. + } else if ( "label" in elem ) { + return elem.disabled === disabled; + } + + // Remaining elements are neither :enabled nor :disabled + return false; + }; +} + +/** + * Returns a function to use in pseudos for positionals + * @param {Function} fn + */ +function createPositionalPseudo( fn ) { + return markFunction( function( argument ) { + argument = +argument; + return markFunction( function( seed, matches ) { + var j, + matchIndexes = fn( [], seed.length, argument ), + i = matchIndexes.length; + + // Match elements found at the specified indexes + while ( i-- ) { + if ( seed[ ( j = matchIndexes[ i ] ) ] ) { + seed[ j ] = !( matches[ j ] = seed[ j ] ); + } + } + } ); + } ); +} + +/** + * Checks a node for validity as a Sizzle context + * @param {Element|Object=} context + * @returns {Element|Object|Boolean} The input node if acceptable, otherwise a falsy value + */ +function testContext( context ) { + return context && typeof context.getElementsByTagName !== "undefined" && context; +} + +// Expose support vars for convenience +support = Sizzle.support = {}; + +/** + * Detects XML nodes + * @param {Element|Object} elem An element or a document + * @returns {Boolean} True iff elem is a non-HTML XML node + */ +isXML = Sizzle.isXML = function( elem ) { + var namespace = elem && elem.namespaceURI, + docElem = elem && ( elem.ownerDocument || elem ).documentElement; + + // Support: IE <=8 + // Assume HTML when documentElement doesn't yet exist, such as inside loading iframes + // https://bugs.jquery.com/ticket/4833 + return !rhtml.test( namespace || docElem && docElem.nodeName || "HTML" ); +}; + +/** + * Sets document-related variables once based on the current document + * @param {Element|Object} [doc] An element or document object to use to set the document + * @returns {Object} Returns the current document + */ +setDocument = Sizzle.setDocument = function( node ) { + var hasCompare, subWindow, + doc = node ? node.ownerDocument || node : preferredDoc; + + // Return early if doc is invalid or already selected + // Support: IE 11+, Edge 17 - 18+ + // IE/Edge sometimes throw a "Permission denied" error when strict-comparing + // two documents; shallow comparisons work. + // eslint-disable-next-line eqeqeq + if ( doc == document || doc.nodeType !== 9 || !doc.documentElement ) { + return document; + } + + // Update global variables + document = doc; + docElem = document.documentElement; + documentIsHTML = !isXML( document ); + + // Support: IE 9 - 11+, Edge 12 - 18+ + // Accessing iframe documents after unload throws "permission denied" errors (jQuery #13936) + // Support: IE 11+, Edge 17 - 18+ + // IE/Edge sometimes throw a "Permission denied" error when strict-comparing + // two documents; shallow comparisons work. + // eslint-disable-next-line eqeqeq + if ( preferredDoc != document && + ( subWindow = document.defaultView ) && subWindow.top !== subWindow ) { + + // Support: IE 11, Edge + if ( subWindow.addEventListener ) { + subWindow.addEventListener( "unload", unloadHandler, false ); + + // Support: IE 9 - 10 only + } else if ( subWindow.attachEvent ) { + subWindow.attachEvent( "onunload", unloadHandler ); + } + } + + // Support: IE 8 - 11+, Edge 12 - 18+, Chrome <=16 - 25 only, Firefox <=3.6 - 31 only, + // Safari 4 - 5 only, Opera <=11.6 - 12.x only + // IE/Edge & older browsers don't support the :scope pseudo-class. + // Support: Safari 6.0 only + // Safari 6.0 supports :scope but it's an alias of :root there. + support.scope = assert( function( el ) { + docElem.appendChild( el ).appendChild( document.createElement( "div" ) ); + return typeof el.querySelectorAll !== "undefined" && + !el.querySelectorAll( ":scope fieldset div" ).length; + } ); + + /* Attributes + ---------------------------------------------------------------------- */ + + // Support: IE<8 + // Verify that getAttribute really returns attributes and not properties + // (excepting IE8 booleans) + support.attributes = assert( function( el ) { + el.className = "i"; + return !el.getAttribute( "className" ); + } ); + + /* getElement(s)By* + ---------------------------------------------------------------------- */ + + // Check if getElementsByTagName("*") returns only elements + support.getElementsByTagName = assert( function( el ) { + el.appendChild( document.createComment( "" ) ); + return !el.getElementsByTagName( "*" ).length; + } ); + + // Support: IE<9 + support.getElementsByClassName = rnative.test( document.getElementsByClassName ); + + // Support: IE<10 + // Check if getElementById returns elements by name + // The broken getElementById methods don't pick up programmatically-set names, + // so use a roundabout getElementsByName test + support.getById = assert( function( el ) { + docElem.appendChild( el ).id = expando; + return !document.getElementsByName || !document.getElementsByName( expando ).length; + } ); + + // ID filter and find + if ( support.getById ) { + Expr.filter[ "ID" ] = function( id ) { + var attrId = id.replace( runescape, funescape ); + return function( elem ) { + return elem.getAttribute( "id" ) === attrId; + }; + }; + Expr.find[ "ID" ] = function( id, context ) { + if ( typeof context.getElementById !== "undefined" && documentIsHTML ) { + var elem = context.getElementById( id ); + return elem ? [ elem ] : []; + } + }; + } else { + Expr.filter[ "ID" ] = function( id ) { + var attrId = id.replace( runescape, funescape ); + return function( elem ) { + var node = typeof elem.getAttributeNode !== "undefined" && + elem.getAttributeNode( "id" ); + return node && node.value === attrId; + }; + }; + + // Support: IE 6 - 7 only + // getElementById is not reliable as a find shortcut + Expr.find[ "ID" ] = function( id, context ) { + if ( typeof context.getElementById !== "undefined" && documentIsHTML ) { + var node, i, elems, + elem = context.getElementById( id ); + + if ( elem ) { + + // Verify the id attribute + node = elem.getAttributeNode( "id" ); + if ( node && node.value === id ) { + return [ elem ]; + } + + // Fall back on getElementsByName + elems = context.getElementsByName( id ); + i = 0; + while ( ( elem = elems[ i++ ] ) ) { + node = elem.getAttributeNode( "id" ); + if ( node && node.value === id ) { + return [ elem ]; + } + } + } + + return []; + } + }; + } + + // Tag + Expr.find[ "TAG" ] = support.getElementsByTagName ? + function( tag, context ) { + if ( typeof context.getElementsByTagName !== "undefined" ) { + return context.getElementsByTagName( tag ); + + // DocumentFragment nodes don't have gEBTN + } else if ( support.qsa ) { + return context.querySelectorAll( tag ); + } + } : + + function( tag, context ) { + var elem, + tmp = [], + i = 0, + + // By happy coincidence, a (broken) gEBTN appears on DocumentFragment nodes too + results = context.getElementsByTagName( tag ); + + // Filter out possible comments + if ( tag === "*" ) { + while ( ( elem = results[ i++ ] ) ) { + if ( elem.nodeType === 1 ) { + tmp.push( elem ); + } + } + + return tmp; + } + return results; + }; + + // Class + Expr.find[ "CLASS" ] = support.getElementsByClassName && function( className, context ) { + if ( typeof context.getElementsByClassName !== "undefined" && documentIsHTML ) { + return context.getElementsByClassName( className ); + } + }; + + /* QSA/matchesSelector + ---------------------------------------------------------------------- */ + + // QSA and matchesSelector support + + // matchesSelector(:active) reports false when true (IE9/Opera 11.5) + rbuggyMatches = []; + + // qSa(:focus) reports false when true (Chrome 21) + // We allow this because of a bug in IE8/9 that throws an error + // whenever `document.activeElement` is accessed on an iframe + // So, we allow :focus to pass through QSA all the time to avoid the IE error + // See https://bugs.jquery.com/ticket/13378 + rbuggyQSA = []; + + if ( ( support.qsa = rnative.test( document.querySelectorAll ) ) ) { + + // Build QSA regex + // Regex strategy adopted from Diego Perini + assert( function( el ) { + + var input; + + // Select is set to empty string on purpose + // This is to test IE's treatment of not explicitly + // setting a boolean content attribute, + // since its presence should be enough + // https://bugs.jquery.com/ticket/12359 + docElem.appendChild( el ).innerHTML = "" + + ""; + + // Support: IE8, Opera 11-12.16 + // Nothing should be selected when empty strings follow ^= or $= or *= + // The test attribute must be unknown in Opera but "safe" for WinRT + // https://msdn.microsoft.com/en-us/library/ie/hh465388.aspx#attribute_section + if ( el.querySelectorAll( "[msallowcapture^='']" ).length ) { + rbuggyQSA.push( "[*^$]=" + whitespace + "*(?:''|\"\")" ); + } + + // Support: IE8 + // Boolean attributes and "value" are not treated correctly + if ( !el.querySelectorAll( "[selected]" ).length ) { + rbuggyQSA.push( "\\[" + whitespace + "*(?:value|" + booleans + ")" ); + } + + // Support: Chrome<29, Android<4.4, Safari<7.0+, iOS<7.0+, PhantomJS<1.9.8+ + if ( !el.querySelectorAll( "[id~=" + expando + "-]" ).length ) { + rbuggyQSA.push( "~=" ); + } + + // Support: IE 11+, Edge 15 - 18+ + // IE 11/Edge don't find elements on a `[name='']` query in some cases. + // Adding a temporary attribute to the document before the selection works + // around the issue. + // Interestingly, IE 10 & older don't seem to have the issue. + input = document.createElement( "input" ); + input.setAttribute( "name", "" ); + el.appendChild( input ); + if ( !el.querySelectorAll( "[name='']" ).length ) { + rbuggyQSA.push( "\\[" + whitespace + "*name" + whitespace + "*=" + + whitespace + "*(?:''|\"\")" ); + } + + // Webkit/Opera - :checked should return selected option elements + // http://www.w3.org/TR/2011/REC-css3-selectors-20110929/#checked + // IE8 throws error here and will not see later tests + if ( !el.querySelectorAll( ":checked" ).length ) { + rbuggyQSA.push( ":checked" ); + } + + // Support: Safari 8+, iOS 8+ + // https://bugs.webkit.org/show_bug.cgi?id=136851 + // In-page `selector#id sibling-combinator selector` fails + if ( !el.querySelectorAll( "a#" + expando + "+*" ).length ) { + rbuggyQSA.push( ".#.+[+~]" ); + } + + // Support: Firefox <=3.6 - 5 only + // Old Firefox doesn't throw on a badly-escaped identifier. + el.querySelectorAll( "\\\f" ); + rbuggyQSA.push( "[\\r\\n\\f]" ); + } ); + + assert( function( el ) { + el.innerHTML = "" + + ""; + + // Support: Windows 8 Native Apps + // The type and name attributes are restricted during .innerHTML assignment + var input = document.createElement( "input" ); + input.setAttribute( "type", "hidden" ); + el.appendChild( input ).setAttribute( "name", "D" ); + + // Support: IE8 + // Enforce case-sensitivity of name attribute + if ( el.querySelectorAll( "[name=d]" ).length ) { + rbuggyQSA.push( "name" + whitespace + "*[*^$|!~]?=" ); + } + + // FF 3.5 - :enabled/:disabled and hidden elements (hidden elements are still enabled) + // IE8 throws error here and will not see later tests + if ( el.querySelectorAll( ":enabled" ).length !== 2 ) { + rbuggyQSA.push( ":enabled", ":disabled" ); + } + + // Support: IE9-11+ + // IE's :disabled selector does not pick up the children of disabled fieldsets + docElem.appendChild( el ).disabled = true; + if ( el.querySelectorAll( ":disabled" ).length !== 2 ) { + rbuggyQSA.push( ":enabled", ":disabled" ); + } + + // Support: Opera 10 - 11 only + // Opera 10-11 does not throw on post-comma invalid pseudos + el.querySelectorAll( "*,:x" ); + rbuggyQSA.push( ",.*:" ); + } ); + } + + if ( ( support.matchesSelector = rnative.test( ( matches = docElem.matches || + docElem.webkitMatchesSelector || + docElem.mozMatchesSelector || + docElem.oMatchesSelector || + docElem.msMatchesSelector ) ) ) ) { + + assert( function( el ) { + + // Check to see if it's possible to do matchesSelector + // on a disconnected node (IE 9) + support.disconnectedMatch = matches.call( el, "*" ); + + // This should fail with an exception + // Gecko does not error, returns false instead + matches.call( el, "[s!='']:x" ); + rbuggyMatches.push( "!=", pseudos ); + } ); + } + + rbuggyQSA = rbuggyQSA.length && new RegExp( rbuggyQSA.join( "|" ) ); + rbuggyMatches = rbuggyMatches.length && new RegExp( rbuggyMatches.join( "|" ) ); + + /* Contains + ---------------------------------------------------------------------- */ + hasCompare = rnative.test( docElem.compareDocumentPosition ); + + // Element contains another + // Purposefully self-exclusive + // As in, an element does not contain itself + contains = hasCompare || rnative.test( docElem.contains ) ? + function( a, b ) { + var adown = a.nodeType === 9 ? a.documentElement : a, + bup = b && b.parentNode; + return a === bup || !!( bup && bup.nodeType === 1 && ( + adown.contains ? + adown.contains( bup ) : + a.compareDocumentPosition && a.compareDocumentPosition( bup ) & 16 + ) ); + } : + function( a, b ) { + if ( b ) { + while ( ( b = b.parentNode ) ) { + if ( b === a ) { + return true; + } + } + } + return false; + }; + + /* Sorting + ---------------------------------------------------------------------- */ + + // Document order sorting + sortOrder = hasCompare ? + function( a, b ) { + + // Flag for duplicate removal + if ( a === b ) { + hasDuplicate = true; + return 0; + } + + // Sort on method existence if only one input has compareDocumentPosition + var compare = !a.compareDocumentPosition - !b.compareDocumentPosition; + if ( compare ) { + return compare; + } + + // Calculate position if both inputs belong to the same document + // Support: IE 11+, Edge 17 - 18+ + // IE/Edge sometimes throw a "Permission denied" error when strict-comparing + // two documents; shallow comparisons work. + // eslint-disable-next-line eqeqeq + compare = ( a.ownerDocument || a ) == ( b.ownerDocument || b ) ? + a.compareDocumentPosition( b ) : + + // Otherwise we know they are disconnected + 1; + + // Disconnected nodes + if ( compare & 1 || + ( !support.sortDetached && b.compareDocumentPosition( a ) === compare ) ) { + + // Choose the first element that is related to our preferred document + // Support: IE 11+, Edge 17 - 18+ + // IE/Edge sometimes throw a "Permission denied" error when strict-comparing + // two documents; shallow comparisons work. + // eslint-disable-next-line eqeqeq + if ( a == document || a.ownerDocument == preferredDoc && + contains( preferredDoc, a ) ) { + return -1; + } + + // Support: IE 11+, Edge 17 - 18+ + // IE/Edge sometimes throw a "Permission denied" error when strict-comparing + // two documents; shallow comparisons work. + // eslint-disable-next-line eqeqeq + if ( b == document || b.ownerDocument == preferredDoc && + contains( preferredDoc, b ) ) { + return 1; + } + + // Maintain original order + return sortInput ? + ( indexOf( sortInput, a ) - indexOf( sortInput, b ) ) : + 0; + } + + return compare & 4 ? -1 : 1; + } : + function( a, b ) { + + // Exit early if the nodes are identical + if ( a === b ) { + hasDuplicate = true; + return 0; + } + + var cur, + i = 0, + aup = a.parentNode, + bup = b.parentNode, + ap = [ a ], + bp = [ b ]; + + // Parentless nodes are either documents or disconnected + if ( !aup || !bup ) { + + // Support: IE 11+, Edge 17 - 18+ + // IE/Edge sometimes throw a "Permission denied" error when strict-comparing + // two documents; shallow comparisons work. + /* eslint-disable eqeqeq */ + return a == document ? -1 : + b == document ? 1 : + /* eslint-enable eqeqeq */ + aup ? -1 : + bup ? 1 : + sortInput ? + ( indexOf( sortInput, a ) - indexOf( sortInput, b ) ) : + 0; + + // If the nodes are siblings, we can do a quick check + } else if ( aup === bup ) { + return siblingCheck( a, b ); + } + + // Otherwise we need full lists of their ancestors for comparison + cur = a; + while ( ( cur = cur.parentNode ) ) { + ap.unshift( cur ); + } + cur = b; + while ( ( cur = cur.parentNode ) ) { + bp.unshift( cur ); + } + + // Walk down the tree looking for a discrepancy + while ( ap[ i ] === bp[ i ] ) { + i++; + } + + return i ? + + // Do a sibling check if the nodes have a common ancestor + siblingCheck( ap[ i ], bp[ i ] ) : + + // Otherwise nodes in our document sort first + // Support: IE 11+, Edge 17 - 18+ + // IE/Edge sometimes throw a "Permission denied" error when strict-comparing + // two documents; shallow comparisons work. + /* eslint-disable eqeqeq */ + ap[ i ] == preferredDoc ? -1 : + bp[ i ] == preferredDoc ? 1 : + /* eslint-enable eqeqeq */ + 0; + }; + + return document; +}; + +Sizzle.matches = function( expr, elements ) { + return Sizzle( expr, null, null, elements ); +}; + +Sizzle.matchesSelector = function( elem, expr ) { + setDocument( elem ); + + if ( support.matchesSelector && documentIsHTML && + !nonnativeSelectorCache[ expr + " " ] && + ( !rbuggyMatches || !rbuggyMatches.test( expr ) ) && + ( !rbuggyQSA || !rbuggyQSA.test( expr ) ) ) { + + try { + var ret = matches.call( elem, expr ); + + // IE 9's matchesSelector returns false on disconnected nodes + if ( ret || support.disconnectedMatch || + + // As well, disconnected nodes are said to be in a document + // fragment in IE 9 + elem.document && elem.document.nodeType !== 11 ) { + return ret; + } + } catch ( e ) { + nonnativeSelectorCache( expr, true ); + } + } + + return Sizzle( expr, document, null, [ elem ] ).length > 0; +}; + +Sizzle.contains = function( context, elem ) { + + // Set document vars if needed + // Support: IE 11+, Edge 17 - 18+ + // IE/Edge sometimes throw a "Permission denied" error when strict-comparing + // two documents; shallow comparisons work. + // eslint-disable-next-line eqeqeq + if ( ( context.ownerDocument || context ) != document ) { + setDocument( context ); + } + return contains( context, elem ); +}; + +Sizzle.attr = function( elem, name ) { + + // Set document vars if needed + // Support: IE 11+, Edge 17 - 18+ + // IE/Edge sometimes throw a "Permission denied" error when strict-comparing + // two documents; shallow comparisons work. + // eslint-disable-next-line eqeqeq + if ( ( elem.ownerDocument || elem ) != document ) { + setDocument( elem ); + } + + var fn = Expr.attrHandle[ name.toLowerCase() ], + + // Don't get fooled by Object.prototype properties (jQuery #13807) + val = fn && hasOwn.call( Expr.attrHandle, name.toLowerCase() ) ? + fn( elem, name, !documentIsHTML ) : + undefined; + + return val !== undefined ? + val : + support.attributes || !documentIsHTML ? + elem.getAttribute( name ) : + ( val = elem.getAttributeNode( name ) ) && val.specified ? + val.value : + null; +}; + +Sizzle.escape = function( sel ) { + return ( sel + "" ).replace( rcssescape, fcssescape ); +}; + +Sizzle.error = function( msg ) { + throw new Error( "Syntax error, unrecognized expression: " + msg ); +}; + +/** + * Document sorting and removing duplicates + * @param {ArrayLike} results + */ +Sizzle.uniqueSort = function( results ) { + var elem, + duplicates = [], + j = 0, + i = 0; + + // Unless we *know* we can detect duplicates, assume their presence + hasDuplicate = !support.detectDuplicates; + sortInput = !support.sortStable && results.slice( 0 ); + results.sort( sortOrder ); + + if ( hasDuplicate ) { + while ( ( elem = results[ i++ ] ) ) { + if ( elem === results[ i ] ) { + j = duplicates.push( i ); + } + } + while ( j-- ) { + results.splice( duplicates[ j ], 1 ); + } + } + + // Clear input after sorting to release objects + // See https://github.com/jquery/sizzle/pull/225 + sortInput = null; + + return results; +}; + +/** + * Utility function for retrieving the text value of an array of DOM nodes + * @param {Array|Element} elem + */ +getText = Sizzle.getText = function( elem ) { + var node, + ret = "", + i = 0, + nodeType = elem.nodeType; + + if ( !nodeType ) { + + // If no nodeType, this is expected to be an array + while ( ( node = elem[ i++ ] ) ) { + + // Do not traverse comment nodes + ret += getText( node ); + } + } else if ( nodeType === 1 || nodeType === 9 || nodeType === 11 ) { + + // Use textContent for elements + // innerText usage removed for consistency of new lines (jQuery #11153) + if ( typeof elem.textContent === "string" ) { + return elem.textContent; + } else { + + // Traverse its children + for ( elem = elem.firstChild; elem; elem = elem.nextSibling ) { + ret += getText( elem ); + } + } + } else if ( nodeType === 3 || nodeType === 4 ) { + return elem.nodeValue; + } + + // Do not include comment or processing instruction nodes + + return ret; +}; + +Expr = Sizzle.selectors = { + + // Can be adjusted by the user + cacheLength: 50, + + createPseudo: markFunction, + + match: matchExpr, + + attrHandle: {}, + + find: {}, + + relative: { + ">": { dir: "parentNode", first: true }, + " ": { dir: "parentNode" }, + "+": { dir: "previousSibling", first: true }, + "~": { dir: "previousSibling" } + }, + + preFilter: { + "ATTR": function( match ) { + match[ 1 ] = match[ 1 ].replace( runescape, funescape ); + + // Move the given value to match[3] whether quoted or unquoted + match[ 3 ] = ( match[ 3 ] || match[ 4 ] || + match[ 5 ] || "" ).replace( runescape, funescape ); + + if ( match[ 2 ] === "~=" ) { + match[ 3 ] = " " + match[ 3 ] + " "; + } + + return match.slice( 0, 4 ); + }, + + "CHILD": function( match ) { + + /* matches from matchExpr["CHILD"] + 1 type (only|nth|...) + 2 what (child|of-type) + 3 argument (even|odd|\d*|\d*n([+-]\d+)?|...) + 4 xn-component of xn+y argument ([+-]?\d*n|) + 5 sign of xn-component + 6 x of xn-component + 7 sign of y-component + 8 y of y-component + */ + match[ 1 ] = match[ 1 ].toLowerCase(); + + if ( match[ 1 ].slice( 0, 3 ) === "nth" ) { + + // nth-* requires argument + if ( !match[ 3 ] ) { + Sizzle.error( match[ 0 ] ); + } + + // numeric x and y parameters for Expr.filter.CHILD + // remember that false/true cast respectively to 0/1 + match[ 4 ] = +( match[ 4 ] ? + match[ 5 ] + ( match[ 6 ] || 1 ) : + 2 * ( match[ 3 ] === "even" || match[ 3 ] === "odd" ) ); + match[ 5 ] = +( ( match[ 7 ] + match[ 8 ] ) || match[ 3 ] === "odd" ); + + // other types prohibit arguments + } else if ( match[ 3 ] ) { + Sizzle.error( match[ 0 ] ); + } + + return match; + }, + + "PSEUDO": function( match ) { + var excess, + unquoted = !match[ 6 ] && match[ 2 ]; + + if ( matchExpr[ "CHILD" ].test( match[ 0 ] ) ) { + return null; + } + + // Accept quoted arguments as-is + if ( match[ 3 ] ) { + match[ 2 ] = match[ 4 ] || match[ 5 ] || ""; + + // Strip excess characters from unquoted arguments + } else if ( unquoted && rpseudo.test( unquoted ) && + + // Get excess from tokenize (recursively) + ( excess = tokenize( unquoted, true ) ) && + + // advance to the next closing parenthesis + ( excess = unquoted.indexOf( ")", unquoted.length - excess ) - unquoted.length ) ) { + + // excess is a negative index + match[ 0 ] = match[ 0 ].slice( 0, excess ); + match[ 2 ] = unquoted.slice( 0, excess ); + } + + // Return only captures needed by the pseudo filter method (type and argument) + return match.slice( 0, 3 ); + } + }, + + filter: { + + "TAG": function( nodeNameSelector ) { + var nodeName = nodeNameSelector.replace( runescape, funescape ).toLowerCase(); + return nodeNameSelector === "*" ? + function() { + return true; + } : + function( elem ) { + return elem.nodeName && elem.nodeName.toLowerCase() === nodeName; + }; + }, + + "CLASS": function( className ) { + var pattern = classCache[ className + " " ]; + + return pattern || + ( pattern = new RegExp( "(^|" + whitespace + + ")" + className + "(" + whitespace + "|$)" ) ) && classCache( + className, function( elem ) { + return pattern.test( + typeof elem.className === "string" && elem.className || + typeof elem.getAttribute !== "undefined" && + elem.getAttribute( "class" ) || + "" + ); + } ); + }, + + "ATTR": function( name, operator, check ) { + return function( elem ) { + var result = Sizzle.attr( elem, name ); + + if ( result == null ) { + return operator === "!="; + } + if ( !operator ) { + return true; + } + + result += ""; + + /* eslint-disable max-len */ + + return operator === "=" ? result === check : + operator === "!=" ? result !== check : + operator === "^=" ? check && result.indexOf( check ) === 0 : + operator === "*=" ? check && result.indexOf( check ) > -1 : + operator === "$=" ? check && result.slice( -check.length ) === check : + operator === "~=" ? ( " " + result.replace( rwhitespace, " " ) + " " ).indexOf( check ) > -1 : + operator === "|=" ? result === check || result.slice( 0, check.length + 1 ) === check + "-" : + false; + /* eslint-enable max-len */ + + }; + }, + + "CHILD": function( type, what, _argument, first, last ) { + var simple = type.slice( 0, 3 ) !== "nth", + forward = type.slice( -4 ) !== "last", + ofType = what === "of-type"; + + return first === 1 && last === 0 ? + + // Shortcut for :nth-*(n) + function( elem ) { + return !!elem.parentNode; + } : + + function( elem, _context, xml ) { + var cache, uniqueCache, outerCache, node, nodeIndex, start, + dir = simple !== forward ? "nextSibling" : "previousSibling", + parent = elem.parentNode, + name = ofType && elem.nodeName.toLowerCase(), + useCache = !xml && !ofType, + diff = false; + + if ( parent ) { + + // :(first|last|only)-(child|of-type) + if ( simple ) { + while ( dir ) { + node = elem; + while ( ( node = node[ dir ] ) ) { + if ( ofType ? + node.nodeName.toLowerCase() === name : + node.nodeType === 1 ) { + + return false; + } + } + + // Reverse direction for :only-* (if we haven't yet done so) + start = dir = type === "only" && !start && "nextSibling"; + } + return true; + } + + start = [ forward ? parent.firstChild : parent.lastChild ]; + + // non-xml :nth-child(...) stores cache data on `parent` + if ( forward && useCache ) { + + // Seek `elem` from a previously-cached index + + // ...in a gzip-friendly way + node = parent; + outerCache = node[ expando ] || ( node[ expando ] = {} ); + + // Support: IE <9 only + // Defend against cloned attroperties (jQuery gh-1709) + uniqueCache = outerCache[ node.uniqueID ] || + ( outerCache[ node.uniqueID ] = {} ); + + cache = uniqueCache[ type ] || []; + nodeIndex = cache[ 0 ] === dirruns && cache[ 1 ]; + diff = nodeIndex && cache[ 2 ]; + node = nodeIndex && parent.childNodes[ nodeIndex ]; + + while ( ( node = ++nodeIndex && node && node[ dir ] || + + // Fallback to seeking `elem` from the start + ( diff = nodeIndex = 0 ) || start.pop() ) ) { + + // When found, cache indexes on `parent` and break + if ( node.nodeType === 1 && ++diff && node === elem ) { + uniqueCache[ type ] = [ dirruns, nodeIndex, diff ]; + break; + } + } + + } else { + + // Use previously-cached element index if available + if ( useCache ) { + + // ...in a gzip-friendly way + node = elem; + outerCache = node[ expando ] || ( node[ expando ] = {} ); + + // Support: IE <9 only + // Defend against cloned attroperties (jQuery gh-1709) + uniqueCache = outerCache[ node.uniqueID ] || + ( outerCache[ node.uniqueID ] = {} ); + + cache = uniqueCache[ type ] || []; + nodeIndex = cache[ 0 ] === dirruns && cache[ 1 ]; + diff = nodeIndex; + } + + // xml :nth-child(...) + // or :nth-last-child(...) or :nth(-last)?-of-type(...) + if ( diff === false ) { + + // Use the same loop as above to seek `elem` from the start + while ( ( node = ++nodeIndex && node && node[ dir ] || + ( diff = nodeIndex = 0 ) || start.pop() ) ) { + + if ( ( ofType ? + node.nodeName.toLowerCase() === name : + node.nodeType === 1 ) && + ++diff ) { + + // Cache the index of each encountered element + if ( useCache ) { + outerCache = node[ expando ] || + ( node[ expando ] = {} ); + + // Support: IE <9 only + // Defend against cloned attroperties (jQuery gh-1709) + uniqueCache = outerCache[ node.uniqueID ] || + ( outerCache[ node.uniqueID ] = {} ); + + uniqueCache[ type ] = [ dirruns, diff ]; + } + + if ( node === elem ) { + break; + } + } + } + } + } + + // Incorporate the offset, then check against cycle size + diff -= last; + return diff === first || ( diff % first === 0 && diff / first >= 0 ); + } + }; + }, + + "PSEUDO": function( pseudo, argument ) { + + // pseudo-class names are case-insensitive + // http://www.w3.org/TR/selectors/#pseudo-classes + // Prioritize by case sensitivity in case custom pseudos are added with uppercase letters + // Remember that setFilters inherits from pseudos + var args, + fn = Expr.pseudos[ pseudo ] || Expr.setFilters[ pseudo.toLowerCase() ] || + Sizzle.error( "unsupported pseudo: " + pseudo ); + + // The user may use createPseudo to indicate that + // arguments are needed to create the filter function + // just as Sizzle does + if ( fn[ expando ] ) { + return fn( argument ); + } + + // But maintain support for old signatures + if ( fn.length > 1 ) { + args = [ pseudo, pseudo, "", argument ]; + return Expr.setFilters.hasOwnProperty( pseudo.toLowerCase() ) ? + markFunction( function( seed, matches ) { + var idx, + matched = fn( seed, argument ), + i = matched.length; + while ( i-- ) { + idx = indexOf( seed, matched[ i ] ); + seed[ idx ] = !( matches[ idx ] = matched[ i ] ); + } + } ) : + function( elem ) { + return fn( elem, 0, args ); + }; + } + + return fn; + } + }, + + pseudos: { + + // Potentially complex pseudos + "not": markFunction( function( selector ) { + + // Trim the selector passed to compile + // to avoid treating leading and trailing + // spaces as combinators + var input = [], + results = [], + matcher = compile( selector.replace( rtrim, "$1" ) ); + + return matcher[ expando ] ? + markFunction( function( seed, matches, _context, xml ) { + var elem, + unmatched = matcher( seed, null, xml, [] ), + i = seed.length; + + // Match elements unmatched by `matcher` + while ( i-- ) { + if ( ( elem = unmatched[ i ] ) ) { + seed[ i ] = !( matches[ i ] = elem ); + } + } + } ) : + function( elem, _context, xml ) { + input[ 0 ] = elem; + matcher( input, null, xml, results ); + + // Don't keep the element (issue #299) + input[ 0 ] = null; + return !results.pop(); + }; + } ), + + "has": markFunction( function( selector ) { + return function( elem ) { + return Sizzle( selector, elem ).length > 0; + }; + } ), + + "contains": markFunction( function( text ) { + text = text.replace( runescape, funescape ); + return function( elem ) { + return ( elem.textContent || getText( elem ) ).indexOf( text ) > -1; + }; + } ), + + // "Whether an element is represented by a :lang() selector + // is based solely on the element's language value + // being equal to the identifier C, + // or beginning with the identifier C immediately followed by "-". + // The matching of C against the element's language value is performed case-insensitively. + // The identifier C does not have to be a valid language name." + // http://www.w3.org/TR/selectors/#lang-pseudo + "lang": markFunction( function( lang ) { + + // lang value must be a valid identifier + if ( !ridentifier.test( lang || "" ) ) { + Sizzle.error( "unsupported lang: " + lang ); + } + lang = lang.replace( runescape, funescape ).toLowerCase(); + return function( elem ) { + var elemLang; + do { + if ( ( elemLang = documentIsHTML ? + elem.lang : + elem.getAttribute( "xml:lang" ) || elem.getAttribute( "lang" ) ) ) { + + elemLang = elemLang.toLowerCase(); + return elemLang === lang || elemLang.indexOf( lang + "-" ) === 0; + } + } while ( ( elem = elem.parentNode ) && elem.nodeType === 1 ); + return false; + }; + } ), + + // Miscellaneous + "target": function( elem ) { + var hash = window.location && window.location.hash; + return hash && hash.slice( 1 ) === elem.id; + }, + + "root": function( elem ) { + return elem === docElem; + }, + + "focus": function( elem ) { + return elem === document.activeElement && + ( !document.hasFocus || document.hasFocus() ) && + !!( elem.type || elem.href || ~elem.tabIndex ); + }, + + // Boolean properties + "enabled": createDisabledPseudo( false ), + "disabled": createDisabledPseudo( true ), + + "checked": function( elem ) { + + // In CSS3, :checked should return both checked and selected elements + // http://www.w3.org/TR/2011/REC-css3-selectors-20110929/#checked + var nodeName = elem.nodeName.toLowerCase(); + return ( nodeName === "input" && !!elem.checked ) || + ( nodeName === "option" && !!elem.selected ); + }, + + "selected": function( elem ) { + + // Accessing this property makes selected-by-default + // options in Safari work properly + if ( elem.parentNode ) { + // eslint-disable-next-line no-unused-expressions + elem.parentNode.selectedIndex; + } + + return elem.selected === true; + }, + + // Contents + "empty": function( elem ) { + + // http://www.w3.org/TR/selectors/#empty-pseudo + // :empty is negated by element (1) or content nodes (text: 3; cdata: 4; entity ref: 5), + // but not by others (comment: 8; processing instruction: 7; etc.) + // nodeType < 6 works because attributes (2) do not appear as children + for ( elem = elem.firstChild; elem; elem = elem.nextSibling ) { + if ( elem.nodeType < 6 ) { + return false; + } + } + return true; + }, + + "parent": function( elem ) { + return !Expr.pseudos[ "empty" ]( elem ); + }, + + // Element/input types + "header": function( elem ) { + return rheader.test( elem.nodeName ); + }, + + "input": function( elem ) { + return rinputs.test( elem.nodeName ); + }, + + "button": function( elem ) { + var name = elem.nodeName.toLowerCase(); + return name === "input" && elem.type === "button" || name === "button"; + }, + + "text": function( elem ) { + var attr; + return elem.nodeName.toLowerCase() === "input" && + elem.type === "text" && + + // Support: IE<8 + // New HTML5 attribute values (e.g., "search") appear with elem.type === "text" + ( ( attr = elem.getAttribute( "type" ) ) == null || + attr.toLowerCase() === "text" ); + }, + + // Position-in-collection + "first": createPositionalPseudo( function() { + return [ 0 ]; + } ), + + "last": createPositionalPseudo( function( _matchIndexes, length ) { + return [ length - 1 ]; + } ), + + "eq": createPositionalPseudo( function( _matchIndexes, length, argument ) { + return [ argument < 0 ? argument + length : argument ]; + } ), + + "even": createPositionalPseudo( function( matchIndexes, length ) { + var i = 0; + for ( ; i < length; i += 2 ) { + matchIndexes.push( i ); + } + return matchIndexes; + } ), + + "odd": createPositionalPseudo( function( matchIndexes, length ) { + var i = 1; + for ( ; i < length; i += 2 ) { + matchIndexes.push( i ); + } + return matchIndexes; + } ), + + "lt": createPositionalPseudo( function( matchIndexes, length, argument ) { + var i = argument < 0 ? + argument + length : + argument > length ? + length : + argument; + for ( ; --i >= 0; ) { + matchIndexes.push( i ); + } + return matchIndexes; + } ), + + "gt": createPositionalPseudo( function( matchIndexes, length, argument ) { + var i = argument < 0 ? argument + length : argument; + for ( ; ++i < length; ) { + matchIndexes.push( i ); + } + return matchIndexes; + } ) + } +}; + +Expr.pseudos[ "nth" ] = Expr.pseudos[ "eq" ]; + +// Add button/input type pseudos +for ( i in { radio: true, checkbox: true, file: true, password: true, image: true } ) { + Expr.pseudos[ i ] = createInputPseudo( i ); +} +for ( i in { submit: true, reset: true } ) { + Expr.pseudos[ i ] = createButtonPseudo( i ); +} + +// Easy API for creating new setFilters +function setFilters() {} +setFilters.prototype = Expr.filters = Expr.pseudos; +Expr.setFilters = new setFilters(); + +tokenize = Sizzle.tokenize = function( selector, parseOnly ) { + var matched, match, tokens, type, + soFar, groups, preFilters, + cached = tokenCache[ selector + " " ]; + + if ( cached ) { + return parseOnly ? 0 : cached.slice( 0 ); + } + + soFar = selector; + groups = []; + preFilters = Expr.preFilter; + + while ( soFar ) { + + // Comma and first run + if ( !matched || ( match = rcomma.exec( soFar ) ) ) { + if ( match ) { + + // Don't consume trailing commas as valid + soFar = soFar.slice( match[ 0 ].length ) || soFar; + } + groups.push( ( tokens = [] ) ); + } + + matched = false; + + // Combinators + if ( ( match = rcombinators.exec( soFar ) ) ) { + matched = match.shift(); + tokens.push( { + value: matched, + + // Cast descendant combinators to space + type: match[ 0 ].replace( rtrim, " " ) + } ); + soFar = soFar.slice( matched.length ); + } + + // Filters + for ( type in Expr.filter ) { + if ( ( match = matchExpr[ type ].exec( soFar ) ) && ( !preFilters[ type ] || + ( match = preFilters[ type ]( match ) ) ) ) { + matched = match.shift(); + tokens.push( { + value: matched, + type: type, + matches: match + } ); + soFar = soFar.slice( matched.length ); + } + } + + if ( !matched ) { + break; + } + } + + // Return the length of the invalid excess + // if we're just parsing + // Otherwise, throw an error or return tokens + return parseOnly ? + soFar.length : + soFar ? + Sizzle.error( selector ) : + + // Cache the tokens + tokenCache( selector, groups ).slice( 0 ); +}; + +function toSelector( tokens ) { + var i = 0, + len = tokens.length, + selector = ""; + for ( ; i < len; i++ ) { + selector += tokens[ i ].value; + } + return selector; +} + +function addCombinator( matcher, combinator, base ) { + var dir = combinator.dir, + skip = combinator.next, + key = skip || dir, + checkNonElements = base && key === "parentNode", + doneName = done++; + + return combinator.first ? + + // Check against closest ancestor/preceding element + function( elem, context, xml ) { + while ( ( elem = elem[ dir ] ) ) { + if ( elem.nodeType === 1 || checkNonElements ) { + return matcher( elem, context, xml ); + } + } + return false; + } : + + // Check against all ancestor/preceding elements + function( elem, context, xml ) { + var oldCache, uniqueCache, outerCache, + newCache = [ dirruns, doneName ]; + + // We can't set arbitrary data on XML nodes, so they don't benefit from combinator caching + if ( xml ) { + while ( ( elem = elem[ dir ] ) ) { + if ( elem.nodeType === 1 || checkNonElements ) { + if ( matcher( elem, context, xml ) ) { + return true; + } + } + } + } else { + while ( ( elem = elem[ dir ] ) ) { + if ( elem.nodeType === 1 || checkNonElements ) { + outerCache = elem[ expando ] || ( elem[ expando ] = {} ); + + // Support: IE <9 only + // Defend against cloned attroperties (jQuery gh-1709) + uniqueCache = outerCache[ elem.uniqueID ] || + ( outerCache[ elem.uniqueID ] = {} ); + + if ( skip && skip === elem.nodeName.toLowerCase() ) { + elem = elem[ dir ] || elem; + } else if ( ( oldCache = uniqueCache[ key ] ) && + oldCache[ 0 ] === dirruns && oldCache[ 1 ] === doneName ) { + + // Assign to newCache so results back-propagate to previous elements + return ( newCache[ 2 ] = oldCache[ 2 ] ); + } else { + + // Reuse newcache so results back-propagate to previous elements + uniqueCache[ key ] = newCache; + + // A match means we're done; a fail means we have to keep checking + if ( ( newCache[ 2 ] = matcher( elem, context, xml ) ) ) { + return true; + } + } + } + } + } + return false; + }; +} + +function elementMatcher( matchers ) { + return matchers.length > 1 ? + function( elem, context, xml ) { + var i = matchers.length; + while ( i-- ) { + if ( !matchers[ i ]( elem, context, xml ) ) { + return false; + } + } + return true; + } : + matchers[ 0 ]; +} + +function multipleContexts( selector, contexts, results ) { + var i = 0, + len = contexts.length; + for ( ; i < len; i++ ) { + Sizzle( selector, contexts[ i ], results ); + } + return results; +} + +function condense( unmatched, map, filter, context, xml ) { + var elem, + newUnmatched = [], + i = 0, + len = unmatched.length, + mapped = map != null; + + for ( ; i < len; i++ ) { + if ( ( elem = unmatched[ i ] ) ) { + if ( !filter || filter( elem, context, xml ) ) { + newUnmatched.push( elem ); + if ( mapped ) { + map.push( i ); + } + } + } + } + + return newUnmatched; +} + +function setMatcher( preFilter, selector, matcher, postFilter, postFinder, postSelector ) { + if ( postFilter && !postFilter[ expando ] ) { + postFilter = setMatcher( postFilter ); + } + if ( postFinder && !postFinder[ expando ] ) { + postFinder = setMatcher( postFinder, postSelector ); + } + return markFunction( function( seed, results, context, xml ) { + var temp, i, elem, + preMap = [], + postMap = [], + preexisting = results.length, + + // Get initial elements from seed or context + elems = seed || multipleContexts( + selector || "*", + context.nodeType ? [ context ] : context, + [] + ), + + // Prefilter to get matcher input, preserving a map for seed-results synchronization + matcherIn = preFilter && ( seed || !selector ) ? + condense( elems, preMap, preFilter, context, xml ) : + elems, + + matcherOut = matcher ? + + // If we have a postFinder, or filtered seed, or non-seed postFilter or preexisting results, + postFinder || ( seed ? preFilter : preexisting || postFilter ) ? + + // ...intermediate processing is necessary + [] : + + // ...otherwise use results directly + results : + matcherIn; + + // Find primary matches + if ( matcher ) { + matcher( matcherIn, matcherOut, context, xml ); + } + + // Apply postFilter + if ( postFilter ) { + temp = condense( matcherOut, postMap ); + postFilter( temp, [], context, xml ); + + // Un-match failing elements by moving them back to matcherIn + i = temp.length; + while ( i-- ) { + if ( ( elem = temp[ i ] ) ) { + matcherOut[ postMap[ i ] ] = !( matcherIn[ postMap[ i ] ] = elem ); + } + } + } + + if ( seed ) { + if ( postFinder || preFilter ) { + if ( postFinder ) { + + // Get the final matcherOut by condensing this intermediate into postFinder contexts + temp = []; + i = matcherOut.length; + while ( i-- ) { + if ( ( elem = matcherOut[ i ] ) ) { + + // Restore matcherIn since elem is not yet a final match + temp.push( ( matcherIn[ i ] = elem ) ); + } + } + postFinder( null, ( matcherOut = [] ), temp, xml ); + } + + // Move matched elements from seed to results to keep them synchronized + i = matcherOut.length; + while ( i-- ) { + if ( ( elem = matcherOut[ i ] ) && + ( temp = postFinder ? indexOf( seed, elem ) : preMap[ i ] ) > -1 ) { + + seed[ temp ] = !( results[ temp ] = elem ); + } + } + } + + // Add elements to results, through postFinder if defined + } else { + matcherOut = condense( + matcherOut === results ? + matcherOut.splice( preexisting, matcherOut.length ) : + matcherOut + ); + if ( postFinder ) { + postFinder( null, results, matcherOut, xml ); + } else { + push.apply( results, matcherOut ); + } + } + } ); +} + +function matcherFromTokens( tokens ) { + var checkContext, matcher, j, + len = tokens.length, + leadingRelative = Expr.relative[ tokens[ 0 ].type ], + implicitRelative = leadingRelative || Expr.relative[ " " ], + i = leadingRelative ? 1 : 0, + + // The foundational matcher ensures that elements are reachable from top-level context(s) + matchContext = addCombinator( function( elem ) { + return elem === checkContext; + }, implicitRelative, true ), + matchAnyContext = addCombinator( function( elem ) { + return indexOf( checkContext, elem ) > -1; + }, implicitRelative, true ), + matchers = [ function( elem, context, xml ) { + var ret = ( !leadingRelative && ( xml || context !== outermostContext ) ) || ( + ( checkContext = context ).nodeType ? + matchContext( elem, context, xml ) : + matchAnyContext( elem, context, xml ) ); + + // Avoid hanging onto element (issue #299) + checkContext = null; + return ret; + } ]; + + for ( ; i < len; i++ ) { + if ( ( matcher = Expr.relative[ tokens[ i ].type ] ) ) { + matchers = [ addCombinator( elementMatcher( matchers ), matcher ) ]; + } else { + matcher = Expr.filter[ tokens[ i ].type ].apply( null, tokens[ i ].matches ); + + // Return special upon seeing a positional matcher + if ( matcher[ expando ] ) { + + // Find the next relative operator (if any) for proper handling + j = ++i; + for ( ; j < len; j++ ) { + if ( Expr.relative[ tokens[ j ].type ] ) { + break; + } + } + return setMatcher( + i > 1 && elementMatcher( matchers ), + i > 1 && toSelector( + + // If the preceding token was a descendant combinator, insert an implicit any-element `*` + tokens + .slice( 0, i - 1 ) + .concat( { value: tokens[ i - 2 ].type === " " ? "*" : "" } ) + ).replace( rtrim, "$1" ), + matcher, + i < j && matcherFromTokens( tokens.slice( i, j ) ), + j < len && matcherFromTokens( ( tokens = tokens.slice( j ) ) ), + j < len && toSelector( tokens ) + ); + } + matchers.push( matcher ); + } + } + + return elementMatcher( matchers ); +} + +function matcherFromGroupMatchers( elementMatchers, setMatchers ) { + var bySet = setMatchers.length > 0, + byElement = elementMatchers.length > 0, + superMatcher = function( seed, context, xml, results, outermost ) { + var elem, j, matcher, + matchedCount = 0, + i = "0", + unmatched = seed && [], + setMatched = [], + contextBackup = outermostContext, + + // We must always have either seed elements or outermost context + elems = seed || byElement && Expr.find[ "TAG" ]( "*", outermost ), + + // Use integer dirruns iff this is the outermost matcher + dirrunsUnique = ( dirruns += contextBackup == null ? 1 : Math.random() || 0.1 ), + len = elems.length; + + if ( outermost ) { + + // Support: IE 11+, Edge 17 - 18+ + // IE/Edge sometimes throw a "Permission denied" error when strict-comparing + // two documents; shallow comparisons work. + // eslint-disable-next-line eqeqeq + outermostContext = context == document || context || outermost; + } + + // Add elements passing elementMatchers directly to results + // Support: IE<9, Safari + // Tolerate NodeList properties (IE: "length"; Safari: ) matching elements by id + for ( ; i !== len && ( elem = elems[ i ] ) != null; i++ ) { + if ( byElement && elem ) { + j = 0; + + // Support: IE 11+, Edge 17 - 18+ + // IE/Edge sometimes throw a "Permission denied" error when strict-comparing + // two documents; shallow comparisons work. + // eslint-disable-next-line eqeqeq + if ( !context && elem.ownerDocument != document ) { + setDocument( elem ); + xml = !documentIsHTML; + } + while ( ( matcher = elementMatchers[ j++ ] ) ) { + if ( matcher( elem, context || document, xml ) ) { + results.push( elem ); + break; + } + } + if ( outermost ) { + dirruns = dirrunsUnique; + } + } + + // Track unmatched elements for set filters + if ( bySet ) { + + // They will have gone through all possible matchers + if ( ( elem = !matcher && elem ) ) { + matchedCount--; + } + + // Lengthen the array for every element, matched or not + if ( seed ) { + unmatched.push( elem ); + } + } + } + + // `i` is now the count of elements visited above, and adding it to `matchedCount` + // makes the latter nonnegative. + matchedCount += i; + + // Apply set filters to unmatched elements + // NOTE: This can be skipped if there are no unmatched elements (i.e., `matchedCount` + // equals `i`), unless we didn't visit _any_ elements in the above loop because we have + // no element matchers and no seed. + // Incrementing an initially-string "0" `i` allows `i` to remain a string only in that + // case, which will result in a "00" `matchedCount` that differs from `i` but is also + // numerically zero. + if ( bySet && i !== matchedCount ) { + j = 0; + while ( ( matcher = setMatchers[ j++ ] ) ) { + matcher( unmatched, setMatched, context, xml ); + } + + if ( seed ) { + + // Reintegrate element matches to eliminate the need for sorting + if ( matchedCount > 0 ) { + while ( i-- ) { + if ( !( unmatched[ i ] || setMatched[ i ] ) ) { + setMatched[ i ] = pop.call( results ); + } + } + } + + // Discard index placeholder values to get only actual matches + setMatched = condense( setMatched ); + } + + // Add matches to results + push.apply( results, setMatched ); + + // Seedless set matches succeeding multiple successful matchers stipulate sorting + if ( outermost && !seed && setMatched.length > 0 && + ( matchedCount + setMatchers.length ) > 1 ) { + + Sizzle.uniqueSort( results ); + } + } + + // Override manipulation of globals by nested matchers + if ( outermost ) { + dirruns = dirrunsUnique; + outermostContext = contextBackup; + } + + return unmatched; + }; + + return bySet ? + markFunction( superMatcher ) : + superMatcher; +} + +compile = Sizzle.compile = function( selector, match /* Internal Use Only */ ) { + var i, + setMatchers = [], + elementMatchers = [], + cached = compilerCache[ selector + " " ]; + + if ( !cached ) { + + // Generate a function of recursive functions that can be used to check each element + if ( !match ) { + match = tokenize( selector ); + } + i = match.length; + while ( i-- ) { + cached = matcherFromTokens( match[ i ] ); + if ( cached[ expando ] ) { + setMatchers.push( cached ); + } else { + elementMatchers.push( cached ); + } + } + + // Cache the compiled function + cached = compilerCache( + selector, + matcherFromGroupMatchers( elementMatchers, setMatchers ) + ); + + // Save selector and tokenization + cached.selector = selector; + } + return cached; +}; + +/** + * A low-level selection function that works with Sizzle's compiled + * selector functions + * @param {String|Function} selector A selector or a pre-compiled + * selector function built with Sizzle.compile + * @param {Element} context + * @param {Array} [results] + * @param {Array} [seed] A set of elements to match against + */ +select = Sizzle.select = function( selector, context, results, seed ) { + var i, tokens, token, type, find, + compiled = typeof selector === "function" && selector, + match = !seed && tokenize( ( selector = compiled.selector || selector ) ); + + results = results || []; + + // Try to minimize operations if there is only one selector in the list and no seed + // (the latter of which guarantees us context) + if ( match.length === 1 ) { + + // Reduce context if the leading compound selector is an ID + tokens = match[ 0 ] = match[ 0 ].slice( 0 ); + if ( tokens.length > 2 && ( token = tokens[ 0 ] ).type === "ID" && + context.nodeType === 9 && documentIsHTML && Expr.relative[ tokens[ 1 ].type ] ) { + + context = ( Expr.find[ "ID" ]( token.matches[ 0 ] + .replace( runescape, funescape ), context ) || [] )[ 0 ]; + if ( !context ) { + return results; + + // Precompiled matchers will still verify ancestry, so step up a level + } else if ( compiled ) { + context = context.parentNode; + } + + selector = selector.slice( tokens.shift().value.length ); + } + + // Fetch a seed set for right-to-left matching + i = matchExpr[ "needsContext" ].test( selector ) ? 0 : tokens.length; + while ( i-- ) { + token = tokens[ i ]; + + // Abort if we hit a combinator + if ( Expr.relative[ ( type = token.type ) ] ) { + break; + } + if ( ( find = Expr.find[ type ] ) ) { + + // Search, expanding context for leading sibling combinators + if ( ( seed = find( + token.matches[ 0 ].replace( runescape, funescape ), + rsibling.test( tokens[ 0 ].type ) && testContext( context.parentNode ) || + context + ) ) ) { + + // If seed is empty or no tokens remain, we can return early + tokens.splice( i, 1 ); + selector = seed.length && toSelector( tokens ); + if ( !selector ) { + push.apply( results, seed ); + return results; + } + + break; + } + } + } + } + + // Compile and execute a filtering function if one is not provided + // Provide `match` to avoid retokenization if we modified the selector above + ( compiled || compile( selector, match ) )( + seed, + context, + !documentIsHTML, + results, + !context || rsibling.test( selector ) && testContext( context.parentNode ) || context + ); + return results; +}; + +// One-time assignments + +// Sort stability +support.sortStable = expando.split( "" ).sort( sortOrder ).join( "" ) === expando; + +// Support: Chrome 14-35+ +// Always assume duplicates if they aren't passed to the comparison function +support.detectDuplicates = !!hasDuplicate; + +// Initialize against the default document +setDocument(); + +// Support: Webkit<537.32 - Safari 6.0.3/Chrome 25 (fixed in Chrome 27) +// Detached nodes confoundingly follow *each other* +support.sortDetached = assert( function( el ) { + + // Should return 1, but returns 4 (following) + return el.compareDocumentPosition( document.createElement( "fieldset" ) ) & 1; +} ); + +// Support: IE<8 +// Prevent attribute/property "interpolation" +// https://msdn.microsoft.com/en-us/library/ms536429%28VS.85%29.aspx +if ( !assert( function( el ) { + el.innerHTML = ""; + return el.firstChild.getAttribute( "href" ) === "#"; +} ) ) { + addHandle( "type|href|height|width", function( elem, name, isXML ) { + if ( !isXML ) { + return elem.getAttribute( name, name.toLowerCase() === "type" ? 1 : 2 ); + } + } ); +} + +// Support: IE<9 +// Use defaultValue in place of getAttribute("value") +if ( !support.attributes || !assert( function( el ) { + el.innerHTML = ""; + el.firstChild.setAttribute( "value", "" ); + return el.firstChild.getAttribute( "value" ) === ""; +} ) ) { + addHandle( "value", function( elem, _name, isXML ) { + if ( !isXML && elem.nodeName.toLowerCase() === "input" ) { + return elem.defaultValue; + } + } ); +} + +// Support: IE<9 +// Use getAttributeNode to fetch booleans when getAttribute lies +if ( !assert( function( el ) { + return el.getAttribute( "disabled" ) == null; +} ) ) { + addHandle( booleans, function( elem, name, isXML ) { + var val; + if ( !isXML ) { + return elem[ name ] === true ? name.toLowerCase() : + ( val = elem.getAttributeNode( name ) ) && val.specified ? + val.value : + null; + } + } ); +} + +return Sizzle; + +} )( window ); + + + +jQuery.find = Sizzle; +jQuery.expr = Sizzle.selectors; + +// Deprecated +jQuery.expr[ ":" ] = jQuery.expr.pseudos; +jQuery.uniqueSort = jQuery.unique = Sizzle.uniqueSort; +jQuery.text = Sizzle.getText; +jQuery.isXMLDoc = Sizzle.isXML; +jQuery.contains = Sizzle.contains; +jQuery.escapeSelector = Sizzle.escape; + + + + +var dir = function( elem, dir, until ) { + var matched = [], + truncate = until !== undefined; + + while ( ( elem = elem[ dir ] ) && elem.nodeType !== 9 ) { + if ( elem.nodeType === 1 ) { + if ( truncate && jQuery( elem ).is( until ) ) { + break; + } + matched.push( elem ); + } + } + return matched; +}; + + +var siblings = function( n, elem ) { + var matched = []; + + for ( ; n; n = n.nextSibling ) { + if ( n.nodeType === 1 && n !== elem ) { + matched.push( n ); + } + } + + return matched; +}; + + +var rneedsContext = jQuery.expr.match.needsContext; + + + +function nodeName( elem, name ) { + + return elem.nodeName && elem.nodeName.toLowerCase() === name.toLowerCase(); + +} +var rsingleTag = ( /^<([a-z][^\/\0>:\x20\t\r\n\f]*)[\x20\t\r\n\f]*\/?>(?:<\/\1>|)$/i ); + + + +// Implement the identical functionality for filter and not +function winnow( elements, qualifier, not ) { + if ( isFunction( qualifier ) ) { + return jQuery.grep( elements, function( elem, i ) { + return !!qualifier.call( elem, i, elem ) !== not; + } ); + } + + // Single element + if ( qualifier.nodeType ) { + return jQuery.grep( elements, function( elem ) { + return ( elem === qualifier ) !== not; + } ); + } + + // Arraylike of elements (jQuery, arguments, Array) + if ( typeof qualifier !== "string" ) { + return jQuery.grep( elements, function( elem ) { + return ( indexOf.call( qualifier, elem ) > -1 ) !== not; + } ); + } + + // Filtered directly for both simple and complex selectors + return jQuery.filter( qualifier, elements, not ); +} + +jQuery.filter = function( expr, elems, not ) { + var elem = elems[ 0 ]; + + if ( not ) { + expr = ":not(" + expr + ")"; + } + + if ( elems.length === 1 && elem.nodeType === 1 ) { + return jQuery.find.matchesSelector( elem, expr ) ? [ elem ] : []; + } + + return jQuery.find.matches( expr, jQuery.grep( elems, function( elem ) { + return elem.nodeType === 1; + } ) ); +}; + +jQuery.fn.extend( { + find: function( selector ) { + var i, ret, + len = this.length, + self = this; + + if ( typeof selector !== "string" ) { + return this.pushStack( jQuery( selector ).filter( function() { + for ( i = 0; i < len; i++ ) { + if ( jQuery.contains( self[ i ], this ) ) { + return true; + } + } + } ) ); + } + + ret = this.pushStack( [] ); + + for ( i = 0; i < len; i++ ) { + jQuery.find( selector, self[ i ], ret ); + } + + return len > 1 ? jQuery.uniqueSort( ret ) : ret; + }, + filter: function( selector ) { + return this.pushStack( winnow( this, selector || [], false ) ); + }, + not: function( selector ) { + return this.pushStack( winnow( this, selector || [], true ) ); + }, + is: function( selector ) { + return !!winnow( + this, + + // If this is a positional/relative selector, check membership in the returned set + // so $("p:first").is("p:last") won't return true for a doc with two "p". + typeof selector === "string" && rneedsContext.test( selector ) ? + jQuery( selector ) : + selector || [], + false + ).length; + } +} ); + + +// Initialize a jQuery object + + +// A central reference to the root jQuery(document) +var rootjQuery, + + // A simple way to check for HTML strings + // Prioritize #id over to avoid XSS via location.hash (#9521) + // Strict HTML recognition (#11290: must start with <) + // Shortcut simple #id case for speed + rquickExpr = /^(?:\s*(<[\w\W]+>)[^>]*|#([\w-]+))$/, + + init = jQuery.fn.init = function( selector, context, root ) { + var match, elem; + + // HANDLE: $(""), $(null), $(undefined), $(false) + if ( !selector ) { + return this; + } + + // Method init() accepts an alternate rootjQuery + // so migrate can support jQuery.sub (gh-2101) + root = root || rootjQuery; + + // Handle HTML strings + if ( typeof selector === "string" ) { + if ( selector[ 0 ] === "<" && + selector[ selector.length - 1 ] === ">" && + selector.length >= 3 ) { + + // Assume that strings that start and end with <> are HTML and skip the regex check + match = [ null, selector, null ]; + + } else { + match = rquickExpr.exec( selector ); + } + + // Match html or make sure no context is specified for #id + if ( match && ( match[ 1 ] || !context ) ) { + + // HANDLE: $(html) -> $(array) + if ( match[ 1 ] ) { + context = context instanceof jQuery ? context[ 0 ] : context; + + // Option to run scripts is true for back-compat + // Intentionally let the error be thrown if parseHTML is not present + jQuery.merge( this, jQuery.parseHTML( + match[ 1 ], + context && context.nodeType ? context.ownerDocument || context : document, + true + ) ); + + // HANDLE: $(html, props) + if ( rsingleTag.test( match[ 1 ] ) && jQuery.isPlainObject( context ) ) { + for ( match in context ) { + + // Properties of context are called as methods if possible + if ( isFunction( this[ match ] ) ) { + this[ match ]( context[ match ] ); + + // ...and otherwise set as attributes + } else { + this.attr( match, context[ match ] ); + } + } + } + + return this; + + // HANDLE: $(#id) + } else { + elem = document.getElementById( match[ 2 ] ); + + if ( elem ) { + + // Inject the element directly into the jQuery object + this[ 0 ] = elem; + this.length = 1; + } + return this; + } + + // HANDLE: $(expr, $(...)) + } else if ( !context || context.jquery ) { + return ( context || root ).find( selector ); + + // HANDLE: $(expr, context) + // (which is just equivalent to: $(context).find(expr) + } else { + return this.constructor( context ).find( selector ); + } + + // HANDLE: $(DOMElement) + } else if ( selector.nodeType ) { + this[ 0 ] = selector; + this.length = 1; + return this; + + // HANDLE: $(function) + // Shortcut for document ready + } else if ( isFunction( selector ) ) { + return root.ready !== undefined ? + root.ready( selector ) : + + // Execute immediately if ready is not present + selector( jQuery ); + } + + return jQuery.makeArray( selector, this ); + }; + +// Give the init function the jQuery prototype for later instantiation +init.prototype = jQuery.fn; + +// Initialize central reference +rootjQuery = jQuery( document ); + + +var rparentsprev = /^(?:parents|prev(?:Until|All))/, + + // Methods guaranteed to produce a unique set when starting from a unique set + guaranteedUnique = { + children: true, + contents: true, + next: true, + prev: true + }; + +jQuery.fn.extend( { + has: function( target ) { + var targets = jQuery( target, this ), + l = targets.length; + + return this.filter( function() { + var i = 0; + for ( ; i < l; i++ ) { + if ( jQuery.contains( this, targets[ i ] ) ) { + return true; + } + } + } ); + }, + + closest: function( selectors, context ) { + var cur, + i = 0, + l = this.length, + matched = [], + targets = typeof selectors !== "string" && jQuery( selectors ); + + // Positional selectors never match, since there's no _selection_ context + if ( !rneedsContext.test( selectors ) ) { + for ( ; i < l; i++ ) { + for ( cur = this[ i ]; cur && cur !== context; cur = cur.parentNode ) { + + // Always skip document fragments + if ( cur.nodeType < 11 && ( targets ? + targets.index( cur ) > -1 : + + // Don't pass non-elements to Sizzle + cur.nodeType === 1 && + jQuery.find.matchesSelector( cur, selectors ) ) ) { + + matched.push( cur ); + break; + } + } + } + } + + return this.pushStack( matched.length > 1 ? jQuery.uniqueSort( matched ) : matched ); + }, + + // Determine the position of an element within the set + index: function( elem ) { + + // No argument, return index in parent + if ( !elem ) { + return ( this[ 0 ] && this[ 0 ].parentNode ) ? this.first().prevAll().length : -1; + } + + // Index in selector + if ( typeof elem === "string" ) { + return indexOf.call( jQuery( elem ), this[ 0 ] ); + } + + // Locate the position of the desired element + return indexOf.call( this, + + // If it receives a jQuery object, the first element is used + elem.jquery ? elem[ 0 ] : elem + ); + }, + + add: function( selector, context ) { + return this.pushStack( + jQuery.uniqueSort( + jQuery.merge( this.get(), jQuery( selector, context ) ) + ) + ); + }, + + addBack: function( selector ) { + return this.add( selector == null ? + this.prevObject : this.prevObject.filter( selector ) + ); + } +} ); + +function sibling( cur, dir ) { + while ( ( cur = cur[ dir ] ) && cur.nodeType !== 1 ) {} + return cur; +} + +jQuery.each( { + parent: function( elem ) { + var parent = elem.parentNode; + return parent && parent.nodeType !== 11 ? parent : null; + }, + parents: function( elem ) { + return dir( elem, "parentNode" ); + }, + parentsUntil: function( elem, _i, until ) { + return dir( elem, "parentNode", until ); + }, + next: function( elem ) { + return sibling( elem, "nextSibling" ); + }, + prev: function( elem ) { + return sibling( elem, "previousSibling" ); + }, + nextAll: function( elem ) { + return dir( elem, "nextSibling" ); + }, + prevAll: function( elem ) { + return dir( elem, "previousSibling" ); + }, + nextUntil: function( elem, _i, until ) { + return dir( elem, "nextSibling", until ); + }, + prevUntil: function( elem, _i, until ) { + return dir( elem, "previousSibling", until ); + }, + siblings: function( elem ) { + return siblings( ( elem.parentNode || {} ).firstChild, elem ); + }, + children: function( elem ) { + return siblings( elem.firstChild ); + }, + contents: function( elem ) { + if ( elem.contentDocument != null && + + // Support: IE 11+ + // elements with no `data` attribute has an object + // `contentDocument` with a `null` prototype. + getProto( elem.contentDocument ) ) { + + return elem.contentDocument; + } + + // Support: IE 9 - 11 only, iOS 7 only, Android Browser <=4.3 only + // Treat the template element as a regular one in browsers that + // don't support it. + if ( nodeName( elem, "template" ) ) { + elem = elem.content || elem; + } + + return jQuery.merge( [], elem.childNodes ); + } +}, function( name, fn ) { + jQuery.fn[ name ] = function( until, selector ) { + var matched = jQuery.map( this, fn, until ); + + if ( name.slice( -5 ) !== "Until" ) { + selector = until; + } + + if ( selector && typeof selector === "string" ) { + matched = jQuery.filter( selector, matched ); + } + + if ( this.length > 1 ) { + + // Remove duplicates + if ( !guaranteedUnique[ name ] ) { + jQuery.uniqueSort( matched ); + } + + // Reverse order for parents* and prev-derivatives + if ( rparentsprev.test( name ) ) { + matched.reverse(); + } + } + + return this.pushStack( matched ); + }; +} ); +var rnothtmlwhite = ( /[^\x20\t\r\n\f]+/g ); + + + +// Convert String-formatted options into Object-formatted ones +function createOptions( options ) { + var object = {}; + jQuery.each( options.match( rnothtmlwhite ) || [], function( _, flag ) { + object[ flag ] = true; + } ); + return object; +} + +/* + * Create a callback list using the following parameters: + * + * options: an optional list of space-separated options that will change how + * the callback list behaves or a more traditional option object + * + * By default a callback list will act like an event callback list and can be + * "fired" multiple times. + * + * Possible options: + * + * once: will ensure the callback list can only be fired once (like a Deferred) + * + * memory: will keep track of previous values and will call any callback added + * after the list has been fired right away with the latest "memorized" + * values (like a Deferred) + * + * unique: will ensure a callback can only be added once (no duplicate in the list) + * + * stopOnFalse: interrupt callings when a callback returns false + * + */ +jQuery.Callbacks = function( options ) { + + // Convert options from String-formatted to Object-formatted if needed + // (we check in cache first) + options = typeof options === "string" ? + createOptions( options ) : + jQuery.extend( {}, options ); + + var // Flag to know if list is currently firing + firing, + + // Last fire value for non-forgettable lists + memory, + + // Flag to know if list was already fired + fired, + + // Flag to prevent firing + locked, + + // Actual callback list + list = [], + + // Queue of execution data for repeatable lists + queue = [], + + // Index of currently firing callback (modified by add/remove as needed) + firingIndex = -1, + + // Fire callbacks + fire = function() { + + // Enforce single-firing + locked = locked || options.once; + + // Execute callbacks for all pending executions, + // respecting firingIndex overrides and runtime changes + fired = firing = true; + for ( ; queue.length; firingIndex = -1 ) { + memory = queue.shift(); + while ( ++firingIndex < list.length ) { + + // Run callback and check for early termination + if ( list[ firingIndex ].apply( memory[ 0 ], memory[ 1 ] ) === false && + options.stopOnFalse ) { + + // Jump to end and forget the data so .add doesn't re-fire + firingIndex = list.length; + memory = false; + } + } + } + + // Forget the data if we're done with it + if ( !options.memory ) { + memory = false; + } + + firing = false; + + // Clean up if we're done firing for good + if ( locked ) { + + // Keep an empty list if we have data for future add calls + if ( memory ) { + list = []; + + // Otherwise, this object is spent + } else { + list = ""; + } + } + }, + + // Actual Callbacks object + self = { + + // Add a callback or a collection of callbacks to the list + add: function() { + if ( list ) { + + // If we have memory from a past run, we should fire after adding + if ( memory && !firing ) { + firingIndex = list.length - 1; + queue.push( memory ); + } + + ( function add( args ) { + jQuery.each( args, function( _, arg ) { + if ( isFunction( arg ) ) { + if ( !options.unique || !self.has( arg ) ) { + list.push( arg ); + } + } else if ( arg && arg.length && toType( arg ) !== "string" ) { + + // Inspect recursively + add( arg ); + } + } ); + } )( arguments ); + + if ( memory && !firing ) { + fire(); + } + } + return this; + }, + + // Remove a callback from the list + remove: function() { + jQuery.each( arguments, function( _, arg ) { + var index; + while ( ( index = jQuery.inArray( arg, list, index ) ) > -1 ) { + list.splice( index, 1 ); + + // Handle firing indexes + if ( index <= firingIndex ) { + firingIndex--; + } + } + } ); + return this; + }, + + // Check if a given callback is in the list. + // If no argument is given, return whether or not list has callbacks attached. + has: function( fn ) { + return fn ? + jQuery.inArray( fn, list ) > -1 : + list.length > 0; + }, + + // Remove all callbacks from the list + empty: function() { + if ( list ) { + list = []; + } + return this; + }, + + // Disable .fire and .add + // Abort any current/pending executions + // Clear all callbacks and values + disable: function() { + locked = queue = []; + list = memory = ""; + return this; + }, + disabled: function() { + return !list; + }, + + // Disable .fire + // Also disable .add unless we have memory (since it would have no effect) + // Abort any pending executions + lock: function() { + locked = queue = []; + if ( !memory && !firing ) { + list = memory = ""; + } + return this; + }, + locked: function() { + return !!locked; + }, + + // Call all callbacks with the given context and arguments + fireWith: function( context, args ) { + if ( !locked ) { + args = args || []; + args = [ context, args.slice ? args.slice() : args ]; + queue.push( args ); + if ( !firing ) { + fire(); + } + } + return this; + }, + + // Call all the callbacks with the given arguments + fire: function() { + self.fireWith( this, arguments ); + return this; + }, + + // To know if the callbacks have already been called at least once + fired: function() { + return !!fired; + } + }; + + return self; +}; + + +function Identity( v ) { + return v; +} +function Thrower( ex ) { + throw ex; +} + +function adoptValue( value, resolve, reject, noValue ) { + var method; + + try { + + // Check for promise aspect first to privilege synchronous behavior + if ( value && isFunction( ( method = value.promise ) ) ) { + method.call( value ).done( resolve ).fail( reject ); + + // Other thenables + } else if ( value && isFunction( ( method = value.then ) ) ) { + method.call( value, resolve, reject ); + + // Other non-thenables + } else { + + // Control `resolve` arguments by letting Array#slice cast boolean `noValue` to integer: + // * false: [ value ].slice( 0 ) => resolve( value ) + // * true: [ value ].slice( 1 ) => resolve() + resolve.apply( undefined, [ value ].slice( noValue ) ); + } + + // For Promises/A+, convert exceptions into rejections + // Since jQuery.when doesn't unwrap thenables, we can skip the extra checks appearing in + // Deferred#then to conditionally suppress rejection. + } catch ( value ) { + + // Support: Android 4.0 only + // Strict mode functions invoked without .call/.apply get global-object context + reject.apply( undefined, [ value ] ); + } +} + +jQuery.extend( { + + Deferred: function( func ) { + var tuples = [ + + // action, add listener, callbacks, + // ... .then handlers, argument index, [final state] + [ "notify", "progress", jQuery.Callbacks( "memory" ), + jQuery.Callbacks( "memory" ), 2 ], + [ "resolve", "done", jQuery.Callbacks( "once memory" ), + jQuery.Callbacks( "once memory" ), 0, "resolved" ], + [ "reject", "fail", jQuery.Callbacks( "once memory" ), + jQuery.Callbacks( "once memory" ), 1, "rejected" ] + ], + state = "pending", + promise = { + state: function() { + return state; + }, + always: function() { + deferred.done( arguments ).fail( arguments ); + return this; + }, + "catch": function( fn ) { + return promise.then( null, fn ); + }, + + // Keep pipe for back-compat + pipe: function( /* fnDone, fnFail, fnProgress */ ) { + var fns = arguments; + + return jQuery.Deferred( function( newDefer ) { + jQuery.each( tuples, function( _i, tuple ) { + + // Map tuples (progress, done, fail) to arguments (done, fail, progress) + var fn = isFunction( fns[ tuple[ 4 ] ] ) && fns[ tuple[ 4 ] ]; + + // deferred.progress(function() { bind to newDefer or newDefer.notify }) + // deferred.done(function() { bind to newDefer or newDefer.resolve }) + // deferred.fail(function() { bind to newDefer or newDefer.reject }) + deferred[ tuple[ 1 ] ]( function() { + var returned = fn && fn.apply( this, arguments ); + if ( returned && isFunction( returned.promise ) ) { + returned.promise() + .progress( newDefer.notify ) + .done( newDefer.resolve ) + .fail( newDefer.reject ); + } else { + newDefer[ tuple[ 0 ] + "With" ]( + this, + fn ? [ returned ] : arguments + ); + } + } ); + } ); + fns = null; + } ).promise(); + }, + then: function( onFulfilled, onRejected, onProgress ) { + var maxDepth = 0; + function resolve( depth, deferred, handler, special ) { + return function() { + var that = this, + args = arguments, + mightThrow = function() { + var returned, then; + + // Support: Promises/A+ section 2.3.3.3.3 + // https://promisesaplus.com/#point-59 + // Ignore double-resolution attempts + if ( depth < maxDepth ) { + return; + } + + returned = handler.apply( that, args ); + + // Support: Promises/A+ section 2.3.1 + // https://promisesaplus.com/#point-48 + if ( returned === deferred.promise() ) { + throw new TypeError( "Thenable self-resolution" ); + } + + // Support: Promises/A+ sections 2.3.3.1, 3.5 + // https://promisesaplus.com/#point-54 + // https://promisesaplus.com/#point-75 + // Retrieve `then` only once + then = returned && + + // Support: Promises/A+ section 2.3.4 + // https://promisesaplus.com/#point-64 + // Only check objects and functions for thenability + ( typeof returned === "object" || + typeof returned === "function" ) && + returned.then; + + // Handle a returned thenable + if ( isFunction( then ) ) { + + // Special processors (notify) just wait for resolution + if ( special ) { + then.call( + returned, + resolve( maxDepth, deferred, Identity, special ), + resolve( maxDepth, deferred, Thrower, special ) + ); + + // Normal processors (resolve) also hook into progress + } else { + + // ...and disregard older resolution values + maxDepth++; + + then.call( + returned, + resolve( maxDepth, deferred, Identity, special ), + resolve( maxDepth, deferred, Thrower, special ), + resolve( maxDepth, deferred, Identity, + deferred.notifyWith ) + ); + } + + // Handle all other returned values + } else { + + // Only substitute handlers pass on context + // and multiple values (non-spec behavior) + if ( handler !== Identity ) { + that = undefined; + args = [ returned ]; + } + + // Process the value(s) + // Default process is resolve + ( special || deferred.resolveWith )( that, args ); + } + }, + + // Only normal processors (resolve) catch and reject exceptions + process = special ? + mightThrow : + function() { + try { + mightThrow(); + } catch ( e ) { + + if ( jQuery.Deferred.exceptionHook ) { + jQuery.Deferred.exceptionHook( e, + process.stackTrace ); + } + + // Support: Promises/A+ section 2.3.3.3.4.1 + // https://promisesaplus.com/#point-61 + // Ignore post-resolution exceptions + if ( depth + 1 >= maxDepth ) { + + // Only substitute handlers pass on context + // and multiple values (non-spec behavior) + if ( handler !== Thrower ) { + that = undefined; + args = [ e ]; + } + + deferred.rejectWith( that, args ); + } + } + }; + + // Support: Promises/A+ section 2.3.3.3.1 + // https://promisesaplus.com/#point-57 + // Re-resolve promises immediately to dodge false rejection from + // subsequent errors + if ( depth ) { + process(); + } else { + + // Call an optional hook to record the stack, in case of exception + // since it's otherwise lost when execution goes async + if ( jQuery.Deferred.getStackHook ) { + process.stackTrace = jQuery.Deferred.getStackHook(); + } + window.setTimeout( process ); + } + }; + } + + return jQuery.Deferred( function( newDefer ) { + + // progress_handlers.add( ... ) + tuples[ 0 ][ 3 ].add( + resolve( + 0, + newDefer, + isFunction( onProgress ) ? + onProgress : + Identity, + newDefer.notifyWith + ) + ); + + // fulfilled_handlers.add( ... ) + tuples[ 1 ][ 3 ].add( + resolve( + 0, + newDefer, + isFunction( onFulfilled ) ? + onFulfilled : + Identity + ) + ); + + // rejected_handlers.add( ... ) + tuples[ 2 ][ 3 ].add( + resolve( + 0, + newDefer, + isFunction( onRejected ) ? + onRejected : + Thrower + ) + ); + } ).promise(); + }, + + // Get a promise for this deferred + // If obj is provided, the promise aspect is added to the object + promise: function( obj ) { + return obj != null ? jQuery.extend( obj, promise ) : promise; + } + }, + deferred = {}; + + // Add list-specific methods + jQuery.each( tuples, function( i, tuple ) { + var list = tuple[ 2 ], + stateString = tuple[ 5 ]; + + // promise.progress = list.add + // promise.done = list.add + // promise.fail = list.add + promise[ tuple[ 1 ] ] = list.add; + + // Handle state + if ( stateString ) { + list.add( + function() { + + // state = "resolved" (i.e., fulfilled) + // state = "rejected" + state = stateString; + }, + + // rejected_callbacks.disable + // fulfilled_callbacks.disable + tuples[ 3 - i ][ 2 ].disable, + + // rejected_handlers.disable + // fulfilled_handlers.disable + tuples[ 3 - i ][ 3 ].disable, + + // progress_callbacks.lock + tuples[ 0 ][ 2 ].lock, + + // progress_handlers.lock + tuples[ 0 ][ 3 ].lock + ); + } + + // progress_handlers.fire + // fulfilled_handlers.fire + // rejected_handlers.fire + list.add( tuple[ 3 ].fire ); + + // deferred.notify = function() { deferred.notifyWith(...) } + // deferred.resolve = function() { deferred.resolveWith(...) } + // deferred.reject = function() { deferred.rejectWith(...) } + deferred[ tuple[ 0 ] ] = function() { + deferred[ tuple[ 0 ] + "With" ]( this === deferred ? undefined : this, arguments ); + return this; + }; + + // deferred.notifyWith = list.fireWith + // deferred.resolveWith = list.fireWith + // deferred.rejectWith = list.fireWith + deferred[ tuple[ 0 ] + "With" ] = list.fireWith; + } ); + + // Make the deferred a promise + promise.promise( deferred ); + + // Call given func if any + if ( func ) { + func.call( deferred, deferred ); + } + + // All done! + return deferred; + }, + + // Deferred helper + when: function( singleValue ) { + var + + // count of uncompleted subordinates + remaining = arguments.length, + + // count of unprocessed arguments + i = remaining, + + // subordinate fulfillment data + resolveContexts = Array( i ), + resolveValues = slice.call( arguments ), + + // the primary Deferred + primary = jQuery.Deferred(), + + // subordinate callback factory + updateFunc = function( i ) { + return function( value ) { + resolveContexts[ i ] = this; + resolveValues[ i ] = arguments.length > 1 ? slice.call( arguments ) : value; + if ( !( --remaining ) ) { + primary.resolveWith( resolveContexts, resolveValues ); + } + }; + }; + + // Single- and empty arguments are adopted like Promise.resolve + if ( remaining <= 1 ) { + adoptValue( singleValue, primary.done( updateFunc( i ) ).resolve, primary.reject, + !remaining ); + + // Use .then() to unwrap secondary thenables (cf. gh-3000) + if ( primary.state() === "pending" || + isFunction( resolveValues[ i ] && resolveValues[ i ].then ) ) { + + return primary.then(); + } + } + + // Multiple arguments are aggregated like Promise.all array elements + while ( i-- ) { + adoptValue( resolveValues[ i ], updateFunc( i ), primary.reject ); + } + + return primary.promise(); + } +} ); + + +// These usually indicate a programmer mistake during development, +// warn about them ASAP rather than swallowing them by default. +var rerrorNames = /^(Eval|Internal|Range|Reference|Syntax|Type|URI)Error$/; + +jQuery.Deferred.exceptionHook = function( error, stack ) { + + // Support: IE 8 - 9 only + // Console exists when dev tools are open, which can happen at any time + if ( window.console && window.console.warn && error && rerrorNames.test( error.name ) ) { + window.console.warn( "jQuery.Deferred exception: " + error.message, error.stack, stack ); + } +}; + + + + +jQuery.readyException = function( error ) { + window.setTimeout( function() { + throw error; + } ); +}; + + + + +// The deferred used on DOM ready +var readyList = jQuery.Deferred(); + +jQuery.fn.ready = function( fn ) { + + readyList + .then( fn ) + + // Wrap jQuery.readyException in a function so that the lookup + // happens at the time of error handling instead of callback + // registration. + .catch( function( error ) { + jQuery.readyException( error ); + } ); + + return this; +}; + +jQuery.extend( { + + // Is the DOM ready to be used? Set to true once it occurs. + isReady: false, + + // A counter to track how many items to wait for before + // the ready event fires. See #6781 + readyWait: 1, + + // Handle when the DOM is ready + ready: function( wait ) { + + // Abort if there are pending holds or we're already ready + if ( wait === true ? --jQuery.readyWait : jQuery.isReady ) { + return; + } + + // Remember that the DOM is ready + jQuery.isReady = true; + + // If a normal DOM Ready event fired, decrement, and wait if need be + if ( wait !== true && --jQuery.readyWait > 0 ) { + return; + } + + // If there are functions bound, to execute + readyList.resolveWith( document, [ jQuery ] ); + } +} ); + +jQuery.ready.then = readyList.then; + +// The ready event handler and self cleanup method +function completed() { + document.removeEventListener( "DOMContentLoaded", completed ); + window.removeEventListener( "load", completed ); + jQuery.ready(); +} + +// Catch cases where $(document).ready() is called +// after the browser event has already occurred. +// Support: IE <=9 - 10 only +// Older IE sometimes signals "interactive" too soon +if ( document.readyState === "complete" || + ( document.readyState !== "loading" && !document.documentElement.doScroll ) ) { + + // Handle it asynchronously to allow scripts the opportunity to delay ready + window.setTimeout( jQuery.ready ); + +} else { + + // Use the handy event callback + document.addEventListener( "DOMContentLoaded", completed ); + + // A fallback to window.onload, that will always work + window.addEventListener( "load", completed ); +} + + + + +// Multifunctional method to get and set values of a collection +// The value/s can optionally be executed if it's a function +var access = function( elems, fn, key, value, chainable, emptyGet, raw ) { + var i = 0, + len = elems.length, + bulk = key == null; + + // Sets many values + if ( toType( key ) === "object" ) { + chainable = true; + for ( i in key ) { + access( elems, fn, i, key[ i ], true, emptyGet, raw ); + } + + // Sets one value + } else if ( value !== undefined ) { + chainable = true; + + if ( !isFunction( value ) ) { + raw = true; + } + + if ( bulk ) { + + // Bulk operations run against the entire set + if ( raw ) { + fn.call( elems, value ); + fn = null; + + // ...except when executing function values + } else { + bulk = fn; + fn = function( elem, _key, value ) { + return bulk.call( jQuery( elem ), value ); + }; + } + } + + if ( fn ) { + for ( ; i < len; i++ ) { + fn( + elems[ i ], key, raw ? + value : + value.call( elems[ i ], i, fn( elems[ i ], key ) ) + ); + } + } + } + + if ( chainable ) { + return elems; + } + + // Gets + if ( bulk ) { + return fn.call( elems ); + } + + return len ? fn( elems[ 0 ], key ) : emptyGet; +}; + + +// Matches dashed string for camelizing +var rmsPrefix = /^-ms-/, + rdashAlpha = /-([a-z])/g; + +// Used by camelCase as callback to replace() +function fcamelCase( _all, letter ) { + return letter.toUpperCase(); +} + +// Convert dashed to camelCase; used by the css and data modules +// Support: IE <=9 - 11, Edge 12 - 15 +// Microsoft forgot to hump their vendor prefix (#9572) +function camelCase( string ) { + return string.replace( rmsPrefix, "ms-" ).replace( rdashAlpha, fcamelCase ); +} +var acceptData = function( owner ) { + + // Accepts only: + // - Node + // - Node.ELEMENT_NODE + // - Node.DOCUMENT_NODE + // - Object + // - Any + return owner.nodeType === 1 || owner.nodeType === 9 || !( +owner.nodeType ); +}; + + + + +function Data() { + this.expando = jQuery.expando + Data.uid++; +} + +Data.uid = 1; + +Data.prototype = { + + cache: function( owner ) { + + // Check if the owner object already has a cache + var value = owner[ this.expando ]; + + // If not, create one + if ( !value ) { + value = {}; + + // We can accept data for non-element nodes in modern browsers, + // but we should not, see #8335. + // Always return an empty object. + if ( acceptData( owner ) ) { + + // If it is a node unlikely to be stringify-ed or looped over + // use plain assignment + if ( owner.nodeType ) { + owner[ this.expando ] = value; + + // Otherwise secure it in a non-enumerable property + // configurable must be true to allow the property to be + // deleted when data is removed + } else { + Object.defineProperty( owner, this.expando, { + value: value, + configurable: true + } ); + } + } + } + + return value; + }, + set: function( owner, data, value ) { + var prop, + cache = this.cache( owner ); + + // Handle: [ owner, key, value ] args + // Always use camelCase key (gh-2257) + if ( typeof data === "string" ) { + cache[ camelCase( data ) ] = value; + + // Handle: [ owner, { properties } ] args + } else { + + // Copy the properties one-by-one to the cache object + for ( prop in data ) { + cache[ camelCase( prop ) ] = data[ prop ]; + } + } + return cache; + }, + get: function( owner, key ) { + return key === undefined ? + this.cache( owner ) : + + // Always use camelCase key (gh-2257) + owner[ this.expando ] && owner[ this.expando ][ camelCase( key ) ]; + }, + access: function( owner, key, value ) { + + // In cases where either: + // + // 1. No key was specified + // 2. A string key was specified, but no value provided + // + // Take the "read" path and allow the get method to determine + // which value to return, respectively either: + // + // 1. The entire cache object + // 2. The data stored at the key + // + if ( key === undefined || + ( ( key && typeof key === "string" ) && value === undefined ) ) { + + return this.get( owner, key ); + } + + // When the key is not a string, or both a key and value + // are specified, set or extend (existing objects) with either: + // + // 1. An object of properties + // 2. A key and value + // + this.set( owner, key, value ); + + // Since the "set" path can have two possible entry points + // return the expected data based on which path was taken[*] + return value !== undefined ? value : key; + }, + remove: function( owner, key ) { + var i, + cache = owner[ this.expando ]; + + if ( cache === undefined ) { + return; + } + + if ( key !== undefined ) { + + // Support array or space separated string of keys + if ( Array.isArray( key ) ) { + + // If key is an array of keys... + // We always set camelCase keys, so remove that. + key = key.map( camelCase ); + } else { + key = camelCase( key ); + + // If a key with the spaces exists, use it. + // Otherwise, create an array by matching non-whitespace + key = key in cache ? + [ key ] : + ( key.match( rnothtmlwhite ) || [] ); + } + + i = key.length; + + while ( i-- ) { + delete cache[ key[ i ] ]; + } + } + + // Remove the expando if there's no more data + if ( key === undefined || jQuery.isEmptyObject( cache ) ) { + + // Support: Chrome <=35 - 45 + // Webkit & Blink performance suffers when deleting properties + // from DOM nodes, so set to undefined instead + // https://bugs.chromium.org/p/chromium/issues/detail?id=378607 (bug restricted) + if ( owner.nodeType ) { + owner[ this.expando ] = undefined; + } else { + delete owner[ this.expando ]; + } + } + }, + hasData: function( owner ) { + var cache = owner[ this.expando ]; + return cache !== undefined && !jQuery.isEmptyObject( cache ); + } +}; +var dataPriv = new Data(); + +var dataUser = new Data(); + + + +// Implementation Summary +// +// 1. Enforce API surface and semantic compatibility with 1.9.x branch +// 2. Improve the module's maintainability by reducing the storage +// paths to a single mechanism. +// 3. Use the same single mechanism to support "private" and "user" data. +// 4. _Never_ expose "private" data to user code (TODO: Drop _data, _removeData) +// 5. Avoid exposing implementation details on user objects (eg. expando properties) +// 6. Provide a clear path for implementation upgrade to WeakMap in 2014 + +var rbrace = /^(?:\{[\w\W]*\}|\[[\w\W]*\])$/, + rmultiDash = /[A-Z]/g; + +function getData( data ) { + if ( data === "true" ) { + return true; + } + + if ( data === "false" ) { + return false; + } + + if ( data === "null" ) { + return null; + } + + // Only convert to a number if it doesn't change the string + if ( data === +data + "" ) { + return +data; + } + + if ( rbrace.test( data ) ) { + return JSON.parse( data ); + } + + return data; +} + +function dataAttr( elem, key, data ) { + var name; + + // If nothing was found internally, try to fetch any + // data from the HTML5 data-* attribute + if ( data === undefined && elem.nodeType === 1 ) { + name = "data-" + key.replace( rmultiDash, "-$&" ).toLowerCase(); + data = elem.getAttribute( name ); + + if ( typeof data === "string" ) { + try { + data = getData( data ); + } catch ( e ) {} + + // Make sure we set the data so it isn't changed later + dataUser.set( elem, key, data ); + } else { + data = undefined; + } + } + return data; +} + +jQuery.extend( { + hasData: function( elem ) { + return dataUser.hasData( elem ) || dataPriv.hasData( elem ); + }, + + data: function( elem, name, data ) { + return dataUser.access( elem, name, data ); + }, + + removeData: function( elem, name ) { + dataUser.remove( elem, name ); + }, + + // TODO: Now that all calls to _data and _removeData have been replaced + // with direct calls to dataPriv methods, these can be deprecated. + _data: function( elem, name, data ) { + return dataPriv.access( elem, name, data ); + }, + + _removeData: function( elem, name ) { + dataPriv.remove( elem, name ); + } +} ); + +jQuery.fn.extend( { + data: function( key, value ) { + var i, name, data, + elem = this[ 0 ], + attrs = elem && elem.attributes; + + // Gets all values + if ( key === undefined ) { + if ( this.length ) { + data = dataUser.get( elem ); + + if ( elem.nodeType === 1 && !dataPriv.get( elem, "hasDataAttrs" ) ) { + i = attrs.length; + while ( i-- ) { + + // Support: IE 11 only + // The attrs elements can be null (#14894) + if ( attrs[ i ] ) { + name = attrs[ i ].name; + if ( name.indexOf( "data-" ) === 0 ) { + name = camelCase( name.slice( 5 ) ); + dataAttr( elem, name, data[ name ] ); + } + } + } + dataPriv.set( elem, "hasDataAttrs", true ); + } + } + + return data; + } + + // Sets multiple values + if ( typeof key === "object" ) { + return this.each( function() { + dataUser.set( this, key ); + } ); + } + + return access( this, function( value ) { + var data; + + // The calling jQuery object (element matches) is not empty + // (and therefore has an element appears at this[ 0 ]) and the + // `value` parameter was not undefined. An empty jQuery object + // will result in `undefined` for elem = this[ 0 ] which will + // throw an exception if an attempt to read a data cache is made. + if ( elem && value === undefined ) { + + // Attempt to get data from the cache + // The key will always be camelCased in Data + data = dataUser.get( elem, key ); + if ( data !== undefined ) { + return data; + } + + // Attempt to "discover" the data in + // HTML5 custom data-* attrs + data = dataAttr( elem, key ); + if ( data !== undefined ) { + return data; + } + + // We tried really hard, but the data doesn't exist. + return; + } + + // Set the data... + this.each( function() { + + // We always store the camelCased key + dataUser.set( this, key, value ); + } ); + }, null, value, arguments.length > 1, null, true ); + }, + + removeData: function( key ) { + return this.each( function() { + dataUser.remove( this, key ); + } ); + } +} ); + + +jQuery.extend( { + queue: function( elem, type, data ) { + var queue; + + if ( elem ) { + type = ( type || "fx" ) + "queue"; + queue = dataPriv.get( elem, type ); + + // Speed up dequeue by getting out quickly if this is just a lookup + if ( data ) { + if ( !queue || Array.isArray( data ) ) { + queue = dataPriv.access( elem, type, jQuery.makeArray( data ) ); + } else { + queue.push( data ); + } + } + return queue || []; + } + }, + + dequeue: function( elem, type ) { + type = type || "fx"; + + var queue = jQuery.queue( elem, type ), + startLength = queue.length, + fn = queue.shift(), + hooks = jQuery._queueHooks( elem, type ), + next = function() { + jQuery.dequeue( elem, type ); + }; + + // If the fx queue is dequeued, always remove the progress sentinel + if ( fn === "inprogress" ) { + fn = queue.shift(); + startLength--; + } + + if ( fn ) { + + // Add a progress sentinel to prevent the fx queue from being + // automatically dequeued + if ( type === "fx" ) { + queue.unshift( "inprogress" ); + } + + // Clear up the last queue stop function + delete hooks.stop; + fn.call( elem, next, hooks ); + } + + if ( !startLength && hooks ) { + hooks.empty.fire(); + } + }, + + // Not public - generate a queueHooks object, or return the current one + _queueHooks: function( elem, type ) { + var key = type + "queueHooks"; + return dataPriv.get( elem, key ) || dataPriv.access( elem, key, { + empty: jQuery.Callbacks( "once memory" ).add( function() { + dataPriv.remove( elem, [ type + "queue", key ] ); + } ) + } ); + } +} ); + +jQuery.fn.extend( { + queue: function( type, data ) { + var setter = 2; + + if ( typeof type !== "string" ) { + data = type; + type = "fx"; + setter--; + } + + if ( arguments.length < setter ) { + return jQuery.queue( this[ 0 ], type ); + } + + return data === undefined ? + this : + this.each( function() { + var queue = jQuery.queue( this, type, data ); + + // Ensure a hooks for this queue + jQuery._queueHooks( this, type ); + + if ( type === "fx" && queue[ 0 ] !== "inprogress" ) { + jQuery.dequeue( this, type ); + } + } ); + }, + dequeue: function( type ) { + return this.each( function() { + jQuery.dequeue( this, type ); + } ); + }, + clearQueue: function( type ) { + return this.queue( type || "fx", [] ); + }, + + // Get a promise resolved when queues of a certain type + // are emptied (fx is the type by default) + promise: function( type, obj ) { + var tmp, + count = 1, + defer = jQuery.Deferred(), + elements = this, + i = this.length, + resolve = function() { + if ( !( --count ) ) { + defer.resolveWith( elements, [ elements ] ); + } + }; + + if ( typeof type !== "string" ) { + obj = type; + type = undefined; + } + type = type || "fx"; + + while ( i-- ) { + tmp = dataPriv.get( elements[ i ], type + "queueHooks" ); + if ( tmp && tmp.empty ) { + count++; + tmp.empty.add( resolve ); + } + } + resolve(); + return defer.promise( obj ); + } +} ); +var pnum = ( /[+-]?(?:\d*\.|)\d+(?:[eE][+-]?\d+|)/ ).source; + +var rcssNum = new RegExp( "^(?:([+-])=|)(" + pnum + ")([a-z%]*)$", "i" ); + + +var cssExpand = [ "Top", "Right", "Bottom", "Left" ]; + +var documentElement = document.documentElement; + + + + var isAttached = function( elem ) { + return jQuery.contains( elem.ownerDocument, elem ); + }, + composed = { composed: true }; + + // Support: IE 9 - 11+, Edge 12 - 18+, iOS 10.0 - 10.2 only + // Check attachment across shadow DOM boundaries when possible (gh-3504) + // Support: iOS 10.0-10.2 only + // Early iOS 10 versions support `attachShadow` but not `getRootNode`, + // leading to errors. We need to check for `getRootNode`. + if ( documentElement.getRootNode ) { + isAttached = function( elem ) { + return jQuery.contains( elem.ownerDocument, elem ) || + elem.getRootNode( composed ) === elem.ownerDocument; + }; + } +var isHiddenWithinTree = function( elem, el ) { + + // isHiddenWithinTree might be called from jQuery#filter function; + // in that case, element will be second argument + elem = el || elem; + + // Inline style trumps all + return elem.style.display === "none" || + elem.style.display === "" && + + // Otherwise, check computed style + // Support: Firefox <=43 - 45 + // Disconnected elements can have computed display: none, so first confirm that elem is + // in the document. + isAttached( elem ) && + + jQuery.css( elem, "display" ) === "none"; + }; + + + +function adjustCSS( elem, prop, valueParts, tween ) { + var adjusted, scale, + maxIterations = 20, + currentValue = tween ? + function() { + return tween.cur(); + } : + function() { + return jQuery.css( elem, prop, "" ); + }, + initial = currentValue(), + unit = valueParts && valueParts[ 3 ] || ( jQuery.cssNumber[ prop ] ? "" : "px" ), + + // Starting value computation is required for potential unit mismatches + initialInUnit = elem.nodeType && + ( jQuery.cssNumber[ prop ] || unit !== "px" && +initial ) && + rcssNum.exec( jQuery.css( elem, prop ) ); + + if ( initialInUnit && initialInUnit[ 3 ] !== unit ) { + + // Support: Firefox <=54 + // Halve the iteration target value to prevent interference from CSS upper bounds (gh-2144) + initial = initial / 2; + + // Trust units reported by jQuery.css + unit = unit || initialInUnit[ 3 ]; + + // Iteratively approximate from a nonzero starting point + initialInUnit = +initial || 1; + + while ( maxIterations-- ) { + + // Evaluate and update our best guess (doubling guesses that zero out). + // Finish if the scale equals or crosses 1 (making the old*new product non-positive). + jQuery.style( elem, prop, initialInUnit + unit ); + if ( ( 1 - scale ) * ( 1 - ( scale = currentValue() / initial || 0.5 ) ) <= 0 ) { + maxIterations = 0; + } + initialInUnit = initialInUnit / scale; + + } + + initialInUnit = initialInUnit * 2; + jQuery.style( elem, prop, initialInUnit + unit ); + + // Make sure we update the tween properties later on + valueParts = valueParts || []; + } + + if ( valueParts ) { + initialInUnit = +initialInUnit || +initial || 0; + + // Apply relative offset (+=/-=) if specified + adjusted = valueParts[ 1 ] ? + initialInUnit + ( valueParts[ 1 ] + 1 ) * valueParts[ 2 ] : + +valueParts[ 2 ]; + if ( tween ) { + tween.unit = unit; + tween.start = initialInUnit; + tween.end = adjusted; + } + } + return adjusted; +} + + +var defaultDisplayMap = {}; + +function getDefaultDisplay( elem ) { + var temp, + doc = elem.ownerDocument, + nodeName = elem.nodeName, + display = defaultDisplayMap[ nodeName ]; + + if ( display ) { + return display; + } + + temp = doc.body.appendChild( doc.createElement( nodeName ) ); + display = jQuery.css( temp, "display" ); + + temp.parentNode.removeChild( temp ); + + if ( display === "none" ) { + display = "block"; + } + defaultDisplayMap[ nodeName ] = display; + + return display; +} + +function showHide( elements, show ) { + var display, elem, + values = [], + index = 0, + length = elements.length; + + // Determine new display value for elements that need to change + for ( ; index < length; index++ ) { + elem = elements[ index ]; + if ( !elem.style ) { + continue; + } + + display = elem.style.display; + if ( show ) { + + // Since we force visibility upon cascade-hidden elements, an immediate (and slow) + // check is required in this first loop unless we have a nonempty display value (either + // inline or about-to-be-restored) + if ( display === "none" ) { + values[ index ] = dataPriv.get( elem, "display" ) || null; + if ( !values[ index ] ) { + elem.style.display = ""; + } + } + if ( elem.style.display === "" && isHiddenWithinTree( elem ) ) { + values[ index ] = getDefaultDisplay( elem ); + } + } else { + if ( display !== "none" ) { + values[ index ] = "none"; + + // Remember what we're overwriting + dataPriv.set( elem, "display", display ); + } + } + } + + // Set the display of the elements in a second loop to avoid constant reflow + for ( index = 0; index < length; index++ ) { + if ( values[ index ] != null ) { + elements[ index ].style.display = values[ index ]; + } + } + + return elements; +} + +jQuery.fn.extend( { + show: function() { + return showHide( this, true ); + }, + hide: function() { + return showHide( this ); + }, + toggle: function( state ) { + if ( typeof state === "boolean" ) { + return state ? this.show() : this.hide(); + } + + return this.each( function() { + if ( isHiddenWithinTree( this ) ) { + jQuery( this ).show(); + } else { + jQuery( this ).hide(); + } + } ); + } +} ); +var rcheckableType = ( /^(?:checkbox|radio)$/i ); + +var rtagName = ( /<([a-z][^\/\0>\x20\t\r\n\f]*)/i ); + +var rscriptType = ( /^$|^module$|\/(?:java|ecma)script/i ); + + + +( function() { + var fragment = document.createDocumentFragment(), + div = fragment.appendChild( document.createElement( "div" ) ), + input = document.createElement( "input" ); + + // Support: Android 4.0 - 4.3 only + // Check state lost if the name is set (#11217) + // Support: Windows Web Apps (WWA) + // `name` and `type` must use .setAttribute for WWA (#14901) + input.setAttribute( "type", "radio" ); + input.setAttribute( "checked", "checked" ); + input.setAttribute( "name", "t" ); + + div.appendChild( input ); + + // Support: Android <=4.1 only + // Older WebKit doesn't clone checked state correctly in fragments + support.checkClone = div.cloneNode( true ).cloneNode( true ).lastChild.checked; + + // Support: IE <=11 only + // Make sure textarea (and checkbox) defaultValue is properly cloned + div.innerHTML = ""; + support.noCloneChecked = !!div.cloneNode( true ).lastChild.defaultValue; + + // Support: IE <=9 only + // IE <=9 replaces "; + support.option = !!div.lastChild; +} )(); + + +// We have to close these tags to support XHTML (#13200) +var wrapMap = { + + // XHTML parsers do not magically insert elements in the + // same way that tag soup parsers do. So we cannot shorten + // this by omitting or other required elements. + thead: [ 1, "", "
" ], + col: [ 2, "", "
" ], + tr: [ 2, "", "
" ], + td: [ 3, "", "
" ], + + _default: [ 0, "", "" ] +}; + +wrapMap.tbody = wrapMap.tfoot = wrapMap.colgroup = wrapMap.caption = wrapMap.thead; +wrapMap.th = wrapMap.td; + +// Support: IE <=9 only +if ( !support.option ) { + wrapMap.optgroup = wrapMap.option = [ 1, "" ]; +} + + +function getAll( context, tag ) { + + // Support: IE <=9 - 11 only + // Use typeof to avoid zero-argument method invocation on host objects (#15151) + var ret; + + if ( typeof context.getElementsByTagName !== "undefined" ) { + ret = context.getElementsByTagName( tag || "*" ); + + } else if ( typeof context.querySelectorAll !== "undefined" ) { + ret = context.querySelectorAll( tag || "*" ); + + } else { + ret = []; + } + + if ( tag === undefined || tag && nodeName( context, tag ) ) { + return jQuery.merge( [ context ], ret ); + } + + return ret; +} + + +// Mark scripts as having already been evaluated +function setGlobalEval( elems, refElements ) { + var i = 0, + l = elems.length; + + for ( ; i < l; i++ ) { + dataPriv.set( + elems[ i ], + "globalEval", + !refElements || dataPriv.get( refElements[ i ], "globalEval" ) + ); + } +} + + +var rhtml = /<|&#?\w+;/; + +function buildFragment( elems, context, scripts, selection, ignored ) { + var elem, tmp, tag, wrap, attached, j, + fragment = context.createDocumentFragment(), + nodes = [], + i = 0, + l = elems.length; + + for ( ; i < l; i++ ) { + elem = elems[ i ]; + + if ( elem || elem === 0 ) { + + // Add nodes directly + if ( toType( elem ) === "object" ) { + + // Support: Android <=4.0 only, PhantomJS 1 only + // push.apply(_, arraylike) throws on ancient WebKit + jQuery.merge( nodes, elem.nodeType ? [ elem ] : elem ); + + // Convert non-html into a text node + } else if ( !rhtml.test( elem ) ) { + nodes.push( context.createTextNode( elem ) ); + + // Convert html into DOM nodes + } else { + tmp = tmp || fragment.appendChild( context.createElement( "div" ) ); + + // Deserialize a standard representation + tag = ( rtagName.exec( elem ) || [ "", "" ] )[ 1 ].toLowerCase(); + wrap = wrapMap[ tag ] || wrapMap._default; + tmp.innerHTML = wrap[ 1 ] + jQuery.htmlPrefilter( elem ) + wrap[ 2 ]; + + // Descend through wrappers to the right content + j = wrap[ 0 ]; + while ( j-- ) { + tmp = tmp.lastChild; + } + + // Support: Android <=4.0 only, PhantomJS 1 only + // push.apply(_, arraylike) throws on ancient WebKit + jQuery.merge( nodes, tmp.childNodes ); + + // Remember the top-level container + tmp = fragment.firstChild; + + // Ensure the created nodes are orphaned (#12392) + tmp.textContent = ""; + } + } + } + + // Remove wrapper from fragment + fragment.textContent = ""; + + i = 0; + while ( ( elem = nodes[ i++ ] ) ) { + + // Skip elements already in the context collection (trac-4087) + if ( selection && jQuery.inArray( elem, selection ) > -1 ) { + if ( ignored ) { + ignored.push( elem ); + } + continue; + } + + attached = isAttached( elem ); + + // Append to fragment + tmp = getAll( fragment.appendChild( elem ), "script" ); + + // Preserve script evaluation history + if ( attached ) { + setGlobalEval( tmp ); + } + + // Capture executables + if ( scripts ) { + j = 0; + while ( ( elem = tmp[ j++ ] ) ) { + if ( rscriptType.test( elem.type || "" ) ) { + scripts.push( elem ); + } + } + } + } + + return fragment; +} + + +var rtypenamespace = /^([^.]*)(?:\.(.+)|)/; + +function returnTrue() { + return true; +} + +function returnFalse() { + return false; +} + +// Support: IE <=9 - 11+ +// focus() and blur() are asynchronous, except when they are no-op. +// So expect focus to be synchronous when the element is already active, +// and blur to be synchronous when the element is not already active. +// (focus and blur are always synchronous in other supported browsers, +// this just defines when we can count on it). +function expectSync( elem, type ) { + return ( elem === safeActiveElement() ) === ( type === "focus" ); +} + +// Support: IE <=9 only +// Accessing document.activeElement can throw unexpectedly +// https://bugs.jquery.com/ticket/13393 +function safeActiveElement() { + try { + return document.activeElement; + } catch ( err ) { } +} + +function on( elem, types, selector, data, fn, one ) { + var origFn, type; + + // Types can be a map of types/handlers + if ( typeof types === "object" ) { + + // ( types-Object, selector, data ) + if ( typeof selector !== "string" ) { + + // ( types-Object, data ) + data = data || selector; + selector = undefined; + } + for ( type in types ) { + on( elem, type, selector, data, types[ type ], one ); + } + return elem; + } + + if ( data == null && fn == null ) { + + // ( types, fn ) + fn = selector; + data = selector = undefined; + } else if ( fn == null ) { + if ( typeof selector === "string" ) { + + // ( types, selector, fn ) + fn = data; + data = undefined; + } else { + + // ( types, data, fn ) + fn = data; + data = selector; + selector = undefined; + } + } + if ( fn === false ) { + fn = returnFalse; + } else if ( !fn ) { + return elem; + } + + if ( one === 1 ) { + origFn = fn; + fn = function( event ) { + + // Can use an empty set, since event contains the info + jQuery().off( event ); + return origFn.apply( this, arguments ); + }; + + // Use same guid so caller can remove using origFn + fn.guid = origFn.guid || ( origFn.guid = jQuery.guid++ ); + } + return elem.each( function() { + jQuery.event.add( this, types, fn, data, selector ); + } ); +} + +/* + * Helper functions for managing events -- not part of the public interface. + * Props to Dean Edwards' addEvent library for many of the ideas. + */ +jQuery.event = { + + global: {}, + + add: function( elem, types, handler, data, selector ) { + + var handleObjIn, eventHandle, tmp, + events, t, handleObj, + special, handlers, type, namespaces, origType, + elemData = dataPriv.get( elem ); + + // Only attach events to objects that accept data + if ( !acceptData( elem ) ) { + return; + } + + // Caller can pass in an object of custom data in lieu of the handler + if ( handler.handler ) { + handleObjIn = handler; + handler = handleObjIn.handler; + selector = handleObjIn.selector; + } + + // Ensure that invalid selectors throw exceptions at attach time + // Evaluate against documentElement in case elem is a non-element node (e.g., document) + if ( selector ) { + jQuery.find.matchesSelector( documentElement, selector ); + } + + // Make sure that the handler has a unique ID, used to find/remove it later + if ( !handler.guid ) { + handler.guid = jQuery.guid++; + } + + // Init the element's event structure and main handler, if this is the first + if ( !( events = elemData.events ) ) { + events = elemData.events = Object.create( null ); + } + if ( !( eventHandle = elemData.handle ) ) { + eventHandle = elemData.handle = function( e ) { + + // Discard the second event of a jQuery.event.trigger() and + // when an event is called after a page has unloaded + return typeof jQuery !== "undefined" && jQuery.event.triggered !== e.type ? + jQuery.event.dispatch.apply( elem, arguments ) : undefined; + }; + } + + // Handle multiple events separated by a space + types = ( types || "" ).match( rnothtmlwhite ) || [ "" ]; + t = types.length; + while ( t-- ) { + tmp = rtypenamespace.exec( types[ t ] ) || []; + type = origType = tmp[ 1 ]; + namespaces = ( tmp[ 2 ] || "" ).split( "." ).sort(); + + // There *must* be a type, no attaching namespace-only handlers + if ( !type ) { + continue; + } + + // If event changes its type, use the special event handlers for the changed type + special = jQuery.event.special[ type ] || {}; + + // If selector defined, determine special event api type, otherwise given type + type = ( selector ? special.delegateType : special.bindType ) || type; + + // Update special based on newly reset type + special = jQuery.event.special[ type ] || {}; + + // handleObj is passed to all event handlers + handleObj = jQuery.extend( { + type: type, + origType: origType, + data: data, + handler: handler, + guid: handler.guid, + selector: selector, + needsContext: selector && jQuery.expr.match.needsContext.test( selector ), + namespace: namespaces.join( "." ) + }, handleObjIn ); + + // Init the event handler queue if we're the first + if ( !( handlers = events[ type ] ) ) { + handlers = events[ type ] = []; + handlers.delegateCount = 0; + + // Only use addEventListener if the special events handler returns false + if ( !special.setup || + special.setup.call( elem, data, namespaces, eventHandle ) === false ) { + + if ( elem.addEventListener ) { + elem.addEventListener( type, eventHandle ); + } + } + } + + if ( special.add ) { + special.add.call( elem, handleObj ); + + if ( !handleObj.handler.guid ) { + handleObj.handler.guid = handler.guid; + } + } + + // Add to the element's handler list, delegates in front + if ( selector ) { + handlers.splice( handlers.delegateCount++, 0, handleObj ); + } else { + handlers.push( handleObj ); + } + + // Keep track of which events have ever been used, for event optimization + jQuery.event.global[ type ] = true; + } + + }, + + // Detach an event or set of events from an element + remove: function( elem, types, handler, selector, mappedTypes ) { + + var j, origCount, tmp, + events, t, handleObj, + special, handlers, type, namespaces, origType, + elemData = dataPriv.hasData( elem ) && dataPriv.get( elem ); + + if ( !elemData || !( events = elemData.events ) ) { + return; + } + + // Once for each type.namespace in types; type may be omitted + types = ( types || "" ).match( rnothtmlwhite ) || [ "" ]; + t = types.length; + while ( t-- ) { + tmp = rtypenamespace.exec( types[ t ] ) || []; + type = origType = tmp[ 1 ]; + namespaces = ( tmp[ 2 ] || "" ).split( "." ).sort(); + + // Unbind all events (on this namespace, if provided) for the element + if ( !type ) { + for ( type in events ) { + jQuery.event.remove( elem, type + types[ t ], handler, selector, true ); + } + continue; + } + + special = jQuery.event.special[ type ] || {}; + type = ( selector ? special.delegateType : special.bindType ) || type; + handlers = events[ type ] || []; + tmp = tmp[ 2 ] && + new RegExp( "(^|\\.)" + namespaces.join( "\\.(?:.*\\.|)" ) + "(\\.|$)" ); + + // Remove matching events + origCount = j = handlers.length; + while ( j-- ) { + handleObj = handlers[ j ]; + + if ( ( mappedTypes || origType === handleObj.origType ) && + ( !handler || handler.guid === handleObj.guid ) && + ( !tmp || tmp.test( handleObj.namespace ) ) && + ( !selector || selector === handleObj.selector || + selector === "**" && handleObj.selector ) ) { + handlers.splice( j, 1 ); + + if ( handleObj.selector ) { + handlers.delegateCount--; + } + if ( special.remove ) { + special.remove.call( elem, handleObj ); + } + } + } + + // Remove generic event handler if we removed something and no more handlers exist + // (avoids potential for endless recursion during removal of special event handlers) + if ( origCount && !handlers.length ) { + if ( !special.teardown || + special.teardown.call( elem, namespaces, elemData.handle ) === false ) { + + jQuery.removeEvent( elem, type, elemData.handle ); + } + + delete events[ type ]; + } + } + + // Remove data and the expando if it's no longer used + if ( jQuery.isEmptyObject( events ) ) { + dataPriv.remove( elem, "handle events" ); + } + }, + + dispatch: function( nativeEvent ) { + + var i, j, ret, matched, handleObj, handlerQueue, + args = new Array( arguments.length ), + + // Make a writable jQuery.Event from the native event object + event = jQuery.event.fix( nativeEvent ), + + handlers = ( + dataPriv.get( this, "events" ) || Object.create( null ) + )[ event.type ] || [], + special = jQuery.event.special[ event.type ] || {}; + + // Use the fix-ed jQuery.Event rather than the (read-only) native event + args[ 0 ] = event; + + for ( i = 1; i < arguments.length; i++ ) { + args[ i ] = arguments[ i ]; + } + + event.delegateTarget = this; + + // Call the preDispatch hook for the mapped type, and let it bail if desired + if ( special.preDispatch && special.preDispatch.call( this, event ) === false ) { + return; + } + + // Determine handlers + handlerQueue = jQuery.event.handlers.call( this, event, handlers ); + + // Run delegates first; they may want to stop propagation beneath us + i = 0; + while ( ( matched = handlerQueue[ i++ ] ) && !event.isPropagationStopped() ) { + event.currentTarget = matched.elem; + + j = 0; + while ( ( handleObj = matched.handlers[ j++ ] ) && + !event.isImmediatePropagationStopped() ) { + + // If the event is namespaced, then each handler is only invoked if it is + // specially universal or its namespaces are a superset of the event's. + if ( !event.rnamespace || handleObj.namespace === false || + event.rnamespace.test( handleObj.namespace ) ) { + + event.handleObj = handleObj; + event.data = handleObj.data; + + ret = ( ( jQuery.event.special[ handleObj.origType ] || {} ).handle || + handleObj.handler ).apply( matched.elem, args ); + + if ( ret !== undefined ) { + if ( ( event.result = ret ) === false ) { + event.preventDefault(); + event.stopPropagation(); + } + } + } + } + } + + // Call the postDispatch hook for the mapped type + if ( special.postDispatch ) { + special.postDispatch.call( this, event ); + } + + return event.result; + }, + + handlers: function( event, handlers ) { + var i, handleObj, sel, matchedHandlers, matchedSelectors, + handlerQueue = [], + delegateCount = handlers.delegateCount, + cur = event.target; + + // Find delegate handlers + if ( delegateCount && + + // Support: IE <=9 + // Black-hole SVG instance trees (trac-13180) + cur.nodeType && + + // Support: Firefox <=42 + // Suppress spec-violating clicks indicating a non-primary pointer button (trac-3861) + // https://www.w3.org/TR/DOM-Level-3-Events/#event-type-click + // Support: IE 11 only + // ...but not arrow key "clicks" of radio inputs, which can have `button` -1 (gh-2343) + !( event.type === "click" && event.button >= 1 ) ) { + + for ( ; cur !== this; cur = cur.parentNode || this ) { + + // Don't check non-elements (#13208) + // Don't process clicks on disabled elements (#6911, #8165, #11382, #11764) + if ( cur.nodeType === 1 && !( event.type === "click" && cur.disabled === true ) ) { + matchedHandlers = []; + matchedSelectors = {}; + for ( i = 0; i < delegateCount; i++ ) { + handleObj = handlers[ i ]; + + // Don't conflict with Object.prototype properties (#13203) + sel = handleObj.selector + " "; + + if ( matchedSelectors[ sel ] === undefined ) { + matchedSelectors[ sel ] = handleObj.needsContext ? + jQuery( sel, this ).index( cur ) > -1 : + jQuery.find( sel, this, null, [ cur ] ).length; + } + if ( matchedSelectors[ sel ] ) { + matchedHandlers.push( handleObj ); + } + } + if ( matchedHandlers.length ) { + handlerQueue.push( { elem: cur, handlers: matchedHandlers } ); + } + } + } + } + + // Add the remaining (directly-bound) handlers + cur = this; + if ( delegateCount < handlers.length ) { + handlerQueue.push( { elem: cur, handlers: handlers.slice( delegateCount ) } ); + } + + return handlerQueue; + }, + + addProp: function( name, hook ) { + Object.defineProperty( jQuery.Event.prototype, name, { + enumerable: true, + configurable: true, + + get: isFunction( hook ) ? + function() { + if ( this.originalEvent ) { + return hook( this.originalEvent ); + } + } : + function() { + if ( this.originalEvent ) { + return this.originalEvent[ name ]; + } + }, + + set: function( value ) { + Object.defineProperty( this, name, { + enumerable: true, + configurable: true, + writable: true, + value: value + } ); + } + } ); + }, + + fix: function( originalEvent ) { + return originalEvent[ jQuery.expando ] ? + originalEvent : + new jQuery.Event( originalEvent ); + }, + + special: { + load: { + + // Prevent triggered image.load events from bubbling to window.load + noBubble: true + }, + click: { + + // Utilize native event to ensure correct state for checkable inputs + setup: function( data ) { + + // For mutual compressibility with _default, replace `this` access with a local var. + // `|| data` is dead code meant only to preserve the variable through minification. + var el = this || data; + + // Claim the first handler + if ( rcheckableType.test( el.type ) && + el.click && nodeName( el, "input" ) ) { + + // dataPriv.set( el, "click", ... ) + leverageNative( el, "click", returnTrue ); + } + + // Return false to allow normal processing in the caller + return false; + }, + trigger: function( data ) { + + // For mutual compressibility with _default, replace `this` access with a local var. + // `|| data` is dead code meant only to preserve the variable through minification. + var el = this || data; + + // Force setup before triggering a click + if ( rcheckableType.test( el.type ) && + el.click && nodeName( el, "input" ) ) { + + leverageNative( el, "click" ); + } + + // Return non-false to allow normal event-path propagation + return true; + }, + + // For cross-browser consistency, suppress native .click() on links + // Also prevent it if we're currently inside a leveraged native-event stack + _default: function( event ) { + var target = event.target; + return rcheckableType.test( target.type ) && + target.click && nodeName( target, "input" ) && + dataPriv.get( target, "click" ) || + nodeName( target, "a" ); + } + }, + + beforeunload: { + postDispatch: function( event ) { + + // Support: Firefox 20+ + // Firefox doesn't alert if the returnValue field is not set. + if ( event.result !== undefined && event.originalEvent ) { + event.originalEvent.returnValue = event.result; + } + } + } + } +}; + +// Ensure the presence of an event listener that handles manually-triggered +// synthetic events by interrupting progress until reinvoked in response to +// *native* events that it fires directly, ensuring that state changes have +// already occurred before other listeners are invoked. +function leverageNative( el, type, expectSync ) { + + // Missing expectSync indicates a trigger call, which must force setup through jQuery.event.add + if ( !expectSync ) { + if ( dataPriv.get( el, type ) === undefined ) { + jQuery.event.add( el, type, returnTrue ); + } + return; + } + + // Register the controller as a special universal handler for all event namespaces + dataPriv.set( el, type, false ); + jQuery.event.add( el, type, { + namespace: false, + handler: function( event ) { + var notAsync, result, + saved = dataPriv.get( this, type ); + + if ( ( event.isTrigger & 1 ) && this[ type ] ) { + + // Interrupt processing of the outer synthetic .trigger()ed event + // Saved data should be false in such cases, but might be a leftover capture object + // from an async native handler (gh-4350) + if ( !saved.length ) { + + // Store arguments for use when handling the inner native event + // There will always be at least one argument (an event object), so this array + // will not be confused with a leftover capture object. + saved = slice.call( arguments ); + dataPriv.set( this, type, saved ); + + // Trigger the native event and capture its result + // Support: IE <=9 - 11+ + // focus() and blur() are asynchronous + notAsync = expectSync( this, type ); + this[ type ](); + result = dataPriv.get( this, type ); + if ( saved !== result || notAsync ) { + dataPriv.set( this, type, false ); + } else { + result = {}; + } + if ( saved !== result ) { + + // Cancel the outer synthetic event + event.stopImmediatePropagation(); + event.preventDefault(); + + // Support: Chrome 86+ + // In Chrome, if an element having a focusout handler is blurred by + // clicking outside of it, it invokes the handler synchronously. If + // that handler calls `.remove()` on the element, the data is cleared, + // leaving `result` undefined. We need to guard against this. + return result && result.value; + } + + // If this is an inner synthetic event for an event with a bubbling surrogate + // (focus or blur), assume that the surrogate already propagated from triggering the + // native event and prevent that from happening again here. + // This technically gets the ordering wrong w.r.t. to `.trigger()` (in which the + // bubbling surrogate propagates *after* the non-bubbling base), but that seems + // less bad than duplication. + } else if ( ( jQuery.event.special[ type ] || {} ).delegateType ) { + event.stopPropagation(); + } + + // If this is a native event triggered above, everything is now in order + // Fire an inner synthetic event with the original arguments + } else if ( saved.length ) { + + // ...and capture the result + dataPriv.set( this, type, { + value: jQuery.event.trigger( + + // Support: IE <=9 - 11+ + // Extend with the prototype to reset the above stopImmediatePropagation() + jQuery.extend( saved[ 0 ], jQuery.Event.prototype ), + saved.slice( 1 ), + this + ) + } ); + + // Abort handling of the native event + event.stopImmediatePropagation(); + } + } + } ); +} + +jQuery.removeEvent = function( elem, type, handle ) { + + // This "if" is needed for plain objects + if ( elem.removeEventListener ) { + elem.removeEventListener( type, handle ); + } +}; + +jQuery.Event = function( src, props ) { + + // Allow instantiation without the 'new' keyword + if ( !( this instanceof jQuery.Event ) ) { + return new jQuery.Event( src, props ); + } + + // Event object + if ( src && src.type ) { + this.originalEvent = src; + this.type = src.type; + + // Events bubbling up the document may have been marked as prevented + // by a handler lower down the tree; reflect the correct value. + this.isDefaultPrevented = src.defaultPrevented || + src.defaultPrevented === undefined && + + // Support: Android <=2.3 only + src.returnValue === false ? + returnTrue : + returnFalse; + + // Create target properties + // Support: Safari <=6 - 7 only + // Target should not be a text node (#504, #13143) + this.target = ( src.target && src.target.nodeType === 3 ) ? + src.target.parentNode : + src.target; + + this.currentTarget = src.currentTarget; + this.relatedTarget = src.relatedTarget; + + // Event type + } else { + this.type = src; + } + + // Put explicitly provided properties onto the event object + if ( props ) { + jQuery.extend( this, props ); + } + + // Create a timestamp if incoming event doesn't have one + this.timeStamp = src && src.timeStamp || Date.now(); + + // Mark it as fixed + this[ jQuery.expando ] = true; +}; + +// jQuery.Event is based on DOM3 Events as specified by the ECMAScript Language Binding +// https://www.w3.org/TR/2003/WD-DOM-Level-3-Events-20030331/ecma-script-binding.html +jQuery.Event.prototype = { + constructor: jQuery.Event, + isDefaultPrevented: returnFalse, + isPropagationStopped: returnFalse, + isImmediatePropagationStopped: returnFalse, + isSimulated: false, + + preventDefault: function() { + var e = this.originalEvent; + + this.isDefaultPrevented = returnTrue; + + if ( e && !this.isSimulated ) { + e.preventDefault(); + } + }, + stopPropagation: function() { + var e = this.originalEvent; + + this.isPropagationStopped = returnTrue; + + if ( e && !this.isSimulated ) { + e.stopPropagation(); + } + }, + stopImmediatePropagation: function() { + var e = this.originalEvent; + + this.isImmediatePropagationStopped = returnTrue; + + if ( e && !this.isSimulated ) { + e.stopImmediatePropagation(); + } + + this.stopPropagation(); + } +}; + +// Includes all common event props including KeyEvent and MouseEvent specific props +jQuery.each( { + altKey: true, + bubbles: true, + cancelable: true, + changedTouches: true, + ctrlKey: true, + detail: true, + eventPhase: true, + metaKey: true, + pageX: true, + pageY: true, + shiftKey: true, + view: true, + "char": true, + code: true, + charCode: true, + key: true, + keyCode: true, + button: true, + buttons: true, + clientX: true, + clientY: true, + offsetX: true, + offsetY: true, + pointerId: true, + pointerType: true, + screenX: true, + screenY: true, + targetTouches: true, + toElement: true, + touches: true, + which: true +}, jQuery.event.addProp ); + +jQuery.each( { focus: "focusin", blur: "focusout" }, function( type, delegateType ) { + jQuery.event.special[ type ] = { + + // Utilize native event if possible so blur/focus sequence is correct + setup: function() { + + // Claim the first handler + // dataPriv.set( this, "focus", ... ) + // dataPriv.set( this, "blur", ... ) + leverageNative( this, type, expectSync ); + + // Return false to allow normal processing in the caller + return false; + }, + trigger: function() { + + // Force setup before trigger + leverageNative( this, type ); + + // Return non-false to allow normal event-path propagation + return true; + }, + + // Suppress native focus or blur as it's already being fired + // in leverageNative. + _default: function() { + return true; + }, + + delegateType: delegateType + }; +} ); + +// Create mouseenter/leave events using mouseover/out and event-time checks +// so that event delegation works in jQuery. +// Do the same for pointerenter/pointerleave and pointerover/pointerout +// +// Support: Safari 7 only +// Safari sends mouseenter too often; see: +// https://bugs.chromium.org/p/chromium/issues/detail?id=470258 +// for the description of the bug (it existed in older Chrome versions as well). +jQuery.each( { + mouseenter: "mouseover", + mouseleave: "mouseout", + pointerenter: "pointerover", + pointerleave: "pointerout" +}, function( orig, fix ) { + jQuery.event.special[ orig ] = { + delegateType: fix, + bindType: fix, + + handle: function( event ) { + var ret, + target = this, + related = event.relatedTarget, + handleObj = event.handleObj; + + // For mouseenter/leave call the handler if related is outside the target. + // NB: No relatedTarget if the mouse left/entered the browser window + if ( !related || ( related !== target && !jQuery.contains( target, related ) ) ) { + event.type = handleObj.origType; + ret = handleObj.handler.apply( this, arguments ); + event.type = fix; + } + return ret; + } + }; +} ); + +jQuery.fn.extend( { + + on: function( types, selector, data, fn ) { + return on( this, types, selector, data, fn ); + }, + one: function( types, selector, data, fn ) { + return on( this, types, selector, data, fn, 1 ); + }, + off: function( types, selector, fn ) { + var handleObj, type; + if ( types && types.preventDefault && types.handleObj ) { + + // ( event ) dispatched jQuery.Event + handleObj = types.handleObj; + jQuery( types.delegateTarget ).off( + handleObj.namespace ? + handleObj.origType + "." + handleObj.namespace : + handleObj.origType, + handleObj.selector, + handleObj.handler + ); + return this; + } + if ( typeof types === "object" ) { + + // ( types-object [, selector] ) + for ( type in types ) { + this.off( type, selector, types[ type ] ); + } + return this; + } + if ( selector === false || typeof selector === "function" ) { + + // ( types [, fn] ) + fn = selector; + selector = undefined; + } + if ( fn === false ) { + fn = returnFalse; + } + return this.each( function() { + jQuery.event.remove( this, types, fn, selector ); + } ); + } +} ); + + +var + + // Support: IE <=10 - 11, Edge 12 - 13 only + // In IE/Edge using regex groups here causes severe slowdowns. + // See https://connect.microsoft.com/IE/feedback/details/1736512/ + rnoInnerhtml = /\s*$/g; + +// Prefer a tbody over its parent table for containing new rows +function manipulationTarget( elem, content ) { + if ( nodeName( elem, "table" ) && + nodeName( content.nodeType !== 11 ? content : content.firstChild, "tr" ) ) { + + return jQuery( elem ).children( "tbody" )[ 0 ] || elem; + } + + return elem; +} + +// Replace/restore the type attribute of script elements for safe DOM manipulation +function disableScript( elem ) { + elem.type = ( elem.getAttribute( "type" ) !== null ) + "/" + elem.type; + return elem; +} +function restoreScript( elem ) { + if ( ( elem.type || "" ).slice( 0, 5 ) === "true/" ) { + elem.type = elem.type.slice( 5 ); + } else { + elem.removeAttribute( "type" ); + } + + return elem; +} + +function cloneCopyEvent( src, dest ) { + var i, l, type, pdataOld, udataOld, udataCur, events; + + if ( dest.nodeType !== 1 ) { + return; + } + + // 1. Copy private data: events, handlers, etc. + if ( dataPriv.hasData( src ) ) { + pdataOld = dataPriv.get( src ); + events = pdataOld.events; + + if ( events ) { + dataPriv.remove( dest, "handle events" ); + + for ( type in events ) { + for ( i = 0, l = events[ type ].length; i < l; i++ ) { + jQuery.event.add( dest, type, events[ type ][ i ] ); + } + } + } + } + + // 2. Copy user data + if ( dataUser.hasData( src ) ) { + udataOld = dataUser.access( src ); + udataCur = jQuery.extend( {}, udataOld ); + + dataUser.set( dest, udataCur ); + } +} + +// Fix IE bugs, see support tests +function fixInput( src, dest ) { + var nodeName = dest.nodeName.toLowerCase(); + + // Fails to persist the checked state of a cloned checkbox or radio button. + if ( nodeName === "input" && rcheckableType.test( src.type ) ) { + dest.checked = src.checked; + + // Fails to return the selected option to the default selected state when cloning options + } else if ( nodeName === "input" || nodeName === "textarea" ) { + dest.defaultValue = src.defaultValue; + } +} + +function domManip( collection, args, callback, ignored ) { + + // Flatten any nested arrays + args = flat( args ); + + var fragment, first, scripts, hasScripts, node, doc, + i = 0, + l = collection.length, + iNoClone = l - 1, + value = args[ 0 ], + valueIsFunction = isFunction( value ); + + // We can't cloneNode fragments that contain checked, in WebKit + if ( valueIsFunction || + ( l > 1 && typeof value === "string" && + !support.checkClone && rchecked.test( value ) ) ) { + return collection.each( function( index ) { + var self = collection.eq( index ); + if ( valueIsFunction ) { + args[ 0 ] = value.call( this, index, self.html() ); + } + domManip( self, args, callback, ignored ); + } ); + } + + if ( l ) { + fragment = buildFragment( args, collection[ 0 ].ownerDocument, false, collection, ignored ); + first = fragment.firstChild; + + if ( fragment.childNodes.length === 1 ) { + fragment = first; + } + + // Require either new content or an interest in ignored elements to invoke the callback + if ( first || ignored ) { + scripts = jQuery.map( getAll( fragment, "script" ), disableScript ); + hasScripts = scripts.length; + + // Use the original fragment for the last item + // instead of the first because it can end up + // being emptied incorrectly in certain situations (#8070). + for ( ; i < l; i++ ) { + node = fragment; + + if ( i !== iNoClone ) { + node = jQuery.clone( node, true, true ); + + // Keep references to cloned scripts for later restoration + if ( hasScripts ) { + + // Support: Android <=4.0 only, PhantomJS 1 only + // push.apply(_, arraylike) throws on ancient WebKit + jQuery.merge( scripts, getAll( node, "script" ) ); + } + } + + callback.call( collection[ i ], node, i ); + } + + if ( hasScripts ) { + doc = scripts[ scripts.length - 1 ].ownerDocument; + + // Reenable scripts + jQuery.map( scripts, restoreScript ); + + // Evaluate executable scripts on first document insertion + for ( i = 0; i < hasScripts; i++ ) { + node = scripts[ i ]; + if ( rscriptType.test( node.type || "" ) && + !dataPriv.access( node, "globalEval" ) && + jQuery.contains( doc, node ) ) { + + if ( node.src && ( node.type || "" ).toLowerCase() !== "module" ) { + + // Optional AJAX dependency, but won't run scripts if not present + if ( jQuery._evalUrl && !node.noModule ) { + jQuery._evalUrl( node.src, { + nonce: node.nonce || node.getAttribute( "nonce" ) + }, doc ); + } + } else { + DOMEval( node.textContent.replace( rcleanScript, "" ), node, doc ); + } + } + } + } + } + } + + return collection; +} + +function remove( elem, selector, keepData ) { + var node, + nodes = selector ? jQuery.filter( selector, elem ) : elem, + i = 0; + + for ( ; ( node = nodes[ i ] ) != null; i++ ) { + if ( !keepData && node.nodeType === 1 ) { + jQuery.cleanData( getAll( node ) ); + } + + if ( node.parentNode ) { + if ( keepData && isAttached( node ) ) { + setGlobalEval( getAll( node, "script" ) ); + } + node.parentNode.removeChild( node ); + } + } + + return elem; +} + +jQuery.extend( { + htmlPrefilter: function( html ) { + return html; + }, + + clone: function( elem, dataAndEvents, deepDataAndEvents ) { + var i, l, srcElements, destElements, + clone = elem.cloneNode( true ), + inPage = isAttached( elem ); + + // Fix IE cloning issues + if ( !support.noCloneChecked && ( elem.nodeType === 1 || elem.nodeType === 11 ) && + !jQuery.isXMLDoc( elem ) ) { + + // We eschew Sizzle here for performance reasons: https://jsperf.com/getall-vs-sizzle/2 + destElements = getAll( clone ); + srcElements = getAll( elem ); + + for ( i = 0, l = srcElements.length; i < l; i++ ) { + fixInput( srcElements[ i ], destElements[ i ] ); + } + } + + // Copy the events from the original to the clone + if ( dataAndEvents ) { + if ( deepDataAndEvents ) { + srcElements = srcElements || getAll( elem ); + destElements = destElements || getAll( clone ); + + for ( i = 0, l = srcElements.length; i < l; i++ ) { + cloneCopyEvent( srcElements[ i ], destElements[ i ] ); + } + } else { + cloneCopyEvent( elem, clone ); + } + } + + // Preserve script evaluation history + destElements = getAll( clone, "script" ); + if ( destElements.length > 0 ) { + setGlobalEval( destElements, !inPage && getAll( elem, "script" ) ); + } + + // Return the cloned set + return clone; + }, + + cleanData: function( elems ) { + var data, elem, type, + special = jQuery.event.special, + i = 0; + + for ( ; ( elem = elems[ i ] ) !== undefined; i++ ) { + if ( acceptData( elem ) ) { + if ( ( data = elem[ dataPriv.expando ] ) ) { + if ( data.events ) { + for ( type in data.events ) { + if ( special[ type ] ) { + jQuery.event.remove( elem, type ); + + // This is a shortcut to avoid jQuery.event.remove's overhead + } else { + jQuery.removeEvent( elem, type, data.handle ); + } + } + } + + // Support: Chrome <=35 - 45+ + // Assign undefined instead of using delete, see Data#remove + elem[ dataPriv.expando ] = undefined; + } + if ( elem[ dataUser.expando ] ) { + + // Support: Chrome <=35 - 45+ + // Assign undefined instead of using delete, see Data#remove + elem[ dataUser.expando ] = undefined; + } + } + } + } +} ); + +jQuery.fn.extend( { + detach: function( selector ) { + return remove( this, selector, true ); + }, + + remove: function( selector ) { + return remove( this, selector ); + }, + + text: function( value ) { + return access( this, function( value ) { + return value === undefined ? + jQuery.text( this ) : + this.empty().each( function() { + if ( this.nodeType === 1 || this.nodeType === 11 || this.nodeType === 9 ) { + this.textContent = value; + } + } ); + }, null, value, arguments.length ); + }, + + append: function() { + return domManip( this, arguments, function( elem ) { + if ( this.nodeType === 1 || this.nodeType === 11 || this.nodeType === 9 ) { + var target = manipulationTarget( this, elem ); + target.appendChild( elem ); + } + } ); + }, + + prepend: function() { + return domManip( this, arguments, function( elem ) { + if ( this.nodeType === 1 || this.nodeType === 11 || this.nodeType === 9 ) { + var target = manipulationTarget( this, elem ); + target.insertBefore( elem, target.firstChild ); + } + } ); + }, + + before: function() { + return domManip( this, arguments, function( elem ) { + if ( this.parentNode ) { + this.parentNode.insertBefore( elem, this ); + } + } ); + }, + + after: function() { + return domManip( this, arguments, function( elem ) { + if ( this.parentNode ) { + this.parentNode.insertBefore( elem, this.nextSibling ); + } + } ); + }, + + empty: function() { + var elem, + i = 0; + + for ( ; ( elem = this[ i ] ) != null; i++ ) { + if ( elem.nodeType === 1 ) { + + // Prevent memory leaks + jQuery.cleanData( getAll( elem, false ) ); + + // Remove any remaining nodes + elem.textContent = ""; + } + } + + return this; + }, + + clone: function( dataAndEvents, deepDataAndEvents ) { + dataAndEvents = dataAndEvents == null ? false : dataAndEvents; + deepDataAndEvents = deepDataAndEvents == null ? dataAndEvents : deepDataAndEvents; + + return this.map( function() { + return jQuery.clone( this, dataAndEvents, deepDataAndEvents ); + } ); + }, + + html: function( value ) { + return access( this, function( value ) { + var elem = this[ 0 ] || {}, + i = 0, + l = this.length; + + if ( value === undefined && elem.nodeType === 1 ) { + return elem.innerHTML; + } + + // See if we can take a shortcut and just use innerHTML + if ( typeof value === "string" && !rnoInnerhtml.test( value ) && + !wrapMap[ ( rtagName.exec( value ) || [ "", "" ] )[ 1 ].toLowerCase() ] ) { + + value = jQuery.htmlPrefilter( value ); + + try { + for ( ; i < l; i++ ) { + elem = this[ i ] || {}; + + // Remove element nodes and prevent memory leaks + if ( elem.nodeType === 1 ) { + jQuery.cleanData( getAll( elem, false ) ); + elem.innerHTML = value; + } + } + + elem = 0; + + // If using innerHTML throws an exception, use the fallback method + } catch ( e ) {} + } + + if ( elem ) { + this.empty().append( value ); + } + }, null, value, arguments.length ); + }, + + replaceWith: function() { + var ignored = []; + + // Make the changes, replacing each non-ignored context element with the new content + return domManip( this, arguments, function( elem ) { + var parent = this.parentNode; + + if ( jQuery.inArray( this, ignored ) < 0 ) { + jQuery.cleanData( getAll( this ) ); + if ( parent ) { + parent.replaceChild( elem, this ); + } + } + + // Force callback invocation + }, ignored ); + } +} ); + +jQuery.each( { + appendTo: "append", + prependTo: "prepend", + insertBefore: "before", + insertAfter: "after", + replaceAll: "replaceWith" +}, function( name, original ) { + jQuery.fn[ name ] = function( selector ) { + var elems, + ret = [], + insert = jQuery( selector ), + last = insert.length - 1, + i = 0; + + for ( ; i <= last; i++ ) { + elems = i === last ? this : this.clone( true ); + jQuery( insert[ i ] )[ original ]( elems ); + + // Support: Android <=4.0 only, PhantomJS 1 only + // .get() because push.apply(_, arraylike) throws on ancient WebKit + push.apply( ret, elems.get() ); + } + + return this.pushStack( ret ); + }; +} ); +var rnumnonpx = new RegExp( "^(" + pnum + ")(?!px)[a-z%]+$", "i" ); + +var getStyles = function( elem ) { + + // Support: IE <=11 only, Firefox <=30 (#15098, #14150) + // IE throws on elements created in popups + // FF meanwhile throws on frame elements through "defaultView.getComputedStyle" + var view = elem.ownerDocument.defaultView; + + if ( !view || !view.opener ) { + view = window; + } + + return view.getComputedStyle( elem ); + }; + +var swap = function( elem, options, callback ) { + var ret, name, + old = {}; + + // Remember the old values, and insert the new ones + for ( name in options ) { + old[ name ] = elem.style[ name ]; + elem.style[ name ] = options[ name ]; + } + + ret = callback.call( elem ); + + // Revert the old values + for ( name in options ) { + elem.style[ name ] = old[ name ]; + } + + return ret; +}; + + +var rboxStyle = new RegExp( cssExpand.join( "|" ), "i" ); + + + +( function() { + + // Executing both pixelPosition & boxSizingReliable tests require only one layout + // so they're executed at the same time to save the second computation. + function computeStyleTests() { + + // This is a singleton, we need to execute it only once + if ( !div ) { + return; + } + + container.style.cssText = "position:absolute;left:-11111px;width:60px;" + + "margin-top:1px;padding:0;border:0"; + div.style.cssText = + "position:relative;display:block;box-sizing:border-box;overflow:scroll;" + + "margin:auto;border:1px;padding:1px;" + + "width:60%;top:1%"; + documentElement.appendChild( container ).appendChild( div ); + + var divStyle = window.getComputedStyle( div ); + pixelPositionVal = divStyle.top !== "1%"; + + // Support: Android 4.0 - 4.3 only, Firefox <=3 - 44 + reliableMarginLeftVal = roundPixelMeasures( divStyle.marginLeft ) === 12; + + // Support: Android 4.0 - 4.3 only, Safari <=9.1 - 10.1, iOS <=7.0 - 9.3 + // Some styles come back with percentage values, even though they shouldn't + div.style.right = "60%"; + pixelBoxStylesVal = roundPixelMeasures( divStyle.right ) === 36; + + // Support: IE 9 - 11 only + // Detect misreporting of content dimensions for box-sizing:border-box elements + boxSizingReliableVal = roundPixelMeasures( divStyle.width ) === 36; + + // Support: IE 9 only + // Detect overflow:scroll screwiness (gh-3699) + // Support: Chrome <=64 + // Don't get tricked when zoom affects offsetWidth (gh-4029) + div.style.position = "absolute"; + scrollboxSizeVal = roundPixelMeasures( div.offsetWidth / 3 ) === 12; + + documentElement.removeChild( container ); + + // Nullify the div so it wouldn't be stored in the memory and + // it will also be a sign that checks already performed + div = null; + } + + function roundPixelMeasures( measure ) { + return Math.round( parseFloat( measure ) ); + } + + var pixelPositionVal, boxSizingReliableVal, scrollboxSizeVal, pixelBoxStylesVal, + reliableTrDimensionsVal, reliableMarginLeftVal, + container = document.createElement( "div" ), + div = document.createElement( "div" ); + + // Finish early in limited (non-browser) environments + if ( !div.style ) { + return; + } + + // Support: IE <=9 - 11 only + // Style of cloned element affects source element cloned (#8908) + div.style.backgroundClip = "content-box"; + div.cloneNode( true ).style.backgroundClip = ""; + support.clearCloneStyle = div.style.backgroundClip === "content-box"; + + jQuery.extend( support, { + boxSizingReliable: function() { + computeStyleTests(); + return boxSizingReliableVal; + }, + pixelBoxStyles: function() { + computeStyleTests(); + return pixelBoxStylesVal; + }, + pixelPosition: function() { + computeStyleTests(); + return pixelPositionVal; + }, + reliableMarginLeft: function() { + computeStyleTests(); + return reliableMarginLeftVal; + }, + scrollboxSize: function() { + computeStyleTests(); + return scrollboxSizeVal; + }, + + // Support: IE 9 - 11+, Edge 15 - 18+ + // IE/Edge misreport `getComputedStyle` of table rows with width/height + // set in CSS while `offset*` properties report correct values. + // Behavior in IE 9 is more subtle than in newer versions & it passes + // some versions of this test; make sure not to make it pass there! + // + // Support: Firefox 70+ + // Only Firefox includes border widths + // in computed dimensions. (gh-4529) + reliableTrDimensions: function() { + var table, tr, trChild, trStyle; + if ( reliableTrDimensionsVal == null ) { + table = document.createElement( "table" ); + tr = document.createElement( "tr" ); + trChild = document.createElement( "div" ); + + table.style.cssText = "position:absolute;left:-11111px;border-collapse:separate"; + tr.style.cssText = "border:1px solid"; + + // Support: Chrome 86+ + // Height set through cssText does not get applied. + // Computed height then comes back as 0. + tr.style.height = "1px"; + trChild.style.height = "9px"; + + // Support: Android 8 Chrome 86+ + // In our bodyBackground.html iframe, + // display for all div elements is set to "inline", + // which causes a problem only in Android 8 Chrome 86. + // Ensuring the div is display: block + // gets around this issue. + trChild.style.display = "block"; + + documentElement + .appendChild( table ) + .appendChild( tr ) + .appendChild( trChild ); + + trStyle = window.getComputedStyle( tr ); + reliableTrDimensionsVal = ( parseInt( trStyle.height, 10 ) + + parseInt( trStyle.borderTopWidth, 10 ) + + parseInt( trStyle.borderBottomWidth, 10 ) ) === tr.offsetHeight; + + documentElement.removeChild( table ); + } + return reliableTrDimensionsVal; + } + } ); +} )(); + + +function curCSS( elem, name, computed ) { + var width, minWidth, maxWidth, ret, + + // Support: Firefox 51+ + // Retrieving style before computed somehow + // fixes an issue with getting wrong values + // on detached elements + style = elem.style; + + computed = computed || getStyles( elem ); + + // getPropertyValue is needed for: + // .css('filter') (IE 9 only, #12537) + // .css('--customProperty) (#3144) + if ( computed ) { + ret = computed.getPropertyValue( name ) || computed[ name ]; + + if ( ret === "" && !isAttached( elem ) ) { + ret = jQuery.style( elem, name ); + } + + // A tribute to the "awesome hack by Dean Edwards" + // Android Browser returns percentage for some values, + // but width seems to be reliably pixels. + // This is against the CSSOM draft spec: + // https://drafts.csswg.org/cssom/#resolved-values + if ( !support.pixelBoxStyles() && rnumnonpx.test( ret ) && rboxStyle.test( name ) ) { + + // Remember the original values + width = style.width; + minWidth = style.minWidth; + maxWidth = style.maxWidth; + + // Put in the new values to get a computed value out + style.minWidth = style.maxWidth = style.width = ret; + ret = computed.width; + + // Revert the changed values + style.width = width; + style.minWidth = minWidth; + style.maxWidth = maxWidth; + } + } + + return ret !== undefined ? + + // Support: IE <=9 - 11 only + // IE returns zIndex value as an integer. + ret + "" : + ret; +} + + +function addGetHookIf( conditionFn, hookFn ) { + + // Define the hook, we'll check on the first run if it's really needed. + return { + get: function() { + if ( conditionFn() ) { + + // Hook not needed (or it's not possible to use it due + // to missing dependency), remove it. + delete this.get; + return; + } + + // Hook needed; redefine it so that the support test is not executed again. + return ( this.get = hookFn ).apply( this, arguments ); + } + }; +} + + +var cssPrefixes = [ "Webkit", "Moz", "ms" ], + emptyStyle = document.createElement( "div" ).style, + vendorProps = {}; + +// Return a vendor-prefixed property or undefined +function vendorPropName( name ) { + + // Check for vendor prefixed names + var capName = name[ 0 ].toUpperCase() + name.slice( 1 ), + i = cssPrefixes.length; + + while ( i-- ) { + name = cssPrefixes[ i ] + capName; + if ( name in emptyStyle ) { + return name; + } + } +} + +// Return a potentially-mapped jQuery.cssProps or vendor prefixed property +function finalPropName( name ) { + var final = jQuery.cssProps[ name ] || vendorProps[ name ]; + + if ( final ) { + return final; + } + if ( name in emptyStyle ) { + return name; + } + return vendorProps[ name ] = vendorPropName( name ) || name; +} + + +var + + // Swappable if display is none or starts with table + // except "table", "table-cell", or "table-caption" + // See here for display values: https://developer.mozilla.org/en-US/docs/CSS/display + rdisplayswap = /^(none|table(?!-c[ea]).+)/, + rcustomProp = /^--/, + cssShow = { position: "absolute", visibility: "hidden", display: "block" }, + cssNormalTransform = { + letterSpacing: "0", + fontWeight: "400" + }; + +function setPositiveNumber( _elem, value, subtract ) { + + // Any relative (+/-) values have already been + // normalized at this point + var matches = rcssNum.exec( value ); + return matches ? + + // Guard against undefined "subtract", e.g., when used as in cssHooks + Math.max( 0, matches[ 2 ] - ( subtract || 0 ) ) + ( matches[ 3 ] || "px" ) : + value; +} + +function boxModelAdjustment( elem, dimension, box, isBorderBox, styles, computedVal ) { + var i = dimension === "width" ? 1 : 0, + extra = 0, + delta = 0; + + // Adjustment may not be necessary + if ( box === ( isBorderBox ? "border" : "content" ) ) { + return 0; + } + + for ( ; i < 4; i += 2 ) { + + // Both box models exclude margin + if ( box === "margin" ) { + delta += jQuery.css( elem, box + cssExpand[ i ], true, styles ); + } + + // If we get here with a content-box, we're seeking "padding" or "border" or "margin" + if ( !isBorderBox ) { + + // Add padding + delta += jQuery.css( elem, "padding" + cssExpand[ i ], true, styles ); + + // For "border" or "margin", add border + if ( box !== "padding" ) { + delta += jQuery.css( elem, "border" + cssExpand[ i ] + "Width", true, styles ); + + // But still keep track of it otherwise + } else { + extra += jQuery.css( elem, "border" + cssExpand[ i ] + "Width", true, styles ); + } + + // If we get here with a border-box (content + padding + border), we're seeking "content" or + // "padding" or "margin" + } else { + + // For "content", subtract padding + if ( box === "content" ) { + delta -= jQuery.css( elem, "padding" + cssExpand[ i ], true, styles ); + } + + // For "content" or "padding", subtract border + if ( box !== "margin" ) { + delta -= jQuery.css( elem, "border" + cssExpand[ i ] + "Width", true, styles ); + } + } + } + + // Account for positive content-box scroll gutter when requested by providing computedVal + if ( !isBorderBox && computedVal >= 0 ) { + + // offsetWidth/offsetHeight is a rounded sum of content, padding, scroll gutter, and border + // Assuming integer scroll gutter, subtract the rest and round down + delta += Math.max( 0, Math.ceil( + elem[ "offset" + dimension[ 0 ].toUpperCase() + dimension.slice( 1 ) ] - + computedVal - + delta - + extra - + 0.5 + + // If offsetWidth/offsetHeight is unknown, then we can't determine content-box scroll gutter + // Use an explicit zero to avoid NaN (gh-3964) + ) ) || 0; + } + + return delta; +} + +function getWidthOrHeight( elem, dimension, extra ) { + + // Start with computed style + var styles = getStyles( elem ), + + // To avoid forcing a reflow, only fetch boxSizing if we need it (gh-4322). + // Fake content-box until we know it's needed to know the true value. + boxSizingNeeded = !support.boxSizingReliable() || extra, + isBorderBox = boxSizingNeeded && + jQuery.css( elem, "boxSizing", false, styles ) === "border-box", + valueIsBorderBox = isBorderBox, + + val = curCSS( elem, dimension, styles ), + offsetProp = "offset" + dimension[ 0 ].toUpperCase() + dimension.slice( 1 ); + + // Support: Firefox <=54 + // Return a confounding non-pixel value or feign ignorance, as appropriate. + if ( rnumnonpx.test( val ) ) { + if ( !extra ) { + return val; + } + val = "auto"; + } + + + // Support: IE 9 - 11 only + // Use offsetWidth/offsetHeight for when box sizing is unreliable. + // In those cases, the computed value can be trusted to be border-box. + if ( ( !support.boxSizingReliable() && isBorderBox || + + // Support: IE 10 - 11+, Edge 15 - 18+ + // IE/Edge misreport `getComputedStyle` of table rows with width/height + // set in CSS while `offset*` properties report correct values. + // Interestingly, in some cases IE 9 doesn't suffer from this issue. + !support.reliableTrDimensions() && nodeName( elem, "tr" ) || + + // Fall back to offsetWidth/offsetHeight when value is "auto" + // This happens for inline elements with no explicit setting (gh-3571) + val === "auto" || + + // Support: Android <=4.1 - 4.3 only + // Also use offsetWidth/offsetHeight for misreported inline dimensions (gh-3602) + !parseFloat( val ) && jQuery.css( elem, "display", false, styles ) === "inline" ) && + + // Make sure the element is visible & connected + elem.getClientRects().length ) { + + isBorderBox = jQuery.css( elem, "boxSizing", false, styles ) === "border-box"; + + // Where available, offsetWidth/offsetHeight approximate border box dimensions. + // Where not available (e.g., SVG), assume unreliable box-sizing and interpret the + // retrieved value as a content box dimension. + valueIsBorderBox = offsetProp in elem; + if ( valueIsBorderBox ) { + val = elem[ offsetProp ]; + } + } + + // Normalize "" and auto + val = parseFloat( val ) || 0; + + // Adjust for the element's box model + return ( val + + boxModelAdjustment( + elem, + dimension, + extra || ( isBorderBox ? "border" : "content" ), + valueIsBorderBox, + styles, + + // Provide the current computed size to request scroll gutter calculation (gh-3589) + val + ) + ) + "px"; +} + +jQuery.extend( { + + // Add in style property hooks for overriding the default + // behavior of getting and setting a style property + cssHooks: { + opacity: { + get: function( elem, computed ) { + if ( computed ) { + + // We should always get a number back from opacity + var ret = curCSS( elem, "opacity" ); + return ret === "" ? "1" : ret; + } + } + } + }, + + // Don't automatically add "px" to these possibly-unitless properties + cssNumber: { + "animationIterationCount": true, + "columnCount": true, + "fillOpacity": true, + "flexGrow": true, + "flexShrink": true, + "fontWeight": true, + "gridArea": true, + "gridColumn": true, + "gridColumnEnd": true, + "gridColumnStart": true, + "gridRow": true, + "gridRowEnd": true, + "gridRowStart": true, + "lineHeight": true, + "opacity": true, + "order": true, + "orphans": true, + "widows": true, + "zIndex": true, + "zoom": true + }, + + // Add in properties whose names you wish to fix before + // setting or getting the value + cssProps: {}, + + // Get and set the style property on a DOM Node + style: function( elem, name, value, extra ) { + + // Don't set styles on text and comment nodes + if ( !elem || elem.nodeType === 3 || elem.nodeType === 8 || !elem.style ) { + return; + } + + // Make sure that we're working with the right name + var ret, type, hooks, + origName = camelCase( name ), + isCustomProp = rcustomProp.test( name ), + style = elem.style; + + // Make sure that we're working with the right name. We don't + // want to query the value if it is a CSS custom property + // since they are user-defined. + if ( !isCustomProp ) { + name = finalPropName( origName ); + } + + // Gets hook for the prefixed version, then unprefixed version + hooks = jQuery.cssHooks[ name ] || jQuery.cssHooks[ origName ]; + + // Check if we're setting a value + if ( value !== undefined ) { + type = typeof value; + + // Convert "+=" or "-=" to relative numbers (#7345) + if ( type === "string" && ( ret = rcssNum.exec( value ) ) && ret[ 1 ] ) { + value = adjustCSS( elem, name, ret ); + + // Fixes bug #9237 + type = "number"; + } + + // Make sure that null and NaN values aren't set (#7116) + if ( value == null || value !== value ) { + return; + } + + // If a number was passed in, add the unit (except for certain CSS properties) + // The isCustomProp check can be removed in jQuery 4.0 when we only auto-append + // "px" to a few hardcoded values. + if ( type === "number" && !isCustomProp ) { + value += ret && ret[ 3 ] || ( jQuery.cssNumber[ origName ] ? "" : "px" ); + } + + // background-* props affect original clone's values + if ( !support.clearCloneStyle && value === "" && name.indexOf( "background" ) === 0 ) { + style[ name ] = "inherit"; + } + + // If a hook was provided, use that value, otherwise just set the specified value + if ( !hooks || !( "set" in hooks ) || + ( value = hooks.set( elem, value, extra ) ) !== undefined ) { + + if ( isCustomProp ) { + style.setProperty( name, value ); + } else { + style[ name ] = value; + } + } + + } else { + + // If a hook was provided get the non-computed value from there + if ( hooks && "get" in hooks && + ( ret = hooks.get( elem, false, extra ) ) !== undefined ) { + + return ret; + } + + // Otherwise just get the value from the style object + return style[ name ]; + } + }, + + css: function( elem, name, extra, styles ) { + var val, num, hooks, + origName = camelCase( name ), + isCustomProp = rcustomProp.test( name ); + + // Make sure that we're working with the right name. We don't + // want to modify the value if it is a CSS custom property + // since they are user-defined. + if ( !isCustomProp ) { + name = finalPropName( origName ); + } + + // Try prefixed name followed by the unprefixed name + hooks = jQuery.cssHooks[ name ] || jQuery.cssHooks[ origName ]; + + // If a hook was provided get the computed value from there + if ( hooks && "get" in hooks ) { + val = hooks.get( elem, true, extra ); + } + + // Otherwise, if a way to get the computed value exists, use that + if ( val === undefined ) { + val = curCSS( elem, name, styles ); + } + + // Convert "normal" to computed value + if ( val === "normal" && name in cssNormalTransform ) { + val = cssNormalTransform[ name ]; + } + + // Make numeric if forced or a qualifier was provided and val looks numeric + if ( extra === "" || extra ) { + num = parseFloat( val ); + return extra === true || isFinite( num ) ? num || 0 : val; + } + + return val; + } +} ); + +jQuery.each( [ "height", "width" ], function( _i, dimension ) { + jQuery.cssHooks[ dimension ] = { + get: function( elem, computed, extra ) { + if ( computed ) { + + // Certain elements can have dimension info if we invisibly show them + // but it must have a current display style that would benefit + return rdisplayswap.test( jQuery.css( elem, "display" ) ) && + + // Support: Safari 8+ + // Table columns in Safari have non-zero offsetWidth & zero + // getBoundingClientRect().width unless display is changed. + // Support: IE <=11 only + // Running getBoundingClientRect on a disconnected node + // in IE throws an error. + ( !elem.getClientRects().length || !elem.getBoundingClientRect().width ) ? + swap( elem, cssShow, function() { + return getWidthOrHeight( elem, dimension, extra ); + } ) : + getWidthOrHeight( elem, dimension, extra ); + } + }, + + set: function( elem, value, extra ) { + var matches, + styles = getStyles( elem ), + + // Only read styles.position if the test has a chance to fail + // to avoid forcing a reflow. + scrollboxSizeBuggy = !support.scrollboxSize() && + styles.position === "absolute", + + // To avoid forcing a reflow, only fetch boxSizing if we need it (gh-3991) + boxSizingNeeded = scrollboxSizeBuggy || extra, + isBorderBox = boxSizingNeeded && + jQuery.css( elem, "boxSizing", false, styles ) === "border-box", + subtract = extra ? + boxModelAdjustment( + elem, + dimension, + extra, + isBorderBox, + styles + ) : + 0; + + // Account for unreliable border-box dimensions by comparing offset* to computed and + // faking a content-box to get border and padding (gh-3699) + if ( isBorderBox && scrollboxSizeBuggy ) { + subtract -= Math.ceil( + elem[ "offset" + dimension[ 0 ].toUpperCase() + dimension.slice( 1 ) ] - + parseFloat( styles[ dimension ] ) - + boxModelAdjustment( elem, dimension, "border", false, styles ) - + 0.5 + ); + } + + // Convert to pixels if value adjustment is needed + if ( subtract && ( matches = rcssNum.exec( value ) ) && + ( matches[ 3 ] || "px" ) !== "px" ) { + + elem.style[ dimension ] = value; + value = jQuery.css( elem, dimension ); + } + + return setPositiveNumber( elem, value, subtract ); + } + }; +} ); + +jQuery.cssHooks.marginLeft = addGetHookIf( support.reliableMarginLeft, + function( elem, computed ) { + if ( computed ) { + return ( parseFloat( curCSS( elem, "marginLeft" ) ) || + elem.getBoundingClientRect().left - + swap( elem, { marginLeft: 0 }, function() { + return elem.getBoundingClientRect().left; + } ) + ) + "px"; + } + } +); + +// These hooks are used by animate to expand properties +jQuery.each( { + margin: "", + padding: "", + border: "Width" +}, function( prefix, suffix ) { + jQuery.cssHooks[ prefix + suffix ] = { + expand: function( value ) { + var i = 0, + expanded = {}, + + // Assumes a single number if not a string + parts = typeof value === "string" ? value.split( " " ) : [ value ]; + + for ( ; i < 4; i++ ) { + expanded[ prefix + cssExpand[ i ] + suffix ] = + parts[ i ] || parts[ i - 2 ] || parts[ 0 ]; + } + + return expanded; + } + }; + + if ( prefix !== "margin" ) { + jQuery.cssHooks[ prefix + suffix ].set = setPositiveNumber; + } +} ); + +jQuery.fn.extend( { + css: function( name, value ) { + return access( this, function( elem, name, value ) { + var styles, len, + map = {}, + i = 0; + + if ( Array.isArray( name ) ) { + styles = getStyles( elem ); + len = name.length; + + for ( ; i < len; i++ ) { + map[ name[ i ] ] = jQuery.css( elem, name[ i ], false, styles ); + } + + return map; + } + + return value !== undefined ? + jQuery.style( elem, name, value ) : + jQuery.css( elem, name ); + }, name, value, arguments.length > 1 ); + } +} ); + + +function Tween( elem, options, prop, end, easing ) { + return new Tween.prototype.init( elem, options, prop, end, easing ); +} +jQuery.Tween = Tween; + +Tween.prototype = { + constructor: Tween, + init: function( elem, options, prop, end, easing, unit ) { + this.elem = elem; + this.prop = prop; + this.easing = easing || jQuery.easing._default; + this.options = options; + this.start = this.now = this.cur(); + this.end = end; + this.unit = unit || ( jQuery.cssNumber[ prop ] ? "" : "px" ); + }, + cur: function() { + var hooks = Tween.propHooks[ this.prop ]; + + return hooks && hooks.get ? + hooks.get( this ) : + Tween.propHooks._default.get( this ); + }, + run: function( percent ) { + var eased, + hooks = Tween.propHooks[ this.prop ]; + + if ( this.options.duration ) { + this.pos = eased = jQuery.easing[ this.easing ]( + percent, this.options.duration * percent, 0, 1, this.options.duration + ); + } else { + this.pos = eased = percent; + } + this.now = ( this.end - this.start ) * eased + this.start; + + if ( this.options.step ) { + this.options.step.call( this.elem, this.now, this ); + } + + if ( hooks && hooks.set ) { + hooks.set( this ); + } else { + Tween.propHooks._default.set( this ); + } + return this; + } +}; + +Tween.prototype.init.prototype = Tween.prototype; + +Tween.propHooks = { + _default: { + get: function( tween ) { + var result; + + // Use a property on the element directly when it is not a DOM element, + // or when there is no matching style property that exists. + if ( tween.elem.nodeType !== 1 || + tween.elem[ tween.prop ] != null && tween.elem.style[ tween.prop ] == null ) { + return tween.elem[ tween.prop ]; + } + + // Passing an empty string as a 3rd parameter to .css will automatically + // attempt a parseFloat and fallback to a string if the parse fails. + // Simple values such as "10px" are parsed to Float; + // complex values such as "rotate(1rad)" are returned as-is. + result = jQuery.css( tween.elem, tween.prop, "" ); + + // Empty strings, null, undefined and "auto" are converted to 0. + return !result || result === "auto" ? 0 : result; + }, + set: function( tween ) { + + // Use step hook for back compat. + // Use cssHook if its there. + // Use .style if available and use plain properties where available. + if ( jQuery.fx.step[ tween.prop ] ) { + jQuery.fx.step[ tween.prop ]( tween ); + } else if ( tween.elem.nodeType === 1 && ( + jQuery.cssHooks[ tween.prop ] || + tween.elem.style[ finalPropName( tween.prop ) ] != null ) ) { + jQuery.style( tween.elem, tween.prop, tween.now + tween.unit ); + } else { + tween.elem[ tween.prop ] = tween.now; + } + } + } +}; + +// Support: IE <=9 only +// Panic based approach to setting things on disconnected nodes +Tween.propHooks.scrollTop = Tween.propHooks.scrollLeft = { + set: function( tween ) { + if ( tween.elem.nodeType && tween.elem.parentNode ) { + tween.elem[ tween.prop ] = tween.now; + } + } +}; + +jQuery.easing = { + linear: function( p ) { + return p; + }, + swing: function( p ) { + return 0.5 - Math.cos( p * Math.PI ) / 2; + }, + _default: "swing" +}; + +jQuery.fx = Tween.prototype.init; + +// Back compat <1.8 extension point +jQuery.fx.step = {}; + + + + +var + fxNow, inProgress, + rfxtypes = /^(?:toggle|show|hide)$/, + rrun = /queueHooks$/; + +function schedule() { + if ( inProgress ) { + if ( document.hidden === false && window.requestAnimationFrame ) { + window.requestAnimationFrame( schedule ); + } else { + window.setTimeout( schedule, jQuery.fx.interval ); + } + + jQuery.fx.tick(); + } +} + +// Animations created synchronously will run synchronously +function createFxNow() { + window.setTimeout( function() { + fxNow = undefined; + } ); + return ( fxNow = Date.now() ); +} + +// Generate parameters to create a standard animation +function genFx( type, includeWidth ) { + var which, + i = 0, + attrs = { height: type }; + + // If we include width, step value is 1 to do all cssExpand values, + // otherwise step value is 2 to skip over Left and Right + includeWidth = includeWidth ? 1 : 0; + for ( ; i < 4; i += 2 - includeWidth ) { + which = cssExpand[ i ]; + attrs[ "margin" + which ] = attrs[ "padding" + which ] = type; + } + + if ( includeWidth ) { + attrs.opacity = attrs.width = type; + } + + return attrs; +} + +function createTween( value, prop, animation ) { + var tween, + collection = ( Animation.tweeners[ prop ] || [] ).concat( Animation.tweeners[ "*" ] ), + index = 0, + length = collection.length; + for ( ; index < length; index++ ) { + if ( ( tween = collection[ index ].call( animation, prop, value ) ) ) { + + // We're done with this property + return tween; + } + } +} + +function defaultPrefilter( elem, props, opts ) { + var prop, value, toggle, hooks, oldfire, propTween, restoreDisplay, display, + isBox = "width" in props || "height" in props, + anim = this, + orig = {}, + style = elem.style, + hidden = elem.nodeType && isHiddenWithinTree( elem ), + dataShow = dataPriv.get( elem, "fxshow" ); + + // Queue-skipping animations hijack the fx hooks + if ( !opts.queue ) { + hooks = jQuery._queueHooks( elem, "fx" ); + if ( hooks.unqueued == null ) { + hooks.unqueued = 0; + oldfire = hooks.empty.fire; + hooks.empty.fire = function() { + if ( !hooks.unqueued ) { + oldfire(); + } + }; + } + hooks.unqueued++; + + anim.always( function() { + + // Ensure the complete handler is called before this completes + anim.always( function() { + hooks.unqueued--; + if ( !jQuery.queue( elem, "fx" ).length ) { + hooks.empty.fire(); + } + } ); + } ); + } + + // Detect show/hide animations + for ( prop in props ) { + value = props[ prop ]; + if ( rfxtypes.test( value ) ) { + delete props[ prop ]; + toggle = toggle || value === "toggle"; + if ( value === ( hidden ? "hide" : "show" ) ) { + + // Pretend to be hidden if this is a "show" and + // there is still data from a stopped show/hide + if ( value === "show" && dataShow && dataShow[ prop ] !== undefined ) { + hidden = true; + + // Ignore all other no-op show/hide data + } else { + continue; + } + } + orig[ prop ] = dataShow && dataShow[ prop ] || jQuery.style( elem, prop ); + } + } + + // Bail out if this is a no-op like .hide().hide() + propTween = !jQuery.isEmptyObject( props ); + if ( !propTween && jQuery.isEmptyObject( orig ) ) { + return; + } + + // Restrict "overflow" and "display" styles during box animations + if ( isBox && elem.nodeType === 1 ) { + + // Support: IE <=9 - 11, Edge 12 - 15 + // Record all 3 overflow attributes because IE does not infer the shorthand + // from identically-valued overflowX and overflowY and Edge just mirrors + // the overflowX value there. + opts.overflow = [ style.overflow, style.overflowX, style.overflowY ]; + + // Identify a display type, preferring old show/hide data over the CSS cascade + restoreDisplay = dataShow && dataShow.display; + if ( restoreDisplay == null ) { + restoreDisplay = dataPriv.get( elem, "display" ); + } + display = jQuery.css( elem, "display" ); + if ( display === "none" ) { + if ( restoreDisplay ) { + display = restoreDisplay; + } else { + + // Get nonempty value(s) by temporarily forcing visibility + showHide( [ elem ], true ); + restoreDisplay = elem.style.display || restoreDisplay; + display = jQuery.css( elem, "display" ); + showHide( [ elem ] ); + } + } + + // Animate inline elements as inline-block + if ( display === "inline" || display === "inline-block" && restoreDisplay != null ) { + if ( jQuery.css( elem, "float" ) === "none" ) { + + // Restore the original display value at the end of pure show/hide animations + if ( !propTween ) { + anim.done( function() { + style.display = restoreDisplay; + } ); + if ( restoreDisplay == null ) { + display = style.display; + restoreDisplay = display === "none" ? "" : display; + } + } + style.display = "inline-block"; + } + } + } + + if ( opts.overflow ) { + style.overflow = "hidden"; + anim.always( function() { + style.overflow = opts.overflow[ 0 ]; + style.overflowX = opts.overflow[ 1 ]; + style.overflowY = opts.overflow[ 2 ]; + } ); + } + + // Implement show/hide animations + propTween = false; + for ( prop in orig ) { + + // General show/hide setup for this element animation + if ( !propTween ) { + if ( dataShow ) { + if ( "hidden" in dataShow ) { + hidden = dataShow.hidden; + } + } else { + dataShow = dataPriv.access( elem, "fxshow", { display: restoreDisplay } ); + } + + // Store hidden/visible for toggle so `.stop().toggle()` "reverses" + if ( toggle ) { + dataShow.hidden = !hidden; + } + + // Show elements before animating them + if ( hidden ) { + showHide( [ elem ], true ); + } + + /* eslint-disable no-loop-func */ + + anim.done( function() { + + /* eslint-enable no-loop-func */ + + // The final step of a "hide" animation is actually hiding the element + if ( !hidden ) { + showHide( [ elem ] ); + } + dataPriv.remove( elem, "fxshow" ); + for ( prop in orig ) { + jQuery.style( elem, prop, orig[ prop ] ); + } + } ); + } + + // Per-property setup + propTween = createTween( hidden ? dataShow[ prop ] : 0, prop, anim ); + if ( !( prop in dataShow ) ) { + dataShow[ prop ] = propTween.start; + if ( hidden ) { + propTween.end = propTween.start; + propTween.start = 0; + } + } + } +} + +function propFilter( props, specialEasing ) { + var index, name, easing, value, hooks; + + // camelCase, specialEasing and expand cssHook pass + for ( index in props ) { + name = camelCase( index ); + easing = specialEasing[ name ]; + value = props[ index ]; + if ( Array.isArray( value ) ) { + easing = value[ 1 ]; + value = props[ index ] = value[ 0 ]; + } + + if ( index !== name ) { + props[ name ] = value; + delete props[ index ]; + } + + hooks = jQuery.cssHooks[ name ]; + if ( hooks && "expand" in hooks ) { + value = hooks.expand( value ); + delete props[ name ]; + + // Not quite $.extend, this won't overwrite existing keys. + // Reusing 'index' because we have the correct "name" + for ( index in value ) { + if ( !( index in props ) ) { + props[ index ] = value[ index ]; + specialEasing[ index ] = easing; + } + } + } else { + specialEasing[ name ] = easing; + } + } +} + +function Animation( elem, properties, options ) { + var result, + stopped, + index = 0, + length = Animation.prefilters.length, + deferred = jQuery.Deferred().always( function() { + + // Don't match elem in the :animated selector + delete tick.elem; + } ), + tick = function() { + if ( stopped ) { + return false; + } + var currentTime = fxNow || createFxNow(), + remaining = Math.max( 0, animation.startTime + animation.duration - currentTime ), + + // Support: Android 2.3 only + // Archaic crash bug won't allow us to use `1 - ( 0.5 || 0 )` (#12497) + temp = remaining / animation.duration || 0, + percent = 1 - temp, + index = 0, + length = animation.tweens.length; + + for ( ; index < length; index++ ) { + animation.tweens[ index ].run( percent ); + } + + deferred.notifyWith( elem, [ animation, percent, remaining ] ); + + // If there's more to do, yield + if ( percent < 1 && length ) { + return remaining; + } + + // If this was an empty animation, synthesize a final progress notification + if ( !length ) { + deferred.notifyWith( elem, [ animation, 1, 0 ] ); + } + + // Resolve the animation and report its conclusion + deferred.resolveWith( elem, [ animation ] ); + return false; + }, + animation = deferred.promise( { + elem: elem, + props: jQuery.extend( {}, properties ), + opts: jQuery.extend( true, { + specialEasing: {}, + easing: jQuery.easing._default + }, options ), + originalProperties: properties, + originalOptions: options, + startTime: fxNow || createFxNow(), + duration: options.duration, + tweens: [], + createTween: function( prop, end ) { + var tween = jQuery.Tween( elem, animation.opts, prop, end, + animation.opts.specialEasing[ prop ] || animation.opts.easing ); + animation.tweens.push( tween ); + return tween; + }, + stop: function( gotoEnd ) { + var index = 0, + + // If we are going to the end, we want to run all the tweens + // otherwise we skip this part + length = gotoEnd ? animation.tweens.length : 0; + if ( stopped ) { + return this; + } + stopped = true; + for ( ; index < length; index++ ) { + animation.tweens[ index ].run( 1 ); + } + + // Resolve when we played the last frame; otherwise, reject + if ( gotoEnd ) { + deferred.notifyWith( elem, [ animation, 1, 0 ] ); + deferred.resolveWith( elem, [ animation, gotoEnd ] ); + } else { + deferred.rejectWith( elem, [ animation, gotoEnd ] ); + } + return this; + } + } ), + props = animation.props; + + propFilter( props, animation.opts.specialEasing ); + + for ( ; index < length; index++ ) { + result = Animation.prefilters[ index ].call( animation, elem, props, animation.opts ); + if ( result ) { + if ( isFunction( result.stop ) ) { + jQuery._queueHooks( animation.elem, animation.opts.queue ).stop = + result.stop.bind( result ); + } + return result; + } + } + + jQuery.map( props, createTween, animation ); + + if ( isFunction( animation.opts.start ) ) { + animation.opts.start.call( elem, animation ); + } + + // Attach callbacks from options + animation + .progress( animation.opts.progress ) + .done( animation.opts.done, animation.opts.complete ) + .fail( animation.opts.fail ) + .always( animation.opts.always ); + + jQuery.fx.timer( + jQuery.extend( tick, { + elem: elem, + anim: animation, + queue: animation.opts.queue + } ) + ); + + return animation; +} + +jQuery.Animation = jQuery.extend( Animation, { + + tweeners: { + "*": [ function( prop, value ) { + var tween = this.createTween( prop, value ); + adjustCSS( tween.elem, prop, rcssNum.exec( value ), tween ); + return tween; + } ] + }, + + tweener: function( props, callback ) { + if ( isFunction( props ) ) { + callback = props; + props = [ "*" ]; + } else { + props = props.match( rnothtmlwhite ); + } + + var prop, + index = 0, + length = props.length; + + for ( ; index < length; index++ ) { + prop = props[ index ]; + Animation.tweeners[ prop ] = Animation.tweeners[ prop ] || []; + Animation.tweeners[ prop ].unshift( callback ); + } + }, + + prefilters: [ defaultPrefilter ], + + prefilter: function( callback, prepend ) { + if ( prepend ) { + Animation.prefilters.unshift( callback ); + } else { + Animation.prefilters.push( callback ); + } + } +} ); + +jQuery.speed = function( speed, easing, fn ) { + var opt = speed && typeof speed === "object" ? jQuery.extend( {}, speed ) : { + complete: fn || !fn && easing || + isFunction( speed ) && speed, + duration: speed, + easing: fn && easing || easing && !isFunction( easing ) && easing + }; + + // Go to the end state if fx are off + if ( jQuery.fx.off ) { + opt.duration = 0; + + } else { + if ( typeof opt.duration !== "number" ) { + if ( opt.duration in jQuery.fx.speeds ) { + opt.duration = jQuery.fx.speeds[ opt.duration ]; + + } else { + opt.duration = jQuery.fx.speeds._default; + } + } + } + + // Normalize opt.queue - true/undefined/null -> "fx" + if ( opt.queue == null || opt.queue === true ) { + opt.queue = "fx"; + } + + // Queueing + opt.old = opt.complete; + + opt.complete = function() { + if ( isFunction( opt.old ) ) { + opt.old.call( this ); + } + + if ( opt.queue ) { + jQuery.dequeue( this, opt.queue ); + } + }; + + return opt; +}; + +jQuery.fn.extend( { + fadeTo: function( speed, to, easing, callback ) { + + // Show any hidden elements after setting opacity to 0 + return this.filter( isHiddenWithinTree ).css( "opacity", 0 ).show() + + // Animate to the value specified + .end().animate( { opacity: to }, speed, easing, callback ); + }, + animate: function( prop, speed, easing, callback ) { + var empty = jQuery.isEmptyObject( prop ), + optall = jQuery.speed( speed, easing, callback ), + doAnimation = function() { + + // Operate on a copy of prop so per-property easing won't be lost + var anim = Animation( this, jQuery.extend( {}, prop ), optall ); + + // Empty animations, or finishing resolves immediately + if ( empty || dataPriv.get( this, "finish" ) ) { + anim.stop( true ); + } + }; + + doAnimation.finish = doAnimation; + + return empty || optall.queue === false ? + this.each( doAnimation ) : + this.queue( optall.queue, doAnimation ); + }, + stop: function( type, clearQueue, gotoEnd ) { + var stopQueue = function( hooks ) { + var stop = hooks.stop; + delete hooks.stop; + stop( gotoEnd ); + }; + + if ( typeof type !== "string" ) { + gotoEnd = clearQueue; + clearQueue = type; + type = undefined; + } + if ( clearQueue ) { + this.queue( type || "fx", [] ); + } + + return this.each( function() { + var dequeue = true, + index = type != null && type + "queueHooks", + timers = jQuery.timers, + data = dataPriv.get( this ); + + if ( index ) { + if ( data[ index ] && data[ index ].stop ) { + stopQueue( data[ index ] ); + } + } else { + for ( index in data ) { + if ( data[ index ] && data[ index ].stop && rrun.test( index ) ) { + stopQueue( data[ index ] ); + } + } + } + + for ( index = timers.length; index--; ) { + if ( timers[ index ].elem === this && + ( type == null || timers[ index ].queue === type ) ) { + + timers[ index ].anim.stop( gotoEnd ); + dequeue = false; + timers.splice( index, 1 ); + } + } + + // Start the next in the queue if the last step wasn't forced. + // Timers currently will call their complete callbacks, which + // will dequeue but only if they were gotoEnd. + if ( dequeue || !gotoEnd ) { + jQuery.dequeue( this, type ); + } + } ); + }, + finish: function( type ) { + if ( type !== false ) { + type = type || "fx"; + } + return this.each( function() { + var index, + data = dataPriv.get( this ), + queue = data[ type + "queue" ], + hooks = data[ type + "queueHooks" ], + timers = jQuery.timers, + length = queue ? queue.length : 0; + + // Enable finishing flag on private data + data.finish = true; + + // Empty the queue first + jQuery.queue( this, type, [] ); + + if ( hooks && hooks.stop ) { + hooks.stop.call( this, true ); + } + + // Look for any active animations, and finish them + for ( index = timers.length; index--; ) { + if ( timers[ index ].elem === this && timers[ index ].queue === type ) { + timers[ index ].anim.stop( true ); + timers.splice( index, 1 ); + } + } + + // Look for any animations in the old queue and finish them + for ( index = 0; index < length; index++ ) { + if ( queue[ index ] && queue[ index ].finish ) { + queue[ index ].finish.call( this ); + } + } + + // Turn off finishing flag + delete data.finish; + } ); + } +} ); + +jQuery.each( [ "toggle", "show", "hide" ], function( _i, name ) { + var cssFn = jQuery.fn[ name ]; + jQuery.fn[ name ] = function( speed, easing, callback ) { + return speed == null || typeof speed === "boolean" ? + cssFn.apply( this, arguments ) : + this.animate( genFx( name, true ), speed, easing, callback ); + }; +} ); + +// Generate shortcuts for custom animations +jQuery.each( { + slideDown: genFx( "show" ), + slideUp: genFx( "hide" ), + slideToggle: genFx( "toggle" ), + fadeIn: { opacity: "show" }, + fadeOut: { opacity: "hide" }, + fadeToggle: { opacity: "toggle" } +}, function( name, props ) { + jQuery.fn[ name ] = function( speed, easing, callback ) { + return this.animate( props, speed, easing, callback ); + }; +} ); + +jQuery.timers = []; +jQuery.fx.tick = function() { + var timer, + i = 0, + timers = jQuery.timers; + + fxNow = Date.now(); + + for ( ; i < timers.length; i++ ) { + timer = timers[ i ]; + + // Run the timer and safely remove it when done (allowing for external removal) + if ( !timer() && timers[ i ] === timer ) { + timers.splice( i--, 1 ); + } + } + + if ( !timers.length ) { + jQuery.fx.stop(); + } + fxNow = undefined; +}; + +jQuery.fx.timer = function( timer ) { + jQuery.timers.push( timer ); + jQuery.fx.start(); +}; + +jQuery.fx.interval = 13; +jQuery.fx.start = function() { + if ( inProgress ) { + return; + } + + inProgress = true; + schedule(); +}; + +jQuery.fx.stop = function() { + inProgress = null; +}; + +jQuery.fx.speeds = { + slow: 600, + fast: 200, + + // Default speed + _default: 400 +}; + + +// Based off of the plugin by Clint Helfers, with permission. +// https://web.archive.org/web/20100324014747/http://blindsignals.com/index.php/2009/07/jquery-delay/ +jQuery.fn.delay = function( time, type ) { + time = jQuery.fx ? jQuery.fx.speeds[ time ] || time : time; + type = type || "fx"; + + return this.queue( type, function( next, hooks ) { + var timeout = window.setTimeout( next, time ); + hooks.stop = function() { + window.clearTimeout( timeout ); + }; + } ); +}; + + +( function() { + var input = document.createElement( "input" ), + select = document.createElement( "select" ), + opt = select.appendChild( document.createElement( "option" ) ); + + input.type = "checkbox"; + + // Support: Android <=4.3 only + // Default value for a checkbox should be "on" + support.checkOn = input.value !== ""; + + // Support: IE <=11 only + // Must access selectedIndex to make default options select + support.optSelected = opt.selected; + + // Support: IE <=11 only + // An input loses its value after becoming a radio + input = document.createElement( "input" ); + input.value = "t"; + input.type = "radio"; + support.radioValue = input.value === "t"; +} )(); + + +var boolHook, + attrHandle = jQuery.expr.attrHandle; + +jQuery.fn.extend( { + attr: function( name, value ) { + return access( this, jQuery.attr, name, value, arguments.length > 1 ); + }, + + removeAttr: function( name ) { + return this.each( function() { + jQuery.removeAttr( this, name ); + } ); + } +} ); + +jQuery.extend( { + attr: function( elem, name, value ) { + var ret, hooks, + nType = elem.nodeType; + + // Don't get/set attributes on text, comment and attribute nodes + if ( nType === 3 || nType === 8 || nType === 2 ) { + return; + } + + // Fallback to prop when attributes are not supported + if ( typeof elem.getAttribute === "undefined" ) { + return jQuery.prop( elem, name, value ); + } + + // Attribute hooks are determined by the lowercase version + // Grab necessary hook if one is defined + if ( nType !== 1 || !jQuery.isXMLDoc( elem ) ) { + hooks = jQuery.attrHooks[ name.toLowerCase() ] || + ( jQuery.expr.match.bool.test( name ) ? boolHook : undefined ); + } + + if ( value !== undefined ) { + if ( value === null ) { + jQuery.removeAttr( elem, name ); + return; + } + + if ( hooks && "set" in hooks && + ( ret = hooks.set( elem, value, name ) ) !== undefined ) { + return ret; + } + + elem.setAttribute( name, value + "" ); + return value; + } + + if ( hooks && "get" in hooks && ( ret = hooks.get( elem, name ) ) !== null ) { + return ret; + } + + ret = jQuery.find.attr( elem, name ); + + // Non-existent attributes return null, we normalize to undefined + return ret == null ? undefined : ret; + }, + + attrHooks: { + type: { + set: function( elem, value ) { + if ( !support.radioValue && value === "radio" && + nodeName( elem, "input" ) ) { + var val = elem.value; + elem.setAttribute( "type", value ); + if ( val ) { + elem.value = val; + } + return value; + } + } + } + }, + + removeAttr: function( elem, value ) { + var name, + i = 0, + + // Attribute names can contain non-HTML whitespace characters + // https://html.spec.whatwg.org/multipage/syntax.html#attributes-2 + attrNames = value && value.match( rnothtmlwhite ); + + if ( attrNames && elem.nodeType === 1 ) { + while ( ( name = attrNames[ i++ ] ) ) { + elem.removeAttribute( name ); + } + } + } +} ); + +// Hooks for boolean attributes +boolHook = { + set: function( elem, value, name ) { + if ( value === false ) { + + // Remove boolean attributes when set to false + jQuery.removeAttr( elem, name ); + } else { + elem.setAttribute( name, name ); + } + return name; + } +}; + +jQuery.each( jQuery.expr.match.bool.source.match( /\w+/g ), function( _i, name ) { + var getter = attrHandle[ name ] || jQuery.find.attr; + + attrHandle[ name ] = function( elem, name, isXML ) { + var ret, handle, + lowercaseName = name.toLowerCase(); + + if ( !isXML ) { + + // Avoid an infinite loop by temporarily removing this function from the getter + handle = attrHandle[ lowercaseName ]; + attrHandle[ lowercaseName ] = ret; + ret = getter( elem, name, isXML ) != null ? + lowercaseName : + null; + attrHandle[ lowercaseName ] = handle; + } + return ret; + }; +} ); + + + + +var rfocusable = /^(?:input|select|textarea|button)$/i, + rclickable = /^(?:a|area)$/i; + +jQuery.fn.extend( { + prop: function( name, value ) { + return access( this, jQuery.prop, name, value, arguments.length > 1 ); + }, + + removeProp: function( name ) { + return this.each( function() { + delete this[ jQuery.propFix[ name ] || name ]; + } ); + } +} ); + +jQuery.extend( { + prop: function( elem, name, value ) { + var ret, hooks, + nType = elem.nodeType; + + // Don't get/set properties on text, comment and attribute nodes + if ( nType === 3 || nType === 8 || nType === 2 ) { + return; + } + + if ( nType !== 1 || !jQuery.isXMLDoc( elem ) ) { + + // Fix name and attach hooks + name = jQuery.propFix[ name ] || name; + hooks = jQuery.propHooks[ name ]; + } + + if ( value !== undefined ) { + if ( hooks && "set" in hooks && + ( ret = hooks.set( elem, value, name ) ) !== undefined ) { + return ret; + } + + return ( elem[ name ] = value ); + } + + if ( hooks && "get" in hooks && ( ret = hooks.get( elem, name ) ) !== null ) { + return ret; + } + + return elem[ name ]; + }, + + propHooks: { + tabIndex: { + get: function( elem ) { + + // Support: IE <=9 - 11 only + // elem.tabIndex doesn't always return the + // correct value when it hasn't been explicitly set + // https://web.archive.org/web/20141116233347/http://fluidproject.org/blog/2008/01/09/getting-setting-and-removing-tabindex-values-with-javascript/ + // Use proper attribute retrieval(#12072) + var tabindex = jQuery.find.attr( elem, "tabindex" ); + + if ( tabindex ) { + return parseInt( tabindex, 10 ); + } + + if ( + rfocusable.test( elem.nodeName ) || + rclickable.test( elem.nodeName ) && + elem.href + ) { + return 0; + } + + return -1; + } + } + }, + + propFix: { + "for": "htmlFor", + "class": "className" + } +} ); + +// Support: IE <=11 only +// Accessing the selectedIndex property +// forces the browser to respect setting selected +// on the option +// The getter ensures a default option is selected +// when in an optgroup +// eslint rule "no-unused-expressions" is disabled for this code +// since it considers such accessions noop +if ( !support.optSelected ) { + jQuery.propHooks.selected = { + get: function( elem ) { + + /* eslint no-unused-expressions: "off" */ + + var parent = elem.parentNode; + if ( parent && parent.parentNode ) { + parent.parentNode.selectedIndex; + } + return null; + }, + set: function( elem ) { + + /* eslint no-unused-expressions: "off" */ + + var parent = elem.parentNode; + if ( parent ) { + parent.selectedIndex; + + if ( parent.parentNode ) { + parent.parentNode.selectedIndex; + } + } + } + }; +} + +jQuery.each( [ + "tabIndex", + "readOnly", + "maxLength", + "cellSpacing", + "cellPadding", + "rowSpan", + "colSpan", + "useMap", + "frameBorder", + "contentEditable" +], function() { + jQuery.propFix[ this.toLowerCase() ] = this; +} ); + + + + + // Strip and collapse whitespace according to HTML spec + // https://infra.spec.whatwg.org/#strip-and-collapse-ascii-whitespace + function stripAndCollapse( value ) { + var tokens = value.match( rnothtmlwhite ) || []; + return tokens.join( " " ); + } + + +function getClass( elem ) { + return elem.getAttribute && elem.getAttribute( "class" ) || ""; +} + +function classesToArray( value ) { + if ( Array.isArray( value ) ) { + return value; + } + if ( typeof value === "string" ) { + return value.match( rnothtmlwhite ) || []; + } + return []; +} + +jQuery.fn.extend( { + addClass: function( value ) { + var classes, elem, cur, curValue, clazz, j, finalValue, + i = 0; + + if ( isFunction( value ) ) { + return this.each( function( j ) { + jQuery( this ).addClass( value.call( this, j, getClass( this ) ) ); + } ); + } + + classes = classesToArray( value ); + + if ( classes.length ) { + while ( ( elem = this[ i++ ] ) ) { + curValue = getClass( elem ); + cur = elem.nodeType === 1 && ( " " + stripAndCollapse( curValue ) + " " ); + + if ( cur ) { + j = 0; + while ( ( clazz = classes[ j++ ] ) ) { + if ( cur.indexOf( " " + clazz + " " ) < 0 ) { + cur += clazz + " "; + } + } + + // Only assign if different to avoid unneeded rendering. + finalValue = stripAndCollapse( cur ); + if ( curValue !== finalValue ) { + elem.setAttribute( "class", finalValue ); + } + } + } + } + + return this; + }, + + removeClass: function( value ) { + var classes, elem, cur, curValue, clazz, j, finalValue, + i = 0; + + if ( isFunction( value ) ) { + return this.each( function( j ) { + jQuery( this ).removeClass( value.call( this, j, getClass( this ) ) ); + } ); + } + + if ( !arguments.length ) { + return this.attr( "class", "" ); + } + + classes = classesToArray( value ); + + if ( classes.length ) { + while ( ( elem = this[ i++ ] ) ) { + curValue = getClass( elem ); + + // This expression is here for better compressibility (see addClass) + cur = elem.nodeType === 1 && ( " " + stripAndCollapse( curValue ) + " " ); + + if ( cur ) { + j = 0; + while ( ( clazz = classes[ j++ ] ) ) { + + // Remove *all* instances + while ( cur.indexOf( " " + clazz + " " ) > -1 ) { + cur = cur.replace( " " + clazz + " ", " " ); + } + } + + // Only assign if different to avoid unneeded rendering. + finalValue = stripAndCollapse( cur ); + if ( curValue !== finalValue ) { + elem.setAttribute( "class", finalValue ); + } + } + } + } + + return this; + }, + + toggleClass: function( value, stateVal ) { + var type = typeof value, + isValidValue = type === "string" || Array.isArray( value ); + + if ( typeof stateVal === "boolean" && isValidValue ) { + return stateVal ? this.addClass( value ) : this.removeClass( value ); + } + + if ( isFunction( value ) ) { + return this.each( function( i ) { + jQuery( this ).toggleClass( + value.call( this, i, getClass( this ), stateVal ), + stateVal + ); + } ); + } + + return this.each( function() { + var className, i, self, classNames; + + if ( isValidValue ) { + + // Toggle individual class names + i = 0; + self = jQuery( this ); + classNames = classesToArray( value ); + + while ( ( className = classNames[ i++ ] ) ) { + + // Check each className given, space separated list + if ( self.hasClass( className ) ) { + self.removeClass( className ); + } else { + self.addClass( className ); + } + } + + // Toggle whole class name + } else if ( value === undefined || type === "boolean" ) { + className = getClass( this ); + if ( className ) { + + // Store className if set + dataPriv.set( this, "__className__", className ); + } + + // If the element has a class name or if we're passed `false`, + // then remove the whole classname (if there was one, the above saved it). + // Otherwise bring back whatever was previously saved (if anything), + // falling back to the empty string if nothing was stored. + if ( this.setAttribute ) { + this.setAttribute( "class", + className || value === false ? + "" : + dataPriv.get( this, "__className__" ) || "" + ); + } + } + } ); + }, + + hasClass: function( selector ) { + var className, elem, + i = 0; + + className = " " + selector + " "; + while ( ( elem = this[ i++ ] ) ) { + if ( elem.nodeType === 1 && + ( " " + stripAndCollapse( getClass( elem ) ) + " " ).indexOf( className ) > -1 ) { + return true; + } + } + + return false; + } +} ); + + + + +var rreturn = /\r/g; + +jQuery.fn.extend( { + val: function( value ) { + var hooks, ret, valueIsFunction, + elem = this[ 0 ]; + + if ( !arguments.length ) { + if ( elem ) { + hooks = jQuery.valHooks[ elem.type ] || + jQuery.valHooks[ elem.nodeName.toLowerCase() ]; + + if ( hooks && + "get" in hooks && + ( ret = hooks.get( elem, "value" ) ) !== undefined + ) { + return ret; + } + + ret = elem.value; + + // Handle most common string cases + if ( typeof ret === "string" ) { + return ret.replace( rreturn, "" ); + } + + // Handle cases where value is null/undef or number + return ret == null ? "" : ret; + } + + return; + } + + valueIsFunction = isFunction( value ); + + return this.each( function( i ) { + var val; + + if ( this.nodeType !== 1 ) { + return; + } + + if ( valueIsFunction ) { + val = value.call( this, i, jQuery( this ).val() ); + } else { + val = value; + } + + // Treat null/undefined as ""; convert numbers to string + if ( val == null ) { + val = ""; + + } else if ( typeof val === "number" ) { + val += ""; + + } else if ( Array.isArray( val ) ) { + val = jQuery.map( val, function( value ) { + return value == null ? "" : value + ""; + } ); + } + + hooks = jQuery.valHooks[ this.type ] || jQuery.valHooks[ this.nodeName.toLowerCase() ]; + + // If set returns undefined, fall back to normal setting + if ( !hooks || !( "set" in hooks ) || hooks.set( this, val, "value" ) === undefined ) { + this.value = val; + } + } ); + } +} ); + +jQuery.extend( { + valHooks: { + option: { + get: function( elem ) { + + var val = jQuery.find.attr( elem, "value" ); + return val != null ? + val : + + // Support: IE <=10 - 11 only + // option.text throws exceptions (#14686, #14858) + // Strip and collapse whitespace + // https://html.spec.whatwg.org/#strip-and-collapse-whitespace + stripAndCollapse( jQuery.text( elem ) ); + } + }, + select: { + get: function( elem ) { + var value, option, i, + options = elem.options, + index = elem.selectedIndex, + one = elem.type === "select-one", + values = one ? null : [], + max = one ? index + 1 : options.length; + + if ( index < 0 ) { + i = max; + + } else { + i = one ? index : 0; + } + + // Loop through all the selected options + for ( ; i < max; i++ ) { + option = options[ i ]; + + // Support: IE <=9 only + // IE8-9 doesn't update selected after form reset (#2551) + if ( ( option.selected || i === index ) && + + // Don't return options that are disabled or in a disabled optgroup + !option.disabled && + ( !option.parentNode.disabled || + !nodeName( option.parentNode, "optgroup" ) ) ) { + + // Get the specific value for the option + value = jQuery( option ).val(); + + // We don't need an array for one selects + if ( one ) { + return value; + } + + // Multi-Selects return an array + values.push( value ); + } + } + + return values; + }, + + set: function( elem, value ) { + var optionSet, option, + options = elem.options, + values = jQuery.makeArray( value ), + i = options.length; + + while ( i-- ) { + option = options[ i ]; + + /* eslint-disable no-cond-assign */ + + if ( option.selected = + jQuery.inArray( jQuery.valHooks.option.get( option ), values ) > -1 + ) { + optionSet = true; + } + + /* eslint-enable no-cond-assign */ + } + + // Force browsers to behave consistently when non-matching value is set + if ( !optionSet ) { + elem.selectedIndex = -1; + } + return values; + } + } + } +} ); + +// Radios and checkboxes getter/setter +jQuery.each( [ "radio", "checkbox" ], function() { + jQuery.valHooks[ this ] = { + set: function( elem, value ) { + if ( Array.isArray( value ) ) { + return ( elem.checked = jQuery.inArray( jQuery( elem ).val(), value ) > -1 ); + } + } + }; + if ( !support.checkOn ) { + jQuery.valHooks[ this ].get = function( elem ) { + return elem.getAttribute( "value" ) === null ? "on" : elem.value; + }; + } +} ); + + + + +// Return jQuery for attributes-only inclusion + + +support.focusin = "onfocusin" in window; + + +var rfocusMorph = /^(?:focusinfocus|focusoutblur)$/, + stopPropagationCallback = function( e ) { + e.stopPropagation(); + }; + +jQuery.extend( jQuery.event, { + + trigger: function( event, data, elem, onlyHandlers ) { + + var i, cur, tmp, bubbleType, ontype, handle, special, lastElement, + eventPath = [ elem || document ], + type = hasOwn.call( event, "type" ) ? event.type : event, + namespaces = hasOwn.call( event, "namespace" ) ? event.namespace.split( "." ) : []; + + cur = lastElement = tmp = elem = elem || document; + + // Don't do events on text and comment nodes + if ( elem.nodeType === 3 || elem.nodeType === 8 ) { + return; + } + + // focus/blur morphs to focusin/out; ensure we're not firing them right now + if ( rfocusMorph.test( type + jQuery.event.triggered ) ) { + return; + } + + if ( type.indexOf( "." ) > -1 ) { + + // Namespaced trigger; create a regexp to match event type in handle() + namespaces = type.split( "." ); + type = namespaces.shift(); + namespaces.sort(); + } + ontype = type.indexOf( ":" ) < 0 && "on" + type; + + // Caller can pass in a jQuery.Event object, Object, or just an event type string + event = event[ jQuery.expando ] ? + event : + new jQuery.Event( type, typeof event === "object" && event ); + + // Trigger bitmask: & 1 for native handlers; & 2 for jQuery (always true) + event.isTrigger = onlyHandlers ? 2 : 3; + event.namespace = namespaces.join( "." ); + event.rnamespace = event.namespace ? + new RegExp( "(^|\\.)" + namespaces.join( "\\.(?:.*\\.|)" ) + "(\\.|$)" ) : + null; + + // Clean up the event in case it is being reused + event.result = undefined; + if ( !event.target ) { + event.target = elem; + } + + // Clone any incoming data and prepend the event, creating the handler arg list + data = data == null ? + [ event ] : + jQuery.makeArray( data, [ event ] ); + + // Allow special events to draw outside the lines + special = jQuery.event.special[ type ] || {}; + if ( !onlyHandlers && special.trigger && special.trigger.apply( elem, data ) === false ) { + return; + } + + // Determine event propagation path in advance, per W3C events spec (#9951) + // Bubble up to document, then to window; watch for a global ownerDocument var (#9724) + if ( !onlyHandlers && !special.noBubble && !isWindow( elem ) ) { + + bubbleType = special.delegateType || type; + if ( !rfocusMorph.test( bubbleType + type ) ) { + cur = cur.parentNode; + } + for ( ; cur; cur = cur.parentNode ) { + eventPath.push( cur ); + tmp = cur; + } + + // Only add window if we got to document (e.g., not plain obj or detached DOM) + if ( tmp === ( elem.ownerDocument || document ) ) { + eventPath.push( tmp.defaultView || tmp.parentWindow || window ); + } + } + + // Fire handlers on the event path + i = 0; + while ( ( cur = eventPath[ i++ ] ) && !event.isPropagationStopped() ) { + lastElement = cur; + event.type = i > 1 ? + bubbleType : + special.bindType || type; + + // jQuery handler + handle = ( dataPriv.get( cur, "events" ) || Object.create( null ) )[ event.type ] && + dataPriv.get( cur, "handle" ); + if ( handle ) { + handle.apply( cur, data ); + } + + // Native handler + handle = ontype && cur[ ontype ]; + if ( handle && handle.apply && acceptData( cur ) ) { + event.result = handle.apply( cur, data ); + if ( event.result === false ) { + event.preventDefault(); + } + } + } + event.type = type; + + // If nobody prevented the default action, do it now + if ( !onlyHandlers && !event.isDefaultPrevented() ) { + + if ( ( !special._default || + special._default.apply( eventPath.pop(), data ) === false ) && + acceptData( elem ) ) { + + // Call a native DOM method on the target with the same name as the event. + // Don't do default actions on window, that's where global variables be (#6170) + if ( ontype && isFunction( elem[ type ] ) && !isWindow( elem ) ) { + + // Don't re-trigger an onFOO event when we call its FOO() method + tmp = elem[ ontype ]; + + if ( tmp ) { + elem[ ontype ] = null; + } + + // Prevent re-triggering of the same event, since we already bubbled it above + jQuery.event.triggered = type; + + if ( event.isPropagationStopped() ) { + lastElement.addEventListener( type, stopPropagationCallback ); + } + + elem[ type ](); + + if ( event.isPropagationStopped() ) { + lastElement.removeEventListener( type, stopPropagationCallback ); + } + + jQuery.event.triggered = undefined; + + if ( tmp ) { + elem[ ontype ] = tmp; + } + } + } + } + + return event.result; + }, + + // Piggyback on a donor event to simulate a different one + // Used only for `focus(in | out)` events + simulate: function( type, elem, event ) { + var e = jQuery.extend( + new jQuery.Event(), + event, + { + type: type, + isSimulated: true + } + ); + + jQuery.event.trigger( e, null, elem ); + } + +} ); + +jQuery.fn.extend( { + + trigger: function( type, data ) { + return this.each( function() { + jQuery.event.trigger( type, data, this ); + } ); + }, + triggerHandler: function( type, data ) { + var elem = this[ 0 ]; + if ( elem ) { + return jQuery.event.trigger( type, data, elem, true ); + } + } +} ); + + +// Support: Firefox <=44 +// Firefox doesn't have focus(in | out) events +// Related ticket - https://bugzilla.mozilla.org/show_bug.cgi?id=687787 +// +// Support: Chrome <=48 - 49, Safari <=9.0 - 9.1 +// focus(in | out) events fire after focus & blur events, +// which is spec violation - http://www.w3.org/TR/DOM-Level-3-Events/#events-focusevent-event-order +// Related ticket - https://bugs.chromium.org/p/chromium/issues/detail?id=449857 +if ( !support.focusin ) { + jQuery.each( { focus: "focusin", blur: "focusout" }, function( orig, fix ) { + + // Attach a single capturing handler on the document while someone wants focusin/focusout + var handler = function( event ) { + jQuery.event.simulate( fix, event.target, jQuery.event.fix( event ) ); + }; + + jQuery.event.special[ fix ] = { + setup: function() { + + // Handle: regular nodes (via `this.ownerDocument`), window + // (via `this.document`) & document (via `this`). + var doc = this.ownerDocument || this.document || this, + attaches = dataPriv.access( doc, fix ); + + if ( !attaches ) { + doc.addEventListener( orig, handler, true ); + } + dataPriv.access( doc, fix, ( attaches || 0 ) + 1 ); + }, + teardown: function() { + var doc = this.ownerDocument || this.document || this, + attaches = dataPriv.access( doc, fix ) - 1; + + if ( !attaches ) { + doc.removeEventListener( orig, handler, true ); + dataPriv.remove( doc, fix ); + + } else { + dataPriv.access( doc, fix, attaches ); + } + } + }; + } ); +} +var location = window.location; + +var nonce = { guid: Date.now() }; + +var rquery = ( /\?/ ); + + + +// Cross-browser xml parsing +jQuery.parseXML = function( data ) { + var xml, parserErrorElem; + if ( !data || typeof data !== "string" ) { + return null; + } + + // Support: IE 9 - 11 only + // IE throws on parseFromString with invalid input. + try { + xml = ( new window.DOMParser() ).parseFromString( data, "text/xml" ); + } catch ( e ) {} + + parserErrorElem = xml && xml.getElementsByTagName( "parsererror" )[ 0 ]; + if ( !xml || parserErrorElem ) { + jQuery.error( "Invalid XML: " + ( + parserErrorElem ? + jQuery.map( parserErrorElem.childNodes, function( el ) { + return el.textContent; + } ).join( "\n" ) : + data + ) ); + } + return xml; +}; + + +var + rbracket = /\[\]$/, + rCRLF = /\r?\n/g, + rsubmitterTypes = /^(?:submit|button|image|reset|file)$/i, + rsubmittable = /^(?:input|select|textarea|keygen)/i; + +function buildParams( prefix, obj, traditional, add ) { + var name; + + if ( Array.isArray( obj ) ) { + + // Serialize array item. + jQuery.each( obj, function( i, v ) { + if ( traditional || rbracket.test( prefix ) ) { + + // Treat each array item as a scalar. + add( prefix, v ); + + } else { + + // Item is non-scalar (array or object), encode its numeric index. + buildParams( + prefix + "[" + ( typeof v === "object" && v != null ? i : "" ) + "]", + v, + traditional, + add + ); + } + } ); + + } else if ( !traditional && toType( obj ) === "object" ) { + + // Serialize object item. + for ( name in obj ) { + buildParams( prefix + "[" + name + "]", obj[ name ], traditional, add ); + } + + } else { + + // Serialize scalar item. + add( prefix, obj ); + } +} + +// Serialize an array of form elements or a set of +// key/values into a query string +jQuery.param = function( a, traditional ) { + var prefix, + s = [], + add = function( key, valueOrFunction ) { + + // If value is a function, invoke it and use its return value + var value = isFunction( valueOrFunction ) ? + valueOrFunction() : + valueOrFunction; + + s[ s.length ] = encodeURIComponent( key ) + "=" + + encodeURIComponent( value == null ? "" : value ); + }; + + if ( a == null ) { + return ""; + } + + // If an array was passed in, assume that it is an array of form elements. + if ( Array.isArray( a ) || ( a.jquery && !jQuery.isPlainObject( a ) ) ) { + + // Serialize the form elements + jQuery.each( a, function() { + add( this.name, this.value ); + } ); + + } else { + + // If traditional, encode the "old" way (the way 1.3.2 or older + // did it), otherwise encode params recursively. + for ( prefix in a ) { + buildParams( prefix, a[ prefix ], traditional, add ); + } + } + + // Return the resulting serialization + return s.join( "&" ); +}; + +jQuery.fn.extend( { + serialize: function() { + return jQuery.param( this.serializeArray() ); + }, + serializeArray: function() { + return this.map( function() { + + // Can add propHook for "elements" to filter or add form elements + var elements = jQuery.prop( this, "elements" ); + return elements ? jQuery.makeArray( elements ) : this; + } ).filter( function() { + var type = this.type; + + // Use .is( ":disabled" ) so that fieldset[disabled] works + return this.name && !jQuery( this ).is( ":disabled" ) && + rsubmittable.test( this.nodeName ) && !rsubmitterTypes.test( type ) && + ( this.checked || !rcheckableType.test( type ) ); + } ).map( function( _i, elem ) { + var val = jQuery( this ).val(); + + if ( val == null ) { + return null; + } + + if ( Array.isArray( val ) ) { + return jQuery.map( val, function( val ) { + return { name: elem.name, value: val.replace( rCRLF, "\r\n" ) }; + } ); + } + + return { name: elem.name, value: val.replace( rCRLF, "\r\n" ) }; + } ).get(); + } +} ); + + +var + r20 = /%20/g, + rhash = /#.*$/, + rantiCache = /([?&])_=[^&]*/, + rheaders = /^(.*?):[ \t]*([^\r\n]*)$/mg, + + // #7653, #8125, #8152: local protocol detection + rlocalProtocol = /^(?:about|app|app-storage|.+-extension|file|res|widget):$/, + rnoContent = /^(?:GET|HEAD)$/, + rprotocol = /^\/\//, + + /* Prefilters + * 1) They are useful to introduce custom dataTypes (see ajax/jsonp.js for an example) + * 2) These are called: + * - BEFORE asking for a transport + * - AFTER param serialization (s.data is a string if s.processData is true) + * 3) key is the dataType + * 4) the catchall symbol "*" can be used + * 5) execution will start with transport dataType and THEN continue down to "*" if needed + */ + prefilters = {}, + + /* Transports bindings + * 1) key is the dataType + * 2) the catchall symbol "*" can be used + * 3) selection will start with transport dataType and THEN go to "*" if needed + */ + transports = {}, + + // Avoid comment-prolog char sequence (#10098); must appease lint and evade compression + allTypes = "*/".concat( "*" ), + + // Anchor tag for parsing the document origin + originAnchor = document.createElement( "a" ); + +originAnchor.href = location.href; + +// Base "constructor" for jQuery.ajaxPrefilter and jQuery.ajaxTransport +function addToPrefiltersOrTransports( structure ) { + + // dataTypeExpression is optional and defaults to "*" + return function( dataTypeExpression, func ) { + + if ( typeof dataTypeExpression !== "string" ) { + func = dataTypeExpression; + dataTypeExpression = "*"; + } + + var dataType, + i = 0, + dataTypes = dataTypeExpression.toLowerCase().match( rnothtmlwhite ) || []; + + if ( isFunction( func ) ) { + + // For each dataType in the dataTypeExpression + while ( ( dataType = dataTypes[ i++ ] ) ) { + + // Prepend if requested + if ( dataType[ 0 ] === "+" ) { + dataType = dataType.slice( 1 ) || "*"; + ( structure[ dataType ] = structure[ dataType ] || [] ).unshift( func ); + + // Otherwise append + } else { + ( structure[ dataType ] = structure[ dataType ] || [] ).push( func ); + } + } + } + }; +} + +// Base inspection function for prefilters and transports +function inspectPrefiltersOrTransports( structure, options, originalOptions, jqXHR ) { + + var inspected = {}, + seekingTransport = ( structure === transports ); + + function inspect( dataType ) { + var selected; + inspected[ dataType ] = true; + jQuery.each( structure[ dataType ] || [], function( _, prefilterOrFactory ) { + var dataTypeOrTransport = prefilterOrFactory( options, originalOptions, jqXHR ); + if ( typeof dataTypeOrTransport === "string" && + !seekingTransport && !inspected[ dataTypeOrTransport ] ) { + + options.dataTypes.unshift( dataTypeOrTransport ); + inspect( dataTypeOrTransport ); + return false; + } else if ( seekingTransport ) { + return !( selected = dataTypeOrTransport ); + } + } ); + return selected; + } + + return inspect( options.dataTypes[ 0 ] ) || !inspected[ "*" ] && inspect( "*" ); +} + +// A special extend for ajax options +// that takes "flat" options (not to be deep extended) +// Fixes #9887 +function ajaxExtend( target, src ) { + var key, deep, + flatOptions = jQuery.ajaxSettings.flatOptions || {}; + + for ( key in src ) { + if ( src[ key ] !== undefined ) { + ( flatOptions[ key ] ? target : ( deep || ( deep = {} ) ) )[ key ] = src[ key ]; + } + } + if ( deep ) { + jQuery.extend( true, target, deep ); + } + + return target; +} + +/* Handles responses to an ajax request: + * - finds the right dataType (mediates between content-type and expected dataType) + * - returns the corresponding response + */ +function ajaxHandleResponses( s, jqXHR, responses ) { + + var ct, type, finalDataType, firstDataType, + contents = s.contents, + dataTypes = s.dataTypes; + + // Remove auto dataType and get content-type in the process + while ( dataTypes[ 0 ] === "*" ) { + dataTypes.shift(); + if ( ct === undefined ) { + ct = s.mimeType || jqXHR.getResponseHeader( "Content-Type" ); + } + } + + // Check if we're dealing with a known content-type + if ( ct ) { + for ( type in contents ) { + if ( contents[ type ] && contents[ type ].test( ct ) ) { + dataTypes.unshift( type ); + break; + } + } + } + + // Check to see if we have a response for the expected dataType + if ( dataTypes[ 0 ] in responses ) { + finalDataType = dataTypes[ 0 ]; + } else { + + // Try convertible dataTypes + for ( type in responses ) { + if ( !dataTypes[ 0 ] || s.converters[ type + " " + dataTypes[ 0 ] ] ) { + finalDataType = type; + break; + } + if ( !firstDataType ) { + firstDataType = type; + } + } + + // Or just use first one + finalDataType = finalDataType || firstDataType; + } + + // If we found a dataType + // We add the dataType to the list if needed + // and return the corresponding response + if ( finalDataType ) { + if ( finalDataType !== dataTypes[ 0 ] ) { + dataTypes.unshift( finalDataType ); + } + return responses[ finalDataType ]; + } +} + +/* Chain conversions given the request and the original response + * Also sets the responseXXX fields on the jqXHR instance + */ +function ajaxConvert( s, response, jqXHR, isSuccess ) { + var conv2, current, conv, tmp, prev, + converters = {}, + + // Work with a copy of dataTypes in case we need to modify it for conversion + dataTypes = s.dataTypes.slice(); + + // Create converters map with lowercased keys + if ( dataTypes[ 1 ] ) { + for ( conv in s.converters ) { + converters[ conv.toLowerCase() ] = s.converters[ conv ]; + } + } + + current = dataTypes.shift(); + + // Convert to each sequential dataType + while ( current ) { + + if ( s.responseFields[ current ] ) { + jqXHR[ s.responseFields[ current ] ] = response; + } + + // Apply the dataFilter if provided + if ( !prev && isSuccess && s.dataFilter ) { + response = s.dataFilter( response, s.dataType ); + } + + prev = current; + current = dataTypes.shift(); + + if ( current ) { + + // There's only work to do if current dataType is non-auto + if ( current === "*" ) { + + current = prev; + + // Convert response if prev dataType is non-auto and differs from current + } else if ( prev !== "*" && prev !== current ) { + + // Seek a direct converter + conv = converters[ prev + " " + current ] || converters[ "* " + current ]; + + // If none found, seek a pair + if ( !conv ) { + for ( conv2 in converters ) { + + // If conv2 outputs current + tmp = conv2.split( " " ); + if ( tmp[ 1 ] === current ) { + + // If prev can be converted to accepted input + conv = converters[ prev + " " + tmp[ 0 ] ] || + converters[ "* " + tmp[ 0 ] ]; + if ( conv ) { + + // Condense equivalence converters + if ( conv === true ) { + conv = converters[ conv2 ]; + + // Otherwise, insert the intermediate dataType + } else if ( converters[ conv2 ] !== true ) { + current = tmp[ 0 ]; + dataTypes.unshift( tmp[ 1 ] ); + } + break; + } + } + } + } + + // Apply converter (if not an equivalence) + if ( conv !== true ) { + + // Unless errors are allowed to bubble, catch and return them + if ( conv && s.throws ) { + response = conv( response ); + } else { + try { + response = conv( response ); + } catch ( e ) { + return { + state: "parsererror", + error: conv ? e : "No conversion from " + prev + " to " + current + }; + } + } + } + } + } + } + + return { state: "success", data: response }; +} + +jQuery.extend( { + + // Counter for holding the number of active queries + active: 0, + + // Last-Modified header cache for next request + lastModified: {}, + etag: {}, + + ajaxSettings: { + url: location.href, + type: "GET", + isLocal: rlocalProtocol.test( location.protocol ), + global: true, + processData: true, + async: true, + contentType: "application/x-www-form-urlencoded; charset=UTF-8", + + /* + timeout: 0, + data: null, + dataType: null, + username: null, + password: null, + cache: null, + throws: false, + traditional: false, + headers: {}, + */ + + accepts: { + "*": allTypes, + text: "text/plain", + html: "text/html", + xml: "application/xml, text/xml", + json: "application/json, text/javascript" + }, + + contents: { + xml: /\bxml\b/, + html: /\bhtml/, + json: /\bjson\b/ + }, + + responseFields: { + xml: "responseXML", + text: "responseText", + json: "responseJSON" + }, + + // Data converters + // Keys separate source (or catchall "*") and destination types with a single space + converters: { + + // Convert anything to text + "* text": String, + + // Text to html (true = no transformation) + "text html": true, + + // Evaluate text as a json expression + "text json": JSON.parse, + + // Parse text as xml + "text xml": jQuery.parseXML + }, + + // For options that shouldn't be deep extended: + // you can add your own custom options here if + // and when you create one that shouldn't be + // deep extended (see ajaxExtend) + flatOptions: { + url: true, + context: true + } + }, + + // Creates a full fledged settings object into target + // with both ajaxSettings and settings fields. + // If target is omitted, writes into ajaxSettings. + ajaxSetup: function( target, settings ) { + return settings ? + + // Building a settings object + ajaxExtend( ajaxExtend( target, jQuery.ajaxSettings ), settings ) : + + // Extending ajaxSettings + ajaxExtend( jQuery.ajaxSettings, target ); + }, + + ajaxPrefilter: addToPrefiltersOrTransports( prefilters ), + ajaxTransport: addToPrefiltersOrTransports( transports ), + + // Main method + ajax: function( url, options ) { + + // If url is an object, simulate pre-1.5 signature + if ( typeof url === "object" ) { + options = url; + url = undefined; + } + + // Force options to be an object + options = options || {}; + + var transport, + + // URL without anti-cache param + cacheURL, + + // Response headers + responseHeadersString, + responseHeaders, + + // timeout handle + timeoutTimer, + + // Url cleanup var + urlAnchor, + + // Request state (becomes false upon send and true upon completion) + completed, + + // To know if global events are to be dispatched + fireGlobals, + + // Loop variable + i, + + // uncached part of the url + uncached, + + // Create the final options object + s = jQuery.ajaxSetup( {}, options ), + + // Callbacks context + callbackContext = s.context || s, + + // Context for global events is callbackContext if it is a DOM node or jQuery collection + globalEventContext = s.context && + ( callbackContext.nodeType || callbackContext.jquery ) ? + jQuery( callbackContext ) : + jQuery.event, + + // Deferreds + deferred = jQuery.Deferred(), + completeDeferred = jQuery.Callbacks( "once memory" ), + + // Status-dependent callbacks + statusCode = s.statusCode || {}, + + // Headers (they are sent all at once) + requestHeaders = {}, + requestHeadersNames = {}, + + // Default abort message + strAbort = "canceled", + + // Fake xhr + jqXHR = { + readyState: 0, + + // Builds headers hashtable if needed + getResponseHeader: function( key ) { + var match; + if ( completed ) { + if ( !responseHeaders ) { + responseHeaders = {}; + while ( ( match = rheaders.exec( responseHeadersString ) ) ) { + responseHeaders[ match[ 1 ].toLowerCase() + " " ] = + ( responseHeaders[ match[ 1 ].toLowerCase() + " " ] || [] ) + .concat( match[ 2 ] ); + } + } + match = responseHeaders[ key.toLowerCase() + " " ]; + } + return match == null ? null : match.join( ", " ); + }, + + // Raw string + getAllResponseHeaders: function() { + return completed ? responseHeadersString : null; + }, + + // Caches the header + setRequestHeader: function( name, value ) { + if ( completed == null ) { + name = requestHeadersNames[ name.toLowerCase() ] = + requestHeadersNames[ name.toLowerCase() ] || name; + requestHeaders[ name ] = value; + } + return this; + }, + + // Overrides response content-type header + overrideMimeType: function( type ) { + if ( completed == null ) { + s.mimeType = type; + } + return this; + }, + + // Status-dependent callbacks + statusCode: function( map ) { + var code; + if ( map ) { + if ( completed ) { + + // Execute the appropriate callbacks + jqXHR.always( map[ jqXHR.status ] ); + } else { + + // Lazy-add the new callbacks in a way that preserves old ones + for ( code in map ) { + statusCode[ code ] = [ statusCode[ code ], map[ code ] ]; + } + } + } + return this; + }, + + // Cancel the request + abort: function( statusText ) { + var finalText = statusText || strAbort; + if ( transport ) { + transport.abort( finalText ); + } + done( 0, finalText ); + return this; + } + }; + + // Attach deferreds + deferred.promise( jqXHR ); + + // Add protocol if not provided (prefilters might expect it) + // Handle falsy url in the settings object (#10093: consistency with old signature) + // We also use the url parameter if available + s.url = ( ( url || s.url || location.href ) + "" ) + .replace( rprotocol, location.protocol + "//" ); + + // Alias method option to type as per ticket #12004 + s.type = options.method || options.type || s.method || s.type; + + // Extract dataTypes list + s.dataTypes = ( s.dataType || "*" ).toLowerCase().match( rnothtmlwhite ) || [ "" ]; + + // A cross-domain request is in order when the origin doesn't match the current origin. + if ( s.crossDomain == null ) { + urlAnchor = document.createElement( "a" ); + + // Support: IE <=8 - 11, Edge 12 - 15 + // IE throws exception on accessing the href property if url is malformed, + // e.g. http://example.com:80x/ + try { + urlAnchor.href = s.url; + + // Support: IE <=8 - 11 only + // Anchor's host property isn't correctly set when s.url is relative + urlAnchor.href = urlAnchor.href; + s.crossDomain = originAnchor.protocol + "//" + originAnchor.host !== + urlAnchor.protocol + "//" + urlAnchor.host; + } catch ( e ) { + + // If there is an error parsing the URL, assume it is crossDomain, + // it can be rejected by the transport if it is invalid + s.crossDomain = true; + } + } + + // Convert data if not already a string + if ( s.data && s.processData && typeof s.data !== "string" ) { + s.data = jQuery.param( s.data, s.traditional ); + } + + // Apply prefilters + inspectPrefiltersOrTransports( prefilters, s, options, jqXHR ); + + // If request was aborted inside a prefilter, stop there + if ( completed ) { + return jqXHR; + } + + // We can fire global events as of now if asked to + // Don't fire events if jQuery.event is undefined in an AMD-usage scenario (#15118) + fireGlobals = jQuery.event && s.global; + + // Watch for a new set of requests + if ( fireGlobals && jQuery.active++ === 0 ) { + jQuery.event.trigger( "ajaxStart" ); + } + + // Uppercase the type + s.type = s.type.toUpperCase(); + + // Determine if request has content + s.hasContent = !rnoContent.test( s.type ); + + // Save the URL in case we're toying with the If-Modified-Since + // and/or If-None-Match header later on + // Remove hash to simplify url manipulation + cacheURL = s.url.replace( rhash, "" ); + + // More options handling for requests with no content + if ( !s.hasContent ) { + + // Remember the hash so we can put it back + uncached = s.url.slice( cacheURL.length ); + + // If data is available and should be processed, append data to url + if ( s.data && ( s.processData || typeof s.data === "string" ) ) { + cacheURL += ( rquery.test( cacheURL ) ? "&" : "?" ) + s.data; + + // #9682: remove data so that it's not used in an eventual retry + delete s.data; + } + + // Add or update anti-cache param if needed + if ( s.cache === false ) { + cacheURL = cacheURL.replace( rantiCache, "$1" ); + uncached = ( rquery.test( cacheURL ) ? "&" : "?" ) + "_=" + ( nonce.guid++ ) + + uncached; + } + + // Put hash and anti-cache on the URL that will be requested (gh-1732) + s.url = cacheURL + uncached; + + // Change '%20' to '+' if this is encoded form body content (gh-2658) + } else if ( s.data && s.processData && + ( s.contentType || "" ).indexOf( "application/x-www-form-urlencoded" ) === 0 ) { + s.data = s.data.replace( r20, "+" ); + } + + // Set the If-Modified-Since and/or If-None-Match header, if in ifModified mode. + if ( s.ifModified ) { + if ( jQuery.lastModified[ cacheURL ] ) { + jqXHR.setRequestHeader( "If-Modified-Since", jQuery.lastModified[ cacheURL ] ); + } + if ( jQuery.etag[ cacheURL ] ) { + jqXHR.setRequestHeader( "If-None-Match", jQuery.etag[ cacheURL ] ); + } + } + + // Set the correct header, if data is being sent + if ( s.data && s.hasContent && s.contentType !== false || options.contentType ) { + jqXHR.setRequestHeader( "Content-Type", s.contentType ); + } + + // Set the Accepts header for the server, depending on the dataType + jqXHR.setRequestHeader( + "Accept", + s.dataTypes[ 0 ] && s.accepts[ s.dataTypes[ 0 ] ] ? + s.accepts[ s.dataTypes[ 0 ] ] + + ( s.dataTypes[ 0 ] !== "*" ? ", " + allTypes + "; q=0.01" : "" ) : + s.accepts[ "*" ] + ); + + // Check for headers option + for ( i in s.headers ) { + jqXHR.setRequestHeader( i, s.headers[ i ] ); + } + + // Allow custom headers/mimetypes and early abort + if ( s.beforeSend && + ( s.beforeSend.call( callbackContext, jqXHR, s ) === false || completed ) ) { + + // Abort if not done already and return + return jqXHR.abort(); + } + + // Aborting is no longer a cancellation + strAbort = "abort"; + + // Install callbacks on deferreds + completeDeferred.add( s.complete ); + jqXHR.done( s.success ); + jqXHR.fail( s.error ); + + // Get transport + transport = inspectPrefiltersOrTransports( transports, s, options, jqXHR ); + + // If no transport, we auto-abort + if ( !transport ) { + done( -1, "No Transport" ); + } else { + jqXHR.readyState = 1; + + // Send global event + if ( fireGlobals ) { + globalEventContext.trigger( "ajaxSend", [ jqXHR, s ] ); + } + + // If request was aborted inside ajaxSend, stop there + if ( completed ) { + return jqXHR; + } + + // Timeout + if ( s.async && s.timeout > 0 ) { + timeoutTimer = window.setTimeout( function() { + jqXHR.abort( "timeout" ); + }, s.timeout ); + } + + try { + completed = false; + transport.send( requestHeaders, done ); + } catch ( e ) { + + // Rethrow post-completion exceptions + if ( completed ) { + throw e; + } + + // Propagate others as results + done( -1, e ); + } + } + + // Callback for when everything is done + function done( status, nativeStatusText, responses, headers ) { + var isSuccess, success, error, response, modified, + statusText = nativeStatusText; + + // Ignore repeat invocations + if ( completed ) { + return; + } + + completed = true; + + // Clear timeout if it exists + if ( timeoutTimer ) { + window.clearTimeout( timeoutTimer ); + } + + // Dereference transport for early garbage collection + // (no matter how long the jqXHR object will be used) + transport = undefined; + + // Cache response headers + responseHeadersString = headers || ""; + + // Set readyState + jqXHR.readyState = status > 0 ? 4 : 0; + + // Determine if successful + isSuccess = status >= 200 && status < 300 || status === 304; + + // Get response data + if ( responses ) { + response = ajaxHandleResponses( s, jqXHR, responses ); + } + + // Use a noop converter for missing script but not if jsonp + if ( !isSuccess && + jQuery.inArray( "script", s.dataTypes ) > -1 && + jQuery.inArray( "json", s.dataTypes ) < 0 ) { + s.converters[ "text script" ] = function() {}; + } + + // Convert no matter what (that way responseXXX fields are always set) + response = ajaxConvert( s, response, jqXHR, isSuccess ); + + // If successful, handle type chaining + if ( isSuccess ) { + + // Set the If-Modified-Since and/or If-None-Match header, if in ifModified mode. + if ( s.ifModified ) { + modified = jqXHR.getResponseHeader( "Last-Modified" ); + if ( modified ) { + jQuery.lastModified[ cacheURL ] = modified; + } + modified = jqXHR.getResponseHeader( "etag" ); + if ( modified ) { + jQuery.etag[ cacheURL ] = modified; + } + } + + // if no content + if ( status === 204 || s.type === "HEAD" ) { + statusText = "nocontent"; + + // if not modified + } else if ( status === 304 ) { + statusText = "notmodified"; + + // If we have data, let's convert it + } else { + statusText = response.state; + success = response.data; + error = response.error; + isSuccess = !error; + } + } else { + + // Extract error from statusText and normalize for non-aborts + error = statusText; + if ( status || !statusText ) { + statusText = "error"; + if ( status < 0 ) { + status = 0; + } + } + } + + // Set data for the fake xhr object + jqXHR.status = status; + jqXHR.statusText = ( nativeStatusText || statusText ) + ""; + + // Success/Error + if ( isSuccess ) { + deferred.resolveWith( callbackContext, [ success, statusText, jqXHR ] ); + } else { + deferred.rejectWith( callbackContext, [ jqXHR, statusText, error ] ); + } + + // Status-dependent callbacks + jqXHR.statusCode( statusCode ); + statusCode = undefined; + + if ( fireGlobals ) { + globalEventContext.trigger( isSuccess ? "ajaxSuccess" : "ajaxError", + [ jqXHR, s, isSuccess ? success : error ] ); + } + + // Complete + completeDeferred.fireWith( callbackContext, [ jqXHR, statusText ] ); + + if ( fireGlobals ) { + globalEventContext.trigger( "ajaxComplete", [ jqXHR, s ] ); + + // Handle the global AJAX counter + if ( !( --jQuery.active ) ) { + jQuery.event.trigger( "ajaxStop" ); + } + } + } + + return jqXHR; + }, + + getJSON: function( url, data, callback ) { + return jQuery.get( url, data, callback, "json" ); + }, + + getScript: function( url, callback ) { + return jQuery.get( url, undefined, callback, "script" ); + } +} ); + +jQuery.each( [ "get", "post" ], function( _i, method ) { + jQuery[ method ] = function( url, data, callback, type ) { + + // Shift arguments if data argument was omitted + if ( isFunction( data ) ) { + type = type || callback; + callback = data; + data = undefined; + } + + // The url can be an options object (which then must have .url) + return jQuery.ajax( jQuery.extend( { + url: url, + type: method, + dataType: type, + data: data, + success: callback + }, jQuery.isPlainObject( url ) && url ) ); + }; +} ); + +jQuery.ajaxPrefilter( function( s ) { + var i; + for ( i in s.headers ) { + if ( i.toLowerCase() === "content-type" ) { + s.contentType = s.headers[ i ] || ""; + } + } +} ); + + +jQuery._evalUrl = function( url, options, doc ) { + return jQuery.ajax( { + url: url, + + // Make this explicit, since user can override this through ajaxSetup (#11264) + type: "GET", + dataType: "script", + cache: true, + async: false, + global: false, + + // Only evaluate the response if it is successful (gh-4126) + // dataFilter is not invoked for failure responses, so using it instead + // of the default converter is kludgy but it works. + converters: { + "text script": function() {} + }, + dataFilter: function( response ) { + jQuery.globalEval( response, options, doc ); + } + } ); +}; + + +jQuery.fn.extend( { + wrapAll: function( html ) { + var wrap; + + if ( this[ 0 ] ) { + if ( isFunction( html ) ) { + html = html.call( this[ 0 ] ); + } + + // The elements to wrap the target around + wrap = jQuery( html, this[ 0 ].ownerDocument ).eq( 0 ).clone( true ); + + if ( this[ 0 ].parentNode ) { + wrap.insertBefore( this[ 0 ] ); + } + + wrap.map( function() { + var elem = this; + + while ( elem.firstElementChild ) { + elem = elem.firstElementChild; + } + + return elem; + } ).append( this ); + } + + return this; + }, + + wrapInner: function( html ) { + if ( isFunction( html ) ) { + return this.each( function( i ) { + jQuery( this ).wrapInner( html.call( this, i ) ); + } ); + } + + return this.each( function() { + var self = jQuery( this ), + contents = self.contents(); + + if ( contents.length ) { + contents.wrapAll( html ); + + } else { + self.append( html ); + } + } ); + }, + + wrap: function( html ) { + var htmlIsFunction = isFunction( html ); + + return this.each( function( i ) { + jQuery( this ).wrapAll( htmlIsFunction ? html.call( this, i ) : html ); + } ); + }, + + unwrap: function( selector ) { + this.parent( selector ).not( "body" ).each( function() { + jQuery( this ).replaceWith( this.childNodes ); + } ); + return this; + } +} ); + + +jQuery.expr.pseudos.hidden = function( elem ) { + return !jQuery.expr.pseudos.visible( elem ); +}; +jQuery.expr.pseudos.visible = function( elem ) { + return !!( elem.offsetWidth || elem.offsetHeight || elem.getClientRects().length ); +}; + + + + +jQuery.ajaxSettings.xhr = function() { + try { + return new window.XMLHttpRequest(); + } catch ( e ) {} +}; + +var xhrSuccessStatus = { + + // File protocol always yields status code 0, assume 200 + 0: 200, + + // Support: IE <=9 only + // #1450: sometimes IE returns 1223 when it should be 204 + 1223: 204 + }, + xhrSupported = jQuery.ajaxSettings.xhr(); + +support.cors = !!xhrSupported && ( "withCredentials" in xhrSupported ); +support.ajax = xhrSupported = !!xhrSupported; + +jQuery.ajaxTransport( function( options ) { + var callback, errorCallback; + + // Cross domain only allowed if supported through XMLHttpRequest + if ( support.cors || xhrSupported && !options.crossDomain ) { + return { + send: function( headers, complete ) { + var i, + xhr = options.xhr(); + + xhr.open( + options.type, + options.url, + options.async, + options.username, + options.password + ); + + // Apply custom fields if provided + if ( options.xhrFields ) { + for ( i in options.xhrFields ) { + xhr[ i ] = options.xhrFields[ i ]; + } + } + + // Override mime type if needed + if ( options.mimeType && xhr.overrideMimeType ) { + xhr.overrideMimeType( options.mimeType ); + } + + // X-Requested-With header + // For cross-domain requests, seeing as conditions for a preflight are + // akin to a jigsaw puzzle, we simply never set it to be sure. + // (it can always be set on a per-request basis or even using ajaxSetup) + // For same-domain requests, won't change header if already provided. + if ( !options.crossDomain && !headers[ "X-Requested-With" ] ) { + headers[ "X-Requested-With" ] = "XMLHttpRequest"; + } + + // Set headers + for ( i in headers ) { + xhr.setRequestHeader( i, headers[ i ] ); + } + + // Callback + callback = function( type ) { + return function() { + if ( callback ) { + callback = errorCallback = xhr.onload = + xhr.onerror = xhr.onabort = xhr.ontimeout = + xhr.onreadystatechange = null; + + if ( type === "abort" ) { + xhr.abort(); + } else if ( type === "error" ) { + + // Support: IE <=9 only + // On a manual native abort, IE9 throws + // errors on any property access that is not readyState + if ( typeof xhr.status !== "number" ) { + complete( 0, "error" ); + } else { + complete( + + // File: protocol always yields status 0; see #8605, #14207 + xhr.status, + xhr.statusText + ); + } + } else { + complete( + xhrSuccessStatus[ xhr.status ] || xhr.status, + xhr.statusText, + + // Support: IE <=9 only + // IE9 has no XHR2 but throws on binary (trac-11426) + // For XHR2 non-text, let the caller handle it (gh-2498) + ( xhr.responseType || "text" ) !== "text" || + typeof xhr.responseText !== "string" ? + { binary: xhr.response } : + { text: xhr.responseText }, + xhr.getAllResponseHeaders() + ); + } + } + }; + }; + + // Listen to events + xhr.onload = callback(); + errorCallback = xhr.onerror = xhr.ontimeout = callback( "error" ); + + // Support: IE 9 only + // Use onreadystatechange to replace onabort + // to handle uncaught aborts + if ( xhr.onabort !== undefined ) { + xhr.onabort = errorCallback; + } else { + xhr.onreadystatechange = function() { + + // Check readyState before timeout as it changes + if ( xhr.readyState === 4 ) { + + // Allow onerror to be called first, + // but that will not handle a native abort + // Also, save errorCallback to a variable + // as xhr.onerror cannot be accessed + window.setTimeout( function() { + if ( callback ) { + errorCallback(); + } + } ); + } + }; + } + + // Create the abort callback + callback = callback( "abort" ); + + try { + + // Do send the request (this may raise an exception) + xhr.send( options.hasContent && options.data || null ); + } catch ( e ) { + + // #14683: Only rethrow if this hasn't been notified as an error yet + if ( callback ) { + throw e; + } + } + }, + + abort: function() { + if ( callback ) { + callback(); + } + } + }; + } +} ); + + + + +// Prevent auto-execution of scripts when no explicit dataType was provided (See gh-2432) +jQuery.ajaxPrefilter( function( s ) { + if ( s.crossDomain ) { + s.contents.script = false; + } +} ); + +// Install script dataType +jQuery.ajaxSetup( { + accepts: { + script: "text/javascript, application/javascript, " + + "application/ecmascript, application/x-ecmascript" + }, + contents: { + script: /\b(?:java|ecma)script\b/ + }, + converters: { + "text script": function( text ) { + jQuery.globalEval( text ); + return text; + } + } +} ); + +// Handle cache's special case and crossDomain +jQuery.ajaxPrefilter( "script", function( s ) { + if ( s.cache === undefined ) { + s.cache = false; + } + if ( s.crossDomain ) { + s.type = "GET"; + } +} ); + +// Bind script tag hack transport +jQuery.ajaxTransport( "script", function( s ) { + + // This transport only deals with cross domain or forced-by-attrs requests + if ( s.crossDomain || s.scriptAttrs ) { + var script, callback; + return { + send: function( _, complete ) { + script = jQuery( " + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

Lasso Classes and Functions

+
+

Base Classes

+ ++++ + + + + + + + + + + + + + + + + + +

CubeTransit

Class for storing information about transit defined in Cube line files.

StandardTransit

Holds a standard transit feed as a Partridge object and contains methods to manipulate and translate the GTFS data to MetCouncil's Cube Line files.

ModelRoadwayNetwork

Subclass of network_wrangler class RoadwayNetwork

Project

A single or set of changes to the roadway or transit system.

Parameters

A class representing all the parameters defining the networks including time of day, categories, etc.

+
+
+

Utils and Functions

+ ++++ + + + + + + + + +

util

logger

+
+
+ + +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bart/genindex/index.html b/branch/bart/genindex/index.html new file mode 100644 index 0000000..b643af3 --- /dev/null +++ b/branch/bart/genindex/index.html @@ -0,0 +1,1013 @@ + + + + + + Index — lasso documentation + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+
    +
  • + +
  • +
  • +
+
+
+
+
+ + +

Index

+ +
+ _ + | A + | B + | C + | D + | E + | F + | G + | H + | I + | K + | L + | M + | N + | O + | P + | R + | S + | T + | U + | V + | W + | X + | Y + | Z + +
+

_

+ + +
+ +

A

+ + + +
+ +

B

+ + + +
+ +

C

+ + + +
+ +

D

+ + + +
+ +

E

+ + + +
+ +

F

+ + + +
+ +

G

+ + + +
+ +

H

+ + + +
+ +

I

+ + + +
+ +

K

+ + +
+ +

L

+ + + +
    +
  • + lasso + +
  • +
  • + lasso.logger + +
  • +
  • + lasso.util + +
  • +
+ +

M

+ + + +
+ +

N

+ + + +
+ +

O

+ + + +
+ +

P

+ + + +
+ +

R

+ + + +
+ +

S

+ + + +
+ +

T

+ + + +
+ +

U

+ + + +
+ +

V

+ + + +
+ +

W

+ + + +
+ +

X

+ + + +
+ +

Y

+ + +
+ +

Z

+ + + +
+ + + +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bart/index.html b/branch/bart/index.html new file mode 100644 index 0000000..ce413f3 --- /dev/null +++ b/branch/bart/index.html @@ -0,0 +1,176 @@ + + + + + + + Welcome to lasso’s documentation! — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

Welcome to lasso’s documentation!

+

This package of utilities is a wrapper around the +network_wrangler package +for MetCouncil and MTC. It aims to have the following functionality:

+
    +
  1. parse Cube log files and base highway networks and create ProjectCards +for Network Wrangler

  2. +
  3. parse two Cube transit line files and create ProjectCards for NetworkWrangler

  4. +
  5. refine Network Wrangler highway networks to contain specific variables and +settings for the respective agency and export them to a format that can +be read in by Citilab’s Cube software.

  6. +
+ +
+
+

Indices and tables

+ +
+ + +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bart/objects.inv b/branch/bart/objects.inv new file mode 100644 index 0000000..a77a24d Binary files /dev/null and b/branch/bart/objects.inv differ diff --git a/branch/bart/py-modindex/index.html b/branch/bart/py-modindex/index.html new file mode 100644 index 0000000..c4f724c --- /dev/null +++ b/branch/bart/py-modindex/index.html @@ -0,0 +1,131 @@ + + + + + + Python Module Index — lasso documentation + + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+
    +
  • + +
  • +
  • +
+
+
+
+
+ + +

Python Module Index

+ +
+ l +
+ + + + + + + + + + + + + +
 
+ l
+ lasso +
    + lasso.logger +
    + lasso.util +
+ + +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bart/running/index.html b/branch/bart/running/index.html new file mode 100644 index 0000000..17b055b --- /dev/null +++ b/branch/bart/running/index.html @@ -0,0 +1,128 @@ + + + + + + + Running Lasso — lasso documentation + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

Running Lasso

+
+

Create project files

+
+
+

Create a scenario

+
+
+

Exporting networks

+
+
+

Auditing and Reporting

+
+
+ + +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bart/search/index.html b/branch/bart/search/index.html new file mode 100644 index 0000000..f8180f5 --- /dev/null +++ b/branch/bart/search/index.html @@ -0,0 +1,121 @@ + + + + + + Search — lasso documentation + + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+
    +
  • + +
  • +
  • +
+
+
+
+
+ + + + +
+ +
+ +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + + + + + + \ No newline at end of file diff --git a/branch/bart/searchindex.js b/branch/bart/searchindex.js new file mode 100644 index 0000000..1bfcfd6 --- /dev/null +++ b/branch/bart/searchindex.js @@ -0,0 +1 @@ +Search.setIndex({"docnames": ["_generated/lasso.CubeTransit", "_generated/lasso.ModelRoadwayNetwork", "_generated/lasso.Parameters", "_generated/lasso.Project", "_generated/lasso.StandardTransit", "_generated/lasso.logger", "_generated/lasso.util", "autodoc", "index", "running", "setup", "starting"], "filenames": ["_generated/lasso.CubeTransit.rst", "_generated/lasso.ModelRoadwayNetwork.rst", "_generated/lasso.Parameters.rst", "_generated/lasso.Project.rst", "_generated/lasso.StandardTransit.rst", "_generated/lasso.logger.rst", "_generated/lasso.util.rst", "autodoc.rst", "index.rst", "running.md", "setup.md", "starting.md"], "titles": ["lasso.CubeTransit", "lasso.ModelRoadwayNetwork", "lasso.Parameters", "lasso.Project", "lasso.StandardTransit", "lasso.logger", "lasso.util", "Lasso Classes and Functions", "Welcome to lasso\u2019s documentation!", "Running Lasso", "Setup", "Starting Out"], "terms": {"class": [0, 1, 2, 3, 4, 6, 8, 11], "paramet": [0, 1, 3, 4, 6, 8], "sourc": [0, 1, 2, 3, 4, 5, 6, 11], "base": [0, 1, 2, 3, 4, 6, 8, 11], "object": [0, 1, 2, 3, 4, 6, 11], "store": [0, 1, 11], "inform": [0, 1, 4, 11], "about": [0, 1, 4, 11], "transit": [0, 1, 2, 3, 4, 8], "defin": [0, 1, 2, 3, 11], "cube": [0, 1, 2, 3, 4, 8], "line": [0, 1, 2, 3, 4, 6, 8, 11], "file": [0, 1, 2, 3, 4, 8], "ha": [0, 6, 11], "capabl": [0, 11], "pars": [0, 8, 11], "properti": [0, 1, 2, 4, 6, 11], "shape": [0, 1, 3, 4, 6, 11], "python": [0, 1, 2, 11], "dictionari": [0, 1, 2, 3, 4, 6, 11], "compar": [0, 3, 4, 11], "repres": [0, 2, 4, 6, 11], "chang": [0, 1, 3, 4, 11], "project": [0, 1, 2, 4, 6, 8], "card": [0, 1, 3, 4], "typic": [0, 3, 4, 6, 8], "usag": [0, 1, 3, 4], "exampl": [0, 1, 3, 4, 6, 11], "tn": [0, 11], "create_from_cub": [0, 11], "cube_dir": [0, 3, 11], "transit_change_list": [0, 11], "evaluate_differ": [0, 4, 11], "base_transit_network": [0, 3, 11], "list": [0, 1, 2, 3, 4, 6, 11], "string": [0, 1, 3, 4, 6], "uniqu": [0, 1, 3], "name": [0, 1, 2, 3, 4, 6, 11], "network": [0, 1, 2, 3, 4, 5, 8], "type": [0, 1, 2, 3, 4, 6, 11], "line_properti": 0, "kei": [0, 1, 3, 11], "valu": [0, 1, 3, 4, 6, 11], "ar": [0, 1, 2, 3, 6, 11], "These": 0, "directli": 0, "read": [0, 1, 3, 4, 8, 11], "from": [0, 1, 2, 3, 4, 6, 8], "haven": 0, "t": [0, 1, 6], "been": [0, 6], "translat": [0, 4, 11], "standard": [0, 1, 2, 4, 11], "dict": [0, 1, 2, 3, 4], "panda": [0, 1, 3], "datafram": [0, 1, 3, 4, 11], "node": [0, 1, 2, 3, 4, 6, 11], "follow": [0, 4, 8, 11], "column": [0, 1, 4], "node_id": 0, "int": [0, 1, 2, 3, 4, 6], "posit": [0, 6], "integ": [0, 1, 2], "id": [0, 1, 3, 4, 11], "number": [0, 1, 2, 3, 4, 6, 11], "neg": [0, 6], "indic": [0, 1, 6], "non": [0, 6], "stop": [0, 4], "boolean": [0, 1], "i": [0, 1, 3, 4, 5, 6, 8, 11], "order": [0, 1, 2, 6, 11], "within": [0, 1, 2, 6], "thi": [0, 1, 2, 3, 4, 6, 8, 11], "program_typ": 0, "either": [0, 1, 5, 11], "pt": 0, "trnbld": 0, "str": [0, 1, 2, 3, 6], "instanc": [0, 1, 2, 3, 4, 11], "appli": [0, 1, 6], "which": [0, 1, 3, 6, 11], "includ": [0, 1, 2, 11], "time": [0, 1, 2, 4, 6, 11], "period": [0, 1, 2, 4, 11], "variabl": [0, 1, 2, 3, 4, 8, 11], "source_list": 0, "have": [0, 1, 6, 8], "ad": [0, 1, 3, 11], "diff_dict": 0, "__init__": [0, 1, 2, 3, 4], "constructor": [0, 1, 3], "set": [0, 1, 2, 3, 4, 5, 6, 8, 11], "see": [0, 1, 3, 4], "an": [0, 1, 3, 4, 6, 11], "method": [0, 1, 2, 3, 4, 6, 11], "add_additional_time_period": 0, "new_time_period_numb": 0, "orig_line_nam": 0, "copi": [0, 1, 3, 4, 11], "rout": [0, 1, 4], "anoth": 0, "appropri": [0, 4], "specif": [0, 1, 8, 11], "new": [0, 1, 6, 11], "under": 0, "self": [0, 1, 2, 3, 4, 6], "origin": [0, 1, 6, 11], "its": [0, 1], "return": [0, 1, 3, 4, 6], "add_cub": 0, "transit_sourc": 0, "lin": [0, 3, 4], "add": [0, 1, 3, 11], "exist": [0, 1, 3, 6, 11], "transitnetwork": [0, 4], "directori": [0, 1, 4, 11], "static": [0, 1, 3, 4], "build_route_nam": 0, "route_id": [0, 4], "time_period": [0, 1, 2], "agency_id": 0, "0": [0, 1, 2, 4, 6, 11], "direction_id": 0, "1": [0, 1, 2, 4, 6, 11], "creat": [0, 1, 2, 3, 4, 6, 8, 11], "contaten": 0, "agenc": [0, 8], "direct": [0, 1, 6, 11], "e": [0, 1, 11], "452": 0, "111": 0, "pk": [0, 2], "construct": [0, 6, 11], "line_nam": 0, "0_452": 0, "111_452_pk1": 0, "calculate_start_end_tim": 0, "line_properties_dict": 0, "calcul": [0, 1, 2, 3, 4, 6], "start": [0, 1, 4, 8], "end": [0, 1, 6], "warn": [0, 1], "doesn": [0, 1], "take": [0, 1], "care": 0, "discongru": 0, "flavor": [0, 1], "create_add_route_card_dict": 0, "format": [0, 1, 2, 4, 8, 11], "route_properti": 0, "being": 0, "updat": [0, 1, 11], "A": [0, 1, 2, 3, 4, 6, 11], "addit": [0, 1, 8, 11], "create_delete_route_card_dict": 0, "base_transit_line_properties_dict": 0, "delet": [0, 1], "style": [0, 6], "attribut": [0, 1, 2, 3, 4, 11], "find": [0, 1, 4], "create_update_route_card_dict": 0, "updated_properties_dict": 0, "cube_properties_to_standard_properti": 0, "cube_properties_dict": 0, "convert": [0, 1, 4, 6], "most": 0, "pertin": 0, "like": [0, 1, 2, 6], "headwai": [0, 2, 4], "varibl": [0, 2], "stnadard": 0, "unit": [0, 1, 6], "minut": 0, "second": [0, 4, 6], "correct": 0, "base_transit": 0, "identifi": [0, 1, 11], "what": [0, 1, 3, 6], "need": [0, 1, 2, 3, 4], "For": [0, 1, 4, 6], "multipl": [0, 1, 6, 11], "make": [0, 1, 11], "duplic": 0, "so": [0, 1, 5, 11], "each": [0, 1, 3, 6], "condit": [0, 1], "contain": [0, 1, 4, 6, 8, 11], "requir": [0, 1, 6, 11], "evalu": [0, 1, 3], "differ": [0, 2, 6], "between": [0, 1, 2, 3, 6], "evaluate_route_property_differ": 0, "properties_build": 0, "properties_bas": 0, "time_period_numb": 0, "absolut": [0, 6], "true": [0, 1, 3, 4, 5, 6, 11], "validate_bas": 0, "fals": [0, 1, 3, 4, 6, 11], "check": [0, 1, 3], "ani": [0, 1, 3, 6, 11], "entri": [0, 3], "property_nam": 0, "property_valu": 0, "us": [0, 1, 2, 3, 4, 6, 11], "command": [0, 1, 3], "rather": [0, 1], "than": [0, 1, 6], "If": [0, 1, 3, 4, 6, 11], "automat": 0, "note": [0, 6, 11], "onli": [0, 1, 3, 4, 6], "numer": [0, 4], "frequenc": [0, 4, 11], "suitabl": 0, "write": [0, 1, 3, 4, 11], "evaluate_route_shape_chang": 0, "shape_build": 0, "shape_bas": 0, "two": [0, 1, 6, 8, 11], "build": [0, 1, 3, 11], "version": [0, 6, 11], "ddatafram": 0, "get_time_period_numbers_from_cube_properti": 0, "properties_list": 0, "associ": [0, 2], "them": [0, 1, 4, 6, 8], "all": [0, 1, 2, 5, 6, 11], "found": [0, 1, 6], "unpack_route_nam": 0, "unpack": 0, "info": [0, 1, 2], "link": [1, 2, 3, 6, 11], "kwarg": [1, 2, 3, 6], "roadwaynetwork": [1, 3, 4], "subclass": [1, 11], "network_wrangl": [1, 8, 11], "represent": [1, 3, 4, 6], "physic": 1, "roadwai": [1, 2, 3, 4, 11], "geodatafram": [1, 11], "specifi": [1, 3, 6], "default": [1, 2, 3, 4, 6, 11], "cr": [1, 3, 6], "coordin": [1, 3, 6], "refer": [1, 3, 4, 6, 11], "system": [1, 3], "espg": [1, 3], "node_foreign_kei": [1, 3], "tabl": [1, 2, 3, 6], "link_foreign_kei": [1, 3], "foreign": [1, 3], "shape_foreign_kei": [1, 3, 11], "unique_link_id": [1, 3], "unique_node_id": [1, 3], "modes_to_network_link_vari": [1, 3], "map": [1, 2, 3, 6, 11], "mode": [1, 3, 4, 11], "modes_to_network_nodes_vari": [1, 3], "managed_lanes_node_id_scalar": [1, 3], "scalar": [1, 3, 6], "primari": [1, 3], "correspond": [1, 3, 4], "manag": [1, 2, 3, 11], "lane": [1, 2, 3, 11], "managed_lanes_link_id_scalar": [1, 3], "managed_lanes_required_attribut": [1, 3], "must": [1, 3], "keep_same_attributes_ml_and_gp": [1, 3], "parallel": [1, 3], "gener": [1, 3, 6], "purpos": [1, 3, 6], "add_count": 1, "network_vari": 1, "aadt": 1, "mndot_count_shst_data": 1, "none": [1, 3, 4, 5, 6], "widot_count_shst_data": 1, "mndot_count_variable_shp": [1, 2], "widot_count_variable_shp": 1, "count": [1, 2], "mc": [1, 2, 4], "join": [1, 3, 4, 6, 11], "data": [1, 2, 3, 4, 8, 11], "via": 1, "shst": 1, "api": [1, 6], "match": 1, "result": [1, 11], "should": [1, 2, 3, 6, 11], "written": [1, 11], "path": [1, 3, 4, 6, 11], "mndot": [1, 2], "locat": [1, 2, 3, 4], "widot": 1, "geodatabas": 1, "add_incident_link_data_to_nod": 1, "links_df": [1, 11], "nodes_df": [1, 11], "link_vari": 1, "unique_node_kei": 1, "model_node_id": [1, 2], "go": [1, 11], "assess": [1, 3], "connect": 1, "incid": 1, "where": 1, "length": [1, 6], "n": [1, 2, 3, 11], "out": [1, 2, 5, 6, 8], "add_new_roadway_feature_chang": 1, "featur": [1, 6], "also": [1, 6, 11], "valid": [1, 6, 11], "add_variable_using_shst_refer": 1, "var_shst_csvdata": 1, "shst_csv_variabl": 1, "network_var_typ": 1, "overwrit": 1, "bool": [1, 3, 6], "addition_map": 1, "show": 1, "project_card_dictionari": 1, "wrapper": [1, 8, 11], "apply_managed_lane_feature_chang": 1, "link_idx": 1, "in_plac": [1, 11], "lndice": 1, "whether": 1, "decid": 1, "connector": [1, 2], "when": [1, 3, 4, 6], "thei": [1, 2], "more": [1, 6, 11], "apply_python_calcul": 1, "pycod": 1, "execut": 1, "code": [1, 2, 4, 6, 11], "apply_roadway_feature_chang": [1, 11], "select": [1, 11], "pass": [1, 5], "assess_connect": [1, 11], "ignore_end_nod": [1, 11], "graph": 1, "disconnect": 1, "subgraph": 1, "describ": [1, 11], "member": [1, 6], "one": [1, 6, 11], "drive": [1, 11], "walk": [1, 11], "bike": 1, "ignor": [1, 6], "strai": 1, "singleton": 1, "tupl": [1, 6], "osmnx": [1, 11], "networkx": 1, "digraph": 1, "build_selection_kei": 1, "selection_dict": 1, "combin": [1, 2, 4, 6], "queri": [1, 11], "b": [1, 2, 11], "you": [1, 3, 11], "selection_dictonari": 1, "serv": 1, "calculate_area_typ": 1, "area_type_shap": [1, 2], "area_type_shape_vari": 1, "area_typ": [1, 2, 3], "area_type_codes_dict": 1, "downtown_area_type_shap": [1, 2], "downtown_area_typ": [1, 2], "area": [1, 2, 3, 6], "centroid": [1, 2, 6], "geometri": [1, 2, 6], "field": [1, 2, 11], "determin": [1, 3, 6], "label": 1, "isn": 1, "perfect": 1, "much": 1, "quicker": 1, "other": [1, 6, 11], "The": [1, 2, 5, 6, 11], "geodadabas": 1, "input": [1, 6], "downtown": [1, 2], "boundari": [1, 2, 6], "counti": [1, 2, 3], "calculate_centroidconnect": 1, "centroidconnect": [1, 2, 3], "highest_taz_numb": [1, 2], "as_integ": 1, "max": 1, "taz": [1, 2], "calculate_counti": 1, "county_shap": [1, 2], "county_shape_vari": 1, "county_codes_dict": 1, "calculate_dist": 1, "distanc": [1, 2, 3, 6], "centroidconnect_onli": 1, "mile": 1, "centroidconnector": 1, "calculate_mpo": 1, "county_network_vari": 1, "mpo": [1, 2], "mpo_counti": [1, 2], "param": [1, 4], "county_vari": 1, "region": [1, 6], "calculate_us": 1, "defauli": 1, "convert_int": 1, "int_col_nam": 1, "create_ml_vari": 1, "ml_lane": [1, 2], "ml": 1, "placehold": 1, "come": 1, "log": [1, 3, 8], "create_calculated_vari": 1, "create_dummy_connector_link": 1, "ml_df": 1, "access_lan": 1, "egress_lan": 1, "access_roadwai": 1, "ml_access": 1, "egress_roadwai": 1, "access_name_prefix": 1, "access": [1, 4, 6], "dummi": 1, "egress_name_prefix": 1, "egress": 1, "gp_df": 1, "roadai": 1, "prefix": 1, "create_hov_corridor_vari": 1, "segment_id": [1, 2], "hov": 1, "corridor": 1, "create_managed_lane_network": [1, 11], "keep_additional_attributes_ml_and_gp": 1, "separ": [1, 3, 6], "look": 1, "want": [1, 11], "leav": 1, "some": 1, "rigor": 1, "test": [1, 2, 6], "create_managed_vari": 1, "dataframe_to_fixed_width": 1, "df": 1, "fix": [1, 2], "width": [1, 6], "transform": [1, 6], "delete_roadway_feature_chang": 1, "ignore_miss": 1, "get": [1, 2, 6, 11], "miss": 1, "fail": [1, 6], "deletion_map": 1, "fill_na": 1, "fill": [1, 6], "na": 1, "from_roadwaynetwork": [1, 11], "roadway_network_object": 1, "get_attribut": 1, "join_kei": 1, "source_shst_ref_df": 1, "source_gdf": 1, "field_nam": 1, "get_managed_lane_node_id": 1, "nodes_list": 1, "4500000": 1, "237": 1, "get_modal_graph": 1, "bike_access": [1, 2], "bu": [1, 4], "bus_onli": [1, 2], "drive_access": [1, 2, 11], "rail": [1, 4], "rail_onli": [1, 2], "walk_access": [1, 2], "strongli": 1, "vertex": [1, 6], "reachabl": 1, "everi": 1, "get_modal_links_nod": 1, "kept": 1, "both": [1, 4, 6, 11], "filter": [1, 6], "right": [1, 6], "now": 1, "we": [1, 4, 11], "don": 1, "becaus": [1, 4, 6, 11], "mark": 1, "issu": 1, "discuss": 1, "http": [1, 4, 6, 11], "github": [1, 6, 11], "com": [1, 4, 6, 11], "wsp": [1, 11], "sag": [1, 11], "145": 1, "modal_nodes_df": 1, "mode_node_vari": 1, "get_property_by_time_period_and_group": 1, "prop": 1, "categori": [1, 2, 11], "default_return": 1, "seri": [1, 11], "group": 1, "16": [1, 2, 6], "00": [1, 2], "19": [1, 2], "option": [1, 6], "sov": [1, 2], "search": [1, 2, 8, 11], "hov3": [1, 2], "hov2": [1, 2], "identify_seg": 1, "o_id": 1, "d_id": 1, "endpoint": 1, "up": [1, 5, 11], "segment": [1, 4, 6, 11], "candid": 1, "otherwis": [1, 4, 6], "ram": 1, "hog": 1, "could": [1, 6, 11], "odd": 1, "shortest": 1, "segment_vari": 1, "keep": [1, 6], "identify_segment_endpoint": 1, "min_connecting_link": 1, "10": [1, 2, 6], "min_dist": 1, "max_link_devi": 1, "2": [1, 2, 4, 6, 11], "is_network_connect": [1, 11], "consid": [1, 6, 11], "cach": 1, "long": [1, 6], "load_transform_network": 1, "node_filenam": [1, 11], "link_filenam": [1, 11], "shape_filenam": [1, 11], "4326": [1, 6], "validate_schema": 1, "disk": 1, "schema": [1, 11], "shapes_df": [1, 11], "network_connection_plot": 1, "g": [1, 6], "disconnected_subgraph_nod": 1, "plot": 1, "fig": 1, "ax": 1, "orig_dest_nodes_foreign_kei": 1, "whatev": 1, "u": 1, "v": [1, 2, 11], "ab": 1, "noth": 1, "assum": 1, "a_id": 1, "b_id": 1, "ox_graph": 1, "unique_link_kei": 1, "model_link_id": [1, 2, 3], "arrai": [1, 6], "remov": [1, 6], "certain": 1, "do": [1, 6, 11], "too": [1, 5], "link_df": 1, "referenc": 1, "multidigraph": 1, "path_search": 1, "candidate_links_df": 1, "weight_column": 1, "weight_factor": 1, "search_breadth": 1, "5": [1, 2, 4, 6], "max_search_breadth": 1, "candidate_link": 1, "part": [1, 6, 11], "foreigh": 1, "destin": 1, "weight": 1, "iter": [1, 6], "multipli": 1, "fast": [1, 11], "recalculate_calculated_vari": [1, 3], "recalculate_dist": [1, 3], "json": [1, 11], "geojson": 1, "skip": 1, "speed": 1, "spatial": [1, 6], "etc": [1, 2, 3, 11], "re": 1, "read_match_result": 1, "lot": 1, "same": [1, 4, 6], "concaten": 1, "singl": [1, 3, 6], "geopanda": [1, 11], "sure": 1, "why": 1, "util": [1, 8, 11], "rename_variables_for_dbf": 1, "input_df": 1, "variable_crosswalk": 1, "output_vari": [1, 2], "convert_geometry_to_xi": 1, "renam": [1, 3], "dbf": 1, "shp": [1, 2], "char": 1, "crosswalk": [1, 3], "x": [1, 2, 6], "y": [1, 2, 6], "roadway_net_to_gdf": 1, "roadway_net": 1, "turn": [1, 11], "export": [1, 8, 11], "sophist": 1, "attach": 1, "roadway_standard_to_met_council_network": 1, "output_epsg": [1, 2], "consist": [1, 6, 11], "metcouncil": [1, 2, 4, 8, 11], "": [1, 2, 4, 6, 11], "model": [1, 2], "expect": [1, 6], "epsg": [1, 2, 6], "output": [1, 2, 3, 4, 6], "select_roadway_featur": [1, 11], "search_mod": 1, "force_search": 1, "sp_weight_factor": 1, "satisfi": [1, 6], "criteria": 1, "net": [1, 11], "osm": [1, 2], "share": [1, 11], "street": [1, 11], "osm_model_link_id": 1, "1234": 1, "shstid": 1, "4321": 1, "regex": 1, "facil": [1, 2, 11], "main": 1, "st": [1, 2], "least": [1, 11], "perform": [1, 4], "even": 1, "previou": 1, "discourag": 1, "meander": 1, "ref": 1, "here": [1, 6], "defaul": 1, "selection_has_unique_link_id": 1, "selection_dictionari": 1, "selection_map": 1, "selected_link_idx": 1, "candidate_link_idx": 1, "selected_links_idx": 1, "candidate_links_idx": 1, "shortest_path": 1, "graph_links_df": 1, "100": 1, "four": 1, "nx": 1, "split_properties_by_time_period_and_categori": 1, "properties_to_split": [1, 2], "split": [1, 2, 4], "structur": 1, "stratifi": 1, "times_period": 1, "am": [1, 2], "6": [1, 2, 4, 6], "pm": [1, 2], "15": [1, 2], "update_dist": 1, "use_shap": 1, "inplac": 1, "straight": 1, "avail": 1, "portion": 1, "provid": [1, 3, 4, 6, 11], "entir": 1, "crow": 1, "fly": 1, "meter": [1, 6], "nan": 1, "validate_link_schema": 1, "schema_loc": 1, "roadway_network_link": 1, "validate_node_schema": 1, "node_fil": 1, "roadway_network_nod": 1, "validate_properti": 1, "ignore_exist": 1, "require_existing_for_chang": 1, "theproject": 1, "dictonari": 1, "validate_select": 1, "selection_requir": 1, "whetther": 1, "minimum": [1, 5, 6], "validate_shape_schema": 1, "shape_fil": 1, "roadway_network_shap": 1, "validate_uniqu": 1, "confirm": 1, "met": 1, "filenam": [1, 3, 11], "were": 1, "save": 1, "write_roadway_as_fixedwidth": [1, 11], "output_dir": 1, "node_output_vari": 1, "link_output_vari": 1, "output_link_txt": [1, 2], "output_node_txt": [1, 2], "output_link_header_width_txt": [1, 2], "output_node_header_width_txt": [1, 2], "output_cube_network_script": [1, 2], "drive_onli": 1, "function": [1, 5, 6, 8, 11], "doe": [1, 4], "header": [1, 2], "3": [1, 2, 4, 6, 11], "script": [1, 2], "run": [1, 8], "databas": [1, 3], "record": [1, 4], "write_roadway_as_shp": [1, 11], "data_to_csv": 1, "data_to_dbf": 1, "output_link_shp": [1, 2], "output_node_shp": [1, 2], "output_link_csv": [1, 2], "output_node_csv": [1, 2], "output_gpkg": 1, "output_link_gpkg_lay": 1, "output_node_gpkg_lay": 1, "output_gpkg_link_filt": 1, "gpkg": 1, "csv": [1, 2, 3, 11], "full": 1, "geopackag": 1, "layer": [1, 11], "subset": 1, "calculated_valu": [1, 3], "dai": [2, 4, 11], "can": [2, 6, 8, 11], "runtim": [2, 11], "initi": [2, 3, 11], "keyword": [2, 6, 11], "argument": [2, 6, 11], "explicitli": [2, 11], "highlight": [2, 11], "attr": 2, "time_period_to_tim": 2, "abbrevi": [2, 4], "gtf": [2, 4, 11], "highwai": [2, 3, 8, 11], "ea": 2, "md": 2, "ev": 2, "cube_time_period": 2, "4": [2, 4, 6], "demand": 2, "allow": [2, 6], "suffix": 2, "truck": 2, "trk": 2, "final": 2, "lanes_am": 2, "time_periods_to_tim": 2, "shapefil": 2, "r": 2, "metcouncil_data": 2, "cb_2017_us_county_5m": 2, "county_variable_shp": 2, "lanes_lookup_fil": 2, "lookup": 2, "centroid_connect_lan": 2, "anoka": 2, "dakota": 2, "hennepin": 2, "ramsei": 2, "scott": 2, "washington": 2, "carver": 2, "taz_shap": 2, "tazofficialwcurrentforecast": 2, "taz_data": 2, "highest": 2, "3100": 2, "link_id": 2, "shstgeometryid": 2, "roadway_class": 2, "truck_access": 2, "trn_priority_ea": 2, "trn_priority_am": 2, "trn_priority_md": 2, "trn_priority_pm": 2, "trn_priority_ev": 2, "ttime_assert_ea": 2, "ttime_assert_am": 2, "ttime_assert_md": 2, "ttime_assert_pm": 2, "ttime_assert_ev": 2, "lanes_ea": 2, "lanes_md": 2, "lanes_pm": 2, "lanes_ev": 2, "price_sov_ea": 2, "price_hov2_ea": 2, "price_hov3_ea": 2, "price_truck_ea": 2, "price_sov_am": 2, "price_hov2_am": 2, "price_hov3_am": 2, "price_truck_am": 2, "price_sov_md": 2, "price_hov2_md": 2, "price_hov3_md": 2, "price_truck_md": 2, "price_sov_pm": 2, "price_hov2_pm": 2, "price_hov3_pm": 2, "price_truck_pm": 2, "price_sov_ev": 2, "price_hov2_ev": 2, "price_hov3_ev": 2, "price_truck_ev": 2, "roadway_class_idx": 2, "facility_typ": 2, "osm_node_id": [2, 6, 11], "bike_nod": 2, "transit_nod": 2, "walk_nod": 2, "drive_nod": 2, "ml_lanes_ea": 2, "ml_lanes_am": 2, "ml_lanes_md": 2, "ml_lanes_pm": 2, "ml_lanes_ev": 2, "osm_facility_type_dict": 2, "thrivemsp2040communitydesign": 2, "area_type_variable_shp": 2, "comdes2040": 2, "area_type_code_dict": 2, "23": 2, "urban": [2, 4], "center": [2, 6], "24": 2, "25": 2, "35": 2, "36": 2, "41": 2, "51": 2, "52": 2, "53": 2, "60": 2, "downtownzones_taz": 2, "mrcc_roadway_class_shap": 2, "mrcc": 2, "trans_mrcc_centerlin": 2, "mrcc_roadway_class_variable_shp": 2, "mrcc_roadway_class_shp": 2, "route_si": 2, "widot_roadway_class_shap": 2, "wisconsin": 2, "wisconsin_lanes_counts_median": 2, "wislr": 2, "widot_roadway_class_variable_shp": 2, "rdwy_ctgy_": 2, "mndot_count_shap": 2, "count_mn": 2, "aadt_2017_count_loc": 2, "osm_highway_facility_type_crosswalk": 2, "legacy_tm2_attribut": 2, "shstreferenceid": 2, "legaci": 2, "tm2": 2, "osm_lanes_attribut": 2, "tam_tm2_attribut": 2, "tam": 2, "tom_tom_attribut": 2, "tomtom": 2, "tomtom_attribut": 2, "sfcta_attribut": 2, "sfcta": 2, "geograph": 2, "102646": 2, "scratch": 2, "txt": [2, 11], "links_header_width": 2, "nodes_header_width": 2, "import": [2, 6], "make_complete_network_from_fixed_width_fil": 2, "zone": 2, "travel": [2, 11], "analysi": 2, "3061": 2, "county_link_range_dict": 2, "county_code_dict": 2, "7": [2, 4, 11], "extern": 2, "chisago": 2, "11": 2, "goodhu": 2, "12": 2, "isanti": 2, "13": [2, 6], "le": 2, "sueur": 2, "14": 2, "mcleod": 2, "pierc": 2, "polk": 2, "17": 2, "rice": 2, "18": 2, "sherburn": 2, "siblei": 2, "20": 2, "croix": 2, "21": 2, "wright": 2, "22": 2, "route_type_bus_mode_dict": 2, "urb": 2, "loc": 2, "sub": [2, 6], "express": [2, 4, 6], "route_type_mode_dict": 2, "8": [2, 4, 6], "9": [2, 4], "cube_time_periods_nam": 2, "op": 2, "detail": [2, 5], "possibl": [2, 6], "roadway_link_chang": 3, "roadway_node_chang": 3, "transit_chang": [3, 4], "base_roadway_network": 3, "base_cube_transit_network": 3, "build_cube_transit_network": 3, "project_nam": 3, "produc": [3, 6], "test_project": [3, 11], "create_project": [3, 11], "base_cube_transit_sourc": 3, "o": [3, 4, 11], "build_cube_transit_sourc": 3, "transit_route_shape_chang": [3, 11], "evaluate_chang": [3, 11], "write_project_card": [3, 11], "scratch_dir": [3, 11], "t_transit_shape_test": [3, 11], "yml": [3, 11], "default_project_nam": 3, "level": 3, "constant": 3, "static_valu": 3, "card_data": 3, "cubetransit": [3, 8], "bunch": 3, "projectcard": [3, 8], "case": 3, "standardtransit": [3, 8], "add_highway_chang": 3, "limit_variables_to_existing_network": 3, "hoc": [3, 11], "add_transit_chang": 3, "roadway_log_fil": 3, "roadway_shp_fil": 3, "roadway_csv_fil": 3, "network_build_fil": 3, "emme_node_id_crosswalk_fil": 3, "emme_name_crosswalk_fil": 3, "base_roadway_dir": 3, "base_transit_dir": [3, 4, 11], "consum": 3, "logfil": 3, "emm": [3, 4], "folder": 3, "base_cube_transit_fil": 3, "build_cube_transit_fil": 3, "first": [3, 4, 6], "recalcul": 3, "determine_roadway_network_changes_compat": 3, "emme_id_to_wrangler_id": 3, "emme_link_change_df": 3, "emme_node_change_df": 3, "emme_transit_changes_df": 3, "rewrit": 3, "wrangler": [3, 8, 11], "emme_name_to_wrangler_nam": 3, "aggreg": 3, "get_object_from_network_build_command": 3, "row": [3, 4], "histori": 3, "l": 3, "get_operation_from_network_build_command": 3, "action": 3, "c": [3, 6], "d": [3, 6], "read_logfil": 3, "logfilenam": 3, "reprsent": [3, 11], "read_network_build_fil": 3, "networkbuildfilenam": 3, "nework": 3, "assign_group": 3, "user": [3, 11], "TO": 3, "ptg_feed": 4, "hold": [4, 11], "feed": [4, 11], "partridg": [4, 11], "manipul": [4, 11], "cube_transit_net": [4, 11], "read_gtf": [4, 11], "write_as_cube_lin": [4, 11], "write_dir": [4, 11], "outfil": [4, 11], "calculate_cube_mod": 4, "assign": 4, "logic": 4, "route_typ": 4, "develop": [4, 11], "googl": 4, "cube_mod": 4, "route_type_to_cube_mod": 4, "tram": 4, "streetcar": 4, "light": 4, "further": 4, "disaggreg": 4, "buse": 4, "suburban": 4, "longnam": 4, "lower": [4, 6], "elif": 4, "99": 4, "local": [4, 11], "els": [4, 6], "route_long_nam": 4, "cube_format": 4, "represnt": 4, "notat": 4, "trip": 4, "trip_id": 4, "shape_id": [4, 11], "tod": 4, "onewai": 4, "oper": [4, 6], "fromtransitnetwork": 4, "transit_network_object": 4, "modelroadwaynetwork": [4, 8], "gtfs_feed_dir": 4, "route_properties_gtfs_to_cub": 4, "prepar": 4, "trip_df": 4, "shape_gtfs_to_cub": 4, "add_nntim": 4, "shape_gtfs_to_dict_list": 4, "step": 4, "through": 4, "todo": 4, "elimin": 4, "necess": 4, "tag": [4, 11], "begin": 4, "As": 4, "m": 4, "minim": 4, "modif": 4, "question": 4, "shape_pt_sequ": 4, "shape_mode_node_id": 4, "is_stop": 4, "stop_sequ": 4, "shape_gtfs_to_emm": 4, "trip_row": 4, "time_to_cube_time_period": 4, "start_time_sec": 4, "as_str": 4, "verbos": 4, "midnight": [4, 6], "this_tp": 4, "this_tp_num": 4, "outpath": 4, "after": 4, "setuplog": 5, "infologfilenam": 5, "debuglogfilenam": 5, "logtoconsol": 5, "infolog": 5, "ters": 5, "just": 5, "give": 5, "bare": 5, "composit": 5, "clear": 5, "later": 5, "debuglog": 5, "veri": [5, 6], "noisi": 5, "debug": 5, "spew": 5, "consol": 5, "point": 6, "arg": 6, "basegeometri": 6, "zero": 6, "dimension": 6, "z": 6, "float": 6, "p": 6, "print": 6, "almost_equ": 6, "decim": 6, "equal": 6, "place": 6, "deprec": 6, "sinc": 6, "confus": 6, "equals_exact": 6, "instead": 6, "approxim": 6, "compon": [6, 8], "linestr": 6, "1e": 6, "array_interfac": 6, "numpi": 6, "protocol": 6, "buffer": 6, "resolut": 6, "quadseg": 6, "cap_styl": 6, "join_styl": 6, "mitre_limit": 6, "single_sid": 6, "dilat": 6, "eros": 6, "small": 6, "mai": 6, "sometim": 6, "tidi": 6, "polygon": 6, "around": [6, 8, 11], "angl": 6, "fillet": 6, "gone": 6, "next": 6, "major": 6, "releas": 6, "cap": 6, "round": 6, "flat": 6, "squar": 6, "offset": 6, "mitr": 6, "bevel": 6, "limit": 6, "ratio": 6, "sharp": 6, "corner": 6, "meet": 6, "miter": 6, "extend": 6, "To": [6, 11], "prevent": 6, "unreason": 6, "control": 6, "maximum": 6, "exce": 6, "side": 6, "sign": 6, "left": 6, "hand": 6, "regular": 6, "alwai": 6, "forc": 6, "equival": 6, "cap_flat": 6, "strictli": 6, "wkt": 6, "load": 6, "gon": 6, "approx": 6, "radiu": 6, "circl": 6, "1365484905459": 6, "128": 6, "141513801144": 6, "triangl": 6, "exterior": 6, "coord": 6, "covered_bi": 6, "cover": 6, "cross": 6, "disjoint": 6, "unitless": 6, "empti": 6, "val": 6, "93892948296208": 6, "topolog": 6, "toler": 6, "comparison": 6, "geometrytyp": 6, "hausdorff_dist": 6, "hausdorff": 6, "interpol": 6, "normal": 6, "along": 6, "linear": 6, "taken": 6, "measur": 6, "revers": 6, "rang": 6, "index": [6, 8], "handl": 6, "clamp": 6, "interpret": 6, "fraction": 6, "intersect": 6, "form": 6, "canon": 6, "ring": 6, "multi": 6, "multilinestr": 6, "overlap": 6, "nearest": 6, "relat": 6, "de": 6, "9im": 6, "matrix": 6, "relate_pattern": 6, "pattern": 6, "relationship": 6, "representative_point": 6, "guarante": 6, "cheapli": 6, "simplifi": 6, "preserve_topologi": 6, "dougla": 6, "peucker": 6, "algorithm": 6, "unless": 6, "topologi": 6, "preserv": 6, "invalid": 6, "svg": 6, "scale_factor": 6, "fill_color": 6, "opac": 6, "element": 6, "factor": 6, "diamet": 6, "hex": 6, "color": 6, "66cc99": 6, "ff3333": 6, "symmetric_differ": 6, "symmetr": 6, "touch": 6, "union": 6, "array_interface_bas": 6, "dimens": 6, "bound": 6, "collect": 6, "null": 6, "minx": 6, "mini": 6, "maxx": 6, "maxi": 6, "geometr": 6, "convex_hul": 6, "convex": 6, "hull": 6, "less": 6, "three": [6, 11], "multipoint": 6, "triangular": 6, "imagin": 6, "elast": 6, "band": 6, "stretch": 6, "coordinatesequ": 6, "ctype": 6, "envelop": 6, "figur": 6, "geom_typ": 6, "has_z": 6, "sequenc": 6, "impl": 6, "geosimpl": 6, "geo": 6, "is_clos": 6, "close": 6, "applic": 6, "is_empti": 6, "is_r": 6, "is_simpl": 6, "simpl": 6, "mean": 6, "is_valid": 6, "definit": 6, "depend": 6, "minimum_clear": 6, "move": 6, "minimum_rotated_rectangl": 6, "rectangl": 6, "possibli": 6, "rotat": 6, "degener": 6, "wkb": 6, "wkb_hex": 6, "xy": 6, "shell": 6, "hole": 6, "It": [6, 8], "space": 6, "interior": 6, "classmethod": 6, "from_bound": 6, "xmin": 6, "ymin": 6, "xmax": 6, "ymax": 6, "stroke": 6, "partial": 6, "func": 6, "given": [6, 11], "futur": 6, "call": 6, "column_name_to_part": 6, "create_locationrefer": 6, "geodesic_point_buff": 6, "lat": 6, "lon": 6, "circular": 6, "get_shared_streets_intersection_hash": 6, "per": [6, 11], "sharedstreet": 6, "j": 6, "blob": 6, "0e6d7de0aee2e9ae3b007d1e45284b06cc241d02": 6, "src": 6, "l553": 6, "l565": 6, "93": 6, "0965985": 6, "44": 6, "952112199999995": 6, "954734870": 6, "69f13f881649cb21ee3b359730790bb9": 6, "hhmmss_to_datetim": 6, "hhmmss_str": 6, "datetim": 6, "hh": 6, "mm": 6, "ss": 6, "dt": 6, "secs_to_datetim": 6, "sec": 6, "shorten_nam": 6, "geom": 6, "xp": 6, "yp": 6, "zp": 6, "shall": 6, "ident": 6, "def": 6, "id_func": 6, "g2": 6, "g1": 6, "pyproj": 6, "accur": 6, "wgs84": 6, "utm": 6, "32618": 6, "from_cr": 6, "always_xi": 6, "support": 6, "lambda": 6, "unidecod": 6, "error": 6, "replace_str": 6, "transliter": 6, "unicod": 6, "ascii": 6, "\u5317\u4eb0": 6, "bei": 6, "jing": 6, "tri": 6, "codec": 6, "charact": 6, "fall": 6, "back": 6, "five": 6, "faster": 6, "slightli": 6, "slower": 6, "unicode_expect_nonascii": 6, "present": 6, "replac": [6, 11], "strict": 6, "rais": 6, "unidecodeerror": 6, "substitut": 6, "might": [6, 11], "packag": [8, 11], "mtc": 8, "aim": 8, "networkwrangl": [8, 11], "refin": 8, "respect": 8, "citilab": 8, "softwar": [8, 11], "instal": 8, "bleed": 8, "edg": 8, "clone": 8, "brief": 8, "intro": 8, "workflow": 8, "quickstart": 8, "jupyt": 8, "notebook": 8, "setup": 8, "scenario": 8, "audit": 8, "report": 8, "logger": 8, "modul": 8, "page": 8, "suggest": 11, "virtualenv": 11, "conda": 11, "virtual": 11, "environ": 11, "recommend": 11, "pip": 11, "lasso": 11, "config": 11, "channel": 11, "forg": 11, "rtree": 11, "my_lasso_environ": 11, "activ": 11, "git": 11, "master": 11, "pypi": 11, "repositori": 11, "date": 11, "branch": 11, "work": 11, "your": 11, "machin": 11, "edit": 11, "plan": 11, "well": 11, "cd": 11, "team": 11, "contribut": 11, "bxack": 11, "pleas": 11, "fork": 11, "befor": 11, "upstream": 11, "branchnam": 11, "frequent": 11, "instruct": 11, "good": 11, "atom": 11, "sublim": 11, "text": 11, "syntax": 11, "desktop": 11, "built": 11, "mashup": 11, "open": 11, "In": 11, "nest": 11, "span": 11, "implement": 11, "novel": 11, "break": 11, "publictransport": 11, "document": 11, "done": 11, "gui": 11, "public": 11, "transport": 11, "infrastructur": 11, "servic": 11, "tier": 11, "made": 11, "mainli": 11, "my_link_fil": 11, "my_node_fil": 11, "my_shape_fil": 11, "my_select": 11, "35e": 11, "961117623": 11, "2564047368": 11, "my_chang": 11, "my_net": 11, "ml_net": 11, "_": 11, "disconnected_nod": 11, "my_out_prefix": 11, "my_dir": 11, "my_base_scenario": 11, "road_net": 11, "stpaul_link_fil": 11, "stpaul_node_fil": 11, "stpaul_shape_fil": 11, "transit_net": 11, "stpaul_dir": 11, "card_filenam": 11, "3_multiple_roadway_attribute_chang": 11, "multiple_chang": 11, "4_simple_managed_lan": 11, "project_card_directori": 11, "project_card": 11, "project_cards_list": 11, "my_scenario": 11, "create_scenario": 11, "base_scenario": 11, "check_scenario_requisit": 11, "apply_all_project": 11, "scenario_summari": 11, "base_transit_sourc": 11, "build_transit_sourc": 11, "understand": 11, "how": 11, "overrid": 11, "those": 11, "instanti": 11, "yaml": 11, "configur": 11, "pair": 11, "config_fil": 11, "f": 11, "my_config": 11, "safe_load": 11, "model_road_net": 11, "my_paramet": 11, "accomplish": 11, "goal": 11, "top": 11, "learn": 11, "basic": 11, "creation": 11, "ipynb": 11}, "objects": {"": [[7, 0, 0, "-", "lasso"]], "lasso": [[0, 1, 1, "", "CubeTransit"], [1, 1, 1, "", "ModelRoadwayNetwork"], [2, 1, 1, "", "Parameters"], [3, 1, 1, "", "Project"], [4, 1, 1, "", "StandardTransit"], [5, 0, 0, "-", "logger"], [6, 0, 0, "-", "util"]], "lasso.CubeTransit": [[0, 2, 1, "", "__init__"], [0, 2, 1, "", "add_additional_time_periods"], [0, 2, 1, "", "add_cube"], [0, 2, 1, "", "build_route_name"], [0, 2, 1, "", "calculate_start_end_times"], [0, 2, 1, "", "create_add_route_card_dict"], [0, 2, 1, "", "create_delete_route_card_dict"], [0, 2, 1, "", "create_from_cube"], [0, 2, 1, "", "create_update_route_card_dict"], [0, 2, 1, "", "cube_properties_to_standard_properties"], [0, 3, 1, "", "diff_dict"], [0, 2, 1, "", "evaluate_differences"], [0, 2, 1, "", "evaluate_route_property_differences"], [0, 2, 1, "", "evaluate_route_shape_changes"], [0, 2, 1, "", "get_time_period_numbers_from_cube_properties"], [0, 3, 1, "", "line_properties"], [0, 3, 1, "", "lines"], [0, 3, 1, "", "parameters"], [0, 3, 1, "", "program_type"], [0, 3, 1, "", "shapes"], [0, 3, 1, "", "source_list"], [0, 2, 1, "", "unpack_route_name"]], "lasso.ModelRoadwayNetwork": [[1, 3, 1, "", "CALCULATED_VALUES"], [1, 2, 1, "", "__init__"], [1, 2, 1, "", "add_counts"], [1, 2, 1, "", "add_incident_link_data_to_nodes"], [1, 2, 1, "", "add_new_roadway_feature_change"], [1, 2, 1, "", "add_variable_using_shst_reference"], [1, 2, 1, "", "addition_map"], [1, 2, 1, "", "apply"], [1, 2, 1, "", "apply_managed_lane_feature_change"], [1, 2, 1, "", "apply_python_calculation"], [1, 2, 1, "", "apply_roadway_feature_change"], [1, 2, 1, "", "assess_connectivity"], [1, 2, 1, "", "build_selection_key"], [1, 2, 1, "", "calculate_area_type"], [1, 2, 1, "", "calculate_centroidconnect"], [1, 2, 1, "", "calculate_county"], [1, 2, 1, "", "calculate_distance"], [1, 2, 1, "", "calculate_mpo"], [1, 2, 1, "", "calculate_use"], [1, 2, 1, "", "convert_int"], [1, 2, 1, "", "create_ML_variable"], [1, 2, 1, "", "create_calculated_variables"], [1, 2, 1, "", "create_dummy_connector_links"], [1, 2, 1, "", "create_hov_corridor_variable"], [1, 2, 1, "", "create_managed_lane_network"], [1, 2, 1, "", "create_managed_variable"], [1, 2, 1, "", "dataframe_to_fixed_width"], [1, 2, 1, "", "delete_roadway_feature_change"], [1, 2, 1, "", "deletion_map"], [1, 2, 1, "", "fill_na"], [1, 2, 1, "", "from_RoadwayNetwork"], [1, 2, 1, "", "get_attribute"], [1, 2, 1, "", "get_managed_lane_node_ids"], [1, 2, 1, "", "get_modal_graph"], [1, 2, 1, "", "get_modal_links_nodes"], [1, 2, 1, "", "get_property_by_time_period_and_group"], [1, 2, 1, "", "identify_segment"], [1, 2, 1, "", "identify_segment_endpoints"], [1, 2, 1, "", "is_network_connected"], [1, 2, 1, "", "load_transform_network"], [1, 2, 1, "", "network_connection_plot"], [1, 2, 1, "", "orig_dest_nodes_foreign_key"], [1, 2, 1, "", "ox_graph"], [1, 2, 1, "", "path_search"], [1, 2, 1, "", "read"], [1, 2, 1, "", "read_match_result"], [1, 2, 1, "", "rename_variables_for_dbf"], [1, 2, 1, "", "roadway_net_to_gdf"], [1, 2, 1, "", "roadway_standard_to_met_council_network"], [1, 2, 1, "", "select_roadway_features"], [1, 2, 1, "", "selection_has_unique_link_id"], [1, 2, 1, "", "selection_map"], [1, 2, 1, "", "shortest_path"], [1, 2, 1, "", "split_properties_by_time_period_and_category"], [1, 2, 1, "", "update_distance"], [1, 2, 1, "", "validate_link_schema"], [1, 2, 1, "", "validate_node_schema"], [1, 2, 1, "", "validate_properties"], [1, 2, 1, "", "validate_selection"], [1, 2, 1, "", "validate_shape_schema"], [1, 2, 1, "", "validate_uniqueness"], [1, 2, 1, "", "write"], [1, 2, 1, "", "write_roadway_as_fixedwidth"], [1, 2, 1, "", "write_roadway_as_shp"]], "lasso.Parameters": [[2, 2, 1, "", "__init__"], [2, 3, 1, "", "county_link_range_dict"], [2, 3, 1, "", "cube_time_periods"], [2, 3, 1, "", "properties_to_split"], [2, 3, 1, "", "zones"]], "lasso.Project": [[3, 3, 1, "", "CALCULATED_VALUES"], [3, 3, 1, "id0", "DEFAULT_PROJECT_NAME"], [3, 3, 1, "id1", "STATIC_VALUES"], [3, 2, 1, "", "__init__"], [3, 2, 1, "", "add_highway_changes"], [3, 2, 1, "", "add_transit_changes"], [3, 3, 1, "", "base_cube_transit_network"], [3, 3, 1, "", "base_roadway_network"], [3, 3, 1, "", "build_cube_transit_network"], [3, 3, 1, "", "card_data"], [3, 2, 1, "", "create_project"], [3, 2, 1, "", "determine_roadway_network_changes_compatibility"], [3, 2, 1, "", "emme_id_to_wrangler_id"], [3, 2, 1, "", "emme_name_to_wrangler_name"], [3, 2, 1, "", "evaluate_changes"], [3, 2, 1, "", "get_object_from_network_build_command"], [3, 2, 1, "", "get_operation_from_network_build_command"], [3, 3, 1, "", "parameters"], [3, 3, 1, "", "project_name"], [3, 2, 1, "", "read_logfile"], [3, 2, 1, "", "read_network_build_file"], [3, 3, 1, "", "roadway_link_changes"], [3, 3, 1, "", "roadway_node_changes"], [3, 3, 1, "", "transit_changes"], [3, 2, 1, "", "write_project_card"]], "lasso.StandardTransit": [[4, 2, 1, "", "__init__"], [4, 2, 1, "", "calculate_cube_mode"], [4, 2, 1, "", "cube_format"], [4, 2, 1, "", "evaluate_differences"], [4, 3, 1, "", "feed"], [4, 2, 1, "", "fromTransitNetwork"], [4, 3, 1, "", "parameters"], [4, 2, 1, "", "read_gtfs"], [4, 2, 1, "", "route_properties_gtfs_to_cube"], [4, 2, 1, "", "shape_gtfs_to_cube"], [4, 2, 1, "", "shape_gtfs_to_dict_list"], [4, 2, 1, "", "shape_gtfs_to_emme"], [4, 2, 1, "", "time_to_cube_time_period"], [4, 2, 1, "", "write_as_cube_lin"]], "lasso.logger": [[5, 4, 1, "", "setupLogging"]], "lasso.util": [[6, 1, 1, "", "Point"], [6, 1, 1, "", "Polygon"], [6, 4, 1, "", "column_name_to_parts"], [6, 4, 1, "", "create_locationreference"], [6, 4, 1, "", "geodesic_point_buffer"], [6, 4, 1, "", "get_shared_streets_intersection_hash"], [6, 4, 1, "", "hhmmss_to_datetime"], [6, 1, 1, "", "partial"], [6, 4, 1, "", "secs_to_datetime"], [6, 4, 1, "", "shorten_name"], [6, 4, 1, "", "transform"], [6, 4, 1, "", "unidecode"]], "lasso.util.Point": [[6, 2, 1, "", "almost_equals"], [6, 5, 1, "", "area"], [6, 2, 1, "", "array_interface"], [6, 5, 1, "", "array_interface_base"], [6, 5, 1, "", "boundary"], [6, 5, 1, "", "bounds"], [6, 2, 1, "", "buffer"], [6, 5, 1, "", "centroid"], [6, 2, 1, "", "contains"], [6, 5, 1, "", "convex_hull"], [6, 5, 1, "", "coords"], [6, 2, 1, "", "covered_by"], [6, 2, 1, "", "covers"], [6, 2, 1, "", "crosses"], [6, 5, 1, "", "ctypes"], [6, 2, 1, "", "difference"], [6, 2, 1, "", "disjoint"], [6, 2, 1, "", "distance"], [6, 2, 1, "", "empty"], [6, 5, 1, "", "envelope"], [6, 2, 1, "", "equals"], [6, 2, 1, "", "equals_exact"], [6, 5, 1, "", "geom_type"], [6, 2, 1, "", "geometryType"], [6, 5, 1, "", "has_z"], [6, 2, 1, "", "hausdorff_distance"], [6, 3, 1, "", "impl"], [6, 2, 1, "", "interpolate"], [6, 2, 1, "", "intersection"], [6, 2, 1, "", "intersects"], [6, 5, 1, "", "is_closed"], [6, 5, 1, "", "is_empty"], [6, 5, 1, "", "is_ring"], [6, 5, 1, "", "is_simple"], [6, 5, 1, "", "is_valid"], [6, 5, 1, "", "length"], [6, 5, 1, "", "minimum_clearance"], [6, 5, 1, "", "minimum_rotated_rectangle"], [6, 2, 1, "", "normalize"], [6, 2, 1, "", "overlaps"], [6, 2, 1, "", "project"], [6, 2, 1, "", "relate"], [6, 2, 1, "", "relate_pattern"], [6, 2, 1, "", "representative_point"], [6, 2, 1, "", "simplify"], [6, 2, 1, "", "svg"], [6, 2, 1, "", "symmetric_difference"], [6, 2, 1, "", "touches"], [6, 5, 1, "", "type"], [6, 2, 1, "", "union"], [6, 2, 1, "", "within"], [6, 5, 1, "", "wkb"], [6, 5, 1, "", "wkb_hex"], [6, 5, 1, "", "wkt"], [6, 5, 1, "", "x"], [6, 5, 1, "", "xy"], [6, 5, 1, "", "y"], [6, 5, 1, "", "z"]], "lasso.util.Polygon": [[6, 2, 1, "", "almost_equals"], [6, 5, 1, "", "area"], [6, 5, 1, "", "array_interface_base"], [6, 5, 1, "", "boundary"], [6, 5, 1, "", "bounds"], [6, 2, 1, "", "buffer"], [6, 5, 1, "", "centroid"], [6, 2, 1, "", "contains"], [6, 5, 1, "", "convex_hull"], [6, 5, 1, "", "coords"], [6, 2, 1, "", "covered_by"], [6, 2, 1, "", "covers"], [6, 2, 1, "", "crosses"], [6, 5, 1, "", "ctypes"], [6, 2, 1, "", "difference"], [6, 2, 1, "", "disjoint"], [6, 2, 1, "", "distance"], [6, 2, 1, "", "empty"], [6, 5, 1, "", "envelope"], [6, 2, 1, "", "equals"], [6, 2, 1, "", "equals_exact"], [6, 5, 1, "id0", "exterior"], [6, 2, 1, "", "from_bounds"], [6, 5, 1, "", "geom_type"], [6, 2, 1, "", "geometryType"], [6, 5, 1, "", "has_z"], [6, 2, 1, "", "hausdorff_distance"], [6, 3, 1, "", "impl"], [6, 5, 1, "id1", "interiors"], [6, 2, 1, "", "interpolate"], [6, 2, 1, "", "intersection"], [6, 2, 1, "", "intersects"], [6, 5, 1, "", "is_closed"], [6, 5, 1, "", "is_empty"], [6, 5, 1, "", "is_ring"], [6, 5, 1, "", "is_simple"], [6, 5, 1, "", "is_valid"], [6, 5, 1, "", "length"], [6, 5, 1, "", "minimum_clearance"], [6, 5, 1, "", "minimum_rotated_rectangle"], [6, 2, 1, "", "normalize"], [6, 2, 1, "", "overlaps"], [6, 2, 1, "", "project"], [6, 2, 1, "", "relate"], [6, 2, 1, "", "relate_pattern"], [6, 2, 1, "", "representative_point"], [6, 2, 1, "", "simplify"], [6, 2, 1, "", "svg"], [6, 2, 1, "", "symmetric_difference"], [6, 2, 1, "", "touches"], [6, 5, 1, "", "type"], [6, 2, 1, "", "union"], [6, 2, 1, "", "within"], [6, 5, 1, "", "wkb"], [6, 5, 1, "", "wkb_hex"], [6, 5, 1, "", "wkt"], [6, 5, 1, "", "xy"]], "lasso.util.partial": [[6, 3, 1, "", "args"], [6, 3, 1, "", "func"], [6, 3, 1, "", "keywords"]]}, "objtypes": {"0": "py:module", "1": "py:class", "2": "py:method", "3": "py:attribute", "4": "py:function", "5": "py:property"}, "objnames": {"0": ["py", "module", "Python module"], "1": ["py", "class", "Python class"], "2": ["py", "method", "Python method"], "3": ["py", "attribute", "Python attribute"], "4": ["py", "function", "Python function"], "5": ["py", "property", "Python property"]}, "titleterms": {"lasso": [0, 1, 2, 3, 4, 5, 6, 7, 8, 9], "cubetransit": [0, 11], "modelroadwaynetwork": [1, 11], "todo": 1, "paramet": [2, 10, 11], "project": [3, 9, 10, 11], "standardtransit": [4, 11], "logger": 5, "util": [6, 7], "class": 7, "function": 7, "base": 7, "welcom": 8, "": 8, "document": 8, "content": 8, "indic": 8, "tabl": 8, "run": [9, 11], "creat": 9, "file": [9, 10, 11], "scenario": [9, 11], "export": 9, "network": [9, 11], "audit": 9, "report": 9, "setup": 10, "set": 10, "addit": 10, "data": 10, "start": 11, "out": 11, "instal": 11, "bleed": 11, "edg": 11, "from": 11, "clone": 11, "brief": 11, "intro": 11, "compon": 11, "roadwaynetwork": 11, "transitnetwork": 11, "projectcard": 11, "typic": 11, "workflow": 11, "card": 11, "transit": 11, "lin": 11, "cube": 11, "log": 11, "model": 11, "quickstart": 11, "jupyt": 11, "notebook": 11}, "envversion": {"sphinx.domains.c": 2, "sphinx.domains.changeset": 1, "sphinx.domains.citation": 1, "sphinx.domains.cpp": 8, "sphinx.domains.index": 1, "sphinx.domains.javascript": 2, "sphinx.domains.math": 2, "sphinx.domains.python": 3, "sphinx.domains.rst": 2, "sphinx.domains.std": 2, "sphinx.ext.intersphinx": 1, "sphinx.ext.todo": 2, "sphinx.ext.viewcode": 1, "sphinx": 57}, "alltitles": {"lasso.CubeTransit": [[0, "lasso-cubetransit"]], "lasso.ModelRoadwayNetwork": [[1, "lasso-modelroadwaynetwork"]], "Todo": [[1, "id1"], [1, "id2"], [1, "id3"], [1, "id4"], [1, "id5"], [1, "id6"]], "lasso.Parameters": [[2, "lasso-parameters"]], "lasso.Project": [[3, "lasso-project"]], "lasso.StandardTransit": [[4, "lasso-standardtransit"]], "lasso.logger": [[5, "module-lasso.logger"]], "lasso.util": [[6, "module-lasso.util"]], "Lasso Classes and Functions": [[7, "module-lasso"]], "Base Classes": [[7, "base-classes"]], "Utils and Functions": [[7, "utils-and-functions"]], "Welcome to lasso\u2019s documentation!": [[8, "welcome-to-lasso-s-documentation"]], "Contents:": [[8, null]], "Indices and tables": [[8, "indices-and-tables"]], "Running Lasso": [[9, "running-lasso"]], "Create project files": [[9, "create-project-files"]], "Create a scenario": [[9, "create-a-scenario"]], "Exporting networks": [[9, "exporting-networks"]], "Auditing and Reporting": [[9, "auditing-and-reporting"]], "Setup": [[10, "setup"]], "Projects": [[10, "projects"]], "Parameters": [[10, "parameters"], [11, "parameters"]], "Settings": [[10, "settings"]], "Additional Data Files": [[10, "additional-data-files"]], "Starting Out": [[11, "starting-out"]], "Installation": [[11, "installation"]], "Bleeding Edge": [[11, "bleeding-edge"]], "From Clone": [[11, "from-clone"]], "Brief Intro": [[11, "brief-intro"]], "Components": [[11, "components"]], "RoadwayNetwork": [[11, "roadwaynetwork"]], "TransitNetwork": [[11, "transitnetwork"]], "ProjectCard": [[11, "projectcard"]], "Scenario": [[11, "scenario"]], "Project": [[11, "project"]], "ModelRoadwayNetwork": [[11, "modelroadwaynetwork"]], "StandardTransit": [[11, "standardtransit"]], "CubeTransit": [[11, "cubetransit"]], "Typical Workflow": [[11, "typical-workflow"]], "Project Cards from Transit LIN Files": [[11, "project-cards-from-transit-lin-files"]], "Project Cards from Cube LOG Files": [[11, "project-cards-from-cube-log-files"]], "Model Network Files for a Scenario": [[11, "model-network-files-for-a-scenario"]], "Running Quickstart Jupyter Notebooks": [[11, "running-quickstart-jupyter-notebooks"]]}, "indexentries": {"cubetransit (class in lasso)": [[0, "lasso.CubeTransit"]], "__init__() (lasso.cubetransit method)": [[0, "lasso.CubeTransit.__init__"]], "add_additional_time_periods() (lasso.cubetransit method)": [[0, "lasso.CubeTransit.add_additional_time_periods"]], "add_cube() (lasso.cubetransit method)": [[0, "lasso.CubeTransit.add_cube"]], "build_route_name() (lasso.cubetransit static method)": [[0, "lasso.CubeTransit.build_route_name"]], "calculate_start_end_times() (lasso.cubetransit method)": [[0, "lasso.CubeTransit.calculate_start_end_times"]], "create_add_route_card_dict() (lasso.cubetransit method)": [[0, "lasso.CubeTransit.create_add_route_card_dict"]], "create_delete_route_card_dict() (lasso.cubetransit method)": [[0, "lasso.CubeTransit.create_delete_route_card_dict"]], "create_from_cube() (lasso.cubetransit static method)": [[0, "lasso.CubeTransit.create_from_cube"]], "create_update_route_card_dict() (lasso.cubetransit method)": [[0, "lasso.CubeTransit.create_update_route_card_dict"]], "cube_properties_to_standard_properties() (lasso.cubetransit static method)": [[0, "lasso.CubeTransit.cube_properties_to_standard_properties"]], "diff_dict (lasso.cubetransit attribute)": [[0, "lasso.CubeTransit.diff_dict"]], "evaluate_differences() (lasso.cubetransit method)": [[0, "lasso.CubeTransit.evaluate_differences"]], "evaluate_route_property_differences() (lasso.cubetransit method)": [[0, "lasso.CubeTransit.evaluate_route_property_differences"]], "evaluate_route_shape_changes() (lasso.cubetransit static method)": [[0, "lasso.CubeTransit.evaluate_route_shape_changes"]], "get_time_period_numbers_from_cube_properties() (lasso.cubetransit static method)": [[0, "lasso.CubeTransit.get_time_period_numbers_from_cube_properties"]], "line_properties (lasso.cubetransit attribute)": [[0, "lasso.CubeTransit.line_properties"]], "lines (lasso.cubetransit attribute)": [[0, "lasso.CubeTransit.lines"]], "parameters (lasso.cubetransit attribute)": [[0, "lasso.CubeTransit.parameters"]], "program_type (lasso.cubetransit attribute)": [[0, "lasso.CubeTransit.program_type"]], "shapes (lasso.cubetransit attribute)": [[0, "lasso.CubeTransit.shapes"]], "source_list (lasso.cubetransit attribute)": [[0, "lasso.CubeTransit.source_list"]], "unpack_route_name() (lasso.cubetransit static method)": [[0, "lasso.CubeTransit.unpack_route_name"]], "calculated_values (lasso.modelroadwaynetwork attribute)": [[1, "lasso.ModelRoadwayNetwork.CALCULATED_VALUES"]], "modelroadwaynetwork (class in lasso)": [[1, "lasso.ModelRoadwayNetwork"]], "__init__() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.__init__"]], "add_counts() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.add_counts"]], "add_incident_link_data_to_nodes() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.add_incident_link_data_to_nodes"]], "add_new_roadway_feature_change() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.add_new_roadway_feature_change"]], "add_variable_using_shst_reference() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.add_variable_using_shst_reference"]], "addition_map() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.addition_map"]], "apply() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.apply"]], "apply_managed_lane_feature_change() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.apply_managed_lane_feature_change"]], "apply_python_calculation() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.apply_python_calculation"]], "apply_roadway_feature_change() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.apply_roadway_feature_change"]], "assess_connectivity() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.assess_connectivity"]], "build_selection_key() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.build_selection_key"]], "calculate_area_type() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.calculate_area_type"]], "calculate_centroidconnect() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.calculate_centroidconnect"]], "calculate_county() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.calculate_county"]], "calculate_distance() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.calculate_distance"]], "calculate_mpo() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.calculate_mpo"]], "calculate_use() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.calculate_use"]], "convert_int() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.convert_int"]], "create_ml_variable() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.create_ML_variable"]], "create_calculated_variables() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.create_calculated_variables"]], "create_dummy_connector_links() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.create_dummy_connector_links"]], "create_hov_corridor_variable() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.create_hov_corridor_variable"]], "create_managed_lane_network() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.create_managed_lane_network"]], "create_managed_variable() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.create_managed_variable"]], "dataframe_to_fixed_width() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.dataframe_to_fixed_width"]], "delete_roadway_feature_change() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.delete_roadway_feature_change"]], "deletion_map() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.deletion_map"]], "fill_na() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.fill_na"]], "from_roadwaynetwork() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.from_RoadwayNetwork"]], "get_attribute() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.get_attribute"]], "get_managed_lane_node_ids() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.get_managed_lane_node_ids"]], "get_modal_graph() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.get_modal_graph"]], "get_modal_links_nodes() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.get_modal_links_nodes"]], "get_property_by_time_period_and_group() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.get_property_by_time_period_and_group"]], "identify_segment() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.identify_segment"]], "identify_segment_endpoints() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.identify_segment_endpoints"]], "is_network_connected() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.is_network_connected"]], "load_transform_network() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.load_transform_network"]], "network_connection_plot() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.network_connection_plot"]], "orig_dest_nodes_foreign_key() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.orig_dest_nodes_foreign_key"]], "ox_graph() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.ox_graph"]], "path_search() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.path_search"]], "read() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.read"]], "read_match_result() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.read_match_result"]], "rename_variables_for_dbf() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.rename_variables_for_dbf"]], "roadway_net_to_gdf() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.roadway_net_to_gdf"]], "roadway_standard_to_met_council_network() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.roadway_standard_to_met_council_network"]], "select_roadway_features() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.select_roadway_features"]], "selection_has_unique_link_id() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.selection_has_unique_link_id"]], "selection_map() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.selection_map"]], "shortest_path() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.shortest_path"]], "split_properties_by_time_period_and_category() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.split_properties_by_time_period_and_category"]], "update_distance() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.update_distance"]], "validate_link_schema() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.validate_link_schema"]], "validate_node_schema() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.validate_node_schema"]], "validate_properties() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.validate_properties"]], "validate_selection() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.validate_selection"]], "validate_shape_schema() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.validate_shape_schema"]], "validate_uniqueness() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.validate_uniqueness"]], "write() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.write"]], "write_roadway_as_fixedwidth() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.write_roadway_as_fixedwidth"]], "write_roadway_as_shp() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.write_roadway_as_shp"]], "parameters (class in lasso)": [[2, "lasso.Parameters"]], "__init__() (lasso.parameters method)": [[2, "lasso.Parameters.__init__"]], "county_link_range_dict (lasso.parameters attribute)": [[2, "lasso.Parameters.county_link_range_dict"]], "cube_time_periods (lasso.parameters attribute)": [[2, "lasso.Parameters.cube_time_periods"]], "properties_to_split (lasso.parameters attribute)": [[2, "lasso.Parameters.properties_to_split"]], "zones (lasso.parameters attribute)": [[2, "lasso.Parameters.zones"]], "calculated_values (lasso.project attribute)": [[3, "lasso.Project.CALCULATED_VALUES"]], "default_project_name (lasso.project attribute)": [[3, "id0"], [3, "lasso.Project.DEFAULT_PROJECT_NAME"]], "project (class in lasso)": [[3, "lasso.Project"]], "static_values (lasso.project attribute)": [[3, "id1"], [3, "lasso.Project.STATIC_VALUES"]], "__init__() (lasso.project method)": [[3, "lasso.Project.__init__"]], "add_highway_changes() (lasso.project method)": [[3, "lasso.Project.add_highway_changes"]], "add_transit_changes() (lasso.project method)": [[3, "lasso.Project.add_transit_changes"]], "base_cube_transit_network (lasso.project attribute)": [[3, "lasso.Project.base_cube_transit_network"]], "base_roadway_network (lasso.project attribute)": [[3, "lasso.Project.base_roadway_network"]], "build_cube_transit_network (lasso.project attribute)": [[3, "lasso.Project.build_cube_transit_network"]], "card_data (lasso.project attribute)": [[3, "lasso.Project.card_data"]], "create_project() (lasso.project static method)": [[3, "lasso.Project.create_project"]], "determine_roadway_network_changes_compatibility() (lasso.project static method)": [[3, "lasso.Project.determine_roadway_network_changes_compatibility"]], "emme_id_to_wrangler_id() (lasso.project static method)": [[3, "lasso.Project.emme_id_to_wrangler_id"]], "emme_name_to_wrangler_name() (lasso.project static method)": [[3, "lasso.Project.emme_name_to_wrangler_name"]], "evaluate_changes() (lasso.project method)": [[3, "lasso.Project.evaluate_changes"]], "get_object_from_network_build_command() (lasso.project method)": [[3, "lasso.Project.get_object_from_network_build_command"]], "get_operation_from_network_build_command() (lasso.project method)": [[3, "lasso.Project.get_operation_from_network_build_command"]], "parameters (lasso.project attribute)": [[3, "lasso.Project.parameters"]], "project_name (lasso.project attribute)": [[3, "lasso.Project.project_name"]], "read_logfile() (lasso.project static method)": [[3, "lasso.Project.read_logfile"]], "read_network_build_file() (lasso.project static method)": [[3, "lasso.Project.read_network_build_file"]], "roadway_link_changes (lasso.project attribute)": [[3, "lasso.Project.roadway_link_changes"]], "roadway_node_changes (lasso.project attribute)": [[3, "lasso.Project.roadway_node_changes"]], "transit_changes (lasso.project attribute)": [[3, "lasso.Project.transit_changes"]], "write_project_card() (lasso.project method)": [[3, "lasso.Project.write_project_card"]], "standardtransit (class in lasso)": [[4, "lasso.StandardTransit"]], "__init__() (lasso.standardtransit method)": [[4, "lasso.StandardTransit.__init__"]], "calculate_cube_mode() (lasso.standardtransit method)": [[4, "lasso.StandardTransit.calculate_cube_mode"]], "cube_format() (lasso.standardtransit method)": [[4, "lasso.StandardTransit.cube_format"]], "evaluate_differences() (lasso.standardtransit method)": [[4, "lasso.StandardTransit.evaluate_differences"]], "feed (lasso.standardtransit attribute)": [[4, "lasso.StandardTransit.feed"]], "fromtransitnetwork() (lasso.standardtransit static method)": [[4, "lasso.StandardTransit.fromTransitNetwork"]], "parameters (lasso.standardtransit attribute)": [[4, "lasso.StandardTransit.parameters"]], "read_gtfs() (lasso.standardtransit static method)": [[4, "lasso.StandardTransit.read_gtfs"]], "route_properties_gtfs_to_cube() (lasso.standardtransit static method)": [[4, "lasso.StandardTransit.route_properties_gtfs_to_cube"]], "shape_gtfs_to_cube() (lasso.standardtransit method)": [[4, "lasso.StandardTransit.shape_gtfs_to_cube"]], "shape_gtfs_to_dict_list() (lasso.standardtransit method)": [[4, "lasso.StandardTransit.shape_gtfs_to_dict_list"]], "shape_gtfs_to_emme() (lasso.standardtransit method)": [[4, "lasso.StandardTransit.shape_gtfs_to_emme"]], "time_to_cube_time_period() (lasso.standardtransit method)": [[4, "lasso.StandardTransit.time_to_cube_time_period"]], "write_as_cube_lin() (lasso.standardtransit method)": [[4, "lasso.StandardTransit.write_as_cube_lin"]], "lasso.logger": [[5, "module-lasso.logger"]], "module": [[5, "module-lasso.logger"], [6, "module-lasso.util"], [7, "module-lasso"]], "setuplogging() (in module lasso.logger)": [[5, "lasso.logger.setupLogging"]], "point (class in lasso.util)": [[6, "lasso.util.Point"]], "polygon (class in lasso.util)": [[6, "lasso.util.Polygon"]], "almost_equals() (lasso.util.point method)": [[6, "lasso.util.Point.almost_equals"]], "almost_equals() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.almost_equals"]], "area (lasso.util.point property)": [[6, "lasso.util.Point.area"]], "area (lasso.util.polygon property)": [[6, "lasso.util.Polygon.area"]], "args (lasso.util.partial attribute)": [[6, "lasso.util.partial.args"]], "array_interface() (lasso.util.point method)": [[6, "lasso.util.Point.array_interface"]], "array_interface_base (lasso.util.point property)": [[6, "lasso.util.Point.array_interface_base"]], "array_interface_base (lasso.util.polygon property)": [[6, "lasso.util.Polygon.array_interface_base"]], "boundary (lasso.util.point property)": [[6, "lasso.util.Point.boundary"]], "boundary (lasso.util.polygon property)": [[6, "lasso.util.Polygon.boundary"]], "bounds (lasso.util.point property)": [[6, "lasso.util.Point.bounds"]], "bounds (lasso.util.polygon property)": [[6, "lasso.util.Polygon.bounds"]], "buffer() (lasso.util.point method)": [[6, "lasso.util.Point.buffer"]], "buffer() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.buffer"]], "centroid (lasso.util.point property)": [[6, "lasso.util.Point.centroid"]], "centroid (lasso.util.polygon property)": [[6, "lasso.util.Polygon.centroid"]], "column_name_to_parts() (in module lasso.util)": [[6, "lasso.util.column_name_to_parts"]], "contains() (lasso.util.point method)": [[6, "lasso.util.Point.contains"]], "contains() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.contains"]], "convex_hull (lasso.util.point property)": [[6, "lasso.util.Point.convex_hull"]], "convex_hull (lasso.util.polygon property)": [[6, "lasso.util.Polygon.convex_hull"]], "coords (lasso.util.point property)": [[6, "lasso.util.Point.coords"]], "coords (lasso.util.polygon property)": [[6, "lasso.util.Polygon.coords"]], "covered_by() (lasso.util.point method)": [[6, "lasso.util.Point.covered_by"]], "covered_by() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.covered_by"]], "covers() (lasso.util.point method)": [[6, "lasso.util.Point.covers"]], "covers() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.covers"]], "create_locationreference() (in module lasso.util)": [[6, "lasso.util.create_locationreference"]], "crosses() (lasso.util.point method)": [[6, "lasso.util.Point.crosses"]], "crosses() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.crosses"]], "ctypes (lasso.util.point property)": [[6, "lasso.util.Point.ctypes"]], "ctypes (lasso.util.polygon property)": [[6, "lasso.util.Polygon.ctypes"]], "difference() (lasso.util.point method)": [[6, "lasso.util.Point.difference"]], "difference() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.difference"]], "disjoint() (lasso.util.point method)": [[6, "lasso.util.Point.disjoint"]], "disjoint() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.disjoint"]], "distance() (lasso.util.point method)": [[6, "lasso.util.Point.distance"]], "distance() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.distance"]], "empty() (lasso.util.point method)": [[6, "lasso.util.Point.empty"]], "empty() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.empty"]], "envelope (lasso.util.point property)": [[6, "lasso.util.Point.envelope"]], "envelope (lasso.util.polygon property)": [[6, "lasso.util.Polygon.envelope"]], "equals() (lasso.util.point method)": [[6, "lasso.util.Point.equals"]], "equals() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.equals"]], "equals_exact() (lasso.util.point method)": [[6, "lasso.util.Point.equals_exact"]], "equals_exact() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.equals_exact"]], "exterior (lasso.util.polygon attribute)": [[6, "lasso.util.Polygon.exterior"]], "exterior (lasso.util.polygon property)": [[6, "id0"]], "from_bounds() (lasso.util.polygon class method)": [[6, "lasso.util.Polygon.from_bounds"]], "func (lasso.util.partial attribute)": [[6, "lasso.util.partial.func"]], "geodesic_point_buffer() (in module lasso.util)": [[6, "lasso.util.geodesic_point_buffer"]], "geom_type (lasso.util.point property)": [[6, "lasso.util.Point.geom_type"]], "geom_type (lasso.util.polygon property)": [[6, "lasso.util.Polygon.geom_type"]], "geometrytype() (lasso.util.point method)": [[6, "lasso.util.Point.geometryType"]], "geometrytype() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.geometryType"]], "get_shared_streets_intersection_hash() (in module lasso.util)": [[6, "lasso.util.get_shared_streets_intersection_hash"]], "has_z (lasso.util.point property)": [[6, "lasso.util.Point.has_z"]], "has_z (lasso.util.polygon property)": [[6, "lasso.util.Polygon.has_z"]], "hausdorff_distance() (lasso.util.point method)": [[6, "lasso.util.Point.hausdorff_distance"]], "hausdorff_distance() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.hausdorff_distance"]], "hhmmss_to_datetime() (in module lasso.util)": [[6, "lasso.util.hhmmss_to_datetime"]], "impl (lasso.util.point attribute)": [[6, "lasso.util.Point.impl"]], "impl (lasso.util.polygon attribute)": [[6, "lasso.util.Polygon.impl"]], "interiors (lasso.util.polygon attribute)": [[6, "lasso.util.Polygon.interiors"]], "interiors (lasso.util.polygon property)": [[6, "id1"]], "interpolate() (lasso.util.point method)": [[6, "lasso.util.Point.interpolate"]], "interpolate() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.interpolate"]], "intersection() (lasso.util.point method)": [[6, "lasso.util.Point.intersection"]], "intersection() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.intersection"]], "intersects() (lasso.util.point method)": [[6, "lasso.util.Point.intersects"]], "intersects() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.intersects"]], "is_closed (lasso.util.point property)": [[6, "lasso.util.Point.is_closed"]], "is_closed (lasso.util.polygon property)": [[6, "lasso.util.Polygon.is_closed"]], "is_empty (lasso.util.point property)": [[6, "lasso.util.Point.is_empty"]], "is_empty (lasso.util.polygon property)": [[6, "lasso.util.Polygon.is_empty"]], "is_ring (lasso.util.point property)": [[6, "lasso.util.Point.is_ring"]], "is_ring (lasso.util.polygon property)": [[6, "lasso.util.Polygon.is_ring"]], "is_simple (lasso.util.point property)": [[6, "lasso.util.Point.is_simple"]], "is_simple (lasso.util.polygon property)": [[6, "lasso.util.Polygon.is_simple"]], "is_valid (lasso.util.point property)": [[6, "lasso.util.Point.is_valid"]], "is_valid (lasso.util.polygon property)": [[6, "lasso.util.Polygon.is_valid"]], "keywords (lasso.util.partial attribute)": [[6, "lasso.util.partial.keywords"]], "lasso.util": [[6, "module-lasso.util"]], "length (lasso.util.point property)": [[6, "lasso.util.Point.length"]], "length (lasso.util.polygon property)": [[6, "lasso.util.Polygon.length"]], "minimum_clearance (lasso.util.point property)": [[6, "lasso.util.Point.minimum_clearance"]], "minimum_clearance (lasso.util.polygon property)": [[6, "lasso.util.Polygon.minimum_clearance"]], "minimum_rotated_rectangle (lasso.util.point property)": [[6, "lasso.util.Point.minimum_rotated_rectangle"]], "minimum_rotated_rectangle (lasso.util.polygon property)": [[6, "lasso.util.Polygon.minimum_rotated_rectangle"]], "normalize() (lasso.util.point method)": [[6, "lasso.util.Point.normalize"]], "normalize() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.normalize"]], "overlaps() (lasso.util.point method)": [[6, "lasso.util.Point.overlaps"]], "overlaps() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.overlaps"]], "partial (class in lasso.util)": [[6, "lasso.util.partial"]], "project() (lasso.util.point method)": [[6, "lasso.util.Point.project"]], "project() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.project"]], "relate() (lasso.util.point method)": [[6, "lasso.util.Point.relate"]], "relate() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.relate"]], "relate_pattern() (lasso.util.point method)": [[6, "lasso.util.Point.relate_pattern"]], "relate_pattern() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.relate_pattern"]], "representative_point() (lasso.util.point method)": [[6, "lasso.util.Point.representative_point"]], "representative_point() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.representative_point"]], "secs_to_datetime() (in module lasso.util)": [[6, "lasso.util.secs_to_datetime"]], "shorten_name() (in module lasso.util)": [[6, "lasso.util.shorten_name"]], "simplify() (lasso.util.point method)": [[6, "lasso.util.Point.simplify"]], "simplify() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.simplify"]], "svg() (lasso.util.point method)": [[6, "lasso.util.Point.svg"]], "svg() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.svg"]], "symmetric_difference() (lasso.util.point method)": [[6, "lasso.util.Point.symmetric_difference"]], "symmetric_difference() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.symmetric_difference"]], "touches() (lasso.util.point method)": [[6, "lasso.util.Point.touches"]], "touches() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.touches"]], "transform() (in module lasso.util)": [[6, "lasso.util.transform"]], "type (lasso.util.point property)": [[6, "lasso.util.Point.type"]], "type (lasso.util.polygon property)": [[6, "lasso.util.Polygon.type"]], "unidecode() (in module lasso.util)": [[6, "lasso.util.unidecode"]], "union() (lasso.util.point method)": [[6, "lasso.util.Point.union"]], "union() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.union"]], "within() (lasso.util.point method)": [[6, "lasso.util.Point.within"]], "within() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.within"]], "wkb (lasso.util.point property)": [[6, "lasso.util.Point.wkb"]], "wkb (lasso.util.polygon property)": [[6, "lasso.util.Polygon.wkb"]], "wkb_hex (lasso.util.point property)": [[6, "lasso.util.Point.wkb_hex"]], "wkb_hex (lasso.util.polygon property)": [[6, "lasso.util.Polygon.wkb_hex"]], "wkt (lasso.util.point property)": [[6, "lasso.util.Point.wkt"]], "wkt (lasso.util.polygon property)": [[6, "lasso.util.Polygon.wkt"]], "x (lasso.util.point property)": [[6, "lasso.util.Point.x"]], "xy (lasso.util.point property)": [[6, "lasso.util.Point.xy"]], "xy (lasso.util.polygon property)": [[6, "lasso.util.Polygon.xy"]], "y (lasso.util.point property)": [[6, "lasso.util.Point.y"]], "z (lasso.util.point property)": [[6, "lasso.util.Point.z"]], "lasso": [[7, "module-lasso"]]}}) \ No newline at end of file diff --git a/branch/bart/setup/index.html b/branch/bart/setup/index.html new file mode 100644 index 0000000..2ff636e --- /dev/null +++ b/branch/bart/setup/index.html @@ -0,0 +1,128 @@ + + + + + + + Setup — lasso documentation + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

Setup

+
+

Projects

+
+
+

Parameters

+
+
+

Settings

+
+
+

Additional Data Files

+
+
+ + +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bart/starting/index.html b/branch/bart/starting/index.html new file mode 100644 index 0000000..a54a689 --- /dev/null +++ b/branch/bart/starting/index.html @@ -0,0 +1,429 @@ + + + + + + + Starting Out — lasso documentation + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

Starting Out

+
+

Installation

+

If you are managing multiple python versions, we suggest using virtualenv or conda virtual environments.

+

Example using a conda environment (recommended) and using the package manager pip to install Lasso from the source on GitHub.

+
conda config --add channels conda-forge
+conda create python=3.7 rtree geopandas -n <my_lasso_environment>
+conda activate <my_lasso_environment>
+pip install git+https://github.com/wsp-sag/Lasso@master
+
+
+

Lasso will install network_wrangler from the PyPi repository because it is included in Lasso’s requirements.txt.

+
+

Bleeding Edge

+

If you want to install a more up-to-date or development version of network wrangler and lasso , you can do so by installing it from the develop branch of

+
conda config --add channels conda-forge
+conda create python=3.7 rtree geopandas -n <my_lasso_environment>
+conda activate <my_lasso_environment>
+pip install git+https://github.com/wsp-sag/network_wrangler@develop
+pip install git+https://github.com/wsp-sag/Lasso@develop
+
+
+
+
+

From Clone

+

If you are going to be working on Lasso locally, you might want to clone it to your local machine and install it from the clone. The -e will install it in editable mode.

+

if you plan to do development on both network wrangler and lasso locally, consider installing network wrangler from a clone as well!

+
conda config --add channels conda-forge
+conda create python=3.7 rtree geopandas osmnx -n <my_lasso_environment>
+conda activate <my_lasso_environment>
+git clone https://github.com/wsp-sag/Lasso
+git clone https://github.com/wsp-sag/network_wrangler
+cd network_wrangler
+pip install -e .
+cd ..
+cd Lasso
+pip install -e .
+
+
+

Notes:

+
    +
  1. The -e installs it in editable mode.

  2. +
  3. If you are not part of the project team and want to contribute code bxack to the project, please fork before you clone and then add the original repository to your upstream origin list per these directions on github.

  4. +
  5. if you wanted to install from a specific tag/version number or branch, replace @master with @<branchname> or @tag

  6. +
  7. If you want to make use of frequent developer updates for network wrangler as well, you can also install it from clone by copying the instructions for cloning and installing Lasso for Network Wrangler

  8. +
+

If you are going to be doing Lasso development, we also recommend:

+
    +
  • a good IDE such as Atom, VS Code, Sublime Text, etc. +with Python syntax highlighting turned on.

  • +
  • GitHub Desktop to locally update your clones

  • +
+
+
+
+

Brief Intro

+

Lasso is a ‘wrapper’ around the Network Wrangler utility.

+

Both Lasso and NetworkWrangler are built around the following data schemas:

+
    +
  • [roadway network], which is based on a mashup of Open Street Map and Shared Streets. In Network Wrangler these are read in from three json files reprsenting: links, shapes, and nodes. Data fields that change by time of day or by user category are represented as nested fields such that any field can be defined for an ad-hoc time-of-day span or user category.

  • +
  • [transit network], which is based on a frequency-based implementation of the csv-based GTFS; and

  • +
  • [project card], which is novel to Network Wrangler and stores information about network changes as a result of projects in yml.

  • +
+

In addition, Lasso utilizes the following data schemas:

+
    +
  • [MetCouncil Model Roadway Network Schema], which adds data fields to the roadway network schema that MetCouncil uses in their travel model including breaking out data fields by time period.

  • +
  • [MetCouncil Model Transit Network Schema], which uses the Cube PublicTransport format, and

  • +
  • [Cube Log Files], which document changes to the roadway network done in the Cube GUI. Lasso translates these to project cards in order to be used by NetworkWrangler.

  • +
  • [Cube public transport line files], which define a set of transit lines in the cube software.

  • +
+
+

Components

+

Network Wrangler has the following atomic parts:

+
    +
  • RoadwayNetwork object, which represents the roadway network data as GeoDataFrames;

  • +
  • TransitNetwork object, which represents the transit network data as DataFrames;

  • +
  • ProjectCard object, which represents the data of the project card. Project cards identify the infrastructure that is changing (a selection) and defines the changes; or contains information about a new facility to be constructed or a new service to be run.;

  • +
  • Scenario object, which consist of at least a RoadwayNetwork, and +TransitNetwork. Scenarios can be based on or tiered from other scenarios. +Scenarios can query and add ProjectCards to describe a set of changes that should be made to the network.

  • +
+

In addition, Lasso has the following atomic parts:

+
    +
  • Project object, creates project cards from one of the following: a base and a build transit network in cube format, a base and build highway network, or a base highway network and a Cube log file.

  • +
  • ModelRoadwayNetwork object is a subclass of RoadwayNetwork and contains MetCouncil-specific methods to define and create MetCouncil-specific variables and export the network to a format that can be read by Cube.

  • +
  • StandardTransit, an object for holding a standard transit feed as a Partridge object and contains +methods to manipulate and translate the GTFS data to MetCouncil’s Cube Line files.

  • +
  • CubeTransit, an object for storing information about transit defined in Cube public transport line files +. Has the capability to parse cube line file properties and shapes into python dictionaries and compare line files and represent changes as Project Card dictionaries.

  • +
  • Parameters, A class representing all the parameters defining the networks +including time of day, categories, etc. Parameters can be set at runtime by initializing a parameters instance +with a keyword argument setting the attribute. Parameters that are +not explicitly set will use default parameters listed in this class.

  • +
+
+

RoadwayNetwork

+

Reads, writes, queries and and manipulates roadway network data, which +is mainly stored in the GeoDataFrames links_df, nodes_df, and shapes_df.

+
net = RoadwayNetwork.read(
+        link_filename=MY_LINK_FILE,
+        node_filename=MY_NODE_FILE,
+        shape_filename=MY_SHAPE_FILE,
+        shape_foreign_key ='shape_id',
+        
+    )
+my_selection = {
+    "link": [{"name": ["I 35E"]}],
+    "A": {"osm_node_id": "961117623"},  # start searching for segments at A
+    "B": {"osm_node_id": "2564047368"},
+}
+net.select_roadway_features(my_selection)
+
+my_change = [
+    {
+        'property': 'lanes',
+        'existing': 1,
+        'set': 2,
+     },
+     {
+        'property': 'drive_access',
+        'set': 0,
+      },
+]
+
+my_net.apply_roadway_feature_change(
+    my_net.select_roadway_features(my_selection),
+    my_change
+)
+
+ml_net = net.create_managed_lane_network(in_place=False)
+
+ml_net.is_network_connected(mode="drive"))
+
+_, disconnected_nodes = ml_net.assess_connectivity(
+  mode="walk",
+  ignore_end_nodes=True
+)
+ml_net.write(filename=my_out_prefix, path=my_dir)
+
+
+
+
+

TransitNetwork

+
+
+

ProjectCard

+
+
+

Scenario

+

Manages sets of project cards and tiering from a base scenario/set of networks.

+

+my_base_scenario = {
+    "road_net": RoadwayNetwork.read(
+        link_filename=STPAUL_LINK_FILE,
+        node_filename=STPAUL_NODE_FILE,
+        shape_filename=STPAUL_SHAPE_FILE,
+        fast=True,
+        shape_foreign_key ='shape_id',
+    ),
+    "transit_net": TransitNetwork.read(STPAUL_DIR),
+}
+
+card_filenames = [
+    "3_multiple_roadway_attribute_change.yml",
+    "multiple_changes.yml",
+    "4_simple_managed_lane.yml",
+]
+
+project_card_directory = os.path.join(STPAUL_DIR, "project_cards")
+
+project_cards_list = [
+    ProjectCard.read(os.path.join(project_card_directory, filename), validate=False)
+    for filename in card_filenames
+]
+
+my_scenario = Scenario.create_scenario(
+  base_scenario=my_base_scenario,
+  project_cards_list=project_cards_list,
+)
+my_scenario.check_scenario_requisites()
+
+my_scenario.apply_all_projects()
+
+my_scenario.scenario_summary()
+
+
+
+
+

Project

+

Creates project cards by comparing two MetCouncil Model Transit Network files or by reading a cube log file and a base network;

+

+test_project = Project.create_project(
+  base_transit_source=os.path.join(CUBE_DIR, "transit.LIN"),
+  build_transit_source=os.path.join(CUBE_DIR, "transit_route_shape_change"),
+  )
+
+test_project.evaluate_changes()
+
+test_project.write_project_card(
+  os.path.join(SCRATCH_DIR, "t_transit_shape_test.yml")
+  )
+
+
+
+
+

ModelRoadwayNetwork

+

A subclass of network_wrangler’s RoadwayNetwork +class which additional understanding about how to translate and write the +network out to the MetCouncil Roadway Network schema.

+
net = ModelRoadwayNetwork.read(
+      link_filename=STPAUL_LINK_FILE,
+      node_filename=STPAUL_NODE_FILE,
+      shape_filename=STPAUL_SHAPE_FILE,
+      fast=True,
+      shape_foreign_key ='shape_id',
+  )
+
+net.write_roadway_as_fixedwidth()
+
+
+
+
+

StandardTransit

+

Translates the standard GTFS data to MetCouncil’s Cube Line files.

+
cube_transit_net = StandardTransit.read_gtfs(BASE_TRANSIT_DIR)
+cube_transit_net.write_as_cube_lin(os.path.join(WRITE_DIR, "outfile.lin"))
+
+
+
+
+

CubeTransit

+

Used by the project class and has the capability to:

+
    +
  • Parse cube line file properties and shapes into python dictionaries

  • +
  • Compare line files and represent changes as Project Card dictionaries

  • +
+
tn = CubeTransit.create_from_cube(CUBE_DIR)
+transit_change_list = tn.evaluate_differences(base_transit_network)
+
+
+
+
+

Parameters

+

Holds information about default parameters but can +also be initialized to override those parameters at object instantiation using a dictionary.

+
# read parameters from a yaml configuration  file
+# could also provide as a key/value pair
+with open(config_file) as f:
+      my_config = yaml.safe_load(f)
+
+# provide parameters at instantiation of ModelRoadwayNetwork
+model_road_net = ModelRoadwayNetwork.from_RoadwayNetwork(
+            my_scenario.road_net, parameters=my_config.get("my_parameters", {})
+        )
+# network written with direction from the parameters given
+model_road_net.write_roadway_as_shp()
+
+
+
+
+
+

Typical Workflow

+

Workflows in Lasso and Network Wrangler typically accomplish one of two goals:

+
    +
  1. Create Project Cards to document network changes as a result of either transit or roadway projects.

  2. +
  3. Create Model Network Files for a scenario as represented by a series of Project Cards layered on top of a base network.

  4. +
+
+

Project Cards from Transit LIN Files

+
+
+

Project Cards from Cube LOG Files

+
+
+

Model Network Files for a Scenario

+
+
+
+
+

Running Quickstart Jupyter Notebooks

+

To learn basic lasso functionality, please refer to the following jupyter notebooks in the /notebooks directory:

+
    +
  • Lasso Project Card Creation Quickstart.ipynb

  • +
  • Lasso Scenario Creation Quickstart.ipynb

  • +
+

Jupyter notebooks can be started by activating the lasso conda environment and typing jupyter notebook:

+
conda activate <my_lasso_environment>
+jupyter notebook
+
+
+
+
+ + +
+
+
+ +
+ +
+

© Copyright 2019-2022 Metropolitan Council, Metropolitan Transportation Commission.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty/.buildinfo b/branch/bicounty/.buildinfo new file mode 100644 index 0000000..15a527f --- /dev/null +++ b/branch/bicounty/.buildinfo @@ -0,0 +1,4 @@ +# Sphinx build info version 1 +# This file hashes the configuration used when building these files. When it is not found, a full rebuild will be done. +config: b2b36939df5fcb4cba7778815ec5d6e4 +tags: d77d1c0d9ca2f4c8421862c7c5a0d620 diff --git a/branch/bicounty/_generated/lasso.CubeTransit/index.html b/branch/bicounty/_generated/lasso.CubeTransit/index.html new file mode 100644 index 0000000..c408cd0 --- /dev/null +++ b/branch/bicounty/_generated/lasso.CubeTransit/index.html @@ -0,0 +1,566 @@ + + + + + + + lasso.CubeTransit — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

lasso.CubeTransit

+
+
+class lasso.CubeTransit(parameters={})[source]
+

Bases: object

+

Class for storing information about transit defined in Cube line +files.

+

Has the capability to:

+
+
    +
  • Parse cube line file properties and shapes into python dictionaries

  • +
  • Compare line files and represent changes as Project Card dictionaries

  • +
+
+

Typical usage example:

+
tn = CubeTransit.create_from_cube(CUBE_DIR)
+transit_change_list = tn.evaluate_differences(base_transit_network)
+
+
+
+
+lines
+

list of strings representing unique line names in +the cube network.

+
+
Type
+

list

+
+
+
+ +
+
+line_properties
+

dictionary of line properties keyed by line name. Property +values are stored in a dictionary by property name. These +properties are directly read from the cube line files and haven’t +been translated to standard transit values.

+
+
Type
+

dict

+
+
+
+ +
+
+shapes
+

dictionary of shapes +keyed by line name. Shapes stored as a pandas DataFrame of nodes with following columns:

+
+
    +
  • ‘node_id’ (int): positive integer of node id

  • +
  • ‘node’ (int): node number, with negative indicating a non-stop

  • +
  • ‘stop’ (boolean): indicates if it is a stop

  • +
  • ‘order’ (int): order within this shape

  • +
+
+
+
Type
+

dict

+
+
+
+ +
+
+program_type
+

Either PT or TRNBLD

+
+
Type
+

str

+
+
+
+ +
+
+parameters
+

Parameters instance that will be applied to this instance which +includes information about time periods and variables.

+
+
Type
+

Parameters

+
+
+
+ +
+
+source_list
+

List of cube line file sources that have been read and added.

+
+
Type
+

list

+
+
+
+ +
+
+diff_dict
+
+
Type
+

dict

+
+
+
+ +
+
+__init__(parameters={})[source]
+

Constructor for CubeTransit

+

parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters

+
+ +

Methods

+ ++++ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +

__init__([parameters])

Constructor for CubeTransit

add_additional_time_periods(...)

Copies a route to another cube time period with appropriate values for time-period-specific properties.

add_cube(transit_source)

Reads a .lin file and adds it to existing TransitNetwork instance.

build_route_name([route_id, time_period, ...])

Create a route name by contatenating route, time period, agency, and direction

calculate_start_end_times(line_properties_dict)

Calculate the start and end times of the property change WARNING: Doesn't take care of discongruous time periods!!!!

create_add_route_card_dict(line)

Creates a project card change formatted dictionary for adding a route based on the information in self.route_properties for the line.

create_delete_route_card_dict(line, ...)

Creates a project card change formatted dictionary for deleting a line.

create_from_cube(transit_source[, parameters])

Reads a cube .lin file and stores as TransitNetwork object.

create_update_route_card_dict(line, ...)

Creates a project card change formatted dictionary for updating the line.

cube_properties_to_standard_properties(...)

Converts cube style properties to standard properties.

evaluate_differences(base_transit)

    +
  1. Identifies what routes need to be updated, deleted, or added

  2. +
+

evaluate_route_property_differences(...[, ...])

Checks if any values have been updated or added for a specific route and creates project card entries for each.

evaluate_route_shape_changes(shape_build, ...)

Compares two route shapes and constructs returns list of changes suitable for a project card.

get_time_period_numbers_from_cube_properties(...)

Finds properties that are associated with time periods and the returns the numbers in them.

unpack_route_name(line_name)

Unpacks route name into direction, route, agency, and time period info

+
+
+add_additional_time_periods(new_time_period_number, orig_line_name)[source]
+

Copies a route to another cube time period with appropriate +values for time-period-specific properties.

+
+
New properties are stored under the new name in:
    +
  • ::self.shapes

  • +
  • ::self.line_properties

  • +
+
+
+
+
Parameters
+
    +
  • new_time_period_number (int) – cube time period number

  • +
  • orig_line_name (str) – name of the originating line, from which +the new line will copy its properties.

  • +
+
+
Returns
+

Line name with new time period.

+
+
+
+ +
+
+add_cube(transit_source)[source]
+

Reads a .lin file and adds it to existing TransitNetwork instance.

+
+
Parameters
+

transit_source – a string or the directory of the cube line file to be parsed

+
+
+
+ +
+
+static build_route_name(route_id='', time_period='', agency_id=0, direction_id=1)[source]
+

Create a route name by contatenating route, time period, agency, and direction

+
+
Parameters
+
    +
  • route_id – i.e. 452-111

  • +
  • time_period – i.e. pk

  • +
  • direction_id – i.e. 1

  • +
  • agency_id – i.e. 0

  • +
+
+
Returns
+

constructed line_name i.e. “0_452-111_452_pk1”

+
+
+
+ +
+
+calculate_start_end_times(line_properties_dict)[source]
+

Calculate the start and end times of the property change +WARNING: Doesn’t take care of discongruous time periods!!!!

+
+
Parameters
+

line_properties_dict – dictionary of cube-flavor properties for a transit line

+
+
+
+ +
+
+create_add_route_card_dict(line)[source]
+

Creates a project card change formatted dictionary for adding +a route based on the information in self.route_properties for +the line.

+
+
Parameters
+

line – name of line that is being updated

+
+
Returns
+

A project card change-formatted dictionary for the route addition.

+
+
+
+ +
+
+create_delete_route_card_dict(line, base_transit_line_properties_dict)[source]
+

Creates a project card change formatted dictionary for deleting a line.

+
+
Parameters
+
    +
  • line – name of line that is being deleted

  • +
  • base_transit_line_properties_dict – dictionary of cube-style +attribute values in order to find time periods and +start and end times.

  • +
+
+
Returns
+

A project card change-formatted dictionary for the route deletion.

+
+
+
+ +
+
+static create_from_cube(transit_source, parameters={})[source]
+

Reads a cube .lin file and stores as TransitNetwork object.

+
+
Parameters
+

transit_source – a string or the directory of the cube line file to be parsed

+
+
Returns
+

A ::CubeTransit object created from the transit_source.

+
+
+
+ +
+
+create_update_route_card_dict(line, updated_properties_dict)[source]
+

Creates a project card change formatted dictionary for updating +the line.

+
+
Parameters
+
    +
  • line – name of line that is being updated

  • +
  • updated_properties_dict – dictionary of attributes to update as +‘property’: <property name>, +‘set’: <new property value>

  • +
+
+
Returns
+

A project card change-formatted dictionary for the attribute update.

+
+
+
+ +
+
+static cube_properties_to_standard_properties(cube_properties_dict)[source]
+

Converts cube style properties to standard properties.

+

This is most pertinent to time-period specific variables like headway, +and varibles that have stnadard units like headway, which is minutes +in cube and seconds in standard format.

+
+
Parameters
+

cube_properties_dict – <cube style property name> : <property value>

+
+
Returns
+

+
<standard

style property name>, “set” : <property value with correct units>`

+
+
+

+
+
Return type
+

A list of dictionaries with values for `”property”

+
+
+
+ +
+
+evaluate_differences(base_transit)[source]
+
    +
  1. Identifies what routes need to be updated, deleted, or added

  2. +
  3. +
    For routes being added or updated, identify if the time periods

    have changed or if there are multiples, and make duplicate lines if so

    +
    +
    +
  4. +
  5. Create project card dictionaries for each change.

  6. +
+
+
Parameters
+

base_transit (CubeTransit) – an instance of this class for the base condition

+
+
Returns
+

A list of dictionaries containing project card changes +required to evaluate the differences between the base network +and this transit network instance.

+
+
+
+ +
+
+evaluate_route_property_differences(properties_build, properties_base, time_period_number, absolute=True, validate_base=False)[source]
+

Checks if any values have been updated or added for a specific +route and creates project card entries for each.

+
+
Parameters
+
    +
  • properties_build – ::<property_name>: <property_value>

  • +
  • properties_base – ::<property_name>: <property_value>

  • +
  • time_period_number – time period to evaluate

  • +
  • absolute – if True, will use set command rather than a change. If false, will automatically check the base value. Note that this only applies to the numeric values of frequency/headway

  • +
  • validate_base – if True, will add the existing line in the project card

  • +
+
+
Returns
+

+
a list of dictionary values suitable for writing to a project card

{ +‘property’: <property_name>, +‘set’: <set value>, +‘change’: <change from existing value>, +‘existing’: <existing value to check>, +}

+
+
+

+
+
Return type
+

transit_change_list (list)

+
+
+
+ +
+
+static evaluate_route_shape_changes(shape_build, shape_base)[source]
+

Compares two route shapes and constructs returns list of changes +suitable for a project card.

+
+
Parameters
+
    +
  • shape_build – DataFrame of the build-version of the route shape.

  • +
  • shape_base – dDataFrame of the base-version of the route shape.

  • +
+
+
Returns
+

List of shape changes formatted as a project card-change dictionary.

+
+
+
+ +
+
+static get_time_period_numbers_from_cube_properties(properties_list)[source]
+

Finds properties that are associated with time periods and the +returns the numbers in them.

+
+
Parameters
+

properties_list (list) – list of all properties.

+
+
Returns
+

list of strings of the time period numbers found

+
+
+
+ +
+
+static unpack_route_name(line_name)[source]
+

Unpacks route name into direction, route, agency, and time period info

+
+
Parameters
+

line_name (str) – i.e. “0_452-111_452_pk1”

+
+
Returns
+

452-111 +time_period (str): i.e. pk +direction_id (str) : i.e. 1 +agency_id (str) : i.e. 0

+
+
Return type
+

route_id (str)

+
+
+
+ +
+ +
+ + +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty/_generated/lasso.ModelRoadwayNetwork/index.html b/branch/bicounty/_generated/lasso.ModelRoadwayNetwork/index.html new file mode 100644 index 0000000..7d108a6 --- /dev/null +++ b/branch/bicounty/_generated/lasso.ModelRoadwayNetwork/index.html @@ -0,0 +1,1584 @@ + + + + + + + lasso.ModelRoadwayNetwork — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

lasso.ModelRoadwayNetwork

+
+
+class lasso.ModelRoadwayNetwork(nodes, links, shapes, parameters={}, **kwargs)[source]
+

Bases: RoadwayNetwork

+

Subclass of network_wrangler class RoadwayNetwork

+

A representation of the physical roadway network and its properties.

+
+
+__init__(nodes, links, shapes, parameters={}, **kwargs)[source]
+

Constructor

+
+
Parameters
+
    +
  • nodes – geodataframe of nodes

  • +
  • links – dataframe of links

  • +
  • shapes – geodataframe of shapes

  • +
  • parameters – dictionary of parameter settings (see Parameters class) or an instance of Parameters. +If not specified, will use default parameters.

  • +
  • crs (int) – coordinate reference system, ESPG number

  • +
  • node_foreign_key (str) – variable linking the node table to the link table

  • +
  • link_foreign_key (list) – list of variable linking the link table to the node foreign key

  • +
  • shape_foreign_key (str) – variable linking the links table and shape table

  • +
  • unique_link_ids (list) – list of variables unique to each link

  • +
  • unique_node_ids (list) – list of variables unique to each node

  • +
  • modes_to_network_link_variables (dict) – Mapping of modes to link variables in the network

  • +
  • modes_to_network_nodes_variables (dict) – Mapping of modes to node variables in the network

  • +
  • managed_lanes_node_id_scalar (int) – Scalar values added to primary keys for nodes for +corresponding managed lanes.

  • +
  • managed_lanes_link_id_scalar (int) – Scalar values added to primary keys for links for +corresponding managed lanes.

  • +
  • managed_lanes_required_attributes (list) – attributes that must be specified in managed +lane projects.

  • +
  • keep_same_attributes_ml_and_gp (list) – attributes to copy to managed lanes from parallel +general purpose lanes.

  • +
+
+
+
+ +

Methods

+ ++++ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +

__init__(nodes, links, shapes[, parameters])

Constructor

add_counts([network_variable, ...])

Adds count variable.

add_incident_link_data_to_nodes([links_df, ...])

Add data from links going to/from nodes to node.

add_new_roadway_feature_change(links, nodes)

add the new roadway features defined in the project card.

add_variable_using_shst_reference([...])

Join network links with source data, via SHST API node match result.

addition_map(links, nodes)

Shows which links and nodes are added to the roadway network

apply(project_card_dictionary)

Wrapper method to apply a project to a roadway network.

apply_managed_lane_feature_change(link_idx, ...)

Apply the managed lane feature changes to the roadway network

apply_python_calculation(pycode[, in_place])

Changes roadway network object by executing pycode.

apply_roadway_feature_change(link_idx, ...)

Changes the roadway attributes for the selected features based on the project card information passed

assess_connectivity([mode, ...])

Returns a network graph and list of disconnected subgraphs as described by a list of their member nodes.

build_selection_key(selection_dict)

Selections are stored by a key combining the query and the A and B ids.

calculate_area_type([area_type_shape, ...])

#MC Calculates area type variable.

calculate_centroidconnect(parameters[, ...])

Calculates centroid connector variable.

calculate_county([county_shape, ...])

#MC Calculates county variable.

calculate_distance([network_variable, ...])

calculate link distance in miles

calculate_mpo([county_network_variable, ...])

Calculates mpo variable.

calculate_use([network_variable, ...])

Calculates use variable.

convert_int([int_col_names])

Convert integer columns

create_ML_variable([network_variable, overwrite])

Created ML lanes placeholder for project to write out ML changes

create_calculated_variables()

Creates calculated roadway variables.

create_dummy_connector_links(ml_df[, ...])

create dummy connector links between the general purpose and managed lanes

create_hov_corridor_variable([...])

Created hov corridor placeholder for project to write out corridor changes

create_managed_lane_network([...])

Create a roadway network with managed lanes links separated out.

create_managed_variable([network_variable, ...])

Created placeholder for project to write out managed

dataframe_to_fixed_width(df)

Convert dataframe to fixed width format, geometry column will not be transformed.

delete_roadway_feature_change(links, nodes)

delete the roadway features defined in the project card.

deletion_map(links, nodes)

Shows which links and nodes are deleted from the roadway network

fill_na()

Fill na values from create_managed_lane_network()

from_RoadwayNetwork(roadway_network_object)

RoadwayNetwork to ModelRoadwayNetwork

get_attribute(links_df, join_key, ...)

Gets attribute from source data using SHST match result.

get_managed_lane_node_ids(nodes_list[, scalar])

Transform a list of node IDS by a scalar.

get_modal_graph(links_df, nodes_df[, mode, ...])

Determines if the network graph is "strongly" connected A graph is strongly connected if each vertex is reachable from every other vertex.

get_modal_links_nodes(links_df, nodes_df[, ...])

Returns nodes and link dataframes for specific mode.

get_property_by_time_period_and_group(prop)

Return a series for the properties with a specific group or time period.

identify_segment(O_id, D_id[, ...])

+
param endpoints
+

list of length of two unique keys of nodes making up endpoints of segment

+
+
+

identify_segment_endpoints([mode, links_df, ...])

+
param mode
+

list of modes of the network, one of drive,`transit`,

+
+
+

is_network_connected([mode, links_df, nodes_df])

Determines if the network graph is "strongly" connected A graph is strongly connected if each vertex is reachable from every other vertex.

load_transform_network(node_filename, ...[, ...])

Reads roadway network files from disk and transforms them into GeoDataFrames.

network_connection_plot(G, ...)

Plot a graph to check for network connection.

orig_dest_nodes_foreign_key(selection[, ...])

Returns the foreign key id (whatever is used in the u and v variables in the links file) for the AB nodes as a tuple.

ox_graph(nodes_df, links_df[, ...])

create an osmnx-flavored network graph

path_search(candidate_links_df, O_id, D_id)

+
param candidate_links
+

selection of links geodataframe with links likely to be part of path

+
+
+

read(link_filename, node_filename, ...[, ...])

Reads in links and nodes network standard.

read_match_result(path)

Reads the shst geojson match returns.

rename_variables_for_dbf(input_df[, ...])

Rename attributes for DBF/SHP, make sure length within 10 chars.

roadway_net_to_gdf(roadway_net)

Turn the roadway network into a GeoDataFrame :param roadway_net: the roadway network to export

roadway_standard_to_met_council_network([...])

Rename and format roadway attributes to be consistent with what metcouncil's model is expecting.

select_roadway_features(selection[, ...])

Selects roadway features that satisfy selection criteria

selection_has_unique_link_id(selection_dict)

+
param selection_dictionary
+

Dictionary representation of selection

+
+
+

selection_map(selected_link_idx[, A, B, ...])

Shows which links are selected for roadway property change or parallel managed lanes category of roadway projects.

shortest_path(graph_links_df, O_id, D_id[, ...])

+
param graph_links_df
+

+
+

split_properties_by_time_period_and_category([...])

Splits properties by time period, assuming a variable structure of

update_distance([links_df, use_shapes, ...])

Calculate link distance in specified units to network variable using either straight line distance or (if specified) shape distance if available.

validate_link_schema(link_filename[, ...])

Validate roadway network data link schema and output a boolean

validate_node_schema(node_file[, ...])

Validate roadway network data node schema and output a boolean

validate_properties(properties[, ...])

If there are change or existing commands, make sure that that property exists in the network.

validate_selection(selection[, ...])

Evaluate whetther the selection dictionary contains the minimum required values.

validate_shape_schema(shape_file[, ...])

Validate roadway network data shape schema and output a boolean

validate_uniqueness()

Confirms that the unique identifiers are met.

write([path, filename])

Writes a network in the roadway network standard

write_roadway_as_fixedwidth(output_dir[, ...])

Writes out fixed width file.

write_roadway_as_shp(output_dir[, ...])

Write out dbf/shp/gpkg for cube.

+

Attributes

+ ++++ + + + + + +

CALCULATED_VALUES

+
+
+add_counts(network_variable='AADT', mndot_count_shst_data=None, widot_count_shst_data=None, mndot_count_variable_shp=None, widot_count_variable_shp=None)[source]
+

Adds count variable. +#MC +join the network with count node data, via SHST API node match result

+
+
Parameters
+
    +
  • network_variable (str) – Name of the variable that should be written to. Default to “AADT”.

  • +
  • mndot_count_shst_data (str) – File path to MNDOT count location SHST API node match result.

  • +
  • widot_count_shst_data (str) – File path to WIDOT count location SHST API node match result.

  • +
  • mndot_count_variable_shp (str) – File path to MNDOT count location geodatabase.

  • +
  • widot_count_variable_shp (str) – File path to WIDOT count location geodatabase.

  • +
+
+
Returns
+

None

+
+
+
+ +
+ +

Add data from links going to/from nodes to node.

+
+
Parameters
+
    +
  • links_df – if specified, will assess connectivity of this +links list rather than self.links_df

  • +
  • nodes_df – if specified, will assess connectivity of this +nodes list rather than self.nodes_df

  • +
  • link_variables – list of columns in links dataframe to add to incident nodes

  • +
+
+
Returns
+

nodes DataFrame with link data where length is N*number of links going in/out

+
+
Return type
+

DataFrame

+
+
+
+ +
+
+add_new_roadway_feature_change(links, nodes)
+

add the new roadway features defined in the project card. +new shapes are also added for the new roadway links.

+
+
Parameters
+
    +
  • links – list of dictionaries

  • +
  • nodes – list of dictionaries

  • +
+
+
+

returns: None

+
+
Return type
+

None

+
+
+
+ +
+
+add_variable_using_shst_reference(var_shst_csvdata=None, shst_csv_variable=None, network_variable=None, network_var_type=<class 'int'>, overwrite=False)[source]
+

Join network links with source data, via SHST API node match result.

+
+
Parameters
+
    +
  • var_shst_csvdata (str) – File path to SHST API return.

  • +
  • shst_csv_variable (str) – Variable name in the source data.

  • +
  • network_variable (str) – Name of the variable that should be written to.

  • +
  • network_var_type – Variable type in the written network.

  • +
  • overwrite (bool) – True is overwriting existing variable. Default to False.

  • +
+
+
Returns
+

None

+
+
+
+ +
+
+addition_map(links, nodes)
+

Shows which links and nodes are added to the roadway network

+
+ +
+
+apply(project_card_dictionary)
+

Wrapper method to apply a project to a roadway network.

+
+
Parameters
+

project_card_dictionary – dict +a dictionary of the project card object

+
+
+
+ +
+
+apply_managed_lane_feature_change(link_idx, properties, in_place=True)
+

Apply the managed lane feature changes to the roadway network

+
+
Parameters
+
    +
  • link_idx – list of lndices of all links to apply change to

  • +
  • properties – list of dictionarys roadway properties to change

  • +
  • in_place – boolean to indicate whether to update self or return +a new roadway network object

  • +
+
+
+
+ +
+
+apply_python_calculation(pycode, in_place=True)
+

Changes roadway network object by executing pycode.

+
+
Parameters
+
    +
  • pycode – python code which changes values in the roadway network object

  • +
  • in_place – update self or return a new roadway network object

  • +
+
+
+
+ +
+
+apply_roadway_feature_change(link_idx, properties, in_place=True)
+

Changes the roadway attributes for the selected features based on the +project card information passed

+
+
Parameters
+
    +
  • link_idx – list +lndices of all links to apply change to

  • +
  • properties – list of dictionarys +roadway properties to change

  • +
  • in_place – boolean +update self or return a new roadway network object

  • +
+
+
+
+ +
+
+assess_connectivity(mode='', ignore_end_nodes=True, links_df=None, nodes_df=None)
+

Returns a network graph and list of disconnected subgraphs +as described by a list of their member nodes.

+
+
Parameters
+
    +
  • mode – list of modes of the network, one of drive,`transit`, +walk, bike

  • +
  • ignore_end_nodes – if True, ignores stray singleton nodes

  • +
  • links_df – if specified, will assess connectivity of this +links list rather than self.links_df

  • +
  • nodes_df – if specified, will assess connectivity of this +nodes list rather than self.nodes_df

  • +
+
+
+
+
Returns: Tuple of

Network Graph (osmnx flavored networkX DiGraph) +List of disconnected subgraphs described by the list of their

+
+

member nodes (as described by their model_node_id)

+
+
+
+
+ +
+
+build_selection_key(selection_dict)
+

Selections are stored by a key combining the query and the A and B ids. +This method combines the two for you based on the selection dictionary.

+
+
Parameters
+

selection_dictonary – Selection Dictionary

+
+
+

Returns: Tuple serving as the selection key.

+
+
Return type
+

tuple

+
+
+
+ +
+
+calculate_area_type(area_type_shape=None, area_type_shape_variable=None, network_variable='area_type', area_type_codes_dict=None, downtown_area_type_shape=None, downtown_area_type=None, overwrite=False)[source]
+

#MC +Calculates area type variable.

+

This uses the centroid of the geometry field to determine which area it should be labeled. +This isn’t perfect, but it much quicker than other methods.

+
+
Parameters
+
    +
  • area_type_shape (str) – The File path to area geodatabase.

  • +
  • area_type_shape_variable (str) – The variable name of area type in area geodadabase.

  • +
  • network_variable (str) – The variable name of area type in network standard. Default to “area_type”.

  • +
  • area_type_codes_dict – The dictionary to map input area_type_shape_variable to network_variable

  • +
  • downtown_area_type_shape – The file path to the downtown area type boundary.

  • +
  • downtown_area_type (int) – Integer value of downtown area type

  • +
  • overwrite (Bool) – True if overwriting existing county variable in network. Default to False.

  • +
+
+
Returns
+

None

+
+
+
+ +
+
+calculate_centroidconnect(parameters, network_variable='centroidconnect', highest_taz_number=None, as_integer=True, overwrite=False)[source]
+

Calculates centroid connector variable.

+
+
Parameters
+
    +
  • parameters (Parameters) – A Lasso Parameters, which stores input files.

  • +
  • network_variable (str) – Variable that should be written to in the network. Default to “centroidconnect”

  • +
  • highest_taz_number (int) – the max TAZ number in the network.

  • +
  • as_integer (bool) – If True, will convert true/false to 1/0s. Default to True.

  • +
  • overwrite (Bool) – True if overwriting existing county variable in network. Default to False.

  • +
+
+
Returns
+

RoadwayNetwork

+
+
+
+ +
+
+calculate_county(county_shape=None, county_shape_variable=None, network_variable='county', county_codes_dict=None, overwrite=False)[source]
+

#MC +Calculates county variable.

+

This uses the centroid of the geometry field to determine which county it should be labeled. +This isn’t perfect, but it much quicker than other methods.

+
+
Parameters
+
    +
  • county_shape (str) – The File path to county geodatabase.

  • +
  • county_shape_variable (str) – The variable name of county in county geodadabase.

  • +
  • network_variable (str) – The variable name of county in network standard. Default to “county”.

  • +
  • overwrite (Bool) – True if overwriting existing county variable in network. Default to False.

  • +
+
+
Returns
+

None

+
+
+
+ +
+
+calculate_distance(network_variable='distance', centroidconnect_only=False, overwrite=False)[source]
+

calculate link distance in miles

+
+
Parameters
+
    +
  • centroidconnect_only (Bool) – True if calculating distance for centroidconnectors only. Default to False.

  • +
  • overwrite (Bool) – True if overwriting existing variable in network. Default to False.

  • +
+
+
Returns
+

None

+
+
+
+ +
+
+calculate_mpo(county_network_variable='county', network_variable='mpo', as_integer=True, mpo_counties=None, overwrite=False)[source]
+

Calculates mpo variable. +#MC +:param county_variable: Name of the variable where the county names are stored. Default to “county”. +:type county_variable: str +:param network_variable: Name of the variable that should be written to. Default to “mpo”. +:type network_variable: str +:param as_integer: If true, will convert true/false to 1/0s. +:type as_integer: bool +:param mpo_counties: List of county names that are within mpo region. +:type mpo_counties: list +:param overwrite: True if overwriting existing county variable in network. Default to False. +:type overwrite: Bool

+
+
Returns
+

None

+
+
+
+ +
+
+calculate_use(network_variable='use', as_integer=True, overwrite=False)[source]
+

Calculates use variable.

+
+
Parameters
+
    +
  • network_variable (str) – Variable that should be written to in the network. Default to “use”

  • +
  • as_integer (bool) – If True, will convert true/false to 1/0s. Defauly to True.

  • +
  • overwrite (Bool) – True if overwriting existing county variable in network. Default to False.

  • +
+
+
Returns
+

None

+
+
+
+ +
+
+convert_int(int_col_names=[])[source]
+

Convert integer columns

+
+ +
+
+create_ML_variable(network_variable='ML_lanes', overwrite=False)[source]
+

Created ML lanes placeholder for project to write out ML changes

+

ML lanes default to 0, ML info comes from cube LOG file and store in project cards

+
+
Parameters
+

overwrite (Bool) – True if overwriting existing variable in network. Default to False.

+
+
Returns
+

None

+
+
+
+ +
+
+create_calculated_variables()[source]
+

Creates calculated roadway variables.

+
+
Parameters
+

None

+
+
+
+ +
+ +

create dummy connector links between the general purpose and managed lanes

+
+
Parameters
+
    +
  • gp_df – GeoDataFrame +dataframe of general purpose links (where managed lane also exists)

  • +
  • ml_df – GeoDataFrame +dataframe of corresponding managed lane links,

  • +
  • access_lanes – int +number of lanes in access dummy link

  • +
  • egress_lanes – int +number of lanes in egress dummy link

  • +
  • access_roadway – str +roaday type for access dummy link

  • +
  • egress_roadway – str +roadway type for egress dummy link

  • +
  • access_name_prefix – str +prefix for access dummy link name

  • +
  • egress_name_prefix – str +prefix for egress dummy link name

  • +
+
+
+
+ +
+
+create_hov_corridor_variable(network_variable='segment_id', overwrite=False)[source]
+

Created hov corridor placeholder for project to write out corridor changes

+

hov corridor id default to 0, its info comes from cube LOG file and store in project cards

+
+
Parameters
+

overwrite (Bool) – True if overwriting existing variable in network. Default to False.

+
+
Returns
+

None

+
+
+
+ +
+
+create_managed_lane_network(keep_same_attributes_ml_and_gp=None, keep_additional_attributes_ml_and_gp=[], managed_lanes_required_attributes=[], managed_lanes_node_id_scalar=None, managed_lanes_link_id_scalar=None, in_place=False)
+

Create a roadway network with managed lanes links separated out. +Add new parallel managed lane links, access/egress links, +and add shapes corresponding to the new links

+
+
Parameters
+
    +
  • keep_same_attributes_ml_and_gp – list of attributes to copy from general purpose +lane to managed lane. If not specified, will look for value in the RoadwayNetwork +instance. If not found there, will default to KEEP_SAME_ATTRIBUTES_ML_AND_GP.

  • +
  • keep_additional_attributes_ml_and_gp – list of additional attributes to add. This is useful +if you want to leave the default attributes and then ALSO some others.

  • +
  • managed_lanes_required_attributes – list of attributes that are required to be specified +in new managed lanes. If not specified, will look for value in the RoadwayNetwork +instance. If not found there, will default to MANAGED_LANES_REQUIRED_ATTRIBUTES.

  • +
  • managed_lanes_node_id_scalar – integer value added to original node IDs to create managed +lane unique ids. If not specified, will look for value in the RoadwayNetwork +instance. If not found there, will default to MANAGED_LANES_NODE_ID_SCALAR.

  • +
  • managed_lanes_link_id_scalar – integer value added to original link IDs to create managed +lane unique ids. If not specified, will look for value in the RoadwayNetwork +instance. If not found there, will default to MANAGED_LANES_LINK_ID_SCALAR.

  • +
  • in_place – update self or return a new roadway network object

  • +
+
+
+

returns: A RoadwayNetwork instance

+
+
Return type
+

RoadwayNetwork

+
+
+
+ +
+
+create_managed_variable(network_variable='managed', overwrite=False)[source]
+

Created placeholder for project to write out managed

+

managed default to 0, its info comes from cube LOG file and store in project cards

+
+
Parameters
+

overwrite (Bool) – True if overwriting existing variable in network. Default to False.

+
+
Returns
+

None

+
+
+
+ +
+
+static dataframe_to_fixed_width(df)[source]
+

Convert dataframe to fixed width format, geometry column will not be transformed.

+
+
Parameters
+

df (pandas DataFrame) –

+
+
Returns
+

dataframe with fixed width for each column. +dict: dictionary with columns names as keys, column width as values.

+
+
Return type
+

pandas dataframe

+
+
+
+ +
+
+delete_roadway_feature_change(links, nodes, ignore_missing=True)
+

delete the roadway features defined in the project card. +valid links and nodes defined in the project gets deleted +and shapes corresponding to the deleted links are also deleted.

+
+
Parameters
+
    +
  • links – dict +list of dictionaries

  • +
  • nodes – dict +list of dictionaries

  • +
  • ignore_missing – bool +If True, will only warn about links/nodes that are missing from +network but specified to “delete” in project card +If False, will fail.

  • +
+
+
Return type
+

None

+
+
+
+ +
+
+deletion_map(links, nodes)
+

Shows which links and nodes are deleted from the roadway network

+
+ +
+
+fill_na()[source]
+

Fill na values from create_managed_lane_network()

+
+ +
+
+static from_RoadwayNetwork(roadway_network_object, parameters={})[source]
+

RoadwayNetwork to ModelRoadwayNetwork

+
+
Parameters
+
    +
  • roadway_network_object (RoadwayNetwork) –

  • +
  • parameters – dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not specified, will use default parameters.

  • +
+
+
Returns
+

ModelRoadwayNetwork

+
+
+
+ +
+
+static get_attribute(links_df, join_key, source_shst_ref_df, source_gdf, field_name)[source]
+

Gets attribute from source data using SHST match result.

+
+
Parameters
+
    +
  • links_df (dataframe) – The network dataframe that new attribute should be written to.

  • +
  • join_key (str) – SHST ID variable name used to join source data with network dataframe.

  • +
  • source_shst_ref_df (str) – File path to source data SHST match result.

  • +
  • source_gdf (str) – File path to source data.

  • +
  • field_name (str) – Name of the attribute to get from source data.

  • +
+
+
Returns
+

None

+
+
+
+ +
+
+static get_managed_lane_node_ids(nodes_list, scalar=4500000)
+

Transform a list of node IDS by a scalar. +..todo #237 what if node ids are not a number?

+
+
Parameters
+
    +
  • nodes_list – list of integers

  • +
  • scalar – value to add to node IDs

  • +
+
+
+

Returns: list of integers

+
+ +
+
+static get_modal_graph(links_df, nodes_df, mode=None, modes_to_network_link_variables={'bike': ['bike_access'], 'bus': ['bus_only', 'drive_access'], 'drive': ['drive_access'], 'rail': ['rail_only'], 'transit': ['bus_only', 'rail_only', 'drive_access'], 'walk': ['walk_access']})
+

Determines if the network graph is “strongly” connected +A graph is strongly connected if each vertex is reachable from every other vertex.

+
+
Parameters
+
    +
  • links_df – DataFrame of standard network links

  • +
  • nodes_df – DataFrame of standard network nodes

  • +
  • mode – mode of the network, one of drive,`transit`, +walk, bike

  • +
  • modes_to_network_link_variables – dictionary mapping the mode selections to the network variables +that must bool to true to select that mode. Defaults to MODES_TO_NETWORK_LINK_VARIABLES

  • +
+
+
+

Returns: networkx: osmnx: DiGraph of network

+
+ +
+ +

Returns nodes and link dataframes for specific mode.

+
+
Parameters
+
    +
  • links_df – DataFrame of standard network links

  • +
  • nodes_df – DataFrame of standard network nodes

  • +
  • modes – list of the modes of the network to be kept, must be in drive,`transit`,`rail`,`bus`, +walk, bike. For example, if bike and walk are selected, both bike and walk links will be kept.

  • +
  • modes_to_network_link_variables – dictionary mapping the mode selections to the network variables +that must bool to true to select that mode. Defaults to MODES_TO_NETWORK_LINK_VARIABLES

  • +
+
+
+

Returns: tuple of DataFrames for links, nodes filtered by mode

+

links with walk access are not marked as having walk access +Issue discussed in https://github.com/wsp-sag/network_wrangler/issues/145 +modal_nodes_df = nodes_df[nodes_df[mode_node_variable] == 1]

+
+ +
+
+get_property_by_time_period_and_group(prop, time_period=None, category=None, default_return=None)
+

Return a series for the properties with a specific group or time period.

+
+
Parameters
+
    +
  • prop (str) – the variable that you want from network

  • +
  • time_period (list(str)) – the time period that you are querying for +i.e. [‘16:00’, ‘19:00’]

  • +
  • category (str or list(str)(Optional)) –

    the group category +i.e. “sov”

    +

    or

    +

    list of group categories in order of search, i.e. +[“hov3”,”hov2”]

    +

  • +
  • default_return (what to return if variable or time period not found. Default is None.) –

  • +
+
+
Return type
+

pandas series

+
+
+
+ +
+
+identify_segment(O_id, D_id, selection_dict={}, mode=None, nodes_df=None, links_df=None)
+
+
Parameters
+
    +
  • endpoints – list of length of two unique keys of nodes making up endpoints of segment

  • +
  • selection_dict – dictionary of link variables to select candidate links from, otherwise will create a graph of ALL links which will be both a RAM hog and could result in odd shortest paths.

  • +
  • segment_variables – list of variables to keep

  • +
+
+
+
+ +
+
+identify_segment_endpoints(mode='', links_df=None, nodes_df=None, min_connecting_links=10, min_distance=None, max_link_deviation=2)
+
+
Parameters
+
    +
  • mode – list of modes of the network, one of drive,`transit`, +walk, bike

  • +
  • links_df – if specified, will assess connectivity of this +links list rather than self.links_df

  • +
  • nodes_df – if specified, will assess connectivity of this +nodes list rather than self.nodes_df

  • +
+
+
+
+ +
+
+is_network_connected(mode=None, links_df=None, nodes_df=None)
+

Determines if the network graph is “strongly” connected +A graph is strongly connected if each vertex is reachable from every other vertex.

+
+
Parameters
+
    +
  • mode – mode of the network, one of drive,`transit`, +walk, bike

  • +
  • links_df – DataFrame of standard network links

  • +
  • nodes_df – DataFrame of standard network nodes

  • +
+
+
+

Returns: boolean

+
+ +
+
+static load_transform_network(node_filename, link_filename, shape_filename, crs=4326, node_foreign_key='model_node_id', validate_schema=True, **kwargs)
+

Reads roadway network files from disk and transforms them into GeoDataFrames.

+
+
Parameters
+
    +
  • node_filename – file name for nodes.

  • +
  • link_filename – file name for links.

  • +
  • shape_filename – file name for shapes.

  • +
  • crs – coordinate reference system. Defaults to value in CRS.

  • +
  • node_foreign_key – variable linking the node table to the link table. Defaults +to NODE_FOREIGN_KEY.

  • +
  • validate_schema – boolean indicating if network should be validated to schema.

  • +
+
+
+

returns: tuple of GeodataFrames nodes_df, links_df, shapes_df

+
+
Return type
+

tuple

+
+
+
+ +
+
+static network_connection_plot(G, disconnected_subgraph_nodes)
+

Plot a graph to check for network connection.

+
+
Parameters
+
    +
  • G – OSMNX flavored networkX graph.

  • +
  • disconnected_subgraph_nodes – List of disconnected subgraphs described by the list of their +member nodes (as described by their model_node_id).

  • +
+
+
+

returns: fig, ax : tuple

+
+ +
+
+orig_dest_nodes_foreign_key(selection, node_foreign_key='')
+

Returns the foreign key id (whatever is used in the u and v +variables in the links file) for the AB nodes as a tuple.

+
+
Parameters
+
    +
  • selection – selection dictionary with A and B keys

  • +
  • node_foreign_key – variable name for whatever is used by the u and v variable

  • +
  • specified (in the links_df file. If nothing is) –

  • +
  • whatever (assume) –

  • +
  • is (default) –

  • +
+
+
+

Returns: tuple of (A_id, B_id)

+
+
Return type
+

tuple

+
+
+
+ +
+
+static ox_graph(nodes_df, links_df, node_foreign_key='model_node_id', link_foreign_key=['A', 'B'], unique_link_key='model_link_id')
+

create an osmnx-flavored network graph

+

osmnx doesn’t like values that are arrays, so remove the variables +that have arrays. osmnx also requires that certain variables +be filled in, so do that too.

+
+
Parameters
+
    +
  • nodes_df – GeoDataFrame of nodes

  • +
  • link_df – GeoDataFrame of links

  • +
  • node_foreign_key – field referenced in link_foreign_key

  • +
  • link_foreign_key – list of attributes that define the link start and end nodes to the node foreign key

  • +
  • unique_link_key – primary key for links

  • +
+
+
+

Returns: a networkx multidigraph

+
+ +
+ +
+
Parameters
+
    +
  • candidate_links – selection of links geodataframe with links likely to be part of path

  • +
  • O_id – origin node foreigh key ID

  • +
  • D_id – destination node foreigh key ID

  • +
  • weight_column – column to use for weight of shortest path. Defaults to “i” (iteration)

  • +
  • weight_factor – optional weight to multiply the weight column by when finding the shortest path

  • +
  • search_breadth

  • +
+
+
+

Returns

+
+ +
+
+static read(link_filename, node_filename, shape_filename, fast=False, recalculate_calculated_variables=False, recalculate_distance=False, parameters={}, **kwargs)[source]
+

Reads in links and nodes network standard.

+
+
Parameters
+
    +
  • link_filename (str) – File path to link json.

  • +
  • node_filename (str) – File path to node geojson.

  • +
  • shape_filename (str) – File path to link true shape geojson

  • +
  • fast (bool) – boolean that will skip validation to speed up read time.

  • +
  • recalculate_calculated_variables (bool) – calculates fields from spatial joins, etc.

  • +
  • recalculate_distance (bool) – re-calculates distance.

  • +
  • parameters – dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not specified, will use default parameters.

  • +
  • crs (int) – coordinate reference system, ESPG number

  • +
  • node_foreign_key (str) – variable linking the node table to the link table

  • +
  • link_foreign_key (list) – list of variable linking the link table to the node foreign key

  • +
  • shape_foreign_key (str) – variable linking the links table and shape table

  • +
  • unique_link_ids (list) – list of variables unique to each link

  • +
  • unique_node_ids (list) – list of variables unique to each node

  • +
  • modes_to_network_link_variables (dict) – Mapping of modes to link variables in the network

  • +
  • modes_to_network_nodes_variables (dict) – Mapping of modes to node variables in the network

  • +
  • managed_lanes_node_id_scalar (int) – Scalar values added to primary keys for nodes for +corresponding managed lanes.

  • +
  • managed_lanes_link_id_scalar (int) – Scalar values added to primary keys for links for +corresponding managed lanes.

  • +
  • managed_lanes_required_attributes (list) – attributes that must be specified in managed +lane projects.

  • +
  • keep_same_attributes_ml_and_gp (list) – attributes to copy to managed lanes from parallel +general purpose lanes.

  • +
+
+
Returns
+

ModelRoadwayNetwork

+
+
+
+ +
+
+static read_match_result(path)[source]
+

Reads the shst geojson match returns.

+

Returns shst dataframe.

+

Reading lots of same type of file and concatenating them into a single DataFrame.

+
+
Parameters
+

path (str) – File path to SHST match results.

+
+
Returns
+

geopandas geodataframe

+
+
Return type
+

geodataframe

+
+
+

##todo +not sure why we need, but should be in utilities not this class

+
+ +
+
+rename_variables_for_dbf(input_df, variable_crosswalk=None, output_variables=None, convert_geometry_to_xy=False)[source]
+

Rename attributes for DBF/SHP, make sure length within 10 chars.

+
+
Parameters
+
    +
  • input_df (dataframe) – Network standard DataFrame.

  • +
  • variable_crosswalk (str) – File path to variable name crosswalk from network standard to DBF names.

  • +
  • output_variables (list) – List of strings for DBF variables.

  • +
  • convert_geometry_to_xy (bool) – True if converting node geometry to X/Y

  • +
+
+
Returns
+

dataframe

+
+
+
+ +
+
+static roadway_net_to_gdf(roadway_net)
+

Turn the roadway network into a GeoDataFrame +:param roadway_net: the roadway network to export

+

returns: shapes dataframe

+
+
Return type
+

GeoDataFrame

+
+
+
+ +
+
+roadway_standard_to_met_council_network(output_epsg=None)[source]
+

Rename and format roadway attributes to be consistent with what metcouncil’s model is expecting. +#MC +:param output_epsg: epsg number of output network. +:type output_epsg: int

+
+
Returns
+

None

+
+
+
+ +
+
+select_roadway_features(selection, search_mode='drive', force_search=False, sp_weight_factor=None)
+

Selects roadway features that satisfy selection criteria

+
+
Example usage:
+
net.select_roadway_features(
+
selection = [ {

# a match condition for the from node using osm, +# shared streets, or model node number +‘from’: {‘osm_model_link_id’: ‘1234’}, +# a match for the to-node.. +‘to’: {‘shstid’: ‘4321’}, +# a regex or match for facility condition +# could be # of lanes, facility type, etc. +‘facility’: {‘name’:’Main St’}, +}, … ])

+
+
+
+
+
+
+
+
Parameters
+
    +
  • selection – dictionary with keys for: +A - from node +B - to node +link - which includes at least a variable for name

  • +
  • search_mode – mode which you are searching for; defaults to “drive”

  • +
  • force_search – boolean directing method to perform search even if one +with same selection dict is stored from a previous search.

  • +
  • sp_weight_factor – multiple used to discourage shortest paths which +meander from original search returned from name or ref query. +If not set here, will default to value of sp_weight_factor in +RoadwayNetwork instance. If not set there, will defaul to SP_WEIGHT_FACTOR.

  • +
+
+
+

Returns: a list of link IDs in selection

+
+
Return type
+

GeoDataFrame

+
+
+
+ +
+ +
+
Parameters
+

selection_dictionary – Dictionary representation of selection +of roadway features, containing a “link” key.

+
+
+
+
Returns: A boolean indicating if the selection dictionary contains

a unique identifier for links.

+
+
+
+
Return type
+

bool

+
+
+
+ +
+
+selection_map(selected_link_idx, A=None, B=None, candidate_link_idx=[])
+

Shows which links are selected for roadway property change or parallel +managed lanes category of roadway projects.

+
+
Parameters
+
    +
  • selected_links_idx – list of selected link indices

  • +
  • candidate_links_idx – optional list of candidate link indices to also include in map

  • +
  • A – optional foreign key of starting node of a route selection

  • +
  • B – optional foreign key of ending node of a route selection

  • +
+
+
+
+ +
+
+shortest_path(graph_links_df, O_id, D_id, nodes_df=None, weight_column='i', weight_factor=100)
+
+
Parameters
+
    +
  • graph_links_df

  • +
  • O_id – foreign key for start node

  • +
  • D_id – foreign key for end node

  • +
  • nodes_df – optional nodes df, otherwise will use network instance

  • +
  • weight_column – column to use as a weight, defaults to “i”

  • +
  • weight_factor – any additional weighting to multiply the weight column by, defaults to SP_WEIGHT_FACTOR

  • +
+
+
+

Returns: tuple with length of four +- Boolean if shortest path found +- nx Directed graph of graph links +- route of shortest path nodes as List +- links in shortest path selected from links_df

+
+
Return type
+

tuple

+
+
+
+ +
+
+split_properties_by_time_period_and_category(properties_to_split=None)[source]
+

Splits properties by time period, assuming a variable structure of

+
+
Parameters
+

properties_to_split

dict +dictionary of output variable prefix mapped to the source variable and what to stratify it by +e.g. +{

+
+

’lanes’ : {‘v’:’lanes’, ‘times_periods’:{“AM”: (“6:00”, “10:00”),”PM”: (“15:00”, “19:00”)}}, +‘ML_lanes’ : {‘v’:’ML_lanes’, ‘times_periods’:{“AM”: (“6:00”, “10:00”),”PM”: (“15:00”, “19:00”)}}, +‘use’ : {‘v’:’use’, ‘times_periods’:{“AM”: (“6:00”, “10:00”),”PM”: (“15:00”, “19:00”)}},

+
+

}

+

+
+
+
+ +
+
+update_distance(links_df=None, use_shapes=False, units='miles', network_variable='distance', overwrite=True, inplace=True)
+

Calculate link distance in specified units to network variable using either straight line +distance or (if specified) shape distance if available.

+
+
Parameters
+
    +
  • links_df – Links GeoDataFrame. Useful if want to update a portion of network links +(i.e. only centroid connectors). If not provided, will use entire self.links_df.

  • +
  • use_shapes – if True, will add length information from self.shapes_df rather than crow-fly. +If no corresponding shape found in self.shapes_df, will default to crow-fly.

  • +
  • units – units to use. Defaults to the standard unit of miles. Available units: “meters”, “miles”.

  • +
  • network_variable – variable to store link distance in. Defaults to “distance”.

  • +
  • overwrite – Defaults to True and will overwrite all existing calculated distances. +False will only update NaNs.

  • +
  • inplace – updates self.links_df

  • +
+
+
Returns
+

links_df with updated distance

+
+
+
+ +
+ +

Validate roadway network data link schema and output a boolean

+
+ +
+
+static validate_node_schema(node_file, schema_location='roadway_network_node.json')
+

Validate roadway network data node schema and output a boolean

+
+ +
+
+validate_properties(properties, ignore_existing=False, require_existing_for_change=False)
+

If there are change or existing commands, make sure that that +property exists in the network.

+
+
Parameters
+
    +
  • properties – properties dictionary to be evaluated

  • +
  • ignore_existing – If True, will only warn about properties +that specify an “existing” value. If False, will fail.

  • +
  • require_existing_for_change – If True, will fail if there isn’t +a specified value in theproject card for existing when a +change is specified.

  • +
+
+
+

Returns: boolean value as to whether the properties dictonary is valid.

+
+
Return type
+

bool

+
+
+
+ +
+
+validate_selection(selection, selection_requires=['link'])
+

Evaluate whetther the selection dictionary contains the +minimum required values.

+
+
Parameters
+

selection – selection dictionary to be evaluated

+
+
+

Returns: boolean value as to whether the selection dictonary is valid.

+
+
Return type
+

bool

+
+
+
+ +
+
+static validate_shape_schema(shape_file, schema_location='roadway_network_shape.json')
+

Validate roadway network data shape schema and output a boolean

+
+ +
+
+validate_uniqueness()
+

Confirms that the unique identifiers are met.

+
+
Return type
+

bool

+
+
+
+ +
+
+write(path='.', filename=None)
+

Writes a network in the roadway network standard

+
+
Parameters
+
    +
  • path – the path were the output will be saved

  • +
  • filename – the name prefix of the roadway files that will be generated

  • +
+
+
Return type
+

None

+
+
+
+ +
+
+write_roadway_as_fixedwidth(output_dir, node_output_variables=None, link_output_variables=None, output_link_txt=None, output_node_txt=None, output_link_header_width_txt=None, output_node_header_width_txt=None, output_cube_network_script=None, drive_only=False)[source]
+

Writes out fixed width file.

+

This function does: +1. write out link and node fixed width data files for cube. +2. write out header and width correspondence. +3. write out cube network building script with header and width specification.

+
+
Parameters
+
    +
  • output_dir (str) – File path to where links, nodes and script will be written and run

  • +
  • node_output_variables (list) – list of node variable names.

  • +
  • link_output_variables (list) – list of link variable names.

  • +
  • output_link_txt (str) – File name of output link database (within output_dir)

  • +
  • output_node_txt (str) – File name of output node database (within output_dir)

  • +
  • output_link_header_width_txt (str) – File name of link column width records (within output_dir)

  • +
  • output_node_header_width_txt (str) – File name of node column width records (within output_dir)

  • +
  • output_cube_network_script (str) – File name of CUBE network building script (within output_dir)

  • +
  • drive_only (bool) – If True, only writes drive nodes and links

  • +
+
+
Returns
+

None

+
+
+
+ +
+
+write_roadway_as_shp(output_dir, node_output_variables=None, link_output_variables=None, data_to_csv=True, data_to_dbf=False, output_link_shp=None, output_node_shp=None, output_link_csv=None, output_node_csv=None, output_gpkg=None, output_link_gpkg_layer=None, output_node_gpkg_layer=None, output_gpkg_link_filter=None)[source]
+

Write out dbf/shp/gpkg for cube. Write out csv in addition to shp with full length variable names.

+
+
Parameters
+
    +
  • output_dir (str) – File path to directory

  • +
  • node_output_variables (list) – List of strings for node output variables.

  • +
  • link_output_variables (list) – List of strings for link output variables.

  • +
  • data_to_csv (bool) – True if write network in csv format.

  • +
  • data_to_dbf (bool) – True if write network in dbf/shp format.

  • +
  • output_link_shp (str) – File name to output link dbf/shp.

  • +
  • output_node_shp (str) – File name of output node dbf/shp.

  • +
  • output_link_csv (str) – File name to output link csv.

  • +
  • output_node_csv (str) – File name to output node csv.

  • +
  • output_gpkg (str) – File name to output GeoPackage.

  • +
  • output_link_gpkg_layer (str) – Layer name within output_gpkg to output links.

  • +
  • output_node_gpkg_layer (str) – Layer name within output_gpkg to output links.

  • +
  • output_gpkg_link_filter (str) – Optional column name to additional output link subset layers

  • +
+
+
Returns
+

None

+
+
+
+ +
+
+CALCULATED_VALUES = ['area_type', 'county', 'centroidconnect']
+
+ +
+ +
+ + +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty/_generated/lasso.Parameters/index.html b/branch/bicounty/_generated/lasso.Parameters/index.html new file mode 100644 index 0000000..5c07783 --- /dev/null +++ b/branch/bicounty/_generated/lasso.Parameters/index.html @@ -0,0 +1,554 @@ + + + + + + + lasso.Parameters — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

lasso.Parameters

+
+
+class lasso.Parameters(**kwargs)[source]
+

Bases: object

+

A class representing all the parameters defining the networks +including time of day, categories, etc.

+

Parameters can be set at runtime by initializing a parameters instance +with a keyword argument setting the attribute. Parameters that are +not explicitly set will use default parameters listed in this class. +.. highlight:: python

+
+
Attr:
+
time_period_to_time (dict): Maps time period abbreviations used in

Cube to time of days used on gtfs and highway network standard +Default:

+
{
+    "EA": ("3:00", "6:00"),
+    "AM": ("6:00, "10:00"),
+    "MD": ("10:00", "15:00"),
+    "PM": ("15:00", "19:00"),
+    "EV": ("19:00", "3:00"),
+}
+
+
+
+
cube_time_periods (dict): Maps cube time period numbers used in

transit line files to the time period abbreviations in time_period_to_time +dictionary. +Default:

+
{"1": "EA", "2": "AM", "3": "MD", "4": "PM", "5": "EV"}
+
+
+
+
categories (dict): Maps demand category abbreviations to a list of

network categories they are allowed to use. +Default:

+
{
+    # suffix, source (in order of search)
+    "sov": ["sov", "default"],
+    "hov2": ["hov2", "default", "sov"],
+    "hov3": ["hov3", "hov2", "default", "sov"],
+    "truck": ["trk", "sov", "default"],
+}
+
+
+
+
properties_to_split (dict): Dictionary mapping variables in standard

roadway network to categories and time periods that need to be +split out in final model network to get variables like LANES_AM. +Default:

+
{
+    "lanes": {
+        "v": "lanes",
+        "time_periods": self.time_periods_to_time
+    },
+    "ML_lanes": {
+        "v": "ML_lanes",
+        "time_periods": self.time_periods_to_time
+    },
+    "use": {
+        "v": "use",
+        "time_periods": self.time_periods_to_time
+    },
+}
+
+
+
+
county_shape (str): File location of shapefile defining counties.

Default:

+
r"metcouncil_data/county/cb_2017_us_county_5m.shp"
+
+
+
+
county_variable_shp (str): Property defining the county n ame in

the county_shape file. +Default:

+
NAME
+
+
+
+
lanes_lookup_file (str): Lookup table of number of lanes for different data sources.

Default:

+
r"metcouncil_data/lookups/lanes.csv"
+
+
+
+
centroid_connect_lanes (int): Number of lanes for centroid connectors.

Default:

+
1
+
+
+
+
mpo_counties (list): list of county names within MPO boundary.

Default:

+
[
+    "ANOKA",
+    "DAKOTA",
+    "HENNEPIN",
+    "RAMSEY",
+    "SCOTT",
+    "WASHINGTON",
+    "CARVER",
+]
+
+
+
+
taz_shape (str):

Default:

+
r"metcouncil_data/TAZ/TAZOfficialWCurrentForecasts.shp"
+
+
+
+
taz_data (str):

Default:

+
??
+
+
+
+
highest_taz_number (int): highest TAZ number in order to define

centroid connectors. +Default:

+
3100
+
+
+
+
output_variables (list): list of variables to output in final model

network. +Default:

+
[
+    "model_link_id",
+    "link_id",
+    "A",
+    "B",
+    "shstGeometryId",
+    "distance",
+    "roadway",
+    "name",
+    "roadway_class",
+    "bike_access",
+    "walk_access",
+    "drive_access",
+    "truck_access",
+    "trn_priority_EA",
+    "trn_priority_AM",
+    "trn_priority_MD",
+    "trn_priority_PM",
+    "trn_priority_EV",
+    "ttime_assert_EA",
+    "ttime_assert_AM",
+    "ttime_assert_MD",
+    "ttime_assert_PM",
+    "ttime_assert_EV",
+    "lanes_EA",
+    "lanes_AM",
+    "lanes_MD",
+    "lanes_PM",
+    "lanes_EV",
+    "price_sov_EA",
+    "price_hov2_EA",
+    "price_hov3_EA",
+    "price_truck_EA",
+    "price_sov_AM",
+    "price_hov2_AM",
+    "price_hov3_AM",
+    "price_truck_AM",
+    "price_sov_MD",
+    "price_hov2_MD",
+    "price_hov3_MD",
+    "price_truck_MD",
+    "price_sov_PM",
+    "price_hov2_PM",
+    "price_hov3_PM",
+    "price_truck_PM",
+    "price_sov_EV",
+    "price_hov2_EV",
+    "price_hov3_EV",
+    "price_truck_EV",
+    "roadway_class_idx",
+    "facility_type",
+    "county",
+    "centroidconnect",
+    "model_node_id",
+    "N",
+    "osm_node_id",
+    "bike_node",
+    "transit_node",
+    "walk_node",
+    "drive_node",
+    "geometry",
+    "X",
+    "Y",
+    "ML_lanes_EA",
+    "ML_lanes_AM",
+    "ML_lanes_MD",
+    "ML_lanes_PM",
+    "ML_lanes_EV",
+    "segment_id",
+    "managed",
+    "bus_only",
+    "rail_only"
+]
+
+
+
+
osm_facility_type_dict (dict): Mapping between OSM Roadway variable

and facility type. Default:

+
+
area_type_shape (str): Location of shapefile defining area type.

Default:

+
r"metcouncil_data/area_type/ThriveMSP2040CommunityDesignation.shp"
+
+
+
+
area_type_variable_shp (str): property in area_type_shape with area

type in it. +Default:

+
"COMDES2040"
+
+
+
+
area_type_code_dict (dict): Mapping of the area_type_variable_shp to

the area type code used in the MetCouncil cube network. +Default:

+
{
+    23: 4,  # urban center
+    24: 3,
+    25: 2,
+    35: 2,
+    36: 1,
+    41: 1,
+    51: 1,
+    52: 1,
+    53: 1,
+    60: 1,
+}
+
+
+
+
downtown_area_type_shape (str): Location of shapefile defining downtown area type.

Default:

+
r"metcouncil_data/area_type/downtownzones_TAZ.shp"
+
+
+
+
downtown_area_type (int): Area type integer for downtown.

Default:

+
5
+
+
+
+
mrcc_roadway_class_shape (str): Shapefile of MRCC links with a property

associated with roadway class. Default:

+
r"metcouncil_data/mrcc/trans_mrcc_centerlines.shp"
+
+
+
+
mrcc_roadway_class_variable_shp (str): The property in mrcc_roadway_class_shp

associated with roadway class. Default:

+
"ROUTE_SYS"
+
+
+
+
widot_roadway_class_shape (str): Shapefile of Wisconsin links with a property

associated with roadway class. Default:

+
r"metcouncil_data/Wisconsin_Lanes_Counts_Median/WISLR.shp"
+
+
+
+
widot_roadway_class_variable_shp (str): The property in widot_roadway_class_shape

associated with roadway class.Default:

+
"RDWY_CTGY_"
+
+
+
+
mndot_count_shape (str): Shapefile of MnDOT links with a property

associated with counts. Default:

+
r"metcouncil_data/count_mn/AADT_2017_Count_Locations.shp"
+
+
+
+
mndot_count_variable_shp (str): The property in mndot_count_shape

associated with counts. Default:

+
+
::

“lookups/osm_highway_facility_type_crosswalk.csv”

+
+
+
+
legacy_tm2_attributes (str): CSV file of link attributes by

shStReferenceId from Legacy TM2 network. Default:

+
"lookups/legacy_tm2_attributes.csv"
+
+
+
+
osm_lanes_attributes (str): CSV file of number of lanes by shStReferenceId

from OSM. Default:

+
"lookups/osm_lanes_attributes.csv"
+
+
+
+
tam_tm2_attributes (str): CSV file of link attributes by

shStReferenceId from TAM TM2 network. Default:

+
"lookups/tam_tm2_attributes.csv"
+
+
+
+
tom_tom_attributes (str): CSV file of link attributes by

shStReferenceId from TomTom network. Default:

+
"lookups/tomtom_attributes.csv"
+
+
+
+
sfcta_attributes (str): CSV file of link attributes by

shStReferenceId from SFCTA network. Default:

+
"lookups/sfcta_attributes.csv"
+
+
+
+
output_epsg (int): EPSG type of geographic projection for output

shapefiles. Default:

+
102646
+
+
+
+
output_link_shp (str): Output shapefile for roadway links. Default:
+
::

r”tests/scratch/links.shp”

+
+
+
+
output_node_shp (str): Output shapefile for roadway nodes. Default:
+
::

r”tests/scratch/nodes.shp”

+
+
+
+
output_link_csv (str): Output csv for roadway links. Default:
+
::

r”tests/scratch/links.csv”

+
+
+
+
output_node_csv (str): Output csv for roadway nodes. Default:
+
::

r”tests/scratch/nodes.csv”

+
+
+
+
output_link_txt (str): Output fixed format txt for roadway links. Default:
+
::

r”tests/scratch/links.txt”

+
+
+
+
output_node_txt (str): Output fixed format txt for roadway nodes. Default:
+
::

r”tests/scratch/nodes.txt”

+
+
+
+
output_link_header_width_txt (str): Header for txt roadway links. Default:
+
::

r”tests/scratch/links_header_width.txt”

+
+
+
+
output_node_header_width_txt (str): Header for txt for roadway Nodes. Default:
+
::

r”tests/scratch/nodes_header_width.txt”

+
+
+
+
output_cube_network_script (str): Cube script for importing

fixed-format roadway network. Default:

+
r"tests/scratch/make_complete_network_from_fixed_width_file.s
+
+
+
+
+
+
+
+
+__init__(**kwargs)[source]
+

Time period and category splitting info

+
+ +

Methods

+ ++++ + + + + + +

__init__(**kwargs)

Time period and category splitting info

+

Attributes

+ ++++ + + + + + + + + + + + + + + +

cube_time_periods

#MC self.route_type_bus_mode_dict = {"Urb Loc": 5, "Sub Loc": 6, "Express": 7}

properties_to_split

Details for calculating the county based on the centroid of the link.

county_link_range_dict

self.county_code_dict = {

zones

Create all the possible headway variable combinations based on the cube time periods setting

+
+ +
+
self.county_code_dict = {

“Anoka”: 1, +“Carver”: 2, +“Dakota”: 3, +“Hennepin”: 4, +“Ramsey”: 5, +“Scott”: 6, +“Washington”: 7, +“external”: 10, +“Chisago”: 11, +“Goodhue”: 12, +“Isanti”: 13, +“Le Sueur”: 14, +“McLeod”: 15, +“Pierce”: 16, +“Polk”: 17, +“Rice”: 18, +“Sherburne”: 19, +“Sibley”: 20, +“St. Croix”: 21, +“Wright”: 22,

+
+
+

}

+
+ +
+
+cube_time_periods
+

#MC +self.route_type_bus_mode_dict = {“Urb Loc”: 5, “Sub Loc”: 6, “Express”: 7}

+

self.route_type_mode_dict = {0: 8, 2: 9}

+

self.cube_time_periods = {“1”: “AM”, “2”: “MD”} +self.cube_time_periods_name = {“AM”: “pk”, “MD”: “op”}

+
+ +
+
+properties_to_split
+

Details for calculating the county based on the centroid of the link. +The NAME varible should be the name of a field in shapefile.

+
+ +
+
+zones
+

Create all the possible headway variable combinations based on the cube time periods setting

+
+ +
+ +
+ + +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty/_generated/lasso.Project/index.html b/branch/bicounty/_generated/lasso.Project/index.html new file mode 100644 index 0000000..99606cc --- /dev/null +++ b/branch/bicounty/_generated/lasso.Project/index.html @@ -0,0 +1,520 @@ + + + + + + + lasso.Project — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

lasso.Project

+
+
+class lasso.Project(roadway_link_changes=None, roadway_node_changes=None, transit_changes=None, base_roadway_network=None, base_transit_network=None, base_cube_transit_network=None, build_cube_transit_network=None, project_name='', evaluate=False, parameters={})[source]
+

Bases: object

+

A single or set of changes to the roadway or transit system.

+

Compares a base and a build transit network or a base and build +highway network and produces project cards.

+

Typical usage example:

+
test_project = Project.create_project(
+    base_cube_transit_source=os.path.join(CUBE_DIR, "transit.LIN"),
+    build_cube_transit_source=os.path.join(CUBE_DIR, "transit_route_shape_change"),
+)
+test_project.evaluate_changes()
+test_project.write_project_card(
+    os.path.join(SCRATCH_DIR, "t_transit_shape_test.yml")
+)
+
+
+
+
+DEFAULT_PROJECT_NAME
+

a class-level constant that defines what +the project name will be if none is set.

+
+ +
+
+STATIC_VALUES
+

a class-level constant which defines values that +are not evaluated when assessing changes.

+
+ +
+
+card_data
+

{“project”: <project_name>, “changes”: <list of change dicts>}

+
+
Type
+

dict

+
+
+
+ +
+ +

pandas dataframe of CUBE roadway link changes.

+
+
Type
+

DataFrame

+
+
+
+ +
+
+roadway_node_changes
+

pandas dataframe of CUBE roadway node changes.

+
+
Type
+

DataFrame

+
+
+
+ +
+
+transit_changes
+
+
Type
+

CubeTransit

+
+
+
+ +
+
+base_roadway_network
+
+
Type
+

RoadwayNetwork

+
+
+
+ +
+
+base_cube_transit_network
+
+
Type
+

CubeTransit

+
+
+
+ +
+
+build_cube_transit_network
+
+
Type
+

CubeTransit

+
+
+
+ +
+
+project_name
+

name of the project, set to DEFAULT_PROJECT_NAME if not provided

+
+
Type
+

str

+
+
+
+ +
+
+parameters
+

an instance of the Parameters class which sets a bunch of parameters

+
+ +
+
+__init__(roadway_link_changes=None, roadway_node_changes=None, transit_changes=None, base_roadway_network=None, base_transit_network=None, base_cube_transit_network=None, build_cube_transit_network=None, project_name='', evaluate=False, parameters={})[source]
+

ProjectCard constructor.

+
+
Parameters
+
    +
  • roadway_link_changes – dataframe of roadway changes read from a log file

  • +
  • roadway_node_changes – dataframe of roadway changes read from a log file

  • +
  • transit_changes – dataframe of transit changes read from a log file

  • +
  • base_roadway_network – RoadwayNetwork instance for base case

  • +
  • base_transit_network – StandardTransit instance for base case

  • +
  • base_cube_transit_network – CubeTransit instance for base transit network

  • +
  • build_cube_transit_network – CubeTransit instance for build transit network

  • +
  • project_name – name of the project

  • +
  • evaluate – defaults to false, but if true, will create card data

  • +
  • parameters – dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not specified, will use default parameters.

  • +
+
+
+

returns: instance of ProjectCard

+
+ +

Methods

+ ++++ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +

__init__([roadway_link_changes, ...])

ProjectCard constructor.

add_highway_changes([...])

Evaluates changes from the log file based on the base highway object and adds entries into the self.card_data dictionary.

add_transit_changes()

Evaluates changes between base and build transit objects and adds entries into the self.card_data dictionary.

create_project([roadway_log_file, ...])

Constructor for a Project instance.

determine_roadway_network_changes_compatibility(...)

Checks to see that any links or nodes that change exist in base roadway network.

emme_id_to_wrangler_id(emme_link_change_df, ...)

rewrite the emme id with wrangler id, using the emme wrangler id crosswalk located in database folder

emme_name_to_wrangler_name(...)

rename emme names to wrangler names using crosswalk file

evaluate_changes()

Determines which changes should be evaluated, initiates self.card_data to be an aggregation of transit and highway changes.

get_object_from_network_build_command()

determine the network build object is node or link

get_operation_from_network_build_command()

determine the network build object action type

read_logfile(logfilename)

Reads a Cube log file and returns separate dataframes of roadway_link_changes and roadway_node_changes

read_network_build_file(networkbuildfilename)

Reads a emme network build file and returns separate dataframes of roadway_link_changes and roadway_node_changes

write_project_card([filename])

Writes project cards.

+

Attributes

+ ++++ + + + + + + + + + + + +

CALCULATED_VALUES

DEFAULT_PROJECT_NAME

STATIC_VALUES

+
+
+add_highway_changes(limit_variables_to_existing_network=False)[source]
+

Evaluates changes from the log file based on the base highway object and +adds entries into the self.card_data dictionary.

+
+
Parameters
+

limit_variables_to_existing_network (bool) – True if no ad-hoc variables. Default to False.

+
+
+
+ +
+
+add_transit_changes()[source]
+

Evaluates changes between base and build transit objects and +adds entries into the self.card_data dictionary.

+
+ +
+
+static create_project(roadway_log_file=None, roadway_shp_file=None, roadway_csv_file=None, network_build_file=None, emme_node_id_crosswalk_file=None, emme_name_crosswalk_file=None, base_roadway_dir=None, base_transit_dir=None, base_cube_transit_source=None, build_cube_transit_source=None, roadway_link_changes=None, roadway_node_changes=None, transit_changes=None, base_roadway_network=None, base_cube_transit_network=None, build_cube_transit_network=None, project_name=None, recalculate_calculated_variables=False, recalculate_distance=False, parameters={}, **kwargs)[source]
+

Constructor for a Project instance.

+
+
Parameters
+
    +
  • roadway_log_file – File path to consuming logfile or a list of logfile paths.

  • +
  • roadway_shp_file – File path to consuming shape file for roadway changes.

  • +
  • roadway_csv_file – File path to consuming csv file for roadway changes.

  • +
  • network_build_file – File path to consuming EMME network build for network changes.

  • +
  • base_roadway_dir – Folder path to base roadway network.

  • +
  • base_transit_dir – Folder path to base transit network.

  • +
  • base_cube_transit_source – Folder path to base transit network or cube line file string.

  • +
  • base_cube_transit_file – File path to base transit network.

  • +
  • build_cube_transit_source – Folder path to build transit network or cube line file string.

  • +
  • build_cube_transit_file – File path to build transit network.

  • +
  • roadway_link_changes – pandas dataframe of CUBE roadway link changes.

  • +
  • roadway_node_changes – pandas dataframe of CUBE roadway node changes.

  • +
  • transit_changes – build transit changes.

  • +
  • base_roadway_network – Base roadway network object.

  • +
  • base_cube_transit_network – Base cube transit network object.

  • +
  • build_cube_transit_network – Build cube transit network object.

  • +
  • project_name – If not provided, will default to the roadway_log_file filename if +provided (or the first filename if a list is provided)

  • +
  • recalculate_calculated_variables – if reading in a base network, if this is true it +will recalculate variables such as area type, etc. This only needs to be true +if you are creating project cards that are changing the calculated variables.

  • +
  • recalculate_distance – recalculate the distance variable. This only needs to be +true if you are creating project cards that change the distance.

  • +
  • parameters – dictionary of parameters

  • +
  • crs (int) – coordinate reference system, ESPG number

  • +
  • node_foreign_key (str) – variable linking the node table to the link table

  • +
  • link_foreign_key (list) – list of variable linking the link table to the node foreign key

  • +
  • shape_foreign_key (str) – variable linking the links table and shape table

  • +
  • unique_link_ids (list) – list of variables unique to each link

  • +
  • unique_node_ids (list) – list of variables unique to each node

  • +
  • modes_to_network_link_variables (dict) – Mapping of modes to link variables in +the network

  • +
  • modes_to_network_nodes_variables (dict) – Mapping of modes to node variables +in the network

  • +
  • managed_lanes_node_id_scalar (int) – Scalar values added to primary keys for nodes for +corresponding managed lanes.

  • +
  • managed_lanes_link_id_scalar (int) – Scalar values added to primary keys for links for +corresponding managed lanes.

  • +
  • managed_lanes_required_attributes (list) – attributes that must be specified in managed +lane projects.

  • +
  • keep_same_attributes_ml_and_gp (list) – attributes to copy to managed lanes from parallel +general purpose lanes.

  • +
+
+
Returns
+

A Project instance.

+
+
+
+ +
+
+static determine_roadway_network_changes_compatibility(base_roadway_network, roadway_link_changes, roadway_node_changes, parameters)[source]
+

Checks to see that any links or nodes that change exist in base roadway network.

+
+ +
+
+static emme_id_to_wrangler_id(emme_link_change_df, emme_node_change_df, emme_transit_changes_df, emme_node_id_crosswalk_file)[source]
+

rewrite the emme id with wrangler id, using the emme wrangler id crosswalk located in database folder

+
+ +
+
+static emme_name_to_wrangler_name(emme_link_change_df, emme_node_change_df, emme_name_crosswalk_file)[source]
+

rename emme names to wrangler names using crosswalk file

+
+ +
+
+evaluate_changes()[source]
+

Determines which changes should be evaluated, initiates +self.card_data to be an aggregation of transit and highway changes.

+
+ +
+
+get_object_from_network_build_command()[source]
+

determine the network build object is node or link

+
+
Parameters
+

row – network build command history dataframe

+
+
Returns
+

‘N’ for node, ‘L’ for link

+
+
+
+ +
+
+get_operation_from_network_build_command()[source]
+

determine the network build object action type

+
+
Parameters
+

row – network build command history dataframe

+
+
Returns
+

‘A’, ‘C’, ‘D’

+
+
+
+ +
+
+static read_logfile(logfilename)[source]
+

Reads a Cube log file and returns separate dataframes of roadway_link_changes and roadway_node_changes

+
+
Parameters
+

logfilename (str or list[str]) – File path to CUBE logfile or list of logfile paths.

+
+
Returns
+

A DataFrame reprsentation of the log file.

+
+
+
+ +
+
+static read_network_build_file(networkbuildfilename)[source]
+

Reads a emme network build file and returns separate dataframes of roadway_link_changes and roadway_node_changes

+
+
Parameters
+

networkbuildfilename (str or list[str]) – File path to emme nework build file or list of network build file paths.

+
+
Returns
+

A DataFrame representation of the network build file

+
+
+
+ +
+
+write_project_card(filename=None)[source]
+

Writes project cards.

+
+
Parameters
+

filename (str) – File path to output .yml

+
+
Returns
+

None

+
+
+
+ +
+
+CALCULATED_VALUES = ['area_type', 'county', 'assign_group', 'centroidconnect']
+
+ +
+
+DEFAULT_PROJECT_NAME = 'USER TO define'
+
+ +
+
+STATIC_VALUES = ['model_link_id', 'area_type', 'county', 'centroidconnect']
+
+ +
+ +
+ + +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty/_generated/lasso.StandardTransit/index.html b/branch/bicounty/_generated/lasso.StandardTransit/index.html new file mode 100644 index 0000000..caa6df0 --- /dev/null +++ b/branch/bicounty/_generated/lasso.StandardTransit/index.html @@ -0,0 +1,415 @@ + + + + + + + lasso.StandardTransit — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

lasso.StandardTransit

+
+
+class lasso.StandardTransit(ptg_feed, parameters={})[source]
+

Bases: object

+

Holds a standard transit feed as a Partridge object and contains +methods to manipulate and translate the GTFS data to MetCouncil’s +Cube Line files.

+

Typical usage example:

+
cube_transit_net = StandardTransit.read_gtfs(BASE_TRANSIT_DIR)
+cube_transit_net.write_as_cube_lin(os.path.join(WRITE_DIR, "outfile.lin"))
+
+
+
+
+feed
+

Partridge Feed object containing read-only access to GTFS feed

+
+ +
+
+parameters
+

Parameters instance containing information +about time periods and variables.

+
+
Type
+

Parameters

+
+
+
+ +
+
+__init__(ptg_feed, parameters={})[source]
+
+
Parameters
+
    +
  • ptg_feed – partridge feed object

  • +
  • parameters – dictionary of parameter settings (see Parameters class) or an instance of Parameters

  • +
+
+
+
+ +

Methods

+ ++++ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +

__init__(ptg_feed[, parameters])

+
param ptg_feed
+

partridge feed object

+
+
+

calculate_cube_mode(row)

Assigns a cube mode number by following logic.

cube_format(row)

Creates a string represnting the route in cube line file notation. #MC :param row: row of a DataFrame representing a cube-formatted trip, with the Attributes trip_id, shape_id, NAME, LONGNAME, tod, HEADWAY, MODE, ONEWAY, OPERATOR.

evaluate_differences(transit_changes)

Compare changes from the transit_changes dataframe with the standard transit network returns the project card changes in dictionary format

fromTransitNetwork(transit_network_object[, ...])

RoadwayNetwork to ModelRoadwayNetwork

read_gtfs(gtfs_feed_dir[, parameters])

Reads GTFS files from a directory and returns a StandardTransit instance.

route_properties_gtfs_to_cube(self)

Prepare gtfs for cube lin file.

shape_gtfs_to_cube(row[, add_nntime])

Creates a list of nodes that for the route in appropriate cube format.

shape_gtfs_to_emme(trip_row)

Creates transit segment for the trips in appropriate emme format.

time_to_cube_time_period(start_time_secs[, ...])

Converts seconds from midnight to the cube time period.

write_as_cube_lin([outpath])

Writes the gtfs feed as a cube line file after converting gtfs properties to MetCouncil cube properties.

+
+
+calculate_cube_mode(row)[source]
+

Assigns a cube mode number by following logic. +#MC +For rail, uses GTFS route_type variable: +https://developers.google.com/transit/gtfs/reference

+
+
::

# route_type : cube_mode +route_type_to_cube_mode = {0: 8, # Tram, Streetcar, Light rail

+
+

3: 0, # Bus; further disaggregated for cube +2: 9} # Rail

+
+
+
+

For buses, uses route id numbers and route name to find +express and suburban buses as follows:

+
+
::
+
if not cube_mode:
+
if ‘express’ in row[‘LONGNAME’].lower():

cube_mode = 7 # Express

+
+
elif int(row[‘route_id’].split(“-“)[0]) > 99:

cube_mode = 6 # Suburban Local

+
+
else:

cube_mode = 5 # Urban Local

+
+
+
+
+
+
+
+
Parameters
+

row – A DataFrame row with route_type, route_long_name, and route_id

+
+
Returns
+

cube mode number

+
+
+
+ +
+
+cube_format(row)[source]
+

Creates a string represnting the route in cube line file notation. +#MC +:param row: row of a DataFrame representing a cube-formatted trip, with the Attributes

+
+

trip_id, shape_id, NAME, LONGNAME, tod, HEADWAY, MODE, ONEWAY, OPERATOR

+
+
+
Returns
+

string representation of route in cube line file notation

+
+
+
+ +
+
+evaluate_differences(transit_changes)[source]
+

Compare changes from the transit_changes dataframe with the standard transit network +returns the project card changes in dictionary format

+
+ +
+
+static fromTransitNetwork(transit_network_object, parameters={})[source]
+

RoadwayNetwork to ModelRoadwayNetwork

+
+
Parameters
+
    +
  • transit_network_object – Reference to an instance of TransitNetwork.

  • +
  • parameters – dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not provided will +use default parameters.

  • +
+
+
Returns
+

StandardTransit

+
+
+
+ +
+
+static read_gtfs(gtfs_feed_dir, parameters={})[source]
+

Reads GTFS files from a directory and returns a StandardTransit +instance.

+
+
Parameters
+
    +
  • gtfs_feed_dir – location of the GTFS files

  • +
  • parameters – dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not provided will +use default parameters.

  • +
+
+
Returns
+

StandardTransit instance

+
+
+
+ +
+
+static route_properties_gtfs_to_cube(self)[source]
+

Prepare gtfs for cube lin file. +#MC +Does the following operations: +1. Combines route, frequency, trip, and shape information +2. Converts time of day to time periods +3. Calculates cube route name from gtfs route name and properties +4. Assigns a cube-appropriate mode number +5. Assigns a cube-appropriate operator number

+
+
Returns
+

+
DataFrame of trips with cube-appropriate values for:
    +
  • NAME

  • +
  • ONEWAY

  • +
  • OPERATOR

  • +
  • MODE

  • +
  • HEADWAY

  • +
+
+
+

+
+
Return type
+

trip_df (DataFrame)

+
+
+
+ +
+
+shape_gtfs_to_cube(row, add_nntime=False)[source]
+

Creates a list of nodes that for the route in appropriate +cube format.

+
+
Parameters
+

row – DataFrame row with both shape_id and trip_id

+
+
+
+
Returns: a string representation of the node list

for a route in cube format.

+
+
+
+ +
+
+shape_gtfs_to_emme(trip_row)[source]
+

Creates transit segment for the trips in appropriate +emme format.

+
+
Parameters
+

row – DataFrame row with both shape_id and trip_id

+
+
+
+
Returns: a dataframe representation of the transit segment

for a trip in emme format.

+
+
+
+ +
+
+time_to_cube_time_period(start_time_secs, as_str=True, verbose=False)[source]
+

Converts seconds from midnight to the cube time period.

+
+
Parameters
+
    +
  • start_time_secs – start time for transit trip in seconds +from midnight

  • +
  • as_str – if True, returns the time period as a string, +otherwise returns a numeric time period

  • +
+
+
Returns
+

+
if as_str is False, returns the numeric

time period

+
+
this_tp: if as_str is True, returns the Cube time period

name abbreviation

+
+
+

+
+
Return type
+

this_tp_num

+
+
+
+ +
+
+write_as_cube_lin(outpath=None)[source]
+

Writes the gtfs feed as a cube line file after +converting gtfs properties to MetCouncil cube properties. +#MC +:param outpath: File location for output cube line file.

+
+ +
+ +
+ + +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty/_generated/lasso.logger/index.html b/branch/bicounty/_generated/lasso.logger/index.html new file mode 100644 index 0000000..9adf06d --- /dev/null +++ b/branch/bicounty/_generated/lasso.logger/index.html @@ -0,0 +1,138 @@ + + + + + + + lasso.logger — lasso documentation + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

lasso.logger

+

Functions

+ ++++ + + + + + +

setupLogging(infoLogFilename, debugLogFilename)

Sets up the logger.

+
+
+lasso.logger.setupLogging(infoLogFilename, debugLogFilename, logToConsole=True)[source]
+

Sets up the logger. The infoLog is terse, just gives the bare minimum of details +so the network composition will be clear later. +The debuglog is very noisy, for debugging.

+

Pass none to either. +Spews it all out to console too, if logToConsole is true.

+
+ +
+ + +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty/_generated/lasso.util/index.html b/branch/bicounty/_generated/lasso.util/index.html new file mode 100644 index 0000000..5624528 --- /dev/null +++ b/branch/bicounty/_generated/lasso.util/index.html @@ -0,0 +1,1441 @@ + + + + + + + lasso.util — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

lasso.util

+

Functions

+ ++++ + + + + + + + + + + + + + + + + + + + + + + + +

column_name_to_parts(c[, parameters])

create_locationreference(node, link)

geodesic_point_buffer(lat, lon, meters)

creates circular buffer polygon for node

get_shared_streets_intersection_hash(lat, long)

Calculated per:

hhmmss_to_datetime(hhmmss_str)

Creates a datetime time object from a string of hh:mm:ss

secs_to_datetime(secs)

Creates a datetime time object from a seconds from midnight

shorten_name(name)

+
+
+class lasso.util.Point(*args)[source]
+

Bases: BaseGeometry

+

A zero dimensional feature

+

A point has zero length and zero area.

+
+
+x, y, z
+

Coordinate values

+
+
Type
+

float

+
+
+
+ +

Example

+
>>> p = Point(1.0, -1.0)
+>>> print(p)
+POINT (1 -1)
+>>> p.y
+-1.0
+>>> p.x
+1.0
+
+
+
+
+almost_equals(other, decimal=6)
+

True if geometries are equal at all coordinates to a +specified decimal place.

+
+

Deprecated since version 1.8.0: The ‘almost_equals()’ method is deprecated +and will be removed in Shapely 2.0 because the name is +confusing. The ‘equals_exact()’ method should be used +instead.

+
+

Refers to approximate coordinate equality, which requires +coordinates to be approximately equal and in the same order for +all components of a geometry.

+

Because of this it is possible for “equals()” to be True for two +geometries and “almost_equals()” to be False.

+

Examples

+
>>> LineString(
+...     [(0, 0), (2, 2)]
+... ).equals_exact(
+...     LineString([(0, 0), (1, 1), (2, 2)]),
+...     1e-6
+... )
+False
+
+
+
+
Return type
+

bool

+
+
+
+ +
+
+array_interface()[source]
+

Provide the Numpy array protocol.

+
+ +
+
+buffer(distance, resolution=16, quadsegs=None, cap_style=1, join_style=1, mitre_limit=5.0, single_sided=False)
+

Get a geometry that represents all points within a distance +of this geometry.

+

A positive distance produces a dilation, a negative distance an +erosion. A very small or zero distance may sometimes be used to +“tidy” a polygon.

+
+
Parameters
+
    +
  • distance (float) – The distance to buffer around the object.

  • +
  • resolution (int, optional) – The resolution of the buffer around each vertex of the +object.

  • +
  • quadsegs (int, optional) – Sets the number of line segments used to approximate an +angle fillet. Note: the use of a quadsegs parameter is +deprecated and will be gone from the next major release.

  • +
  • cap_style (int, optional) – The styles of caps are: CAP_STYLE.round (1), CAP_STYLE.flat +(2), and CAP_STYLE.square (3).

  • +
  • join_style (int, optional) – The styles of joins between offset segments are: +JOIN_STYLE.round (1), JOIN_STYLE.mitre (2), and +JOIN_STYLE.bevel (3).

  • +
  • mitre_limit (float, optional) – The mitre limit ratio is used for very sharp corners. The +mitre ratio is the ratio of the distance from the corner to +the end of the mitred offset corner. When two line segments +meet at a sharp angle, a miter join will extend the original +geometry. To prevent unreasonable geometry, the mitre limit +allows controlling the maximum length of the join corner. +Corners with a ratio which exceed the limit will be beveled.

  • +
  • single_side (bool, optional) –

    The side used is determined by the sign of the buffer +distance:

    +
    +

    a positive distance indicates the left-hand side +a negative distance indicates the right-hand side

    +
    +

    The single-sided buffer of point geometries is the same as +the regular buffer. The End Cap Style for single-sided +buffers is always ignored, and forced to the equivalent of +CAP_FLAT.

    +

  • +
+
+
Return type
+

Geometry

+
+
+

Notes

+

The return value is a strictly two-dimensional geometry. All +Z coordinates of the original geometry will be ignored.

+

Examples

+
>>> from shapely.wkt import loads
+>>> g = loads('POINT (0.0 0.0)')
+>>> g.buffer(1.0).area        # 16-gon approx of a unit radius circle
+3.1365484905459...
+>>> g.buffer(1.0, 128).area   # 128-gon approximation
+3.141513801144...
+>>> round(g.buffer(1.0, 3).area, 10)  # triangle approximation
+3.0
+>>> list(g.buffer(1.0, cap_style=CAP_STYLE.square).exterior.coords)
+[(1.0, 1.0), (1.0, -1.0), (-1.0, -1.0), (-1.0, 1.0), (1.0, 1.0)]
+>>> g.buffer(1.0, cap_style=CAP_STYLE.square).area
+4.0
+
+
+
+ +
+
+contains(other)
+

Returns True if the geometry contains the other, else False

+
+ +
+
+covered_by(other)
+

Returns True if the geometry is covered by the other, else False

+
+ +
+
+covers(other)
+

Returns True if the geometry covers the other, else False

+
+ +
+
+crosses(other)
+

Returns True if the geometries cross, else False

+
+ +
+
+difference(other)
+

Returns the difference of the geometries

+
+ +
+
+disjoint(other)
+

Returns True if geometries are disjoint, else False

+
+ +
+
+distance(other)
+

Unitless distance to other geometry (float)

+
+ +
+
+empty(val=94210181747008)
+
+ +
+
+equals(other)
+

Returns True if geometries are equal, else False.

+

This method considers point-set equality (or topological +equality), and is equivalent to (self.within(other) & +self.contains(other)).

+

Examples

+
>>> LineString(
+...     [(0, 0), (2, 2)]
+... ).equals(
+...     LineString([(0, 0), (1, 1), (2, 2)])
+... )
+True
+
+
+
+
Return type
+

bool

+
+
+
+ +
+
+equals_exact(other, tolerance)
+

True if geometries are equal to within a specified +tolerance.

+
+
Parameters
+
    +
  • other (BaseGeometry) – The other geometry object in this comparison.

  • +
  • tolerance (float) – Absolute tolerance in the same units as coordinates.

  • +
  • equality (This method considers coordinate) –

  • +
  • requires (which) –

  • +
  • components (coordinates to be equal and in the same order for all) –

  • +
  • geometry. (of a) –

  • +
  • two (Because of this it is possible for "equals()" to be True for) –

  • +
  • False. (geometries and "equals_exact()" to be) –

  • +
+
+
+

Examples

+
>>> LineString(
+...     [(0, 0), (2, 2)]
+... ).equals_exact(
+...     LineString([(0, 0), (1, 1), (2, 2)]),
+...     1e-6
+... )
+False
+
+
+
+
Return type
+

bool

+
+
+
+ +
+
+geometryType()
+
+ +
+
+hausdorff_distance(other)
+

Unitless hausdorff distance to other geometry (float)

+
+ +
+
+interpolate(distance, normalized=False)
+

Return a point at the specified distance along a linear geometry

+

Negative length values are taken as measured in the reverse +direction from the end of the geometry. Out-of-range index +values are handled by clamping them to the valid range of values. +If the normalized arg is True, the distance will be interpreted as a +fraction of the geometry’s length.

+
+ +
+
+intersection(other)
+

Returns the intersection of the geometries

+
+ +
+
+intersects(other)
+

Returns True if geometries intersect, else False

+
+ +
+
+normalize()
+

Converts geometry to normal form (or canonical form).

+

This method orders the coordinates, rings of a polygon and parts of +multi geometries consistently. Typically useful for testing purposes +(for example in combination with equals_exact).

+

Examples

+
>>> from shapely.wkt import loads
+>>> p = loads("MULTILINESTRING((0 0, 1 1), (3 3, 2 2))")
+>>> p.normalize().wkt
+'MULTILINESTRING ((2 2, 3 3), (0 0, 1 1))'
+
+
+
+ +
+
+overlaps(other)
+

Returns True if geometries overlap, else False

+
+ +
+
+project(other, normalized=False)
+

Returns the distance along this geometry to a point nearest the +specified point

+

If the normalized arg is True, return the distance normalized to the +length of the linear geometry.

+
+ +
+
+relate(other)
+

Returns the DE-9IM intersection matrix for the two geometries +(string)

+
+ +
+
+relate_pattern(other, pattern)
+

Returns True if the DE-9IM string code for the relationship between +the geometries satisfies the pattern, else False

+
+ +
+
+representative_point()
+

Returns a point guaranteed to be within the object, cheaply.

+
+ +
+
+simplify(tolerance, preserve_topology=True)
+

Returns a simplified geometry produced by the Douglas-Peucker +algorithm

+

Coordinates of the simplified geometry will be no more than the +tolerance distance from the original. Unless the topology preserving +option is used, the algorithm may produce self-intersecting or +otherwise invalid geometries.

+
+ +
+
+svg(scale_factor=1.0, fill_color=None, opacity=None)[source]
+

Returns SVG circle element for the Point geometry.

+
+
Parameters
+
    +
  • scale_factor (float) – Multiplication factor for the SVG circle diameter. Default is 1.

  • +
  • fill_color (str, optional) – Hex string for fill color. Default is to use “#66cc99” if +geometry is valid, and “#ff3333” if invalid.

  • +
  • opacity (float) – Float number between 0 and 1 for color opacity. Default value is 0.6

  • +
+
+
+
+ +
+
+symmetric_difference(other)
+

Returns the symmetric difference of the geometries +(Shapely geometry)

+
+ +
+
+touches(other)
+

Returns True if geometries touch, else False

+
+ +
+
+union(other)
+

Returns the union of the geometries (Shapely geometry)

+
+ +
+
+within(other)
+

Returns True if geometry is within the other, else False

+
+ +
+
+property area
+

Unitless area of the geometry (float)

+
+ +
+
+property array_interface_base
+
+ +
+
+property boundary
+

Returns a lower dimension geometry that bounds the object

+

The boundary of a polygon is a line, the boundary of a line is a +collection of points. The boundary of a point is an empty (null) +collection.

+
+ +
+
+property bounds
+

Returns minimum bounding region (minx, miny, maxx, maxy)

+
+ +
+
+property centroid
+

Returns the geometric center of the object

+
+ +
+
+property convex_hull
+

that’s a +convex hull, more or less

+

The convex hull of a three member multipoint, for example, is a +triangular polygon.

+
+
Type
+

Imagine an elastic band stretched around the geometry

+
+
+
+ +
+
+property coords
+

Access to geometry’s coordinates (CoordinateSequence)

+
+ +
+
+property ctypes
+

Return ctypes buffer

+
+ +
+
+property envelope
+

A figure that envelopes the geometry

+
+ +
+
+property geom_type
+

Name of the geometry’s type, such as ‘Point’

+
+ +
+
+property has_z
+

True if the geometry’s coordinate sequence(s) have z values (are +3-dimensional)

+
+ +
+
+impl = <GEOSImpl object: GEOS C API version (1, 13, 0)>
+
+ +
+
+property is_closed
+

True if the geometry is closed, else False

+

Applicable only to 1-D geometries.

+
+ +
+
+property is_empty
+

True if the set of points in this geometry is empty, else False

+
+ +
+
+property is_ring
+

True if the geometry is a closed ring, else False

+
+ +
+
+property is_simple
+

True if the geometry is simple, meaning that any self-intersections +are only at boundary points, else False

+
+ +
+
+property is_valid
+

True if the geometry is valid (definition depends on sub-class), +else False

+
+ +
+
+property length
+

Unitless length of the geometry (float)

+
+ +
+
+property minimum_clearance
+

Unitless distance by which a node could be moved to produce an invalid geometry (float)

+
+ +
+
+property minimum_rotated_rectangle
+

Returns the general minimum bounding rectangle of +the geometry. Can possibly be rotated. If the convex hull +of the object is a degenerate (line or point) this same degenerate +is returned.

+
+ +
+
+property type
+
+ +
+
+property wkb
+

WKB representation of the geometry

+
+ +
+
+property wkb_hex
+

WKB hex representation of the geometry

+
+ +
+
+property wkt
+

WKT representation of the geometry

+
+ +
+
+property x
+

Return x coordinate.

+
+ +
+
+property xy
+

Separate arrays of X and Y coordinate values

+

Example

+
>>> x, y = Point(0, 0).xy
+>>> list(x)
+[0.0]
+>>> list(y)
+[0.0]
+
+
+
+ +
+
+property y
+

Return y coordinate.

+
+ +
+
+property z
+

Return z coordinate.

+
+ +
+ +
+
+class lasso.util.Polygon(shell=None, holes=None)[source]
+

Bases: BaseGeometry

+

A two-dimensional figure bounded by a linear ring

+

A polygon has a non-zero area. It may have one or more negative-space +“holes” which are also bounded by linear rings. If any rings cross each +other, the feature is invalid and operations on it may fail.

+
+
+exterior
+

The ring which bounds the positive space of the polygon.

+
+
Type
+

LinearRing

+
+
+
+ +
+
+interiors
+

A sequence of rings which bound all existing holes.

+
+
Type
+

sequence

+
+
+
+ +
+
+almost_equals(other, decimal=6)
+

True if geometries are equal at all coordinates to a +specified decimal place.

+
+

Deprecated since version 1.8.0: The ‘almost_equals()’ method is deprecated +and will be removed in Shapely 2.0 because the name is +confusing. The ‘equals_exact()’ method should be used +instead.

+
+

Refers to approximate coordinate equality, which requires +coordinates to be approximately equal and in the same order for +all components of a geometry.

+

Because of this it is possible for “equals()” to be True for two +geometries and “almost_equals()” to be False.

+

Examples

+
>>> LineString(
+...     [(0, 0), (2, 2)]
+... ).equals_exact(
+...     LineString([(0, 0), (1, 1), (2, 2)]),
+...     1e-6
+... )
+False
+
+
+
+
Return type
+

bool

+
+
+
+ +
+
+buffer(distance, resolution=16, quadsegs=None, cap_style=1, join_style=1, mitre_limit=5.0, single_sided=False)
+

Get a geometry that represents all points within a distance +of this geometry.

+

A positive distance produces a dilation, a negative distance an +erosion. A very small or zero distance may sometimes be used to +“tidy” a polygon.

+
+
Parameters
+
    +
  • distance (float) – The distance to buffer around the object.

  • +
  • resolution (int, optional) – The resolution of the buffer around each vertex of the +object.

  • +
  • quadsegs (int, optional) – Sets the number of line segments used to approximate an +angle fillet. Note: the use of a quadsegs parameter is +deprecated and will be gone from the next major release.

  • +
  • cap_style (int, optional) – The styles of caps are: CAP_STYLE.round (1), CAP_STYLE.flat +(2), and CAP_STYLE.square (3).

  • +
  • join_style (int, optional) – The styles of joins between offset segments are: +JOIN_STYLE.round (1), JOIN_STYLE.mitre (2), and +JOIN_STYLE.bevel (3).

  • +
  • mitre_limit (float, optional) – The mitre limit ratio is used for very sharp corners. The +mitre ratio is the ratio of the distance from the corner to +the end of the mitred offset corner. When two line segments +meet at a sharp angle, a miter join will extend the original +geometry. To prevent unreasonable geometry, the mitre limit +allows controlling the maximum length of the join corner. +Corners with a ratio which exceed the limit will be beveled.

  • +
  • single_side (bool, optional) –

    The side used is determined by the sign of the buffer +distance:

    +
    +

    a positive distance indicates the left-hand side +a negative distance indicates the right-hand side

    +
    +

    The single-sided buffer of point geometries is the same as +the regular buffer. The End Cap Style for single-sided +buffers is always ignored, and forced to the equivalent of +CAP_FLAT.

    +

  • +
+
+
Return type
+

Geometry

+
+
+

Notes

+

The return value is a strictly two-dimensional geometry. All +Z coordinates of the original geometry will be ignored.

+

Examples

+
>>> from shapely.wkt import loads
+>>> g = loads('POINT (0.0 0.0)')
+>>> g.buffer(1.0).area        # 16-gon approx of a unit radius circle
+3.1365484905459...
+>>> g.buffer(1.0, 128).area   # 128-gon approximation
+3.141513801144...
+>>> round(g.buffer(1.0, 3).area, 10)  # triangle approximation
+3.0
+>>> list(g.buffer(1.0, cap_style=CAP_STYLE.square).exterior.coords)
+[(1.0, 1.0), (1.0, -1.0), (-1.0, -1.0), (-1.0, 1.0), (1.0, 1.0)]
+>>> g.buffer(1.0, cap_style=CAP_STYLE.square).area
+4.0
+
+
+
+ +
+
+contains(other)
+

Returns True if the geometry contains the other, else False

+
+ +
+
+covered_by(other)
+

Returns True if the geometry is covered by the other, else False

+
+ +
+
+covers(other)
+

Returns True if the geometry covers the other, else False

+
+ +
+
+crosses(other)
+

Returns True if the geometries cross, else False

+
+ +
+
+difference(other)
+

Returns the difference of the geometries

+
+ +
+
+disjoint(other)
+

Returns True if geometries are disjoint, else False

+
+ +
+
+distance(other)
+

Unitless distance to other geometry (float)

+
+ +
+
+empty(val=94210181747008)
+
+ +
+
+equals(other)
+

Returns True if geometries are equal, else False.

+

This method considers point-set equality (or topological +equality), and is equivalent to (self.within(other) & +self.contains(other)).

+

Examples

+
>>> LineString(
+...     [(0, 0), (2, 2)]
+... ).equals(
+...     LineString([(0, 0), (1, 1), (2, 2)])
+... )
+True
+
+
+
+
Return type
+

bool

+
+
+
+ +
+
+equals_exact(other, tolerance)
+

True if geometries are equal to within a specified +tolerance.

+
+
Parameters
+
    +
  • other (BaseGeometry) – The other geometry object in this comparison.

  • +
  • tolerance (float) – Absolute tolerance in the same units as coordinates.

  • +
  • equality (This method considers coordinate) –

  • +
  • requires (which) –

  • +
  • components (coordinates to be equal and in the same order for all) –

  • +
  • geometry. (of a) –

  • +
  • two (Because of this it is possible for "equals()" to be True for) –

  • +
  • False. (geometries and "equals_exact()" to be) –

  • +
+
+
+

Examples

+
>>> LineString(
+...     [(0, 0), (2, 2)]
+... ).equals_exact(
+...     LineString([(0, 0), (1, 1), (2, 2)]),
+...     1e-6
+... )
+False
+
+
+
+
Return type
+

bool

+
+
+
+ +
+
+classmethod from_bounds(xmin, ymin, xmax, ymax)[source]
+

Construct a Polygon() from spatial bounds.

+
+ +
+
+geometryType()
+
+ +
+
+hausdorff_distance(other)
+

Unitless hausdorff distance to other geometry (float)

+
+ +
+
+interpolate(distance, normalized=False)
+

Return a point at the specified distance along a linear geometry

+

Negative length values are taken as measured in the reverse +direction from the end of the geometry. Out-of-range index +values are handled by clamping them to the valid range of values. +If the normalized arg is True, the distance will be interpreted as a +fraction of the geometry’s length.

+
+ +
+
+intersection(other)
+

Returns the intersection of the geometries

+
+ +
+
+intersects(other)
+

Returns True if geometries intersect, else False

+
+ +
+
+normalize()
+

Converts geometry to normal form (or canonical form).

+

This method orders the coordinates, rings of a polygon and parts of +multi geometries consistently. Typically useful for testing purposes +(for example in combination with equals_exact).

+

Examples

+
>>> from shapely.wkt import loads
+>>> p = loads("MULTILINESTRING((0 0, 1 1), (3 3, 2 2))")
+>>> p.normalize().wkt
+'MULTILINESTRING ((2 2, 3 3), (0 0, 1 1))'
+
+
+
+ +
+
+overlaps(other)
+

Returns True if geometries overlap, else False

+
+ +
+
+project(other, normalized=False)
+

Returns the distance along this geometry to a point nearest the +specified point

+

If the normalized arg is True, return the distance normalized to the +length of the linear geometry.

+
+ +
+
+relate(other)
+

Returns the DE-9IM intersection matrix for the two geometries +(string)

+
+ +
+
+relate_pattern(other, pattern)
+

Returns True if the DE-9IM string code for the relationship between +the geometries satisfies the pattern, else False

+
+ +
+
+representative_point()
+

Returns a point guaranteed to be within the object, cheaply.

+
+ +
+
+simplify(tolerance, preserve_topology=True)
+

Returns a simplified geometry produced by the Douglas-Peucker +algorithm

+

Coordinates of the simplified geometry will be no more than the +tolerance distance from the original. Unless the topology preserving +option is used, the algorithm may produce self-intersecting or +otherwise invalid geometries.

+
+ +
+
+svg(scale_factor=1.0, fill_color=None, opacity=None)[source]
+

Returns SVG path element for the Polygon geometry.

+
+
Parameters
+
    +
  • scale_factor (float) – Multiplication factor for the SVG stroke-width. Default is 1.

  • +
  • fill_color (str, optional) – Hex string for fill color. Default is to use “#66cc99” if +geometry is valid, and “#ff3333” if invalid.

  • +
  • opacity (float) – Float number between 0 and 1 for color opacity. Default value is 0.6

  • +
+
+
+
+ +
+
+symmetric_difference(other)
+

Returns the symmetric difference of the geometries +(Shapely geometry)

+
+ +
+
+touches(other)
+

Returns True if geometries touch, else False

+
+ +
+
+union(other)
+

Returns the union of the geometries (Shapely geometry)

+
+ +
+
+within(other)
+

Returns True if geometry is within the other, else False

+
+ +
+
+property area
+

Unitless area of the geometry (float)

+
+ +
+
+property array_interface_base
+
+ +
+
+property boundary
+

Returns a lower dimension geometry that bounds the object

+

The boundary of a polygon is a line, the boundary of a line is a +collection of points. The boundary of a point is an empty (null) +collection.

+
+ +
+
+property bounds
+

Returns minimum bounding region (minx, miny, maxx, maxy)

+
+ +
+
+property centroid
+

Returns the geometric center of the object

+
+ +
+
+property convex_hull
+

that’s a +convex hull, more or less

+

The convex hull of a three member multipoint, for example, is a +triangular polygon.

+
+
Type
+

Imagine an elastic band stretched around the geometry

+
+
+
+ +
+
+property coords
+

Access to geometry’s coordinates (CoordinateSequence)

+
+ +
+
+property ctypes
+

Return ctypes buffer

+
+ +
+
+property envelope
+

A figure that envelopes the geometry

+
+ +
+
+property exterior
+
+ +
+
+property geom_type
+

Name of the geometry’s type, such as ‘Point’

+
+ +
+
+property has_z
+

True if the geometry’s coordinate sequence(s) have z values (are +3-dimensional)

+
+ +
+
+impl = <GEOSImpl object: GEOS C API version (1, 13, 0)>
+
+ +
+
+property interiors
+
+ +
+
+property is_closed
+

True if the geometry is closed, else False

+

Applicable only to 1-D geometries.

+
+ +
+
+property is_empty
+

True if the set of points in this geometry is empty, else False

+
+ +
+
+property is_ring
+

True if the geometry is a closed ring, else False

+
+ +
+
+property is_simple
+

True if the geometry is simple, meaning that any self-intersections +are only at boundary points, else False

+
+ +
+
+property is_valid
+

True if the geometry is valid (definition depends on sub-class), +else False

+
+ +
+
+property length
+

Unitless length of the geometry (float)

+
+ +
+
+property minimum_clearance
+

Unitless distance by which a node could be moved to produce an invalid geometry (float)

+
+ +
+
+property minimum_rotated_rectangle
+

Returns the general minimum bounding rectangle of +the geometry. Can possibly be rotated. If the convex hull +of the object is a degenerate (line or point) this same degenerate +is returned.

+
+ +
+
+property type
+
+ +
+
+property wkb
+

WKB representation of the geometry

+
+ +
+
+property wkb_hex
+

WKB hex representation of the geometry

+
+ +
+
+property wkt
+

WKT representation of the geometry

+
+ +
+
+property xy
+

Separate arrays of X and Y coordinate values

+
+ +
+ +
+
+class lasso.util.partial[source]
+

Bases: object

+

partial(func, *args, **keywords) - new function with partial application +of the given arguments and keywords.

+
+
+args
+

tuple of arguments to future partial calls

+
+ +
+
+func
+

function object to use in future partial calls

+
+ +
+
+keywords
+

dictionary of keyword arguments to future partial calls

+
+ +
+ +
+
+lasso.util.column_name_to_parts(c, parameters=None)[source]
+
+ +
+
+lasso.util.create_locationreference(node, link)[source]
+
+ +
+
+lasso.util.geodesic_point_buffer(lat, lon, meters)[source]
+

creates circular buffer polygon for node

+
+
Parameters
+
    +
  • lat – node lat

  • +
  • lon – node lon

  • +
  • meters – buffer distance, radius of circle

  • +
+
+
Returns
+

Polygon

+
+
+
+ +
+
+lasso.util.get_shared_streets_intersection_hash(lat, long, osm_node_id=None)[source]
+
+
Calculated per:

https://github.com/sharedstreets/sharedstreets-js/blob/0e6d7de0aee2e9ae3b007d1e45284b06cc241d02/src/index.ts#L553-L565

+
+
Expected in/out
+
-93.0965985, 44.952112199999995 osm_node_id = 954734870

69f13f881649cb21ee3b359730790bb9

+
+
+
+
+
+ +
+
+lasso.util.hhmmss_to_datetime(hhmmss_str)[source]
+

Creates a datetime time object from a string of hh:mm:ss

+
+
Parameters
+

hhmmss_str – string of hh:mm:ss

+
+
Returns
+

datetime.time object representing time

+
+
Return type
+

dt

+
+
+
+ +
+
+lasso.util.secs_to_datetime(secs)[source]
+

Creates a datetime time object from a seconds from midnight

+
+
Parameters
+

secs – seconds from midnight

+
+
Returns
+

datetime.time object representing time

+
+
Return type
+

dt

+
+
+
+ +
+
+lasso.util.shorten_name(name)[source]
+
+ +
+
+lasso.util.transform(func, geom)[source]
+

Applies func to all coordinates of geom and returns a new +geometry of the same type from the transformed coordinates.

+

func maps x, y, and optionally z to output xp, yp, zp. The input +parameters may iterable types like lists or arrays or single values. +The output shall be of the same type. Scalars in, scalars out. +Lists in, lists out.

+

For example, here is an identity function applicable to both types +of input.

+
+
+
def id_func(x, y, z=None):

return tuple(filter(None, [x, y, z]))

+
+
+

g2 = transform(id_func, g1)

+
+

Using pyproj >= 2.1, this example will accurately project Shapely geometries:

+
+

import pyproj

+

wgs84 = pyproj.CRS(‘EPSG:4326’) +utm = pyproj.CRS(‘EPSG:32618’)

+

project = pyproj.Transformer.from_crs(wgs84, utm, always_xy=True).transform

+

g2 = transform(project, g1)

+
+

Note that the always_xy kwarg is required here as Shapely geometries only support +X,Y coordinate ordering.

+

Lambda expressions such as the one in

+
+

g2 = transform(lambda x, y, z=None: (x+1.0, y+1.0), g1)

+
+

also satisfy the requirements for func.

+
+ +
+
+lasso.util.unidecode(string, errors='ignore', replace_str='?')
+

Transliterate an Unicode object into an ASCII string

+
>>> unidecode("北亰")
+"Bei Jing "
+
+
+

This function first tries to convert the string using ASCII codec. +If it fails (because of non-ASCII characters), it falls back to +transliteration using the character tables.

+

This is approx. five times faster if the string only contains ASCII +characters, but slightly slower than unicode_expect_nonascii if +non-ASCII characters are present.

+

errors specifies what to do with characters that have not been +found in replacement tables. The default is ‘ignore’ which ignores +the character. ‘strict’ raises an UnidecodeError. ‘replace’ +substitutes the character with replace_str (default is ‘?’). +‘preserve’ keeps the original character.

+

Note that if ‘preserve’ is used the returned string might not be +ASCII!

+
+
Return type
+

str

+
+
+
+ +
+ + +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty/_modules/functools/index.html b/branch/bicounty/_modules/functools/index.html new file mode 100644 index 0000000..97cee24 --- /dev/null +++ b/branch/bicounty/_modules/functools/index.html @@ -0,0 +1,1077 @@ + + + + + + functools — lasso documentation + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for functools

+"""functools.py - Tools for working with functions and callable objects
+"""
+# Python module wrapper for _functools C module
+# to allow utilities written in Python to be added
+# to the functools module.
+# Written by Nick Coghlan <ncoghlan at gmail.com>,
+# Raymond Hettinger <python at rcn.com>,
+# and Łukasz Langa <lukasz at langa.pl>.
+#   Copyright (C) 2006-2013 Python Software Foundation.
+# See C source code for _functools credits/copyright
+
+__all__ = ['update_wrapper', 'wraps', 'WRAPPER_ASSIGNMENTS', 'WRAPPER_UPDATES',
+           'total_ordering', 'cmp_to_key', 'lru_cache', 'reduce', 'partial',
+           'partialmethod', 'singledispatch', 'singledispatchmethod',
+           "cached_property"]
+
+from abc import get_cache_token
+from collections import namedtuple
+# import types, weakref  # Deferred to single_dispatch()
+from reprlib import recursive_repr
+from _thread import RLock
+
+
+################################################################################
+### update_wrapper() and wraps() decorator
+################################################################################
+
+# update_wrapper() and wraps() are tools to help write
+# wrapper functions that can handle naive introspection
+
+WRAPPER_ASSIGNMENTS = ('__module__', '__name__', '__qualname__', '__doc__',
+                       '__annotations__')
+WRAPPER_UPDATES = ('__dict__',)
+def update_wrapper(wrapper,
+                   wrapped,
+                   assigned = WRAPPER_ASSIGNMENTS,
+                   updated = WRAPPER_UPDATES):
+    """Update a wrapper function to look like the wrapped function
+
+       wrapper is the function to be updated
+       wrapped is the original function
+       assigned is a tuple naming the attributes assigned directly
+       from the wrapped function to the wrapper function (defaults to
+       functools.WRAPPER_ASSIGNMENTS)
+       updated is a tuple naming the attributes of the wrapper that
+       are updated with the corresponding attribute from the wrapped
+       function (defaults to functools.WRAPPER_UPDATES)
+    """
+    for attr in assigned:
+        try:
+            value = getattr(wrapped, attr)
+        except AttributeError:
+            pass
+        else:
+            setattr(wrapper, attr, value)
+    for attr in updated:
+        getattr(wrapper, attr).update(getattr(wrapped, attr, {}))
+    # Issue #17482: set __wrapped__ last so we don't inadvertently copy it
+    # from the wrapped function when updating __dict__
+    wrapper.__wrapped__ = wrapped
+    # Return the wrapper so this can be used as a decorator via partial()
+    return wrapper
+
+def wraps(wrapped,
+          assigned = WRAPPER_ASSIGNMENTS,
+          updated = WRAPPER_UPDATES):
+    """Decorator factory to apply update_wrapper() to a wrapper function
+
+       Returns a decorator that invokes update_wrapper() with the decorated
+       function as the wrapper argument and the arguments to wraps() as the
+       remaining arguments. Default arguments are as for update_wrapper().
+       This is a convenience function to simplify applying partial() to
+       update_wrapper().
+    """
+    return partial(update_wrapper, wrapped=wrapped,
+                   assigned=assigned, updated=updated)
+
+
+################################################################################
+### total_ordering class decorator
+################################################################################
+
+# The total ordering functions all invoke the root magic method directly
+# rather than using the corresponding operator.  This avoids possible
+# infinite recursion that could occur when the operator dispatch logic
+# detects a NotImplemented result and then calls a reflected method.
+
+def _gt_from_lt(self, other, NotImplemented=NotImplemented):
+    'Return a > b.  Computed by @total_ordering from (not a < b) and (a != b).'
+    op_result = self.__lt__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return not op_result and self != other
+
+def _le_from_lt(self, other, NotImplemented=NotImplemented):
+    'Return a <= b.  Computed by @total_ordering from (a < b) or (a == b).'
+    op_result = self.__lt__(other)
+    return op_result or self == other
+
+def _ge_from_lt(self, other, NotImplemented=NotImplemented):
+    'Return a >= b.  Computed by @total_ordering from (not a < b).'
+    op_result = self.__lt__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return not op_result
+
+def _ge_from_le(self, other, NotImplemented=NotImplemented):
+    'Return a >= b.  Computed by @total_ordering from (not a <= b) or (a == b).'
+    op_result = self.__le__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return not op_result or self == other
+
+def _lt_from_le(self, other, NotImplemented=NotImplemented):
+    'Return a < b.  Computed by @total_ordering from (a <= b) and (a != b).'
+    op_result = self.__le__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return op_result and self != other
+
+def _gt_from_le(self, other, NotImplemented=NotImplemented):
+    'Return a > b.  Computed by @total_ordering from (not a <= b).'
+    op_result = self.__le__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return not op_result
+
+def _lt_from_gt(self, other, NotImplemented=NotImplemented):
+    'Return a < b.  Computed by @total_ordering from (not a > b) and (a != b).'
+    op_result = self.__gt__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return not op_result and self != other
+
+def _ge_from_gt(self, other, NotImplemented=NotImplemented):
+    'Return a >= b.  Computed by @total_ordering from (a > b) or (a == b).'
+    op_result = self.__gt__(other)
+    return op_result or self == other
+
+def _le_from_gt(self, other, NotImplemented=NotImplemented):
+    'Return a <= b.  Computed by @total_ordering from (not a > b).'
+    op_result = self.__gt__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return not op_result
+
+def _le_from_ge(self, other, NotImplemented=NotImplemented):
+    'Return a <= b.  Computed by @total_ordering from (not a >= b) or (a == b).'
+    op_result = self.__ge__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return not op_result or self == other
+
+def _gt_from_ge(self, other, NotImplemented=NotImplemented):
+    'Return a > b.  Computed by @total_ordering from (a >= b) and (a != b).'
+    op_result = self.__ge__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return op_result and self != other
+
+def _lt_from_ge(self, other, NotImplemented=NotImplemented):
+    'Return a < b.  Computed by @total_ordering from (not a >= b).'
+    op_result = self.__ge__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return not op_result
+
+_convert = {
+    '__lt__': [('__gt__', _gt_from_lt),
+               ('__le__', _le_from_lt),
+               ('__ge__', _ge_from_lt)],
+    '__le__': [('__ge__', _ge_from_le),
+               ('__lt__', _lt_from_le),
+               ('__gt__', _gt_from_le)],
+    '__gt__': [('__lt__', _lt_from_gt),
+               ('__ge__', _ge_from_gt),
+               ('__le__', _le_from_gt)],
+    '__ge__': [('__le__', _le_from_ge),
+               ('__gt__', _gt_from_ge),
+               ('__lt__', _lt_from_ge)]
+}
+
+def total_ordering(cls):
+    """Class decorator that fills in missing ordering methods"""
+    # Find user-defined comparisons (not those inherited from object).
+    roots = {op for op in _convert if getattr(cls, op, None) is not getattr(object, op, None)}
+    if not roots:
+        raise ValueError('must define at least one ordering operation: < > <= >=')
+    root = max(roots)       # prefer __lt__ to __le__ to __gt__ to __ge__
+    for opname, opfunc in _convert[root]:
+        if opname not in roots:
+            opfunc.__name__ = opname
+            setattr(cls, opname, opfunc)
+    return cls
+
+
+################################################################################
+### cmp_to_key() function converter
+################################################################################
+
+def cmp_to_key(mycmp):
+    """Convert a cmp= function into a key= function"""
+    class K(object):
+        __slots__ = ['obj']
+        def __init__(self, obj):
+            self.obj = obj
+        def __lt__(self, other):
+            return mycmp(self.obj, other.obj) < 0
+        def __gt__(self, other):
+            return mycmp(self.obj, other.obj) > 0
+        def __eq__(self, other):
+            return mycmp(self.obj, other.obj) == 0
+        def __le__(self, other):
+            return mycmp(self.obj, other.obj) <= 0
+        def __ge__(self, other):
+            return mycmp(self.obj, other.obj) >= 0
+        __hash__ = None
+    return K
+
+try:
+    from _functools import cmp_to_key
+except ImportError:
+    pass
+
+
+################################################################################
+### reduce() sequence to a single item
+################################################################################
+
+_initial_missing = object()
+
+def reduce(function, sequence, initial=_initial_missing):
+    """
+    reduce(function, sequence[, initial]) -> value
+
+    Apply a function of two arguments cumulatively to the items of a sequence,
+    from left to right, so as to reduce the sequence to a single value.
+    For example, reduce(lambda x, y: x+y, [1, 2, 3, 4, 5]) calculates
+    ((((1+2)+3)+4)+5).  If initial is present, it is placed before the items
+    of the sequence in the calculation, and serves as a default when the
+    sequence is empty.
+    """
+
+    it = iter(sequence)
+
+    if initial is _initial_missing:
+        try:
+            value = next(it)
+        except StopIteration:
+            raise TypeError("reduce() of empty sequence with no initial value") from None
+    else:
+        value = initial
+
+    for element in it:
+        value = function(value, element)
+
+    return value
+
+try:
+    from _functools import reduce
+except ImportError:
+    pass
+
+
+################################################################################
+### partial() argument application
+################################################################################
+
+# Purely functional, no descriptor behaviour
+
[docs]class partial: + """New function with partial application of the given arguments + and keywords. + """ + + __slots__ = "func", "args", "keywords", "__dict__", "__weakref__" + + def __new__(cls, func, /, *args, **keywords): + if not callable(func): + raise TypeError("the first argument must be callable") + + if hasattr(func, "func"): + args = func.args + args + keywords = {**func.keywords, **keywords} + func = func.func + + self = super(partial, cls).__new__(cls) + + self.func = func + self.args = args + self.keywords = keywords + return self + + def __call__(self, /, *args, **keywords): + keywords = {**self.keywords, **keywords} + return self.func(*self.args, *args, **keywords) + + @recursive_repr() + def __repr__(self): + qualname = type(self).__qualname__ + args = [repr(self.func)] + args.extend(repr(x) for x in self.args) + args.extend(f"{k}={v!r}" for (k, v) in self.keywords.items()) + if type(self).__module__ == "functools": + return f"functools.{qualname}({', '.join(args)})" + return f"{qualname}({', '.join(args)})" + + def __reduce__(self): + return type(self), (self.func,), (self.func, self.args, + self.keywords or None, self.__dict__ or None) + + def __setstate__(self, state): + if not isinstance(state, tuple): + raise TypeError("argument to __setstate__ must be a tuple") + if len(state) != 4: + raise TypeError(f"expected 4 items in state, got {len(state)}") + func, args, kwds, namespace = state + if (not callable(func) or not isinstance(args, tuple) or + (kwds is not None and not isinstance(kwds, dict)) or + (namespace is not None and not isinstance(namespace, dict))): + raise TypeError("invalid partial state") + + args = tuple(args) # just in case it's a subclass + if kwds is None: + kwds = {} + elif type(kwds) is not dict: # XXX does it need to be *exactly* dict? + kwds = dict(kwds) + if namespace is None: + namespace = {} + + self.__dict__ = namespace + self.func = func + self.args = args + self.keywords = kwds
+ +try: + from _functools import partial +except ImportError: + pass + +# Descriptor version +class partialmethod(object): + """Method descriptor with partial application of the given arguments + and keywords. + + Supports wrapping existing descriptors and handles non-descriptor + callables as instance methods. + """ + + def __init__(*args, **keywords): + if len(args) >= 2: + self, func, *args = args + elif not args: + raise TypeError("descriptor '__init__' of partialmethod " + "needs an argument") + elif 'func' in keywords: + func = keywords.pop('func') + self, *args = args + import warnings + warnings.warn("Passing 'func' as keyword argument is deprecated", + DeprecationWarning, stacklevel=2) + else: + raise TypeError("type 'partialmethod' takes at least one argument, " + "got %d" % (len(args)-1)) + args = tuple(args) + + if not callable(func) and not hasattr(func, "__get__"): + raise TypeError("{!r} is not callable or a descriptor" + .format(func)) + + # func could be a descriptor like classmethod which isn't callable, + # so we can't inherit from partial (it verifies func is callable) + if isinstance(func, partialmethod): + # flattening is mandatory in order to place cls/self before all + # other arguments + # it's also more efficient since only one function will be called + self.func = func.func + self.args = func.args + args + self.keywords = {**func.keywords, **keywords} + else: + self.func = func + self.args = args + self.keywords = keywords + __init__.__text_signature__ = '($self, func, /, *args, **keywords)' + + def __repr__(self): + args = ", ".join(map(repr, self.args)) + keywords = ", ".join("{}={!r}".format(k, v) + for k, v in self.keywords.items()) + format_string = "{module}.{cls}({func}, {args}, {keywords})" + return format_string.format(module=self.__class__.__module__, + cls=self.__class__.__qualname__, + func=self.func, + args=args, + keywords=keywords) + + def _make_unbound_method(self): + def _method(cls_or_self, /, *args, **keywords): + keywords = {**self.keywords, **keywords} + return self.func(cls_or_self, *self.args, *args, **keywords) + _method.__isabstractmethod__ = self.__isabstractmethod__ + _method._partialmethod = self + return _method + + def __get__(self, obj, cls=None): + get = getattr(self.func, "__get__", None) + result = None + if get is not None: + new_func = get(obj, cls) + if new_func is not self.func: + # Assume __get__ returning something new indicates the + # creation of an appropriate callable + result = partial(new_func, *self.args, **self.keywords) + try: + result.__self__ = new_func.__self__ + except AttributeError: + pass + if result is None: + # If the underlying descriptor didn't do anything, treat this + # like an instance method + result = self._make_unbound_method().__get__(obj, cls) + return result + + @property + def __isabstractmethod__(self): + return getattr(self.func, "__isabstractmethod__", False) + +# Helper functions + +def _unwrap_partial(func): + while isinstance(func, partial): + func = func.func + return func + +################################################################################ +### LRU Cache function decorator +################################################################################ + +_CacheInfo = namedtuple("CacheInfo", ["hits", "misses", "maxsize", "currsize"]) + +class _HashedSeq(list): + """ This class guarantees that hash() will be called no more than once + per element. This is important because the lru_cache() will hash + the key multiple times on a cache miss. + + """ + + __slots__ = 'hashvalue' + + def __init__(self, tup, hash=hash): + self[:] = tup + self.hashvalue = hash(tup) + + def __hash__(self): + return self.hashvalue + +def _make_key(args, kwds, typed, + kwd_mark = (object(),), + fasttypes = {int, str}, + tuple=tuple, type=type, len=len): + """Make a cache key from optionally typed positional and keyword arguments + + The key is constructed in a way that is flat as possible rather than + as a nested structure that would take more memory. + + If there is only a single argument and its data type is known to cache + its hash value, then that argument is returned without a wrapper. This + saves space and improves lookup speed. + + """ + # All of code below relies on kwds preserving the order input by the user. + # Formerly, we sorted() the kwds before looping. The new way is *much* + # faster; however, it means that f(x=1, y=2) will now be treated as a + # distinct call from f(y=2, x=1) which will be cached separately. + key = args + if kwds: + key += kwd_mark + for item in kwds.items(): + key += item + if typed: + key += tuple(type(v) for v in args) + if kwds: + key += tuple(type(v) for v in kwds.values()) + elif len(key) == 1 and type(key[0]) in fasttypes: + return key[0] + return _HashedSeq(key) + +def lru_cache(maxsize=128, typed=False): + """Least-recently-used cache decorator. + + If *maxsize* is set to None, the LRU features are disabled and the cache + can grow without bound. + + If *typed* is True, arguments of different types will be cached separately. + For example, f(3.0) and f(3) will be treated as distinct calls with + distinct results. + + Arguments to the cached function must be hashable. + + View the cache statistics named tuple (hits, misses, maxsize, currsize) + with f.cache_info(). Clear the cache and statistics with f.cache_clear(). + Access the underlying function with f.__wrapped__. + + See: http://en.wikipedia.org/wiki/Cache_replacement_policies#Least_recently_used_(LRU) + + """ + + # Users should only access the lru_cache through its public API: + # cache_info, cache_clear, and f.__wrapped__ + # The internals of the lru_cache are encapsulated for thread safety and + # to allow the implementation to change (including a possible C version). + + if isinstance(maxsize, int): + # Negative maxsize is treated as 0 + if maxsize < 0: + maxsize = 0 + elif callable(maxsize) and isinstance(typed, bool): + # The user_function was passed in directly via the maxsize argument + user_function, maxsize = maxsize, 128 + wrapper = _lru_cache_wrapper(user_function, maxsize, typed, _CacheInfo) + return update_wrapper(wrapper, user_function) + elif maxsize is not None: + raise TypeError( + 'Expected first argument to be an integer, a callable, or None') + + def decorating_function(user_function): + wrapper = _lru_cache_wrapper(user_function, maxsize, typed, _CacheInfo) + return update_wrapper(wrapper, user_function) + + return decorating_function + +def _lru_cache_wrapper(user_function, maxsize, typed, _CacheInfo): + # Constants shared by all lru cache instances: + sentinel = object() # unique object used to signal cache misses + make_key = _make_key # build a key from the function arguments + PREV, NEXT, KEY, RESULT = 0, 1, 2, 3 # names for the link fields + + cache = {} + hits = misses = 0 + full = False + cache_get = cache.get # bound method to lookup a key or return None + cache_len = cache.__len__ # get cache size without calling len() + lock = RLock() # because linkedlist updates aren't threadsafe + root = [] # root of the circular doubly linked list + root[:] = [root, root, None, None] # initialize by pointing to self + + if maxsize == 0: + + def wrapper(*args, **kwds): + # No caching -- just a statistics update + nonlocal misses + misses += 1 + result = user_function(*args, **kwds) + return result + + elif maxsize is None: + + def wrapper(*args, **kwds): + # Simple caching without ordering or size limit + nonlocal hits, misses + key = make_key(args, kwds, typed) + result = cache_get(key, sentinel) + if result is not sentinel: + hits += 1 + return result + misses += 1 + result = user_function(*args, **kwds) + cache[key] = result + return result + + else: + + def wrapper(*args, **kwds): + # Size limited caching that tracks accesses by recency + nonlocal root, hits, misses, full + key = make_key(args, kwds, typed) + with lock: + link = cache_get(key) + if link is not None: + # Move the link to the front of the circular queue + link_prev, link_next, _key, result = link + link_prev[NEXT] = link_next + link_next[PREV] = link_prev + last = root[PREV] + last[NEXT] = root[PREV] = link + link[PREV] = last + link[NEXT] = root + hits += 1 + return result + misses += 1 + result = user_function(*args, **kwds) + with lock: + if key in cache: + # Getting here means that this same key was added to the + # cache while the lock was released. Since the link + # update is already done, we need only return the + # computed result and update the count of misses. + pass + elif full: + # Use the old root to store the new key and result. + oldroot = root + oldroot[KEY] = key + oldroot[RESULT] = result + # Empty the oldest link and make it the new root. + # Keep a reference to the old key and old result to + # prevent their ref counts from going to zero during the + # update. That will prevent potentially arbitrary object + # clean-up code (i.e. __del__) from running while we're + # still adjusting the links. + root = oldroot[NEXT] + oldkey = root[KEY] + oldresult = root[RESULT] + root[KEY] = root[RESULT] = None + # Now update the cache dictionary. + del cache[oldkey] + # Save the potentially reentrant cache[key] assignment + # for last, after the root and links have been put in + # a consistent state. + cache[key] = oldroot + else: + # Put result in a new link at the front of the queue. + last = root[PREV] + link = [last, root, key, result] + last[NEXT] = root[PREV] = cache[key] = link + # Use the cache_len bound method instead of the len() function + # which could potentially be wrapped in an lru_cache itself. + full = (cache_len() >= maxsize) + return result + + def cache_info(): + """Report cache statistics""" + with lock: + return _CacheInfo(hits, misses, maxsize, cache_len()) + + def cache_clear(): + """Clear the cache and cache statistics""" + nonlocal hits, misses, full + with lock: + cache.clear() + root[:] = [root, root, None, None] + hits = misses = 0 + full = False + + wrapper.cache_info = cache_info + wrapper.cache_clear = cache_clear + return wrapper + +try: + from _functools import _lru_cache_wrapper +except ImportError: + pass + + +################################################################################ +### singledispatch() - single-dispatch generic function decorator +################################################################################ + +def _c3_merge(sequences): + """Merges MROs in *sequences* to a single MRO using the C3 algorithm. + + Adapted from http://www.python.org/download/releases/2.3/mro/. + + """ + result = [] + while True: + sequences = [s for s in sequences if s] # purge empty sequences + if not sequences: + return result + for s1 in sequences: # find merge candidates among seq heads + candidate = s1[0] + for s2 in sequences: + if candidate in s2[1:]: + candidate = None + break # reject the current head, it appears later + else: + break + if candidate is None: + raise RuntimeError("Inconsistent hierarchy") + result.append(candidate) + # remove the chosen candidate + for seq in sequences: + if seq[0] == candidate: + del seq[0] + +def _c3_mro(cls, abcs=None): + """Computes the method resolution order using extended C3 linearization. + + If no *abcs* are given, the algorithm works exactly like the built-in C3 + linearization used for method resolution. + + If given, *abcs* is a list of abstract base classes that should be inserted + into the resulting MRO. Unrelated ABCs are ignored and don't end up in the + result. The algorithm inserts ABCs where their functionality is introduced, + i.e. issubclass(cls, abc) returns True for the class itself but returns + False for all its direct base classes. Implicit ABCs for a given class + (either registered or inferred from the presence of a special method like + __len__) are inserted directly after the last ABC explicitly listed in the + MRO of said class. If two implicit ABCs end up next to each other in the + resulting MRO, their ordering depends on the order of types in *abcs*. + + """ + for i, base in enumerate(reversed(cls.__bases__)): + if hasattr(base, '__abstractmethods__'): + boundary = len(cls.__bases__) - i + break # Bases up to the last explicit ABC are considered first. + else: + boundary = 0 + abcs = list(abcs) if abcs else [] + explicit_bases = list(cls.__bases__[:boundary]) + abstract_bases = [] + other_bases = list(cls.__bases__[boundary:]) + for base in abcs: + if issubclass(cls, base) and not any( + issubclass(b, base) for b in cls.__bases__ + ): + # If *cls* is the class that introduces behaviour described by + # an ABC *base*, insert said ABC to its MRO. + abstract_bases.append(base) + for base in abstract_bases: + abcs.remove(base) + explicit_c3_mros = [_c3_mro(base, abcs=abcs) for base in explicit_bases] + abstract_c3_mros = [_c3_mro(base, abcs=abcs) for base in abstract_bases] + other_c3_mros = [_c3_mro(base, abcs=abcs) for base in other_bases] + return _c3_merge( + [[cls]] + + explicit_c3_mros + abstract_c3_mros + other_c3_mros + + [explicit_bases] + [abstract_bases] + [other_bases] + ) + +def _compose_mro(cls, types): + """Calculates the method resolution order for a given class *cls*. + + Includes relevant abstract base classes (with their respective bases) from + the *types* iterable. Uses a modified C3 linearization algorithm. + + """ + bases = set(cls.__mro__) + # Remove entries which are already present in the __mro__ or unrelated. + def is_related(typ): + return (typ not in bases and hasattr(typ, '__mro__') + and issubclass(cls, typ)) + types = [n for n in types if is_related(n)] + # Remove entries which are strict bases of other entries (they will end up + # in the MRO anyway. + def is_strict_base(typ): + for other in types: + if typ != other and typ in other.__mro__: + return True + return False + types = [n for n in types if not is_strict_base(n)] + # Subclasses of the ABCs in *types* which are also implemented by + # *cls* can be used to stabilize ABC ordering. + type_set = set(types) + mro = [] + for typ in types: + found = [] + for sub in typ.__subclasses__(): + if sub not in bases and issubclass(cls, sub): + found.append([s for s in sub.__mro__ if s in type_set]) + if not found: + mro.append(typ) + continue + # Favor subclasses with the biggest number of useful bases + found.sort(key=len, reverse=True) + for sub in found: + for subcls in sub: + if subcls not in mro: + mro.append(subcls) + return _c3_mro(cls, abcs=mro) + +def _find_impl(cls, registry): + """Returns the best matching implementation from *registry* for type *cls*. + + Where there is no registered implementation for a specific type, its method + resolution order is used to find a more generic implementation. + + Note: if *registry* does not contain an implementation for the base + *object* type, this function may return None. + + """ + mro = _compose_mro(cls, registry.keys()) + match = None + for t in mro: + if match is not None: + # If *match* is an implicit ABC but there is another unrelated, + # equally matching implicit ABC, refuse the temptation to guess. + if (t in registry and t not in cls.__mro__ + and match not in cls.__mro__ + and not issubclass(match, t)): + raise RuntimeError("Ambiguous dispatch: {} or {}".format( + match, t)) + break + if t in registry: + match = t + return registry.get(match) + +def singledispatch(func): + """Single-dispatch generic function decorator. + + Transforms a function into a generic function, which can have different + behaviours depending upon the type of its first argument. The decorated + function acts as the default implementation, and additional + implementations can be registered using the register() attribute of the + generic function. + """ + # There are many programs that use functools without singledispatch, so we + # trade-off making singledispatch marginally slower for the benefit of + # making start-up of such applications slightly faster. + import types, weakref + + registry = {} + dispatch_cache = weakref.WeakKeyDictionary() + cache_token = None + + def dispatch(cls): + """generic_func.dispatch(cls) -> <function implementation> + + Runs the dispatch algorithm to return the best available implementation + for the given *cls* registered on *generic_func*. + + """ + nonlocal cache_token + if cache_token is not None: + current_token = get_cache_token() + if cache_token != current_token: + dispatch_cache.clear() + cache_token = current_token + try: + impl = dispatch_cache[cls] + except KeyError: + try: + impl = registry[cls] + except KeyError: + impl = _find_impl(cls, registry) + dispatch_cache[cls] = impl + return impl + + def register(cls, func=None): + """generic_func.register(cls, func) -> func + + Registers a new implementation for the given *cls* on a *generic_func*. + + """ + nonlocal cache_token + if func is None: + if isinstance(cls, type): + return lambda f: register(cls, f) + ann = getattr(cls, '__annotations__', {}) + if not ann: + raise TypeError( + f"Invalid first argument to `register()`: {cls!r}. " + f"Use either `@register(some_class)` or plain `@register` " + f"on an annotated function." + ) + func = cls + + # only import typing if annotation parsing is necessary + from typing import get_type_hints + argname, cls = next(iter(get_type_hints(func).items())) + if not isinstance(cls, type): + raise TypeError( + f"Invalid annotation for {argname!r}. " + f"{cls!r} is not a class." + ) + registry[cls] = func + if cache_token is None and hasattr(cls, '__abstractmethods__'): + cache_token = get_cache_token() + dispatch_cache.clear() + return func + + def wrapper(*args, **kw): + if not args: + raise TypeError(f'{funcname} requires at least ' + '1 positional argument') + + return dispatch(args[0].__class__)(*args, **kw) + + funcname = getattr(func, '__name__', 'singledispatch function') + registry[object] = func + wrapper.register = register + wrapper.dispatch = dispatch + wrapper.registry = types.MappingProxyType(registry) + wrapper._clear_cache = dispatch_cache.clear + update_wrapper(wrapper, func) + return wrapper + + +# Descriptor version +class singledispatchmethod: + """Single-dispatch generic method descriptor. + + Supports wrapping existing descriptors and handles non-descriptor + callables as instance methods. + """ + + def __init__(self, func): + if not callable(func) and not hasattr(func, "__get__"): + raise TypeError(f"{func!r} is not callable or a descriptor") + + self.dispatcher = singledispatch(func) + self.func = func + + def register(self, cls, method=None): + """generic_method.register(cls, func) -> func + + Registers a new implementation for the given *cls* on a *generic_method*. + """ + return self.dispatcher.register(cls, func=method) + + def __get__(self, obj, cls=None): + def _method(*args, **kwargs): + method = self.dispatcher.dispatch(args[0].__class__) + return method.__get__(obj, cls)(*args, **kwargs) + + _method.__isabstractmethod__ = self.__isabstractmethod__ + _method.register = self.register + update_wrapper(_method, self.func) + return _method + + @property + def __isabstractmethod__(self): + return getattr(self.func, '__isabstractmethod__', False) + + +################################################################################ +### cached_property() - computed once per instance, cached as attribute +################################################################################ + +_NOT_FOUND = object() + + +class cached_property: + def __init__(self, func): + self.func = func + self.attrname = None + self.__doc__ = func.__doc__ + self.lock = RLock() + + def __set_name__(self, owner, name): + if self.attrname is None: + self.attrname = name + elif name != self.attrname: + raise TypeError( + "Cannot assign the same cached_property to two different names " + f"({self.attrname!r} and {name!r})." + ) + + def __get__(self, instance, owner=None): + if instance is None: + return self + if self.attrname is None: + raise TypeError( + "Cannot use cached_property instance without calling __set_name__ on it.") + try: + cache = instance.__dict__ + except AttributeError: # not all objects have __dict__ (e.g. class defines slots) + msg = ( + f"No '__dict__' attribute on {type(instance).__name__!r} " + f"instance to cache {self.attrname!r} property." + ) + raise TypeError(msg) from None + val = cache.get(self.attrname, _NOT_FOUND) + if val is _NOT_FOUND: + with self.lock: + # check if another thread filled cache while we awaited lock + val = cache.get(self.attrname, _NOT_FOUND) + if val is _NOT_FOUND: + val = self.func(instance) + try: + cache[self.attrname] = val + except TypeError: + msg = ( + f"The '__dict__' attribute on {type(instance).__name__!r} instance " + f"does not support item assignment for caching {self.attrname!r} property." + ) + raise TypeError(msg) from None + return val +
+ +
+
+
+ +
+ +
+

© Copyright 2019 Metropolitan Council.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty/_modules/index.html b/branch/bicounty/_modules/index.html new file mode 100644 index 0000000..9aaa3eb --- /dev/null +++ b/branch/bicounty/_modules/index.html @@ -0,0 +1,110 @@ + + + + + + Overview: module code — lasso documentation + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+
    +
  • »
  • +
  • Overview: module code
  • +
  • +
  • +
+
+
+ +
+ +
+ +
+

© Copyright 2019 Metropolitan Council.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty/_modules/lasso/logger/index.html b/branch/bicounty/_modules/lasso/logger/index.html new file mode 100644 index 0000000..4a994f4 --- /dev/null +++ b/branch/bicounty/_modules/lasso/logger/index.html @@ -0,0 +1,147 @@ + + + + + + lasso.logger — lasso documentation + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for lasso.logger

+import logging
+
+__all__ = ["WranglerLogger", "setupLogging"]
+
+
+# for all the Wrangler logging needs!
+WranglerLogger = logging.getLogger("WranglerLogger")
+
+
+
[docs]def setupLogging(infoLogFilename, debugLogFilename, logToConsole=True): + """Sets up the logger. The infoLog is terse, just gives the bare minimum of details + so the network composition will be clear later. + The debuglog is very noisy, for debugging. + + Pass none to either. + Spews it all out to console too, if logToConsole is true. + """ + # clear handlers if any exist already + WranglerLogger.handlers = [] + + # create a logger + WranglerLogger.setLevel(logging.DEBUG) + + if infoLogFilename: + infologhandler = logging.StreamHandler(open(infoLogFilename, "w")) + infologhandler.setLevel(logging.INFO) + infologhandler.setFormatter( + logging.Formatter("%(asctime)s - %(name)s - %(levelname)s - %(message)s") + ) + WranglerLogger.addHandler(infologhandler) + + if debugLogFilename: + debugloghandler = logging.StreamHandler(open(debugLogFilename, "w")) + debugloghandler.setLevel(logging.DEBUG) + debugloghandler.setFormatter( + logging.Formatter("%(asctime)s %(levelname)s %(message)s", "%Y-%m-%d %H:%M") + ) + WranglerLogger.addHandler(debugloghandler) + + if logToConsole: + consolehandler = logging.StreamHandler() + consolehandler.setLevel(logging.DEBUG) + consolehandler.setFormatter( + logging.Formatter("%(name)-12s: %(levelname)-8s %(message)s") + ) + WranglerLogger.addHandler(consolehandler)
+
+ +
+
+
+ +
+ +
+

© Copyright 2019 Metropolitan Council.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty/_modules/lasso/parameters/index.html b/branch/bicounty/_modules/lasso/parameters/index.html new file mode 100644 index 0000000..42a4188 --- /dev/null +++ b/branch/bicounty/_modules/lasso/parameters/index.html @@ -0,0 +1,1041 @@ + + + + + + lasso.parameters — lasso documentation + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for lasso.parameters

+import os
+from .logger import WranglerLogger
+
+
+from pyproj import CRS
+
+
+def get_base_dir(lasso_base_dir=os.getcwd()):
+    d = lasso_base_dir
+    for i in range(3):
+        if "metcouncil_data" in os.listdir(d):
+
+            WranglerLogger.info("Lasso base directory set as: {}".format(d))
+            return d
+        d = os.path.dirname(d)
+
+    msg = "Cannot find Lasso base directory from {}, please input using keyword in parameters: `lasso_base_dir =` ".format(
+        lasso_base_dir
+    )
+    WranglerLogger.error(msg)
+    raise (ValueError(msg))
+
+
+
[docs]class Parameters: + """A class representing all the parameters defining the networks + including time of day, categories, etc. + + Parameters can be set at runtime by initializing a parameters instance + with a keyword argument setting the attribute. Parameters that are + not explicitly set will use default parameters listed in this class. + .. highlight:: python + + Attr: + time_period_to_time (dict): Maps time period abbreviations used in + Cube to time of days used on gtfs and highway network standard + Default: + :: + { + "EA": ("3:00", "6:00"), + "AM": ("6:00, "10:00"), + "MD": ("10:00", "15:00"), + "PM": ("15:00", "19:00"), + "EV": ("19:00", "3:00"), + } + cube_time_periods (dict): Maps cube time period numbers used in + transit line files to the time period abbreviations in time_period_to_time + dictionary. + Default: + :: + {"1": "EA", "2": "AM", "3": "MD", "4": "PM", "5": "EV"} + categories (dict): Maps demand category abbreviations to a list of + network categories they are allowed to use. + Default: + :: + { + # suffix, source (in order of search) + "sov": ["sov", "default"], + "hov2": ["hov2", "default", "sov"], + "hov3": ["hov3", "hov2", "default", "sov"], + "truck": ["trk", "sov", "default"], + } + properties_to_split (dict): Dictionary mapping variables in standard + roadway network to categories and time periods that need to be + split out in final model network to get variables like LANES_AM. + Default: + :: + { + "lanes": { + "v": "lanes", + "time_periods": self.time_periods_to_time + }, + "ML_lanes": { + "v": "ML_lanes", + "time_periods": self.time_periods_to_time + }, + "use": { + "v": "use", + "time_periods": self.time_periods_to_time + }, + } + + county_shape (str): File location of shapefile defining counties. + Default: + :: + r"metcouncil_data/county/cb_2017_us_county_5m.shp" + + county_variable_shp (str): Property defining the county n ame in + the county_shape file. + Default: + :: + NAME + lanes_lookup_file (str): Lookup table of number of lanes for different data sources. + Default: + :: + r"metcouncil_data/lookups/lanes.csv" + centroid_connect_lanes (int): Number of lanes for centroid connectors. + Default: + :: + 1 + mpo_counties (list): list of county names within MPO boundary. + Default: + :: + [ + "ANOKA", + "DAKOTA", + "HENNEPIN", + "RAMSEY", + "SCOTT", + "WASHINGTON", + "CARVER", + ] + + taz_shape (str): + Default: + :: + r"metcouncil_data/TAZ/TAZOfficialWCurrentForecasts.shp" + taz_data (str): + Default: + :: + ?? + highest_taz_number (int): highest TAZ number in order to define + centroid connectors. + Default: + :: + 3100 + + output_variables (list): list of variables to output in final model + network. + Default: + :: + [ + "model_link_id", + "link_id", + "A", + "B", + "shstGeometryId", + "distance", + "roadway", + "name", + "roadway_class", + "bike_access", + "walk_access", + "drive_access", + "truck_access", + "trn_priority_EA", + "trn_priority_AM", + "trn_priority_MD", + "trn_priority_PM", + "trn_priority_EV", + "ttime_assert_EA", + "ttime_assert_AM", + "ttime_assert_MD", + "ttime_assert_PM", + "ttime_assert_EV", + "lanes_EA", + "lanes_AM", + "lanes_MD", + "lanes_PM", + "lanes_EV", + "price_sov_EA", + "price_hov2_EA", + "price_hov3_EA", + "price_truck_EA", + "price_sov_AM", + "price_hov2_AM", + "price_hov3_AM", + "price_truck_AM", + "price_sov_MD", + "price_hov2_MD", + "price_hov3_MD", + "price_truck_MD", + "price_sov_PM", + "price_hov2_PM", + "price_hov3_PM", + "price_truck_PM", + "price_sov_EV", + "price_hov2_EV", + "price_hov3_EV", + "price_truck_EV", + "roadway_class_idx", + "facility_type", + "county", + "centroidconnect", + "model_node_id", + "N", + "osm_node_id", + "bike_node", + "transit_node", + "walk_node", + "drive_node", + "geometry", + "X", + "Y", + "ML_lanes_EA", + "ML_lanes_AM", + "ML_lanes_MD", + "ML_lanes_PM", + "ML_lanes_EV", + "segment_id", + "managed", + "bus_only", + "rail_only" + ] + + osm_facility_type_dict (dict): Mapping between OSM Roadway variable + and facility type. Default: + + area_type_shape (str): Location of shapefile defining area type. + Default: + :: + r"metcouncil_data/area_type/ThriveMSP2040CommunityDesignation.shp" + area_type_variable_shp (str): property in area_type_shape with area + type in it. + Default: + :: + "COMDES2040" + area_type_code_dict (dict): Mapping of the area_type_variable_shp to + the area type code used in the MetCouncil cube network. + Default: + :: + { + 23: 4, # urban center + 24: 3, + 25: 2, + 35: 2, + 36: 1, + 41: 1, + 51: 1, + 52: 1, + 53: 1, + 60: 1, + } + downtown_area_type_shape (str): Location of shapefile defining downtown area type. + Default: + :: + r"metcouncil_data/area_type/downtownzones_TAZ.shp" + downtown_area_type (int): Area type integer for downtown. + Default: + :: + 5 + mrcc_roadway_class_shape (str): Shapefile of MRCC links with a property + associated with roadway class. Default: + :: + r"metcouncil_data/mrcc/trans_mrcc_centerlines.shp" + mrcc_roadway_class_variable_shp (str): The property in mrcc_roadway_class_shp + associated with roadway class. Default: + :: + "ROUTE_SYS" + widot_roadway_class_shape (str): Shapefile of Wisconsin links with a property + associated with roadway class. Default: + :: + r"metcouncil_data/Wisconsin_Lanes_Counts_Median/WISLR.shp" + widot_roadway_class_variable_shp (str): The property in widot_roadway_class_shape + associated with roadway class.Default: + :: + "RDWY_CTGY_" + mndot_count_shape (str): Shapefile of MnDOT links with a property + associated with counts. Default: + :: + r"metcouncil_data/count_mn/AADT_2017_Count_Locations.shp" + mndot_count_variable_shp (str): The property in mndot_count_shape + associated with counts. Default: + + :: + "lookups/osm_highway_facility_type_crosswalk.csv" + legacy_tm2_attributes (str): CSV file of link attributes by + shStReferenceId from Legacy TM2 network. Default: + :: + "lookups/legacy_tm2_attributes.csv" + osm_lanes_attributes (str): CSV file of number of lanes by shStReferenceId + from OSM. Default: + :: + "lookups/osm_lanes_attributes.csv" + tam_tm2_attributes (str): CSV file of link attributes by + shStReferenceId from TAM TM2 network. Default: + :: + "lookups/tam_tm2_attributes.csv" + tom_tom_attributes (str): CSV file of link attributes by + shStReferenceId from TomTom network. Default: + :: + "lookups/tomtom_attributes.csv" + sfcta_attributes (str): CSV file of link attributes by + shStReferenceId from SFCTA network. Default: + :: + "lookups/sfcta_attributes.csv" + output_epsg (int): EPSG type of geographic projection for output + shapefiles. Default: + :: + 102646 + output_link_shp (str): Output shapefile for roadway links. Default: + :: + r"tests/scratch/links.shp" + output_node_shp (str): Output shapefile for roadway nodes. Default: + :: + r"tests/scratch/nodes.shp" + output_link_csv (str): Output csv for roadway links. Default: + :: + r"tests/scratch/links.csv" + output_node_csv (str): Output csv for roadway nodes. Default: + :: + r"tests/scratch/nodes.csv" + output_link_txt (str): Output fixed format txt for roadway links. Default: + :: + r"tests/scratch/links.txt" + output_node_txt (str): Output fixed format txt for roadway nodes. Default: + :: + r"tests/scratch/nodes.txt" + output_link_header_width_txt (str): Header for txt roadway links. Default: + :: + r"tests/scratch/links_header_width.txt" + output_node_header_width_txt (str): Header for txt for roadway Nodes. Default: + :: + r"tests/scratch/nodes_header_width.txt" + output_cube_network_script (str): Cube script for importing + fixed-format roadway network. Default: + :: + r"tests/scratch/make_complete_network_from_fixed_width_file.s + + + + """ + +
[docs] def __init__(self, **kwargs): + """ + Time period and category splitting info + """ + if "time_periods_to_time" in kwargs: + self.time_periods_to_time = kwargs.get("time_periods_to_time") + else: + self.time_period_to_time = { + "EA": ("3:00", "6:00"), + "AM": ("6:00", "10:00"), + "MD": ("10:00", "15:00"), + "PM": ("15:00", "19:00"), + "EV": ("19:00", "3:00"), + } + + #MTC + self.cube_time_periods = { + "1": "EA", + "2": "AM", + "3": "MD", + "4": "PM", + "5": "EV", + } + + """ + #MC + self.route_type_bus_mode_dict = {"Urb Loc": 5, "Sub Loc": 6, "Express": 7} + + self.route_type_mode_dict = {0: 8, 2: 9} + + self.cube_time_periods = {"1": "AM", "2": "MD"} + self.cube_time_periods_name = {"AM": "pk", "MD": "op"} + """ + if "categories" in kwargs: + self.categories = kwargs.get("categories") + else: + self.categories = { + # suffix, source (in order of search) + "sov": ["sov", "default"], + "hov2": ["hov2", "default", "sov"], + "hov3": ["hov3", "hov2", "default", "sov"], + "truck": ["trk", "sov", "default"], + } + + # prefix, source variable, categories + self.properties_to_split = { + "lanes": { + "v": "lanes", + "time_periods": self.time_period_to_time, + }, + "ML_lanes": { + "v": "ML_lanes", + "time_periods": self.time_period_to_time, + }, + "useclass": { + "v": "useclass", + "time_periods": self.time_period_to_time, + }, + } + + """ + Details for calculating the county based on the centroid of the link. + The NAME varible should be the name of a field in shapefile. + """ + #MTC + if 'lasso_base_dir' in kwargs: + self.base_dir = get_base_dir(lasso_base_dir = kwargs.get("lasso_base_dir")) + else: + self.base_dir = get_base_dir() + + if 'data_file_location' in kwargs: + self.data_file_location = kwargs.get("data_file_location") + else: + self.data_file_location = os.path.join(self.base_dir, "mtc_data") + + #MC + if "lasso_base_dir" in kwargs: + self.base_dir = get_base_dir(lasso_base_dir=kwargs.get("lasso_base_dir")) + else: + self.base_dir = get_base_dir() + """ + if "data_file_location" in kwargs: + self.data_file_location = kwargs.get("data_file_location") + else: + self.data_file_location = os.path.join(self.base_dir, "metcouncil_data") + """ + + #-------- + if "settings_location" in kwargs: + self.settings_location = kwargs.get("settings_location") + else: + self.settings_location = os.path.join(self.base_dir, "examples", "settings") + + if "scratch_location" in kwargs: + self.scratch_location = kwargs.get("scratch_location") + else: + self.scratch_location = os.path.join(self.base_dir, "tests", "scratch") + + ### COUNTIES + + self.county_shape = os.path.join( + self.data_file_location, "county", "county.shp" + ) + self.county_variable_shp = "NAME" + + #MTC + self.county_code_dict = { + 'San Francisco':1, + 'San Mateo':2, + 'Santa Clara':3, + 'Alameda':4, + 'Contra Costa':5, + 'Solano':6, + 'Napa':7, + 'Sonoma':8, + 'Marin':9, + 'San Joaquin':11, + 'External':10, + } + + self.county_centroid_range_dict = { + 'San Francisco':range(1,100000), + 'San Mateo':range(100001,200000), + 'Santa Clara':range(200001,300000), + 'Alameda':range(300001,400000), + 'Contra Costa':range(400001,500000), + 'Solano':range(500001,600000), + 'Napa':range(600001,700000), + 'Sonoma':range(700001,800000), + 'Marin':range(800001,900000), + 'External':range(900001,1000000) + } + + self.county_node_range_dict = { + 'San Francisco':range(1000000,1500000), + 'San Mateo':range(1500000,2000000), + 'Santa Clara':range(2000000,2500000), + 'Alameda':range(2500000,3000000), + 'Contra Costa':range(3000000,3500000), + 'Solano':range(3500000,4000000), + 'Napa':range(4000000,4500000), + 'Sonoma':range(4500000,5000000), + 'Marin':range(5000000,5500000), + } + + self.county_hov_node_range_dict = { + 'San Francisco':range(5500000,6000000), + 'San Mateo':range(6000000,6500000), + 'Santa Clara':range(6500000,7000000), + 'Alameda':range(7000000,7500000), + 'Contra Costa':range(7500000,8000000), + 'Solano':range(8000000,8500000), + 'Napa':range(8500000,9000000), + 'Sonoma':range(9000000,9500000), + 'Marin':range(9500000,10000000), + } + + self.county_link_range_dict = { + 'San Francisco':range(1,1000000), + 'San Mateo':range(1000000,2000000), + 'Santa Clara':range(2000000,3000000), + 'Alameda':range(3000000,4000000), + 'Contra Costa':range(4000000,5000000), + 'Solano':range(5000000,6000000), + 'Napa':range(6000000,7000000), + 'Sonoma':range(7000000,8000000), + 'Marin':range(8000000,9000000) + } + + #MC + """ + self.county_code_dict = { + "Anoka": 1, + "Carver": 2, + "Dakota": 3, + "Hennepin": 4, + "Ramsey": 5, + "Scott": 6, + "Washington": 7, + "external": 10, + "Chisago": 11, + "Goodhue": 12, + "Isanti": 13, + "Le Sueur": 14, + "McLeod": 15, + "Pierce": 16, + "Polk": 17, + "Rice": 18, + "Sherburne": 19, + "Sibley": 20, + "St. Croix": 21, + "Wright": 22, + } + """ + + self.mpo_counties = [ + 1, + 3, + 4, + 5, + 6, + 7, + 8, + 9 + ] + + self.taz_N_list = list(range(1, 10000)) + list(range(100001, 110000)) + list(range(200001, 210000)) + list(range(300001, 310000))\ + + list(range(400001, 410000)) + list(range(500001, 510000)) + list(range(600001, 610000)) + list(range(700001, 710000))\ + + list(range(800001, 810000)) + list(range(900001, 1000000)) + + self.maz_N_list = list(range(10001, 90000)) + list(range(110001, 190000)) + list(range(210001, 290000)) + list(range(310001, 390000))\ + + list(range(410001, 490000)) + list(range(510001, 590000)) + list(range(610001, 690000)) + list(range(710001, 790000))\ + + list(range(810001, 890000)) + + self.tap_N_list = list(range(90001, 99999)) + list(range(190001, 199999)) + list(range(290001, 299999)) + list(range(390001, 399999))\ + + list(range(490001, 499999)) + list(range(590001, 599999)) + list(range(690001, 699999)) + list(range(790001, 799999))\ + + list(range(890001, 899999)) + + self.tap_N_start = { + "San Francisco" : 90001, + "San Mateo" : 190001, + "Santa Clara" : 290001, + "Alameda" : 390001, + "Contra Costa" : 490001, + "Solano" : 590001, + "Napa" : 690001, + "Sonoma" : 790001, + "Marin" : 890001 + } + + #MTC + self.osm_facility_type_dict = os.path.join( + self.data_file_location, "lookups", "osm_highway_facility_type_crosswalk.csv" + ) + #MC + ### Lanes + self.lanes_lookup_file = os.path.join( + self.data_file_location, "lookups", "lanes.csv" + ) + + ### TAZS + + self.taz_shape = os.path.join( + self.data_file_location, "TAZ", "TAZOfficialWCurrentForecasts.shp" + ) + ###### + #MTC + self.osm_lanes_attributes = os.path.join( + self.data_file_location, "lookups", "osm_lanes_attributes.csv" + ) + + self.legacy_tm2_attributes = os.path.join( + self.data_file_location, "lookups", "legacy_tm2_attributes.csv" + ) + + self.assignable_analysis = os.path.join( + self.data_file_location, "lookups", "assignable_analysis_links.csv" + ) + ### + ### AREA TYPE - MC + self.area_type_shape = os.path.join( + self.data_file_location, + "area_type", + "ThriveMSP2040CommunityDesignation.shp", + ) + self.area_type_variable_shp = "COMDES2040" + # area type map from raw data to model category + + # source https://metrocouncil.org/Planning/Publications-And-Resources/Thrive-MSP-2040-Plan-(1)/7_ThriveMSP2040_LandUsePoliciesbyCD.aspx + # urban center + # urban + # suburban + # suburban edge + # emerging suburban edge + # rural center + # diversified rural + # rural residential + # agricultural + self.area_type_code_dict = { + 23: 4, # urban center + 24: 3, + 25: 2, + 35: 2, + 36: 1, + 41: 1, + 51: 1, + 52: 1, + 53: 1, + 60: 1, + } + + self.downtown_area_type_shape = os.path.join( + self.data_file_location, + "area_type", + "downtownzones_TAZ.shp", + ) + + self.downtown_area_type = int(5) + + self.centroid_connect_lanes = int(1) + + self.osm_assgngrp_dict = os.path.join( + self.data_file_location, "lookups", "osm_highway_asgngrp_crosswalk.csv" + ) + self.mrcc_roadway_class_shape = os.path.join( + self.data_file_location, "mrcc", "trans_mrcc_centerlines.shp" + ) + #### + ###MTC + self.tam_tm2_attributes = os.path.join( + self.data_file_location, "lookups", "tam_tm2_attributes.csv" + ) + + self.sfcta_attributes = os.path.join( + self.data_file_location, "lookups", "sfcta_attributes.csv" + ) + + self.tomtom_attributes = os.path.join( + self.data_file_location, "lookups", "tomtom_attributes.csv" + ) + + self.pems_attributes = os.path.join( + self.data_file_location, "lookups", "pems_attributes.csv" + ) + + self.centroid_file = os.path.join( + self.data_file_location, "centroid", "centroid_node.pickle" + ) + #### + ###MC + self.widot_shst_data = os.path.join( + self.data_file_location, + "Wisconsin_Lanes_Counts_Median", + "widot.out.matched.geojson", + ) + #### + + self.centroid_connector_link_file = os.path.join( + self.data_file_location, "centroid", "cc_link.pickle" + ) + + self.centroid_connector_shape_file = os.path.join( + self.data_file_location, "centroid", "cc_shape.pickle" + ) + + self.tap_file = os.path.join( + self.data_file_location, "tap", "tap_node.pickle" + ) + + self.tap_connector_link_file = os.path.join( + self.data_file_location, "tap", "tap_link.pickle" + ) + + self.tap_connector_shape_file = os.path.join( + self.data_file_location, "tap", "tap_shape.pickle" + ) + + self.net_to_dbf_crosswalk = os.path.join( + self.settings_location, "net_to_dbf.csv" + ) + + ###MTC + self.log_to_net_crosswalk = os.path.join(self.settings_location, "log_to_net.csv") + + self.emme_name_crosswalk_file = os.path.join( + self.data_file_location, "lookups", "emme_attribute_names.csv" + ) + #### + #MC + self.mndot_count_variable_shp = "AADT_mn" + + self.widot_county_shape = os.path.join( + self.data_file_location, + "Wisconsin_Lanes_Counts_Median", + "TRADAS_(counts).shp", + ) + ### + ###MTC + self.mode_crosswalk_file = os.path.join( + self.data_file_location, "lookups", "gtfs_to_tm2_mode_crosswalk.csv" + ) + + self.veh_cap_crosswalk_file = os.path.join( + self.data_file_location, "lookups", "transitSeatCap.csv" + ) + + self.faresystem_crosswalk_file = os.path.join( + self.data_file_location, "lookups", "faresystem_crosswalk.txt" + ) + + # https://app.asana.com/0/12291104512575/1200287255197808/f + self.fare_2015_to_2010_deflator = 0.927 + #### + #MC + self.widot_count_variable_shp = "AADT_wi" + + self.net_to_dbf_crosswalk = os.path.join( + self.settings_location, "net_to_dbf.csv" + ) + + self.log_to_net_crosswalk = os.path.join( + self.settings_location, "log_to_net.csv" + ) + + self.subregion_boundary_file = os.path.join( + self.data_file_location, 'emme', 'subregion_boundary_for_active_modes.shp' + ) + + self.subregion_boundary_id_variable = 'subregion' + #### + + self.output_variables = [ + "model_link_id", + "link_id", + "A", + "B", + "shstGeometryId", + #MTC + 'name', + "distance", + #"roadway", + #"name", + #MC + #"shape_id", + #"distance", + #"roadway", + #"name", + #"roadway_class", + #### + "bike_access", + "walk_access", + "drive_access", + "truck_access", + "lanes_EA", + "lanes_AM", + "lanes_MD", + "lanes_PM", + "lanes_EV", + "county", + "model_node_id", + "N", + "osm_node_id", + "geometry", + "X", + "Y", + "segment_id", + "managed", + "bus_only", + "rail_only", + #MTC + "assignable", + "cntype", + "useclass_AM", + "useclass_MD", + "useclass_PM", + "useclass_EV", + "useclass_EA", + "transit", + "tollbooth", + "tollseg", + "ft", + "tap_drive", + "tollbooth", + "tollseg", + "farezone", + "tap_id", + #### + #MC + "bike_facility", + "mrcc_id", + "ROUTE_SYS", # mrcc functional class + #### + #bi-county + "nmt2010", + "nmt2020", + ] + + self.output_link_shp = os.path.join(self.scratch_location, "links.shp") + self.output_node_shp = os.path.join(self.scratch_location, "nodes.shp") + self.output_link_csv = os.path.join(self.scratch_location, "links.csv") + self.output_node_csv = os.path.join(self.scratch_location, "nodes.csv") + self.output_link_txt = os.path.join(self.scratch_location, "links.txt") + self.output_node_txt = os.path.join(self.scratch_location, "nodes.txt") + self.output_link_header_width_txt = os.path.join( + self.scratch_location, "links_header_width.txt" + ) + self.output_node_header_width_txt = os.path.join( + self.scratch_location, "nodes_header_width.txt" + ) + self.output_cube_network_script = os.path.join( + self.scratch_location, "make_complete_network_from_fixed_width_file.s" + ) + self.output_dir = os.path.join(self.scratch_location) + self.output_proj = CRS("ESRI:102646") + self.output_proj4 = '+proj=lcc +lat_1=32.78333333333333 +lat_2=33.88333333333333 +lat_0=32.16666666666666 +lon_0=-116.25 +x_0=2000000 +y_0=500000.0000000002 +ellps=GRS80 +datum=NAD83 +to_meter=0.3048006096012192 +no_defs' + self.prj_file = os.path.join(self.data_file_location, 'projection', '102646.prj') + self.wkt_projection = 'PROJCS["NAD_1983_StatePlane_California_VI_FIPS_0406_Feet",GEOGCS["GCS_North_American_1983",DATUM["North_American_Datum_1983",SPHEROID["GRS_1980",6378137,298.257222101]],PRIMEM["Greenwich",0],UNIT["Degree",0.017453292519943295]],PROJECTION["Lambert_Conformal_Conic_2SP"],PARAMETER["False_Easting",6561666.666666666],PARAMETER["False_Northing",1640416.666666667],PARAMETER["Central_Meridian",-116.25],PARAMETER["Standard_Parallel_1",32.78333333333333],PARAMETER["Standard_Parallel_2",33.88333333333333],PARAMETER["Latitude_Of_Origin",32.16666666666666],UNIT["Foot_US",0.30480060960121924],AUTHORITY["EPSG","102646"]]' + + self.fare_matrix_output_variables = ["faresystem", "origin_farezone", "destination_farezone", "price"] + + self.zones = 4756 + """ + Create all the possible headway variable combinations based on the cube time periods setting + """ + self.time_period_properties_list = [ + p + "[" + str(t) + "]" + for p in ["HEADWAY", "FREQ"] + for t in self.cube_time_periods.keys() + ] + + self.int_col = [ + "model_link_id", + "model_node_id", + "A", + "B", + #MTC + #"county", + ### + #MC + # "lanes", + "lanes_AM", + "lanes_MD", + "lanes_PM", + "lanes_NT", + "roadway_class", + "assign_group", + #"county", + "area_type", + "trn_priority", + "AADT", + "count_AM", + "count_MD", + "count_PM", + "count_NT", + "count_daily", + "centroidconnect", + "bike_facility", + #### + "drive_access", + "walk_access", + "bike_access", + "truck_access", + #MTC + "ML_lanes_AM", + "ML_lanes_MD", + "ML_lanes_PM", + "ML_lanes_EV", + "ML_lanes_EA", + ### + #MC + "drive_node", + "walk_node", + "bike_node", + "transit_node", + # "ML_lanes", + "ML_lanes_AM", + "ML_lanes_MD", + "ML_lanes_PM", + "ML_lanes_NT", + #### + "segment_id", + "managed", + "bus_only", + "rail_only", + "transit", + ##MTC + "ft", + "assignable", + "lanes_AM", + "lanes_MD", + "lanes_PM", + "lanes_EA", + "lanes_EV", + "useclass_AM", + "useclass_EA", + "useclass_MD", + "useclass_PM", + "useclass_EV", + "tollseg", + "tollbooth", + "farezone", + "tap_id", + #### + #bi-county + "nmt2010", + "nmt2020", + ] + + self.float_col = [ + "distance", + "price", + "X", + "Y" + "mrcc_id", + ] + + self.float_col = ["distance", "ttime_assert", "price", "X", "Y"] + + self.string_col = [ + "osm_node_id", + "name", + "roadway", + "shstGeometryId", + "access_AM", + "access_MD", + "access_PM", + "access_NT", + "ROUTE_SYS", + ] + + # pnr parameters + self.pnr_node_location = os.path.join( + self.data_file_location, "lookups", "pnr_stations.csv" + ) + + self.drive_buffer = 6 + + self.__dict__.update(kwargs)
+
+ +
+
+
+ +
+ +
+

© Copyright 2019 Metropolitan Council.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty/_modules/lasso/project/index.html b/branch/bicounty/_modules/lasso/project/index.html new file mode 100644 index 0000000..b99a8cf --- /dev/null +++ b/branch/bicounty/_modules/lasso/project/index.html @@ -0,0 +1,1506 @@ + + + + + + lasso.project — lasso documentation + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for lasso.project

+import json
+import os
+import re
+from typing import Any, Dict, Optional, Union, List
+from csv import reader
+
+from pandas.core import base
+
+import numpy as np
+import pandas as pd
+from pandas import DataFrame
+import geopandas as gpd
+
+from network_wrangler import ProjectCard
+from network_wrangler import RoadwayNetwork
+
+from .transit import CubeTransit, StandardTransit
+from .logger import WranglerLogger
+from .parameters import Parameters
+from .roadway import ModelRoadwayNetwork
+from .util import column_name_to_parts
+
+
+
[docs]class Project(object): + """A single or set of changes to the roadway or transit system. + + Compares a base and a build transit network or a base and build + highway network and produces project cards. + + .. highlight:: python + + Typical usage example: + :: + test_project = Project.create_project( + base_cube_transit_source=os.path.join(CUBE_DIR, "transit.LIN"), + build_cube_transit_source=os.path.join(CUBE_DIR, "transit_route_shape_change"), + ) + test_project.evaluate_changes() + test_project.write_project_card( + os.path.join(SCRATCH_DIR, "t_transit_shape_test.yml") + ) + + Attributes: + DEFAULT_PROJECT_NAME: a class-level constant that defines what + the project name will be if none is set. + STATIC_VALUES: a class-level constant which defines values that + are not evaluated when assessing changes. + card_data (dict): {"project": <project_name>, "changes": <list of change dicts>} + roadway_link_changes (DataFrame): pandas dataframe of CUBE roadway link changes. + roadway_node_changes (DataFrame): pandas dataframe of CUBE roadway node changes. + transit_changes (CubeTransit): + base_roadway_network (RoadwayNetwork): + base_cube_transit_network (CubeTransit): + build_cube_transit_network (CubeTransit): + project_name (str): name of the project, set to DEFAULT_PROJECT_NAME if not provided + parameters: an instance of the Parameters class which sets a bunch of parameters + """ + + DEFAULT_PROJECT_NAME = "USER TO define" + + STATIC_VALUES = [ + "model_link_id", + "area_type", + "county", + # "assign_group", + "centroidconnect", + ] + CALCULATED_VALUES = [ + "area_type", + "county", + "assign_group", + "centroidconnect", + ] + +
[docs] def __init__( + self, + roadway_link_changes: Optional[DataFrame] = None, + roadway_node_changes: Optional[DataFrame] = None, + transit_changes: Optional[DataFrame] = None, + base_roadway_network: Optional[RoadwayNetwork] = None, + base_transit_network: Optional[StandardTransit] = None, + base_cube_transit_network: Optional[CubeTransit] = None, + build_cube_transit_network: Optional[CubeTransit] = None, + project_name: Optional[str] = "", + evaluate: Optional[bool] = False, + parameters: Union[dict, Parameters] = {}, + ): + """ + ProjectCard constructor. + + args: + roadway_link_changes: dataframe of roadway changes read from a log file + roadway_node_changes: dataframe of roadway changes read from a log file + transit_changes: dataframe of transit changes read from a log file + base_roadway_network: RoadwayNetwork instance for base case + base_transit_network: StandardTransit instance for base case + base_cube_transit_network: CubeTransit instance for base transit network + build_cube_transit_network: CubeTransit instance for build transit network + project_name: name of the project + evaluate: defaults to false, but if true, will create card data + parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not specified, will use default parameters. + + returns: instance of ProjectCard + """ + self.card_data = Dict[str, Dict[str, Any]] + + self.roadway_link_changes = roadway_link_changes + self.roadway_node_changes = roadway_node_changes + self.base_roadway_network = base_roadway_network + self.base_transit_network = base_transit_network + self.base_cube_transit_network = base_cube_transit_network + self.build_cube_transit_network = build_cube_transit_network + self.transit_changes = transit_changes + self.project_name = ( + project_name if project_name else Project.DEFAULT_PROJECT_NAME + ) + + if type(parameters) is dict: + self.parameters = Parameters(**parameters) + elif isinstance(parameters, Parameters): + self.parameters = Parameters(**parameters.__dict__) + else: + msg = "Parameters should be a dict or instance of Parameters: found {} which is of type:{}".format( + parameters, type(parameters) + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + if base_roadway_network != None: + self.determine_roadway_network_changes_compatibility( + self.base_roadway_network, + self.roadway_link_changes, + self.roadway_node_changes, + self.parameters + ) + + if evaluate: + self.evaluate_changes()
+ +
[docs] def write_project_card(self, filename: str = None): + """ + Writes project cards. + + Args: + filename (str): File path to output .yml + + Returns: + None + """ + ProjectCard(self.card_data).write(out_filename=filename)
+ +
[docs] @staticmethod + def create_project( + roadway_log_file: Union[str, List[str], None] = None, + roadway_shp_file: Optional[str] = None, + roadway_csv_file: Optional[str] = None, + network_build_file: Optional[str] = None, + emme_node_id_crosswalk_file: Optional[str] = None, + emme_name_crosswalk_file: Optional[str] = None, + base_roadway_dir: Optional[str] = None, + base_transit_dir: Optional[str] = None, + base_cube_transit_source: Optional[str] = None, + build_cube_transit_source: Optional[str] = None, + roadway_link_changes: Optional[DataFrame] = None, + roadway_node_changes: Optional[DataFrame] = None, + transit_changes: Optional[CubeTransit] = None, + base_roadway_network: Optional[RoadwayNetwork] = None, + base_cube_transit_network: Optional[CubeTransit] = None, + build_cube_transit_network: Optional[CubeTransit] = None, + project_name: Optional[str] = None, + recalculate_calculated_variables: Optional[bool] = False, + recalculate_distance: Optional[bool] = False, + parameters: Optional[dict] = {}, + **kwargs, + ): + """ + Constructor for a Project instance. + + Args: + roadway_log_file: File path to consuming logfile or a list of logfile paths. + roadway_shp_file: File path to consuming shape file for roadway changes. + roadway_csv_file: File path to consuming csv file for roadway changes. + network_build_file: File path to consuming EMME network build for network changes. + base_roadway_dir: Folder path to base roadway network. + base_transit_dir: Folder path to base transit network. + base_cube_transit_source: Folder path to base transit network or cube line file string. + base_cube_transit_file: File path to base transit network. + build_cube_transit_source: Folder path to build transit network or cube line file string. + build_cube_transit_file: File path to build transit network. + roadway_link_changes: pandas dataframe of CUBE roadway link changes. + roadway_node_changes: pandas dataframe of CUBE roadway node changes. + transit_changes: build transit changes. + base_roadway_network: Base roadway network object. + base_cube_transit_network: Base cube transit network object. + build_cube_transit_network: Build cube transit network object. + project_name: If not provided, will default to the roadway_log_file filename if + provided (or the first filename if a list is provided) + recalculate_calculated_variables: if reading in a base network, if this is true it + will recalculate variables such as area type, etc. This only needs to be true + if you are creating project cards that are changing the calculated variables. + recalculate_distance: recalculate the distance variable. This only needs to be + true if you are creating project cards that change the distance. + parameters: dictionary of parameters + crs (int): coordinate reference system, ESPG number + node_foreign_key (str): variable linking the node table to the link table + link_foreign_key (list): list of variable linking the link table to the node foreign key + shape_foreign_key (str): variable linking the links table and shape table + unique_link_ids (list): list of variables unique to each link + unique_node_ids (list): list of variables unique to each node + modes_to_network_link_variables (dict): Mapping of modes to link variables in + the network + modes_to_network_nodes_variables (dict): Mapping of modes to node variables + in the network + managed_lanes_node_id_scalar (int): Scalar values added to primary keys for nodes for + corresponding managed lanes. + managed_lanes_link_id_scalar (int): Scalar values added to primary keys for links for + corresponding managed lanes. + managed_lanes_required_attributes (list): attributes that must be specified in managed + lane projects. + keep_same_attributes_ml_and_gp (list): attributes to copy to managed lanes from parallel + general purpose lanes. + + Returns: + A Project instance. + """ + + if base_cube_transit_source and base_cube_transit_network: + msg = "Method takes only one of 'base_cube_transit_source' and 'base_cube_transit_network' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if base_cube_transit_source: + base_cube_transit_network = CubeTransit.create_from_cube(base_cube_transit_source, parameters) + WranglerLogger.debug( + "Base network has {} lines".format(len(base_cube_transit_network.lines)) + ) + if len(base_cube_transit_network.lines) <= 10: + WranglerLogger.debug( + "Base network lines: {}".format( + "\n - ".join(base_cube_transit_network.lines) + ) + ) + elif base_cube_transit_network: + pass + else: + msg = "No base cube transit network." + WranglerLogger.info(msg) + base_cube_transit_network = None + + if build_cube_transit_source and transit_changes: + msg = "Method takes only one of 'build_cube_transit_source' and 'transit_changes' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if build_cube_transit_source: + WranglerLogger.debug("build") + build_cube_transit_network = CubeTransit.create_from_cube(build_cube_transit_source, parameters) + WranglerLogger.debug( + "Build network has {} lines".format(len(build_cube_transit_network.lines)) + ) + if len(build_cube_transit_network.lines) <= 10: + WranglerLogger.debug( + "Build network lines: {}".format( + "\n - ".join(build_cube_transit_network.lines) + ) + ) + elif transit_changes: + pass + else: + msg = "No cube transit changes given or processed." + WranglerLogger.info(msg) + transit_changes = None + + if roadway_log_file and (roadway_link_changes or roadway_node_changes): + msg = "Method takes only one of 'roadway_log_file' and 'roadway_changes' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if roadway_shp_file and (roadway_link_changes or roadway_node_changes): + msg = "Method takes only one of 'roadway_shp_file' and 'roadway_changes' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if roadway_csv_file and (roadway_link_changes or roadway_node_changes): + msg = "Method takes only one of 'roadway_csv_file' and 'roadway_changes' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if roadway_log_file and roadway_csv_file: + msg = "Method takes only one of 'roadway_log_file' and 'roadway_csv_file' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if roadway_shp_file and roadway_csv_file: + msg = "Method takes only one of 'roadway_shp_file' and 'roadway_csv_file' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if roadway_log_file and roadway_shp_file: + msg = "Method takes only one of 'roadway_log_file' and 'roadway_shp_file' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if roadway_log_file and not project_name: + if type(roadway_log_file) == list: + project_name = os.path.splitext(os.path.basename(roadway_log_file[0]))[ + 0 + ] + WranglerLogger.info( + "No Project Name - Using name of first log file in list" + ) + else: + project_name = os.path.splitext(os.path.basename(roadway_log_file))[0] + WranglerLogger.info("No Project Name - Using name of log file") + if network_build_file and not project_name: + if type(network_build_file) == list: + with open(network_build_file[0]) as f: + _content = json.load(f) + project_name = ( + _content.get('metadata').get('project_title') + ' ' + + _content.get('metadata').get('date') + ' ' + + _content.get('metadata').get('comments') + ) + WranglerLogger.info( + "No Project Name - Using metadata of first network build file in list" + ) + else: + with open(network_build_file) as f: + _content = json.load(f) + project_name = ( + _content.get('metadata').get('project_title') + ' ' + + _content.get('metadata').get('date') + ' ' + + _content.get('metadata').get('comments') + ) + WranglerLogger.info("No Project Name - Using metadata of network build file") + if roadway_log_file: + roadway_link_changes, roadway_node_changes = Project.read_logfile(roadway_log_file) + elif roadway_shp_file: + roadway_changes = gpd.read_file(roadway_shp_file) + roadway_link_changes = roadway_changes[roadway_changes.OBJECT == 'L'].copy() + roadway_node_changes = roadway_changes[roadway_changes.OBJECT == 'N'].copy() + roadway_link_changes = DataFrame(roadway_link_changes.drop("geometry", axis=1)) + roadway_node_changes = DataFrame(roadway_node_changes.drop("geometry", axis=1)) + roadway_node_changes["model_node_id"] = 0 + elif roadway_csv_file: + roadway_changes = pd.read_csv(roadway_csv_file) + roadway_link_changes = roadway_changes[roadway_changes.OBJECT == 'L'].copy() + roadway_node_changes = roadway_changes[roadway_changes.OBJECT == 'N'].copy() + roadway_node_changes["model_node_id"] = 0 + elif network_build_file: + roadway_link_changes, roadway_node_changes, transit_changes = Project.read_network_build_file(network_build_file) + if emme_node_id_crosswalk_file: + # get wrangler IDs from emme element_id + roadway_link_changes, roadway_node_changes, transit_changes = Project.emme_id_to_wrangler_id( + roadway_link_changes, + roadway_node_changes, + transit_changes, + emme_node_id_crosswalk_file + ) + else: + msg = "User needs to specify emme node id crosswalk file using emme_node_id_crosswalk_file = " + WranglerLogger.error(msg) + raise ValueError(msg) + # rename emme attributes to wrangler attributes + if emme_name_crosswalk_file is None: + emme_name_crosswalk_file = parameters.emme_name_crosswalk_file + roadway_link_changes, roadway_node_changes = Project.emme_name_to_wrangler_name( + roadway_link_changes, + roadway_node_changes, + emme_name_crosswalk_file + ) + elif roadway_link_changes: + pass + elif roadway_node_changes: + pass + else: + msg = "No roadway changes given or processed." + WranglerLogger.info(msg) + roadway_link_changes = pd.DataFrame({}) + roadway_node_changes = pd.DataFrame({}) + + if base_roadway_network and base_roadway_dir: + msg = "Method takes only one of 'base_roadway_network' and 'base_roadway_dir' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if base_roadway_dir: + base_roadway_network = ModelRoadwayNetwork.read( + os.path.join(base_roadway_dir, "link.json"), + os.path.join(base_roadway_dir, "node.geojson"), + os.path.join(base_roadway_dir, "shape.geojson"), + fast=True, + recalculate_calculated_variables=recalculate_calculated_variables, + recalculate_distance=recalculate_distance, + parameters=parameters, + **kwargs, + ) + base_roadway_network.split_properties_by_time_period_and_category() + elif base_roadway_network: + base_roadway_network.split_properties_by_time_period_and_category() + else: + msg = "No base roadway network." + WranglerLogger.info(msg) + base_roadway_network = None + + if base_cube_transit_source and base_transit_dir: + msg = "Method takes only one of 'base_cube_transit_source' and 'base_transit_dir' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if base_transit_dir: + base_transit_network = StandardTransit.read_gtfs( + gtfs_feed_dir=base_transit_dir, + parameters=parameters + ) + else: + msg = "No base transit network." + WranglerLogger.info(msg) + base_transit_network = None + + project = Project( + roadway_link_changes=roadway_link_changes, + roadway_node_changes=roadway_node_changes, + transit_changes=transit_changes, + base_roadway_network=base_roadway_network, + base_transit_network=base_transit_network, + base_cube_transit_network=base_cube_transit_network, + build_cube_transit_network=build_cube_transit_network, + evaluate=True, + project_name=project_name, + parameters=parameters, + ) + + return project
+ +
[docs] @staticmethod + def read_logfile(logfilename: Union[str, List[str]]): + """ + Reads a Cube log file and returns separate dataframes of roadway_link_changes and roadway_node_changes + + Args: + logfilename (str or list[str]): File path to CUBE logfile or list of logfile paths. + + Returns: + A DataFrame reprsentation of the log file. + """ + if type(logfilename) == str: + logfilename = [logfilename] + + link_df = pd.DataFrame() + node_df = pd.DataFrame() + + for file in logfilename: + WranglerLogger.info("Reading logfile: {}".format(file)) + with open(file) as f: + _content = f.readlines() + + _node_lines = [ + x.strip().replace(";", ",") for x in _content if x.startswith("N") + ] + WranglerLogger.debug("node lines: {}".format(_node_lines)) + _link_lines = [ + x.strip().replace(";", ",") for x in _content if x.startswith("L") + ] + WranglerLogger.debug("link lines: {}".format(_link_lines)) + + _nodecol = ["OBJECT", "OPERATION", "GROUP"] + _node_lines[0].split(",")[ + 1: + ] + WranglerLogger.debug("Node Cols: {}".format(_nodecol)) + _linkcol = ["OBJECT", "OPERATION", "GROUP"] + _link_lines[0].split(",")[ + 1: + ] + WranglerLogger.debug("Link Cols: {}".format(_linkcol)) + + def split_log(x): + return list(reader([x], delimiter=',', quotechar='"'))[0] + + _node_df = pd.DataFrame([split_log(x) for x in _node_lines[1:]],columns = _nodecol) + WranglerLogger.debug("Node DF: {}".format(_node_df)) + _link_df = pd.DataFrame([split_log(x) for x in _link_lines[1:]],columns = _linkcol) + WranglerLogger.debug("Link DF: {}".format(_link_df)) + + node_df = pd.concat([node_df, _node_df]) + link_df = pd.concat([link_df, _link_df]) + + # CUBE logfile headers for string fields: NAME[111] instead of NAME, need to shorten that + link_df.columns = [c.split("[")[0] for c in link_df.columns] + # CUBE logfile headers for string fields: NAME[111] instead of NAME, need to shorten that + node_df.columns = [c.split("[")[0] for c in node_df.columns] + + if len(link_df) > 0: + # create operation history + action_history_df = ( + link_df.groupby(['A', 'B'])["OPERATION"] + .agg(lambda x: x.tolist()) + .rename("operation_history") + .reset_index() + ) + action_history_df["operation_final"] = action_history_df.apply(lambda x: Project._final_op(x), axis=1) + + link_df = pd.merge(link_df, action_history_df, on=['A', 'B'], how="left") + + if len(node_df) > 0: + action_history_df = ( + node_df.groupby('N')["OPERATION"] + .agg(lambda x: x.tolist()) + .rename("operation_history") + .reset_index() + ) + action_history_df["operation_final"] = action_history_df.apply(lambda x: Project._final_op(x), axis=1) + + node_df = pd.merge(node_df, action_history_df, on='N', how="left") + + WranglerLogger.info( + "Processed {} Node lines and {} Link lines".format( + node_df.shape[0], link_df.shape[0] + ) + ) + + return link_df, node_df
+ +
[docs] @staticmethod + def read_network_build_file(networkbuildfilename: Union[str, List[str]]): + """ + Reads a emme network build file and returns separate dataframes of roadway_link_changes and roadway_node_changes + + Args: + networkbuildfilename (str or list[str]): File path to emme nework build file or list of network build file paths. + + Returns: + A DataFrame representation of the network build file + """ + if type(networkbuildfilename) == str: + networkbuildfilename = [networkbuildfilename] + + _link_command_history_df = DataFrame() + _node_command_history_df = DataFrame() + _transit_command_history_df = DataFrame() + + for file in networkbuildfilename: + WranglerLogger.info("Reading network build file: {}".format(file)) + with open(file) as f: + _content = json.load(f) + + _command_history = _content.get('command_history') + + # loop through all the commands + for command in _command_history: + if command.get('command') == 'set_attribute': + element_id = command.get('parameters').get('element_ids') + object = Project.get_object_from_network_build_command(command) + operation = Project.get_operation_from_network_build_command(command) + + _command_df = DataFrame( + data = { + 'element_id' : element_id, + 'object' : object, + 'operation' : operation + } + ) + + _command_df[command.get('parameters').get('attribute_name')] = command.get('parameters').get('value') + + if command.get('command') in ['create_link', 'create_node']: + if command.get('command') == 'create_link': + element_id = command.get('results').get('changes').get('added').get('LINK') + if command.get('command') == 'create_node': + element_id = command.get('results').get('changes').get('added').get('NODE') + object = Project.get_object_from_network_build_command(command) + operation = Project.get_operation_from_network_build_command(command) + + _command_df = DataFrame( + data = { + 'element_id' : element_id, + 'object' : object, + 'operation' : operation + } + ) + + for attribute_name, attribute_value in command.get('parameters').get('attributes').items(): + _command_df[attribute_name] = attribute_value + + if command.get('command') == 'delete_link': + element_id = command.get('results').get('changes').get('removed').get('LINK') + object = Project.get_object_from_network_build_command(command) + operation = Project.get_operation_from_network_build_command(command) + + _command_df = DataFrame( + data = { + 'element_id' : element_id, + 'object' : object, + 'operation' : operation + } + ) + + if command.get('command') == 'modify_transit_line': + element_id = command.get('parameters').get('line_id') + object = Project.get_object_from_network_build_command(command) + operation = Project.get_operation_from_network_build_command(command) + + _command_df = DataFrame( + data = { + 'element_id' : pd.Series(element_id), + 'object' : pd.Series(object), + 'operation' : pd.Series(operation) + } + ) + + _command_df['new_itinerary'] = [command.get('parameters').get('new_itinerary')] + + if ('L' in _command_df['object'].unique()): + _link_command_history_df = _link_command_history_df.append( + _command_df[_command_df['object'] == 'L'], + sort = False, + ignore_index = True + ) + + if ('N' in _command_df['object'].unique()): + _node_command_history_df = _node_command_history_df.append( + _command_df[_command_df['object'] == 'N'], + sort = False, + ignore_index = True + ) + + if ( + ('TRANSIT_LINE' in _command_df['object'].unique()) | + ('TRANSIT_STOP' in _command_df['object'].unique()) | + ('TRANSIT_SHAPE' in _command_df['object'].unique()) + ): + _transit_command_history_df = _transit_command_history_df.append( + _command_df[_command_df['object'].isin(['TRANSIT_LINE', 'TRANSIT_STOP', 'TRANSIT_SHAPE'])], + sort = False, + ignore_index = True + ) + + if len(_link_command_history_df) > 0: + # create operation history + link_action_history_df = ( + _link_command_history_df.groupby('element_id')["operation"] + .agg(lambda x: x.tolist()) + .rename("operation_history") + .reset_index() + ) + link_action_history_df["operation_final"] = link_action_history_df.apply( + lambda x: Project._final_op(x), + axis=1 + ) + + # get the last none null value for each element + # consolidate elements to single record + def get_last_valid(series): + if len(series.dropna()) > 0: + return series.dropna().iloc[-1] + else: + return np.nan + + #_command_history_df = _command_history_df.groupby(['element_id']).apply(get_last_valid).reset_index() + _link_command_history_df = _link_command_history_df.groupby( + ['element_id'] + ).last().reset_index() + + _link_command_history_df = pd.merge(_link_command_history_df, link_action_history_df, on='element_id', how="left") + + if len(_node_command_history_df) > 0: + # create node operation history + node_action_history_df = ( + _node_command_history_df.groupby('element_id')["operation"] + .agg(lambda x: x.tolist()) + .rename("operation_history") + .reset_index() + ) + node_action_history_df["operation_final"] = node_action_history_df.apply( + lambda x: Project._final_op(x), + axis=1 + ) + + _node_command_history_df = _node_command_history_df.groupby( + ['element_id'] + ).last().reset_index() + + _node_command_history_df = pd.merge(_node_command_history_df, node_action_history_df, on='element_id', how="left") + + WranglerLogger.info( + "Processed {} link element commands, {} node element commands".format( + _link_command_history_df.shape[0], + _node_command_history_df.shape[0] + ) + ) + + return _link_command_history_df, _node_command_history_df, _transit_command_history_df
+ +
[docs] @staticmethod + def emme_id_to_wrangler_id(emme_link_change_df, emme_node_change_df, emme_transit_changes_df, emme_node_id_crosswalk_file): + """ + rewrite the emme id with wrangler id, using the emme wrangler id crosswalk located in database folder + """ + WranglerLogger.info('Reading emme node id crosswalk file from {}'.format(emme_node_id_crosswalk_file)) + emme_node_id_crosswalk_df = pd.read_csv(emme_node_id_crosswalk_file) + emme_node_id_dict = dict(zip(emme_node_id_crosswalk_df['emme_node_id'], emme_node_id_crosswalk_df['model_node_id'])) + + # get node changes + if len(emme_node_change_df) > 0: + emme_node_change_df['emme_id'] = emme_node_change_df['element_id'].apply(lambda x: int(x.split('-')[0])) + + # get new emme nodes + new_emme_node_id_list = [ + n for n in emme_node_change_df['emme_id'] if n not in emme_node_id_crosswalk_df['emme_node_id'] + ] + WranglerLogger.info('New emme node id list {}'.format(new_emme_node_id_list)) + new_wrangler_node = emme_node_id_crosswalk_df['model_node_id'].max() + + # add crosswalk for new emme nodes + for new_emme_node in new_emme_node_id_list: + new_wrangler_node = new_wrangler_node + 1 + emme_node_id_dict.update({new_emme_node : new_wrangler_node}) + + # for nodes update model_node_id + emme_node_change_df['model_node_id'] = emme_node_change_df['emme_id'].map(emme_node_id_dict).fillna(0) + + if len(emme_link_change_df) > 0: + emme_link_change_df['A'] = emme_link_change_df['element_id'].apply(lambda x: int(x.split('-')[0])) + emme_link_change_df['B'] = emme_link_change_df['element_id'].apply(lambda x: int(x.split('-')[-1])) + # for links update A,B nodes + emme_link_change_df['A'] = emme_link_change_df['A'].map(emme_node_id_dict) + emme_link_change_df['B'] = emme_link_change_df['B'].map(emme_node_id_dict) + + if len(emme_transit_changes_df) > 0: + emme_transit_changes_df['i_node'] = emme_transit_changes_df.apply( + lambda x: x['element_id'].split('-')[-3] if x['object'] == 'TRANSIT_STOP' else 0, + axis = 1 + ) + + emme_transit_changes_df['j_node'] = emme_transit_changes_df.apply( + lambda x: x['element_id'].split('-')[-2] if x['object'] == 'TRANSIT_STOP' else 0, + axis = 1 + ) + + # update i,j nodes + emme_transit_changes_df['i_node'] = emme_transit_changes_df[ + 'i_node' + ].astype( + int + ).map( + emme_node_id_dict + ).fillna(0).astype(int) + emme_transit_changes_df['j_node'] = emme_transit_changes_df[ + 'j_node' + ].astype( + int + ).map( + emme_node_id_dict + ).fillna(0).astype(int) + + # update routing nodes + emme_transit_changes_df['new_itinerary'] = emme_transit_changes_df.apply( + lambda x: [emme_node_id_dict.get(n) for n in x['new_itinerary']] if x['object'] == 'TRANSIT_SHAPE' else 0, + axis = 1 + ) + + return emme_link_change_df, emme_node_change_df, emme_transit_changes_df
+ +
[docs] def get_object_from_network_build_command(row): + """ + determine the network build object is node or link + + Args: + row: network build command history dataframe + + Returns: + 'N' for node, 'L' for link + """ + + if row.get('command') == 'create_link': + return 'L' + + if row.get('command') == 'create_node': + return 'N' + + if row.get('command') == 'delete_link': + return 'L' + + if row.get('command') == 'set_attribute': + if row.get('parameters').get('element_type') == 'LINK': + return 'L' + if row.get('parameters').get('element_type') == 'NODE': + return 'N' + if row.get('parameters').get('element_type') == 'TRANSIT_LINE': + return 'TRANSIT_LINE' + if row.get('parameters').get('element_type') == 'TRANSIT_SEGMENT': + return 'TRANSIT_STOP' + + if row.get('command') == 'modify_transit_line': + return 'TRANSIT_SHAPE'
+ +
[docs] def get_operation_from_network_build_command(row): + """ + determine the network build object action type + + Args: + row: network build command history dataframe + + Returns: + 'A', 'C', 'D' + """ + + if row.get('command') == 'create_link': + return 'A' + + if row.get('command') == 'create_node': + return 'A' + + if row.get('command') == 'delete_link': + return 'D' + + if row.get('command') == 'set_attribute': + if row.get('parameters').get('element_type') == 'LINK': + return 'C' + if row.get('parameters').get('element_type') == 'NODE': + return 'C' + if row.get('parameters').get('element_type') == 'TRANSIT_LINE': + return 'C' + if row.get('parameters').get('element_type') == 'TRANSIT_SEGMENT': + return 'C' + + if row.get('command') == 'modify_transit_line': + return 'C'
+ +
[docs] @staticmethod + def emme_name_to_wrangler_name(emme_link_change_df, emme_node_change_df, emme_name_crosswalk_file): + """ + rename emme names to wrangler names using crosswalk file + """ + + WranglerLogger.info('Reading emme attribute name crosswalk file {}'.format(emme_name_crosswalk_file)) + emme_name_crosswalk_df = pd.read_csv(emme_name_crosswalk_file) + emme_name_crosswalk_dict = dict(zip(emme_name_crosswalk_df['emme_name'], emme_name_crosswalk_df['wrangler_name'])) + + # drop columns we don't need from emme to avoid confusion + ignore_columns = [ + c for c in emme_link_change_df.columns if c not in list(emme_name_crosswalk_dict.keys()) + ['operation_final', 'A', 'B'] + ] + WranglerLogger.info('Ignoring link changes in {}'.format(ignore_columns)) + emme_link_change_df = emme_link_change_df.drop(ignore_columns, axis = 1) + + ignore_columns = [ + c for c in emme_node_change_df.columns if c not in list(emme_name_crosswalk_dict.keys()) + ['operation_final', 'model_node_id'] + ] + WranglerLogger.info('Ignoring node changes in {}'.format(ignore_columns)) + emme_node_change_df = emme_node_change_df.drop(ignore_columns, axis = 1) + + # rename emme name to wrangler name + emme_link_change_df.rename(columns = emme_name_crosswalk_dict, inplace = True) + emme_node_change_df.rename(columns = emme_name_crosswalk_dict, inplace = True) + + return emme_link_change_df, emme_node_change_df
+ +
[docs] @staticmethod + def determine_roadway_network_changes_compatibility( + base_roadway_network: ModelRoadwayNetwork, + roadway_link_changes: DataFrame, + roadway_node_changes: DataFrame, + parameters: Parameters, + ): + """ + Checks to see that any links or nodes that change exist in base roadway network. + """ + WranglerLogger.info( + "Evaluating compatibility between roadway network changes and base network. Not evaluating deletions." + ) + + # CUBE log file saves all variable names in upper cases, need to convert them to be same as network + log_to_net_df = pd.read_csv(parameters.log_to_net_crosswalk) + log_to_net_dict = dict(zip(log_to_net_df["log"], log_to_net_df["net"])) + + dbf_to_net_df = pd.read_csv(parameters.net_to_dbf_crosswalk) + dbf_to_net_dict = dict(zip(dbf_to_net_df["dbf"], dbf_to_net_df["net"])) + + for c in roadway_link_changes.columns: + if (c not in log_to_net_df["log"].tolist() + log_to_net_df["net"].tolist()) & (c not in ["A", "B"]): + roadway_link_changes.rename(columns={c : c.lower()}, inplace=True) + roadway_link_changes.rename(columns=log_to_net_dict, inplace=True) + roadway_link_changes.rename(columns=dbf_to_net_dict, inplace=True) + + for c in roadway_node_changes.columns: + if (c not in log_to_net_df["log"].tolist() + log_to_net_df["net"].tolist()) & (c not in ["A", "B"]): + roadway_node_changes.rename(columns={c : c.lower()}, inplace=True) + roadway_node_changes.rename(columns=log_to_net_dict, inplace=True) + roadway_node_changes.rename(columns=dbf_to_net_dict, inplace=True) + + # for links "L" that change "C", + # find locations where there isn't a base roadway link + if len(roadway_link_changes) > 0: + link_changes_df = roadway_link_changes[ + roadway_link_changes["operation_final"] == "C" + ].copy() + + link_merge_df = pd.merge( + link_changes_df[["A", "B"]].astype(str), + base_roadway_network.links_df[["A", "B", "model_link_id"]].astype(str), + how="left", + on=["A", "B"], + ) + + missing_links = link_merge_df.loc[link_merge_df["model_link_id"].isna()] + + if missing_links.shape[0]: + msg = "Network missing the following AB links:\n{}".format(missing_links) + WranglerLogger.error(msg) + raise ValueError(msg) + + # for links "N" that change "C", + # find locations where there isn't a base roadway node + if len(roadway_node_changes) > 0: + node_changes_df = roadway_node_changes[ + roadway_node_changes["operation_final"] == "C" + ].copy() + + node_merge_df = pd.merge( + node_changes_df[["model_node_id"]], + base_roadway_network.nodes_df[["model_node_id", "geometry"]], + how="left", + on=["model_node_id"], + ) + missing_nodes = node_merge_df.loc[node_merge_df["geometry"].isna()] + if missing_nodes.shape[0]: + msg = "Network missing the following nodes:\n{}".format(missing_nodes) + WranglerLogger.error(msg) + raise ValueError(msg)
+ +
[docs] def evaluate_changes(self): + """ + Determines which changes should be evaluated, initiates + self.card_data to be an aggregation of transit and highway changes. + """ + highway_change_list = [] + transit_change_list = [] + + WranglerLogger.info("Evaluating project changes.") + + if (not self.roadway_link_changes.empty) | (not self.roadway_node_changes.empty): + highway_change_list = self.add_highway_changes() + + if self.transit_changes is not None: + if (not self.transit_changes.empty) or ( + self.base_cube_transit_network is not None + and self.build_cube_transit_network is not None + ): + transit_change_list = self.add_transit_changes() + + if ( + self.base_cube_transit_network is not None + and self.build_cube_transit_network is not None + ): + transit_change_list = self.add_transit_changes() + + self.card_data = { + "project": self.project_name, + "changes": transit_change_list + highway_change_list, + }
+ +
[docs] def add_transit_changes(self): + """ + Evaluates changes between base and build transit objects and + adds entries into the self.card_data dictionary. + """ + if self.build_cube_transit_network: + transit_change_list = self.build_cube_transit_network.evaluate_differences( + self.base_cube_transit_network + ) + elif self.base_transit_network: + transit_change_list = self.base_transit_network.evaluate_differences( + self.transit_changes + ) + return transit_change_list
+ + @staticmethod + def _final_op(x): + if x["operation_history"][-1] == "D": + if "A" in x["operation_history"][:-1]: + return "N" + else: + return "D" + elif x["operation_history"][-1] == "A": + if "D" in x["operation_history"][:-1]: + return "C" + else: + return "A" + else: + if "A" in x["operation_history"][:-1]: + return "A" + else: + return "C" + +
[docs] def add_highway_changes(self, limit_variables_to_existing_network=False): + """ + Evaluates changes from the log file based on the base highway object and + adds entries into the self.card_data dictionary. + + Args: + limit_variables_to_existing_network (bool): True if no ad-hoc variables. Default to False. + """ + + for c in self.parameters.string_col: + if c in self.roadway_link_changes.columns: + self.roadway_link_changes[c] = self.roadway_link_changes[c].str.lstrip(" ") + if c in self.roadway_node_changes.columns: + self.roadway_node_changes[c] = self.roadway_node_changes[c].str.lstrip(" ") + + ## if worth it, could also add some functionality to network wrangler itself. + node_changes_df = self.roadway_node_changes.copy() + + link_changes_df = self.roadway_link_changes.copy() + + def _process_deletions(link_changes_df): + """ + create deletion section in project card + """ + WranglerLogger.debug("Processing link deletions") + + cube_delete_df = link_changes_df[link_changes_df["operation_final"] == "D"].copy() + + # make sure columns has the same type as base network + cube_delete_df['A'] = cube_delete_df['A'].astype( + type(self.base_roadway_network.links_df['A'].iloc[0]) + ) + cube_delete_df['B'] = cube_delete_df['B'].astype( + type(self.base_roadway_network.links_df['B'].iloc[0]) + ) + + if 'model_link_id' in cube_delete_df.columns: + cube_delete_df.drop(['model_link_id'], axis = 1, inplace = True) + + cube_delete_df = pd.merge( + cube_delete_df, + self.base_roadway_network.links_df[['A', 'B', 'model_link_id']], + how = 'left', + on = ['A', 'B'] + ) + + if len(cube_delete_df) > 0: + links_to_delete = cube_delete_df["model_link_id"].tolist() + delete_link_dict = { + "category": "Roadway Deletion", + "links": {"model_link_id": links_to_delete}, + } + WranglerLogger.debug("{} Links Deleted.".format(len(links_to_delete))) + else: + delete_link_dict = None + WranglerLogger.debug("No link deletions processed") + + return delete_link_dict + + def _process_link_additions( + link_changes_df, limit_variables_to_existing_network + ): + """""" + WranglerLogger.debug("Processing link additions") + cube_add_df = link_changes_df[link_changes_df["operation_final"] == "A"] + if len(cube_add_df) == 0: + WranglerLogger.debug("No link additions processed") + return {} + + if limit_variables_to_existing_network: + add_col = [ + c + for c in cube_add_df.columns + if c in self.base_roadway_network.links_df.columns + ] + else: + add_col = [ + c for c in cube_add_df.columns if c not in ["operation_final"] + ] + # can leave out "operation_final" from writing out, is there a reason to write it out? + + for x in add_col: + cube_add_df[x] = cube_add_df[x].astype(self.base_roadway_network.links_df[x].dtype) + + add_link_properties = cube_add_df[add_col].to_dict("records") + + # WranglerLogger.debug("Add Link Properties: {}".format(add_link_properties)) + WranglerLogger.debug("{} Links Added".format(len(add_link_properties))) + + return {"category": "Add New Roadway", "links": add_link_properties} + + def _process_node_additions(node_add_df): + """""" + WranglerLogger.debug("Processing node additions") + + if len(node_add_df) == 0: + WranglerLogger.debug("No node additions processed") + return [] + + node_add_df = node_add_df.drop(["operation_final"], axis=1) + + for x in node_add_df.columns: + node_add_df[x] = node_add_df[x].astype(self.base_roadway_network.nodes_df[x].dtype) + + add_nodes_dict_list = node_add_df.to_dict( + "records" + ) + WranglerLogger.debug("{} Nodes Added".format(len(add_nodes_dict_list))) + + return add_nodes_dict_list + + def _process_single_link_change(change_row, changeable_col): + """""" + + # 1. Find associated base year network values + base_df = self.base_roadway_network.links_df[ + (self.base_roadway_network.links_df["A"] == int(change_row.A)) + & (self.base_roadway_network.links_df["B"] == int(change_row.B)) + ] + + if not base_df.shape[0]: + msg = "No match found in network for AB combination: ({},{}). Incompatible base network.".format( + change_row.A, change_row.B + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + elif base_df.shape[0] > 1: + WranglerLogger.warning( + "Found more than one match in base network for AB combination: ({},{}). Selecting first one to operate on but AB should be unique to network.".format( + change_row.A, change_row.B + ) + ) + + base_row = base_df.iloc[0] + # WranglerLogger.debug("Properties with changes: {}".format(changeable_col)) + + # 2. find columns that changed (enough) + changed_col = [] + for col in changeable_col: + WranglerLogger.debug("Assessing Column: {}".format(col)) + # if it is the same as before, or a static value, don't process as a change + if str(change_row[col]).strip('"\'') == str(base_row[col]).strip('"\''): + continue + # if it is NaN or None, don't process as a change + if (change_row[col] != change_row[col]) | (change_row[col] is None): + continue + if (col == "roadway_class") & (change_row[col] == 0): + continue + # only look at distance if it has significantly changed + if col == "distance": + if ( + abs( + (change_row[col] - float(base_row[col])) + / base_row[col].astype(float) + ) + > 0.01 + ): + change_row[col] = type(base_row[col])(change_row[col]) + changed_col.append(col) + else: + continue + else: + change_row[col] = type(base_row[col])(change_row[col]) + changed_col.append(col) + + WranglerLogger.debug( + "Properties with changes that will be processed: {}".format(changed_col) + ) + + if not changed_col: + return pd.DataFrame() + + # 3. Iterate through columns with changed values and structure the changes as expected in project card + property_dict_list = [] + processed_properties = [] + + # check if it's a manged lane change + for c in changed_col: + if c.startswith("ML_"): + # TODO ML project card skeleton + msg = "Detected managed lane changes, please create managed lane project card!" + WranglerLogger.error(msg) + raise ValueError(msg) + return + + # regular roadway property change + for c in changed_col: + # WranglerLogger.debug("Processing Column: {}".format(c)) + ( + p_base_name, + p_time_period, + p_category, + managed_lane, + ) = column_name_to_parts(c, self.parameters) + + _d = { + "existing": base_row[c], + "set": change_row[c], + } + if c in Project.CALCULATED_VALUES: + _d = { + "set": change_row[c], + } + if p_time_period: + if managed_lane == 1: + _d["time"] = list( + self.parameters.time_period_to_time[p_time_period] + ) + if p_category: + _d["category"] = p_category + + # iterate through existing properties that have been changed and see if you should just add + if (p_base_name in processed_properties) & (managed_lane == 1): + for processed_p in property_dict_list: + if processed_p["property"] == p_base_name: + processed_p["timeofday"] += [_d] + elif (p_base_name in processed_properties) & (managed_lane == 0): + for processed_p in property_dict_list: + if processed_p["property"] == p_base_name: + if processed_p["set"] != change_row[c]: + msg = "Detected different changes for split-property variables on regular roadway links: " + msg += "conflicting \"{}\" values \"{}\", \"{}\"".format(p_base_name, processed_p["set"], change_row[c]) + WranglerLogger.error(msg) + raise ValueError(msg) + elif p_time_period: + if managed_lane == 1: + property_dict = {"property": p_base_name, "timeofday": [_d]} + processed_properties.append(p_base_name) + property_dict_list.append(property_dict) + else: + _d["property"] = p_base_name + processed_properties.append(_d["property"]) + property_dict_list.append(_d) + else: + _d["property"] = p_base_name + processed_properties.append(_d["property"]) + property_dict_list.append(_d) + + card_df = pd.DataFrame( + { + "properties": pd.Series([property_dict_list]), + "model_link_id": pd.Series(base_row["model_link_id"]), + } + ) + + # WranglerLogger.debug('single change card_df:\n {}'.format(card_df)) + + return card_df + + def _process_link_changes(link_changes_df, changeable_col): + """""" + cube_change_df = link_changes_df[link_changes_df["operation_final"] == "C"].copy() + + # make sure columns has the same type as base network + cube_change_df['A'] = cube_change_df['A'].astype( + type(self.base_roadway_network.links_df['A'].iloc[0]) + ) + cube_change_df['B'] = cube_change_df['B'].astype( + type(self.base_roadway_network.links_df['B'].iloc[0]) + ) + + if 'model_link_id' in cube_change_df.columns: + cube_change_df.drop('model_link_id', axis = 1, inplace = True) + + cube_change_df = pd.merge( + cube_change_df, + self.base_roadway_network.links_df[['A', 'B', 'model_link_id']], + how = 'left', + on = ['A', 'B'] + ) + + if not cube_change_df.shape[0]: + WranglerLogger.info("No link changes processed") + return [] + + change_link_dict_df = pd.DataFrame(columns=["properties", "model_link_id"]) + + for index, row in cube_change_df.iterrows(): + card_df = _process_single_link_change(row, changeable_col) + + change_link_dict_df = pd.concat( + [change_link_dict_df, card_df], ignore_index=True, sort=False + ) + + if not change_link_dict_df.shape[0]: + WranglerLogger.info("No link changes processed") + return [] + + # WranglerLogger.debug('change_link_dict_df Unaggregated:\n {}'.format(change_link_dict_df)) + + # Have to change to string so that it is a hashable type for the aggregation + change_link_dict_df["properties"] = change_link_dict_df[ + "properties" + ].astype(str) + # Group the changes that are the same + change_link_dict_df = ( + change_link_dict_df.groupby("properties")[["model_link_id"]] + .agg(lambda x: list(x)) + .reset_index() + ) + # WranglerLogger.debug('change_link_dict_df Aggregated:\n {}'.format(change_link_dict_df)) + + # Reformat model link id to correct "facility" format + change_link_dict_df["facility"] = change_link_dict_df.apply( + lambda x: {"link": [{"model_link_id": x.model_link_id}]}, axis=1 + ) + + # WranglerLogger.debug('change_link_dict_df 3: {}'.format(change_link_dict_df)) + change_link_dict_df["properties"] = change_link_dict_df["properties"].apply( + lambda x: json.loads( + x.replace("'\"", "'").replace("\"'", "'").replace("'", '"') + ) + ) + + change_link_dict_df["category"] = "Roadway Property Change" + + change_link_dict_list = change_link_dict_df[ + ["category", "facility", "properties"] + ].to_dict("record") + + WranglerLogger.debug( + "{} Changes Processed".format(len(change_link_dict_list)) + ) + return change_link_dict_list + + def _consolidate_actions(log, base, key_list): + log_df = log.copy() + # will be changed if to allow new variables being added/changed that are not in base network + changeable_col = [x for x in log_df.columns if x in base.columns] + #print(log_df) + #for x in changeable_col: + # print(x) + #log_df[x] = log_df[x].astype(base[x].dtype) + + if 'operation_final' not in log_df.columns: + action_history_df = ( + log_df.groupby(key_list)["operation"] + .agg(lambda x: x.tolist()) + .rename("operation_history") + .reset_index() + ) + + log_df = pd.merge(log_df, action_history_df, on=key_list, how="left") + log_df.drop_duplicates(subset=key_list, keep="last", inplace=True) + log_df["operation_final"] = log_df.apply(lambda x: Project._final_op(x), axis=1) + + return log_df[changeable_col + ["operation_final"]] + + delete_link_dict = None + add_link_dict = None + change_link_dict_list = [] + + if len(link_changes_df) != 0: + link_changes_df = _consolidate_actions( + link_changes_df, self.base_roadway_network.links_df, ["A", "B"] + ) + + # process deletions + delete_link_dict = _process_deletions(link_changes_df) + + # process additions + add_link_dict = _process_link_additions( + link_changes_df, limit_variables_to_existing_network + ) + + # process changes + WranglerLogger.debug("Processing changes") + WranglerLogger.debug(link_changes_df) + changeable_col = list( + ( + set(link_changes_df.columns) + & set(self.base_roadway_network.links_df.columns) + ) + - set(Project.STATIC_VALUES) + ) + + cols_in_changes_not_in_net = list( + set(link_changes_df.columns) + - set(self.base_roadway_network.links_df.columns) + ) + + if cols_in_changes_not_in_net: + WranglerLogger.warning( + "The following attributes are specified in the changes but do not exist in the base network: {}".format( + cols_in_changes_not_in_net + ) + ) + + change_link_dict_list = _process_link_changes(link_changes_df, changeable_col) + + if len(node_changes_df) != 0: + node_changes_df = _consolidate_actions( + node_changes_df, self.base_roadway_network.nodes_df, ["model_node_id"] + ) + + # print error message for node change and node deletion + if ( + len(node_changes_df[node_changes_df["operation_final"].isin(["C", "D"])]) + > 0 + ): + msg = "NODE changes and deletions are not allowed!" + WranglerLogger.warning(msg) + #raise ValueError(msg) + node_add_df = node_changes_df[node_changes_df["operation_final"] == "A"] + + if add_link_dict: + add_link_dict["nodes"] = _process_node_additions(node_add_df) + else: + add_link_dict = {"category": "Add New Roadway", "nodes": _process_node_additions(node_add_df)} + + else: + None + + # combine together + + highway_change_list = list( + filter(None, [delete_link_dict] + [add_link_dict] + change_link_dict_list) + ) + + return highway_change_list
+
+ +
+
+
+ +
+ +
+

© Copyright 2019 Metropolitan Council.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty/_modules/lasso/roadway/index.html b/branch/bicounty/_modules/lasso/roadway/index.html new file mode 100644 index 0000000..946c04b --- /dev/null +++ b/branch/bicounty/_modules/lasso/roadway/index.html @@ -0,0 +1,2037 @@ + + + + + + lasso.roadway — lasso documentation + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for lasso.roadway

+import copy
+import glob
+import os
+from typing import Optional, Union
+
+import geopandas as gpd
+import pandas as pd
+
+from geopandas import GeoDataFrame
+from pandas import DataFrame
+import numpy as np
+
+from network_wrangler import RoadwayNetwork
+from .parameters import Parameters
+from .logger import WranglerLogger
+
+
+
[docs]class ModelRoadwayNetwork(RoadwayNetwork): + """ + Subclass of network_wrangler class :ref:`RoadwayNetwork <network_wrangler:RoadwayNetwork>` + + A representation of the physical roadway network and its properties. + """ + + CALCULATED_VALUES = [ + "area_type", + "county", + "centroidconnect", + ] + +
[docs] def __init__( + self, + nodes: GeoDataFrame, + links: DataFrame, + shapes: GeoDataFrame, + parameters: Union[Parameters, dict] = {}, + **kwargs, + ): + """ + Constructor + + Args: + nodes: geodataframe of nodes + links: dataframe of links + shapes: geodataframe of shapes + parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters. + If not specified, will use default parameters. + crs (int): coordinate reference system, ESPG number + node_foreign_key (str): variable linking the node table to the link table + link_foreign_key (list): list of variable linking the link table to the node foreign key + shape_foreign_key (str): variable linking the links table and shape table + unique_link_ids (list): list of variables unique to each link + unique_node_ids (list): list of variables unique to each node + modes_to_network_link_variables (dict): Mapping of modes to link variables in the network + modes_to_network_nodes_variables (dict): Mapping of modes to node variables in the network + managed_lanes_node_id_scalar (int): Scalar values added to primary keys for nodes for + corresponding managed lanes. + managed_lanes_link_id_scalar (int): Scalar values added to primary keys for links for + corresponding managed lanes. + managed_lanes_required_attributes (list): attributes that must be specified in managed + lane projects. + keep_same_attributes_ml_and_gp (list): attributes to copy to managed lanes from parallel + general purpose lanes. + """ + super().__init__(nodes, links, shapes, **kwargs) + + # will have to change if want to alter them + if type(parameters) is dict: + self.parameters = Parameters(**parameters) + elif isinstance(parameters, Parameters): + self.parameters = Parameters(**parameters.__dict__) + else: + msg = "Parameters should be a dict or instance of Parameters: found {} which is of type:{}".format( + parameters, type(parameters) + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + self.links_metcouncil_df = None + self.nodes_metcouncil_df = None + + self.fill_na() + self.convert_int()
+ # self.shapes_metcouncil_df = None + ##todo also write to file + # WranglerLogger.debug("Used PARAMS\n", '\n'.join(['{}: {}'.format(k,v) for k,v in self.parameters.__dict__.items()])) + +
[docs] @staticmethod + def read( + link_filename: str, + node_filename: str, + shape_filename: str, + fast: bool = False, + recalculate_calculated_variables=False, + recalculate_distance=False, + parameters: Union[dict, Parameters] = {}, + **kwargs, + ): + """ + Reads in links and nodes network standard. + + Args: + link_filename (str): File path to link json. + node_filename (str): File path to node geojson. + shape_filename (str): File path to link true shape geojson + fast (bool): boolean that will skip validation to speed up read time. + recalculate_calculated_variables (bool): calculates fields from spatial joins, etc. + recalculate_distance (bool): re-calculates distance. + parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not specified, will use default parameters. + crs (int): coordinate reference system, ESPG number + node_foreign_key (str): variable linking the node table to the link table + link_foreign_key (list): list of variable linking the link table to the node foreign key + shape_foreign_key (str): variable linking the links table and shape table + unique_link_ids (list): list of variables unique to each link + unique_node_ids (list): list of variables unique to each node + modes_to_network_link_variables (dict): Mapping of modes to link variables in the network + modes_to_network_nodes_variables (dict): Mapping of modes to node variables in the network + managed_lanes_node_id_scalar (int): Scalar values added to primary keys for nodes for + corresponding managed lanes. + managed_lanes_link_id_scalar (int): Scalar values added to primary keys for links for + corresponding managed lanes. + managed_lanes_required_attributes (list): attributes that must be specified in managed + lane projects. + keep_same_attributes_ml_and_gp (list): attributes to copy to managed lanes from parallel + general purpose lanes. + Returns: + ModelRoadwayNetwork + """ + + nodes_df, links_df, shapes_df = RoadwayNetwork.load_transform_network( + node_filename, + link_filename, + shape_filename, + validate_schema=not fast, + **kwargs, + ) + + m_road_net = ModelRoadwayNetwork( + nodes_df, + links_df, + shapes_df, + parameters=parameters, + **kwargs, + ) + + if recalculate_calculated_variables: + m_road_net.create_calculated_variables() + if recalculate_distance: + m_road_net.calculate_distance(overwrite=True) + + m_road_net.fill_na() + # this method is making period values as string "NaN", need to revise. + m_road_net.split_properties_by_time_period_and_category() + for c in m_road_net.links_df.columns: + m_road_net.links_df[c] = m_road_net.links_df[c].replace("NaN", np.nan) + m_road_net.convert_int() + + return m_road_net
+ +
[docs] @staticmethod + def from_RoadwayNetwork( + roadway_network_object, + parameters: Union[dict, Parameters] = {}, + ): + """ + RoadwayNetwork to ModelRoadwayNetwork + + Args: + roadway_network_object (RoadwayNetwork). + parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not specified, will use default parameters. + + Returns: + ModelRoadwayNetwork + """ + + additional_params_dict = { + k: v + for k, v in roadway_network_object.__dict__.items() + if k not in ["nodes_df", "links_df", "shapes_df", "parameters"] + } + + return ModelRoadwayNetwork( + roadway_network_object.nodes_df, + roadway_network_object.links_df, + roadway_network_object.shapes_df, + parameters=parameters, + **additional_params_dict, + )
+ +
[docs] def split_properties_by_time_period_and_category(self, properties_to_split=None): + """ + Splits properties by time period, assuming a variable structure of + + Args: + properties_to_split: dict + dictionary of output variable prefix mapped to the source variable and what to stratify it by + e.g. + { + 'lanes' : {'v':'lanes', 'times_periods':{"AM": ("6:00", "10:00"),"PM": ("15:00", "19:00")}}, + 'ML_lanes' : {'v':'ML_lanes', 'times_periods':{"AM": ("6:00", "10:00"),"PM": ("15:00", "19:00")}}, + 'use' : {'v':'use', 'times_periods':{"AM": ("6:00", "10:00"),"PM": ("15:00", "19:00")}}, + } + + """ + import itertools + + if properties_to_split == None: + properties_to_split = self.parameters.properties_to_split + + for out_var, params in properties_to_split.items(): + if params["v"] not in self.links_df.columns: + WranglerLogger.warning( + "Specified variable to split: {} not in network variables: {}. Returning 0.".format( + params["v"], str(self.links_df.columns) + ) + ) + if params.get("time_periods") and params.get("categories"): + + for time_suffix, category_suffix in itertools.product( + params["time_periods"], params["categories"] + ): + self.links_df[ + out_var + "_" + time_suffix + "_" + category_suffix + ] = 0 + elif params.get("time_periods"): + for time_suffix in params["time_periods"]: + self.links_df[out_var + "_" + time_suffix] = 0 + elif params.get("time_periods") and params.get("categories"): + for time_suffix, category_suffix in itertools.product( + params["time_periods"], params["categories"] + ): + self.links_df[ + out_var + "_" + category_suffix + "_" + time_suffix + ] = self.get_property_by_time_period_and_group( + params["v"], + category=params["categories"][category_suffix], + time_period=params["time_periods"][time_suffix], + ) + elif params.get("time_periods"): + for time_suffix in params["time_periods"]: + self.links_df[ + out_var + "_" + time_suffix + ] = self.get_property_by_time_period_and_group( + params["v"], + category=None, + time_period=params["time_periods"][time_suffix], + ) + else: + raise ValueError( + "Shoudn't have a category without a time period: {}".format(params) + )
+ +
[docs] def create_calculated_variables(self): + """ + Creates calculated roadway variables. + + Args: + None + """ + WranglerLogger.info("Creating calculated roadway variables.") + + #MTC + self.create_ML_variable() + #/MTC + #MC + self.calculate_area_type() + self.calculate_county() + self.calculate_mpo() + self.add_counts() + self.create_ML_variable() + self.create_hov_corridor_variable() + self.create_managed_variable()
+ #/MC + +
[docs] def calculate_county( + self, + county_shape=None, + county_shape_variable=None, + network_variable="county", + county_codes_dict=None, + overwrite=False, + ): + """ + #MC + Calculates county variable. + + This uses the centroid of the geometry field to determine which county it should be labeled. + This isn't perfect, but it much quicker than other methods. + + Args: + county_shape (str): The File path to county geodatabase. + county_shape_variable (str): The variable name of county in county geodadabase. + network_variable (str): The variable name of county in network standard. Default to "county". + overwrite (Bool): True if overwriting existing county variable in network. Default to False. + + Returns: + None + """ + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing County Variable '{}' already in network".format( + network_variable + ) + ) + else: + WranglerLogger.info( + "County Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + """ + Verify inputs + """ + + county_shape = county_shape if county_shape else self.parameters.county_shape + + county_shape_variable = ( + county_shape_variable + if county_shape_variable + else self.parameters.county_variable_shp + ) + + WranglerLogger.info( + "Adding roadway network variable for county using a spatial join with: {}".format( + county_shape + ) + ) + + county_codes_dict = ( + county_codes_dict if county_codes_dict else self.parameters.county_code_dict + ) + if not county_codes_dict: + msg = "No county codes dictionary specified" + WranglerLogger.error(msg) + raise ValueError(msg) + + """ + Start actual process + """ + + centroids_gdf = self.links_df.copy() + centroids_gdf["geometry"] = centroids_gdf["geometry"].centroid + + county_gdf = gpd.read_file(county_shape) + county_gdf = county_gdf.to_crs(epsg=self.crs) + joined_gdf = gpd.sjoin(centroids_gdf, county_gdf, how="left", op="intersects") + + joined_gdf[county_shape_variable] = ( + joined_gdf[county_shape_variable] + .map(county_codes_dict) + .fillna(10) + .astype(int) + ) + + self.links_df[network_variable] = joined_gdf[county_shape_variable] + + WranglerLogger.info( + "Finished Calculating county variable: {}".format(network_variable) + )
+ +
[docs] def calculate_area_type( + self, + area_type_shape=None, + area_type_shape_variable=None, + network_variable="area_type", + area_type_codes_dict=None, + downtown_area_type_shape=None, + downtown_area_type=None, + overwrite=False, + ): + """ + #MC + Calculates area type variable. + + This uses the centroid of the geometry field to determine which area it should be labeled. + This isn't perfect, but it much quicker than other methods. + + Args: + area_type_shape (str): The File path to area geodatabase. + area_type_shape_variable (str): The variable name of area type in area geodadabase. + network_variable (str): The variable name of area type in network standard. Default to "area_type". + area_type_codes_dict: The dictionary to map input area_type_shape_variable to network_variable + downtown_area_type_shape: The file path to the downtown area type boundary. + downtown_area_type (int): Integer value of downtown area type + overwrite (Bool): True if overwriting existing county variable in network. Default to False. + + Returns: + None + + """ + + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing Area Type Variable '{}' already in network".format( + network_variable + ) + ) + else: + WranglerLogger.info( + "Area Type Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + WranglerLogger.info( + "Calculating Area Type from Spatial Data and adding as roadway network variable: {}".format( + network_variable + ) + ) + + """ + Verify inputs + """ + + area_type_shape = ( + area_type_shape if area_type_shape else self.parameters.area_type_shape + ) + + if not area_type_shape: + msg = "No area type shape specified" + WranglerLogger.error(msg) + raise ValueError(msg) + if not os.path.exists(area_type_shape): + msg = "File not found for area type shape: {}".format(area_type_shape) + WranglerLogger.error(msg) + raise ValueError(msg) + + area_type_shape_variable = ( + area_type_shape_variable + if area_type_shape_variable + else self.parameters.area_type_variable_shp + ) + + if not area_type_shape_variable: + msg = "No area type shape varible specified" + WranglerLogger.error(msg) + raise ValueError(msg) + + area_type_codes_dict = ( + area_type_codes_dict + if area_type_codes_dict + else self.parameters.area_type_code_dict + ) + if not area_type_codes_dict: + msg = "No area type codes dictionary specified" + WranglerLogger.error(msg) + raise ValueError(msg) + + downtown_area_type_shape = ( + downtown_area_type_shape + if downtown_area_type_shape + else self.parameters.downtown_area_type_shape + ) + + if not downtown_area_type_shape: + msg = "No downtown area type shape specified" + WranglerLogger.error(msg) + raise ValueError(msg) + if not os.path.exists(downtown_area_type_shape): + msg = "File not found for downtown area type shape: {}".format( + downtown_area_type_shape + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + downtown_area_type = ( + downtown_area_type + if downtown_area_type + else self.parameters.downtown_area_type + ) + if not downtown_area_type: + msg = "No downtown area type value specified" + WranglerLogger.error(msg) + raise ValueError(msg) + + """ + Start actual process + """ + centroids_gdf = self.links_df.copy() + centroids_gdf["geometry"] = centroids_gdf["geometry"].centroid + + WranglerLogger.debug("Reading Area Type Shapefile {}".format(area_type_shape)) + area_type_gdf = gpd.read_file(area_type_shape) + area_type_gdf = area_type_gdf.to_crs(epsg=self.crs) + + downtown_gdf = gpd.read_file(downtown_area_type_shape) + downtown_gdf = downtown_gdf.to_crs(epsg=self.crs) + + joined_gdf = gpd.sjoin( + centroids_gdf, area_type_gdf, how="left", op="intersects" + ) + + joined_gdf[area_type_shape_variable] = ( + joined_gdf[area_type_shape_variable] + .map(area_type_codes_dict) + .fillna(1) + .astype(int) + ) + + WranglerLogger.debug("Area Type Codes Used: {}".format(area_type_codes_dict)) + + d_joined_gdf = gpd.sjoin( + centroids_gdf, downtown_gdf, how="left", op="intersects" + ) + + d_joined_gdf["downtown_area_type"] = d_joined_gdf["Id"].fillna(-99).astype(int) + + joined_gdf.loc[ + d_joined_gdf["downtown_area_type"] == 0, area_type_shape_variable + ] = downtown_area_type + + WranglerLogger.debug( + "Downtown Area Type used boundary file: {}".format(downtown_area_type_shape) + ) + + self.links_df[network_variable] = joined_gdf[area_type_shape_variable] + + WranglerLogger.info( + "Finished Calculating Area Type from Spatial Data into variable: {}".format( + network_variable + ) + )
+ +
[docs] def calculate_mpo( + self, + county_network_variable="county", + network_variable="mpo", + as_integer=True, + mpo_counties=None, + overwrite=False, + ): + """ + Calculates mpo variable. + #MC + Args: + county_variable (str): Name of the variable where the county names are stored. Default to "county". + network_variable (str): Name of the variable that should be written to. Default to "mpo". + as_integer (bool): If true, will convert true/false to 1/0s. + mpo_counties (list): List of county names that are within mpo region. + overwrite (Bool): True if overwriting existing county variable in network. Default to False. + + Returns: + None + """ + + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing MPO Variable '{}' already in network".format( + network_variable + ) + ) + else: + WranglerLogger.info( + "MPO Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + WranglerLogger.info( + "Calculating MPO as roadway network variable: {}".format(network_variable) + ) + """ + Verify inputs + """ + county_network_variable = ( + county_network_variable + if county_network_variable + else self.parameters.county_network_variable + ) + + if not county_network_variable: + msg = "No variable specified as containing 'county' in the network." + WranglerLogger.error(msg) + raise ValueError(msg) + if county_network_variable not in self.links_df.columns: + msg = "Specified county network variable: {} does not exist in network. Try running or debuging county calculation." + WranglerLogger.error(msg) + raise ValueError(msg) + + mpo_counties = mpo_counties if mpo_counties else self.parameters.mpo_counties + + if not mpo_counties: + msg = "No MPO Counties specified in method call or in parameters." + WranglerLogger.error(msg) + raise ValueError(msg) + + WranglerLogger.debug("MPO Counties: {}".format(",".join(str(mpo_counties)))) + + """ + Start actual process + """ + + mpo = self.links_df[county_network_variable].isin(mpo_counties) + + if as_integer: + mpo = mpo.astype(int) + + self.links_df[network_variable] = mpo + + WranglerLogger.info( + "Finished calculating MPO variable: {}".format(network_variable) + )
+ +
[docs] def add_variable_using_shst_reference( + self, + var_shst_csvdata=None, + shst_csv_variable=None, + network_variable=None, + network_var_type=int, + overwrite=False, + ): + """ + Join network links with source data, via SHST API node match result. + + Args: + var_shst_csvdata (str): File path to SHST API return. + shst_csv_variable (str): Variable name in the source data. + network_variable (str): Name of the variable that should be written to. + network_var_type : Variable type in the written network. + overwrite (bool): True is overwriting existing variable. Default to False. + + Returns: + None + + """ + WranglerLogger.info( + "Adding Variable {} using Shared Streets Reference from {}".format( + network_variable, var_shst_csvdata + ) + ) + + var_shst_df = pd.read_csv(var_shst_csvdata) + + if "shstReferenceId" not in var_shst_df.columns: + msg = "'shstReferenceId' required but not found in {}".format(var_shst_data) + WranglerLogger.error(msg) + raise ValueError(msg) + + if shst_csv_variable not in var_shst_df.columns: + msg = "{} required but not found in {}".format( + shst_csv_variable, var_shst_data + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + join_gdf = pd.merge( + self.links_df, + var_shst_df[["shstReferenceId", shst_csv_variable]], + how="left", + on="shstReferenceId", + ) + + join_gdf[shst_csv_variable].fillna(0, inplace=True) + + if network_variable in self.links_df.columns and not overwrite: + join_gdf.loc[join_gdf[network_variable] > 0, network_variable] = join_gdf[ + shst_csv_variable + ].astype(network_var_type) + else: + join_gdf[network_variable] = join_gdf[shst_csv_variable].astype( + network_var_type + ) + + self.links_df[network_variable] = join_gdf[network_variable] + + WranglerLogger.info( + "Added variable: {} using Shared Streets Reference".format(network_variable) + )
+ +
[docs] def add_counts( + self, + network_variable="AADT", + mndot_count_shst_data=None, + widot_count_shst_data=None, + mndot_count_variable_shp=None, + widot_count_variable_shp=None, + ): + + """ + Adds count variable. + #MC + join the network with count node data, via SHST API node match result + + Args: + network_variable (str): Name of the variable that should be written to. Default to "AADT". + mndot_count_shst_data (str): File path to MNDOT count location SHST API node match result. + widot_count_shst_data (str): File path to WIDOT count location SHST API node match result. + mndot_count_variable_shp (str): File path to MNDOT count location geodatabase. + widot_count_variable_shp (str): File path to WIDOT count location geodatabase. + + Returns: + None + """ + + WranglerLogger.info("Adding Counts") + + """ + Verify inputs + """ + + mndot_count_shst_data = ( + mndot_count_shst_data + if mndot_count_shst_data + else self.parameters.mndot_count_shst_data + ) + widot_count_shst_data = ( + widot_count_shst_data + if widot_count_shst_data + else self.parameters.widot_count_shst_data + ) + mndot_count_variable_shp = ( + mndot_count_variable_shp + if mndot_count_variable_shp + else self.parameters.mndot_count_variable_shp + ) + widot_count_variable_shp = ( + widot_count_variable_shp + if widot_count_variable_shp + else self.parameters.widot_count_variable_shp + ) + + for varname, var in { + "mndot_count_shst_data": mndot_count_shst_data, + "widot_count_shst_data": widot_count_shst_data, + }.items(): + if not var: + msg = "'{}' not found in method or lasso parameters.".format(varname) + WranglerLogger.error(msg) + raise ValueError(msg) + if not os.path.exists(var): + msg = "{}' not found at following location: {}.".format(varname, var) + WranglerLogger.error(msg) + raise ValueError(msg) + + for varname, var in { + "mndot_count_variable_shp": mndot_count_variable_shp, + "widot_count_variable_shp": widot_count_variable_shp, + }.items(): + if not var: + msg = "'{}' not found in method or lasso parameters.".format(varname) + WranglerLogger.error(msg) + raise ValueError(msg) + + """ + Start actual process + """ + WranglerLogger.debug( + "Adding MNDOT Counts using \n- shst file: {}\n- shp file: {}\n- as network variable: {}".format( + mndot_count_shst_data, mndot_count_variable_shp, network_variable + ) + ) + # Add Minnesota Counts + self.add_variable_using_shst_reference( + var_shst_csvdata=mndot_count_shst_data, + shst_csv_variable=mndot_count_variable_shp, + network_variable=network_variable, + network_var_type=int, + overwrite=True, + ) + WranglerLogger.debug( + "Adding WiDot Counts using \n- shst file: {}\n- shp file: {}\n- as network variable: {}".format( + widot_count_shst_data, widot_count_variable_shp, network_variable + ) + ) + # Add Wisconsin Counts, but don't overwrite Minnesota + self.add_variable_using_shst_reference( + var_shst_csvdata=widot_count_shst_data, + shst_csv_variable=widot_count_variable_shp, + network_variable=network_variable, + network_var_type=int, + overwrite=False, + ) + + self.links_df["count_AM"] = self.links_df[network_variable] / 4 + self.links_df["count_MD"] = self.links_df[network_variable] / 4 + self.links_df["count_PM"] = self.links_df[network_variable] / 4 + self.links_df["count_NT"] = self.links_df[network_variable] / 4 + + self.links_df["count_daily"] = self.links_df[network_variable] + self.links_df["count_year"] = 2017 + + WranglerLogger.info( + "Finished adding counts variable: {}".format(network_variable) + )
+ +
[docs] @staticmethod + def read_match_result(path): + """ + Reads the shst geojson match returns. + + Returns shst dataframe. + + Reading lots of same type of file and concatenating them into a single DataFrame. + + Args: + path (str): File path to SHST match results. + + Returns: + geodataframe: geopandas geodataframe + + ##todo + not sure why we need, but should be in utilities not this class + """ + refId_gdf = DataFrame() + refid_file = glob.glob(path) + for i in refid_file: + new = gpd.read_file(i) + refId_gdf = pd.concat([refId_gdf, new], ignore_index=True, sort=False) + return refId_gdf
+ +
[docs] @staticmethod + def get_attribute( + links_df, + join_key, # either "shstReferenceId", or "shstGeometryId", tests showed the latter gave better coverage + source_shst_ref_df, # source shst refId + source_gdf, # source dataframe + field_name, # , # targetted attribute from source + ): + """ + Gets attribute from source data using SHST match result. + + Args: + links_df (dataframe): The network dataframe that new attribute should be written to. + join_key (str): SHST ID variable name used to join source data with network dataframe. + source_shst_ref_df (str): File path to source data SHST match result. + source_gdf (str): File path to source data. + field_name (str): Name of the attribute to get from source data. + + Returns: + None + """ + # join based on shared streets geometry ID + # pp_link_id is shared streets match return + # source_ink_id is mrcc + WranglerLogger.debug( + "source ShSt rename_variables_for_dbf columns\n{}".format( + source_shst_ref_df.columns + ) + ) + WranglerLogger.debug("source gdf columns\n{}".format(source_gdf.columns)) + # end up with OSM network with the MRCC Link ID + # could also do with route_sys...would that be quicker? + join_refId_df = pd.merge( + links_df, + source_shst_ref_df[[join_key, "pp_link_id", "score"]].rename( + columns={"pp_link_id": "source_link_id", "score": "source_score"} + ), + how="left", + on=join_key, + ) + + # joined with MRCC dataframe to get route_sys + + join_refId_df = pd.merge( + join_refId_df, + source_gdf[["LINK_ID", field_name]].rename( + columns={"LINK_ID": "source_link_id"} + ), + how="left", + on="source_link_id", + ) + + # drop duplicated records with same field value + + join_refId_df.drop_duplicates( + subset=["model_link_id", "shstReferenceId", field_name], inplace=True + ) + + # more than one match, take the best score + + join_refId_df.sort_values( + by=["model_link_id", "source_score"], + ascending=True, + na_position="first", + inplace=True, + ) + + join_refId_df.drop_duplicates( + subset=["model_link_id"], keep="last", inplace=True + ) + + # self.links_df[field_name] = join_refId_df[field_name] + + return join_refId_df[links_df.columns.tolist() + [field_name, "source_link_id"]]
+ +
[docs] def calculate_use( + self, + network_variable="use", + as_integer=True, + overwrite=False, + ): + """ + Calculates use variable. + + Args: + network_variable (str): Variable that should be written to in the network. Default to "use" + as_integer (bool): If True, will convert true/false to 1/0s. Defauly to True. + overwrite (Bool): True if overwriting existing county variable in network. Default to False. + + Returns: + None + """ + + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing hov Variable '{}' already in network".format( + network_variable + ) + ) + else: + WranglerLogger.info( + "'use' Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + WranglerLogger.info( + "Calculating hov and adding as roadway network variable: {}".format( + network_variable + ) + ) + """ + Verify inputs + """ + + if not network_variable: + msg = "No network variable specified for centroid connector" + WranglerLogger.error(msg) + raise ValueError(msg) + + """ + Start actual process + """ + #MTC + self.links_df[network_variable] = int(1) + #/MTC + + self.links_df[network_variable] = 0 + + self.links_df.loc[ + (self.links_df["assign_group"] == 8) | (self.links_df["access"] == "hov"), + network_variable, + ] = 100 + #/MC + + + if as_integer: + self.links_df[network_variable] = self.links_df[network_variable].astype( + int + ) + WranglerLogger.info( + "Finished calculating hov variable: {}".format(network_variable) + )
+ +
[docs] def create_ML_variable( + self, + network_variable="ML_lanes", + overwrite=False, + ): + """ + Created ML lanes placeholder for project to write out ML changes + + ML lanes default to 0, ML info comes from cube LOG file and store in project cards + + Args: + overwrite (Bool): True if overwriting existing variable in network. Default to False. + + Returns: + None + """ + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing ML Variable '{}' already in network".format( + network_variable + ) + ) + self.links_df[network_variable] = int(0) + else: + WranglerLogger.info( + "ML Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + """ + Verify inputs + """ + + WranglerLogger.info( + "Finished creating ML lanes variable: {}".format(network_variable) + )
+ +
[docs] def create_hov_corridor_variable( + self, + network_variable="segment_id", + overwrite=False, + ): + """ + Created hov corridor placeholder for project to write out corridor changes + + hov corridor id default to 0, its info comes from cube LOG file and store in project cards + + Args: + overwrite (Bool): True if overwriting existing variable in network. Default to False. + + Returns: + None + """ + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing hov corridor Variable '{}' already in network".format( + network_variable + ) + ) + else: + WranglerLogger.info( + "Hov corridor Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + """ + Verify inputs + """ + + self.links_df[network_variable] = int(0) + + WranglerLogger.info( + "Finished creating hov corridor variable: {}".format(network_variable) + )
+ +
[docs] def create_managed_variable( + self, + network_variable="managed", + overwrite=False, + ): + """ + Created placeholder for project to write out managed + + managed default to 0, its info comes from cube LOG file and store in project cards + + Args: + overwrite (Bool): True if overwriting existing variable in network. Default to False. + + Returns: + None + """ + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing managed Variable '{}' already in network".format( + network_variable + ) + ) + else: + WranglerLogger.info( + "Managed Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + """ + Verify inputs + """ + + self.links_df[network_variable] = int(0) + + WranglerLogger.info( + "Finished creating managed variable: {}".format(network_variable) + )
+ +
[docs] def calculate_centroidconnect( + self, + parameters, + network_variable="centroidconnect", + highest_taz_number=None, + as_integer=True, + overwrite=False, + ): + """ + Calculates centroid connector variable. + + Args: + parameters (Parameters): A Lasso Parameters, which stores input files. + network_variable (str): Variable that should be written to in the network. Default to "centroidconnect" + highest_taz_number (int): the max TAZ number in the network. + as_integer (bool): If True, will convert true/false to 1/0s. Default to True. + overwrite (Bool): True if overwriting existing county variable in network. Default to False. + Returns: + RoadwayNetwork + """ + + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing Centroid Connector Variable '{}' already in network".format( + network_variable + ) + ) + else: + WranglerLogger.info( + "Centroid Connector Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + WranglerLogger.info( + "Calculating Centroid Connector and adding as roadway network variable: {}".format( + network_variable + ) + ) + """ + Verify inputs + """ + highest_taz_number = ( + highest_taz_number if highest_taz_number else parameters.highest_taz_number + ) + + if not highest_taz_number: + msg = "No highest_TAZ number specified in method variable or in parameters" + WranglerLogger.error(msg) + raise ValueError(msg) + + WranglerLogger.debug( + "Calculating Centroid Connectors using highest TAZ number: {}".format( + highest_taz_number + ) + ) + + if not network_variable: + msg = "No network variable specified for centroid connector" + WranglerLogger.error(msg) + raise ValueError(msg) + + """ + Start actual process + """ + self.links_df[network_variable] = False + + self.links_df.loc[ + (self.links_df["A"] <= highest_taz_number) + | (self.links_df["B"] <= highest_taz_number), + network_variable, + ] = True + + if as_integer: + self.links_df[network_variable] = self.links_df[ + network_variable + ].astype(int) + WranglerLogger.info( + "Finished calculating centroid connector variable: {}".format(network_variable) + )
+ + +
[docs] def calculate_distance( + self, network_variable="distance", centroidconnect_only=False, overwrite=False + ): + """ + calculate link distance in miles + + Args: + centroidconnect_only (Bool): True if calculating distance for centroidconnectors only. Default to False. + overwrite (Bool): True if overwriting existing variable in network. Default to False. + + Returns: + None + + """ + + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing distance Variable '{}' already in network".format( + network_variable + ) + ) + else: + WranglerLogger.info( + "Distance Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + """ + Verify inputs + """ + + #MC + if ("centroidconnect" not in self.links_df) & ("taz" not in self.links_df.roadway.unique()): + if centroidconnect_only: + msg = "No variable specified for centroid connector, calculating centroidconnect first" + WranglerLogger.error(msg) + raise ValueError(msg) + #/MC + + """ + Start actual process + """ + + temp_links_gdf = self.links_df.copy() + temp_links_gdf.crs = "EPSG:4326" + temp_links_gdf = temp_links_gdf.to_crs(epsg=26915) + + #MTC + WranglerLogger.info( + "Calculating distance for all links".format(network_variable) + ) + temp_links_gdf[network_variable] = temp_links_gdf.geometry.length / 1609.34 + #/MTC + #MC + if centroidconnect_only: + WranglerLogger.info( + "Calculating {} for centroid connectors".format(network_variable) + ) + temp_links_gdf[network_variable] = np.where( + temp_links_gdf.centroidconnect == 1, + temp_links_gdf.geometry.length / 1609.34, + temp_links_gdf[network_variable], + ) + else: + WranglerLogger.info( + "Calculating distance for all links".format(network_variable) + ) + temp_links_gdf[network_variable] = temp_links_gdf.geometry.length / 1609.34 + #/MC + + self.links_df[network_variable] = temp_links_gdf[network_variable]
+ +
[docs] def convert_int(self, int_col_names=[]): + """ + Convert integer columns + """ + + #MTC + WranglerLogger.info( + "Converting variable type to mtc standard" + ) + + int_col_names = self.parameters.int_col + #/MTC + #MC + """ + WranglerLogger.info("Converting variable type to MetCouncil standard") + + if not int_col_names: + int_col_names = self.parameters.int_col + #/MC + """ + ##Why are we doing this? + # int_col_names.remove("lanes") + + for c in list(set(self.links_df.columns) & set(int_col_names)): + self.links_df[c] = self.links_df[c].replace(np.nan, 0) + # REPLACE BLANKS WITH ZERO FOR INTEGER COLUMNS + self.links_df[c] = self.links_df[c].replace('', 0) + try: + self.links_df[c] = self.links_df[c].replace(np.nan, 0) + self.links_df[c] = self.links_df[c].replace("", 0) + self.links_df[c] = self.links_df[c].astype(int) + except ValueError: + try: + self.links_df[c] = self.links_df[c].astype(float) + self.links_df[c] = self.links_df[c].astype(int) + except: + msg = f"Could not convert column {c} to integer." + WranglerLogger.error(msg) + raise ValueError(msg) + except: + self.links_df[c] = self.links_df[c].astype(float) + self.links_df[c] = self.links_df[c].astype(int) + + for c in list(set(self.nodes_df.columns) & set(int_col_names)): + self.nodes_df[c] = self.nodes_df[c].replace("", 0) + self.nodes_df[c] = self.nodes_df[c].astype(int)
+ +
[docs] def fill_na(self): + """ + Fill na values from create_managed_lane_network() + """ + + WranglerLogger.info("Filling nan for network from network wrangler") + + num_col = self.parameters.int_col + self.parameters.float_col + + for x in list(self.links_df.columns): + if x in num_col: + self.links_df[x].fillna(0, inplace=True) + self.links_df[x] = self.links_df[x].apply( + lambda k: 0 if k in [np.nan, "", float("nan"), "NaN"] else k + ) + + else: + self.links_df[x].fillna("", inplace=True) + + for x in list(self.nodes_df.columns): + if x in num_col: + self.nodes_df[x].fillna(0, inplace=True) + else: + self.nodes_df[x].fillna("", inplace=True)
+ + +
[docs] def roadway_standard_to_met_council_network(self, output_epsg=None): + """ + Rename and format roadway attributes to be consistent with what metcouncil's model is expecting. + #MC + Args: + output_epsg (int): epsg number of output network. + + Returns: + None + """ + + WranglerLogger.info( + "Renaming roadway attributes to be consistent with what metcouncil's model is expecting" + ) + + """ + Verify inputs + """ + + output_epsg = output_epsg if output_epsg else self.parameters.output_epsg + + """ + Start actual process + """ + if "managed" in self.links_df.columns: + WranglerLogger.info("Creating managed lane network.") + self.create_managed_lane_network(in_place=True) + + # when ML and assign_group projects are applied together, assign_group is filled as "" by wrangler for ML links + for c in ModelRoadwayNetwork.CALCULATED_VALUES: + if c in self.links_df.columns and c in self.parameters.int_col: + self.links_df[c] = self.links_df[c].replace("", 0) + else: + WranglerLogger.info("Didn't detect managed lanes in network.") + + self.calculate_centroidconnect(self.parameters) + self.create_calculated_variables() + self.calculate_distance(overwrite=True) + + self.fill_na() + # no method to calculate price yet, will be hard coded in project card + WranglerLogger.info("Splitting variables by time period and category") + self.split_properties_by_time_period_and_category() + self.convert_int() + + self.links_metcouncil_df = self.links_df.copy() + self.nodes_metcouncil_df = self.nodes_df.copy() + + self.links_metcouncil_df = pd.merge( + self.links_metcouncil_df.drop( + "geometry", axis=1 + ), # drop the stick geometry in links_df + self.shapes_df[["shape_id", "geometry"]], + how="left", + on="shape_id", + ) + + self.links_metcouncil_df.crs = "EPSG:4326" + self.nodes_metcouncil_df.crs = "EPSG:4326" + WranglerLogger.info("Setting Coordinate Reference System to EPSG 26915") + self.links_metcouncil_df = self.links_metcouncil_df.to_crs(epsg=26915) + self.nodes_metcouncil_df = self.nodes_metcouncil_df.to_crs(epsg=26915) + + self.nodes_metcouncil_df["X"] = self.nodes_metcouncil_df.geometry.apply( + lambda g: g.x + ) + self.nodes_metcouncil_df["Y"] = self.nodes_metcouncil_df.geometry.apply( + lambda g: g.y + ) + + # CUBE expect node id to be N + self.nodes_metcouncil_df.rename(columns={"model_node_id": "N"}, inplace=True)
+ +
[docs] def rename_variables_for_dbf( + self, + input_df, + variable_crosswalk: str = None, + output_variables: list = None, + convert_geometry_to_xy=False, + ): + """ + Rename attributes for DBF/SHP, make sure length within 10 chars. + + Args: + input_df (dataframe): Network standard DataFrame. + variable_crosswalk (str): File path to variable name crosswalk from network standard to DBF names. + output_variables (list): List of strings for DBF variables. + convert_geometry_to_xy (bool): True if converting node geometry to X/Y + + Returns: + dataframe + + """ + WranglerLogger.info("Renaming variables so that they are DBF-safe") + + """ + Verify inputs + """ + + variable_crosswalk = ( + variable_crosswalk + if variable_crosswalk + else self.parameters.net_to_dbf_crosswalk + ) + + output_variables = ( + output_variables if output_variables else self.parameters.output_variables + ) + + """ + Start actual process + """ + + crosswalk_df = pd.read_csv(variable_crosswalk) + WranglerLogger.debug( + "Variable crosswalk: {} \n {}".format(variable_crosswalk, crosswalk_df) + ) + net_to_dbf_dict = dict(zip(crosswalk_df["net"], crosswalk_df["dbf"])) + + dbf_name_list = [] + + dbf_df = copy.deepcopy(input_df) + + # only write out variables that we specify + # if variable is specified in the crosswalk, rename it to that variable + for c in dbf_df.columns: + if c in output_variables: + try: + dbf_df.rename(columns={c: net_to_dbf_dict[c]}, inplace=True) + dbf_name_list += [net_to_dbf_dict[c]] + except: + dbf_name_list += [c] + + if "geometry" in dbf_df.columns: + if str(dbf_df["geometry"].iloc[0].geom_type) == "Point": + dbf_df["X"] = dbf_df.geometry.apply(lambda g: g.x) + dbf_df["Y"] = dbf_df.geometry.apply(lambda g: g.y) + dbf_name_list += ["X", "Y"] + + WranglerLogger.debug("DBF Variables: {}".format(",".join(dbf_name_list))) + + return dbf_df[dbf_name_list]
+ +
[docs] def write_roadway_as_shp( + self, + output_dir, + node_output_variables: list = None, + link_output_variables: list = None, + data_to_csv: bool = True, + data_to_dbf: bool = False, + output_link_shp: str = None, + output_node_shp: str = None, + output_link_csv: str = None, + output_node_csv: str = None, + output_gpkg: str = None, + output_link_gpkg_layer: str = None, + output_node_gpkg_layer: str = None, + output_gpkg_link_filter: str = None + ): + """ + Write out dbf/shp/gpkg for cube. Write out csv in addition to shp with full length variable names. + + Args: + output_dir (str): File path to directory + node_output_variables (list): List of strings for node output variables. + link_output_variables (list): List of strings for link output variables. + data_to_csv (bool): True if write network in csv format. + data_to_dbf (bool): True if write network in dbf/shp format. + output_link_shp (str): File name to output link dbf/shp. + output_node_shp (str): File name of output node dbf/shp. + output_link_csv (str): File name to output link csv. + output_node_csv (str): File name to output node csv. + output_gpkg (str): File name to output GeoPackage. + output_link_gpkg_layer (str): Layer name within output_gpkg to output links. + output_node_gpkg_layer (str): Layer name within output_gpkg to output links. + output_gpkg_link_filter (str): Optional column name to additional output link subset layers + + Returns: + None + """ + + WranglerLogger.info("Writing Network as Shapefile") + WranglerLogger.debug( + "Output Variables: \n - {}".format( + "\n - ".join(self.parameters.output_variables) + ) + ) + + """ + Verify inputs + """ + + if self.nodes_mtc_df is None: + self.roadway_standard_to_met_council_network() + + WranglerLogger.debug( + "Network Link Variables: \n - {}".format( + "\n - ".join(self.links_mtc_df.columns) + ) + ) + WranglerLogger.debug( + "Network Node Variables: \n - {}".format( + "\n - ".join(self.nodes_mtc_df.columns) + ) + ) + + link_output_variables = ( + link_output_variables + if link_output_variables + else [ + c + for c in self.links_mtc_df.columns + if c in self.parameters.output_variables + ] + ) + + node_output_variables = ( + node_output_variables + if node_output_variables + else [ + c + for c in self.nodes_mtc_df.columns + if c in self.parameters.output_variables + ] + ) + + # unless specified that all the data goes to the DBF, only output A and B + dbf_link_output_variables = ( + #MTC + link_output_variables if link_output_variables else ["A", "B", "geometry"] + #MC + #link_output_variables if data_to_dbf else ["A", "B", "shape_id", "geometry"] + ) + + # Removing code to set this to versions from parameters + # User can use these as arg + + """ + Start Process + """ + # rename these to short only for shapefile option + if output_node_shp: + WranglerLogger.info("Renaming DBF Node Variables") + nodes_dbf_df = self.rename_variables_for_dbf(self.nodes_mtc_df, output_variables=node_output_variables) + else: + WranglerLogger.debug("nodes_mtc_df columns: {}".format(list(self.nodes_mtc_df.columns))) + nodes_dbf_df = self.nodes_mtc_df[node_output_variables] + + if output_link_shp: + WranglerLogger.info("Renaming DBF Link Variables") + links_dbf_df = self.rename_variables_for_dbf(self.links_mtc_df, output_variables=dbf_link_output_variables) + else: + WranglerLogger.debug("links_mtc_df columns: {}".format(list(self.links_mtc_df.columns))) + links_dbf_df = self.links_mtc_df[dbf_link_output_variables] + + links_dbf_df = gpd.GeoDataFrame(links_dbf_df, geometry=links_dbf_df["geometry"]) + + # temp debug + WranglerLogger.debug("links_dbf_df.loc[(links_dbf_df.A.isin([7063066,2563066]))|(links_dbf_df.B.isin([7063066,2563066]))]:\n{}".format( + links_dbf_df.loc[(links_dbf_df.A.isin([7063066,2563066]))|(links_dbf_df.B.isin([7063066,2563066]))] + )) + + if output_node_shp: + WranglerLogger.info("Writing Node Shapes: {}".format(os.path.join(output_dir, output_node_shp))) + nodes_dbf_df.to_file(os.path.join(output_dir, output_node_shp)) + + if output_gpkg and output_node_gpkg_layer: + WranglerLogger.info("Writing GeoPackage {} with Node Layer {}".format(os.path.join(output_dir, output_gpkg), output_node_gpkg_layer)) + nodes_dbf_df.to_file(os.path.join(output_dir, output_gpkg), layer=output_node_gpkg_layer, driver="GPKG") + + if output_link_shp: + WranglerLogger.info("Writing Link Shapes: {}".format(os.path.join(output_dir, output_link_shp))) + links_dbf_df.to_file(os.path.join(output_dir, output_link_shp)) + + # debug test + link_schema = { + "properties": { + "A" : "int:8", + "B" : "int:8", + "model_link_id" : "int:10", + "shstGeometryId": "str:32", + "name" : "str:84", + "ft" : "int:2", + "assignable" : "int:18", + "cntype" : "str:80", + "distance" : "float", + "county" : "str:15", + "bike_access" : "int:2", + "drive_access" : "int:2", + "walk_access" : "int:2", + "rail_only" : "int:2", + "bus_only" : "int:2", + "transit" : "int:2", + "managed" : "int:2", + "tollbooth" : "int:2", + "tollseg" : "int:2", + "segment_id" : "int:4", + "lanes_EA" : "int:2", + "lanes_AM" : "int:2", + "lanes_MD" : "int:2", + "lanes_PM" : "int:2", + "lanes_EV" : "int:2", + "useclass_EA" : "int:2", + "useclass_AM" : "int:2", + "useclass_MD" : "int:2", + "useclass_PM" : "int:2", + "useclass_EV" : "int:2", + "nmt2010" : "int:2", + "nmt2020" : "int:2", + }, + "geometry": "LineString" + } + if output_gpkg and output_link_gpkg_layer: + WranglerLogger.info("Writing GeoPackage {} with Link Layer {}".format(os.path.join(output_dir, output_gpkg), output_link_gpkg_layer)) + links_dbf_df.to_file(os.path.join(output_dir, output_gpkg), layer=output_link_gpkg_layer, schema=link_schema, driver="GPKG") + + # output additional link layers if filter column is specified + # e.g. if county-subsets are output + if output_gpkg_link_filter: + link_value_counts = links_dbf_df[output_gpkg_link_filter].value_counts() + for filter_val,filter_count in link_value_counts.items(): + gpkg_layer_name = "{}_{}".format(output_link_gpkg_layer, filter_val) + gpkg_layer_name = gpkg_layer_name.replace(" ","_") + WranglerLogger.info("Writing GeoPackage {} with Link Layer {} for {} rows".format( + os.path.join(output_dir, output_gpkg), gpkg_layer_name, filter_count)) + links_dbf_df.loc[ links_dbf_df[output_gpkg_link_filter]==filter_val ].to_file( + os.path.join(output_dir, output_gpkg), layer=gpkg_layer_name, schema=link_schema, driver="GPKG") + + + + + if data_to_csv: + WranglerLogger.info( + "Writing Network Data to CSVs:\n - {}\n - {}".format( + output_link_csv, output_node_csv + ) + ) + self.links_mtc_df[link_output_variables].to_csv( + output_link_csv, index=False + ) + self.nodes_mtc_df[node_output_variables].to_csv( + output_node_csv, index=False + )
+ + + # this should be moved to util +
[docs] @staticmethod + def dataframe_to_fixed_width(df): + """ + Convert dataframe to fixed width format, geometry column will not be transformed. + + Args: + df (pandas DataFrame). + + Returns: + pandas dataframe: dataframe with fixed width for each column. + dict: dictionary with columns names as keys, column width as values. + """ + WranglerLogger.info("Starting fixed width conversion") + + # get the max length for each variable column + max_width_dict = dict( + [ + (v, df[v].apply(lambda r: len(str(r)) if r != None else 0).max()) + for v in df.columns.values + if v != "geometry" + ] + ) + + fw_df = df.drop("geometry", axis=1).copy() + for c in fw_df.columns: + fw_df[c] = fw_df[c].apply(lambda x: str(x)) + fw_df["pad"] = fw_df[c].apply(lambda x: " " * (max_width_dict[c] - len(x))) + fw_df[c] = fw_df.apply(lambda x: x["pad"] + x[c], axis=1) + + return fw_df, max_width_dict
+ +
[docs] def write_roadway_as_fixedwidth( + self, + output_dir, + node_output_variables: list = None, + link_output_variables: list = None, + output_link_txt: str = None, + output_node_txt: str = None, + output_link_header_width_txt: str = None, + output_node_header_width_txt: str = None, + output_cube_network_script: str = None, + drive_only: bool = False, + ): + """ + Writes out fixed width file. + + This function does: + 1. write out link and node fixed width data files for cube. + 2. write out header and width correspondence. + 3. write out cube network building script with header and width specification. + + Args: + output_dir (str): File path to where links, nodes and script will be written and run + node_output_variables (list): list of node variable names. + link_output_variables (list): list of link variable names. + output_link_txt (str): File name of output link database (within output_dir) + output_node_txt (str): File name of output node database (within output_dir) + output_link_header_width_txt (str): File name of link column width records (within output_dir) + output_node_header_width_txt (str): File name of node column width records (within output_dir) + output_cube_network_script (str): File name of CUBE network building script (within output_dir) + drive_only (bool): If True, only writes drive nodes and links + + Returns: + None + + """ + + """ + Verify inputs + """ + + if self.nodes_mtc_df is None: + self.roadway_standard_to_mtc_network() + + WranglerLogger.debug( + "Network Link Variables: \n - {}".format( + "\n - ".join(self.links_mtc_df.columns) + ) + ) + WranglerLogger.debug( + "Network Node Variables: \n - {}".format( + "\n - ".join(self.nodes_mtc_df.columns) + ) + ) + + link_output_variables = ( + link_output_variables + if link_output_variables + else [ + c + for c in self.links_mtc_df.columns + if c in self.parameters.output_variables + ] + ) + + node_output_variables = ( + node_output_variables + if node_output_variables + else [ + c + for c in self.nodes_mtc_df.columns + if c in self.parameters.output_variables + ] + ) + + output_link_txt = ( + output_link_txt if output_link_txt else self.parameters.output_link_txt + ) + + output_node_txt = ( + output_node_txt if output_node_txt else self.parameters.output_node_txt + ) + + output_link_header_width_txt = ( + output_link_header_width_txt + if output_link_header_width_txt + else self.parameters.output_link_header_width_txt + ) + + output_node_header_width_txt = ( + output_node_header_width_txt + if output_node_header_width_txt + else self.parameters.output_node_header_width_txt + ) + + output_cube_network_script = ( + output_cube_network_script + if output_cube_network_script + else self.parameters.output_cube_network_script + ) + + """ + Start Process + """ + #MTC + link_ff_df, link_max_width_dict = self.dataframe_to_fixed_width( + self.links_mtc_df[link_output_variables] + ) + + if drive_only: + link_ff_df = link_ff_df.loc[link_ff_df['drive_access'] == 1] + #/MTC + """ + #MC + link_ff_df, link_max_width_dict = self.dataframe_to_fixed_width( + self.links_metcouncil_df[link_output_variables] + ) + + if drive_only: + link_ff_df = link_ff_df.loc[link_ff_df["drive_access"] == 1] + #/MC + """ + WranglerLogger.info("Writing out link database") + + link_ff_df.to_csv(os.path.join(output_dir, output_link_txt), sep=";", index=False, header=False) + + # write out header and width correspondence + WranglerLogger.info("Writing out link header and width ----") + link_max_width_df = DataFrame( + list(link_max_width_dict.items()), columns=["header", "width"] + ) + link_max_width_df.to_csv(os.path.join(output_dir, output_link_header_width_txt), index=False) + + #MTC + node_ff_df, node_max_width_dict = self.dataframe_to_fixed_width( + self.nodes_mtc_df[node_output_variables] + ) + #/MTC + """ + #MC + node_ff_df, node_max_width_dict = self.dataframe_to_fixed_width( + self.nodes_metcouncil_df[node_output_variables] + ) + #/MC + """ + WranglerLogger.info("Writing out node database") + + if drive_only: + node_ff_df = node_ff_df.loc[node_ff_df["drive_node"] == 1] + + + node_ff_df.to_csv(os.path.join(output_dir, output_node_txt), sep=";", index=False, header=False) + + # write out header and width correspondence + WranglerLogger.info("Writing out node header and width") + node_max_width_df = DataFrame( + list(node_max_width_dict.items()), columns=["header", "width"] + ) + node_max_width_df.to_csv(os.path.join(output_dir, output_node_header_width_txt), index=False) + + # write out cube script + s = 'RUN PGM = NETWORK MSG = "Read in network from fixed width file" \n' + s += 'FILEI LINKI[1] = "{}",'.format(output_link_txt) + start_pos = 1 + for i in range(len(link_max_width_df)): + s += " VAR=" + link_max_width_df.header.iloc[i] + + if ( + self.links_mtc_df.dtypes.loc[link_max_width_df.header.iloc[i]] + == "O" + ): + s += "(C" + str(link_max_width_df.width.iloc[i]) + ")" + + s += ( + ", BEG=" + + str(start_pos) + + ", LEN=" + + str(link_max_width_df.width.iloc[i]) + + "," + ) + + start_pos += link_max_width_df.width.iloc[i] + 1 + + s = s[:-1] + s += "\n" + s += 'FILEI NODEI[1] = "{}",'.format(output_node_txt) + start_pos = 1 + for i in range(len(node_max_width_df)): + s += " VAR=" + node_max_width_df.header.iloc[i] + + if ( + self.nodes_mtc_df.dtypes.loc[node_max_width_df.header.iloc[i]] + == "O" + ): + s += "(C" + str(node_max_width_df.width.iloc[i]) + ")" + + s += ( + ", BEG=" + + str(start_pos) + + ", LEN=" + + str(node_max_width_df.width.iloc[i]) + + "," + ) + + start_pos += node_max_width_df.width.iloc[i] + 1 + + s = s[:-1] + s += '\n' + s += 'FILEO NETO = "complete_network.net"\n\n' + s += ' ZONES = {}\n\n'.format(self.parameters.zones) + s += '; Trim leading whitespace from string variables\n' + # todo: The below should be built above based on columns that are strings + s += ' phase=NODEMERGE\n' + s += ' county = LTRIM(county)\n' + s += ' endphase\n' + s += ' phase=LINKMERGE\n' + s += ' name = LTRIM(name)\n' + s += ' county = LTRIM(county)\n' + s += ' cntype = LTRIM(cntype)\n' + s += ' endphase\n' + s += '\nENDRUN\n' + + with open(os.path.join(output_dir, output_cube_network_script), "w") as f: + f.write(s) + + # run the cube script to create the cube network + import subprocess + env = copy.copy(os.environ) + cube_cmd = '"C:\\Program Files\\Citilabs\\CubeVoyager\\runtpp.exe" {}'.format(output_cube_network_script) + try: + WranglerLogger.info("Running [{}] in cwd [{}]".format(cube_cmd, output_dir)) + ret = subprocess.run(cube_cmd, cwd=output_dir, capture_output=True, check=True) + + WranglerLogger.info("return code: {}".format(ret.returncode)) + + for line in ret.stdout.decode('utf-8').split('\r\n'): + if len(line) > 0: WranglerLogger.info("stdout: {}".format(line)) + + for line in ret.stderr.decode('utf-8').split('\r\n'): + if len(line) > 0: WranglerLogger.info("stderr: {}".format(line)) + + except Exception as e: + WranglerLogger.error(e)
+
+ +
+
+
+ +
+ +
+

© Copyright 2019 Metropolitan Council.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty/_modules/lasso/transit/index.html b/branch/bicounty/_modules/lasso/transit/index.html new file mode 100644 index 0000000..3e4a10c --- /dev/null +++ b/branch/bicounty/_modules/lasso/transit/index.html @@ -0,0 +1,1846 @@ + + + + + + lasso.transit — lasso documentation + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for lasso.transit

+"""Transit-related classes to parse, compare, and write standard and cube transit files.
+
+  Typical usage example:
+
+    tn = CubeTransit.create_from_cube(CUBE_DIR)
+    transit_change_list = tn.evaluate_differences(base_transit_network)
+
+    cube_transit_net = StandardTransit.read_gtfs(BASE_TRANSIT_DIR)
+    cube_transit_net.write_as_cube_lin(os.path.join(WRITE_DIR, "outfile.lin"))
+"""
+import os
+import copy
+import csv
+import datetime, time
+from typing import Any, Dict, Optional, Union
+
+from lark import Lark, Transformer, v_args
+from pandas import DataFrame
+
+import pandas as pd
+import partridge as ptg
+import numpy as np
+
+from network_wrangler import TransitNetwork
+
+from .logger import WranglerLogger
+from .parameters import Parameters
+
+
[docs]class CubeTransit(object): + """Class for storing information about transit defined in Cube line + files. + + Has the capability to: + + - Parse cube line file properties and shapes into python dictionaries + - Compare line files and represent changes as Project Card dictionaries + + .. highlight:: python + + Typical usage example: + :: + tn = CubeTransit.create_from_cube(CUBE_DIR) + transit_change_list = tn.evaluate_differences(base_transit_network) + + Attributes: + lines (list): list of strings representing unique line names in + the cube network. + line_properties (dict): dictionary of line properties keyed by line name. Property + values are stored in a dictionary by property name. These + properties are directly read from the cube line files and haven't + been translated to standard transit values. + shapes (dict): dictionary of shapes + keyed by line name. Shapes stored as a pandas DataFrame of nodes with following columns: + - 'node_id' (int): positive integer of node id + - 'node' (int): node number, with negative indicating a non-stop + - 'stop' (boolean): indicates if it is a stop + - 'order' (int): order within this shape + program_type (str): Either PT or TRNBLD + parameters (Parameters): + Parameters instance that will be applied to this instance which + includes information about time periods and variables. + source_list (list): + List of cube line file sources that have been read and added. + diff_dict (dict): + """ + +
[docs] def __init__(self, parameters: Union[Parameters, dict] = {}): + """ + Constructor for CubeTransit + + parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters + """ + WranglerLogger.debug("Creating a new Cube Transit instance") + + self.lines = [] + + self.line_properties = {} + self.shapes = {} + + self.program_type = None + + if type(parameters) is dict: + self.parameters = Parameters(**parameters) + elif isinstance(parameters, Parameters): + self.parameters = Parameters(**parameters.__dict__) + else: + msg = "Parameters should be a dict or instance of Parameters: found {} which is of type:{}".format( + parameters, type(parameters) + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + self.source_list = [] + + self.diff_dict = Dict[str, Any]
+ +
[docs] def add_cube(self, transit_source: str): + """Reads a .lin file and adds it to existing TransitNetwork instance. + + Args: + transit_source: a string or the directory of the cube line file to be parsed + + """ + + """ + Figure out what kind of transit source it is + """ + + parser = Lark(TRANSIT_LINE_FILE_GRAMMAR, debug="debug", parser="lalr") + + if "NAME=" in transit_source: + WranglerLogger.debug("reading transit source as string") + self.source_list.append("input_str") + parse_tree = parser.parse(transit_source) + elif os.path.isfile(transit_source): + print("reading: {}".format(transit_source)) + with open(transit_source) as file: + WranglerLogger.debug( + "reading transit source: {}".format(transit_source) + ) + self.source_list.append(transit_source) + parse_tree = parser.parse(file.read()) + elif os.path.isdir(transit_source): + import glob + + for lin_file in glob.glob(os.path.join(transit_source, "*.LIN")): + self.add_cube(lin_file) + return + else: + msg = "{} not a valid transit line string, directory, or file" + WranglerLogger.error(msg) + raise ValueError(msg) + + WranglerLogger.debug("finished parsing cube line file") + # WranglerLogger.debug("--Parse Tree--\n {}".format(parse_tree.pretty())) + transformed_tree_data = CubeTransformer().transform(parse_tree) + # WranglerLogger.debug("--Transformed Parse Tree--\n {}".format(transformed_tree_data)) + + _line_data = transformed_tree_data["lines"] + + line_properties_dict = {k: v["line_properties"] for k, v in _line_data.items()} + line_shapes_dict = {k: v["line_shape"] for k, v in _line_data.items()} + new_lines = list(line_properties_dict.keys()) + """ + Before adding lines, check to see if any are overlapping with existing ones in the network + """ + + overlapping_lines = set(new_lines) & set(self.lines) + if overlapping_lines: + msg = "Overlapping lines found when adding from {}. \nSource files:\n{}\n{} Overlapping Lines of {} total new lines.\n-->{}".format( + transit_source, + "\n - ".join(self.source_list), + len(new_lines), + len(overlapping_lines), + overlapping_lines, + ) + print(msg) + WranglerLogger.error(msg) + raise ValueError(msg) + + self.program_type = transformed_tree_data.get("program_type", None) + + self.lines += new_lines + self.line_properties.update(line_properties_dict) + self.shapes.update(line_shapes_dict) + + WranglerLogger.debug("Added lines to CubeTransit: \n".format(new_lines))
+ +
[docs] @staticmethod + def create_from_cube(transit_source: str, parameters: Optional[dict] = {}): + """ + Reads a cube .lin file and stores as TransitNetwork object. + + Args: + transit_source: a string or the directory of the cube line file to be parsed + + Returns: + A ::CubeTransit object created from the transit_source. + """ + + tn = CubeTransit(parameters) + tn.add_cube(transit_source) + + return tn
+ +
[docs] def evaluate_differences(self, base_transit): + """ + 1. Identifies what routes need to be updated, deleted, or added + 2. For routes being added or updated, identify if the time periods + have changed or if there are multiples, and make duplicate lines if so + 3. Create project card dictionaries for each change. + + Args: + base_transit (CubeTransit): an instance of this class for the base condition + + Returns: + A list of dictionaries containing project card changes + required to evaluate the differences between the base network + and this transit network instance. + """ + transit_change_list = [] + + """ + Identify what needs to be evaluated + """ + lines_to_update = [l for l in self.lines if l in base_transit.lines] + lines_to_delete = [l for l in base_transit.lines if l not in self.lines] + lines_to_add = [l for l in self.lines if l not in base_transit.lines] + + project_card_changes = [] + + """ + Evaluate Property Updates + """ + + for line in lines_to_update: + WranglerLogger.debug( + "Finding differences in time periods for: {}".format(line) + ) + + """ + Find any additional time periods that might need to add or delete. + """ + base_cube_time_period_numbers = ( + CubeTransit.get_time_period_numbers_from_cube_properties( + base_transit.line_properties[line] + ) + ) + + try: + assert len(base_cube_time_period_numbers) == 1 + except: + msg = "Base network line {} should only have one time period per route, but {} found".format( + line, base_cube_time_period_numbers + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + base_cube_time_period_number = base_cube_time_period_numbers[0] + + build_cube_time_period_numbers = ( + CubeTransit.get_time_period_numbers_from_cube_properties( + self.line_properties[line] + ) + ) + + time_periods_to_add = [ + tp + for tp in build_cube_time_period_numbers + if tp not in base_cube_time_period_numbers + ] + + for tp in time_periods_to_add: + lines_to_add.append(self.add_additional_time_periods(tp, line)) + + time_periods_to_delete = [ + tp + for tp in base_cube_time_period_numbers + if tp not in build_cube_time_period_numbers + ] + + for tp in time_periods_to_delete: + lines_to_delete.append(line) + + WranglerLogger.debug("Evaluating differences in: {}".format(line)) + updated_properties = self.evaluate_route_property_differences( + self.line_properties[line], + base_transit.line_properties[line], + base_cube_time_period_number, + ) + updated_shapes = CubeTransit.evaluate_route_shape_changes( + self.shapes[line].node, base_transit.shapes[line].node + ) + if updated_properties: + update_prop_card_dict = self.create_update_route_card_dict( + line, updated_properties + ) + project_card_changes.append(update_prop_card_dict) + + if updated_shapes: + update_shape_card_dict = self.create_update_route_card_dict( + line, updated_shapes + ) + project_card_changes.append(update_shape_card_dict) + + """ + Evaluate Deletions + """ + for line in lines_to_delete: + delete_card_dict = self.create_delete_route_card_dict( + line, base_transit.line_properties[line] + ) + project_card_changes.append(delete_card_dict) + + """ + Evaluate Additions + + First assess if need to add multiple routes if there are multiple time periods + """ + for line in lines_to_add: + time_period_numbers = ( + CubeTransit.get_time_period_numbers_from_cube_properties( + self.line_properties[line] + ) + ) + if len(time_period_numbers) > 1: + for tp in time_period_numbers[1:]: + lines_to_add.append(self.add_additional_time_periods(tp, line)) + + for line in lines_to_add: + add_card_dict = self.create_add_route_card_dict(line) + project_card_changes.append(add_card_dict) + + return project_card_changes
+ +
[docs] def add_additional_time_periods( + self, new_time_period_number: int, orig_line_name: str + ): + """ + Copies a route to another cube time period with appropriate + values for time-period-specific properties. + + New properties are stored under the new name in: + - ::self.shapes + - ::self.line_properties + + Args: + new_time_period_number (int): cube time period number + orig_line_name(str): name of the originating line, from which + the new line will copy its properties. + + Returns: + Line name with new time period. + """ + WranglerLogger.debug( + "adding time periods {} to line {}".format( + new_time_period_number, orig_line_name + ) + ) + + ( + route_id, + _init_time_period, + agency_id, + direction_id, + ) = CubeTransit.unpack_route_name(orig_line_name) + new_time_period_name = self.parameters.cube_time_periods[new_time_period_number] + new_tp_line_name = CubeTransit.build_route_name( + route_id=route_id, + time_period=new_time_period_name, + agency_id=agency_id, + direction_id=direction_id, + ) + + try: + assert new_tp_line_name not in self.lines + except: + msg = "Trying to add a new time period {} to line {}, but constructed name {} is already in line list.".format( + new_time_period_number, orig_line_name, new_tp_line_name + ) + WrangerLogger.error(msg) + raise ValueError(msg) + + # copy to a new line and add it to list of lines to add + self.line_properties[new_tp_line_name] = copy.deepcopy( + self.line_properties[orig_line_name] + ) + self.shapes[new_tp_line_name] = copy.deepcopy(self.shapes[orig_line_name]) + self.line_properties[new_tp_line_name]["NAME"] = new_tp_line_name + + """ + Remove entries that aren't for this time period from the new line's properties list. + """ + this_time_period_properties_list = [ + p + "[" + str(new_time_period_number) + "]" + ##todo parameterize all time period specific variables + for p in ["HEADWAY", "FREQ"] + ] + + not_this_tp_properties_list = list( + set(self.parameters.time_period_properties_list) + - set(this_time_period_properties_list) + ) + + for k in not_this_tp_properties_list: + self.line_properties[new_tp_line_name].pop(k, None) + + """ + Remove entries for time period from the original line's properties list. + """ + for k in this_time_period_properties_list: + self.line_properties[orig_line_name].pop(k, None) + + """ + Add new line to list of lines to add. + """ + WranglerLogger.debug( + "Adding new time period {} for line {} as {}.".format( + new_time_period_number, orig_line_name, new_tp_line_name + ) + ) + return new_tp_line_name
+ +
[docs] def create_update_route_card_dict(self, line: str, updated_properties_dict: dict): + """ + Creates a project card change formatted dictionary for updating + the line. + + Args: + line: name of line that is being updated + updated_properties_dict: dictionary of attributes to update as + 'property': <property name>, + 'set': <new property value> + + Returns: + A project card change-formatted dictionary for the attribute update. + """ + base_start_time_str, base_end_time_str = self.calculate_start_end_times( + self.line_properties[line] + ) + + update_card_dict = { + "category": "Transit Service Property Change", + "facility": { + "route_id": line.split("_")[1], + "direction_id": int(line.split("_")[-2].strip("d\"")), + "shape_id": line.split("_")[-1].strip("s\""), + "start_time": base_start_time_str, + "end_time": base_end_time_str, + }, + "properties": updated_properties_dict, + } + WranglerLogger.debug( + "Updating {} route to changes:\n{}".format(line, str(update_card_dict)) + ) + + return update_card_dict
+ +
[docs] def create_delete_route_card_dict( + self, line: str, base_transit_line_properties_dict: dict + ): + """ + Creates a project card change formatted dictionary for deleting a line. + + Args: + line: name of line that is being deleted + base_transit_line_properties_dict: dictionary of cube-style + attribute values in order to find time periods and + start and end times. + + Returns: + A project card change-formatted dictionary for the route deletion. + """ + base_start_time_str, base_end_time_str = self.calculate_start_end_times( + base_transit_line_properties_dict + ) + + delete_card_dict = { + "category": "Delete Transit Service", + "facility": { + "route_id": line.split("_")[1], + "direction_id": int(line.strip('"')[-1]), + "start_time": base_start_time_str, + "end_time": base_end_time_str, + }, + } + WranglerLogger.debug( + "Deleting {} route to changes:\n{}".format(line, delete_card_dict) + ) + + return delete_card_dict
+ +
[docs] def create_add_route_card_dict(self, line: str): + """ + Creates a project card change formatted dictionary for adding + a route based on the information in self.route_properties for + the line. + + Args: + line: name of line that is being updated + + Returns: + A project card change-formatted dictionary for the route addition. + """ + start_time_str, end_time_str = self.calculate_start_end_times( + self.line_properties[line] + ) + + standard_properties = self.cube_properties_to_standard_properties( + self.line_properties[line] + ) + + routing_properties = { + "property": "routing", + "set": self.shapes[line]["node"].tolist(), + } + + add_card_dict = { + "category": "New Transit Service", + "facility": { + "route_id": line.split("_")[1], + "direction_id": int(line.strip('_')[-2]), + "start_time": start_time_str, + "end_time": end_time_str, + "agency_id": line.strip('_')[0], + }, + "properties": standard_properties + [routing_properties], + } + + WranglerLogger.debug( + "Adding {} route to changes:\n{}".format(line, add_card_dict) + ) + return add_card_dict
+ +
[docs] @staticmethod + def get_time_period_numbers_from_cube_properties(properties_list: list): + """ + Finds properties that are associated with time periods and the + returns the numbers in them. + + Args: + properties_list (list): list of all properties. + + Returns: + list of strings of the time period numbers found + """ + time_periods_list = [] + for p in properties_list: + if ("[" not in p) or ("]" not in p): + continue + tp_num = p.split("[")[1][0] + if tp_num and tp_num not in time_periods_list: + time_periods_list.append(tp_num) + return time_periods_list
+ +
[docs] @staticmethod + def build_route_name( + route_id: str = "", + time_period: str = "", + agency_id: str = 0, + direction_id: str = 1, + ): + """ + Create a route name by contatenating route, time period, agency, and direction + + Args: + route_id: i.e. 452-111 + time_period: i.e. pk + direction_id: i.e. 1 + agency_id: i.e. 0 + + Returns: + constructed line_name i.e. "0_452-111_452_pk1" + """ + + return ( + str(agency_id) + + "_" + + str(route_id) + + "_" + + str(route_id.split("-")[0]) + + "_" + + str(time_period) + + str(direction_id) + )
+ +
[docs] @staticmethod + def unpack_route_name(line_name: str): + """ + Unpacks route name into direction, route, agency, and time period info + + Args: + line_name (str): i.e. "0_452-111_452_pk1" + + Returns: + route_id (str): 452-111 + time_period (str): i.e. pk + direction_id (str) : i.e. 1 + agency_id (str) : i.e. 0 + """ + + line_name = line_name.strip('"') + + agency_id, route_id, _rtid, _tp_direction = line_name.split("_") + time_period = _tp_direction[0:-1] + direction_id = _tp_direction[-1] + + return route_id, time_period, agency_id, direction_id
+ +
[docs] def calculate_start_end_times(self, line_properties_dict: dict): + """ + Calculate the start and end times of the property change + WARNING: Doesn't take care of discongruous time periods!!!! + + Args: + line_properties_dict: dictionary of cube-flavor properties for a transit line + """ + start_time_m = 24 * 60 + end_time_m = 0 * 60 + + WranglerLogger.debug( + "parameters.time_period_properties_list: {}".format( + self.parameters.time_period_properties_list + ) + ) + current_cube_time_period_numbers = ( + CubeTransit.get_time_period_numbers_from_cube_properties( + line_properties_dict + ) + ) + + WranglerLogger.debug( + "current_cube_time_period_numbers:{}".format( + current_cube_time_period_numbers + ) + ) + + for tp in current_cube_time_period_numbers: + time_period_name = self.parameters.cube_time_periods[tp] + WranglerLogger.debug("time_period_name:{}".format(time_period_name)) + _start_time, _end_time = self.parameters.time_period_to_time[ + time_period_name + ] + + # change from "HH:MM" to integer # of seconds + _start_time_m = (int(_start_time.split(":")[0]) * 60) + int( + _start_time.split(":")[1] + ) + _end_time_m = (int(_end_time.split(":")[0]) * 60) + int( + _end_time.split(":")[1] + ) + + # find bounding start and end times + if _start_time_m < start_time_m: + start_time_m = _start_time_m + if _end_time_m > end_time_m: + end_time_m = _end_time_m + + if start_time_m > end_time_m: + msg = "Start time ({}) is after end time ({})".format( + start_time_m, end_time_m + ) + #WranglerLogger.error(msg) + #raise ValueError(msg) + + start_time_str = "{:02d}:{:02d}".format(*divmod(start_time_m, 60)) + end_time_str = "{:02d}:{:02d}".format(*divmod(end_time_m, 60)) + return start_time_str, end_time_str
+ +
[docs] @staticmethod + def cube_properties_to_standard_properties(cube_properties_dict: dict): + """ + Converts cube style properties to standard properties. + + This is most pertinent to time-period specific variables like headway, + and varibles that have stnadard units like headway, which is minutes + in cube and seconds in standard format. + + Args: + cube_properties_dict: <cube style property name> : <property value> + + Returns: + A list of dictionaries with values for `"property": <standard + style property name>, "set" : <property value with correct units>` + + """ + standard_properties_list = [] + for k, v in cube_properties_dict.items(): + change_item = {} + if any(i in k for i in ["HEADWAY", "FREQ"]): + change_item["property"] = "headway_secs" + change_item["set"] = v * 60 + else: + change_item["property"] = k + change_item["set"] = v + standard_properties_list.append(change_item) + + return standard_properties_list
+ +
[docs] def evaluate_route_property_differences( + self, + properties_build: dict, + properties_base: dict, + time_period_number: str, + absolute: bool = True, + validate_base: bool = False, + ): + """ + Checks if any values have been updated or added for a specific + route and creates project card entries for each. + + Args: + properties_build: ::<property_name>: <property_value> + properties_base: ::<property_name>: <property_value> + time_period_number: time period to evaluate + absolute: if True, will use `set` command rather than a change. If false, will automatically check the base value. Note that this only applies to the numeric values of frequency/headway + validate_base: if True, will add the `existing` line in the project card + + Returns: + transit_change_list (list): a list of dictionary values suitable for writing to a project card + `{ + 'property': <property_name>, + 'set': <set value>, + 'change': <change from existing value>, + 'existing': <existing value to check>, + }` + + """ + + # Remove time period specific values for things that aren't part of the time period in question + this_time_period_properties_list = [ + p + "[" + str(time_period_number) + "]" + ##todo parameterize all time period specific variables + for p in ["HEADWAY", "FREQ"] + ] + + not_this_tp_properties_list = list( + set(self.parameters.time_period_properties_list) + - set(this_time_period_properties_list) + ) + + for k in not_this_tp_properties_list: + properties_build.pop(k, None) + properties_base.pop(k, None) + + difference_dict = dict( + set(properties_build.items()) ^ set(properties_base.items()) + ) + + # Iterate through properties list to build difference project card list + + properties_list = [] + for k, v in difference_dict.items(): + change_item = {} + if any(i in k for i in ["HEADWAY", "FREQ"]): + change_item["property"] = "headway_secs" + + if absolute: + change_item["set"] = ( + v * 60 + ) # project cards are in secs, cube is in minutes + else: + change_item["change"] = ( + properties_build[k] - properties_base[k] + ) * 60 + if validate_base or not absolute: + change_item["existing"] = properties_base[k] * 60 + else: + change_item["property"] = k + change_item["set"] = v + if validate_base: + change_item["existing"] = properties_base[k] + + properties_list.append(change_item) + WranglerLogger.debug( + "Evaluated Route Changes: \n {})".format( + "\n".join(map(str, properties_list)) + ) + ) + return properties_list
+ +
[docs] @staticmethod + def evaluate_route_shape_changes( + shape_build: DataFrame, shape_base: DataFrame + ): + """ + Compares two route shapes and constructs returns list of changes + suitable for a project card. + + Args: + shape_build: DataFrame of the build-version of the route shape. + shape_base: dDataFrame of the base-version of the route shape. + + Returns: + List of shape changes formatted as a project card-change dictionary. + + """ + + if shape_build.equals(shape_base): + return None + + shape_change_list = [] + + base_node_list = shape_base.tolist() + build_node_list = shape_build.tolist() + + sort_len = max(len(base_node_list), len(build_node_list)) + + start_pos = None + end_pos = None + for i in range(sort_len): + if (i == len(base_node_list)) | (i == len(build_node_list)): + start_pos = i - 1 + break + if base_node_list[i] != build_node_list[i]: + start_pos = i + break + else: + continue + + j = -1 + for i in range(sort_len): + if (i == len(base_node_list)) | (i == len(build_node_list)): + end_pos = j + 1 + break + if base_node_list[j] != build_node_list[j]: + end_pos = j + break + else: + j -= 1 + + if start_pos or end_pos: + existing = base_node_list[ + (start_pos - 2 if start_pos > 1 else None) : ( + end_pos + 2 if end_pos < -2 else None + ) + ] + set = build_node_list[ + (start_pos - 2 if start_pos > 1 else None) : ( + end_pos + 2 if end_pos < -2 else None + ) + ] + + shape_change_list.append( + {"property": "routing", "existing": existing, "set": set} + ) + + return shape_change_list
+ + +
[docs]class StandardTransit(object): + """Holds a standard transit feed as a Partridge object and contains + methods to manipulate and translate the GTFS data to MetCouncil's + Cube Line files. + + .. highlight:: python + Typical usage example: + :: + cube_transit_net = StandardTransit.read_gtfs(BASE_TRANSIT_DIR) + cube_transit_net.write_as_cube_lin(os.path.join(WRITE_DIR, "outfile.lin")) + + Attributes: + feed: Partridge Feed object containing read-only access to GTFS feed + parameters (Parameters): Parameters instance containing information + about time periods and variables. + """ + +
[docs] def __init__(self, ptg_feed, parameters: Union[Parameters, dict] = {}): + """ + + Args: + ptg_feed: partridge feed object + parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters + """ + self.feed = ptg_feed + + if type(parameters) is dict: + self.parameters = Parameters(**parameters) + elif isinstance(parameters, Parameters): + self.parameters = Parameters(**parameters.__dict__) + else: + msg = "Parameters should be a dict or instance of Parameters: found {} which is of type:{}".format( + parameters, type(parameters) + ) + WranglerLogger.error(msg) + raise ValueError(msg)
+ +
[docs] @staticmethod + def fromTransitNetwork( + transit_network_object: TransitNetwork, parameters: Union[Parameters, dict] = {} + ): + """ + RoadwayNetwork to ModelRoadwayNetwork + + Args: + transit_network_object: Reference to an instance of TransitNetwork. + parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not provided will + use default parameters. + + Returns: + StandardTransit + """ + return StandardTransit(transit_network_object.feed, parameters=parameters)
+ +
[docs] @staticmethod + def read_gtfs(gtfs_feed_dir: str, parameters: Union[Parameters, dict] = {}): + """ + Reads GTFS files from a directory and returns a StandardTransit + instance. + + Args: + gtfs_feed_dir: location of the GTFS files + parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not provided will + use default parameters. + + Returns: + StandardTransit instance + """ + return StandardTransit(ptg.load_feed(gtfs_feed_dir), parameters=parameters)
+ +
[docs] def write_as_cube_lin(self, outpath: str = None): + """ + Writes the gtfs feed as a cube line file after + converting gtfs properties to MetCouncil cube properties. + #MC + Args: + outpath: File location for output cube line file. + + """ + if not outpath: + outpath = os.path.join(self.parameters.scratch_location, "outtransit.lin") + trip_cube_df = self.route_properties_gtfs_to_cube(self) + + trip_cube_df["LIN"] = trip_cube_df.apply(self.cube_format, axis=1) + + l = trip_cube_df["LIN"].tolist() + + with open(outpath, "w") as f: + f.write("\n".join(l))
+ +
[docs] @staticmethod + def route_properties_gtfs_to_cube(self): + """ + Prepare gtfs for cube lin file. + #MC + Does the following operations: + 1. Combines route, frequency, trip, and shape information + 2. Converts time of day to time periods + 3. Calculates cube route name from gtfs route name and properties + 4. Assigns a cube-appropriate mode number + 5. Assigns a cube-appropriate operator number + + Returns: + trip_df (DataFrame): DataFrame of trips with cube-appropriate values for: + - NAME + - ONEWAY + - OPERATOR + - MODE + - HEADWAY + """ + WranglerLogger.info( + "Converting GTFS Standard Properties to MetCouncil's Cube Standard" + ) + metro_operator_dict = { + "0": 3, + "1": 3, + "2": 3, + "3": 4, + "4": 2, + "5": 5, + "6": 8, + "7": 1, + "8": 1, + "9": 10, + "10": 3, + "11": 9, + "12": 3, + "13": 4, + "14": 4, + "15": 3, + } + + shape_df = self.feed.shapes.copy() + trip_df = self.feed.trips.copy() + + """ + Add information from: routes, frequencies, and routetype to trips_df + """ + trip_df = pd.merge(trip_df, self.feed.routes, how="left", on="route_id") + trip_df = pd.merge(trip_df, self.feed.frequencies, how="left", on="trip_id") + + trip_df["tod_name"] = trip_df.start_time.apply(self.time_to_cube_time_period) + inv_cube_time_periods_map = { + v: k for k, v in self.parameters.cube_time_periods.items() + } + trip_df["tod_num"] = trip_df.tod_name.map(inv_cube_time_periods_map) + trip_df["tod_name"] = trip_df.tod_name.map( + self.parameters.cube_time_periods_name + ) + + trip_df["NAME"] = trip_df.apply( + lambda x: x.agency_id + + "_" + + x.route_id + + "_" + + x.route_short_name + + "_" + + x.tod_name + + str(x.direction_id), + axis=1, + ) + + trip_df["LONGNAME"] = trip_df["route_long_name"] + trip_df["HEADWAY"] = (trip_df["headway_secs"] / 60).astype(int) + trip_df["MODE"] = trip_df.apply(self.calculate_cube_mode, axis=1) + trip_df["ONEWAY"] = "T" + trip_df["OPERATOR"] = trip_df["agency_id"].map(metro_operator_dict) + + return trip_df
+ +
[docs] def calculate_cube_mode(self, row): + """ + Assigns a cube mode number by following logic. + #MC + For rail, uses GTFS route_type variable: + https://developers.google.com/transit/gtfs/reference + + :: + # route_type : cube_mode + route_type_to_cube_mode = {0: 8, # Tram, Streetcar, Light rail + 3: 0, # Bus; further disaggregated for cube + 2: 9} # Rail + + For buses, uses route id numbers and route name to find + express and suburban buses as follows: + + :: + if not cube_mode: + if 'express' in row['LONGNAME'].lower(): + cube_mode = 7 # Express + elif int(row['route_id'].split("-")[0]) > 99: + cube_mode = 6 # Suburban Local + else: + cube_mode = 5 # Urban Local + + Args: + row: A DataFrame row with route_type, route_long_name, and route_id + + Returns: + cube mode number + """ + # route_type : cube_mode + route_type_to_cube_mode = { + 0: 8, # Tram, Streetcar, Light rail + 3: 0, # Bus; further disaggregated for cube + 2: 9, + } # Rail + + cube_mode = route_type_to_cube_mode[row["route_type"]] + + if not cube_mode: + if "express" in row["route_long_name"].lower(): + cube_mode = 7 # Express + elif int(row["route_id"].split("-")[0]) > 99: + cube_mode = 6 # Suburban Local + else: + cube_mode = 5 # Urban Local + + return cube_mode
+ +
[docs] def time_to_cube_time_period( + self, start_time_secs: int, as_str: bool = True, verbose: bool = False + ): + """ + Converts seconds from midnight to the cube time period. + + Args: + start_time_secs: start time for transit trip in seconds + from midnight + as_str: if True, returns the time period as a string, + otherwise returns a numeric time period + + Returns: + this_tp_num: if as_str is False, returns the numeric + time period + this_tp: if as_str is True, returns the Cube time period + name abbreviation + """ + from .util import hhmmss_to_datetime, secs_to_datetime + + # set initial time as the time that spans midnight + + start_time_dt = secs_to_datetime(start_time_secs) + + # set initial time as the time that spans midnight + this_tp = "NA" + for tp_name, _times in self.parameters.time_period_to_time.items(): + _start_time, _end_time = _times + _dt_start_time = hhmmss_to_datetime(_start_time) + _dt_end_time = hhmmss_to_datetime(_end_time) + if _dt_start_time > _dt_end_time: + this_tp = tp_name + break + + for tp_name, _times in self.parameters.time_period_to_time.items(): + _start_time, _end_time = _times + _dt_start_time = hhmmss_to_datetime(_start_time) + if start_time_dt >= _dt_start_time: + this_time = _dt_start_time + this_tp = tp_name + + if verbose: + WranglerLogger.debug( + "Finding Cube Time Period from Start Time: \ + \n - start_time_sec: {} \ + \n - start_time_dt: {} \ + \n - this_tp: {}".format( + start_time_secs, start_time_dt, this_tp + ) + ) + + if as_str: + return this_tp + + name_to_num = {v: k for k, v in self.parameters.cube_time_periods.items()} + this_tp_num = name_to_num.get(this_tp) + + if not this_tp_num: + msg = ( + "Cannot find time period number in {} for time period name: {}".format( + name_to_num, this_tp + ) + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + return this_tp_num
+ +
[docs] def shape_gtfs_to_cube(self, row, add_nntime = False): + """ + Creates a list of nodes that for the route in appropriate + cube format. + + Args: + row: DataFrame row with both shape_id and trip_id + + Returns: a string representation of the node list + for a route in cube format. + + """ + agency_raw_name = row.agency_raw_name + shape_id = row.shape_id + trip_id = row.trip_id + + trip_stop_times_df = self.feed.stop_times.copy() + + if 'agency_raw_name' in trip_stop_times_df.columns: + trip_stop_times_df.drop('agency_raw_name', axis = 1, inplace = True) + + trip_stop_times_df = pd.merge( + trip_stop_times_df, + self.feed.trips[['trip_id', 'agency_raw_name']], + how = 'left', + on = ['trip_id'] + ) + + trip_stop_times_df = trip_stop_times_df[ + (trip_stop_times_df.trip_id == row.trip_id) & + (trip_stop_times_df.agency_raw_name == agency_raw_name) + ] + + trip_node_df = self.feed.shapes.copy() + if 'agency_raw_name' in trip_node_df.columns: + trip_node_df.drop('agency_raw_name', axis = 1, inplace = True) + + trip_node_df = pd.merge( + trip_node_df, + self.feed.trips[['shape_id', 'agency_raw_name']].drop_duplicates(), + how = 'left', + on = ['shape_id'] + ) + + trip_node_df = trip_node_df[ + (trip_node_df.shape_id == shape_id) & + (trip_node_df.agency_raw_name == agency_raw_name) + ] + + trip_node_df.sort_values(by = ["shape_pt_sequence"], inplace = True) + + if 'trip_id' in self.feed.stops.columns: + stops_df = self.feed.stops.copy() + if agency_raw_name != 'sjrtd_2015_0127': + stops_df = stops_df[stops_df.agency_raw_name != 'sjrtd_2015_0127'] + trip_stop_times_df = pd.merge( + trip_stop_times_df, stops_df.drop('trip_id', axis = 1), how="left", on=["stop_id"] + ) + else: + stops_df = stops_df[stops_df.agency_raw_name == 'sjrtd_2015_0127'] + stops_df['trip_id'] = stops_df['trip_id'].astype(float).astype(int).astype(str) + trip_stop_times_df = pd.merge( + trip_stop_times_df, stops_df, how="left", on=['agency_raw_name', 'trip_id',"stop_id"] + ) + else: + trip_stop_times_df = pd.merge( + trip_stop_times_df, self.feed.stops, how="left", on="stop_id" + ) + + trip_stop_times_df["model_node_id"] = pd.to_numeric(trip_stop_times_df["model_node_id"]).astype(int) + trip_node_df["shape_model_node_id"] = pd.to_numeric(trip_node_df["shape_model_node_id"]).astype(int) + + stop_node_id_list = trip_stop_times_df["model_node_id"].tolist() + trip_node_list = trip_node_df["shape_model_node_id"].tolist() + + trip_stop_times_df.sort_values(by = ["stop_sequence"], inplace = True) + # sometimes GTFS `stop_sequence` does not start with 1, e.g. SFMTA light rails + trip_stop_times_df["internal_stop_sequence"] = range(1, 1+len(trip_stop_times_df)) + # sometimes GTFS `departure_time` is not recorded for every stop, e.g. VTA light rails + trip_stop_times_df["departure_time"].fillna(method = "ffill", inplace = True) + trip_stop_times_df["departure_time"].fillna(0, inplace = True) + trip_stop_times_df["NNTIME"] = trip_stop_times_df["departure_time"].diff() / 60 + # CUBE NNTIME takes 2 decimals + trip_stop_times_df["NNTIME"] = trip_stop_times_df["NNTIME"].round(2) + trip_stop_times_df["NNTIME"].fillna(-1, inplace = True) + + # ACCESS + def _access_type(x): + if (x.pickup_type in [1, "1"]): + return 2 + elif (x.drop_off_type in [1, "1"]): + return 1 + else: + return 0 + + trip_stop_times_df["ACCESS"] = trip_stop_times_df.apply(lambda x: _access_type(x), axis = 1) + + # node list + node_list_str = "" + stop_seq = 0 + for nodeIdx in range(len(trip_node_list)): + if trip_node_list[nodeIdx] in stop_node_id_list: + # in case a route stops at a stop more than once, e.g. circular route + stop_seq += 1 + + if (add_nntime) & (stop_seq > 1): + if len(trip_stop_times_df[ + trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]]) > 1: + nntime_v = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]) & + (trip_stop_times_df["internal_stop_sequence"] == stop_seq), + "NNTIME"].iloc[0] + else: + nntime_v = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]),"NNTIME"].iloc[0] + + if nntime_v > 0: + nntime = ", NNTIME=%s" % (nntime_v) + else: + nntime = "" + else: + nntime = "" + + access_v = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]),"ACCESS"].iloc[0] + if access_v > 0: + access = ", ACCESS=%s" % (access_v) + else: + access = "" + + node_list_str += "\n %s%s%s" % (trip_node_list[nodeIdx], nntime, access) + if nodeIdx < (len(trip_node_list) - 1): + node_list_str += "," + if ((add_nntime) & (stop_seq > 1) & (len(nntime) > 0)) | (len(access) > 0): + node_list_str += " N=" + else: + node_list_str += "\n -%s" % (trip_node_list[nodeIdx]) + if nodeIdx < (len(trip_node_list) - 1): + node_list_str += "," + + # remove NNTIME = 0 + node_list_str = node_list_str.replace(" NNTIME=0.0, N=", "") + node_list_str = node_list_str.replace(" NNTIME=0.0,", "") + + return node_list_str
+ + +
[docs] def cube_format(self, row): + """ + Creates a string represnting the route in cube line file notation. + #MC + Args: + row: row of a DataFrame representing a cube-formatted trip, with the Attributes + trip_id, shape_id, NAME, LONGNAME, tod, HEADWAY, MODE, ONEWAY, OPERATOR + + Returns: + string representation of route in cube line file notation + """ + + s = '\nLINE NAME="{}",'.format(row.NAME) + s += '\n LONGNAME="{}",'.format(row.LONGNAME) + s += "\n HEADWAY[{}]={},".format(row.tod_num, row.HEADWAY) + s += "\n MODE={},".format(row.MODE) + s += "\n ONEWAY={},".format(row.ONEWAY) + s += "\n OPERATOR={},".format(row.OPERATOR) + s += "\n NODES={}".format(self.shape_gtfs_to_cube(row)) + + return s
+ +
[docs] def shape_gtfs_to_emme(self, trip_row): + """ + Creates transit segment for the trips in appropriate + emme format. + + Args: + row: DataFrame row with both shape_id and trip_id + + Returns: a dataframe representation of the transit segment + for a trip in emme format. + + """ + trip_stop_times_df = self.feed.stop_times.copy() + trip_stop_times_df = trip_stop_times_df[ + trip_stop_times_df.trip_id == trip_row.trip_id + ] + + trip_node_df = self.feed.shapes.copy() + trip_node_df = trip_node_df[trip_node_df.shape_id == trip_row.shape_id] + trip_node_df.sort_values(by = ["shape_pt_sequence"], inplace = True) + + trip_stop_times_df = pd.merge( + trip_stop_times_df, self.feed.stops, how="left", on="stop_id" + ) + + stop_node_id_list = trip_stop_times_df["model_node_id"].tolist() + trip_node_list = trip_node_df["shape_model_node_id"].tolist() + + trip_stop_times_df.sort_values(by = ["stop_sequence"], inplace = True) + # sometimes GTFS `stop_sequence` does not start with 1, e.g. SFMTA light rails + trip_stop_times_df["internal_stop_sequence"] = range(1, 1+len(trip_stop_times_df)) + # sometimes GTFS `departure_time` is not recorded for every stop, e.g. VTA light rails + trip_stop_times_df["departure_time"].fillna(method = "ffill", inplace = True) + trip_stop_times_df["departure_time"].fillna(0, inplace = True) + trip_stop_times_df["NNTIME"] = trip_stop_times_df["departure_time"].diff() / 60 + # CUBE NNTIME takes 2 decimals + trip_stop_times_df["NNTIME"] = trip_stop_times_df["NNTIME"].round(2) + trip_stop_times_df["NNTIME"].fillna(-1, inplace = True) + + # node list + stop_seq = 0 + nntimes = [] + allow_alightings=[] + allow_boardings=[] + stop_names=[] + + if trip_row.TM2_line_haul_name in ["Light rail", "Heavy rail", "Commuter rail", "Ferry service"]: + add_nntime = True + else: + add_nntime = False + + for nodeIdx in range(len(trip_node_list)): + + if trip_node_list[nodeIdx] in stop_node_id_list: + # in case a route stops at a stop more than once, e.g. circular route + stop_seq += 1 + + if (add_nntime) & (stop_seq > 1): + if len(trip_stop_times_df[ + trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]]) > 1: + + nntime_v = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]) & + (trip_stop_times_df["internal_stop_sequence"] == stop_seq), + "NNTIME"].iloc[0] + else: + nntime_v = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]),"NNTIME"].iloc[0] + + nntimes.append(nntime_v) + else: + nntimes.append(0) + + pickup_type = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]),"pickup_type"].iloc[0] + if pickup_type in [1, "1"]: + allow_alightings.append(0) + else: + allow_alightings.append(1) + + drop_off_type = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]),"drop_off_type"].iloc[0] + if drop_off_type in [1, "1"]: + allow_boardings.append(0) + else: + allow_boardings.append(1) + + stop_name = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]),"stop_name"].iloc[0] + stop_names.append(stop_name) + + else: + nntimes.append(0) + allow_alightings.append(0) + allow_boardings.append(0) + stop_names.append("") + + trip_node_df['time_minutes'] = nntimes + trip_node_df['allow_alightings'] = allow_alightings + trip_node_df['allow_boardings'] = allow_boardings + trip_node_df['stop_name'] = stop_names + trip_node_df['line_id'] = trip_row['line_id'] + trip_node_df['node_id'] = trip_node_df['shape_model_node_id'].astype(int) + trip_node_df['stop_order'] = trip_node_df['shape_pt_sequence'] + + return trip_node_df
+ +
[docs] def evaluate_differences(self, transit_changes): + """ + Compare changes from the transit_changes dataframe with the standard transit network + returns the project card changes in dictionary format + """ + + # simple properties change + trip_df = self.feed.trips.copy() + + mode_crosswalk = pd.read_csv(self.parameters.mode_crosswalk_file) + mode_crosswalk.drop_duplicates(subset = ["agency_raw_name", "route_type", "is_express_bus"], inplace = True) + + trip_df = pd.merge(trip_df, self.feed.routes.drop("agency_raw_name", axis = 1), how="left", on="route_id") + + trip_df = pd.merge(trip_df, self.feed.frequencies, how="left", on="trip_id") + + trip_df["tod"] = trip_df.start_time.apply(self.time_to_cube_time_period, as_str = False) + trip_df["tod_name"] = trip_df.start_time.apply(self.time_to_cube_time_period) + + trip_df["headway_minutes"] = (trip_df["headway_secs"] / 60).astype(int) + + trip_df = pd.merge(trip_df, self.feed.agency[["agency_name", "agency_raw_name", "agency_id"]], how = "left", on = ["agency_raw_name", "agency_id"]) + + # identify express bus + from .mtc import _is_express_bus + trip_df["is_express_bus"] = trip_df.apply(lambda x: _is_express_bus(x), axis = 1) + trip_df.drop("agency_name", axis = 1 , inplace = True) + + trip_df = pd.merge( + trip_df, + mode_crosswalk.drop("agency_id", axis = 1), + how = "left", + on = ["agency_raw_name", "route_type", "is_express_bus"] + ) + + trip_df["line_id"] = trip_df.apply( + lambda x: str(x.TM2_operator) + + "_" + + str(x.route_id) + + "_" + + x.tod_name + + "_" + + "d" + + str(int(x.direction_id)) + + "_s" + + x.shape_id, + axis=1, + ) + + trip_df["line_id"] = trip_df["line_id"].str.slice(stop = 28) + + project_card_changes = [] + + # lines updated + transit_changes['line_id'] = transit_changes.apply( + lambda x: '-'.join(x['element_id'].split('-')[:-3]) if + x['object'] == 'TRANSIT_STOP' else + x['element_id'], + axis = 1 + ) + + lines_updated_df = transit_changes[ + (transit_changes['operation'] == 'C') & + (transit_changes['line_id'].isin(trip_df['line_id'].tolist())) + ].copy() + + ######################### + # simple property changes + ######################### + + property_changes_df = lines_updated_df[ + lines_updated_df.object == 'TRANSIT_LINE' + ].copy() + + property_attribute_list = ['headway_secs'] + + for index, row in property_changes_df.iterrows(): + line_id = row['line_id'] + properties_list = [] + change_item = {} + for c in property_attribute_list: + existing_value = int(trip_df[ + trip_df['line_id'] == line_id + ][c].iloc[0]) + + change_item["existing"] = existing_value + + if c == 'headway_secs': + change_item["set"] = row['headway'] * 60 + else: + change_item["set"] = row[c] + + change_item["property"] = c + + properties_list.append(change_item) + + property_changes_df.loc[index, 'properties'] = properties_list + + ############### + # shape changes + ############### + + shape_changes_df = lines_updated_df[ + lines_updated_df.object.isin(['TRANSIT_SHAPE']) + ].copy() + + for index, row in shape_changes_df.iterrows(): + line_id = row.line_id + + # get base shape + trip_row = trip_df[trip_df.line_id == line_id].copy().squeeze() + + base_shape = self.shape_gtfs_to_emme( + trip_row=trip_row + ) + base_shape['shape_model_node_id'] = base_shape['shape_model_node_id'].astype(int) + + # get build shape + build_shape = row.new_itinerary + + updated_shapes = CubeTransit.evaluate_route_shape_changes( + shape_base = base_shape.shape_model_node_id, + shape_build = pd.Series(row.new_itinerary) + ) + updated_shapes[0]['property'] = 'shapes' + shape_changes_df.loc[index, 'properties'] = updated_shapes + + ############## + # stop changes + ############## + stop_changes_df = lines_updated_df[ + lines_updated_df.object.isin(['TRANSIT_STOP']) + ].copy() + + stop_attribute_list = ['allow_alightings', 'allow_boardings'] + + stop_changes_df = stop_changes_df.groupby( + ['line_id','i_node'] + )[stop_attribute_list].last().reset_index() + + stop_attribute_changes_df = pd.DataFrame() + + for attribute in stop_attribute_list: + + attribute_df = stop_changes_df.groupby( + ['line_id', attribute] + )['i_node'].apply(list).reset_index() + attribute_df['properties'] = attribute_df.apply( + lambda x: { + 'property' : attribute if x[attribute] == True else 'no_'+attribute.split('_')[-1], + 'set': x['i_node']}, + axis = 1 + ) + + stop_attribute_changes_df = pd.concat( + [stop_attribute_changes_df, + attribute_df[['line_id', 'properties']]], + sort = False, + ignore_index = True + ) + + ############## + # combine all transit changes + ############## + transit_changes_df = pd.concat( + [ + property_changes_df, + shape_changes_df, + stop_attribute_changes_df + ], + sort = False, + ignore_index = True + ) + + # groupby line_id + transit_changes_df = transit_changes_df.groupby( + ['line_id'] + )['properties'].apply(list).reset_index() + + # create change items by line_id + for index, row in transit_changes_df.iterrows(): + line_id = row['line_id'] + base_start_time_str = self.parameters.time_period_to_time.get( + line_id.split("_")[2] + )[0] + + base_end_time_str = self.parameters.time_period_to_time.get( + line_id.split("_")[2] + )[1] + + update_card_dict = { + "category": "Transit Service Property Change", + "facility": { + "route_id": line_id.split("_")[1], + "direction_id": int(line_id.split("_")[-2].strip("d\"")), + "shape_id": line_id.split("_")[-1].strip("s\""), + "start_time": base_start_time_str, + "end_time": base_end_time_str + }, + "properties": row['properties'], + } + + project_card_changes.append(update_card_dict) + + return project_card_changes
+ +class CubeTransformer(Transformer): + """A lark-parsing Transformer which transforms the parse-tree to + a dictionary. + + .. highlight:: python + Typical usage example: + :: + transformed_tree_data = CubeTransformer().transform(parse_tree) + + Attributes: + line_order (int): a dynamic counter to hold the order of the nodes within + a route shape + lines_list (list): a list of the line names + """ + + def __init__(self): + self.line_order = 0 + self.lines_list = [] + + def lines(self, line): + # WranglerLogger.debug("lines: \n {}".format(line)) + + # This MUST be a tuple because it returns to start in the tree + lines = {k: v for k, v in line} + return ("lines", lines) + + @v_args(inline=True) + def program_type_line(self, PROGRAM_TYPE, whitespace=None): + # WranglerLogger.debug("program_type_line:{}".format(PROGRAM_TYPE)) + self.program_type = PROGRAM_TYPE.value + + # This MUST be a tuple because it returns to start in the tree + return ("program_type", PROGRAM_TYPE.value) + + @v_args(inline=True) + def line(self, lin_attributes, nodes): + # WranglerLogger.debug("line...attributes:\n {}".format(lin_attributes)) + # WranglerLogger.debug("line...nodes:\n {}".format(nodes)) + lin_name = lin_attributes["NAME"] + + self.line_order = 0 + # WranglerLogger.debug("parsing: {}".format(lin_name)) + + return (lin_name, {"line_properties": lin_attributes, "line_shape": nodes}) + + @v_args(inline=True) + def lin_attributes(self, *lin_attr): + lin_attr = {k: v for (k, v) in lin_attr} + # WranglerLogger.debug("lin_attributes: {}".format(lin_attr)) + return lin_attr + + @v_args(inline=True) + def lin_attr(self, lin_attr_name, attr_value, SEMICOLON_COMMENT=None): + # WranglerLogger.debug("lin_attr {}: {}".format(lin_attr_name, attr_value)) + return lin_attr_name, attr_value + + def lin_attr_name(self, args): + attr_name = args[0].value.upper() + # WranglerLogger.debug(".......args {}".format(args)) + if attr_name in ["FREQ", "HEADWAY"]: + attr_name = attr_name + "[" + str(args[2]) + "]" + return attr_name + + def attr_value(self, attr_value): + try: + return int(attr_value[0].value) + except: + return attr_value[0].value + + def nodes(self, lin_node): + lin_node = DataFrame(lin_node) + # WranglerLogger.debug("nodes:\n {}".format(lin_node)) + + return lin_node + + @v_args(inline=True) + def lin_node(self, NODE_NUM, SEMICOLON_COMMENT=None, *lin_nodeattr): + self.line_order += 1 + n = int(NODE_NUM.value) + return {"node_id": abs(n), "node": n, "stop": n > 0, "order": self.line_order} + + start = dict + + +TRANSIT_LINE_FILE_GRAMMAR = r""" + +start : program_type_line? lines +WHITESPACE : /[ \t\r\n]/+ +STRING : /("(?!"").*?(?<!\\)(\\\\)*?"|'(?!'').*?(?<!\\)(\\\\)*?')/i +SEMICOLON_COMMENT : /;[^\n]*/ +BOOLEAN : "T"i | "F"i +program_type_line : ";;<<" PROGRAM_TYPE ">><<LINE>>;;" WHITESPACE? +PROGRAM_TYPE : "PT" | "TRNBUILD" + +lines : line* +line : "LINE" lin_attributes nodes + +lin_attributes : lin_attr+ +lin_attr : lin_attr_name "=" attr_value "," SEMICOLON_COMMENT* +TIME_PERIOD : "1".."5" +!lin_attr_name : "allstops"i + | "color"i + | ("freq"i "[" TIME_PERIOD "]") + | ("headway"i "[" TIME_PERIOD "]") + | "mode"i + | "name"i + | "oneway"i + | "owner"i + | "runtime"i + | "timefac"i + | "xyspeed"i + | "longname"i + | "shortname"i + | ("usera1"i) + | ("usera2"i) + | "circular"i + | "vehicletype"i + | "operator"i + | "faresystem"i + +attr_value : BOOLEAN | STRING | SIGNED_INT | FLOAT + +nodes : lin_node+ +lin_node : ("N" | "NODES")? "="? NODE_NUM ","? SEMICOLON_COMMENT? lin_nodeattr* +NODE_NUM : SIGNED_INT +lin_nodeattr : lin_nodeattr_name "=" attr_value ","? SEMICOLON_COMMENT* +!lin_nodeattr_name : "access_c"i + | "access"i + | "delay"i + | "xyspeed"i + | "timefac"i + | "nntime"i + | "time"i + +operator : SEMICOLON_COMMENT* "OPERATOR" opmode_attr* SEMICOLON_COMMENT* +mode : SEMICOLON_COMMENT* "MODE" opmode_attr* SEMICOLON_COMMENT* +opmode_attr : ( (opmode_attr_name "=" attr_value) ","? ) +opmode_attr_name : "number" | "name" | "longname" + +%import common.SIGNED_INT +%import common.FLOAT +%import common.WS +%ignore WS + +""" +
+ +
+
+
+ +
+ +
+

© Copyright 2019 Metropolitan Council.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty/_modules/lasso/util/index.html b/branch/bicounty/_modules/lasso/util/index.html new file mode 100644 index 0000000..c53ef4c --- /dev/null +++ b/branch/bicounty/_modules/lasso/util/index.html @@ -0,0 +1,250 @@ + + + + + + lasso.util — lasso documentation + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for lasso.util

+from functools import partial
+import pyproj
+from shapely.ops import transform
+from shapely.geometry import Point, Polygon
+import re
+from unidecode import unidecode
+
+
[docs]def get_shared_streets_intersection_hash(lat, long, osm_node_id=None): + """ + Calculated per: + https://github.com/sharedstreets/sharedstreets-js/blob/0e6d7de0aee2e9ae3b007d1e45284b06cc241d02/src/index.ts#L553-L565 + Expected in/out + -93.0965985, 44.952112199999995 osm_node_id = 954734870 + 69f13f881649cb21ee3b359730790bb9 + + """ + import hashlib + + message = "Intersection {0:.5f} {0:.5f}".format(long, lat) + if osm_node_id: + message += " {}".format(osm_node_id) + unhashed = message.encode("utf-8") + hash = hashlib.md5(unhashed).hexdigest() + return hash
+ + +
[docs]def hhmmss_to_datetime(hhmmss_str: str): + """ + Creates a datetime time object from a string of hh:mm:ss + + Args: + hhmmss_str: string of hh:mm:ss + Returns: + dt: datetime.time object representing time + """ + import datetime + + dt = datetime.time(*[int(i) for i in hhmmss_str.split(":")]) + + return dt
+ + +
[docs]def secs_to_datetime(secs: int): + """ + Creates a datetime time object from a seconds from midnight + + Args: + secs: seconds from midnight + Returns: + dt: datetime.time object representing time + """ + import datetime + + dt = (datetime.datetime.min + datetime.timedelta(seconds=secs)).time() + + return dt
+ + +
[docs]def geodesic_point_buffer(lat, lon, meters): + """ + creates circular buffer polygon for node + + Args: + lat: node lat + lon: node lon + meters: buffer distance, radius of circle + Returns: + Polygon + """ + proj_wgs84 = pyproj.Proj('+proj=longlat +datum=WGS84') + # Azimuthal equidistant projection + aeqd_proj = '+proj=aeqd +lat_0={lat} +lon_0={lon} +x_0=0 +y_0=0' + project = partial( + pyproj.transform, + pyproj.Proj(aeqd_proj.format(lat=lat, lon=lon)), + proj_wgs84) + buf = Point(0, 0).buffer(meters) # distance in meters + return Polygon(transform(project, buf).exterior.coords[:])
+ +
[docs]def create_locationreference(node, link): + node['X'] = node['geometry'].apply(lambda p: p.x) + node['Y'] = node['geometry'].apply(lambda p: p.y) + node['point'] = [list(xy) for xy in zip(node.X, node.Y)] + node_dict = dict(zip(node.model_node_id, node.point)) + + link['A_point'] = link['A'].map(node_dict) + link['B_point'] = link['B'].map(node_dict) + link['locationReferences'] = link.apply(lambda x: [{'sequence':1, + 'point': x['A_point'], + 'distanceToNextRef':x['length'], + 'bearing' : 0, + 'intersectionId':x['fromIntersectionId']}, + {'sequence':2, + 'point': x['B_point'], + 'intersectionId':x['toIntersectionId']}], + axis = 1)
+ +
[docs]def column_name_to_parts(c, parameters=None): + + if not parameters: + from .parameters import Parameters + + parameters = Parameters() + + if c[0:2] == "ML": + managed = True + else: + managed = False + + time_period = None + category = None + + if c.split("_")[0] not in parameters.properties_to_split.keys(): + return c, None, None, managed + + tps = parameters.time_period_to_time.keys() + cats = parameters.categories.keys() + + if c.split("_")[-1] in tps: + time_period = c.split("_")[-1] + base_name = c.split(time_period)[-2][:-1] + if c.split("_")[-2] in cats: + category = c.split("_")[-2] + base_name = c.split(category)[-2][:-1] + elif c.split("_")[-1] in cats: + category = c.split("_")[-1] + base_name = c.split(category)[-2][:-1] + else: + msg = "Can't split property correctly: {}".format(c) + WranglerLogger.error(msg) + + return base_name, time_period, category, managed
+ +
[docs]def shorten_name(name): + if type(name) == str: + name_list = name.split(',') + else: + name_list = name + name_list = [re.sub(r'\W+', ' ', c).replace('nan', '').strip(' ') for c in name_list] + + name_list = list(set(name_list)) + #name_list.remove('') + + name_new = ' '.join(name_list).strip(' ') + + # convert non english character to english + name_new = unidecode(name_new) + + return name_new
+
+ +
+
+
+ +
+ +
+

© Copyright 2019 Metropolitan Council.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty/_modules/shapely/geometry/point/index.html b/branch/bicounty/_modules/shapely/geometry/point/index.html new file mode 100644 index 0000000..0ae7d33 --- /dev/null +++ b/branch/bicounty/_modules/shapely/geometry/point/index.html @@ -0,0 +1,391 @@ + + + + + + shapely.geometry.point — lasso documentation + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for shapely.geometry.point

+"""Points and related utilities
+"""
+
+from ctypes import c_double
+import warnings
+
+from shapely.coords import CoordinateSequence
+from shapely.errors import DimensionError, ShapelyDeprecationWarning
+from shapely.geos import lgeos
+from shapely.geometry.base import BaseGeometry, geos_geom_from_py
+from shapely.geometry.proxy import CachingGeometryProxy
+
+__all__ = ['Point', 'asPoint']
+
+
+
[docs]class Point(BaseGeometry): + """ + A zero dimensional feature + + A point has zero length and zero area. + + Attributes + ---------- + x, y, z : float + Coordinate values + + Example + ------- + >>> p = Point(1.0, -1.0) + >>> print(p) + POINT (1 -1) + >>> p.y + -1.0 + >>> p.x + 1.0 + """ + + def __init__(self, *args): + """ + Parameters + ---------- + There are 2 cases: + + 1) 1 parameter: this must satisfy the numpy array protocol. + 2) 2 or more parameters: x, y, z : float + Easting, northing, and elevation. + """ + BaseGeometry.__init__(self) + if len(args) > 0: + if len(args) == 1: + geom, n = geos_point_from_py(args[0]) + elif len(args) > 3: + raise TypeError( + "Point() takes at most 3 arguments ({} given)".format(len(args)) + ) + else: + geom, n = geos_point_from_py(tuple(args)) + self._set_geom(geom) + self._ndim = n + + # Coordinate getters and setters + + @property + def x(self): + """Return x coordinate.""" + return self.coords[0][0] + + @property + def y(self): + """Return y coordinate.""" + return self.coords[0][1] + + @property + def z(self): + """Return z coordinate.""" + if self._ndim != 3: + raise DimensionError("This point has no z coordinate.") + return self.coords[0][2] + + @property + def __geo_interface__(self): + return { + 'type': 'Point', + 'coordinates': self.coords[0] + } + +
[docs] def svg(self, scale_factor=1., fill_color=None, opacity=None): + """Returns SVG circle element for the Point geometry. + + Parameters + ========== + scale_factor : float + Multiplication factor for the SVG circle diameter. Default is 1. + fill_color : str, optional + Hex string for fill color. Default is to use "#66cc99" if + geometry is valid, and "#ff3333" if invalid. + opacity : float + Float number between 0 and 1 for color opacity. Default value is 0.6 + """ + if self.is_empty: + return '<g />' + if fill_color is None: + fill_color = "#66cc99" if self.is_valid else "#ff3333" + if opacity is None: + opacity = 0.6 + return ( + '<circle cx="{0.x}" cy="{0.y}" r="{1}" ' + 'stroke="#555555" stroke-width="{2}" fill="{3}" opacity="{4}" />' + ).format(self, 3. * scale_factor, 1. * scale_factor, fill_color, opacity)
+ + @property + def _ctypes(self): + if not self._ctypes_data: + array_type = c_double * self._ndim + array = array_type() + xy = self.coords[0] + array[0] = xy[0] + array[1] = xy[1] + if self._ndim == 3: + array[2] = xy[2] + self._ctypes_data = array + return self._ctypes_data + + def _array_interface(self): + """Provide the Numpy array protocol.""" + if self.is_empty: + ai = {'version': 3, 'typestr': '<f8', 'shape': (0,), 'data': (c_double * 0)()} + else: + ai = self._array_interface_base + ai.update({'shape': (self._ndim,)}) + return ai + +
[docs] def array_interface(self): + """Provide the Numpy array protocol.""" + warnings.warn( + "The 'array_interface' method is deprecated and will be removed " + "in Shapely 2.0.", + ShapelyDeprecationWarning, stacklevel=2) + return self._array_interface()
+ + @property + def __array_interface__(self): + warnings.warn( + "The array interface is deprecated and will no longer work in " + "Shapely 2.0. Convert the '.coords' to a numpy array instead.", + ShapelyDeprecationWarning, stacklevel=3) + return self._array_interface() + + @property + def bounds(self): + """Returns minimum bounding region (minx, miny, maxx, maxy)""" + try: + xy = self.coords[0] + except IndexError: + return () + return (xy[0], xy[1], xy[0], xy[1]) + + # Coordinate access + + def _get_coords(self): + """Access to geometry's coordinates (CoordinateSequence)""" + return CoordinateSequence(self) + + def _set_coords(self, *args): + warnings.warn( + "Setting the 'coords' to mutate a Geometry in place is deprecated," + " and will not be possible any more in Shapely 2.0", + ShapelyDeprecationWarning, stacklevel=2) + self._empty() + if len(args) == 1: + geom, n = geos_point_from_py(args[0]) + elif len(args) > 3: + raise TypeError("Point() takes at most 3 arguments ({} given)".format(len(args))) + else: + geom, n = geos_point_from_py(tuple(args)) + self._set_geom(geom) + self._ndim = n + + coords = property(_get_coords, _set_coords) + + @property + def xy(self): + """Separate arrays of X and Y coordinate values + + Example: + >>> x, y = Point(0, 0).xy + >>> list(x) + [0.0] + >>> list(y) + [0.0] + """ + return self.coords.xy
+ + +class PointAdapter(CachingGeometryProxy, Point): + + _other_owned = False + + def __init__(self, context): + warnings.warn( + "The proxy geometries (through the 'asShape()', 'asPoint()' or " + "'PointAdapter()' constructors) are deprecated and will be " + "removed in Shapely 2.0. Use the 'shape()' function or the " + "standard 'Point()' constructor instead.", + ShapelyDeprecationWarning, stacklevel=4) + self.context = context + self.factory = geos_point_from_py + + @property + def _ndim(self): + try: + # From array protocol + array = self.context.__array_interface__ + n = array['shape'][0] + assert n == 2 or n == 3 + return n + except AttributeError: + # Fall back on list + return len(self.context) + + @property + def __array_interface__(self): + """Provide the Numpy array protocol.""" + try: + return self.context.__array_interface__ + except AttributeError: + return self.array_interface() + + def _get_coords(self): + """Access to geometry's coordinates (CoordinateSequence)""" + return CoordinateSequence(self) + + def _set_coords(self, ob): + raise NotImplementedError("Adapters can not modify their sources") + + coords = property(_get_coords) + + +def asPoint(context): + """Adapt an object to the Point interface""" + return PointAdapter(context) + + +def geos_point_from_py(ob, update_geom=None, update_ndim=0): + """Create a GEOS geom from an object that is a Point, a coordinate sequence + or that provides the array interface. + + Returns the GEOS geometry and the number of its dimensions. + """ + if isinstance(ob, Point): + return geos_geom_from_py(ob) + + # Accept either (x, y) or [(x, y)] + if not hasattr(ob, '__getitem__'): # generators + ob = list(ob) + + if isinstance(ob[0], tuple): + coords = ob[0] + else: + coords = ob + n = len(coords) + dx = c_double(coords[0]) + dy = c_double(coords[1]) + dz = None + if n == 3: + dz = c_double(coords[2]) + + if update_geom: + cs = lgeos.GEOSGeom_getCoordSeq(update_geom) + if n != update_ndim: + raise ValueError( + "Wrong coordinate dimensions; this geometry has dimensions: " + "%d" % update_ndim) + else: + cs = lgeos.GEOSCoordSeq_create(1, n) + + # Because of a bug in the GEOS C API, always set X before Y + lgeos.GEOSCoordSeq_setX(cs, 0, dx) + lgeos.GEOSCoordSeq_setY(cs, 0, dy) + if n == 3: + lgeos.GEOSCoordSeq_setZ(cs, 0, dz) + + if update_geom: + return None + else: + return lgeos.GEOSGeom_createPoint(cs), n + + +def update_point_from_py(geom, ob): + geos_point_from_py(ob, geom._geom, geom._ndim) +
+ +
+
+
+ +
+ +
+

© Copyright 2019 Metropolitan Council.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty/_modules/shapely/geometry/polygon/index.html b/branch/bicounty/_modules/shapely/geometry/polygon/index.html new file mode 100644 index 0000000..a646e78 --- /dev/null +++ b/branch/bicounty/_modules/shapely/geometry/polygon/index.html @@ -0,0 +1,671 @@ + + + + + + shapely.geometry.polygon — lasso documentation + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+
    +
  • »
  • +
  • Module code »
  • +
  • shapely.geometry.polygon
  • +
  • +
  • +
+
+
+
+
+ +

Source code for shapely.geometry.polygon

+"""Polygons and their linear ring components
+"""
+
+import sys
+import warnings
+
+from ctypes import c_void_p, cast, POINTER
+import weakref
+
+from shapely.algorithms.cga import signed_area
+from shapely.coords import CoordinateSequence
+from shapely.geos import lgeos
+from shapely.geometry.base import BaseGeometry, geos_geom_from_py
+from shapely.geometry.linestring import LineString, LineStringAdapter
+from shapely.geometry.point import Point
+from shapely.geometry.proxy import PolygonProxy
+from shapely.errors import TopologicalError, ShapelyDeprecationWarning
+
+
+__all__ = ['Polygon', 'asPolygon', 'LinearRing', 'asLinearRing']
+
+
+class LinearRing(LineString):
+    """
+    A closed one-dimensional feature comprising one or more line segments
+
+    A LinearRing that crosses itself or touches itself at a single point is
+    invalid and operations on it may fail.
+    """
+
+    def __init__(self, coordinates=None):
+        """
+        Parameters
+        ----------
+        coordinates : sequence
+            A sequence of (x, y [,z]) numeric coordinate pairs or triples.
+            Also can be a sequence of Point objects.
+
+        Rings are implicitly closed. There is no need to specific a final
+        coordinate pair identical to the first.
+
+        Example
+        -------
+        Construct a square ring.
+
+          >>> ring = LinearRing( ((0, 0), (0, 1), (1 ,1 ), (1 , 0)) )
+          >>> ring.is_closed
+          True
+          >>> ring.length
+          4.0
+        """
+        BaseGeometry.__init__(self)
+        if coordinates is not None:
+            ret = geos_linearring_from_py(coordinates)
+            if ret is not None:
+                geom, n = ret
+                self._set_geom(geom)
+                self._ndim = n
+
+    @property
+    def __geo_interface__(self):
+        return {
+            'type': 'LinearRing',
+            'coordinates': tuple(self.coords)
+            }
+
+    # Coordinate access
+
+    def _get_coords(self):
+        """Access to geometry's coordinates (CoordinateSequence)"""
+        return CoordinateSequence(self)
+
+    def _set_coords(self, coordinates):
+        warnings.warn(
+            "Setting the 'coords' to mutate a Geometry in place is deprecated,"
+            " and will not be possible any more in Shapely 2.0",
+            ShapelyDeprecationWarning, stacklevel=2)
+        self._empty()
+        ret = geos_linearring_from_py(coordinates)
+        if ret is not None:
+            geom, n = ret
+            self._set_geom(geom)
+            self._ndim = n
+
+    coords = property(_get_coords, _set_coords)
+
+    def __setstate__(self, state):
+        """WKB doesn't differentiate between LineString and LinearRing so we
+        need to move the coordinate sequence into the correct geometry type"""
+        super().__setstate__(state)
+        cs = lgeos.GEOSGeom_getCoordSeq(self.__geom__)
+        cs_clone = lgeos.GEOSCoordSeq_clone(cs)
+        lgeos.GEOSGeom_destroy(self.__geom__)
+        self.__geom__ = lgeos.GEOSGeom_createLinearRing(cs_clone)
+
+    @property
+    def is_ccw(self):
+        """True is the ring is oriented counter clock-wise"""
+        return bool(self.impl['is_ccw'](self))
+
+    @property
+    def is_simple(self):
+        """True if the geometry is simple, meaning that any self-intersections
+        are only at boundary points, else False"""
+        return LineString(self).is_simple
+
+
+class LinearRingAdapter(LineStringAdapter):
+
+    __p__ = None
+
+    def __init__(self, context):
+        warnings.warn(
+            "The proxy geometries (through the 'asShape()', 'asLinearRing()' or "
+            "'LinearRingAdapter()' constructors) are deprecated and will be "
+            "removed in Shapely 2.0. Use the 'shape()' function or the "
+            "standard 'LinearRing()' constructor instead.",
+            ShapelyDeprecationWarning, stacklevel=3)
+        self.context = context
+        self.factory = geos_linearring_from_py
+
+    @property
+    def __geo_interface__(self):
+        return {
+            'type': 'LinearRing',
+            'coordinates': tuple(self.coords)
+            }
+
+    def _get_coords(self):
+        """Access to geometry's coordinates (CoordinateSequence)"""
+        return CoordinateSequence(self)
+
+    coords = property(_get_coords)
+
+
+def asLinearRing(context):
+    """Adapt an object to the LinearRing interface"""
+    return LinearRingAdapter(context)
+
+
+class InteriorRingSequence:
+
+    _factory = None
+    _geom = None
+    __p__ = None
+    _ndim = None
+    _index = 0
+    _length = 0
+    __rings__ = None
+    _gtag = None
+
+    def __init__(self, parent):
+        self.__p__ = parent
+        self._geom = parent._geom
+        self._ndim = parent._ndim
+
+    def __iter__(self):
+        self._index = 0
+        self._length = self.__len__()
+        return self
+
+    def __next__(self):
+        if self._index < self._length:
+            ring = self._get_ring(self._index)
+            self._index += 1
+            return ring
+        else:
+            raise StopIteration
+
+    def __len__(self):
+        return lgeos.GEOSGetNumInteriorRings(self._geom)
+
+    def __getitem__(self, key):
+        m = self.__len__()
+        if isinstance(key, int):
+            if key + m < 0 or key >= m:
+                raise IndexError("index out of range")
+            if key < 0:
+                i = m + key
+            else:
+                i = key
+            return self._get_ring(i)
+        elif isinstance(key, slice):
+            res = []
+            start, stop, stride = key.indices(m)
+            for i in range(start, stop, stride):
+                res.append(self._get_ring(i))
+            return res
+        else:
+            raise TypeError("key must be an index or slice")
+
+    @property
+    def _longest(self):
+        max = 0
+        for g in iter(self):
+            l = len(g.coords)
+            if l > max:
+                max = l
+
+    def gtag(self):
+        return hash(repr(self.__p__))
+
+    def _get_ring(self, i):
+        gtag = self.gtag()
+        if gtag != self._gtag:
+            self.__rings__ = {}
+        if i not in self.__rings__:
+            g = lgeos.GEOSGetInteriorRingN(self._geom, i)
+            ring = LinearRing()
+            ring._set_geom(g)
+            ring.__p__ = self
+            ring._other_owned = True
+            ring._ndim = self._ndim
+            self.__rings__[i] = weakref.ref(ring)
+        return self.__rings__[i]()
+
+
+
[docs]class Polygon(BaseGeometry): + """ + A two-dimensional figure bounded by a linear ring + + A polygon has a non-zero area. It may have one or more negative-space + "holes" which are also bounded by linear rings. If any rings cross each + other, the feature is invalid and operations on it may fail. + + Attributes + ---------- + exterior : LinearRing + The ring which bounds the positive space of the polygon. + interiors : sequence + A sequence of rings which bound all existing holes. + """ + + _exterior = None + _interiors = [] + _ndim = 2 + + def __init__(self, shell=None, holes=None): + """ + Parameters + ---------- + shell : sequence + A sequence of (x, y [,z]) numeric coordinate pairs or triples. + Also can be a sequence of Point objects. + holes : sequence + A sequence of objects which satisfy the same requirements as the + shell parameters above + + Example + ------- + Create a square polygon with no holes + + >>> coords = ((0., 0.), (0., 1.), (1., 1.), (1., 0.), (0., 0.)) + >>> polygon = Polygon(coords) + >>> polygon.area + 1.0 + """ + BaseGeometry.__init__(self) + + if shell is not None: + ret = geos_polygon_from_py(shell, holes) + if ret is not None: + geom, n = ret + self._set_geom(geom) + self._ndim = n + else: + self._empty() + + @property + def exterior(self): + if self.is_empty: + return LinearRing() + elif self._exterior is None or self._exterior() is None: + g = lgeos.GEOSGetExteriorRing(self._geom) + ring = LinearRing() + ring._set_geom(g) + ring.__p__ = self + ring._other_owned = True + ring._ndim = self._ndim + self._exterior = weakref.ref(ring) + return self._exterior() + + @property + def interiors(self): + if self.is_empty: + return [] + return InteriorRingSequence(self) + + def __eq__(self, other): + if not isinstance(other, Polygon): + return False + check_empty = (self.is_empty, other.is_empty) + if all(check_empty): + return True + elif any(check_empty): + return False + my_coords = [ + tuple(self.exterior.coords), + [tuple(interior.coords) for interior in self.interiors] + ] + other_coords = [ + tuple(other.exterior.coords), + [tuple(interior.coords) for interior in other.interiors] + ] + return my_coords == other_coords + + def __ne__(self, other): + return not self.__eq__(other) + + __hash__ = None + + @property + def _ctypes(self): + if not self._ctypes_data: + self._ctypes_data = self.exterior._ctypes + return self._ctypes_data + + @property + def __array_interface__(self): + raise AttributeError( + "A polygon does not itself provide the array interface. Its rings do.") + + def _get_coords(self): + raise NotImplementedError( + "Component rings have coordinate sequences, but the polygon does not") + + def _set_coords(self, ob): + raise NotImplementedError( + "Component rings have coordinate sequences, but the polygon does not") + + @property + def coords(self): + raise NotImplementedError( + "Component rings have coordinate sequences, but the polygon does not") + + @property + def __geo_interface__(self): + if self.exterior == LinearRing(): + coords = [] + else: + coords = [tuple(self.exterior.coords)] + for hole in self.interiors: + coords.append(tuple(hole.coords)) + return { + 'type': 'Polygon', + 'coordinates': tuple(coords)} + +
[docs] def svg(self, scale_factor=1., fill_color=None, opacity=None): + """Returns SVG path element for the Polygon geometry. + + Parameters + ========== + scale_factor : float + Multiplication factor for the SVG stroke-width. Default is 1. + fill_color : str, optional + Hex string for fill color. Default is to use "#66cc99" if + geometry is valid, and "#ff3333" if invalid. + opacity : float + Float number between 0 and 1 for color opacity. Default value is 0.6 + """ + if self.is_empty: + return '<g />' + if fill_color is None: + fill_color = "#66cc99" if self.is_valid else "#ff3333" + if opacity is None: + opacity = 0.6 + exterior_coords = [ + ["{},{}".format(*c) for c in self.exterior.coords]] + interior_coords = [ + ["{},{}".format(*c) for c in interior.coords] + for interior in self.interiors] + path = " ".join([ + "M {} L {} z".format(coords[0], " L ".join(coords[1:])) + for coords in exterior_coords + interior_coords]) + return ( + '<path fill-rule="evenodd" fill="{2}" stroke="#555555" ' + 'stroke-width="{0}" opacity="{3}" d="{1}" />' + ).format(2. * scale_factor, path, fill_color, opacity)
+ +
[docs] @classmethod + def from_bounds(cls, xmin, ymin, xmax, ymax): + """Construct a `Polygon()` from spatial bounds.""" + return cls([ + (xmin, ymin), + (xmin, ymax), + (xmax, ymax), + (xmax, ymin)])
+ + +class PolygonAdapter(PolygonProxy, Polygon): + + def __init__(self, shell, holes=None): + warnings.warn( + "The proxy geometries (through the 'asShape()', 'asPolygon()' or " + "'PolygonAdapter()' constructors) are deprecated and will be " + "removed in Shapely 2.0. Use the 'shape()' function or the " + "standard 'Polygon()' constructor instead.", + ShapelyDeprecationWarning, stacklevel=4) + self.shell = shell + self.holes = holes + self.context = (shell, holes) + self.factory = geos_polygon_from_py + + @property + def _ndim(self): + try: + # From array protocol + array = self.shell.__array_interface__ + n = array['shape'][1] + assert n == 2 or n == 3 + return n + except AttributeError: + # Fall back on list + return len(self.shell[0]) + + +def asPolygon(shell, holes=None): + """Adapt objects to the Polygon interface""" + return PolygonAdapter(shell, holes) + + +def orient(polygon, sign=1.0): + s = float(sign) + rings = [] + ring = polygon.exterior + if signed_area(ring)/s >= 0.0: + rings.append(ring) + else: + rings.append(list(ring.coords)[::-1]) + for ring in polygon.interiors: + if signed_area(ring)/s <= 0.0: + rings.append(ring) + else: + rings.append(list(ring.coords)[::-1]) + return Polygon(rings[0], rings[1:]) + + +def geos_linearring_from_py(ob, update_geom=None, update_ndim=0): + # If a LinearRing is passed in, clone it and return + # If a valid LineString is passed in, clone the coord seq and return a + # LinearRing. + # + # NB: access to coordinates using the array protocol has been moved + # entirely to the speedups module. + + if isinstance(ob, LineString): + if type(ob) == LinearRing: + return geos_geom_from_py(ob) + elif not ob.is_valid: + raise TopologicalError("An input LineString must be valid.") + elif ob.is_closed and len(ob.coords) >= 4: + return geos_geom_from_py(ob, lgeos.GEOSGeom_createLinearRing) + else: + ob = list(ob.coords) + + try: + m = len(ob) + except TypeError: # generators + ob = list(ob) + m = len(ob) + + if m == 0: + return None + + def _coords(o): + if isinstance(o, Point): + return o.coords[0] + else: + return o + + n = len(_coords(ob[0])) + if m < 3: + raise ValueError( + "A LinearRing must have at least 3 coordinate tuples") + assert (n == 2 or n == 3) + + # Add closing coordinates if not provided + if ( + m == 3 + or _coords(ob[0])[0] != _coords(ob[-1])[0] + or _coords(ob[0])[1] != _coords(ob[-1])[1] + ): + M = m + 1 + else: + M = m + + # Create a coordinate sequence + if update_geom is not None: + if n != update_ndim: + raise ValueError( + "Coordinate dimensions mismatch: target geom has {} dims, " + "update geom has {} dims".format(n, update_ndim)) + cs = lgeos.GEOSGeom_getCoordSeq(update_geom) + else: + cs = lgeos.GEOSCoordSeq_create(M, n) + + # add to coordinate sequence + for i in range(m): + coords = _coords(ob[i]) + # Because of a bug in the GEOS C API, + # always set X before Y + lgeos.GEOSCoordSeq_setX(cs, i, coords[0]) + lgeos.GEOSCoordSeq_setY(cs, i, coords[1]) + if n == 3: + try: + lgeos.GEOSCoordSeq_setZ(cs, i, coords[2]) + except IndexError: + raise ValueError("Inconsistent coordinate dimensionality") + + # Add closing coordinates to sequence? + if M > m: + coords = _coords(ob[0]) + # Because of a bug in the GEOS C API, + # always set X before Y + lgeos.GEOSCoordSeq_setX(cs, M-1, coords[0]) + lgeos.GEOSCoordSeq_setY(cs, M-1, coords[1]) + if n == 3: + lgeos.GEOSCoordSeq_setZ(cs, M-1, coords[2]) + + if update_geom is not None: + return None + else: + return lgeos.GEOSGeom_createLinearRing(cs), n + + +def update_linearring_from_py(geom, ob): + geos_linearring_from_py(ob, geom._geom, geom._ndim) + + +def geos_polygon_from_py(shell, holes=None): + + if shell is None: + return None + + if isinstance(shell, Polygon): + return geos_geom_from_py(shell) + + if shell is not None: + ret = geos_linearring_from_py(shell) + if ret is None: + return None + + geos_shell, ndim = ret + if holes is not None and len(holes) > 0: + ob = holes + L = len(ob) + exemplar = ob[0] + try: + N = len(exemplar[0]) + except TypeError: + N = exemplar._ndim + if not L >= 1: + raise ValueError("number of holes must be non zero") + if N not in (2, 3): + raise ValueError("insufficiant coordinate dimension") + + # Array of pointers to ring geometries + geos_holes = (c_void_p * L)() + + # add to coordinate sequence + for l in range(L): + geom, ndim = geos_linearring_from_py(ob[l]) + geos_holes[l] = cast(geom, c_void_p) + else: + geos_holes = POINTER(c_void_p)() + L = 0 + + return ( + lgeos.GEOSGeom_createPolygon( + c_void_p(geos_shell), geos_holes, L), ndim) +
+ +
+
+
+ +
+ +
+

© Copyright 2019 Metropolitan Council.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty/_modules/shapely/ops/index.html b/branch/bicounty/_modules/shapely/ops/index.html new file mode 100644 index 0000000..6a69ec3 --- /dev/null +++ b/branch/bicounty/_modules/shapely/ops/index.html @@ -0,0 +1,866 @@ + + + + + + shapely.ops — lasso documentation + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for shapely.ops

+"""Support for various GEOS geometry operations
+"""
+
+from ctypes import byref, c_void_p, c_double
+from warnings import warn
+
+from shapely.errors import GeometryTypeError, ShapelyDeprecationWarning
+from shapely.prepared import prep
+from shapely.geos import lgeos
+from shapely.geometry.base import geom_factory, BaseGeometry, BaseMultipartGeometry
+from shapely.geometry import (
+    shape, Point, MultiPoint, LineString, MultiLineString, Polygon, GeometryCollection)
+from shapely.geometry.polygon import orient as orient_
+from shapely.algorithms.polylabel import polylabel
+
+
+__all__ = ['cascaded_union', 'linemerge', 'operator', 'polygonize',
+           'polygonize_full', 'transform', 'unary_union', 'triangulate',
+           'voronoi_diagram', 'split', 'nearest_points', 'validate', 'snap',
+           'shared_paths', 'clip_by_rect', 'orient', 'substring']
+
+
+class CollectionOperator:
+
+    def shapeup(self, ob):
+        if isinstance(ob, BaseGeometry):
+            return ob
+        else:
+            try:
+                return shape(ob)
+            except (ValueError, AttributeError):
+                return LineString(ob)
+
+    def polygonize(self, lines):
+        """Creates polygons from a source of lines
+
+        The source may be a MultiLineString, a sequence of LineString objects,
+        or a sequence of objects than can be adapted to LineStrings.
+        """
+        source = getattr(lines, 'geoms', None) or lines
+        try:
+            source = iter(source)
+        except TypeError:
+            source = [source]
+        finally:
+            obs = [self.shapeup(l) for l in source]
+        geom_array_type = c_void_p * len(obs)
+        geom_array = geom_array_type()
+        for i, line in enumerate(obs):
+            geom_array[i] = line._geom
+        product = lgeos.GEOSPolygonize(byref(geom_array), len(obs))
+        collection = geom_factory(product)
+        for g in collection.geoms:
+            clone = lgeos.GEOSGeom_clone(g._geom)
+            g = geom_factory(clone)
+            g._other_owned = False
+            yield g
+
+    def polygonize_full(self, lines):
+        """Creates polygons from a source of lines, returning the polygons
+        and leftover geometries.
+
+        The source may be a MultiLineString, a sequence of LineString objects,
+        or a sequence of objects than can be adapted to LineStrings.
+
+        Returns a tuple of objects: (polygons, dangles, cut edges, invalid ring
+        lines). Each are a geometry collection.
+
+        Dangles are edges which have one or both ends which are not incident on
+        another edge endpoint. Cut edges are connected at both ends but do not
+        form part of polygon. Invalid ring lines form rings which are invalid
+        (bowties, etc).
+        """
+        source = getattr(lines, 'geoms', None) or lines
+        try:
+            source = iter(source)
+        except TypeError:
+            source = [source]
+        finally:
+            obs = [self.shapeup(l) for l in source]
+        L = len(obs)
+        subs = (c_void_p * L)()
+        for i, g in enumerate(obs):
+            subs[i] = g._geom
+        collection = lgeos.GEOSGeom_createCollection(5, subs, L)
+        dangles = c_void_p()
+        cuts = c_void_p()
+        invalids = c_void_p()
+        product = lgeos.GEOSPolygonize_full(
+            collection, byref(dangles), byref(cuts), byref(invalids))
+        return (
+            geom_factory(product),
+            geom_factory(dangles),
+            geom_factory(cuts),
+            geom_factory(invalids)
+            )
+
+    def linemerge(self, lines):
+        """Merges all connected lines from a source
+
+        The source may be a MultiLineString, a sequence of LineString objects,
+        or a sequence of objects than can be adapted to LineStrings.  Returns a
+        LineString or MultiLineString when lines are not contiguous.
+        """
+        source = None
+        if hasattr(lines, 'type') and lines.type == 'MultiLineString':
+            source = lines
+        elif hasattr(lines, 'geoms'):
+            # other Multi geometries
+            source = MultiLineString([ls.coords for ls in lines.geoms])
+        elif hasattr(lines, '__iter__'):
+            try:
+                source = MultiLineString([ls.coords for ls in lines])
+            except AttributeError:
+                source = MultiLineString(lines)
+        if source is None:
+            raise ValueError("Cannot linemerge %s" % lines)
+        result = lgeos.GEOSLineMerge(source._geom)
+        return geom_factory(result)
+
+    def cascaded_union(self, geoms):
+        """Returns the union of a sequence of geometries
+
+        This function is deprecated, as it was superseded by
+        :meth:`unary_union`.
+        """
+        warn(
+            "The 'cascaded_union()' function is deprecated. "
+            "Use 'unary_union()' instead.",
+            ShapelyDeprecationWarning, stacklevel=2)
+        try:
+            if isinstance(geoms, BaseMultipartGeometry):
+                geoms = geoms.geoms
+            L = len(geoms)
+        except TypeError:
+            geoms = [geoms]
+            L = 1
+        subs = (c_void_p * L)()
+        for i, g in enumerate(geoms):
+            subs[i] = g._geom
+        collection = lgeos.GEOSGeom_createCollection(6, subs, L)
+        return geom_factory(lgeos.methods['cascaded_union'](collection))
+
+    def unary_union(self, geoms):
+        """Returns the union of a sequence of geometries
+
+        This method replaces :meth:`cascaded_union` as the
+        preferred method for dissolving many polygons.
+        """
+        try:
+            if isinstance(geoms, BaseMultipartGeometry):
+                geoms = geoms.geoms
+            L = len(geoms)
+        except TypeError:
+            geoms = [geoms]
+            L = 1
+        subs = (c_void_p * L)()
+        for i, g in enumerate(geoms):
+            subs[i] = g._geom
+        collection = lgeos.GEOSGeom_createCollection(6, subs, L)
+        return geom_factory(lgeos.methods['unary_union'](collection))
+
+operator = CollectionOperator()
+polygonize = operator.polygonize
+polygonize_full = operator.polygonize_full
+linemerge = operator.linemerge
+cascaded_union = operator.cascaded_union
+unary_union = operator.unary_union
+
+
+def triangulate(geom, tolerance=0.0, edges=False):
+    """Creates the Delaunay triangulation and returns a list of geometries
+
+    The source may be any geometry type. All vertices of the geometry will be
+    used as the points of the triangulation.
+
+    From the GEOS documentation:
+    tolerance is the snapping tolerance used to improve the robustness of
+    the triangulation computation. A tolerance of 0.0 specifies that no
+    snapping will take place.
+
+    If edges is False, a list of Polygons (triangles) will be returned.
+    Otherwise the list of LineString edges is returned.
+
+    """
+    func = lgeos.methods['delaunay_triangulation']
+    gc = geom_factory(func(geom._geom, tolerance, int(edges)))
+    return [g for g in gc.geoms]
+
+
+def voronoi_diagram(geom, envelope=None, tolerance=0.0, edges=False):
+    """
+    Constructs a Voronoi Diagram [1] from the given geometry.
+    Returns a list of geometries.
+
+    Parameters
+    ----------
+    geom: geometry
+        the input geometry whose vertices will be used to calculate
+        the final diagram.
+    envelope: geometry, None
+        clipping envelope for the returned diagram, automatically
+        determined if None. The diagram will be clipped to the larger
+        of this envelope or an envelope surrounding the sites.
+    tolerance: float, 0.0
+        sets the snapping tolerance used to improve the robustness
+        of the computation. A tolerance of 0.0 specifies that no
+        snapping will take place.
+    edges: bool, False
+        If False, return regions as polygons. Else, return only
+        edges e.g. LineStrings.
+
+    GEOS documentation can be found at [2]
+
+    Returns
+    -------
+    GeometryCollection
+        geometries representing the Voronoi regions.
+
+    Notes
+    -----
+    The tolerance `argument` can be finicky and is known to cause the
+    algorithm to fail in several cases. If you're using `tolerance`
+    and getting a failure, try removing it. The test cases in
+    tests/test_voronoi_diagram.py show more details.
+
+
+    References
+    ----------
+    [1] https://en.wikipedia.org/wiki/Voronoi_diagram
+    [2] https://geos.osgeo.org/doxygen/geos__c_8h_source.html  (line 730)
+    """
+    func = lgeos.methods['voronoi_diagram']
+    envelope = envelope._geom if envelope else None
+    try:
+        result = geom_factory(func(geom._geom, envelope, tolerance, int(edges)))
+    except ValueError:
+        errstr = "Could not create Voronoi Diagram with the specified inputs."
+        if tolerance:
+            errstr += " Try running again with default tolerance value."
+        raise ValueError(errstr)
+
+    if result.type != 'GeometryCollection':
+        return GeometryCollection([result])
+    return result
+
+
+class ValidateOp:
+    def __call__(self, this):
+        return lgeos.GEOSisValidReason(this._geom)
+
+validate = ValidateOp()
+
+
+
[docs]def transform(func, geom): + """Applies `func` to all coordinates of `geom` and returns a new + geometry of the same type from the transformed coordinates. + + `func` maps x, y, and optionally z to output xp, yp, zp. The input + parameters may iterable types like lists or arrays or single values. + The output shall be of the same type. Scalars in, scalars out. + Lists in, lists out. + + For example, here is an identity function applicable to both types + of input. + + def id_func(x, y, z=None): + return tuple(filter(None, [x, y, z])) + + g2 = transform(id_func, g1) + + Using pyproj >= 2.1, this example will accurately project Shapely geometries: + + import pyproj + + wgs84 = pyproj.CRS('EPSG:4326') + utm = pyproj.CRS('EPSG:32618') + + project = pyproj.Transformer.from_crs(wgs84, utm, always_xy=True).transform + + g2 = transform(project, g1) + + Note that the always_xy kwarg is required here as Shapely geometries only support + X,Y coordinate ordering. + + Lambda expressions such as the one in + + g2 = transform(lambda x, y, z=None: (x+1.0, y+1.0), g1) + + also satisfy the requirements for `func`. + """ + if geom.is_empty: + return geom + if geom.type in ('Point', 'LineString', 'LinearRing', 'Polygon'): + + # First we try to apply func to x, y, z sequences. When func is + # optimized for sequences, this is the fastest, though zipping + # the results up to go back into the geometry constructors adds + # extra cost. + try: + if geom.type in ('Point', 'LineString', 'LinearRing'): + return type(geom)(zip(*func(*zip(*geom.coords)))) + elif geom.type == 'Polygon': + shell = type(geom.exterior)( + zip(*func(*zip(*geom.exterior.coords)))) + holes = list(type(ring)(zip(*func(*zip(*ring.coords)))) + for ring in geom.interiors) + return type(geom)(shell, holes) + + # A func that assumes x, y, z are single values will likely raise a + # TypeError, in which case we'll try again. + except TypeError: + if geom.type in ('Point', 'LineString', 'LinearRing'): + return type(geom)([func(*c) for c in geom.coords]) + elif geom.type == 'Polygon': + shell = type(geom.exterior)( + [func(*c) for c in geom.exterior.coords]) + holes = list(type(ring)([func(*c) for c in ring.coords]) + for ring in geom.interiors) + return type(geom)(shell, holes) + + elif geom.type.startswith('Multi') or geom.type == 'GeometryCollection': + return type(geom)([transform(func, part) for part in geom.geoms]) + else: + raise GeometryTypeError('Type %r not recognized' % geom.type)
+ + +def nearest_points(g1, g2): + """Returns the calculated nearest points in the input geometries + + The points are returned in the same order as the input geometries. + """ + seq = lgeos.methods['nearest_points'](g1._geom, g2._geom) + if seq is None: + if g1.is_empty: + raise ValueError('The first input geometry is empty') + else: + raise ValueError('The second input geometry is empty') + + try: + x1 = c_double() + y1 = c_double() + x2 = c_double() + y2 = c_double() + lgeos.GEOSCoordSeq_getX(seq, 0, byref(x1)) + lgeos.GEOSCoordSeq_getY(seq, 0, byref(y1)) + lgeos.GEOSCoordSeq_getX(seq, 1, byref(x2)) + lgeos.GEOSCoordSeq_getY(seq, 1, byref(y2)) + finally: + lgeos._lgeos.GEOSCoordSeq_destroy(seq) + + p1 = Point(x1.value, y1.value) + p2 = Point(x2.value, y2.value) + return (p1, p2) + +def snap(g1, g2, tolerance): + """Snap one geometry to another with a given tolerance + + Vertices of the first geometry are snapped to vertices of the second + geometry. The resulting snapped geometry is returned. The input geometries + are not modified. + + Parameters + ---------- + g1 : geometry + The first geometry + g2 : geometry + The second geometry + tolerance : float + The snapping tolerance + + Example + ------- + >>> square = Polygon([(1,1), (2, 1), (2, 2), (1, 2), (1, 1)]) + >>> line = LineString([(0,0), (0.8, 0.8), (1.8, 0.95), (2.6, 0.5)]) + >>> result = snap(line, square, 0.5) + >>> result.wkt + 'LINESTRING (0 0, 1 1, 2 1, 2.6 0.5)' + """ + return(geom_factory(lgeos.methods['snap'](g1._geom, g2._geom, tolerance))) + +def shared_paths(g1, g2): + """Find paths shared between the two given lineal geometries + + Returns a GeometryCollection with two elements: + - First element is a MultiLineString containing shared paths with the + same direction for both inputs. + - Second element is a MultiLineString containing shared paths with the + opposite direction for the two inputs. + + Parameters + ---------- + g1 : geometry + The first geometry + g2 : geometry + The second geometry + """ + if not isinstance(g1, LineString): + raise GeometryTypeError("First geometry must be a LineString") + if not isinstance(g2, LineString): + raise GeometryTypeError("Second geometry must be a LineString") + return(geom_factory(lgeos.methods['shared_paths'](g1._geom, g2._geom))) + + +class SplitOp: + + @staticmethod + def _split_polygon_with_line(poly, splitter): + """Split a Polygon with a LineString""" + if not isinstance(poly, Polygon): + raise GeometryTypeError("First argument must be a Polygon") + if not isinstance(splitter, LineString): + raise GeometryTypeError("Second argument must be a LineString") + + union = poly.boundary.union(splitter) + + # greatly improves split performance for big geometries with many + # holes (the following contains checks) with minimal overhead + # for common cases + poly = prep(poly) + + # some polygonized geometries may be holes, we do not want them + # that's why we test if the original polygon (poly) contains + # an inner point of polygonized geometry (pg) + return [pg for pg in polygonize(union) if poly.contains(pg.representative_point())] + + @staticmethod + def _split_line_with_line(line, splitter): + """Split a LineString with another (Multi)LineString or (Multi)Polygon""" + + # if splitter is a polygon, pick it's boundary + if splitter.type in ('Polygon', 'MultiPolygon'): + splitter = splitter.boundary + + if not isinstance(line, LineString): + raise GeometryTypeError("First argument must be a LineString") + if not isinstance(splitter, LineString) and not isinstance(splitter, MultiLineString): + raise GeometryTypeError("Second argument must be either a LineString or a MultiLineString") + + # | s\l | Interior | Boundary | Exterior | + # |----------|----------|----------|----------| + # | Interior | 0 or F | * | * | At least one of these two must be 0 + # | Boundary | 0 or F | * | * | So either '0********' or '[0F]**0*****' + # | Exterior | * | * | * | No overlapping interiors ('1********') + relation = splitter.relate(line) + if relation[0] == '1': + # The lines overlap at some segment (linear intersection of interiors) + raise ValueError('Input geometry segment overlaps with the splitter.') + elif relation[0] == '0' or relation[3] == '0': + # The splitter crosses or touches the line's interior --> return multilinestring from the split + return line.difference(splitter) + else: + # The splitter does not cross or touch the line's interior --> return collection with identity line + return [line] + + @staticmethod + def _split_line_with_point(line, splitter): + """Split a LineString with a Point""" + if not isinstance(line, LineString): + raise GeometryTypeError("First argument must be a LineString") + if not isinstance(splitter, Point): + raise GeometryTypeError("Second argument must be a Point") + + # check if point is in the interior of the line + if not line.relate_pattern(splitter, '0********'): + # point not on line interior --> return collection with single identity line + # (REASONING: Returning a list with the input line reference and creating a + # GeometryCollection at the general split function prevents unnecessary copying + # of linestrings in multipoint splitting function) + return [line] + elif line.coords[0] == splitter.coords[0]: + # if line is a closed ring the previous test doesn't behave as desired + return [line] + + # point is on line, get the distance from the first point on line + distance_on_line = line.project(splitter) + coords = list(line.coords) + # split the line at the point and create two new lines + current_position = 0.0 + for i in range(len(coords)-1): + point1 = coords[i] + point2 = coords[i+1] + dx = point1[0] - point2[0] + dy = point1[1] - point2[1] + segment_length = (dx ** 2 + dy ** 2) ** 0.5 + current_position += segment_length + if distance_on_line == current_position: + # splitter is exactly on a vertex + return [ + LineString(coords[:i+2]), + LineString(coords[i+1:]) + ] + elif distance_on_line < current_position: + # splitter is between two vertices + return [ + LineString(coords[:i+1] + [splitter.coords[0]]), + LineString([splitter.coords[0]] + coords[i+1:]) + ] + return [line] + + + @staticmethod + def _split_line_with_multipoint(line, splitter): + """Split a LineString with a MultiPoint""" + + if not isinstance(line, LineString): + raise GeometryTypeError("First argument must be a LineString") + if not isinstance(splitter, MultiPoint): + raise GeometryTypeError("Second argument must be a MultiPoint") + + chunks = [line] + for pt in splitter.geoms: + new_chunks = [] + for chunk in filter(lambda x: not x.is_empty, chunks): + # add the newly split 2 lines or the same line if not split + new_chunks.extend(SplitOp._split_line_with_point(chunk, pt)) + chunks = new_chunks + + return chunks + + @staticmethod + def split(geom, splitter): + """ + Splits a geometry by another geometry and returns a collection of geometries. This function is the theoretical + opposite of the union of the split geometry parts. If the splitter does not split the geometry, a collection + with a single geometry equal to the input geometry is returned. + The function supports: + - Splitting a (Multi)LineString by a (Multi)Point or (Multi)LineString or (Multi)Polygon + - Splitting a (Multi)Polygon by a LineString + + It may be convenient to snap the splitter with low tolerance to the geometry. For example in the case + of splitting a line by a point, the point must be exactly on the line, for the line to be correctly split. + When splitting a line by a polygon, the boundary of the polygon is used for the operation. + When splitting a line by another line, a ValueError is raised if the two overlap at some segment. + + Parameters + ---------- + geom : geometry + The geometry to be split + splitter : geometry + The geometry that will split the input geom + + Example + ------- + >>> pt = Point((1, 1)) + >>> line = LineString([(0,0), (2,2)]) + >>> result = split(line, pt) + >>> result.wkt + 'GEOMETRYCOLLECTION (LINESTRING (0 0, 1 1), LINESTRING (1 1, 2 2))' + """ + + if geom.type in ('MultiLineString', 'MultiPolygon'): + return GeometryCollection([i for part in geom.geoms for i in SplitOp.split(part, splitter).geoms]) + + elif geom.type == 'LineString': + if splitter.type in ('LineString', 'MultiLineString', 'Polygon', 'MultiPolygon'): + split_func = SplitOp._split_line_with_line + elif splitter.type in ('Point'): + split_func = SplitOp._split_line_with_point + elif splitter.type in ('MultiPoint'): + split_func = SplitOp._split_line_with_multipoint + else: + raise GeometryTypeError("Splitting a LineString with a %s is not supported" % splitter.type) + + elif geom.type == 'Polygon': + if splitter.type == 'LineString': + split_func = SplitOp._split_polygon_with_line + else: + raise GeometryTypeError("Splitting a Polygon with a %s is not supported" % splitter.type) + + else: + raise GeometryTypeError("Splitting %s geometry is not supported" % geom.type) + + return GeometryCollection(split_func(geom, splitter)) + +split = SplitOp.split + + +def substring(geom, start_dist, end_dist, normalized=False): + """Return a line segment between specified distances along a LineString + + Negative distance values are taken as measured in the reverse + direction from the end of the geometry. Out-of-range index + values are handled by clamping them to the valid range of values. + + If the start distance equals the end distance, a Point is returned. + + If the start distance is actually beyond the end distance, then the + reversed substring is returned such that the start distance is + at the first coordinate. + + Parameters + ---------- + geom : LineString + The geometry to get a substring of. + start_dist : float + The distance along `geom` of the start of the substring. + end_dist : float + The distance along `geom` of the end of the substring. + normalized : bool, False + Whether the distance parameters are interpreted as a + fraction of the geometry's length. + + Returns + ------- + Union[Point, LineString] + The substring between `start_dist` and `end_dist` or a Point + if they are at the same location. + + Raises + ------ + TypeError + If `geom` is not a LineString. + + Examples + -------- + >>> from shapely.geometry import LineString + >>> from shapely.ops import substring + >>> ls = LineString((i, 0) for i in range(6)) + >>> ls.wkt + 'LINESTRING (0 0, 1 0, 2 0, 3 0, 4 0, 5 0)' + >>> substring(ls, start_dist=1, end_dist=3).wkt + 'LINESTRING (1 0, 2 0, 3 0)' + >>> substring(ls, start_dist=3, end_dist=1).wkt + 'LINESTRING (3 0, 2 0, 1 0)' + >>> substring(ls, start_dist=1, end_dist=-3).wkt + 'LINESTRING (1 0, 2 0)' + >>> substring(ls, start_dist=0.2, end_dist=-0.6, normalized=True).wkt + 'LINESTRING (1 0, 2 0)' + + Returning a `Point` when `start_dist` and `end_dist` are at the + same location. + + >>> substring(ls, 2.5, -2.5).wkt + 'POINT (2.5 0)' + """ + + if not isinstance(geom, LineString): + raise GeometryTypeError("Can only calculate a substring of LineString geometries. A %s was provided." % geom.type) + + # Filter out cases in which to return a point + if start_dist == end_dist: + return geom.interpolate(start_dist, normalized) + elif not normalized and start_dist >= geom.length and end_dist >= geom.length: + return geom.interpolate(geom.length, normalized) + elif not normalized and -start_dist >= geom.length and -end_dist >= geom.length: + return geom.interpolate(0, normalized) + elif normalized and start_dist >= 1 and end_dist >= 1: + return geom.interpolate(1, normalized) + elif normalized and -start_dist >= 1 and -end_dist >= 1: + return geom.interpolate(0, normalized) + + if normalized: + start_dist *= geom.length + end_dist *= geom.length + + # Filter out cases where distances meet at a middle point from opposite ends. + if start_dist < 0 < end_dist and abs(start_dist) + end_dist == geom.length: + return geom.interpolate(end_dist) + elif end_dist < 0 < start_dist and abs(end_dist) + start_dist == geom.length: + return geom.interpolate(start_dist) + + start_point = geom.interpolate(start_dist) + end_point = geom.interpolate(end_dist) + + if start_dist < 0: + start_dist = geom.length + start_dist # Values may still be negative, + if end_dist < 0: # but only in the out-of-range + end_dist = geom.length + end_dist # sense, not the wrap-around sense. + + reverse = start_dist > end_dist + if reverse: + start_dist, end_dist = end_dist, start_dist + + if start_dist < 0: + start_dist = 0 # to avoid duplicating the first vertex + + if reverse: + vertex_list = [(end_point.x, end_point.y)] + else: + vertex_list = [(start_point.x, start_point.y)] + + coords = list(geom.coords) + current_distance = 0 + for p1, p2 in zip(coords, coords[1:]): + if start_dist < current_distance < end_dist: + vertex_list.append(p1) + elif current_distance >= end_dist: + break + + current_distance += ((p2[0] - p1[0]) ** 2 + (p2[1] - p1[1]) ** 2) ** 0.5 + + if reverse: + vertex_list.append((start_point.x, start_point.y)) + # reverse direction result + vertex_list = reversed(vertex_list) + else: + vertex_list.append((end_point.x, end_point.y)) + + return LineString(vertex_list) + + +def clip_by_rect(geom, xmin, ymin, xmax, ymax): + """Returns the portion of a geometry within a rectangle + + The geometry is clipped in a fast but possibly dirty way. The output is + not guaranteed to be valid. No exceptions will be raised for topological + errors. + + Parameters + ---------- + geom : geometry + The geometry to be clipped + xmin : float + Minimum x value of the rectangle + ymin : float + Minimum y value of the rectangle + xmax : float + Maximum x value of the rectangle + ymax : float + Maximum y value of the rectangle + + Notes + ----- + Requires GEOS >= 3.5.0 + New in 1.7. + """ + if geom.is_empty: + return geom + result = geom_factory(lgeos.methods['clip_by_rect'](geom._geom, xmin, ymin, xmax, ymax)) + return result + + +def orient(geom, sign=1.0): + """A properly oriented copy of the given geometry. + + The signed area of the result will have the given sign. A sign of + 1.0 means that the coordinates of the product's exterior rings will + be oriented counter-clockwise. + + Parameters + ---------- + geom : Geometry + The original geometry. May be a Polygon, MultiPolygon, or + GeometryCollection. + sign : float, optional. + The sign of the result's signed area. + + Returns + ------- + Geometry + + """ + if isinstance(geom, BaseMultipartGeometry): + return geom.__class__( + list( + map( + lambda geom: orient(geom, sign), + geom.geoms, + ) + ) + ) + if isinstance(geom, (Polygon,)): + return orient_(geom, sign) + return geom +
+ +
+
+
+ +
+ +
+

© Copyright 2019 Metropolitan Council.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty/_sources/_generated/lasso.CubeTransit.rst.txt b/branch/bicounty/_sources/_generated/lasso.CubeTransit.rst.txt new file mode 100644 index 0000000..e24b49e --- /dev/null +++ b/branch/bicounty/_sources/_generated/lasso.CubeTransit.rst.txt @@ -0,0 +1,36 @@ +lasso.CubeTransit +================= + +.. currentmodule:: lasso + +.. autoclass:: CubeTransit + + + .. automethod:: __init__ + + + .. rubric:: Methods + + .. autosummary:: + + ~CubeTransit.__init__ + ~CubeTransit.add_additional_time_periods + ~CubeTransit.add_cube + ~CubeTransit.build_route_name + ~CubeTransit.calculate_start_end_times + ~CubeTransit.create_add_route_card_dict + ~CubeTransit.create_delete_route_card_dict + ~CubeTransit.create_from_cube + ~CubeTransit.create_update_route_card_dict + ~CubeTransit.cube_properties_to_standard_properties + ~CubeTransit.evaluate_differences + ~CubeTransit.evaluate_route_property_differences + ~CubeTransit.evaluate_route_shape_changes + ~CubeTransit.get_time_period_numbers_from_cube_properties + ~CubeTransit.unpack_route_name + + + + + + \ No newline at end of file diff --git a/branch/bicounty/_sources/_generated/lasso.ModelRoadwayNetwork.rst.txt b/branch/bicounty/_sources/_generated/lasso.ModelRoadwayNetwork.rst.txt new file mode 100644 index 0000000..29190d8 --- /dev/null +++ b/branch/bicounty/_sources/_generated/lasso.ModelRoadwayNetwork.rst.txt @@ -0,0 +1,90 @@ +lasso.ModelRoadwayNetwork +========================= + +.. currentmodule:: lasso + +.. autoclass:: ModelRoadwayNetwork + + + .. automethod:: __init__ + + + .. rubric:: Methods + + .. autosummary:: + + ~ModelRoadwayNetwork.__init__ + ~ModelRoadwayNetwork.add_counts + ~ModelRoadwayNetwork.add_incident_link_data_to_nodes + ~ModelRoadwayNetwork.add_new_roadway_feature_change + ~ModelRoadwayNetwork.add_variable_using_shst_reference + ~ModelRoadwayNetwork.addition_map + ~ModelRoadwayNetwork.apply + ~ModelRoadwayNetwork.apply_managed_lane_feature_change + ~ModelRoadwayNetwork.apply_python_calculation + ~ModelRoadwayNetwork.apply_roadway_feature_change + ~ModelRoadwayNetwork.assess_connectivity + ~ModelRoadwayNetwork.build_selection_key + ~ModelRoadwayNetwork.calculate_area_type + ~ModelRoadwayNetwork.calculate_centroidconnect + ~ModelRoadwayNetwork.calculate_county + ~ModelRoadwayNetwork.calculate_distance + ~ModelRoadwayNetwork.calculate_mpo + ~ModelRoadwayNetwork.calculate_use + ~ModelRoadwayNetwork.convert_int + ~ModelRoadwayNetwork.create_ML_variable + ~ModelRoadwayNetwork.create_calculated_variables + ~ModelRoadwayNetwork.create_dummy_connector_links + ~ModelRoadwayNetwork.create_hov_corridor_variable + ~ModelRoadwayNetwork.create_managed_lane_network + ~ModelRoadwayNetwork.create_managed_variable + ~ModelRoadwayNetwork.dataframe_to_fixed_width + ~ModelRoadwayNetwork.delete_roadway_feature_change + ~ModelRoadwayNetwork.deletion_map + ~ModelRoadwayNetwork.fill_na + ~ModelRoadwayNetwork.from_RoadwayNetwork + ~ModelRoadwayNetwork.get_attribute + ~ModelRoadwayNetwork.get_managed_lane_node_ids + ~ModelRoadwayNetwork.get_modal_graph + ~ModelRoadwayNetwork.get_modal_links_nodes + ~ModelRoadwayNetwork.get_property_by_time_period_and_group + ~ModelRoadwayNetwork.identify_segment + ~ModelRoadwayNetwork.identify_segment_endpoints + ~ModelRoadwayNetwork.is_network_connected + ~ModelRoadwayNetwork.load_transform_network + ~ModelRoadwayNetwork.network_connection_plot + ~ModelRoadwayNetwork.orig_dest_nodes_foreign_key + ~ModelRoadwayNetwork.ox_graph + ~ModelRoadwayNetwork.path_search + ~ModelRoadwayNetwork.read + ~ModelRoadwayNetwork.read_match_result + ~ModelRoadwayNetwork.rename_variables_for_dbf + ~ModelRoadwayNetwork.roadway_net_to_gdf + ~ModelRoadwayNetwork.roadway_standard_to_met_council_network + ~ModelRoadwayNetwork.select_roadway_features + ~ModelRoadwayNetwork.selection_has_unique_link_id + ~ModelRoadwayNetwork.selection_map + ~ModelRoadwayNetwork.shortest_path + ~ModelRoadwayNetwork.split_properties_by_time_period_and_category + ~ModelRoadwayNetwork.update_distance + ~ModelRoadwayNetwork.validate_link_schema + ~ModelRoadwayNetwork.validate_node_schema + ~ModelRoadwayNetwork.validate_properties + ~ModelRoadwayNetwork.validate_selection + ~ModelRoadwayNetwork.validate_shape_schema + ~ModelRoadwayNetwork.validate_uniqueness + ~ModelRoadwayNetwork.write + ~ModelRoadwayNetwork.write_roadway_as_fixedwidth + ~ModelRoadwayNetwork.write_roadway_as_shp + + + + + + .. rubric:: Attributes + + .. autosummary:: + + ~ModelRoadwayNetwork.CALCULATED_VALUES + + \ No newline at end of file diff --git a/branch/bicounty/_sources/_generated/lasso.Parameters.rst.txt b/branch/bicounty/_sources/_generated/lasso.Parameters.rst.txt new file mode 100644 index 0000000..28d2c86 --- /dev/null +++ b/branch/bicounty/_sources/_generated/lasso.Parameters.rst.txt @@ -0,0 +1,31 @@ +lasso.Parameters +================ + +.. currentmodule:: lasso + +.. autoclass:: Parameters + + + .. automethod:: __init__ + + + .. rubric:: Methods + + .. autosummary:: + + ~Parameters.__init__ + + + + + + .. rubric:: Attributes + + .. autosummary:: + + ~Parameters.cube_time_periods + ~Parameters.properties_to_split + ~Parameters.county_link_range_dict + ~Parameters.zones + + \ No newline at end of file diff --git a/branch/bicounty/_sources/_generated/lasso.Project.rst.txt b/branch/bicounty/_sources/_generated/lasso.Project.rst.txt new file mode 100644 index 0000000..863945b --- /dev/null +++ b/branch/bicounty/_sources/_generated/lasso.Project.rst.txt @@ -0,0 +1,42 @@ +lasso.Project +============= + +.. currentmodule:: lasso + +.. autoclass:: Project + + + .. automethod:: __init__ + + + .. rubric:: Methods + + .. autosummary:: + + ~Project.__init__ + ~Project.add_highway_changes + ~Project.add_transit_changes + ~Project.create_project + ~Project.determine_roadway_network_changes_compatibility + ~Project.emme_id_to_wrangler_id + ~Project.emme_name_to_wrangler_name + ~Project.evaluate_changes + ~Project.get_object_from_network_build_command + ~Project.get_operation_from_network_build_command + ~Project.read_logfile + ~Project.read_network_build_file + ~Project.write_project_card + + + + + + .. rubric:: Attributes + + .. autosummary:: + + ~Project.CALCULATED_VALUES + ~Project.DEFAULT_PROJECT_NAME + ~Project.STATIC_VALUES + + \ No newline at end of file diff --git a/branch/bicounty/_sources/_generated/lasso.StandardTransit.rst.txt b/branch/bicounty/_sources/_generated/lasso.StandardTransit.rst.txt new file mode 100644 index 0000000..1175b4b --- /dev/null +++ b/branch/bicounty/_sources/_generated/lasso.StandardTransit.rst.txt @@ -0,0 +1,32 @@ +lasso.StandardTransit +===================== + +.. currentmodule:: lasso + +.. autoclass:: StandardTransit + + + .. automethod:: __init__ + + + .. rubric:: Methods + + .. autosummary:: + + ~StandardTransit.__init__ + ~StandardTransit.calculate_cube_mode + ~StandardTransit.cube_format + ~StandardTransit.evaluate_differences + ~StandardTransit.fromTransitNetwork + ~StandardTransit.read_gtfs + ~StandardTransit.route_properties_gtfs_to_cube + ~StandardTransit.shape_gtfs_to_cube + ~StandardTransit.shape_gtfs_to_emme + ~StandardTransit.time_to_cube_time_period + ~StandardTransit.write_as_cube_lin + + + + + + \ No newline at end of file diff --git a/branch/bicounty/_sources/_generated/lasso.logger.rst.txt b/branch/bicounty/_sources/_generated/lasso.logger.rst.txt new file mode 100644 index 0000000..2054273 --- /dev/null +++ b/branch/bicounty/_sources/_generated/lasso.logger.rst.txt @@ -0,0 +1,29 @@ +lasso.logger +============ + +.. automodule:: lasso.logger + + + + + + + + .. rubric:: Functions + + .. autosummary:: + + setupLogging + + + + + + + + + + + + + diff --git a/branch/bicounty/_sources/_generated/lasso.util.rst.txt b/branch/bicounty/_sources/_generated/lasso.util.rst.txt new file mode 100644 index 0000000..95fecf8 --- /dev/null +++ b/branch/bicounty/_sources/_generated/lasso.util.rst.txt @@ -0,0 +1,35 @@ +lasso.util +========== + +.. automodule:: lasso.util + + + + + + + + .. rubric:: Functions + + .. autosummary:: + + column_name_to_parts + create_locationreference + geodesic_point_buffer + get_shared_streets_intersection_hash + hhmmss_to_datetime + secs_to_datetime + shorten_name + + + + + + + + + + + + + diff --git a/branch/bicounty/_sources/autodoc.rst.txt b/branch/bicounty/_sources/autodoc.rst.txt new file mode 100644 index 0000000..7e48d58 --- /dev/null +++ b/branch/bicounty/_sources/autodoc.rst.txt @@ -0,0 +1,29 @@ +Lasso Classes and Functions +==================================== + +.. automodule:: lasso + :no-members: + :no-undoc-members: + :no-inherited-members: + :no-show-inheritance: + + +Base Classes +-------------- +.. autosummary:: + :toctree: _generated + :nosignatures: + + CubeTransit + StandardTransit + ModelRoadwayNetwork + Project + Parameters + +Utils and Functions +-------------------- +.. autosummary:: + :toctree: _generated + + util + logger diff --git a/branch/bicounty/_sources/index.rst.txt b/branch/bicounty/_sources/index.rst.txt new file mode 100644 index 0000000..1255d4e --- /dev/null +++ b/branch/bicounty/_sources/index.rst.txt @@ -0,0 +1,35 @@ +.. lasso documentation master file, created by + sphinx-quickstart on Thu Dec 5 15:43:28 2019. + You can adapt this file completely to your liking, but it should at least + contain the root `toctree` directive. + +Welcome to lasso's documentation! +================================= + +This package of utilities is a wrapper around the +[network_wrangler](http://github.com/wsp-sag/network_wrangler) package +for MetCouncil. It aims to have the following functionality: +1. parse Cube log files and base highway networks and create ProjectCards + for Network Wrangler +2. parse two Cube transit line files and create ProjectCards for NetworkWrangler +3. refine Network Wrangler highway networks to contain specific variables and + settings for Metropolitan Council and export them to a format that can + be read in by Citilab's Cube software. + +.. toctree:: + :maxdepth: 3 + :caption: Contents: + + starting + setup + running + autodoc + + + +Indices and tables +================== + +* :ref:`genindex` +* :ref:`modindex` +* :ref:`search` diff --git a/branch/bicounty/_sources/running.md.txt b/branch/bicounty/_sources/running.md.txt new file mode 100644 index 0000000..e139dc8 --- /dev/null +++ b/branch/bicounty/_sources/running.md.txt @@ -0,0 +1,12 @@ +# Running Lasso + +## Create project files + + +## Create a scenario + + +## Exporting networks + + +## Auditing and Reporting diff --git a/branch/bicounty/_sources/setup.md.txt b/branch/bicounty/_sources/setup.md.txt new file mode 100644 index 0000000..e77d463 --- /dev/null +++ b/branch/bicounty/_sources/setup.md.txt @@ -0,0 +1,9 @@ +# Setup + +### Projects + +### Parameters + +### Settings + +### Additional Data Files diff --git a/branch/bicounty/_sources/starting.md.txt b/branch/bicounty/_sources/starting.md.txt new file mode 100644 index 0000000..8886f95 --- /dev/null +++ b/branch/bicounty/_sources/starting.md.txt @@ -0,0 +1,292 @@ +# Starting Out + +## Installation + +If you are managing multiple python versions, we suggest using [`virtualenv`](https://virtualenv.pypa.io/en/latest/) or [`conda`](https://conda.io/en/latest/) virtual environments. + +Example using a conda environment (recommended) and using the package manager [pip](https://pip.pypa.io/en/stable/) to install Lasso from the source on GitHub. + +```bash +conda config --add channels conda-forge +conda create python=3.7 rtree geopandas -n +conda activate +pip install git+https://github.com/wsp-sag/Lasso@master +``` + +Lasso will install `network_wrangler` from the [PyPi](https://pypi.org/project/network-wrangler/) repository because it is included in Lasso's `requirements.txt`. + +#### Bleeding Edge +If you want to install a more up-to-date or development version of network wrangler and lasso , you can do so by installing it from the `develop` branch of + +```bash +conda config --add channels conda-forge +conda create python=3.7 rtree geopandas -n +conda activate +pip install git+https://github.com/wsp-sag/network_wrangler@develop +pip install git+https://github.com/wsp-sag/Lasso@develop +``` + +#### From Clone +If you are going to be working on Lasso locally, you might want to clone it to your local machine and install it from the clone. The -e will install it in [editable mode](https://pip.pypa.io/en/stable/reference/pip_install/?highlight=editable#editable-installs). + +**if you plan to do development on both network wrangler and lasso locally, consider installing network wrangler from a clone as well!** + +```bash +conda config --add channels conda-forge +conda create python=3.7 rtree geopandas osmnx -n +conda activate +git clone https://github.com/wsp-sag/Lasso +git clone https://github.com/wsp-sag/network_wrangler +cd network_wrangler +pip install -e . +cd .. +cd Lasso +pip install -e . +``` + +Notes: + +1. The -e installs it in editable mode. +2. If you are not part of the project team and want to contribute code bxack to the project, please fork before you clone and then add the original repository to your upstream origin list per [these directions on github](https://help.github.com/en/articles/fork-a-repo). +3. if you wanted to install from a specific tag/version number or branch, replace `@master` with `@` or `@tag` +4. If you want to make use of frequent developer updates for network wrangler as well, you can also install it from clone by copying the instructions for cloning and installing Lasso for Network Wrangler + +If you are going to be doing Lasso development, we also recommend: + - a good IDE such as [Atom](http://atom.io), VS Code, Sublime Text, etc. + with Python syntax highlighting turned on. + - [GitHub Desktop](https://desktop.github.com/) to locally update your clones + +## Brief Intro + +Lasso is a 'wrapper' around the [Network Wrangler](http://wsp-sag.github.io/network_wrangler) utility. + +Both Lasso and NetworkWrangler are built around the following data schemas: + - [`roadway network`], which is based on a mashup of Open Street Map and [Shared Streets](http://sharedstreets.io). In Network Wrangler these are read in from three json files reprsenting: links, shapes, and nodes. Data fields that change by time of day or by user category are represented as nested fields such that any field can be defined for an ad-hoc time-of-day span or user category. + - [`transit network`], which is based on a frequency-based implementation of the csv-based GTFS; and + - [`project card`], which is novel to Network Wrangler and stores information about network changes as a result of projects in yml. + +In addition, Lasso utilizes the following data schemas: + + - [`MetCouncil Model Roadway Network Schema`], which adds data fields to the `roadway network` schema that MetCouncil uses in their travel model including breaking out data fields by time period. + - [`MetCouncil Model Transit Network Schema`], which uses the Cube PublicTransport format, and + - [`Cube Log Files`], which document changes to the roadway network done in the Cube GUI. Lasso translates these to project cards in order to be used by NetworkWrangler. + - [`Cube public transport line files`], which define a set of transit lines in the cube software. + +### Components +Network Wrangler has the following atomic parts: + + - _RoadwayNetwork_ object, which represents the `roadway network` data as GeoDataFrames; + - _TransitNetwork_ object, which represents the `transit network` data as DataFrames; + - _ProjectCard_ object, which represents the data of the `project card`. Project cards identify the infrastructure that is changing (a selection) and defines the changes; or contains information about a new facility to be constructed or a new service to be run.; + - _Scenario_ object, which consist of at least a RoadwayNetwork, and +TransitNetwork. Scenarios can be based on or tiered from other scenarios. +Scenarios can query and add ProjectCards to describe a set of changes that should be made to the network. + +In addition, Lasso has the following atomic parts: + + - _Project_ object, creates project cards from one of the following: a base and a build transit network in cube format, a base and build highway network, or a base highway network and a Cube log file. + - _ModelRoadwayNetwork_ object is a subclass of `RoadwayNetwork` and contains MetCouncil-specific methods to define and create MetCouncil-specific variables and export the network to a format that can be read by Cube. + - _StandardTransit_, an object for holding a standard transit feed as a Partridge object and contains + methods to manipulate and translate the GTFS data to MetCouncil's Cube Line files. + - _CubeTransit_, an object for storing information about transit defined in `Cube public transport line files` + . Has the capability to parse cube line file properties and shapes into python dictionaries and compare line files and represent changes as Project Card dictionaries. + - _Parameters_, A class representing all the parameters defining the networks + including time of day, categories, etc. Parameters can be set at runtime by initializing a parameters instance + with a keyword argument setting the attribute. Parameters that are + not explicitly set will use default parameters listed in this class. + +#### RoadwayNetwork + +Reads, writes, queries and and manipulates roadway network data, which +is mainly stored in the GeoDataFrames `links_df`, `nodes_df`, and `shapes_df`. + +```python +net = RoadwayNetwork.read( + link_filename=MY_LINK_FILE, + node_filename=MY_NODE_FILE, + shape_filename=MY_SHAPE_FILE, + shape_foreign_key ='shape_id', + + ) +my_selection = { + "link": [{"name": ["I 35E"]}], + "A": {"osm_node_id": "961117623"}, # start searching for segments at A + "B": {"osm_node_id": "2564047368"}, +} +net.select_roadway_features(my_selection) + +my_change = [ + { + 'property': 'lanes', + 'existing': 1, + 'set': 2, + }, + { + 'property': 'drive_access', + 'set': 0, + }, +] + +my_net.apply_roadway_feature_change( + my_net.select_roadway_features(my_selection), + my_change +) + +ml_net = net.create_managed_lane_network(in_place=False) + +ml_net.is_network_connected(mode="drive")) + +_, disconnected_nodes = ml_net.assess_connectivity( + mode="walk", + ignore_end_nodes=True +) +ml_net.write(filename=my_out_prefix, path=my_dir) +``` +#### TransitNetwork + +#### ProjectCard + +#### Scenario + +Manages sets of project cards and tiering from a base scenario/set of networks. + +```python + +my_base_scenario = { + "road_net": RoadwayNetwork.read( + link_filename=STPAUL_LINK_FILE, + node_filename=STPAUL_NODE_FILE, + shape_filename=STPAUL_SHAPE_FILE, + fast=True, + shape_foreign_key ='shape_id', + ), + "transit_net": TransitNetwork.read(STPAUL_DIR), +} + +card_filenames = [ + "3_multiple_roadway_attribute_change.yml", + "multiple_changes.yml", + "4_simple_managed_lane.yml", +] + +project_card_directory = os.path.join(STPAUL_DIR, "project_cards") + +project_cards_list = [ + ProjectCard.read(os.path.join(project_card_directory, filename), validate=False) + for filename in card_filenames +] + +my_scenario = Scenario.create_scenario( + base_scenario=my_base_scenario, + project_cards_list=project_cards_list, +) +my_scenario.check_scenario_requisites() + +my_scenario.apply_all_projects() + +my_scenario.scenario_summary() +``` + +#### Project +Creates project cards by comparing two MetCouncil Model Transit Network files or by reading a cube log file and a base network; + +```python + +test_project = Project.create_project( + base_transit_source=os.path.join(CUBE_DIR, "transit.LIN"), + build_transit_source=os.path.join(CUBE_DIR, "transit_route_shape_change"), + ) + +test_project.evaluate_changes() + +test_project.write_project_card( + os.path.join(SCRATCH_DIR, "t_transit_shape_test.yml") + ) + +``` + +#### ModelRoadwayNetwork +A subclass of network_wrangler's RoadwayNetwork +class which additional understanding about how to translate and write the +network out to the MetCouncil Roadway Network schema. + +```Python + +net = ModelRoadwayNetwork.read( + link_filename=STPAUL_LINK_FILE, + node_filename=STPAUL_NODE_FILE, + shape_filename=STPAUL_SHAPE_FILE, + fast=True, + shape_foreign_key ='shape_id', + ) + +net.write_roadway_as_fixedwidth() + +``` + +#### StandardTransit +Translates the standard GTFS data to MetCouncil's Cube Line files. + +```Python +cube_transit_net = StandardTransit.read_gtfs(BASE_TRANSIT_DIR) +cube_transit_net.write_as_cube_lin(os.path.join(WRITE_DIR, "outfile.lin")) +``` + +#### CubeTransit +Used by the project class and has the capability to: + - Parse cube line file properties and shapes into python dictionaries + - Compare line files and represent changes as Project Card dictionaries + +```python +tn = CubeTransit.create_from_cube(CUBE_DIR) +transit_change_list = tn.evaluate_differences(base_transit_network) +``` + +#### Parameters +Holds information about default parameters but can +also be initialized to override those parameters at object instantiation using a dictionary. + +```Python +# read parameters from a yaml configuration file +# could also provide as a key/value pair +with open(config_file) as f: + my_config = yaml.safe_load(f) + +# provide parameters at instantiation of ModelRoadwayNetwork +model_road_net = ModelRoadwayNetwork.from_RoadwayNetwork( + my_scenario.road_net, parameters=my_config.get("my_parameters", {}) + ) +# network written with direction from the parameters given +model_road_net.write_roadway_as_shp() + +``` + +### Typical Workflow + +Workflows in Lasso and Network Wrangler typically accomplish one of two goals: +1. Create Project Cards to document network changes as a result of either transit or roadway projects. +2. Create Model Network Files for a scenario as represented by a series of Project Cards layered on top of a base network. + +#### Project Cards from Transit LIN Files + + +#### Project Cards from Cube LOG Files + + +#### Model Network Files for a Scenario + + + +## Running Quickstart Jupyter Notebooks + +To learn basic lasso functionality, please refer to the following jupyter notebooks in the `/notebooks` directory: + + - `Lasso Project Card Creation Quickstart.ipynb` + - `Lasso Scenario Creation Quickstart.ipynb` + + Jupyter notebooks can be started by activating the lasso conda environment and typing `jupyter notebook`: + + ```bash + conda activate + jupyter notebook + ``` diff --git a/branch/bicounty/_static/_sphinx_javascript_frameworks_compat.js b/branch/bicounty/_static/_sphinx_javascript_frameworks_compat.js new file mode 100644 index 0000000..8549469 --- /dev/null +++ b/branch/bicounty/_static/_sphinx_javascript_frameworks_compat.js @@ -0,0 +1,134 @@ +/* + * _sphinx_javascript_frameworks_compat.js + * ~~~~~~~~~~ + * + * Compatability shim for jQuery and underscores.js. + * + * WILL BE REMOVED IN Sphinx 6.0 + * xref RemovedInSphinx60Warning + * + */ + +/** + * select a different prefix for underscore + */ +$u = _.noConflict(); + + +/** + * small helper function to urldecode strings + * + * See https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/decodeURIComponent#Decoding_query_parameters_from_a_URL + */ +jQuery.urldecode = function(x) { + if (!x) { + return x + } + return decodeURIComponent(x.replace(/\+/g, ' ')); +}; + +/** + * small helper function to urlencode strings + */ +jQuery.urlencode = encodeURIComponent; + +/** + * This function returns the parsed url parameters of the + * current request. Multiple values per key are supported, + * it will always return arrays of strings for the value parts. + */ +jQuery.getQueryParameters = function(s) { + if (typeof s === 'undefined') + s = document.location.search; + var parts = s.substr(s.indexOf('?') + 1).split('&'); + var result = {}; + for (var i = 0; i < parts.length; i++) { + var tmp = parts[i].split('=', 2); + var key = jQuery.urldecode(tmp[0]); + var value = jQuery.urldecode(tmp[1]); + if (key in result) + result[key].push(value); + else + result[key] = [value]; + } + return result; +}; + +/** + * highlight a given string on a jquery object by wrapping it in + * span elements with the given class name. + */ +jQuery.fn.highlightText = function(text, className) { + function highlight(node, addItems) { + if (node.nodeType === 3) { + var val = node.nodeValue; + var pos = val.toLowerCase().indexOf(text); + if (pos >= 0 && + !jQuery(node.parentNode).hasClass(className) && + !jQuery(node.parentNode).hasClass("nohighlight")) { + var span; + var isInSVG = jQuery(node).closest("body, svg, foreignObject").is("svg"); + if (isInSVG) { + span = document.createElementNS("http://www.w3.org/2000/svg", "tspan"); + } else { + span = document.createElement("span"); + span.className = className; + } + span.appendChild(document.createTextNode(val.substr(pos, text.length))); + node.parentNode.insertBefore(span, node.parentNode.insertBefore( + document.createTextNode(val.substr(pos + text.length)), + node.nextSibling)); + node.nodeValue = val.substr(0, pos); + if (isInSVG) { + var rect = document.createElementNS("http://www.w3.org/2000/svg", "rect"); + var bbox = node.parentElement.getBBox(); + rect.x.baseVal.value = bbox.x; + rect.y.baseVal.value = bbox.y; + rect.width.baseVal.value = bbox.width; + rect.height.baseVal.value = bbox.height; + rect.setAttribute('class', className); + addItems.push({ + "parent": node.parentNode, + "target": rect}); + } + } + } + else if (!jQuery(node).is("button, select, textarea")) { + jQuery.each(node.childNodes, function() { + highlight(this, addItems); + }); + } + } + var addItems = []; + var result = this.each(function() { + highlight(this, addItems); + }); + for (var i = 0; i < addItems.length; ++i) { + jQuery(addItems[i].parent).before(addItems[i].target); + } + return result; +}; + +/* + * backward compatibility for jQuery.browser + * This will be supported until firefox bug is fixed. + */ +if (!jQuery.browser) { + jQuery.uaMatch = function(ua) { + ua = ua.toLowerCase(); + + var match = /(chrome)[ \/]([\w.]+)/.exec(ua) || + /(webkit)[ \/]([\w.]+)/.exec(ua) || + /(opera)(?:.*version|)[ \/]([\w.]+)/.exec(ua) || + /(msie) ([\w.]+)/.exec(ua) || + ua.indexOf("compatible") < 0 && /(mozilla)(?:.*? rv:([\w.]+)|)/.exec(ua) || + []; + + return { + browser: match[ 1 ] || "", + version: match[ 2 ] || "0" + }; + }; + jQuery.browser = {}; + jQuery.browser[jQuery.uaMatch(navigator.userAgent).browser] = true; +} diff --git a/branch/bicounty/_static/basic.css b/branch/bicounty/_static/basic.css new file mode 100644 index 0000000..eeb0519 --- /dev/null +++ b/branch/bicounty/_static/basic.css @@ -0,0 +1,899 @@ +/* + * basic.css + * ~~~~~~~~~ + * + * Sphinx stylesheet -- basic theme. + * + * :copyright: Copyright 2007-2022 by the Sphinx team, see AUTHORS. + * :license: BSD, see LICENSE for details. + * + */ + +/* -- main layout ----------------------------------------------------------- */ + +div.clearer { + clear: both; +} + +div.section::after { + display: block; + content: ''; + clear: left; +} + +/* -- relbar ---------------------------------------------------------------- */ + +div.related { + width: 100%; + font-size: 90%; +} + +div.related h3 { + display: none; +} + +div.related ul { + margin: 0; + padding: 0 0 0 10px; + list-style: none; +} + +div.related li { + display: inline; +} + +div.related li.right { + float: right; + margin-right: 5px; +} + +/* -- sidebar --------------------------------------------------------------- */ + +div.sphinxsidebarwrapper { + padding: 10px 5px 0 10px; +} + +div.sphinxsidebar { + float: left; + width: 230px; + margin-left: -100%; + font-size: 90%; + word-wrap: break-word; + overflow-wrap : break-word; +} + +div.sphinxsidebar ul { + list-style: none; +} + +div.sphinxsidebar ul ul, +div.sphinxsidebar ul.want-points { + margin-left: 20px; + list-style: square; +} + +div.sphinxsidebar ul ul { + margin-top: 0; + margin-bottom: 0; +} + +div.sphinxsidebar form { + margin-top: 10px; +} + +div.sphinxsidebar input { + border: 1px solid #98dbcc; + font-family: sans-serif; + font-size: 1em; +} + +div.sphinxsidebar #searchbox form.search { + overflow: hidden; +} + +div.sphinxsidebar #searchbox input[type="text"] { + float: left; + width: 80%; + padding: 0.25em; + box-sizing: border-box; +} + +div.sphinxsidebar #searchbox input[type="submit"] { + float: left; + width: 20%; + border-left: none; + padding: 0.25em; + box-sizing: border-box; +} + + +img { + border: 0; + max-width: 100%; +} + +/* -- search page ----------------------------------------------------------- */ + +ul.search { + margin: 10px 0 0 20px; + padding: 0; +} + +ul.search li { + padding: 5px 0 5px 20px; + background-image: url(file.png); + background-repeat: no-repeat; + background-position: 0 7px; +} + +ul.search li a { + font-weight: bold; +} + +ul.search li p.context { + color: #888; + margin: 2px 0 0 30px; + text-align: left; +} + +ul.keywordmatches li.goodmatch a { + font-weight: bold; +} + +/* -- index page ------------------------------------------------------------ */ + +table.contentstable { + width: 90%; + margin-left: auto; + margin-right: auto; +} + +table.contentstable p.biglink { + line-height: 150%; +} + +a.biglink { + font-size: 1.3em; +} + +span.linkdescr { + font-style: italic; + padding-top: 5px; + font-size: 90%; +} + +/* -- general index --------------------------------------------------------- */ + +table.indextable { + width: 100%; +} + +table.indextable td { + text-align: left; + vertical-align: top; +} + +table.indextable ul { + margin-top: 0; + margin-bottom: 0; + list-style-type: none; +} + +table.indextable > tbody > tr > td > ul { + padding-left: 0em; +} + +table.indextable tr.pcap { + height: 10px; +} + +table.indextable tr.cap { + margin-top: 10px; + background-color: #f2f2f2; +} + +img.toggler { + margin-right: 3px; + margin-top: 3px; + cursor: pointer; +} + +div.modindex-jumpbox { + border-top: 1px solid #ddd; + border-bottom: 1px solid #ddd; + margin: 1em 0 1em 0; + padding: 0.4em; +} + +div.genindex-jumpbox { + border-top: 1px solid #ddd; + border-bottom: 1px solid #ddd; + margin: 1em 0 1em 0; + padding: 0.4em; +} + +/* -- domain module index --------------------------------------------------- */ + +table.modindextable td { + padding: 2px; + border-collapse: collapse; +} + +/* -- general body styles --------------------------------------------------- */ + +div.body { + min-width: 360px; + max-width: 800px; +} + +div.body p, div.body dd, div.body li, div.body blockquote { + -moz-hyphens: auto; + -ms-hyphens: auto; + -webkit-hyphens: auto; + hyphens: auto; +} + +a.headerlink { + visibility: hidden; +} +a.brackets:before, +span.brackets > a:before{ + content: "["; +} + +a.brackets:after, +span.brackets > a:after { + content: "]"; +} + + +h1:hover > a.headerlink, +h2:hover > a.headerlink, +h3:hover > a.headerlink, +h4:hover > a.headerlink, +h5:hover > a.headerlink, +h6:hover > a.headerlink, +dt:hover > a.headerlink, +caption:hover > a.headerlink, +p.caption:hover > a.headerlink, +div.code-block-caption:hover > a.headerlink { + visibility: visible; +} + +div.body p.caption { + text-align: inherit; +} + +div.body td { + text-align: left; +} + +.first { + margin-top: 0 !important; +} + +p.rubric { + margin-top: 30px; + font-weight: bold; +} + +img.align-left, figure.align-left, .figure.align-left, object.align-left { + clear: left; + float: left; + margin-right: 1em; +} + +img.align-right, figure.align-right, .figure.align-right, object.align-right { + clear: right; + float: right; + margin-left: 1em; +} + +img.align-center, figure.align-center, .figure.align-center, object.align-center { + display: block; + margin-left: auto; + margin-right: auto; +} + +img.align-default, figure.align-default, .figure.align-default { + display: block; + margin-left: auto; + margin-right: auto; +} + +.align-left { + text-align: left; +} + +.align-center { + text-align: center; +} + +.align-default { + text-align: center; +} + +.align-right { + text-align: right; +} + +/* -- sidebars -------------------------------------------------------------- */ + +div.sidebar, +aside.sidebar { + margin: 0 0 0.5em 1em; + border: 1px solid #ddb; + padding: 7px; + background-color: #ffe; + width: 40%; + float: right; + clear: right; + overflow-x: auto; +} + +p.sidebar-title { + font-weight: bold; +} +div.admonition, div.topic, blockquote { + clear: left; +} + +/* -- topics ---------------------------------------------------------------- */ +div.topic { + border: 1px solid #ccc; + padding: 7px; + margin: 10px 0 10px 0; +} + +p.topic-title { + font-size: 1.1em; + font-weight: bold; + margin-top: 10px; +} + +/* -- admonitions ----------------------------------------------------------- */ + +div.admonition { + margin-top: 10px; + margin-bottom: 10px; + padding: 7px; +} + +div.admonition dt { + font-weight: bold; +} + +p.admonition-title { + margin: 0px 10px 5px 0px; + font-weight: bold; +} + +div.body p.centered { + text-align: center; + margin-top: 25px; +} + +/* -- content of sidebars/topics/admonitions -------------------------------- */ + +div.sidebar > :last-child, +aside.sidebar > :last-child, +div.topic > :last-child, +div.admonition > :last-child { + margin-bottom: 0; +} + +div.sidebar::after, +aside.sidebar::after, +div.topic::after, +div.admonition::after, +blockquote::after { + display: block; + content: ''; + clear: both; +} + +/* -- tables ---------------------------------------------------------------- */ + +table.docutils { + margin-top: 10px; + margin-bottom: 10px; + border: 0; + border-collapse: collapse; +} + +table.align-center { + margin-left: auto; + margin-right: auto; +} + +table.align-default { + margin-left: auto; + margin-right: auto; +} + +table caption span.caption-number { + font-style: italic; +} + +table caption span.caption-text { +} + +table.docutils td, table.docutils th { + padding: 1px 8px 1px 5px; + border-top: 0; + border-left: 0; + border-right: 0; + border-bottom: 1px solid #aaa; +} + +th { + text-align: left; + padding-right: 5px; +} + +table.citation { + border-left: solid 1px gray; + margin-left: 1px; +} + +table.citation td { + border-bottom: none; +} + +th > :first-child, +td > :first-child { + margin-top: 0px; +} + +th > :last-child, +td > :last-child { + margin-bottom: 0px; +} + +/* -- figures --------------------------------------------------------------- */ + +div.figure, figure { + margin: 0.5em; + padding: 0.5em; +} + +div.figure p.caption, figcaption { + padding: 0.3em; +} + +div.figure p.caption span.caption-number, +figcaption span.caption-number { + font-style: italic; +} + +div.figure p.caption span.caption-text, +figcaption span.caption-text { +} + +/* -- field list styles ----------------------------------------------------- */ + +table.field-list td, table.field-list th { + border: 0 !important; +} + +.field-list ul { + margin: 0; + padding-left: 1em; +} + +.field-list p { + margin: 0; +} + +.field-name { + -moz-hyphens: manual; + -ms-hyphens: manual; + -webkit-hyphens: manual; + hyphens: manual; +} + +/* -- hlist styles ---------------------------------------------------------- */ + +table.hlist { + margin: 1em 0; +} + +table.hlist td { + vertical-align: top; +} + +/* -- object description styles --------------------------------------------- */ + +.sig { + font-family: 'Consolas', 'Menlo', 'DejaVu Sans Mono', 'Bitstream Vera Sans Mono', monospace; +} + +.sig-name, code.descname { + background-color: transparent; + font-weight: bold; +} + +.sig-name { + font-size: 1.1em; +} + +code.descname { + font-size: 1.2em; +} + +.sig-prename, code.descclassname { + background-color: transparent; +} + +.optional { + font-size: 1.3em; +} + +.sig-paren { + font-size: larger; +} + +.sig-param.n { + font-style: italic; +} + +/* C++ specific styling */ + +.sig-inline.c-texpr, +.sig-inline.cpp-texpr { + font-family: unset; +} + +.sig.c .k, .sig.c .kt, +.sig.cpp .k, .sig.cpp .kt { + color: #0033B3; +} + +.sig.c .m, +.sig.cpp .m { + color: #1750EB; +} + +.sig.c .s, .sig.c .sc, +.sig.cpp .s, .sig.cpp .sc { + color: #067D17; +} + + +/* -- other body styles ----------------------------------------------------- */ + +ol.arabic { + list-style: decimal; +} + +ol.loweralpha { + list-style: lower-alpha; +} + +ol.upperalpha { + list-style: upper-alpha; +} + +ol.lowerroman { + list-style: lower-roman; +} + +ol.upperroman { + list-style: upper-roman; +} + +:not(li) > ol > li:first-child > :first-child, +:not(li) > ul > li:first-child > :first-child { + margin-top: 0px; +} + +:not(li) > ol > li:last-child > :last-child, +:not(li) > ul > li:last-child > :last-child { + margin-bottom: 0px; +} + +ol.simple ol p, +ol.simple ul p, +ul.simple ol p, +ul.simple ul p { + margin-top: 0; +} + +ol.simple > li:not(:first-child) > p, +ul.simple > li:not(:first-child) > p { + margin-top: 0; +} + +ol.simple p, +ul.simple p { + margin-bottom: 0; +} +dl.footnote > dt, +dl.citation > dt { + float: left; + margin-right: 0.5em; +} + +dl.footnote > dd, +dl.citation > dd { + margin-bottom: 0em; +} + +dl.footnote > dd:after, +dl.citation > dd:after { + content: ""; + clear: both; +} + +dl.field-list { + display: grid; + grid-template-columns: fit-content(30%) auto; +} + +dl.field-list > dt { + font-weight: bold; + word-break: break-word; + padding-left: 0.5em; + padding-right: 5px; +} +dl.field-list > dt:after { + content: ":"; +} + + +dl.field-list > dd { + padding-left: 0.5em; + margin-top: 0em; + margin-left: 0em; + margin-bottom: 0em; +} + +dl { + margin-bottom: 15px; +} + +dd > :first-child { + margin-top: 0px; +} + +dd ul, dd table { + margin-bottom: 10px; +} + +dd { + margin-top: 3px; + margin-bottom: 10px; + margin-left: 30px; +} + +dl > dd:last-child, +dl > dd:last-child > :last-child { + margin-bottom: 0; +} + +dt:target, span.highlighted { + background-color: #fbe54e; +} + +rect.highlighted { + fill: #fbe54e; +} + +dl.glossary dt { + font-weight: bold; + font-size: 1.1em; +} + +.versionmodified { + font-style: italic; +} + +.system-message { + background-color: #fda; + padding: 5px; + border: 3px solid red; +} + +.footnote:target { + background-color: #ffa; +} + +.line-block { + display: block; + margin-top: 1em; + margin-bottom: 1em; +} + +.line-block .line-block { + margin-top: 0; + margin-bottom: 0; + margin-left: 1.5em; +} + +.guilabel, .menuselection { + font-family: sans-serif; +} + +.accelerator { + text-decoration: underline; +} + +.classifier { + font-style: oblique; +} + +.classifier:before { + font-style: normal; + margin: 0 0.5em; + content: ":"; + display: inline-block; +} + +abbr, acronym { + border-bottom: dotted 1px; + cursor: help; +} + +/* -- code displays --------------------------------------------------------- */ + +pre { + overflow: auto; + overflow-y: hidden; /* fixes display issues on Chrome browsers */ +} + +pre, div[class*="highlight-"] { + clear: both; +} + +span.pre { + -moz-hyphens: none; + -ms-hyphens: none; + -webkit-hyphens: none; + hyphens: none; + white-space: nowrap; +} + +div[class*="highlight-"] { + margin: 1em 0; +} + +td.linenos pre { + border: 0; + background-color: transparent; + color: #aaa; +} + +table.highlighttable { + display: block; +} + +table.highlighttable tbody { + display: block; +} + +table.highlighttable tr { + display: flex; +} + +table.highlighttable td { + margin: 0; + padding: 0; +} + +table.highlighttable td.linenos { + padding-right: 0.5em; +} + +table.highlighttable td.code { + flex: 1; + overflow: hidden; +} + +.highlight .hll { + display: block; +} + +div.highlight pre, +table.highlighttable pre { + margin: 0; +} + +div.code-block-caption + div { + margin-top: 0; +} + +div.code-block-caption { + margin-top: 1em; + padding: 2px 5px; + font-size: small; +} + +div.code-block-caption code { + background-color: transparent; +} + +table.highlighttable td.linenos, +span.linenos, +div.highlight span.gp { /* gp: Generic.Prompt */ + user-select: none; + -webkit-user-select: text; /* Safari fallback only */ + -webkit-user-select: none; /* Chrome/Safari */ + -moz-user-select: none; /* Firefox */ + -ms-user-select: none; /* IE10+ */ +} + +div.code-block-caption span.caption-number { + padding: 0.1em 0.3em; + font-style: italic; +} + +div.code-block-caption span.caption-text { +} + +div.literal-block-wrapper { + margin: 1em 0; +} + +code.xref, a code { + background-color: transparent; + font-weight: bold; +} + +h1 code, h2 code, h3 code, h4 code, h5 code, h6 code { + background-color: transparent; +} + +.viewcode-link { + float: right; +} + +.viewcode-back { + float: right; + font-family: sans-serif; +} + +div.viewcode-block:target { + margin: -1px -10px; + padding: 0 10px; +} + +/* -- math display ---------------------------------------------------------- */ + +img.math { + vertical-align: middle; +} + +div.body div.math p { + text-align: center; +} + +span.eqno { + float: right; +} + +span.eqno a.headerlink { + position: absolute; + z-index: 1; +} + +div.math:hover a.headerlink { + visibility: visible; +} + +/* -- printout stylesheet --------------------------------------------------- */ + +@media print { + div.document, + div.documentwrapper, + div.bodywrapper { + margin: 0 !important; + width: 100%; + } + + div.sphinxsidebar, + div.related, + div.footer, + #top-link { + display: none; + } +} \ No newline at end of file diff --git a/branch/bicounty/_static/css/badge_only.css b/branch/bicounty/_static/css/badge_only.css new file mode 100644 index 0000000..e380325 --- /dev/null +++ b/branch/bicounty/_static/css/badge_only.css @@ -0,0 +1 @@ +.fa:before{-webkit-font-smoothing:antialiased}.clearfix{*zoom:1}.clearfix:after,.clearfix:before{display:table;content:""}.clearfix:after{clear:both}@font-face{font-family:FontAwesome;font-style:normal;font-weight:400;src:url(fonts/fontawesome-webfont.eot?674f50d287a8c48dc19ba404d20fe713?#iefix) format("embedded-opentype"),url(fonts/fontawesome-webfont.woff2?af7ae505a9eed503f8b8e6982036873e) format("woff2"),url(fonts/fontawesome-webfont.woff?fee66e712a8a08eef5805a46892932ad) format("woff"),url(fonts/fontawesome-webfont.ttf?b06871f281fee6b241d60582ae9369b9) format("truetype"),url(fonts/fontawesome-webfont.svg?912ec66d7572ff821749319396470bde#FontAwesome) format("svg")}.fa:before{font-family:FontAwesome;font-style:normal;font-weight:400;line-height:1}.fa:before,a .fa{text-decoration:inherit}.fa:before,a .fa,li .fa{display:inline-block}li .fa-large:before{width:1.875em}ul.fas{list-style-type:none;margin-left:2em;text-indent:-.8em}ul.fas li .fa{width:.8em}ul.fas li .fa-large:before{vertical-align:baseline}.fa-book:before,.icon-book:before{content:"\f02d"}.fa-caret-down:before,.icon-caret-down:before{content:"\f0d7"}.fa-caret-up:before,.icon-caret-up:before{content:"\f0d8"}.fa-caret-left:before,.icon-caret-left:before{content:"\f0d9"}.fa-caret-right:before,.icon-caret-right:before{content:"\f0da"}.rst-versions{position:fixed;bottom:0;left:0;width:300px;color:#fcfcfc;background:#1f1d1d;font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif;z-index:400}.rst-versions a{color:#2980b9;text-decoration:none}.rst-versions .rst-badge-small{display:none}.rst-versions .rst-current-version{padding:12px;background-color:#272525;display:block;text-align:right;font-size:90%;cursor:pointer;color:#27ae60}.rst-versions .rst-current-version:after{clear:both;content:"";display:block}.rst-versions .rst-current-version .fa{color:#fcfcfc}.rst-versions .rst-current-version .fa-book,.rst-versions .rst-current-version .icon-book{float:left}.rst-versions .rst-current-version.rst-out-of-date{background-color:#e74c3c;color:#fff}.rst-versions .rst-current-version.rst-active-old-version{background-color:#f1c40f;color:#000}.rst-versions.shift-up{height:auto;max-height:100%;overflow-y:scroll}.rst-versions.shift-up .rst-other-versions{display:block}.rst-versions .rst-other-versions{font-size:90%;padding:12px;color:grey;display:none}.rst-versions .rst-other-versions hr{display:block;height:1px;border:0;margin:20px 0;padding:0;border-top:1px solid #413d3d}.rst-versions .rst-other-versions dd{display:inline-block;margin:0}.rst-versions .rst-other-versions dd a{display:inline-block;padding:6px;color:#fcfcfc}.rst-versions.rst-badge{width:auto;bottom:20px;right:20px;left:auto;border:none;max-width:300px;max-height:90%}.rst-versions.rst-badge .fa-book,.rst-versions.rst-badge .icon-book{float:none;line-height:30px}.rst-versions.rst-badge.shift-up .rst-current-version{text-align:right}.rst-versions.rst-badge.shift-up .rst-current-version .fa-book,.rst-versions.rst-badge.shift-up .rst-current-version .icon-book{float:left}.rst-versions.rst-badge>.rst-current-version{width:auto;height:30px;line-height:30px;padding:0 6px;display:block;text-align:center}@media screen and (max-width:768px){.rst-versions{width:85%;display:none}.rst-versions.shift{display:block}} \ No newline at end of file diff --git a/branch/bicounty/_static/css/fonts/Roboto-Slab-Bold.woff b/branch/bicounty/_static/css/fonts/Roboto-Slab-Bold.woff new file mode 100644 index 0000000..6cb6000 Binary files /dev/null and b/branch/bicounty/_static/css/fonts/Roboto-Slab-Bold.woff differ diff --git a/branch/bicounty/_static/css/fonts/Roboto-Slab-Bold.woff2 b/branch/bicounty/_static/css/fonts/Roboto-Slab-Bold.woff2 new file mode 100644 index 0000000..7059e23 Binary files /dev/null and b/branch/bicounty/_static/css/fonts/Roboto-Slab-Bold.woff2 differ diff --git a/branch/bicounty/_static/css/fonts/Roboto-Slab-Regular.woff b/branch/bicounty/_static/css/fonts/Roboto-Slab-Regular.woff new file mode 100644 index 0000000..f815f63 Binary files /dev/null and b/branch/bicounty/_static/css/fonts/Roboto-Slab-Regular.woff differ diff --git a/branch/bicounty/_static/css/fonts/Roboto-Slab-Regular.woff2 b/branch/bicounty/_static/css/fonts/Roboto-Slab-Regular.woff2 new file mode 100644 index 0000000..f2c76e5 Binary files /dev/null and b/branch/bicounty/_static/css/fonts/Roboto-Slab-Regular.woff2 differ diff --git a/branch/bicounty/_static/css/fonts/fontawesome-webfont.eot b/branch/bicounty/_static/css/fonts/fontawesome-webfont.eot new file mode 100644 index 0000000..e9f60ca Binary files /dev/null and b/branch/bicounty/_static/css/fonts/fontawesome-webfont.eot differ diff --git a/branch/bicounty/_static/css/fonts/fontawesome-webfont.svg b/branch/bicounty/_static/css/fonts/fontawesome-webfont.svg new file mode 100644 index 0000000..855c845 --- /dev/null +++ b/branch/bicounty/_static/css/fonts/fontawesome-webfont.svg @@ -0,0 +1,2671 @@ + + + + +Created by FontForge 20120731 at Mon Oct 24 17:37:40 2016 + By ,,, +Copyright Dave Gandy 2016. All rights reserved. + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/branch/bicounty/_static/css/fonts/fontawesome-webfont.ttf b/branch/bicounty/_static/css/fonts/fontawesome-webfont.ttf new file mode 100644 index 0000000..35acda2 Binary files /dev/null and b/branch/bicounty/_static/css/fonts/fontawesome-webfont.ttf differ diff --git a/branch/bicounty/_static/css/fonts/fontawesome-webfont.woff b/branch/bicounty/_static/css/fonts/fontawesome-webfont.woff new file mode 100644 index 0000000..400014a Binary files /dev/null and b/branch/bicounty/_static/css/fonts/fontawesome-webfont.woff differ diff --git a/branch/bicounty/_static/css/fonts/fontawesome-webfont.woff2 b/branch/bicounty/_static/css/fonts/fontawesome-webfont.woff2 new file mode 100644 index 0000000..4d13fc6 Binary files /dev/null and b/branch/bicounty/_static/css/fonts/fontawesome-webfont.woff2 differ diff --git a/branch/bicounty/_static/css/fonts/lato-bold-italic.woff b/branch/bicounty/_static/css/fonts/lato-bold-italic.woff new file mode 100644 index 0000000..88ad05b Binary files /dev/null and b/branch/bicounty/_static/css/fonts/lato-bold-italic.woff differ diff --git a/branch/bicounty/_static/css/fonts/lato-bold-italic.woff2 b/branch/bicounty/_static/css/fonts/lato-bold-italic.woff2 new file mode 100644 index 0000000..c4e3d80 Binary files /dev/null and b/branch/bicounty/_static/css/fonts/lato-bold-italic.woff2 differ diff --git a/branch/bicounty/_static/css/fonts/lato-bold.woff b/branch/bicounty/_static/css/fonts/lato-bold.woff new file mode 100644 index 0000000..c6dff51 Binary files /dev/null and b/branch/bicounty/_static/css/fonts/lato-bold.woff differ diff --git a/branch/bicounty/_static/css/fonts/lato-bold.woff2 b/branch/bicounty/_static/css/fonts/lato-bold.woff2 new file mode 100644 index 0000000..bb19504 Binary files /dev/null and b/branch/bicounty/_static/css/fonts/lato-bold.woff2 differ diff --git a/branch/bicounty/_static/css/fonts/lato-normal-italic.woff b/branch/bicounty/_static/css/fonts/lato-normal-italic.woff new file mode 100644 index 0000000..76114bc Binary files /dev/null and b/branch/bicounty/_static/css/fonts/lato-normal-italic.woff differ diff --git a/branch/bicounty/_static/css/fonts/lato-normal-italic.woff2 b/branch/bicounty/_static/css/fonts/lato-normal-italic.woff2 new file mode 100644 index 0000000..3404f37 Binary files /dev/null and b/branch/bicounty/_static/css/fonts/lato-normal-italic.woff2 differ diff --git a/branch/bicounty/_static/css/fonts/lato-normal.woff b/branch/bicounty/_static/css/fonts/lato-normal.woff new file mode 100644 index 0000000..ae1307f Binary files /dev/null and b/branch/bicounty/_static/css/fonts/lato-normal.woff differ diff --git a/branch/bicounty/_static/css/fonts/lato-normal.woff2 b/branch/bicounty/_static/css/fonts/lato-normal.woff2 new file mode 100644 index 0000000..3bf9843 Binary files /dev/null and b/branch/bicounty/_static/css/fonts/lato-normal.woff2 differ diff --git a/branch/bicounty/_static/css/theme.css b/branch/bicounty/_static/css/theme.css new file mode 100644 index 0000000..0d9ae7e --- /dev/null +++ b/branch/bicounty/_static/css/theme.css @@ -0,0 +1,4 @@ +html{box-sizing:border-box}*,:after,:before{box-sizing:inherit}article,aside,details,figcaption,figure,footer,header,hgroup,nav,section{display:block}audio,canvas,video{display:inline-block;*display:inline;*zoom:1}[hidden],audio:not([controls]){display:none}*{-webkit-box-sizing:border-box;-moz-box-sizing:border-box;box-sizing:border-box}html{font-size:100%;-webkit-text-size-adjust:100%;-ms-text-size-adjust:100%}body{margin:0}a:active,a:hover{outline:0}abbr[title]{border-bottom:1px dotted}b,strong{font-weight:700}blockquote{margin:0}dfn{font-style:italic}ins{background:#ff9;text-decoration:none}ins,mark{color:#000}mark{background:#ff0;font-style:italic;font-weight:700}.rst-content code,.rst-content tt,code,kbd,pre,samp{font-family:monospace,serif;_font-family:courier new,monospace;font-size:1em}pre{white-space:pre}q{quotes:none}q:after,q:before{content:"";content:none}small{font-size:85%}sub,sup{font-size:75%;line-height:0;position:relative;vertical-align:baseline}sup{top:-.5em}sub{bottom:-.25em}dl,ol,ul{margin:0;padding:0;list-style:none;list-style-image:none}li{list-style:none}dd{margin:0}img{border:0;-ms-interpolation-mode:bicubic;vertical-align:middle;max-width:100%}svg:not(:root){overflow:hidden}figure,form{margin:0}label{cursor:pointer}button,input,select,textarea{font-size:100%;margin:0;vertical-align:baseline;*vertical-align:middle}button,input{line-height:normal}button,input[type=button],input[type=reset],input[type=submit]{cursor:pointer;-webkit-appearance:button;*overflow:visible}button[disabled],input[disabled]{cursor:default}input[type=search]{-webkit-appearance:textfield;-moz-box-sizing:content-box;-webkit-box-sizing:content-box;box-sizing:content-box}textarea{resize:vertical}table{border-collapse:collapse;border-spacing:0}td{vertical-align:top}.chromeframe{margin:.2em 0;background:#ccc;color:#000;padding:.2em 0}.ir{display:block;border:0;text-indent:-999em;overflow:hidden;background-color:transparent;background-repeat:no-repeat;text-align:left;direction:ltr;*line-height:0}.ir br{display:none}.hidden{display:none!important;visibility:hidden}.visuallyhidden{border:0;clip:rect(0 0 0 0);height:1px;margin:-1px;overflow:hidden;padding:0;position:absolute;width:1px}.visuallyhidden.focusable:active,.visuallyhidden.focusable:focus{clip:auto;height:auto;margin:0;overflow:visible;position:static;width:auto}.invisible{visibility:hidden}.relative{position:relative}big,small{font-size:100%}@media print{body,html,section{background:none!important}*{box-shadow:none!important;text-shadow:none!important;filter:none!important;-ms-filter:none!important}a,a:visited{text-decoration:underline}.ir a:after,a[href^="#"]:after,a[href^="javascript:"]:after{content:""}blockquote,pre{page-break-inside:avoid}thead{display:table-header-group}img,tr{page-break-inside:avoid}img{max-width:100%!important}@page{margin:.5cm}.rst-content .toctree-wrapper>p.caption,h2,h3,p{orphans:3;widows:3}.rst-content .toctree-wrapper>p.caption,h2,h3{page-break-after:avoid}}.btn,.fa:before,.icon:before,.rst-content .admonition,.rst-content .admonition-title:before,.rst-content .admonition-todo,.rst-content .attention,.rst-content .caution,.rst-content .code-block-caption .headerlink:before,.rst-content .danger,.rst-content .eqno .headerlink:before,.rst-content .error,.rst-content .hint,.rst-content .important,.rst-content .note,.rst-content .seealso,.rst-content .tip,.rst-content .warning,.rst-content code.download span:first-child:before,.rst-content dl dt .headerlink:before,.rst-content h1 .headerlink:before,.rst-content h2 .headerlink:before,.rst-content h3 .headerlink:before,.rst-content h4 .headerlink:before,.rst-content h5 .headerlink:before,.rst-content h6 .headerlink:before,.rst-content p.caption .headerlink:before,.rst-content p .headerlink:before,.rst-content table>caption .headerlink:before,.rst-content tt.download span:first-child:before,.wy-alert,.wy-dropdown .caret:before,.wy-inline-validate.wy-inline-validate-danger .wy-input-context:before,.wy-inline-validate.wy-inline-validate-info .wy-input-context:before,.wy-inline-validate.wy-inline-validate-success .wy-input-context:before,.wy-inline-validate.wy-inline-validate-warning .wy-input-context:before,.wy-menu-vertical li.current>a,.wy-menu-vertical li.current>a button.toctree-expand:before,.wy-menu-vertical li.on a,.wy-menu-vertical li.on a button.toctree-expand:before,.wy-menu-vertical li button.toctree-expand:before,.wy-nav-top a,.wy-side-nav-search .wy-dropdown>a,.wy-side-nav-search>a,input[type=color],input[type=date],input[type=datetime-local],input[type=datetime],input[type=email],input[type=month],input[type=number],input[type=password],input[type=search],input[type=tel],input[type=text],input[type=time],input[type=url],input[type=week],select,textarea{-webkit-font-smoothing:antialiased}.clearfix{*zoom:1}.clearfix:after,.clearfix:before{display:table;content:""}.clearfix:after{clear:both}/*! + * Font Awesome 4.7.0 by @davegandy - http://fontawesome.io - @fontawesome + * License - http://fontawesome.io/license (Font: SIL OFL 1.1, CSS: MIT License) + */@font-face{font-family:FontAwesome;src:url(fonts/fontawesome-webfont.eot?674f50d287a8c48dc19ba404d20fe713);src:url(fonts/fontawesome-webfont.eot?674f50d287a8c48dc19ba404d20fe713?#iefix&v=4.7.0) format("embedded-opentype"),url(fonts/fontawesome-webfont.woff2?af7ae505a9eed503f8b8e6982036873e) format("woff2"),url(fonts/fontawesome-webfont.woff?fee66e712a8a08eef5805a46892932ad) format("woff"),url(fonts/fontawesome-webfont.ttf?b06871f281fee6b241d60582ae9369b9) format("truetype"),url(fonts/fontawesome-webfont.svg?912ec66d7572ff821749319396470bde#fontawesomeregular) format("svg");font-weight:400;font-style:normal}.fa,.icon,.rst-content .admonition-title,.rst-content .code-block-caption .headerlink,.rst-content .eqno .headerlink,.rst-content code.download span:first-child,.rst-content dl dt .headerlink,.rst-content h1 .headerlink,.rst-content h2 .headerlink,.rst-content h3 .headerlink,.rst-content h4 .headerlink,.rst-content h5 .headerlink,.rst-content h6 .headerlink,.rst-content p.caption .headerlink,.rst-content p .headerlink,.rst-content table>caption .headerlink,.rst-content tt.download span:first-child,.wy-menu-vertical li.current>a button.toctree-expand,.wy-menu-vertical li.on a button.toctree-expand,.wy-menu-vertical li button.toctree-expand{display:inline-block;font:normal normal normal 14px/1 FontAwesome;font-size:inherit;text-rendering:auto;-webkit-font-smoothing:antialiased;-moz-osx-font-smoothing:grayscale}.fa-lg{font-size:1.33333em;line-height:.75em;vertical-align:-15%}.fa-2x{font-size:2em}.fa-3x{font-size:3em}.fa-4x{font-size:4em}.fa-5x{font-size:5em}.fa-fw{width:1.28571em;text-align:center}.fa-ul{padding-left:0;margin-left:2.14286em;list-style-type:none}.fa-ul>li{position:relative}.fa-li{position:absolute;left:-2.14286em;width:2.14286em;top:.14286em;text-align:center}.fa-li.fa-lg{left:-1.85714em}.fa-border{padding:.2em .25em .15em;border:.08em solid #eee;border-radius:.1em}.fa-pull-left{float:left}.fa-pull-right{float:right}.fa-pull-left.icon,.fa.fa-pull-left,.rst-content .code-block-caption .fa-pull-left.headerlink,.rst-content .eqno .fa-pull-left.headerlink,.rst-content .fa-pull-left.admonition-title,.rst-content code.download span.fa-pull-left:first-child,.rst-content dl dt .fa-pull-left.headerlink,.rst-content h1 .fa-pull-left.headerlink,.rst-content h2 .fa-pull-left.headerlink,.rst-content h3 .fa-pull-left.headerlink,.rst-content h4 .fa-pull-left.headerlink,.rst-content h5 .fa-pull-left.headerlink,.rst-content h6 .fa-pull-left.headerlink,.rst-content p .fa-pull-left.headerlink,.rst-content table>caption .fa-pull-left.headerlink,.rst-content tt.download span.fa-pull-left:first-child,.wy-menu-vertical li.current>a button.fa-pull-left.toctree-expand,.wy-menu-vertical li.on a button.fa-pull-left.toctree-expand,.wy-menu-vertical li button.fa-pull-left.toctree-expand{margin-right:.3em}.fa-pull-right.icon,.fa.fa-pull-right,.rst-content .code-block-caption .fa-pull-right.headerlink,.rst-content .eqno .fa-pull-right.headerlink,.rst-content .fa-pull-right.admonition-title,.rst-content code.download span.fa-pull-right:first-child,.rst-content dl dt .fa-pull-right.headerlink,.rst-content h1 .fa-pull-right.headerlink,.rst-content h2 .fa-pull-right.headerlink,.rst-content h3 .fa-pull-right.headerlink,.rst-content h4 .fa-pull-right.headerlink,.rst-content h5 .fa-pull-right.headerlink,.rst-content h6 .fa-pull-right.headerlink,.rst-content p .fa-pull-right.headerlink,.rst-content table>caption .fa-pull-right.headerlink,.rst-content tt.download span.fa-pull-right:first-child,.wy-menu-vertical li.current>a button.fa-pull-right.toctree-expand,.wy-menu-vertical li.on a button.fa-pull-right.toctree-expand,.wy-menu-vertical li button.fa-pull-right.toctree-expand{margin-left:.3em}.pull-right{float:right}.pull-left{float:left}.fa.pull-left,.pull-left.icon,.rst-content .code-block-caption .pull-left.headerlink,.rst-content .eqno .pull-left.headerlink,.rst-content .pull-left.admonition-title,.rst-content code.download span.pull-left:first-child,.rst-content dl dt .pull-left.headerlink,.rst-content h1 .pull-left.headerlink,.rst-content h2 .pull-left.headerlink,.rst-content h3 .pull-left.headerlink,.rst-content h4 .pull-left.headerlink,.rst-content h5 .pull-left.headerlink,.rst-content h6 .pull-left.headerlink,.rst-content p .pull-left.headerlink,.rst-content table>caption .pull-left.headerlink,.rst-content tt.download span.pull-left:first-child,.wy-menu-vertical li.current>a button.pull-left.toctree-expand,.wy-menu-vertical li.on a button.pull-left.toctree-expand,.wy-menu-vertical li button.pull-left.toctree-expand{margin-right:.3em}.fa.pull-right,.pull-right.icon,.rst-content .code-block-caption .pull-right.headerlink,.rst-content .eqno .pull-right.headerlink,.rst-content .pull-right.admonition-title,.rst-content code.download span.pull-right:first-child,.rst-content dl dt .pull-right.headerlink,.rst-content h1 .pull-right.headerlink,.rst-content h2 .pull-right.headerlink,.rst-content h3 .pull-right.headerlink,.rst-content h4 .pull-right.headerlink,.rst-content h5 .pull-right.headerlink,.rst-content h6 .pull-right.headerlink,.rst-content p .pull-right.headerlink,.rst-content table>caption .pull-right.headerlink,.rst-content tt.download span.pull-right:first-child,.wy-menu-vertical li.current>a button.pull-right.toctree-expand,.wy-menu-vertical li.on a button.pull-right.toctree-expand,.wy-menu-vertical li button.pull-right.toctree-expand{margin-left:.3em}.fa-spin{-webkit-animation:fa-spin 2s linear infinite;animation:fa-spin 2s linear infinite}.fa-pulse{-webkit-animation:fa-spin 1s steps(8) infinite;animation:fa-spin 1s steps(8) infinite}@-webkit-keyframes fa-spin{0%{-webkit-transform:rotate(0deg);transform:rotate(0deg)}to{-webkit-transform:rotate(359deg);transform:rotate(359deg)}}@keyframes fa-spin{0%{-webkit-transform:rotate(0deg);transform:rotate(0deg)}to{-webkit-transform:rotate(359deg);transform:rotate(359deg)}}.fa-rotate-90{-ms-filter:"progid:DXImageTransform.Microsoft.BasicImage(rotation=1)";-webkit-transform:rotate(90deg);-ms-transform:rotate(90deg);transform:rotate(90deg)}.fa-rotate-180{-ms-filter:"progid:DXImageTransform.Microsoft.BasicImage(rotation=2)";-webkit-transform:rotate(180deg);-ms-transform:rotate(180deg);transform:rotate(180deg)}.fa-rotate-270{-ms-filter:"progid:DXImageTransform.Microsoft.BasicImage(rotation=3)";-webkit-transform:rotate(270deg);-ms-transform:rotate(270deg);transform:rotate(270deg)}.fa-flip-horizontal{-ms-filter:"progid:DXImageTransform.Microsoft.BasicImage(rotation=0, mirror=1)";-webkit-transform:scaleX(-1);-ms-transform:scaleX(-1);transform:scaleX(-1)}.fa-flip-vertical{-ms-filter:"progid:DXImageTransform.Microsoft.BasicImage(rotation=2, mirror=1)";-webkit-transform:scaleY(-1);-ms-transform:scaleY(-1);transform:scaleY(-1)}:root .fa-flip-horizontal,:root .fa-flip-vertical,:root .fa-rotate-90,:root .fa-rotate-180,:root .fa-rotate-270{filter:none}.fa-stack{position:relative;display:inline-block;width:2em;height:2em;line-height:2em;vertical-align:middle}.fa-stack-1x,.fa-stack-2x{position:absolute;left:0;width:100%;text-align:center}.fa-stack-1x{line-height:inherit}.fa-stack-2x{font-size:2em}.fa-inverse{color:#fff}.fa-glass:before{content:""}.fa-music:before{content:""}.fa-search:before,.icon-search:before{content:""}.fa-envelope-o:before{content:""}.fa-heart:before{content:""}.fa-star:before{content:""}.fa-star-o:before{content:""}.fa-user:before{content:""}.fa-film:before{content:""}.fa-th-large:before{content:""}.fa-th:before{content:""}.fa-th-list:before{content:""}.fa-check:before{content:""}.fa-close:before,.fa-remove:before,.fa-times:before{content:""}.fa-search-plus:before{content:""}.fa-search-minus:before{content:""}.fa-power-off:before{content:""}.fa-signal:before{content:""}.fa-cog:before,.fa-gear:before{content:""}.fa-trash-o:before{content:""}.fa-home:before,.icon-home:before{content:""}.fa-file-o:before{content:""}.fa-clock-o:before{content:""}.fa-road:before{content:""}.fa-download:before,.rst-content code.download span:first-child:before,.rst-content tt.download span:first-child:before{content:""}.fa-arrow-circle-o-down:before{content:""}.fa-arrow-circle-o-up:before{content:""}.fa-inbox:before{content:""}.fa-play-circle-o:before{content:""}.fa-repeat:before,.fa-rotate-right:before{content:""}.fa-refresh:before{content:""}.fa-list-alt:before{content:""}.fa-lock:before{content:""}.fa-flag:before{content:""}.fa-headphones:before{content:""}.fa-volume-off:before{content:""}.fa-volume-down:before{content:""}.fa-volume-up:before{content:""}.fa-qrcode:before{content:""}.fa-barcode:before{content:""}.fa-tag:before{content:""}.fa-tags:before{content:""}.fa-book:before,.icon-book:before{content:""}.fa-bookmark:before{content:""}.fa-print:before{content:""}.fa-camera:before{content:""}.fa-font:before{content:""}.fa-bold:before{content:""}.fa-italic:before{content:""}.fa-text-height:before{content:""}.fa-text-width:before{content:""}.fa-align-left:before{content:""}.fa-align-center:before{content:""}.fa-align-right:before{content:""}.fa-align-justify:before{content:""}.fa-list:before{content:""}.fa-dedent:before,.fa-outdent:before{content:""}.fa-indent:before{content:""}.fa-video-camera:before{content:""}.fa-image:before,.fa-photo:before,.fa-picture-o:before{content:""}.fa-pencil:before{content:""}.fa-map-marker:before{content:""}.fa-adjust:before{content:""}.fa-tint:before{content:""}.fa-edit:before,.fa-pencil-square-o:before{content:""}.fa-share-square-o:before{content:""}.fa-check-square-o:before{content:""}.fa-arrows:before{content:""}.fa-step-backward:before{content:""}.fa-fast-backward:before{content:""}.fa-backward:before{content:""}.fa-play:before{content:""}.fa-pause:before{content:""}.fa-stop:before{content:""}.fa-forward:before{content:""}.fa-fast-forward:before{content:""}.fa-step-forward:before{content:""}.fa-eject:before{content:""}.fa-chevron-left:before{content:""}.fa-chevron-right:before{content:""}.fa-plus-circle:before{content:""}.fa-minus-circle:before{content:""}.fa-times-circle:before,.wy-inline-validate.wy-inline-validate-danger .wy-input-context:before{content:""}.fa-check-circle:before,.wy-inline-validate.wy-inline-validate-success .wy-input-context:before{content:""}.fa-question-circle:before{content:""}.fa-info-circle:before{content:""}.fa-crosshairs:before{content:""}.fa-times-circle-o:before{content:""}.fa-check-circle-o:before{content:""}.fa-ban:before{content:""}.fa-arrow-left:before{content:""}.fa-arrow-right:before{content:""}.fa-arrow-up:before{content:""}.fa-arrow-down:before{content:""}.fa-mail-forward:before,.fa-share:before{content:""}.fa-expand:before{content:""}.fa-compress:before{content:""}.fa-plus:before{content:""}.fa-minus:before{content:""}.fa-asterisk:before{content:""}.fa-exclamation-circle:before,.rst-content .admonition-title:before,.wy-inline-validate.wy-inline-validate-info .wy-input-context:before,.wy-inline-validate.wy-inline-validate-warning .wy-input-context:before{content:""}.fa-gift:before{content:""}.fa-leaf:before{content:""}.fa-fire:before,.icon-fire:before{content:""}.fa-eye:before{content:""}.fa-eye-slash:before{content:""}.fa-exclamation-triangle:before,.fa-warning:before{content:""}.fa-plane:before{content:""}.fa-calendar:before{content:""}.fa-random:before{content:""}.fa-comment:before{content:""}.fa-magnet:before{content:""}.fa-chevron-up:before{content:""}.fa-chevron-down:before{content:""}.fa-retweet:before{content:""}.fa-shopping-cart:before{content:""}.fa-folder:before{content:""}.fa-folder-open:before{content:""}.fa-arrows-v:before{content:""}.fa-arrows-h:before{content:""}.fa-bar-chart-o:before,.fa-bar-chart:before{content:""}.fa-twitter-square:before{content:""}.fa-facebook-square:before{content:""}.fa-camera-retro:before{content:""}.fa-key:before{content:""}.fa-cogs:before,.fa-gears:before{content:""}.fa-comments:before{content:""}.fa-thumbs-o-up:before{content:""}.fa-thumbs-o-down:before{content:""}.fa-star-half:before{content:""}.fa-heart-o:before{content:""}.fa-sign-out:before{content:""}.fa-linkedin-square:before{content:""}.fa-thumb-tack:before{content:""}.fa-external-link:before{content:""}.fa-sign-in:before{content:""}.fa-trophy:before{content:""}.fa-github-square:before{content:""}.fa-upload:before{content:""}.fa-lemon-o:before{content:""}.fa-phone:before{content:""}.fa-square-o:before{content:""}.fa-bookmark-o:before{content:""}.fa-phone-square:before{content:""}.fa-twitter:before{content:""}.fa-facebook-f:before,.fa-facebook:before{content:""}.fa-github:before,.icon-github:before{content:""}.fa-unlock:before{content:""}.fa-credit-card:before{content:""}.fa-feed:before,.fa-rss:before{content:""}.fa-hdd-o:before{content:""}.fa-bullhorn:before{content:""}.fa-bell:before{content:""}.fa-certificate:before{content:""}.fa-hand-o-right:before{content:""}.fa-hand-o-left:before{content:""}.fa-hand-o-up:before{content:""}.fa-hand-o-down:before{content:""}.fa-arrow-circle-left:before,.icon-circle-arrow-left:before{content:""}.fa-arrow-circle-right:before,.icon-circle-arrow-right:before{content:""}.fa-arrow-circle-up:before{content:""}.fa-arrow-circle-down:before{content:""}.fa-globe:before{content:""}.fa-wrench:before{content:""}.fa-tasks:before{content:""}.fa-filter:before{content:""}.fa-briefcase:before{content:""}.fa-arrows-alt:before{content:""}.fa-group:before,.fa-users:before{content:""}.fa-chain:before,.fa-link:before,.icon-link:before{content:""}.fa-cloud:before{content:""}.fa-flask:before{content:""}.fa-cut:before,.fa-scissors:before{content:""}.fa-copy:before,.fa-files-o:before{content:""}.fa-paperclip:before{content:""}.fa-floppy-o:before,.fa-save:before{content:""}.fa-square:before{content:""}.fa-bars:before,.fa-navicon:before,.fa-reorder:before{content:""}.fa-list-ul:before{content:""}.fa-list-ol:before{content:""}.fa-strikethrough:before{content:""}.fa-underline:before{content:""}.fa-table:before{content:""}.fa-magic:before{content:""}.fa-truck:before{content:""}.fa-pinterest:before{content:""}.fa-pinterest-square:before{content:""}.fa-google-plus-square:before{content:""}.fa-google-plus:before{content:""}.fa-money:before{content:""}.fa-caret-down:before,.icon-caret-down:before,.wy-dropdown .caret:before{content:""}.fa-caret-up:before{content:""}.fa-caret-left:before{content:""}.fa-caret-right:before{content:""}.fa-columns:before{content:""}.fa-sort:before,.fa-unsorted:before{content:""}.fa-sort-desc:before,.fa-sort-down:before{content:""}.fa-sort-asc:before,.fa-sort-up:before{content:""}.fa-envelope:before{content:""}.fa-linkedin:before{content:""}.fa-rotate-left:before,.fa-undo:before{content:""}.fa-gavel:before,.fa-legal:before{content:""}.fa-dashboard:before,.fa-tachometer:before{content:""}.fa-comment-o:before{content:""}.fa-comments-o:before{content:""}.fa-bolt:before,.fa-flash:before{content:""}.fa-sitemap:before{content:""}.fa-umbrella:before{content:""}.fa-clipboard:before,.fa-paste:before{content:""}.fa-lightbulb-o:before{content:""}.fa-exchange:before{content:""}.fa-cloud-download:before{content:""}.fa-cloud-upload:before{content:""}.fa-user-md:before{content:""}.fa-stethoscope:before{content:""}.fa-suitcase:before{content:""}.fa-bell-o:before{content:""}.fa-coffee:before{content:""}.fa-cutlery:before{content:""}.fa-file-text-o:before{content:""}.fa-building-o:before{content:""}.fa-hospital-o:before{content:""}.fa-ambulance:before{content:""}.fa-medkit:before{content:""}.fa-fighter-jet:before{content:""}.fa-beer:before{content:""}.fa-h-square:before{content:""}.fa-plus-square:before{content:""}.fa-angle-double-left:before{content:""}.fa-angle-double-right:before{content:""}.fa-angle-double-up:before{content:""}.fa-angle-double-down:before{content:""}.fa-angle-left:before{content:""}.fa-angle-right:before{content:""}.fa-angle-up:before{content:""}.fa-angle-down:before{content:""}.fa-desktop:before{content:""}.fa-laptop:before{content:""}.fa-tablet:before{content:""}.fa-mobile-phone:before,.fa-mobile:before{content:""}.fa-circle-o:before{content:""}.fa-quote-left:before{content:""}.fa-quote-right:before{content:""}.fa-spinner:before{content:""}.fa-circle:before{content:""}.fa-mail-reply:before,.fa-reply:before{content:""}.fa-github-alt:before{content:""}.fa-folder-o:before{content:""}.fa-folder-open-o:before{content:""}.fa-smile-o:before{content:""}.fa-frown-o:before{content:""}.fa-meh-o:before{content:""}.fa-gamepad:before{content:""}.fa-keyboard-o:before{content:""}.fa-flag-o:before{content:""}.fa-flag-checkered:before{content:""}.fa-terminal:before{content:""}.fa-code:before{content:""}.fa-mail-reply-all:before,.fa-reply-all:before{content:""}.fa-star-half-empty:before,.fa-star-half-full:before,.fa-star-half-o:before{content:""}.fa-location-arrow:before{content:""}.fa-crop:before{content:""}.fa-code-fork:before{content:""}.fa-chain-broken:before,.fa-unlink:before{content:""}.fa-question:before{content:""}.fa-info:before{content:""}.fa-exclamation:before{content:""}.fa-superscript:before{content:""}.fa-subscript:before{content:""}.fa-eraser:before{content:""}.fa-puzzle-piece:before{content:""}.fa-microphone:before{content:""}.fa-microphone-slash:before{content:""}.fa-shield:before{content:""}.fa-calendar-o:before{content:""}.fa-fire-extinguisher:before{content:""}.fa-rocket:before{content:""}.fa-maxcdn:before{content:""}.fa-chevron-circle-left:before{content:""}.fa-chevron-circle-right:before{content:""}.fa-chevron-circle-up:before{content:""}.fa-chevron-circle-down:before{content:""}.fa-html5:before{content:""}.fa-css3:before{content:""}.fa-anchor:before{content:""}.fa-unlock-alt:before{content:""}.fa-bullseye:before{content:""}.fa-ellipsis-h:before{content:""}.fa-ellipsis-v:before{content:""}.fa-rss-square:before{content:""}.fa-play-circle:before{content:""}.fa-ticket:before{content:""}.fa-minus-square:before{content:""}.fa-minus-square-o:before,.wy-menu-vertical li.current>a button.toctree-expand:before,.wy-menu-vertical li.on a button.toctree-expand:before{content:""}.fa-level-up:before{content:""}.fa-level-down:before{content:""}.fa-check-square:before{content:""}.fa-pencil-square:before{content:""}.fa-external-link-square:before{content:""}.fa-share-square:before{content:""}.fa-compass:before{content:""}.fa-caret-square-o-down:before,.fa-toggle-down:before{content:""}.fa-caret-square-o-up:before,.fa-toggle-up:before{content:""}.fa-caret-square-o-right:before,.fa-toggle-right:before{content:""}.fa-eur:before,.fa-euro:before{content:""}.fa-gbp:before{content:""}.fa-dollar:before,.fa-usd:before{content:""}.fa-inr:before,.fa-rupee:before{content:""}.fa-cny:before,.fa-jpy:before,.fa-rmb:before,.fa-yen:before{content:""}.fa-rouble:before,.fa-rub:before,.fa-ruble:before{content:""}.fa-krw:before,.fa-won:before{content:""}.fa-bitcoin:before,.fa-btc:before{content:""}.fa-file:before{content:""}.fa-file-text:before{content:""}.fa-sort-alpha-asc:before{content:""}.fa-sort-alpha-desc:before{content:""}.fa-sort-amount-asc:before{content:""}.fa-sort-amount-desc:before{content:""}.fa-sort-numeric-asc:before{content:""}.fa-sort-numeric-desc:before{content:""}.fa-thumbs-up:before{content:""}.fa-thumbs-down:before{content:""}.fa-youtube-square:before{content:""}.fa-youtube:before{content:""}.fa-xing:before{content:""}.fa-xing-square:before{content:""}.fa-youtube-play:before{content:""}.fa-dropbox:before{content:""}.fa-stack-overflow:before{content:""}.fa-instagram:before{content:""}.fa-flickr:before{content:""}.fa-adn:before{content:""}.fa-bitbucket:before,.icon-bitbucket:before{content:""}.fa-bitbucket-square:before{content:""}.fa-tumblr:before{content:""}.fa-tumblr-square:before{content:""}.fa-long-arrow-down:before{content:""}.fa-long-arrow-up:before{content:""}.fa-long-arrow-left:before{content:""}.fa-long-arrow-right:before{content:""}.fa-apple:before{content:""}.fa-windows:before{content:""}.fa-android:before{content:""}.fa-linux:before{content:""}.fa-dribbble:before{content:""}.fa-skype:before{content:""}.fa-foursquare:before{content:""}.fa-trello:before{content:""}.fa-female:before{content:""}.fa-male:before{content:""}.fa-gittip:before,.fa-gratipay:before{content:""}.fa-sun-o:before{content:""}.fa-moon-o:before{content:""}.fa-archive:before{content:""}.fa-bug:before{content:""}.fa-vk:before{content:""}.fa-weibo:before{content:""}.fa-renren:before{content:""}.fa-pagelines:before{content:""}.fa-stack-exchange:before{content:""}.fa-arrow-circle-o-right:before{content:""}.fa-arrow-circle-o-left:before{content:""}.fa-caret-square-o-left:before,.fa-toggle-left:before{content:""}.fa-dot-circle-o:before{content:""}.fa-wheelchair:before{content:""}.fa-vimeo-square:before{content:""}.fa-try:before,.fa-turkish-lira:before{content:""}.fa-plus-square-o:before,.wy-menu-vertical li button.toctree-expand:before{content:""}.fa-space-shuttle:before{content:""}.fa-slack:before{content:""}.fa-envelope-square:before{content:""}.fa-wordpress:before{content:""}.fa-openid:before{content:""}.fa-bank:before,.fa-institution:before,.fa-university:before{content:""}.fa-graduation-cap:before,.fa-mortar-board:before{content:""}.fa-yahoo:before{content:""}.fa-google:before{content:""}.fa-reddit:before{content:""}.fa-reddit-square:before{content:""}.fa-stumbleupon-circle:before{content:""}.fa-stumbleupon:before{content:""}.fa-delicious:before{content:""}.fa-digg:before{content:""}.fa-pied-piper-pp:before{content:""}.fa-pied-piper-alt:before{content:""}.fa-drupal:before{content:""}.fa-joomla:before{content:""}.fa-language:before{content:""}.fa-fax:before{content:""}.fa-building:before{content:""}.fa-child:before{content:""}.fa-paw:before{content:""}.fa-spoon:before{content:""}.fa-cube:before{content:""}.fa-cubes:before{content:""}.fa-behance:before{content:""}.fa-behance-square:before{content:""}.fa-steam:before{content:""}.fa-steam-square:before{content:""}.fa-recycle:before{content:""}.fa-automobile:before,.fa-car:before{content:""}.fa-cab:before,.fa-taxi:before{content:""}.fa-tree:before{content:""}.fa-spotify:before{content:""}.fa-deviantart:before{content:""}.fa-soundcloud:before{content:""}.fa-database:before{content:""}.fa-file-pdf-o:before{content:""}.fa-file-word-o:before{content:""}.fa-file-excel-o:before{content:""}.fa-file-powerpoint-o:before{content:""}.fa-file-image-o:before,.fa-file-photo-o:before,.fa-file-picture-o:before{content:""}.fa-file-archive-o:before,.fa-file-zip-o:before{content:""}.fa-file-audio-o:before,.fa-file-sound-o:before{content:""}.fa-file-movie-o:before,.fa-file-video-o:before{content:""}.fa-file-code-o:before{content:""}.fa-vine:before{content:""}.fa-codepen:before{content:""}.fa-jsfiddle:before{content:""}.fa-life-bouy:before,.fa-life-buoy:before,.fa-life-ring:before,.fa-life-saver:before,.fa-support:before{content:""}.fa-circle-o-notch:before{content:""}.fa-ra:before,.fa-rebel:before,.fa-resistance:before{content:""}.fa-empire:before,.fa-ge:before{content:""}.fa-git-square:before{content:""}.fa-git:before{content:""}.fa-hacker-news:before,.fa-y-combinator-square:before,.fa-yc-square:before{content:""}.fa-tencent-weibo:before{content:""}.fa-qq:before{content:""}.fa-wechat:before,.fa-weixin:before{content:""}.fa-paper-plane:before,.fa-send:before{content:""}.fa-paper-plane-o:before,.fa-send-o:before{content:""}.fa-history:before{content:""}.fa-circle-thin:before{content:""}.fa-header:before{content:""}.fa-paragraph:before{content:""}.fa-sliders:before{content:""}.fa-share-alt:before{content:""}.fa-share-alt-square:before{content:""}.fa-bomb:before{content:""}.fa-futbol-o:before,.fa-soccer-ball-o:before{content:""}.fa-tty:before{content:""}.fa-binoculars:before{content:""}.fa-plug:before{content:""}.fa-slideshare:before{content:""}.fa-twitch:before{content:""}.fa-yelp:before{content:""}.fa-newspaper-o:before{content:""}.fa-wifi:before{content:""}.fa-calculator:before{content:""}.fa-paypal:before{content:""}.fa-google-wallet:before{content:""}.fa-cc-visa:before{content:""}.fa-cc-mastercard:before{content:""}.fa-cc-discover:before{content:""}.fa-cc-amex:before{content:""}.fa-cc-paypal:before{content:""}.fa-cc-stripe:before{content:""}.fa-bell-slash:before{content:""}.fa-bell-slash-o:before{content:""}.fa-trash:before{content:""}.fa-copyright:before{content:""}.fa-at:before{content:""}.fa-eyedropper:before{content:""}.fa-paint-brush:before{content:""}.fa-birthday-cake:before{content:""}.fa-area-chart:before{content:""}.fa-pie-chart:before{content:""}.fa-line-chart:before{content:""}.fa-lastfm:before{content:""}.fa-lastfm-square:before{content:""}.fa-toggle-off:before{content:""}.fa-toggle-on:before{content:""}.fa-bicycle:before{content:""}.fa-bus:before{content:""}.fa-ioxhost:before{content:""}.fa-angellist:before{content:""}.fa-cc:before{content:""}.fa-ils:before,.fa-shekel:before,.fa-sheqel:before{content:""}.fa-meanpath:before{content:""}.fa-buysellads:before{content:""}.fa-connectdevelop:before{content:""}.fa-dashcube:before{content:""}.fa-forumbee:before{content:""}.fa-leanpub:before{content:""}.fa-sellsy:before{content:""}.fa-shirtsinbulk:before{content:""}.fa-simplybuilt:before{content:""}.fa-skyatlas:before{content:""}.fa-cart-plus:before{content:""}.fa-cart-arrow-down:before{content:""}.fa-diamond:before{content:""}.fa-ship:before{content:""}.fa-user-secret:before{content:""}.fa-motorcycle:before{content:""}.fa-street-view:before{content:""}.fa-heartbeat:before{content:""}.fa-venus:before{content:""}.fa-mars:before{content:""}.fa-mercury:before{content:""}.fa-intersex:before,.fa-transgender:before{content:""}.fa-transgender-alt:before{content:""}.fa-venus-double:before{content:""}.fa-mars-double:before{content:""}.fa-venus-mars:before{content:""}.fa-mars-stroke:before{content:""}.fa-mars-stroke-v:before{content:""}.fa-mars-stroke-h:before{content:""}.fa-neuter:before{content:""}.fa-genderless:before{content:""}.fa-facebook-official:before{content:""}.fa-pinterest-p:before{content:""}.fa-whatsapp:before{content:""}.fa-server:before{content:""}.fa-user-plus:before{content:""}.fa-user-times:before{content:""}.fa-bed:before,.fa-hotel:before{content:""}.fa-viacoin:before{content:""}.fa-train:before{content:""}.fa-subway:before{content:""}.fa-medium:before{content:""}.fa-y-combinator:before,.fa-yc:before{content:""}.fa-optin-monster:before{content:""}.fa-opencart:before{content:""}.fa-expeditedssl:before{content:""}.fa-battery-4:before,.fa-battery-full:before,.fa-battery:before{content:""}.fa-battery-3:before,.fa-battery-three-quarters:before{content:""}.fa-battery-2:before,.fa-battery-half:before{content:""}.fa-battery-1:before,.fa-battery-quarter:before{content:""}.fa-battery-0:before,.fa-battery-empty:before{content:""}.fa-mouse-pointer:before{content:""}.fa-i-cursor:before{content:""}.fa-object-group:before{content:""}.fa-object-ungroup:before{content:""}.fa-sticky-note:before{content:""}.fa-sticky-note-o:before{content:""}.fa-cc-jcb:before{content:""}.fa-cc-diners-club:before{content:""}.fa-clone:before{content:""}.fa-balance-scale:before{content:""}.fa-hourglass-o:before{content:""}.fa-hourglass-1:before,.fa-hourglass-start:before{content:""}.fa-hourglass-2:before,.fa-hourglass-half:before{content:""}.fa-hourglass-3:before,.fa-hourglass-end:before{content:""}.fa-hourglass:before{content:""}.fa-hand-grab-o:before,.fa-hand-rock-o:before{content:""}.fa-hand-paper-o:before,.fa-hand-stop-o:before{content:""}.fa-hand-scissors-o:before{content:""}.fa-hand-lizard-o:before{content:""}.fa-hand-spock-o:before{content:""}.fa-hand-pointer-o:before{content:""}.fa-hand-peace-o:before{content:""}.fa-trademark:before{content:""}.fa-registered:before{content:""}.fa-creative-commons:before{content:""}.fa-gg:before{content:""}.fa-gg-circle:before{content:""}.fa-tripadvisor:before{content:""}.fa-odnoklassniki:before{content:""}.fa-odnoklassniki-square:before{content:""}.fa-get-pocket:before{content:""}.fa-wikipedia-w:before{content:""}.fa-safari:before{content:""}.fa-chrome:before{content:""}.fa-firefox:before{content:""}.fa-opera:before{content:""}.fa-internet-explorer:before{content:""}.fa-television:before,.fa-tv:before{content:""}.fa-contao:before{content:""}.fa-500px:before{content:""}.fa-amazon:before{content:""}.fa-calendar-plus-o:before{content:""}.fa-calendar-minus-o:before{content:""}.fa-calendar-times-o:before{content:""}.fa-calendar-check-o:before{content:""}.fa-industry:before{content:""}.fa-map-pin:before{content:""}.fa-map-signs:before{content:""}.fa-map-o:before{content:""}.fa-map:before{content:""}.fa-commenting:before{content:""}.fa-commenting-o:before{content:""}.fa-houzz:before{content:""}.fa-vimeo:before{content:""}.fa-black-tie:before{content:""}.fa-fonticons:before{content:""}.fa-reddit-alien:before{content:""}.fa-edge:before{content:""}.fa-credit-card-alt:before{content:""}.fa-codiepie:before{content:""}.fa-modx:before{content:""}.fa-fort-awesome:before{content:""}.fa-usb:before{content:""}.fa-product-hunt:before{content:""}.fa-mixcloud:before{content:""}.fa-scribd:before{content:""}.fa-pause-circle:before{content:""}.fa-pause-circle-o:before{content:""}.fa-stop-circle:before{content:""}.fa-stop-circle-o:before{content:""}.fa-shopping-bag:before{content:""}.fa-shopping-basket:before{content:""}.fa-hashtag:before{content:""}.fa-bluetooth:before{content:""}.fa-bluetooth-b:before{content:""}.fa-percent:before{content:""}.fa-gitlab:before,.icon-gitlab:before{content:""}.fa-wpbeginner:before{content:""}.fa-wpforms:before{content:""}.fa-envira:before{content:""}.fa-universal-access:before{content:""}.fa-wheelchair-alt:before{content:""}.fa-question-circle-o:before{content:""}.fa-blind:before{content:""}.fa-audio-description:before{content:""}.fa-volume-control-phone:before{content:""}.fa-braille:before{content:""}.fa-assistive-listening-systems:before{content:""}.fa-american-sign-language-interpreting:before,.fa-asl-interpreting:before{content:""}.fa-deaf:before,.fa-deafness:before,.fa-hard-of-hearing:before{content:""}.fa-glide:before{content:""}.fa-glide-g:before{content:""}.fa-sign-language:before,.fa-signing:before{content:""}.fa-low-vision:before{content:""}.fa-viadeo:before{content:""}.fa-viadeo-square:before{content:""}.fa-snapchat:before{content:""}.fa-snapchat-ghost:before{content:""}.fa-snapchat-square:before{content:""}.fa-pied-piper:before{content:""}.fa-first-order:before{content:""}.fa-yoast:before{content:""}.fa-themeisle:before{content:""}.fa-google-plus-circle:before,.fa-google-plus-official:before{content:""}.fa-fa:before,.fa-font-awesome:before{content:""}.fa-handshake-o:before{content:""}.fa-envelope-open:before{content:""}.fa-envelope-open-o:before{content:""}.fa-linode:before{content:""}.fa-address-book:before{content:""}.fa-address-book-o:before{content:""}.fa-address-card:before,.fa-vcard:before{content:""}.fa-address-card-o:before,.fa-vcard-o:before{content:""}.fa-user-circle:before{content:""}.fa-user-circle-o:before{content:""}.fa-user-o:before{content:""}.fa-id-badge:before{content:""}.fa-drivers-license:before,.fa-id-card:before{content:""}.fa-drivers-license-o:before,.fa-id-card-o:before{content:""}.fa-quora:before{content:""}.fa-free-code-camp:before{content:""}.fa-telegram:before{content:""}.fa-thermometer-4:before,.fa-thermometer-full:before,.fa-thermometer:before{content:""}.fa-thermometer-3:before,.fa-thermometer-three-quarters:before{content:""}.fa-thermometer-2:before,.fa-thermometer-half:before{content:""}.fa-thermometer-1:before,.fa-thermometer-quarter:before{content:""}.fa-thermometer-0:before,.fa-thermometer-empty:before{content:""}.fa-shower:before{content:""}.fa-bath:before,.fa-bathtub:before,.fa-s15:before{content:""}.fa-podcast:before{content:""}.fa-window-maximize:before{content:""}.fa-window-minimize:before{content:""}.fa-window-restore:before{content:""}.fa-times-rectangle:before,.fa-window-close:before{content:""}.fa-times-rectangle-o:before,.fa-window-close-o:before{content:""}.fa-bandcamp:before{content:""}.fa-grav:before{content:""}.fa-etsy:before{content:""}.fa-imdb:before{content:""}.fa-ravelry:before{content:""}.fa-eercast:before{content:""}.fa-microchip:before{content:""}.fa-snowflake-o:before{content:""}.fa-superpowers:before{content:""}.fa-wpexplorer:before{content:""}.fa-meetup:before{content:""}.sr-only{position:absolute;width:1px;height:1px;padding:0;margin:-1px;overflow:hidden;clip:rect(0,0,0,0);border:0}.sr-only-focusable:active,.sr-only-focusable:focus{position:static;width:auto;height:auto;margin:0;overflow:visible;clip:auto}.fa,.icon,.rst-content .admonition-title,.rst-content .code-block-caption .headerlink,.rst-content .eqno .headerlink,.rst-content code.download span:first-child,.rst-content dl dt .headerlink,.rst-content h1 .headerlink,.rst-content h2 .headerlink,.rst-content h3 .headerlink,.rst-content h4 .headerlink,.rst-content h5 .headerlink,.rst-content h6 .headerlink,.rst-content p.caption .headerlink,.rst-content p .headerlink,.rst-content table>caption .headerlink,.rst-content tt.download span:first-child,.wy-dropdown .caret,.wy-inline-validate.wy-inline-validate-danger .wy-input-context,.wy-inline-validate.wy-inline-validate-info .wy-input-context,.wy-inline-validate.wy-inline-validate-success .wy-input-context,.wy-inline-validate.wy-inline-validate-warning .wy-input-context,.wy-menu-vertical li.current>a button.toctree-expand,.wy-menu-vertical li.on a button.toctree-expand,.wy-menu-vertical li button.toctree-expand{font-family:inherit}.fa:before,.icon:before,.rst-content .admonition-title:before,.rst-content .code-block-caption .headerlink:before,.rst-content .eqno .headerlink:before,.rst-content code.download span:first-child:before,.rst-content dl dt .headerlink:before,.rst-content h1 .headerlink:before,.rst-content h2 .headerlink:before,.rst-content h3 .headerlink:before,.rst-content h4 .headerlink:before,.rst-content h5 .headerlink:before,.rst-content h6 .headerlink:before,.rst-content p.caption .headerlink:before,.rst-content p .headerlink:before,.rst-content table>caption .headerlink:before,.rst-content tt.download span:first-child:before,.wy-dropdown .caret:before,.wy-inline-validate.wy-inline-validate-danger .wy-input-context:before,.wy-inline-validate.wy-inline-validate-info .wy-input-context:before,.wy-inline-validate.wy-inline-validate-success .wy-input-context:before,.wy-inline-validate.wy-inline-validate-warning .wy-input-context:before,.wy-menu-vertical li.current>a button.toctree-expand:before,.wy-menu-vertical li.on a button.toctree-expand:before,.wy-menu-vertical li button.toctree-expand:before{font-family:FontAwesome;display:inline-block;font-style:normal;font-weight:400;line-height:1;text-decoration:inherit}.rst-content .code-block-caption a .headerlink,.rst-content .eqno a .headerlink,.rst-content a .admonition-title,.rst-content code.download a span:first-child,.rst-content dl dt a .headerlink,.rst-content h1 a .headerlink,.rst-content h2 a .headerlink,.rst-content h3 a .headerlink,.rst-content h4 a .headerlink,.rst-content h5 a .headerlink,.rst-content h6 a .headerlink,.rst-content p.caption a .headerlink,.rst-content p a .headerlink,.rst-content table>caption a .headerlink,.rst-content tt.download a span:first-child,.wy-menu-vertical li.current>a button.toctree-expand,.wy-menu-vertical li.on a button.toctree-expand,.wy-menu-vertical li a button.toctree-expand,a .fa,a .icon,a .rst-content .admonition-title,a .rst-content .code-block-caption .headerlink,a .rst-content .eqno .headerlink,a .rst-content code.download span:first-child,a .rst-content dl dt .headerlink,a .rst-content h1 .headerlink,a .rst-content h2 .headerlink,a .rst-content h3 .headerlink,a .rst-content h4 .headerlink,a .rst-content h5 .headerlink,a .rst-content h6 .headerlink,a .rst-content p.caption .headerlink,a .rst-content p .headerlink,a .rst-content table>caption .headerlink,a .rst-content tt.download span:first-child,a .wy-menu-vertical li button.toctree-expand{display:inline-block;text-decoration:inherit}.btn .fa,.btn .icon,.btn .rst-content .admonition-title,.btn .rst-content .code-block-caption .headerlink,.btn .rst-content .eqno .headerlink,.btn .rst-content code.download span:first-child,.btn .rst-content dl dt .headerlink,.btn .rst-content h1 .headerlink,.btn .rst-content h2 .headerlink,.btn .rst-content h3 .headerlink,.btn .rst-content h4 .headerlink,.btn .rst-content h5 .headerlink,.btn .rst-content h6 .headerlink,.btn .rst-content p .headerlink,.btn .rst-content table>caption .headerlink,.btn .rst-content tt.download span:first-child,.btn .wy-menu-vertical li.current>a button.toctree-expand,.btn .wy-menu-vertical li.on a button.toctree-expand,.btn .wy-menu-vertical li button.toctree-expand,.nav .fa,.nav .icon,.nav .rst-content .admonition-title,.nav .rst-content .code-block-caption .headerlink,.nav .rst-content .eqno .headerlink,.nav .rst-content code.download span:first-child,.nav .rst-content dl dt .headerlink,.nav .rst-content h1 .headerlink,.nav .rst-content h2 .headerlink,.nav .rst-content h3 .headerlink,.nav .rst-content h4 .headerlink,.nav .rst-content h5 .headerlink,.nav .rst-content h6 .headerlink,.nav .rst-content p .headerlink,.nav .rst-content table>caption .headerlink,.nav .rst-content tt.download span:first-child,.nav .wy-menu-vertical li.current>a button.toctree-expand,.nav .wy-menu-vertical li.on a button.toctree-expand,.nav .wy-menu-vertical li button.toctree-expand,.rst-content .btn .admonition-title,.rst-content .code-block-caption .btn .headerlink,.rst-content .code-block-caption .nav .headerlink,.rst-content .eqno .btn .headerlink,.rst-content .eqno .nav .headerlink,.rst-content .nav .admonition-title,.rst-content code.download .btn span:first-child,.rst-content code.download .nav span:first-child,.rst-content dl dt .btn .headerlink,.rst-content dl dt .nav .headerlink,.rst-content h1 .btn .headerlink,.rst-content h1 .nav .headerlink,.rst-content h2 .btn .headerlink,.rst-content h2 .nav .headerlink,.rst-content h3 .btn .headerlink,.rst-content h3 .nav .headerlink,.rst-content h4 .btn .headerlink,.rst-content h4 .nav .headerlink,.rst-content h5 .btn .headerlink,.rst-content h5 .nav .headerlink,.rst-content h6 .btn .headerlink,.rst-content h6 .nav .headerlink,.rst-content p .btn .headerlink,.rst-content p .nav .headerlink,.rst-content table>caption .btn .headerlink,.rst-content table>caption .nav .headerlink,.rst-content tt.download .btn span:first-child,.rst-content tt.download .nav span:first-child,.wy-menu-vertical li .btn button.toctree-expand,.wy-menu-vertical li.current>a .btn button.toctree-expand,.wy-menu-vertical li.current>a .nav button.toctree-expand,.wy-menu-vertical li .nav button.toctree-expand,.wy-menu-vertical li.on a .btn button.toctree-expand,.wy-menu-vertical li.on a .nav button.toctree-expand{display:inline}.btn .fa-large.icon,.btn .fa.fa-large,.btn .rst-content .code-block-caption .fa-large.headerlink,.btn .rst-content .eqno .fa-large.headerlink,.btn .rst-content .fa-large.admonition-title,.btn .rst-content code.download span.fa-large:first-child,.btn .rst-content dl dt .fa-large.headerlink,.btn .rst-content h1 .fa-large.headerlink,.btn .rst-content h2 .fa-large.headerlink,.btn .rst-content h3 .fa-large.headerlink,.btn .rst-content h4 .fa-large.headerlink,.btn .rst-content h5 .fa-large.headerlink,.btn .rst-content h6 .fa-large.headerlink,.btn .rst-content p .fa-large.headerlink,.btn .rst-content table>caption .fa-large.headerlink,.btn .rst-content tt.download span.fa-large:first-child,.btn .wy-menu-vertical li button.fa-large.toctree-expand,.nav .fa-large.icon,.nav .fa.fa-large,.nav .rst-content .code-block-caption .fa-large.headerlink,.nav .rst-content .eqno .fa-large.headerlink,.nav .rst-content .fa-large.admonition-title,.nav .rst-content code.download span.fa-large:first-child,.nav .rst-content dl dt .fa-large.headerlink,.nav .rst-content h1 .fa-large.headerlink,.nav .rst-content h2 .fa-large.headerlink,.nav .rst-content h3 .fa-large.headerlink,.nav .rst-content h4 .fa-large.headerlink,.nav .rst-content h5 .fa-large.headerlink,.nav .rst-content h6 .fa-large.headerlink,.nav .rst-content p .fa-large.headerlink,.nav .rst-content table>caption .fa-large.headerlink,.nav .rst-content tt.download span.fa-large:first-child,.nav .wy-menu-vertical li button.fa-large.toctree-expand,.rst-content .btn .fa-large.admonition-title,.rst-content .code-block-caption .btn .fa-large.headerlink,.rst-content .code-block-caption .nav .fa-large.headerlink,.rst-content .eqno .btn .fa-large.headerlink,.rst-content .eqno .nav .fa-large.headerlink,.rst-content .nav .fa-large.admonition-title,.rst-content code.download .btn span.fa-large:first-child,.rst-content code.download .nav span.fa-large:first-child,.rst-content dl dt .btn .fa-large.headerlink,.rst-content dl dt .nav .fa-large.headerlink,.rst-content h1 .btn .fa-large.headerlink,.rst-content h1 .nav .fa-large.headerlink,.rst-content h2 .btn .fa-large.headerlink,.rst-content h2 .nav .fa-large.headerlink,.rst-content h3 .btn .fa-large.headerlink,.rst-content h3 .nav .fa-large.headerlink,.rst-content h4 .btn .fa-large.headerlink,.rst-content h4 .nav .fa-large.headerlink,.rst-content h5 .btn .fa-large.headerlink,.rst-content h5 .nav .fa-large.headerlink,.rst-content h6 .btn .fa-large.headerlink,.rst-content h6 .nav .fa-large.headerlink,.rst-content p .btn .fa-large.headerlink,.rst-content p .nav .fa-large.headerlink,.rst-content table>caption .btn .fa-large.headerlink,.rst-content table>caption .nav .fa-large.headerlink,.rst-content tt.download .btn span.fa-large:first-child,.rst-content tt.download .nav span.fa-large:first-child,.wy-menu-vertical li .btn button.fa-large.toctree-expand,.wy-menu-vertical li .nav button.fa-large.toctree-expand{line-height:.9em}.btn .fa-spin.icon,.btn .fa.fa-spin,.btn .rst-content .code-block-caption .fa-spin.headerlink,.btn .rst-content .eqno .fa-spin.headerlink,.btn .rst-content .fa-spin.admonition-title,.btn .rst-content code.download span.fa-spin:first-child,.btn .rst-content dl dt .fa-spin.headerlink,.btn .rst-content h1 .fa-spin.headerlink,.btn .rst-content h2 .fa-spin.headerlink,.btn .rst-content h3 .fa-spin.headerlink,.btn .rst-content h4 .fa-spin.headerlink,.btn .rst-content h5 .fa-spin.headerlink,.btn .rst-content h6 .fa-spin.headerlink,.btn .rst-content p .fa-spin.headerlink,.btn .rst-content table>caption .fa-spin.headerlink,.btn .rst-content tt.download span.fa-spin:first-child,.btn .wy-menu-vertical li button.fa-spin.toctree-expand,.nav .fa-spin.icon,.nav .fa.fa-spin,.nav .rst-content .code-block-caption .fa-spin.headerlink,.nav .rst-content .eqno .fa-spin.headerlink,.nav .rst-content .fa-spin.admonition-title,.nav .rst-content code.download span.fa-spin:first-child,.nav .rst-content dl dt .fa-spin.headerlink,.nav .rst-content h1 .fa-spin.headerlink,.nav .rst-content h2 .fa-spin.headerlink,.nav .rst-content h3 .fa-spin.headerlink,.nav .rst-content h4 .fa-spin.headerlink,.nav .rst-content h5 .fa-spin.headerlink,.nav .rst-content h6 .fa-spin.headerlink,.nav .rst-content p .fa-spin.headerlink,.nav .rst-content table>caption .fa-spin.headerlink,.nav .rst-content tt.download span.fa-spin:first-child,.nav .wy-menu-vertical li button.fa-spin.toctree-expand,.rst-content .btn .fa-spin.admonition-title,.rst-content .code-block-caption .btn .fa-spin.headerlink,.rst-content .code-block-caption .nav .fa-spin.headerlink,.rst-content .eqno .btn .fa-spin.headerlink,.rst-content .eqno .nav .fa-spin.headerlink,.rst-content .nav .fa-spin.admonition-title,.rst-content code.download .btn span.fa-spin:first-child,.rst-content code.download .nav span.fa-spin:first-child,.rst-content dl dt .btn .fa-spin.headerlink,.rst-content dl dt .nav .fa-spin.headerlink,.rst-content h1 .btn .fa-spin.headerlink,.rst-content h1 .nav .fa-spin.headerlink,.rst-content h2 .btn .fa-spin.headerlink,.rst-content h2 .nav .fa-spin.headerlink,.rst-content h3 .btn .fa-spin.headerlink,.rst-content h3 .nav .fa-spin.headerlink,.rst-content h4 .btn .fa-spin.headerlink,.rst-content h4 .nav .fa-spin.headerlink,.rst-content h5 .btn .fa-spin.headerlink,.rst-content h5 .nav .fa-spin.headerlink,.rst-content h6 .btn .fa-spin.headerlink,.rst-content h6 .nav .fa-spin.headerlink,.rst-content p .btn .fa-spin.headerlink,.rst-content p .nav .fa-spin.headerlink,.rst-content table>caption .btn .fa-spin.headerlink,.rst-content table>caption .nav .fa-spin.headerlink,.rst-content tt.download .btn span.fa-spin:first-child,.rst-content tt.download .nav span.fa-spin:first-child,.wy-menu-vertical li .btn button.fa-spin.toctree-expand,.wy-menu-vertical li .nav button.fa-spin.toctree-expand{display:inline-block}.btn.fa:before,.btn.icon:before,.rst-content .btn.admonition-title:before,.rst-content .code-block-caption .btn.headerlink:before,.rst-content .eqno .btn.headerlink:before,.rst-content code.download span.btn:first-child:before,.rst-content dl dt .btn.headerlink:before,.rst-content h1 .btn.headerlink:before,.rst-content h2 .btn.headerlink:before,.rst-content h3 .btn.headerlink:before,.rst-content h4 .btn.headerlink:before,.rst-content h5 .btn.headerlink:before,.rst-content h6 .btn.headerlink:before,.rst-content p .btn.headerlink:before,.rst-content table>caption .btn.headerlink:before,.rst-content tt.download span.btn:first-child:before,.wy-menu-vertical li button.btn.toctree-expand:before{opacity:.5;-webkit-transition:opacity .05s ease-in;-moz-transition:opacity .05s ease-in;transition:opacity .05s ease-in}.btn.fa:hover:before,.btn.icon:hover:before,.rst-content .btn.admonition-title:hover:before,.rst-content .code-block-caption .btn.headerlink:hover:before,.rst-content .eqno .btn.headerlink:hover:before,.rst-content code.download span.btn:first-child:hover:before,.rst-content dl dt .btn.headerlink:hover:before,.rst-content h1 .btn.headerlink:hover:before,.rst-content h2 .btn.headerlink:hover:before,.rst-content h3 .btn.headerlink:hover:before,.rst-content h4 .btn.headerlink:hover:before,.rst-content h5 .btn.headerlink:hover:before,.rst-content h6 .btn.headerlink:hover:before,.rst-content p .btn.headerlink:hover:before,.rst-content table>caption .btn.headerlink:hover:before,.rst-content tt.download span.btn:first-child:hover:before,.wy-menu-vertical li button.btn.toctree-expand:hover:before{opacity:1}.btn-mini .fa:before,.btn-mini .icon:before,.btn-mini .rst-content .admonition-title:before,.btn-mini .rst-content .code-block-caption .headerlink:before,.btn-mini .rst-content .eqno .headerlink:before,.btn-mini .rst-content code.download span:first-child:before,.btn-mini .rst-content dl dt .headerlink:before,.btn-mini .rst-content h1 .headerlink:before,.btn-mini .rst-content h2 .headerlink:before,.btn-mini .rst-content h3 .headerlink:before,.btn-mini .rst-content h4 .headerlink:before,.btn-mini .rst-content h5 .headerlink:before,.btn-mini .rst-content h6 .headerlink:before,.btn-mini .rst-content p .headerlink:before,.btn-mini .rst-content table>caption .headerlink:before,.btn-mini .rst-content tt.download span:first-child:before,.btn-mini .wy-menu-vertical li button.toctree-expand:before,.rst-content .btn-mini .admonition-title:before,.rst-content .code-block-caption .btn-mini .headerlink:before,.rst-content .eqno .btn-mini .headerlink:before,.rst-content code.download .btn-mini span:first-child:before,.rst-content dl dt .btn-mini .headerlink:before,.rst-content h1 .btn-mini .headerlink:before,.rst-content h2 .btn-mini .headerlink:before,.rst-content h3 .btn-mini .headerlink:before,.rst-content h4 .btn-mini .headerlink:before,.rst-content h5 .btn-mini .headerlink:before,.rst-content h6 .btn-mini .headerlink:before,.rst-content p .btn-mini .headerlink:before,.rst-content table>caption .btn-mini .headerlink:before,.rst-content tt.download .btn-mini span:first-child:before,.wy-menu-vertical li .btn-mini button.toctree-expand:before{font-size:14px;vertical-align:-15%}.rst-content .admonition,.rst-content .admonition-todo,.rst-content .attention,.rst-content .caution,.rst-content .danger,.rst-content .error,.rst-content .hint,.rst-content .important,.rst-content .note,.rst-content .seealso,.rst-content .tip,.rst-content .warning,.wy-alert{padding:12px;line-height:24px;margin-bottom:24px;background:#e7f2fa}.rst-content .admonition-title,.wy-alert-title{font-weight:700;display:block;color:#fff;background:#6ab0de;padding:6px 12px;margin:-12px -12px 12px}.rst-content .danger,.rst-content .error,.rst-content .wy-alert-danger.admonition,.rst-content .wy-alert-danger.admonition-todo,.rst-content .wy-alert-danger.attention,.rst-content .wy-alert-danger.caution,.rst-content .wy-alert-danger.hint,.rst-content .wy-alert-danger.important,.rst-content .wy-alert-danger.note,.rst-content .wy-alert-danger.seealso,.rst-content .wy-alert-danger.tip,.rst-content .wy-alert-danger.warning,.wy-alert.wy-alert-danger{background:#fdf3f2}.rst-content .danger .admonition-title,.rst-content .danger .wy-alert-title,.rst-content .error .admonition-title,.rst-content .error .wy-alert-title,.rst-content .wy-alert-danger.admonition-todo .admonition-title,.rst-content .wy-alert-danger.admonition-todo .wy-alert-title,.rst-content .wy-alert-danger.admonition .admonition-title,.rst-content .wy-alert-danger.admonition .wy-alert-title,.rst-content .wy-alert-danger.attention .admonition-title,.rst-content .wy-alert-danger.attention .wy-alert-title,.rst-content .wy-alert-danger.caution .admonition-title,.rst-content .wy-alert-danger.caution .wy-alert-title,.rst-content .wy-alert-danger.hint .admonition-title,.rst-content .wy-alert-danger.hint .wy-alert-title,.rst-content .wy-alert-danger.important .admonition-title,.rst-content .wy-alert-danger.important .wy-alert-title,.rst-content .wy-alert-danger.note .admonition-title,.rst-content .wy-alert-danger.note .wy-alert-title,.rst-content .wy-alert-danger.seealso .admonition-title,.rst-content .wy-alert-danger.seealso .wy-alert-title,.rst-content .wy-alert-danger.tip .admonition-title,.rst-content .wy-alert-danger.tip .wy-alert-title,.rst-content .wy-alert-danger.warning .admonition-title,.rst-content .wy-alert-danger.warning .wy-alert-title,.rst-content .wy-alert.wy-alert-danger .admonition-title,.wy-alert.wy-alert-danger .rst-content .admonition-title,.wy-alert.wy-alert-danger .wy-alert-title{background:#f29f97}.rst-content .admonition-todo,.rst-content .attention,.rst-content .caution,.rst-content .warning,.rst-content .wy-alert-warning.admonition,.rst-content .wy-alert-warning.danger,.rst-content .wy-alert-warning.error,.rst-content .wy-alert-warning.hint,.rst-content .wy-alert-warning.important,.rst-content .wy-alert-warning.note,.rst-content .wy-alert-warning.seealso,.rst-content .wy-alert-warning.tip,.wy-alert.wy-alert-warning{background:#ffedcc}.rst-content .admonition-todo .admonition-title,.rst-content .admonition-todo .wy-alert-title,.rst-content .attention .admonition-title,.rst-content .attention .wy-alert-title,.rst-content .caution .admonition-title,.rst-content .caution .wy-alert-title,.rst-content .warning .admonition-title,.rst-content .warning .wy-alert-title,.rst-content .wy-alert-warning.admonition .admonition-title,.rst-content .wy-alert-warning.admonition .wy-alert-title,.rst-content .wy-alert-warning.danger .admonition-title,.rst-content .wy-alert-warning.danger .wy-alert-title,.rst-content .wy-alert-warning.error .admonition-title,.rst-content .wy-alert-warning.error .wy-alert-title,.rst-content .wy-alert-warning.hint .admonition-title,.rst-content .wy-alert-warning.hint .wy-alert-title,.rst-content .wy-alert-warning.important .admonition-title,.rst-content .wy-alert-warning.important .wy-alert-title,.rst-content .wy-alert-warning.note .admonition-title,.rst-content .wy-alert-warning.note .wy-alert-title,.rst-content .wy-alert-warning.seealso .admonition-title,.rst-content .wy-alert-warning.seealso .wy-alert-title,.rst-content .wy-alert-warning.tip .admonition-title,.rst-content .wy-alert-warning.tip .wy-alert-title,.rst-content .wy-alert.wy-alert-warning .admonition-title,.wy-alert.wy-alert-warning .rst-content .admonition-title,.wy-alert.wy-alert-warning .wy-alert-title{background:#f0b37e}.rst-content .note,.rst-content .seealso,.rst-content .wy-alert-info.admonition,.rst-content .wy-alert-info.admonition-todo,.rst-content .wy-alert-info.attention,.rst-content .wy-alert-info.caution,.rst-content .wy-alert-info.danger,.rst-content .wy-alert-info.error,.rst-content .wy-alert-info.hint,.rst-content .wy-alert-info.important,.rst-content .wy-alert-info.tip,.rst-content .wy-alert-info.warning,.wy-alert.wy-alert-info{background:#e7f2fa}.rst-content .note .admonition-title,.rst-content .note .wy-alert-title,.rst-content .seealso .admonition-title,.rst-content .seealso .wy-alert-title,.rst-content .wy-alert-info.admonition-todo .admonition-title,.rst-content .wy-alert-info.admonition-todo .wy-alert-title,.rst-content .wy-alert-info.admonition .admonition-title,.rst-content .wy-alert-info.admonition .wy-alert-title,.rst-content .wy-alert-info.attention .admonition-title,.rst-content .wy-alert-info.attention .wy-alert-title,.rst-content .wy-alert-info.caution .admonition-title,.rst-content .wy-alert-info.caution .wy-alert-title,.rst-content .wy-alert-info.danger .admonition-title,.rst-content .wy-alert-info.danger .wy-alert-title,.rst-content .wy-alert-info.error .admonition-title,.rst-content .wy-alert-info.error .wy-alert-title,.rst-content .wy-alert-info.hint .admonition-title,.rst-content .wy-alert-info.hint .wy-alert-title,.rst-content .wy-alert-info.important .admonition-title,.rst-content .wy-alert-info.important .wy-alert-title,.rst-content .wy-alert-info.tip .admonition-title,.rst-content .wy-alert-info.tip .wy-alert-title,.rst-content .wy-alert-info.warning .admonition-title,.rst-content .wy-alert-info.warning .wy-alert-title,.rst-content .wy-alert.wy-alert-info .admonition-title,.wy-alert.wy-alert-info .rst-content .admonition-title,.wy-alert.wy-alert-info .wy-alert-title{background:#6ab0de}.rst-content .hint,.rst-content .important,.rst-content .tip,.rst-content .wy-alert-success.admonition,.rst-content .wy-alert-success.admonition-todo,.rst-content .wy-alert-success.attention,.rst-content .wy-alert-success.caution,.rst-content .wy-alert-success.danger,.rst-content .wy-alert-success.error,.rst-content .wy-alert-success.note,.rst-content .wy-alert-success.seealso,.rst-content .wy-alert-success.warning,.wy-alert.wy-alert-success{background:#dbfaf4}.rst-content .hint .admonition-title,.rst-content .hint .wy-alert-title,.rst-content .important .admonition-title,.rst-content .important .wy-alert-title,.rst-content .tip .admonition-title,.rst-content .tip .wy-alert-title,.rst-content .wy-alert-success.admonition-todo .admonition-title,.rst-content .wy-alert-success.admonition-todo .wy-alert-title,.rst-content .wy-alert-success.admonition .admonition-title,.rst-content .wy-alert-success.admonition .wy-alert-title,.rst-content .wy-alert-success.attention .admonition-title,.rst-content .wy-alert-success.attention .wy-alert-title,.rst-content .wy-alert-success.caution .admonition-title,.rst-content .wy-alert-success.caution .wy-alert-title,.rst-content .wy-alert-success.danger .admonition-title,.rst-content .wy-alert-success.danger .wy-alert-title,.rst-content .wy-alert-success.error .admonition-title,.rst-content .wy-alert-success.error .wy-alert-title,.rst-content .wy-alert-success.note .admonition-title,.rst-content .wy-alert-success.note .wy-alert-title,.rst-content .wy-alert-success.seealso .admonition-title,.rst-content .wy-alert-success.seealso .wy-alert-title,.rst-content .wy-alert-success.warning .admonition-title,.rst-content .wy-alert-success.warning .wy-alert-title,.rst-content .wy-alert.wy-alert-success .admonition-title,.wy-alert.wy-alert-success .rst-content .admonition-title,.wy-alert.wy-alert-success .wy-alert-title{background:#1abc9c}.rst-content .wy-alert-neutral.admonition,.rst-content .wy-alert-neutral.admonition-todo,.rst-content .wy-alert-neutral.attention,.rst-content .wy-alert-neutral.caution,.rst-content .wy-alert-neutral.danger,.rst-content .wy-alert-neutral.error,.rst-content .wy-alert-neutral.hint,.rst-content .wy-alert-neutral.important,.rst-content .wy-alert-neutral.note,.rst-content .wy-alert-neutral.seealso,.rst-content .wy-alert-neutral.tip,.rst-content .wy-alert-neutral.warning,.wy-alert.wy-alert-neutral{background:#f3f6f6}.rst-content .wy-alert-neutral.admonition-todo .admonition-title,.rst-content .wy-alert-neutral.admonition-todo .wy-alert-title,.rst-content .wy-alert-neutral.admonition .admonition-title,.rst-content .wy-alert-neutral.admonition .wy-alert-title,.rst-content .wy-alert-neutral.attention .admonition-title,.rst-content .wy-alert-neutral.attention .wy-alert-title,.rst-content .wy-alert-neutral.caution .admonition-title,.rst-content .wy-alert-neutral.caution .wy-alert-title,.rst-content .wy-alert-neutral.danger .admonition-title,.rst-content .wy-alert-neutral.danger .wy-alert-title,.rst-content .wy-alert-neutral.error .admonition-title,.rst-content .wy-alert-neutral.error .wy-alert-title,.rst-content .wy-alert-neutral.hint .admonition-title,.rst-content .wy-alert-neutral.hint .wy-alert-title,.rst-content .wy-alert-neutral.important .admonition-title,.rst-content .wy-alert-neutral.important .wy-alert-title,.rst-content .wy-alert-neutral.note .admonition-title,.rst-content .wy-alert-neutral.note .wy-alert-title,.rst-content .wy-alert-neutral.seealso .admonition-title,.rst-content .wy-alert-neutral.seealso .wy-alert-title,.rst-content .wy-alert-neutral.tip .admonition-title,.rst-content .wy-alert-neutral.tip .wy-alert-title,.rst-content .wy-alert-neutral.warning .admonition-title,.rst-content .wy-alert-neutral.warning .wy-alert-title,.rst-content .wy-alert.wy-alert-neutral .admonition-title,.wy-alert.wy-alert-neutral .rst-content .admonition-title,.wy-alert.wy-alert-neutral .wy-alert-title{color:#404040;background:#e1e4e5}.rst-content .wy-alert-neutral.admonition-todo a,.rst-content .wy-alert-neutral.admonition a,.rst-content .wy-alert-neutral.attention a,.rst-content .wy-alert-neutral.caution a,.rst-content .wy-alert-neutral.danger a,.rst-content .wy-alert-neutral.error a,.rst-content .wy-alert-neutral.hint a,.rst-content .wy-alert-neutral.important a,.rst-content .wy-alert-neutral.note a,.rst-content .wy-alert-neutral.seealso a,.rst-content .wy-alert-neutral.tip a,.rst-content .wy-alert-neutral.warning a,.wy-alert.wy-alert-neutral a{color:#2980b9}.rst-content .admonition-todo p:last-child,.rst-content .admonition p:last-child,.rst-content .attention p:last-child,.rst-content .caution p:last-child,.rst-content .danger p:last-child,.rst-content .error p:last-child,.rst-content .hint p:last-child,.rst-content .important p:last-child,.rst-content .note p:last-child,.rst-content .seealso p:last-child,.rst-content .tip p:last-child,.rst-content .warning p:last-child,.wy-alert p:last-child{margin-bottom:0}.wy-tray-container{position:fixed;bottom:0;left:0;z-index:600}.wy-tray-container li{display:block;width:300px;background:transparent;color:#fff;text-align:center;box-shadow:0 5px 5px 0 rgba(0,0,0,.1);padding:0 24px;min-width:20%;opacity:0;height:0;line-height:56px;overflow:hidden;-webkit-transition:all .3s ease-in;-moz-transition:all .3s ease-in;transition:all .3s ease-in}.wy-tray-container li.wy-tray-item-success{background:#27ae60}.wy-tray-container li.wy-tray-item-info{background:#2980b9}.wy-tray-container li.wy-tray-item-warning{background:#e67e22}.wy-tray-container li.wy-tray-item-danger{background:#e74c3c}.wy-tray-container li.on{opacity:1;height:56px}@media screen and (max-width:768px){.wy-tray-container{bottom:auto;top:0;width:100%}.wy-tray-container li{width:100%}}button{font-size:100%;margin:0;vertical-align:baseline;*vertical-align:middle;cursor:pointer;line-height:normal;-webkit-appearance:button;*overflow:visible}button::-moz-focus-inner,input::-moz-focus-inner{border:0;padding:0}button[disabled]{cursor:default}.btn{display:inline-block;border-radius:2px;line-height:normal;white-space:nowrap;text-align:center;cursor:pointer;font-size:100%;padding:6px 12px 8px;color:#fff;border:1px solid rgba(0,0,0,.1);background-color:#27ae60;text-decoration:none;font-weight:400;font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif;box-shadow:inset 0 1px 2px -1px hsla(0,0%,100%,.5),inset 0 -2px 0 0 rgba(0,0,0,.1);outline-none:false;vertical-align:middle;*display:inline;zoom:1;-webkit-user-drag:none;-webkit-user-select:none;-moz-user-select:none;-ms-user-select:none;user-select:none;-webkit-transition:all .1s linear;-moz-transition:all .1s linear;transition:all .1s linear}.btn-hover{background:#2e8ece;color:#fff}.btn:hover{background:#2cc36b;color:#fff}.btn:focus{background:#2cc36b;outline:0}.btn:active{box-shadow:inset 0 -1px 0 0 rgba(0,0,0,.05),inset 0 2px 0 0 rgba(0,0,0,.1);padding:8px 12px 6px}.btn:visited{color:#fff}.btn-disabled,.btn-disabled:active,.btn-disabled:focus,.btn-disabled:hover,.btn:disabled{background-image:none;filter:progid:DXImageTransform.Microsoft.gradient(enabled = false);filter:alpha(opacity=40);opacity:.4;cursor:not-allowed;box-shadow:none}.btn::-moz-focus-inner{padding:0;border:0}.btn-small{font-size:80%}.btn-info{background-color:#2980b9!important}.btn-info:hover{background-color:#2e8ece!important}.btn-neutral{background-color:#f3f6f6!important;color:#404040!important}.btn-neutral:hover{background-color:#e5ebeb!important;color:#404040}.btn-neutral:visited{color:#404040!important}.btn-success{background-color:#27ae60!important}.btn-success:hover{background-color:#295!important}.btn-danger{background-color:#e74c3c!important}.btn-danger:hover{background-color:#ea6153!important}.btn-warning{background-color:#e67e22!important}.btn-warning:hover{background-color:#e98b39!important}.btn-invert{background-color:#222}.btn-invert:hover{background-color:#2f2f2f!important}.btn-link{background-color:transparent!important;color:#2980b9;box-shadow:none;border-color:transparent!important}.btn-link:active,.btn-link:hover{background-color:transparent!important;color:#409ad5!important;box-shadow:none}.btn-link:visited{color:#9b59b6}.wy-btn-group .btn,.wy-control .btn{vertical-align:middle}.wy-btn-group{margin-bottom:24px;*zoom:1}.wy-btn-group:after,.wy-btn-group:before{display:table;content:""}.wy-btn-group:after{clear:both}.wy-dropdown{position:relative;display:inline-block}.wy-dropdown-active .wy-dropdown-menu{display:block}.wy-dropdown-menu{position:absolute;left:0;display:none;float:left;top:100%;min-width:100%;background:#fcfcfc;z-index:100;border:1px solid #cfd7dd;box-shadow:0 2px 2px 0 rgba(0,0,0,.1);padding:12px}.wy-dropdown-menu>dd>a{display:block;clear:both;color:#404040;white-space:nowrap;font-size:90%;padding:0 12px;cursor:pointer}.wy-dropdown-menu>dd>a:hover{background:#2980b9;color:#fff}.wy-dropdown-menu>dd.divider{border-top:1px solid #cfd7dd;margin:6px 0}.wy-dropdown-menu>dd.search{padding-bottom:12px}.wy-dropdown-menu>dd.search input[type=search]{width:100%}.wy-dropdown-menu>dd.call-to-action{background:#e3e3e3;text-transform:uppercase;font-weight:500;font-size:80%}.wy-dropdown-menu>dd.call-to-action:hover{background:#e3e3e3}.wy-dropdown-menu>dd.call-to-action .btn{color:#fff}.wy-dropdown.wy-dropdown-up .wy-dropdown-menu{bottom:100%;top:auto;left:auto;right:0}.wy-dropdown.wy-dropdown-bubble .wy-dropdown-menu{background:#fcfcfc;margin-top:2px}.wy-dropdown.wy-dropdown-bubble .wy-dropdown-menu a{padding:6px 12px}.wy-dropdown.wy-dropdown-bubble .wy-dropdown-menu a:hover{background:#2980b9;color:#fff}.wy-dropdown.wy-dropdown-left .wy-dropdown-menu{right:0;left:auto;text-align:right}.wy-dropdown-arrow:before{content:" ";border-bottom:5px solid #f5f5f5;border-left:5px solid transparent;border-right:5px solid transparent;position:absolute;display:block;top:-4px;left:50%;margin-left:-3px}.wy-dropdown-arrow.wy-dropdown-arrow-left:before{left:11px}.wy-form-stacked select{display:block}.wy-form-aligned .wy-help-inline,.wy-form-aligned input,.wy-form-aligned label,.wy-form-aligned select,.wy-form-aligned textarea{display:inline-block;*display:inline;*zoom:1;vertical-align:middle}.wy-form-aligned .wy-control-group>label{display:inline-block;vertical-align:middle;width:10em;margin:6px 12px 0 0;float:left}.wy-form-aligned .wy-control{float:left}.wy-form-aligned .wy-control label{display:block}.wy-form-aligned .wy-control select{margin-top:6px}fieldset{margin:0}fieldset,legend{border:0;padding:0}legend{width:100%;white-space:normal;margin-bottom:24px;font-size:150%;*margin-left:-7px}label,legend{display:block}label{margin:0 0 .3125em;color:#333;font-size:90%}input,select,textarea{font-size:100%;margin:0;vertical-align:baseline;*vertical-align:middle}.wy-control-group{margin-bottom:24px;max-width:1200px;margin-left:auto;margin-right:auto;*zoom:1}.wy-control-group:after,.wy-control-group:before{display:table;content:""}.wy-control-group:after{clear:both}.wy-control-group.wy-control-group-required>label:after{content:" *";color:#e74c3c}.wy-control-group .wy-form-full,.wy-control-group .wy-form-halves,.wy-control-group .wy-form-thirds{padding-bottom:12px}.wy-control-group .wy-form-full input[type=color],.wy-control-group .wy-form-full input[type=date],.wy-control-group .wy-form-full input[type=datetime-local],.wy-control-group .wy-form-full input[type=datetime],.wy-control-group .wy-form-full input[type=email],.wy-control-group .wy-form-full input[type=month],.wy-control-group .wy-form-full input[type=number],.wy-control-group .wy-form-full input[type=password],.wy-control-group .wy-form-full input[type=search],.wy-control-group .wy-form-full input[type=tel],.wy-control-group .wy-form-full input[type=text],.wy-control-group .wy-form-full input[type=time],.wy-control-group .wy-form-full input[type=url],.wy-control-group .wy-form-full input[type=week],.wy-control-group .wy-form-full select,.wy-control-group .wy-form-halves input[type=color],.wy-control-group .wy-form-halves input[type=date],.wy-control-group .wy-form-halves input[type=datetime-local],.wy-control-group .wy-form-halves input[type=datetime],.wy-control-group .wy-form-halves input[type=email],.wy-control-group .wy-form-halves input[type=month],.wy-control-group .wy-form-halves input[type=number],.wy-control-group .wy-form-halves input[type=password],.wy-control-group .wy-form-halves input[type=search],.wy-control-group .wy-form-halves input[type=tel],.wy-control-group .wy-form-halves input[type=text],.wy-control-group .wy-form-halves input[type=time],.wy-control-group .wy-form-halves input[type=url],.wy-control-group .wy-form-halves input[type=week],.wy-control-group .wy-form-halves select,.wy-control-group .wy-form-thirds input[type=color],.wy-control-group .wy-form-thirds input[type=date],.wy-control-group .wy-form-thirds input[type=datetime-local],.wy-control-group .wy-form-thirds input[type=datetime],.wy-control-group .wy-form-thirds input[type=email],.wy-control-group .wy-form-thirds input[type=month],.wy-control-group .wy-form-thirds input[type=number],.wy-control-group .wy-form-thirds input[type=password],.wy-control-group .wy-form-thirds input[type=search],.wy-control-group .wy-form-thirds input[type=tel],.wy-control-group .wy-form-thirds input[type=text],.wy-control-group .wy-form-thirds input[type=time],.wy-control-group .wy-form-thirds input[type=url],.wy-control-group .wy-form-thirds input[type=week],.wy-control-group .wy-form-thirds select{width:100%}.wy-control-group .wy-form-full{float:left;display:block;width:100%;margin-right:0}.wy-control-group .wy-form-full:last-child{margin-right:0}.wy-control-group .wy-form-halves{float:left;display:block;margin-right:2.35765%;width:48.82117%}.wy-control-group .wy-form-halves:last-child,.wy-control-group .wy-form-halves:nth-of-type(2n){margin-right:0}.wy-control-group .wy-form-halves:nth-of-type(odd){clear:left}.wy-control-group .wy-form-thirds{float:left;display:block;margin-right:2.35765%;width:31.76157%}.wy-control-group .wy-form-thirds:last-child,.wy-control-group .wy-form-thirds:nth-of-type(3n){margin-right:0}.wy-control-group .wy-form-thirds:nth-of-type(3n+1){clear:left}.wy-control-group.wy-control-group-no-input .wy-control,.wy-control-no-input{margin:6px 0 0;font-size:90%}.wy-control-no-input{display:inline-block}.wy-control-group.fluid-input input[type=color],.wy-control-group.fluid-input input[type=date],.wy-control-group.fluid-input input[type=datetime-local],.wy-control-group.fluid-input input[type=datetime],.wy-control-group.fluid-input input[type=email],.wy-control-group.fluid-input input[type=month],.wy-control-group.fluid-input input[type=number],.wy-control-group.fluid-input input[type=password],.wy-control-group.fluid-input input[type=search],.wy-control-group.fluid-input input[type=tel],.wy-control-group.fluid-input input[type=text],.wy-control-group.fluid-input input[type=time],.wy-control-group.fluid-input input[type=url],.wy-control-group.fluid-input input[type=week]{width:100%}.wy-form-message-inline{padding-left:.3em;color:#666;font-size:90%}.wy-form-message{display:block;color:#999;font-size:70%;margin-top:.3125em;font-style:italic}.wy-form-message p{font-size:inherit;font-style:italic;margin-bottom:6px}.wy-form-message p:last-child{margin-bottom:0}input{line-height:normal}input[type=button],input[type=reset],input[type=submit]{-webkit-appearance:button;cursor:pointer;font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif;*overflow:visible}input[type=color],input[type=date],input[type=datetime-local],input[type=datetime],input[type=email],input[type=month],input[type=number],input[type=password],input[type=search],input[type=tel],input[type=text],input[type=time],input[type=url],input[type=week]{-webkit-appearance:none;padding:6px;display:inline-block;border:1px solid #ccc;font-size:80%;font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif;box-shadow:inset 0 1px 3px #ddd;border-radius:0;-webkit-transition:border .3s linear;-moz-transition:border .3s linear;transition:border .3s linear}input[type=datetime-local]{padding:.34375em .625em}input[disabled]{cursor:default}input[type=checkbox],input[type=radio]{padding:0;margin-right:.3125em;*height:13px;*width:13px}input[type=checkbox],input[type=radio],input[type=search]{-webkit-box-sizing:border-box;-moz-box-sizing:border-box;box-sizing:border-box}input[type=search]::-webkit-search-cancel-button,input[type=search]::-webkit-search-decoration{-webkit-appearance:none}input[type=color]:focus,input[type=date]:focus,input[type=datetime-local]:focus,input[type=datetime]:focus,input[type=email]:focus,input[type=month]:focus,input[type=number]:focus,input[type=password]:focus,input[type=search]:focus,input[type=tel]:focus,input[type=text]:focus,input[type=time]:focus,input[type=url]:focus,input[type=week]:focus{outline:0;outline:thin dotted\9;border-color:#333}input.no-focus:focus{border-color:#ccc!important}input[type=checkbox]:focus,input[type=file]:focus,input[type=radio]:focus{outline:thin dotted #333;outline:1px auto #129fea}input[type=color][disabled],input[type=date][disabled],input[type=datetime-local][disabled],input[type=datetime][disabled],input[type=email][disabled],input[type=month][disabled],input[type=number][disabled],input[type=password][disabled],input[type=search][disabled],input[type=tel][disabled],input[type=text][disabled],input[type=time][disabled],input[type=url][disabled],input[type=week][disabled]{cursor:not-allowed;background-color:#fafafa}input:focus:invalid,select:focus:invalid,textarea:focus:invalid{color:#e74c3c;border:1px solid #e74c3c}input:focus:invalid:focus,select:focus:invalid:focus,textarea:focus:invalid:focus{border-color:#e74c3c}input[type=checkbox]:focus:invalid:focus,input[type=file]:focus:invalid:focus,input[type=radio]:focus:invalid:focus{outline-color:#e74c3c}input.wy-input-large{padding:12px;font-size:100%}textarea{overflow:auto;vertical-align:top;width:100%;font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif}select,textarea{padding:.5em .625em;display:inline-block;border:1px solid #ccc;font-size:80%;box-shadow:inset 0 1px 3px #ddd;-webkit-transition:border .3s linear;-moz-transition:border .3s linear;transition:border .3s linear}select{border:1px solid #ccc;background-color:#fff}select[multiple]{height:auto}select:focus,textarea:focus{outline:0}input[readonly],select[disabled],select[readonly],textarea[disabled],textarea[readonly]{cursor:not-allowed;background-color:#fafafa}input[type=checkbox][disabled],input[type=radio][disabled]{cursor:not-allowed}.wy-checkbox,.wy-radio{margin:6px 0;color:#404040;display:block}.wy-checkbox input,.wy-radio input{vertical-align:baseline}.wy-form-message-inline{display:inline-block;*display:inline;*zoom:1;vertical-align:middle}.wy-input-prefix,.wy-input-suffix{white-space:nowrap;padding:6px}.wy-input-prefix .wy-input-context,.wy-input-suffix .wy-input-context{line-height:27px;padding:0 8px;display:inline-block;font-size:80%;background-color:#f3f6f6;border:1px solid #ccc;color:#999}.wy-input-suffix .wy-input-context{border-left:0}.wy-input-prefix .wy-input-context{border-right:0}.wy-switch{position:relative;display:block;height:24px;margin-top:12px;cursor:pointer}.wy-switch:before{left:0;top:0;width:36px;height:12px;background:#ccc}.wy-switch:after,.wy-switch:before{position:absolute;content:"";display:block;border-radius:4px;-webkit-transition:all .2s ease-in-out;-moz-transition:all .2s ease-in-out;transition:all .2s ease-in-out}.wy-switch:after{width:18px;height:18px;background:#999;left:-3px;top:-3px}.wy-switch span{position:absolute;left:48px;display:block;font-size:12px;color:#ccc;line-height:1}.wy-switch.active:before{background:#1e8449}.wy-switch.active:after{left:24px;background:#27ae60}.wy-switch.disabled{cursor:not-allowed;opacity:.8}.wy-control-group.wy-control-group-error .wy-form-message,.wy-control-group.wy-control-group-error>label{color:#e74c3c}.wy-control-group.wy-control-group-error input[type=color],.wy-control-group.wy-control-group-error input[type=date],.wy-control-group.wy-control-group-error input[type=datetime-local],.wy-control-group.wy-control-group-error input[type=datetime],.wy-control-group.wy-control-group-error input[type=email],.wy-control-group.wy-control-group-error input[type=month],.wy-control-group.wy-control-group-error input[type=number],.wy-control-group.wy-control-group-error input[type=password],.wy-control-group.wy-control-group-error input[type=search],.wy-control-group.wy-control-group-error input[type=tel],.wy-control-group.wy-control-group-error input[type=text],.wy-control-group.wy-control-group-error input[type=time],.wy-control-group.wy-control-group-error input[type=url],.wy-control-group.wy-control-group-error input[type=week],.wy-control-group.wy-control-group-error textarea{border:1px solid #e74c3c}.wy-inline-validate{white-space:nowrap}.wy-inline-validate .wy-input-context{padding:.5em .625em;display:inline-block;font-size:80%}.wy-inline-validate.wy-inline-validate-success .wy-input-context{color:#27ae60}.wy-inline-validate.wy-inline-validate-danger .wy-input-context{color:#e74c3c}.wy-inline-validate.wy-inline-validate-warning .wy-input-context{color:#e67e22}.wy-inline-validate.wy-inline-validate-info .wy-input-context{color:#2980b9}.rotate-90{-webkit-transform:rotate(90deg);-moz-transform:rotate(90deg);-ms-transform:rotate(90deg);-o-transform:rotate(90deg);transform:rotate(90deg)}.rotate-180{-webkit-transform:rotate(180deg);-moz-transform:rotate(180deg);-ms-transform:rotate(180deg);-o-transform:rotate(180deg);transform:rotate(180deg)}.rotate-270{-webkit-transform:rotate(270deg);-moz-transform:rotate(270deg);-ms-transform:rotate(270deg);-o-transform:rotate(270deg);transform:rotate(270deg)}.mirror{-webkit-transform:scaleX(-1);-moz-transform:scaleX(-1);-ms-transform:scaleX(-1);-o-transform:scaleX(-1);transform:scaleX(-1)}.mirror.rotate-90{-webkit-transform:scaleX(-1) rotate(90deg);-moz-transform:scaleX(-1) rotate(90deg);-ms-transform:scaleX(-1) rotate(90deg);-o-transform:scaleX(-1) rotate(90deg);transform:scaleX(-1) rotate(90deg)}.mirror.rotate-180{-webkit-transform:scaleX(-1) rotate(180deg);-moz-transform:scaleX(-1) rotate(180deg);-ms-transform:scaleX(-1) rotate(180deg);-o-transform:scaleX(-1) rotate(180deg);transform:scaleX(-1) rotate(180deg)}.mirror.rotate-270{-webkit-transform:scaleX(-1) rotate(270deg);-moz-transform:scaleX(-1) rotate(270deg);-ms-transform:scaleX(-1) rotate(270deg);-o-transform:scaleX(-1) rotate(270deg);transform:scaleX(-1) rotate(270deg)}@media only screen and (max-width:480px){.wy-form button[type=submit]{margin:.7em 0 0}.wy-form input[type=color],.wy-form input[type=date],.wy-form input[type=datetime-local],.wy-form input[type=datetime],.wy-form input[type=email],.wy-form input[type=month],.wy-form input[type=number],.wy-form input[type=password],.wy-form input[type=search],.wy-form input[type=tel],.wy-form input[type=text],.wy-form input[type=time],.wy-form input[type=url],.wy-form input[type=week],.wy-form label{margin-bottom:.3em;display:block}.wy-form input[type=color],.wy-form input[type=date],.wy-form input[type=datetime-local],.wy-form input[type=datetime],.wy-form input[type=email],.wy-form input[type=month],.wy-form input[type=number],.wy-form input[type=password],.wy-form input[type=search],.wy-form input[type=tel],.wy-form input[type=time],.wy-form input[type=url],.wy-form input[type=week]{margin-bottom:0}.wy-form-aligned .wy-control-group label{margin-bottom:.3em;text-align:left;display:block;width:100%}.wy-form-aligned .wy-control{margin:1.5em 0 0}.wy-form-message,.wy-form-message-inline,.wy-form .wy-help-inline{display:block;font-size:80%;padding:6px 0}}@media screen and (max-width:768px){.tablet-hide{display:none}}@media screen and (max-width:480px){.mobile-hide{display:none}}.float-left{float:left}.float-right{float:right}.full-width{width:100%}.rst-content table.docutils,.rst-content table.field-list,.wy-table{border-collapse:collapse;border-spacing:0;empty-cells:show;margin-bottom:24px}.rst-content table.docutils caption,.rst-content table.field-list caption,.wy-table caption{color:#000;font:italic 85%/1 arial,sans-serif;padding:1em 0;text-align:center}.rst-content table.docutils td,.rst-content table.docutils th,.rst-content table.field-list td,.rst-content table.field-list th,.wy-table td,.wy-table th{font-size:90%;margin:0;overflow:visible;padding:8px 16px}.rst-content table.docutils td:first-child,.rst-content table.docutils th:first-child,.rst-content table.field-list td:first-child,.rst-content table.field-list th:first-child,.wy-table td:first-child,.wy-table th:first-child{border-left-width:0}.rst-content table.docutils thead,.rst-content table.field-list thead,.wy-table thead{color:#000;text-align:left;vertical-align:bottom;white-space:nowrap}.rst-content table.docutils thead th,.rst-content table.field-list thead th,.wy-table thead th{font-weight:700;border-bottom:2px solid #e1e4e5}.rst-content table.docutils td,.rst-content table.field-list td,.wy-table td{background-color:transparent;vertical-align:middle}.rst-content table.docutils td p,.rst-content table.field-list td p,.wy-table td p{line-height:18px}.rst-content table.docutils td p:last-child,.rst-content table.field-list td p:last-child,.wy-table td p:last-child{margin-bottom:0}.rst-content table.docutils .wy-table-cell-min,.rst-content table.field-list .wy-table-cell-min,.wy-table .wy-table-cell-min{width:1%;padding-right:0}.rst-content table.docutils .wy-table-cell-min input[type=checkbox],.rst-content table.field-list .wy-table-cell-min input[type=checkbox],.wy-table .wy-table-cell-min input[type=checkbox]{margin:0}.wy-table-secondary{color:grey;font-size:90%}.wy-table-tertiary{color:grey;font-size:80%}.rst-content table.docutils:not(.field-list) tr:nth-child(2n-1) td,.wy-table-backed,.wy-table-odd td,.wy-table-striped tr:nth-child(2n-1) td{background-color:#f3f6f6}.rst-content table.docutils,.wy-table-bordered-all{border:1px solid #e1e4e5}.rst-content table.docutils td,.wy-table-bordered-all td{border-bottom:1px solid #e1e4e5;border-left:1px solid #e1e4e5}.rst-content table.docutils tbody>tr:last-child td,.wy-table-bordered-all tbody>tr:last-child td{border-bottom-width:0}.wy-table-bordered{border:1px solid #e1e4e5}.wy-table-bordered-rows td{border-bottom:1px solid #e1e4e5}.wy-table-bordered-rows tbody>tr:last-child td{border-bottom-width:0}.wy-table-horizontal td,.wy-table-horizontal th{border-width:0 0 1px;border-bottom:1px solid #e1e4e5}.wy-table-horizontal tbody>tr:last-child td{border-bottom-width:0}.wy-table-responsive{margin-bottom:24px;max-width:100%;overflow:auto}.wy-table-responsive table{margin-bottom:0!important}.wy-table-responsive table td,.wy-table-responsive table th{white-space:nowrap}a{color:#2980b9;text-decoration:none;cursor:pointer}a:hover{color:#3091d1}a:visited{color:#9b59b6}html{height:100%}body,html{overflow-x:hidden}body{font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif;font-weight:400;color:#404040;min-height:100%;background:#edf0f2}.wy-text-left{text-align:left}.wy-text-center{text-align:center}.wy-text-right{text-align:right}.wy-text-large{font-size:120%}.wy-text-normal{font-size:100%}.wy-text-small,small{font-size:80%}.wy-text-strike{text-decoration:line-through}.wy-text-warning{color:#e67e22!important}a.wy-text-warning:hover{color:#eb9950!important}.wy-text-info{color:#2980b9!important}a.wy-text-info:hover{color:#409ad5!important}.wy-text-success{color:#27ae60!important}a.wy-text-success:hover{color:#36d278!important}.wy-text-danger{color:#e74c3c!important}a.wy-text-danger:hover{color:#ed7669!important}.wy-text-neutral{color:#404040!important}a.wy-text-neutral:hover{color:#595959!important}.rst-content .toctree-wrapper>p.caption,h1,h2,h3,h4,h5,h6,legend{margin-top:0;font-weight:700;font-family:Roboto Slab,ff-tisa-web-pro,Georgia,Arial,sans-serif}p{line-height:24px;font-size:16px;margin:0 0 24px}h1{font-size:175%}.rst-content .toctree-wrapper>p.caption,h2{font-size:150%}h3{font-size:125%}h4{font-size:115%}h5{font-size:110%}h6{font-size:100%}hr{display:block;height:1px;border:0;border-top:1px solid #e1e4e5;margin:24px 0;padding:0}.rst-content code,.rst-content tt,code{white-space:nowrap;max-width:100%;background:#fff;border:1px solid #e1e4e5;font-size:75%;padding:0 5px;font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;color:#e74c3c;overflow-x:auto}.rst-content tt.code-large,code.code-large{font-size:90%}.rst-content .section ul,.rst-content .toctree-wrapper ul,.rst-content section ul,.wy-plain-list-disc,article ul{list-style:disc;line-height:24px;margin-bottom:24px}.rst-content .section ul li,.rst-content .toctree-wrapper ul li,.rst-content section ul li,.wy-plain-list-disc li,article ul li{list-style:disc;margin-left:24px}.rst-content .section ul li p:last-child,.rst-content .section ul li ul,.rst-content .toctree-wrapper ul li p:last-child,.rst-content .toctree-wrapper ul li ul,.rst-content section ul li p:last-child,.rst-content section ul li ul,.wy-plain-list-disc li p:last-child,.wy-plain-list-disc li ul,article ul li p:last-child,article ul li ul{margin-bottom:0}.rst-content .section ul li li,.rst-content .toctree-wrapper ul li li,.rst-content section ul li li,.wy-plain-list-disc li li,article ul li li{list-style:circle}.rst-content .section ul li li li,.rst-content .toctree-wrapper ul li li li,.rst-content section ul li li li,.wy-plain-list-disc li li li,article ul li li li{list-style:square}.rst-content .section ul li ol li,.rst-content .toctree-wrapper ul li ol li,.rst-content section ul li ol li,.wy-plain-list-disc li ol li,article ul li ol li{list-style:decimal}.rst-content .section ol,.rst-content .section ol.arabic,.rst-content .toctree-wrapper ol,.rst-content .toctree-wrapper ol.arabic,.rst-content section ol,.rst-content section ol.arabic,.wy-plain-list-decimal,article ol{list-style:decimal;line-height:24px;margin-bottom:24px}.rst-content .section ol.arabic li,.rst-content .section ol li,.rst-content .toctree-wrapper ol.arabic li,.rst-content .toctree-wrapper ol li,.rst-content section ol.arabic li,.rst-content section ol li,.wy-plain-list-decimal li,article ol li{list-style:decimal;margin-left:24px}.rst-content .section ol.arabic li ul,.rst-content .section ol li p:last-child,.rst-content .section ol li ul,.rst-content .toctree-wrapper ol.arabic li ul,.rst-content .toctree-wrapper ol li p:last-child,.rst-content .toctree-wrapper ol li ul,.rst-content section ol.arabic li ul,.rst-content section ol li p:last-child,.rst-content section ol li ul,.wy-plain-list-decimal li p:last-child,.wy-plain-list-decimal li ul,article ol li p:last-child,article ol li ul{margin-bottom:0}.rst-content .section ol.arabic li ul li,.rst-content .section ol li ul li,.rst-content .toctree-wrapper ol.arabic li ul li,.rst-content .toctree-wrapper ol li ul li,.rst-content section ol.arabic li ul li,.rst-content section ol li ul li,.wy-plain-list-decimal li ul li,article ol li ul li{list-style:disc}.wy-breadcrumbs{*zoom:1}.wy-breadcrumbs:after,.wy-breadcrumbs:before{display:table;content:""}.wy-breadcrumbs:after{clear:both}.wy-breadcrumbs li{display:inline-block}.wy-breadcrumbs li.wy-breadcrumbs-aside{float:right}.wy-breadcrumbs li a{display:inline-block;padding:5px}.wy-breadcrumbs li a:first-child{padding-left:0}.rst-content .wy-breadcrumbs li tt,.wy-breadcrumbs li .rst-content tt,.wy-breadcrumbs li code{padding:5px;border:none;background:none}.rst-content .wy-breadcrumbs li tt.literal,.wy-breadcrumbs li .rst-content tt.literal,.wy-breadcrumbs li code.literal{color:#404040}.wy-breadcrumbs-extra{margin-bottom:0;color:#b3b3b3;font-size:80%;display:inline-block}@media screen and (max-width:480px){.wy-breadcrumbs-extra,.wy-breadcrumbs li.wy-breadcrumbs-aside{display:none}}@media print{.wy-breadcrumbs li.wy-breadcrumbs-aside{display:none}}html{font-size:16px}.wy-affix{position:fixed;top:1.618em}.wy-menu a:hover{text-decoration:none}.wy-menu-horiz{*zoom:1}.wy-menu-horiz:after,.wy-menu-horiz:before{display:table;content:""}.wy-menu-horiz:after{clear:both}.wy-menu-horiz li,.wy-menu-horiz ul{display:inline-block}.wy-menu-horiz li:hover{background:hsla(0,0%,100%,.1)}.wy-menu-horiz li.divide-left{border-left:1px solid #404040}.wy-menu-horiz li.divide-right{border-right:1px solid #404040}.wy-menu-horiz a{height:32px;display:inline-block;line-height:32px;padding:0 16px}.wy-menu-vertical{width:300px}.wy-menu-vertical header,.wy-menu-vertical p.caption{color:#55a5d9;height:32px;line-height:32px;padding:0 1.618em;margin:12px 0 0;display:block;font-weight:700;text-transform:uppercase;font-size:85%;white-space:nowrap}.wy-menu-vertical ul{margin-bottom:0}.wy-menu-vertical li.divide-top{border-top:1px solid #404040}.wy-menu-vertical li.divide-bottom{border-bottom:1px solid #404040}.wy-menu-vertical li.current{background:#e3e3e3}.wy-menu-vertical li.current a{color:grey;border-right:1px solid #c9c9c9;padding:.4045em 2.427em}.wy-menu-vertical li.current a:hover{background:#d6d6d6}.rst-content .wy-menu-vertical li tt,.wy-menu-vertical li .rst-content tt,.wy-menu-vertical li code{border:none;background:inherit;color:inherit;padding-left:0;padding-right:0}.wy-menu-vertical li button.toctree-expand{display:block;float:left;margin-left:-1.2em;line-height:18px;color:#4d4d4d;border:none;background:none;padding:0}.wy-menu-vertical li.current>a,.wy-menu-vertical li.on a{color:#404040;font-weight:700;position:relative;background:#fcfcfc;border:none;padding:.4045em 1.618em}.wy-menu-vertical li.current>a:hover,.wy-menu-vertical li.on a:hover{background:#fcfcfc}.wy-menu-vertical li.current>a:hover button.toctree-expand,.wy-menu-vertical li.on a:hover button.toctree-expand{color:grey}.wy-menu-vertical li.current>a button.toctree-expand,.wy-menu-vertical li.on a button.toctree-expand{display:block;line-height:18px;color:#333}.wy-menu-vertical li.toctree-l1.current>a{border-bottom:1px solid #c9c9c9;border-top:1px solid #c9c9c9}.wy-menu-vertical .toctree-l1.current .toctree-l2>ul,.wy-menu-vertical .toctree-l2.current .toctree-l3>ul,.wy-menu-vertical .toctree-l3.current .toctree-l4>ul,.wy-menu-vertical .toctree-l4.current .toctree-l5>ul,.wy-menu-vertical .toctree-l5.current .toctree-l6>ul,.wy-menu-vertical .toctree-l6.current .toctree-l7>ul,.wy-menu-vertical .toctree-l7.current .toctree-l8>ul,.wy-menu-vertical .toctree-l8.current .toctree-l9>ul,.wy-menu-vertical .toctree-l9.current .toctree-l10>ul,.wy-menu-vertical .toctree-l10.current .toctree-l11>ul{display:none}.wy-menu-vertical .toctree-l1.current .current.toctree-l2>ul,.wy-menu-vertical .toctree-l2.current .current.toctree-l3>ul,.wy-menu-vertical .toctree-l3.current .current.toctree-l4>ul,.wy-menu-vertical .toctree-l4.current .current.toctree-l5>ul,.wy-menu-vertical .toctree-l5.current .current.toctree-l6>ul,.wy-menu-vertical .toctree-l6.current .current.toctree-l7>ul,.wy-menu-vertical .toctree-l7.current .current.toctree-l8>ul,.wy-menu-vertical .toctree-l8.current .current.toctree-l9>ul,.wy-menu-vertical .toctree-l9.current .current.toctree-l10>ul,.wy-menu-vertical .toctree-l10.current .current.toctree-l11>ul{display:block}.wy-menu-vertical li.toctree-l3,.wy-menu-vertical li.toctree-l4{font-size:.9em}.wy-menu-vertical li.toctree-l2 a,.wy-menu-vertical li.toctree-l3 a,.wy-menu-vertical li.toctree-l4 a,.wy-menu-vertical li.toctree-l5 a,.wy-menu-vertical li.toctree-l6 a,.wy-menu-vertical li.toctree-l7 a,.wy-menu-vertical li.toctree-l8 a,.wy-menu-vertical li.toctree-l9 a,.wy-menu-vertical li.toctree-l10 a{color:#404040}.wy-menu-vertical li.toctree-l2 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l3 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l4 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l5 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l6 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l7 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l8 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l9 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l10 a:hover button.toctree-expand{color:grey}.wy-menu-vertical li.toctree-l2.current li.toctree-l3>a,.wy-menu-vertical li.toctree-l3.current li.toctree-l4>a,.wy-menu-vertical li.toctree-l4.current li.toctree-l5>a,.wy-menu-vertical li.toctree-l5.current li.toctree-l6>a,.wy-menu-vertical li.toctree-l6.current li.toctree-l7>a,.wy-menu-vertical li.toctree-l7.current li.toctree-l8>a,.wy-menu-vertical li.toctree-l8.current li.toctree-l9>a,.wy-menu-vertical li.toctree-l9.current li.toctree-l10>a,.wy-menu-vertical li.toctree-l10.current li.toctree-l11>a{display:block}.wy-menu-vertical li.toctree-l2.current>a{padding:.4045em 2.427em}.wy-menu-vertical li.toctree-l2.current li.toctree-l3>a{padding:.4045em 1.618em .4045em 4.045em}.wy-menu-vertical li.toctree-l3.current>a{padding:.4045em 4.045em}.wy-menu-vertical li.toctree-l3.current li.toctree-l4>a{padding:.4045em 1.618em .4045em 5.663em}.wy-menu-vertical li.toctree-l4.current>a{padding:.4045em 5.663em}.wy-menu-vertical li.toctree-l4.current li.toctree-l5>a{padding:.4045em 1.618em .4045em 7.281em}.wy-menu-vertical li.toctree-l5.current>a{padding:.4045em 7.281em}.wy-menu-vertical li.toctree-l5.current li.toctree-l6>a{padding:.4045em 1.618em .4045em 8.899em}.wy-menu-vertical li.toctree-l6.current>a{padding:.4045em 8.899em}.wy-menu-vertical li.toctree-l6.current li.toctree-l7>a{padding:.4045em 1.618em .4045em 10.517em}.wy-menu-vertical li.toctree-l7.current>a{padding:.4045em 10.517em}.wy-menu-vertical li.toctree-l7.current li.toctree-l8>a{padding:.4045em 1.618em .4045em 12.135em}.wy-menu-vertical li.toctree-l8.current>a{padding:.4045em 12.135em}.wy-menu-vertical li.toctree-l8.current li.toctree-l9>a{padding:.4045em 1.618em .4045em 13.753em}.wy-menu-vertical li.toctree-l9.current>a{padding:.4045em 13.753em}.wy-menu-vertical li.toctree-l9.current li.toctree-l10>a{padding:.4045em 1.618em .4045em 15.371em}.wy-menu-vertical li.toctree-l10.current>a{padding:.4045em 15.371em}.wy-menu-vertical li.toctree-l10.current li.toctree-l11>a{padding:.4045em 1.618em .4045em 16.989em}.wy-menu-vertical li.toctree-l2.current>a,.wy-menu-vertical li.toctree-l2.current li.toctree-l3>a{background:#c9c9c9}.wy-menu-vertical li.toctree-l2 button.toctree-expand{color:#a3a3a3}.wy-menu-vertical li.toctree-l3.current>a,.wy-menu-vertical li.toctree-l3.current li.toctree-l4>a{background:#bdbdbd}.wy-menu-vertical li.toctree-l3 button.toctree-expand{color:#969696}.wy-menu-vertical li.current ul{display:block}.wy-menu-vertical li ul{margin-bottom:0;display:none}.wy-menu-vertical li ul li a{margin-bottom:0;color:#d9d9d9;font-weight:400}.wy-menu-vertical a{line-height:18px;padding:.4045em 1.618em;display:block;position:relative;font-size:90%;color:#d9d9d9}.wy-menu-vertical a:hover{background-color:#4e4a4a;cursor:pointer}.wy-menu-vertical a:hover button.toctree-expand{color:#d9d9d9}.wy-menu-vertical a:active{background-color:#2980b9;cursor:pointer;color:#fff}.wy-menu-vertical a:active button.toctree-expand{color:#fff}.wy-side-nav-search{display:block;width:300px;padding:.809em;margin-bottom:.809em;z-index:200;background-color:#2980b9;text-align:center;color:#fcfcfc}.wy-side-nav-search input[type=text]{width:100%;border-radius:50px;padding:6px 12px;border-color:#2472a4}.wy-side-nav-search img{display:block;margin:auto auto .809em;height:45px;width:45px;background-color:#2980b9;padding:5px;border-radius:100%}.wy-side-nav-search .wy-dropdown>a,.wy-side-nav-search>a{color:#fcfcfc;font-size:100%;font-weight:700;display:inline-block;padding:4px 6px;margin-bottom:.809em;max-width:100%}.wy-side-nav-search .wy-dropdown>a:hover,.wy-side-nav-search>a:hover{background:hsla(0,0%,100%,.1)}.wy-side-nav-search .wy-dropdown>a img.logo,.wy-side-nav-search>a img.logo{display:block;margin:0 auto;height:auto;width:auto;border-radius:0;max-width:100%;background:transparent}.wy-side-nav-search .wy-dropdown>a.icon img.logo,.wy-side-nav-search>a.icon img.logo{margin-top:.85em}.wy-side-nav-search>div.version{margin-top:-.4045em;margin-bottom:.809em;font-weight:400;color:hsla(0,0%,100%,.3)}.wy-nav .wy-menu-vertical header{color:#2980b9}.wy-nav .wy-menu-vertical a{color:#b3b3b3}.wy-nav .wy-menu-vertical a:hover{background-color:#2980b9;color:#fff}[data-menu-wrap]{-webkit-transition:all .2s ease-in;-moz-transition:all .2s ease-in;transition:all .2s ease-in;position:absolute;opacity:1;width:100%;opacity:0}[data-menu-wrap].move-center{left:0;right:auto;opacity:1}[data-menu-wrap].move-left{right:auto;left:-100%;opacity:0}[data-menu-wrap].move-right{right:-100%;left:auto;opacity:0}.wy-body-for-nav{background:#fcfcfc}.wy-grid-for-nav{position:absolute;width:100%;height:100%}.wy-nav-side{position:fixed;top:0;bottom:0;left:0;padding-bottom:2em;width:300px;overflow-x:hidden;overflow-y:hidden;min-height:100%;color:#9b9b9b;background:#343131;z-index:200}.wy-side-scroll{width:320px;position:relative;overflow-x:hidden;overflow-y:scroll;height:100%}.wy-nav-top{display:none;background:#2980b9;color:#fff;padding:.4045em .809em;position:relative;line-height:50px;text-align:center;font-size:100%;*zoom:1}.wy-nav-top:after,.wy-nav-top:before{display:table;content:""}.wy-nav-top:after{clear:both}.wy-nav-top a{color:#fff;font-weight:700}.wy-nav-top img{margin-right:12px;height:45px;width:45px;background-color:#2980b9;padding:5px;border-radius:100%}.wy-nav-top i{font-size:30px;float:left;cursor:pointer;padding-top:inherit}.wy-nav-content-wrap{margin-left:300px;background:#fcfcfc;min-height:100%}.wy-nav-content{padding:1.618em 3.236em;height:100%;max-width:800px;margin:auto}.wy-body-mask{position:fixed;width:100%;height:100%;background:rgba(0,0,0,.2);display:none;z-index:499}.wy-body-mask.on{display:block}footer{color:grey}footer p{margin-bottom:12px}.rst-content footer span.commit tt,footer span.commit .rst-content tt,footer span.commit code{padding:0;font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;font-size:1em;background:none;border:none;color:grey}.rst-footer-buttons{*zoom:1}.rst-footer-buttons:after,.rst-footer-buttons:before{width:100%;display:table;content:""}.rst-footer-buttons:after{clear:both}.rst-breadcrumbs-buttons{margin-top:12px;*zoom:1}.rst-breadcrumbs-buttons:after,.rst-breadcrumbs-buttons:before{display:table;content:""}.rst-breadcrumbs-buttons:after{clear:both}#search-results .search li{margin-bottom:24px;border-bottom:1px solid #e1e4e5;padding-bottom:24px}#search-results .search li:first-child{border-top:1px solid #e1e4e5;padding-top:24px}#search-results .search li a{font-size:120%;margin-bottom:12px;display:inline-block}#search-results .context{color:grey;font-size:90%}.genindextable li>ul{margin-left:24px}@media screen and (max-width:768px){.wy-body-for-nav{background:#fcfcfc}.wy-nav-top{display:block}.wy-nav-side{left:-300px}.wy-nav-side.shift{width:85%;left:0}.wy-menu.wy-menu-vertical,.wy-side-nav-search,.wy-side-scroll{width:auto}.wy-nav-content-wrap{margin-left:0}.wy-nav-content-wrap .wy-nav-content{padding:1.618em}.wy-nav-content-wrap.shift{position:fixed;min-width:100%;left:85%;top:0;height:100%;overflow:hidden}}@media screen and (min-width:1100px){.wy-nav-content-wrap{background:rgba(0,0,0,.05)}.wy-nav-content{margin:0;background:#fcfcfc}}@media print{.rst-versions,.wy-nav-side,footer{display:none}.wy-nav-content-wrap{margin-left:0}}.rst-versions{position:fixed;bottom:0;left:0;width:300px;color:#fcfcfc;background:#1f1d1d;font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif;z-index:400}.rst-versions a{color:#2980b9;text-decoration:none}.rst-versions .rst-badge-small{display:none}.rst-versions .rst-current-version{padding:12px;background-color:#272525;display:block;text-align:right;font-size:90%;cursor:pointer;color:#27ae60;*zoom:1}.rst-versions .rst-current-version:after,.rst-versions .rst-current-version:before{display:table;content:""}.rst-versions .rst-current-version:after{clear:both}.rst-content .code-block-caption .rst-versions .rst-current-version .headerlink,.rst-content .eqno .rst-versions .rst-current-version .headerlink,.rst-content .rst-versions .rst-current-version .admonition-title,.rst-content code.download .rst-versions .rst-current-version span:first-child,.rst-content dl dt .rst-versions .rst-current-version .headerlink,.rst-content h1 .rst-versions .rst-current-version .headerlink,.rst-content h2 .rst-versions .rst-current-version .headerlink,.rst-content h3 .rst-versions .rst-current-version .headerlink,.rst-content h4 .rst-versions .rst-current-version .headerlink,.rst-content h5 .rst-versions .rst-current-version .headerlink,.rst-content h6 .rst-versions .rst-current-version .headerlink,.rst-content p .rst-versions .rst-current-version .headerlink,.rst-content table>caption .rst-versions .rst-current-version .headerlink,.rst-content tt.download .rst-versions .rst-current-version span:first-child,.rst-versions .rst-current-version .fa,.rst-versions .rst-current-version .icon,.rst-versions .rst-current-version .rst-content .admonition-title,.rst-versions .rst-current-version .rst-content .code-block-caption .headerlink,.rst-versions .rst-current-version .rst-content .eqno .headerlink,.rst-versions .rst-current-version .rst-content code.download span:first-child,.rst-versions .rst-current-version .rst-content dl dt .headerlink,.rst-versions .rst-current-version .rst-content h1 .headerlink,.rst-versions .rst-current-version .rst-content h2 .headerlink,.rst-versions .rst-current-version .rst-content h3 .headerlink,.rst-versions .rst-current-version .rst-content h4 .headerlink,.rst-versions .rst-current-version .rst-content h5 .headerlink,.rst-versions .rst-current-version .rst-content h6 .headerlink,.rst-versions .rst-current-version .rst-content p .headerlink,.rst-versions .rst-current-version .rst-content table>caption .headerlink,.rst-versions .rst-current-version .rst-content tt.download span:first-child,.rst-versions .rst-current-version .wy-menu-vertical li button.toctree-expand,.wy-menu-vertical li .rst-versions .rst-current-version button.toctree-expand{color:#fcfcfc}.rst-versions .rst-current-version .fa-book,.rst-versions .rst-current-version .icon-book{float:left}.rst-versions .rst-current-version.rst-out-of-date{background-color:#e74c3c;color:#fff}.rst-versions .rst-current-version.rst-active-old-version{background-color:#f1c40f;color:#000}.rst-versions.shift-up{height:auto;max-height:100%;overflow-y:scroll}.rst-versions.shift-up .rst-other-versions{display:block}.rst-versions .rst-other-versions{font-size:90%;padding:12px;color:grey;display:none}.rst-versions .rst-other-versions hr{display:block;height:1px;border:0;margin:20px 0;padding:0;border-top:1px solid #413d3d}.rst-versions .rst-other-versions dd{display:inline-block;margin:0}.rst-versions .rst-other-versions dd a{display:inline-block;padding:6px;color:#fcfcfc}.rst-versions.rst-badge{width:auto;bottom:20px;right:20px;left:auto;border:none;max-width:300px;max-height:90%}.rst-versions.rst-badge .fa-book,.rst-versions.rst-badge .icon-book{float:none;line-height:30px}.rst-versions.rst-badge.shift-up .rst-current-version{text-align:right}.rst-versions.rst-badge.shift-up .rst-current-version .fa-book,.rst-versions.rst-badge.shift-up .rst-current-version .icon-book{float:left}.rst-versions.rst-badge>.rst-current-version{width:auto;height:30px;line-height:30px;padding:0 6px;display:block;text-align:center}@media screen and (max-width:768px){.rst-versions{width:85%;display:none}.rst-versions.shift{display:block}}.rst-content .toctree-wrapper>p.caption,.rst-content h1,.rst-content h2,.rst-content h3,.rst-content h4,.rst-content h5,.rst-content h6{margin-bottom:24px}.rst-content img{max-width:100%;height:auto}.rst-content div.figure,.rst-content figure{margin-bottom:24px}.rst-content div.figure .caption-text,.rst-content figure .caption-text{font-style:italic}.rst-content div.figure p:last-child.caption,.rst-content figure p:last-child.caption{margin-bottom:0}.rst-content div.figure.align-center,.rst-content figure.align-center{text-align:center}.rst-content .section>a>img,.rst-content .section>img,.rst-content section>a>img,.rst-content section>img{margin-bottom:24px}.rst-content abbr[title]{text-decoration:none}.rst-content.style-external-links a.reference.external:after{font-family:FontAwesome;content:"\f08e";color:#b3b3b3;vertical-align:super;font-size:60%;margin:0 .2em}.rst-content blockquote{margin-left:24px;line-height:24px;margin-bottom:24px}.rst-content pre.literal-block{white-space:pre;margin:0;padding:12px;font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;display:block;overflow:auto}.rst-content div[class^=highlight],.rst-content pre.literal-block{border:1px solid #e1e4e5;overflow-x:auto;margin:1px 0 24px}.rst-content div[class^=highlight] div[class^=highlight],.rst-content pre.literal-block div[class^=highlight]{padding:0;border:none;margin:0}.rst-content div[class^=highlight] td.code{width:100%}.rst-content .linenodiv pre{border-right:1px solid #e6e9ea;margin:0;padding:12px;font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;user-select:none;pointer-events:none}.rst-content div[class^=highlight] pre{white-space:pre;margin:0;padding:12px;display:block;overflow:auto}.rst-content div[class^=highlight] pre .hll{display:block;margin:0 -12px;padding:0 12px}.rst-content .linenodiv pre,.rst-content div[class^=highlight] pre,.rst-content pre.literal-block{font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;font-size:12px;line-height:1.4}.rst-content div.highlight .gp,.rst-content div.highlight span.linenos{user-select:none;pointer-events:none}.rst-content div.highlight span.linenos{display:inline-block;padding-left:0;padding-right:12px;margin-right:12px;border-right:1px solid #e6e9ea}.rst-content .code-block-caption{font-style:italic;font-size:85%;line-height:1;padding:1em 0;text-align:center}@media print{.rst-content .codeblock,.rst-content div[class^=highlight],.rst-content div[class^=highlight] pre{white-space:pre-wrap}}.rst-content .admonition,.rst-content .admonition-todo,.rst-content .attention,.rst-content .caution,.rst-content .danger,.rst-content .error,.rst-content .hint,.rst-content .important,.rst-content .note,.rst-content .seealso,.rst-content .tip,.rst-content .warning{clear:both}.rst-content .admonition-todo .last,.rst-content .admonition-todo>:last-child,.rst-content .admonition .last,.rst-content .admonition>:last-child,.rst-content .attention .last,.rst-content .attention>:last-child,.rst-content .caution .last,.rst-content .caution>:last-child,.rst-content .danger .last,.rst-content .danger>:last-child,.rst-content .error .last,.rst-content .error>:last-child,.rst-content .hint .last,.rst-content .hint>:last-child,.rst-content .important .last,.rst-content .important>:last-child,.rst-content .note .last,.rst-content .note>:last-child,.rst-content .seealso .last,.rst-content .seealso>:last-child,.rst-content .tip .last,.rst-content .tip>:last-child,.rst-content .warning .last,.rst-content .warning>:last-child{margin-bottom:0}.rst-content .admonition-title:before{margin-right:4px}.rst-content .admonition table{border-color:rgba(0,0,0,.1)}.rst-content .admonition table td,.rst-content .admonition table th{background:transparent!important;border-color:rgba(0,0,0,.1)!important}.rst-content .section ol.loweralpha,.rst-content .section ol.loweralpha>li,.rst-content .toctree-wrapper ol.loweralpha,.rst-content .toctree-wrapper ol.loweralpha>li,.rst-content section ol.loweralpha,.rst-content section ol.loweralpha>li{list-style:lower-alpha}.rst-content .section ol.upperalpha,.rst-content .section ol.upperalpha>li,.rst-content .toctree-wrapper ol.upperalpha,.rst-content .toctree-wrapper ol.upperalpha>li,.rst-content section ol.upperalpha,.rst-content section ol.upperalpha>li{list-style:upper-alpha}.rst-content .section ol li>*,.rst-content .section ul li>*,.rst-content .toctree-wrapper ol li>*,.rst-content .toctree-wrapper ul li>*,.rst-content section ol li>*,.rst-content section ul li>*{margin-top:12px;margin-bottom:12px}.rst-content .section ol li>:first-child,.rst-content .section ul li>:first-child,.rst-content .toctree-wrapper ol li>:first-child,.rst-content .toctree-wrapper ul li>:first-child,.rst-content section ol li>:first-child,.rst-content section ul li>:first-child{margin-top:0}.rst-content .section ol li>p,.rst-content .section ol li>p:last-child,.rst-content .section ul li>p,.rst-content .section ul li>p:last-child,.rst-content .toctree-wrapper ol li>p,.rst-content .toctree-wrapper ol li>p:last-child,.rst-content .toctree-wrapper ul li>p,.rst-content .toctree-wrapper ul li>p:last-child,.rst-content section ol li>p,.rst-content section ol li>p:last-child,.rst-content section ul li>p,.rst-content section ul li>p:last-child{margin-bottom:12px}.rst-content .section ol li>p:only-child,.rst-content .section ol li>p:only-child:last-child,.rst-content .section ul li>p:only-child,.rst-content .section ul li>p:only-child:last-child,.rst-content .toctree-wrapper ol li>p:only-child,.rst-content .toctree-wrapper ol li>p:only-child:last-child,.rst-content .toctree-wrapper ul li>p:only-child,.rst-content .toctree-wrapper ul li>p:only-child:last-child,.rst-content section ol li>p:only-child,.rst-content section ol li>p:only-child:last-child,.rst-content section ul li>p:only-child,.rst-content section ul li>p:only-child:last-child{margin-bottom:0}.rst-content .section ol li>ol,.rst-content .section ol li>ul,.rst-content .section ul li>ol,.rst-content .section ul li>ul,.rst-content .toctree-wrapper ol li>ol,.rst-content .toctree-wrapper ol li>ul,.rst-content .toctree-wrapper ul li>ol,.rst-content .toctree-wrapper ul li>ul,.rst-content section ol li>ol,.rst-content section ol li>ul,.rst-content section ul li>ol,.rst-content section ul li>ul{margin-bottom:12px}.rst-content .section ol.simple li>*,.rst-content .section ol.simple li ol,.rst-content .section ol.simple li ul,.rst-content .section ul.simple li>*,.rst-content .section ul.simple li ol,.rst-content .section ul.simple li ul,.rst-content .toctree-wrapper ol.simple li>*,.rst-content .toctree-wrapper ol.simple li ol,.rst-content .toctree-wrapper ol.simple li ul,.rst-content .toctree-wrapper ul.simple li>*,.rst-content .toctree-wrapper ul.simple li ol,.rst-content .toctree-wrapper ul.simple li ul,.rst-content section ol.simple li>*,.rst-content section ol.simple li ol,.rst-content section ol.simple li ul,.rst-content section ul.simple li>*,.rst-content section ul.simple li ol,.rst-content section ul.simple li ul{margin-top:0;margin-bottom:0}.rst-content .line-block{margin-left:0;margin-bottom:24px;line-height:24px}.rst-content .line-block .line-block{margin-left:24px;margin-bottom:0}.rst-content .topic-title{font-weight:700;margin-bottom:12px}.rst-content .toc-backref{color:#404040}.rst-content .align-right{float:right;margin:0 0 24px 24px}.rst-content .align-left{float:left;margin:0 24px 24px 0}.rst-content .align-center{margin:auto}.rst-content .align-center:not(table){display:block}.rst-content .code-block-caption .headerlink,.rst-content .eqno .headerlink,.rst-content .toctree-wrapper>p.caption .headerlink,.rst-content dl dt .headerlink,.rst-content h1 .headerlink,.rst-content h2 .headerlink,.rst-content h3 .headerlink,.rst-content h4 .headerlink,.rst-content h5 .headerlink,.rst-content h6 .headerlink,.rst-content p.caption .headerlink,.rst-content p .headerlink,.rst-content table>caption .headerlink{opacity:0;font-size:14px;font-family:FontAwesome;margin-left:.5em}.rst-content .code-block-caption .headerlink:focus,.rst-content .code-block-caption:hover .headerlink,.rst-content .eqno .headerlink:focus,.rst-content .eqno:hover .headerlink,.rst-content .toctree-wrapper>p.caption .headerlink:focus,.rst-content .toctree-wrapper>p.caption:hover .headerlink,.rst-content dl dt .headerlink:focus,.rst-content dl dt:hover .headerlink,.rst-content h1 .headerlink:focus,.rst-content h1:hover .headerlink,.rst-content h2 .headerlink:focus,.rst-content h2:hover .headerlink,.rst-content h3 .headerlink:focus,.rst-content h3:hover .headerlink,.rst-content h4 .headerlink:focus,.rst-content h4:hover .headerlink,.rst-content h5 .headerlink:focus,.rst-content h5:hover .headerlink,.rst-content h6 .headerlink:focus,.rst-content h6:hover .headerlink,.rst-content p.caption .headerlink:focus,.rst-content p.caption:hover .headerlink,.rst-content p .headerlink:focus,.rst-content p:hover .headerlink,.rst-content table>caption .headerlink:focus,.rst-content table>caption:hover .headerlink{opacity:1}.rst-content .btn:focus{outline:2px solid}.rst-content table>caption .headerlink:after{font-size:12px}.rst-content .centered{text-align:center}.rst-content .sidebar{float:right;width:40%;display:block;margin:0 0 24px 24px;padding:24px;background:#f3f6f6;border:1px solid #e1e4e5}.rst-content .sidebar dl,.rst-content .sidebar p,.rst-content .sidebar ul{font-size:90%}.rst-content .sidebar .last,.rst-content .sidebar>:last-child{margin-bottom:0}.rst-content .sidebar .sidebar-title{display:block;font-family:Roboto Slab,ff-tisa-web-pro,Georgia,Arial,sans-serif;font-weight:700;background:#e1e4e5;padding:6px 12px;margin:-24px -24px 24px;font-size:100%}.rst-content .highlighted{background:#f1c40f;box-shadow:0 0 0 2px #f1c40f;display:inline;font-weight:700}.rst-content .citation-reference,.rst-content .footnote-reference{vertical-align:baseline;position:relative;top:-.4em;line-height:0;font-size:90%}.rst-content .hlist{width:100%}.rst-content dl dt span.classifier:before{content:" : "}.rst-content dl dt span.classifier-delimiter{display:none!important}html.writer-html4 .rst-content table.docutils.citation,html.writer-html4 .rst-content table.docutils.footnote{background:none;border:none}html.writer-html4 .rst-content table.docutils.citation td,html.writer-html4 .rst-content table.docutils.citation tr,html.writer-html4 .rst-content table.docutils.footnote td,html.writer-html4 .rst-content table.docutils.footnote tr{border:none;background-color:transparent!important;white-space:normal}html.writer-html4 .rst-content table.docutils.citation td.label,html.writer-html4 .rst-content table.docutils.footnote td.label{padding-left:0;padding-right:0;vertical-align:top}html.writer-html5 .rst-content dl.field-list,html.writer-html5 .rst-content dl.footnote{display:grid;grid-template-columns:max-content auto}html.writer-html5 .rst-content dl.field-list>dt,html.writer-html5 .rst-content dl.footnote>dt{padding-left:1rem}html.writer-html5 .rst-content dl.field-list>dt:after,html.writer-html5 .rst-content dl.footnote>dt:after{content:":"}html.writer-html5 .rst-content dl.field-list>dd,html.writer-html5 .rst-content dl.field-list>dt,html.writer-html5 .rst-content dl.footnote>dd,html.writer-html5 .rst-content dl.footnote>dt{margin-bottom:0}html.writer-html5 .rst-content dl.footnote{font-size:.9rem}html.writer-html5 .rst-content dl.footnote>dt{margin:0 .5rem .5rem 0;line-height:1.2rem;word-break:break-all;font-weight:400}html.writer-html5 .rst-content dl.footnote>dt>span.brackets{margin-right:.5rem}html.writer-html5 .rst-content dl.footnote>dt>span.brackets:before{content:"["}html.writer-html5 .rst-content dl.footnote>dt>span.brackets:after{content:"]"}html.writer-html5 .rst-content dl.footnote>dt>span.fn-backref{font-style:italic}html.writer-html5 .rst-content dl.footnote>dd{margin:0 0 .5rem;line-height:1.2rem}html.writer-html5 .rst-content dl.footnote>dd p,html.writer-html5 .rst-content dl.option-list kbd{font-size:.9rem}.rst-content table.docutils.footnote,html.writer-html4 .rst-content table.docutils.citation,html.writer-html5 .rst-content dl.footnote{color:grey}.rst-content table.docutils.footnote code,.rst-content table.docutils.footnote tt,html.writer-html4 .rst-content table.docutils.citation code,html.writer-html4 .rst-content table.docutils.citation tt,html.writer-html5 .rst-content dl.footnote code,html.writer-html5 .rst-content dl.footnote tt{color:#555}.rst-content .wy-table-responsive.citation,.rst-content .wy-table-responsive.footnote{margin-bottom:0}.rst-content .wy-table-responsive.citation+:not(.citation),.rst-content .wy-table-responsive.footnote+:not(.footnote){margin-top:24px}.rst-content .wy-table-responsive.citation:last-child,.rst-content .wy-table-responsive.footnote:last-child{margin-bottom:24px}.rst-content table.docutils th{border-color:#e1e4e5}html.writer-html5 .rst-content table.docutils th{border:1px solid #e1e4e5}html.writer-html5 .rst-content table.docutils td>p,html.writer-html5 .rst-content table.docutils th>p{line-height:1rem;margin-bottom:0;font-size:.9rem}.rst-content table.docutils td .last,.rst-content table.docutils td .last>:last-child{margin-bottom:0}.rst-content table.field-list,.rst-content table.field-list td{border:none}.rst-content table.field-list td p{font-size:inherit;line-height:inherit}.rst-content table.field-list td>strong{display:inline-block}.rst-content table.field-list .field-name{padding-right:10px;text-align:left;white-space:nowrap}.rst-content table.field-list .field-body{text-align:left}.rst-content code,.rst-content tt{color:#000;font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;padding:2px 5px}.rst-content code big,.rst-content code em,.rst-content tt big,.rst-content tt em{font-size:100%!important;line-height:normal}.rst-content code.literal,.rst-content tt.literal{color:#e74c3c;white-space:normal}.rst-content code.xref,.rst-content tt.xref,a .rst-content code,a .rst-content tt{font-weight:700;color:#404040}.rst-content kbd,.rst-content pre,.rst-content samp{font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace}.rst-content a code,.rst-content a tt{color:#2980b9}.rst-content dl{margin-bottom:24px}.rst-content dl dt{font-weight:700;margin-bottom:12px}.rst-content dl ol,.rst-content dl p,.rst-content dl table,.rst-content dl ul{margin-bottom:12px}.rst-content dl dd{margin:0 0 12px 24px;line-height:24px}html.writer-html4 .rst-content dl:not(.docutils),html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.glossary):not(.simple){margin-bottom:24px}html.writer-html4 .rst-content dl:not(.docutils)>dt,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.glossary):not(.simple)>dt{display:table;margin:6px 0;font-size:90%;line-height:normal;background:#e7f2fa;color:#2980b9;border-top:3px solid #6ab0de;padding:6px;position:relative}html.writer-html4 .rst-content dl:not(.docutils)>dt:before,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.glossary):not(.simple)>dt:before{color:#6ab0de}html.writer-html4 .rst-content dl:not(.docutils)>dt .headerlink,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.glossary):not(.simple)>dt .headerlink{color:#404040;font-size:100%!important}html.writer-html4 .rst-content dl:not(.docutils) dl:not(.field-list)>dt,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.glossary):not(.simple) dl:not(.field-list)>dt{margin-bottom:6px;border:none;border-left:3px solid #ccc;background:#f0f0f0;color:#555}html.writer-html4 .rst-content dl:not(.docutils) dl:not(.field-list)>dt .headerlink,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.glossary):not(.simple) dl:not(.field-list)>dt .headerlink{color:#404040;font-size:100%!important}html.writer-html4 .rst-content dl:not(.docutils)>dt:first-child,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.glossary):not(.simple)>dt:first-child{margin-top:0}html.writer-html4 .rst-content dl:not(.docutils) code.descclassname,html.writer-html4 .rst-content dl:not(.docutils) code.descname,html.writer-html4 .rst-content dl:not(.docutils) tt.descclassname,html.writer-html4 .rst-content dl:not(.docutils) tt.descname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.glossary):not(.simple) code.descclassname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.glossary):not(.simple) code.descname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.glossary):not(.simple) tt.descclassname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.glossary):not(.simple) tt.descname{background-color:transparent;border:none;padding:0;font-size:100%!important}html.writer-html4 .rst-content dl:not(.docutils) code.descname,html.writer-html4 .rst-content dl:not(.docutils) tt.descname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.glossary):not(.simple) code.descname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.glossary):not(.simple) tt.descname{font-weight:700}html.writer-html4 .rst-content dl:not(.docutils) .optional,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.glossary):not(.simple) .optional{display:inline-block;padding:0 4px;color:#000;font-weight:700}html.writer-html4 .rst-content dl:not(.docutils) .property,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.glossary):not(.simple) .property{display:inline-block;padding-right:8px;max-width:100%}html.writer-html4 .rst-content dl:not(.docutils) .k,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.glossary):not(.simple) .k{font-style:italic}html.writer-html4 .rst-content dl:not(.docutils) .descclassname,html.writer-html4 .rst-content dl:not(.docutils) .descname,html.writer-html4 .rst-content dl:not(.docutils) .sig-name,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.glossary):not(.simple) .descclassname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.glossary):not(.simple) .descname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.glossary):not(.simple) .sig-name{font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;color:#000}.rst-content .viewcode-back,.rst-content .viewcode-link{display:inline-block;color:#27ae60;font-size:80%;padding-left:24px}.rst-content .viewcode-back{display:block;float:right}.rst-content p.rubric{margin-bottom:12px;font-weight:700}.rst-content code.download,.rst-content tt.download{background:inherit;padding:inherit;font-weight:400;font-family:inherit;font-size:inherit;color:inherit;border:inherit;white-space:inherit}.rst-content code.download span:first-child,.rst-content tt.download span:first-child{-webkit-font-smoothing:subpixel-antialiased}.rst-content code.download span:first-child:before,.rst-content tt.download span:first-child:before{margin-right:4px}.rst-content .guilabel{border:1px solid #7fbbe3;background:#e7f2fa;font-size:80%;font-weight:700;border-radius:4px;padding:2.4px 6px;margin:auto 2px}.rst-content .versionmodified{font-style:italic}@media screen and (max-width:480px){.rst-content .sidebar{width:100%}}span[id*=MathJax-Span]{color:#404040}.math{text-align:center}@font-face{font-family:Lato;src:url(fonts/lato-normal.woff2?bd03a2cc277bbbc338d464e679fe9942) format("woff2"),url(fonts/lato-normal.woff?27bd77b9162d388cb8d4c4217c7c5e2a) format("woff");font-weight:400;font-style:normal;font-display:block}@font-face{font-family:Lato;src:url(fonts/lato-bold.woff2?cccb897485813c7c256901dbca54ecf2) format("woff2"),url(fonts/lato-bold.woff?d878b6c29b10beca227e9eef4246111b) format("woff");font-weight:700;font-style:normal;font-display:block}@font-face{font-family:Lato;src:url(fonts/lato-bold-italic.woff2?0b6bb6725576b072c5d0b02ecdd1900d) format("woff2"),url(fonts/lato-bold-italic.woff?9c7e4e9eb485b4a121c760e61bc3707c) format("woff");font-weight:700;font-style:italic;font-display:block}@font-face{font-family:Lato;src:url(fonts/lato-normal-italic.woff2?4eb103b4d12be57cb1d040ed5e162e9d) format("woff2"),url(fonts/lato-normal-italic.woff?f28f2d6482446544ef1ea1ccc6dd5892) format("woff");font-weight:400;font-style:italic;font-display:block}@font-face{font-family:Roboto Slab;font-style:normal;font-weight:400;src:url(fonts/Roboto-Slab-Regular.woff2?7abf5b8d04d26a2cafea937019bca958) format("woff2"),url(fonts/Roboto-Slab-Regular.woff?c1be9284088d487c5e3ff0a10a92e58c) format("woff");font-display:block}@font-face{font-family:Roboto Slab;font-style:normal;font-weight:700;src:url(fonts/Roboto-Slab-Bold.woff2?9984f4a9bda09be08e83f2506954adbe) format("woff2"),url(fonts/Roboto-Slab-Bold.woff?bed5564a116b05148e3b3bea6fb1162a) format("woff");font-display:block} \ No newline at end of file diff --git a/branch/bicounty/_static/doctools.js b/branch/bicounty/_static/doctools.js new file mode 100644 index 0000000..c3db08d --- /dev/null +++ b/branch/bicounty/_static/doctools.js @@ -0,0 +1,264 @@ +/* + * doctools.js + * ~~~~~~~~~~~ + * + * Base JavaScript utilities for all Sphinx HTML documentation. + * + * :copyright: Copyright 2007-2022 by the Sphinx team, see AUTHORS. + * :license: BSD, see LICENSE for details. + * + */ +"use strict"; + +const _ready = (callback) => { + if (document.readyState !== "loading") { + callback(); + } else { + document.addEventListener("DOMContentLoaded", callback); + } +}; + +/** + * highlight a given string on a node by wrapping it in + * span elements with the given class name. + */ +const _highlight = (node, addItems, text, className) => { + if (node.nodeType === Node.TEXT_NODE) { + const val = node.nodeValue; + const parent = node.parentNode; + const pos = val.toLowerCase().indexOf(text); + if ( + pos >= 0 && + !parent.classList.contains(className) && + !parent.classList.contains("nohighlight") + ) { + let span; + + const closestNode = parent.closest("body, svg, foreignObject"); + const isInSVG = closestNode && closestNode.matches("svg"); + if (isInSVG) { + span = document.createElementNS("http://www.w3.org/2000/svg", "tspan"); + } else { + span = document.createElement("span"); + span.classList.add(className); + } + + span.appendChild(document.createTextNode(val.substr(pos, text.length))); + parent.insertBefore( + span, + parent.insertBefore( + document.createTextNode(val.substr(pos + text.length)), + node.nextSibling + ) + ); + node.nodeValue = val.substr(0, pos); + + if (isInSVG) { + const rect = document.createElementNS( + "http://www.w3.org/2000/svg", + "rect" + ); + const bbox = parent.getBBox(); + rect.x.baseVal.value = bbox.x; + rect.y.baseVal.value = bbox.y; + rect.width.baseVal.value = bbox.width; + rect.height.baseVal.value = bbox.height; + rect.setAttribute("class", className); + addItems.push({ parent: parent, target: rect }); + } + } + } else if (node.matches && !node.matches("button, select, textarea")) { + node.childNodes.forEach((el) => _highlight(el, addItems, text, className)); + } +}; +const _highlightText = (thisNode, text, className) => { + let addItems = []; + _highlight(thisNode, addItems, text, className); + addItems.forEach((obj) => + obj.parent.insertAdjacentElement("beforebegin", obj.target) + ); +}; + +/** + * Small JavaScript module for the documentation. + */ +const Documentation = { + init: () => { + Documentation.highlightSearchWords(); + Documentation.initDomainIndexTable(); + Documentation.initOnKeyListeners(); + }, + + /** + * i18n support + */ + TRANSLATIONS: {}, + PLURAL_EXPR: (n) => (n === 1 ? 0 : 1), + LOCALE: "unknown", + + // gettext and ngettext don't access this so that the functions + // can safely bound to a different name (_ = Documentation.gettext) + gettext: (string) => { + const translated = Documentation.TRANSLATIONS[string]; + switch (typeof translated) { + case "undefined": + return string; // no translation + case "string": + return translated; // translation exists + default: + return translated[0]; // (singular, plural) translation tuple exists + } + }, + + ngettext: (singular, plural, n) => { + const translated = Documentation.TRANSLATIONS[singular]; + if (typeof translated !== "undefined") + return translated[Documentation.PLURAL_EXPR(n)]; + return n === 1 ? singular : plural; + }, + + addTranslations: (catalog) => { + Object.assign(Documentation.TRANSLATIONS, catalog.messages); + Documentation.PLURAL_EXPR = new Function( + "n", + `return (${catalog.plural_expr})` + ); + Documentation.LOCALE = catalog.locale; + }, + + /** + * highlight the search words provided in the url in the text + */ + highlightSearchWords: () => { + const highlight = + new URLSearchParams(window.location.search).get("highlight") || ""; + const terms = highlight.toLowerCase().split(/\s+/).filter(x => x); + if (terms.length === 0) return; // nothing to do + + // There should never be more than one element matching "div.body" + const divBody = document.querySelectorAll("div.body"); + const body = divBody.length ? divBody[0] : document.querySelector("body"); + window.setTimeout(() => { + terms.forEach((term) => _highlightText(body, term, "highlighted")); + }, 10); + + const searchBox = document.getElementById("searchbox"); + if (searchBox === null) return; + searchBox.appendChild( + document + .createRange() + .createContextualFragment( + '" + ) + ); + }, + + /** + * helper function to hide the search marks again + */ + hideSearchWords: () => { + document + .querySelectorAll("#searchbox .highlight-link") + .forEach((el) => el.remove()); + document + .querySelectorAll("span.highlighted") + .forEach((el) => el.classList.remove("highlighted")); + const url = new URL(window.location); + url.searchParams.delete("highlight"); + window.history.replaceState({}, "", url); + }, + + /** + * helper function to focus on search bar + */ + focusSearchBar: () => { + document.querySelectorAll("input[name=q]")[0]?.focus(); + }, + + /** + * Initialise the domain index toggle buttons + */ + initDomainIndexTable: () => { + const toggler = (el) => { + const idNumber = el.id.substr(7); + const toggledRows = document.querySelectorAll(`tr.cg-${idNumber}`); + if (el.src.substr(-9) === "minus.png") { + el.src = `${el.src.substr(0, el.src.length - 9)}plus.png`; + toggledRows.forEach((el) => (el.style.display = "none")); + } else { + el.src = `${el.src.substr(0, el.src.length - 8)}minus.png`; + toggledRows.forEach((el) => (el.style.display = "")); + } + }; + + const togglerElements = document.querySelectorAll("img.toggler"); + togglerElements.forEach((el) => + el.addEventListener("click", (event) => toggler(event.currentTarget)) + ); + togglerElements.forEach((el) => (el.style.display = "")); + if (DOCUMENTATION_OPTIONS.COLLAPSE_INDEX) togglerElements.forEach(toggler); + }, + + initOnKeyListeners: () => { + // only install a listener if it is really needed + if ( + !DOCUMENTATION_OPTIONS.NAVIGATION_WITH_KEYS && + !DOCUMENTATION_OPTIONS.ENABLE_SEARCH_SHORTCUTS + ) + return; + + const blacklistedElements = new Set([ + "TEXTAREA", + "INPUT", + "SELECT", + "BUTTON", + ]); + document.addEventListener("keydown", (event) => { + if (blacklistedElements.has(document.activeElement.tagName)) return; // bail for input elements + if (event.altKey || event.ctrlKey || event.metaKey) return; // bail with special keys + + if (!event.shiftKey) { + switch (event.key) { + case "ArrowLeft": + if (!DOCUMENTATION_OPTIONS.NAVIGATION_WITH_KEYS) break; + + const prevLink = document.querySelector('link[rel="prev"]'); + if (prevLink && prevLink.href) { + window.location.href = prevLink.href; + event.preventDefault(); + } + break; + case "ArrowRight": + if (!DOCUMENTATION_OPTIONS.NAVIGATION_WITH_KEYS) break; + + const nextLink = document.querySelector('link[rel="next"]'); + if (nextLink && nextLink.href) { + window.location.href = nextLink.href; + event.preventDefault(); + } + break; + case "Escape": + if (!DOCUMENTATION_OPTIONS.ENABLE_SEARCH_SHORTCUTS) break; + Documentation.hideSearchWords(); + event.preventDefault(); + } + } + + // some keyboard layouts may need Shift to get / + switch (event.key) { + case "/": + if (!DOCUMENTATION_OPTIONS.ENABLE_SEARCH_SHORTCUTS) break; + Documentation.focusSearchBar(); + event.preventDefault(); + } + }); + }, +}; + +// quick alias for translations +const _ = Documentation.gettext; + +_ready(Documentation.init); diff --git a/branch/bicounty/_static/documentation_options.js b/branch/bicounty/_static/documentation_options.js new file mode 100644 index 0000000..c066c69 --- /dev/null +++ b/branch/bicounty/_static/documentation_options.js @@ -0,0 +1,14 @@ +var DOCUMENTATION_OPTIONS = { + URL_ROOT: document.getElementById("documentation_options").getAttribute('data-url_root'), + VERSION: '', + LANGUAGE: 'en', + COLLAPSE_INDEX: false, + BUILDER: 'dirhtml', + FILE_SUFFIX: '.html', + LINK_SUFFIX: '.html', + HAS_SOURCE: true, + SOURCELINK_SUFFIX: '.txt', + NAVIGATION_WITH_KEYS: false, + SHOW_SEARCH_SUMMARY: true, + ENABLE_SEARCH_SHORTCUTS: true, +}; \ No newline at end of file diff --git a/branch/bicounty/_static/file.png b/branch/bicounty/_static/file.png new file mode 100644 index 0000000..a858a41 Binary files /dev/null and b/branch/bicounty/_static/file.png differ diff --git a/branch/bicounty/_static/graphviz.css b/branch/bicounty/_static/graphviz.css new file mode 100644 index 0000000..19e7afd --- /dev/null +++ b/branch/bicounty/_static/graphviz.css @@ -0,0 +1,19 @@ +/* + * graphviz.css + * ~~~~~~~~~~~~ + * + * Sphinx stylesheet -- graphviz extension. + * + * :copyright: Copyright 2007-2022 by the Sphinx team, see AUTHORS. + * :license: BSD, see LICENSE for details. + * + */ + +img.graphviz { + border: 0; + max-width: 100%; +} + +object.graphviz { + max-width: 100%; +} diff --git a/branch/bicounty/_static/jquery-3.6.0.js b/branch/bicounty/_static/jquery-3.6.0.js new file mode 100644 index 0000000..fc6c299 --- /dev/null +++ b/branch/bicounty/_static/jquery-3.6.0.js @@ -0,0 +1,10881 @@ +/*! + * jQuery JavaScript Library v3.6.0 + * https://jquery.com/ + * + * Includes Sizzle.js + * https://sizzlejs.com/ + * + * Copyright OpenJS Foundation and other contributors + * Released under the MIT license + * https://jquery.org/license + * + * Date: 2021-03-02T17:08Z + */ +( function( global, factory ) { + + "use strict"; + + if ( typeof module === "object" && typeof module.exports === "object" ) { + + // For CommonJS and CommonJS-like environments where a proper `window` + // is present, execute the factory and get jQuery. + // For environments that do not have a `window` with a `document` + // (such as Node.js), expose a factory as module.exports. + // This accentuates the need for the creation of a real `window`. + // e.g. var jQuery = require("jquery")(window); + // See ticket #14549 for more info. + module.exports = global.document ? + factory( global, true ) : + function( w ) { + if ( !w.document ) { + throw new Error( "jQuery requires a window with a document" ); + } + return factory( w ); + }; + } else { + factory( global ); + } + +// Pass this if window is not defined yet +} )( typeof window !== "undefined" ? window : this, function( window, noGlobal ) { + +// Edge <= 12 - 13+, Firefox <=18 - 45+, IE 10 - 11, Safari 5.1 - 9+, iOS 6 - 9.1 +// throw exceptions when non-strict code (e.g., ASP.NET 4.5) accesses strict mode +// arguments.callee.caller (trac-13335). But as of jQuery 3.0 (2016), strict mode should be common +// enough that all such attempts are guarded in a try block. +"use strict"; + +var arr = []; + +var getProto = Object.getPrototypeOf; + +var slice = arr.slice; + +var flat = arr.flat ? function( array ) { + return arr.flat.call( array ); +} : function( array ) { + return arr.concat.apply( [], array ); +}; + + +var push = arr.push; + +var indexOf = arr.indexOf; + +var class2type = {}; + +var toString = class2type.toString; + +var hasOwn = class2type.hasOwnProperty; + +var fnToString = hasOwn.toString; + +var ObjectFunctionString = fnToString.call( Object ); + +var support = {}; + +var isFunction = function isFunction( obj ) { + + // Support: Chrome <=57, Firefox <=52 + // In some browsers, typeof returns "function" for HTML elements + // (i.e., `typeof document.createElement( "object" ) === "function"`). + // We don't want to classify *any* DOM node as a function. + // Support: QtWeb <=3.8.5, WebKit <=534.34, wkhtmltopdf tool <=0.12.5 + // Plus for old WebKit, typeof returns "function" for HTML collections + // (e.g., `typeof document.getElementsByTagName("div") === "function"`). (gh-4756) + return typeof obj === "function" && typeof obj.nodeType !== "number" && + typeof obj.item !== "function"; + }; + + +var isWindow = function isWindow( obj ) { + return obj != null && obj === obj.window; + }; + + +var document = window.document; + + + + var preservedScriptAttributes = { + type: true, + src: true, + nonce: true, + noModule: true + }; + + function DOMEval( code, node, doc ) { + doc = doc || document; + + var i, val, + script = doc.createElement( "script" ); + + script.text = code; + if ( node ) { + for ( i in preservedScriptAttributes ) { + + // Support: Firefox 64+, Edge 18+ + // Some browsers don't support the "nonce" property on scripts. + // On the other hand, just using `getAttribute` is not enough as + // the `nonce` attribute is reset to an empty string whenever it + // becomes browsing-context connected. + // See https://github.com/whatwg/html/issues/2369 + // See https://html.spec.whatwg.org/#nonce-attributes + // The `node.getAttribute` check was added for the sake of + // `jQuery.globalEval` so that it can fake a nonce-containing node + // via an object. + val = node[ i ] || node.getAttribute && node.getAttribute( i ); + if ( val ) { + script.setAttribute( i, val ); + } + } + } + doc.head.appendChild( script ).parentNode.removeChild( script ); + } + + +function toType( obj ) { + if ( obj == null ) { + return obj + ""; + } + + // Support: Android <=2.3 only (functionish RegExp) + return typeof obj === "object" || typeof obj === "function" ? + class2type[ toString.call( obj ) ] || "object" : + typeof obj; +} +/* global Symbol */ +// Defining this global in .eslintrc.json would create a danger of using the global +// unguarded in another place, it seems safer to define global only for this module + + + +var + version = "3.6.0", + + // Define a local copy of jQuery + jQuery = function( selector, context ) { + + // The jQuery object is actually just the init constructor 'enhanced' + // Need init if jQuery is called (just allow error to be thrown if not included) + return new jQuery.fn.init( selector, context ); + }; + +jQuery.fn = jQuery.prototype = { + + // The current version of jQuery being used + jquery: version, + + constructor: jQuery, + + // The default length of a jQuery object is 0 + length: 0, + + toArray: function() { + return slice.call( this ); + }, + + // Get the Nth element in the matched element set OR + // Get the whole matched element set as a clean array + get: function( num ) { + + // Return all the elements in a clean array + if ( num == null ) { + return slice.call( this ); + } + + // Return just the one element from the set + return num < 0 ? this[ num + this.length ] : this[ num ]; + }, + + // Take an array of elements and push it onto the stack + // (returning the new matched element set) + pushStack: function( elems ) { + + // Build a new jQuery matched element set + var ret = jQuery.merge( this.constructor(), elems ); + + // Add the old object onto the stack (as a reference) + ret.prevObject = this; + + // Return the newly-formed element set + return ret; + }, + + // Execute a callback for every element in the matched set. + each: function( callback ) { + return jQuery.each( this, callback ); + }, + + map: function( callback ) { + return this.pushStack( jQuery.map( this, function( elem, i ) { + return callback.call( elem, i, elem ); + } ) ); + }, + + slice: function() { + return this.pushStack( slice.apply( this, arguments ) ); + }, + + first: function() { + return this.eq( 0 ); + }, + + last: function() { + return this.eq( -1 ); + }, + + even: function() { + return this.pushStack( jQuery.grep( this, function( _elem, i ) { + return ( i + 1 ) % 2; + } ) ); + }, + + odd: function() { + return this.pushStack( jQuery.grep( this, function( _elem, i ) { + return i % 2; + } ) ); + }, + + eq: function( i ) { + var len = this.length, + j = +i + ( i < 0 ? len : 0 ); + return this.pushStack( j >= 0 && j < len ? [ this[ j ] ] : [] ); + }, + + end: function() { + return this.prevObject || this.constructor(); + }, + + // For internal use only. + // Behaves like an Array's method, not like a jQuery method. + push: push, + sort: arr.sort, + splice: arr.splice +}; + +jQuery.extend = jQuery.fn.extend = function() { + var options, name, src, copy, copyIsArray, clone, + target = arguments[ 0 ] || {}, + i = 1, + length = arguments.length, + deep = false; + + // Handle a deep copy situation + if ( typeof target === "boolean" ) { + deep = target; + + // Skip the boolean and the target + target = arguments[ i ] || {}; + i++; + } + + // Handle case when target is a string or something (possible in deep copy) + if ( typeof target !== "object" && !isFunction( target ) ) { + target = {}; + } + + // Extend jQuery itself if only one argument is passed + if ( i === length ) { + target = this; + i--; + } + + for ( ; i < length; i++ ) { + + // Only deal with non-null/undefined values + if ( ( options = arguments[ i ] ) != null ) { + + // Extend the base object + for ( name in options ) { + copy = options[ name ]; + + // Prevent Object.prototype pollution + // Prevent never-ending loop + if ( name === "__proto__" || target === copy ) { + continue; + } + + // Recurse if we're merging plain objects or arrays + if ( deep && copy && ( jQuery.isPlainObject( copy ) || + ( copyIsArray = Array.isArray( copy ) ) ) ) { + src = target[ name ]; + + // Ensure proper type for the source value + if ( copyIsArray && !Array.isArray( src ) ) { + clone = []; + } else if ( !copyIsArray && !jQuery.isPlainObject( src ) ) { + clone = {}; + } else { + clone = src; + } + copyIsArray = false; + + // Never move original objects, clone them + target[ name ] = jQuery.extend( deep, clone, copy ); + + // Don't bring in undefined values + } else if ( copy !== undefined ) { + target[ name ] = copy; + } + } + } + } + + // Return the modified object + return target; +}; + +jQuery.extend( { + + // Unique for each copy of jQuery on the page + expando: "jQuery" + ( version + Math.random() ).replace( /\D/g, "" ), + + // Assume jQuery is ready without the ready module + isReady: true, + + error: function( msg ) { + throw new Error( msg ); + }, + + noop: function() {}, + + isPlainObject: function( obj ) { + var proto, Ctor; + + // Detect obvious negatives + // Use toString instead of jQuery.type to catch host objects + if ( !obj || toString.call( obj ) !== "[object Object]" ) { + return false; + } + + proto = getProto( obj ); + + // Objects with no prototype (e.g., `Object.create( null )`) are plain + if ( !proto ) { + return true; + } + + // Objects with prototype are plain iff they were constructed by a global Object function + Ctor = hasOwn.call( proto, "constructor" ) && proto.constructor; + return typeof Ctor === "function" && fnToString.call( Ctor ) === ObjectFunctionString; + }, + + isEmptyObject: function( obj ) { + var name; + + for ( name in obj ) { + return false; + } + return true; + }, + + // Evaluates a script in a provided context; falls back to the global one + // if not specified. + globalEval: function( code, options, doc ) { + DOMEval( code, { nonce: options && options.nonce }, doc ); + }, + + each: function( obj, callback ) { + var length, i = 0; + + if ( isArrayLike( obj ) ) { + length = obj.length; + for ( ; i < length; i++ ) { + if ( callback.call( obj[ i ], i, obj[ i ] ) === false ) { + break; + } + } + } else { + for ( i in obj ) { + if ( callback.call( obj[ i ], i, obj[ i ] ) === false ) { + break; + } + } + } + + return obj; + }, + + // results is for internal usage only + makeArray: function( arr, results ) { + var ret = results || []; + + if ( arr != null ) { + if ( isArrayLike( Object( arr ) ) ) { + jQuery.merge( ret, + typeof arr === "string" ? + [ arr ] : arr + ); + } else { + push.call( ret, arr ); + } + } + + return ret; + }, + + inArray: function( elem, arr, i ) { + return arr == null ? -1 : indexOf.call( arr, elem, i ); + }, + + // Support: Android <=4.0 only, PhantomJS 1 only + // push.apply(_, arraylike) throws on ancient WebKit + merge: function( first, second ) { + var len = +second.length, + j = 0, + i = first.length; + + for ( ; j < len; j++ ) { + first[ i++ ] = second[ j ]; + } + + first.length = i; + + return first; + }, + + grep: function( elems, callback, invert ) { + var callbackInverse, + matches = [], + i = 0, + length = elems.length, + callbackExpect = !invert; + + // Go through the array, only saving the items + // that pass the validator function + for ( ; i < length; i++ ) { + callbackInverse = !callback( elems[ i ], i ); + if ( callbackInverse !== callbackExpect ) { + matches.push( elems[ i ] ); + } + } + + return matches; + }, + + // arg is for internal usage only + map: function( elems, callback, arg ) { + var length, value, + i = 0, + ret = []; + + // Go through the array, translating each of the items to their new values + if ( isArrayLike( elems ) ) { + length = elems.length; + for ( ; i < length; i++ ) { + value = callback( elems[ i ], i, arg ); + + if ( value != null ) { + ret.push( value ); + } + } + + // Go through every key on the object, + } else { + for ( i in elems ) { + value = callback( elems[ i ], i, arg ); + + if ( value != null ) { + ret.push( value ); + } + } + } + + // Flatten any nested arrays + return flat( ret ); + }, + + // A global GUID counter for objects + guid: 1, + + // jQuery.support is not used in Core but other projects attach their + // properties to it so it needs to exist. + support: support +} ); + +if ( typeof Symbol === "function" ) { + jQuery.fn[ Symbol.iterator ] = arr[ Symbol.iterator ]; +} + +// Populate the class2type map +jQuery.each( "Boolean Number String Function Array Date RegExp Object Error Symbol".split( " " ), + function( _i, name ) { + class2type[ "[object " + name + "]" ] = name.toLowerCase(); + } ); + +function isArrayLike( obj ) { + + // Support: real iOS 8.2 only (not reproducible in simulator) + // `in` check used to prevent JIT error (gh-2145) + // hasOwn isn't used here due to false negatives + // regarding Nodelist length in IE + var length = !!obj && "length" in obj && obj.length, + type = toType( obj ); + + if ( isFunction( obj ) || isWindow( obj ) ) { + return false; + } + + return type === "array" || length === 0 || + typeof length === "number" && length > 0 && ( length - 1 ) in obj; +} +var Sizzle = +/*! + * Sizzle CSS Selector Engine v2.3.6 + * https://sizzlejs.com/ + * + * Copyright JS Foundation and other contributors + * Released under the MIT license + * https://js.foundation/ + * + * Date: 2021-02-16 + */ +( function( window ) { +var i, + support, + Expr, + getText, + isXML, + tokenize, + compile, + select, + outermostContext, + sortInput, + hasDuplicate, + + // Local document vars + setDocument, + document, + docElem, + documentIsHTML, + rbuggyQSA, + rbuggyMatches, + matches, + contains, + + // Instance-specific data + expando = "sizzle" + 1 * new Date(), + preferredDoc = window.document, + dirruns = 0, + done = 0, + classCache = createCache(), + tokenCache = createCache(), + compilerCache = createCache(), + nonnativeSelectorCache = createCache(), + sortOrder = function( a, b ) { + if ( a === b ) { + hasDuplicate = true; + } + return 0; + }, + + // Instance methods + hasOwn = ( {} ).hasOwnProperty, + arr = [], + pop = arr.pop, + pushNative = arr.push, + push = arr.push, + slice = arr.slice, + + // Use a stripped-down indexOf as it's faster than native + // https://jsperf.com/thor-indexof-vs-for/5 + indexOf = function( list, elem ) { + var i = 0, + len = list.length; + for ( ; i < len; i++ ) { + if ( list[ i ] === elem ) { + return i; + } + } + return -1; + }, + + booleans = "checked|selected|async|autofocus|autoplay|controls|defer|disabled|hidden|" + + "ismap|loop|multiple|open|readonly|required|scoped", + + // Regular expressions + + // http://www.w3.org/TR/css3-selectors/#whitespace + whitespace = "[\\x20\\t\\r\\n\\f]", + + // https://www.w3.org/TR/css-syntax-3/#ident-token-diagram + identifier = "(?:\\\\[\\da-fA-F]{1,6}" + whitespace + + "?|\\\\[^\\r\\n\\f]|[\\w-]|[^\0-\\x7f])+", + + // Attribute selectors: http://www.w3.org/TR/selectors/#attribute-selectors + attributes = "\\[" + whitespace + "*(" + identifier + ")(?:" + whitespace + + + // Operator (capture 2) + "*([*^$|!~]?=)" + whitespace + + + // "Attribute values must be CSS identifiers [capture 5] + // or strings [capture 3 or capture 4]" + "*(?:'((?:\\\\.|[^\\\\'])*)'|\"((?:\\\\.|[^\\\\\"])*)\"|(" + identifier + "))|)" + + whitespace + "*\\]", + + pseudos = ":(" + identifier + ")(?:\\((" + + + // To reduce the number of selectors needing tokenize in the preFilter, prefer arguments: + // 1. quoted (capture 3; capture 4 or capture 5) + "('((?:\\\\.|[^\\\\'])*)'|\"((?:\\\\.|[^\\\\\"])*)\")|" + + + // 2. simple (capture 6) + "((?:\\\\.|[^\\\\()[\\]]|" + attributes + ")*)|" + + + // 3. anything else (capture 2) + ".*" + + ")\\)|)", + + // Leading and non-escaped trailing whitespace, capturing some non-whitespace characters preceding the latter + rwhitespace = new RegExp( whitespace + "+", "g" ), + rtrim = new RegExp( "^" + whitespace + "+|((?:^|[^\\\\])(?:\\\\.)*)" + + whitespace + "+$", "g" ), + + rcomma = new RegExp( "^" + whitespace + "*," + whitespace + "*" ), + rcombinators = new RegExp( "^" + whitespace + "*([>+~]|" + whitespace + ")" + whitespace + + "*" ), + rdescend = new RegExp( whitespace + "|>" ), + + rpseudo = new RegExp( pseudos ), + ridentifier = new RegExp( "^" + identifier + "$" ), + + matchExpr = { + "ID": new RegExp( "^#(" + identifier + ")" ), + "CLASS": new RegExp( "^\\.(" + identifier + ")" ), + "TAG": new RegExp( "^(" + identifier + "|[*])" ), + "ATTR": new RegExp( "^" + attributes ), + "PSEUDO": new RegExp( "^" + pseudos ), + "CHILD": new RegExp( "^:(only|first|last|nth|nth-last)-(child|of-type)(?:\\(" + + whitespace + "*(even|odd|(([+-]|)(\\d*)n|)" + whitespace + "*(?:([+-]|)" + + whitespace + "*(\\d+)|))" + whitespace + "*\\)|)", "i" ), + "bool": new RegExp( "^(?:" + booleans + ")$", "i" ), + + // For use in libraries implementing .is() + // We use this for POS matching in `select` + "needsContext": new RegExp( "^" + whitespace + + "*[>+~]|:(even|odd|eq|gt|lt|nth|first|last)(?:\\(" + whitespace + + "*((?:-\\d)?\\d*)" + whitespace + "*\\)|)(?=[^-]|$)", "i" ) + }, + + rhtml = /HTML$/i, + rinputs = /^(?:input|select|textarea|button)$/i, + rheader = /^h\d$/i, + + rnative = /^[^{]+\{\s*\[native \w/, + + // Easily-parseable/retrievable ID or TAG or CLASS selectors + rquickExpr = /^(?:#([\w-]+)|(\w+)|\.([\w-]+))$/, + + rsibling = /[+~]/, + + // CSS escapes + // http://www.w3.org/TR/CSS21/syndata.html#escaped-characters + runescape = new RegExp( "\\\\[\\da-fA-F]{1,6}" + whitespace + "?|\\\\([^\\r\\n\\f])", "g" ), + funescape = function( escape, nonHex ) { + var high = "0x" + escape.slice( 1 ) - 0x10000; + + return nonHex ? + + // Strip the backslash prefix from a non-hex escape sequence + nonHex : + + // Replace a hexadecimal escape sequence with the encoded Unicode code point + // Support: IE <=11+ + // For values outside the Basic Multilingual Plane (BMP), manually construct a + // surrogate pair + high < 0 ? + String.fromCharCode( high + 0x10000 ) : + String.fromCharCode( high >> 10 | 0xD800, high & 0x3FF | 0xDC00 ); + }, + + // CSS string/identifier serialization + // https://drafts.csswg.org/cssom/#common-serializing-idioms + rcssescape = /([\0-\x1f\x7f]|^-?\d)|^-$|[^\0-\x1f\x7f-\uFFFF\w-]/g, + fcssescape = function( ch, asCodePoint ) { + if ( asCodePoint ) { + + // U+0000 NULL becomes U+FFFD REPLACEMENT CHARACTER + if ( ch === "\0" ) { + return "\uFFFD"; + } + + // Control characters and (dependent upon position) numbers get escaped as code points + return ch.slice( 0, -1 ) + "\\" + + ch.charCodeAt( ch.length - 1 ).toString( 16 ) + " "; + } + + // Other potentially-special ASCII characters get backslash-escaped + return "\\" + ch; + }, + + // Used for iframes + // See setDocument() + // Removing the function wrapper causes a "Permission Denied" + // error in IE + unloadHandler = function() { + setDocument(); + }, + + inDisabledFieldset = addCombinator( + function( elem ) { + return elem.disabled === true && elem.nodeName.toLowerCase() === "fieldset"; + }, + { dir: "parentNode", next: "legend" } + ); + +// Optimize for push.apply( _, NodeList ) +try { + push.apply( + ( arr = slice.call( preferredDoc.childNodes ) ), + preferredDoc.childNodes + ); + + // Support: Android<4.0 + // Detect silently failing push.apply + // eslint-disable-next-line no-unused-expressions + arr[ preferredDoc.childNodes.length ].nodeType; +} catch ( e ) { + push = { apply: arr.length ? + + // Leverage slice if possible + function( target, els ) { + pushNative.apply( target, slice.call( els ) ); + } : + + // Support: IE<9 + // Otherwise append directly + function( target, els ) { + var j = target.length, + i = 0; + + // Can't trust NodeList.length + while ( ( target[ j++ ] = els[ i++ ] ) ) {} + target.length = j - 1; + } + }; +} + +function Sizzle( selector, context, results, seed ) { + var m, i, elem, nid, match, groups, newSelector, + newContext = context && context.ownerDocument, + + // nodeType defaults to 9, since context defaults to document + nodeType = context ? context.nodeType : 9; + + results = results || []; + + // Return early from calls with invalid selector or context + if ( typeof selector !== "string" || !selector || + nodeType !== 1 && nodeType !== 9 && nodeType !== 11 ) { + + return results; + } + + // Try to shortcut find operations (as opposed to filters) in HTML documents + if ( !seed ) { + setDocument( context ); + context = context || document; + + if ( documentIsHTML ) { + + // If the selector is sufficiently simple, try using a "get*By*" DOM method + // (excepting DocumentFragment context, where the methods don't exist) + if ( nodeType !== 11 && ( match = rquickExpr.exec( selector ) ) ) { + + // ID selector + if ( ( m = match[ 1 ] ) ) { + + // Document context + if ( nodeType === 9 ) { + if ( ( elem = context.getElementById( m ) ) ) { + + // Support: IE, Opera, Webkit + // TODO: identify versions + // getElementById can match elements by name instead of ID + if ( elem.id === m ) { + results.push( elem ); + return results; + } + } else { + return results; + } + + // Element context + } else { + + // Support: IE, Opera, Webkit + // TODO: identify versions + // getElementById can match elements by name instead of ID + if ( newContext && ( elem = newContext.getElementById( m ) ) && + contains( context, elem ) && + elem.id === m ) { + + results.push( elem ); + return results; + } + } + + // Type selector + } else if ( match[ 2 ] ) { + push.apply( results, context.getElementsByTagName( selector ) ); + return results; + + // Class selector + } else if ( ( m = match[ 3 ] ) && support.getElementsByClassName && + context.getElementsByClassName ) { + + push.apply( results, context.getElementsByClassName( m ) ); + return results; + } + } + + // Take advantage of querySelectorAll + if ( support.qsa && + !nonnativeSelectorCache[ selector + " " ] && + ( !rbuggyQSA || !rbuggyQSA.test( selector ) ) && + + // Support: IE 8 only + // Exclude object elements + ( nodeType !== 1 || context.nodeName.toLowerCase() !== "object" ) ) { + + newSelector = selector; + newContext = context; + + // qSA considers elements outside a scoping root when evaluating child or + // descendant combinators, which is not what we want. + // In such cases, we work around the behavior by prefixing every selector in the + // list with an ID selector referencing the scope context. + // The technique has to be used as well when a leading combinator is used + // as such selectors are not recognized by querySelectorAll. + // Thanks to Andrew Dupont for this technique. + if ( nodeType === 1 && + ( rdescend.test( selector ) || rcombinators.test( selector ) ) ) { + + // Expand context for sibling selectors + newContext = rsibling.test( selector ) && testContext( context.parentNode ) || + context; + + // We can use :scope instead of the ID hack if the browser + // supports it & if we're not changing the context. + if ( newContext !== context || !support.scope ) { + + // Capture the context ID, setting it first if necessary + if ( ( nid = context.getAttribute( "id" ) ) ) { + nid = nid.replace( rcssescape, fcssescape ); + } else { + context.setAttribute( "id", ( nid = expando ) ); + } + } + + // Prefix every selector in the list + groups = tokenize( selector ); + i = groups.length; + while ( i-- ) { + groups[ i ] = ( nid ? "#" + nid : ":scope" ) + " " + + toSelector( groups[ i ] ); + } + newSelector = groups.join( "," ); + } + + try { + push.apply( results, + newContext.querySelectorAll( newSelector ) + ); + return results; + } catch ( qsaError ) { + nonnativeSelectorCache( selector, true ); + } finally { + if ( nid === expando ) { + context.removeAttribute( "id" ); + } + } + } + } + } + + // All others + return select( selector.replace( rtrim, "$1" ), context, results, seed ); +} + +/** + * Create key-value caches of limited size + * @returns {function(string, object)} Returns the Object data after storing it on itself with + * property name the (space-suffixed) string and (if the cache is larger than Expr.cacheLength) + * deleting the oldest entry + */ +function createCache() { + var keys = []; + + function cache( key, value ) { + + // Use (key + " ") to avoid collision with native prototype properties (see Issue #157) + if ( keys.push( key + " " ) > Expr.cacheLength ) { + + // Only keep the most recent entries + delete cache[ keys.shift() ]; + } + return ( cache[ key + " " ] = value ); + } + return cache; +} + +/** + * Mark a function for special use by Sizzle + * @param {Function} fn The function to mark + */ +function markFunction( fn ) { + fn[ expando ] = true; + return fn; +} + +/** + * Support testing using an element + * @param {Function} fn Passed the created element and returns a boolean result + */ +function assert( fn ) { + var el = document.createElement( "fieldset" ); + + try { + return !!fn( el ); + } catch ( e ) { + return false; + } finally { + + // Remove from its parent by default + if ( el.parentNode ) { + el.parentNode.removeChild( el ); + } + + // release memory in IE + el = null; + } +} + +/** + * Adds the same handler for all of the specified attrs + * @param {String} attrs Pipe-separated list of attributes + * @param {Function} handler The method that will be applied + */ +function addHandle( attrs, handler ) { + var arr = attrs.split( "|" ), + i = arr.length; + + while ( i-- ) { + Expr.attrHandle[ arr[ i ] ] = handler; + } +} + +/** + * Checks document order of two siblings + * @param {Element} a + * @param {Element} b + * @returns {Number} Returns less than 0 if a precedes b, greater than 0 if a follows b + */ +function siblingCheck( a, b ) { + var cur = b && a, + diff = cur && a.nodeType === 1 && b.nodeType === 1 && + a.sourceIndex - b.sourceIndex; + + // Use IE sourceIndex if available on both nodes + if ( diff ) { + return diff; + } + + // Check if b follows a + if ( cur ) { + while ( ( cur = cur.nextSibling ) ) { + if ( cur === b ) { + return -1; + } + } + } + + return a ? 1 : -1; +} + +/** + * Returns a function to use in pseudos for input types + * @param {String} type + */ +function createInputPseudo( type ) { + return function( elem ) { + var name = elem.nodeName.toLowerCase(); + return name === "input" && elem.type === type; + }; +} + +/** + * Returns a function to use in pseudos for buttons + * @param {String} type + */ +function createButtonPseudo( type ) { + return function( elem ) { + var name = elem.nodeName.toLowerCase(); + return ( name === "input" || name === "button" ) && elem.type === type; + }; +} + +/** + * Returns a function to use in pseudos for :enabled/:disabled + * @param {Boolean} disabled true for :disabled; false for :enabled + */ +function createDisabledPseudo( disabled ) { + + // Known :disabled false positives: fieldset[disabled] > legend:nth-of-type(n+2) :can-disable + return function( elem ) { + + // Only certain elements can match :enabled or :disabled + // https://html.spec.whatwg.org/multipage/scripting.html#selector-enabled + // https://html.spec.whatwg.org/multipage/scripting.html#selector-disabled + if ( "form" in elem ) { + + // Check for inherited disabledness on relevant non-disabled elements: + // * listed form-associated elements in a disabled fieldset + // https://html.spec.whatwg.org/multipage/forms.html#category-listed + // https://html.spec.whatwg.org/multipage/forms.html#concept-fe-disabled + // * option elements in a disabled optgroup + // https://html.spec.whatwg.org/multipage/forms.html#concept-option-disabled + // All such elements have a "form" property. + if ( elem.parentNode && elem.disabled === false ) { + + // Option elements defer to a parent optgroup if present + if ( "label" in elem ) { + if ( "label" in elem.parentNode ) { + return elem.parentNode.disabled === disabled; + } else { + return elem.disabled === disabled; + } + } + + // Support: IE 6 - 11 + // Use the isDisabled shortcut property to check for disabled fieldset ancestors + return elem.isDisabled === disabled || + + // Where there is no isDisabled, check manually + /* jshint -W018 */ + elem.isDisabled !== !disabled && + inDisabledFieldset( elem ) === disabled; + } + + return elem.disabled === disabled; + + // Try to winnow out elements that can't be disabled before trusting the disabled property. + // Some victims get caught in our net (label, legend, menu, track), but it shouldn't + // even exist on them, let alone have a boolean value. + } else if ( "label" in elem ) { + return elem.disabled === disabled; + } + + // Remaining elements are neither :enabled nor :disabled + return false; + }; +} + +/** + * Returns a function to use in pseudos for positionals + * @param {Function} fn + */ +function createPositionalPseudo( fn ) { + return markFunction( function( argument ) { + argument = +argument; + return markFunction( function( seed, matches ) { + var j, + matchIndexes = fn( [], seed.length, argument ), + i = matchIndexes.length; + + // Match elements found at the specified indexes + while ( i-- ) { + if ( seed[ ( j = matchIndexes[ i ] ) ] ) { + seed[ j ] = !( matches[ j ] = seed[ j ] ); + } + } + } ); + } ); +} + +/** + * Checks a node for validity as a Sizzle context + * @param {Element|Object=} context + * @returns {Element|Object|Boolean} The input node if acceptable, otherwise a falsy value + */ +function testContext( context ) { + return context && typeof context.getElementsByTagName !== "undefined" && context; +} + +// Expose support vars for convenience +support = Sizzle.support = {}; + +/** + * Detects XML nodes + * @param {Element|Object} elem An element or a document + * @returns {Boolean} True iff elem is a non-HTML XML node + */ +isXML = Sizzle.isXML = function( elem ) { + var namespace = elem && elem.namespaceURI, + docElem = elem && ( elem.ownerDocument || elem ).documentElement; + + // Support: IE <=8 + // Assume HTML when documentElement doesn't yet exist, such as inside loading iframes + // https://bugs.jquery.com/ticket/4833 + return !rhtml.test( namespace || docElem && docElem.nodeName || "HTML" ); +}; + +/** + * Sets document-related variables once based on the current document + * @param {Element|Object} [doc] An element or document object to use to set the document + * @returns {Object} Returns the current document + */ +setDocument = Sizzle.setDocument = function( node ) { + var hasCompare, subWindow, + doc = node ? node.ownerDocument || node : preferredDoc; + + // Return early if doc is invalid or already selected + // Support: IE 11+, Edge 17 - 18+ + // IE/Edge sometimes throw a "Permission denied" error when strict-comparing + // two documents; shallow comparisons work. + // eslint-disable-next-line eqeqeq + if ( doc == document || doc.nodeType !== 9 || !doc.documentElement ) { + return document; + } + + // Update global variables + document = doc; + docElem = document.documentElement; + documentIsHTML = !isXML( document ); + + // Support: IE 9 - 11+, Edge 12 - 18+ + // Accessing iframe documents after unload throws "permission denied" errors (jQuery #13936) + // Support: IE 11+, Edge 17 - 18+ + // IE/Edge sometimes throw a "Permission denied" error when strict-comparing + // two documents; shallow comparisons work. + // eslint-disable-next-line eqeqeq + if ( preferredDoc != document && + ( subWindow = document.defaultView ) && subWindow.top !== subWindow ) { + + // Support: IE 11, Edge + if ( subWindow.addEventListener ) { + subWindow.addEventListener( "unload", unloadHandler, false ); + + // Support: IE 9 - 10 only + } else if ( subWindow.attachEvent ) { + subWindow.attachEvent( "onunload", unloadHandler ); + } + } + + // Support: IE 8 - 11+, Edge 12 - 18+, Chrome <=16 - 25 only, Firefox <=3.6 - 31 only, + // Safari 4 - 5 only, Opera <=11.6 - 12.x only + // IE/Edge & older browsers don't support the :scope pseudo-class. + // Support: Safari 6.0 only + // Safari 6.0 supports :scope but it's an alias of :root there. + support.scope = assert( function( el ) { + docElem.appendChild( el ).appendChild( document.createElement( "div" ) ); + return typeof el.querySelectorAll !== "undefined" && + !el.querySelectorAll( ":scope fieldset div" ).length; + } ); + + /* Attributes + ---------------------------------------------------------------------- */ + + // Support: IE<8 + // Verify that getAttribute really returns attributes and not properties + // (excepting IE8 booleans) + support.attributes = assert( function( el ) { + el.className = "i"; + return !el.getAttribute( "className" ); + } ); + + /* getElement(s)By* + ---------------------------------------------------------------------- */ + + // Check if getElementsByTagName("*") returns only elements + support.getElementsByTagName = assert( function( el ) { + el.appendChild( document.createComment( "" ) ); + return !el.getElementsByTagName( "*" ).length; + } ); + + // Support: IE<9 + support.getElementsByClassName = rnative.test( document.getElementsByClassName ); + + // Support: IE<10 + // Check if getElementById returns elements by name + // The broken getElementById methods don't pick up programmatically-set names, + // so use a roundabout getElementsByName test + support.getById = assert( function( el ) { + docElem.appendChild( el ).id = expando; + return !document.getElementsByName || !document.getElementsByName( expando ).length; + } ); + + // ID filter and find + if ( support.getById ) { + Expr.filter[ "ID" ] = function( id ) { + var attrId = id.replace( runescape, funescape ); + return function( elem ) { + return elem.getAttribute( "id" ) === attrId; + }; + }; + Expr.find[ "ID" ] = function( id, context ) { + if ( typeof context.getElementById !== "undefined" && documentIsHTML ) { + var elem = context.getElementById( id ); + return elem ? [ elem ] : []; + } + }; + } else { + Expr.filter[ "ID" ] = function( id ) { + var attrId = id.replace( runescape, funescape ); + return function( elem ) { + var node = typeof elem.getAttributeNode !== "undefined" && + elem.getAttributeNode( "id" ); + return node && node.value === attrId; + }; + }; + + // Support: IE 6 - 7 only + // getElementById is not reliable as a find shortcut + Expr.find[ "ID" ] = function( id, context ) { + if ( typeof context.getElementById !== "undefined" && documentIsHTML ) { + var node, i, elems, + elem = context.getElementById( id ); + + if ( elem ) { + + // Verify the id attribute + node = elem.getAttributeNode( "id" ); + if ( node && node.value === id ) { + return [ elem ]; + } + + // Fall back on getElementsByName + elems = context.getElementsByName( id ); + i = 0; + while ( ( elem = elems[ i++ ] ) ) { + node = elem.getAttributeNode( "id" ); + if ( node && node.value === id ) { + return [ elem ]; + } + } + } + + return []; + } + }; + } + + // Tag + Expr.find[ "TAG" ] = support.getElementsByTagName ? + function( tag, context ) { + if ( typeof context.getElementsByTagName !== "undefined" ) { + return context.getElementsByTagName( tag ); + + // DocumentFragment nodes don't have gEBTN + } else if ( support.qsa ) { + return context.querySelectorAll( tag ); + } + } : + + function( tag, context ) { + var elem, + tmp = [], + i = 0, + + // By happy coincidence, a (broken) gEBTN appears on DocumentFragment nodes too + results = context.getElementsByTagName( tag ); + + // Filter out possible comments + if ( tag === "*" ) { + while ( ( elem = results[ i++ ] ) ) { + if ( elem.nodeType === 1 ) { + tmp.push( elem ); + } + } + + return tmp; + } + return results; + }; + + // Class + Expr.find[ "CLASS" ] = support.getElementsByClassName && function( className, context ) { + if ( typeof context.getElementsByClassName !== "undefined" && documentIsHTML ) { + return context.getElementsByClassName( className ); + } + }; + + /* QSA/matchesSelector + ---------------------------------------------------------------------- */ + + // QSA and matchesSelector support + + // matchesSelector(:active) reports false when true (IE9/Opera 11.5) + rbuggyMatches = []; + + // qSa(:focus) reports false when true (Chrome 21) + // We allow this because of a bug in IE8/9 that throws an error + // whenever `document.activeElement` is accessed on an iframe + // So, we allow :focus to pass through QSA all the time to avoid the IE error + // See https://bugs.jquery.com/ticket/13378 + rbuggyQSA = []; + + if ( ( support.qsa = rnative.test( document.querySelectorAll ) ) ) { + + // Build QSA regex + // Regex strategy adopted from Diego Perini + assert( function( el ) { + + var input; + + // Select is set to empty string on purpose + // This is to test IE's treatment of not explicitly + // setting a boolean content attribute, + // since its presence should be enough + // https://bugs.jquery.com/ticket/12359 + docElem.appendChild( el ).innerHTML = "" + + ""; + + // Support: IE8, Opera 11-12.16 + // Nothing should be selected when empty strings follow ^= or $= or *= + // The test attribute must be unknown in Opera but "safe" for WinRT + // https://msdn.microsoft.com/en-us/library/ie/hh465388.aspx#attribute_section + if ( el.querySelectorAll( "[msallowcapture^='']" ).length ) { + rbuggyQSA.push( "[*^$]=" + whitespace + "*(?:''|\"\")" ); + } + + // Support: IE8 + // Boolean attributes and "value" are not treated correctly + if ( !el.querySelectorAll( "[selected]" ).length ) { + rbuggyQSA.push( "\\[" + whitespace + "*(?:value|" + booleans + ")" ); + } + + // Support: Chrome<29, Android<4.4, Safari<7.0+, iOS<7.0+, PhantomJS<1.9.8+ + if ( !el.querySelectorAll( "[id~=" + expando + "-]" ).length ) { + rbuggyQSA.push( "~=" ); + } + + // Support: IE 11+, Edge 15 - 18+ + // IE 11/Edge don't find elements on a `[name='']` query in some cases. + // Adding a temporary attribute to the document before the selection works + // around the issue. + // Interestingly, IE 10 & older don't seem to have the issue. + input = document.createElement( "input" ); + input.setAttribute( "name", "" ); + el.appendChild( input ); + if ( !el.querySelectorAll( "[name='']" ).length ) { + rbuggyQSA.push( "\\[" + whitespace + "*name" + whitespace + "*=" + + whitespace + "*(?:''|\"\")" ); + } + + // Webkit/Opera - :checked should return selected option elements + // http://www.w3.org/TR/2011/REC-css3-selectors-20110929/#checked + // IE8 throws error here and will not see later tests + if ( !el.querySelectorAll( ":checked" ).length ) { + rbuggyQSA.push( ":checked" ); + } + + // Support: Safari 8+, iOS 8+ + // https://bugs.webkit.org/show_bug.cgi?id=136851 + // In-page `selector#id sibling-combinator selector` fails + if ( !el.querySelectorAll( "a#" + expando + "+*" ).length ) { + rbuggyQSA.push( ".#.+[+~]" ); + } + + // Support: Firefox <=3.6 - 5 only + // Old Firefox doesn't throw on a badly-escaped identifier. + el.querySelectorAll( "\\\f" ); + rbuggyQSA.push( "[\\r\\n\\f]" ); + } ); + + assert( function( el ) { + el.innerHTML = "" + + ""; + + // Support: Windows 8 Native Apps + // The type and name attributes are restricted during .innerHTML assignment + var input = document.createElement( "input" ); + input.setAttribute( "type", "hidden" ); + el.appendChild( input ).setAttribute( "name", "D" ); + + // Support: IE8 + // Enforce case-sensitivity of name attribute + if ( el.querySelectorAll( "[name=d]" ).length ) { + rbuggyQSA.push( "name" + whitespace + "*[*^$|!~]?=" ); + } + + // FF 3.5 - :enabled/:disabled and hidden elements (hidden elements are still enabled) + // IE8 throws error here and will not see later tests + if ( el.querySelectorAll( ":enabled" ).length !== 2 ) { + rbuggyQSA.push( ":enabled", ":disabled" ); + } + + // Support: IE9-11+ + // IE's :disabled selector does not pick up the children of disabled fieldsets + docElem.appendChild( el ).disabled = true; + if ( el.querySelectorAll( ":disabled" ).length !== 2 ) { + rbuggyQSA.push( ":enabled", ":disabled" ); + } + + // Support: Opera 10 - 11 only + // Opera 10-11 does not throw on post-comma invalid pseudos + el.querySelectorAll( "*,:x" ); + rbuggyQSA.push( ",.*:" ); + } ); + } + + if ( ( support.matchesSelector = rnative.test( ( matches = docElem.matches || + docElem.webkitMatchesSelector || + docElem.mozMatchesSelector || + docElem.oMatchesSelector || + docElem.msMatchesSelector ) ) ) ) { + + assert( function( el ) { + + // Check to see if it's possible to do matchesSelector + // on a disconnected node (IE 9) + support.disconnectedMatch = matches.call( el, "*" ); + + // This should fail with an exception + // Gecko does not error, returns false instead + matches.call( el, "[s!='']:x" ); + rbuggyMatches.push( "!=", pseudos ); + } ); + } + + rbuggyQSA = rbuggyQSA.length && new RegExp( rbuggyQSA.join( "|" ) ); + rbuggyMatches = rbuggyMatches.length && new RegExp( rbuggyMatches.join( "|" ) ); + + /* Contains + ---------------------------------------------------------------------- */ + hasCompare = rnative.test( docElem.compareDocumentPosition ); + + // Element contains another + // Purposefully self-exclusive + // As in, an element does not contain itself + contains = hasCompare || rnative.test( docElem.contains ) ? + function( a, b ) { + var adown = a.nodeType === 9 ? a.documentElement : a, + bup = b && b.parentNode; + return a === bup || !!( bup && bup.nodeType === 1 && ( + adown.contains ? + adown.contains( bup ) : + a.compareDocumentPosition && a.compareDocumentPosition( bup ) & 16 + ) ); + } : + function( a, b ) { + if ( b ) { + while ( ( b = b.parentNode ) ) { + if ( b === a ) { + return true; + } + } + } + return false; + }; + + /* Sorting + ---------------------------------------------------------------------- */ + + // Document order sorting + sortOrder = hasCompare ? + function( a, b ) { + + // Flag for duplicate removal + if ( a === b ) { + hasDuplicate = true; + return 0; + } + + // Sort on method existence if only one input has compareDocumentPosition + var compare = !a.compareDocumentPosition - !b.compareDocumentPosition; + if ( compare ) { + return compare; + } + + // Calculate position if both inputs belong to the same document + // Support: IE 11+, Edge 17 - 18+ + // IE/Edge sometimes throw a "Permission denied" error when strict-comparing + // two documents; shallow comparisons work. + // eslint-disable-next-line eqeqeq + compare = ( a.ownerDocument || a ) == ( b.ownerDocument || b ) ? + a.compareDocumentPosition( b ) : + + // Otherwise we know they are disconnected + 1; + + // Disconnected nodes + if ( compare & 1 || + ( !support.sortDetached && b.compareDocumentPosition( a ) === compare ) ) { + + // Choose the first element that is related to our preferred document + // Support: IE 11+, Edge 17 - 18+ + // IE/Edge sometimes throw a "Permission denied" error when strict-comparing + // two documents; shallow comparisons work. + // eslint-disable-next-line eqeqeq + if ( a == document || a.ownerDocument == preferredDoc && + contains( preferredDoc, a ) ) { + return -1; + } + + // Support: IE 11+, Edge 17 - 18+ + // IE/Edge sometimes throw a "Permission denied" error when strict-comparing + // two documents; shallow comparisons work. + // eslint-disable-next-line eqeqeq + if ( b == document || b.ownerDocument == preferredDoc && + contains( preferredDoc, b ) ) { + return 1; + } + + // Maintain original order + return sortInput ? + ( indexOf( sortInput, a ) - indexOf( sortInput, b ) ) : + 0; + } + + return compare & 4 ? -1 : 1; + } : + function( a, b ) { + + // Exit early if the nodes are identical + if ( a === b ) { + hasDuplicate = true; + return 0; + } + + var cur, + i = 0, + aup = a.parentNode, + bup = b.parentNode, + ap = [ a ], + bp = [ b ]; + + // Parentless nodes are either documents or disconnected + if ( !aup || !bup ) { + + // Support: IE 11+, Edge 17 - 18+ + // IE/Edge sometimes throw a "Permission denied" error when strict-comparing + // two documents; shallow comparisons work. + /* eslint-disable eqeqeq */ + return a == document ? -1 : + b == document ? 1 : + /* eslint-enable eqeqeq */ + aup ? -1 : + bup ? 1 : + sortInput ? + ( indexOf( sortInput, a ) - indexOf( sortInput, b ) ) : + 0; + + // If the nodes are siblings, we can do a quick check + } else if ( aup === bup ) { + return siblingCheck( a, b ); + } + + // Otherwise we need full lists of their ancestors for comparison + cur = a; + while ( ( cur = cur.parentNode ) ) { + ap.unshift( cur ); + } + cur = b; + while ( ( cur = cur.parentNode ) ) { + bp.unshift( cur ); + } + + // Walk down the tree looking for a discrepancy + while ( ap[ i ] === bp[ i ] ) { + i++; + } + + return i ? + + // Do a sibling check if the nodes have a common ancestor + siblingCheck( ap[ i ], bp[ i ] ) : + + // Otherwise nodes in our document sort first + // Support: IE 11+, Edge 17 - 18+ + // IE/Edge sometimes throw a "Permission denied" error when strict-comparing + // two documents; shallow comparisons work. + /* eslint-disable eqeqeq */ + ap[ i ] == preferredDoc ? -1 : + bp[ i ] == preferredDoc ? 1 : + /* eslint-enable eqeqeq */ + 0; + }; + + return document; +}; + +Sizzle.matches = function( expr, elements ) { + return Sizzle( expr, null, null, elements ); +}; + +Sizzle.matchesSelector = function( elem, expr ) { + setDocument( elem ); + + if ( support.matchesSelector && documentIsHTML && + !nonnativeSelectorCache[ expr + " " ] && + ( !rbuggyMatches || !rbuggyMatches.test( expr ) ) && + ( !rbuggyQSA || !rbuggyQSA.test( expr ) ) ) { + + try { + var ret = matches.call( elem, expr ); + + // IE 9's matchesSelector returns false on disconnected nodes + if ( ret || support.disconnectedMatch || + + // As well, disconnected nodes are said to be in a document + // fragment in IE 9 + elem.document && elem.document.nodeType !== 11 ) { + return ret; + } + } catch ( e ) { + nonnativeSelectorCache( expr, true ); + } + } + + return Sizzle( expr, document, null, [ elem ] ).length > 0; +}; + +Sizzle.contains = function( context, elem ) { + + // Set document vars if needed + // Support: IE 11+, Edge 17 - 18+ + // IE/Edge sometimes throw a "Permission denied" error when strict-comparing + // two documents; shallow comparisons work. + // eslint-disable-next-line eqeqeq + if ( ( context.ownerDocument || context ) != document ) { + setDocument( context ); + } + return contains( context, elem ); +}; + +Sizzle.attr = function( elem, name ) { + + // Set document vars if needed + // Support: IE 11+, Edge 17 - 18+ + // IE/Edge sometimes throw a "Permission denied" error when strict-comparing + // two documents; shallow comparisons work. + // eslint-disable-next-line eqeqeq + if ( ( elem.ownerDocument || elem ) != document ) { + setDocument( elem ); + } + + var fn = Expr.attrHandle[ name.toLowerCase() ], + + // Don't get fooled by Object.prototype properties (jQuery #13807) + val = fn && hasOwn.call( Expr.attrHandle, name.toLowerCase() ) ? + fn( elem, name, !documentIsHTML ) : + undefined; + + return val !== undefined ? + val : + support.attributes || !documentIsHTML ? + elem.getAttribute( name ) : + ( val = elem.getAttributeNode( name ) ) && val.specified ? + val.value : + null; +}; + +Sizzle.escape = function( sel ) { + return ( sel + "" ).replace( rcssescape, fcssescape ); +}; + +Sizzle.error = function( msg ) { + throw new Error( "Syntax error, unrecognized expression: " + msg ); +}; + +/** + * Document sorting and removing duplicates + * @param {ArrayLike} results + */ +Sizzle.uniqueSort = function( results ) { + var elem, + duplicates = [], + j = 0, + i = 0; + + // Unless we *know* we can detect duplicates, assume their presence + hasDuplicate = !support.detectDuplicates; + sortInput = !support.sortStable && results.slice( 0 ); + results.sort( sortOrder ); + + if ( hasDuplicate ) { + while ( ( elem = results[ i++ ] ) ) { + if ( elem === results[ i ] ) { + j = duplicates.push( i ); + } + } + while ( j-- ) { + results.splice( duplicates[ j ], 1 ); + } + } + + // Clear input after sorting to release objects + // See https://github.com/jquery/sizzle/pull/225 + sortInput = null; + + return results; +}; + +/** + * Utility function for retrieving the text value of an array of DOM nodes + * @param {Array|Element} elem + */ +getText = Sizzle.getText = function( elem ) { + var node, + ret = "", + i = 0, + nodeType = elem.nodeType; + + if ( !nodeType ) { + + // If no nodeType, this is expected to be an array + while ( ( node = elem[ i++ ] ) ) { + + // Do not traverse comment nodes + ret += getText( node ); + } + } else if ( nodeType === 1 || nodeType === 9 || nodeType === 11 ) { + + // Use textContent for elements + // innerText usage removed for consistency of new lines (jQuery #11153) + if ( typeof elem.textContent === "string" ) { + return elem.textContent; + } else { + + // Traverse its children + for ( elem = elem.firstChild; elem; elem = elem.nextSibling ) { + ret += getText( elem ); + } + } + } else if ( nodeType === 3 || nodeType === 4 ) { + return elem.nodeValue; + } + + // Do not include comment or processing instruction nodes + + return ret; +}; + +Expr = Sizzle.selectors = { + + // Can be adjusted by the user + cacheLength: 50, + + createPseudo: markFunction, + + match: matchExpr, + + attrHandle: {}, + + find: {}, + + relative: { + ">": { dir: "parentNode", first: true }, + " ": { dir: "parentNode" }, + "+": { dir: "previousSibling", first: true }, + "~": { dir: "previousSibling" } + }, + + preFilter: { + "ATTR": function( match ) { + match[ 1 ] = match[ 1 ].replace( runescape, funescape ); + + // Move the given value to match[3] whether quoted or unquoted + match[ 3 ] = ( match[ 3 ] || match[ 4 ] || + match[ 5 ] || "" ).replace( runescape, funescape ); + + if ( match[ 2 ] === "~=" ) { + match[ 3 ] = " " + match[ 3 ] + " "; + } + + return match.slice( 0, 4 ); + }, + + "CHILD": function( match ) { + + /* matches from matchExpr["CHILD"] + 1 type (only|nth|...) + 2 what (child|of-type) + 3 argument (even|odd|\d*|\d*n([+-]\d+)?|...) + 4 xn-component of xn+y argument ([+-]?\d*n|) + 5 sign of xn-component + 6 x of xn-component + 7 sign of y-component + 8 y of y-component + */ + match[ 1 ] = match[ 1 ].toLowerCase(); + + if ( match[ 1 ].slice( 0, 3 ) === "nth" ) { + + // nth-* requires argument + if ( !match[ 3 ] ) { + Sizzle.error( match[ 0 ] ); + } + + // numeric x and y parameters for Expr.filter.CHILD + // remember that false/true cast respectively to 0/1 + match[ 4 ] = +( match[ 4 ] ? + match[ 5 ] + ( match[ 6 ] || 1 ) : + 2 * ( match[ 3 ] === "even" || match[ 3 ] === "odd" ) ); + match[ 5 ] = +( ( match[ 7 ] + match[ 8 ] ) || match[ 3 ] === "odd" ); + + // other types prohibit arguments + } else if ( match[ 3 ] ) { + Sizzle.error( match[ 0 ] ); + } + + return match; + }, + + "PSEUDO": function( match ) { + var excess, + unquoted = !match[ 6 ] && match[ 2 ]; + + if ( matchExpr[ "CHILD" ].test( match[ 0 ] ) ) { + return null; + } + + // Accept quoted arguments as-is + if ( match[ 3 ] ) { + match[ 2 ] = match[ 4 ] || match[ 5 ] || ""; + + // Strip excess characters from unquoted arguments + } else if ( unquoted && rpseudo.test( unquoted ) && + + // Get excess from tokenize (recursively) + ( excess = tokenize( unquoted, true ) ) && + + // advance to the next closing parenthesis + ( excess = unquoted.indexOf( ")", unquoted.length - excess ) - unquoted.length ) ) { + + // excess is a negative index + match[ 0 ] = match[ 0 ].slice( 0, excess ); + match[ 2 ] = unquoted.slice( 0, excess ); + } + + // Return only captures needed by the pseudo filter method (type and argument) + return match.slice( 0, 3 ); + } + }, + + filter: { + + "TAG": function( nodeNameSelector ) { + var nodeName = nodeNameSelector.replace( runescape, funescape ).toLowerCase(); + return nodeNameSelector === "*" ? + function() { + return true; + } : + function( elem ) { + return elem.nodeName && elem.nodeName.toLowerCase() === nodeName; + }; + }, + + "CLASS": function( className ) { + var pattern = classCache[ className + " " ]; + + return pattern || + ( pattern = new RegExp( "(^|" + whitespace + + ")" + className + "(" + whitespace + "|$)" ) ) && classCache( + className, function( elem ) { + return pattern.test( + typeof elem.className === "string" && elem.className || + typeof elem.getAttribute !== "undefined" && + elem.getAttribute( "class" ) || + "" + ); + } ); + }, + + "ATTR": function( name, operator, check ) { + return function( elem ) { + var result = Sizzle.attr( elem, name ); + + if ( result == null ) { + return operator === "!="; + } + if ( !operator ) { + return true; + } + + result += ""; + + /* eslint-disable max-len */ + + return operator === "=" ? result === check : + operator === "!=" ? result !== check : + operator === "^=" ? check && result.indexOf( check ) === 0 : + operator === "*=" ? check && result.indexOf( check ) > -1 : + operator === "$=" ? check && result.slice( -check.length ) === check : + operator === "~=" ? ( " " + result.replace( rwhitespace, " " ) + " " ).indexOf( check ) > -1 : + operator === "|=" ? result === check || result.slice( 0, check.length + 1 ) === check + "-" : + false; + /* eslint-enable max-len */ + + }; + }, + + "CHILD": function( type, what, _argument, first, last ) { + var simple = type.slice( 0, 3 ) !== "nth", + forward = type.slice( -4 ) !== "last", + ofType = what === "of-type"; + + return first === 1 && last === 0 ? + + // Shortcut for :nth-*(n) + function( elem ) { + return !!elem.parentNode; + } : + + function( elem, _context, xml ) { + var cache, uniqueCache, outerCache, node, nodeIndex, start, + dir = simple !== forward ? "nextSibling" : "previousSibling", + parent = elem.parentNode, + name = ofType && elem.nodeName.toLowerCase(), + useCache = !xml && !ofType, + diff = false; + + if ( parent ) { + + // :(first|last|only)-(child|of-type) + if ( simple ) { + while ( dir ) { + node = elem; + while ( ( node = node[ dir ] ) ) { + if ( ofType ? + node.nodeName.toLowerCase() === name : + node.nodeType === 1 ) { + + return false; + } + } + + // Reverse direction for :only-* (if we haven't yet done so) + start = dir = type === "only" && !start && "nextSibling"; + } + return true; + } + + start = [ forward ? parent.firstChild : parent.lastChild ]; + + // non-xml :nth-child(...) stores cache data on `parent` + if ( forward && useCache ) { + + // Seek `elem` from a previously-cached index + + // ...in a gzip-friendly way + node = parent; + outerCache = node[ expando ] || ( node[ expando ] = {} ); + + // Support: IE <9 only + // Defend against cloned attroperties (jQuery gh-1709) + uniqueCache = outerCache[ node.uniqueID ] || + ( outerCache[ node.uniqueID ] = {} ); + + cache = uniqueCache[ type ] || []; + nodeIndex = cache[ 0 ] === dirruns && cache[ 1 ]; + diff = nodeIndex && cache[ 2 ]; + node = nodeIndex && parent.childNodes[ nodeIndex ]; + + while ( ( node = ++nodeIndex && node && node[ dir ] || + + // Fallback to seeking `elem` from the start + ( diff = nodeIndex = 0 ) || start.pop() ) ) { + + // When found, cache indexes on `parent` and break + if ( node.nodeType === 1 && ++diff && node === elem ) { + uniqueCache[ type ] = [ dirruns, nodeIndex, diff ]; + break; + } + } + + } else { + + // Use previously-cached element index if available + if ( useCache ) { + + // ...in a gzip-friendly way + node = elem; + outerCache = node[ expando ] || ( node[ expando ] = {} ); + + // Support: IE <9 only + // Defend against cloned attroperties (jQuery gh-1709) + uniqueCache = outerCache[ node.uniqueID ] || + ( outerCache[ node.uniqueID ] = {} ); + + cache = uniqueCache[ type ] || []; + nodeIndex = cache[ 0 ] === dirruns && cache[ 1 ]; + diff = nodeIndex; + } + + // xml :nth-child(...) + // or :nth-last-child(...) or :nth(-last)?-of-type(...) + if ( diff === false ) { + + // Use the same loop as above to seek `elem` from the start + while ( ( node = ++nodeIndex && node && node[ dir ] || + ( diff = nodeIndex = 0 ) || start.pop() ) ) { + + if ( ( ofType ? + node.nodeName.toLowerCase() === name : + node.nodeType === 1 ) && + ++diff ) { + + // Cache the index of each encountered element + if ( useCache ) { + outerCache = node[ expando ] || + ( node[ expando ] = {} ); + + // Support: IE <9 only + // Defend against cloned attroperties (jQuery gh-1709) + uniqueCache = outerCache[ node.uniqueID ] || + ( outerCache[ node.uniqueID ] = {} ); + + uniqueCache[ type ] = [ dirruns, diff ]; + } + + if ( node === elem ) { + break; + } + } + } + } + } + + // Incorporate the offset, then check against cycle size + diff -= last; + return diff === first || ( diff % first === 0 && diff / first >= 0 ); + } + }; + }, + + "PSEUDO": function( pseudo, argument ) { + + // pseudo-class names are case-insensitive + // http://www.w3.org/TR/selectors/#pseudo-classes + // Prioritize by case sensitivity in case custom pseudos are added with uppercase letters + // Remember that setFilters inherits from pseudos + var args, + fn = Expr.pseudos[ pseudo ] || Expr.setFilters[ pseudo.toLowerCase() ] || + Sizzle.error( "unsupported pseudo: " + pseudo ); + + // The user may use createPseudo to indicate that + // arguments are needed to create the filter function + // just as Sizzle does + if ( fn[ expando ] ) { + return fn( argument ); + } + + // But maintain support for old signatures + if ( fn.length > 1 ) { + args = [ pseudo, pseudo, "", argument ]; + return Expr.setFilters.hasOwnProperty( pseudo.toLowerCase() ) ? + markFunction( function( seed, matches ) { + var idx, + matched = fn( seed, argument ), + i = matched.length; + while ( i-- ) { + idx = indexOf( seed, matched[ i ] ); + seed[ idx ] = !( matches[ idx ] = matched[ i ] ); + } + } ) : + function( elem ) { + return fn( elem, 0, args ); + }; + } + + return fn; + } + }, + + pseudos: { + + // Potentially complex pseudos + "not": markFunction( function( selector ) { + + // Trim the selector passed to compile + // to avoid treating leading and trailing + // spaces as combinators + var input = [], + results = [], + matcher = compile( selector.replace( rtrim, "$1" ) ); + + return matcher[ expando ] ? + markFunction( function( seed, matches, _context, xml ) { + var elem, + unmatched = matcher( seed, null, xml, [] ), + i = seed.length; + + // Match elements unmatched by `matcher` + while ( i-- ) { + if ( ( elem = unmatched[ i ] ) ) { + seed[ i ] = !( matches[ i ] = elem ); + } + } + } ) : + function( elem, _context, xml ) { + input[ 0 ] = elem; + matcher( input, null, xml, results ); + + // Don't keep the element (issue #299) + input[ 0 ] = null; + return !results.pop(); + }; + } ), + + "has": markFunction( function( selector ) { + return function( elem ) { + return Sizzle( selector, elem ).length > 0; + }; + } ), + + "contains": markFunction( function( text ) { + text = text.replace( runescape, funescape ); + return function( elem ) { + return ( elem.textContent || getText( elem ) ).indexOf( text ) > -1; + }; + } ), + + // "Whether an element is represented by a :lang() selector + // is based solely on the element's language value + // being equal to the identifier C, + // or beginning with the identifier C immediately followed by "-". + // The matching of C against the element's language value is performed case-insensitively. + // The identifier C does not have to be a valid language name." + // http://www.w3.org/TR/selectors/#lang-pseudo + "lang": markFunction( function( lang ) { + + // lang value must be a valid identifier + if ( !ridentifier.test( lang || "" ) ) { + Sizzle.error( "unsupported lang: " + lang ); + } + lang = lang.replace( runescape, funescape ).toLowerCase(); + return function( elem ) { + var elemLang; + do { + if ( ( elemLang = documentIsHTML ? + elem.lang : + elem.getAttribute( "xml:lang" ) || elem.getAttribute( "lang" ) ) ) { + + elemLang = elemLang.toLowerCase(); + return elemLang === lang || elemLang.indexOf( lang + "-" ) === 0; + } + } while ( ( elem = elem.parentNode ) && elem.nodeType === 1 ); + return false; + }; + } ), + + // Miscellaneous + "target": function( elem ) { + var hash = window.location && window.location.hash; + return hash && hash.slice( 1 ) === elem.id; + }, + + "root": function( elem ) { + return elem === docElem; + }, + + "focus": function( elem ) { + return elem === document.activeElement && + ( !document.hasFocus || document.hasFocus() ) && + !!( elem.type || elem.href || ~elem.tabIndex ); + }, + + // Boolean properties + "enabled": createDisabledPseudo( false ), + "disabled": createDisabledPseudo( true ), + + "checked": function( elem ) { + + // In CSS3, :checked should return both checked and selected elements + // http://www.w3.org/TR/2011/REC-css3-selectors-20110929/#checked + var nodeName = elem.nodeName.toLowerCase(); + return ( nodeName === "input" && !!elem.checked ) || + ( nodeName === "option" && !!elem.selected ); + }, + + "selected": function( elem ) { + + // Accessing this property makes selected-by-default + // options in Safari work properly + if ( elem.parentNode ) { + // eslint-disable-next-line no-unused-expressions + elem.parentNode.selectedIndex; + } + + return elem.selected === true; + }, + + // Contents + "empty": function( elem ) { + + // http://www.w3.org/TR/selectors/#empty-pseudo + // :empty is negated by element (1) or content nodes (text: 3; cdata: 4; entity ref: 5), + // but not by others (comment: 8; processing instruction: 7; etc.) + // nodeType < 6 works because attributes (2) do not appear as children + for ( elem = elem.firstChild; elem; elem = elem.nextSibling ) { + if ( elem.nodeType < 6 ) { + return false; + } + } + return true; + }, + + "parent": function( elem ) { + return !Expr.pseudos[ "empty" ]( elem ); + }, + + // Element/input types + "header": function( elem ) { + return rheader.test( elem.nodeName ); + }, + + "input": function( elem ) { + return rinputs.test( elem.nodeName ); + }, + + "button": function( elem ) { + var name = elem.nodeName.toLowerCase(); + return name === "input" && elem.type === "button" || name === "button"; + }, + + "text": function( elem ) { + var attr; + return elem.nodeName.toLowerCase() === "input" && + elem.type === "text" && + + // Support: IE<8 + // New HTML5 attribute values (e.g., "search") appear with elem.type === "text" + ( ( attr = elem.getAttribute( "type" ) ) == null || + attr.toLowerCase() === "text" ); + }, + + // Position-in-collection + "first": createPositionalPseudo( function() { + return [ 0 ]; + } ), + + "last": createPositionalPseudo( function( _matchIndexes, length ) { + return [ length - 1 ]; + } ), + + "eq": createPositionalPseudo( function( _matchIndexes, length, argument ) { + return [ argument < 0 ? argument + length : argument ]; + } ), + + "even": createPositionalPseudo( function( matchIndexes, length ) { + var i = 0; + for ( ; i < length; i += 2 ) { + matchIndexes.push( i ); + } + return matchIndexes; + } ), + + "odd": createPositionalPseudo( function( matchIndexes, length ) { + var i = 1; + for ( ; i < length; i += 2 ) { + matchIndexes.push( i ); + } + return matchIndexes; + } ), + + "lt": createPositionalPseudo( function( matchIndexes, length, argument ) { + var i = argument < 0 ? + argument + length : + argument > length ? + length : + argument; + for ( ; --i >= 0; ) { + matchIndexes.push( i ); + } + return matchIndexes; + } ), + + "gt": createPositionalPseudo( function( matchIndexes, length, argument ) { + var i = argument < 0 ? argument + length : argument; + for ( ; ++i < length; ) { + matchIndexes.push( i ); + } + return matchIndexes; + } ) + } +}; + +Expr.pseudos[ "nth" ] = Expr.pseudos[ "eq" ]; + +// Add button/input type pseudos +for ( i in { radio: true, checkbox: true, file: true, password: true, image: true } ) { + Expr.pseudos[ i ] = createInputPseudo( i ); +} +for ( i in { submit: true, reset: true } ) { + Expr.pseudos[ i ] = createButtonPseudo( i ); +} + +// Easy API for creating new setFilters +function setFilters() {} +setFilters.prototype = Expr.filters = Expr.pseudos; +Expr.setFilters = new setFilters(); + +tokenize = Sizzle.tokenize = function( selector, parseOnly ) { + var matched, match, tokens, type, + soFar, groups, preFilters, + cached = tokenCache[ selector + " " ]; + + if ( cached ) { + return parseOnly ? 0 : cached.slice( 0 ); + } + + soFar = selector; + groups = []; + preFilters = Expr.preFilter; + + while ( soFar ) { + + // Comma and first run + if ( !matched || ( match = rcomma.exec( soFar ) ) ) { + if ( match ) { + + // Don't consume trailing commas as valid + soFar = soFar.slice( match[ 0 ].length ) || soFar; + } + groups.push( ( tokens = [] ) ); + } + + matched = false; + + // Combinators + if ( ( match = rcombinators.exec( soFar ) ) ) { + matched = match.shift(); + tokens.push( { + value: matched, + + // Cast descendant combinators to space + type: match[ 0 ].replace( rtrim, " " ) + } ); + soFar = soFar.slice( matched.length ); + } + + // Filters + for ( type in Expr.filter ) { + if ( ( match = matchExpr[ type ].exec( soFar ) ) && ( !preFilters[ type ] || + ( match = preFilters[ type ]( match ) ) ) ) { + matched = match.shift(); + tokens.push( { + value: matched, + type: type, + matches: match + } ); + soFar = soFar.slice( matched.length ); + } + } + + if ( !matched ) { + break; + } + } + + // Return the length of the invalid excess + // if we're just parsing + // Otherwise, throw an error or return tokens + return parseOnly ? + soFar.length : + soFar ? + Sizzle.error( selector ) : + + // Cache the tokens + tokenCache( selector, groups ).slice( 0 ); +}; + +function toSelector( tokens ) { + var i = 0, + len = tokens.length, + selector = ""; + for ( ; i < len; i++ ) { + selector += tokens[ i ].value; + } + return selector; +} + +function addCombinator( matcher, combinator, base ) { + var dir = combinator.dir, + skip = combinator.next, + key = skip || dir, + checkNonElements = base && key === "parentNode", + doneName = done++; + + return combinator.first ? + + // Check against closest ancestor/preceding element + function( elem, context, xml ) { + while ( ( elem = elem[ dir ] ) ) { + if ( elem.nodeType === 1 || checkNonElements ) { + return matcher( elem, context, xml ); + } + } + return false; + } : + + // Check against all ancestor/preceding elements + function( elem, context, xml ) { + var oldCache, uniqueCache, outerCache, + newCache = [ dirruns, doneName ]; + + // We can't set arbitrary data on XML nodes, so they don't benefit from combinator caching + if ( xml ) { + while ( ( elem = elem[ dir ] ) ) { + if ( elem.nodeType === 1 || checkNonElements ) { + if ( matcher( elem, context, xml ) ) { + return true; + } + } + } + } else { + while ( ( elem = elem[ dir ] ) ) { + if ( elem.nodeType === 1 || checkNonElements ) { + outerCache = elem[ expando ] || ( elem[ expando ] = {} ); + + // Support: IE <9 only + // Defend against cloned attroperties (jQuery gh-1709) + uniqueCache = outerCache[ elem.uniqueID ] || + ( outerCache[ elem.uniqueID ] = {} ); + + if ( skip && skip === elem.nodeName.toLowerCase() ) { + elem = elem[ dir ] || elem; + } else if ( ( oldCache = uniqueCache[ key ] ) && + oldCache[ 0 ] === dirruns && oldCache[ 1 ] === doneName ) { + + // Assign to newCache so results back-propagate to previous elements + return ( newCache[ 2 ] = oldCache[ 2 ] ); + } else { + + // Reuse newcache so results back-propagate to previous elements + uniqueCache[ key ] = newCache; + + // A match means we're done; a fail means we have to keep checking + if ( ( newCache[ 2 ] = matcher( elem, context, xml ) ) ) { + return true; + } + } + } + } + } + return false; + }; +} + +function elementMatcher( matchers ) { + return matchers.length > 1 ? + function( elem, context, xml ) { + var i = matchers.length; + while ( i-- ) { + if ( !matchers[ i ]( elem, context, xml ) ) { + return false; + } + } + return true; + } : + matchers[ 0 ]; +} + +function multipleContexts( selector, contexts, results ) { + var i = 0, + len = contexts.length; + for ( ; i < len; i++ ) { + Sizzle( selector, contexts[ i ], results ); + } + return results; +} + +function condense( unmatched, map, filter, context, xml ) { + var elem, + newUnmatched = [], + i = 0, + len = unmatched.length, + mapped = map != null; + + for ( ; i < len; i++ ) { + if ( ( elem = unmatched[ i ] ) ) { + if ( !filter || filter( elem, context, xml ) ) { + newUnmatched.push( elem ); + if ( mapped ) { + map.push( i ); + } + } + } + } + + return newUnmatched; +} + +function setMatcher( preFilter, selector, matcher, postFilter, postFinder, postSelector ) { + if ( postFilter && !postFilter[ expando ] ) { + postFilter = setMatcher( postFilter ); + } + if ( postFinder && !postFinder[ expando ] ) { + postFinder = setMatcher( postFinder, postSelector ); + } + return markFunction( function( seed, results, context, xml ) { + var temp, i, elem, + preMap = [], + postMap = [], + preexisting = results.length, + + // Get initial elements from seed or context + elems = seed || multipleContexts( + selector || "*", + context.nodeType ? [ context ] : context, + [] + ), + + // Prefilter to get matcher input, preserving a map for seed-results synchronization + matcherIn = preFilter && ( seed || !selector ) ? + condense( elems, preMap, preFilter, context, xml ) : + elems, + + matcherOut = matcher ? + + // If we have a postFinder, or filtered seed, or non-seed postFilter or preexisting results, + postFinder || ( seed ? preFilter : preexisting || postFilter ) ? + + // ...intermediate processing is necessary + [] : + + // ...otherwise use results directly + results : + matcherIn; + + // Find primary matches + if ( matcher ) { + matcher( matcherIn, matcherOut, context, xml ); + } + + // Apply postFilter + if ( postFilter ) { + temp = condense( matcherOut, postMap ); + postFilter( temp, [], context, xml ); + + // Un-match failing elements by moving them back to matcherIn + i = temp.length; + while ( i-- ) { + if ( ( elem = temp[ i ] ) ) { + matcherOut[ postMap[ i ] ] = !( matcherIn[ postMap[ i ] ] = elem ); + } + } + } + + if ( seed ) { + if ( postFinder || preFilter ) { + if ( postFinder ) { + + // Get the final matcherOut by condensing this intermediate into postFinder contexts + temp = []; + i = matcherOut.length; + while ( i-- ) { + if ( ( elem = matcherOut[ i ] ) ) { + + // Restore matcherIn since elem is not yet a final match + temp.push( ( matcherIn[ i ] = elem ) ); + } + } + postFinder( null, ( matcherOut = [] ), temp, xml ); + } + + // Move matched elements from seed to results to keep them synchronized + i = matcherOut.length; + while ( i-- ) { + if ( ( elem = matcherOut[ i ] ) && + ( temp = postFinder ? indexOf( seed, elem ) : preMap[ i ] ) > -1 ) { + + seed[ temp ] = !( results[ temp ] = elem ); + } + } + } + + // Add elements to results, through postFinder if defined + } else { + matcherOut = condense( + matcherOut === results ? + matcherOut.splice( preexisting, matcherOut.length ) : + matcherOut + ); + if ( postFinder ) { + postFinder( null, results, matcherOut, xml ); + } else { + push.apply( results, matcherOut ); + } + } + } ); +} + +function matcherFromTokens( tokens ) { + var checkContext, matcher, j, + len = tokens.length, + leadingRelative = Expr.relative[ tokens[ 0 ].type ], + implicitRelative = leadingRelative || Expr.relative[ " " ], + i = leadingRelative ? 1 : 0, + + // The foundational matcher ensures that elements are reachable from top-level context(s) + matchContext = addCombinator( function( elem ) { + return elem === checkContext; + }, implicitRelative, true ), + matchAnyContext = addCombinator( function( elem ) { + return indexOf( checkContext, elem ) > -1; + }, implicitRelative, true ), + matchers = [ function( elem, context, xml ) { + var ret = ( !leadingRelative && ( xml || context !== outermostContext ) ) || ( + ( checkContext = context ).nodeType ? + matchContext( elem, context, xml ) : + matchAnyContext( elem, context, xml ) ); + + // Avoid hanging onto element (issue #299) + checkContext = null; + return ret; + } ]; + + for ( ; i < len; i++ ) { + if ( ( matcher = Expr.relative[ tokens[ i ].type ] ) ) { + matchers = [ addCombinator( elementMatcher( matchers ), matcher ) ]; + } else { + matcher = Expr.filter[ tokens[ i ].type ].apply( null, tokens[ i ].matches ); + + // Return special upon seeing a positional matcher + if ( matcher[ expando ] ) { + + // Find the next relative operator (if any) for proper handling + j = ++i; + for ( ; j < len; j++ ) { + if ( Expr.relative[ tokens[ j ].type ] ) { + break; + } + } + return setMatcher( + i > 1 && elementMatcher( matchers ), + i > 1 && toSelector( + + // If the preceding token was a descendant combinator, insert an implicit any-element `*` + tokens + .slice( 0, i - 1 ) + .concat( { value: tokens[ i - 2 ].type === " " ? "*" : "" } ) + ).replace( rtrim, "$1" ), + matcher, + i < j && matcherFromTokens( tokens.slice( i, j ) ), + j < len && matcherFromTokens( ( tokens = tokens.slice( j ) ) ), + j < len && toSelector( tokens ) + ); + } + matchers.push( matcher ); + } + } + + return elementMatcher( matchers ); +} + +function matcherFromGroupMatchers( elementMatchers, setMatchers ) { + var bySet = setMatchers.length > 0, + byElement = elementMatchers.length > 0, + superMatcher = function( seed, context, xml, results, outermost ) { + var elem, j, matcher, + matchedCount = 0, + i = "0", + unmatched = seed && [], + setMatched = [], + contextBackup = outermostContext, + + // We must always have either seed elements or outermost context + elems = seed || byElement && Expr.find[ "TAG" ]( "*", outermost ), + + // Use integer dirruns iff this is the outermost matcher + dirrunsUnique = ( dirruns += contextBackup == null ? 1 : Math.random() || 0.1 ), + len = elems.length; + + if ( outermost ) { + + // Support: IE 11+, Edge 17 - 18+ + // IE/Edge sometimes throw a "Permission denied" error when strict-comparing + // two documents; shallow comparisons work. + // eslint-disable-next-line eqeqeq + outermostContext = context == document || context || outermost; + } + + // Add elements passing elementMatchers directly to results + // Support: IE<9, Safari + // Tolerate NodeList properties (IE: "length"; Safari: ) matching elements by id + for ( ; i !== len && ( elem = elems[ i ] ) != null; i++ ) { + if ( byElement && elem ) { + j = 0; + + // Support: IE 11+, Edge 17 - 18+ + // IE/Edge sometimes throw a "Permission denied" error when strict-comparing + // two documents; shallow comparisons work. + // eslint-disable-next-line eqeqeq + if ( !context && elem.ownerDocument != document ) { + setDocument( elem ); + xml = !documentIsHTML; + } + while ( ( matcher = elementMatchers[ j++ ] ) ) { + if ( matcher( elem, context || document, xml ) ) { + results.push( elem ); + break; + } + } + if ( outermost ) { + dirruns = dirrunsUnique; + } + } + + // Track unmatched elements for set filters + if ( bySet ) { + + // They will have gone through all possible matchers + if ( ( elem = !matcher && elem ) ) { + matchedCount--; + } + + // Lengthen the array for every element, matched or not + if ( seed ) { + unmatched.push( elem ); + } + } + } + + // `i` is now the count of elements visited above, and adding it to `matchedCount` + // makes the latter nonnegative. + matchedCount += i; + + // Apply set filters to unmatched elements + // NOTE: This can be skipped if there are no unmatched elements (i.e., `matchedCount` + // equals `i`), unless we didn't visit _any_ elements in the above loop because we have + // no element matchers and no seed. + // Incrementing an initially-string "0" `i` allows `i` to remain a string only in that + // case, which will result in a "00" `matchedCount` that differs from `i` but is also + // numerically zero. + if ( bySet && i !== matchedCount ) { + j = 0; + while ( ( matcher = setMatchers[ j++ ] ) ) { + matcher( unmatched, setMatched, context, xml ); + } + + if ( seed ) { + + // Reintegrate element matches to eliminate the need for sorting + if ( matchedCount > 0 ) { + while ( i-- ) { + if ( !( unmatched[ i ] || setMatched[ i ] ) ) { + setMatched[ i ] = pop.call( results ); + } + } + } + + // Discard index placeholder values to get only actual matches + setMatched = condense( setMatched ); + } + + // Add matches to results + push.apply( results, setMatched ); + + // Seedless set matches succeeding multiple successful matchers stipulate sorting + if ( outermost && !seed && setMatched.length > 0 && + ( matchedCount + setMatchers.length ) > 1 ) { + + Sizzle.uniqueSort( results ); + } + } + + // Override manipulation of globals by nested matchers + if ( outermost ) { + dirruns = dirrunsUnique; + outermostContext = contextBackup; + } + + return unmatched; + }; + + return bySet ? + markFunction( superMatcher ) : + superMatcher; +} + +compile = Sizzle.compile = function( selector, match /* Internal Use Only */ ) { + var i, + setMatchers = [], + elementMatchers = [], + cached = compilerCache[ selector + " " ]; + + if ( !cached ) { + + // Generate a function of recursive functions that can be used to check each element + if ( !match ) { + match = tokenize( selector ); + } + i = match.length; + while ( i-- ) { + cached = matcherFromTokens( match[ i ] ); + if ( cached[ expando ] ) { + setMatchers.push( cached ); + } else { + elementMatchers.push( cached ); + } + } + + // Cache the compiled function + cached = compilerCache( + selector, + matcherFromGroupMatchers( elementMatchers, setMatchers ) + ); + + // Save selector and tokenization + cached.selector = selector; + } + return cached; +}; + +/** + * A low-level selection function that works with Sizzle's compiled + * selector functions + * @param {String|Function} selector A selector or a pre-compiled + * selector function built with Sizzle.compile + * @param {Element} context + * @param {Array} [results] + * @param {Array} [seed] A set of elements to match against + */ +select = Sizzle.select = function( selector, context, results, seed ) { + var i, tokens, token, type, find, + compiled = typeof selector === "function" && selector, + match = !seed && tokenize( ( selector = compiled.selector || selector ) ); + + results = results || []; + + // Try to minimize operations if there is only one selector in the list and no seed + // (the latter of which guarantees us context) + if ( match.length === 1 ) { + + // Reduce context if the leading compound selector is an ID + tokens = match[ 0 ] = match[ 0 ].slice( 0 ); + if ( tokens.length > 2 && ( token = tokens[ 0 ] ).type === "ID" && + context.nodeType === 9 && documentIsHTML && Expr.relative[ tokens[ 1 ].type ] ) { + + context = ( Expr.find[ "ID" ]( token.matches[ 0 ] + .replace( runescape, funescape ), context ) || [] )[ 0 ]; + if ( !context ) { + return results; + + // Precompiled matchers will still verify ancestry, so step up a level + } else if ( compiled ) { + context = context.parentNode; + } + + selector = selector.slice( tokens.shift().value.length ); + } + + // Fetch a seed set for right-to-left matching + i = matchExpr[ "needsContext" ].test( selector ) ? 0 : tokens.length; + while ( i-- ) { + token = tokens[ i ]; + + // Abort if we hit a combinator + if ( Expr.relative[ ( type = token.type ) ] ) { + break; + } + if ( ( find = Expr.find[ type ] ) ) { + + // Search, expanding context for leading sibling combinators + if ( ( seed = find( + token.matches[ 0 ].replace( runescape, funescape ), + rsibling.test( tokens[ 0 ].type ) && testContext( context.parentNode ) || + context + ) ) ) { + + // If seed is empty or no tokens remain, we can return early + tokens.splice( i, 1 ); + selector = seed.length && toSelector( tokens ); + if ( !selector ) { + push.apply( results, seed ); + return results; + } + + break; + } + } + } + } + + // Compile and execute a filtering function if one is not provided + // Provide `match` to avoid retokenization if we modified the selector above + ( compiled || compile( selector, match ) )( + seed, + context, + !documentIsHTML, + results, + !context || rsibling.test( selector ) && testContext( context.parentNode ) || context + ); + return results; +}; + +// One-time assignments + +// Sort stability +support.sortStable = expando.split( "" ).sort( sortOrder ).join( "" ) === expando; + +// Support: Chrome 14-35+ +// Always assume duplicates if they aren't passed to the comparison function +support.detectDuplicates = !!hasDuplicate; + +// Initialize against the default document +setDocument(); + +// Support: Webkit<537.32 - Safari 6.0.3/Chrome 25 (fixed in Chrome 27) +// Detached nodes confoundingly follow *each other* +support.sortDetached = assert( function( el ) { + + // Should return 1, but returns 4 (following) + return el.compareDocumentPosition( document.createElement( "fieldset" ) ) & 1; +} ); + +// Support: IE<8 +// Prevent attribute/property "interpolation" +// https://msdn.microsoft.com/en-us/library/ms536429%28VS.85%29.aspx +if ( !assert( function( el ) { + el.innerHTML = ""; + return el.firstChild.getAttribute( "href" ) === "#"; +} ) ) { + addHandle( "type|href|height|width", function( elem, name, isXML ) { + if ( !isXML ) { + return elem.getAttribute( name, name.toLowerCase() === "type" ? 1 : 2 ); + } + } ); +} + +// Support: IE<9 +// Use defaultValue in place of getAttribute("value") +if ( !support.attributes || !assert( function( el ) { + el.innerHTML = ""; + el.firstChild.setAttribute( "value", "" ); + return el.firstChild.getAttribute( "value" ) === ""; +} ) ) { + addHandle( "value", function( elem, _name, isXML ) { + if ( !isXML && elem.nodeName.toLowerCase() === "input" ) { + return elem.defaultValue; + } + } ); +} + +// Support: IE<9 +// Use getAttributeNode to fetch booleans when getAttribute lies +if ( !assert( function( el ) { + return el.getAttribute( "disabled" ) == null; +} ) ) { + addHandle( booleans, function( elem, name, isXML ) { + var val; + if ( !isXML ) { + return elem[ name ] === true ? name.toLowerCase() : + ( val = elem.getAttributeNode( name ) ) && val.specified ? + val.value : + null; + } + } ); +} + +return Sizzle; + +} )( window ); + + + +jQuery.find = Sizzle; +jQuery.expr = Sizzle.selectors; + +// Deprecated +jQuery.expr[ ":" ] = jQuery.expr.pseudos; +jQuery.uniqueSort = jQuery.unique = Sizzle.uniqueSort; +jQuery.text = Sizzle.getText; +jQuery.isXMLDoc = Sizzle.isXML; +jQuery.contains = Sizzle.contains; +jQuery.escapeSelector = Sizzle.escape; + + + + +var dir = function( elem, dir, until ) { + var matched = [], + truncate = until !== undefined; + + while ( ( elem = elem[ dir ] ) && elem.nodeType !== 9 ) { + if ( elem.nodeType === 1 ) { + if ( truncate && jQuery( elem ).is( until ) ) { + break; + } + matched.push( elem ); + } + } + return matched; +}; + + +var siblings = function( n, elem ) { + var matched = []; + + for ( ; n; n = n.nextSibling ) { + if ( n.nodeType === 1 && n !== elem ) { + matched.push( n ); + } + } + + return matched; +}; + + +var rneedsContext = jQuery.expr.match.needsContext; + + + +function nodeName( elem, name ) { + + return elem.nodeName && elem.nodeName.toLowerCase() === name.toLowerCase(); + +} +var rsingleTag = ( /^<([a-z][^\/\0>:\x20\t\r\n\f]*)[\x20\t\r\n\f]*\/?>(?:<\/\1>|)$/i ); + + + +// Implement the identical functionality for filter and not +function winnow( elements, qualifier, not ) { + if ( isFunction( qualifier ) ) { + return jQuery.grep( elements, function( elem, i ) { + return !!qualifier.call( elem, i, elem ) !== not; + } ); + } + + // Single element + if ( qualifier.nodeType ) { + return jQuery.grep( elements, function( elem ) { + return ( elem === qualifier ) !== not; + } ); + } + + // Arraylike of elements (jQuery, arguments, Array) + if ( typeof qualifier !== "string" ) { + return jQuery.grep( elements, function( elem ) { + return ( indexOf.call( qualifier, elem ) > -1 ) !== not; + } ); + } + + // Filtered directly for both simple and complex selectors + return jQuery.filter( qualifier, elements, not ); +} + +jQuery.filter = function( expr, elems, not ) { + var elem = elems[ 0 ]; + + if ( not ) { + expr = ":not(" + expr + ")"; + } + + if ( elems.length === 1 && elem.nodeType === 1 ) { + return jQuery.find.matchesSelector( elem, expr ) ? [ elem ] : []; + } + + return jQuery.find.matches( expr, jQuery.grep( elems, function( elem ) { + return elem.nodeType === 1; + } ) ); +}; + +jQuery.fn.extend( { + find: function( selector ) { + var i, ret, + len = this.length, + self = this; + + if ( typeof selector !== "string" ) { + return this.pushStack( jQuery( selector ).filter( function() { + for ( i = 0; i < len; i++ ) { + if ( jQuery.contains( self[ i ], this ) ) { + return true; + } + } + } ) ); + } + + ret = this.pushStack( [] ); + + for ( i = 0; i < len; i++ ) { + jQuery.find( selector, self[ i ], ret ); + } + + return len > 1 ? jQuery.uniqueSort( ret ) : ret; + }, + filter: function( selector ) { + return this.pushStack( winnow( this, selector || [], false ) ); + }, + not: function( selector ) { + return this.pushStack( winnow( this, selector || [], true ) ); + }, + is: function( selector ) { + return !!winnow( + this, + + // If this is a positional/relative selector, check membership in the returned set + // so $("p:first").is("p:last") won't return true for a doc with two "p". + typeof selector === "string" && rneedsContext.test( selector ) ? + jQuery( selector ) : + selector || [], + false + ).length; + } +} ); + + +// Initialize a jQuery object + + +// A central reference to the root jQuery(document) +var rootjQuery, + + // A simple way to check for HTML strings + // Prioritize #id over to avoid XSS via location.hash (#9521) + // Strict HTML recognition (#11290: must start with <) + // Shortcut simple #id case for speed + rquickExpr = /^(?:\s*(<[\w\W]+>)[^>]*|#([\w-]+))$/, + + init = jQuery.fn.init = function( selector, context, root ) { + var match, elem; + + // HANDLE: $(""), $(null), $(undefined), $(false) + if ( !selector ) { + return this; + } + + // Method init() accepts an alternate rootjQuery + // so migrate can support jQuery.sub (gh-2101) + root = root || rootjQuery; + + // Handle HTML strings + if ( typeof selector === "string" ) { + if ( selector[ 0 ] === "<" && + selector[ selector.length - 1 ] === ">" && + selector.length >= 3 ) { + + // Assume that strings that start and end with <> are HTML and skip the regex check + match = [ null, selector, null ]; + + } else { + match = rquickExpr.exec( selector ); + } + + // Match html or make sure no context is specified for #id + if ( match && ( match[ 1 ] || !context ) ) { + + // HANDLE: $(html) -> $(array) + if ( match[ 1 ] ) { + context = context instanceof jQuery ? context[ 0 ] : context; + + // Option to run scripts is true for back-compat + // Intentionally let the error be thrown if parseHTML is not present + jQuery.merge( this, jQuery.parseHTML( + match[ 1 ], + context && context.nodeType ? context.ownerDocument || context : document, + true + ) ); + + // HANDLE: $(html, props) + if ( rsingleTag.test( match[ 1 ] ) && jQuery.isPlainObject( context ) ) { + for ( match in context ) { + + // Properties of context are called as methods if possible + if ( isFunction( this[ match ] ) ) { + this[ match ]( context[ match ] ); + + // ...and otherwise set as attributes + } else { + this.attr( match, context[ match ] ); + } + } + } + + return this; + + // HANDLE: $(#id) + } else { + elem = document.getElementById( match[ 2 ] ); + + if ( elem ) { + + // Inject the element directly into the jQuery object + this[ 0 ] = elem; + this.length = 1; + } + return this; + } + + // HANDLE: $(expr, $(...)) + } else if ( !context || context.jquery ) { + return ( context || root ).find( selector ); + + // HANDLE: $(expr, context) + // (which is just equivalent to: $(context).find(expr) + } else { + return this.constructor( context ).find( selector ); + } + + // HANDLE: $(DOMElement) + } else if ( selector.nodeType ) { + this[ 0 ] = selector; + this.length = 1; + return this; + + // HANDLE: $(function) + // Shortcut for document ready + } else if ( isFunction( selector ) ) { + return root.ready !== undefined ? + root.ready( selector ) : + + // Execute immediately if ready is not present + selector( jQuery ); + } + + return jQuery.makeArray( selector, this ); + }; + +// Give the init function the jQuery prototype for later instantiation +init.prototype = jQuery.fn; + +// Initialize central reference +rootjQuery = jQuery( document ); + + +var rparentsprev = /^(?:parents|prev(?:Until|All))/, + + // Methods guaranteed to produce a unique set when starting from a unique set + guaranteedUnique = { + children: true, + contents: true, + next: true, + prev: true + }; + +jQuery.fn.extend( { + has: function( target ) { + var targets = jQuery( target, this ), + l = targets.length; + + return this.filter( function() { + var i = 0; + for ( ; i < l; i++ ) { + if ( jQuery.contains( this, targets[ i ] ) ) { + return true; + } + } + } ); + }, + + closest: function( selectors, context ) { + var cur, + i = 0, + l = this.length, + matched = [], + targets = typeof selectors !== "string" && jQuery( selectors ); + + // Positional selectors never match, since there's no _selection_ context + if ( !rneedsContext.test( selectors ) ) { + for ( ; i < l; i++ ) { + for ( cur = this[ i ]; cur && cur !== context; cur = cur.parentNode ) { + + // Always skip document fragments + if ( cur.nodeType < 11 && ( targets ? + targets.index( cur ) > -1 : + + // Don't pass non-elements to Sizzle + cur.nodeType === 1 && + jQuery.find.matchesSelector( cur, selectors ) ) ) { + + matched.push( cur ); + break; + } + } + } + } + + return this.pushStack( matched.length > 1 ? jQuery.uniqueSort( matched ) : matched ); + }, + + // Determine the position of an element within the set + index: function( elem ) { + + // No argument, return index in parent + if ( !elem ) { + return ( this[ 0 ] && this[ 0 ].parentNode ) ? this.first().prevAll().length : -1; + } + + // Index in selector + if ( typeof elem === "string" ) { + return indexOf.call( jQuery( elem ), this[ 0 ] ); + } + + // Locate the position of the desired element + return indexOf.call( this, + + // If it receives a jQuery object, the first element is used + elem.jquery ? elem[ 0 ] : elem + ); + }, + + add: function( selector, context ) { + return this.pushStack( + jQuery.uniqueSort( + jQuery.merge( this.get(), jQuery( selector, context ) ) + ) + ); + }, + + addBack: function( selector ) { + return this.add( selector == null ? + this.prevObject : this.prevObject.filter( selector ) + ); + } +} ); + +function sibling( cur, dir ) { + while ( ( cur = cur[ dir ] ) && cur.nodeType !== 1 ) {} + return cur; +} + +jQuery.each( { + parent: function( elem ) { + var parent = elem.parentNode; + return parent && parent.nodeType !== 11 ? parent : null; + }, + parents: function( elem ) { + return dir( elem, "parentNode" ); + }, + parentsUntil: function( elem, _i, until ) { + return dir( elem, "parentNode", until ); + }, + next: function( elem ) { + return sibling( elem, "nextSibling" ); + }, + prev: function( elem ) { + return sibling( elem, "previousSibling" ); + }, + nextAll: function( elem ) { + return dir( elem, "nextSibling" ); + }, + prevAll: function( elem ) { + return dir( elem, "previousSibling" ); + }, + nextUntil: function( elem, _i, until ) { + return dir( elem, "nextSibling", until ); + }, + prevUntil: function( elem, _i, until ) { + return dir( elem, "previousSibling", until ); + }, + siblings: function( elem ) { + return siblings( ( elem.parentNode || {} ).firstChild, elem ); + }, + children: function( elem ) { + return siblings( elem.firstChild ); + }, + contents: function( elem ) { + if ( elem.contentDocument != null && + + // Support: IE 11+ + // elements with no `data` attribute has an object + // `contentDocument` with a `null` prototype. + getProto( elem.contentDocument ) ) { + + return elem.contentDocument; + } + + // Support: IE 9 - 11 only, iOS 7 only, Android Browser <=4.3 only + // Treat the template element as a regular one in browsers that + // don't support it. + if ( nodeName( elem, "template" ) ) { + elem = elem.content || elem; + } + + return jQuery.merge( [], elem.childNodes ); + } +}, function( name, fn ) { + jQuery.fn[ name ] = function( until, selector ) { + var matched = jQuery.map( this, fn, until ); + + if ( name.slice( -5 ) !== "Until" ) { + selector = until; + } + + if ( selector && typeof selector === "string" ) { + matched = jQuery.filter( selector, matched ); + } + + if ( this.length > 1 ) { + + // Remove duplicates + if ( !guaranteedUnique[ name ] ) { + jQuery.uniqueSort( matched ); + } + + // Reverse order for parents* and prev-derivatives + if ( rparentsprev.test( name ) ) { + matched.reverse(); + } + } + + return this.pushStack( matched ); + }; +} ); +var rnothtmlwhite = ( /[^\x20\t\r\n\f]+/g ); + + + +// Convert String-formatted options into Object-formatted ones +function createOptions( options ) { + var object = {}; + jQuery.each( options.match( rnothtmlwhite ) || [], function( _, flag ) { + object[ flag ] = true; + } ); + return object; +} + +/* + * Create a callback list using the following parameters: + * + * options: an optional list of space-separated options that will change how + * the callback list behaves or a more traditional option object + * + * By default a callback list will act like an event callback list and can be + * "fired" multiple times. + * + * Possible options: + * + * once: will ensure the callback list can only be fired once (like a Deferred) + * + * memory: will keep track of previous values and will call any callback added + * after the list has been fired right away with the latest "memorized" + * values (like a Deferred) + * + * unique: will ensure a callback can only be added once (no duplicate in the list) + * + * stopOnFalse: interrupt callings when a callback returns false + * + */ +jQuery.Callbacks = function( options ) { + + // Convert options from String-formatted to Object-formatted if needed + // (we check in cache first) + options = typeof options === "string" ? + createOptions( options ) : + jQuery.extend( {}, options ); + + var // Flag to know if list is currently firing + firing, + + // Last fire value for non-forgettable lists + memory, + + // Flag to know if list was already fired + fired, + + // Flag to prevent firing + locked, + + // Actual callback list + list = [], + + // Queue of execution data for repeatable lists + queue = [], + + // Index of currently firing callback (modified by add/remove as needed) + firingIndex = -1, + + // Fire callbacks + fire = function() { + + // Enforce single-firing + locked = locked || options.once; + + // Execute callbacks for all pending executions, + // respecting firingIndex overrides and runtime changes + fired = firing = true; + for ( ; queue.length; firingIndex = -1 ) { + memory = queue.shift(); + while ( ++firingIndex < list.length ) { + + // Run callback and check for early termination + if ( list[ firingIndex ].apply( memory[ 0 ], memory[ 1 ] ) === false && + options.stopOnFalse ) { + + // Jump to end and forget the data so .add doesn't re-fire + firingIndex = list.length; + memory = false; + } + } + } + + // Forget the data if we're done with it + if ( !options.memory ) { + memory = false; + } + + firing = false; + + // Clean up if we're done firing for good + if ( locked ) { + + // Keep an empty list if we have data for future add calls + if ( memory ) { + list = []; + + // Otherwise, this object is spent + } else { + list = ""; + } + } + }, + + // Actual Callbacks object + self = { + + // Add a callback or a collection of callbacks to the list + add: function() { + if ( list ) { + + // If we have memory from a past run, we should fire after adding + if ( memory && !firing ) { + firingIndex = list.length - 1; + queue.push( memory ); + } + + ( function add( args ) { + jQuery.each( args, function( _, arg ) { + if ( isFunction( arg ) ) { + if ( !options.unique || !self.has( arg ) ) { + list.push( arg ); + } + } else if ( arg && arg.length && toType( arg ) !== "string" ) { + + // Inspect recursively + add( arg ); + } + } ); + } )( arguments ); + + if ( memory && !firing ) { + fire(); + } + } + return this; + }, + + // Remove a callback from the list + remove: function() { + jQuery.each( arguments, function( _, arg ) { + var index; + while ( ( index = jQuery.inArray( arg, list, index ) ) > -1 ) { + list.splice( index, 1 ); + + // Handle firing indexes + if ( index <= firingIndex ) { + firingIndex--; + } + } + } ); + return this; + }, + + // Check if a given callback is in the list. + // If no argument is given, return whether or not list has callbacks attached. + has: function( fn ) { + return fn ? + jQuery.inArray( fn, list ) > -1 : + list.length > 0; + }, + + // Remove all callbacks from the list + empty: function() { + if ( list ) { + list = []; + } + return this; + }, + + // Disable .fire and .add + // Abort any current/pending executions + // Clear all callbacks and values + disable: function() { + locked = queue = []; + list = memory = ""; + return this; + }, + disabled: function() { + return !list; + }, + + // Disable .fire + // Also disable .add unless we have memory (since it would have no effect) + // Abort any pending executions + lock: function() { + locked = queue = []; + if ( !memory && !firing ) { + list = memory = ""; + } + return this; + }, + locked: function() { + return !!locked; + }, + + // Call all callbacks with the given context and arguments + fireWith: function( context, args ) { + if ( !locked ) { + args = args || []; + args = [ context, args.slice ? args.slice() : args ]; + queue.push( args ); + if ( !firing ) { + fire(); + } + } + return this; + }, + + // Call all the callbacks with the given arguments + fire: function() { + self.fireWith( this, arguments ); + return this; + }, + + // To know if the callbacks have already been called at least once + fired: function() { + return !!fired; + } + }; + + return self; +}; + + +function Identity( v ) { + return v; +} +function Thrower( ex ) { + throw ex; +} + +function adoptValue( value, resolve, reject, noValue ) { + var method; + + try { + + // Check for promise aspect first to privilege synchronous behavior + if ( value && isFunction( ( method = value.promise ) ) ) { + method.call( value ).done( resolve ).fail( reject ); + + // Other thenables + } else if ( value && isFunction( ( method = value.then ) ) ) { + method.call( value, resolve, reject ); + + // Other non-thenables + } else { + + // Control `resolve` arguments by letting Array#slice cast boolean `noValue` to integer: + // * false: [ value ].slice( 0 ) => resolve( value ) + // * true: [ value ].slice( 1 ) => resolve() + resolve.apply( undefined, [ value ].slice( noValue ) ); + } + + // For Promises/A+, convert exceptions into rejections + // Since jQuery.when doesn't unwrap thenables, we can skip the extra checks appearing in + // Deferred#then to conditionally suppress rejection. + } catch ( value ) { + + // Support: Android 4.0 only + // Strict mode functions invoked without .call/.apply get global-object context + reject.apply( undefined, [ value ] ); + } +} + +jQuery.extend( { + + Deferred: function( func ) { + var tuples = [ + + // action, add listener, callbacks, + // ... .then handlers, argument index, [final state] + [ "notify", "progress", jQuery.Callbacks( "memory" ), + jQuery.Callbacks( "memory" ), 2 ], + [ "resolve", "done", jQuery.Callbacks( "once memory" ), + jQuery.Callbacks( "once memory" ), 0, "resolved" ], + [ "reject", "fail", jQuery.Callbacks( "once memory" ), + jQuery.Callbacks( "once memory" ), 1, "rejected" ] + ], + state = "pending", + promise = { + state: function() { + return state; + }, + always: function() { + deferred.done( arguments ).fail( arguments ); + return this; + }, + "catch": function( fn ) { + return promise.then( null, fn ); + }, + + // Keep pipe for back-compat + pipe: function( /* fnDone, fnFail, fnProgress */ ) { + var fns = arguments; + + return jQuery.Deferred( function( newDefer ) { + jQuery.each( tuples, function( _i, tuple ) { + + // Map tuples (progress, done, fail) to arguments (done, fail, progress) + var fn = isFunction( fns[ tuple[ 4 ] ] ) && fns[ tuple[ 4 ] ]; + + // deferred.progress(function() { bind to newDefer or newDefer.notify }) + // deferred.done(function() { bind to newDefer or newDefer.resolve }) + // deferred.fail(function() { bind to newDefer or newDefer.reject }) + deferred[ tuple[ 1 ] ]( function() { + var returned = fn && fn.apply( this, arguments ); + if ( returned && isFunction( returned.promise ) ) { + returned.promise() + .progress( newDefer.notify ) + .done( newDefer.resolve ) + .fail( newDefer.reject ); + } else { + newDefer[ tuple[ 0 ] + "With" ]( + this, + fn ? [ returned ] : arguments + ); + } + } ); + } ); + fns = null; + } ).promise(); + }, + then: function( onFulfilled, onRejected, onProgress ) { + var maxDepth = 0; + function resolve( depth, deferred, handler, special ) { + return function() { + var that = this, + args = arguments, + mightThrow = function() { + var returned, then; + + // Support: Promises/A+ section 2.3.3.3.3 + // https://promisesaplus.com/#point-59 + // Ignore double-resolution attempts + if ( depth < maxDepth ) { + return; + } + + returned = handler.apply( that, args ); + + // Support: Promises/A+ section 2.3.1 + // https://promisesaplus.com/#point-48 + if ( returned === deferred.promise() ) { + throw new TypeError( "Thenable self-resolution" ); + } + + // Support: Promises/A+ sections 2.3.3.1, 3.5 + // https://promisesaplus.com/#point-54 + // https://promisesaplus.com/#point-75 + // Retrieve `then` only once + then = returned && + + // Support: Promises/A+ section 2.3.4 + // https://promisesaplus.com/#point-64 + // Only check objects and functions for thenability + ( typeof returned === "object" || + typeof returned === "function" ) && + returned.then; + + // Handle a returned thenable + if ( isFunction( then ) ) { + + // Special processors (notify) just wait for resolution + if ( special ) { + then.call( + returned, + resolve( maxDepth, deferred, Identity, special ), + resolve( maxDepth, deferred, Thrower, special ) + ); + + // Normal processors (resolve) also hook into progress + } else { + + // ...and disregard older resolution values + maxDepth++; + + then.call( + returned, + resolve( maxDepth, deferred, Identity, special ), + resolve( maxDepth, deferred, Thrower, special ), + resolve( maxDepth, deferred, Identity, + deferred.notifyWith ) + ); + } + + // Handle all other returned values + } else { + + // Only substitute handlers pass on context + // and multiple values (non-spec behavior) + if ( handler !== Identity ) { + that = undefined; + args = [ returned ]; + } + + // Process the value(s) + // Default process is resolve + ( special || deferred.resolveWith )( that, args ); + } + }, + + // Only normal processors (resolve) catch and reject exceptions + process = special ? + mightThrow : + function() { + try { + mightThrow(); + } catch ( e ) { + + if ( jQuery.Deferred.exceptionHook ) { + jQuery.Deferred.exceptionHook( e, + process.stackTrace ); + } + + // Support: Promises/A+ section 2.3.3.3.4.1 + // https://promisesaplus.com/#point-61 + // Ignore post-resolution exceptions + if ( depth + 1 >= maxDepth ) { + + // Only substitute handlers pass on context + // and multiple values (non-spec behavior) + if ( handler !== Thrower ) { + that = undefined; + args = [ e ]; + } + + deferred.rejectWith( that, args ); + } + } + }; + + // Support: Promises/A+ section 2.3.3.3.1 + // https://promisesaplus.com/#point-57 + // Re-resolve promises immediately to dodge false rejection from + // subsequent errors + if ( depth ) { + process(); + } else { + + // Call an optional hook to record the stack, in case of exception + // since it's otherwise lost when execution goes async + if ( jQuery.Deferred.getStackHook ) { + process.stackTrace = jQuery.Deferred.getStackHook(); + } + window.setTimeout( process ); + } + }; + } + + return jQuery.Deferred( function( newDefer ) { + + // progress_handlers.add( ... ) + tuples[ 0 ][ 3 ].add( + resolve( + 0, + newDefer, + isFunction( onProgress ) ? + onProgress : + Identity, + newDefer.notifyWith + ) + ); + + // fulfilled_handlers.add( ... ) + tuples[ 1 ][ 3 ].add( + resolve( + 0, + newDefer, + isFunction( onFulfilled ) ? + onFulfilled : + Identity + ) + ); + + // rejected_handlers.add( ... ) + tuples[ 2 ][ 3 ].add( + resolve( + 0, + newDefer, + isFunction( onRejected ) ? + onRejected : + Thrower + ) + ); + } ).promise(); + }, + + // Get a promise for this deferred + // If obj is provided, the promise aspect is added to the object + promise: function( obj ) { + return obj != null ? jQuery.extend( obj, promise ) : promise; + } + }, + deferred = {}; + + // Add list-specific methods + jQuery.each( tuples, function( i, tuple ) { + var list = tuple[ 2 ], + stateString = tuple[ 5 ]; + + // promise.progress = list.add + // promise.done = list.add + // promise.fail = list.add + promise[ tuple[ 1 ] ] = list.add; + + // Handle state + if ( stateString ) { + list.add( + function() { + + // state = "resolved" (i.e., fulfilled) + // state = "rejected" + state = stateString; + }, + + // rejected_callbacks.disable + // fulfilled_callbacks.disable + tuples[ 3 - i ][ 2 ].disable, + + // rejected_handlers.disable + // fulfilled_handlers.disable + tuples[ 3 - i ][ 3 ].disable, + + // progress_callbacks.lock + tuples[ 0 ][ 2 ].lock, + + // progress_handlers.lock + tuples[ 0 ][ 3 ].lock + ); + } + + // progress_handlers.fire + // fulfilled_handlers.fire + // rejected_handlers.fire + list.add( tuple[ 3 ].fire ); + + // deferred.notify = function() { deferred.notifyWith(...) } + // deferred.resolve = function() { deferred.resolveWith(...) } + // deferred.reject = function() { deferred.rejectWith(...) } + deferred[ tuple[ 0 ] ] = function() { + deferred[ tuple[ 0 ] + "With" ]( this === deferred ? undefined : this, arguments ); + return this; + }; + + // deferred.notifyWith = list.fireWith + // deferred.resolveWith = list.fireWith + // deferred.rejectWith = list.fireWith + deferred[ tuple[ 0 ] + "With" ] = list.fireWith; + } ); + + // Make the deferred a promise + promise.promise( deferred ); + + // Call given func if any + if ( func ) { + func.call( deferred, deferred ); + } + + // All done! + return deferred; + }, + + // Deferred helper + when: function( singleValue ) { + var + + // count of uncompleted subordinates + remaining = arguments.length, + + // count of unprocessed arguments + i = remaining, + + // subordinate fulfillment data + resolveContexts = Array( i ), + resolveValues = slice.call( arguments ), + + // the primary Deferred + primary = jQuery.Deferred(), + + // subordinate callback factory + updateFunc = function( i ) { + return function( value ) { + resolveContexts[ i ] = this; + resolveValues[ i ] = arguments.length > 1 ? slice.call( arguments ) : value; + if ( !( --remaining ) ) { + primary.resolveWith( resolveContexts, resolveValues ); + } + }; + }; + + // Single- and empty arguments are adopted like Promise.resolve + if ( remaining <= 1 ) { + adoptValue( singleValue, primary.done( updateFunc( i ) ).resolve, primary.reject, + !remaining ); + + // Use .then() to unwrap secondary thenables (cf. gh-3000) + if ( primary.state() === "pending" || + isFunction( resolveValues[ i ] && resolveValues[ i ].then ) ) { + + return primary.then(); + } + } + + // Multiple arguments are aggregated like Promise.all array elements + while ( i-- ) { + adoptValue( resolveValues[ i ], updateFunc( i ), primary.reject ); + } + + return primary.promise(); + } +} ); + + +// These usually indicate a programmer mistake during development, +// warn about them ASAP rather than swallowing them by default. +var rerrorNames = /^(Eval|Internal|Range|Reference|Syntax|Type|URI)Error$/; + +jQuery.Deferred.exceptionHook = function( error, stack ) { + + // Support: IE 8 - 9 only + // Console exists when dev tools are open, which can happen at any time + if ( window.console && window.console.warn && error && rerrorNames.test( error.name ) ) { + window.console.warn( "jQuery.Deferred exception: " + error.message, error.stack, stack ); + } +}; + + + + +jQuery.readyException = function( error ) { + window.setTimeout( function() { + throw error; + } ); +}; + + + + +// The deferred used on DOM ready +var readyList = jQuery.Deferred(); + +jQuery.fn.ready = function( fn ) { + + readyList + .then( fn ) + + // Wrap jQuery.readyException in a function so that the lookup + // happens at the time of error handling instead of callback + // registration. + .catch( function( error ) { + jQuery.readyException( error ); + } ); + + return this; +}; + +jQuery.extend( { + + // Is the DOM ready to be used? Set to true once it occurs. + isReady: false, + + // A counter to track how many items to wait for before + // the ready event fires. See #6781 + readyWait: 1, + + // Handle when the DOM is ready + ready: function( wait ) { + + // Abort if there are pending holds or we're already ready + if ( wait === true ? --jQuery.readyWait : jQuery.isReady ) { + return; + } + + // Remember that the DOM is ready + jQuery.isReady = true; + + // If a normal DOM Ready event fired, decrement, and wait if need be + if ( wait !== true && --jQuery.readyWait > 0 ) { + return; + } + + // If there are functions bound, to execute + readyList.resolveWith( document, [ jQuery ] ); + } +} ); + +jQuery.ready.then = readyList.then; + +// The ready event handler and self cleanup method +function completed() { + document.removeEventListener( "DOMContentLoaded", completed ); + window.removeEventListener( "load", completed ); + jQuery.ready(); +} + +// Catch cases where $(document).ready() is called +// after the browser event has already occurred. +// Support: IE <=9 - 10 only +// Older IE sometimes signals "interactive" too soon +if ( document.readyState === "complete" || + ( document.readyState !== "loading" && !document.documentElement.doScroll ) ) { + + // Handle it asynchronously to allow scripts the opportunity to delay ready + window.setTimeout( jQuery.ready ); + +} else { + + // Use the handy event callback + document.addEventListener( "DOMContentLoaded", completed ); + + // A fallback to window.onload, that will always work + window.addEventListener( "load", completed ); +} + + + + +// Multifunctional method to get and set values of a collection +// The value/s can optionally be executed if it's a function +var access = function( elems, fn, key, value, chainable, emptyGet, raw ) { + var i = 0, + len = elems.length, + bulk = key == null; + + // Sets many values + if ( toType( key ) === "object" ) { + chainable = true; + for ( i in key ) { + access( elems, fn, i, key[ i ], true, emptyGet, raw ); + } + + // Sets one value + } else if ( value !== undefined ) { + chainable = true; + + if ( !isFunction( value ) ) { + raw = true; + } + + if ( bulk ) { + + // Bulk operations run against the entire set + if ( raw ) { + fn.call( elems, value ); + fn = null; + + // ...except when executing function values + } else { + bulk = fn; + fn = function( elem, _key, value ) { + return bulk.call( jQuery( elem ), value ); + }; + } + } + + if ( fn ) { + for ( ; i < len; i++ ) { + fn( + elems[ i ], key, raw ? + value : + value.call( elems[ i ], i, fn( elems[ i ], key ) ) + ); + } + } + } + + if ( chainable ) { + return elems; + } + + // Gets + if ( bulk ) { + return fn.call( elems ); + } + + return len ? fn( elems[ 0 ], key ) : emptyGet; +}; + + +// Matches dashed string for camelizing +var rmsPrefix = /^-ms-/, + rdashAlpha = /-([a-z])/g; + +// Used by camelCase as callback to replace() +function fcamelCase( _all, letter ) { + return letter.toUpperCase(); +} + +// Convert dashed to camelCase; used by the css and data modules +// Support: IE <=9 - 11, Edge 12 - 15 +// Microsoft forgot to hump their vendor prefix (#9572) +function camelCase( string ) { + return string.replace( rmsPrefix, "ms-" ).replace( rdashAlpha, fcamelCase ); +} +var acceptData = function( owner ) { + + // Accepts only: + // - Node + // - Node.ELEMENT_NODE + // - Node.DOCUMENT_NODE + // - Object + // - Any + return owner.nodeType === 1 || owner.nodeType === 9 || !( +owner.nodeType ); +}; + + + + +function Data() { + this.expando = jQuery.expando + Data.uid++; +} + +Data.uid = 1; + +Data.prototype = { + + cache: function( owner ) { + + // Check if the owner object already has a cache + var value = owner[ this.expando ]; + + // If not, create one + if ( !value ) { + value = {}; + + // We can accept data for non-element nodes in modern browsers, + // but we should not, see #8335. + // Always return an empty object. + if ( acceptData( owner ) ) { + + // If it is a node unlikely to be stringify-ed or looped over + // use plain assignment + if ( owner.nodeType ) { + owner[ this.expando ] = value; + + // Otherwise secure it in a non-enumerable property + // configurable must be true to allow the property to be + // deleted when data is removed + } else { + Object.defineProperty( owner, this.expando, { + value: value, + configurable: true + } ); + } + } + } + + return value; + }, + set: function( owner, data, value ) { + var prop, + cache = this.cache( owner ); + + // Handle: [ owner, key, value ] args + // Always use camelCase key (gh-2257) + if ( typeof data === "string" ) { + cache[ camelCase( data ) ] = value; + + // Handle: [ owner, { properties } ] args + } else { + + // Copy the properties one-by-one to the cache object + for ( prop in data ) { + cache[ camelCase( prop ) ] = data[ prop ]; + } + } + return cache; + }, + get: function( owner, key ) { + return key === undefined ? + this.cache( owner ) : + + // Always use camelCase key (gh-2257) + owner[ this.expando ] && owner[ this.expando ][ camelCase( key ) ]; + }, + access: function( owner, key, value ) { + + // In cases where either: + // + // 1. No key was specified + // 2. A string key was specified, but no value provided + // + // Take the "read" path and allow the get method to determine + // which value to return, respectively either: + // + // 1. The entire cache object + // 2. The data stored at the key + // + if ( key === undefined || + ( ( key && typeof key === "string" ) && value === undefined ) ) { + + return this.get( owner, key ); + } + + // When the key is not a string, or both a key and value + // are specified, set or extend (existing objects) with either: + // + // 1. An object of properties + // 2. A key and value + // + this.set( owner, key, value ); + + // Since the "set" path can have two possible entry points + // return the expected data based on which path was taken[*] + return value !== undefined ? value : key; + }, + remove: function( owner, key ) { + var i, + cache = owner[ this.expando ]; + + if ( cache === undefined ) { + return; + } + + if ( key !== undefined ) { + + // Support array or space separated string of keys + if ( Array.isArray( key ) ) { + + // If key is an array of keys... + // We always set camelCase keys, so remove that. + key = key.map( camelCase ); + } else { + key = camelCase( key ); + + // If a key with the spaces exists, use it. + // Otherwise, create an array by matching non-whitespace + key = key in cache ? + [ key ] : + ( key.match( rnothtmlwhite ) || [] ); + } + + i = key.length; + + while ( i-- ) { + delete cache[ key[ i ] ]; + } + } + + // Remove the expando if there's no more data + if ( key === undefined || jQuery.isEmptyObject( cache ) ) { + + // Support: Chrome <=35 - 45 + // Webkit & Blink performance suffers when deleting properties + // from DOM nodes, so set to undefined instead + // https://bugs.chromium.org/p/chromium/issues/detail?id=378607 (bug restricted) + if ( owner.nodeType ) { + owner[ this.expando ] = undefined; + } else { + delete owner[ this.expando ]; + } + } + }, + hasData: function( owner ) { + var cache = owner[ this.expando ]; + return cache !== undefined && !jQuery.isEmptyObject( cache ); + } +}; +var dataPriv = new Data(); + +var dataUser = new Data(); + + + +// Implementation Summary +// +// 1. Enforce API surface and semantic compatibility with 1.9.x branch +// 2. Improve the module's maintainability by reducing the storage +// paths to a single mechanism. +// 3. Use the same single mechanism to support "private" and "user" data. +// 4. _Never_ expose "private" data to user code (TODO: Drop _data, _removeData) +// 5. Avoid exposing implementation details on user objects (eg. expando properties) +// 6. Provide a clear path for implementation upgrade to WeakMap in 2014 + +var rbrace = /^(?:\{[\w\W]*\}|\[[\w\W]*\])$/, + rmultiDash = /[A-Z]/g; + +function getData( data ) { + if ( data === "true" ) { + return true; + } + + if ( data === "false" ) { + return false; + } + + if ( data === "null" ) { + return null; + } + + // Only convert to a number if it doesn't change the string + if ( data === +data + "" ) { + return +data; + } + + if ( rbrace.test( data ) ) { + return JSON.parse( data ); + } + + return data; +} + +function dataAttr( elem, key, data ) { + var name; + + // If nothing was found internally, try to fetch any + // data from the HTML5 data-* attribute + if ( data === undefined && elem.nodeType === 1 ) { + name = "data-" + key.replace( rmultiDash, "-$&" ).toLowerCase(); + data = elem.getAttribute( name ); + + if ( typeof data === "string" ) { + try { + data = getData( data ); + } catch ( e ) {} + + // Make sure we set the data so it isn't changed later + dataUser.set( elem, key, data ); + } else { + data = undefined; + } + } + return data; +} + +jQuery.extend( { + hasData: function( elem ) { + return dataUser.hasData( elem ) || dataPriv.hasData( elem ); + }, + + data: function( elem, name, data ) { + return dataUser.access( elem, name, data ); + }, + + removeData: function( elem, name ) { + dataUser.remove( elem, name ); + }, + + // TODO: Now that all calls to _data and _removeData have been replaced + // with direct calls to dataPriv methods, these can be deprecated. + _data: function( elem, name, data ) { + return dataPriv.access( elem, name, data ); + }, + + _removeData: function( elem, name ) { + dataPriv.remove( elem, name ); + } +} ); + +jQuery.fn.extend( { + data: function( key, value ) { + var i, name, data, + elem = this[ 0 ], + attrs = elem && elem.attributes; + + // Gets all values + if ( key === undefined ) { + if ( this.length ) { + data = dataUser.get( elem ); + + if ( elem.nodeType === 1 && !dataPriv.get( elem, "hasDataAttrs" ) ) { + i = attrs.length; + while ( i-- ) { + + // Support: IE 11 only + // The attrs elements can be null (#14894) + if ( attrs[ i ] ) { + name = attrs[ i ].name; + if ( name.indexOf( "data-" ) === 0 ) { + name = camelCase( name.slice( 5 ) ); + dataAttr( elem, name, data[ name ] ); + } + } + } + dataPriv.set( elem, "hasDataAttrs", true ); + } + } + + return data; + } + + // Sets multiple values + if ( typeof key === "object" ) { + return this.each( function() { + dataUser.set( this, key ); + } ); + } + + return access( this, function( value ) { + var data; + + // The calling jQuery object (element matches) is not empty + // (and therefore has an element appears at this[ 0 ]) and the + // `value` parameter was not undefined. An empty jQuery object + // will result in `undefined` for elem = this[ 0 ] which will + // throw an exception if an attempt to read a data cache is made. + if ( elem && value === undefined ) { + + // Attempt to get data from the cache + // The key will always be camelCased in Data + data = dataUser.get( elem, key ); + if ( data !== undefined ) { + return data; + } + + // Attempt to "discover" the data in + // HTML5 custom data-* attrs + data = dataAttr( elem, key ); + if ( data !== undefined ) { + return data; + } + + // We tried really hard, but the data doesn't exist. + return; + } + + // Set the data... + this.each( function() { + + // We always store the camelCased key + dataUser.set( this, key, value ); + } ); + }, null, value, arguments.length > 1, null, true ); + }, + + removeData: function( key ) { + return this.each( function() { + dataUser.remove( this, key ); + } ); + } +} ); + + +jQuery.extend( { + queue: function( elem, type, data ) { + var queue; + + if ( elem ) { + type = ( type || "fx" ) + "queue"; + queue = dataPriv.get( elem, type ); + + // Speed up dequeue by getting out quickly if this is just a lookup + if ( data ) { + if ( !queue || Array.isArray( data ) ) { + queue = dataPriv.access( elem, type, jQuery.makeArray( data ) ); + } else { + queue.push( data ); + } + } + return queue || []; + } + }, + + dequeue: function( elem, type ) { + type = type || "fx"; + + var queue = jQuery.queue( elem, type ), + startLength = queue.length, + fn = queue.shift(), + hooks = jQuery._queueHooks( elem, type ), + next = function() { + jQuery.dequeue( elem, type ); + }; + + // If the fx queue is dequeued, always remove the progress sentinel + if ( fn === "inprogress" ) { + fn = queue.shift(); + startLength--; + } + + if ( fn ) { + + // Add a progress sentinel to prevent the fx queue from being + // automatically dequeued + if ( type === "fx" ) { + queue.unshift( "inprogress" ); + } + + // Clear up the last queue stop function + delete hooks.stop; + fn.call( elem, next, hooks ); + } + + if ( !startLength && hooks ) { + hooks.empty.fire(); + } + }, + + // Not public - generate a queueHooks object, or return the current one + _queueHooks: function( elem, type ) { + var key = type + "queueHooks"; + return dataPriv.get( elem, key ) || dataPriv.access( elem, key, { + empty: jQuery.Callbacks( "once memory" ).add( function() { + dataPriv.remove( elem, [ type + "queue", key ] ); + } ) + } ); + } +} ); + +jQuery.fn.extend( { + queue: function( type, data ) { + var setter = 2; + + if ( typeof type !== "string" ) { + data = type; + type = "fx"; + setter--; + } + + if ( arguments.length < setter ) { + return jQuery.queue( this[ 0 ], type ); + } + + return data === undefined ? + this : + this.each( function() { + var queue = jQuery.queue( this, type, data ); + + // Ensure a hooks for this queue + jQuery._queueHooks( this, type ); + + if ( type === "fx" && queue[ 0 ] !== "inprogress" ) { + jQuery.dequeue( this, type ); + } + } ); + }, + dequeue: function( type ) { + return this.each( function() { + jQuery.dequeue( this, type ); + } ); + }, + clearQueue: function( type ) { + return this.queue( type || "fx", [] ); + }, + + // Get a promise resolved when queues of a certain type + // are emptied (fx is the type by default) + promise: function( type, obj ) { + var tmp, + count = 1, + defer = jQuery.Deferred(), + elements = this, + i = this.length, + resolve = function() { + if ( !( --count ) ) { + defer.resolveWith( elements, [ elements ] ); + } + }; + + if ( typeof type !== "string" ) { + obj = type; + type = undefined; + } + type = type || "fx"; + + while ( i-- ) { + tmp = dataPriv.get( elements[ i ], type + "queueHooks" ); + if ( tmp && tmp.empty ) { + count++; + tmp.empty.add( resolve ); + } + } + resolve(); + return defer.promise( obj ); + } +} ); +var pnum = ( /[+-]?(?:\d*\.|)\d+(?:[eE][+-]?\d+|)/ ).source; + +var rcssNum = new RegExp( "^(?:([+-])=|)(" + pnum + ")([a-z%]*)$", "i" ); + + +var cssExpand = [ "Top", "Right", "Bottom", "Left" ]; + +var documentElement = document.documentElement; + + + + var isAttached = function( elem ) { + return jQuery.contains( elem.ownerDocument, elem ); + }, + composed = { composed: true }; + + // Support: IE 9 - 11+, Edge 12 - 18+, iOS 10.0 - 10.2 only + // Check attachment across shadow DOM boundaries when possible (gh-3504) + // Support: iOS 10.0-10.2 only + // Early iOS 10 versions support `attachShadow` but not `getRootNode`, + // leading to errors. We need to check for `getRootNode`. + if ( documentElement.getRootNode ) { + isAttached = function( elem ) { + return jQuery.contains( elem.ownerDocument, elem ) || + elem.getRootNode( composed ) === elem.ownerDocument; + }; + } +var isHiddenWithinTree = function( elem, el ) { + + // isHiddenWithinTree might be called from jQuery#filter function; + // in that case, element will be second argument + elem = el || elem; + + // Inline style trumps all + return elem.style.display === "none" || + elem.style.display === "" && + + // Otherwise, check computed style + // Support: Firefox <=43 - 45 + // Disconnected elements can have computed display: none, so first confirm that elem is + // in the document. + isAttached( elem ) && + + jQuery.css( elem, "display" ) === "none"; + }; + + + +function adjustCSS( elem, prop, valueParts, tween ) { + var adjusted, scale, + maxIterations = 20, + currentValue = tween ? + function() { + return tween.cur(); + } : + function() { + return jQuery.css( elem, prop, "" ); + }, + initial = currentValue(), + unit = valueParts && valueParts[ 3 ] || ( jQuery.cssNumber[ prop ] ? "" : "px" ), + + // Starting value computation is required for potential unit mismatches + initialInUnit = elem.nodeType && + ( jQuery.cssNumber[ prop ] || unit !== "px" && +initial ) && + rcssNum.exec( jQuery.css( elem, prop ) ); + + if ( initialInUnit && initialInUnit[ 3 ] !== unit ) { + + // Support: Firefox <=54 + // Halve the iteration target value to prevent interference from CSS upper bounds (gh-2144) + initial = initial / 2; + + // Trust units reported by jQuery.css + unit = unit || initialInUnit[ 3 ]; + + // Iteratively approximate from a nonzero starting point + initialInUnit = +initial || 1; + + while ( maxIterations-- ) { + + // Evaluate and update our best guess (doubling guesses that zero out). + // Finish if the scale equals or crosses 1 (making the old*new product non-positive). + jQuery.style( elem, prop, initialInUnit + unit ); + if ( ( 1 - scale ) * ( 1 - ( scale = currentValue() / initial || 0.5 ) ) <= 0 ) { + maxIterations = 0; + } + initialInUnit = initialInUnit / scale; + + } + + initialInUnit = initialInUnit * 2; + jQuery.style( elem, prop, initialInUnit + unit ); + + // Make sure we update the tween properties later on + valueParts = valueParts || []; + } + + if ( valueParts ) { + initialInUnit = +initialInUnit || +initial || 0; + + // Apply relative offset (+=/-=) if specified + adjusted = valueParts[ 1 ] ? + initialInUnit + ( valueParts[ 1 ] + 1 ) * valueParts[ 2 ] : + +valueParts[ 2 ]; + if ( tween ) { + tween.unit = unit; + tween.start = initialInUnit; + tween.end = adjusted; + } + } + return adjusted; +} + + +var defaultDisplayMap = {}; + +function getDefaultDisplay( elem ) { + var temp, + doc = elem.ownerDocument, + nodeName = elem.nodeName, + display = defaultDisplayMap[ nodeName ]; + + if ( display ) { + return display; + } + + temp = doc.body.appendChild( doc.createElement( nodeName ) ); + display = jQuery.css( temp, "display" ); + + temp.parentNode.removeChild( temp ); + + if ( display === "none" ) { + display = "block"; + } + defaultDisplayMap[ nodeName ] = display; + + return display; +} + +function showHide( elements, show ) { + var display, elem, + values = [], + index = 0, + length = elements.length; + + // Determine new display value for elements that need to change + for ( ; index < length; index++ ) { + elem = elements[ index ]; + if ( !elem.style ) { + continue; + } + + display = elem.style.display; + if ( show ) { + + // Since we force visibility upon cascade-hidden elements, an immediate (and slow) + // check is required in this first loop unless we have a nonempty display value (either + // inline or about-to-be-restored) + if ( display === "none" ) { + values[ index ] = dataPriv.get( elem, "display" ) || null; + if ( !values[ index ] ) { + elem.style.display = ""; + } + } + if ( elem.style.display === "" && isHiddenWithinTree( elem ) ) { + values[ index ] = getDefaultDisplay( elem ); + } + } else { + if ( display !== "none" ) { + values[ index ] = "none"; + + // Remember what we're overwriting + dataPriv.set( elem, "display", display ); + } + } + } + + // Set the display of the elements in a second loop to avoid constant reflow + for ( index = 0; index < length; index++ ) { + if ( values[ index ] != null ) { + elements[ index ].style.display = values[ index ]; + } + } + + return elements; +} + +jQuery.fn.extend( { + show: function() { + return showHide( this, true ); + }, + hide: function() { + return showHide( this ); + }, + toggle: function( state ) { + if ( typeof state === "boolean" ) { + return state ? this.show() : this.hide(); + } + + return this.each( function() { + if ( isHiddenWithinTree( this ) ) { + jQuery( this ).show(); + } else { + jQuery( this ).hide(); + } + } ); + } +} ); +var rcheckableType = ( /^(?:checkbox|radio)$/i ); + +var rtagName = ( /<([a-z][^\/\0>\x20\t\r\n\f]*)/i ); + +var rscriptType = ( /^$|^module$|\/(?:java|ecma)script/i ); + + + +( function() { + var fragment = document.createDocumentFragment(), + div = fragment.appendChild( document.createElement( "div" ) ), + input = document.createElement( "input" ); + + // Support: Android 4.0 - 4.3 only + // Check state lost if the name is set (#11217) + // Support: Windows Web Apps (WWA) + // `name` and `type` must use .setAttribute for WWA (#14901) + input.setAttribute( "type", "radio" ); + input.setAttribute( "checked", "checked" ); + input.setAttribute( "name", "t" ); + + div.appendChild( input ); + + // Support: Android <=4.1 only + // Older WebKit doesn't clone checked state correctly in fragments + support.checkClone = div.cloneNode( true ).cloneNode( true ).lastChild.checked; + + // Support: IE <=11 only + // Make sure textarea (and checkbox) defaultValue is properly cloned + div.innerHTML = ""; + support.noCloneChecked = !!div.cloneNode( true ).lastChild.defaultValue; + + // Support: IE <=9 only + // IE <=9 replaces "; + support.option = !!div.lastChild; +} )(); + + +// We have to close these tags to support XHTML (#13200) +var wrapMap = { + + // XHTML parsers do not magically insert elements in the + // same way that tag soup parsers do. So we cannot shorten + // this by omitting or other required elements. + thead: [ 1, "", "
" ], + col: [ 2, "", "
" ], + tr: [ 2, "", "
" ], + td: [ 3, "", "
" ], + + _default: [ 0, "", "" ] +}; + +wrapMap.tbody = wrapMap.tfoot = wrapMap.colgroup = wrapMap.caption = wrapMap.thead; +wrapMap.th = wrapMap.td; + +// Support: IE <=9 only +if ( !support.option ) { + wrapMap.optgroup = wrapMap.option = [ 1, "" ]; +} + + +function getAll( context, tag ) { + + // Support: IE <=9 - 11 only + // Use typeof to avoid zero-argument method invocation on host objects (#15151) + var ret; + + if ( typeof context.getElementsByTagName !== "undefined" ) { + ret = context.getElementsByTagName( tag || "*" ); + + } else if ( typeof context.querySelectorAll !== "undefined" ) { + ret = context.querySelectorAll( tag || "*" ); + + } else { + ret = []; + } + + if ( tag === undefined || tag && nodeName( context, tag ) ) { + return jQuery.merge( [ context ], ret ); + } + + return ret; +} + + +// Mark scripts as having already been evaluated +function setGlobalEval( elems, refElements ) { + var i = 0, + l = elems.length; + + for ( ; i < l; i++ ) { + dataPriv.set( + elems[ i ], + "globalEval", + !refElements || dataPriv.get( refElements[ i ], "globalEval" ) + ); + } +} + + +var rhtml = /<|&#?\w+;/; + +function buildFragment( elems, context, scripts, selection, ignored ) { + var elem, tmp, tag, wrap, attached, j, + fragment = context.createDocumentFragment(), + nodes = [], + i = 0, + l = elems.length; + + for ( ; i < l; i++ ) { + elem = elems[ i ]; + + if ( elem || elem === 0 ) { + + // Add nodes directly + if ( toType( elem ) === "object" ) { + + // Support: Android <=4.0 only, PhantomJS 1 only + // push.apply(_, arraylike) throws on ancient WebKit + jQuery.merge( nodes, elem.nodeType ? [ elem ] : elem ); + + // Convert non-html into a text node + } else if ( !rhtml.test( elem ) ) { + nodes.push( context.createTextNode( elem ) ); + + // Convert html into DOM nodes + } else { + tmp = tmp || fragment.appendChild( context.createElement( "div" ) ); + + // Deserialize a standard representation + tag = ( rtagName.exec( elem ) || [ "", "" ] )[ 1 ].toLowerCase(); + wrap = wrapMap[ tag ] || wrapMap._default; + tmp.innerHTML = wrap[ 1 ] + jQuery.htmlPrefilter( elem ) + wrap[ 2 ]; + + // Descend through wrappers to the right content + j = wrap[ 0 ]; + while ( j-- ) { + tmp = tmp.lastChild; + } + + // Support: Android <=4.0 only, PhantomJS 1 only + // push.apply(_, arraylike) throws on ancient WebKit + jQuery.merge( nodes, tmp.childNodes ); + + // Remember the top-level container + tmp = fragment.firstChild; + + // Ensure the created nodes are orphaned (#12392) + tmp.textContent = ""; + } + } + } + + // Remove wrapper from fragment + fragment.textContent = ""; + + i = 0; + while ( ( elem = nodes[ i++ ] ) ) { + + // Skip elements already in the context collection (trac-4087) + if ( selection && jQuery.inArray( elem, selection ) > -1 ) { + if ( ignored ) { + ignored.push( elem ); + } + continue; + } + + attached = isAttached( elem ); + + // Append to fragment + tmp = getAll( fragment.appendChild( elem ), "script" ); + + // Preserve script evaluation history + if ( attached ) { + setGlobalEval( tmp ); + } + + // Capture executables + if ( scripts ) { + j = 0; + while ( ( elem = tmp[ j++ ] ) ) { + if ( rscriptType.test( elem.type || "" ) ) { + scripts.push( elem ); + } + } + } + } + + return fragment; +} + + +var rtypenamespace = /^([^.]*)(?:\.(.+)|)/; + +function returnTrue() { + return true; +} + +function returnFalse() { + return false; +} + +// Support: IE <=9 - 11+ +// focus() and blur() are asynchronous, except when they are no-op. +// So expect focus to be synchronous when the element is already active, +// and blur to be synchronous when the element is not already active. +// (focus and blur are always synchronous in other supported browsers, +// this just defines when we can count on it). +function expectSync( elem, type ) { + return ( elem === safeActiveElement() ) === ( type === "focus" ); +} + +// Support: IE <=9 only +// Accessing document.activeElement can throw unexpectedly +// https://bugs.jquery.com/ticket/13393 +function safeActiveElement() { + try { + return document.activeElement; + } catch ( err ) { } +} + +function on( elem, types, selector, data, fn, one ) { + var origFn, type; + + // Types can be a map of types/handlers + if ( typeof types === "object" ) { + + // ( types-Object, selector, data ) + if ( typeof selector !== "string" ) { + + // ( types-Object, data ) + data = data || selector; + selector = undefined; + } + for ( type in types ) { + on( elem, type, selector, data, types[ type ], one ); + } + return elem; + } + + if ( data == null && fn == null ) { + + // ( types, fn ) + fn = selector; + data = selector = undefined; + } else if ( fn == null ) { + if ( typeof selector === "string" ) { + + // ( types, selector, fn ) + fn = data; + data = undefined; + } else { + + // ( types, data, fn ) + fn = data; + data = selector; + selector = undefined; + } + } + if ( fn === false ) { + fn = returnFalse; + } else if ( !fn ) { + return elem; + } + + if ( one === 1 ) { + origFn = fn; + fn = function( event ) { + + // Can use an empty set, since event contains the info + jQuery().off( event ); + return origFn.apply( this, arguments ); + }; + + // Use same guid so caller can remove using origFn + fn.guid = origFn.guid || ( origFn.guid = jQuery.guid++ ); + } + return elem.each( function() { + jQuery.event.add( this, types, fn, data, selector ); + } ); +} + +/* + * Helper functions for managing events -- not part of the public interface. + * Props to Dean Edwards' addEvent library for many of the ideas. + */ +jQuery.event = { + + global: {}, + + add: function( elem, types, handler, data, selector ) { + + var handleObjIn, eventHandle, tmp, + events, t, handleObj, + special, handlers, type, namespaces, origType, + elemData = dataPriv.get( elem ); + + // Only attach events to objects that accept data + if ( !acceptData( elem ) ) { + return; + } + + // Caller can pass in an object of custom data in lieu of the handler + if ( handler.handler ) { + handleObjIn = handler; + handler = handleObjIn.handler; + selector = handleObjIn.selector; + } + + // Ensure that invalid selectors throw exceptions at attach time + // Evaluate against documentElement in case elem is a non-element node (e.g., document) + if ( selector ) { + jQuery.find.matchesSelector( documentElement, selector ); + } + + // Make sure that the handler has a unique ID, used to find/remove it later + if ( !handler.guid ) { + handler.guid = jQuery.guid++; + } + + // Init the element's event structure and main handler, if this is the first + if ( !( events = elemData.events ) ) { + events = elemData.events = Object.create( null ); + } + if ( !( eventHandle = elemData.handle ) ) { + eventHandle = elemData.handle = function( e ) { + + // Discard the second event of a jQuery.event.trigger() and + // when an event is called after a page has unloaded + return typeof jQuery !== "undefined" && jQuery.event.triggered !== e.type ? + jQuery.event.dispatch.apply( elem, arguments ) : undefined; + }; + } + + // Handle multiple events separated by a space + types = ( types || "" ).match( rnothtmlwhite ) || [ "" ]; + t = types.length; + while ( t-- ) { + tmp = rtypenamespace.exec( types[ t ] ) || []; + type = origType = tmp[ 1 ]; + namespaces = ( tmp[ 2 ] || "" ).split( "." ).sort(); + + // There *must* be a type, no attaching namespace-only handlers + if ( !type ) { + continue; + } + + // If event changes its type, use the special event handlers for the changed type + special = jQuery.event.special[ type ] || {}; + + // If selector defined, determine special event api type, otherwise given type + type = ( selector ? special.delegateType : special.bindType ) || type; + + // Update special based on newly reset type + special = jQuery.event.special[ type ] || {}; + + // handleObj is passed to all event handlers + handleObj = jQuery.extend( { + type: type, + origType: origType, + data: data, + handler: handler, + guid: handler.guid, + selector: selector, + needsContext: selector && jQuery.expr.match.needsContext.test( selector ), + namespace: namespaces.join( "." ) + }, handleObjIn ); + + // Init the event handler queue if we're the first + if ( !( handlers = events[ type ] ) ) { + handlers = events[ type ] = []; + handlers.delegateCount = 0; + + // Only use addEventListener if the special events handler returns false + if ( !special.setup || + special.setup.call( elem, data, namespaces, eventHandle ) === false ) { + + if ( elem.addEventListener ) { + elem.addEventListener( type, eventHandle ); + } + } + } + + if ( special.add ) { + special.add.call( elem, handleObj ); + + if ( !handleObj.handler.guid ) { + handleObj.handler.guid = handler.guid; + } + } + + // Add to the element's handler list, delegates in front + if ( selector ) { + handlers.splice( handlers.delegateCount++, 0, handleObj ); + } else { + handlers.push( handleObj ); + } + + // Keep track of which events have ever been used, for event optimization + jQuery.event.global[ type ] = true; + } + + }, + + // Detach an event or set of events from an element + remove: function( elem, types, handler, selector, mappedTypes ) { + + var j, origCount, tmp, + events, t, handleObj, + special, handlers, type, namespaces, origType, + elemData = dataPriv.hasData( elem ) && dataPriv.get( elem ); + + if ( !elemData || !( events = elemData.events ) ) { + return; + } + + // Once for each type.namespace in types; type may be omitted + types = ( types || "" ).match( rnothtmlwhite ) || [ "" ]; + t = types.length; + while ( t-- ) { + tmp = rtypenamespace.exec( types[ t ] ) || []; + type = origType = tmp[ 1 ]; + namespaces = ( tmp[ 2 ] || "" ).split( "." ).sort(); + + // Unbind all events (on this namespace, if provided) for the element + if ( !type ) { + for ( type in events ) { + jQuery.event.remove( elem, type + types[ t ], handler, selector, true ); + } + continue; + } + + special = jQuery.event.special[ type ] || {}; + type = ( selector ? special.delegateType : special.bindType ) || type; + handlers = events[ type ] || []; + tmp = tmp[ 2 ] && + new RegExp( "(^|\\.)" + namespaces.join( "\\.(?:.*\\.|)" ) + "(\\.|$)" ); + + // Remove matching events + origCount = j = handlers.length; + while ( j-- ) { + handleObj = handlers[ j ]; + + if ( ( mappedTypes || origType === handleObj.origType ) && + ( !handler || handler.guid === handleObj.guid ) && + ( !tmp || tmp.test( handleObj.namespace ) ) && + ( !selector || selector === handleObj.selector || + selector === "**" && handleObj.selector ) ) { + handlers.splice( j, 1 ); + + if ( handleObj.selector ) { + handlers.delegateCount--; + } + if ( special.remove ) { + special.remove.call( elem, handleObj ); + } + } + } + + // Remove generic event handler if we removed something and no more handlers exist + // (avoids potential for endless recursion during removal of special event handlers) + if ( origCount && !handlers.length ) { + if ( !special.teardown || + special.teardown.call( elem, namespaces, elemData.handle ) === false ) { + + jQuery.removeEvent( elem, type, elemData.handle ); + } + + delete events[ type ]; + } + } + + // Remove data and the expando if it's no longer used + if ( jQuery.isEmptyObject( events ) ) { + dataPriv.remove( elem, "handle events" ); + } + }, + + dispatch: function( nativeEvent ) { + + var i, j, ret, matched, handleObj, handlerQueue, + args = new Array( arguments.length ), + + // Make a writable jQuery.Event from the native event object + event = jQuery.event.fix( nativeEvent ), + + handlers = ( + dataPriv.get( this, "events" ) || Object.create( null ) + )[ event.type ] || [], + special = jQuery.event.special[ event.type ] || {}; + + // Use the fix-ed jQuery.Event rather than the (read-only) native event + args[ 0 ] = event; + + for ( i = 1; i < arguments.length; i++ ) { + args[ i ] = arguments[ i ]; + } + + event.delegateTarget = this; + + // Call the preDispatch hook for the mapped type, and let it bail if desired + if ( special.preDispatch && special.preDispatch.call( this, event ) === false ) { + return; + } + + // Determine handlers + handlerQueue = jQuery.event.handlers.call( this, event, handlers ); + + // Run delegates first; they may want to stop propagation beneath us + i = 0; + while ( ( matched = handlerQueue[ i++ ] ) && !event.isPropagationStopped() ) { + event.currentTarget = matched.elem; + + j = 0; + while ( ( handleObj = matched.handlers[ j++ ] ) && + !event.isImmediatePropagationStopped() ) { + + // If the event is namespaced, then each handler is only invoked if it is + // specially universal or its namespaces are a superset of the event's. + if ( !event.rnamespace || handleObj.namespace === false || + event.rnamespace.test( handleObj.namespace ) ) { + + event.handleObj = handleObj; + event.data = handleObj.data; + + ret = ( ( jQuery.event.special[ handleObj.origType ] || {} ).handle || + handleObj.handler ).apply( matched.elem, args ); + + if ( ret !== undefined ) { + if ( ( event.result = ret ) === false ) { + event.preventDefault(); + event.stopPropagation(); + } + } + } + } + } + + // Call the postDispatch hook for the mapped type + if ( special.postDispatch ) { + special.postDispatch.call( this, event ); + } + + return event.result; + }, + + handlers: function( event, handlers ) { + var i, handleObj, sel, matchedHandlers, matchedSelectors, + handlerQueue = [], + delegateCount = handlers.delegateCount, + cur = event.target; + + // Find delegate handlers + if ( delegateCount && + + // Support: IE <=9 + // Black-hole SVG instance trees (trac-13180) + cur.nodeType && + + // Support: Firefox <=42 + // Suppress spec-violating clicks indicating a non-primary pointer button (trac-3861) + // https://www.w3.org/TR/DOM-Level-3-Events/#event-type-click + // Support: IE 11 only + // ...but not arrow key "clicks" of radio inputs, which can have `button` -1 (gh-2343) + !( event.type === "click" && event.button >= 1 ) ) { + + for ( ; cur !== this; cur = cur.parentNode || this ) { + + // Don't check non-elements (#13208) + // Don't process clicks on disabled elements (#6911, #8165, #11382, #11764) + if ( cur.nodeType === 1 && !( event.type === "click" && cur.disabled === true ) ) { + matchedHandlers = []; + matchedSelectors = {}; + for ( i = 0; i < delegateCount; i++ ) { + handleObj = handlers[ i ]; + + // Don't conflict with Object.prototype properties (#13203) + sel = handleObj.selector + " "; + + if ( matchedSelectors[ sel ] === undefined ) { + matchedSelectors[ sel ] = handleObj.needsContext ? + jQuery( sel, this ).index( cur ) > -1 : + jQuery.find( sel, this, null, [ cur ] ).length; + } + if ( matchedSelectors[ sel ] ) { + matchedHandlers.push( handleObj ); + } + } + if ( matchedHandlers.length ) { + handlerQueue.push( { elem: cur, handlers: matchedHandlers } ); + } + } + } + } + + // Add the remaining (directly-bound) handlers + cur = this; + if ( delegateCount < handlers.length ) { + handlerQueue.push( { elem: cur, handlers: handlers.slice( delegateCount ) } ); + } + + return handlerQueue; + }, + + addProp: function( name, hook ) { + Object.defineProperty( jQuery.Event.prototype, name, { + enumerable: true, + configurable: true, + + get: isFunction( hook ) ? + function() { + if ( this.originalEvent ) { + return hook( this.originalEvent ); + } + } : + function() { + if ( this.originalEvent ) { + return this.originalEvent[ name ]; + } + }, + + set: function( value ) { + Object.defineProperty( this, name, { + enumerable: true, + configurable: true, + writable: true, + value: value + } ); + } + } ); + }, + + fix: function( originalEvent ) { + return originalEvent[ jQuery.expando ] ? + originalEvent : + new jQuery.Event( originalEvent ); + }, + + special: { + load: { + + // Prevent triggered image.load events from bubbling to window.load + noBubble: true + }, + click: { + + // Utilize native event to ensure correct state for checkable inputs + setup: function( data ) { + + // For mutual compressibility with _default, replace `this` access with a local var. + // `|| data` is dead code meant only to preserve the variable through minification. + var el = this || data; + + // Claim the first handler + if ( rcheckableType.test( el.type ) && + el.click && nodeName( el, "input" ) ) { + + // dataPriv.set( el, "click", ... ) + leverageNative( el, "click", returnTrue ); + } + + // Return false to allow normal processing in the caller + return false; + }, + trigger: function( data ) { + + // For mutual compressibility with _default, replace `this` access with a local var. + // `|| data` is dead code meant only to preserve the variable through minification. + var el = this || data; + + // Force setup before triggering a click + if ( rcheckableType.test( el.type ) && + el.click && nodeName( el, "input" ) ) { + + leverageNative( el, "click" ); + } + + // Return non-false to allow normal event-path propagation + return true; + }, + + // For cross-browser consistency, suppress native .click() on links + // Also prevent it if we're currently inside a leveraged native-event stack + _default: function( event ) { + var target = event.target; + return rcheckableType.test( target.type ) && + target.click && nodeName( target, "input" ) && + dataPriv.get( target, "click" ) || + nodeName( target, "a" ); + } + }, + + beforeunload: { + postDispatch: function( event ) { + + // Support: Firefox 20+ + // Firefox doesn't alert if the returnValue field is not set. + if ( event.result !== undefined && event.originalEvent ) { + event.originalEvent.returnValue = event.result; + } + } + } + } +}; + +// Ensure the presence of an event listener that handles manually-triggered +// synthetic events by interrupting progress until reinvoked in response to +// *native* events that it fires directly, ensuring that state changes have +// already occurred before other listeners are invoked. +function leverageNative( el, type, expectSync ) { + + // Missing expectSync indicates a trigger call, which must force setup through jQuery.event.add + if ( !expectSync ) { + if ( dataPriv.get( el, type ) === undefined ) { + jQuery.event.add( el, type, returnTrue ); + } + return; + } + + // Register the controller as a special universal handler for all event namespaces + dataPriv.set( el, type, false ); + jQuery.event.add( el, type, { + namespace: false, + handler: function( event ) { + var notAsync, result, + saved = dataPriv.get( this, type ); + + if ( ( event.isTrigger & 1 ) && this[ type ] ) { + + // Interrupt processing of the outer synthetic .trigger()ed event + // Saved data should be false in such cases, but might be a leftover capture object + // from an async native handler (gh-4350) + if ( !saved.length ) { + + // Store arguments for use when handling the inner native event + // There will always be at least one argument (an event object), so this array + // will not be confused with a leftover capture object. + saved = slice.call( arguments ); + dataPriv.set( this, type, saved ); + + // Trigger the native event and capture its result + // Support: IE <=9 - 11+ + // focus() and blur() are asynchronous + notAsync = expectSync( this, type ); + this[ type ](); + result = dataPriv.get( this, type ); + if ( saved !== result || notAsync ) { + dataPriv.set( this, type, false ); + } else { + result = {}; + } + if ( saved !== result ) { + + // Cancel the outer synthetic event + event.stopImmediatePropagation(); + event.preventDefault(); + + // Support: Chrome 86+ + // In Chrome, if an element having a focusout handler is blurred by + // clicking outside of it, it invokes the handler synchronously. If + // that handler calls `.remove()` on the element, the data is cleared, + // leaving `result` undefined. We need to guard against this. + return result && result.value; + } + + // If this is an inner synthetic event for an event with a bubbling surrogate + // (focus or blur), assume that the surrogate already propagated from triggering the + // native event and prevent that from happening again here. + // This technically gets the ordering wrong w.r.t. to `.trigger()` (in which the + // bubbling surrogate propagates *after* the non-bubbling base), but that seems + // less bad than duplication. + } else if ( ( jQuery.event.special[ type ] || {} ).delegateType ) { + event.stopPropagation(); + } + + // If this is a native event triggered above, everything is now in order + // Fire an inner synthetic event with the original arguments + } else if ( saved.length ) { + + // ...and capture the result + dataPriv.set( this, type, { + value: jQuery.event.trigger( + + // Support: IE <=9 - 11+ + // Extend with the prototype to reset the above stopImmediatePropagation() + jQuery.extend( saved[ 0 ], jQuery.Event.prototype ), + saved.slice( 1 ), + this + ) + } ); + + // Abort handling of the native event + event.stopImmediatePropagation(); + } + } + } ); +} + +jQuery.removeEvent = function( elem, type, handle ) { + + // This "if" is needed for plain objects + if ( elem.removeEventListener ) { + elem.removeEventListener( type, handle ); + } +}; + +jQuery.Event = function( src, props ) { + + // Allow instantiation without the 'new' keyword + if ( !( this instanceof jQuery.Event ) ) { + return new jQuery.Event( src, props ); + } + + // Event object + if ( src && src.type ) { + this.originalEvent = src; + this.type = src.type; + + // Events bubbling up the document may have been marked as prevented + // by a handler lower down the tree; reflect the correct value. + this.isDefaultPrevented = src.defaultPrevented || + src.defaultPrevented === undefined && + + // Support: Android <=2.3 only + src.returnValue === false ? + returnTrue : + returnFalse; + + // Create target properties + // Support: Safari <=6 - 7 only + // Target should not be a text node (#504, #13143) + this.target = ( src.target && src.target.nodeType === 3 ) ? + src.target.parentNode : + src.target; + + this.currentTarget = src.currentTarget; + this.relatedTarget = src.relatedTarget; + + // Event type + } else { + this.type = src; + } + + // Put explicitly provided properties onto the event object + if ( props ) { + jQuery.extend( this, props ); + } + + // Create a timestamp if incoming event doesn't have one + this.timeStamp = src && src.timeStamp || Date.now(); + + // Mark it as fixed + this[ jQuery.expando ] = true; +}; + +// jQuery.Event is based on DOM3 Events as specified by the ECMAScript Language Binding +// https://www.w3.org/TR/2003/WD-DOM-Level-3-Events-20030331/ecma-script-binding.html +jQuery.Event.prototype = { + constructor: jQuery.Event, + isDefaultPrevented: returnFalse, + isPropagationStopped: returnFalse, + isImmediatePropagationStopped: returnFalse, + isSimulated: false, + + preventDefault: function() { + var e = this.originalEvent; + + this.isDefaultPrevented = returnTrue; + + if ( e && !this.isSimulated ) { + e.preventDefault(); + } + }, + stopPropagation: function() { + var e = this.originalEvent; + + this.isPropagationStopped = returnTrue; + + if ( e && !this.isSimulated ) { + e.stopPropagation(); + } + }, + stopImmediatePropagation: function() { + var e = this.originalEvent; + + this.isImmediatePropagationStopped = returnTrue; + + if ( e && !this.isSimulated ) { + e.stopImmediatePropagation(); + } + + this.stopPropagation(); + } +}; + +// Includes all common event props including KeyEvent and MouseEvent specific props +jQuery.each( { + altKey: true, + bubbles: true, + cancelable: true, + changedTouches: true, + ctrlKey: true, + detail: true, + eventPhase: true, + metaKey: true, + pageX: true, + pageY: true, + shiftKey: true, + view: true, + "char": true, + code: true, + charCode: true, + key: true, + keyCode: true, + button: true, + buttons: true, + clientX: true, + clientY: true, + offsetX: true, + offsetY: true, + pointerId: true, + pointerType: true, + screenX: true, + screenY: true, + targetTouches: true, + toElement: true, + touches: true, + which: true +}, jQuery.event.addProp ); + +jQuery.each( { focus: "focusin", blur: "focusout" }, function( type, delegateType ) { + jQuery.event.special[ type ] = { + + // Utilize native event if possible so blur/focus sequence is correct + setup: function() { + + // Claim the first handler + // dataPriv.set( this, "focus", ... ) + // dataPriv.set( this, "blur", ... ) + leverageNative( this, type, expectSync ); + + // Return false to allow normal processing in the caller + return false; + }, + trigger: function() { + + // Force setup before trigger + leverageNative( this, type ); + + // Return non-false to allow normal event-path propagation + return true; + }, + + // Suppress native focus or blur as it's already being fired + // in leverageNative. + _default: function() { + return true; + }, + + delegateType: delegateType + }; +} ); + +// Create mouseenter/leave events using mouseover/out and event-time checks +// so that event delegation works in jQuery. +// Do the same for pointerenter/pointerleave and pointerover/pointerout +// +// Support: Safari 7 only +// Safari sends mouseenter too often; see: +// https://bugs.chromium.org/p/chromium/issues/detail?id=470258 +// for the description of the bug (it existed in older Chrome versions as well). +jQuery.each( { + mouseenter: "mouseover", + mouseleave: "mouseout", + pointerenter: "pointerover", + pointerleave: "pointerout" +}, function( orig, fix ) { + jQuery.event.special[ orig ] = { + delegateType: fix, + bindType: fix, + + handle: function( event ) { + var ret, + target = this, + related = event.relatedTarget, + handleObj = event.handleObj; + + // For mouseenter/leave call the handler if related is outside the target. + // NB: No relatedTarget if the mouse left/entered the browser window + if ( !related || ( related !== target && !jQuery.contains( target, related ) ) ) { + event.type = handleObj.origType; + ret = handleObj.handler.apply( this, arguments ); + event.type = fix; + } + return ret; + } + }; +} ); + +jQuery.fn.extend( { + + on: function( types, selector, data, fn ) { + return on( this, types, selector, data, fn ); + }, + one: function( types, selector, data, fn ) { + return on( this, types, selector, data, fn, 1 ); + }, + off: function( types, selector, fn ) { + var handleObj, type; + if ( types && types.preventDefault && types.handleObj ) { + + // ( event ) dispatched jQuery.Event + handleObj = types.handleObj; + jQuery( types.delegateTarget ).off( + handleObj.namespace ? + handleObj.origType + "." + handleObj.namespace : + handleObj.origType, + handleObj.selector, + handleObj.handler + ); + return this; + } + if ( typeof types === "object" ) { + + // ( types-object [, selector] ) + for ( type in types ) { + this.off( type, selector, types[ type ] ); + } + return this; + } + if ( selector === false || typeof selector === "function" ) { + + // ( types [, fn] ) + fn = selector; + selector = undefined; + } + if ( fn === false ) { + fn = returnFalse; + } + return this.each( function() { + jQuery.event.remove( this, types, fn, selector ); + } ); + } +} ); + + +var + + // Support: IE <=10 - 11, Edge 12 - 13 only + // In IE/Edge using regex groups here causes severe slowdowns. + // See https://connect.microsoft.com/IE/feedback/details/1736512/ + rnoInnerhtml = /\s*$/g; + +// Prefer a tbody over its parent table for containing new rows +function manipulationTarget( elem, content ) { + if ( nodeName( elem, "table" ) && + nodeName( content.nodeType !== 11 ? content : content.firstChild, "tr" ) ) { + + return jQuery( elem ).children( "tbody" )[ 0 ] || elem; + } + + return elem; +} + +// Replace/restore the type attribute of script elements for safe DOM manipulation +function disableScript( elem ) { + elem.type = ( elem.getAttribute( "type" ) !== null ) + "/" + elem.type; + return elem; +} +function restoreScript( elem ) { + if ( ( elem.type || "" ).slice( 0, 5 ) === "true/" ) { + elem.type = elem.type.slice( 5 ); + } else { + elem.removeAttribute( "type" ); + } + + return elem; +} + +function cloneCopyEvent( src, dest ) { + var i, l, type, pdataOld, udataOld, udataCur, events; + + if ( dest.nodeType !== 1 ) { + return; + } + + // 1. Copy private data: events, handlers, etc. + if ( dataPriv.hasData( src ) ) { + pdataOld = dataPriv.get( src ); + events = pdataOld.events; + + if ( events ) { + dataPriv.remove( dest, "handle events" ); + + for ( type in events ) { + for ( i = 0, l = events[ type ].length; i < l; i++ ) { + jQuery.event.add( dest, type, events[ type ][ i ] ); + } + } + } + } + + // 2. Copy user data + if ( dataUser.hasData( src ) ) { + udataOld = dataUser.access( src ); + udataCur = jQuery.extend( {}, udataOld ); + + dataUser.set( dest, udataCur ); + } +} + +// Fix IE bugs, see support tests +function fixInput( src, dest ) { + var nodeName = dest.nodeName.toLowerCase(); + + // Fails to persist the checked state of a cloned checkbox or radio button. + if ( nodeName === "input" && rcheckableType.test( src.type ) ) { + dest.checked = src.checked; + + // Fails to return the selected option to the default selected state when cloning options + } else if ( nodeName === "input" || nodeName === "textarea" ) { + dest.defaultValue = src.defaultValue; + } +} + +function domManip( collection, args, callback, ignored ) { + + // Flatten any nested arrays + args = flat( args ); + + var fragment, first, scripts, hasScripts, node, doc, + i = 0, + l = collection.length, + iNoClone = l - 1, + value = args[ 0 ], + valueIsFunction = isFunction( value ); + + // We can't cloneNode fragments that contain checked, in WebKit + if ( valueIsFunction || + ( l > 1 && typeof value === "string" && + !support.checkClone && rchecked.test( value ) ) ) { + return collection.each( function( index ) { + var self = collection.eq( index ); + if ( valueIsFunction ) { + args[ 0 ] = value.call( this, index, self.html() ); + } + domManip( self, args, callback, ignored ); + } ); + } + + if ( l ) { + fragment = buildFragment( args, collection[ 0 ].ownerDocument, false, collection, ignored ); + first = fragment.firstChild; + + if ( fragment.childNodes.length === 1 ) { + fragment = first; + } + + // Require either new content or an interest in ignored elements to invoke the callback + if ( first || ignored ) { + scripts = jQuery.map( getAll( fragment, "script" ), disableScript ); + hasScripts = scripts.length; + + // Use the original fragment for the last item + // instead of the first because it can end up + // being emptied incorrectly in certain situations (#8070). + for ( ; i < l; i++ ) { + node = fragment; + + if ( i !== iNoClone ) { + node = jQuery.clone( node, true, true ); + + // Keep references to cloned scripts for later restoration + if ( hasScripts ) { + + // Support: Android <=4.0 only, PhantomJS 1 only + // push.apply(_, arraylike) throws on ancient WebKit + jQuery.merge( scripts, getAll( node, "script" ) ); + } + } + + callback.call( collection[ i ], node, i ); + } + + if ( hasScripts ) { + doc = scripts[ scripts.length - 1 ].ownerDocument; + + // Reenable scripts + jQuery.map( scripts, restoreScript ); + + // Evaluate executable scripts on first document insertion + for ( i = 0; i < hasScripts; i++ ) { + node = scripts[ i ]; + if ( rscriptType.test( node.type || "" ) && + !dataPriv.access( node, "globalEval" ) && + jQuery.contains( doc, node ) ) { + + if ( node.src && ( node.type || "" ).toLowerCase() !== "module" ) { + + // Optional AJAX dependency, but won't run scripts if not present + if ( jQuery._evalUrl && !node.noModule ) { + jQuery._evalUrl( node.src, { + nonce: node.nonce || node.getAttribute( "nonce" ) + }, doc ); + } + } else { + DOMEval( node.textContent.replace( rcleanScript, "" ), node, doc ); + } + } + } + } + } + } + + return collection; +} + +function remove( elem, selector, keepData ) { + var node, + nodes = selector ? jQuery.filter( selector, elem ) : elem, + i = 0; + + for ( ; ( node = nodes[ i ] ) != null; i++ ) { + if ( !keepData && node.nodeType === 1 ) { + jQuery.cleanData( getAll( node ) ); + } + + if ( node.parentNode ) { + if ( keepData && isAttached( node ) ) { + setGlobalEval( getAll( node, "script" ) ); + } + node.parentNode.removeChild( node ); + } + } + + return elem; +} + +jQuery.extend( { + htmlPrefilter: function( html ) { + return html; + }, + + clone: function( elem, dataAndEvents, deepDataAndEvents ) { + var i, l, srcElements, destElements, + clone = elem.cloneNode( true ), + inPage = isAttached( elem ); + + // Fix IE cloning issues + if ( !support.noCloneChecked && ( elem.nodeType === 1 || elem.nodeType === 11 ) && + !jQuery.isXMLDoc( elem ) ) { + + // We eschew Sizzle here for performance reasons: https://jsperf.com/getall-vs-sizzle/2 + destElements = getAll( clone ); + srcElements = getAll( elem ); + + for ( i = 0, l = srcElements.length; i < l; i++ ) { + fixInput( srcElements[ i ], destElements[ i ] ); + } + } + + // Copy the events from the original to the clone + if ( dataAndEvents ) { + if ( deepDataAndEvents ) { + srcElements = srcElements || getAll( elem ); + destElements = destElements || getAll( clone ); + + for ( i = 0, l = srcElements.length; i < l; i++ ) { + cloneCopyEvent( srcElements[ i ], destElements[ i ] ); + } + } else { + cloneCopyEvent( elem, clone ); + } + } + + // Preserve script evaluation history + destElements = getAll( clone, "script" ); + if ( destElements.length > 0 ) { + setGlobalEval( destElements, !inPage && getAll( elem, "script" ) ); + } + + // Return the cloned set + return clone; + }, + + cleanData: function( elems ) { + var data, elem, type, + special = jQuery.event.special, + i = 0; + + for ( ; ( elem = elems[ i ] ) !== undefined; i++ ) { + if ( acceptData( elem ) ) { + if ( ( data = elem[ dataPriv.expando ] ) ) { + if ( data.events ) { + for ( type in data.events ) { + if ( special[ type ] ) { + jQuery.event.remove( elem, type ); + + // This is a shortcut to avoid jQuery.event.remove's overhead + } else { + jQuery.removeEvent( elem, type, data.handle ); + } + } + } + + // Support: Chrome <=35 - 45+ + // Assign undefined instead of using delete, see Data#remove + elem[ dataPriv.expando ] = undefined; + } + if ( elem[ dataUser.expando ] ) { + + // Support: Chrome <=35 - 45+ + // Assign undefined instead of using delete, see Data#remove + elem[ dataUser.expando ] = undefined; + } + } + } + } +} ); + +jQuery.fn.extend( { + detach: function( selector ) { + return remove( this, selector, true ); + }, + + remove: function( selector ) { + return remove( this, selector ); + }, + + text: function( value ) { + return access( this, function( value ) { + return value === undefined ? + jQuery.text( this ) : + this.empty().each( function() { + if ( this.nodeType === 1 || this.nodeType === 11 || this.nodeType === 9 ) { + this.textContent = value; + } + } ); + }, null, value, arguments.length ); + }, + + append: function() { + return domManip( this, arguments, function( elem ) { + if ( this.nodeType === 1 || this.nodeType === 11 || this.nodeType === 9 ) { + var target = manipulationTarget( this, elem ); + target.appendChild( elem ); + } + } ); + }, + + prepend: function() { + return domManip( this, arguments, function( elem ) { + if ( this.nodeType === 1 || this.nodeType === 11 || this.nodeType === 9 ) { + var target = manipulationTarget( this, elem ); + target.insertBefore( elem, target.firstChild ); + } + } ); + }, + + before: function() { + return domManip( this, arguments, function( elem ) { + if ( this.parentNode ) { + this.parentNode.insertBefore( elem, this ); + } + } ); + }, + + after: function() { + return domManip( this, arguments, function( elem ) { + if ( this.parentNode ) { + this.parentNode.insertBefore( elem, this.nextSibling ); + } + } ); + }, + + empty: function() { + var elem, + i = 0; + + for ( ; ( elem = this[ i ] ) != null; i++ ) { + if ( elem.nodeType === 1 ) { + + // Prevent memory leaks + jQuery.cleanData( getAll( elem, false ) ); + + // Remove any remaining nodes + elem.textContent = ""; + } + } + + return this; + }, + + clone: function( dataAndEvents, deepDataAndEvents ) { + dataAndEvents = dataAndEvents == null ? false : dataAndEvents; + deepDataAndEvents = deepDataAndEvents == null ? dataAndEvents : deepDataAndEvents; + + return this.map( function() { + return jQuery.clone( this, dataAndEvents, deepDataAndEvents ); + } ); + }, + + html: function( value ) { + return access( this, function( value ) { + var elem = this[ 0 ] || {}, + i = 0, + l = this.length; + + if ( value === undefined && elem.nodeType === 1 ) { + return elem.innerHTML; + } + + // See if we can take a shortcut and just use innerHTML + if ( typeof value === "string" && !rnoInnerhtml.test( value ) && + !wrapMap[ ( rtagName.exec( value ) || [ "", "" ] )[ 1 ].toLowerCase() ] ) { + + value = jQuery.htmlPrefilter( value ); + + try { + for ( ; i < l; i++ ) { + elem = this[ i ] || {}; + + // Remove element nodes and prevent memory leaks + if ( elem.nodeType === 1 ) { + jQuery.cleanData( getAll( elem, false ) ); + elem.innerHTML = value; + } + } + + elem = 0; + + // If using innerHTML throws an exception, use the fallback method + } catch ( e ) {} + } + + if ( elem ) { + this.empty().append( value ); + } + }, null, value, arguments.length ); + }, + + replaceWith: function() { + var ignored = []; + + // Make the changes, replacing each non-ignored context element with the new content + return domManip( this, arguments, function( elem ) { + var parent = this.parentNode; + + if ( jQuery.inArray( this, ignored ) < 0 ) { + jQuery.cleanData( getAll( this ) ); + if ( parent ) { + parent.replaceChild( elem, this ); + } + } + + // Force callback invocation + }, ignored ); + } +} ); + +jQuery.each( { + appendTo: "append", + prependTo: "prepend", + insertBefore: "before", + insertAfter: "after", + replaceAll: "replaceWith" +}, function( name, original ) { + jQuery.fn[ name ] = function( selector ) { + var elems, + ret = [], + insert = jQuery( selector ), + last = insert.length - 1, + i = 0; + + for ( ; i <= last; i++ ) { + elems = i === last ? this : this.clone( true ); + jQuery( insert[ i ] )[ original ]( elems ); + + // Support: Android <=4.0 only, PhantomJS 1 only + // .get() because push.apply(_, arraylike) throws on ancient WebKit + push.apply( ret, elems.get() ); + } + + return this.pushStack( ret ); + }; +} ); +var rnumnonpx = new RegExp( "^(" + pnum + ")(?!px)[a-z%]+$", "i" ); + +var getStyles = function( elem ) { + + // Support: IE <=11 only, Firefox <=30 (#15098, #14150) + // IE throws on elements created in popups + // FF meanwhile throws on frame elements through "defaultView.getComputedStyle" + var view = elem.ownerDocument.defaultView; + + if ( !view || !view.opener ) { + view = window; + } + + return view.getComputedStyle( elem ); + }; + +var swap = function( elem, options, callback ) { + var ret, name, + old = {}; + + // Remember the old values, and insert the new ones + for ( name in options ) { + old[ name ] = elem.style[ name ]; + elem.style[ name ] = options[ name ]; + } + + ret = callback.call( elem ); + + // Revert the old values + for ( name in options ) { + elem.style[ name ] = old[ name ]; + } + + return ret; +}; + + +var rboxStyle = new RegExp( cssExpand.join( "|" ), "i" ); + + + +( function() { + + // Executing both pixelPosition & boxSizingReliable tests require only one layout + // so they're executed at the same time to save the second computation. + function computeStyleTests() { + + // This is a singleton, we need to execute it only once + if ( !div ) { + return; + } + + container.style.cssText = "position:absolute;left:-11111px;width:60px;" + + "margin-top:1px;padding:0;border:0"; + div.style.cssText = + "position:relative;display:block;box-sizing:border-box;overflow:scroll;" + + "margin:auto;border:1px;padding:1px;" + + "width:60%;top:1%"; + documentElement.appendChild( container ).appendChild( div ); + + var divStyle = window.getComputedStyle( div ); + pixelPositionVal = divStyle.top !== "1%"; + + // Support: Android 4.0 - 4.3 only, Firefox <=3 - 44 + reliableMarginLeftVal = roundPixelMeasures( divStyle.marginLeft ) === 12; + + // Support: Android 4.0 - 4.3 only, Safari <=9.1 - 10.1, iOS <=7.0 - 9.3 + // Some styles come back with percentage values, even though they shouldn't + div.style.right = "60%"; + pixelBoxStylesVal = roundPixelMeasures( divStyle.right ) === 36; + + // Support: IE 9 - 11 only + // Detect misreporting of content dimensions for box-sizing:border-box elements + boxSizingReliableVal = roundPixelMeasures( divStyle.width ) === 36; + + // Support: IE 9 only + // Detect overflow:scroll screwiness (gh-3699) + // Support: Chrome <=64 + // Don't get tricked when zoom affects offsetWidth (gh-4029) + div.style.position = "absolute"; + scrollboxSizeVal = roundPixelMeasures( div.offsetWidth / 3 ) === 12; + + documentElement.removeChild( container ); + + // Nullify the div so it wouldn't be stored in the memory and + // it will also be a sign that checks already performed + div = null; + } + + function roundPixelMeasures( measure ) { + return Math.round( parseFloat( measure ) ); + } + + var pixelPositionVal, boxSizingReliableVal, scrollboxSizeVal, pixelBoxStylesVal, + reliableTrDimensionsVal, reliableMarginLeftVal, + container = document.createElement( "div" ), + div = document.createElement( "div" ); + + // Finish early in limited (non-browser) environments + if ( !div.style ) { + return; + } + + // Support: IE <=9 - 11 only + // Style of cloned element affects source element cloned (#8908) + div.style.backgroundClip = "content-box"; + div.cloneNode( true ).style.backgroundClip = ""; + support.clearCloneStyle = div.style.backgroundClip === "content-box"; + + jQuery.extend( support, { + boxSizingReliable: function() { + computeStyleTests(); + return boxSizingReliableVal; + }, + pixelBoxStyles: function() { + computeStyleTests(); + return pixelBoxStylesVal; + }, + pixelPosition: function() { + computeStyleTests(); + return pixelPositionVal; + }, + reliableMarginLeft: function() { + computeStyleTests(); + return reliableMarginLeftVal; + }, + scrollboxSize: function() { + computeStyleTests(); + return scrollboxSizeVal; + }, + + // Support: IE 9 - 11+, Edge 15 - 18+ + // IE/Edge misreport `getComputedStyle` of table rows with width/height + // set in CSS while `offset*` properties report correct values. + // Behavior in IE 9 is more subtle than in newer versions & it passes + // some versions of this test; make sure not to make it pass there! + // + // Support: Firefox 70+ + // Only Firefox includes border widths + // in computed dimensions. (gh-4529) + reliableTrDimensions: function() { + var table, tr, trChild, trStyle; + if ( reliableTrDimensionsVal == null ) { + table = document.createElement( "table" ); + tr = document.createElement( "tr" ); + trChild = document.createElement( "div" ); + + table.style.cssText = "position:absolute;left:-11111px;border-collapse:separate"; + tr.style.cssText = "border:1px solid"; + + // Support: Chrome 86+ + // Height set through cssText does not get applied. + // Computed height then comes back as 0. + tr.style.height = "1px"; + trChild.style.height = "9px"; + + // Support: Android 8 Chrome 86+ + // In our bodyBackground.html iframe, + // display for all div elements is set to "inline", + // which causes a problem only in Android 8 Chrome 86. + // Ensuring the div is display: block + // gets around this issue. + trChild.style.display = "block"; + + documentElement + .appendChild( table ) + .appendChild( tr ) + .appendChild( trChild ); + + trStyle = window.getComputedStyle( tr ); + reliableTrDimensionsVal = ( parseInt( trStyle.height, 10 ) + + parseInt( trStyle.borderTopWidth, 10 ) + + parseInt( trStyle.borderBottomWidth, 10 ) ) === tr.offsetHeight; + + documentElement.removeChild( table ); + } + return reliableTrDimensionsVal; + } + } ); +} )(); + + +function curCSS( elem, name, computed ) { + var width, minWidth, maxWidth, ret, + + // Support: Firefox 51+ + // Retrieving style before computed somehow + // fixes an issue with getting wrong values + // on detached elements + style = elem.style; + + computed = computed || getStyles( elem ); + + // getPropertyValue is needed for: + // .css('filter') (IE 9 only, #12537) + // .css('--customProperty) (#3144) + if ( computed ) { + ret = computed.getPropertyValue( name ) || computed[ name ]; + + if ( ret === "" && !isAttached( elem ) ) { + ret = jQuery.style( elem, name ); + } + + // A tribute to the "awesome hack by Dean Edwards" + // Android Browser returns percentage for some values, + // but width seems to be reliably pixels. + // This is against the CSSOM draft spec: + // https://drafts.csswg.org/cssom/#resolved-values + if ( !support.pixelBoxStyles() && rnumnonpx.test( ret ) && rboxStyle.test( name ) ) { + + // Remember the original values + width = style.width; + minWidth = style.minWidth; + maxWidth = style.maxWidth; + + // Put in the new values to get a computed value out + style.minWidth = style.maxWidth = style.width = ret; + ret = computed.width; + + // Revert the changed values + style.width = width; + style.minWidth = minWidth; + style.maxWidth = maxWidth; + } + } + + return ret !== undefined ? + + // Support: IE <=9 - 11 only + // IE returns zIndex value as an integer. + ret + "" : + ret; +} + + +function addGetHookIf( conditionFn, hookFn ) { + + // Define the hook, we'll check on the first run if it's really needed. + return { + get: function() { + if ( conditionFn() ) { + + // Hook not needed (or it's not possible to use it due + // to missing dependency), remove it. + delete this.get; + return; + } + + // Hook needed; redefine it so that the support test is not executed again. + return ( this.get = hookFn ).apply( this, arguments ); + } + }; +} + + +var cssPrefixes = [ "Webkit", "Moz", "ms" ], + emptyStyle = document.createElement( "div" ).style, + vendorProps = {}; + +// Return a vendor-prefixed property or undefined +function vendorPropName( name ) { + + // Check for vendor prefixed names + var capName = name[ 0 ].toUpperCase() + name.slice( 1 ), + i = cssPrefixes.length; + + while ( i-- ) { + name = cssPrefixes[ i ] + capName; + if ( name in emptyStyle ) { + return name; + } + } +} + +// Return a potentially-mapped jQuery.cssProps or vendor prefixed property +function finalPropName( name ) { + var final = jQuery.cssProps[ name ] || vendorProps[ name ]; + + if ( final ) { + return final; + } + if ( name in emptyStyle ) { + return name; + } + return vendorProps[ name ] = vendorPropName( name ) || name; +} + + +var + + // Swappable if display is none or starts with table + // except "table", "table-cell", or "table-caption" + // See here for display values: https://developer.mozilla.org/en-US/docs/CSS/display + rdisplayswap = /^(none|table(?!-c[ea]).+)/, + rcustomProp = /^--/, + cssShow = { position: "absolute", visibility: "hidden", display: "block" }, + cssNormalTransform = { + letterSpacing: "0", + fontWeight: "400" + }; + +function setPositiveNumber( _elem, value, subtract ) { + + // Any relative (+/-) values have already been + // normalized at this point + var matches = rcssNum.exec( value ); + return matches ? + + // Guard against undefined "subtract", e.g., when used as in cssHooks + Math.max( 0, matches[ 2 ] - ( subtract || 0 ) ) + ( matches[ 3 ] || "px" ) : + value; +} + +function boxModelAdjustment( elem, dimension, box, isBorderBox, styles, computedVal ) { + var i = dimension === "width" ? 1 : 0, + extra = 0, + delta = 0; + + // Adjustment may not be necessary + if ( box === ( isBorderBox ? "border" : "content" ) ) { + return 0; + } + + for ( ; i < 4; i += 2 ) { + + // Both box models exclude margin + if ( box === "margin" ) { + delta += jQuery.css( elem, box + cssExpand[ i ], true, styles ); + } + + // If we get here with a content-box, we're seeking "padding" or "border" or "margin" + if ( !isBorderBox ) { + + // Add padding + delta += jQuery.css( elem, "padding" + cssExpand[ i ], true, styles ); + + // For "border" or "margin", add border + if ( box !== "padding" ) { + delta += jQuery.css( elem, "border" + cssExpand[ i ] + "Width", true, styles ); + + // But still keep track of it otherwise + } else { + extra += jQuery.css( elem, "border" + cssExpand[ i ] + "Width", true, styles ); + } + + // If we get here with a border-box (content + padding + border), we're seeking "content" or + // "padding" or "margin" + } else { + + // For "content", subtract padding + if ( box === "content" ) { + delta -= jQuery.css( elem, "padding" + cssExpand[ i ], true, styles ); + } + + // For "content" or "padding", subtract border + if ( box !== "margin" ) { + delta -= jQuery.css( elem, "border" + cssExpand[ i ] + "Width", true, styles ); + } + } + } + + // Account for positive content-box scroll gutter when requested by providing computedVal + if ( !isBorderBox && computedVal >= 0 ) { + + // offsetWidth/offsetHeight is a rounded sum of content, padding, scroll gutter, and border + // Assuming integer scroll gutter, subtract the rest and round down + delta += Math.max( 0, Math.ceil( + elem[ "offset" + dimension[ 0 ].toUpperCase() + dimension.slice( 1 ) ] - + computedVal - + delta - + extra - + 0.5 + + // If offsetWidth/offsetHeight is unknown, then we can't determine content-box scroll gutter + // Use an explicit zero to avoid NaN (gh-3964) + ) ) || 0; + } + + return delta; +} + +function getWidthOrHeight( elem, dimension, extra ) { + + // Start with computed style + var styles = getStyles( elem ), + + // To avoid forcing a reflow, only fetch boxSizing if we need it (gh-4322). + // Fake content-box until we know it's needed to know the true value. + boxSizingNeeded = !support.boxSizingReliable() || extra, + isBorderBox = boxSizingNeeded && + jQuery.css( elem, "boxSizing", false, styles ) === "border-box", + valueIsBorderBox = isBorderBox, + + val = curCSS( elem, dimension, styles ), + offsetProp = "offset" + dimension[ 0 ].toUpperCase() + dimension.slice( 1 ); + + // Support: Firefox <=54 + // Return a confounding non-pixel value or feign ignorance, as appropriate. + if ( rnumnonpx.test( val ) ) { + if ( !extra ) { + return val; + } + val = "auto"; + } + + + // Support: IE 9 - 11 only + // Use offsetWidth/offsetHeight for when box sizing is unreliable. + // In those cases, the computed value can be trusted to be border-box. + if ( ( !support.boxSizingReliable() && isBorderBox || + + // Support: IE 10 - 11+, Edge 15 - 18+ + // IE/Edge misreport `getComputedStyle` of table rows with width/height + // set in CSS while `offset*` properties report correct values. + // Interestingly, in some cases IE 9 doesn't suffer from this issue. + !support.reliableTrDimensions() && nodeName( elem, "tr" ) || + + // Fall back to offsetWidth/offsetHeight when value is "auto" + // This happens for inline elements with no explicit setting (gh-3571) + val === "auto" || + + // Support: Android <=4.1 - 4.3 only + // Also use offsetWidth/offsetHeight for misreported inline dimensions (gh-3602) + !parseFloat( val ) && jQuery.css( elem, "display", false, styles ) === "inline" ) && + + // Make sure the element is visible & connected + elem.getClientRects().length ) { + + isBorderBox = jQuery.css( elem, "boxSizing", false, styles ) === "border-box"; + + // Where available, offsetWidth/offsetHeight approximate border box dimensions. + // Where not available (e.g., SVG), assume unreliable box-sizing and interpret the + // retrieved value as a content box dimension. + valueIsBorderBox = offsetProp in elem; + if ( valueIsBorderBox ) { + val = elem[ offsetProp ]; + } + } + + // Normalize "" and auto + val = parseFloat( val ) || 0; + + // Adjust for the element's box model + return ( val + + boxModelAdjustment( + elem, + dimension, + extra || ( isBorderBox ? "border" : "content" ), + valueIsBorderBox, + styles, + + // Provide the current computed size to request scroll gutter calculation (gh-3589) + val + ) + ) + "px"; +} + +jQuery.extend( { + + // Add in style property hooks for overriding the default + // behavior of getting and setting a style property + cssHooks: { + opacity: { + get: function( elem, computed ) { + if ( computed ) { + + // We should always get a number back from opacity + var ret = curCSS( elem, "opacity" ); + return ret === "" ? "1" : ret; + } + } + } + }, + + // Don't automatically add "px" to these possibly-unitless properties + cssNumber: { + "animationIterationCount": true, + "columnCount": true, + "fillOpacity": true, + "flexGrow": true, + "flexShrink": true, + "fontWeight": true, + "gridArea": true, + "gridColumn": true, + "gridColumnEnd": true, + "gridColumnStart": true, + "gridRow": true, + "gridRowEnd": true, + "gridRowStart": true, + "lineHeight": true, + "opacity": true, + "order": true, + "orphans": true, + "widows": true, + "zIndex": true, + "zoom": true + }, + + // Add in properties whose names you wish to fix before + // setting or getting the value + cssProps: {}, + + // Get and set the style property on a DOM Node + style: function( elem, name, value, extra ) { + + // Don't set styles on text and comment nodes + if ( !elem || elem.nodeType === 3 || elem.nodeType === 8 || !elem.style ) { + return; + } + + // Make sure that we're working with the right name + var ret, type, hooks, + origName = camelCase( name ), + isCustomProp = rcustomProp.test( name ), + style = elem.style; + + // Make sure that we're working with the right name. We don't + // want to query the value if it is a CSS custom property + // since they are user-defined. + if ( !isCustomProp ) { + name = finalPropName( origName ); + } + + // Gets hook for the prefixed version, then unprefixed version + hooks = jQuery.cssHooks[ name ] || jQuery.cssHooks[ origName ]; + + // Check if we're setting a value + if ( value !== undefined ) { + type = typeof value; + + // Convert "+=" or "-=" to relative numbers (#7345) + if ( type === "string" && ( ret = rcssNum.exec( value ) ) && ret[ 1 ] ) { + value = adjustCSS( elem, name, ret ); + + // Fixes bug #9237 + type = "number"; + } + + // Make sure that null and NaN values aren't set (#7116) + if ( value == null || value !== value ) { + return; + } + + // If a number was passed in, add the unit (except for certain CSS properties) + // The isCustomProp check can be removed in jQuery 4.0 when we only auto-append + // "px" to a few hardcoded values. + if ( type === "number" && !isCustomProp ) { + value += ret && ret[ 3 ] || ( jQuery.cssNumber[ origName ] ? "" : "px" ); + } + + // background-* props affect original clone's values + if ( !support.clearCloneStyle && value === "" && name.indexOf( "background" ) === 0 ) { + style[ name ] = "inherit"; + } + + // If a hook was provided, use that value, otherwise just set the specified value + if ( !hooks || !( "set" in hooks ) || + ( value = hooks.set( elem, value, extra ) ) !== undefined ) { + + if ( isCustomProp ) { + style.setProperty( name, value ); + } else { + style[ name ] = value; + } + } + + } else { + + // If a hook was provided get the non-computed value from there + if ( hooks && "get" in hooks && + ( ret = hooks.get( elem, false, extra ) ) !== undefined ) { + + return ret; + } + + // Otherwise just get the value from the style object + return style[ name ]; + } + }, + + css: function( elem, name, extra, styles ) { + var val, num, hooks, + origName = camelCase( name ), + isCustomProp = rcustomProp.test( name ); + + // Make sure that we're working with the right name. We don't + // want to modify the value if it is a CSS custom property + // since they are user-defined. + if ( !isCustomProp ) { + name = finalPropName( origName ); + } + + // Try prefixed name followed by the unprefixed name + hooks = jQuery.cssHooks[ name ] || jQuery.cssHooks[ origName ]; + + // If a hook was provided get the computed value from there + if ( hooks && "get" in hooks ) { + val = hooks.get( elem, true, extra ); + } + + // Otherwise, if a way to get the computed value exists, use that + if ( val === undefined ) { + val = curCSS( elem, name, styles ); + } + + // Convert "normal" to computed value + if ( val === "normal" && name in cssNormalTransform ) { + val = cssNormalTransform[ name ]; + } + + // Make numeric if forced or a qualifier was provided and val looks numeric + if ( extra === "" || extra ) { + num = parseFloat( val ); + return extra === true || isFinite( num ) ? num || 0 : val; + } + + return val; + } +} ); + +jQuery.each( [ "height", "width" ], function( _i, dimension ) { + jQuery.cssHooks[ dimension ] = { + get: function( elem, computed, extra ) { + if ( computed ) { + + // Certain elements can have dimension info if we invisibly show them + // but it must have a current display style that would benefit + return rdisplayswap.test( jQuery.css( elem, "display" ) ) && + + // Support: Safari 8+ + // Table columns in Safari have non-zero offsetWidth & zero + // getBoundingClientRect().width unless display is changed. + // Support: IE <=11 only + // Running getBoundingClientRect on a disconnected node + // in IE throws an error. + ( !elem.getClientRects().length || !elem.getBoundingClientRect().width ) ? + swap( elem, cssShow, function() { + return getWidthOrHeight( elem, dimension, extra ); + } ) : + getWidthOrHeight( elem, dimension, extra ); + } + }, + + set: function( elem, value, extra ) { + var matches, + styles = getStyles( elem ), + + // Only read styles.position if the test has a chance to fail + // to avoid forcing a reflow. + scrollboxSizeBuggy = !support.scrollboxSize() && + styles.position === "absolute", + + // To avoid forcing a reflow, only fetch boxSizing if we need it (gh-3991) + boxSizingNeeded = scrollboxSizeBuggy || extra, + isBorderBox = boxSizingNeeded && + jQuery.css( elem, "boxSizing", false, styles ) === "border-box", + subtract = extra ? + boxModelAdjustment( + elem, + dimension, + extra, + isBorderBox, + styles + ) : + 0; + + // Account for unreliable border-box dimensions by comparing offset* to computed and + // faking a content-box to get border and padding (gh-3699) + if ( isBorderBox && scrollboxSizeBuggy ) { + subtract -= Math.ceil( + elem[ "offset" + dimension[ 0 ].toUpperCase() + dimension.slice( 1 ) ] - + parseFloat( styles[ dimension ] ) - + boxModelAdjustment( elem, dimension, "border", false, styles ) - + 0.5 + ); + } + + // Convert to pixels if value adjustment is needed + if ( subtract && ( matches = rcssNum.exec( value ) ) && + ( matches[ 3 ] || "px" ) !== "px" ) { + + elem.style[ dimension ] = value; + value = jQuery.css( elem, dimension ); + } + + return setPositiveNumber( elem, value, subtract ); + } + }; +} ); + +jQuery.cssHooks.marginLeft = addGetHookIf( support.reliableMarginLeft, + function( elem, computed ) { + if ( computed ) { + return ( parseFloat( curCSS( elem, "marginLeft" ) ) || + elem.getBoundingClientRect().left - + swap( elem, { marginLeft: 0 }, function() { + return elem.getBoundingClientRect().left; + } ) + ) + "px"; + } + } +); + +// These hooks are used by animate to expand properties +jQuery.each( { + margin: "", + padding: "", + border: "Width" +}, function( prefix, suffix ) { + jQuery.cssHooks[ prefix + suffix ] = { + expand: function( value ) { + var i = 0, + expanded = {}, + + // Assumes a single number if not a string + parts = typeof value === "string" ? value.split( " " ) : [ value ]; + + for ( ; i < 4; i++ ) { + expanded[ prefix + cssExpand[ i ] + suffix ] = + parts[ i ] || parts[ i - 2 ] || parts[ 0 ]; + } + + return expanded; + } + }; + + if ( prefix !== "margin" ) { + jQuery.cssHooks[ prefix + suffix ].set = setPositiveNumber; + } +} ); + +jQuery.fn.extend( { + css: function( name, value ) { + return access( this, function( elem, name, value ) { + var styles, len, + map = {}, + i = 0; + + if ( Array.isArray( name ) ) { + styles = getStyles( elem ); + len = name.length; + + for ( ; i < len; i++ ) { + map[ name[ i ] ] = jQuery.css( elem, name[ i ], false, styles ); + } + + return map; + } + + return value !== undefined ? + jQuery.style( elem, name, value ) : + jQuery.css( elem, name ); + }, name, value, arguments.length > 1 ); + } +} ); + + +function Tween( elem, options, prop, end, easing ) { + return new Tween.prototype.init( elem, options, prop, end, easing ); +} +jQuery.Tween = Tween; + +Tween.prototype = { + constructor: Tween, + init: function( elem, options, prop, end, easing, unit ) { + this.elem = elem; + this.prop = prop; + this.easing = easing || jQuery.easing._default; + this.options = options; + this.start = this.now = this.cur(); + this.end = end; + this.unit = unit || ( jQuery.cssNumber[ prop ] ? "" : "px" ); + }, + cur: function() { + var hooks = Tween.propHooks[ this.prop ]; + + return hooks && hooks.get ? + hooks.get( this ) : + Tween.propHooks._default.get( this ); + }, + run: function( percent ) { + var eased, + hooks = Tween.propHooks[ this.prop ]; + + if ( this.options.duration ) { + this.pos = eased = jQuery.easing[ this.easing ]( + percent, this.options.duration * percent, 0, 1, this.options.duration + ); + } else { + this.pos = eased = percent; + } + this.now = ( this.end - this.start ) * eased + this.start; + + if ( this.options.step ) { + this.options.step.call( this.elem, this.now, this ); + } + + if ( hooks && hooks.set ) { + hooks.set( this ); + } else { + Tween.propHooks._default.set( this ); + } + return this; + } +}; + +Tween.prototype.init.prototype = Tween.prototype; + +Tween.propHooks = { + _default: { + get: function( tween ) { + var result; + + // Use a property on the element directly when it is not a DOM element, + // or when there is no matching style property that exists. + if ( tween.elem.nodeType !== 1 || + tween.elem[ tween.prop ] != null && tween.elem.style[ tween.prop ] == null ) { + return tween.elem[ tween.prop ]; + } + + // Passing an empty string as a 3rd parameter to .css will automatically + // attempt a parseFloat and fallback to a string if the parse fails. + // Simple values such as "10px" are parsed to Float; + // complex values such as "rotate(1rad)" are returned as-is. + result = jQuery.css( tween.elem, tween.prop, "" ); + + // Empty strings, null, undefined and "auto" are converted to 0. + return !result || result === "auto" ? 0 : result; + }, + set: function( tween ) { + + // Use step hook for back compat. + // Use cssHook if its there. + // Use .style if available and use plain properties where available. + if ( jQuery.fx.step[ tween.prop ] ) { + jQuery.fx.step[ tween.prop ]( tween ); + } else if ( tween.elem.nodeType === 1 && ( + jQuery.cssHooks[ tween.prop ] || + tween.elem.style[ finalPropName( tween.prop ) ] != null ) ) { + jQuery.style( tween.elem, tween.prop, tween.now + tween.unit ); + } else { + tween.elem[ tween.prop ] = tween.now; + } + } + } +}; + +// Support: IE <=9 only +// Panic based approach to setting things on disconnected nodes +Tween.propHooks.scrollTop = Tween.propHooks.scrollLeft = { + set: function( tween ) { + if ( tween.elem.nodeType && tween.elem.parentNode ) { + tween.elem[ tween.prop ] = tween.now; + } + } +}; + +jQuery.easing = { + linear: function( p ) { + return p; + }, + swing: function( p ) { + return 0.5 - Math.cos( p * Math.PI ) / 2; + }, + _default: "swing" +}; + +jQuery.fx = Tween.prototype.init; + +// Back compat <1.8 extension point +jQuery.fx.step = {}; + + + + +var + fxNow, inProgress, + rfxtypes = /^(?:toggle|show|hide)$/, + rrun = /queueHooks$/; + +function schedule() { + if ( inProgress ) { + if ( document.hidden === false && window.requestAnimationFrame ) { + window.requestAnimationFrame( schedule ); + } else { + window.setTimeout( schedule, jQuery.fx.interval ); + } + + jQuery.fx.tick(); + } +} + +// Animations created synchronously will run synchronously +function createFxNow() { + window.setTimeout( function() { + fxNow = undefined; + } ); + return ( fxNow = Date.now() ); +} + +// Generate parameters to create a standard animation +function genFx( type, includeWidth ) { + var which, + i = 0, + attrs = { height: type }; + + // If we include width, step value is 1 to do all cssExpand values, + // otherwise step value is 2 to skip over Left and Right + includeWidth = includeWidth ? 1 : 0; + for ( ; i < 4; i += 2 - includeWidth ) { + which = cssExpand[ i ]; + attrs[ "margin" + which ] = attrs[ "padding" + which ] = type; + } + + if ( includeWidth ) { + attrs.opacity = attrs.width = type; + } + + return attrs; +} + +function createTween( value, prop, animation ) { + var tween, + collection = ( Animation.tweeners[ prop ] || [] ).concat( Animation.tweeners[ "*" ] ), + index = 0, + length = collection.length; + for ( ; index < length; index++ ) { + if ( ( tween = collection[ index ].call( animation, prop, value ) ) ) { + + // We're done with this property + return tween; + } + } +} + +function defaultPrefilter( elem, props, opts ) { + var prop, value, toggle, hooks, oldfire, propTween, restoreDisplay, display, + isBox = "width" in props || "height" in props, + anim = this, + orig = {}, + style = elem.style, + hidden = elem.nodeType && isHiddenWithinTree( elem ), + dataShow = dataPriv.get( elem, "fxshow" ); + + // Queue-skipping animations hijack the fx hooks + if ( !opts.queue ) { + hooks = jQuery._queueHooks( elem, "fx" ); + if ( hooks.unqueued == null ) { + hooks.unqueued = 0; + oldfire = hooks.empty.fire; + hooks.empty.fire = function() { + if ( !hooks.unqueued ) { + oldfire(); + } + }; + } + hooks.unqueued++; + + anim.always( function() { + + // Ensure the complete handler is called before this completes + anim.always( function() { + hooks.unqueued--; + if ( !jQuery.queue( elem, "fx" ).length ) { + hooks.empty.fire(); + } + } ); + } ); + } + + // Detect show/hide animations + for ( prop in props ) { + value = props[ prop ]; + if ( rfxtypes.test( value ) ) { + delete props[ prop ]; + toggle = toggle || value === "toggle"; + if ( value === ( hidden ? "hide" : "show" ) ) { + + // Pretend to be hidden if this is a "show" and + // there is still data from a stopped show/hide + if ( value === "show" && dataShow && dataShow[ prop ] !== undefined ) { + hidden = true; + + // Ignore all other no-op show/hide data + } else { + continue; + } + } + orig[ prop ] = dataShow && dataShow[ prop ] || jQuery.style( elem, prop ); + } + } + + // Bail out if this is a no-op like .hide().hide() + propTween = !jQuery.isEmptyObject( props ); + if ( !propTween && jQuery.isEmptyObject( orig ) ) { + return; + } + + // Restrict "overflow" and "display" styles during box animations + if ( isBox && elem.nodeType === 1 ) { + + // Support: IE <=9 - 11, Edge 12 - 15 + // Record all 3 overflow attributes because IE does not infer the shorthand + // from identically-valued overflowX and overflowY and Edge just mirrors + // the overflowX value there. + opts.overflow = [ style.overflow, style.overflowX, style.overflowY ]; + + // Identify a display type, preferring old show/hide data over the CSS cascade + restoreDisplay = dataShow && dataShow.display; + if ( restoreDisplay == null ) { + restoreDisplay = dataPriv.get( elem, "display" ); + } + display = jQuery.css( elem, "display" ); + if ( display === "none" ) { + if ( restoreDisplay ) { + display = restoreDisplay; + } else { + + // Get nonempty value(s) by temporarily forcing visibility + showHide( [ elem ], true ); + restoreDisplay = elem.style.display || restoreDisplay; + display = jQuery.css( elem, "display" ); + showHide( [ elem ] ); + } + } + + // Animate inline elements as inline-block + if ( display === "inline" || display === "inline-block" && restoreDisplay != null ) { + if ( jQuery.css( elem, "float" ) === "none" ) { + + // Restore the original display value at the end of pure show/hide animations + if ( !propTween ) { + anim.done( function() { + style.display = restoreDisplay; + } ); + if ( restoreDisplay == null ) { + display = style.display; + restoreDisplay = display === "none" ? "" : display; + } + } + style.display = "inline-block"; + } + } + } + + if ( opts.overflow ) { + style.overflow = "hidden"; + anim.always( function() { + style.overflow = opts.overflow[ 0 ]; + style.overflowX = opts.overflow[ 1 ]; + style.overflowY = opts.overflow[ 2 ]; + } ); + } + + // Implement show/hide animations + propTween = false; + for ( prop in orig ) { + + // General show/hide setup for this element animation + if ( !propTween ) { + if ( dataShow ) { + if ( "hidden" in dataShow ) { + hidden = dataShow.hidden; + } + } else { + dataShow = dataPriv.access( elem, "fxshow", { display: restoreDisplay } ); + } + + // Store hidden/visible for toggle so `.stop().toggle()` "reverses" + if ( toggle ) { + dataShow.hidden = !hidden; + } + + // Show elements before animating them + if ( hidden ) { + showHide( [ elem ], true ); + } + + /* eslint-disable no-loop-func */ + + anim.done( function() { + + /* eslint-enable no-loop-func */ + + // The final step of a "hide" animation is actually hiding the element + if ( !hidden ) { + showHide( [ elem ] ); + } + dataPriv.remove( elem, "fxshow" ); + for ( prop in orig ) { + jQuery.style( elem, prop, orig[ prop ] ); + } + } ); + } + + // Per-property setup + propTween = createTween( hidden ? dataShow[ prop ] : 0, prop, anim ); + if ( !( prop in dataShow ) ) { + dataShow[ prop ] = propTween.start; + if ( hidden ) { + propTween.end = propTween.start; + propTween.start = 0; + } + } + } +} + +function propFilter( props, specialEasing ) { + var index, name, easing, value, hooks; + + // camelCase, specialEasing and expand cssHook pass + for ( index in props ) { + name = camelCase( index ); + easing = specialEasing[ name ]; + value = props[ index ]; + if ( Array.isArray( value ) ) { + easing = value[ 1 ]; + value = props[ index ] = value[ 0 ]; + } + + if ( index !== name ) { + props[ name ] = value; + delete props[ index ]; + } + + hooks = jQuery.cssHooks[ name ]; + if ( hooks && "expand" in hooks ) { + value = hooks.expand( value ); + delete props[ name ]; + + // Not quite $.extend, this won't overwrite existing keys. + // Reusing 'index' because we have the correct "name" + for ( index in value ) { + if ( !( index in props ) ) { + props[ index ] = value[ index ]; + specialEasing[ index ] = easing; + } + } + } else { + specialEasing[ name ] = easing; + } + } +} + +function Animation( elem, properties, options ) { + var result, + stopped, + index = 0, + length = Animation.prefilters.length, + deferred = jQuery.Deferred().always( function() { + + // Don't match elem in the :animated selector + delete tick.elem; + } ), + tick = function() { + if ( stopped ) { + return false; + } + var currentTime = fxNow || createFxNow(), + remaining = Math.max( 0, animation.startTime + animation.duration - currentTime ), + + // Support: Android 2.3 only + // Archaic crash bug won't allow us to use `1 - ( 0.5 || 0 )` (#12497) + temp = remaining / animation.duration || 0, + percent = 1 - temp, + index = 0, + length = animation.tweens.length; + + for ( ; index < length; index++ ) { + animation.tweens[ index ].run( percent ); + } + + deferred.notifyWith( elem, [ animation, percent, remaining ] ); + + // If there's more to do, yield + if ( percent < 1 && length ) { + return remaining; + } + + // If this was an empty animation, synthesize a final progress notification + if ( !length ) { + deferred.notifyWith( elem, [ animation, 1, 0 ] ); + } + + // Resolve the animation and report its conclusion + deferred.resolveWith( elem, [ animation ] ); + return false; + }, + animation = deferred.promise( { + elem: elem, + props: jQuery.extend( {}, properties ), + opts: jQuery.extend( true, { + specialEasing: {}, + easing: jQuery.easing._default + }, options ), + originalProperties: properties, + originalOptions: options, + startTime: fxNow || createFxNow(), + duration: options.duration, + tweens: [], + createTween: function( prop, end ) { + var tween = jQuery.Tween( elem, animation.opts, prop, end, + animation.opts.specialEasing[ prop ] || animation.opts.easing ); + animation.tweens.push( tween ); + return tween; + }, + stop: function( gotoEnd ) { + var index = 0, + + // If we are going to the end, we want to run all the tweens + // otherwise we skip this part + length = gotoEnd ? animation.tweens.length : 0; + if ( stopped ) { + return this; + } + stopped = true; + for ( ; index < length; index++ ) { + animation.tweens[ index ].run( 1 ); + } + + // Resolve when we played the last frame; otherwise, reject + if ( gotoEnd ) { + deferred.notifyWith( elem, [ animation, 1, 0 ] ); + deferred.resolveWith( elem, [ animation, gotoEnd ] ); + } else { + deferred.rejectWith( elem, [ animation, gotoEnd ] ); + } + return this; + } + } ), + props = animation.props; + + propFilter( props, animation.opts.specialEasing ); + + for ( ; index < length; index++ ) { + result = Animation.prefilters[ index ].call( animation, elem, props, animation.opts ); + if ( result ) { + if ( isFunction( result.stop ) ) { + jQuery._queueHooks( animation.elem, animation.opts.queue ).stop = + result.stop.bind( result ); + } + return result; + } + } + + jQuery.map( props, createTween, animation ); + + if ( isFunction( animation.opts.start ) ) { + animation.opts.start.call( elem, animation ); + } + + // Attach callbacks from options + animation + .progress( animation.opts.progress ) + .done( animation.opts.done, animation.opts.complete ) + .fail( animation.opts.fail ) + .always( animation.opts.always ); + + jQuery.fx.timer( + jQuery.extend( tick, { + elem: elem, + anim: animation, + queue: animation.opts.queue + } ) + ); + + return animation; +} + +jQuery.Animation = jQuery.extend( Animation, { + + tweeners: { + "*": [ function( prop, value ) { + var tween = this.createTween( prop, value ); + adjustCSS( tween.elem, prop, rcssNum.exec( value ), tween ); + return tween; + } ] + }, + + tweener: function( props, callback ) { + if ( isFunction( props ) ) { + callback = props; + props = [ "*" ]; + } else { + props = props.match( rnothtmlwhite ); + } + + var prop, + index = 0, + length = props.length; + + for ( ; index < length; index++ ) { + prop = props[ index ]; + Animation.tweeners[ prop ] = Animation.tweeners[ prop ] || []; + Animation.tweeners[ prop ].unshift( callback ); + } + }, + + prefilters: [ defaultPrefilter ], + + prefilter: function( callback, prepend ) { + if ( prepend ) { + Animation.prefilters.unshift( callback ); + } else { + Animation.prefilters.push( callback ); + } + } +} ); + +jQuery.speed = function( speed, easing, fn ) { + var opt = speed && typeof speed === "object" ? jQuery.extend( {}, speed ) : { + complete: fn || !fn && easing || + isFunction( speed ) && speed, + duration: speed, + easing: fn && easing || easing && !isFunction( easing ) && easing + }; + + // Go to the end state if fx are off + if ( jQuery.fx.off ) { + opt.duration = 0; + + } else { + if ( typeof opt.duration !== "number" ) { + if ( opt.duration in jQuery.fx.speeds ) { + opt.duration = jQuery.fx.speeds[ opt.duration ]; + + } else { + opt.duration = jQuery.fx.speeds._default; + } + } + } + + // Normalize opt.queue - true/undefined/null -> "fx" + if ( opt.queue == null || opt.queue === true ) { + opt.queue = "fx"; + } + + // Queueing + opt.old = opt.complete; + + opt.complete = function() { + if ( isFunction( opt.old ) ) { + opt.old.call( this ); + } + + if ( opt.queue ) { + jQuery.dequeue( this, opt.queue ); + } + }; + + return opt; +}; + +jQuery.fn.extend( { + fadeTo: function( speed, to, easing, callback ) { + + // Show any hidden elements after setting opacity to 0 + return this.filter( isHiddenWithinTree ).css( "opacity", 0 ).show() + + // Animate to the value specified + .end().animate( { opacity: to }, speed, easing, callback ); + }, + animate: function( prop, speed, easing, callback ) { + var empty = jQuery.isEmptyObject( prop ), + optall = jQuery.speed( speed, easing, callback ), + doAnimation = function() { + + // Operate on a copy of prop so per-property easing won't be lost + var anim = Animation( this, jQuery.extend( {}, prop ), optall ); + + // Empty animations, or finishing resolves immediately + if ( empty || dataPriv.get( this, "finish" ) ) { + anim.stop( true ); + } + }; + + doAnimation.finish = doAnimation; + + return empty || optall.queue === false ? + this.each( doAnimation ) : + this.queue( optall.queue, doAnimation ); + }, + stop: function( type, clearQueue, gotoEnd ) { + var stopQueue = function( hooks ) { + var stop = hooks.stop; + delete hooks.stop; + stop( gotoEnd ); + }; + + if ( typeof type !== "string" ) { + gotoEnd = clearQueue; + clearQueue = type; + type = undefined; + } + if ( clearQueue ) { + this.queue( type || "fx", [] ); + } + + return this.each( function() { + var dequeue = true, + index = type != null && type + "queueHooks", + timers = jQuery.timers, + data = dataPriv.get( this ); + + if ( index ) { + if ( data[ index ] && data[ index ].stop ) { + stopQueue( data[ index ] ); + } + } else { + for ( index in data ) { + if ( data[ index ] && data[ index ].stop && rrun.test( index ) ) { + stopQueue( data[ index ] ); + } + } + } + + for ( index = timers.length; index--; ) { + if ( timers[ index ].elem === this && + ( type == null || timers[ index ].queue === type ) ) { + + timers[ index ].anim.stop( gotoEnd ); + dequeue = false; + timers.splice( index, 1 ); + } + } + + // Start the next in the queue if the last step wasn't forced. + // Timers currently will call their complete callbacks, which + // will dequeue but only if they were gotoEnd. + if ( dequeue || !gotoEnd ) { + jQuery.dequeue( this, type ); + } + } ); + }, + finish: function( type ) { + if ( type !== false ) { + type = type || "fx"; + } + return this.each( function() { + var index, + data = dataPriv.get( this ), + queue = data[ type + "queue" ], + hooks = data[ type + "queueHooks" ], + timers = jQuery.timers, + length = queue ? queue.length : 0; + + // Enable finishing flag on private data + data.finish = true; + + // Empty the queue first + jQuery.queue( this, type, [] ); + + if ( hooks && hooks.stop ) { + hooks.stop.call( this, true ); + } + + // Look for any active animations, and finish them + for ( index = timers.length; index--; ) { + if ( timers[ index ].elem === this && timers[ index ].queue === type ) { + timers[ index ].anim.stop( true ); + timers.splice( index, 1 ); + } + } + + // Look for any animations in the old queue and finish them + for ( index = 0; index < length; index++ ) { + if ( queue[ index ] && queue[ index ].finish ) { + queue[ index ].finish.call( this ); + } + } + + // Turn off finishing flag + delete data.finish; + } ); + } +} ); + +jQuery.each( [ "toggle", "show", "hide" ], function( _i, name ) { + var cssFn = jQuery.fn[ name ]; + jQuery.fn[ name ] = function( speed, easing, callback ) { + return speed == null || typeof speed === "boolean" ? + cssFn.apply( this, arguments ) : + this.animate( genFx( name, true ), speed, easing, callback ); + }; +} ); + +// Generate shortcuts for custom animations +jQuery.each( { + slideDown: genFx( "show" ), + slideUp: genFx( "hide" ), + slideToggle: genFx( "toggle" ), + fadeIn: { opacity: "show" }, + fadeOut: { opacity: "hide" }, + fadeToggle: { opacity: "toggle" } +}, function( name, props ) { + jQuery.fn[ name ] = function( speed, easing, callback ) { + return this.animate( props, speed, easing, callback ); + }; +} ); + +jQuery.timers = []; +jQuery.fx.tick = function() { + var timer, + i = 0, + timers = jQuery.timers; + + fxNow = Date.now(); + + for ( ; i < timers.length; i++ ) { + timer = timers[ i ]; + + // Run the timer and safely remove it when done (allowing for external removal) + if ( !timer() && timers[ i ] === timer ) { + timers.splice( i--, 1 ); + } + } + + if ( !timers.length ) { + jQuery.fx.stop(); + } + fxNow = undefined; +}; + +jQuery.fx.timer = function( timer ) { + jQuery.timers.push( timer ); + jQuery.fx.start(); +}; + +jQuery.fx.interval = 13; +jQuery.fx.start = function() { + if ( inProgress ) { + return; + } + + inProgress = true; + schedule(); +}; + +jQuery.fx.stop = function() { + inProgress = null; +}; + +jQuery.fx.speeds = { + slow: 600, + fast: 200, + + // Default speed + _default: 400 +}; + + +// Based off of the plugin by Clint Helfers, with permission. +// https://web.archive.org/web/20100324014747/http://blindsignals.com/index.php/2009/07/jquery-delay/ +jQuery.fn.delay = function( time, type ) { + time = jQuery.fx ? jQuery.fx.speeds[ time ] || time : time; + type = type || "fx"; + + return this.queue( type, function( next, hooks ) { + var timeout = window.setTimeout( next, time ); + hooks.stop = function() { + window.clearTimeout( timeout ); + }; + } ); +}; + + +( function() { + var input = document.createElement( "input" ), + select = document.createElement( "select" ), + opt = select.appendChild( document.createElement( "option" ) ); + + input.type = "checkbox"; + + // Support: Android <=4.3 only + // Default value for a checkbox should be "on" + support.checkOn = input.value !== ""; + + // Support: IE <=11 only + // Must access selectedIndex to make default options select + support.optSelected = opt.selected; + + // Support: IE <=11 only + // An input loses its value after becoming a radio + input = document.createElement( "input" ); + input.value = "t"; + input.type = "radio"; + support.radioValue = input.value === "t"; +} )(); + + +var boolHook, + attrHandle = jQuery.expr.attrHandle; + +jQuery.fn.extend( { + attr: function( name, value ) { + return access( this, jQuery.attr, name, value, arguments.length > 1 ); + }, + + removeAttr: function( name ) { + return this.each( function() { + jQuery.removeAttr( this, name ); + } ); + } +} ); + +jQuery.extend( { + attr: function( elem, name, value ) { + var ret, hooks, + nType = elem.nodeType; + + // Don't get/set attributes on text, comment and attribute nodes + if ( nType === 3 || nType === 8 || nType === 2 ) { + return; + } + + // Fallback to prop when attributes are not supported + if ( typeof elem.getAttribute === "undefined" ) { + return jQuery.prop( elem, name, value ); + } + + // Attribute hooks are determined by the lowercase version + // Grab necessary hook if one is defined + if ( nType !== 1 || !jQuery.isXMLDoc( elem ) ) { + hooks = jQuery.attrHooks[ name.toLowerCase() ] || + ( jQuery.expr.match.bool.test( name ) ? boolHook : undefined ); + } + + if ( value !== undefined ) { + if ( value === null ) { + jQuery.removeAttr( elem, name ); + return; + } + + if ( hooks && "set" in hooks && + ( ret = hooks.set( elem, value, name ) ) !== undefined ) { + return ret; + } + + elem.setAttribute( name, value + "" ); + return value; + } + + if ( hooks && "get" in hooks && ( ret = hooks.get( elem, name ) ) !== null ) { + return ret; + } + + ret = jQuery.find.attr( elem, name ); + + // Non-existent attributes return null, we normalize to undefined + return ret == null ? undefined : ret; + }, + + attrHooks: { + type: { + set: function( elem, value ) { + if ( !support.radioValue && value === "radio" && + nodeName( elem, "input" ) ) { + var val = elem.value; + elem.setAttribute( "type", value ); + if ( val ) { + elem.value = val; + } + return value; + } + } + } + }, + + removeAttr: function( elem, value ) { + var name, + i = 0, + + // Attribute names can contain non-HTML whitespace characters + // https://html.spec.whatwg.org/multipage/syntax.html#attributes-2 + attrNames = value && value.match( rnothtmlwhite ); + + if ( attrNames && elem.nodeType === 1 ) { + while ( ( name = attrNames[ i++ ] ) ) { + elem.removeAttribute( name ); + } + } + } +} ); + +// Hooks for boolean attributes +boolHook = { + set: function( elem, value, name ) { + if ( value === false ) { + + // Remove boolean attributes when set to false + jQuery.removeAttr( elem, name ); + } else { + elem.setAttribute( name, name ); + } + return name; + } +}; + +jQuery.each( jQuery.expr.match.bool.source.match( /\w+/g ), function( _i, name ) { + var getter = attrHandle[ name ] || jQuery.find.attr; + + attrHandle[ name ] = function( elem, name, isXML ) { + var ret, handle, + lowercaseName = name.toLowerCase(); + + if ( !isXML ) { + + // Avoid an infinite loop by temporarily removing this function from the getter + handle = attrHandle[ lowercaseName ]; + attrHandle[ lowercaseName ] = ret; + ret = getter( elem, name, isXML ) != null ? + lowercaseName : + null; + attrHandle[ lowercaseName ] = handle; + } + return ret; + }; +} ); + + + + +var rfocusable = /^(?:input|select|textarea|button)$/i, + rclickable = /^(?:a|area)$/i; + +jQuery.fn.extend( { + prop: function( name, value ) { + return access( this, jQuery.prop, name, value, arguments.length > 1 ); + }, + + removeProp: function( name ) { + return this.each( function() { + delete this[ jQuery.propFix[ name ] || name ]; + } ); + } +} ); + +jQuery.extend( { + prop: function( elem, name, value ) { + var ret, hooks, + nType = elem.nodeType; + + // Don't get/set properties on text, comment and attribute nodes + if ( nType === 3 || nType === 8 || nType === 2 ) { + return; + } + + if ( nType !== 1 || !jQuery.isXMLDoc( elem ) ) { + + // Fix name and attach hooks + name = jQuery.propFix[ name ] || name; + hooks = jQuery.propHooks[ name ]; + } + + if ( value !== undefined ) { + if ( hooks && "set" in hooks && + ( ret = hooks.set( elem, value, name ) ) !== undefined ) { + return ret; + } + + return ( elem[ name ] = value ); + } + + if ( hooks && "get" in hooks && ( ret = hooks.get( elem, name ) ) !== null ) { + return ret; + } + + return elem[ name ]; + }, + + propHooks: { + tabIndex: { + get: function( elem ) { + + // Support: IE <=9 - 11 only + // elem.tabIndex doesn't always return the + // correct value when it hasn't been explicitly set + // https://web.archive.org/web/20141116233347/http://fluidproject.org/blog/2008/01/09/getting-setting-and-removing-tabindex-values-with-javascript/ + // Use proper attribute retrieval(#12072) + var tabindex = jQuery.find.attr( elem, "tabindex" ); + + if ( tabindex ) { + return parseInt( tabindex, 10 ); + } + + if ( + rfocusable.test( elem.nodeName ) || + rclickable.test( elem.nodeName ) && + elem.href + ) { + return 0; + } + + return -1; + } + } + }, + + propFix: { + "for": "htmlFor", + "class": "className" + } +} ); + +// Support: IE <=11 only +// Accessing the selectedIndex property +// forces the browser to respect setting selected +// on the option +// The getter ensures a default option is selected +// when in an optgroup +// eslint rule "no-unused-expressions" is disabled for this code +// since it considers such accessions noop +if ( !support.optSelected ) { + jQuery.propHooks.selected = { + get: function( elem ) { + + /* eslint no-unused-expressions: "off" */ + + var parent = elem.parentNode; + if ( parent && parent.parentNode ) { + parent.parentNode.selectedIndex; + } + return null; + }, + set: function( elem ) { + + /* eslint no-unused-expressions: "off" */ + + var parent = elem.parentNode; + if ( parent ) { + parent.selectedIndex; + + if ( parent.parentNode ) { + parent.parentNode.selectedIndex; + } + } + } + }; +} + +jQuery.each( [ + "tabIndex", + "readOnly", + "maxLength", + "cellSpacing", + "cellPadding", + "rowSpan", + "colSpan", + "useMap", + "frameBorder", + "contentEditable" +], function() { + jQuery.propFix[ this.toLowerCase() ] = this; +} ); + + + + + // Strip and collapse whitespace according to HTML spec + // https://infra.spec.whatwg.org/#strip-and-collapse-ascii-whitespace + function stripAndCollapse( value ) { + var tokens = value.match( rnothtmlwhite ) || []; + return tokens.join( " " ); + } + + +function getClass( elem ) { + return elem.getAttribute && elem.getAttribute( "class" ) || ""; +} + +function classesToArray( value ) { + if ( Array.isArray( value ) ) { + return value; + } + if ( typeof value === "string" ) { + return value.match( rnothtmlwhite ) || []; + } + return []; +} + +jQuery.fn.extend( { + addClass: function( value ) { + var classes, elem, cur, curValue, clazz, j, finalValue, + i = 0; + + if ( isFunction( value ) ) { + return this.each( function( j ) { + jQuery( this ).addClass( value.call( this, j, getClass( this ) ) ); + } ); + } + + classes = classesToArray( value ); + + if ( classes.length ) { + while ( ( elem = this[ i++ ] ) ) { + curValue = getClass( elem ); + cur = elem.nodeType === 1 && ( " " + stripAndCollapse( curValue ) + " " ); + + if ( cur ) { + j = 0; + while ( ( clazz = classes[ j++ ] ) ) { + if ( cur.indexOf( " " + clazz + " " ) < 0 ) { + cur += clazz + " "; + } + } + + // Only assign if different to avoid unneeded rendering. + finalValue = stripAndCollapse( cur ); + if ( curValue !== finalValue ) { + elem.setAttribute( "class", finalValue ); + } + } + } + } + + return this; + }, + + removeClass: function( value ) { + var classes, elem, cur, curValue, clazz, j, finalValue, + i = 0; + + if ( isFunction( value ) ) { + return this.each( function( j ) { + jQuery( this ).removeClass( value.call( this, j, getClass( this ) ) ); + } ); + } + + if ( !arguments.length ) { + return this.attr( "class", "" ); + } + + classes = classesToArray( value ); + + if ( classes.length ) { + while ( ( elem = this[ i++ ] ) ) { + curValue = getClass( elem ); + + // This expression is here for better compressibility (see addClass) + cur = elem.nodeType === 1 && ( " " + stripAndCollapse( curValue ) + " " ); + + if ( cur ) { + j = 0; + while ( ( clazz = classes[ j++ ] ) ) { + + // Remove *all* instances + while ( cur.indexOf( " " + clazz + " " ) > -1 ) { + cur = cur.replace( " " + clazz + " ", " " ); + } + } + + // Only assign if different to avoid unneeded rendering. + finalValue = stripAndCollapse( cur ); + if ( curValue !== finalValue ) { + elem.setAttribute( "class", finalValue ); + } + } + } + } + + return this; + }, + + toggleClass: function( value, stateVal ) { + var type = typeof value, + isValidValue = type === "string" || Array.isArray( value ); + + if ( typeof stateVal === "boolean" && isValidValue ) { + return stateVal ? this.addClass( value ) : this.removeClass( value ); + } + + if ( isFunction( value ) ) { + return this.each( function( i ) { + jQuery( this ).toggleClass( + value.call( this, i, getClass( this ), stateVal ), + stateVal + ); + } ); + } + + return this.each( function() { + var className, i, self, classNames; + + if ( isValidValue ) { + + // Toggle individual class names + i = 0; + self = jQuery( this ); + classNames = classesToArray( value ); + + while ( ( className = classNames[ i++ ] ) ) { + + // Check each className given, space separated list + if ( self.hasClass( className ) ) { + self.removeClass( className ); + } else { + self.addClass( className ); + } + } + + // Toggle whole class name + } else if ( value === undefined || type === "boolean" ) { + className = getClass( this ); + if ( className ) { + + // Store className if set + dataPriv.set( this, "__className__", className ); + } + + // If the element has a class name or if we're passed `false`, + // then remove the whole classname (if there was one, the above saved it). + // Otherwise bring back whatever was previously saved (if anything), + // falling back to the empty string if nothing was stored. + if ( this.setAttribute ) { + this.setAttribute( "class", + className || value === false ? + "" : + dataPriv.get( this, "__className__" ) || "" + ); + } + } + } ); + }, + + hasClass: function( selector ) { + var className, elem, + i = 0; + + className = " " + selector + " "; + while ( ( elem = this[ i++ ] ) ) { + if ( elem.nodeType === 1 && + ( " " + stripAndCollapse( getClass( elem ) ) + " " ).indexOf( className ) > -1 ) { + return true; + } + } + + return false; + } +} ); + + + + +var rreturn = /\r/g; + +jQuery.fn.extend( { + val: function( value ) { + var hooks, ret, valueIsFunction, + elem = this[ 0 ]; + + if ( !arguments.length ) { + if ( elem ) { + hooks = jQuery.valHooks[ elem.type ] || + jQuery.valHooks[ elem.nodeName.toLowerCase() ]; + + if ( hooks && + "get" in hooks && + ( ret = hooks.get( elem, "value" ) ) !== undefined + ) { + return ret; + } + + ret = elem.value; + + // Handle most common string cases + if ( typeof ret === "string" ) { + return ret.replace( rreturn, "" ); + } + + // Handle cases where value is null/undef or number + return ret == null ? "" : ret; + } + + return; + } + + valueIsFunction = isFunction( value ); + + return this.each( function( i ) { + var val; + + if ( this.nodeType !== 1 ) { + return; + } + + if ( valueIsFunction ) { + val = value.call( this, i, jQuery( this ).val() ); + } else { + val = value; + } + + // Treat null/undefined as ""; convert numbers to string + if ( val == null ) { + val = ""; + + } else if ( typeof val === "number" ) { + val += ""; + + } else if ( Array.isArray( val ) ) { + val = jQuery.map( val, function( value ) { + return value == null ? "" : value + ""; + } ); + } + + hooks = jQuery.valHooks[ this.type ] || jQuery.valHooks[ this.nodeName.toLowerCase() ]; + + // If set returns undefined, fall back to normal setting + if ( !hooks || !( "set" in hooks ) || hooks.set( this, val, "value" ) === undefined ) { + this.value = val; + } + } ); + } +} ); + +jQuery.extend( { + valHooks: { + option: { + get: function( elem ) { + + var val = jQuery.find.attr( elem, "value" ); + return val != null ? + val : + + // Support: IE <=10 - 11 only + // option.text throws exceptions (#14686, #14858) + // Strip and collapse whitespace + // https://html.spec.whatwg.org/#strip-and-collapse-whitespace + stripAndCollapse( jQuery.text( elem ) ); + } + }, + select: { + get: function( elem ) { + var value, option, i, + options = elem.options, + index = elem.selectedIndex, + one = elem.type === "select-one", + values = one ? null : [], + max = one ? index + 1 : options.length; + + if ( index < 0 ) { + i = max; + + } else { + i = one ? index : 0; + } + + // Loop through all the selected options + for ( ; i < max; i++ ) { + option = options[ i ]; + + // Support: IE <=9 only + // IE8-9 doesn't update selected after form reset (#2551) + if ( ( option.selected || i === index ) && + + // Don't return options that are disabled or in a disabled optgroup + !option.disabled && + ( !option.parentNode.disabled || + !nodeName( option.parentNode, "optgroup" ) ) ) { + + // Get the specific value for the option + value = jQuery( option ).val(); + + // We don't need an array for one selects + if ( one ) { + return value; + } + + // Multi-Selects return an array + values.push( value ); + } + } + + return values; + }, + + set: function( elem, value ) { + var optionSet, option, + options = elem.options, + values = jQuery.makeArray( value ), + i = options.length; + + while ( i-- ) { + option = options[ i ]; + + /* eslint-disable no-cond-assign */ + + if ( option.selected = + jQuery.inArray( jQuery.valHooks.option.get( option ), values ) > -1 + ) { + optionSet = true; + } + + /* eslint-enable no-cond-assign */ + } + + // Force browsers to behave consistently when non-matching value is set + if ( !optionSet ) { + elem.selectedIndex = -1; + } + return values; + } + } + } +} ); + +// Radios and checkboxes getter/setter +jQuery.each( [ "radio", "checkbox" ], function() { + jQuery.valHooks[ this ] = { + set: function( elem, value ) { + if ( Array.isArray( value ) ) { + return ( elem.checked = jQuery.inArray( jQuery( elem ).val(), value ) > -1 ); + } + } + }; + if ( !support.checkOn ) { + jQuery.valHooks[ this ].get = function( elem ) { + return elem.getAttribute( "value" ) === null ? "on" : elem.value; + }; + } +} ); + + + + +// Return jQuery for attributes-only inclusion + + +support.focusin = "onfocusin" in window; + + +var rfocusMorph = /^(?:focusinfocus|focusoutblur)$/, + stopPropagationCallback = function( e ) { + e.stopPropagation(); + }; + +jQuery.extend( jQuery.event, { + + trigger: function( event, data, elem, onlyHandlers ) { + + var i, cur, tmp, bubbleType, ontype, handle, special, lastElement, + eventPath = [ elem || document ], + type = hasOwn.call( event, "type" ) ? event.type : event, + namespaces = hasOwn.call( event, "namespace" ) ? event.namespace.split( "." ) : []; + + cur = lastElement = tmp = elem = elem || document; + + // Don't do events on text and comment nodes + if ( elem.nodeType === 3 || elem.nodeType === 8 ) { + return; + } + + // focus/blur morphs to focusin/out; ensure we're not firing them right now + if ( rfocusMorph.test( type + jQuery.event.triggered ) ) { + return; + } + + if ( type.indexOf( "." ) > -1 ) { + + // Namespaced trigger; create a regexp to match event type in handle() + namespaces = type.split( "." ); + type = namespaces.shift(); + namespaces.sort(); + } + ontype = type.indexOf( ":" ) < 0 && "on" + type; + + // Caller can pass in a jQuery.Event object, Object, or just an event type string + event = event[ jQuery.expando ] ? + event : + new jQuery.Event( type, typeof event === "object" && event ); + + // Trigger bitmask: & 1 for native handlers; & 2 for jQuery (always true) + event.isTrigger = onlyHandlers ? 2 : 3; + event.namespace = namespaces.join( "." ); + event.rnamespace = event.namespace ? + new RegExp( "(^|\\.)" + namespaces.join( "\\.(?:.*\\.|)" ) + "(\\.|$)" ) : + null; + + // Clean up the event in case it is being reused + event.result = undefined; + if ( !event.target ) { + event.target = elem; + } + + // Clone any incoming data and prepend the event, creating the handler arg list + data = data == null ? + [ event ] : + jQuery.makeArray( data, [ event ] ); + + // Allow special events to draw outside the lines + special = jQuery.event.special[ type ] || {}; + if ( !onlyHandlers && special.trigger && special.trigger.apply( elem, data ) === false ) { + return; + } + + // Determine event propagation path in advance, per W3C events spec (#9951) + // Bubble up to document, then to window; watch for a global ownerDocument var (#9724) + if ( !onlyHandlers && !special.noBubble && !isWindow( elem ) ) { + + bubbleType = special.delegateType || type; + if ( !rfocusMorph.test( bubbleType + type ) ) { + cur = cur.parentNode; + } + for ( ; cur; cur = cur.parentNode ) { + eventPath.push( cur ); + tmp = cur; + } + + // Only add window if we got to document (e.g., not plain obj or detached DOM) + if ( tmp === ( elem.ownerDocument || document ) ) { + eventPath.push( tmp.defaultView || tmp.parentWindow || window ); + } + } + + // Fire handlers on the event path + i = 0; + while ( ( cur = eventPath[ i++ ] ) && !event.isPropagationStopped() ) { + lastElement = cur; + event.type = i > 1 ? + bubbleType : + special.bindType || type; + + // jQuery handler + handle = ( dataPriv.get( cur, "events" ) || Object.create( null ) )[ event.type ] && + dataPriv.get( cur, "handle" ); + if ( handle ) { + handle.apply( cur, data ); + } + + // Native handler + handle = ontype && cur[ ontype ]; + if ( handle && handle.apply && acceptData( cur ) ) { + event.result = handle.apply( cur, data ); + if ( event.result === false ) { + event.preventDefault(); + } + } + } + event.type = type; + + // If nobody prevented the default action, do it now + if ( !onlyHandlers && !event.isDefaultPrevented() ) { + + if ( ( !special._default || + special._default.apply( eventPath.pop(), data ) === false ) && + acceptData( elem ) ) { + + // Call a native DOM method on the target with the same name as the event. + // Don't do default actions on window, that's where global variables be (#6170) + if ( ontype && isFunction( elem[ type ] ) && !isWindow( elem ) ) { + + // Don't re-trigger an onFOO event when we call its FOO() method + tmp = elem[ ontype ]; + + if ( tmp ) { + elem[ ontype ] = null; + } + + // Prevent re-triggering of the same event, since we already bubbled it above + jQuery.event.triggered = type; + + if ( event.isPropagationStopped() ) { + lastElement.addEventListener( type, stopPropagationCallback ); + } + + elem[ type ](); + + if ( event.isPropagationStopped() ) { + lastElement.removeEventListener( type, stopPropagationCallback ); + } + + jQuery.event.triggered = undefined; + + if ( tmp ) { + elem[ ontype ] = tmp; + } + } + } + } + + return event.result; + }, + + // Piggyback on a donor event to simulate a different one + // Used only for `focus(in | out)` events + simulate: function( type, elem, event ) { + var e = jQuery.extend( + new jQuery.Event(), + event, + { + type: type, + isSimulated: true + } + ); + + jQuery.event.trigger( e, null, elem ); + } + +} ); + +jQuery.fn.extend( { + + trigger: function( type, data ) { + return this.each( function() { + jQuery.event.trigger( type, data, this ); + } ); + }, + triggerHandler: function( type, data ) { + var elem = this[ 0 ]; + if ( elem ) { + return jQuery.event.trigger( type, data, elem, true ); + } + } +} ); + + +// Support: Firefox <=44 +// Firefox doesn't have focus(in | out) events +// Related ticket - https://bugzilla.mozilla.org/show_bug.cgi?id=687787 +// +// Support: Chrome <=48 - 49, Safari <=9.0 - 9.1 +// focus(in | out) events fire after focus & blur events, +// which is spec violation - http://www.w3.org/TR/DOM-Level-3-Events/#events-focusevent-event-order +// Related ticket - https://bugs.chromium.org/p/chromium/issues/detail?id=449857 +if ( !support.focusin ) { + jQuery.each( { focus: "focusin", blur: "focusout" }, function( orig, fix ) { + + // Attach a single capturing handler on the document while someone wants focusin/focusout + var handler = function( event ) { + jQuery.event.simulate( fix, event.target, jQuery.event.fix( event ) ); + }; + + jQuery.event.special[ fix ] = { + setup: function() { + + // Handle: regular nodes (via `this.ownerDocument`), window + // (via `this.document`) & document (via `this`). + var doc = this.ownerDocument || this.document || this, + attaches = dataPriv.access( doc, fix ); + + if ( !attaches ) { + doc.addEventListener( orig, handler, true ); + } + dataPriv.access( doc, fix, ( attaches || 0 ) + 1 ); + }, + teardown: function() { + var doc = this.ownerDocument || this.document || this, + attaches = dataPriv.access( doc, fix ) - 1; + + if ( !attaches ) { + doc.removeEventListener( orig, handler, true ); + dataPriv.remove( doc, fix ); + + } else { + dataPriv.access( doc, fix, attaches ); + } + } + }; + } ); +} +var location = window.location; + +var nonce = { guid: Date.now() }; + +var rquery = ( /\?/ ); + + + +// Cross-browser xml parsing +jQuery.parseXML = function( data ) { + var xml, parserErrorElem; + if ( !data || typeof data !== "string" ) { + return null; + } + + // Support: IE 9 - 11 only + // IE throws on parseFromString with invalid input. + try { + xml = ( new window.DOMParser() ).parseFromString( data, "text/xml" ); + } catch ( e ) {} + + parserErrorElem = xml && xml.getElementsByTagName( "parsererror" )[ 0 ]; + if ( !xml || parserErrorElem ) { + jQuery.error( "Invalid XML: " + ( + parserErrorElem ? + jQuery.map( parserErrorElem.childNodes, function( el ) { + return el.textContent; + } ).join( "\n" ) : + data + ) ); + } + return xml; +}; + + +var + rbracket = /\[\]$/, + rCRLF = /\r?\n/g, + rsubmitterTypes = /^(?:submit|button|image|reset|file)$/i, + rsubmittable = /^(?:input|select|textarea|keygen)/i; + +function buildParams( prefix, obj, traditional, add ) { + var name; + + if ( Array.isArray( obj ) ) { + + // Serialize array item. + jQuery.each( obj, function( i, v ) { + if ( traditional || rbracket.test( prefix ) ) { + + // Treat each array item as a scalar. + add( prefix, v ); + + } else { + + // Item is non-scalar (array or object), encode its numeric index. + buildParams( + prefix + "[" + ( typeof v === "object" && v != null ? i : "" ) + "]", + v, + traditional, + add + ); + } + } ); + + } else if ( !traditional && toType( obj ) === "object" ) { + + // Serialize object item. + for ( name in obj ) { + buildParams( prefix + "[" + name + "]", obj[ name ], traditional, add ); + } + + } else { + + // Serialize scalar item. + add( prefix, obj ); + } +} + +// Serialize an array of form elements or a set of +// key/values into a query string +jQuery.param = function( a, traditional ) { + var prefix, + s = [], + add = function( key, valueOrFunction ) { + + // If value is a function, invoke it and use its return value + var value = isFunction( valueOrFunction ) ? + valueOrFunction() : + valueOrFunction; + + s[ s.length ] = encodeURIComponent( key ) + "=" + + encodeURIComponent( value == null ? "" : value ); + }; + + if ( a == null ) { + return ""; + } + + // If an array was passed in, assume that it is an array of form elements. + if ( Array.isArray( a ) || ( a.jquery && !jQuery.isPlainObject( a ) ) ) { + + // Serialize the form elements + jQuery.each( a, function() { + add( this.name, this.value ); + } ); + + } else { + + // If traditional, encode the "old" way (the way 1.3.2 or older + // did it), otherwise encode params recursively. + for ( prefix in a ) { + buildParams( prefix, a[ prefix ], traditional, add ); + } + } + + // Return the resulting serialization + return s.join( "&" ); +}; + +jQuery.fn.extend( { + serialize: function() { + return jQuery.param( this.serializeArray() ); + }, + serializeArray: function() { + return this.map( function() { + + // Can add propHook for "elements" to filter or add form elements + var elements = jQuery.prop( this, "elements" ); + return elements ? jQuery.makeArray( elements ) : this; + } ).filter( function() { + var type = this.type; + + // Use .is( ":disabled" ) so that fieldset[disabled] works + return this.name && !jQuery( this ).is( ":disabled" ) && + rsubmittable.test( this.nodeName ) && !rsubmitterTypes.test( type ) && + ( this.checked || !rcheckableType.test( type ) ); + } ).map( function( _i, elem ) { + var val = jQuery( this ).val(); + + if ( val == null ) { + return null; + } + + if ( Array.isArray( val ) ) { + return jQuery.map( val, function( val ) { + return { name: elem.name, value: val.replace( rCRLF, "\r\n" ) }; + } ); + } + + return { name: elem.name, value: val.replace( rCRLF, "\r\n" ) }; + } ).get(); + } +} ); + + +var + r20 = /%20/g, + rhash = /#.*$/, + rantiCache = /([?&])_=[^&]*/, + rheaders = /^(.*?):[ \t]*([^\r\n]*)$/mg, + + // #7653, #8125, #8152: local protocol detection + rlocalProtocol = /^(?:about|app|app-storage|.+-extension|file|res|widget):$/, + rnoContent = /^(?:GET|HEAD)$/, + rprotocol = /^\/\//, + + /* Prefilters + * 1) They are useful to introduce custom dataTypes (see ajax/jsonp.js for an example) + * 2) These are called: + * - BEFORE asking for a transport + * - AFTER param serialization (s.data is a string if s.processData is true) + * 3) key is the dataType + * 4) the catchall symbol "*" can be used + * 5) execution will start with transport dataType and THEN continue down to "*" if needed + */ + prefilters = {}, + + /* Transports bindings + * 1) key is the dataType + * 2) the catchall symbol "*" can be used + * 3) selection will start with transport dataType and THEN go to "*" if needed + */ + transports = {}, + + // Avoid comment-prolog char sequence (#10098); must appease lint and evade compression + allTypes = "*/".concat( "*" ), + + // Anchor tag for parsing the document origin + originAnchor = document.createElement( "a" ); + +originAnchor.href = location.href; + +// Base "constructor" for jQuery.ajaxPrefilter and jQuery.ajaxTransport +function addToPrefiltersOrTransports( structure ) { + + // dataTypeExpression is optional and defaults to "*" + return function( dataTypeExpression, func ) { + + if ( typeof dataTypeExpression !== "string" ) { + func = dataTypeExpression; + dataTypeExpression = "*"; + } + + var dataType, + i = 0, + dataTypes = dataTypeExpression.toLowerCase().match( rnothtmlwhite ) || []; + + if ( isFunction( func ) ) { + + // For each dataType in the dataTypeExpression + while ( ( dataType = dataTypes[ i++ ] ) ) { + + // Prepend if requested + if ( dataType[ 0 ] === "+" ) { + dataType = dataType.slice( 1 ) || "*"; + ( structure[ dataType ] = structure[ dataType ] || [] ).unshift( func ); + + // Otherwise append + } else { + ( structure[ dataType ] = structure[ dataType ] || [] ).push( func ); + } + } + } + }; +} + +// Base inspection function for prefilters and transports +function inspectPrefiltersOrTransports( structure, options, originalOptions, jqXHR ) { + + var inspected = {}, + seekingTransport = ( structure === transports ); + + function inspect( dataType ) { + var selected; + inspected[ dataType ] = true; + jQuery.each( structure[ dataType ] || [], function( _, prefilterOrFactory ) { + var dataTypeOrTransport = prefilterOrFactory( options, originalOptions, jqXHR ); + if ( typeof dataTypeOrTransport === "string" && + !seekingTransport && !inspected[ dataTypeOrTransport ] ) { + + options.dataTypes.unshift( dataTypeOrTransport ); + inspect( dataTypeOrTransport ); + return false; + } else if ( seekingTransport ) { + return !( selected = dataTypeOrTransport ); + } + } ); + return selected; + } + + return inspect( options.dataTypes[ 0 ] ) || !inspected[ "*" ] && inspect( "*" ); +} + +// A special extend for ajax options +// that takes "flat" options (not to be deep extended) +// Fixes #9887 +function ajaxExtend( target, src ) { + var key, deep, + flatOptions = jQuery.ajaxSettings.flatOptions || {}; + + for ( key in src ) { + if ( src[ key ] !== undefined ) { + ( flatOptions[ key ] ? target : ( deep || ( deep = {} ) ) )[ key ] = src[ key ]; + } + } + if ( deep ) { + jQuery.extend( true, target, deep ); + } + + return target; +} + +/* Handles responses to an ajax request: + * - finds the right dataType (mediates between content-type and expected dataType) + * - returns the corresponding response + */ +function ajaxHandleResponses( s, jqXHR, responses ) { + + var ct, type, finalDataType, firstDataType, + contents = s.contents, + dataTypes = s.dataTypes; + + // Remove auto dataType and get content-type in the process + while ( dataTypes[ 0 ] === "*" ) { + dataTypes.shift(); + if ( ct === undefined ) { + ct = s.mimeType || jqXHR.getResponseHeader( "Content-Type" ); + } + } + + // Check if we're dealing with a known content-type + if ( ct ) { + for ( type in contents ) { + if ( contents[ type ] && contents[ type ].test( ct ) ) { + dataTypes.unshift( type ); + break; + } + } + } + + // Check to see if we have a response for the expected dataType + if ( dataTypes[ 0 ] in responses ) { + finalDataType = dataTypes[ 0 ]; + } else { + + // Try convertible dataTypes + for ( type in responses ) { + if ( !dataTypes[ 0 ] || s.converters[ type + " " + dataTypes[ 0 ] ] ) { + finalDataType = type; + break; + } + if ( !firstDataType ) { + firstDataType = type; + } + } + + // Or just use first one + finalDataType = finalDataType || firstDataType; + } + + // If we found a dataType + // We add the dataType to the list if needed + // and return the corresponding response + if ( finalDataType ) { + if ( finalDataType !== dataTypes[ 0 ] ) { + dataTypes.unshift( finalDataType ); + } + return responses[ finalDataType ]; + } +} + +/* Chain conversions given the request and the original response + * Also sets the responseXXX fields on the jqXHR instance + */ +function ajaxConvert( s, response, jqXHR, isSuccess ) { + var conv2, current, conv, tmp, prev, + converters = {}, + + // Work with a copy of dataTypes in case we need to modify it for conversion + dataTypes = s.dataTypes.slice(); + + // Create converters map with lowercased keys + if ( dataTypes[ 1 ] ) { + for ( conv in s.converters ) { + converters[ conv.toLowerCase() ] = s.converters[ conv ]; + } + } + + current = dataTypes.shift(); + + // Convert to each sequential dataType + while ( current ) { + + if ( s.responseFields[ current ] ) { + jqXHR[ s.responseFields[ current ] ] = response; + } + + // Apply the dataFilter if provided + if ( !prev && isSuccess && s.dataFilter ) { + response = s.dataFilter( response, s.dataType ); + } + + prev = current; + current = dataTypes.shift(); + + if ( current ) { + + // There's only work to do if current dataType is non-auto + if ( current === "*" ) { + + current = prev; + + // Convert response if prev dataType is non-auto and differs from current + } else if ( prev !== "*" && prev !== current ) { + + // Seek a direct converter + conv = converters[ prev + " " + current ] || converters[ "* " + current ]; + + // If none found, seek a pair + if ( !conv ) { + for ( conv2 in converters ) { + + // If conv2 outputs current + tmp = conv2.split( " " ); + if ( tmp[ 1 ] === current ) { + + // If prev can be converted to accepted input + conv = converters[ prev + " " + tmp[ 0 ] ] || + converters[ "* " + tmp[ 0 ] ]; + if ( conv ) { + + // Condense equivalence converters + if ( conv === true ) { + conv = converters[ conv2 ]; + + // Otherwise, insert the intermediate dataType + } else if ( converters[ conv2 ] !== true ) { + current = tmp[ 0 ]; + dataTypes.unshift( tmp[ 1 ] ); + } + break; + } + } + } + } + + // Apply converter (if not an equivalence) + if ( conv !== true ) { + + // Unless errors are allowed to bubble, catch and return them + if ( conv && s.throws ) { + response = conv( response ); + } else { + try { + response = conv( response ); + } catch ( e ) { + return { + state: "parsererror", + error: conv ? e : "No conversion from " + prev + " to " + current + }; + } + } + } + } + } + } + + return { state: "success", data: response }; +} + +jQuery.extend( { + + // Counter for holding the number of active queries + active: 0, + + // Last-Modified header cache for next request + lastModified: {}, + etag: {}, + + ajaxSettings: { + url: location.href, + type: "GET", + isLocal: rlocalProtocol.test( location.protocol ), + global: true, + processData: true, + async: true, + contentType: "application/x-www-form-urlencoded; charset=UTF-8", + + /* + timeout: 0, + data: null, + dataType: null, + username: null, + password: null, + cache: null, + throws: false, + traditional: false, + headers: {}, + */ + + accepts: { + "*": allTypes, + text: "text/plain", + html: "text/html", + xml: "application/xml, text/xml", + json: "application/json, text/javascript" + }, + + contents: { + xml: /\bxml\b/, + html: /\bhtml/, + json: /\bjson\b/ + }, + + responseFields: { + xml: "responseXML", + text: "responseText", + json: "responseJSON" + }, + + // Data converters + // Keys separate source (or catchall "*") and destination types with a single space + converters: { + + // Convert anything to text + "* text": String, + + // Text to html (true = no transformation) + "text html": true, + + // Evaluate text as a json expression + "text json": JSON.parse, + + // Parse text as xml + "text xml": jQuery.parseXML + }, + + // For options that shouldn't be deep extended: + // you can add your own custom options here if + // and when you create one that shouldn't be + // deep extended (see ajaxExtend) + flatOptions: { + url: true, + context: true + } + }, + + // Creates a full fledged settings object into target + // with both ajaxSettings and settings fields. + // If target is omitted, writes into ajaxSettings. + ajaxSetup: function( target, settings ) { + return settings ? + + // Building a settings object + ajaxExtend( ajaxExtend( target, jQuery.ajaxSettings ), settings ) : + + // Extending ajaxSettings + ajaxExtend( jQuery.ajaxSettings, target ); + }, + + ajaxPrefilter: addToPrefiltersOrTransports( prefilters ), + ajaxTransport: addToPrefiltersOrTransports( transports ), + + // Main method + ajax: function( url, options ) { + + // If url is an object, simulate pre-1.5 signature + if ( typeof url === "object" ) { + options = url; + url = undefined; + } + + // Force options to be an object + options = options || {}; + + var transport, + + // URL without anti-cache param + cacheURL, + + // Response headers + responseHeadersString, + responseHeaders, + + // timeout handle + timeoutTimer, + + // Url cleanup var + urlAnchor, + + // Request state (becomes false upon send and true upon completion) + completed, + + // To know if global events are to be dispatched + fireGlobals, + + // Loop variable + i, + + // uncached part of the url + uncached, + + // Create the final options object + s = jQuery.ajaxSetup( {}, options ), + + // Callbacks context + callbackContext = s.context || s, + + // Context for global events is callbackContext if it is a DOM node or jQuery collection + globalEventContext = s.context && + ( callbackContext.nodeType || callbackContext.jquery ) ? + jQuery( callbackContext ) : + jQuery.event, + + // Deferreds + deferred = jQuery.Deferred(), + completeDeferred = jQuery.Callbacks( "once memory" ), + + // Status-dependent callbacks + statusCode = s.statusCode || {}, + + // Headers (they are sent all at once) + requestHeaders = {}, + requestHeadersNames = {}, + + // Default abort message + strAbort = "canceled", + + // Fake xhr + jqXHR = { + readyState: 0, + + // Builds headers hashtable if needed + getResponseHeader: function( key ) { + var match; + if ( completed ) { + if ( !responseHeaders ) { + responseHeaders = {}; + while ( ( match = rheaders.exec( responseHeadersString ) ) ) { + responseHeaders[ match[ 1 ].toLowerCase() + " " ] = + ( responseHeaders[ match[ 1 ].toLowerCase() + " " ] || [] ) + .concat( match[ 2 ] ); + } + } + match = responseHeaders[ key.toLowerCase() + " " ]; + } + return match == null ? null : match.join( ", " ); + }, + + // Raw string + getAllResponseHeaders: function() { + return completed ? responseHeadersString : null; + }, + + // Caches the header + setRequestHeader: function( name, value ) { + if ( completed == null ) { + name = requestHeadersNames[ name.toLowerCase() ] = + requestHeadersNames[ name.toLowerCase() ] || name; + requestHeaders[ name ] = value; + } + return this; + }, + + // Overrides response content-type header + overrideMimeType: function( type ) { + if ( completed == null ) { + s.mimeType = type; + } + return this; + }, + + // Status-dependent callbacks + statusCode: function( map ) { + var code; + if ( map ) { + if ( completed ) { + + // Execute the appropriate callbacks + jqXHR.always( map[ jqXHR.status ] ); + } else { + + // Lazy-add the new callbacks in a way that preserves old ones + for ( code in map ) { + statusCode[ code ] = [ statusCode[ code ], map[ code ] ]; + } + } + } + return this; + }, + + // Cancel the request + abort: function( statusText ) { + var finalText = statusText || strAbort; + if ( transport ) { + transport.abort( finalText ); + } + done( 0, finalText ); + return this; + } + }; + + // Attach deferreds + deferred.promise( jqXHR ); + + // Add protocol if not provided (prefilters might expect it) + // Handle falsy url in the settings object (#10093: consistency with old signature) + // We also use the url parameter if available + s.url = ( ( url || s.url || location.href ) + "" ) + .replace( rprotocol, location.protocol + "//" ); + + // Alias method option to type as per ticket #12004 + s.type = options.method || options.type || s.method || s.type; + + // Extract dataTypes list + s.dataTypes = ( s.dataType || "*" ).toLowerCase().match( rnothtmlwhite ) || [ "" ]; + + // A cross-domain request is in order when the origin doesn't match the current origin. + if ( s.crossDomain == null ) { + urlAnchor = document.createElement( "a" ); + + // Support: IE <=8 - 11, Edge 12 - 15 + // IE throws exception on accessing the href property if url is malformed, + // e.g. http://example.com:80x/ + try { + urlAnchor.href = s.url; + + // Support: IE <=8 - 11 only + // Anchor's host property isn't correctly set when s.url is relative + urlAnchor.href = urlAnchor.href; + s.crossDomain = originAnchor.protocol + "//" + originAnchor.host !== + urlAnchor.protocol + "//" + urlAnchor.host; + } catch ( e ) { + + // If there is an error parsing the URL, assume it is crossDomain, + // it can be rejected by the transport if it is invalid + s.crossDomain = true; + } + } + + // Convert data if not already a string + if ( s.data && s.processData && typeof s.data !== "string" ) { + s.data = jQuery.param( s.data, s.traditional ); + } + + // Apply prefilters + inspectPrefiltersOrTransports( prefilters, s, options, jqXHR ); + + // If request was aborted inside a prefilter, stop there + if ( completed ) { + return jqXHR; + } + + // We can fire global events as of now if asked to + // Don't fire events if jQuery.event is undefined in an AMD-usage scenario (#15118) + fireGlobals = jQuery.event && s.global; + + // Watch for a new set of requests + if ( fireGlobals && jQuery.active++ === 0 ) { + jQuery.event.trigger( "ajaxStart" ); + } + + // Uppercase the type + s.type = s.type.toUpperCase(); + + // Determine if request has content + s.hasContent = !rnoContent.test( s.type ); + + // Save the URL in case we're toying with the If-Modified-Since + // and/or If-None-Match header later on + // Remove hash to simplify url manipulation + cacheURL = s.url.replace( rhash, "" ); + + // More options handling for requests with no content + if ( !s.hasContent ) { + + // Remember the hash so we can put it back + uncached = s.url.slice( cacheURL.length ); + + // If data is available and should be processed, append data to url + if ( s.data && ( s.processData || typeof s.data === "string" ) ) { + cacheURL += ( rquery.test( cacheURL ) ? "&" : "?" ) + s.data; + + // #9682: remove data so that it's not used in an eventual retry + delete s.data; + } + + // Add or update anti-cache param if needed + if ( s.cache === false ) { + cacheURL = cacheURL.replace( rantiCache, "$1" ); + uncached = ( rquery.test( cacheURL ) ? "&" : "?" ) + "_=" + ( nonce.guid++ ) + + uncached; + } + + // Put hash and anti-cache on the URL that will be requested (gh-1732) + s.url = cacheURL + uncached; + + // Change '%20' to '+' if this is encoded form body content (gh-2658) + } else if ( s.data && s.processData && + ( s.contentType || "" ).indexOf( "application/x-www-form-urlencoded" ) === 0 ) { + s.data = s.data.replace( r20, "+" ); + } + + // Set the If-Modified-Since and/or If-None-Match header, if in ifModified mode. + if ( s.ifModified ) { + if ( jQuery.lastModified[ cacheURL ] ) { + jqXHR.setRequestHeader( "If-Modified-Since", jQuery.lastModified[ cacheURL ] ); + } + if ( jQuery.etag[ cacheURL ] ) { + jqXHR.setRequestHeader( "If-None-Match", jQuery.etag[ cacheURL ] ); + } + } + + // Set the correct header, if data is being sent + if ( s.data && s.hasContent && s.contentType !== false || options.contentType ) { + jqXHR.setRequestHeader( "Content-Type", s.contentType ); + } + + // Set the Accepts header for the server, depending on the dataType + jqXHR.setRequestHeader( + "Accept", + s.dataTypes[ 0 ] && s.accepts[ s.dataTypes[ 0 ] ] ? + s.accepts[ s.dataTypes[ 0 ] ] + + ( s.dataTypes[ 0 ] !== "*" ? ", " + allTypes + "; q=0.01" : "" ) : + s.accepts[ "*" ] + ); + + // Check for headers option + for ( i in s.headers ) { + jqXHR.setRequestHeader( i, s.headers[ i ] ); + } + + // Allow custom headers/mimetypes and early abort + if ( s.beforeSend && + ( s.beforeSend.call( callbackContext, jqXHR, s ) === false || completed ) ) { + + // Abort if not done already and return + return jqXHR.abort(); + } + + // Aborting is no longer a cancellation + strAbort = "abort"; + + // Install callbacks on deferreds + completeDeferred.add( s.complete ); + jqXHR.done( s.success ); + jqXHR.fail( s.error ); + + // Get transport + transport = inspectPrefiltersOrTransports( transports, s, options, jqXHR ); + + // If no transport, we auto-abort + if ( !transport ) { + done( -1, "No Transport" ); + } else { + jqXHR.readyState = 1; + + // Send global event + if ( fireGlobals ) { + globalEventContext.trigger( "ajaxSend", [ jqXHR, s ] ); + } + + // If request was aborted inside ajaxSend, stop there + if ( completed ) { + return jqXHR; + } + + // Timeout + if ( s.async && s.timeout > 0 ) { + timeoutTimer = window.setTimeout( function() { + jqXHR.abort( "timeout" ); + }, s.timeout ); + } + + try { + completed = false; + transport.send( requestHeaders, done ); + } catch ( e ) { + + // Rethrow post-completion exceptions + if ( completed ) { + throw e; + } + + // Propagate others as results + done( -1, e ); + } + } + + // Callback for when everything is done + function done( status, nativeStatusText, responses, headers ) { + var isSuccess, success, error, response, modified, + statusText = nativeStatusText; + + // Ignore repeat invocations + if ( completed ) { + return; + } + + completed = true; + + // Clear timeout if it exists + if ( timeoutTimer ) { + window.clearTimeout( timeoutTimer ); + } + + // Dereference transport for early garbage collection + // (no matter how long the jqXHR object will be used) + transport = undefined; + + // Cache response headers + responseHeadersString = headers || ""; + + // Set readyState + jqXHR.readyState = status > 0 ? 4 : 0; + + // Determine if successful + isSuccess = status >= 200 && status < 300 || status === 304; + + // Get response data + if ( responses ) { + response = ajaxHandleResponses( s, jqXHR, responses ); + } + + // Use a noop converter for missing script but not if jsonp + if ( !isSuccess && + jQuery.inArray( "script", s.dataTypes ) > -1 && + jQuery.inArray( "json", s.dataTypes ) < 0 ) { + s.converters[ "text script" ] = function() {}; + } + + // Convert no matter what (that way responseXXX fields are always set) + response = ajaxConvert( s, response, jqXHR, isSuccess ); + + // If successful, handle type chaining + if ( isSuccess ) { + + // Set the If-Modified-Since and/or If-None-Match header, if in ifModified mode. + if ( s.ifModified ) { + modified = jqXHR.getResponseHeader( "Last-Modified" ); + if ( modified ) { + jQuery.lastModified[ cacheURL ] = modified; + } + modified = jqXHR.getResponseHeader( "etag" ); + if ( modified ) { + jQuery.etag[ cacheURL ] = modified; + } + } + + // if no content + if ( status === 204 || s.type === "HEAD" ) { + statusText = "nocontent"; + + // if not modified + } else if ( status === 304 ) { + statusText = "notmodified"; + + // If we have data, let's convert it + } else { + statusText = response.state; + success = response.data; + error = response.error; + isSuccess = !error; + } + } else { + + // Extract error from statusText and normalize for non-aborts + error = statusText; + if ( status || !statusText ) { + statusText = "error"; + if ( status < 0 ) { + status = 0; + } + } + } + + // Set data for the fake xhr object + jqXHR.status = status; + jqXHR.statusText = ( nativeStatusText || statusText ) + ""; + + // Success/Error + if ( isSuccess ) { + deferred.resolveWith( callbackContext, [ success, statusText, jqXHR ] ); + } else { + deferred.rejectWith( callbackContext, [ jqXHR, statusText, error ] ); + } + + // Status-dependent callbacks + jqXHR.statusCode( statusCode ); + statusCode = undefined; + + if ( fireGlobals ) { + globalEventContext.trigger( isSuccess ? "ajaxSuccess" : "ajaxError", + [ jqXHR, s, isSuccess ? success : error ] ); + } + + // Complete + completeDeferred.fireWith( callbackContext, [ jqXHR, statusText ] ); + + if ( fireGlobals ) { + globalEventContext.trigger( "ajaxComplete", [ jqXHR, s ] ); + + // Handle the global AJAX counter + if ( !( --jQuery.active ) ) { + jQuery.event.trigger( "ajaxStop" ); + } + } + } + + return jqXHR; + }, + + getJSON: function( url, data, callback ) { + return jQuery.get( url, data, callback, "json" ); + }, + + getScript: function( url, callback ) { + return jQuery.get( url, undefined, callback, "script" ); + } +} ); + +jQuery.each( [ "get", "post" ], function( _i, method ) { + jQuery[ method ] = function( url, data, callback, type ) { + + // Shift arguments if data argument was omitted + if ( isFunction( data ) ) { + type = type || callback; + callback = data; + data = undefined; + } + + // The url can be an options object (which then must have .url) + return jQuery.ajax( jQuery.extend( { + url: url, + type: method, + dataType: type, + data: data, + success: callback + }, jQuery.isPlainObject( url ) && url ) ); + }; +} ); + +jQuery.ajaxPrefilter( function( s ) { + var i; + for ( i in s.headers ) { + if ( i.toLowerCase() === "content-type" ) { + s.contentType = s.headers[ i ] || ""; + } + } +} ); + + +jQuery._evalUrl = function( url, options, doc ) { + return jQuery.ajax( { + url: url, + + // Make this explicit, since user can override this through ajaxSetup (#11264) + type: "GET", + dataType: "script", + cache: true, + async: false, + global: false, + + // Only evaluate the response if it is successful (gh-4126) + // dataFilter is not invoked for failure responses, so using it instead + // of the default converter is kludgy but it works. + converters: { + "text script": function() {} + }, + dataFilter: function( response ) { + jQuery.globalEval( response, options, doc ); + } + } ); +}; + + +jQuery.fn.extend( { + wrapAll: function( html ) { + var wrap; + + if ( this[ 0 ] ) { + if ( isFunction( html ) ) { + html = html.call( this[ 0 ] ); + } + + // The elements to wrap the target around + wrap = jQuery( html, this[ 0 ].ownerDocument ).eq( 0 ).clone( true ); + + if ( this[ 0 ].parentNode ) { + wrap.insertBefore( this[ 0 ] ); + } + + wrap.map( function() { + var elem = this; + + while ( elem.firstElementChild ) { + elem = elem.firstElementChild; + } + + return elem; + } ).append( this ); + } + + return this; + }, + + wrapInner: function( html ) { + if ( isFunction( html ) ) { + return this.each( function( i ) { + jQuery( this ).wrapInner( html.call( this, i ) ); + } ); + } + + return this.each( function() { + var self = jQuery( this ), + contents = self.contents(); + + if ( contents.length ) { + contents.wrapAll( html ); + + } else { + self.append( html ); + } + } ); + }, + + wrap: function( html ) { + var htmlIsFunction = isFunction( html ); + + return this.each( function( i ) { + jQuery( this ).wrapAll( htmlIsFunction ? html.call( this, i ) : html ); + } ); + }, + + unwrap: function( selector ) { + this.parent( selector ).not( "body" ).each( function() { + jQuery( this ).replaceWith( this.childNodes ); + } ); + return this; + } +} ); + + +jQuery.expr.pseudos.hidden = function( elem ) { + return !jQuery.expr.pseudos.visible( elem ); +}; +jQuery.expr.pseudos.visible = function( elem ) { + return !!( elem.offsetWidth || elem.offsetHeight || elem.getClientRects().length ); +}; + + + + +jQuery.ajaxSettings.xhr = function() { + try { + return new window.XMLHttpRequest(); + } catch ( e ) {} +}; + +var xhrSuccessStatus = { + + // File protocol always yields status code 0, assume 200 + 0: 200, + + // Support: IE <=9 only + // #1450: sometimes IE returns 1223 when it should be 204 + 1223: 204 + }, + xhrSupported = jQuery.ajaxSettings.xhr(); + +support.cors = !!xhrSupported && ( "withCredentials" in xhrSupported ); +support.ajax = xhrSupported = !!xhrSupported; + +jQuery.ajaxTransport( function( options ) { + var callback, errorCallback; + + // Cross domain only allowed if supported through XMLHttpRequest + if ( support.cors || xhrSupported && !options.crossDomain ) { + return { + send: function( headers, complete ) { + var i, + xhr = options.xhr(); + + xhr.open( + options.type, + options.url, + options.async, + options.username, + options.password + ); + + // Apply custom fields if provided + if ( options.xhrFields ) { + for ( i in options.xhrFields ) { + xhr[ i ] = options.xhrFields[ i ]; + } + } + + // Override mime type if needed + if ( options.mimeType && xhr.overrideMimeType ) { + xhr.overrideMimeType( options.mimeType ); + } + + // X-Requested-With header + // For cross-domain requests, seeing as conditions for a preflight are + // akin to a jigsaw puzzle, we simply never set it to be sure. + // (it can always be set on a per-request basis or even using ajaxSetup) + // For same-domain requests, won't change header if already provided. + if ( !options.crossDomain && !headers[ "X-Requested-With" ] ) { + headers[ "X-Requested-With" ] = "XMLHttpRequest"; + } + + // Set headers + for ( i in headers ) { + xhr.setRequestHeader( i, headers[ i ] ); + } + + // Callback + callback = function( type ) { + return function() { + if ( callback ) { + callback = errorCallback = xhr.onload = + xhr.onerror = xhr.onabort = xhr.ontimeout = + xhr.onreadystatechange = null; + + if ( type === "abort" ) { + xhr.abort(); + } else if ( type === "error" ) { + + // Support: IE <=9 only + // On a manual native abort, IE9 throws + // errors on any property access that is not readyState + if ( typeof xhr.status !== "number" ) { + complete( 0, "error" ); + } else { + complete( + + // File: protocol always yields status 0; see #8605, #14207 + xhr.status, + xhr.statusText + ); + } + } else { + complete( + xhrSuccessStatus[ xhr.status ] || xhr.status, + xhr.statusText, + + // Support: IE <=9 only + // IE9 has no XHR2 but throws on binary (trac-11426) + // For XHR2 non-text, let the caller handle it (gh-2498) + ( xhr.responseType || "text" ) !== "text" || + typeof xhr.responseText !== "string" ? + { binary: xhr.response } : + { text: xhr.responseText }, + xhr.getAllResponseHeaders() + ); + } + } + }; + }; + + // Listen to events + xhr.onload = callback(); + errorCallback = xhr.onerror = xhr.ontimeout = callback( "error" ); + + // Support: IE 9 only + // Use onreadystatechange to replace onabort + // to handle uncaught aborts + if ( xhr.onabort !== undefined ) { + xhr.onabort = errorCallback; + } else { + xhr.onreadystatechange = function() { + + // Check readyState before timeout as it changes + if ( xhr.readyState === 4 ) { + + // Allow onerror to be called first, + // but that will not handle a native abort + // Also, save errorCallback to a variable + // as xhr.onerror cannot be accessed + window.setTimeout( function() { + if ( callback ) { + errorCallback(); + } + } ); + } + }; + } + + // Create the abort callback + callback = callback( "abort" ); + + try { + + // Do send the request (this may raise an exception) + xhr.send( options.hasContent && options.data || null ); + } catch ( e ) { + + // #14683: Only rethrow if this hasn't been notified as an error yet + if ( callback ) { + throw e; + } + } + }, + + abort: function() { + if ( callback ) { + callback(); + } + } + }; + } +} ); + + + + +// Prevent auto-execution of scripts when no explicit dataType was provided (See gh-2432) +jQuery.ajaxPrefilter( function( s ) { + if ( s.crossDomain ) { + s.contents.script = false; + } +} ); + +// Install script dataType +jQuery.ajaxSetup( { + accepts: { + script: "text/javascript, application/javascript, " + + "application/ecmascript, application/x-ecmascript" + }, + contents: { + script: /\b(?:java|ecma)script\b/ + }, + converters: { + "text script": function( text ) { + jQuery.globalEval( text ); + return text; + } + } +} ); + +// Handle cache's special case and crossDomain +jQuery.ajaxPrefilter( "script", function( s ) { + if ( s.cache === undefined ) { + s.cache = false; + } + if ( s.crossDomain ) { + s.type = "GET"; + } +} ); + +// Bind script tag hack transport +jQuery.ajaxTransport( "script", function( s ) { + + // This transport only deals with cross domain or forced-by-attrs requests + if ( s.crossDomain || s.scriptAttrs ) { + var script, callback; + return { + send: function( _, complete ) { + script = jQuery( " + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

Lasso Classes and Functions

+
+

Base Classes

+ ++++ + + + + + + + + + + + + + + + + + +

CubeTransit

Class for storing information about transit defined in Cube line files.

StandardTransit

Holds a standard transit feed as a Partridge object and contains methods to manipulate and translate the GTFS data to MetCouncil's Cube Line files.

ModelRoadwayNetwork

Subclass of network_wrangler class RoadwayNetwork

Project

A single or set of changes to the roadway or transit system.

Parameters

A class representing all the parameters defining the networks including time of day, categories, etc.

+
+
+

Utils and Functions

+ ++++ + + + + + + + + +

util

logger

+
+
+ + +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty/genindex/index.html b/branch/bicounty/genindex/index.html new file mode 100644 index 0000000..30c2efe --- /dev/null +++ b/branch/bicounty/genindex/index.html @@ -0,0 +1,1010 @@ + + + + + + Index — lasso documentation + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+
    +
  • »
  • +
  • Index
  • +
  • +
  • +
+
+
+
+
+ + +

Index

+ +
+ _ + | A + | B + | C + | D + | E + | F + | G + | H + | I + | K + | L + | M + | N + | O + | P + | R + | S + | T + | U + | V + | W + | X + | Y + | Z + +
+

_

+ + +
+ +

A

+ + + +
+ +

B

+ + + +
+ +

C

+ + + +
+ +

D

+ + + +
+ +

E

+ + + +
+ +

F

+ + + +
+ +

G

+ + + +
+ +

H

+ + + +
+ +

I

+ + + +
+ +

K

+ + +
+ +

L

+ + + +
    +
  • + lasso + +
  • +
  • + lasso.logger + +
  • +
  • + lasso.util + +
  • +
+ +

M

+ + + +
+ +

N

+ + + +
+ +

O

+ + + +
+ +

P

+ + + +
+ +

R

+ + + +
+ +

S

+ + + +
+ +

T

+ + + +
+ +

U

+ + + +
+ +

V

+ + + +
+ +

W

+ + + +
+ +

X

+ + + +
+ +

Y

+ + +
+ +

Z

+ + + +
+ + + +
+
+
+ +
+ +
+

© Copyright 2019 Metropolitan Council.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty/index.html b/branch/bicounty/index.html new file mode 100644 index 0000000..602fa6e --- /dev/null +++ b/branch/bicounty/index.html @@ -0,0 +1,177 @@ + + + + + + + Welcome to lasso’s documentation! — lasso documentation + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

Welcome to lasso’s documentation!

+

This package of utilities is a wrapper around the +[network_wrangler](http://github.com/wsp-sag/network_wrangler) package +for MetCouncil. It aims to have the following functionality: +1. parse Cube log files and base highway networks and create ProjectCards

+
+

for Network Wrangler

+
+
    +
  1. parse two Cube transit line files and create ProjectCards for NetworkWrangler

  2. +
  3. refine Network Wrangler highway networks to contain specific variables and +settings for Metropolitan Council and export them to a format that can +be read in by Citilab’s Cube software.

  4. +
+ +
+
+

Indices and tables

+ +
+ + +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty/objects.inv b/branch/bicounty/objects.inv new file mode 100644 index 0000000..efee943 Binary files /dev/null and b/branch/bicounty/objects.inv differ diff --git a/branch/bicounty/py-modindex/index.html b/branch/bicounty/py-modindex/index.html new file mode 100644 index 0000000..b1cdf4a --- /dev/null +++ b/branch/bicounty/py-modindex/index.html @@ -0,0 +1,130 @@ + + + + + + Python Module Index — lasso documentation + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+
    +
  • »
  • +
  • Python Module Index
  • +
  • +
  • +
+
+
+
+
+ + +

Python Module Index

+ +
+ l +
+ + + + + + + + + + + + + +
 
+ l
+ lasso +
    + lasso.logger +
    + lasso.util +
+ + +
+
+
+ +
+ +
+

© Copyright 2019 Metropolitan Council.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty/running/index.html b/branch/bicounty/running/index.html new file mode 100644 index 0000000..2532cdc --- /dev/null +++ b/branch/bicounty/running/index.html @@ -0,0 +1,127 @@ + + + + + + + Running Lasso — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

Running Lasso

+
+

Create project files

+
+
+

Create a scenario

+
+
+

Exporting networks

+
+
+

Auditing and Reporting

+
+
+ + +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty/search/index.html b/branch/bicounty/search/index.html new file mode 100644 index 0000000..3c6d6f2 --- /dev/null +++ b/branch/bicounty/search/index.html @@ -0,0 +1,120 @@ + + + + + + Search — lasso documentation + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+
    +
  • »
  • +
  • Search
  • +
  • +
  • +
+
+
+
+
+ + + + +
+ +
+ +
+
+
+ +
+ +
+

© Copyright 2019 Metropolitan Council.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + + + + + + \ No newline at end of file diff --git a/branch/bicounty/searchindex.js b/branch/bicounty/searchindex.js new file mode 100644 index 0000000..e70d53f --- /dev/null +++ b/branch/bicounty/searchindex.js @@ -0,0 +1 @@ +Search.setIndex({"docnames": ["_generated/lasso.CubeTransit", "_generated/lasso.ModelRoadwayNetwork", "_generated/lasso.Parameters", "_generated/lasso.Project", "_generated/lasso.StandardTransit", "_generated/lasso.logger", "_generated/lasso.util", "autodoc", "index", "running", "setup", "starting"], "filenames": ["_generated/lasso.CubeTransit.rst", "_generated/lasso.ModelRoadwayNetwork.rst", "_generated/lasso.Parameters.rst", "_generated/lasso.Project.rst", "_generated/lasso.StandardTransit.rst", "_generated/lasso.logger.rst", "_generated/lasso.util.rst", "autodoc.rst", "index.rst", "running.md", "setup.md", "starting.md"], "titles": ["lasso.CubeTransit", "lasso.ModelRoadwayNetwork", "lasso.Parameters", "lasso.Project", "lasso.StandardTransit", "lasso.logger", "lasso.util", "Lasso Classes and Functions", "Welcome to lasso\u2019s documentation!", "Running Lasso", "Setup", "Starting Out"], "terms": {"class": [0, 1, 2, 3, 4, 6, 8, 11], "paramet": [0, 1, 3, 4, 6, 8], "sourc": [0, 1, 2, 3, 4, 5, 6, 11], "base": [0, 1, 2, 3, 4, 6, 8, 11], "object": [0, 1, 2, 3, 4, 6, 11], "store": [0, 1, 11], "inform": [0, 1, 4, 11], "about": [0, 1, 4, 11], "transit": [0, 1, 2, 3, 4, 8], "defin": [0, 1, 2, 3, 11], "cube": [0, 1, 2, 3, 4, 8], "line": [0, 1, 2, 3, 4, 6, 8, 11], "file": [0, 1, 2, 3, 4, 8], "ha": [0, 6, 11], "capabl": [0, 11], "pars": [0, 8, 11], "properti": [0, 1, 2, 4, 6, 11], "shape": [0, 1, 3, 4, 6, 11], "python": [0, 1, 2, 11], "dictionari": [0, 1, 2, 3, 4, 6, 11], "compar": [0, 3, 4, 11], "repres": [0, 2, 4, 6, 11], "chang": [0, 1, 3, 4, 11], "project": [0, 1, 2, 4, 6, 8], "card": [0, 1, 3, 4], "typic": [0, 3, 4, 6, 8], "usag": [0, 1, 3, 4], "exampl": [0, 1, 3, 4, 6, 11], "tn": [0, 11], "create_from_cub": [0, 11], "cube_dir": [0, 3, 11], "transit_change_list": [0, 11], "evaluate_differ": [0, 4, 11], "base_transit_network": [0, 3, 11], "list": [0, 1, 2, 3, 4, 6, 11], "string": [0, 1, 3, 4, 6], "uniqu": [0, 1, 3], "name": [0, 1, 2, 3, 4, 6, 11], "network": [0, 1, 2, 3, 4, 5, 8], "type": [0, 1, 2, 3, 4, 6, 11], "line_properti": 0, "kei": [0, 1, 3, 11], "valu": [0, 1, 3, 4, 6, 11], "ar": [0, 1, 2, 3, 6, 11], "These": 0, "directli": 0, "read": [0, 1, 3, 4, 8, 11], "from": [0, 1, 2, 3, 4, 6, 8], "haven": 0, "t": [0, 1, 6], "been": [0, 6], "translat": [0, 4, 11], "standard": [0, 1, 2, 4, 11], "dict": [0, 1, 2, 3], "panda": [0, 1, 3], "datafram": [0, 1, 3, 4, 11], "node": [0, 1, 2, 3, 4, 6, 11], "follow": [0, 4, 8, 11], "column": [0, 1], "node_id": 0, "int": [0, 1, 2, 3, 4, 6], "posit": [0, 6], "integ": [0, 1, 2], "id": [0, 1, 3, 4, 11], "number": [0, 1, 2, 3, 4, 6, 11], "neg": [0, 6], "indic": [0, 1, 6], "non": [0, 6], "stop": 0, "boolean": [0, 1], "i": [0, 1, 3, 4, 5, 6, 8, 11], "order": [0, 1, 2, 6, 11], "within": [0, 1, 2, 6], "thi": [0, 1, 2, 3, 6, 8, 11], "program_typ": 0, "either": [0, 1, 5, 11], "pt": 0, "trnbld": 0, "str": [0, 1, 2, 3, 6], "instanc": [0, 1, 2, 3, 4, 11], "appli": [0, 1, 6], "which": [0, 1, 3, 6, 11], "includ": [0, 1, 2, 11], "time": [0, 1, 2, 4, 6, 11], "period": [0, 1, 2, 4, 11], "variabl": [0, 1, 2, 3, 4, 8, 11], "source_list": 0, "have": [0, 1, 6, 8], "ad": [0, 1, 3, 11], "diff_dict": 0, "__init__": [0, 1, 2, 3, 4], "constructor": [0, 1, 3], "set": [0, 1, 2, 3, 4, 5, 6, 8, 11], "see": [0, 1, 3, 4], "an": [0, 1, 3, 4, 6, 11], "method": [0, 1, 2, 3, 4, 6, 11], "add_additional_time_period": 0, "new_time_period_numb": 0, "orig_line_nam": 0, "copi": [0, 1, 3, 11], "rout": [0, 1, 4], "anoth": 0, "appropri": [0, 4], "specif": [0, 1, 8, 11], "new": [0, 1, 6, 11], "under": 0, "self": [0, 1, 2, 3, 4, 6], "origin": [0, 1, 6, 11], "its": [0, 1], "return": [0, 1, 3, 4, 6], "add_cub": 0, "transit_sourc": 0, "lin": [0, 3, 4], "add": [0, 1, 3, 11], "exist": [0, 1, 3, 6, 11], "transitnetwork": [0, 4], "directori": [0, 1, 4, 11], "static": [0, 1, 3, 4], "build_route_nam": 0, "route_id": [0, 4], "time_period": [0, 1, 2], "agency_id": 0, "0": [0, 1, 2, 4, 6, 11], "direction_id": 0, "1": [0, 1, 2, 4, 6, 8, 11], "creat": [0, 1, 2, 3, 4, 6, 8, 11], "contaten": 0, "agenc": 0, "direct": [0, 1, 6, 11], "e": [0, 1, 11], "452": 0, "111": 0, "pk": [0, 2], "construct": [0, 6, 11], "line_nam": 0, "0_452": 0, "111_452_pk1": 0, "calculate_start_end_tim": 0, "line_properties_dict": 0, "calcul": [0, 1, 2, 3, 4, 6], "start": [0, 1, 4, 8], "end": [0, 1, 6], "warn": [0, 1], "doesn": [0, 1], "take": [0, 1], "care": 0, "discongru": 0, "flavor": [0, 1], "create_add_route_card_dict": 0, "format": [0, 1, 2, 4, 8, 11], "route_properti": 0, "being": 0, "updat": [0, 1, 11], "A": [0, 1, 2, 3, 4, 6, 11], "addit": [0, 1, 8, 11], "create_delete_route_card_dict": 0, "base_transit_line_properties_dict": 0, "delet": [0, 1], "style": [0, 6], "attribut": [0, 1, 2, 3, 4, 11], "find": [0, 1, 4], "create_update_route_card_dict": 0, "updated_properties_dict": 0, "cube_properties_to_standard_properti": 0, "cube_properties_dict": 0, "convert": [0, 1, 4, 6], "most": 0, "pertin": 0, "like": [0, 1, 2, 6], "headwai": [0, 2, 4], "varibl": [0, 2], "stnadard": 0, "unit": [0, 1, 6], "minut": 0, "second": [0, 4, 6], "correct": 0, "base_transit": 0, "identifi": [0, 1, 11], "what": [0, 1, 3, 6], "need": [0, 1, 2, 3], "For": [0, 1, 4, 6], "multipl": [0, 1, 6, 11], "make": [0, 1, 11], "duplic": 0, "so": [0, 1, 5, 11], "each": [0, 1, 3, 6], "condit": [0, 1], "contain": [0, 1, 4, 6, 8, 11], "requir": [0, 1, 6, 11], "evalu": [0, 1, 3], "differ": [0, 2, 6], "between": [0, 1, 2, 3, 6], "evaluate_route_property_differ": 0, "properties_build": 0, "properties_bas": 0, "time_period_numb": 0, "absolut": [0, 6], "true": [0, 1, 3, 4, 5, 6, 11], "validate_bas": 0, "fals": [0, 1, 3, 4, 6, 11], "check": [0, 1, 3], "ani": [0, 1, 3, 6, 11], "entri": [0, 3], "property_nam": 0, "property_valu": 0, "us": [0, 1, 2, 3, 4, 6, 11], "command": [0, 1, 3], "rather": [0, 1], "than": [0, 1, 6], "If": [0, 1, 3, 4, 6, 11], "automat": 0, "note": [0, 6, 11], "onli": [0, 1, 3, 4, 6], "numer": [0, 4], "frequenc": [0, 4, 11], "suitabl": 0, "write": [0, 1, 3, 4, 11], "evaluate_route_shape_chang": 0, "shape_build": 0, "shape_bas": 0, "two": [0, 1, 6, 8, 11], "build": [0, 1, 3, 11], "version": [0, 6, 11], "ddatafram": 0, "get_time_period_numbers_from_cube_properti": 0, "properties_list": 0, "associ": [0, 2], "them": [0, 1, 6, 8], "all": [0, 1, 2, 5, 6, 11], "found": [0, 1, 6], "unpack_route_nam": 0, "unpack": 0, "info": [0, 1, 2], "link": [1, 2, 3, 6, 11], "kwarg": [1, 2, 3, 6], "roadwaynetwork": [1, 3, 4], "subclass": [1, 11], "network_wrangl": [1, 8, 11], "represent": [1, 3, 4, 6], "physic": 1, "roadwai": [1, 2, 3, 11], "geodatafram": [1, 11], "specifi": [1, 3, 6], "default": [1, 2, 3, 4, 6, 11], "cr": [1, 3, 6], "coordin": [1, 3, 6], "refer": [1, 3, 4, 6, 11], "system": [1, 3], "espg": [1, 3], "node_foreign_kei": [1, 3], "tabl": [1, 2, 3, 6], "link_foreign_kei": [1, 3], "foreign": [1, 3], "shape_foreign_kei": [1, 3, 11], "unique_link_id": [1, 3], "unique_node_id": [1, 3], "modes_to_network_link_vari": [1, 3], "map": [1, 2, 3, 6, 11], "mode": [1, 3, 4, 11], "modes_to_network_nodes_vari": [1, 3], "managed_lanes_node_id_scalar": [1, 3], "scalar": [1, 3, 6], "primari": [1, 3], "correspond": [1, 3], "manag": [1, 2, 3, 11], "lane": [1, 2, 3, 11], "managed_lanes_link_id_scalar": [1, 3], "managed_lanes_required_attribut": [1, 3], "must": [1, 3], "keep_same_attributes_ml_and_gp": [1, 3], "parallel": [1, 3], "gener": [1, 3, 6], "purpos": [1, 3, 6], "add_count": 1, "network_vari": 1, "aadt": 1, "mndot_count_shst_data": 1, "none": [1, 3, 4, 5, 6], "widot_count_shst_data": 1, "mndot_count_variable_shp": [1, 2], "widot_count_variable_shp": 1, "count": [1, 2], "mc": [1, 2, 4], "join": [1, 3, 4, 6, 11], "data": [1, 2, 3, 4, 8, 11], "via": 1, "shst": 1, "api": [1, 6], "match": 1, "result": [1, 11], "should": [1, 2, 3, 6, 11], "written": [1, 11], "path": [1, 3, 4, 6, 11], "mndot": [1, 2], "locat": [1, 2, 3, 4], "widot": 1, "geodatabas": 1, "add_incident_link_data_to_nod": 1, "links_df": [1, 11], "nodes_df": [1, 11], "link_vari": 1, "unique_node_kei": 1, "model_node_id": [1, 2], "go": [1, 11], "assess": [1, 3], "connect": 1, "incid": 1, "where": 1, "length": [1, 6], "n": [1, 2, 3, 11], "out": [1, 2, 5, 6, 8], "add_new_roadway_feature_chang": 1, "featur": [1, 6], "also": [1, 6, 11], "valid": [1, 6, 11], "add_variable_using_shst_refer": 1, "var_shst_csvdata": 1, "shst_csv_variabl": 1, "network_var_typ": 1, "overwrit": 1, "bool": [1, 3, 6], "addition_map": 1, "show": 1, "project_card_dictionari": 1, "wrapper": [1, 8, 11], "apply_managed_lane_feature_chang": 1, "link_idx": 1, "in_plac": [1, 11], "lndice": 1, "whether": 1, "decid": 1, "connector": [1, 2], "when": [1, 3, 6], "thei": [1, 2], "more": [1, 6, 11], "apply_python_calcul": 1, "pycod": 1, "execut": 1, "code": [1, 2, 6, 11], "apply_roadway_feature_chang": [1, 11], "select": [1, 11], "pass": [1, 5], "assess_connect": [1, 11], "ignore_end_nod": [1, 11], "graph": 1, "disconnect": 1, "subgraph": 1, "describ": [1, 11], "member": [1, 6], "one": [1, 6, 11], "drive": [1, 11], "walk": [1, 11], "bike": 1, "ignor": [1, 6], "strai": 1, "singleton": 1, "tupl": [1, 6], "osmnx": [1, 11], "networkx": 1, "digraph": 1, "build_selection_kei": 1, "selection_dict": 1, "combin": [1, 2, 4, 6], "queri": [1, 11], "b": [1, 2, 11], "you": [1, 3, 11], "selection_dictonari": 1, "serv": 1, "calculate_area_typ": 1, "area_type_shap": [1, 2], "area_type_shape_vari": 1, "area_typ": [1, 2, 3], "area_type_codes_dict": 1, "downtown_area_type_shap": [1, 2], "downtown_area_typ": [1, 2], "area": [1, 2, 3, 6], "centroid": [1, 2, 6], "geometri": [1, 2, 6], "field": [1, 2, 11], "determin": [1, 3, 6], "label": 1, "isn": 1, "perfect": 1, "much": 1, "quicker": 1, "other": [1, 6, 11], "The": [1, 2, 5, 6, 11], "geodadabas": 1, "input": [1, 6], "downtown": [1, 2], "boundari": [1, 2, 6], "counti": [1, 2, 3], "calculate_centroidconnect": 1, "centroidconnect": [1, 2, 3], "highest_taz_numb": [1, 2], "as_integ": 1, "max": 1, "taz": [1, 2], "calculate_counti": 1, "county_shap": [1, 2], "county_shape_vari": 1, "county_codes_dict": 1, "calculate_dist": 1, "distanc": [1, 2, 3, 6], "centroidconnect_onli": 1, "mile": 1, "centroidconnector": 1, "calculate_mpo": 1, "county_network_vari": 1, "mpo": [1, 2], "mpo_counti": [1, 2], "param": [1, 4], "county_vari": 1, "region": [1, 6], "calculate_us": 1, "defauli": 1, "convert_int": 1, "int_col_nam": 1, "create_ml_vari": 1, "ml_lane": [1, 2], "ml": 1, "placehold": 1, "come": 1, "log": [1, 3, 8], "create_calculated_vari": 1, "create_dummy_connector_link": 1, "ml_df": 1, "access_lan": 1, "egress_lan": 1, "access_roadwai": 1, "ml_access": 1, "egress_roadwai": 1, "access_name_prefix": 1, "access": [1, 4, 6], "dummi": 1, "egress_name_prefix": 1, "egress": 1, "gp_df": 1, "roadai": 1, "prefix": 1, "create_hov_corridor_vari": 1, "segment_id": [1, 2], "hov": 1, "corridor": 1, "create_managed_lane_network": [1, 11], "keep_additional_attributes_ml_and_gp": 1, "separ": [1, 3, 6], "look": 1, "want": [1, 11], "leav": 1, "some": 1, "rigor": 1, "test": [1, 2, 6], "create_managed_vari": 1, "dataframe_to_fixed_width": 1, "df": 1, "fix": [1, 2], "width": [1, 6], "transform": [1, 6], "delete_roadway_feature_chang": 1, "ignore_miss": 1, "get": [1, 2, 6, 11], "miss": 1, "fail": [1, 6], "deletion_map": 1, "fill_na": 1, "fill": [1, 6], "na": 1, "from_roadwaynetwork": [1, 11], "roadway_network_object": 1, "get_attribut": 1, "join_kei": 1, "source_shst_ref_df": 1, "source_gdf": 1, "field_nam": 1, "get_managed_lane_node_id": 1, "nodes_list": 1, "4500000": 1, "237": 1, "get_modal_graph": 1, "bike_access": [1, 2], "bu": [1, 4], "bus_onli": [1, 2], "drive_access": [1, 2, 11], "rail": [1, 4], "rail_onli": [1, 2], "walk_access": [1, 2], "strongli": 1, "vertex": [1, 6], "reachabl": 1, "everi": 1, "get_modal_links_nod": 1, "kept": 1, "both": [1, 4, 6, 11], "filter": [1, 6], "right": [1, 6], "now": 1, "we": [1, 11], "don": 1, "becaus": [1, 6, 11], "mark": 1, "issu": 1, "discuss": 1, "http": [1, 4, 6, 8, 11], "github": [1, 6, 8, 11], "com": [1, 4, 6, 8, 11], "wsp": [1, 8, 11], "sag": [1, 8, 11], "145": 1, "modal_nodes_df": 1, "mode_node_vari": 1, "get_property_by_time_period_and_group": 1, "prop": 1, "categori": [1, 2, 11], "default_return": 1, "seri": [1, 11], "group": 1, "16": [1, 2, 6], "00": [1, 2], "19": [1, 2], "option": [1, 6], "sov": [1, 2], "search": [1, 2, 8, 11], "hov3": [1, 2], "hov2": [1, 2], "identify_seg": 1, "o_id": 1, "d_id": 1, "endpoint": 1, "up": [1, 5, 11], "segment": [1, 4, 6, 11], "candid": 1, "otherwis": [1, 4, 6], "ram": 1, "hog": 1, "could": [1, 6, 11], "odd": 1, "shortest": 1, "segment_vari": 1, "keep": [1, 6], "identify_segment_endpoint": 1, "min_connecting_link": 1, "10": [1, 2, 6], "min_dist": 1, "max_link_devi": 1, "2": [1, 2, 4, 6, 11], "is_network_connect": [1, 11], "consid": [1, 6, 11], "cach": 1, "long": [1, 6], "load_transform_network": 1, "node_filenam": [1, 11], "link_filenam": [1, 11], "shape_filenam": [1, 11], "4326": [1, 6], "validate_schema": 1, "disk": 1, "schema": [1, 11], "shapes_df": [1, 11], "network_connection_plot": 1, "g": [1, 6], "disconnected_subgraph_nod": 1, "plot": 1, "fig": 1, "ax": 1, "orig_dest_nodes_foreign_kei": 1, "whatev": 1, "u": 1, "v": [1, 2, 11], "ab": 1, "noth": 1, "assum": 1, "a_id": 1, "b_id": 1, "ox_graph": 1, "unique_link_kei": 1, "model_link_id": [1, 2, 3], "arrai": [1, 6], "remov": [1, 6], "certain": 1, "do": [1, 6, 11], "too": [1, 5], "link_df": 1, "referenc": 1, "multidigraph": 1, "path_search": 1, "candidate_links_df": 1, "weight_column": 1, "weight_factor": 1, "search_breadth": 1, "5": [1, 2, 4, 6], "max_search_breadth": 1, "candidate_link": 1, "part": [1, 6, 11], "foreigh": 1, "destin": 1, "weight": 1, "iter": [1, 6], "multipli": 1, "fast": [1, 11], "recalculate_calculated_vari": [1, 3], "recalculate_dist": [1, 3], "json": [1, 11], "geojson": 1, "skip": 1, "speed": 1, "spatial": [1, 6], "etc": [1, 2, 3, 11], "re": 1, "read_match_result": 1, "lot": 1, "same": [1, 6], "concaten": 1, "singl": [1, 3, 6], "geopanda": [1, 11], "sure": 1, "why": 1, "util": [1, 8, 11], "rename_variables_for_dbf": 1, "input_df": 1, "variable_crosswalk": 1, "output_vari": [1, 2], "convert_geometry_to_xi": 1, "renam": [1, 3], "dbf": 1, "shp": [1, 2], "char": 1, "crosswalk": [1, 3], "x": [1, 2, 6], "y": [1, 2, 6], "roadway_net_to_gdf": 1, "roadway_net": 1, "turn": [1, 11], "export": [1, 8, 11], "sophist": 1, "attach": 1, "roadway_standard_to_met_council_network": 1, "output_epsg": [1, 2], "consist": [1, 6, 11], "metcouncil": [1, 2, 4, 8, 11], "": [1, 2, 4, 6, 11], "model": [1, 2], "expect": [1, 6], "epsg": [1, 2, 6], "output": [1, 2, 3, 4, 6], "select_roadway_featur": [1, 11], "search_mod": 1, "force_search": 1, "sp_weight_factor": 1, "satisfi": [1, 6], "criteria": 1, "net": [1, 11], "osm": [1, 2], "share": [1, 11], "street": [1, 11], "osm_model_link_id": 1, "1234": 1, "shstid": 1, "4321": 1, "regex": 1, "facil": [1, 2, 11], "main": 1, "st": [1, 2], "least": [1, 11], "perform": 1, "even": 1, "previou": 1, "discourag": 1, "meander": 1, "ref": 1, "here": [1, 6], "defaul": 1, "selection_has_unique_link_id": 1, "selection_dictionari": 1, "selection_map": 1, "selected_link_idx": 1, "candidate_link_idx": 1, "selected_links_idx": 1, "candidate_links_idx": 1, "shortest_path": 1, "graph_links_df": 1, "100": 1, "four": 1, "nx": 1, "split_properties_by_time_period_and_categori": 1, "properties_to_split": [1, 2], "split": [1, 2, 4], "structur": 1, "stratifi": 1, "times_period": 1, "am": [1, 2], "6": [1, 2, 4, 6], "pm": [1, 2], "15": [1, 2], "update_dist": 1, "use_shap": 1, "inplac": 1, "straight": 1, "avail": 1, "portion": 1, "provid": [1, 3, 4, 6, 11], "entir": 1, "crow": 1, "fly": 1, "meter": [1, 6], "nan": 1, "validate_link_schema": 1, "schema_loc": 1, "roadway_network_link": 1, "validate_node_schema": 1, "node_fil": 1, "roadway_network_nod": 1, "validate_properti": 1, "ignore_exist": 1, "require_existing_for_chang": 1, "theproject": 1, "dictonari": 1, "validate_select": 1, "selection_requir": 1, "whetther": 1, "minimum": [1, 5, 6], "validate_shape_schema": 1, "shape_fil": 1, "roadway_network_shap": 1, "validate_uniqu": 1, "confirm": 1, "met": 1, "filenam": [1, 3, 11], "were": 1, "save": 1, "write_roadway_as_fixedwidth": [1, 11], "output_dir": 1, "node_output_vari": 1, "link_output_vari": 1, "output_link_txt": [1, 2], "output_node_txt": [1, 2], "output_link_header_width_txt": [1, 2], "output_node_header_width_txt": [1, 2], "output_cube_network_script": [1, 2], "drive_onli": 1, "function": [1, 5, 6, 8, 11], "doe": [1, 4], "header": [1, 2], "3": [1, 2, 4, 6, 11], "script": [1, 2], "run": [1, 8], "databas": [1, 3], "record": 1, "write_roadway_as_shp": [1, 11], "data_to_csv": 1, "data_to_dbf": 1, "output_link_shp": [1, 2], "output_node_shp": [1, 2], "output_link_csv": [1, 2], "output_node_csv": [1, 2], "output_gpkg": 1, "output_link_gpkg_lay": 1, "output_node_gpkg_lay": 1, "output_gpkg_link_filt": 1, "gpkg": 1, "csv": [1, 2, 3, 11], "full": 1, "geopackag": 1, "layer": [1, 11], "subset": 1, "calculated_valu": [1, 3], "dai": [2, 4, 11], "can": [2, 6, 8, 11], "runtim": [2, 11], "initi": [2, 3, 11], "keyword": [2, 6, 11], "argument": [2, 6, 11], "explicitli": [2, 11], "highlight": [2, 11], "attr": 2, "time_period_to_tim": 2, "abbrevi": [2, 4], "gtf": [2, 4, 11], "highwai": [2, 3, 8, 11], "ea": 2, "md": 2, "ev": 2, "cube_time_period": 2, "4": [2, 4, 6], "demand": 2, "allow": [2, 6], "suffix": 2, "truck": 2, "trk": 2, "final": 2, "lanes_am": 2, "time_periods_to_tim": 2, "shapefil": 2, "r": 2, "metcouncil_data": 2, "cb_2017_us_county_5m": 2, "county_variable_shp": 2, "lanes_lookup_fil": 2, "lookup": 2, "centroid_connect_lan": 2, "anoka": 2, "dakota": 2, "hennepin": 2, "ramsei": 2, "scott": 2, "washington": 2, "carver": 2, "taz_shap": 2, "tazofficialwcurrentforecast": 2, "taz_data": 2, "highest": 2, "3100": 2, "link_id": 2, "shstgeometryid": 2, "roadway_class": 2, "truck_access": 2, "trn_priority_ea": 2, "trn_priority_am": 2, "trn_priority_md": 2, "trn_priority_pm": 2, "trn_priority_ev": 2, "ttime_assert_ea": 2, "ttime_assert_am": 2, "ttime_assert_md": 2, "ttime_assert_pm": 2, "ttime_assert_ev": 2, "lanes_ea": 2, "lanes_md": 2, "lanes_pm": 2, "lanes_ev": 2, "price_sov_ea": 2, "price_hov2_ea": 2, "price_hov3_ea": 2, "price_truck_ea": 2, "price_sov_am": 2, "price_hov2_am": 2, "price_hov3_am": 2, "price_truck_am": 2, "price_sov_md": 2, "price_hov2_md": 2, "price_hov3_md": 2, "price_truck_md": 2, "price_sov_pm": 2, "price_hov2_pm": 2, "price_hov3_pm": 2, "price_truck_pm": 2, "price_sov_ev": 2, "price_hov2_ev": 2, "price_hov3_ev": 2, "price_truck_ev": 2, "roadway_class_idx": 2, "facility_typ": 2, "osm_node_id": [2, 6, 11], "bike_nod": 2, "transit_nod": 2, "walk_nod": 2, "drive_nod": 2, "ml_lanes_ea": 2, "ml_lanes_am": 2, "ml_lanes_md": 2, "ml_lanes_pm": 2, "ml_lanes_ev": 2, "osm_facility_type_dict": 2, "thrivemsp2040communitydesign": 2, "area_type_variable_shp": 2, "comdes2040": 2, "area_type_code_dict": 2, "23": 2, "urban": [2, 4], "center": [2, 6], "24": 2, "25": 2, "35": 2, "36": 2, "41": 2, "51": 2, "52": 2, "53": 2, "60": 2, "downtownzones_taz": 2, "mrcc_roadway_class_shap": 2, "mrcc": 2, "trans_mrcc_centerlin": 2, "mrcc_roadway_class_variable_shp": 2, "mrcc_roadway_class_shp": 2, "route_si": 2, "widot_roadway_class_shap": 2, "wisconsin": 2, "wisconsin_lanes_counts_median": 2, "wislr": 2, "widot_roadway_class_variable_shp": 2, "rdwy_ctgy_": 2, "mndot_count_shap": 2, "count_mn": 2, "aadt_2017_count_loc": 2, "osm_highway_facility_type_crosswalk": 2, "legacy_tm2_attribut": 2, "shstreferenceid": 2, "legaci": 2, "tm2": 2, "osm_lanes_attribut": 2, "tam_tm2_attribut": 2, "tam": 2, "tom_tom_attribut": 2, "tomtom": 2, "tomtom_attribut": 2, "sfcta_attribut": 2, "sfcta": 2, "geograph": 2, "102646": 2, "scratch": 2, "txt": [2, 11], "links_header_width": 2, "nodes_header_width": 2, "import": [2, 6], "make_complete_network_from_fixed_width_fil": 2, "county_link_range_dict": 2, "county_code_dict": 2, "7": [2, 4, 11], "extern": 2, "chisago": 2, "11": 2, "goodhu": 2, "12": 2, "isanti": 2, "13": [2, 6], "le": 2, "sueur": 2, "14": 2, "mcleod": 2, "pierc": 2, "polk": 2, "17": 2, "rice": 2, "18": 2, "sherburn": 2, "siblei": 2, "20": 2, "croix": 2, "21": 2, "wright": 2, "22": 2, "route_type_bus_mode_dict": 2, "urb": 2, "loc": 2, "sub": [2, 6], "express": [2, 4, 6], "route_type_mode_dict": 2, "8": [2, 4, 6], "9": [2, 4], "cube_time_periods_nam": 2, "op": 2, "detail": [2, 5], "zone": 2, "possibl": [2, 6], "roadway_link_chang": 3, "roadway_node_chang": 3, "transit_chang": [3, 4], "base_roadway_network": 3, "base_cube_transit_network": 3, "build_cube_transit_network": 3, "project_nam": 3, "produc": [3, 6], "test_project": [3, 11], "create_project": [3, 11], "base_cube_transit_sourc": 3, "o": [3, 4, 11], "build_cube_transit_sourc": 3, "transit_route_shape_chang": [3, 11], "evaluate_chang": [3, 11], "write_project_card": [3, 11], "scratch_dir": [3, 11], "t_transit_shape_test": [3, 11], "yml": [3, 11], "default_project_nam": 3, "level": 3, "constant": 3, "static_valu": 3, "card_data": 3, "cubetransit": [3, 8], "bunch": 3, "projectcard": [3, 8], "case": 3, "standardtransit": [3, 8], "add_highway_chang": 3, "limit_variables_to_existing_network": 3, "hoc": [3, 11], "add_transit_chang": 3, "roadway_log_fil": 3, "roadway_shp_fil": 3, "roadway_csv_fil": 3, "network_build_fil": 3, "emme_node_id_crosswalk_fil": 3, "emme_name_crosswalk_fil": 3, "base_roadway_dir": 3, "base_transit_dir": [3, 4, 11], "consum": 3, "logfil": 3, "emm": [3, 4], "folder": 3, "base_cube_transit_fil": 3, "build_cube_transit_fil": 3, "first": [3, 6], "recalcul": 3, "determine_roadway_network_changes_compat": 3, "emme_id_to_wrangler_id": 3, "emme_link_change_df": 3, "emme_node_change_df": 3, "emme_transit_changes_df": 3, "rewrit": 3, "wrangler": [3, 8, 11], "emme_name_to_wrangler_nam": 3, "aggreg": 3, "get_object_from_network_build_command": 3, "row": [3, 4], "histori": 3, "l": 3, "get_operation_from_network_build_command": 3, "action": 3, "c": [3, 6], "d": [3, 6], "read_logfil": 3, "logfilenam": 3, "reprsent": [3, 11], "read_network_build_fil": 3, "networkbuildfilenam": 3, "nework": 3, "assign_group": 3, "user": [3, 11], "TO": 3, "ptg_feed": 4, "hold": [4, 11], "feed": [4, 11], "partridg": [4, 11], "manipul": [4, 11], "cube_transit_net": [4, 11], "read_gtf": [4, 11], "write_as_cube_lin": [4, 11], "write_dir": [4, 11], "outfil": [4, 11], "calculate_cube_mod": 4, "assign": 4, "logic": 4, "route_typ": 4, "develop": [4, 11], "googl": 4, "cube_mod": 4, "route_type_to_cube_mod": 4, "tram": 4, "streetcar": 4, "light": 4, "further": 4, "disaggreg": 4, "buse": 4, "suburban": 4, "longnam": 4, "lower": [4, 6], "elif": 4, "99": 4, "local": [4, 11], "els": [4, 6], "route_long_nam": 4, "cube_format": 4, "represnt": 4, "notat": 4, "trip": 4, "trip_id": 4, "shape_id": [4, 11], "tod": 4, "onewai": 4, "oper": [4, 6], "fromtransitnetwork": 4, "transit_network_object": 4, "modelroadwaynetwork": [4, 8], "gtfs_feed_dir": 4, "route_properties_gtfs_to_cub": 4, "prepar": 4, "trip_df": 4, "shape_gtfs_to_cub": 4, "add_nntim": 4, "shape_gtfs_to_emm": 4, "trip_row": 4, "time_to_cube_time_period": 4, "start_time_sec": 4, "as_str": 4, "verbos": 4, "midnight": [4, 6], "this_tp": 4, "this_tp_num": 4, "outpath": 4, "after": 4, "setuplog": 5, "infologfilenam": 5, "debuglogfilenam": 5, "logtoconsol": 5, "infolog": 5, "ters": 5, "just": 5, "give": 5, "bare": 5, "composit": 5, "clear": 5, "later": 5, "debuglog": 5, "veri": [5, 6], "noisi": 5, "debug": 5, "spew": 5, "consol": 5, "point": 6, "arg": 6, "basegeometri": 6, "zero": 6, "dimension": 6, "z": 6, "float": 6, "p": 6, "print": 6, "almost_equ": 6, "decim": 6, "equal": 6, "place": 6, "deprec": 6, "sinc": 6, "confus": 6, "equals_exact": 6, "instead": 6, "approxim": 6, "compon": [6, 8], "linestr": 6, "1e": 6, "array_interfac": 6, "numpi": 6, "protocol": 6, "buffer": 6, "resolut": 6, "quadseg": 6, "cap_styl": 6, "join_styl": 6, "mitre_limit": 6, "single_sid": 6, "dilat": 6, "eros": 6, "small": 6, "mai": 6, "sometim": 6, "tidi": 6, "polygon": 6, "around": [6, 8, 11], "angl": 6, "fillet": 6, "gone": 6, "next": 6, "major": 6, "releas": 6, "cap": 6, "round": 6, "flat": 6, "squar": 6, "offset": 6, "mitr": 6, "bevel": 6, "limit": 6, "ratio": 6, "sharp": 6, "corner": 6, "meet": 6, "miter": 6, "extend": 6, "To": [6, 11], "prevent": 6, "unreason": 6, "control": 6, "maximum": 6, "exce": 6, "side": 6, "sign": 6, "left": 6, "hand": 6, "regular": 6, "alwai": 6, "forc": 6, "equival": 6, "cap_flat": 6, "strictli": 6, "wkt": 6, "load": 6, "gon": 6, "approx": 6, "radiu": 6, "circl": 6, "1365484905459": 6, "128": 6, "141513801144": 6, "triangl": 6, "exterior": 6, "coord": 6, "covered_bi": 6, "cover": 6, "cross": 6, "disjoint": 6, "unitless": 6, "empti": 6, "val": 6, "94210181747008": 6, "topolog": 6, "toler": 6, "comparison": 6, "geometrytyp": 6, "hausdorff_dist": 6, "hausdorff": 6, "interpol": 6, "normal": 6, "along": 6, "linear": 6, "taken": 6, "measur": 6, "revers": 6, "rang": 6, "index": [6, 8], "handl": 6, "clamp": 6, "interpret": 6, "fraction": 6, "intersect": 6, "form": 6, "canon": 6, "ring": 6, "multi": 6, "multilinestr": 6, "overlap": 6, "nearest": 6, "relat": 6, "de": 6, "9im": 6, "matrix": 6, "relate_pattern": 6, "pattern": 6, "relationship": 6, "representative_point": 6, "guarante": 6, "cheapli": 6, "simplifi": 6, "preserve_topologi": 6, "dougla": 6, "peucker": 6, "algorithm": 6, "unless": 6, "topologi": 6, "preserv": 6, "invalid": 6, "svg": 6, "scale_factor": 6, "fill_color": 6, "opac": 6, "element": 6, "factor": 6, "diamet": 6, "hex": 6, "color": 6, "66cc99": 6, "ff3333": 6, "symmetric_differ": 6, "symmetr": 6, "touch": 6, "union": 6, "array_interface_bas": 6, "dimens": 6, "bound": 6, "collect": 6, "null": 6, "minx": 6, "mini": 6, "maxx": 6, "maxi": 6, "geometr": 6, "convex_hul": 6, "convex": 6, "hull": 6, "less": 6, "three": [6, 11], "multipoint": 6, "triangular": 6, "imagin": 6, "elast": 6, "band": 6, "stretch": 6, "coordinatesequ": 6, "ctype": 6, "envelop": 6, "figur": 6, "geom_typ": 6, "has_z": 6, "sequenc": 6, "impl": 6, "geosimpl": 6, "geo": 6, "is_clos": 6, "close": 6, "applic": 6, "is_empti": 6, "is_r": 6, "is_simpl": 6, "simpl": 6, "mean": 6, "is_valid": 6, "definit": 6, "depend": 6, "minimum_clear": 6, "move": 6, "minimum_rotated_rectangl": 6, "rectangl": 6, "possibli": 6, "rotat": 6, "degener": 6, "wkb": 6, "wkb_hex": 6, "xy": 6, "shell": 6, "hole": 6, "It": [6, 8], "space": 6, "interior": 6, "classmethod": 6, "from_bound": 6, "xmin": 6, "ymin": 6, "xmax": 6, "ymax": 6, "stroke": 6, "partial": 6, "func": 6, "given": [6, 11], "futur": 6, "call": 6, "column_name_to_part": 6, "create_locationrefer": 6, "geodesic_point_buff": 6, "lat": 6, "lon": 6, "circular": 6, "get_shared_streets_intersection_hash": 6, "per": [6, 11], "sharedstreet": 6, "j": 6, "blob": 6, "0e6d7de0aee2e9ae3b007d1e45284b06cc241d02": 6, "src": 6, "l553": 6, "l565": 6, "93": 6, "0965985": 6, "44": 6, "952112199999995": 6, "954734870": 6, "69f13f881649cb21ee3b359730790bb9": 6, "hhmmss_to_datetim": 6, "hhmmss_str": 6, "datetim": 6, "hh": 6, "mm": 6, "ss": 6, "dt": 6, "secs_to_datetim": 6, "sec": 6, "shorten_nam": 6, "geom": 6, "xp": 6, "yp": 6, "zp": 6, "shall": 6, "ident": 6, "def": 6, "id_func": 6, "g2": 6, "g1": 6, "pyproj": 6, "accur": 6, "wgs84": 6, "utm": 6, "32618": 6, "from_cr": 6, "always_xi": 6, "support": 6, "lambda": 6, "unidecod": 6, "error": 6, "replace_str": 6, "transliter": 6, "unicod": 6, "ascii": 6, "\u5317\u4eb0": 6, "bei": 6, "jing": 6, "tri": 6, "codec": 6, "charact": 6, "fall": 6, "back": 6, "five": 6, "faster": 6, "slightli": 6, "slower": 6, "unicode_expect_nonascii": 6, "present": 6, "replac": [6, 11], "strict": 6, "rais": 6, "unidecodeerror": 6, "substitut": 6, "might": [6, 11], "packag": [8, 11], "aim": 8, "networkwrangl": [8, 11], "refin": 8, "metropolitan": 8, "council": 8, "citilab": 8, "softwar": [8, 11], "instal": 8, "bleed": 8, "edg": 8, "clone": 8, "brief": 8, "intro": 8, "workflow": 8, "quickstart": 8, "jupyt": 8, "notebook": 8, "setup": 8, "scenario": 8, "audit": 8, "report": 8, "logger": 8, "modul": 8, "page": 8, "suggest": 11, "virtualenv": 11, "conda": 11, "virtual": 11, "environ": 11, "recommend": 11, "pip": 11, "lasso": 11, "config": 11, "channel": 11, "forg": 11, "rtree": 11, "my_lasso_environ": 11, "activ": 11, "git": 11, "master": 11, "pypi": 11, "repositori": 11, "date": 11, "branch": 11, "work": 11, "your": 11, "machin": 11, "edit": 11, "plan": 11, "well": 11, "cd": 11, "team": 11, "contribut": 11, "bxack": 11, "pleas": 11, "fork": 11, "befor": 11, "upstream": 11, "tag": 11, "branchnam": 11, "frequent": 11, "instruct": 11, "good": 11, "atom": 11, "sublim": 11, "text": 11, "syntax": 11, "desktop": 11, "built": 11, "mashup": 11, "open": 11, "In": 11, "nest": 11, "span": 11, "implement": 11, "novel": 11, "travel": 11, "break": 11, "publictransport": 11, "document": 11, "done": 11, "gui": 11, "public": 11, "transport": 11, "infrastructur": 11, "servic": 11, "tier": 11, "made": 11, "mainli": 11, "my_link_fil": 11, "my_node_fil": 11, "my_shape_fil": 11, "my_select": 11, "35e": 11, "961117623": 11, "2564047368": 11, "my_chang": 11, "my_net": 11, "ml_net": 11, "_": 11, "disconnected_nod": 11, "my_out_prefix": 11, "my_dir": 11, "my_base_scenario": 11, "road_net": 11, "stpaul_link_fil": 11, "stpaul_node_fil": 11, "stpaul_shape_fil": 11, "transit_net": 11, "stpaul_dir": 11, "card_filenam": 11, "3_multiple_roadway_attribute_chang": 11, "multiple_chang": 11, "4_simple_managed_lan": 11, "project_card_directori": 11, "project_card": 11, "project_cards_list": 11, "my_scenario": 11, "create_scenario": 11, "base_scenario": 11, "check_scenario_requisit": 11, "apply_all_project": 11, "scenario_summari": 11, "base_transit_sourc": 11, "build_transit_sourc": 11, "understand": 11, "how": 11, "overrid": 11, "those": 11, "instanti": 11, "yaml": 11, "configur": 11, "pair": 11, "config_fil": 11, "f": 11, "my_config": 11, "safe_load": 11, "model_road_net": 11, "my_paramet": 11, "accomplish": 11, "goal": 11, "top": 11, "learn": 11, "basic": 11, "creation": 11, "ipynb": 11}, "objects": {"": [[7, 0, 0, "-", "lasso"]], "lasso": [[0, 1, 1, "", "CubeTransit"], [1, 1, 1, "", "ModelRoadwayNetwork"], [2, 1, 1, "", "Parameters"], [3, 1, 1, "", "Project"], [4, 1, 1, "", "StandardTransit"], [5, 0, 0, "-", "logger"], [6, 0, 0, "-", "util"]], "lasso.CubeTransit": [[0, 2, 1, "", "__init__"], [0, 2, 1, "", "add_additional_time_periods"], [0, 2, 1, "", "add_cube"], [0, 2, 1, "", "build_route_name"], [0, 2, 1, "", "calculate_start_end_times"], [0, 2, 1, "", "create_add_route_card_dict"], [0, 2, 1, "", "create_delete_route_card_dict"], [0, 2, 1, "", "create_from_cube"], [0, 2, 1, "", "create_update_route_card_dict"], [0, 2, 1, "", "cube_properties_to_standard_properties"], [0, 3, 1, "", "diff_dict"], [0, 2, 1, "", "evaluate_differences"], [0, 2, 1, "", "evaluate_route_property_differences"], [0, 2, 1, "", "evaluate_route_shape_changes"], [0, 2, 1, "", "get_time_period_numbers_from_cube_properties"], [0, 3, 1, "", "line_properties"], [0, 3, 1, "", "lines"], [0, 3, 1, "", "parameters"], [0, 3, 1, "", "program_type"], [0, 3, 1, "", "shapes"], [0, 3, 1, "", "source_list"], [0, 2, 1, "", "unpack_route_name"]], "lasso.ModelRoadwayNetwork": [[1, 3, 1, "", "CALCULATED_VALUES"], [1, 2, 1, "", "__init__"], [1, 2, 1, "", "add_counts"], [1, 2, 1, "", "add_incident_link_data_to_nodes"], [1, 2, 1, "", "add_new_roadway_feature_change"], [1, 2, 1, "", "add_variable_using_shst_reference"], [1, 2, 1, "", "addition_map"], [1, 2, 1, "", "apply"], [1, 2, 1, "", "apply_managed_lane_feature_change"], [1, 2, 1, "", "apply_python_calculation"], [1, 2, 1, "", "apply_roadway_feature_change"], [1, 2, 1, "", "assess_connectivity"], [1, 2, 1, "", "build_selection_key"], [1, 2, 1, "", "calculate_area_type"], [1, 2, 1, "", "calculate_centroidconnect"], [1, 2, 1, "", "calculate_county"], [1, 2, 1, "", "calculate_distance"], [1, 2, 1, "", "calculate_mpo"], [1, 2, 1, "", "calculate_use"], [1, 2, 1, "", "convert_int"], [1, 2, 1, "", "create_ML_variable"], [1, 2, 1, "", "create_calculated_variables"], [1, 2, 1, "", "create_dummy_connector_links"], [1, 2, 1, "", "create_hov_corridor_variable"], [1, 2, 1, "", "create_managed_lane_network"], [1, 2, 1, "", "create_managed_variable"], [1, 2, 1, "", "dataframe_to_fixed_width"], [1, 2, 1, "", "delete_roadway_feature_change"], [1, 2, 1, "", "deletion_map"], [1, 2, 1, "", "fill_na"], [1, 2, 1, "", "from_RoadwayNetwork"], [1, 2, 1, "", "get_attribute"], [1, 2, 1, "", "get_managed_lane_node_ids"], [1, 2, 1, "", "get_modal_graph"], [1, 2, 1, "", "get_modal_links_nodes"], [1, 2, 1, "", "get_property_by_time_period_and_group"], [1, 2, 1, "", "identify_segment"], [1, 2, 1, "", "identify_segment_endpoints"], [1, 2, 1, "", "is_network_connected"], [1, 2, 1, "", "load_transform_network"], [1, 2, 1, "", "network_connection_plot"], [1, 2, 1, "", "orig_dest_nodes_foreign_key"], [1, 2, 1, "", "ox_graph"], [1, 2, 1, "", "path_search"], [1, 2, 1, "", "read"], [1, 2, 1, "", "read_match_result"], [1, 2, 1, "", "rename_variables_for_dbf"], [1, 2, 1, "", "roadway_net_to_gdf"], [1, 2, 1, "", "roadway_standard_to_met_council_network"], [1, 2, 1, "", "select_roadway_features"], [1, 2, 1, "", "selection_has_unique_link_id"], [1, 2, 1, "", "selection_map"], [1, 2, 1, "", "shortest_path"], [1, 2, 1, "", "split_properties_by_time_period_and_category"], [1, 2, 1, "", "update_distance"], [1, 2, 1, "", "validate_link_schema"], [1, 2, 1, "", "validate_node_schema"], [1, 2, 1, "", "validate_properties"], [1, 2, 1, "", "validate_selection"], [1, 2, 1, "", "validate_shape_schema"], [1, 2, 1, "", "validate_uniqueness"], [1, 2, 1, "", "write"], [1, 2, 1, "", "write_roadway_as_fixedwidth"], [1, 2, 1, "", "write_roadway_as_shp"]], "lasso.Parameters": [[2, 2, 1, "", "__init__"], [2, 3, 1, "", "county_link_range_dict"], [2, 3, 1, "", "cube_time_periods"], [2, 3, 1, "", "properties_to_split"], [2, 3, 1, "", "zones"]], "lasso.Project": [[3, 3, 1, "", "CALCULATED_VALUES"], [3, 3, 1, "id0", "DEFAULT_PROJECT_NAME"], [3, 3, 1, "id1", "STATIC_VALUES"], [3, 2, 1, "", "__init__"], [3, 2, 1, "", "add_highway_changes"], [3, 2, 1, "", "add_transit_changes"], [3, 3, 1, "", "base_cube_transit_network"], [3, 3, 1, "", "base_roadway_network"], [3, 3, 1, "", "build_cube_transit_network"], [3, 3, 1, "", "card_data"], [3, 2, 1, "", "create_project"], [3, 2, 1, "", "determine_roadway_network_changes_compatibility"], [3, 2, 1, "", "emme_id_to_wrangler_id"], [3, 2, 1, "", "emme_name_to_wrangler_name"], [3, 2, 1, "", "evaluate_changes"], [3, 2, 1, "", "get_object_from_network_build_command"], [3, 2, 1, "", "get_operation_from_network_build_command"], [3, 3, 1, "", "parameters"], [3, 3, 1, "", "project_name"], [3, 2, 1, "", "read_logfile"], [3, 2, 1, "", "read_network_build_file"], [3, 3, 1, "", "roadway_link_changes"], [3, 3, 1, "", "roadway_node_changes"], [3, 3, 1, "", "transit_changes"], [3, 2, 1, "", "write_project_card"]], "lasso.StandardTransit": [[4, 2, 1, "", "__init__"], [4, 2, 1, "", "calculate_cube_mode"], [4, 2, 1, "", "cube_format"], [4, 2, 1, "", "evaluate_differences"], [4, 3, 1, "", "feed"], [4, 2, 1, "", "fromTransitNetwork"], [4, 3, 1, "", "parameters"], [4, 2, 1, "", "read_gtfs"], [4, 2, 1, "", "route_properties_gtfs_to_cube"], [4, 2, 1, "", "shape_gtfs_to_cube"], [4, 2, 1, "", "shape_gtfs_to_emme"], [4, 2, 1, "", "time_to_cube_time_period"], [4, 2, 1, "", "write_as_cube_lin"]], "lasso.logger": [[5, 4, 1, "", "setupLogging"]], "lasso.util": [[6, 1, 1, "", "Point"], [6, 1, 1, "", "Polygon"], [6, 4, 1, "", "column_name_to_parts"], [6, 4, 1, "", "create_locationreference"], [6, 4, 1, "", "geodesic_point_buffer"], [6, 4, 1, "", "get_shared_streets_intersection_hash"], [6, 4, 1, "", "hhmmss_to_datetime"], [6, 1, 1, "", "partial"], [6, 4, 1, "", "secs_to_datetime"], [6, 4, 1, "", "shorten_name"], [6, 4, 1, "", "transform"], [6, 4, 1, "", "unidecode"]], "lasso.util.Point": [[6, 2, 1, "", "almost_equals"], [6, 5, 1, "", "area"], [6, 2, 1, "", "array_interface"], [6, 5, 1, "", "array_interface_base"], [6, 5, 1, "", "boundary"], [6, 5, 1, "", "bounds"], [6, 2, 1, "", "buffer"], [6, 5, 1, "", "centroid"], [6, 2, 1, "", "contains"], [6, 5, 1, "", "convex_hull"], [6, 5, 1, "", "coords"], [6, 2, 1, "", "covered_by"], [6, 2, 1, "", "covers"], [6, 2, 1, "", "crosses"], [6, 5, 1, "", "ctypes"], [6, 2, 1, "", "difference"], [6, 2, 1, "", "disjoint"], [6, 2, 1, "", "distance"], [6, 2, 1, "", "empty"], [6, 5, 1, "", "envelope"], [6, 2, 1, "", "equals"], [6, 2, 1, "", "equals_exact"], [6, 5, 1, "", "geom_type"], [6, 2, 1, "", "geometryType"], [6, 5, 1, "", "has_z"], [6, 2, 1, "", "hausdorff_distance"], [6, 3, 1, "", "impl"], [6, 2, 1, "", "interpolate"], [6, 2, 1, "", "intersection"], [6, 2, 1, "", "intersects"], [6, 5, 1, "", "is_closed"], [6, 5, 1, "", "is_empty"], [6, 5, 1, "", "is_ring"], [6, 5, 1, "", "is_simple"], [6, 5, 1, "", "is_valid"], [6, 5, 1, "", "length"], [6, 5, 1, "", "minimum_clearance"], [6, 5, 1, "", "minimum_rotated_rectangle"], [6, 2, 1, "", "normalize"], [6, 2, 1, "", "overlaps"], [6, 2, 1, "", "project"], [6, 2, 1, "", "relate"], [6, 2, 1, "", "relate_pattern"], [6, 2, 1, "", "representative_point"], [6, 2, 1, "", "simplify"], [6, 2, 1, "", "svg"], [6, 2, 1, "", "symmetric_difference"], [6, 2, 1, "", "touches"], [6, 5, 1, "", "type"], [6, 2, 1, "", "union"], [6, 2, 1, "", "within"], [6, 5, 1, "", "wkb"], [6, 5, 1, "", "wkb_hex"], [6, 5, 1, "", "wkt"], [6, 5, 1, "", "x"], [6, 5, 1, "", "xy"], [6, 5, 1, "", "y"], [6, 5, 1, "", "z"]], "lasso.util.Polygon": [[6, 2, 1, "", "almost_equals"], [6, 5, 1, "", "area"], [6, 5, 1, "", "array_interface_base"], [6, 5, 1, "", "boundary"], [6, 5, 1, "", "bounds"], [6, 2, 1, "", "buffer"], [6, 5, 1, "", "centroid"], [6, 2, 1, "", "contains"], [6, 5, 1, "", "convex_hull"], [6, 5, 1, "", "coords"], [6, 2, 1, "", "covered_by"], [6, 2, 1, "", "covers"], [6, 2, 1, "", "crosses"], [6, 5, 1, "", "ctypes"], [6, 2, 1, "", "difference"], [6, 2, 1, "", "disjoint"], [6, 2, 1, "", "distance"], [6, 2, 1, "", "empty"], [6, 5, 1, "", "envelope"], [6, 2, 1, "", "equals"], [6, 2, 1, "", "equals_exact"], [6, 5, 1, "id0", "exterior"], [6, 2, 1, "", "from_bounds"], [6, 5, 1, "", "geom_type"], [6, 2, 1, "", "geometryType"], [6, 5, 1, "", "has_z"], [6, 2, 1, "", "hausdorff_distance"], [6, 3, 1, "", "impl"], [6, 5, 1, "id1", "interiors"], [6, 2, 1, "", "interpolate"], [6, 2, 1, "", "intersection"], [6, 2, 1, "", "intersects"], [6, 5, 1, "", "is_closed"], [6, 5, 1, "", "is_empty"], [6, 5, 1, "", "is_ring"], [6, 5, 1, "", "is_simple"], [6, 5, 1, "", "is_valid"], [6, 5, 1, "", "length"], [6, 5, 1, "", "minimum_clearance"], [6, 5, 1, "", "minimum_rotated_rectangle"], [6, 2, 1, "", "normalize"], [6, 2, 1, "", "overlaps"], [6, 2, 1, "", "project"], [6, 2, 1, "", "relate"], [6, 2, 1, "", "relate_pattern"], [6, 2, 1, "", "representative_point"], [6, 2, 1, "", "simplify"], [6, 2, 1, "", "svg"], [6, 2, 1, "", "symmetric_difference"], [6, 2, 1, "", "touches"], [6, 5, 1, "", "type"], [6, 2, 1, "", "union"], [6, 2, 1, "", "within"], [6, 5, 1, "", "wkb"], [6, 5, 1, "", "wkb_hex"], [6, 5, 1, "", "wkt"], [6, 5, 1, "", "xy"]], "lasso.util.partial": [[6, 3, 1, "", "args"], [6, 3, 1, "", "func"], [6, 3, 1, "", "keywords"]]}, "objtypes": {"0": "py:module", "1": "py:class", "2": "py:method", "3": "py:attribute", "4": "py:function", "5": "py:property"}, "objnames": {"0": ["py", "module", "Python module"], "1": ["py", "class", "Python class"], "2": ["py", "method", "Python method"], "3": ["py", "attribute", "Python attribute"], "4": ["py", "function", "Python function"], "5": ["py", "property", "Python property"]}, "titleterms": {"lasso": [0, 1, 2, 3, 4, 5, 6, 7, 8, 9], "cubetransit": [0, 11], "modelroadwaynetwork": [1, 11], "todo": 1, "paramet": [2, 10, 11], "project": [3, 9, 10, 11], "standardtransit": [4, 11], "logger": 5, "util": [6, 7], "class": 7, "function": 7, "base": 7, "welcom": 8, "": 8, "document": 8, "content": 8, "indic": 8, "tabl": 8, "run": [9, 11], "creat": 9, "file": [9, 10, 11], "scenario": [9, 11], "export": 9, "network": [9, 11], "audit": 9, "report": 9, "setup": 10, "set": 10, "addit": 10, "data": 10, "start": 11, "out": 11, "instal": 11, "bleed": 11, "edg": 11, "from": 11, "clone": 11, "brief": 11, "intro": 11, "compon": 11, "roadwaynetwork": 11, "transitnetwork": 11, "projectcard": 11, "typic": 11, "workflow": 11, "card": 11, "transit": 11, "lin": 11, "cube": 11, "log": 11, "model": 11, "quickstart": 11, "jupyt": 11, "notebook": 11}, "envversion": {"sphinx.domains.c": 2, "sphinx.domains.changeset": 1, "sphinx.domains.citation": 1, "sphinx.domains.cpp": 6, "sphinx.domains.index": 1, "sphinx.domains.javascript": 2, "sphinx.domains.math": 2, "sphinx.domains.python": 3, "sphinx.domains.rst": 2, "sphinx.domains.std": 2, "sphinx.ext.intersphinx": 1, "sphinx.ext.todo": 2, "sphinx.ext.viewcode": 1, "sphinx": 56}}) \ No newline at end of file diff --git a/branch/bicounty/setup/index.html b/branch/bicounty/setup/index.html new file mode 100644 index 0000000..bf0bb2b --- /dev/null +++ b/branch/bicounty/setup/index.html @@ -0,0 +1,127 @@ + + + + + + + Setup — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

Setup

+
+

Projects

+
+
+

Parameters

+
+
+

Settings

+
+
+

Additional Data Files

+
+
+ + +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty/starting/index.html b/branch/bicounty/starting/index.html new file mode 100644 index 0000000..ad98d16 --- /dev/null +++ b/branch/bicounty/starting/index.html @@ -0,0 +1,428 @@ + + + + + + + Starting Out — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

Starting Out

+
+

Installation

+

If you are managing multiple python versions, we suggest using virtualenv or conda virtual environments.

+

Example using a conda environment (recommended) and using the package manager pip to install Lasso from the source on GitHub.

+
conda config --add channels conda-forge
+conda create python=3.7 rtree geopandas -n <my_lasso_environment>
+conda activate <my_lasso_environment>
+pip install git+https://github.com/wsp-sag/Lasso@master
+
+
+

Lasso will install network_wrangler from the PyPi repository because it is included in Lasso’s requirements.txt.

+
+

Bleeding Edge

+

If you want to install a more up-to-date or development version of network wrangler and lasso , you can do so by installing it from the develop branch of

+
conda config --add channels conda-forge
+conda create python=3.7 rtree geopandas -n <my_lasso_environment>
+conda activate <my_lasso_environment>
+pip install git+https://github.com/wsp-sag/network_wrangler@develop
+pip install git+https://github.com/wsp-sag/Lasso@develop
+
+
+
+
+

From Clone

+

If you are going to be working on Lasso locally, you might want to clone it to your local machine and install it from the clone. The -e will install it in editable mode.

+

if you plan to do development on both network wrangler and lasso locally, consider installing network wrangler from a clone as well!

+
conda config --add channels conda-forge
+conda create python=3.7 rtree geopandas osmnx -n <my_lasso_environment>
+conda activate <my_lasso_environment>
+git clone https://github.com/wsp-sag/Lasso
+git clone https://github.com/wsp-sag/network_wrangler
+cd network_wrangler
+pip install -e .
+cd ..
+cd Lasso
+pip install -e .
+
+
+

Notes:

+
    +
  1. The -e installs it in editable mode.

  2. +
  3. If you are not part of the project team and want to contribute code bxack to the project, please fork before you clone and then add the original repository to your upstream origin list per these directions on github.

  4. +
  5. if you wanted to install from a specific tag/version number or branch, replace @master with @<branchname> or @tag

  6. +
  7. If you want to make use of frequent developer updates for network wrangler as well, you can also install it from clone by copying the instructions for cloning and installing Lasso for Network Wrangler

  8. +
+

If you are going to be doing Lasso development, we also recommend:

+
    +
  • a good IDE such as Atom, VS Code, Sublime Text, etc. +with Python syntax highlighting turned on.

  • +
  • GitHub Desktop to locally update your clones

  • +
+
+
+
+

Brief Intro

+

Lasso is a ‘wrapper’ around the Network Wrangler utility.

+

Both Lasso and NetworkWrangler are built around the following data schemas:

+
    +
  • [roadway network], which is based on a mashup of Open Street Map and Shared Streets. In Network Wrangler these are read in from three json files reprsenting: links, shapes, and nodes. Data fields that change by time of day or by user category are represented as nested fields such that any field can be defined for an ad-hoc time-of-day span or user category.

  • +
  • [transit network], which is based on a frequency-based implementation of the csv-based GTFS; and

  • +
  • [project card], which is novel to Network Wrangler and stores information about network changes as a result of projects in yml.

  • +
+

In addition, Lasso utilizes the following data schemas:

+
    +
  • [MetCouncil Model Roadway Network Schema], which adds data fields to the roadway network schema that MetCouncil uses in their travel model including breaking out data fields by time period.

  • +
  • [MetCouncil Model Transit Network Schema], which uses the Cube PublicTransport format, and

  • +
  • [Cube Log Files], which document changes to the roadway network done in the Cube GUI. Lasso translates these to project cards in order to be used by NetworkWrangler.

  • +
  • [Cube public transport line files], which define a set of transit lines in the cube software.

  • +
+
+

Components

+

Network Wrangler has the following atomic parts:

+
    +
  • RoadwayNetwork object, which represents the roadway network data as GeoDataFrames;

  • +
  • TransitNetwork object, which represents the transit network data as DataFrames;

  • +
  • ProjectCard object, which represents the data of the project card. Project cards identify the infrastructure that is changing (a selection) and defines the changes; or contains information about a new facility to be constructed or a new service to be run.;

  • +
  • Scenario object, which consist of at least a RoadwayNetwork, and +TransitNetwork. Scenarios can be based on or tiered from other scenarios. +Scenarios can query and add ProjectCards to describe a set of changes that should be made to the network.

  • +
+

In addition, Lasso has the following atomic parts:

+
    +
  • Project object, creates project cards from one of the following: a base and a build transit network in cube format, a base and build highway network, or a base highway network and a Cube log file.

  • +
  • ModelRoadwayNetwork object is a subclass of RoadwayNetwork and contains MetCouncil-specific methods to define and create MetCouncil-specific variables and export the network to a format that can be read by Cube.

  • +
  • StandardTransit, an object for holding a standard transit feed as a Partridge object and contains +methods to manipulate and translate the GTFS data to MetCouncil’s Cube Line files.

  • +
  • CubeTransit, an object for storing information about transit defined in Cube public transport line files +. Has the capability to parse cube line file properties and shapes into python dictionaries and compare line files and represent changes as Project Card dictionaries.

  • +
  • Parameters, A class representing all the parameters defining the networks +including time of day, categories, etc. Parameters can be set at runtime by initializing a parameters instance +with a keyword argument setting the attribute. Parameters that are +not explicitly set will use default parameters listed in this class.

  • +
+
+

RoadwayNetwork

+

Reads, writes, queries and and manipulates roadway network data, which +is mainly stored in the GeoDataFrames links_df, nodes_df, and shapes_df.

+
net = RoadwayNetwork.read(
+        link_filename=MY_LINK_FILE,
+        node_filename=MY_NODE_FILE,
+        shape_filename=MY_SHAPE_FILE,
+        shape_foreign_key ='shape_id',
+        
+    )
+my_selection = {
+    "link": [{"name": ["I 35E"]}],
+    "A": {"osm_node_id": "961117623"},  # start searching for segments at A
+    "B": {"osm_node_id": "2564047368"},
+}
+net.select_roadway_features(my_selection)
+
+my_change = [
+    {
+        'property': 'lanes',
+        'existing': 1,
+        'set': 2,
+     },
+     {
+        'property': 'drive_access',
+        'set': 0,
+      },
+]
+
+my_net.apply_roadway_feature_change(
+    my_net.select_roadway_features(my_selection),
+    my_change
+)
+
+ml_net = net.create_managed_lane_network(in_place=False)
+
+ml_net.is_network_connected(mode="drive"))
+
+_, disconnected_nodes = ml_net.assess_connectivity(
+  mode="walk",
+  ignore_end_nodes=True
+)
+ml_net.write(filename=my_out_prefix, path=my_dir)
+
+
+
+
+

TransitNetwork

+
+
+

ProjectCard

+
+
+

Scenario

+

Manages sets of project cards and tiering from a base scenario/set of networks.

+

+my_base_scenario = {
+    "road_net": RoadwayNetwork.read(
+        link_filename=STPAUL_LINK_FILE,
+        node_filename=STPAUL_NODE_FILE,
+        shape_filename=STPAUL_SHAPE_FILE,
+        fast=True,
+        shape_foreign_key ='shape_id',
+    ),
+    "transit_net": TransitNetwork.read(STPAUL_DIR),
+}
+
+card_filenames = [
+    "3_multiple_roadway_attribute_change.yml",
+    "multiple_changes.yml",
+    "4_simple_managed_lane.yml",
+]
+
+project_card_directory = os.path.join(STPAUL_DIR, "project_cards")
+
+project_cards_list = [
+    ProjectCard.read(os.path.join(project_card_directory, filename), validate=False)
+    for filename in card_filenames
+]
+
+my_scenario = Scenario.create_scenario(
+  base_scenario=my_base_scenario,
+  project_cards_list=project_cards_list,
+)
+my_scenario.check_scenario_requisites()
+
+my_scenario.apply_all_projects()
+
+my_scenario.scenario_summary()
+
+
+
+
+

Project

+

Creates project cards by comparing two MetCouncil Model Transit Network files or by reading a cube log file and a base network;

+

+test_project = Project.create_project(
+  base_transit_source=os.path.join(CUBE_DIR, "transit.LIN"),
+  build_transit_source=os.path.join(CUBE_DIR, "transit_route_shape_change"),
+  )
+
+test_project.evaluate_changes()
+
+test_project.write_project_card(
+  os.path.join(SCRATCH_DIR, "t_transit_shape_test.yml")
+  )
+
+
+
+
+

ModelRoadwayNetwork

+

A subclass of network_wrangler’s RoadwayNetwork +class which additional understanding about how to translate and write the +network out to the MetCouncil Roadway Network schema.

+
net = ModelRoadwayNetwork.read(
+      link_filename=STPAUL_LINK_FILE,
+      node_filename=STPAUL_NODE_FILE,
+      shape_filename=STPAUL_SHAPE_FILE,
+      fast=True,
+      shape_foreign_key ='shape_id',
+  )
+
+net.write_roadway_as_fixedwidth()
+
+
+
+
+

StandardTransit

+

Translates the standard GTFS data to MetCouncil’s Cube Line files.

+
cube_transit_net = StandardTransit.read_gtfs(BASE_TRANSIT_DIR)
+cube_transit_net.write_as_cube_lin(os.path.join(WRITE_DIR, "outfile.lin"))
+
+
+
+
+

CubeTransit

+

Used by the project class and has the capability to:

+
    +
  • Parse cube line file properties and shapes into python dictionaries

  • +
  • Compare line files and represent changes as Project Card dictionaries

  • +
+
tn = CubeTransit.create_from_cube(CUBE_DIR)
+transit_change_list = tn.evaluate_differences(base_transit_network)
+
+
+
+
+

Parameters

+

Holds information about default parameters but can +also be initialized to override those parameters at object instantiation using a dictionary.

+
# read parameters from a yaml configuration  file
+# could also provide as a key/value pair
+with open(config_file) as f:
+      my_config = yaml.safe_load(f)
+
+# provide parameters at instantiation of ModelRoadwayNetwork
+model_road_net = ModelRoadwayNetwork.from_RoadwayNetwork(
+            my_scenario.road_net, parameters=my_config.get("my_parameters", {})
+        )
+# network written with direction from the parameters given
+model_road_net.write_roadway_as_shp()
+
+
+
+
+
+

Typical Workflow

+

Workflows in Lasso and Network Wrangler typically accomplish one of two goals:

+
    +
  1. Create Project Cards to document network changes as a result of either transit or roadway projects.

  2. +
  3. Create Model Network Files for a scenario as represented by a series of Project Cards layered on top of a base network.

  4. +
+
+

Project Cards from Transit LIN Files

+
+
+

Project Cards from Cube LOG Files

+
+
+

Model Network Files for a Scenario

+
+
+
+
+

Running Quickstart Jupyter Notebooks

+

To learn basic lasso functionality, please refer to the following jupyter notebooks in the /notebooks directory:

+
    +
  • Lasso Project Card Creation Quickstart.ipynb

  • +
  • Lasso Scenario Creation Quickstart.ipynb

  • +
+

Jupyter notebooks can be started by activating the lasso conda environment and typing jupyter notebook:

+
conda activate <my_lasso_environment>
+jupyter notebook
+
+
+
+
+ + +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty_2035_hwy_update/.buildinfo b/branch/bicounty_2035_hwy_update/.buildinfo new file mode 100644 index 0000000..b36862c --- /dev/null +++ b/branch/bicounty_2035_hwy_update/.buildinfo @@ -0,0 +1,4 @@ +# Sphinx build info version 1 +# This file hashes the configuration used when building these files. When it is not found, a full rebuild will be done. +config: a1cf922aaabc1875e26d788834f2f593 +tags: d77d1c0d9ca2f4c8421862c7c5a0d620 diff --git a/branch/bicounty_2035_hwy_update/_generated/lasso.CubeTransit/index.html b/branch/bicounty_2035_hwy_update/_generated/lasso.CubeTransit/index.html new file mode 100644 index 0000000..8f061a9 --- /dev/null +++ b/branch/bicounty_2035_hwy_update/_generated/lasso.CubeTransit/index.html @@ -0,0 +1,571 @@ + + + + + + + lasso.CubeTransit — lasso documentation + + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

lasso.CubeTransit

+
+
+class lasso.CubeTransit(parameters={})[source]
+

Bases: object

+

Class for storing information about transit defined in Cube line +files.

+

Has the capability to:

+
+
    +
  • Parse cube line file properties and shapes into python dictionaries

  • +
  • Compare line files and represent changes as Project Card dictionaries

  • +
+
+

Typical usage example:

+
tn = CubeTransit.create_from_cube(CUBE_DIR)
+transit_change_list = tn.evaluate_differences(base_transit_network)
+
+
+
+
+lines
+

list of strings representing unique line names in +the cube network.

+
+
Type:
+

list

+
+
+
+ +
+
+line_properties
+

dictionary of line properties keyed by line name. Property +values are stored in a dictionary by property name. These +properties are directly read from the cube line files and haven’t +been translated to standard transit values.

+
+
Type:
+

dict

+
+
+
+ +
+
+shapes
+

dictionary of shapes +keyed by line name. Shapes stored as a pandas DataFrame of nodes with following columns:

+
+
    +
  • ‘node_id’ (int): positive integer of node id

  • +
  • ‘node’ (int): node number, with negative indicating a non-stop

  • +
  • ‘stop’ (boolean): indicates if it is a stop

  • +
  • ‘order’ (int): order within this shape

  • +
+
+
+
Type:
+

dict

+
+
+
+ +
+
+program_type
+

Either PT or TRNBLD

+
+
Type:
+

str

+
+
+
+ +
+
+parameters
+

Parameters instance that will be applied to this instance which +includes information about time periods and variables.

+
+
Type:
+

Parameters

+
+
+
+ +
+
+source_list
+

List of cube line file sources that have been read and added.

+
+
Type:
+

list

+
+
+
+ +
+
+diff_dict
+
+
Type:
+

dict

+
+
+
+ +
+
+__init__(parameters={})[source]
+

Constructor for CubeTransit

+

parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters

+
+ +

Methods

+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +

__init__([parameters])

Constructor for CubeTransit

add_additional_time_periods(...)

Copies a route to another cube time period with appropriate values for time-period-specific properties.

add_cube(transit_source)

Reads a .lin file and adds it to existing TransitNetwork instance.

build_route_name([route_id, time_period, ...])

Create a route name by contatenating route, time period, agency, and direction

calculate_start_end_times(line_properties_dict)

Calculate the start and end times of the property change WARNING: Doesn't take care of discongruous time periods!!!!

create_add_route_card_dict(line)

Creates a project card change formatted dictionary for adding a route based on the information in self.route_properties for the line.

create_delete_route_card_dict(line, ...)

Creates a project card change formatted dictionary for deleting a line.

create_from_cube(transit_source[, parameters])

Reads a cube .lin file and stores as TransitNetwork object.

create_update_route_card_dict(line, ...)

Creates a project card change formatted dictionary for updating the line.

cube_properties_to_standard_properties(...)

Converts cube style properties to standard properties.

evaluate_differences(base_transit)

    +
  1. Identifies what routes need to be updated, deleted, or added

  2. +
+

evaluate_route_property_differences(...[, ...])

Checks if any values have been updated or added for a specific route and creates project card entries for each.

evaluate_route_shape_changes(shape_build, ...)

Compares two route shapes and constructs returns list of changes suitable for a project card.

get_time_period_numbers_from_cube_properties(...)

Finds properties that are associated with time periods and the returns the numbers in them.

unpack_route_name(line_name)

Unpacks route name into direction, route, agency, and time period info

+
+
+add_additional_time_periods(new_time_period_number, orig_line_name)[source]
+

Copies a route to another cube time period with appropriate +values for time-period-specific properties.

+
+
New properties are stored under the new name in:
    +
  • ::self.shapes

  • +
  • ::self.line_properties

  • +
+
+
+
+
Parameters:
+
    +
  • new_time_period_number (int) – cube time period number

  • +
  • orig_line_name (str) – name of the originating line, from which +the new line will copy its properties.

  • +
+
+
Returns:
+

Line name with new time period.

+
+
+
+ +
+
+add_cube(transit_source)[source]
+

Reads a .lin file and adds it to existing TransitNetwork instance.

+
+
Parameters:
+

transit_source – a string or the directory of the cube line file to be parsed

+
+
+
+ +
+
+static build_route_name(route_id='', time_period='', agency_id=0, direction_id=1)[source]
+

Create a route name by contatenating route, time period, agency, and direction

+
+
Parameters:
+
    +
  • route_id – i.e. 452-111

  • +
  • time_period – i.e. pk

  • +
  • direction_id – i.e. 1

  • +
  • agency_id – i.e. 0

  • +
+
+
Returns:
+

constructed line_name i.e. “0_452-111_452_pk1”

+
+
+
+ +
+
+calculate_start_end_times(line_properties_dict)[source]
+

Calculate the start and end times of the property change +WARNING: Doesn’t take care of discongruous time periods!!!!

+
+
Parameters:
+

line_properties_dict – dictionary of cube-flavor properties for a transit line

+
+
+
+ +
+
+create_add_route_card_dict(line)[source]
+

Creates a project card change formatted dictionary for adding +a route based on the information in self.route_properties for +the line.

+
+
Parameters:
+

line – name of line that is being updated

+
+
Returns:
+

A project card change-formatted dictionary for the route addition.

+
+
+
+ +
+
+create_delete_route_card_dict(line, base_transit_line_properties_dict)[source]
+

Creates a project card change formatted dictionary for deleting a line.

+
+
Parameters:
+
    +
  • line – name of line that is being deleted

  • +
  • base_transit_line_properties_dict – dictionary of cube-style +attribute values in order to find time periods and +start and end times.

  • +
+
+
Returns:
+

A project card change-formatted dictionary for the route deletion.

+
+
+
+ +
+
+static create_from_cube(transit_source, parameters={})[source]
+

Reads a cube .lin file and stores as TransitNetwork object.

+
+
Parameters:
+

transit_source – a string or the directory of the cube line file to be parsed

+
+
Returns:
+

A ::CubeTransit object created from the transit_source.

+
+
+
+ +
+
+create_update_route_card_dict(line, updated_properties_dict)[source]
+

Creates a project card change formatted dictionary for updating +the line.

+
+
Parameters:
+
    +
  • line – name of line that is being updated

  • +
  • updated_properties_dict – dictionary of attributes to update as +‘property’: <property name>, +‘set’: <new property value>

  • +
+
+
Returns:
+

A project card change-formatted dictionary for the attribute update.

+
+
+
+ +
+
+static cube_properties_to_standard_properties(cube_properties_dict)[source]
+

Converts cube style properties to standard properties.

+

This is most pertinent to time-period specific variables like headway, +and varibles that have stnadard units like headway, which is minutes +in cube and seconds in standard format.

+
+
Parameters:
+

cube_properties_dict – <cube style property name> : <property value>

+
+
Returns:
+

+
<standard

style property name>, “set” : <property value with correct units>`

+
+
+

+
+
Return type:
+

A list of dictionaries with values for `”property”

+
+
+
+ +
+
+evaluate_differences(base_transit)[source]
+
    +
  1. Identifies what routes need to be updated, deleted, or added

  2. +
  3. +
    For routes being added or updated, identify if the time periods

    have changed or if there are multiples, and make duplicate lines if so

    +
    +
    +
  4. +
  5. Create project card dictionaries for each change.

  6. +
+
+
Parameters:
+

base_transit (CubeTransit) – an instance of this class for the base condition

+
+
Returns:
+

A list of dictionaries containing project card changes +required to evaluate the differences between the base network +and this transit network instance.

+
+
+
+ +
+
+evaluate_route_property_differences(properties_build, properties_base, time_period_number, absolute=True, validate_base=False)[source]
+

Checks if any values have been updated or added for a specific +route and creates project card entries for each.

+
+
Parameters:
+
    +
  • properties_build – ::<property_name>: <property_value>

  • +
  • properties_base – ::<property_name>: <property_value>

  • +
  • time_period_number – time period to evaluate

  • +
  • absolute – if True, will use set command rather than a change. If false, will automatically check the base value. Note that this only applies to the numeric values of frequency/headway

  • +
  • validate_base – if True, will add the existing line in the project card

  • +
+
+
Returns:
+

+
a list of dictionary values suitable for writing to a project card

{ +‘property’: <property_name>, +‘set’: <set value>, +‘change’: <change from existing value>, +‘existing’: <existing value to check>, +}

+
+
+

+
+
Return type:
+

transit_change_list (list)

+
+
+
+ +
+
+static evaluate_route_shape_changes(shape_build, shape_base)[source]
+

Compares two route shapes and constructs returns list of changes +suitable for a project card.

+
+
Parameters:
+
    +
  • shape_build – DataFrame of the build-version of the route shape.

  • +
  • shape_base – dDataFrame of the base-version of the route shape.

  • +
+
+
Returns:
+

List of shape changes formatted as a project card-change dictionary.

+
+
+
+ +
+
+static get_time_period_numbers_from_cube_properties(properties_list)[source]
+

Finds properties that are associated with time periods and the +returns the numbers in them.

+
+
Parameters:
+

properties_list (list) – list of all properties.

+
+
Returns:
+

list of strings of the time period numbers found

+
+
+
+ +
+
+static unpack_route_name(line_name)[source]
+

Unpacks route name into direction, route, agency, and time period info

+
+
Parameters:
+

line_name (str) – i.e. “0_452-111_452_pk1”

+
+
Returns:
+

452-111 +time_period (str): i.e. pk +direction_id (str) : i.e. 1 +agency_id (str) : i.e. 0

+
+
Return type:
+

route_id (str)

+
+
+
+ +
+ +
+ + +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty_2035_hwy_update/_generated/lasso.ModelRoadwayNetwork/index.html b/branch/bicounty_2035_hwy_update/_generated/lasso.ModelRoadwayNetwork/index.html new file mode 100644 index 0000000..b824de9 --- /dev/null +++ b/branch/bicounty_2035_hwy_update/_generated/lasso.ModelRoadwayNetwork/index.html @@ -0,0 +1,1573 @@ + + + + + + + lasso.ModelRoadwayNetwork — lasso documentation + + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

lasso.ModelRoadwayNetwork

+
+
+class lasso.ModelRoadwayNetwork(nodes, links, shapes, parameters={}, **kwargs)[source]
+

Bases: RoadwayNetwork

+

Subclass of network_wrangler class RoadwayNetwork

+

A representation of the physical roadway network and its properties.

+
+
+__init__(nodes, links, shapes, parameters={}, **kwargs)[source]
+

Constructor

+
+
Parameters:
+
    +
  • nodes – geodataframe of nodes

  • +
  • links – dataframe of links

  • +
  • shapes – geodataframe of shapes

  • +
  • parameters – dictionary of parameter settings (see Parameters class) or an instance of Parameters. +If not specified, will use default parameters.

  • +
  • crs (int) – coordinate reference system, ESPG number

  • +
  • node_foreign_key (str) – variable linking the node table to the link table

  • +
  • link_foreign_key (list) – list of variable linking the link table to the node foreign key

  • +
  • shape_foreign_key (str) – variable linking the links table and shape table

  • +
  • unique_link_ids (list) – list of variables unique to each link

  • +
  • unique_node_ids (list) – list of variables unique to each node

  • +
  • modes_to_network_link_variables (dict) – Mapping of modes to link variables in the network

  • +
  • modes_to_network_nodes_variables (dict) – Mapping of modes to node variables in the network

  • +
  • managed_lanes_node_id_scalar (int) – Scalar values added to primary keys for nodes for +corresponding managed lanes.

  • +
  • managed_lanes_link_id_scalar (int) – Scalar values added to primary keys for links for +corresponding managed lanes.

  • +
  • managed_lanes_required_attributes (list) – attributes that must be specified in managed +lane projects.

  • +
  • keep_same_attributes_ml_and_gp (list) – attributes to copy to managed lanes from parallel +general purpose lanes.

  • +
+
+
+
+ +

Methods

+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +

__init__(nodes, links, shapes[, parameters])

Constructor

add_counts([network_variable, ...])

Adds count variable.

add_incident_link_data_to_nodes([links_df, ...])

Add data from links going to/from nodes to node.

add_new_roadway_feature_change(links, nodes)

add the new roadway features defined in the project card.

add_variable_using_shst_reference([...])

Join network links with source data, via SHST API node match result.

addition_map(links, nodes)

Shows which links and nodes are added to the roadway network

apply(project_card_dictionary)

Wrapper method to apply a project to a roadway network.

apply_managed_lane_feature_change(link_idx, ...)

Apply the managed lane feature changes to the roadway network

apply_python_calculation(pycode[, in_place])

Changes roadway network object by executing pycode.

apply_roadway_feature_change(link_idx, ...)

Changes the roadway attributes for the selected features based on the project card information passed

assess_connectivity([mode, ...])

Returns a network graph and list of disconnected subgraphs as described by a list of their member nodes.

build_selection_key(selection_dict)

Selections are stored by a key combining the query and the A and B ids.

calculate_area_type([area_type_shape, ...])

#MC Calculates area type variable.

calculate_centroidconnect(parameters[, ...])

Calculates centroid connector variable.

calculate_county([county_shape, ...])

#MC Calculates county variable.

calculate_distance([network_variable, ...])

calculate link distance in miles

calculate_mpo([county_network_variable, ...])

Calculates mpo variable.

calculate_use([network_variable, ...])

Calculates use variable.

convert_int([int_col_names])

Convert integer columns

create_ML_variable([network_variable, overwrite])

Created ML lanes placeholder for project to write out ML changes

create_calculated_variables()

Creates calculated roadway variables.

create_dummy_connector_links(ml_df[, ...])

create dummy connector links between the general purpose and managed lanes

create_hov_corridor_variable([...])

Created hov corridor placeholder for project to write out corridor changes

create_managed_lane_network([...])

Create a roadway network with managed lanes links separated out.

create_managed_variable([network_variable, ...])

Created placeholder for project to write out managed

dataframe_to_fixed_width(df)

Convert dataframe to fixed width format, geometry column will not be transformed.

delete_roadway_feature_change(links, nodes)

delete the roadway features defined in the project card.

deletion_map(links, nodes)

Shows which links and nodes are deleted from the roadway network

fill_na()

Fill na values from create_managed_lane_network()

from_RoadwayNetwork(roadway_network_object)

RoadwayNetwork to ModelRoadwayNetwork

get_attribute(links_df, join_key, ...)

Gets attribute from source data using SHST match result.

get_managed_lane_node_ids(nodes_list[, scalar])

Transform a list of node IDS by a scalar.

get_modal_graph(links_df, nodes_df[, mode, ...])

Determines if the network graph is "strongly" connected A graph is strongly connected if each vertex is reachable from every other vertex.

get_modal_links_nodes(links_df, nodes_df[, ...])

Returns nodes and link dataframes for specific mode.

get_property_by_time_period_and_group(prop)

Return a series for the properties with a specific group or time period.

identify_segment(O_id, D_id[, ...])

+
param endpoints:
+

list of length of two unique keys of nodes making up endpoints of segment

+
+
+

identify_segment_endpoints([mode, links_df, ...])

+
param mode:
+

list of modes of the network, one of drive,`transit`,

+
+
+

is_network_connected([mode, links_df, nodes_df])

Determines if the network graph is "strongly" connected A graph is strongly connected if each vertex is reachable from every other vertex.

load_transform_network(node_filename, ...[, ...])

Reads roadway network files from disk and transforms them into GeoDataFrames.

network_connection_plot(G, ...)

Plot a graph to check for network connection.

orig_dest_nodes_foreign_key(selection[, ...])

Returns the foreign key id (whatever is used in the u and v variables in the links file) for the AB nodes as a tuple.

ox_graph(nodes_df, links_df[, ...])

create an osmnx-flavored network graph

path_search(candidate_links_df, O_id, D_id)

+
param candidate_links:
+

selection of links geodataframe with links likely to be part of path

+
+
+

read(link_filename, node_filename, ...[, ...])

Reads in links and nodes network standard.

read_match_result(path)

Reads the shst geojson match returns.

rename_variables_for_dbf(input_df[, ...])

Rename attributes for DBF/SHP, make sure length within 10 chars.

roadway_net_to_gdf(roadway_net)

+
rtype:
+

GeoDataFrame

+
+
+

roadway_standard_to_met_council_network([...])

Rename and format roadway attributes to be consistent with what metcouncil's model is expecting.

select_roadway_features(selection[, ...])

Selects roadway features that satisfy selection criteria

selection_has_unique_link_id(selection_dict)

+
rtype:
+

bool

+
+
+

selection_map(selected_link_idx[, A, B, ...])

Shows which links are selected for roadway property change or parallel managed lanes category of roadway projects.

shortest_path(graph_links_df, O_id, D_id[, ...])

+
rtype:
+

tuple

+
+
+

split_properties_by_time_period_and_category([...])

Splits properties by time period, assuming a variable structure of

update_distance([links_df, use_shapes, ...])

Calculate link distance in specified units to network variable using either straight line distance or (if specified) shape distance if available.

validate_link_schema(link_filename[, ...])

Validate roadway network data link schema and output a boolean

validate_node_schema(node_file[, ...])

Validate roadway network data node schema and output a boolean

validate_properties(properties[, ...])

If there are change or existing commands, make sure that that property exists in the network.

validate_selection(selection[, ...])

Evaluate whetther the selection dictionary contains the minimum required values.

validate_shape_schema(shape_file[, ...])

Validate roadway network data shape schema and output a boolean

validate_uniqueness()

Confirms that the unique identifiers are met.

write([path, filename])

Writes a network in the roadway network standard

write_roadway_as_fixedwidth(output_dir[, ...])

Writes out fixed width file.

write_roadway_as_shp(output_dir[, ...])

Write out dbf/shp/gpkg for cube.

+

Attributes

+ + + + + + +

CALCULATED_VALUES

+
+
+add_counts(network_variable='AADT', mndot_count_shst_data=None, widot_count_shst_data=None, mndot_count_variable_shp=None, widot_count_variable_shp=None)[source]
+

Adds count variable. +#MC +join the network with count node data, via SHST API node match result

+
+
Parameters:
+
    +
  • network_variable (str) – Name of the variable that should be written to. Default to “AADT”.

  • +
  • mndot_count_shst_data (str) – File path to MNDOT count location SHST API node match result.

  • +
  • widot_count_shst_data (str) – File path to WIDOT count location SHST API node match result.

  • +
  • mndot_count_variable_shp (str) – File path to MNDOT count location geodatabase.

  • +
  • widot_count_variable_shp (str) – File path to WIDOT count location geodatabase.

  • +
+
+
Returns:
+

None

+
+
+
+ +
+ +

Add data from links going to/from nodes to node.

+
+
Return type:
+

DataFrame

+
+
Parameters:
+
    +
  • links_df – if specified, will assess connectivity of this +links list rather than self.links_df

  • +
  • nodes_df – if specified, will assess connectivity of this +nodes list rather than self.nodes_df

  • +
  • link_variables – list of columns in links dataframe to add to incident nodes

  • +
+
+
Returns:
+

nodes DataFrame with link data where length is N*number of links going in/out

+
+
+
+ +
+
+add_new_roadway_feature_change(links, nodes)
+

add the new roadway features defined in the project card. +new shapes are also added for the new roadway links.

+
+
Return type:
+

None

+
+
Parameters:
+
    +
  • links – list of dictionaries

  • +
  • nodes – list of dictionaries

  • +
+
+
+

returns: None

+
+ +
+
+add_variable_using_shst_reference(var_shst_csvdata=None, shst_csv_variable=None, network_variable=None, network_var_type=<class 'int'>, overwrite=False)[source]
+

Join network links with source data, via SHST API node match result.

+
+
Parameters:
+
    +
  • var_shst_csvdata (str) – File path to SHST API return.

  • +
  • shst_csv_variable (str) – Variable name in the source data.

  • +
  • network_variable (str) – Name of the variable that should be written to.

  • +
  • network_var_type – Variable type in the written network.

  • +
  • overwrite (bool) – True is overwriting existing variable. Default to False.

  • +
+
+
Returns:
+

None

+
+
+
+ +
+
+addition_map(links, nodes)
+

Shows which links and nodes are added to the roadway network

+
+ +
+
+apply(project_card_dictionary)
+

Wrapper method to apply a project to a roadway network.

+
+
Parameters:
+

project_card_dictionary – dict +a dictionary of the project card object

+
+
+
+ +
+
+apply_managed_lane_feature_change(link_idx, properties, in_place=True)
+

Apply the managed lane feature changes to the roadway network

+
+
Parameters:
+
    +
  • link_idx – list of lndices of all links to apply change to

  • +
  • properties – list of dictionarys roadway properties to change

  • +
  • in_place – boolean to indicate whether to update self or return +a new roadway network object

  • +
+
+
+
+ +
+
+apply_python_calculation(pycode, in_place=True)
+

Changes roadway network object by executing pycode.

+
+
Parameters:
+
    +
  • pycode – python code which changes values in the roadway network object

  • +
  • in_place – update self or return a new roadway network object

  • +
+
+
+
+ +
+
+apply_roadway_feature_change(link_idx, properties, in_place=True)
+

Changes the roadway attributes for the selected features based on the +project card information passed

+
+
Parameters:
+
    +
  • link_idx – list +lndices of all links to apply change to

  • +
  • properties – list of dictionarys +roadway properties to change

  • +
  • in_place – boolean +update self or return a new roadway network object

  • +
+
+
+
+ +
+
+assess_connectivity(mode='', ignore_end_nodes=True, links_df=None, nodes_df=None)
+

Returns a network graph and list of disconnected subgraphs +as described by a list of their member nodes.

+
+
Parameters:
+
    +
  • mode – list of modes of the network, one of drive,`transit`, +walk, bike

  • +
  • ignore_end_nodes – if True, ignores stray singleton nodes

  • +
  • links_df – if specified, will assess connectivity of this +links list rather than self.links_df

  • +
  • nodes_df – if specified, will assess connectivity of this +nodes list rather than self.nodes_df

  • +
+
+
+
+
Returns: Tuple of

Network Graph (osmnx flavored networkX DiGraph) +List of disconnected subgraphs described by the list of their

+
+

member nodes (as described by their model_node_id)

+
+
+
+
+ +
+
+build_selection_key(selection_dict)
+

Selections are stored by a key combining the query and the A and B ids. +This method combines the two for you based on the selection dictionary.

+
+
Return type:
+

tuple

+
+
Parameters:
+

selection_dictonary – Selection Dictionary

+
+
+

Returns: Tuple serving as the selection key.

+
+ +
+
+calculate_area_type(area_type_shape=None, area_type_shape_variable=None, network_variable='area_type', area_type_codes_dict=None, downtown_area_type_shape=None, downtown_area_type=None, overwrite=False)[source]
+

#MC +Calculates area type variable.

+

This uses the centroid of the geometry field to determine which area it should be labeled. +This isn’t perfect, but it much quicker than other methods.

+
+
Parameters:
+
    +
  • area_type_shape (str) – The File path to area geodatabase.

  • +
  • area_type_shape_variable (str) – The variable name of area type in area geodadabase.

  • +
  • network_variable (str) – The variable name of area type in network standard. Default to “area_type”.

  • +
  • area_type_codes_dict – The dictionary to map input area_type_shape_variable to network_variable

  • +
  • downtown_area_type_shape – The file path to the downtown area type boundary.

  • +
  • downtown_area_type (int) – Integer value of downtown area type

  • +
  • overwrite (Bool) – True if overwriting existing county variable in network. Default to False.

  • +
+
+
Returns:
+

None

+
+
+
+ +
+
+calculate_centroidconnect(parameters, network_variable='centroidconnect', highest_taz_number=None, as_integer=True, overwrite=False)[source]
+

Calculates centroid connector variable.

+
+
Parameters:
+
    +
  • parameters (Parameters) – A Lasso Parameters, which stores input files.

  • +
  • network_variable (str) – Variable that should be written to in the network. Default to “centroidconnect”

  • +
  • highest_taz_number (int) – the max TAZ number in the network.

  • +
  • as_integer (bool) – If True, will convert true/false to 1/0s. Default to True.

  • +
  • overwrite (Bool) – True if overwriting existing county variable in network. Default to False.

  • +
+
+
Returns:
+

RoadwayNetwork

+
+
+
+ +
+
+calculate_county(county_shape=None, county_shape_variable=None, network_variable='county', county_codes_dict=None, overwrite=False)[source]
+

#MC +Calculates county variable.

+

This uses the centroid of the geometry field to determine which county it should be labeled. +This isn’t perfect, but it much quicker than other methods.

+
+
Parameters:
+
    +
  • county_shape (str) – The File path to county geodatabase.

  • +
  • county_shape_variable (str) – The variable name of county in county geodadabase.

  • +
  • network_variable (str) – The variable name of county in network standard. Default to “county”.

  • +
  • overwrite (Bool) – True if overwriting existing county variable in network. Default to False.

  • +
+
+
Returns:
+

None

+
+
+
+ +
+
+calculate_distance(network_variable='distance', centroidconnect_only=False, overwrite=False)[source]
+

calculate link distance in miles

+
+
Parameters:
+
    +
  • centroidconnect_only (Bool) – True if calculating distance for centroidconnectors only. Default to False.

  • +
  • overwrite (Bool) – True if overwriting existing variable in network. Default to False.

  • +
+
+
Returns:
+

None

+
+
+
+ +
+
+calculate_mpo(county_network_variable='county', network_variable='mpo', as_integer=True, mpo_counties=None, overwrite=False)[source]
+

Calculates mpo variable. +#MC +:param county_variable: Name of the variable where the county names are stored. Default to “county”. +:type county_variable: str +:param network_variable: Name of the variable that should be written to. Default to “mpo”. +:type network_variable: str +:param as_integer: If true, will convert true/false to 1/0s. +:type as_integer: bool +:param mpo_counties: List of county names that are within mpo region. +:type mpo_counties: list +:param overwrite: True if overwriting existing county variable in network. Default to False. +:type overwrite: Bool

+
+
Returns:
+

None

+
+
+
+ +
+
+calculate_use(network_variable='use', as_integer=True, overwrite=False)[source]
+

Calculates use variable.

+
+
Parameters:
+
    +
  • network_variable (str) – Variable that should be written to in the network. Default to “use”

  • +
  • as_integer (bool) – If True, will convert true/false to 1/0s. Defauly to True.

  • +
  • overwrite (Bool) – True if overwriting existing county variable in network. Default to False.

  • +
+
+
Returns:
+

None

+
+
+
+ +
+
+convert_int(int_col_names=[])[source]
+

Convert integer columns

+
+ +
+
+create_ML_variable(network_variable='ML_lanes', overwrite=False)[source]
+

Created ML lanes placeholder for project to write out ML changes

+

ML lanes default to 0, ML info comes from cube LOG file and store in project cards

+
+
Parameters:
+

overwrite (Bool) – True if overwriting existing variable in network. Default to False.

+
+
Returns:
+

None

+
+
+
+ +
+
+create_calculated_variables()[source]
+

Creates calculated roadway variables.

+
+
Parameters:
+

None

+
+
+
+ +
+ +

create dummy connector links between the general purpose and managed lanes

+
+
Parameters:
+
    +
  • gp_df – GeoDataFrame +dataframe of general purpose links (where managed lane also exists)

  • +
  • ml_df – GeoDataFrame +dataframe of corresponding managed lane links,

  • +
  • access_lanes – int +number of lanes in access dummy link

  • +
  • egress_lanes – int +number of lanes in egress dummy link

  • +
  • access_roadway – str +roaday type for access dummy link

  • +
  • egress_roadway – str +roadway type for egress dummy link

  • +
  • access_name_prefix – str +prefix for access dummy link name

  • +
  • egress_name_prefix – str +prefix for egress dummy link name

  • +
+
+
+
+ +
+
+create_hov_corridor_variable(network_variable='segment_id', overwrite=False)[source]
+

Created hov corridor placeholder for project to write out corridor changes

+

hov corridor id default to 0, its info comes from cube LOG file and store in project cards

+
+
Parameters:
+

overwrite (Bool) – True if overwriting existing variable in network. Default to False.

+
+
Returns:
+

None

+
+
+
+ +
+
+create_managed_lane_network(keep_same_attributes_ml_and_gp=None, keep_additional_attributes_ml_and_gp=[], managed_lanes_required_attributes=[], managed_lanes_node_id_scalar=None, managed_lanes_link_id_scalar=None, in_place=False)
+

Create a roadway network with managed lanes links separated out. +Add new parallel managed lane links, access/egress links, +and add shapes corresponding to the new links

+
+
Return type:
+

RoadwayNetwork

+
+
Parameters:
+
    +
  • keep_same_attributes_ml_and_gp – list of attributes to copy from general purpose +lane to managed lane. If not specified, will look for value in the RoadwayNetwork +instance. If not found there, will default to KEEP_SAME_ATTRIBUTES_ML_AND_GP.

  • +
  • keep_additional_attributes_ml_and_gp – list of additional attributes to add. This is useful +if you want to leave the default attributes and then ALSO some others.

  • +
  • managed_lanes_required_attributes – list of attributes that are required to be specified +in new managed lanes. If not specified, will look for value in the RoadwayNetwork +instance. If not found there, will default to MANAGED_LANES_REQUIRED_ATTRIBUTES.

  • +
  • managed_lanes_node_id_scalar – integer value added to original node IDs to create managed +lane unique ids. If not specified, will look for value in the RoadwayNetwork +instance. If not found there, will default to MANAGED_LANES_NODE_ID_SCALAR.

  • +
  • managed_lanes_link_id_scalar – integer value added to original link IDs to create managed +lane unique ids. If not specified, will look for value in the RoadwayNetwork +instance. If not found there, will default to MANAGED_LANES_LINK_ID_SCALAR.

  • +
  • in_place – update self or return a new roadway network object

  • +
+
+
+

returns: A RoadwayNetwork instance

+
+ +
+
+create_managed_variable(network_variable='managed', overwrite=False)[source]
+

Created placeholder for project to write out managed

+

managed default to 0, its info comes from cube LOG file and store in project cards

+
+
Parameters:
+

overwrite (Bool) – True if overwriting existing variable in network. Default to False.

+
+
Returns:
+

None

+
+
+
+ +
+
+static dataframe_to_fixed_width(df)[source]
+

Convert dataframe to fixed width format, geometry column will not be transformed.

+
+
Parameters:
+

df (pandas DataFrame) –

+
+
Returns:
+

dataframe with fixed width for each column. +dict: dictionary with columns names as keys, column width as values.

+
+
Return type:
+

pandas dataframe

+
+
+
+ +
+
+delete_roadway_feature_change(links, nodes, ignore_missing=True)
+

delete the roadway features defined in the project card. +valid links and nodes defined in the project gets deleted +and shapes corresponding to the deleted links are also deleted.

+
+
Return type:
+

None

+
+
Parameters:
+
    +
  • links – dict +list of dictionaries

  • +
  • nodes – dict +list of dictionaries

  • +
  • ignore_missing – bool +If True, will only warn about links/nodes that are missing from +network but specified to “delete” in project card +If False, will fail.

  • +
+
+
+
+ +
+
+deletion_map(links, nodes)
+

Shows which links and nodes are deleted from the roadway network

+
+ +
+
+fill_na()[source]
+

Fill na values from create_managed_lane_network()

+
+ +
+
+static from_RoadwayNetwork(roadway_network_object, parameters={})[source]
+

RoadwayNetwork to ModelRoadwayNetwork

+
+
Parameters:
+
    +
  • roadway_network_object (RoadwayNetwork) –

  • +
  • parameters – dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not specified, will use default parameters.

  • +
+
+
Returns:
+

ModelRoadwayNetwork

+
+
+
+ +
+
+static get_attribute(links_df, join_key, source_shst_ref_df, source_gdf, field_name)[source]
+

Gets attribute from source data using SHST match result.

+
+
Parameters:
+
    +
  • links_df (dataframe) – The network dataframe that new attribute should be written to.

  • +
  • join_key (str) – SHST ID variable name used to join source data with network dataframe.

  • +
  • source_shst_ref_df (str) – File path to source data SHST match result.

  • +
  • source_gdf (str) – File path to source data.

  • +
  • field_name (str) – Name of the attribute to get from source data.

  • +
+
+
Returns:
+

None

+
+
+
+ +
+
+static get_managed_lane_node_ids(nodes_list, scalar=4500000)
+

Transform a list of node IDS by a scalar. +..todo #237 what if node ids are not a number?

+
+
Parameters:
+
    +
  • nodes_list – list of integers

  • +
  • scalar – value to add to node IDs

  • +
+
+
+

Returns: list of integers

+
+ +
+
+static get_modal_graph(links_df, nodes_df, mode=None, modes_to_network_link_variables={'bike': ['bike_access'], 'bus': ['bus_only', 'drive_access'], 'drive': ['drive_access'], 'rail': ['rail_only'], 'transit': ['bus_only', 'rail_only', 'drive_access'], 'walk': ['walk_access']})
+

Determines if the network graph is “strongly” connected +A graph is strongly connected if each vertex is reachable from every other vertex.

+
+
Parameters:
+
    +
  • links_df – DataFrame of standard network links

  • +
  • nodes_df – DataFrame of standard network nodes

  • +
  • mode – mode of the network, one of drive,`transit`, +walk, bike

  • +
  • modes_to_network_link_variables – dictionary mapping the mode selections to the network variables +that must bool to true to select that mode. Defaults to MODES_TO_NETWORK_LINK_VARIABLES

  • +
+
+
+

Returns: networkx: osmnx: DiGraph of network

+
+ +
+ +

Returns nodes and link dataframes for specific mode.

+
+
Parameters:
+
    +
  • links_df – DataFrame of standard network links

  • +
  • nodes_df – DataFrame of standard network nodes

  • +
  • modes – list of the modes of the network to be kept, must be in drive,`transit`,`rail`,`bus`, +walk, bike. For example, if bike and walk are selected, both bike and walk links will be kept.

  • +
  • modes_to_network_link_variables – dictionary mapping the mode selections to the network variables +that must bool to true to select that mode. Defaults to MODES_TO_NETWORK_LINK_VARIABLES

  • +
+
+
+

Returns: tuple of DataFrames for links, nodes filtered by mode

+

links with walk access are not marked as having walk access +Issue discussed in https://github.com/wsp-sag/network_wrangler/issues/145 +modal_nodes_df = nodes_df[nodes_df[mode_node_variable] == 1]

+
+ +
+
+get_property_by_time_period_and_group(prop, time_period=None, category=None, default_return=None)
+

Return a series for the properties with a specific group or time period.

+
+
Parameters:
+
    +
  • prop (str) – the variable that you want from network

  • +
  • time_period (list(str)) – the time period that you are querying for +i.e. [‘16:00’, ‘19:00’]

  • +
  • category (str or list(str)(Optional)) –

    the group category +i.e. “sov”

    +

    or

    +

    list of group categories in order of search, i.e. +[“hov3”,”hov2”]

    +

  • +
  • default_return (what to return if variable or time period not found. Default is None.) –

  • +
+
+
Return type:
+

pandas series

+
+
+
+ +
+
+identify_segment(O_id, D_id, selection_dict={}, mode=None, nodes_df=None, links_df=None)
+
+
Parameters:
+
    +
  • endpoints – list of length of two unique keys of nodes making up endpoints of segment

  • +
  • selection_dict – dictionary of link variables to select candidate links from, otherwise will create a graph of ALL links which will be both a RAM hog and could result in odd shortest paths.

  • +
  • segment_variables – list of variables to keep

  • +
+
+
+
+ +
+
+identify_segment_endpoints(mode='', links_df=None, nodes_df=None, min_connecting_links=10, min_distance=None, max_link_deviation=2)
+
+
Parameters:
+
    +
  • mode – list of modes of the network, one of drive,`transit`, +walk, bike

  • +
  • links_df – if specified, will assess connectivity of this +links list rather than self.links_df

  • +
  • nodes_df – if specified, will assess connectivity of this +nodes list rather than self.nodes_df

  • +
+
+
+
+ +
+
+is_network_connected(mode=None, links_df=None, nodes_df=None)
+

Determines if the network graph is “strongly” connected +A graph is strongly connected if each vertex is reachable from every other vertex.

+
+
Parameters:
+
    +
  • mode – mode of the network, one of drive,`transit`, +walk, bike

  • +
  • links_df – DataFrame of standard network links

  • +
  • nodes_df – DataFrame of standard network nodes

  • +
+
+
+

Returns: boolean

+
+ +
+
+static load_transform_network(node_filename, link_filename, shape_filename, crs=4326, node_foreign_key='model_node_id', validate_schema=True, **kwargs)
+

Reads roadway network files from disk and transforms them into GeoDataFrames.

+
+
Return type:
+

tuple

+
+
Parameters:
+
    +
  • node_filename – file name for nodes.

  • +
  • link_filename – file name for links.

  • +
  • shape_filename – file name for shapes.

  • +
  • crs – coordinate reference system. Defaults to value in CRS.

  • +
  • node_foreign_key – variable linking the node table to the link table. Defaults +to NODE_FOREIGN_KEY.

  • +
  • validate_schema – boolean indicating if network should be validated to schema.

  • +
+
+
+

returns: tuple of GeodataFrames nodes_df, links_df, shapes_df

+
+ +
+
+static network_connection_plot(G, disconnected_subgraph_nodes)
+

Plot a graph to check for network connection.

+
+
Parameters:
+
    +
  • G – OSMNX flavored networkX graph.

  • +
  • disconnected_subgraph_nodes – List of disconnected subgraphs described by the list of their +member nodes (as described by their model_node_id).

  • +
+
+
+

returns: fig, ax : tuple

+
+ +
+
+orig_dest_nodes_foreign_key(selection, node_foreign_key='')
+

Returns the foreign key id (whatever is used in the u and v +variables in the links file) for the AB nodes as a tuple.

+
+
Return type:
+

tuple

+
+
Parameters:
+
    +
  • selection – selection dictionary with A and B keys

  • +
  • node_foreign_key – variable name for whatever is used by the u and v variable

  • +
  • specified (in the links_df file. If nothing is) –

  • +
  • whatever (assume) –

  • +
  • is (default) –

  • +
+
+
+

Returns: tuple of (A_id, B_id)

+
+ +
+
+static ox_graph(nodes_df, links_df, node_foreign_key='model_node_id', link_foreign_key=['A', 'B'], unique_link_key='model_link_id')
+

create an osmnx-flavored network graph

+

osmnx doesn’t like values that are arrays, so remove the variables +that have arrays. osmnx also requires that certain variables +be filled in, so do that too.

+
+
Parameters:
+
    +
  • nodes_df – GeoDataFrame of nodes

  • +
  • link_df – GeoDataFrame of links

  • +
  • node_foreign_key – field referenced in link_foreign_key

  • +
  • link_foreign_key – list of attributes that define the link start and end nodes to the node foreign key

  • +
  • unique_link_key – primary key for links

  • +
+
+
+

Returns: a networkx multidigraph

+
+ +
+ +
+
Parameters:
+
    +
  • candidate_links – selection of links geodataframe with links likely to be part of path

  • +
  • O_id – origin node foreigh key ID

  • +
  • D_id – destination node foreigh key ID

  • +
  • weight_column – column to use for weight of shortest path. Defaults to “i” (iteration)

  • +
  • weight_factor – optional weight to multiply the weight column by when finding the shortest path

  • +
  • search_breadth

  • +
+
+
+

Returns

+
+ +
+
+static read(link_filename, node_filename, shape_filename, fast=False, recalculate_calculated_variables=False, recalculate_distance=False, parameters={}, **kwargs)[source]
+

Reads in links and nodes network standard.

+
+
Parameters:
+
    +
  • link_filename (str) – File path to link json.

  • +
  • node_filename (str) – File path to node geojson.

  • +
  • shape_filename (str) – File path to link true shape geojson

  • +
  • fast (bool) – boolean that will skip validation to speed up read time.

  • +
  • recalculate_calculated_variables (bool) – calculates fields from spatial joins, etc.

  • +
  • recalculate_distance (bool) – re-calculates distance.

  • +
  • parameters – dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not specified, will use default parameters.

  • +
  • crs (int) – coordinate reference system, ESPG number

  • +
  • node_foreign_key (str) – variable linking the node table to the link table

  • +
  • link_foreign_key (list) – list of variable linking the link table to the node foreign key

  • +
  • shape_foreign_key (str) – variable linking the links table and shape table

  • +
  • unique_link_ids (list) – list of variables unique to each link

  • +
  • unique_node_ids (list) – list of variables unique to each node

  • +
  • modes_to_network_link_variables (dict) – Mapping of modes to link variables in the network

  • +
  • modes_to_network_nodes_variables (dict) – Mapping of modes to node variables in the network

  • +
  • managed_lanes_node_id_scalar (int) – Scalar values added to primary keys for nodes for +corresponding managed lanes.

  • +
  • managed_lanes_link_id_scalar (int) – Scalar values added to primary keys for links for +corresponding managed lanes.

  • +
  • managed_lanes_required_attributes (list) – attributes that must be specified in managed +lane projects.

  • +
  • keep_same_attributes_ml_and_gp (list) – attributes to copy to managed lanes from parallel +general purpose lanes.

  • +
+
+
Returns:
+

ModelRoadwayNetwork

+
+
+
+ +
+
+static read_match_result(path)[source]
+

Reads the shst geojson match returns.

+

Returns shst dataframe.

+

Reading lots of same type of file and concatenating them into a single DataFrame.

+
+
Parameters:
+

path (str) – File path to SHST match results.

+
+
Returns:
+

geopandas geodataframe

+
+
Return type:
+

geodataframe

+
+
+

##todo +not sure why we need, but should be in utilities not this class

+
+ +
+
+rename_variables_for_dbf(input_df, variable_crosswalk=None, output_variables=None, convert_geometry_to_xy=False)[source]
+

Rename attributes for DBF/SHP, make sure length within 10 chars.

+
+
Parameters:
+
    +
  • input_df (dataframe) – Network standard DataFrame.

  • +
  • variable_crosswalk (str) – File path to variable name crosswalk from network standard to DBF names.

  • +
  • output_variables (list) – List of strings for DBF variables.

  • +
  • convert_geometry_to_xy (bool) – True if converting node geometry to X/Y

  • +
+
+
Returns:
+

dataframe

+
+
+
+ +
+
+static roadway_net_to_gdf(roadway_net)
+
+
Return type:
+

GeoDataFrame

+
+
+

Turn the roadway network into a GeoDataFrame +:param roadway_net: the roadway network to export

+

returns: shapes dataframe

+
+ +
+
+roadway_standard_to_met_council_network(output_epsg=None)[source]
+

Rename and format roadway attributes to be consistent with what metcouncil’s model is expecting. +#MC +:param output_epsg: epsg number of output network. +:type output_epsg: int

+
+
Returns:
+

None

+
+
+
+ +
+
+select_roadway_features(selection, search_mode='drive', force_search=False, sp_weight_factor=None)
+

Selects roadway features that satisfy selection criteria

+
+
Return type:
+

GeoDataFrame

+
+
+
+
Example usage:
+
net.select_roadway_features(
+
selection = [ {

# a match condition for the from node using osm, +# shared streets, or model node number +‘from’: {‘osm_model_link_id’: ‘1234’}, +# a match for the to-node.. +‘to’: {‘shstid’: ‘4321’}, +# a regex or match for facility condition +# could be # of lanes, facility type, etc. +‘facility’: {‘name’:’Main St’}, +}, … ])

+
+
+
+
+
+
+
+
Parameters:
+
    +
  • selection – dictionary with keys for: +A - from node +B - to node +link - which includes at least a variable for name

  • +
  • search_mode – mode which you are searching for; defaults to “drive”

  • +
  • force_search – boolean directing method to perform search even if one +with same selection dict is stored from a previous search.

  • +
  • sp_weight_factor – multiple used to discourage shortest paths which +meander from original search returned from name or ref query. +If not set here, will default to value of sp_weight_factor in +RoadwayNetwork instance. If not set there, will defaul to SP_WEIGHT_FACTOR.

  • +
+
+
+

Returns: a list of link IDs in selection

+
+ +
+ +
+
Return type:
+

bool

+
+
Parameters:
+

selection_dictionary – Dictionary representation of selection +of roadway features, containing a “link” key.

+
+
+
+
Returns: A boolean indicating if the selection dictionary contains

a unique identifier for links.

+
+
+
+ +
+
+selection_map(selected_link_idx, A=None, B=None, candidate_link_idx=[])
+

Shows which links are selected for roadway property change or parallel +managed lanes category of roadway projects.

+
+
Parameters:
+
    +
  • selected_links_idx – list of selected link indices

  • +
  • candidate_links_idx – optional list of candidate link indices to also include in map

  • +
  • A – optional foreign key of starting node of a route selection

  • +
  • B – optional foreign key of ending node of a route selection

  • +
+
+
+
+ +
+
+shortest_path(graph_links_df, O_id, D_id, nodes_df=None, weight_column='i', weight_factor=100)
+
+
Return type:
+

tuple

+
+
Parameters:
+
    +
  • graph_links_df

  • +
  • O_id – foreign key for start node

  • +
  • D_id – foreign key for end node

  • +
  • nodes_df – optional nodes df, otherwise will use network instance

  • +
  • weight_column – column to use as a weight, defaults to “i”

  • +
  • weight_factor – any additional weighting to multiply the weight column by, defaults to SP_WEIGHT_FACTOR

  • +
+
+
+

Returns: tuple with length of four +- Boolean if shortest path found +- nx Directed graph of graph links +- route of shortest path nodes as List +- links in shortest path selected from links_df

+
+ +
+
+split_properties_by_time_period_and_category(properties_to_split=None)[source]
+

Splits properties by time period, assuming a variable structure of

+
+
Parameters:
+

properties_to_split

dict +dictionary of output variable prefix mapped to the source variable and what to stratify it by +e.g. +{

+
+

’lanes’ : {‘v’:’lanes’, ‘times_periods’:{“AM”: (“6:00”, “10:00”),”PM”: (“15:00”, “19:00”)}}, +‘ML_lanes’ : {‘v’:’ML_lanes’, ‘times_periods’:{“AM”: (“6:00”, “10:00”),”PM”: (“15:00”, “19:00”)}}, +‘use’ : {‘v’:’use’, ‘times_periods’:{“AM”: (“6:00”, “10:00”),”PM”: (“15:00”, “19:00”)}},

+
+

}

+

+
+
+
+ +
+
+update_distance(links_df=None, use_shapes=False, units='miles', network_variable='distance', overwrite=True, inplace=True)
+

Calculate link distance in specified units to network variable using either straight line +distance or (if specified) shape distance if available.

+
+
Parameters:
+
    +
  • links_df – Links GeoDataFrame. Useful if want to update a portion of network links +(i.e. only centroid connectors). If not provided, will use entire self.links_df.

  • +
  • use_shapes – if True, will add length information from self.shapes_df rather than crow-fly. +If no corresponding shape found in self.shapes_df, will default to crow-fly.

  • +
  • units – units to use. Defaults to the standard unit of miles. Available units: “meters”, “miles”.

  • +
  • network_variable – variable to store link distance in. Defaults to “distance”.

  • +
  • overwrite – Defaults to True and will overwrite all existing calculated distances. +False will only update NaNs.

  • +
  • inplace – updates self.links_df

  • +
+
+
Returns:
+

links_df with updated distance

+
+
+
+ +
+ +

Validate roadway network data link schema and output a boolean

+
+ +
+
+static validate_node_schema(node_file, schema_location='roadway_network_node.json')
+

Validate roadway network data node schema and output a boolean

+
+ +
+
+validate_properties(properties, ignore_existing=False, require_existing_for_change=False)
+

If there are change or existing commands, make sure that that +property exists in the network.

+
+
Return type:
+

bool

+
+
Parameters:
+
    +
  • properties – properties dictionary to be evaluated

  • +
  • ignore_existing – If True, will only warn about properties +that specify an “existing” value. If False, will fail.

  • +
  • require_existing_for_change – If True, will fail if there isn’t +a specified value in theproject card for existing when a +change is specified.

  • +
+
+
+

Returns: boolean value as to whether the properties dictonary is valid.

+
+ +
+
+validate_selection(selection, selection_requires=['link'])
+

Evaluate whetther the selection dictionary contains the +minimum required values.

+
+
Return type:
+

bool

+
+
Parameters:
+

selection – selection dictionary to be evaluated

+
+
+

Returns: boolean value as to whether the selection dictonary is valid.

+
+ +
+
+static validate_shape_schema(shape_file, schema_location='roadway_network_shape.json')
+

Validate roadway network data shape schema and output a boolean

+
+ +
+
+validate_uniqueness()
+

Confirms that the unique identifiers are met.

+
+
Return type:
+

bool

+
+
+
+ +
+
+write(path='.', filename=None)
+

Writes a network in the roadway network standard

+
+
Return type:
+

None

+
+
Parameters:
+
    +
  • path – the path were the output will be saved

  • +
  • filename – the name prefix of the roadway files that will be generated

  • +
+
+
+
+ +
+
+write_roadway_as_fixedwidth(output_dir, node_output_variables=None, link_output_variables=None, output_link_txt=None, output_node_txt=None, output_link_header_width_txt=None, output_node_header_width_txt=None, output_cube_network_script=None, drive_only=False)[source]
+

Writes out fixed width file.

+

This function does: +1. write out link and node fixed width data files for cube. +2. write out header and width correspondence. +3. write out cube network building script with header and width specification.

+
+
Parameters:
+
    +
  • output_dir (str) – File path to where links, nodes and script will be written and run

  • +
  • node_output_variables (list) – list of node variable names.

  • +
  • link_output_variables (list) – list of link variable names.

  • +
  • output_link_txt (str) – File name of output link database (within output_dir)

  • +
  • output_node_txt (str) – File name of output node database (within output_dir)

  • +
  • output_link_header_width_txt (str) – File name of link column width records (within output_dir)

  • +
  • output_node_header_width_txt (str) – File name of node column width records (within output_dir)

  • +
  • output_cube_network_script (str) – File name of CUBE network building script (within output_dir)

  • +
  • drive_only (bool) – If True, only writes drive nodes and links

  • +
+
+
Returns:
+

None

+
+
+
+ +
+
+write_roadway_as_shp(output_dir, node_output_variables=None, link_output_variables=None, data_to_csv=True, data_to_dbf=False, output_link_shp=None, output_node_shp=None, output_link_csv=None, output_node_csv=None, output_gpkg=None, output_link_gpkg_layer=None, output_node_gpkg_layer=None, output_gpkg_link_filter=None)[source]
+

Write out dbf/shp/gpkg for cube. Write out csv in addition to shp with full length variable names.

+
+
Parameters:
+
    +
  • output_dir (str) – File path to directory

  • +
  • node_output_variables (list) – List of strings for node output variables.

  • +
  • link_output_variables (list) – List of strings for link output variables.

  • +
  • data_to_csv (bool) – True if write network in csv format.

  • +
  • data_to_dbf (bool) – True if write network in dbf/shp format.

  • +
  • output_link_shp (str) – File name to output link dbf/shp.

  • +
  • output_node_shp (str) – File name of output node dbf/shp.

  • +
  • output_link_csv (str) – File name to output link csv.

  • +
  • output_node_csv (str) – File name to output node csv.

  • +
  • output_gpkg (str) – File name to output GeoPackage.

  • +
  • output_link_gpkg_layer (str) – Layer name within output_gpkg to output links.

  • +
  • output_node_gpkg_layer (str) – Layer name within output_gpkg to output links.

  • +
  • output_gpkg_link_filter (str) – Optional column name to additional output link subset layers

  • +
+
+
Returns:
+

None

+
+
+
+ +
+
+CALCULATED_VALUES = ['area_type', 'county', 'centroidconnect']
+
+ +
+ +
+ + +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty_2035_hwy_update/_generated/lasso.Parameters/index.html b/branch/bicounty_2035_hwy_update/_generated/lasso.Parameters/index.html new file mode 100644 index 0000000..ead392d --- /dev/null +++ b/branch/bicounty_2035_hwy_update/_generated/lasso.Parameters/index.html @@ -0,0 +1,555 @@ + + + + + + + lasso.Parameters — lasso documentation + + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

lasso.Parameters

+
+
+class lasso.Parameters(**kwargs)[source]
+

Bases: object

+

A class representing all the parameters defining the networks +including time of day, categories, etc.

+

Parameters can be set at runtime by initializing a parameters instance +with a keyword argument setting the attribute. Parameters that are +not explicitly set will use default parameters listed in this class. +.. highlight:: python

+
+
Attr:
+
time_period_to_time (dict): Maps time period abbreviations used in

Cube to time of days used on gtfs and highway network standard +Default:

+
{
+    "EA": ("3:00", "6:00"),
+    "AM": ("6:00, "10:00"),
+    "MD": ("10:00", "15:00"),
+    "PM": ("15:00", "19:00"),
+    "EV": ("19:00", "3:00"),
+}
+
+
+
+
cube_time_periods (dict): Maps cube time period numbers used in

transit line files to the time period abbreviations in time_period_to_time +dictionary. +Default:

+
{"1": "EA", "2": "AM", "3": "MD", "4": "PM", "5": "EV"}
+
+
+
+
categories (dict): Maps demand category abbreviations to a list of

network categories they are allowed to use. +Default:

+
{
+    # suffix, source (in order of search)
+    "sov": ["sov", "default"],
+    "hov2": ["hov2", "default", "sov"],
+    "hov3": ["hov3", "hov2", "default", "sov"],
+    "truck": ["trk", "sov", "default"],
+}
+
+
+
+
properties_to_split (dict): Dictionary mapping variables in standard

roadway network to categories and time periods that need to be +split out in final model network to get variables like LANES_AM. +Default:

+
{
+    "lanes": {
+        "v": "lanes",
+        "time_periods": self.time_periods_to_time
+    },
+    "ML_lanes": {
+        "v": "ML_lanes",
+        "time_periods": self.time_periods_to_time
+    },
+    "use": {
+        "v": "use",
+        "time_periods": self.time_periods_to_time
+    },
+}
+
+
+
+
county_shape (str): File location of shapefile defining counties.

Default:

+
r"metcouncil_data/county/cb_2017_us_county_5m.shp"
+
+
+
+
county_variable_shp (str): Property defining the county n ame in

the county_shape file. +Default:

+
NAME
+
+
+
+
lanes_lookup_file (str): Lookup table of number of lanes for different data sources.

Default:

+
r"metcouncil_data/lookups/lanes.csv"
+
+
+
+
centroid_connect_lanes (int): Number of lanes for centroid connectors.

Default:

+
1
+
+
+
+
mpo_counties (list): list of county names within MPO boundary.

Default:

+
[
+    "ANOKA",
+    "DAKOTA",
+    "HENNEPIN",
+    "RAMSEY",
+    "SCOTT",
+    "WASHINGTON",
+    "CARVER",
+]
+
+
+
+
taz_shape (str):

Default:

+
r"metcouncil_data/TAZ/TAZOfficialWCurrentForecasts.shp"
+
+
+
+
taz_data (str):

Default:

+
??
+
+
+
+
highest_taz_number (int): highest TAZ number in order to define

centroid connectors. +Default:

+
3100
+
+
+
+
output_variables (list): list of variables to output in final model

network. +Default:

+
[
+    "model_link_id",
+    "link_id",
+    "A",
+    "B",
+    "shstGeometryId",
+    "distance",
+    "roadway",
+    "name",
+    "roadway_class",
+    "bike_access",
+    "walk_access",
+    "drive_access",
+    "truck_access",
+    "trn_priority_EA",
+    "trn_priority_AM",
+    "trn_priority_MD",
+    "trn_priority_PM",
+    "trn_priority_EV",
+    "ttime_assert_EA",
+    "ttime_assert_AM",
+    "ttime_assert_MD",
+    "ttime_assert_PM",
+    "ttime_assert_EV",
+    "lanes_EA",
+    "lanes_AM",
+    "lanes_MD",
+    "lanes_PM",
+    "lanes_EV",
+    "price_sov_EA",
+    "price_hov2_EA",
+    "price_hov3_EA",
+    "price_truck_EA",
+    "price_sov_AM",
+    "price_hov2_AM",
+    "price_hov3_AM",
+    "price_truck_AM",
+    "price_sov_MD",
+    "price_hov2_MD",
+    "price_hov3_MD",
+    "price_truck_MD",
+    "price_sov_PM",
+    "price_hov2_PM",
+    "price_hov3_PM",
+    "price_truck_PM",
+    "price_sov_EV",
+    "price_hov2_EV",
+    "price_hov3_EV",
+    "price_truck_EV",
+    "roadway_class_idx",
+    "facility_type",
+    "county",
+    "centroidconnect",
+    "model_node_id",
+    "N",
+    "osm_node_id",
+    "bike_node",
+    "transit_node",
+    "walk_node",
+    "drive_node",
+    "geometry",
+    "X",
+    "Y",
+    "ML_lanes_EA",
+    "ML_lanes_AM",
+    "ML_lanes_MD",
+    "ML_lanes_PM",
+    "ML_lanes_EV",
+    "segment_id",
+    "managed",
+    "bus_only",
+    "rail_only"
+]
+
+
+
+
osm_facility_type_dict (dict): Mapping between OSM Roadway variable

and facility type. Default:

+
+
area_type_shape (str): Location of shapefile defining area type.

Default:

+
r"metcouncil_data/area_type/ThriveMSP2040CommunityDesignation.shp"
+
+
+
+
area_type_variable_shp (str): property in area_type_shape with area

type in it. +Default:

+
"COMDES2040"
+
+
+
+
area_type_code_dict (dict): Mapping of the area_type_variable_shp to

the area type code used in the MetCouncil cube network. +Default:

+
{
+    23: 4,  # urban center
+    24: 3,
+    25: 2,
+    35: 2,
+    36: 1,
+    41: 1,
+    51: 1,
+    52: 1,
+    53: 1,
+    60: 1,
+}
+
+
+
+
downtown_area_type_shape (str): Location of shapefile defining downtown area type.

Default:

+
r"metcouncil_data/area_type/downtownzones_TAZ.shp"
+
+
+
+
downtown_area_type (int): Area type integer for downtown.

Default:

+
5
+
+
+
+
mrcc_roadway_class_shape (str): Shapefile of MRCC links with a property

associated with roadway class. Default:

+
r"metcouncil_data/mrcc/trans_mrcc_centerlines.shp"
+
+
+
+
mrcc_roadway_class_variable_shp (str): The property in mrcc_roadway_class_shp

associated with roadway class. Default:

+
"ROUTE_SYS"
+
+
+
+
widot_roadway_class_shape (str): Shapefile of Wisconsin links with a property

associated with roadway class. Default:

+
r"metcouncil_data/Wisconsin_Lanes_Counts_Median/WISLR.shp"
+
+
+
+
widot_roadway_class_variable_shp (str): The property in widot_roadway_class_shape

associated with roadway class.Default:

+
"RDWY_CTGY_"
+
+
+
+
mndot_count_shape (str): Shapefile of MnDOT links with a property

associated with counts. Default:

+
r"metcouncil_data/count_mn/AADT_2017_Count_Locations.shp"
+
+
+
+
mndot_count_variable_shp (str): The property in mndot_count_shape

associated with counts. Default:

+
+
::

“lookups/osm_highway_facility_type_crosswalk.csv”

+
+
+
+
legacy_tm2_attributes (str): CSV file of link attributes by

shStReferenceId from Legacy TM2 network. Default:

+
"lookups/legacy_tm2_attributes.csv"
+
+
+
+
osm_lanes_attributes (str): CSV file of number of lanes by shStReferenceId

from OSM. Default:

+
"lookups/osm_lanes_attributes.csv"
+
+
+
+
tam_tm2_attributes (str): CSV file of link attributes by

shStReferenceId from TAM TM2 network. Default:

+
"lookups/tam_tm2_attributes.csv"
+
+
+
+
tom_tom_attributes (str): CSV file of link attributes by

shStReferenceId from TomTom network. Default:

+
"lookups/tomtom_attributes.csv"
+
+
+
+
sfcta_attributes (str): CSV file of link attributes by

shStReferenceId from SFCTA network. Default:

+
"lookups/sfcta_attributes.csv"
+
+
+
+
output_epsg (int): EPSG type of geographic projection for output

shapefiles. Default:

+
102646
+
+
+
+
output_link_shp (str): Output shapefile for roadway links. Default:
+
::

r”tests/scratch/links.shp”

+
+
+
+
output_node_shp (str): Output shapefile for roadway nodes. Default:
+
::

r”tests/scratch/nodes.shp”

+
+
+
+
output_link_csv (str): Output csv for roadway links. Default:
+
::

r”tests/scratch/links.csv”

+
+
+
+
output_node_csv (str): Output csv for roadway nodes. Default:
+
::

r”tests/scratch/nodes.csv”

+
+
+
+
output_link_txt (str): Output fixed format txt for roadway links. Default:
+
::

r”tests/scratch/links.txt”

+
+
+
+
output_node_txt (str): Output fixed format txt for roadway nodes. Default:
+
::

r”tests/scratch/nodes.txt”

+
+
+
+
output_link_header_width_txt (str): Header for txt roadway links. Default:
+
::

r”tests/scratch/links_header_width.txt”

+
+
+
+
output_node_header_width_txt (str): Header for txt for roadway Nodes. Default:
+
::

r”tests/scratch/nodes_header_width.txt”

+
+
+
+
output_cube_network_script (str): Cube script for importing

fixed-format roadway network. Default:

+
r"tests/scratch/make_complete_network_from_fixed_width_file.s
+
+
+
+
+
+
+
+
+__init__(**kwargs)[source]
+

Time period and category splitting info

+
+ +

Methods

+ + + + + + +

__init__(**kwargs)

Time period and category splitting info

+

Attributes

+ + + + + + + + + + + + + + + +

cube_time_periods

#MC self.route_type_bus_mode_dict = {"Urb Loc": 5, "Sub Loc": 6, "Express": 7}

properties_to_split

Details for calculating the county based on the centroid of the link.

county_link_range_dict

self.county_code_dict = {

zones

Create all the possible headway variable combinations based on the cube time periods setting

+
+ +
+
self.county_code_dict = {

“Anoka”: 1, +“Carver”: 2, +“Dakota”: 3, +“Hennepin”: 4, +“Ramsey”: 5, +“Scott”: 6, +“Washington”: 7, +“external”: 10, +“Chisago”: 11, +“Goodhue”: 12, +“Isanti”: 13, +“Le Sueur”: 14, +“McLeod”: 15, +“Pierce”: 16, +“Polk”: 17, +“Rice”: 18, +“Sherburne”: 19, +“Sibley”: 20, +“St. Croix”: 21, +“Wright”: 22,

+
+
+

}

+
+ +
+
+cube_time_periods
+

#MC +self.route_type_bus_mode_dict = {“Urb Loc”: 5, “Sub Loc”: 6, “Express”: 7}

+

self.route_type_mode_dict = {0: 8, 2: 9}

+

self.cube_time_periods = {“1”: “AM”, “2”: “MD”} +self.cube_time_periods_name = {“AM”: “pk”, “MD”: “op”}

+
+ +
+
+properties_to_split
+

Details for calculating the county based on the centroid of the link. +The NAME varible should be the name of a field in shapefile.

+
+ +
+
+zones
+

Create all the possible headway variable combinations based on the cube time periods setting

+
+ +
+ +
+ + +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty_2035_hwy_update/_generated/lasso.Project/index.html b/branch/bicounty_2035_hwy_update/_generated/lasso.Project/index.html new file mode 100644 index 0000000..68c1dc2 --- /dev/null +++ b/branch/bicounty_2035_hwy_update/_generated/lasso.Project/index.html @@ -0,0 +1,521 @@ + + + + + + + lasso.Project — lasso documentation + + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

lasso.Project

+
+
+class lasso.Project(roadway_link_changes=None, roadway_node_changes=None, transit_changes=None, base_roadway_network=None, base_transit_network=None, base_cube_transit_network=None, build_cube_transit_network=None, project_name='', evaluate=False, parameters={})[source]
+

Bases: object

+

A single or set of changes to the roadway or transit system.

+

Compares a base and a build transit network or a base and build +highway network and produces project cards.

+

Typical usage example:

+
test_project = Project.create_project(
+    base_cube_transit_source=os.path.join(CUBE_DIR, "transit.LIN"),
+    build_cube_transit_source=os.path.join(CUBE_DIR, "transit_route_shape_change"),
+)
+test_project.evaluate_changes()
+test_project.write_project_card(
+    os.path.join(SCRATCH_DIR, "t_transit_shape_test.yml")
+)
+
+
+
+
+DEFAULT_PROJECT_NAME
+

a class-level constant that defines what +the project name will be if none is set.

+
+ +
+
+STATIC_VALUES
+

a class-level constant which defines values that +are not evaluated when assessing changes.

+
+ +
+
+card_data
+

{“project”: <project_name>, “changes”: <list of change dicts>}

+
+
Type:
+

dict

+
+
+
+ +
+ +

pandas dataframe of CUBE roadway link changes.

+
+
Type:
+

DataFrame

+
+
+
+ +
+
+roadway_node_changes
+

pandas dataframe of CUBE roadway node changes.

+
+
Type:
+

DataFrame

+
+
+
+ +
+
+transit_changes
+
+
Type:
+

CubeTransit

+
+
+
+ +
+
+base_roadway_network
+
+
Type:
+

RoadwayNetwork

+
+
+
+ +
+
+base_cube_transit_network
+
+
Type:
+

CubeTransit

+
+
+
+ +
+
+build_cube_transit_network
+
+
Type:
+

CubeTransit

+
+
+
+ +
+
+project_name
+

name of the project, set to DEFAULT_PROJECT_NAME if not provided

+
+
Type:
+

str

+
+
+
+ +
+
+parameters
+

an instance of the Parameters class which sets a bunch of parameters

+
+ +
+
+__init__(roadway_link_changes=None, roadway_node_changes=None, transit_changes=None, base_roadway_network=None, base_transit_network=None, base_cube_transit_network=None, build_cube_transit_network=None, project_name='', evaluate=False, parameters={})[source]
+

ProjectCard constructor.

+
+
Parameters:
+
    +
  • roadway_link_changes – dataframe of roadway changes read from a log file

  • +
  • roadway_node_changes – dataframe of roadway changes read from a log file

  • +
  • transit_changes – dataframe of transit changes read from a log file

  • +
  • base_roadway_network – RoadwayNetwork instance for base case

  • +
  • base_transit_network – StandardTransit instance for base case

  • +
  • base_cube_transit_network – CubeTransit instance for base transit network

  • +
  • build_cube_transit_network – CubeTransit instance for build transit network

  • +
  • project_name – name of the project

  • +
  • evaluate – defaults to false, but if true, will create card data

  • +
  • parameters – dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not specified, will use default parameters.

  • +
+
+
+

returns: instance of ProjectCard

+
+ +

Methods

+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +

__init__([roadway_link_changes, ...])

ProjectCard constructor.

add_highway_changes([...])

Evaluates changes from the log file based on the base highway object and adds entries into the self.card_data dictionary.

add_transit_changes()

Evaluates changes between base and build transit objects and adds entries into the self.card_data dictionary.

create_project([roadway_log_file, ...])

Constructor for a Project instance.

determine_roadway_network_changes_compatibility(...)

Checks to see that any links or nodes that change exist in base roadway network.

emme_id_to_wrangler_id(emme_link_change_df, ...)

rewrite the emme id with wrangler id, using the emme wrangler id crosswalk located in database folder

emme_name_to_wrangler_name(...)

rename emme names to wrangler names using crosswalk file

evaluate_changes()

Determines which changes should be evaluated, initiates self.card_data to be an aggregation of transit and highway changes.

get_object_from_network_build_command()

determine the network build object is node or link

get_operation_from_network_build_command()

determine the network build object action type

read_logfile(logfilename)

Reads a Cube log file and returns separate dataframes of roadway_link_changes and roadway_node_changes

read_network_build_file(networkbuildfilename)

Reads a emme network build file and returns separate dataframes of roadway_link_changes and roadway_node_changes

write_project_card([filename])

Writes project cards.

+

Attributes

+ + + + + + + + + + + + +

CALCULATED_VALUES

DEFAULT_PROJECT_NAME

STATIC_VALUES

+
+
+add_highway_changes(limit_variables_to_existing_network=False)[source]
+

Evaluates changes from the log file based on the base highway object and +adds entries into the self.card_data dictionary.

+
+
Parameters:
+

limit_variables_to_existing_network (bool) – True if no ad-hoc variables. Default to False.

+
+
+
+ +
+
+add_transit_changes()[source]
+

Evaluates changes between base and build transit objects and +adds entries into the self.card_data dictionary.

+
+ +
+
+static create_project(roadway_log_file=None, roadway_shp_file=None, roadway_csv_file=None, network_build_file=None, emme_node_id_crosswalk_file=None, emme_name_crosswalk_file=None, base_roadway_dir=None, base_transit_dir=None, base_cube_transit_source=None, build_cube_transit_source=None, roadway_link_changes=None, roadway_node_changes=None, transit_changes=None, base_roadway_network=None, base_transit_network=None, base_cube_transit_network=None, build_cube_transit_network=None, project_name=None, recalculate_calculated_variables=False, recalculate_distance=False, parameters={}, **kwargs)[source]
+

Constructor for a Project instance.

+
+
Parameters:
+
    +
  • roadway_log_file – File path to consuming logfile or a list of logfile paths.

  • +
  • roadway_shp_file – File path to consuming shape file for roadway changes.

  • +
  • roadway_csv_file – File path to consuming csv file for roadway changes.

  • +
  • network_build_file – File path to consuming EMME network build for network changes.

  • +
  • base_roadway_dir – Folder path to base roadway network.

  • +
  • base_transit_dir – Folder path to base transit network.

  • +
  • base_cube_transit_source – Folder path to base transit network or cube line file string.

  • +
  • base_cube_transit_file – File path to base transit network.

  • +
  • build_cube_transit_source – Folder path to build transit network or cube line file string.

  • +
  • build_cube_transit_file – File path to build transit network.

  • +
  • roadway_link_changes – pandas dataframe of CUBE roadway link changes.

  • +
  • roadway_node_changes – pandas dataframe of CUBE roadway node changes.

  • +
  • transit_changes – build transit changes.

  • +
  • base_roadway_network – Base roadway network object.

  • +
  • base_cube_transit_network – Base cube transit network object.

  • +
  • build_cube_transit_network – Build cube transit network object.

  • +
  • project_name – If not provided, will default to the roadway_log_file filename if +provided (or the first filename if a list is provided)

  • +
  • recalculate_calculated_variables – if reading in a base network, if this is true it +will recalculate variables such as area type, etc. This only needs to be true +if you are creating project cards that are changing the calculated variables.

  • +
  • recalculate_distance – recalculate the distance variable. This only needs to be +true if you are creating project cards that change the distance.

  • +
  • parameters – dictionary of parameters

  • +
  • crs (int) – coordinate reference system, ESPG number

  • +
  • node_foreign_key (str) – variable linking the node table to the link table

  • +
  • link_foreign_key (list) – list of variable linking the link table to the node foreign key

  • +
  • shape_foreign_key (str) – variable linking the links table and shape table

  • +
  • unique_link_ids (list) – list of variables unique to each link

  • +
  • unique_node_ids (list) – list of variables unique to each node

  • +
  • modes_to_network_link_variables (dict) – Mapping of modes to link variables in +the network

  • +
  • modes_to_network_nodes_variables (dict) – Mapping of modes to node variables +in the network

  • +
  • managed_lanes_node_id_scalar (int) – Scalar values added to primary keys for nodes for +corresponding managed lanes.

  • +
  • managed_lanes_link_id_scalar (int) – Scalar values added to primary keys for links for +corresponding managed lanes.

  • +
  • managed_lanes_required_attributes (list) – attributes that must be specified in managed +lane projects.

  • +
  • keep_same_attributes_ml_and_gp (list) – attributes to copy to managed lanes from parallel +general purpose lanes.

  • +
+
+
Returns:
+

A Project instance.

+
+
+
+ +
+
+static determine_roadway_network_changes_compatibility(base_roadway_network, roadway_link_changes, roadway_node_changes, parameters)[source]
+

Checks to see that any links or nodes that change exist in base roadway network.

+
+ +
+
+static emme_id_to_wrangler_id(emme_link_change_df, emme_node_change_df, emme_transit_changes_df, emme_node_id_crosswalk_file)[source]
+

rewrite the emme id with wrangler id, using the emme wrangler id crosswalk located in database folder

+
+ +
+
+static emme_name_to_wrangler_name(emme_link_change_df, emme_node_change_df, emme_name_crosswalk_file)[source]
+

rename emme names to wrangler names using crosswalk file

+
+ +
+
+evaluate_changes()[source]
+

Determines which changes should be evaluated, initiates +self.card_data to be an aggregation of transit and highway changes.

+
+ +
+
+get_object_from_network_build_command()[source]
+

determine the network build object is node or link

+
+
Parameters:
+

row – network build command history dataframe

+
+
Returns:
+

‘N’ for node, ‘L’ for link

+
+
+
+ +
+
+get_operation_from_network_build_command()[source]
+

determine the network build object action type

+
+
Parameters:
+

row – network build command history dataframe

+
+
Returns:
+

‘A’, ‘C’, ‘D’

+
+
+
+ +
+
+static read_logfile(logfilename)[source]
+

Reads a Cube log file and returns separate dataframes of roadway_link_changes and roadway_node_changes

+
+
Parameters:
+

logfilename (str or list[str]) – File path to CUBE logfile or list of logfile paths.

+
+
Returns:
+

A DataFrame reprsentation of the log file.

+
+
+
+ +
+
+static read_network_build_file(networkbuildfilename)[source]
+

Reads a emme network build file and returns separate dataframes of roadway_link_changes and roadway_node_changes

+
+
Parameters:
+

networkbuildfilename (str or list[str]) – File path to emme nework build file or list of network build file paths.

+
+
Returns:
+

A DataFrame representation of the network build file

+
+
+
+ +
+
+write_project_card(filename=None)[source]
+

Writes project cards.

+
+
Parameters:
+

filename (str) – File path to output .yml

+
+
Returns:
+

None

+
+
+
+ +
+
+CALCULATED_VALUES = ['area_type', 'county', 'assign_group', 'centroidconnect']
+
+ +
+
+DEFAULT_PROJECT_NAME = 'USER TO define'
+
+ +
+
+STATIC_VALUES = ['model_link_id', 'area_type', 'county', 'centroidconnect']
+
+ +
+ +
+ + +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty_2035_hwy_update/_generated/lasso.StandardTransit/index.html b/branch/bicounty_2035_hwy_update/_generated/lasso.StandardTransit/index.html new file mode 100644 index 0000000..e3c050b --- /dev/null +++ b/branch/bicounty_2035_hwy_update/_generated/lasso.StandardTransit/index.html @@ -0,0 +1,420 @@ + + + + + + + lasso.StandardTransit — lasso documentation + + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

lasso.StandardTransit

+
+
+class lasso.StandardTransit(ptg_feed, parameters={})[source]
+

Bases: object

+

Holds a standard transit feed as a Partridge object and contains +methods to manipulate and translate the GTFS data to MetCouncil’s +Cube Line files.

+

Typical usage example:

+
cube_transit_net = StandardTransit.read_gtfs(BASE_TRANSIT_DIR)
+cube_transit_net.write_as_cube_lin(os.path.join(WRITE_DIR, "outfile.lin"))
+
+
+
+
+feed
+

Partridge Feed object containing read-only access to GTFS feed

+
+ +
+
+parameters
+

Parameters instance containing information +about time periods and variables.

+
+
Type:
+

Parameters

+
+
+
+ +
+
+__init__(ptg_feed, parameters={})[source]
+
+
Parameters:
+
    +
  • ptg_feed – partridge feed object

  • +
  • parameters – dictionary of parameter settings (see Parameters class) or an instance of Parameters

  • +
+
+
+
+ +

Methods

+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +

__init__(ptg_feed[, parameters])

+
param ptg_feed:
+

partridge feed object

+
+
+

calculate_cube_mode(row)

Assigns a cube mode number by following logic.

cube_format(row)

Creates a string represnting the route in cube line file notation. #MC :param row: row of a DataFrame representing a cube-formatted trip, with the Attributes trip_id, shape_id, NAME, LONGNAME, tod, HEADWAY, MODE, ONEWAY, OPERATOR.

evaluate_differences(transit_changes)

Compare changes from the transit_changes dataframe with the standard transit network returns the project card changes in dictionary format

fromTransitNetwork(transit_network_object[, ...])

RoadwayNetwork to ModelRoadwayNetwork

read_gtfs(gtfs_feed_dir[, parameters])

Reads GTFS files from a directory and returns a StandardTransit instance.

route_properties_gtfs_to_cube(self)

Prepare gtfs for cube lin file.

shape_gtfs_to_cube(row[, add_nntime])

Creates a list of nodes that for the route in appropriate cube format.

shape_gtfs_to_emme(trip_row)

Creates transit segment for the trips in appropriate emme format.

time_to_cube_time_period(start_time_secs[, ...])

Converts seconds from midnight to the cube time period.

write_as_cube_lin([outpath])

Writes the gtfs feed as a cube line file after converting gtfs properties to MetCouncil cube properties.

+
+
+calculate_cube_mode(row)[source]
+

Assigns a cube mode number by following logic. +#MC +For rail, uses GTFS route_type variable: +https://developers.google.com/transit/gtfs/reference

+
+
::

# route_type : cube_mode +route_type_to_cube_mode = {0: 8, # Tram, Streetcar, Light rail

+
+

3: 0, # Bus; further disaggregated for cube +2: 9} # Rail

+
+
+
+

For buses, uses route id numbers and route name to find +express and suburban buses as follows:

+
+
::
+
if not cube_mode:
+
if ‘express’ in row[‘LONGNAME’].lower():

cube_mode = 7 # Express

+
+
elif int(row[‘route_id’].split(“-“)[0]) > 99:

cube_mode = 6 # Suburban Local

+
+
else:

cube_mode = 5 # Urban Local

+
+
+
+
+
+
+
+
Parameters:
+

row – A DataFrame row with route_type, route_long_name, and route_id

+
+
Returns:
+

cube mode number

+
+
+
+ +
+
+cube_format(row)[source]
+

Creates a string represnting the route in cube line file notation. +#MC +:param row: row of a DataFrame representing a cube-formatted trip, with the Attributes

+
+

trip_id, shape_id, NAME, LONGNAME, tod, HEADWAY, MODE, ONEWAY, OPERATOR

+
+
+
Returns:
+

string representation of route in cube line file notation

+
+
+
+ +
+
+evaluate_differences(transit_changes)[source]
+

Compare changes from the transit_changes dataframe with the standard transit network +returns the project card changes in dictionary format

+
+ +
+
+static fromTransitNetwork(transit_network_object, parameters={})[source]
+

RoadwayNetwork to ModelRoadwayNetwork

+
+
Parameters:
+
    +
  • transit_network_object – Reference to an instance of TransitNetwork.

  • +
  • parameters – dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not provided will +use default parameters.

  • +
+
+
Returns:
+

StandardTransit

+
+
+
+ +
+
+static read_gtfs(gtfs_feed_dir, parameters={})[source]
+

Reads GTFS files from a directory and returns a StandardTransit +instance.

+
+
Parameters:
+
    +
  • gtfs_feed_dir – location of the GTFS files

  • +
  • parameters – dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not provided will +use default parameters.

  • +
+
+
Returns:
+

StandardTransit instance

+
+
+
+ +
+
+static route_properties_gtfs_to_cube(self)[source]
+

Prepare gtfs for cube lin file. +#MC +Does the following operations: +1. Combines route, frequency, trip, and shape information +2. Converts time of day to time periods +3. Calculates cube route name from gtfs route name and properties +4. Assigns a cube-appropriate mode number +5. Assigns a cube-appropriate operator number

+
+
Returns:
+

+
DataFrame of trips with cube-appropriate values for:
    +
  • NAME

  • +
  • ONEWAY

  • +
  • OPERATOR

  • +
  • MODE

  • +
  • HEADWAY

  • +
+
+
+

+
+
Return type:
+

trip_df (DataFrame)

+
+
+
+ +
+
+shape_gtfs_to_cube(row, add_nntime=False)[source]
+

Creates a list of nodes that for the route in appropriate +cube format.

+
+
Parameters:
+

row – DataFrame row with both shape_id and trip_id

+
+
+
+
Returns: a string representation of the node list

for a route in cube format.

+
+
+
+ +
+
+shape_gtfs_to_emme(trip_row)[source]
+

Creates transit segment for the trips in appropriate +emme format.

+
+
Parameters:
+

row – DataFrame row with both shape_id and trip_id

+
+
+
+
Returns: a dataframe representation of the transit segment

for a trip in emme format.

+
+
+
+ +
+
+time_to_cube_time_period(start_time_secs, as_str=True, verbose=False)[source]
+

Converts seconds from midnight to the cube time period.

+
+
Parameters:
+
    +
  • start_time_secs – start time for transit trip in seconds +from midnight

  • +
  • as_str – if True, returns the time period as a string, +otherwise returns a numeric time period

  • +
+
+
Returns:
+

+
if as_str is False, returns the numeric

time period

+
+
this_tp: if as_str is True, returns the Cube time period

name abbreviation

+
+
+

+
+
Return type:
+

this_tp_num

+
+
+
+ +
+
+write_as_cube_lin(outpath=None)[source]
+

Writes the gtfs feed as a cube line file after +converting gtfs properties to MetCouncil cube properties. +#MC +:param outpath: File location for output cube line file.

+
+ +
+ +
+ + +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty_2035_hwy_update/_generated/lasso.logger/index.html b/branch/bicounty_2035_hwy_update/_generated/lasso.logger/index.html new file mode 100644 index 0000000..baf81a9 --- /dev/null +++ b/branch/bicounty_2035_hwy_update/_generated/lasso.logger/index.html @@ -0,0 +1,143 @@ + + + + + + + lasso.logger — lasso documentation + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

lasso.logger

+

Functions

+ + + + + + +

setupLogging(infoLogFilename, debugLogFilename)

Sets up the logger.

+
+
+lasso.logger.setupLogging(infoLogFilename, debugLogFilename, logToConsole=True)[source]
+

Sets up the logger. The infoLog is terse, just gives the bare minimum of details +so the network composition will be clear later. +The debuglog is very noisy, for debugging.

+

Pass none to either. +Spews it all out to console too, if logToConsole is true.

+
+ +
+ + +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty_2035_hwy_update/_generated/lasso.util/index.html b/branch/bicounty_2035_hwy_update/_generated/lasso.util/index.html new file mode 100644 index 0000000..1d14e5f --- /dev/null +++ b/branch/bicounty_2035_hwy_update/_generated/lasso.util/index.html @@ -0,0 +1,1696 @@ + + + + + + + lasso.util — lasso documentation + + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

lasso.util

+

Functions

+ + + + + + + + + + + + + + + + + + + + + + + + +

column_name_to_parts(c[, parameters])

create_locationreference(node, link)

geodesic_point_buffer(lat, lon, meters)

creates circular buffer polygon for node

get_shared_streets_intersection_hash(lat, long)

Calculated per:

hhmmss_to_datetime(hhmmss_str)

Creates a datetime time object from a string of hh:mm:ss

secs_to_datetime(secs)

Creates a datetime time object from a seconds from midnight

shorten_name(name)

+
+
+class lasso.util.Point(*args)[source]
+

Bases: BaseGeometry

+

A geometry type that represents a single coordinate with +x,y and possibly z values.

+

A point is a zero-dimensional feature and has zero length and zero area.

+
+
Parameters:
+

args (float, or sequence of floats) –

The coordinates can either be passed as a single parameter, or as +individual float values using multiple parameters:

+
    +
  1. 1 parameter: a sequence or array-like of with 2 or 3 values.

  2. +
  3. 2 or 3 parameters (float): x, y, and possibly z.

  4. +
+

+
+
+
+
+x, y, z
+

Coordinate values

+
+
Type:
+

float

+
+
+
+ +

Examples

+

Constructing the Point using separate parameters for x and y:

+
>>> p = Point(1.0, -1.0)
+
+
+

Constructing the Point using a list of x, y coordinates:

+
>>> p = Point([1.0, -1.0])
+>>> print(p)
+POINT (1 -1)
+>>> p.y
+-1.0
+>>> p.x
+1.0
+
+
+
+
+almost_equals(other, decimal=6)
+

True if geometries are equal at all coordinates to a +specified decimal place.

+
+

Deprecated since version 1.8.0: The ‘almost_equals()’ method is deprecated +and will be removed in Shapely 2.1 because the name is +confusing. The ‘equals_exact()’ method should be used +instead.

+
+

Refers to approximate coordinate equality, which requires +coordinates to be approximately equal and in the same order for +all components of a geometry.

+

Because of this it is possible for “equals()” to be True for two +geometries and “almost_equals()” to be False.

+

Examples

+
>>> LineString(
+...     [(0, 0), (2, 2)]
+... ).equals_exact(
+...     LineString([(0, 0), (1, 1), (2, 2)]),
+...     1e-6
+... )
+False
+
+
+
+
Return type:
+

bool

+
+
+
+ +
+
+buffer(distance, quad_segs=16, cap_style='round', join_style='round', mitre_limit=5.0, single_sided=False, **kwargs)
+

Get a geometry that represents all points within a distance +of this geometry.

+

A positive distance produces a dilation, a negative distance an +erosion. A very small or zero distance may sometimes be used to +“tidy” a polygon.

+
+
Parameters:
+
    +
  • distance (float) – The distance to buffer around the object.

  • +
  • resolution (int, optional) – The resolution of the buffer around each vertex of the +object.

  • +
  • quad_segs (int, optional) – Sets the number of line segments used to approximate an +angle fillet.

  • +
  • cap_style (shapely.BufferCapStyle or {'round', 'square', 'flat'}, default 'round') – Specifies the shape of buffered line endings. BufferCapStyle.round (‘round’) +results in circular line endings (see quad_segs). Both BufferCapStyle.square +(‘square’) and BufferCapStyle.flat (‘flat’) result in rectangular line endings, +only BufferCapStyle.flat (‘flat’) will end at the original vertex, +while BufferCapStyle.square (‘square’) involves adding the buffer width.

  • +
  • join_style (shapely.BufferJoinStyle or {'round', 'mitre', 'bevel'}, default 'round') – Specifies the shape of buffered line midpoints. BufferJoinStyle.ROUND (‘round’) +results in rounded shapes. BufferJoinStyle.bevel (‘bevel’) results in a beveled +edge that touches the original vertex. BufferJoinStyle.mitre (‘mitre’) results +in a single vertex that is beveled depending on the mitre_limit parameter.

  • +
  • mitre_limit (float, optional) – The mitre limit ratio is used for very sharp corners. The +mitre ratio is the ratio of the distance from the corner to +the end of the mitred offset corner. When two line segments +meet at a sharp angle, a miter join will extend the original +geometry. To prevent unreasonable geometry, the mitre limit +allows controlling the maximum length of the join corner. +Corners with a ratio which exceed the limit will be beveled.

  • +
  • single_side (bool, optional) –

    The side used is determined by the sign of the buffer +distance:

    +
    +

    a positive distance indicates the left-hand side +a negative distance indicates the right-hand side

    +
    +

    The single-sided buffer of point geometries is the same as +the regular buffer. The End Cap Style for single-sided +buffers is always ignored, and forced to the equivalent of +CAP_FLAT.

    +

  • +
  • quadsegs (int, optional) – Deprecated alias for quad_segs.

  • +
+
+
Return type:
+

Geometry

+
+
+

Notes

+

The return value is a strictly two-dimensional geometry. All +Z coordinates of the original geometry will be ignored.

+

Examples

+
>>> from shapely.wkt import loads
+>>> g = loads('POINT (0.0 0.0)')
+
+
+

16-gon approx of a unit radius circle:

+
>>> g.buffer(1.0).area  
+3.1365484905459...
+
+
+

128-gon approximation:

+
>>> g.buffer(1.0, 128).area  
+3.141513801144...
+
+
+

triangle approximation:

+
>>> g.buffer(1.0, 3).area
+3.0
+>>> list(g.buffer(1.0, cap_style=BufferCapStyle.square).exterior.coords)
+[(1.0, 1.0), (1.0, -1.0), (-1.0, -1.0), (-1.0, 1.0), (1.0, 1.0)]
+>>> g.buffer(1.0, cap_style=BufferCapStyle.square).area
+4.0
+
+
+
+ +
+
+contains(other)
+

Returns True if the geometry contains the other, else False

+
+ +
+
+contains_properly(other)
+

Returns True if the geometry completely contains the other, with no +common boundary points, else False

+

Refer to shapely.contains_properly for full documentation.

+
+ +
+
+covered_by(other)
+

Returns True if the geometry is covered by the other, else False

+
+ +
+
+covers(other)
+

Returns True if the geometry covers the other, else False

+
+ +
+
+crosses(other)
+

Returns True if the geometries cross, else False

+
+ +
+
+difference(other, grid_size=None)
+

Returns the difference of the geometries.

+

Refer to shapely.difference for full documentation.

+
+ +
+
+disjoint(other)
+

Returns True if geometries are disjoint, else False

+
+ +
+
+distance(other)
+

Unitless distance to other geometry (float)

+
+ +
+
+dwithin(other, distance)
+

Returns True if geometry is within a given distance from the other, else False.

+

Refer to shapely.dwithin for full documentation.

+
+ +
+
+equals(other)
+

Returns True if geometries are equal, else False.

+

This method considers point-set equality (or topological +equality), and is equivalent to (self.within(other) & +self.contains(other)).

+

Examples

+
>>> LineString(
+...     [(0, 0), (2, 2)]
+... ).equals(
+...     LineString([(0, 0), (1, 1), (2, 2)])
+... )
+True
+
+
+
+
Return type:
+

bool

+
+
+
+ +
+
+equals_exact(other, tolerance)
+

True if geometries are equal to within a specified +tolerance.

+
+
Parameters:
+
    +
  • other (BaseGeometry) – The other geometry object in this comparison.

  • +
  • tolerance (float) – Absolute tolerance in the same units as coordinates.

  • +
  • equality (This method considers coordinate) –

  • +
  • requires (which) –

  • +
  • components (coordinates to be equal and in the same order for all) –

  • +
  • geometry. (of a) –

  • +
  • two (Because of this it is possible for "equals()" to be True for) –

  • +
  • False. (geometries and "equals_exact()" to be) –

  • +
+
+
+

Examples

+
>>> LineString(
+...     [(0, 0), (2, 2)]
+... ).equals_exact(
+...     LineString([(0, 0), (1, 1), (2, 2)]),
+...     1e-6
+... )
+False
+
+
+
+
Return type:
+

bool

+
+
+
+ +
+
+geometryType()
+
+ +
+
+hausdorff_distance(other)
+

Unitless hausdorff distance to other geometry (float)

+
+ +
+
+interpolate(distance, normalized=False)
+

Return a point at the specified distance along a linear geometry

+

Negative length values are taken as measured in the reverse +direction from the end of the geometry. Out-of-range index +values are handled by clamping them to the valid range of values. +If the normalized arg is True, the distance will be interpreted as a +fraction of the geometry’s length.

+

Alias of line_interpolate_point.

+
+ +
+
+intersection(other, grid_size=None)
+

Returns the intersection of the geometries.

+

Refer to shapely.intersection for full documentation.

+
+ +
+
+intersects(other)
+

Returns True if geometries intersect, else False

+
+ +
+
+line_interpolate_point(distance, normalized=False)
+

Return a point at the specified distance along a linear geometry

+

Negative length values are taken as measured in the reverse +direction from the end of the geometry. Out-of-range index +values are handled by clamping them to the valid range of values. +If the normalized arg is True, the distance will be interpreted as a +fraction of the geometry’s length.

+

Alias of interpolate.

+
+ +
+
+line_locate_point(other, normalized=False)
+

Returns the distance along this geometry to a point nearest the +specified point

+

If the normalized arg is True, return the distance normalized to the +length of the linear geometry.

+

Alias of project.

+
+ +
+
+normalize()
+

Converts geometry to normal form (or canonical form).

+

This method orders the coordinates, rings of a polygon and parts of +multi geometries consistently. Typically useful for testing purposes +(for example in combination with equals_exact).

+

Examples

+
>>> from shapely import MultiLineString
+>>> line = MultiLineString([[(0, 0), (1, 1)], [(3, 3), (2, 2)]])
+>>> line.normalize()
+<MULTILINESTRING ((2 2, 3 3), (0 0, 1 1))>
+
+
+
+ +
+
+overlaps(other)
+

Returns True if geometries overlap, else False

+
+ +
+
+point_on_surface()
+

Returns a point guaranteed to be within the object, cheaply.

+

Alias of representative_point.

+
+ +
+
+project(other, normalized=False)
+

Returns the distance along this geometry to a point nearest the +specified point

+

If the normalized arg is True, return the distance normalized to the +length of the linear geometry.

+

Alias of line_locate_point.

+
+ +
+
+relate(other)
+

Returns the DE-9IM intersection matrix for the two geometries +(string)

+
+ +
+
+relate_pattern(other, pattern)
+

Returns True if the DE-9IM string code for the relationship between +the geometries satisfies the pattern, else False

+
+ +
+
+representative_point()
+

Returns a point guaranteed to be within the object, cheaply.

+

Alias of point_on_surface.

+
+ +
+
+reverse()
+

Returns a copy of this geometry with the order of coordinates reversed.

+

If the geometry is a polygon with interior rings, the interior rings are also +reversed.

+

Points are unchanged.

+
+

See also

+
+
is_ccw

Checks if a geometry is clockwise.

+
+
+
+

Examples

+
>>> from shapely import LineString, Polygon
+>>> LineString([(0, 0), (1, 2)]).reverse()
+<LINESTRING (1 2, 0 0)>
+>>> Polygon([(0, 0), (1, 0), (1, 1), (0, 1), (0, 0)]).reverse()
+<POLYGON ((0 0, 0 1, 1 1, 1 0, 0 0))>
+
+
+
+ +
+
+segmentize(max_segment_length)
+

Adds vertices to line segments based on maximum segment length.

+

Additional vertices will be added to every line segment in an input geometry +so that segments are no longer than the provided maximum segment length. New +vertices will evenly subdivide each segment.

+

Only linear components of input geometries are densified; other geometries +are returned unmodified.

+
+
Parameters:
+

max_segment_length (float or array_like) – Additional vertices will be added so that all line segments are no +longer this value. Must be greater than 0.

+
+
+

Examples

+
>>> from shapely import LineString, Polygon
+>>> LineString([(0, 0), (0, 10)]).segmentize(max_segment_length=5)
+<LINESTRING (0 0, 0 5, 0 10)>
+>>> Polygon([(0, 0), (10, 0), (10, 10), (0, 10), (0, 0)]).segmentize(max_segment_length=5)
+<POLYGON ((0 0, 5 0, 10 0, 10 5, 10 10, 5 10, 0 10, 0 5, 0 0))>
+
+
+
+ +
+
+simplify(tolerance, preserve_topology=True)
+

Returns a simplified geometry produced by the Douglas-Peucker +algorithm

+

Coordinates of the simplified geometry will be no more than the +tolerance distance from the original. Unless the topology preserving +option is used, the algorithm may produce self-intersecting or +otherwise invalid geometries.

+
+ +
+
+svg(scale_factor=1.0, fill_color=None, opacity=None)[source]
+

Returns SVG circle element for the Point geometry.

+
+
Parameters:
+
    +
  • scale_factor (float) – Multiplication factor for the SVG circle diameter. Default is 1.

  • +
  • fill_color (str, optional) – Hex string for fill color. Default is to use “#66cc99” if +geometry is valid, and “#ff3333” if invalid.

  • +
  • opacity (float) – Float number between 0 and 1 for color opacity. Default value is 0.6

  • +
+
+
+
+ +
+
+symmetric_difference(other, grid_size=None)
+

Returns the symmetric difference of the geometries.

+

Refer to shapely.symmetric_difference for full documentation.

+
+ +
+
+touches(other)
+

Returns True if geometries touch, else False

+
+ +
+
+union(other, grid_size=None)
+

Returns the union of the geometries.

+

Refer to shapely.union for full documentation.

+
+ +
+
+within(other)
+

Returns True if geometry is within the other, else False

+
+ +
+
+property area
+

Unitless area of the geometry (float)

+
+ +
+
+property boundary
+

Returns a lower dimension geometry that bounds the object

+

The boundary of a polygon is a line, the boundary of a line is a +collection of points. The boundary of a point is an empty (null) +collection.

+
+ +
+
+property bounds
+

Returns minimum bounding region (minx, miny, maxx, maxy)

+
+ +
+
+property centroid
+

Returns the geometric center of the object

+
+ +
+
+property convex_hull
+

that’s a +convex hull, more or less

+

The convex hull of a three member multipoint, for example, is a +triangular polygon.

+
+
Type:
+

Imagine an elastic band stretched around the geometry

+
+
+
+ +
+
+property coords
+

Access to geometry’s coordinates (CoordinateSequence)

+
+ +
+
+property envelope
+

A figure that envelopes the geometry

+
+ +
+
+property geom_type
+

Name of the geometry’s type, such as ‘Point’

+
+ +
+
+property has_z
+

True if the geometry’s coordinate sequence(s) have z values (are +3-dimensional)

+
+ +
+
+property is_closed
+

True if the geometry is closed, else False

+

Applicable only to 1-D geometries.

+
+ +
+
+property is_empty
+

True if the set of points in this geometry is empty, else False

+
+ +
+
+property is_ring
+

True if the geometry is a closed ring, else False

+
+ +
+
+property is_simple
+

True if the geometry is simple, meaning that any self-intersections +are only at boundary points, else False

+
+ +
+
+property is_valid
+

True if the geometry is valid (definition depends on sub-class), +else False

+
+ +
+
+property length
+

Unitless length of the geometry (float)

+
+ +
+
+property minimum_clearance
+

Unitless distance by which a node could be moved to produce an invalid geometry (float)

+
+ +
+
+property minimum_rotated_rectangle
+

Returns the oriented envelope (minimum rotated rectangle) that +encloses the geometry.

+

Unlike envelope this rectangle is not constrained to be parallel to the +coordinate axes. If the convex hull of the object is a degenerate (line +or point) this degenerate is returned.

+

Alias of oriented_envelope.

+
+ +
+
+property oriented_envelope
+

Returns the oriented envelope (minimum rotated rectangle) that +encloses the geometry.

+

Unlike envelope this rectangle is not constrained to be parallel to the +coordinate axes. If the convex hull of the object is a degenerate (line +or point) this degenerate is returned.

+

Alias of minimum_rotated_rectangle.

+
+ +
+
+property type
+
+ +
+
+property wkb
+

WKB representation of the geometry

+
+ +
+
+property wkb_hex
+

WKB hex representation of the geometry

+
+ +
+
+property wkt
+

WKT representation of the geometry

+
+ +
+
+property x
+

Return x coordinate.

+
+ +
+
+property xy
+

Separate arrays of X and Y coordinate values

+

Example

+
>>> x, y = Point(0, 0).xy
+>>> list(x)
+[0.0]
+>>> list(y)
+[0.0]
+
+
+
+ +
+
+property y
+

Return y coordinate.

+
+ +
+
+property z
+

Return z coordinate.

+
+ +
+ +
+
+class lasso.util.Polygon(shell=None, holes=None)[source]
+

Bases: BaseGeometry

+

A geometry type representing an area that is enclosed by a linear ring.

+

A polygon is a two-dimensional feature and has a non-zero area. It may +have one or more negative-space “holes” which are also bounded by linear +rings. If any rings cross each other, the feature is invalid and +operations on it may fail.

+
+
Parameters:
+
    +
  • shell (sequence) – A sequence of (x, y [,z]) numeric coordinate pairs or triples, or +an array-like with shape (N, 2) or (N, 3). +Also can be a sequence of Point objects.

  • +
  • holes (sequence) – A sequence of objects which satisfy the same requirements as the +shell parameters above

  • +
+
+
+
+
+exterior
+

The ring which bounds the positive space of the polygon.

+
+
Type:
+

LinearRing

+
+
+
+ +
+
+interiors
+

A sequence of rings which bound all existing holes.

+
+
Type:
+

sequence

+
+
+
+ +

Examples

+

Create a square polygon with no holes

+
>>> coords = ((0., 0.), (0., 1.), (1., 1.), (1., 0.), (0., 0.))
+>>> polygon = Polygon(coords)
+>>> polygon.area
+1.0
+
+
+
+
+almost_equals(other, decimal=6)
+

True if geometries are equal at all coordinates to a +specified decimal place.

+
+

Deprecated since version 1.8.0: The ‘almost_equals()’ method is deprecated +and will be removed in Shapely 2.1 because the name is +confusing. The ‘equals_exact()’ method should be used +instead.

+
+

Refers to approximate coordinate equality, which requires +coordinates to be approximately equal and in the same order for +all components of a geometry.

+

Because of this it is possible for “equals()” to be True for two +geometries and “almost_equals()” to be False.

+

Examples

+
>>> LineString(
+...     [(0, 0), (2, 2)]
+... ).equals_exact(
+...     LineString([(0, 0), (1, 1), (2, 2)]),
+...     1e-6
+... )
+False
+
+
+
+
Return type:
+

bool

+
+
+
+ +
+
+buffer(distance, quad_segs=16, cap_style='round', join_style='round', mitre_limit=5.0, single_sided=False, **kwargs)
+

Get a geometry that represents all points within a distance +of this geometry.

+

A positive distance produces a dilation, a negative distance an +erosion. A very small or zero distance may sometimes be used to +“tidy” a polygon.

+
+
Parameters:
+
    +
  • distance (float) – The distance to buffer around the object.

  • +
  • resolution (int, optional) – The resolution of the buffer around each vertex of the +object.

  • +
  • quad_segs (int, optional) – Sets the number of line segments used to approximate an +angle fillet.

  • +
  • cap_style (shapely.BufferCapStyle or {'round', 'square', 'flat'}, default 'round') – Specifies the shape of buffered line endings. BufferCapStyle.round (‘round’) +results in circular line endings (see quad_segs). Both BufferCapStyle.square +(‘square’) and BufferCapStyle.flat (‘flat’) result in rectangular line endings, +only BufferCapStyle.flat (‘flat’) will end at the original vertex, +while BufferCapStyle.square (‘square’) involves adding the buffer width.

  • +
  • join_style (shapely.BufferJoinStyle or {'round', 'mitre', 'bevel'}, default 'round') – Specifies the shape of buffered line midpoints. BufferJoinStyle.ROUND (‘round’) +results in rounded shapes. BufferJoinStyle.bevel (‘bevel’) results in a beveled +edge that touches the original vertex. BufferJoinStyle.mitre (‘mitre’) results +in a single vertex that is beveled depending on the mitre_limit parameter.

  • +
  • mitre_limit (float, optional) – The mitre limit ratio is used for very sharp corners. The +mitre ratio is the ratio of the distance from the corner to +the end of the mitred offset corner. When two line segments +meet at a sharp angle, a miter join will extend the original +geometry. To prevent unreasonable geometry, the mitre limit +allows controlling the maximum length of the join corner. +Corners with a ratio which exceed the limit will be beveled.

  • +
  • single_side (bool, optional) –

    The side used is determined by the sign of the buffer +distance:

    +
    +

    a positive distance indicates the left-hand side +a negative distance indicates the right-hand side

    +
    +

    The single-sided buffer of point geometries is the same as +the regular buffer. The End Cap Style for single-sided +buffers is always ignored, and forced to the equivalent of +CAP_FLAT.

    +

  • +
  • quadsegs (int, optional) – Deprecated alias for quad_segs.

  • +
+
+
Return type:
+

Geometry

+
+
+

Notes

+

The return value is a strictly two-dimensional geometry. All +Z coordinates of the original geometry will be ignored.

+

Examples

+
>>> from shapely.wkt import loads
+>>> g = loads('POINT (0.0 0.0)')
+
+
+

16-gon approx of a unit radius circle:

+
>>> g.buffer(1.0).area  
+3.1365484905459...
+
+
+

128-gon approximation:

+
>>> g.buffer(1.0, 128).area  
+3.141513801144...
+
+
+

triangle approximation:

+
>>> g.buffer(1.0, 3).area
+3.0
+>>> list(g.buffer(1.0, cap_style=BufferCapStyle.square).exterior.coords)
+[(1.0, 1.0), (1.0, -1.0), (-1.0, -1.0), (-1.0, 1.0), (1.0, 1.0)]
+>>> g.buffer(1.0, cap_style=BufferCapStyle.square).area
+4.0
+
+
+
+ +
+
+contains(other)
+

Returns True if the geometry contains the other, else False

+
+ +
+
+contains_properly(other)
+

Returns True if the geometry completely contains the other, with no +common boundary points, else False

+

Refer to shapely.contains_properly for full documentation.

+
+ +
+
+covered_by(other)
+

Returns True if the geometry is covered by the other, else False

+
+ +
+
+covers(other)
+

Returns True if the geometry covers the other, else False

+
+ +
+
+crosses(other)
+

Returns True if the geometries cross, else False

+
+ +
+
+difference(other, grid_size=None)
+

Returns the difference of the geometries.

+

Refer to shapely.difference for full documentation.

+
+ +
+
+disjoint(other)
+

Returns True if geometries are disjoint, else False

+
+ +
+
+distance(other)
+

Unitless distance to other geometry (float)

+
+ +
+
+dwithin(other, distance)
+

Returns True if geometry is within a given distance from the other, else False.

+

Refer to shapely.dwithin for full documentation.

+
+ +
+
+equals(other)
+

Returns True if geometries are equal, else False.

+

This method considers point-set equality (or topological +equality), and is equivalent to (self.within(other) & +self.contains(other)).

+

Examples

+
>>> LineString(
+...     [(0, 0), (2, 2)]
+... ).equals(
+...     LineString([(0, 0), (1, 1), (2, 2)])
+... )
+True
+
+
+
+
Return type:
+

bool

+
+
+
+ +
+
+equals_exact(other, tolerance)
+

True if geometries are equal to within a specified +tolerance.

+
+
Parameters:
+
    +
  • other (BaseGeometry) – The other geometry object in this comparison.

  • +
  • tolerance (float) – Absolute tolerance in the same units as coordinates.

  • +
  • equality (This method considers coordinate) –

  • +
  • requires (which) –

  • +
  • components (coordinates to be equal and in the same order for all) –

  • +
  • geometry. (of a) –

  • +
  • two (Because of this it is possible for "equals()" to be True for) –

  • +
  • False. (geometries and "equals_exact()" to be) –

  • +
+
+
+

Examples

+
>>> LineString(
+...     [(0, 0), (2, 2)]
+... ).equals_exact(
+...     LineString([(0, 0), (1, 1), (2, 2)]),
+...     1e-6
+... )
+False
+
+
+
+
Return type:
+

bool

+
+
+
+ +
+
+classmethod from_bounds(xmin, ymin, xmax, ymax)[source]
+

Construct a Polygon() from spatial bounds.

+
+ +
+
+geometryType()
+
+ +
+
+hausdorff_distance(other)
+

Unitless hausdorff distance to other geometry (float)

+
+ +
+
+interpolate(distance, normalized=False)
+

Return a point at the specified distance along a linear geometry

+

Negative length values are taken as measured in the reverse +direction from the end of the geometry. Out-of-range index +values are handled by clamping them to the valid range of values. +If the normalized arg is True, the distance will be interpreted as a +fraction of the geometry’s length.

+

Alias of line_interpolate_point.

+
+ +
+
+intersection(other, grid_size=None)
+

Returns the intersection of the geometries.

+

Refer to shapely.intersection for full documentation.

+
+ +
+
+intersects(other)
+

Returns True if geometries intersect, else False

+
+ +
+
+line_interpolate_point(distance, normalized=False)
+

Return a point at the specified distance along a linear geometry

+

Negative length values are taken as measured in the reverse +direction from the end of the geometry. Out-of-range index +values are handled by clamping them to the valid range of values. +If the normalized arg is True, the distance will be interpreted as a +fraction of the geometry’s length.

+

Alias of interpolate.

+
+ +
+
+line_locate_point(other, normalized=False)
+

Returns the distance along this geometry to a point nearest the +specified point

+

If the normalized arg is True, return the distance normalized to the +length of the linear geometry.

+

Alias of project.

+
+ +
+
+normalize()
+

Converts geometry to normal form (or canonical form).

+

This method orders the coordinates, rings of a polygon and parts of +multi geometries consistently. Typically useful for testing purposes +(for example in combination with equals_exact).

+

Examples

+
>>> from shapely import MultiLineString
+>>> line = MultiLineString([[(0, 0), (1, 1)], [(3, 3), (2, 2)]])
+>>> line.normalize()
+<MULTILINESTRING ((2 2, 3 3), (0 0, 1 1))>
+
+
+
+ +
+
+overlaps(other)
+

Returns True if geometries overlap, else False

+
+ +
+
+point_on_surface()
+

Returns a point guaranteed to be within the object, cheaply.

+

Alias of representative_point.

+
+ +
+
+project(other, normalized=False)
+

Returns the distance along this geometry to a point nearest the +specified point

+

If the normalized arg is True, return the distance normalized to the +length of the linear geometry.

+

Alias of line_locate_point.

+
+ +
+
+relate(other)
+

Returns the DE-9IM intersection matrix for the two geometries +(string)

+
+ +
+
+relate_pattern(other, pattern)
+

Returns True if the DE-9IM string code for the relationship between +the geometries satisfies the pattern, else False

+
+ +
+
+representative_point()
+

Returns a point guaranteed to be within the object, cheaply.

+

Alias of point_on_surface.

+
+ +
+
+reverse()
+

Returns a copy of this geometry with the order of coordinates reversed.

+

If the geometry is a polygon with interior rings, the interior rings are also +reversed.

+

Points are unchanged.

+
+

See also

+
+
is_ccw

Checks if a geometry is clockwise.

+
+
+
+

Examples

+
>>> from shapely import LineString, Polygon
+>>> LineString([(0, 0), (1, 2)]).reverse()
+<LINESTRING (1 2, 0 0)>
+>>> Polygon([(0, 0), (1, 0), (1, 1), (0, 1), (0, 0)]).reverse()
+<POLYGON ((0 0, 0 1, 1 1, 1 0, 0 0))>
+
+
+
+ +
+
+segmentize(max_segment_length)
+

Adds vertices to line segments based on maximum segment length.

+

Additional vertices will be added to every line segment in an input geometry +so that segments are no longer than the provided maximum segment length. New +vertices will evenly subdivide each segment.

+

Only linear components of input geometries are densified; other geometries +are returned unmodified.

+
+
Parameters:
+

max_segment_length (float or array_like) – Additional vertices will be added so that all line segments are no +longer this value. Must be greater than 0.

+
+
+

Examples

+
>>> from shapely import LineString, Polygon
+>>> LineString([(0, 0), (0, 10)]).segmentize(max_segment_length=5)
+<LINESTRING (0 0, 0 5, 0 10)>
+>>> Polygon([(0, 0), (10, 0), (10, 10), (0, 10), (0, 0)]).segmentize(max_segment_length=5)
+<POLYGON ((0 0, 5 0, 10 0, 10 5, 10 10, 5 10, 0 10, 0 5, 0 0))>
+
+
+
+ +
+
+simplify(tolerance, preserve_topology=True)
+

Returns a simplified geometry produced by the Douglas-Peucker +algorithm

+

Coordinates of the simplified geometry will be no more than the +tolerance distance from the original. Unless the topology preserving +option is used, the algorithm may produce self-intersecting or +otherwise invalid geometries.

+
+ +
+
+svg(scale_factor=1.0, fill_color=None, opacity=None)[source]
+

Returns SVG path element for the Polygon geometry.

+
+
Parameters:
+
    +
  • scale_factor (float) – Multiplication factor for the SVG stroke-width. Default is 1.

  • +
  • fill_color (str, optional) – Hex string for fill color. Default is to use “#66cc99” if +geometry is valid, and “#ff3333” if invalid.

  • +
  • opacity (float) – Float number between 0 and 1 for color opacity. Default value is 0.6

  • +
+
+
+
+ +
+
+symmetric_difference(other, grid_size=None)
+

Returns the symmetric difference of the geometries.

+

Refer to shapely.symmetric_difference for full documentation.

+
+ +
+
+touches(other)
+

Returns True if geometries touch, else False

+
+ +
+
+union(other, grid_size=None)
+

Returns the union of the geometries.

+

Refer to shapely.union for full documentation.

+
+ +
+
+within(other)
+

Returns True if geometry is within the other, else False

+
+ +
+
+property area
+

Unitless area of the geometry (float)

+
+ +
+
+property boundary
+

Returns a lower dimension geometry that bounds the object

+

The boundary of a polygon is a line, the boundary of a line is a +collection of points. The boundary of a point is an empty (null) +collection.

+
+ +
+
+property bounds
+

Returns minimum bounding region (minx, miny, maxx, maxy)

+
+ +
+
+property centroid
+

Returns the geometric center of the object

+
+ +
+
+property convex_hull
+

that’s a +convex hull, more or less

+

The convex hull of a three member multipoint, for example, is a +triangular polygon.

+
+
Type:
+

Imagine an elastic band stretched around the geometry

+
+
+
+ +
+
+property coords
+

Access to geometry’s coordinates (CoordinateSequence)

+
+ +
+
+property envelope
+

A figure that envelopes the geometry

+
+ +
+
+property exterior
+
+ +
+
+property geom_type
+

Name of the geometry’s type, such as ‘Point’

+
+ +
+
+property has_z
+

True if the geometry’s coordinate sequence(s) have z values (are +3-dimensional)

+
+ +
+
+property interiors
+
+ +
+
+property is_closed
+

True if the geometry is closed, else False

+

Applicable only to 1-D geometries.

+
+ +
+
+property is_empty
+

True if the set of points in this geometry is empty, else False

+
+ +
+
+property is_ring
+

True if the geometry is a closed ring, else False

+
+ +
+
+property is_simple
+

True if the geometry is simple, meaning that any self-intersections +are only at boundary points, else False

+
+ +
+
+property is_valid
+

True if the geometry is valid (definition depends on sub-class), +else False

+
+ +
+
+property length
+

Unitless length of the geometry (float)

+
+ +
+
+property minimum_clearance
+

Unitless distance by which a node could be moved to produce an invalid geometry (float)

+
+ +
+
+property minimum_rotated_rectangle
+

Returns the oriented envelope (minimum rotated rectangle) that +encloses the geometry.

+

Unlike envelope this rectangle is not constrained to be parallel to the +coordinate axes. If the convex hull of the object is a degenerate (line +or point) this degenerate is returned.

+

Alias of oriented_envelope.

+
+ +
+
+property oriented_envelope
+

Returns the oriented envelope (minimum rotated rectangle) that +encloses the geometry.

+

Unlike envelope this rectangle is not constrained to be parallel to the +coordinate axes. If the convex hull of the object is a degenerate (line +or point) this degenerate is returned.

+

Alias of minimum_rotated_rectangle.

+
+ +
+
+property type
+
+ +
+
+property wkb
+

WKB representation of the geometry

+
+ +
+
+property wkb_hex
+

WKB hex representation of the geometry

+
+ +
+
+property wkt
+

WKT representation of the geometry

+
+ +
+
+property xy
+

Separate arrays of X and Y coordinate values

+
+ +
+ +
+
+class lasso.util.partial[source]
+

Bases: object

+

partial(func, *args, **keywords) - new function with partial application +of the given arguments and keywords.

+
+
+args
+

tuple of arguments to future partial calls

+
+ +
+
+func
+

function object to use in future partial calls

+
+ +
+
+keywords
+

dictionary of keyword arguments to future partial calls

+
+ +
+ +
+
+lasso.util.column_name_to_parts(c, parameters=None)[source]
+
+ +
+
+lasso.util.create_locationreference(node, link)[source]
+
+ +
+
+lasso.util.geodesic_point_buffer(lat, lon, meters)[source]
+

creates circular buffer polygon for node

+
+
Parameters:
+
    +
  • lat – node lat

  • +
  • lon – node lon

  • +
  • meters – buffer distance, radius of circle

  • +
+
+
Returns:
+

Polygon

+
+
+
+ +
+
+lasso.util.get_shared_streets_intersection_hash(lat, long, osm_node_id=None)[source]
+
+
Calculated per:

https://github.com/sharedstreets/sharedstreets-js/blob/0e6d7de0aee2e9ae3b007d1e45284b06cc241d02/src/index.ts#L553-L565

+
+
Expected in/out
+
-93.0965985, 44.952112199999995 osm_node_id = 954734870

69f13f881649cb21ee3b359730790bb9

+
+
+
+
+
+ +
+
+lasso.util.hhmmss_to_datetime(hhmmss_str)[source]
+

Creates a datetime time object from a string of hh:mm:ss

+
+
Parameters:
+

hhmmss_str – string of hh:mm:ss

+
+
Returns:
+

datetime.time object representing time

+
+
Return type:
+

dt

+
+
+
+ +
+
+lasso.util.secs_to_datetime(secs)[source]
+

Creates a datetime time object from a seconds from midnight

+
+
Parameters:
+

secs – seconds from midnight

+
+
Returns:
+

datetime.time object representing time

+
+
Return type:
+

dt

+
+
+
+ +
+
+lasso.util.shorten_name(name)[source]
+
+ +
+
+lasso.util.transform(func, geom)[source]
+

Applies func to all coordinates of geom and returns a new +geometry of the same type from the transformed coordinates.

+

func maps x, y, and optionally z to output xp, yp, zp. The input +parameters may iterable types like lists or arrays or single values. +The output shall be of the same type. Scalars in, scalars out. +Lists in, lists out.

+

For example, here is an identity function applicable to both types +of input.

+
+
+
def id_func(x, y, z=None):

return tuple(filter(None, [x, y, z]))

+
+
+

g2 = transform(id_func, g1)

+
+

Using pyproj >= 2.1, this example will accurately project Shapely geometries:

+
+

import pyproj

+

wgs84 = pyproj.CRS(‘EPSG:4326’) +utm = pyproj.CRS(‘EPSG:32618’)

+

project = pyproj.Transformer.from_crs(wgs84, utm, always_xy=True).transform

+

g2 = transform(project, g1)

+
+

Note that the always_xy kwarg is required here as Shapely geometries only support +X,Y coordinate ordering.

+

Lambda expressions such as the one in

+
+

g2 = transform(lambda x, y, z=None: (x+1.0, y+1.0), g1)

+
+

also satisfy the requirements for func.

+
+ +
+
+lasso.util.unidecode(string, errors='ignore', replace_str='?')
+

Transliterate an Unicode object into an ASCII string

+
+
Return type:
+

str

+
+
+
>>> unidecode("北亰")
+"Bei Jing "
+
+
+

This function first tries to convert the string using ASCII codec. +If it fails (because of non-ASCII characters), it falls back to +transliteration using the character tables.

+

This is approx. five times faster if the string only contains ASCII +characters, but slightly slower than unicode_expect_nonascii if +non-ASCII characters are present.

+

errors specifies what to do with characters that have not been +found in replacement tables. The default is ‘ignore’ which ignores +the character. ‘strict’ raises an UnidecodeError. ‘replace’ +substitutes the character with replace_str (default is ‘?’). +‘preserve’ keeps the original character.

+

Note that if ‘preserve’ is used the returned string might not be +ASCII!

+
+ +
+ + +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty_2035_hwy_update/_modules/functools/index.html b/branch/bicounty_2035_hwy_update/_modules/functools/index.html new file mode 100644 index 0000000..bd88918 --- /dev/null +++ b/branch/bicounty_2035_hwy_update/_modules/functools/index.html @@ -0,0 +1,1083 @@ + + + + + + functools — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for functools

+"""functools.py - Tools for working with functions and callable objects
+"""
+# Python module wrapper for _functools C module
+# to allow utilities written in Python to be added
+# to the functools module.
+# Written by Nick Coghlan <ncoghlan at gmail.com>,
+# Raymond Hettinger <python at rcn.com>,
+# and Łukasz Langa <lukasz at langa.pl>.
+#   Copyright (C) 2006-2013 Python Software Foundation.
+# See C source code for _functools credits/copyright
+
+__all__ = ['update_wrapper', 'wraps', 'WRAPPER_ASSIGNMENTS', 'WRAPPER_UPDATES',
+           'total_ordering', 'cmp_to_key', 'lru_cache', 'reduce', 'partial',
+           'partialmethod', 'singledispatch', 'singledispatchmethod',
+           "cached_property"]
+
+from abc import get_cache_token
+from collections import namedtuple
+# import types, weakref  # Deferred to single_dispatch()
+from reprlib import recursive_repr
+from _thread import RLock
+
+
+################################################################################
+### update_wrapper() and wraps() decorator
+################################################################################
+
+# update_wrapper() and wraps() are tools to help write
+# wrapper functions that can handle naive introspection
+
+WRAPPER_ASSIGNMENTS = ('__module__', '__name__', '__qualname__', '__doc__',
+                       '__annotations__')
+WRAPPER_UPDATES = ('__dict__',)
+def update_wrapper(wrapper,
+                   wrapped,
+                   assigned = WRAPPER_ASSIGNMENTS,
+                   updated = WRAPPER_UPDATES):
+    """Update a wrapper function to look like the wrapped function
+
+       wrapper is the function to be updated
+       wrapped is the original function
+       assigned is a tuple naming the attributes assigned directly
+       from the wrapped function to the wrapper function (defaults to
+       functools.WRAPPER_ASSIGNMENTS)
+       updated is a tuple naming the attributes of the wrapper that
+       are updated with the corresponding attribute from the wrapped
+       function (defaults to functools.WRAPPER_UPDATES)
+    """
+    for attr in assigned:
+        try:
+            value = getattr(wrapped, attr)
+        except AttributeError:
+            pass
+        else:
+            setattr(wrapper, attr, value)
+    for attr in updated:
+        getattr(wrapper, attr).update(getattr(wrapped, attr, {}))
+    # Issue #17482: set __wrapped__ last so we don't inadvertently copy it
+    # from the wrapped function when updating __dict__
+    wrapper.__wrapped__ = wrapped
+    # Return the wrapper so this can be used as a decorator via partial()
+    return wrapper
+
+def wraps(wrapped,
+          assigned = WRAPPER_ASSIGNMENTS,
+          updated = WRAPPER_UPDATES):
+    """Decorator factory to apply update_wrapper() to a wrapper function
+
+       Returns a decorator that invokes update_wrapper() with the decorated
+       function as the wrapper argument and the arguments to wraps() as the
+       remaining arguments. Default arguments are as for update_wrapper().
+       This is a convenience function to simplify applying partial() to
+       update_wrapper().
+    """
+    return partial(update_wrapper, wrapped=wrapped,
+                   assigned=assigned, updated=updated)
+
+
+################################################################################
+### total_ordering class decorator
+################################################################################
+
+# The total ordering functions all invoke the root magic method directly
+# rather than using the corresponding operator.  This avoids possible
+# infinite recursion that could occur when the operator dispatch logic
+# detects a NotImplemented result and then calls a reflected method.
+
+def _gt_from_lt(self, other, NotImplemented=NotImplemented):
+    'Return a > b.  Computed by @total_ordering from (not a < b) and (a != b).'
+    op_result = self.__lt__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return not op_result and self != other
+
+def _le_from_lt(self, other, NotImplemented=NotImplemented):
+    'Return a <= b.  Computed by @total_ordering from (a < b) or (a == b).'
+    op_result = self.__lt__(other)
+    return op_result or self == other
+
+def _ge_from_lt(self, other, NotImplemented=NotImplemented):
+    'Return a >= b.  Computed by @total_ordering from (not a < b).'
+    op_result = self.__lt__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return not op_result
+
+def _ge_from_le(self, other, NotImplemented=NotImplemented):
+    'Return a >= b.  Computed by @total_ordering from (not a <= b) or (a == b).'
+    op_result = self.__le__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return not op_result or self == other
+
+def _lt_from_le(self, other, NotImplemented=NotImplemented):
+    'Return a < b.  Computed by @total_ordering from (a <= b) and (a != b).'
+    op_result = self.__le__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return op_result and self != other
+
+def _gt_from_le(self, other, NotImplemented=NotImplemented):
+    'Return a > b.  Computed by @total_ordering from (not a <= b).'
+    op_result = self.__le__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return not op_result
+
+def _lt_from_gt(self, other, NotImplemented=NotImplemented):
+    'Return a < b.  Computed by @total_ordering from (not a > b) and (a != b).'
+    op_result = self.__gt__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return not op_result and self != other
+
+def _ge_from_gt(self, other, NotImplemented=NotImplemented):
+    'Return a >= b.  Computed by @total_ordering from (a > b) or (a == b).'
+    op_result = self.__gt__(other)
+    return op_result or self == other
+
+def _le_from_gt(self, other, NotImplemented=NotImplemented):
+    'Return a <= b.  Computed by @total_ordering from (not a > b).'
+    op_result = self.__gt__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return not op_result
+
+def _le_from_ge(self, other, NotImplemented=NotImplemented):
+    'Return a <= b.  Computed by @total_ordering from (not a >= b) or (a == b).'
+    op_result = self.__ge__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return not op_result or self == other
+
+def _gt_from_ge(self, other, NotImplemented=NotImplemented):
+    'Return a > b.  Computed by @total_ordering from (a >= b) and (a != b).'
+    op_result = self.__ge__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return op_result and self != other
+
+def _lt_from_ge(self, other, NotImplemented=NotImplemented):
+    'Return a < b.  Computed by @total_ordering from (not a >= b).'
+    op_result = self.__ge__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return not op_result
+
+_convert = {
+    '__lt__': [('__gt__', _gt_from_lt),
+               ('__le__', _le_from_lt),
+               ('__ge__', _ge_from_lt)],
+    '__le__': [('__ge__', _ge_from_le),
+               ('__lt__', _lt_from_le),
+               ('__gt__', _gt_from_le)],
+    '__gt__': [('__lt__', _lt_from_gt),
+               ('__ge__', _ge_from_gt),
+               ('__le__', _le_from_gt)],
+    '__ge__': [('__le__', _le_from_ge),
+               ('__gt__', _gt_from_ge),
+               ('__lt__', _lt_from_ge)]
+}
+
+def total_ordering(cls):
+    """Class decorator that fills in missing ordering methods"""
+    # Find user-defined comparisons (not those inherited from object).
+    roots = {op for op in _convert if getattr(cls, op, None) is not getattr(object, op, None)}
+    if not roots:
+        raise ValueError('must define at least one ordering operation: < > <= >=')
+    root = max(roots)       # prefer __lt__ to __le__ to __gt__ to __ge__
+    for opname, opfunc in _convert[root]:
+        if opname not in roots:
+            opfunc.__name__ = opname
+            setattr(cls, opname, opfunc)
+    return cls
+
+
+################################################################################
+### cmp_to_key() function converter
+################################################################################
+
+def cmp_to_key(mycmp):
+    """Convert a cmp= function into a key= function"""
+    class K(object):
+        __slots__ = ['obj']
+        def __init__(self, obj):
+            self.obj = obj
+        def __lt__(self, other):
+            return mycmp(self.obj, other.obj) < 0
+        def __gt__(self, other):
+            return mycmp(self.obj, other.obj) > 0
+        def __eq__(self, other):
+            return mycmp(self.obj, other.obj) == 0
+        def __le__(self, other):
+            return mycmp(self.obj, other.obj) <= 0
+        def __ge__(self, other):
+            return mycmp(self.obj, other.obj) >= 0
+        __hash__ = None
+    return K
+
+try:
+    from _functools import cmp_to_key
+except ImportError:
+    pass
+
+
+################################################################################
+### reduce() sequence to a single item
+################################################################################
+
+_initial_missing = object()
+
+def reduce(function, sequence, initial=_initial_missing):
+    """
+    reduce(function, sequence[, initial]) -> value
+
+    Apply a function of two arguments cumulatively to the items of a sequence,
+    from left to right, so as to reduce the sequence to a single value.
+    For example, reduce(lambda x, y: x+y, [1, 2, 3, 4, 5]) calculates
+    ((((1+2)+3)+4)+5).  If initial is present, it is placed before the items
+    of the sequence in the calculation, and serves as a default when the
+    sequence is empty.
+    """
+
+    it = iter(sequence)
+
+    if initial is _initial_missing:
+        try:
+            value = next(it)
+        except StopIteration:
+            raise TypeError("reduce() of empty sequence with no initial value") from None
+    else:
+        value = initial
+
+    for element in it:
+        value = function(value, element)
+
+    return value
+
+try:
+    from _functools import reduce
+except ImportError:
+    pass
+
+
+################################################################################
+### partial() argument application
+################################################################################
+
+# Purely functional, no descriptor behaviour
+
[docs]class partial: + """New function with partial application of the given arguments + and keywords. + """ + + __slots__ = "func", "args", "keywords", "__dict__", "__weakref__" + + def __new__(cls, func, /, *args, **keywords): + if not callable(func): + raise TypeError("the first argument must be callable") + + if hasattr(func, "func"): + args = func.args + args + keywords = {**func.keywords, **keywords} + func = func.func + + self = super(partial, cls).__new__(cls) + + self.func = func + self.args = args + self.keywords = keywords + return self + + def __call__(self, /, *args, **keywords): + keywords = {**self.keywords, **keywords} + return self.func(*self.args, *args, **keywords) + + @recursive_repr() + def __repr__(self): + qualname = type(self).__qualname__ + args = [repr(self.func)] + args.extend(repr(x) for x in self.args) + args.extend(f"{k}={v!r}" for (k, v) in self.keywords.items()) + if type(self).__module__ == "functools": + return f"functools.{qualname}({', '.join(args)})" + return f"{qualname}({', '.join(args)})" + + def __reduce__(self): + return type(self), (self.func,), (self.func, self.args, + self.keywords or None, self.__dict__ or None) + + def __setstate__(self, state): + if not isinstance(state, tuple): + raise TypeError("argument to __setstate__ must be a tuple") + if len(state) != 4: + raise TypeError(f"expected 4 items in state, got {len(state)}") + func, args, kwds, namespace = state + if (not callable(func) or not isinstance(args, tuple) or + (kwds is not None and not isinstance(kwds, dict)) or + (namespace is not None and not isinstance(namespace, dict))): + raise TypeError("invalid partial state") + + args = tuple(args) # just in case it's a subclass + if kwds is None: + kwds = {} + elif type(kwds) is not dict: # XXX does it need to be *exactly* dict? + kwds = dict(kwds) + if namespace is None: + namespace = {} + + self.__dict__ = namespace + self.func = func + self.args = args + self.keywords = kwds
+ +try: + from _functools import partial +except ImportError: + pass + +# Descriptor version +class partialmethod(object): + """Method descriptor with partial application of the given arguments + and keywords. + + Supports wrapping existing descriptors and handles non-descriptor + callables as instance methods. + """ + + def __init__(*args, **keywords): + if len(args) >= 2: + self, func, *args = args + elif not args: + raise TypeError("descriptor '__init__' of partialmethod " + "needs an argument") + elif 'func' in keywords: + func = keywords.pop('func') + self, *args = args + import warnings + warnings.warn("Passing 'func' as keyword argument is deprecated", + DeprecationWarning, stacklevel=2) + else: + raise TypeError("type 'partialmethod' takes at least one argument, " + "got %d" % (len(args)-1)) + args = tuple(args) + + if not callable(func) and not hasattr(func, "__get__"): + raise TypeError("{!r} is not callable or a descriptor" + .format(func)) + + # func could be a descriptor like classmethod which isn't callable, + # so we can't inherit from partial (it verifies func is callable) + if isinstance(func, partialmethod): + # flattening is mandatory in order to place cls/self before all + # other arguments + # it's also more efficient since only one function will be called + self.func = func.func + self.args = func.args + args + self.keywords = {**func.keywords, **keywords} + else: + self.func = func + self.args = args + self.keywords = keywords + __init__.__text_signature__ = '($self, func, /, *args, **keywords)' + + def __repr__(self): + args = ", ".join(map(repr, self.args)) + keywords = ", ".join("{}={!r}".format(k, v) + for k, v in self.keywords.items()) + format_string = "{module}.{cls}({func}, {args}, {keywords})" + return format_string.format(module=self.__class__.__module__, + cls=self.__class__.__qualname__, + func=self.func, + args=args, + keywords=keywords) + + def _make_unbound_method(self): + def _method(cls_or_self, /, *args, **keywords): + keywords = {**self.keywords, **keywords} + return self.func(cls_or_self, *self.args, *args, **keywords) + _method.__isabstractmethod__ = self.__isabstractmethod__ + _method._partialmethod = self + return _method + + def __get__(self, obj, cls=None): + get = getattr(self.func, "__get__", None) + result = None + if get is not None: + new_func = get(obj, cls) + if new_func is not self.func: + # Assume __get__ returning something new indicates the + # creation of an appropriate callable + result = partial(new_func, *self.args, **self.keywords) + try: + result.__self__ = new_func.__self__ + except AttributeError: + pass + if result is None: + # If the underlying descriptor didn't do anything, treat this + # like an instance method + result = self._make_unbound_method().__get__(obj, cls) + return result + + @property + def __isabstractmethod__(self): + return getattr(self.func, "__isabstractmethod__", False) + +# Helper functions + +def _unwrap_partial(func): + while isinstance(func, partial): + func = func.func + return func + +################################################################################ +### LRU Cache function decorator +################################################################################ + +_CacheInfo = namedtuple("CacheInfo", ["hits", "misses", "maxsize", "currsize"]) + +class _HashedSeq(list): + """ This class guarantees that hash() will be called no more than once + per element. This is important because the lru_cache() will hash + the key multiple times on a cache miss. + + """ + + __slots__ = 'hashvalue' + + def __init__(self, tup, hash=hash): + self[:] = tup + self.hashvalue = hash(tup) + + def __hash__(self): + return self.hashvalue + +def _make_key(args, kwds, typed, + kwd_mark = (object(),), + fasttypes = {int, str}, + tuple=tuple, type=type, len=len): + """Make a cache key from optionally typed positional and keyword arguments + + The key is constructed in a way that is flat as possible rather than + as a nested structure that would take more memory. + + If there is only a single argument and its data type is known to cache + its hash value, then that argument is returned without a wrapper. This + saves space and improves lookup speed. + + """ + # All of code below relies on kwds preserving the order input by the user. + # Formerly, we sorted() the kwds before looping. The new way is *much* + # faster; however, it means that f(x=1, y=2) will now be treated as a + # distinct call from f(y=2, x=1) which will be cached separately. + key = args + if kwds: + key += kwd_mark + for item in kwds.items(): + key += item + if typed: + key += tuple(type(v) for v in args) + if kwds: + key += tuple(type(v) for v in kwds.values()) + elif len(key) == 1 and type(key[0]) in fasttypes: + return key[0] + return _HashedSeq(key) + +def lru_cache(maxsize=128, typed=False): + """Least-recently-used cache decorator. + + If *maxsize* is set to None, the LRU features are disabled and the cache + can grow without bound. + + If *typed* is True, arguments of different types will be cached separately. + For example, f(3.0) and f(3) will be treated as distinct calls with + distinct results. + + Arguments to the cached function must be hashable. + + View the cache statistics named tuple (hits, misses, maxsize, currsize) + with f.cache_info(). Clear the cache and statistics with f.cache_clear(). + Access the underlying function with f.__wrapped__. + + See: http://en.wikipedia.org/wiki/Cache_replacement_policies#Least_recently_used_(LRU) + + """ + + # Users should only access the lru_cache through its public API: + # cache_info, cache_clear, and f.__wrapped__ + # The internals of the lru_cache are encapsulated for thread safety and + # to allow the implementation to change (including a possible C version). + + if isinstance(maxsize, int): + # Negative maxsize is treated as 0 + if maxsize < 0: + maxsize = 0 + elif callable(maxsize) and isinstance(typed, bool): + # The user_function was passed in directly via the maxsize argument + user_function, maxsize = maxsize, 128 + wrapper = _lru_cache_wrapper(user_function, maxsize, typed, _CacheInfo) + return update_wrapper(wrapper, user_function) + elif maxsize is not None: + raise TypeError( + 'Expected first argument to be an integer, a callable, or None') + + def decorating_function(user_function): + wrapper = _lru_cache_wrapper(user_function, maxsize, typed, _CacheInfo) + return update_wrapper(wrapper, user_function) + + return decorating_function + +def _lru_cache_wrapper(user_function, maxsize, typed, _CacheInfo): + # Constants shared by all lru cache instances: + sentinel = object() # unique object used to signal cache misses + make_key = _make_key # build a key from the function arguments + PREV, NEXT, KEY, RESULT = 0, 1, 2, 3 # names for the link fields + + cache = {} + hits = misses = 0 + full = False + cache_get = cache.get # bound method to lookup a key or return None + cache_len = cache.__len__ # get cache size without calling len() + lock = RLock() # because linkedlist updates aren't threadsafe + root = [] # root of the circular doubly linked list + root[:] = [root, root, None, None] # initialize by pointing to self + + if maxsize == 0: + + def wrapper(*args, **kwds): + # No caching -- just a statistics update + nonlocal misses + misses += 1 + result = user_function(*args, **kwds) + return result + + elif maxsize is None: + + def wrapper(*args, **kwds): + # Simple caching without ordering or size limit + nonlocal hits, misses + key = make_key(args, kwds, typed) + result = cache_get(key, sentinel) + if result is not sentinel: + hits += 1 + return result + misses += 1 + result = user_function(*args, **kwds) + cache[key] = result + return result + + else: + + def wrapper(*args, **kwds): + # Size limited caching that tracks accesses by recency + nonlocal root, hits, misses, full + key = make_key(args, kwds, typed) + with lock: + link = cache_get(key) + if link is not None: + # Move the link to the front of the circular queue + link_prev, link_next, _key, result = link + link_prev[NEXT] = link_next + link_next[PREV] = link_prev + last = root[PREV] + last[NEXT] = root[PREV] = link + link[PREV] = last + link[NEXT] = root + hits += 1 + return result + misses += 1 + result = user_function(*args, **kwds) + with lock: + if key in cache: + # Getting here means that this same key was added to the + # cache while the lock was released. Since the link + # update is already done, we need only return the + # computed result and update the count of misses. + pass + elif full: + # Use the old root to store the new key and result. + oldroot = root + oldroot[KEY] = key + oldroot[RESULT] = result + # Empty the oldest link and make it the new root. + # Keep a reference to the old key and old result to + # prevent their ref counts from going to zero during the + # update. That will prevent potentially arbitrary object + # clean-up code (i.e. __del__) from running while we're + # still adjusting the links. + root = oldroot[NEXT] + oldkey = root[KEY] + oldresult = root[RESULT] + root[KEY] = root[RESULT] = None + # Now update the cache dictionary. + del cache[oldkey] + # Save the potentially reentrant cache[key] assignment + # for last, after the root and links have been put in + # a consistent state. + cache[key] = oldroot + else: + # Put result in a new link at the front of the queue. + last = root[PREV] + link = [last, root, key, result] + last[NEXT] = root[PREV] = cache[key] = link + # Use the cache_len bound method instead of the len() function + # which could potentially be wrapped in an lru_cache itself. + full = (cache_len() >= maxsize) + return result + + def cache_info(): + """Report cache statistics""" + with lock: + return _CacheInfo(hits, misses, maxsize, cache_len()) + + def cache_clear(): + """Clear the cache and cache statistics""" + nonlocal hits, misses, full + with lock: + cache.clear() + root[:] = [root, root, None, None] + hits = misses = 0 + full = False + + wrapper.cache_info = cache_info + wrapper.cache_clear = cache_clear + return wrapper + +try: + from _functools import _lru_cache_wrapper +except ImportError: + pass + + +################################################################################ +### singledispatch() - single-dispatch generic function decorator +################################################################################ + +def _c3_merge(sequences): + """Merges MROs in *sequences* to a single MRO using the C3 algorithm. + + Adapted from http://www.python.org/download/releases/2.3/mro/. + + """ + result = [] + while True: + sequences = [s for s in sequences if s] # purge empty sequences + if not sequences: + return result + for s1 in sequences: # find merge candidates among seq heads + candidate = s1[0] + for s2 in sequences: + if candidate in s2[1:]: + candidate = None + break # reject the current head, it appears later + else: + break + if candidate is None: + raise RuntimeError("Inconsistent hierarchy") + result.append(candidate) + # remove the chosen candidate + for seq in sequences: + if seq[0] == candidate: + del seq[0] + +def _c3_mro(cls, abcs=None): + """Computes the method resolution order using extended C3 linearization. + + If no *abcs* are given, the algorithm works exactly like the built-in C3 + linearization used for method resolution. + + If given, *abcs* is a list of abstract base classes that should be inserted + into the resulting MRO. Unrelated ABCs are ignored and don't end up in the + result. The algorithm inserts ABCs where their functionality is introduced, + i.e. issubclass(cls, abc) returns True for the class itself but returns + False for all its direct base classes. Implicit ABCs for a given class + (either registered or inferred from the presence of a special method like + __len__) are inserted directly after the last ABC explicitly listed in the + MRO of said class. If two implicit ABCs end up next to each other in the + resulting MRO, their ordering depends on the order of types in *abcs*. + + """ + for i, base in enumerate(reversed(cls.__bases__)): + if hasattr(base, '__abstractmethods__'): + boundary = len(cls.__bases__) - i + break # Bases up to the last explicit ABC are considered first. + else: + boundary = 0 + abcs = list(abcs) if abcs else [] + explicit_bases = list(cls.__bases__[:boundary]) + abstract_bases = [] + other_bases = list(cls.__bases__[boundary:]) + for base in abcs: + if issubclass(cls, base) and not any( + issubclass(b, base) for b in cls.__bases__ + ): + # If *cls* is the class that introduces behaviour described by + # an ABC *base*, insert said ABC to its MRO. + abstract_bases.append(base) + for base in abstract_bases: + abcs.remove(base) + explicit_c3_mros = [_c3_mro(base, abcs=abcs) for base in explicit_bases] + abstract_c3_mros = [_c3_mro(base, abcs=abcs) for base in abstract_bases] + other_c3_mros = [_c3_mro(base, abcs=abcs) for base in other_bases] + return _c3_merge( + [[cls]] + + explicit_c3_mros + abstract_c3_mros + other_c3_mros + + [explicit_bases] + [abstract_bases] + [other_bases] + ) + +def _compose_mro(cls, types): + """Calculates the method resolution order for a given class *cls*. + + Includes relevant abstract base classes (with their respective bases) from + the *types* iterable. Uses a modified C3 linearization algorithm. + + """ + bases = set(cls.__mro__) + # Remove entries which are already present in the __mro__ or unrelated. + def is_related(typ): + return (typ not in bases and hasattr(typ, '__mro__') + and issubclass(cls, typ)) + types = [n for n in types if is_related(n)] + # Remove entries which are strict bases of other entries (they will end up + # in the MRO anyway. + def is_strict_base(typ): + for other in types: + if typ != other and typ in other.__mro__: + return True + return False + types = [n for n in types if not is_strict_base(n)] + # Subclasses of the ABCs in *types* which are also implemented by + # *cls* can be used to stabilize ABC ordering. + type_set = set(types) + mro = [] + for typ in types: + found = [] + for sub in typ.__subclasses__(): + if sub not in bases and issubclass(cls, sub): + found.append([s for s in sub.__mro__ if s in type_set]) + if not found: + mro.append(typ) + continue + # Favor subclasses with the biggest number of useful bases + found.sort(key=len, reverse=True) + for sub in found: + for subcls in sub: + if subcls not in mro: + mro.append(subcls) + return _c3_mro(cls, abcs=mro) + +def _find_impl(cls, registry): + """Returns the best matching implementation from *registry* for type *cls*. + + Where there is no registered implementation for a specific type, its method + resolution order is used to find a more generic implementation. + + Note: if *registry* does not contain an implementation for the base + *object* type, this function may return None. + + """ + mro = _compose_mro(cls, registry.keys()) + match = None + for t in mro: + if match is not None: + # If *match* is an implicit ABC but there is another unrelated, + # equally matching implicit ABC, refuse the temptation to guess. + if (t in registry and t not in cls.__mro__ + and match not in cls.__mro__ + and not issubclass(match, t)): + raise RuntimeError("Ambiguous dispatch: {} or {}".format( + match, t)) + break + if t in registry: + match = t + return registry.get(match) + +def singledispatch(func): + """Single-dispatch generic function decorator. + + Transforms a function into a generic function, which can have different + behaviours depending upon the type of its first argument. The decorated + function acts as the default implementation, and additional + implementations can be registered using the register() attribute of the + generic function. + """ + # There are many programs that use functools without singledispatch, so we + # trade-off making singledispatch marginally slower for the benefit of + # making start-up of such applications slightly faster. + import types, weakref + + registry = {} + dispatch_cache = weakref.WeakKeyDictionary() + cache_token = None + + def dispatch(cls): + """generic_func.dispatch(cls) -> <function implementation> + + Runs the dispatch algorithm to return the best available implementation + for the given *cls* registered on *generic_func*. + + """ + nonlocal cache_token + if cache_token is not None: + current_token = get_cache_token() + if cache_token != current_token: + dispatch_cache.clear() + cache_token = current_token + try: + impl = dispatch_cache[cls] + except KeyError: + try: + impl = registry[cls] + except KeyError: + impl = _find_impl(cls, registry) + dispatch_cache[cls] = impl + return impl + + def register(cls, func=None): + """generic_func.register(cls, func) -> func + + Registers a new implementation for the given *cls* on a *generic_func*. + + """ + nonlocal cache_token + if func is None: + if isinstance(cls, type): + return lambda f: register(cls, f) + ann = getattr(cls, '__annotations__', {}) + if not ann: + raise TypeError( + f"Invalid first argument to `register()`: {cls!r}. " + f"Use either `@register(some_class)` or plain `@register` " + f"on an annotated function." + ) + func = cls + + # only import typing if annotation parsing is necessary + from typing import get_type_hints + argname, cls = next(iter(get_type_hints(func).items())) + if not isinstance(cls, type): + raise TypeError( + f"Invalid annotation for {argname!r}. " + f"{cls!r} is not a class." + ) + registry[cls] = func + if cache_token is None and hasattr(cls, '__abstractmethods__'): + cache_token = get_cache_token() + dispatch_cache.clear() + return func + + def wrapper(*args, **kw): + if not args: + raise TypeError(f'{funcname} requires at least ' + '1 positional argument') + + return dispatch(args[0].__class__)(*args, **kw) + + funcname = getattr(func, '__name__', 'singledispatch function') + registry[object] = func + wrapper.register = register + wrapper.dispatch = dispatch + wrapper.registry = types.MappingProxyType(registry) + wrapper._clear_cache = dispatch_cache.clear + update_wrapper(wrapper, func) + return wrapper + + +# Descriptor version +class singledispatchmethod: + """Single-dispatch generic method descriptor. + + Supports wrapping existing descriptors and handles non-descriptor + callables as instance methods. + """ + + def __init__(self, func): + if not callable(func) and not hasattr(func, "__get__"): + raise TypeError(f"{func!r} is not callable or a descriptor") + + self.dispatcher = singledispatch(func) + self.func = func + + def register(self, cls, method=None): + """generic_method.register(cls, func) -> func + + Registers a new implementation for the given *cls* on a *generic_method*. + """ + return self.dispatcher.register(cls, func=method) + + def __get__(self, obj, cls=None): + def _method(*args, **kwargs): + method = self.dispatcher.dispatch(args[0].__class__) + return method.__get__(obj, cls)(*args, **kwargs) + + _method.__isabstractmethod__ = self.__isabstractmethod__ + _method.register = self.register + update_wrapper(_method, self.func) + return _method + + @property + def __isabstractmethod__(self): + return getattr(self.func, '__isabstractmethod__', False) + + +################################################################################ +### cached_property() - computed once per instance, cached as attribute +################################################################################ + +_NOT_FOUND = object() + + +class cached_property: + def __init__(self, func): + self.func = func + self.attrname = None + self.__doc__ = func.__doc__ + self.lock = RLock() + + def __set_name__(self, owner, name): + if self.attrname is None: + self.attrname = name + elif name != self.attrname: + raise TypeError( + "Cannot assign the same cached_property to two different names " + f"({self.attrname!r} and {name!r})." + ) + + def __get__(self, instance, owner=None): + if instance is None: + return self + if self.attrname is None: + raise TypeError( + "Cannot use cached_property instance without calling __set_name__ on it.") + try: + cache = instance.__dict__ + except AttributeError: # not all objects have __dict__ (e.g. class defines slots) + msg = ( + f"No '__dict__' attribute on {type(instance).__name__!r} " + f"instance to cache {self.attrname!r} property." + ) + raise TypeError(msg) from None + val = cache.get(self.attrname, _NOT_FOUND) + if val is _NOT_FOUND: + with self.lock: + # check if another thread filled cache while we awaited lock + val = cache.get(self.attrname, _NOT_FOUND) + if val is _NOT_FOUND: + val = self.func(instance) + try: + cache[self.attrname] = val + except TypeError: + msg = ( + f"The '__dict__' attribute on {type(instance).__name__!r} instance " + f"does not support item assignment for caching {self.attrname!r} property." + ) + raise TypeError(msg) from None + return val +
+ +
+
+
+ +
+ +
+

© Copyright 2019 Metropolitan Council.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty_2035_hwy_update/_modules/index.html b/branch/bicounty_2035_hwy_update/_modules/index.html new file mode 100644 index 0000000..19f96a2 --- /dev/null +++ b/branch/bicounty_2035_hwy_update/_modules/index.html @@ -0,0 +1,116 @@ + + + + + + Overview: module code — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+
    +
  • + +
  • +
  • +
+
+
+ +
+ +
+ +
+

© Copyright 2019 Metropolitan Council.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty_2035_hwy_update/_modules/lasso/logger/index.html b/branch/bicounty_2035_hwy_update/_modules/lasso/logger/index.html new file mode 100644 index 0000000..1561ede --- /dev/null +++ b/branch/bicounty_2035_hwy_update/_modules/lasso/logger/index.html @@ -0,0 +1,153 @@ + + + + + + lasso.logger — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for lasso.logger

+import logging
+
+__all__ = ["WranglerLogger", "setupLogging"]
+
+
+# for all the Wrangler logging needs!
+WranglerLogger = logging.getLogger("WranglerLogger")
+
+
+
[docs]def setupLogging(infoLogFilename, debugLogFilename, logToConsole=True): + """Sets up the logger. The infoLog is terse, just gives the bare minimum of details + so the network composition will be clear later. + The debuglog is very noisy, for debugging. + + Pass none to either. + Spews it all out to console too, if logToConsole is true. + """ + # clear handlers if any exist already + WranglerLogger.handlers = [] + + # create a logger + WranglerLogger.setLevel(logging.DEBUG) + + if infoLogFilename: + infologhandler = logging.StreamHandler(open(infoLogFilename, "w")) + infologhandler.setLevel(logging.INFO) + infologhandler.setFormatter( + logging.Formatter("%(asctime)s - %(name)s - %(levelname)s - %(message)s") + ) + WranglerLogger.addHandler(infologhandler) + + if debugLogFilename: + debugloghandler = logging.StreamHandler(open(debugLogFilename, "w")) + debugloghandler.setLevel(logging.DEBUG) + debugloghandler.setFormatter( + logging.Formatter("%(asctime)s %(levelname)s %(message)s", "%Y-%m-%d %H:%M") + ) + WranglerLogger.addHandler(debugloghandler) + + if logToConsole: + consolehandler = logging.StreamHandler() + consolehandler.setLevel(logging.DEBUG) + consolehandler.setFormatter( + logging.Formatter("%(name)-12s: %(levelname)-8s %(message)s") + ) + WranglerLogger.addHandler(consolehandler)
+
+ +
+
+
+ +
+ +
+

© Copyright 2019 Metropolitan Council.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty_2035_hwy_update/_modules/lasso/parameters/index.html b/branch/bicounty_2035_hwy_update/_modules/lasso/parameters/index.html new file mode 100644 index 0000000..7b8be0d --- /dev/null +++ b/branch/bicounty_2035_hwy_update/_modules/lasso/parameters/index.html @@ -0,0 +1,1060 @@ + + + + + + lasso.parameters — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for lasso.parameters

+import os
+import pyproj
+from .logger import WranglerLogger
+
+
+from pyproj import CRS
+
+
+def get_base_dir(lasso_base_dir=os.getcwd()):
+    d = lasso_base_dir
+    for i in range(3):
+        if "metcouncil_data" in os.listdir(d):
+
+            WranglerLogger.info("Lasso base directory set as: {}".format(d))
+            return d
+        d = os.path.dirname(d)
+
+    msg = "Cannot find Lasso base directory from {}, please input using keyword in parameters: `lasso_base_dir =` ".format(
+        lasso_base_dir
+    )
+    WranglerLogger.error(msg)
+    raise (ValueError(msg))
+
+
+
[docs]class Parameters: + """A class representing all the parameters defining the networks + including time of day, categories, etc. + + Parameters can be set at runtime by initializing a parameters instance + with a keyword argument setting the attribute. Parameters that are + not explicitly set will use default parameters listed in this class. + .. highlight:: python + + Attr: + time_period_to_time (dict): Maps time period abbreviations used in + Cube to time of days used on gtfs and highway network standard + Default: + :: + { + "EA": ("3:00", "6:00"), + "AM": ("6:00, "10:00"), + "MD": ("10:00", "15:00"), + "PM": ("15:00", "19:00"), + "EV": ("19:00", "3:00"), + } + cube_time_periods (dict): Maps cube time period numbers used in + transit line files to the time period abbreviations in time_period_to_time + dictionary. + Default: + :: + {"1": "EA", "2": "AM", "3": "MD", "4": "PM", "5": "EV"} + categories (dict): Maps demand category abbreviations to a list of + network categories they are allowed to use. + Default: + :: + { + # suffix, source (in order of search) + "sov": ["sov", "default"], + "hov2": ["hov2", "default", "sov"], + "hov3": ["hov3", "hov2", "default", "sov"], + "truck": ["trk", "sov", "default"], + } + properties_to_split (dict): Dictionary mapping variables in standard + roadway network to categories and time periods that need to be + split out in final model network to get variables like LANES_AM. + Default: + :: + { + "lanes": { + "v": "lanes", + "time_periods": self.time_periods_to_time + }, + "ML_lanes": { + "v": "ML_lanes", + "time_periods": self.time_periods_to_time + }, + "use": { + "v": "use", + "time_periods": self.time_periods_to_time + }, + } + + county_shape (str): File location of shapefile defining counties. + Default: + :: + r"metcouncil_data/county/cb_2017_us_county_5m.shp" + + county_variable_shp (str): Property defining the county n ame in + the county_shape file. + Default: + :: + NAME + lanes_lookup_file (str): Lookup table of number of lanes for different data sources. + Default: + :: + r"metcouncil_data/lookups/lanes.csv" + centroid_connect_lanes (int): Number of lanes for centroid connectors. + Default: + :: + 1 + mpo_counties (list): list of county names within MPO boundary. + Default: + :: + [ + "ANOKA", + "DAKOTA", + "HENNEPIN", + "RAMSEY", + "SCOTT", + "WASHINGTON", + "CARVER", + ] + + taz_shape (str): + Default: + :: + r"metcouncil_data/TAZ/TAZOfficialWCurrentForecasts.shp" + taz_data (str): + Default: + :: + ?? + highest_taz_number (int): highest TAZ number in order to define + centroid connectors. + Default: + :: + 3100 + + output_variables (list): list of variables to output in final model + network. + Default: + :: + [ + "model_link_id", + "link_id", + "A", + "B", + "shstGeometryId", + "distance", + "roadway", + "name", + "roadway_class", + "bike_access", + "walk_access", + "drive_access", + "truck_access", + "trn_priority_EA", + "trn_priority_AM", + "trn_priority_MD", + "trn_priority_PM", + "trn_priority_EV", + "ttime_assert_EA", + "ttime_assert_AM", + "ttime_assert_MD", + "ttime_assert_PM", + "ttime_assert_EV", + "lanes_EA", + "lanes_AM", + "lanes_MD", + "lanes_PM", + "lanes_EV", + "price_sov_EA", + "price_hov2_EA", + "price_hov3_EA", + "price_truck_EA", + "price_sov_AM", + "price_hov2_AM", + "price_hov3_AM", + "price_truck_AM", + "price_sov_MD", + "price_hov2_MD", + "price_hov3_MD", + "price_truck_MD", + "price_sov_PM", + "price_hov2_PM", + "price_hov3_PM", + "price_truck_PM", + "price_sov_EV", + "price_hov2_EV", + "price_hov3_EV", + "price_truck_EV", + "roadway_class_idx", + "facility_type", + "county", + "centroidconnect", + "model_node_id", + "N", + "osm_node_id", + "bike_node", + "transit_node", + "walk_node", + "drive_node", + "geometry", + "X", + "Y", + "ML_lanes_EA", + "ML_lanes_AM", + "ML_lanes_MD", + "ML_lanes_PM", + "ML_lanes_EV", + "segment_id", + "managed", + "bus_only", + "rail_only" + ] + + osm_facility_type_dict (dict): Mapping between OSM Roadway variable + and facility type. Default: + + area_type_shape (str): Location of shapefile defining area type. + Default: + :: + r"metcouncil_data/area_type/ThriveMSP2040CommunityDesignation.shp" + area_type_variable_shp (str): property in area_type_shape with area + type in it. + Default: + :: + "COMDES2040" + area_type_code_dict (dict): Mapping of the area_type_variable_shp to + the area type code used in the MetCouncil cube network. + Default: + :: + { + 23: 4, # urban center + 24: 3, + 25: 2, + 35: 2, + 36: 1, + 41: 1, + 51: 1, + 52: 1, + 53: 1, + 60: 1, + } + downtown_area_type_shape (str): Location of shapefile defining downtown area type. + Default: + :: + r"metcouncil_data/area_type/downtownzones_TAZ.shp" + downtown_area_type (int): Area type integer for downtown. + Default: + :: + 5 + mrcc_roadway_class_shape (str): Shapefile of MRCC links with a property + associated with roadway class. Default: + :: + r"metcouncil_data/mrcc/trans_mrcc_centerlines.shp" + mrcc_roadway_class_variable_shp (str): The property in mrcc_roadway_class_shp + associated with roadway class. Default: + :: + "ROUTE_SYS" + widot_roadway_class_shape (str): Shapefile of Wisconsin links with a property + associated with roadway class. Default: + :: + r"metcouncil_data/Wisconsin_Lanes_Counts_Median/WISLR.shp" + widot_roadway_class_variable_shp (str): The property in widot_roadway_class_shape + associated with roadway class.Default: + :: + "RDWY_CTGY_" + mndot_count_shape (str): Shapefile of MnDOT links with a property + associated with counts. Default: + :: + r"metcouncil_data/count_mn/AADT_2017_Count_Locations.shp" + mndot_count_variable_shp (str): The property in mndot_count_shape + associated with counts. Default: + + :: + "lookups/osm_highway_facility_type_crosswalk.csv" + legacy_tm2_attributes (str): CSV file of link attributes by + shStReferenceId from Legacy TM2 network. Default: + :: + "lookups/legacy_tm2_attributes.csv" + osm_lanes_attributes (str): CSV file of number of lanes by shStReferenceId + from OSM. Default: + :: + "lookups/osm_lanes_attributes.csv" + tam_tm2_attributes (str): CSV file of link attributes by + shStReferenceId from TAM TM2 network. Default: + :: + "lookups/tam_tm2_attributes.csv" + tom_tom_attributes (str): CSV file of link attributes by + shStReferenceId from TomTom network. Default: + :: + "lookups/tomtom_attributes.csv" + sfcta_attributes (str): CSV file of link attributes by + shStReferenceId from SFCTA network. Default: + :: + "lookups/sfcta_attributes.csv" + output_epsg (int): EPSG type of geographic projection for output + shapefiles. Default: + :: + 102646 + output_link_shp (str): Output shapefile for roadway links. Default: + :: + r"tests/scratch/links.shp" + output_node_shp (str): Output shapefile for roadway nodes. Default: + :: + r"tests/scratch/nodes.shp" + output_link_csv (str): Output csv for roadway links. Default: + :: + r"tests/scratch/links.csv" + output_node_csv (str): Output csv for roadway nodes. Default: + :: + r"tests/scratch/nodes.csv" + output_link_txt (str): Output fixed format txt for roadway links. Default: + :: + r"tests/scratch/links.txt" + output_node_txt (str): Output fixed format txt for roadway nodes. Default: + :: + r"tests/scratch/nodes.txt" + output_link_header_width_txt (str): Header for txt roadway links. Default: + :: + r"tests/scratch/links_header_width.txt" + output_node_header_width_txt (str): Header for txt for roadway Nodes. Default: + :: + r"tests/scratch/nodes_header_width.txt" + output_cube_network_script (str): Cube script for importing + fixed-format roadway network. Default: + :: + r"tests/scratch/make_complete_network_from_fixed_width_file.s + + + + """ + +
[docs] def __init__(self, **kwargs): + """ + Time period and category splitting info + """ + if "time_periods_to_time" in kwargs: + self.time_periods_to_time = kwargs.get("time_periods_to_time") + else: + self.time_period_to_time = { + "EA": ("3:00", "6:00"), + "AM": ("6:00", "10:00"), + "MD": ("10:00", "15:00"), + "PM": ("15:00", "19:00"), + "EV": ("19:00", "3:00"), + } + + #MTC + self.cube_time_periods = { + "1": "EA", + "2": "AM", + "3": "MD", + "4": "PM", + "5": "EV", + } + + """ + #MC + self.route_type_bus_mode_dict = {"Urb Loc": 5, "Sub Loc": 6, "Express": 7} + + self.route_type_mode_dict = {0: 8, 2: 9} + + self.cube_time_periods = {"1": "AM", "2": "MD"} + self.cube_time_periods_name = {"AM": "pk", "MD": "op"} + """ + if "categories" in kwargs: + self.categories = kwargs.get("categories") + else: + self.categories = { + # suffix, source (in order of search) + "sov": ["sov", "default"], + "hov2": ["hov2", "default", "sov"], + "hov3": ["hov3", "hov2", "default", "sov"], + "truck": ["trk", "sov", "default"], + } + + # prefix, source variable, categories + self.properties_to_split = { + "lanes": { + "v": "lanes", + "time_periods": self.time_period_to_time, + }, + "ML_lanes": { + "v": "ML_lanes", + "time_periods": self.time_period_to_time, + }, + "useclass": { + "v": "useclass", + "time_periods": self.time_period_to_time, + }, + } + + """ + Details for calculating the county based on the centroid of the link. + The NAME varible should be the name of a field in shapefile. + """ + #MTC + if 'lasso_base_dir' in kwargs: + self.base_dir = get_base_dir(lasso_base_dir = kwargs.get("lasso_base_dir")) + else: + self.base_dir = get_base_dir() + + if 'data_file_location' in kwargs: + self.data_file_location = kwargs.get("data_file_location") + else: + self.data_file_location = os.path.join(self.base_dir, "mtc_data") + + #MC + if "lasso_base_dir" in kwargs: + self.base_dir = get_base_dir(lasso_base_dir=kwargs.get("lasso_base_dir")) + else: + self.base_dir = get_base_dir() + """ + if "data_file_location" in kwargs: + self.data_file_location = kwargs.get("data_file_location") + else: + self.data_file_location = os.path.join(self.base_dir, "metcouncil_data") + """ + + #-------- + if "settings_location" in kwargs: + self.settings_location = kwargs.get("settings_location") + else: + self.settings_location = os.path.join(self.base_dir, "examples", "settings") + + if "scratch_location" in kwargs: + self.scratch_location = kwargs.get("scratch_location") + else: + self.scratch_location = os.path.join(self.base_dir, "tests", "scratch") + + ### COUNTIES + + self.county_shape = os.path.join( + self.data_file_location, "county", "county.shp" + ) + self.county_variable_shp = "NAME" + + #MTC + self.county_code_dict = { + 'San Francisco':1, + 'San Mateo':2, + 'Santa Clara':3, + 'Alameda':4, + 'Contra Costa':5, + 'Solano':6, + 'Napa':7, + 'Sonoma':8, + 'Marin':9, + 'San Joaquin':11, + 'External':10, + } + + self.county_centroid_range_dict = { + 'San Francisco':range(1,100000), + 'San Mateo':range(100001,200000), + 'Santa Clara':range(200001,300000), + 'Alameda':range(300001,400000), + 'Contra Costa':range(400001,500000), + 'Solano':range(500001,600000), + 'Napa':range(600001,700000), + 'Sonoma':range(700001,800000), + 'Marin':range(800001,900000), + 'External':range(900001,1000000) + } + + self.county_node_range_dict = { + 'San Francisco':range(1000000,1500000), + 'San Mateo':range(1500000,2000000), + 'Santa Clara':range(2000000,2500000), + 'Alameda':range(2500000,3000000), + 'Contra Costa':range(3000000,3500000), + 'Solano':range(3500000,4000000), + 'Napa':range(4000000,4500000), + 'Sonoma':range(4500000,5000000), + 'Marin':range(5000000,5500000), + } + + self.county_hov_node_range_dict = { + 'San Francisco':range(5500000,6000000), + 'San Mateo':range(6000000,6500000), + 'Santa Clara':range(6500000,7000000), + 'Alameda':range(7000000,7500000), + 'Contra Costa':range(7500000,8000000), + 'Solano':range(8000000,8500000), + 'Napa':range(8500000,9000000), + 'Sonoma':range(9000000,9500000), + 'Marin':range(9500000,10000000), + } + + self.county_link_range_dict = { + 'San Francisco':range(1,1000000), + 'San Mateo':range(1000000,2000000), + 'Santa Clara':range(2000000,3000000), + 'Alameda':range(3000000,4000000), + 'Contra Costa':range(4000000,5000000), + 'Solano':range(5000000,6000000), + 'Napa':range(6000000,7000000), + 'Sonoma':range(7000000,8000000), + 'Marin':range(8000000,9000000) + } + + #MC + """ + self.county_code_dict = { + "Anoka": 1, + "Carver": 2, + "Dakota": 3, + "Hennepin": 4, + "Ramsey": 5, + "Scott": 6, + "Washington": 7, + "external": 10, + "Chisago": 11, + "Goodhue": 12, + "Isanti": 13, + "Le Sueur": 14, + "McLeod": 15, + "Pierce": 16, + "Polk": 17, + "Rice": 18, + "Sherburne": 19, + "Sibley": 20, + "St. Croix": 21, + "Wright": 22, + } + """ + + self.mpo_counties = [ + 1, + 3, + 4, + 5, + 6, + 7, + 8, + 9 + ] + + self.taz_N_list = list(range(1, 10000)) + list(range(100001, 110000)) + list(range(200001, 210000)) + list(range(300001, 310000))\ + + list(range(400001, 410000)) + list(range(500001, 510000)) + list(range(600001, 610000)) + list(range(700001, 710000))\ + + list(range(800001, 810000)) + list(range(900001, 1000000)) + + self.maz_N_list = list(range(10001, 90000)) + list(range(110001, 190000)) + list(range(210001, 290000)) + list(range(310001, 390000))\ + + list(range(410001, 490000)) + list(range(510001, 590000)) + list(range(610001, 690000)) + list(range(710001, 790000))\ + + list(range(810001, 890000)) + + self.tap_N_list = list(range(90001, 99999)) + list(range(190001, 199999)) + list(range(290001, 299999)) + list(range(390001, 399999))\ + + list(range(490001, 499999)) + list(range(590001, 599999)) + list(range(690001, 699999)) + list(range(790001, 799999))\ + + list(range(890001, 899999)) + + self.tap_N_start = { + "San Francisco" : 90001, + "San Mateo" : 190001, + "Santa Clara" : 290001, + "Alameda" : 390001, + "Contra Costa" : 490001, + "Solano" : 590001, + "Napa" : 690001, + "Sonoma" : 790001, + "Marin" : 890001 + } + + #MTC + self.osm_facility_type_dict = os.path.join( + self.data_file_location, "lookups", "osm_highway_facility_type_crosswalk.csv" + ) + #MC + ### Lanes + self.lanes_lookup_file = os.path.join( + self.data_file_location, "lookups", "lanes.csv" + ) + + ### TAZS + + self.taz_shape = os.path.join( + self.data_file_location, "TAZ", "TAZOfficialWCurrentForecasts.shp" + ) + ###### + #MTC + self.osm_lanes_attributes = os.path.join( + self.data_file_location, "lookups", "osm_lanes_attributes.csv" + ) + + self.legacy_tm2_attributes = os.path.join( + self.data_file_location, "lookups", "legacy_tm2_attributes.csv" + ) + + self.assignable_analysis = os.path.join( + self.data_file_location, "lookups", "assignable_analysis_links.csv" + ) + ### + ### AREA TYPE - MC + self.area_type_shape = os.path.join( + self.data_file_location, + "area_type", + "ThriveMSP2040CommunityDesignation.shp", + ) + self.area_type_variable_shp = "COMDES2040" + # area type map from raw data to model category + + # source https://metrocouncil.org/Planning/Publications-And-Resources/Thrive-MSP-2040-Plan-(1)/7_ThriveMSP2040_LandUsePoliciesbyCD.aspx + # urban center + # urban + # suburban + # suburban edge + # emerging suburban edge + # rural center + # diversified rural + # rural residential + # agricultural + self.area_type_code_dict = { + 23: 4, # urban center + 24: 3, + 25: 2, + 35: 2, + 36: 1, + 41: 1, + 51: 1, + 52: 1, + 53: 1, + 60: 1, + } + + self.downtown_area_type_shape = os.path.join( + self.data_file_location, + "area_type", + "downtownzones_TAZ.shp", + ) + + self.downtown_area_type = int(5) + + self.centroid_connect_lanes = int(1) + + self.osm_assgngrp_dict = os.path.join( + self.data_file_location, "lookups", "osm_highway_asgngrp_crosswalk.csv" + ) + self.mrcc_roadway_class_shape = os.path.join( + self.data_file_location, "mrcc", "trans_mrcc_centerlines.shp" + ) + #### + ###MTC + self.tam_tm2_attributes = os.path.join( + self.data_file_location, "lookups", "tam_tm2_attributes.csv" + ) + + self.sfcta_attributes = os.path.join( + self.data_file_location, "lookups", "sfcta_attributes.csv" + ) + + self.tomtom_attributes = os.path.join( + self.data_file_location, "lookups", "tomtom_attributes.csv" + ) + + self.pems_attributes = os.path.join( + self.data_file_location, "lookups", "pems_attributes.csv" + ) + + self.centroid_file = os.path.join( + self.data_file_location, "centroid", "centroid_node.pickle" + ) + #### + ###MC + self.widot_shst_data = os.path.join( + self.data_file_location, + "Wisconsin_Lanes_Counts_Median", + "widot.out.matched.geojson", + ) + #### + + self.centroid_connector_link_file = os.path.join( + self.data_file_location, "centroid", "cc_link.pickle" + ) + + self.centroid_connector_shape_file = os.path.join( + self.data_file_location, "centroid", "cc_shape.pickle" + ) + + self.tap_file = os.path.join( + self.data_file_location, "tap", "tap_node.pickle" + ) + + self.tap_connector_link_file = os.path.join( + self.data_file_location, "tap", "tap_link.pickle" + ) + + self.tap_connector_shape_file = os.path.join( + self.data_file_location, "tap", "tap_shape.pickle" + ) + + self.net_to_dbf_crosswalk = os.path.join( + self.settings_location, "net_to_dbf.csv" + ) + + ###MTC + self.log_to_net_crosswalk = os.path.join(self.settings_location, "log_to_net.csv") + + self.emme_name_crosswalk_file = os.path.join( + self.data_file_location, "lookups", "emme_attribute_names.csv" + ) + #### + #MC + self.mndot_count_variable_shp = "AADT_mn" + + self.widot_county_shape = os.path.join( + self.data_file_location, + "Wisconsin_Lanes_Counts_Median", + "TRADAS_(counts).shp", + ) + ### + ###MTC + self.mode_crosswalk_file = os.path.join( + self.data_file_location, "lookups", "gtfs_to_tm2_mode_crosswalk.csv" + ) + + self.veh_cap_crosswalk_file = os.path.join( + self.data_file_location, "lookups", "transitSeatCap.csv" + ) + + self.faresystem_crosswalk_file = os.path.join( + self.data_file_location, "lookups", "faresystem_crosswalk.txt" + ) + + # https://app.asana.com/0/12291104512575/1200287255197808/f + self.fare_2015_to_2010_deflator = 0.927 + #### + #MC + self.widot_count_variable_shp = "AADT_wi" + + self.net_to_dbf_crosswalk = os.path.join( + self.settings_location, "net_to_dbf.csv" + ) + + self.log_to_net_crosswalk = os.path.join( + self.settings_location, "log_to_net.csv" + ) + + self.subregion_boundary_file = os.path.join( + self.data_file_location, 'emme', 'subregion_boundary_for_active_modes.shp' + ) + + self.subregion_boundary_id_variable = 'subregion' + #### + + self.output_variables = [ + "model_link_id", + "link_id", + "A", + "B", + "shstGeometryId", + #MTC + 'name', + "distance", + "roadway", + #"name", + #MC + #"shape_id", + #"distance", + #"roadway", + #"name", + #"roadway_class", + #### + "bike_access", + "walk_access", + "drive_access", + "truck_access", + "lanes_EA", + "lanes_AM", + "lanes_MD", + "lanes_PM", + "lanes_EV", + "county", + "model_node_id", + "N", + "osm_node_id", + "geometry", + "X", + "Y", + "segment_id", + "managed", + "bus_only", + "rail_only", + "pnr", + #MTC + "assignable", + "cntype", + "useclass_AM", + "useclass_MD", + "useclass_PM", + "useclass_EV", + "useclass_EA", + "transit", + "tollbooth", + "tollseg", + "ft", + "tap_drive", + "tollbooth", + "tollseg", + "farezone", + "tap_id", + #### + #MC + "bike_facility", + "mrcc_id", + "ROUTE_SYS", # mrcc functional class + #### + #bi-county + "nmt2010", + "nmt2020", + "BRT", + "has_transit" + ] + + self.output_link_shp = os.path.join(self.scratch_location, "links.shp") + self.output_node_shp = os.path.join(self.scratch_location, "nodes.shp") + self.output_link_csv = os.path.join(self.scratch_location, "links.csv") + self.output_node_csv = os.path.join(self.scratch_location, "nodes.csv") + self.output_link_txt = os.path.join(self.scratch_location, "links.txt") + self.output_node_txt = os.path.join(self.scratch_location, "nodes.txt") + self.output_link_header_width_txt = os.path.join( + self.scratch_location, "links_header_width.txt" + ) + self.output_node_header_width_txt = os.path.join( + self.scratch_location, "nodes_header_width.txt" + ) + self.output_cube_network_script = os.path.join( + self.scratch_location, "make_complete_network_from_fixed_width_file.s" + ) + self.output_dir = os.path.join(self.scratch_location) + self.output_proj = CRS("ESRI:102646") + self.output_proj4 = '+proj=lcc +lat_1=32.78333333333333 +lat_2=33.88333333333333 +lat_0=32.16666666666666 +lon_0=-116.25 +x_0=2000000 +y_0=500000.0000000002 +ellps=GRS80 +datum=NAD83 +to_meter=0.3048006096012192 +no_defs' + self.prj_file = os.path.join(self.data_file_location, 'projection', '102646.prj') + self.wkt_projection = 'PROJCS["NAD_1983_StatePlane_California_VI_FIPS_0406_Feet",GEOGCS["GCS_North_American_1983",DATUM["North_American_Datum_1983",SPHEROID["GRS_1980",6378137,298.257222101]],PRIMEM["Greenwich",0],UNIT["Degree",0.017453292519943295]],PROJECTION["Lambert_Conformal_Conic_2SP"],PARAMETER["False_Easting",6561666.666666666],PARAMETER["False_Northing",1640416.666666667],PARAMETER["Central_Meridian",-116.25],PARAMETER["Standard_Parallel_1",32.78333333333333],PARAMETER["Standard_Parallel_2",33.88333333333333],PARAMETER["Latitude_Of_Origin",32.16666666666666],UNIT["Foot_US",0.30480060960121924],AUTHORITY["EPSG","102646"]]' + + self.fare_matrix_output_variables = ["faresystem", "origin_farezone", "destination_farezone", "price"] + + self.zones = 6593 + """ + Create all the possible headway variable combinations based on the cube time periods setting + """ + self.time_period_properties_list = [ + p + "[" + str(t) + "]" + for p in ["HEADWAY", "FREQ"] + for t in self.cube_time_periods.keys() + ] + + self.int_col = [ + "model_link_id", + "model_node_id", + "A", + "B", + #MTC + #"county", + ### + #MC + # "lanes", + "lanes_AM", + "lanes_MD", + "lanes_PM", + "lanes_NT", + "roadway_class", + "assign_group", + #"county", + "area_type", + "trn_priority", + "AADT", + "count_AM", + "count_MD", + "count_PM", + "count_NT", + "count_daily", + "centroidconnect", + "bike_facility", + #### + "drive_access", + "walk_access", + "bike_access", + "truck_access", + #MTC + "ML_lanes_AM", + "ML_lanes_MD", + "ML_lanes_PM", + "ML_lanes_EV", + "ML_lanes_EA", + ### + #MC + "drive_node", + "walk_node", + "bike_node", + "transit_node", + # "ML_lanes", + "ML_lanes_AM", + "ML_lanes_MD", + "ML_lanes_PM", + "ML_lanes_NT", + #### + "segment_id", + "managed", + "bus_only", + "rail_only", + "transit", + ##MTC + "ft", + "assignable", + "lanes_AM", + "lanes_MD", + "lanes_PM", + "lanes_EA", + "lanes_EV", + "useclass_AM", + "useclass_EA", + "useclass_MD", + "useclass_PM", + "useclass_EV", + "tollseg", + "tollbooth", + "farezone", + "tap_id", + #### + #bi-county + "nmt2010", + "nmt2020", + "BRT", + "has_transit" + ] + + self.float_col = [ + "distance", + "price", + "X", + "Y" + "mrcc_id", + ] + + self.float_col = ["distance", "ttime_assert", "price", "X", "Y"] + + self.string_col = [ + "osm_node_id", + "name", + "pnr", + "roadway", + "shstGeometryId", + "access_AM", + "access_MD", + "access_PM", + "access_NT", + "ROUTE_SYS", + ] + + # pnr parameters + self.pnr_node_location = os.path.join( + self.data_file_location, "lookups", "pnr_stations.csv" + ) + + self.drive_buffer = 6 + + #self.network_build_crs = CRS("EPSG:2875") + #self.project_card_crs = CRS("EPSG:4326") + #self.transformer = pyproj.Transformer.from_crs( + # self.network_build_crs, self.project_card_crs, always_xy=True + #) + + self.__dict__.update(kwargs)
+
+ +
+
+
+ +
+ +
+

© Copyright 2019 Metropolitan Council.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty_2035_hwy_update/_modules/lasso/project/index.html b/branch/bicounty_2035_hwy_update/_modules/lasso/project/index.html new file mode 100644 index 0000000..e1285f4 --- /dev/null +++ b/branch/bicounty_2035_hwy_update/_modules/lasso/project/index.html @@ -0,0 +1,1530 @@ + + + + + + lasso.project — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for lasso.project

+import json
+import os
+import re
+from typing import Any, Dict, Optional, Union, List
+from csv import reader
+
+from pandas.core import base
+
+import numpy as np
+import pandas as pd
+from pandas import DataFrame
+import geopandas as gpd
+
+from network_wrangler import ProjectCard
+from network_wrangler import RoadwayNetwork
+
+from .transit import CubeTransit, StandardTransit
+from .logger import WranglerLogger
+from .parameters import Parameters
+from .roadway import ModelRoadwayNetwork
+from .util import column_name_to_parts
+
+
+
[docs]class Project(object): + """A single or set of changes to the roadway or transit system. + + Compares a base and a build transit network or a base and build + highway network and produces project cards. + + .. highlight:: python + + Typical usage example: + :: + test_project = Project.create_project( + base_cube_transit_source=os.path.join(CUBE_DIR, "transit.LIN"), + build_cube_transit_source=os.path.join(CUBE_DIR, "transit_route_shape_change"), + ) + test_project.evaluate_changes() + test_project.write_project_card( + os.path.join(SCRATCH_DIR, "t_transit_shape_test.yml") + ) + + Attributes: + DEFAULT_PROJECT_NAME: a class-level constant that defines what + the project name will be if none is set. + STATIC_VALUES: a class-level constant which defines values that + are not evaluated when assessing changes. + card_data (dict): {"project": <project_name>, "changes": <list of change dicts>} + roadway_link_changes (DataFrame): pandas dataframe of CUBE roadway link changes. + roadway_node_changes (DataFrame): pandas dataframe of CUBE roadway node changes. + transit_changes (CubeTransit): + base_roadway_network (RoadwayNetwork): + base_cube_transit_network (CubeTransit): + build_cube_transit_network (CubeTransit): + project_name (str): name of the project, set to DEFAULT_PROJECT_NAME if not provided + parameters: an instance of the Parameters class which sets a bunch of parameters + """ + + DEFAULT_PROJECT_NAME = "USER TO define" + + STATIC_VALUES = [ + "model_link_id", + "area_type", + "county", + # "assign_group", + "centroidconnect", + ] + CALCULATED_VALUES = [ + "area_type", + "county", + "assign_group", + "centroidconnect", + ] + +
[docs] def __init__( + self, + roadway_link_changes: Optional[DataFrame] = None, + roadway_node_changes: Optional[DataFrame] = None, + transit_changes: Optional[DataFrame] = None, + base_roadway_network: Optional[RoadwayNetwork] = None, + base_transit_network: Optional[StandardTransit] = None, + base_cube_transit_network: Optional[CubeTransit] = None, + build_cube_transit_network: Optional[CubeTransit] = None, + project_name: Optional[str] = "", + evaluate: Optional[bool] = False, + parameters: Union[dict, Parameters] = {}, + ): + """ + ProjectCard constructor. + + args: + roadway_link_changes: dataframe of roadway changes read from a log file + roadway_node_changes: dataframe of roadway changes read from a log file + transit_changes: dataframe of transit changes read from a log file + base_roadway_network: RoadwayNetwork instance for base case + base_transit_network: StandardTransit instance for base case + base_cube_transit_network: CubeTransit instance for base transit network + build_cube_transit_network: CubeTransit instance for build transit network + project_name: name of the project + evaluate: defaults to false, but if true, will create card data + parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not specified, will use default parameters. + + returns: instance of ProjectCard + """ + self.card_data = Dict[str, Dict[str, Any]] + + self.roadway_link_changes = roadway_link_changes + self.roadway_node_changes = roadway_node_changes + self.base_roadway_network = base_roadway_network + self.base_transit_network = base_transit_network + self.base_cube_transit_network = base_cube_transit_network + self.build_cube_transit_network = build_cube_transit_network + self.transit_changes = transit_changes + self.project_name = ( + project_name if project_name else Project.DEFAULT_PROJECT_NAME + ) + + if type(parameters) is dict: + self.parameters = Parameters(**parameters) + elif isinstance(parameters, Parameters): + self.parameters = Parameters(**parameters.__dict__) + else: + msg = "Parameters should be a dict or instance of Parameters: found {} which is of type:{}".format( + parameters, type(parameters) + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + if base_roadway_network != None: + self.determine_roadway_network_changes_compatibility( + self.base_roadway_network, + self.roadway_link_changes, + self.roadway_node_changes, + self.parameters + ) + + if evaluate: + self.evaluate_changes()
+ +
[docs] def write_project_card(self, filename: str = None): + """ + Writes project cards. + + Args: + filename (str): File path to output .yml + + Returns: + None + """ + ProjectCard(self.card_data).write(out_filename=filename)
+ +
[docs] @staticmethod + def create_project( + roadway_log_file: Union[str, List[str], None] = None, + roadway_shp_file: Optional[str] = None, + roadway_csv_file: Optional[str] = None, + network_build_file: Optional[str] = None, + emme_node_id_crosswalk_file: Optional[str] = None, + emme_name_crosswalk_file: Optional[str] = None, + base_roadway_dir: Optional[str] = None, + base_transit_dir: Optional[str] = None, + base_cube_transit_source: Optional[str] = None, + build_cube_transit_source: Optional[str] = None, + roadway_link_changes: Optional[DataFrame] = None, + roadway_node_changes: Optional[DataFrame] = None, + transit_changes: Optional[CubeTransit] = None, + base_roadway_network: Optional[RoadwayNetwork] = None, + base_transit_network: Optional[StandardTransit] = None, + base_cube_transit_network: Optional[CubeTransit] = None, + build_cube_transit_network: Optional[CubeTransit] = None, + project_name: Optional[str] = None, + recalculate_calculated_variables: Optional[bool] = False, + recalculate_distance: Optional[bool] = False, + parameters: Optional[dict] = {}, + **kwargs, + ): + """ + Constructor for a Project instance. + + Args: + roadway_log_file: File path to consuming logfile or a list of logfile paths. + roadway_shp_file: File path to consuming shape file for roadway changes. + roadway_csv_file: File path to consuming csv file for roadway changes. + network_build_file: File path to consuming EMME network build for network changes. + base_roadway_dir: Folder path to base roadway network. + base_transit_dir: Folder path to base transit network. + base_cube_transit_source: Folder path to base transit network or cube line file string. + base_cube_transit_file: File path to base transit network. + build_cube_transit_source: Folder path to build transit network or cube line file string. + build_cube_transit_file: File path to build transit network. + roadway_link_changes: pandas dataframe of CUBE roadway link changes. + roadway_node_changes: pandas dataframe of CUBE roadway node changes. + transit_changes: build transit changes. + base_roadway_network: Base roadway network object. + base_cube_transit_network: Base cube transit network object. + build_cube_transit_network: Build cube transit network object. + project_name: If not provided, will default to the roadway_log_file filename if + provided (or the first filename if a list is provided) + recalculate_calculated_variables: if reading in a base network, if this is true it + will recalculate variables such as area type, etc. This only needs to be true + if you are creating project cards that are changing the calculated variables. + recalculate_distance: recalculate the distance variable. This only needs to be + true if you are creating project cards that change the distance. + parameters: dictionary of parameters + crs (int): coordinate reference system, ESPG number + node_foreign_key (str): variable linking the node table to the link table + link_foreign_key (list): list of variable linking the link table to the node foreign key + shape_foreign_key (str): variable linking the links table and shape table + unique_link_ids (list): list of variables unique to each link + unique_node_ids (list): list of variables unique to each node + modes_to_network_link_variables (dict): Mapping of modes to link variables in + the network + modes_to_network_nodes_variables (dict): Mapping of modes to node variables + in the network + managed_lanes_node_id_scalar (int): Scalar values added to primary keys for nodes for + corresponding managed lanes. + managed_lanes_link_id_scalar (int): Scalar values added to primary keys for links for + corresponding managed lanes. + managed_lanes_required_attributes (list): attributes that must be specified in managed + lane projects. + keep_same_attributes_ml_and_gp (list): attributes to copy to managed lanes from parallel + general purpose lanes. + + Returns: + A Project instance. + """ + + if base_cube_transit_source and base_cube_transit_network: + msg = "Method takes only one of 'base_cube_transit_source' and 'base_cube_transit_network' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if base_cube_transit_source: + base_cube_transit_network = CubeTransit.create_from_cube(base_cube_transit_source, parameters) + WranglerLogger.debug( + "Base network has {} lines".format(len(base_cube_transit_network.lines)) + ) + if len(base_cube_transit_network.lines) <= 10: + WranglerLogger.debug( + "Base network lines: {}".format( + "\n - ".join(base_cube_transit_network.lines) + ) + ) + elif base_cube_transit_network: + pass + else: + msg = "No base cube transit network." + WranglerLogger.info(msg) + base_cube_transit_network = None + + if build_cube_transit_source and transit_changes: + msg = "Method takes only one of 'build_cube_transit_source' and 'transit_changes' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if build_cube_transit_source: + WranglerLogger.debug("build") + build_cube_transit_network = CubeTransit.create_from_cube(build_cube_transit_source, parameters) + WranglerLogger.debug( + "Build network has {} lines".format(len(build_cube_transit_network.lines)) + ) + if len(build_cube_transit_network.lines) <= 10: + WranglerLogger.debug( + "Build network lines: {}".format( + "\n - ".join(build_cube_transit_network.lines) + ) + ) + elif transit_changes: + pass + else: + msg = "No cube transit changes given or processed." + WranglerLogger.info(msg) + transit_changes = None + + if roadway_log_file and (roadway_link_changes or roadway_node_changes): + msg = "Method takes only one of 'roadway_log_file' and 'roadway_changes' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if roadway_shp_file and (roadway_link_changes or roadway_node_changes): + msg = "Method takes only one of 'roadway_shp_file' and 'roadway_changes' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if roadway_csv_file and (roadway_link_changes or roadway_node_changes): + msg = "Method takes only one of 'roadway_csv_file' and 'roadway_changes' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if roadway_log_file and roadway_csv_file: + msg = "Method takes only one of 'roadway_log_file' and 'roadway_csv_file' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if roadway_shp_file and roadway_csv_file: + msg = "Method takes only one of 'roadway_shp_file' and 'roadway_csv_file' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if roadway_log_file and roadway_shp_file: + msg = "Method takes only one of 'roadway_log_file' and 'roadway_shp_file' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if roadway_log_file and not project_name: + if type(roadway_log_file) == list: + project_name = os.path.splitext(os.path.basename(roadway_log_file[0]))[ + 0 + ] + WranglerLogger.info( + "No Project Name - Using name of first log file in list" + ) + else: + project_name = os.path.splitext(os.path.basename(roadway_log_file))[0] + WranglerLogger.info("No Project Name - Using name of log file") + if network_build_file and not project_name: + if type(network_build_file) == list: + with open(network_build_file[0]) as f: + _content = json.load(f) + project_name = ( + _content.get('metadata').get('project_title') + ' ' + + _content.get('metadata').get('date') + ' ' + + _content.get('metadata').get('comments') + ) + WranglerLogger.info( + "No Project Name - Using metadata of first network build file in list" + ) + else: + with open(network_build_file) as f: + _content = json.load(f) + project_name = ( + _content.get('metadata').get('project_title') + ' ' + + _content.get('metadata').get('date') + ' ' + + _content.get('metadata').get('comments') + ) + WranglerLogger.info("No Project Name - Using metadata of network build file") + if roadway_log_file: + roadway_link_changes, roadway_node_changes = Project.read_logfile(roadway_log_file) + elif roadway_shp_file: + roadway_changes = gpd.read_file(roadway_shp_file) + roadway_link_changes = roadway_changes[roadway_changes.OBJECT == 'L'].copy() + roadway_node_changes = roadway_changes[roadway_changes.OBJECT == 'N'].copy() + roadway_link_changes = DataFrame(roadway_link_changes.drop("geometry", axis=1)) + roadway_node_changes = DataFrame(roadway_node_changes.drop("geometry", axis=1)) + roadway_node_changes["model_node_id"] = 0 + elif roadway_csv_file: + roadway_changes = pd.read_csv(roadway_csv_file) + roadway_link_changes = roadway_changes[roadway_changes.OBJECT == 'L'].copy() + roadway_node_changes = roadway_changes[roadway_changes.OBJECT == 'N'].copy() + roadway_node_changes["model_node_id"] = 0 + elif network_build_file: + roadway_link_changes, roadway_node_changes, transit_changes = Project.read_network_build_file(network_build_file) + if emme_node_id_crosswalk_file: + # get wrangler IDs from emme element_id + roadway_link_changes, roadway_node_changes, transit_changes = Project.emme_id_to_wrangler_id( + roadway_link_changes, + roadway_node_changes, + transit_changes, + emme_node_id_crosswalk_file + ) + else: + msg = "User needs to specify emme node id crosswalk file using emme_node_id_crosswalk_file = " + WranglerLogger.error(msg) + raise ValueError(msg) + # rename emme attributes to wrangler attributes + if emme_name_crosswalk_file is None: + emme_name_crosswalk_file = parameters.emme_name_crosswalk_file + roadway_link_changes, roadway_node_changes = Project.emme_name_to_wrangler_name( + roadway_link_changes, + roadway_node_changes, + emme_name_crosswalk_file + ) + elif roadway_link_changes: + pass + elif roadway_node_changes: + pass + else: + msg = "No roadway changes given or processed." + WranglerLogger.info(msg) + roadway_link_changes = pd.DataFrame({}) + roadway_node_changes = pd.DataFrame({}) + + if base_roadway_network and base_roadway_dir: + msg = "Method takes only one of 'base_roadway_network' and 'base_roadway_dir' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if base_roadway_dir: + base_roadway_network = ModelRoadwayNetwork.read( + os.path.join(base_roadway_dir, "link.json"), + os.path.join(base_roadway_dir, "node.geojson"), + os.path.join(base_roadway_dir, "shape.geojson"), + fast=True, + recalculate_calculated_variables=recalculate_calculated_variables, + recalculate_distance=recalculate_distance, + parameters=parameters, + **kwargs, + ) + base_roadway_network.split_properties_by_time_period_and_category() + elif base_roadway_network: + base_roadway_network.split_properties_by_time_period_and_category() + else: + msg = "No base roadway network." + WranglerLogger.info(msg) + base_roadway_network = None + + if base_cube_transit_source and base_transit_dir: + msg = "Method takes only one of 'base_cube_transit_source' and 'base_transit_dir' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if base_transit_dir: + base_transit_network = StandardTransit.read_gtfs( + gtfs_feed_dir=base_transit_dir, + parameters=parameters + ) + elif base_transit_network: + base_transit_network = base_transit_network + else: + msg = "No base transit network." + WranglerLogger.info(msg) + base_transit_network = None + + project = Project( + roadway_link_changes=roadway_link_changes, + roadway_node_changes=roadway_node_changes, + transit_changes=transit_changes, + base_roadway_network=base_roadway_network, + base_transit_network=base_transit_network, + base_cube_transit_network=base_cube_transit_network, + build_cube_transit_network=build_cube_transit_network, + evaluate=True, + project_name=project_name, + parameters=parameters, + ) + + return project
+ +
[docs] @staticmethod + def read_logfile(logfilename: Union[str, List[str]]): + """ + Reads a Cube log file and returns separate dataframes of roadway_link_changes and roadway_node_changes + + Args: + logfilename (str or list[str]): File path to CUBE logfile or list of logfile paths. + + Returns: + A DataFrame reprsentation of the log file. + """ + if type(logfilename) == str: + logfilename = [logfilename] + + link_df = pd.DataFrame() + node_df = pd.DataFrame() + + for file in logfilename: + WranglerLogger.info("Reading logfile: {}".format(file)) + with open(file) as f: + _content = f.readlines() + + _node_lines = [ + x.strip().replace(";", ",") for x in _content if x.startswith("N") + ] + WranglerLogger.debug("node lines: {}".format(_node_lines)) + _link_lines = [ + x.strip().replace(";", ",") for x in _content if x.startswith("L") + ] + WranglerLogger.debug("link lines: {}".format(_link_lines)) + + _nodecol = ["OBJECT", "OPERATION", "GROUP"] + _node_lines[0].split(",")[ + 1: + ] + WranglerLogger.debug("Node Cols: {}".format(_nodecol)) + _linkcol = ["OBJECT", "OPERATION", "GROUP"] + _link_lines[0].split(",")[ + 1: + ] + WranglerLogger.debug("Link Cols: {}".format(_linkcol)) + + def split_log(x): + return list(reader([x], delimiter=',', quotechar='"'))[0] + + _node_df = pd.DataFrame([split_log(x) for x in _node_lines[1:]],columns = _nodecol) + WranglerLogger.debug("Node DF: {}".format(_node_df)) + _link_df = pd.DataFrame([split_log(x) for x in _link_lines[1:]],columns = _linkcol) + WranglerLogger.debug("Link DF: {}".format(_link_df)) + + node_df = pd.concat([node_df, _node_df]) + link_df = pd.concat([link_df, _link_df]) + + # CUBE logfile headers for string fields: NAME[111] instead of NAME, need to shorten that + link_df.columns = [c.split("[")[0] for c in link_df.columns] + # CUBE logfile headers for string fields: NAME[111] instead of NAME, need to shorten that + node_df.columns = [c.split("[")[0] for c in node_df.columns] + + if len(link_df) > 0: + # create operation history + action_history_df = ( + link_df.groupby(['A', 'B'])["OPERATION"] + .agg(lambda x: x.tolist()) + .rename("operation_history") + .reset_index() + ) + action_history_df["operation_final"] = action_history_df.apply(lambda x: Project._final_op(x), axis=1) + + link_df = pd.merge(link_df, action_history_df, on=['A', 'B'], how="left") + + if len(node_df) > 0: + action_history_df = ( + node_df.groupby('N')["OPERATION"] + .agg(lambda x: x.tolist()) + .rename("operation_history") + .reset_index() + ) + action_history_df["operation_final"] = action_history_df.apply(lambda x: Project._final_op(x), axis=1) + + node_df = pd.merge(node_df, action_history_df, on='N', how="left") + + WranglerLogger.info( + "Processed {} Node lines and {} Link lines".format( + node_df.shape[0], link_df.shape[0] + ) + ) + + return link_df, node_df
+ +
[docs] @staticmethod + def read_network_build_file(networkbuildfilename: Union[str, List[str]]): + """ + Reads a emme network build file and returns separate dataframes of roadway_link_changes and roadway_node_changes + + Args: + networkbuildfilename (str or list[str]): File path to emme nework build file or list of network build file paths. + + Returns: + A DataFrame representation of the network build file + """ + if type(networkbuildfilename) == str: + networkbuildfilename = [networkbuildfilename] + + _link_command_history_df = DataFrame() + _node_command_history_df = DataFrame() + _transit_command_history_df = DataFrame() + + for file in networkbuildfilename: + WranglerLogger.info("Reading network build file: {}".format(file)) + with open(file) as f: + _content = json.load(f) + + _command_history = _content.get('command_history') + + # loop through all the commands + for command in _command_history: + if command.get('command') == 'set_attribute': + element_id = command.get('parameters').get('element_ids') + object = Project.get_object_from_network_build_command(command) + operation = Project.get_operation_from_network_build_command(command) + + _command_df = DataFrame( + data = { + 'element_id' : element_id, + 'object' : object, + 'operation' : operation + } + ) + + _command_df[command.get('parameters').get('attribute_name')] = command.get('parameters').get('value') + + if command.get('command') in ['create_link', 'create_node']: + if command.get('command') == 'create_link': + element_id = command.get('results').get('changes').get('added').get('LINK') + if command.get('command') == 'create_node': + element_id = command.get('results').get('changes').get('added').get('NODE') + object = Project.get_object_from_network_build_command(command) + operation = Project.get_operation_from_network_build_command(command) + + _command_df = DataFrame( + data = { + 'element_id' : element_id, + 'object' : object, + 'operation' : operation + } + ) + + for attribute_name, attribute_value in command.get('parameters').get('attributes').items(): + _command_df[attribute_name] = attribute_value + + if command.get('command') == 'delete_link': + element_id = command.get('results').get('changes').get('removed').get('LINK') + object = Project.get_object_from_network_build_command(command) + operation = Project.get_operation_from_network_build_command(command) + + _command_df = DataFrame( + data = { + 'element_id' : element_id, + 'object' : object, + 'operation' : operation + } + ) + + if command.get('command') == 'modify_transit_line': + element_id = command.get('parameters').get('line_id') + object = Project.get_object_from_network_build_command(command) + operation = Project.get_operation_from_network_build_command(command) + + _command_df = DataFrame( + data = { + 'element_id' : pd.Series(element_id), + 'object' : pd.Series(object), + 'operation' : pd.Series(operation) + } + ) + + _command_df['new_itinerary'] = [command.get('parameters').get('new_itinerary')] + + if ('L' in _command_df['object'].unique()): + _link_command_history_df = _link_command_history_df.append( + _command_df[_command_df['object'] == 'L'], + sort = False, + ignore_index = True + ) + + if ('N' in _command_df['object'].unique()): + _node_command_history_df = _node_command_history_df.append( + _command_df[_command_df['object'] == 'N'], + sort = False, + ignore_index = True + ) + + if ( + ('TRANSIT_LINE' in _command_df['object'].unique()) | + ('TRANSIT_STOP' in _command_df['object'].unique()) | + ('TRANSIT_SHAPE' in _command_df['object'].unique()) + ): + _transit_command_history_df = _transit_command_history_df.append( + _command_df[_command_df['object'].isin(['TRANSIT_LINE', 'TRANSIT_STOP', 'TRANSIT_SHAPE'])], + sort = False, + ignore_index = True + ) + + if len(_link_command_history_df) > 0: + # create operation history + link_action_history_df = ( + _link_command_history_df.groupby('element_id')["operation"] + .agg(lambda x: x.tolist()) + .rename("operation_history") + .reset_index() + ) + link_action_history_df["operation_final"] = link_action_history_df.apply( + lambda x: Project._final_op(x), + axis=1 + ) + + # get the last none null value for each element + # consolidate elements to single record + def get_last_valid(series): + if len(series.dropna()) > 0: + return series.dropna().iloc[-1] + else: + return np.nan + + #_command_history_df = _command_history_df.groupby(['element_id']).apply(get_last_valid).reset_index() + _link_command_history_df = _link_command_history_df.groupby( + ['element_id'] + ).last().reset_index() + + _link_command_history_df = pd.merge(_link_command_history_df, link_action_history_df, on='element_id', how="left") + + if len(_node_command_history_df) > 0: + # create node operation history + node_action_history_df = ( + _node_command_history_df.groupby('element_id')["operation"] + .agg(lambda x: x.tolist()) + .rename("operation_history") + .reset_index() + ) + node_action_history_df["operation_final"] = node_action_history_df.apply( + lambda x: Project._final_op(x), + axis=1 + ) + + _node_command_history_df = _node_command_history_df.groupby( + ['element_id'] + ).last().reset_index() + + _node_command_history_df = pd.merge(_node_command_history_df, node_action_history_df, on='element_id', how="left") + + WranglerLogger.info( + "Processed {} link element commands, {} node element commands".format( + _link_command_history_df.shape[0], + _node_command_history_df.shape[0] + ) + ) + + return _link_command_history_df, _node_command_history_df, _transit_command_history_df
+ +
[docs] @staticmethod + def emme_id_to_wrangler_id(emme_link_change_df, emme_node_change_df, emme_transit_changes_df, emme_node_id_crosswalk_file): + """ + rewrite the emme id with wrangler id, using the emme wrangler id crosswalk located in database folder + """ + WranglerLogger.info('Reading emme node id crosswalk file from {}'.format(emme_node_id_crosswalk_file)) + emme_node_id_crosswalk_df = pd.read_csv(emme_node_id_crosswalk_file) + emme_node_id_dict = dict(zip(emme_node_id_crosswalk_df['emme_node_id'], emme_node_id_crosswalk_df['model_node_id'])) + + # get node changes + if len(emme_node_change_df) > 0: + emme_node_change_df['emme_id'] = emme_node_change_df['element_id'].apply(lambda x: int(x.split('-')[0])) + + # get new emme nodes + new_emme_node_id_list = [ + n for n in emme_node_change_df['emme_id'].to_list() if n not in emme_node_id_crosswalk_df['emme_node_id'].to_list() + ] + WranglerLogger.info('New emme node id list {}'.format(new_emme_node_id_list)) + new_wrangler_node = emme_node_id_crosswalk_df['model_node_id'].max() + + # add crosswalk for new emme nodes + for new_emme_node in new_emme_node_id_list: + if new_emme_node in emme_node_id_dict.keys(): + msg = "new node id {} has already been added to the crosswalk".format(new_emme_node) + WranglerLogger.error(msg) + raise ValueError(msg) + else: + new_wrangler_node = new_wrangler_node + 1 + emme_node_id_dict.update({new_emme_node : new_wrangler_node}) + new_emme_node_id_crosswalk_df = pd.DataFrame(emme_node_id_dict.items(), columns=['emme_node_id', 'model_node_id']) + new_emme_node_id_crosswalk_df.to_csv(emme_node_id_crosswalk_file, index=False) + + # for nodes update model_node_id + emme_node_change_df['model_node_id'] = emme_node_change_df['emme_id'].map(emme_node_id_dict).fillna(0) + + if len(emme_link_change_df) > 0: + emme_link_change_df['A'] = emme_link_change_df['element_id'].apply(lambda x: int(x.split('-')[0])) + emme_link_change_df['B'] = emme_link_change_df['element_id'].apply(lambda x: int(x.split('-')[-1])) + # for links update A,B nodes + emme_link_change_df['A'] = emme_link_change_df['A'].map(emme_node_id_dict) + emme_link_change_df['B'] = emme_link_change_df['B'].map(emme_node_id_dict) + + if len(emme_transit_changes_df) > 0: + emme_transit_changes_df['i_node'] = emme_transit_changes_df.apply( + lambda x: x['element_id'].split('-')[-3] if x['object'] == 'TRANSIT_STOP' else 0, + axis = 1 + ) + + emme_transit_changes_df['j_node'] = emme_transit_changes_df.apply( + lambda x: x['element_id'].split('-')[-2] if x['object'] == 'TRANSIT_STOP' else 0, + axis = 1 + ) + + # update i,j nodes + emme_transit_changes_df['i_node'] = emme_transit_changes_df[ + 'i_node' + ].astype( + int + ).map( + emme_node_id_dict + ).fillna(0).astype(int) + emme_transit_changes_df['j_node'] = emme_transit_changes_df[ + 'j_node' + ].astype( + int + ).map( + emme_node_id_dict + ).fillna(0).astype(int) + + # update routing nodes + emme_transit_changes_df['new_itinerary'] = emme_transit_changes_df.apply( + lambda x: [emme_node_id_dict.get(n) for n in x['new_itinerary']] if x['object'] == 'TRANSIT_SHAPE' else 0, + axis = 1 + ) + + return emme_link_change_df, emme_node_change_df, emme_transit_changes_df
+ +
[docs] def get_object_from_network_build_command(row): + """ + determine the network build object is node or link + + Args: + row: network build command history dataframe + + Returns: + 'N' for node, 'L' for link + """ + + if row.get('command') == 'create_link': + return 'L' + + if row.get('command') == 'create_node': + return 'N' + + if row.get('command') == 'delete_link': + return 'L' + + if row.get('command') == 'set_attribute': + if row.get('parameters').get('element_type') == 'LINK': + return 'L' + if row.get('parameters').get('element_type') == 'NODE': + return 'N' + if row.get('parameters').get('element_type') == 'TRANSIT_LINE': + return 'TRANSIT_LINE' + if row.get('parameters').get('element_type') == 'TRANSIT_SEGMENT': + return 'TRANSIT_STOP' + + if row.get('command') == 'modify_transit_line': + return 'TRANSIT_SHAPE'
+ +
[docs] def get_operation_from_network_build_command(row): + """ + determine the network build object action type + + Args: + row: network build command history dataframe + + Returns: + 'A', 'C', 'D' + """ + + if row.get('command') == 'create_link': + return 'A' + + if row.get('command') == 'create_node': + return 'A' + + if row.get('command') == 'delete_link': + return 'D' + + if row.get('command') == 'set_attribute': + if row.get('parameters').get('element_type') == 'LINK': + return 'C' + if row.get('parameters').get('element_type') == 'NODE': + return 'C' + if row.get('parameters').get('element_type') == 'TRANSIT_LINE': + return 'C' + if row.get('parameters').get('element_type') == 'TRANSIT_SEGMENT': + return 'C' + + if row.get('command') == 'modify_transit_line': + return 'C'
+ +
[docs] @staticmethod + def emme_name_to_wrangler_name(emme_link_change_df, emme_node_change_df, emme_name_crosswalk_file): + """ + rename emme names to wrangler names using crosswalk file + """ + + WranglerLogger.info('Reading emme attribute name crosswalk file {}'.format(emme_name_crosswalk_file)) + emme_name_crosswalk_df = pd.read_csv(emme_name_crosswalk_file) + emme_name_crosswalk_dict = dict(zip(emme_name_crosswalk_df['emme_name'], emme_name_crosswalk_df['wrangler_name'])) + + # drop columns we don't need from emme to avoid confusion + ignore_columns = [ + c for c in emme_link_change_df.columns if c not in list(emme_name_crosswalk_dict.keys()) + ['operation_final', 'A', 'B'] + ] + WranglerLogger.info('Ignoring link changes in {}'.format(ignore_columns)) + emme_link_change_df = emme_link_change_df.drop(ignore_columns, axis = 1) + + ignore_columns = [ + c for c in emme_node_change_df.columns if c not in list(emme_name_crosswalk_dict.keys()) + ['operation_final', 'model_node_id'] + ] + WranglerLogger.info('Ignoring node changes in {}'.format(ignore_columns)) + emme_node_change_df = emme_node_change_df.drop(ignore_columns, axis = 1) + + # rename emme name to wrangler name + emme_link_change_df.rename(columns = emme_name_crosswalk_dict, inplace = True) + emme_node_change_df.rename(columns = emme_name_crosswalk_dict, inplace = True) + + return emme_link_change_df, emme_node_change_df
+ +
[docs] @staticmethod + def determine_roadway_network_changes_compatibility( + base_roadway_network: ModelRoadwayNetwork, + roadway_link_changes: DataFrame, + roadway_node_changes: DataFrame, + parameters: Parameters, + ): + """ + Checks to see that any links or nodes that change exist in base roadway network. + """ + WranglerLogger.info( + "Evaluating compatibility between roadway network changes and base network. Not evaluating deletions." + ) + + # CUBE log file saves all variable names in upper cases, need to convert them to be same as network + log_to_net_df = pd.read_csv(parameters.log_to_net_crosswalk) + log_to_net_dict = dict(zip(log_to_net_df["log"], log_to_net_df["net"])) + + dbf_to_net_df = pd.read_csv(parameters.net_to_dbf_crosswalk) + dbf_to_net_dict = dict(zip(dbf_to_net_df["dbf"], dbf_to_net_df["net"])) + + for c in roadway_link_changes.columns: + if (c not in log_to_net_df["log"].tolist() + log_to_net_df["net"].tolist()) & (c not in ["A", "B"]): + roadway_link_changes.rename(columns={c : c.lower()}, inplace=True) + roadway_link_changes.rename(columns=log_to_net_dict, inplace=True) + roadway_link_changes.rename(columns=dbf_to_net_dict, inplace=True) + + for c in roadway_node_changes.columns: + if (c not in log_to_net_df["log"].tolist() + log_to_net_df["net"].tolist()) & (c not in ["A", "B", "X", "Y"]): + roadway_node_changes.rename(columns={c : c.lower()}, inplace=True) + roadway_node_changes.rename(columns=log_to_net_dict, inplace=True) + roadway_node_changes.rename(columns=dbf_to_net_dict, inplace=True) + + # for links "L" that change "C", + # find locations where there isn't a base roadway link + if len(roadway_link_changes) > 0: + link_changes_df = roadway_link_changes[ + roadway_link_changes["operation_final"] == "C" + ].copy() + + link_merge_df = pd.merge( + link_changes_df[["A", "B"]].astype(str), + base_roadway_network.links_df[["A", "B", "model_link_id"]].astype(str), + how="left", + on=["A", "B"], + ) + + missing_links = link_merge_df.loc[link_merge_df["model_link_id"].isna()] + + if missing_links.shape[0]: + msg = "Network missing the following AB links:\n{}".format(missing_links) + WranglerLogger.error(msg) + raise ValueError(msg) + + # for links "N" that change "C", + # find locations where there isn't a base roadway node + if len(roadway_node_changes) > 0: + node_changes_df = roadway_node_changes[ + roadway_node_changes["operation_final"] == "C" + ].copy() + + node_merge_df = pd.merge( + node_changes_df[["model_node_id"]], + base_roadway_network.nodes_df[["model_node_id", "geometry"]], + how="left", + on=["model_node_id"], + ) + missing_nodes = node_merge_df.loc[node_merge_df["geometry"].isna()] + if missing_nodes.shape[0]: + msg = "Network missing the following nodes:\n{}".format(missing_nodes) + WranglerLogger.error(msg) + raise ValueError(msg)
+ +
[docs] def evaluate_changes(self): + """ + Determines which changes should be evaluated, initiates + self.card_data to be an aggregation of transit and highway changes. + """ + highway_change_list = [] + transit_change_list = [] + + WranglerLogger.info("Evaluating project changes.") + + if (not self.roadway_link_changes.empty) | (not self.roadway_node_changes.empty): + highway_change_list = self.add_highway_changes() + + if self.transit_changes is not None: + if (not self.transit_changes.empty) or ( + self.base_cube_transit_network is not None + and self.build_cube_transit_network is not None + ): + transit_change_list = self.add_transit_changes() + + if ( + self.base_cube_transit_network is not None + and self.build_cube_transit_network is not None + ): + transit_change_list = self.add_transit_changes() + + self.card_data = { + "project": self.project_name, + "changes": transit_change_list + highway_change_list, + }
+ +
[docs] def add_transit_changes(self): + """ + Evaluates changes between base and build transit objects and + adds entries into the self.card_data dictionary. + """ + if self.build_cube_transit_network: + transit_change_list = self.build_cube_transit_network.evaluate_differences( + self.base_cube_transit_network + ) + elif self.base_transit_network: + transit_change_list = self.base_transit_network.evaluate_differences( + self.transit_changes + ) + return transit_change_list
+ + @staticmethod + def _final_op(x): + if x["operation_history"][-1] == "D": + if "A" in x["operation_history"][:-1]: + return "N" + else: + return "D" + elif x["operation_history"][-1] == "A": + if "D" in x["operation_history"][:-1]: + return "C" + else: + return "A" + else: + if "A" in x["operation_history"][:-1]: + return "A" + else: + return "C" + +
[docs] def add_highway_changes(self, limit_variables_to_existing_network=False): + """ + Evaluates changes from the log file based on the base highway object and + adds entries into the self.card_data dictionary. + + Args: + limit_variables_to_existing_network (bool): True if no ad-hoc variables. Default to False. + """ + + for c in self.parameters.string_col: + if c in self.roadway_link_changes.columns: + self.roadway_link_changes[c] = self.roadway_link_changes[c].str.lstrip(" ") + if c in self.roadway_node_changes.columns: + self.roadway_node_changes[c] = self.roadway_node_changes[c].str.lstrip(" ") + + ## if worth it, could also add some functionality to network wrangler itself. + node_changes_df = self.roadway_node_changes.copy() + + link_changes_df = self.roadway_link_changes.copy() + + def _process_deletions(link_changes_df): + """ + create deletion section in project card + """ + WranglerLogger.debug("Processing link deletions") + + cube_delete_df = link_changes_df[link_changes_df["operation_final"] == "D"].copy() + + # make sure columns has the same type as base network + cube_delete_df['A'] = cube_delete_df['A'].astype( + type(self.base_roadway_network.links_df['A'].iloc[0]) + ) + cube_delete_df['B'] = cube_delete_df['B'].astype( + type(self.base_roadway_network.links_df['B'].iloc[0]) + ) + + if 'model_link_id' in cube_delete_df.columns: + cube_delete_df.drop(['model_link_id'], axis = 1, inplace = True) + + cube_delete_df = pd.merge( + cube_delete_df, + self.base_roadway_network.links_df[['A', 'B', 'model_link_id']], + how = 'left', + on = ['A', 'B'] + ) + + if len(cube_delete_df) > 0: + links_to_delete = cube_delete_df["model_link_id"].tolist() + delete_link_dict = { + "category": "Roadway Deletion", + "links": {"model_link_id": links_to_delete}, + } + WranglerLogger.debug("{} Links Deleted.".format(len(links_to_delete))) + else: + delete_link_dict = None + WranglerLogger.debug("No link deletions processed") + + return delete_link_dict + + def _process_link_additions( + link_changes_df, limit_variables_to_existing_network + ): + """""" + WranglerLogger.debug("Processing link additions") + cube_add_df = link_changes_df[link_changes_df["operation_final"] == "A"] + if len(cube_add_df) == 0: + WranglerLogger.debug("No link additions processed") + return {} + + if limit_variables_to_existing_network: + add_col = [ + c + for c in cube_add_df.columns + if c in self.base_roadway_network.links_df.columns + ] + else: + add_col = [ + c for c in cube_add_df.columns if c not in ["operation_final"] + ] + # can leave out "operation_final" from writing out, is there a reason to write it out? + + for x in add_col: + cube_add_df[x] = cube_add_df[x].astype(self.base_roadway_network.links_df[x].dtype) + + add_link_properties = cube_add_df[add_col].to_dict("records") + + # WranglerLogger.debug("Add Link Properties: {}".format(add_link_properties)) + WranglerLogger.debug("{} Links Added".format(len(add_link_properties))) + + return {"category": "Add New Roadway", "links": add_link_properties} + + def _process_node_additions(node_add_df): + """""" + WranglerLogger.debug("Processing node additions") + + if len(node_add_df) == 0: + WranglerLogger.debug("No node additions processed") + return [] + + node_add_df = node_add_df.drop(["operation_final"], axis=1) + + node_add_df = node_add_df.apply(_reproject_coordinates, axis=1) + + for x in node_add_df.columns: + node_add_df[x] = node_add_df[x].astype(self.base_roadway_network.nodes_df[x].dtype) + + add_nodes_dict_list = node_add_df.to_dict( + "records" + ) + WranglerLogger.debug("{} Nodes Added".format(len(add_nodes_dict_list))) + + return add_nodes_dict_list + + def _reproject_coordinates(row): + reprojected_x, reprojected_y = self.parameters.transformer.transform(row['X'], row['Y']) + row['X'] = reprojected_x + row['Y'] = reprojected_y + return row + + def _process_single_link_change(change_row, changeable_col): + """""" + + # 1. Find associated base year network values + base_df = self.base_roadway_network.links_df[ + (self.base_roadway_network.links_df["A"] == int(change_row.A)) + & (self.base_roadway_network.links_df["B"] == int(change_row.B)) + ] + + if not base_df.shape[0]: + msg = "No match found in network for AB combination: ({},{}). Incompatible base network.".format( + change_row.A, change_row.B + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + elif base_df.shape[0] > 1: + WranglerLogger.warning( + "Found more than one match in base network for AB combination: ({},{}). Selecting first one to operate on but AB should be unique to network.".format( + change_row.A, change_row.B + ) + ) + + base_row = base_df.iloc[0] + # WranglerLogger.debug("Properties with changes: {}".format(changeable_col)) + + # 2. find columns that changed (enough) + changed_col = [] + for col in changeable_col: + WranglerLogger.debug("Assessing Column: {}".format(col)) + # if it is the same as before, or a static value, don't process as a change + if str(change_row[col]).strip('"\'') == str(base_row[col]).strip('"\''): + continue + # if it is NaN or None, don't process as a change + if (change_row[col] != change_row[col]) | (change_row[col] is None): + continue + if (col == "roadway_class") & (change_row[col] == 0): + continue + # only look at distance if it has significantly changed + if col == "distance": + if ( + abs( + (change_row[col] - float(base_row[col])) + / base_row[col].astype(float) + ) + > 0.01 + ): + change_row[col] = type(base_row[col])(change_row[col]) + changed_col.append(col) + else: + continue + else: + change_row[col] = type(base_row[col])(change_row[col]) + changed_col.append(col) + + WranglerLogger.debug( + "Properties with changes that will be processed: {}".format(changed_col) + ) + + if not changed_col: + return pd.DataFrame() + + # 3. Iterate through columns with changed values and structure the changes as expected in project card + property_dict_list = [] + processed_properties = [] + + # check if it's a manged lane change + for c in changed_col: + if c.startswith("ML_"): + # TODO ML project card skeleton + msg = "Detected managed lane changes, please create managed lane project card!" + WranglerLogger.error(msg) + raise ValueError(msg) + return + + # regular roadway property change + for c in changed_col: + # WranglerLogger.debug("Processing Column: {}".format(c)) + ( + p_base_name, + p_time_period, + p_category, + managed_lane, + ) = column_name_to_parts(c, self.parameters) + + _d = { + "existing": base_row[c], + "set": change_row[c], + } + if c in Project.CALCULATED_VALUES: + _d = { + "set": change_row[c], + } + if p_time_period: + if managed_lane == 1: + _d["time"] = list( + self.parameters.time_period_to_time[p_time_period] + ) + if p_category: + _d["category"] = p_category + + # iterate through existing properties that have been changed and see if you should just add + if (p_base_name in processed_properties) & (managed_lane == 1): + for processed_p in property_dict_list: + if processed_p["property"] == p_base_name: + processed_p["timeofday"] += [_d] + elif (p_base_name in processed_properties) & (managed_lane == 0): + for processed_p in property_dict_list: + if processed_p["property"] == p_base_name: + if processed_p["set"] != change_row[c]: + msg = "Detected different changes for split-property variables on regular roadway links: " + msg += "conflicting \"{}\" values \"{}\", \"{}\"".format(p_base_name, processed_p["set"], change_row[c]) + WranglerLogger.error(msg) + raise ValueError(msg) + elif p_time_period: + if managed_lane == 1: + property_dict = {"property": p_base_name, "timeofday": [_d]} + processed_properties.append(p_base_name) + property_dict_list.append(property_dict) + else: + _d["property"] = p_base_name + processed_properties.append(_d["property"]) + property_dict_list.append(_d) + else: + _d["property"] = p_base_name + processed_properties.append(_d["property"]) + property_dict_list.append(_d) + + card_df = pd.DataFrame( + { + "properties": pd.Series([property_dict_list]), + "model_link_id": pd.Series(base_row["model_link_id"]), + } + ) + + # WranglerLogger.debug('single change card_df:\n {}'.format(card_df)) + + return card_df + + def _process_link_changes(link_changes_df, changeable_col): + """""" + cube_change_df = link_changes_df[link_changes_df["operation_final"] == "C"].copy() + + # make sure columns has the same type as base network + cube_change_df['A'] = cube_change_df['A'].astype( + type(self.base_roadway_network.links_df['A'].iloc[0]) + ) + cube_change_df['B'] = cube_change_df['B'].astype( + type(self.base_roadway_network.links_df['B'].iloc[0]) + ) + + if 'model_link_id' in cube_change_df.columns: + cube_change_df.drop('model_link_id', axis = 1, inplace = True) + + cube_change_df = pd.merge( + cube_change_df, + self.base_roadway_network.links_df[['A', 'B', 'model_link_id']], + how = 'left', + on = ['A', 'B'] + ) + + if not cube_change_df.shape[0]: + WranglerLogger.info("No link changes processed") + return [] + + change_link_dict_df = pd.DataFrame(columns=["properties", "model_link_id"]) + + for index, row in cube_change_df.iterrows(): + card_df = _process_single_link_change(row, changeable_col) + + change_link_dict_df = pd.concat( + [change_link_dict_df, card_df], ignore_index=True, sort=False + ) + + if not change_link_dict_df.shape[0]: + WranglerLogger.info("No link changes processed") + return [] + + # WranglerLogger.debug('change_link_dict_df Unaggregated:\n {}'.format(change_link_dict_df)) + + # Have to change to string so that it is a hashable type for the aggregation + change_link_dict_df["properties"] = change_link_dict_df[ + "properties" + ].astype(str) + # Group the changes that are the same + change_link_dict_df = ( + change_link_dict_df.groupby("properties")[["model_link_id"]] + .agg(lambda x: list(x)) + .reset_index() + ) + # WranglerLogger.debug('change_link_dict_df Aggregated:\n {}'.format(change_link_dict_df)) + + # Reformat model link id to correct "facility" format + change_link_dict_df["facility"] = change_link_dict_df.apply( + lambda x: {"link": [{"model_link_id": x.model_link_id}]}, axis=1 + ) + + # WranglerLogger.debug('change_link_dict_df 3: {}'.format(change_link_dict_df)) + change_link_dict_df["properties"] = change_link_dict_df["properties"].apply( + lambda x: json.loads( + x.replace("'\"", "'").replace("\"'", "'").replace("'", '"') + ) + ) + + change_link_dict_df["category"] = "Roadway Property Change" + + change_link_dict_list = change_link_dict_df[ + ["category", "facility", "properties"] + ].to_dict("record") + + WranglerLogger.debug( + "{} Changes Processed".format(len(change_link_dict_list)) + ) + return change_link_dict_list + + def _consolidate_actions(log, base, key_list): + log_df = log.copy() + # will be changed if to allow new variables being added/changed that are not in base network + changeable_col = [x for x in log_df.columns if x in base.columns] + #print(log_df) + #for x in changeable_col: + # print(x) + #log_df[x] = log_df[x].astype(base[x].dtype) + + if 'operation_final' not in log_df.columns: + action_history_df = ( + log_df.groupby(key_list)["operation"] + .agg(lambda x: x.tolist()) + .rename("operation_history") + .reset_index() + ) + + log_df = pd.merge(log_df, action_history_df, on=key_list, how="left") + log_df.drop_duplicates(subset=key_list, keep="last", inplace=True) + log_df["operation_final"] = log_df.apply(lambda x: Project._final_op(x), axis=1) + + return log_df[changeable_col + ["operation_final"]] + + delete_link_dict = None + add_link_dict = None + change_link_dict_list = [] + + if len(link_changes_df) != 0: + link_changes_df = _consolidate_actions( + link_changes_df, self.base_roadway_network.links_df, ["A", "B"] + ) + + # process deletions + delete_link_dict = _process_deletions(link_changes_df) + + # process additions + add_link_dict = _process_link_additions( + link_changes_df, limit_variables_to_existing_network + ) + + # process changes + WranglerLogger.debug("Processing changes") + WranglerLogger.debug(link_changes_df) + changeable_col = list( + ( + set(link_changes_df.columns) + & set(self.base_roadway_network.links_df.columns) + ) + - set(Project.STATIC_VALUES) + ) + + cols_in_changes_not_in_net = list( + set(link_changes_df.columns) + - set(self.base_roadway_network.links_df.columns) + ) + + if cols_in_changes_not_in_net: + WranglerLogger.warning( + "The following attributes are specified in the changes but do not exist in the base network: {}".format( + cols_in_changes_not_in_net + ) + ) + + change_link_dict_list = _process_link_changes(link_changes_df, changeable_col) + + if len(node_changes_df) != 0: + node_changes_df = _consolidate_actions( + node_changes_df, self.base_roadway_network.nodes_df, ["model_node_id"] + ) + + # print error message for node change and node deletion + if ( + len(node_changes_df[node_changes_df["operation_final"].isin(["C", "D"])]) + > 0 + ): + msg = "NODE changes and deletions are not allowed!" + WranglerLogger.warning(msg) + #raise ValueError(msg) + node_add_df = node_changes_df[node_changes_df["operation_final"] == "A"] + + if add_link_dict: + add_link_dict["nodes"] = _process_node_additions(node_add_df) + else: + add_link_dict = {"category": "Add New Roadway", "nodes": _process_node_additions(node_add_df)} + + else: + None + + # combine together + + highway_change_list = list( + filter(None, [delete_link_dict] + [add_link_dict] + change_link_dict_list) + ) + + return highway_change_list
+
+ +
+
+
+ +
+ +
+

© Copyright 2019 Metropolitan Council.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty_2035_hwy_update/_modules/lasso/roadway/index.html b/branch/bicounty_2035_hwy_update/_modules/lasso/roadway/index.html new file mode 100644 index 0000000..f276fd1 --- /dev/null +++ b/branch/bicounty_2035_hwy_update/_modules/lasso/roadway/index.html @@ -0,0 +1,2046 @@ + + + + + + lasso.roadway — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for lasso.roadway

+import copy
+import glob
+import os
+from typing import Optional, Union
+
+import geopandas as gpd
+import pandas as pd
+
+from geopandas import GeoDataFrame
+from pandas import DataFrame
+import numpy as np
+
+from network_wrangler import RoadwayNetwork
+from .parameters import Parameters
+from .logger import WranglerLogger
+
+
+
[docs]class ModelRoadwayNetwork(RoadwayNetwork): + """ + Subclass of network_wrangler class :ref:`RoadwayNetwork <network_wrangler:RoadwayNetwork>` + + A representation of the physical roadway network and its properties. + """ + + CALCULATED_VALUES = [ + "area_type", + "county", + "centroidconnect", + ] + +
[docs] def __init__( + self, + nodes: GeoDataFrame, + links: DataFrame, + shapes: GeoDataFrame, + parameters: Union[Parameters, dict] = {}, + **kwargs, + ): + """ + Constructor + + Args: + nodes: geodataframe of nodes + links: dataframe of links + shapes: geodataframe of shapes + parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters. + If not specified, will use default parameters. + crs (int): coordinate reference system, ESPG number + node_foreign_key (str): variable linking the node table to the link table + link_foreign_key (list): list of variable linking the link table to the node foreign key + shape_foreign_key (str): variable linking the links table and shape table + unique_link_ids (list): list of variables unique to each link + unique_node_ids (list): list of variables unique to each node + modes_to_network_link_variables (dict): Mapping of modes to link variables in the network + modes_to_network_nodes_variables (dict): Mapping of modes to node variables in the network + managed_lanes_node_id_scalar (int): Scalar values added to primary keys for nodes for + corresponding managed lanes. + managed_lanes_link_id_scalar (int): Scalar values added to primary keys for links for + corresponding managed lanes. + managed_lanes_required_attributes (list): attributes that must be specified in managed + lane projects. + keep_same_attributes_ml_and_gp (list): attributes to copy to managed lanes from parallel + general purpose lanes. + """ + super().__init__(nodes, links, shapes, **kwargs) + + # will have to change if want to alter them + if type(parameters) is dict: + self.parameters = Parameters(**parameters) + elif isinstance(parameters, Parameters): + self.parameters = Parameters(**parameters.__dict__) + else: + msg = "Parameters should be a dict or instance of Parameters: found {} which is of type:{}".format( + parameters, type(parameters) + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + self.links_metcouncil_df = None + self.nodes_metcouncil_df = None + + self.fill_na() + self.convert_int()
+ # self.shapes_metcouncil_df = None + ##todo also write to file + # WranglerLogger.debug("Used PARAMS\n", '\n'.join(['{}: {}'.format(k,v) for k,v in self.parameters.__dict__.items()])) + +
[docs] @staticmethod + def read( + link_filename: str, + node_filename: str, + shape_filename: str, + fast: bool = False, + recalculate_calculated_variables=False, + recalculate_distance=False, + parameters: Union[dict, Parameters] = {}, + **kwargs, + ): + """ + Reads in links and nodes network standard. + + Args: + link_filename (str): File path to link json. + node_filename (str): File path to node geojson. + shape_filename (str): File path to link true shape geojson + fast (bool): boolean that will skip validation to speed up read time. + recalculate_calculated_variables (bool): calculates fields from spatial joins, etc. + recalculate_distance (bool): re-calculates distance. + parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not specified, will use default parameters. + crs (int): coordinate reference system, ESPG number + node_foreign_key (str): variable linking the node table to the link table + link_foreign_key (list): list of variable linking the link table to the node foreign key + shape_foreign_key (str): variable linking the links table and shape table + unique_link_ids (list): list of variables unique to each link + unique_node_ids (list): list of variables unique to each node + modes_to_network_link_variables (dict): Mapping of modes to link variables in the network + modes_to_network_nodes_variables (dict): Mapping of modes to node variables in the network + managed_lanes_node_id_scalar (int): Scalar values added to primary keys for nodes for + corresponding managed lanes. + managed_lanes_link_id_scalar (int): Scalar values added to primary keys for links for + corresponding managed lanes. + managed_lanes_required_attributes (list): attributes that must be specified in managed + lane projects. + keep_same_attributes_ml_and_gp (list): attributes to copy to managed lanes from parallel + general purpose lanes. + Returns: + ModelRoadwayNetwork + """ + + nodes_df, links_df, shapes_df = RoadwayNetwork.load_transform_network( + node_filename, + link_filename, + shape_filename, + validate_schema=not fast, + **kwargs, + ) + + m_road_net = ModelRoadwayNetwork( + nodes_df, + links_df, + shapes_df, + parameters=parameters, + **kwargs, + ) + + if recalculate_calculated_variables: + m_road_net.create_calculated_variables() + if recalculate_distance: + m_road_net.calculate_distance(overwrite=True) + + m_road_net.fill_na() + # this method is making period values as string "NaN", need to revise. + m_road_net.split_properties_by_time_period_and_category() + for c in m_road_net.links_df.columns: + m_road_net.links_df[c] = m_road_net.links_df[c].replace("NaN", np.nan) + m_road_net.convert_int() + + return m_road_net
+ +
[docs] @staticmethod + def from_RoadwayNetwork( + roadway_network_object, + parameters: Union[dict, Parameters] = {}, + ): + """ + RoadwayNetwork to ModelRoadwayNetwork + + Args: + roadway_network_object (RoadwayNetwork). + parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not specified, will use default parameters. + + Returns: + ModelRoadwayNetwork + """ + + additional_params_dict = { + k: v + for k, v in roadway_network_object.__dict__.items() + if k not in ["nodes_df", "links_df", "shapes_df", "parameters"] + } + + return ModelRoadwayNetwork( + roadway_network_object.nodes_df, + roadway_network_object.links_df, + roadway_network_object.shapes_df, + parameters=parameters, + **additional_params_dict, + )
+ +
[docs] def split_properties_by_time_period_and_category(self, properties_to_split=None): + """ + Splits properties by time period, assuming a variable structure of + + Args: + properties_to_split: dict + dictionary of output variable prefix mapped to the source variable and what to stratify it by + e.g. + { + 'lanes' : {'v':'lanes', 'times_periods':{"AM": ("6:00", "10:00"),"PM": ("15:00", "19:00")}}, + 'ML_lanes' : {'v':'ML_lanes', 'times_periods':{"AM": ("6:00", "10:00"),"PM": ("15:00", "19:00")}}, + 'use' : {'v':'use', 'times_periods':{"AM": ("6:00", "10:00"),"PM": ("15:00", "19:00")}}, + } + + """ + import itertools + + if properties_to_split == None: + properties_to_split = self.parameters.properties_to_split + + for out_var, params in properties_to_split.items(): + if params["v"] not in self.links_df.columns: + WranglerLogger.warning( + "Specified variable to split: {} not in network variables: {}. Returning 0.".format( + params["v"], str(self.links_df.columns) + ) + ) + if params.get("time_periods") and params.get("categories"): + + for time_suffix, category_suffix in itertools.product( + params["time_periods"], params["categories"] + ): + self.links_df[ + out_var + "_" + time_suffix + "_" + category_suffix + ] = 0 + elif params.get("time_periods"): + for time_suffix in params["time_periods"]: + self.links_df[out_var + "_" + time_suffix] = 0 + elif params.get("time_periods") and params.get("categories"): + for time_suffix, category_suffix in itertools.product( + params["time_periods"], params["categories"] + ): + self.links_df[ + out_var + "_" + category_suffix + "_" + time_suffix + ] = self.get_property_by_time_period_and_group( + params["v"], + category=params["categories"][category_suffix], + time_period=params["time_periods"][time_suffix], + ) + elif params.get("time_periods"): + for time_suffix in params["time_periods"]: + self.links_df[ + out_var + "_" + time_suffix + ] = self.get_property_by_time_period_and_group( + params["v"], + category=None, + time_period=params["time_periods"][time_suffix], + ) + else: + raise ValueError( + "Shoudn't have a category without a time period: {}".format(params) + )
+ +
[docs] def create_calculated_variables(self): + """ + Creates calculated roadway variables. + + Args: + None + """ + WranglerLogger.info("Creating calculated roadway variables.") + + #MTC + self.create_ML_variable() + #/MTC + #MC + self.calculate_area_type() + self.calculate_county() + self.calculate_mpo() + self.add_counts() + self.create_ML_variable() + self.create_hov_corridor_variable() + self.create_managed_variable()
+ #/MC + +
[docs] def calculate_county( + self, + county_shape=None, + county_shape_variable=None, + network_variable="county", + county_codes_dict=None, + overwrite=False, + ): + """ + #MC + Calculates county variable. + + This uses the centroid of the geometry field to determine which county it should be labeled. + This isn't perfect, but it much quicker than other methods. + + Args: + county_shape (str): The File path to county geodatabase. + county_shape_variable (str): The variable name of county in county geodadabase. + network_variable (str): The variable name of county in network standard. Default to "county". + overwrite (Bool): True if overwriting existing county variable in network. Default to False. + + Returns: + None + """ + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing County Variable '{}' already in network".format( + network_variable + ) + ) + else: + WranglerLogger.info( + "County Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + """ + Verify inputs + """ + + county_shape = county_shape if county_shape else self.parameters.county_shape + + county_shape_variable = ( + county_shape_variable + if county_shape_variable + else self.parameters.county_variable_shp + ) + + WranglerLogger.info( + "Adding roadway network variable for county using a spatial join with: {}".format( + county_shape + ) + ) + + county_codes_dict = ( + county_codes_dict if county_codes_dict else self.parameters.county_code_dict + ) + if not county_codes_dict: + msg = "No county codes dictionary specified" + WranglerLogger.error(msg) + raise ValueError(msg) + + """ + Start actual process + """ + + centroids_gdf = self.links_df.copy() + centroids_gdf["geometry"] = centroids_gdf["geometry"].centroid + + county_gdf = gpd.read_file(county_shape) + county_gdf = county_gdf.to_crs(epsg=self.crs) + joined_gdf = gpd.sjoin(centroids_gdf, county_gdf, how="left", op="intersects") + + joined_gdf[county_shape_variable] = ( + joined_gdf[county_shape_variable] + .map(county_codes_dict) + .fillna(10) + .astype(int) + ) + + self.links_df[network_variable] = joined_gdf[county_shape_variable] + + WranglerLogger.info( + "Finished Calculating county variable: {}".format(network_variable) + )
+ +
[docs] def calculate_area_type( + self, + area_type_shape=None, + area_type_shape_variable=None, + network_variable="area_type", + area_type_codes_dict=None, + downtown_area_type_shape=None, + downtown_area_type=None, + overwrite=False, + ): + """ + #MC + Calculates area type variable. + + This uses the centroid of the geometry field to determine which area it should be labeled. + This isn't perfect, but it much quicker than other methods. + + Args: + area_type_shape (str): The File path to area geodatabase. + area_type_shape_variable (str): The variable name of area type in area geodadabase. + network_variable (str): The variable name of area type in network standard. Default to "area_type". + area_type_codes_dict: The dictionary to map input area_type_shape_variable to network_variable + downtown_area_type_shape: The file path to the downtown area type boundary. + downtown_area_type (int): Integer value of downtown area type + overwrite (Bool): True if overwriting existing county variable in network. Default to False. + + Returns: + None + + """ + + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing Area Type Variable '{}' already in network".format( + network_variable + ) + ) + else: + WranglerLogger.info( + "Area Type Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + WranglerLogger.info( + "Calculating Area Type from Spatial Data and adding as roadway network variable: {}".format( + network_variable + ) + ) + + """ + Verify inputs + """ + + area_type_shape = ( + area_type_shape if area_type_shape else self.parameters.area_type_shape + ) + + if not area_type_shape: + msg = "No area type shape specified" + WranglerLogger.error(msg) + raise ValueError(msg) + if not os.path.exists(area_type_shape): + msg = "File not found for area type shape: {}".format(area_type_shape) + WranglerLogger.error(msg) + raise ValueError(msg) + + area_type_shape_variable = ( + area_type_shape_variable + if area_type_shape_variable + else self.parameters.area_type_variable_shp + ) + + if not area_type_shape_variable: + msg = "No area type shape varible specified" + WranglerLogger.error(msg) + raise ValueError(msg) + + area_type_codes_dict = ( + area_type_codes_dict + if area_type_codes_dict + else self.parameters.area_type_code_dict + ) + if not area_type_codes_dict: + msg = "No area type codes dictionary specified" + WranglerLogger.error(msg) + raise ValueError(msg) + + downtown_area_type_shape = ( + downtown_area_type_shape + if downtown_area_type_shape + else self.parameters.downtown_area_type_shape + ) + + if not downtown_area_type_shape: + msg = "No downtown area type shape specified" + WranglerLogger.error(msg) + raise ValueError(msg) + if not os.path.exists(downtown_area_type_shape): + msg = "File not found for downtown area type shape: {}".format( + downtown_area_type_shape + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + downtown_area_type = ( + downtown_area_type + if downtown_area_type + else self.parameters.downtown_area_type + ) + if not downtown_area_type: + msg = "No downtown area type value specified" + WranglerLogger.error(msg) + raise ValueError(msg) + + """ + Start actual process + """ + centroids_gdf = self.links_df.copy() + centroids_gdf["geometry"] = centroids_gdf["geometry"].centroid + + WranglerLogger.debug("Reading Area Type Shapefile {}".format(area_type_shape)) + area_type_gdf = gpd.read_file(area_type_shape) + area_type_gdf = area_type_gdf.to_crs(epsg=self.crs) + + downtown_gdf = gpd.read_file(downtown_area_type_shape) + downtown_gdf = downtown_gdf.to_crs(epsg=self.crs) + + joined_gdf = gpd.sjoin( + centroids_gdf, area_type_gdf, how="left", op="intersects" + ) + + joined_gdf[area_type_shape_variable] = ( + joined_gdf[area_type_shape_variable] + .map(area_type_codes_dict) + .fillna(1) + .astype(int) + ) + + WranglerLogger.debug("Area Type Codes Used: {}".format(area_type_codes_dict)) + + d_joined_gdf = gpd.sjoin( + centroids_gdf, downtown_gdf, how="left", op="intersects" + ) + + d_joined_gdf["downtown_area_type"] = d_joined_gdf["Id"].fillna(-99).astype(int) + + joined_gdf.loc[ + d_joined_gdf["downtown_area_type"] == 0, area_type_shape_variable + ] = downtown_area_type + + WranglerLogger.debug( + "Downtown Area Type used boundary file: {}".format(downtown_area_type_shape) + ) + + self.links_df[network_variable] = joined_gdf[area_type_shape_variable] + + WranglerLogger.info( + "Finished Calculating Area Type from Spatial Data into variable: {}".format( + network_variable + ) + )
+ +
[docs] def calculate_mpo( + self, + county_network_variable="county", + network_variable="mpo", + as_integer=True, + mpo_counties=None, + overwrite=False, + ): + """ + Calculates mpo variable. + #MC + Args: + county_variable (str): Name of the variable where the county names are stored. Default to "county". + network_variable (str): Name of the variable that should be written to. Default to "mpo". + as_integer (bool): If true, will convert true/false to 1/0s. + mpo_counties (list): List of county names that are within mpo region. + overwrite (Bool): True if overwriting existing county variable in network. Default to False. + + Returns: + None + """ + + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing MPO Variable '{}' already in network".format( + network_variable + ) + ) + else: + WranglerLogger.info( + "MPO Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + WranglerLogger.info( + "Calculating MPO as roadway network variable: {}".format(network_variable) + ) + """ + Verify inputs + """ + county_network_variable = ( + county_network_variable + if county_network_variable + else self.parameters.county_network_variable + ) + + if not county_network_variable: + msg = "No variable specified as containing 'county' in the network." + WranglerLogger.error(msg) + raise ValueError(msg) + if county_network_variable not in self.links_df.columns: + msg = "Specified county network variable: {} does not exist in network. Try running or debuging county calculation." + WranglerLogger.error(msg) + raise ValueError(msg) + + mpo_counties = mpo_counties if mpo_counties else self.parameters.mpo_counties + + if not mpo_counties: + msg = "No MPO Counties specified in method call or in parameters." + WranglerLogger.error(msg) + raise ValueError(msg) + + WranglerLogger.debug("MPO Counties: {}".format(",".join(str(mpo_counties)))) + + """ + Start actual process + """ + + mpo = self.links_df[county_network_variable].isin(mpo_counties) + + if as_integer: + mpo = mpo.astype(int) + + self.links_df[network_variable] = mpo + + WranglerLogger.info( + "Finished calculating MPO variable: {}".format(network_variable) + )
+ +
[docs] def add_variable_using_shst_reference( + self, + var_shst_csvdata=None, + shst_csv_variable=None, + network_variable=None, + network_var_type=int, + overwrite=False, + ): + """ + Join network links with source data, via SHST API node match result. + + Args: + var_shst_csvdata (str): File path to SHST API return. + shst_csv_variable (str): Variable name in the source data. + network_variable (str): Name of the variable that should be written to. + network_var_type : Variable type in the written network. + overwrite (bool): True is overwriting existing variable. Default to False. + + Returns: + None + + """ + WranglerLogger.info( + "Adding Variable {} using Shared Streets Reference from {}".format( + network_variable, var_shst_csvdata + ) + ) + + var_shst_df = pd.read_csv(var_shst_csvdata) + + if "shstReferenceId" not in var_shst_df.columns: + msg = "'shstReferenceId' required but not found in {}".format(var_shst_data) + WranglerLogger.error(msg) + raise ValueError(msg) + + if shst_csv_variable not in var_shst_df.columns: + msg = "{} required but not found in {}".format( + shst_csv_variable, var_shst_data + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + join_gdf = pd.merge( + self.links_df, + var_shst_df[["shstReferenceId", shst_csv_variable]], + how="left", + on="shstReferenceId", + ) + + join_gdf[shst_csv_variable].fillna(0, inplace=True) + + if network_variable in self.links_df.columns and not overwrite: + join_gdf.loc[join_gdf[network_variable] > 0, network_variable] = join_gdf[ + shst_csv_variable + ].astype(network_var_type) + else: + join_gdf[network_variable] = join_gdf[shst_csv_variable].astype( + network_var_type + ) + + self.links_df[network_variable] = join_gdf[network_variable] + + WranglerLogger.info( + "Added variable: {} using Shared Streets Reference".format(network_variable) + )
+ +
[docs] def add_counts( + self, + network_variable="AADT", + mndot_count_shst_data=None, + widot_count_shst_data=None, + mndot_count_variable_shp=None, + widot_count_variable_shp=None, + ): + + """ + Adds count variable. + #MC + join the network with count node data, via SHST API node match result + + Args: + network_variable (str): Name of the variable that should be written to. Default to "AADT". + mndot_count_shst_data (str): File path to MNDOT count location SHST API node match result. + widot_count_shst_data (str): File path to WIDOT count location SHST API node match result. + mndot_count_variable_shp (str): File path to MNDOT count location geodatabase. + widot_count_variable_shp (str): File path to WIDOT count location geodatabase. + + Returns: + None + """ + + WranglerLogger.info("Adding Counts") + + """ + Verify inputs + """ + + mndot_count_shst_data = ( + mndot_count_shst_data + if mndot_count_shst_data + else self.parameters.mndot_count_shst_data + ) + widot_count_shst_data = ( + widot_count_shst_data + if widot_count_shst_data + else self.parameters.widot_count_shst_data + ) + mndot_count_variable_shp = ( + mndot_count_variable_shp + if mndot_count_variable_shp + else self.parameters.mndot_count_variable_shp + ) + widot_count_variable_shp = ( + widot_count_variable_shp + if widot_count_variable_shp + else self.parameters.widot_count_variable_shp + ) + + for varname, var in { + "mndot_count_shst_data": mndot_count_shst_data, + "widot_count_shst_data": widot_count_shst_data, + }.items(): + if not var: + msg = "'{}' not found in method or lasso parameters.".format(varname) + WranglerLogger.error(msg) + raise ValueError(msg) + if not os.path.exists(var): + msg = "{}' not found at following location: {}.".format(varname, var) + WranglerLogger.error(msg) + raise ValueError(msg) + + for varname, var in { + "mndot_count_variable_shp": mndot_count_variable_shp, + "widot_count_variable_shp": widot_count_variable_shp, + }.items(): + if not var: + msg = "'{}' not found in method or lasso parameters.".format(varname) + WranglerLogger.error(msg) + raise ValueError(msg) + + """ + Start actual process + """ + WranglerLogger.debug( + "Adding MNDOT Counts using \n- shst file: {}\n- shp file: {}\n- as network variable: {}".format( + mndot_count_shst_data, mndot_count_variable_shp, network_variable + ) + ) + # Add Minnesota Counts + self.add_variable_using_shst_reference( + var_shst_csvdata=mndot_count_shst_data, + shst_csv_variable=mndot_count_variable_shp, + network_variable=network_variable, + network_var_type=int, + overwrite=True, + ) + WranglerLogger.debug( + "Adding WiDot Counts using \n- shst file: {}\n- shp file: {}\n- as network variable: {}".format( + widot_count_shst_data, widot_count_variable_shp, network_variable + ) + ) + # Add Wisconsin Counts, but don't overwrite Minnesota + self.add_variable_using_shst_reference( + var_shst_csvdata=widot_count_shst_data, + shst_csv_variable=widot_count_variable_shp, + network_variable=network_variable, + network_var_type=int, + overwrite=False, + ) + + self.links_df["count_AM"] = self.links_df[network_variable] / 4 + self.links_df["count_MD"] = self.links_df[network_variable] / 4 + self.links_df["count_PM"] = self.links_df[network_variable] / 4 + self.links_df["count_NT"] = self.links_df[network_variable] / 4 + + self.links_df["count_daily"] = self.links_df[network_variable] + self.links_df["count_year"] = 2017 + + WranglerLogger.info( + "Finished adding counts variable: {}".format(network_variable) + )
+ +
[docs] @staticmethod + def read_match_result(path): + """ + Reads the shst geojson match returns. + + Returns shst dataframe. + + Reading lots of same type of file and concatenating them into a single DataFrame. + + Args: + path (str): File path to SHST match results. + + Returns: + geodataframe: geopandas geodataframe + + ##todo + not sure why we need, but should be in utilities not this class + """ + refId_gdf = DataFrame() + refid_file = glob.glob(path) + for i in refid_file: + new = gpd.read_file(i) + refId_gdf = pd.concat([refId_gdf, new], ignore_index=True, sort=False) + return refId_gdf
+ +
[docs] @staticmethod + def get_attribute( + links_df, + join_key, # either "shstReferenceId", or "shstGeometryId", tests showed the latter gave better coverage + source_shst_ref_df, # source shst refId + source_gdf, # source dataframe + field_name, # , # targetted attribute from source + ): + """ + Gets attribute from source data using SHST match result. + + Args: + links_df (dataframe): The network dataframe that new attribute should be written to. + join_key (str): SHST ID variable name used to join source data with network dataframe. + source_shst_ref_df (str): File path to source data SHST match result. + source_gdf (str): File path to source data. + field_name (str): Name of the attribute to get from source data. + + Returns: + None + """ + # join based on shared streets geometry ID + # pp_link_id is shared streets match return + # source_ink_id is mrcc + WranglerLogger.debug( + "source ShSt rename_variables_for_dbf columns\n{}".format( + source_shst_ref_df.columns + ) + ) + WranglerLogger.debug("source gdf columns\n{}".format(source_gdf.columns)) + # end up with OSM network with the MRCC Link ID + # could also do with route_sys...would that be quicker? + join_refId_df = pd.merge( + links_df, + source_shst_ref_df[[join_key, "pp_link_id", "score"]].rename( + columns={"pp_link_id": "source_link_id", "score": "source_score"} + ), + how="left", + on=join_key, + ) + + # joined with MRCC dataframe to get route_sys + + join_refId_df = pd.merge( + join_refId_df, + source_gdf[["LINK_ID", field_name]].rename( + columns={"LINK_ID": "source_link_id"} + ), + how="left", + on="source_link_id", + ) + + # drop duplicated records with same field value + + join_refId_df.drop_duplicates( + subset=["model_link_id", "shstReferenceId", field_name], inplace=True + ) + + # more than one match, take the best score + + join_refId_df.sort_values( + by=["model_link_id", "source_score"], + ascending=True, + na_position="first", + inplace=True, + ) + + join_refId_df.drop_duplicates( + subset=["model_link_id"], keep="last", inplace=True + ) + + # self.links_df[field_name] = join_refId_df[field_name] + + return join_refId_df[links_df.columns.tolist() + [field_name, "source_link_id"]]
+ +
[docs] def calculate_use( + self, + network_variable="use", + as_integer=True, + overwrite=False, + ): + """ + Calculates use variable. + + Args: + network_variable (str): Variable that should be written to in the network. Default to "use" + as_integer (bool): If True, will convert true/false to 1/0s. Defauly to True. + overwrite (Bool): True if overwriting existing county variable in network. Default to False. + + Returns: + None + """ + + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing hov Variable '{}' already in network".format( + network_variable + ) + ) + else: + WranglerLogger.info( + "'use' Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + WranglerLogger.info( + "Calculating hov and adding as roadway network variable: {}".format( + network_variable + ) + ) + """ + Verify inputs + """ + + if not network_variable: + msg = "No network variable specified for centroid connector" + WranglerLogger.error(msg) + raise ValueError(msg) + + """ + Start actual process + """ + #MTC + self.links_df[network_variable] = int(1) + #/MTC + + self.links_df[network_variable] = 0 + + self.links_df.loc[ + (self.links_df["assign_group"] == 8) | (self.links_df["access"] == "hov"), + network_variable, + ] = 100 + #/MC + + + if as_integer: + self.links_df[network_variable] = self.links_df[network_variable].astype( + int + ) + WranglerLogger.info( + "Finished calculating hov variable: {}".format(network_variable) + )
+ +
[docs] def create_ML_variable( + self, + network_variable="ML_lanes", + overwrite=False, + ): + """ + Created ML lanes placeholder for project to write out ML changes + + ML lanes default to 0, ML info comes from cube LOG file and store in project cards + + Args: + overwrite (Bool): True if overwriting existing variable in network. Default to False. + + Returns: + None + """ + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing ML Variable '{}' already in network".format( + network_variable + ) + ) + self.links_df[network_variable] = int(0) + else: + WranglerLogger.info( + "ML Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + """ + Verify inputs + """ + + WranglerLogger.info( + "Finished creating ML lanes variable: {}".format(network_variable) + )
+ +
[docs] def create_hov_corridor_variable( + self, + network_variable="segment_id", + overwrite=False, + ): + """ + Created hov corridor placeholder for project to write out corridor changes + + hov corridor id default to 0, its info comes from cube LOG file and store in project cards + + Args: + overwrite (Bool): True if overwriting existing variable in network. Default to False. + + Returns: + None + """ + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing hov corridor Variable '{}' already in network".format( + network_variable + ) + ) + else: + WranglerLogger.info( + "Hov corridor Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + """ + Verify inputs + """ + + self.links_df[network_variable] = int(0) + + WranglerLogger.info( + "Finished creating hov corridor variable: {}".format(network_variable) + )
+ +
[docs] def create_managed_variable( + self, + network_variable="managed", + overwrite=False, + ): + """ + Created placeholder for project to write out managed + + managed default to 0, its info comes from cube LOG file and store in project cards + + Args: + overwrite (Bool): True if overwriting existing variable in network. Default to False. + + Returns: + None + """ + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing managed Variable '{}' already in network".format( + network_variable + ) + ) + else: + WranglerLogger.info( + "Managed Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + """ + Verify inputs + """ + + self.links_df[network_variable] = int(0) + + WranglerLogger.info( + "Finished creating managed variable: {}".format(network_variable) + )
+ +
[docs] def calculate_centroidconnect( + self, + parameters, + network_variable="centroidconnect", + highest_taz_number=None, + as_integer=True, + overwrite=False, + ): + """ + Calculates centroid connector variable. + + Args: + parameters (Parameters): A Lasso Parameters, which stores input files. + network_variable (str): Variable that should be written to in the network. Default to "centroidconnect" + highest_taz_number (int): the max TAZ number in the network. + as_integer (bool): If True, will convert true/false to 1/0s. Default to True. + overwrite (Bool): True if overwriting existing county variable in network. Default to False. + Returns: + RoadwayNetwork + """ + + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing Centroid Connector Variable '{}' already in network".format( + network_variable + ) + ) + else: + WranglerLogger.info( + "Centroid Connector Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + WranglerLogger.info( + "Calculating Centroid Connector and adding as roadway network variable: {}".format( + network_variable + ) + ) + """ + Verify inputs + """ + highest_taz_number = ( + highest_taz_number if highest_taz_number else parameters.highest_taz_number + ) + + if not highest_taz_number: + msg = "No highest_TAZ number specified in method variable or in parameters" + WranglerLogger.error(msg) + raise ValueError(msg) + + WranglerLogger.debug( + "Calculating Centroid Connectors using highest TAZ number: {}".format( + highest_taz_number + ) + ) + + if not network_variable: + msg = "No network variable specified for centroid connector" + WranglerLogger.error(msg) + raise ValueError(msg) + + """ + Start actual process + """ + self.links_df[network_variable] = False + + self.links_df.loc[ + (self.links_df["A"] <= highest_taz_number) + | (self.links_df["B"] <= highest_taz_number), + network_variable, + ] = True + + if as_integer: + self.links_df[network_variable] = self.links_df[ + network_variable + ].astype(int) + WranglerLogger.info( + "Finished calculating centroid connector variable: {}".format(network_variable) + )
+ + +
[docs] def calculate_distance( + self, network_variable="distance", centroidconnect_only=False, overwrite=False + ): + """ + calculate link distance in miles + + Args: + centroidconnect_only (Bool): True if calculating distance for centroidconnectors only. Default to False. + overwrite (Bool): True if overwriting existing variable in network. Default to False. + + Returns: + None + + """ + + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing distance Variable '{}' already in network".format( + network_variable + ) + ) + else: + WranglerLogger.info( + "Distance Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + """ + Verify inputs + """ + + #MC + if ("centroidconnect" not in self.links_df) & ("taz" not in self.links_df.roadway.unique()): + if centroidconnect_only: + msg = "No variable specified for centroid connector, calculating centroidconnect first" + WranglerLogger.error(msg) + raise ValueError(msg) + #/MC + + """ + Start actual process + """ + + temp_links_gdf = self.links_df.copy() + temp_links_gdf.crs = "EPSG:4326" + temp_links_gdf = temp_links_gdf.to_crs(epsg=26915) + + #MTC + WranglerLogger.info( + "Calculating distance for all links".format(network_variable) + ) + temp_links_gdf[network_variable] = temp_links_gdf.geometry.length / 1609.34 + #/MTC + #MC + if centroidconnect_only: + WranglerLogger.info( + "Calculating {} for centroid connectors".format(network_variable) + ) + temp_links_gdf[network_variable] = np.where( + temp_links_gdf.centroidconnect == 1, + temp_links_gdf.geometry.length / 1609.34, + temp_links_gdf[network_variable], + ) + else: + WranglerLogger.info( + "Calculating distance for all links".format(network_variable) + ) + temp_links_gdf[network_variable] = temp_links_gdf.geometry.length / 1609.34 + #/MC + + self.links_df[network_variable] = temp_links_gdf[network_variable]
+ +
[docs] def convert_int(self, int_col_names=[]): + """ + Convert integer columns + """ + + #MTC + WranglerLogger.info( + "Converting variable type to mtc standard" + ) + + int_col_names = self.parameters.int_col + #/MTC + #MC + """ + WranglerLogger.info("Converting variable type to MetCouncil standard") + + if not int_col_names: + int_col_names = self.parameters.int_col + #/MC + """ + ##Why are we doing this? + # int_col_names.remove("lanes") + + for c in list(set(self.links_df.columns) & set(int_col_names)): + self.links_df[c] = self.links_df[c].replace(np.nan, 0) + # REPLACE BLANKS WITH ZERO FOR INTEGER COLUMNS + self.links_df[c] = self.links_df[c].replace('', 0) + try: + self.links_df[c] = self.links_df[c].replace(np.nan, 0) + self.links_df[c] = self.links_df[c].replace("", 0) + self.links_df[c] = self.links_df[c].astype(int) + except ValueError: + try: + self.links_df[c] = self.links_df[c].astype(float) + self.links_df[c] = self.links_df[c].astype(int) + except: + msg = f"Could not convert column {c} to integer." + WranglerLogger.error(msg) + raise ValueError(msg) + except: + self.links_df[c] = self.links_df[c].astype(float) + self.links_df[c] = self.links_df[c].astype(int) + + for c in list(set(self.nodes_df.columns) & set(int_col_names)): + self.nodes_df[c] = self.nodes_df[c].replace("", 0) + self.nodes_df[c] = self.nodes_df[c].astype(int)
+ +
[docs] def fill_na(self): + """ + Fill na values from create_managed_lane_network() + """ + + WranglerLogger.info("Filling nan for network from network wrangler") + + num_col = self.parameters.int_col + self.parameters.float_col + + for x in list(self.links_df.columns): + if x in num_col: + self.links_df[x].fillna(0, inplace=True) + self.links_df[x] = self.links_df[x].apply( + lambda k: 0 if k in [np.nan, "", float("nan"), "NaN"] else k + ) + + else: + self.links_df[x].fillna("", inplace=True) + + for x in list(self.nodes_df.columns): + if x in num_col: + self.nodes_df[x].fillna(0, inplace=True) + else: + self.nodes_df[x].fillna("", inplace=True)
+ + +
[docs] def roadway_standard_to_met_council_network(self, output_epsg=None): + """ + Rename and format roadway attributes to be consistent with what metcouncil's model is expecting. + #MC + Args: + output_epsg (int): epsg number of output network. + + Returns: + None + """ + + WranglerLogger.info( + "Renaming roadway attributes to be consistent with what metcouncil's model is expecting" + ) + + """ + Verify inputs + """ + + output_epsg = output_epsg if output_epsg else self.parameters.output_epsg + + """ + Start actual process + """ + if "managed" in self.links_df.columns: + WranglerLogger.info("Creating managed lane network.") + self.create_managed_lane_network(in_place=True) + + # when ML and assign_group projects are applied together, assign_group is filled as "" by wrangler for ML links + for c in ModelRoadwayNetwork.CALCULATED_VALUES: + if c in self.links_df.columns and c in self.parameters.int_col: + self.links_df[c] = self.links_df[c].replace("", 0) + else: + WranglerLogger.info("Didn't detect managed lanes in network.") + + self.calculate_centroidconnect(self.parameters) + self.create_calculated_variables() + self.calculate_distance(overwrite=True) + + self.fill_na() + # no method to calculate price yet, will be hard coded in project card + WranglerLogger.info("Splitting variables by time period and category") + self.split_properties_by_time_period_and_category() + self.convert_int() + + self.links_metcouncil_df = self.links_df.copy() + self.nodes_metcouncil_df = self.nodes_df.copy() + + self.links_metcouncil_df = pd.merge( + self.links_metcouncil_df.drop( + "geometry", axis=1 + ), # drop the stick geometry in links_df + self.shapes_df[["shape_id", "geometry"]], + how="left", + on="shape_id", + ) + + self.links_metcouncil_df.crs = "EPSG:4326" + self.nodes_metcouncil_df.crs = "EPSG:4326" + WranglerLogger.info("Setting Coordinate Reference System to EPSG 26915") + self.links_metcouncil_df = self.links_metcouncil_df.to_crs(epsg=26915) + self.nodes_metcouncil_df = self.nodes_metcouncil_df.to_crs(epsg=26915) + + self.nodes_metcouncil_df["X"] = self.nodes_metcouncil_df.geometry.apply( + lambda g: g.x + ) + self.nodes_metcouncil_df["Y"] = self.nodes_metcouncil_df.geometry.apply( + lambda g: g.y + ) + + # CUBE expect node id to be N + self.nodes_metcouncil_df.rename(columns={"model_node_id": "N"}, inplace=True)
+ +
[docs] def rename_variables_for_dbf( + self, + input_df, + variable_crosswalk: str = None, + output_variables: list = None, + convert_geometry_to_xy=False, + ): + """ + Rename attributes for DBF/SHP, make sure length within 10 chars. + + Args: + input_df (dataframe): Network standard DataFrame. + variable_crosswalk (str): File path to variable name crosswalk from network standard to DBF names. + output_variables (list): List of strings for DBF variables. + convert_geometry_to_xy (bool): True if converting node geometry to X/Y + + Returns: + dataframe + + """ + WranglerLogger.info("Renaming variables so that they are DBF-safe") + + """ + Verify inputs + """ + + variable_crosswalk = ( + variable_crosswalk + if variable_crosswalk + else self.parameters.net_to_dbf_crosswalk + ) + + output_variables = ( + output_variables if output_variables else self.parameters.output_variables + ) + + """ + Start actual process + """ + + crosswalk_df = pd.read_csv(variable_crosswalk) + WranglerLogger.debug( + "Variable crosswalk: {} \n {}".format(variable_crosswalk, crosswalk_df) + ) + net_to_dbf_dict = dict(zip(crosswalk_df["net"], crosswalk_df["dbf"])) + + dbf_name_list = [] + + dbf_df = copy.deepcopy(input_df) + + # only write out variables that we specify + # if variable is specified in the crosswalk, rename it to that variable + for c in dbf_df.columns: + if c in output_variables: + try: + dbf_df.rename(columns={c: net_to_dbf_dict[c]}, inplace=True) + dbf_name_list += [net_to_dbf_dict[c]] + except: + dbf_name_list += [c] + + if "geometry" in dbf_df.columns: + if str(dbf_df["geometry"].iloc[0].geom_type) == "Point": + dbf_df["X"] = dbf_df.geometry.apply(lambda g: g.x) + dbf_df["Y"] = dbf_df.geometry.apply(lambda g: g.y) + dbf_name_list += ["X", "Y"] + + WranglerLogger.debug("DBF Variables: {}".format(",".join(dbf_name_list))) + + return dbf_df[dbf_name_list]
+ +
[docs] def write_roadway_as_shp( + self, + output_dir, + node_output_variables: list = None, + link_output_variables: list = None, + data_to_csv: bool = True, + data_to_dbf: bool = False, + output_link_shp: str = None, + output_node_shp: str = None, + output_link_csv: str = None, + output_node_csv: str = None, + output_gpkg: str = None, + output_link_gpkg_layer: str = None, + output_node_gpkg_layer: str = None, + output_gpkg_link_filter: str = None + ): + """ + Write out dbf/shp/gpkg for cube. Write out csv in addition to shp with full length variable names. + + Args: + output_dir (str): File path to directory + node_output_variables (list): List of strings for node output variables. + link_output_variables (list): List of strings for link output variables. + data_to_csv (bool): True if write network in csv format. + data_to_dbf (bool): True if write network in dbf/shp format. + output_link_shp (str): File name to output link dbf/shp. + output_node_shp (str): File name of output node dbf/shp. + output_link_csv (str): File name to output link csv. + output_node_csv (str): File name to output node csv. + output_gpkg (str): File name to output GeoPackage. + output_link_gpkg_layer (str): Layer name within output_gpkg to output links. + output_node_gpkg_layer (str): Layer name within output_gpkg to output links. + output_gpkg_link_filter (str): Optional column name to additional output link subset layers + + Returns: + None + """ + + WranglerLogger.info("Writing Network as Shapefile") + WranglerLogger.debug( + "Output Variables: \n - {}".format( + "\n - ".join(self.parameters.output_variables) + ) + ) + + """ + Verify inputs + """ + + if self.nodes_mtc_df is None: + self.roadway_standard_to_met_council_network() + + WranglerLogger.debug( + "Network Link Variables: \n - {}".format( + "\n - ".join(self.links_mtc_df.columns) + ) + ) + WranglerLogger.debug( + "Network Node Variables: \n - {}".format( + "\n - ".join(self.nodes_mtc_df.columns) + ) + ) + + link_output_variables = ( + link_output_variables + if link_output_variables + else [ + c + for c in self.links_mtc_df.columns + if c in self.parameters.output_variables + ] + ) + + node_output_variables = ( + node_output_variables + if node_output_variables + else [ + c + for c in self.nodes_mtc_df.columns + if c in self.parameters.output_variables + ] + ) + + # unless specified that all the data goes to the DBF, only output A and B + dbf_link_output_variables = ( + #MTC + link_output_variables if link_output_variables else ["A", "B", "geometry"] + #MC + #link_output_variables if data_to_dbf else ["A", "B", "shape_id", "geometry"] + ) + + # Removing code to set this to versions from parameters + # User can use these as arg + + """ + Start Process + """ + # rename these to short only for shapefile option + if output_node_shp: + WranglerLogger.info("Renaming DBF Node Variables") + nodes_dbf_df = self.rename_variables_for_dbf(self.nodes_mtc_df, output_variables=node_output_variables) + else: + WranglerLogger.debug("nodes_mtc_df columns: {}".format(list(self.nodes_mtc_df.columns))) + nodes_dbf_df = self.nodes_mtc_df[node_output_variables] + + if output_link_shp: + WranglerLogger.info("Renaming DBF Link Variables") + links_dbf_df = self.rename_variables_for_dbf(self.links_mtc_df, output_variables=dbf_link_output_variables) + else: + WranglerLogger.debug("links_mtc_df columns: {}".format(list(self.links_mtc_df.columns))) + links_dbf_df = self.links_mtc_df[dbf_link_output_variables] + + links_dbf_df = gpd.GeoDataFrame(links_dbf_df, geometry=links_dbf_df["geometry"]) + + # temp debug + WranglerLogger.debug("links_dbf_df.loc[(links_dbf_df.A.isin([7063066,2563066]))|(links_dbf_df.B.isin([7063066,2563066]))]:\n{}".format( + links_dbf_df.loc[(links_dbf_df.A.isin([7063066,2563066]))|(links_dbf_df.B.isin([7063066,2563066]))] + )) + + if output_node_shp: + WranglerLogger.info("Writing Node Shapes: {}".format(os.path.join(output_dir, output_node_shp))) + nodes_dbf_df.to_file(os.path.join(output_dir, output_node_shp)) + + if output_gpkg and output_node_gpkg_layer: + WranglerLogger.info("Writing GeoPackage {} with Node Layer {}".format(os.path.join(output_dir, output_gpkg), output_node_gpkg_layer)) + nodes_dbf_df.to_file(os.path.join(output_dir, output_gpkg), layer=output_node_gpkg_layer, driver="GPKG") + + if output_link_shp: + WranglerLogger.info("Writing Link Shapes: {}".format(os.path.join(output_dir, output_link_shp))) + links_dbf_df.to_file(os.path.join(output_dir, output_link_shp)) + + # debug test + link_schema = { + "properties": { + "A" : "int:8", + "B" : "int:8", + "model_link_id" : "int:10", + "shstGeometryId": "str:32", + "name" : "str:84", + "ft" : "int:2", + "assignable" : "int:18", + "cntype" : "str:80", + "distance" : "float", + "county" : "str:15", + "bike_access" : "int:2", + "drive_access" : "int:2", + "walk_access" : "int:2", + "rail_only" : "int:2", + "bus_only" : "int:2", + "transit" : "int:2", + "managed" : "int:2", + "tollbooth" : "int:2", + "tollseg" : "int:2", + "segment_id" : "int:4", + "lanes_EA" : "int:2", + "lanes_AM" : "int:2", + "lanes_MD" : "int:2", + "lanes_PM" : "int:2", + "lanes_EV" : "int:2", + "useclass_EA" : "int:2", + "useclass_AM" : "int:2", + "useclass_MD" : "int:2", + "useclass_PM" : "int:2", + "useclass_EV" : "int:2", + "nmt2010" : "int:2", + "nmt2020" : "int:2", + }, + "geometry": "LineString" + } + if output_gpkg and output_link_gpkg_layer: + WranglerLogger.info("Writing GeoPackage {} with Link Layer {}".format(os.path.join(output_dir, output_gpkg), output_link_gpkg_layer)) + links_dbf_df.to_file(os.path.join(output_dir, output_gpkg), layer=output_link_gpkg_layer, schema=link_schema, driver="GPKG") + + # output additional link layers if filter column is specified + # e.g. if county-subsets are output + if output_gpkg_link_filter: + link_value_counts = links_dbf_df[output_gpkg_link_filter].value_counts() + for filter_val,filter_count in link_value_counts.items(): + gpkg_layer_name = "{}_{}".format(output_link_gpkg_layer, filter_val) + gpkg_layer_name = gpkg_layer_name.replace(" ","_") + WranglerLogger.info("Writing GeoPackage {} with Link Layer {} for {} rows".format( + os.path.join(output_dir, output_gpkg), gpkg_layer_name, filter_count)) + links_dbf_df.loc[ links_dbf_df[output_gpkg_link_filter]==filter_val ].to_file( + os.path.join(output_dir, output_gpkg), layer=gpkg_layer_name, schema=link_schema, driver="GPKG") + + + + + if data_to_csv: + WranglerLogger.info( + "Writing Network Data to CSVs:\n - {}\n - {}".format( + output_link_csv, output_node_csv + ) + ) + self.links_mtc_df[link_output_variables].to_csv( + output_link_csv, index=False + ) + self.nodes_mtc_df[node_output_variables].to_csv( + output_node_csv, index=False + )
+ + + # this should be moved to util +
[docs] @staticmethod + def dataframe_to_fixed_width(df): + """ + Convert dataframe to fixed width format, geometry column will not be transformed. + + Args: + df (pandas DataFrame). + + Returns: + pandas dataframe: dataframe with fixed width for each column. + dict: dictionary with columns names as keys, column width as values. + """ + WranglerLogger.info("Starting fixed width conversion") + if 'name' in df.columns: + df['name']=df['name'].apply(lambda x: x.strip().split(',')[0].replace("[",'').replace("'nan'","").replace("nan","").replace("'","")) + + + # get the max length for each variable column + max_width_dict = dict( + [ + (v, df[v].apply(lambda r: len(str(r)) if r != None else 0).max()) + for v in df.columns.values + if v != "geometry" + ] + ) + + fw_df = df.drop("geometry", axis=1).copy() + for c in fw_df.columns: + fw_df[c] = fw_df[c].apply(lambda x: str(x)) + fw_df["pad"] = fw_df[c].apply(lambda x: " " * (max_width_dict[c] - len(x))) + fw_df[c] = fw_df.apply(lambda x: x["pad"] + x[c], axis=1) + + return fw_df, max_width_dict
+ +
[docs] def write_roadway_as_fixedwidth( + self, + output_dir, + node_output_variables: list = None, + link_output_variables: list = None, + output_link_txt: str = None, + output_node_txt: str = None, + output_link_header_width_txt: str = None, + output_node_header_width_txt: str = None, + output_cube_network_script: str = None, + drive_only: bool = False, + ): + """ + Writes out fixed width file. + + This function does: + 1. write out link and node fixed width data files for cube. + 2. write out header and width correspondence. + 3. write out cube network building script with header and width specification. + + Args: + output_dir (str): File path to where links, nodes and script will be written and run + node_output_variables (list): list of node variable names. + link_output_variables (list): list of link variable names. + output_link_txt (str): File name of output link database (within output_dir) + output_node_txt (str): File name of output node database (within output_dir) + output_link_header_width_txt (str): File name of link column width records (within output_dir) + output_node_header_width_txt (str): File name of node column width records (within output_dir) + output_cube_network_script (str): File name of CUBE network building script (within output_dir) + drive_only (bool): If True, only writes drive nodes and links + + Returns: + None + + """ + + """ + Verify inputs + """ + + if self.nodes_mtc_df is None: + self.roadway_standard_to_mtc_network() + + WranglerLogger.debug( + "Network Link Variables: \n - {}".format( + "\n - ".join(self.links_mtc_df.columns) + ) + ) + WranglerLogger.debug( + "Network Node Variables: \n - {}".format( + "\n - ".join(self.nodes_mtc_df.columns) + ) + ) + + link_output_variables = ( + link_output_variables + if link_output_variables + else [ + c + for c in self.links_mtc_df.columns + if c in self.parameters.output_variables + ] + ) + + node_output_variables = ( + node_output_variables + if node_output_variables + else [ + c + for c in self.nodes_mtc_df.columns + if c in self.parameters.output_variables + ] + ) + + output_link_txt = ( + output_link_txt if output_link_txt else self.parameters.output_link_txt + ) + + output_node_txt = ( + output_node_txt if output_node_txt else self.parameters.output_node_txt + ) + + output_link_header_width_txt = ( + output_link_header_width_txt + if output_link_header_width_txt + else self.parameters.output_link_header_width_txt + ) + + output_node_header_width_txt = ( + output_node_header_width_txt + if output_node_header_width_txt + else self.parameters.output_node_header_width_txt + ) + + output_cube_network_script = ( + output_cube_network_script + if output_cube_network_script + else self.parameters.output_cube_network_script + ) + + """ + Start Process + """ + #MTC + link_ff_df, link_max_width_dict = self.dataframe_to_fixed_width( + self.links_mtc_df[link_output_variables] + ) + + if drive_only: + link_ff_df = link_ff_df.loc[link_ff_df['drive_access'] == 1] + #/MTC + """ + #MC + link_ff_df, link_max_width_dict = self.dataframe_to_fixed_width( + self.links_metcouncil_df[link_output_variables] + ) + + if drive_only: + link_ff_df = link_ff_df.loc[link_ff_df["drive_access"] == 1] + #/MC + """ + WranglerLogger.info("Writing out link database") + + link_ff_df.to_csv(os.path.join(output_dir, output_link_txt), sep=";", index=False, header=False) + + # write out header and width correspondence + WranglerLogger.info("Writing out link header and width ----") + link_max_width_df = DataFrame( + list(link_max_width_dict.items()), columns=["header", "width"] + ) + link_max_width_df.to_csv(os.path.join(output_dir, output_link_header_width_txt), index=False) + + #MTC + node_ff_df, node_max_width_dict = self.dataframe_to_fixed_width( + self.nodes_mtc_df[node_output_variables] + ) + #/MTC + """ + #MC + node_ff_df, node_max_width_dict = self.dataframe_to_fixed_width( + self.nodes_metcouncil_df[node_output_variables] + ) + #/MC + """ + WranglerLogger.info("Writing out node database") + + if drive_only: + node_ff_df = node_ff_df.loc[node_ff_df["drive_node"] == 1] + + + node_ff_df.to_csv(os.path.join(output_dir, output_node_txt), sep=";", index=False, header=False) + + # write out header and width correspondence + WranglerLogger.info("Writing out node header and width") + node_max_width_df = DataFrame( + list(node_max_width_dict.items()), columns=["header", "width"] + ) + node_max_width_df.to_csv(os.path.join(output_dir, output_node_header_width_txt), index=False) + + # write out cube script + s = 'RUN PGM = NETWORK MSG = "Read in network from fixed width file" \n' + s += 'FILEI LINKI[1] = "{}",'.format(output_link_txt) + start_pos = 1 + for i in range(len(link_max_width_df)): + s += " VAR=" + link_max_width_df.header.iloc[i] + + if ( + self.links_mtc_df.dtypes.loc[link_max_width_df.header.iloc[i]] + == "O" + ): + s += "(C" + str(link_max_width_df.width.iloc[i]) + ")" + + s += ( + ", BEG=" + + str(start_pos) + + ", LEN=" + + str(link_max_width_df.width.iloc[i]) + + "," + ) + + start_pos += link_max_width_df.width.iloc[i] + 1 + + s = s[:-1] + s += "\n" + s += 'FILEI NODEI[1] = "{}",'.format(output_node_txt) + start_pos = 1 + for i in range(len(node_max_width_df)): + s += " VAR=" + node_max_width_df.header.iloc[i] + + if ( + self.nodes_mtc_df.dtypes.loc[node_max_width_df.header.iloc[i]] + == "O" + ): + s += "(C" + str(node_max_width_df.width.iloc[i]) + ")" + + s += ( + ", BEG=" + + str(start_pos) + + ", LEN=" + + str(node_max_width_df.width.iloc[i]) + + "," + ) + + start_pos += node_max_width_df.width.iloc[i] + 1 + + s = s[:-1] + s += '\n' + s += 'FILEO NETO = "complete_network.net"\n\n' + s += ' ZONES = {}\n\n'.format(self.parameters.zones) + s += '; Trim leading whitespace from string variables\n' + # todo: The below should be built above based on columns that are strings + s += ' phase=NODEMERGE\n' + s += ' county = LTRIM(county)\n' + s += ' endphase\n' + s += ' phase=LINKMERGE\n' + s += ' name = LTRIM(name)\n' + s += ' county = LTRIM(county)\n' + s += ' cntype = LTRIM(cntype)\n' + s += ' endphase\n' + s += '\nENDRUN\n' + + with open(os.path.join(output_dir, output_cube_network_script), "w") as f: + f.write(s) + + # run the cube script to create the cube network + import subprocess + env = copy.copy(os.environ) + cube_cmd = '"C:\\Program Files\\Citilabs\\CubeVoyager\\runtpp.exe" {}'.format(output_cube_network_script) + try: + WranglerLogger.info("Running [{}] in cwd [{}]".format(cube_cmd, output_dir)) + ret = subprocess.run(cube_cmd, cwd=output_dir, capture_output=True, check=True) + + WranglerLogger.info("return code: {}".format(ret.returncode)) + + for line in ret.stdout.decode('utf-8').split('\r\n'): + if len(line) > 0: WranglerLogger.info("stdout: {}".format(line)) + + for line in ret.stderr.decode('utf-8').split('\r\n'): + if len(line) > 0: WranglerLogger.info("stderr: {}".format(line)) + + except Exception as e: + WranglerLogger.error(e)
+
+ +
+
+
+ +
+ +
+

© Copyright 2019 Metropolitan Council.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty_2035_hwy_update/_modules/lasso/transit/index.html b/branch/bicounty_2035_hwy_update/_modules/lasso/transit/index.html new file mode 100644 index 0000000..13a7d41 --- /dev/null +++ b/branch/bicounty_2035_hwy_update/_modules/lasso/transit/index.html @@ -0,0 +1,1858 @@ + + + + + + lasso.transit — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for lasso.transit

+"""Transit-related classes to parse, compare, and write standard and cube transit files.
+
+  Typical usage example:
+
+    tn = CubeTransit.create_from_cube(CUBE_DIR)
+    transit_change_list = tn.evaluate_differences(base_transit_network)
+
+    cube_transit_net = StandardTransit.read_gtfs(BASE_TRANSIT_DIR)
+    cube_transit_net.write_as_cube_lin(os.path.join(WRITE_DIR, "outfile.lin"))
+"""
+import os
+import copy
+import csv
+import datetime, time
+from typing import Any, Dict, Optional, Union
+
+from lark import Lark, Transformer, v_args
+from pandas import DataFrame
+
+import pandas as pd
+import partridge as ptg
+import numpy as np
+
+from network_wrangler import TransitNetwork
+
+from .logger import WranglerLogger
+from .parameters import Parameters
+
+
[docs]class CubeTransit(object): + """Class for storing information about transit defined in Cube line + files. + + Has the capability to: + + - Parse cube line file properties and shapes into python dictionaries + - Compare line files and represent changes as Project Card dictionaries + + .. highlight:: python + + Typical usage example: + :: + tn = CubeTransit.create_from_cube(CUBE_DIR) + transit_change_list = tn.evaluate_differences(base_transit_network) + + Attributes: + lines (list): list of strings representing unique line names in + the cube network. + line_properties (dict): dictionary of line properties keyed by line name. Property + values are stored in a dictionary by property name. These + properties are directly read from the cube line files and haven't + been translated to standard transit values. + shapes (dict): dictionary of shapes + keyed by line name. Shapes stored as a pandas DataFrame of nodes with following columns: + - 'node_id' (int): positive integer of node id + - 'node' (int): node number, with negative indicating a non-stop + - 'stop' (boolean): indicates if it is a stop + - 'order' (int): order within this shape + program_type (str): Either PT or TRNBLD + parameters (Parameters): + Parameters instance that will be applied to this instance which + includes information about time periods and variables. + source_list (list): + List of cube line file sources that have been read and added. + diff_dict (dict): + """ + +
[docs] def __init__(self, parameters: Union[Parameters, dict] = {}): + """ + Constructor for CubeTransit + + parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters + """ + WranglerLogger.debug("Creating a new Cube Transit instance") + + self.lines = [] + + self.line_properties = {} + self.shapes = {} + + self.program_type = None + + if type(parameters) is dict: + self.parameters = Parameters(**parameters) + elif isinstance(parameters, Parameters): + self.parameters = Parameters(**parameters.__dict__) + else: + msg = "Parameters should be a dict or instance of Parameters: found {} which is of type:{}".format( + parameters, type(parameters) + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + self.source_list = [] + + self.diff_dict = Dict[str, Any]
+ +
[docs] def add_cube(self, transit_source: str): + """Reads a .lin file and adds it to existing TransitNetwork instance. + + Args: + transit_source: a string or the directory of the cube line file to be parsed + + """ + + """ + Figure out what kind of transit source it is + """ + + parser = Lark(TRANSIT_LINE_FILE_GRAMMAR, debug="debug", parser="lalr") + + if "NAME=" in transit_source: + WranglerLogger.debug("reading transit source as string") + self.source_list.append("input_str") + parse_tree = parser.parse(transit_source) + elif os.path.isfile(transit_source): + print("reading: {}".format(transit_source)) + with open(transit_source) as file: + WranglerLogger.debug( + "reading transit source: {}".format(transit_source) + ) + self.source_list.append(transit_source) + parse_tree = parser.parse(file.read()) + elif os.path.isdir(transit_source): + import glob + + for lin_file in glob.glob(os.path.join(transit_source, "*.LIN")): + self.add_cube(lin_file) + return + else: + msg = "{} not a valid transit line string, directory, or file" + WranglerLogger.error(msg) + raise ValueError(msg) + + WranglerLogger.debug("finished parsing cube line file") + # WranglerLogger.debug("--Parse Tree--\n {}".format(parse_tree.pretty())) + transformed_tree_data = CubeTransformer().transform(parse_tree) + # WranglerLogger.debug("--Transformed Parse Tree--\n {}".format(transformed_tree_data)) + + _line_data = transformed_tree_data["lines"] + + line_properties_dict = {k: v["line_properties"] for k, v in _line_data.items()} + line_shapes_dict = {k: v["line_shape"] for k, v in _line_data.items()} + new_lines = list(line_properties_dict.keys()) + """ + Before adding lines, check to see if any are overlapping with existing ones in the network + """ + + overlapping_lines = set(new_lines) & set(self.lines) + if overlapping_lines: + msg = "Overlapping lines found when adding from {}. \nSource files:\n{}\n{} Overlapping Lines of {} total new lines.\n-->{}".format( + transit_source, + "\n - ".join(self.source_list), + len(new_lines), + len(overlapping_lines), + overlapping_lines, + ) + print(msg) + WranglerLogger.error(msg) + raise ValueError(msg) + + self.program_type = transformed_tree_data.get("program_type", None) + + self.lines += new_lines + self.line_properties.update(line_properties_dict) + self.shapes.update(line_shapes_dict) + + WranglerLogger.debug("Added lines to CubeTransit: \n".format(new_lines))
+ +
[docs] @staticmethod + def create_from_cube(transit_source: str, parameters: Optional[dict] = {}): + """ + Reads a cube .lin file and stores as TransitNetwork object. + + Args: + transit_source: a string or the directory of the cube line file to be parsed + + Returns: + A ::CubeTransit object created from the transit_source. + """ + + tn = CubeTransit(parameters) + tn.add_cube(transit_source) + + return tn
+ +
[docs] def evaluate_differences(self, base_transit): + """ + 1. Identifies what routes need to be updated, deleted, or added + 2. For routes being added or updated, identify if the time periods + have changed or if there are multiples, and make duplicate lines if so + 3. Create project card dictionaries for each change. + + Args: + base_transit (CubeTransit): an instance of this class for the base condition + + Returns: + A list of dictionaries containing project card changes + required to evaluate the differences between the base network + and this transit network instance. + """ + transit_change_list = [] + + """ + Identify what needs to be evaluated + """ + lines_to_update = [l for l in self.lines if l in base_transit.lines] + lines_to_delete = [l for l in base_transit.lines if l not in self.lines] + lines_to_add = [l for l in self.lines if l not in base_transit.lines] + + project_card_changes = [] + + """ + Evaluate Property Updates + """ + + for line in lines_to_update: + WranglerLogger.debug( + "Finding differences in time periods for: {}".format(line) + ) + + """ + Find any additional time periods that might need to add or delete. + """ + base_cube_time_period_numbers = ( + CubeTransit.get_time_period_numbers_from_cube_properties( + base_transit.line_properties[line] + ) + ) + + try: + assert len(base_cube_time_period_numbers) == 1 + except: + msg = "Base network line {} should only have one time period per route, but {} found".format( + line, base_cube_time_period_numbers + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + base_cube_time_period_number = base_cube_time_period_numbers[0] + + build_cube_time_period_numbers = ( + CubeTransit.get_time_period_numbers_from_cube_properties( + self.line_properties[line] + ) + ) + + time_periods_to_add = [ + tp + for tp in build_cube_time_period_numbers + if tp not in base_cube_time_period_numbers + ] + + for tp in time_periods_to_add: + lines_to_add.append(self.add_additional_time_periods(tp, line)) + + time_periods_to_delete = [ + tp + for tp in base_cube_time_period_numbers + if tp not in build_cube_time_period_numbers + ] + + for tp in time_periods_to_delete: + lines_to_delete.append(line) + + WranglerLogger.debug("Evaluating differences in: {}".format(line)) + updated_properties = self.evaluate_route_property_differences( + self.line_properties[line], + base_transit.line_properties[line], + base_cube_time_period_number, + ) + updated_shapes = CubeTransit.evaluate_route_shape_changes( + self.shapes[line].node, base_transit.shapes[line].node + ) + if updated_properties: + update_prop_card_dict = self.create_update_route_card_dict( + line, updated_properties + ) + project_card_changes.append(update_prop_card_dict) + + if updated_shapes: + update_shape_card_dict = self.create_update_route_card_dict( + line, updated_shapes + ) + project_card_changes.append(update_shape_card_dict) + + """ + Evaluate Deletions + """ + for line in lines_to_delete: + delete_card_dict = self.create_delete_route_card_dict( + line, base_transit.line_properties[line] + ) + project_card_changes.append(delete_card_dict) + + """ + Evaluate Additions + + First assess if need to add multiple routes if there are multiple time periods + """ + for line in lines_to_add: + time_period_numbers = ( + CubeTransit.get_time_period_numbers_from_cube_properties( + self.line_properties[line] + ) + ) + if len(time_period_numbers) > 1: + for tp in time_period_numbers[1:]: + lines_to_add.append(self.add_additional_time_periods(tp, line)) + + for line in lines_to_add: + add_card_dict = self.create_add_route_card_dict(line) + project_card_changes.append(add_card_dict) + + return project_card_changes
+ +
[docs] def add_additional_time_periods( + self, new_time_period_number: int, orig_line_name: str + ): + """ + Copies a route to another cube time period with appropriate + values for time-period-specific properties. + + New properties are stored under the new name in: + - ::self.shapes + - ::self.line_properties + + Args: + new_time_period_number (int): cube time period number + orig_line_name(str): name of the originating line, from which + the new line will copy its properties. + + Returns: + Line name with new time period. + """ + WranglerLogger.debug( + "adding time periods {} to line {}".format( + new_time_period_number, orig_line_name + ) + ) + + ( + route_id, + _init_time_period, + agency_id, + direction_id, + ) = CubeTransit.unpack_route_name(orig_line_name) + new_time_period_name = self.parameters.cube_time_periods[new_time_period_number] + new_tp_line_name = CubeTransit.build_route_name( + route_id=route_id, + time_period=new_time_period_name, + agency_id=agency_id, + direction_id=direction_id, + ) + + try: + assert new_tp_line_name not in self.lines + except: + msg = "Trying to add a new time period {} to line {}, but constructed name {} is already in line list.".format( + new_time_period_number, orig_line_name, new_tp_line_name + ) + WrangerLogger.error(msg) + raise ValueError(msg) + + # copy to a new line and add it to list of lines to add + self.line_properties[new_tp_line_name] = copy.deepcopy( + self.line_properties[orig_line_name] + ) + self.shapes[new_tp_line_name] = copy.deepcopy(self.shapes[orig_line_name]) + self.line_properties[new_tp_line_name]["NAME"] = new_tp_line_name + + """ + Remove entries that aren't for this time period from the new line's properties list. + """ + this_time_period_properties_list = [ + p + "[" + str(new_time_period_number) + "]" + ##todo parameterize all time period specific variables + for p in ["HEADWAY", "FREQ"] + ] + + not_this_tp_properties_list = list( + set(self.parameters.time_period_properties_list) + - set(this_time_period_properties_list) + ) + + for k in not_this_tp_properties_list: + self.line_properties[new_tp_line_name].pop(k, None) + + """ + Remove entries for time period from the original line's properties list. + """ + for k in this_time_period_properties_list: + self.line_properties[orig_line_name].pop(k, None) + + """ + Add new line to list of lines to add. + """ + WranglerLogger.debug( + "Adding new time period {} for line {} as {}.".format( + new_time_period_number, orig_line_name, new_tp_line_name + ) + ) + return new_tp_line_name
+ +
[docs] def create_update_route_card_dict(self, line: str, updated_properties_dict: dict): + """ + Creates a project card change formatted dictionary for updating + the line. + + Args: + line: name of line that is being updated + updated_properties_dict: dictionary of attributes to update as + 'property': <property name>, + 'set': <new property value> + + Returns: + A project card change-formatted dictionary for the attribute update. + """ + base_start_time_str, base_end_time_str = self.calculate_start_end_times( + self.line_properties[line] + ) + + update_card_dict = { + "category": "Transit Service Property Change", + "facility": { + "route_id": line.split("_")[1], + "direction_id": int(line.split("_")[-2].strip("d\"")), + "shape_id": line.split("_")[-1].strip("s\""), + "start_time": base_start_time_str, + "end_time": base_end_time_str, + }, + "properties": updated_properties_dict, + } + WranglerLogger.debug( + "Updating {} route to changes:\n{}".format(line, str(update_card_dict)) + ) + + return update_card_dict
+ +
[docs] def create_delete_route_card_dict( + self, line: str, base_transit_line_properties_dict: dict + ): + """ + Creates a project card change formatted dictionary for deleting a line. + + Args: + line: name of line that is being deleted + base_transit_line_properties_dict: dictionary of cube-style + attribute values in order to find time periods and + start and end times. + + Returns: + A project card change-formatted dictionary for the route deletion. + """ + base_start_time_str, base_end_time_str = self.calculate_start_end_times( + base_transit_line_properties_dict + ) + + delete_card_dict = { + "category": "Delete Transit Service", + "facility": { + "route_id": line.split("_")[1], + "direction_id": int(line.strip('"')[-1]), + "start_time": base_start_time_str, + "end_time": base_end_time_str, + }, + } + WranglerLogger.debug( + "Deleting {} route to changes:\n{}".format(line, delete_card_dict) + ) + + return delete_card_dict
+ +
[docs] def create_add_route_card_dict(self, line: str): + """ + Creates a project card change formatted dictionary for adding + a route based on the information in self.route_properties for + the line. + + Args: + line: name of line that is being updated + + Returns: + A project card change-formatted dictionary for the route addition. + """ + start_time_str, end_time_str = self.calculate_start_end_times( + self.line_properties[line] + ) + + standard_properties = self.cube_properties_to_standard_properties( + self.line_properties[line] + ) + + routing_properties = { + "property": "routing", + "set": self.shapes[line]["node"].tolist(), + } + + add_card_dict = { + "category": "New Transit Service", + "facility": { + "route_id": line.split("_")[1], + "direction_id": int(line.strip('_')[-2]), + "start_time": start_time_str, + "end_time": end_time_str, + "agency_id": line.strip('_')[0], + }, + "properties": standard_properties + [routing_properties], + } + + WranglerLogger.debug( + "Adding {} route to changes:\n{}".format(line, add_card_dict) + ) + return add_card_dict
+ +
[docs] @staticmethod + def get_time_period_numbers_from_cube_properties(properties_list: list): + """ + Finds properties that are associated with time periods and the + returns the numbers in them. + + Args: + properties_list (list): list of all properties. + + Returns: + list of strings of the time period numbers found + """ + time_periods_list = [] + for p in properties_list: + if ("[" not in p) or ("]" not in p): + continue + tp_num = p.split("[")[1][0] + if tp_num and tp_num not in time_periods_list: + time_periods_list.append(tp_num) + return time_periods_list
+ +
[docs] @staticmethod + def build_route_name( + route_id: str = "", + time_period: str = "", + agency_id: str = 0, + direction_id: str = 1, + ): + """ + Create a route name by contatenating route, time period, agency, and direction + + Args: + route_id: i.e. 452-111 + time_period: i.e. pk + direction_id: i.e. 1 + agency_id: i.e. 0 + + Returns: + constructed line_name i.e. "0_452-111_452_pk1" + """ + + return ( + str(agency_id) + + "_" + + str(route_id) + + "_" + + str(route_id.split("-")[0]) + + "_" + + str(time_period) + + str(direction_id) + )
+ +
[docs] @staticmethod + def unpack_route_name(line_name: str): + """ + Unpacks route name into direction, route, agency, and time period info + + Args: + line_name (str): i.e. "0_452-111_452_pk1" + + Returns: + route_id (str): 452-111 + time_period (str): i.e. pk + direction_id (str) : i.e. 1 + agency_id (str) : i.e. 0 + """ + + line_name = line_name.strip('"') + + agency_id, route_id, _rtid, _tp_direction = line_name.split("_") + time_period = _tp_direction[0:-1] + direction_id = _tp_direction[-1] + + return route_id, time_period, agency_id, direction_id
+ +
[docs] def calculate_start_end_times(self, line_properties_dict: dict): + """ + Calculate the start and end times of the property change + WARNING: Doesn't take care of discongruous time periods!!!! + + Args: + line_properties_dict: dictionary of cube-flavor properties for a transit line + """ + start_time_m = 24 * 60 + end_time_m = 0 * 60 + + WranglerLogger.debug( + "parameters.time_period_properties_list: {}".format( + self.parameters.time_period_properties_list + ) + ) + current_cube_time_period_numbers = ( + CubeTransit.get_time_period_numbers_from_cube_properties( + line_properties_dict + ) + ) + + WranglerLogger.debug( + "current_cube_time_period_numbers:{}".format( + current_cube_time_period_numbers + ) + ) + + for tp in current_cube_time_period_numbers: + time_period_name = self.parameters.cube_time_periods[tp] + WranglerLogger.debug("time_period_name:{}".format(time_period_name)) + _start_time, _end_time = self.parameters.time_period_to_time[ + time_period_name + ] + + # change from "HH:MM" to integer # of seconds + _start_time_m = (int(_start_time.split(":")[0]) * 60) + int( + _start_time.split(":")[1] + ) + _end_time_m = (int(_end_time.split(":")[0]) * 60) + int( + _end_time.split(":")[1] + ) + + # find bounding start and end times + if _start_time_m < start_time_m: + start_time_m = _start_time_m + if _end_time_m > end_time_m: + end_time_m = _end_time_m + + if start_time_m > end_time_m: + msg = "Start time ({}) is after end time ({})".format( + start_time_m, end_time_m + ) + #WranglerLogger.error(msg) + #raise ValueError(msg) + + start_time_str = "{:02d}:{:02d}".format(*divmod(start_time_m, 60)) + end_time_str = "{:02d}:{:02d}".format(*divmod(end_time_m, 60)) + return start_time_str, end_time_str
+ +
[docs] @staticmethod + def cube_properties_to_standard_properties(cube_properties_dict: dict): + """ + Converts cube style properties to standard properties. + + This is most pertinent to time-period specific variables like headway, + and varibles that have stnadard units like headway, which is minutes + in cube and seconds in standard format. + + Args: + cube_properties_dict: <cube style property name> : <property value> + + Returns: + A list of dictionaries with values for `"property": <standard + style property name>, "set" : <property value with correct units>` + + """ + standard_properties_list = [] + for k, v in cube_properties_dict.items(): + change_item = {} + if any(i in k for i in ["HEADWAY", "FREQ"]): + change_item["property"] = "headway_secs" + change_item["set"] = v * 60 + else: + change_item["property"] = k + change_item["set"] = v + standard_properties_list.append(change_item) + + return standard_properties_list
+ +
[docs] def evaluate_route_property_differences( + self, + properties_build: dict, + properties_base: dict, + time_period_number: str, + absolute: bool = True, + validate_base: bool = False, + ): + """ + Checks if any values have been updated or added for a specific + route and creates project card entries for each. + + Args: + properties_build: ::<property_name>: <property_value> + properties_base: ::<property_name>: <property_value> + time_period_number: time period to evaluate + absolute: if True, will use `set` command rather than a change. If false, will automatically check the base value. Note that this only applies to the numeric values of frequency/headway + validate_base: if True, will add the `existing` line in the project card + + Returns: + transit_change_list (list): a list of dictionary values suitable for writing to a project card + `{ + 'property': <property_name>, + 'set': <set value>, + 'change': <change from existing value>, + 'existing': <existing value to check>, + }` + + """ + + # Remove time period specific values for things that aren't part of the time period in question + this_time_period_properties_list = [ + p + "[" + str(time_period_number) + "]" + ##todo parameterize all time period specific variables + for p in ["HEADWAY", "FREQ"] + ] + + not_this_tp_properties_list = list( + set(self.parameters.time_period_properties_list) + - set(this_time_period_properties_list) + ) + + for k in not_this_tp_properties_list: + properties_build.pop(k, None) + properties_base.pop(k, None) + + difference_dict = dict( + set(properties_build.items()) ^ set(properties_base.items()) + ) + + # Iterate through properties list to build difference project card list + + properties_list = [] + for k, v in difference_dict.items(): + change_item = {} + if any(i in k for i in ["HEADWAY", "FREQ"]): + change_item["property"] = "headway_secs" + + if absolute: + change_item["set"] = ( + v * 60 + ) # project cards are in secs, cube is in minutes + else: + change_item["change"] = ( + properties_build[k] - properties_base[k] + ) * 60 + if validate_base or not absolute: + change_item["existing"] = properties_base[k] * 60 + else: + change_item["property"] = k + change_item["set"] = v + if validate_base: + change_item["existing"] = properties_base[k] + + properties_list.append(change_item) + WranglerLogger.debug( + "Evaluated Route Changes: \n {})".format( + "\n".join(map(str, properties_list)) + ) + ) + return properties_list
+ +
[docs] @staticmethod + def evaluate_route_shape_changes( + shape_build: DataFrame, shape_base: DataFrame + ): + """ + Compares two route shapes and constructs returns list of changes + suitable for a project card. + + Args: + shape_build: DataFrame of the build-version of the route shape. + shape_base: dDataFrame of the base-version of the route shape. + + Returns: + List of shape changes formatted as a project card-change dictionary. + + """ + + if shape_build.equals(shape_base): + return None + + shape_change_list = [] + + base_node_list = shape_base.tolist() + build_node_list = shape_build.tolist() + + sort_len = max(len(base_node_list), len(build_node_list)) + + start_pos = None + end_pos = None + for i in range(sort_len): + if (i == len(base_node_list)) | (i == len(build_node_list)): + start_pos = i - 1 + break + if base_node_list[i] != build_node_list[i]: + start_pos = i + break + else: + continue + + j = -1 + for i in range(sort_len): + if (i == len(base_node_list)) | (i == len(build_node_list)): + end_pos = j + 1 + break + if base_node_list[j] != build_node_list[j]: + end_pos = j + break + else: + j -= 1 + + if start_pos or end_pos: + existing = base_node_list[ + (start_pos - 2 if start_pos > 1 else None) : ( + end_pos + 2 if end_pos < -2 else None + ) + ] + set = build_node_list[ + (start_pos - 2 if start_pos > 1 else None) : ( + end_pos + 2 if end_pos < -2 else None + ) + ] + + shape_change_list.append( + {"property": "routing", "existing": existing, "set": set} + ) + + return shape_change_list
+ + +
[docs]class StandardTransit(object): + """Holds a standard transit feed as a Partridge object and contains + methods to manipulate and translate the GTFS data to MetCouncil's + Cube Line files. + + .. highlight:: python + Typical usage example: + :: + cube_transit_net = StandardTransit.read_gtfs(BASE_TRANSIT_DIR) + cube_transit_net.write_as_cube_lin(os.path.join(WRITE_DIR, "outfile.lin")) + + Attributes: + feed: Partridge Feed object containing read-only access to GTFS feed + parameters (Parameters): Parameters instance containing information + about time periods and variables. + """ + +
[docs] def __init__(self, ptg_feed, parameters: Union[Parameters, dict] = {}): + """ + + Args: + ptg_feed: partridge feed object + parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters + """ + self.feed = ptg_feed + + if type(parameters) is dict: + self.parameters = Parameters(**parameters) + elif isinstance(parameters, Parameters): + self.parameters = Parameters(**parameters.__dict__) + else: + msg = "Parameters should be a dict or instance of Parameters: found {} which is of type:{}".format( + parameters, type(parameters) + ) + WranglerLogger.error(msg) + raise ValueError(msg)
+ +
[docs] @staticmethod + def fromTransitNetwork( + transit_network_object: TransitNetwork, parameters: Union[Parameters, dict] = {} + ): + """ + RoadwayNetwork to ModelRoadwayNetwork + + Args: + transit_network_object: Reference to an instance of TransitNetwork. + parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not provided will + use default parameters. + + Returns: + StandardTransit + """ + return StandardTransit(transit_network_object.feed, parameters=parameters)
+ +
[docs] @staticmethod + def read_gtfs(gtfs_feed_dir: str, parameters: Union[Parameters, dict] = {}): + """ + Reads GTFS files from a directory and returns a StandardTransit + instance. + + Args: + gtfs_feed_dir: location of the GTFS files + parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not provided will + use default parameters. + + Returns: + StandardTransit instance + """ + return StandardTransit(ptg.load_feed(gtfs_feed_dir), parameters=parameters)
+ +
[docs] def write_as_cube_lin(self, outpath: str = None): + """ + Writes the gtfs feed as a cube line file after + converting gtfs properties to MetCouncil cube properties. + #MC + Args: + outpath: File location for output cube line file. + + """ + if not outpath: + outpath = os.path.join(self.parameters.scratch_location, "outtransit.lin") + trip_cube_df = self.route_properties_gtfs_to_cube(self) + + trip_cube_df["LIN"] = trip_cube_df.apply(self.cube_format, axis=1) + + l = trip_cube_df["LIN"].tolist() + + with open(outpath, "w") as f: + f.write("\n".join(l))
+ +
[docs] @staticmethod + def route_properties_gtfs_to_cube(self): + """ + Prepare gtfs for cube lin file. + #MC + Does the following operations: + 1. Combines route, frequency, trip, and shape information + 2. Converts time of day to time periods + 3. Calculates cube route name from gtfs route name and properties + 4. Assigns a cube-appropriate mode number + 5. Assigns a cube-appropriate operator number + + Returns: + trip_df (DataFrame): DataFrame of trips with cube-appropriate values for: + - NAME + - ONEWAY + - OPERATOR + - MODE + - HEADWAY + """ + WranglerLogger.info( + "Converting GTFS Standard Properties to MetCouncil's Cube Standard" + ) + metro_operator_dict = { + "0": 3, + "1": 3, + "2": 3, + "3": 4, + "4": 2, + "5": 5, + "6": 8, + "7": 1, + "8": 1, + "9": 10, + "10": 3, + "11": 9, + "12": 3, + "13": 4, + "14": 4, + "15": 3, + } + + shape_df = self.feed.shapes.copy() + trip_df = self.feed.trips.copy() + + """ + Add information from: routes, frequencies, and routetype to trips_df + """ + trip_df = pd.merge(trip_df, self.feed.routes, how="left", on="route_id") + trip_df = pd.merge(trip_df, self.feed.frequencies, how="left", on="trip_id") + + trip_df["tod_name"] = trip_df.start_time.apply(self.time_to_cube_time_period) + inv_cube_time_periods_map = { + v: k for k, v in self.parameters.cube_time_periods.items() + } + trip_df["tod_num"] = trip_df.tod_name.map(inv_cube_time_periods_map) + trip_df["tod_name"] = trip_df.tod_name.map( + self.parameters.cube_time_periods_name + ) + + trip_df["NAME"] = trip_df.apply( + lambda x: x.agency_id + + "_" + + x.route_id + + "_" + + x.route_short_name + + "_" + + x.tod_name + + str(x.direction_id), + axis=1, + ) + + trip_df["LONGNAME"] = trip_df["route_long_name"] + trip_df["HEADWAY"] = (trip_df["headway_secs"] / 60).astype(int) + trip_df["MODE"] = trip_df.apply(self.calculate_cube_mode, axis=1) + trip_df["ONEWAY"] = "T" + trip_df["OPERATOR"] = trip_df["agency_id"].map(metro_operator_dict) + + return trip_df
+ +
[docs] def calculate_cube_mode(self, row): + """ + Assigns a cube mode number by following logic. + #MC + For rail, uses GTFS route_type variable: + https://developers.google.com/transit/gtfs/reference + + :: + # route_type : cube_mode + route_type_to_cube_mode = {0: 8, # Tram, Streetcar, Light rail + 3: 0, # Bus; further disaggregated for cube + 2: 9} # Rail + + For buses, uses route id numbers and route name to find + express and suburban buses as follows: + + :: + if not cube_mode: + if 'express' in row['LONGNAME'].lower(): + cube_mode = 7 # Express + elif int(row['route_id'].split("-")[0]) > 99: + cube_mode = 6 # Suburban Local + else: + cube_mode = 5 # Urban Local + + Args: + row: A DataFrame row with route_type, route_long_name, and route_id + + Returns: + cube mode number + """ + # route_type : cube_mode + route_type_to_cube_mode = { + 0: 8, # Tram, Streetcar, Light rail + 3: 0, # Bus; further disaggregated for cube + 2: 9, + } # Rail + + cube_mode = route_type_to_cube_mode[row["route_type"]] + + if not cube_mode: + if "express" in row["route_long_name"].lower(): + cube_mode = 7 # Express + elif int(row["route_id"].split("-")[0]) > 99: + cube_mode = 6 # Suburban Local + else: + cube_mode = 5 # Urban Local + + return cube_mode
+ +
[docs] def time_to_cube_time_period( + self, start_time_secs: int, as_str: bool = True, verbose: bool = False + ): + """ + Converts seconds from midnight to the cube time period. + + Args: + start_time_secs: start time for transit trip in seconds + from midnight + as_str: if True, returns the time period as a string, + otherwise returns a numeric time period + + Returns: + this_tp_num: if as_str is False, returns the numeric + time period + this_tp: if as_str is True, returns the Cube time period + name abbreviation + """ + from .util import hhmmss_to_datetime, secs_to_datetime + + # set initial time as the time that spans midnight + + start_time_dt = secs_to_datetime(start_time_secs) + + # set initial time as the time that spans midnight + this_tp = "NA" + for tp_name, _times in self.parameters.time_period_to_time.items(): + _start_time, _end_time = _times + _dt_start_time = hhmmss_to_datetime(_start_time) + _dt_end_time = hhmmss_to_datetime(_end_time) + if _dt_start_time > _dt_end_time: + this_tp = tp_name + break + + for tp_name, _times in self.parameters.time_period_to_time.items(): + _start_time, _end_time = _times + _dt_start_time = hhmmss_to_datetime(_start_time) + if start_time_dt >= _dt_start_time: + this_time = _dt_start_time + this_tp = tp_name + + if verbose: + WranglerLogger.debug( + "Finding Cube Time Period from Start Time: \ + \n - start_time_sec: {} \ + \n - start_time_dt: {} \ + \n - this_tp: {}".format( + start_time_secs, start_time_dt, this_tp + ) + ) + + if as_str: + return this_tp + + name_to_num = {v: k for k, v in self.parameters.cube_time_periods.items()} + this_tp_num = name_to_num.get(this_tp) + + if not this_tp_num: + msg = ( + "Cannot find time period number in {} for time period name: {}".format( + name_to_num, this_tp + ) + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + return this_tp_num
+ +
[docs] def shape_gtfs_to_cube(self, row, add_nntime = False): + """ + Creates a list of nodes that for the route in appropriate + cube format. + + Args: + row: DataFrame row with both shape_id and trip_id + + Returns: a string representation of the node list + for a route in cube format. + + """ + agency_raw_name = row.agency_raw_name + shape_id = row.shape_id + trip_id = row.trip_id + + trip_stop_times_df = self.feed.stop_times.copy() + + if 'agency_raw_name' in trip_stop_times_df.columns: + trip_stop_times_df.drop('agency_raw_name', axis = 1, inplace = True) + + trip_stop_times_df = pd.merge( + trip_stop_times_df, + self.feed.trips[['trip_id', 'agency_raw_name']], + how = 'left', + on = ['trip_id'] + ) + + trip_stop_times_df = trip_stop_times_df[ + (trip_stop_times_df.trip_id == row.trip_id) & + (trip_stop_times_df.agency_raw_name == agency_raw_name) + ] + + trip_node_df = self.feed.shapes.copy() + if 'agency_raw_name' in trip_node_df.columns: + trip_node_df.drop('agency_raw_name', axis = 1, inplace = True) + + trip_node_df = pd.merge( + trip_node_df, + self.feed.trips[['shape_id', 'agency_raw_name']].drop_duplicates(), + how = 'left', + on = ['shape_id'] + ) + + trip_node_df = trip_node_df[ + (trip_node_df.shape_id == shape_id) & + (trip_node_df.agency_raw_name == agency_raw_name) + ] + + trip_node_df.sort_values(by = ["shape_pt_sequence"], inplace = True) + + stops_df = self.feed.stops.copy() + stops_df['stop_id'] = stops_df['stop_id'].astype(float).astype(int) + trip_stop_times_df['stop_id'] = trip_stop_times_df['stop_id'].astype(float).astype(int) + + if 'trip_id' in self.feed.stops.columns: + if agency_raw_name != 'sjrtd_2015_0127': + stops_df = stops_df[stops_df.agency_raw_name != 'sjrtd_2015_0127'] + trip_stop_times_df = pd.merge( + trip_stop_times_df, stops_df.drop('trip_id', axis = 1), how="left", on=["stop_id"] + ) + else: + stops_df = stops_df[stops_df.agency_raw_name == 'sjrtd_2015_0127'] + stops_df['trip_id'] = stops_df['trip_id'].astype(float).astype(int).astype(str) + trip_stop_times_df['trip_id'] = trip_stop_times_df['trip_id'].astype(float).astype(int).astype(str) + trip_stop_times_df = pd.merge( + trip_stop_times_df, stops_df, how="left", on=['agency_raw_name', 'trip_id',"stop_id"] + ) + else: + trip_stop_times_df = pd.merge( + trip_stop_times_df, stops_df, how="left", on="stop_id" + ) + + trip_stop_times_df["model_node_id"] = pd.to_numeric(trip_stop_times_df["model_node_id"]).astype(int) + trip_node_df["shape_model_node_id"] = pd.to_numeric(trip_node_df["shape_model_node_id"]).astype(int) + + stop_node_id_list = trip_stop_times_df["model_node_id"].tolist() + trip_node_list = trip_node_df["shape_model_node_id"].tolist() + + trip_stop_times_df.sort_values(by = ["stop_sequence"], inplace = True) + # sometimes GTFS `stop_sequence` does not start with 1, e.g. SFMTA light rails + trip_stop_times_df["internal_stop_sequence"] = range(1, 1+len(trip_stop_times_df)) + # sometimes GTFS `departure_time` is not recorded for every stop, e.g. VTA light rails + trip_stop_times_df["departure_time"].fillna(method = "ffill", inplace = True) + trip_stop_times_df["departure_time"].fillna(0, inplace = True) + trip_stop_times_df["NNTIME"] = trip_stop_times_df["departure_time"].diff() / 60 + # CUBE NNTIME takes 2 decimals + trip_stop_times_df["NNTIME"] = trip_stop_times_df["NNTIME"].round(2) + trip_stop_times_df["NNTIME"].fillna(-1, inplace = True) + + # ACCESS + def _access_type(x): + if (x.pickup_type in [1, "1"]): + return 2 + elif (x.drop_off_type in [1, "1"]): + return 1 + else: + return 0 + + trip_stop_times_df["ACCESS"] = trip_stop_times_df.apply(lambda x: _access_type(x), axis = 1) + + trip_runtime = round(trip_stop_times_df[trip_stop_times_df['NNTIME'] > 0]['NNTIME'].sum(),2) + + # node list + node_list_str = "" + stop_seq = 0 + for nodeIdx in range(len(trip_node_list)): + if trip_node_list[nodeIdx] in stop_node_id_list: + # in case a route stops at a stop more than once, e.g. circular route + stop_seq += 1 + + if (add_nntime) & (stop_seq > 1): + if len(trip_stop_times_df[ + trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]]) > 1: + nntime_v = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]) & + (trip_stop_times_df["internal_stop_sequence"] == stop_seq), + "NNTIME"].iloc[0] + else: + nntime_v = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]),"NNTIME"].iloc[0] + + if nntime_v > 0: + nntime = ", NNTIME=%s" % (nntime_v) + else: + nntime = "" + else: + nntime = "" + + access_v = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]),"ACCESS"].iloc[0] + if access_v > 0: + access = ", ACCESS=%s" % (access_v) + else: + access = "" + + node_list_str += "\n %s%s%s" % (trip_node_list[nodeIdx], nntime, access) + if nodeIdx < (len(trip_node_list) - 1): + node_list_str += "," + if ((add_nntime) & (stop_seq > 1) & (len(nntime) > 0)) | (len(access) > 0): + node_list_str += " N=" + else: + node_list_str += "\n -%s" % (trip_node_list[nodeIdx]) + if nodeIdx < (len(trip_node_list) - 1): + node_list_str += "," + + # remove NNTIME = 0 + node_list_str = node_list_str.replace(" NNTIME=0.0, N=", "") + node_list_str = node_list_str.replace(" NNTIME=0.0,", "") + + return node_list_str, trip_runtime
+ + +
[docs] def cube_format(self, row): + """ + Creates a string represnting the route in cube line file notation. + #MC + Args: + row: row of a DataFrame representing a cube-formatted trip, with the Attributes + trip_id, shape_id, NAME, LONGNAME, tod, HEADWAY, MODE, ONEWAY, OPERATOR + + Returns: + string representation of route in cube line file notation + """ + + s = '\nLINE NAME="{}",'.format(row.NAME) + s += '\n LONGNAME="{}",'.format(row.LONGNAME) + s += "\n HEADWAY[{}]={},".format(row.tod_num, row.HEADWAY) + s += "\n MODE={},".format(row.MODE) + s += "\n ONEWAY={},".format(row.ONEWAY) + s += "\n OPERATOR={},".format(row.OPERATOR) + s += "\n NODES={}".format(self.shape_gtfs_to_cube(row)) + + return s
+ +
[docs] def shape_gtfs_to_emme(self, trip_row): + """ + Creates transit segment for the trips in appropriate + emme format. + + Args: + row: DataFrame row with both shape_id and trip_id + + Returns: a dataframe representation of the transit segment + for a trip in emme format. + + """ + trip_stop_times_df = self.feed.stop_times.copy() + trip_stop_times_df = trip_stop_times_df[ + trip_stop_times_df.trip_id == trip_row.trip_id + ] + + trip_node_df = self.feed.shapes.copy() + trip_node_df = trip_node_df[trip_node_df.shape_id == trip_row.shape_id] + trip_node_df.sort_values(by = ["shape_pt_sequence"], inplace = True) + + trip_stop_times_df = pd.merge( + trip_stop_times_df, self.feed.stops, how="left", on="stop_id" + ) + + stop_node_id_list = trip_stop_times_df["model_node_id"].tolist() + trip_node_list = trip_node_df["shape_model_node_id"].tolist() + + trip_stop_times_df.sort_values(by = ["stop_sequence"], inplace = True) + # sometimes GTFS `stop_sequence` does not start with 1, e.g. SFMTA light rails + trip_stop_times_df["internal_stop_sequence"] = range(1, 1+len(trip_stop_times_df)) + # sometimes GTFS `departure_time` is not recorded for every stop, e.g. VTA light rails + trip_stop_times_df["departure_time"].fillna(method = "ffill", inplace = True) + trip_stop_times_df["departure_time"].fillna(0, inplace = True) + trip_stop_times_df["NNTIME"] = trip_stop_times_df["departure_time"].diff() / 60 + # CUBE NNTIME takes 2 decimals + trip_stop_times_df["NNTIME"] = trip_stop_times_df["NNTIME"].round(2) + trip_stop_times_df["NNTIME"].fillna(-1, inplace = True) + + # node list + stop_seq = 0 + nntimes = [] + allow_alightings=[] + allow_boardings=[] + stop_names=[] + + if trip_row.TM2_line_haul_name in ["Light rail", "Heavy rail", "Commuter rail", "Ferry service"]: + add_nntime = True + else: + add_nntime = False + + for nodeIdx in range(len(trip_node_list)): + + if trip_node_list[nodeIdx] in stop_node_id_list: + # in case a route stops at a stop more than once, e.g. circular route + stop_seq += 1 + + if (add_nntime) & (stop_seq > 1): + if len(trip_stop_times_df[ + trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]]) > 1: + + nntime_v = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]) & + (trip_stop_times_df["internal_stop_sequence"] == stop_seq), + "NNTIME"].iloc[0] + else: + nntime_v = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]),"NNTIME"].iloc[0] + + nntimes.append(nntime_v) + else: + nntimes.append(0) + + pickup_type = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]),"pickup_type"].iloc[0] + if pickup_type in [1, "1"]: + allow_alightings.append(0) + else: + allow_alightings.append(1) + + drop_off_type = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]),"drop_off_type"].iloc[0] + if drop_off_type in [1, "1"]: + allow_boardings.append(0) + else: + allow_boardings.append(1) + + stop_name = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]),"stop_name"].iloc[0] + stop_names.append(stop_name) + + else: + nntimes.append(0) + allow_alightings.append(0) + allow_boardings.append(0) + stop_names.append("") + + trip_node_df['time_minutes'] = nntimes + trip_node_df['allow_alightings'] = allow_alightings + trip_node_df['allow_boardings'] = allow_boardings + trip_node_df['stop_name'] = stop_names + trip_node_df['line_id'] = trip_row['line_id'] + trip_node_df['node_id'] = trip_node_df['shape_model_node_id'].astype(int) + trip_node_df['stop_order'] = trip_node_df['shape_pt_sequence'] + + return trip_node_df
+ +
[docs] def evaluate_differences(self, transit_changes): + """ + Compare changes from the transit_changes dataframe with the standard transit network + returns the project card changes in dictionary format + """ + + # simple properties change + trip_df = self.feed.trips.copy() + + mode_crosswalk = pd.read_csv(self.parameters.mode_crosswalk_file) + mode_crosswalk.drop_duplicates(subset = ["agency_raw_name", "route_type", "is_express_bus"], inplace = True) + + trip_df = pd.merge(trip_df, self.feed.routes.drop("agency_raw_name", axis = 1), how="left", on="route_id") + + trip_df = pd.merge(trip_df, self.feed.frequencies, how="left", on="trip_id") + + trip_df["tod"] = trip_df.start_time.apply(self.time_to_cube_time_period, as_str = False) + trip_df["tod_name"] = trip_df.start_time.apply(self.time_to_cube_time_period) + + trip_df["headway_minutes"] = (trip_df["headway_secs"] / 60).astype(int) + + trip_df = pd.merge(trip_df, self.feed.agency[["agency_name", "agency_raw_name", "agency_id"]], how = "left", on = ["agency_raw_name", "agency_id"]) + + # identify express bus + from .mtc import _is_express_bus + trip_df["is_express_bus"] = trip_df.apply(lambda x: _is_express_bus(x), axis = 1) + trip_df.drop("agency_name", axis = 1 , inplace = True) + + trip_df = pd.merge( + trip_df, + mode_crosswalk.drop("agency_id", axis = 1), + how = "left", + on = ["agency_raw_name", "route_type", "is_express_bus"] + ) + + trip_df["line_id"] = trip_df.apply( + lambda x: str(x.TM2_operator) + + "_" + + str(x.route_id) + + "_" + + x.tod_name + + "_" + + "d" + + str(int(x.direction_id)) + + "_s" + + x.shape_id, + axis=1, + ) + + trip_df["line_id"] = trip_df["line_id"].str.slice(stop = 28) + + project_card_changes = [] + + # lines updated + transit_changes['line_id'] = transit_changes.apply( + lambda x: '-'.join(x['element_id'].split('-')[:-3]) if + x['object'] == 'TRANSIT_STOP' else + x['element_id'], + axis = 1 + ) + + lines_updated_df = transit_changes[ + (transit_changes['operation'] == 'C') & + (transit_changes['line_id'].isin(trip_df['line_id'].tolist())) + ].copy() + + ######################### + # simple property changes + ######################### + + property_changes_df = lines_updated_df[ + lines_updated_df.object == 'TRANSIT_LINE' + ].copy() + + property_attribute_list = ['headway_secs'] + + for index, row in property_changes_df.iterrows(): + line_id = row['line_id'] + properties_list = [] + change_item = {} + for c in property_attribute_list: + existing_value = int(trip_df[ + trip_df['line_id'] == line_id + ][c].iloc[0]) + + change_item["existing"] = existing_value + + if c == 'headway_secs': + change_item["set"] = row['headway'] * 60 + else: + change_item["set"] = row[c] + + change_item["property"] = c + + properties_list.append(change_item) + + property_changes_df.loc[index, 'properties'] = properties_list + + ############### + # shape changes + ############### + + shape_changes_df = lines_updated_df[ + lines_updated_df.object.isin(['TRANSIT_SHAPE']) + ].copy() + + for index, row in shape_changes_df.iterrows(): + line_id = row.line_id + + # get base shape + trip_row = trip_df[trip_df.line_id == line_id].copy().squeeze() + + base_shape = self.shape_gtfs_to_emme( + trip_row=trip_row + ) + base_shape['shape_model_node_id'] = base_shape['shape_model_node_id'].astype(int) + + # get build shape + build_shape = row.new_itinerary + + updated_shapes = CubeTransit.evaluate_route_shape_changes( + shape_base = base_shape.shape_model_node_id, + shape_build = pd.Series(row.new_itinerary) + ) + updated_shapes[0]['property'] = 'shapes' + shape_changes_df.loc[index, 'properties'] = updated_shapes + + ############## + # stop changes + ############## + stop_changes_df = lines_updated_df[ + lines_updated_df.object.isin(['TRANSIT_STOP']) + ].copy() + + stop_attribute_list = ['allow_alightings', 'allow_boardings'] + + stop_changes_df = stop_changes_df.groupby( + ['line_id','i_node'] + )[stop_attribute_list].last().reset_index() + + stop_attribute_changes_df = pd.DataFrame() + + for attribute in stop_attribute_list: + + attribute_df = stop_changes_df.groupby( + ['line_id', attribute] + )['i_node'].apply(list).reset_index() + attribute_df['properties'] = attribute_df.apply( + lambda x: { + 'property' : attribute if x[attribute] == True else 'no_'+attribute.split('_')[-1], + 'set': x['i_node']}, + axis = 1 + ) + + stop_attribute_changes_df = pd.concat( + [stop_attribute_changes_df, + attribute_df[['line_id', 'properties']]], + sort = False, + ignore_index = True + ) + + ############## + # combine all transit changes + ############## + transit_changes_df = pd.concat( + [ + property_changes_df, + shape_changes_df, + stop_attribute_changes_df + ], + sort = False, + ignore_index = True + ) + + # groupby line_id + transit_changes_df = transit_changes_df.groupby( + ['line_id'] + )['properties'].apply(list).reset_index() + + # create change items by line_id + for index, row in transit_changes_df.iterrows(): + line_id = row['line_id'] + base_start_time_str = self.parameters.time_period_to_time.get( + line_id.split("_")[2] + )[0] + + base_end_time_str = self.parameters.time_period_to_time.get( + line_id.split("_")[2] + )[1] + + update_card_dict = { + "category": "Transit Service Property Change", + "facility": { + "route_id": line_id.split("_")[1], + "direction_id": int(line_id.split("_")[-2].strip("d\"")), + "shape_id": line_id.split("_")[-1].strip("s\""), + "start_time": base_start_time_str, + "end_time": base_end_time_str + }, + "properties": row['properties'], + } + + project_card_changes.append(update_card_dict) + + return project_card_changes
+ +class CubeTransformer(Transformer): + """A lark-parsing Transformer which transforms the parse-tree to + a dictionary. + + .. highlight:: python + Typical usage example: + :: + transformed_tree_data = CubeTransformer().transform(parse_tree) + + Attributes: + line_order (int): a dynamic counter to hold the order of the nodes within + a route shape + lines_list (list): a list of the line names + """ + + def __init__(self): + self.line_order = 0 + self.lines_list = [] + + def lines(self, line): + # WranglerLogger.debug("lines: \n {}".format(line)) + + # This MUST be a tuple because it returns to start in the tree + lines = {k: v for k, v in line} + return ("lines", lines) + + @v_args(inline=True) + def program_type_line(self, PROGRAM_TYPE, whitespace=None): + # WranglerLogger.debug("program_type_line:{}".format(PROGRAM_TYPE)) + self.program_type = PROGRAM_TYPE.value + + # This MUST be a tuple because it returns to start in the tree + return ("program_type", PROGRAM_TYPE.value) + + @v_args(inline=True) + def line(self, lin_attributes, nodes): + # WranglerLogger.debug("line...attributes:\n {}".format(lin_attributes)) + # WranglerLogger.debug("line...nodes:\n {}".format(nodes)) + lin_name = lin_attributes["NAME"] + + self.line_order = 0 + # WranglerLogger.debug("parsing: {}".format(lin_name)) + + return (lin_name, {"line_properties": lin_attributes, "line_shape": nodes}) + + @v_args(inline=True) + def lin_attributes(self, *lin_attr): + lin_attr = {k: v for (k, v) in lin_attr} + # WranglerLogger.debug("lin_attributes: {}".format(lin_attr)) + return lin_attr + + @v_args(inline=True) + def lin_attr(self, lin_attr_name, attr_value, SEMICOLON_COMMENT=None): + # WranglerLogger.debug("lin_attr {}: {}".format(lin_attr_name, attr_value)) + return lin_attr_name, attr_value + + def lin_attr_name(self, args): + attr_name = args[0].value.upper() + # WranglerLogger.debug(".......args {}".format(args)) + if attr_name in ["FREQ", "HEADWAY"]: + attr_name = attr_name + "[" + str(args[2]) + "]" + return attr_name + + def attr_value(self, attr_value): + try: + return int(attr_value[0].value) + except: + return attr_value[0].value + + def nodes(self, lin_node): + lin_node = DataFrame(lin_node) + # WranglerLogger.debug("nodes:\n {}".format(lin_node)) + + return lin_node + + @v_args(inline=True) + def lin_node(self, NODE_NUM, SEMICOLON_COMMENT=None, *lin_nodeattr): + self.line_order += 1 + n = int(NODE_NUM.value) + return {"node_id": abs(n), "node": n, "stop": n > 0, "order": self.line_order} + + start = dict + + +TRANSIT_LINE_FILE_GRAMMAR = r""" + +start : program_type_line? lines +WHITESPACE : /[ \t\r\n]/+ +STRING : /("(?!"").*?(?<!\\)(\\\\)*?"|'(?!'').*?(?<!\\)(\\\\)*?')/i +SEMICOLON_COMMENT : /;[^\n]*/ +BOOLEAN : "T"i | "F"i +program_type_line : ";;<<" PROGRAM_TYPE ">><<LINE>>;;" WHITESPACE? +PROGRAM_TYPE : "PT" | "TRNBUILD" + +lines : line* +line : "LINE" lin_attributes nodes + +lin_attributes : lin_attr+ +lin_attr : lin_attr_name "=" attr_value "," SEMICOLON_COMMENT* +TIME_PERIOD : "1".."5" +!lin_attr_name : "allstops"i + | "color"i + | ("freq"i "[" TIME_PERIOD "]") + | ("headway"i "[" TIME_PERIOD "]") + | "mode"i + | "name"i + | "oneway"i + | "owner"i + | "runtime"i + | "timefac"i + | "xyspeed"i + | "longname"i + | "shortname"i + | ("usera1"i) + | ("usera2"i) + | "circular"i + | "vehicletype"i + | "operator"i + | "faresystem"i + +attr_value : BOOLEAN | STRING | SIGNED_INT | FLOAT + +nodes : lin_node+ +lin_node : ("N" | "NODES")? "="? NODE_NUM ","? SEMICOLON_COMMENT? lin_nodeattr* +NODE_NUM : SIGNED_INT +lin_nodeattr : lin_nodeattr_name "=" attr_value ","? SEMICOLON_COMMENT* +!lin_nodeattr_name : "access_c"i + | "access"i + | "delay"i + | "xyspeed"i + | "timefac"i + | "nntime"i + | "time"i + +operator : SEMICOLON_COMMENT* "OPERATOR" opmode_attr* SEMICOLON_COMMENT* +mode : SEMICOLON_COMMENT* "MODE" opmode_attr* SEMICOLON_COMMENT* +opmode_attr : ( (opmode_attr_name "=" attr_value) ","? ) +opmode_attr_name : "number" | "name" | "longname" + +%import common.SIGNED_INT +%import common.FLOAT +%import common.WS +%ignore WS + +""" +
+ +
+
+
+ +
+ +
+

© Copyright 2019 Metropolitan Council.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty_2035_hwy_update/_modules/lasso/util/index.html b/branch/bicounty_2035_hwy_update/_modules/lasso/util/index.html new file mode 100644 index 0000000..f82d773 --- /dev/null +++ b/branch/bicounty_2035_hwy_update/_modules/lasso/util/index.html @@ -0,0 +1,256 @@ + + + + + + lasso.util — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for lasso.util

+from functools import partial
+import pyproj
+from shapely.ops import transform
+from shapely.geometry import Point, Polygon
+import re
+from unidecode import unidecode
+
+
[docs]def get_shared_streets_intersection_hash(lat, long, osm_node_id=None): + """ + Calculated per: + https://github.com/sharedstreets/sharedstreets-js/blob/0e6d7de0aee2e9ae3b007d1e45284b06cc241d02/src/index.ts#L553-L565 + Expected in/out + -93.0965985, 44.952112199999995 osm_node_id = 954734870 + 69f13f881649cb21ee3b359730790bb9 + + """ + import hashlib + + message = "Intersection {0:.5f} {0:.5f}".format(long, lat) + if osm_node_id: + message += " {}".format(osm_node_id) + unhashed = message.encode("utf-8") + hash = hashlib.md5(unhashed).hexdigest() + return hash
+ + +
[docs]def hhmmss_to_datetime(hhmmss_str: str): + """ + Creates a datetime time object from a string of hh:mm:ss + + Args: + hhmmss_str: string of hh:mm:ss + Returns: + dt: datetime.time object representing time + """ + import datetime + + dt = datetime.time(*[int(i) for i in hhmmss_str.split(":")]) + + return dt
+ + +
[docs]def secs_to_datetime(secs: int): + """ + Creates a datetime time object from a seconds from midnight + + Args: + secs: seconds from midnight + Returns: + dt: datetime.time object representing time + """ + import datetime + + dt = (datetime.datetime.min + datetime.timedelta(seconds=secs)).time() + + return dt
+ + +
[docs]def geodesic_point_buffer(lat, lon, meters): + """ + creates circular buffer polygon for node + + Args: + lat: node lat + lon: node lon + meters: buffer distance, radius of circle + Returns: + Polygon + """ + proj_wgs84 = pyproj.Proj('+proj=longlat +datum=WGS84') + # Azimuthal equidistant projection + aeqd_proj = '+proj=aeqd +lat_0={lat} +lon_0={lon} +x_0=0 +y_0=0' + project = partial( + pyproj.transform, + pyproj.Proj(aeqd_proj.format(lat=lat, lon=lon)), + proj_wgs84) + buf = Point(0, 0).buffer(meters) # distance in meters + return Polygon(transform(project, buf).exterior.coords[:])
+ +
[docs]def create_locationreference(node, link): + node['X'] = node['geometry'].apply(lambda p: p.x) + node['Y'] = node['geometry'].apply(lambda p: p.y) + node['point'] = [list(xy) for xy in zip(node.X, node.Y)] + node_dict = dict(zip(node.model_node_id, node.point)) + + link['A_point'] = link['A'].map(node_dict) + link['B_point'] = link['B'].map(node_dict) + link['locationReferences'] = link.apply(lambda x: [{'sequence':1, + 'point': x['A_point'], + 'distanceToNextRef':x['length'], + 'bearing' : 0, + 'intersectionId':x['fromIntersectionId']}, + {'sequence':2, + 'point': x['B_point'], + 'intersectionId':x['toIntersectionId']}], + axis = 1)
+ +
[docs]def column_name_to_parts(c, parameters=None): + + if not parameters: + from .parameters import Parameters + + parameters = Parameters() + + if c[0:2] == "ML": + managed = True + else: + managed = False + + time_period = None + category = None + + if c.split("_")[0] not in parameters.properties_to_split.keys(): + return c, None, None, managed + + tps = parameters.time_period_to_time.keys() + cats = parameters.categories.keys() + + if c.split("_")[-1] in tps: + time_period = c.split("_")[-1] + base_name = c.split(time_period)[-2][:-1] + if c.split("_")[-2] in cats: + category = c.split("_")[-2] + base_name = c.split(category)[-2][:-1] + elif c.split("_")[-1] in cats: + category = c.split("_")[-1] + base_name = c.split(category)[-2][:-1] + else: + msg = "Can't split property correctly: {}".format(c) + WranglerLogger.error(msg) + + return base_name, time_period, category, managed
+ +
[docs]def shorten_name(name): + if type(name) == str: + name_list = name.split(',') + else: + name_list = name + name_list = [re.sub(r'\W+', ' ', c).replace('nan', '').strip(' ') for c in name_list] + + name_list = list(set(name_list)) + #name_list.remove('') + + name_new = ' '.join(name_list).strip(' ') + + # convert non english character to english + name_new = unidecode(name_new) + + return name_new
+
+ +
+
+
+ +
+ +
+

© Copyright 2019 Metropolitan Council.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty_2035_hwy_update/_modules/shapely/geometry/point/index.html b/branch/bicounty_2035_hwy_update/_modules/shapely/geometry/point/index.html new file mode 100644 index 0000000..a0ef069 --- /dev/null +++ b/branch/bicounty_2035_hwy_update/_modules/shapely/geometry/point/index.html @@ -0,0 +1,252 @@ + + + + + + shapely.geometry.point — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for shapely.geometry.point

+"""Points and related utilities
+"""
+import numpy as np
+
+import shapely
+from shapely.errors import DimensionError
+from shapely.geometry.base import BaseGeometry
+
+__all__ = ["Point"]
+
+
+
[docs]class Point(BaseGeometry): + """ + A geometry type that represents a single coordinate with + x,y and possibly z values. + + A point is a zero-dimensional feature and has zero length and zero area. + + Parameters + ---------- + args : float, or sequence of floats + The coordinates can either be passed as a single parameter, or as + individual float values using multiple parameters: + + 1) 1 parameter: a sequence or array-like of with 2 or 3 values. + 2) 2 or 3 parameters (float): x, y, and possibly z. + + Attributes + ---------- + x, y, z : float + Coordinate values + + Examples + -------- + Constructing the Point using separate parameters for x and y: + + >>> p = Point(1.0, -1.0) + + Constructing the Point using a list of x, y coordinates: + + >>> p = Point([1.0, -1.0]) + >>> print(p) + POINT (1 -1) + >>> p.y + -1.0 + >>> p.x + 1.0 + """ + + __slots__ = [] + + def __new__(self, *args): + if len(args) == 0: + # empty geometry + # TODO better constructor + return shapely.from_wkt("POINT EMPTY") + elif len(args) > 3: + raise TypeError(f"Point() takes at most 3 arguments ({len(args)} given)") + elif len(args) == 1: + coords = args[0] + if isinstance(coords, Point): + return coords + + # Accept either (x, y) or [(x, y)] + if not hasattr(coords, "__getitem__"): # generators + coords = list(coords) + coords = np.asarray(coords).squeeze() + else: + # 2 or 3 args + coords = np.array(args).squeeze() + + if coords.ndim > 1: + raise ValueError( + f"Point() takes only scalar or 1-size vector arguments, got {args}" + ) + if not np.issubdtype(coords.dtype, np.number): + coords = [float(c) for c in coords] + geom = shapely.points(coords) + if not isinstance(geom, Point): + raise ValueError("Invalid values passed to Point constructor") + return geom + + # Coordinate getters and setters + + @property + def x(self): + """Return x coordinate.""" + return shapely.get_x(self) + + @property + def y(self): + """Return y coordinate.""" + return shapely.get_y(self) + + @property + def z(self): + """Return z coordinate.""" + if not shapely.has_z(self): + raise DimensionError("This point has no z coordinate.") + # return shapely.get_z(self) -> get_z only supported for GEOS 3.7+ + return self.coords[0][2] + + @property + def __geo_interface__(self): + return {"type": "Point", "coordinates": self.coords[0]} + +
[docs] def svg(self, scale_factor=1.0, fill_color=None, opacity=None): + """Returns SVG circle element for the Point geometry. + + Parameters + ========== + scale_factor : float + Multiplication factor for the SVG circle diameter. Default is 1. + fill_color : str, optional + Hex string for fill color. Default is to use "#66cc99" if + geometry is valid, and "#ff3333" if invalid. + opacity : float + Float number between 0 and 1 for color opacity. Default value is 0.6 + """ + if self.is_empty: + return "<g />" + if fill_color is None: + fill_color = "#66cc99" if self.is_valid else "#ff3333" + if opacity is None: + opacity = 0.6 + return ( + '<circle cx="{0.x}" cy="{0.y}" r="{1}" ' + 'stroke="#555555" stroke-width="{2}" fill="{3}" opacity="{4}" />' + ).format(self, 3.0 * scale_factor, 1.0 * scale_factor, fill_color, opacity)
+ + @property + def xy(self): + """Separate arrays of X and Y coordinate values + + Example: + >>> x, y = Point(0, 0).xy + >>> list(x) + [0.0] + >>> list(y) + [0.0] + """ + return self.coords.xy
+ + +shapely.lib.registry[0] = Point +
+ +
+
+
+ +
+ +
+

© Copyright 2019 Metropolitan Council.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty_2035_hwy_update/_modules/shapely/geometry/polygon/index.html b/branch/bicounty_2035_hwy_update/_modules/shapely/geometry/polygon/index.html new file mode 100644 index 0000000..f73f375 --- /dev/null +++ b/branch/bicounty_2035_hwy_update/_modules/shapely/geometry/polygon/index.html @@ -0,0 +1,462 @@ + + + + + + shapely.geometry.polygon — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for shapely.geometry.polygon

+"""Polygons and their linear ring components
+"""
+
+import numpy as np
+
+import shapely
+from shapely.algorithms.cga import is_ccw_impl, signed_area
+from shapely.errors import TopologicalError
+from shapely.geometry.base import BaseGeometry
+from shapely.geometry.linestring import LineString
+from shapely.geometry.point import Point
+
+__all__ = ["Polygon", "LinearRing"]
+
+
+def _unpickle_linearring(wkb):
+    linestring = shapely.from_wkb(wkb)
+    srid = shapely.get_srid(linestring)
+    linearring = shapely.linearrings(shapely.get_coordinates(linestring))
+    if srid:
+        linearring = shapely.set_srid(linearring, srid)
+    return linearring
+
+
+class LinearRing(LineString):
+    """
+    A geometry type composed of one or more line segments
+    that forms a closed loop.
+
+    A LinearRing is a closed, one-dimensional feature.
+    A LinearRing that crosses itself or touches itself at a single point is
+    invalid and operations on it may fail.
+
+    Parameters
+    ----------
+    coordinates : sequence
+        A sequence of (x, y [,z]) numeric coordinate pairs or triples, or
+        an array-like with shape (N, 2) or (N, 3).
+        Also can be a sequence of Point objects.
+
+    Notes
+    -----
+    Rings are automatically closed. There is no need to specify a final
+    coordinate pair identical to the first.
+
+    Examples
+    --------
+    Construct a square ring.
+
+    >>> ring = LinearRing( ((0, 0), (0, 1), (1 ,1 ), (1 , 0)) )
+    >>> ring.is_closed
+    True
+    >>> list(ring.coords)
+    [(0.0, 0.0), (0.0, 1.0), (1.0, 1.0), (1.0, 0.0), (0.0, 0.0)]
+    >>> ring.length
+    4.0
+
+    """
+
+    __slots__ = []
+
+    def __new__(self, coordinates=None):
+        if coordinates is None:
+            # empty geometry
+            # TODO better way?
+            return shapely.from_wkt("LINEARRING EMPTY")
+        elif isinstance(coordinates, LineString):
+            if type(coordinates) == LinearRing:
+                # return original objects since geometries are immutable
+                return coordinates
+            elif not coordinates.is_valid:
+                raise TopologicalError("An input LineString must be valid.")
+            else:
+                # LineString
+                # TODO convert LineString to LinearRing more directly?
+                coordinates = coordinates.coords
+
+        else:
+            if hasattr(coordinates, "__array__"):
+                coordinates = np.asarray(coordinates)
+            if isinstance(coordinates, np.ndarray) and np.issubdtype(
+                coordinates.dtype, np.number
+            ):
+                pass
+            else:
+                # check coordinates on points
+                def _coords(o):
+                    if isinstance(o, Point):
+                        return o.coords[0]
+                    else:
+                        return [float(c) for c in o]
+
+                coordinates = np.array([_coords(o) for o in coordinates])
+                if not np.issubdtype(coordinates.dtype, np.number):
+                    # conversion of coords to 2D array failed, this might be due
+                    # to inconsistent coordinate dimensionality
+                    raise ValueError("Inconsistent coordinate dimensionality")
+
+        if len(coordinates) == 0:
+            # empty geometry
+            # TODO better constructor + should shapely.linearrings handle this?
+            return shapely.from_wkt("LINEARRING EMPTY")
+
+        geom = shapely.linearrings(coordinates)
+        if not isinstance(geom, LinearRing):
+            raise ValueError("Invalid values passed to LinearRing constructor")
+        return geom
+
+    @property
+    def __geo_interface__(self):
+        return {"type": "LinearRing", "coordinates": tuple(self.coords)}
+
+    def __reduce__(self):
+        """WKB doesn't differentiate between LineString and LinearRing so we
+        need to move the coordinate sequence into the correct geometry type"""
+        return (_unpickle_linearring, (shapely.to_wkb(self, include_srid=True),))
+
+    @property
+    def is_ccw(self):
+        """True is the ring is oriented counter clock-wise"""
+        return bool(is_ccw_impl()(self))
+
+    @property
+    def is_simple(self):
+        """True if the geometry is simple, meaning that any self-intersections
+        are only at boundary points, else False"""
+        return bool(shapely.is_simple(self))
+
+
+shapely.lib.registry[2] = LinearRing
+
+
+class InteriorRingSequence:
+
+    _parent = None
+    _ndim = None
+    _index = 0
+    _length = 0
+
+    def __init__(self, parent):
+        self._parent = parent
+        self._ndim = parent._ndim
+
+    def __iter__(self):
+        self._index = 0
+        self._length = self.__len__()
+        return self
+
+    def __next__(self):
+        if self._index < self._length:
+            ring = self._get_ring(self._index)
+            self._index += 1
+            return ring
+        else:
+            raise StopIteration
+
+    def __len__(self):
+        return shapely.get_num_interior_rings(self._parent)
+
+    def __getitem__(self, key):
+        m = self.__len__()
+        if isinstance(key, int):
+            if key + m < 0 or key >= m:
+                raise IndexError("index out of range")
+            if key < 0:
+                i = m + key
+            else:
+                i = key
+            return self._get_ring(i)
+        elif isinstance(key, slice):
+            res = []
+            start, stop, stride = key.indices(m)
+            for i in range(start, stop, stride):
+                res.append(self._get_ring(i))
+            return res
+        else:
+            raise TypeError("key must be an index or slice")
+
+    def _get_ring(self, i):
+        return shapely.get_interior_ring(self._parent, i)
+
+
+
[docs]class Polygon(BaseGeometry): + """ + A geometry type representing an area that is enclosed by a linear ring. + + A polygon is a two-dimensional feature and has a non-zero area. It may + have one or more negative-space "holes" which are also bounded by linear + rings. If any rings cross each other, the feature is invalid and + operations on it may fail. + + Parameters + ---------- + shell : sequence + A sequence of (x, y [,z]) numeric coordinate pairs or triples, or + an array-like with shape (N, 2) or (N, 3). + Also can be a sequence of Point objects. + holes : sequence + A sequence of objects which satisfy the same requirements as the + shell parameters above + + Attributes + ---------- + exterior : LinearRing + The ring which bounds the positive space of the polygon. + interiors : sequence + A sequence of rings which bound all existing holes. + + Examples + -------- + Create a square polygon with no holes + + >>> coords = ((0., 0.), (0., 1.), (1., 1.), (1., 0.), (0., 0.)) + >>> polygon = Polygon(coords) + >>> polygon.area + 1.0 + """ + + __slots__ = [] + + def __new__(self, shell=None, holes=None): + if shell is None: + # empty geometry + # TODO better way? + return shapely.from_wkt("POLYGON EMPTY") + elif isinstance(shell, Polygon): + # return original objects since geometries are immutable + return shell + else: + shell = LinearRing(shell) + + if holes is not None: + if len(holes) == 0: + # shapely constructor cannot handle holes=[] + holes = None + else: + holes = [LinearRing(ring) for ring in holes] + + geom = shapely.polygons(shell, holes=holes) + if not isinstance(geom, Polygon): + raise ValueError("Invalid values passed to Polygon constructor") + return geom + + @property + def exterior(self): + return shapely.get_exterior_ring(self) + + @property + def interiors(self): + if self.is_empty: + return [] + return InteriorRingSequence(self) + + @property + def coords(self): + raise NotImplementedError( + "Component rings have coordinate sequences, but the polygon does not" + ) + + def __eq__(self, other): + if not isinstance(other, BaseGeometry): + return NotImplemented + if not isinstance(other, Polygon): + return False + check_empty = (self.is_empty, other.is_empty) + if all(check_empty): + return True + elif any(check_empty): + return False + my_coords = [self.exterior.coords] + [ + interior.coords for interior in self.interiors + ] + other_coords = [other.exterior.coords] + [ + interior.coords for interior in other.interiors + ] + if not len(my_coords) == len(other_coords): + return False + # equal_nan=False is the default, but not yet available for older numpy + return np.all( + [ + np.array_equal(left, right) # , equal_nan=False) + for left, right in zip(my_coords, other_coords) + ] + ) + + def __hash__(self): + return super().__hash__() + + @property + def __geo_interface__(self): + if self.exterior == LinearRing(): + coords = [] + else: + coords = [tuple(self.exterior.coords)] + for hole in self.interiors: + coords.append(tuple(hole.coords)) + return {"type": "Polygon", "coordinates": tuple(coords)} + +
[docs] def svg(self, scale_factor=1.0, fill_color=None, opacity=None): + """Returns SVG path element for the Polygon geometry. + + Parameters + ========== + scale_factor : float + Multiplication factor for the SVG stroke-width. Default is 1. + fill_color : str, optional + Hex string for fill color. Default is to use "#66cc99" if + geometry is valid, and "#ff3333" if invalid. + opacity : float + Float number between 0 and 1 for color opacity. Default value is 0.6 + """ + if self.is_empty: + return "<g />" + if fill_color is None: + fill_color = "#66cc99" if self.is_valid else "#ff3333" + if opacity is None: + opacity = 0.6 + exterior_coords = [["{},{}".format(*c) for c in self.exterior.coords]] + interior_coords = [ + ["{},{}".format(*c) for c in interior.coords] for interior in self.interiors + ] + path = " ".join( + [ + "M {} L {} z".format(coords[0], " L ".join(coords[1:])) + for coords in exterior_coords + interior_coords + ] + ) + return ( + '<path fill-rule="evenodd" fill="{2}" stroke="#555555" ' + 'stroke-width="{0}" opacity="{3}" d="{1}" />' + ).format(2.0 * scale_factor, path, fill_color, opacity)
+ +
[docs] @classmethod + def from_bounds(cls, xmin, ymin, xmax, ymax): + """Construct a `Polygon()` from spatial bounds.""" + return cls([(xmin, ymin), (xmin, ymax), (xmax, ymax), (xmax, ymin)])
+ + +shapely.lib.registry[3] = Polygon + + +def orient(polygon, sign=1.0): + s = float(sign) + rings = [] + ring = polygon.exterior + if signed_area(ring) / s >= 0.0: + rings.append(ring) + else: + rings.append(list(ring.coords)[::-1]) + for ring in polygon.interiors: + if signed_area(ring) / s <= 0.0: + rings.append(ring) + else: + rings.append(list(ring.coords)[::-1]) + return Polygon(rings[0], rings[1:]) +
+ +
+
+
+ +
+ +
+

© Copyright 2019 Metropolitan Council.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty_2035_hwy_update/_modules/shapely/ops/index.html b/branch/bicounty_2035_hwy_update/_modules/shapely/ops/index.html new file mode 100644 index 0000000..90625be --- /dev/null +++ b/branch/bicounty_2035_hwy_update/_modules/shapely/ops/index.html @@ -0,0 +1,845 @@ + + + + + + shapely.ops — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for shapely.ops

+"""Support for various GEOS geometry operations
+"""
+
+from warnings import warn
+
+import shapely
+from shapely.algorithms.polylabel import polylabel  # noqa
+from shapely.errors import GeometryTypeError, ShapelyDeprecationWarning
+from shapely.geometry import (
+    GeometryCollection,
+    LineString,
+    MultiLineString,
+    MultiPoint,
+    Point,
+    Polygon,
+    shape,
+)
+from shapely.geometry.base import BaseGeometry, BaseMultipartGeometry
+from shapely.geometry.polygon import orient as orient_
+from shapely.prepared import prep
+
+__all__ = [
+    "cascaded_union",
+    "linemerge",
+    "operator",
+    "polygonize",
+    "polygonize_full",
+    "transform",
+    "unary_union",
+    "triangulate",
+    "voronoi_diagram",
+    "split",
+    "nearest_points",
+    "validate",
+    "snap",
+    "shared_paths",
+    "clip_by_rect",
+    "orient",
+    "substring",
+]
+
+
+class CollectionOperator:
+    def shapeup(self, ob):
+        if isinstance(ob, BaseGeometry):
+            return ob
+        else:
+            try:
+                return shape(ob)
+            except (ValueError, AttributeError):
+                return LineString(ob)
+
+    def polygonize(self, lines):
+        """Creates polygons from a source of lines
+
+        The source may be a MultiLineString, a sequence of LineString objects,
+        or a sequence of objects than can be adapted to LineStrings.
+        """
+        source = getattr(lines, "geoms", None) or lines
+        try:
+            source = iter(source)
+        except TypeError:
+            source = [source]
+        finally:
+            obs = [self.shapeup(line) for line in source]
+        collection = shapely.polygonize(obs)
+        return collection.geoms
+
+    def polygonize_full(self, lines):
+        """Creates polygons from a source of lines, returning the polygons
+        and leftover geometries.
+
+        The source may be a MultiLineString, a sequence of LineString objects,
+        or a sequence of objects than can be adapted to LineStrings.
+
+        Returns a tuple of objects: (polygons, cut edges, dangles, invalid ring
+        lines). Each are a geometry collection.
+
+        Dangles are edges which have one or both ends which are not incident on
+        another edge endpoint. Cut edges are connected at both ends but do not
+        form part of polygon. Invalid ring lines form rings which are invalid
+        (bowties, etc).
+        """
+        source = getattr(lines, "geoms", None) or lines
+        try:
+            source = iter(source)
+        except TypeError:
+            source = [source]
+        finally:
+            obs = [self.shapeup(line) for line in source]
+        return shapely.polygonize_full(obs)
+
+    def linemerge(self, lines, directed=False):
+        """Merges all connected lines from a source
+
+        The source may be a MultiLineString, a sequence of LineString objects,
+        or a sequence of objects than can be adapted to LineStrings.  Returns a
+        LineString or MultiLineString when lines are not contiguous.
+        """
+        source = None
+        if getattr(lines, "geom_type", None) == "MultiLineString":
+            source = lines
+        elif hasattr(lines, "geoms"):
+            # other Multi geometries
+            source = MultiLineString([ls.coords for ls in lines.geoms])
+        elif hasattr(lines, "__iter__"):
+            try:
+                source = MultiLineString([ls.coords for ls in lines])
+            except AttributeError:
+                source = MultiLineString(lines)
+        if source is None:
+            raise ValueError(f"Cannot linemerge {lines}")
+        return shapely.line_merge(source, directed=directed)
+
+    def cascaded_union(self, geoms):
+        """Returns the union of a sequence of geometries
+
+        .. deprecated:: 1.8
+            This function was superseded by :meth:`unary_union`.
+        """
+        warn(
+            "The 'cascaded_union()' function is deprecated. "
+            "Use 'unary_union()' instead.",
+            ShapelyDeprecationWarning,
+            stacklevel=2,
+        )
+        return shapely.union_all(geoms, axis=None)
+
+    def unary_union(self, geoms):
+        """Returns the union of a sequence of geometries
+
+        Usually used to convert a collection into the smallest set of polygons
+        that cover the same area.
+        """
+        return shapely.union_all(geoms, axis=None)
+
+
+operator = CollectionOperator()
+polygonize = operator.polygonize
+polygonize_full = operator.polygonize_full
+linemerge = operator.linemerge
+cascaded_union = operator.cascaded_union
+unary_union = operator.unary_union
+
+
+def triangulate(geom, tolerance=0.0, edges=False):
+    """Creates the Delaunay triangulation and returns a list of geometries
+
+    The source may be any geometry type. All vertices of the geometry will be
+    used as the points of the triangulation.
+
+    From the GEOS documentation:
+    tolerance is the snapping tolerance used to improve the robustness of
+    the triangulation computation. A tolerance of 0.0 specifies that no
+    snapping will take place.
+
+    If edges is False, a list of Polygons (triangles) will be returned.
+    Otherwise the list of LineString edges is returned.
+
+    """
+    collection = shapely.delaunay_triangles(geom, tolerance=tolerance, only_edges=edges)
+    return [g for g in collection.geoms]
+
+
+def voronoi_diagram(geom, envelope=None, tolerance=0.0, edges=False):
+    """
+    Constructs a Voronoi Diagram [1] from the given geometry.
+    Returns a list of geometries.
+
+    Parameters
+    ----------
+    geom: geometry
+        the input geometry whose vertices will be used to calculate
+        the final diagram.
+    envelope: geometry, None
+        clipping envelope for the returned diagram, automatically
+        determined if None. The diagram will be clipped to the larger
+        of this envelope or an envelope surrounding the sites.
+    tolerance: float, 0.0
+        sets the snapping tolerance used to improve the robustness
+        of the computation. A tolerance of 0.0 specifies that no
+        snapping will take place.
+    edges: bool, False
+        If False, return regions as polygons. Else, return only
+        edges e.g. LineStrings.
+
+    GEOS documentation can be found at [2]
+
+    Returns
+    -------
+    GeometryCollection
+        geometries representing the Voronoi regions.
+
+    Notes
+    -----
+    The tolerance `argument` can be finicky and is known to cause the
+    algorithm to fail in several cases. If you're using `tolerance`
+    and getting a failure, try removing it. The test cases in
+    tests/test_voronoi_diagram.py show more details.
+
+
+    References
+    ----------
+    [1] https://en.wikipedia.org/wiki/Voronoi_diagram
+    [2] https://geos.osgeo.org/doxygen/geos__c_8h_source.html  (line 730)
+    """
+    try:
+        result = shapely.voronoi_polygons(
+            geom, tolerance=tolerance, extend_to=envelope, only_edges=edges
+        )
+    except shapely.GEOSException as err:
+        errstr = "Could not create Voronoi Diagram with the specified inputs "
+        errstr += f"({err!s})."
+        if tolerance:
+            errstr += " Try running again with default tolerance value."
+        raise ValueError(errstr) from err
+
+    if result.geom_type != "GeometryCollection":
+        return GeometryCollection([result])
+    return result
+
+
+def validate(geom):
+    return shapely.is_valid_reason(geom)
+
+
+
[docs]def transform(func, geom): + """Applies `func` to all coordinates of `geom` and returns a new + geometry of the same type from the transformed coordinates. + + `func` maps x, y, and optionally z to output xp, yp, zp. The input + parameters may iterable types like lists or arrays or single values. + The output shall be of the same type. Scalars in, scalars out. + Lists in, lists out. + + For example, here is an identity function applicable to both types + of input. + + def id_func(x, y, z=None): + return tuple(filter(None, [x, y, z])) + + g2 = transform(id_func, g1) + + Using pyproj >= 2.1, this example will accurately project Shapely geometries: + + import pyproj + + wgs84 = pyproj.CRS('EPSG:4326') + utm = pyproj.CRS('EPSG:32618') + + project = pyproj.Transformer.from_crs(wgs84, utm, always_xy=True).transform + + g2 = transform(project, g1) + + Note that the always_xy kwarg is required here as Shapely geometries only support + X,Y coordinate ordering. + + Lambda expressions such as the one in + + g2 = transform(lambda x, y, z=None: (x+1.0, y+1.0), g1) + + also satisfy the requirements for `func`. + """ + if geom.is_empty: + return geom + if geom.geom_type in ("Point", "LineString", "LinearRing", "Polygon"): + + # First we try to apply func to x, y, z sequences. When func is + # optimized for sequences, this is the fastest, though zipping + # the results up to go back into the geometry constructors adds + # extra cost. + try: + if geom.geom_type in ("Point", "LineString", "LinearRing"): + return type(geom)(zip(*func(*zip(*geom.coords)))) + elif geom.geom_type == "Polygon": + shell = type(geom.exterior)(zip(*func(*zip(*geom.exterior.coords)))) + holes = list( + type(ring)(zip(*func(*zip(*ring.coords)))) + for ring in geom.interiors + ) + return type(geom)(shell, holes) + + # A func that assumes x, y, z are single values will likely raise a + # TypeError, in which case we'll try again. + except TypeError: + if geom.geom_type in ("Point", "LineString", "LinearRing"): + return type(geom)([func(*c) for c in geom.coords]) + elif geom.geom_type == "Polygon": + shell = type(geom.exterior)([func(*c) for c in geom.exterior.coords]) + holes = list( + type(ring)([func(*c) for c in ring.coords]) + for ring in geom.interiors + ) + return type(geom)(shell, holes) + + elif geom.geom_type.startswith("Multi") or geom.geom_type == "GeometryCollection": + return type(geom)([transform(func, part) for part in geom.geoms]) + else: + raise GeometryTypeError(f"Type {geom.geom_type!r} not recognized")
+ + +def nearest_points(g1, g2): + """Returns the calculated nearest points in the input geometries + + The points are returned in the same order as the input geometries. + """ + seq = shapely.shortest_line(g1, g2) + if seq is None: + if g1.is_empty: + raise ValueError("The first input geometry is empty") + else: + raise ValueError("The second input geometry is empty") + + p1 = shapely.get_point(seq, 0) + p2 = shapely.get_point(seq, 1) + return (p1, p2) + + +def snap(g1, g2, tolerance): + """ + Snaps an input geometry (g1) to reference (g2) geometry's vertices. + + Parameters + ---------- + g1 : geometry + The first geometry + g2 : geometry + The second geometry + tolerance : float + The snapping tolerance + + Refer to :func:`shapely.snap` for full documentation. + """ + + return shapely.snap(g1, g2, tolerance) + + +def shared_paths(g1, g2): + """Find paths shared between the two given lineal geometries + + Returns a GeometryCollection with two elements: + - First element is a MultiLineString containing shared paths with the + same direction for both inputs. + - Second element is a MultiLineString containing shared paths with the + opposite direction for the two inputs. + + Parameters + ---------- + g1 : geometry + The first geometry + g2 : geometry + The second geometry + """ + if not isinstance(g1, LineString): + raise GeometryTypeError("First geometry must be a LineString") + if not isinstance(g2, LineString): + raise GeometryTypeError("Second geometry must be a LineString") + return shapely.shared_paths(g1, g2) + + +class SplitOp: + @staticmethod + def _split_polygon_with_line(poly, splitter): + """Split a Polygon with a LineString""" + if not isinstance(poly, Polygon): + raise GeometryTypeError("First argument must be a Polygon") + if not isinstance(splitter, LineString): + raise GeometryTypeError("Second argument must be a LineString") + + union = poly.boundary.union(splitter) + + # greatly improves split performance for big geometries with many + # holes (the following contains checks) with minimal overhead + # for common cases + poly = prep(poly) + + # some polygonized geometries may be holes, we do not want them + # that's why we test if the original polygon (poly) contains + # an inner point of polygonized geometry (pg) + return [ + pg for pg in polygonize(union) if poly.contains(pg.representative_point()) + ] + + @staticmethod + def _split_line_with_line(line, splitter): + """Split a LineString with another (Multi)LineString or (Multi)Polygon""" + + # if splitter is a polygon, pick it's boundary + if splitter.geom_type in ("Polygon", "MultiPolygon"): + splitter = splitter.boundary + + if not isinstance(line, LineString): + raise GeometryTypeError("First argument must be a LineString") + if not isinstance(splitter, LineString) and not isinstance( + splitter, MultiLineString + ): + raise GeometryTypeError( + "Second argument must be either a LineString or a MultiLineString" + ) + + # | s\l | Interior | Boundary | Exterior | + # |----------|----------|----------|----------| + # | Interior | 0 or F | * | * | At least one of these two must be 0 + # | Boundary | 0 or F | * | * | So either '0********' or '[0F]**0*****' + # | Exterior | * | * | * | No overlapping interiors ('1********') + relation = splitter.relate(line) + if relation[0] == "1": + # The lines overlap at some segment (linear intersection of interiors) + raise ValueError("Input geometry segment overlaps with the splitter.") + elif relation[0] == "0" or relation[3] == "0": + # The splitter crosses or touches the line's interior --> return multilinestring from the split + return line.difference(splitter) + else: + # The splitter does not cross or touch the line's interior --> return collection with identity line + return [line] + + @staticmethod + def _split_line_with_point(line, splitter): + """Split a LineString with a Point""" + if not isinstance(line, LineString): + raise GeometryTypeError("First argument must be a LineString") + if not isinstance(splitter, Point): + raise GeometryTypeError("Second argument must be a Point") + + # check if point is in the interior of the line + if not line.relate_pattern(splitter, "0********"): + # point not on line interior --> return collection with single identity line + # (REASONING: Returning a list with the input line reference and creating a + # GeometryCollection at the general split function prevents unnecessary copying + # of linestrings in multipoint splitting function) + return [line] + elif line.coords[0] == splitter.coords[0]: + # if line is a closed ring the previous test doesn't behave as desired + return [line] + + # point is on line, get the distance from the first point on line + distance_on_line = line.project(splitter) + coords = list(line.coords) + # split the line at the point and create two new lines + current_position = 0.0 + for i in range(len(coords) - 1): + point1 = coords[i] + point2 = coords[i + 1] + dx = point1[0] - point2[0] + dy = point1[1] - point2[1] + segment_length = (dx**2 + dy**2) ** 0.5 + current_position += segment_length + if distance_on_line == current_position: + # splitter is exactly on a vertex + return [LineString(coords[: i + 2]), LineString(coords[i + 1 :])] + elif distance_on_line < current_position: + # splitter is between two vertices + return [ + LineString(coords[: i + 1] + [splitter.coords[0]]), + LineString([splitter.coords[0]] + coords[i + 1 :]), + ] + return [line] + + @staticmethod + def _split_line_with_multipoint(line, splitter): + """Split a LineString with a MultiPoint""" + + if not isinstance(line, LineString): + raise GeometryTypeError("First argument must be a LineString") + if not isinstance(splitter, MultiPoint): + raise GeometryTypeError("Second argument must be a MultiPoint") + + chunks = [line] + for pt in splitter.geoms: + new_chunks = [] + for chunk in filter(lambda x: not x.is_empty, chunks): + # add the newly split 2 lines or the same line if not split + new_chunks.extend(SplitOp._split_line_with_point(chunk, pt)) + chunks = new_chunks + + return chunks + + @staticmethod + def split(geom, splitter): + """ + Splits a geometry by another geometry and returns a collection of geometries. This function is the theoretical + opposite of the union of the split geometry parts. If the splitter does not split the geometry, a collection + with a single geometry equal to the input geometry is returned. + The function supports: + - Splitting a (Multi)LineString by a (Multi)Point or (Multi)LineString or (Multi)Polygon + - Splitting a (Multi)Polygon by a LineString + + It may be convenient to snap the splitter with low tolerance to the geometry. For example in the case + of splitting a line by a point, the point must be exactly on the line, for the line to be correctly split. + When splitting a line by a polygon, the boundary of the polygon is used for the operation. + When splitting a line by another line, a ValueError is raised if the two overlap at some segment. + + Parameters + ---------- + geom : geometry + The geometry to be split + splitter : geometry + The geometry that will split the input geom + + Example + ------- + >>> pt = Point((1, 1)) + >>> line = LineString([(0,0), (2,2)]) + >>> result = split(line, pt) + >>> result.wkt + 'GEOMETRYCOLLECTION (LINESTRING (0 0, 1 1), LINESTRING (1 1, 2 2))' + """ + + if geom.geom_type in ("MultiLineString", "MultiPolygon"): + return GeometryCollection( + [i for part in geom.geoms for i in SplitOp.split(part, splitter).geoms] + ) + + elif geom.geom_type == "LineString": + if splitter.geom_type in ( + "LineString", + "MultiLineString", + "Polygon", + "MultiPolygon", + ): + split_func = SplitOp._split_line_with_line + elif splitter.geom_type == "Point": + split_func = SplitOp._split_line_with_point + elif splitter.geom_type == "MultiPoint": + split_func = SplitOp._split_line_with_multipoint + else: + raise GeometryTypeError( + f"Splitting a LineString with a {splitter.geom_type} is not supported" + ) + + elif geom.geom_type == "Polygon": + if splitter.geom_type == "LineString": + split_func = SplitOp._split_polygon_with_line + else: + raise GeometryTypeError( + f"Splitting a Polygon with a {splitter.geom_type} is not supported" + ) + + else: + raise GeometryTypeError( + f"Splitting {geom.geom_type} geometry is not supported" + ) + + return GeometryCollection(split_func(geom, splitter)) + + +split = SplitOp.split + + +def substring(geom, start_dist, end_dist, normalized=False): + """Return a line segment between specified distances along a LineString + + Negative distance values are taken as measured in the reverse + direction from the end of the geometry. Out-of-range index + values are handled by clamping them to the valid range of values. + + If the start distance equals the end distance, a Point is returned. + + If the start distance is actually beyond the end distance, then the + reversed substring is returned such that the start distance is + at the first coordinate. + + Parameters + ---------- + geom : LineString + The geometry to get a substring of. + start_dist : float + The distance along `geom` of the start of the substring. + end_dist : float + The distance along `geom` of the end of the substring. + normalized : bool, False + Whether the distance parameters are interpreted as a + fraction of the geometry's length. + + Returns + ------- + Union[Point, LineString] + The substring between `start_dist` and `end_dist` or a Point + if they are at the same location. + + Raises + ------ + TypeError + If `geom` is not a LineString. + + Examples + -------- + >>> from shapely.geometry import LineString + >>> from shapely.ops import substring + >>> ls = LineString((i, 0) for i in range(6)) + >>> ls.wkt + 'LINESTRING (0 0, 1 0, 2 0, 3 0, 4 0, 5 0)' + >>> substring(ls, start_dist=1, end_dist=3).wkt + 'LINESTRING (1 0, 2 0, 3 0)' + >>> substring(ls, start_dist=3, end_dist=1).wkt + 'LINESTRING (3 0, 2 0, 1 0)' + >>> substring(ls, start_dist=1, end_dist=-3).wkt + 'LINESTRING (1 0, 2 0)' + >>> substring(ls, start_dist=0.2, end_dist=-0.6, normalized=True).wkt + 'LINESTRING (1 0, 2 0)' + + Returning a `Point` when `start_dist` and `end_dist` are at the + same location. + + >>> substring(ls, 2.5, -2.5).wkt + 'POINT (2.5 0)' + """ + + if not isinstance(geom, LineString): + raise GeometryTypeError( + "Can only calculate a substring of LineString geometries. " + f"A {geom.geom_type} was provided." + ) + + # Filter out cases in which to return a point + if start_dist == end_dist: + return geom.interpolate(start_dist, normalized) + elif not normalized and start_dist >= geom.length and end_dist >= geom.length: + return geom.interpolate(geom.length, normalized) + elif not normalized and -start_dist >= geom.length and -end_dist >= geom.length: + return geom.interpolate(0, normalized) + elif normalized and start_dist >= 1 and end_dist >= 1: + return geom.interpolate(1, normalized) + elif normalized and -start_dist >= 1 and -end_dist >= 1: + return geom.interpolate(0, normalized) + + if normalized: + start_dist *= geom.length + end_dist *= geom.length + + # Filter out cases where distances meet at a middle point from opposite ends. + if start_dist < 0 < end_dist and abs(start_dist) + end_dist == geom.length: + return geom.interpolate(end_dist) + elif end_dist < 0 < start_dist and abs(end_dist) + start_dist == geom.length: + return geom.interpolate(start_dist) + + start_point = geom.interpolate(start_dist) + end_point = geom.interpolate(end_dist) + + if start_dist < 0: + start_dist = geom.length + start_dist # Values may still be negative, + if end_dist < 0: # but only in the out-of-range + end_dist = geom.length + end_dist # sense, not the wrap-around sense. + + reverse = start_dist > end_dist + if reverse: + start_dist, end_dist = end_dist, start_dist + + if start_dist < 0: + start_dist = 0 # to avoid duplicating the first vertex + + if reverse: + vertex_list = [tuple(*end_point.coords)] + else: + vertex_list = [tuple(*start_point.coords)] + + coords = list(geom.coords) + current_distance = 0 + for p1, p2 in zip(coords, coords[1:]): + if start_dist < current_distance < end_dist: + vertex_list.append(p1) + elif current_distance >= end_dist: + break + + current_distance += ((p2[0] - p1[0]) ** 2 + (p2[1] - p1[1]) ** 2) ** 0.5 + + if reverse: + vertex_list.append(tuple(*start_point.coords)) + # reverse direction result + vertex_list = reversed(vertex_list) + else: + vertex_list.append(tuple(*end_point.coords)) + + return LineString(vertex_list) + + +def clip_by_rect(geom, xmin, ymin, xmax, ymax): + """Returns the portion of a geometry within a rectangle + + The geometry is clipped in a fast but possibly dirty way. The output is + not guaranteed to be valid. No exceptions will be raised for topological + errors. + + Parameters + ---------- + geom : geometry + The geometry to be clipped + xmin : float + Minimum x value of the rectangle + ymin : float + Minimum y value of the rectangle + xmax : float + Maximum x value of the rectangle + ymax : float + Maximum y value of the rectangle + + Notes + ----- + Requires GEOS >= 3.5.0 + New in 1.7. + """ + if geom.is_empty: + return geom + return shapely.clip_by_rect(geom, xmin, ymin, xmax, ymax) + + +def orient(geom, sign=1.0): + """A properly oriented copy of the given geometry. + + The signed area of the result will have the given sign. A sign of + 1.0 means that the coordinates of the product's exterior rings will + be oriented counter-clockwise. + + Parameters + ---------- + geom : Geometry + The original geometry. May be a Polygon, MultiPolygon, or + GeometryCollection. + sign : float, optional. + The sign of the result's signed area. + + Returns + ------- + Geometry + + """ + if isinstance(geom, BaseMultipartGeometry): + return geom.__class__( + list( + map( + lambda geom: orient(geom, sign), + geom.geoms, + ) + ) + ) + if isinstance(geom, (Polygon,)): + return orient_(geom, sign) + return geom +
+ +
+
+
+ +
+ +
+

© Copyright 2019 Metropolitan Council.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty_2035_hwy_update/_sources/_generated/lasso.CubeTransit.rst.txt b/branch/bicounty_2035_hwy_update/_sources/_generated/lasso.CubeTransit.rst.txt new file mode 100644 index 0000000..e24b49e --- /dev/null +++ b/branch/bicounty_2035_hwy_update/_sources/_generated/lasso.CubeTransit.rst.txt @@ -0,0 +1,36 @@ +lasso.CubeTransit +================= + +.. currentmodule:: lasso + +.. autoclass:: CubeTransit + + + .. automethod:: __init__ + + + .. rubric:: Methods + + .. autosummary:: + + ~CubeTransit.__init__ + ~CubeTransit.add_additional_time_periods + ~CubeTransit.add_cube + ~CubeTransit.build_route_name + ~CubeTransit.calculate_start_end_times + ~CubeTransit.create_add_route_card_dict + ~CubeTransit.create_delete_route_card_dict + ~CubeTransit.create_from_cube + ~CubeTransit.create_update_route_card_dict + ~CubeTransit.cube_properties_to_standard_properties + ~CubeTransit.evaluate_differences + ~CubeTransit.evaluate_route_property_differences + ~CubeTransit.evaluate_route_shape_changes + ~CubeTransit.get_time_period_numbers_from_cube_properties + ~CubeTransit.unpack_route_name + + + + + + \ No newline at end of file diff --git a/branch/bicounty_2035_hwy_update/_sources/_generated/lasso.ModelRoadwayNetwork.rst.txt b/branch/bicounty_2035_hwy_update/_sources/_generated/lasso.ModelRoadwayNetwork.rst.txt new file mode 100644 index 0000000..29190d8 --- /dev/null +++ b/branch/bicounty_2035_hwy_update/_sources/_generated/lasso.ModelRoadwayNetwork.rst.txt @@ -0,0 +1,90 @@ +lasso.ModelRoadwayNetwork +========================= + +.. currentmodule:: lasso + +.. autoclass:: ModelRoadwayNetwork + + + .. automethod:: __init__ + + + .. rubric:: Methods + + .. autosummary:: + + ~ModelRoadwayNetwork.__init__ + ~ModelRoadwayNetwork.add_counts + ~ModelRoadwayNetwork.add_incident_link_data_to_nodes + ~ModelRoadwayNetwork.add_new_roadway_feature_change + ~ModelRoadwayNetwork.add_variable_using_shst_reference + ~ModelRoadwayNetwork.addition_map + ~ModelRoadwayNetwork.apply + ~ModelRoadwayNetwork.apply_managed_lane_feature_change + ~ModelRoadwayNetwork.apply_python_calculation + ~ModelRoadwayNetwork.apply_roadway_feature_change + ~ModelRoadwayNetwork.assess_connectivity + ~ModelRoadwayNetwork.build_selection_key + ~ModelRoadwayNetwork.calculate_area_type + ~ModelRoadwayNetwork.calculate_centroidconnect + ~ModelRoadwayNetwork.calculate_county + ~ModelRoadwayNetwork.calculate_distance + ~ModelRoadwayNetwork.calculate_mpo + ~ModelRoadwayNetwork.calculate_use + ~ModelRoadwayNetwork.convert_int + ~ModelRoadwayNetwork.create_ML_variable + ~ModelRoadwayNetwork.create_calculated_variables + ~ModelRoadwayNetwork.create_dummy_connector_links + ~ModelRoadwayNetwork.create_hov_corridor_variable + ~ModelRoadwayNetwork.create_managed_lane_network + ~ModelRoadwayNetwork.create_managed_variable + ~ModelRoadwayNetwork.dataframe_to_fixed_width + ~ModelRoadwayNetwork.delete_roadway_feature_change + ~ModelRoadwayNetwork.deletion_map + ~ModelRoadwayNetwork.fill_na + ~ModelRoadwayNetwork.from_RoadwayNetwork + ~ModelRoadwayNetwork.get_attribute + ~ModelRoadwayNetwork.get_managed_lane_node_ids + ~ModelRoadwayNetwork.get_modal_graph + ~ModelRoadwayNetwork.get_modal_links_nodes + ~ModelRoadwayNetwork.get_property_by_time_period_and_group + ~ModelRoadwayNetwork.identify_segment + ~ModelRoadwayNetwork.identify_segment_endpoints + ~ModelRoadwayNetwork.is_network_connected + ~ModelRoadwayNetwork.load_transform_network + ~ModelRoadwayNetwork.network_connection_plot + ~ModelRoadwayNetwork.orig_dest_nodes_foreign_key + ~ModelRoadwayNetwork.ox_graph + ~ModelRoadwayNetwork.path_search + ~ModelRoadwayNetwork.read + ~ModelRoadwayNetwork.read_match_result + ~ModelRoadwayNetwork.rename_variables_for_dbf + ~ModelRoadwayNetwork.roadway_net_to_gdf + ~ModelRoadwayNetwork.roadway_standard_to_met_council_network + ~ModelRoadwayNetwork.select_roadway_features + ~ModelRoadwayNetwork.selection_has_unique_link_id + ~ModelRoadwayNetwork.selection_map + ~ModelRoadwayNetwork.shortest_path + ~ModelRoadwayNetwork.split_properties_by_time_period_and_category + ~ModelRoadwayNetwork.update_distance + ~ModelRoadwayNetwork.validate_link_schema + ~ModelRoadwayNetwork.validate_node_schema + ~ModelRoadwayNetwork.validate_properties + ~ModelRoadwayNetwork.validate_selection + ~ModelRoadwayNetwork.validate_shape_schema + ~ModelRoadwayNetwork.validate_uniqueness + ~ModelRoadwayNetwork.write + ~ModelRoadwayNetwork.write_roadway_as_fixedwidth + ~ModelRoadwayNetwork.write_roadway_as_shp + + + + + + .. rubric:: Attributes + + .. autosummary:: + + ~ModelRoadwayNetwork.CALCULATED_VALUES + + \ No newline at end of file diff --git a/branch/bicounty_2035_hwy_update/_sources/_generated/lasso.Parameters.rst.txt b/branch/bicounty_2035_hwy_update/_sources/_generated/lasso.Parameters.rst.txt new file mode 100644 index 0000000..28d2c86 --- /dev/null +++ b/branch/bicounty_2035_hwy_update/_sources/_generated/lasso.Parameters.rst.txt @@ -0,0 +1,31 @@ +lasso.Parameters +================ + +.. currentmodule:: lasso + +.. autoclass:: Parameters + + + .. automethod:: __init__ + + + .. rubric:: Methods + + .. autosummary:: + + ~Parameters.__init__ + + + + + + .. rubric:: Attributes + + .. autosummary:: + + ~Parameters.cube_time_periods + ~Parameters.properties_to_split + ~Parameters.county_link_range_dict + ~Parameters.zones + + \ No newline at end of file diff --git a/branch/bicounty_2035_hwy_update/_sources/_generated/lasso.Project.rst.txt b/branch/bicounty_2035_hwy_update/_sources/_generated/lasso.Project.rst.txt new file mode 100644 index 0000000..863945b --- /dev/null +++ b/branch/bicounty_2035_hwy_update/_sources/_generated/lasso.Project.rst.txt @@ -0,0 +1,42 @@ +lasso.Project +============= + +.. currentmodule:: lasso + +.. autoclass:: Project + + + .. automethod:: __init__ + + + .. rubric:: Methods + + .. autosummary:: + + ~Project.__init__ + ~Project.add_highway_changes + ~Project.add_transit_changes + ~Project.create_project + ~Project.determine_roadway_network_changes_compatibility + ~Project.emme_id_to_wrangler_id + ~Project.emme_name_to_wrangler_name + ~Project.evaluate_changes + ~Project.get_object_from_network_build_command + ~Project.get_operation_from_network_build_command + ~Project.read_logfile + ~Project.read_network_build_file + ~Project.write_project_card + + + + + + .. rubric:: Attributes + + .. autosummary:: + + ~Project.CALCULATED_VALUES + ~Project.DEFAULT_PROJECT_NAME + ~Project.STATIC_VALUES + + \ No newline at end of file diff --git a/branch/bicounty_2035_hwy_update/_sources/_generated/lasso.StandardTransit.rst.txt b/branch/bicounty_2035_hwy_update/_sources/_generated/lasso.StandardTransit.rst.txt new file mode 100644 index 0000000..1175b4b --- /dev/null +++ b/branch/bicounty_2035_hwy_update/_sources/_generated/lasso.StandardTransit.rst.txt @@ -0,0 +1,32 @@ +lasso.StandardTransit +===================== + +.. currentmodule:: lasso + +.. autoclass:: StandardTransit + + + .. automethod:: __init__ + + + .. rubric:: Methods + + .. autosummary:: + + ~StandardTransit.__init__ + ~StandardTransit.calculate_cube_mode + ~StandardTransit.cube_format + ~StandardTransit.evaluate_differences + ~StandardTransit.fromTransitNetwork + ~StandardTransit.read_gtfs + ~StandardTransit.route_properties_gtfs_to_cube + ~StandardTransit.shape_gtfs_to_cube + ~StandardTransit.shape_gtfs_to_emme + ~StandardTransit.time_to_cube_time_period + ~StandardTransit.write_as_cube_lin + + + + + + \ No newline at end of file diff --git a/branch/bicounty_2035_hwy_update/_sources/_generated/lasso.logger.rst.txt b/branch/bicounty_2035_hwy_update/_sources/_generated/lasso.logger.rst.txt new file mode 100644 index 0000000..2054273 --- /dev/null +++ b/branch/bicounty_2035_hwy_update/_sources/_generated/lasso.logger.rst.txt @@ -0,0 +1,29 @@ +lasso.logger +============ + +.. automodule:: lasso.logger + + + + + + + + .. rubric:: Functions + + .. autosummary:: + + setupLogging + + + + + + + + + + + + + diff --git a/branch/bicounty_2035_hwy_update/_sources/_generated/lasso.util.rst.txt b/branch/bicounty_2035_hwy_update/_sources/_generated/lasso.util.rst.txt new file mode 100644 index 0000000..95fecf8 --- /dev/null +++ b/branch/bicounty_2035_hwy_update/_sources/_generated/lasso.util.rst.txt @@ -0,0 +1,35 @@ +lasso.util +========== + +.. automodule:: lasso.util + + + + + + + + .. rubric:: Functions + + .. autosummary:: + + column_name_to_parts + create_locationreference + geodesic_point_buffer + get_shared_streets_intersection_hash + hhmmss_to_datetime + secs_to_datetime + shorten_name + + + + + + + + + + + + + diff --git a/branch/bicounty_2035_hwy_update/_sources/autodoc.rst.txt b/branch/bicounty_2035_hwy_update/_sources/autodoc.rst.txt new file mode 100644 index 0000000..7e48d58 --- /dev/null +++ b/branch/bicounty_2035_hwy_update/_sources/autodoc.rst.txt @@ -0,0 +1,29 @@ +Lasso Classes and Functions +==================================== + +.. automodule:: lasso + :no-members: + :no-undoc-members: + :no-inherited-members: + :no-show-inheritance: + + +Base Classes +-------------- +.. autosummary:: + :toctree: _generated + :nosignatures: + + CubeTransit + StandardTransit + ModelRoadwayNetwork + Project + Parameters + +Utils and Functions +-------------------- +.. autosummary:: + :toctree: _generated + + util + logger diff --git a/branch/bicounty_2035_hwy_update/_sources/index.rst.txt b/branch/bicounty_2035_hwy_update/_sources/index.rst.txt new file mode 100644 index 0000000..1255d4e --- /dev/null +++ b/branch/bicounty_2035_hwy_update/_sources/index.rst.txt @@ -0,0 +1,35 @@ +.. lasso documentation master file, created by + sphinx-quickstart on Thu Dec 5 15:43:28 2019. + You can adapt this file completely to your liking, but it should at least + contain the root `toctree` directive. + +Welcome to lasso's documentation! +================================= + +This package of utilities is a wrapper around the +[network_wrangler](http://github.com/wsp-sag/network_wrangler) package +for MetCouncil. It aims to have the following functionality: +1. parse Cube log files and base highway networks and create ProjectCards + for Network Wrangler +2. parse two Cube transit line files and create ProjectCards for NetworkWrangler +3. refine Network Wrangler highway networks to contain specific variables and + settings for Metropolitan Council and export them to a format that can + be read in by Citilab's Cube software. + +.. toctree:: + :maxdepth: 3 + :caption: Contents: + + starting + setup + running + autodoc + + + +Indices and tables +================== + +* :ref:`genindex` +* :ref:`modindex` +* :ref:`search` diff --git a/branch/bicounty_2035_hwy_update/_sources/running.md.txt b/branch/bicounty_2035_hwy_update/_sources/running.md.txt new file mode 100644 index 0000000..e139dc8 --- /dev/null +++ b/branch/bicounty_2035_hwy_update/_sources/running.md.txt @@ -0,0 +1,12 @@ +# Running Lasso + +## Create project files + + +## Create a scenario + + +## Exporting networks + + +## Auditing and Reporting diff --git a/branch/bicounty_2035_hwy_update/_sources/setup.md.txt b/branch/bicounty_2035_hwy_update/_sources/setup.md.txt new file mode 100644 index 0000000..e77d463 --- /dev/null +++ b/branch/bicounty_2035_hwy_update/_sources/setup.md.txt @@ -0,0 +1,9 @@ +# Setup + +### Projects + +### Parameters + +### Settings + +### Additional Data Files diff --git a/branch/bicounty_2035_hwy_update/_sources/starting.md.txt b/branch/bicounty_2035_hwy_update/_sources/starting.md.txt new file mode 100644 index 0000000..8886f95 --- /dev/null +++ b/branch/bicounty_2035_hwy_update/_sources/starting.md.txt @@ -0,0 +1,292 @@ +# Starting Out + +## Installation + +If you are managing multiple python versions, we suggest using [`virtualenv`](https://virtualenv.pypa.io/en/latest/) or [`conda`](https://conda.io/en/latest/) virtual environments. + +Example using a conda environment (recommended) and using the package manager [pip](https://pip.pypa.io/en/stable/) to install Lasso from the source on GitHub. + +```bash +conda config --add channels conda-forge +conda create python=3.7 rtree geopandas -n +conda activate +pip install git+https://github.com/wsp-sag/Lasso@master +``` + +Lasso will install `network_wrangler` from the [PyPi](https://pypi.org/project/network-wrangler/) repository because it is included in Lasso's `requirements.txt`. + +#### Bleeding Edge +If you want to install a more up-to-date or development version of network wrangler and lasso , you can do so by installing it from the `develop` branch of + +```bash +conda config --add channels conda-forge +conda create python=3.7 rtree geopandas -n +conda activate +pip install git+https://github.com/wsp-sag/network_wrangler@develop +pip install git+https://github.com/wsp-sag/Lasso@develop +``` + +#### From Clone +If you are going to be working on Lasso locally, you might want to clone it to your local machine and install it from the clone. The -e will install it in [editable mode](https://pip.pypa.io/en/stable/reference/pip_install/?highlight=editable#editable-installs). + +**if you plan to do development on both network wrangler and lasso locally, consider installing network wrangler from a clone as well!** + +```bash +conda config --add channels conda-forge +conda create python=3.7 rtree geopandas osmnx -n +conda activate +git clone https://github.com/wsp-sag/Lasso +git clone https://github.com/wsp-sag/network_wrangler +cd network_wrangler +pip install -e . +cd .. +cd Lasso +pip install -e . +``` + +Notes: + +1. The -e installs it in editable mode. +2. If you are not part of the project team and want to contribute code bxack to the project, please fork before you clone and then add the original repository to your upstream origin list per [these directions on github](https://help.github.com/en/articles/fork-a-repo). +3. if you wanted to install from a specific tag/version number or branch, replace `@master` with `@` or `@tag` +4. If you want to make use of frequent developer updates for network wrangler as well, you can also install it from clone by copying the instructions for cloning and installing Lasso for Network Wrangler + +If you are going to be doing Lasso development, we also recommend: + - a good IDE such as [Atom](http://atom.io), VS Code, Sublime Text, etc. + with Python syntax highlighting turned on. + - [GitHub Desktop](https://desktop.github.com/) to locally update your clones + +## Brief Intro + +Lasso is a 'wrapper' around the [Network Wrangler](http://wsp-sag.github.io/network_wrangler) utility. + +Both Lasso and NetworkWrangler are built around the following data schemas: + - [`roadway network`], which is based on a mashup of Open Street Map and [Shared Streets](http://sharedstreets.io). In Network Wrangler these are read in from three json files reprsenting: links, shapes, and nodes. Data fields that change by time of day or by user category are represented as nested fields such that any field can be defined for an ad-hoc time-of-day span or user category. + - [`transit network`], which is based on a frequency-based implementation of the csv-based GTFS; and + - [`project card`], which is novel to Network Wrangler and stores information about network changes as a result of projects in yml. + +In addition, Lasso utilizes the following data schemas: + + - [`MetCouncil Model Roadway Network Schema`], which adds data fields to the `roadway network` schema that MetCouncil uses in their travel model including breaking out data fields by time period. + - [`MetCouncil Model Transit Network Schema`], which uses the Cube PublicTransport format, and + - [`Cube Log Files`], which document changes to the roadway network done in the Cube GUI. Lasso translates these to project cards in order to be used by NetworkWrangler. + - [`Cube public transport line files`], which define a set of transit lines in the cube software. + +### Components +Network Wrangler has the following atomic parts: + + - _RoadwayNetwork_ object, which represents the `roadway network` data as GeoDataFrames; + - _TransitNetwork_ object, which represents the `transit network` data as DataFrames; + - _ProjectCard_ object, which represents the data of the `project card`. Project cards identify the infrastructure that is changing (a selection) and defines the changes; or contains information about a new facility to be constructed or a new service to be run.; + - _Scenario_ object, which consist of at least a RoadwayNetwork, and +TransitNetwork. Scenarios can be based on or tiered from other scenarios. +Scenarios can query and add ProjectCards to describe a set of changes that should be made to the network. + +In addition, Lasso has the following atomic parts: + + - _Project_ object, creates project cards from one of the following: a base and a build transit network in cube format, a base and build highway network, or a base highway network and a Cube log file. + - _ModelRoadwayNetwork_ object is a subclass of `RoadwayNetwork` and contains MetCouncil-specific methods to define and create MetCouncil-specific variables and export the network to a format that can be read by Cube. + - _StandardTransit_, an object for holding a standard transit feed as a Partridge object and contains + methods to manipulate and translate the GTFS data to MetCouncil's Cube Line files. + - _CubeTransit_, an object for storing information about transit defined in `Cube public transport line files` + . Has the capability to parse cube line file properties and shapes into python dictionaries and compare line files and represent changes as Project Card dictionaries. + - _Parameters_, A class representing all the parameters defining the networks + including time of day, categories, etc. Parameters can be set at runtime by initializing a parameters instance + with a keyword argument setting the attribute. Parameters that are + not explicitly set will use default parameters listed in this class. + +#### RoadwayNetwork + +Reads, writes, queries and and manipulates roadway network data, which +is mainly stored in the GeoDataFrames `links_df`, `nodes_df`, and `shapes_df`. + +```python +net = RoadwayNetwork.read( + link_filename=MY_LINK_FILE, + node_filename=MY_NODE_FILE, + shape_filename=MY_SHAPE_FILE, + shape_foreign_key ='shape_id', + + ) +my_selection = { + "link": [{"name": ["I 35E"]}], + "A": {"osm_node_id": "961117623"}, # start searching for segments at A + "B": {"osm_node_id": "2564047368"}, +} +net.select_roadway_features(my_selection) + +my_change = [ + { + 'property': 'lanes', + 'existing': 1, + 'set': 2, + }, + { + 'property': 'drive_access', + 'set': 0, + }, +] + +my_net.apply_roadway_feature_change( + my_net.select_roadway_features(my_selection), + my_change +) + +ml_net = net.create_managed_lane_network(in_place=False) + +ml_net.is_network_connected(mode="drive")) + +_, disconnected_nodes = ml_net.assess_connectivity( + mode="walk", + ignore_end_nodes=True +) +ml_net.write(filename=my_out_prefix, path=my_dir) +``` +#### TransitNetwork + +#### ProjectCard + +#### Scenario + +Manages sets of project cards and tiering from a base scenario/set of networks. + +```python + +my_base_scenario = { + "road_net": RoadwayNetwork.read( + link_filename=STPAUL_LINK_FILE, + node_filename=STPAUL_NODE_FILE, + shape_filename=STPAUL_SHAPE_FILE, + fast=True, + shape_foreign_key ='shape_id', + ), + "transit_net": TransitNetwork.read(STPAUL_DIR), +} + +card_filenames = [ + "3_multiple_roadway_attribute_change.yml", + "multiple_changes.yml", + "4_simple_managed_lane.yml", +] + +project_card_directory = os.path.join(STPAUL_DIR, "project_cards") + +project_cards_list = [ + ProjectCard.read(os.path.join(project_card_directory, filename), validate=False) + for filename in card_filenames +] + +my_scenario = Scenario.create_scenario( + base_scenario=my_base_scenario, + project_cards_list=project_cards_list, +) +my_scenario.check_scenario_requisites() + +my_scenario.apply_all_projects() + +my_scenario.scenario_summary() +``` + +#### Project +Creates project cards by comparing two MetCouncil Model Transit Network files or by reading a cube log file and a base network; + +```python + +test_project = Project.create_project( + base_transit_source=os.path.join(CUBE_DIR, "transit.LIN"), + build_transit_source=os.path.join(CUBE_DIR, "transit_route_shape_change"), + ) + +test_project.evaluate_changes() + +test_project.write_project_card( + os.path.join(SCRATCH_DIR, "t_transit_shape_test.yml") + ) + +``` + +#### ModelRoadwayNetwork +A subclass of network_wrangler's RoadwayNetwork +class which additional understanding about how to translate and write the +network out to the MetCouncil Roadway Network schema. + +```Python + +net = ModelRoadwayNetwork.read( + link_filename=STPAUL_LINK_FILE, + node_filename=STPAUL_NODE_FILE, + shape_filename=STPAUL_SHAPE_FILE, + fast=True, + shape_foreign_key ='shape_id', + ) + +net.write_roadway_as_fixedwidth() + +``` + +#### StandardTransit +Translates the standard GTFS data to MetCouncil's Cube Line files. + +```Python +cube_transit_net = StandardTransit.read_gtfs(BASE_TRANSIT_DIR) +cube_transit_net.write_as_cube_lin(os.path.join(WRITE_DIR, "outfile.lin")) +``` + +#### CubeTransit +Used by the project class and has the capability to: + - Parse cube line file properties and shapes into python dictionaries + - Compare line files and represent changes as Project Card dictionaries + +```python +tn = CubeTransit.create_from_cube(CUBE_DIR) +transit_change_list = tn.evaluate_differences(base_transit_network) +``` + +#### Parameters +Holds information about default parameters but can +also be initialized to override those parameters at object instantiation using a dictionary. + +```Python +# read parameters from a yaml configuration file +# could also provide as a key/value pair +with open(config_file) as f: + my_config = yaml.safe_load(f) + +# provide parameters at instantiation of ModelRoadwayNetwork +model_road_net = ModelRoadwayNetwork.from_RoadwayNetwork( + my_scenario.road_net, parameters=my_config.get("my_parameters", {}) + ) +# network written with direction from the parameters given +model_road_net.write_roadway_as_shp() + +``` + +### Typical Workflow + +Workflows in Lasso and Network Wrangler typically accomplish one of two goals: +1. Create Project Cards to document network changes as a result of either transit or roadway projects. +2. Create Model Network Files for a scenario as represented by a series of Project Cards layered on top of a base network. + +#### Project Cards from Transit LIN Files + + +#### Project Cards from Cube LOG Files + + +#### Model Network Files for a Scenario + + + +## Running Quickstart Jupyter Notebooks + +To learn basic lasso functionality, please refer to the following jupyter notebooks in the `/notebooks` directory: + + - `Lasso Project Card Creation Quickstart.ipynb` + - `Lasso Scenario Creation Quickstart.ipynb` + + Jupyter notebooks can be started by activating the lasso conda environment and typing `jupyter notebook`: + + ```bash + conda activate + jupyter notebook + ``` diff --git a/branch/bicounty_2035_hwy_update/_static/_sphinx_javascript_frameworks_compat.js b/branch/bicounty_2035_hwy_update/_static/_sphinx_javascript_frameworks_compat.js new file mode 100644 index 0000000..8141580 --- /dev/null +++ b/branch/bicounty_2035_hwy_update/_static/_sphinx_javascript_frameworks_compat.js @@ -0,0 +1,123 @@ +/* Compatability shim for jQuery and underscores.js. + * + * Copyright Sphinx contributors + * Released under the two clause BSD licence + */ + +/** + * small helper function to urldecode strings + * + * See https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/decodeURIComponent#Decoding_query_parameters_from_a_URL + */ +jQuery.urldecode = function(x) { + if (!x) { + return x + } + return decodeURIComponent(x.replace(/\+/g, ' ')); +}; + +/** + * small helper function to urlencode strings + */ +jQuery.urlencode = encodeURIComponent; + +/** + * This function returns the parsed url parameters of the + * current request. Multiple values per key are supported, + * it will always return arrays of strings for the value parts. + */ +jQuery.getQueryParameters = function(s) { + if (typeof s === 'undefined') + s = document.location.search; + var parts = s.substr(s.indexOf('?') + 1).split('&'); + var result = {}; + for (var i = 0; i < parts.length; i++) { + var tmp = parts[i].split('=', 2); + var key = jQuery.urldecode(tmp[0]); + var value = jQuery.urldecode(tmp[1]); + if (key in result) + result[key].push(value); + else + result[key] = [value]; + } + return result; +}; + +/** + * highlight a given string on a jquery object by wrapping it in + * span elements with the given class name. + */ +jQuery.fn.highlightText = function(text, className) { + function highlight(node, addItems) { + if (node.nodeType === 3) { + var val = node.nodeValue; + var pos = val.toLowerCase().indexOf(text); + if (pos >= 0 && + !jQuery(node.parentNode).hasClass(className) && + !jQuery(node.parentNode).hasClass("nohighlight")) { + var span; + var isInSVG = jQuery(node).closest("body, svg, foreignObject").is("svg"); + if (isInSVG) { + span = document.createElementNS("http://www.w3.org/2000/svg", "tspan"); + } else { + span = document.createElement("span"); + span.className = className; + } + span.appendChild(document.createTextNode(val.substr(pos, text.length))); + node.parentNode.insertBefore(span, node.parentNode.insertBefore( + document.createTextNode(val.substr(pos + text.length)), + node.nextSibling)); + node.nodeValue = val.substr(0, pos); + if (isInSVG) { + var rect = document.createElementNS("http://www.w3.org/2000/svg", "rect"); + var bbox = node.parentElement.getBBox(); + rect.x.baseVal.value = bbox.x; + rect.y.baseVal.value = bbox.y; + rect.width.baseVal.value = bbox.width; + rect.height.baseVal.value = bbox.height; + rect.setAttribute('class', className); + addItems.push({ + "parent": node.parentNode, + "target": rect}); + } + } + } + else if (!jQuery(node).is("button, select, textarea")) { + jQuery.each(node.childNodes, function() { + highlight(this, addItems); + }); + } + } + var addItems = []; + var result = this.each(function() { + highlight(this, addItems); + }); + for (var i = 0; i < addItems.length; ++i) { + jQuery(addItems[i].parent).before(addItems[i].target); + } + return result; +}; + +/* + * backward compatibility for jQuery.browser + * This will be supported until firefox bug is fixed. + */ +if (!jQuery.browser) { + jQuery.uaMatch = function(ua) { + ua = ua.toLowerCase(); + + var match = /(chrome)[ \/]([\w.]+)/.exec(ua) || + /(webkit)[ \/]([\w.]+)/.exec(ua) || + /(opera)(?:.*version|)[ \/]([\w.]+)/.exec(ua) || + /(msie) ([\w.]+)/.exec(ua) || + ua.indexOf("compatible") < 0 && /(mozilla)(?:.*? rv:([\w.]+)|)/.exec(ua) || + []; + + return { + browser: match[ 1 ] || "", + version: match[ 2 ] || "0" + }; + }; + jQuery.browser = {}; + jQuery.browser[jQuery.uaMatch(navigator.userAgent).browser] = true; +} diff --git a/branch/bicounty_2035_hwy_update/_static/basic.css b/branch/bicounty_2035_hwy_update/_static/basic.css new file mode 100644 index 0000000..cfc60b8 --- /dev/null +++ b/branch/bicounty_2035_hwy_update/_static/basic.css @@ -0,0 +1,921 @@ +/* + * basic.css + * ~~~~~~~~~ + * + * Sphinx stylesheet -- basic theme. + * + * :copyright: Copyright 2007-2023 by the Sphinx team, see AUTHORS. + * :license: BSD, see LICENSE for details. + * + */ + +/* -- main layout ----------------------------------------------------------- */ + +div.clearer { + clear: both; +} + +div.section::after { + display: block; + content: ''; + clear: left; +} + +/* -- relbar ---------------------------------------------------------------- */ + +div.related { + width: 100%; + font-size: 90%; +} + +div.related h3 { + display: none; +} + +div.related ul { + margin: 0; + padding: 0 0 0 10px; + list-style: none; +} + +div.related li { + display: inline; +} + +div.related li.right { + float: right; + margin-right: 5px; +} + +/* -- sidebar --------------------------------------------------------------- */ + +div.sphinxsidebarwrapper { + padding: 10px 5px 0 10px; +} + +div.sphinxsidebar { + float: left; + width: 230px; + margin-left: -100%; + font-size: 90%; + word-wrap: break-word; + overflow-wrap : break-word; +} + +div.sphinxsidebar ul { + list-style: none; +} + +div.sphinxsidebar ul ul, +div.sphinxsidebar ul.want-points { + margin-left: 20px; + list-style: square; +} + +div.sphinxsidebar ul ul { + margin-top: 0; + margin-bottom: 0; +} + +div.sphinxsidebar form { + margin-top: 10px; +} + +div.sphinxsidebar input { + border: 1px solid #98dbcc; + font-family: sans-serif; + font-size: 1em; +} + +div.sphinxsidebar #searchbox form.search { + overflow: hidden; +} + +div.sphinxsidebar #searchbox input[type="text"] { + float: left; + width: 80%; + padding: 0.25em; + box-sizing: border-box; +} + +div.sphinxsidebar #searchbox input[type="submit"] { + float: left; + width: 20%; + border-left: none; + padding: 0.25em; + box-sizing: border-box; +} + + +img { + border: 0; + max-width: 100%; +} + +/* -- search page ----------------------------------------------------------- */ + +ul.search { + margin: 10px 0 0 20px; + padding: 0; +} + +ul.search li { + padding: 5px 0 5px 20px; + background-image: url(file.png); + background-repeat: no-repeat; + background-position: 0 7px; +} + +ul.search li a { + font-weight: bold; +} + +ul.search li p.context { + color: #888; + margin: 2px 0 0 30px; + text-align: left; +} + +ul.keywordmatches li.goodmatch a { + font-weight: bold; +} + +/* -- index page ------------------------------------------------------------ */ + +table.contentstable { + width: 90%; + margin-left: auto; + margin-right: auto; +} + +table.contentstable p.biglink { + line-height: 150%; +} + +a.biglink { + font-size: 1.3em; +} + +span.linkdescr { + font-style: italic; + padding-top: 5px; + font-size: 90%; +} + +/* -- general index --------------------------------------------------------- */ + +table.indextable { + width: 100%; +} + +table.indextable td { + text-align: left; + vertical-align: top; +} + +table.indextable ul { + margin-top: 0; + margin-bottom: 0; + list-style-type: none; +} + +table.indextable > tbody > tr > td > ul { + padding-left: 0em; +} + +table.indextable tr.pcap { + height: 10px; +} + +table.indextable tr.cap { + margin-top: 10px; + background-color: #f2f2f2; +} + +img.toggler { + margin-right: 3px; + margin-top: 3px; + cursor: pointer; +} + +div.modindex-jumpbox { + border-top: 1px solid #ddd; + border-bottom: 1px solid #ddd; + margin: 1em 0 1em 0; + padding: 0.4em; +} + +div.genindex-jumpbox { + border-top: 1px solid #ddd; + border-bottom: 1px solid #ddd; + margin: 1em 0 1em 0; + padding: 0.4em; +} + +/* -- domain module index --------------------------------------------------- */ + +table.modindextable td { + padding: 2px; + border-collapse: collapse; +} + +/* -- general body styles --------------------------------------------------- */ + +div.body { + min-width: 360px; + max-width: 800px; +} + +div.body p, div.body dd, div.body li, div.body blockquote { + -moz-hyphens: auto; + -ms-hyphens: auto; + -webkit-hyphens: auto; + hyphens: auto; +} + +a.headerlink { + visibility: hidden; +} + +h1:hover > a.headerlink, +h2:hover > a.headerlink, +h3:hover > a.headerlink, +h4:hover > a.headerlink, +h5:hover > a.headerlink, +h6:hover > a.headerlink, +dt:hover > a.headerlink, +caption:hover > a.headerlink, +p.caption:hover > a.headerlink, +div.code-block-caption:hover > a.headerlink { + visibility: visible; +} + +div.body p.caption { + text-align: inherit; +} + +div.body td { + text-align: left; +} + +.first { + margin-top: 0 !important; +} + +p.rubric { + margin-top: 30px; + font-weight: bold; +} + +img.align-left, figure.align-left, .figure.align-left, object.align-left { + clear: left; + float: left; + margin-right: 1em; +} + +img.align-right, figure.align-right, .figure.align-right, object.align-right { + clear: right; + float: right; + margin-left: 1em; +} + +img.align-center, figure.align-center, .figure.align-center, object.align-center { + display: block; + margin-left: auto; + margin-right: auto; +} + +img.align-default, figure.align-default, .figure.align-default { + display: block; + margin-left: auto; + margin-right: auto; +} + +.align-left { + text-align: left; +} + +.align-center { + text-align: center; +} + +.align-default { + text-align: center; +} + +.align-right { + text-align: right; +} + +/* -- sidebars -------------------------------------------------------------- */ + +div.sidebar, +aside.sidebar { + margin: 0 0 0.5em 1em; + border: 1px solid #ddb; + padding: 7px; + background-color: #ffe; + width: 40%; + float: right; + clear: right; + overflow-x: auto; +} + +p.sidebar-title { + font-weight: bold; +} + +nav.contents, +aside.topic, +div.admonition, div.topic, blockquote { + clear: left; +} + +/* -- topics ---------------------------------------------------------------- */ + +nav.contents, +aside.topic, +div.topic { + border: 1px solid #ccc; + padding: 7px; + margin: 10px 0 10px 0; +} + +p.topic-title { + font-size: 1.1em; + font-weight: bold; + margin-top: 10px; +} + +/* -- admonitions ----------------------------------------------------------- */ + +div.admonition { + margin-top: 10px; + margin-bottom: 10px; + padding: 7px; +} + +div.admonition dt { + font-weight: bold; +} + +p.admonition-title { + margin: 0px 10px 5px 0px; + font-weight: bold; +} + +div.body p.centered { + text-align: center; + margin-top: 25px; +} + +/* -- content of sidebars/topics/admonitions -------------------------------- */ + +div.sidebar > :last-child, +aside.sidebar > :last-child, +nav.contents > :last-child, +aside.topic > :last-child, +div.topic > :last-child, +div.admonition > :last-child { + margin-bottom: 0; +} + +div.sidebar::after, +aside.sidebar::after, +nav.contents::after, +aside.topic::after, +div.topic::after, +div.admonition::after, +blockquote::after { + display: block; + content: ''; + clear: both; +} + +/* -- tables ---------------------------------------------------------------- */ + +table.docutils { + margin-top: 10px; + margin-bottom: 10px; + border: 0; + border-collapse: collapse; +} + +table.align-center { + margin-left: auto; + margin-right: auto; +} + +table.align-default { + margin-left: auto; + margin-right: auto; +} + +table caption span.caption-number { + font-style: italic; +} + +table caption span.caption-text { +} + +table.docutils td, table.docutils th { + padding: 1px 8px 1px 5px; + border-top: 0; + border-left: 0; + border-right: 0; + border-bottom: 1px solid #aaa; +} + +th { + text-align: left; + padding-right: 5px; +} + +table.citation { + border-left: solid 1px gray; + margin-left: 1px; +} + +table.citation td { + border-bottom: none; +} + +th > :first-child, +td > :first-child { + margin-top: 0px; +} + +th > :last-child, +td > :last-child { + margin-bottom: 0px; +} + +/* -- figures --------------------------------------------------------------- */ + +div.figure, figure { + margin: 0.5em; + padding: 0.5em; +} + +div.figure p.caption, figcaption { + padding: 0.3em; +} + +div.figure p.caption span.caption-number, +figcaption span.caption-number { + font-style: italic; +} + +div.figure p.caption span.caption-text, +figcaption span.caption-text { +} + +/* -- field list styles ----------------------------------------------------- */ + +table.field-list td, table.field-list th { + border: 0 !important; +} + +.field-list ul { + margin: 0; + padding-left: 1em; +} + +.field-list p { + margin: 0; +} + +.field-name { + -moz-hyphens: manual; + -ms-hyphens: manual; + -webkit-hyphens: manual; + hyphens: manual; +} + +/* -- hlist styles ---------------------------------------------------------- */ + +table.hlist { + margin: 1em 0; +} + +table.hlist td { + vertical-align: top; +} + +/* -- object description styles --------------------------------------------- */ + +.sig { + font-family: 'Consolas', 'Menlo', 'DejaVu Sans Mono', 'Bitstream Vera Sans Mono', monospace; +} + +.sig-name, code.descname { + background-color: transparent; + font-weight: bold; +} + +.sig-name { + font-size: 1.1em; +} + +code.descname { + font-size: 1.2em; +} + +.sig-prename, code.descclassname { + background-color: transparent; +} + +.optional { + font-size: 1.3em; +} + +.sig-paren { + font-size: larger; +} + +.sig-param.n { + font-style: italic; +} + +/* C++ specific styling */ + +.sig-inline.c-texpr, +.sig-inline.cpp-texpr { + font-family: unset; +} + +.sig.c .k, .sig.c .kt, +.sig.cpp .k, .sig.cpp .kt { + color: #0033B3; +} + +.sig.c .m, +.sig.cpp .m { + color: #1750EB; +} + +.sig.c .s, .sig.c .sc, +.sig.cpp .s, .sig.cpp .sc { + color: #067D17; +} + + +/* -- other body styles ----------------------------------------------------- */ + +ol.arabic { + list-style: decimal; +} + +ol.loweralpha { + list-style: lower-alpha; +} + +ol.upperalpha { + list-style: upper-alpha; +} + +ol.lowerroman { + list-style: lower-roman; +} + +ol.upperroman { + list-style: upper-roman; +} + +:not(li) > ol > li:first-child > :first-child, +:not(li) > ul > li:first-child > :first-child { + margin-top: 0px; +} + +:not(li) > ol > li:last-child > :last-child, +:not(li) > ul > li:last-child > :last-child { + margin-bottom: 0px; +} + +ol.simple ol p, +ol.simple ul p, +ul.simple ol p, +ul.simple ul p { + margin-top: 0; +} + +ol.simple > li:not(:first-child) > p, +ul.simple > li:not(:first-child) > p { + margin-top: 0; +} + +ol.simple p, +ul.simple p { + margin-bottom: 0; +} + +aside.footnote > span, +div.citation > span { + float: left; +} +aside.footnote > span:last-of-type, +div.citation > span:last-of-type { + padding-right: 0.5em; +} +aside.footnote > p { + margin-left: 2em; +} +div.citation > p { + margin-left: 4em; +} +aside.footnote > p:last-of-type, +div.citation > p:last-of-type { + margin-bottom: 0em; +} +aside.footnote > p:last-of-type:after, +div.citation > p:last-of-type:after { + content: ""; + clear: both; +} + +dl.field-list { + display: grid; + grid-template-columns: fit-content(30%) auto; +} + +dl.field-list > dt { + font-weight: bold; + word-break: break-word; + padding-left: 0.5em; + padding-right: 5px; +} + +dl.field-list > dd { + padding-left: 0.5em; + margin-top: 0em; + margin-left: 0em; + margin-bottom: 0em; +} + +dl { + margin-bottom: 15px; +} + +dd > :first-child { + margin-top: 0px; +} + +dd ul, dd table { + margin-bottom: 10px; +} + +dd { + margin-top: 3px; + margin-bottom: 10px; + margin-left: 30px; +} + +.sig dd { + margin-top: 0px; + margin-bottom: 0px; +} + +.sig dl { + margin-top: 0px; + margin-bottom: 0px; +} + +dl > dd:last-child, +dl > dd:last-child > :last-child { + margin-bottom: 0; +} + +dt:target, span.highlighted { + background-color: #fbe54e; +} + +rect.highlighted { + fill: #fbe54e; +} + +dl.glossary dt { + font-weight: bold; + font-size: 1.1em; +} + +.versionmodified { + font-style: italic; +} + +.system-message { + background-color: #fda; + padding: 5px; + border: 3px solid red; +} + +.footnote:target { + background-color: #ffa; +} + +.line-block { + display: block; + margin-top: 1em; + margin-bottom: 1em; +} + +.line-block .line-block { + margin-top: 0; + margin-bottom: 0; + margin-left: 1.5em; +} + +.guilabel, .menuselection { + font-family: sans-serif; +} + +.accelerator { + text-decoration: underline; +} + +.classifier { + font-style: oblique; +} + +.classifier:before { + font-style: normal; + margin: 0 0.5em; + content: ":"; + display: inline-block; +} + +abbr, acronym { + border-bottom: dotted 1px; + cursor: help; +} + +.translated { + background-color: rgba(207, 255, 207, 0.2) +} + +.untranslated { + background-color: rgba(255, 207, 207, 0.2) +} + +/* -- code displays --------------------------------------------------------- */ + +pre { + overflow: auto; + overflow-y: hidden; /* fixes display issues on Chrome browsers */ +} + +pre, div[class*="highlight-"] { + clear: both; +} + +span.pre { + -moz-hyphens: none; + -ms-hyphens: none; + -webkit-hyphens: none; + hyphens: none; + white-space: nowrap; +} + +div[class*="highlight-"] { + margin: 1em 0; +} + +td.linenos pre { + border: 0; + background-color: transparent; + color: #aaa; +} + +table.highlighttable { + display: block; +} + +table.highlighttable tbody { + display: block; +} + +table.highlighttable tr { + display: flex; +} + +table.highlighttable td { + margin: 0; + padding: 0; +} + +table.highlighttable td.linenos { + padding-right: 0.5em; +} + +table.highlighttable td.code { + flex: 1; + overflow: hidden; +} + +.highlight .hll { + display: block; +} + +div.highlight pre, +table.highlighttable pre { + margin: 0; +} + +div.code-block-caption + div { + margin-top: 0; +} + +div.code-block-caption { + margin-top: 1em; + padding: 2px 5px; + font-size: small; +} + +div.code-block-caption code { + background-color: transparent; +} + +table.highlighttable td.linenos, +span.linenos, +div.highlight span.gp { /* gp: Generic.Prompt */ + user-select: none; + -webkit-user-select: text; /* Safari fallback only */ + -webkit-user-select: none; /* Chrome/Safari */ + -moz-user-select: none; /* Firefox */ + -ms-user-select: none; /* IE10+ */ +} + +div.code-block-caption span.caption-number { + padding: 0.1em 0.3em; + font-style: italic; +} + +div.code-block-caption span.caption-text { +} + +div.literal-block-wrapper { + margin: 1em 0; +} + +code.xref, a code { + background-color: transparent; + font-weight: bold; +} + +h1 code, h2 code, h3 code, h4 code, h5 code, h6 code { + background-color: transparent; +} + +.viewcode-link { + float: right; +} + +.viewcode-back { + float: right; + font-family: sans-serif; +} + +div.viewcode-block:target { + margin: -1px -10px; + padding: 0 10px; +} + +/* -- math display ---------------------------------------------------------- */ + +img.math { + vertical-align: middle; +} + +div.body div.math p { + text-align: center; +} + +span.eqno { + float: right; +} + +span.eqno a.headerlink { + position: absolute; + z-index: 1; +} + +div.math:hover a.headerlink { + visibility: visible; +} + +/* -- printout stylesheet --------------------------------------------------- */ + +@media print { + div.document, + div.documentwrapper, + div.bodywrapper { + margin: 0 !important; + width: 100%; + } + + div.sphinxsidebar, + div.related, + div.footer, + #top-link { + display: none; + } +} \ No newline at end of file diff --git a/branch/bicounty_2035_hwy_update/_static/css/badge_only.css b/branch/bicounty_2035_hwy_update/_static/css/badge_only.css new file mode 100644 index 0000000..c718cee --- /dev/null +++ b/branch/bicounty_2035_hwy_update/_static/css/badge_only.css @@ -0,0 +1 @@ +.clearfix{*zoom:1}.clearfix:after,.clearfix:before{display:table;content:""}.clearfix:after{clear:both}@font-face{font-family:FontAwesome;font-style:normal;font-weight:400;src:url(fonts/fontawesome-webfont.eot?674f50d287a8c48dc19ba404d20fe713?#iefix) format("embedded-opentype"),url(fonts/fontawesome-webfont.woff2?af7ae505a9eed503f8b8e6982036873e) format("woff2"),url(fonts/fontawesome-webfont.woff?fee66e712a8a08eef5805a46892932ad) format("woff"),url(fonts/fontawesome-webfont.ttf?b06871f281fee6b241d60582ae9369b9) format("truetype"),url(fonts/fontawesome-webfont.svg?912ec66d7572ff821749319396470bde#FontAwesome) format("svg")}.fa:before{font-family:FontAwesome;font-style:normal;font-weight:400;line-height:1}.fa:before,a .fa{text-decoration:inherit}.fa:before,a .fa,li .fa{display:inline-block}li .fa-large:before{width:1.875em}ul.fas{list-style-type:none;margin-left:2em;text-indent:-.8em}ul.fas li .fa{width:.8em}ul.fas li .fa-large:before{vertical-align:baseline}.fa-book:before,.icon-book:before{content:"\f02d"}.fa-caret-down:before,.icon-caret-down:before{content:"\f0d7"}.fa-caret-up:before,.icon-caret-up:before{content:"\f0d8"}.fa-caret-left:before,.icon-caret-left:before{content:"\f0d9"}.fa-caret-right:before,.icon-caret-right:before{content:"\f0da"}.rst-versions{position:fixed;bottom:0;left:0;width:300px;color:#fcfcfc;background:#1f1d1d;font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif;z-index:400}.rst-versions a{color:#2980b9;text-decoration:none}.rst-versions .rst-badge-small{display:none}.rst-versions .rst-current-version{padding:12px;background-color:#272525;display:block;text-align:right;font-size:90%;cursor:pointer;color:#27ae60}.rst-versions .rst-current-version:after{clear:both;content:"";display:block}.rst-versions .rst-current-version .fa{color:#fcfcfc}.rst-versions .rst-current-version .fa-book,.rst-versions .rst-current-version .icon-book{float:left}.rst-versions .rst-current-version.rst-out-of-date{background-color:#e74c3c;color:#fff}.rst-versions .rst-current-version.rst-active-old-version{background-color:#f1c40f;color:#000}.rst-versions.shift-up{height:auto;max-height:100%;overflow-y:scroll}.rst-versions.shift-up .rst-other-versions{display:block}.rst-versions .rst-other-versions{font-size:90%;padding:12px;color:grey;display:none}.rst-versions .rst-other-versions hr{display:block;height:1px;border:0;margin:20px 0;padding:0;border-top:1px solid #413d3d}.rst-versions .rst-other-versions dd{display:inline-block;margin:0}.rst-versions .rst-other-versions dd a{display:inline-block;padding:6px;color:#fcfcfc}.rst-versions.rst-badge{width:auto;bottom:20px;right:20px;left:auto;border:none;max-width:300px;max-height:90%}.rst-versions.rst-badge .fa-book,.rst-versions.rst-badge .icon-book{float:none;line-height:30px}.rst-versions.rst-badge.shift-up .rst-current-version{text-align:right}.rst-versions.rst-badge.shift-up .rst-current-version .fa-book,.rst-versions.rst-badge.shift-up .rst-current-version .icon-book{float:left}.rst-versions.rst-badge>.rst-current-version{width:auto;height:30px;line-height:30px;padding:0 6px;display:block;text-align:center}@media screen and (max-width:768px){.rst-versions{width:85%;display:none}.rst-versions.shift{display:block}} \ No newline at end of file diff --git a/branch/bicounty_2035_hwy_update/_static/css/fonts/Roboto-Slab-Bold.woff b/branch/bicounty_2035_hwy_update/_static/css/fonts/Roboto-Slab-Bold.woff new file mode 100644 index 0000000..6cb6000 Binary files /dev/null and b/branch/bicounty_2035_hwy_update/_static/css/fonts/Roboto-Slab-Bold.woff differ diff --git a/branch/bicounty_2035_hwy_update/_static/css/fonts/Roboto-Slab-Bold.woff2 b/branch/bicounty_2035_hwy_update/_static/css/fonts/Roboto-Slab-Bold.woff2 new file mode 100644 index 0000000..7059e23 Binary files /dev/null and b/branch/bicounty_2035_hwy_update/_static/css/fonts/Roboto-Slab-Bold.woff2 differ diff --git a/branch/bicounty_2035_hwy_update/_static/css/fonts/Roboto-Slab-Regular.woff b/branch/bicounty_2035_hwy_update/_static/css/fonts/Roboto-Slab-Regular.woff new file mode 100644 index 0000000..f815f63 Binary files /dev/null and b/branch/bicounty_2035_hwy_update/_static/css/fonts/Roboto-Slab-Regular.woff differ diff --git a/branch/bicounty_2035_hwy_update/_static/css/fonts/Roboto-Slab-Regular.woff2 b/branch/bicounty_2035_hwy_update/_static/css/fonts/Roboto-Slab-Regular.woff2 new file mode 100644 index 0000000..f2c76e5 Binary files /dev/null and b/branch/bicounty_2035_hwy_update/_static/css/fonts/Roboto-Slab-Regular.woff2 differ diff --git a/branch/bicounty_2035_hwy_update/_static/css/fonts/fontawesome-webfont.eot b/branch/bicounty_2035_hwy_update/_static/css/fonts/fontawesome-webfont.eot new file mode 100644 index 0000000..e9f60ca Binary files /dev/null and b/branch/bicounty_2035_hwy_update/_static/css/fonts/fontawesome-webfont.eot differ diff --git a/branch/bicounty_2035_hwy_update/_static/css/fonts/fontawesome-webfont.svg b/branch/bicounty_2035_hwy_update/_static/css/fonts/fontawesome-webfont.svg new file mode 100644 index 0000000..855c845 --- /dev/null +++ b/branch/bicounty_2035_hwy_update/_static/css/fonts/fontawesome-webfont.svg @@ -0,0 +1,2671 @@ + + + + +Created by FontForge 20120731 at Mon Oct 24 17:37:40 2016 + By ,,, +Copyright Dave Gandy 2016. All rights reserved. + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/branch/bicounty_2035_hwy_update/_static/css/fonts/fontawesome-webfont.ttf b/branch/bicounty_2035_hwy_update/_static/css/fonts/fontawesome-webfont.ttf new file mode 100644 index 0000000..35acda2 Binary files /dev/null and b/branch/bicounty_2035_hwy_update/_static/css/fonts/fontawesome-webfont.ttf differ diff --git a/branch/bicounty_2035_hwy_update/_static/css/fonts/fontawesome-webfont.woff b/branch/bicounty_2035_hwy_update/_static/css/fonts/fontawesome-webfont.woff new file mode 100644 index 0000000..400014a Binary files /dev/null and b/branch/bicounty_2035_hwy_update/_static/css/fonts/fontawesome-webfont.woff differ diff --git a/branch/bicounty_2035_hwy_update/_static/css/fonts/fontawesome-webfont.woff2 b/branch/bicounty_2035_hwy_update/_static/css/fonts/fontawesome-webfont.woff2 new file mode 100644 index 0000000..4d13fc6 Binary files /dev/null and b/branch/bicounty_2035_hwy_update/_static/css/fonts/fontawesome-webfont.woff2 differ diff --git a/branch/bicounty_2035_hwy_update/_static/css/fonts/lato-bold-italic.woff b/branch/bicounty_2035_hwy_update/_static/css/fonts/lato-bold-italic.woff new file mode 100644 index 0000000..88ad05b Binary files /dev/null and b/branch/bicounty_2035_hwy_update/_static/css/fonts/lato-bold-italic.woff differ diff --git a/branch/bicounty_2035_hwy_update/_static/css/fonts/lato-bold-italic.woff2 b/branch/bicounty_2035_hwy_update/_static/css/fonts/lato-bold-italic.woff2 new file mode 100644 index 0000000..c4e3d80 Binary files /dev/null and b/branch/bicounty_2035_hwy_update/_static/css/fonts/lato-bold-italic.woff2 differ diff --git a/branch/bicounty_2035_hwy_update/_static/css/fonts/lato-bold.woff b/branch/bicounty_2035_hwy_update/_static/css/fonts/lato-bold.woff new file mode 100644 index 0000000..c6dff51 Binary files /dev/null and b/branch/bicounty_2035_hwy_update/_static/css/fonts/lato-bold.woff differ diff --git a/branch/bicounty_2035_hwy_update/_static/css/fonts/lato-bold.woff2 b/branch/bicounty_2035_hwy_update/_static/css/fonts/lato-bold.woff2 new file mode 100644 index 0000000..bb19504 Binary files /dev/null and b/branch/bicounty_2035_hwy_update/_static/css/fonts/lato-bold.woff2 differ diff --git a/branch/bicounty_2035_hwy_update/_static/css/fonts/lato-normal-italic.woff b/branch/bicounty_2035_hwy_update/_static/css/fonts/lato-normal-italic.woff new file mode 100644 index 0000000..76114bc Binary files /dev/null and b/branch/bicounty_2035_hwy_update/_static/css/fonts/lato-normal-italic.woff differ diff --git a/branch/bicounty_2035_hwy_update/_static/css/fonts/lato-normal-italic.woff2 b/branch/bicounty_2035_hwy_update/_static/css/fonts/lato-normal-italic.woff2 new file mode 100644 index 0000000..3404f37 Binary files /dev/null and b/branch/bicounty_2035_hwy_update/_static/css/fonts/lato-normal-italic.woff2 differ diff --git a/branch/bicounty_2035_hwy_update/_static/css/fonts/lato-normal.woff b/branch/bicounty_2035_hwy_update/_static/css/fonts/lato-normal.woff new file mode 100644 index 0000000..ae1307f Binary files /dev/null and b/branch/bicounty_2035_hwy_update/_static/css/fonts/lato-normal.woff differ diff --git a/branch/bicounty_2035_hwy_update/_static/css/fonts/lato-normal.woff2 b/branch/bicounty_2035_hwy_update/_static/css/fonts/lato-normal.woff2 new file mode 100644 index 0000000..3bf9843 Binary files /dev/null and b/branch/bicounty_2035_hwy_update/_static/css/fonts/lato-normal.woff2 differ diff --git a/branch/bicounty_2035_hwy_update/_static/css/theme.css b/branch/bicounty_2035_hwy_update/_static/css/theme.css new file mode 100644 index 0000000..19a446a --- /dev/null +++ b/branch/bicounty_2035_hwy_update/_static/css/theme.css @@ -0,0 +1,4 @@ +html{box-sizing:border-box}*,:after,:before{box-sizing:inherit}article,aside,details,figcaption,figure,footer,header,hgroup,nav,section{display:block}audio,canvas,video{display:inline-block;*display:inline;*zoom:1}[hidden],audio:not([controls]){display:none}*{-webkit-box-sizing:border-box;-moz-box-sizing:border-box;box-sizing:border-box}html{font-size:100%;-webkit-text-size-adjust:100%;-ms-text-size-adjust:100%}body{margin:0}a:active,a:hover{outline:0}abbr[title]{border-bottom:1px dotted}b,strong{font-weight:700}blockquote{margin:0}dfn{font-style:italic}ins{background:#ff9;text-decoration:none}ins,mark{color:#000}mark{background:#ff0;font-style:italic;font-weight:700}.rst-content code,.rst-content tt,code,kbd,pre,samp{font-family:monospace,serif;_font-family:courier new,monospace;font-size:1em}pre{white-space:pre}q{quotes:none}q:after,q:before{content:"";content:none}small{font-size:85%}sub,sup{font-size:75%;line-height:0;position:relative;vertical-align:baseline}sup{top:-.5em}sub{bottom:-.25em}dl,ol,ul{margin:0;padding:0;list-style:none;list-style-image:none}li{list-style:none}dd{margin:0}img{border:0;-ms-interpolation-mode:bicubic;vertical-align:middle;max-width:100%}svg:not(:root){overflow:hidden}figure,form{margin:0}label{cursor:pointer}button,input,select,textarea{font-size:100%;margin:0;vertical-align:baseline;*vertical-align:middle}button,input{line-height:normal}button,input[type=button],input[type=reset],input[type=submit]{cursor:pointer;-webkit-appearance:button;*overflow:visible}button[disabled],input[disabled]{cursor:default}input[type=search]{-webkit-appearance:textfield;-moz-box-sizing:content-box;-webkit-box-sizing:content-box;box-sizing:content-box}textarea{resize:vertical}table{border-collapse:collapse;border-spacing:0}td{vertical-align:top}.chromeframe{margin:.2em 0;background:#ccc;color:#000;padding:.2em 0}.ir{display:block;border:0;text-indent:-999em;overflow:hidden;background-color:transparent;background-repeat:no-repeat;text-align:left;direction:ltr;*line-height:0}.ir br{display:none}.hidden{display:none!important;visibility:hidden}.visuallyhidden{border:0;clip:rect(0 0 0 0);height:1px;margin:-1px;overflow:hidden;padding:0;position:absolute;width:1px}.visuallyhidden.focusable:active,.visuallyhidden.focusable:focus{clip:auto;height:auto;margin:0;overflow:visible;position:static;width:auto}.invisible{visibility:hidden}.relative{position:relative}big,small{font-size:100%}@media print{body,html,section{background:none!important}*{box-shadow:none!important;text-shadow:none!important;filter:none!important;-ms-filter:none!important}a,a:visited{text-decoration:underline}.ir a:after,a[href^="#"]:after,a[href^="javascript:"]:after{content:""}blockquote,pre{page-break-inside:avoid}thead{display:table-header-group}img,tr{page-break-inside:avoid}img{max-width:100%!important}@page{margin:.5cm}.rst-content .toctree-wrapper>p.caption,h2,h3,p{orphans:3;widows:3}.rst-content .toctree-wrapper>p.caption,h2,h3{page-break-after:avoid}}.btn,.fa:before,.icon:before,.rst-content .admonition,.rst-content .admonition-title:before,.rst-content .admonition-todo,.rst-content .attention,.rst-content .caution,.rst-content .code-block-caption .headerlink:before,.rst-content .danger,.rst-content .eqno .headerlink:before,.rst-content .error,.rst-content .hint,.rst-content .important,.rst-content .note,.rst-content .seealso,.rst-content .tip,.rst-content .warning,.rst-content code.download span:first-child:before,.rst-content dl dt .headerlink:before,.rst-content h1 .headerlink:before,.rst-content h2 .headerlink:before,.rst-content h3 .headerlink:before,.rst-content h4 .headerlink:before,.rst-content h5 .headerlink:before,.rst-content h6 .headerlink:before,.rst-content p.caption .headerlink:before,.rst-content p .headerlink:before,.rst-content table>caption .headerlink:before,.rst-content tt.download span:first-child:before,.wy-alert,.wy-dropdown .caret:before,.wy-inline-validate.wy-inline-validate-danger .wy-input-context:before,.wy-inline-validate.wy-inline-validate-info .wy-input-context:before,.wy-inline-validate.wy-inline-validate-success .wy-input-context:before,.wy-inline-validate.wy-inline-validate-warning .wy-input-context:before,.wy-menu-vertical li.current>a button.toctree-expand:before,.wy-menu-vertical li.on a button.toctree-expand:before,.wy-menu-vertical li button.toctree-expand:before,input[type=color],input[type=date],input[type=datetime-local],input[type=datetime],input[type=email],input[type=month],input[type=number],input[type=password],input[type=search],input[type=tel],input[type=text],input[type=time],input[type=url],input[type=week],select,textarea{-webkit-font-smoothing:antialiased}.clearfix{*zoom:1}.clearfix:after,.clearfix:before{display:table;content:""}.clearfix:after{clear:both}/*! + * Font Awesome 4.7.0 by @davegandy - http://fontawesome.io - @fontawesome + * License - http://fontawesome.io/license (Font: SIL OFL 1.1, CSS: MIT License) + */@font-face{font-family:FontAwesome;src:url(fonts/fontawesome-webfont.eot?674f50d287a8c48dc19ba404d20fe713);src:url(fonts/fontawesome-webfont.eot?674f50d287a8c48dc19ba404d20fe713?#iefix&v=4.7.0) format("embedded-opentype"),url(fonts/fontawesome-webfont.woff2?af7ae505a9eed503f8b8e6982036873e) format("woff2"),url(fonts/fontawesome-webfont.woff?fee66e712a8a08eef5805a46892932ad) format("woff"),url(fonts/fontawesome-webfont.ttf?b06871f281fee6b241d60582ae9369b9) format("truetype"),url(fonts/fontawesome-webfont.svg?912ec66d7572ff821749319396470bde#fontawesomeregular) format("svg");font-weight:400;font-style:normal}.fa,.icon,.rst-content .admonition-title,.rst-content .code-block-caption .headerlink,.rst-content .eqno .headerlink,.rst-content code.download span:first-child,.rst-content dl dt .headerlink,.rst-content h1 .headerlink,.rst-content h2 .headerlink,.rst-content h3 .headerlink,.rst-content h4 .headerlink,.rst-content h5 .headerlink,.rst-content h6 .headerlink,.rst-content p.caption .headerlink,.rst-content p .headerlink,.rst-content table>caption .headerlink,.rst-content tt.download span:first-child,.wy-menu-vertical li.current>a button.toctree-expand,.wy-menu-vertical li.on a button.toctree-expand,.wy-menu-vertical li button.toctree-expand{display:inline-block;font:normal normal normal 14px/1 FontAwesome;font-size:inherit;text-rendering:auto;-webkit-font-smoothing:antialiased;-moz-osx-font-smoothing:grayscale}.fa-lg{font-size:1.33333em;line-height:.75em;vertical-align:-15%}.fa-2x{font-size:2em}.fa-3x{font-size:3em}.fa-4x{font-size:4em}.fa-5x{font-size:5em}.fa-fw{width:1.28571em;text-align:center}.fa-ul{padding-left:0;margin-left:2.14286em;list-style-type:none}.fa-ul>li{position:relative}.fa-li{position:absolute;left:-2.14286em;width:2.14286em;top:.14286em;text-align:center}.fa-li.fa-lg{left:-1.85714em}.fa-border{padding:.2em .25em .15em;border:.08em solid #eee;border-radius:.1em}.fa-pull-left{float:left}.fa-pull-right{float:right}.fa-pull-left.icon,.fa.fa-pull-left,.rst-content .code-block-caption .fa-pull-left.headerlink,.rst-content .eqno .fa-pull-left.headerlink,.rst-content .fa-pull-left.admonition-title,.rst-content code.download span.fa-pull-left:first-child,.rst-content dl dt .fa-pull-left.headerlink,.rst-content h1 .fa-pull-left.headerlink,.rst-content h2 .fa-pull-left.headerlink,.rst-content h3 .fa-pull-left.headerlink,.rst-content h4 .fa-pull-left.headerlink,.rst-content h5 .fa-pull-left.headerlink,.rst-content h6 .fa-pull-left.headerlink,.rst-content p .fa-pull-left.headerlink,.rst-content table>caption .fa-pull-left.headerlink,.rst-content tt.download span.fa-pull-left:first-child,.wy-menu-vertical li.current>a button.fa-pull-left.toctree-expand,.wy-menu-vertical li.on a button.fa-pull-left.toctree-expand,.wy-menu-vertical li button.fa-pull-left.toctree-expand{margin-right:.3em}.fa-pull-right.icon,.fa.fa-pull-right,.rst-content .code-block-caption .fa-pull-right.headerlink,.rst-content .eqno .fa-pull-right.headerlink,.rst-content .fa-pull-right.admonition-title,.rst-content code.download span.fa-pull-right:first-child,.rst-content dl dt .fa-pull-right.headerlink,.rst-content h1 .fa-pull-right.headerlink,.rst-content h2 .fa-pull-right.headerlink,.rst-content h3 .fa-pull-right.headerlink,.rst-content h4 .fa-pull-right.headerlink,.rst-content h5 .fa-pull-right.headerlink,.rst-content h6 .fa-pull-right.headerlink,.rst-content p .fa-pull-right.headerlink,.rst-content table>caption .fa-pull-right.headerlink,.rst-content tt.download span.fa-pull-right:first-child,.wy-menu-vertical li.current>a button.fa-pull-right.toctree-expand,.wy-menu-vertical li.on a button.fa-pull-right.toctree-expand,.wy-menu-vertical li button.fa-pull-right.toctree-expand{margin-left:.3em}.pull-right{float:right}.pull-left{float:left}.fa.pull-left,.pull-left.icon,.rst-content .code-block-caption .pull-left.headerlink,.rst-content .eqno .pull-left.headerlink,.rst-content .pull-left.admonition-title,.rst-content code.download span.pull-left:first-child,.rst-content dl dt .pull-left.headerlink,.rst-content h1 .pull-left.headerlink,.rst-content h2 .pull-left.headerlink,.rst-content h3 .pull-left.headerlink,.rst-content h4 .pull-left.headerlink,.rst-content h5 .pull-left.headerlink,.rst-content h6 .pull-left.headerlink,.rst-content p .pull-left.headerlink,.rst-content table>caption .pull-left.headerlink,.rst-content tt.download span.pull-left:first-child,.wy-menu-vertical li.current>a button.pull-left.toctree-expand,.wy-menu-vertical li.on a button.pull-left.toctree-expand,.wy-menu-vertical li button.pull-left.toctree-expand{margin-right:.3em}.fa.pull-right,.pull-right.icon,.rst-content .code-block-caption .pull-right.headerlink,.rst-content .eqno .pull-right.headerlink,.rst-content .pull-right.admonition-title,.rst-content code.download span.pull-right:first-child,.rst-content dl dt .pull-right.headerlink,.rst-content h1 .pull-right.headerlink,.rst-content h2 .pull-right.headerlink,.rst-content h3 .pull-right.headerlink,.rst-content h4 .pull-right.headerlink,.rst-content h5 .pull-right.headerlink,.rst-content h6 .pull-right.headerlink,.rst-content p .pull-right.headerlink,.rst-content table>caption .pull-right.headerlink,.rst-content tt.download span.pull-right:first-child,.wy-menu-vertical li.current>a button.pull-right.toctree-expand,.wy-menu-vertical li.on a button.pull-right.toctree-expand,.wy-menu-vertical li button.pull-right.toctree-expand{margin-left:.3em}.fa-spin{-webkit-animation:fa-spin 2s linear infinite;animation:fa-spin 2s linear infinite}.fa-pulse{-webkit-animation:fa-spin 1s steps(8) infinite;animation:fa-spin 1s steps(8) infinite}@-webkit-keyframes fa-spin{0%{-webkit-transform:rotate(0deg);transform:rotate(0deg)}to{-webkit-transform:rotate(359deg);transform:rotate(359deg)}}@keyframes fa-spin{0%{-webkit-transform:rotate(0deg);transform:rotate(0deg)}to{-webkit-transform:rotate(359deg);transform:rotate(359deg)}}.fa-rotate-90{-ms-filter:"progid:DXImageTransform.Microsoft.BasicImage(rotation=1)";-webkit-transform:rotate(90deg);-ms-transform:rotate(90deg);transform:rotate(90deg)}.fa-rotate-180{-ms-filter:"progid:DXImageTransform.Microsoft.BasicImage(rotation=2)";-webkit-transform:rotate(180deg);-ms-transform:rotate(180deg);transform:rotate(180deg)}.fa-rotate-270{-ms-filter:"progid:DXImageTransform.Microsoft.BasicImage(rotation=3)";-webkit-transform:rotate(270deg);-ms-transform:rotate(270deg);transform:rotate(270deg)}.fa-flip-horizontal{-ms-filter:"progid:DXImageTransform.Microsoft.BasicImage(rotation=0, mirror=1)";-webkit-transform:scaleX(-1);-ms-transform:scaleX(-1);transform:scaleX(-1)}.fa-flip-vertical{-ms-filter:"progid:DXImageTransform.Microsoft.BasicImage(rotation=2, mirror=1)";-webkit-transform:scaleY(-1);-ms-transform:scaleY(-1);transform:scaleY(-1)}:root .fa-flip-horizontal,:root .fa-flip-vertical,:root .fa-rotate-90,:root .fa-rotate-180,:root .fa-rotate-270{filter:none}.fa-stack{position:relative;display:inline-block;width:2em;height:2em;line-height:2em;vertical-align:middle}.fa-stack-1x,.fa-stack-2x{position:absolute;left:0;width:100%;text-align:center}.fa-stack-1x{line-height:inherit}.fa-stack-2x{font-size:2em}.fa-inverse{color:#fff}.fa-glass:before{content:""}.fa-music:before{content:""}.fa-search:before,.icon-search:before{content:""}.fa-envelope-o:before{content:""}.fa-heart:before{content:""}.fa-star:before{content:""}.fa-star-o:before{content:""}.fa-user:before{content:""}.fa-film:before{content:""}.fa-th-large:before{content:""}.fa-th:before{content:""}.fa-th-list:before{content:""}.fa-check:before{content:""}.fa-close:before,.fa-remove:before,.fa-times:before{content:""}.fa-search-plus:before{content:""}.fa-search-minus:before{content:""}.fa-power-off:before{content:""}.fa-signal:before{content:""}.fa-cog:before,.fa-gear:before{content:""}.fa-trash-o:before{content:""}.fa-home:before,.icon-home:before{content:""}.fa-file-o:before{content:""}.fa-clock-o:before{content:""}.fa-road:before{content:""}.fa-download:before,.rst-content code.download span:first-child:before,.rst-content tt.download span:first-child:before{content:""}.fa-arrow-circle-o-down:before{content:""}.fa-arrow-circle-o-up:before{content:""}.fa-inbox:before{content:""}.fa-play-circle-o:before{content:""}.fa-repeat:before,.fa-rotate-right:before{content:""}.fa-refresh:before{content:""}.fa-list-alt:before{content:""}.fa-lock:before{content:""}.fa-flag:before{content:""}.fa-headphones:before{content:""}.fa-volume-off:before{content:""}.fa-volume-down:before{content:""}.fa-volume-up:before{content:""}.fa-qrcode:before{content:""}.fa-barcode:before{content:""}.fa-tag:before{content:""}.fa-tags:before{content:""}.fa-book:before,.icon-book:before{content:""}.fa-bookmark:before{content:""}.fa-print:before{content:""}.fa-camera:before{content:""}.fa-font:before{content:""}.fa-bold:before{content:""}.fa-italic:before{content:""}.fa-text-height:before{content:""}.fa-text-width:before{content:""}.fa-align-left:before{content:""}.fa-align-center:before{content:""}.fa-align-right:before{content:""}.fa-align-justify:before{content:""}.fa-list:before{content:""}.fa-dedent:before,.fa-outdent:before{content:""}.fa-indent:before{content:""}.fa-video-camera:before{content:""}.fa-image:before,.fa-photo:before,.fa-picture-o:before{content:""}.fa-pencil:before{content:""}.fa-map-marker:before{content:""}.fa-adjust:before{content:""}.fa-tint:before{content:""}.fa-edit:before,.fa-pencil-square-o:before{content:""}.fa-share-square-o:before{content:""}.fa-check-square-o:before{content:""}.fa-arrows:before{content:""}.fa-step-backward:before{content:""}.fa-fast-backward:before{content:""}.fa-backward:before{content:""}.fa-play:before{content:""}.fa-pause:before{content:""}.fa-stop:before{content:""}.fa-forward:before{content:""}.fa-fast-forward:before{content:""}.fa-step-forward:before{content:""}.fa-eject:before{content:""}.fa-chevron-left:before{content:""}.fa-chevron-right:before{content:""}.fa-plus-circle:before{content:""}.fa-minus-circle:before{content:""}.fa-times-circle:before,.wy-inline-validate.wy-inline-validate-danger .wy-input-context:before{content:""}.fa-check-circle:before,.wy-inline-validate.wy-inline-validate-success .wy-input-context:before{content:""}.fa-question-circle:before{content:""}.fa-info-circle:before{content:""}.fa-crosshairs:before{content:""}.fa-times-circle-o:before{content:""}.fa-check-circle-o:before{content:""}.fa-ban:before{content:""}.fa-arrow-left:before{content:""}.fa-arrow-right:before{content:""}.fa-arrow-up:before{content:""}.fa-arrow-down:before{content:""}.fa-mail-forward:before,.fa-share:before{content:""}.fa-expand:before{content:""}.fa-compress:before{content:""}.fa-plus:before{content:""}.fa-minus:before{content:""}.fa-asterisk:before{content:""}.fa-exclamation-circle:before,.rst-content .admonition-title:before,.wy-inline-validate.wy-inline-validate-info .wy-input-context:before,.wy-inline-validate.wy-inline-validate-warning .wy-input-context:before{content:""}.fa-gift:before{content:""}.fa-leaf:before{content:""}.fa-fire:before,.icon-fire:before{content:""}.fa-eye:before{content:""}.fa-eye-slash:before{content:""}.fa-exclamation-triangle:before,.fa-warning:before{content:""}.fa-plane:before{content:""}.fa-calendar:before{content:""}.fa-random:before{content:""}.fa-comment:before{content:""}.fa-magnet:before{content:""}.fa-chevron-up:before{content:""}.fa-chevron-down:before{content:""}.fa-retweet:before{content:""}.fa-shopping-cart:before{content:""}.fa-folder:before{content:""}.fa-folder-open:before{content:""}.fa-arrows-v:before{content:""}.fa-arrows-h:before{content:""}.fa-bar-chart-o:before,.fa-bar-chart:before{content:""}.fa-twitter-square:before{content:""}.fa-facebook-square:before{content:""}.fa-camera-retro:before{content:""}.fa-key:before{content:""}.fa-cogs:before,.fa-gears:before{content:""}.fa-comments:before{content:""}.fa-thumbs-o-up:before{content:""}.fa-thumbs-o-down:before{content:""}.fa-star-half:before{content:""}.fa-heart-o:before{content:""}.fa-sign-out:before{content:""}.fa-linkedin-square:before{content:""}.fa-thumb-tack:before{content:""}.fa-external-link:before{content:""}.fa-sign-in:before{content:""}.fa-trophy:before{content:""}.fa-github-square:before{content:""}.fa-upload:before{content:""}.fa-lemon-o:before{content:""}.fa-phone:before{content:""}.fa-square-o:before{content:""}.fa-bookmark-o:before{content:""}.fa-phone-square:before{content:""}.fa-twitter:before{content:""}.fa-facebook-f:before,.fa-facebook:before{content:""}.fa-github:before,.icon-github:before{content:""}.fa-unlock:before{content:""}.fa-credit-card:before{content:""}.fa-feed:before,.fa-rss:before{content:""}.fa-hdd-o:before{content:""}.fa-bullhorn:before{content:""}.fa-bell:before{content:""}.fa-certificate:before{content:""}.fa-hand-o-right:before{content:""}.fa-hand-o-left:before{content:""}.fa-hand-o-up:before{content:""}.fa-hand-o-down:before{content:""}.fa-arrow-circle-left:before,.icon-circle-arrow-left:before{content:""}.fa-arrow-circle-right:before,.icon-circle-arrow-right:before{content:""}.fa-arrow-circle-up:before{content:""}.fa-arrow-circle-down:before{content:""}.fa-globe:before{content:""}.fa-wrench:before{content:""}.fa-tasks:before{content:""}.fa-filter:before{content:""}.fa-briefcase:before{content:""}.fa-arrows-alt:before{content:""}.fa-group:before,.fa-users:before{content:""}.fa-chain:before,.fa-link:before,.icon-link:before{content:""}.fa-cloud:before{content:""}.fa-flask:before{content:""}.fa-cut:before,.fa-scissors:before{content:""}.fa-copy:before,.fa-files-o:before{content:""}.fa-paperclip:before{content:""}.fa-floppy-o:before,.fa-save:before{content:""}.fa-square:before{content:""}.fa-bars:before,.fa-navicon:before,.fa-reorder:before{content:""}.fa-list-ul:before{content:""}.fa-list-ol:before{content:""}.fa-strikethrough:before{content:""}.fa-underline:before{content:""}.fa-table:before{content:""}.fa-magic:before{content:""}.fa-truck:before{content:""}.fa-pinterest:before{content:""}.fa-pinterest-square:before{content:""}.fa-google-plus-square:before{content:""}.fa-google-plus:before{content:""}.fa-money:before{content:""}.fa-caret-down:before,.icon-caret-down:before,.wy-dropdown .caret:before{content:""}.fa-caret-up:before{content:""}.fa-caret-left:before{content:""}.fa-caret-right:before{content:""}.fa-columns:before{content:""}.fa-sort:before,.fa-unsorted:before{content:""}.fa-sort-desc:before,.fa-sort-down:before{content:""}.fa-sort-asc:before,.fa-sort-up:before{content:""}.fa-envelope:before{content:""}.fa-linkedin:before{content:""}.fa-rotate-left:before,.fa-undo:before{content:""}.fa-gavel:before,.fa-legal:before{content:""}.fa-dashboard:before,.fa-tachometer:before{content:""}.fa-comment-o:before{content:""}.fa-comments-o:before{content:""}.fa-bolt:before,.fa-flash:before{content:""}.fa-sitemap:before{content:""}.fa-umbrella:before{content:""}.fa-clipboard:before,.fa-paste:before{content:""}.fa-lightbulb-o:before{content:""}.fa-exchange:before{content:""}.fa-cloud-download:before{content:""}.fa-cloud-upload:before{content:""}.fa-user-md:before{content:""}.fa-stethoscope:before{content:""}.fa-suitcase:before{content:""}.fa-bell-o:before{content:""}.fa-coffee:before{content:""}.fa-cutlery:before{content:""}.fa-file-text-o:before{content:""}.fa-building-o:before{content:""}.fa-hospital-o:before{content:""}.fa-ambulance:before{content:""}.fa-medkit:before{content:""}.fa-fighter-jet:before{content:""}.fa-beer:before{content:""}.fa-h-square:before{content:""}.fa-plus-square:before{content:""}.fa-angle-double-left:before{content:""}.fa-angle-double-right:before{content:""}.fa-angle-double-up:before{content:""}.fa-angle-double-down:before{content:""}.fa-angle-left:before{content:""}.fa-angle-right:before{content:""}.fa-angle-up:before{content:""}.fa-angle-down:before{content:""}.fa-desktop:before{content:""}.fa-laptop:before{content:""}.fa-tablet:before{content:""}.fa-mobile-phone:before,.fa-mobile:before{content:""}.fa-circle-o:before{content:""}.fa-quote-left:before{content:""}.fa-quote-right:before{content:""}.fa-spinner:before{content:""}.fa-circle:before{content:""}.fa-mail-reply:before,.fa-reply:before{content:""}.fa-github-alt:before{content:""}.fa-folder-o:before{content:""}.fa-folder-open-o:before{content:""}.fa-smile-o:before{content:""}.fa-frown-o:before{content:""}.fa-meh-o:before{content:""}.fa-gamepad:before{content:""}.fa-keyboard-o:before{content:""}.fa-flag-o:before{content:""}.fa-flag-checkered:before{content:""}.fa-terminal:before{content:""}.fa-code:before{content:""}.fa-mail-reply-all:before,.fa-reply-all:before{content:""}.fa-star-half-empty:before,.fa-star-half-full:before,.fa-star-half-o:before{content:""}.fa-location-arrow:before{content:""}.fa-crop:before{content:""}.fa-code-fork:before{content:""}.fa-chain-broken:before,.fa-unlink:before{content:""}.fa-question:before{content:""}.fa-info:before{content:""}.fa-exclamation:before{content:""}.fa-superscript:before{content:""}.fa-subscript:before{content:""}.fa-eraser:before{content:""}.fa-puzzle-piece:before{content:""}.fa-microphone:before{content:""}.fa-microphone-slash:before{content:""}.fa-shield:before{content:""}.fa-calendar-o:before{content:""}.fa-fire-extinguisher:before{content:""}.fa-rocket:before{content:""}.fa-maxcdn:before{content:""}.fa-chevron-circle-left:before{content:""}.fa-chevron-circle-right:before{content:""}.fa-chevron-circle-up:before{content:""}.fa-chevron-circle-down:before{content:""}.fa-html5:before{content:""}.fa-css3:before{content:""}.fa-anchor:before{content:""}.fa-unlock-alt:before{content:""}.fa-bullseye:before{content:""}.fa-ellipsis-h:before{content:""}.fa-ellipsis-v:before{content:""}.fa-rss-square:before{content:""}.fa-play-circle:before{content:""}.fa-ticket:before{content:""}.fa-minus-square:before{content:""}.fa-minus-square-o:before,.wy-menu-vertical li.current>a button.toctree-expand:before,.wy-menu-vertical li.on a button.toctree-expand:before{content:""}.fa-level-up:before{content:""}.fa-level-down:before{content:""}.fa-check-square:before{content:""}.fa-pencil-square:before{content:""}.fa-external-link-square:before{content:""}.fa-share-square:before{content:""}.fa-compass:before{content:""}.fa-caret-square-o-down:before,.fa-toggle-down:before{content:""}.fa-caret-square-o-up:before,.fa-toggle-up:before{content:""}.fa-caret-square-o-right:before,.fa-toggle-right:before{content:""}.fa-eur:before,.fa-euro:before{content:""}.fa-gbp:before{content:""}.fa-dollar:before,.fa-usd:before{content:""}.fa-inr:before,.fa-rupee:before{content:""}.fa-cny:before,.fa-jpy:before,.fa-rmb:before,.fa-yen:before{content:""}.fa-rouble:before,.fa-rub:before,.fa-ruble:before{content:""}.fa-krw:before,.fa-won:before{content:""}.fa-bitcoin:before,.fa-btc:before{content:""}.fa-file:before{content:""}.fa-file-text:before{content:""}.fa-sort-alpha-asc:before{content:""}.fa-sort-alpha-desc:before{content:""}.fa-sort-amount-asc:before{content:""}.fa-sort-amount-desc:before{content:""}.fa-sort-numeric-asc:before{content:""}.fa-sort-numeric-desc:before{content:""}.fa-thumbs-up:before{content:""}.fa-thumbs-down:before{content:""}.fa-youtube-square:before{content:""}.fa-youtube:before{content:""}.fa-xing:before{content:""}.fa-xing-square:before{content:""}.fa-youtube-play:before{content:""}.fa-dropbox:before{content:""}.fa-stack-overflow:before{content:""}.fa-instagram:before{content:""}.fa-flickr:before{content:""}.fa-adn:before{content:""}.fa-bitbucket:before,.icon-bitbucket:before{content:""}.fa-bitbucket-square:before{content:""}.fa-tumblr:before{content:""}.fa-tumblr-square:before{content:""}.fa-long-arrow-down:before{content:""}.fa-long-arrow-up:before{content:""}.fa-long-arrow-left:before{content:""}.fa-long-arrow-right:before{content:""}.fa-apple:before{content:""}.fa-windows:before{content:""}.fa-android:before{content:""}.fa-linux:before{content:""}.fa-dribbble:before{content:""}.fa-skype:before{content:""}.fa-foursquare:before{content:""}.fa-trello:before{content:""}.fa-female:before{content:""}.fa-male:before{content:""}.fa-gittip:before,.fa-gratipay:before{content:""}.fa-sun-o:before{content:""}.fa-moon-o:before{content:""}.fa-archive:before{content:""}.fa-bug:before{content:""}.fa-vk:before{content:""}.fa-weibo:before{content:""}.fa-renren:before{content:""}.fa-pagelines:before{content:""}.fa-stack-exchange:before{content:""}.fa-arrow-circle-o-right:before{content:""}.fa-arrow-circle-o-left:before{content:""}.fa-caret-square-o-left:before,.fa-toggle-left:before{content:""}.fa-dot-circle-o:before{content:""}.fa-wheelchair:before{content:""}.fa-vimeo-square:before{content:""}.fa-try:before,.fa-turkish-lira:before{content:""}.fa-plus-square-o:before,.wy-menu-vertical li button.toctree-expand:before{content:""}.fa-space-shuttle:before{content:""}.fa-slack:before{content:""}.fa-envelope-square:before{content:""}.fa-wordpress:before{content:""}.fa-openid:before{content:""}.fa-bank:before,.fa-institution:before,.fa-university:before{content:""}.fa-graduation-cap:before,.fa-mortar-board:before{content:""}.fa-yahoo:before{content:""}.fa-google:before{content:""}.fa-reddit:before{content:""}.fa-reddit-square:before{content:""}.fa-stumbleupon-circle:before{content:""}.fa-stumbleupon:before{content:""}.fa-delicious:before{content:""}.fa-digg:before{content:""}.fa-pied-piper-pp:before{content:""}.fa-pied-piper-alt:before{content:""}.fa-drupal:before{content:""}.fa-joomla:before{content:""}.fa-language:before{content:""}.fa-fax:before{content:""}.fa-building:before{content:""}.fa-child:before{content:""}.fa-paw:before{content:""}.fa-spoon:before{content:""}.fa-cube:before{content:""}.fa-cubes:before{content:""}.fa-behance:before{content:""}.fa-behance-square:before{content:""}.fa-steam:before{content:""}.fa-steam-square:before{content:""}.fa-recycle:before{content:""}.fa-automobile:before,.fa-car:before{content:""}.fa-cab:before,.fa-taxi:before{content:""}.fa-tree:before{content:""}.fa-spotify:before{content:""}.fa-deviantart:before{content:""}.fa-soundcloud:before{content:""}.fa-database:before{content:""}.fa-file-pdf-o:before{content:""}.fa-file-word-o:before{content:""}.fa-file-excel-o:before{content:""}.fa-file-powerpoint-o:before{content:""}.fa-file-image-o:before,.fa-file-photo-o:before,.fa-file-picture-o:before{content:""}.fa-file-archive-o:before,.fa-file-zip-o:before{content:""}.fa-file-audio-o:before,.fa-file-sound-o:before{content:""}.fa-file-movie-o:before,.fa-file-video-o:before{content:""}.fa-file-code-o:before{content:""}.fa-vine:before{content:""}.fa-codepen:before{content:""}.fa-jsfiddle:before{content:""}.fa-life-bouy:before,.fa-life-buoy:before,.fa-life-ring:before,.fa-life-saver:before,.fa-support:before{content:""}.fa-circle-o-notch:before{content:""}.fa-ra:before,.fa-rebel:before,.fa-resistance:before{content:""}.fa-empire:before,.fa-ge:before{content:""}.fa-git-square:before{content:""}.fa-git:before{content:""}.fa-hacker-news:before,.fa-y-combinator-square:before,.fa-yc-square:before{content:""}.fa-tencent-weibo:before{content:""}.fa-qq:before{content:""}.fa-wechat:before,.fa-weixin:before{content:""}.fa-paper-plane:before,.fa-send:before{content:""}.fa-paper-plane-o:before,.fa-send-o:before{content:""}.fa-history:before{content:""}.fa-circle-thin:before{content:""}.fa-header:before{content:""}.fa-paragraph:before{content:""}.fa-sliders:before{content:""}.fa-share-alt:before{content:""}.fa-share-alt-square:before{content:""}.fa-bomb:before{content:""}.fa-futbol-o:before,.fa-soccer-ball-o:before{content:""}.fa-tty:before{content:""}.fa-binoculars:before{content:""}.fa-plug:before{content:""}.fa-slideshare:before{content:""}.fa-twitch:before{content:""}.fa-yelp:before{content:""}.fa-newspaper-o:before{content:""}.fa-wifi:before{content:""}.fa-calculator:before{content:""}.fa-paypal:before{content:""}.fa-google-wallet:before{content:""}.fa-cc-visa:before{content:""}.fa-cc-mastercard:before{content:""}.fa-cc-discover:before{content:""}.fa-cc-amex:before{content:""}.fa-cc-paypal:before{content:""}.fa-cc-stripe:before{content:""}.fa-bell-slash:before{content:""}.fa-bell-slash-o:before{content:""}.fa-trash:before{content:""}.fa-copyright:before{content:""}.fa-at:before{content:""}.fa-eyedropper:before{content:""}.fa-paint-brush:before{content:""}.fa-birthday-cake:before{content:""}.fa-area-chart:before{content:""}.fa-pie-chart:before{content:""}.fa-line-chart:before{content:""}.fa-lastfm:before{content:""}.fa-lastfm-square:before{content:""}.fa-toggle-off:before{content:""}.fa-toggle-on:before{content:""}.fa-bicycle:before{content:""}.fa-bus:before{content:""}.fa-ioxhost:before{content:""}.fa-angellist:before{content:""}.fa-cc:before{content:""}.fa-ils:before,.fa-shekel:before,.fa-sheqel:before{content:""}.fa-meanpath:before{content:""}.fa-buysellads:before{content:""}.fa-connectdevelop:before{content:""}.fa-dashcube:before{content:""}.fa-forumbee:before{content:""}.fa-leanpub:before{content:""}.fa-sellsy:before{content:""}.fa-shirtsinbulk:before{content:""}.fa-simplybuilt:before{content:""}.fa-skyatlas:before{content:""}.fa-cart-plus:before{content:""}.fa-cart-arrow-down:before{content:""}.fa-diamond:before{content:""}.fa-ship:before{content:""}.fa-user-secret:before{content:""}.fa-motorcycle:before{content:""}.fa-street-view:before{content:""}.fa-heartbeat:before{content:""}.fa-venus:before{content:""}.fa-mars:before{content:""}.fa-mercury:before{content:""}.fa-intersex:before,.fa-transgender:before{content:""}.fa-transgender-alt:before{content:""}.fa-venus-double:before{content:""}.fa-mars-double:before{content:""}.fa-venus-mars:before{content:""}.fa-mars-stroke:before{content:""}.fa-mars-stroke-v:before{content:""}.fa-mars-stroke-h:before{content:""}.fa-neuter:before{content:""}.fa-genderless:before{content:""}.fa-facebook-official:before{content:""}.fa-pinterest-p:before{content:""}.fa-whatsapp:before{content:""}.fa-server:before{content:""}.fa-user-plus:before{content:""}.fa-user-times:before{content:""}.fa-bed:before,.fa-hotel:before{content:""}.fa-viacoin:before{content:""}.fa-train:before{content:""}.fa-subway:before{content:""}.fa-medium:before{content:""}.fa-y-combinator:before,.fa-yc:before{content:""}.fa-optin-monster:before{content:""}.fa-opencart:before{content:""}.fa-expeditedssl:before{content:""}.fa-battery-4:before,.fa-battery-full:before,.fa-battery:before{content:""}.fa-battery-3:before,.fa-battery-three-quarters:before{content:""}.fa-battery-2:before,.fa-battery-half:before{content:""}.fa-battery-1:before,.fa-battery-quarter:before{content:""}.fa-battery-0:before,.fa-battery-empty:before{content:""}.fa-mouse-pointer:before{content:""}.fa-i-cursor:before{content:""}.fa-object-group:before{content:""}.fa-object-ungroup:before{content:""}.fa-sticky-note:before{content:""}.fa-sticky-note-o:before{content:""}.fa-cc-jcb:before{content:""}.fa-cc-diners-club:before{content:""}.fa-clone:before{content:""}.fa-balance-scale:before{content:""}.fa-hourglass-o:before{content:""}.fa-hourglass-1:before,.fa-hourglass-start:before{content:""}.fa-hourglass-2:before,.fa-hourglass-half:before{content:""}.fa-hourglass-3:before,.fa-hourglass-end:before{content:""}.fa-hourglass:before{content:""}.fa-hand-grab-o:before,.fa-hand-rock-o:before{content:""}.fa-hand-paper-o:before,.fa-hand-stop-o:before{content:""}.fa-hand-scissors-o:before{content:""}.fa-hand-lizard-o:before{content:""}.fa-hand-spock-o:before{content:""}.fa-hand-pointer-o:before{content:""}.fa-hand-peace-o:before{content:""}.fa-trademark:before{content:""}.fa-registered:before{content:""}.fa-creative-commons:before{content:""}.fa-gg:before{content:""}.fa-gg-circle:before{content:""}.fa-tripadvisor:before{content:""}.fa-odnoklassniki:before{content:""}.fa-odnoklassniki-square:before{content:""}.fa-get-pocket:before{content:""}.fa-wikipedia-w:before{content:""}.fa-safari:before{content:""}.fa-chrome:before{content:""}.fa-firefox:before{content:""}.fa-opera:before{content:""}.fa-internet-explorer:before{content:""}.fa-television:before,.fa-tv:before{content:""}.fa-contao:before{content:""}.fa-500px:before{content:""}.fa-amazon:before{content:""}.fa-calendar-plus-o:before{content:""}.fa-calendar-minus-o:before{content:""}.fa-calendar-times-o:before{content:""}.fa-calendar-check-o:before{content:""}.fa-industry:before{content:""}.fa-map-pin:before{content:""}.fa-map-signs:before{content:""}.fa-map-o:before{content:""}.fa-map:before{content:""}.fa-commenting:before{content:""}.fa-commenting-o:before{content:""}.fa-houzz:before{content:""}.fa-vimeo:before{content:""}.fa-black-tie:before{content:""}.fa-fonticons:before{content:""}.fa-reddit-alien:before{content:""}.fa-edge:before{content:""}.fa-credit-card-alt:before{content:""}.fa-codiepie:before{content:""}.fa-modx:before{content:""}.fa-fort-awesome:before{content:""}.fa-usb:before{content:""}.fa-product-hunt:before{content:""}.fa-mixcloud:before{content:""}.fa-scribd:before{content:""}.fa-pause-circle:before{content:""}.fa-pause-circle-o:before{content:""}.fa-stop-circle:before{content:""}.fa-stop-circle-o:before{content:""}.fa-shopping-bag:before{content:""}.fa-shopping-basket:before{content:""}.fa-hashtag:before{content:""}.fa-bluetooth:before{content:""}.fa-bluetooth-b:before{content:""}.fa-percent:before{content:""}.fa-gitlab:before,.icon-gitlab:before{content:""}.fa-wpbeginner:before{content:""}.fa-wpforms:before{content:""}.fa-envira:before{content:""}.fa-universal-access:before{content:""}.fa-wheelchair-alt:before{content:""}.fa-question-circle-o:before{content:""}.fa-blind:before{content:""}.fa-audio-description:before{content:""}.fa-volume-control-phone:before{content:""}.fa-braille:before{content:""}.fa-assistive-listening-systems:before{content:""}.fa-american-sign-language-interpreting:before,.fa-asl-interpreting:before{content:""}.fa-deaf:before,.fa-deafness:before,.fa-hard-of-hearing:before{content:""}.fa-glide:before{content:""}.fa-glide-g:before{content:""}.fa-sign-language:before,.fa-signing:before{content:""}.fa-low-vision:before{content:""}.fa-viadeo:before{content:""}.fa-viadeo-square:before{content:""}.fa-snapchat:before{content:""}.fa-snapchat-ghost:before{content:""}.fa-snapchat-square:before{content:""}.fa-pied-piper:before{content:""}.fa-first-order:before{content:""}.fa-yoast:before{content:""}.fa-themeisle:before{content:""}.fa-google-plus-circle:before,.fa-google-plus-official:before{content:""}.fa-fa:before,.fa-font-awesome:before{content:""}.fa-handshake-o:before{content:""}.fa-envelope-open:before{content:""}.fa-envelope-open-o:before{content:""}.fa-linode:before{content:""}.fa-address-book:before{content:""}.fa-address-book-o:before{content:""}.fa-address-card:before,.fa-vcard:before{content:""}.fa-address-card-o:before,.fa-vcard-o:before{content:""}.fa-user-circle:before{content:""}.fa-user-circle-o:before{content:""}.fa-user-o:before{content:""}.fa-id-badge:before{content:""}.fa-drivers-license:before,.fa-id-card:before{content:""}.fa-drivers-license-o:before,.fa-id-card-o:before{content:""}.fa-quora:before{content:""}.fa-free-code-camp:before{content:""}.fa-telegram:before{content:""}.fa-thermometer-4:before,.fa-thermometer-full:before,.fa-thermometer:before{content:""}.fa-thermometer-3:before,.fa-thermometer-three-quarters:before{content:""}.fa-thermometer-2:before,.fa-thermometer-half:before{content:""}.fa-thermometer-1:before,.fa-thermometer-quarter:before{content:""}.fa-thermometer-0:before,.fa-thermometer-empty:before{content:""}.fa-shower:before{content:""}.fa-bath:before,.fa-bathtub:before,.fa-s15:before{content:""}.fa-podcast:before{content:""}.fa-window-maximize:before{content:""}.fa-window-minimize:before{content:""}.fa-window-restore:before{content:""}.fa-times-rectangle:before,.fa-window-close:before{content:""}.fa-times-rectangle-o:before,.fa-window-close-o:before{content:""}.fa-bandcamp:before{content:""}.fa-grav:before{content:""}.fa-etsy:before{content:""}.fa-imdb:before{content:""}.fa-ravelry:before{content:""}.fa-eercast:before{content:""}.fa-microchip:before{content:""}.fa-snowflake-o:before{content:""}.fa-superpowers:before{content:""}.fa-wpexplorer:before{content:""}.fa-meetup:before{content:""}.sr-only{position:absolute;width:1px;height:1px;padding:0;margin:-1px;overflow:hidden;clip:rect(0,0,0,0);border:0}.sr-only-focusable:active,.sr-only-focusable:focus{position:static;width:auto;height:auto;margin:0;overflow:visible;clip:auto}.fa,.icon,.rst-content .admonition-title,.rst-content .code-block-caption .headerlink,.rst-content .eqno .headerlink,.rst-content code.download span:first-child,.rst-content dl dt .headerlink,.rst-content h1 .headerlink,.rst-content h2 .headerlink,.rst-content h3 .headerlink,.rst-content h4 .headerlink,.rst-content h5 .headerlink,.rst-content h6 .headerlink,.rst-content p.caption .headerlink,.rst-content p .headerlink,.rst-content table>caption .headerlink,.rst-content tt.download span:first-child,.wy-dropdown .caret,.wy-inline-validate.wy-inline-validate-danger .wy-input-context,.wy-inline-validate.wy-inline-validate-info .wy-input-context,.wy-inline-validate.wy-inline-validate-success .wy-input-context,.wy-inline-validate.wy-inline-validate-warning .wy-input-context,.wy-menu-vertical li.current>a button.toctree-expand,.wy-menu-vertical li.on a button.toctree-expand,.wy-menu-vertical li button.toctree-expand{font-family:inherit}.fa:before,.icon:before,.rst-content .admonition-title:before,.rst-content .code-block-caption .headerlink:before,.rst-content .eqno .headerlink:before,.rst-content code.download span:first-child:before,.rst-content dl dt .headerlink:before,.rst-content h1 .headerlink:before,.rst-content h2 .headerlink:before,.rst-content h3 .headerlink:before,.rst-content h4 .headerlink:before,.rst-content h5 .headerlink:before,.rst-content h6 .headerlink:before,.rst-content p.caption .headerlink:before,.rst-content p .headerlink:before,.rst-content table>caption .headerlink:before,.rst-content tt.download span:first-child:before,.wy-dropdown .caret:before,.wy-inline-validate.wy-inline-validate-danger .wy-input-context:before,.wy-inline-validate.wy-inline-validate-info .wy-input-context:before,.wy-inline-validate.wy-inline-validate-success .wy-input-context:before,.wy-inline-validate.wy-inline-validate-warning .wy-input-context:before,.wy-menu-vertical li.current>a button.toctree-expand:before,.wy-menu-vertical li.on a button.toctree-expand:before,.wy-menu-vertical li button.toctree-expand:before{font-family:FontAwesome;display:inline-block;font-style:normal;font-weight:400;line-height:1;text-decoration:inherit}.rst-content .code-block-caption a .headerlink,.rst-content .eqno a .headerlink,.rst-content a .admonition-title,.rst-content code.download a span:first-child,.rst-content dl dt a .headerlink,.rst-content h1 a .headerlink,.rst-content h2 a .headerlink,.rst-content h3 a .headerlink,.rst-content h4 a .headerlink,.rst-content h5 a .headerlink,.rst-content h6 a .headerlink,.rst-content p.caption a .headerlink,.rst-content p a .headerlink,.rst-content table>caption a .headerlink,.rst-content tt.download a span:first-child,.wy-menu-vertical li.current>a button.toctree-expand,.wy-menu-vertical li.on a button.toctree-expand,.wy-menu-vertical li a button.toctree-expand,a .fa,a .icon,a .rst-content .admonition-title,a .rst-content .code-block-caption .headerlink,a .rst-content .eqno .headerlink,a .rst-content code.download span:first-child,a .rst-content dl dt .headerlink,a .rst-content h1 .headerlink,a .rst-content h2 .headerlink,a .rst-content h3 .headerlink,a .rst-content h4 .headerlink,a .rst-content h5 .headerlink,a .rst-content h6 .headerlink,a .rst-content p.caption .headerlink,a .rst-content p .headerlink,a .rst-content table>caption .headerlink,a .rst-content tt.download span:first-child,a .wy-menu-vertical li button.toctree-expand{display:inline-block;text-decoration:inherit}.btn .fa,.btn .icon,.btn .rst-content .admonition-title,.btn .rst-content .code-block-caption .headerlink,.btn .rst-content .eqno .headerlink,.btn .rst-content code.download span:first-child,.btn .rst-content dl dt .headerlink,.btn .rst-content h1 .headerlink,.btn .rst-content h2 .headerlink,.btn .rst-content h3 .headerlink,.btn .rst-content h4 .headerlink,.btn .rst-content h5 .headerlink,.btn .rst-content h6 .headerlink,.btn .rst-content p .headerlink,.btn .rst-content table>caption .headerlink,.btn .rst-content tt.download span:first-child,.btn .wy-menu-vertical li.current>a button.toctree-expand,.btn .wy-menu-vertical li.on a button.toctree-expand,.btn .wy-menu-vertical li button.toctree-expand,.nav .fa,.nav .icon,.nav .rst-content .admonition-title,.nav .rst-content .code-block-caption .headerlink,.nav .rst-content .eqno .headerlink,.nav .rst-content code.download span:first-child,.nav .rst-content dl dt .headerlink,.nav .rst-content h1 .headerlink,.nav .rst-content h2 .headerlink,.nav .rst-content h3 .headerlink,.nav .rst-content h4 .headerlink,.nav .rst-content h5 .headerlink,.nav .rst-content h6 .headerlink,.nav .rst-content p .headerlink,.nav .rst-content table>caption .headerlink,.nav .rst-content tt.download span:first-child,.nav .wy-menu-vertical li.current>a button.toctree-expand,.nav .wy-menu-vertical li.on a button.toctree-expand,.nav .wy-menu-vertical li button.toctree-expand,.rst-content .btn .admonition-title,.rst-content .code-block-caption .btn .headerlink,.rst-content .code-block-caption .nav .headerlink,.rst-content .eqno .btn .headerlink,.rst-content .eqno .nav .headerlink,.rst-content .nav .admonition-title,.rst-content code.download .btn span:first-child,.rst-content code.download .nav span:first-child,.rst-content dl dt .btn .headerlink,.rst-content dl dt .nav .headerlink,.rst-content h1 .btn .headerlink,.rst-content h1 .nav .headerlink,.rst-content h2 .btn .headerlink,.rst-content h2 .nav .headerlink,.rst-content h3 .btn .headerlink,.rst-content h3 .nav .headerlink,.rst-content h4 .btn .headerlink,.rst-content h4 .nav .headerlink,.rst-content h5 .btn .headerlink,.rst-content h5 .nav .headerlink,.rst-content h6 .btn .headerlink,.rst-content h6 .nav .headerlink,.rst-content p .btn .headerlink,.rst-content p .nav .headerlink,.rst-content table>caption .btn .headerlink,.rst-content table>caption .nav .headerlink,.rst-content tt.download .btn span:first-child,.rst-content tt.download .nav span:first-child,.wy-menu-vertical li .btn button.toctree-expand,.wy-menu-vertical li.current>a .btn button.toctree-expand,.wy-menu-vertical li.current>a .nav button.toctree-expand,.wy-menu-vertical li .nav button.toctree-expand,.wy-menu-vertical li.on a .btn button.toctree-expand,.wy-menu-vertical li.on a .nav button.toctree-expand{display:inline}.btn .fa-large.icon,.btn .fa.fa-large,.btn .rst-content .code-block-caption .fa-large.headerlink,.btn .rst-content .eqno .fa-large.headerlink,.btn .rst-content .fa-large.admonition-title,.btn .rst-content code.download span.fa-large:first-child,.btn .rst-content dl dt .fa-large.headerlink,.btn .rst-content h1 .fa-large.headerlink,.btn .rst-content h2 .fa-large.headerlink,.btn .rst-content h3 .fa-large.headerlink,.btn .rst-content h4 .fa-large.headerlink,.btn .rst-content h5 .fa-large.headerlink,.btn .rst-content h6 .fa-large.headerlink,.btn .rst-content p .fa-large.headerlink,.btn .rst-content table>caption .fa-large.headerlink,.btn .rst-content tt.download span.fa-large:first-child,.btn .wy-menu-vertical li button.fa-large.toctree-expand,.nav .fa-large.icon,.nav .fa.fa-large,.nav .rst-content .code-block-caption .fa-large.headerlink,.nav .rst-content .eqno .fa-large.headerlink,.nav .rst-content .fa-large.admonition-title,.nav .rst-content code.download span.fa-large:first-child,.nav .rst-content dl dt .fa-large.headerlink,.nav .rst-content h1 .fa-large.headerlink,.nav .rst-content h2 .fa-large.headerlink,.nav .rst-content h3 .fa-large.headerlink,.nav .rst-content h4 .fa-large.headerlink,.nav .rst-content h5 .fa-large.headerlink,.nav .rst-content h6 .fa-large.headerlink,.nav .rst-content p .fa-large.headerlink,.nav .rst-content table>caption .fa-large.headerlink,.nav .rst-content tt.download span.fa-large:first-child,.nav .wy-menu-vertical li button.fa-large.toctree-expand,.rst-content .btn .fa-large.admonition-title,.rst-content .code-block-caption .btn .fa-large.headerlink,.rst-content .code-block-caption .nav .fa-large.headerlink,.rst-content .eqno .btn .fa-large.headerlink,.rst-content .eqno .nav .fa-large.headerlink,.rst-content .nav .fa-large.admonition-title,.rst-content code.download .btn span.fa-large:first-child,.rst-content code.download .nav span.fa-large:first-child,.rst-content dl dt .btn .fa-large.headerlink,.rst-content dl dt .nav .fa-large.headerlink,.rst-content h1 .btn .fa-large.headerlink,.rst-content h1 .nav .fa-large.headerlink,.rst-content h2 .btn .fa-large.headerlink,.rst-content h2 .nav .fa-large.headerlink,.rst-content h3 .btn .fa-large.headerlink,.rst-content h3 .nav .fa-large.headerlink,.rst-content h4 .btn .fa-large.headerlink,.rst-content h4 .nav .fa-large.headerlink,.rst-content h5 .btn .fa-large.headerlink,.rst-content h5 .nav .fa-large.headerlink,.rst-content h6 .btn .fa-large.headerlink,.rst-content h6 .nav .fa-large.headerlink,.rst-content p .btn .fa-large.headerlink,.rst-content p .nav .fa-large.headerlink,.rst-content table>caption .btn .fa-large.headerlink,.rst-content table>caption .nav .fa-large.headerlink,.rst-content tt.download .btn span.fa-large:first-child,.rst-content tt.download .nav span.fa-large:first-child,.wy-menu-vertical li .btn button.fa-large.toctree-expand,.wy-menu-vertical li .nav button.fa-large.toctree-expand{line-height:.9em}.btn .fa-spin.icon,.btn .fa.fa-spin,.btn .rst-content .code-block-caption .fa-spin.headerlink,.btn .rst-content .eqno .fa-spin.headerlink,.btn .rst-content .fa-spin.admonition-title,.btn .rst-content code.download span.fa-spin:first-child,.btn .rst-content dl dt .fa-spin.headerlink,.btn .rst-content h1 .fa-spin.headerlink,.btn .rst-content h2 .fa-spin.headerlink,.btn .rst-content h3 .fa-spin.headerlink,.btn .rst-content h4 .fa-spin.headerlink,.btn .rst-content h5 .fa-spin.headerlink,.btn .rst-content h6 .fa-spin.headerlink,.btn .rst-content p .fa-spin.headerlink,.btn .rst-content table>caption .fa-spin.headerlink,.btn .rst-content tt.download span.fa-spin:first-child,.btn .wy-menu-vertical li button.fa-spin.toctree-expand,.nav .fa-spin.icon,.nav .fa.fa-spin,.nav .rst-content .code-block-caption .fa-spin.headerlink,.nav .rst-content .eqno .fa-spin.headerlink,.nav .rst-content .fa-spin.admonition-title,.nav .rst-content code.download span.fa-spin:first-child,.nav .rst-content dl dt .fa-spin.headerlink,.nav .rst-content h1 .fa-spin.headerlink,.nav .rst-content h2 .fa-spin.headerlink,.nav .rst-content h3 .fa-spin.headerlink,.nav .rst-content h4 .fa-spin.headerlink,.nav .rst-content h5 .fa-spin.headerlink,.nav .rst-content h6 .fa-spin.headerlink,.nav .rst-content p .fa-spin.headerlink,.nav .rst-content table>caption .fa-spin.headerlink,.nav .rst-content tt.download span.fa-spin:first-child,.nav .wy-menu-vertical li button.fa-spin.toctree-expand,.rst-content .btn .fa-spin.admonition-title,.rst-content .code-block-caption .btn .fa-spin.headerlink,.rst-content .code-block-caption .nav .fa-spin.headerlink,.rst-content .eqno .btn .fa-spin.headerlink,.rst-content .eqno .nav .fa-spin.headerlink,.rst-content .nav .fa-spin.admonition-title,.rst-content code.download .btn span.fa-spin:first-child,.rst-content code.download .nav span.fa-spin:first-child,.rst-content dl dt .btn .fa-spin.headerlink,.rst-content dl dt .nav .fa-spin.headerlink,.rst-content h1 .btn .fa-spin.headerlink,.rst-content h1 .nav .fa-spin.headerlink,.rst-content h2 .btn .fa-spin.headerlink,.rst-content h2 .nav .fa-spin.headerlink,.rst-content h3 .btn .fa-spin.headerlink,.rst-content h3 .nav .fa-spin.headerlink,.rst-content h4 .btn .fa-spin.headerlink,.rst-content h4 .nav .fa-spin.headerlink,.rst-content h5 .btn .fa-spin.headerlink,.rst-content h5 .nav .fa-spin.headerlink,.rst-content h6 .btn .fa-spin.headerlink,.rst-content h6 .nav .fa-spin.headerlink,.rst-content p .btn .fa-spin.headerlink,.rst-content p .nav .fa-spin.headerlink,.rst-content table>caption .btn .fa-spin.headerlink,.rst-content table>caption .nav .fa-spin.headerlink,.rst-content tt.download .btn span.fa-spin:first-child,.rst-content tt.download .nav span.fa-spin:first-child,.wy-menu-vertical li .btn button.fa-spin.toctree-expand,.wy-menu-vertical li .nav button.fa-spin.toctree-expand{display:inline-block}.btn.fa:before,.btn.icon:before,.rst-content .btn.admonition-title:before,.rst-content .code-block-caption .btn.headerlink:before,.rst-content .eqno .btn.headerlink:before,.rst-content code.download span.btn:first-child:before,.rst-content dl dt .btn.headerlink:before,.rst-content h1 .btn.headerlink:before,.rst-content h2 .btn.headerlink:before,.rst-content h3 .btn.headerlink:before,.rst-content h4 .btn.headerlink:before,.rst-content h5 .btn.headerlink:before,.rst-content h6 .btn.headerlink:before,.rst-content p .btn.headerlink:before,.rst-content table>caption .btn.headerlink:before,.rst-content tt.download span.btn:first-child:before,.wy-menu-vertical li button.btn.toctree-expand:before{opacity:.5;-webkit-transition:opacity .05s ease-in;-moz-transition:opacity .05s ease-in;transition:opacity .05s ease-in}.btn.fa:hover:before,.btn.icon:hover:before,.rst-content .btn.admonition-title:hover:before,.rst-content .code-block-caption .btn.headerlink:hover:before,.rst-content .eqno .btn.headerlink:hover:before,.rst-content code.download span.btn:first-child:hover:before,.rst-content dl dt .btn.headerlink:hover:before,.rst-content h1 .btn.headerlink:hover:before,.rst-content h2 .btn.headerlink:hover:before,.rst-content h3 .btn.headerlink:hover:before,.rst-content h4 .btn.headerlink:hover:before,.rst-content h5 .btn.headerlink:hover:before,.rst-content h6 .btn.headerlink:hover:before,.rst-content p .btn.headerlink:hover:before,.rst-content table>caption .btn.headerlink:hover:before,.rst-content tt.download span.btn:first-child:hover:before,.wy-menu-vertical li button.btn.toctree-expand:hover:before{opacity:1}.btn-mini .fa:before,.btn-mini .icon:before,.btn-mini .rst-content .admonition-title:before,.btn-mini .rst-content .code-block-caption .headerlink:before,.btn-mini .rst-content .eqno .headerlink:before,.btn-mini .rst-content code.download span:first-child:before,.btn-mini .rst-content dl dt .headerlink:before,.btn-mini .rst-content h1 .headerlink:before,.btn-mini .rst-content h2 .headerlink:before,.btn-mini .rst-content h3 .headerlink:before,.btn-mini .rst-content h4 .headerlink:before,.btn-mini .rst-content h5 .headerlink:before,.btn-mini .rst-content h6 .headerlink:before,.btn-mini .rst-content p .headerlink:before,.btn-mini .rst-content table>caption .headerlink:before,.btn-mini .rst-content tt.download span:first-child:before,.btn-mini .wy-menu-vertical li button.toctree-expand:before,.rst-content .btn-mini .admonition-title:before,.rst-content .code-block-caption .btn-mini .headerlink:before,.rst-content .eqno .btn-mini .headerlink:before,.rst-content code.download .btn-mini span:first-child:before,.rst-content dl dt .btn-mini .headerlink:before,.rst-content h1 .btn-mini .headerlink:before,.rst-content h2 .btn-mini .headerlink:before,.rst-content h3 .btn-mini .headerlink:before,.rst-content h4 .btn-mini .headerlink:before,.rst-content h5 .btn-mini .headerlink:before,.rst-content h6 .btn-mini .headerlink:before,.rst-content p .btn-mini .headerlink:before,.rst-content table>caption .btn-mini .headerlink:before,.rst-content tt.download .btn-mini span:first-child:before,.wy-menu-vertical li .btn-mini button.toctree-expand:before{font-size:14px;vertical-align:-15%}.rst-content .admonition,.rst-content .admonition-todo,.rst-content .attention,.rst-content .caution,.rst-content .danger,.rst-content .error,.rst-content .hint,.rst-content .important,.rst-content .note,.rst-content .seealso,.rst-content .tip,.rst-content .warning,.wy-alert{padding:12px;line-height:24px;margin-bottom:24px;background:#e7f2fa}.rst-content .admonition-title,.wy-alert-title{font-weight:700;display:block;color:#fff;background:#6ab0de;padding:6px 12px;margin:-12px -12px 12px}.rst-content .danger,.rst-content .error,.rst-content .wy-alert-danger.admonition,.rst-content .wy-alert-danger.admonition-todo,.rst-content .wy-alert-danger.attention,.rst-content .wy-alert-danger.caution,.rst-content .wy-alert-danger.hint,.rst-content .wy-alert-danger.important,.rst-content .wy-alert-danger.note,.rst-content .wy-alert-danger.seealso,.rst-content .wy-alert-danger.tip,.rst-content .wy-alert-danger.warning,.wy-alert.wy-alert-danger{background:#fdf3f2}.rst-content .danger .admonition-title,.rst-content .danger .wy-alert-title,.rst-content .error .admonition-title,.rst-content .error .wy-alert-title,.rst-content .wy-alert-danger.admonition-todo .admonition-title,.rst-content .wy-alert-danger.admonition-todo .wy-alert-title,.rst-content .wy-alert-danger.admonition .admonition-title,.rst-content .wy-alert-danger.admonition .wy-alert-title,.rst-content .wy-alert-danger.attention .admonition-title,.rst-content .wy-alert-danger.attention .wy-alert-title,.rst-content .wy-alert-danger.caution .admonition-title,.rst-content .wy-alert-danger.caution .wy-alert-title,.rst-content .wy-alert-danger.hint .admonition-title,.rst-content .wy-alert-danger.hint .wy-alert-title,.rst-content .wy-alert-danger.important .admonition-title,.rst-content .wy-alert-danger.important .wy-alert-title,.rst-content .wy-alert-danger.note .admonition-title,.rst-content .wy-alert-danger.note .wy-alert-title,.rst-content .wy-alert-danger.seealso .admonition-title,.rst-content .wy-alert-danger.seealso .wy-alert-title,.rst-content .wy-alert-danger.tip .admonition-title,.rst-content .wy-alert-danger.tip .wy-alert-title,.rst-content .wy-alert-danger.warning .admonition-title,.rst-content .wy-alert-danger.warning .wy-alert-title,.rst-content .wy-alert.wy-alert-danger .admonition-title,.wy-alert.wy-alert-danger .rst-content .admonition-title,.wy-alert.wy-alert-danger .wy-alert-title{background:#f29f97}.rst-content .admonition-todo,.rst-content .attention,.rst-content .caution,.rst-content .warning,.rst-content .wy-alert-warning.admonition,.rst-content .wy-alert-warning.danger,.rst-content .wy-alert-warning.error,.rst-content .wy-alert-warning.hint,.rst-content .wy-alert-warning.important,.rst-content .wy-alert-warning.note,.rst-content .wy-alert-warning.seealso,.rst-content .wy-alert-warning.tip,.wy-alert.wy-alert-warning{background:#ffedcc}.rst-content .admonition-todo .admonition-title,.rst-content .admonition-todo .wy-alert-title,.rst-content .attention .admonition-title,.rst-content .attention .wy-alert-title,.rst-content .caution .admonition-title,.rst-content .caution .wy-alert-title,.rst-content .warning .admonition-title,.rst-content .warning .wy-alert-title,.rst-content .wy-alert-warning.admonition .admonition-title,.rst-content .wy-alert-warning.admonition .wy-alert-title,.rst-content .wy-alert-warning.danger .admonition-title,.rst-content .wy-alert-warning.danger .wy-alert-title,.rst-content .wy-alert-warning.error .admonition-title,.rst-content .wy-alert-warning.error .wy-alert-title,.rst-content .wy-alert-warning.hint .admonition-title,.rst-content .wy-alert-warning.hint .wy-alert-title,.rst-content .wy-alert-warning.important .admonition-title,.rst-content .wy-alert-warning.important .wy-alert-title,.rst-content .wy-alert-warning.note .admonition-title,.rst-content .wy-alert-warning.note .wy-alert-title,.rst-content .wy-alert-warning.seealso .admonition-title,.rst-content .wy-alert-warning.seealso .wy-alert-title,.rst-content .wy-alert-warning.tip .admonition-title,.rst-content .wy-alert-warning.tip .wy-alert-title,.rst-content .wy-alert.wy-alert-warning .admonition-title,.wy-alert.wy-alert-warning .rst-content .admonition-title,.wy-alert.wy-alert-warning .wy-alert-title{background:#f0b37e}.rst-content .note,.rst-content .seealso,.rst-content .wy-alert-info.admonition,.rst-content .wy-alert-info.admonition-todo,.rst-content .wy-alert-info.attention,.rst-content .wy-alert-info.caution,.rst-content .wy-alert-info.danger,.rst-content .wy-alert-info.error,.rst-content .wy-alert-info.hint,.rst-content .wy-alert-info.important,.rst-content .wy-alert-info.tip,.rst-content .wy-alert-info.warning,.wy-alert.wy-alert-info{background:#e7f2fa}.rst-content .note .admonition-title,.rst-content .note .wy-alert-title,.rst-content .seealso .admonition-title,.rst-content .seealso .wy-alert-title,.rst-content .wy-alert-info.admonition-todo .admonition-title,.rst-content .wy-alert-info.admonition-todo .wy-alert-title,.rst-content .wy-alert-info.admonition .admonition-title,.rst-content .wy-alert-info.admonition .wy-alert-title,.rst-content .wy-alert-info.attention .admonition-title,.rst-content .wy-alert-info.attention .wy-alert-title,.rst-content .wy-alert-info.caution .admonition-title,.rst-content .wy-alert-info.caution .wy-alert-title,.rst-content .wy-alert-info.danger .admonition-title,.rst-content .wy-alert-info.danger .wy-alert-title,.rst-content .wy-alert-info.error .admonition-title,.rst-content .wy-alert-info.error .wy-alert-title,.rst-content .wy-alert-info.hint .admonition-title,.rst-content .wy-alert-info.hint .wy-alert-title,.rst-content .wy-alert-info.important .admonition-title,.rst-content .wy-alert-info.important .wy-alert-title,.rst-content .wy-alert-info.tip .admonition-title,.rst-content .wy-alert-info.tip .wy-alert-title,.rst-content .wy-alert-info.warning .admonition-title,.rst-content .wy-alert-info.warning .wy-alert-title,.rst-content .wy-alert.wy-alert-info .admonition-title,.wy-alert.wy-alert-info .rst-content .admonition-title,.wy-alert.wy-alert-info .wy-alert-title{background:#6ab0de}.rst-content .hint,.rst-content .important,.rst-content .tip,.rst-content .wy-alert-success.admonition,.rst-content .wy-alert-success.admonition-todo,.rst-content .wy-alert-success.attention,.rst-content .wy-alert-success.caution,.rst-content .wy-alert-success.danger,.rst-content .wy-alert-success.error,.rst-content .wy-alert-success.note,.rst-content .wy-alert-success.seealso,.rst-content .wy-alert-success.warning,.wy-alert.wy-alert-success{background:#dbfaf4}.rst-content .hint .admonition-title,.rst-content .hint .wy-alert-title,.rst-content .important .admonition-title,.rst-content .important .wy-alert-title,.rst-content .tip .admonition-title,.rst-content .tip .wy-alert-title,.rst-content .wy-alert-success.admonition-todo .admonition-title,.rst-content .wy-alert-success.admonition-todo .wy-alert-title,.rst-content .wy-alert-success.admonition .admonition-title,.rst-content .wy-alert-success.admonition .wy-alert-title,.rst-content .wy-alert-success.attention .admonition-title,.rst-content .wy-alert-success.attention .wy-alert-title,.rst-content .wy-alert-success.caution .admonition-title,.rst-content .wy-alert-success.caution .wy-alert-title,.rst-content .wy-alert-success.danger .admonition-title,.rst-content .wy-alert-success.danger .wy-alert-title,.rst-content .wy-alert-success.error .admonition-title,.rst-content .wy-alert-success.error .wy-alert-title,.rst-content .wy-alert-success.note .admonition-title,.rst-content .wy-alert-success.note .wy-alert-title,.rst-content .wy-alert-success.seealso .admonition-title,.rst-content .wy-alert-success.seealso .wy-alert-title,.rst-content .wy-alert-success.warning .admonition-title,.rst-content .wy-alert-success.warning .wy-alert-title,.rst-content .wy-alert.wy-alert-success .admonition-title,.wy-alert.wy-alert-success .rst-content .admonition-title,.wy-alert.wy-alert-success .wy-alert-title{background:#1abc9c}.rst-content .wy-alert-neutral.admonition,.rst-content .wy-alert-neutral.admonition-todo,.rst-content .wy-alert-neutral.attention,.rst-content .wy-alert-neutral.caution,.rst-content .wy-alert-neutral.danger,.rst-content .wy-alert-neutral.error,.rst-content .wy-alert-neutral.hint,.rst-content .wy-alert-neutral.important,.rst-content .wy-alert-neutral.note,.rst-content .wy-alert-neutral.seealso,.rst-content .wy-alert-neutral.tip,.rst-content .wy-alert-neutral.warning,.wy-alert.wy-alert-neutral{background:#f3f6f6}.rst-content .wy-alert-neutral.admonition-todo .admonition-title,.rst-content .wy-alert-neutral.admonition-todo .wy-alert-title,.rst-content .wy-alert-neutral.admonition .admonition-title,.rst-content .wy-alert-neutral.admonition .wy-alert-title,.rst-content .wy-alert-neutral.attention .admonition-title,.rst-content .wy-alert-neutral.attention .wy-alert-title,.rst-content .wy-alert-neutral.caution .admonition-title,.rst-content .wy-alert-neutral.caution .wy-alert-title,.rst-content .wy-alert-neutral.danger .admonition-title,.rst-content .wy-alert-neutral.danger .wy-alert-title,.rst-content .wy-alert-neutral.error .admonition-title,.rst-content .wy-alert-neutral.error .wy-alert-title,.rst-content .wy-alert-neutral.hint .admonition-title,.rst-content .wy-alert-neutral.hint .wy-alert-title,.rst-content .wy-alert-neutral.important .admonition-title,.rst-content .wy-alert-neutral.important .wy-alert-title,.rst-content .wy-alert-neutral.note .admonition-title,.rst-content .wy-alert-neutral.note .wy-alert-title,.rst-content .wy-alert-neutral.seealso .admonition-title,.rst-content .wy-alert-neutral.seealso .wy-alert-title,.rst-content .wy-alert-neutral.tip .admonition-title,.rst-content .wy-alert-neutral.tip .wy-alert-title,.rst-content .wy-alert-neutral.warning .admonition-title,.rst-content .wy-alert-neutral.warning .wy-alert-title,.rst-content .wy-alert.wy-alert-neutral .admonition-title,.wy-alert.wy-alert-neutral .rst-content .admonition-title,.wy-alert.wy-alert-neutral .wy-alert-title{color:#404040;background:#e1e4e5}.rst-content .wy-alert-neutral.admonition-todo a,.rst-content .wy-alert-neutral.admonition a,.rst-content .wy-alert-neutral.attention a,.rst-content .wy-alert-neutral.caution a,.rst-content .wy-alert-neutral.danger a,.rst-content .wy-alert-neutral.error a,.rst-content .wy-alert-neutral.hint a,.rst-content .wy-alert-neutral.important a,.rst-content .wy-alert-neutral.note a,.rst-content .wy-alert-neutral.seealso a,.rst-content .wy-alert-neutral.tip a,.rst-content .wy-alert-neutral.warning a,.wy-alert.wy-alert-neutral a{color:#2980b9}.rst-content .admonition-todo p:last-child,.rst-content .admonition p:last-child,.rst-content .attention p:last-child,.rst-content .caution p:last-child,.rst-content .danger p:last-child,.rst-content .error p:last-child,.rst-content .hint p:last-child,.rst-content .important p:last-child,.rst-content .note p:last-child,.rst-content .seealso p:last-child,.rst-content .tip p:last-child,.rst-content .warning p:last-child,.wy-alert p:last-child{margin-bottom:0}.wy-tray-container{position:fixed;bottom:0;left:0;z-index:600}.wy-tray-container li{display:block;width:300px;background:transparent;color:#fff;text-align:center;box-shadow:0 5px 5px 0 rgba(0,0,0,.1);padding:0 24px;min-width:20%;opacity:0;height:0;line-height:56px;overflow:hidden;-webkit-transition:all .3s ease-in;-moz-transition:all .3s ease-in;transition:all .3s ease-in}.wy-tray-container li.wy-tray-item-success{background:#27ae60}.wy-tray-container li.wy-tray-item-info{background:#2980b9}.wy-tray-container li.wy-tray-item-warning{background:#e67e22}.wy-tray-container li.wy-tray-item-danger{background:#e74c3c}.wy-tray-container li.on{opacity:1;height:56px}@media screen and (max-width:768px){.wy-tray-container{bottom:auto;top:0;width:100%}.wy-tray-container li{width:100%}}button{font-size:100%;margin:0;vertical-align:baseline;*vertical-align:middle;cursor:pointer;line-height:normal;-webkit-appearance:button;*overflow:visible}button::-moz-focus-inner,input::-moz-focus-inner{border:0;padding:0}button[disabled]{cursor:default}.btn{display:inline-block;border-radius:2px;line-height:normal;white-space:nowrap;text-align:center;cursor:pointer;font-size:100%;padding:6px 12px 8px;color:#fff;border:1px solid rgba(0,0,0,.1);background-color:#27ae60;text-decoration:none;font-weight:400;font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif;box-shadow:inset 0 1px 2px -1px hsla(0,0%,100%,.5),inset 0 -2px 0 0 rgba(0,0,0,.1);outline-none:false;vertical-align:middle;*display:inline;zoom:1;-webkit-user-drag:none;-webkit-user-select:none;-moz-user-select:none;-ms-user-select:none;user-select:none;-webkit-transition:all .1s linear;-moz-transition:all .1s linear;transition:all .1s linear}.btn-hover{background:#2e8ece;color:#fff}.btn:hover{background:#2cc36b;color:#fff}.btn:focus{background:#2cc36b;outline:0}.btn:active{box-shadow:inset 0 -1px 0 0 rgba(0,0,0,.05),inset 0 2px 0 0 rgba(0,0,0,.1);padding:8px 12px 6px}.btn:visited{color:#fff}.btn-disabled,.btn-disabled:active,.btn-disabled:focus,.btn-disabled:hover,.btn:disabled{background-image:none;filter:progid:DXImageTransform.Microsoft.gradient(enabled = false);filter:alpha(opacity=40);opacity:.4;cursor:not-allowed;box-shadow:none}.btn::-moz-focus-inner{padding:0;border:0}.btn-small{font-size:80%}.btn-info{background-color:#2980b9!important}.btn-info:hover{background-color:#2e8ece!important}.btn-neutral{background-color:#f3f6f6!important;color:#404040!important}.btn-neutral:hover{background-color:#e5ebeb!important;color:#404040}.btn-neutral:visited{color:#404040!important}.btn-success{background-color:#27ae60!important}.btn-success:hover{background-color:#295!important}.btn-danger{background-color:#e74c3c!important}.btn-danger:hover{background-color:#ea6153!important}.btn-warning{background-color:#e67e22!important}.btn-warning:hover{background-color:#e98b39!important}.btn-invert{background-color:#222}.btn-invert:hover{background-color:#2f2f2f!important}.btn-link{background-color:transparent!important;color:#2980b9;box-shadow:none;border-color:transparent!important}.btn-link:active,.btn-link:hover{background-color:transparent!important;color:#409ad5!important;box-shadow:none}.btn-link:visited{color:#9b59b6}.wy-btn-group .btn,.wy-control .btn{vertical-align:middle}.wy-btn-group{margin-bottom:24px;*zoom:1}.wy-btn-group:after,.wy-btn-group:before{display:table;content:""}.wy-btn-group:after{clear:both}.wy-dropdown{position:relative;display:inline-block}.wy-dropdown-active .wy-dropdown-menu{display:block}.wy-dropdown-menu{position:absolute;left:0;display:none;float:left;top:100%;min-width:100%;background:#fcfcfc;z-index:100;border:1px solid #cfd7dd;box-shadow:0 2px 2px 0 rgba(0,0,0,.1);padding:12px}.wy-dropdown-menu>dd>a{display:block;clear:both;color:#404040;white-space:nowrap;font-size:90%;padding:0 12px;cursor:pointer}.wy-dropdown-menu>dd>a:hover{background:#2980b9;color:#fff}.wy-dropdown-menu>dd.divider{border-top:1px solid #cfd7dd;margin:6px 0}.wy-dropdown-menu>dd.search{padding-bottom:12px}.wy-dropdown-menu>dd.search input[type=search]{width:100%}.wy-dropdown-menu>dd.call-to-action{background:#e3e3e3;text-transform:uppercase;font-weight:500;font-size:80%}.wy-dropdown-menu>dd.call-to-action:hover{background:#e3e3e3}.wy-dropdown-menu>dd.call-to-action .btn{color:#fff}.wy-dropdown.wy-dropdown-up .wy-dropdown-menu{bottom:100%;top:auto;left:auto;right:0}.wy-dropdown.wy-dropdown-bubble .wy-dropdown-menu{background:#fcfcfc;margin-top:2px}.wy-dropdown.wy-dropdown-bubble .wy-dropdown-menu a{padding:6px 12px}.wy-dropdown.wy-dropdown-bubble .wy-dropdown-menu a:hover{background:#2980b9;color:#fff}.wy-dropdown.wy-dropdown-left .wy-dropdown-menu{right:0;left:auto;text-align:right}.wy-dropdown-arrow:before{content:" ";border-bottom:5px solid #f5f5f5;border-left:5px solid transparent;border-right:5px solid transparent;position:absolute;display:block;top:-4px;left:50%;margin-left:-3px}.wy-dropdown-arrow.wy-dropdown-arrow-left:before{left:11px}.wy-form-stacked select{display:block}.wy-form-aligned .wy-help-inline,.wy-form-aligned input,.wy-form-aligned label,.wy-form-aligned select,.wy-form-aligned textarea{display:inline-block;*display:inline;*zoom:1;vertical-align:middle}.wy-form-aligned .wy-control-group>label{display:inline-block;vertical-align:middle;width:10em;margin:6px 12px 0 0;float:left}.wy-form-aligned .wy-control{float:left}.wy-form-aligned .wy-control label{display:block}.wy-form-aligned .wy-control select{margin-top:6px}fieldset{margin:0}fieldset,legend{border:0;padding:0}legend{width:100%;white-space:normal;margin-bottom:24px;font-size:150%;*margin-left:-7px}label,legend{display:block}label{margin:0 0 .3125em;color:#333;font-size:90%}input,select,textarea{font-size:100%;margin:0;vertical-align:baseline;*vertical-align:middle}.wy-control-group{margin-bottom:24px;max-width:1200px;margin-left:auto;margin-right:auto;*zoom:1}.wy-control-group:after,.wy-control-group:before{display:table;content:""}.wy-control-group:after{clear:both}.wy-control-group.wy-control-group-required>label:after{content:" *";color:#e74c3c}.wy-control-group .wy-form-full,.wy-control-group .wy-form-halves,.wy-control-group .wy-form-thirds{padding-bottom:12px}.wy-control-group .wy-form-full input[type=color],.wy-control-group .wy-form-full input[type=date],.wy-control-group .wy-form-full input[type=datetime-local],.wy-control-group .wy-form-full input[type=datetime],.wy-control-group .wy-form-full input[type=email],.wy-control-group .wy-form-full input[type=month],.wy-control-group .wy-form-full input[type=number],.wy-control-group .wy-form-full input[type=password],.wy-control-group .wy-form-full input[type=search],.wy-control-group .wy-form-full input[type=tel],.wy-control-group .wy-form-full input[type=text],.wy-control-group .wy-form-full input[type=time],.wy-control-group .wy-form-full input[type=url],.wy-control-group .wy-form-full input[type=week],.wy-control-group .wy-form-full select,.wy-control-group .wy-form-halves input[type=color],.wy-control-group .wy-form-halves input[type=date],.wy-control-group .wy-form-halves input[type=datetime-local],.wy-control-group .wy-form-halves input[type=datetime],.wy-control-group .wy-form-halves input[type=email],.wy-control-group .wy-form-halves input[type=month],.wy-control-group .wy-form-halves input[type=number],.wy-control-group .wy-form-halves input[type=password],.wy-control-group .wy-form-halves input[type=search],.wy-control-group .wy-form-halves input[type=tel],.wy-control-group .wy-form-halves input[type=text],.wy-control-group .wy-form-halves input[type=time],.wy-control-group .wy-form-halves input[type=url],.wy-control-group .wy-form-halves input[type=week],.wy-control-group .wy-form-halves select,.wy-control-group .wy-form-thirds input[type=color],.wy-control-group .wy-form-thirds input[type=date],.wy-control-group .wy-form-thirds input[type=datetime-local],.wy-control-group .wy-form-thirds input[type=datetime],.wy-control-group .wy-form-thirds input[type=email],.wy-control-group .wy-form-thirds input[type=month],.wy-control-group .wy-form-thirds input[type=number],.wy-control-group .wy-form-thirds input[type=password],.wy-control-group .wy-form-thirds input[type=search],.wy-control-group .wy-form-thirds input[type=tel],.wy-control-group .wy-form-thirds input[type=text],.wy-control-group .wy-form-thirds input[type=time],.wy-control-group .wy-form-thirds input[type=url],.wy-control-group .wy-form-thirds input[type=week],.wy-control-group .wy-form-thirds select{width:100%}.wy-control-group .wy-form-full{float:left;display:block;width:100%;margin-right:0}.wy-control-group .wy-form-full:last-child{margin-right:0}.wy-control-group .wy-form-halves{float:left;display:block;margin-right:2.35765%;width:48.82117%}.wy-control-group .wy-form-halves:last-child,.wy-control-group .wy-form-halves:nth-of-type(2n){margin-right:0}.wy-control-group .wy-form-halves:nth-of-type(odd){clear:left}.wy-control-group .wy-form-thirds{float:left;display:block;margin-right:2.35765%;width:31.76157%}.wy-control-group .wy-form-thirds:last-child,.wy-control-group .wy-form-thirds:nth-of-type(3n){margin-right:0}.wy-control-group .wy-form-thirds:nth-of-type(3n+1){clear:left}.wy-control-group.wy-control-group-no-input .wy-control,.wy-control-no-input{margin:6px 0 0;font-size:90%}.wy-control-no-input{display:inline-block}.wy-control-group.fluid-input input[type=color],.wy-control-group.fluid-input input[type=date],.wy-control-group.fluid-input input[type=datetime-local],.wy-control-group.fluid-input input[type=datetime],.wy-control-group.fluid-input input[type=email],.wy-control-group.fluid-input input[type=month],.wy-control-group.fluid-input input[type=number],.wy-control-group.fluid-input input[type=password],.wy-control-group.fluid-input input[type=search],.wy-control-group.fluid-input input[type=tel],.wy-control-group.fluid-input input[type=text],.wy-control-group.fluid-input input[type=time],.wy-control-group.fluid-input input[type=url],.wy-control-group.fluid-input input[type=week]{width:100%}.wy-form-message-inline{padding-left:.3em;color:#666;font-size:90%}.wy-form-message{display:block;color:#999;font-size:70%;margin-top:.3125em;font-style:italic}.wy-form-message p{font-size:inherit;font-style:italic;margin-bottom:6px}.wy-form-message p:last-child{margin-bottom:0}input{line-height:normal}input[type=button],input[type=reset],input[type=submit]{-webkit-appearance:button;cursor:pointer;font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif;*overflow:visible}input[type=color],input[type=date],input[type=datetime-local],input[type=datetime],input[type=email],input[type=month],input[type=number],input[type=password],input[type=search],input[type=tel],input[type=text],input[type=time],input[type=url],input[type=week]{-webkit-appearance:none;padding:6px;display:inline-block;border:1px solid #ccc;font-size:80%;font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif;box-shadow:inset 0 1px 3px #ddd;border-radius:0;-webkit-transition:border .3s linear;-moz-transition:border .3s linear;transition:border .3s linear}input[type=datetime-local]{padding:.34375em .625em}input[disabled]{cursor:default}input[type=checkbox],input[type=radio]{padding:0;margin-right:.3125em;*height:13px;*width:13px}input[type=checkbox],input[type=radio],input[type=search]{-webkit-box-sizing:border-box;-moz-box-sizing:border-box;box-sizing:border-box}input[type=search]::-webkit-search-cancel-button,input[type=search]::-webkit-search-decoration{-webkit-appearance:none}input[type=color]:focus,input[type=date]:focus,input[type=datetime-local]:focus,input[type=datetime]:focus,input[type=email]:focus,input[type=month]:focus,input[type=number]:focus,input[type=password]:focus,input[type=search]:focus,input[type=tel]:focus,input[type=text]:focus,input[type=time]:focus,input[type=url]:focus,input[type=week]:focus{outline:0;outline:thin dotted\9;border-color:#333}input.no-focus:focus{border-color:#ccc!important}input[type=checkbox]:focus,input[type=file]:focus,input[type=radio]:focus{outline:thin dotted #333;outline:1px auto #129fea}input[type=color][disabled],input[type=date][disabled],input[type=datetime-local][disabled],input[type=datetime][disabled],input[type=email][disabled],input[type=month][disabled],input[type=number][disabled],input[type=password][disabled],input[type=search][disabled],input[type=tel][disabled],input[type=text][disabled],input[type=time][disabled],input[type=url][disabled],input[type=week][disabled]{cursor:not-allowed;background-color:#fafafa}input:focus:invalid,select:focus:invalid,textarea:focus:invalid{color:#e74c3c;border:1px solid #e74c3c}input:focus:invalid:focus,select:focus:invalid:focus,textarea:focus:invalid:focus{border-color:#e74c3c}input[type=checkbox]:focus:invalid:focus,input[type=file]:focus:invalid:focus,input[type=radio]:focus:invalid:focus{outline-color:#e74c3c}input.wy-input-large{padding:12px;font-size:100%}textarea{overflow:auto;vertical-align:top;width:100%;font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif}select,textarea{padding:.5em .625em;display:inline-block;border:1px solid #ccc;font-size:80%;box-shadow:inset 0 1px 3px #ddd;-webkit-transition:border .3s linear;-moz-transition:border .3s linear;transition:border .3s linear}select{border:1px solid #ccc;background-color:#fff}select[multiple]{height:auto}select:focus,textarea:focus{outline:0}input[readonly],select[disabled],select[readonly],textarea[disabled],textarea[readonly]{cursor:not-allowed;background-color:#fafafa}input[type=checkbox][disabled],input[type=radio][disabled]{cursor:not-allowed}.wy-checkbox,.wy-radio{margin:6px 0;color:#404040;display:block}.wy-checkbox input,.wy-radio input{vertical-align:baseline}.wy-form-message-inline{display:inline-block;*display:inline;*zoom:1;vertical-align:middle}.wy-input-prefix,.wy-input-suffix{white-space:nowrap;padding:6px}.wy-input-prefix .wy-input-context,.wy-input-suffix .wy-input-context{line-height:27px;padding:0 8px;display:inline-block;font-size:80%;background-color:#f3f6f6;border:1px solid #ccc;color:#999}.wy-input-suffix .wy-input-context{border-left:0}.wy-input-prefix .wy-input-context{border-right:0}.wy-switch{position:relative;display:block;height:24px;margin-top:12px;cursor:pointer}.wy-switch:before{left:0;top:0;width:36px;height:12px;background:#ccc}.wy-switch:after,.wy-switch:before{position:absolute;content:"";display:block;border-radius:4px;-webkit-transition:all .2s ease-in-out;-moz-transition:all .2s ease-in-out;transition:all .2s ease-in-out}.wy-switch:after{width:18px;height:18px;background:#999;left:-3px;top:-3px}.wy-switch span{position:absolute;left:48px;display:block;font-size:12px;color:#ccc;line-height:1}.wy-switch.active:before{background:#1e8449}.wy-switch.active:after{left:24px;background:#27ae60}.wy-switch.disabled{cursor:not-allowed;opacity:.8}.wy-control-group.wy-control-group-error .wy-form-message,.wy-control-group.wy-control-group-error>label{color:#e74c3c}.wy-control-group.wy-control-group-error input[type=color],.wy-control-group.wy-control-group-error input[type=date],.wy-control-group.wy-control-group-error input[type=datetime-local],.wy-control-group.wy-control-group-error input[type=datetime],.wy-control-group.wy-control-group-error input[type=email],.wy-control-group.wy-control-group-error input[type=month],.wy-control-group.wy-control-group-error input[type=number],.wy-control-group.wy-control-group-error input[type=password],.wy-control-group.wy-control-group-error input[type=search],.wy-control-group.wy-control-group-error input[type=tel],.wy-control-group.wy-control-group-error input[type=text],.wy-control-group.wy-control-group-error input[type=time],.wy-control-group.wy-control-group-error input[type=url],.wy-control-group.wy-control-group-error input[type=week],.wy-control-group.wy-control-group-error textarea{border:1px solid #e74c3c}.wy-inline-validate{white-space:nowrap}.wy-inline-validate .wy-input-context{padding:.5em .625em;display:inline-block;font-size:80%}.wy-inline-validate.wy-inline-validate-success .wy-input-context{color:#27ae60}.wy-inline-validate.wy-inline-validate-danger .wy-input-context{color:#e74c3c}.wy-inline-validate.wy-inline-validate-warning .wy-input-context{color:#e67e22}.wy-inline-validate.wy-inline-validate-info .wy-input-context{color:#2980b9}.rotate-90{-webkit-transform:rotate(90deg);-moz-transform:rotate(90deg);-ms-transform:rotate(90deg);-o-transform:rotate(90deg);transform:rotate(90deg)}.rotate-180{-webkit-transform:rotate(180deg);-moz-transform:rotate(180deg);-ms-transform:rotate(180deg);-o-transform:rotate(180deg);transform:rotate(180deg)}.rotate-270{-webkit-transform:rotate(270deg);-moz-transform:rotate(270deg);-ms-transform:rotate(270deg);-o-transform:rotate(270deg);transform:rotate(270deg)}.mirror{-webkit-transform:scaleX(-1);-moz-transform:scaleX(-1);-ms-transform:scaleX(-1);-o-transform:scaleX(-1);transform:scaleX(-1)}.mirror.rotate-90{-webkit-transform:scaleX(-1) rotate(90deg);-moz-transform:scaleX(-1) rotate(90deg);-ms-transform:scaleX(-1) rotate(90deg);-o-transform:scaleX(-1) rotate(90deg);transform:scaleX(-1) rotate(90deg)}.mirror.rotate-180{-webkit-transform:scaleX(-1) rotate(180deg);-moz-transform:scaleX(-1) rotate(180deg);-ms-transform:scaleX(-1) rotate(180deg);-o-transform:scaleX(-1) rotate(180deg);transform:scaleX(-1) rotate(180deg)}.mirror.rotate-270{-webkit-transform:scaleX(-1) rotate(270deg);-moz-transform:scaleX(-1) rotate(270deg);-ms-transform:scaleX(-1) rotate(270deg);-o-transform:scaleX(-1) rotate(270deg);transform:scaleX(-1) rotate(270deg)}@media only screen and (max-width:480px){.wy-form button[type=submit]{margin:.7em 0 0}.wy-form input[type=color],.wy-form input[type=date],.wy-form input[type=datetime-local],.wy-form input[type=datetime],.wy-form input[type=email],.wy-form input[type=month],.wy-form input[type=number],.wy-form input[type=password],.wy-form input[type=search],.wy-form input[type=tel],.wy-form input[type=text],.wy-form input[type=time],.wy-form input[type=url],.wy-form input[type=week],.wy-form label{margin-bottom:.3em;display:block}.wy-form input[type=color],.wy-form input[type=date],.wy-form input[type=datetime-local],.wy-form input[type=datetime],.wy-form input[type=email],.wy-form input[type=month],.wy-form input[type=number],.wy-form input[type=password],.wy-form input[type=search],.wy-form input[type=tel],.wy-form input[type=time],.wy-form input[type=url],.wy-form input[type=week]{margin-bottom:0}.wy-form-aligned .wy-control-group label{margin-bottom:.3em;text-align:left;display:block;width:100%}.wy-form-aligned .wy-control{margin:1.5em 0 0}.wy-form-message,.wy-form-message-inline,.wy-form .wy-help-inline{display:block;font-size:80%;padding:6px 0}}@media screen and (max-width:768px){.tablet-hide{display:none}}@media screen and (max-width:480px){.mobile-hide{display:none}}.float-left{float:left}.float-right{float:right}.full-width{width:100%}.rst-content table.docutils,.rst-content table.field-list,.wy-table{border-collapse:collapse;border-spacing:0;empty-cells:show;margin-bottom:24px}.rst-content table.docutils caption,.rst-content table.field-list caption,.wy-table caption{color:#000;font:italic 85%/1 arial,sans-serif;padding:1em 0;text-align:center}.rst-content table.docutils td,.rst-content table.docutils th,.rst-content table.field-list td,.rst-content table.field-list th,.wy-table td,.wy-table th{font-size:90%;margin:0;overflow:visible;padding:8px 16px}.rst-content table.docutils td:first-child,.rst-content table.docutils th:first-child,.rst-content table.field-list td:first-child,.rst-content table.field-list th:first-child,.wy-table td:first-child,.wy-table th:first-child{border-left-width:0}.rst-content table.docutils thead,.rst-content table.field-list thead,.wy-table thead{color:#000;text-align:left;vertical-align:bottom;white-space:nowrap}.rst-content table.docutils thead th,.rst-content table.field-list thead th,.wy-table thead th{font-weight:700;border-bottom:2px solid #e1e4e5}.rst-content table.docutils td,.rst-content table.field-list td,.wy-table td{background-color:transparent;vertical-align:middle}.rst-content table.docutils td p,.rst-content table.field-list td p,.wy-table td p{line-height:18px}.rst-content table.docutils td p:last-child,.rst-content table.field-list td p:last-child,.wy-table td p:last-child{margin-bottom:0}.rst-content table.docutils .wy-table-cell-min,.rst-content table.field-list .wy-table-cell-min,.wy-table .wy-table-cell-min{width:1%;padding-right:0}.rst-content table.docutils .wy-table-cell-min input[type=checkbox],.rst-content table.field-list .wy-table-cell-min input[type=checkbox],.wy-table .wy-table-cell-min input[type=checkbox]{margin:0}.wy-table-secondary{color:grey;font-size:90%}.wy-table-tertiary{color:grey;font-size:80%}.rst-content table.docutils:not(.field-list) tr:nth-child(2n-1) td,.wy-table-backed,.wy-table-odd td,.wy-table-striped tr:nth-child(2n-1) td{background-color:#f3f6f6}.rst-content table.docutils,.wy-table-bordered-all{border:1px solid #e1e4e5}.rst-content table.docutils td,.wy-table-bordered-all td{border-bottom:1px solid #e1e4e5;border-left:1px solid #e1e4e5}.rst-content table.docutils tbody>tr:last-child td,.wy-table-bordered-all tbody>tr:last-child td{border-bottom-width:0}.wy-table-bordered{border:1px solid #e1e4e5}.wy-table-bordered-rows td{border-bottom:1px solid #e1e4e5}.wy-table-bordered-rows tbody>tr:last-child td{border-bottom-width:0}.wy-table-horizontal td,.wy-table-horizontal th{border-width:0 0 1px;border-bottom:1px solid #e1e4e5}.wy-table-horizontal tbody>tr:last-child td{border-bottom-width:0}.wy-table-responsive{margin-bottom:24px;max-width:100%;overflow:auto}.wy-table-responsive table{margin-bottom:0!important}.wy-table-responsive table td,.wy-table-responsive table th{white-space:nowrap}a{color:#2980b9;text-decoration:none;cursor:pointer}a:hover{color:#3091d1}a:visited{color:#9b59b6}html{height:100%}body,html{overflow-x:hidden}body{font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif;font-weight:400;color:#404040;min-height:100%;background:#edf0f2}.wy-text-left{text-align:left}.wy-text-center{text-align:center}.wy-text-right{text-align:right}.wy-text-large{font-size:120%}.wy-text-normal{font-size:100%}.wy-text-small,small{font-size:80%}.wy-text-strike{text-decoration:line-through}.wy-text-warning{color:#e67e22!important}a.wy-text-warning:hover{color:#eb9950!important}.wy-text-info{color:#2980b9!important}a.wy-text-info:hover{color:#409ad5!important}.wy-text-success{color:#27ae60!important}a.wy-text-success:hover{color:#36d278!important}.wy-text-danger{color:#e74c3c!important}a.wy-text-danger:hover{color:#ed7669!important}.wy-text-neutral{color:#404040!important}a.wy-text-neutral:hover{color:#595959!important}.rst-content .toctree-wrapper>p.caption,h1,h2,h3,h4,h5,h6,legend{margin-top:0;font-weight:700;font-family:Roboto Slab,ff-tisa-web-pro,Georgia,Arial,sans-serif}p{line-height:24px;font-size:16px;margin:0 0 24px}h1{font-size:175%}.rst-content .toctree-wrapper>p.caption,h2{font-size:150%}h3{font-size:125%}h4{font-size:115%}h5{font-size:110%}h6{font-size:100%}hr{display:block;height:1px;border:0;border-top:1px solid #e1e4e5;margin:24px 0;padding:0}.rst-content code,.rst-content tt,code{white-space:nowrap;max-width:100%;background:#fff;border:1px solid #e1e4e5;font-size:75%;padding:0 5px;font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;color:#e74c3c;overflow-x:auto}.rst-content tt.code-large,code.code-large{font-size:90%}.rst-content .section ul,.rst-content .toctree-wrapper ul,.rst-content section ul,.wy-plain-list-disc,article ul{list-style:disc;line-height:24px;margin-bottom:24px}.rst-content .section ul li,.rst-content .toctree-wrapper ul li,.rst-content section ul li,.wy-plain-list-disc li,article ul li{list-style:disc;margin-left:24px}.rst-content .section ul li p:last-child,.rst-content .section ul li ul,.rst-content .toctree-wrapper ul li p:last-child,.rst-content .toctree-wrapper ul li ul,.rst-content section ul li p:last-child,.rst-content section ul li ul,.wy-plain-list-disc li p:last-child,.wy-plain-list-disc li ul,article ul li p:last-child,article ul li ul{margin-bottom:0}.rst-content .section ul li li,.rst-content .toctree-wrapper ul li li,.rst-content section ul li li,.wy-plain-list-disc li li,article ul li li{list-style:circle}.rst-content .section ul li li li,.rst-content .toctree-wrapper ul li li li,.rst-content section ul li li li,.wy-plain-list-disc li li li,article ul li li li{list-style:square}.rst-content .section ul li ol li,.rst-content .toctree-wrapper ul li ol li,.rst-content section ul li ol li,.wy-plain-list-disc li ol li,article ul li ol li{list-style:decimal}.rst-content .section ol,.rst-content .section ol.arabic,.rst-content .toctree-wrapper ol,.rst-content .toctree-wrapper ol.arabic,.rst-content section ol,.rst-content section ol.arabic,.wy-plain-list-decimal,article ol{list-style:decimal;line-height:24px;margin-bottom:24px}.rst-content .section ol.arabic li,.rst-content .section ol li,.rst-content .toctree-wrapper ol.arabic li,.rst-content .toctree-wrapper ol li,.rst-content section ol.arabic li,.rst-content section ol li,.wy-plain-list-decimal li,article ol li{list-style:decimal;margin-left:24px}.rst-content .section ol.arabic li ul,.rst-content .section ol li p:last-child,.rst-content .section ol li ul,.rst-content .toctree-wrapper ol.arabic li ul,.rst-content .toctree-wrapper ol li p:last-child,.rst-content .toctree-wrapper ol li ul,.rst-content section ol.arabic li ul,.rst-content section ol li p:last-child,.rst-content section ol li ul,.wy-plain-list-decimal li p:last-child,.wy-plain-list-decimal li ul,article ol li p:last-child,article ol li ul{margin-bottom:0}.rst-content .section ol.arabic li ul li,.rst-content .section ol li ul li,.rst-content .toctree-wrapper ol.arabic li ul li,.rst-content .toctree-wrapper ol li ul li,.rst-content section ol.arabic li ul li,.rst-content section ol li ul li,.wy-plain-list-decimal li ul li,article ol li ul li{list-style:disc}.wy-breadcrumbs{*zoom:1}.wy-breadcrumbs:after,.wy-breadcrumbs:before{display:table;content:""}.wy-breadcrumbs:after{clear:both}.wy-breadcrumbs>li{display:inline-block;padding-top:5px}.wy-breadcrumbs>li.wy-breadcrumbs-aside{float:right}.rst-content .wy-breadcrumbs>li code,.rst-content .wy-breadcrumbs>li tt,.wy-breadcrumbs>li .rst-content tt,.wy-breadcrumbs>li code{all:inherit;color:inherit}.breadcrumb-item:before{content:"/";color:#bbb;font-size:13px;padding:0 6px 0 3px}.wy-breadcrumbs-extra{margin-bottom:0;color:#b3b3b3;font-size:80%;display:inline-block}@media screen and (max-width:480px){.wy-breadcrumbs-extra,.wy-breadcrumbs li.wy-breadcrumbs-aside{display:none}}@media print{.wy-breadcrumbs li.wy-breadcrumbs-aside{display:none}}html{font-size:16px}.wy-affix{position:fixed;top:1.618em}.wy-menu a:hover{text-decoration:none}.wy-menu-horiz{*zoom:1}.wy-menu-horiz:after,.wy-menu-horiz:before{display:table;content:""}.wy-menu-horiz:after{clear:both}.wy-menu-horiz li,.wy-menu-horiz ul{display:inline-block}.wy-menu-horiz li:hover{background:hsla(0,0%,100%,.1)}.wy-menu-horiz li.divide-left{border-left:1px solid #404040}.wy-menu-horiz li.divide-right{border-right:1px solid #404040}.wy-menu-horiz a{height:32px;display:inline-block;line-height:32px;padding:0 16px}.wy-menu-vertical{width:300px}.wy-menu-vertical header,.wy-menu-vertical p.caption{color:#55a5d9;height:32px;line-height:32px;padding:0 1.618em;margin:12px 0 0;display:block;font-weight:700;text-transform:uppercase;font-size:85%;white-space:nowrap}.wy-menu-vertical ul{margin-bottom:0}.wy-menu-vertical li.divide-top{border-top:1px solid #404040}.wy-menu-vertical li.divide-bottom{border-bottom:1px solid #404040}.wy-menu-vertical li.current{background:#e3e3e3}.wy-menu-vertical li.current a{color:grey;border-right:1px solid #c9c9c9;padding:.4045em 2.427em}.wy-menu-vertical li.current a:hover{background:#d6d6d6}.rst-content .wy-menu-vertical li tt,.wy-menu-vertical li .rst-content tt,.wy-menu-vertical li code{border:none;background:inherit;color:inherit;padding-left:0;padding-right:0}.wy-menu-vertical li button.toctree-expand{display:block;float:left;margin-left:-1.2em;line-height:18px;color:#4d4d4d;border:none;background:none;padding:0}.wy-menu-vertical li.current>a,.wy-menu-vertical li.on a{color:#404040;font-weight:700;position:relative;background:#fcfcfc;border:none;padding:.4045em 1.618em}.wy-menu-vertical li.current>a:hover,.wy-menu-vertical li.on a:hover{background:#fcfcfc}.wy-menu-vertical li.current>a:hover button.toctree-expand,.wy-menu-vertical li.on a:hover button.toctree-expand{color:grey}.wy-menu-vertical li.current>a button.toctree-expand,.wy-menu-vertical li.on a button.toctree-expand{display:block;line-height:18px;color:#333}.wy-menu-vertical li.toctree-l1.current>a{border-bottom:1px solid #c9c9c9;border-top:1px solid #c9c9c9}.wy-menu-vertical .toctree-l1.current .toctree-l2>ul,.wy-menu-vertical .toctree-l2.current .toctree-l3>ul,.wy-menu-vertical .toctree-l3.current .toctree-l4>ul,.wy-menu-vertical .toctree-l4.current .toctree-l5>ul,.wy-menu-vertical .toctree-l5.current .toctree-l6>ul,.wy-menu-vertical .toctree-l6.current .toctree-l7>ul,.wy-menu-vertical .toctree-l7.current .toctree-l8>ul,.wy-menu-vertical .toctree-l8.current .toctree-l9>ul,.wy-menu-vertical .toctree-l9.current .toctree-l10>ul,.wy-menu-vertical .toctree-l10.current .toctree-l11>ul{display:none}.wy-menu-vertical .toctree-l1.current .current.toctree-l2>ul,.wy-menu-vertical .toctree-l2.current .current.toctree-l3>ul,.wy-menu-vertical .toctree-l3.current .current.toctree-l4>ul,.wy-menu-vertical .toctree-l4.current .current.toctree-l5>ul,.wy-menu-vertical .toctree-l5.current .current.toctree-l6>ul,.wy-menu-vertical .toctree-l6.current .current.toctree-l7>ul,.wy-menu-vertical .toctree-l7.current .current.toctree-l8>ul,.wy-menu-vertical .toctree-l8.current .current.toctree-l9>ul,.wy-menu-vertical .toctree-l9.current .current.toctree-l10>ul,.wy-menu-vertical .toctree-l10.current .current.toctree-l11>ul{display:block}.wy-menu-vertical li.toctree-l3,.wy-menu-vertical li.toctree-l4{font-size:.9em}.wy-menu-vertical li.toctree-l2 a,.wy-menu-vertical li.toctree-l3 a,.wy-menu-vertical li.toctree-l4 a,.wy-menu-vertical li.toctree-l5 a,.wy-menu-vertical li.toctree-l6 a,.wy-menu-vertical li.toctree-l7 a,.wy-menu-vertical li.toctree-l8 a,.wy-menu-vertical li.toctree-l9 a,.wy-menu-vertical li.toctree-l10 a{color:#404040}.wy-menu-vertical li.toctree-l2 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l3 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l4 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l5 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l6 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l7 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l8 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l9 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l10 a:hover button.toctree-expand{color:grey}.wy-menu-vertical li.toctree-l2.current li.toctree-l3>a,.wy-menu-vertical li.toctree-l3.current li.toctree-l4>a,.wy-menu-vertical li.toctree-l4.current li.toctree-l5>a,.wy-menu-vertical li.toctree-l5.current li.toctree-l6>a,.wy-menu-vertical li.toctree-l6.current li.toctree-l7>a,.wy-menu-vertical li.toctree-l7.current li.toctree-l8>a,.wy-menu-vertical li.toctree-l8.current li.toctree-l9>a,.wy-menu-vertical li.toctree-l9.current li.toctree-l10>a,.wy-menu-vertical li.toctree-l10.current li.toctree-l11>a{display:block}.wy-menu-vertical li.toctree-l2.current>a{padding:.4045em 2.427em}.wy-menu-vertical li.toctree-l2.current li.toctree-l3>a{padding:.4045em 1.618em .4045em 4.045em}.wy-menu-vertical li.toctree-l3.current>a{padding:.4045em 4.045em}.wy-menu-vertical li.toctree-l3.current li.toctree-l4>a{padding:.4045em 1.618em .4045em 5.663em}.wy-menu-vertical li.toctree-l4.current>a{padding:.4045em 5.663em}.wy-menu-vertical li.toctree-l4.current li.toctree-l5>a{padding:.4045em 1.618em .4045em 7.281em}.wy-menu-vertical li.toctree-l5.current>a{padding:.4045em 7.281em}.wy-menu-vertical li.toctree-l5.current li.toctree-l6>a{padding:.4045em 1.618em .4045em 8.899em}.wy-menu-vertical li.toctree-l6.current>a{padding:.4045em 8.899em}.wy-menu-vertical li.toctree-l6.current li.toctree-l7>a{padding:.4045em 1.618em .4045em 10.517em}.wy-menu-vertical li.toctree-l7.current>a{padding:.4045em 10.517em}.wy-menu-vertical li.toctree-l7.current li.toctree-l8>a{padding:.4045em 1.618em .4045em 12.135em}.wy-menu-vertical li.toctree-l8.current>a{padding:.4045em 12.135em}.wy-menu-vertical li.toctree-l8.current li.toctree-l9>a{padding:.4045em 1.618em .4045em 13.753em}.wy-menu-vertical li.toctree-l9.current>a{padding:.4045em 13.753em}.wy-menu-vertical li.toctree-l9.current li.toctree-l10>a{padding:.4045em 1.618em .4045em 15.371em}.wy-menu-vertical li.toctree-l10.current>a{padding:.4045em 15.371em}.wy-menu-vertical li.toctree-l10.current li.toctree-l11>a{padding:.4045em 1.618em .4045em 16.989em}.wy-menu-vertical li.toctree-l2.current>a,.wy-menu-vertical li.toctree-l2.current li.toctree-l3>a{background:#c9c9c9}.wy-menu-vertical li.toctree-l2 button.toctree-expand{color:#a3a3a3}.wy-menu-vertical li.toctree-l3.current>a,.wy-menu-vertical li.toctree-l3.current li.toctree-l4>a{background:#bdbdbd}.wy-menu-vertical li.toctree-l3 button.toctree-expand{color:#969696}.wy-menu-vertical li.current ul{display:block}.wy-menu-vertical li ul{margin-bottom:0;display:none}.wy-menu-vertical li ul li a{margin-bottom:0;color:#d9d9d9;font-weight:400}.wy-menu-vertical a{line-height:18px;padding:.4045em 1.618em;display:block;position:relative;font-size:90%;color:#d9d9d9}.wy-menu-vertical a:hover{background-color:#4e4a4a;cursor:pointer}.wy-menu-vertical a:hover button.toctree-expand{color:#d9d9d9}.wy-menu-vertical a:active{background-color:#2980b9;cursor:pointer;color:#fff}.wy-menu-vertical a:active button.toctree-expand{color:#fff}.wy-side-nav-search{display:block;width:300px;padding:.809em;margin-bottom:.809em;z-index:200;background-color:#2980b9;text-align:center;color:#fcfcfc}.wy-side-nav-search input[type=text]{width:100%;border-radius:50px;padding:6px 12px;border-color:#2472a4}.wy-side-nav-search img{display:block;margin:auto auto .809em;height:45px;width:45px;background-color:#2980b9;padding:5px;border-radius:100%}.wy-side-nav-search .wy-dropdown>a,.wy-side-nav-search>a{color:#fcfcfc;font-size:100%;font-weight:700;display:inline-block;padding:4px 6px;margin-bottom:.809em;max-width:100%}.wy-side-nav-search .wy-dropdown>a:hover,.wy-side-nav-search>a:hover{background:hsla(0,0%,100%,.1)}.wy-side-nav-search .wy-dropdown>a img.logo,.wy-side-nav-search>a img.logo{display:block;margin:0 auto;height:auto;width:auto;border-radius:0;max-width:100%;background:transparent}.wy-side-nav-search .wy-dropdown>a.icon img.logo,.wy-side-nav-search>a.icon img.logo{margin-top:.85em}.wy-side-nav-search>div.version{margin-top:-.4045em;margin-bottom:.809em;font-weight:400;color:hsla(0,0%,100%,.3)}.wy-nav .wy-menu-vertical header{color:#2980b9}.wy-nav .wy-menu-vertical a{color:#b3b3b3}.wy-nav .wy-menu-vertical a:hover{background-color:#2980b9;color:#fff}[data-menu-wrap]{-webkit-transition:all .2s ease-in;-moz-transition:all .2s ease-in;transition:all .2s ease-in;position:absolute;opacity:1;width:100%;opacity:0}[data-menu-wrap].move-center{left:0;right:auto;opacity:1}[data-menu-wrap].move-left{right:auto;left:-100%;opacity:0}[data-menu-wrap].move-right{right:-100%;left:auto;opacity:0}.wy-body-for-nav{background:#fcfcfc}.wy-grid-for-nav{position:absolute;width:100%;height:100%}.wy-nav-side{position:fixed;top:0;bottom:0;left:0;padding-bottom:2em;width:300px;overflow-x:hidden;overflow-y:hidden;min-height:100%;color:#9b9b9b;background:#343131;z-index:200}.wy-side-scroll{width:320px;position:relative;overflow-x:hidden;overflow-y:scroll;height:100%}.wy-nav-top{display:none;background:#2980b9;color:#fff;padding:.4045em .809em;position:relative;line-height:50px;text-align:center;font-size:100%;*zoom:1}.wy-nav-top:after,.wy-nav-top:before{display:table;content:""}.wy-nav-top:after{clear:both}.wy-nav-top a{color:#fff;font-weight:700}.wy-nav-top img{margin-right:12px;height:45px;width:45px;background-color:#2980b9;padding:5px;border-radius:100%}.wy-nav-top i{font-size:30px;float:left;cursor:pointer;padding-top:inherit}.wy-nav-content-wrap{margin-left:300px;background:#fcfcfc;min-height:100%}.wy-nav-content{padding:1.618em 3.236em;height:100%;max-width:800px;margin:auto}.wy-body-mask{position:fixed;width:100%;height:100%;background:rgba(0,0,0,.2);display:none;z-index:499}.wy-body-mask.on{display:block}footer{color:grey}footer p{margin-bottom:12px}.rst-content footer span.commit tt,footer span.commit .rst-content tt,footer span.commit code{padding:0;font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;font-size:1em;background:none;border:none;color:grey}.rst-footer-buttons{*zoom:1}.rst-footer-buttons:after,.rst-footer-buttons:before{width:100%;display:table;content:""}.rst-footer-buttons:after{clear:both}.rst-breadcrumbs-buttons{margin-top:12px;*zoom:1}.rst-breadcrumbs-buttons:after,.rst-breadcrumbs-buttons:before{display:table;content:""}.rst-breadcrumbs-buttons:after{clear:both}#search-results .search li{margin-bottom:24px;border-bottom:1px solid #e1e4e5;padding-bottom:24px}#search-results .search li:first-child{border-top:1px solid #e1e4e5;padding-top:24px}#search-results .search li a{font-size:120%;margin-bottom:12px;display:inline-block}#search-results .context{color:grey;font-size:90%}.genindextable li>ul{margin-left:24px}@media screen and (max-width:768px){.wy-body-for-nav{background:#fcfcfc}.wy-nav-top{display:block}.wy-nav-side{left:-300px}.wy-nav-side.shift{width:85%;left:0}.wy-menu.wy-menu-vertical,.wy-side-nav-search,.wy-side-scroll{width:auto}.wy-nav-content-wrap{margin-left:0}.wy-nav-content-wrap .wy-nav-content{padding:1.618em}.wy-nav-content-wrap.shift{position:fixed;min-width:100%;left:85%;top:0;height:100%;overflow:hidden}}@media screen and (min-width:1100px){.wy-nav-content-wrap{background:rgba(0,0,0,.05)}.wy-nav-content{margin:0;background:#fcfcfc}}@media print{.rst-versions,.wy-nav-side,footer{display:none}.wy-nav-content-wrap{margin-left:0}}.rst-versions{position:fixed;bottom:0;left:0;width:300px;color:#fcfcfc;background:#1f1d1d;font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif;z-index:400}.rst-versions a{color:#2980b9;text-decoration:none}.rst-versions .rst-badge-small{display:none}.rst-versions .rst-current-version{padding:12px;background-color:#272525;display:block;text-align:right;font-size:90%;cursor:pointer;color:#27ae60;*zoom:1}.rst-versions .rst-current-version:after,.rst-versions .rst-current-version:before{display:table;content:""}.rst-versions .rst-current-version:after{clear:both}.rst-content .code-block-caption .rst-versions .rst-current-version .headerlink,.rst-content .eqno .rst-versions .rst-current-version .headerlink,.rst-content .rst-versions .rst-current-version .admonition-title,.rst-content code.download .rst-versions .rst-current-version span:first-child,.rst-content dl dt .rst-versions .rst-current-version .headerlink,.rst-content h1 .rst-versions .rst-current-version .headerlink,.rst-content h2 .rst-versions .rst-current-version .headerlink,.rst-content h3 .rst-versions .rst-current-version .headerlink,.rst-content h4 .rst-versions .rst-current-version .headerlink,.rst-content h5 .rst-versions .rst-current-version .headerlink,.rst-content h6 .rst-versions .rst-current-version .headerlink,.rst-content p .rst-versions .rst-current-version .headerlink,.rst-content table>caption .rst-versions .rst-current-version .headerlink,.rst-content tt.download .rst-versions .rst-current-version span:first-child,.rst-versions .rst-current-version .fa,.rst-versions .rst-current-version .icon,.rst-versions .rst-current-version .rst-content .admonition-title,.rst-versions .rst-current-version .rst-content .code-block-caption .headerlink,.rst-versions .rst-current-version .rst-content .eqno .headerlink,.rst-versions .rst-current-version .rst-content code.download span:first-child,.rst-versions .rst-current-version .rst-content dl dt .headerlink,.rst-versions .rst-current-version .rst-content h1 .headerlink,.rst-versions .rst-current-version .rst-content h2 .headerlink,.rst-versions .rst-current-version .rst-content h3 .headerlink,.rst-versions .rst-current-version .rst-content h4 .headerlink,.rst-versions .rst-current-version .rst-content h5 .headerlink,.rst-versions .rst-current-version .rst-content h6 .headerlink,.rst-versions .rst-current-version .rst-content p .headerlink,.rst-versions .rst-current-version .rst-content table>caption .headerlink,.rst-versions .rst-current-version .rst-content tt.download span:first-child,.rst-versions .rst-current-version .wy-menu-vertical li button.toctree-expand,.wy-menu-vertical li .rst-versions .rst-current-version button.toctree-expand{color:#fcfcfc}.rst-versions .rst-current-version .fa-book,.rst-versions .rst-current-version .icon-book{float:left}.rst-versions .rst-current-version.rst-out-of-date{background-color:#e74c3c;color:#fff}.rst-versions .rst-current-version.rst-active-old-version{background-color:#f1c40f;color:#000}.rst-versions.shift-up{height:auto;max-height:100%;overflow-y:scroll}.rst-versions.shift-up .rst-other-versions{display:block}.rst-versions .rst-other-versions{font-size:90%;padding:12px;color:grey;display:none}.rst-versions .rst-other-versions hr{display:block;height:1px;border:0;margin:20px 0;padding:0;border-top:1px solid #413d3d}.rst-versions .rst-other-versions dd{display:inline-block;margin:0}.rst-versions .rst-other-versions dd a{display:inline-block;padding:6px;color:#fcfcfc}.rst-versions.rst-badge{width:auto;bottom:20px;right:20px;left:auto;border:none;max-width:300px;max-height:90%}.rst-versions.rst-badge .fa-book,.rst-versions.rst-badge .icon-book{float:none;line-height:30px}.rst-versions.rst-badge.shift-up .rst-current-version{text-align:right}.rst-versions.rst-badge.shift-up .rst-current-version .fa-book,.rst-versions.rst-badge.shift-up .rst-current-version .icon-book{float:left}.rst-versions.rst-badge>.rst-current-version{width:auto;height:30px;line-height:30px;padding:0 6px;display:block;text-align:center}@media screen and (max-width:768px){.rst-versions{width:85%;display:none}.rst-versions.shift{display:block}}.rst-content .toctree-wrapper>p.caption,.rst-content h1,.rst-content h2,.rst-content h3,.rst-content h4,.rst-content h5,.rst-content h6{margin-bottom:24px}.rst-content img{max-width:100%;height:auto}.rst-content div.figure,.rst-content figure{margin-bottom:24px}.rst-content div.figure .caption-text,.rst-content figure .caption-text{font-style:italic}.rst-content div.figure p:last-child.caption,.rst-content figure p:last-child.caption{margin-bottom:0}.rst-content div.figure.align-center,.rst-content figure.align-center{text-align:center}.rst-content .section>a>img,.rst-content .section>img,.rst-content section>a>img,.rst-content section>img{margin-bottom:24px}.rst-content abbr[title]{text-decoration:none}.rst-content.style-external-links a.reference.external:after{font-family:FontAwesome;content:"\f08e";color:#b3b3b3;vertical-align:super;font-size:60%;margin:0 .2em}.rst-content blockquote{margin-left:24px;line-height:24px;margin-bottom:24px}.rst-content pre.literal-block{white-space:pre;margin:0;padding:12px;font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;display:block;overflow:auto}.rst-content div[class^=highlight],.rst-content pre.literal-block{border:1px solid #e1e4e5;overflow-x:auto;margin:1px 0 24px}.rst-content div[class^=highlight] div[class^=highlight],.rst-content pre.literal-block div[class^=highlight]{padding:0;border:none;margin:0}.rst-content div[class^=highlight] td.code{width:100%}.rst-content .linenodiv pre{border-right:1px solid #e6e9ea;margin:0;padding:12px;font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;user-select:none;pointer-events:none}.rst-content div[class^=highlight] pre{white-space:pre;margin:0;padding:12px;display:block;overflow:auto}.rst-content div[class^=highlight] pre .hll{display:block;margin:0 -12px;padding:0 12px}.rst-content .linenodiv pre,.rst-content div[class^=highlight] pre,.rst-content pre.literal-block{font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;font-size:12px;line-height:1.4}.rst-content div.highlight .gp,.rst-content div.highlight span.linenos{user-select:none;pointer-events:none}.rst-content div.highlight span.linenos{display:inline-block;padding-left:0;padding-right:12px;margin-right:12px;border-right:1px solid #e6e9ea}.rst-content .code-block-caption{font-style:italic;font-size:85%;line-height:1;padding:1em 0;text-align:center}@media print{.rst-content .codeblock,.rst-content div[class^=highlight],.rst-content div[class^=highlight] pre{white-space:pre-wrap}}.rst-content .admonition,.rst-content .admonition-todo,.rst-content .attention,.rst-content .caution,.rst-content .danger,.rst-content .error,.rst-content .hint,.rst-content .important,.rst-content .note,.rst-content .seealso,.rst-content .tip,.rst-content .warning{clear:both}.rst-content .admonition-todo .last,.rst-content .admonition-todo>:last-child,.rst-content .admonition .last,.rst-content .admonition>:last-child,.rst-content .attention .last,.rst-content .attention>:last-child,.rst-content .caution .last,.rst-content .caution>:last-child,.rst-content .danger .last,.rst-content .danger>:last-child,.rst-content .error .last,.rst-content .error>:last-child,.rst-content .hint .last,.rst-content .hint>:last-child,.rst-content .important .last,.rst-content .important>:last-child,.rst-content .note .last,.rst-content .note>:last-child,.rst-content .seealso .last,.rst-content .seealso>:last-child,.rst-content .tip .last,.rst-content .tip>:last-child,.rst-content .warning .last,.rst-content .warning>:last-child{margin-bottom:0}.rst-content .admonition-title:before{margin-right:4px}.rst-content .admonition table{border-color:rgba(0,0,0,.1)}.rst-content .admonition table td,.rst-content .admonition table th{background:transparent!important;border-color:rgba(0,0,0,.1)!important}.rst-content .section ol.loweralpha,.rst-content .section ol.loweralpha>li,.rst-content .toctree-wrapper ol.loweralpha,.rst-content .toctree-wrapper ol.loweralpha>li,.rst-content section ol.loweralpha,.rst-content section ol.loweralpha>li{list-style:lower-alpha}.rst-content .section ol.upperalpha,.rst-content .section ol.upperalpha>li,.rst-content .toctree-wrapper ol.upperalpha,.rst-content .toctree-wrapper ol.upperalpha>li,.rst-content section ol.upperalpha,.rst-content section ol.upperalpha>li{list-style:upper-alpha}.rst-content .section ol li>*,.rst-content .section ul li>*,.rst-content .toctree-wrapper ol li>*,.rst-content .toctree-wrapper ul li>*,.rst-content section ol li>*,.rst-content section ul li>*{margin-top:12px;margin-bottom:12px}.rst-content .section ol li>:first-child,.rst-content .section ul li>:first-child,.rst-content .toctree-wrapper ol li>:first-child,.rst-content .toctree-wrapper ul li>:first-child,.rst-content section ol li>:first-child,.rst-content section ul li>:first-child{margin-top:0}.rst-content .section ol li>p,.rst-content .section ol li>p:last-child,.rst-content .section ul li>p,.rst-content .section ul li>p:last-child,.rst-content .toctree-wrapper ol li>p,.rst-content .toctree-wrapper ol li>p:last-child,.rst-content .toctree-wrapper ul li>p,.rst-content .toctree-wrapper ul li>p:last-child,.rst-content section ol li>p,.rst-content section ol li>p:last-child,.rst-content section ul li>p,.rst-content section ul li>p:last-child{margin-bottom:12px}.rst-content .section ol li>p:only-child,.rst-content .section ol li>p:only-child:last-child,.rst-content .section ul li>p:only-child,.rst-content .section ul li>p:only-child:last-child,.rst-content .toctree-wrapper ol li>p:only-child,.rst-content .toctree-wrapper ol li>p:only-child:last-child,.rst-content .toctree-wrapper ul li>p:only-child,.rst-content .toctree-wrapper ul li>p:only-child:last-child,.rst-content section ol li>p:only-child,.rst-content section ol li>p:only-child:last-child,.rst-content section ul li>p:only-child,.rst-content section ul li>p:only-child:last-child{margin-bottom:0}.rst-content .section ol li>ol,.rst-content .section ol li>ul,.rst-content .section ul li>ol,.rst-content .section ul li>ul,.rst-content .toctree-wrapper ol li>ol,.rst-content .toctree-wrapper ol li>ul,.rst-content .toctree-wrapper ul li>ol,.rst-content .toctree-wrapper ul li>ul,.rst-content section ol li>ol,.rst-content section ol li>ul,.rst-content section ul li>ol,.rst-content section ul li>ul{margin-bottom:12px}.rst-content .section ol.simple li>*,.rst-content .section ol.simple li ol,.rst-content .section ol.simple li ul,.rst-content .section ul.simple li>*,.rst-content .section ul.simple li ol,.rst-content .section ul.simple li ul,.rst-content .toctree-wrapper ol.simple li>*,.rst-content .toctree-wrapper ol.simple li ol,.rst-content .toctree-wrapper ol.simple li ul,.rst-content .toctree-wrapper ul.simple li>*,.rst-content .toctree-wrapper ul.simple li ol,.rst-content .toctree-wrapper ul.simple li ul,.rst-content section ol.simple li>*,.rst-content section ol.simple li ol,.rst-content section ol.simple li ul,.rst-content section ul.simple li>*,.rst-content section ul.simple li ol,.rst-content section ul.simple li ul{margin-top:0;margin-bottom:0}.rst-content .line-block{margin-left:0;margin-bottom:24px;line-height:24px}.rst-content .line-block .line-block{margin-left:24px;margin-bottom:0}.rst-content .topic-title{font-weight:700;margin-bottom:12px}.rst-content .toc-backref{color:#404040}.rst-content .align-right{float:right;margin:0 0 24px 24px}.rst-content .align-left{float:left;margin:0 24px 24px 0}.rst-content .align-center{margin:auto}.rst-content .align-center:not(table){display:block}.rst-content .code-block-caption .headerlink,.rst-content .eqno .headerlink,.rst-content .toctree-wrapper>p.caption .headerlink,.rst-content dl dt .headerlink,.rst-content h1 .headerlink,.rst-content h2 .headerlink,.rst-content h3 .headerlink,.rst-content h4 .headerlink,.rst-content h5 .headerlink,.rst-content h6 .headerlink,.rst-content p.caption .headerlink,.rst-content p .headerlink,.rst-content table>caption .headerlink{opacity:0;font-size:14px;font-family:FontAwesome;margin-left:.5em}.rst-content .code-block-caption .headerlink:focus,.rst-content .code-block-caption:hover .headerlink,.rst-content .eqno .headerlink:focus,.rst-content .eqno:hover .headerlink,.rst-content .toctree-wrapper>p.caption .headerlink:focus,.rst-content .toctree-wrapper>p.caption:hover .headerlink,.rst-content dl dt .headerlink:focus,.rst-content dl dt:hover .headerlink,.rst-content h1 .headerlink:focus,.rst-content h1:hover .headerlink,.rst-content h2 .headerlink:focus,.rst-content h2:hover .headerlink,.rst-content h3 .headerlink:focus,.rst-content h3:hover .headerlink,.rst-content h4 .headerlink:focus,.rst-content h4:hover .headerlink,.rst-content h5 .headerlink:focus,.rst-content h5:hover .headerlink,.rst-content h6 .headerlink:focus,.rst-content h6:hover .headerlink,.rst-content p.caption .headerlink:focus,.rst-content p.caption:hover .headerlink,.rst-content p .headerlink:focus,.rst-content p:hover .headerlink,.rst-content table>caption .headerlink:focus,.rst-content table>caption:hover .headerlink{opacity:1}.rst-content p a{overflow-wrap:anywhere}.rst-content .wy-table td p,.rst-content .wy-table td ul,.rst-content .wy-table th p,.rst-content .wy-table th ul,.rst-content table.docutils td p,.rst-content table.docutils td ul,.rst-content table.docutils th p,.rst-content table.docutils th ul,.rst-content table.field-list td p,.rst-content table.field-list td ul,.rst-content table.field-list th p,.rst-content table.field-list th ul{font-size:inherit}.rst-content .btn:focus{outline:2px solid}.rst-content table>caption .headerlink:after{font-size:12px}.rst-content .centered{text-align:center}.rst-content .sidebar{float:right;width:40%;display:block;margin:0 0 24px 24px;padding:24px;background:#f3f6f6;border:1px solid #e1e4e5}.rst-content .sidebar dl,.rst-content .sidebar p,.rst-content .sidebar ul{font-size:90%}.rst-content .sidebar .last,.rst-content .sidebar>:last-child{margin-bottom:0}.rst-content .sidebar .sidebar-title{display:block;font-family:Roboto Slab,ff-tisa-web-pro,Georgia,Arial,sans-serif;font-weight:700;background:#e1e4e5;padding:6px 12px;margin:-24px -24px 24px;font-size:100%}.rst-content .highlighted{background:#f1c40f;box-shadow:0 0 0 2px #f1c40f;display:inline;font-weight:700}.rst-content .citation-reference,.rst-content .footnote-reference{vertical-align:baseline;position:relative;top:-.4em;line-height:0;font-size:90%}.rst-content .citation-reference>span.fn-bracket,.rst-content .footnote-reference>span.fn-bracket{display:none}.rst-content .hlist{width:100%}.rst-content dl dt span.classifier:before{content:" : "}.rst-content dl dt span.classifier-delimiter{display:none!important}html.writer-html4 .rst-content table.docutils.citation,html.writer-html4 .rst-content table.docutils.footnote{background:none;border:none}html.writer-html4 .rst-content table.docutils.citation td,html.writer-html4 .rst-content table.docutils.citation tr,html.writer-html4 .rst-content table.docutils.footnote td,html.writer-html4 .rst-content table.docutils.footnote tr{border:none;background-color:transparent!important;white-space:normal}html.writer-html4 .rst-content table.docutils.citation td.label,html.writer-html4 .rst-content table.docutils.footnote td.label{padding-left:0;padding-right:0;vertical-align:top}html.writer-html5 .rst-content dl.citation,html.writer-html5 .rst-content dl.field-list,html.writer-html5 .rst-content dl.footnote{display:grid;grid-template-columns:auto minmax(80%,95%)}html.writer-html5 .rst-content dl.citation>dt,html.writer-html5 .rst-content dl.field-list>dt,html.writer-html5 .rst-content dl.footnote>dt{display:inline-grid;grid-template-columns:max-content auto}html.writer-html5 .rst-content aside.citation,html.writer-html5 .rst-content aside.footnote,html.writer-html5 .rst-content div.citation{display:grid;grid-template-columns:auto auto minmax(.65rem,auto) minmax(40%,95%)}html.writer-html5 .rst-content aside.citation>span.label,html.writer-html5 .rst-content aside.footnote>span.label,html.writer-html5 .rst-content div.citation>span.label{grid-column-start:1;grid-column-end:2}html.writer-html5 .rst-content aside.citation>span.backrefs,html.writer-html5 .rst-content aside.footnote>span.backrefs,html.writer-html5 .rst-content div.citation>span.backrefs{grid-column-start:2;grid-column-end:3;grid-row-start:1;grid-row-end:3}html.writer-html5 .rst-content aside.citation>p,html.writer-html5 .rst-content aside.footnote>p,html.writer-html5 .rst-content div.citation>p{grid-column-start:4;grid-column-end:5}html.writer-html5 .rst-content dl.citation,html.writer-html5 .rst-content dl.field-list,html.writer-html5 .rst-content dl.footnote{margin-bottom:24px}html.writer-html5 .rst-content dl.citation>dt,html.writer-html5 .rst-content dl.field-list>dt,html.writer-html5 .rst-content dl.footnote>dt{padding-left:1rem}html.writer-html5 .rst-content dl.citation>dd,html.writer-html5 .rst-content dl.citation>dt,html.writer-html5 .rst-content dl.field-list>dd,html.writer-html5 .rst-content dl.field-list>dt,html.writer-html5 .rst-content dl.footnote>dd,html.writer-html5 .rst-content dl.footnote>dt{margin-bottom:0}html.writer-html5 .rst-content dl.citation,html.writer-html5 .rst-content dl.footnote{font-size:.9rem}html.writer-html5 .rst-content dl.citation>dt,html.writer-html5 .rst-content dl.footnote>dt{margin:0 .5rem .5rem 0;line-height:1.2rem;word-break:break-all;font-weight:400}html.writer-html5 .rst-content dl.citation>dt>span.brackets:before,html.writer-html5 .rst-content dl.footnote>dt>span.brackets:before{content:"["}html.writer-html5 .rst-content dl.citation>dt>span.brackets:after,html.writer-html5 .rst-content dl.footnote>dt>span.brackets:after{content:"]"}html.writer-html5 .rst-content dl.citation>dt>span.fn-backref,html.writer-html5 .rst-content dl.footnote>dt>span.fn-backref{text-align:left;font-style:italic;margin-left:.65rem;word-break:break-word;word-spacing:-.1rem;max-width:5rem}html.writer-html5 .rst-content dl.citation>dt>span.fn-backref>a,html.writer-html5 .rst-content dl.footnote>dt>span.fn-backref>a{word-break:keep-all}html.writer-html5 .rst-content dl.citation>dt>span.fn-backref>a:not(:first-child):before,html.writer-html5 .rst-content dl.footnote>dt>span.fn-backref>a:not(:first-child):before{content:" "}html.writer-html5 .rst-content dl.citation>dd,html.writer-html5 .rst-content dl.footnote>dd{margin:0 0 .5rem;line-height:1.2rem}html.writer-html5 .rst-content dl.citation>dd p,html.writer-html5 .rst-content dl.footnote>dd p{font-size:.9rem}html.writer-html5 .rst-content aside.citation,html.writer-html5 .rst-content aside.footnote,html.writer-html5 .rst-content div.citation{padding-left:1rem;padding-right:1rem;font-size:.9rem;line-height:1.2rem}html.writer-html5 .rst-content aside.citation p,html.writer-html5 .rst-content aside.footnote p,html.writer-html5 .rst-content div.citation p{font-size:.9rem;line-height:1.2rem;margin-bottom:12px}html.writer-html5 .rst-content aside.citation span.backrefs,html.writer-html5 .rst-content aside.footnote span.backrefs,html.writer-html5 .rst-content div.citation span.backrefs{text-align:left;font-style:italic;margin-left:.65rem;word-break:break-word;word-spacing:-.1rem;max-width:5rem}html.writer-html5 .rst-content aside.citation span.backrefs>a,html.writer-html5 .rst-content aside.footnote span.backrefs>a,html.writer-html5 .rst-content div.citation span.backrefs>a{word-break:keep-all}html.writer-html5 .rst-content aside.citation span.backrefs>a:not(:first-child):before,html.writer-html5 .rst-content aside.footnote span.backrefs>a:not(:first-child):before,html.writer-html5 .rst-content div.citation span.backrefs>a:not(:first-child):before{content:" "}html.writer-html5 .rst-content aside.citation span.label,html.writer-html5 .rst-content aside.footnote span.label,html.writer-html5 .rst-content div.citation span.label{line-height:1.2rem}html.writer-html5 .rst-content aside.citation-list,html.writer-html5 .rst-content aside.footnote-list,html.writer-html5 .rst-content div.citation-list{margin-bottom:24px}html.writer-html5 .rst-content dl.option-list kbd{font-size:.9rem}.rst-content table.docutils.footnote,html.writer-html4 .rst-content table.docutils.citation,html.writer-html5 .rst-content aside.footnote,html.writer-html5 .rst-content aside.footnote-list aside.footnote,html.writer-html5 .rst-content div.citation-list>div.citation,html.writer-html5 .rst-content dl.citation,html.writer-html5 .rst-content dl.footnote{color:grey}.rst-content table.docutils.footnote code,.rst-content table.docutils.footnote tt,html.writer-html4 .rst-content table.docutils.citation code,html.writer-html4 .rst-content table.docutils.citation tt,html.writer-html5 .rst-content aside.footnote-list aside.footnote code,html.writer-html5 .rst-content aside.footnote-list aside.footnote tt,html.writer-html5 .rst-content aside.footnote code,html.writer-html5 .rst-content aside.footnote tt,html.writer-html5 .rst-content div.citation-list>div.citation code,html.writer-html5 .rst-content div.citation-list>div.citation tt,html.writer-html5 .rst-content dl.citation code,html.writer-html5 .rst-content dl.citation tt,html.writer-html5 .rst-content dl.footnote code,html.writer-html5 .rst-content dl.footnote tt{color:#555}.rst-content .wy-table-responsive.citation,.rst-content .wy-table-responsive.footnote{margin-bottom:0}.rst-content .wy-table-responsive.citation+:not(.citation),.rst-content .wy-table-responsive.footnote+:not(.footnote){margin-top:24px}.rst-content .wy-table-responsive.citation:last-child,.rst-content .wy-table-responsive.footnote:last-child{margin-bottom:24px}.rst-content table.docutils th{border-color:#e1e4e5}html.writer-html5 .rst-content table.docutils th{border:1px solid #e1e4e5}html.writer-html5 .rst-content table.docutils td>p,html.writer-html5 .rst-content table.docutils th>p{line-height:1rem;margin-bottom:0;font-size:.9rem}.rst-content table.docutils td .last,.rst-content table.docutils td .last>:last-child{margin-bottom:0}.rst-content table.field-list,.rst-content table.field-list td{border:none}.rst-content table.field-list td p{line-height:inherit}.rst-content table.field-list td>strong{display:inline-block}.rst-content table.field-list .field-name{padding-right:10px;text-align:left;white-space:nowrap}.rst-content table.field-list .field-body{text-align:left}.rst-content code,.rst-content tt{color:#000;font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;padding:2px 5px}.rst-content code big,.rst-content code em,.rst-content tt big,.rst-content tt em{font-size:100%!important;line-height:normal}.rst-content code.literal,.rst-content tt.literal{color:#e74c3c;white-space:normal}.rst-content code.xref,.rst-content tt.xref,a .rst-content code,a .rst-content tt{font-weight:700;color:#404040;overflow-wrap:normal}.rst-content kbd,.rst-content pre,.rst-content samp{font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace}.rst-content a code,.rst-content a tt{color:#2980b9}.rst-content dl{margin-bottom:24px}.rst-content dl dt{font-weight:700;margin-bottom:12px}.rst-content dl ol,.rst-content dl p,.rst-content dl table,.rst-content dl ul{margin-bottom:12px}.rst-content dl dd{margin:0 0 12px 24px;line-height:24px}.rst-content dl dd>ol:last-child,.rst-content dl dd>p:last-child,.rst-content dl dd>table:last-child,.rst-content dl dd>ul:last-child{margin-bottom:0}html.writer-html4 .rst-content dl:not(.docutils),html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple){margin-bottom:24px}html.writer-html4 .rst-content dl:not(.docutils)>dt,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt{display:table;margin:6px 0;font-size:90%;line-height:normal;background:#e7f2fa;color:#2980b9;border-top:3px solid #6ab0de;padding:6px;position:relative}html.writer-html4 .rst-content dl:not(.docutils)>dt:before,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt:before{color:#6ab0de}html.writer-html4 .rst-content dl:not(.docutils)>dt .headerlink,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt .headerlink{color:#404040;font-size:100%!important}html.writer-html4 .rst-content dl:not(.docutils) dl:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) dl:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt{margin-bottom:6px;border:none;border-left:3px solid #ccc;background:#f0f0f0;color:#555}html.writer-html4 .rst-content dl:not(.docutils) dl:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt .headerlink,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) dl:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt .headerlink{color:#404040;font-size:100%!important}html.writer-html4 .rst-content dl:not(.docutils)>dt:first-child,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt:first-child{margin-top:0}html.writer-html4 .rst-content dl:not(.docutils) code.descclassname,html.writer-html4 .rst-content dl:not(.docutils) code.descname,html.writer-html4 .rst-content dl:not(.docutils) tt.descclassname,html.writer-html4 .rst-content dl:not(.docutils) tt.descname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) code.descclassname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) code.descname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) tt.descclassname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) tt.descname{background-color:transparent;border:none;padding:0;font-size:100%!important}html.writer-html4 .rst-content dl:not(.docutils) code.descname,html.writer-html4 .rst-content dl:not(.docutils) tt.descname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) code.descname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) tt.descname{font-weight:700}html.writer-html4 .rst-content dl:not(.docutils) .optional,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) .optional{display:inline-block;padding:0 4px;color:#000;font-weight:700}html.writer-html4 .rst-content dl:not(.docutils) .property,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) .property{display:inline-block;padding-right:8px;max-width:100%}html.writer-html4 .rst-content dl:not(.docutils) .k,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) .k{font-style:italic}html.writer-html4 .rst-content dl:not(.docutils) .descclassname,html.writer-html4 .rst-content dl:not(.docutils) .descname,html.writer-html4 .rst-content dl:not(.docutils) .sig-name,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) .descclassname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) .descname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) .sig-name{font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;color:#000}.rst-content .viewcode-back,.rst-content .viewcode-link{display:inline-block;color:#27ae60;font-size:80%;padding-left:24px}.rst-content .viewcode-back{display:block;float:right}.rst-content p.rubric{margin-bottom:12px;font-weight:700}.rst-content code.download,.rst-content tt.download{background:inherit;padding:inherit;font-weight:400;font-family:inherit;font-size:inherit;color:inherit;border:inherit;white-space:inherit}.rst-content code.download span:first-child,.rst-content tt.download span:first-child{-webkit-font-smoothing:subpixel-antialiased}.rst-content code.download span:first-child:before,.rst-content tt.download span:first-child:before{margin-right:4px}.rst-content .guilabel,.rst-content .menuselection{font-size:80%;font-weight:700;border-radius:4px;padding:2.4px 6px;margin:auto 2px}.rst-content .guilabel,.rst-content .menuselection{border:1px solid #7fbbe3;background:#e7f2fa}.rst-content :not(dl.option-list)>:not(dt):not(kbd):not(.kbd)>.kbd,.rst-content :not(dl.option-list)>:not(dt):not(kbd):not(.kbd)>kbd{color:inherit;font-size:80%;background-color:#fff;border:1px solid #a6a6a6;border-radius:4px;box-shadow:0 2px grey;padding:2.4px 6px;margin:auto 0}.rst-content .versionmodified{font-style:italic}@media screen and (max-width:480px){.rst-content .sidebar{width:100%}}span[id*=MathJax-Span]{color:#404040}.math{text-align:center}@font-face{font-family:Lato;src:url(fonts/lato-normal.woff2?bd03a2cc277bbbc338d464e679fe9942) format("woff2"),url(fonts/lato-normal.woff?27bd77b9162d388cb8d4c4217c7c5e2a) format("woff");font-weight:400;font-style:normal;font-display:block}@font-face{font-family:Lato;src:url(fonts/lato-bold.woff2?cccb897485813c7c256901dbca54ecf2) format("woff2"),url(fonts/lato-bold.woff?d878b6c29b10beca227e9eef4246111b) format("woff");font-weight:700;font-style:normal;font-display:block}@font-face{font-family:Lato;src:url(fonts/lato-bold-italic.woff2?0b6bb6725576b072c5d0b02ecdd1900d) format("woff2"),url(fonts/lato-bold-italic.woff?9c7e4e9eb485b4a121c760e61bc3707c) format("woff");font-weight:700;font-style:italic;font-display:block}@font-face{font-family:Lato;src:url(fonts/lato-normal-italic.woff2?4eb103b4d12be57cb1d040ed5e162e9d) format("woff2"),url(fonts/lato-normal-italic.woff?f28f2d6482446544ef1ea1ccc6dd5892) format("woff");font-weight:400;font-style:italic;font-display:block}@font-face{font-family:Roboto Slab;font-style:normal;font-weight:400;src:url(fonts/Roboto-Slab-Regular.woff2?7abf5b8d04d26a2cafea937019bca958) format("woff2"),url(fonts/Roboto-Slab-Regular.woff?c1be9284088d487c5e3ff0a10a92e58c) format("woff");font-display:block}@font-face{font-family:Roboto Slab;font-style:normal;font-weight:700;src:url(fonts/Roboto-Slab-Bold.woff2?9984f4a9bda09be08e83f2506954adbe) format("woff2"),url(fonts/Roboto-Slab-Bold.woff?bed5564a116b05148e3b3bea6fb1162a) format("woff");font-display:block} \ No newline at end of file diff --git a/branch/bicounty_2035_hwy_update/_static/doctools.js b/branch/bicounty_2035_hwy_update/_static/doctools.js new file mode 100644 index 0000000..d06a71d --- /dev/null +++ b/branch/bicounty_2035_hwy_update/_static/doctools.js @@ -0,0 +1,156 @@ +/* + * doctools.js + * ~~~~~~~~~~~ + * + * Base JavaScript utilities for all Sphinx HTML documentation. + * + * :copyright: Copyright 2007-2023 by the Sphinx team, see AUTHORS. + * :license: BSD, see LICENSE for details. + * + */ +"use strict"; + +const BLACKLISTED_KEY_CONTROL_ELEMENTS = new Set([ + "TEXTAREA", + "INPUT", + "SELECT", + "BUTTON", +]); + +const _ready = (callback) => { + if (document.readyState !== "loading") { + callback(); + } else { + document.addEventListener("DOMContentLoaded", callback); + } +}; + +/** + * Small JavaScript module for the documentation. + */ +const Documentation = { + init: () => { + Documentation.initDomainIndexTable(); + Documentation.initOnKeyListeners(); + }, + + /** + * i18n support + */ + TRANSLATIONS: {}, + PLURAL_EXPR: (n) => (n === 1 ? 0 : 1), + LOCALE: "unknown", + + // gettext and ngettext don't access this so that the functions + // can safely bound to a different name (_ = Documentation.gettext) + gettext: (string) => { + const translated = Documentation.TRANSLATIONS[string]; + switch (typeof translated) { + case "undefined": + return string; // no translation + case "string": + return translated; // translation exists + default: + return translated[0]; // (singular, plural) translation tuple exists + } + }, + + ngettext: (singular, plural, n) => { + const translated = Documentation.TRANSLATIONS[singular]; + if (typeof translated !== "undefined") + return translated[Documentation.PLURAL_EXPR(n)]; + return n === 1 ? singular : plural; + }, + + addTranslations: (catalog) => { + Object.assign(Documentation.TRANSLATIONS, catalog.messages); + Documentation.PLURAL_EXPR = new Function( + "n", + `return (${catalog.plural_expr})` + ); + Documentation.LOCALE = catalog.locale; + }, + + /** + * helper function to focus on search bar + */ + focusSearchBar: () => { + document.querySelectorAll("input[name=q]")[0]?.focus(); + }, + + /** + * Initialise the domain index toggle buttons + */ + initDomainIndexTable: () => { + const toggler = (el) => { + const idNumber = el.id.substr(7); + const toggledRows = document.querySelectorAll(`tr.cg-${idNumber}`); + if (el.src.substr(-9) === "minus.png") { + el.src = `${el.src.substr(0, el.src.length - 9)}plus.png`; + toggledRows.forEach((el) => (el.style.display = "none")); + } else { + el.src = `${el.src.substr(0, el.src.length - 8)}minus.png`; + toggledRows.forEach((el) => (el.style.display = "")); + } + }; + + const togglerElements = document.querySelectorAll("img.toggler"); + togglerElements.forEach((el) => + el.addEventListener("click", (event) => toggler(event.currentTarget)) + ); + togglerElements.forEach((el) => (el.style.display = "")); + if (DOCUMENTATION_OPTIONS.COLLAPSE_INDEX) togglerElements.forEach(toggler); + }, + + initOnKeyListeners: () => { + // only install a listener if it is really needed + if ( + !DOCUMENTATION_OPTIONS.NAVIGATION_WITH_KEYS && + !DOCUMENTATION_OPTIONS.ENABLE_SEARCH_SHORTCUTS + ) + return; + + document.addEventListener("keydown", (event) => { + // bail for input elements + if (BLACKLISTED_KEY_CONTROL_ELEMENTS.has(document.activeElement.tagName)) return; + // bail with special keys + if (event.altKey || event.ctrlKey || event.metaKey) return; + + if (!event.shiftKey) { + switch (event.key) { + case "ArrowLeft": + if (!DOCUMENTATION_OPTIONS.NAVIGATION_WITH_KEYS) break; + + const prevLink = document.querySelector('link[rel="prev"]'); + if (prevLink && prevLink.href) { + window.location.href = prevLink.href; + event.preventDefault(); + } + break; + case "ArrowRight": + if (!DOCUMENTATION_OPTIONS.NAVIGATION_WITH_KEYS) break; + + const nextLink = document.querySelector('link[rel="next"]'); + if (nextLink && nextLink.href) { + window.location.href = nextLink.href; + event.preventDefault(); + } + break; + } + } + + // some keyboard layouts may need Shift to get / + switch (event.key) { + case "/": + if (!DOCUMENTATION_OPTIONS.ENABLE_SEARCH_SHORTCUTS) break; + Documentation.focusSearchBar(); + event.preventDefault(); + } + }); + }, +}; + +// quick alias for translations +const _ = Documentation.gettext; + +_ready(Documentation.init); diff --git a/branch/bicounty_2035_hwy_update/_static/documentation_options.js b/branch/bicounty_2035_hwy_update/_static/documentation_options.js new file mode 100644 index 0000000..c066c69 --- /dev/null +++ b/branch/bicounty_2035_hwy_update/_static/documentation_options.js @@ -0,0 +1,14 @@ +var DOCUMENTATION_OPTIONS = { + URL_ROOT: document.getElementById("documentation_options").getAttribute('data-url_root'), + VERSION: '', + LANGUAGE: 'en', + COLLAPSE_INDEX: false, + BUILDER: 'dirhtml', + FILE_SUFFIX: '.html', + LINK_SUFFIX: '.html', + HAS_SOURCE: true, + SOURCELINK_SUFFIX: '.txt', + NAVIGATION_WITH_KEYS: false, + SHOW_SEARCH_SUMMARY: true, + ENABLE_SEARCH_SHORTCUTS: true, +}; \ No newline at end of file diff --git a/branch/bicounty_2035_hwy_update/_static/file.png b/branch/bicounty_2035_hwy_update/_static/file.png new file mode 100644 index 0000000..a858a41 Binary files /dev/null and b/branch/bicounty_2035_hwy_update/_static/file.png differ diff --git a/branch/bicounty_2035_hwy_update/_static/graphviz.css b/branch/bicounty_2035_hwy_update/_static/graphviz.css new file mode 100644 index 0000000..8d81c02 --- /dev/null +++ b/branch/bicounty_2035_hwy_update/_static/graphviz.css @@ -0,0 +1,19 @@ +/* + * graphviz.css + * ~~~~~~~~~~~~ + * + * Sphinx stylesheet -- graphviz extension. + * + * :copyright: Copyright 2007-2023 by the Sphinx team, see AUTHORS. + * :license: BSD, see LICENSE for details. + * + */ + +img.graphviz { + border: 0; + max-width: 100%; +} + +object.graphviz { + max-width: 100%; +} diff --git a/branch/bicounty_2035_hwy_update/_static/jquery.js b/branch/bicounty_2035_hwy_update/_static/jquery.js new file mode 100644 index 0000000..c4c6022 --- /dev/null +++ b/branch/bicounty_2035_hwy_update/_static/jquery.js @@ -0,0 +1,2 @@ +/*! jQuery v3.6.0 | (c) OpenJS Foundation and other contributors | jquery.org/license */ +!function(e,t){"use strict";"object"==typeof module&&"object"==typeof module.exports?module.exports=e.document?t(e,!0):function(e){if(!e.document)throw new Error("jQuery requires a window with a document");return t(e)}:t(e)}("undefined"!=typeof window?window:this,function(C,e){"use strict";var t=[],r=Object.getPrototypeOf,s=t.slice,g=t.flat?function(e){return t.flat.call(e)}:function(e){return t.concat.apply([],e)},u=t.push,i=t.indexOf,n={},o=n.toString,v=n.hasOwnProperty,a=v.toString,l=a.call(Object),y={},m=function(e){return"function"==typeof e&&"number"!=typeof e.nodeType&&"function"!=typeof e.item},x=function(e){return null!=e&&e===e.window},E=C.document,c={type:!0,src:!0,nonce:!0,noModule:!0};function b(e,t,n){var r,i,o=(n=n||E).createElement("script");if(o.text=e,t)for(r in c)(i=t[r]||t.getAttribute&&t.getAttribute(r))&&o.setAttribute(r,i);n.head.appendChild(o).parentNode.removeChild(o)}function w(e){return null==e?e+"":"object"==typeof e||"function"==typeof e?n[o.call(e)]||"object":typeof e}var f="3.6.0",S=function(e,t){return new S.fn.init(e,t)};function p(e){var t=!!e&&"length"in e&&e.length,n=w(e);return!m(e)&&!x(e)&&("array"===n||0===t||"number"==typeof t&&0+~]|"+M+")"+M+"*"),U=new RegExp(M+"|>"),X=new RegExp(F),V=new RegExp("^"+I+"$"),G={ID:new RegExp("^#("+I+")"),CLASS:new RegExp("^\\.("+I+")"),TAG:new RegExp("^("+I+"|[*])"),ATTR:new RegExp("^"+W),PSEUDO:new RegExp("^"+F),CHILD:new RegExp("^:(only|first|last|nth|nth-last)-(child|of-type)(?:\\("+M+"*(even|odd|(([+-]|)(\\d*)n|)"+M+"*(?:([+-]|)"+M+"*(\\d+)|))"+M+"*\\)|)","i"),bool:new RegExp("^(?:"+R+")$","i"),needsContext:new RegExp("^"+M+"*[>+~]|:(even|odd|eq|gt|lt|nth|first|last)(?:\\("+M+"*((?:-\\d)?\\d*)"+M+"*\\)|)(?=[^-]|$)","i")},Y=/HTML$/i,Q=/^(?:input|select|textarea|button)$/i,J=/^h\d$/i,K=/^[^{]+\{\s*\[native \w/,Z=/^(?:#([\w-]+)|(\w+)|\.([\w-]+))$/,ee=/[+~]/,te=new RegExp("\\\\[\\da-fA-F]{1,6}"+M+"?|\\\\([^\\r\\n\\f])","g"),ne=function(e,t){var n="0x"+e.slice(1)-65536;return t||(n<0?String.fromCharCode(n+65536):String.fromCharCode(n>>10|55296,1023&n|56320))},re=/([\0-\x1f\x7f]|^-?\d)|^-$|[^\0-\x1f\x7f-\uFFFF\w-]/g,ie=function(e,t){return t?"\0"===e?"\ufffd":e.slice(0,-1)+"\\"+e.charCodeAt(e.length-1).toString(16)+" ":"\\"+e},oe=function(){T()},ae=be(function(e){return!0===e.disabled&&"fieldset"===e.nodeName.toLowerCase()},{dir:"parentNode",next:"legend"});try{H.apply(t=O.call(p.childNodes),p.childNodes),t[p.childNodes.length].nodeType}catch(e){H={apply:t.length?function(e,t){L.apply(e,O.call(t))}:function(e,t){var n=e.length,r=0;while(e[n++]=t[r++]);e.length=n-1}}}function se(t,e,n,r){var i,o,a,s,u,l,c,f=e&&e.ownerDocument,p=e?e.nodeType:9;if(n=n||[],"string"!=typeof t||!t||1!==p&&9!==p&&11!==p)return n;if(!r&&(T(e),e=e||C,E)){if(11!==p&&(u=Z.exec(t)))if(i=u[1]){if(9===p){if(!(a=e.getElementById(i)))return n;if(a.id===i)return n.push(a),n}else if(f&&(a=f.getElementById(i))&&y(e,a)&&a.id===i)return n.push(a),n}else{if(u[2])return H.apply(n,e.getElementsByTagName(t)),n;if((i=u[3])&&d.getElementsByClassName&&e.getElementsByClassName)return H.apply(n,e.getElementsByClassName(i)),n}if(d.qsa&&!N[t+" "]&&(!v||!v.test(t))&&(1!==p||"object"!==e.nodeName.toLowerCase())){if(c=t,f=e,1===p&&(U.test(t)||z.test(t))){(f=ee.test(t)&&ye(e.parentNode)||e)===e&&d.scope||((s=e.getAttribute("id"))?s=s.replace(re,ie):e.setAttribute("id",s=S)),o=(l=h(t)).length;while(o--)l[o]=(s?"#"+s:":scope")+" "+xe(l[o]);c=l.join(",")}try{return H.apply(n,f.querySelectorAll(c)),n}catch(e){N(t,!0)}finally{s===S&&e.removeAttribute("id")}}}return g(t.replace($,"$1"),e,n,r)}function ue(){var r=[];return function e(t,n){return r.push(t+" ")>b.cacheLength&&delete e[r.shift()],e[t+" "]=n}}function le(e){return e[S]=!0,e}function ce(e){var t=C.createElement("fieldset");try{return!!e(t)}catch(e){return!1}finally{t.parentNode&&t.parentNode.removeChild(t),t=null}}function fe(e,t){var n=e.split("|"),r=n.length;while(r--)b.attrHandle[n[r]]=t}function pe(e,t){var n=t&&e,r=n&&1===e.nodeType&&1===t.nodeType&&e.sourceIndex-t.sourceIndex;if(r)return r;if(n)while(n=n.nextSibling)if(n===t)return-1;return e?1:-1}function de(t){return function(e){return"input"===e.nodeName.toLowerCase()&&e.type===t}}function he(n){return function(e){var t=e.nodeName.toLowerCase();return("input"===t||"button"===t)&&e.type===n}}function ge(t){return function(e){return"form"in e?e.parentNode&&!1===e.disabled?"label"in e?"label"in e.parentNode?e.parentNode.disabled===t:e.disabled===t:e.isDisabled===t||e.isDisabled!==!t&&ae(e)===t:e.disabled===t:"label"in e&&e.disabled===t}}function ve(a){return le(function(o){return o=+o,le(function(e,t){var n,r=a([],e.length,o),i=r.length;while(i--)e[n=r[i]]&&(e[n]=!(t[n]=e[n]))})})}function ye(e){return e&&"undefined"!=typeof e.getElementsByTagName&&e}for(e in d=se.support={},i=se.isXML=function(e){var t=e&&e.namespaceURI,n=e&&(e.ownerDocument||e).documentElement;return!Y.test(t||n&&n.nodeName||"HTML")},T=se.setDocument=function(e){var t,n,r=e?e.ownerDocument||e:p;return r!=C&&9===r.nodeType&&r.documentElement&&(a=(C=r).documentElement,E=!i(C),p!=C&&(n=C.defaultView)&&n.top!==n&&(n.addEventListener?n.addEventListener("unload",oe,!1):n.attachEvent&&n.attachEvent("onunload",oe)),d.scope=ce(function(e){return a.appendChild(e).appendChild(C.createElement("div")),"undefined"!=typeof e.querySelectorAll&&!e.querySelectorAll(":scope fieldset div").length}),d.attributes=ce(function(e){return e.className="i",!e.getAttribute("className")}),d.getElementsByTagName=ce(function(e){return e.appendChild(C.createComment("")),!e.getElementsByTagName("*").length}),d.getElementsByClassName=K.test(C.getElementsByClassName),d.getById=ce(function(e){return a.appendChild(e).id=S,!C.getElementsByName||!C.getElementsByName(S).length}),d.getById?(b.filter.ID=function(e){var t=e.replace(te,ne);return function(e){return e.getAttribute("id")===t}},b.find.ID=function(e,t){if("undefined"!=typeof t.getElementById&&E){var n=t.getElementById(e);return n?[n]:[]}}):(b.filter.ID=function(e){var n=e.replace(te,ne);return function(e){var t="undefined"!=typeof e.getAttributeNode&&e.getAttributeNode("id");return t&&t.value===n}},b.find.ID=function(e,t){if("undefined"!=typeof t.getElementById&&E){var n,r,i,o=t.getElementById(e);if(o){if((n=o.getAttributeNode("id"))&&n.value===e)return[o];i=t.getElementsByName(e),r=0;while(o=i[r++])if((n=o.getAttributeNode("id"))&&n.value===e)return[o]}return[]}}),b.find.TAG=d.getElementsByTagName?function(e,t){return"undefined"!=typeof t.getElementsByTagName?t.getElementsByTagName(e):d.qsa?t.querySelectorAll(e):void 0}:function(e,t){var n,r=[],i=0,o=t.getElementsByTagName(e);if("*"===e){while(n=o[i++])1===n.nodeType&&r.push(n);return r}return o},b.find.CLASS=d.getElementsByClassName&&function(e,t){if("undefined"!=typeof t.getElementsByClassName&&E)return t.getElementsByClassName(e)},s=[],v=[],(d.qsa=K.test(C.querySelectorAll))&&(ce(function(e){var t;a.appendChild(e).innerHTML="",e.querySelectorAll("[msallowcapture^='']").length&&v.push("[*^$]="+M+"*(?:''|\"\")"),e.querySelectorAll("[selected]").length||v.push("\\["+M+"*(?:value|"+R+")"),e.querySelectorAll("[id~="+S+"-]").length||v.push("~="),(t=C.createElement("input")).setAttribute("name",""),e.appendChild(t),e.querySelectorAll("[name='']").length||v.push("\\["+M+"*name"+M+"*="+M+"*(?:''|\"\")"),e.querySelectorAll(":checked").length||v.push(":checked"),e.querySelectorAll("a#"+S+"+*").length||v.push(".#.+[+~]"),e.querySelectorAll("\\\f"),v.push("[\\r\\n\\f]")}),ce(function(e){e.innerHTML="";var t=C.createElement("input");t.setAttribute("type","hidden"),e.appendChild(t).setAttribute("name","D"),e.querySelectorAll("[name=d]").length&&v.push("name"+M+"*[*^$|!~]?="),2!==e.querySelectorAll(":enabled").length&&v.push(":enabled",":disabled"),a.appendChild(e).disabled=!0,2!==e.querySelectorAll(":disabled").length&&v.push(":enabled",":disabled"),e.querySelectorAll("*,:x"),v.push(",.*:")})),(d.matchesSelector=K.test(c=a.matches||a.webkitMatchesSelector||a.mozMatchesSelector||a.oMatchesSelector||a.msMatchesSelector))&&ce(function(e){d.disconnectedMatch=c.call(e,"*"),c.call(e,"[s!='']:x"),s.push("!=",F)}),v=v.length&&new RegExp(v.join("|")),s=s.length&&new RegExp(s.join("|")),t=K.test(a.compareDocumentPosition),y=t||K.test(a.contains)?function(e,t){var n=9===e.nodeType?e.documentElement:e,r=t&&t.parentNode;return e===r||!(!r||1!==r.nodeType||!(n.contains?n.contains(r):e.compareDocumentPosition&&16&e.compareDocumentPosition(r)))}:function(e,t){if(t)while(t=t.parentNode)if(t===e)return!0;return!1},j=t?function(e,t){if(e===t)return l=!0,0;var n=!e.compareDocumentPosition-!t.compareDocumentPosition;return n||(1&(n=(e.ownerDocument||e)==(t.ownerDocument||t)?e.compareDocumentPosition(t):1)||!d.sortDetached&&t.compareDocumentPosition(e)===n?e==C||e.ownerDocument==p&&y(p,e)?-1:t==C||t.ownerDocument==p&&y(p,t)?1:u?P(u,e)-P(u,t):0:4&n?-1:1)}:function(e,t){if(e===t)return l=!0,0;var n,r=0,i=e.parentNode,o=t.parentNode,a=[e],s=[t];if(!i||!o)return e==C?-1:t==C?1:i?-1:o?1:u?P(u,e)-P(u,t):0;if(i===o)return pe(e,t);n=e;while(n=n.parentNode)a.unshift(n);n=t;while(n=n.parentNode)s.unshift(n);while(a[r]===s[r])r++;return r?pe(a[r],s[r]):a[r]==p?-1:s[r]==p?1:0}),C},se.matches=function(e,t){return se(e,null,null,t)},se.matchesSelector=function(e,t){if(T(e),d.matchesSelector&&E&&!N[t+" "]&&(!s||!s.test(t))&&(!v||!v.test(t)))try{var n=c.call(e,t);if(n||d.disconnectedMatch||e.document&&11!==e.document.nodeType)return n}catch(e){N(t,!0)}return 0":{dir:"parentNode",first:!0}," ":{dir:"parentNode"},"+":{dir:"previousSibling",first:!0},"~":{dir:"previousSibling"}},preFilter:{ATTR:function(e){return e[1]=e[1].replace(te,ne),e[3]=(e[3]||e[4]||e[5]||"").replace(te,ne),"~="===e[2]&&(e[3]=" "+e[3]+" "),e.slice(0,4)},CHILD:function(e){return e[1]=e[1].toLowerCase(),"nth"===e[1].slice(0,3)?(e[3]||se.error(e[0]),e[4]=+(e[4]?e[5]+(e[6]||1):2*("even"===e[3]||"odd"===e[3])),e[5]=+(e[7]+e[8]||"odd"===e[3])):e[3]&&se.error(e[0]),e},PSEUDO:function(e){var t,n=!e[6]&&e[2];return G.CHILD.test(e[0])?null:(e[3]?e[2]=e[4]||e[5]||"":n&&X.test(n)&&(t=h(n,!0))&&(t=n.indexOf(")",n.length-t)-n.length)&&(e[0]=e[0].slice(0,t),e[2]=n.slice(0,t)),e.slice(0,3))}},filter:{TAG:function(e){var t=e.replace(te,ne).toLowerCase();return"*"===e?function(){return!0}:function(e){return e.nodeName&&e.nodeName.toLowerCase()===t}},CLASS:function(e){var t=m[e+" "];return t||(t=new RegExp("(^|"+M+")"+e+"("+M+"|$)"))&&m(e,function(e){return t.test("string"==typeof e.className&&e.className||"undefined"!=typeof e.getAttribute&&e.getAttribute("class")||"")})},ATTR:function(n,r,i){return function(e){var t=se.attr(e,n);return null==t?"!="===r:!r||(t+="","="===r?t===i:"!="===r?t!==i:"^="===r?i&&0===t.indexOf(i):"*="===r?i&&-1:\x20\t\r\n\f]*)[\x20\t\r\n\f]*\/?>(?:<\/\1>|)$/i;function j(e,n,r){return m(n)?S.grep(e,function(e,t){return!!n.call(e,t,e)!==r}):n.nodeType?S.grep(e,function(e){return e===n!==r}):"string"!=typeof n?S.grep(e,function(e){return-1)[^>]*|#([\w-]+))$/;(S.fn.init=function(e,t,n){var r,i;if(!e)return this;if(n=n||D,"string"==typeof e){if(!(r="<"===e[0]&&">"===e[e.length-1]&&3<=e.length?[null,e,null]:q.exec(e))||!r[1]&&t)return!t||t.jquery?(t||n).find(e):this.constructor(t).find(e);if(r[1]){if(t=t instanceof S?t[0]:t,S.merge(this,S.parseHTML(r[1],t&&t.nodeType?t.ownerDocument||t:E,!0)),N.test(r[1])&&S.isPlainObject(t))for(r in t)m(this[r])?this[r](t[r]):this.attr(r,t[r]);return this}return(i=E.getElementById(r[2]))&&(this[0]=i,this.length=1),this}return e.nodeType?(this[0]=e,this.length=1,this):m(e)?void 0!==n.ready?n.ready(e):e(S):S.makeArray(e,this)}).prototype=S.fn,D=S(E);var L=/^(?:parents|prev(?:Until|All))/,H={children:!0,contents:!0,next:!0,prev:!0};function O(e,t){while((e=e[t])&&1!==e.nodeType);return e}S.fn.extend({has:function(e){var t=S(e,this),n=t.length;return this.filter(function(){for(var e=0;e\x20\t\r\n\f]*)/i,he=/^$|^module$|\/(?:java|ecma)script/i;ce=E.createDocumentFragment().appendChild(E.createElement("div")),(fe=E.createElement("input")).setAttribute("type","radio"),fe.setAttribute("checked","checked"),fe.setAttribute("name","t"),ce.appendChild(fe),y.checkClone=ce.cloneNode(!0).cloneNode(!0).lastChild.checked,ce.innerHTML="",y.noCloneChecked=!!ce.cloneNode(!0).lastChild.defaultValue,ce.innerHTML="",y.option=!!ce.lastChild;var ge={thead:[1,"","
"],col:[2,"","
"],tr:[2,"","
"],td:[3,"","
"],_default:[0,"",""]};function ve(e,t){var n;return n="undefined"!=typeof e.getElementsByTagName?e.getElementsByTagName(t||"*"):"undefined"!=typeof e.querySelectorAll?e.querySelectorAll(t||"*"):[],void 0===t||t&&A(e,t)?S.merge([e],n):n}function ye(e,t){for(var n=0,r=e.length;n",""]);var me=/<|&#?\w+;/;function xe(e,t,n,r,i){for(var o,a,s,u,l,c,f=t.createDocumentFragment(),p=[],d=0,h=e.length;d\s*$/g;function je(e,t){return A(e,"table")&&A(11!==t.nodeType?t:t.firstChild,"tr")&&S(e).children("tbody")[0]||e}function De(e){return e.type=(null!==e.getAttribute("type"))+"/"+e.type,e}function qe(e){return"true/"===(e.type||"").slice(0,5)?e.type=e.type.slice(5):e.removeAttribute("type"),e}function Le(e,t){var n,r,i,o,a,s;if(1===t.nodeType){if(Y.hasData(e)&&(s=Y.get(e).events))for(i in Y.remove(t,"handle events"),s)for(n=0,r=s[i].length;n").attr(n.scriptAttrs||{}).prop({charset:n.scriptCharset,src:n.url}).on("load error",i=function(e){r.remove(),i=null,e&&t("error"===e.type?404:200,e.type)}),E.head.appendChild(r[0])},abort:function(){i&&i()}}});var _t,zt=[],Ut=/(=)\?(?=&|$)|\?\?/;S.ajaxSetup({jsonp:"callback",jsonpCallback:function(){var e=zt.pop()||S.expando+"_"+wt.guid++;return this[e]=!0,e}}),S.ajaxPrefilter("json jsonp",function(e,t,n){var r,i,o,a=!1!==e.jsonp&&(Ut.test(e.url)?"url":"string"==typeof e.data&&0===(e.contentType||"").indexOf("application/x-www-form-urlencoded")&&Ut.test(e.data)&&"data");if(a||"jsonp"===e.dataTypes[0])return r=e.jsonpCallback=m(e.jsonpCallback)?e.jsonpCallback():e.jsonpCallback,a?e[a]=e[a].replace(Ut,"$1"+r):!1!==e.jsonp&&(e.url+=(Tt.test(e.url)?"&":"?")+e.jsonp+"="+r),e.converters["script json"]=function(){return o||S.error(r+" was not called"),o[0]},e.dataTypes[0]="json",i=C[r],C[r]=function(){o=arguments},n.always(function(){void 0===i?S(C).removeProp(r):C[r]=i,e[r]&&(e.jsonpCallback=t.jsonpCallback,zt.push(r)),o&&m(i)&&i(o[0]),o=i=void 0}),"script"}),y.createHTMLDocument=((_t=E.implementation.createHTMLDocument("").body).innerHTML="
",2===_t.childNodes.length),S.parseHTML=function(e,t,n){return"string"!=typeof e?[]:("boolean"==typeof t&&(n=t,t=!1),t||(y.createHTMLDocument?((r=(t=E.implementation.createHTMLDocument("")).createElement("base")).href=E.location.href,t.head.appendChild(r)):t=E),o=!n&&[],(i=N.exec(e))?[t.createElement(i[1])]:(i=xe([e],t,o),o&&o.length&&S(o).remove(),S.merge([],i.childNodes)));var r,i,o},S.fn.load=function(e,t,n){var r,i,o,a=this,s=e.indexOf(" ");return-1").append(S.parseHTML(e)).find(r):e)}).always(n&&function(e,t){a.each(function(){n.apply(this,o||[e.responseText,t,e])})}),this},S.expr.pseudos.animated=function(t){return S.grep(S.timers,function(e){return t===e.elem}).length},S.offset={setOffset:function(e,t,n){var r,i,o,a,s,u,l=S.css(e,"position"),c=S(e),f={};"static"===l&&(e.style.position="relative"),s=c.offset(),o=S.css(e,"top"),u=S.css(e,"left"),("absolute"===l||"fixed"===l)&&-1<(o+u).indexOf("auto")?(a=(r=c.position()).top,i=r.left):(a=parseFloat(o)||0,i=parseFloat(u)||0),m(t)&&(t=t.call(e,n,S.extend({},s))),null!=t.top&&(f.top=t.top-s.top+a),null!=t.left&&(f.left=t.left-s.left+i),"using"in t?t.using.call(e,f):c.css(f)}},S.fn.extend({offset:function(t){if(arguments.length)return void 0===t?this:this.each(function(e){S.offset.setOffset(this,t,e)});var e,n,r=this[0];return r?r.getClientRects().length?(e=r.getBoundingClientRect(),n=r.ownerDocument.defaultView,{top:e.top+n.pageYOffset,left:e.left+n.pageXOffset}):{top:0,left:0}:void 0},position:function(){if(this[0]){var e,t,n,r=this[0],i={top:0,left:0};if("fixed"===S.css(r,"position"))t=r.getBoundingClientRect();else{t=this.offset(),n=r.ownerDocument,e=r.offsetParent||n.documentElement;while(e&&(e===n.body||e===n.documentElement)&&"static"===S.css(e,"position"))e=e.parentNode;e&&e!==r&&1===e.nodeType&&((i=S(e).offset()).top+=S.css(e,"borderTopWidth",!0),i.left+=S.css(e,"borderLeftWidth",!0))}return{top:t.top-i.top-S.css(r,"marginTop",!0),left:t.left-i.left-S.css(r,"marginLeft",!0)}}},offsetParent:function(){return this.map(function(){var e=this.offsetParent;while(e&&"static"===S.css(e,"position"))e=e.offsetParent;return e||re})}}),S.each({scrollLeft:"pageXOffset",scrollTop:"pageYOffset"},function(t,i){var o="pageYOffset"===i;S.fn[t]=function(e){return $(this,function(e,t,n){var r;if(x(e)?r=e:9===e.nodeType&&(r=e.defaultView),void 0===n)return r?r[i]:e[t];r?r.scrollTo(o?r.pageXOffset:n,o?n:r.pageYOffset):e[t]=n},t,e,arguments.length)}}),S.each(["top","left"],function(e,n){S.cssHooks[n]=Fe(y.pixelPosition,function(e,t){if(t)return t=We(e,n),Pe.test(t)?S(e).position()[n]+"px":t})}),S.each({Height:"height",Width:"width"},function(a,s){S.each({padding:"inner"+a,content:s,"":"outer"+a},function(r,o){S.fn[o]=function(e,t){var n=arguments.length&&(r||"boolean"!=typeof e),i=r||(!0===e||!0===t?"margin":"border");return $(this,function(e,t,n){var r;return x(e)?0===o.indexOf("outer")?e["inner"+a]:e.document.documentElement["client"+a]:9===e.nodeType?(r=e.documentElement,Math.max(e.body["scroll"+a],r["scroll"+a],e.body["offset"+a],r["offset"+a],r["client"+a])):void 0===n?S.css(e,t,i):S.style(e,t,n,i)},s,n?e:void 0,n)}})}),S.each(["ajaxStart","ajaxStop","ajaxComplete","ajaxError","ajaxSuccess","ajaxSend"],function(e,t){S.fn[t]=function(e){return this.on(t,e)}}),S.fn.extend({bind:function(e,t,n){return this.on(e,null,t,n)},unbind:function(e,t){return this.off(e,null,t)},delegate:function(e,t,n,r){return this.on(t,e,n,r)},undelegate:function(e,t,n){return 1===arguments.length?this.off(e,"**"):this.off(t,e||"**",n)},hover:function(e,t){return this.mouseenter(e).mouseleave(t||e)}}),S.each("blur focus focusin focusout resize scroll click dblclick mousedown mouseup mousemove mouseover mouseout mouseenter mouseleave change select submit keydown keypress keyup contextmenu".split(" "),function(e,n){S.fn[n]=function(e,t){return 0",d.insertBefore(c.lastChild,d.firstChild)}function d(){var a=y.elements;return"string"==typeof a?a.split(" "):a}function e(a,b){var c=y.elements;"string"!=typeof c&&(c=c.join(" ")),"string"!=typeof a&&(a=a.join(" ")),y.elements=c+" "+a,j(b)}function f(a){var b=x[a[v]];return b||(b={},w++,a[v]=w,x[w]=b),b}function g(a,c,d){if(c||(c=b),q)return c.createElement(a);d||(d=f(c));var e;return e=d.cache[a]?d.cache[a].cloneNode():u.test(a)?(d.cache[a]=d.createElem(a)).cloneNode():d.createElem(a),!e.canHaveChildren||t.test(a)||e.tagUrn?e:d.frag.appendChild(e)}function h(a,c){if(a||(a=b),q)return a.createDocumentFragment();c=c||f(a);for(var e=c.frag.cloneNode(),g=0,h=d(),i=h.length;i>g;g++)e.createElement(h[g]);return e}function i(a,b){b.cache||(b.cache={},b.createElem=a.createElement,b.createFrag=a.createDocumentFragment,b.frag=b.createFrag()),a.createElement=function(c){return y.shivMethods?g(c,a,b):b.createElem(c)},a.createDocumentFragment=Function("h,f","return function(){var n=f.cloneNode(),c=n.createElement;h.shivMethods&&("+d().join().replace(/[\w\-:]+/g,function(a){return b.createElem(a),b.frag.createElement(a),'c("'+a+'")'})+");return n}")(y,b.frag)}function j(a){a||(a=b);var d=f(a);return!y.shivCSS||p||d.hasCSS||(d.hasCSS=!!c(a,"article,aside,dialog,figcaption,figure,footer,header,hgroup,main,nav,section{display:block}mark{background:#FF0;color:#000}template{display:none}")),q||i(a,d),a}function k(a){for(var b,c=a.getElementsByTagName("*"),e=c.length,f=RegExp("^(?:"+d().join("|")+")$","i"),g=[];e--;)b=c[e],f.test(b.nodeName)&&g.push(b.applyElement(l(b)));return g}function l(a){for(var b,c=a.attributes,d=c.length,e=a.ownerDocument.createElement(A+":"+a.nodeName);d--;)b=c[d],b.specified&&e.setAttribute(b.nodeName,b.nodeValue);return e.style.cssText=a.style.cssText,e}function m(a){for(var b,c=a.split("{"),e=c.length,f=RegExp("(^|[\\s,>+~])("+d().join("|")+")(?=[[\\s,>+~#.:]|$)","gi"),g="$1"+A+"\\:$2";e--;)b=c[e]=c[e].split("}"),b[b.length-1]=b[b.length-1].replace(f,g),c[e]=b.join("}");return c.join("{")}function n(a){for(var b=a.length;b--;)a[b].removeNode()}function o(a){function b(){clearTimeout(g._removeSheetTimer),d&&d.removeNode(!0),d=null}var d,e,g=f(a),h=a.namespaces,i=a.parentWindow;return!B||a.printShived?a:("undefined"==typeof h[A]&&h.add(A),i.attachEvent("onbeforeprint",function(){b();for(var f,g,h,i=a.styleSheets,j=[],l=i.length,n=Array(l);l--;)n[l]=i[l];for(;h=n.pop();)if(!h.disabled&&z.test(h.media)){try{f=h.imports,g=f.length}catch(o){g=0}for(l=0;g>l;l++)n.push(f[l]);try{j.push(h.cssText)}catch(o){}}j=m(j.reverse().join("")),e=k(a),d=c(a,j)}),i.attachEvent("onafterprint",function(){n(e),clearTimeout(g._removeSheetTimer),g._removeSheetTimer=setTimeout(b,500)}),a.printShived=!0,a)}var p,q,r="3.7.3",s=a.html5||{},t=/^<|^(?:button|map|select|textarea|object|iframe|option|optgroup)$/i,u=/^(?:a|b|code|div|fieldset|h1|h2|h3|h4|h5|h6|i|label|li|ol|p|q|span|strong|style|table|tbody|td|th|tr|ul)$/i,v="_html5shiv",w=0,x={};!function(){try{var a=b.createElement("a");a.innerHTML="",p="hidden"in a,q=1==a.childNodes.length||function(){b.createElement("a");var a=b.createDocumentFragment();return"undefined"==typeof a.cloneNode||"undefined"==typeof a.createDocumentFragment||"undefined"==typeof a.createElement}()}catch(c){p=!0,q=!0}}();var y={elements:s.elements||"abbr article aside audio bdi canvas data datalist details dialog figcaption figure footer header hgroup main mark meter nav output picture progress section summary template time video",version:r,shivCSS:s.shivCSS!==!1,supportsUnknownElements:q,shivMethods:s.shivMethods!==!1,type:"default",shivDocument:j,createElement:g,createDocumentFragment:h,addElements:e};a.html5=y,j(b);var z=/^$|\b(?:all|print)\b/,A="html5shiv",B=!q&&function(){var c=b.documentElement;return!("undefined"==typeof b.namespaces||"undefined"==typeof b.parentWindow||"undefined"==typeof c.applyElement||"undefined"==typeof c.removeNode||"undefined"==typeof a.attachEvent)}();y.type+=" print",y.shivPrint=o,o(b),"object"==typeof module&&module.exports&&(module.exports=y)}("undefined"!=typeof window?window:this,document); \ No newline at end of file diff --git a/branch/bicounty_2035_hwy_update/_static/js/html5shiv.min.js b/branch/bicounty_2035_hwy_update/_static/js/html5shiv.min.js new file mode 100644 index 0000000..cd1c674 --- /dev/null +++ b/branch/bicounty_2035_hwy_update/_static/js/html5shiv.min.js @@ -0,0 +1,4 @@ +/** +* @preserve HTML5 Shiv 3.7.3 | @afarkas @jdalton @jon_neal @rem | MIT/GPL2 Licensed +*/ +!function(a,b){function c(a,b){var c=a.createElement("p"),d=a.getElementsByTagName("head")[0]||a.documentElement;return c.innerHTML="x",d.insertBefore(c.lastChild,d.firstChild)}function d(){var a=t.elements;return"string"==typeof a?a.split(" "):a}function e(a,b){var c=t.elements;"string"!=typeof c&&(c=c.join(" ")),"string"!=typeof a&&(a=a.join(" ")),t.elements=c+" "+a,j(b)}function f(a){var b=s[a[q]];return b||(b={},r++,a[q]=r,s[r]=b),b}function g(a,c,d){if(c||(c=b),l)return c.createElement(a);d||(d=f(c));var e;return e=d.cache[a]?d.cache[a].cloneNode():p.test(a)?(d.cache[a]=d.createElem(a)).cloneNode():d.createElem(a),!e.canHaveChildren||o.test(a)||e.tagUrn?e:d.frag.appendChild(e)}function h(a,c){if(a||(a=b),l)return a.createDocumentFragment();c=c||f(a);for(var e=c.frag.cloneNode(),g=0,h=d(),i=h.length;i>g;g++)e.createElement(h[g]);return e}function i(a,b){b.cache||(b.cache={},b.createElem=a.createElement,b.createFrag=a.createDocumentFragment,b.frag=b.createFrag()),a.createElement=function(c){return t.shivMethods?g(c,a,b):b.createElem(c)},a.createDocumentFragment=Function("h,f","return function(){var n=f.cloneNode(),c=n.createElement;h.shivMethods&&("+d().join().replace(/[\w\-:]+/g,function(a){return b.createElem(a),b.frag.createElement(a),'c("'+a+'")'})+");return n}")(t,b.frag)}function j(a){a||(a=b);var d=f(a);return!t.shivCSS||k||d.hasCSS||(d.hasCSS=!!c(a,"article,aside,dialog,figcaption,figure,footer,header,hgroup,main,nav,section{display:block}mark{background:#FF0;color:#000}template{display:none}")),l||i(a,d),a}var k,l,m="3.7.3-pre",n=a.html5||{},o=/^<|^(?:button|map|select|textarea|object|iframe|option|optgroup)$/i,p=/^(?:a|b|code|div|fieldset|h1|h2|h3|h4|h5|h6|i|label|li|ol|p|q|span|strong|style|table|tbody|td|th|tr|ul)$/i,q="_html5shiv",r=0,s={};!function(){try{var a=b.createElement("a");a.innerHTML="",k="hidden"in a,l=1==a.childNodes.length||function(){b.createElement("a");var a=b.createDocumentFragment();return"undefined"==typeof a.cloneNode||"undefined"==typeof a.createDocumentFragment||"undefined"==typeof a.createElement}()}catch(c){k=!0,l=!0}}();var t={elements:n.elements||"abbr article aside audio bdi canvas data datalist details dialog figcaption figure footer header hgroup main mark meter nav output picture progress section summary template time video",version:m,shivCSS:n.shivCSS!==!1,supportsUnknownElements:l,shivMethods:n.shivMethods!==!1,type:"default",shivDocument:j,createElement:g,createDocumentFragment:h,addElements:e};a.html5=t,j(b),"object"==typeof module&&module.exports&&(module.exports=t)}("undefined"!=typeof window?window:this,document); \ No newline at end of file diff --git a/branch/bicounty_2035_hwy_update/_static/js/theme.js b/branch/bicounty_2035_hwy_update/_static/js/theme.js new file mode 100644 index 0000000..1fddb6e --- /dev/null +++ b/branch/bicounty_2035_hwy_update/_static/js/theme.js @@ -0,0 +1 @@ +!function(n){var e={};function t(i){if(e[i])return e[i].exports;var o=e[i]={i:i,l:!1,exports:{}};return n[i].call(o.exports,o,o.exports,t),o.l=!0,o.exports}t.m=n,t.c=e,t.d=function(n,e,i){t.o(n,e)||Object.defineProperty(n,e,{enumerable:!0,get:i})},t.r=function(n){"undefined"!=typeof Symbol&&Symbol.toStringTag&&Object.defineProperty(n,Symbol.toStringTag,{value:"Module"}),Object.defineProperty(n,"__esModule",{value:!0})},t.t=function(n,e){if(1&e&&(n=t(n)),8&e)return n;if(4&e&&"object"==typeof n&&n&&n.__esModule)return n;var i=Object.create(null);if(t.r(i),Object.defineProperty(i,"default",{enumerable:!0,value:n}),2&e&&"string"!=typeof n)for(var o in n)t.d(i,o,function(e){return n[e]}.bind(null,o));return i},t.n=function(n){var e=n&&n.__esModule?function(){return n.default}:function(){return n};return t.d(e,"a",e),e},t.o=function(n,e){return Object.prototype.hasOwnProperty.call(n,e)},t.p="",t(t.s=0)}([function(n,e,t){t(1),n.exports=t(3)},function(n,e,t){(function(){var e="undefined"!=typeof window?window.jQuery:t(2);n.exports.ThemeNav={navBar:null,win:null,winScroll:!1,winResize:!1,linkScroll:!1,winPosition:0,winHeight:null,docHeight:null,isRunning:!1,enable:function(n){var t=this;void 0===n&&(n=!0),t.isRunning||(t.isRunning=!0,e((function(e){t.init(e),t.reset(),t.win.on("hashchange",t.reset),n&&t.win.on("scroll",(function(){t.linkScroll||t.winScroll||(t.winScroll=!0,requestAnimationFrame((function(){t.onScroll()})))})),t.win.on("resize",(function(){t.winResize||(t.winResize=!0,requestAnimationFrame((function(){t.onResize()})))})),t.onResize()})))},enableSticky:function(){this.enable(!0)},init:function(n){n(document);var e=this;this.navBar=n("div.wy-side-scroll:first"),this.win=n(window),n(document).on("click","[data-toggle='wy-nav-top']",(function(){n("[data-toggle='wy-nav-shift']").toggleClass("shift"),n("[data-toggle='rst-versions']").toggleClass("shift")})).on("click",".wy-menu-vertical .current ul li a",(function(){var t=n(this);n("[data-toggle='wy-nav-shift']").removeClass("shift"),n("[data-toggle='rst-versions']").toggleClass("shift"),e.toggleCurrent(t),e.hashChange()})).on("click","[data-toggle='rst-current-version']",(function(){n("[data-toggle='rst-versions']").toggleClass("shift-up")})),n("table.docutils:not(.field-list,.footnote,.citation)").wrap("
"),n("table.docutils.footnote").wrap("
"),n("table.docutils.citation").wrap("
"),n(".wy-menu-vertical ul").not(".simple").siblings("a").each((function(){var t=n(this);expand=n(''),expand.on("click",(function(n){return e.toggleCurrent(t),n.stopPropagation(),!1})),t.prepend(expand)}))},reset:function(){var n=encodeURI(window.location.hash)||"#";try{var e=$(".wy-menu-vertical"),t=e.find('[href="'+n+'"]');if(0===t.length){var i=$('.document [id="'+n.substring(1)+'"]').closest("div.section");0===(t=e.find('[href="#'+i.attr("id")+'"]')).length&&(t=e.find('[href="#"]'))}if(t.length>0){$(".wy-menu-vertical .current").removeClass("current").attr("aria-expanded","false"),t.addClass("current").attr("aria-expanded","true"),t.closest("li.toctree-l1").parent().addClass("current").attr("aria-expanded","true");for(let n=1;n<=10;n++)t.closest("li.toctree-l"+n).addClass("current").attr("aria-expanded","true");t[0].scrollIntoView()}}catch(n){console.log("Error expanding nav for anchor",n)}},onScroll:function(){this.winScroll=!1;var n=this.win.scrollTop(),e=n+this.winHeight,t=this.navBar.scrollTop()+(n-this.winPosition);n<0||e>this.docHeight||(this.navBar.scrollTop(t),this.winPosition=n)},onResize:function(){this.winResize=!1,this.winHeight=this.win.height(),this.docHeight=$(document).height()},hashChange:function(){this.linkScroll=!0,this.win.one("hashchange",(function(){this.linkScroll=!1}))},toggleCurrent:function(n){var e=n.closest("li");e.siblings("li.current").removeClass("current").attr("aria-expanded","false"),e.siblings().find("li.current").removeClass("current").attr("aria-expanded","false");var t=e.find("> ul li");t.length&&(t.removeClass("current").attr("aria-expanded","false"),e.toggleClass("current").attr("aria-expanded",(function(n,e){return"true"==e?"false":"true"})))}},"undefined"!=typeof window&&(window.SphinxRtdTheme={Navigation:n.exports.ThemeNav,StickyNav:n.exports.ThemeNav}),function(){for(var n=0,e=["ms","moz","webkit","o"],t=0;t0 + var meq1 = "^(" + C + ")?" + V + C + "(" + V + ")?$"; // [C]VC[V] is m=1 + var mgr1 = "^(" + C + ")?" + V + C + V + C; // [C]VCVC... is m>1 + var s_v = "^(" + C + ")?" + v; // vowel in stem + + this.stemWord = function (w) { + var stem; + var suffix; + var firstch; + var origword = w; + + if (w.length < 3) + return w; + + var re; + var re2; + var re3; + var re4; + + firstch = w.substr(0,1); + if (firstch == "y") + w = firstch.toUpperCase() + w.substr(1); + + // Step 1a + re = /^(.+?)(ss|i)es$/; + re2 = /^(.+?)([^s])s$/; + + if (re.test(w)) + w = w.replace(re,"$1$2"); + else if (re2.test(w)) + w = w.replace(re2,"$1$2"); + + // Step 1b + re = /^(.+?)eed$/; + re2 = /^(.+?)(ed|ing)$/; + if (re.test(w)) { + var fp = re.exec(w); + re = new RegExp(mgr0); + if (re.test(fp[1])) { + re = /.$/; + w = w.replace(re,""); + } + } + else if (re2.test(w)) { + var fp = re2.exec(w); + stem = fp[1]; + re2 = new RegExp(s_v); + if (re2.test(stem)) { + w = stem; + re2 = /(at|bl|iz)$/; + re3 = new RegExp("([^aeiouylsz])\\1$"); + re4 = new RegExp("^" + C + v + "[^aeiouwxy]$"); + if (re2.test(w)) + w = w + "e"; + else if (re3.test(w)) { + re = /.$/; + w = w.replace(re,""); + } + else if (re4.test(w)) + w = w + "e"; + } + } + + // Step 1c + re = /^(.+?)y$/; + if (re.test(w)) { + var fp = re.exec(w); + stem = fp[1]; + re = new RegExp(s_v); + if (re.test(stem)) + w = stem + "i"; + } + + // Step 2 + re = /^(.+?)(ational|tional|enci|anci|izer|bli|alli|entli|eli|ousli|ization|ation|ator|alism|iveness|fulness|ousness|aliti|iviti|biliti|logi)$/; + if (re.test(w)) { + var fp = re.exec(w); + stem = fp[1]; + suffix = fp[2]; + re = new RegExp(mgr0); + if (re.test(stem)) + w = stem + step2list[suffix]; + } + + // Step 3 + re = /^(.+?)(icate|ative|alize|iciti|ical|ful|ness)$/; + if (re.test(w)) { + var fp = re.exec(w); + stem = fp[1]; + suffix = fp[2]; + re = new RegExp(mgr0); + if (re.test(stem)) + w = stem + step3list[suffix]; + } + + // Step 4 + re = /^(.+?)(al|ance|ence|er|ic|able|ible|ant|ement|ment|ent|ou|ism|ate|iti|ous|ive|ize)$/; + re2 = /^(.+?)(s|t)(ion)$/; + if (re.test(w)) { + var fp = re.exec(w); + stem = fp[1]; + re = new RegExp(mgr1); + if (re.test(stem)) + w = stem; + } + else if (re2.test(w)) { + var fp = re2.exec(w); + stem = fp[1] + fp[2]; + re2 = new RegExp(mgr1); + if (re2.test(stem)) + w = stem; + } + + // Step 5 + re = /^(.+?)e$/; + if (re.test(w)) { + var fp = re.exec(w); + stem = fp[1]; + re = new RegExp(mgr1); + re2 = new RegExp(meq1); + re3 = new RegExp("^" + C + v + "[^aeiouwxy]$"); + if (re.test(stem) || (re2.test(stem) && !(re3.test(stem)))) + w = stem; + } + re = /ll$/; + re2 = new RegExp(mgr1); + if (re.test(w) && re2.test(w)) { + re = /.$/; + w = w.replace(re,""); + } + + // and turn initial Y back to y + if (firstch == "y") + w = firstch.toLowerCase() + w.substr(1); + return w; + } +} + diff --git a/branch/bicounty_2035_hwy_update/_static/minus.png b/branch/bicounty_2035_hwy_update/_static/minus.png new file mode 100644 index 0000000..d96755f Binary files /dev/null and b/branch/bicounty_2035_hwy_update/_static/minus.png differ diff --git a/branch/bicounty_2035_hwy_update/_static/plus.png b/branch/bicounty_2035_hwy_update/_static/plus.png new file mode 100644 index 0000000..7107cec Binary files /dev/null and b/branch/bicounty_2035_hwy_update/_static/plus.png differ diff --git a/branch/bicounty_2035_hwy_update/_static/pygments.css b/branch/bicounty_2035_hwy_update/_static/pygments.css new file mode 100644 index 0000000..84ab303 --- /dev/null +++ b/branch/bicounty_2035_hwy_update/_static/pygments.css @@ -0,0 +1,75 @@ +pre { line-height: 125%; } +td.linenos .normal { color: inherit; background-color: transparent; padding-left: 5px; padding-right: 5px; } +span.linenos { color: inherit; background-color: transparent; padding-left: 5px; padding-right: 5px; } +td.linenos .special { color: #000000; background-color: #ffffc0; padding-left: 5px; padding-right: 5px; } +span.linenos.special { color: #000000; background-color: #ffffc0; padding-left: 5px; padding-right: 5px; } +.highlight .hll { background-color: #ffffcc } +.highlight { background: #f8f8f8; } +.highlight .c { color: #3D7B7B; font-style: italic } /* Comment */ +.highlight .err { border: 1px solid #FF0000 } /* Error */ +.highlight .k { color: #008000; font-weight: bold } /* Keyword */ +.highlight .o { color: #666666 } /* Operator */ +.highlight .ch { color: #3D7B7B; font-style: italic } /* Comment.Hashbang */ +.highlight .cm { color: #3D7B7B; font-style: italic } /* Comment.Multiline */ +.highlight .cp { color: #9C6500 } /* Comment.Preproc */ +.highlight .cpf { color: #3D7B7B; font-style: italic } /* Comment.PreprocFile */ +.highlight .c1 { color: #3D7B7B; font-style: italic } /* Comment.Single */ +.highlight .cs { color: #3D7B7B; font-style: italic } /* Comment.Special */ +.highlight .gd { color: #A00000 } /* Generic.Deleted */ +.highlight .ge { font-style: italic } /* Generic.Emph */ +.highlight .ges { font-weight: bold; font-style: italic } /* Generic.EmphStrong */ +.highlight .gr { color: #E40000 } /* Generic.Error */ +.highlight .gh { color: #000080; font-weight: bold } /* Generic.Heading */ +.highlight .gi { color: #008400 } /* Generic.Inserted */ +.highlight .go { color: #717171 } /* Generic.Output */ +.highlight .gp { color: #000080; font-weight: bold } /* Generic.Prompt */ +.highlight .gs { font-weight: bold } /* Generic.Strong */ +.highlight .gu { color: #800080; font-weight: bold } /* Generic.Subheading */ +.highlight .gt { color: #0044DD } /* Generic.Traceback */ +.highlight .kc { color: #008000; font-weight: bold } /* Keyword.Constant */ +.highlight .kd { color: #008000; font-weight: bold } /* Keyword.Declaration */ +.highlight .kn { color: #008000; font-weight: bold } /* Keyword.Namespace */ +.highlight .kp { color: #008000 } /* Keyword.Pseudo */ +.highlight .kr { color: #008000; font-weight: bold } /* Keyword.Reserved */ +.highlight .kt { color: #B00040 } /* Keyword.Type */ +.highlight .m { color: #666666 } /* Literal.Number */ +.highlight .s { color: #BA2121 } /* Literal.String */ +.highlight .na { color: #687822 } /* Name.Attribute */ +.highlight .nb { color: #008000 } /* Name.Builtin */ +.highlight .nc { color: #0000FF; font-weight: bold } /* Name.Class */ +.highlight .no { color: #880000 } /* Name.Constant */ +.highlight .nd { color: #AA22FF } /* Name.Decorator */ +.highlight .ni { color: #717171; font-weight: bold } /* Name.Entity */ +.highlight .ne { color: #CB3F38; font-weight: bold } /* Name.Exception */ +.highlight .nf { color: #0000FF } /* Name.Function */ +.highlight .nl { color: #767600 } /* Name.Label */ +.highlight .nn { color: #0000FF; font-weight: bold } /* Name.Namespace */ +.highlight .nt { color: #008000; font-weight: bold } /* Name.Tag */ +.highlight .nv { color: #19177C } /* Name.Variable */ +.highlight .ow { color: #AA22FF; font-weight: bold } /* Operator.Word */ +.highlight .w { color: #bbbbbb } /* Text.Whitespace */ +.highlight .mb { color: #666666 } /* Literal.Number.Bin */ +.highlight .mf { color: #666666 } /* Literal.Number.Float */ +.highlight .mh { color: #666666 } /* Literal.Number.Hex */ +.highlight .mi { color: #666666 } /* Literal.Number.Integer */ +.highlight .mo { color: #666666 } /* Literal.Number.Oct */ +.highlight .sa { color: #BA2121 } /* Literal.String.Affix */ +.highlight .sb { color: #BA2121 } /* Literal.String.Backtick */ +.highlight .sc { color: #BA2121 } /* Literal.String.Char */ +.highlight .dl { color: #BA2121 } /* Literal.String.Delimiter */ +.highlight .sd { color: #BA2121; font-style: italic } /* Literal.String.Doc */ +.highlight .s2 { color: #BA2121 } /* Literal.String.Double */ +.highlight .se { color: #AA5D1F; font-weight: bold } /* Literal.String.Escape */ +.highlight .sh { color: #BA2121 } /* Literal.String.Heredoc */ +.highlight .si { color: #A45A77; font-weight: bold } /* Literal.String.Interpol */ +.highlight .sx { color: #008000 } /* Literal.String.Other */ +.highlight .sr { color: #A45A77 } /* Literal.String.Regex */ +.highlight .s1 { color: #BA2121 } /* Literal.String.Single */ +.highlight .ss { color: #19177C } /* Literal.String.Symbol */ +.highlight .bp { color: #008000 } /* Name.Builtin.Pseudo */ +.highlight .fm { color: #0000FF } /* Name.Function.Magic */ +.highlight .vc { color: #19177C } /* Name.Variable.Class */ +.highlight .vg { color: #19177C } /* Name.Variable.Global */ +.highlight .vi { color: #19177C } /* Name.Variable.Instance */ +.highlight .vm { color: #19177C } /* Name.Variable.Magic */ +.highlight .il { color: #666666 } /* Literal.Number.Integer.Long */ \ No newline at end of file diff --git a/branch/bicounty_2035_hwy_update/_static/searchtools.js b/branch/bicounty_2035_hwy_update/_static/searchtools.js new file mode 100644 index 0000000..97d56a7 --- /dev/null +++ b/branch/bicounty_2035_hwy_update/_static/searchtools.js @@ -0,0 +1,566 @@ +/* + * searchtools.js + * ~~~~~~~~~~~~~~~~ + * + * Sphinx JavaScript utilities for the full-text search. + * + * :copyright: Copyright 2007-2023 by the Sphinx team, see AUTHORS. + * :license: BSD, see LICENSE for details. + * + */ +"use strict"; + +/** + * Simple result scoring code. + */ +if (typeof Scorer === "undefined") { + var Scorer = { + // Implement the following function to further tweak the score for each result + // The function takes a result array [docname, title, anchor, descr, score, filename] + // and returns the new score. + /* + score: result => { + const [docname, title, anchor, descr, score, filename] = result + return score + }, + */ + + // query matches the full name of an object + objNameMatch: 11, + // or matches in the last dotted part of the object name + objPartialMatch: 6, + // Additive scores depending on the priority of the object + objPrio: { + 0: 15, // used to be importantResults + 1: 5, // used to be objectResults + 2: -5, // used to be unimportantResults + }, + // Used when the priority is not in the mapping. + objPrioDefault: 0, + + // query found in title + title: 15, + partialTitle: 7, + // query found in terms + term: 5, + partialTerm: 2, + }; +} + +const _removeChildren = (element) => { + while (element && element.lastChild) element.removeChild(element.lastChild); +}; + +/** + * See https://developer.mozilla.org/en-US/docs/Web/JavaScript/Guide/Regular_Expressions#escaping + */ +const _escapeRegExp = (string) => + string.replace(/[.*+\-?^${}()|[\]\\]/g, "\\$&"); // $& means the whole matched string + +const _displayItem = (item, searchTerms) => { + const docBuilder = DOCUMENTATION_OPTIONS.BUILDER; + const docUrlRoot = DOCUMENTATION_OPTIONS.URL_ROOT; + const docFileSuffix = DOCUMENTATION_OPTIONS.FILE_SUFFIX; + const docLinkSuffix = DOCUMENTATION_OPTIONS.LINK_SUFFIX; + const showSearchSummary = DOCUMENTATION_OPTIONS.SHOW_SEARCH_SUMMARY; + + const [docName, title, anchor, descr, score, _filename] = item; + + let listItem = document.createElement("li"); + let requestUrl; + let linkUrl; + if (docBuilder === "dirhtml") { + // dirhtml builder + let dirname = docName + "/"; + if (dirname.match(/\/index\/$/)) + dirname = dirname.substring(0, dirname.length - 6); + else if (dirname === "index/") dirname = ""; + requestUrl = docUrlRoot + dirname; + linkUrl = requestUrl; + } else { + // normal html builders + requestUrl = docUrlRoot + docName + docFileSuffix; + linkUrl = docName + docLinkSuffix; + } + let linkEl = listItem.appendChild(document.createElement("a")); + linkEl.href = linkUrl + anchor; + linkEl.dataset.score = score; + linkEl.innerHTML = title; + if (descr) + listItem.appendChild(document.createElement("span")).innerHTML = + " (" + descr + ")"; + else if (showSearchSummary) + fetch(requestUrl) + .then((responseData) => responseData.text()) + .then((data) => { + if (data) + listItem.appendChild( + Search.makeSearchSummary(data, searchTerms) + ); + }); + Search.output.appendChild(listItem); +}; +const _finishSearch = (resultCount) => { + Search.stopPulse(); + Search.title.innerText = _("Search Results"); + if (!resultCount) + Search.status.innerText = Documentation.gettext( + "Your search did not match any documents. Please make sure that all words are spelled correctly and that you've selected enough categories." + ); + else + Search.status.innerText = _( + `Search finished, found ${resultCount} page(s) matching the search query.` + ); +}; +const _displayNextItem = ( + results, + resultCount, + searchTerms +) => { + // results left, load the summary and display it + // this is intended to be dynamic (don't sub resultsCount) + if (results.length) { + _displayItem(results.pop(), searchTerms); + setTimeout( + () => _displayNextItem(results, resultCount, searchTerms), + 5 + ); + } + // search finished, update title and status message + else _finishSearch(resultCount); +}; + +/** + * Default splitQuery function. Can be overridden in ``sphinx.search`` with a + * custom function per language. + * + * The regular expression works by splitting the string on consecutive characters + * that are not Unicode letters, numbers, underscores, or emoji characters. + * This is the same as ``\W+`` in Python, preserving the surrogate pair area. + */ +if (typeof splitQuery === "undefined") { + var splitQuery = (query) => query + .split(/[^\p{Letter}\p{Number}_\p{Emoji_Presentation}]+/gu) + .filter(term => term) // remove remaining empty strings +} + +/** + * Search Module + */ +const Search = { + _index: null, + _queued_query: null, + _pulse_status: -1, + + htmlToText: (htmlString) => { + const htmlElement = new DOMParser().parseFromString(htmlString, 'text/html'); + htmlElement.querySelectorAll(".headerlink").forEach((el) => { el.remove() }); + const docContent = htmlElement.querySelector('[role="main"]'); + if (docContent !== undefined) return docContent.textContent; + console.warn( + "Content block not found. Sphinx search tries to obtain it via '[role=main]'. Could you check your theme or template." + ); + return ""; + }, + + init: () => { + const query = new URLSearchParams(window.location.search).get("q"); + document + .querySelectorAll('input[name="q"]') + .forEach((el) => (el.value = query)); + if (query) Search.performSearch(query); + }, + + loadIndex: (url) => + (document.body.appendChild(document.createElement("script")).src = url), + + setIndex: (index) => { + Search._index = index; + if (Search._queued_query !== null) { + const query = Search._queued_query; + Search._queued_query = null; + Search.query(query); + } + }, + + hasIndex: () => Search._index !== null, + + deferQuery: (query) => (Search._queued_query = query), + + stopPulse: () => (Search._pulse_status = -1), + + startPulse: () => { + if (Search._pulse_status >= 0) return; + + const pulse = () => { + Search._pulse_status = (Search._pulse_status + 1) % 4; + Search.dots.innerText = ".".repeat(Search._pulse_status); + if (Search._pulse_status >= 0) window.setTimeout(pulse, 500); + }; + pulse(); + }, + + /** + * perform a search for something (or wait until index is loaded) + */ + performSearch: (query) => { + // create the required interface elements + const searchText = document.createElement("h2"); + searchText.textContent = _("Searching"); + const searchSummary = document.createElement("p"); + searchSummary.classList.add("search-summary"); + searchSummary.innerText = ""; + const searchList = document.createElement("ul"); + searchList.classList.add("search"); + + const out = document.getElementById("search-results"); + Search.title = out.appendChild(searchText); + Search.dots = Search.title.appendChild(document.createElement("span")); + Search.status = out.appendChild(searchSummary); + Search.output = out.appendChild(searchList); + + const searchProgress = document.getElementById("search-progress"); + // Some themes don't use the search progress node + if (searchProgress) { + searchProgress.innerText = _("Preparing search..."); + } + Search.startPulse(); + + // index already loaded, the browser was quick! + if (Search.hasIndex()) Search.query(query); + else Search.deferQuery(query); + }, + + /** + * execute search (requires search index to be loaded) + */ + query: (query) => { + const filenames = Search._index.filenames; + const docNames = Search._index.docnames; + const titles = Search._index.titles; + const allTitles = Search._index.alltitles; + const indexEntries = Search._index.indexentries; + + // stem the search terms and add them to the correct list + const stemmer = new Stemmer(); + const searchTerms = new Set(); + const excludedTerms = new Set(); + const highlightTerms = new Set(); + const objectTerms = new Set(splitQuery(query.toLowerCase().trim())); + splitQuery(query.trim()).forEach((queryTerm) => { + const queryTermLower = queryTerm.toLowerCase(); + + // maybe skip this "word" + // stopwords array is from language_data.js + if ( + stopwords.indexOf(queryTermLower) !== -1 || + queryTerm.match(/^\d+$/) + ) + return; + + // stem the word + let word = stemmer.stemWord(queryTermLower); + // select the correct list + if (word[0] === "-") excludedTerms.add(word.substr(1)); + else { + searchTerms.add(word); + highlightTerms.add(queryTermLower); + } + }); + + if (SPHINX_HIGHLIGHT_ENABLED) { // set in sphinx_highlight.js + localStorage.setItem("sphinx_highlight_terms", [...highlightTerms].join(" ")) + } + + // console.debug("SEARCH: searching for:"); + // console.info("required: ", [...searchTerms]); + // console.info("excluded: ", [...excludedTerms]); + + // array of [docname, title, anchor, descr, score, filename] + let results = []; + _removeChildren(document.getElementById("search-progress")); + + const queryLower = query.toLowerCase(); + for (const [title, foundTitles] of Object.entries(allTitles)) { + if (title.toLowerCase().includes(queryLower) && (queryLower.length >= title.length/2)) { + for (const [file, id] of foundTitles) { + let score = Math.round(100 * queryLower.length / title.length) + results.push([ + docNames[file], + titles[file] !== title ? `${titles[file]} > ${title}` : title, + id !== null ? "#" + id : "", + null, + score, + filenames[file], + ]); + } + } + } + + // search for explicit entries in index directives + for (const [entry, foundEntries] of Object.entries(indexEntries)) { + if (entry.includes(queryLower) && (queryLower.length >= entry.length/2)) { + for (const [file, id] of foundEntries) { + let score = Math.round(100 * queryLower.length / entry.length) + results.push([ + docNames[file], + titles[file], + id ? "#" + id : "", + null, + score, + filenames[file], + ]); + } + } + } + + // lookup as object + objectTerms.forEach((term) => + results.push(...Search.performObjectSearch(term, objectTerms)) + ); + + // lookup as search terms in fulltext + results.push(...Search.performTermsSearch(searchTerms, excludedTerms)); + + // let the scorer override scores with a custom scoring function + if (Scorer.score) results.forEach((item) => (item[4] = Scorer.score(item))); + + // now sort the results by score (in opposite order of appearance, since the + // display function below uses pop() to retrieve items) and then + // alphabetically + results.sort((a, b) => { + const leftScore = a[4]; + const rightScore = b[4]; + if (leftScore === rightScore) { + // same score: sort alphabetically + const leftTitle = a[1].toLowerCase(); + const rightTitle = b[1].toLowerCase(); + if (leftTitle === rightTitle) return 0; + return leftTitle > rightTitle ? -1 : 1; // inverted is intentional + } + return leftScore > rightScore ? 1 : -1; + }); + + // remove duplicate search results + // note the reversing of results, so that in the case of duplicates, the highest-scoring entry is kept + let seen = new Set(); + results = results.reverse().reduce((acc, result) => { + let resultStr = result.slice(0, 4).concat([result[5]]).map(v => String(v)).join(','); + if (!seen.has(resultStr)) { + acc.push(result); + seen.add(resultStr); + } + return acc; + }, []); + + results = results.reverse(); + + // for debugging + //Search.lastresults = results.slice(); // a copy + // console.info("search results:", Search.lastresults); + + // print the results + _displayNextItem(results, results.length, searchTerms); + }, + + /** + * search for object names + */ + performObjectSearch: (object, objectTerms) => { + const filenames = Search._index.filenames; + const docNames = Search._index.docnames; + const objects = Search._index.objects; + const objNames = Search._index.objnames; + const titles = Search._index.titles; + + const results = []; + + const objectSearchCallback = (prefix, match) => { + const name = match[4] + const fullname = (prefix ? prefix + "." : "") + name; + const fullnameLower = fullname.toLowerCase(); + if (fullnameLower.indexOf(object) < 0) return; + + let score = 0; + const parts = fullnameLower.split("."); + + // check for different match types: exact matches of full name or + // "last name" (i.e. last dotted part) + if (fullnameLower === object || parts.slice(-1)[0] === object) + score += Scorer.objNameMatch; + else if (parts.slice(-1)[0].indexOf(object) > -1) + score += Scorer.objPartialMatch; // matches in last name + + const objName = objNames[match[1]][2]; + const title = titles[match[0]]; + + // If more than one term searched for, we require other words to be + // found in the name/title/description + const otherTerms = new Set(objectTerms); + otherTerms.delete(object); + if (otherTerms.size > 0) { + const haystack = `${prefix} ${name} ${objName} ${title}`.toLowerCase(); + if ( + [...otherTerms].some((otherTerm) => haystack.indexOf(otherTerm) < 0) + ) + return; + } + + let anchor = match[3]; + if (anchor === "") anchor = fullname; + else if (anchor === "-") anchor = objNames[match[1]][1] + "-" + fullname; + + const descr = objName + _(", in ") + title; + + // add custom score for some objects according to scorer + if (Scorer.objPrio.hasOwnProperty(match[2])) + score += Scorer.objPrio[match[2]]; + else score += Scorer.objPrioDefault; + + results.push([ + docNames[match[0]], + fullname, + "#" + anchor, + descr, + score, + filenames[match[0]], + ]); + }; + Object.keys(objects).forEach((prefix) => + objects[prefix].forEach((array) => + objectSearchCallback(prefix, array) + ) + ); + return results; + }, + + /** + * search for full-text terms in the index + */ + performTermsSearch: (searchTerms, excludedTerms) => { + // prepare search + const terms = Search._index.terms; + const titleTerms = Search._index.titleterms; + const filenames = Search._index.filenames; + const docNames = Search._index.docnames; + const titles = Search._index.titles; + + const scoreMap = new Map(); + const fileMap = new Map(); + + // perform the search on the required terms + searchTerms.forEach((word) => { + const files = []; + const arr = [ + { files: terms[word], score: Scorer.term }, + { files: titleTerms[word], score: Scorer.title }, + ]; + // add support for partial matches + if (word.length > 2) { + const escapedWord = _escapeRegExp(word); + Object.keys(terms).forEach((term) => { + if (term.match(escapedWord) && !terms[word]) + arr.push({ files: terms[term], score: Scorer.partialTerm }); + }); + Object.keys(titleTerms).forEach((term) => { + if (term.match(escapedWord) && !titleTerms[word]) + arr.push({ files: titleTerms[word], score: Scorer.partialTitle }); + }); + } + + // no match but word was a required one + if (arr.every((record) => record.files === undefined)) return; + + // found search word in contents + arr.forEach((record) => { + if (record.files === undefined) return; + + let recordFiles = record.files; + if (recordFiles.length === undefined) recordFiles = [recordFiles]; + files.push(...recordFiles); + + // set score for the word in each file + recordFiles.forEach((file) => { + if (!scoreMap.has(file)) scoreMap.set(file, {}); + scoreMap.get(file)[word] = record.score; + }); + }); + + // create the mapping + files.forEach((file) => { + if (fileMap.has(file) && fileMap.get(file).indexOf(word) === -1) + fileMap.get(file).push(word); + else fileMap.set(file, [word]); + }); + }); + + // now check if the files don't contain excluded terms + const results = []; + for (const [file, wordList] of fileMap) { + // check if all requirements are matched + + // as search terms with length < 3 are discarded + const filteredTermCount = [...searchTerms].filter( + (term) => term.length > 2 + ).length; + if ( + wordList.length !== searchTerms.size && + wordList.length !== filteredTermCount + ) + continue; + + // ensure that none of the excluded terms is in the search result + if ( + [...excludedTerms].some( + (term) => + terms[term] === file || + titleTerms[term] === file || + (terms[term] || []).includes(file) || + (titleTerms[term] || []).includes(file) + ) + ) + break; + + // select one (max) score for the file. + const score = Math.max(...wordList.map((w) => scoreMap.get(file)[w])); + // add result to the result list + results.push([ + docNames[file], + titles[file], + "", + null, + score, + filenames[file], + ]); + } + return results; + }, + + /** + * helper function to return a node containing the + * search summary for a given text. keywords is a list + * of stemmed words. + */ + makeSearchSummary: (htmlText, keywords) => { + const text = Search.htmlToText(htmlText); + if (text === "") return null; + + const textLower = text.toLowerCase(); + const actualStartPosition = [...keywords] + .map((k) => textLower.indexOf(k.toLowerCase())) + .filter((i) => i > -1) + .slice(-1)[0]; + const startWithContext = Math.max(actualStartPosition - 120, 0); + + const top = startWithContext === 0 ? "" : "..."; + const tail = startWithContext + 240 < text.length ? "..." : ""; + + let summary = document.createElement("p"); + summary.classList.add("context"); + summary.textContent = top + text.substr(startWithContext, 240).trim() + tail; + + return summary; + }, +}; + +_ready(Search.init); diff --git a/branch/bicounty_2035_hwy_update/_static/sphinx_highlight.js b/branch/bicounty_2035_hwy_update/_static/sphinx_highlight.js new file mode 100644 index 0000000..aae669d --- /dev/null +++ b/branch/bicounty_2035_hwy_update/_static/sphinx_highlight.js @@ -0,0 +1,144 @@ +/* Highlighting utilities for Sphinx HTML documentation. */ +"use strict"; + +const SPHINX_HIGHLIGHT_ENABLED = true + +/** + * highlight a given string on a node by wrapping it in + * span elements with the given class name. + */ +const _highlight = (node, addItems, text, className) => { + if (node.nodeType === Node.TEXT_NODE) { + const val = node.nodeValue; + const parent = node.parentNode; + const pos = val.toLowerCase().indexOf(text); + if ( + pos >= 0 && + !parent.classList.contains(className) && + !parent.classList.contains("nohighlight") + ) { + let span; + + const closestNode = parent.closest("body, svg, foreignObject"); + const isInSVG = closestNode && closestNode.matches("svg"); + if (isInSVG) { + span = document.createElementNS("http://www.w3.org/2000/svg", "tspan"); + } else { + span = document.createElement("span"); + span.classList.add(className); + } + + span.appendChild(document.createTextNode(val.substr(pos, text.length))); + parent.insertBefore( + span, + parent.insertBefore( + document.createTextNode(val.substr(pos + text.length)), + node.nextSibling + ) + ); + node.nodeValue = val.substr(0, pos); + + if (isInSVG) { + const rect = document.createElementNS( + "http://www.w3.org/2000/svg", + "rect" + ); + const bbox = parent.getBBox(); + rect.x.baseVal.value = bbox.x; + rect.y.baseVal.value = bbox.y; + rect.width.baseVal.value = bbox.width; + rect.height.baseVal.value = bbox.height; + rect.setAttribute("class", className); + addItems.push({ parent: parent, target: rect }); + } + } + } else if (node.matches && !node.matches("button, select, textarea")) { + node.childNodes.forEach((el) => _highlight(el, addItems, text, className)); + } +}; +const _highlightText = (thisNode, text, className) => { + let addItems = []; + _highlight(thisNode, addItems, text, className); + addItems.forEach((obj) => + obj.parent.insertAdjacentElement("beforebegin", obj.target) + ); +}; + +/** + * Small JavaScript module for the documentation. + */ +const SphinxHighlight = { + + /** + * highlight the search words provided in localstorage in the text + */ + highlightSearchWords: () => { + if (!SPHINX_HIGHLIGHT_ENABLED) return; // bail if no highlight + + // get and clear terms from localstorage + const url = new URL(window.location); + const highlight = + localStorage.getItem("sphinx_highlight_terms") + || url.searchParams.get("highlight") + || ""; + localStorage.removeItem("sphinx_highlight_terms") + url.searchParams.delete("highlight"); + window.history.replaceState({}, "", url); + + // get individual terms from highlight string + const terms = highlight.toLowerCase().split(/\s+/).filter(x => x); + if (terms.length === 0) return; // nothing to do + + // There should never be more than one element matching "div.body" + const divBody = document.querySelectorAll("div.body"); + const body = divBody.length ? divBody[0] : document.querySelector("body"); + window.setTimeout(() => { + terms.forEach((term) => _highlightText(body, term, "highlighted")); + }, 10); + + const searchBox = document.getElementById("searchbox"); + if (searchBox === null) return; + searchBox.appendChild( + document + .createRange() + .createContextualFragment( + '" + ) + ); + }, + + /** + * helper function to hide the search marks again + */ + hideSearchWords: () => { + document + .querySelectorAll("#searchbox .highlight-link") + .forEach((el) => el.remove()); + document + .querySelectorAll("span.highlighted") + .forEach((el) => el.classList.remove("highlighted")); + localStorage.removeItem("sphinx_highlight_terms") + }, + + initEscapeListener: () => { + // only install a listener if it is really needed + if (!DOCUMENTATION_OPTIONS.ENABLE_SEARCH_SHORTCUTS) return; + + document.addEventListener("keydown", (event) => { + // bail for input elements + if (BLACKLISTED_KEY_CONTROL_ELEMENTS.has(document.activeElement.tagName)) return; + // bail with special keys + if (event.shiftKey || event.altKey || event.ctrlKey || event.metaKey) return; + if (DOCUMENTATION_OPTIONS.ENABLE_SEARCH_SHORTCUTS && (event.key === "Escape")) { + SphinxHighlight.hideSearchWords(); + event.preventDefault(); + } + }); + }, +}; + +_ready(SphinxHighlight.highlightSearchWords); +_ready(SphinxHighlight.initEscapeListener); diff --git a/branch/bicounty_2035_hwy_update/autodoc/index.html b/branch/bicounty_2035_hwy_update/autodoc/index.html new file mode 100644 index 0000000..376fe34 --- /dev/null +++ b/branch/bicounty_2035_hwy_update/autodoc/index.html @@ -0,0 +1,165 @@ + + + + + + + Lasso Classes and Functions — lasso documentation + + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

Lasso Classes and Functions

+
+

Base Classes

+ + + + + + + + + + + + + + + + + + +

CubeTransit

Class for storing information about transit defined in Cube line files.

StandardTransit

Holds a standard transit feed as a Partridge object and contains methods to manipulate and translate the GTFS data to MetCouncil's Cube Line files.

ModelRoadwayNetwork

Subclass of network_wrangler class RoadwayNetwork

Project

A single or set of changes to the roadway or transit system.

Parameters

A class representing all the parameters defining the networks including time of day, categories, etc.

+
+
+

Utils and Functions

+ + + + + + + + + +

util

logger

+
+
+ + +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty_2035_hwy_update/genindex/index.html b/branch/bicounty_2035_hwy_update/genindex/index.html new file mode 100644 index 0000000..cb4a8a9 --- /dev/null +++ b/branch/bicounty_2035_hwy_update/genindex/index.html @@ -0,0 +1,1038 @@ + + + + + + Index — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+
    +
  • + +
  • +
  • +
+
+
+
+
+ + +

Index

+ +
+ _ + | A + | B + | C + | D + | E + | F + | G + | H + | I + | K + | L + | M + | N + | O + | P + | R + | S + | T + | U + | V + | W + | X + | Y + | Z + +
+

_

+ + +
+ +

A

+ + + +
+ +

B

+ + + +
+ +

C

+ + + +
+ +

D

+ + + +
+ +

E

+ + + +
+ +

F

+ + + +
+ +

G

+ + + +
+ +

H

+ + + +
+ +

I

+ + + +
+ +

K

+ + +
+ +

L

+ + + +
+ +

M

+ + + +
+ +

N

+ + + +
+ +

O

+ + + +
+ +

P

+ + + +
+ +

R

+ + + +
+ +

S

+ + + +
+ +

T

+ + + +
+ +

U

+ + + +
+ +

V

+ + + +
+ +

W

+ + + +
+ +

X

+ + + +
+ +

Y

+ + +
+ +

Z

+ + + +
+ + + +
+
+
+ +
+ +
+

© Copyright 2019 Metropolitan Council.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty_2035_hwy_update/index.html b/branch/bicounty_2035_hwy_update/index.html new file mode 100644 index 0000000..5a887f3 --- /dev/null +++ b/branch/bicounty_2035_hwy_update/index.html @@ -0,0 +1,183 @@ + + + + + + + Welcome to lasso’s documentation! — lasso documentation + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

Welcome to lasso’s documentation!

+

This package of utilities is a wrapper around the +[network_wrangler](http://github.com/wsp-sag/network_wrangler) package +for MetCouncil. It aims to have the following functionality: +1. parse Cube log files and base highway networks and create ProjectCards

+
+

for Network Wrangler

+
+
    +
  1. parse two Cube transit line files and create ProjectCards for NetworkWrangler

  2. +
  3. refine Network Wrangler highway networks to contain specific variables and +settings for Metropolitan Council and export them to a format that can +be read in by Citilab’s Cube software.

  4. +
+ +
+
+

Indices and tables

+ +
+ + +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty_2035_hwy_update/objects.inv b/branch/bicounty_2035_hwy_update/objects.inv new file mode 100644 index 0000000..5f26290 Binary files /dev/null and b/branch/bicounty_2035_hwy_update/objects.inv differ diff --git a/branch/bicounty_2035_hwy_update/py-modindex/index.html b/branch/bicounty_2035_hwy_update/py-modindex/index.html new file mode 100644 index 0000000..a64daae --- /dev/null +++ b/branch/bicounty_2035_hwy_update/py-modindex/index.html @@ -0,0 +1,136 @@ + + + + + + Python Module Index — lasso documentation + + + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+
    +
  • + +
  • +
  • +
+
+
+
+
+ + +

Python Module Index

+ +
+ l +
+ + + + + + + + + + + + + +
 
+ l
+ lasso +
    + lasso.logger +
    + lasso.util +
+ + +
+
+
+ +
+ +
+

© Copyright 2019 Metropolitan Council.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty_2035_hwy_update/running/index.html b/branch/bicounty_2035_hwy_update/running/index.html new file mode 100644 index 0000000..36a27f9 --- /dev/null +++ b/branch/bicounty_2035_hwy_update/running/index.html @@ -0,0 +1,133 @@ + + + + + + + Running Lasso — lasso documentation + + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

Running Lasso

+
+

Create project files

+
+
+

Create a scenario

+
+
+

Exporting networks

+
+
+

Auditing and Reporting

+
+
+ + +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty_2035_hwy_update/search/index.html b/branch/bicounty_2035_hwy_update/search/index.html new file mode 100644 index 0000000..8bafd66 --- /dev/null +++ b/branch/bicounty_2035_hwy_update/search/index.html @@ -0,0 +1,126 @@ + + + + + + Search — lasso documentation + + + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+
    +
  • + +
  • +
  • +
+
+
+
+
+ + + + +
+ +
+ +
+
+
+ +
+ +
+

© Copyright 2019 Metropolitan Council.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + + + + + + \ No newline at end of file diff --git a/branch/bicounty_2035_hwy_update/searchindex.js b/branch/bicounty_2035_hwy_update/searchindex.js new file mode 100644 index 0000000..8ef2c89 --- /dev/null +++ b/branch/bicounty_2035_hwy_update/searchindex.js @@ -0,0 +1 @@ +Search.setIndex({"docnames": ["_generated/lasso.CubeTransit", "_generated/lasso.ModelRoadwayNetwork", "_generated/lasso.Parameters", "_generated/lasso.Project", "_generated/lasso.StandardTransit", "_generated/lasso.logger", "_generated/lasso.util", "autodoc", "index", "running", "setup", "starting"], "filenames": ["_generated/lasso.CubeTransit.rst", "_generated/lasso.ModelRoadwayNetwork.rst", "_generated/lasso.Parameters.rst", "_generated/lasso.Project.rst", "_generated/lasso.StandardTransit.rst", "_generated/lasso.logger.rst", "_generated/lasso.util.rst", "autodoc.rst", "index.rst", "running.md", "setup.md", "starting.md"], "titles": ["lasso.CubeTransit", "lasso.ModelRoadwayNetwork", "lasso.Parameters", "lasso.Project", "lasso.StandardTransit", "lasso.logger", "lasso.util", "Lasso Classes and Functions", "Welcome to lasso\u2019s documentation!", "Running Lasso", "Setup", "Starting Out"], "terms": {"class": [0, 1, 2, 3, 4, 6, 8, 11], "paramet": [0, 1, 3, 4, 6, 8], "sourc": [0, 1, 2, 3, 4, 5, 6, 11], "base": [0, 1, 2, 3, 4, 6, 8, 11], "object": [0, 1, 2, 3, 4, 6, 11], "store": [0, 1, 11], "inform": [0, 1, 4, 11], "about": [0, 1, 4, 11], "transit": [0, 1, 2, 3, 4, 8], "defin": [0, 1, 2, 3, 11], "cube": [0, 1, 2, 3, 4, 8], "line": [0, 1, 2, 3, 4, 6, 8, 11], "file": [0, 1, 2, 3, 4, 8], "ha": [0, 6, 11], "capabl": [0, 11], "pars": [0, 8, 11], "properti": [0, 1, 2, 4, 6, 11], "shape": [0, 1, 3, 4, 6, 11], "python": [0, 1, 2, 11], "dictionari": [0, 1, 2, 3, 4, 6, 11], "compar": [0, 3, 4, 11], "repres": [0, 2, 4, 6, 11], "chang": [0, 1, 3, 4, 11], "project": [0, 1, 2, 4, 6, 8], "card": [0, 1, 3, 4], "typic": [0, 3, 4, 6, 8], "usag": [0, 1, 3, 4], "exampl": [0, 1, 3, 4, 6, 11], "tn": [0, 11], "create_from_cub": [0, 11], "cube_dir": [0, 3, 11], "transit_change_list": [0, 11], "evaluate_differ": [0, 4, 11], "base_transit_network": [0, 3, 11], "list": [0, 1, 2, 3, 4, 6, 11], "string": [0, 1, 3, 4, 6], "uniqu": [0, 1, 3], "name": [0, 1, 2, 3, 4, 6, 11], "network": [0, 1, 2, 3, 4, 5, 8], "type": [0, 1, 2, 3, 4, 6, 11], "line_properti": 0, "kei": [0, 1, 3, 11], "valu": [0, 1, 3, 4, 6, 11], "ar": [0, 1, 2, 3, 6, 11], "These": 0, "directli": 0, "read": [0, 1, 3, 4, 8, 11], "from": [0, 1, 2, 3, 4, 6, 8], "haven": 0, "t": [0, 1, 6], "been": [0, 6], "translat": [0, 4, 11], "standard": [0, 1, 2, 4, 11], "dict": [0, 1, 2, 3], "panda": [0, 1, 3], "datafram": [0, 1, 3, 4, 11], "node": [0, 1, 2, 3, 4, 6, 11], "follow": [0, 4, 8, 11], "column": [0, 1], "node_id": 0, "int": [0, 1, 2, 3, 4, 6], "posit": [0, 6], "integ": [0, 1, 2], "id": [0, 1, 3, 4, 11], "number": [0, 1, 2, 3, 4, 6, 11], "neg": [0, 6], "indic": [0, 1, 6], "non": [0, 6], "stop": 0, "boolean": [0, 1], "i": [0, 1, 3, 4, 5, 6, 8, 11], "order": [0, 1, 2, 6, 11], "within": [0, 1, 2, 6], "thi": [0, 1, 2, 3, 6, 8, 11], "program_typ": 0, "either": [0, 1, 5, 6, 11], "pt": 0, "trnbld": 0, "str": [0, 1, 2, 3, 6], "instanc": [0, 1, 2, 3, 4, 11], "appli": [0, 1, 6], "which": [0, 1, 3, 6, 11], "includ": [0, 1, 2, 11], "time": [0, 1, 2, 4, 6, 11], "period": [0, 1, 2, 4, 11], "variabl": [0, 1, 2, 3, 4, 8, 11], "source_list": 0, "have": [0, 1, 6, 8], "ad": [0, 1, 3, 6, 11], "diff_dict": 0, "__init__": [0, 1, 2, 3, 4], "constructor": [0, 1, 3], "set": [0, 1, 2, 3, 4, 5, 6, 8, 11], "see": [0, 1, 3, 4, 6], "an": [0, 1, 3, 4, 6, 11], "method": [0, 1, 2, 3, 4, 6, 11], "add_additional_time_period": 0, "new_time_period_numb": 0, "orig_line_nam": 0, "copi": [0, 1, 3, 6, 11], "rout": [0, 1, 4], "anoth": 0, "appropri": [0, 4], "specif": [0, 1, 8, 11], "new": [0, 1, 6, 11], "under": 0, "self": [0, 1, 2, 3, 4, 6], "origin": [0, 1, 6, 11], "its": [0, 1], "return": [0, 1, 3, 4, 6], "add_cub": 0, "transit_sourc": 0, "lin": [0, 3, 4], "add": [0, 1, 3, 6, 11], "exist": [0, 1, 3, 6, 11], "transitnetwork": [0, 4], "directori": [0, 1, 4, 11], "static": [0, 1, 3, 4], "build_route_nam": 0, "route_id": [0, 4], "time_period": [0, 1, 2], "agency_id": 0, "0": [0, 1, 2, 4, 6, 11], "direction_id": 0, "1": [0, 1, 2, 4, 6, 8, 11], "creat": [0, 1, 2, 3, 4, 6, 8, 11], "contaten": 0, "agenc": 0, "direct": [0, 1, 6, 11], "e": [0, 1, 11], "452": 0, "111": 0, "pk": [0, 2], "construct": [0, 6, 11], "line_nam": 0, "0_452": 0, "111_452_pk1": 0, "calculate_start_end_tim": 0, "line_properties_dict": 0, "calcul": [0, 1, 2, 3, 4, 6], "start": [0, 1, 4, 8], "end": [0, 1, 6], "warn": [0, 1], "doesn": [0, 1], "take": [0, 1], "care": 0, "discongru": 0, "flavor": [0, 1], "create_add_route_card_dict": 0, "format": [0, 1, 2, 4, 8, 11], "route_properti": 0, "being": 0, "updat": [0, 1, 11], "A": [0, 1, 2, 3, 4, 6, 11], "addit": [0, 1, 6, 8, 11], "create_delete_route_card_dict": 0, "base_transit_line_properties_dict": 0, "delet": [0, 1], "style": [0, 6], "attribut": [0, 1, 2, 3, 4, 11], "find": [0, 1, 4], "create_update_route_card_dict": 0, "updated_properties_dict": 0, "cube_properties_to_standard_properti": 0, "cube_properties_dict": 0, "convert": [0, 1, 4, 6], "most": 0, "pertin": 0, "like": [0, 1, 2, 6], "headwai": [0, 2, 4], "varibl": [0, 2], "stnadard": 0, "unit": [0, 1, 6], "minut": 0, "second": [0, 4, 6], "correct": 0, "base_transit": 0, "identifi": [0, 1, 11], "what": [0, 1, 3, 6], "need": [0, 1, 2, 3], "For": [0, 1, 4, 6], "multipl": [0, 1, 6, 11], "make": [0, 1, 11], "duplic": 0, "so": [0, 1, 5, 6, 11], "each": [0, 1, 3, 6], "condit": [0, 1], "contain": [0, 1, 4, 6, 8, 11], "requir": [0, 1, 6, 11], "evalu": [0, 1, 3], "differ": [0, 2, 6], "between": [0, 1, 2, 3, 6], "evaluate_route_property_differ": 0, "properties_build": 0, "properties_bas": 0, "time_period_numb": 0, "absolut": [0, 6], "true": [0, 1, 3, 4, 5, 6, 11], "validate_bas": 0, "fals": [0, 1, 3, 4, 6, 11], "check": [0, 1, 3, 6], "ani": [0, 1, 3, 6, 11], "entri": [0, 3], "property_nam": 0, "property_valu": 0, "us": [0, 1, 2, 3, 4, 6, 11], "command": [0, 1, 3], "rather": [0, 1], "than": [0, 1, 6], "If": [0, 1, 3, 4, 6, 11], "automat": 0, "note": [0, 6, 11], "onli": [0, 1, 3, 4, 6], "numer": [0, 4, 6], "frequenc": [0, 4, 11], "suitabl": 0, "write": [0, 1, 3, 4, 11], "evaluate_route_shape_chang": 0, "shape_build": 0, "shape_bas": 0, "two": [0, 1, 6, 8, 11], "build": [0, 1, 3, 11], "version": [0, 6, 11], "ddatafram": 0, "get_time_period_numbers_from_cube_properti": 0, "properties_list": 0, "associ": [0, 2], "them": [0, 1, 6, 8], "all": [0, 1, 2, 5, 6, 11], "found": [0, 1, 6], "unpack_route_nam": 0, "unpack": 0, "info": [0, 1, 2], "link": [1, 2, 3, 6, 11], "kwarg": [1, 2, 3, 6], "roadwaynetwork": [1, 3, 4], "subclass": [1, 11], "network_wrangl": [1, 8, 11], "represent": [1, 3, 4, 6], "physic": 1, "roadwai": [1, 2, 3, 11], "geodatafram": [1, 11], "specifi": [1, 3, 6], "default": [1, 2, 3, 4, 6, 11], "cr": [1, 3, 6], "coordin": [1, 3, 6], "refer": [1, 3, 4, 6, 11], "system": [1, 3], "espg": [1, 3], "node_foreign_kei": [1, 3], "tabl": [1, 2, 3, 6], "link_foreign_kei": [1, 3], "foreign": [1, 3], "shape_foreign_kei": [1, 3, 11], "unique_link_id": [1, 3], "unique_node_id": [1, 3], "modes_to_network_link_vari": [1, 3], "map": [1, 2, 3, 6, 11], "mode": [1, 3, 4, 11], "modes_to_network_nodes_vari": [1, 3], "managed_lanes_node_id_scalar": [1, 3], "scalar": [1, 3, 6], "primari": [1, 3], "correspond": [1, 3], "manag": [1, 2, 3, 11], "lane": [1, 2, 3, 11], "managed_lanes_link_id_scalar": [1, 3], "managed_lanes_required_attribut": [1, 3], "must": [1, 3, 6], "keep_same_attributes_ml_and_gp": [1, 3], "parallel": [1, 3, 6], "gener": [1, 3], "purpos": [1, 3, 6], "add_count": 1, "network_vari": 1, "aadt": 1, "mndot_count_shst_data": 1, "none": [1, 3, 4, 5, 6], "widot_count_shst_data": 1, "mndot_count_variable_shp": [1, 2], "widot_count_variable_shp": 1, "count": [1, 2], "mc": [1, 2, 4], "join": [1, 3, 4, 6, 11], "data": [1, 2, 3, 4, 8, 11], "via": 1, "shst": 1, "api": 1, "match": 1, "result": [1, 6, 11], "should": [1, 2, 3, 6, 11], "written": [1, 11], "path": [1, 3, 4, 6, 11], "mndot": [1, 2], "locat": [1, 2, 3, 4], "widot": 1, "geodatabas": 1, "add_incident_link_data_to_nod": 1, "links_df": [1, 11], "nodes_df": [1, 11], "link_vari": 1, "unique_node_kei": 1, "model_node_id": [1, 2], "go": [1, 11], "assess": [1, 3], "connect": 1, "incid": 1, "where": 1, "length": [1, 6], "n": [1, 2, 3, 6, 11], "out": [1, 2, 5, 6, 8], "add_new_roadway_feature_chang": 1, "featur": [1, 6], "also": [1, 6, 11], "valid": [1, 6, 11], "add_variable_using_shst_refer": 1, "var_shst_csvdata": 1, "shst_csv_variabl": 1, "network_var_typ": 1, "overwrit": 1, "bool": [1, 3, 6], "addition_map": 1, "show": 1, "project_card_dictionari": 1, "wrapper": [1, 8, 11], "apply_managed_lane_feature_chang": 1, "link_idx": 1, "in_plac": [1, 11], "lndice": 1, "whether": 1, "decid": 1, "connector": [1, 2], "when": [1, 3, 6], "thei": [1, 2], "more": [1, 6, 11], "apply_python_calcul": 1, "pycod": 1, "execut": 1, "code": [1, 2, 6, 11], "apply_roadway_feature_chang": [1, 11], "select": [1, 11], "pass": [1, 5, 6], "assess_connect": [1, 11], "ignore_end_nod": [1, 11], "graph": 1, "disconnect": 1, "subgraph": 1, "describ": [1, 11], "member": [1, 6], "one": [1, 6, 11], "drive": [1, 11], "walk": [1, 11], "bike": 1, "ignor": [1, 6], "strai": 1, "singleton": 1, "tupl": [1, 6], "osmnx": [1, 11], "networkx": 1, "digraph": 1, "build_selection_kei": 1, "selection_dict": 1, "combin": [1, 2, 4, 6], "queri": [1, 11], "b": [1, 2, 11], "you": [1, 3, 11], "selection_dictonari": 1, "serv": 1, "calculate_area_typ": 1, "area_type_shap": [1, 2], "area_type_shape_vari": 1, "area_typ": [1, 2, 3], "area_type_codes_dict": 1, "downtown_area_type_shap": [1, 2], "downtown_area_typ": [1, 2], "area": [1, 2, 3, 6], "centroid": [1, 2, 6], "geometri": [1, 2, 6], "field": [1, 2, 11], "determin": [1, 3, 6], "label": 1, "isn": 1, "perfect": 1, "much": 1, "quicker": 1, "other": [1, 6, 11], "The": [1, 2, 5, 6, 11], "geodadabas": 1, "input": [1, 6], "downtown": [1, 2], "boundari": [1, 2, 6], "counti": [1, 2, 3], "calculate_centroidconnect": 1, "centroidconnect": [1, 2, 3], "highest_taz_numb": [1, 2], "as_integ": 1, "max": 1, "taz": [1, 2], "calculate_counti": 1, "county_shap": [1, 2], "county_shape_vari": 1, "county_codes_dict": 1, "calculate_dist": 1, "distanc": [1, 2, 3, 6], "centroidconnect_onli": 1, "mile": 1, "centroidconnector": 1, "calculate_mpo": 1, "county_network_vari": 1, "mpo": [1, 2], "mpo_counti": [1, 2], "param": [1, 4], "county_vari": 1, "region": [1, 6], "calculate_us": 1, "defauli": 1, "convert_int": 1, "int_col_nam": 1, "create_ml_vari": 1, "ml_lane": [1, 2], "ml": 1, "placehold": 1, "come": 1, "log": [1, 3, 8], "create_calculated_vari": 1, "create_dummy_connector_link": 1, "ml_df": 1, "access_lan": 1, "egress_lan": 1, "access_roadwai": 1, "ml_access": 1, "egress_roadwai": 1, "access_name_prefix": 1, "access": [1, 4, 6], "dummi": 1, "egress_name_prefix": 1, "egress": 1, "gp_df": 1, "roadai": 1, "prefix": 1, "create_hov_corridor_vari": 1, "segment_id": [1, 2], "hov": 1, "corridor": 1, "create_managed_lane_network": [1, 11], "keep_additional_attributes_ml_and_gp": 1, "separ": [1, 3, 6], "look": 1, "want": [1, 11], "leav": 1, "some": 1, "rigor": 1, "test": [1, 2, 6], "create_managed_vari": 1, "dataframe_to_fixed_width": 1, "df": 1, "fix": [1, 2], "width": [1, 6], "transform": [1, 6], "delete_roadway_feature_chang": 1, "ignore_miss": 1, "get": [1, 2, 6, 11], "miss": 1, "fail": [1, 6], "deletion_map": 1, "fill_na": 1, "fill": [1, 6], "na": 1, "from_roadwaynetwork": [1, 11], "roadway_network_object": 1, "get_attribut": 1, "join_kei": 1, "source_shst_ref_df": 1, "source_gdf": 1, "field_nam": 1, "get_managed_lane_node_id": 1, "nodes_list": 1, "4500000": 1, "237": 1, "get_modal_graph": 1, "bike_access": [1, 2], "bu": [1, 4], "bus_onli": [1, 2], "drive_access": [1, 2, 11], "rail": [1, 4], "rail_onli": [1, 2], "walk_access": [1, 2], "strongli": 1, "vertex": [1, 6], "reachabl": 1, "everi": [1, 6], "get_modal_links_nod": 1, "kept": 1, "both": [1, 4, 6, 11], "filter": [1, 6], "right": [1, 6], "now": 1, "we": [1, 11], "don": 1, "becaus": [1, 6, 11], "mark": 1, "issu": 1, "discuss": 1, "http": [1, 4, 6, 8, 11], "github": [1, 6, 8, 11], "com": [1, 4, 6, 8, 11], "wsp": [1, 8, 11], "sag": [1, 8, 11], "145": 1, "modal_nodes_df": 1, "mode_node_vari": 1, "get_property_by_time_period_and_group": 1, "prop": 1, "categori": [1, 2, 11], "default_return": 1, "seri": [1, 11], "group": 1, "16": [1, 2, 6], "00": [1, 2], "19": [1, 2], "option": [1, 6], "sov": [1, 2], "search": [1, 2, 8, 11], "hov3": [1, 2], "hov2": [1, 2], "identify_seg": 1, "o_id": 1, "d_id": 1, "endpoint": 1, "up": [1, 5, 11], "segment": [1, 4, 6, 11], "candid": 1, "otherwis": [1, 4, 6], "ram": 1, "hog": 1, "could": [1, 6, 11], "odd": 1, "shortest": 1, "segment_vari": 1, "keep": [1, 6], "identify_segment_endpoint": 1, "min_connecting_link": 1, "10": [1, 2, 6], "min_dist": 1, "max_link_devi": 1, "2": [1, 2, 4, 6, 11], "is_network_connect": [1, 11], "consid": [1, 6, 11], "cach": 1, "long": [1, 6], "load_transform_network": 1, "node_filenam": [1, 11], "link_filenam": [1, 11], "shape_filenam": [1, 11], "4326": [1, 6], "validate_schema": 1, "disk": 1, "schema": [1, 11], "shapes_df": [1, 11], "network_connection_plot": 1, "g": [1, 6], "disconnected_subgraph_nod": 1, "plot": 1, "fig": 1, "ax": [1, 6], "orig_dest_nodes_foreign_kei": 1, "whatev": 1, "u": 1, "v": [1, 2, 11], "ab": 1, "noth": 1, "assum": 1, "a_id": 1, "b_id": 1, "ox_graph": 1, "unique_link_kei": 1, "model_link_id": [1, 2, 3], "arrai": [1, 6], "remov": [1, 6], "certain": 1, "do": [1, 6, 11], "too": [1, 5], "link_df": 1, "referenc": 1, "multidigraph": 1, "path_search": 1, "candidate_links_df": 1, "weight_column": 1, "weight_factor": 1, "search_breadth": 1, "5": [1, 2, 4, 6], "max_search_breadth": 1, "candidate_link": 1, "part": [1, 6, 11], "foreigh": 1, "destin": 1, "weight": 1, "iter": [1, 6], "multipli": 1, "fast": [1, 11], "recalculate_calculated_vari": [1, 3], "recalculate_dist": [1, 3], "json": [1, 11], "geojson": 1, "skip": 1, "speed": 1, "spatial": [1, 6], "etc": [1, 2, 3, 11], "re": 1, "read_match_result": 1, "lot": 1, "same": [1, 6], "concaten": 1, "singl": [1, 3, 6], "geopanda": [1, 11], "sure": 1, "why": 1, "util": [1, 8, 11], "rename_variables_for_dbf": 1, "input_df": 1, "variable_crosswalk": 1, "output_vari": [1, 2], "convert_geometry_to_xi": 1, "renam": [1, 3], "dbf": 1, "shp": [1, 2], "char": 1, "crosswalk": [1, 3], "x": [1, 2, 6], "y": [1, 2, 6], "roadway_net_to_gdf": 1, "roadway_net": 1, "turn": [1, 11], "export": [1, 8, 11], "sophist": 1, "attach": 1, "roadway_standard_to_met_council_network": 1, "output_epsg": [1, 2], "consist": [1, 6, 11], "metcouncil": [1, 2, 4, 8, 11], "": [1, 2, 4, 6, 11], "model": [1, 2], "expect": [1, 6], "epsg": [1, 2, 6], "output": [1, 2, 3, 4, 6], "select_roadway_featur": [1, 11], "search_mod": 1, "force_search": 1, "sp_weight_factor": 1, "satisfi": [1, 6], "criteria": 1, "net": [1, 11], "osm": [1, 2], "share": [1, 11], "street": [1, 11], "osm_model_link_id": 1, "1234": 1, "shstid": 1, "4321": 1, "regex": 1, "facil": [1, 2, 11], "main": 1, "st": [1, 2], "least": [1, 11], "perform": 1, "even": 1, "previou": 1, "discourag": 1, "meander": 1, "ref": 1, "here": [1, 6], "defaul": 1, "selection_has_unique_link_id": 1, "selection_dictionari": 1, "selection_map": 1, "selected_link_idx": 1, "candidate_link_idx": 1, "selected_links_idx": 1, "candidate_links_idx": 1, "shortest_path": 1, "graph_links_df": 1, "100": 1, "four": 1, "nx": 1, "split_properties_by_time_period_and_categori": 1, "properties_to_split": [1, 2], "split": [1, 2, 4], "structur": 1, "stratifi": 1, "times_period": 1, "am": [1, 2], "6": [1, 2, 4, 6], "pm": [1, 2], "15": [1, 2], "update_dist": 1, "use_shap": 1, "inplac": 1, "straight": 1, "avail": 1, "portion": 1, "provid": [1, 3, 4, 6, 11], "entir": 1, "crow": 1, "fly": 1, "meter": [1, 6], "nan": 1, "validate_link_schema": 1, "schema_loc": 1, "roadway_network_link": 1, "validate_node_schema": 1, "node_fil": 1, "roadway_network_nod": 1, "validate_properti": 1, "ignore_exist": 1, "require_existing_for_chang": 1, "theproject": 1, "dictonari": 1, "validate_select": 1, "selection_requir": 1, "whetther": 1, "minimum": [1, 5, 6], "validate_shape_schema": 1, "shape_fil": 1, "roadway_network_shap": 1, "validate_uniqu": 1, "confirm": 1, "met": 1, "filenam": [1, 3, 11], "were": 1, "save": 1, "write_roadway_as_fixedwidth": [1, 11], "output_dir": 1, "node_output_vari": 1, "link_output_vari": 1, "output_link_txt": [1, 2], "output_node_txt": [1, 2], "output_link_header_width_txt": [1, 2], "output_node_header_width_txt": [1, 2], "output_cube_network_script": [1, 2], "drive_onli": 1, "function": [1, 5, 6, 8, 11], "doe": [1, 4], "header": [1, 2], "3": [1, 2, 4, 6, 11], "script": [1, 2], "run": [1, 8], "databas": [1, 3], "record": 1, "write_roadway_as_shp": [1, 11], "data_to_csv": 1, "data_to_dbf": 1, "output_link_shp": [1, 2], "output_node_shp": [1, 2], "output_link_csv": [1, 2], "output_node_csv": [1, 2], "output_gpkg": 1, "output_link_gpkg_lay": 1, "output_node_gpkg_lay": 1, "output_gpkg_link_filt": 1, "gpkg": 1, "csv": [1, 2, 3, 11], "full": [1, 6], "geopackag": 1, "layer": [1, 11], "subset": 1, "calculated_valu": [1, 3], "dai": [2, 4, 11], "can": [2, 6, 8, 11], "runtim": [2, 11], "initi": [2, 3, 11], "keyword": [2, 6, 11], "argument": [2, 6, 11], "explicitli": [2, 11], "highlight": [2, 11], "attr": 2, "time_period_to_tim": 2, "abbrevi": [2, 4], "gtf": [2, 4, 11], "highwai": [2, 3, 8, 11], "ea": 2, "md": 2, "ev": 2, "cube_time_period": 2, "4": [2, 4, 6], "demand": 2, "allow": [2, 6], "suffix": 2, "truck": 2, "trk": 2, "final": 2, "lanes_am": 2, "time_periods_to_tim": 2, "shapefil": 2, "r": 2, "metcouncil_data": 2, "cb_2017_us_county_5m": 2, "county_variable_shp": 2, "lanes_lookup_fil": 2, "lookup": 2, "centroid_connect_lan": 2, "anoka": 2, "dakota": 2, "hennepin": 2, "ramsei": 2, "scott": 2, "washington": 2, "carver": 2, "taz_shap": 2, "tazofficialwcurrentforecast": 2, "taz_data": 2, "highest": 2, "3100": 2, "link_id": 2, "shstgeometryid": 2, "roadway_class": 2, "truck_access": 2, "trn_priority_ea": 2, "trn_priority_am": 2, "trn_priority_md": 2, "trn_priority_pm": 2, "trn_priority_ev": 2, "ttime_assert_ea": 2, "ttime_assert_am": 2, "ttime_assert_md": 2, "ttime_assert_pm": 2, "ttime_assert_ev": 2, "lanes_ea": 2, "lanes_md": 2, "lanes_pm": 2, "lanes_ev": 2, "price_sov_ea": 2, "price_hov2_ea": 2, "price_hov3_ea": 2, "price_truck_ea": 2, "price_sov_am": 2, "price_hov2_am": 2, "price_hov3_am": 2, "price_truck_am": 2, "price_sov_md": 2, "price_hov2_md": 2, "price_hov3_md": 2, "price_truck_md": 2, "price_sov_pm": 2, "price_hov2_pm": 2, "price_hov3_pm": 2, "price_truck_pm": 2, "price_sov_ev": 2, "price_hov2_ev": 2, "price_hov3_ev": 2, "price_truck_ev": 2, "roadway_class_idx": 2, "facility_typ": 2, "osm_node_id": [2, 6, 11], "bike_nod": 2, "transit_nod": 2, "walk_nod": 2, "drive_nod": 2, "ml_lanes_ea": 2, "ml_lanes_am": 2, "ml_lanes_md": 2, "ml_lanes_pm": 2, "ml_lanes_ev": 2, "osm_facility_type_dict": 2, "thrivemsp2040communitydesign": 2, "area_type_variable_shp": 2, "comdes2040": 2, "area_type_code_dict": 2, "23": 2, "urban": [2, 4], "center": [2, 6], "24": 2, "25": 2, "35": 2, "36": 2, "41": 2, "51": 2, "52": 2, "53": 2, "60": 2, "downtownzones_taz": 2, "mrcc_roadway_class_shap": 2, "mrcc": 2, "trans_mrcc_centerlin": 2, "mrcc_roadway_class_variable_shp": 2, "mrcc_roadway_class_shp": 2, "route_si": 2, "widot_roadway_class_shap": 2, "wisconsin": 2, "wisconsin_lanes_counts_median": 2, "wislr": 2, "widot_roadway_class_variable_shp": 2, "rdwy_ctgy_": 2, "mndot_count_shap": 2, "count_mn": 2, "aadt_2017_count_loc": 2, "osm_highway_facility_type_crosswalk": 2, "legacy_tm2_attribut": 2, "shstreferenceid": 2, "legaci": 2, "tm2": 2, "osm_lanes_attribut": 2, "tam_tm2_attribut": 2, "tam": 2, "tom_tom_attribut": 2, "tomtom": 2, "tomtom_attribut": 2, "sfcta_attribut": 2, "sfcta": 2, "geograph": 2, "102646": 2, "scratch": 2, "txt": [2, 11], "links_header_width": 2, "nodes_header_width": 2, "import": [2, 6], "make_complete_network_from_fixed_width_fil": 2, "county_link_range_dict": 2, "county_code_dict": 2, "7": [2, 4, 11], "extern": 2, "chisago": 2, "11": 2, "goodhu": 2, "12": 2, "isanti": 2, "13": 2, "le": 2, "sueur": 2, "14": 2, "mcleod": 2, "pierc": 2, "polk": 2, "17": 2, "rice": 2, "18": 2, "sherburn": 2, "siblei": 2, "20": 2, "croix": 2, "21": 2, "wright": 2, "22": 2, "route_type_bus_mode_dict": 2, "urb": 2, "loc": 2, "sub": [2, 6], "express": [2, 4, 6], "route_type_mode_dict": 2, "8": [2, 4, 6], "9": [2, 4], "cube_time_periods_nam": 2, "op": 2, "detail": [2, 5], "zone": 2, "possibl": [2, 6], "roadway_link_chang": 3, "roadway_node_chang": 3, "transit_chang": [3, 4], "base_roadway_network": 3, "base_cube_transit_network": 3, "build_cube_transit_network": 3, "project_nam": 3, "produc": [3, 6], "test_project": [3, 11], "create_project": [3, 11], "base_cube_transit_sourc": 3, "o": [3, 4, 11], "build_cube_transit_sourc": 3, "transit_route_shape_chang": [3, 11], "evaluate_chang": [3, 11], "write_project_card": [3, 11], "scratch_dir": [3, 11], "t_transit_shape_test": [3, 11], "yml": [3, 11], "default_project_nam": 3, "level": 3, "constant": 3, "static_valu": 3, "card_data": 3, "cubetransit": [3, 8], "bunch": 3, "projectcard": [3, 8], "case": 3, "standardtransit": [3, 8], "add_highway_chang": 3, "limit_variables_to_existing_network": 3, "hoc": [3, 11], "add_transit_chang": 3, "roadway_log_fil": 3, "roadway_shp_fil": 3, "roadway_csv_fil": 3, "network_build_fil": 3, "emme_node_id_crosswalk_fil": 3, "emme_name_crosswalk_fil": 3, "base_roadway_dir": 3, "base_transit_dir": [3, 4, 11], "consum": 3, "logfil": 3, "emm": [3, 4], "folder": 3, "base_cube_transit_fil": 3, "build_cube_transit_fil": 3, "first": [3, 6], "recalcul": 3, "determine_roadway_network_changes_compat": 3, "emme_id_to_wrangler_id": 3, "emme_link_change_df": 3, "emme_node_change_df": 3, "emme_transit_changes_df": 3, "rewrit": 3, "wrangler": [3, 8, 11], "emme_name_to_wrangler_nam": 3, "aggreg": 3, "get_object_from_network_build_command": 3, "row": [3, 4], "histori": 3, "l": 3, "get_operation_from_network_build_command": 3, "action": 3, "c": [3, 6], "d": [3, 6], "read_logfil": 3, "logfilenam": 3, "reprsent": [3, 11], "read_network_build_fil": 3, "networkbuildfilenam": 3, "nework": 3, "assign_group": 3, "user": [3, 11], "TO": 3, "ptg_feed": 4, "hold": [4, 11], "feed": [4, 11], "partridg": [4, 11], "manipul": [4, 11], "cube_transit_net": [4, 11], "read_gtf": [4, 11], "write_as_cube_lin": [4, 11], "write_dir": [4, 11], "outfil": [4, 11], "calculate_cube_mod": 4, "assign": 4, "logic": 4, "route_typ": 4, "develop": [4, 11], "googl": 4, "cube_mod": 4, "route_type_to_cube_mod": 4, "tram": 4, "streetcar": 4, "light": 4, "further": 4, "disaggreg": 4, "buse": 4, "suburban": 4, "longnam": 4, "lower": [4, 6], "elif": 4, "99": 4, "local": [4, 11], "els": [4, 6], "route_long_nam": 4, "cube_format": 4, "represnt": 4, "notat": 4, "trip": 4, "trip_id": 4, "shape_id": [4, 11], "tod": 4, "onewai": 4, "oper": [4, 6], "fromtransitnetwork": 4, "transit_network_object": 4, "modelroadwaynetwork": [4, 8], "gtfs_feed_dir": 4, "route_properties_gtfs_to_cub": 4, "prepar": 4, "trip_df": 4, "shape_gtfs_to_cub": 4, "add_nntim": 4, "shape_gtfs_to_emm": 4, "trip_row": 4, "time_to_cube_time_period": 4, "start_time_sec": 4, "as_str": 4, "verbos": 4, "midnight": [4, 6], "this_tp": 4, "this_tp_num": 4, "outpath": 4, "after": 4, "setuplog": 5, "infologfilenam": 5, "debuglogfilenam": 5, "logtoconsol": 5, "infolog": 5, "ters": 5, "just": 5, "give": 5, "bare": 5, "composit": 5, "clear": 5, "later": 5, "debuglog": 5, "veri": [5, 6], "noisi": 5, "debug": 5, "spew": 5, "consol": 5, "point": 6, "arg": 6, "basegeometri": 6, "possibli": 6, "z": 6, "zero": 6, "dimension": 6, "float": 6, "sequenc": 6, "individu": 6, "p": 6, "print": 6, "almost_equ": 6, "decim": 6, "equal": 6, "place": 6, "deprec": 6, "sinc": 6, "confus": 6, "equals_exact": 6, "instead": 6, "approxim": 6, "compon": [6, 8], "linestr": 6, "1e": 6, "buffer": 6, "quad_seg": 6, "cap_styl": 6, "round": 6, "join_styl": 6, "mitre_limit": 6, "single_sid": 6, "dilat": 6, "eros": 6, "small": 6, "mai": 6, "sometim": 6, "tidi": 6, "polygon": 6, "around": [6, 8, 11], "resolut": 6, "angl": 6, "fillet": 6, "buffercapstyl": 6, "squar": 6, "flat": 6, "circular": 6, "rectangular": 6, "while": 6, "involv": 6, "bufferjoinstyl": 6, "mitr": 6, "bevel": 6, "midpoint": 6, "edg": [6, 8], "touch": 6, "depend": 6, "limit": 6, "ratio": 6, "sharp": 6, "corner": 6, "offset": 6, "meet": 6, "miter": 6, "extend": 6, "To": [6, 11], "prevent": 6, "unreason": 6, "control": 6, "maximum": 6, "exce": 6, "side": 6, "sign": 6, "left": 6, "hand": 6, "regular": 6, "cap": 6, "alwai": 6, "forc": 6, "equival": 6, "cap_flat": 6, "quadseg": 6, "alia": 6, "strictli": 6, "wkt": 6, "load": 6, "gon": 6, "approx": 6, "radiu": 6, "circl": 6, "1365484905459": 6, "128": 6, "141513801144": 6, "triangl": 6, "exterior": 6, "coord": 6, "contains_properli": 6, "complet": 6, "common": 6, "document": [6, 11], "covered_bi": 6, "cover": 6, "cross": 6, "grid_siz": 6, "disjoint": 6, "unitless": 6, "dwithin": 6, "given": [6, 11], "topolog": 6, "toler": 6, "comparison": 6, "geometrytyp": 6, "hausdorff_dist": 6, "hausdorff": 6, "interpol": 6, "normal": 6, "along": 6, "linear": 6, "taken": 6, "measur": 6, "revers": 6, "rang": 6, "index": [6, 8], "handl": 6, "clamp": 6, "interpret": 6, "fraction": 6, "line_interpolate_point": 6, "intersect": 6, "line_locate_point": 6, "nearest": 6, "form": 6, "canon": 6, "ring": 6, "multi": 6, "multilinestr": 6, "overlap": 6, "point_on_surfac": 6, "guarante": 6, "cheapli": 6, "representative_point": 6, "relat": 6, "de": 6, "9im": 6, "matrix": 6, "relate_pattern": 6, "pattern": 6, "relationship": 6, "interior": 6, "unchang": 6, "is_ccw": 6, "clockwis": 6, "max_segment_length": 6, "vertic": 6, "longer": 6, "evenli": 6, "subdivid": 6, "densifi": 6, "unmodifi": 6, "array_lik": 6, "greater": 6, "simplifi": 6, "preserve_topologi": 6, "dougla": 6, "peucker": 6, "algorithm": 6, "unless": 6, "topologi": 6, "preserv": 6, "invalid": 6, "svg": 6, "scale_factor": 6, "fill_color": 6, "opac": 6, "element": 6, "factor": 6, "diamet": 6, "hex": 6, "color": 6, "66cc99": 6, "ff3333": 6, "symmetric_differ": 6, "symmetr": 6, "union": 6, "dimens": 6, "bound": 6, "collect": 6, "empti": 6, "null": 6, "minx": 6, "mini": 6, "maxx": 6, "maxi": 6, "geometr": 6, "convex_hul": 6, "convex": 6, "hull": 6, "less": 6, "three": [6, 11], "multipoint": 6, "triangular": 6, "imagin": 6, "elast": 6, "band": 6, "stretch": 6, "coordinatesequ": 6, "envelop": 6, "figur": 6, "geom_typ": 6, "has_z": 6, "is_clos": 6, "close": 6, "applic": 6, "is_empti": 6, "is_r": 6, "is_simpl": 6, "simpl": 6, "mean": 6, "is_valid": 6, "definit": 6, "minimum_clear": 6, "move": 6, "minimum_rotated_rectangl": 6, "orient": 6, "rotat": 6, "rectangl": 6, "enclos": 6, "unlik": 6, "constrain": 6, "degener": 6, "oriented_envelop": 6, "wkb": 6, "wkb_hex": 6, "xy": 6, "shell": 6, "hole": 6, "It": [6, 8], "space": 6, "pair": [6, 11], "tripl": 6, "abov": 6, "classmethod": 6, "from_bound": 6, "xmin": 6, "ymin": 6, "xmax": 6, "ymax": 6, "stroke": 6, "partial": 6, "func": 6, "futur": 6, "call": 6, "column_name_to_part": 6, "create_locationrefer": 6, "geodesic_point_buff": 6, "lat": 6, "lon": 6, "get_shared_streets_intersection_hash": 6, "per": [6, 11], "sharedstreet": 6, "j": 6, "blob": 6, "0e6d7de0aee2e9ae3b007d1e45284b06cc241d02": 6, "src": 6, "l553": 6, "l565": 6, "93": 6, "0965985": 6, "44": 6, "952112199999995": 6, "954734870": 6, "69f13f881649cb21ee3b359730790bb9": 6, "hhmmss_to_datetim": 6, "hhmmss_str": 6, "datetim": 6, "hh": 6, "mm": 6, "ss": 6, "dt": 6, "secs_to_datetim": 6, "sec": 6, "shorten_nam": 6, "geom": 6, "xp": 6, "yp": 6, "zp": 6, "shall": 6, "ident": 6, "def": 6, "id_func": 6, "g2": 6, "g1": 6, "pyproj": 6, "accur": 6, "wgs84": 6, "utm": 6, "32618": 6, "from_cr": 6, "always_xi": 6, "support": 6, "lambda": 6, "unidecod": 6, "error": 6, "replace_str": 6, "transliter": 6, "unicod": 6, "ascii": 6, "\u5317\u4eb0": 6, "bei": 6, "jing": 6, "tri": 6, "codec": 6, "charact": 6, "fall": 6, "back": 6, "five": 6, "faster": 6, "slightli": 6, "slower": 6, "unicode_expect_nonascii": 6, "present": 6, "replac": [6, 11], "strict": 6, "rais": 6, "unidecodeerror": 6, "substitut": 6, "might": [6, 11], "packag": [8, 11], "aim": 8, "networkwrangl": [8, 11], "refin": 8, "metropolitan": 8, "council": 8, "citilab": 8, "softwar": [8, 11], "instal": 8, "bleed": 8, "clone": 8, "brief": 8, "intro": 8, "workflow": 8, "quickstart": 8, "jupyt": 8, "notebook": 8, "setup": 8, "scenario": 8, "audit": 8, "report": 8, "logger": 8, "modul": 8, "page": 8, "suggest": 11, "virtualenv": 11, "conda": 11, "virtual": 11, "environ": 11, "recommend": 11, "pip": 11, "lasso": 11, "config": 11, "channel": 11, "forg": 11, "rtree": 11, "my_lasso_environ": 11, "activ": 11, "git": 11, "master": 11, "pypi": 11, "repositori": 11, "date": 11, "branch": 11, "work": 11, "your": 11, "machin": 11, "edit": 11, "plan": 11, "well": 11, "cd": 11, "team": 11, "contribut": 11, "bxack": 11, "pleas": 11, "fork": 11, "befor": 11, "upstream": 11, "tag": 11, "branchnam": 11, "frequent": 11, "instruct": 11, "good": 11, "atom": 11, "sublim": 11, "text": 11, "syntax": 11, "desktop": 11, "built": 11, "mashup": 11, "open": 11, "In": 11, "nest": 11, "span": 11, "implement": 11, "novel": 11, "travel": 11, "break": 11, "publictransport": 11, "done": 11, "gui": 11, "public": 11, "transport": 11, "infrastructur": 11, "servic": 11, "tier": 11, "made": 11, "mainli": 11, "my_link_fil": 11, "my_node_fil": 11, "my_shape_fil": 11, "my_select": 11, "35e": 11, "961117623": 11, "2564047368": 11, "my_chang": 11, "my_net": 11, "ml_net": 11, "_": 11, "disconnected_nod": 11, "my_out_prefix": 11, "my_dir": 11, "my_base_scenario": 11, "road_net": 11, "stpaul_link_fil": 11, "stpaul_node_fil": 11, "stpaul_shape_fil": 11, "transit_net": 11, "stpaul_dir": 11, "card_filenam": 11, "3_multiple_roadway_attribute_chang": 11, "multiple_chang": 11, "4_simple_managed_lan": 11, "project_card_directori": 11, "project_card": 11, "project_cards_list": 11, "my_scenario": 11, "create_scenario": 11, "base_scenario": 11, "check_scenario_requisit": 11, "apply_all_project": 11, "scenario_summari": 11, "base_transit_sourc": 11, "build_transit_sourc": 11, "understand": 11, "how": 11, "overrid": 11, "those": 11, "instanti": 11, "yaml": 11, "configur": 11, "config_fil": 11, "f": 11, "my_config": 11, "safe_load": 11, "model_road_net": 11, "my_paramet": 11, "accomplish": 11, "goal": 11, "top": 11, "learn": 11, "basic": 11, "creation": 11, "ipynb": 11}, "objects": {"": [[7, 0, 0, "-", "lasso"]], "lasso": [[0, 1, 1, "", "CubeTransit"], [1, 1, 1, "", "ModelRoadwayNetwork"], [2, 1, 1, "", "Parameters"], [3, 1, 1, "", "Project"], [4, 1, 1, "", "StandardTransit"], [5, 0, 0, "-", "logger"], [6, 0, 0, "-", "util"]], "lasso.CubeTransit": [[0, 2, 1, "", "__init__"], [0, 2, 1, "", "add_additional_time_periods"], [0, 2, 1, "", "add_cube"], [0, 2, 1, "", "build_route_name"], [0, 2, 1, "", "calculate_start_end_times"], [0, 2, 1, "", "create_add_route_card_dict"], [0, 2, 1, "", "create_delete_route_card_dict"], [0, 2, 1, "", "create_from_cube"], [0, 2, 1, "", "create_update_route_card_dict"], [0, 2, 1, "", "cube_properties_to_standard_properties"], [0, 3, 1, "", "diff_dict"], [0, 2, 1, "", "evaluate_differences"], [0, 2, 1, "", "evaluate_route_property_differences"], [0, 2, 1, "", "evaluate_route_shape_changes"], [0, 2, 1, "", "get_time_period_numbers_from_cube_properties"], [0, 3, 1, "", "line_properties"], [0, 3, 1, "", "lines"], [0, 3, 1, "", "parameters"], [0, 3, 1, "", "program_type"], [0, 3, 1, "", "shapes"], [0, 3, 1, "", "source_list"], [0, 2, 1, "", "unpack_route_name"]], "lasso.ModelRoadwayNetwork": [[1, 3, 1, "", "CALCULATED_VALUES"], [1, 2, 1, "", "__init__"], [1, 2, 1, "", "add_counts"], [1, 2, 1, "", "add_incident_link_data_to_nodes"], [1, 2, 1, "", "add_new_roadway_feature_change"], [1, 2, 1, "", "add_variable_using_shst_reference"], [1, 2, 1, "", "addition_map"], [1, 2, 1, "", "apply"], [1, 2, 1, "", "apply_managed_lane_feature_change"], [1, 2, 1, "", "apply_python_calculation"], [1, 2, 1, "", "apply_roadway_feature_change"], [1, 2, 1, "", "assess_connectivity"], [1, 2, 1, "", "build_selection_key"], [1, 2, 1, "", "calculate_area_type"], [1, 2, 1, "", "calculate_centroidconnect"], [1, 2, 1, "", "calculate_county"], [1, 2, 1, "", "calculate_distance"], [1, 2, 1, "", "calculate_mpo"], [1, 2, 1, "", "calculate_use"], [1, 2, 1, "", "convert_int"], [1, 2, 1, "", "create_ML_variable"], [1, 2, 1, "", "create_calculated_variables"], [1, 2, 1, "", "create_dummy_connector_links"], [1, 2, 1, "", "create_hov_corridor_variable"], [1, 2, 1, "", "create_managed_lane_network"], [1, 2, 1, "", "create_managed_variable"], [1, 2, 1, "", "dataframe_to_fixed_width"], [1, 2, 1, "", "delete_roadway_feature_change"], [1, 2, 1, "", "deletion_map"], [1, 2, 1, "", "fill_na"], [1, 2, 1, "", "from_RoadwayNetwork"], [1, 2, 1, "", "get_attribute"], [1, 2, 1, "", "get_managed_lane_node_ids"], [1, 2, 1, "", "get_modal_graph"], [1, 2, 1, "", "get_modal_links_nodes"], [1, 2, 1, "", "get_property_by_time_period_and_group"], [1, 2, 1, "", "identify_segment"], [1, 2, 1, "", "identify_segment_endpoints"], [1, 2, 1, "", "is_network_connected"], [1, 2, 1, "", "load_transform_network"], [1, 2, 1, "", "network_connection_plot"], [1, 2, 1, "", "orig_dest_nodes_foreign_key"], [1, 2, 1, "", "ox_graph"], [1, 2, 1, "", "path_search"], [1, 2, 1, "", "read"], [1, 2, 1, "", "read_match_result"], [1, 2, 1, "", "rename_variables_for_dbf"], [1, 2, 1, "", "roadway_net_to_gdf"], [1, 2, 1, "", "roadway_standard_to_met_council_network"], [1, 2, 1, "", "select_roadway_features"], [1, 2, 1, "", "selection_has_unique_link_id"], [1, 2, 1, "", "selection_map"], [1, 2, 1, "", "shortest_path"], [1, 2, 1, "", "split_properties_by_time_period_and_category"], [1, 2, 1, "", "update_distance"], [1, 2, 1, "", "validate_link_schema"], [1, 2, 1, "", "validate_node_schema"], [1, 2, 1, "", "validate_properties"], [1, 2, 1, "", "validate_selection"], [1, 2, 1, "", "validate_shape_schema"], [1, 2, 1, "", "validate_uniqueness"], [1, 2, 1, "", "write"], [1, 2, 1, "", "write_roadway_as_fixedwidth"], [1, 2, 1, "", "write_roadway_as_shp"]], "lasso.Parameters": [[2, 2, 1, "", "__init__"], [2, 3, 1, "", "county_link_range_dict"], [2, 3, 1, "", "cube_time_periods"], [2, 3, 1, "", "properties_to_split"], [2, 3, 1, "", "zones"]], "lasso.Project": [[3, 3, 1, "", "CALCULATED_VALUES"], [3, 3, 1, "id0", "DEFAULT_PROJECT_NAME"], [3, 3, 1, "id1", "STATIC_VALUES"], [3, 2, 1, "", "__init__"], [3, 2, 1, "", "add_highway_changes"], [3, 2, 1, "", "add_transit_changes"], [3, 3, 1, "", "base_cube_transit_network"], [3, 3, 1, "", "base_roadway_network"], [3, 3, 1, "", "build_cube_transit_network"], [3, 3, 1, "", "card_data"], [3, 2, 1, "", "create_project"], [3, 2, 1, "", "determine_roadway_network_changes_compatibility"], [3, 2, 1, "", "emme_id_to_wrangler_id"], [3, 2, 1, "", "emme_name_to_wrangler_name"], [3, 2, 1, "", "evaluate_changes"], [3, 2, 1, "", "get_object_from_network_build_command"], [3, 2, 1, "", "get_operation_from_network_build_command"], [3, 3, 1, "", "parameters"], [3, 3, 1, "", "project_name"], [3, 2, 1, "", "read_logfile"], [3, 2, 1, "", "read_network_build_file"], [3, 3, 1, "", "roadway_link_changes"], [3, 3, 1, "", "roadway_node_changes"], [3, 3, 1, "", "transit_changes"], [3, 2, 1, "", "write_project_card"]], "lasso.StandardTransit": [[4, 2, 1, "", "__init__"], [4, 2, 1, "", "calculate_cube_mode"], [4, 2, 1, "", "cube_format"], [4, 2, 1, "", "evaluate_differences"], [4, 3, 1, "", "feed"], [4, 2, 1, "", "fromTransitNetwork"], [4, 3, 1, "", "parameters"], [4, 2, 1, "", "read_gtfs"], [4, 2, 1, "", "route_properties_gtfs_to_cube"], [4, 2, 1, "", "shape_gtfs_to_cube"], [4, 2, 1, "", "shape_gtfs_to_emme"], [4, 2, 1, "", "time_to_cube_time_period"], [4, 2, 1, "", "write_as_cube_lin"]], "lasso.logger": [[5, 4, 1, "", "setupLogging"]], "lasso.util": [[6, 1, 1, "", "Point"], [6, 1, 1, "", "Polygon"], [6, 4, 1, "", "column_name_to_parts"], [6, 4, 1, "", "create_locationreference"], [6, 4, 1, "", "geodesic_point_buffer"], [6, 4, 1, "", "get_shared_streets_intersection_hash"], [6, 4, 1, "", "hhmmss_to_datetime"], [6, 1, 1, "", "partial"], [6, 4, 1, "", "secs_to_datetime"], [6, 4, 1, "", "shorten_name"], [6, 4, 1, "", "transform"], [6, 4, 1, "", "unidecode"]], "lasso.util.Point": [[6, 2, 1, "", "almost_equals"], [6, 5, 1, "", "area"], [6, 5, 1, "", "boundary"], [6, 5, 1, "", "bounds"], [6, 2, 1, "", "buffer"], [6, 5, 1, "", "centroid"], [6, 2, 1, "", "contains"], [6, 2, 1, "", "contains_properly"], [6, 5, 1, "", "convex_hull"], [6, 5, 1, "", "coords"], [6, 2, 1, "", "covered_by"], [6, 2, 1, "", "covers"], [6, 2, 1, "", "crosses"], [6, 2, 1, "", "difference"], [6, 2, 1, "", "disjoint"], [6, 2, 1, "", "distance"], [6, 2, 1, "", "dwithin"], [6, 5, 1, "", "envelope"], [6, 2, 1, "", "equals"], [6, 2, 1, "", "equals_exact"], [6, 5, 1, "", "geom_type"], [6, 2, 1, "", "geometryType"], [6, 5, 1, "", "has_z"], [6, 2, 1, "", "hausdorff_distance"], [6, 2, 1, "", "interpolate"], [6, 2, 1, "", "intersection"], [6, 2, 1, "", "intersects"], [6, 5, 1, "", "is_closed"], [6, 5, 1, "", "is_empty"], [6, 5, 1, "", "is_ring"], [6, 5, 1, "", "is_simple"], [6, 5, 1, "", "is_valid"], [6, 5, 1, "", "length"], [6, 2, 1, "", "line_interpolate_point"], [6, 2, 1, "", "line_locate_point"], [6, 5, 1, "", "minimum_clearance"], [6, 5, 1, "", "minimum_rotated_rectangle"], [6, 2, 1, "", "normalize"], [6, 5, 1, "", "oriented_envelope"], [6, 2, 1, "", "overlaps"], [6, 2, 1, "", "point_on_surface"], [6, 2, 1, "", "project"], [6, 2, 1, "", "relate"], [6, 2, 1, "", "relate_pattern"], [6, 2, 1, "", "representative_point"], [6, 2, 1, "", "reverse"], [6, 2, 1, "", "segmentize"], [6, 2, 1, "", "simplify"], [6, 2, 1, "", "svg"], [6, 2, 1, "", "symmetric_difference"], [6, 2, 1, "", "touches"], [6, 5, 1, "", "type"], [6, 2, 1, "", "union"], [6, 2, 1, "", "within"], [6, 5, 1, "", "wkb"], [6, 5, 1, "", "wkb_hex"], [6, 5, 1, "", "wkt"], [6, 5, 1, "", "x"], [6, 5, 1, "", "xy"], [6, 5, 1, "", "y"], [6, 5, 1, "", "z"]], "lasso.util.Polygon": [[6, 2, 1, "", "almost_equals"], [6, 5, 1, "", "area"], [6, 5, 1, "", "boundary"], [6, 5, 1, "", "bounds"], [6, 2, 1, "", "buffer"], [6, 5, 1, "", "centroid"], [6, 2, 1, "", "contains"], [6, 2, 1, "", "contains_properly"], [6, 5, 1, "", "convex_hull"], [6, 5, 1, "", "coords"], [6, 2, 1, "", "covered_by"], [6, 2, 1, "", "covers"], [6, 2, 1, "", "crosses"], [6, 2, 1, "", "difference"], [6, 2, 1, "", "disjoint"], [6, 2, 1, "", "distance"], [6, 2, 1, "", "dwithin"], [6, 5, 1, "", "envelope"], [6, 2, 1, "", "equals"], [6, 2, 1, "", "equals_exact"], [6, 5, 1, "id0", "exterior"], [6, 2, 1, "", "from_bounds"], [6, 5, 1, "", "geom_type"], [6, 2, 1, "", "geometryType"], [6, 5, 1, "", "has_z"], [6, 2, 1, "", "hausdorff_distance"], [6, 5, 1, "id1", "interiors"], [6, 2, 1, "", "interpolate"], [6, 2, 1, "", "intersection"], [6, 2, 1, "", "intersects"], [6, 5, 1, "", "is_closed"], [6, 5, 1, "", "is_empty"], [6, 5, 1, "", "is_ring"], [6, 5, 1, "", "is_simple"], [6, 5, 1, "", "is_valid"], [6, 5, 1, "", "length"], [6, 2, 1, "", "line_interpolate_point"], [6, 2, 1, "", "line_locate_point"], [6, 5, 1, "", "minimum_clearance"], [6, 5, 1, "", "minimum_rotated_rectangle"], [6, 2, 1, "", "normalize"], [6, 5, 1, "", "oriented_envelope"], [6, 2, 1, "", "overlaps"], [6, 2, 1, "", "point_on_surface"], [6, 2, 1, "", "project"], [6, 2, 1, "", "relate"], [6, 2, 1, "", "relate_pattern"], [6, 2, 1, "", "representative_point"], [6, 2, 1, "", "reverse"], [6, 2, 1, "", "segmentize"], [6, 2, 1, "", "simplify"], [6, 2, 1, "", "svg"], [6, 2, 1, "", "symmetric_difference"], [6, 2, 1, "", "touches"], [6, 5, 1, "", "type"], [6, 2, 1, "", "union"], [6, 2, 1, "", "within"], [6, 5, 1, "", "wkb"], [6, 5, 1, "", "wkb_hex"], [6, 5, 1, "", "wkt"], [6, 5, 1, "", "xy"]], "lasso.util.partial": [[6, 3, 1, "", "args"], [6, 3, 1, "", "func"], [6, 3, 1, "", "keywords"]]}, "objtypes": {"0": "py:module", "1": "py:class", "2": "py:method", "3": "py:attribute", "4": "py:function", "5": "py:property"}, "objnames": {"0": ["py", "module", "Python module"], "1": ["py", "class", "Python class"], "2": ["py", "method", "Python method"], "3": ["py", "attribute", "Python attribute"], "4": ["py", "function", "Python function"], "5": ["py", "property", "Python property"]}, "titleterms": {"lasso": [0, 1, 2, 3, 4, 5, 6, 7, 8, 9], "cubetransit": [0, 11], "modelroadwaynetwork": [1, 11], "todo": 1, "paramet": [2, 10, 11], "project": [3, 9, 10, 11], "standardtransit": [4, 11], "logger": 5, "util": [6, 7], "class": 7, "function": 7, "base": 7, "welcom": 8, "": 8, "document": 8, "content": 8, "indic": 8, "tabl": 8, "run": [9, 11], "creat": 9, "file": [9, 10, 11], "scenario": [9, 11], "export": 9, "network": [9, 11], "audit": 9, "report": 9, "setup": 10, "set": 10, "addit": 10, "data": 10, "start": 11, "out": 11, "instal": 11, "bleed": 11, "edg": 11, "from": 11, "clone": 11, "brief": 11, "intro": 11, "compon": 11, "roadwaynetwork": 11, "transitnetwork": 11, "projectcard": 11, "typic": 11, "workflow": 11, "card": 11, "transit": 11, "lin": 11, "cube": 11, "log": 11, "model": 11, "quickstart": 11, "jupyt": 11, "notebook": 11}, "envversion": {"sphinx.domains.c": 3, "sphinx.domains.changeset": 1, "sphinx.domains.citation": 1, "sphinx.domains.cpp": 9, "sphinx.domains.index": 1, "sphinx.domains.javascript": 3, "sphinx.domains.math": 2, "sphinx.domains.python": 4, "sphinx.domains.rst": 2, "sphinx.domains.std": 2, "sphinx.ext.intersphinx": 1, "sphinx.ext.todo": 2, "sphinx.ext.viewcode": 1, "sphinx": 58}, "alltitles": {"lasso.CubeTransit": [[0, "lasso-cubetransit"]], "lasso.ModelRoadwayNetwork": [[1, "lasso-modelroadwaynetwork"]], "Todo": [[1, "id1"], [1, "id2"], [1, "id3"], [1, "id4"], [1, "id5"], [1, "id6"]], "lasso.Parameters": [[2, "lasso-parameters"]], "lasso.Project": [[3, "lasso-project"]], "lasso.StandardTransit": [[4, "lasso-standardtransit"]], "lasso.logger": [[5, "module-lasso.logger"]], "lasso.util": [[6, "module-lasso.util"]], "Lasso Classes and Functions": [[7, "module-lasso"]], "Base Classes": [[7, "base-classes"]], "Utils and Functions": [[7, "utils-and-functions"]], "Welcome to lasso\u2019s documentation!": [[8, "welcome-to-lasso-s-documentation"]], "Contents:": [[8, null]], "Indices and tables": [[8, "indices-and-tables"]], "Running Lasso": [[9, "running-lasso"]], "Create project files": [[9, "create-project-files"]], "Create a scenario": [[9, "create-a-scenario"]], "Exporting networks": [[9, "exporting-networks"]], "Auditing and Reporting": [[9, "auditing-and-reporting"]], "Setup": [[10, "setup"]], "Projects": [[10, "projects"]], "Parameters": [[10, "parameters"], [11, "parameters"]], "Settings": [[10, "settings"]], "Additional Data Files": [[10, "additional-data-files"]], "Starting Out": [[11, "starting-out"]], "Installation": [[11, "installation"]], "Bleeding Edge": [[11, "bleeding-edge"]], "From Clone": [[11, "from-clone"]], "Brief Intro": [[11, "brief-intro"]], "Components": [[11, "components"]], "RoadwayNetwork": [[11, "roadwaynetwork"]], "TransitNetwork": [[11, "transitnetwork"]], "ProjectCard": [[11, "projectcard"]], "Scenario": [[11, "scenario"]], "Project": [[11, "project"]], "ModelRoadwayNetwork": [[11, "modelroadwaynetwork"]], "StandardTransit": [[11, "standardtransit"]], "CubeTransit": [[11, "cubetransit"]], "Typical Workflow": [[11, "typical-workflow"]], "Project Cards from Transit LIN Files": [[11, "project-cards-from-transit-lin-files"]], "Project Cards from Cube LOG Files": [[11, "project-cards-from-cube-log-files"]], "Model Network Files for a Scenario": [[11, "model-network-files-for-a-scenario"]], "Running Quickstart Jupyter Notebooks": [[11, "running-quickstart-jupyter-notebooks"]]}, "indexentries": {"cubetransit (class in lasso)": [[0, "lasso.CubeTransit"]], "__init__() (lasso.cubetransit method)": [[0, "lasso.CubeTransit.__init__"]], "add_additional_time_periods() (lasso.cubetransit method)": [[0, "lasso.CubeTransit.add_additional_time_periods"]], "add_cube() (lasso.cubetransit method)": [[0, "lasso.CubeTransit.add_cube"]], "build_route_name() (lasso.cubetransit static method)": [[0, "lasso.CubeTransit.build_route_name"]], "calculate_start_end_times() (lasso.cubetransit method)": [[0, "lasso.CubeTransit.calculate_start_end_times"]], "create_add_route_card_dict() (lasso.cubetransit method)": [[0, "lasso.CubeTransit.create_add_route_card_dict"]], "create_delete_route_card_dict() (lasso.cubetransit method)": [[0, "lasso.CubeTransit.create_delete_route_card_dict"]], "create_from_cube() (lasso.cubetransit static method)": [[0, "lasso.CubeTransit.create_from_cube"]], "create_update_route_card_dict() (lasso.cubetransit method)": [[0, "lasso.CubeTransit.create_update_route_card_dict"]], "cube_properties_to_standard_properties() (lasso.cubetransit static method)": [[0, "lasso.CubeTransit.cube_properties_to_standard_properties"]], "diff_dict (lasso.cubetransit attribute)": [[0, "lasso.CubeTransit.diff_dict"]], "evaluate_differences() (lasso.cubetransit method)": [[0, "lasso.CubeTransit.evaluate_differences"]], "evaluate_route_property_differences() (lasso.cubetransit method)": [[0, "lasso.CubeTransit.evaluate_route_property_differences"]], "evaluate_route_shape_changes() (lasso.cubetransit static method)": [[0, "lasso.CubeTransit.evaluate_route_shape_changes"]], "get_time_period_numbers_from_cube_properties() (lasso.cubetransit static method)": [[0, "lasso.CubeTransit.get_time_period_numbers_from_cube_properties"]], "line_properties (lasso.cubetransit attribute)": [[0, "lasso.CubeTransit.line_properties"]], "lines (lasso.cubetransit attribute)": [[0, "lasso.CubeTransit.lines"]], "parameters (lasso.cubetransit attribute)": [[0, "lasso.CubeTransit.parameters"]], "program_type (lasso.cubetransit attribute)": [[0, "lasso.CubeTransit.program_type"]], "shapes (lasso.cubetransit attribute)": [[0, "lasso.CubeTransit.shapes"]], "source_list (lasso.cubetransit attribute)": [[0, "lasso.CubeTransit.source_list"]], "unpack_route_name() (lasso.cubetransit static method)": [[0, "lasso.CubeTransit.unpack_route_name"]], "calculated_values (lasso.modelroadwaynetwork attribute)": [[1, "lasso.ModelRoadwayNetwork.CALCULATED_VALUES"]], "modelroadwaynetwork (class in lasso)": [[1, "lasso.ModelRoadwayNetwork"]], "__init__() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.__init__"]], "add_counts() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.add_counts"]], "add_incident_link_data_to_nodes() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.add_incident_link_data_to_nodes"]], "add_new_roadway_feature_change() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.add_new_roadway_feature_change"]], "add_variable_using_shst_reference() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.add_variable_using_shst_reference"]], "addition_map() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.addition_map"]], "apply() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.apply"]], "apply_managed_lane_feature_change() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.apply_managed_lane_feature_change"]], "apply_python_calculation() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.apply_python_calculation"]], "apply_roadway_feature_change() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.apply_roadway_feature_change"]], "assess_connectivity() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.assess_connectivity"]], "build_selection_key() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.build_selection_key"]], "calculate_area_type() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.calculate_area_type"]], "calculate_centroidconnect() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.calculate_centroidconnect"]], "calculate_county() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.calculate_county"]], "calculate_distance() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.calculate_distance"]], "calculate_mpo() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.calculate_mpo"]], "calculate_use() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.calculate_use"]], "convert_int() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.convert_int"]], "create_ml_variable() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.create_ML_variable"]], "create_calculated_variables() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.create_calculated_variables"]], "create_dummy_connector_links() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.create_dummy_connector_links"]], "create_hov_corridor_variable() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.create_hov_corridor_variable"]], "create_managed_lane_network() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.create_managed_lane_network"]], "create_managed_variable() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.create_managed_variable"]], "dataframe_to_fixed_width() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.dataframe_to_fixed_width"]], "delete_roadway_feature_change() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.delete_roadway_feature_change"]], "deletion_map() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.deletion_map"]], "fill_na() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.fill_na"]], "from_roadwaynetwork() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.from_RoadwayNetwork"]], "get_attribute() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.get_attribute"]], "get_managed_lane_node_ids() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.get_managed_lane_node_ids"]], "get_modal_graph() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.get_modal_graph"]], "get_modal_links_nodes() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.get_modal_links_nodes"]], "get_property_by_time_period_and_group() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.get_property_by_time_period_and_group"]], "identify_segment() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.identify_segment"]], "identify_segment_endpoints() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.identify_segment_endpoints"]], "is_network_connected() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.is_network_connected"]], "load_transform_network() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.load_transform_network"]], "network_connection_plot() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.network_connection_plot"]], "orig_dest_nodes_foreign_key() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.orig_dest_nodes_foreign_key"]], "ox_graph() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.ox_graph"]], "path_search() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.path_search"]], "read() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.read"]], "read_match_result() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.read_match_result"]], "rename_variables_for_dbf() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.rename_variables_for_dbf"]], "roadway_net_to_gdf() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.roadway_net_to_gdf"]], "roadway_standard_to_met_council_network() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.roadway_standard_to_met_council_network"]], "select_roadway_features() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.select_roadway_features"]], "selection_has_unique_link_id() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.selection_has_unique_link_id"]], "selection_map() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.selection_map"]], "shortest_path() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.shortest_path"]], "split_properties_by_time_period_and_category() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.split_properties_by_time_period_and_category"]], "update_distance() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.update_distance"]], "validate_link_schema() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.validate_link_schema"]], "validate_node_schema() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.validate_node_schema"]], "validate_properties() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.validate_properties"]], "validate_selection() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.validate_selection"]], "validate_shape_schema() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.validate_shape_schema"]], "validate_uniqueness() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.validate_uniqueness"]], "write() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.write"]], "write_roadway_as_fixedwidth() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.write_roadway_as_fixedwidth"]], "write_roadway_as_shp() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.write_roadway_as_shp"]], "parameters (class in lasso)": [[2, "lasso.Parameters"]], "__init__() (lasso.parameters method)": [[2, "lasso.Parameters.__init__"]], "county_link_range_dict (lasso.parameters attribute)": [[2, "lasso.Parameters.county_link_range_dict"]], "cube_time_periods (lasso.parameters attribute)": [[2, "lasso.Parameters.cube_time_periods"]], "properties_to_split (lasso.parameters attribute)": [[2, "lasso.Parameters.properties_to_split"]], "zones (lasso.parameters attribute)": [[2, "lasso.Parameters.zones"]], "calculated_values (lasso.project attribute)": [[3, "lasso.Project.CALCULATED_VALUES"]], "default_project_name (lasso.project attribute)": [[3, "id0"], [3, "lasso.Project.DEFAULT_PROJECT_NAME"]], "project (class in lasso)": [[3, "lasso.Project"]], "static_values (lasso.project attribute)": [[3, "id1"], [3, "lasso.Project.STATIC_VALUES"]], "__init__() (lasso.project method)": [[3, "lasso.Project.__init__"]], "add_highway_changes() (lasso.project method)": [[3, "lasso.Project.add_highway_changes"]], "add_transit_changes() (lasso.project method)": [[3, "lasso.Project.add_transit_changes"]], "base_cube_transit_network (lasso.project attribute)": [[3, "lasso.Project.base_cube_transit_network"]], "base_roadway_network (lasso.project attribute)": [[3, "lasso.Project.base_roadway_network"]], "build_cube_transit_network (lasso.project attribute)": [[3, "lasso.Project.build_cube_transit_network"]], "card_data (lasso.project attribute)": [[3, "lasso.Project.card_data"]], "create_project() (lasso.project static method)": [[3, "lasso.Project.create_project"]], "determine_roadway_network_changes_compatibility() (lasso.project static method)": [[3, "lasso.Project.determine_roadway_network_changes_compatibility"]], "emme_id_to_wrangler_id() (lasso.project static method)": [[3, "lasso.Project.emme_id_to_wrangler_id"]], "emme_name_to_wrangler_name() (lasso.project static method)": [[3, "lasso.Project.emme_name_to_wrangler_name"]], "evaluate_changes() (lasso.project method)": [[3, "lasso.Project.evaluate_changes"]], "get_object_from_network_build_command() (lasso.project method)": [[3, "lasso.Project.get_object_from_network_build_command"]], "get_operation_from_network_build_command() (lasso.project method)": [[3, "lasso.Project.get_operation_from_network_build_command"]], "parameters (lasso.project attribute)": [[3, "lasso.Project.parameters"]], "project_name (lasso.project attribute)": [[3, "lasso.Project.project_name"]], "read_logfile() (lasso.project static method)": [[3, "lasso.Project.read_logfile"]], "read_network_build_file() (lasso.project static method)": [[3, "lasso.Project.read_network_build_file"]], "roadway_link_changes (lasso.project attribute)": [[3, "lasso.Project.roadway_link_changes"]], "roadway_node_changes (lasso.project attribute)": [[3, "lasso.Project.roadway_node_changes"]], "transit_changes (lasso.project attribute)": [[3, "lasso.Project.transit_changes"]], "write_project_card() (lasso.project method)": [[3, "lasso.Project.write_project_card"]], "standardtransit (class in lasso)": [[4, "lasso.StandardTransit"]], "__init__() (lasso.standardtransit method)": [[4, "lasso.StandardTransit.__init__"]], "calculate_cube_mode() (lasso.standardtransit method)": [[4, "lasso.StandardTransit.calculate_cube_mode"]], "cube_format() (lasso.standardtransit method)": [[4, "lasso.StandardTransit.cube_format"]], "evaluate_differences() (lasso.standardtransit method)": [[4, "lasso.StandardTransit.evaluate_differences"]], "feed (lasso.standardtransit attribute)": [[4, "lasso.StandardTransit.feed"]], "fromtransitnetwork() (lasso.standardtransit static method)": [[4, "lasso.StandardTransit.fromTransitNetwork"]], "parameters (lasso.standardtransit attribute)": [[4, "lasso.StandardTransit.parameters"]], "read_gtfs() (lasso.standardtransit static method)": [[4, "lasso.StandardTransit.read_gtfs"]], "route_properties_gtfs_to_cube() (lasso.standardtransit static method)": [[4, "lasso.StandardTransit.route_properties_gtfs_to_cube"]], "shape_gtfs_to_cube() (lasso.standardtransit method)": [[4, "lasso.StandardTransit.shape_gtfs_to_cube"]], "shape_gtfs_to_emme() (lasso.standardtransit method)": [[4, "lasso.StandardTransit.shape_gtfs_to_emme"]], "time_to_cube_time_period() (lasso.standardtransit method)": [[4, "lasso.StandardTransit.time_to_cube_time_period"]], "write_as_cube_lin() (lasso.standardtransit method)": [[4, "lasso.StandardTransit.write_as_cube_lin"]], "lasso.logger": [[5, "module-lasso.logger"]], "module": [[5, "module-lasso.logger"], [6, "module-lasso.util"], [7, "module-lasso"]], "setuplogging() (in module lasso.logger)": [[5, "lasso.logger.setupLogging"]], "point (class in lasso.util)": [[6, "lasso.util.Point"]], "polygon (class in lasso.util)": [[6, "lasso.util.Polygon"]], "almost_equals() (lasso.util.point method)": [[6, "lasso.util.Point.almost_equals"]], "almost_equals() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.almost_equals"]], "area (lasso.util.point property)": [[6, "lasso.util.Point.area"]], "area (lasso.util.polygon property)": [[6, "lasso.util.Polygon.area"]], "args (lasso.util.partial attribute)": [[6, "lasso.util.partial.args"]], "boundary (lasso.util.point property)": [[6, "lasso.util.Point.boundary"]], "boundary (lasso.util.polygon property)": [[6, "lasso.util.Polygon.boundary"]], "bounds (lasso.util.point property)": [[6, "lasso.util.Point.bounds"]], "bounds (lasso.util.polygon property)": [[6, "lasso.util.Polygon.bounds"]], "buffer() (lasso.util.point method)": [[6, "lasso.util.Point.buffer"]], "buffer() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.buffer"]], "centroid (lasso.util.point property)": [[6, "lasso.util.Point.centroid"]], "centroid (lasso.util.polygon property)": [[6, "lasso.util.Polygon.centroid"]], "column_name_to_parts() (in module lasso.util)": [[6, "lasso.util.column_name_to_parts"]], "contains() (lasso.util.point method)": [[6, "lasso.util.Point.contains"]], "contains() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.contains"]], "contains_properly() (lasso.util.point method)": [[6, "lasso.util.Point.contains_properly"]], "contains_properly() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.contains_properly"]], "convex_hull (lasso.util.point property)": [[6, "lasso.util.Point.convex_hull"]], "convex_hull (lasso.util.polygon property)": [[6, "lasso.util.Polygon.convex_hull"]], "coords (lasso.util.point property)": [[6, "lasso.util.Point.coords"]], "coords (lasso.util.polygon property)": [[6, "lasso.util.Polygon.coords"]], "covered_by() (lasso.util.point method)": [[6, "lasso.util.Point.covered_by"]], "covered_by() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.covered_by"]], "covers() (lasso.util.point method)": [[6, "lasso.util.Point.covers"]], "covers() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.covers"]], "create_locationreference() (in module lasso.util)": [[6, "lasso.util.create_locationreference"]], "crosses() (lasso.util.point method)": [[6, "lasso.util.Point.crosses"]], "crosses() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.crosses"]], "difference() (lasso.util.point method)": [[6, "lasso.util.Point.difference"]], "difference() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.difference"]], "disjoint() (lasso.util.point method)": [[6, "lasso.util.Point.disjoint"]], "disjoint() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.disjoint"]], "distance() (lasso.util.point method)": [[6, "lasso.util.Point.distance"]], "distance() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.distance"]], "dwithin() (lasso.util.point method)": [[6, "lasso.util.Point.dwithin"]], "dwithin() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.dwithin"]], "envelope (lasso.util.point property)": [[6, "lasso.util.Point.envelope"]], "envelope (lasso.util.polygon property)": [[6, "lasso.util.Polygon.envelope"]], "equals() (lasso.util.point method)": [[6, "lasso.util.Point.equals"]], "equals() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.equals"]], "equals_exact() (lasso.util.point method)": [[6, "lasso.util.Point.equals_exact"]], "equals_exact() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.equals_exact"]], "exterior (lasso.util.polygon attribute)": [[6, "lasso.util.Polygon.exterior"]], "exterior (lasso.util.polygon property)": [[6, "id0"]], "from_bounds() (lasso.util.polygon class method)": [[6, "lasso.util.Polygon.from_bounds"]], "func (lasso.util.partial attribute)": [[6, "lasso.util.partial.func"]], "geodesic_point_buffer() (in module lasso.util)": [[6, "lasso.util.geodesic_point_buffer"]], "geom_type (lasso.util.point property)": [[6, "lasso.util.Point.geom_type"]], "geom_type (lasso.util.polygon property)": [[6, "lasso.util.Polygon.geom_type"]], "geometrytype() (lasso.util.point method)": [[6, "lasso.util.Point.geometryType"]], "geometrytype() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.geometryType"]], "get_shared_streets_intersection_hash() (in module lasso.util)": [[6, "lasso.util.get_shared_streets_intersection_hash"]], "has_z (lasso.util.point property)": [[6, "lasso.util.Point.has_z"]], "has_z (lasso.util.polygon property)": [[6, "lasso.util.Polygon.has_z"]], "hausdorff_distance() (lasso.util.point method)": [[6, "lasso.util.Point.hausdorff_distance"]], "hausdorff_distance() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.hausdorff_distance"]], "hhmmss_to_datetime() (in module lasso.util)": [[6, "lasso.util.hhmmss_to_datetime"]], "interiors (lasso.util.polygon attribute)": [[6, "lasso.util.Polygon.interiors"]], "interiors (lasso.util.polygon property)": [[6, "id1"]], "interpolate() (lasso.util.point method)": [[6, "lasso.util.Point.interpolate"]], "interpolate() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.interpolate"]], "intersection() (lasso.util.point method)": [[6, "lasso.util.Point.intersection"]], "intersection() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.intersection"]], "intersects() (lasso.util.point method)": [[6, "lasso.util.Point.intersects"]], "intersects() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.intersects"]], "is_closed (lasso.util.point property)": [[6, "lasso.util.Point.is_closed"]], "is_closed (lasso.util.polygon property)": [[6, "lasso.util.Polygon.is_closed"]], "is_empty (lasso.util.point property)": [[6, "lasso.util.Point.is_empty"]], "is_empty (lasso.util.polygon property)": [[6, "lasso.util.Polygon.is_empty"]], "is_ring (lasso.util.point property)": [[6, "lasso.util.Point.is_ring"]], "is_ring (lasso.util.polygon property)": [[6, "lasso.util.Polygon.is_ring"]], "is_simple (lasso.util.point property)": [[6, "lasso.util.Point.is_simple"]], "is_simple (lasso.util.polygon property)": [[6, "lasso.util.Polygon.is_simple"]], "is_valid (lasso.util.point property)": [[6, "lasso.util.Point.is_valid"]], "is_valid (lasso.util.polygon property)": [[6, "lasso.util.Polygon.is_valid"]], "keywords (lasso.util.partial attribute)": [[6, "lasso.util.partial.keywords"]], "lasso.util": [[6, "module-lasso.util"]], "length (lasso.util.point property)": [[6, "lasso.util.Point.length"]], "length (lasso.util.polygon property)": [[6, "lasso.util.Polygon.length"]], "line_interpolate_point() (lasso.util.point method)": [[6, "lasso.util.Point.line_interpolate_point"]], "line_interpolate_point() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.line_interpolate_point"]], "line_locate_point() (lasso.util.point method)": [[6, "lasso.util.Point.line_locate_point"]], "line_locate_point() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.line_locate_point"]], "minimum_clearance (lasso.util.point property)": [[6, "lasso.util.Point.minimum_clearance"]], "minimum_clearance (lasso.util.polygon property)": [[6, "lasso.util.Polygon.minimum_clearance"]], "minimum_rotated_rectangle (lasso.util.point property)": [[6, "lasso.util.Point.minimum_rotated_rectangle"]], "minimum_rotated_rectangle (lasso.util.polygon property)": [[6, "lasso.util.Polygon.minimum_rotated_rectangle"]], "normalize() (lasso.util.point method)": [[6, "lasso.util.Point.normalize"]], "normalize() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.normalize"]], "oriented_envelope (lasso.util.point property)": [[6, "lasso.util.Point.oriented_envelope"]], "oriented_envelope (lasso.util.polygon property)": [[6, "lasso.util.Polygon.oriented_envelope"]], "overlaps() (lasso.util.point method)": [[6, "lasso.util.Point.overlaps"]], "overlaps() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.overlaps"]], "partial (class in lasso.util)": [[6, "lasso.util.partial"]], "point_on_surface() (lasso.util.point method)": [[6, "lasso.util.Point.point_on_surface"]], "point_on_surface() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.point_on_surface"]], "project() (lasso.util.point method)": [[6, "lasso.util.Point.project"]], "project() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.project"]], "relate() (lasso.util.point method)": [[6, "lasso.util.Point.relate"]], "relate() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.relate"]], "relate_pattern() (lasso.util.point method)": [[6, "lasso.util.Point.relate_pattern"]], "relate_pattern() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.relate_pattern"]], "representative_point() (lasso.util.point method)": [[6, "lasso.util.Point.representative_point"]], "representative_point() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.representative_point"]], "reverse() (lasso.util.point method)": [[6, "lasso.util.Point.reverse"]], "reverse() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.reverse"]], "secs_to_datetime() (in module lasso.util)": [[6, "lasso.util.secs_to_datetime"]], "segmentize() (lasso.util.point method)": [[6, "lasso.util.Point.segmentize"]], "segmentize() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.segmentize"]], "shorten_name() (in module lasso.util)": [[6, "lasso.util.shorten_name"]], "simplify() (lasso.util.point method)": [[6, "lasso.util.Point.simplify"]], "simplify() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.simplify"]], "svg() (lasso.util.point method)": [[6, "lasso.util.Point.svg"]], "svg() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.svg"]], "symmetric_difference() (lasso.util.point method)": [[6, "lasso.util.Point.symmetric_difference"]], "symmetric_difference() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.symmetric_difference"]], "touches() (lasso.util.point method)": [[6, "lasso.util.Point.touches"]], "touches() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.touches"]], "transform() (in module lasso.util)": [[6, "lasso.util.transform"]], "type (lasso.util.point property)": [[6, "lasso.util.Point.type"]], "type (lasso.util.polygon property)": [[6, "lasso.util.Polygon.type"]], "unidecode() (in module lasso.util)": [[6, "lasso.util.unidecode"]], "union() (lasso.util.point method)": [[6, "lasso.util.Point.union"]], "union() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.union"]], "within() (lasso.util.point method)": [[6, "lasso.util.Point.within"]], "within() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.within"]], "wkb (lasso.util.point property)": [[6, "lasso.util.Point.wkb"]], "wkb (lasso.util.polygon property)": [[6, "lasso.util.Polygon.wkb"]], "wkb_hex (lasso.util.point property)": [[6, "lasso.util.Point.wkb_hex"]], "wkb_hex (lasso.util.polygon property)": [[6, "lasso.util.Polygon.wkb_hex"]], "wkt (lasso.util.point property)": [[6, "lasso.util.Point.wkt"]], "wkt (lasso.util.polygon property)": [[6, "lasso.util.Polygon.wkt"]], "x (lasso.util.point property)": [[6, "lasso.util.Point.x"]], "xy (lasso.util.point property)": [[6, "lasso.util.Point.xy"]], "xy (lasso.util.polygon property)": [[6, "lasso.util.Polygon.xy"]], "y (lasso.util.point property)": [[6, "lasso.util.Point.y"]], "z (lasso.util.point property)": [[6, "lasso.util.Point.z"]], "lasso": [[7, "module-lasso"]]}}) \ No newline at end of file diff --git a/branch/bicounty_2035_hwy_update/setup/index.html b/branch/bicounty_2035_hwy_update/setup/index.html new file mode 100644 index 0000000..868c23c --- /dev/null +++ b/branch/bicounty_2035_hwy_update/setup/index.html @@ -0,0 +1,133 @@ + + + + + + + Setup — lasso documentation + + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

Setup

+
+

Projects

+
+
+

Parameters

+
+
+

Settings

+
+
+

Additional Data Files

+
+
+ + +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty_2035_hwy_update/starting/index.html b/branch/bicounty_2035_hwy_update/starting/index.html new file mode 100644 index 0000000..ef31ab2 --- /dev/null +++ b/branch/bicounty_2035_hwy_update/starting/index.html @@ -0,0 +1,434 @@ + + + + + + + Starting Out — lasso documentation + + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

Starting Out

+
+

Installation

+

If you are managing multiple python versions, we suggest using virtualenv or conda virtual environments.

+

Example using a conda environment (recommended) and using the package manager pip to install Lasso from the source on GitHub.

+
conda config --add channels conda-forge
+conda create python=3.7 rtree geopandas -n <my_lasso_environment>
+conda activate <my_lasso_environment>
+pip install git+https://github.com/wsp-sag/Lasso@master
+
+
+

Lasso will install network_wrangler from the PyPi repository because it is included in Lasso’s requirements.txt.

+
+

Bleeding Edge

+

If you want to install a more up-to-date or development version of network wrangler and lasso , you can do so by installing it from the develop branch of

+
conda config --add channels conda-forge
+conda create python=3.7 rtree geopandas -n <my_lasso_environment>
+conda activate <my_lasso_environment>
+pip install git+https://github.com/wsp-sag/network_wrangler@develop
+pip install git+https://github.com/wsp-sag/Lasso@develop
+
+
+
+
+

From Clone

+

If you are going to be working on Lasso locally, you might want to clone it to your local machine and install it from the clone. The -e will install it in editable mode.

+

if you plan to do development on both network wrangler and lasso locally, consider installing network wrangler from a clone as well!

+
conda config --add channels conda-forge
+conda create python=3.7 rtree geopandas osmnx -n <my_lasso_environment>
+conda activate <my_lasso_environment>
+git clone https://github.com/wsp-sag/Lasso
+git clone https://github.com/wsp-sag/network_wrangler
+cd network_wrangler
+pip install -e .
+cd ..
+cd Lasso
+pip install -e .
+
+
+

Notes:

+
    +
  1. The -e installs it in editable mode.

  2. +
  3. If you are not part of the project team and want to contribute code bxack to the project, please fork before you clone and then add the original repository to your upstream origin list per these directions on github.

  4. +
  5. if you wanted to install from a specific tag/version number or branch, replace @master with @<branchname> or @tag

  6. +
  7. If you want to make use of frequent developer updates for network wrangler as well, you can also install it from clone by copying the instructions for cloning and installing Lasso for Network Wrangler

  8. +
+

If you are going to be doing Lasso development, we also recommend:

+
    +
  • a good IDE such as Atom, VS Code, Sublime Text, etc. +with Python syntax highlighting turned on.

  • +
  • GitHub Desktop to locally update your clones

  • +
+
+
+
+

Brief Intro

+

Lasso is a ‘wrapper’ around the Network Wrangler utility.

+

Both Lasso and NetworkWrangler are built around the following data schemas:

+
    +
  • [roadway network], which is based on a mashup of Open Street Map and Shared Streets. In Network Wrangler these are read in from three json files reprsenting: links, shapes, and nodes. Data fields that change by time of day or by user category are represented as nested fields such that any field can be defined for an ad-hoc time-of-day span or user category.

  • +
  • [transit network], which is based on a frequency-based implementation of the csv-based GTFS; and

  • +
  • [project card], which is novel to Network Wrangler and stores information about network changes as a result of projects in yml.

  • +
+

In addition, Lasso utilizes the following data schemas:

+
    +
  • [MetCouncil Model Roadway Network Schema], which adds data fields to the roadway network schema that MetCouncil uses in their travel model including breaking out data fields by time period.

  • +
  • [MetCouncil Model Transit Network Schema], which uses the Cube PublicTransport format, and

  • +
  • [Cube Log Files], which document changes to the roadway network done in the Cube GUI. Lasso translates these to project cards in order to be used by NetworkWrangler.

  • +
  • [Cube public transport line files], which define a set of transit lines in the cube software.

  • +
+
+

Components

+

Network Wrangler has the following atomic parts:

+
    +
  • RoadwayNetwork object, which represents the roadway network data as GeoDataFrames;

  • +
  • TransitNetwork object, which represents the transit network data as DataFrames;

  • +
  • ProjectCard object, which represents the data of the project card. Project cards identify the infrastructure that is changing (a selection) and defines the changes; or contains information about a new facility to be constructed or a new service to be run.;

  • +
  • Scenario object, which consist of at least a RoadwayNetwork, and +TransitNetwork. Scenarios can be based on or tiered from other scenarios. +Scenarios can query and add ProjectCards to describe a set of changes that should be made to the network.

  • +
+

In addition, Lasso has the following atomic parts:

+
    +
  • Project object, creates project cards from one of the following: a base and a build transit network in cube format, a base and build highway network, or a base highway network and a Cube log file.

  • +
  • ModelRoadwayNetwork object is a subclass of RoadwayNetwork and contains MetCouncil-specific methods to define and create MetCouncil-specific variables and export the network to a format that can be read by Cube.

  • +
  • StandardTransit, an object for holding a standard transit feed as a Partridge object and contains +methods to manipulate and translate the GTFS data to MetCouncil’s Cube Line files.

  • +
  • CubeTransit, an object for storing information about transit defined in Cube public transport line files +. Has the capability to parse cube line file properties and shapes into python dictionaries and compare line files and represent changes as Project Card dictionaries.

  • +
  • Parameters, A class representing all the parameters defining the networks +including time of day, categories, etc. Parameters can be set at runtime by initializing a parameters instance +with a keyword argument setting the attribute. Parameters that are +not explicitly set will use default parameters listed in this class.

  • +
+
+

RoadwayNetwork

+

Reads, writes, queries and and manipulates roadway network data, which +is mainly stored in the GeoDataFrames links_df, nodes_df, and shapes_df.

+
net = RoadwayNetwork.read(
+        link_filename=MY_LINK_FILE,
+        node_filename=MY_NODE_FILE,
+        shape_filename=MY_SHAPE_FILE,
+        shape_foreign_key ='shape_id',
+        
+    )
+my_selection = {
+    "link": [{"name": ["I 35E"]}],
+    "A": {"osm_node_id": "961117623"},  # start searching for segments at A
+    "B": {"osm_node_id": "2564047368"},
+}
+net.select_roadway_features(my_selection)
+
+my_change = [
+    {
+        'property': 'lanes',
+        'existing': 1,
+        'set': 2,
+     },
+     {
+        'property': 'drive_access',
+        'set': 0,
+      },
+]
+
+my_net.apply_roadway_feature_change(
+    my_net.select_roadway_features(my_selection),
+    my_change
+)
+
+ml_net = net.create_managed_lane_network(in_place=False)
+
+ml_net.is_network_connected(mode="drive"))
+
+_, disconnected_nodes = ml_net.assess_connectivity(
+  mode="walk",
+  ignore_end_nodes=True
+)
+ml_net.write(filename=my_out_prefix, path=my_dir)
+
+
+
+
+

TransitNetwork

+
+
+

ProjectCard

+
+
+

Scenario

+

Manages sets of project cards and tiering from a base scenario/set of networks.

+

+my_base_scenario = {
+    "road_net": RoadwayNetwork.read(
+        link_filename=STPAUL_LINK_FILE,
+        node_filename=STPAUL_NODE_FILE,
+        shape_filename=STPAUL_SHAPE_FILE,
+        fast=True,
+        shape_foreign_key ='shape_id',
+    ),
+    "transit_net": TransitNetwork.read(STPAUL_DIR),
+}
+
+card_filenames = [
+    "3_multiple_roadway_attribute_change.yml",
+    "multiple_changes.yml",
+    "4_simple_managed_lane.yml",
+]
+
+project_card_directory = os.path.join(STPAUL_DIR, "project_cards")
+
+project_cards_list = [
+    ProjectCard.read(os.path.join(project_card_directory, filename), validate=False)
+    for filename in card_filenames
+]
+
+my_scenario = Scenario.create_scenario(
+  base_scenario=my_base_scenario,
+  project_cards_list=project_cards_list,
+)
+my_scenario.check_scenario_requisites()
+
+my_scenario.apply_all_projects()
+
+my_scenario.scenario_summary()
+
+
+
+
+

Project

+

Creates project cards by comparing two MetCouncil Model Transit Network files or by reading a cube log file and a base network;

+

+test_project = Project.create_project(
+  base_transit_source=os.path.join(CUBE_DIR, "transit.LIN"),
+  build_transit_source=os.path.join(CUBE_DIR, "transit_route_shape_change"),
+  )
+
+test_project.evaluate_changes()
+
+test_project.write_project_card(
+  os.path.join(SCRATCH_DIR, "t_transit_shape_test.yml")
+  )
+
+
+
+
+

ModelRoadwayNetwork

+

A subclass of network_wrangler’s RoadwayNetwork +class which additional understanding about how to translate and write the +network out to the MetCouncil Roadway Network schema.

+
net = ModelRoadwayNetwork.read(
+      link_filename=STPAUL_LINK_FILE,
+      node_filename=STPAUL_NODE_FILE,
+      shape_filename=STPAUL_SHAPE_FILE,
+      fast=True,
+      shape_foreign_key ='shape_id',
+  )
+
+net.write_roadway_as_fixedwidth()
+
+
+
+
+

StandardTransit

+

Translates the standard GTFS data to MetCouncil’s Cube Line files.

+
cube_transit_net = StandardTransit.read_gtfs(BASE_TRANSIT_DIR)
+cube_transit_net.write_as_cube_lin(os.path.join(WRITE_DIR, "outfile.lin"))
+
+
+
+
+

CubeTransit

+

Used by the project class and has the capability to:

+
    +
  • Parse cube line file properties and shapes into python dictionaries

  • +
  • Compare line files and represent changes as Project Card dictionaries

  • +
+
tn = CubeTransit.create_from_cube(CUBE_DIR)
+transit_change_list = tn.evaluate_differences(base_transit_network)
+
+
+
+
+

Parameters

+

Holds information about default parameters but can +also be initialized to override those parameters at object instantiation using a dictionary.

+
# read parameters from a yaml configuration  file
+# could also provide as a key/value pair
+with open(config_file) as f:
+      my_config = yaml.safe_load(f)
+
+# provide parameters at instantiation of ModelRoadwayNetwork
+model_road_net = ModelRoadwayNetwork.from_RoadwayNetwork(
+            my_scenario.road_net, parameters=my_config.get("my_parameters", {})
+        )
+# network written with direction from the parameters given
+model_road_net.write_roadway_as_shp()
+
+
+
+
+
+

Typical Workflow

+

Workflows in Lasso and Network Wrangler typically accomplish one of two goals:

+
    +
  1. Create Project Cards to document network changes as a result of either transit or roadway projects.

  2. +
  3. Create Model Network Files for a scenario as represented by a series of Project Cards layered on top of a base network.

  4. +
+
+

Project Cards from Transit LIN Files

+
+
+

Project Cards from Cube LOG Files

+
+
+

Model Network Files for a Scenario

+
+
+
+
+

Running Quickstart Jupyter Notebooks

+

To learn basic lasso functionality, please refer to the following jupyter notebooks in the /notebooks directory:

+
    +
  • Lasso Project Card Creation Quickstart.ipynb

  • +
  • Lasso Scenario Creation Quickstart.ipynb

  • +
+

Jupyter notebooks can be started by activating the lasso conda environment and typing jupyter notebook:

+
conda activate <my_lasso_environment>
+jupyter notebook
+
+
+
+
+ + +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty_dev/.buildinfo b/branch/bicounty_dev/.buildinfo new file mode 100644 index 0000000..b36862c --- /dev/null +++ b/branch/bicounty_dev/.buildinfo @@ -0,0 +1,4 @@ +# Sphinx build info version 1 +# This file hashes the configuration used when building these files. When it is not found, a full rebuild will be done. +config: a1cf922aaabc1875e26d788834f2f593 +tags: d77d1c0d9ca2f4c8421862c7c5a0d620 diff --git a/branch/bicounty_dev/_generated/lasso.CubeTransit/index.html b/branch/bicounty_dev/_generated/lasso.CubeTransit/index.html new file mode 100644 index 0000000..8f061a9 --- /dev/null +++ b/branch/bicounty_dev/_generated/lasso.CubeTransit/index.html @@ -0,0 +1,571 @@ + + + + + + + lasso.CubeTransit — lasso documentation + + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

lasso.CubeTransit

+
+
+class lasso.CubeTransit(parameters={})[source]
+

Bases: object

+

Class for storing information about transit defined in Cube line +files.

+

Has the capability to:

+
+
    +
  • Parse cube line file properties and shapes into python dictionaries

  • +
  • Compare line files and represent changes as Project Card dictionaries

  • +
+
+

Typical usage example:

+
tn = CubeTransit.create_from_cube(CUBE_DIR)
+transit_change_list = tn.evaluate_differences(base_transit_network)
+
+
+
+
+lines
+

list of strings representing unique line names in +the cube network.

+
+
Type:
+

list

+
+
+
+ +
+
+line_properties
+

dictionary of line properties keyed by line name. Property +values are stored in a dictionary by property name. These +properties are directly read from the cube line files and haven’t +been translated to standard transit values.

+
+
Type:
+

dict

+
+
+
+ +
+
+shapes
+

dictionary of shapes +keyed by line name. Shapes stored as a pandas DataFrame of nodes with following columns:

+
+
    +
  • ‘node_id’ (int): positive integer of node id

  • +
  • ‘node’ (int): node number, with negative indicating a non-stop

  • +
  • ‘stop’ (boolean): indicates if it is a stop

  • +
  • ‘order’ (int): order within this shape

  • +
+
+
+
Type:
+

dict

+
+
+
+ +
+
+program_type
+

Either PT or TRNBLD

+
+
Type:
+

str

+
+
+
+ +
+
+parameters
+

Parameters instance that will be applied to this instance which +includes information about time periods and variables.

+
+
Type:
+

Parameters

+
+
+
+ +
+
+source_list
+

List of cube line file sources that have been read and added.

+
+
Type:
+

list

+
+
+
+ +
+
+diff_dict
+
+
Type:
+

dict

+
+
+
+ +
+
+__init__(parameters={})[source]
+

Constructor for CubeTransit

+

parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters

+
+ +

Methods

+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +

__init__([parameters])

Constructor for CubeTransit

add_additional_time_periods(...)

Copies a route to another cube time period with appropriate values for time-period-specific properties.

add_cube(transit_source)

Reads a .lin file and adds it to existing TransitNetwork instance.

build_route_name([route_id, time_period, ...])

Create a route name by contatenating route, time period, agency, and direction

calculate_start_end_times(line_properties_dict)

Calculate the start and end times of the property change WARNING: Doesn't take care of discongruous time periods!!!!

create_add_route_card_dict(line)

Creates a project card change formatted dictionary for adding a route based on the information in self.route_properties for the line.

create_delete_route_card_dict(line, ...)

Creates a project card change formatted dictionary for deleting a line.

create_from_cube(transit_source[, parameters])

Reads a cube .lin file and stores as TransitNetwork object.

create_update_route_card_dict(line, ...)

Creates a project card change formatted dictionary for updating the line.

cube_properties_to_standard_properties(...)

Converts cube style properties to standard properties.

evaluate_differences(base_transit)

    +
  1. Identifies what routes need to be updated, deleted, or added

  2. +
+

evaluate_route_property_differences(...[, ...])

Checks if any values have been updated or added for a specific route and creates project card entries for each.

evaluate_route_shape_changes(shape_build, ...)

Compares two route shapes and constructs returns list of changes suitable for a project card.

get_time_period_numbers_from_cube_properties(...)

Finds properties that are associated with time periods and the returns the numbers in them.

unpack_route_name(line_name)

Unpacks route name into direction, route, agency, and time period info

+
+
+add_additional_time_periods(new_time_period_number, orig_line_name)[source]
+

Copies a route to another cube time period with appropriate +values for time-period-specific properties.

+
+
New properties are stored under the new name in:
    +
  • ::self.shapes

  • +
  • ::self.line_properties

  • +
+
+
+
+
Parameters:
+
    +
  • new_time_period_number (int) – cube time period number

  • +
  • orig_line_name (str) – name of the originating line, from which +the new line will copy its properties.

  • +
+
+
Returns:
+

Line name with new time period.

+
+
+
+ +
+
+add_cube(transit_source)[source]
+

Reads a .lin file and adds it to existing TransitNetwork instance.

+
+
Parameters:
+

transit_source – a string or the directory of the cube line file to be parsed

+
+
+
+ +
+
+static build_route_name(route_id='', time_period='', agency_id=0, direction_id=1)[source]
+

Create a route name by contatenating route, time period, agency, and direction

+
+
Parameters:
+
    +
  • route_id – i.e. 452-111

  • +
  • time_period – i.e. pk

  • +
  • direction_id – i.e. 1

  • +
  • agency_id – i.e. 0

  • +
+
+
Returns:
+

constructed line_name i.e. “0_452-111_452_pk1”

+
+
+
+ +
+
+calculate_start_end_times(line_properties_dict)[source]
+

Calculate the start and end times of the property change +WARNING: Doesn’t take care of discongruous time periods!!!!

+
+
Parameters:
+

line_properties_dict – dictionary of cube-flavor properties for a transit line

+
+
+
+ +
+
+create_add_route_card_dict(line)[source]
+

Creates a project card change formatted dictionary for adding +a route based on the information in self.route_properties for +the line.

+
+
Parameters:
+

line – name of line that is being updated

+
+
Returns:
+

A project card change-formatted dictionary for the route addition.

+
+
+
+ +
+
+create_delete_route_card_dict(line, base_transit_line_properties_dict)[source]
+

Creates a project card change formatted dictionary for deleting a line.

+
+
Parameters:
+
    +
  • line – name of line that is being deleted

  • +
  • base_transit_line_properties_dict – dictionary of cube-style +attribute values in order to find time periods and +start and end times.

  • +
+
+
Returns:
+

A project card change-formatted dictionary for the route deletion.

+
+
+
+ +
+
+static create_from_cube(transit_source, parameters={})[source]
+

Reads a cube .lin file and stores as TransitNetwork object.

+
+
Parameters:
+

transit_source – a string or the directory of the cube line file to be parsed

+
+
Returns:
+

A ::CubeTransit object created from the transit_source.

+
+
+
+ +
+
+create_update_route_card_dict(line, updated_properties_dict)[source]
+

Creates a project card change formatted dictionary for updating +the line.

+
+
Parameters:
+
    +
  • line – name of line that is being updated

  • +
  • updated_properties_dict – dictionary of attributes to update as +‘property’: <property name>, +‘set’: <new property value>

  • +
+
+
Returns:
+

A project card change-formatted dictionary for the attribute update.

+
+
+
+ +
+
+static cube_properties_to_standard_properties(cube_properties_dict)[source]
+

Converts cube style properties to standard properties.

+

This is most pertinent to time-period specific variables like headway, +and varibles that have stnadard units like headway, which is minutes +in cube and seconds in standard format.

+
+
Parameters:
+

cube_properties_dict – <cube style property name> : <property value>

+
+
Returns:
+

+
<standard

style property name>, “set” : <property value with correct units>`

+
+
+

+
+
Return type:
+

A list of dictionaries with values for `”property”

+
+
+
+ +
+
+evaluate_differences(base_transit)[source]
+
    +
  1. Identifies what routes need to be updated, deleted, or added

  2. +
  3. +
    For routes being added or updated, identify if the time periods

    have changed or if there are multiples, and make duplicate lines if so

    +
    +
    +
  4. +
  5. Create project card dictionaries for each change.

  6. +
+
+
Parameters:
+

base_transit (CubeTransit) – an instance of this class for the base condition

+
+
Returns:
+

A list of dictionaries containing project card changes +required to evaluate the differences between the base network +and this transit network instance.

+
+
+
+ +
+
+evaluate_route_property_differences(properties_build, properties_base, time_period_number, absolute=True, validate_base=False)[source]
+

Checks if any values have been updated or added for a specific +route and creates project card entries for each.

+
+
Parameters:
+
    +
  • properties_build – ::<property_name>: <property_value>

  • +
  • properties_base – ::<property_name>: <property_value>

  • +
  • time_period_number – time period to evaluate

  • +
  • absolute – if True, will use set command rather than a change. If false, will automatically check the base value. Note that this only applies to the numeric values of frequency/headway

  • +
  • validate_base – if True, will add the existing line in the project card

  • +
+
+
Returns:
+

+
a list of dictionary values suitable for writing to a project card

{ +‘property’: <property_name>, +‘set’: <set value>, +‘change’: <change from existing value>, +‘existing’: <existing value to check>, +}

+
+
+

+
+
Return type:
+

transit_change_list (list)

+
+
+
+ +
+
+static evaluate_route_shape_changes(shape_build, shape_base)[source]
+

Compares two route shapes and constructs returns list of changes +suitable for a project card.

+
+
Parameters:
+
    +
  • shape_build – DataFrame of the build-version of the route shape.

  • +
  • shape_base – dDataFrame of the base-version of the route shape.

  • +
+
+
Returns:
+

List of shape changes formatted as a project card-change dictionary.

+
+
+
+ +
+
+static get_time_period_numbers_from_cube_properties(properties_list)[source]
+

Finds properties that are associated with time periods and the +returns the numbers in them.

+
+
Parameters:
+

properties_list (list) – list of all properties.

+
+
Returns:
+

list of strings of the time period numbers found

+
+
+
+ +
+
+static unpack_route_name(line_name)[source]
+

Unpacks route name into direction, route, agency, and time period info

+
+
Parameters:
+

line_name (str) – i.e. “0_452-111_452_pk1”

+
+
Returns:
+

452-111 +time_period (str): i.e. pk +direction_id (str) : i.e. 1 +agency_id (str) : i.e. 0

+
+
Return type:
+

route_id (str)

+
+
+
+ +
+ +
+ + +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty_dev/_generated/lasso.ModelRoadwayNetwork/index.html b/branch/bicounty_dev/_generated/lasso.ModelRoadwayNetwork/index.html new file mode 100644 index 0000000..16c638c --- /dev/null +++ b/branch/bicounty_dev/_generated/lasso.ModelRoadwayNetwork/index.html @@ -0,0 +1,1573 @@ + + + + + + + lasso.ModelRoadwayNetwork — lasso documentation + + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

lasso.ModelRoadwayNetwork

+
+
+class lasso.ModelRoadwayNetwork(nodes, links, shapes, parameters={}, **kwargs)[source]
+

Bases: RoadwayNetwork

+

Subclass of network_wrangler class RoadwayNetwork

+

A representation of the physical roadway network and its properties.

+
+
+__init__(nodes, links, shapes, parameters={}, **kwargs)[source]
+

Constructor

+
+
Parameters:
+
    +
  • nodes – geodataframe of nodes

  • +
  • links – dataframe of links

  • +
  • shapes – geodataframe of shapes

  • +
  • parameters – dictionary of parameter settings (see Parameters class) or an instance of Parameters. +If not specified, will use default parameters.

  • +
  • crs (int) – coordinate reference system, ESPG number

  • +
  • node_foreign_key (str) – variable linking the node table to the link table

  • +
  • link_foreign_key (list) – list of variable linking the link table to the node foreign key

  • +
  • shape_foreign_key (str) – variable linking the links table and shape table

  • +
  • unique_link_ids (list) – list of variables unique to each link

  • +
  • unique_node_ids (list) – list of variables unique to each node

  • +
  • modes_to_network_link_variables (dict) – Mapping of modes to link variables in the network

  • +
  • modes_to_network_nodes_variables (dict) – Mapping of modes to node variables in the network

  • +
  • managed_lanes_node_id_scalar (int) – Scalar values added to primary keys for nodes for +corresponding managed lanes.

  • +
  • managed_lanes_link_id_scalar (int) – Scalar values added to primary keys for links for +corresponding managed lanes.

  • +
  • managed_lanes_required_attributes (list) – attributes that must be specified in managed +lane projects.

  • +
  • keep_same_attributes_ml_and_gp (list) – attributes to copy to managed lanes from parallel +general purpose lanes.

  • +
+
+
+
+ +

Methods

+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +

__init__(nodes, links, shapes[, parameters])

Constructor

add_counts([network_variable, ...])

Adds count variable.

add_incident_link_data_to_nodes([links_df, ...])

Add data from links going to/from nodes to node.

add_new_roadway_feature_change(links, nodes)

add the new roadway features defined in the project card.

add_variable_using_shst_reference([...])

Join network links with source data, via SHST API node match result.

addition_map(links, nodes)

Shows which links and nodes are added to the roadway network

apply(project_card_dictionary)

Wrapper method to apply a project to a roadway network.

apply_managed_lane_feature_change(link_idx, ...)

Apply the managed lane feature changes to the roadway network

apply_python_calculation(pycode[, in_place])

Changes roadway network object by executing pycode.

apply_roadway_feature_change(link_idx, ...)

Changes the roadway attributes for the selected features based on the project card information passed

assess_connectivity([mode, ...])

Returns a network graph and list of disconnected subgraphs as described by a list of their member nodes.

build_selection_key(selection_dict)

Selections are stored by a key combining the query and the A and B ids.

calculate_area_type([area_type_shape, ...])

#MC Calculates area type variable.

calculate_centroidconnect(parameters[, ...])

Calculates centroid connector variable.

calculate_county([county_shape, ...])

#MC Calculates county variable.

calculate_distance([network_variable, ...])

calculate link distance in miles

calculate_mpo([county_network_variable, ...])

Calculates mpo variable.

calculate_use([network_variable, ...])

Calculates use variable.

convert_int([int_col_names])

Convert integer columns

create_ML_variable([network_variable, overwrite])

Created ML lanes placeholder for project to write out ML changes

create_calculated_variables()

Creates calculated roadway variables.

create_dummy_connector_links(ml_df[, ...])

create dummy connector links between the general purpose and managed lanes

create_hov_corridor_variable([...])

Created hov corridor placeholder for project to write out corridor changes

create_managed_lane_network([...])

Create a roadway network with managed lanes links separated out.

create_managed_variable([network_variable, ...])

Created placeholder for project to write out managed

dataframe_to_fixed_width(df)

Convert dataframe to fixed width format, geometry column will not be transformed.

delete_roadway_feature_change(links, nodes)

delete the roadway features defined in the project card.

deletion_map(links, nodes)

Shows which links and nodes are deleted from the roadway network

fill_na()

Fill na values from create_managed_lane_network()

from_RoadwayNetwork(roadway_network_object)

RoadwayNetwork to ModelRoadwayNetwork

get_attribute(links_df, join_key, ...)

Gets attribute from source data using SHST match result.

get_managed_lane_node_ids(nodes_list[, scalar])

Transform a list of node IDS by a scalar.

get_modal_graph(links_df, nodes_df[, mode, ...])

Determines if the network graph is "strongly" connected A graph is strongly connected if each vertex is reachable from every other vertex.

get_modal_links_nodes(links_df, nodes_df[, ...])

Returns nodes and link dataframes for specific mode.

get_property_by_time_period_and_group(prop)

Return a series for the properties with a specific group or time period.

identify_segment(O_id, D_id[, ...])

+
param endpoints:
+

list of length of two unique keys of nodes making up endpoints of segment

+
+
+

identify_segment_endpoints([mode, links_df, ...])

+
param mode:
+

list of modes of the network, one of drive,`transit`,

+
+
+

is_network_connected([mode, links_df, nodes_df])

Determines if the network graph is "strongly" connected A graph is strongly connected if each vertex is reachable from every other vertex.

load_transform_network(node_filename, ...[, ...])

Reads roadway network files from disk and transforms them into GeoDataFrames.

network_connection_plot(G, ...)

Plot a graph to check for network connection.

orig_dest_nodes_foreign_key(selection[, ...])

Returns the foreign key id (whatever is used in the u and v variables in the links file) for the AB nodes as a tuple.

ox_graph(nodes_df, links_df[, ...])

create an osmnx-flavored network graph

path_search(candidate_links_df, O_id, D_id)

+
param candidate_links:
+

selection of links geodataframe with links likely to be part of path

+
+
+

read(link_filename, node_filename, ...[, ...])

Reads in links and nodes network standard.

read_match_result(path)

Reads the shst geojson match returns.

rename_variables_for_dbf(input_df[, ...])

Rename attributes for DBF/SHP, make sure length within 10 chars.

roadway_net_to_gdf(roadway_net)

+
rtype:
+

GeoDataFrame

+
+
+

roadway_standard_to_met_council_network([...])

Rename and format roadway attributes to be consistent with what metcouncil's model is expecting.

select_roadway_features(selection[, ...])

Selects roadway features that satisfy selection criteria

selection_has_unique_link_id(selection_dict)

+
rtype:
+

bool

+
+
+

selection_map(selected_link_idx[, A, B, ...])

Shows which links are selected for roadway property change or parallel managed lanes category of roadway projects.

shortest_path(graph_links_df, O_id, D_id[, ...])

+
rtype:
+

tuple

+
+
+

split_properties_by_time_period_and_category([...])

Splits properties by time period, assuming a variable structure of

update_distance([links_df, use_shapes, ...])

Calculate link distance in specified units to network variable using either straight line distance or (if specified) shape distance if available.

validate_link_schema(link_filename[, ...])

Validate roadway network data link schema and output a boolean

validate_node_schema(node_file[, ...])

Validate roadway network data node schema and output a boolean

validate_properties(properties[, ...])

If there are change or existing commands, make sure that that property exists in the network.

validate_selection(selection[, ...])

Evaluate whetther the selection dictionary contains the minimum required values.

validate_shape_schema(shape_file[, ...])

Validate roadway network data shape schema and output a boolean

validate_uniqueness()

Confirms that the unique identifiers are met.

write([path, filename])

Writes a network in the roadway network standard

write_roadway_as_fixedwidth(output_dir[, ...])

Writes out fixed width file.

write_roadway_as_shp(output_dir[, ...])

Write out dbf/shp/gpkg for cube.

+

Attributes

+ + + + + + +

CALCULATED_VALUES

+
+
+add_counts(network_variable='AADT', mndot_count_shst_data=None, widot_count_shst_data=None, mndot_count_variable_shp=None, widot_count_variable_shp=None)[source]
+

Adds count variable. +#MC +join the network with count node data, via SHST API node match result

+
+
Parameters:
+
    +
  • network_variable (str) – Name of the variable that should be written to. Default to “AADT”.

  • +
  • mndot_count_shst_data (str) – File path to MNDOT count location SHST API node match result.

  • +
  • widot_count_shst_data (str) – File path to WIDOT count location SHST API node match result.

  • +
  • mndot_count_variable_shp (str) – File path to MNDOT count location geodatabase.

  • +
  • widot_count_variable_shp (str) – File path to WIDOT count location geodatabase.

  • +
+
+
Returns:
+

None

+
+
+
+ +
+ +

Add data from links going to/from nodes to node.

+
+
Return type:
+

DataFrame

+
+
Parameters:
+
    +
  • links_df – if specified, will assess connectivity of this +links list rather than self.links_df

  • +
  • nodes_df – if specified, will assess connectivity of this +nodes list rather than self.nodes_df

  • +
  • link_variables – list of columns in links dataframe to add to incident nodes

  • +
+
+
Returns:
+

nodes DataFrame with link data where length is N*number of links going in/out

+
+
+
+ +
+
+add_new_roadway_feature_change(links, nodes)
+

add the new roadway features defined in the project card. +new shapes are also added for the new roadway links.

+
+
Return type:
+

None

+
+
Parameters:
+
    +
  • links – list of dictionaries

  • +
  • nodes – list of dictionaries

  • +
+
+
+

returns: None

+
+ +
+
+add_variable_using_shst_reference(var_shst_csvdata=None, shst_csv_variable=None, network_variable=None, network_var_type=<class 'int'>, overwrite=False)[source]
+

Join network links with source data, via SHST API node match result.

+
+
Parameters:
+
    +
  • var_shst_csvdata (str) – File path to SHST API return.

  • +
  • shst_csv_variable (str) – Variable name in the source data.

  • +
  • network_variable (str) – Name of the variable that should be written to.

  • +
  • network_var_type – Variable type in the written network.

  • +
  • overwrite (bool) – True is overwriting existing variable. Default to False.

  • +
+
+
Returns:
+

None

+
+
+
+ +
+
+addition_map(links, nodes)
+

Shows which links and nodes are added to the roadway network

+
+ +
+
+apply(project_card_dictionary)
+

Wrapper method to apply a project to a roadway network.

+
+
Parameters:
+

project_card_dictionary – dict +a dictionary of the project card object

+
+
+
+ +
+
+apply_managed_lane_feature_change(link_idx, properties, in_place=True)
+

Apply the managed lane feature changes to the roadway network

+
+
Parameters:
+
    +
  • link_idx – list of lndices of all links to apply change to

  • +
  • properties – list of dictionarys roadway properties to change

  • +
  • in_place – boolean to indicate whether to update self or return +a new roadway network object

  • +
+
+
+
+ +
+
+apply_python_calculation(pycode, in_place=True)
+

Changes roadway network object by executing pycode.

+
+
Parameters:
+
    +
  • pycode – python code which changes values in the roadway network object

  • +
  • in_place – update self or return a new roadway network object

  • +
+
+
+
+ +
+
+apply_roadway_feature_change(link_idx, properties, in_place=True)
+

Changes the roadway attributes for the selected features based on the +project card information passed

+
+
Parameters:
+
    +
  • link_idx – list +lndices of all links to apply change to

  • +
  • properties – list of dictionarys +roadway properties to change

  • +
  • in_place – boolean +update self or return a new roadway network object

  • +
+
+
+
+ +
+
+assess_connectivity(mode='', ignore_end_nodes=True, links_df=None, nodes_df=None)
+

Returns a network graph and list of disconnected subgraphs +as described by a list of their member nodes.

+
+
Parameters:
+
    +
  • mode – list of modes of the network, one of drive,`transit`, +walk, bike

  • +
  • ignore_end_nodes – if True, ignores stray singleton nodes

  • +
  • links_df – if specified, will assess connectivity of this +links list rather than self.links_df

  • +
  • nodes_df – if specified, will assess connectivity of this +nodes list rather than self.nodes_df

  • +
+
+
+
+
Returns: Tuple of

Network Graph (osmnx flavored networkX DiGraph) +List of disconnected subgraphs described by the list of their

+
+

member nodes (as described by their model_node_id)

+
+
+
+
+ +
+
+build_selection_key(selection_dict)
+

Selections are stored by a key combining the query and the A and B ids. +This method combines the two for you based on the selection dictionary.

+
+
Return type:
+

tuple

+
+
Parameters:
+

selection_dictonary – Selection Dictionary

+
+
+

Returns: Tuple serving as the selection key.

+
+ +
+
+calculate_area_type(area_type_shape=None, area_type_shape_variable=None, network_variable='area_type', area_type_codes_dict=None, downtown_area_type_shape=None, downtown_area_type=None, overwrite=False)[source]
+

#MC +Calculates area type variable.

+

This uses the centroid of the geometry field to determine which area it should be labeled. +This isn’t perfect, but it much quicker than other methods.

+
+
Parameters:
+
    +
  • area_type_shape (str) – The File path to area geodatabase.

  • +
  • area_type_shape_variable (str) – The variable name of area type in area geodadabase.

  • +
  • network_variable (str) – The variable name of area type in network standard. Default to “area_type”.

  • +
  • area_type_codes_dict – The dictionary to map input area_type_shape_variable to network_variable

  • +
  • downtown_area_type_shape – The file path to the downtown area type boundary.

  • +
  • downtown_area_type (int) – Integer value of downtown area type

  • +
  • overwrite (Bool) – True if overwriting existing county variable in network. Default to False.

  • +
+
+
Returns:
+

None

+
+
+
+ +
+
+calculate_centroidconnect(parameters, network_variable='centroidconnect', highest_taz_number=None, as_integer=True, overwrite=False)[source]
+

Calculates centroid connector variable.

+
+
Parameters:
+
    +
  • parameters (Parameters) – A Lasso Parameters, which stores input files.

  • +
  • network_variable (str) – Variable that should be written to in the network. Default to “centroidconnect”

  • +
  • highest_taz_number (int) – the max TAZ number in the network.

  • +
  • as_integer (bool) – If True, will convert true/false to 1/0s. Default to True.

  • +
  • overwrite (Bool) – True if overwriting existing county variable in network. Default to False.

  • +
+
+
Returns:
+

RoadwayNetwork

+
+
+
+ +
+
+calculate_county(county_shape=None, county_shape_variable=None, network_variable='county', county_codes_dict=None, overwrite=False)[source]
+

#MC +Calculates county variable.

+

This uses the centroid of the geometry field to determine which county it should be labeled. +This isn’t perfect, but it much quicker than other methods.

+
+
Parameters:
+
    +
  • county_shape (str) – The File path to county geodatabase.

  • +
  • county_shape_variable (str) – The variable name of county in county geodadabase.

  • +
  • network_variable (str) – The variable name of county in network standard. Default to “county”.

  • +
  • overwrite (Bool) – True if overwriting existing county variable in network. Default to False.

  • +
+
+
Returns:
+

None

+
+
+
+ +
+
+calculate_distance(network_variable='distance', centroidconnect_only=False, overwrite=False)[source]
+

calculate link distance in miles

+
+
Parameters:
+
    +
  • centroidconnect_only (Bool) – True if calculating distance for centroidconnectors only. Default to False.

  • +
  • overwrite (Bool) – True if overwriting existing variable in network. Default to False.

  • +
+
+
Returns:
+

None

+
+
+
+ +
+
+calculate_mpo(county_network_variable='county', network_variable='mpo', as_integer=True, mpo_counties=None, overwrite=False)[source]
+

Calculates mpo variable. +#MC +:param county_variable: Name of the variable where the county names are stored. Default to “county”. +:type county_variable: str +:param network_variable: Name of the variable that should be written to. Default to “mpo”. +:type network_variable: str +:param as_integer: If true, will convert true/false to 1/0s. +:type as_integer: bool +:param mpo_counties: List of county names that are within mpo region. +:type mpo_counties: list +:param overwrite: True if overwriting existing county variable in network. Default to False. +:type overwrite: Bool

+
+
Returns:
+

None

+
+
+
+ +
+
+calculate_use(network_variable='use', as_integer=True, overwrite=False)[source]
+

Calculates use variable.

+
+
Parameters:
+
    +
  • network_variable (str) – Variable that should be written to in the network. Default to “use”

  • +
  • as_integer (bool) – If True, will convert true/false to 1/0s. Defauly to True.

  • +
  • overwrite (Bool) – True if overwriting existing county variable in network. Default to False.

  • +
+
+
Returns:
+

None

+
+
+
+ +
+
+convert_int(int_col_names=[])[source]
+

Convert integer columns

+
+ +
+
+create_ML_variable(network_variable='ML_lanes', overwrite=False)[source]
+

Created ML lanes placeholder for project to write out ML changes

+

ML lanes default to 0, ML info comes from cube LOG file and store in project cards

+
+
Parameters:
+

overwrite (Bool) – True if overwriting existing variable in network. Default to False.

+
+
Returns:
+

None

+
+
+
+ +
+
+create_calculated_variables()[source]
+

Creates calculated roadway variables.

+
+
Parameters:
+

None

+
+
+
+ +
+ +

create dummy connector links between the general purpose and managed lanes

+
+
Parameters:
+
    +
  • gp_df – GeoDataFrame +dataframe of general purpose links (where managed lane also exists)

  • +
  • ml_df – GeoDataFrame +dataframe of corresponding managed lane links,

  • +
  • access_lanes – int +number of lanes in access dummy link

  • +
  • egress_lanes – int +number of lanes in egress dummy link

  • +
  • access_roadway – str +roaday type for access dummy link

  • +
  • egress_roadway – str +roadway type for egress dummy link

  • +
  • access_name_prefix – str +prefix for access dummy link name

  • +
  • egress_name_prefix – str +prefix for egress dummy link name

  • +
+
+
+
+ +
+
+create_hov_corridor_variable(network_variable='segment_id', overwrite=False)[source]
+

Created hov corridor placeholder for project to write out corridor changes

+

hov corridor id default to 0, its info comes from cube LOG file and store in project cards

+
+
Parameters:
+

overwrite (Bool) – True if overwriting existing variable in network. Default to False.

+
+
Returns:
+

None

+
+
+
+ +
+
+create_managed_lane_network(keep_same_attributes_ml_and_gp=None, keep_additional_attributes_ml_and_gp=[], managed_lanes_required_attributes=[], managed_lanes_node_id_scalar=None, managed_lanes_link_id_scalar=None, in_place=False)
+

Create a roadway network with managed lanes links separated out. +Add new parallel managed lane links, access/egress links, +and add shapes corresponding to the new links

+
+
Return type:
+

RoadwayNetwork

+
+
Parameters:
+
    +
  • keep_same_attributes_ml_and_gp – list of attributes to copy from general purpose +lane to managed lane. If not specified, will look for value in the RoadwayNetwork +instance. If not found there, will default to KEEP_SAME_ATTRIBUTES_ML_AND_GP.

  • +
  • keep_additional_attributes_ml_and_gp – list of additional attributes to add. This is useful +if you want to leave the default attributes and then ALSO some others.

  • +
  • managed_lanes_required_attributes – list of attributes that are required to be specified +in new managed lanes. If not specified, will look for value in the RoadwayNetwork +instance. If not found there, will default to MANAGED_LANES_REQUIRED_ATTRIBUTES.

  • +
  • managed_lanes_node_id_scalar – integer value added to original node IDs to create managed +lane unique ids. If not specified, will look for value in the RoadwayNetwork +instance. If not found there, will default to MANAGED_LANES_NODE_ID_SCALAR.

  • +
  • managed_lanes_link_id_scalar – integer value added to original link IDs to create managed +lane unique ids. If not specified, will look for value in the RoadwayNetwork +instance. If not found there, will default to MANAGED_LANES_LINK_ID_SCALAR.

  • +
  • in_place – update self or return a new roadway network object

  • +
+
+
+

returns: A RoadwayNetwork instance

+
+ +
+
+create_managed_variable(network_variable='managed', overwrite=False)[source]
+

Created placeholder for project to write out managed

+

managed default to 0, its info comes from cube LOG file and store in project cards

+
+
Parameters:
+

overwrite (Bool) – True if overwriting existing variable in network. Default to False.

+
+
Returns:
+

None

+
+
+
+ +
+
+static dataframe_to_fixed_width(df)[source]
+

Convert dataframe to fixed width format, geometry column will not be transformed.

+
+
Parameters:
+

df (pandas DataFrame) –

+
+
Returns:
+

dataframe with fixed width for each column. +dict: dictionary with columns names as keys, column width as values.

+
+
Return type:
+

pandas dataframe

+
+
+
+ +
+
+delete_roadway_feature_change(links, nodes, ignore_missing=True)
+

delete the roadway features defined in the project card. +valid links and nodes defined in the project gets deleted +and shapes corresponding to the deleted links are also deleted.

+
+
Return type:
+

None

+
+
Parameters:
+
    +
  • links – dict +list of dictionaries

  • +
  • nodes – dict +list of dictionaries

  • +
  • ignore_missing – bool +If True, will only warn about links/nodes that are missing from +network but specified to “delete” in project card +If False, will fail.

  • +
+
+
+
+ +
+
+deletion_map(links, nodes)
+

Shows which links and nodes are deleted from the roadway network

+
+ +
+
+fill_na()[source]
+

Fill na values from create_managed_lane_network()

+
+ +
+
+static from_RoadwayNetwork(roadway_network_object, parameters={})[source]
+

RoadwayNetwork to ModelRoadwayNetwork

+
+
Parameters:
+
    +
  • roadway_network_object (RoadwayNetwork) –

  • +
  • parameters – dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not specified, will use default parameters.

  • +
+
+
Returns:
+

ModelRoadwayNetwork

+
+
+
+ +
+
+static get_attribute(links_df, join_key, source_shst_ref_df, source_gdf, field_name)[source]
+

Gets attribute from source data using SHST match result.

+
+
Parameters:
+
    +
  • links_df (dataframe) – The network dataframe that new attribute should be written to.

  • +
  • join_key (str) – SHST ID variable name used to join source data with network dataframe.

  • +
  • source_shst_ref_df (str) – File path to source data SHST match result.

  • +
  • source_gdf (str) – File path to source data.

  • +
  • field_name (str) – Name of the attribute to get from source data.

  • +
+
+
Returns:
+

None

+
+
+
+ +
+
+static get_managed_lane_node_ids(nodes_list, scalar=4500000)
+

Transform a list of node IDS by a scalar. +..todo #237 what if node ids are not a number?

+
+
Parameters:
+
    +
  • nodes_list – list of integers

  • +
  • scalar – value to add to node IDs

  • +
+
+
+

Returns: list of integers

+
+ +
+
+static get_modal_graph(links_df, nodes_df, mode=None, modes_to_network_link_variables={'bike': ['bike_access'], 'bus': ['bus_only', 'drive_access'], 'drive': ['drive_access'], 'rail': ['rail_only'], 'transit': ['bus_only', 'rail_only', 'drive_access'], 'walk': ['walk_access']})
+

Determines if the network graph is “strongly” connected +A graph is strongly connected if each vertex is reachable from every other vertex.

+
+
Parameters:
+
    +
  • links_df – DataFrame of standard network links

  • +
  • nodes_df – DataFrame of standard network nodes

  • +
  • mode – mode of the network, one of drive,`transit`, +walk, bike

  • +
  • modes_to_network_link_variables – dictionary mapping the mode selections to the network variables +that must bool to true to select that mode. Defaults to MODES_TO_NETWORK_LINK_VARIABLES

  • +
+
+
+

Returns: networkx: osmnx: DiGraph of network

+
+ +
+ +

Returns nodes and link dataframes for specific mode.

+
+
Parameters:
+
    +
  • links_df – DataFrame of standard network links

  • +
  • nodes_df – DataFrame of standard network nodes

  • +
  • modes – list of the modes of the network to be kept, must be in drive,`transit`,`rail`,`bus`, +walk, bike. For example, if bike and walk are selected, both bike and walk links will be kept.

  • +
  • modes_to_network_link_variables – dictionary mapping the mode selections to the network variables +that must bool to true to select that mode. Defaults to MODES_TO_NETWORK_LINK_VARIABLES

  • +
+
+
+

Returns: tuple of DataFrames for links, nodes filtered by mode

+

links with walk access are not marked as having walk access +Issue discussed in https://github.com/wsp-sag/network_wrangler/issues/145 +modal_nodes_df = nodes_df[nodes_df[mode_node_variable] == 1]

+
+ +
+
+get_property_by_time_period_and_group(prop, time_period=None, category=None, default_return=None)
+

Return a series for the properties with a specific group or time period.

+
+
Parameters:
+
    +
  • prop (str) – the variable that you want from network

  • +
  • time_period (list(str)) – the time period that you are querying for +i.e. [‘16:00’, ‘19:00’]

  • +
  • category (str or list(str)(Optional)) –

    the group category +i.e. “sov”

    +

    or

    +

    list of group categories in order of search, i.e. +[“hov3”,”hov2”]

    +

  • +
  • default_return (what to return if variable or time period not found. Default is None.) –

  • +
+
+
Return type:
+

pandas series

+
+
+
+ +
+
+identify_segment(O_id, D_id, selection_dict={}, mode=None, nodes_df=None, links_df=None)
+
+
Parameters:
+
    +
  • endpoints – list of length of two unique keys of nodes making up endpoints of segment

  • +
  • selection_dict – dictionary of link variables to select candidate links from, otherwise will create a graph of ALL links which will be both a RAM hog and could result in odd shortest paths.

  • +
  • segment_variables – list of variables to keep

  • +
+
+
+
+ +
+
+identify_segment_endpoints(mode='', links_df=None, nodes_df=None, min_connecting_links=10, min_distance=None, max_link_deviation=2)
+
+
Parameters:
+
    +
  • mode – list of modes of the network, one of drive,`transit`, +walk, bike

  • +
  • links_df – if specified, will assess connectivity of this +links list rather than self.links_df

  • +
  • nodes_df – if specified, will assess connectivity of this +nodes list rather than self.nodes_df

  • +
+
+
+
+ +
+
+is_network_connected(mode=None, links_df=None, nodes_df=None)
+

Determines if the network graph is “strongly” connected +A graph is strongly connected if each vertex is reachable from every other vertex.

+
+
Parameters:
+
    +
  • mode – mode of the network, one of drive,`transit`, +walk, bike

  • +
  • links_df – DataFrame of standard network links

  • +
  • nodes_df – DataFrame of standard network nodes

  • +
+
+
+

Returns: boolean

+
+ +
+
+static load_transform_network(node_filename, link_filename, shape_filename, crs=4326, node_foreign_key='model_node_id', validate_schema=True, **kwargs)
+

Reads roadway network files from disk and transforms them into GeoDataFrames.

+
+
Return type:
+

tuple

+
+
Parameters:
+
    +
  • node_filename – file name for nodes.

  • +
  • link_filename – file name for links.

  • +
  • shape_filename – file name for shapes.

  • +
  • crs – coordinate reference system. Defaults to value in CRS.

  • +
  • node_foreign_key – variable linking the node table to the link table. Defaults +to NODE_FOREIGN_KEY.

  • +
  • validate_schema – boolean indicating if network should be validated to schema.

  • +
+
+
+

returns: tuple of GeodataFrames nodes_df, links_df, shapes_df

+
+ +
+
+static network_connection_plot(G, disconnected_subgraph_nodes)
+

Plot a graph to check for network connection.

+
+
Parameters:
+
    +
  • G – OSMNX flavored networkX graph.

  • +
  • disconnected_subgraph_nodes – List of disconnected subgraphs described by the list of their +member nodes (as described by their model_node_id).

  • +
+
+
+

returns: fig, ax : tuple

+
+ +
+
+orig_dest_nodes_foreign_key(selection, node_foreign_key='')
+

Returns the foreign key id (whatever is used in the u and v +variables in the links file) for the AB nodes as a tuple.

+
+
Return type:
+

tuple

+
+
Parameters:
+
    +
  • selection – selection dictionary with A and B keys

  • +
  • node_foreign_key – variable name for whatever is used by the u and v variable

  • +
  • specified (in the links_df file. If nothing is) –

  • +
  • whatever (assume) –

  • +
  • is (default) –

  • +
+
+
+

Returns: tuple of (A_id, B_id)

+
+ +
+
+static ox_graph(nodes_df, links_df, node_foreign_key='model_node_id', link_foreign_key=['A', 'B'], unique_link_key='model_link_id')
+

create an osmnx-flavored network graph

+

osmnx doesn’t like values that are arrays, so remove the variables +that have arrays. osmnx also requires that certain variables +be filled in, so do that too.

+
+
Parameters:
+
    +
  • nodes_df – GeoDataFrame of nodes

  • +
  • link_df – GeoDataFrame of links

  • +
  • node_foreign_key – field referenced in link_foreign_key

  • +
  • link_foreign_key – list of attributes that define the link start and end nodes to the node foreign key

  • +
  • unique_link_key – primary key for links

  • +
+
+
+

Returns: a networkx multidigraph

+
+ +
+ +
+
Parameters:
+
    +
  • candidate_links – selection of links geodataframe with links likely to be part of path

  • +
  • O_id – origin node foreigh key ID

  • +
  • D_id – destination node foreigh key ID

  • +
  • weight_column – column to use for weight of shortest path. Defaults to “i” (iteration)

  • +
  • weight_factor – optional weight to multiply the weight column by when finding the shortest path

  • +
  • search_breadth

  • +
+
+
+

Returns

+
+ +
+
+static read(link_filename, node_filename, shape_filename, fast=False, recalculate_calculated_variables=False, recalculate_distance=False, parameters={}, **kwargs)[source]
+

Reads in links and nodes network standard.

+
+
Parameters:
+
    +
  • link_filename (str) – File path to link json.

  • +
  • node_filename (str) – File path to node geojson.

  • +
  • shape_filename (str) – File path to link true shape geojson

  • +
  • fast (bool) – boolean that will skip validation to speed up read time.

  • +
  • recalculate_calculated_variables (bool) – calculates fields from spatial joins, etc.

  • +
  • recalculate_distance (bool) – re-calculates distance.

  • +
  • parameters – dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not specified, will use default parameters.

  • +
  • crs (int) – coordinate reference system, ESPG number

  • +
  • node_foreign_key (str) – variable linking the node table to the link table

  • +
  • link_foreign_key (list) – list of variable linking the link table to the node foreign key

  • +
  • shape_foreign_key (str) – variable linking the links table and shape table

  • +
  • unique_link_ids (list) – list of variables unique to each link

  • +
  • unique_node_ids (list) – list of variables unique to each node

  • +
  • modes_to_network_link_variables (dict) – Mapping of modes to link variables in the network

  • +
  • modes_to_network_nodes_variables (dict) – Mapping of modes to node variables in the network

  • +
  • managed_lanes_node_id_scalar (int) – Scalar values added to primary keys for nodes for +corresponding managed lanes.

  • +
  • managed_lanes_link_id_scalar (int) – Scalar values added to primary keys for links for +corresponding managed lanes.

  • +
  • managed_lanes_required_attributes (list) – attributes that must be specified in managed +lane projects.

  • +
  • keep_same_attributes_ml_and_gp (list) – attributes to copy to managed lanes from parallel +general purpose lanes.

  • +
+
+
Returns:
+

ModelRoadwayNetwork

+
+
+
+ +
+
+static read_match_result(path)[source]
+

Reads the shst geojson match returns.

+

Returns shst dataframe.

+

Reading lots of same type of file and concatenating them into a single DataFrame.

+
+
Parameters:
+

path (str) – File path to SHST match results.

+
+
Returns:
+

geopandas geodataframe

+
+
Return type:
+

geodataframe

+
+
+

##todo +not sure why we need, but should be in utilities not this class

+
+ +
+
+rename_variables_for_dbf(input_df, variable_crosswalk=None, output_variables=None, convert_geometry_to_xy=False)[source]
+

Rename attributes for DBF/SHP, make sure length within 10 chars.

+
+
Parameters:
+
    +
  • input_df (dataframe) – Network standard DataFrame.

  • +
  • variable_crosswalk (str) – File path to variable name crosswalk from network standard to DBF names.

  • +
  • output_variables (list) – List of strings for DBF variables.

  • +
  • convert_geometry_to_xy (bool) – True if converting node geometry to X/Y

  • +
+
+
Returns:
+

dataframe

+
+
+
+ +
+
+static roadway_net_to_gdf(roadway_net)
+
+
Return type:
+

GeoDataFrame

+
+
+

Turn the roadway network into a GeoDataFrame +:param roadway_net: the roadway network to export

+

returns: shapes dataframe

+
+ +
+
+roadway_standard_to_met_council_network(output_epsg=None)[source]
+

Rename and format roadway attributes to be consistent with what metcouncil’s model is expecting. +#MC +:param output_epsg: epsg number of output network. +:type output_epsg: int

+
+
Returns:
+

None

+
+
+
+ +
+
+select_roadway_features(selection, search_mode='drive', force_search=False, sp_weight_factor=None)
+

Selects roadway features that satisfy selection criteria

+
+
Return type:
+

GeoDataFrame

+
+
+
+
Example usage:
+
net.select_roadway_features(
+
selection = [ {

# a match condition for the from node using osm, +# shared streets, or model node number +‘from’: {‘osm_model_link_id’: ‘1234’}, +# a match for the to-node.. +‘to’: {‘shstid’: ‘4321’}, +# a regex or match for facility condition +# could be # of lanes, facility type, etc. +‘facility’: {‘name’:’Main St’}, +}, … ])

+
+
+
+
+
+
+
+
Parameters:
+
    +
  • selection – dictionary with keys for: +A - from node +B - to node +link - which includes at least a variable for name

  • +
  • search_mode – mode which you are searching for; defaults to “drive”

  • +
  • force_search – boolean directing method to perform search even if one +with same selection dict is stored from a previous search.

  • +
  • sp_weight_factor – multiple used to discourage shortest paths which +meander from original search returned from name or ref query. +If not set here, will default to value of sp_weight_factor in +RoadwayNetwork instance. If not set there, will defaul to SP_WEIGHT_FACTOR.

  • +
+
+
+

Returns: a list of link IDs in selection

+
+ +
+ +
+
Return type:
+

bool

+
+
Parameters:
+

selection_dictionary – Dictionary representation of selection +of roadway features, containing a “link” key.

+
+
+
+
Returns: A boolean indicating if the selection dictionary contains

a unique identifier for links.

+
+
+
+ +
+
+selection_map(selected_link_idx, A=None, B=None, candidate_link_idx=[])
+

Shows which links are selected for roadway property change or parallel +managed lanes category of roadway projects.

+
+
Parameters:
+
    +
  • selected_links_idx – list of selected link indices

  • +
  • candidate_links_idx – optional list of candidate link indices to also include in map

  • +
  • A – optional foreign key of starting node of a route selection

  • +
  • B – optional foreign key of ending node of a route selection

  • +
+
+
+
+ +
+
+shortest_path(graph_links_df, O_id, D_id, nodes_df=None, weight_column='i', weight_factor=100)
+
+
Return type:
+

tuple

+
+
Parameters:
+
    +
  • graph_links_df

  • +
  • O_id – foreign key for start node

  • +
  • D_id – foreign key for end node

  • +
  • nodes_df – optional nodes df, otherwise will use network instance

  • +
  • weight_column – column to use as a weight, defaults to “i”

  • +
  • weight_factor – any additional weighting to multiply the weight column by, defaults to SP_WEIGHT_FACTOR

  • +
+
+
+

Returns: tuple with length of four +- Boolean if shortest path found +- nx Directed graph of graph links +- route of shortest path nodes as List +- links in shortest path selected from links_df

+
+ +
+
+split_properties_by_time_period_and_category(properties_to_split=None)[source]
+

Splits properties by time period, assuming a variable structure of

+
+
Parameters:
+

properties_to_split

dict +dictionary of output variable prefix mapped to the source variable and what to stratify it by +e.g. +{

+
+

’lanes’ : {‘v’:’lanes’, ‘times_periods’:{“AM”: (“6:00”, “10:00”),”PM”: (“15:00”, “19:00”)}}, +‘ML_lanes’ : {‘v’:’ML_lanes’, ‘times_periods’:{“AM”: (“6:00”, “10:00”),”PM”: (“15:00”, “19:00”)}}, +‘use’ : {‘v’:’use’, ‘times_periods’:{“AM”: (“6:00”, “10:00”),”PM”: (“15:00”, “19:00”)}},

+
+

}

+

+
+
+
+ +
+
+update_distance(links_df=None, use_shapes=False, units='miles', network_variable='distance', overwrite=True, inplace=True)
+

Calculate link distance in specified units to network variable using either straight line +distance or (if specified) shape distance if available.

+
+
Parameters:
+
    +
  • links_df – Links GeoDataFrame. Useful if want to update a portion of network links +(i.e. only centroid connectors). If not provided, will use entire self.links_df.

  • +
  • use_shapes – if True, will add length information from self.shapes_df rather than crow-fly. +If no corresponding shape found in self.shapes_df, will default to crow-fly.

  • +
  • units – units to use. Defaults to the standard unit of miles. Available units: “meters”, “miles”.

  • +
  • network_variable – variable to store link distance in. Defaults to “distance”.

  • +
  • overwrite – Defaults to True and will overwrite all existing calculated distances. +False will only update NaNs.

  • +
  • inplace – updates self.links_df

  • +
+
+
Returns:
+

links_df with updated distance

+
+
+
+ +
+ +

Validate roadway network data link schema and output a boolean

+
+ +
+
+static validate_node_schema(node_file, schema_location='roadway_network_node.json')
+

Validate roadway network data node schema and output a boolean

+
+ +
+
+validate_properties(properties, ignore_existing=False, require_existing_for_change=False)
+

If there are change or existing commands, make sure that that +property exists in the network.

+
+
Return type:
+

bool

+
+
Parameters:
+
    +
  • properties – properties dictionary to be evaluated

  • +
  • ignore_existing – If True, will only warn about properties +that specify an “existing” value. If False, will fail.

  • +
  • require_existing_for_change – If True, will fail if there isn’t +a specified value in theproject card for existing when a +change is specified.

  • +
+
+
+

Returns: boolean value as to whether the properties dictonary is valid.

+
+ +
+
+validate_selection(selection, selection_requires=['link'])
+

Evaluate whetther the selection dictionary contains the +minimum required values.

+
+
Return type:
+

bool

+
+
Parameters:
+

selection – selection dictionary to be evaluated

+
+
+

Returns: boolean value as to whether the selection dictonary is valid.

+
+ +
+
+static validate_shape_schema(shape_file, schema_location='roadway_network_shape.json')
+

Validate roadway network data shape schema and output a boolean

+
+ +
+
+validate_uniqueness()
+

Confirms that the unique identifiers are met.

+
+
Return type:
+

bool

+
+
+
+ +
+
+write(path='.', filename=None)
+

Writes a network in the roadway network standard

+
+
Return type:
+

None

+
+
Parameters:
+
    +
  • path – the path were the output will be saved

  • +
  • filename – the name prefix of the roadway files that will be generated

  • +
+
+
+
+ +
+
+write_roadway_as_fixedwidth(output_dir, node_output_variables=None, link_output_variables=None, output_link_txt=None, output_node_txt=None, output_link_header_width_txt=None, output_node_header_width_txt=None, output_cube_network_script=None, drive_only=False)[source]
+

Writes out fixed width file.

+

This function does: +1. write out link and node fixed width data files for cube. +2. write out header and width correspondence. +3. write out cube network building script with header and width specification.

+
+
Parameters:
+
    +
  • output_dir (str) – File path to where links, nodes and script will be written and run

  • +
  • node_output_variables (list) – list of node variable names.

  • +
  • link_output_variables (list) – list of link variable names.

  • +
  • output_link_txt (str) – File name of output link database (within output_dir)

  • +
  • output_node_txt (str) – File name of output node database (within output_dir)

  • +
  • output_link_header_width_txt (str) – File name of link column width records (within output_dir)

  • +
  • output_node_header_width_txt (str) – File name of node column width records (within output_dir)

  • +
  • output_cube_network_script (str) – File name of CUBE network building script (within output_dir)

  • +
  • drive_only (bool) – If True, only writes drive nodes and links

  • +
+
+
Returns:
+

None

+
+
+
+ +
+
+write_roadway_as_shp(output_dir, node_output_variables=None, link_output_variables=None, data_to_csv=True, data_to_dbf=False, output_link_shp=None, output_node_shp=None, output_link_csv=None, output_node_csv=None, output_gpkg=None, output_link_gpkg_layer=None, output_node_gpkg_layer=None, output_gpkg_link_filter=None)[source]
+

Write out dbf/shp/gpkg for cube. Write out csv in addition to shp with full length variable names.

+
+
Parameters:
+
    +
  • output_dir (str) – File path to directory

  • +
  • node_output_variables (list) – List of strings for node output variables.

  • +
  • link_output_variables (list) – List of strings for link output variables.

  • +
  • data_to_csv (bool) – True if write network in csv format.

  • +
  • data_to_dbf (bool) – True if write network in dbf/shp format.

  • +
  • output_link_shp (str) – File name to output link dbf/shp.

  • +
  • output_node_shp (str) – File name of output node dbf/shp.

  • +
  • output_link_csv (str) – File name to output link csv.

  • +
  • output_node_csv (str) – File name to output node csv.

  • +
  • output_gpkg (str) – File name to output GeoPackage.

  • +
  • output_link_gpkg_layer (str) – Layer name within output_gpkg to output links.

  • +
  • output_node_gpkg_layer (str) – Layer name within output_gpkg to output links.

  • +
  • output_gpkg_link_filter (str) – Optional column name to additional output link subset layers

  • +
+
+
Returns:
+

None

+
+
+
+ +
+
+CALCULATED_VALUES = ['area_type', 'county', 'centroidconnect']
+
+ +
+ +
+ + +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty_dev/_generated/lasso.Parameters/index.html b/branch/bicounty_dev/_generated/lasso.Parameters/index.html new file mode 100644 index 0000000..ead392d --- /dev/null +++ b/branch/bicounty_dev/_generated/lasso.Parameters/index.html @@ -0,0 +1,555 @@ + + + + + + + lasso.Parameters — lasso documentation + + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

lasso.Parameters

+
+
+class lasso.Parameters(**kwargs)[source]
+

Bases: object

+

A class representing all the parameters defining the networks +including time of day, categories, etc.

+

Parameters can be set at runtime by initializing a parameters instance +with a keyword argument setting the attribute. Parameters that are +not explicitly set will use default parameters listed in this class. +.. highlight:: python

+
+
Attr:
+
time_period_to_time (dict): Maps time period abbreviations used in

Cube to time of days used on gtfs and highway network standard +Default:

+
{
+    "EA": ("3:00", "6:00"),
+    "AM": ("6:00, "10:00"),
+    "MD": ("10:00", "15:00"),
+    "PM": ("15:00", "19:00"),
+    "EV": ("19:00", "3:00"),
+}
+
+
+
+
cube_time_periods (dict): Maps cube time period numbers used in

transit line files to the time period abbreviations in time_period_to_time +dictionary. +Default:

+
{"1": "EA", "2": "AM", "3": "MD", "4": "PM", "5": "EV"}
+
+
+
+
categories (dict): Maps demand category abbreviations to a list of

network categories they are allowed to use. +Default:

+
{
+    # suffix, source (in order of search)
+    "sov": ["sov", "default"],
+    "hov2": ["hov2", "default", "sov"],
+    "hov3": ["hov3", "hov2", "default", "sov"],
+    "truck": ["trk", "sov", "default"],
+}
+
+
+
+
properties_to_split (dict): Dictionary mapping variables in standard

roadway network to categories and time periods that need to be +split out in final model network to get variables like LANES_AM. +Default:

+
{
+    "lanes": {
+        "v": "lanes",
+        "time_periods": self.time_periods_to_time
+    },
+    "ML_lanes": {
+        "v": "ML_lanes",
+        "time_periods": self.time_periods_to_time
+    },
+    "use": {
+        "v": "use",
+        "time_periods": self.time_periods_to_time
+    },
+}
+
+
+
+
county_shape (str): File location of shapefile defining counties.

Default:

+
r"metcouncil_data/county/cb_2017_us_county_5m.shp"
+
+
+
+
county_variable_shp (str): Property defining the county n ame in

the county_shape file. +Default:

+
NAME
+
+
+
+
lanes_lookup_file (str): Lookup table of number of lanes for different data sources.

Default:

+
r"metcouncil_data/lookups/lanes.csv"
+
+
+
+
centroid_connect_lanes (int): Number of lanes for centroid connectors.

Default:

+
1
+
+
+
+
mpo_counties (list): list of county names within MPO boundary.

Default:

+
[
+    "ANOKA",
+    "DAKOTA",
+    "HENNEPIN",
+    "RAMSEY",
+    "SCOTT",
+    "WASHINGTON",
+    "CARVER",
+]
+
+
+
+
taz_shape (str):

Default:

+
r"metcouncil_data/TAZ/TAZOfficialWCurrentForecasts.shp"
+
+
+
+
taz_data (str):

Default:

+
??
+
+
+
+
highest_taz_number (int): highest TAZ number in order to define

centroid connectors. +Default:

+
3100
+
+
+
+
output_variables (list): list of variables to output in final model

network. +Default:

+
[
+    "model_link_id",
+    "link_id",
+    "A",
+    "B",
+    "shstGeometryId",
+    "distance",
+    "roadway",
+    "name",
+    "roadway_class",
+    "bike_access",
+    "walk_access",
+    "drive_access",
+    "truck_access",
+    "trn_priority_EA",
+    "trn_priority_AM",
+    "trn_priority_MD",
+    "trn_priority_PM",
+    "trn_priority_EV",
+    "ttime_assert_EA",
+    "ttime_assert_AM",
+    "ttime_assert_MD",
+    "ttime_assert_PM",
+    "ttime_assert_EV",
+    "lanes_EA",
+    "lanes_AM",
+    "lanes_MD",
+    "lanes_PM",
+    "lanes_EV",
+    "price_sov_EA",
+    "price_hov2_EA",
+    "price_hov3_EA",
+    "price_truck_EA",
+    "price_sov_AM",
+    "price_hov2_AM",
+    "price_hov3_AM",
+    "price_truck_AM",
+    "price_sov_MD",
+    "price_hov2_MD",
+    "price_hov3_MD",
+    "price_truck_MD",
+    "price_sov_PM",
+    "price_hov2_PM",
+    "price_hov3_PM",
+    "price_truck_PM",
+    "price_sov_EV",
+    "price_hov2_EV",
+    "price_hov3_EV",
+    "price_truck_EV",
+    "roadway_class_idx",
+    "facility_type",
+    "county",
+    "centroidconnect",
+    "model_node_id",
+    "N",
+    "osm_node_id",
+    "bike_node",
+    "transit_node",
+    "walk_node",
+    "drive_node",
+    "geometry",
+    "X",
+    "Y",
+    "ML_lanes_EA",
+    "ML_lanes_AM",
+    "ML_lanes_MD",
+    "ML_lanes_PM",
+    "ML_lanes_EV",
+    "segment_id",
+    "managed",
+    "bus_only",
+    "rail_only"
+]
+
+
+
+
osm_facility_type_dict (dict): Mapping between OSM Roadway variable

and facility type. Default:

+
+
area_type_shape (str): Location of shapefile defining area type.

Default:

+
r"metcouncil_data/area_type/ThriveMSP2040CommunityDesignation.shp"
+
+
+
+
area_type_variable_shp (str): property in area_type_shape with area

type in it. +Default:

+
"COMDES2040"
+
+
+
+
area_type_code_dict (dict): Mapping of the area_type_variable_shp to

the area type code used in the MetCouncil cube network. +Default:

+
{
+    23: 4,  # urban center
+    24: 3,
+    25: 2,
+    35: 2,
+    36: 1,
+    41: 1,
+    51: 1,
+    52: 1,
+    53: 1,
+    60: 1,
+}
+
+
+
+
downtown_area_type_shape (str): Location of shapefile defining downtown area type.

Default:

+
r"metcouncil_data/area_type/downtownzones_TAZ.shp"
+
+
+
+
downtown_area_type (int): Area type integer for downtown.

Default:

+
5
+
+
+
+
mrcc_roadway_class_shape (str): Shapefile of MRCC links with a property

associated with roadway class. Default:

+
r"metcouncil_data/mrcc/trans_mrcc_centerlines.shp"
+
+
+
+
mrcc_roadway_class_variable_shp (str): The property in mrcc_roadway_class_shp

associated with roadway class. Default:

+
"ROUTE_SYS"
+
+
+
+
widot_roadway_class_shape (str): Shapefile of Wisconsin links with a property

associated with roadway class. Default:

+
r"metcouncil_data/Wisconsin_Lanes_Counts_Median/WISLR.shp"
+
+
+
+
widot_roadway_class_variable_shp (str): The property in widot_roadway_class_shape

associated with roadway class.Default:

+
"RDWY_CTGY_"
+
+
+
+
mndot_count_shape (str): Shapefile of MnDOT links with a property

associated with counts. Default:

+
r"metcouncil_data/count_mn/AADT_2017_Count_Locations.shp"
+
+
+
+
mndot_count_variable_shp (str): The property in mndot_count_shape

associated with counts. Default:

+
+
::

“lookups/osm_highway_facility_type_crosswalk.csv”

+
+
+
+
legacy_tm2_attributes (str): CSV file of link attributes by

shStReferenceId from Legacy TM2 network. Default:

+
"lookups/legacy_tm2_attributes.csv"
+
+
+
+
osm_lanes_attributes (str): CSV file of number of lanes by shStReferenceId

from OSM. Default:

+
"lookups/osm_lanes_attributes.csv"
+
+
+
+
tam_tm2_attributes (str): CSV file of link attributes by

shStReferenceId from TAM TM2 network. Default:

+
"lookups/tam_tm2_attributes.csv"
+
+
+
+
tom_tom_attributes (str): CSV file of link attributes by

shStReferenceId from TomTom network. Default:

+
"lookups/tomtom_attributes.csv"
+
+
+
+
sfcta_attributes (str): CSV file of link attributes by

shStReferenceId from SFCTA network. Default:

+
"lookups/sfcta_attributes.csv"
+
+
+
+
output_epsg (int): EPSG type of geographic projection for output

shapefiles. Default:

+
102646
+
+
+
+
output_link_shp (str): Output shapefile for roadway links. Default:
+
::

r”tests/scratch/links.shp”

+
+
+
+
output_node_shp (str): Output shapefile for roadway nodes. Default:
+
::

r”tests/scratch/nodes.shp”

+
+
+
+
output_link_csv (str): Output csv for roadway links. Default:
+
::

r”tests/scratch/links.csv”

+
+
+
+
output_node_csv (str): Output csv for roadway nodes. Default:
+
::

r”tests/scratch/nodes.csv”

+
+
+
+
output_link_txt (str): Output fixed format txt for roadway links. Default:
+
::

r”tests/scratch/links.txt”

+
+
+
+
output_node_txt (str): Output fixed format txt for roadway nodes. Default:
+
::

r”tests/scratch/nodes.txt”

+
+
+
+
output_link_header_width_txt (str): Header for txt roadway links. Default:
+
::

r”tests/scratch/links_header_width.txt”

+
+
+
+
output_node_header_width_txt (str): Header for txt for roadway Nodes. Default:
+
::

r”tests/scratch/nodes_header_width.txt”

+
+
+
+
output_cube_network_script (str): Cube script for importing

fixed-format roadway network. Default:

+
r"tests/scratch/make_complete_network_from_fixed_width_file.s
+
+
+
+
+
+
+
+
+__init__(**kwargs)[source]
+

Time period and category splitting info

+
+ +

Methods

+ + + + + + +

__init__(**kwargs)

Time period and category splitting info

+

Attributes

+ + + + + + + + + + + + + + + +

cube_time_periods

#MC self.route_type_bus_mode_dict = {"Urb Loc": 5, "Sub Loc": 6, "Express": 7}

properties_to_split

Details for calculating the county based on the centroid of the link.

county_link_range_dict

self.county_code_dict = {

zones

Create all the possible headway variable combinations based on the cube time periods setting

+
+ +
+
self.county_code_dict = {

“Anoka”: 1, +“Carver”: 2, +“Dakota”: 3, +“Hennepin”: 4, +“Ramsey”: 5, +“Scott”: 6, +“Washington”: 7, +“external”: 10, +“Chisago”: 11, +“Goodhue”: 12, +“Isanti”: 13, +“Le Sueur”: 14, +“McLeod”: 15, +“Pierce”: 16, +“Polk”: 17, +“Rice”: 18, +“Sherburne”: 19, +“Sibley”: 20, +“St. Croix”: 21, +“Wright”: 22,

+
+
+

}

+
+ +
+
+cube_time_periods
+

#MC +self.route_type_bus_mode_dict = {“Urb Loc”: 5, “Sub Loc”: 6, “Express”: 7}

+

self.route_type_mode_dict = {0: 8, 2: 9}

+

self.cube_time_periods = {“1”: “AM”, “2”: “MD”} +self.cube_time_periods_name = {“AM”: “pk”, “MD”: “op”}

+
+ +
+
+properties_to_split
+

Details for calculating the county based on the centroid of the link. +The NAME varible should be the name of a field in shapefile.

+
+ +
+
+zones
+

Create all the possible headway variable combinations based on the cube time periods setting

+
+ +
+ +
+ + +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty_dev/_generated/lasso.Project/index.html b/branch/bicounty_dev/_generated/lasso.Project/index.html new file mode 100644 index 0000000..68c1dc2 --- /dev/null +++ b/branch/bicounty_dev/_generated/lasso.Project/index.html @@ -0,0 +1,521 @@ + + + + + + + lasso.Project — lasso documentation + + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

lasso.Project

+
+
+class lasso.Project(roadway_link_changes=None, roadway_node_changes=None, transit_changes=None, base_roadway_network=None, base_transit_network=None, base_cube_transit_network=None, build_cube_transit_network=None, project_name='', evaluate=False, parameters={})[source]
+

Bases: object

+

A single or set of changes to the roadway or transit system.

+

Compares a base and a build transit network or a base and build +highway network and produces project cards.

+

Typical usage example:

+
test_project = Project.create_project(
+    base_cube_transit_source=os.path.join(CUBE_DIR, "transit.LIN"),
+    build_cube_transit_source=os.path.join(CUBE_DIR, "transit_route_shape_change"),
+)
+test_project.evaluate_changes()
+test_project.write_project_card(
+    os.path.join(SCRATCH_DIR, "t_transit_shape_test.yml")
+)
+
+
+
+
+DEFAULT_PROJECT_NAME
+

a class-level constant that defines what +the project name will be if none is set.

+
+ +
+
+STATIC_VALUES
+

a class-level constant which defines values that +are not evaluated when assessing changes.

+
+ +
+
+card_data
+

{“project”: <project_name>, “changes”: <list of change dicts>}

+
+
Type:
+

dict

+
+
+
+ +
+ +

pandas dataframe of CUBE roadway link changes.

+
+
Type:
+

DataFrame

+
+
+
+ +
+
+roadway_node_changes
+

pandas dataframe of CUBE roadway node changes.

+
+
Type:
+

DataFrame

+
+
+
+ +
+
+transit_changes
+
+
Type:
+

CubeTransit

+
+
+
+ +
+
+base_roadway_network
+
+
Type:
+

RoadwayNetwork

+
+
+
+ +
+
+base_cube_transit_network
+
+
Type:
+

CubeTransit

+
+
+
+ +
+
+build_cube_transit_network
+
+
Type:
+

CubeTransit

+
+
+
+ +
+
+project_name
+

name of the project, set to DEFAULT_PROJECT_NAME if not provided

+
+
Type:
+

str

+
+
+
+ +
+
+parameters
+

an instance of the Parameters class which sets a bunch of parameters

+
+ +
+
+__init__(roadway_link_changes=None, roadway_node_changes=None, transit_changes=None, base_roadway_network=None, base_transit_network=None, base_cube_transit_network=None, build_cube_transit_network=None, project_name='', evaluate=False, parameters={})[source]
+

ProjectCard constructor.

+
+
Parameters:
+
    +
  • roadway_link_changes – dataframe of roadway changes read from a log file

  • +
  • roadway_node_changes – dataframe of roadway changes read from a log file

  • +
  • transit_changes – dataframe of transit changes read from a log file

  • +
  • base_roadway_network – RoadwayNetwork instance for base case

  • +
  • base_transit_network – StandardTransit instance for base case

  • +
  • base_cube_transit_network – CubeTransit instance for base transit network

  • +
  • build_cube_transit_network – CubeTransit instance for build transit network

  • +
  • project_name – name of the project

  • +
  • evaluate – defaults to false, but if true, will create card data

  • +
  • parameters – dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not specified, will use default parameters.

  • +
+
+
+

returns: instance of ProjectCard

+
+ +

Methods

+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +

__init__([roadway_link_changes, ...])

ProjectCard constructor.

add_highway_changes([...])

Evaluates changes from the log file based on the base highway object and adds entries into the self.card_data dictionary.

add_transit_changes()

Evaluates changes between base and build transit objects and adds entries into the self.card_data dictionary.

create_project([roadway_log_file, ...])

Constructor for a Project instance.

determine_roadway_network_changes_compatibility(...)

Checks to see that any links or nodes that change exist in base roadway network.

emme_id_to_wrangler_id(emme_link_change_df, ...)

rewrite the emme id with wrangler id, using the emme wrangler id crosswalk located in database folder

emme_name_to_wrangler_name(...)

rename emme names to wrangler names using crosswalk file

evaluate_changes()

Determines which changes should be evaluated, initiates self.card_data to be an aggregation of transit and highway changes.

get_object_from_network_build_command()

determine the network build object is node or link

get_operation_from_network_build_command()

determine the network build object action type

read_logfile(logfilename)

Reads a Cube log file and returns separate dataframes of roadway_link_changes and roadway_node_changes

read_network_build_file(networkbuildfilename)

Reads a emme network build file and returns separate dataframes of roadway_link_changes and roadway_node_changes

write_project_card([filename])

Writes project cards.

+

Attributes

+ + + + + + + + + + + + +

CALCULATED_VALUES

DEFAULT_PROJECT_NAME

STATIC_VALUES

+
+
+add_highway_changes(limit_variables_to_existing_network=False)[source]
+

Evaluates changes from the log file based on the base highway object and +adds entries into the self.card_data dictionary.

+
+
Parameters:
+

limit_variables_to_existing_network (bool) – True if no ad-hoc variables. Default to False.

+
+
+
+ +
+
+add_transit_changes()[source]
+

Evaluates changes between base and build transit objects and +adds entries into the self.card_data dictionary.

+
+ +
+
+static create_project(roadway_log_file=None, roadway_shp_file=None, roadway_csv_file=None, network_build_file=None, emme_node_id_crosswalk_file=None, emme_name_crosswalk_file=None, base_roadway_dir=None, base_transit_dir=None, base_cube_transit_source=None, build_cube_transit_source=None, roadway_link_changes=None, roadway_node_changes=None, transit_changes=None, base_roadway_network=None, base_transit_network=None, base_cube_transit_network=None, build_cube_transit_network=None, project_name=None, recalculate_calculated_variables=False, recalculate_distance=False, parameters={}, **kwargs)[source]
+

Constructor for a Project instance.

+
+
Parameters:
+
    +
  • roadway_log_file – File path to consuming logfile or a list of logfile paths.

  • +
  • roadway_shp_file – File path to consuming shape file for roadway changes.

  • +
  • roadway_csv_file – File path to consuming csv file for roadway changes.

  • +
  • network_build_file – File path to consuming EMME network build for network changes.

  • +
  • base_roadway_dir – Folder path to base roadway network.

  • +
  • base_transit_dir – Folder path to base transit network.

  • +
  • base_cube_transit_source – Folder path to base transit network or cube line file string.

  • +
  • base_cube_transit_file – File path to base transit network.

  • +
  • build_cube_transit_source – Folder path to build transit network or cube line file string.

  • +
  • build_cube_transit_file – File path to build transit network.

  • +
  • roadway_link_changes – pandas dataframe of CUBE roadway link changes.

  • +
  • roadway_node_changes – pandas dataframe of CUBE roadway node changes.

  • +
  • transit_changes – build transit changes.

  • +
  • base_roadway_network – Base roadway network object.

  • +
  • base_cube_transit_network – Base cube transit network object.

  • +
  • build_cube_transit_network – Build cube transit network object.

  • +
  • project_name – If not provided, will default to the roadway_log_file filename if +provided (or the first filename if a list is provided)

  • +
  • recalculate_calculated_variables – if reading in a base network, if this is true it +will recalculate variables such as area type, etc. This only needs to be true +if you are creating project cards that are changing the calculated variables.

  • +
  • recalculate_distance – recalculate the distance variable. This only needs to be +true if you are creating project cards that change the distance.

  • +
  • parameters – dictionary of parameters

  • +
  • crs (int) – coordinate reference system, ESPG number

  • +
  • node_foreign_key (str) – variable linking the node table to the link table

  • +
  • link_foreign_key (list) – list of variable linking the link table to the node foreign key

  • +
  • shape_foreign_key (str) – variable linking the links table and shape table

  • +
  • unique_link_ids (list) – list of variables unique to each link

  • +
  • unique_node_ids (list) – list of variables unique to each node

  • +
  • modes_to_network_link_variables (dict) – Mapping of modes to link variables in +the network

  • +
  • modes_to_network_nodes_variables (dict) – Mapping of modes to node variables +in the network

  • +
  • managed_lanes_node_id_scalar (int) – Scalar values added to primary keys for nodes for +corresponding managed lanes.

  • +
  • managed_lanes_link_id_scalar (int) – Scalar values added to primary keys for links for +corresponding managed lanes.

  • +
  • managed_lanes_required_attributes (list) – attributes that must be specified in managed +lane projects.

  • +
  • keep_same_attributes_ml_and_gp (list) – attributes to copy to managed lanes from parallel +general purpose lanes.

  • +
+
+
Returns:
+

A Project instance.

+
+
+
+ +
+
+static determine_roadway_network_changes_compatibility(base_roadway_network, roadway_link_changes, roadway_node_changes, parameters)[source]
+

Checks to see that any links or nodes that change exist in base roadway network.

+
+ +
+
+static emme_id_to_wrangler_id(emme_link_change_df, emme_node_change_df, emme_transit_changes_df, emme_node_id_crosswalk_file)[source]
+

rewrite the emme id with wrangler id, using the emme wrangler id crosswalk located in database folder

+
+ +
+
+static emme_name_to_wrangler_name(emme_link_change_df, emme_node_change_df, emme_name_crosswalk_file)[source]
+

rename emme names to wrangler names using crosswalk file

+
+ +
+
+evaluate_changes()[source]
+

Determines which changes should be evaluated, initiates +self.card_data to be an aggregation of transit and highway changes.

+
+ +
+
+get_object_from_network_build_command()[source]
+

determine the network build object is node or link

+
+
Parameters:
+

row – network build command history dataframe

+
+
Returns:
+

‘N’ for node, ‘L’ for link

+
+
+
+ +
+
+get_operation_from_network_build_command()[source]
+

determine the network build object action type

+
+
Parameters:
+

row – network build command history dataframe

+
+
Returns:
+

‘A’, ‘C’, ‘D’

+
+
+
+ +
+
+static read_logfile(logfilename)[source]
+

Reads a Cube log file and returns separate dataframes of roadway_link_changes and roadway_node_changes

+
+
Parameters:
+

logfilename (str or list[str]) – File path to CUBE logfile or list of logfile paths.

+
+
Returns:
+

A DataFrame reprsentation of the log file.

+
+
+
+ +
+
+static read_network_build_file(networkbuildfilename)[source]
+

Reads a emme network build file and returns separate dataframes of roadway_link_changes and roadway_node_changes

+
+
Parameters:
+

networkbuildfilename (str or list[str]) – File path to emme nework build file or list of network build file paths.

+
+
Returns:
+

A DataFrame representation of the network build file

+
+
+
+ +
+
+write_project_card(filename=None)[source]
+

Writes project cards.

+
+
Parameters:
+

filename (str) – File path to output .yml

+
+
Returns:
+

None

+
+
+
+ +
+
+CALCULATED_VALUES = ['area_type', 'county', 'assign_group', 'centroidconnect']
+
+ +
+
+DEFAULT_PROJECT_NAME = 'USER TO define'
+
+ +
+
+STATIC_VALUES = ['model_link_id', 'area_type', 'county', 'centroidconnect']
+
+ +
+ +
+ + +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty_dev/_generated/lasso.StandardTransit/index.html b/branch/bicounty_dev/_generated/lasso.StandardTransit/index.html new file mode 100644 index 0000000..e3c050b --- /dev/null +++ b/branch/bicounty_dev/_generated/lasso.StandardTransit/index.html @@ -0,0 +1,420 @@ + + + + + + + lasso.StandardTransit — lasso documentation + + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

lasso.StandardTransit

+
+
+class lasso.StandardTransit(ptg_feed, parameters={})[source]
+

Bases: object

+

Holds a standard transit feed as a Partridge object and contains +methods to manipulate and translate the GTFS data to MetCouncil’s +Cube Line files.

+

Typical usage example:

+
cube_transit_net = StandardTransit.read_gtfs(BASE_TRANSIT_DIR)
+cube_transit_net.write_as_cube_lin(os.path.join(WRITE_DIR, "outfile.lin"))
+
+
+
+
+feed
+

Partridge Feed object containing read-only access to GTFS feed

+
+ +
+
+parameters
+

Parameters instance containing information +about time periods and variables.

+
+
Type:
+

Parameters

+
+
+
+ +
+
+__init__(ptg_feed, parameters={})[source]
+
+
Parameters:
+
    +
  • ptg_feed – partridge feed object

  • +
  • parameters – dictionary of parameter settings (see Parameters class) or an instance of Parameters

  • +
+
+
+
+ +

Methods

+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +

__init__(ptg_feed[, parameters])

+
param ptg_feed:
+

partridge feed object

+
+
+

calculate_cube_mode(row)

Assigns a cube mode number by following logic.

cube_format(row)

Creates a string represnting the route in cube line file notation. #MC :param row: row of a DataFrame representing a cube-formatted trip, with the Attributes trip_id, shape_id, NAME, LONGNAME, tod, HEADWAY, MODE, ONEWAY, OPERATOR.

evaluate_differences(transit_changes)

Compare changes from the transit_changes dataframe with the standard transit network returns the project card changes in dictionary format

fromTransitNetwork(transit_network_object[, ...])

RoadwayNetwork to ModelRoadwayNetwork

read_gtfs(gtfs_feed_dir[, parameters])

Reads GTFS files from a directory and returns a StandardTransit instance.

route_properties_gtfs_to_cube(self)

Prepare gtfs for cube lin file.

shape_gtfs_to_cube(row[, add_nntime])

Creates a list of nodes that for the route in appropriate cube format.

shape_gtfs_to_emme(trip_row)

Creates transit segment for the trips in appropriate emme format.

time_to_cube_time_period(start_time_secs[, ...])

Converts seconds from midnight to the cube time period.

write_as_cube_lin([outpath])

Writes the gtfs feed as a cube line file after converting gtfs properties to MetCouncil cube properties.

+
+
+calculate_cube_mode(row)[source]
+

Assigns a cube mode number by following logic. +#MC +For rail, uses GTFS route_type variable: +https://developers.google.com/transit/gtfs/reference

+
+
::

# route_type : cube_mode +route_type_to_cube_mode = {0: 8, # Tram, Streetcar, Light rail

+
+

3: 0, # Bus; further disaggregated for cube +2: 9} # Rail

+
+
+
+

For buses, uses route id numbers and route name to find +express and suburban buses as follows:

+
+
::
+
if not cube_mode:
+
if ‘express’ in row[‘LONGNAME’].lower():

cube_mode = 7 # Express

+
+
elif int(row[‘route_id’].split(“-“)[0]) > 99:

cube_mode = 6 # Suburban Local

+
+
else:

cube_mode = 5 # Urban Local

+
+
+
+
+
+
+
+
Parameters:
+

row – A DataFrame row with route_type, route_long_name, and route_id

+
+
Returns:
+

cube mode number

+
+
+
+ +
+
+cube_format(row)[source]
+

Creates a string represnting the route in cube line file notation. +#MC +:param row: row of a DataFrame representing a cube-formatted trip, with the Attributes

+
+

trip_id, shape_id, NAME, LONGNAME, tod, HEADWAY, MODE, ONEWAY, OPERATOR

+
+
+
Returns:
+

string representation of route in cube line file notation

+
+
+
+ +
+
+evaluate_differences(transit_changes)[source]
+

Compare changes from the transit_changes dataframe with the standard transit network +returns the project card changes in dictionary format

+
+ +
+
+static fromTransitNetwork(transit_network_object, parameters={})[source]
+

RoadwayNetwork to ModelRoadwayNetwork

+
+
Parameters:
+
    +
  • transit_network_object – Reference to an instance of TransitNetwork.

  • +
  • parameters – dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not provided will +use default parameters.

  • +
+
+
Returns:
+

StandardTransit

+
+
+
+ +
+
+static read_gtfs(gtfs_feed_dir, parameters={})[source]
+

Reads GTFS files from a directory and returns a StandardTransit +instance.

+
+
Parameters:
+
    +
  • gtfs_feed_dir – location of the GTFS files

  • +
  • parameters – dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not provided will +use default parameters.

  • +
+
+
Returns:
+

StandardTransit instance

+
+
+
+ +
+
+static route_properties_gtfs_to_cube(self)[source]
+

Prepare gtfs for cube lin file. +#MC +Does the following operations: +1. Combines route, frequency, trip, and shape information +2. Converts time of day to time periods +3. Calculates cube route name from gtfs route name and properties +4. Assigns a cube-appropriate mode number +5. Assigns a cube-appropriate operator number

+
+
Returns:
+

+
DataFrame of trips with cube-appropriate values for:
    +
  • NAME

  • +
  • ONEWAY

  • +
  • OPERATOR

  • +
  • MODE

  • +
  • HEADWAY

  • +
+
+
+

+
+
Return type:
+

trip_df (DataFrame)

+
+
+
+ +
+
+shape_gtfs_to_cube(row, add_nntime=False)[source]
+

Creates a list of nodes that for the route in appropriate +cube format.

+
+
Parameters:
+

row – DataFrame row with both shape_id and trip_id

+
+
+
+
Returns: a string representation of the node list

for a route in cube format.

+
+
+
+ +
+
+shape_gtfs_to_emme(trip_row)[source]
+

Creates transit segment for the trips in appropriate +emme format.

+
+
Parameters:
+

row – DataFrame row with both shape_id and trip_id

+
+
+
+
Returns: a dataframe representation of the transit segment

for a trip in emme format.

+
+
+
+ +
+
+time_to_cube_time_period(start_time_secs, as_str=True, verbose=False)[source]
+

Converts seconds from midnight to the cube time period.

+
+
Parameters:
+
    +
  • start_time_secs – start time for transit trip in seconds +from midnight

  • +
  • as_str – if True, returns the time period as a string, +otherwise returns a numeric time period

  • +
+
+
Returns:
+

+
if as_str is False, returns the numeric

time period

+
+
this_tp: if as_str is True, returns the Cube time period

name abbreviation

+
+
+

+
+
Return type:
+

this_tp_num

+
+
+
+ +
+
+write_as_cube_lin(outpath=None)[source]
+

Writes the gtfs feed as a cube line file after +converting gtfs properties to MetCouncil cube properties. +#MC +:param outpath: File location for output cube line file.

+
+ +
+ +
+ + +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty_dev/_generated/lasso.logger/index.html b/branch/bicounty_dev/_generated/lasso.logger/index.html new file mode 100644 index 0000000..baf81a9 --- /dev/null +++ b/branch/bicounty_dev/_generated/lasso.logger/index.html @@ -0,0 +1,143 @@ + + + + + + + lasso.logger — lasso documentation + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

lasso.logger

+

Functions

+ + + + + + +

setupLogging(infoLogFilename, debugLogFilename)

Sets up the logger.

+
+
+lasso.logger.setupLogging(infoLogFilename, debugLogFilename, logToConsole=True)[source]
+

Sets up the logger. The infoLog is terse, just gives the bare minimum of details +so the network composition will be clear later. +The debuglog is very noisy, for debugging.

+

Pass none to either. +Spews it all out to console too, if logToConsole is true.

+
+ +
+ + +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty_dev/_generated/lasso.util/index.html b/branch/bicounty_dev/_generated/lasso.util/index.html new file mode 100644 index 0000000..1d14e5f --- /dev/null +++ b/branch/bicounty_dev/_generated/lasso.util/index.html @@ -0,0 +1,1696 @@ + + + + + + + lasso.util — lasso documentation + + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

lasso.util

+

Functions

+ + + + + + + + + + + + + + + + + + + + + + + + +

column_name_to_parts(c[, parameters])

create_locationreference(node, link)

geodesic_point_buffer(lat, lon, meters)

creates circular buffer polygon for node

get_shared_streets_intersection_hash(lat, long)

Calculated per:

hhmmss_to_datetime(hhmmss_str)

Creates a datetime time object from a string of hh:mm:ss

secs_to_datetime(secs)

Creates a datetime time object from a seconds from midnight

shorten_name(name)

+
+
+class lasso.util.Point(*args)[source]
+

Bases: BaseGeometry

+

A geometry type that represents a single coordinate with +x,y and possibly z values.

+

A point is a zero-dimensional feature and has zero length and zero area.

+
+
Parameters:
+

args (float, or sequence of floats) –

The coordinates can either be passed as a single parameter, or as +individual float values using multiple parameters:

+
    +
  1. 1 parameter: a sequence or array-like of with 2 or 3 values.

  2. +
  3. 2 or 3 parameters (float): x, y, and possibly z.

  4. +
+

+
+
+
+
+x, y, z
+

Coordinate values

+
+
Type:
+

float

+
+
+
+ +

Examples

+

Constructing the Point using separate parameters for x and y:

+
>>> p = Point(1.0, -1.0)
+
+
+

Constructing the Point using a list of x, y coordinates:

+
>>> p = Point([1.0, -1.0])
+>>> print(p)
+POINT (1 -1)
+>>> p.y
+-1.0
+>>> p.x
+1.0
+
+
+
+
+almost_equals(other, decimal=6)
+

True if geometries are equal at all coordinates to a +specified decimal place.

+
+

Deprecated since version 1.8.0: The ‘almost_equals()’ method is deprecated +and will be removed in Shapely 2.1 because the name is +confusing. The ‘equals_exact()’ method should be used +instead.

+
+

Refers to approximate coordinate equality, which requires +coordinates to be approximately equal and in the same order for +all components of a geometry.

+

Because of this it is possible for “equals()” to be True for two +geometries and “almost_equals()” to be False.

+

Examples

+
>>> LineString(
+...     [(0, 0), (2, 2)]
+... ).equals_exact(
+...     LineString([(0, 0), (1, 1), (2, 2)]),
+...     1e-6
+... )
+False
+
+
+
+
Return type:
+

bool

+
+
+
+ +
+
+buffer(distance, quad_segs=16, cap_style='round', join_style='round', mitre_limit=5.0, single_sided=False, **kwargs)
+

Get a geometry that represents all points within a distance +of this geometry.

+

A positive distance produces a dilation, a negative distance an +erosion. A very small or zero distance may sometimes be used to +“tidy” a polygon.

+
+
Parameters:
+
    +
  • distance (float) – The distance to buffer around the object.

  • +
  • resolution (int, optional) – The resolution of the buffer around each vertex of the +object.

  • +
  • quad_segs (int, optional) – Sets the number of line segments used to approximate an +angle fillet.

  • +
  • cap_style (shapely.BufferCapStyle or {'round', 'square', 'flat'}, default 'round') – Specifies the shape of buffered line endings. BufferCapStyle.round (‘round’) +results in circular line endings (see quad_segs). Both BufferCapStyle.square +(‘square’) and BufferCapStyle.flat (‘flat’) result in rectangular line endings, +only BufferCapStyle.flat (‘flat’) will end at the original vertex, +while BufferCapStyle.square (‘square’) involves adding the buffer width.

  • +
  • join_style (shapely.BufferJoinStyle or {'round', 'mitre', 'bevel'}, default 'round') – Specifies the shape of buffered line midpoints. BufferJoinStyle.ROUND (‘round’) +results in rounded shapes. BufferJoinStyle.bevel (‘bevel’) results in a beveled +edge that touches the original vertex. BufferJoinStyle.mitre (‘mitre’) results +in a single vertex that is beveled depending on the mitre_limit parameter.

  • +
  • mitre_limit (float, optional) – The mitre limit ratio is used for very sharp corners. The +mitre ratio is the ratio of the distance from the corner to +the end of the mitred offset corner. When two line segments +meet at a sharp angle, a miter join will extend the original +geometry. To prevent unreasonable geometry, the mitre limit +allows controlling the maximum length of the join corner. +Corners with a ratio which exceed the limit will be beveled.

  • +
  • single_side (bool, optional) –

    The side used is determined by the sign of the buffer +distance:

    +
    +

    a positive distance indicates the left-hand side +a negative distance indicates the right-hand side

    +
    +

    The single-sided buffer of point geometries is the same as +the regular buffer. The End Cap Style for single-sided +buffers is always ignored, and forced to the equivalent of +CAP_FLAT.

    +

  • +
  • quadsegs (int, optional) – Deprecated alias for quad_segs.

  • +
+
+
Return type:
+

Geometry

+
+
+

Notes

+

The return value is a strictly two-dimensional geometry. All +Z coordinates of the original geometry will be ignored.

+

Examples

+
>>> from shapely.wkt import loads
+>>> g = loads('POINT (0.0 0.0)')
+
+
+

16-gon approx of a unit radius circle:

+
>>> g.buffer(1.0).area  
+3.1365484905459...
+
+
+

128-gon approximation:

+
>>> g.buffer(1.0, 128).area  
+3.141513801144...
+
+
+

triangle approximation:

+
>>> g.buffer(1.0, 3).area
+3.0
+>>> list(g.buffer(1.0, cap_style=BufferCapStyle.square).exterior.coords)
+[(1.0, 1.0), (1.0, -1.0), (-1.0, -1.0), (-1.0, 1.0), (1.0, 1.0)]
+>>> g.buffer(1.0, cap_style=BufferCapStyle.square).area
+4.0
+
+
+
+ +
+
+contains(other)
+

Returns True if the geometry contains the other, else False

+
+ +
+
+contains_properly(other)
+

Returns True if the geometry completely contains the other, with no +common boundary points, else False

+

Refer to shapely.contains_properly for full documentation.

+
+ +
+
+covered_by(other)
+

Returns True if the geometry is covered by the other, else False

+
+ +
+
+covers(other)
+

Returns True if the geometry covers the other, else False

+
+ +
+
+crosses(other)
+

Returns True if the geometries cross, else False

+
+ +
+
+difference(other, grid_size=None)
+

Returns the difference of the geometries.

+

Refer to shapely.difference for full documentation.

+
+ +
+
+disjoint(other)
+

Returns True if geometries are disjoint, else False

+
+ +
+
+distance(other)
+

Unitless distance to other geometry (float)

+
+ +
+
+dwithin(other, distance)
+

Returns True if geometry is within a given distance from the other, else False.

+

Refer to shapely.dwithin for full documentation.

+
+ +
+
+equals(other)
+

Returns True if geometries are equal, else False.

+

This method considers point-set equality (or topological +equality), and is equivalent to (self.within(other) & +self.contains(other)).

+

Examples

+
>>> LineString(
+...     [(0, 0), (2, 2)]
+... ).equals(
+...     LineString([(0, 0), (1, 1), (2, 2)])
+... )
+True
+
+
+
+
Return type:
+

bool

+
+
+
+ +
+
+equals_exact(other, tolerance)
+

True if geometries are equal to within a specified +tolerance.

+
+
Parameters:
+
    +
  • other (BaseGeometry) – The other geometry object in this comparison.

  • +
  • tolerance (float) – Absolute tolerance in the same units as coordinates.

  • +
  • equality (This method considers coordinate) –

  • +
  • requires (which) –

  • +
  • components (coordinates to be equal and in the same order for all) –

  • +
  • geometry. (of a) –

  • +
  • two (Because of this it is possible for "equals()" to be True for) –

  • +
  • False. (geometries and "equals_exact()" to be) –

  • +
+
+
+

Examples

+
>>> LineString(
+...     [(0, 0), (2, 2)]
+... ).equals_exact(
+...     LineString([(0, 0), (1, 1), (2, 2)]),
+...     1e-6
+... )
+False
+
+
+
+
Return type:
+

bool

+
+
+
+ +
+
+geometryType()
+
+ +
+
+hausdorff_distance(other)
+

Unitless hausdorff distance to other geometry (float)

+
+ +
+
+interpolate(distance, normalized=False)
+

Return a point at the specified distance along a linear geometry

+

Negative length values are taken as measured in the reverse +direction from the end of the geometry. Out-of-range index +values are handled by clamping them to the valid range of values. +If the normalized arg is True, the distance will be interpreted as a +fraction of the geometry’s length.

+

Alias of line_interpolate_point.

+
+ +
+
+intersection(other, grid_size=None)
+

Returns the intersection of the geometries.

+

Refer to shapely.intersection for full documentation.

+
+ +
+
+intersects(other)
+

Returns True if geometries intersect, else False

+
+ +
+
+line_interpolate_point(distance, normalized=False)
+

Return a point at the specified distance along a linear geometry

+

Negative length values are taken as measured in the reverse +direction from the end of the geometry. Out-of-range index +values are handled by clamping them to the valid range of values. +If the normalized arg is True, the distance will be interpreted as a +fraction of the geometry’s length.

+

Alias of interpolate.

+
+ +
+
+line_locate_point(other, normalized=False)
+

Returns the distance along this geometry to a point nearest the +specified point

+

If the normalized arg is True, return the distance normalized to the +length of the linear geometry.

+

Alias of project.

+
+ +
+
+normalize()
+

Converts geometry to normal form (or canonical form).

+

This method orders the coordinates, rings of a polygon and parts of +multi geometries consistently. Typically useful for testing purposes +(for example in combination with equals_exact).

+

Examples

+
>>> from shapely import MultiLineString
+>>> line = MultiLineString([[(0, 0), (1, 1)], [(3, 3), (2, 2)]])
+>>> line.normalize()
+<MULTILINESTRING ((2 2, 3 3), (0 0, 1 1))>
+
+
+
+ +
+
+overlaps(other)
+

Returns True if geometries overlap, else False

+
+ +
+
+point_on_surface()
+

Returns a point guaranteed to be within the object, cheaply.

+

Alias of representative_point.

+
+ +
+
+project(other, normalized=False)
+

Returns the distance along this geometry to a point nearest the +specified point

+

If the normalized arg is True, return the distance normalized to the +length of the linear geometry.

+

Alias of line_locate_point.

+
+ +
+
+relate(other)
+

Returns the DE-9IM intersection matrix for the two geometries +(string)

+
+ +
+
+relate_pattern(other, pattern)
+

Returns True if the DE-9IM string code for the relationship between +the geometries satisfies the pattern, else False

+
+ +
+
+representative_point()
+

Returns a point guaranteed to be within the object, cheaply.

+

Alias of point_on_surface.

+
+ +
+
+reverse()
+

Returns a copy of this geometry with the order of coordinates reversed.

+

If the geometry is a polygon with interior rings, the interior rings are also +reversed.

+

Points are unchanged.

+
+

See also

+
+
is_ccw

Checks if a geometry is clockwise.

+
+
+
+

Examples

+
>>> from shapely import LineString, Polygon
+>>> LineString([(0, 0), (1, 2)]).reverse()
+<LINESTRING (1 2, 0 0)>
+>>> Polygon([(0, 0), (1, 0), (1, 1), (0, 1), (0, 0)]).reverse()
+<POLYGON ((0 0, 0 1, 1 1, 1 0, 0 0))>
+
+
+
+ +
+
+segmentize(max_segment_length)
+

Adds vertices to line segments based on maximum segment length.

+

Additional vertices will be added to every line segment in an input geometry +so that segments are no longer than the provided maximum segment length. New +vertices will evenly subdivide each segment.

+

Only linear components of input geometries are densified; other geometries +are returned unmodified.

+
+
Parameters:
+

max_segment_length (float or array_like) – Additional vertices will be added so that all line segments are no +longer this value. Must be greater than 0.

+
+
+

Examples

+
>>> from shapely import LineString, Polygon
+>>> LineString([(0, 0), (0, 10)]).segmentize(max_segment_length=5)
+<LINESTRING (0 0, 0 5, 0 10)>
+>>> Polygon([(0, 0), (10, 0), (10, 10), (0, 10), (0, 0)]).segmentize(max_segment_length=5)
+<POLYGON ((0 0, 5 0, 10 0, 10 5, 10 10, 5 10, 0 10, 0 5, 0 0))>
+
+
+
+ +
+
+simplify(tolerance, preserve_topology=True)
+

Returns a simplified geometry produced by the Douglas-Peucker +algorithm

+

Coordinates of the simplified geometry will be no more than the +tolerance distance from the original. Unless the topology preserving +option is used, the algorithm may produce self-intersecting or +otherwise invalid geometries.

+
+ +
+
+svg(scale_factor=1.0, fill_color=None, opacity=None)[source]
+

Returns SVG circle element for the Point geometry.

+
+
Parameters:
+
    +
  • scale_factor (float) – Multiplication factor for the SVG circle diameter. Default is 1.

  • +
  • fill_color (str, optional) – Hex string for fill color. Default is to use “#66cc99” if +geometry is valid, and “#ff3333” if invalid.

  • +
  • opacity (float) – Float number between 0 and 1 for color opacity. Default value is 0.6

  • +
+
+
+
+ +
+
+symmetric_difference(other, grid_size=None)
+

Returns the symmetric difference of the geometries.

+

Refer to shapely.symmetric_difference for full documentation.

+
+ +
+
+touches(other)
+

Returns True if geometries touch, else False

+
+ +
+
+union(other, grid_size=None)
+

Returns the union of the geometries.

+

Refer to shapely.union for full documentation.

+
+ +
+
+within(other)
+

Returns True if geometry is within the other, else False

+
+ +
+
+property area
+

Unitless area of the geometry (float)

+
+ +
+
+property boundary
+

Returns a lower dimension geometry that bounds the object

+

The boundary of a polygon is a line, the boundary of a line is a +collection of points. The boundary of a point is an empty (null) +collection.

+
+ +
+
+property bounds
+

Returns minimum bounding region (minx, miny, maxx, maxy)

+
+ +
+
+property centroid
+

Returns the geometric center of the object

+
+ +
+
+property convex_hull
+

that’s a +convex hull, more or less

+

The convex hull of a three member multipoint, for example, is a +triangular polygon.

+
+
Type:
+

Imagine an elastic band stretched around the geometry

+
+
+
+ +
+
+property coords
+

Access to geometry’s coordinates (CoordinateSequence)

+
+ +
+
+property envelope
+

A figure that envelopes the geometry

+
+ +
+
+property geom_type
+

Name of the geometry’s type, such as ‘Point’

+
+ +
+
+property has_z
+

True if the geometry’s coordinate sequence(s) have z values (are +3-dimensional)

+
+ +
+
+property is_closed
+

True if the geometry is closed, else False

+

Applicable only to 1-D geometries.

+
+ +
+
+property is_empty
+

True if the set of points in this geometry is empty, else False

+
+ +
+
+property is_ring
+

True if the geometry is a closed ring, else False

+
+ +
+
+property is_simple
+

True if the geometry is simple, meaning that any self-intersections +are only at boundary points, else False

+
+ +
+
+property is_valid
+

True if the geometry is valid (definition depends on sub-class), +else False

+
+ +
+
+property length
+

Unitless length of the geometry (float)

+
+ +
+
+property minimum_clearance
+

Unitless distance by which a node could be moved to produce an invalid geometry (float)

+
+ +
+
+property minimum_rotated_rectangle
+

Returns the oriented envelope (minimum rotated rectangle) that +encloses the geometry.

+

Unlike envelope this rectangle is not constrained to be parallel to the +coordinate axes. If the convex hull of the object is a degenerate (line +or point) this degenerate is returned.

+

Alias of oriented_envelope.

+
+ +
+
+property oriented_envelope
+

Returns the oriented envelope (minimum rotated rectangle) that +encloses the geometry.

+

Unlike envelope this rectangle is not constrained to be parallel to the +coordinate axes. If the convex hull of the object is a degenerate (line +or point) this degenerate is returned.

+

Alias of minimum_rotated_rectangle.

+
+ +
+
+property type
+
+ +
+
+property wkb
+

WKB representation of the geometry

+
+ +
+
+property wkb_hex
+

WKB hex representation of the geometry

+
+ +
+
+property wkt
+

WKT representation of the geometry

+
+ +
+
+property x
+

Return x coordinate.

+
+ +
+
+property xy
+

Separate arrays of X and Y coordinate values

+

Example

+
>>> x, y = Point(0, 0).xy
+>>> list(x)
+[0.0]
+>>> list(y)
+[0.0]
+
+
+
+ +
+
+property y
+

Return y coordinate.

+
+ +
+
+property z
+

Return z coordinate.

+
+ +
+ +
+
+class lasso.util.Polygon(shell=None, holes=None)[source]
+

Bases: BaseGeometry

+

A geometry type representing an area that is enclosed by a linear ring.

+

A polygon is a two-dimensional feature and has a non-zero area. It may +have one or more negative-space “holes” which are also bounded by linear +rings. If any rings cross each other, the feature is invalid and +operations on it may fail.

+
+
Parameters:
+
    +
  • shell (sequence) – A sequence of (x, y [,z]) numeric coordinate pairs or triples, or +an array-like with shape (N, 2) or (N, 3). +Also can be a sequence of Point objects.

  • +
  • holes (sequence) – A sequence of objects which satisfy the same requirements as the +shell parameters above

  • +
+
+
+
+
+exterior
+

The ring which bounds the positive space of the polygon.

+
+
Type:
+

LinearRing

+
+
+
+ +
+
+interiors
+

A sequence of rings which bound all existing holes.

+
+
Type:
+

sequence

+
+
+
+ +

Examples

+

Create a square polygon with no holes

+
>>> coords = ((0., 0.), (0., 1.), (1., 1.), (1., 0.), (0., 0.))
+>>> polygon = Polygon(coords)
+>>> polygon.area
+1.0
+
+
+
+
+almost_equals(other, decimal=6)
+

True if geometries are equal at all coordinates to a +specified decimal place.

+
+

Deprecated since version 1.8.0: The ‘almost_equals()’ method is deprecated +and will be removed in Shapely 2.1 because the name is +confusing. The ‘equals_exact()’ method should be used +instead.

+
+

Refers to approximate coordinate equality, which requires +coordinates to be approximately equal and in the same order for +all components of a geometry.

+

Because of this it is possible for “equals()” to be True for two +geometries and “almost_equals()” to be False.

+

Examples

+
>>> LineString(
+...     [(0, 0), (2, 2)]
+... ).equals_exact(
+...     LineString([(0, 0), (1, 1), (2, 2)]),
+...     1e-6
+... )
+False
+
+
+
+
Return type:
+

bool

+
+
+
+ +
+
+buffer(distance, quad_segs=16, cap_style='round', join_style='round', mitre_limit=5.0, single_sided=False, **kwargs)
+

Get a geometry that represents all points within a distance +of this geometry.

+

A positive distance produces a dilation, a negative distance an +erosion. A very small or zero distance may sometimes be used to +“tidy” a polygon.

+
+
Parameters:
+
    +
  • distance (float) – The distance to buffer around the object.

  • +
  • resolution (int, optional) – The resolution of the buffer around each vertex of the +object.

  • +
  • quad_segs (int, optional) – Sets the number of line segments used to approximate an +angle fillet.

  • +
  • cap_style (shapely.BufferCapStyle or {'round', 'square', 'flat'}, default 'round') – Specifies the shape of buffered line endings. BufferCapStyle.round (‘round’) +results in circular line endings (see quad_segs). Both BufferCapStyle.square +(‘square’) and BufferCapStyle.flat (‘flat’) result in rectangular line endings, +only BufferCapStyle.flat (‘flat’) will end at the original vertex, +while BufferCapStyle.square (‘square’) involves adding the buffer width.

  • +
  • join_style (shapely.BufferJoinStyle or {'round', 'mitre', 'bevel'}, default 'round') – Specifies the shape of buffered line midpoints. BufferJoinStyle.ROUND (‘round’) +results in rounded shapes. BufferJoinStyle.bevel (‘bevel’) results in a beveled +edge that touches the original vertex. BufferJoinStyle.mitre (‘mitre’) results +in a single vertex that is beveled depending on the mitre_limit parameter.

  • +
  • mitre_limit (float, optional) – The mitre limit ratio is used for very sharp corners. The +mitre ratio is the ratio of the distance from the corner to +the end of the mitred offset corner. When two line segments +meet at a sharp angle, a miter join will extend the original +geometry. To prevent unreasonable geometry, the mitre limit +allows controlling the maximum length of the join corner. +Corners with a ratio which exceed the limit will be beveled.

  • +
  • single_side (bool, optional) –

    The side used is determined by the sign of the buffer +distance:

    +
    +

    a positive distance indicates the left-hand side +a negative distance indicates the right-hand side

    +
    +

    The single-sided buffer of point geometries is the same as +the regular buffer. The End Cap Style for single-sided +buffers is always ignored, and forced to the equivalent of +CAP_FLAT.

    +

  • +
  • quadsegs (int, optional) – Deprecated alias for quad_segs.

  • +
+
+
Return type:
+

Geometry

+
+
+

Notes

+

The return value is a strictly two-dimensional geometry. All +Z coordinates of the original geometry will be ignored.

+

Examples

+
>>> from shapely.wkt import loads
+>>> g = loads('POINT (0.0 0.0)')
+
+
+

16-gon approx of a unit radius circle:

+
>>> g.buffer(1.0).area  
+3.1365484905459...
+
+
+

128-gon approximation:

+
>>> g.buffer(1.0, 128).area  
+3.141513801144...
+
+
+

triangle approximation:

+
>>> g.buffer(1.0, 3).area
+3.0
+>>> list(g.buffer(1.0, cap_style=BufferCapStyle.square).exterior.coords)
+[(1.0, 1.0), (1.0, -1.0), (-1.0, -1.0), (-1.0, 1.0), (1.0, 1.0)]
+>>> g.buffer(1.0, cap_style=BufferCapStyle.square).area
+4.0
+
+
+
+ +
+
+contains(other)
+

Returns True if the geometry contains the other, else False

+
+ +
+
+contains_properly(other)
+

Returns True if the geometry completely contains the other, with no +common boundary points, else False

+

Refer to shapely.contains_properly for full documentation.

+
+ +
+
+covered_by(other)
+

Returns True if the geometry is covered by the other, else False

+
+ +
+
+covers(other)
+

Returns True if the geometry covers the other, else False

+
+ +
+
+crosses(other)
+

Returns True if the geometries cross, else False

+
+ +
+
+difference(other, grid_size=None)
+

Returns the difference of the geometries.

+

Refer to shapely.difference for full documentation.

+
+ +
+
+disjoint(other)
+

Returns True if geometries are disjoint, else False

+
+ +
+
+distance(other)
+

Unitless distance to other geometry (float)

+
+ +
+
+dwithin(other, distance)
+

Returns True if geometry is within a given distance from the other, else False.

+

Refer to shapely.dwithin for full documentation.

+
+ +
+
+equals(other)
+

Returns True if geometries are equal, else False.

+

This method considers point-set equality (or topological +equality), and is equivalent to (self.within(other) & +self.contains(other)).

+

Examples

+
>>> LineString(
+...     [(0, 0), (2, 2)]
+... ).equals(
+...     LineString([(0, 0), (1, 1), (2, 2)])
+... )
+True
+
+
+
+
Return type:
+

bool

+
+
+
+ +
+
+equals_exact(other, tolerance)
+

True if geometries are equal to within a specified +tolerance.

+
+
Parameters:
+
    +
  • other (BaseGeometry) – The other geometry object in this comparison.

  • +
  • tolerance (float) – Absolute tolerance in the same units as coordinates.

  • +
  • equality (This method considers coordinate) –

  • +
  • requires (which) –

  • +
  • components (coordinates to be equal and in the same order for all) –

  • +
  • geometry. (of a) –

  • +
  • two (Because of this it is possible for "equals()" to be True for) –

  • +
  • False. (geometries and "equals_exact()" to be) –

  • +
+
+
+

Examples

+
>>> LineString(
+...     [(0, 0), (2, 2)]
+... ).equals_exact(
+...     LineString([(0, 0), (1, 1), (2, 2)]),
+...     1e-6
+... )
+False
+
+
+
+
Return type:
+

bool

+
+
+
+ +
+
+classmethod from_bounds(xmin, ymin, xmax, ymax)[source]
+

Construct a Polygon() from spatial bounds.

+
+ +
+
+geometryType()
+
+ +
+
+hausdorff_distance(other)
+

Unitless hausdorff distance to other geometry (float)

+
+ +
+
+interpolate(distance, normalized=False)
+

Return a point at the specified distance along a linear geometry

+

Negative length values are taken as measured in the reverse +direction from the end of the geometry. Out-of-range index +values are handled by clamping them to the valid range of values. +If the normalized arg is True, the distance will be interpreted as a +fraction of the geometry’s length.

+

Alias of line_interpolate_point.

+
+ +
+
+intersection(other, grid_size=None)
+

Returns the intersection of the geometries.

+

Refer to shapely.intersection for full documentation.

+
+ +
+
+intersects(other)
+

Returns True if geometries intersect, else False

+
+ +
+
+line_interpolate_point(distance, normalized=False)
+

Return a point at the specified distance along a linear geometry

+

Negative length values are taken as measured in the reverse +direction from the end of the geometry. Out-of-range index +values are handled by clamping them to the valid range of values. +If the normalized arg is True, the distance will be interpreted as a +fraction of the geometry’s length.

+

Alias of interpolate.

+
+ +
+
+line_locate_point(other, normalized=False)
+

Returns the distance along this geometry to a point nearest the +specified point

+

If the normalized arg is True, return the distance normalized to the +length of the linear geometry.

+

Alias of project.

+
+ +
+
+normalize()
+

Converts geometry to normal form (or canonical form).

+

This method orders the coordinates, rings of a polygon and parts of +multi geometries consistently. Typically useful for testing purposes +(for example in combination with equals_exact).

+

Examples

+
>>> from shapely import MultiLineString
+>>> line = MultiLineString([[(0, 0), (1, 1)], [(3, 3), (2, 2)]])
+>>> line.normalize()
+<MULTILINESTRING ((2 2, 3 3), (0 0, 1 1))>
+
+
+
+ +
+
+overlaps(other)
+

Returns True if geometries overlap, else False

+
+ +
+
+point_on_surface()
+

Returns a point guaranteed to be within the object, cheaply.

+

Alias of representative_point.

+
+ +
+
+project(other, normalized=False)
+

Returns the distance along this geometry to a point nearest the +specified point

+

If the normalized arg is True, return the distance normalized to the +length of the linear geometry.

+

Alias of line_locate_point.

+
+ +
+
+relate(other)
+

Returns the DE-9IM intersection matrix for the two geometries +(string)

+
+ +
+
+relate_pattern(other, pattern)
+

Returns True if the DE-9IM string code for the relationship between +the geometries satisfies the pattern, else False

+
+ +
+
+representative_point()
+

Returns a point guaranteed to be within the object, cheaply.

+

Alias of point_on_surface.

+
+ +
+
+reverse()
+

Returns a copy of this geometry with the order of coordinates reversed.

+

If the geometry is a polygon with interior rings, the interior rings are also +reversed.

+

Points are unchanged.

+
+

See also

+
+
is_ccw

Checks if a geometry is clockwise.

+
+
+
+

Examples

+
>>> from shapely import LineString, Polygon
+>>> LineString([(0, 0), (1, 2)]).reverse()
+<LINESTRING (1 2, 0 0)>
+>>> Polygon([(0, 0), (1, 0), (1, 1), (0, 1), (0, 0)]).reverse()
+<POLYGON ((0 0, 0 1, 1 1, 1 0, 0 0))>
+
+
+
+ +
+
+segmentize(max_segment_length)
+

Adds vertices to line segments based on maximum segment length.

+

Additional vertices will be added to every line segment in an input geometry +so that segments are no longer than the provided maximum segment length. New +vertices will evenly subdivide each segment.

+

Only linear components of input geometries are densified; other geometries +are returned unmodified.

+
+
Parameters:
+

max_segment_length (float or array_like) – Additional vertices will be added so that all line segments are no +longer this value. Must be greater than 0.

+
+
+

Examples

+
>>> from shapely import LineString, Polygon
+>>> LineString([(0, 0), (0, 10)]).segmentize(max_segment_length=5)
+<LINESTRING (0 0, 0 5, 0 10)>
+>>> Polygon([(0, 0), (10, 0), (10, 10), (0, 10), (0, 0)]).segmentize(max_segment_length=5)
+<POLYGON ((0 0, 5 0, 10 0, 10 5, 10 10, 5 10, 0 10, 0 5, 0 0))>
+
+
+
+ +
+
+simplify(tolerance, preserve_topology=True)
+

Returns a simplified geometry produced by the Douglas-Peucker +algorithm

+

Coordinates of the simplified geometry will be no more than the +tolerance distance from the original. Unless the topology preserving +option is used, the algorithm may produce self-intersecting or +otherwise invalid geometries.

+
+ +
+
+svg(scale_factor=1.0, fill_color=None, opacity=None)[source]
+

Returns SVG path element for the Polygon geometry.

+
+
Parameters:
+
    +
  • scale_factor (float) – Multiplication factor for the SVG stroke-width. Default is 1.

  • +
  • fill_color (str, optional) – Hex string for fill color. Default is to use “#66cc99” if +geometry is valid, and “#ff3333” if invalid.

  • +
  • opacity (float) – Float number between 0 and 1 for color opacity. Default value is 0.6

  • +
+
+
+
+ +
+
+symmetric_difference(other, grid_size=None)
+

Returns the symmetric difference of the geometries.

+

Refer to shapely.symmetric_difference for full documentation.

+
+ +
+
+touches(other)
+

Returns True if geometries touch, else False

+
+ +
+
+union(other, grid_size=None)
+

Returns the union of the geometries.

+

Refer to shapely.union for full documentation.

+
+ +
+
+within(other)
+

Returns True if geometry is within the other, else False

+
+ +
+
+property area
+

Unitless area of the geometry (float)

+
+ +
+
+property boundary
+

Returns a lower dimension geometry that bounds the object

+

The boundary of a polygon is a line, the boundary of a line is a +collection of points. The boundary of a point is an empty (null) +collection.

+
+ +
+
+property bounds
+

Returns minimum bounding region (minx, miny, maxx, maxy)

+
+ +
+
+property centroid
+

Returns the geometric center of the object

+
+ +
+
+property convex_hull
+

that’s a +convex hull, more or less

+

The convex hull of a three member multipoint, for example, is a +triangular polygon.

+
+
Type:
+

Imagine an elastic band stretched around the geometry

+
+
+
+ +
+
+property coords
+

Access to geometry’s coordinates (CoordinateSequence)

+
+ +
+
+property envelope
+

A figure that envelopes the geometry

+
+ +
+
+property exterior
+
+ +
+
+property geom_type
+

Name of the geometry’s type, such as ‘Point’

+
+ +
+
+property has_z
+

True if the geometry’s coordinate sequence(s) have z values (are +3-dimensional)

+
+ +
+
+property interiors
+
+ +
+
+property is_closed
+

True if the geometry is closed, else False

+

Applicable only to 1-D geometries.

+
+ +
+
+property is_empty
+

True if the set of points in this geometry is empty, else False

+
+ +
+
+property is_ring
+

True if the geometry is a closed ring, else False

+
+ +
+
+property is_simple
+

True if the geometry is simple, meaning that any self-intersections +are only at boundary points, else False

+
+ +
+
+property is_valid
+

True if the geometry is valid (definition depends on sub-class), +else False

+
+ +
+
+property length
+

Unitless length of the geometry (float)

+
+ +
+
+property minimum_clearance
+

Unitless distance by which a node could be moved to produce an invalid geometry (float)

+
+ +
+
+property minimum_rotated_rectangle
+

Returns the oriented envelope (minimum rotated rectangle) that +encloses the geometry.

+

Unlike envelope this rectangle is not constrained to be parallel to the +coordinate axes. If the convex hull of the object is a degenerate (line +or point) this degenerate is returned.

+

Alias of oriented_envelope.

+
+ +
+
+property oriented_envelope
+

Returns the oriented envelope (minimum rotated rectangle) that +encloses the geometry.

+

Unlike envelope this rectangle is not constrained to be parallel to the +coordinate axes. If the convex hull of the object is a degenerate (line +or point) this degenerate is returned.

+

Alias of minimum_rotated_rectangle.

+
+ +
+
+property type
+
+ +
+
+property wkb
+

WKB representation of the geometry

+
+ +
+
+property wkb_hex
+

WKB hex representation of the geometry

+
+ +
+
+property wkt
+

WKT representation of the geometry

+
+ +
+
+property xy
+

Separate arrays of X and Y coordinate values

+
+ +
+ +
+
+class lasso.util.partial[source]
+

Bases: object

+

partial(func, *args, **keywords) - new function with partial application +of the given arguments and keywords.

+
+
+args
+

tuple of arguments to future partial calls

+
+ +
+
+func
+

function object to use in future partial calls

+
+ +
+
+keywords
+

dictionary of keyword arguments to future partial calls

+
+ +
+ +
+
+lasso.util.column_name_to_parts(c, parameters=None)[source]
+
+ +
+
+lasso.util.create_locationreference(node, link)[source]
+
+ +
+
+lasso.util.geodesic_point_buffer(lat, lon, meters)[source]
+

creates circular buffer polygon for node

+
+
Parameters:
+
    +
  • lat – node lat

  • +
  • lon – node lon

  • +
  • meters – buffer distance, radius of circle

  • +
+
+
Returns:
+

Polygon

+
+
+
+ +
+
+lasso.util.get_shared_streets_intersection_hash(lat, long, osm_node_id=None)[source]
+
+
Calculated per:

https://github.com/sharedstreets/sharedstreets-js/blob/0e6d7de0aee2e9ae3b007d1e45284b06cc241d02/src/index.ts#L553-L565

+
+
Expected in/out
+
-93.0965985, 44.952112199999995 osm_node_id = 954734870

69f13f881649cb21ee3b359730790bb9

+
+
+
+
+
+ +
+
+lasso.util.hhmmss_to_datetime(hhmmss_str)[source]
+

Creates a datetime time object from a string of hh:mm:ss

+
+
Parameters:
+

hhmmss_str – string of hh:mm:ss

+
+
Returns:
+

datetime.time object representing time

+
+
Return type:
+

dt

+
+
+
+ +
+
+lasso.util.secs_to_datetime(secs)[source]
+

Creates a datetime time object from a seconds from midnight

+
+
Parameters:
+

secs – seconds from midnight

+
+
Returns:
+

datetime.time object representing time

+
+
Return type:
+

dt

+
+
+
+ +
+
+lasso.util.shorten_name(name)[source]
+
+ +
+
+lasso.util.transform(func, geom)[source]
+

Applies func to all coordinates of geom and returns a new +geometry of the same type from the transformed coordinates.

+

func maps x, y, and optionally z to output xp, yp, zp. The input +parameters may iterable types like lists or arrays or single values. +The output shall be of the same type. Scalars in, scalars out. +Lists in, lists out.

+

For example, here is an identity function applicable to both types +of input.

+
+
+
def id_func(x, y, z=None):

return tuple(filter(None, [x, y, z]))

+
+
+

g2 = transform(id_func, g1)

+
+

Using pyproj >= 2.1, this example will accurately project Shapely geometries:

+
+

import pyproj

+

wgs84 = pyproj.CRS(‘EPSG:4326’) +utm = pyproj.CRS(‘EPSG:32618’)

+

project = pyproj.Transformer.from_crs(wgs84, utm, always_xy=True).transform

+

g2 = transform(project, g1)

+
+

Note that the always_xy kwarg is required here as Shapely geometries only support +X,Y coordinate ordering.

+

Lambda expressions such as the one in

+
+

g2 = transform(lambda x, y, z=None: (x+1.0, y+1.0), g1)

+
+

also satisfy the requirements for func.

+
+ +
+
+lasso.util.unidecode(string, errors='ignore', replace_str='?')
+

Transliterate an Unicode object into an ASCII string

+
+
Return type:
+

str

+
+
+
>>> unidecode("北亰")
+"Bei Jing "
+
+
+

This function first tries to convert the string using ASCII codec. +If it fails (because of non-ASCII characters), it falls back to +transliteration using the character tables.

+

This is approx. five times faster if the string only contains ASCII +characters, but slightly slower than unicode_expect_nonascii if +non-ASCII characters are present.

+

errors specifies what to do with characters that have not been +found in replacement tables. The default is ‘ignore’ which ignores +the character. ‘strict’ raises an UnidecodeError. ‘replace’ +substitutes the character with replace_str (default is ‘?’). +‘preserve’ keeps the original character.

+

Note that if ‘preserve’ is used the returned string might not be +ASCII!

+
+ +
+ + +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty_dev/_modules/functools/index.html b/branch/bicounty_dev/_modules/functools/index.html new file mode 100644 index 0000000..bd88918 --- /dev/null +++ b/branch/bicounty_dev/_modules/functools/index.html @@ -0,0 +1,1083 @@ + + + + + + functools — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for functools

+"""functools.py - Tools for working with functions and callable objects
+"""
+# Python module wrapper for _functools C module
+# to allow utilities written in Python to be added
+# to the functools module.
+# Written by Nick Coghlan <ncoghlan at gmail.com>,
+# Raymond Hettinger <python at rcn.com>,
+# and Łukasz Langa <lukasz at langa.pl>.
+#   Copyright (C) 2006-2013 Python Software Foundation.
+# See C source code for _functools credits/copyright
+
+__all__ = ['update_wrapper', 'wraps', 'WRAPPER_ASSIGNMENTS', 'WRAPPER_UPDATES',
+           'total_ordering', 'cmp_to_key', 'lru_cache', 'reduce', 'partial',
+           'partialmethod', 'singledispatch', 'singledispatchmethod',
+           "cached_property"]
+
+from abc import get_cache_token
+from collections import namedtuple
+# import types, weakref  # Deferred to single_dispatch()
+from reprlib import recursive_repr
+from _thread import RLock
+
+
+################################################################################
+### update_wrapper() and wraps() decorator
+################################################################################
+
+# update_wrapper() and wraps() are tools to help write
+# wrapper functions that can handle naive introspection
+
+WRAPPER_ASSIGNMENTS = ('__module__', '__name__', '__qualname__', '__doc__',
+                       '__annotations__')
+WRAPPER_UPDATES = ('__dict__',)
+def update_wrapper(wrapper,
+                   wrapped,
+                   assigned = WRAPPER_ASSIGNMENTS,
+                   updated = WRAPPER_UPDATES):
+    """Update a wrapper function to look like the wrapped function
+
+       wrapper is the function to be updated
+       wrapped is the original function
+       assigned is a tuple naming the attributes assigned directly
+       from the wrapped function to the wrapper function (defaults to
+       functools.WRAPPER_ASSIGNMENTS)
+       updated is a tuple naming the attributes of the wrapper that
+       are updated with the corresponding attribute from the wrapped
+       function (defaults to functools.WRAPPER_UPDATES)
+    """
+    for attr in assigned:
+        try:
+            value = getattr(wrapped, attr)
+        except AttributeError:
+            pass
+        else:
+            setattr(wrapper, attr, value)
+    for attr in updated:
+        getattr(wrapper, attr).update(getattr(wrapped, attr, {}))
+    # Issue #17482: set __wrapped__ last so we don't inadvertently copy it
+    # from the wrapped function when updating __dict__
+    wrapper.__wrapped__ = wrapped
+    # Return the wrapper so this can be used as a decorator via partial()
+    return wrapper
+
+def wraps(wrapped,
+          assigned = WRAPPER_ASSIGNMENTS,
+          updated = WRAPPER_UPDATES):
+    """Decorator factory to apply update_wrapper() to a wrapper function
+
+       Returns a decorator that invokes update_wrapper() with the decorated
+       function as the wrapper argument and the arguments to wraps() as the
+       remaining arguments. Default arguments are as for update_wrapper().
+       This is a convenience function to simplify applying partial() to
+       update_wrapper().
+    """
+    return partial(update_wrapper, wrapped=wrapped,
+                   assigned=assigned, updated=updated)
+
+
+################################################################################
+### total_ordering class decorator
+################################################################################
+
+# The total ordering functions all invoke the root magic method directly
+# rather than using the corresponding operator.  This avoids possible
+# infinite recursion that could occur when the operator dispatch logic
+# detects a NotImplemented result and then calls a reflected method.
+
+def _gt_from_lt(self, other, NotImplemented=NotImplemented):
+    'Return a > b.  Computed by @total_ordering from (not a < b) and (a != b).'
+    op_result = self.__lt__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return not op_result and self != other
+
+def _le_from_lt(self, other, NotImplemented=NotImplemented):
+    'Return a <= b.  Computed by @total_ordering from (a < b) or (a == b).'
+    op_result = self.__lt__(other)
+    return op_result or self == other
+
+def _ge_from_lt(self, other, NotImplemented=NotImplemented):
+    'Return a >= b.  Computed by @total_ordering from (not a < b).'
+    op_result = self.__lt__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return not op_result
+
+def _ge_from_le(self, other, NotImplemented=NotImplemented):
+    'Return a >= b.  Computed by @total_ordering from (not a <= b) or (a == b).'
+    op_result = self.__le__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return not op_result or self == other
+
+def _lt_from_le(self, other, NotImplemented=NotImplemented):
+    'Return a < b.  Computed by @total_ordering from (a <= b) and (a != b).'
+    op_result = self.__le__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return op_result and self != other
+
+def _gt_from_le(self, other, NotImplemented=NotImplemented):
+    'Return a > b.  Computed by @total_ordering from (not a <= b).'
+    op_result = self.__le__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return not op_result
+
+def _lt_from_gt(self, other, NotImplemented=NotImplemented):
+    'Return a < b.  Computed by @total_ordering from (not a > b) and (a != b).'
+    op_result = self.__gt__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return not op_result and self != other
+
+def _ge_from_gt(self, other, NotImplemented=NotImplemented):
+    'Return a >= b.  Computed by @total_ordering from (a > b) or (a == b).'
+    op_result = self.__gt__(other)
+    return op_result or self == other
+
+def _le_from_gt(self, other, NotImplemented=NotImplemented):
+    'Return a <= b.  Computed by @total_ordering from (not a > b).'
+    op_result = self.__gt__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return not op_result
+
+def _le_from_ge(self, other, NotImplemented=NotImplemented):
+    'Return a <= b.  Computed by @total_ordering from (not a >= b) or (a == b).'
+    op_result = self.__ge__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return not op_result or self == other
+
+def _gt_from_ge(self, other, NotImplemented=NotImplemented):
+    'Return a > b.  Computed by @total_ordering from (a >= b) and (a != b).'
+    op_result = self.__ge__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return op_result and self != other
+
+def _lt_from_ge(self, other, NotImplemented=NotImplemented):
+    'Return a < b.  Computed by @total_ordering from (not a >= b).'
+    op_result = self.__ge__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return not op_result
+
+_convert = {
+    '__lt__': [('__gt__', _gt_from_lt),
+               ('__le__', _le_from_lt),
+               ('__ge__', _ge_from_lt)],
+    '__le__': [('__ge__', _ge_from_le),
+               ('__lt__', _lt_from_le),
+               ('__gt__', _gt_from_le)],
+    '__gt__': [('__lt__', _lt_from_gt),
+               ('__ge__', _ge_from_gt),
+               ('__le__', _le_from_gt)],
+    '__ge__': [('__le__', _le_from_ge),
+               ('__gt__', _gt_from_ge),
+               ('__lt__', _lt_from_ge)]
+}
+
+def total_ordering(cls):
+    """Class decorator that fills in missing ordering methods"""
+    # Find user-defined comparisons (not those inherited from object).
+    roots = {op for op in _convert if getattr(cls, op, None) is not getattr(object, op, None)}
+    if not roots:
+        raise ValueError('must define at least one ordering operation: < > <= >=')
+    root = max(roots)       # prefer __lt__ to __le__ to __gt__ to __ge__
+    for opname, opfunc in _convert[root]:
+        if opname not in roots:
+            opfunc.__name__ = opname
+            setattr(cls, opname, opfunc)
+    return cls
+
+
+################################################################################
+### cmp_to_key() function converter
+################################################################################
+
+def cmp_to_key(mycmp):
+    """Convert a cmp= function into a key= function"""
+    class K(object):
+        __slots__ = ['obj']
+        def __init__(self, obj):
+            self.obj = obj
+        def __lt__(self, other):
+            return mycmp(self.obj, other.obj) < 0
+        def __gt__(self, other):
+            return mycmp(self.obj, other.obj) > 0
+        def __eq__(self, other):
+            return mycmp(self.obj, other.obj) == 0
+        def __le__(self, other):
+            return mycmp(self.obj, other.obj) <= 0
+        def __ge__(self, other):
+            return mycmp(self.obj, other.obj) >= 0
+        __hash__ = None
+    return K
+
+try:
+    from _functools import cmp_to_key
+except ImportError:
+    pass
+
+
+################################################################################
+### reduce() sequence to a single item
+################################################################################
+
+_initial_missing = object()
+
+def reduce(function, sequence, initial=_initial_missing):
+    """
+    reduce(function, sequence[, initial]) -> value
+
+    Apply a function of two arguments cumulatively to the items of a sequence,
+    from left to right, so as to reduce the sequence to a single value.
+    For example, reduce(lambda x, y: x+y, [1, 2, 3, 4, 5]) calculates
+    ((((1+2)+3)+4)+5).  If initial is present, it is placed before the items
+    of the sequence in the calculation, and serves as a default when the
+    sequence is empty.
+    """
+
+    it = iter(sequence)
+
+    if initial is _initial_missing:
+        try:
+            value = next(it)
+        except StopIteration:
+            raise TypeError("reduce() of empty sequence with no initial value") from None
+    else:
+        value = initial
+
+    for element in it:
+        value = function(value, element)
+
+    return value
+
+try:
+    from _functools import reduce
+except ImportError:
+    pass
+
+
+################################################################################
+### partial() argument application
+################################################################################
+
+# Purely functional, no descriptor behaviour
+
[docs]class partial: + """New function with partial application of the given arguments + and keywords. + """ + + __slots__ = "func", "args", "keywords", "__dict__", "__weakref__" + + def __new__(cls, func, /, *args, **keywords): + if not callable(func): + raise TypeError("the first argument must be callable") + + if hasattr(func, "func"): + args = func.args + args + keywords = {**func.keywords, **keywords} + func = func.func + + self = super(partial, cls).__new__(cls) + + self.func = func + self.args = args + self.keywords = keywords + return self + + def __call__(self, /, *args, **keywords): + keywords = {**self.keywords, **keywords} + return self.func(*self.args, *args, **keywords) + + @recursive_repr() + def __repr__(self): + qualname = type(self).__qualname__ + args = [repr(self.func)] + args.extend(repr(x) for x in self.args) + args.extend(f"{k}={v!r}" for (k, v) in self.keywords.items()) + if type(self).__module__ == "functools": + return f"functools.{qualname}({', '.join(args)})" + return f"{qualname}({', '.join(args)})" + + def __reduce__(self): + return type(self), (self.func,), (self.func, self.args, + self.keywords or None, self.__dict__ or None) + + def __setstate__(self, state): + if not isinstance(state, tuple): + raise TypeError("argument to __setstate__ must be a tuple") + if len(state) != 4: + raise TypeError(f"expected 4 items in state, got {len(state)}") + func, args, kwds, namespace = state + if (not callable(func) or not isinstance(args, tuple) or + (kwds is not None and not isinstance(kwds, dict)) or + (namespace is not None and not isinstance(namespace, dict))): + raise TypeError("invalid partial state") + + args = tuple(args) # just in case it's a subclass + if kwds is None: + kwds = {} + elif type(kwds) is not dict: # XXX does it need to be *exactly* dict? + kwds = dict(kwds) + if namespace is None: + namespace = {} + + self.__dict__ = namespace + self.func = func + self.args = args + self.keywords = kwds
+ +try: + from _functools import partial +except ImportError: + pass + +# Descriptor version +class partialmethod(object): + """Method descriptor with partial application of the given arguments + and keywords. + + Supports wrapping existing descriptors and handles non-descriptor + callables as instance methods. + """ + + def __init__(*args, **keywords): + if len(args) >= 2: + self, func, *args = args + elif not args: + raise TypeError("descriptor '__init__' of partialmethod " + "needs an argument") + elif 'func' in keywords: + func = keywords.pop('func') + self, *args = args + import warnings + warnings.warn("Passing 'func' as keyword argument is deprecated", + DeprecationWarning, stacklevel=2) + else: + raise TypeError("type 'partialmethod' takes at least one argument, " + "got %d" % (len(args)-1)) + args = tuple(args) + + if not callable(func) and not hasattr(func, "__get__"): + raise TypeError("{!r} is not callable or a descriptor" + .format(func)) + + # func could be a descriptor like classmethod which isn't callable, + # so we can't inherit from partial (it verifies func is callable) + if isinstance(func, partialmethod): + # flattening is mandatory in order to place cls/self before all + # other arguments + # it's also more efficient since only one function will be called + self.func = func.func + self.args = func.args + args + self.keywords = {**func.keywords, **keywords} + else: + self.func = func + self.args = args + self.keywords = keywords + __init__.__text_signature__ = '($self, func, /, *args, **keywords)' + + def __repr__(self): + args = ", ".join(map(repr, self.args)) + keywords = ", ".join("{}={!r}".format(k, v) + for k, v in self.keywords.items()) + format_string = "{module}.{cls}({func}, {args}, {keywords})" + return format_string.format(module=self.__class__.__module__, + cls=self.__class__.__qualname__, + func=self.func, + args=args, + keywords=keywords) + + def _make_unbound_method(self): + def _method(cls_or_self, /, *args, **keywords): + keywords = {**self.keywords, **keywords} + return self.func(cls_or_self, *self.args, *args, **keywords) + _method.__isabstractmethod__ = self.__isabstractmethod__ + _method._partialmethod = self + return _method + + def __get__(self, obj, cls=None): + get = getattr(self.func, "__get__", None) + result = None + if get is not None: + new_func = get(obj, cls) + if new_func is not self.func: + # Assume __get__ returning something new indicates the + # creation of an appropriate callable + result = partial(new_func, *self.args, **self.keywords) + try: + result.__self__ = new_func.__self__ + except AttributeError: + pass + if result is None: + # If the underlying descriptor didn't do anything, treat this + # like an instance method + result = self._make_unbound_method().__get__(obj, cls) + return result + + @property + def __isabstractmethod__(self): + return getattr(self.func, "__isabstractmethod__", False) + +# Helper functions + +def _unwrap_partial(func): + while isinstance(func, partial): + func = func.func + return func + +################################################################################ +### LRU Cache function decorator +################################################################################ + +_CacheInfo = namedtuple("CacheInfo", ["hits", "misses", "maxsize", "currsize"]) + +class _HashedSeq(list): + """ This class guarantees that hash() will be called no more than once + per element. This is important because the lru_cache() will hash + the key multiple times on a cache miss. + + """ + + __slots__ = 'hashvalue' + + def __init__(self, tup, hash=hash): + self[:] = tup + self.hashvalue = hash(tup) + + def __hash__(self): + return self.hashvalue + +def _make_key(args, kwds, typed, + kwd_mark = (object(),), + fasttypes = {int, str}, + tuple=tuple, type=type, len=len): + """Make a cache key from optionally typed positional and keyword arguments + + The key is constructed in a way that is flat as possible rather than + as a nested structure that would take more memory. + + If there is only a single argument and its data type is known to cache + its hash value, then that argument is returned without a wrapper. This + saves space and improves lookup speed. + + """ + # All of code below relies on kwds preserving the order input by the user. + # Formerly, we sorted() the kwds before looping. The new way is *much* + # faster; however, it means that f(x=1, y=2) will now be treated as a + # distinct call from f(y=2, x=1) which will be cached separately. + key = args + if kwds: + key += kwd_mark + for item in kwds.items(): + key += item + if typed: + key += tuple(type(v) for v in args) + if kwds: + key += tuple(type(v) for v in kwds.values()) + elif len(key) == 1 and type(key[0]) in fasttypes: + return key[0] + return _HashedSeq(key) + +def lru_cache(maxsize=128, typed=False): + """Least-recently-used cache decorator. + + If *maxsize* is set to None, the LRU features are disabled and the cache + can grow without bound. + + If *typed* is True, arguments of different types will be cached separately. + For example, f(3.0) and f(3) will be treated as distinct calls with + distinct results. + + Arguments to the cached function must be hashable. + + View the cache statistics named tuple (hits, misses, maxsize, currsize) + with f.cache_info(). Clear the cache and statistics with f.cache_clear(). + Access the underlying function with f.__wrapped__. + + See: http://en.wikipedia.org/wiki/Cache_replacement_policies#Least_recently_used_(LRU) + + """ + + # Users should only access the lru_cache through its public API: + # cache_info, cache_clear, and f.__wrapped__ + # The internals of the lru_cache are encapsulated for thread safety and + # to allow the implementation to change (including a possible C version). + + if isinstance(maxsize, int): + # Negative maxsize is treated as 0 + if maxsize < 0: + maxsize = 0 + elif callable(maxsize) and isinstance(typed, bool): + # The user_function was passed in directly via the maxsize argument + user_function, maxsize = maxsize, 128 + wrapper = _lru_cache_wrapper(user_function, maxsize, typed, _CacheInfo) + return update_wrapper(wrapper, user_function) + elif maxsize is not None: + raise TypeError( + 'Expected first argument to be an integer, a callable, or None') + + def decorating_function(user_function): + wrapper = _lru_cache_wrapper(user_function, maxsize, typed, _CacheInfo) + return update_wrapper(wrapper, user_function) + + return decorating_function + +def _lru_cache_wrapper(user_function, maxsize, typed, _CacheInfo): + # Constants shared by all lru cache instances: + sentinel = object() # unique object used to signal cache misses + make_key = _make_key # build a key from the function arguments + PREV, NEXT, KEY, RESULT = 0, 1, 2, 3 # names for the link fields + + cache = {} + hits = misses = 0 + full = False + cache_get = cache.get # bound method to lookup a key or return None + cache_len = cache.__len__ # get cache size without calling len() + lock = RLock() # because linkedlist updates aren't threadsafe + root = [] # root of the circular doubly linked list + root[:] = [root, root, None, None] # initialize by pointing to self + + if maxsize == 0: + + def wrapper(*args, **kwds): + # No caching -- just a statistics update + nonlocal misses + misses += 1 + result = user_function(*args, **kwds) + return result + + elif maxsize is None: + + def wrapper(*args, **kwds): + # Simple caching without ordering or size limit + nonlocal hits, misses + key = make_key(args, kwds, typed) + result = cache_get(key, sentinel) + if result is not sentinel: + hits += 1 + return result + misses += 1 + result = user_function(*args, **kwds) + cache[key] = result + return result + + else: + + def wrapper(*args, **kwds): + # Size limited caching that tracks accesses by recency + nonlocal root, hits, misses, full + key = make_key(args, kwds, typed) + with lock: + link = cache_get(key) + if link is not None: + # Move the link to the front of the circular queue + link_prev, link_next, _key, result = link + link_prev[NEXT] = link_next + link_next[PREV] = link_prev + last = root[PREV] + last[NEXT] = root[PREV] = link + link[PREV] = last + link[NEXT] = root + hits += 1 + return result + misses += 1 + result = user_function(*args, **kwds) + with lock: + if key in cache: + # Getting here means that this same key was added to the + # cache while the lock was released. Since the link + # update is already done, we need only return the + # computed result and update the count of misses. + pass + elif full: + # Use the old root to store the new key and result. + oldroot = root + oldroot[KEY] = key + oldroot[RESULT] = result + # Empty the oldest link and make it the new root. + # Keep a reference to the old key and old result to + # prevent their ref counts from going to zero during the + # update. That will prevent potentially arbitrary object + # clean-up code (i.e. __del__) from running while we're + # still adjusting the links. + root = oldroot[NEXT] + oldkey = root[KEY] + oldresult = root[RESULT] + root[KEY] = root[RESULT] = None + # Now update the cache dictionary. + del cache[oldkey] + # Save the potentially reentrant cache[key] assignment + # for last, after the root and links have been put in + # a consistent state. + cache[key] = oldroot + else: + # Put result in a new link at the front of the queue. + last = root[PREV] + link = [last, root, key, result] + last[NEXT] = root[PREV] = cache[key] = link + # Use the cache_len bound method instead of the len() function + # which could potentially be wrapped in an lru_cache itself. + full = (cache_len() >= maxsize) + return result + + def cache_info(): + """Report cache statistics""" + with lock: + return _CacheInfo(hits, misses, maxsize, cache_len()) + + def cache_clear(): + """Clear the cache and cache statistics""" + nonlocal hits, misses, full + with lock: + cache.clear() + root[:] = [root, root, None, None] + hits = misses = 0 + full = False + + wrapper.cache_info = cache_info + wrapper.cache_clear = cache_clear + return wrapper + +try: + from _functools import _lru_cache_wrapper +except ImportError: + pass + + +################################################################################ +### singledispatch() - single-dispatch generic function decorator +################################################################################ + +def _c3_merge(sequences): + """Merges MROs in *sequences* to a single MRO using the C3 algorithm. + + Adapted from http://www.python.org/download/releases/2.3/mro/. + + """ + result = [] + while True: + sequences = [s for s in sequences if s] # purge empty sequences + if not sequences: + return result + for s1 in sequences: # find merge candidates among seq heads + candidate = s1[0] + for s2 in sequences: + if candidate in s2[1:]: + candidate = None + break # reject the current head, it appears later + else: + break + if candidate is None: + raise RuntimeError("Inconsistent hierarchy") + result.append(candidate) + # remove the chosen candidate + for seq in sequences: + if seq[0] == candidate: + del seq[0] + +def _c3_mro(cls, abcs=None): + """Computes the method resolution order using extended C3 linearization. + + If no *abcs* are given, the algorithm works exactly like the built-in C3 + linearization used for method resolution. + + If given, *abcs* is a list of abstract base classes that should be inserted + into the resulting MRO. Unrelated ABCs are ignored and don't end up in the + result. The algorithm inserts ABCs where their functionality is introduced, + i.e. issubclass(cls, abc) returns True for the class itself but returns + False for all its direct base classes. Implicit ABCs for a given class + (either registered or inferred from the presence of a special method like + __len__) are inserted directly after the last ABC explicitly listed in the + MRO of said class. If two implicit ABCs end up next to each other in the + resulting MRO, their ordering depends on the order of types in *abcs*. + + """ + for i, base in enumerate(reversed(cls.__bases__)): + if hasattr(base, '__abstractmethods__'): + boundary = len(cls.__bases__) - i + break # Bases up to the last explicit ABC are considered first. + else: + boundary = 0 + abcs = list(abcs) if abcs else [] + explicit_bases = list(cls.__bases__[:boundary]) + abstract_bases = [] + other_bases = list(cls.__bases__[boundary:]) + for base in abcs: + if issubclass(cls, base) and not any( + issubclass(b, base) for b in cls.__bases__ + ): + # If *cls* is the class that introduces behaviour described by + # an ABC *base*, insert said ABC to its MRO. + abstract_bases.append(base) + for base in abstract_bases: + abcs.remove(base) + explicit_c3_mros = [_c3_mro(base, abcs=abcs) for base in explicit_bases] + abstract_c3_mros = [_c3_mro(base, abcs=abcs) for base in abstract_bases] + other_c3_mros = [_c3_mro(base, abcs=abcs) for base in other_bases] + return _c3_merge( + [[cls]] + + explicit_c3_mros + abstract_c3_mros + other_c3_mros + + [explicit_bases] + [abstract_bases] + [other_bases] + ) + +def _compose_mro(cls, types): + """Calculates the method resolution order for a given class *cls*. + + Includes relevant abstract base classes (with their respective bases) from + the *types* iterable. Uses a modified C3 linearization algorithm. + + """ + bases = set(cls.__mro__) + # Remove entries which are already present in the __mro__ or unrelated. + def is_related(typ): + return (typ not in bases and hasattr(typ, '__mro__') + and issubclass(cls, typ)) + types = [n for n in types if is_related(n)] + # Remove entries which are strict bases of other entries (they will end up + # in the MRO anyway. + def is_strict_base(typ): + for other in types: + if typ != other and typ in other.__mro__: + return True + return False + types = [n for n in types if not is_strict_base(n)] + # Subclasses of the ABCs in *types* which are also implemented by + # *cls* can be used to stabilize ABC ordering. + type_set = set(types) + mro = [] + for typ in types: + found = [] + for sub in typ.__subclasses__(): + if sub not in bases and issubclass(cls, sub): + found.append([s for s in sub.__mro__ if s in type_set]) + if not found: + mro.append(typ) + continue + # Favor subclasses with the biggest number of useful bases + found.sort(key=len, reverse=True) + for sub in found: + for subcls in sub: + if subcls not in mro: + mro.append(subcls) + return _c3_mro(cls, abcs=mro) + +def _find_impl(cls, registry): + """Returns the best matching implementation from *registry* for type *cls*. + + Where there is no registered implementation for a specific type, its method + resolution order is used to find a more generic implementation. + + Note: if *registry* does not contain an implementation for the base + *object* type, this function may return None. + + """ + mro = _compose_mro(cls, registry.keys()) + match = None + for t in mro: + if match is not None: + # If *match* is an implicit ABC but there is another unrelated, + # equally matching implicit ABC, refuse the temptation to guess. + if (t in registry and t not in cls.__mro__ + and match not in cls.__mro__ + and not issubclass(match, t)): + raise RuntimeError("Ambiguous dispatch: {} or {}".format( + match, t)) + break + if t in registry: + match = t + return registry.get(match) + +def singledispatch(func): + """Single-dispatch generic function decorator. + + Transforms a function into a generic function, which can have different + behaviours depending upon the type of its first argument. The decorated + function acts as the default implementation, and additional + implementations can be registered using the register() attribute of the + generic function. + """ + # There are many programs that use functools without singledispatch, so we + # trade-off making singledispatch marginally slower for the benefit of + # making start-up of such applications slightly faster. + import types, weakref + + registry = {} + dispatch_cache = weakref.WeakKeyDictionary() + cache_token = None + + def dispatch(cls): + """generic_func.dispatch(cls) -> <function implementation> + + Runs the dispatch algorithm to return the best available implementation + for the given *cls* registered on *generic_func*. + + """ + nonlocal cache_token + if cache_token is not None: + current_token = get_cache_token() + if cache_token != current_token: + dispatch_cache.clear() + cache_token = current_token + try: + impl = dispatch_cache[cls] + except KeyError: + try: + impl = registry[cls] + except KeyError: + impl = _find_impl(cls, registry) + dispatch_cache[cls] = impl + return impl + + def register(cls, func=None): + """generic_func.register(cls, func) -> func + + Registers a new implementation for the given *cls* on a *generic_func*. + + """ + nonlocal cache_token + if func is None: + if isinstance(cls, type): + return lambda f: register(cls, f) + ann = getattr(cls, '__annotations__', {}) + if not ann: + raise TypeError( + f"Invalid first argument to `register()`: {cls!r}. " + f"Use either `@register(some_class)` or plain `@register` " + f"on an annotated function." + ) + func = cls + + # only import typing if annotation parsing is necessary + from typing import get_type_hints + argname, cls = next(iter(get_type_hints(func).items())) + if not isinstance(cls, type): + raise TypeError( + f"Invalid annotation for {argname!r}. " + f"{cls!r} is not a class." + ) + registry[cls] = func + if cache_token is None and hasattr(cls, '__abstractmethods__'): + cache_token = get_cache_token() + dispatch_cache.clear() + return func + + def wrapper(*args, **kw): + if not args: + raise TypeError(f'{funcname} requires at least ' + '1 positional argument') + + return dispatch(args[0].__class__)(*args, **kw) + + funcname = getattr(func, '__name__', 'singledispatch function') + registry[object] = func + wrapper.register = register + wrapper.dispatch = dispatch + wrapper.registry = types.MappingProxyType(registry) + wrapper._clear_cache = dispatch_cache.clear + update_wrapper(wrapper, func) + return wrapper + + +# Descriptor version +class singledispatchmethod: + """Single-dispatch generic method descriptor. + + Supports wrapping existing descriptors and handles non-descriptor + callables as instance methods. + """ + + def __init__(self, func): + if not callable(func) and not hasattr(func, "__get__"): + raise TypeError(f"{func!r} is not callable or a descriptor") + + self.dispatcher = singledispatch(func) + self.func = func + + def register(self, cls, method=None): + """generic_method.register(cls, func) -> func + + Registers a new implementation for the given *cls* on a *generic_method*. + """ + return self.dispatcher.register(cls, func=method) + + def __get__(self, obj, cls=None): + def _method(*args, **kwargs): + method = self.dispatcher.dispatch(args[0].__class__) + return method.__get__(obj, cls)(*args, **kwargs) + + _method.__isabstractmethod__ = self.__isabstractmethod__ + _method.register = self.register + update_wrapper(_method, self.func) + return _method + + @property + def __isabstractmethod__(self): + return getattr(self.func, '__isabstractmethod__', False) + + +################################################################################ +### cached_property() - computed once per instance, cached as attribute +################################################################################ + +_NOT_FOUND = object() + + +class cached_property: + def __init__(self, func): + self.func = func + self.attrname = None + self.__doc__ = func.__doc__ + self.lock = RLock() + + def __set_name__(self, owner, name): + if self.attrname is None: + self.attrname = name + elif name != self.attrname: + raise TypeError( + "Cannot assign the same cached_property to two different names " + f"({self.attrname!r} and {name!r})." + ) + + def __get__(self, instance, owner=None): + if instance is None: + return self + if self.attrname is None: + raise TypeError( + "Cannot use cached_property instance without calling __set_name__ on it.") + try: + cache = instance.__dict__ + except AttributeError: # not all objects have __dict__ (e.g. class defines slots) + msg = ( + f"No '__dict__' attribute on {type(instance).__name__!r} " + f"instance to cache {self.attrname!r} property." + ) + raise TypeError(msg) from None + val = cache.get(self.attrname, _NOT_FOUND) + if val is _NOT_FOUND: + with self.lock: + # check if another thread filled cache while we awaited lock + val = cache.get(self.attrname, _NOT_FOUND) + if val is _NOT_FOUND: + val = self.func(instance) + try: + cache[self.attrname] = val + except TypeError: + msg = ( + f"The '__dict__' attribute on {type(instance).__name__!r} instance " + f"does not support item assignment for caching {self.attrname!r} property." + ) + raise TypeError(msg) from None + return val +
+ +
+
+
+ +
+ +
+

© Copyright 2019 Metropolitan Council.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty_dev/_modules/index.html b/branch/bicounty_dev/_modules/index.html new file mode 100644 index 0000000..19f96a2 --- /dev/null +++ b/branch/bicounty_dev/_modules/index.html @@ -0,0 +1,116 @@ + + + + + + Overview: module code — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+
    +
  • + +
  • +
  • +
+
+
+ +
+ +
+ +
+

© Copyright 2019 Metropolitan Council.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty_dev/_modules/lasso/logger/index.html b/branch/bicounty_dev/_modules/lasso/logger/index.html new file mode 100644 index 0000000..1561ede --- /dev/null +++ b/branch/bicounty_dev/_modules/lasso/logger/index.html @@ -0,0 +1,153 @@ + + + + + + lasso.logger — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for lasso.logger

+import logging
+
+__all__ = ["WranglerLogger", "setupLogging"]
+
+
+# for all the Wrangler logging needs!
+WranglerLogger = logging.getLogger("WranglerLogger")
+
+
+
[docs]def setupLogging(infoLogFilename, debugLogFilename, logToConsole=True): + """Sets up the logger. The infoLog is terse, just gives the bare minimum of details + so the network composition will be clear later. + The debuglog is very noisy, for debugging. + + Pass none to either. + Spews it all out to console too, if logToConsole is true. + """ + # clear handlers if any exist already + WranglerLogger.handlers = [] + + # create a logger + WranglerLogger.setLevel(logging.DEBUG) + + if infoLogFilename: + infologhandler = logging.StreamHandler(open(infoLogFilename, "w")) + infologhandler.setLevel(logging.INFO) + infologhandler.setFormatter( + logging.Formatter("%(asctime)s - %(name)s - %(levelname)s - %(message)s") + ) + WranglerLogger.addHandler(infologhandler) + + if debugLogFilename: + debugloghandler = logging.StreamHandler(open(debugLogFilename, "w")) + debugloghandler.setLevel(logging.DEBUG) + debugloghandler.setFormatter( + logging.Formatter("%(asctime)s %(levelname)s %(message)s", "%Y-%m-%d %H:%M") + ) + WranglerLogger.addHandler(debugloghandler) + + if logToConsole: + consolehandler = logging.StreamHandler() + consolehandler.setLevel(logging.DEBUG) + consolehandler.setFormatter( + logging.Formatter("%(name)-12s: %(levelname)-8s %(message)s") + ) + WranglerLogger.addHandler(consolehandler)
+
+ +
+
+
+ +
+ +
+

© Copyright 2019 Metropolitan Council.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty_dev/_modules/lasso/parameters/index.html b/branch/bicounty_dev/_modules/lasso/parameters/index.html new file mode 100644 index 0000000..7b8be0d --- /dev/null +++ b/branch/bicounty_dev/_modules/lasso/parameters/index.html @@ -0,0 +1,1060 @@ + + + + + + lasso.parameters — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for lasso.parameters

+import os
+import pyproj
+from .logger import WranglerLogger
+
+
+from pyproj import CRS
+
+
+def get_base_dir(lasso_base_dir=os.getcwd()):
+    d = lasso_base_dir
+    for i in range(3):
+        if "metcouncil_data" in os.listdir(d):
+
+            WranglerLogger.info("Lasso base directory set as: {}".format(d))
+            return d
+        d = os.path.dirname(d)
+
+    msg = "Cannot find Lasso base directory from {}, please input using keyword in parameters: `lasso_base_dir =` ".format(
+        lasso_base_dir
+    )
+    WranglerLogger.error(msg)
+    raise (ValueError(msg))
+
+
+
[docs]class Parameters: + """A class representing all the parameters defining the networks + including time of day, categories, etc. + + Parameters can be set at runtime by initializing a parameters instance + with a keyword argument setting the attribute. Parameters that are + not explicitly set will use default parameters listed in this class. + .. highlight:: python + + Attr: + time_period_to_time (dict): Maps time period abbreviations used in + Cube to time of days used on gtfs and highway network standard + Default: + :: + { + "EA": ("3:00", "6:00"), + "AM": ("6:00, "10:00"), + "MD": ("10:00", "15:00"), + "PM": ("15:00", "19:00"), + "EV": ("19:00", "3:00"), + } + cube_time_periods (dict): Maps cube time period numbers used in + transit line files to the time period abbreviations in time_period_to_time + dictionary. + Default: + :: + {"1": "EA", "2": "AM", "3": "MD", "4": "PM", "5": "EV"} + categories (dict): Maps demand category abbreviations to a list of + network categories they are allowed to use. + Default: + :: + { + # suffix, source (in order of search) + "sov": ["sov", "default"], + "hov2": ["hov2", "default", "sov"], + "hov3": ["hov3", "hov2", "default", "sov"], + "truck": ["trk", "sov", "default"], + } + properties_to_split (dict): Dictionary mapping variables in standard + roadway network to categories and time periods that need to be + split out in final model network to get variables like LANES_AM. + Default: + :: + { + "lanes": { + "v": "lanes", + "time_periods": self.time_periods_to_time + }, + "ML_lanes": { + "v": "ML_lanes", + "time_periods": self.time_periods_to_time + }, + "use": { + "v": "use", + "time_periods": self.time_periods_to_time + }, + } + + county_shape (str): File location of shapefile defining counties. + Default: + :: + r"metcouncil_data/county/cb_2017_us_county_5m.shp" + + county_variable_shp (str): Property defining the county n ame in + the county_shape file. + Default: + :: + NAME + lanes_lookup_file (str): Lookup table of number of lanes for different data sources. + Default: + :: + r"metcouncil_data/lookups/lanes.csv" + centroid_connect_lanes (int): Number of lanes for centroid connectors. + Default: + :: + 1 + mpo_counties (list): list of county names within MPO boundary. + Default: + :: + [ + "ANOKA", + "DAKOTA", + "HENNEPIN", + "RAMSEY", + "SCOTT", + "WASHINGTON", + "CARVER", + ] + + taz_shape (str): + Default: + :: + r"metcouncil_data/TAZ/TAZOfficialWCurrentForecasts.shp" + taz_data (str): + Default: + :: + ?? + highest_taz_number (int): highest TAZ number in order to define + centroid connectors. + Default: + :: + 3100 + + output_variables (list): list of variables to output in final model + network. + Default: + :: + [ + "model_link_id", + "link_id", + "A", + "B", + "shstGeometryId", + "distance", + "roadway", + "name", + "roadway_class", + "bike_access", + "walk_access", + "drive_access", + "truck_access", + "trn_priority_EA", + "trn_priority_AM", + "trn_priority_MD", + "trn_priority_PM", + "trn_priority_EV", + "ttime_assert_EA", + "ttime_assert_AM", + "ttime_assert_MD", + "ttime_assert_PM", + "ttime_assert_EV", + "lanes_EA", + "lanes_AM", + "lanes_MD", + "lanes_PM", + "lanes_EV", + "price_sov_EA", + "price_hov2_EA", + "price_hov3_EA", + "price_truck_EA", + "price_sov_AM", + "price_hov2_AM", + "price_hov3_AM", + "price_truck_AM", + "price_sov_MD", + "price_hov2_MD", + "price_hov3_MD", + "price_truck_MD", + "price_sov_PM", + "price_hov2_PM", + "price_hov3_PM", + "price_truck_PM", + "price_sov_EV", + "price_hov2_EV", + "price_hov3_EV", + "price_truck_EV", + "roadway_class_idx", + "facility_type", + "county", + "centroidconnect", + "model_node_id", + "N", + "osm_node_id", + "bike_node", + "transit_node", + "walk_node", + "drive_node", + "geometry", + "X", + "Y", + "ML_lanes_EA", + "ML_lanes_AM", + "ML_lanes_MD", + "ML_lanes_PM", + "ML_lanes_EV", + "segment_id", + "managed", + "bus_only", + "rail_only" + ] + + osm_facility_type_dict (dict): Mapping between OSM Roadway variable + and facility type. Default: + + area_type_shape (str): Location of shapefile defining area type. + Default: + :: + r"metcouncil_data/area_type/ThriveMSP2040CommunityDesignation.shp" + area_type_variable_shp (str): property in area_type_shape with area + type in it. + Default: + :: + "COMDES2040" + area_type_code_dict (dict): Mapping of the area_type_variable_shp to + the area type code used in the MetCouncil cube network. + Default: + :: + { + 23: 4, # urban center + 24: 3, + 25: 2, + 35: 2, + 36: 1, + 41: 1, + 51: 1, + 52: 1, + 53: 1, + 60: 1, + } + downtown_area_type_shape (str): Location of shapefile defining downtown area type. + Default: + :: + r"metcouncil_data/area_type/downtownzones_TAZ.shp" + downtown_area_type (int): Area type integer for downtown. + Default: + :: + 5 + mrcc_roadway_class_shape (str): Shapefile of MRCC links with a property + associated with roadway class. Default: + :: + r"metcouncil_data/mrcc/trans_mrcc_centerlines.shp" + mrcc_roadway_class_variable_shp (str): The property in mrcc_roadway_class_shp + associated with roadway class. Default: + :: + "ROUTE_SYS" + widot_roadway_class_shape (str): Shapefile of Wisconsin links with a property + associated with roadway class. Default: + :: + r"metcouncil_data/Wisconsin_Lanes_Counts_Median/WISLR.shp" + widot_roadway_class_variable_shp (str): The property in widot_roadway_class_shape + associated with roadway class.Default: + :: + "RDWY_CTGY_" + mndot_count_shape (str): Shapefile of MnDOT links with a property + associated with counts. Default: + :: + r"metcouncil_data/count_mn/AADT_2017_Count_Locations.shp" + mndot_count_variable_shp (str): The property in mndot_count_shape + associated with counts. Default: + + :: + "lookups/osm_highway_facility_type_crosswalk.csv" + legacy_tm2_attributes (str): CSV file of link attributes by + shStReferenceId from Legacy TM2 network. Default: + :: + "lookups/legacy_tm2_attributes.csv" + osm_lanes_attributes (str): CSV file of number of lanes by shStReferenceId + from OSM. Default: + :: + "lookups/osm_lanes_attributes.csv" + tam_tm2_attributes (str): CSV file of link attributes by + shStReferenceId from TAM TM2 network. Default: + :: + "lookups/tam_tm2_attributes.csv" + tom_tom_attributes (str): CSV file of link attributes by + shStReferenceId from TomTom network. Default: + :: + "lookups/tomtom_attributes.csv" + sfcta_attributes (str): CSV file of link attributes by + shStReferenceId from SFCTA network. Default: + :: + "lookups/sfcta_attributes.csv" + output_epsg (int): EPSG type of geographic projection for output + shapefiles. Default: + :: + 102646 + output_link_shp (str): Output shapefile for roadway links. Default: + :: + r"tests/scratch/links.shp" + output_node_shp (str): Output shapefile for roadway nodes. Default: + :: + r"tests/scratch/nodes.shp" + output_link_csv (str): Output csv for roadway links. Default: + :: + r"tests/scratch/links.csv" + output_node_csv (str): Output csv for roadway nodes. Default: + :: + r"tests/scratch/nodes.csv" + output_link_txt (str): Output fixed format txt for roadway links. Default: + :: + r"tests/scratch/links.txt" + output_node_txt (str): Output fixed format txt for roadway nodes. Default: + :: + r"tests/scratch/nodes.txt" + output_link_header_width_txt (str): Header for txt roadway links. Default: + :: + r"tests/scratch/links_header_width.txt" + output_node_header_width_txt (str): Header for txt for roadway Nodes. Default: + :: + r"tests/scratch/nodes_header_width.txt" + output_cube_network_script (str): Cube script for importing + fixed-format roadway network. Default: + :: + r"tests/scratch/make_complete_network_from_fixed_width_file.s + + + + """ + +
[docs] def __init__(self, **kwargs): + """ + Time period and category splitting info + """ + if "time_periods_to_time" in kwargs: + self.time_periods_to_time = kwargs.get("time_periods_to_time") + else: + self.time_period_to_time = { + "EA": ("3:00", "6:00"), + "AM": ("6:00", "10:00"), + "MD": ("10:00", "15:00"), + "PM": ("15:00", "19:00"), + "EV": ("19:00", "3:00"), + } + + #MTC + self.cube_time_periods = { + "1": "EA", + "2": "AM", + "3": "MD", + "4": "PM", + "5": "EV", + } + + """ + #MC + self.route_type_bus_mode_dict = {"Urb Loc": 5, "Sub Loc": 6, "Express": 7} + + self.route_type_mode_dict = {0: 8, 2: 9} + + self.cube_time_periods = {"1": "AM", "2": "MD"} + self.cube_time_periods_name = {"AM": "pk", "MD": "op"} + """ + if "categories" in kwargs: + self.categories = kwargs.get("categories") + else: + self.categories = { + # suffix, source (in order of search) + "sov": ["sov", "default"], + "hov2": ["hov2", "default", "sov"], + "hov3": ["hov3", "hov2", "default", "sov"], + "truck": ["trk", "sov", "default"], + } + + # prefix, source variable, categories + self.properties_to_split = { + "lanes": { + "v": "lanes", + "time_periods": self.time_period_to_time, + }, + "ML_lanes": { + "v": "ML_lanes", + "time_periods": self.time_period_to_time, + }, + "useclass": { + "v": "useclass", + "time_periods": self.time_period_to_time, + }, + } + + """ + Details for calculating the county based on the centroid of the link. + The NAME varible should be the name of a field in shapefile. + """ + #MTC + if 'lasso_base_dir' in kwargs: + self.base_dir = get_base_dir(lasso_base_dir = kwargs.get("lasso_base_dir")) + else: + self.base_dir = get_base_dir() + + if 'data_file_location' in kwargs: + self.data_file_location = kwargs.get("data_file_location") + else: + self.data_file_location = os.path.join(self.base_dir, "mtc_data") + + #MC + if "lasso_base_dir" in kwargs: + self.base_dir = get_base_dir(lasso_base_dir=kwargs.get("lasso_base_dir")) + else: + self.base_dir = get_base_dir() + """ + if "data_file_location" in kwargs: + self.data_file_location = kwargs.get("data_file_location") + else: + self.data_file_location = os.path.join(self.base_dir, "metcouncil_data") + """ + + #-------- + if "settings_location" in kwargs: + self.settings_location = kwargs.get("settings_location") + else: + self.settings_location = os.path.join(self.base_dir, "examples", "settings") + + if "scratch_location" in kwargs: + self.scratch_location = kwargs.get("scratch_location") + else: + self.scratch_location = os.path.join(self.base_dir, "tests", "scratch") + + ### COUNTIES + + self.county_shape = os.path.join( + self.data_file_location, "county", "county.shp" + ) + self.county_variable_shp = "NAME" + + #MTC + self.county_code_dict = { + 'San Francisco':1, + 'San Mateo':2, + 'Santa Clara':3, + 'Alameda':4, + 'Contra Costa':5, + 'Solano':6, + 'Napa':7, + 'Sonoma':8, + 'Marin':9, + 'San Joaquin':11, + 'External':10, + } + + self.county_centroid_range_dict = { + 'San Francisco':range(1,100000), + 'San Mateo':range(100001,200000), + 'Santa Clara':range(200001,300000), + 'Alameda':range(300001,400000), + 'Contra Costa':range(400001,500000), + 'Solano':range(500001,600000), + 'Napa':range(600001,700000), + 'Sonoma':range(700001,800000), + 'Marin':range(800001,900000), + 'External':range(900001,1000000) + } + + self.county_node_range_dict = { + 'San Francisco':range(1000000,1500000), + 'San Mateo':range(1500000,2000000), + 'Santa Clara':range(2000000,2500000), + 'Alameda':range(2500000,3000000), + 'Contra Costa':range(3000000,3500000), + 'Solano':range(3500000,4000000), + 'Napa':range(4000000,4500000), + 'Sonoma':range(4500000,5000000), + 'Marin':range(5000000,5500000), + } + + self.county_hov_node_range_dict = { + 'San Francisco':range(5500000,6000000), + 'San Mateo':range(6000000,6500000), + 'Santa Clara':range(6500000,7000000), + 'Alameda':range(7000000,7500000), + 'Contra Costa':range(7500000,8000000), + 'Solano':range(8000000,8500000), + 'Napa':range(8500000,9000000), + 'Sonoma':range(9000000,9500000), + 'Marin':range(9500000,10000000), + } + + self.county_link_range_dict = { + 'San Francisco':range(1,1000000), + 'San Mateo':range(1000000,2000000), + 'Santa Clara':range(2000000,3000000), + 'Alameda':range(3000000,4000000), + 'Contra Costa':range(4000000,5000000), + 'Solano':range(5000000,6000000), + 'Napa':range(6000000,7000000), + 'Sonoma':range(7000000,8000000), + 'Marin':range(8000000,9000000) + } + + #MC + """ + self.county_code_dict = { + "Anoka": 1, + "Carver": 2, + "Dakota": 3, + "Hennepin": 4, + "Ramsey": 5, + "Scott": 6, + "Washington": 7, + "external": 10, + "Chisago": 11, + "Goodhue": 12, + "Isanti": 13, + "Le Sueur": 14, + "McLeod": 15, + "Pierce": 16, + "Polk": 17, + "Rice": 18, + "Sherburne": 19, + "Sibley": 20, + "St. Croix": 21, + "Wright": 22, + } + """ + + self.mpo_counties = [ + 1, + 3, + 4, + 5, + 6, + 7, + 8, + 9 + ] + + self.taz_N_list = list(range(1, 10000)) + list(range(100001, 110000)) + list(range(200001, 210000)) + list(range(300001, 310000))\ + + list(range(400001, 410000)) + list(range(500001, 510000)) + list(range(600001, 610000)) + list(range(700001, 710000))\ + + list(range(800001, 810000)) + list(range(900001, 1000000)) + + self.maz_N_list = list(range(10001, 90000)) + list(range(110001, 190000)) + list(range(210001, 290000)) + list(range(310001, 390000))\ + + list(range(410001, 490000)) + list(range(510001, 590000)) + list(range(610001, 690000)) + list(range(710001, 790000))\ + + list(range(810001, 890000)) + + self.tap_N_list = list(range(90001, 99999)) + list(range(190001, 199999)) + list(range(290001, 299999)) + list(range(390001, 399999))\ + + list(range(490001, 499999)) + list(range(590001, 599999)) + list(range(690001, 699999)) + list(range(790001, 799999))\ + + list(range(890001, 899999)) + + self.tap_N_start = { + "San Francisco" : 90001, + "San Mateo" : 190001, + "Santa Clara" : 290001, + "Alameda" : 390001, + "Contra Costa" : 490001, + "Solano" : 590001, + "Napa" : 690001, + "Sonoma" : 790001, + "Marin" : 890001 + } + + #MTC + self.osm_facility_type_dict = os.path.join( + self.data_file_location, "lookups", "osm_highway_facility_type_crosswalk.csv" + ) + #MC + ### Lanes + self.lanes_lookup_file = os.path.join( + self.data_file_location, "lookups", "lanes.csv" + ) + + ### TAZS + + self.taz_shape = os.path.join( + self.data_file_location, "TAZ", "TAZOfficialWCurrentForecasts.shp" + ) + ###### + #MTC + self.osm_lanes_attributes = os.path.join( + self.data_file_location, "lookups", "osm_lanes_attributes.csv" + ) + + self.legacy_tm2_attributes = os.path.join( + self.data_file_location, "lookups", "legacy_tm2_attributes.csv" + ) + + self.assignable_analysis = os.path.join( + self.data_file_location, "lookups", "assignable_analysis_links.csv" + ) + ### + ### AREA TYPE - MC + self.area_type_shape = os.path.join( + self.data_file_location, + "area_type", + "ThriveMSP2040CommunityDesignation.shp", + ) + self.area_type_variable_shp = "COMDES2040" + # area type map from raw data to model category + + # source https://metrocouncil.org/Planning/Publications-And-Resources/Thrive-MSP-2040-Plan-(1)/7_ThriveMSP2040_LandUsePoliciesbyCD.aspx + # urban center + # urban + # suburban + # suburban edge + # emerging suburban edge + # rural center + # diversified rural + # rural residential + # agricultural + self.area_type_code_dict = { + 23: 4, # urban center + 24: 3, + 25: 2, + 35: 2, + 36: 1, + 41: 1, + 51: 1, + 52: 1, + 53: 1, + 60: 1, + } + + self.downtown_area_type_shape = os.path.join( + self.data_file_location, + "area_type", + "downtownzones_TAZ.shp", + ) + + self.downtown_area_type = int(5) + + self.centroid_connect_lanes = int(1) + + self.osm_assgngrp_dict = os.path.join( + self.data_file_location, "lookups", "osm_highway_asgngrp_crosswalk.csv" + ) + self.mrcc_roadway_class_shape = os.path.join( + self.data_file_location, "mrcc", "trans_mrcc_centerlines.shp" + ) + #### + ###MTC + self.tam_tm2_attributes = os.path.join( + self.data_file_location, "lookups", "tam_tm2_attributes.csv" + ) + + self.sfcta_attributes = os.path.join( + self.data_file_location, "lookups", "sfcta_attributes.csv" + ) + + self.tomtom_attributes = os.path.join( + self.data_file_location, "lookups", "tomtom_attributes.csv" + ) + + self.pems_attributes = os.path.join( + self.data_file_location, "lookups", "pems_attributes.csv" + ) + + self.centroid_file = os.path.join( + self.data_file_location, "centroid", "centroid_node.pickle" + ) + #### + ###MC + self.widot_shst_data = os.path.join( + self.data_file_location, + "Wisconsin_Lanes_Counts_Median", + "widot.out.matched.geojson", + ) + #### + + self.centroid_connector_link_file = os.path.join( + self.data_file_location, "centroid", "cc_link.pickle" + ) + + self.centroid_connector_shape_file = os.path.join( + self.data_file_location, "centroid", "cc_shape.pickle" + ) + + self.tap_file = os.path.join( + self.data_file_location, "tap", "tap_node.pickle" + ) + + self.tap_connector_link_file = os.path.join( + self.data_file_location, "tap", "tap_link.pickle" + ) + + self.tap_connector_shape_file = os.path.join( + self.data_file_location, "tap", "tap_shape.pickle" + ) + + self.net_to_dbf_crosswalk = os.path.join( + self.settings_location, "net_to_dbf.csv" + ) + + ###MTC + self.log_to_net_crosswalk = os.path.join(self.settings_location, "log_to_net.csv") + + self.emme_name_crosswalk_file = os.path.join( + self.data_file_location, "lookups", "emme_attribute_names.csv" + ) + #### + #MC + self.mndot_count_variable_shp = "AADT_mn" + + self.widot_county_shape = os.path.join( + self.data_file_location, + "Wisconsin_Lanes_Counts_Median", + "TRADAS_(counts).shp", + ) + ### + ###MTC + self.mode_crosswalk_file = os.path.join( + self.data_file_location, "lookups", "gtfs_to_tm2_mode_crosswalk.csv" + ) + + self.veh_cap_crosswalk_file = os.path.join( + self.data_file_location, "lookups", "transitSeatCap.csv" + ) + + self.faresystem_crosswalk_file = os.path.join( + self.data_file_location, "lookups", "faresystem_crosswalk.txt" + ) + + # https://app.asana.com/0/12291104512575/1200287255197808/f + self.fare_2015_to_2010_deflator = 0.927 + #### + #MC + self.widot_count_variable_shp = "AADT_wi" + + self.net_to_dbf_crosswalk = os.path.join( + self.settings_location, "net_to_dbf.csv" + ) + + self.log_to_net_crosswalk = os.path.join( + self.settings_location, "log_to_net.csv" + ) + + self.subregion_boundary_file = os.path.join( + self.data_file_location, 'emme', 'subregion_boundary_for_active_modes.shp' + ) + + self.subregion_boundary_id_variable = 'subregion' + #### + + self.output_variables = [ + "model_link_id", + "link_id", + "A", + "B", + "shstGeometryId", + #MTC + 'name', + "distance", + "roadway", + #"name", + #MC + #"shape_id", + #"distance", + #"roadway", + #"name", + #"roadway_class", + #### + "bike_access", + "walk_access", + "drive_access", + "truck_access", + "lanes_EA", + "lanes_AM", + "lanes_MD", + "lanes_PM", + "lanes_EV", + "county", + "model_node_id", + "N", + "osm_node_id", + "geometry", + "X", + "Y", + "segment_id", + "managed", + "bus_only", + "rail_only", + "pnr", + #MTC + "assignable", + "cntype", + "useclass_AM", + "useclass_MD", + "useclass_PM", + "useclass_EV", + "useclass_EA", + "transit", + "tollbooth", + "tollseg", + "ft", + "tap_drive", + "tollbooth", + "tollseg", + "farezone", + "tap_id", + #### + #MC + "bike_facility", + "mrcc_id", + "ROUTE_SYS", # mrcc functional class + #### + #bi-county + "nmt2010", + "nmt2020", + "BRT", + "has_transit" + ] + + self.output_link_shp = os.path.join(self.scratch_location, "links.shp") + self.output_node_shp = os.path.join(self.scratch_location, "nodes.shp") + self.output_link_csv = os.path.join(self.scratch_location, "links.csv") + self.output_node_csv = os.path.join(self.scratch_location, "nodes.csv") + self.output_link_txt = os.path.join(self.scratch_location, "links.txt") + self.output_node_txt = os.path.join(self.scratch_location, "nodes.txt") + self.output_link_header_width_txt = os.path.join( + self.scratch_location, "links_header_width.txt" + ) + self.output_node_header_width_txt = os.path.join( + self.scratch_location, "nodes_header_width.txt" + ) + self.output_cube_network_script = os.path.join( + self.scratch_location, "make_complete_network_from_fixed_width_file.s" + ) + self.output_dir = os.path.join(self.scratch_location) + self.output_proj = CRS("ESRI:102646") + self.output_proj4 = '+proj=lcc +lat_1=32.78333333333333 +lat_2=33.88333333333333 +lat_0=32.16666666666666 +lon_0=-116.25 +x_0=2000000 +y_0=500000.0000000002 +ellps=GRS80 +datum=NAD83 +to_meter=0.3048006096012192 +no_defs' + self.prj_file = os.path.join(self.data_file_location, 'projection', '102646.prj') + self.wkt_projection = 'PROJCS["NAD_1983_StatePlane_California_VI_FIPS_0406_Feet",GEOGCS["GCS_North_American_1983",DATUM["North_American_Datum_1983",SPHEROID["GRS_1980",6378137,298.257222101]],PRIMEM["Greenwich",0],UNIT["Degree",0.017453292519943295]],PROJECTION["Lambert_Conformal_Conic_2SP"],PARAMETER["False_Easting",6561666.666666666],PARAMETER["False_Northing",1640416.666666667],PARAMETER["Central_Meridian",-116.25],PARAMETER["Standard_Parallel_1",32.78333333333333],PARAMETER["Standard_Parallel_2",33.88333333333333],PARAMETER["Latitude_Of_Origin",32.16666666666666],UNIT["Foot_US",0.30480060960121924],AUTHORITY["EPSG","102646"]]' + + self.fare_matrix_output_variables = ["faresystem", "origin_farezone", "destination_farezone", "price"] + + self.zones = 6593 + """ + Create all the possible headway variable combinations based on the cube time periods setting + """ + self.time_period_properties_list = [ + p + "[" + str(t) + "]" + for p in ["HEADWAY", "FREQ"] + for t in self.cube_time_periods.keys() + ] + + self.int_col = [ + "model_link_id", + "model_node_id", + "A", + "B", + #MTC + #"county", + ### + #MC + # "lanes", + "lanes_AM", + "lanes_MD", + "lanes_PM", + "lanes_NT", + "roadway_class", + "assign_group", + #"county", + "area_type", + "trn_priority", + "AADT", + "count_AM", + "count_MD", + "count_PM", + "count_NT", + "count_daily", + "centroidconnect", + "bike_facility", + #### + "drive_access", + "walk_access", + "bike_access", + "truck_access", + #MTC + "ML_lanes_AM", + "ML_lanes_MD", + "ML_lanes_PM", + "ML_lanes_EV", + "ML_lanes_EA", + ### + #MC + "drive_node", + "walk_node", + "bike_node", + "transit_node", + # "ML_lanes", + "ML_lanes_AM", + "ML_lanes_MD", + "ML_lanes_PM", + "ML_lanes_NT", + #### + "segment_id", + "managed", + "bus_only", + "rail_only", + "transit", + ##MTC + "ft", + "assignable", + "lanes_AM", + "lanes_MD", + "lanes_PM", + "lanes_EA", + "lanes_EV", + "useclass_AM", + "useclass_EA", + "useclass_MD", + "useclass_PM", + "useclass_EV", + "tollseg", + "tollbooth", + "farezone", + "tap_id", + #### + #bi-county + "nmt2010", + "nmt2020", + "BRT", + "has_transit" + ] + + self.float_col = [ + "distance", + "price", + "X", + "Y" + "mrcc_id", + ] + + self.float_col = ["distance", "ttime_assert", "price", "X", "Y"] + + self.string_col = [ + "osm_node_id", + "name", + "pnr", + "roadway", + "shstGeometryId", + "access_AM", + "access_MD", + "access_PM", + "access_NT", + "ROUTE_SYS", + ] + + # pnr parameters + self.pnr_node_location = os.path.join( + self.data_file_location, "lookups", "pnr_stations.csv" + ) + + self.drive_buffer = 6 + + #self.network_build_crs = CRS("EPSG:2875") + #self.project_card_crs = CRS("EPSG:4326") + #self.transformer = pyproj.Transformer.from_crs( + # self.network_build_crs, self.project_card_crs, always_xy=True + #) + + self.__dict__.update(kwargs)
+
+ +
+
+
+ +
+ +
+

© Copyright 2019 Metropolitan Council.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty_dev/_modules/lasso/project/index.html b/branch/bicounty_dev/_modules/lasso/project/index.html new file mode 100644 index 0000000..e1285f4 --- /dev/null +++ b/branch/bicounty_dev/_modules/lasso/project/index.html @@ -0,0 +1,1530 @@ + + + + + + lasso.project — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for lasso.project

+import json
+import os
+import re
+from typing import Any, Dict, Optional, Union, List
+from csv import reader
+
+from pandas.core import base
+
+import numpy as np
+import pandas as pd
+from pandas import DataFrame
+import geopandas as gpd
+
+from network_wrangler import ProjectCard
+from network_wrangler import RoadwayNetwork
+
+from .transit import CubeTransit, StandardTransit
+from .logger import WranglerLogger
+from .parameters import Parameters
+from .roadway import ModelRoadwayNetwork
+from .util import column_name_to_parts
+
+
+
[docs]class Project(object): + """A single or set of changes to the roadway or transit system. + + Compares a base and a build transit network or a base and build + highway network and produces project cards. + + .. highlight:: python + + Typical usage example: + :: + test_project = Project.create_project( + base_cube_transit_source=os.path.join(CUBE_DIR, "transit.LIN"), + build_cube_transit_source=os.path.join(CUBE_DIR, "transit_route_shape_change"), + ) + test_project.evaluate_changes() + test_project.write_project_card( + os.path.join(SCRATCH_DIR, "t_transit_shape_test.yml") + ) + + Attributes: + DEFAULT_PROJECT_NAME: a class-level constant that defines what + the project name will be if none is set. + STATIC_VALUES: a class-level constant which defines values that + are not evaluated when assessing changes. + card_data (dict): {"project": <project_name>, "changes": <list of change dicts>} + roadway_link_changes (DataFrame): pandas dataframe of CUBE roadway link changes. + roadway_node_changes (DataFrame): pandas dataframe of CUBE roadway node changes. + transit_changes (CubeTransit): + base_roadway_network (RoadwayNetwork): + base_cube_transit_network (CubeTransit): + build_cube_transit_network (CubeTransit): + project_name (str): name of the project, set to DEFAULT_PROJECT_NAME if not provided + parameters: an instance of the Parameters class which sets a bunch of parameters + """ + + DEFAULT_PROJECT_NAME = "USER TO define" + + STATIC_VALUES = [ + "model_link_id", + "area_type", + "county", + # "assign_group", + "centroidconnect", + ] + CALCULATED_VALUES = [ + "area_type", + "county", + "assign_group", + "centroidconnect", + ] + +
[docs] def __init__( + self, + roadway_link_changes: Optional[DataFrame] = None, + roadway_node_changes: Optional[DataFrame] = None, + transit_changes: Optional[DataFrame] = None, + base_roadway_network: Optional[RoadwayNetwork] = None, + base_transit_network: Optional[StandardTransit] = None, + base_cube_transit_network: Optional[CubeTransit] = None, + build_cube_transit_network: Optional[CubeTransit] = None, + project_name: Optional[str] = "", + evaluate: Optional[bool] = False, + parameters: Union[dict, Parameters] = {}, + ): + """ + ProjectCard constructor. + + args: + roadway_link_changes: dataframe of roadway changes read from a log file + roadway_node_changes: dataframe of roadway changes read from a log file + transit_changes: dataframe of transit changes read from a log file + base_roadway_network: RoadwayNetwork instance for base case + base_transit_network: StandardTransit instance for base case + base_cube_transit_network: CubeTransit instance for base transit network + build_cube_transit_network: CubeTransit instance for build transit network + project_name: name of the project + evaluate: defaults to false, but if true, will create card data + parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not specified, will use default parameters. + + returns: instance of ProjectCard + """ + self.card_data = Dict[str, Dict[str, Any]] + + self.roadway_link_changes = roadway_link_changes + self.roadway_node_changes = roadway_node_changes + self.base_roadway_network = base_roadway_network + self.base_transit_network = base_transit_network + self.base_cube_transit_network = base_cube_transit_network + self.build_cube_transit_network = build_cube_transit_network + self.transit_changes = transit_changes + self.project_name = ( + project_name if project_name else Project.DEFAULT_PROJECT_NAME + ) + + if type(parameters) is dict: + self.parameters = Parameters(**parameters) + elif isinstance(parameters, Parameters): + self.parameters = Parameters(**parameters.__dict__) + else: + msg = "Parameters should be a dict or instance of Parameters: found {} which is of type:{}".format( + parameters, type(parameters) + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + if base_roadway_network != None: + self.determine_roadway_network_changes_compatibility( + self.base_roadway_network, + self.roadway_link_changes, + self.roadway_node_changes, + self.parameters + ) + + if evaluate: + self.evaluate_changes()
+ +
[docs] def write_project_card(self, filename: str = None): + """ + Writes project cards. + + Args: + filename (str): File path to output .yml + + Returns: + None + """ + ProjectCard(self.card_data).write(out_filename=filename)
+ +
[docs] @staticmethod + def create_project( + roadway_log_file: Union[str, List[str], None] = None, + roadway_shp_file: Optional[str] = None, + roadway_csv_file: Optional[str] = None, + network_build_file: Optional[str] = None, + emme_node_id_crosswalk_file: Optional[str] = None, + emme_name_crosswalk_file: Optional[str] = None, + base_roadway_dir: Optional[str] = None, + base_transit_dir: Optional[str] = None, + base_cube_transit_source: Optional[str] = None, + build_cube_transit_source: Optional[str] = None, + roadway_link_changes: Optional[DataFrame] = None, + roadway_node_changes: Optional[DataFrame] = None, + transit_changes: Optional[CubeTransit] = None, + base_roadway_network: Optional[RoadwayNetwork] = None, + base_transit_network: Optional[StandardTransit] = None, + base_cube_transit_network: Optional[CubeTransit] = None, + build_cube_transit_network: Optional[CubeTransit] = None, + project_name: Optional[str] = None, + recalculate_calculated_variables: Optional[bool] = False, + recalculate_distance: Optional[bool] = False, + parameters: Optional[dict] = {}, + **kwargs, + ): + """ + Constructor for a Project instance. + + Args: + roadway_log_file: File path to consuming logfile or a list of logfile paths. + roadway_shp_file: File path to consuming shape file for roadway changes. + roadway_csv_file: File path to consuming csv file for roadway changes. + network_build_file: File path to consuming EMME network build for network changes. + base_roadway_dir: Folder path to base roadway network. + base_transit_dir: Folder path to base transit network. + base_cube_transit_source: Folder path to base transit network or cube line file string. + base_cube_transit_file: File path to base transit network. + build_cube_transit_source: Folder path to build transit network or cube line file string. + build_cube_transit_file: File path to build transit network. + roadway_link_changes: pandas dataframe of CUBE roadway link changes. + roadway_node_changes: pandas dataframe of CUBE roadway node changes. + transit_changes: build transit changes. + base_roadway_network: Base roadway network object. + base_cube_transit_network: Base cube transit network object. + build_cube_transit_network: Build cube transit network object. + project_name: If not provided, will default to the roadway_log_file filename if + provided (or the first filename if a list is provided) + recalculate_calculated_variables: if reading in a base network, if this is true it + will recalculate variables such as area type, etc. This only needs to be true + if you are creating project cards that are changing the calculated variables. + recalculate_distance: recalculate the distance variable. This only needs to be + true if you are creating project cards that change the distance. + parameters: dictionary of parameters + crs (int): coordinate reference system, ESPG number + node_foreign_key (str): variable linking the node table to the link table + link_foreign_key (list): list of variable linking the link table to the node foreign key + shape_foreign_key (str): variable linking the links table and shape table + unique_link_ids (list): list of variables unique to each link + unique_node_ids (list): list of variables unique to each node + modes_to_network_link_variables (dict): Mapping of modes to link variables in + the network + modes_to_network_nodes_variables (dict): Mapping of modes to node variables + in the network + managed_lanes_node_id_scalar (int): Scalar values added to primary keys for nodes for + corresponding managed lanes. + managed_lanes_link_id_scalar (int): Scalar values added to primary keys for links for + corresponding managed lanes. + managed_lanes_required_attributes (list): attributes that must be specified in managed + lane projects. + keep_same_attributes_ml_and_gp (list): attributes to copy to managed lanes from parallel + general purpose lanes. + + Returns: + A Project instance. + """ + + if base_cube_transit_source and base_cube_transit_network: + msg = "Method takes only one of 'base_cube_transit_source' and 'base_cube_transit_network' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if base_cube_transit_source: + base_cube_transit_network = CubeTransit.create_from_cube(base_cube_transit_source, parameters) + WranglerLogger.debug( + "Base network has {} lines".format(len(base_cube_transit_network.lines)) + ) + if len(base_cube_transit_network.lines) <= 10: + WranglerLogger.debug( + "Base network lines: {}".format( + "\n - ".join(base_cube_transit_network.lines) + ) + ) + elif base_cube_transit_network: + pass + else: + msg = "No base cube transit network." + WranglerLogger.info(msg) + base_cube_transit_network = None + + if build_cube_transit_source and transit_changes: + msg = "Method takes only one of 'build_cube_transit_source' and 'transit_changes' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if build_cube_transit_source: + WranglerLogger.debug("build") + build_cube_transit_network = CubeTransit.create_from_cube(build_cube_transit_source, parameters) + WranglerLogger.debug( + "Build network has {} lines".format(len(build_cube_transit_network.lines)) + ) + if len(build_cube_transit_network.lines) <= 10: + WranglerLogger.debug( + "Build network lines: {}".format( + "\n - ".join(build_cube_transit_network.lines) + ) + ) + elif transit_changes: + pass + else: + msg = "No cube transit changes given or processed." + WranglerLogger.info(msg) + transit_changes = None + + if roadway_log_file and (roadway_link_changes or roadway_node_changes): + msg = "Method takes only one of 'roadway_log_file' and 'roadway_changes' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if roadway_shp_file and (roadway_link_changes or roadway_node_changes): + msg = "Method takes only one of 'roadway_shp_file' and 'roadway_changes' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if roadway_csv_file and (roadway_link_changes or roadway_node_changes): + msg = "Method takes only one of 'roadway_csv_file' and 'roadway_changes' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if roadway_log_file and roadway_csv_file: + msg = "Method takes only one of 'roadway_log_file' and 'roadway_csv_file' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if roadway_shp_file and roadway_csv_file: + msg = "Method takes only one of 'roadway_shp_file' and 'roadway_csv_file' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if roadway_log_file and roadway_shp_file: + msg = "Method takes only one of 'roadway_log_file' and 'roadway_shp_file' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if roadway_log_file and not project_name: + if type(roadway_log_file) == list: + project_name = os.path.splitext(os.path.basename(roadway_log_file[0]))[ + 0 + ] + WranglerLogger.info( + "No Project Name - Using name of first log file in list" + ) + else: + project_name = os.path.splitext(os.path.basename(roadway_log_file))[0] + WranglerLogger.info("No Project Name - Using name of log file") + if network_build_file and not project_name: + if type(network_build_file) == list: + with open(network_build_file[0]) as f: + _content = json.load(f) + project_name = ( + _content.get('metadata').get('project_title') + ' ' + + _content.get('metadata').get('date') + ' ' + + _content.get('metadata').get('comments') + ) + WranglerLogger.info( + "No Project Name - Using metadata of first network build file in list" + ) + else: + with open(network_build_file) as f: + _content = json.load(f) + project_name = ( + _content.get('metadata').get('project_title') + ' ' + + _content.get('metadata').get('date') + ' ' + + _content.get('metadata').get('comments') + ) + WranglerLogger.info("No Project Name - Using metadata of network build file") + if roadway_log_file: + roadway_link_changes, roadway_node_changes = Project.read_logfile(roadway_log_file) + elif roadway_shp_file: + roadway_changes = gpd.read_file(roadway_shp_file) + roadway_link_changes = roadway_changes[roadway_changes.OBJECT == 'L'].copy() + roadway_node_changes = roadway_changes[roadway_changes.OBJECT == 'N'].copy() + roadway_link_changes = DataFrame(roadway_link_changes.drop("geometry", axis=1)) + roadway_node_changes = DataFrame(roadway_node_changes.drop("geometry", axis=1)) + roadway_node_changes["model_node_id"] = 0 + elif roadway_csv_file: + roadway_changes = pd.read_csv(roadway_csv_file) + roadway_link_changes = roadway_changes[roadway_changes.OBJECT == 'L'].copy() + roadway_node_changes = roadway_changes[roadway_changes.OBJECT == 'N'].copy() + roadway_node_changes["model_node_id"] = 0 + elif network_build_file: + roadway_link_changes, roadway_node_changes, transit_changes = Project.read_network_build_file(network_build_file) + if emme_node_id_crosswalk_file: + # get wrangler IDs from emme element_id + roadway_link_changes, roadway_node_changes, transit_changes = Project.emme_id_to_wrangler_id( + roadway_link_changes, + roadway_node_changes, + transit_changes, + emme_node_id_crosswalk_file + ) + else: + msg = "User needs to specify emme node id crosswalk file using emme_node_id_crosswalk_file = " + WranglerLogger.error(msg) + raise ValueError(msg) + # rename emme attributes to wrangler attributes + if emme_name_crosswalk_file is None: + emme_name_crosswalk_file = parameters.emme_name_crosswalk_file + roadway_link_changes, roadway_node_changes = Project.emme_name_to_wrangler_name( + roadway_link_changes, + roadway_node_changes, + emme_name_crosswalk_file + ) + elif roadway_link_changes: + pass + elif roadway_node_changes: + pass + else: + msg = "No roadway changes given or processed." + WranglerLogger.info(msg) + roadway_link_changes = pd.DataFrame({}) + roadway_node_changes = pd.DataFrame({}) + + if base_roadway_network and base_roadway_dir: + msg = "Method takes only one of 'base_roadway_network' and 'base_roadway_dir' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if base_roadway_dir: + base_roadway_network = ModelRoadwayNetwork.read( + os.path.join(base_roadway_dir, "link.json"), + os.path.join(base_roadway_dir, "node.geojson"), + os.path.join(base_roadway_dir, "shape.geojson"), + fast=True, + recalculate_calculated_variables=recalculate_calculated_variables, + recalculate_distance=recalculate_distance, + parameters=parameters, + **kwargs, + ) + base_roadway_network.split_properties_by_time_period_and_category() + elif base_roadway_network: + base_roadway_network.split_properties_by_time_period_and_category() + else: + msg = "No base roadway network." + WranglerLogger.info(msg) + base_roadway_network = None + + if base_cube_transit_source and base_transit_dir: + msg = "Method takes only one of 'base_cube_transit_source' and 'base_transit_dir' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if base_transit_dir: + base_transit_network = StandardTransit.read_gtfs( + gtfs_feed_dir=base_transit_dir, + parameters=parameters + ) + elif base_transit_network: + base_transit_network = base_transit_network + else: + msg = "No base transit network." + WranglerLogger.info(msg) + base_transit_network = None + + project = Project( + roadway_link_changes=roadway_link_changes, + roadway_node_changes=roadway_node_changes, + transit_changes=transit_changes, + base_roadway_network=base_roadway_network, + base_transit_network=base_transit_network, + base_cube_transit_network=base_cube_transit_network, + build_cube_transit_network=build_cube_transit_network, + evaluate=True, + project_name=project_name, + parameters=parameters, + ) + + return project
+ +
[docs] @staticmethod + def read_logfile(logfilename: Union[str, List[str]]): + """ + Reads a Cube log file and returns separate dataframes of roadway_link_changes and roadway_node_changes + + Args: + logfilename (str or list[str]): File path to CUBE logfile or list of logfile paths. + + Returns: + A DataFrame reprsentation of the log file. + """ + if type(logfilename) == str: + logfilename = [logfilename] + + link_df = pd.DataFrame() + node_df = pd.DataFrame() + + for file in logfilename: + WranglerLogger.info("Reading logfile: {}".format(file)) + with open(file) as f: + _content = f.readlines() + + _node_lines = [ + x.strip().replace(";", ",") for x in _content if x.startswith("N") + ] + WranglerLogger.debug("node lines: {}".format(_node_lines)) + _link_lines = [ + x.strip().replace(";", ",") for x in _content if x.startswith("L") + ] + WranglerLogger.debug("link lines: {}".format(_link_lines)) + + _nodecol = ["OBJECT", "OPERATION", "GROUP"] + _node_lines[0].split(",")[ + 1: + ] + WranglerLogger.debug("Node Cols: {}".format(_nodecol)) + _linkcol = ["OBJECT", "OPERATION", "GROUP"] + _link_lines[0].split(",")[ + 1: + ] + WranglerLogger.debug("Link Cols: {}".format(_linkcol)) + + def split_log(x): + return list(reader([x], delimiter=',', quotechar='"'))[0] + + _node_df = pd.DataFrame([split_log(x) for x in _node_lines[1:]],columns = _nodecol) + WranglerLogger.debug("Node DF: {}".format(_node_df)) + _link_df = pd.DataFrame([split_log(x) for x in _link_lines[1:]],columns = _linkcol) + WranglerLogger.debug("Link DF: {}".format(_link_df)) + + node_df = pd.concat([node_df, _node_df]) + link_df = pd.concat([link_df, _link_df]) + + # CUBE logfile headers for string fields: NAME[111] instead of NAME, need to shorten that + link_df.columns = [c.split("[")[0] for c in link_df.columns] + # CUBE logfile headers for string fields: NAME[111] instead of NAME, need to shorten that + node_df.columns = [c.split("[")[0] for c in node_df.columns] + + if len(link_df) > 0: + # create operation history + action_history_df = ( + link_df.groupby(['A', 'B'])["OPERATION"] + .agg(lambda x: x.tolist()) + .rename("operation_history") + .reset_index() + ) + action_history_df["operation_final"] = action_history_df.apply(lambda x: Project._final_op(x), axis=1) + + link_df = pd.merge(link_df, action_history_df, on=['A', 'B'], how="left") + + if len(node_df) > 0: + action_history_df = ( + node_df.groupby('N')["OPERATION"] + .agg(lambda x: x.tolist()) + .rename("operation_history") + .reset_index() + ) + action_history_df["operation_final"] = action_history_df.apply(lambda x: Project._final_op(x), axis=1) + + node_df = pd.merge(node_df, action_history_df, on='N', how="left") + + WranglerLogger.info( + "Processed {} Node lines and {} Link lines".format( + node_df.shape[0], link_df.shape[0] + ) + ) + + return link_df, node_df
+ +
[docs] @staticmethod + def read_network_build_file(networkbuildfilename: Union[str, List[str]]): + """ + Reads a emme network build file and returns separate dataframes of roadway_link_changes and roadway_node_changes + + Args: + networkbuildfilename (str or list[str]): File path to emme nework build file or list of network build file paths. + + Returns: + A DataFrame representation of the network build file + """ + if type(networkbuildfilename) == str: + networkbuildfilename = [networkbuildfilename] + + _link_command_history_df = DataFrame() + _node_command_history_df = DataFrame() + _transit_command_history_df = DataFrame() + + for file in networkbuildfilename: + WranglerLogger.info("Reading network build file: {}".format(file)) + with open(file) as f: + _content = json.load(f) + + _command_history = _content.get('command_history') + + # loop through all the commands + for command in _command_history: + if command.get('command') == 'set_attribute': + element_id = command.get('parameters').get('element_ids') + object = Project.get_object_from_network_build_command(command) + operation = Project.get_operation_from_network_build_command(command) + + _command_df = DataFrame( + data = { + 'element_id' : element_id, + 'object' : object, + 'operation' : operation + } + ) + + _command_df[command.get('parameters').get('attribute_name')] = command.get('parameters').get('value') + + if command.get('command') in ['create_link', 'create_node']: + if command.get('command') == 'create_link': + element_id = command.get('results').get('changes').get('added').get('LINK') + if command.get('command') == 'create_node': + element_id = command.get('results').get('changes').get('added').get('NODE') + object = Project.get_object_from_network_build_command(command) + operation = Project.get_operation_from_network_build_command(command) + + _command_df = DataFrame( + data = { + 'element_id' : element_id, + 'object' : object, + 'operation' : operation + } + ) + + for attribute_name, attribute_value in command.get('parameters').get('attributes').items(): + _command_df[attribute_name] = attribute_value + + if command.get('command') == 'delete_link': + element_id = command.get('results').get('changes').get('removed').get('LINK') + object = Project.get_object_from_network_build_command(command) + operation = Project.get_operation_from_network_build_command(command) + + _command_df = DataFrame( + data = { + 'element_id' : element_id, + 'object' : object, + 'operation' : operation + } + ) + + if command.get('command') == 'modify_transit_line': + element_id = command.get('parameters').get('line_id') + object = Project.get_object_from_network_build_command(command) + operation = Project.get_operation_from_network_build_command(command) + + _command_df = DataFrame( + data = { + 'element_id' : pd.Series(element_id), + 'object' : pd.Series(object), + 'operation' : pd.Series(operation) + } + ) + + _command_df['new_itinerary'] = [command.get('parameters').get('new_itinerary')] + + if ('L' in _command_df['object'].unique()): + _link_command_history_df = _link_command_history_df.append( + _command_df[_command_df['object'] == 'L'], + sort = False, + ignore_index = True + ) + + if ('N' in _command_df['object'].unique()): + _node_command_history_df = _node_command_history_df.append( + _command_df[_command_df['object'] == 'N'], + sort = False, + ignore_index = True + ) + + if ( + ('TRANSIT_LINE' in _command_df['object'].unique()) | + ('TRANSIT_STOP' in _command_df['object'].unique()) | + ('TRANSIT_SHAPE' in _command_df['object'].unique()) + ): + _transit_command_history_df = _transit_command_history_df.append( + _command_df[_command_df['object'].isin(['TRANSIT_LINE', 'TRANSIT_STOP', 'TRANSIT_SHAPE'])], + sort = False, + ignore_index = True + ) + + if len(_link_command_history_df) > 0: + # create operation history + link_action_history_df = ( + _link_command_history_df.groupby('element_id')["operation"] + .agg(lambda x: x.tolist()) + .rename("operation_history") + .reset_index() + ) + link_action_history_df["operation_final"] = link_action_history_df.apply( + lambda x: Project._final_op(x), + axis=1 + ) + + # get the last none null value for each element + # consolidate elements to single record + def get_last_valid(series): + if len(series.dropna()) > 0: + return series.dropna().iloc[-1] + else: + return np.nan + + #_command_history_df = _command_history_df.groupby(['element_id']).apply(get_last_valid).reset_index() + _link_command_history_df = _link_command_history_df.groupby( + ['element_id'] + ).last().reset_index() + + _link_command_history_df = pd.merge(_link_command_history_df, link_action_history_df, on='element_id', how="left") + + if len(_node_command_history_df) > 0: + # create node operation history + node_action_history_df = ( + _node_command_history_df.groupby('element_id')["operation"] + .agg(lambda x: x.tolist()) + .rename("operation_history") + .reset_index() + ) + node_action_history_df["operation_final"] = node_action_history_df.apply( + lambda x: Project._final_op(x), + axis=1 + ) + + _node_command_history_df = _node_command_history_df.groupby( + ['element_id'] + ).last().reset_index() + + _node_command_history_df = pd.merge(_node_command_history_df, node_action_history_df, on='element_id', how="left") + + WranglerLogger.info( + "Processed {} link element commands, {} node element commands".format( + _link_command_history_df.shape[0], + _node_command_history_df.shape[0] + ) + ) + + return _link_command_history_df, _node_command_history_df, _transit_command_history_df
+ +
[docs] @staticmethod + def emme_id_to_wrangler_id(emme_link_change_df, emme_node_change_df, emme_transit_changes_df, emme_node_id_crosswalk_file): + """ + rewrite the emme id with wrangler id, using the emme wrangler id crosswalk located in database folder + """ + WranglerLogger.info('Reading emme node id crosswalk file from {}'.format(emme_node_id_crosswalk_file)) + emme_node_id_crosswalk_df = pd.read_csv(emme_node_id_crosswalk_file) + emme_node_id_dict = dict(zip(emme_node_id_crosswalk_df['emme_node_id'], emme_node_id_crosswalk_df['model_node_id'])) + + # get node changes + if len(emme_node_change_df) > 0: + emme_node_change_df['emme_id'] = emme_node_change_df['element_id'].apply(lambda x: int(x.split('-')[0])) + + # get new emme nodes + new_emme_node_id_list = [ + n for n in emme_node_change_df['emme_id'].to_list() if n not in emme_node_id_crosswalk_df['emme_node_id'].to_list() + ] + WranglerLogger.info('New emme node id list {}'.format(new_emme_node_id_list)) + new_wrangler_node = emme_node_id_crosswalk_df['model_node_id'].max() + + # add crosswalk for new emme nodes + for new_emme_node in new_emme_node_id_list: + if new_emme_node in emme_node_id_dict.keys(): + msg = "new node id {} has already been added to the crosswalk".format(new_emme_node) + WranglerLogger.error(msg) + raise ValueError(msg) + else: + new_wrangler_node = new_wrangler_node + 1 + emme_node_id_dict.update({new_emme_node : new_wrangler_node}) + new_emme_node_id_crosswalk_df = pd.DataFrame(emme_node_id_dict.items(), columns=['emme_node_id', 'model_node_id']) + new_emme_node_id_crosswalk_df.to_csv(emme_node_id_crosswalk_file, index=False) + + # for nodes update model_node_id + emme_node_change_df['model_node_id'] = emme_node_change_df['emme_id'].map(emme_node_id_dict).fillna(0) + + if len(emme_link_change_df) > 0: + emme_link_change_df['A'] = emme_link_change_df['element_id'].apply(lambda x: int(x.split('-')[0])) + emme_link_change_df['B'] = emme_link_change_df['element_id'].apply(lambda x: int(x.split('-')[-1])) + # for links update A,B nodes + emme_link_change_df['A'] = emme_link_change_df['A'].map(emme_node_id_dict) + emme_link_change_df['B'] = emme_link_change_df['B'].map(emme_node_id_dict) + + if len(emme_transit_changes_df) > 0: + emme_transit_changes_df['i_node'] = emme_transit_changes_df.apply( + lambda x: x['element_id'].split('-')[-3] if x['object'] == 'TRANSIT_STOP' else 0, + axis = 1 + ) + + emme_transit_changes_df['j_node'] = emme_transit_changes_df.apply( + lambda x: x['element_id'].split('-')[-2] if x['object'] == 'TRANSIT_STOP' else 0, + axis = 1 + ) + + # update i,j nodes + emme_transit_changes_df['i_node'] = emme_transit_changes_df[ + 'i_node' + ].astype( + int + ).map( + emme_node_id_dict + ).fillna(0).astype(int) + emme_transit_changes_df['j_node'] = emme_transit_changes_df[ + 'j_node' + ].astype( + int + ).map( + emme_node_id_dict + ).fillna(0).astype(int) + + # update routing nodes + emme_transit_changes_df['new_itinerary'] = emme_transit_changes_df.apply( + lambda x: [emme_node_id_dict.get(n) for n in x['new_itinerary']] if x['object'] == 'TRANSIT_SHAPE' else 0, + axis = 1 + ) + + return emme_link_change_df, emme_node_change_df, emme_transit_changes_df
+ +
[docs] def get_object_from_network_build_command(row): + """ + determine the network build object is node or link + + Args: + row: network build command history dataframe + + Returns: + 'N' for node, 'L' for link + """ + + if row.get('command') == 'create_link': + return 'L' + + if row.get('command') == 'create_node': + return 'N' + + if row.get('command') == 'delete_link': + return 'L' + + if row.get('command') == 'set_attribute': + if row.get('parameters').get('element_type') == 'LINK': + return 'L' + if row.get('parameters').get('element_type') == 'NODE': + return 'N' + if row.get('parameters').get('element_type') == 'TRANSIT_LINE': + return 'TRANSIT_LINE' + if row.get('parameters').get('element_type') == 'TRANSIT_SEGMENT': + return 'TRANSIT_STOP' + + if row.get('command') == 'modify_transit_line': + return 'TRANSIT_SHAPE'
+ +
[docs] def get_operation_from_network_build_command(row): + """ + determine the network build object action type + + Args: + row: network build command history dataframe + + Returns: + 'A', 'C', 'D' + """ + + if row.get('command') == 'create_link': + return 'A' + + if row.get('command') == 'create_node': + return 'A' + + if row.get('command') == 'delete_link': + return 'D' + + if row.get('command') == 'set_attribute': + if row.get('parameters').get('element_type') == 'LINK': + return 'C' + if row.get('parameters').get('element_type') == 'NODE': + return 'C' + if row.get('parameters').get('element_type') == 'TRANSIT_LINE': + return 'C' + if row.get('parameters').get('element_type') == 'TRANSIT_SEGMENT': + return 'C' + + if row.get('command') == 'modify_transit_line': + return 'C'
+ +
[docs] @staticmethod + def emme_name_to_wrangler_name(emme_link_change_df, emme_node_change_df, emme_name_crosswalk_file): + """ + rename emme names to wrangler names using crosswalk file + """ + + WranglerLogger.info('Reading emme attribute name crosswalk file {}'.format(emme_name_crosswalk_file)) + emme_name_crosswalk_df = pd.read_csv(emme_name_crosswalk_file) + emme_name_crosswalk_dict = dict(zip(emme_name_crosswalk_df['emme_name'], emme_name_crosswalk_df['wrangler_name'])) + + # drop columns we don't need from emme to avoid confusion + ignore_columns = [ + c for c in emme_link_change_df.columns if c not in list(emme_name_crosswalk_dict.keys()) + ['operation_final', 'A', 'B'] + ] + WranglerLogger.info('Ignoring link changes in {}'.format(ignore_columns)) + emme_link_change_df = emme_link_change_df.drop(ignore_columns, axis = 1) + + ignore_columns = [ + c for c in emme_node_change_df.columns if c not in list(emme_name_crosswalk_dict.keys()) + ['operation_final', 'model_node_id'] + ] + WranglerLogger.info('Ignoring node changes in {}'.format(ignore_columns)) + emme_node_change_df = emme_node_change_df.drop(ignore_columns, axis = 1) + + # rename emme name to wrangler name + emme_link_change_df.rename(columns = emme_name_crosswalk_dict, inplace = True) + emme_node_change_df.rename(columns = emme_name_crosswalk_dict, inplace = True) + + return emme_link_change_df, emme_node_change_df
+ +
[docs] @staticmethod + def determine_roadway_network_changes_compatibility( + base_roadway_network: ModelRoadwayNetwork, + roadway_link_changes: DataFrame, + roadway_node_changes: DataFrame, + parameters: Parameters, + ): + """ + Checks to see that any links or nodes that change exist in base roadway network. + """ + WranglerLogger.info( + "Evaluating compatibility between roadway network changes and base network. Not evaluating deletions." + ) + + # CUBE log file saves all variable names in upper cases, need to convert them to be same as network + log_to_net_df = pd.read_csv(parameters.log_to_net_crosswalk) + log_to_net_dict = dict(zip(log_to_net_df["log"], log_to_net_df["net"])) + + dbf_to_net_df = pd.read_csv(parameters.net_to_dbf_crosswalk) + dbf_to_net_dict = dict(zip(dbf_to_net_df["dbf"], dbf_to_net_df["net"])) + + for c in roadway_link_changes.columns: + if (c not in log_to_net_df["log"].tolist() + log_to_net_df["net"].tolist()) & (c not in ["A", "B"]): + roadway_link_changes.rename(columns={c : c.lower()}, inplace=True) + roadway_link_changes.rename(columns=log_to_net_dict, inplace=True) + roadway_link_changes.rename(columns=dbf_to_net_dict, inplace=True) + + for c in roadway_node_changes.columns: + if (c not in log_to_net_df["log"].tolist() + log_to_net_df["net"].tolist()) & (c not in ["A", "B", "X", "Y"]): + roadway_node_changes.rename(columns={c : c.lower()}, inplace=True) + roadway_node_changes.rename(columns=log_to_net_dict, inplace=True) + roadway_node_changes.rename(columns=dbf_to_net_dict, inplace=True) + + # for links "L" that change "C", + # find locations where there isn't a base roadway link + if len(roadway_link_changes) > 0: + link_changes_df = roadway_link_changes[ + roadway_link_changes["operation_final"] == "C" + ].copy() + + link_merge_df = pd.merge( + link_changes_df[["A", "B"]].astype(str), + base_roadway_network.links_df[["A", "B", "model_link_id"]].astype(str), + how="left", + on=["A", "B"], + ) + + missing_links = link_merge_df.loc[link_merge_df["model_link_id"].isna()] + + if missing_links.shape[0]: + msg = "Network missing the following AB links:\n{}".format(missing_links) + WranglerLogger.error(msg) + raise ValueError(msg) + + # for links "N" that change "C", + # find locations where there isn't a base roadway node + if len(roadway_node_changes) > 0: + node_changes_df = roadway_node_changes[ + roadway_node_changes["operation_final"] == "C" + ].copy() + + node_merge_df = pd.merge( + node_changes_df[["model_node_id"]], + base_roadway_network.nodes_df[["model_node_id", "geometry"]], + how="left", + on=["model_node_id"], + ) + missing_nodes = node_merge_df.loc[node_merge_df["geometry"].isna()] + if missing_nodes.shape[0]: + msg = "Network missing the following nodes:\n{}".format(missing_nodes) + WranglerLogger.error(msg) + raise ValueError(msg)
+ +
[docs] def evaluate_changes(self): + """ + Determines which changes should be evaluated, initiates + self.card_data to be an aggregation of transit and highway changes. + """ + highway_change_list = [] + transit_change_list = [] + + WranglerLogger.info("Evaluating project changes.") + + if (not self.roadway_link_changes.empty) | (not self.roadway_node_changes.empty): + highway_change_list = self.add_highway_changes() + + if self.transit_changes is not None: + if (not self.transit_changes.empty) or ( + self.base_cube_transit_network is not None + and self.build_cube_transit_network is not None + ): + transit_change_list = self.add_transit_changes() + + if ( + self.base_cube_transit_network is not None + and self.build_cube_transit_network is not None + ): + transit_change_list = self.add_transit_changes() + + self.card_data = { + "project": self.project_name, + "changes": transit_change_list + highway_change_list, + }
+ +
[docs] def add_transit_changes(self): + """ + Evaluates changes between base and build transit objects and + adds entries into the self.card_data dictionary. + """ + if self.build_cube_transit_network: + transit_change_list = self.build_cube_transit_network.evaluate_differences( + self.base_cube_transit_network + ) + elif self.base_transit_network: + transit_change_list = self.base_transit_network.evaluate_differences( + self.transit_changes + ) + return transit_change_list
+ + @staticmethod + def _final_op(x): + if x["operation_history"][-1] == "D": + if "A" in x["operation_history"][:-1]: + return "N" + else: + return "D" + elif x["operation_history"][-1] == "A": + if "D" in x["operation_history"][:-1]: + return "C" + else: + return "A" + else: + if "A" in x["operation_history"][:-1]: + return "A" + else: + return "C" + +
[docs] def add_highway_changes(self, limit_variables_to_existing_network=False): + """ + Evaluates changes from the log file based on the base highway object and + adds entries into the self.card_data dictionary. + + Args: + limit_variables_to_existing_network (bool): True if no ad-hoc variables. Default to False. + """ + + for c in self.parameters.string_col: + if c in self.roadway_link_changes.columns: + self.roadway_link_changes[c] = self.roadway_link_changes[c].str.lstrip(" ") + if c in self.roadway_node_changes.columns: + self.roadway_node_changes[c] = self.roadway_node_changes[c].str.lstrip(" ") + + ## if worth it, could also add some functionality to network wrangler itself. + node_changes_df = self.roadway_node_changes.copy() + + link_changes_df = self.roadway_link_changes.copy() + + def _process_deletions(link_changes_df): + """ + create deletion section in project card + """ + WranglerLogger.debug("Processing link deletions") + + cube_delete_df = link_changes_df[link_changes_df["operation_final"] == "D"].copy() + + # make sure columns has the same type as base network + cube_delete_df['A'] = cube_delete_df['A'].astype( + type(self.base_roadway_network.links_df['A'].iloc[0]) + ) + cube_delete_df['B'] = cube_delete_df['B'].astype( + type(self.base_roadway_network.links_df['B'].iloc[0]) + ) + + if 'model_link_id' in cube_delete_df.columns: + cube_delete_df.drop(['model_link_id'], axis = 1, inplace = True) + + cube_delete_df = pd.merge( + cube_delete_df, + self.base_roadway_network.links_df[['A', 'B', 'model_link_id']], + how = 'left', + on = ['A', 'B'] + ) + + if len(cube_delete_df) > 0: + links_to_delete = cube_delete_df["model_link_id"].tolist() + delete_link_dict = { + "category": "Roadway Deletion", + "links": {"model_link_id": links_to_delete}, + } + WranglerLogger.debug("{} Links Deleted.".format(len(links_to_delete))) + else: + delete_link_dict = None + WranglerLogger.debug("No link deletions processed") + + return delete_link_dict + + def _process_link_additions( + link_changes_df, limit_variables_to_existing_network + ): + """""" + WranglerLogger.debug("Processing link additions") + cube_add_df = link_changes_df[link_changes_df["operation_final"] == "A"] + if len(cube_add_df) == 0: + WranglerLogger.debug("No link additions processed") + return {} + + if limit_variables_to_existing_network: + add_col = [ + c + for c in cube_add_df.columns + if c in self.base_roadway_network.links_df.columns + ] + else: + add_col = [ + c for c in cube_add_df.columns if c not in ["operation_final"] + ] + # can leave out "operation_final" from writing out, is there a reason to write it out? + + for x in add_col: + cube_add_df[x] = cube_add_df[x].astype(self.base_roadway_network.links_df[x].dtype) + + add_link_properties = cube_add_df[add_col].to_dict("records") + + # WranglerLogger.debug("Add Link Properties: {}".format(add_link_properties)) + WranglerLogger.debug("{} Links Added".format(len(add_link_properties))) + + return {"category": "Add New Roadway", "links": add_link_properties} + + def _process_node_additions(node_add_df): + """""" + WranglerLogger.debug("Processing node additions") + + if len(node_add_df) == 0: + WranglerLogger.debug("No node additions processed") + return [] + + node_add_df = node_add_df.drop(["operation_final"], axis=1) + + node_add_df = node_add_df.apply(_reproject_coordinates, axis=1) + + for x in node_add_df.columns: + node_add_df[x] = node_add_df[x].astype(self.base_roadway_network.nodes_df[x].dtype) + + add_nodes_dict_list = node_add_df.to_dict( + "records" + ) + WranglerLogger.debug("{} Nodes Added".format(len(add_nodes_dict_list))) + + return add_nodes_dict_list + + def _reproject_coordinates(row): + reprojected_x, reprojected_y = self.parameters.transformer.transform(row['X'], row['Y']) + row['X'] = reprojected_x + row['Y'] = reprojected_y + return row + + def _process_single_link_change(change_row, changeable_col): + """""" + + # 1. Find associated base year network values + base_df = self.base_roadway_network.links_df[ + (self.base_roadway_network.links_df["A"] == int(change_row.A)) + & (self.base_roadway_network.links_df["B"] == int(change_row.B)) + ] + + if not base_df.shape[0]: + msg = "No match found in network for AB combination: ({},{}). Incompatible base network.".format( + change_row.A, change_row.B + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + elif base_df.shape[0] > 1: + WranglerLogger.warning( + "Found more than one match in base network for AB combination: ({},{}). Selecting first one to operate on but AB should be unique to network.".format( + change_row.A, change_row.B + ) + ) + + base_row = base_df.iloc[0] + # WranglerLogger.debug("Properties with changes: {}".format(changeable_col)) + + # 2. find columns that changed (enough) + changed_col = [] + for col in changeable_col: + WranglerLogger.debug("Assessing Column: {}".format(col)) + # if it is the same as before, or a static value, don't process as a change + if str(change_row[col]).strip('"\'') == str(base_row[col]).strip('"\''): + continue + # if it is NaN or None, don't process as a change + if (change_row[col] != change_row[col]) | (change_row[col] is None): + continue + if (col == "roadway_class") & (change_row[col] == 0): + continue + # only look at distance if it has significantly changed + if col == "distance": + if ( + abs( + (change_row[col] - float(base_row[col])) + / base_row[col].astype(float) + ) + > 0.01 + ): + change_row[col] = type(base_row[col])(change_row[col]) + changed_col.append(col) + else: + continue + else: + change_row[col] = type(base_row[col])(change_row[col]) + changed_col.append(col) + + WranglerLogger.debug( + "Properties with changes that will be processed: {}".format(changed_col) + ) + + if not changed_col: + return pd.DataFrame() + + # 3. Iterate through columns with changed values and structure the changes as expected in project card + property_dict_list = [] + processed_properties = [] + + # check if it's a manged lane change + for c in changed_col: + if c.startswith("ML_"): + # TODO ML project card skeleton + msg = "Detected managed lane changes, please create managed lane project card!" + WranglerLogger.error(msg) + raise ValueError(msg) + return + + # regular roadway property change + for c in changed_col: + # WranglerLogger.debug("Processing Column: {}".format(c)) + ( + p_base_name, + p_time_period, + p_category, + managed_lane, + ) = column_name_to_parts(c, self.parameters) + + _d = { + "existing": base_row[c], + "set": change_row[c], + } + if c in Project.CALCULATED_VALUES: + _d = { + "set": change_row[c], + } + if p_time_period: + if managed_lane == 1: + _d["time"] = list( + self.parameters.time_period_to_time[p_time_period] + ) + if p_category: + _d["category"] = p_category + + # iterate through existing properties that have been changed and see if you should just add + if (p_base_name in processed_properties) & (managed_lane == 1): + for processed_p in property_dict_list: + if processed_p["property"] == p_base_name: + processed_p["timeofday"] += [_d] + elif (p_base_name in processed_properties) & (managed_lane == 0): + for processed_p in property_dict_list: + if processed_p["property"] == p_base_name: + if processed_p["set"] != change_row[c]: + msg = "Detected different changes for split-property variables on regular roadway links: " + msg += "conflicting \"{}\" values \"{}\", \"{}\"".format(p_base_name, processed_p["set"], change_row[c]) + WranglerLogger.error(msg) + raise ValueError(msg) + elif p_time_period: + if managed_lane == 1: + property_dict = {"property": p_base_name, "timeofday": [_d]} + processed_properties.append(p_base_name) + property_dict_list.append(property_dict) + else: + _d["property"] = p_base_name + processed_properties.append(_d["property"]) + property_dict_list.append(_d) + else: + _d["property"] = p_base_name + processed_properties.append(_d["property"]) + property_dict_list.append(_d) + + card_df = pd.DataFrame( + { + "properties": pd.Series([property_dict_list]), + "model_link_id": pd.Series(base_row["model_link_id"]), + } + ) + + # WranglerLogger.debug('single change card_df:\n {}'.format(card_df)) + + return card_df + + def _process_link_changes(link_changes_df, changeable_col): + """""" + cube_change_df = link_changes_df[link_changes_df["operation_final"] == "C"].copy() + + # make sure columns has the same type as base network + cube_change_df['A'] = cube_change_df['A'].astype( + type(self.base_roadway_network.links_df['A'].iloc[0]) + ) + cube_change_df['B'] = cube_change_df['B'].astype( + type(self.base_roadway_network.links_df['B'].iloc[0]) + ) + + if 'model_link_id' in cube_change_df.columns: + cube_change_df.drop('model_link_id', axis = 1, inplace = True) + + cube_change_df = pd.merge( + cube_change_df, + self.base_roadway_network.links_df[['A', 'B', 'model_link_id']], + how = 'left', + on = ['A', 'B'] + ) + + if not cube_change_df.shape[0]: + WranglerLogger.info("No link changes processed") + return [] + + change_link_dict_df = pd.DataFrame(columns=["properties", "model_link_id"]) + + for index, row in cube_change_df.iterrows(): + card_df = _process_single_link_change(row, changeable_col) + + change_link_dict_df = pd.concat( + [change_link_dict_df, card_df], ignore_index=True, sort=False + ) + + if not change_link_dict_df.shape[0]: + WranglerLogger.info("No link changes processed") + return [] + + # WranglerLogger.debug('change_link_dict_df Unaggregated:\n {}'.format(change_link_dict_df)) + + # Have to change to string so that it is a hashable type for the aggregation + change_link_dict_df["properties"] = change_link_dict_df[ + "properties" + ].astype(str) + # Group the changes that are the same + change_link_dict_df = ( + change_link_dict_df.groupby("properties")[["model_link_id"]] + .agg(lambda x: list(x)) + .reset_index() + ) + # WranglerLogger.debug('change_link_dict_df Aggregated:\n {}'.format(change_link_dict_df)) + + # Reformat model link id to correct "facility" format + change_link_dict_df["facility"] = change_link_dict_df.apply( + lambda x: {"link": [{"model_link_id": x.model_link_id}]}, axis=1 + ) + + # WranglerLogger.debug('change_link_dict_df 3: {}'.format(change_link_dict_df)) + change_link_dict_df["properties"] = change_link_dict_df["properties"].apply( + lambda x: json.loads( + x.replace("'\"", "'").replace("\"'", "'").replace("'", '"') + ) + ) + + change_link_dict_df["category"] = "Roadway Property Change" + + change_link_dict_list = change_link_dict_df[ + ["category", "facility", "properties"] + ].to_dict("record") + + WranglerLogger.debug( + "{} Changes Processed".format(len(change_link_dict_list)) + ) + return change_link_dict_list + + def _consolidate_actions(log, base, key_list): + log_df = log.copy() + # will be changed if to allow new variables being added/changed that are not in base network + changeable_col = [x for x in log_df.columns if x in base.columns] + #print(log_df) + #for x in changeable_col: + # print(x) + #log_df[x] = log_df[x].astype(base[x].dtype) + + if 'operation_final' not in log_df.columns: + action_history_df = ( + log_df.groupby(key_list)["operation"] + .agg(lambda x: x.tolist()) + .rename("operation_history") + .reset_index() + ) + + log_df = pd.merge(log_df, action_history_df, on=key_list, how="left") + log_df.drop_duplicates(subset=key_list, keep="last", inplace=True) + log_df["operation_final"] = log_df.apply(lambda x: Project._final_op(x), axis=1) + + return log_df[changeable_col + ["operation_final"]] + + delete_link_dict = None + add_link_dict = None + change_link_dict_list = [] + + if len(link_changes_df) != 0: + link_changes_df = _consolidate_actions( + link_changes_df, self.base_roadway_network.links_df, ["A", "B"] + ) + + # process deletions + delete_link_dict = _process_deletions(link_changes_df) + + # process additions + add_link_dict = _process_link_additions( + link_changes_df, limit_variables_to_existing_network + ) + + # process changes + WranglerLogger.debug("Processing changes") + WranglerLogger.debug(link_changes_df) + changeable_col = list( + ( + set(link_changes_df.columns) + & set(self.base_roadway_network.links_df.columns) + ) + - set(Project.STATIC_VALUES) + ) + + cols_in_changes_not_in_net = list( + set(link_changes_df.columns) + - set(self.base_roadway_network.links_df.columns) + ) + + if cols_in_changes_not_in_net: + WranglerLogger.warning( + "The following attributes are specified in the changes but do not exist in the base network: {}".format( + cols_in_changes_not_in_net + ) + ) + + change_link_dict_list = _process_link_changes(link_changes_df, changeable_col) + + if len(node_changes_df) != 0: + node_changes_df = _consolidate_actions( + node_changes_df, self.base_roadway_network.nodes_df, ["model_node_id"] + ) + + # print error message for node change and node deletion + if ( + len(node_changes_df[node_changes_df["operation_final"].isin(["C", "D"])]) + > 0 + ): + msg = "NODE changes and deletions are not allowed!" + WranglerLogger.warning(msg) + #raise ValueError(msg) + node_add_df = node_changes_df[node_changes_df["operation_final"] == "A"] + + if add_link_dict: + add_link_dict["nodes"] = _process_node_additions(node_add_df) + else: + add_link_dict = {"category": "Add New Roadway", "nodes": _process_node_additions(node_add_df)} + + else: + None + + # combine together + + highway_change_list = list( + filter(None, [delete_link_dict] + [add_link_dict] + change_link_dict_list) + ) + + return highway_change_list
+
+ +
+
+
+ +
+ +
+

© Copyright 2019 Metropolitan Council.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty_dev/_modules/lasso/roadway/index.html b/branch/bicounty_dev/_modules/lasso/roadway/index.html new file mode 100644 index 0000000..f276fd1 --- /dev/null +++ b/branch/bicounty_dev/_modules/lasso/roadway/index.html @@ -0,0 +1,2046 @@ + + + + + + lasso.roadway — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for lasso.roadway

+import copy
+import glob
+import os
+from typing import Optional, Union
+
+import geopandas as gpd
+import pandas as pd
+
+from geopandas import GeoDataFrame
+from pandas import DataFrame
+import numpy as np
+
+from network_wrangler import RoadwayNetwork
+from .parameters import Parameters
+from .logger import WranglerLogger
+
+
+
[docs]class ModelRoadwayNetwork(RoadwayNetwork): + """ + Subclass of network_wrangler class :ref:`RoadwayNetwork <network_wrangler:RoadwayNetwork>` + + A representation of the physical roadway network and its properties. + """ + + CALCULATED_VALUES = [ + "area_type", + "county", + "centroidconnect", + ] + +
[docs] def __init__( + self, + nodes: GeoDataFrame, + links: DataFrame, + shapes: GeoDataFrame, + parameters: Union[Parameters, dict] = {}, + **kwargs, + ): + """ + Constructor + + Args: + nodes: geodataframe of nodes + links: dataframe of links + shapes: geodataframe of shapes + parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters. + If not specified, will use default parameters. + crs (int): coordinate reference system, ESPG number + node_foreign_key (str): variable linking the node table to the link table + link_foreign_key (list): list of variable linking the link table to the node foreign key + shape_foreign_key (str): variable linking the links table and shape table + unique_link_ids (list): list of variables unique to each link + unique_node_ids (list): list of variables unique to each node + modes_to_network_link_variables (dict): Mapping of modes to link variables in the network + modes_to_network_nodes_variables (dict): Mapping of modes to node variables in the network + managed_lanes_node_id_scalar (int): Scalar values added to primary keys for nodes for + corresponding managed lanes. + managed_lanes_link_id_scalar (int): Scalar values added to primary keys for links for + corresponding managed lanes. + managed_lanes_required_attributes (list): attributes that must be specified in managed + lane projects. + keep_same_attributes_ml_and_gp (list): attributes to copy to managed lanes from parallel + general purpose lanes. + """ + super().__init__(nodes, links, shapes, **kwargs) + + # will have to change if want to alter them + if type(parameters) is dict: + self.parameters = Parameters(**parameters) + elif isinstance(parameters, Parameters): + self.parameters = Parameters(**parameters.__dict__) + else: + msg = "Parameters should be a dict or instance of Parameters: found {} which is of type:{}".format( + parameters, type(parameters) + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + self.links_metcouncil_df = None + self.nodes_metcouncil_df = None + + self.fill_na() + self.convert_int()
+ # self.shapes_metcouncil_df = None + ##todo also write to file + # WranglerLogger.debug("Used PARAMS\n", '\n'.join(['{}: {}'.format(k,v) for k,v in self.parameters.__dict__.items()])) + +
[docs] @staticmethod + def read( + link_filename: str, + node_filename: str, + shape_filename: str, + fast: bool = False, + recalculate_calculated_variables=False, + recalculate_distance=False, + parameters: Union[dict, Parameters] = {}, + **kwargs, + ): + """ + Reads in links and nodes network standard. + + Args: + link_filename (str): File path to link json. + node_filename (str): File path to node geojson. + shape_filename (str): File path to link true shape geojson + fast (bool): boolean that will skip validation to speed up read time. + recalculate_calculated_variables (bool): calculates fields from spatial joins, etc. + recalculate_distance (bool): re-calculates distance. + parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not specified, will use default parameters. + crs (int): coordinate reference system, ESPG number + node_foreign_key (str): variable linking the node table to the link table + link_foreign_key (list): list of variable linking the link table to the node foreign key + shape_foreign_key (str): variable linking the links table and shape table + unique_link_ids (list): list of variables unique to each link + unique_node_ids (list): list of variables unique to each node + modes_to_network_link_variables (dict): Mapping of modes to link variables in the network + modes_to_network_nodes_variables (dict): Mapping of modes to node variables in the network + managed_lanes_node_id_scalar (int): Scalar values added to primary keys for nodes for + corresponding managed lanes. + managed_lanes_link_id_scalar (int): Scalar values added to primary keys for links for + corresponding managed lanes. + managed_lanes_required_attributes (list): attributes that must be specified in managed + lane projects. + keep_same_attributes_ml_and_gp (list): attributes to copy to managed lanes from parallel + general purpose lanes. + Returns: + ModelRoadwayNetwork + """ + + nodes_df, links_df, shapes_df = RoadwayNetwork.load_transform_network( + node_filename, + link_filename, + shape_filename, + validate_schema=not fast, + **kwargs, + ) + + m_road_net = ModelRoadwayNetwork( + nodes_df, + links_df, + shapes_df, + parameters=parameters, + **kwargs, + ) + + if recalculate_calculated_variables: + m_road_net.create_calculated_variables() + if recalculate_distance: + m_road_net.calculate_distance(overwrite=True) + + m_road_net.fill_na() + # this method is making period values as string "NaN", need to revise. + m_road_net.split_properties_by_time_period_and_category() + for c in m_road_net.links_df.columns: + m_road_net.links_df[c] = m_road_net.links_df[c].replace("NaN", np.nan) + m_road_net.convert_int() + + return m_road_net
+ +
[docs] @staticmethod + def from_RoadwayNetwork( + roadway_network_object, + parameters: Union[dict, Parameters] = {}, + ): + """ + RoadwayNetwork to ModelRoadwayNetwork + + Args: + roadway_network_object (RoadwayNetwork). + parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not specified, will use default parameters. + + Returns: + ModelRoadwayNetwork + """ + + additional_params_dict = { + k: v + for k, v in roadway_network_object.__dict__.items() + if k not in ["nodes_df", "links_df", "shapes_df", "parameters"] + } + + return ModelRoadwayNetwork( + roadway_network_object.nodes_df, + roadway_network_object.links_df, + roadway_network_object.shapes_df, + parameters=parameters, + **additional_params_dict, + )
+ +
[docs] def split_properties_by_time_period_and_category(self, properties_to_split=None): + """ + Splits properties by time period, assuming a variable structure of + + Args: + properties_to_split: dict + dictionary of output variable prefix mapped to the source variable and what to stratify it by + e.g. + { + 'lanes' : {'v':'lanes', 'times_periods':{"AM": ("6:00", "10:00"),"PM": ("15:00", "19:00")}}, + 'ML_lanes' : {'v':'ML_lanes', 'times_periods':{"AM": ("6:00", "10:00"),"PM": ("15:00", "19:00")}}, + 'use' : {'v':'use', 'times_periods':{"AM": ("6:00", "10:00"),"PM": ("15:00", "19:00")}}, + } + + """ + import itertools + + if properties_to_split == None: + properties_to_split = self.parameters.properties_to_split + + for out_var, params in properties_to_split.items(): + if params["v"] not in self.links_df.columns: + WranglerLogger.warning( + "Specified variable to split: {} not in network variables: {}. Returning 0.".format( + params["v"], str(self.links_df.columns) + ) + ) + if params.get("time_periods") and params.get("categories"): + + for time_suffix, category_suffix in itertools.product( + params["time_periods"], params["categories"] + ): + self.links_df[ + out_var + "_" + time_suffix + "_" + category_suffix + ] = 0 + elif params.get("time_periods"): + for time_suffix in params["time_periods"]: + self.links_df[out_var + "_" + time_suffix] = 0 + elif params.get("time_periods") and params.get("categories"): + for time_suffix, category_suffix in itertools.product( + params["time_periods"], params["categories"] + ): + self.links_df[ + out_var + "_" + category_suffix + "_" + time_suffix + ] = self.get_property_by_time_period_and_group( + params["v"], + category=params["categories"][category_suffix], + time_period=params["time_periods"][time_suffix], + ) + elif params.get("time_periods"): + for time_suffix in params["time_periods"]: + self.links_df[ + out_var + "_" + time_suffix + ] = self.get_property_by_time_period_and_group( + params["v"], + category=None, + time_period=params["time_periods"][time_suffix], + ) + else: + raise ValueError( + "Shoudn't have a category without a time period: {}".format(params) + )
+ +
[docs] def create_calculated_variables(self): + """ + Creates calculated roadway variables. + + Args: + None + """ + WranglerLogger.info("Creating calculated roadway variables.") + + #MTC + self.create_ML_variable() + #/MTC + #MC + self.calculate_area_type() + self.calculate_county() + self.calculate_mpo() + self.add_counts() + self.create_ML_variable() + self.create_hov_corridor_variable() + self.create_managed_variable()
+ #/MC + +
[docs] def calculate_county( + self, + county_shape=None, + county_shape_variable=None, + network_variable="county", + county_codes_dict=None, + overwrite=False, + ): + """ + #MC + Calculates county variable. + + This uses the centroid of the geometry field to determine which county it should be labeled. + This isn't perfect, but it much quicker than other methods. + + Args: + county_shape (str): The File path to county geodatabase. + county_shape_variable (str): The variable name of county in county geodadabase. + network_variable (str): The variable name of county in network standard. Default to "county". + overwrite (Bool): True if overwriting existing county variable in network. Default to False. + + Returns: + None + """ + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing County Variable '{}' already in network".format( + network_variable + ) + ) + else: + WranglerLogger.info( + "County Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + """ + Verify inputs + """ + + county_shape = county_shape if county_shape else self.parameters.county_shape + + county_shape_variable = ( + county_shape_variable + if county_shape_variable + else self.parameters.county_variable_shp + ) + + WranglerLogger.info( + "Adding roadway network variable for county using a spatial join with: {}".format( + county_shape + ) + ) + + county_codes_dict = ( + county_codes_dict if county_codes_dict else self.parameters.county_code_dict + ) + if not county_codes_dict: + msg = "No county codes dictionary specified" + WranglerLogger.error(msg) + raise ValueError(msg) + + """ + Start actual process + """ + + centroids_gdf = self.links_df.copy() + centroids_gdf["geometry"] = centroids_gdf["geometry"].centroid + + county_gdf = gpd.read_file(county_shape) + county_gdf = county_gdf.to_crs(epsg=self.crs) + joined_gdf = gpd.sjoin(centroids_gdf, county_gdf, how="left", op="intersects") + + joined_gdf[county_shape_variable] = ( + joined_gdf[county_shape_variable] + .map(county_codes_dict) + .fillna(10) + .astype(int) + ) + + self.links_df[network_variable] = joined_gdf[county_shape_variable] + + WranglerLogger.info( + "Finished Calculating county variable: {}".format(network_variable) + )
+ +
[docs] def calculate_area_type( + self, + area_type_shape=None, + area_type_shape_variable=None, + network_variable="area_type", + area_type_codes_dict=None, + downtown_area_type_shape=None, + downtown_area_type=None, + overwrite=False, + ): + """ + #MC + Calculates area type variable. + + This uses the centroid of the geometry field to determine which area it should be labeled. + This isn't perfect, but it much quicker than other methods. + + Args: + area_type_shape (str): The File path to area geodatabase. + area_type_shape_variable (str): The variable name of area type in area geodadabase. + network_variable (str): The variable name of area type in network standard. Default to "area_type". + area_type_codes_dict: The dictionary to map input area_type_shape_variable to network_variable + downtown_area_type_shape: The file path to the downtown area type boundary. + downtown_area_type (int): Integer value of downtown area type + overwrite (Bool): True if overwriting existing county variable in network. Default to False. + + Returns: + None + + """ + + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing Area Type Variable '{}' already in network".format( + network_variable + ) + ) + else: + WranglerLogger.info( + "Area Type Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + WranglerLogger.info( + "Calculating Area Type from Spatial Data and adding as roadway network variable: {}".format( + network_variable + ) + ) + + """ + Verify inputs + """ + + area_type_shape = ( + area_type_shape if area_type_shape else self.parameters.area_type_shape + ) + + if not area_type_shape: + msg = "No area type shape specified" + WranglerLogger.error(msg) + raise ValueError(msg) + if not os.path.exists(area_type_shape): + msg = "File not found for area type shape: {}".format(area_type_shape) + WranglerLogger.error(msg) + raise ValueError(msg) + + area_type_shape_variable = ( + area_type_shape_variable + if area_type_shape_variable + else self.parameters.area_type_variable_shp + ) + + if not area_type_shape_variable: + msg = "No area type shape varible specified" + WranglerLogger.error(msg) + raise ValueError(msg) + + area_type_codes_dict = ( + area_type_codes_dict + if area_type_codes_dict + else self.parameters.area_type_code_dict + ) + if not area_type_codes_dict: + msg = "No area type codes dictionary specified" + WranglerLogger.error(msg) + raise ValueError(msg) + + downtown_area_type_shape = ( + downtown_area_type_shape + if downtown_area_type_shape + else self.parameters.downtown_area_type_shape + ) + + if not downtown_area_type_shape: + msg = "No downtown area type shape specified" + WranglerLogger.error(msg) + raise ValueError(msg) + if not os.path.exists(downtown_area_type_shape): + msg = "File not found for downtown area type shape: {}".format( + downtown_area_type_shape + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + downtown_area_type = ( + downtown_area_type + if downtown_area_type + else self.parameters.downtown_area_type + ) + if not downtown_area_type: + msg = "No downtown area type value specified" + WranglerLogger.error(msg) + raise ValueError(msg) + + """ + Start actual process + """ + centroids_gdf = self.links_df.copy() + centroids_gdf["geometry"] = centroids_gdf["geometry"].centroid + + WranglerLogger.debug("Reading Area Type Shapefile {}".format(area_type_shape)) + area_type_gdf = gpd.read_file(area_type_shape) + area_type_gdf = area_type_gdf.to_crs(epsg=self.crs) + + downtown_gdf = gpd.read_file(downtown_area_type_shape) + downtown_gdf = downtown_gdf.to_crs(epsg=self.crs) + + joined_gdf = gpd.sjoin( + centroids_gdf, area_type_gdf, how="left", op="intersects" + ) + + joined_gdf[area_type_shape_variable] = ( + joined_gdf[area_type_shape_variable] + .map(area_type_codes_dict) + .fillna(1) + .astype(int) + ) + + WranglerLogger.debug("Area Type Codes Used: {}".format(area_type_codes_dict)) + + d_joined_gdf = gpd.sjoin( + centroids_gdf, downtown_gdf, how="left", op="intersects" + ) + + d_joined_gdf["downtown_area_type"] = d_joined_gdf["Id"].fillna(-99).astype(int) + + joined_gdf.loc[ + d_joined_gdf["downtown_area_type"] == 0, area_type_shape_variable + ] = downtown_area_type + + WranglerLogger.debug( + "Downtown Area Type used boundary file: {}".format(downtown_area_type_shape) + ) + + self.links_df[network_variable] = joined_gdf[area_type_shape_variable] + + WranglerLogger.info( + "Finished Calculating Area Type from Spatial Data into variable: {}".format( + network_variable + ) + )
+ +
[docs] def calculate_mpo( + self, + county_network_variable="county", + network_variable="mpo", + as_integer=True, + mpo_counties=None, + overwrite=False, + ): + """ + Calculates mpo variable. + #MC + Args: + county_variable (str): Name of the variable where the county names are stored. Default to "county". + network_variable (str): Name of the variable that should be written to. Default to "mpo". + as_integer (bool): If true, will convert true/false to 1/0s. + mpo_counties (list): List of county names that are within mpo region. + overwrite (Bool): True if overwriting existing county variable in network. Default to False. + + Returns: + None + """ + + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing MPO Variable '{}' already in network".format( + network_variable + ) + ) + else: + WranglerLogger.info( + "MPO Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + WranglerLogger.info( + "Calculating MPO as roadway network variable: {}".format(network_variable) + ) + """ + Verify inputs + """ + county_network_variable = ( + county_network_variable + if county_network_variable + else self.parameters.county_network_variable + ) + + if not county_network_variable: + msg = "No variable specified as containing 'county' in the network." + WranglerLogger.error(msg) + raise ValueError(msg) + if county_network_variable not in self.links_df.columns: + msg = "Specified county network variable: {} does not exist in network. Try running or debuging county calculation." + WranglerLogger.error(msg) + raise ValueError(msg) + + mpo_counties = mpo_counties if mpo_counties else self.parameters.mpo_counties + + if not mpo_counties: + msg = "No MPO Counties specified in method call or in parameters." + WranglerLogger.error(msg) + raise ValueError(msg) + + WranglerLogger.debug("MPO Counties: {}".format(",".join(str(mpo_counties)))) + + """ + Start actual process + """ + + mpo = self.links_df[county_network_variable].isin(mpo_counties) + + if as_integer: + mpo = mpo.astype(int) + + self.links_df[network_variable] = mpo + + WranglerLogger.info( + "Finished calculating MPO variable: {}".format(network_variable) + )
+ +
[docs] def add_variable_using_shst_reference( + self, + var_shst_csvdata=None, + shst_csv_variable=None, + network_variable=None, + network_var_type=int, + overwrite=False, + ): + """ + Join network links with source data, via SHST API node match result. + + Args: + var_shst_csvdata (str): File path to SHST API return. + shst_csv_variable (str): Variable name in the source data. + network_variable (str): Name of the variable that should be written to. + network_var_type : Variable type in the written network. + overwrite (bool): True is overwriting existing variable. Default to False. + + Returns: + None + + """ + WranglerLogger.info( + "Adding Variable {} using Shared Streets Reference from {}".format( + network_variable, var_shst_csvdata + ) + ) + + var_shst_df = pd.read_csv(var_shst_csvdata) + + if "shstReferenceId" not in var_shst_df.columns: + msg = "'shstReferenceId' required but not found in {}".format(var_shst_data) + WranglerLogger.error(msg) + raise ValueError(msg) + + if shst_csv_variable not in var_shst_df.columns: + msg = "{} required but not found in {}".format( + shst_csv_variable, var_shst_data + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + join_gdf = pd.merge( + self.links_df, + var_shst_df[["shstReferenceId", shst_csv_variable]], + how="left", + on="shstReferenceId", + ) + + join_gdf[shst_csv_variable].fillna(0, inplace=True) + + if network_variable in self.links_df.columns and not overwrite: + join_gdf.loc[join_gdf[network_variable] > 0, network_variable] = join_gdf[ + shst_csv_variable + ].astype(network_var_type) + else: + join_gdf[network_variable] = join_gdf[shst_csv_variable].astype( + network_var_type + ) + + self.links_df[network_variable] = join_gdf[network_variable] + + WranglerLogger.info( + "Added variable: {} using Shared Streets Reference".format(network_variable) + )
+ +
[docs] def add_counts( + self, + network_variable="AADT", + mndot_count_shst_data=None, + widot_count_shst_data=None, + mndot_count_variable_shp=None, + widot_count_variable_shp=None, + ): + + """ + Adds count variable. + #MC + join the network with count node data, via SHST API node match result + + Args: + network_variable (str): Name of the variable that should be written to. Default to "AADT". + mndot_count_shst_data (str): File path to MNDOT count location SHST API node match result. + widot_count_shst_data (str): File path to WIDOT count location SHST API node match result. + mndot_count_variable_shp (str): File path to MNDOT count location geodatabase. + widot_count_variable_shp (str): File path to WIDOT count location geodatabase. + + Returns: + None + """ + + WranglerLogger.info("Adding Counts") + + """ + Verify inputs + """ + + mndot_count_shst_data = ( + mndot_count_shst_data + if mndot_count_shst_data + else self.parameters.mndot_count_shst_data + ) + widot_count_shst_data = ( + widot_count_shst_data + if widot_count_shst_data + else self.parameters.widot_count_shst_data + ) + mndot_count_variable_shp = ( + mndot_count_variable_shp + if mndot_count_variable_shp + else self.parameters.mndot_count_variable_shp + ) + widot_count_variable_shp = ( + widot_count_variable_shp + if widot_count_variable_shp + else self.parameters.widot_count_variable_shp + ) + + for varname, var in { + "mndot_count_shst_data": mndot_count_shst_data, + "widot_count_shst_data": widot_count_shst_data, + }.items(): + if not var: + msg = "'{}' not found in method or lasso parameters.".format(varname) + WranglerLogger.error(msg) + raise ValueError(msg) + if not os.path.exists(var): + msg = "{}' not found at following location: {}.".format(varname, var) + WranglerLogger.error(msg) + raise ValueError(msg) + + for varname, var in { + "mndot_count_variable_shp": mndot_count_variable_shp, + "widot_count_variable_shp": widot_count_variable_shp, + }.items(): + if not var: + msg = "'{}' not found in method or lasso parameters.".format(varname) + WranglerLogger.error(msg) + raise ValueError(msg) + + """ + Start actual process + """ + WranglerLogger.debug( + "Adding MNDOT Counts using \n- shst file: {}\n- shp file: {}\n- as network variable: {}".format( + mndot_count_shst_data, mndot_count_variable_shp, network_variable + ) + ) + # Add Minnesota Counts + self.add_variable_using_shst_reference( + var_shst_csvdata=mndot_count_shst_data, + shst_csv_variable=mndot_count_variable_shp, + network_variable=network_variable, + network_var_type=int, + overwrite=True, + ) + WranglerLogger.debug( + "Adding WiDot Counts using \n- shst file: {}\n- shp file: {}\n- as network variable: {}".format( + widot_count_shst_data, widot_count_variable_shp, network_variable + ) + ) + # Add Wisconsin Counts, but don't overwrite Minnesota + self.add_variable_using_shst_reference( + var_shst_csvdata=widot_count_shst_data, + shst_csv_variable=widot_count_variable_shp, + network_variable=network_variable, + network_var_type=int, + overwrite=False, + ) + + self.links_df["count_AM"] = self.links_df[network_variable] / 4 + self.links_df["count_MD"] = self.links_df[network_variable] / 4 + self.links_df["count_PM"] = self.links_df[network_variable] / 4 + self.links_df["count_NT"] = self.links_df[network_variable] / 4 + + self.links_df["count_daily"] = self.links_df[network_variable] + self.links_df["count_year"] = 2017 + + WranglerLogger.info( + "Finished adding counts variable: {}".format(network_variable) + )
+ +
[docs] @staticmethod + def read_match_result(path): + """ + Reads the shst geojson match returns. + + Returns shst dataframe. + + Reading lots of same type of file and concatenating them into a single DataFrame. + + Args: + path (str): File path to SHST match results. + + Returns: + geodataframe: geopandas geodataframe + + ##todo + not sure why we need, but should be in utilities not this class + """ + refId_gdf = DataFrame() + refid_file = glob.glob(path) + for i in refid_file: + new = gpd.read_file(i) + refId_gdf = pd.concat([refId_gdf, new], ignore_index=True, sort=False) + return refId_gdf
+ +
[docs] @staticmethod + def get_attribute( + links_df, + join_key, # either "shstReferenceId", or "shstGeometryId", tests showed the latter gave better coverage + source_shst_ref_df, # source shst refId + source_gdf, # source dataframe + field_name, # , # targetted attribute from source + ): + """ + Gets attribute from source data using SHST match result. + + Args: + links_df (dataframe): The network dataframe that new attribute should be written to. + join_key (str): SHST ID variable name used to join source data with network dataframe. + source_shst_ref_df (str): File path to source data SHST match result. + source_gdf (str): File path to source data. + field_name (str): Name of the attribute to get from source data. + + Returns: + None + """ + # join based on shared streets geometry ID + # pp_link_id is shared streets match return + # source_ink_id is mrcc + WranglerLogger.debug( + "source ShSt rename_variables_for_dbf columns\n{}".format( + source_shst_ref_df.columns + ) + ) + WranglerLogger.debug("source gdf columns\n{}".format(source_gdf.columns)) + # end up with OSM network with the MRCC Link ID + # could also do with route_sys...would that be quicker? + join_refId_df = pd.merge( + links_df, + source_shst_ref_df[[join_key, "pp_link_id", "score"]].rename( + columns={"pp_link_id": "source_link_id", "score": "source_score"} + ), + how="left", + on=join_key, + ) + + # joined with MRCC dataframe to get route_sys + + join_refId_df = pd.merge( + join_refId_df, + source_gdf[["LINK_ID", field_name]].rename( + columns={"LINK_ID": "source_link_id"} + ), + how="left", + on="source_link_id", + ) + + # drop duplicated records with same field value + + join_refId_df.drop_duplicates( + subset=["model_link_id", "shstReferenceId", field_name], inplace=True + ) + + # more than one match, take the best score + + join_refId_df.sort_values( + by=["model_link_id", "source_score"], + ascending=True, + na_position="first", + inplace=True, + ) + + join_refId_df.drop_duplicates( + subset=["model_link_id"], keep="last", inplace=True + ) + + # self.links_df[field_name] = join_refId_df[field_name] + + return join_refId_df[links_df.columns.tolist() + [field_name, "source_link_id"]]
+ +
[docs] def calculate_use( + self, + network_variable="use", + as_integer=True, + overwrite=False, + ): + """ + Calculates use variable. + + Args: + network_variable (str): Variable that should be written to in the network. Default to "use" + as_integer (bool): If True, will convert true/false to 1/0s. Defauly to True. + overwrite (Bool): True if overwriting existing county variable in network. Default to False. + + Returns: + None + """ + + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing hov Variable '{}' already in network".format( + network_variable + ) + ) + else: + WranglerLogger.info( + "'use' Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + WranglerLogger.info( + "Calculating hov and adding as roadway network variable: {}".format( + network_variable + ) + ) + """ + Verify inputs + """ + + if not network_variable: + msg = "No network variable specified for centroid connector" + WranglerLogger.error(msg) + raise ValueError(msg) + + """ + Start actual process + """ + #MTC + self.links_df[network_variable] = int(1) + #/MTC + + self.links_df[network_variable] = 0 + + self.links_df.loc[ + (self.links_df["assign_group"] == 8) | (self.links_df["access"] == "hov"), + network_variable, + ] = 100 + #/MC + + + if as_integer: + self.links_df[network_variable] = self.links_df[network_variable].astype( + int + ) + WranglerLogger.info( + "Finished calculating hov variable: {}".format(network_variable) + )
+ +
[docs] def create_ML_variable( + self, + network_variable="ML_lanes", + overwrite=False, + ): + """ + Created ML lanes placeholder for project to write out ML changes + + ML lanes default to 0, ML info comes from cube LOG file and store in project cards + + Args: + overwrite (Bool): True if overwriting existing variable in network. Default to False. + + Returns: + None + """ + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing ML Variable '{}' already in network".format( + network_variable + ) + ) + self.links_df[network_variable] = int(0) + else: + WranglerLogger.info( + "ML Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + """ + Verify inputs + """ + + WranglerLogger.info( + "Finished creating ML lanes variable: {}".format(network_variable) + )
+ +
[docs] def create_hov_corridor_variable( + self, + network_variable="segment_id", + overwrite=False, + ): + """ + Created hov corridor placeholder for project to write out corridor changes + + hov corridor id default to 0, its info comes from cube LOG file and store in project cards + + Args: + overwrite (Bool): True if overwriting existing variable in network. Default to False. + + Returns: + None + """ + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing hov corridor Variable '{}' already in network".format( + network_variable + ) + ) + else: + WranglerLogger.info( + "Hov corridor Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + """ + Verify inputs + """ + + self.links_df[network_variable] = int(0) + + WranglerLogger.info( + "Finished creating hov corridor variable: {}".format(network_variable) + )
+ +
[docs] def create_managed_variable( + self, + network_variable="managed", + overwrite=False, + ): + """ + Created placeholder for project to write out managed + + managed default to 0, its info comes from cube LOG file and store in project cards + + Args: + overwrite (Bool): True if overwriting existing variable in network. Default to False. + + Returns: + None + """ + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing managed Variable '{}' already in network".format( + network_variable + ) + ) + else: + WranglerLogger.info( + "Managed Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + """ + Verify inputs + """ + + self.links_df[network_variable] = int(0) + + WranglerLogger.info( + "Finished creating managed variable: {}".format(network_variable) + )
+ +
[docs] def calculate_centroidconnect( + self, + parameters, + network_variable="centroidconnect", + highest_taz_number=None, + as_integer=True, + overwrite=False, + ): + """ + Calculates centroid connector variable. + + Args: + parameters (Parameters): A Lasso Parameters, which stores input files. + network_variable (str): Variable that should be written to in the network. Default to "centroidconnect" + highest_taz_number (int): the max TAZ number in the network. + as_integer (bool): If True, will convert true/false to 1/0s. Default to True. + overwrite (Bool): True if overwriting existing county variable in network. Default to False. + Returns: + RoadwayNetwork + """ + + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing Centroid Connector Variable '{}' already in network".format( + network_variable + ) + ) + else: + WranglerLogger.info( + "Centroid Connector Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + WranglerLogger.info( + "Calculating Centroid Connector and adding as roadway network variable: {}".format( + network_variable + ) + ) + """ + Verify inputs + """ + highest_taz_number = ( + highest_taz_number if highest_taz_number else parameters.highest_taz_number + ) + + if not highest_taz_number: + msg = "No highest_TAZ number specified in method variable or in parameters" + WranglerLogger.error(msg) + raise ValueError(msg) + + WranglerLogger.debug( + "Calculating Centroid Connectors using highest TAZ number: {}".format( + highest_taz_number + ) + ) + + if not network_variable: + msg = "No network variable specified for centroid connector" + WranglerLogger.error(msg) + raise ValueError(msg) + + """ + Start actual process + """ + self.links_df[network_variable] = False + + self.links_df.loc[ + (self.links_df["A"] <= highest_taz_number) + | (self.links_df["B"] <= highest_taz_number), + network_variable, + ] = True + + if as_integer: + self.links_df[network_variable] = self.links_df[ + network_variable + ].astype(int) + WranglerLogger.info( + "Finished calculating centroid connector variable: {}".format(network_variable) + )
+ + +
[docs] def calculate_distance( + self, network_variable="distance", centroidconnect_only=False, overwrite=False + ): + """ + calculate link distance in miles + + Args: + centroidconnect_only (Bool): True if calculating distance for centroidconnectors only. Default to False. + overwrite (Bool): True if overwriting existing variable in network. Default to False. + + Returns: + None + + """ + + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing distance Variable '{}' already in network".format( + network_variable + ) + ) + else: + WranglerLogger.info( + "Distance Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + """ + Verify inputs + """ + + #MC + if ("centroidconnect" not in self.links_df) & ("taz" not in self.links_df.roadway.unique()): + if centroidconnect_only: + msg = "No variable specified for centroid connector, calculating centroidconnect first" + WranglerLogger.error(msg) + raise ValueError(msg) + #/MC + + """ + Start actual process + """ + + temp_links_gdf = self.links_df.copy() + temp_links_gdf.crs = "EPSG:4326" + temp_links_gdf = temp_links_gdf.to_crs(epsg=26915) + + #MTC + WranglerLogger.info( + "Calculating distance for all links".format(network_variable) + ) + temp_links_gdf[network_variable] = temp_links_gdf.geometry.length / 1609.34 + #/MTC + #MC + if centroidconnect_only: + WranglerLogger.info( + "Calculating {} for centroid connectors".format(network_variable) + ) + temp_links_gdf[network_variable] = np.where( + temp_links_gdf.centroidconnect == 1, + temp_links_gdf.geometry.length / 1609.34, + temp_links_gdf[network_variable], + ) + else: + WranglerLogger.info( + "Calculating distance for all links".format(network_variable) + ) + temp_links_gdf[network_variable] = temp_links_gdf.geometry.length / 1609.34 + #/MC + + self.links_df[network_variable] = temp_links_gdf[network_variable]
+ +
[docs] def convert_int(self, int_col_names=[]): + """ + Convert integer columns + """ + + #MTC + WranglerLogger.info( + "Converting variable type to mtc standard" + ) + + int_col_names = self.parameters.int_col + #/MTC + #MC + """ + WranglerLogger.info("Converting variable type to MetCouncil standard") + + if not int_col_names: + int_col_names = self.parameters.int_col + #/MC + """ + ##Why are we doing this? + # int_col_names.remove("lanes") + + for c in list(set(self.links_df.columns) & set(int_col_names)): + self.links_df[c] = self.links_df[c].replace(np.nan, 0) + # REPLACE BLANKS WITH ZERO FOR INTEGER COLUMNS + self.links_df[c] = self.links_df[c].replace('', 0) + try: + self.links_df[c] = self.links_df[c].replace(np.nan, 0) + self.links_df[c] = self.links_df[c].replace("", 0) + self.links_df[c] = self.links_df[c].astype(int) + except ValueError: + try: + self.links_df[c] = self.links_df[c].astype(float) + self.links_df[c] = self.links_df[c].astype(int) + except: + msg = f"Could not convert column {c} to integer." + WranglerLogger.error(msg) + raise ValueError(msg) + except: + self.links_df[c] = self.links_df[c].astype(float) + self.links_df[c] = self.links_df[c].astype(int) + + for c in list(set(self.nodes_df.columns) & set(int_col_names)): + self.nodes_df[c] = self.nodes_df[c].replace("", 0) + self.nodes_df[c] = self.nodes_df[c].astype(int)
+ +
[docs] def fill_na(self): + """ + Fill na values from create_managed_lane_network() + """ + + WranglerLogger.info("Filling nan for network from network wrangler") + + num_col = self.parameters.int_col + self.parameters.float_col + + for x in list(self.links_df.columns): + if x in num_col: + self.links_df[x].fillna(0, inplace=True) + self.links_df[x] = self.links_df[x].apply( + lambda k: 0 if k in [np.nan, "", float("nan"), "NaN"] else k + ) + + else: + self.links_df[x].fillna("", inplace=True) + + for x in list(self.nodes_df.columns): + if x in num_col: + self.nodes_df[x].fillna(0, inplace=True) + else: + self.nodes_df[x].fillna("", inplace=True)
+ + +
[docs] def roadway_standard_to_met_council_network(self, output_epsg=None): + """ + Rename and format roadway attributes to be consistent with what metcouncil's model is expecting. + #MC + Args: + output_epsg (int): epsg number of output network. + + Returns: + None + """ + + WranglerLogger.info( + "Renaming roadway attributes to be consistent with what metcouncil's model is expecting" + ) + + """ + Verify inputs + """ + + output_epsg = output_epsg if output_epsg else self.parameters.output_epsg + + """ + Start actual process + """ + if "managed" in self.links_df.columns: + WranglerLogger.info("Creating managed lane network.") + self.create_managed_lane_network(in_place=True) + + # when ML and assign_group projects are applied together, assign_group is filled as "" by wrangler for ML links + for c in ModelRoadwayNetwork.CALCULATED_VALUES: + if c in self.links_df.columns and c in self.parameters.int_col: + self.links_df[c] = self.links_df[c].replace("", 0) + else: + WranglerLogger.info("Didn't detect managed lanes in network.") + + self.calculate_centroidconnect(self.parameters) + self.create_calculated_variables() + self.calculate_distance(overwrite=True) + + self.fill_na() + # no method to calculate price yet, will be hard coded in project card + WranglerLogger.info("Splitting variables by time period and category") + self.split_properties_by_time_period_and_category() + self.convert_int() + + self.links_metcouncil_df = self.links_df.copy() + self.nodes_metcouncil_df = self.nodes_df.copy() + + self.links_metcouncil_df = pd.merge( + self.links_metcouncil_df.drop( + "geometry", axis=1 + ), # drop the stick geometry in links_df + self.shapes_df[["shape_id", "geometry"]], + how="left", + on="shape_id", + ) + + self.links_metcouncil_df.crs = "EPSG:4326" + self.nodes_metcouncil_df.crs = "EPSG:4326" + WranglerLogger.info("Setting Coordinate Reference System to EPSG 26915") + self.links_metcouncil_df = self.links_metcouncil_df.to_crs(epsg=26915) + self.nodes_metcouncil_df = self.nodes_metcouncil_df.to_crs(epsg=26915) + + self.nodes_metcouncil_df["X"] = self.nodes_metcouncil_df.geometry.apply( + lambda g: g.x + ) + self.nodes_metcouncil_df["Y"] = self.nodes_metcouncil_df.geometry.apply( + lambda g: g.y + ) + + # CUBE expect node id to be N + self.nodes_metcouncil_df.rename(columns={"model_node_id": "N"}, inplace=True)
+ +
[docs] def rename_variables_for_dbf( + self, + input_df, + variable_crosswalk: str = None, + output_variables: list = None, + convert_geometry_to_xy=False, + ): + """ + Rename attributes for DBF/SHP, make sure length within 10 chars. + + Args: + input_df (dataframe): Network standard DataFrame. + variable_crosswalk (str): File path to variable name crosswalk from network standard to DBF names. + output_variables (list): List of strings for DBF variables. + convert_geometry_to_xy (bool): True if converting node geometry to X/Y + + Returns: + dataframe + + """ + WranglerLogger.info("Renaming variables so that they are DBF-safe") + + """ + Verify inputs + """ + + variable_crosswalk = ( + variable_crosswalk + if variable_crosswalk + else self.parameters.net_to_dbf_crosswalk + ) + + output_variables = ( + output_variables if output_variables else self.parameters.output_variables + ) + + """ + Start actual process + """ + + crosswalk_df = pd.read_csv(variable_crosswalk) + WranglerLogger.debug( + "Variable crosswalk: {} \n {}".format(variable_crosswalk, crosswalk_df) + ) + net_to_dbf_dict = dict(zip(crosswalk_df["net"], crosswalk_df["dbf"])) + + dbf_name_list = [] + + dbf_df = copy.deepcopy(input_df) + + # only write out variables that we specify + # if variable is specified in the crosswalk, rename it to that variable + for c in dbf_df.columns: + if c in output_variables: + try: + dbf_df.rename(columns={c: net_to_dbf_dict[c]}, inplace=True) + dbf_name_list += [net_to_dbf_dict[c]] + except: + dbf_name_list += [c] + + if "geometry" in dbf_df.columns: + if str(dbf_df["geometry"].iloc[0].geom_type) == "Point": + dbf_df["X"] = dbf_df.geometry.apply(lambda g: g.x) + dbf_df["Y"] = dbf_df.geometry.apply(lambda g: g.y) + dbf_name_list += ["X", "Y"] + + WranglerLogger.debug("DBF Variables: {}".format(",".join(dbf_name_list))) + + return dbf_df[dbf_name_list]
+ +
[docs] def write_roadway_as_shp( + self, + output_dir, + node_output_variables: list = None, + link_output_variables: list = None, + data_to_csv: bool = True, + data_to_dbf: bool = False, + output_link_shp: str = None, + output_node_shp: str = None, + output_link_csv: str = None, + output_node_csv: str = None, + output_gpkg: str = None, + output_link_gpkg_layer: str = None, + output_node_gpkg_layer: str = None, + output_gpkg_link_filter: str = None + ): + """ + Write out dbf/shp/gpkg for cube. Write out csv in addition to shp with full length variable names. + + Args: + output_dir (str): File path to directory + node_output_variables (list): List of strings for node output variables. + link_output_variables (list): List of strings for link output variables. + data_to_csv (bool): True if write network in csv format. + data_to_dbf (bool): True if write network in dbf/shp format. + output_link_shp (str): File name to output link dbf/shp. + output_node_shp (str): File name of output node dbf/shp. + output_link_csv (str): File name to output link csv. + output_node_csv (str): File name to output node csv. + output_gpkg (str): File name to output GeoPackage. + output_link_gpkg_layer (str): Layer name within output_gpkg to output links. + output_node_gpkg_layer (str): Layer name within output_gpkg to output links. + output_gpkg_link_filter (str): Optional column name to additional output link subset layers + + Returns: + None + """ + + WranglerLogger.info("Writing Network as Shapefile") + WranglerLogger.debug( + "Output Variables: \n - {}".format( + "\n - ".join(self.parameters.output_variables) + ) + ) + + """ + Verify inputs + """ + + if self.nodes_mtc_df is None: + self.roadway_standard_to_met_council_network() + + WranglerLogger.debug( + "Network Link Variables: \n - {}".format( + "\n - ".join(self.links_mtc_df.columns) + ) + ) + WranglerLogger.debug( + "Network Node Variables: \n - {}".format( + "\n - ".join(self.nodes_mtc_df.columns) + ) + ) + + link_output_variables = ( + link_output_variables + if link_output_variables + else [ + c + for c in self.links_mtc_df.columns + if c in self.parameters.output_variables + ] + ) + + node_output_variables = ( + node_output_variables + if node_output_variables + else [ + c + for c in self.nodes_mtc_df.columns + if c in self.parameters.output_variables + ] + ) + + # unless specified that all the data goes to the DBF, only output A and B + dbf_link_output_variables = ( + #MTC + link_output_variables if link_output_variables else ["A", "B", "geometry"] + #MC + #link_output_variables if data_to_dbf else ["A", "B", "shape_id", "geometry"] + ) + + # Removing code to set this to versions from parameters + # User can use these as arg + + """ + Start Process + """ + # rename these to short only for shapefile option + if output_node_shp: + WranglerLogger.info("Renaming DBF Node Variables") + nodes_dbf_df = self.rename_variables_for_dbf(self.nodes_mtc_df, output_variables=node_output_variables) + else: + WranglerLogger.debug("nodes_mtc_df columns: {}".format(list(self.nodes_mtc_df.columns))) + nodes_dbf_df = self.nodes_mtc_df[node_output_variables] + + if output_link_shp: + WranglerLogger.info("Renaming DBF Link Variables") + links_dbf_df = self.rename_variables_for_dbf(self.links_mtc_df, output_variables=dbf_link_output_variables) + else: + WranglerLogger.debug("links_mtc_df columns: {}".format(list(self.links_mtc_df.columns))) + links_dbf_df = self.links_mtc_df[dbf_link_output_variables] + + links_dbf_df = gpd.GeoDataFrame(links_dbf_df, geometry=links_dbf_df["geometry"]) + + # temp debug + WranglerLogger.debug("links_dbf_df.loc[(links_dbf_df.A.isin([7063066,2563066]))|(links_dbf_df.B.isin([7063066,2563066]))]:\n{}".format( + links_dbf_df.loc[(links_dbf_df.A.isin([7063066,2563066]))|(links_dbf_df.B.isin([7063066,2563066]))] + )) + + if output_node_shp: + WranglerLogger.info("Writing Node Shapes: {}".format(os.path.join(output_dir, output_node_shp))) + nodes_dbf_df.to_file(os.path.join(output_dir, output_node_shp)) + + if output_gpkg and output_node_gpkg_layer: + WranglerLogger.info("Writing GeoPackage {} with Node Layer {}".format(os.path.join(output_dir, output_gpkg), output_node_gpkg_layer)) + nodes_dbf_df.to_file(os.path.join(output_dir, output_gpkg), layer=output_node_gpkg_layer, driver="GPKG") + + if output_link_shp: + WranglerLogger.info("Writing Link Shapes: {}".format(os.path.join(output_dir, output_link_shp))) + links_dbf_df.to_file(os.path.join(output_dir, output_link_shp)) + + # debug test + link_schema = { + "properties": { + "A" : "int:8", + "B" : "int:8", + "model_link_id" : "int:10", + "shstGeometryId": "str:32", + "name" : "str:84", + "ft" : "int:2", + "assignable" : "int:18", + "cntype" : "str:80", + "distance" : "float", + "county" : "str:15", + "bike_access" : "int:2", + "drive_access" : "int:2", + "walk_access" : "int:2", + "rail_only" : "int:2", + "bus_only" : "int:2", + "transit" : "int:2", + "managed" : "int:2", + "tollbooth" : "int:2", + "tollseg" : "int:2", + "segment_id" : "int:4", + "lanes_EA" : "int:2", + "lanes_AM" : "int:2", + "lanes_MD" : "int:2", + "lanes_PM" : "int:2", + "lanes_EV" : "int:2", + "useclass_EA" : "int:2", + "useclass_AM" : "int:2", + "useclass_MD" : "int:2", + "useclass_PM" : "int:2", + "useclass_EV" : "int:2", + "nmt2010" : "int:2", + "nmt2020" : "int:2", + }, + "geometry": "LineString" + } + if output_gpkg and output_link_gpkg_layer: + WranglerLogger.info("Writing GeoPackage {} with Link Layer {}".format(os.path.join(output_dir, output_gpkg), output_link_gpkg_layer)) + links_dbf_df.to_file(os.path.join(output_dir, output_gpkg), layer=output_link_gpkg_layer, schema=link_schema, driver="GPKG") + + # output additional link layers if filter column is specified + # e.g. if county-subsets are output + if output_gpkg_link_filter: + link_value_counts = links_dbf_df[output_gpkg_link_filter].value_counts() + for filter_val,filter_count in link_value_counts.items(): + gpkg_layer_name = "{}_{}".format(output_link_gpkg_layer, filter_val) + gpkg_layer_name = gpkg_layer_name.replace(" ","_") + WranglerLogger.info("Writing GeoPackage {} with Link Layer {} for {} rows".format( + os.path.join(output_dir, output_gpkg), gpkg_layer_name, filter_count)) + links_dbf_df.loc[ links_dbf_df[output_gpkg_link_filter]==filter_val ].to_file( + os.path.join(output_dir, output_gpkg), layer=gpkg_layer_name, schema=link_schema, driver="GPKG") + + + + + if data_to_csv: + WranglerLogger.info( + "Writing Network Data to CSVs:\n - {}\n - {}".format( + output_link_csv, output_node_csv + ) + ) + self.links_mtc_df[link_output_variables].to_csv( + output_link_csv, index=False + ) + self.nodes_mtc_df[node_output_variables].to_csv( + output_node_csv, index=False + )
+ + + # this should be moved to util +
[docs] @staticmethod + def dataframe_to_fixed_width(df): + """ + Convert dataframe to fixed width format, geometry column will not be transformed. + + Args: + df (pandas DataFrame). + + Returns: + pandas dataframe: dataframe with fixed width for each column. + dict: dictionary with columns names as keys, column width as values. + """ + WranglerLogger.info("Starting fixed width conversion") + if 'name' in df.columns: + df['name']=df['name'].apply(lambda x: x.strip().split(',')[0].replace("[",'').replace("'nan'","").replace("nan","").replace("'","")) + + + # get the max length for each variable column + max_width_dict = dict( + [ + (v, df[v].apply(lambda r: len(str(r)) if r != None else 0).max()) + for v in df.columns.values + if v != "geometry" + ] + ) + + fw_df = df.drop("geometry", axis=1).copy() + for c in fw_df.columns: + fw_df[c] = fw_df[c].apply(lambda x: str(x)) + fw_df["pad"] = fw_df[c].apply(lambda x: " " * (max_width_dict[c] - len(x))) + fw_df[c] = fw_df.apply(lambda x: x["pad"] + x[c], axis=1) + + return fw_df, max_width_dict
+ +
[docs] def write_roadway_as_fixedwidth( + self, + output_dir, + node_output_variables: list = None, + link_output_variables: list = None, + output_link_txt: str = None, + output_node_txt: str = None, + output_link_header_width_txt: str = None, + output_node_header_width_txt: str = None, + output_cube_network_script: str = None, + drive_only: bool = False, + ): + """ + Writes out fixed width file. + + This function does: + 1. write out link and node fixed width data files for cube. + 2. write out header and width correspondence. + 3. write out cube network building script with header and width specification. + + Args: + output_dir (str): File path to where links, nodes and script will be written and run + node_output_variables (list): list of node variable names. + link_output_variables (list): list of link variable names. + output_link_txt (str): File name of output link database (within output_dir) + output_node_txt (str): File name of output node database (within output_dir) + output_link_header_width_txt (str): File name of link column width records (within output_dir) + output_node_header_width_txt (str): File name of node column width records (within output_dir) + output_cube_network_script (str): File name of CUBE network building script (within output_dir) + drive_only (bool): If True, only writes drive nodes and links + + Returns: + None + + """ + + """ + Verify inputs + """ + + if self.nodes_mtc_df is None: + self.roadway_standard_to_mtc_network() + + WranglerLogger.debug( + "Network Link Variables: \n - {}".format( + "\n - ".join(self.links_mtc_df.columns) + ) + ) + WranglerLogger.debug( + "Network Node Variables: \n - {}".format( + "\n - ".join(self.nodes_mtc_df.columns) + ) + ) + + link_output_variables = ( + link_output_variables + if link_output_variables + else [ + c + for c in self.links_mtc_df.columns + if c in self.parameters.output_variables + ] + ) + + node_output_variables = ( + node_output_variables + if node_output_variables + else [ + c + for c in self.nodes_mtc_df.columns + if c in self.parameters.output_variables + ] + ) + + output_link_txt = ( + output_link_txt if output_link_txt else self.parameters.output_link_txt + ) + + output_node_txt = ( + output_node_txt if output_node_txt else self.parameters.output_node_txt + ) + + output_link_header_width_txt = ( + output_link_header_width_txt + if output_link_header_width_txt + else self.parameters.output_link_header_width_txt + ) + + output_node_header_width_txt = ( + output_node_header_width_txt + if output_node_header_width_txt + else self.parameters.output_node_header_width_txt + ) + + output_cube_network_script = ( + output_cube_network_script + if output_cube_network_script + else self.parameters.output_cube_network_script + ) + + """ + Start Process + """ + #MTC + link_ff_df, link_max_width_dict = self.dataframe_to_fixed_width( + self.links_mtc_df[link_output_variables] + ) + + if drive_only: + link_ff_df = link_ff_df.loc[link_ff_df['drive_access'] == 1] + #/MTC + """ + #MC + link_ff_df, link_max_width_dict = self.dataframe_to_fixed_width( + self.links_metcouncil_df[link_output_variables] + ) + + if drive_only: + link_ff_df = link_ff_df.loc[link_ff_df["drive_access"] == 1] + #/MC + """ + WranglerLogger.info("Writing out link database") + + link_ff_df.to_csv(os.path.join(output_dir, output_link_txt), sep=";", index=False, header=False) + + # write out header and width correspondence + WranglerLogger.info("Writing out link header and width ----") + link_max_width_df = DataFrame( + list(link_max_width_dict.items()), columns=["header", "width"] + ) + link_max_width_df.to_csv(os.path.join(output_dir, output_link_header_width_txt), index=False) + + #MTC + node_ff_df, node_max_width_dict = self.dataframe_to_fixed_width( + self.nodes_mtc_df[node_output_variables] + ) + #/MTC + """ + #MC + node_ff_df, node_max_width_dict = self.dataframe_to_fixed_width( + self.nodes_metcouncil_df[node_output_variables] + ) + #/MC + """ + WranglerLogger.info("Writing out node database") + + if drive_only: + node_ff_df = node_ff_df.loc[node_ff_df["drive_node"] == 1] + + + node_ff_df.to_csv(os.path.join(output_dir, output_node_txt), sep=";", index=False, header=False) + + # write out header and width correspondence + WranglerLogger.info("Writing out node header and width") + node_max_width_df = DataFrame( + list(node_max_width_dict.items()), columns=["header", "width"] + ) + node_max_width_df.to_csv(os.path.join(output_dir, output_node_header_width_txt), index=False) + + # write out cube script + s = 'RUN PGM = NETWORK MSG = "Read in network from fixed width file" \n' + s += 'FILEI LINKI[1] = "{}",'.format(output_link_txt) + start_pos = 1 + for i in range(len(link_max_width_df)): + s += " VAR=" + link_max_width_df.header.iloc[i] + + if ( + self.links_mtc_df.dtypes.loc[link_max_width_df.header.iloc[i]] + == "O" + ): + s += "(C" + str(link_max_width_df.width.iloc[i]) + ")" + + s += ( + ", BEG=" + + str(start_pos) + + ", LEN=" + + str(link_max_width_df.width.iloc[i]) + + "," + ) + + start_pos += link_max_width_df.width.iloc[i] + 1 + + s = s[:-1] + s += "\n" + s += 'FILEI NODEI[1] = "{}",'.format(output_node_txt) + start_pos = 1 + for i in range(len(node_max_width_df)): + s += " VAR=" + node_max_width_df.header.iloc[i] + + if ( + self.nodes_mtc_df.dtypes.loc[node_max_width_df.header.iloc[i]] + == "O" + ): + s += "(C" + str(node_max_width_df.width.iloc[i]) + ")" + + s += ( + ", BEG=" + + str(start_pos) + + ", LEN=" + + str(node_max_width_df.width.iloc[i]) + + "," + ) + + start_pos += node_max_width_df.width.iloc[i] + 1 + + s = s[:-1] + s += '\n' + s += 'FILEO NETO = "complete_network.net"\n\n' + s += ' ZONES = {}\n\n'.format(self.parameters.zones) + s += '; Trim leading whitespace from string variables\n' + # todo: The below should be built above based on columns that are strings + s += ' phase=NODEMERGE\n' + s += ' county = LTRIM(county)\n' + s += ' endphase\n' + s += ' phase=LINKMERGE\n' + s += ' name = LTRIM(name)\n' + s += ' county = LTRIM(county)\n' + s += ' cntype = LTRIM(cntype)\n' + s += ' endphase\n' + s += '\nENDRUN\n' + + with open(os.path.join(output_dir, output_cube_network_script), "w") as f: + f.write(s) + + # run the cube script to create the cube network + import subprocess + env = copy.copy(os.environ) + cube_cmd = '"C:\\Program Files\\Citilabs\\CubeVoyager\\runtpp.exe" {}'.format(output_cube_network_script) + try: + WranglerLogger.info("Running [{}] in cwd [{}]".format(cube_cmd, output_dir)) + ret = subprocess.run(cube_cmd, cwd=output_dir, capture_output=True, check=True) + + WranglerLogger.info("return code: {}".format(ret.returncode)) + + for line in ret.stdout.decode('utf-8').split('\r\n'): + if len(line) > 0: WranglerLogger.info("stdout: {}".format(line)) + + for line in ret.stderr.decode('utf-8').split('\r\n'): + if len(line) > 0: WranglerLogger.info("stderr: {}".format(line)) + + except Exception as e: + WranglerLogger.error(e)
+
+ +
+
+
+ +
+ +
+

© Copyright 2019 Metropolitan Council.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty_dev/_modules/lasso/transit/index.html b/branch/bicounty_dev/_modules/lasso/transit/index.html new file mode 100644 index 0000000..13a7d41 --- /dev/null +++ b/branch/bicounty_dev/_modules/lasso/transit/index.html @@ -0,0 +1,1858 @@ + + + + + + lasso.transit — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for lasso.transit

+"""Transit-related classes to parse, compare, and write standard and cube transit files.
+
+  Typical usage example:
+
+    tn = CubeTransit.create_from_cube(CUBE_DIR)
+    transit_change_list = tn.evaluate_differences(base_transit_network)
+
+    cube_transit_net = StandardTransit.read_gtfs(BASE_TRANSIT_DIR)
+    cube_transit_net.write_as_cube_lin(os.path.join(WRITE_DIR, "outfile.lin"))
+"""
+import os
+import copy
+import csv
+import datetime, time
+from typing import Any, Dict, Optional, Union
+
+from lark import Lark, Transformer, v_args
+from pandas import DataFrame
+
+import pandas as pd
+import partridge as ptg
+import numpy as np
+
+from network_wrangler import TransitNetwork
+
+from .logger import WranglerLogger
+from .parameters import Parameters
+
+
[docs]class CubeTransit(object): + """Class for storing information about transit defined in Cube line + files. + + Has the capability to: + + - Parse cube line file properties and shapes into python dictionaries + - Compare line files and represent changes as Project Card dictionaries + + .. highlight:: python + + Typical usage example: + :: + tn = CubeTransit.create_from_cube(CUBE_DIR) + transit_change_list = tn.evaluate_differences(base_transit_network) + + Attributes: + lines (list): list of strings representing unique line names in + the cube network. + line_properties (dict): dictionary of line properties keyed by line name. Property + values are stored in a dictionary by property name. These + properties are directly read from the cube line files and haven't + been translated to standard transit values. + shapes (dict): dictionary of shapes + keyed by line name. Shapes stored as a pandas DataFrame of nodes with following columns: + - 'node_id' (int): positive integer of node id + - 'node' (int): node number, with negative indicating a non-stop + - 'stop' (boolean): indicates if it is a stop + - 'order' (int): order within this shape + program_type (str): Either PT or TRNBLD + parameters (Parameters): + Parameters instance that will be applied to this instance which + includes information about time periods and variables. + source_list (list): + List of cube line file sources that have been read and added. + diff_dict (dict): + """ + +
[docs] def __init__(self, parameters: Union[Parameters, dict] = {}): + """ + Constructor for CubeTransit + + parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters + """ + WranglerLogger.debug("Creating a new Cube Transit instance") + + self.lines = [] + + self.line_properties = {} + self.shapes = {} + + self.program_type = None + + if type(parameters) is dict: + self.parameters = Parameters(**parameters) + elif isinstance(parameters, Parameters): + self.parameters = Parameters(**parameters.__dict__) + else: + msg = "Parameters should be a dict or instance of Parameters: found {} which is of type:{}".format( + parameters, type(parameters) + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + self.source_list = [] + + self.diff_dict = Dict[str, Any]
+ +
[docs] def add_cube(self, transit_source: str): + """Reads a .lin file and adds it to existing TransitNetwork instance. + + Args: + transit_source: a string or the directory of the cube line file to be parsed + + """ + + """ + Figure out what kind of transit source it is + """ + + parser = Lark(TRANSIT_LINE_FILE_GRAMMAR, debug="debug", parser="lalr") + + if "NAME=" in transit_source: + WranglerLogger.debug("reading transit source as string") + self.source_list.append("input_str") + parse_tree = parser.parse(transit_source) + elif os.path.isfile(transit_source): + print("reading: {}".format(transit_source)) + with open(transit_source) as file: + WranglerLogger.debug( + "reading transit source: {}".format(transit_source) + ) + self.source_list.append(transit_source) + parse_tree = parser.parse(file.read()) + elif os.path.isdir(transit_source): + import glob + + for lin_file in glob.glob(os.path.join(transit_source, "*.LIN")): + self.add_cube(lin_file) + return + else: + msg = "{} not a valid transit line string, directory, or file" + WranglerLogger.error(msg) + raise ValueError(msg) + + WranglerLogger.debug("finished parsing cube line file") + # WranglerLogger.debug("--Parse Tree--\n {}".format(parse_tree.pretty())) + transformed_tree_data = CubeTransformer().transform(parse_tree) + # WranglerLogger.debug("--Transformed Parse Tree--\n {}".format(transformed_tree_data)) + + _line_data = transformed_tree_data["lines"] + + line_properties_dict = {k: v["line_properties"] for k, v in _line_data.items()} + line_shapes_dict = {k: v["line_shape"] for k, v in _line_data.items()} + new_lines = list(line_properties_dict.keys()) + """ + Before adding lines, check to see if any are overlapping with existing ones in the network + """ + + overlapping_lines = set(new_lines) & set(self.lines) + if overlapping_lines: + msg = "Overlapping lines found when adding from {}. \nSource files:\n{}\n{} Overlapping Lines of {} total new lines.\n-->{}".format( + transit_source, + "\n - ".join(self.source_list), + len(new_lines), + len(overlapping_lines), + overlapping_lines, + ) + print(msg) + WranglerLogger.error(msg) + raise ValueError(msg) + + self.program_type = transformed_tree_data.get("program_type", None) + + self.lines += new_lines + self.line_properties.update(line_properties_dict) + self.shapes.update(line_shapes_dict) + + WranglerLogger.debug("Added lines to CubeTransit: \n".format(new_lines))
+ +
[docs] @staticmethod + def create_from_cube(transit_source: str, parameters: Optional[dict] = {}): + """ + Reads a cube .lin file and stores as TransitNetwork object. + + Args: + transit_source: a string or the directory of the cube line file to be parsed + + Returns: + A ::CubeTransit object created from the transit_source. + """ + + tn = CubeTransit(parameters) + tn.add_cube(transit_source) + + return tn
+ +
[docs] def evaluate_differences(self, base_transit): + """ + 1. Identifies what routes need to be updated, deleted, or added + 2. For routes being added or updated, identify if the time periods + have changed or if there are multiples, and make duplicate lines if so + 3. Create project card dictionaries for each change. + + Args: + base_transit (CubeTransit): an instance of this class for the base condition + + Returns: + A list of dictionaries containing project card changes + required to evaluate the differences between the base network + and this transit network instance. + """ + transit_change_list = [] + + """ + Identify what needs to be evaluated + """ + lines_to_update = [l for l in self.lines if l in base_transit.lines] + lines_to_delete = [l for l in base_transit.lines if l not in self.lines] + lines_to_add = [l for l in self.lines if l not in base_transit.lines] + + project_card_changes = [] + + """ + Evaluate Property Updates + """ + + for line in lines_to_update: + WranglerLogger.debug( + "Finding differences in time periods for: {}".format(line) + ) + + """ + Find any additional time periods that might need to add or delete. + """ + base_cube_time_period_numbers = ( + CubeTransit.get_time_period_numbers_from_cube_properties( + base_transit.line_properties[line] + ) + ) + + try: + assert len(base_cube_time_period_numbers) == 1 + except: + msg = "Base network line {} should only have one time period per route, but {} found".format( + line, base_cube_time_period_numbers + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + base_cube_time_period_number = base_cube_time_period_numbers[0] + + build_cube_time_period_numbers = ( + CubeTransit.get_time_period_numbers_from_cube_properties( + self.line_properties[line] + ) + ) + + time_periods_to_add = [ + tp + for tp in build_cube_time_period_numbers + if tp not in base_cube_time_period_numbers + ] + + for tp in time_periods_to_add: + lines_to_add.append(self.add_additional_time_periods(tp, line)) + + time_periods_to_delete = [ + tp + for tp in base_cube_time_period_numbers + if tp not in build_cube_time_period_numbers + ] + + for tp in time_periods_to_delete: + lines_to_delete.append(line) + + WranglerLogger.debug("Evaluating differences in: {}".format(line)) + updated_properties = self.evaluate_route_property_differences( + self.line_properties[line], + base_transit.line_properties[line], + base_cube_time_period_number, + ) + updated_shapes = CubeTransit.evaluate_route_shape_changes( + self.shapes[line].node, base_transit.shapes[line].node + ) + if updated_properties: + update_prop_card_dict = self.create_update_route_card_dict( + line, updated_properties + ) + project_card_changes.append(update_prop_card_dict) + + if updated_shapes: + update_shape_card_dict = self.create_update_route_card_dict( + line, updated_shapes + ) + project_card_changes.append(update_shape_card_dict) + + """ + Evaluate Deletions + """ + for line in lines_to_delete: + delete_card_dict = self.create_delete_route_card_dict( + line, base_transit.line_properties[line] + ) + project_card_changes.append(delete_card_dict) + + """ + Evaluate Additions + + First assess if need to add multiple routes if there are multiple time periods + """ + for line in lines_to_add: + time_period_numbers = ( + CubeTransit.get_time_period_numbers_from_cube_properties( + self.line_properties[line] + ) + ) + if len(time_period_numbers) > 1: + for tp in time_period_numbers[1:]: + lines_to_add.append(self.add_additional_time_periods(tp, line)) + + for line in lines_to_add: + add_card_dict = self.create_add_route_card_dict(line) + project_card_changes.append(add_card_dict) + + return project_card_changes
+ +
[docs] def add_additional_time_periods( + self, new_time_period_number: int, orig_line_name: str + ): + """ + Copies a route to another cube time period with appropriate + values for time-period-specific properties. + + New properties are stored under the new name in: + - ::self.shapes + - ::self.line_properties + + Args: + new_time_period_number (int): cube time period number + orig_line_name(str): name of the originating line, from which + the new line will copy its properties. + + Returns: + Line name with new time period. + """ + WranglerLogger.debug( + "adding time periods {} to line {}".format( + new_time_period_number, orig_line_name + ) + ) + + ( + route_id, + _init_time_period, + agency_id, + direction_id, + ) = CubeTransit.unpack_route_name(orig_line_name) + new_time_period_name = self.parameters.cube_time_periods[new_time_period_number] + new_tp_line_name = CubeTransit.build_route_name( + route_id=route_id, + time_period=new_time_period_name, + agency_id=agency_id, + direction_id=direction_id, + ) + + try: + assert new_tp_line_name not in self.lines + except: + msg = "Trying to add a new time period {} to line {}, but constructed name {} is already in line list.".format( + new_time_period_number, orig_line_name, new_tp_line_name + ) + WrangerLogger.error(msg) + raise ValueError(msg) + + # copy to a new line and add it to list of lines to add + self.line_properties[new_tp_line_name] = copy.deepcopy( + self.line_properties[orig_line_name] + ) + self.shapes[new_tp_line_name] = copy.deepcopy(self.shapes[orig_line_name]) + self.line_properties[new_tp_line_name]["NAME"] = new_tp_line_name + + """ + Remove entries that aren't for this time period from the new line's properties list. + """ + this_time_period_properties_list = [ + p + "[" + str(new_time_period_number) + "]" + ##todo parameterize all time period specific variables + for p in ["HEADWAY", "FREQ"] + ] + + not_this_tp_properties_list = list( + set(self.parameters.time_period_properties_list) + - set(this_time_period_properties_list) + ) + + for k in not_this_tp_properties_list: + self.line_properties[new_tp_line_name].pop(k, None) + + """ + Remove entries for time period from the original line's properties list. + """ + for k in this_time_period_properties_list: + self.line_properties[orig_line_name].pop(k, None) + + """ + Add new line to list of lines to add. + """ + WranglerLogger.debug( + "Adding new time period {} for line {} as {}.".format( + new_time_period_number, orig_line_name, new_tp_line_name + ) + ) + return new_tp_line_name
+ +
[docs] def create_update_route_card_dict(self, line: str, updated_properties_dict: dict): + """ + Creates a project card change formatted dictionary for updating + the line. + + Args: + line: name of line that is being updated + updated_properties_dict: dictionary of attributes to update as + 'property': <property name>, + 'set': <new property value> + + Returns: + A project card change-formatted dictionary for the attribute update. + """ + base_start_time_str, base_end_time_str = self.calculate_start_end_times( + self.line_properties[line] + ) + + update_card_dict = { + "category": "Transit Service Property Change", + "facility": { + "route_id": line.split("_")[1], + "direction_id": int(line.split("_")[-2].strip("d\"")), + "shape_id": line.split("_")[-1].strip("s\""), + "start_time": base_start_time_str, + "end_time": base_end_time_str, + }, + "properties": updated_properties_dict, + } + WranglerLogger.debug( + "Updating {} route to changes:\n{}".format(line, str(update_card_dict)) + ) + + return update_card_dict
+ +
[docs] def create_delete_route_card_dict( + self, line: str, base_transit_line_properties_dict: dict + ): + """ + Creates a project card change formatted dictionary for deleting a line. + + Args: + line: name of line that is being deleted + base_transit_line_properties_dict: dictionary of cube-style + attribute values in order to find time periods and + start and end times. + + Returns: + A project card change-formatted dictionary for the route deletion. + """ + base_start_time_str, base_end_time_str = self.calculate_start_end_times( + base_transit_line_properties_dict + ) + + delete_card_dict = { + "category": "Delete Transit Service", + "facility": { + "route_id": line.split("_")[1], + "direction_id": int(line.strip('"')[-1]), + "start_time": base_start_time_str, + "end_time": base_end_time_str, + }, + } + WranglerLogger.debug( + "Deleting {} route to changes:\n{}".format(line, delete_card_dict) + ) + + return delete_card_dict
+ +
[docs] def create_add_route_card_dict(self, line: str): + """ + Creates a project card change formatted dictionary for adding + a route based on the information in self.route_properties for + the line. + + Args: + line: name of line that is being updated + + Returns: + A project card change-formatted dictionary for the route addition. + """ + start_time_str, end_time_str = self.calculate_start_end_times( + self.line_properties[line] + ) + + standard_properties = self.cube_properties_to_standard_properties( + self.line_properties[line] + ) + + routing_properties = { + "property": "routing", + "set": self.shapes[line]["node"].tolist(), + } + + add_card_dict = { + "category": "New Transit Service", + "facility": { + "route_id": line.split("_")[1], + "direction_id": int(line.strip('_')[-2]), + "start_time": start_time_str, + "end_time": end_time_str, + "agency_id": line.strip('_')[0], + }, + "properties": standard_properties + [routing_properties], + } + + WranglerLogger.debug( + "Adding {} route to changes:\n{}".format(line, add_card_dict) + ) + return add_card_dict
+ +
[docs] @staticmethod + def get_time_period_numbers_from_cube_properties(properties_list: list): + """ + Finds properties that are associated with time periods and the + returns the numbers in them. + + Args: + properties_list (list): list of all properties. + + Returns: + list of strings of the time period numbers found + """ + time_periods_list = [] + for p in properties_list: + if ("[" not in p) or ("]" not in p): + continue + tp_num = p.split("[")[1][0] + if tp_num and tp_num not in time_periods_list: + time_periods_list.append(tp_num) + return time_periods_list
+ +
[docs] @staticmethod + def build_route_name( + route_id: str = "", + time_period: str = "", + agency_id: str = 0, + direction_id: str = 1, + ): + """ + Create a route name by contatenating route, time period, agency, and direction + + Args: + route_id: i.e. 452-111 + time_period: i.e. pk + direction_id: i.e. 1 + agency_id: i.e. 0 + + Returns: + constructed line_name i.e. "0_452-111_452_pk1" + """ + + return ( + str(agency_id) + + "_" + + str(route_id) + + "_" + + str(route_id.split("-")[0]) + + "_" + + str(time_period) + + str(direction_id) + )
+ +
[docs] @staticmethod + def unpack_route_name(line_name: str): + """ + Unpacks route name into direction, route, agency, and time period info + + Args: + line_name (str): i.e. "0_452-111_452_pk1" + + Returns: + route_id (str): 452-111 + time_period (str): i.e. pk + direction_id (str) : i.e. 1 + agency_id (str) : i.e. 0 + """ + + line_name = line_name.strip('"') + + agency_id, route_id, _rtid, _tp_direction = line_name.split("_") + time_period = _tp_direction[0:-1] + direction_id = _tp_direction[-1] + + return route_id, time_period, agency_id, direction_id
+ +
[docs] def calculate_start_end_times(self, line_properties_dict: dict): + """ + Calculate the start and end times of the property change + WARNING: Doesn't take care of discongruous time periods!!!! + + Args: + line_properties_dict: dictionary of cube-flavor properties for a transit line + """ + start_time_m = 24 * 60 + end_time_m = 0 * 60 + + WranglerLogger.debug( + "parameters.time_period_properties_list: {}".format( + self.parameters.time_period_properties_list + ) + ) + current_cube_time_period_numbers = ( + CubeTransit.get_time_period_numbers_from_cube_properties( + line_properties_dict + ) + ) + + WranglerLogger.debug( + "current_cube_time_period_numbers:{}".format( + current_cube_time_period_numbers + ) + ) + + for tp in current_cube_time_period_numbers: + time_period_name = self.parameters.cube_time_periods[tp] + WranglerLogger.debug("time_period_name:{}".format(time_period_name)) + _start_time, _end_time = self.parameters.time_period_to_time[ + time_period_name + ] + + # change from "HH:MM" to integer # of seconds + _start_time_m = (int(_start_time.split(":")[0]) * 60) + int( + _start_time.split(":")[1] + ) + _end_time_m = (int(_end_time.split(":")[0]) * 60) + int( + _end_time.split(":")[1] + ) + + # find bounding start and end times + if _start_time_m < start_time_m: + start_time_m = _start_time_m + if _end_time_m > end_time_m: + end_time_m = _end_time_m + + if start_time_m > end_time_m: + msg = "Start time ({}) is after end time ({})".format( + start_time_m, end_time_m + ) + #WranglerLogger.error(msg) + #raise ValueError(msg) + + start_time_str = "{:02d}:{:02d}".format(*divmod(start_time_m, 60)) + end_time_str = "{:02d}:{:02d}".format(*divmod(end_time_m, 60)) + return start_time_str, end_time_str
+ +
[docs] @staticmethod + def cube_properties_to_standard_properties(cube_properties_dict: dict): + """ + Converts cube style properties to standard properties. + + This is most pertinent to time-period specific variables like headway, + and varibles that have stnadard units like headway, which is minutes + in cube and seconds in standard format. + + Args: + cube_properties_dict: <cube style property name> : <property value> + + Returns: + A list of dictionaries with values for `"property": <standard + style property name>, "set" : <property value with correct units>` + + """ + standard_properties_list = [] + for k, v in cube_properties_dict.items(): + change_item = {} + if any(i in k for i in ["HEADWAY", "FREQ"]): + change_item["property"] = "headway_secs" + change_item["set"] = v * 60 + else: + change_item["property"] = k + change_item["set"] = v + standard_properties_list.append(change_item) + + return standard_properties_list
+ +
[docs] def evaluate_route_property_differences( + self, + properties_build: dict, + properties_base: dict, + time_period_number: str, + absolute: bool = True, + validate_base: bool = False, + ): + """ + Checks if any values have been updated or added for a specific + route and creates project card entries for each. + + Args: + properties_build: ::<property_name>: <property_value> + properties_base: ::<property_name>: <property_value> + time_period_number: time period to evaluate + absolute: if True, will use `set` command rather than a change. If false, will automatically check the base value. Note that this only applies to the numeric values of frequency/headway + validate_base: if True, will add the `existing` line in the project card + + Returns: + transit_change_list (list): a list of dictionary values suitable for writing to a project card + `{ + 'property': <property_name>, + 'set': <set value>, + 'change': <change from existing value>, + 'existing': <existing value to check>, + }` + + """ + + # Remove time period specific values for things that aren't part of the time period in question + this_time_period_properties_list = [ + p + "[" + str(time_period_number) + "]" + ##todo parameterize all time period specific variables + for p in ["HEADWAY", "FREQ"] + ] + + not_this_tp_properties_list = list( + set(self.parameters.time_period_properties_list) + - set(this_time_period_properties_list) + ) + + for k in not_this_tp_properties_list: + properties_build.pop(k, None) + properties_base.pop(k, None) + + difference_dict = dict( + set(properties_build.items()) ^ set(properties_base.items()) + ) + + # Iterate through properties list to build difference project card list + + properties_list = [] + for k, v in difference_dict.items(): + change_item = {} + if any(i in k for i in ["HEADWAY", "FREQ"]): + change_item["property"] = "headway_secs" + + if absolute: + change_item["set"] = ( + v * 60 + ) # project cards are in secs, cube is in minutes + else: + change_item["change"] = ( + properties_build[k] - properties_base[k] + ) * 60 + if validate_base or not absolute: + change_item["existing"] = properties_base[k] * 60 + else: + change_item["property"] = k + change_item["set"] = v + if validate_base: + change_item["existing"] = properties_base[k] + + properties_list.append(change_item) + WranglerLogger.debug( + "Evaluated Route Changes: \n {})".format( + "\n".join(map(str, properties_list)) + ) + ) + return properties_list
+ +
[docs] @staticmethod + def evaluate_route_shape_changes( + shape_build: DataFrame, shape_base: DataFrame + ): + """ + Compares two route shapes and constructs returns list of changes + suitable for a project card. + + Args: + shape_build: DataFrame of the build-version of the route shape. + shape_base: dDataFrame of the base-version of the route shape. + + Returns: + List of shape changes formatted as a project card-change dictionary. + + """ + + if shape_build.equals(shape_base): + return None + + shape_change_list = [] + + base_node_list = shape_base.tolist() + build_node_list = shape_build.tolist() + + sort_len = max(len(base_node_list), len(build_node_list)) + + start_pos = None + end_pos = None + for i in range(sort_len): + if (i == len(base_node_list)) | (i == len(build_node_list)): + start_pos = i - 1 + break + if base_node_list[i] != build_node_list[i]: + start_pos = i + break + else: + continue + + j = -1 + for i in range(sort_len): + if (i == len(base_node_list)) | (i == len(build_node_list)): + end_pos = j + 1 + break + if base_node_list[j] != build_node_list[j]: + end_pos = j + break + else: + j -= 1 + + if start_pos or end_pos: + existing = base_node_list[ + (start_pos - 2 if start_pos > 1 else None) : ( + end_pos + 2 if end_pos < -2 else None + ) + ] + set = build_node_list[ + (start_pos - 2 if start_pos > 1 else None) : ( + end_pos + 2 if end_pos < -2 else None + ) + ] + + shape_change_list.append( + {"property": "routing", "existing": existing, "set": set} + ) + + return shape_change_list
+ + +
[docs]class StandardTransit(object): + """Holds a standard transit feed as a Partridge object and contains + methods to manipulate and translate the GTFS data to MetCouncil's + Cube Line files. + + .. highlight:: python + Typical usage example: + :: + cube_transit_net = StandardTransit.read_gtfs(BASE_TRANSIT_DIR) + cube_transit_net.write_as_cube_lin(os.path.join(WRITE_DIR, "outfile.lin")) + + Attributes: + feed: Partridge Feed object containing read-only access to GTFS feed + parameters (Parameters): Parameters instance containing information + about time periods and variables. + """ + +
[docs] def __init__(self, ptg_feed, parameters: Union[Parameters, dict] = {}): + """ + + Args: + ptg_feed: partridge feed object + parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters + """ + self.feed = ptg_feed + + if type(parameters) is dict: + self.parameters = Parameters(**parameters) + elif isinstance(parameters, Parameters): + self.parameters = Parameters(**parameters.__dict__) + else: + msg = "Parameters should be a dict or instance of Parameters: found {} which is of type:{}".format( + parameters, type(parameters) + ) + WranglerLogger.error(msg) + raise ValueError(msg)
+ +
[docs] @staticmethod + def fromTransitNetwork( + transit_network_object: TransitNetwork, parameters: Union[Parameters, dict] = {} + ): + """ + RoadwayNetwork to ModelRoadwayNetwork + + Args: + transit_network_object: Reference to an instance of TransitNetwork. + parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not provided will + use default parameters. + + Returns: + StandardTransit + """ + return StandardTransit(transit_network_object.feed, parameters=parameters)
+ +
[docs] @staticmethod + def read_gtfs(gtfs_feed_dir: str, parameters: Union[Parameters, dict] = {}): + """ + Reads GTFS files from a directory and returns a StandardTransit + instance. + + Args: + gtfs_feed_dir: location of the GTFS files + parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not provided will + use default parameters. + + Returns: + StandardTransit instance + """ + return StandardTransit(ptg.load_feed(gtfs_feed_dir), parameters=parameters)
+ +
[docs] def write_as_cube_lin(self, outpath: str = None): + """ + Writes the gtfs feed as a cube line file after + converting gtfs properties to MetCouncil cube properties. + #MC + Args: + outpath: File location for output cube line file. + + """ + if not outpath: + outpath = os.path.join(self.parameters.scratch_location, "outtransit.lin") + trip_cube_df = self.route_properties_gtfs_to_cube(self) + + trip_cube_df["LIN"] = trip_cube_df.apply(self.cube_format, axis=1) + + l = trip_cube_df["LIN"].tolist() + + with open(outpath, "w") as f: + f.write("\n".join(l))
+ +
[docs] @staticmethod + def route_properties_gtfs_to_cube(self): + """ + Prepare gtfs for cube lin file. + #MC + Does the following operations: + 1. Combines route, frequency, trip, and shape information + 2. Converts time of day to time periods + 3. Calculates cube route name from gtfs route name and properties + 4. Assigns a cube-appropriate mode number + 5. Assigns a cube-appropriate operator number + + Returns: + trip_df (DataFrame): DataFrame of trips with cube-appropriate values for: + - NAME + - ONEWAY + - OPERATOR + - MODE + - HEADWAY + """ + WranglerLogger.info( + "Converting GTFS Standard Properties to MetCouncil's Cube Standard" + ) + metro_operator_dict = { + "0": 3, + "1": 3, + "2": 3, + "3": 4, + "4": 2, + "5": 5, + "6": 8, + "7": 1, + "8": 1, + "9": 10, + "10": 3, + "11": 9, + "12": 3, + "13": 4, + "14": 4, + "15": 3, + } + + shape_df = self.feed.shapes.copy() + trip_df = self.feed.trips.copy() + + """ + Add information from: routes, frequencies, and routetype to trips_df + """ + trip_df = pd.merge(trip_df, self.feed.routes, how="left", on="route_id") + trip_df = pd.merge(trip_df, self.feed.frequencies, how="left", on="trip_id") + + trip_df["tod_name"] = trip_df.start_time.apply(self.time_to_cube_time_period) + inv_cube_time_periods_map = { + v: k for k, v in self.parameters.cube_time_periods.items() + } + trip_df["tod_num"] = trip_df.tod_name.map(inv_cube_time_periods_map) + trip_df["tod_name"] = trip_df.tod_name.map( + self.parameters.cube_time_periods_name + ) + + trip_df["NAME"] = trip_df.apply( + lambda x: x.agency_id + + "_" + + x.route_id + + "_" + + x.route_short_name + + "_" + + x.tod_name + + str(x.direction_id), + axis=1, + ) + + trip_df["LONGNAME"] = trip_df["route_long_name"] + trip_df["HEADWAY"] = (trip_df["headway_secs"] / 60).astype(int) + trip_df["MODE"] = trip_df.apply(self.calculate_cube_mode, axis=1) + trip_df["ONEWAY"] = "T" + trip_df["OPERATOR"] = trip_df["agency_id"].map(metro_operator_dict) + + return trip_df
+ +
[docs] def calculate_cube_mode(self, row): + """ + Assigns a cube mode number by following logic. + #MC + For rail, uses GTFS route_type variable: + https://developers.google.com/transit/gtfs/reference + + :: + # route_type : cube_mode + route_type_to_cube_mode = {0: 8, # Tram, Streetcar, Light rail + 3: 0, # Bus; further disaggregated for cube + 2: 9} # Rail + + For buses, uses route id numbers and route name to find + express and suburban buses as follows: + + :: + if not cube_mode: + if 'express' in row['LONGNAME'].lower(): + cube_mode = 7 # Express + elif int(row['route_id'].split("-")[0]) > 99: + cube_mode = 6 # Suburban Local + else: + cube_mode = 5 # Urban Local + + Args: + row: A DataFrame row with route_type, route_long_name, and route_id + + Returns: + cube mode number + """ + # route_type : cube_mode + route_type_to_cube_mode = { + 0: 8, # Tram, Streetcar, Light rail + 3: 0, # Bus; further disaggregated for cube + 2: 9, + } # Rail + + cube_mode = route_type_to_cube_mode[row["route_type"]] + + if not cube_mode: + if "express" in row["route_long_name"].lower(): + cube_mode = 7 # Express + elif int(row["route_id"].split("-")[0]) > 99: + cube_mode = 6 # Suburban Local + else: + cube_mode = 5 # Urban Local + + return cube_mode
+ +
[docs] def time_to_cube_time_period( + self, start_time_secs: int, as_str: bool = True, verbose: bool = False + ): + """ + Converts seconds from midnight to the cube time period. + + Args: + start_time_secs: start time for transit trip in seconds + from midnight + as_str: if True, returns the time period as a string, + otherwise returns a numeric time period + + Returns: + this_tp_num: if as_str is False, returns the numeric + time period + this_tp: if as_str is True, returns the Cube time period + name abbreviation + """ + from .util import hhmmss_to_datetime, secs_to_datetime + + # set initial time as the time that spans midnight + + start_time_dt = secs_to_datetime(start_time_secs) + + # set initial time as the time that spans midnight + this_tp = "NA" + for tp_name, _times in self.parameters.time_period_to_time.items(): + _start_time, _end_time = _times + _dt_start_time = hhmmss_to_datetime(_start_time) + _dt_end_time = hhmmss_to_datetime(_end_time) + if _dt_start_time > _dt_end_time: + this_tp = tp_name + break + + for tp_name, _times in self.parameters.time_period_to_time.items(): + _start_time, _end_time = _times + _dt_start_time = hhmmss_to_datetime(_start_time) + if start_time_dt >= _dt_start_time: + this_time = _dt_start_time + this_tp = tp_name + + if verbose: + WranglerLogger.debug( + "Finding Cube Time Period from Start Time: \ + \n - start_time_sec: {} \ + \n - start_time_dt: {} \ + \n - this_tp: {}".format( + start_time_secs, start_time_dt, this_tp + ) + ) + + if as_str: + return this_tp + + name_to_num = {v: k for k, v in self.parameters.cube_time_periods.items()} + this_tp_num = name_to_num.get(this_tp) + + if not this_tp_num: + msg = ( + "Cannot find time period number in {} for time period name: {}".format( + name_to_num, this_tp + ) + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + return this_tp_num
+ +
[docs] def shape_gtfs_to_cube(self, row, add_nntime = False): + """ + Creates a list of nodes that for the route in appropriate + cube format. + + Args: + row: DataFrame row with both shape_id and trip_id + + Returns: a string representation of the node list + for a route in cube format. + + """ + agency_raw_name = row.agency_raw_name + shape_id = row.shape_id + trip_id = row.trip_id + + trip_stop_times_df = self.feed.stop_times.copy() + + if 'agency_raw_name' in trip_stop_times_df.columns: + trip_stop_times_df.drop('agency_raw_name', axis = 1, inplace = True) + + trip_stop_times_df = pd.merge( + trip_stop_times_df, + self.feed.trips[['trip_id', 'agency_raw_name']], + how = 'left', + on = ['trip_id'] + ) + + trip_stop_times_df = trip_stop_times_df[ + (trip_stop_times_df.trip_id == row.trip_id) & + (trip_stop_times_df.agency_raw_name == agency_raw_name) + ] + + trip_node_df = self.feed.shapes.copy() + if 'agency_raw_name' in trip_node_df.columns: + trip_node_df.drop('agency_raw_name', axis = 1, inplace = True) + + trip_node_df = pd.merge( + trip_node_df, + self.feed.trips[['shape_id', 'agency_raw_name']].drop_duplicates(), + how = 'left', + on = ['shape_id'] + ) + + trip_node_df = trip_node_df[ + (trip_node_df.shape_id == shape_id) & + (trip_node_df.agency_raw_name == agency_raw_name) + ] + + trip_node_df.sort_values(by = ["shape_pt_sequence"], inplace = True) + + stops_df = self.feed.stops.copy() + stops_df['stop_id'] = stops_df['stop_id'].astype(float).astype(int) + trip_stop_times_df['stop_id'] = trip_stop_times_df['stop_id'].astype(float).astype(int) + + if 'trip_id' in self.feed.stops.columns: + if agency_raw_name != 'sjrtd_2015_0127': + stops_df = stops_df[stops_df.agency_raw_name != 'sjrtd_2015_0127'] + trip_stop_times_df = pd.merge( + trip_stop_times_df, stops_df.drop('trip_id', axis = 1), how="left", on=["stop_id"] + ) + else: + stops_df = stops_df[stops_df.agency_raw_name == 'sjrtd_2015_0127'] + stops_df['trip_id'] = stops_df['trip_id'].astype(float).astype(int).astype(str) + trip_stop_times_df['trip_id'] = trip_stop_times_df['trip_id'].astype(float).astype(int).astype(str) + trip_stop_times_df = pd.merge( + trip_stop_times_df, stops_df, how="left", on=['agency_raw_name', 'trip_id',"stop_id"] + ) + else: + trip_stop_times_df = pd.merge( + trip_stop_times_df, stops_df, how="left", on="stop_id" + ) + + trip_stop_times_df["model_node_id"] = pd.to_numeric(trip_stop_times_df["model_node_id"]).astype(int) + trip_node_df["shape_model_node_id"] = pd.to_numeric(trip_node_df["shape_model_node_id"]).astype(int) + + stop_node_id_list = trip_stop_times_df["model_node_id"].tolist() + trip_node_list = trip_node_df["shape_model_node_id"].tolist() + + trip_stop_times_df.sort_values(by = ["stop_sequence"], inplace = True) + # sometimes GTFS `stop_sequence` does not start with 1, e.g. SFMTA light rails + trip_stop_times_df["internal_stop_sequence"] = range(1, 1+len(trip_stop_times_df)) + # sometimes GTFS `departure_time` is not recorded for every stop, e.g. VTA light rails + trip_stop_times_df["departure_time"].fillna(method = "ffill", inplace = True) + trip_stop_times_df["departure_time"].fillna(0, inplace = True) + trip_stop_times_df["NNTIME"] = trip_stop_times_df["departure_time"].diff() / 60 + # CUBE NNTIME takes 2 decimals + trip_stop_times_df["NNTIME"] = trip_stop_times_df["NNTIME"].round(2) + trip_stop_times_df["NNTIME"].fillna(-1, inplace = True) + + # ACCESS + def _access_type(x): + if (x.pickup_type in [1, "1"]): + return 2 + elif (x.drop_off_type in [1, "1"]): + return 1 + else: + return 0 + + trip_stop_times_df["ACCESS"] = trip_stop_times_df.apply(lambda x: _access_type(x), axis = 1) + + trip_runtime = round(trip_stop_times_df[trip_stop_times_df['NNTIME'] > 0]['NNTIME'].sum(),2) + + # node list + node_list_str = "" + stop_seq = 0 + for nodeIdx in range(len(trip_node_list)): + if trip_node_list[nodeIdx] in stop_node_id_list: + # in case a route stops at a stop more than once, e.g. circular route + stop_seq += 1 + + if (add_nntime) & (stop_seq > 1): + if len(trip_stop_times_df[ + trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]]) > 1: + nntime_v = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]) & + (trip_stop_times_df["internal_stop_sequence"] == stop_seq), + "NNTIME"].iloc[0] + else: + nntime_v = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]),"NNTIME"].iloc[0] + + if nntime_v > 0: + nntime = ", NNTIME=%s" % (nntime_v) + else: + nntime = "" + else: + nntime = "" + + access_v = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]),"ACCESS"].iloc[0] + if access_v > 0: + access = ", ACCESS=%s" % (access_v) + else: + access = "" + + node_list_str += "\n %s%s%s" % (trip_node_list[nodeIdx], nntime, access) + if nodeIdx < (len(trip_node_list) - 1): + node_list_str += "," + if ((add_nntime) & (stop_seq > 1) & (len(nntime) > 0)) | (len(access) > 0): + node_list_str += " N=" + else: + node_list_str += "\n -%s" % (trip_node_list[nodeIdx]) + if nodeIdx < (len(trip_node_list) - 1): + node_list_str += "," + + # remove NNTIME = 0 + node_list_str = node_list_str.replace(" NNTIME=0.0, N=", "") + node_list_str = node_list_str.replace(" NNTIME=0.0,", "") + + return node_list_str, trip_runtime
+ + +
[docs] def cube_format(self, row): + """ + Creates a string represnting the route in cube line file notation. + #MC + Args: + row: row of a DataFrame representing a cube-formatted trip, with the Attributes + trip_id, shape_id, NAME, LONGNAME, tod, HEADWAY, MODE, ONEWAY, OPERATOR + + Returns: + string representation of route in cube line file notation + """ + + s = '\nLINE NAME="{}",'.format(row.NAME) + s += '\n LONGNAME="{}",'.format(row.LONGNAME) + s += "\n HEADWAY[{}]={},".format(row.tod_num, row.HEADWAY) + s += "\n MODE={},".format(row.MODE) + s += "\n ONEWAY={},".format(row.ONEWAY) + s += "\n OPERATOR={},".format(row.OPERATOR) + s += "\n NODES={}".format(self.shape_gtfs_to_cube(row)) + + return s
+ +
[docs] def shape_gtfs_to_emme(self, trip_row): + """ + Creates transit segment for the trips in appropriate + emme format. + + Args: + row: DataFrame row with both shape_id and trip_id + + Returns: a dataframe representation of the transit segment + for a trip in emme format. + + """ + trip_stop_times_df = self.feed.stop_times.copy() + trip_stop_times_df = trip_stop_times_df[ + trip_stop_times_df.trip_id == trip_row.trip_id + ] + + trip_node_df = self.feed.shapes.copy() + trip_node_df = trip_node_df[trip_node_df.shape_id == trip_row.shape_id] + trip_node_df.sort_values(by = ["shape_pt_sequence"], inplace = True) + + trip_stop_times_df = pd.merge( + trip_stop_times_df, self.feed.stops, how="left", on="stop_id" + ) + + stop_node_id_list = trip_stop_times_df["model_node_id"].tolist() + trip_node_list = trip_node_df["shape_model_node_id"].tolist() + + trip_stop_times_df.sort_values(by = ["stop_sequence"], inplace = True) + # sometimes GTFS `stop_sequence` does not start with 1, e.g. SFMTA light rails + trip_stop_times_df["internal_stop_sequence"] = range(1, 1+len(trip_stop_times_df)) + # sometimes GTFS `departure_time` is not recorded for every stop, e.g. VTA light rails + trip_stop_times_df["departure_time"].fillna(method = "ffill", inplace = True) + trip_stop_times_df["departure_time"].fillna(0, inplace = True) + trip_stop_times_df["NNTIME"] = trip_stop_times_df["departure_time"].diff() / 60 + # CUBE NNTIME takes 2 decimals + trip_stop_times_df["NNTIME"] = trip_stop_times_df["NNTIME"].round(2) + trip_stop_times_df["NNTIME"].fillna(-1, inplace = True) + + # node list + stop_seq = 0 + nntimes = [] + allow_alightings=[] + allow_boardings=[] + stop_names=[] + + if trip_row.TM2_line_haul_name in ["Light rail", "Heavy rail", "Commuter rail", "Ferry service"]: + add_nntime = True + else: + add_nntime = False + + for nodeIdx in range(len(trip_node_list)): + + if trip_node_list[nodeIdx] in stop_node_id_list: + # in case a route stops at a stop more than once, e.g. circular route + stop_seq += 1 + + if (add_nntime) & (stop_seq > 1): + if len(trip_stop_times_df[ + trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]]) > 1: + + nntime_v = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]) & + (trip_stop_times_df["internal_stop_sequence"] == stop_seq), + "NNTIME"].iloc[0] + else: + nntime_v = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]),"NNTIME"].iloc[0] + + nntimes.append(nntime_v) + else: + nntimes.append(0) + + pickup_type = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]),"pickup_type"].iloc[0] + if pickup_type in [1, "1"]: + allow_alightings.append(0) + else: + allow_alightings.append(1) + + drop_off_type = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]),"drop_off_type"].iloc[0] + if drop_off_type in [1, "1"]: + allow_boardings.append(0) + else: + allow_boardings.append(1) + + stop_name = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]),"stop_name"].iloc[0] + stop_names.append(stop_name) + + else: + nntimes.append(0) + allow_alightings.append(0) + allow_boardings.append(0) + stop_names.append("") + + trip_node_df['time_minutes'] = nntimes + trip_node_df['allow_alightings'] = allow_alightings + trip_node_df['allow_boardings'] = allow_boardings + trip_node_df['stop_name'] = stop_names + trip_node_df['line_id'] = trip_row['line_id'] + trip_node_df['node_id'] = trip_node_df['shape_model_node_id'].astype(int) + trip_node_df['stop_order'] = trip_node_df['shape_pt_sequence'] + + return trip_node_df
+ +
[docs] def evaluate_differences(self, transit_changes): + """ + Compare changes from the transit_changes dataframe with the standard transit network + returns the project card changes in dictionary format + """ + + # simple properties change + trip_df = self.feed.trips.copy() + + mode_crosswalk = pd.read_csv(self.parameters.mode_crosswalk_file) + mode_crosswalk.drop_duplicates(subset = ["agency_raw_name", "route_type", "is_express_bus"], inplace = True) + + trip_df = pd.merge(trip_df, self.feed.routes.drop("agency_raw_name", axis = 1), how="left", on="route_id") + + trip_df = pd.merge(trip_df, self.feed.frequencies, how="left", on="trip_id") + + trip_df["tod"] = trip_df.start_time.apply(self.time_to_cube_time_period, as_str = False) + trip_df["tod_name"] = trip_df.start_time.apply(self.time_to_cube_time_period) + + trip_df["headway_minutes"] = (trip_df["headway_secs"] / 60).astype(int) + + trip_df = pd.merge(trip_df, self.feed.agency[["agency_name", "agency_raw_name", "agency_id"]], how = "left", on = ["agency_raw_name", "agency_id"]) + + # identify express bus + from .mtc import _is_express_bus + trip_df["is_express_bus"] = trip_df.apply(lambda x: _is_express_bus(x), axis = 1) + trip_df.drop("agency_name", axis = 1 , inplace = True) + + trip_df = pd.merge( + trip_df, + mode_crosswalk.drop("agency_id", axis = 1), + how = "left", + on = ["agency_raw_name", "route_type", "is_express_bus"] + ) + + trip_df["line_id"] = trip_df.apply( + lambda x: str(x.TM2_operator) + + "_" + + str(x.route_id) + + "_" + + x.tod_name + + "_" + + "d" + + str(int(x.direction_id)) + + "_s" + + x.shape_id, + axis=1, + ) + + trip_df["line_id"] = trip_df["line_id"].str.slice(stop = 28) + + project_card_changes = [] + + # lines updated + transit_changes['line_id'] = transit_changes.apply( + lambda x: '-'.join(x['element_id'].split('-')[:-3]) if + x['object'] == 'TRANSIT_STOP' else + x['element_id'], + axis = 1 + ) + + lines_updated_df = transit_changes[ + (transit_changes['operation'] == 'C') & + (transit_changes['line_id'].isin(trip_df['line_id'].tolist())) + ].copy() + + ######################### + # simple property changes + ######################### + + property_changes_df = lines_updated_df[ + lines_updated_df.object == 'TRANSIT_LINE' + ].copy() + + property_attribute_list = ['headway_secs'] + + for index, row in property_changes_df.iterrows(): + line_id = row['line_id'] + properties_list = [] + change_item = {} + for c in property_attribute_list: + existing_value = int(trip_df[ + trip_df['line_id'] == line_id + ][c].iloc[0]) + + change_item["existing"] = existing_value + + if c == 'headway_secs': + change_item["set"] = row['headway'] * 60 + else: + change_item["set"] = row[c] + + change_item["property"] = c + + properties_list.append(change_item) + + property_changes_df.loc[index, 'properties'] = properties_list + + ############### + # shape changes + ############### + + shape_changes_df = lines_updated_df[ + lines_updated_df.object.isin(['TRANSIT_SHAPE']) + ].copy() + + for index, row in shape_changes_df.iterrows(): + line_id = row.line_id + + # get base shape + trip_row = trip_df[trip_df.line_id == line_id].copy().squeeze() + + base_shape = self.shape_gtfs_to_emme( + trip_row=trip_row + ) + base_shape['shape_model_node_id'] = base_shape['shape_model_node_id'].astype(int) + + # get build shape + build_shape = row.new_itinerary + + updated_shapes = CubeTransit.evaluate_route_shape_changes( + shape_base = base_shape.shape_model_node_id, + shape_build = pd.Series(row.new_itinerary) + ) + updated_shapes[0]['property'] = 'shapes' + shape_changes_df.loc[index, 'properties'] = updated_shapes + + ############## + # stop changes + ############## + stop_changes_df = lines_updated_df[ + lines_updated_df.object.isin(['TRANSIT_STOP']) + ].copy() + + stop_attribute_list = ['allow_alightings', 'allow_boardings'] + + stop_changes_df = stop_changes_df.groupby( + ['line_id','i_node'] + )[stop_attribute_list].last().reset_index() + + stop_attribute_changes_df = pd.DataFrame() + + for attribute in stop_attribute_list: + + attribute_df = stop_changes_df.groupby( + ['line_id', attribute] + )['i_node'].apply(list).reset_index() + attribute_df['properties'] = attribute_df.apply( + lambda x: { + 'property' : attribute if x[attribute] == True else 'no_'+attribute.split('_')[-1], + 'set': x['i_node']}, + axis = 1 + ) + + stop_attribute_changes_df = pd.concat( + [stop_attribute_changes_df, + attribute_df[['line_id', 'properties']]], + sort = False, + ignore_index = True + ) + + ############## + # combine all transit changes + ############## + transit_changes_df = pd.concat( + [ + property_changes_df, + shape_changes_df, + stop_attribute_changes_df + ], + sort = False, + ignore_index = True + ) + + # groupby line_id + transit_changes_df = transit_changes_df.groupby( + ['line_id'] + )['properties'].apply(list).reset_index() + + # create change items by line_id + for index, row in transit_changes_df.iterrows(): + line_id = row['line_id'] + base_start_time_str = self.parameters.time_period_to_time.get( + line_id.split("_")[2] + )[0] + + base_end_time_str = self.parameters.time_period_to_time.get( + line_id.split("_")[2] + )[1] + + update_card_dict = { + "category": "Transit Service Property Change", + "facility": { + "route_id": line_id.split("_")[1], + "direction_id": int(line_id.split("_")[-2].strip("d\"")), + "shape_id": line_id.split("_")[-1].strip("s\""), + "start_time": base_start_time_str, + "end_time": base_end_time_str + }, + "properties": row['properties'], + } + + project_card_changes.append(update_card_dict) + + return project_card_changes
+ +class CubeTransformer(Transformer): + """A lark-parsing Transformer which transforms the parse-tree to + a dictionary. + + .. highlight:: python + Typical usage example: + :: + transformed_tree_data = CubeTransformer().transform(parse_tree) + + Attributes: + line_order (int): a dynamic counter to hold the order of the nodes within + a route shape + lines_list (list): a list of the line names + """ + + def __init__(self): + self.line_order = 0 + self.lines_list = [] + + def lines(self, line): + # WranglerLogger.debug("lines: \n {}".format(line)) + + # This MUST be a tuple because it returns to start in the tree + lines = {k: v for k, v in line} + return ("lines", lines) + + @v_args(inline=True) + def program_type_line(self, PROGRAM_TYPE, whitespace=None): + # WranglerLogger.debug("program_type_line:{}".format(PROGRAM_TYPE)) + self.program_type = PROGRAM_TYPE.value + + # This MUST be a tuple because it returns to start in the tree + return ("program_type", PROGRAM_TYPE.value) + + @v_args(inline=True) + def line(self, lin_attributes, nodes): + # WranglerLogger.debug("line...attributes:\n {}".format(lin_attributes)) + # WranglerLogger.debug("line...nodes:\n {}".format(nodes)) + lin_name = lin_attributes["NAME"] + + self.line_order = 0 + # WranglerLogger.debug("parsing: {}".format(lin_name)) + + return (lin_name, {"line_properties": lin_attributes, "line_shape": nodes}) + + @v_args(inline=True) + def lin_attributes(self, *lin_attr): + lin_attr = {k: v for (k, v) in lin_attr} + # WranglerLogger.debug("lin_attributes: {}".format(lin_attr)) + return lin_attr + + @v_args(inline=True) + def lin_attr(self, lin_attr_name, attr_value, SEMICOLON_COMMENT=None): + # WranglerLogger.debug("lin_attr {}: {}".format(lin_attr_name, attr_value)) + return lin_attr_name, attr_value + + def lin_attr_name(self, args): + attr_name = args[0].value.upper() + # WranglerLogger.debug(".......args {}".format(args)) + if attr_name in ["FREQ", "HEADWAY"]: + attr_name = attr_name + "[" + str(args[2]) + "]" + return attr_name + + def attr_value(self, attr_value): + try: + return int(attr_value[0].value) + except: + return attr_value[0].value + + def nodes(self, lin_node): + lin_node = DataFrame(lin_node) + # WranglerLogger.debug("nodes:\n {}".format(lin_node)) + + return lin_node + + @v_args(inline=True) + def lin_node(self, NODE_NUM, SEMICOLON_COMMENT=None, *lin_nodeattr): + self.line_order += 1 + n = int(NODE_NUM.value) + return {"node_id": abs(n), "node": n, "stop": n > 0, "order": self.line_order} + + start = dict + + +TRANSIT_LINE_FILE_GRAMMAR = r""" + +start : program_type_line? lines +WHITESPACE : /[ \t\r\n]/+ +STRING : /("(?!"").*?(?<!\\)(\\\\)*?"|'(?!'').*?(?<!\\)(\\\\)*?')/i +SEMICOLON_COMMENT : /;[^\n]*/ +BOOLEAN : "T"i | "F"i +program_type_line : ";;<<" PROGRAM_TYPE ">><<LINE>>;;" WHITESPACE? +PROGRAM_TYPE : "PT" | "TRNBUILD" + +lines : line* +line : "LINE" lin_attributes nodes + +lin_attributes : lin_attr+ +lin_attr : lin_attr_name "=" attr_value "," SEMICOLON_COMMENT* +TIME_PERIOD : "1".."5" +!lin_attr_name : "allstops"i + | "color"i + | ("freq"i "[" TIME_PERIOD "]") + | ("headway"i "[" TIME_PERIOD "]") + | "mode"i + | "name"i + | "oneway"i + | "owner"i + | "runtime"i + | "timefac"i + | "xyspeed"i + | "longname"i + | "shortname"i + | ("usera1"i) + | ("usera2"i) + | "circular"i + | "vehicletype"i + | "operator"i + | "faresystem"i + +attr_value : BOOLEAN | STRING | SIGNED_INT | FLOAT + +nodes : lin_node+ +lin_node : ("N" | "NODES")? "="? NODE_NUM ","? SEMICOLON_COMMENT? lin_nodeattr* +NODE_NUM : SIGNED_INT +lin_nodeattr : lin_nodeattr_name "=" attr_value ","? SEMICOLON_COMMENT* +!lin_nodeattr_name : "access_c"i + | "access"i + | "delay"i + | "xyspeed"i + | "timefac"i + | "nntime"i + | "time"i + +operator : SEMICOLON_COMMENT* "OPERATOR" opmode_attr* SEMICOLON_COMMENT* +mode : SEMICOLON_COMMENT* "MODE" opmode_attr* SEMICOLON_COMMENT* +opmode_attr : ( (opmode_attr_name "=" attr_value) ","? ) +opmode_attr_name : "number" | "name" | "longname" + +%import common.SIGNED_INT +%import common.FLOAT +%import common.WS +%ignore WS + +""" +
+ +
+
+
+ +
+ +
+

© Copyright 2019 Metropolitan Council.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty_dev/_modules/lasso/util/index.html b/branch/bicounty_dev/_modules/lasso/util/index.html new file mode 100644 index 0000000..f82d773 --- /dev/null +++ b/branch/bicounty_dev/_modules/lasso/util/index.html @@ -0,0 +1,256 @@ + + + + + + lasso.util — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for lasso.util

+from functools import partial
+import pyproj
+from shapely.ops import transform
+from shapely.geometry import Point, Polygon
+import re
+from unidecode import unidecode
+
+
[docs]def get_shared_streets_intersection_hash(lat, long, osm_node_id=None): + """ + Calculated per: + https://github.com/sharedstreets/sharedstreets-js/blob/0e6d7de0aee2e9ae3b007d1e45284b06cc241d02/src/index.ts#L553-L565 + Expected in/out + -93.0965985, 44.952112199999995 osm_node_id = 954734870 + 69f13f881649cb21ee3b359730790bb9 + + """ + import hashlib + + message = "Intersection {0:.5f} {0:.5f}".format(long, lat) + if osm_node_id: + message += " {}".format(osm_node_id) + unhashed = message.encode("utf-8") + hash = hashlib.md5(unhashed).hexdigest() + return hash
+ + +
[docs]def hhmmss_to_datetime(hhmmss_str: str): + """ + Creates a datetime time object from a string of hh:mm:ss + + Args: + hhmmss_str: string of hh:mm:ss + Returns: + dt: datetime.time object representing time + """ + import datetime + + dt = datetime.time(*[int(i) for i in hhmmss_str.split(":")]) + + return dt
+ + +
[docs]def secs_to_datetime(secs: int): + """ + Creates a datetime time object from a seconds from midnight + + Args: + secs: seconds from midnight + Returns: + dt: datetime.time object representing time + """ + import datetime + + dt = (datetime.datetime.min + datetime.timedelta(seconds=secs)).time() + + return dt
+ + +
[docs]def geodesic_point_buffer(lat, lon, meters): + """ + creates circular buffer polygon for node + + Args: + lat: node lat + lon: node lon + meters: buffer distance, radius of circle + Returns: + Polygon + """ + proj_wgs84 = pyproj.Proj('+proj=longlat +datum=WGS84') + # Azimuthal equidistant projection + aeqd_proj = '+proj=aeqd +lat_0={lat} +lon_0={lon} +x_0=0 +y_0=0' + project = partial( + pyproj.transform, + pyproj.Proj(aeqd_proj.format(lat=lat, lon=lon)), + proj_wgs84) + buf = Point(0, 0).buffer(meters) # distance in meters + return Polygon(transform(project, buf).exterior.coords[:])
+ +
[docs]def create_locationreference(node, link): + node['X'] = node['geometry'].apply(lambda p: p.x) + node['Y'] = node['geometry'].apply(lambda p: p.y) + node['point'] = [list(xy) for xy in zip(node.X, node.Y)] + node_dict = dict(zip(node.model_node_id, node.point)) + + link['A_point'] = link['A'].map(node_dict) + link['B_point'] = link['B'].map(node_dict) + link['locationReferences'] = link.apply(lambda x: [{'sequence':1, + 'point': x['A_point'], + 'distanceToNextRef':x['length'], + 'bearing' : 0, + 'intersectionId':x['fromIntersectionId']}, + {'sequence':2, + 'point': x['B_point'], + 'intersectionId':x['toIntersectionId']}], + axis = 1)
+ +
[docs]def column_name_to_parts(c, parameters=None): + + if not parameters: + from .parameters import Parameters + + parameters = Parameters() + + if c[0:2] == "ML": + managed = True + else: + managed = False + + time_period = None + category = None + + if c.split("_")[0] not in parameters.properties_to_split.keys(): + return c, None, None, managed + + tps = parameters.time_period_to_time.keys() + cats = parameters.categories.keys() + + if c.split("_")[-1] in tps: + time_period = c.split("_")[-1] + base_name = c.split(time_period)[-2][:-1] + if c.split("_")[-2] in cats: + category = c.split("_")[-2] + base_name = c.split(category)[-2][:-1] + elif c.split("_")[-1] in cats: + category = c.split("_")[-1] + base_name = c.split(category)[-2][:-1] + else: + msg = "Can't split property correctly: {}".format(c) + WranglerLogger.error(msg) + + return base_name, time_period, category, managed
+ +
[docs]def shorten_name(name): + if type(name) == str: + name_list = name.split(',') + else: + name_list = name + name_list = [re.sub(r'\W+', ' ', c).replace('nan', '').strip(' ') for c in name_list] + + name_list = list(set(name_list)) + #name_list.remove('') + + name_new = ' '.join(name_list).strip(' ') + + # convert non english character to english + name_new = unidecode(name_new) + + return name_new
+
+ +
+
+
+ +
+ +
+

© Copyright 2019 Metropolitan Council.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty_dev/_modules/shapely/geometry/point/index.html b/branch/bicounty_dev/_modules/shapely/geometry/point/index.html new file mode 100644 index 0000000..a0ef069 --- /dev/null +++ b/branch/bicounty_dev/_modules/shapely/geometry/point/index.html @@ -0,0 +1,252 @@ + + + + + + shapely.geometry.point — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for shapely.geometry.point

+"""Points and related utilities
+"""
+import numpy as np
+
+import shapely
+from shapely.errors import DimensionError
+from shapely.geometry.base import BaseGeometry
+
+__all__ = ["Point"]
+
+
+
[docs]class Point(BaseGeometry): + """ + A geometry type that represents a single coordinate with + x,y and possibly z values. + + A point is a zero-dimensional feature and has zero length and zero area. + + Parameters + ---------- + args : float, or sequence of floats + The coordinates can either be passed as a single parameter, or as + individual float values using multiple parameters: + + 1) 1 parameter: a sequence or array-like of with 2 or 3 values. + 2) 2 or 3 parameters (float): x, y, and possibly z. + + Attributes + ---------- + x, y, z : float + Coordinate values + + Examples + -------- + Constructing the Point using separate parameters for x and y: + + >>> p = Point(1.0, -1.0) + + Constructing the Point using a list of x, y coordinates: + + >>> p = Point([1.0, -1.0]) + >>> print(p) + POINT (1 -1) + >>> p.y + -1.0 + >>> p.x + 1.0 + """ + + __slots__ = [] + + def __new__(self, *args): + if len(args) == 0: + # empty geometry + # TODO better constructor + return shapely.from_wkt("POINT EMPTY") + elif len(args) > 3: + raise TypeError(f"Point() takes at most 3 arguments ({len(args)} given)") + elif len(args) == 1: + coords = args[0] + if isinstance(coords, Point): + return coords + + # Accept either (x, y) or [(x, y)] + if not hasattr(coords, "__getitem__"): # generators + coords = list(coords) + coords = np.asarray(coords).squeeze() + else: + # 2 or 3 args + coords = np.array(args).squeeze() + + if coords.ndim > 1: + raise ValueError( + f"Point() takes only scalar or 1-size vector arguments, got {args}" + ) + if not np.issubdtype(coords.dtype, np.number): + coords = [float(c) for c in coords] + geom = shapely.points(coords) + if not isinstance(geom, Point): + raise ValueError("Invalid values passed to Point constructor") + return geom + + # Coordinate getters and setters + + @property + def x(self): + """Return x coordinate.""" + return shapely.get_x(self) + + @property + def y(self): + """Return y coordinate.""" + return shapely.get_y(self) + + @property + def z(self): + """Return z coordinate.""" + if not shapely.has_z(self): + raise DimensionError("This point has no z coordinate.") + # return shapely.get_z(self) -> get_z only supported for GEOS 3.7+ + return self.coords[0][2] + + @property + def __geo_interface__(self): + return {"type": "Point", "coordinates": self.coords[0]} + +
[docs] def svg(self, scale_factor=1.0, fill_color=None, opacity=None): + """Returns SVG circle element for the Point geometry. + + Parameters + ========== + scale_factor : float + Multiplication factor for the SVG circle diameter. Default is 1. + fill_color : str, optional + Hex string for fill color. Default is to use "#66cc99" if + geometry is valid, and "#ff3333" if invalid. + opacity : float + Float number between 0 and 1 for color opacity. Default value is 0.6 + """ + if self.is_empty: + return "<g />" + if fill_color is None: + fill_color = "#66cc99" if self.is_valid else "#ff3333" + if opacity is None: + opacity = 0.6 + return ( + '<circle cx="{0.x}" cy="{0.y}" r="{1}" ' + 'stroke="#555555" stroke-width="{2}" fill="{3}" opacity="{4}" />' + ).format(self, 3.0 * scale_factor, 1.0 * scale_factor, fill_color, opacity)
+ + @property + def xy(self): + """Separate arrays of X and Y coordinate values + + Example: + >>> x, y = Point(0, 0).xy + >>> list(x) + [0.0] + >>> list(y) + [0.0] + """ + return self.coords.xy
+ + +shapely.lib.registry[0] = Point +
+ +
+
+
+ +
+ +
+

© Copyright 2019 Metropolitan Council.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty_dev/_modules/shapely/geometry/polygon/index.html b/branch/bicounty_dev/_modules/shapely/geometry/polygon/index.html new file mode 100644 index 0000000..f73f375 --- /dev/null +++ b/branch/bicounty_dev/_modules/shapely/geometry/polygon/index.html @@ -0,0 +1,462 @@ + + + + + + shapely.geometry.polygon — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for shapely.geometry.polygon

+"""Polygons and their linear ring components
+"""
+
+import numpy as np
+
+import shapely
+from shapely.algorithms.cga import is_ccw_impl, signed_area
+from shapely.errors import TopologicalError
+from shapely.geometry.base import BaseGeometry
+from shapely.geometry.linestring import LineString
+from shapely.geometry.point import Point
+
+__all__ = ["Polygon", "LinearRing"]
+
+
+def _unpickle_linearring(wkb):
+    linestring = shapely.from_wkb(wkb)
+    srid = shapely.get_srid(linestring)
+    linearring = shapely.linearrings(shapely.get_coordinates(linestring))
+    if srid:
+        linearring = shapely.set_srid(linearring, srid)
+    return linearring
+
+
+class LinearRing(LineString):
+    """
+    A geometry type composed of one or more line segments
+    that forms a closed loop.
+
+    A LinearRing is a closed, one-dimensional feature.
+    A LinearRing that crosses itself or touches itself at a single point is
+    invalid and operations on it may fail.
+
+    Parameters
+    ----------
+    coordinates : sequence
+        A sequence of (x, y [,z]) numeric coordinate pairs or triples, or
+        an array-like with shape (N, 2) or (N, 3).
+        Also can be a sequence of Point objects.
+
+    Notes
+    -----
+    Rings are automatically closed. There is no need to specify a final
+    coordinate pair identical to the first.
+
+    Examples
+    --------
+    Construct a square ring.
+
+    >>> ring = LinearRing( ((0, 0), (0, 1), (1 ,1 ), (1 , 0)) )
+    >>> ring.is_closed
+    True
+    >>> list(ring.coords)
+    [(0.0, 0.0), (0.0, 1.0), (1.0, 1.0), (1.0, 0.0), (0.0, 0.0)]
+    >>> ring.length
+    4.0
+
+    """
+
+    __slots__ = []
+
+    def __new__(self, coordinates=None):
+        if coordinates is None:
+            # empty geometry
+            # TODO better way?
+            return shapely.from_wkt("LINEARRING EMPTY")
+        elif isinstance(coordinates, LineString):
+            if type(coordinates) == LinearRing:
+                # return original objects since geometries are immutable
+                return coordinates
+            elif not coordinates.is_valid:
+                raise TopologicalError("An input LineString must be valid.")
+            else:
+                # LineString
+                # TODO convert LineString to LinearRing more directly?
+                coordinates = coordinates.coords
+
+        else:
+            if hasattr(coordinates, "__array__"):
+                coordinates = np.asarray(coordinates)
+            if isinstance(coordinates, np.ndarray) and np.issubdtype(
+                coordinates.dtype, np.number
+            ):
+                pass
+            else:
+                # check coordinates on points
+                def _coords(o):
+                    if isinstance(o, Point):
+                        return o.coords[0]
+                    else:
+                        return [float(c) for c in o]
+
+                coordinates = np.array([_coords(o) for o in coordinates])
+                if not np.issubdtype(coordinates.dtype, np.number):
+                    # conversion of coords to 2D array failed, this might be due
+                    # to inconsistent coordinate dimensionality
+                    raise ValueError("Inconsistent coordinate dimensionality")
+
+        if len(coordinates) == 0:
+            # empty geometry
+            # TODO better constructor + should shapely.linearrings handle this?
+            return shapely.from_wkt("LINEARRING EMPTY")
+
+        geom = shapely.linearrings(coordinates)
+        if not isinstance(geom, LinearRing):
+            raise ValueError("Invalid values passed to LinearRing constructor")
+        return geom
+
+    @property
+    def __geo_interface__(self):
+        return {"type": "LinearRing", "coordinates": tuple(self.coords)}
+
+    def __reduce__(self):
+        """WKB doesn't differentiate between LineString and LinearRing so we
+        need to move the coordinate sequence into the correct geometry type"""
+        return (_unpickle_linearring, (shapely.to_wkb(self, include_srid=True),))
+
+    @property
+    def is_ccw(self):
+        """True is the ring is oriented counter clock-wise"""
+        return bool(is_ccw_impl()(self))
+
+    @property
+    def is_simple(self):
+        """True if the geometry is simple, meaning that any self-intersections
+        are only at boundary points, else False"""
+        return bool(shapely.is_simple(self))
+
+
+shapely.lib.registry[2] = LinearRing
+
+
+class InteriorRingSequence:
+
+    _parent = None
+    _ndim = None
+    _index = 0
+    _length = 0
+
+    def __init__(self, parent):
+        self._parent = parent
+        self._ndim = parent._ndim
+
+    def __iter__(self):
+        self._index = 0
+        self._length = self.__len__()
+        return self
+
+    def __next__(self):
+        if self._index < self._length:
+            ring = self._get_ring(self._index)
+            self._index += 1
+            return ring
+        else:
+            raise StopIteration
+
+    def __len__(self):
+        return shapely.get_num_interior_rings(self._parent)
+
+    def __getitem__(self, key):
+        m = self.__len__()
+        if isinstance(key, int):
+            if key + m < 0 or key >= m:
+                raise IndexError("index out of range")
+            if key < 0:
+                i = m + key
+            else:
+                i = key
+            return self._get_ring(i)
+        elif isinstance(key, slice):
+            res = []
+            start, stop, stride = key.indices(m)
+            for i in range(start, stop, stride):
+                res.append(self._get_ring(i))
+            return res
+        else:
+            raise TypeError("key must be an index or slice")
+
+    def _get_ring(self, i):
+        return shapely.get_interior_ring(self._parent, i)
+
+
+
[docs]class Polygon(BaseGeometry): + """ + A geometry type representing an area that is enclosed by a linear ring. + + A polygon is a two-dimensional feature and has a non-zero area. It may + have one or more negative-space "holes" which are also bounded by linear + rings. If any rings cross each other, the feature is invalid and + operations on it may fail. + + Parameters + ---------- + shell : sequence + A sequence of (x, y [,z]) numeric coordinate pairs or triples, or + an array-like with shape (N, 2) or (N, 3). + Also can be a sequence of Point objects. + holes : sequence + A sequence of objects which satisfy the same requirements as the + shell parameters above + + Attributes + ---------- + exterior : LinearRing + The ring which bounds the positive space of the polygon. + interiors : sequence + A sequence of rings which bound all existing holes. + + Examples + -------- + Create a square polygon with no holes + + >>> coords = ((0., 0.), (0., 1.), (1., 1.), (1., 0.), (0., 0.)) + >>> polygon = Polygon(coords) + >>> polygon.area + 1.0 + """ + + __slots__ = [] + + def __new__(self, shell=None, holes=None): + if shell is None: + # empty geometry + # TODO better way? + return shapely.from_wkt("POLYGON EMPTY") + elif isinstance(shell, Polygon): + # return original objects since geometries are immutable + return shell + else: + shell = LinearRing(shell) + + if holes is not None: + if len(holes) == 0: + # shapely constructor cannot handle holes=[] + holes = None + else: + holes = [LinearRing(ring) for ring in holes] + + geom = shapely.polygons(shell, holes=holes) + if not isinstance(geom, Polygon): + raise ValueError("Invalid values passed to Polygon constructor") + return geom + + @property + def exterior(self): + return shapely.get_exterior_ring(self) + + @property + def interiors(self): + if self.is_empty: + return [] + return InteriorRingSequence(self) + + @property + def coords(self): + raise NotImplementedError( + "Component rings have coordinate sequences, but the polygon does not" + ) + + def __eq__(self, other): + if not isinstance(other, BaseGeometry): + return NotImplemented + if not isinstance(other, Polygon): + return False + check_empty = (self.is_empty, other.is_empty) + if all(check_empty): + return True + elif any(check_empty): + return False + my_coords = [self.exterior.coords] + [ + interior.coords for interior in self.interiors + ] + other_coords = [other.exterior.coords] + [ + interior.coords for interior in other.interiors + ] + if not len(my_coords) == len(other_coords): + return False + # equal_nan=False is the default, but not yet available for older numpy + return np.all( + [ + np.array_equal(left, right) # , equal_nan=False) + for left, right in zip(my_coords, other_coords) + ] + ) + + def __hash__(self): + return super().__hash__() + + @property + def __geo_interface__(self): + if self.exterior == LinearRing(): + coords = [] + else: + coords = [tuple(self.exterior.coords)] + for hole in self.interiors: + coords.append(tuple(hole.coords)) + return {"type": "Polygon", "coordinates": tuple(coords)} + +
[docs] def svg(self, scale_factor=1.0, fill_color=None, opacity=None): + """Returns SVG path element for the Polygon geometry. + + Parameters + ========== + scale_factor : float + Multiplication factor for the SVG stroke-width. Default is 1. + fill_color : str, optional + Hex string for fill color. Default is to use "#66cc99" if + geometry is valid, and "#ff3333" if invalid. + opacity : float + Float number between 0 and 1 for color opacity. Default value is 0.6 + """ + if self.is_empty: + return "<g />" + if fill_color is None: + fill_color = "#66cc99" if self.is_valid else "#ff3333" + if opacity is None: + opacity = 0.6 + exterior_coords = [["{},{}".format(*c) for c in self.exterior.coords]] + interior_coords = [ + ["{},{}".format(*c) for c in interior.coords] for interior in self.interiors + ] + path = " ".join( + [ + "M {} L {} z".format(coords[0], " L ".join(coords[1:])) + for coords in exterior_coords + interior_coords + ] + ) + return ( + '<path fill-rule="evenodd" fill="{2}" stroke="#555555" ' + 'stroke-width="{0}" opacity="{3}" d="{1}" />' + ).format(2.0 * scale_factor, path, fill_color, opacity)
+ +
[docs] @classmethod + def from_bounds(cls, xmin, ymin, xmax, ymax): + """Construct a `Polygon()` from spatial bounds.""" + return cls([(xmin, ymin), (xmin, ymax), (xmax, ymax), (xmax, ymin)])
+ + +shapely.lib.registry[3] = Polygon + + +def orient(polygon, sign=1.0): + s = float(sign) + rings = [] + ring = polygon.exterior + if signed_area(ring) / s >= 0.0: + rings.append(ring) + else: + rings.append(list(ring.coords)[::-1]) + for ring in polygon.interiors: + if signed_area(ring) / s <= 0.0: + rings.append(ring) + else: + rings.append(list(ring.coords)[::-1]) + return Polygon(rings[0], rings[1:]) +
+ +
+
+
+ +
+ +
+

© Copyright 2019 Metropolitan Council.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty_dev/_modules/shapely/ops/index.html b/branch/bicounty_dev/_modules/shapely/ops/index.html new file mode 100644 index 0000000..90625be --- /dev/null +++ b/branch/bicounty_dev/_modules/shapely/ops/index.html @@ -0,0 +1,845 @@ + + + + + + shapely.ops — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for shapely.ops

+"""Support for various GEOS geometry operations
+"""
+
+from warnings import warn
+
+import shapely
+from shapely.algorithms.polylabel import polylabel  # noqa
+from shapely.errors import GeometryTypeError, ShapelyDeprecationWarning
+from shapely.geometry import (
+    GeometryCollection,
+    LineString,
+    MultiLineString,
+    MultiPoint,
+    Point,
+    Polygon,
+    shape,
+)
+from shapely.geometry.base import BaseGeometry, BaseMultipartGeometry
+from shapely.geometry.polygon import orient as orient_
+from shapely.prepared import prep
+
+__all__ = [
+    "cascaded_union",
+    "linemerge",
+    "operator",
+    "polygonize",
+    "polygonize_full",
+    "transform",
+    "unary_union",
+    "triangulate",
+    "voronoi_diagram",
+    "split",
+    "nearest_points",
+    "validate",
+    "snap",
+    "shared_paths",
+    "clip_by_rect",
+    "orient",
+    "substring",
+]
+
+
+class CollectionOperator:
+    def shapeup(self, ob):
+        if isinstance(ob, BaseGeometry):
+            return ob
+        else:
+            try:
+                return shape(ob)
+            except (ValueError, AttributeError):
+                return LineString(ob)
+
+    def polygonize(self, lines):
+        """Creates polygons from a source of lines
+
+        The source may be a MultiLineString, a sequence of LineString objects,
+        or a sequence of objects than can be adapted to LineStrings.
+        """
+        source = getattr(lines, "geoms", None) or lines
+        try:
+            source = iter(source)
+        except TypeError:
+            source = [source]
+        finally:
+            obs = [self.shapeup(line) for line in source]
+        collection = shapely.polygonize(obs)
+        return collection.geoms
+
+    def polygonize_full(self, lines):
+        """Creates polygons from a source of lines, returning the polygons
+        and leftover geometries.
+
+        The source may be a MultiLineString, a sequence of LineString objects,
+        or a sequence of objects than can be adapted to LineStrings.
+
+        Returns a tuple of objects: (polygons, cut edges, dangles, invalid ring
+        lines). Each are a geometry collection.
+
+        Dangles are edges which have one or both ends which are not incident on
+        another edge endpoint. Cut edges are connected at both ends but do not
+        form part of polygon. Invalid ring lines form rings which are invalid
+        (bowties, etc).
+        """
+        source = getattr(lines, "geoms", None) or lines
+        try:
+            source = iter(source)
+        except TypeError:
+            source = [source]
+        finally:
+            obs = [self.shapeup(line) for line in source]
+        return shapely.polygonize_full(obs)
+
+    def linemerge(self, lines, directed=False):
+        """Merges all connected lines from a source
+
+        The source may be a MultiLineString, a sequence of LineString objects,
+        or a sequence of objects than can be adapted to LineStrings.  Returns a
+        LineString or MultiLineString when lines are not contiguous.
+        """
+        source = None
+        if getattr(lines, "geom_type", None) == "MultiLineString":
+            source = lines
+        elif hasattr(lines, "geoms"):
+            # other Multi geometries
+            source = MultiLineString([ls.coords for ls in lines.geoms])
+        elif hasattr(lines, "__iter__"):
+            try:
+                source = MultiLineString([ls.coords for ls in lines])
+            except AttributeError:
+                source = MultiLineString(lines)
+        if source is None:
+            raise ValueError(f"Cannot linemerge {lines}")
+        return shapely.line_merge(source, directed=directed)
+
+    def cascaded_union(self, geoms):
+        """Returns the union of a sequence of geometries
+
+        .. deprecated:: 1.8
+            This function was superseded by :meth:`unary_union`.
+        """
+        warn(
+            "The 'cascaded_union()' function is deprecated. "
+            "Use 'unary_union()' instead.",
+            ShapelyDeprecationWarning,
+            stacklevel=2,
+        )
+        return shapely.union_all(geoms, axis=None)
+
+    def unary_union(self, geoms):
+        """Returns the union of a sequence of geometries
+
+        Usually used to convert a collection into the smallest set of polygons
+        that cover the same area.
+        """
+        return shapely.union_all(geoms, axis=None)
+
+
+operator = CollectionOperator()
+polygonize = operator.polygonize
+polygonize_full = operator.polygonize_full
+linemerge = operator.linemerge
+cascaded_union = operator.cascaded_union
+unary_union = operator.unary_union
+
+
+def triangulate(geom, tolerance=0.0, edges=False):
+    """Creates the Delaunay triangulation and returns a list of geometries
+
+    The source may be any geometry type. All vertices of the geometry will be
+    used as the points of the triangulation.
+
+    From the GEOS documentation:
+    tolerance is the snapping tolerance used to improve the robustness of
+    the triangulation computation. A tolerance of 0.0 specifies that no
+    snapping will take place.
+
+    If edges is False, a list of Polygons (triangles) will be returned.
+    Otherwise the list of LineString edges is returned.
+
+    """
+    collection = shapely.delaunay_triangles(geom, tolerance=tolerance, only_edges=edges)
+    return [g for g in collection.geoms]
+
+
+def voronoi_diagram(geom, envelope=None, tolerance=0.0, edges=False):
+    """
+    Constructs a Voronoi Diagram [1] from the given geometry.
+    Returns a list of geometries.
+
+    Parameters
+    ----------
+    geom: geometry
+        the input geometry whose vertices will be used to calculate
+        the final diagram.
+    envelope: geometry, None
+        clipping envelope for the returned diagram, automatically
+        determined if None. The diagram will be clipped to the larger
+        of this envelope or an envelope surrounding the sites.
+    tolerance: float, 0.0
+        sets the snapping tolerance used to improve the robustness
+        of the computation. A tolerance of 0.0 specifies that no
+        snapping will take place.
+    edges: bool, False
+        If False, return regions as polygons. Else, return only
+        edges e.g. LineStrings.
+
+    GEOS documentation can be found at [2]
+
+    Returns
+    -------
+    GeometryCollection
+        geometries representing the Voronoi regions.
+
+    Notes
+    -----
+    The tolerance `argument` can be finicky and is known to cause the
+    algorithm to fail in several cases. If you're using `tolerance`
+    and getting a failure, try removing it. The test cases in
+    tests/test_voronoi_diagram.py show more details.
+
+
+    References
+    ----------
+    [1] https://en.wikipedia.org/wiki/Voronoi_diagram
+    [2] https://geos.osgeo.org/doxygen/geos__c_8h_source.html  (line 730)
+    """
+    try:
+        result = shapely.voronoi_polygons(
+            geom, tolerance=tolerance, extend_to=envelope, only_edges=edges
+        )
+    except shapely.GEOSException as err:
+        errstr = "Could not create Voronoi Diagram with the specified inputs "
+        errstr += f"({err!s})."
+        if tolerance:
+            errstr += " Try running again with default tolerance value."
+        raise ValueError(errstr) from err
+
+    if result.geom_type != "GeometryCollection":
+        return GeometryCollection([result])
+    return result
+
+
+def validate(geom):
+    return shapely.is_valid_reason(geom)
+
+
+
[docs]def transform(func, geom): + """Applies `func` to all coordinates of `geom` and returns a new + geometry of the same type from the transformed coordinates. + + `func` maps x, y, and optionally z to output xp, yp, zp. The input + parameters may iterable types like lists or arrays or single values. + The output shall be of the same type. Scalars in, scalars out. + Lists in, lists out. + + For example, here is an identity function applicable to both types + of input. + + def id_func(x, y, z=None): + return tuple(filter(None, [x, y, z])) + + g2 = transform(id_func, g1) + + Using pyproj >= 2.1, this example will accurately project Shapely geometries: + + import pyproj + + wgs84 = pyproj.CRS('EPSG:4326') + utm = pyproj.CRS('EPSG:32618') + + project = pyproj.Transformer.from_crs(wgs84, utm, always_xy=True).transform + + g2 = transform(project, g1) + + Note that the always_xy kwarg is required here as Shapely geometries only support + X,Y coordinate ordering. + + Lambda expressions such as the one in + + g2 = transform(lambda x, y, z=None: (x+1.0, y+1.0), g1) + + also satisfy the requirements for `func`. + """ + if geom.is_empty: + return geom + if geom.geom_type in ("Point", "LineString", "LinearRing", "Polygon"): + + # First we try to apply func to x, y, z sequences. When func is + # optimized for sequences, this is the fastest, though zipping + # the results up to go back into the geometry constructors adds + # extra cost. + try: + if geom.geom_type in ("Point", "LineString", "LinearRing"): + return type(geom)(zip(*func(*zip(*geom.coords)))) + elif geom.geom_type == "Polygon": + shell = type(geom.exterior)(zip(*func(*zip(*geom.exterior.coords)))) + holes = list( + type(ring)(zip(*func(*zip(*ring.coords)))) + for ring in geom.interiors + ) + return type(geom)(shell, holes) + + # A func that assumes x, y, z are single values will likely raise a + # TypeError, in which case we'll try again. + except TypeError: + if geom.geom_type in ("Point", "LineString", "LinearRing"): + return type(geom)([func(*c) for c in geom.coords]) + elif geom.geom_type == "Polygon": + shell = type(geom.exterior)([func(*c) for c in geom.exterior.coords]) + holes = list( + type(ring)([func(*c) for c in ring.coords]) + for ring in geom.interiors + ) + return type(geom)(shell, holes) + + elif geom.geom_type.startswith("Multi") or geom.geom_type == "GeometryCollection": + return type(geom)([transform(func, part) for part in geom.geoms]) + else: + raise GeometryTypeError(f"Type {geom.geom_type!r} not recognized")
+ + +def nearest_points(g1, g2): + """Returns the calculated nearest points in the input geometries + + The points are returned in the same order as the input geometries. + """ + seq = shapely.shortest_line(g1, g2) + if seq is None: + if g1.is_empty: + raise ValueError("The first input geometry is empty") + else: + raise ValueError("The second input geometry is empty") + + p1 = shapely.get_point(seq, 0) + p2 = shapely.get_point(seq, 1) + return (p1, p2) + + +def snap(g1, g2, tolerance): + """ + Snaps an input geometry (g1) to reference (g2) geometry's vertices. + + Parameters + ---------- + g1 : geometry + The first geometry + g2 : geometry + The second geometry + tolerance : float + The snapping tolerance + + Refer to :func:`shapely.snap` for full documentation. + """ + + return shapely.snap(g1, g2, tolerance) + + +def shared_paths(g1, g2): + """Find paths shared between the two given lineal geometries + + Returns a GeometryCollection with two elements: + - First element is a MultiLineString containing shared paths with the + same direction for both inputs. + - Second element is a MultiLineString containing shared paths with the + opposite direction for the two inputs. + + Parameters + ---------- + g1 : geometry + The first geometry + g2 : geometry + The second geometry + """ + if not isinstance(g1, LineString): + raise GeometryTypeError("First geometry must be a LineString") + if not isinstance(g2, LineString): + raise GeometryTypeError("Second geometry must be a LineString") + return shapely.shared_paths(g1, g2) + + +class SplitOp: + @staticmethod + def _split_polygon_with_line(poly, splitter): + """Split a Polygon with a LineString""" + if not isinstance(poly, Polygon): + raise GeometryTypeError("First argument must be a Polygon") + if not isinstance(splitter, LineString): + raise GeometryTypeError("Second argument must be a LineString") + + union = poly.boundary.union(splitter) + + # greatly improves split performance for big geometries with many + # holes (the following contains checks) with minimal overhead + # for common cases + poly = prep(poly) + + # some polygonized geometries may be holes, we do not want them + # that's why we test if the original polygon (poly) contains + # an inner point of polygonized geometry (pg) + return [ + pg for pg in polygonize(union) if poly.contains(pg.representative_point()) + ] + + @staticmethod + def _split_line_with_line(line, splitter): + """Split a LineString with another (Multi)LineString or (Multi)Polygon""" + + # if splitter is a polygon, pick it's boundary + if splitter.geom_type in ("Polygon", "MultiPolygon"): + splitter = splitter.boundary + + if not isinstance(line, LineString): + raise GeometryTypeError("First argument must be a LineString") + if not isinstance(splitter, LineString) and not isinstance( + splitter, MultiLineString + ): + raise GeometryTypeError( + "Second argument must be either a LineString or a MultiLineString" + ) + + # | s\l | Interior | Boundary | Exterior | + # |----------|----------|----------|----------| + # | Interior | 0 or F | * | * | At least one of these two must be 0 + # | Boundary | 0 or F | * | * | So either '0********' or '[0F]**0*****' + # | Exterior | * | * | * | No overlapping interiors ('1********') + relation = splitter.relate(line) + if relation[0] == "1": + # The lines overlap at some segment (linear intersection of interiors) + raise ValueError("Input geometry segment overlaps with the splitter.") + elif relation[0] == "0" or relation[3] == "0": + # The splitter crosses or touches the line's interior --> return multilinestring from the split + return line.difference(splitter) + else: + # The splitter does not cross or touch the line's interior --> return collection with identity line + return [line] + + @staticmethod + def _split_line_with_point(line, splitter): + """Split a LineString with a Point""" + if not isinstance(line, LineString): + raise GeometryTypeError("First argument must be a LineString") + if not isinstance(splitter, Point): + raise GeometryTypeError("Second argument must be a Point") + + # check if point is in the interior of the line + if not line.relate_pattern(splitter, "0********"): + # point not on line interior --> return collection with single identity line + # (REASONING: Returning a list with the input line reference and creating a + # GeometryCollection at the general split function prevents unnecessary copying + # of linestrings in multipoint splitting function) + return [line] + elif line.coords[0] == splitter.coords[0]: + # if line is a closed ring the previous test doesn't behave as desired + return [line] + + # point is on line, get the distance from the first point on line + distance_on_line = line.project(splitter) + coords = list(line.coords) + # split the line at the point and create two new lines + current_position = 0.0 + for i in range(len(coords) - 1): + point1 = coords[i] + point2 = coords[i + 1] + dx = point1[0] - point2[0] + dy = point1[1] - point2[1] + segment_length = (dx**2 + dy**2) ** 0.5 + current_position += segment_length + if distance_on_line == current_position: + # splitter is exactly on a vertex + return [LineString(coords[: i + 2]), LineString(coords[i + 1 :])] + elif distance_on_line < current_position: + # splitter is between two vertices + return [ + LineString(coords[: i + 1] + [splitter.coords[0]]), + LineString([splitter.coords[0]] + coords[i + 1 :]), + ] + return [line] + + @staticmethod + def _split_line_with_multipoint(line, splitter): + """Split a LineString with a MultiPoint""" + + if not isinstance(line, LineString): + raise GeometryTypeError("First argument must be a LineString") + if not isinstance(splitter, MultiPoint): + raise GeometryTypeError("Second argument must be a MultiPoint") + + chunks = [line] + for pt in splitter.geoms: + new_chunks = [] + for chunk in filter(lambda x: not x.is_empty, chunks): + # add the newly split 2 lines or the same line if not split + new_chunks.extend(SplitOp._split_line_with_point(chunk, pt)) + chunks = new_chunks + + return chunks + + @staticmethod + def split(geom, splitter): + """ + Splits a geometry by another geometry and returns a collection of geometries. This function is the theoretical + opposite of the union of the split geometry parts. If the splitter does not split the geometry, a collection + with a single geometry equal to the input geometry is returned. + The function supports: + - Splitting a (Multi)LineString by a (Multi)Point or (Multi)LineString or (Multi)Polygon + - Splitting a (Multi)Polygon by a LineString + + It may be convenient to snap the splitter with low tolerance to the geometry. For example in the case + of splitting a line by a point, the point must be exactly on the line, for the line to be correctly split. + When splitting a line by a polygon, the boundary of the polygon is used for the operation. + When splitting a line by another line, a ValueError is raised if the two overlap at some segment. + + Parameters + ---------- + geom : geometry + The geometry to be split + splitter : geometry + The geometry that will split the input geom + + Example + ------- + >>> pt = Point((1, 1)) + >>> line = LineString([(0,0), (2,2)]) + >>> result = split(line, pt) + >>> result.wkt + 'GEOMETRYCOLLECTION (LINESTRING (0 0, 1 1), LINESTRING (1 1, 2 2))' + """ + + if geom.geom_type in ("MultiLineString", "MultiPolygon"): + return GeometryCollection( + [i for part in geom.geoms for i in SplitOp.split(part, splitter).geoms] + ) + + elif geom.geom_type == "LineString": + if splitter.geom_type in ( + "LineString", + "MultiLineString", + "Polygon", + "MultiPolygon", + ): + split_func = SplitOp._split_line_with_line + elif splitter.geom_type == "Point": + split_func = SplitOp._split_line_with_point + elif splitter.geom_type == "MultiPoint": + split_func = SplitOp._split_line_with_multipoint + else: + raise GeometryTypeError( + f"Splitting a LineString with a {splitter.geom_type} is not supported" + ) + + elif geom.geom_type == "Polygon": + if splitter.geom_type == "LineString": + split_func = SplitOp._split_polygon_with_line + else: + raise GeometryTypeError( + f"Splitting a Polygon with a {splitter.geom_type} is not supported" + ) + + else: + raise GeometryTypeError( + f"Splitting {geom.geom_type} geometry is not supported" + ) + + return GeometryCollection(split_func(geom, splitter)) + + +split = SplitOp.split + + +def substring(geom, start_dist, end_dist, normalized=False): + """Return a line segment between specified distances along a LineString + + Negative distance values are taken as measured in the reverse + direction from the end of the geometry. Out-of-range index + values are handled by clamping them to the valid range of values. + + If the start distance equals the end distance, a Point is returned. + + If the start distance is actually beyond the end distance, then the + reversed substring is returned such that the start distance is + at the first coordinate. + + Parameters + ---------- + geom : LineString + The geometry to get a substring of. + start_dist : float + The distance along `geom` of the start of the substring. + end_dist : float + The distance along `geom` of the end of the substring. + normalized : bool, False + Whether the distance parameters are interpreted as a + fraction of the geometry's length. + + Returns + ------- + Union[Point, LineString] + The substring between `start_dist` and `end_dist` or a Point + if they are at the same location. + + Raises + ------ + TypeError + If `geom` is not a LineString. + + Examples + -------- + >>> from shapely.geometry import LineString + >>> from shapely.ops import substring + >>> ls = LineString((i, 0) for i in range(6)) + >>> ls.wkt + 'LINESTRING (0 0, 1 0, 2 0, 3 0, 4 0, 5 0)' + >>> substring(ls, start_dist=1, end_dist=3).wkt + 'LINESTRING (1 0, 2 0, 3 0)' + >>> substring(ls, start_dist=3, end_dist=1).wkt + 'LINESTRING (3 0, 2 0, 1 0)' + >>> substring(ls, start_dist=1, end_dist=-3).wkt + 'LINESTRING (1 0, 2 0)' + >>> substring(ls, start_dist=0.2, end_dist=-0.6, normalized=True).wkt + 'LINESTRING (1 0, 2 0)' + + Returning a `Point` when `start_dist` and `end_dist` are at the + same location. + + >>> substring(ls, 2.5, -2.5).wkt + 'POINT (2.5 0)' + """ + + if not isinstance(geom, LineString): + raise GeometryTypeError( + "Can only calculate a substring of LineString geometries. " + f"A {geom.geom_type} was provided." + ) + + # Filter out cases in which to return a point + if start_dist == end_dist: + return geom.interpolate(start_dist, normalized) + elif not normalized and start_dist >= geom.length and end_dist >= geom.length: + return geom.interpolate(geom.length, normalized) + elif not normalized and -start_dist >= geom.length and -end_dist >= geom.length: + return geom.interpolate(0, normalized) + elif normalized and start_dist >= 1 and end_dist >= 1: + return geom.interpolate(1, normalized) + elif normalized and -start_dist >= 1 and -end_dist >= 1: + return geom.interpolate(0, normalized) + + if normalized: + start_dist *= geom.length + end_dist *= geom.length + + # Filter out cases where distances meet at a middle point from opposite ends. + if start_dist < 0 < end_dist and abs(start_dist) + end_dist == geom.length: + return geom.interpolate(end_dist) + elif end_dist < 0 < start_dist and abs(end_dist) + start_dist == geom.length: + return geom.interpolate(start_dist) + + start_point = geom.interpolate(start_dist) + end_point = geom.interpolate(end_dist) + + if start_dist < 0: + start_dist = geom.length + start_dist # Values may still be negative, + if end_dist < 0: # but only in the out-of-range + end_dist = geom.length + end_dist # sense, not the wrap-around sense. + + reverse = start_dist > end_dist + if reverse: + start_dist, end_dist = end_dist, start_dist + + if start_dist < 0: + start_dist = 0 # to avoid duplicating the first vertex + + if reverse: + vertex_list = [tuple(*end_point.coords)] + else: + vertex_list = [tuple(*start_point.coords)] + + coords = list(geom.coords) + current_distance = 0 + for p1, p2 in zip(coords, coords[1:]): + if start_dist < current_distance < end_dist: + vertex_list.append(p1) + elif current_distance >= end_dist: + break + + current_distance += ((p2[0] - p1[0]) ** 2 + (p2[1] - p1[1]) ** 2) ** 0.5 + + if reverse: + vertex_list.append(tuple(*start_point.coords)) + # reverse direction result + vertex_list = reversed(vertex_list) + else: + vertex_list.append(tuple(*end_point.coords)) + + return LineString(vertex_list) + + +def clip_by_rect(geom, xmin, ymin, xmax, ymax): + """Returns the portion of a geometry within a rectangle + + The geometry is clipped in a fast but possibly dirty way. The output is + not guaranteed to be valid. No exceptions will be raised for topological + errors. + + Parameters + ---------- + geom : geometry + The geometry to be clipped + xmin : float + Minimum x value of the rectangle + ymin : float + Minimum y value of the rectangle + xmax : float + Maximum x value of the rectangle + ymax : float + Maximum y value of the rectangle + + Notes + ----- + Requires GEOS >= 3.5.0 + New in 1.7. + """ + if geom.is_empty: + return geom + return shapely.clip_by_rect(geom, xmin, ymin, xmax, ymax) + + +def orient(geom, sign=1.0): + """A properly oriented copy of the given geometry. + + The signed area of the result will have the given sign. A sign of + 1.0 means that the coordinates of the product's exterior rings will + be oriented counter-clockwise. + + Parameters + ---------- + geom : Geometry + The original geometry. May be a Polygon, MultiPolygon, or + GeometryCollection. + sign : float, optional. + The sign of the result's signed area. + + Returns + ------- + Geometry + + """ + if isinstance(geom, BaseMultipartGeometry): + return geom.__class__( + list( + map( + lambda geom: orient(geom, sign), + geom.geoms, + ) + ) + ) + if isinstance(geom, (Polygon,)): + return orient_(geom, sign) + return geom +
+ +
+
+
+ +
+ +
+

© Copyright 2019 Metropolitan Council.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty_dev/_sources/_generated/lasso.CubeTransit.rst.txt b/branch/bicounty_dev/_sources/_generated/lasso.CubeTransit.rst.txt new file mode 100644 index 0000000..e24b49e --- /dev/null +++ b/branch/bicounty_dev/_sources/_generated/lasso.CubeTransit.rst.txt @@ -0,0 +1,36 @@ +lasso.CubeTransit +================= + +.. currentmodule:: lasso + +.. autoclass:: CubeTransit + + + .. automethod:: __init__ + + + .. rubric:: Methods + + .. autosummary:: + + ~CubeTransit.__init__ + ~CubeTransit.add_additional_time_periods + ~CubeTransit.add_cube + ~CubeTransit.build_route_name + ~CubeTransit.calculate_start_end_times + ~CubeTransit.create_add_route_card_dict + ~CubeTransit.create_delete_route_card_dict + ~CubeTransit.create_from_cube + ~CubeTransit.create_update_route_card_dict + ~CubeTransit.cube_properties_to_standard_properties + ~CubeTransit.evaluate_differences + ~CubeTransit.evaluate_route_property_differences + ~CubeTransit.evaluate_route_shape_changes + ~CubeTransit.get_time_period_numbers_from_cube_properties + ~CubeTransit.unpack_route_name + + + + + + \ No newline at end of file diff --git a/branch/bicounty_dev/_sources/_generated/lasso.ModelRoadwayNetwork.rst.txt b/branch/bicounty_dev/_sources/_generated/lasso.ModelRoadwayNetwork.rst.txt new file mode 100644 index 0000000..29190d8 --- /dev/null +++ b/branch/bicounty_dev/_sources/_generated/lasso.ModelRoadwayNetwork.rst.txt @@ -0,0 +1,90 @@ +lasso.ModelRoadwayNetwork +========================= + +.. currentmodule:: lasso + +.. autoclass:: ModelRoadwayNetwork + + + .. automethod:: __init__ + + + .. rubric:: Methods + + .. autosummary:: + + ~ModelRoadwayNetwork.__init__ + ~ModelRoadwayNetwork.add_counts + ~ModelRoadwayNetwork.add_incident_link_data_to_nodes + ~ModelRoadwayNetwork.add_new_roadway_feature_change + ~ModelRoadwayNetwork.add_variable_using_shst_reference + ~ModelRoadwayNetwork.addition_map + ~ModelRoadwayNetwork.apply + ~ModelRoadwayNetwork.apply_managed_lane_feature_change + ~ModelRoadwayNetwork.apply_python_calculation + ~ModelRoadwayNetwork.apply_roadway_feature_change + ~ModelRoadwayNetwork.assess_connectivity + ~ModelRoadwayNetwork.build_selection_key + ~ModelRoadwayNetwork.calculate_area_type + ~ModelRoadwayNetwork.calculate_centroidconnect + ~ModelRoadwayNetwork.calculate_county + ~ModelRoadwayNetwork.calculate_distance + ~ModelRoadwayNetwork.calculate_mpo + ~ModelRoadwayNetwork.calculate_use + ~ModelRoadwayNetwork.convert_int + ~ModelRoadwayNetwork.create_ML_variable + ~ModelRoadwayNetwork.create_calculated_variables + ~ModelRoadwayNetwork.create_dummy_connector_links + ~ModelRoadwayNetwork.create_hov_corridor_variable + ~ModelRoadwayNetwork.create_managed_lane_network + ~ModelRoadwayNetwork.create_managed_variable + ~ModelRoadwayNetwork.dataframe_to_fixed_width + ~ModelRoadwayNetwork.delete_roadway_feature_change + ~ModelRoadwayNetwork.deletion_map + ~ModelRoadwayNetwork.fill_na + ~ModelRoadwayNetwork.from_RoadwayNetwork + ~ModelRoadwayNetwork.get_attribute + ~ModelRoadwayNetwork.get_managed_lane_node_ids + ~ModelRoadwayNetwork.get_modal_graph + ~ModelRoadwayNetwork.get_modal_links_nodes + ~ModelRoadwayNetwork.get_property_by_time_period_and_group + ~ModelRoadwayNetwork.identify_segment + ~ModelRoadwayNetwork.identify_segment_endpoints + ~ModelRoadwayNetwork.is_network_connected + ~ModelRoadwayNetwork.load_transform_network + ~ModelRoadwayNetwork.network_connection_plot + ~ModelRoadwayNetwork.orig_dest_nodes_foreign_key + ~ModelRoadwayNetwork.ox_graph + ~ModelRoadwayNetwork.path_search + ~ModelRoadwayNetwork.read + ~ModelRoadwayNetwork.read_match_result + ~ModelRoadwayNetwork.rename_variables_for_dbf + ~ModelRoadwayNetwork.roadway_net_to_gdf + ~ModelRoadwayNetwork.roadway_standard_to_met_council_network + ~ModelRoadwayNetwork.select_roadway_features + ~ModelRoadwayNetwork.selection_has_unique_link_id + ~ModelRoadwayNetwork.selection_map + ~ModelRoadwayNetwork.shortest_path + ~ModelRoadwayNetwork.split_properties_by_time_period_and_category + ~ModelRoadwayNetwork.update_distance + ~ModelRoadwayNetwork.validate_link_schema + ~ModelRoadwayNetwork.validate_node_schema + ~ModelRoadwayNetwork.validate_properties + ~ModelRoadwayNetwork.validate_selection + ~ModelRoadwayNetwork.validate_shape_schema + ~ModelRoadwayNetwork.validate_uniqueness + ~ModelRoadwayNetwork.write + ~ModelRoadwayNetwork.write_roadway_as_fixedwidth + ~ModelRoadwayNetwork.write_roadway_as_shp + + + + + + .. rubric:: Attributes + + .. autosummary:: + + ~ModelRoadwayNetwork.CALCULATED_VALUES + + \ No newline at end of file diff --git a/branch/bicounty_dev/_sources/_generated/lasso.Parameters.rst.txt b/branch/bicounty_dev/_sources/_generated/lasso.Parameters.rst.txt new file mode 100644 index 0000000..28d2c86 --- /dev/null +++ b/branch/bicounty_dev/_sources/_generated/lasso.Parameters.rst.txt @@ -0,0 +1,31 @@ +lasso.Parameters +================ + +.. currentmodule:: lasso + +.. autoclass:: Parameters + + + .. automethod:: __init__ + + + .. rubric:: Methods + + .. autosummary:: + + ~Parameters.__init__ + + + + + + .. rubric:: Attributes + + .. autosummary:: + + ~Parameters.cube_time_periods + ~Parameters.properties_to_split + ~Parameters.county_link_range_dict + ~Parameters.zones + + \ No newline at end of file diff --git a/branch/bicounty_dev/_sources/_generated/lasso.Project.rst.txt b/branch/bicounty_dev/_sources/_generated/lasso.Project.rst.txt new file mode 100644 index 0000000..863945b --- /dev/null +++ b/branch/bicounty_dev/_sources/_generated/lasso.Project.rst.txt @@ -0,0 +1,42 @@ +lasso.Project +============= + +.. currentmodule:: lasso + +.. autoclass:: Project + + + .. automethod:: __init__ + + + .. rubric:: Methods + + .. autosummary:: + + ~Project.__init__ + ~Project.add_highway_changes + ~Project.add_transit_changes + ~Project.create_project + ~Project.determine_roadway_network_changes_compatibility + ~Project.emme_id_to_wrangler_id + ~Project.emme_name_to_wrangler_name + ~Project.evaluate_changes + ~Project.get_object_from_network_build_command + ~Project.get_operation_from_network_build_command + ~Project.read_logfile + ~Project.read_network_build_file + ~Project.write_project_card + + + + + + .. rubric:: Attributes + + .. autosummary:: + + ~Project.CALCULATED_VALUES + ~Project.DEFAULT_PROJECT_NAME + ~Project.STATIC_VALUES + + \ No newline at end of file diff --git a/branch/bicounty_dev/_sources/_generated/lasso.StandardTransit.rst.txt b/branch/bicounty_dev/_sources/_generated/lasso.StandardTransit.rst.txt new file mode 100644 index 0000000..1175b4b --- /dev/null +++ b/branch/bicounty_dev/_sources/_generated/lasso.StandardTransit.rst.txt @@ -0,0 +1,32 @@ +lasso.StandardTransit +===================== + +.. currentmodule:: lasso + +.. autoclass:: StandardTransit + + + .. automethod:: __init__ + + + .. rubric:: Methods + + .. autosummary:: + + ~StandardTransit.__init__ + ~StandardTransit.calculate_cube_mode + ~StandardTransit.cube_format + ~StandardTransit.evaluate_differences + ~StandardTransit.fromTransitNetwork + ~StandardTransit.read_gtfs + ~StandardTransit.route_properties_gtfs_to_cube + ~StandardTransit.shape_gtfs_to_cube + ~StandardTransit.shape_gtfs_to_emme + ~StandardTransit.time_to_cube_time_period + ~StandardTransit.write_as_cube_lin + + + + + + \ No newline at end of file diff --git a/branch/bicounty_dev/_sources/_generated/lasso.logger.rst.txt b/branch/bicounty_dev/_sources/_generated/lasso.logger.rst.txt new file mode 100644 index 0000000..2054273 --- /dev/null +++ b/branch/bicounty_dev/_sources/_generated/lasso.logger.rst.txt @@ -0,0 +1,29 @@ +lasso.logger +============ + +.. automodule:: lasso.logger + + + + + + + + .. rubric:: Functions + + .. autosummary:: + + setupLogging + + + + + + + + + + + + + diff --git a/branch/bicounty_dev/_sources/_generated/lasso.util.rst.txt b/branch/bicounty_dev/_sources/_generated/lasso.util.rst.txt new file mode 100644 index 0000000..95fecf8 --- /dev/null +++ b/branch/bicounty_dev/_sources/_generated/lasso.util.rst.txt @@ -0,0 +1,35 @@ +lasso.util +========== + +.. automodule:: lasso.util + + + + + + + + .. rubric:: Functions + + .. autosummary:: + + column_name_to_parts + create_locationreference + geodesic_point_buffer + get_shared_streets_intersection_hash + hhmmss_to_datetime + secs_to_datetime + shorten_name + + + + + + + + + + + + + diff --git a/branch/bicounty_dev/_sources/autodoc.rst.txt b/branch/bicounty_dev/_sources/autodoc.rst.txt new file mode 100644 index 0000000..7e48d58 --- /dev/null +++ b/branch/bicounty_dev/_sources/autodoc.rst.txt @@ -0,0 +1,29 @@ +Lasso Classes and Functions +==================================== + +.. automodule:: lasso + :no-members: + :no-undoc-members: + :no-inherited-members: + :no-show-inheritance: + + +Base Classes +-------------- +.. autosummary:: + :toctree: _generated + :nosignatures: + + CubeTransit + StandardTransit + ModelRoadwayNetwork + Project + Parameters + +Utils and Functions +-------------------- +.. autosummary:: + :toctree: _generated + + util + logger diff --git a/branch/bicounty_dev/_sources/index.rst.txt b/branch/bicounty_dev/_sources/index.rst.txt new file mode 100644 index 0000000..1255d4e --- /dev/null +++ b/branch/bicounty_dev/_sources/index.rst.txt @@ -0,0 +1,35 @@ +.. lasso documentation master file, created by + sphinx-quickstart on Thu Dec 5 15:43:28 2019. + You can adapt this file completely to your liking, but it should at least + contain the root `toctree` directive. + +Welcome to lasso's documentation! +================================= + +This package of utilities is a wrapper around the +[network_wrangler](http://github.com/wsp-sag/network_wrangler) package +for MetCouncil. It aims to have the following functionality: +1. parse Cube log files and base highway networks and create ProjectCards + for Network Wrangler +2. parse two Cube transit line files and create ProjectCards for NetworkWrangler +3. refine Network Wrangler highway networks to contain specific variables and + settings for Metropolitan Council and export them to a format that can + be read in by Citilab's Cube software. + +.. toctree:: + :maxdepth: 3 + :caption: Contents: + + starting + setup + running + autodoc + + + +Indices and tables +================== + +* :ref:`genindex` +* :ref:`modindex` +* :ref:`search` diff --git a/branch/bicounty_dev/_sources/running.md.txt b/branch/bicounty_dev/_sources/running.md.txt new file mode 100644 index 0000000..e139dc8 --- /dev/null +++ b/branch/bicounty_dev/_sources/running.md.txt @@ -0,0 +1,12 @@ +# Running Lasso + +## Create project files + + +## Create a scenario + + +## Exporting networks + + +## Auditing and Reporting diff --git a/branch/bicounty_dev/_sources/setup.md.txt b/branch/bicounty_dev/_sources/setup.md.txt new file mode 100644 index 0000000..e77d463 --- /dev/null +++ b/branch/bicounty_dev/_sources/setup.md.txt @@ -0,0 +1,9 @@ +# Setup + +### Projects + +### Parameters + +### Settings + +### Additional Data Files diff --git a/branch/bicounty_dev/_sources/starting.md.txt b/branch/bicounty_dev/_sources/starting.md.txt new file mode 100644 index 0000000..8886f95 --- /dev/null +++ b/branch/bicounty_dev/_sources/starting.md.txt @@ -0,0 +1,292 @@ +# Starting Out + +## Installation + +If you are managing multiple python versions, we suggest using [`virtualenv`](https://virtualenv.pypa.io/en/latest/) or [`conda`](https://conda.io/en/latest/) virtual environments. + +Example using a conda environment (recommended) and using the package manager [pip](https://pip.pypa.io/en/stable/) to install Lasso from the source on GitHub. + +```bash +conda config --add channels conda-forge +conda create python=3.7 rtree geopandas -n +conda activate +pip install git+https://github.com/wsp-sag/Lasso@master +``` + +Lasso will install `network_wrangler` from the [PyPi](https://pypi.org/project/network-wrangler/) repository because it is included in Lasso's `requirements.txt`. + +#### Bleeding Edge +If you want to install a more up-to-date or development version of network wrangler and lasso , you can do so by installing it from the `develop` branch of + +```bash +conda config --add channels conda-forge +conda create python=3.7 rtree geopandas -n +conda activate +pip install git+https://github.com/wsp-sag/network_wrangler@develop +pip install git+https://github.com/wsp-sag/Lasso@develop +``` + +#### From Clone +If you are going to be working on Lasso locally, you might want to clone it to your local machine and install it from the clone. The -e will install it in [editable mode](https://pip.pypa.io/en/stable/reference/pip_install/?highlight=editable#editable-installs). + +**if you plan to do development on both network wrangler and lasso locally, consider installing network wrangler from a clone as well!** + +```bash +conda config --add channels conda-forge +conda create python=3.7 rtree geopandas osmnx -n +conda activate +git clone https://github.com/wsp-sag/Lasso +git clone https://github.com/wsp-sag/network_wrangler +cd network_wrangler +pip install -e . +cd .. +cd Lasso +pip install -e . +``` + +Notes: + +1. The -e installs it in editable mode. +2. If you are not part of the project team and want to contribute code bxack to the project, please fork before you clone and then add the original repository to your upstream origin list per [these directions on github](https://help.github.com/en/articles/fork-a-repo). +3. if you wanted to install from a specific tag/version number or branch, replace `@master` with `@` or `@tag` +4. If you want to make use of frequent developer updates for network wrangler as well, you can also install it from clone by copying the instructions for cloning and installing Lasso for Network Wrangler + +If you are going to be doing Lasso development, we also recommend: + - a good IDE such as [Atom](http://atom.io), VS Code, Sublime Text, etc. + with Python syntax highlighting turned on. + - [GitHub Desktop](https://desktop.github.com/) to locally update your clones + +## Brief Intro + +Lasso is a 'wrapper' around the [Network Wrangler](http://wsp-sag.github.io/network_wrangler) utility. + +Both Lasso and NetworkWrangler are built around the following data schemas: + - [`roadway network`], which is based on a mashup of Open Street Map and [Shared Streets](http://sharedstreets.io). In Network Wrangler these are read in from three json files reprsenting: links, shapes, and nodes. Data fields that change by time of day or by user category are represented as nested fields such that any field can be defined for an ad-hoc time-of-day span or user category. + - [`transit network`], which is based on a frequency-based implementation of the csv-based GTFS; and + - [`project card`], which is novel to Network Wrangler and stores information about network changes as a result of projects in yml. + +In addition, Lasso utilizes the following data schemas: + + - [`MetCouncil Model Roadway Network Schema`], which adds data fields to the `roadway network` schema that MetCouncil uses in their travel model including breaking out data fields by time period. + - [`MetCouncil Model Transit Network Schema`], which uses the Cube PublicTransport format, and + - [`Cube Log Files`], which document changes to the roadway network done in the Cube GUI. Lasso translates these to project cards in order to be used by NetworkWrangler. + - [`Cube public transport line files`], which define a set of transit lines in the cube software. + +### Components +Network Wrangler has the following atomic parts: + + - _RoadwayNetwork_ object, which represents the `roadway network` data as GeoDataFrames; + - _TransitNetwork_ object, which represents the `transit network` data as DataFrames; + - _ProjectCard_ object, which represents the data of the `project card`. Project cards identify the infrastructure that is changing (a selection) and defines the changes; or contains information about a new facility to be constructed or a new service to be run.; + - _Scenario_ object, which consist of at least a RoadwayNetwork, and +TransitNetwork. Scenarios can be based on or tiered from other scenarios. +Scenarios can query and add ProjectCards to describe a set of changes that should be made to the network. + +In addition, Lasso has the following atomic parts: + + - _Project_ object, creates project cards from one of the following: a base and a build transit network in cube format, a base and build highway network, or a base highway network and a Cube log file. + - _ModelRoadwayNetwork_ object is a subclass of `RoadwayNetwork` and contains MetCouncil-specific methods to define and create MetCouncil-specific variables and export the network to a format that can be read by Cube. + - _StandardTransit_, an object for holding a standard transit feed as a Partridge object and contains + methods to manipulate and translate the GTFS data to MetCouncil's Cube Line files. + - _CubeTransit_, an object for storing information about transit defined in `Cube public transport line files` + . Has the capability to parse cube line file properties and shapes into python dictionaries and compare line files and represent changes as Project Card dictionaries. + - _Parameters_, A class representing all the parameters defining the networks + including time of day, categories, etc. Parameters can be set at runtime by initializing a parameters instance + with a keyword argument setting the attribute. Parameters that are + not explicitly set will use default parameters listed in this class. + +#### RoadwayNetwork + +Reads, writes, queries and and manipulates roadway network data, which +is mainly stored in the GeoDataFrames `links_df`, `nodes_df`, and `shapes_df`. + +```python +net = RoadwayNetwork.read( + link_filename=MY_LINK_FILE, + node_filename=MY_NODE_FILE, + shape_filename=MY_SHAPE_FILE, + shape_foreign_key ='shape_id', + + ) +my_selection = { + "link": [{"name": ["I 35E"]}], + "A": {"osm_node_id": "961117623"}, # start searching for segments at A + "B": {"osm_node_id": "2564047368"}, +} +net.select_roadway_features(my_selection) + +my_change = [ + { + 'property': 'lanes', + 'existing': 1, + 'set': 2, + }, + { + 'property': 'drive_access', + 'set': 0, + }, +] + +my_net.apply_roadway_feature_change( + my_net.select_roadway_features(my_selection), + my_change +) + +ml_net = net.create_managed_lane_network(in_place=False) + +ml_net.is_network_connected(mode="drive")) + +_, disconnected_nodes = ml_net.assess_connectivity( + mode="walk", + ignore_end_nodes=True +) +ml_net.write(filename=my_out_prefix, path=my_dir) +``` +#### TransitNetwork + +#### ProjectCard + +#### Scenario + +Manages sets of project cards and tiering from a base scenario/set of networks. + +```python + +my_base_scenario = { + "road_net": RoadwayNetwork.read( + link_filename=STPAUL_LINK_FILE, + node_filename=STPAUL_NODE_FILE, + shape_filename=STPAUL_SHAPE_FILE, + fast=True, + shape_foreign_key ='shape_id', + ), + "transit_net": TransitNetwork.read(STPAUL_DIR), +} + +card_filenames = [ + "3_multiple_roadway_attribute_change.yml", + "multiple_changes.yml", + "4_simple_managed_lane.yml", +] + +project_card_directory = os.path.join(STPAUL_DIR, "project_cards") + +project_cards_list = [ + ProjectCard.read(os.path.join(project_card_directory, filename), validate=False) + for filename in card_filenames +] + +my_scenario = Scenario.create_scenario( + base_scenario=my_base_scenario, + project_cards_list=project_cards_list, +) +my_scenario.check_scenario_requisites() + +my_scenario.apply_all_projects() + +my_scenario.scenario_summary() +``` + +#### Project +Creates project cards by comparing two MetCouncil Model Transit Network files or by reading a cube log file and a base network; + +```python + +test_project = Project.create_project( + base_transit_source=os.path.join(CUBE_DIR, "transit.LIN"), + build_transit_source=os.path.join(CUBE_DIR, "transit_route_shape_change"), + ) + +test_project.evaluate_changes() + +test_project.write_project_card( + os.path.join(SCRATCH_DIR, "t_transit_shape_test.yml") + ) + +``` + +#### ModelRoadwayNetwork +A subclass of network_wrangler's RoadwayNetwork +class which additional understanding about how to translate and write the +network out to the MetCouncil Roadway Network schema. + +```Python + +net = ModelRoadwayNetwork.read( + link_filename=STPAUL_LINK_FILE, + node_filename=STPAUL_NODE_FILE, + shape_filename=STPAUL_SHAPE_FILE, + fast=True, + shape_foreign_key ='shape_id', + ) + +net.write_roadway_as_fixedwidth() + +``` + +#### StandardTransit +Translates the standard GTFS data to MetCouncil's Cube Line files. + +```Python +cube_transit_net = StandardTransit.read_gtfs(BASE_TRANSIT_DIR) +cube_transit_net.write_as_cube_lin(os.path.join(WRITE_DIR, "outfile.lin")) +``` + +#### CubeTransit +Used by the project class and has the capability to: + - Parse cube line file properties and shapes into python dictionaries + - Compare line files and represent changes as Project Card dictionaries + +```python +tn = CubeTransit.create_from_cube(CUBE_DIR) +transit_change_list = tn.evaluate_differences(base_transit_network) +``` + +#### Parameters +Holds information about default parameters but can +also be initialized to override those parameters at object instantiation using a dictionary. + +```Python +# read parameters from a yaml configuration file +# could also provide as a key/value pair +with open(config_file) as f: + my_config = yaml.safe_load(f) + +# provide parameters at instantiation of ModelRoadwayNetwork +model_road_net = ModelRoadwayNetwork.from_RoadwayNetwork( + my_scenario.road_net, parameters=my_config.get("my_parameters", {}) + ) +# network written with direction from the parameters given +model_road_net.write_roadway_as_shp() + +``` + +### Typical Workflow + +Workflows in Lasso and Network Wrangler typically accomplish one of two goals: +1. Create Project Cards to document network changes as a result of either transit or roadway projects. +2. Create Model Network Files for a scenario as represented by a series of Project Cards layered on top of a base network. + +#### Project Cards from Transit LIN Files + + +#### Project Cards from Cube LOG Files + + +#### Model Network Files for a Scenario + + + +## Running Quickstart Jupyter Notebooks + +To learn basic lasso functionality, please refer to the following jupyter notebooks in the `/notebooks` directory: + + - `Lasso Project Card Creation Quickstart.ipynb` + - `Lasso Scenario Creation Quickstart.ipynb` + + Jupyter notebooks can be started by activating the lasso conda environment and typing `jupyter notebook`: + + ```bash + conda activate + jupyter notebook + ``` diff --git a/branch/bicounty_dev/_static/_sphinx_javascript_frameworks_compat.js b/branch/bicounty_dev/_static/_sphinx_javascript_frameworks_compat.js new file mode 100644 index 0000000..8141580 --- /dev/null +++ b/branch/bicounty_dev/_static/_sphinx_javascript_frameworks_compat.js @@ -0,0 +1,123 @@ +/* Compatability shim for jQuery and underscores.js. + * + * Copyright Sphinx contributors + * Released under the two clause BSD licence + */ + +/** + * small helper function to urldecode strings + * + * See https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/decodeURIComponent#Decoding_query_parameters_from_a_URL + */ +jQuery.urldecode = function(x) { + if (!x) { + return x + } + return decodeURIComponent(x.replace(/\+/g, ' ')); +}; + +/** + * small helper function to urlencode strings + */ +jQuery.urlencode = encodeURIComponent; + +/** + * This function returns the parsed url parameters of the + * current request. Multiple values per key are supported, + * it will always return arrays of strings for the value parts. + */ +jQuery.getQueryParameters = function(s) { + if (typeof s === 'undefined') + s = document.location.search; + var parts = s.substr(s.indexOf('?') + 1).split('&'); + var result = {}; + for (var i = 0; i < parts.length; i++) { + var tmp = parts[i].split('=', 2); + var key = jQuery.urldecode(tmp[0]); + var value = jQuery.urldecode(tmp[1]); + if (key in result) + result[key].push(value); + else + result[key] = [value]; + } + return result; +}; + +/** + * highlight a given string on a jquery object by wrapping it in + * span elements with the given class name. + */ +jQuery.fn.highlightText = function(text, className) { + function highlight(node, addItems) { + if (node.nodeType === 3) { + var val = node.nodeValue; + var pos = val.toLowerCase().indexOf(text); + if (pos >= 0 && + !jQuery(node.parentNode).hasClass(className) && + !jQuery(node.parentNode).hasClass("nohighlight")) { + var span; + var isInSVG = jQuery(node).closest("body, svg, foreignObject").is("svg"); + if (isInSVG) { + span = document.createElementNS("http://www.w3.org/2000/svg", "tspan"); + } else { + span = document.createElement("span"); + span.className = className; + } + span.appendChild(document.createTextNode(val.substr(pos, text.length))); + node.parentNode.insertBefore(span, node.parentNode.insertBefore( + document.createTextNode(val.substr(pos + text.length)), + node.nextSibling)); + node.nodeValue = val.substr(0, pos); + if (isInSVG) { + var rect = document.createElementNS("http://www.w3.org/2000/svg", "rect"); + var bbox = node.parentElement.getBBox(); + rect.x.baseVal.value = bbox.x; + rect.y.baseVal.value = bbox.y; + rect.width.baseVal.value = bbox.width; + rect.height.baseVal.value = bbox.height; + rect.setAttribute('class', className); + addItems.push({ + "parent": node.parentNode, + "target": rect}); + } + } + } + else if (!jQuery(node).is("button, select, textarea")) { + jQuery.each(node.childNodes, function() { + highlight(this, addItems); + }); + } + } + var addItems = []; + var result = this.each(function() { + highlight(this, addItems); + }); + for (var i = 0; i < addItems.length; ++i) { + jQuery(addItems[i].parent).before(addItems[i].target); + } + return result; +}; + +/* + * backward compatibility for jQuery.browser + * This will be supported until firefox bug is fixed. + */ +if (!jQuery.browser) { + jQuery.uaMatch = function(ua) { + ua = ua.toLowerCase(); + + var match = /(chrome)[ \/]([\w.]+)/.exec(ua) || + /(webkit)[ \/]([\w.]+)/.exec(ua) || + /(opera)(?:.*version|)[ \/]([\w.]+)/.exec(ua) || + /(msie) ([\w.]+)/.exec(ua) || + ua.indexOf("compatible") < 0 && /(mozilla)(?:.*? rv:([\w.]+)|)/.exec(ua) || + []; + + return { + browser: match[ 1 ] || "", + version: match[ 2 ] || "0" + }; + }; + jQuery.browser = {}; + jQuery.browser[jQuery.uaMatch(navigator.userAgent).browser] = true; +} diff --git a/branch/bicounty_dev/_static/basic.css b/branch/bicounty_dev/_static/basic.css new file mode 100644 index 0000000..cfc60b8 --- /dev/null +++ b/branch/bicounty_dev/_static/basic.css @@ -0,0 +1,921 @@ +/* + * basic.css + * ~~~~~~~~~ + * + * Sphinx stylesheet -- basic theme. + * + * :copyright: Copyright 2007-2023 by the Sphinx team, see AUTHORS. + * :license: BSD, see LICENSE for details. + * + */ + +/* -- main layout ----------------------------------------------------------- */ + +div.clearer { + clear: both; +} + +div.section::after { + display: block; + content: ''; + clear: left; +} + +/* -- relbar ---------------------------------------------------------------- */ + +div.related { + width: 100%; + font-size: 90%; +} + +div.related h3 { + display: none; +} + +div.related ul { + margin: 0; + padding: 0 0 0 10px; + list-style: none; +} + +div.related li { + display: inline; +} + +div.related li.right { + float: right; + margin-right: 5px; +} + +/* -- sidebar --------------------------------------------------------------- */ + +div.sphinxsidebarwrapper { + padding: 10px 5px 0 10px; +} + +div.sphinxsidebar { + float: left; + width: 230px; + margin-left: -100%; + font-size: 90%; + word-wrap: break-word; + overflow-wrap : break-word; +} + +div.sphinxsidebar ul { + list-style: none; +} + +div.sphinxsidebar ul ul, +div.sphinxsidebar ul.want-points { + margin-left: 20px; + list-style: square; +} + +div.sphinxsidebar ul ul { + margin-top: 0; + margin-bottom: 0; +} + +div.sphinxsidebar form { + margin-top: 10px; +} + +div.sphinxsidebar input { + border: 1px solid #98dbcc; + font-family: sans-serif; + font-size: 1em; +} + +div.sphinxsidebar #searchbox form.search { + overflow: hidden; +} + +div.sphinxsidebar #searchbox input[type="text"] { + float: left; + width: 80%; + padding: 0.25em; + box-sizing: border-box; +} + +div.sphinxsidebar #searchbox input[type="submit"] { + float: left; + width: 20%; + border-left: none; + padding: 0.25em; + box-sizing: border-box; +} + + +img { + border: 0; + max-width: 100%; +} + +/* -- search page ----------------------------------------------------------- */ + +ul.search { + margin: 10px 0 0 20px; + padding: 0; +} + +ul.search li { + padding: 5px 0 5px 20px; + background-image: url(file.png); + background-repeat: no-repeat; + background-position: 0 7px; +} + +ul.search li a { + font-weight: bold; +} + +ul.search li p.context { + color: #888; + margin: 2px 0 0 30px; + text-align: left; +} + +ul.keywordmatches li.goodmatch a { + font-weight: bold; +} + +/* -- index page ------------------------------------------------------------ */ + +table.contentstable { + width: 90%; + margin-left: auto; + margin-right: auto; +} + +table.contentstable p.biglink { + line-height: 150%; +} + +a.biglink { + font-size: 1.3em; +} + +span.linkdescr { + font-style: italic; + padding-top: 5px; + font-size: 90%; +} + +/* -- general index --------------------------------------------------------- */ + +table.indextable { + width: 100%; +} + +table.indextable td { + text-align: left; + vertical-align: top; +} + +table.indextable ul { + margin-top: 0; + margin-bottom: 0; + list-style-type: none; +} + +table.indextable > tbody > tr > td > ul { + padding-left: 0em; +} + +table.indextable tr.pcap { + height: 10px; +} + +table.indextable tr.cap { + margin-top: 10px; + background-color: #f2f2f2; +} + +img.toggler { + margin-right: 3px; + margin-top: 3px; + cursor: pointer; +} + +div.modindex-jumpbox { + border-top: 1px solid #ddd; + border-bottom: 1px solid #ddd; + margin: 1em 0 1em 0; + padding: 0.4em; +} + +div.genindex-jumpbox { + border-top: 1px solid #ddd; + border-bottom: 1px solid #ddd; + margin: 1em 0 1em 0; + padding: 0.4em; +} + +/* -- domain module index --------------------------------------------------- */ + +table.modindextable td { + padding: 2px; + border-collapse: collapse; +} + +/* -- general body styles --------------------------------------------------- */ + +div.body { + min-width: 360px; + max-width: 800px; +} + +div.body p, div.body dd, div.body li, div.body blockquote { + -moz-hyphens: auto; + -ms-hyphens: auto; + -webkit-hyphens: auto; + hyphens: auto; +} + +a.headerlink { + visibility: hidden; +} + +h1:hover > a.headerlink, +h2:hover > a.headerlink, +h3:hover > a.headerlink, +h4:hover > a.headerlink, +h5:hover > a.headerlink, +h6:hover > a.headerlink, +dt:hover > a.headerlink, +caption:hover > a.headerlink, +p.caption:hover > a.headerlink, +div.code-block-caption:hover > a.headerlink { + visibility: visible; +} + +div.body p.caption { + text-align: inherit; +} + +div.body td { + text-align: left; +} + +.first { + margin-top: 0 !important; +} + +p.rubric { + margin-top: 30px; + font-weight: bold; +} + +img.align-left, figure.align-left, .figure.align-left, object.align-left { + clear: left; + float: left; + margin-right: 1em; +} + +img.align-right, figure.align-right, .figure.align-right, object.align-right { + clear: right; + float: right; + margin-left: 1em; +} + +img.align-center, figure.align-center, .figure.align-center, object.align-center { + display: block; + margin-left: auto; + margin-right: auto; +} + +img.align-default, figure.align-default, .figure.align-default { + display: block; + margin-left: auto; + margin-right: auto; +} + +.align-left { + text-align: left; +} + +.align-center { + text-align: center; +} + +.align-default { + text-align: center; +} + +.align-right { + text-align: right; +} + +/* -- sidebars -------------------------------------------------------------- */ + +div.sidebar, +aside.sidebar { + margin: 0 0 0.5em 1em; + border: 1px solid #ddb; + padding: 7px; + background-color: #ffe; + width: 40%; + float: right; + clear: right; + overflow-x: auto; +} + +p.sidebar-title { + font-weight: bold; +} + +nav.contents, +aside.topic, +div.admonition, div.topic, blockquote { + clear: left; +} + +/* -- topics ---------------------------------------------------------------- */ + +nav.contents, +aside.topic, +div.topic { + border: 1px solid #ccc; + padding: 7px; + margin: 10px 0 10px 0; +} + +p.topic-title { + font-size: 1.1em; + font-weight: bold; + margin-top: 10px; +} + +/* -- admonitions ----------------------------------------------------------- */ + +div.admonition { + margin-top: 10px; + margin-bottom: 10px; + padding: 7px; +} + +div.admonition dt { + font-weight: bold; +} + +p.admonition-title { + margin: 0px 10px 5px 0px; + font-weight: bold; +} + +div.body p.centered { + text-align: center; + margin-top: 25px; +} + +/* -- content of sidebars/topics/admonitions -------------------------------- */ + +div.sidebar > :last-child, +aside.sidebar > :last-child, +nav.contents > :last-child, +aside.topic > :last-child, +div.topic > :last-child, +div.admonition > :last-child { + margin-bottom: 0; +} + +div.sidebar::after, +aside.sidebar::after, +nav.contents::after, +aside.topic::after, +div.topic::after, +div.admonition::after, +blockquote::after { + display: block; + content: ''; + clear: both; +} + +/* -- tables ---------------------------------------------------------------- */ + +table.docutils { + margin-top: 10px; + margin-bottom: 10px; + border: 0; + border-collapse: collapse; +} + +table.align-center { + margin-left: auto; + margin-right: auto; +} + +table.align-default { + margin-left: auto; + margin-right: auto; +} + +table caption span.caption-number { + font-style: italic; +} + +table caption span.caption-text { +} + +table.docutils td, table.docutils th { + padding: 1px 8px 1px 5px; + border-top: 0; + border-left: 0; + border-right: 0; + border-bottom: 1px solid #aaa; +} + +th { + text-align: left; + padding-right: 5px; +} + +table.citation { + border-left: solid 1px gray; + margin-left: 1px; +} + +table.citation td { + border-bottom: none; +} + +th > :first-child, +td > :first-child { + margin-top: 0px; +} + +th > :last-child, +td > :last-child { + margin-bottom: 0px; +} + +/* -- figures --------------------------------------------------------------- */ + +div.figure, figure { + margin: 0.5em; + padding: 0.5em; +} + +div.figure p.caption, figcaption { + padding: 0.3em; +} + +div.figure p.caption span.caption-number, +figcaption span.caption-number { + font-style: italic; +} + +div.figure p.caption span.caption-text, +figcaption span.caption-text { +} + +/* -- field list styles ----------------------------------------------------- */ + +table.field-list td, table.field-list th { + border: 0 !important; +} + +.field-list ul { + margin: 0; + padding-left: 1em; +} + +.field-list p { + margin: 0; +} + +.field-name { + -moz-hyphens: manual; + -ms-hyphens: manual; + -webkit-hyphens: manual; + hyphens: manual; +} + +/* -- hlist styles ---------------------------------------------------------- */ + +table.hlist { + margin: 1em 0; +} + +table.hlist td { + vertical-align: top; +} + +/* -- object description styles --------------------------------------------- */ + +.sig { + font-family: 'Consolas', 'Menlo', 'DejaVu Sans Mono', 'Bitstream Vera Sans Mono', monospace; +} + +.sig-name, code.descname { + background-color: transparent; + font-weight: bold; +} + +.sig-name { + font-size: 1.1em; +} + +code.descname { + font-size: 1.2em; +} + +.sig-prename, code.descclassname { + background-color: transparent; +} + +.optional { + font-size: 1.3em; +} + +.sig-paren { + font-size: larger; +} + +.sig-param.n { + font-style: italic; +} + +/* C++ specific styling */ + +.sig-inline.c-texpr, +.sig-inline.cpp-texpr { + font-family: unset; +} + +.sig.c .k, .sig.c .kt, +.sig.cpp .k, .sig.cpp .kt { + color: #0033B3; +} + +.sig.c .m, +.sig.cpp .m { + color: #1750EB; +} + +.sig.c .s, .sig.c .sc, +.sig.cpp .s, .sig.cpp .sc { + color: #067D17; +} + + +/* -- other body styles ----------------------------------------------------- */ + +ol.arabic { + list-style: decimal; +} + +ol.loweralpha { + list-style: lower-alpha; +} + +ol.upperalpha { + list-style: upper-alpha; +} + +ol.lowerroman { + list-style: lower-roman; +} + +ol.upperroman { + list-style: upper-roman; +} + +:not(li) > ol > li:first-child > :first-child, +:not(li) > ul > li:first-child > :first-child { + margin-top: 0px; +} + +:not(li) > ol > li:last-child > :last-child, +:not(li) > ul > li:last-child > :last-child { + margin-bottom: 0px; +} + +ol.simple ol p, +ol.simple ul p, +ul.simple ol p, +ul.simple ul p { + margin-top: 0; +} + +ol.simple > li:not(:first-child) > p, +ul.simple > li:not(:first-child) > p { + margin-top: 0; +} + +ol.simple p, +ul.simple p { + margin-bottom: 0; +} + +aside.footnote > span, +div.citation > span { + float: left; +} +aside.footnote > span:last-of-type, +div.citation > span:last-of-type { + padding-right: 0.5em; +} +aside.footnote > p { + margin-left: 2em; +} +div.citation > p { + margin-left: 4em; +} +aside.footnote > p:last-of-type, +div.citation > p:last-of-type { + margin-bottom: 0em; +} +aside.footnote > p:last-of-type:after, +div.citation > p:last-of-type:after { + content: ""; + clear: both; +} + +dl.field-list { + display: grid; + grid-template-columns: fit-content(30%) auto; +} + +dl.field-list > dt { + font-weight: bold; + word-break: break-word; + padding-left: 0.5em; + padding-right: 5px; +} + +dl.field-list > dd { + padding-left: 0.5em; + margin-top: 0em; + margin-left: 0em; + margin-bottom: 0em; +} + +dl { + margin-bottom: 15px; +} + +dd > :first-child { + margin-top: 0px; +} + +dd ul, dd table { + margin-bottom: 10px; +} + +dd { + margin-top: 3px; + margin-bottom: 10px; + margin-left: 30px; +} + +.sig dd { + margin-top: 0px; + margin-bottom: 0px; +} + +.sig dl { + margin-top: 0px; + margin-bottom: 0px; +} + +dl > dd:last-child, +dl > dd:last-child > :last-child { + margin-bottom: 0; +} + +dt:target, span.highlighted { + background-color: #fbe54e; +} + +rect.highlighted { + fill: #fbe54e; +} + +dl.glossary dt { + font-weight: bold; + font-size: 1.1em; +} + +.versionmodified { + font-style: italic; +} + +.system-message { + background-color: #fda; + padding: 5px; + border: 3px solid red; +} + +.footnote:target { + background-color: #ffa; +} + +.line-block { + display: block; + margin-top: 1em; + margin-bottom: 1em; +} + +.line-block .line-block { + margin-top: 0; + margin-bottom: 0; + margin-left: 1.5em; +} + +.guilabel, .menuselection { + font-family: sans-serif; +} + +.accelerator { + text-decoration: underline; +} + +.classifier { + font-style: oblique; +} + +.classifier:before { + font-style: normal; + margin: 0 0.5em; + content: ":"; + display: inline-block; +} + +abbr, acronym { + border-bottom: dotted 1px; + cursor: help; +} + +.translated { + background-color: rgba(207, 255, 207, 0.2) +} + +.untranslated { + background-color: rgba(255, 207, 207, 0.2) +} + +/* -- code displays --------------------------------------------------------- */ + +pre { + overflow: auto; + overflow-y: hidden; /* fixes display issues on Chrome browsers */ +} + +pre, div[class*="highlight-"] { + clear: both; +} + +span.pre { + -moz-hyphens: none; + -ms-hyphens: none; + -webkit-hyphens: none; + hyphens: none; + white-space: nowrap; +} + +div[class*="highlight-"] { + margin: 1em 0; +} + +td.linenos pre { + border: 0; + background-color: transparent; + color: #aaa; +} + +table.highlighttable { + display: block; +} + +table.highlighttable tbody { + display: block; +} + +table.highlighttable tr { + display: flex; +} + +table.highlighttable td { + margin: 0; + padding: 0; +} + +table.highlighttable td.linenos { + padding-right: 0.5em; +} + +table.highlighttable td.code { + flex: 1; + overflow: hidden; +} + +.highlight .hll { + display: block; +} + +div.highlight pre, +table.highlighttable pre { + margin: 0; +} + +div.code-block-caption + div { + margin-top: 0; +} + +div.code-block-caption { + margin-top: 1em; + padding: 2px 5px; + font-size: small; +} + +div.code-block-caption code { + background-color: transparent; +} + +table.highlighttable td.linenos, +span.linenos, +div.highlight span.gp { /* gp: Generic.Prompt */ + user-select: none; + -webkit-user-select: text; /* Safari fallback only */ + -webkit-user-select: none; /* Chrome/Safari */ + -moz-user-select: none; /* Firefox */ + -ms-user-select: none; /* IE10+ */ +} + +div.code-block-caption span.caption-number { + padding: 0.1em 0.3em; + font-style: italic; +} + +div.code-block-caption span.caption-text { +} + +div.literal-block-wrapper { + margin: 1em 0; +} + +code.xref, a code { + background-color: transparent; + font-weight: bold; +} + +h1 code, h2 code, h3 code, h4 code, h5 code, h6 code { + background-color: transparent; +} + +.viewcode-link { + float: right; +} + +.viewcode-back { + float: right; + font-family: sans-serif; +} + +div.viewcode-block:target { + margin: -1px -10px; + padding: 0 10px; +} + +/* -- math display ---------------------------------------------------------- */ + +img.math { + vertical-align: middle; +} + +div.body div.math p { + text-align: center; +} + +span.eqno { + float: right; +} + +span.eqno a.headerlink { + position: absolute; + z-index: 1; +} + +div.math:hover a.headerlink { + visibility: visible; +} + +/* -- printout stylesheet --------------------------------------------------- */ + +@media print { + div.document, + div.documentwrapper, + div.bodywrapper { + margin: 0 !important; + width: 100%; + } + + div.sphinxsidebar, + div.related, + div.footer, + #top-link { + display: none; + } +} \ No newline at end of file diff --git a/branch/bicounty_dev/_static/css/badge_only.css b/branch/bicounty_dev/_static/css/badge_only.css new file mode 100644 index 0000000..c718cee --- /dev/null +++ b/branch/bicounty_dev/_static/css/badge_only.css @@ -0,0 +1 @@ +.clearfix{*zoom:1}.clearfix:after,.clearfix:before{display:table;content:""}.clearfix:after{clear:both}@font-face{font-family:FontAwesome;font-style:normal;font-weight:400;src:url(fonts/fontawesome-webfont.eot?674f50d287a8c48dc19ba404d20fe713?#iefix) format("embedded-opentype"),url(fonts/fontawesome-webfont.woff2?af7ae505a9eed503f8b8e6982036873e) format("woff2"),url(fonts/fontawesome-webfont.woff?fee66e712a8a08eef5805a46892932ad) format("woff"),url(fonts/fontawesome-webfont.ttf?b06871f281fee6b241d60582ae9369b9) format("truetype"),url(fonts/fontawesome-webfont.svg?912ec66d7572ff821749319396470bde#FontAwesome) format("svg")}.fa:before{font-family:FontAwesome;font-style:normal;font-weight:400;line-height:1}.fa:before,a .fa{text-decoration:inherit}.fa:before,a .fa,li .fa{display:inline-block}li .fa-large:before{width:1.875em}ul.fas{list-style-type:none;margin-left:2em;text-indent:-.8em}ul.fas li .fa{width:.8em}ul.fas li .fa-large:before{vertical-align:baseline}.fa-book:before,.icon-book:before{content:"\f02d"}.fa-caret-down:before,.icon-caret-down:before{content:"\f0d7"}.fa-caret-up:before,.icon-caret-up:before{content:"\f0d8"}.fa-caret-left:before,.icon-caret-left:before{content:"\f0d9"}.fa-caret-right:before,.icon-caret-right:before{content:"\f0da"}.rst-versions{position:fixed;bottom:0;left:0;width:300px;color:#fcfcfc;background:#1f1d1d;font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif;z-index:400}.rst-versions a{color:#2980b9;text-decoration:none}.rst-versions .rst-badge-small{display:none}.rst-versions .rst-current-version{padding:12px;background-color:#272525;display:block;text-align:right;font-size:90%;cursor:pointer;color:#27ae60}.rst-versions .rst-current-version:after{clear:both;content:"";display:block}.rst-versions .rst-current-version .fa{color:#fcfcfc}.rst-versions .rst-current-version .fa-book,.rst-versions .rst-current-version .icon-book{float:left}.rst-versions .rst-current-version.rst-out-of-date{background-color:#e74c3c;color:#fff}.rst-versions .rst-current-version.rst-active-old-version{background-color:#f1c40f;color:#000}.rst-versions.shift-up{height:auto;max-height:100%;overflow-y:scroll}.rst-versions.shift-up .rst-other-versions{display:block}.rst-versions .rst-other-versions{font-size:90%;padding:12px;color:grey;display:none}.rst-versions .rst-other-versions hr{display:block;height:1px;border:0;margin:20px 0;padding:0;border-top:1px solid #413d3d}.rst-versions .rst-other-versions dd{display:inline-block;margin:0}.rst-versions .rst-other-versions dd a{display:inline-block;padding:6px;color:#fcfcfc}.rst-versions.rst-badge{width:auto;bottom:20px;right:20px;left:auto;border:none;max-width:300px;max-height:90%}.rst-versions.rst-badge .fa-book,.rst-versions.rst-badge .icon-book{float:none;line-height:30px}.rst-versions.rst-badge.shift-up .rst-current-version{text-align:right}.rst-versions.rst-badge.shift-up .rst-current-version .fa-book,.rst-versions.rst-badge.shift-up .rst-current-version .icon-book{float:left}.rst-versions.rst-badge>.rst-current-version{width:auto;height:30px;line-height:30px;padding:0 6px;display:block;text-align:center}@media screen and (max-width:768px){.rst-versions{width:85%;display:none}.rst-versions.shift{display:block}} \ No newline at end of file diff --git a/branch/bicounty_dev/_static/css/fonts/Roboto-Slab-Bold.woff b/branch/bicounty_dev/_static/css/fonts/Roboto-Slab-Bold.woff new file mode 100644 index 0000000..6cb6000 Binary files /dev/null and b/branch/bicounty_dev/_static/css/fonts/Roboto-Slab-Bold.woff differ diff --git a/branch/bicounty_dev/_static/css/fonts/Roboto-Slab-Bold.woff2 b/branch/bicounty_dev/_static/css/fonts/Roboto-Slab-Bold.woff2 new file mode 100644 index 0000000..7059e23 Binary files /dev/null and b/branch/bicounty_dev/_static/css/fonts/Roboto-Slab-Bold.woff2 differ diff --git a/branch/bicounty_dev/_static/css/fonts/Roboto-Slab-Regular.woff b/branch/bicounty_dev/_static/css/fonts/Roboto-Slab-Regular.woff new file mode 100644 index 0000000..f815f63 Binary files /dev/null and b/branch/bicounty_dev/_static/css/fonts/Roboto-Slab-Regular.woff differ diff --git a/branch/bicounty_dev/_static/css/fonts/Roboto-Slab-Regular.woff2 b/branch/bicounty_dev/_static/css/fonts/Roboto-Slab-Regular.woff2 new file mode 100644 index 0000000..f2c76e5 Binary files /dev/null and b/branch/bicounty_dev/_static/css/fonts/Roboto-Slab-Regular.woff2 differ diff --git a/branch/bicounty_dev/_static/css/fonts/fontawesome-webfont.eot b/branch/bicounty_dev/_static/css/fonts/fontawesome-webfont.eot new file mode 100644 index 0000000..e9f60ca Binary files /dev/null and b/branch/bicounty_dev/_static/css/fonts/fontawesome-webfont.eot differ diff --git a/branch/bicounty_dev/_static/css/fonts/fontawesome-webfont.svg b/branch/bicounty_dev/_static/css/fonts/fontawesome-webfont.svg new file mode 100644 index 0000000..855c845 --- /dev/null +++ b/branch/bicounty_dev/_static/css/fonts/fontawesome-webfont.svg @@ -0,0 +1,2671 @@ + + + + +Created by FontForge 20120731 at Mon Oct 24 17:37:40 2016 + By ,,, +Copyright Dave Gandy 2016. All rights reserved. + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/branch/bicounty_dev/_static/css/fonts/fontawesome-webfont.ttf b/branch/bicounty_dev/_static/css/fonts/fontawesome-webfont.ttf new file mode 100644 index 0000000..35acda2 Binary files /dev/null and b/branch/bicounty_dev/_static/css/fonts/fontawesome-webfont.ttf differ diff --git a/branch/bicounty_dev/_static/css/fonts/fontawesome-webfont.woff b/branch/bicounty_dev/_static/css/fonts/fontawesome-webfont.woff new file mode 100644 index 0000000..400014a Binary files /dev/null and b/branch/bicounty_dev/_static/css/fonts/fontawesome-webfont.woff differ diff --git a/branch/bicounty_dev/_static/css/fonts/fontawesome-webfont.woff2 b/branch/bicounty_dev/_static/css/fonts/fontawesome-webfont.woff2 new file mode 100644 index 0000000..4d13fc6 Binary files /dev/null and b/branch/bicounty_dev/_static/css/fonts/fontawesome-webfont.woff2 differ diff --git a/branch/bicounty_dev/_static/css/fonts/lato-bold-italic.woff b/branch/bicounty_dev/_static/css/fonts/lato-bold-italic.woff new file mode 100644 index 0000000..88ad05b Binary files /dev/null and b/branch/bicounty_dev/_static/css/fonts/lato-bold-italic.woff differ diff --git a/branch/bicounty_dev/_static/css/fonts/lato-bold-italic.woff2 b/branch/bicounty_dev/_static/css/fonts/lato-bold-italic.woff2 new file mode 100644 index 0000000..c4e3d80 Binary files /dev/null and b/branch/bicounty_dev/_static/css/fonts/lato-bold-italic.woff2 differ diff --git a/branch/bicounty_dev/_static/css/fonts/lato-bold.woff b/branch/bicounty_dev/_static/css/fonts/lato-bold.woff new file mode 100644 index 0000000..c6dff51 Binary files /dev/null and b/branch/bicounty_dev/_static/css/fonts/lato-bold.woff differ diff --git a/branch/bicounty_dev/_static/css/fonts/lato-bold.woff2 b/branch/bicounty_dev/_static/css/fonts/lato-bold.woff2 new file mode 100644 index 0000000..bb19504 Binary files /dev/null and b/branch/bicounty_dev/_static/css/fonts/lato-bold.woff2 differ diff --git a/branch/bicounty_dev/_static/css/fonts/lato-normal-italic.woff b/branch/bicounty_dev/_static/css/fonts/lato-normal-italic.woff new file mode 100644 index 0000000..76114bc Binary files /dev/null and b/branch/bicounty_dev/_static/css/fonts/lato-normal-italic.woff differ diff --git a/branch/bicounty_dev/_static/css/fonts/lato-normal-italic.woff2 b/branch/bicounty_dev/_static/css/fonts/lato-normal-italic.woff2 new file mode 100644 index 0000000..3404f37 Binary files /dev/null and b/branch/bicounty_dev/_static/css/fonts/lato-normal-italic.woff2 differ diff --git a/branch/bicounty_dev/_static/css/fonts/lato-normal.woff b/branch/bicounty_dev/_static/css/fonts/lato-normal.woff new file mode 100644 index 0000000..ae1307f Binary files /dev/null and b/branch/bicounty_dev/_static/css/fonts/lato-normal.woff differ diff --git a/branch/bicounty_dev/_static/css/fonts/lato-normal.woff2 b/branch/bicounty_dev/_static/css/fonts/lato-normal.woff2 new file mode 100644 index 0000000..3bf9843 Binary files /dev/null and b/branch/bicounty_dev/_static/css/fonts/lato-normal.woff2 differ diff --git a/branch/bicounty_dev/_static/css/theme.css b/branch/bicounty_dev/_static/css/theme.css new file mode 100644 index 0000000..19a446a --- /dev/null +++ b/branch/bicounty_dev/_static/css/theme.css @@ -0,0 +1,4 @@ +html{box-sizing:border-box}*,:after,:before{box-sizing:inherit}article,aside,details,figcaption,figure,footer,header,hgroup,nav,section{display:block}audio,canvas,video{display:inline-block;*display:inline;*zoom:1}[hidden],audio:not([controls]){display:none}*{-webkit-box-sizing:border-box;-moz-box-sizing:border-box;box-sizing:border-box}html{font-size:100%;-webkit-text-size-adjust:100%;-ms-text-size-adjust:100%}body{margin:0}a:active,a:hover{outline:0}abbr[title]{border-bottom:1px dotted}b,strong{font-weight:700}blockquote{margin:0}dfn{font-style:italic}ins{background:#ff9;text-decoration:none}ins,mark{color:#000}mark{background:#ff0;font-style:italic;font-weight:700}.rst-content code,.rst-content tt,code,kbd,pre,samp{font-family:monospace,serif;_font-family:courier new,monospace;font-size:1em}pre{white-space:pre}q{quotes:none}q:after,q:before{content:"";content:none}small{font-size:85%}sub,sup{font-size:75%;line-height:0;position:relative;vertical-align:baseline}sup{top:-.5em}sub{bottom:-.25em}dl,ol,ul{margin:0;padding:0;list-style:none;list-style-image:none}li{list-style:none}dd{margin:0}img{border:0;-ms-interpolation-mode:bicubic;vertical-align:middle;max-width:100%}svg:not(:root){overflow:hidden}figure,form{margin:0}label{cursor:pointer}button,input,select,textarea{font-size:100%;margin:0;vertical-align:baseline;*vertical-align:middle}button,input{line-height:normal}button,input[type=button],input[type=reset],input[type=submit]{cursor:pointer;-webkit-appearance:button;*overflow:visible}button[disabled],input[disabled]{cursor:default}input[type=search]{-webkit-appearance:textfield;-moz-box-sizing:content-box;-webkit-box-sizing:content-box;box-sizing:content-box}textarea{resize:vertical}table{border-collapse:collapse;border-spacing:0}td{vertical-align:top}.chromeframe{margin:.2em 0;background:#ccc;color:#000;padding:.2em 0}.ir{display:block;border:0;text-indent:-999em;overflow:hidden;background-color:transparent;background-repeat:no-repeat;text-align:left;direction:ltr;*line-height:0}.ir br{display:none}.hidden{display:none!important;visibility:hidden}.visuallyhidden{border:0;clip:rect(0 0 0 0);height:1px;margin:-1px;overflow:hidden;padding:0;position:absolute;width:1px}.visuallyhidden.focusable:active,.visuallyhidden.focusable:focus{clip:auto;height:auto;margin:0;overflow:visible;position:static;width:auto}.invisible{visibility:hidden}.relative{position:relative}big,small{font-size:100%}@media print{body,html,section{background:none!important}*{box-shadow:none!important;text-shadow:none!important;filter:none!important;-ms-filter:none!important}a,a:visited{text-decoration:underline}.ir a:after,a[href^="#"]:after,a[href^="javascript:"]:after{content:""}blockquote,pre{page-break-inside:avoid}thead{display:table-header-group}img,tr{page-break-inside:avoid}img{max-width:100%!important}@page{margin:.5cm}.rst-content .toctree-wrapper>p.caption,h2,h3,p{orphans:3;widows:3}.rst-content .toctree-wrapper>p.caption,h2,h3{page-break-after:avoid}}.btn,.fa:before,.icon:before,.rst-content .admonition,.rst-content .admonition-title:before,.rst-content .admonition-todo,.rst-content .attention,.rst-content .caution,.rst-content .code-block-caption .headerlink:before,.rst-content .danger,.rst-content .eqno .headerlink:before,.rst-content .error,.rst-content .hint,.rst-content .important,.rst-content .note,.rst-content .seealso,.rst-content .tip,.rst-content .warning,.rst-content code.download span:first-child:before,.rst-content dl dt .headerlink:before,.rst-content h1 .headerlink:before,.rst-content h2 .headerlink:before,.rst-content h3 .headerlink:before,.rst-content h4 .headerlink:before,.rst-content h5 .headerlink:before,.rst-content h6 .headerlink:before,.rst-content p.caption .headerlink:before,.rst-content p .headerlink:before,.rst-content table>caption .headerlink:before,.rst-content tt.download span:first-child:before,.wy-alert,.wy-dropdown .caret:before,.wy-inline-validate.wy-inline-validate-danger .wy-input-context:before,.wy-inline-validate.wy-inline-validate-info .wy-input-context:before,.wy-inline-validate.wy-inline-validate-success .wy-input-context:before,.wy-inline-validate.wy-inline-validate-warning .wy-input-context:before,.wy-menu-vertical li.current>a button.toctree-expand:before,.wy-menu-vertical li.on a button.toctree-expand:before,.wy-menu-vertical li button.toctree-expand:before,input[type=color],input[type=date],input[type=datetime-local],input[type=datetime],input[type=email],input[type=month],input[type=number],input[type=password],input[type=search],input[type=tel],input[type=text],input[type=time],input[type=url],input[type=week],select,textarea{-webkit-font-smoothing:antialiased}.clearfix{*zoom:1}.clearfix:after,.clearfix:before{display:table;content:""}.clearfix:after{clear:both}/*! + * Font Awesome 4.7.0 by @davegandy - http://fontawesome.io - @fontawesome + * License - http://fontawesome.io/license (Font: SIL OFL 1.1, CSS: MIT License) + */@font-face{font-family:FontAwesome;src:url(fonts/fontawesome-webfont.eot?674f50d287a8c48dc19ba404d20fe713);src:url(fonts/fontawesome-webfont.eot?674f50d287a8c48dc19ba404d20fe713?#iefix&v=4.7.0) format("embedded-opentype"),url(fonts/fontawesome-webfont.woff2?af7ae505a9eed503f8b8e6982036873e) format("woff2"),url(fonts/fontawesome-webfont.woff?fee66e712a8a08eef5805a46892932ad) format("woff"),url(fonts/fontawesome-webfont.ttf?b06871f281fee6b241d60582ae9369b9) format("truetype"),url(fonts/fontawesome-webfont.svg?912ec66d7572ff821749319396470bde#fontawesomeregular) format("svg");font-weight:400;font-style:normal}.fa,.icon,.rst-content .admonition-title,.rst-content .code-block-caption .headerlink,.rst-content .eqno .headerlink,.rst-content code.download span:first-child,.rst-content dl dt .headerlink,.rst-content h1 .headerlink,.rst-content h2 .headerlink,.rst-content h3 .headerlink,.rst-content h4 .headerlink,.rst-content h5 .headerlink,.rst-content h6 .headerlink,.rst-content p.caption .headerlink,.rst-content p .headerlink,.rst-content table>caption .headerlink,.rst-content tt.download span:first-child,.wy-menu-vertical li.current>a button.toctree-expand,.wy-menu-vertical li.on a button.toctree-expand,.wy-menu-vertical li button.toctree-expand{display:inline-block;font:normal normal normal 14px/1 FontAwesome;font-size:inherit;text-rendering:auto;-webkit-font-smoothing:antialiased;-moz-osx-font-smoothing:grayscale}.fa-lg{font-size:1.33333em;line-height:.75em;vertical-align:-15%}.fa-2x{font-size:2em}.fa-3x{font-size:3em}.fa-4x{font-size:4em}.fa-5x{font-size:5em}.fa-fw{width:1.28571em;text-align:center}.fa-ul{padding-left:0;margin-left:2.14286em;list-style-type:none}.fa-ul>li{position:relative}.fa-li{position:absolute;left:-2.14286em;width:2.14286em;top:.14286em;text-align:center}.fa-li.fa-lg{left:-1.85714em}.fa-border{padding:.2em .25em .15em;border:.08em solid #eee;border-radius:.1em}.fa-pull-left{float:left}.fa-pull-right{float:right}.fa-pull-left.icon,.fa.fa-pull-left,.rst-content .code-block-caption .fa-pull-left.headerlink,.rst-content .eqno .fa-pull-left.headerlink,.rst-content .fa-pull-left.admonition-title,.rst-content code.download span.fa-pull-left:first-child,.rst-content dl dt .fa-pull-left.headerlink,.rst-content h1 .fa-pull-left.headerlink,.rst-content h2 .fa-pull-left.headerlink,.rst-content h3 .fa-pull-left.headerlink,.rst-content h4 .fa-pull-left.headerlink,.rst-content h5 .fa-pull-left.headerlink,.rst-content h6 .fa-pull-left.headerlink,.rst-content p .fa-pull-left.headerlink,.rst-content table>caption .fa-pull-left.headerlink,.rst-content tt.download span.fa-pull-left:first-child,.wy-menu-vertical li.current>a button.fa-pull-left.toctree-expand,.wy-menu-vertical li.on a button.fa-pull-left.toctree-expand,.wy-menu-vertical li button.fa-pull-left.toctree-expand{margin-right:.3em}.fa-pull-right.icon,.fa.fa-pull-right,.rst-content .code-block-caption .fa-pull-right.headerlink,.rst-content .eqno .fa-pull-right.headerlink,.rst-content .fa-pull-right.admonition-title,.rst-content code.download span.fa-pull-right:first-child,.rst-content dl dt .fa-pull-right.headerlink,.rst-content h1 .fa-pull-right.headerlink,.rst-content h2 .fa-pull-right.headerlink,.rst-content h3 .fa-pull-right.headerlink,.rst-content h4 .fa-pull-right.headerlink,.rst-content h5 .fa-pull-right.headerlink,.rst-content h6 .fa-pull-right.headerlink,.rst-content p .fa-pull-right.headerlink,.rst-content table>caption .fa-pull-right.headerlink,.rst-content tt.download span.fa-pull-right:first-child,.wy-menu-vertical li.current>a button.fa-pull-right.toctree-expand,.wy-menu-vertical li.on a button.fa-pull-right.toctree-expand,.wy-menu-vertical li button.fa-pull-right.toctree-expand{margin-left:.3em}.pull-right{float:right}.pull-left{float:left}.fa.pull-left,.pull-left.icon,.rst-content .code-block-caption .pull-left.headerlink,.rst-content .eqno .pull-left.headerlink,.rst-content .pull-left.admonition-title,.rst-content code.download span.pull-left:first-child,.rst-content dl dt .pull-left.headerlink,.rst-content h1 .pull-left.headerlink,.rst-content h2 .pull-left.headerlink,.rst-content h3 .pull-left.headerlink,.rst-content h4 .pull-left.headerlink,.rst-content h5 .pull-left.headerlink,.rst-content h6 .pull-left.headerlink,.rst-content p .pull-left.headerlink,.rst-content table>caption .pull-left.headerlink,.rst-content tt.download span.pull-left:first-child,.wy-menu-vertical li.current>a button.pull-left.toctree-expand,.wy-menu-vertical li.on a button.pull-left.toctree-expand,.wy-menu-vertical li button.pull-left.toctree-expand{margin-right:.3em}.fa.pull-right,.pull-right.icon,.rst-content .code-block-caption .pull-right.headerlink,.rst-content .eqno .pull-right.headerlink,.rst-content .pull-right.admonition-title,.rst-content code.download span.pull-right:first-child,.rst-content dl dt .pull-right.headerlink,.rst-content h1 .pull-right.headerlink,.rst-content h2 .pull-right.headerlink,.rst-content h3 .pull-right.headerlink,.rst-content h4 .pull-right.headerlink,.rst-content h5 .pull-right.headerlink,.rst-content h6 .pull-right.headerlink,.rst-content p .pull-right.headerlink,.rst-content table>caption .pull-right.headerlink,.rst-content tt.download span.pull-right:first-child,.wy-menu-vertical li.current>a button.pull-right.toctree-expand,.wy-menu-vertical li.on a button.pull-right.toctree-expand,.wy-menu-vertical li button.pull-right.toctree-expand{margin-left:.3em}.fa-spin{-webkit-animation:fa-spin 2s linear infinite;animation:fa-spin 2s linear infinite}.fa-pulse{-webkit-animation:fa-spin 1s steps(8) infinite;animation:fa-spin 1s steps(8) infinite}@-webkit-keyframes fa-spin{0%{-webkit-transform:rotate(0deg);transform:rotate(0deg)}to{-webkit-transform:rotate(359deg);transform:rotate(359deg)}}@keyframes fa-spin{0%{-webkit-transform:rotate(0deg);transform:rotate(0deg)}to{-webkit-transform:rotate(359deg);transform:rotate(359deg)}}.fa-rotate-90{-ms-filter:"progid:DXImageTransform.Microsoft.BasicImage(rotation=1)";-webkit-transform:rotate(90deg);-ms-transform:rotate(90deg);transform:rotate(90deg)}.fa-rotate-180{-ms-filter:"progid:DXImageTransform.Microsoft.BasicImage(rotation=2)";-webkit-transform:rotate(180deg);-ms-transform:rotate(180deg);transform:rotate(180deg)}.fa-rotate-270{-ms-filter:"progid:DXImageTransform.Microsoft.BasicImage(rotation=3)";-webkit-transform:rotate(270deg);-ms-transform:rotate(270deg);transform:rotate(270deg)}.fa-flip-horizontal{-ms-filter:"progid:DXImageTransform.Microsoft.BasicImage(rotation=0, mirror=1)";-webkit-transform:scaleX(-1);-ms-transform:scaleX(-1);transform:scaleX(-1)}.fa-flip-vertical{-ms-filter:"progid:DXImageTransform.Microsoft.BasicImage(rotation=2, mirror=1)";-webkit-transform:scaleY(-1);-ms-transform:scaleY(-1);transform:scaleY(-1)}:root .fa-flip-horizontal,:root .fa-flip-vertical,:root .fa-rotate-90,:root .fa-rotate-180,:root .fa-rotate-270{filter:none}.fa-stack{position:relative;display:inline-block;width:2em;height:2em;line-height:2em;vertical-align:middle}.fa-stack-1x,.fa-stack-2x{position:absolute;left:0;width:100%;text-align:center}.fa-stack-1x{line-height:inherit}.fa-stack-2x{font-size:2em}.fa-inverse{color:#fff}.fa-glass:before{content:""}.fa-music:before{content:""}.fa-search:before,.icon-search:before{content:""}.fa-envelope-o:before{content:""}.fa-heart:before{content:""}.fa-star:before{content:""}.fa-star-o:before{content:""}.fa-user:before{content:""}.fa-film:before{content:""}.fa-th-large:before{content:""}.fa-th:before{content:""}.fa-th-list:before{content:""}.fa-check:before{content:""}.fa-close:before,.fa-remove:before,.fa-times:before{content:""}.fa-search-plus:before{content:""}.fa-search-minus:before{content:""}.fa-power-off:before{content:""}.fa-signal:before{content:""}.fa-cog:before,.fa-gear:before{content:""}.fa-trash-o:before{content:""}.fa-home:before,.icon-home:before{content:""}.fa-file-o:before{content:""}.fa-clock-o:before{content:""}.fa-road:before{content:""}.fa-download:before,.rst-content code.download span:first-child:before,.rst-content tt.download span:first-child:before{content:""}.fa-arrow-circle-o-down:before{content:""}.fa-arrow-circle-o-up:before{content:""}.fa-inbox:before{content:""}.fa-play-circle-o:before{content:""}.fa-repeat:before,.fa-rotate-right:before{content:""}.fa-refresh:before{content:""}.fa-list-alt:before{content:""}.fa-lock:before{content:""}.fa-flag:before{content:""}.fa-headphones:before{content:""}.fa-volume-off:before{content:""}.fa-volume-down:before{content:""}.fa-volume-up:before{content:""}.fa-qrcode:before{content:""}.fa-barcode:before{content:""}.fa-tag:before{content:""}.fa-tags:before{content:""}.fa-book:before,.icon-book:before{content:""}.fa-bookmark:before{content:""}.fa-print:before{content:""}.fa-camera:before{content:""}.fa-font:before{content:""}.fa-bold:before{content:""}.fa-italic:before{content:""}.fa-text-height:before{content:""}.fa-text-width:before{content:""}.fa-align-left:before{content:""}.fa-align-center:before{content:""}.fa-align-right:before{content:""}.fa-align-justify:before{content:""}.fa-list:before{content:""}.fa-dedent:before,.fa-outdent:before{content:""}.fa-indent:before{content:""}.fa-video-camera:before{content:""}.fa-image:before,.fa-photo:before,.fa-picture-o:before{content:""}.fa-pencil:before{content:""}.fa-map-marker:before{content:""}.fa-adjust:before{content:""}.fa-tint:before{content:""}.fa-edit:before,.fa-pencil-square-o:before{content:""}.fa-share-square-o:before{content:""}.fa-check-square-o:before{content:""}.fa-arrows:before{content:""}.fa-step-backward:before{content:""}.fa-fast-backward:before{content:""}.fa-backward:before{content:""}.fa-play:before{content:""}.fa-pause:before{content:""}.fa-stop:before{content:""}.fa-forward:before{content:""}.fa-fast-forward:before{content:""}.fa-step-forward:before{content:""}.fa-eject:before{content:""}.fa-chevron-left:before{content:""}.fa-chevron-right:before{content:""}.fa-plus-circle:before{content:""}.fa-minus-circle:before{content:""}.fa-times-circle:before,.wy-inline-validate.wy-inline-validate-danger .wy-input-context:before{content:""}.fa-check-circle:before,.wy-inline-validate.wy-inline-validate-success .wy-input-context:before{content:""}.fa-question-circle:before{content:""}.fa-info-circle:before{content:""}.fa-crosshairs:before{content:""}.fa-times-circle-o:before{content:""}.fa-check-circle-o:before{content:""}.fa-ban:before{content:""}.fa-arrow-left:before{content:""}.fa-arrow-right:before{content:""}.fa-arrow-up:before{content:""}.fa-arrow-down:before{content:""}.fa-mail-forward:before,.fa-share:before{content:""}.fa-expand:before{content:""}.fa-compress:before{content:""}.fa-plus:before{content:""}.fa-minus:before{content:""}.fa-asterisk:before{content:""}.fa-exclamation-circle:before,.rst-content .admonition-title:before,.wy-inline-validate.wy-inline-validate-info .wy-input-context:before,.wy-inline-validate.wy-inline-validate-warning .wy-input-context:before{content:""}.fa-gift:before{content:""}.fa-leaf:before{content:""}.fa-fire:before,.icon-fire:before{content:""}.fa-eye:before{content:""}.fa-eye-slash:before{content:""}.fa-exclamation-triangle:before,.fa-warning:before{content:""}.fa-plane:before{content:""}.fa-calendar:before{content:""}.fa-random:before{content:""}.fa-comment:before{content:""}.fa-magnet:before{content:""}.fa-chevron-up:before{content:""}.fa-chevron-down:before{content:""}.fa-retweet:before{content:""}.fa-shopping-cart:before{content:""}.fa-folder:before{content:""}.fa-folder-open:before{content:""}.fa-arrows-v:before{content:""}.fa-arrows-h:before{content:""}.fa-bar-chart-o:before,.fa-bar-chart:before{content:""}.fa-twitter-square:before{content:""}.fa-facebook-square:before{content:""}.fa-camera-retro:before{content:""}.fa-key:before{content:""}.fa-cogs:before,.fa-gears:before{content:""}.fa-comments:before{content:""}.fa-thumbs-o-up:before{content:""}.fa-thumbs-o-down:before{content:""}.fa-star-half:before{content:""}.fa-heart-o:before{content:""}.fa-sign-out:before{content:""}.fa-linkedin-square:before{content:""}.fa-thumb-tack:before{content:""}.fa-external-link:before{content:""}.fa-sign-in:before{content:""}.fa-trophy:before{content:""}.fa-github-square:before{content:""}.fa-upload:before{content:""}.fa-lemon-o:before{content:""}.fa-phone:before{content:""}.fa-square-o:before{content:""}.fa-bookmark-o:before{content:""}.fa-phone-square:before{content:""}.fa-twitter:before{content:""}.fa-facebook-f:before,.fa-facebook:before{content:""}.fa-github:before,.icon-github:before{content:""}.fa-unlock:before{content:""}.fa-credit-card:before{content:""}.fa-feed:before,.fa-rss:before{content:""}.fa-hdd-o:before{content:""}.fa-bullhorn:before{content:""}.fa-bell:before{content:""}.fa-certificate:before{content:""}.fa-hand-o-right:before{content:""}.fa-hand-o-left:before{content:""}.fa-hand-o-up:before{content:""}.fa-hand-o-down:before{content:""}.fa-arrow-circle-left:before,.icon-circle-arrow-left:before{content:""}.fa-arrow-circle-right:before,.icon-circle-arrow-right:before{content:""}.fa-arrow-circle-up:before{content:""}.fa-arrow-circle-down:before{content:""}.fa-globe:before{content:""}.fa-wrench:before{content:""}.fa-tasks:before{content:""}.fa-filter:before{content:""}.fa-briefcase:before{content:""}.fa-arrows-alt:before{content:""}.fa-group:before,.fa-users:before{content:""}.fa-chain:before,.fa-link:before,.icon-link:before{content:""}.fa-cloud:before{content:""}.fa-flask:before{content:""}.fa-cut:before,.fa-scissors:before{content:""}.fa-copy:before,.fa-files-o:before{content:""}.fa-paperclip:before{content:""}.fa-floppy-o:before,.fa-save:before{content:""}.fa-square:before{content:""}.fa-bars:before,.fa-navicon:before,.fa-reorder:before{content:""}.fa-list-ul:before{content:""}.fa-list-ol:before{content:""}.fa-strikethrough:before{content:""}.fa-underline:before{content:""}.fa-table:before{content:""}.fa-magic:before{content:""}.fa-truck:before{content:""}.fa-pinterest:before{content:""}.fa-pinterest-square:before{content:""}.fa-google-plus-square:before{content:""}.fa-google-plus:before{content:""}.fa-money:before{content:""}.fa-caret-down:before,.icon-caret-down:before,.wy-dropdown .caret:before{content:""}.fa-caret-up:before{content:""}.fa-caret-left:before{content:""}.fa-caret-right:before{content:""}.fa-columns:before{content:""}.fa-sort:before,.fa-unsorted:before{content:""}.fa-sort-desc:before,.fa-sort-down:before{content:""}.fa-sort-asc:before,.fa-sort-up:before{content:""}.fa-envelope:before{content:""}.fa-linkedin:before{content:""}.fa-rotate-left:before,.fa-undo:before{content:""}.fa-gavel:before,.fa-legal:before{content:""}.fa-dashboard:before,.fa-tachometer:before{content:""}.fa-comment-o:before{content:""}.fa-comments-o:before{content:""}.fa-bolt:before,.fa-flash:before{content:""}.fa-sitemap:before{content:""}.fa-umbrella:before{content:""}.fa-clipboard:before,.fa-paste:before{content:""}.fa-lightbulb-o:before{content:""}.fa-exchange:before{content:""}.fa-cloud-download:before{content:""}.fa-cloud-upload:before{content:""}.fa-user-md:before{content:""}.fa-stethoscope:before{content:""}.fa-suitcase:before{content:""}.fa-bell-o:before{content:""}.fa-coffee:before{content:""}.fa-cutlery:before{content:""}.fa-file-text-o:before{content:""}.fa-building-o:before{content:""}.fa-hospital-o:before{content:""}.fa-ambulance:before{content:""}.fa-medkit:before{content:""}.fa-fighter-jet:before{content:""}.fa-beer:before{content:""}.fa-h-square:before{content:""}.fa-plus-square:before{content:""}.fa-angle-double-left:before{content:""}.fa-angle-double-right:before{content:""}.fa-angle-double-up:before{content:""}.fa-angle-double-down:before{content:""}.fa-angle-left:before{content:""}.fa-angle-right:before{content:""}.fa-angle-up:before{content:""}.fa-angle-down:before{content:""}.fa-desktop:before{content:""}.fa-laptop:before{content:""}.fa-tablet:before{content:""}.fa-mobile-phone:before,.fa-mobile:before{content:""}.fa-circle-o:before{content:""}.fa-quote-left:before{content:""}.fa-quote-right:before{content:""}.fa-spinner:before{content:""}.fa-circle:before{content:""}.fa-mail-reply:before,.fa-reply:before{content:""}.fa-github-alt:before{content:""}.fa-folder-o:before{content:""}.fa-folder-open-o:before{content:""}.fa-smile-o:before{content:""}.fa-frown-o:before{content:""}.fa-meh-o:before{content:""}.fa-gamepad:before{content:""}.fa-keyboard-o:before{content:""}.fa-flag-o:before{content:""}.fa-flag-checkered:before{content:""}.fa-terminal:before{content:""}.fa-code:before{content:""}.fa-mail-reply-all:before,.fa-reply-all:before{content:""}.fa-star-half-empty:before,.fa-star-half-full:before,.fa-star-half-o:before{content:""}.fa-location-arrow:before{content:""}.fa-crop:before{content:""}.fa-code-fork:before{content:""}.fa-chain-broken:before,.fa-unlink:before{content:""}.fa-question:before{content:""}.fa-info:before{content:""}.fa-exclamation:before{content:""}.fa-superscript:before{content:""}.fa-subscript:before{content:""}.fa-eraser:before{content:""}.fa-puzzle-piece:before{content:""}.fa-microphone:before{content:""}.fa-microphone-slash:before{content:""}.fa-shield:before{content:""}.fa-calendar-o:before{content:""}.fa-fire-extinguisher:before{content:""}.fa-rocket:before{content:""}.fa-maxcdn:before{content:""}.fa-chevron-circle-left:before{content:""}.fa-chevron-circle-right:before{content:""}.fa-chevron-circle-up:before{content:""}.fa-chevron-circle-down:before{content:""}.fa-html5:before{content:""}.fa-css3:before{content:""}.fa-anchor:before{content:""}.fa-unlock-alt:before{content:""}.fa-bullseye:before{content:""}.fa-ellipsis-h:before{content:""}.fa-ellipsis-v:before{content:""}.fa-rss-square:before{content:""}.fa-play-circle:before{content:""}.fa-ticket:before{content:""}.fa-minus-square:before{content:""}.fa-minus-square-o:before,.wy-menu-vertical li.current>a button.toctree-expand:before,.wy-menu-vertical li.on a button.toctree-expand:before{content:""}.fa-level-up:before{content:""}.fa-level-down:before{content:""}.fa-check-square:before{content:""}.fa-pencil-square:before{content:""}.fa-external-link-square:before{content:""}.fa-share-square:before{content:""}.fa-compass:before{content:""}.fa-caret-square-o-down:before,.fa-toggle-down:before{content:""}.fa-caret-square-o-up:before,.fa-toggle-up:before{content:""}.fa-caret-square-o-right:before,.fa-toggle-right:before{content:""}.fa-eur:before,.fa-euro:before{content:""}.fa-gbp:before{content:""}.fa-dollar:before,.fa-usd:before{content:""}.fa-inr:before,.fa-rupee:before{content:""}.fa-cny:before,.fa-jpy:before,.fa-rmb:before,.fa-yen:before{content:""}.fa-rouble:before,.fa-rub:before,.fa-ruble:before{content:""}.fa-krw:before,.fa-won:before{content:""}.fa-bitcoin:before,.fa-btc:before{content:""}.fa-file:before{content:""}.fa-file-text:before{content:""}.fa-sort-alpha-asc:before{content:""}.fa-sort-alpha-desc:before{content:""}.fa-sort-amount-asc:before{content:""}.fa-sort-amount-desc:before{content:""}.fa-sort-numeric-asc:before{content:""}.fa-sort-numeric-desc:before{content:""}.fa-thumbs-up:before{content:""}.fa-thumbs-down:before{content:""}.fa-youtube-square:before{content:""}.fa-youtube:before{content:""}.fa-xing:before{content:""}.fa-xing-square:before{content:""}.fa-youtube-play:before{content:""}.fa-dropbox:before{content:""}.fa-stack-overflow:before{content:""}.fa-instagram:before{content:""}.fa-flickr:before{content:""}.fa-adn:before{content:""}.fa-bitbucket:before,.icon-bitbucket:before{content:""}.fa-bitbucket-square:before{content:""}.fa-tumblr:before{content:""}.fa-tumblr-square:before{content:""}.fa-long-arrow-down:before{content:""}.fa-long-arrow-up:before{content:""}.fa-long-arrow-left:before{content:""}.fa-long-arrow-right:before{content:""}.fa-apple:before{content:""}.fa-windows:before{content:""}.fa-android:before{content:""}.fa-linux:before{content:""}.fa-dribbble:before{content:""}.fa-skype:before{content:""}.fa-foursquare:before{content:""}.fa-trello:before{content:""}.fa-female:before{content:""}.fa-male:before{content:""}.fa-gittip:before,.fa-gratipay:before{content:""}.fa-sun-o:before{content:""}.fa-moon-o:before{content:""}.fa-archive:before{content:""}.fa-bug:before{content:""}.fa-vk:before{content:""}.fa-weibo:before{content:""}.fa-renren:before{content:""}.fa-pagelines:before{content:""}.fa-stack-exchange:before{content:""}.fa-arrow-circle-o-right:before{content:""}.fa-arrow-circle-o-left:before{content:""}.fa-caret-square-o-left:before,.fa-toggle-left:before{content:""}.fa-dot-circle-o:before{content:""}.fa-wheelchair:before{content:""}.fa-vimeo-square:before{content:""}.fa-try:before,.fa-turkish-lira:before{content:""}.fa-plus-square-o:before,.wy-menu-vertical li button.toctree-expand:before{content:""}.fa-space-shuttle:before{content:""}.fa-slack:before{content:""}.fa-envelope-square:before{content:""}.fa-wordpress:before{content:""}.fa-openid:before{content:""}.fa-bank:before,.fa-institution:before,.fa-university:before{content:""}.fa-graduation-cap:before,.fa-mortar-board:before{content:""}.fa-yahoo:before{content:""}.fa-google:before{content:""}.fa-reddit:before{content:""}.fa-reddit-square:before{content:""}.fa-stumbleupon-circle:before{content:""}.fa-stumbleupon:before{content:""}.fa-delicious:before{content:""}.fa-digg:before{content:""}.fa-pied-piper-pp:before{content:""}.fa-pied-piper-alt:before{content:""}.fa-drupal:before{content:""}.fa-joomla:before{content:""}.fa-language:before{content:""}.fa-fax:before{content:""}.fa-building:before{content:""}.fa-child:before{content:""}.fa-paw:before{content:""}.fa-spoon:before{content:""}.fa-cube:before{content:""}.fa-cubes:before{content:""}.fa-behance:before{content:""}.fa-behance-square:before{content:""}.fa-steam:before{content:""}.fa-steam-square:before{content:""}.fa-recycle:before{content:""}.fa-automobile:before,.fa-car:before{content:""}.fa-cab:before,.fa-taxi:before{content:""}.fa-tree:before{content:""}.fa-spotify:before{content:""}.fa-deviantart:before{content:""}.fa-soundcloud:before{content:""}.fa-database:before{content:""}.fa-file-pdf-o:before{content:""}.fa-file-word-o:before{content:""}.fa-file-excel-o:before{content:""}.fa-file-powerpoint-o:before{content:""}.fa-file-image-o:before,.fa-file-photo-o:before,.fa-file-picture-o:before{content:""}.fa-file-archive-o:before,.fa-file-zip-o:before{content:""}.fa-file-audio-o:before,.fa-file-sound-o:before{content:""}.fa-file-movie-o:before,.fa-file-video-o:before{content:""}.fa-file-code-o:before{content:""}.fa-vine:before{content:""}.fa-codepen:before{content:""}.fa-jsfiddle:before{content:""}.fa-life-bouy:before,.fa-life-buoy:before,.fa-life-ring:before,.fa-life-saver:before,.fa-support:before{content:""}.fa-circle-o-notch:before{content:""}.fa-ra:before,.fa-rebel:before,.fa-resistance:before{content:""}.fa-empire:before,.fa-ge:before{content:""}.fa-git-square:before{content:""}.fa-git:before{content:""}.fa-hacker-news:before,.fa-y-combinator-square:before,.fa-yc-square:before{content:""}.fa-tencent-weibo:before{content:""}.fa-qq:before{content:""}.fa-wechat:before,.fa-weixin:before{content:""}.fa-paper-plane:before,.fa-send:before{content:""}.fa-paper-plane-o:before,.fa-send-o:before{content:""}.fa-history:before{content:""}.fa-circle-thin:before{content:""}.fa-header:before{content:""}.fa-paragraph:before{content:""}.fa-sliders:before{content:""}.fa-share-alt:before{content:""}.fa-share-alt-square:before{content:""}.fa-bomb:before{content:""}.fa-futbol-o:before,.fa-soccer-ball-o:before{content:""}.fa-tty:before{content:""}.fa-binoculars:before{content:""}.fa-plug:before{content:""}.fa-slideshare:before{content:""}.fa-twitch:before{content:""}.fa-yelp:before{content:""}.fa-newspaper-o:before{content:""}.fa-wifi:before{content:""}.fa-calculator:before{content:""}.fa-paypal:before{content:""}.fa-google-wallet:before{content:""}.fa-cc-visa:before{content:""}.fa-cc-mastercard:before{content:""}.fa-cc-discover:before{content:""}.fa-cc-amex:before{content:""}.fa-cc-paypal:before{content:""}.fa-cc-stripe:before{content:""}.fa-bell-slash:before{content:""}.fa-bell-slash-o:before{content:""}.fa-trash:before{content:""}.fa-copyright:before{content:""}.fa-at:before{content:""}.fa-eyedropper:before{content:""}.fa-paint-brush:before{content:""}.fa-birthday-cake:before{content:""}.fa-area-chart:before{content:""}.fa-pie-chart:before{content:""}.fa-line-chart:before{content:""}.fa-lastfm:before{content:""}.fa-lastfm-square:before{content:""}.fa-toggle-off:before{content:""}.fa-toggle-on:before{content:""}.fa-bicycle:before{content:""}.fa-bus:before{content:""}.fa-ioxhost:before{content:""}.fa-angellist:before{content:""}.fa-cc:before{content:""}.fa-ils:before,.fa-shekel:before,.fa-sheqel:before{content:""}.fa-meanpath:before{content:""}.fa-buysellads:before{content:""}.fa-connectdevelop:before{content:""}.fa-dashcube:before{content:""}.fa-forumbee:before{content:""}.fa-leanpub:before{content:""}.fa-sellsy:before{content:""}.fa-shirtsinbulk:before{content:""}.fa-simplybuilt:before{content:""}.fa-skyatlas:before{content:""}.fa-cart-plus:before{content:""}.fa-cart-arrow-down:before{content:""}.fa-diamond:before{content:""}.fa-ship:before{content:""}.fa-user-secret:before{content:""}.fa-motorcycle:before{content:""}.fa-street-view:before{content:""}.fa-heartbeat:before{content:""}.fa-venus:before{content:""}.fa-mars:before{content:""}.fa-mercury:before{content:""}.fa-intersex:before,.fa-transgender:before{content:""}.fa-transgender-alt:before{content:""}.fa-venus-double:before{content:""}.fa-mars-double:before{content:""}.fa-venus-mars:before{content:""}.fa-mars-stroke:before{content:""}.fa-mars-stroke-v:before{content:""}.fa-mars-stroke-h:before{content:""}.fa-neuter:before{content:""}.fa-genderless:before{content:""}.fa-facebook-official:before{content:""}.fa-pinterest-p:before{content:""}.fa-whatsapp:before{content:""}.fa-server:before{content:""}.fa-user-plus:before{content:""}.fa-user-times:before{content:""}.fa-bed:before,.fa-hotel:before{content:""}.fa-viacoin:before{content:""}.fa-train:before{content:""}.fa-subway:before{content:""}.fa-medium:before{content:""}.fa-y-combinator:before,.fa-yc:before{content:""}.fa-optin-monster:before{content:""}.fa-opencart:before{content:""}.fa-expeditedssl:before{content:""}.fa-battery-4:before,.fa-battery-full:before,.fa-battery:before{content:""}.fa-battery-3:before,.fa-battery-three-quarters:before{content:""}.fa-battery-2:before,.fa-battery-half:before{content:""}.fa-battery-1:before,.fa-battery-quarter:before{content:""}.fa-battery-0:before,.fa-battery-empty:before{content:""}.fa-mouse-pointer:before{content:""}.fa-i-cursor:before{content:""}.fa-object-group:before{content:""}.fa-object-ungroup:before{content:""}.fa-sticky-note:before{content:""}.fa-sticky-note-o:before{content:""}.fa-cc-jcb:before{content:""}.fa-cc-diners-club:before{content:""}.fa-clone:before{content:""}.fa-balance-scale:before{content:""}.fa-hourglass-o:before{content:""}.fa-hourglass-1:before,.fa-hourglass-start:before{content:""}.fa-hourglass-2:before,.fa-hourglass-half:before{content:""}.fa-hourglass-3:before,.fa-hourglass-end:before{content:""}.fa-hourglass:before{content:""}.fa-hand-grab-o:before,.fa-hand-rock-o:before{content:""}.fa-hand-paper-o:before,.fa-hand-stop-o:before{content:""}.fa-hand-scissors-o:before{content:""}.fa-hand-lizard-o:before{content:""}.fa-hand-spock-o:before{content:""}.fa-hand-pointer-o:before{content:""}.fa-hand-peace-o:before{content:""}.fa-trademark:before{content:""}.fa-registered:before{content:""}.fa-creative-commons:before{content:""}.fa-gg:before{content:""}.fa-gg-circle:before{content:""}.fa-tripadvisor:before{content:""}.fa-odnoklassniki:before{content:""}.fa-odnoklassniki-square:before{content:""}.fa-get-pocket:before{content:""}.fa-wikipedia-w:before{content:""}.fa-safari:before{content:""}.fa-chrome:before{content:""}.fa-firefox:before{content:""}.fa-opera:before{content:""}.fa-internet-explorer:before{content:""}.fa-television:before,.fa-tv:before{content:""}.fa-contao:before{content:""}.fa-500px:before{content:""}.fa-amazon:before{content:""}.fa-calendar-plus-o:before{content:""}.fa-calendar-minus-o:before{content:""}.fa-calendar-times-o:before{content:""}.fa-calendar-check-o:before{content:""}.fa-industry:before{content:""}.fa-map-pin:before{content:""}.fa-map-signs:before{content:""}.fa-map-o:before{content:""}.fa-map:before{content:""}.fa-commenting:before{content:""}.fa-commenting-o:before{content:""}.fa-houzz:before{content:""}.fa-vimeo:before{content:""}.fa-black-tie:before{content:""}.fa-fonticons:before{content:""}.fa-reddit-alien:before{content:""}.fa-edge:before{content:""}.fa-credit-card-alt:before{content:""}.fa-codiepie:before{content:""}.fa-modx:before{content:""}.fa-fort-awesome:before{content:""}.fa-usb:before{content:""}.fa-product-hunt:before{content:""}.fa-mixcloud:before{content:""}.fa-scribd:before{content:""}.fa-pause-circle:before{content:""}.fa-pause-circle-o:before{content:""}.fa-stop-circle:before{content:""}.fa-stop-circle-o:before{content:""}.fa-shopping-bag:before{content:""}.fa-shopping-basket:before{content:""}.fa-hashtag:before{content:""}.fa-bluetooth:before{content:""}.fa-bluetooth-b:before{content:""}.fa-percent:before{content:""}.fa-gitlab:before,.icon-gitlab:before{content:""}.fa-wpbeginner:before{content:""}.fa-wpforms:before{content:""}.fa-envira:before{content:""}.fa-universal-access:before{content:""}.fa-wheelchair-alt:before{content:""}.fa-question-circle-o:before{content:""}.fa-blind:before{content:""}.fa-audio-description:before{content:""}.fa-volume-control-phone:before{content:""}.fa-braille:before{content:""}.fa-assistive-listening-systems:before{content:""}.fa-american-sign-language-interpreting:before,.fa-asl-interpreting:before{content:""}.fa-deaf:before,.fa-deafness:before,.fa-hard-of-hearing:before{content:""}.fa-glide:before{content:""}.fa-glide-g:before{content:""}.fa-sign-language:before,.fa-signing:before{content:""}.fa-low-vision:before{content:""}.fa-viadeo:before{content:""}.fa-viadeo-square:before{content:""}.fa-snapchat:before{content:""}.fa-snapchat-ghost:before{content:""}.fa-snapchat-square:before{content:""}.fa-pied-piper:before{content:""}.fa-first-order:before{content:""}.fa-yoast:before{content:""}.fa-themeisle:before{content:""}.fa-google-plus-circle:before,.fa-google-plus-official:before{content:""}.fa-fa:before,.fa-font-awesome:before{content:""}.fa-handshake-o:before{content:""}.fa-envelope-open:before{content:""}.fa-envelope-open-o:before{content:""}.fa-linode:before{content:""}.fa-address-book:before{content:""}.fa-address-book-o:before{content:""}.fa-address-card:before,.fa-vcard:before{content:""}.fa-address-card-o:before,.fa-vcard-o:before{content:""}.fa-user-circle:before{content:""}.fa-user-circle-o:before{content:""}.fa-user-o:before{content:""}.fa-id-badge:before{content:""}.fa-drivers-license:before,.fa-id-card:before{content:""}.fa-drivers-license-o:before,.fa-id-card-o:before{content:""}.fa-quora:before{content:""}.fa-free-code-camp:before{content:""}.fa-telegram:before{content:""}.fa-thermometer-4:before,.fa-thermometer-full:before,.fa-thermometer:before{content:""}.fa-thermometer-3:before,.fa-thermometer-three-quarters:before{content:""}.fa-thermometer-2:before,.fa-thermometer-half:before{content:""}.fa-thermometer-1:before,.fa-thermometer-quarter:before{content:""}.fa-thermometer-0:before,.fa-thermometer-empty:before{content:""}.fa-shower:before{content:""}.fa-bath:before,.fa-bathtub:before,.fa-s15:before{content:""}.fa-podcast:before{content:""}.fa-window-maximize:before{content:""}.fa-window-minimize:before{content:""}.fa-window-restore:before{content:""}.fa-times-rectangle:before,.fa-window-close:before{content:""}.fa-times-rectangle-o:before,.fa-window-close-o:before{content:""}.fa-bandcamp:before{content:""}.fa-grav:before{content:""}.fa-etsy:before{content:""}.fa-imdb:before{content:""}.fa-ravelry:before{content:""}.fa-eercast:before{content:""}.fa-microchip:before{content:""}.fa-snowflake-o:before{content:""}.fa-superpowers:before{content:""}.fa-wpexplorer:before{content:""}.fa-meetup:before{content:""}.sr-only{position:absolute;width:1px;height:1px;padding:0;margin:-1px;overflow:hidden;clip:rect(0,0,0,0);border:0}.sr-only-focusable:active,.sr-only-focusable:focus{position:static;width:auto;height:auto;margin:0;overflow:visible;clip:auto}.fa,.icon,.rst-content .admonition-title,.rst-content .code-block-caption .headerlink,.rst-content .eqno .headerlink,.rst-content code.download span:first-child,.rst-content dl dt .headerlink,.rst-content h1 .headerlink,.rst-content h2 .headerlink,.rst-content h3 .headerlink,.rst-content h4 .headerlink,.rst-content h5 .headerlink,.rst-content h6 .headerlink,.rst-content p.caption .headerlink,.rst-content p .headerlink,.rst-content table>caption .headerlink,.rst-content tt.download span:first-child,.wy-dropdown .caret,.wy-inline-validate.wy-inline-validate-danger .wy-input-context,.wy-inline-validate.wy-inline-validate-info .wy-input-context,.wy-inline-validate.wy-inline-validate-success .wy-input-context,.wy-inline-validate.wy-inline-validate-warning .wy-input-context,.wy-menu-vertical li.current>a button.toctree-expand,.wy-menu-vertical li.on a button.toctree-expand,.wy-menu-vertical li button.toctree-expand{font-family:inherit}.fa:before,.icon:before,.rst-content .admonition-title:before,.rst-content .code-block-caption .headerlink:before,.rst-content .eqno .headerlink:before,.rst-content code.download span:first-child:before,.rst-content dl dt .headerlink:before,.rst-content h1 .headerlink:before,.rst-content h2 .headerlink:before,.rst-content h3 .headerlink:before,.rst-content h4 .headerlink:before,.rst-content h5 .headerlink:before,.rst-content h6 .headerlink:before,.rst-content p.caption .headerlink:before,.rst-content p .headerlink:before,.rst-content table>caption .headerlink:before,.rst-content tt.download span:first-child:before,.wy-dropdown .caret:before,.wy-inline-validate.wy-inline-validate-danger .wy-input-context:before,.wy-inline-validate.wy-inline-validate-info .wy-input-context:before,.wy-inline-validate.wy-inline-validate-success .wy-input-context:before,.wy-inline-validate.wy-inline-validate-warning .wy-input-context:before,.wy-menu-vertical li.current>a button.toctree-expand:before,.wy-menu-vertical li.on a button.toctree-expand:before,.wy-menu-vertical li button.toctree-expand:before{font-family:FontAwesome;display:inline-block;font-style:normal;font-weight:400;line-height:1;text-decoration:inherit}.rst-content .code-block-caption a .headerlink,.rst-content .eqno a .headerlink,.rst-content a .admonition-title,.rst-content code.download a span:first-child,.rst-content dl dt a .headerlink,.rst-content h1 a .headerlink,.rst-content h2 a .headerlink,.rst-content h3 a .headerlink,.rst-content h4 a .headerlink,.rst-content h5 a .headerlink,.rst-content h6 a .headerlink,.rst-content p.caption a .headerlink,.rst-content p a .headerlink,.rst-content table>caption a .headerlink,.rst-content tt.download a span:first-child,.wy-menu-vertical li.current>a button.toctree-expand,.wy-menu-vertical li.on a button.toctree-expand,.wy-menu-vertical li a button.toctree-expand,a .fa,a .icon,a .rst-content .admonition-title,a .rst-content .code-block-caption .headerlink,a .rst-content .eqno .headerlink,a .rst-content code.download span:first-child,a .rst-content dl dt .headerlink,a .rst-content h1 .headerlink,a .rst-content h2 .headerlink,a .rst-content h3 .headerlink,a .rst-content h4 .headerlink,a .rst-content h5 .headerlink,a .rst-content h6 .headerlink,a .rst-content p.caption .headerlink,a .rst-content p .headerlink,a .rst-content table>caption .headerlink,a .rst-content tt.download span:first-child,a .wy-menu-vertical li button.toctree-expand{display:inline-block;text-decoration:inherit}.btn .fa,.btn .icon,.btn .rst-content .admonition-title,.btn .rst-content .code-block-caption .headerlink,.btn .rst-content .eqno .headerlink,.btn .rst-content code.download span:first-child,.btn .rst-content dl dt .headerlink,.btn .rst-content h1 .headerlink,.btn .rst-content h2 .headerlink,.btn .rst-content h3 .headerlink,.btn .rst-content h4 .headerlink,.btn .rst-content h5 .headerlink,.btn .rst-content h6 .headerlink,.btn .rst-content p .headerlink,.btn .rst-content table>caption .headerlink,.btn .rst-content tt.download span:first-child,.btn .wy-menu-vertical li.current>a button.toctree-expand,.btn .wy-menu-vertical li.on a button.toctree-expand,.btn .wy-menu-vertical li button.toctree-expand,.nav .fa,.nav .icon,.nav .rst-content .admonition-title,.nav .rst-content .code-block-caption .headerlink,.nav .rst-content .eqno .headerlink,.nav .rst-content code.download span:first-child,.nav .rst-content dl dt .headerlink,.nav .rst-content h1 .headerlink,.nav .rst-content h2 .headerlink,.nav .rst-content h3 .headerlink,.nav .rst-content h4 .headerlink,.nav .rst-content h5 .headerlink,.nav .rst-content h6 .headerlink,.nav .rst-content p .headerlink,.nav .rst-content table>caption .headerlink,.nav .rst-content tt.download span:first-child,.nav .wy-menu-vertical li.current>a button.toctree-expand,.nav .wy-menu-vertical li.on a button.toctree-expand,.nav .wy-menu-vertical li button.toctree-expand,.rst-content .btn .admonition-title,.rst-content .code-block-caption .btn .headerlink,.rst-content .code-block-caption .nav .headerlink,.rst-content .eqno .btn .headerlink,.rst-content .eqno .nav .headerlink,.rst-content .nav .admonition-title,.rst-content code.download .btn span:first-child,.rst-content code.download .nav span:first-child,.rst-content dl dt .btn .headerlink,.rst-content dl dt .nav .headerlink,.rst-content h1 .btn .headerlink,.rst-content h1 .nav .headerlink,.rst-content h2 .btn .headerlink,.rst-content h2 .nav .headerlink,.rst-content h3 .btn .headerlink,.rst-content h3 .nav .headerlink,.rst-content h4 .btn .headerlink,.rst-content h4 .nav .headerlink,.rst-content h5 .btn .headerlink,.rst-content h5 .nav .headerlink,.rst-content h6 .btn .headerlink,.rst-content h6 .nav .headerlink,.rst-content p .btn .headerlink,.rst-content p .nav .headerlink,.rst-content table>caption .btn .headerlink,.rst-content table>caption .nav .headerlink,.rst-content tt.download .btn span:first-child,.rst-content tt.download .nav span:first-child,.wy-menu-vertical li .btn button.toctree-expand,.wy-menu-vertical li.current>a .btn button.toctree-expand,.wy-menu-vertical li.current>a .nav button.toctree-expand,.wy-menu-vertical li .nav button.toctree-expand,.wy-menu-vertical li.on a .btn button.toctree-expand,.wy-menu-vertical li.on a .nav button.toctree-expand{display:inline}.btn .fa-large.icon,.btn .fa.fa-large,.btn .rst-content .code-block-caption .fa-large.headerlink,.btn .rst-content .eqno .fa-large.headerlink,.btn .rst-content .fa-large.admonition-title,.btn .rst-content code.download span.fa-large:first-child,.btn .rst-content dl dt .fa-large.headerlink,.btn .rst-content h1 .fa-large.headerlink,.btn .rst-content h2 .fa-large.headerlink,.btn .rst-content h3 .fa-large.headerlink,.btn .rst-content h4 .fa-large.headerlink,.btn .rst-content h5 .fa-large.headerlink,.btn .rst-content h6 .fa-large.headerlink,.btn .rst-content p .fa-large.headerlink,.btn .rst-content table>caption .fa-large.headerlink,.btn .rst-content tt.download span.fa-large:first-child,.btn .wy-menu-vertical li button.fa-large.toctree-expand,.nav .fa-large.icon,.nav .fa.fa-large,.nav .rst-content .code-block-caption .fa-large.headerlink,.nav .rst-content .eqno .fa-large.headerlink,.nav .rst-content .fa-large.admonition-title,.nav .rst-content code.download span.fa-large:first-child,.nav .rst-content dl dt .fa-large.headerlink,.nav .rst-content h1 .fa-large.headerlink,.nav .rst-content h2 .fa-large.headerlink,.nav .rst-content h3 .fa-large.headerlink,.nav .rst-content h4 .fa-large.headerlink,.nav .rst-content h5 .fa-large.headerlink,.nav .rst-content h6 .fa-large.headerlink,.nav .rst-content p .fa-large.headerlink,.nav .rst-content table>caption .fa-large.headerlink,.nav .rst-content tt.download span.fa-large:first-child,.nav .wy-menu-vertical li button.fa-large.toctree-expand,.rst-content .btn .fa-large.admonition-title,.rst-content .code-block-caption .btn .fa-large.headerlink,.rst-content .code-block-caption .nav .fa-large.headerlink,.rst-content .eqno .btn .fa-large.headerlink,.rst-content .eqno .nav .fa-large.headerlink,.rst-content .nav .fa-large.admonition-title,.rst-content code.download .btn span.fa-large:first-child,.rst-content code.download .nav span.fa-large:first-child,.rst-content dl dt .btn .fa-large.headerlink,.rst-content dl dt .nav .fa-large.headerlink,.rst-content h1 .btn .fa-large.headerlink,.rst-content h1 .nav .fa-large.headerlink,.rst-content h2 .btn .fa-large.headerlink,.rst-content h2 .nav .fa-large.headerlink,.rst-content h3 .btn .fa-large.headerlink,.rst-content h3 .nav .fa-large.headerlink,.rst-content h4 .btn .fa-large.headerlink,.rst-content h4 .nav .fa-large.headerlink,.rst-content h5 .btn .fa-large.headerlink,.rst-content h5 .nav .fa-large.headerlink,.rst-content h6 .btn .fa-large.headerlink,.rst-content h6 .nav .fa-large.headerlink,.rst-content p .btn .fa-large.headerlink,.rst-content p .nav .fa-large.headerlink,.rst-content table>caption .btn .fa-large.headerlink,.rst-content table>caption .nav .fa-large.headerlink,.rst-content tt.download .btn span.fa-large:first-child,.rst-content tt.download .nav span.fa-large:first-child,.wy-menu-vertical li .btn button.fa-large.toctree-expand,.wy-menu-vertical li .nav button.fa-large.toctree-expand{line-height:.9em}.btn .fa-spin.icon,.btn .fa.fa-spin,.btn .rst-content .code-block-caption .fa-spin.headerlink,.btn .rst-content .eqno .fa-spin.headerlink,.btn .rst-content .fa-spin.admonition-title,.btn .rst-content code.download span.fa-spin:first-child,.btn .rst-content dl dt .fa-spin.headerlink,.btn .rst-content h1 .fa-spin.headerlink,.btn .rst-content h2 .fa-spin.headerlink,.btn .rst-content h3 .fa-spin.headerlink,.btn .rst-content h4 .fa-spin.headerlink,.btn .rst-content h5 .fa-spin.headerlink,.btn .rst-content h6 .fa-spin.headerlink,.btn .rst-content p .fa-spin.headerlink,.btn .rst-content table>caption .fa-spin.headerlink,.btn .rst-content tt.download span.fa-spin:first-child,.btn .wy-menu-vertical li button.fa-spin.toctree-expand,.nav .fa-spin.icon,.nav .fa.fa-spin,.nav .rst-content .code-block-caption .fa-spin.headerlink,.nav .rst-content .eqno .fa-spin.headerlink,.nav .rst-content .fa-spin.admonition-title,.nav .rst-content code.download span.fa-spin:first-child,.nav .rst-content dl dt .fa-spin.headerlink,.nav .rst-content h1 .fa-spin.headerlink,.nav .rst-content h2 .fa-spin.headerlink,.nav .rst-content h3 .fa-spin.headerlink,.nav .rst-content h4 .fa-spin.headerlink,.nav .rst-content h5 .fa-spin.headerlink,.nav .rst-content h6 .fa-spin.headerlink,.nav .rst-content p .fa-spin.headerlink,.nav .rst-content table>caption .fa-spin.headerlink,.nav .rst-content tt.download span.fa-spin:first-child,.nav .wy-menu-vertical li button.fa-spin.toctree-expand,.rst-content .btn .fa-spin.admonition-title,.rst-content .code-block-caption .btn .fa-spin.headerlink,.rst-content .code-block-caption .nav .fa-spin.headerlink,.rst-content .eqno .btn .fa-spin.headerlink,.rst-content .eqno .nav .fa-spin.headerlink,.rst-content .nav .fa-spin.admonition-title,.rst-content code.download .btn span.fa-spin:first-child,.rst-content code.download .nav span.fa-spin:first-child,.rst-content dl dt .btn .fa-spin.headerlink,.rst-content dl dt .nav .fa-spin.headerlink,.rst-content h1 .btn .fa-spin.headerlink,.rst-content h1 .nav .fa-spin.headerlink,.rst-content h2 .btn .fa-spin.headerlink,.rst-content h2 .nav .fa-spin.headerlink,.rst-content h3 .btn .fa-spin.headerlink,.rst-content h3 .nav .fa-spin.headerlink,.rst-content h4 .btn .fa-spin.headerlink,.rst-content h4 .nav .fa-spin.headerlink,.rst-content h5 .btn .fa-spin.headerlink,.rst-content h5 .nav .fa-spin.headerlink,.rst-content h6 .btn .fa-spin.headerlink,.rst-content h6 .nav .fa-spin.headerlink,.rst-content p .btn .fa-spin.headerlink,.rst-content p .nav .fa-spin.headerlink,.rst-content table>caption .btn .fa-spin.headerlink,.rst-content table>caption .nav .fa-spin.headerlink,.rst-content tt.download .btn span.fa-spin:first-child,.rst-content tt.download .nav span.fa-spin:first-child,.wy-menu-vertical li .btn button.fa-spin.toctree-expand,.wy-menu-vertical li .nav button.fa-spin.toctree-expand{display:inline-block}.btn.fa:before,.btn.icon:before,.rst-content .btn.admonition-title:before,.rst-content .code-block-caption .btn.headerlink:before,.rst-content .eqno .btn.headerlink:before,.rst-content code.download span.btn:first-child:before,.rst-content dl dt .btn.headerlink:before,.rst-content h1 .btn.headerlink:before,.rst-content h2 .btn.headerlink:before,.rst-content h3 .btn.headerlink:before,.rst-content h4 .btn.headerlink:before,.rst-content h5 .btn.headerlink:before,.rst-content h6 .btn.headerlink:before,.rst-content p .btn.headerlink:before,.rst-content table>caption .btn.headerlink:before,.rst-content tt.download span.btn:first-child:before,.wy-menu-vertical li button.btn.toctree-expand:before{opacity:.5;-webkit-transition:opacity .05s ease-in;-moz-transition:opacity .05s ease-in;transition:opacity .05s ease-in}.btn.fa:hover:before,.btn.icon:hover:before,.rst-content .btn.admonition-title:hover:before,.rst-content .code-block-caption .btn.headerlink:hover:before,.rst-content .eqno .btn.headerlink:hover:before,.rst-content code.download span.btn:first-child:hover:before,.rst-content dl dt .btn.headerlink:hover:before,.rst-content h1 .btn.headerlink:hover:before,.rst-content h2 .btn.headerlink:hover:before,.rst-content h3 .btn.headerlink:hover:before,.rst-content h4 .btn.headerlink:hover:before,.rst-content h5 .btn.headerlink:hover:before,.rst-content h6 .btn.headerlink:hover:before,.rst-content p .btn.headerlink:hover:before,.rst-content table>caption .btn.headerlink:hover:before,.rst-content tt.download span.btn:first-child:hover:before,.wy-menu-vertical li button.btn.toctree-expand:hover:before{opacity:1}.btn-mini .fa:before,.btn-mini .icon:before,.btn-mini .rst-content .admonition-title:before,.btn-mini .rst-content .code-block-caption .headerlink:before,.btn-mini .rst-content .eqno .headerlink:before,.btn-mini .rst-content code.download span:first-child:before,.btn-mini .rst-content dl dt .headerlink:before,.btn-mini .rst-content h1 .headerlink:before,.btn-mini .rst-content h2 .headerlink:before,.btn-mini .rst-content h3 .headerlink:before,.btn-mini .rst-content h4 .headerlink:before,.btn-mini .rst-content h5 .headerlink:before,.btn-mini .rst-content h6 .headerlink:before,.btn-mini .rst-content p .headerlink:before,.btn-mini .rst-content table>caption .headerlink:before,.btn-mini .rst-content tt.download span:first-child:before,.btn-mini .wy-menu-vertical li button.toctree-expand:before,.rst-content .btn-mini .admonition-title:before,.rst-content .code-block-caption .btn-mini .headerlink:before,.rst-content .eqno .btn-mini .headerlink:before,.rst-content code.download .btn-mini span:first-child:before,.rst-content dl dt .btn-mini .headerlink:before,.rst-content h1 .btn-mini .headerlink:before,.rst-content h2 .btn-mini .headerlink:before,.rst-content h3 .btn-mini .headerlink:before,.rst-content h4 .btn-mini .headerlink:before,.rst-content h5 .btn-mini .headerlink:before,.rst-content h6 .btn-mini .headerlink:before,.rst-content p .btn-mini .headerlink:before,.rst-content table>caption .btn-mini .headerlink:before,.rst-content tt.download .btn-mini span:first-child:before,.wy-menu-vertical li .btn-mini button.toctree-expand:before{font-size:14px;vertical-align:-15%}.rst-content .admonition,.rst-content .admonition-todo,.rst-content .attention,.rst-content .caution,.rst-content .danger,.rst-content .error,.rst-content .hint,.rst-content .important,.rst-content .note,.rst-content .seealso,.rst-content .tip,.rst-content .warning,.wy-alert{padding:12px;line-height:24px;margin-bottom:24px;background:#e7f2fa}.rst-content .admonition-title,.wy-alert-title{font-weight:700;display:block;color:#fff;background:#6ab0de;padding:6px 12px;margin:-12px -12px 12px}.rst-content .danger,.rst-content .error,.rst-content .wy-alert-danger.admonition,.rst-content .wy-alert-danger.admonition-todo,.rst-content .wy-alert-danger.attention,.rst-content .wy-alert-danger.caution,.rst-content .wy-alert-danger.hint,.rst-content .wy-alert-danger.important,.rst-content .wy-alert-danger.note,.rst-content .wy-alert-danger.seealso,.rst-content .wy-alert-danger.tip,.rst-content .wy-alert-danger.warning,.wy-alert.wy-alert-danger{background:#fdf3f2}.rst-content .danger .admonition-title,.rst-content .danger .wy-alert-title,.rst-content .error .admonition-title,.rst-content .error .wy-alert-title,.rst-content .wy-alert-danger.admonition-todo .admonition-title,.rst-content .wy-alert-danger.admonition-todo .wy-alert-title,.rst-content .wy-alert-danger.admonition .admonition-title,.rst-content .wy-alert-danger.admonition .wy-alert-title,.rst-content .wy-alert-danger.attention .admonition-title,.rst-content .wy-alert-danger.attention .wy-alert-title,.rst-content .wy-alert-danger.caution .admonition-title,.rst-content .wy-alert-danger.caution .wy-alert-title,.rst-content .wy-alert-danger.hint .admonition-title,.rst-content .wy-alert-danger.hint .wy-alert-title,.rst-content .wy-alert-danger.important .admonition-title,.rst-content .wy-alert-danger.important .wy-alert-title,.rst-content .wy-alert-danger.note .admonition-title,.rst-content .wy-alert-danger.note .wy-alert-title,.rst-content .wy-alert-danger.seealso .admonition-title,.rst-content .wy-alert-danger.seealso .wy-alert-title,.rst-content .wy-alert-danger.tip .admonition-title,.rst-content .wy-alert-danger.tip .wy-alert-title,.rst-content .wy-alert-danger.warning .admonition-title,.rst-content .wy-alert-danger.warning .wy-alert-title,.rst-content .wy-alert.wy-alert-danger .admonition-title,.wy-alert.wy-alert-danger .rst-content .admonition-title,.wy-alert.wy-alert-danger .wy-alert-title{background:#f29f97}.rst-content .admonition-todo,.rst-content .attention,.rst-content .caution,.rst-content .warning,.rst-content .wy-alert-warning.admonition,.rst-content .wy-alert-warning.danger,.rst-content .wy-alert-warning.error,.rst-content .wy-alert-warning.hint,.rst-content .wy-alert-warning.important,.rst-content .wy-alert-warning.note,.rst-content .wy-alert-warning.seealso,.rst-content .wy-alert-warning.tip,.wy-alert.wy-alert-warning{background:#ffedcc}.rst-content .admonition-todo .admonition-title,.rst-content .admonition-todo .wy-alert-title,.rst-content .attention .admonition-title,.rst-content .attention .wy-alert-title,.rst-content .caution .admonition-title,.rst-content .caution .wy-alert-title,.rst-content .warning .admonition-title,.rst-content .warning .wy-alert-title,.rst-content .wy-alert-warning.admonition .admonition-title,.rst-content .wy-alert-warning.admonition .wy-alert-title,.rst-content .wy-alert-warning.danger .admonition-title,.rst-content .wy-alert-warning.danger .wy-alert-title,.rst-content .wy-alert-warning.error .admonition-title,.rst-content .wy-alert-warning.error .wy-alert-title,.rst-content .wy-alert-warning.hint .admonition-title,.rst-content .wy-alert-warning.hint .wy-alert-title,.rst-content .wy-alert-warning.important .admonition-title,.rst-content .wy-alert-warning.important .wy-alert-title,.rst-content .wy-alert-warning.note .admonition-title,.rst-content .wy-alert-warning.note .wy-alert-title,.rst-content .wy-alert-warning.seealso .admonition-title,.rst-content .wy-alert-warning.seealso .wy-alert-title,.rst-content .wy-alert-warning.tip .admonition-title,.rst-content .wy-alert-warning.tip .wy-alert-title,.rst-content .wy-alert.wy-alert-warning .admonition-title,.wy-alert.wy-alert-warning .rst-content .admonition-title,.wy-alert.wy-alert-warning .wy-alert-title{background:#f0b37e}.rst-content .note,.rst-content .seealso,.rst-content .wy-alert-info.admonition,.rst-content .wy-alert-info.admonition-todo,.rst-content .wy-alert-info.attention,.rst-content .wy-alert-info.caution,.rst-content .wy-alert-info.danger,.rst-content .wy-alert-info.error,.rst-content .wy-alert-info.hint,.rst-content .wy-alert-info.important,.rst-content .wy-alert-info.tip,.rst-content .wy-alert-info.warning,.wy-alert.wy-alert-info{background:#e7f2fa}.rst-content .note .admonition-title,.rst-content .note .wy-alert-title,.rst-content .seealso .admonition-title,.rst-content .seealso .wy-alert-title,.rst-content .wy-alert-info.admonition-todo .admonition-title,.rst-content .wy-alert-info.admonition-todo .wy-alert-title,.rst-content .wy-alert-info.admonition .admonition-title,.rst-content .wy-alert-info.admonition .wy-alert-title,.rst-content .wy-alert-info.attention .admonition-title,.rst-content .wy-alert-info.attention .wy-alert-title,.rst-content .wy-alert-info.caution .admonition-title,.rst-content .wy-alert-info.caution .wy-alert-title,.rst-content .wy-alert-info.danger .admonition-title,.rst-content .wy-alert-info.danger .wy-alert-title,.rst-content .wy-alert-info.error .admonition-title,.rst-content .wy-alert-info.error .wy-alert-title,.rst-content .wy-alert-info.hint .admonition-title,.rst-content .wy-alert-info.hint .wy-alert-title,.rst-content .wy-alert-info.important .admonition-title,.rst-content .wy-alert-info.important .wy-alert-title,.rst-content .wy-alert-info.tip .admonition-title,.rst-content .wy-alert-info.tip .wy-alert-title,.rst-content .wy-alert-info.warning .admonition-title,.rst-content .wy-alert-info.warning .wy-alert-title,.rst-content .wy-alert.wy-alert-info .admonition-title,.wy-alert.wy-alert-info .rst-content .admonition-title,.wy-alert.wy-alert-info .wy-alert-title{background:#6ab0de}.rst-content .hint,.rst-content .important,.rst-content .tip,.rst-content .wy-alert-success.admonition,.rst-content .wy-alert-success.admonition-todo,.rst-content .wy-alert-success.attention,.rst-content .wy-alert-success.caution,.rst-content .wy-alert-success.danger,.rst-content .wy-alert-success.error,.rst-content .wy-alert-success.note,.rst-content .wy-alert-success.seealso,.rst-content .wy-alert-success.warning,.wy-alert.wy-alert-success{background:#dbfaf4}.rst-content .hint .admonition-title,.rst-content .hint .wy-alert-title,.rst-content .important .admonition-title,.rst-content .important .wy-alert-title,.rst-content .tip .admonition-title,.rst-content .tip .wy-alert-title,.rst-content .wy-alert-success.admonition-todo .admonition-title,.rst-content .wy-alert-success.admonition-todo .wy-alert-title,.rst-content .wy-alert-success.admonition .admonition-title,.rst-content .wy-alert-success.admonition .wy-alert-title,.rst-content .wy-alert-success.attention .admonition-title,.rst-content .wy-alert-success.attention .wy-alert-title,.rst-content .wy-alert-success.caution .admonition-title,.rst-content .wy-alert-success.caution .wy-alert-title,.rst-content .wy-alert-success.danger .admonition-title,.rst-content .wy-alert-success.danger .wy-alert-title,.rst-content .wy-alert-success.error .admonition-title,.rst-content .wy-alert-success.error .wy-alert-title,.rst-content .wy-alert-success.note .admonition-title,.rst-content .wy-alert-success.note .wy-alert-title,.rst-content .wy-alert-success.seealso .admonition-title,.rst-content .wy-alert-success.seealso .wy-alert-title,.rst-content .wy-alert-success.warning .admonition-title,.rst-content .wy-alert-success.warning .wy-alert-title,.rst-content .wy-alert.wy-alert-success .admonition-title,.wy-alert.wy-alert-success .rst-content .admonition-title,.wy-alert.wy-alert-success .wy-alert-title{background:#1abc9c}.rst-content .wy-alert-neutral.admonition,.rst-content .wy-alert-neutral.admonition-todo,.rst-content .wy-alert-neutral.attention,.rst-content .wy-alert-neutral.caution,.rst-content .wy-alert-neutral.danger,.rst-content .wy-alert-neutral.error,.rst-content .wy-alert-neutral.hint,.rst-content .wy-alert-neutral.important,.rst-content .wy-alert-neutral.note,.rst-content .wy-alert-neutral.seealso,.rst-content .wy-alert-neutral.tip,.rst-content .wy-alert-neutral.warning,.wy-alert.wy-alert-neutral{background:#f3f6f6}.rst-content .wy-alert-neutral.admonition-todo .admonition-title,.rst-content .wy-alert-neutral.admonition-todo .wy-alert-title,.rst-content .wy-alert-neutral.admonition .admonition-title,.rst-content .wy-alert-neutral.admonition .wy-alert-title,.rst-content .wy-alert-neutral.attention .admonition-title,.rst-content .wy-alert-neutral.attention .wy-alert-title,.rst-content .wy-alert-neutral.caution .admonition-title,.rst-content .wy-alert-neutral.caution .wy-alert-title,.rst-content .wy-alert-neutral.danger .admonition-title,.rst-content .wy-alert-neutral.danger .wy-alert-title,.rst-content .wy-alert-neutral.error .admonition-title,.rst-content .wy-alert-neutral.error .wy-alert-title,.rst-content .wy-alert-neutral.hint .admonition-title,.rst-content .wy-alert-neutral.hint .wy-alert-title,.rst-content .wy-alert-neutral.important .admonition-title,.rst-content .wy-alert-neutral.important .wy-alert-title,.rst-content .wy-alert-neutral.note .admonition-title,.rst-content .wy-alert-neutral.note .wy-alert-title,.rst-content .wy-alert-neutral.seealso .admonition-title,.rst-content .wy-alert-neutral.seealso .wy-alert-title,.rst-content .wy-alert-neutral.tip .admonition-title,.rst-content .wy-alert-neutral.tip .wy-alert-title,.rst-content .wy-alert-neutral.warning .admonition-title,.rst-content .wy-alert-neutral.warning .wy-alert-title,.rst-content .wy-alert.wy-alert-neutral .admonition-title,.wy-alert.wy-alert-neutral .rst-content .admonition-title,.wy-alert.wy-alert-neutral .wy-alert-title{color:#404040;background:#e1e4e5}.rst-content .wy-alert-neutral.admonition-todo a,.rst-content .wy-alert-neutral.admonition a,.rst-content .wy-alert-neutral.attention a,.rst-content .wy-alert-neutral.caution a,.rst-content .wy-alert-neutral.danger a,.rst-content .wy-alert-neutral.error a,.rst-content .wy-alert-neutral.hint a,.rst-content .wy-alert-neutral.important a,.rst-content .wy-alert-neutral.note a,.rst-content .wy-alert-neutral.seealso a,.rst-content .wy-alert-neutral.tip a,.rst-content .wy-alert-neutral.warning a,.wy-alert.wy-alert-neutral a{color:#2980b9}.rst-content .admonition-todo p:last-child,.rst-content .admonition p:last-child,.rst-content .attention p:last-child,.rst-content .caution p:last-child,.rst-content .danger p:last-child,.rst-content .error p:last-child,.rst-content .hint p:last-child,.rst-content .important p:last-child,.rst-content .note p:last-child,.rst-content .seealso p:last-child,.rst-content .tip p:last-child,.rst-content .warning p:last-child,.wy-alert p:last-child{margin-bottom:0}.wy-tray-container{position:fixed;bottom:0;left:0;z-index:600}.wy-tray-container li{display:block;width:300px;background:transparent;color:#fff;text-align:center;box-shadow:0 5px 5px 0 rgba(0,0,0,.1);padding:0 24px;min-width:20%;opacity:0;height:0;line-height:56px;overflow:hidden;-webkit-transition:all .3s ease-in;-moz-transition:all .3s ease-in;transition:all .3s ease-in}.wy-tray-container li.wy-tray-item-success{background:#27ae60}.wy-tray-container li.wy-tray-item-info{background:#2980b9}.wy-tray-container li.wy-tray-item-warning{background:#e67e22}.wy-tray-container li.wy-tray-item-danger{background:#e74c3c}.wy-tray-container li.on{opacity:1;height:56px}@media screen and (max-width:768px){.wy-tray-container{bottom:auto;top:0;width:100%}.wy-tray-container li{width:100%}}button{font-size:100%;margin:0;vertical-align:baseline;*vertical-align:middle;cursor:pointer;line-height:normal;-webkit-appearance:button;*overflow:visible}button::-moz-focus-inner,input::-moz-focus-inner{border:0;padding:0}button[disabled]{cursor:default}.btn{display:inline-block;border-radius:2px;line-height:normal;white-space:nowrap;text-align:center;cursor:pointer;font-size:100%;padding:6px 12px 8px;color:#fff;border:1px solid rgba(0,0,0,.1);background-color:#27ae60;text-decoration:none;font-weight:400;font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif;box-shadow:inset 0 1px 2px -1px hsla(0,0%,100%,.5),inset 0 -2px 0 0 rgba(0,0,0,.1);outline-none:false;vertical-align:middle;*display:inline;zoom:1;-webkit-user-drag:none;-webkit-user-select:none;-moz-user-select:none;-ms-user-select:none;user-select:none;-webkit-transition:all .1s linear;-moz-transition:all .1s linear;transition:all .1s linear}.btn-hover{background:#2e8ece;color:#fff}.btn:hover{background:#2cc36b;color:#fff}.btn:focus{background:#2cc36b;outline:0}.btn:active{box-shadow:inset 0 -1px 0 0 rgba(0,0,0,.05),inset 0 2px 0 0 rgba(0,0,0,.1);padding:8px 12px 6px}.btn:visited{color:#fff}.btn-disabled,.btn-disabled:active,.btn-disabled:focus,.btn-disabled:hover,.btn:disabled{background-image:none;filter:progid:DXImageTransform.Microsoft.gradient(enabled = false);filter:alpha(opacity=40);opacity:.4;cursor:not-allowed;box-shadow:none}.btn::-moz-focus-inner{padding:0;border:0}.btn-small{font-size:80%}.btn-info{background-color:#2980b9!important}.btn-info:hover{background-color:#2e8ece!important}.btn-neutral{background-color:#f3f6f6!important;color:#404040!important}.btn-neutral:hover{background-color:#e5ebeb!important;color:#404040}.btn-neutral:visited{color:#404040!important}.btn-success{background-color:#27ae60!important}.btn-success:hover{background-color:#295!important}.btn-danger{background-color:#e74c3c!important}.btn-danger:hover{background-color:#ea6153!important}.btn-warning{background-color:#e67e22!important}.btn-warning:hover{background-color:#e98b39!important}.btn-invert{background-color:#222}.btn-invert:hover{background-color:#2f2f2f!important}.btn-link{background-color:transparent!important;color:#2980b9;box-shadow:none;border-color:transparent!important}.btn-link:active,.btn-link:hover{background-color:transparent!important;color:#409ad5!important;box-shadow:none}.btn-link:visited{color:#9b59b6}.wy-btn-group .btn,.wy-control .btn{vertical-align:middle}.wy-btn-group{margin-bottom:24px;*zoom:1}.wy-btn-group:after,.wy-btn-group:before{display:table;content:""}.wy-btn-group:after{clear:both}.wy-dropdown{position:relative;display:inline-block}.wy-dropdown-active .wy-dropdown-menu{display:block}.wy-dropdown-menu{position:absolute;left:0;display:none;float:left;top:100%;min-width:100%;background:#fcfcfc;z-index:100;border:1px solid #cfd7dd;box-shadow:0 2px 2px 0 rgba(0,0,0,.1);padding:12px}.wy-dropdown-menu>dd>a{display:block;clear:both;color:#404040;white-space:nowrap;font-size:90%;padding:0 12px;cursor:pointer}.wy-dropdown-menu>dd>a:hover{background:#2980b9;color:#fff}.wy-dropdown-menu>dd.divider{border-top:1px solid #cfd7dd;margin:6px 0}.wy-dropdown-menu>dd.search{padding-bottom:12px}.wy-dropdown-menu>dd.search input[type=search]{width:100%}.wy-dropdown-menu>dd.call-to-action{background:#e3e3e3;text-transform:uppercase;font-weight:500;font-size:80%}.wy-dropdown-menu>dd.call-to-action:hover{background:#e3e3e3}.wy-dropdown-menu>dd.call-to-action .btn{color:#fff}.wy-dropdown.wy-dropdown-up .wy-dropdown-menu{bottom:100%;top:auto;left:auto;right:0}.wy-dropdown.wy-dropdown-bubble .wy-dropdown-menu{background:#fcfcfc;margin-top:2px}.wy-dropdown.wy-dropdown-bubble .wy-dropdown-menu a{padding:6px 12px}.wy-dropdown.wy-dropdown-bubble .wy-dropdown-menu a:hover{background:#2980b9;color:#fff}.wy-dropdown.wy-dropdown-left .wy-dropdown-menu{right:0;left:auto;text-align:right}.wy-dropdown-arrow:before{content:" ";border-bottom:5px solid #f5f5f5;border-left:5px solid transparent;border-right:5px solid transparent;position:absolute;display:block;top:-4px;left:50%;margin-left:-3px}.wy-dropdown-arrow.wy-dropdown-arrow-left:before{left:11px}.wy-form-stacked select{display:block}.wy-form-aligned .wy-help-inline,.wy-form-aligned input,.wy-form-aligned label,.wy-form-aligned select,.wy-form-aligned textarea{display:inline-block;*display:inline;*zoom:1;vertical-align:middle}.wy-form-aligned .wy-control-group>label{display:inline-block;vertical-align:middle;width:10em;margin:6px 12px 0 0;float:left}.wy-form-aligned .wy-control{float:left}.wy-form-aligned .wy-control label{display:block}.wy-form-aligned .wy-control select{margin-top:6px}fieldset{margin:0}fieldset,legend{border:0;padding:0}legend{width:100%;white-space:normal;margin-bottom:24px;font-size:150%;*margin-left:-7px}label,legend{display:block}label{margin:0 0 .3125em;color:#333;font-size:90%}input,select,textarea{font-size:100%;margin:0;vertical-align:baseline;*vertical-align:middle}.wy-control-group{margin-bottom:24px;max-width:1200px;margin-left:auto;margin-right:auto;*zoom:1}.wy-control-group:after,.wy-control-group:before{display:table;content:""}.wy-control-group:after{clear:both}.wy-control-group.wy-control-group-required>label:after{content:" *";color:#e74c3c}.wy-control-group .wy-form-full,.wy-control-group .wy-form-halves,.wy-control-group .wy-form-thirds{padding-bottom:12px}.wy-control-group .wy-form-full input[type=color],.wy-control-group .wy-form-full input[type=date],.wy-control-group .wy-form-full input[type=datetime-local],.wy-control-group .wy-form-full input[type=datetime],.wy-control-group .wy-form-full input[type=email],.wy-control-group .wy-form-full input[type=month],.wy-control-group .wy-form-full input[type=number],.wy-control-group .wy-form-full input[type=password],.wy-control-group .wy-form-full input[type=search],.wy-control-group .wy-form-full input[type=tel],.wy-control-group .wy-form-full input[type=text],.wy-control-group .wy-form-full input[type=time],.wy-control-group .wy-form-full input[type=url],.wy-control-group .wy-form-full input[type=week],.wy-control-group .wy-form-full select,.wy-control-group .wy-form-halves input[type=color],.wy-control-group .wy-form-halves input[type=date],.wy-control-group .wy-form-halves input[type=datetime-local],.wy-control-group .wy-form-halves input[type=datetime],.wy-control-group .wy-form-halves input[type=email],.wy-control-group .wy-form-halves input[type=month],.wy-control-group .wy-form-halves input[type=number],.wy-control-group .wy-form-halves input[type=password],.wy-control-group .wy-form-halves input[type=search],.wy-control-group .wy-form-halves input[type=tel],.wy-control-group .wy-form-halves input[type=text],.wy-control-group .wy-form-halves input[type=time],.wy-control-group .wy-form-halves input[type=url],.wy-control-group .wy-form-halves input[type=week],.wy-control-group .wy-form-halves select,.wy-control-group .wy-form-thirds input[type=color],.wy-control-group .wy-form-thirds input[type=date],.wy-control-group .wy-form-thirds input[type=datetime-local],.wy-control-group .wy-form-thirds input[type=datetime],.wy-control-group .wy-form-thirds input[type=email],.wy-control-group .wy-form-thirds input[type=month],.wy-control-group .wy-form-thirds input[type=number],.wy-control-group .wy-form-thirds input[type=password],.wy-control-group .wy-form-thirds input[type=search],.wy-control-group .wy-form-thirds input[type=tel],.wy-control-group .wy-form-thirds input[type=text],.wy-control-group .wy-form-thirds input[type=time],.wy-control-group .wy-form-thirds input[type=url],.wy-control-group .wy-form-thirds input[type=week],.wy-control-group .wy-form-thirds select{width:100%}.wy-control-group .wy-form-full{float:left;display:block;width:100%;margin-right:0}.wy-control-group .wy-form-full:last-child{margin-right:0}.wy-control-group .wy-form-halves{float:left;display:block;margin-right:2.35765%;width:48.82117%}.wy-control-group .wy-form-halves:last-child,.wy-control-group .wy-form-halves:nth-of-type(2n){margin-right:0}.wy-control-group .wy-form-halves:nth-of-type(odd){clear:left}.wy-control-group .wy-form-thirds{float:left;display:block;margin-right:2.35765%;width:31.76157%}.wy-control-group .wy-form-thirds:last-child,.wy-control-group .wy-form-thirds:nth-of-type(3n){margin-right:0}.wy-control-group .wy-form-thirds:nth-of-type(3n+1){clear:left}.wy-control-group.wy-control-group-no-input .wy-control,.wy-control-no-input{margin:6px 0 0;font-size:90%}.wy-control-no-input{display:inline-block}.wy-control-group.fluid-input input[type=color],.wy-control-group.fluid-input input[type=date],.wy-control-group.fluid-input input[type=datetime-local],.wy-control-group.fluid-input input[type=datetime],.wy-control-group.fluid-input input[type=email],.wy-control-group.fluid-input input[type=month],.wy-control-group.fluid-input input[type=number],.wy-control-group.fluid-input input[type=password],.wy-control-group.fluid-input input[type=search],.wy-control-group.fluid-input input[type=tel],.wy-control-group.fluid-input input[type=text],.wy-control-group.fluid-input input[type=time],.wy-control-group.fluid-input input[type=url],.wy-control-group.fluid-input input[type=week]{width:100%}.wy-form-message-inline{padding-left:.3em;color:#666;font-size:90%}.wy-form-message{display:block;color:#999;font-size:70%;margin-top:.3125em;font-style:italic}.wy-form-message p{font-size:inherit;font-style:italic;margin-bottom:6px}.wy-form-message p:last-child{margin-bottom:0}input{line-height:normal}input[type=button],input[type=reset],input[type=submit]{-webkit-appearance:button;cursor:pointer;font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif;*overflow:visible}input[type=color],input[type=date],input[type=datetime-local],input[type=datetime],input[type=email],input[type=month],input[type=number],input[type=password],input[type=search],input[type=tel],input[type=text],input[type=time],input[type=url],input[type=week]{-webkit-appearance:none;padding:6px;display:inline-block;border:1px solid #ccc;font-size:80%;font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif;box-shadow:inset 0 1px 3px #ddd;border-radius:0;-webkit-transition:border .3s linear;-moz-transition:border .3s linear;transition:border .3s linear}input[type=datetime-local]{padding:.34375em .625em}input[disabled]{cursor:default}input[type=checkbox],input[type=radio]{padding:0;margin-right:.3125em;*height:13px;*width:13px}input[type=checkbox],input[type=radio],input[type=search]{-webkit-box-sizing:border-box;-moz-box-sizing:border-box;box-sizing:border-box}input[type=search]::-webkit-search-cancel-button,input[type=search]::-webkit-search-decoration{-webkit-appearance:none}input[type=color]:focus,input[type=date]:focus,input[type=datetime-local]:focus,input[type=datetime]:focus,input[type=email]:focus,input[type=month]:focus,input[type=number]:focus,input[type=password]:focus,input[type=search]:focus,input[type=tel]:focus,input[type=text]:focus,input[type=time]:focus,input[type=url]:focus,input[type=week]:focus{outline:0;outline:thin dotted\9;border-color:#333}input.no-focus:focus{border-color:#ccc!important}input[type=checkbox]:focus,input[type=file]:focus,input[type=radio]:focus{outline:thin dotted #333;outline:1px auto #129fea}input[type=color][disabled],input[type=date][disabled],input[type=datetime-local][disabled],input[type=datetime][disabled],input[type=email][disabled],input[type=month][disabled],input[type=number][disabled],input[type=password][disabled],input[type=search][disabled],input[type=tel][disabled],input[type=text][disabled],input[type=time][disabled],input[type=url][disabled],input[type=week][disabled]{cursor:not-allowed;background-color:#fafafa}input:focus:invalid,select:focus:invalid,textarea:focus:invalid{color:#e74c3c;border:1px solid #e74c3c}input:focus:invalid:focus,select:focus:invalid:focus,textarea:focus:invalid:focus{border-color:#e74c3c}input[type=checkbox]:focus:invalid:focus,input[type=file]:focus:invalid:focus,input[type=radio]:focus:invalid:focus{outline-color:#e74c3c}input.wy-input-large{padding:12px;font-size:100%}textarea{overflow:auto;vertical-align:top;width:100%;font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif}select,textarea{padding:.5em .625em;display:inline-block;border:1px solid #ccc;font-size:80%;box-shadow:inset 0 1px 3px #ddd;-webkit-transition:border .3s linear;-moz-transition:border .3s linear;transition:border .3s linear}select{border:1px solid #ccc;background-color:#fff}select[multiple]{height:auto}select:focus,textarea:focus{outline:0}input[readonly],select[disabled],select[readonly],textarea[disabled],textarea[readonly]{cursor:not-allowed;background-color:#fafafa}input[type=checkbox][disabled],input[type=radio][disabled]{cursor:not-allowed}.wy-checkbox,.wy-radio{margin:6px 0;color:#404040;display:block}.wy-checkbox input,.wy-radio input{vertical-align:baseline}.wy-form-message-inline{display:inline-block;*display:inline;*zoom:1;vertical-align:middle}.wy-input-prefix,.wy-input-suffix{white-space:nowrap;padding:6px}.wy-input-prefix .wy-input-context,.wy-input-suffix .wy-input-context{line-height:27px;padding:0 8px;display:inline-block;font-size:80%;background-color:#f3f6f6;border:1px solid #ccc;color:#999}.wy-input-suffix .wy-input-context{border-left:0}.wy-input-prefix .wy-input-context{border-right:0}.wy-switch{position:relative;display:block;height:24px;margin-top:12px;cursor:pointer}.wy-switch:before{left:0;top:0;width:36px;height:12px;background:#ccc}.wy-switch:after,.wy-switch:before{position:absolute;content:"";display:block;border-radius:4px;-webkit-transition:all .2s ease-in-out;-moz-transition:all .2s ease-in-out;transition:all .2s ease-in-out}.wy-switch:after{width:18px;height:18px;background:#999;left:-3px;top:-3px}.wy-switch span{position:absolute;left:48px;display:block;font-size:12px;color:#ccc;line-height:1}.wy-switch.active:before{background:#1e8449}.wy-switch.active:after{left:24px;background:#27ae60}.wy-switch.disabled{cursor:not-allowed;opacity:.8}.wy-control-group.wy-control-group-error .wy-form-message,.wy-control-group.wy-control-group-error>label{color:#e74c3c}.wy-control-group.wy-control-group-error input[type=color],.wy-control-group.wy-control-group-error input[type=date],.wy-control-group.wy-control-group-error input[type=datetime-local],.wy-control-group.wy-control-group-error input[type=datetime],.wy-control-group.wy-control-group-error input[type=email],.wy-control-group.wy-control-group-error input[type=month],.wy-control-group.wy-control-group-error input[type=number],.wy-control-group.wy-control-group-error input[type=password],.wy-control-group.wy-control-group-error input[type=search],.wy-control-group.wy-control-group-error input[type=tel],.wy-control-group.wy-control-group-error input[type=text],.wy-control-group.wy-control-group-error input[type=time],.wy-control-group.wy-control-group-error input[type=url],.wy-control-group.wy-control-group-error input[type=week],.wy-control-group.wy-control-group-error textarea{border:1px solid #e74c3c}.wy-inline-validate{white-space:nowrap}.wy-inline-validate .wy-input-context{padding:.5em .625em;display:inline-block;font-size:80%}.wy-inline-validate.wy-inline-validate-success .wy-input-context{color:#27ae60}.wy-inline-validate.wy-inline-validate-danger .wy-input-context{color:#e74c3c}.wy-inline-validate.wy-inline-validate-warning .wy-input-context{color:#e67e22}.wy-inline-validate.wy-inline-validate-info .wy-input-context{color:#2980b9}.rotate-90{-webkit-transform:rotate(90deg);-moz-transform:rotate(90deg);-ms-transform:rotate(90deg);-o-transform:rotate(90deg);transform:rotate(90deg)}.rotate-180{-webkit-transform:rotate(180deg);-moz-transform:rotate(180deg);-ms-transform:rotate(180deg);-o-transform:rotate(180deg);transform:rotate(180deg)}.rotate-270{-webkit-transform:rotate(270deg);-moz-transform:rotate(270deg);-ms-transform:rotate(270deg);-o-transform:rotate(270deg);transform:rotate(270deg)}.mirror{-webkit-transform:scaleX(-1);-moz-transform:scaleX(-1);-ms-transform:scaleX(-1);-o-transform:scaleX(-1);transform:scaleX(-1)}.mirror.rotate-90{-webkit-transform:scaleX(-1) rotate(90deg);-moz-transform:scaleX(-1) rotate(90deg);-ms-transform:scaleX(-1) rotate(90deg);-o-transform:scaleX(-1) rotate(90deg);transform:scaleX(-1) rotate(90deg)}.mirror.rotate-180{-webkit-transform:scaleX(-1) rotate(180deg);-moz-transform:scaleX(-1) rotate(180deg);-ms-transform:scaleX(-1) rotate(180deg);-o-transform:scaleX(-1) rotate(180deg);transform:scaleX(-1) rotate(180deg)}.mirror.rotate-270{-webkit-transform:scaleX(-1) rotate(270deg);-moz-transform:scaleX(-1) rotate(270deg);-ms-transform:scaleX(-1) rotate(270deg);-o-transform:scaleX(-1) rotate(270deg);transform:scaleX(-1) rotate(270deg)}@media only screen and (max-width:480px){.wy-form button[type=submit]{margin:.7em 0 0}.wy-form input[type=color],.wy-form input[type=date],.wy-form input[type=datetime-local],.wy-form input[type=datetime],.wy-form input[type=email],.wy-form input[type=month],.wy-form input[type=number],.wy-form input[type=password],.wy-form input[type=search],.wy-form input[type=tel],.wy-form input[type=text],.wy-form input[type=time],.wy-form input[type=url],.wy-form input[type=week],.wy-form label{margin-bottom:.3em;display:block}.wy-form input[type=color],.wy-form input[type=date],.wy-form input[type=datetime-local],.wy-form input[type=datetime],.wy-form input[type=email],.wy-form input[type=month],.wy-form input[type=number],.wy-form input[type=password],.wy-form input[type=search],.wy-form input[type=tel],.wy-form input[type=time],.wy-form input[type=url],.wy-form input[type=week]{margin-bottom:0}.wy-form-aligned .wy-control-group label{margin-bottom:.3em;text-align:left;display:block;width:100%}.wy-form-aligned .wy-control{margin:1.5em 0 0}.wy-form-message,.wy-form-message-inline,.wy-form .wy-help-inline{display:block;font-size:80%;padding:6px 0}}@media screen and (max-width:768px){.tablet-hide{display:none}}@media screen and (max-width:480px){.mobile-hide{display:none}}.float-left{float:left}.float-right{float:right}.full-width{width:100%}.rst-content table.docutils,.rst-content table.field-list,.wy-table{border-collapse:collapse;border-spacing:0;empty-cells:show;margin-bottom:24px}.rst-content table.docutils caption,.rst-content table.field-list caption,.wy-table caption{color:#000;font:italic 85%/1 arial,sans-serif;padding:1em 0;text-align:center}.rst-content table.docutils td,.rst-content table.docutils th,.rst-content table.field-list td,.rst-content table.field-list th,.wy-table td,.wy-table th{font-size:90%;margin:0;overflow:visible;padding:8px 16px}.rst-content table.docutils td:first-child,.rst-content table.docutils th:first-child,.rst-content table.field-list td:first-child,.rst-content table.field-list th:first-child,.wy-table td:first-child,.wy-table th:first-child{border-left-width:0}.rst-content table.docutils thead,.rst-content table.field-list thead,.wy-table thead{color:#000;text-align:left;vertical-align:bottom;white-space:nowrap}.rst-content table.docutils thead th,.rst-content table.field-list thead th,.wy-table thead th{font-weight:700;border-bottom:2px solid #e1e4e5}.rst-content table.docutils td,.rst-content table.field-list td,.wy-table td{background-color:transparent;vertical-align:middle}.rst-content table.docutils td p,.rst-content table.field-list td p,.wy-table td p{line-height:18px}.rst-content table.docutils td p:last-child,.rst-content table.field-list td p:last-child,.wy-table td p:last-child{margin-bottom:0}.rst-content table.docutils .wy-table-cell-min,.rst-content table.field-list .wy-table-cell-min,.wy-table .wy-table-cell-min{width:1%;padding-right:0}.rst-content table.docutils .wy-table-cell-min input[type=checkbox],.rst-content table.field-list .wy-table-cell-min input[type=checkbox],.wy-table .wy-table-cell-min input[type=checkbox]{margin:0}.wy-table-secondary{color:grey;font-size:90%}.wy-table-tertiary{color:grey;font-size:80%}.rst-content table.docutils:not(.field-list) tr:nth-child(2n-1) td,.wy-table-backed,.wy-table-odd td,.wy-table-striped tr:nth-child(2n-1) td{background-color:#f3f6f6}.rst-content table.docutils,.wy-table-bordered-all{border:1px solid #e1e4e5}.rst-content table.docutils td,.wy-table-bordered-all td{border-bottom:1px solid #e1e4e5;border-left:1px solid #e1e4e5}.rst-content table.docutils tbody>tr:last-child td,.wy-table-bordered-all tbody>tr:last-child td{border-bottom-width:0}.wy-table-bordered{border:1px solid #e1e4e5}.wy-table-bordered-rows td{border-bottom:1px solid #e1e4e5}.wy-table-bordered-rows tbody>tr:last-child td{border-bottom-width:0}.wy-table-horizontal td,.wy-table-horizontal th{border-width:0 0 1px;border-bottom:1px solid #e1e4e5}.wy-table-horizontal tbody>tr:last-child td{border-bottom-width:0}.wy-table-responsive{margin-bottom:24px;max-width:100%;overflow:auto}.wy-table-responsive table{margin-bottom:0!important}.wy-table-responsive table td,.wy-table-responsive table th{white-space:nowrap}a{color:#2980b9;text-decoration:none;cursor:pointer}a:hover{color:#3091d1}a:visited{color:#9b59b6}html{height:100%}body,html{overflow-x:hidden}body{font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif;font-weight:400;color:#404040;min-height:100%;background:#edf0f2}.wy-text-left{text-align:left}.wy-text-center{text-align:center}.wy-text-right{text-align:right}.wy-text-large{font-size:120%}.wy-text-normal{font-size:100%}.wy-text-small,small{font-size:80%}.wy-text-strike{text-decoration:line-through}.wy-text-warning{color:#e67e22!important}a.wy-text-warning:hover{color:#eb9950!important}.wy-text-info{color:#2980b9!important}a.wy-text-info:hover{color:#409ad5!important}.wy-text-success{color:#27ae60!important}a.wy-text-success:hover{color:#36d278!important}.wy-text-danger{color:#e74c3c!important}a.wy-text-danger:hover{color:#ed7669!important}.wy-text-neutral{color:#404040!important}a.wy-text-neutral:hover{color:#595959!important}.rst-content .toctree-wrapper>p.caption,h1,h2,h3,h4,h5,h6,legend{margin-top:0;font-weight:700;font-family:Roboto Slab,ff-tisa-web-pro,Georgia,Arial,sans-serif}p{line-height:24px;font-size:16px;margin:0 0 24px}h1{font-size:175%}.rst-content .toctree-wrapper>p.caption,h2{font-size:150%}h3{font-size:125%}h4{font-size:115%}h5{font-size:110%}h6{font-size:100%}hr{display:block;height:1px;border:0;border-top:1px solid #e1e4e5;margin:24px 0;padding:0}.rst-content code,.rst-content tt,code{white-space:nowrap;max-width:100%;background:#fff;border:1px solid #e1e4e5;font-size:75%;padding:0 5px;font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;color:#e74c3c;overflow-x:auto}.rst-content tt.code-large,code.code-large{font-size:90%}.rst-content .section ul,.rst-content .toctree-wrapper ul,.rst-content section ul,.wy-plain-list-disc,article ul{list-style:disc;line-height:24px;margin-bottom:24px}.rst-content .section ul li,.rst-content .toctree-wrapper ul li,.rst-content section ul li,.wy-plain-list-disc li,article ul li{list-style:disc;margin-left:24px}.rst-content .section ul li p:last-child,.rst-content .section ul li ul,.rst-content .toctree-wrapper ul li p:last-child,.rst-content .toctree-wrapper ul li ul,.rst-content section ul li p:last-child,.rst-content section ul li ul,.wy-plain-list-disc li p:last-child,.wy-plain-list-disc li ul,article ul li p:last-child,article ul li ul{margin-bottom:0}.rst-content .section ul li li,.rst-content .toctree-wrapper ul li li,.rst-content section ul li li,.wy-plain-list-disc li li,article ul li li{list-style:circle}.rst-content .section ul li li li,.rst-content .toctree-wrapper ul li li li,.rst-content section ul li li li,.wy-plain-list-disc li li li,article ul li li li{list-style:square}.rst-content .section ul li ol li,.rst-content .toctree-wrapper ul li ol li,.rst-content section ul li ol li,.wy-plain-list-disc li ol li,article ul li ol li{list-style:decimal}.rst-content .section ol,.rst-content .section ol.arabic,.rst-content .toctree-wrapper ol,.rst-content .toctree-wrapper ol.arabic,.rst-content section ol,.rst-content section ol.arabic,.wy-plain-list-decimal,article ol{list-style:decimal;line-height:24px;margin-bottom:24px}.rst-content .section ol.arabic li,.rst-content .section ol li,.rst-content .toctree-wrapper ol.arabic li,.rst-content .toctree-wrapper ol li,.rst-content section ol.arabic li,.rst-content section ol li,.wy-plain-list-decimal li,article ol li{list-style:decimal;margin-left:24px}.rst-content .section ol.arabic li ul,.rst-content .section ol li p:last-child,.rst-content .section ol li ul,.rst-content .toctree-wrapper ol.arabic li ul,.rst-content .toctree-wrapper ol li p:last-child,.rst-content .toctree-wrapper ol li ul,.rst-content section ol.arabic li ul,.rst-content section ol li p:last-child,.rst-content section ol li ul,.wy-plain-list-decimal li p:last-child,.wy-plain-list-decimal li ul,article ol li p:last-child,article ol li ul{margin-bottom:0}.rst-content .section ol.arabic li ul li,.rst-content .section ol li ul li,.rst-content .toctree-wrapper ol.arabic li ul li,.rst-content .toctree-wrapper ol li ul li,.rst-content section ol.arabic li ul li,.rst-content section ol li ul li,.wy-plain-list-decimal li ul li,article ol li ul li{list-style:disc}.wy-breadcrumbs{*zoom:1}.wy-breadcrumbs:after,.wy-breadcrumbs:before{display:table;content:""}.wy-breadcrumbs:after{clear:both}.wy-breadcrumbs>li{display:inline-block;padding-top:5px}.wy-breadcrumbs>li.wy-breadcrumbs-aside{float:right}.rst-content .wy-breadcrumbs>li code,.rst-content .wy-breadcrumbs>li tt,.wy-breadcrumbs>li .rst-content tt,.wy-breadcrumbs>li code{all:inherit;color:inherit}.breadcrumb-item:before{content:"/";color:#bbb;font-size:13px;padding:0 6px 0 3px}.wy-breadcrumbs-extra{margin-bottom:0;color:#b3b3b3;font-size:80%;display:inline-block}@media screen and (max-width:480px){.wy-breadcrumbs-extra,.wy-breadcrumbs li.wy-breadcrumbs-aside{display:none}}@media print{.wy-breadcrumbs li.wy-breadcrumbs-aside{display:none}}html{font-size:16px}.wy-affix{position:fixed;top:1.618em}.wy-menu a:hover{text-decoration:none}.wy-menu-horiz{*zoom:1}.wy-menu-horiz:after,.wy-menu-horiz:before{display:table;content:""}.wy-menu-horiz:after{clear:both}.wy-menu-horiz li,.wy-menu-horiz ul{display:inline-block}.wy-menu-horiz li:hover{background:hsla(0,0%,100%,.1)}.wy-menu-horiz li.divide-left{border-left:1px solid #404040}.wy-menu-horiz li.divide-right{border-right:1px solid #404040}.wy-menu-horiz a{height:32px;display:inline-block;line-height:32px;padding:0 16px}.wy-menu-vertical{width:300px}.wy-menu-vertical header,.wy-menu-vertical p.caption{color:#55a5d9;height:32px;line-height:32px;padding:0 1.618em;margin:12px 0 0;display:block;font-weight:700;text-transform:uppercase;font-size:85%;white-space:nowrap}.wy-menu-vertical ul{margin-bottom:0}.wy-menu-vertical li.divide-top{border-top:1px solid #404040}.wy-menu-vertical li.divide-bottom{border-bottom:1px solid #404040}.wy-menu-vertical li.current{background:#e3e3e3}.wy-menu-vertical li.current a{color:grey;border-right:1px solid #c9c9c9;padding:.4045em 2.427em}.wy-menu-vertical li.current a:hover{background:#d6d6d6}.rst-content .wy-menu-vertical li tt,.wy-menu-vertical li .rst-content tt,.wy-menu-vertical li code{border:none;background:inherit;color:inherit;padding-left:0;padding-right:0}.wy-menu-vertical li button.toctree-expand{display:block;float:left;margin-left:-1.2em;line-height:18px;color:#4d4d4d;border:none;background:none;padding:0}.wy-menu-vertical li.current>a,.wy-menu-vertical li.on a{color:#404040;font-weight:700;position:relative;background:#fcfcfc;border:none;padding:.4045em 1.618em}.wy-menu-vertical li.current>a:hover,.wy-menu-vertical li.on a:hover{background:#fcfcfc}.wy-menu-vertical li.current>a:hover button.toctree-expand,.wy-menu-vertical li.on a:hover button.toctree-expand{color:grey}.wy-menu-vertical li.current>a button.toctree-expand,.wy-menu-vertical li.on a button.toctree-expand{display:block;line-height:18px;color:#333}.wy-menu-vertical li.toctree-l1.current>a{border-bottom:1px solid #c9c9c9;border-top:1px solid #c9c9c9}.wy-menu-vertical .toctree-l1.current .toctree-l2>ul,.wy-menu-vertical .toctree-l2.current .toctree-l3>ul,.wy-menu-vertical .toctree-l3.current .toctree-l4>ul,.wy-menu-vertical .toctree-l4.current .toctree-l5>ul,.wy-menu-vertical .toctree-l5.current .toctree-l6>ul,.wy-menu-vertical .toctree-l6.current .toctree-l7>ul,.wy-menu-vertical .toctree-l7.current .toctree-l8>ul,.wy-menu-vertical .toctree-l8.current .toctree-l9>ul,.wy-menu-vertical .toctree-l9.current .toctree-l10>ul,.wy-menu-vertical .toctree-l10.current .toctree-l11>ul{display:none}.wy-menu-vertical .toctree-l1.current .current.toctree-l2>ul,.wy-menu-vertical .toctree-l2.current .current.toctree-l3>ul,.wy-menu-vertical .toctree-l3.current .current.toctree-l4>ul,.wy-menu-vertical .toctree-l4.current .current.toctree-l5>ul,.wy-menu-vertical .toctree-l5.current .current.toctree-l6>ul,.wy-menu-vertical .toctree-l6.current .current.toctree-l7>ul,.wy-menu-vertical .toctree-l7.current .current.toctree-l8>ul,.wy-menu-vertical .toctree-l8.current .current.toctree-l9>ul,.wy-menu-vertical .toctree-l9.current .current.toctree-l10>ul,.wy-menu-vertical .toctree-l10.current .current.toctree-l11>ul{display:block}.wy-menu-vertical li.toctree-l3,.wy-menu-vertical li.toctree-l4{font-size:.9em}.wy-menu-vertical li.toctree-l2 a,.wy-menu-vertical li.toctree-l3 a,.wy-menu-vertical li.toctree-l4 a,.wy-menu-vertical li.toctree-l5 a,.wy-menu-vertical li.toctree-l6 a,.wy-menu-vertical li.toctree-l7 a,.wy-menu-vertical li.toctree-l8 a,.wy-menu-vertical li.toctree-l9 a,.wy-menu-vertical li.toctree-l10 a{color:#404040}.wy-menu-vertical li.toctree-l2 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l3 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l4 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l5 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l6 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l7 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l8 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l9 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l10 a:hover button.toctree-expand{color:grey}.wy-menu-vertical li.toctree-l2.current li.toctree-l3>a,.wy-menu-vertical li.toctree-l3.current li.toctree-l4>a,.wy-menu-vertical li.toctree-l4.current li.toctree-l5>a,.wy-menu-vertical li.toctree-l5.current li.toctree-l6>a,.wy-menu-vertical li.toctree-l6.current li.toctree-l7>a,.wy-menu-vertical li.toctree-l7.current li.toctree-l8>a,.wy-menu-vertical li.toctree-l8.current li.toctree-l9>a,.wy-menu-vertical li.toctree-l9.current li.toctree-l10>a,.wy-menu-vertical li.toctree-l10.current li.toctree-l11>a{display:block}.wy-menu-vertical li.toctree-l2.current>a{padding:.4045em 2.427em}.wy-menu-vertical li.toctree-l2.current li.toctree-l3>a{padding:.4045em 1.618em .4045em 4.045em}.wy-menu-vertical li.toctree-l3.current>a{padding:.4045em 4.045em}.wy-menu-vertical li.toctree-l3.current li.toctree-l4>a{padding:.4045em 1.618em .4045em 5.663em}.wy-menu-vertical li.toctree-l4.current>a{padding:.4045em 5.663em}.wy-menu-vertical li.toctree-l4.current li.toctree-l5>a{padding:.4045em 1.618em .4045em 7.281em}.wy-menu-vertical li.toctree-l5.current>a{padding:.4045em 7.281em}.wy-menu-vertical li.toctree-l5.current li.toctree-l6>a{padding:.4045em 1.618em .4045em 8.899em}.wy-menu-vertical li.toctree-l6.current>a{padding:.4045em 8.899em}.wy-menu-vertical li.toctree-l6.current li.toctree-l7>a{padding:.4045em 1.618em .4045em 10.517em}.wy-menu-vertical li.toctree-l7.current>a{padding:.4045em 10.517em}.wy-menu-vertical li.toctree-l7.current li.toctree-l8>a{padding:.4045em 1.618em .4045em 12.135em}.wy-menu-vertical li.toctree-l8.current>a{padding:.4045em 12.135em}.wy-menu-vertical li.toctree-l8.current li.toctree-l9>a{padding:.4045em 1.618em .4045em 13.753em}.wy-menu-vertical li.toctree-l9.current>a{padding:.4045em 13.753em}.wy-menu-vertical li.toctree-l9.current li.toctree-l10>a{padding:.4045em 1.618em .4045em 15.371em}.wy-menu-vertical li.toctree-l10.current>a{padding:.4045em 15.371em}.wy-menu-vertical li.toctree-l10.current li.toctree-l11>a{padding:.4045em 1.618em .4045em 16.989em}.wy-menu-vertical li.toctree-l2.current>a,.wy-menu-vertical li.toctree-l2.current li.toctree-l3>a{background:#c9c9c9}.wy-menu-vertical li.toctree-l2 button.toctree-expand{color:#a3a3a3}.wy-menu-vertical li.toctree-l3.current>a,.wy-menu-vertical li.toctree-l3.current li.toctree-l4>a{background:#bdbdbd}.wy-menu-vertical li.toctree-l3 button.toctree-expand{color:#969696}.wy-menu-vertical li.current ul{display:block}.wy-menu-vertical li ul{margin-bottom:0;display:none}.wy-menu-vertical li ul li a{margin-bottom:0;color:#d9d9d9;font-weight:400}.wy-menu-vertical a{line-height:18px;padding:.4045em 1.618em;display:block;position:relative;font-size:90%;color:#d9d9d9}.wy-menu-vertical a:hover{background-color:#4e4a4a;cursor:pointer}.wy-menu-vertical a:hover button.toctree-expand{color:#d9d9d9}.wy-menu-vertical a:active{background-color:#2980b9;cursor:pointer;color:#fff}.wy-menu-vertical a:active button.toctree-expand{color:#fff}.wy-side-nav-search{display:block;width:300px;padding:.809em;margin-bottom:.809em;z-index:200;background-color:#2980b9;text-align:center;color:#fcfcfc}.wy-side-nav-search input[type=text]{width:100%;border-radius:50px;padding:6px 12px;border-color:#2472a4}.wy-side-nav-search img{display:block;margin:auto auto .809em;height:45px;width:45px;background-color:#2980b9;padding:5px;border-radius:100%}.wy-side-nav-search .wy-dropdown>a,.wy-side-nav-search>a{color:#fcfcfc;font-size:100%;font-weight:700;display:inline-block;padding:4px 6px;margin-bottom:.809em;max-width:100%}.wy-side-nav-search .wy-dropdown>a:hover,.wy-side-nav-search>a:hover{background:hsla(0,0%,100%,.1)}.wy-side-nav-search .wy-dropdown>a img.logo,.wy-side-nav-search>a img.logo{display:block;margin:0 auto;height:auto;width:auto;border-radius:0;max-width:100%;background:transparent}.wy-side-nav-search .wy-dropdown>a.icon img.logo,.wy-side-nav-search>a.icon img.logo{margin-top:.85em}.wy-side-nav-search>div.version{margin-top:-.4045em;margin-bottom:.809em;font-weight:400;color:hsla(0,0%,100%,.3)}.wy-nav .wy-menu-vertical header{color:#2980b9}.wy-nav .wy-menu-vertical a{color:#b3b3b3}.wy-nav .wy-menu-vertical a:hover{background-color:#2980b9;color:#fff}[data-menu-wrap]{-webkit-transition:all .2s ease-in;-moz-transition:all .2s ease-in;transition:all .2s ease-in;position:absolute;opacity:1;width:100%;opacity:0}[data-menu-wrap].move-center{left:0;right:auto;opacity:1}[data-menu-wrap].move-left{right:auto;left:-100%;opacity:0}[data-menu-wrap].move-right{right:-100%;left:auto;opacity:0}.wy-body-for-nav{background:#fcfcfc}.wy-grid-for-nav{position:absolute;width:100%;height:100%}.wy-nav-side{position:fixed;top:0;bottom:0;left:0;padding-bottom:2em;width:300px;overflow-x:hidden;overflow-y:hidden;min-height:100%;color:#9b9b9b;background:#343131;z-index:200}.wy-side-scroll{width:320px;position:relative;overflow-x:hidden;overflow-y:scroll;height:100%}.wy-nav-top{display:none;background:#2980b9;color:#fff;padding:.4045em .809em;position:relative;line-height:50px;text-align:center;font-size:100%;*zoom:1}.wy-nav-top:after,.wy-nav-top:before{display:table;content:""}.wy-nav-top:after{clear:both}.wy-nav-top a{color:#fff;font-weight:700}.wy-nav-top img{margin-right:12px;height:45px;width:45px;background-color:#2980b9;padding:5px;border-radius:100%}.wy-nav-top i{font-size:30px;float:left;cursor:pointer;padding-top:inherit}.wy-nav-content-wrap{margin-left:300px;background:#fcfcfc;min-height:100%}.wy-nav-content{padding:1.618em 3.236em;height:100%;max-width:800px;margin:auto}.wy-body-mask{position:fixed;width:100%;height:100%;background:rgba(0,0,0,.2);display:none;z-index:499}.wy-body-mask.on{display:block}footer{color:grey}footer p{margin-bottom:12px}.rst-content footer span.commit tt,footer span.commit .rst-content tt,footer span.commit code{padding:0;font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;font-size:1em;background:none;border:none;color:grey}.rst-footer-buttons{*zoom:1}.rst-footer-buttons:after,.rst-footer-buttons:before{width:100%;display:table;content:""}.rst-footer-buttons:after{clear:both}.rst-breadcrumbs-buttons{margin-top:12px;*zoom:1}.rst-breadcrumbs-buttons:after,.rst-breadcrumbs-buttons:before{display:table;content:""}.rst-breadcrumbs-buttons:after{clear:both}#search-results .search li{margin-bottom:24px;border-bottom:1px solid #e1e4e5;padding-bottom:24px}#search-results .search li:first-child{border-top:1px solid #e1e4e5;padding-top:24px}#search-results .search li a{font-size:120%;margin-bottom:12px;display:inline-block}#search-results .context{color:grey;font-size:90%}.genindextable li>ul{margin-left:24px}@media screen and (max-width:768px){.wy-body-for-nav{background:#fcfcfc}.wy-nav-top{display:block}.wy-nav-side{left:-300px}.wy-nav-side.shift{width:85%;left:0}.wy-menu.wy-menu-vertical,.wy-side-nav-search,.wy-side-scroll{width:auto}.wy-nav-content-wrap{margin-left:0}.wy-nav-content-wrap .wy-nav-content{padding:1.618em}.wy-nav-content-wrap.shift{position:fixed;min-width:100%;left:85%;top:0;height:100%;overflow:hidden}}@media screen and (min-width:1100px){.wy-nav-content-wrap{background:rgba(0,0,0,.05)}.wy-nav-content{margin:0;background:#fcfcfc}}@media print{.rst-versions,.wy-nav-side,footer{display:none}.wy-nav-content-wrap{margin-left:0}}.rst-versions{position:fixed;bottom:0;left:0;width:300px;color:#fcfcfc;background:#1f1d1d;font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif;z-index:400}.rst-versions a{color:#2980b9;text-decoration:none}.rst-versions .rst-badge-small{display:none}.rst-versions .rst-current-version{padding:12px;background-color:#272525;display:block;text-align:right;font-size:90%;cursor:pointer;color:#27ae60;*zoom:1}.rst-versions .rst-current-version:after,.rst-versions .rst-current-version:before{display:table;content:""}.rst-versions .rst-current-version:after{clear:both}.rst-content .code-block-caption .rst-versions .rst-current-version .headerlink,.rst-content .eqno .rst-versions .rst-current-version .headerlink,.rst-content .rst-versions .rst-current-version .admonition-title,.rst-content code.download .rst-versions .rst-current-version span:first-child,.rst-content dl dt .rst-versions .rst-current-version .headerlink,.rst-content h1 .rst-versions .rst-current-version .headerlink,.rst-content h2 .rst-versions .rst-current-version .headerlink,.rst-content h3 .rst-versions .rst-current-version .headerlink,.rst-content h4 .rst-versions .rst-current-version .headerlink,.rst-content h5 .rst-versions .rst-current-version .headerlink,.rst-content h6 .rst-versions .rst-current-version .headerlink,.rst-content p .rst-versions .rst-current-version .headerlink,.rst-content table>caption .rst-versions .rst-current-version .headerlink,.rst-content tt.download .rst-versions .rst-current-version span:first-child,.rst-versions .rst-current-version .fa,.rst-versions .rst-current-version .icon,.rst-versions .rst-current-version .rst-content .admonition-title,.rst-versions .rst-current-version .rst-content .code-block-caption .headerlink,.rst-versions .rst-current-version .rst-content .eqno .headerlink,.rst-versions .rst-current-version .rst-content code.download span:first-child,.rst-versions .rst-current-version .rst-content dl dt .headerlink,.rst-versions .rst-current-version .rst-content h1 .headerlink,.rst-versions .rst-current-version .rst-content h2 .headerlink,.rst-versions .rst-current-version .rst-content h3 .headerlink,.rst-versions .rst-current-version .rst-content h4 .headerlink,.rst-versions .rst-current-version .rst-content h5 .headerlink,.rst-versions .rst-current-version .rst-content h6 .headerlink,.rst-versions .rst-current-version .rst-content p .headerlink,.rst-versions .rst-current-version .rst-content table>caption .headerlink,.rst-versions .rst-current-version .rst-content tt.download span:first-child,.rst-versions .rst-current-version .wy-menu-vertical li button.toctree-expand,.wy-menu-vertical li .rst-versions .rst-current-version button.toctree-expand{color:#fcfcfc}.rst-versions .rst-current-version .fa-book,.rst-versions .rst-current-version .icon-book{float:left}.rst-versions .rst-current-version.rst-out-of-date{background-color:#e74c3c;color:#fff}.rst-versions .rst-current-version.rst-active-old-version{background-color:#f1c40f;color:#000}.rst-versions.shift-up{height:auto;max-height:100%;overflow-y:scroll}.rst-versions.shift-up .rst-other-versions{display:block}.rst-versions .rst-other-versions{font-size:90%;padding:12px;color:grey;display:none}.rst-versions .rst-other-versions hr{display:block;height:1px;border:0;margin:20px 0;padding:0;border-top:1px solid #413d3d}.rst-versions .rst-other-versions dd{display:inline-block;margin:0}.rst-versions .rst-other-versions dd a{display:inline-block;padding:6px;color:#fcfcfc}.rst-versions.rst-badge{width:auto;bottom:20px;right:20px;left:auto;border:none;max-width:300px;max-height:90%}.rst-versions.rst-badge .fa-book,.rst-versions.rst-badge .icon-book{float:none;line-height:30px}.rst-versions.rst-badge.shift-up .rst-current-version{text-align:right}.rst-versions.rst-badge.shift-up .rst-current-version .fa-book,.rst-versions.rst-badge.shift-up .rst-current-version .icon-book{float:left}.rst-versions.rst-badge>.rst-current-version{width:auto;height:30px;line-height:30px;padding:0 6px;display:block;text-align:center}@media screen and (max-width:768px){.rst-versions{width:85%;display:none}.rst-versions.shift{display:block}}.rst-content .toctree-wrapper>p.caption,.rst-content h1,.rst-content h2,.rst-content h3,.rst-content h4,.rst-content h5,.rst-content h6{margin-bottom:24px}.rst-content img{max-width:100%;height:auto}.rst-content div.figure,.rst-content figure{margin-bottom:24px}.rst-content div.figure .caption-text,.rst-content figure .caption-text{font-style:italic}.rst-content div.figure p:last-child.caption,.rst-content figure p:last-child.caption{margin-bottom:0}.rst-content div.figure.align-center,.rst-content figure.align-center{text-align:center}.rst-content .section>a>img,.rst-content .section>img,.rst-content section>a>img,.rst-content section>img{margin-bottom:24px}.rst-content abbr[title]{text-decoration:none}.rst-content.style-external-links a.reference.external:after{font-family:FontAwesome;content:"\f08e";color:#b3b3b3;vertical-align:super;font-size:60%;margin:0 .2em}.rst-content blockquote{margin-left:24px;line-height:24px;margin-bottom:24px}.rst-content pre.literal-block{white-space:pre;margin:0;padding:12px;font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;display:block;overflow:auto}.rst-content div[class^=highlight],.rst-content pre.literal-block{border:1px solid #e1e4e5;overflow-x:auto;margin:1px 0 24px}.rst-content div[class^=highlight] div[class^=highlight],.rst-content pre.literal-block div[class^=highlight]{padding:0;border:none;margin:0}.rst-content div[class^=highlight] td.code{width:100%}.rst-content .linenodiv pre{border-right:1px solid #e6e9ea;margin:0;padding:12px;font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;user-select:none;pointer-events:none}.rst-content div[class^=highlight] pre{white-space:pre;margin:0;padding:12px;display:block;overflow:auto}.rst-content div[class^=highlight] pre .hll{display:block;margin:0 -12px;padding:0 12px}.rst-content .linenodiv pre,.rst-content div[class^=highlight] pre,.rst-content pre.literal-block{font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;font-size:12px;line-height:1.4}.rst-content div.highlight .gp,.rst-content div.highlight span.linenos{user-select:none;pointer-events:none}.rst-content div.highlight span.linenos{display:inline-block;padding-left:0;padding-right:12px;margin-right:12px;border-right:1px solid #e6e9ea}.rst-content .code-block-caption{font-style:italic;font-size:85%;line-height:1;padding:1em 0;text-align:center}@media print{.rst-content .codeblock,.rst-content div[class^=highlight],.rst-content div[class^=highlight] pre{white-space:pre-wrap}}.rst-content .admonition,.rst-content .admonition-todo,.rst-content .attention,.rst-content .caution,.rst-content .danger,.rst-content .error,.rst-content .hint,.rst-content .important,.rst-content .note,.rst-content .seealso,.rst-content .tip,.rst-content .warning{clear:both}.rst-content .admonition-todo .last,.rst-content .admonition-todo>:last-child,.rst-content .admonition .last,.rst-content .admonition>:last-child,.rst-content .attention .last,.rst-content .attention>:last-child,.rst-content .caution .last,.rst-content .caution>:last-child,.rst-content .danger .last,.rst-content .danger>:last-child,.rst-content .error .last,.rst-content .error>:last-child,.rst-content .hint .last,.rst-content .hint>:last-child,.rst-content .important .last,.rst-content .important>:last-child,.rst-content .note .last,.rst-content .note>:last-child,.rst-content .seealso .last,.rst-content .seealso>:last-child,.rst-content .tip .last,.rst-content .tip>:last-child,.rst-content .warning .last,.rst-content .warning>:last-child{margin-bottom:0}.rst-content .admonition-title:before{margin-right:4px}.rst-content .admonition table{border-color:rgba(0,0,0,.1)}.rst-content .admonition table td,.rst-content .admonition table th{background:transparent!important;border-color:rgba(0,0,0,.1)!important}.rst-content .section ol.loweralpha,.rst-content .section ol.loweralpha>li,.rst-content .toctree-wrapper ol.loweralpha,.rst-content .toctree-wrapper ol.loweralpha>li,.rst-content section ol.loweralpha,.rst-content section ol.loweralpha>li{list-style:lower-alpha}.rst-content .section ol.upperalpha,.rst-content .section ol.upperalpha>li,.rst-content .toctree-wrapper ol.upperalpha,.rst-content .toctree-wrapper ol.upperalpha>li,.rst-content section ol.upperalpha,.rst-content section ol.upperalpha>li{list-style:upper-alpha}.rst-content .section ol li>*,.rst-content .section ul li>*,.rst-content .toctree-wrapper ol li>*,.rst-content .toctree-wrapper ul li>*,.rst-content section ol li>*,.rst-content section ul li>*{margin-top:12px;margin-bottom:12px}.rst-content .section ol li>:first-child,.rst-content .section ul li>:first-child,.rst-content .toctree-wrapper ol li>:first-child,.rst-content .toctree-wrapper ul li>:first-child,.rst-content section ol li>:first-child,.rst-content section ul li>:first-child{margin-top:0}.rst-content .section ol li>p,.rst-content .section ol li>p:last-child,.rst-content .section ul li>p,.rst-content .section ul li>p:last-child,.rst-content .toctree-wrapper ol li>p,.rst-content .toctree-wrapper ol li>p:last-child,.rst-content .toctree-wrapper ul li>p,.rst-content .toctree-wrapper ul li>p:last-child,.rst-content section ol li>p,.rst-content section ol li>p:last-child,.rst-content section ul li>p,.rst-content section ul li>p:last-child{margin-bottom:12px}.rst-content .section ol li>p:only-child,.rst-content .section ol li>p:only-child:last-child,.rst-content .section ul li>p:only-child,.rst-content .section ul li>p:only-child:last-child,.rst-content .toctree-wrapper ol li>p:only-child,.rst-content .toctree-wrapper ol li>p:only-child:last-child,.rst-content .toctree-wrapper ul li>p:only-child,.rst-content .toctree-wrapper ul li>p:only-child:last-child,.rst-content section ol li>p:only-child,.rst-content section ol li>p:only-child:last-child,.rst-content section ul li>p:only-child,.rst-content section ul li>p:only-child:last-child{margin-bottom:0}.rst-content .section ol li>ol,.rst-content .section ol li>ul,.rst-content .section ul li>ol,.rst-content .section ul li>ul,.rst-content .toctree-wrapper ol li>ol,.rst-content .toctree-wrapper ol li>ul,.rst-content .toctree-wrapper ul li>ol,.rst-content .toctree-wrapper ul li>ul,.rst-content section ol li>ol,.rst-content section ol li>ul,.rst-content section ul li>ol,.rst-content section ul li>ul{margin-bottom:12px}.rst-content .section ol.simple li>*,.rst-content .section ol.simple li ol,.rst-content .section ol.simple li ul,.rst-content .section ul.simple li>*,.rst-content .section ul.simple li ol,.rst-content .section ul.simple li ul,.rst-content .toctree-wrapper ol.simple li>*,.rst-content .toctree-wrapper ol.simple li ol,.rst-content .toctree-wrapper ol.simple li ul,.rst-content .toctree-wrapper ul.simple li>*,.rst-content .toctree-wrapper ul.simple li ol,.rst-content .toctree-wrapper ul.simple li ul,.rst-content section ol.simple li>*,.rst-content section ol.simple li ol,.rst-content section ol.simple li ul,.rst-content section ul.simple li>*,.rst-content section ul.simple li ol,.rst-content section ul.simple li ul{margin-top:0;margin-bottom:0}.rst-content .line-block{margin-left:0;margin-bottom:24px;line-height:24px}.rst-content .line-block .line-block{margin-left:24px;margin-bottom:0}.rst-content .topic-title{font-weight:700;margin-bottom:12px}.rst-content .toc-backref{color:#404040}.rst-content .align-right{float:right;margin:0 0 24px 24px}.rst-content .align-left{float:left;margin:0 24px 24px 0}.rst-content .align-center{margin:auto}.rst-content .align-center:not(table){display:block}.rst-content .code-block-caption .headerlink,.rst-content .eqno .headerlink,.rst-content .toctree-wrapper>p.caption .headerlink,.rst-content dl dt .headerlink,.rst-content h1 .headerlink,.rst-content h2 .headerlink,.rst-content h3 .headerlink,.rst-content h4 .headerlink,.rst-content h5 .headerlink,.rst-content h6 .headerlink,.rst-content p.caption .headerlink,.rst-content p .headerlink,.rst-content table>caption .headerlink{opacity:0;font-size:14px;font-family:FontAwesome;margin-left:.5em}.rst-content .code-block-caption .headerlink:focus,.rst-content .code-block-caption:hover .headerlink,.rst-content .eqno .headerlink:focus,.rst-content .eqno:hover .headerlink,.rst-content .toctree-wrapper>p.caption .headerlink:focus,.rst-content .toctree-wrapper>p.caption:hover .headerlink,.rst-content dl dt .headerlink:focus,.rst-content dl dt:hover .headerlink,.rst-content h1 .headerlink:focus,.rst-content h1:hover .headerlink,.rst-content h2 .headerlink:focus,.rst-content h2:hover .headerlink,.rst-content h3 .headerlink:focus,.rst-content h3:hover .headerlink,.rst-content h4 .headerlink:focus,.rst-content h4:hover .headerlink,.rst-content h5 .headerlink:focus,.rst-content h5:hover .headerlink,.rst-content h6 .headerlink:focus,.rst-content h6:hover .headerlink,.rst-content p.caption .headerlink:focus,.rst-content p.caption:hover .headerlink,.rst-content p .headerlink:focus,.rst-content p:hover .headerlink,.rst-content table>caption .headerlink:focus,.rst-content table>caption:hover .headerlink{opacity:1}.rst-content p a{overflow-wrap:anywhere}.rst-content .wy-table td p,.rst-content .wy-table td ul,.rst-content .wy-table th p,.rst-content .wy-table th ul,.rst-content table.docutils td p,.rst-content table.docutils td ul,.rst-content table.docutils th p,.rst-content table.docutils th ul,.rst-content table.field-list td p,.rst-content table.field-list td ul,.rst-content table.field-list th p,.rst-content table.field-list th ul{font-size:inherit}.rst-content .btn:focus{outline:2px solid}.rst-content table>caption .headerlink:after{font-size:12px}.rst-content .centered{text-align:center}.rst-content .sidebar{float:right;width:40%;display:block;margin:0 0 24px 24px;padding:24px;background:#f3f6f6;border:1px solid #e1e4e5}.rst-content .sidebar dl,.rst-content .sidebar p,.rst-content .sidebar ul{font-size:90%}.rst-content .sidebar .last,.rst-content .sidebar>:last-child{margin-bottom:0}.rst-content .sidebar .sidebar-title{display:block;font-family:Roboto Slab,ff-tisa-web-pro,Georgia,Arial,sans-serif;font-weight:700;background:#e1e4e5;padding:6px 12px;margin:-24px -24px 24px;font-size:100%}.rst-content .highlighted{background:#f1c40f;box-shadow:0 0 0 2px #f1c40f;display:inline;font-weight:700}.rst-content .citation-reference,.rst-content .footnote-reference{vertical-align:baseline;position:relative;top:-.4em;line-height:0;font-size:90%}.rst-content .citation-reference>span.fn-bracket,.rst-content .footnote-reference>span.fn-bracket{display:none}.rst-content .hlist{width:100%}.rst-content dl dt span.classifier:before{content:" : "}.rst-content dl dt span.classifier-delimiter{display:none!important}html.writer-html4 .rst-content table.docutils.citation,html.writer-html4 .rst-content table.docutils.footnote{background:none;border:none}html.writer-html4 .rst-content table.docutils.citation td,html.writer-html4 .rst-content table.docutils.citation tr,html.writer-html4 .rst-content table.docutils.footnote td,html.writer-html4 .rst-content table.docutils.footnote tr{border:none;background-color:transparent!important;white-space:normal}html.writer-html4 .rst-content table.docutils.citation td.label,html.writer-html4 .rst-content table.docutils.footnote td.label{padding-left:0;padding-right:0;vertical-align:top}html.writer-html5 .rst-content dl.citation,html.writer-html5 .rst-content dl.field-list,html.writer-html5 .rst-content dl.footnote{display:grid;grid-template-columns:auto minmax(80%,95%)}html.writer-html5 .rst-content dl.citation>dt,html.writer-html5 .rst-content dl.field-list>dt,html.writer-html5 .rst-content dl.footnote>dt{display:inline-grid;grid-template-columns:max-content auto}html.writer-html5 .rst-content aside.citation,html.writer-html5 .rst-content aside.footnote,html.writer-html5 .rst-content div.citation{display:grid;grid-template-columns:auto auto minmax(.65rem,auto) minmax(40%,95%)}html.writer-html5 .rst-content aside.citation>span.label,html.writer-html5 .rst-content aside.footnote>span.label,html.writer-html5 .rst-content div.citation>span.label{grid-column-start:1;grid-column-end:2}html.writer-html5 .rst-content aside.citation>span.backrefs,html.writer-html5 .rst-content aside.footnote>span.backrefs,html.writer-html5 .rst-content div.citation>span.backrefs{grid-column-start:2;grid-column-end:3;grid-row-start:1;grid-row-end:3}html.writer-html5 .rst-content aside.citation>p,html.writer-html5 .rst-content aside.footnote>p,html.writer-html5 .rst-content div.citation>p{grid-column-start:4;grid-column-end:5}html.writer-html5 .rst-content dl.citation,html.writer-html5 .rst-content dl.field-list,html.writer-html5 .rst-content dl.footnote{margin-bottom:24px}html.writer-html5 .rst-content dl.citation>dt,html.writer-html5 .rst-content dl.field-list>dt,html.writer-html5 .rst-content dl.footnote>dt{padding-left:1rem}html.writer-html5 .rst-content dl.citation>dd,html.writer-html5 .rst-content dl.citation>dt,html.writer-html5 .rst-content dl.field-list>dd,html.writer-html5 .rst-content dl.field-list>dt,html.writer-html5 .rst-content dl.footnote>dd,html.writer-html5 .rst-content dl.footnote>dt{margin-bottom:0}html.writer-html5 .rst-content dl.citation,html.writer-html5 .rst-content dl.footnote{font-size:.9rem}html.writer-html5 .rst-content dl.citation>dt,html.writer-html5 .rst-content dl.footnote>dt{margin:0 .5rem .5rem 0;line-height:1.2rem;word-break:break-all;font-weight:400}html.writer-html5 .rst-content dl.citation>dt>span.brackets:before,html.writer-html5 .rst-content dl.footnote>dt>span.brackets:before{content:"["}html.writer-html5 .rst-content dl.citation>dt>span.brackets:after,html.writer-html5 .rst-content dl.footnote>dt>span.brackets:after{content:"]"}html.writer-html5 .rst-content dl.citation>dt>span.fn-backref,html.writer-html5 .rst-content dl.footnote>dt>span.fn-backref{text-align:left;font-style:italic;margin-left:.65rem;word-break:break-word;word-spacing:-.1rem;max-width:5rem}html.writer-html5 .rst-content dl.citation>dt>span.fn-backref>a,html.writer-html5 .rst-content dl.footnote>dt>span.fn-backref>a{word-break:keep-all}html.writer-html5 .rst-content dl.citation>dt>span.fn-backref>a:not(:first-child):before,html.writer-html5 .rst-content dl.footnote>dt>span.fn-backref>a:not(:first-child):before{content:" "}html.writer-html5 .rst-content dl.citation>dd,html.writer-html5 .rst-content dl.footnote>dd{margin:0 0 .5rem;line-height:1.2rem}html.writer-html5 .rst-content dl.citation>dd p,html.writer-html5 .rst-content dl.footnote>dd p{font-size:.9rem}html.writer-html5 .rst-content aside.citation,html.writer-html5 .rst-content aside.footnote,html.writer-html5 .rst-content div.citation{padding-left:1rem;padding-right:1rem;font-size:.9rem;line-height:1.2rem}html.writer-html5 .rst-content aside.citation p,html.writer-html5 .rst-content aside.footnote p,html.writer-html5 .rst-content div.citation p{font-size:.9rem;line-height:1.2rem;margin-bottom:12px}html.writer-html5 .rst-content aside.citation span.backrefs,html.writer-html5 .rst-content aside.footnote span.backrefs,html.writer-html5 .rst-content div.citation span.backrefs{text-align:left;font-style:italic;margin-left:.65rem;word-break:break-word;word-spacing:-.1rem;max-width:5rem}html.writer-html5 .rst-content aside.citation span.backrefs>a,html.writer-html5 .rst-content aside.footnote span.backrefs>a,html.writer-html5 .rst-content div.citation span.backrefs>a{word-break:keep-all}html.writer-html5 .rst-content aside.citation span.backrefs>a:not(:first-child):before,html.writer-html5 .rst-content aside.footnote span.backrefs>a:not(:first-child):before,html.writer-html5 .rst-content div.citation span.backrefs>a:not(:first-child):before{content:" "}html.writer-html5 .rst-content aside.citation span.label,html.writer-html5 .rst-content aside.footnote span.label,html.writer-html5 .rst-content div.citation span.label{line-height:1.2rem}html.writer-html5 .rst-content aside.citation-list,html.writer-html5 .rst-content aside.footnote-list,html.writer-html5 .rst-content div.citation-list{margin-bottom:24px}html.writer-html5 .rst-content dl.option-list kbd{font-size:.9rem}.rst-content table.docutils.footnote,html.writer-html4 .rst-content table.docutils.citation,html.writer-html5 .rst-content aside.footnote,html.writer-html5 .rst-content aside.footnote-list aside.footnote,html.writer-html5 .rst-content div.citation-list>div.citation,html.writer-html5 .rst-content dl.citation,html.writer-html5 .rst-content dl.footnote{color:grey}.rst-content table.docutils.footnote code,.rst-content table.docutils.footnote tt,html.writer-html4 .rst-content table.docutils.citation code,html.writer-html4 .rst-content table.docutils.citation tt,html.writer-html5 .rst-content aside.footnote-list aside.footnote code,html.writer-html5 .rst-content aside.footnote-list aside.footnote tt,html.writer-html5 .rst-content aside.footnote code,html.writer-html5 .rst-content aside.footnote tt,html.writer-html5 .rst-content div.citation-list>div.citation code,html.writer-html5 .rst-content div.citation-list>div.citation tt,html.writer-html5 .rst-content dl.citation code,html.writer-html5 .rst-content dl.citation tt,html.writer-html5 .rst-content dl.footnote code,html.writer-html5 .rst-content dl.footnote tt{color:#555}.rst-content .wy-table-responsive.citation,.rst-content .wy-table-responsive.footnote{margin-bottom:0}.rst-content .wy-table-responsive.citation+:not(.citation),.rst-content .wy-table-responsive.footnote+:not(.footnote){margin-top:24px}.rst-content .wy-table-responsive.citation:last-child,.rst-content .wy-table-responsive.footnote:last-child{margin-bottom:24px}.rst-content table.docutils th{border-color:#e1e4e5}html.writer-html5 .rst-content table.docutils th{border:1px solid #e1e4e5}html.writer-html5 .rst-content table.docutils td>p,html.writer-html5 .rst-content table.docutils th>p{line-height:1rem;margin-bottom:0;font-size:.9rem}.rst-content table.docutils td .last,.rst-content table.docutils td .last>:last-child{margin-bottom:0}.rst-content table.field-list,.rst-content table.field-list td{border:none}.rst-content table.field-list td p{line-height:inherit}.rst-content table.field-list td>strong{display:inline-block}.rst-content table.field-list .field-name{padding-right:10px;text-align:left;white-space:nowrap}.rst-content table.field-list .field-body{text-align:left}.rst-content code,.rst-content tt{color:#000;font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;padding:2px 5px}.rst-content code big,.rst-content code em,.rst-content tt big,.rst-content tt em{font-size:100%!important;line-height:normal}.rst-content code.literal,.rst-content tt.literal{color:#e74c3c;white-space:normal}.rst-content code.xref,.rst-content tt.xref,a .rst-content code,a .rst-content tt{font-weight:700;color:#404040;overflow-wrap:normal}.rst-content kbd,.rst-content pre,.rst-content samp{font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace}.rst-content a code,.rst-content a tt{color:#2980b9}.rst-content dl{margin-bottom:24px}.rst-content dl dt{font-weight:700;margin-bottom:12px}.rst-content dl ol,.rst-content dl p,.rst-content dl table,.rst-content dl ul{margin-bottom:12px}.rst-content dl dd{margin:0 0 12px 24px;line-height:24px}.rst-content dl dd>ol:last-child,.rst-content dl dd>p:last-child,.rst-content dl dd>table:last-child,.rst-content dl dd>ul:last-child{margin-bottom:0}html.writer-html4 .rst-content dl:not(.docutils),html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple){margin-bottom:24px}html.writer-html4 .rst-content dl:not(.docutils)>dt,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt{display:table;margin:6px 0;font-size:90%;line-height:normal;background:#e7f2fa;color:#2980b9;border-top:3px solid #6ab0de;padding:6px;position:relative}html.writer-html4 .rst-content dl:not(.docutils)>dt:before,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt:before{color:#6ab0de}html.writer-html4 .rst-content dl:not(.docutils)>dt .headerlink,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt .headerlink{color:#404040;font-size:100%!important}html.writer-html4 .rst-content dl:not(.docutils) dl:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) dl:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt{margin-bottom:6px;border:none;border-left:3px solid #ccc;background:#f0f0f0;color:#555}html.writer-html4 .rst-content dl:not(.docutils) dl:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt .headerlink,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) dl:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt .headerlink{color:#404040;font-size:100%!important}html.writer-html4 .rst-content dl:not(.docutils)>dt:first-child,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt:first-child{margin-top:0}html.writer-html4 .rst-content dl:not(.docutils) code.descclassname,html.writer-html4 .rst-content dl:not(.docutils) code.descname,html.writer-html4 .rst-content dl:not(.docutils) tt.descclassname,html.writer-html4 .rst-content dl:not(.docutils) tt.descname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) code.descclassname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) code.descname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) tt.descclassname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) tt.descname{background-color:transparent;border:none;padding:0;font-size:100%!important}html.writer-html4 .rst-content dl:not(.docutils) code.descname,html.writer-html4 .rst-content dl:not(.docutils) tt.descname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) code.descname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) tt.descname{font-weight:700}html.writer-html4 .rst-content dl:not(.docutils) .optional,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) .optional{display:inline-block;padding:0 4px;color:#000;font-weight:700}html.writer-html4 .rst-content dl:not(.docutils) .property,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) .property{display:inline-block;padding-right:8px;max-width:100%}html.writer-html4 .rst-content dl:not(.docutils) .k,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) .k{font-style:italic}html.writer-html4 .rst-content dl:not(.docutils) .descclassname,html.writer-html4 .rst-content dl:not(.docutils) .descname,html.writer-html4 .rst-content dl:not(.docutils) .sig-name,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) .descclassname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) .descname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) .sig-name{font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;color:#000}.rst-content .viewcode-back,.rst-content .viewcode-link{display:inline-block;color:#27ae60;font-size:80%;padding-left:24px}.rst-content .viewcode-back{display:block;float:right}.rst-content p.rubric{margin-bottom:12px;font-weight:700}.rst-content code.download,.rst-content tt.download{background:inherit;padding:inherit;font-weight:400;font-family:inherit;font-size:inherit;color:inherit;border:inherit;white-space:inherit}.rst-content code.download span:first-child,.rst-content tt.download span:first-child{-webkit-font-smoothing:subpixel-antialiased}.rst-content code.download span:first-child:before,.rst-content tt.download span:first-child:before{margin-right:4px}.rst-content .guilabel,.rst-content .menuselection{font-size:80%;font-weight:700;border-radius:4px;padding:2.4px 6px;margin:auto 2px}.rst-content .guilabel,.rst-content .menuselection{border:1px solid #7fbbe3;background:#e7f2fa}.rst-content :not(dl.option-list)>:not(dt):not(kbd):not(.kbd)>.kbd,.rst-content :not(dl.option-list)>:not(dt):not(kbd):not(.kbd)>kbd{color:inherit;font-size:80%;background-color:#fff;border:1px solid #a6a6a6;border-radius:4px;box-shadow:0 2px grey;padding:2.4px 6px;margin:auto 0}.rst-content .versionmodified{font-style:italic}@media screen and (max-width:480px){.rst-content .sidebar{width:100%}}span[id*=MathJax-Span]{color:#404040}.math{text-align:center}@font-face{font-family:Lato;src:url(fonts/lato-normal.woff2?bd03a2cc277bbbc338d464e679fe9942) format("woff2"),url(fonts/lato-normal.woff?27bd77b9162d388cb8d4c4217c7c5e2a) format("woff");font-weight:400;font-style:normal;font-display:block}@font-face{font-family:Lato;src:url(fonts/lato-bold.woff2?cccb897485813c7c256901dbca54ecf2) format("woff2"),url(fonts/lato-bold.woff?d878b6c29b10beca227e9eef4246111b) format("woff");font-weight:700;font-style:normal;font-display:block}@font-face{font-family:Lato;src:url(fonts/lato-bold-italic.woff2?0b6bb6725576b072c5d0b02ecdd1900d) format("woff2"),url(fonts/lato-bold-italic.woff?9c7e4e9eb485b4a121c760e61bc3707c) format("woff");font-weight:700;font-style:italic;font-display:block}@font-face{font-family:Lato;src:url(fonts/lato-normal-italic.woff2?4eb103b4d12be57cb1d040ed5e162e9d) format("woff2"),url(fonts/lato-normal-italic.woff?f28f2d6482446544ef1ea1ccc6dd5892) format("woff");font-weight:400;font-style:italic;font-display:block}@font-face{font-family:Roboto Slab;font-style:normal;font-weight:400;src:url(fonts/Roboto-Slab-Regular.woff2?7abf5b8d04d26a2cafea937019bca958) format("woff2"),url(fonts/Roboto-Slab-Regular.woff?c1be9284088d487c5e3ff0a10a92e58c) format("woff");font-display:block}@font-face{font-family:Roboto Slab;font-style:normal;font-weight:700;src:url(fonts/Roboto-Slab-Bold.woff2?9984f4a9bda09be08e83f2506954adbe) format("woff2"),url(fonts/Roboto-Slab-Bold.woff?bed5564a116b05148e3b3bea6fb1162a) format("woff");font-display:block} \ No newline at end of file diff --git a/branch/bicounty_dev/_static/doctools.js b/branch/bicounty_dev/_static/doctools.js new file mode 100644 index 0000000..d06a71d --- /dev/null +++ b/branch/bicounty_dev/_static/doctools.js @@ -0,0 +1,156 @@ +/* + * doctools.js + * ~~~~~~~~~~~ + * + * Base JavaScript utilities for all Sphinx HTML documentation. + * + * :copyright: Copyright 2007-2023 by the Sphinx team, see AUTHORS. + * :license: BSD, see LICENSE for details. + * + */ +"use strict"; + +const BLACKLISTED_KEY_CONTROL_ELEMENTS = new Set([ + "TEXTAREA", + "INPUT", + "SELECT", + "BUTTON", +]); + +const _ready = (callback) => { + if (document.readyState !== "loading") { + callback(); + } else { + document.addEventListener("DOMContentLoaded", callback); + } +}; + +/** + * Small JavaScript module for the documentation. + */ +const Documentation = { + init: () => { + Documentation.initDomainIndexTable(); + Documentation.initOnKeyListeners(); + }, + + /** + * i18n support + */ + TRANSLATIONS: {}, + PLURAL_EXPR: (n) => (n === 1 ? 0 : 1), + LOCALE: "unknown", + + // gettext and ngettext don't access this so that the functions + // can safely bound to a different name (_ = Documentation.gettext) + gettext: (string) => { + const translated = Documentation.TRANSLATIONS[string]; + switch (typeof translated) { + case "undefined": + return string; // no translation + case "string": + return translated; // translation exists + default: + return translated[0]; // (singular, plural) translation tuple exists + } + }, + + ngettext: (singular, plural, n) => { + const translated = Documentation.TRANSLATIONS[singular]; + if (typeof translated !== "undefined") + return translated[Documentation.PLURAL_EXPR(n)]; + return n === 1 ? singular : plural; + }, + + addTranslations: (catalog) => { + Object.assign(Documentation.TRANSLATIONS, catalog.messages); + Documentation.PLURAL_EXPR = new Function( + "n", + `return (${catalog.plural_expr})` + ); + Documentation.LOCALE = catalog.locale; + }, + + /** + * helper function to focus on search bar + */ + focusSearchBar: () => { + document.querySelectorAll("input[name=q]")[0]?.focus(); + }, + + /** + * Initialise the domain index toggle buttons + */ + initDomainIndexTable: () => { + const toggler = (el) => { + const idNumber = el.id.substr(7); + const toggledRows = document.querySelectorAll(`tr.cg-${idNumber}`); + if (el.src.substr(-9) === "minus.png") { + el.src = `${el.src.substr(0, el.src.length - 9)}plus.png`; + toggledRows.forEach((el) => (el.style.display = "none")); + } else { + el.src = `${el.src.substr(0, el.src.length - 8)}minus.png`; + toggledRows.forEach((el) => (el.style.display = "")); + } + }; + + const togglerElements = document.querySelectorAll("img.toggler"); + togglerElements.forEach((el) => + el.addEventListener("click", (event) => toggler(event.currentTarget)) + ); + togglerElements.forEach((el) => (el.style.display = "")); + if (DOCUMENTATION_OPTIONS.COLLAPSE_INDEX) togglerElements.forEach(toggler); + }, + + initOnKeyListeners: () => { + // only install a listener if it is really needed + if ( + !DOCUMENTATION_OPTIONS.NAVIGATION_WITH_KEYS && + !DOCUMENTATION_OPTIONS.ENABLE_SEARCH_SHORTCUTS + ) + return; + + document.addEventListener("keydown", (event) => { + // bail for input elements + if (BLACKLISTED_KEY_CONTROL_ELEMENTS.has(document.activeElement.tagName)) return; + // bail with special keys + if (event.altKey || event.ctrlKey || event.metaKey) return; + + if (!event.shiftKey) { + switch (event.key) { + case "ArrowLeft": + if (!DOCUMENTATION_OPTIONS.NAVIGATION_WITH_KEYS) break; + + const prevLink = document.querySelector('link[rel="prev"]'); + if (prevLink && prevLink.href) { + window.location.href = prevLink.href; + event.preventDefault(); + } + break; + case "ArrowRight": + if (!DOCUMENTATION_OPTIONS.NAVIGATION_WITH_KEYS) break; + + const nextLink = document.querySelector('link[rel="next"]'); + if (nextLink && nextLink.href) { + window.location.href = nextLink.href; + event.preventDefault(); + } + break; + } + } + + // some keyboard layouts may need Shift to get / + switch (event.key) { + case "/": + if (!DOCUMENTATION_OPTIONS.ENABLE_SEARCH_SHORTCUTS) break; + Documentation.focusSearchBar(); + event.preventDefault(); + } + }); + }, +}; + +// quick alias for translations +const _ = Documentation.gettext; + +_ready(Documentation.init); diff --git a/branch/bicounty_dev/_static/documentation_options.js b/branch/bicounty_dev/_static/documentation_options.js new file mode 100644 index 0000000..c066c69 --- /dev/null +++ b/branch/bicounty_dev/_static/documentation_options.js @@ -0,0 +1,14 @@ +var DOCUMENTATION_OPTIONS = { + URL_ROOT: document.getElementById("documentation_options").getAttribute('data-url_root'), + VERSION: '', + LANGUAGE: 'en', + COLLAPSE_INDEX: false, + BUILDER: 'dirhtml', + FILE_SUFFIX: '.html', + LINK_SUFFIX: '.html', + HAS_SOURCE: true, + SOURCELINK_SUFFIX: '.txt', + NAVIGATION_WITH_KEYS: false, + SHOW_SEARCH_SUMMARY: true, + ENABLE_SEARCH_SHORTCUTS: true, +}; \ No newline at end of file diff --git a/branch/bicounty_dev/_static/file.png b/branch/bicounty_dev/_static/file.png new file mode 100644 index 0000000..a858a41 Binary files /dev/null and b/branch/bicounty_dev/_static/file.png differ diff --git a/branch/bicounty_dev/_static/graphviz.css b/branch/bicounty_dev/_static/graphviz.css new file mode 100644 index 0000000..8d81c02 --- /dev/null +++ b/branch/bicounty_dev/_static/graphviz.css @@ -0,0 +1,19 @@ +/* + * graphviz.css + * ~~~~~~~~~~~~ + * + * Sphinx stylesheet -- graphviz extension. + * + * :copyright: Copyright 2007-2023 by the Sphinx team, see AUTHORS. + * :license: BSD, see LICENSE for details. + * + */ + +img.graphviz { + border: 0; + max-width: 100%; +} + +object.graphviz { + max-width: 100%; +} diff --git a/branch/bicounty_dev/_static/jquery.js b/branch/bicounty_dev/_static/jquery.js new file mode 100644 index 0000000..c4c6022 --- /dev/null +++ b/branch/bicounty_dev/_static/jquery.js @@ -0,0 +1,2 @@ +/*! jQuery v3.6.0 | (c) OpenJS Foundation and other contributors | jquery.org/license */ +!function(e,t){"use strict";"object"==typeof module&&"object"==typeof module.exports?module.exports=e.document?t(e,!0):function(e){if(!e.document)throw new Error("jQuery requires a window with a document");return t(e)}:t(e)}("undefined"!=typeof window?window:this,function(C,e){"use strict";var t=[],r=Object.getPrototypeOf,s=t.slice,g=t.flat?function(e){return t.flat.call(e)}:function(e){return t.concat.apply([],e)},u=t.push,i=t.indexOf,n={},o=n.toString,v=n.hasOwnProperty,a=v.toString,l=a.call(Object),y={},m=function(e){return"function"==typeof e&&"number"!=typeof e.nodeType&&"function"!=typeof e.item},x=function(e){return null!=e&&e===e.window},E=C.document,c={type:!0,src:!0,nonce:!0,noModule:!0};function b(e,t,n){var r,i,o=(n=n||E).createElement("script");if(o.text=e,t)for(r in c)(i=t[r]||t.getAttribute&&t.getAttribute(r))&&o.setAttribute(r,i);n.head.appendChild(o).parentNode.removeChild(o)}function w(e){return null==e?e+"":"object"==typeof e||"function"==typeof e?n[o.call(e)]||"object":typeof e}var f="3.6.0",S=function(e,t){return new S.fn.init(e,t)};function p(e){var t=!!e&&"length"in e&&e.length,n=w(e);return!m(e)&&!x(e)&&("array"===n||0===t||"number"==typeof t&&0+~]|"+M+")"+M+"*"),U=new RegExp(M+"|>"),X=new RegExp(F),V=new RegExp("^"+I+"$"),G={ID:new RegExp("^#("+I+")"),CLASS:new RegExp("^\\.("+I+")"),TAG:new RegExp("^("+I+"|[*])"),ATTR:new RegExp("^"+W),PSEUDO:new RegExp("^"+F),CHILD:new RegExp("^:(only|first|last|nth|nth-last)-(child|of-type)(?:\\("+M+"*(even|odd|(([+-]|)(\\d*)n|)"+M+"*(?:([+-]|)"+M+"*(\\d+)|))"+M+"*\\)|)","i"),bool:new RegExp("^(?:"+R+")$","i"),needsContext:new RegExp("^"+M+"*[>+~]|:(even|odd|eq|gt|lt|nth|first|last)(?:\\("+M+"*((?:-\\d)?\\d*)"+M+"*\\)|)(?=[^-]|$)","i")},Y=/HTML$/i,Q=/^(?:input|select|textarea|button)$/i,J=/^h\d$/i,K=/^[^{]+\{\s*\[native \w/,Z=/^(?:#([\w-]+)|(\w+)|\.([\w-]+))$/,ee=/[+~]/,te=new RegExp("\\\\[\\da-fA-F]{1,6}"+M+"?|\\\\([^\\r\\n\\f])","g"),ne=function(e,t){var n="0x"+e.slice(1)-65536;return t||(n<0?String.fromCharCode(n+65536):String.fromCharCode(n>>10|55296,1023&n|56320))},re=/([\0-\x1f\x7f]|^-?\d)|^-$|[^\0-\x1f\x7f-\uFFFF\w-]/g,ie=function(e,t){return t?"\0"===e?"\ufffd":e.slice(0,-1)+"\\"+e.charCodeAt(e.length-1).toString(16)+" ":"\\"+e},oe=function(){T()},ae=be(function(e){return!0===e.disabled&&"fieldset"===e.nodeName.toLowerCase()},{dir:"parentNode",next:"legend"});try{H.apply(t=O.call(p.childNodes),p.childNodes),t[p.childNodes.length].nodeType}catch(e){H={apply:t.length?function(e,t){L.apply(e,O.call(t))}:function(e,t){var n=e.length,r=0;while(e[n++]=t[r++]);e.length=n-1}}}function se(t,e,n,r){var i,o,a,s,u,l,c,f=e&&e.ownerDocument,p=e?e.nodeType:9;if(n=n||[],"string"!=typeof t||!t||1!==p&&9!==p&&11!==p)return n;if(!r&&(T(e),e=e||C,E)){if(11!==p&&(u=Z.exec(t)))if(i=u[1]){if(9===p){if(!(a=e.getElementById(i)))return n;if(a.id===i)return n.push(a),n}else if(f&&(a=f.getElementById(i))&&y(e,a)&&a.id===i)return n.push(a),n}else{if(u[2])return H.apply(n,e.getElementsByTagName(t)),n;if((i=u[3])&&d.getElementsByClassName&&e.getElementsByClassName)return H.apply(n,e.getElementsByClassName(i)),n}if(d.qsa&&!N[t+" "]&&(!v||!v.test(t))&&(1!==p||"object"!==e.nodeName.toLowerCase())){if(c=t,f=e,1===p&&(U.test(t)||z.test(t))){(f=ee.test(t)&&ye(e.parentNode)||e)===e&&d.scope||((s=e.getAttribute("id"))?s=s.replace(re,ie):e.setAttribute("id",s=S)),o=(l=h(t)).length;while(o--)l[o]=(s?"#"+s:":scope")+" "+xe(l[o]);c=l.join(",")}try{return H.apply(n,f.querySelectorAll(c)),n}catch(e){N(t,!0)}finally{s===S&&e.removeAttribute("id")}}}return g(t.replace($,"$1"),e,n,r)}function ue(){var r=[];return function e(t,n){return r.push(t+" ")>b.cacheLength&&delete e[r.shift()],e[t+" "]=n}}function le(e){return e[S]=!0,e}function ce(e){var t=C.createElement("fieldset");try{return!!e(t)}catch(e){return!1}finally{t.parentNode&&t.parentNode.removeChild(t),t=null}}function fe(e,t){var n=e.split("|"),r=n.length;while(r--)b.attrHandle[n[r]]=t}function pe(e,t){var n=t&&e,r=n&&1===e.nodeType&&1===t.nodeType&&e.sourceIndex-t.sourceIndex;if(r)return r;if(n)while(n=n.nextSibling)if(n===t)return-1;return e?1:-1}function de(t){return function(e){return"input"===e.nodeName.toLowerCase()&&e.type===t}}function he(n){return function(e){var t=e.nodeName.toLowerCase();return("input"===t||"button"===t)&&e.type===n}}function ge(t){return function(e){return"form"in e?e.parentNode&&!1===e.disabled?"label"in e?"label"in e.parentNode?e.parentNode.disabled===t:e.disabled===t:e.isDisabled===t||e.isDisabled!==!t&&ae(e)===t:e.disabled===t:"label"in e&&e.disabled===t}}function ve(a){return le(function(o){return o=+o,le(function(e,t){var n,r=a([],e.length,o),i=r.length;while(i--)e[n=r[i]]&&(e[n]=!(t[n]=e[n]))})})}function ye(e){return e&&"undefined"!=typeof e.getElementsByTagName&&e}for(e in d=se.support={},i=se.isXML=function(e){var t=e&&e.namespaceURI,n=e&&(e.ownerDocument||e).documentElement;return!Y.test(t||n&&n.nodeName||"HTML")},T=se.setDocument=function(e){var t,n,r=e?e.ownerDocument||e:p;return r!=C&&9===r.nodeType&&r.documentElement&&(a=(C=r).documentElement,E=!i(C),p!=C&&(n=C.defaultView)&&n.top!==n&&(n.addEventListener?n.addEventListener("unload",oe,!1):n.attachEvent&&n.attachEvent("onunload",oe)),d.scope=ce(function(e){return a.appendChild(e).appendChild(C.createElement("div")),"undefined"!=typeof e.querySelectorAll&&!e.querySelectorAll(":scope fieldset div").length}),d.attributes=ce(function(e){return e.className="i",!e.getAttribute("className")}),d.getElementsByTagName=ce(function(e){return e.appendChild(C.createComment("")),!e.getElementsByTagName("*").length}),d.getElementsByClassName=K.test(C.getElementsByClassName),d.getById=ce(function(e){return a.appendChild(e).id=S,!C.getElementsByName||!C.getElementsByName(S).length}),d.getById?(b.filter.ID=function(e){var t=e.replace(te,ne);return function(e){return e.getAttribute("id")===t}},b.find.ID=function(e,t){if("undefined"!=typeof t.getElementById&&E){var n=t.getElementById(e);return n?[n]:[]}}):(b.filter.ID=function(e){var n=e.replace(te,ne);return function(e){var t="undefined"!=typeof e.getAttributeNode&&e.getAttributeNode("id");return t&&t.value===n}},b.find.ID=function(e,t){if("undefined"!=typeof t.getElementById&&E){var n,r,i,o=t.getElementById(e);if(o){if((n=o.getAttributeNode("id"))&&n.value===e)return[o];i=t.getElementsByName(e),r=0;while(o=i[r++])if((n=o.getAttributeNode("id"))&&n.value===e)return[o]}return[]}}),b.find.TAG=d.getElementsByTagName?function(e,t){return"undefined"!=typeof t.getElementsByTagName?t.getElementsByTagName(e):d.qsa?t.querySelectorAll(e):void 0}:function(e,t){var n,r=[],i=0,o=t.getElementsByTagName(e);if("*"===e){while(n=o[i++])1===n.nodeType&&r.push(n);return r}return o},b.find.CLASS=d.getElementsByClassName&&function(e,t){if("undefined"!=typeof t.getElementsByClassName&&E)return t.getElementsByClassName(e)},s=[],v=[],(d.qsa=K.test(C.querySelectorAll))&&(ce(function(e){var t;a.appendChild(e).innerHTML="",e.querySelectorAll("[msallowcapture^='']").length&&v.push("[*^$]="+M+"*(?:''|\"\")"),e.querySelectorAll("[selected]").length||v.push("\\["+M+"*(?:value|"+R+")"),e.querySelectorAll("[id~="+S+"-]").length||v.push("~="),(t=C.createElement("input")).setAttribute("name",""),e.appendChild(t),e.querySelectorAll("[name='']").length||v.push("\\["+M+"*name"+M+"*="+M+"*(?:''|\"\")"),e.querySelectorAll(":checked").length||v.push(":checked"),e.querySelectorAll("a#"+S+"+*").length||v.push(".#.+[+~]"),e.querySelectorAll("\\\f"),v.push("[\\r\\n\\f]")}),ce(function(e){e.innerHTML="";var t=C.createElement("input");t.setAttribute("type","hidden"),e.appendChild(t).setAttribute("name","D"),e.querySelectorAll("[name=d]").length&&v.push("name"+M+"*[*^$|!~]?="),2!==e.querySelectorAll(":enabled").length&&v.push(":enabled",":disabled"),a.appendChild(e).disabled=!0,2!==e.querySelectorAll(":disabled").length&&v.push(":enabled",":disabled"),e.querySelectorAll("*,:x"),v.push(",.*:")})),(d.matchesSelector=K.test(c=a.matches||a.webkitMatchesSelector||a.mozMatchesSelector||a.oMatchesSelector||a.msMatchesSelector))&&ce(function(e){d.disconnectedMatch=c.call(e,"*"),c.call(e,"[s!='']:x"),s.push("!=",F)}),v=v.length&&new RegExp(v.join("|")),s=s.length&&new RegExp(s.join("|")),t=K.test(a.compareDocumentPosition),y=t||K.test(a.contains)?function(e,t){var n=9===e.nodeType?e.documentElement:e,r=t&&t.parentNode;return e===r||!(!r||1!==r.nodeType||!(n.contains?n.contains(r):e.compareDocumentPosition&&16&e.compareDocumentPosition(r)))}:function(e,t){if(t)while(t=t.parentNode)if(t===e)return!0;return!1},j=t?function(e,t){if(e===t)return l=!0,0;var n=!e.compareDocumentPosition-!t.compareDocumentPosition;return n||(1&(n=(e.ownerDocument||e)==(t.ownerDocument||t)?e.compareDocumentPosition(t):1)||!d.sortDetached&&t.compareDocumentPosition(e)===n?e==C||e.ownerDocument==p&&y(p,e)?-1:t==C||t.ownerDocument==p&&y(p,t)?1:u?P(u,e)-P(u,t):0:4&n?-1:1)}:function(e,t){if(e===t)return l=!0,0;var n,r=0,i=e.parentNode,o=t.parentNode,a=[e],s=[t];if(!i||!o)return e==C?-1:t==C?1:i?-1:o?1:u?P(u,e)-P(u,t):0;if(i===o)return pe(e,t);n=e;while(n=n.parentNode)a.unshift(n);n=t;while(n=n.parentNode)s.unshift(n);while(a[r]===s[r])r++;return r?pe(a[r],s[r]):a[r]==p?-1:s[r]==p?1:0}),C},se.matches=function(e,t){return se(e,null,null,t)},se.matchesSelector=function(e,t){if(T(e),d.matchesSelector&&E&&!N[t+" "]&&(!s||!s.test(t))&&(!v||!v.test(t)))try{var n=c.call(e,t);if(n||d.disconnectedMatch||e.document&&11!==e.document.nodeType)return n}catch(e){N(t,!0)}return 0":{dir:"parentNode",first:!0}," ":{dir:"parentNode"},"+":{dir:"previousSibling",first:!0},"~":{dir:"previousSibling"}},preFilter:{ATTR:function(e){return e[1]=e[1].replace(te,ne),e[3]=(e[3]||e[4]||e[5]||"").replace(te,ne),"~="===e[2]&&(e[3]=" "+e[3]+" "),e.slice(0,4)},CHILD:function(e){return e[1]=e[1].toLowerCase(),"nth"===e[1].slice(0,3)?(e[3]||se.error(e[0]),e[4]=+(e[4]?e[5]+(e[6]||1):2*("even"===e[3]||"odd"===e[3])),e[5]=+(e[7]+e[8]||"odd"===e[3])):e[3]&&se.error(e[0]),e},PSEUDO:function(e){var t,n=!e[6]&&e[2];return G.CHILD.test(e[0])?null:(e[3]?e[2]=e[4]||e[5]||"":n&&X.test(n)&&(t=h(n,!0))&&(t=n.indexOf(")",n.length-t)-n.length)&&(e[0]=e[0].slice(0,t),e[2]=n.slice(0,t)),e.slice(0,3))}},filter:{TAG:function(e){var t=e.replace(te,ne).toLowerCase();return"*"===e?function(){return!0}:function(e){return e.nodeName&&e.nodeName.toLowerCase()===t}},CLASS:function(e){var t=m[e+" "];return t||(t=new RegExp("(^|"+M+")"+e+"("+M+"|$)"))&&m(e,function(e){return t.test("string"==typeof e.className&&e.className||"undefined"!=typeof e.getAttribute&&e.getAttribute("class")||"")})},ATTR:function(n,r,i){return function(e){var t=se.attr(e,n);return null==t?"!="===r:!r||(t+="","="===r?t===i:"!="===r?t!==i:"^="===r?i&&0===t.indexOf(i):"*="===r?i&&-1:\x20\t\r\n\f]*)[\x20\t\r\n\f]*\/?>(?:<\/\1>|)$/i;function j(e,n,r){return m(n)?S.grep(e,function(e,t){return!!n.call(e,t,e)!==r}):n.nodeType?S.grep(e,function(e){return e===n!==r}):"string"!=typeof n?S.grep(e,function(e){return-1)[^>]*|#([\w-]+))$/;(S.fn.init=function(e,t,n){var r,i;if(!e)return this;if(n=n||D,"string"==typeof e){if(!(r="<"===e[0]&&">"===e[e.length-1]&&3<=e.length?[null,e,null]:q.exec(e))||!r[1]&&t)return!t||t.jquery?(t||n).find(e):this.constructor(t).find(e);if(r[1]){if(t=t instanceof S?t[0]:t,S.merge(this,S.parseHTML(r[1],t&&t.nodeType?t.ownerDocument||t:E,!0)),N.test(r[1])&&S.isPlainObject(t))for(r in t)m(this[r])?this[r](t[r]):this.attr(r,t[r]);return this}return(i=E.getElementById(r[2]))&&(this[0]=i,this.length=1),this}return e.nodeType?(this[0]=e,this.length=1,this):m(e)?void 0!==n.ready?n.ready(e):e(S):S.makeArray(e,this)}).prototype=S.fn,D=S(E);var L=/^(?:parents|prev(?:Until|All))/,H={children:!0,contents:!0,next:!0,prev:!0};function O(e,t){while((e=e[t])&&1!==e.nodeType);return e}S.fn.extend({has:function(e){var t=S(e,this),n=t.length;return this.filter(function(){for(var e=0;e\x20\t\r\n\f]*)/i,he=/^$|^module$|\/(?:java|ecma)script/i;ce=E.createDocumentFragment().appendChild(E.createElement("div")),(fe=E.createElement("input")).setAttribute("type","radio"),fe.setAttribute("checked","checked"),fe.setAttribute("name","t"),ce.appendChild(fe),y.checkClone=ce.cloneNode(!0).cloneNode(!0).lastChild.checked,ce.innerHTML="",y.noCloneChecked=!!ce.cloneNode(!0).lastChild.defaultValue,ce.innerHTML="",y.option=!!ce.lastChild;var ge={thead:[1,"","
"],col:[2,"","
"],tr:[2,"","
"],td:[3,"","
"],_default:[0,"",""]};function ve(e,t){var n;return n="undefined"!=typeof e.getElementsByTagName?e.getElementsByTagName(t||"*"):"undefined"!=typeof e.querySelectorAll?e.querySelectorAll(t||"*"):[],void 0===t||t&&A(e,t)?S.merge([e],n):n}function ye(e,t){for(var n=0,r=e.length;n",""]);var me=/<|&#?\w+;/;function xe(e,t,n,r,i){for(var o,a,s,u,l,c,f=t.createDocumentFragment(),p=[],d=0,h=e.length;d\s*$/g;function je(e,t){return A(e,"table")&&A(11!==t.nodeType?t:t.firstChild,"tr")&&S(e).children("tbody")[0]||e}function De(e){return e.type=(null!==e.getAttribute("type"))+"/"+e.type,e}function qe(e){return"true/"===(e.type||"").slice(0,5)?e.type=e.type.slice(5):e.removeAttribute("type"),e}function Le(e,t){var n,r,i,o,a,s;if(1===t.nodeType){if(Y.hasData(e)&&(s=Y.get(e).events))for(i in Y.remove(t,"handle events"),s)for(n=0,r=s[i].length;n").attr(n.scriptAttrs||{}).prop({charset:n.scriptCharset,src:n.url}).on("load error",i=function(e){r.remove(),i=null,e&&t("error"===e.type?404:200,e.type)}),E.head.appendChild(r[0])},abort:function(){i&&i()}}});var _t,zt=[],Ut=/(=)\?(?=&|$)|\?\?/;S.ajaxSetup({jsonp:"callback",jsonpCallback:function(){var e=zt.pop()||S.expando+"_"+wt.guid++;return this[e]=!0,e}}),S.ajaxPrefilter("json jsonp",function(e,t,n){var r,i,o,a=!1!==e.jsonp&&(Ut.test(e.url)?"url":"string"==typeof e.data&&0===(e.contentType||"").indexOf("application/x-www-form-urlencoded")&&Ut.test(e.data)&&"data");if(a||"jsonp"===e.dataTypes[0])return r=e.jsonpCallback=m(e.jsonpCallback)?e.jsonpCallback():e.jsonpCallback,a?e[a]=e[a].replace(Ut,"$1"+r):!1!==e.jsonp&&(e.url+=(Tt.test(e.url)?"&":"?")+e.jsonp+"="+r),e.converters["script json"]=function(){return o||S.error(r+" was not called"),o[0]},e.dataTypes[0]="json",i=C[r],C[r]=function(){o=arguments},n.always(function(){void 0===i?S(C).removeProp(r):C[r]=i,e[r]&&(e.jsonpCallback=t.jsonpCallback,zt.push(r)),o&&m(i)&&i(o[0]),o=i=void 0}),"script"}),y.createHTMLDocument=((_t=E.implementation.createHTMLDocument("").body).innerHTML="
",2===_t.childNodes.length),S.parseHTML=function(e,t,n){return"string"!=typeof e?[]:("boolean"==typeof t&&(n=t,t=!1),t||(y.createHTMLDocument?((r=(t=E.implementation.createHTMLDocument("")).createElement("base")).href=E.location.href,t.head.appendChild(r)):t=E),o=!n&&[],(i=N.exec(e))?[t.createElement(i[1])]:(i=xe([e],t,o),o&&o.length&&S(o).remove(),S.merge([],i.childNodes)));var r,i,o},S.fn.load=function(e,t,n){var r,i,o,a=this,s=e.indexOf(" ");return-1").append(S.parseHTML(e)).find(r):e)}).always(n&&function(e,t){a.each(function(){n.apply(this,o||[e.responseText,t,e])})}),this},S.expr.pseudos.animated=function(t){return S.grep(S.timers,function(e){return t===e.elem}).length},S.offset={setOffset:function(e,t,n){var r,i,o,a,s,u,l=S.css(e,"position"),c=S(e),f={};"static"===l&&(e.style.position="relative"),s=c.offset(),o=S.css(e,"top"),u=S.css(e,"left"),("absolute"===l||"fixed"===l)&&-1<(o+u).indexOf("auto")?(a=(r=c.position()).top,i=r.left):(a=parseFloat(o)||0,i=parseFloat(u)||0),m(t)&&(t=t.call(e,n,S.extend({},s))),null!=t.top&&(f.top=t.top-s.top+a),null!=t.left&&(f.left=t.left-s.left+i),"using"in t?t.using.call(e,f):c.css(f)}},S.fn.extend({offset:function(t){if(arguments.length)return void 0===t?this:this.each(function(e){S.offset.setOffset(this,t,e)});var e,n,r=this[0];return r?r.getClientRects().length?(e=r.getBoundingClientRect(),n=r.ownerDocument.defaultView,{top:e.top+n.pageYOffset,left:e.left+n.pageXOffset}):{top:0,left:0}:void 0},position:function(){if(this[0]){var e,t,n,r=this[0],i={top:0,left:0};if("fixed"===S.css(r,"position"))t=r.getBoundingClientRect();else{t=this.offset(),n=r.ownerDocument,e=r.offsetParent||n.documentElement;while(e&&(e===n.body||e===n.documentElement)&&"static"===S.css(e,"position"))e=e.parentNode;e&&e!==r&&1===e.nodeType&&((i=S(e).offset()).top+=S.css(e,"borderTopWidth",!0),i.left+=S.css(e,"borderLeftWidth",!0))}return{top:t.top-i.top-S.css(r,"marginTop",!0),left:t.left-i.left-S.css(r,"marginLeft",!0)}}},offsetParent:function(){return this.map(function(){var e=this.offsetParent;while(e&&"static"===S.css(e,"position"))e=e.offsetParent;return e||re})}}),S.each({scrollLeft:"pageXOffset",scrollTop:"pageYOffset"},function(t,i){var o="pageYOffset"===i;S.fn[t]=function(e){return $(this,function(e,t,n){var r;if(x(e)?r=e:9===e.nodeType&&(r=e.defaultView),void 0===n)return r?r[i]:e[t];r?r.scrollTo(o?r.pageXOffset:n,o?n:r.pageYOffset):e[t]=n},t,e,arguments.length)}}),S.each(["top","left"],function(e,n){S.cssHooks[n]=Fe(y.pixelPosition,function(e,t){if(t)return t=We(e,n),Pe.test(t)?S(e).position()[n]+"px":t})}),S.each({Height:"height",Width:"width"},function(a,s){S.each({padding:"inner"+a,content:s,"":"outer"+a},function(r,o){S.fn[o]=function(e,t){var n=arguments.length&&(r||"boolean"!=typeof e),i=r||(!0===e||!0===t?"margin":"border");return $(this,function(e,t,n){var r;return x(e)?0===o.indexOf("outer")?e["inner"+a]:e.document.documentElement["client"+a]:9===e.nodeType?(r=e.documentElement,Math.max(e.body["scroll"+a],r["scroll"+a],e.body["offset"+a],r["offset"+a],r["client"+a])):void 0===n?S.css(e,t,i):S.style(e,t,n,i)},s,n?e:void 0,n)}})}),S.each(["ajaxStart","ajaxStop","ajaxComplete","ajaxError","ajaxSuccess","ajaxSend"],function(e,t){S.fn[t]=function(e){return this.on(t,e)}}),S.fn.extend({bind:function(e,t,n){return this.on(e,null,t,n)},unbind:function(e,t){return this.off(e,null,t)},delegate:function(e,t,n,r){return this.on(t,e,n,r)},undelegate:function(e,t,n){return 1===arguments.length?this.off(e,"**"):this.off(t,e||"**",n)},hover:function(e,t){return this.mouseenter(e).mouseleave(t||e)}}),S.each("blur focus focusin focusout resize scroll click dblclick mousedown mouseup mousemove mouseover mouseout mouseenter mouseleave change select submit keydown keypress keyup contextmenu".split(" "),function(e,n){S.fn[n]=function(e,t){return 0",d.insertBefore(c.lastChild,d.firstChild)}function d(){var a=y.elements;return"string"==typeof a?a.split(" "):a}function e(a,b){var c=y.elements;"string"!=typeof c&&(c=c.join(" ")),"string"!=typeof a&&(a=a.join(" ")),y.elements=c+" "+a,j(b)}function f(a){var b=x[a[v]];return b||(b={},w++,a[v]=w,x[w]=b),b}function g(a,c,d){if(c||(c=b),q)return c.createElement(a);d||(d=f(c));var e;return e=d.cache[a]?d.cache[a].cloneNode():u.test(a)?(d.cache[a]=d.createElem(a)).cloneNode():d.createElem(a),!e.canHaveChildren||t.test(a)||e.tagUrn?e:d.frag.appendChild(e)}function h(a,c){if(a||(a=b),q)return a.createDocumentFragment();c=c||f(a);for(var e=c.frag.cloneNode(),g=0,h=d(),i=h.length;i>g;g++)e.createElement(h[g]);return e}function i(a,b){b.cache||(b.cache={},b.createElem=a.createElement,b.createFrag=a.createDocumentFragment,b.frag=b.createFrag()),a.createElement=function(c){return y.shivMethods?g(c,a,b):b.createElem(c)},a.createDocumentFragment=Function("h,f","return function(){var n=f.cloneNode(),c=n.createElement;h.shivMethods&&("+d().join().replace(/[\w\-:]+/g,function(a){return b.createElem(a),b.frag.createElement(a),'c("'+a+'")'})+");return n}")(y,b.frag)}function j(a){a||(a=b);var d=f(a);return!y.shivCSS||p||d.hasCSS||(d.hasCSS=!!c(a,"article,aside,dialog,figcaption,figure,footer,header,hgroup,main,nav,section{display:block}mark{background:#FF0;color:#000}template{display:none}")),q||i(a,d),a}function k(a){for(var b,c=a.getElementsByTagName("*"),e=c.length,f=RegExp("^(?:"+d().join("|")+")$","i"),g=[];e--;)b=c[e],f.test(b.nodeName)&&g.push(b.applyElement(l(b)));return g}function l(a){for(var b,c=a.attributes,d=c.length,e=a.ownerDocument.createElement(A+":"+a.nodeName);d--;)b=c[d],b.specified&&e.setAttribute(b.nodeName,b.nodeValue);return e.style.cssText=a.style.cssText,e}function m(a){for(var b,c=a.split("{"),e=c.length,f=RegExp("(^|[\\s,>+~])("+d().join("|")+")(?=[[\\s,>+~#.:]|$)","gi"),g="$1"+A+"\\:$2";e--;)b=c[e]=c[e].split("}"),b[b.length-1]=b[b.length-1].replace(f,g),c[e]=b.join("}");return c.join("{")}function n(a){for(var b=a.length;b--;)a[b].removeNode()}function o(a){function b(){clearTimeout(g._removeSheetTimer),d&&d.removeNode(!0),d=null}var d,e,g=f(a),h=a.namespaces,i=a.parentWindow;return!B||a.printShived?a:("undefined"==typeof h[A]&&h.add(A),i.attachEvent("onbeforeprint",function(){b();for(var f,g,h,i=a.styleSheets,j=[],l=i.length,n=Array(l);l--;)n[l]=i[l];for(;h=n.pop();)if(!h.disabled&&z.test(h.media)){try{f=h.imports,g=f.length}catch(o){g=0}for(l=0;g>l;l++)n.push(f[l]);try{j.push(h.cssText)}catch(o){}}j=m(j.reverse().join("")),e=k(a),d=c(a,j)}),i.attachEvent("onafterprint",function(){n(e),clearTimeout(g._removeSheetTimer),g._removeSheetTimer=setTimeout(b,500)}),a.printShived=!0,a)}var p,q,r="3.7.3",s=a.html5||{},t=/^<|^(?:button|map|select|textarea|object|iframe|option|optgroup)$/i,u=/^(?:a|b|code|div|fieldset|h1|h2|h3|h4|h5|h6|i|label|li|ol|p|q|span|strong|style|table|tbody|td|th|tr|ul)$/i,v="_html5shiv",w=0,x={};!function(){try{var a=b.createElement("a");a.innerHTML="",p="hidden"in a,q=1==a.childNodes.length||function(){b.createElement("a");var a=b.createDocumentFragment();return"undefined"==typeof a.cloneNode||"undefined"==typeof a.createDocumentFragment||"undefined"==typeof a.createElement}()}catch(c){p=!0,q=!0}}();var y={elements:s.elements||"abbr article aside audio bdi canvas data datalist details dialog figcaption figure footer header hgroup main mark meter nav output picture progress section summary template time video",version:r,shivCSS:s.shivCSS!==!1,supportsUnknownElements:q,shivMethods:s.shivMethods!==!1,type:"default",shivDocument:j,createElement:g,createDocumentFragment:h,addElements:e};a.html5=y,j(b);var z=/^$|\b(?:all|print)\b/,A="html5shiv",B=!q&&function(){var c=b.documentElement;return!("undefined"==typeof b.namespaces||"undefined"==typeof b.parentWindow||"undefined"==typeof c.applyElement||"undefined"==typeof c.removeNode||"undefined"==typeof a.attachEvent)}();y.type+=" print",y.shivPrint=o,o(b),"object"==typeof module&&module.exports&&(module.exports=y)}("undefined"!=typeof window?window:this,document); \ No newline at end of file diff --git a/branch/bicounty_dev/_static/js/html5shiv.min.js b/branch/bicounty_dev/_static/js/html5shiv.min.js new file mode 100644 index 0000000..cd1c674 --- /dev/null +++ b/branch/bicounty_dev/_static/js/html5shiv.min.js @@ -0,0 +1,4 @@ +/** +* @preserve HTML5 Shiv 3.7.3 | @afarkas @jdalton @jon_neal @rem | MIT/GPL2 Licensed +*/ +!function(a,b){function c(a,b){var c=a.createElement("p"),d=a.getElementsByTagName("head")[0]||a.documentElement;return c.innerHTML="x",d.insertBefore(c.lastChild,d.firstChild)}function d(){var a=t.elements;return"string"==typeof a?a.split(" "):a}function e(a,b){var c=t.elements;"string"!=typeof c&&(c=c.join(" ")),"string"!=typeof a&&(a=a.join(" ")),t.elements=c+" "+a,j(b)}function f(a){var b=s[a[q]];return b||(b={},r++,a[q]=r,s[r]=b),b}function g(a,c,d){if(c||(c=b),l)return c.createElement(a);d||(d=f(c));var e;return e=d.cache[a]?d.cache[a].cloneNode():p.test(a)?(d.cache[a]=d.createElem(a)).cloneNode():d.createElem(a),!e.canHaveChildren||o.test(a)||e.tagUrn?e:d.frag.appendChild(e)}function h(a,c){if(a||(a=b),l)return a.createDocumentFragment();c=c||f(a);for(var e=c.frag.cloneNode(),g=0,h=d(),i=h.length;i>g;g++)e.createElement(h[g]);return e}function i(a,b){b.cache||(b.cache={},b.createElem=a.createElement,b.createFrag=a.createDocumentFragment,b.frag=b.createFrag()),a.createElement=function(c){return t.shivMethods?g(c,a,b):b.createElem(c)},a.createDocumentFragment=Function("h,f","return function(){var n=f.cloneNode(),c=n.createElement;h.shivMethods&&("+d().join().replace(/[\w\-:]+/g,function(a){return b.createElem(a),b.frag.createElement(a),'c("'+a+'")'})+");return n}")(t,b.frag)}function j(a){a||(a=b);var d=f(a);return!t.shivCSS||k||d.hasCSS||(d.hasCSS=!!c(a,"article,aside,dialog,figcaption,figure,footer,header,hgroup,main,nav,section{display:block}mark{background:#FF0;color:#000}template{display:none}")),l||i(a,d),a}var k,l,m="3.7.3-pre",n=a.html5||{},o=/^<|^(?:button|map|select|textarea|object|iframe|option|optgroup)$/i,p=/^(?:a|b|code|div|fieldset|h1|h2|h3|h4|h5|h6|i|label|li|ol|p|q|span|strong|style|table|tbody|td|th|tr|ul)$/i,q="_html5shiv",r=0,s={};!function(){try{var a=b.createElement("a");a.innerHTML="",k="hidden"in a,l=1==a.childNodes.length||function(){b.createElement("a");var a=b.createDocumentFragment();return"undefined"==typeof a.cloneNode||"undefined"==typeof a.createDocumentFragment||"undefined"==typeof a.createElement}()}catch(c){k=!0,l=!0}}();var t={elements:n.elements||"abbr article aside audio bdi canvas data datalist details dialog figcaption figure footer header hgroup main mark meter nav output picture progress section summary template time video",version:m,shivCSS:n.shivCSS!==!1,supportsUnknownElements:l,shivMethods:n.shivMethods!==!1,type:"default",shivDocument:j,createElement:g,createDocumentFragment:h,addElements:e};a.html5=t,j(b),"object"==typeof module&&module.exports&&(module.exports=t)}("undefined"!=typeof window?window:this,document); \ No newline at end of file diff --git a/branch/bicounty_dev/_static/js/theme.js b/branch/bicounty_dev/_static/js/theme.js new file mode 100644 index 0000000..1fddb6e --- /dev/null +++ b/branch/bicounty_dev/_static/js/theme.js @@ -0,0 +1 @@ +!function(n){var e={};function t(i){if(e[i])return e[i].exports;var o=e[i]={i:i,l:!1,exports:{}};return n[i].call(o.exports,o,o.exports,t),o.l=!0,o.exports}t.m=n,t.c=e,t.d=function(n,e,i){t.o(n,e)||Object.defineProperty(n,e,{enumerable:!0,get:i})},t.r=function(n){"undefined"!=typeof Symbol&&Symbol.toStringTag&&Object.defineProperty(n,Symbol.toStringTag,{value:"Module"}),Object.defineProperty(n,"__esModule",{value:!0})},t.t=function(n,e){if(1&e&&(n=t(n)),8&e)return n;if(4&e&&"object"==typeof n&&n&&n.__esModule)return n;var i=Object.create(null);if(t.r(i),Object.defineProperty(i,"default",{enumerable:!0,value:n}),2&e&&"string"!=typeof n)for(var o in n)t.d(i,o,function(e){return n[e]}.bind(null,o));return i},t.n=function(n){var e=n&&n.__esModule?function(){return n.default}:function(){return n};return t.d(e,"a",e),e},t.o=function(n,e){return Object.prototype.hasOwnProperty.call(n,e)},t.p="",t(t.s=0)}([function(n,e,t){t(1),n.exports=t(3)},function(n,e,t){(function(){var e="undefined"!=typeof window?window.jQuery:t(2);n.exports.ThemeNav={navBar:null,win:null,winScroll:!1,winResize:!1,linkScroll:!1,winPosition:0,winHeight:null,docHeight:null,isRunning:!1,enable:function(n){var t=this;void 0===n&&(n=!0),t.isRunning||(t.isRunning=!0,e((function(e){t.init(e),t.reset(),t.win.on("hashchange",t.reset),n&&t.win.on("scroll",(function(){t.linkScroll||t.winScroll||(t.winScroll=!0,requestAnimationFrame((function(){t.onScroll()})))})),t.win.on("resize",(function(){t.winResize||(t.winResize=!0,requestAnimationFrame((function(){t.onResize()})))})),t.onResize()})))},enableSticky:function(){this.enable(!0)},init:function(n){n(document);var e=this;this.navBar=n("div.wy-side-scroll:first"),this.win=n(window),n(document).on("click","[data-toggle='wy-nav-top']",(function(){n("[data-toggle='wy-nav-shift']").toggleClass("shift"),n("[data-toggle='rst-versions']").toggleClass("shift")})).on("click",".wy-menu-vertical .current ul li a",(function(){var t=n(this);n("[data-toggle='wy-nav-shift']").removeClass("shift"),n("[data-toggle='rst-versions']").toggleClass("shift"),e.toggleCurrent(t),e.hashChange()})).on("click","[data-toggle='rst-current-version']",(function(){n("[data-toggle='rst-versions']").toggleClass("shift-up")})),n("table.docutils:not(.field-list,.footnote,.citation)").wrap("
"),n("table.docutils.footnote").wrap("
"),n("table.docutils.citation").wrap("
"),n(".wy-menu-vertical ul").not(".simple").siblings("a").each((function(){var t=n(this);expand=n(''),expand.on("click",(function(n){return e.toggleCurrent(t),n.stopPropagation(),!1})),t.prepend(expand)}))},reset:function(){var n=encodeURI(window.location.hash)||"#";try{var e=$(".wy-menu-vertical"),t=e.find('[href="'+n+'"]');if(0===t.length){var i=$('.document [id="'+n.substring(1)+'"]').closest("div.section");0===(t=e.find('[href="#'+i.attr("id")+'"]')).length&&(t=e.find('[href="#"]'))}if(t.length>0){$(".wy-menu-vertical .current").removeClass("current").attr("aria-expanded","false"),t.addClass("current").attr("aria-expanded","true"),t.closest("li.toctree-l1").parent().addClass("current").attr("aria-expanded","true");for(let n=1;n<=10;n++)t.closest("li.toctree-l"+n).addClass("current").attr("aria-expanded","true");t[0].scrollIntoView()}}catch(n){console.log("Error expanding nav for anchor",n)}},onScroll:function(){this.winScroll=!1;var n=this.win.scrollTop(),e=n+this.winHeight,t=this.navBar.scrollTop()+(n-this.winPosition);n<0||e>this.docHeight||(this.navBar.scrollTop(t),this.winPosition=n)},onResize:function(){this.winResize=!1,this.winHeight=this.win.height(),this.docHeight=$(document).height()},hashChange:function(){this.linkScroll=!0,this.win.one("hashchange",(function(){this.linkScroll=!1}))},toggleCurrent:function(n){var e=n.closest("li");e.siblings("li.current").removeClass("current").attr("aria-expanded","false"),e.siblings().find("li.current").removeClass("current").attr("aria-expanded","false");var t=e.find("> ul li");t.length&&(t.removeClass("current").attr("aria-expanded","false"),e.toggleClass("current").attr("aria-expanded",(function(n,e){return"true"==e?"false":"true"})))}},"undefined"!=typeof window&&(window.SphinxRtdTheme={Navigation:n.exports.ThemeNav,StickyNav:n.exports.ThemeNav}),function(){for(var n=0,e=["ms","moz","webkit","o"],t=0;t0 + var meq1 = "^(" + C + ")?" + V + C + "(" + V + ")?$"; // [C]VC[V] is m=1 + var mgr1 = "^(" + C + ")?" + V + C + V + C; // [C]VCVC... is m>1 + var s_v = "^(" + C + ")?" + v; // vowel in stem + + this.stemWord = function (w) { + var stem; + var suffix; + var firstch; + var origword = w; + + if (w.length < 3) + return w; + + var re; + var re2; + var re3; + var re4; + + firstch = w.substr(0,1); + if (firstch == "y") + w = firstch.toUpperCase() + w.substr(1); + + // Step 1a + re = /^(.+?)(ss|i)es$/; + re2 = /^(.+?)([^s])s$/; + + if (re.test(w)) + w = w.replace(re,"$1$2"); + else if (re2.test(w)) + w = w.replace(re2,"$1$2"); + + // Step 1b + re = /^(.+?)eed$/; + re2 = /^(.+?)(ed|ing)$/; + if (re.test(w)) { + var fp = re.exec(w); + re = new RegExp(mgr0); + if (re.test(fp[1])) { + re = /.$/; + w = w.replace(re,""); + } + } + else if (re2.test(w)) { + var fp = re2.exec(w); + stem = fp[1]; + re2 = new RegExp(s_v); + if (re2.test(stem)) { + w = stem; + re2 = /(at|bl|iz)$/; + re3 = new RegExp("([^aeiouylsz])\\1$"); + re4 = new RegExp("^" + C + v + "[^aeiouwxy]$"); + if (re2.test(w)) + w = w + "e"; + else if (re3.test(w)) { + re = /.$/; + w = w.replace(re,""); + } + else if (re4.test(w)) + w = w + "e"; + } + } + + // Step 1c + re = /^(.+?)y$/; + if (re.test(w)) { + var fp = re.exec(w); + stem = fp[1]; + re = new RegExp(s_v); + if (re.test(stem)) + w = stem + "i"; + } + + // Step 2 + re = /^(.+?)(ational|tional|enci|anci|izer|bli|alli|entli|eli|ousli|ization|ation|ator|alism|iveness|fulness|ousness|aliti|iviti|biliti|logi)$/; + if (re.test(w)) { + var fp = re.exec(w); + stem = fp[1]; + suffix = fp[2]; + re = new RegExp(mgr0); + if (re.test(stem)) + w = stem + step2list[suffix]; + } + + // Step 3 + re = /^(.+?)(icate|ative|alize|iciti|ical|ful|ness)$/; + if (re.test(w)) { + var fp = re.exec(w); + stem = fp[1]; + suffix = fp[2]; + re = new RegExp(mgr0); + if (re.test(stem)) + w = stem + step3list[suffix]; + } + + // Step 4 + re = /^(.+?)(al|ance|ence|er|ic|able|ible|ant|ement|ment|ent|ou|ism|ate|iti|ous|ive|ize)$/; + re2 = /^(.+?)(s|t)(ion)$/; + if (re.test(w)) { + var fp = re.exec(w); + stem = fp[1]; + re = new RegExp(mgr1); + if (re.test(stem)) + w = stem; + } + else if (re2.test(w)) { + var fp = re2.exec(w); + stem = fp[1] + fp[2]; + re2 = new RegExp(mgr1); + if (re2.test(stem)) + w = stem; + } + + // Step 5 + re = /^(.+?)e$/; + if (re.test(w)) { + var fp = re.exec(w); + stem = fp[1]; + re = new RegExp(mgr1); + re2 = new RegExp(meq1); + re3 = new RegExp("^" + C + v + "[^aeiouwxy]$"); + if (re.test(stem) || (re2.test(stem) && !(re3.test(stem)))) + w = stem; + } + re = /ll$/; + re2 = new RegExp(mgr1); + if (re.test(w) && re2.test(w)) { + re = /.$/; + w = w.replace(re,""); + } + + // and turn initial Y back to y + if (firstch == "y") + w = firstch.toLowerCase() + w.substr(1); + return w; + } +} + diff --git a/branch/bicounty_dev/_static/minus.png b/branch/bicounty_dev/_static/minus.png new file mode 100644 index 0000000..d96755f Binary files /dev/null and b/branch/bicounty_dev/_static/minus.png differ diff --git a/branch/bicounty_dev/_static/plus.png b/branch/bicounty_dev/_static/plus.png new file mode 100644 index 0000000..7107cec Binary files /dev/null and b/branch/bicounty_dev/_static/plus.png differ diff --git a/branch/bicounty_dev/_static/pygments.css b/branch/bicounty_dev/_static/pygments.css new file mode 100644 index 0000000..84ab303 --- /dev/null +++ b/branch/bicounty_dev/_static/pygments.css @@ -0,0 +1,75 @@ +pre { line-height: 125%; } +td.linenos .normal { color: inherit; background-color: transparent; padding-left: 5px; padding-right: 5px; } +span.linenos { color: inherit; background-color: transparent; padding-left: 5px; padding-right: 5px; } +td.linenos .special { color: #000000; background-color: #ffffc0; padding-left: 5px; padding-right: 5px; } +span.linenos.special { color: #000000; background-color: #ffffc0; padding-left: 5px; padding-right: 5px; } +.highlight .hll { background-color: #ffffcc } +.highlight { background: #f8f8f8; } +.highlight .c { color: #3D7B7B; font-style: italic } /* Comment */ +.highlight .err { border: 1px solid #FF0000 } /* Error */ +.highlight .k { color: #008000; font-weight: bold } /* Keyword */ +.highlight .o { color: #666666 } /* Operator */ +.highlight .ch { color: #3D7B7B; font-style: italic } /* Comment.Hashbang */ +.highlight .cm { color: #3D7B7B; font-style: italic } /* Comment.Multiline */ +.highlight .cp { color: #9C6500 } /* Comment.Preproc */ +.highlight .cpf { color: #3D7B7B; font-style: italic } /* Comment.PreprocFile */ +.highlight .c1 { color: #3D7B7B; font-style: italic } /* Comment.Single */ +.highlight .cs { color: #3D7B7B; font-style: italic } /* Comment.Special */ +.highlight .gd { color: #A00000 } /* Generic.Deleted */ +.highlight .ge { font-style: italic } /* Generic.Emph */ +.highlight .ges { font-weight: bold; font-style: italic } /* Generic.EmphStrong */ +.highlight .gr { color: #E40000 } /* Generic.Error */ +.highlight .gh { color: #000080; font-weight: bold } /* Generic.Heading */ +.highlight .gi { color: #008400 } /* Generic.Inserted */ +.highlight .go { color: #717171 } /* Generic.Output */ +.highlight .gp { color: #000080; font-weight: bold } /* Generic.Prompt */ +.highlight .gs { font-weight: bold } /* Generic.Strong */ +.highlight .gu { color: #800080; font-weight: bold } /* Generic.Subheading */ +.highlight .gt { color: #0044DD } /* Generic.Traceback */ +.highlight .kc { color: #008000; font-weight: bold } /* Keyword.Constant */ +.highlight .kd { color: #008000; font-weight: bold } /* Keyword.Declaration */ +.highlight .kn { color: #008000; font-weight: bold } /* Keyword.Namespace */ +.highlight .kp { color: #008000 } /* Keyword.Pseudo */ +.highlight .kr { color: #008000; font-weight: bold } /* Keyword.Reserved */ +.highlight .kt { color: #B00040 } /* Keyword.Type */ +.highlight .m { color: #666666 } /* Literal.Number */ +.highlight .s { color: #BA2121 } /* Literal.String */ +.highlight .na { color: #687822 } /* Name.Attribute */ +.highlight .nb { color: #008000 } /* Name.Builtin */ +.highlight .nc { color: #0000FF; font-weight: bold } /* Name.Class */ +.highlight .no { color: #880000 } /* Name.Constant */ +.highlight .nd { color: #AA22FF } /* Name.Decorator */ +.highlight .ni { color: #717171; font-weight: bold } /* Name.Entity */ +.highlight .ne { color: #CB3F38; font-weight: bold } /* Name.Exception */ +.highlight .nf { color: #0000FF } /* Name.Function */ +.highlight .nl { color: #767600 } /* Name.Label */ +.highlight .nn { color: #0000FF; font-weight: bold } /* Name.Namespace */ +.highlight .nt { color: #008000; font-weight: bold } /* Name.Tag */ +.highlight .nv { color: #19177C } /* Name.Variable */ +.highlight .ow { color: #AA22FF; font-weight: bold } /* Operator.Word */ +.highlight .w { color: #bbbbbb } /* Text.Whitespace */ +.highlight .mb { color: #666666 } /* Literal.Number.Bin */ +.highlight .mf { color: #666666 } /* Literal.Number.Float */ +.highlight .mh { color: #666666 } /* Literal.Number.Hex */ +.highlight .mi { color: #666666 } /* Literal.Number.Integer */ +.highlight .mo { color: #666666 } /* Literal.Number.Oct */ +.highlight .sa { color: #BA2121 } /* Literal.String.Affix */ +.highlight .sb { color: #BA2121 } /* Literal.String.Backtick */ +.highlight .sc { color: #BA2121 } /* Literal.String.Char */ +.highlight .dl { color: #BA2121 } /* Literal.String.Delimiter */ +.highlight .sd { color: #BA2121; font-style: italic } /* Literal.String.Doc */ +.highlight .s2 { color: #BA2121 } /* Literal.String.Double */ +.highlight .se { color: #AA5D1F; font-weight: bold } /* Literal.String.Escape */ +.highlight .sh { color: #BA2121 } /* Literal.String.Heredoc */ +.highlight .si { color: #A45A77; font-weight: bold } /* Literal.String.Interpol */ +.highlight .sx { color: #008000 } /* Literal.String.Other */ +.highlight .sr { color: #A45A77 } /* Literal.String.Regex */ +.highlight .s1 { color: #BA2121 } /* Literal.String.Single */ +.highlight .ss { color: #19177C } /* Literal.String.Symbol */ +.highlight .bp { color: #008000 } /* Name.Builtin.Pseudo */ +.highlight .fm { color: #0000FF } /* Name.Function.Magic */ +.highlight .vc { color: #19177C } /* Name.Variable.Class */ +.highlight .vg { color: #19177C } /* Name.Variable.Global */ +.highlight .vi { color: #19177C } /* Name.Variable.Instance */ +.highlight .vm { color: #19177C } /* Name.Variable.Magic */ +.highlight .il { color: #666666 } /* Literal.Number.Integer.Long */ \ No newline at end of file diff --git a/branch/bicounty_dev/_static/searchtools.js b/branch/bicounty_dev/_static/searchtools.js new file mode 100644 index 0000000..97d56a7 --- /dev/null +++ b/branch/bicounty_dev/_static/searchtools.js @@ -0,0 +1,566 @@ +/* + * searchtools.js + * ~~~~~~~~~~~~~~~~ + * + * Sphinx JavaScript utilities for the full-text search. + * + * :copyright: Copyright 2007-2023 by the Sphinx team, see AUTHORS. + * :license: BSD, see LICENSE for details. + * + */ +"use strict"; + +/** + * Simple result scoring code. + */ +if (typeof Scorer === "undefined") { + var Scorer = { + // Implement the following function to further tweak the score for each result + // The function takes a result array [docname, title, anchor, descr, score, filename] + // and returns the new score. + /* + score: result => { + const [docname, title, anchor, descr, score, filename] = result + return score + }, + */ + + // query matches the full name of an object + objNameMatch: 11, + // or matches in the last dotted part of the object name + objPartialMatch: 6, + // Additive scores depending on the priority of the object + objPrio: { + 0: 15, // used to be importantResults + 1: 5, // used to be objectResults + 2: -5, // used to be unimportantResults + }, + // Used when the priority is not in the mapping. + objPrioDefault: 0, + + // query found in title + title: 15, + partialTitle: 7, + // query found in terms + term: 5, + partialTerm: 2, + }; +} + +const _removeChildren = (element) => { + while (element && element.lastChild) element.removeChild(element.lastChild); +}; + +/** + * See https://developer.mozilla.org/en-US/docs/Web/JavaScript/Guide/Regular_Expressions#escaping + */ +const _escapeRegExp = (string) => + string.replace(/[.*+\-?^${}()|[\]\\]/g, "\\$&"); // $& means the whole matched string + +const _displayItem = (item, searchTerms) => { + const docBuilder = DOCUMENTATION_OPTIONS.BUILDER; + const docUrlRoot = DOCUMENTATION_OPTIONS.URL_ROOT; + const docFileSuffix = DOCUMENTATION_OPTIONS.FILE_SUFFIX; + const docLinkSuffix = DOCUMENTATION_OPTIONS.LINK_SUFFIX; + const showSearchSummary = DOCUMENTATION_OPTIONS.SHOW_SEARCH_SUMMARY; + + const [docName, title, anchor, descr, score, _filename] = item; + + let listItem = document.createElement("li"); + let requestUrl; + let linkUrl; + if (docBuilder === "dirhtml") { + // dirhtml builder + let dirname = docName + "/"; + if (dirname.match(/\/index\/$/)) + dirname = dirname.substring(0, dirname.length - 6); + else if (dirname === "index/") dirname = ""; + requestUrl = docUrlRoot + dirname; + linkUrl = requestUrl; + } else { + // normal html builders + requestUrl = docUrlRoot + docName + docFileSuffix; + linkUrl = docName + docLinkSuffix; + } + let linkEl = listItem.appendChild(document.createElement("a")); + linkEl.href = linkUrl + anchor; + linkEl.dataset.score = score; + linkEl.innerHTML = title; + if (descr) + listItem.appendChild(document.createElement("span")).innerHTML = + " (" + descr + ")"; + else if (showSearchSummary) + fetch(requestUrl) + .then((responseData) => responseData.text()) + .then((data) => { + if (data) + listItem.appendChild( + Search.makeSearchSummary(data, searchTerms) + ); + }); + Search.output.appendChild(listItem); +}; +const _finishSearch = (resultCount) => { + Search.stopPulse(); + Search.title.innerText = _("Search Results"); + if (!resultCount) + Search.status.innerText = Documentation.gettext( + "Your search did not match any documents. Please make sure that all words are spelled correctly and that you've selected enough categories." + ); + else + Search.status.innerText = _( + `Search finished, found ${resultCount} page(s) matching the search query.` + ); +}; +const _displayNextItem = ( + results, + resultCount, + searchTerms +) => { + // results left, load the summary and display it + // this is intended to be dynamic (don't sub resultsCount) + if (results.length) { + _displayItem(results.pop(), searchTerms); + setTimeout( + () => _displayNextItem(results, resultCount, searchTerms), + 5 + ); + } + // search finished, update title and status message + else _finishSearch(resultCount); +}; + +/** + * Default splitQuery function. Can be overridden in ``sphinx.search`` with a + * custom function per language. + * + * The regular expression works by splitting the string on consecutive characters + * that are not Unicode letters, numbers, underscores, or emoji characters. + * This is the same as ``\W+`` in Python, preserving the surrogate pair area. + */ +if (typeof splitQuery === "undefined") { + var splitQuery = (query) => query + .split(/[^\p{Letter}\p{Number}_\p{Emoji_Presentation}]+/gu) + .filter(term => term) // remove remaining empty strings +} + +/** + * Search Module + */ +const Search = { + _index: null, + _queued_query: null, + _pulse_status: -1, + + htmlToText: (htmlString) => { + const htmlElement = new DOMParser().parseFromString(htmlString, 'text/html'); + htmlElement.querySelectorAll(".headerlink").forEach((el) => { el.remove() }); + const docContent = htmlElement.querySelector('[role="main"]'); + if (docContent !== undefined) return docContent.textContent; + console.warn( + "Content block not found. Sphinx search tries to obtain it via '[role=main]'. Could you check your theme or template." + ); + return ""; + }, + + init: () => { + const query = new URLSearchParams(window.location.search).get("q"); + document + .querySelectorAll('input[name="q"]') + .forEach((el) => (el.value = query)); + if (query) Search.performSearch(query); + }, + + loadIndex: (url) => + (document.body.appendChild(document.createElement("script")).src = url), + + setIndex: (index) => { + Search._index = index; + if (Search._queued_query !== null) { + const query = Search._queued_query; + Search._queued_query = null; + Search.query(query); + } + }, + + hasIndex: () => Search._index !== null, + + deferQuery: (query) => (Search._queued_query = query), + + stopPulse: () => (Search._pulse_status = -1), + + startPulse: () => { + if (Search._pulse_status >= 0) return; + + const pulse = () => { + Search._pulse_status = (Search._pulse_status + 1) % 4; + Search.dots.innerText = ".".repeat(Search._pulse_status); + if (Search._pulse_status >= 0) window.setTimeout(pulse, 500); + }; + pulse(); + }, + + /** + * perform a search for something (or wait until index is loaded) + */ + performSearch: (query) => { + // create the required interface elements + const searchText = document.createElement("h2"); + searchText.textContent = _("Searching"); + const searchSummary = document.createElement("p"); + searchSummary.classList.add("search-summary"); + searchSummary.innerText = ""; + const searchList = document.createElement("ul"); + searchList.classList.add("search"); + + const out = document.getElementById("search-results"); + Search.title = out.appendChild(searchText); + Search.dots = Search.title.appendChild(document.createElement("span")); + Search.status = out.appendChild(searchSummary); + Search.output = out.appendChild(searchList); + + const searchProgress = document.getElementById("search-progress"); + // Some themes don't use the search progress node + if (searchProgress) { + searchProgress.innerText = _("Preparing search..."); + } + Search.startPulse(); + + // index already loaded, the browser was quick! + if (Search.hasIndex()) Search.query(query); + else Search.deferQuery(query); + }, + + /** + * execute search (requires search index to be loaded) + */ + query: (query) => { + const filenames = Search._index.filenames; + const docNames = Search._index.docnames; + const titles = Search._index.titles; + const allTitles = Search._index.alltitles; + const indexEntries = Search._index.indexentries; + + // stem the search terms and add them to the correct list + const stemmer = new Stemmer(); + const searchTerms = new Set(); + const excludedTerms = new Set(); + const highlightTerms = new Set(); + const objectTerms = new Set(splitQuery(query.toLowerCase().trim())); + splitQuery(query.trim()).forEach((queryTerm) => { + const queryTermLower = queryTerm.toLowerCase(); + + // maybe skip this "word" + // stopwords array is from language_data.js + if ( + stopwords.indexOf(queryTermLower) !== -1 || + queryTerm.match(/^\d+$/) + ) + return; + + // stem the word + let word = stemmer.stemWord(queryTermLower); + // select the correct list + if (word[0] === "-") excludedTerms.add(word.substr(1)); + else { + searchTerms.add(word); + highlightTerms.add(queryTermLower); + } + }); + + if (SPHINX_HIGHLIGHT_ENABLED) { // set in sphinx_highlight.js + localStorage.setItem("sphinx_highlight_terms", [...highlightTerms].join(" ")) + } + + // console.debug("SEARCH: searching for:"); + // console.info("required: ", [...searchTerms]); + // console.info("excluded: ", [...excludedTerms]); + + // array of [docname, title, anchor, descr, score, filename] + let results = []; + _removeChildren(document.getElementById("search-progress")); + + const queryLower = query.toLowerCase(); + for (const [title, foundTitles] of Object.entries(allTitles)) { + if (title.toLowerCase().includes(queryLower) && (queryLower.length >= title.length/2)) { + for (const [file, id] of foundTitles) { + let score = Math.round(100 * queryLower.length / title.length) + results.push([ + docNames[file], + titles[file] !== title ? `${titles[file]} > ${title}` : title, + id !== null ? "#" + id : "", + null, + score, + filenames[file], + ]); + } + } + } + + // search for explicit entries in index directives + for (const [entry, foundEntries] of Object.entries(indexEntries)) { + if (entry.includes(queryLower) && (queryLower.length >= entry.length/2)) { + for (const [file, id] of foundEntries) { + let score = Math.round(100 * queryLower.length / entry.length) + results.push([ + docNames[file], + titles[file], + id ? "#" + id : "", + null, + score, + filenames[file], + ]); + } + } + } + + // lookup as object + objectTerms.forEach((term) => + results.push(...Search.performObjectSearch(term, objectTerms)) + ); + + // lookup as search terms in fulltext + results.push(...Search.performTermsSearch(searchTerms, excludedTerms)); + + // let the scorer override scores with a custom scoring function + if (Scorer.score) results.forEach((item) => (item[4] = Scorer.score(item))); + + // now sort the results by score (in opposite order of appearance, since the + // display function below uses pop() to retrieve items) and then + // alphabetically + results.sort((a, b) => { + const leftScore = a[4]; + const rightScore = b[4]; + if (leftScore === rightScore) { + // same score: sort alphabetically + const leftTitle = a[1].toLowerCase(); + const rightTitle = b[1].toLowerCase(); + if (leftTitle === rightTitle) return 0; + return leftTitle > rightTitle ? -1 : 1; // inverted is intentional + } + return leftScore > rightScore ? 1 : -1; + }); + + // remove duplicate search results + // note the reversing of results, so that in the case of duplicates, the highest-scoring entry is kept + let seen = new Set(); + results = results.reverse().reduce((acc, result) => { + let resultStr = result.slice(0, 4).concat([result[5]]).map(v => String(v)).join(','); + if (!seen.has(resultStr)) { + acc.push(result); + seen.add(resultStr); + } + return acc; + }, []); + + results = results.reverse(); + + // for debugging + //Search.lastresults = results.slice(); // a copy + // console.info("search results:", Search.lastresults); + + // print the results + _displayNextItem(results, results.length, searchTerms); + }, + + /** + * search for object names + */ + performObjectSearch: (object, objectTerms) => { + const filenames = Search._index.filenames; + const docNames = Search._index.docnames; + const objects = Search._index.objects; + const objNames = Search._index.objnames; + const titles = Search._index.titles; + + const results = []; + + const objectSearchCallback = (prefix, match) => { + const name = match[4] + const fullname = (prefix ? prefix + "." : "") + name; + const fullnameLower = fullname.toLowerCase(); + if (fullnameLower.indexOf(object) < 0) return; + + let score = 0; + const parts = fullnameLower.split("."); + + // check for different match types: exact matches of full name or + // "last name" (i.e. last dotted part) + if (fullnameLower === object || parts.slice(-1)[0] === object) + score += Scorer.objNameMatch; + else if (parts.slice(-1)[0].indexOf(object) > -1) + score += Scorer.objPartialMatch; // matches in last name + + const objName = objNames[match[1]][2]; + const title = titles[match[0]]; + + // If more than one term searched for, we require other words to be + // found in the name/title/description + const otherTerms = new Set(objectTerms); + otherTerms.delete(object); + if (otherTerms.size > 0) { + const haystack = `${prefix} ${name} ${objName} ${title}`.toLowerCase(); + if ( + [...otherTerms].some((otherTerm) => haystack.indexOf(otherTerm) < 0) + ) + return; + } + + let anchor = match[3]; + if (anchor === "") anchor = fullname; + else if (anchor === "-") anchor = objNames[match[1]][1] + "-" + fullname; + + const descr = objName + _(", in ") + title; + + // add custom score for some objects according to scorer + if (Scorer.objPrio.hasOwnProperty(match[2])) + score += Scorer.objPrio[match[2]]; + else score += Scorer.objPrioDefault; + + results.push([ + docNames[match[0]], + fullname, + "#" + anchor, + descr, + score, + filenames[match[0]], + ]); + }; + Object.keys(objects).forEach((prefix) => + objects[prefix].forEach((array) => + objectSearchCallback(prefix, array) + ) + ); + return results; + }, + + /** + * search for full-text terms in the index + */ + performTermsSearch: (searchTerms, excludedTerms) => { + // prepare search + const terms = Search._index.terms; + const titleTerms = Search._index.titleterms; + const filenames = Search._index.filenames; + const docNames = Search._index.docnames; + const titles = Search._index.titles; + + const scoreMap = new Map(); + const fileMap = new Map(); + + // perform the search on the required terms + searchTerms.forEach((word) => { + const files = []; + const arr = [ + { files: terms[word], score: Scorer.term }, + { files: titleTerms[word], score: Scorer.title }, + ]; + // add support for partial matches + if (word.length > 2) { + const escapedWord = _escapeRegExp(word); + Object.keys(terms).forEach((term) => { + if (term.match(escapedWord) && !terms[word]) + arr.push({ files: terms[term], score: Scorer.partialTerm }); + }); + Object.keys(titleTerms).forEach((term) => { + if (term.match(escapedWord) && !titleTerms[word]) + arr.push({ files: titleTerms[word], score: Scorer.partialTitle }); + }); + } + + // no match but word was a required one + if (arr.every((record) => record.files === undefined)) return; + + // found search word in contents + arr.forEach((record) => { + if (record.files === undefined) return; + + let recordFiles = record.files; + if (recordFiles.length === undefined) recordFiles = [recordFiles]; + files.push(...recordFiles); + + // set score for the word in each file + recordFiles.forEach((file) => { + if (!scoreMap.has(file)) scoreMap.set(file, {}); + scoreMap.get(file)[word] = record.score; + }); + }); + + // create the mapping + files.forEach((file) => { + if (fileMap.has(file) && fileMap.get(file).indexOf(word) === -1) + fileMap.get(file).push(word); + else fileMap.set(file, [word]); + }); + }); + + // now check if the files don't contain excluded terms + const results = []; + for (const [file, wordList] of fileMap) { + // check if all requirements are matched + + // as search terms with length < 3 are discarded + const filteredTermCount = [...searchTerms].filter( + (term) => term.length > 2 + ).length; + if ( + wordList.length !== searchTerms.size && + wordList.length !== filteredTermCount + ) + continue; + + // ensure that none of the excluded terms is in the search result + if ( + [...excludedTerms].some( + (term) => + terms[term] === file || + titleTerms[term] === file || + (terms[term] || []).includes(file) || + (titleTerms[term] || []).includes(file) + ) + ) + break; + + // select one (max) score for the file. + const score = Math.max(...wordList.map((w) => scoreMap.get(file)[w])); + // add result to the result list + results.push([ + docNames[file], + titles[file], + "", + null, + score, + filenames[file], + ]); + } + return results; + }, + + /** + * helper function to return a node containing the + * search summary for a given text. keywords is a list + * of stemmed words. + */ + makeSearchSummary: (htmlText, keywords) => { + const text = Search.htmlToText(htmlText); + if (text === "") return null; + + const textLower = text.toLowerCase(); + const actualStartPosition = [...keywords] + .map((k) => textLower.indexOf(k.toLowerCase())) + .filter((i) => i > -1) + .slice(-1)[0]; + const startWithContext = Math.max(actualStartPosition - 120, 0); + + const top = startWithContext === 0 ? "" : "..."; + const tail = startWithContext + 240 < text.length ? "..." : ""; + + let summary = document.createElement("p"); + summary.classList.add("context"); + summary.textContent = top + text.substr(startWithContext, 240).trim() + tail; + + return summary; + }, +}; + +_ready(Search.init); diff --git a/branch/bicounty_dev/_static/sphinx_highlight.js b/branch/bicounty_dev/_static/sphinx_highlight.js new file mode 100644 index 0000000..aae669d --- /dev/null +++ b/branch/bicounty_dev/_static/sphinx_highlight.js @@ -0,0 +1,144 @@ +/* Highlighting utilities for Sphinx HTML documentation. */ +"use strict"; + +const SPHINX_HIGHLIGHT_ENABLED = true + +/** + * highlight a given string on a node by wrapping it in + * span elements with the given class name. + */ +const _highlight = (node, addItems, text, className) => { + if (node.nodeType === Node.TEXT_NODE) { + const val = node.nodeValue; + const parent = node.parentNode; + const pos = val.toLowerCase().indexOf(text); + if ( + pos >= 0 && + !parent.classList.contains(className) && + !parent.classList.contains("nohighlight") + ) { + let span; + + const closestNode = parent.closest("body, svg, foreignObject"); + const isInSVG = closestNode && closestNode.matches("svg"); + if (isInSVG) { + span = document.createElementNS("http://www.w3.org/2000/svg", "tspan"); + } else { + span = document.createElement("span"); + span.classList.add(className); + } + + span.appendChild(document.createTextNode(val.substr(pos, text.length))); + parent.insertBefore( + span, + parent.insertBefore( + document.createTextNode(val.substr(pos + text.length)), + node.nextSibling + ) + ); + node.nodeValue = val.substr(0, pos); + + if (isInSVG) { + const rect = document.createElementNS( + "http://www.w3.org/2000/svg", + "rect" + ); + const bbox = parent.getBBox(); + rect.x.baseVal.value = bbox.x; + rect.y.baseVal.value = bbox.y; + rect.width.baseVal.value = bbox.width; + rect.height.baseVal.value = bbox.height; + rect.setAttribute("class", className); + addItems.push({ parent: parent, target: rect }); + } + } + } else if (node.matches && !node.matches("button, select, textarea")) { + node.childNodes.forEach((el) => _highlight(el, addItems, text, className)); + } +}; +const _highlightText = (thisNode, text, className) => { + let addItems = []; + _highlight(thisNode, addItems, text, className); + addItems.forEach((obj) => + obj.parent.insertAdjacentElement("beforebegin", obj.target) + ); +}; + +/** + * Small JavaScript module for the documentation. + */ +const SphinxHighlight = { + + /** + * highlight the search words provided in localstorage in the text + */ + highlightSearchWords: () => { + if (!SPHINX_HIGHLIGHT_ENABLED) return; // bail if no highlight + + // get and clear terms from localstorage + const url = new URL(window.location); + const highlight = + localStorage.getItem("sphinx_highlight_terms") + || url.searchParams.get("highlight") + || ""; + localStorage.removeItem("sphinx_highlight_terms") + url.searchParams.delete("highlight"); + window.history.replaceState({}, "", url); + + // get individual terms from highlight string + const terms = highlight.toLowerCase().split(/\s+/).filter(x => x); + if (terms.length === 0) return; // nothing to do + + // There should never be more than one element matching "div.body" + const divBody = document.querySelectorAll("div.body"); + const body = divBody.length ? divBody[0] : document.querySelector("body"); + window.setTimeout(() => { + terms.forEach((term) => _highlightText(body, term, "highlighted")); + }, 10); + + const searchBox = document.getElementById("searchbox"); + if (searchBox === null) return; + searchBox.appendChild( + document + .createRange() + .createContextualFragment( + '" + ) + ); + }, + + /** + * helper function to hide the search marks again + */ + hideSearchWords: () => { + document + .querySelectorAll("#searchbox .highlight-link") + .forEach((el) => el.remove()); + document + .querySelectorAll("span.highlighted") + .forEach((el) => el.classList.remove("highlighted")); + localStorage.removeItem("sphinx_highlight_terms") + }, + + initEscapeListener: () => { + // only install a listener if it is really needed + if (!DOCUMENTATION_OPTIONS.ENABLE_SEARCH_SHORTCUTS) return; + + document.addEventListener("keydown", (event) => { + // bail for input elements + if (BLACKLISTED_KEY_CONTROL_ELEMENTS.has(document.activeElement.tagName)) return; + // bail with special keys + if (event.shiftKey || event.altKey || event.ctrlKey || event.metaKey) return; + if (DOCUMENTATION_OPTIONS.ENABLE_SEARCH_SHORTCUTS && (event.key === "Escape")) { + SphinxHighlight.hideSearchWords(); + event.preventDefault(); + } + }); + }, +}; + +_ready(SphinxHighlight.highlightSearchWords); +_ready(SphinxHighlight.initEscapeListener); diff --git a/branch/bicounty_dev/autodoc/index.html b/branch/bicounty_dev/autodoc/index.html new file mode 100644 index 0000000..376fe34 --- /dev/null +++ b/branch/bicounty_dev/autodoc/index.html @@ -0,0 +1,165 @@ + + + + + + + Lasso Classes and Functions — lasso documentation + + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

Lasso Classes and Functions

+
+

Base Classes

+ + + + + + + + + + + + + + + + + + +

CubeTransit

Class for storing information about transit defined in Cube line files.

StandardTransit

Holds a standard transit feed as a Partridge object and contains methods to manipulate and translate the GTFS data to MetCouncil's Cube Line files.

ModelRoadwayNetwork

Subclass of network_wrangler class RoadwayNetwork

Project

A single or set of changes to the roadway or transit system.

Parameters

A class representing all the parameters defining the networks including time of day, categories, etc.

+
+
+

Utils and Functions

+ + + + + + + + + +

util

logger

+
+
+ + +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty_dev/genindex/index.html b/branch/bicounty_dev/genindex/index.html new file mode 100644 index 0000000..cb4a8a9 --- /dev/null +++ b/branch/bicounty_dev/genindex/index.html @@ -0,0 +1,1038 @@ + + + + + + Index — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+
    +
  • + +
  • +
  • +
+
+
+
+
+ + +

Index

+ +
+ _ + | A + | B + | C + | D + | E + | F + | G + | H + | I + | K + | L + | M + | N + | O + | P + | R + | S + | T + | U + | V + | W + | X + | Y + | Z + +
+

_

+ + +
+ +

A

+ + + +
+ +

B

+ + + +
+ +

C

+ + + +
+ +

D

+ + + +
+ +

E

+ + + +
+ +

F

+ + + +
+ +

G

+ + + +
+ +

H

+ + + +
+ +

I

+ + + +
+ +

K

+ + +
+ +

L

+ + + +
+ +

M

+ + + +
+ +

N

+ + + +
+ +

O

+ + + +
+ +

P

+ + + +
+ +

R

+ + + +
+ +

S

+ + + +
+ +

T

+ + + +
+ +

U

+ + + +
+ +

V

+ + + +
+ +

W

+ + + +
+ +

X

+ + + +
+ +

Y

+ + +
+ +

Z

+ + + +
+ + + +
+
+
+ +
+ +
+

© Copyright 2019 Metropolitan Council.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty_dev/index.html b/branch/bicounty_dev/index.html new file mode 100644 index 0000000..5a887f3 --- /dev/null +++ b/branch/bicounty_dev/index.html @@ -0,0 +1,183 @@ + + + + + + + Welcome to lasso’s documentation! — lasso documentation + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

Welcome to lasso’s documentation!

+

This package of utilities is a wrapper around the +[network_wrangler](http://github.com/wsp-sag/network_wrangler) package +for MetCouncil. It aims to have the following functionality: +1. parse Cube log files and base highway networks and create ProjectCards

+
+

for Network Wrangler

+
+
    +
  1. parse two Cube transit line files and create ProjectCards for NetworkWrangler

  2. +
  3. refine Network Wrangler highway networks to contain specific variables and +settings for Metropolitan Council and export them to a format that can +be read in by Citilab’s Cube software.

  4. +
+ +
+
+

Indices and tables

+ +
+ + +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty_dev/objects.inv b/branch/bicounty_dev/objects.inv new file mode 100644 index 0000000..5f26290 Binary files /dev/null and b/branch/bicounty_dev/objects.inv differ diff --git a/branch/bicounty_dev/py-modindex/index.html b/branch/bicounty_dev/py-modindex/index.html new file mode 100644 index 0000000..a64daae --- /dev/null +++ b/branch/bicounty_dev/py-modindex/index.html @@ -0,0 +1,136 @@ + + + + + + Python Module Index — lasso documentation + + + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+
    +
  • + +
  • +
  • +
+
+
+
+
+ + +

Python Module Index

+ +
+ l +
+ + + + + + + + + + + + + +
 
+ l
+ lasso +
    + lasso.logger +
    + lasso.util +
+ + +
+
+
+ +
+ +
+

© Copyright 2019 Metropolitan Council.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty_dev/running/index.html b/branch/bicounty_dev/running/index.html new file mode 100644 index 0000000..36a27f9 --- /dev/null +++ b/branch/bicounty_dev/running/index.html @@ -0,0 +1,133 @@ + + + + + + + Running Lasso — lasso documentation + + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

Running Lasso

+
+

Create project files

+
+
+

Create a scenario

+
+
+

Exporting networks

+
+
+

Auditing and Reporting

+
+
+ + +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty_dev/search/index.html b/branch/bicounty_dev/search/index.html new file mode 100644 index 0000000..8bafd66 --- /dev/null +++ b/branch/bicounty_dev/search/index.html @@ -0,0 +1,126 @@ + + + + + + Search — lasso documentation + + + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+
    +
  • + +
  • +
  • +
+
+
+
+
+ + + + +
+ +
+ +
+
+
+ +
+ +
+

© Copyright 2019 Metropolitan Council.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + + + + + + \ No newline at end of file diff --git a/branch/bicounty_dev/searchindex.js b/branch/bicounty_dev/searchindex.js new file mode 100644 index 0000000..8ef2c89 --- /dev/null +++ b/branch/bicounty_dev/searchindex.js @@ -0,0 +1 @@ +Search.setIndex({"docnames": ["_generated/lasso.CubeTransit", "_generated/lasso.ModelRoadwayNetwork", "_generated/lasso.Parameters", "_generated/lasso.Project", "_generated/lasso.StandardTransit", "_generated/lasso.logger", "_generated/lasso.util", "autodoc", "index", "running", "setup", "starting"], "filenames": ["_generated/lasso.CubeTransit.rst", "_generated/lasso.ModelRoadwayNetwork.rst", "_generated/lasso.Parameters.rst", "_generated/lasso.Project.rst", "_generated/lasso.StandardTransit.rst", "_generated/lasso.logger.rst", "_generated/lasso.util.rst", "autodoc.rst", "index.rst", "running.md", "setup.md", "starting.md"], "titles": ["lasso.CubeTransit", "lasso.ModelRoadwayNetwork", "lasso.Parameters", "lasso.Project", "lasso.StandardTransit", "lasso.logger", "lasso.util", "Lasso Classes and Functions", "Welcome to lasso\u2019s documentation!", "Running Lasso", "Setup", "Starting Out"], "terms": {"class": [0, 1, 2, 3, 4, 6, 8, 11], "paramet": [0, 1, 3, 4, 6, 8], "sourc": [0, 1, 2, 3, 4, 5, 6, 11], "base": [0, 1, 2, 3, 4, 6, 8, 11], "object": [0, 1, 2, 3, 4, 6, 11], "store": [0, 1, 11], "inform": [0, 1, 4, 11], "about": [0, 1, 4, 11], "transit": [0, 1, 2, 3, 4, 8], "defin": [0, 1, 2, 3, 11], "cube": [0, 1, 2, 3, 4, 8], "line": [0, 1, 2, 3, 4, 6, 8, 11], "file": [0, 1, 2, 3, 4, 8], "ha": [0, 6, 11], "capabl": [0, 11], "pars": [0, 8, 11], "properti": [0, 1, 2, 4, 6, 11], "shape": [0, 1, 3, 4, 6, 11], "python": [0, 1, 2, 11], "dictionari": [0, 1, 2, 3, 4, 6, 11], "compar": [0, 3, 4, 11], "repres": [0, 2, 4, 6, 11], "chang": [0, 1, 3, 4, 11], "project": [0, 1, 2, 4, 6, 8], "card": [0, 1, 3, 4], "typic": [0, 3, 4, 6, 8], "usag": [0, 1, 3, 4], "exampl": [0, 1, 3, 4, 6, 11], "tn": [0, 11], "create_from_cub": [0, 11], "cube_dir": [0, 3, 11], "transit_change_list": [0, 11], "evaluate_differ": [0, 4, 11], "base_transit_network": [0, 3, 11], "list": [0, 1, 2, 3, 4, 6, 11], "string": [0, 1, 3, 4, 6], "uniqu": [0, 1, 3], "name": [0, 1, 2, 3, 4, 6, 11], "network": [0, 1, 2, 3, 4, 5, 8], "type": [0, 1, 2, 3, 4, 6, 11], "line_properti": 0, "kei": [0, 1, 3, 11], "valu": [0, 1, 3, 4, 6, 11], "ar": [0, 1, 2, 3, 6, 11], "These": 0, "directli": 0, "read": [0, 1, 3, 4, 8, 11], "from": [0, 1, 2, 3, 4, 6, 8], "haven": 0, "t": [0, 1, 6], "been": [0, 6], "translat": [0, 4, 11], "standard": [0, 1, 2, 4, 11], "dict": [0, 1, 2, 3], "panda": [0, 1, 3], "datafram": [0, 1, 3, 4, 11], "node": [0, 1, 2, 3, 4, 6, 11], "follow": [0, 4, 8, 11], "column": [0, 1], "node_id": 0, "int": [0, 1, 2, 3, 4, 6], "posit": [0, 6], "integ": [0, 1, 2], "id": [0, 1, 3, 4, 11], "number": [0, 1, 2, 3, 4, 6, 11], "neg": [0, 6], "indic": [0, 1, 6], "non": [0, 6], "stop": 0, "boolean": [0, 1], "i": [0, 1, 3, 4, 5, 6, 8, 11], "order": [0, 1, 2, 6, 11], "within": [0, 1, 2, 6], "thi": [0, 1, 2, 3, 6, 8, 11], "program_typ": 0, "either": [0, 1, 5, 6, 11], "pt": 0, "trnbld": 0, "str": [0, 1, 2, 3, 6], "instanc": [0, 1, 2, 3, 4, 11], "appli": [0, 1, 6], "which": [0, 1, 3, 6, 11], "includ": [0, 1, 2, 11], "time": [0, 1, 2, 4, 6, 11], "period": [0, 1, 2, 4, 11], "variabl": [0, 1, 2, 3, 4, 8, 11], "source_list": 0, "have": [0, 1, 6, 8], "ad": [0, 1, 3, 6, 11], "diff_dict": 0, "__init__": [0, 1, 2, 3, 4], "constructor": [0, 1, 3], "set": [0, 1, 2, 3, 4, 5, 6, 8, 11], "see": [0, 1, 3, 4, 6], "an": [0, 1, 3, 4, 6, 11], "method": [0, 1, 2, 3, 4, 6, 11], "add_additional_time_period": 0, "new_time_period_numb": 0, "orig_line_nam": 0, "copi": [0, 1, 3, 6, 11], "rout": [0, 1, 4], "anoth": 0, "appropri": [0, 4], "specif": [0, 1, 8, 11], "new": [0, 1, 6, 11], "under": 0, "self": [0, 1, 2, 3, 4, 6], "origin": [0, 1, 6, 11], "its": [0, 1], "return": [0, 1, 3, 4, 6], "add_cub": 0, "transit_sourc": 0, "lin": [0, 3, 4], "add": [0, 1, 3, 6, 11], "exist": [0, 1, 3, 6, 11], "transitnetwork": [0, 4], "directori": [0, 1, 4, 11], "static": [0, 1, 3, 4], "build_route_nam": 0, "route_id": [0, 4], "time_period": [0, 1, 2], "agency_id": 0, "0": [0, 1, 2, 4, 6, 11], "direction_id": 0, "1": [0, 1, 2, 4, 6, 8, 11], "creat": [0, 1, 2, 3, 4, 6, 8, 11], "contaten": 0, "agenc": 0, "direct": [0, 1, 6, 11], "e": [0, 1, 11], "452": 0, "111": 0, "pk": [0, 2], "construct": [0, 6, 11], "line_nam": 0, "0_452": 0, "111_452_pk1": 0, "calculate_start_end_tim": 0, "line_properties_dict": 0, "calcul": [0, 1, 2, 3, 4, 6], "start": [0, 1, 4, 8], "end": [0, 1, 6], "warn": [0, 1], "doesn": [0, 1], "take": [0, 1], "care": 0, "discongru": 0, "flavor": [0, 1], "create_add_route_card_dict": 0, "format": [0, 1, 2, 4, 8, 11], "route_properti": 0, "being": 0, "updat": [0, 1, 11], "A": [0, 1, 2, 3, 4, 6, 11], "addit": [0, 1, 6, 8, 11], "create_delete_route_card_dict": 0, "base_transit_line_properties_dict": 0, "delet": [0, 1], "style": [0, 6], "attribut": [0, 1, 2, 3, 4, 11], "find": [0, 1, 4], "create_update_route_card_dict": 0, "updated_properties_dict": 0, "cube_properties_to_standard_properti": 0, "cube_properties_dict": 0, "convert": [0, 1, 4, 6], "most": 0, "pertin": 0, "like": [0, 1, 2, 6], "headwai": [0, 2, 4], "varibl": [0, 2], "stnadard": 0, "unit": [0, 1, 6], "minut": 0, "second": [0, 4, 6], "correct": 0, "base_transit": 0, "identifi": [0, 1, 11], "what": [0, 1, 3, 6], "need": [0, 1, 2, 3], "For": [0, 1, 4, 6], "multipl": [0, 1, 6, 11], "make": [0, 1, 11], "duplic": 0, "so": [0, 1, 5, 6, 11], "each": [0, 1, 3, 6], "condit": [0, 1], "contain": [0, 1, 4, 6, 8, 11], "requir": [0, 1, 6, 11], "evalu": [0, 1, 3], "differ": [0, 2, 6], "between": [0, 1, 2, 3, 6], "evaluate_route_property_differ": 0, "properties_build": 0, "properties_bas": 0, "time_period_numb": 0, "absolut": [0, 6], "true": [0, 1, 3, 4, 5, 6, 11], "validate_bas": 0, "fals": [0, 1, 3, 4, 6, 11], "check": [0, 1, 3, 6], "ani": [0, 1, 3, 6, 11], "entri": [0, 3], "property_nam": 0, "property_valu": 0, "us": [0, 1, 2, 3, 4, 6, 11], "command": [0, 1, 3], "rather": [0, 1], "than": [0, 1, 6], "If": [0, 1, 3, 4, 6, 11], "automat": 0, "note": [0, 6, 11], "onli": [0, 1, 3, 4, 6], "numer": [0, 4, 6], "frequenc": [0, 4, 11], "suitabl": 0, "write": [0, 1, 3, 4, 11], "evaluate_route_shape_chang": 0, "shape_build": 0, "shape_bas": 0, "two": [0, 1, 6, 8, 11], "build": [0, 1, 3, 11], "version": [0, 6, 11], "ddatafram": 0, "get_time_period_numbers_from_cube_properti": 0, "properties_list": 0, "associ": [0, 2], "them": [0, 1, 6, 8], "all": [0, 1, 2, 5, 6, 11], "found": [0, 1, 6], "unpack_route_nam": 0, "unpack": 0, "info": [0, 1, 2], "link": [1, 2, 3, 6, 11], "kwarg": [1, 2, 3, 6], "roadwaynetwork": [1, 3, 4], "subclass": [1, 11], "network_wrangl": [1, 8, 11], "represent": [1, 3, 4, 6], "physic": 1, "roadwai": [1, 2, 3, 11], "geodatafram": [1, 11], "specifi": [1, 3, 6], "default": [1, 2, 3, 4, 6, 11], "cr": [1, 3, 6], "coordin": [1, 3, 6], "refer": [1, 3, 4, 6, 11], "system": [1, 3], "espg": [1, 3], "node_foreign_kei": [1, 3], "tabl": [1, 2, 3, 6], "link_foreign_kei": [1, 3], "foreign": [1, 3], "shape_foreign_kei": [1, 3, 11], "unique_link_id": [1, 3], "unique_node_id": [1, 3], "modes_to_network_link_vari": [1, 3], "map": [1, 2, 3, 6, 11], "mode": [1, 3, 4, 11], "modes_to_network_nodes_vari": [1, 3], "managed_lanes_node_id_scalar": [1, 3], "scalar": [1, 3, 6], "primari": [1, 3], "correspond": [1, 3], "manag": [1, 2, 3, 11], "lane": [1, 2, 3, 11], "managed_lanes_link_id_scalar": [1, 3], "managed_lanes_required_attribut": [1, 3], "must": [1, 3, 6], "keep_same_attributes_ml_and_gp": [1, 3], "parallel": [1, 3, 6], "gener": [1, 3], "purpos": [1, 3, 6], "add_count": 1, "network_vari": 1, "aadt": 1, "mndot_count_shst_data": 1, "none": [1, 3, 4, 5, 6], "widot_count_shst_data": 1, "mndot_count_variable_shp": [1, 2], "widot_count_variable_shp": 1, "count": [1, 2], "mc": [1, 2, 4], "join": [1, 3, 4, 6, 11], "data": [1, 2, 3, 4, 8, 11], "via": 1, "shst": 1, "api": 1, "match": 1, "result": [1, 6, 11], "should": [1, 2, 3, 6, 11], "written": [1, 11], "path": [1, 3, 4, 6, 11], "mndot": [1, 2], "locat": [1, 2, 3, 4], "widot": 1, "geodatabas": 1, "add_incident_link_data_to_nod": 1, "links_df": [1, 11], "nodes_df": [1, 11], "link_vari": 1, "unique_node_kei": 1, "model_node_id": [1, 2], "go": [1, 11], "assess": [1, 3], "connect": 1, "incid": 1, "where": 1, "length": [1, 6], "n": [1, 2, 3, 6, 11], "out": [1, 2, 5, 6, 8], "add_new_roadway_feature_chang": 1, "featur": [1, 6], "also": [1, 6, 11], "valid": [1, 6, 11], "add_variable_using_shst_refer": 1, "var_shst_csvdata": 1, "shst_csv_variabl": 1, "network_var_typ": 1, "overwrit": 1, "bool": [1, 3, 6], "addition_map": 1, "show": 1, "project_card_dictionari": 1, "wrapper": [1, 8, 11], "apply_managed_lane_feature_chang": 1, "link_idx": 1, "in_plac": [1, 11], "lndice": 1, "whether": 1, "decid": 1, "connector": [1, 2], "when": [1, 3, 6], "thei": [1, 2], "more": [1, 6, 11], "apply_python_calcul": 1, "pycod": 1, "execut": 1, "code": [1, 2, 6, 11], "apply_roadway_feature_chang": [1, 11], "select": [1, 11], "pass": [1, 5, 6], "assess_connect": [1, 11], "ignore_end_nod": [1, 11], "graph": 1, "disconnect": 1, "subgraph": 1, "describ": [1, 11], "member": [1, 6], "one": [1, 6, 11], "drive": [1, 11], "walk": [1, 11], "bike": 1, "ignor": [1, 6], "strai": 1, "singleton": 1, "tupl": [1, 6], "osmnx": [1, 11], "networkx": 1, "digraph": 1, "build_selection_kei": 1, "selection_dict": 1, "combin": [1, 2, 4, 6], "queri": [1, 11], "b": [1, 2, 11], "you": [1, 3, 11], "selection_dictonari": 1, "serv": 1, "calculate_area_typ": 1, "area_type_shap": [1, 2], "area_type_shape_vari": 1, "area_typ": [1, 2, 3], "area_type_codes_dict": 1, "downtown_area_type_shap": [1, 2], "downtown_area_typ": [1, 2], "area": [1, 2, 3, 6], "centroid": [1, 2, 6], "geometri": [1, 2, 6], "field": [1, 2, 11], "determin": [1, 3, 6], "label": 1, "isn": 1, "perfect": 1, "much": 1, "quicker": 1, "other": [1, 6, 11], "The": [1, 2, 5, 6, 11], "geodadabas": 1, "input": [1, 6], "downtown": [1, 2], "boundari": [1, 2, 6], "counti": [1, 2, 3], "calculate_centroidconnect": 1, "centroidconnect": [1, 2, 3], "highest_taz_numb": [1, 2], "as_integ": 1, "max": 1, "taz": [1, 2], "calculate_counti": 1, "county_shap": [1, 2], "county_shape_vari": 1, "county_codes_dict": 1, "calculate_dist": 1, "distanc": [1, 2, 3, 6], "centroidconnect_onli": 1, "mile": 1, "centroidconnector": 1, "calculate_mpo": 1, "county_network_vari": 1, "mpo": [1, 2], "mpo_counti": [1, 2], "param": [1, 4], "county_vari": 1, "region": [1, 6], "calculate_us": 1, "defauli": 1, "convert_int": 1, "int_col_nam": 1, "create_ml_vari": 1, "ml_lane": [1, 2], "ml": 1, "placehold": 1, "come": 1, "log": [1, 3, 8], "create_calculated_vari": 1, "create_dummy_connector_link": 1, "ml_df": 1, "access_lan": 1, "egress_lan": 1, "access_roadwai": 1, "ml_access": 1, "egress_roadwai": 1, "access_name_prefix": 1, "access": [1, 4, 6], "dummi": 1, "egress_name_prefix": 1, "egress": 1, "gp_df": 1, "roadai": 1, "prefix": 1, "create_hov_corridor_vari": 1, "segment_id": [1, 2], "hov": 1, "corridor": 1, "create_managed_lane_network": [1, 11], "keep_additional_attributes_ml_and_gp": 1, "separ": [1, 3, 6], "look": 1, "want": [1, 11], "leav": 1, "some": 1, "rigor": 1, "test": [1, 2, 6], "create_managed_vari": 1, "dataframe_to_fixed_width": 1, "df": 1, "fix": [1, 2], "width": [1, 6], "transform": [1, 6], "delete_roadway_feature_chang": 1, "ignore_miss": 1, "get": [1, 2, 6, 11], "miss": 1, "fail": [1, 6], "deletion_map": 1, "fill_na": 1, "fill": [1, 6], "na": 1, "from_roadwaynetwork": [1, 11], "roadway_network_object": 1, "get_attribut": 1, "join_kei": 1, "source_shst_ref_df": 1, "source_gdf": 1, "field_nam": 1, "get_managed_lane_node_id": 1, "nodes_list": 1, "4500000": 1, "237": 1, "get_modal_graph": 1, "bike_access": [1, 2], "bu": [1, 4], "bus_onli": [1, 2], "drive_access": [1, 2, 11], "rail": [1, 4], "rail_onli": [1, 2], "walk_access": [1, 2], "strongli": 1, "vertex": [1, 6], "reachabl": 1, "everi": [1, 6], "get_modal_links_nod": 1, "kept": 1, "both": [1, 4, 6, 11], "filter": [1, 6], "right": [1, 6], "now": 1, "we": [1, 11], "don": 1, "becaus": [1, 6, 11], "mark": 1, "issu": 1, "discuss": 1, "http": [1, 4, 6, 8, 11], "github": [1, 6, 8, 11], "com": [1, 4, 6, 8, 11], "wsp": [1, 8, 11], "sag": [1, 8, 11], "145": 1, "modal_nodes_df": 1, "mode_node_vari": 1, "get_property_by_time_period_and_group": 1, "prop": 1, "categori": [1, 2, 11], "default_return": 1, "seri": [1, 11], "group": 1, "16": [1, 2, 6], "00": [1, 2], "19": [1, 2], "option": [1, 6], "sov": [1, 2], "search": [1, 2, 8, 11], "hov3": [1, 2], "hov2": [1, 2], "identify_seg": 1, "o_id": 1, "d_id": 1, "endpoint": 1, "up": [1, 5, 11], "segment": [1, 4, 6, 11], "candid": 1, "otherwis": [1, 4, 6], "ram": 1, "hog": 1, "could": [1, 6, 11], "odd": 1, "shortest": 1, "segment_vari": 1, "keep": [1, 6], "identify_segment_endpoint": 1, "min_connecting_link": 1, "10": [1, 2, 6], "min_dist": 1, "max_link_devi": 1, "2": [1, 2, 4, 6, 11], "is_network_connect": [1, 11], "consid": [1, 6, 11], "cach": 1, "long": [1, 6], "load_transform_network": 1, "node_filenam": [1, 11], "link_filenam": [1, 11], "shape_filenam": [1, 11], "4326": [1, 6], "validate_schema": 1, "disk": 1, "schema": [1, 11], "shapes_df": [1, 11], "network_connection_plot": 1, "g": [1, 6], "disconnected_subgraph_nod": 1, "plot": 1, "fig": 1, "ax": [1, 6], "orig_dest_nodes_foreign_kei": 1, "whatev": 1, "u": 1, "v": [1, 2, 11], "ab": 1, "noth": 1, "assum": 1, "a_id": 1, "b_id": 1, "ox_graph": 1, "unique_link_kei": 1, "model_link_id": [1, 2, 3], "arrai": [1, 6], "remov": [1, 6], "certain": 1, "do": [1, 6, 11], "too": [1, 5], "link_df": 1, "referenc": 1, "multidigraph": 1, "path_search": 1, "candidate_links_df": 1, "weight_column": 1, "weight_factor": 1, "search_breadth": 1, "5": [1, 2, 4, 6], "max_search_breadth": 1, "candidate_link": 1, "part": [1, 6, 11], "foreigh": 1, "destin": 1, "weight": 1, "iter": [1, 6], "multipli": 1, "fast": [1, 11], "recalculate_calculated_vari": [1, 3], "recalculate_dist": [1, 3], "json": [1, 11], "geojson": 1, "skip": 1, "speed": 1, "spatial": [1, 6], "etc": [1, 2, 3, 11], "re": 1, "read_match_result": 1, "lot": 1, "same": [1, 6], "concaten": 1, "singl": [1, 3, 6], "geopanda": [1, 11], "sure": 1, "why": 1, "util": [1, 8, 11], "rename_variables_for_dbf": 1, "input_df": 1, "variable_crosswalk": 1, "output_vari": [1, 2], "convert_geometry_to_xi": 1, "renam": [1, 3], "dbf": 1, "shp": [1, 2], "char": 1, "crosswalk": [1, 3], "x": [1, 2, 6], "y": [1, 2, 6], "roadway_net_to_gdf": 1, "roadway_net": 1, "turn": [1, 11], "export": [1, 8, 11], "sophist": 1, "attach": 1, "roadway_standard_to_met_council_network": 1, "output_epsg": [1, 2], "consist": [1, 6, 11], "metcouncil": [1, 2, 4, 8, 11], "": [1, 2, 4, 6, 11], "model": [1, 2], "expect": [1, 6], "epsg": [1, 2, 6], "output": [1, 2, 3, 4, 6], "select_roadway_featur": [1, 11], "search_mod": 1, "force_search": 1, "sp_weight_factor": 1, "satisfi": [1, 6], "criteria": 1, "net": [1, 11], "osm": [1, 2], "share": [1, 11], "street": [1, 11], "osm_model_link_id": 1, "1234": 1, "shstid": 1, "4321": 1, "regex": 1, "facil": [1, 2, 11], "main": 1, "st": [1, 2], "least": [1, 11], "perform": 1, "even": 1, "previou": 1, "discourag": 1, "meander": 1, "ref": 1, "here": [1, 6], "defaul": 1, "selection_has_unique_link_id": 1, "selection_dictionari": 1, "selection_map": 1, "selected_link_idx": 1, "candidate_link_idx": 1, "selected_links_idx": 1, "candidate_links_idx": 1, "shortest_path": 1, "graph_links_df": 1, "100": 1, "four": 1, "nx": 1, "split_properties_by_time_period_and_categori": 1, "properties_to_split": [1, 2], "split": [1, 2, 4], "structur": 1, "stratifi": 1, "times_period": 1, "am": [1, 2], "6": [1, 2, 4, 6], "pm": [1, 2], "15": [1, 2], "update_dist": 1, "use_shap": 1, "inplac": 1, "straight": 1, "avail": 1, "portion": 1, "provid": [1, 3, 4, 6, 11], "entir": 1, "crow": 1, "fly": 1, "meter": [1, 6], "nan": 1, "validate_link_schema": 1, "schema_loc": 1, "roadway_network_link": 1, "validate_node_schema": 1, "node_fil": 1, "roadway_network_nod": 1, "validate_properti": 1, "ignore_exist": 1, "require_existing_for_chang": 1, "theproject": 1, "dictonari": 1, "validate_select": 1, "selection_requir": 1, "whetther": 1, "minimum": [1, 5, 6], "validate_shape_schema": 1, "shape_fil": 1, "roadway_network_shap": 1, "validate_uniqu": 1, "confirm": 1, "met": 1, "filenam": [1, 3, 11], "were": 1, "save": 1, "write_roadway_as_fixedwidth": [1, 11], "output_dir": 1, "node_output_vari": 1, "link_output_vari": 1, "output_link_txt": [1, 2], "output_node_txt": [1, 2], "output_link_header_width_txt": [1, 2], "output_node_header_width_txt": [1, 2], "output_cube_network_script": [1, 2], "drive_onli": 1, "function": [1, 5, 6, 8, 11], "doe": [1, 4], "header": [1, 2], "3": [1, 2, 4, 6, 11], "script": [1, 2], "run": [1, 8], "databas": [1, 3], "record": 1, "write_roadway_as_shp": [1, 11], "data_to_csv": 1, "data_to_dbf": 1, "output_link_shp": [1, 2], "output_node_shp": [1, 2], "output_link_csv": [1, 2], "output_node_csv": [1, 2], "output_gpkg": 1, "output_link_gpkg_lay": 1, "output_node_gpkg_lay": 1, "output_gpkg_link_filt": 1, "gpkg": 1, "csv": [1, 2, 3, 11], "full": [1, 6], "geopackag": 1, "layer": [1, 11], "subset": 1, "calculated_valu": [1, 3], "dai": [2, 4, 11], "can": [2, 6, 8, 11], "runtim": [2, 11], "initi": [2, 3, 11], "keyword": [2, 6, 11], "argument": [2, 6, 11], "explicitli": [2, 11], "highlight": [2, 11], "attr": 2, "time_period_to_tim": 2, "abbrevi": [2, 4], "gtf": [2, 4, 11], "highwai": [2, 3, 8, 11], "ea": 2, "md": 2, "ev": 2, "cube_time_period": 2, "4": [2, 4, 6], "demand": 2, "allow": [2, 6], "suffix": 2, "truck": 2, "trk": 2, "final": 2, "lanes_am": 2, "time_periods_to_tim": 2, "shapefil": 2, "r": 2, "metcouncil_data": 2, "cb_2017_us_county_5m": 2, "county_variable_shp": 2, "lanes_lookup_fil": 2, "lookup": 2, "centroid_connect_lan": 2, "anoka": 2, "dakota": 2, "hennepin": 2, "ramsei": 2, "scott": 2, "washington": 2, "carver": 2, "taz_shap": 2, "tazofficialwcurrentforecast": 2, "taz_data": 2, "highest": 2, "3100": 2, "link_id": 2, "shstgeometryid": 2, "roadway_class": 2, "truck_access": 2, "trn_priority_ea": 2, "trn_priority_am": 2, "trn_priority_md": 2, "trn_priority_pm": 2, "trn_priority_ev": 2, "ttime_assert_ea": 2, "ttime_assert_am": 2, "ttime_assert_md": 2, "ttime_assert_pm": 2, "ttime_assert_ev": 2, "lanes_ea": 2, "lanes_md": 2, "lanes_pm": 2, "lanes_ev": 2, "price_sov_ea": 2, "price_hov2_ea": 2, "price_hov3_ea": 2, "price_truck_ea": 2, "price_sov_am": 2, "price_hov2_am": 2, "price_hov3_am": 2, "price_truck_am": 2, "price_sov_md": 2, "price_hov2_md": 2, "price_hov3_md": 2, "price_truck_md": 2, "price_sov_pm": 2, "price_hov2_pm": 2, "price_hov3_pm": 2, "price_truck_pm": 2, "price_sov_ev": 2, "price_hov2_ev": 2, "price_hov3_ev": 2, "price_truck_ev": 2, "roadway_class_idx": 2, "facility_typ": 2, "osm_node_id": [2, 6, 11], "bike_nod": 2, "transit_nod": 2, "walk_nod": 2, "drive_nod": 2, "ml_lanes_ea": 2, "ml_lanes_am": 2, "ml_lanes_md": 2, "ml_lanes_pm": 2, "ml_lanes_ev": 2, "osm_facility_type_dict": 2, "thrivemsp2040communitydesign": 2, "area_type_variable_shp": 2, "comdes2040": 2, "area_type_code_dict": 2, "23": 2, "urban": [2, 4], "center": [2, 6], "24": 2, "25": 2, "35": 2, "36": 2, "41": 2, "51": 2, "52": 2, "53": 2, "60": 2, "downtownzones_taz": 2, "mrcc_roadway_class_shap": 2, "mrcc": 2, "trans_mrcc_centerlin": 2, "mrcc_roadway_class_variable_shp": 2, "mrcc_roadway_class_shp": 2, "route_si": 2, "widot_roadway_class_shap": 2, "wisconsin": 2, "wisconsin_lanes_counts_median": 2, "wislr": 2, "widot_roadway_class_variable_shp": 2, "rdwy_ctgy_": 2, "mndot_count_shap": 2, "count_mn": 2, "aadt_2017_count_loc": 2, "osm_highway_facility_type_crosswalk": 2, "legacy_tm2_attribut": 2, "shstreferenceid": 2, "legaci": 2, "tm2": 2, "osm_lanes_attribut": 2, "tam_tm2_attribut": 2, "tam": 2, "tom_tom_attribut": 2, "tomtom": 2, "tomtom_attribut": 2, "sfcta_attribut": 2, "sfcta": 2, "geograph": 2, "102646": 2, "scratch": 2, "txt": [2, 11], "links_header_width": 2, "nodes_header_width": 2, "import": [2, 6], "make_complete_network_from_fixed_width_fil": 2, "county_link_range_dict": 2, "county_code_dict": 2, "7": [2, 4, 11], "extern": 2, "chisago": 2, "11": 2, "goodhu": 2, "12": 2, "isanti": 2, "13": 2, "le": 2, "sueur": 2, "14": 2, "mcleod": 2, "pierc": 2, "polk": 2, "17": 2, "rice": 2, "18": 2, "sherburn": 2, "siblei": 2, "20": 2, "croix": 2, "21": 2, "wright": 2, "22": 2, "route_type_bus_mode_dict": 2, "urb": 2, "loc": 2, "sub": [2, 6], "express": [2, 4, 6], "route_type_mode_dict": 2, "8": [2, 4, 6], "9": [2, 4], "cube_time_periods_nam": 2, "op": 2, "detail": [2, 5], "zone": 2, "possibl": [2, 6], "roadway_link_chang": 3, "roadway_node_chang": 3, "transit_chang": [3, 4], "base_roadway_network": 3, "base_cube_transit_network": 3, "build_cube_transit_network": 3, "project_nam": 3, "produc": [3, 6], "test_project": [3, 11], "create_project": [3, 11], "base_cube_transit_sourc": 3, "o": [3, 4, 11], "build_cube_transit_sourc": 3, "transit_route_shape_chang": [3, 11], "evaluate_chang": [3, 11], "write_project_card": [3, 11], "scratch_dir": [3, 11], "t_transit_shape_test": [3, 11], "yml": [3, 11], "default_project_nam": 3, "level": 3, "constant": 3, "static_valu": 3, "card_data": 3, "cubetransit": [3, 8], "bunch": 3, "projectcard": [3, 8], "case": 3, "standardtransit": [3, 8], "add_highway_chang": 3, "limit_variables_to_existing_network": 3, "hoc": [3, 11], "add_transit_chang": 3, "roadway_log_fil": 3, "roadway_shp_fil": 3, "roadway_csv_fil": 3, "network_build_fil": 3, "emme_node_id_crosswalk_fil": 3, "emme_name_crosswalk_fil": 3, "base_roadway_dir": 3, "base_transit_dir": [3, 4, 11], "consum": 3, "logfil": 3, "emm": [3, 4], "folder": 3, "base_cube_transit_fil": 3, "build_cube_transit_fil": 3, "first": [3, 6], "recalcul": 3, "determine_roadway_network_changes_compat": 3, "emme_id_to_wrangler_id": 3, "emme_link_change_df": 3, "emme_node_change_df": 3, "emme_transit_changes_df": 3, "rewrit": 3, "wrangler": [3, 8, 11], "emme_name_to_wrangler_nam": 3, "aggreg": 3, "get_object_from_network_build_command": 3, "row": [3, 4], "histori": 3, "l": 3, "get_operation_from_network_build_command": 3, "action": 3, "c": [3, 6], "d": [3, 6], "read_logfil": 3, "logfilenam": 3, "reprsent": [3, 11], "read_network_build_fil": 3, "networkbuildfilenam": 3, "nework": 3, "assign_group": 3, "user": [3, 11], "TO": 3, "ptg_feed": 4, "hold": [4, 11], "feed": [4, 11], "partridg": [4, 11], "manipul": [4, 11], "cube_transit_net": [4, 11], "read_gtf": [4, 11], "write_as_cube_lin": [4, 11], "write_dir": [4, 11], "outfil": [4, 11], "calculate_cube_mod": 4, "assign": 4, "logic": 4, "route_typ": 4, "develop": [4, 11], "googl": 4, "cube_mod": 4, "route_type_to_cube_mod": 4, "tram": 4, "streetcar": 4, "light": 4, "further": 4, "disaggreg": 4, "buse": 4, "suburban": 4, "longnam": 4, "lower": [4, 6], "elif": 4, "99": 4, "local": [4, 11], "els": [4, 6], "route_long_nam": 4, "cube_format": 4, "represnt": 4, "notat": 4, "trip": 4, "trip_id": 4, "shape_id": [4, 11], "tod": 4, "onewai": 4, "oper": [4, 6], "fromtransitnetwork": 4, "transit_network_object": 4, "modelroadwaynetwork": [4, 8], "gtfs_feed_dir": 4, "route_properties_gtfs_to_cub": 4, "prepar": 4, "trip_df": 4, "shape_gtfs_to_cub": 4, "add_nntim": 4, "shape_gtfs_to_emm": 4, "trip_row": 4, "time_to_cube_time_period": 4, "start_time_sec": 4, "as_str": 4, "verbos": 4, "midnight": [4, 6], "this_tp": 4, "this_tp_num": 4, "outpath": 4, "after": 4, "setuplog": 5, "infologfilenam": 5, "debuglogfilenam": 5, "logtoconsol": 5, "infolog": 5, "ters": 5, "just": 5, "give": 5, "bare": 5, "composit": 5, "clear": 5, "later": 5, "debuglog": 5, "veri": [5, 6], "noisi": 5, "debug": 5, "spew": 5, "consol": 5, "point": 6, "arg": 6, "basegeometri": 6, "possibli": 6, "z": 6, "zero": 6, "dimension": 6, "float": 6, "sequenc": 6, "individu": 6, "p": 6, "print": 6, "almost_equ": 6, "decim": 6, "equal": 6, "place": 6, "deprec": 6, "sinc": 6, "confus": 6, "equals_exact": 6, "instead": 6, "approxim": 6, "compon": [6, 8], "linestr": 6, "1e": 6, "buffer": 6, "quad_seg": 6, "cap_styl": 6, "round": 6, "join_styl": 6, "mitre_limit": 6, "single_sid": 6, "dilat": 6, "eros": 6, "small": 6, "mai": 6, "sometim": 6, "tidi": 6, "polygon": 6, "around": [6, 8, 11], "resolut": 6, "angl": 6, "fillet": 6, "buffercapstyl": 6, "squar": 6, "flat": 6, "circular": 6, "rectangular": 6, "while": 6, "involv": 6, "bufferjoinstyl": 6, "mitr": 6, "bevel": 6, "midpoint": 6, "edg": [6, 8], "touch": 6, "depend": 6, "limit": 6, "ratio": 6, "sharp": 6, "corner": 6, "offset": 6, "meet": 6, "miter": 6, "extend": 6, "To": [6, 11], "prevent": 6, "unreason": 6, "control": 6, "maximum": 6, "exce": 6, "side": 6, "sign": 6, "left": 6, "hand": 6, "regular": 6, "cap": 6, "alwai": 6, "forc": 6, "equival": 6, "cap_flat": 6, "quadseg": 6, "alia": 6, "strictli": 6, "wkt": 6, "load": 6, "gon": 6, "approx": 6, "radiu": 6, "circl": 6, "1365484905459": 6, "128": 6, "141513801144": 6, "triangl": 6, "exterior": 6, "coord": 6, "contains_properli": 6, "complet": 6, "common": 6, "document": [6, 11], "covered_bi": 6, "cover": 6, "cross": 6, "grid_siz": 6, "disjoint": 6, "unitless": 6, "dwithin": 6, "given": [6, 11], "topolog": 6, "toler": 6, "comparison": 6, "geometrytyp": 6, "hausdorff_dist": 6, "hausdorff": 6, "interpol": 6, "normal": 6, "along": 6, "linear": 6, "taken": 6, "measur": 6, "revers": 6, "rang": 6, "index": [6, 8], "handl": 6, "clamp": 6, "interpret": 6, "fraction": 6, "line_interpolate_point": 6, "intersect": 6, "line_locate_point": 6, "nearest": 6, "form": 6, "canon": 6, "ring": 6, "multi": 6, "multilinestr": 6, "overlap": 6, "point_on_surfac": 6, "guarante": 6, "cheapli": 6, "representative_point": 6, "relat": 6, "de": 6, "9im": 6, "matrix": 6, "relate_pattern": 6, "pattern": 6, "relationship": 6, "interior": 6, "unchang": 6, "is_ccw": 6, "clockwis": 6, "max_segment_length": 6, "vertic": 6, "longer": 6, "evenli": 6, "subdivid": 6, "densifi": 6, "unmodifi": 6, "array_lik": 6, "greater": 6, "simplifi": 6, "preserve_topologi": 6, "dougla": 6, "peucker": 6, "algorithm": 6, "unless": 6, "topologi": 6, "preserv": 6, "invalid": 6, "svg": 6, "scale_factor": 6, "fill_color": 6, "opac": 6, "element": 6, "factor": 6, "diamet": 6, "hex": 6, "color": 6, "66cc99": 6, "ff3333": 6, "symmetric_differ": 6, "symmetr": 6, "union": 6, "dimens": 6, "bound": 6, "collect": 6, "empti": 6, "null": 6, "minx": 6, "mini": 6, "maxx": 6, "maxi": 6, "geometr": 6, "convex_hul": 6, "convex": 6, "hull": 6, "less": 6, "three": [6, 11], "multipoint": 6, "triangular": 6, "imagin": 6, "elast": 6, "band": 6, "stretch": 6, "coordinatesequ": 6, "envelop": 6, "figur": 6, "geom_typ": 6, "has_z": 6, "is_clos": 6, "close": 6, "applic": 6, "is_empti": 6, "is_r": 6, "is_simpl": 6, "simpl": 6, "mean": 6, "is_valid": 6, "definit": 6, "minimum_clear": 6, "move": 6, "minimum_rotated_rectangl": 6, "orient": 6, "rotat": 6, "rectangl": 6, "enclos": 6, "unlik": 6, "constrain": 6, "degener": 6, "oriented_envelop": 6, "wkb": 6, "wkb_hex": 6, "xy": 6, "shell": 6, "hole": 6, "It": [6, 8], "space": 6, "pair": [6, 11], "tripl": 6, "abov": 6, "classmethod": 6, "from_bound": 6, "xmin": 6, "ymin": 6, "xmax": 6, "ymax": 6, "stroke": 6, "partial": 6, "func": 6, "futur": 6, "call": 6, "column_name_to_part": 6, "create_locationrefer": 6, "geodesic_point_buff": 6, "lat": 6, "lon": 6, "get_shared_streets_intersection_hash": 6, "per": [6, 11], "sharedstreet": 6, "j": 6, "blob": 6, "0e6d7de0aee2e9ae3b007d1e45284b06cc241d02": 6, "src": 6, "l553": 6, "l565": 6, "93": 6, "0965985": 6, "44": 6, "952112199999995": 6, "954734870": 6, "69f13f881649cb21ee3b359730790bb9": 6, "hhmmss_to_datetim": 6, "hhmmss_str": 6, "datetim": 6, "hh": 6, "mm": 6, "ss": 6, "dt": 6, "secs_to_datetim": 6, "sec": 6, "shorten_nam": 6, "geom": 6, "xp": 6, "yp": 6, "zp": 6, "shall": 6, "ident": 6, "def": 6, "id_func": 6, "g2": 6, "g1": 6, "pyproj": 6, "accur": 6, "wgs84": 6, "utm": 6, "32618": 6, "from_cr": 6, "always_xi": 6, "support": 6, "lambda": 6, "unidecod": 6, "error": 6, "replace_str": 6, "transliter": 6, "unicod": 6, "ascii": 6, "\u5317\u4eb0": 6, "bei": 6, "jing": 6, "tri": 6, "codec": 6, "charact": 6, "fall": 6, "back": 6, "five": 6, "faster": 6, "slightli": 6, "slower": 6, "unicode_expect_nonascii": 6, "present": 6, "replac": [6, 11], "strict": 6, "rais": 6, "unidecodeerror": 6, "substitut": 6, "might": [6, 11], "packag": [8, 11], "aim": 8, "networkwrangl": [8, 11], "refin": 8, "metropolitan": 8, "council": 8, "citilab": 8, "softwar": [8, 11], "instal": 8, "bleed": 8, "clone": 8, "brief": 8, "intro": 8, "workflow": 8, "quickstart": 8, "jupyt": 8, "notebook": 8, "setup": 8, "scenario": 8, "audit": 8, "report": 8, "logger": 8, "modul": 8, "page": 8, "suggest": 11, "virtualenv": 11, "conda": 11, "virtual": 11, "environ": 11, "recommend": 11, "pip": 11, "lasso": 11, "config": 11, "channel": 11, "forg": 11, "rtree": 11, "my_lasso_environ": 11, "activ": 11, "git": 11, "master": 11, "pypi": 11, "repositori": 11, "date": 11, "branch": 11, "work": 11, "your": 11, "machin": 11, "edit": 11, "plan": 11, "well": 11, "cd": 11, "team": 11, "contribut": 11, "bxack": 11, "pleas": 11, "fork": 11, "befor": 11, "upstream": 11, "tag": 11, "branchnam": 11, "frequent": 11, "instruct": 11, "good": 11, "atom": 11, "sublim": 11, "text": 11, "syntax": 11, "desktop": 11, "built": 11, "mashup": 11, "open": 11, "In": 11, "nest": 11, "span": 11, "implement": 11, "novel": 11, "travel": 11, "break": 11, "publictransport": 11, "done": 11, "gui": 11, "public": 11, "transport": 11, "infrastructur": 11, "servic": 11, "tier": 11, "made": 11, "mainli": 11, "my_link_fil": 11, "my_node_fil": 11, "my_shape_fil": 11, "my_select": 11, "35e": 11, "961117623": 11, "2564047368": 11, "my_chang": 11, "my_net": 11, "ml_net": 11, "_": 11, "disconnected_nod": 11, "my_out_prefix": 11, "my_dir": 11, "my_base_scenario": 11, "road_net": 11, "stpaul_link_fil": 11, "stpaul_node_fil": 11, "stpaul_shape_fil": 11, "transit_net": 11, "stpaul_dir": 11, "card_filenam": 11, "3_multiple_roadway_attribute_chang": 11, "multiple_chang": 11, "4_simple_managed_lan": 11, "project_card_directori": 11, "project_card": 11, "project_cards_list": 11, "my_scenario": 11, "create_scenario": 11, "base_scenario": 11, "check_scenario_requisit": 11, "apply_all_project": 11, "scenario_summari": 11, "base_transit_sourc": 11, "build_transit_sourc": 11, "understand": 11, "how": 11, "overrid": 11, "those": 11, "instanti": 11, "yaml": 11, "configur": 11, "config_fil": 11, "f": 11, "my_config": 11, "safe_load": 11, "model_road_net": 11, "my_paramet": 11, "accomplish": 11, "goal": 11, "top": 11, "learn": 11, "basic": 11, "creation": 11, "ipynb": 11}, "objects": {"": [[7, 0, 0, "-", "lasso"]], "lasso": [[0, 1, 1, "", "CubeTransit"], [1, 1, 1, "", "ModelRoadwayNetwork"], [2, 1, 1, "", "Parameters"], [3, 1, 1, "", "Project"], [4, 1, 1, "", "StandardTransit"], [5, 0, 0, "-", "logger"], [6, 0, 0, "-", "util"]], "lasso.CubeTransit": [[0, 2, 1, "", "__init__"], [0, 2, 1, "", "add_additional_time_periods"], [0, 2, 1, "", "add_cube"], [0, 2, 1, "", "build_route_name"], [0, 2, 1, "", "calculate_start_end_times"], [0, 2, 1, "", "create_add_route_card_dict"], [0, 2, 1, "", "create_delete_route_card_dict"], [0, 2, 1, "", "create_from_cube"], [0, 2, 1, "", "create_update_route_card_dict"], [0, 2, 1, "", "cube_properties_to_standard_properties"], [0, 3, 1, "", "diff_dict"], [0, 2, 1, "", "evaluate_differences"], [0, 2, 1, "", "evaluate_route_property_differences"], [0, 2, 1, "", "evaluate_route_shape_changes"], [0, 2, 1, "", "get_time_period_numbers_from_cube_properties"], [0, 3, 1, "", "line_properties"], [0, 3, 1, "", "lines"], [0, 3, 1, "", "parameters"], [0, 3, 1, "", "program_type"], [0, 3, 1, "", "shapes"], [0, 3, 1, "", "source_list"], [0, 2, 1, "", "unpack_route_name"]], "lasso.ModelRoadwayNetwork": [[1, 3, 1, "", "CALCULATED_VALUES"], [1, 2, 1, "", "__init__"], [1, 2, 1, "", "add_counts"], [1, 2, 1, "", "add_incident_link_data_to_nodes"], [1, 2, 1, "", "add_new_roadway_feature_change"], [1, 2, 1, "", "add_variable_using_shst_reference"], [1, 2, 1, "", "addition_map"], [1, 2, 1, "", "apply"], [1, 2, 1, "", "apply_managed_lane_feature_change"], [1, 2, 1, "", "apply_python_calculation"], [1, 2, 1, "", "apply_roadway_feature_change"], [1, 2, 1, "", "assess_connectivity"], [1, 2, 1, "", "build_selection_key"], [1, 2, 1, "", "calculate_area_type"], [1, 2, 1, "", "calculate_centroidconnect"], [1, 2, 1, "", "calculate_county"], [1, 2, 1, "", "calculate_distance"], [1, 2, 1, "", "calculate_mpo"], [1, 2, 1, "", "calculate_use"], [1, 2, 1, "", "convert_int"], [1, 2, 1, "", "create_ML_variable"], [1, 2, 1, "", "create_calculated_variables"], [1, 2, 1, "", "create_dummy_connector_links"], [1, 2, 1, "", "create_hov_corridor_variable"], [1, 2, 1, "", "create_managed_lane_network"], [1, 2, 1, "", "create_managed_variable"], [1, 2, 1, "", "dataframe_to_fixed_width"], [1, 2, 1, "", "delete_roadway_feature_change"], [1, 2, 1, "", "deletion_map"], [1, 2, 1, "", "fill_na"], [1, 2, 1, "", "from_RoadwayNetwork"], [1, 2, 1, "", "get_attribute"], [1, 2, 1, "", "get_managed_lane_node_ids"], [1, 2, 1, "", "get_modal_graph"], [1, 2, 1, "", "get_modal_links_nodes"], [1, 2, 1, "", "get_property_by_time_period_and_group"], [1, 2, 1, "", "identify_segment"], [1, 2, 1, "", "identify_segment_endpoints"], [1, 2, 1, "", "is_network_connected"], [1, 2, 1, "", "load_transform_network"], [1, 2, 1, "", "network_connection_plot"], [1, 2, 1, "", "orig_dest_nodes_foreign_key"], [1, 2, 1, "", "ox_graph"], [1, 2, 1, "", "path_search"], [1, 2, 1, "", "read"], [1, 2, 1, "", "read_match_result"], [1, 2, 1, "", "rename_variables_for_dbf"], [1, 2, 1, "", "roadway_net_to_gdf"], [1, 2, 1, "", "roadway_standard_to_met_council_network"], [1, 2, 1, "", "select_roadway_features"], [1, 2, 1, "", "selection_has_unique_link_id"], [1, 2, 1, "", "selection_map"], [1, 2, 1, "", "shortest_path"], [1, 2, 1, "", "split_properties_by_time_period_and_category"], [1, 2, 1, "", "update_distance"], [1, 2, 1, "", "validate_link_schema"], [1, 2, 1, "", "validate_node_schema"], [1, 2, 1, "", "validate_properties"], [1, 2, 1, "", "validate_selection"], [1, 2, 1, "", "validate_shape_schema"], [1, 2, 1, "", "validate_uniqueness"], [1, 2, 1, "", "write"], [1, 2, 1, "", "write_roadway_as_fixedwidth"], [1, 2, 1, "", "write_roadway_as_shp"]], "lasso.Parameters": [[2, 2, 1, "", "__init__"], [2, 3, 1, "", "county_link_range_dict"], [2, 3, 1, "", "cube_time_periods"], [2, 3, 1, "", "properties_to_split"], [2, 3, 1, "", "zones"]], "lasso.Project": [[3, 3, 1, "", "CALCULATED_VALUES"], [3, 3, 1, "id0", "DEFAULT_PROJECT_NAME"], [3, 3, 1, "id1", "STATIC_VALUES"], [3, 2, 1, "", "__init__"], [3, 2, 1, "", "add_highway_changes"], [3, 2, 1, "", "add_transit_changes"], [3, 3, 1, "", "base_cube_transit_network"], [3, 3, 1, "", "base_roadway_network"], [3, 3, 1, "", "build_cube_transit_network"], [3, 3, 1, "", "card_data"], [3, 2, 1, "", "create_project"], [3, 2, 1, "", "determine_roadway_network_changes_compatibility"], [3, 2, 1, "", "emme_id_to_wrangler_id"], [3, 2, 1, "", "emme_name_to_wrangler_name"], [3, 2, 1, "", "evaluate_changes"], [3, 2, 1, "", "get_object_from_network_build_command"], [3, 2, 1, "", "get_operation_from_network_build_command"], [3, 3, 1, "", "parameters"], [3, 3, 1, "", "project_name"], [3, 2, 1, "", "read_logfile"], [3, 2, 1, "", "read_network_build_file"], [3, 3, 1, "", "roadway_link_changes"], [3, 3, 1, "", "roadway_node_changes"], [3, 3, 1, "", "transit_changes"], [3, 2, 1, "", "write_project_card"]], "lasso.StandardTransit": [[4, 2, 1, "", "__init__"], [4, 2, 1, "", "calculate_cube_mode"], [4, 2, 1, "", "cube_format"], [4, 2, 1, "", "evaluate_differences"], [4, 3, 1, "", "feed"], [4, 2, 1, "", "fromTransitNetwork"], [4, 3, 1, "", "parameters"], [4, 2, 1, "", "read_gtfs"], [4, 2, 1, "", "route_properties_gtfs_to_cube"], [4, 2, 1, "", "shape_gtfs_to_cube"], [4, 2, 1, "", "shape_gtfs_to_emme"], [4, 2, 1, "", "time_to_cube_time_period"], [4, 2, 1, "", "write_as_cube_lin"]], "lasso.logger": [[5, 4, 1, "", "setupLogging"]], "lasso.util": [[6, 1, 1, "", "Point"], [6, 1, 1, "", "Polygon"], [6, 4, 1, "", "column_name_to_parts"], [6, 4, 1, "", "create_locationreference"], [6, 4, 1, "", "geodesic_point_buffer"], [6, 4, 1, "", "get_shared_streets_intersection_hash"], [6, 4, 1, "", "hhmmss_to_datetime"], [6, 1, 1, "", "partial"], [6, 4, 1, "", "secs_to_datetime"], [6, 4, 1, "", "shorten_name"], [6, 4, 1, "", "transform"], [6, 4, 1, "", "unidecode"]], "lasso.util.Point": [[6, 2, 1, "", "almost_equals"], [6, 5, 1, "", "area"], [6, 5, 1, "", "boundary"], [6, 5, 1, "", "bounds"], [6, 2, 1, "", "buffer"], [6, 5, 1, "", "centroid"], [6, 2, 1, "", "contains"], [6, 2, 1, "", "contains_properly"], [6, 5, 1, "", "convex_hull"], [6, 5, 1, "", "coords"], [6, 2, 1, "", "covered_by"], [6, 2, 1, "", "covers"], [6, 2, 1, "", "crosses"], [6, 2, 1, "", "difference"], [6, 2, 1, "", "disjoint"], [6, 2, 1, "", "distance"], [6, 2, 1, "", "dwithin"], [6, 5, 1, "", "envelope"], [6, 2, 1, "", "equals"], [6, 2, 1, "", "equals_exact"], [6, 5, 1, "", "geom_type"], [6, 2, 1, "", "geometryType"], [6, 5, 1, "", "has_z"], [6, 2, 1, "", "hausdorff_distance"], [6, 2, 1, "", "interpolate"], [6, 2, 1, "", "intersection"], [6, 2, 1, "", "intersects"], [6, 5, 1, "", "is_closed"], [6, 5, 1, "", "is_empty"], [6, 5, 1, "", "is_ring"], [6, 5, 1, "", "is_simple"], [6, 5, 1, "", "is_valid"], [6, 5, 1, "", "length"], [6, 2, 1, "", "line_interpolate_point"], [6, 2, 1, "", "line_locate_point"], [6, 5, 1, "", "minimum_clearance"], [6, 5, 1, "", "minimum_rotated_rectangle"], [6, 2, 1, "", "normalize"], [6, 5, 1, "", "oriented_envelope"], [6, 2, 1, "", "overlaps"], [6, 2, 1, "", "point_on_surface"], [6, 2, 1, "", "project"], [6, 2, 1, "", "relate"], [6, 2, 1, "", "relate_pattern"], [6, 2, 1, "", "representative_point"], [6, 2, 1, "", "reverse"], [6, 2, 1, "", "segmentize"], [6, 2, 1, "", "simplify"], [6, 2, 1, "", "svg"], [6, 2, 1, "", "symmetric_difference"], [6, 2, 1, "", "touches"], [6, 5, 1, "", "type"], [6, 2, 1, "", "union"], [6, 2, 1, "", "within"], [6, 5, 1, "", "wkb"], [6, 5, 1, "", "wkb_hex"], [6, 5, 1, "", "wkt"], [6, 5, 1, "", "x"], [6, 5, 1, "", "xy"], [6, 5, 1, "", "y"], [6, 5, 1, "", "z"]], "lasso.util.Polygon": [[6, 2, 1, "", "almost_equals"], [6, 5, 1, "", "area"], [6, 5, 1, "", "boundary"], [6, 5, 1, "", "bounds"], [6, 2, 1, "", "buffer"], [6, 5, 1, "", "centroid"], [6, 2, 1, "", "contains"], [6, 2, 1, "", "contains_properly"], [6, 5, 1, "", "convex_hull"], [6, 5, 1, "", "coords"], [6, 2, 1, "", "covered_by"], [6, 2, 1, "", "covers"], [6, 2, 1, "", "crosses"], [6, 2, 1, "", "difference"], [6, 2, 1, "", "disjoint"], [6, 2, 1, "", "distance"], [6, 2, 1, "", "dwithin"], [6, 5, 1, "", "envelope"], [6, 2, 1, "", "equals"], [6, 2, 1, "", "equals_exact"], [6, 5, 1, "id0", "exterior"], [6, 2, 1, "", "from_bounds"], [6, 5, 1, "", "geom_type"], [6, 2, 1, "", "geometryType"], [6, 5, 1, "", "has_z"], [6, 2, 1, "", "hausdorff_distance"], [6, 5, 1, "id1", "interiors"], [6, 2, 1, "", "interpolate"], [6, 2, 1, "", "intersection"], [6, 2, 1, "", "intersects"], [6, 5, 1, "", "is_closed"], [6, 5, 1, "", "is_empty"], [6, 5, 1, "", "is_ring"], [6, 5, 1, "", "is_simple"], [6, 5, 1, "", "is_valid"], [6, 5, 1, "", "length"], [6, 2, 1, "", "line_interpolate_point"], [6, 2, 1, "", "line_locate_point"], [6, 5, 1, "", "minimum_clearance"], [6, 5, 1, "", "minimum_rotated_rectangle"], [6, 2, 1, "", "normalize"], [6, 5, 1, "", "oriented_envelope"], [6, 2, 1, "", "overlaps"], [6, 2, 1, "", "point_on_surface"], [6, 2, 1, "", "project"], [6, 2, 1, "", "relate"], [6, 2, 1, "", "relate_pattern"], [6, 2, 1, "", "representative_point"], [6, 2, 1, "", "reverse"], [6, 2, 1, "", "segmentize"], [6, 2, 1, "", "simplify"], [6, 2, 1, "", "svg"], [6, 2, 1, "", "symmetric_difference"], [6, 2, 1, "", "touches"], [6, 5, 1, "", "type"], [6, 2, 1, "", "union"], [6, 2, 1, "", "within"], [6, 5, 1, "", "wkb"], [6, 5, 1, "", "wkb_hex"], [6, 5, 1, "", "wkt"], [6, 5, 1, "", "xy"]], "lasso.util.partial": [[6, 3, 1, "", "args"], [6, 3, 1, "", "func"], [6, 3, 1, "", "keywords"]]}, "objtypes": {"0": "py:module", "1": "py:class", "2": "py:method", "3": "py:attribute", "4": "py:function", "5": "py:property"}, "objnames": {"0": ["py", "module", "Python module"], "1": ["py", "class", "Python class"], "2": ["py", "method", "Python method"], "3": ["py", "attribute", "Python attribute"], "4": ["py", "function", "Python function"], "5": ["py", "property", "Python property"]}, "titleterms": {"lasso": [0, 1, 2, 3, 4, 5, 6, 7, 8, 9], "cubetransit": [0, 11], "modelroadwaynetwork": [1, 11], "todo": 1, "paramet": [2, 10, 11], "project": [3, 9, 10, 11], "standardtransit": [4, 11], "logger": 5, "util": [6, 7], "class": 7, "function": 7, "base": 7, "welcom": 8, "": 8, "document": 8, "content": 8, "indic": 8, "tabl": 8, "run": [9, 11], "creat": 9, "file": [9, 10, 11], "scenario": [9, 11], "export": 9, "network": [9, 11], "audit": 9, "report": 9, "setup": 10, "set": 10, "addit": 10, "data": 10, "start": 11, "out": 11, "instal": 11, "bleed": 11, "edg": 11, "from": 11, "clone": 11, "brief": 11, "intro": 11, "compon": 11, "roadwaynetwork": 11, "transitnetwork": 11, "projectcard": 11, "typic": 11, "workflow": 11, "card": 11, "transit": 11, "lin": 11, "cube": 11, "log": 11, "model": 11, "quickstart": 11, "jupyt": 11, "notebook": 11}, "envversion": {"sphinx.domains.c": 3, "sphinx.domains.changeset": 1, "sphinx.domains.citation": 1, "sphinx.domains.cpp": 9, "sphinx.domains.index": 1, "sphinx.domains.javascript": 3, "sphinx.domains.math": 2, "sphinx.domains.python": 4, "sphinx.domains.rst": 2, "sphinx.domains.std": 2, "sphinx.ext.intersphinx": 1, "sphinx.ext.todo": 2, "sphinx.ext.viewcode": 1, "sphinx": 58}, "alltitles": {"lasso.CubeTransit": [[0, "lasso-cubetransit"]], "lasso.ModelRoadwayNetwork": [[1, "lasso-modelroadwaynetwork"]], "Todo": [[1, "id1"], [1, "id2"], [1, "id3"], [1, "id4"], [1, "id5"], [1, "id6"]], "lasso.Parameters": [[2, "lasso-parameters"]], "lasso.Project": [[3, "lasso-project"]], "lasso.StandardTransit": [[4, "lasso-standardtransit"]], "lasso.logger": [[5, "module-lasso.logger"]], "lasso.util": [[6, "module-lasso.util"]], "Lasso Classes and Functions": [[7, "module-lasso"]], "Base Classes": [[7, "base-classes"]], "Utils and Functions": [[7, "utils-and-functions"]], "Welcome to lasso\u2019s documentation!": [[8, "welcome-to-lasso-s-documentation"]], "Contents:": [[8, null]], "Indices and tables": [[8, "indices-and-tables"]], "Running Lasso": [[9, "running-lasso"]], "Create project files": [[9, "create-project-files"]], "Create a scenario": [[9, "create-a-scenario"]], "Exporting networks": [[9, "exporting-networks"]], "Auditing and Reporting": [[9, "auditing-and-reporting"]], "Setup": [[10, "setup"]], "Projects": [[10, "projects"]], "Parameters": [[10, "parameters"], [11, "parameters"]], "Settings": [[10, "settings"]], "Additional Data Files": [[10, "additional-data-files"]], "Starting Out": [[11, "starting-out"]], "Installation": [[11, "installation"]], "Bleeding Edge": [[11, "bleeding-edge"]], "From Clone": [[11, "from-clone"]], "Brief Intro": [[11, "brief-intro"]], "Components": [[11, "components"]], "RoadwayNetwork": [[11, "roadwaynetwork"]], "TransitNetwork": [[11, "transitnetwork"]], "ProjectCard": [[11, "projectcard"]], "Scenario": [[11, "scenario"]], "Project": [[11, "project"]], "ModelRoadwayNetwork": [[11, "modelroadwaynetwork"]], "StandardTransit": [[11, "standardtransit"]], "CubeTransit": [[11, "cubetransit"]], "Typical Workflow": [[11, "typical-workflow"]], "Project Cards from Transit LIN Files": [[11, "project-cards-from-transit-lin-files"]], "Project Cards from Cube LOG Files": [[11, "project-cards-from-cube-log-files"]], "Model Network Files for a Scenario": [[11, "model-network-files-for-a-scenario"]], "Running Quickstart Jupyter Notebooks": [[11, "running-quickstart-jupyter-notebooks"]]}, "indexentries": {"cubetransit (class in lasso)": [[0, "lasso.CubeTransit"]], "__init__() (lasso.cubetransit method)": [[0, "lasso.CubeTransit.__init__"]], "add_additional_time_periods() (lasso.cubetransit method)": [[0, "lasso.CubeTransit.add_additional_time_periods"]], "add_cube() (lasso.cubetransit method)": [[0, "lasso.CubeTransit.add_cube"]], "build_route_name() (lasso.cubetransit static method)": [[0, "lasso.CubeTransit.build_route_name"]], "calculate_start_end_times() (lasso.cubetransit method)": [[0, "lasso.CubeTransit.calculate_start_end_times"]], "create_add_route_card_dict() (lasso.cubetransit method)": [[0, "lasso.CubeTransit.create_add_route_card_dict"]], "create_delete_route_card_dict() (lasso.cubetransit method)": [[0, "lasso.CubeTransit.create_delete_route_card_dict"]], "create_from_cube() (lasso.cubetransit static method)": [[0, "lasso.CubeTransit.create_from_cube"]], "create_update_route_card_dict() (lasso.cubetransit method)": [[0, "lasso.CubeTransit.create_update_route_card_dict"]], "cube_properties_to_standard_properties() (lasso.cubetransit static method)": [[0, "lasso.CubeTransit.cube_properties_to_standard_properties"]], "diff_dict (lasso.cubetransit attribute)": [[0, "lasso.CubeTransit.diff_dict"]], "evaluate_differences() (lasso.cubetransit method)": [[0, "lasso.CubeTransit.evaluate_differences"]], "evaluate_route_property_differences() (lasso.cubetransit method)": [[0, "lasso.CubeTransit.evaluate_route_property_differences"]], "evaluate_route_shape_changes() (lasso.cubetransit static method)": [[0, "lasso.CubeTransit.evaluate_route_shape_changes"]], "get_time_period_numbers_from_cube_properties() (lasso.cubetransit static method)": [[0, "lasso.CubeTransit.get_time_period_numbers_from_cube_properties"]], "line_properties (lasso.cubetransit attribute)": [[0, "lasso.CubeTransit.line_properties"]], "lines (lasso.cubetransit attribute)": [[0, "lasso.CubeTransit.lines"]], "parameters (lasso.cubetransit attribute)": [[0, "lasso.CubeTransit.parameters"]], "program_type (lasso.cubetransit attribute)": [[0, "lasso.CubeTransit.program_type"]], "shapes (lasso.cubetransit attribute)": [[0, "lasso.CubeTransit.shapes"]], "source_list (lasso.cubetransit attribute)": [[0, "lasso.CubeTransit.source_list"]], "unpack_route_name() (lasso.cubetransit static method)": [[0, "lasso.CubeTransit.unpack_route_name"]], "calculated_values (lasso.modelroadwaynetwork attribute)": [[1, "lasso.ModelRoadwayNetwork.CALCULATED_VALUES"]], "modelroadwaynetwork (class in lasso)": [[1, "lasso.ModelRoadwayNetwork"]], "__init__() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.__init__"]], "add_counts() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.add_counts"]], "add_incident_link_data_to_nodes() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.add_incident_link_data_to_nodes"]], "add_new_roadway_feature_change() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.add_new_roadway_feature_change"]], "add_variable_using_shst_reference() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.add_variable_using_shst_reference"]], "addition_map() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.addition_map"]], "apply() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.apply"]], "apply_managed_lane_feature_change() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.apply_managed_lane_feature_change"]], "apply_python_calculation() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.apply_python_calculation"]], "apply_roadway_feature_change() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.apply_roadway_feature_change"]], "assess_connectivity() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.assess_connectivity"]], "build_selection_key() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.build_selection_key"]], "calculate_area_type() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.calculate_area_type"]], "calculate_centroidconnect() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.calculate_centroidconnect"]], "calculate_county() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.calculate_county"]], "calculate_distance() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.calculate_distance"]], "calculate_mpo() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.calculate_mpo"]], "calculate_use() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.calculate_use"]], "convert_int() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.convert_int"]], "create_ml_variable() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.create_ML_variable"]], "create_calculated_variables() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.create_calculated_variables"]], "create_dummy_connector_links() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.create_dummy_connector_links"]], "create_hov_corridor_variable() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.create_hov_corridor_variable"]], "create_managed_lane_network() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.create_managed_lane_network"]], "create_managed_variable() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.create_managed_variable"]], "dataframe_to_fixed_width() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.dataframe_to_fixed_width"]], "delete_roadway_feature_change() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.delete_roadway_feature_change"]], "deletion_map() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.deletion_map"]], "fill_na() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.fill_na"]], "from_roadwaynetwork() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.from_RoadwayNetwork"]], "get_attribute() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.get_attribute"]], "get_managed_lane_node_ids() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.get_managed_lane_node_ids"]], "get_modal_graph() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.get_modal_graph"]], "get_modal_links_nodes() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.get_modal_links_nodes"]], "get_property_by_time_period_and_group() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.get_property_by_time_period_and_group"]], "identify_segment() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.identify_segment"]], "identify_segment_endpoints() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.identify_segment_endpoints"]], "is_network_connected() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.is_network_connected"]], "load_transform_network() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.load_transform_network"]], "network_connection_plot() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.network_connection_plot"]], "orig_dest_nodes_foreign_key() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.orig_dest_nodes_foreign_key"]], "ox_graph() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.ox_graph"]], "path_search() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.path_search"]], "read() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.read"]], "read_match_result() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.read_match_result"]], "rename_variables_for_dbf() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.rename_variables_for_dbf"]], "roadway_net_to_gdf() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.roadway_net_to_gdf"]], "roadway_standard_to_met_council_network() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.roadway_standard_to_met_council_network"]], "select_roadway_features() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.select_roadway_features"]], "selection_has_unique_link_id() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.selection_has_unique_link_id"]], "selection_map() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.selection_map"]], "shortest_path() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.shortest_path"]], "split_properties_by_time_period_and_category() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.split_properties_by_time_period_and_category"]], "update_distance() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.update_distance"]], "validate_link_schema() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.validate_link_schema"]], "validate_node_schema() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.validate_node_schema"]], "validate_properties() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.validate_properties"]], "validate_selection() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.validate_selection"]], "validate_shape_schema() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.validate_shape_schema"]], "validate_uniqueness() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.validate_uniqueness"]], "write() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.write"]], "write_roadway_as_fixedwidth() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.write_roadway_as_fixedwidth"]], "write_roadway_as_shp() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.write_roadway_as_shp"]], "parameters (class in lasso)": [[2, "lasso.Parameters"]], "__init__() (lasso.parameters method)": [[2, "lasso.Parameters.__init__"]], "county_link_range_dict (lasso.parameters attribute)": [[2, "lasso.Parameters.county_link_range_dict"]], "cube_time_periods (lasso.parameters attribute)": [[2, "lasso.Parameters.cube_time_periods"]], "properties_to_split (lasso.parameters attribute)": [[2, "lasso.Parameters.properties_to_split"]], "zones (lasso.parameters attribute)": [[2, "lasso.Parameters.zones"]], "calculated_values (lasso.project attribute)": [[3, "lasso.Project.CALCULATED_VALUES"]], "default_project_name (lasso.project attribute)": [[3, "id0"], [3, "lasso.Project.DEFAULT_PROJECT_NAME"]], "project (class in lasso)": [[3, "lasso.Project"]], "static_values (lasso.project attribute)": [[3, "id1"], [3, "lasso.Project.STATIC_VALUES"]], "__init__() (lasso.project method)": [[3, "lasso.Project.__init__"]], "add_highway_changes() (lasso.project method)": [[3, "lasso.Project.add_highway_changes"]], "add_transit_changes() (lasso.project method)": [[3, "lasso.Project.add_transit_changes"]], "base_cube_transit_network (lasso.project attribute)": [[3, "lasso.Project.base_cube_transit_network"]], "base_roadway_network (lasso.project attribute)": [[3, "lasso.Project.base_roadway_network"]], "build_cube_transit_network (lasso.project attribute)": [[3, "lasso.Project.build_cube_transit_network"]], "card_data (lasso.project attribute)": [[3, "lasso.Project.card_data"]], "create_project() (lasso.project static method)": [[3, "lasso.Project.create_project"]], "determine_roadway_network_changes_compatibility() (lasso.project static method)": [[3, "lasso.Project.determine_roadway_network_changes_compatibility"]], "emme_id_to_wrangler_id() (lasso.project static method)": [[3, "lasso.Project.emme_id_to_wrangler_id"]], "emme_name_to_wrangler_name() (lasso.project static method)": [[3, "lasso.Project.emme_name_to_wrangler_name"]], "evaluate_changes() (lasso.project method)": [[3, "lasso.Project.evaluate_changes"]], "get_object_from_network_build_command() (lasso.project method)": [[3, "lasso.Project.get_object_from_network_build_command"]], "get_operation_from_network_build_command() (lasso.project method)": [[3, "lasso.Project.get_operation_from_network_build_command"]], "parameters (lasso.project attribute)": [[3, "lasso.Project.parameters"]], "project_name (lasso.project attribute)": [[3, "lasso.Project.project_name"]], "read_logfile() (lasso.project static method)": [[3, "lasso.Project.read_logfile"]], "read_network_build_file() (lasso.project static method)": [[3, "lasso.Project.read_network_build_file"]], "roadway_link_changes (lasso.project attribute)": [[3, "lasso.Project.roadway_link_changes"]], "roadway_node_changes (lasso.project attribute)": [[3, "lasso.Project.roadway_node_changes"]], "transit_changes (lasso.project attribute)": [[3, "lasso.Project.transit_changes"]], "write_project_card() (lasso.project method)": [[3, "lasso.Project.write_project_card"]], "standardtransit (class in lasso)": [[4, "lasso.StandardTransit"]], "__init__() (lasso.standardtransit method)": [[4, "lasso.StandardTransit.__init__"]], "calculate_cube_mode() (lasso.standardtransit method)": [[4, "lasso.StandardTransit.calculate_cube_mode"]], "cube_format() (lasso.standardtransit method)": [[4, "lasso.StandardTransit.cube_format"]], "evaluate_differences() (lasso.standardtransit method)": [[4, "lasso.StandardTransit.evaluate_differences"]], "feed (lasso.standardtransit attribute)": [[4, "lasso.StandardTransit.feed"]], "fromtransitnetwork() (lasso.standardtransit static method)": [[4, "lasso.StandardTransit.fromTransitNetwork"]], "parameters (lasso.standardtransit attribute)": [[4, "lasso.StandardTransit.parameters"]], "read_gtfs() (lasso.standardtransit static method)": [[4, "lasso.StandardTransit.read_gtfs"]], "route_properties_gtfs_to_cube() (lasso.standardtransit static method)": [[4, "lasso.StandardTransit.route_properties_gtfs_to_cube"]], "shape_gtfs_to_cube() (lasso.standardtransit method)": [[4, "lasso.StandardTransit.shape_gtfs_to_cube"]], "shape_gtfs_to_emme() (lasso.standardtransit method)": [[4, "lasso.StandardTransit.shape_gtfs_to_emme"]], "time_to_cube_time_period() (lasso.standardtransit method)": [[4, "lasso.StandardTransit.time_to_cube_time_period"]], "write_as_cube_lin() (lasso.standardtransit method)": [[4, "lasso.StandardTransit.write_as_cube_lin"]], "lasso.logger": [[5, "module-lasso.logger"]], "module": [[5, "module-lasso.logger"], [6, "module-lasso.util"], [7, "module-lasso"]], "setuplogging() (in module lasso.logger)": [[5, "lasso.logger.setupLogging"]], "point (class in lasso.util)": [[6, "lasso.util.Point"]], "polygon (class in lasso.util)": [[6, "lasso.util.Polygon"]], "almost_equals() (lasso.util.point method)": [[6, "lasso.util.Point.almost_equals"]], "almost_equals() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.almost_equals"]], "area (lasso.util.point property)": [[6, "lasso.util.Point.area"]], "area (lasso.util.polygon property)": [[6, "lasso.util.Polygon.area"]], "args (lasso.util.partial attribute)": [[6, "lasso.util.partial.args"]], "boundary (lasso.util.point property)": [[6, "lasso.util.Point.boundary"]], "boundary (lasso.util.polygon property)": [[6, "lasso.util.Polygon.boundary"]], "bounds (lasso.util.point property)": [[6, "lasso.util.Point.bounds"]], "bounds (lasso.util.polygon property)": [[6, "lasso.util.Polygon.bounds"]], "buffer() (lasso.util.point method)": [[6, "lasso.util.Point.buffer"]], "buffer() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.buffer"]], "centroid (lasso.util.point property)": [[6, "lasso.util.Point.centroid"]], "centroid (lasso.util.polygon property)": [[6, "lasso.util.Polygon.centroid"]], "column_name_to_parts() (in module lasso.util)": [[6, "lasso.util.column_name_to_parts"]], "contains() (lasso.util.point method)": [[6, "lasso.util.Point.contains"]], "contains() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.contains"]], "contains_properly() (lasso.util.point method)": [[6, "lasso.util.Point.contains_properly"]], "contains_properly() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.contains_properly"]], "convex_hull (lasso.util.point property)": [[6, "lasso.util.Point.convex_hull"]], "convex_hull (lasso.util.polygon property)": [[6, "lasso.util.Polygon.convex_hull"]], "coords (lasso.util.point property)": [[6, "lasso.util.Point.coords"]], "coords (lasso.util.polygon property)": [[6, "lasso.util.Polygon.coords"]], "covered_by() (lasso.util.point method)": [[6, "lasso.util.Point.covered_by"]], "covered_by() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.covered_by"]], "covers() (lasso.util.point method)": [[6, "lasso.util.Point.covers"]], "covers() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.covers"]], "create_locationreference() (in module lasso.util)": [[6, "lasso.util.create_locationreference"]], "crosses() (lasso.util.point method)": [[6, "lasso.util.Point.crosses"]], "crosses() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.crosses"]], "difference() (lasso.util.point method)": [[6, "lasso.util.Point.difference"]], "difference() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.difference"]], "disjoint() (lasso.util.point method)": [[6, "lasso.util.Point.disjoint"]], "disjoint() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.disjoint"]], "distance() (lasso.util.point method)": [[6, "lasso.util.Point.distance"]], "distance() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.distance"]], "dwithin() (lasso.util.point method)": [[6, "lasso.util.Point.dwithin"]], "dwithin() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.dwithin"]], "envelope (lasso.util.point property)": [[6, "lasso.util.Point.envelope"]], "envelope (lasso.util.polygon property)": [[6, "lasso.util.Polygon.envelope"]], "equals() (lasso.util.point method)": [[6, "lasso.util.Point.equals"]], "equals() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.equals"]], "equals_exact() (lasso.util.point method)": [[6, "lasso.util.Point.equals_exact"]], "equals_exact() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.equals_exact"]], "exterior (lasso.util.polygon attribute)": [[6, "lasso.util.Polygon.exterior"]], "exterior (lasso.util.polygon property)": [[6, "id0"]], "from_bounds() (lasso.util.polygon class method)": [[6, "lasso.util.Polygon.from_bounds"]], "func (lasso.util.partial attribute)": [[6, "lasso.util.partial.func"]], "geodesic_point_buffer() (in module lasso.util)": [[6, "lasso.util.geodesic_point_buffer"]], "geom_type (lasso.util.point property)": [[6, "lasso.util.Point.geom_type"]], "geom_type (lasso.util.polygon property)": [[6, "lasso.util.Polygon.geom_type"]], "geometrytype() (lasso.util.point method)": [[6, "lasso.util.Point.geometryType"]], "geometrytype() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.geometryType"]], "get_shared_streets_intersection_hash() (in module lasso.util)": [[6, "lasso.util.get_shared_streets_intersection_hash"]], "has_z (lasso.util.point property)": [[6, "lasso.util.Point.has_z"]], "has_z (lasso.util.polygon property)": [[6, "lasso.util.Polygon.has_z"]], "hausdorff_distance() (lasso.util.point method)": [[6, "lasso.util.Point.hausdorff_distance"]], "hausdorff_distance() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.hausdorff_distance"]], "hhmmss_to_datetime() (in module lasso.util)": [[6, "lasso.util.hhmmss_to_datetime"]], "interiors (lasso.util.polygon attribute)": [[6, "lasso.util.Polygon.interiors"]], "interiors (lasso.util.polygon property)": [[6, "id1"]], "interpolate() (lasso.util.point method)": [[6, "lasso.util.Point.interpolate"]], "interpolate() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.interpolate"]], "intersection() (lasso.util.point method)": [[6, "lasso.util.Point.intersection"]], "intersection() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.intersection"]], "intersects() (lasso.util.point method)": [[6, "lasso.util.Point.intersects"]], "intersects() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.intersects"]], "is_closed (lasso.util.point property)": [[6, "lasso.util.Point.is_closed"]], "is_closed (lasso.util.polygon property)": [[6, "lasso.util.Polygon.is_closed"]], "is_empty (lasso.util.point property)": [[6, "lasso.util.Point.is_empty"]], "is_empty (lasso.util.polygon property)": [[6, "lasso.util.Polygon.is_empty"]], "is_ring (lasso.util.point property)": [[6, "lasso.util.Point.is_ring"]], "is_ring (lasso.util.polygon property)": [[6, "lasso.util.Polygon.is_ring"]], "is_simple (lasso.util.point property)": [[6, "lasso.util.Point.is_simple"]], "is_simple (lasso.util.polygon property)": [[6, "lasso.util.Polygon.is_simple"]], "is_valid (lasso.util.point property)": [[6, "lasso.util.Point.is_valid"]], "is_valid (lasso.util.polygon property)": [[6, "lasso.util.Polygon.is_valid"]], "keywords (lasso.util.partial attribute)": [[6, "lasso.util.partial.keywords"]], "lasso.util": [[6, "module-lasso.util"]], "length (lasso.util.point property)": [[6, "lasso.util.Point.length"]], "length (lasso.util.polygon property)": [[6, "lasso.util.Polygon.length"]], "line_interpolate_point() (lasso.util.point method)": [[6, "lasso.util.Point.line_interpolate_point"]], "line_interpolate_point() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.line_interpolate_point"]], "line_locate_point() (lasso.util.point method)": [[6, "lasso.util.Point.line_locate_point"]], "line_locate_point() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.line_locate_point"]], "minimum_clearance (lasso.util.point property)": [[6, "lasso.util.Point.minimum_clearance"]], "minimum_clearance (lasso.util.polygon property)": [[6, "lasso.util.Polygon.minimum_clearance"]], "minimum_rotated_rectangle (lasso.util.point property)": [[6, "lasso.util.Point.minimum_rotated_rectangle"]], "minimum_rotated_rectangle (lasso.util.polygon property)": [[6, "lasso.util.Polygon.minimum_rotated_rectangle"]], "normalize() (lasso.util.point method)": [[6, "lasso.util.Point.normalize"]], "normalize() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.normalize"]], "oriented_envelope (lasso.util.point property)": [[6, "lasso.util.Point.oriented_envelope"]], "oriented_envelope (lasso.util.polygon property)": [[6, "lasso.util.Polygon.oriented_envelope"]], "overlaps() (lasso.util.point method)": [[6, "lasso.util.Point.overlaps"]], "overlaps() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.overlaps"]], "partial (class in lasso.util)": [[6, "lasso.util.partial"]], "point_on_surface() (lasso.util.point method)": [[6, "lasso.util.Point.point_on_surface"]], "point_on_surface() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.point_on_surface"]], "project() (lasso.util.point method)": [[6, "lasso.util.Point.project"]], "project() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.project"]], "relate() (lasso.util.point method)": [[6, "lasso.util.Point.relate"]], "relate() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.relate"]], "relate_pattern() (lasso.util.point method)": [[6, "lasso.util.Point.relate_pattern"]], "relate_pattern() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.relate_pattern"]], "representative_point() (lasso.util.point method)": [[6, "lasso.util.Point.representative_point"]], "representative_point() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.representative_point"]], "reverse() (lasso.util.point method)": [[6, "lasso.util.Point.reverse"]], "reverse() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.reverse"]], "secs_to_datetime() (in module lasso.util)": [[6, "lasso.util.secs_to_datetime"]], "segmentize() (lasso.util.point method)": [[6, "lasso.util.Point.segmentize"]], "segmentize() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.segmentize"]], "shorten_name() (in module lasso.util)": [[6, "lasso.util.shorten_name"]], "simplify() (lasso.util.point method)": [[6, "lasso.util.Point.simplify"]], "simplify() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.simplify"]], "svg() (lasso.util.point method)": [[6, "lasso.util.Point.svg"]], "svg() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.svg"]], "symmetric_difference() (lasso.util.point method)": [[6, "lasso.util.Point.symmetric_difference"]], "symmetric_difference() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.symmetric_difference"]], "touches() (lasso.util.point method)": [[6, "lasso.util.Point.touches"]], "touches() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.touches"]], "transform() (in module lasso.util)": [[6, "lasso.util.transform"]], "type (lasso.util.point property)": [[6, "lasso.util.Point.type"]], "type (lasso.util.polygon property)": [[6, "lasso.util.Polygon.type"]], "unidecode() (in module lasso.util)": [[6, "lasso.util.unidecode"]], "union() (lasso.util.point method)": [[6, "lasso.util.Point.union"]], "union() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.union"]], "within() (lasso.util.point method)": [[6, "lasso.util.Point.within"]], "within() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.within"]], "wkb (lasso.util.point property)": [[6, "lasso.util.Point.wkb"]], "wkb (lasso.util.polygon property)": [[6, "lasso.util.Polygon.wkb"]], "wkb_hex (lasso.util.point property)": [[6, "lasso.util.Point.wkb_hex"]], "wkb_hex (lasso.util.polygon property)": [[6, "lasso.util.Polygon.wkb_hex"]], "wkt (lasso.util.point property)": [[6, "lasso.util.Point.wkt"]], "wkt (lasso.util.polygon property)": [[6, "lasso.util.Polygon.wkt"]], "x (lasso.util.point property)": [[6, "lasso.util.Point.x"]], "xy (lasso.util.point property)": [[6, "lasso.util.Point.xy"]], "xy (lasso.util.polygon property)": [[6, "lasso.util.Polygon.xy"]], "y (lasso.util.point property)": [[6, "lasso.util.Point.y"]], "z (lasso.util.point property)": [[6, "lasso.util.Point.z"]], "lasso": [[7, "module-lasso"]]}}) \ No newline at end of file diff --git a/branch/bicounty_dev/setup/index.html b/branch/bicounty_dev/setup/index.html new file mode 100644 index 0000000..868c23c --- /dev/null +++ b/branch/bicounty_dev/setup/index.html @@ -0,0 +1,133 @@ + + + + + + + Setup — lasso documentation + + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

Setup

+
+

Projects

+
+
+

Parameters

+
+
+

Settings

+
+
+

Additional Data Files

+
+
+ + +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty_dev/starting/index.html b/branch/bicounty_dev/starting/index.html new file mode 100644 index 0000000..ef31ab2 --- /dev/null +++ b/branch/bicounty_dev/starting/index.html @@ -0,0 +1,434 @@ + + + + + + + Starting Out — lasso documentation + + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

Starting Out

+
+

Installation

+

If you are managing multiple python versions, we suggest using virtualenv or conda virtual environments.

+

Example using a conda environment (recommended) and using the package manager pip to install Lasso from the source on GitHub.

+
conda config --add channels conda-forge
+conda create python=3.7 rtree geopandas -n <my_lasso_environment>
+conda activate <my_lasso_environment>
+pip install git+https://github.com/wsp-sag/Lasso@master
+
+
+

Lasso will install network_wrangler from the PyPi repository because it is included in Lasso’s requirements.txt.

+
+

Bleeding Edge

+

If you want to install a more up-to-date or development version of network wrangler and lasso , you can do so by installing it from the develop branch of

+
conda config --add channels conda-forge
+conda create python=3.7 rtree geopandas -n <my_lasso_environment>
+conda activate <my_lasso_environment>
+pip install git+https://github.com/wsp-sag/network_wrangler@develop
+pip install git+https://github.com/wsp-sag/Lasso@develop
+
+
+
+
+

From Clone

+

If you are going to be working on Lasso locally, you might want to clone it to your local machine and install it from the clone. The -e will install it in editable mode.

+

if you plan to do development on both network wrangler and lasso locally, consider installing network wrangler from a clone as well!

+
conda config --add channels conda-forge
+conda create python=3.7 rtree geopandas osmnx -n <my_lasso_environment>
+conda activate <my_lasso_environment>
+git clone https://github.com/wsp-sag/Lasso
+git clone https://github.com/wsp-sag/network_wrangler
+cd network_wrangler
+pip install -e .
+cd ..
+cd Lasso
+pip install -e .
+
+
+

Notes:

+
    +
  1. The -e installs it in editable mode.

  2. +
  3. If you are not part of the project team and want to contribute code bxack to the project, please fork before you clone and then add the original repository to your upstream origin list per these directions on github.

  4. +
  5. if you wanted to install from a specific tag/version number or branch, replace @master with @<branchname> or @tag

  6. +
  7. If you want to make use of frequent developer updates for network wrangler as well, you can also install it from clone by copying the instructions for cloning and installing Lasso for Network Wrangler

  8. +
+

If you are going to be doing Lasso development, we also recommend:

+
    +
  • a good IDE such as Atom, VS Code, Sublime Text, etc. +with Python syntax highlighting turned on.

  • +
  • GitHub Desktop to locally update your clones

  • +
+
+
+
+

Brief Intro

+

Lasso is a ‘wrapper’ around the Network Wrangler utility.

+

Both Lasso and NetworkWrangler are built around the following data schemas:

+
    +
  • [roadway network], which is based on a mashup of Open Street Map and Shared Streets. In Network Wrangler these are read in from three json files reprsenting: links, shapes, and nodes. Data fields that change by time of day or by user category are represented as nested fields such that any field can be defined for an ad-hoc time-of-day span or user category.

  • +
  • [transit network], which is based on a frequency-based implementation of the csv-based GTFS; and

  • +
  • [project card], which is novel to Network Wrangler and stores information about network changes as a result of projects in yml.

  • +
+

In addition, Lasso utilizes the following data schemas:

+
    +
  • [MetCouncil Model Roadway Network Schema], which adds data fields to the roadway network schema that MetCouncil uses in their travel model including breaking out data fields by time period.

  • +
  • [MetCouncil Model Transit Network Schema], which uses the Cube PublicTransport format, and

  • +
  • [Cube Log Files], which document changes to the roadway network done in the Cube GUI. Lasso translates these to project cards in order to be used by NetworkWrangler.

  • +
  • [Cube public transport line files], which define a set of transit lines in the cube software.

  • +
+
+

Components

+

Network Wrangler has the following atomic parts:

+
    +
  • RoadwayNetwork object, which represents the roadway network data as GeoDataFrames;

  • +
  • TransitNetwork object, which represents the transit network data as DataFrames;

  • +
  • ProjectCard object, which represents the data of the project card. Project cards identify the infrastructure that is changing (a selection) and defines the changes; or contains information about a new facility to be constructed or a new service to be run.;

  • +
  • Scenario object, which consist of at least a RoadwayNetwork, and +TransitNetwork. Scenarios can be based on or tiered from other scenarios. +Scenarios can query and add ProjectCards to describe a set of changes that should be made to the network.

  • +
+

In addition, Lasso has the following atomic parts:

+
    +
  • Project object, creates project cards from one of the following: a base and a build transit network in cube format, a base and build highway network, or a base highway network and a Cube log file.

  • +
  • ModelRoadwayNetwork object is a subclass of RoadwayNetwork and contains MetCouncil-specific methods to define and create MetCouncil-specific variables and export the network to a format that can be read by Cube.

  • +
  • StandardTransit, an object for holding a standard transit feed as a Partridge object and contains +methods to manipulate and translate the GTFS data to MetCouncil’s Cube Line files.

  • +
  • CubeTransit, an object for storing information about transit defined in Cube public transport line files +. Has the capability to parse cube line file properties and shapes into python dictionaries and compare line files and represent changes as Project Card dictionaries.

  • +
  • Parameters, A class representing all the parameters defining the networks +including time of day, categories, etc. Parameters can be set at runtime by initializing a parameters instance +with a keyword argument setting the attribute. Parameters that are +not explicitly set will use default parameters listed in this class.

  • +
+
+

RoadwayNetwork

+

Reads, writes, queries and and manipulates roadway network data, which +is mainly stored in the GeoDataFrames links_df, nodes_df, and shapes_df.

+
net = RoadwayNetwork.read(
+        link_filename=MY_LINK_FILE,
+        node_filename=MY_NODE_FILE,
+        shape_filename=MY_SHAPE_FILE,
+        shape_foreign_key ='shape_id',
+        
+    )
+my_selection = {
+    "link": [{"name": ["I 35E"]}],
+    "A": {"osm_node_id": "961117623"},  # start searching for segments at A
+    "B": {"osm_node_id": "2564047368"},
+}
+net.select_roadway_features(my_selection)
+
+my_change = [
+    {
+        'property': 'lanes',
+        'existing': 1,
+        'set': 2,
+     },
+     {
+        'property': 'drive_access',
+        'set': 0,
+      },
+]
+
+my_net.apply_roadway_feature_change(
+    my_net.select_roadway_features(my_selection),
+    my_change
+)
+
+ml_net = net.create_managed_lane_network(in_place=False)
+
+ml_net.is_network_connected(mode="drive"))
+
+_, disconnected_nodes = ml_net.assess_connectivity(
+  mode="walk",
+  ignore_end_nodes=True
+)
+ml_net.write(filename=my_out_prefix, path=my_dir)
+
+
+
+
+

TransitNetwork

+
+
+

ProjectCard

+
+
+

Scenario

+

Manages sets of project cards and tiering from a base scenario/set of networks.

+

+my_base_scenario = {
+    "road_net": RoadwayNetwork.read(
+        link_filename=STPAUL_LINK_FILE,
+        node_filename=STPAUL_NODE_FILE,
+        shape_filename=STPAUL_SHAPE_FILE,
+        fast=True,
+        shape_foreign_key ='shape_id',
+    ),
+    "transit_net": TransitNetwork.read(STPAUL_DIR),
+}
+
+card_filenames = [
+    "3_multiple_roadway_attribute_change.yml",
+    "multiple_changes.yml",
+    "4_simple_managed_lane.yml",
+]
+
+project_card_directory = os.path.join(STPAUL_DIR, "project_cards")
+
+project_cards_list = [
+    ProjectCard.read(os.path.join(project_card_directory, filename), validate=False)
+    for filename in card_filenames
+]
+
+my_scenario = Scenario.create_scenario(
+  base_scenario=my_base_scenario,
+  project_cards_list=project_cards_list,
+)
+my_scenario.check_scenario_requisites()
+
+my_scenario.apply_all_projects()
+
+my_scenario.scenario_summary()
+
+
+
+
+

Project

+

Creates project cards by comparing two MetCouncil Model Transit Network files or by reading a cube log file and a base network;

+

+test_project = Project.create_project(
+  base_transit_source=os.path.join(CUBE_DIR, "transit.LIN"),
+  build_transit_source=os.path.join(CUBE_DIR, "transit_route_shape_change"),
+  )
+
+test_project.evaluate_changes()
+
+test_project.write_project_card(
+  os.path.join(SCRATCH_DIR, "t_transit_shape_test.yml")
+  )
+
+
+
+
+

ModelRoadwayNetwork

+

A subclass of network_wrangler’s RoadwayNetwork +class which additional understanding about how to translate and write the +network out to the MetCouncil Roadway Network schema.

+
net = ModelRoadwayNetwork.read(
+      link_filename=STPAUL_LINK_FILE,
+      node_filename=STPAUL_NODE_FILE,
+      shape_filename=STPAUL_SHAPE_FILE,
+      fast=True,
+      shape_foreign_key ='shape_id',
+  )
+
+net.write_roadway_as_fixedwidth()
+
+
+
+
+

StandardTransit

+

Translates the standard GTFS data to MetCouncil’s Cube Line files.

+
cube_transit_net = StandardTransit.read_gtfs(BASE_TRANSIT_DIR)
+cube_transit_net.write_as_cube_lin(os.path.join(WRITE_DIR, "outfile.lin"))
+
+
+
+
+

CubeTransit

+

Used by the project class and has the capability to:

+
    +
  • Parse cube line file properties and shapes into python dictionaries

  • +
  • Compare line files and represent changes as Project Card dictionaries

  • +
+
tn = CubeTransit.create_from_cube(CUBE_DIR)
+transit_change_list = tn.evaluate_differences(base_transit_network)
+
+
+
+
+

Parameters

+

Holds information about default parameters but can +also be initialized to override those parameters at object instantiation using a dictionary.

+
# read parameters from a yaml configuration  file
+# could also provide as a key/value pair
+with open(config_file) as f:
+      my_config = yaml.safe_load(f)
+
+# provide parameters at instantiation of ModelRoadwayNetwork
+model_road_net = ModelRoadwayNetwork.from_RoadwayNetwork(
+            my_scenario.road_net, parameters=my_config.get("my_parameters", {})
+        )
+# network written with direction from the parameters given
+model_road_net.write_roadway_as_shp()
+
+
+
+
+
+

Typical Workflow

+

Workflows in Lasso and Network Wrangler typically accomplish one of two goals:

+
    +
  1. Create Project Cards to document network changes as a result of either transit or roadway projects.

  2. +
  3. Create Model Network Files for a scenario as represented by a series of Project Cards layered on top of a base network.

  4. +
+
+

Project Cards from Transit LIN Files

+
+
+

Project Cards from Cube LOG Files

+
+
+

Model Network Files for a Scenario

+
+
+
+
+

Running Quickstart Jupyter Notebooks

+

To learn basic lasso functionality, please refer to the following jupyter notebooks in the /notebooks directory:

+
    +
  • Lasso Project Card Creation Quickstart.ipynb

  • +
  • Lasso Scenario Creation Quickstart.ipynb

  • +
+

Jupyter notebooks can be started by activating the lasso conda environment and typing jupyter notebook:

+
conda activate <my_lasso_environment>
+jupyter notebook
+
+
+
+
+ + +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty_emme/.buildinfo b/branch/bicounty_emme/.buildinfo new file mode 100644 index 0000000..b36862c --- /dev/null +++ b/branch/bicounty_emme/.buildinfo @@ -0,0 +1,4 @@ +# Sphinx build info version 1 +# This file hashes the configuration used when building these files. When it is not found, a full rebuild will be done. +config: a1cf922aaabc1875e26d788834f2f593 +tags: d77d1c0d9ca2f4c8421862c7c5a0d620 diff --git a/branch/bicounty_emme/_generated/lasso.CubeTransit/index.html b/branch/bicounty_emme/_generated/lasso.CubeTransit/index.html new file mode 100644 index 0000000..8f061a9 --- /dev/null +++ b/branch/bicounty_emme/_generated/lasso.CubeTransit/index.html @@ -0,0 +1,571 @@ + + + + + + + lasso.CubeTransit — lasso documentation + + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

lasso.CubeTransit

+
+
+class lasso.CubeTransit(parameters={})[source]
+

Bases: object

+

Class for storing information about transit defined in Cube line +files.

+

Has the capability to:

+
+
    +
  • Parse cube line file properties and shapes into python dictionaries

  • +
  • Compare line files and represent changes as Project Card dictionaries

  • +
+
+

Typical usage example:

+
tn = CubeTransit.create_from_cube(CUBE_DIR)
+transit_change_list = tn.evaluate_differences(base_transit_network)
+
+
+
+
+lines
+

list of strings representing unique line names in +the cube network.

+
+
Type:
+

list

+
+
+
+ +
+
+line_properties
+

dictionary of line properties keyed by line name. Property +values are stored in a dictionary by property name. These +properties are directly read from the cube line files and haven’t +been translated to standard transit values.

+
+
Type:
+

dict

+
+
+
+ +
+
+shapes
+

dictionary of shapes +keyed by line name. Shapes stored as a pandas DataFrame of nodes with following columns:

+
+
    +
  • ‘node_id’ (int): positive integer of node id

  • +
  • ‘node’ (int): node number, with negative indicating a non-stop

  • +
  • ‘stop’ (boolean): indicates if it is a stop

  • +
  • ‘order’ (int): order within this shape

  • +
+
+
+
Type:
+

dict

+
+
+
+ +
+
+program_type
+

Either PT or TRNBLD

+
+
Type:
+

str

+
+
+
+ +
+
+parameters
+

Parameters instance that will be applied to this instance which +includes information about time periods and variables.

+
+
Type:
+

Parameters

+
+
+
+ +
+
+source_list
+

List of cube line file sources that have been read and added.

+
+
Type:
+

list

+
+
+
+ +
+
+diff_dict
+
+
Type:
+

dict

+
+
+
+ +
+
+__init__(parameters={})[source]
+

Constructor for CubeTransit

+

parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters

+
+ +

Methods

+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +

__init__([parameters])

Constructor for CubeTransit

add_additional_time_periods(...)

Copies a route to another cube time period with appropriate values for time-period-specific properties.

add_cube(transit_source)

Reads a .lin file and adds it to existing TransitNetwork instance.

build_route_name([route_id, time_period, ...])

Create a route name by contatenating route, time period, agency, and direction

calculate_start_end_times(line_properties_dict)

Calculate the start and end times of the property change WARNING: Doesn't take care of discongruous time periods!!!!

create_add_route_card_dict(line)

Creates a project card change formatted dictionary for adding a route based on the information in self.route_properties for the line.

create_delete_route_card_dict(line, ...)

Creates a project card change formatted dictionary for deleting a line.

create_from_cube(transit_source[, parameters])

Reads a cube .lin file and stores as TransitNetwork object.

create_update_route_card_dict(line, ...)

Creates a project card change formatted dictionary for updating the line.

cube_properties_to_standard_properties(...)

Converts cube style properties to standard properties.

evaluate_differences(base_transit)

    +
  1. Identifies what routes need to be updated, deleted, or added

  2. +
+

evaluate_route_property_differences(...[, ...])

Checks if any values have been updated or added for a specific route and creates project card entries for each.

evaluate_route_shape_changes(shape_build, ...)

Compares two route shapes and constructs returns list of changes suitable for a project card.

get_time_period_numbers_from_cube_properties(...)

Finds properties that are associated with time periods and the returns the numbers in them.

unpack_route_name(line_name)

Unpacks route name into direction, route, agency, and time period info

+
+
+add_additional_time_periods(new_time_period_number, orig_line_name)[source]
+

Copies a route to another cube time period with appropriate +values for time-period-specific properties.

+
+
New properties are stored under the new name in:
    +
  • ::self.shapes

  • +
  • ::self.line_properties

  • +
+
+
+
+
Parameters:
+
    +
  • new_time_period_number (int) – cube time period number

  • +
  • orig_line_name (str) – name of the originating line, from which +the new line will copy its properties.

  • +
+
+
Returns:
+

Line name with new time period.

+
+
+
+ +
+
+add_cube(transit_source)[source]
+

Reads a .lin file and adds it to existing TransitNetwork instance.

+
+
Parameters:
+

transit_source – a string or the directory of the cube line file to be parsed

+
+
+
+ +
+
+static build_route_name(route_id='', time_period='', agency_id=0, direction_id=1)[source]
+

Create a route name by contatenating route, time period, agency, and direction

+
+
Parameters:
+
    +
  • route_id – i.e. 452-111

  • +
  • time_period – i.e. pk

  • +
  • direction_id – i.e. 1

  • +
  • agency_id – i.e. 0

  • +
+
+
Returns:
+

constructed line_name i.e. “0_452-111_452_pk1”

+
+
+
+ +
+
+calculate_start_end_times(line_properties_dict)[source]
+

Calculate the start and end times of the property change +WARNING: Doesn’t take care of discongruous time periods!!!!

+
+
Parameters:
+

line_properties_dict – dictionary of cube-flavor properties for a transit line

+
+
+
+ +
+
+create_add_route_card_dict(line)[source]
+

Creates a project card change formatted dictionary for adding +a route based on the information in self.route_properties for +the line.

+
+
Parameters:
+

line – name of line that is being updated

+
+
Returns:
+

A project card change-formatted dictionary for the route addition.

+
+
+
+ +
+
+create_delete_route_card_dict(line, base_transit_line_properties_dict)[source]
+

Creates a project card change formatted dictionary for deleting a line.

+
+
Parameters:
+
    +
  • line – name of line that is being deleted

  • +
  • base_transit_line_properties_dict – dictionary of cube-style +attribute values in order to find time periods and +start and end times.

  • +
+
+
Returns:
+

A project card change-formatted dictionary for the route deletion.

+
+
+
+ +
+
+static create_from_cube(transit_source, parameters={})[source]
+

Reads a cube .lin file and stores as TransitNetwork object.

+
+
Parameters:
+

transit_source – a string or the directory of the cube line file to be parsed

+
+
Returns:
+

A ::CubeTransit object created from the transit_source.

+
+
+
+ +
+
+create_update_route_card_dict(line, updated_properties_dict)[source]
+

Creates a project card change formatted dictionary for updating +the line.

+
+
Parameters:
+
    +
  • line – name of line that is being updated

  • +
  • updated_properties_dict – dictionary of attributes to update as +‘property’: <property name>, +‘set’: <new property value>

  • +
+
+
Returns:
+

A project card change-formatted dictionary for the attribute update.

+
+
+
+ +
+
+static cube_properties_to_standard_properties(cube_properties_dict)[source]
+

Converts cube style properties to standard properties.

+

This is most pertinent to time-period specific variables like headway, +and varibles that have stnadard units like headway, which is minutes +in cube and seconds in standard format.

+
+
Parameters:
+

cube_properties_dict – <cube style property name> : <property value>

+
+
Returns:
+

+
<standard

style property name>, “set” : <property value with correct units>`

+
+
+

+
+
Return type:
+

A list of dictionaries with values for `”property”

+
+
+
+ +
+
+evaluate_differences(base_transit)[source]
+
    +
  1. Identifies what routes need to be updated, deleted, or added

  2. +
  3. +
    For routes being added or updated, identify if the time periods

    have changed or if there are multiples, and make duplicate lines if so

    +
    +
    +
  4. +
  5. Create project card dictionaries for each change.

  6. +
+
+
Parameters:
+

base_transit (CubeTransit) – an instance of this class for the base condition

+
+
Returns:
+

A list of dictionaries containing project card changes +required to evaluate the differences between the base network +and this transit network instance.

+
+
+
+ +
+
+evaluate_route_property_differences(properties_build, properties_base, time_period_number, absolute=True, validate_base=False)[source]
+

Checks if any values have been updated or added for a specific +route and creates project card entries for each.

+
+
Parameters:
+
    +
  • properties_build – ::<property_name>: <property_value>

  • +
  • properties_base – ::<property_name>: <property_value>

  • +
  • time_period_number – time period to evaluate

  • +
  • absolute – if True, will use set command rather than a change. If false, will automatically check the base value. Note that this only applies to the numeric values of frequency/headway

  • +
  • validate_base – if True, will add the existing line in the project card

  • +
+
+
Returns:
+

+
a list of dictionary values suitable for writing to a project card

{ +‘property’: <property_name>, +‘set’: <set value>, +‘change’: <change from existing value>, +‘existing’: <existing value to check>, +}

+
+
+

+
+
Return type:
+

transit_change_list (list)

+
+
+
+ +
+
+static evaluate_route_shape_changes(shape_build, shape_base)[source]
+

Compares two route shapes and constructs returns list of changes +suitable for a project card.

+
+
Parameters:
+
    +
  • shape_build – DataFrame of the build-version of the route shape.

  • +
  • shape_base – dDataFrame of the base-version of the route shape.

  • +
+
+
Returns:
+

List of shape changes formatted as a project card-change dictionary.

+
+
+
+ +
+
+static get_time_period_numbers_from_cube_properties(properties_list)[source]
+

Finds properties that are associated with time periods and the +returns the numbers in them.

+
+
Parameters:
+

properties_list (list) – list of all properties.

+
+
Returns:
+

list of strings of the time period numbers found

+
+
+
+ +
+
+static unpack_route_name(line_name)[source]
+

Unpacks route name into direction, route, agency, and time period info

+
+
Parameters:
+

line_name (str) – i.e. “0_452-111_452_pk1”

+
+
Returns:
+

452-111 +time_period (str): i.e. pk +direction_id (str) : i.e. 1 +agency_id (str) : i.e. 0

+
+
Return type:
+

route_id (str)

+
+
+
+ +
+ +
+ + +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty_emme/_generated/lasso.ModelRoadwayNetwork/index.html b/branch/bicounty_emme/_generated/lasso.ModelRoadwayNetwork/index.html new file mode 100644 index 0000000..16c638c --- /dev/null +++ b/branch/bicounty_emme/_generated/lasso.ModelRoadwayNetwork/index.html @@ -0,0 +1,1573 @@ + + + + + + + lasso.ModelRoadwayNetwork — lasso documentation + + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

lasso.ModelRoadwayNetwork

+
+
+class lasso.ModelRoadwayNetwork(nodes, links, shapes, parameters={}, **kwargs)[source]
+

Bases: RoadwayNetwork

+

Subclass of network_wrangler class RoadwayNetwork

+

A representation of the physical roadway network and its properties.

+
+
+__init__(nodes, links, shapes, parameters={}, **kwargs)[source]
+

Constructor

+
+
Parameters:
+
    +
  • nodes – geodataframe of nodes

  • +
  • links – dataframe of links

  • +
  • shapes – geodataframe of shapes

  • +
  • parameters – dictionary of parameter settings (see Parameters class) or an instance of Parameters. +If not specified, will use default parameters.

  • +
  • crs (int) – coordinate reference system, ESPG number

  • +
  • node_foreign_key (str) – variable linking the node table to the link table

  • +
  • link_foreign_key (list) – list of variable linking the link table to the node foreign key

  • +
  • shape_foreign_key (str) – variable linking the links table and shape table

  • +
  • unique_link_ids (list) – list of variables unique to each link

  • +
  • unique_node_ids (list) – list of variables unique to each node

  • +
  • modes_to_network_link_variables (dict) – Mapping of modes to link variables in the network

  • +
  • modes_to_network_nodes_variables (dict) – Mapping of modes to node variables in the network

  • +
  • managed_lanes_node_id_scalar (int) – Scalar values added to primary keys for nodes for +corresponding managed lanes.

  • +
  • managed_lanes_link_id_scalar (int) – Scalar values added to primary keys for links for +corresponding managed lanes.

  • +
  • managed_lanes_required_attributes (list) – attributes that must be specified in managed +lane projects.

  • +
  • keep_same_attributes_ml_and_gp (list) – attributes to copy to managed lanes from parallel +general purpose lanes.

  • +
+
+
+
+ +

Methods

+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +

__init__(nodes, links, shapes[, parameters])

Constructor

add_counts([network_variable, ...])

Adds count variable.

add_incident_link_data_to_nodes([links_df, ...])

Add data from links going to/from nodes to node.

add_new_roadway_feature_change(links, nodes)

add the new roadway features defined in the project card.

add_variable_using_shst_reference([...])

Join network links with source data, via SHST API node match result.

addition_map(links, nodes)

Shows which links and nodes are added to the roadway network

apply(project_card_dictionary)

Wrapper method to apply a project to a roadway network.

apply_managed_lane_feature_change(link_idx, ...)

Apply the managed lane feature changes to the roadway network

apply_python_calculation(pycode[, in_place])

Changes roadway network object by executing pycode.

apply_roadway_feature_change(link_idx, ...)

Changes the roadway attributes for the selected features based on the project card information passed

assess_connectivity([mode, ...])

Returns a network graph and list of disconnected subgraphs as described by a list of their member nodes.

build_selection_key(selection_dict)

Selections are stored by a key combining the query and the A and B ids.

calculate_area_type([area_type_shape, ...])

#MC Calculates area type variable.

calculate_centroidconnect(parameters[, ...])

Calculates centroid connector variable.

calculate_county([county_shape, ...])

#MC Calculates county variable.

calculate_distance([network_variable, ...])

calculate link distance in miles

calculate_mpo([county_network_variable, ...])

Calculates mpo variable.

calculate_use([network_variable, ...])

Calculates use variable.

convert_int([int_col_names])

Convert integer columns

create_ML_variable([network_variable, overwrite])

Created ML lanes placeholder for project to write out ML changes

create_calculated_variables()

Creates calculated roadway variables.

create_dummy_connector_links(ml_df[, ...])

create dummy connector links between the general purpose and managed lanes

create_hov_corridor_variable([...])

Created hov corridor placeholder for project to write out corridor changes

create_managed_lane_network([...])

Create a roadway network with managed lanes links separated out.

create_managed_variable([network_variable, ...])

Created placeholder for project to write out managed

dataframe_to_fixed_width(df)

Convert dataframe to fixed width format, geometry column will not be transformed.

delete_roadway_feature_change(links, nodes)

delete the roadway features defined in the project card.

deletion_map(links, nodes)

Shows which links and nodes are deleted from the roadway network

fill_na()

Fill na values from create_managed_lane_network()

from_RoadwayNetwork(roadway_network_object)

RoadwayNetwork to ModelRoadwayNetwork

get_attribute(links_df, join_key, ...)

Gets attribute from source data using SHST match result.

get_managed_lane_node_ids(nodes_list[, scalar])

Transform a list of node IDS by a scalar.

get_modal_graph(links_df, nodes_df[, mode, ...])

Determines if the network graph is "strongly" connected A graph is strongly connected if each vertex is reachable from every other vertex.

get_modal_links_nodes(links_df, nodes_df[, ...])

Returns nodes and link dataframes for specific mode.

get_property_by_time_period_and_group(prop)

Return a series for the properties with a specific group or time period.

identify_segment(O_id, D_id[, ...])

+
param endpoints:
+

list of length of two unique keys of nodes making up endpoints of segment

+
+
+

identify_segment_endpoints([mode, links_df, ...])

+
param mode:
+

list of modes of the network, one of drive,`transit`,

+
+
+

is_network_connected([mode, links_df, nodes_df])

Determines if the network graph is "strongly" connected A graph is strongly connected if each vertex is reachable from every other vertex.

load_transform_network(node_filename, ...[, ...])

Reads roadway network files from disk and transforms them into GeoDataFrames.

network_connection_plot(G, ...)

Plot a graph to check for network connection.

orig_dest_nodes_foreign_key(selection[, ...])

Returns the foreign key id (whatever is used in the u and v variables in the links file) for the AB nodes as a tuple.

ox_graph(nodes_df, links_df[, ...])

create an osmnx-flavored network graph

path_search(candidate_links_df, O_id, D_id)

+
param candidate_links:
+

selection of links geodataframe with links likely to be part of path

+
+
+

read(link_filename, node_filename, ...[, ...])

Reads in links and nodes network standard.

read_match_result(path)

Reads the shst geojson match returns.

rename_variables_for_dbf(input_df[, ...])

Rename attributes for DBF/SHP, make sure length within 10 chars.

roadway_net_to_gdf(roadway_net)

+
rtype:
+

GeoDataFrame

+
+
+

roadway_standard_to_met_council_network([...])

Rename and format roadway attributes to be consistent with what metcouncil's model is expecting.

select_roadway_features(selection[, ...])

Selects roadway features that satisfy selection criteria

selection_has_unique_link_id(selection_dict)

+
rtype:
+

bool

+
+
+

selection_map(selected_link_idx[, A, B, ...])

Shows which links are selected for roadway property change or parallel managed lanes category of roadway projects.

shortest_path(graph_links_df, O_id, D_id[, ...])

+
rtype:
+

tuple

+
+
+

split_properties_by_time_period_and_category([...])

Splits properties by time period, assuming a variable structure of

update_distance([links_df, use_shapes, ...])

Calculate link distance in specified units to network variable using either straight line distance or (if specified) shape distance if available.

validate_link_schema(link_filename[, ...])

Validate roadway network data link schema and output a boolean

validate_node_schema(node_file[, ...])

Validate roadway network data node schema and output a boolean

validate_properties(properties[, ...])

If there are change or existing commands, make sure that that property exists in the network.

validate_selection(selection[, ...])

Evaluate whetther the selection dictionary contains the minimum required values.

validate_shape_schema(shape_file[, ...])

Validate roadway network data shape schema and output a boolean

validate_uniqueness()

Confirms that the unique identifiers are met.

write([path, filename])

Writes a network in the roadway network standard

write_roadway_as_fixedwidth(output_dir[, ...])

Writes out fixed width file.

write_roadway_as_shp(output_dir[, ...])

Write out dbf/shp/gpkg for cube.

+

Attributes

+ + + + + + +

CALCULATED_VALUES

+
+
+add_counts(network_variable='AADT', mndot_count_shst_data=None, widot_count_shst_data=None, mndot_count_variable_shp=None, widot_count_variable_shp=None)[source]
+

Adds count variable. +#MC +join the network with count node data, via SHST API node match result

+
+
Parameters:
+
    +
  • network_variable (str) – Name of the variable that should be written to. Default to “AADT”.

  • +
  • mndot_count_shst_data (str) – File path to MNDOT count location SHST API node match result.

  • +
  • widot_count_shst_data (str) – File path to WIDOT count location SHST API node match result.

  • +
  • mndot_count_variable_shp (str) – File path to MNDOT count location geodatabase.

  • +
  • widot_count_variable_shp (str) – File path to WIDOT count location geodatabase.

  • +
+
+
Returns:
+

None

+
+
+
+ +
+ +

Add data from links going to/from nodes to node.

+
+
Return type:
+

DataFrame

+
+
Parameters:
+
    +
  • links_df – if specified, will assess connectivity of this +links list rather than self.links_df

  • +
  • nodes_df – if specified, will assess connectivity of this +nodes list rather than self.nodes_df

  • +
  • link_variables – list of columns in links dataframe to add to incident nodes

  • +
+
+
Returns:
+

nodes DataFrame with link data where length is N*number of links going in/out

+
+
+
+ +
+
+add_new_roadway_feature_change(links, nodes)
+

add the new roadway features defined in the project card. +new shapes are also added for the new roadway links.

+
+
Return type:
+

None

+
+
Parameters:
+
    +
  • links – list of dictionaries

  • +
  • nodes – list of dictionaries

  • +
+
+
+

returns: None

+
+ +
+
+add_variable_using_shst_reference(var_shst_csvdata=None, shst_csv_variable=None, network_variable=None, network_var_type=<class 'int'>, overwrite=False)[source]
+

Join network links with source data, via SHST API node match result.

+
+
Parameters:
+
    +
  • var_shst_csvdata (str) – File path to SHST API return.

  • +
  • shst_csv_variable (str) – Variable name in the source data.

  • +
  • network_variable (str) – Name of the variable that should be written to.

  • +
  • network_var_type – Variable type in the written network.

  • +
  • overwrite (bool) – True is overwriting existing variable. Default to False.

  • +
+
+
Returns:
+

None

+
+
+
+ +
+
+addition_map(links, nodes)
+

Shows which links and nodes are added to the roadway network

+
+ +
+
+apply(project_card_dictionary)
+

Wrapper method to apply a project to a roadway network.

+
+
Parameters:
+

project_card_dictionary – dict +a dictionary of the project card object

+
+
+
+ +
+
+apply_managed_lane_feature_change(link_idx, properties, in_place=True)
+

Apply the managed lane feature changes to the roadway network

+
+
Parameters:
+
    +
  • link_idx – list of lndices of all links to apply change to

  • +
  • properties – list of dictionarys roadway properties to change

  • +
  • in_place – boolean to indicate whether to update self or return +a new roadway network object

  • +
+
+
+
+ +
+
+apply_python_calculation(pycode, in_place=True)
+

Changes roadway network object by executing pycode.

+
+
Parameters:
+
    +
  • pycode – python code which changes values in the roadway network object

  • +
  • in_place – update self or return a new roadway network object

  • +
+
+
+
+ +
+
+apply_roadway_feature_change(link_idx, properties, in_place=True)
+

Changes the roadway attributes for the selected features based on the +project card information passed

+
+
Parameters:
+
    +
  • link_idx – list +lndices of all links to apply change to

  • +
  • properties – list of dictionarys +roadway properties to change

  • +
  • in_place – boolean +update self or return a new roadway network object

  • +
+
+
+
+ +
+
+assess_connectivity(mode='', ignore_end_nodes=True, links_df=None, nodes_df=None)
+

Returns a network graph and list of disconnected subgraphs +as described by a list of their member nodes.

+
+
Parameters:
+
    +
  • mode – list of modes of the network, one of drive,`transit`, +walk, bike

  • +
  • ignore_end_nodes – if True, ignores stray singleton nodes

  • +
  • links_df – if specified, will assess connectivity of this +links list rather than self.links_df

  • +
  • nodes_df – if specified, will assess connectivity of this +nodes list rather than self.nodes_df

  • +
+
+
+
+
Returns: Tuple of

Network Graph (osmnx flavored networkX DiGraph) +List of disconnected subgraphs described by the list of their

+
+

member nodes (as described by their model_node_id)

+
+
+
+
+ +
+
+build_selection_key(selection_dict)
+

Selections are stored by a key combining the query and the A and B ids. +This method combines the two for you based on the selection dictionary.

+
+
Return type:
+

tuple

+
+
Parameters:
+

selection_dictonary – Selection Dictionary

+
+
+

Returns: Tuple serving as the selection key.

+
+ +
+
+calculate_area_type(area_type_shape=None, area_type_shape_variable=None, network_variable='area_type', area_type_codes_dict=None, downtown_area_type_shape=None, downtown_area_type=None, overwrite=False)[source]
+

#MC +Calculates area type variable.

+

This uses the centroid of the geometry field to determine which area it should be labeled. +This isn’t perfect, but it much quicker than other methods.

+
+
Parameters:
+
    +
  • area_type_shape (str) – The File path to area geodatabase.

  • +
  • area_type_shape_variable (str) – The variable name of area type in area geodadabase.

  • +
  • network_variable (str) – The variable name of area type in network standard. Default to “area_type”.

  • +
  • area_type_codes_dict – The dictionary to map input area_type_shape_variable to network_variable

  • +
  • downtown_area_type_shape – The file path to the downtown area type boundary.

  • +
  • downtown_area_type (int) – Integer value of downtown area type

  • +
  • overwrite (Bool) – True if overwriting existing county variable in network. Default to False.

  • +
+
+
Returns:
+

None

+
+
+
+ +
+
+calculate_centroidconnect(parameters, network_variable='centroidconnect', highest_taz_number=None, as_integer=True, overwrite=False)[source]
+

Calculates centroid connector variable.

+
+
Parameters:
+
    +
  • parameters (Parameters) – A Lasso Parameters, which stores input files.

  • +
  • network_variable (str) – Variable that should be written to in the network. Default to “centroidconnect”

  • +
  • highest_taz_number (int) – the max TAZ number in the network.

  • +
  • as_integer (bool) – If True, will convert true/false to 1/0s. Default to True.

  • +
  • overwrite (Bool) – True if overwriting existing county variable in network. Default to False.

  • +
+
+
Returns:
+

RoadwayNetwork

+
+
+
+ +
+
+calculate_county(county_shape=None, county_shape_variable=None, network_variable='county', county_codes_dict=None, overwrite=False)[source]
+

#MC +Calculates county variable.

+

This uses the centroid of the geometry field to determine which county it should be labeled. +This isn’t perfect, but it much quicker than other methods.

+
+
Parameters:
+
    +
  • county_shape (str) – The File path to county geodatabase.

  • +
  • county_shape_variable (str) – The variable name of county in county geodadabase.

  • +
  • network_variable (str) – The variable name of county in network standard. Default to “county”.

  • +
  • overwrite (Bool) – True if overwriting existing county variable in network. Default to False.

  • +
+
+
Returns:
+

None

+
+
+
+ +
+
+calculate_distance(network_variable='distance', centroidconnect_only=False, overwrite=False)[source]
+

calculate link distance in miles

+
+
Parameters:
+
    +
  • centroidconnect_only (Bool) – True if calculating distance for centroidconnectors only. Default to False.

  • +
  • overwrite (Bool) – True if overwriting existing variable in network. Default to False.

  • +
+
+
Returns:
+

None

+
+
+
+ +
+
+calculate_mpo(county_network_variable='county', network_variable='mpo', as_integer=True, mpo_counties=None, overwrite=False)[source]
+

Calculates mpo variable. +#MC +:param county_variable: Name of the variable where the county names are stored. Default to “county”. +:type county_variable: str +:param network_variable: Name of the variable that should be written to. Default to “mpo”. +:type network_variable: str +:param as_integer: If true, will convert true/false to 1/0s. +:type as_integer: bool +:param mpo_counties: List of county names that are within mpo region. +:type mpo_counties: list +:param overwrite: True if overwriting existing county variable in network. Default to False. +:type overwrite: Bool

+
+
Returns:
+

None

+
+
+
+ +
+
+calculate_use(network_variable='use', as_integer=True, overwrite=False)[source]
+

Calculates use variable.

+
+
Parameters:
+
    +
  • network_variable (str) – Variable that should be written to in the network. Default to “use”

  • +
  • as_integer (bool) – If True, will convert true/false to 1/0s. Defauly to True.

  • +
  • overwrite (Bool) – True if overwriting existing county variable in network. Default to False.

  • +
+
+
Returns:
+

None

+
+
+
+ +
+
+convert_int(int_col_names=[])[source]
+

Convert integer columns

+
+ +
+
+create_ML_variable(network_variable='ML_lanes', overwrite=False)[source]
+

Created ML lanes placeholder for project to write out ML changes

+

ML lanes default to 0, ML info comes from cube LOG file and store in project cards

+
+
Parameters:
+

overwrite (Bool) – True if overwriting existing variable in network. Default to False.

+
+
Returns:
+

None

+
+
+
+ +
+
+create_calculated_variables()[source]
+

Creates calculated roadway variables.

+
+
Parameters:
+

None

+
+
+
+ +
+ +

create dummy connector links between the general purpose and managed lanes

+
+
Parameters:
+
    +
  • gp_df – GeoDataFrame +dataframe of general purpose links (where managed lane also exists)

  • +
  • ml_df – GeoDataFrame +dataframe of corresponding managed lane links,

  • +
  • access_lanes – int +number of lanes in access dummy link

  • +
  • egress_lanes – int +number of lanes in egress dummy link

  • +
  • access_roadway – str +roaday type for access dummy link

  • +
  • egress_roadway – str +roadway type for egress dummy link

  • +
  • access_name_prefix – str +prefix for access dummy link name

  • +
  • egress_name_prefix – str +prefix for egress dummy link name

  • +
+
+
+
+ +
+
+create_hov_corridor_variable(network_variable='segment_id', overwrite=False)[source]
+

Created hov corridor placeholder for project to write out corridor changes

+

hov corridor id default to 0, its info comes from cube LOG file and store in project cards

+
+
Parameters:
+

overwrite (Bool) – True if overwriting existing variable in network. Default to False.

+
+
Returns:
+

None

+
+
+
+ +
+
+create_managed_lane_network(keep_same_attributes_ml_and_gp=None, keep_additional_attributes_ml_and_gp=[], managed_lanes_required_attributes=[], managed_lanes_node_id_scalar=None, managed_lanes_link_id_scalar=None, in_place=False)
+

Create a roadway network with managed lanes links separated out. +Add new parallel managed lane links, access/egress links, +and add shapes corresponding to the new links

+
+
Return type:
+

RoadwayNetwork

+
+
Parameters:
+
    +
  • keep_same_attributes_ml_and_gp – list of attributes to copy from general purpose +lane to managed lane. If not specified, will look for value in the RoadwayNetwork +instance. If not found there, will default to KEEP_SAME_ATTRIBUTES_ML_AND_GP.

  • +
  • keep_additional_attributes_ml_and_gp – list of additional attributes to add. This is useful +if you want to leave the default attributes and then ALSO some others.

  • +
  • managed_lanes_required_attributes – list of attributes that are required to be specified +in new managed lanes. If not specified, will look for value in the RoadwayNetwork +instance. If not found there, will default to MANAGED_LANES_REQUIRED_ATTRIBUTES.

  • +
  • managed_lanes_node_id_scalar – integer value added to original node IDs to create managed +lane unique ids. If not specified, will look for value in the RoadwayNetwork +instance. If not found there, will default to MANAGED_LANES_NODE_ID_SCALAR.

  • +
  • managed_lanes_link_id_scalar – integer value added to original link IDs to create managed +lane unique ids. If not specified, will look for value in the RoadwayNetwork +instance. If not found there, will default to MANAGED_LANES_LINK_ID_SCALAR.

  • +
  • in_place – update self or return a new roadway network object

  • +
+
+
+

returns: A RoadwayNetwork instance

+
+ +
+
+create_managed_variable(network_variable='managed', overwrite=False)[source]
+

Created placeholder for project to write out managed

+

managed default to 0, its info comes from cube LOG file and store in project cards

+
+
Parameters:
+

overwrite (Bool) – True if overwriting existing variable in network. Default to False.

+
+
Returns:
+

None

+
+
+
+ +
+
+static dataframe_to_fixed_width(df)[source]
+

Convert dataframe to fixed width format, geometry column will not be transformed.

+
+
Parameters:
+

df (pandas DataFrame) –

+
+
Returns:
+

dataframe with fixed width for each column. +dict: dictionary with columns names as keys, column width as values.

+
+
Return type:
+

pandas dataframe

+
+
+
+ +
+
+delete_roadway_feature_change(links, nodes, ignore_missing=True)
+

delete the roadway features defined in the project card. +valid links and nodes defined in the project gets deleted +and shapes corresponding to the deleted links are also deleted.

+
+
Return type:
+

None

+
+
Parameters:
+
    +
  • links – dict +list of dictionaries

  • +
  • nodes – dict +list of dictionaries

  • +
  • ignore_missing – bool +If True, will only warn about links/nodes that are missing from +network but specified to “delete” in project card +If False, will fail.

  • +
+
+
+
+ +
+
+deletion_map(links, nodes)
+

Shows which links and nodes are deleted from the roadway network

+
+ +
+
+fill_na()[source]
+

Fill na values from create_managed_lane_network()

+
+ +
+
+static from_RoadwayNetwork(roadway_network_object, parameters={})[source]
+

RoadwayNetwork to ModelRoadwayNetwork

+
+
Parameters:
+
    +
  • roadway_network_object (RoadwayNetwork) –

  • +
  • parameters – dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not specified, will use default parameters.

  • +
+
+
Returns:
+

ModelRoadwayNetwork

+
+
+
+ +
+
+static get_attribute(links_df, join_key, source_shst_ref_df, source_gdf, field_name)[source]
+

Gets attribute from source data using SHST match result.

+
+
Parameters:
+
    +
  • links_df (dataframe) – The network dataframe that new attribute should be written to.

  • +
  • join_key (str) – SHST ID variable name used to join source data with network dataframe.

  • +
  • source_shst_ref_df (str) – File path to source data SHST match result.

  • +
  • source_gdf (str) – File path to source data.

  • +
  • field_name (str) – Name of the attribute to get from source data.

  • +
+
+
Returns:
+

None

+
+
+
+ +
+
+static get_managed_lane_node_ids(nodes_list, scalar=4500000)
+

Transform a list of node IDS by a scalar. +..todo #237 what if node ids are not a number?

+
+
Parameters:
+
    +
  • nodes_list – list of integers

  • +
  • scalar – value to add to node IDs

  • +
+
+
+

Returns: list of integers

+
+ +
+
+static get_modal_graph(links_df, nodes_df, mode=None, modes_to_network_link_variables={'bike': ['bike_access'], 'bus': ['bus_only', 'drive_access'], 'drive': ['drive_access'], 'rail': ['rail_only'], 'transit': ['bus_only', 'rail_only', 'drive_access'], 'walk': ['walk_access']})
+

Determines if the network graph is “strongly” connected +A graph is strongly connected if each vertex is reachable from every other vertex.

+
+
Parameters:
+
    +
  • links_df – DataFrame of standard network links

  • +
  • nodes_df – DataFrame of standard network nodes

  • +
  • mode – mode of the network, one of drive,`transit`, +walk, bike

  • +
  • modes_to_network_link_variables – dictionary mapping the mode selections to the network variables +that must bool to true to select that mode. Defaults to MODES_TO_NETWORK_LINK_VARIABLES

  • +
+
+
+

Returns: networkx: osmnx: DiGraph of network

+
+ +
+ +

Returns nodes and link dataframes for specific mode.

+
+
Parameters:
+
    +
  • links_df – DataFrame of standard network links

  • +
  • nodes_df – DataFrame of standard network nodes

  • +
  • modes – list of the modes of the network to be kept, must be in drive,`transit`,`rail`,`bus`, +walk, bike. For example, if bike and walk are selected, both bike and walk links will be kept.

  • +
  • modes_to_network_link_variables – dictionary mapping the mode selections to the network variables +that must bool to true to select that mode. Defaults to MODES_TO_NETWORK_LINK_VARIABLES

  • +
+
+
+

Returns: tuple of DataFrames for links, nodes filtered by mode

+

links with walk access are not marked as having walk access +Issue discussed in https://github.com/wsp-sag/network_wrangler/issues/145 +modal_nodes_df = nodes_df[nodes_df[mode_node_variable] == 1]

+
+ +
+
+get_property_by_time_period_and_group(prop, time_period=None, category=None, default_return=None)
+

Return a series for the properties with a specific group or time period.

+
+
Parameters:
+
    +
  • prop (str) – the variable that you want from network

  • +
  • time_period (list(str)) – the time period that you are querying for +i.e. [‘16:00’, ‘19:00’]

  • +
  • category (str or list(str)(Optional)) –

    the group category +i.e. “sov”

    +

    or

    +

    list of group categories in order of search, i.e. +[“hov3”,”hov2”]

    +

  • +
  • default_return (what to return if variable or time period not found. Default is None.) –

  • +
+
+
Return type:
+

pandas series

+
+
+
+ +
+
+identify_segment(O_id, D_id, selection_dict={}, mode=None, nodes_df=None, links_df=None)
+
+
Parameters:
+
    +
  • endpoints – list of length of two unique keys of nodes making up endpoints of segment

  • +
  • selection_dict – dictionary of link variables to select candidate links from, otherwise will create a graph of ALL links which will be both a RAM hog and could result in odd shortest paths.

  • +
  • segment_variables – list of variables to keep

  • +
+
+
+
+ +
+
+identify_segment_endpoints(mode='', links_df=None, nodes_df=None, min_connecting_links=10, min_distance=None, max_link_deviation=2)
+
+
Parameters:
+
    +
  • mode – list of modes of the network, one of drive,`transit`, +walk, bike

  • +
  • links_df – if specified, will assess connectivity of this +links list rather than self.links_df

  • +
  • nodes_df – if specified, will assess connectivity of this +nodes list rather than self.nodes_df

  • +
+
+
+
+ +
+
+is_network_connected(mode=None, links_df=None, nodes_df=None)
+

Determines if the network graph is “strongly” connected +A graph is strongly connected if each vertex is reachable from every other vertex.

+
+
Parameters:
+
    +
  • mode – mode of the network, one of drive,`transit`, +walk, bike

  • +
  • links_df – DataFrame of standard network links

  • +
  • nodes_df – DataFrame of standard network nodes

  • +
+
+
+

Returns: boolean

+
+ +
+
+static load_transform_network(node_filename, link_filename, shape_filename, crs=4326, node_foreign_key='model_node_id', validate_schema=True, **kwargs)
+

Reads roadway network files from disk and transforms them into GeoDataFrames.

+
+
Return type:
+

tuple

+
+
Parameters:
+
    +
  • node_filename – file name for nodes.

  • +
  • link_filename – file name for links.

  • +
  • shape_filename – file name for shapes.

  • +
  • crs – coordinate reference system. Defaults to value in CRS.

  • +
  • node_foreign_key – variable linking the node table to the link table. Defaults +to NODE_FOREIGN_KEY.

  • +
  • validate_schema – boolean indicating if network should be validated to schema.

  • +
+
+
+

returns: tuple of GeodataFrames nodes_df, links_df, shapes_df

+
+ +
+
+static network_connection_plot(G, disconnected_subgraph_nodes)
+

Plot a graph to check for network connection.

+
+
Parameters:
+
    +
  • G – OSMNX flavored networkX graph.

  • +
  • disconnected_subgraph_nodes – List of disconnected subgraphs described by the list of their +member nodes (as described by their model_node_id).

  • +
+
+
+

returns: fig, ax : tuple

+
+ +
+
+orig_dest_nodes_foreign_key(selection, node_foreign_key='')
+

Returns the foreign key id (whatever is used in the u and v +variables in the links file) for the AB nodes as a tuple.

+
+
Return type:
+

tuple

+
+
Parameters:
+
    +
  • selection – selection dictionary with A and B keys

  • +
  • node_foreign_key – variable name for whatever is used by the u and v variable

  • +
  • specified (in the links_df file. If nothing is) –

  • +
  • whatever (assume) –

  • +
  • is (default) –

  • +
+
+
+

Returns: tuple of (A_id, B_id)

+
+ +
+
+static ox_graph(nodes_df, links_df, node_foreign_key='model_node_id', link_foreign_key=['A', 'B'], unique_link_key='model_link_id')
+

create an osmnx-flavored network graph

+

osmnx doesn’t like values that are arrays, so remove the variables +that have arrays. osmnx also requires that certain variables +be filled in, so do that too.

+
+
Parameters:
+
    +
  • nodes_df – GeoDataFrame of nodes

  • +
  • link_df – GeoDataFrame of links

  • +
  • node_foreign_key – field referenced in link_foreign_key

  • +
  • link_foreign_key – list of attributes that define the link start and end nodes to the node foreign key

  • +
  • unique_link_key – primary key for links

  • +
+
+
+

Returns: a networkx multidigraph

+
+ +
+ +
+
Parameters:
+
    +
  • candidate_links – selection of links geodataframe with links likely to be part of path

  • +
  • O_id – origin node foreigh key ID

  • +
  • D_id – destination node foreigh key ID

  • +
  • weight_column – column to use for weight of shortest path. Defaults to “i” (iteration)

  • +
  • weight_factor – optional weight to multiply the weight column by when finding the shortest path

  • +
  • search_breadth

  • +
+
+
+

Returns

+
+ +
+
+static read(link_filename, node_filename, shape_filename, fast=False, recalculate_calculated_variables=False, recalculate_distance=False, parameters={}, **kwargs)[source]
+

Reads in links and nodes network standard.

+
+
Parameters:
+
    +
  • link_filename (str) – File path to link json.

  • +
  • node_filename (str) – File path to node geojson.

  • +
  • shape_filename (str) – File path to link true shape geojson

  • +
  • fast (bool) – boolean that will skip validation to speed up read time.

  • +
  • recalculate_calculated_variables (bool) – calculates fields from spatial joins, etc.

  • +
  • recalculate_distance (bool) – re-calculates distance.

  • +
  • parameters – dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not specified, will use default parameters.

  • +
  • crs (int) – coordinate reference system, ESPG number

  • +
  • node_foreign_key (str) – variable linking the node table to the link table

  • +
  • link_foreign_key (list) – list of variable linking the link table to the node foreign key

  • +
  • shape_foreign_key (str) – variable linking the links table and shape table

  • +
  • unique_link_ids (list) – list of variables unique to each link

  • +
  • unique_node_ids (list) – list of variables unique to each node

  • +
  • modes_to_network_link_variables (dict) – Mapping of modes to link variables in the network

  • +
  • modes_to_network_nodes_variables (dict) – Mapping of modes to node variables in the network

  • +
  • managed_lanes_node_id_scalar (int) – Scalar values added to primary keys for nodes for +corresponding managed lanes.

  • +
  • managed_lanes_link_id_scalar (int) – Scalar values added to primary keys for links for +corresponding managed lanes.

  • +
  • managed_lanes_required_attributes (list) – attributes that must be specified in managed +lane projects.

  • +
  • keep_same_attributes_ml_and_gp (list) – attributes to copy to managed lanes from parallel +general purpose lanes.

  • +
+
+
Returns:
+

ModelRoadwayNetwork

+
+
+
+ +
+
+static read_match_result(path)[source]
+

Reads the shst geojson match returns.

+

Returns shst dataframe.

+

Reading lots of same type of file and concatenating them into a single DataFrame.

+
+
Parameters:
+

path (str) – File path to SHST match results.

+
+
Returns:
+

geopandas geodataframe

+
+
Return type:
+

geodataframe

+
+
+

##todo +not sure why we need, but should be in utilities not this class

+
+ +
+
+rename_variables_for_dbf(input_df, variable_crosswalk=None, output_variables=None, convert_geometry_to_xy=False)[source]
+

Rename attributes for DBF/SHP, make sure length within 10 chars.

+
+
Parameters:
+
    +
  • input_df (dataframe) – Network standard DataFrame.

  • +
  • variable_crosswalk (str) – File path to variable name crosswalk from network standard to DBF names.

  • +
  • output_variables (list) – List of strings for DBF variables.

  • +
  • convert_geometry_to_xy (bool) – True if converting node geometry to X/Y

  • +
+
+
Returns:
+

dataframe

+
+
+
+ +
+
+static roadway_net_to_gdf(roadway_net)
+
+
Return type:
+

GeoDataFrame

+
+
+

Turn the roadway network into a GeoDataFrame +:param roadway_net: the roadway network to export

+

returns: shapes dataframe

+
+ +
+
+roadway_standard_to_met_council_network(output_epsg=None)[source]
+

Rename and format roadway attributes to be consistent with what metcouncil’s model is expecting. +#MC +:param output_epsg: epsg number of output network. +:type output_epsg: int

+
+
Returns:
+

None

+
+
+
+ +
+
+select_roadway_features(selection, search_mode='drive', force_search=False, sp_weight_factor=None)
+

Selects roadway features that satisfy selection criteria

+
+
Return type:
+

GeoDataFrame

+
+
+
+
Example usage:
+
net.select_roadway_features(
+
selection = [ {

# a match condition for the from node using osm, +# shared streets, or model node number +‘from’: {‘osm_model_link_id’: ‘1234’}, +# a match for the to-node.. +‘to’: {‘shstid’: ‘4321’}, +# a regex or match for facility condition +# could be # of lanes, facility type, etc. +‘facility’: {‘name’:’Main St’}, +}, … ])

+
+
+
+
+
+
+
+
Parameters:
+
    +
  • selection – dictionary with keys for: +A - from node +B - to node +link - which includes at least a variable for name

  • +
  • search_mode – mode which you are searching for; defaults to “drive”

  • +
  • force_search – boolean directing method to perform search even if one +with same selection dict is stored from a previous search.

  • +
  • sp_weight_factor – multiple used to discourage shortest paths which +meander from original search returned from name or ref query. +If not set here, will default to value of sp_weight_factor in +RoadwayNetwork instance. If not set there, will defaul to SP_WEIGHT_FACTOR.

  • +
+
+
+

Returns: a list of link IDs in selection

+
+ +
+ +
+
Return type:
+

bool

+
+
Parameters:
+

selection_dictionary – Dictionary representation of selection +of roadway features, containing a “link” key.

+
+
+
+
Returns: A boolean indicating if the selection dictionary contains

a unique identifier for links.

+
+
+
+ +
+
+selection_map(selected_link_idx, A=None, B=None, candidate_link_idx=[])
+

Shows which links are selected for roadway property change or parallel +managed lanes category of roadway projects.

+
+
Parameters:
+
    +
  • selected_links_idx – list of selected link indices

  • +
  • candidate_links_idx – optional list of candidate link indices to also include in map

  • +
  • A – optional foreign key of starting node of a route selection

  • +
  • B – optional foreign key of ending node of a route selection

  • +
+
+
+
+ +
+
+shortest_path(graph_links_df, O_id, D_id, nodes_df=None, weight_column='i', weight_factor=100)
+
+
Return type:
+

tuple

+
+
Parameters:
+
    +
  • graph_links_df

  • +
  • O_id – foreign key for start node

  • +
  • D_id – foreign key for end node

  • +
  • nodes_df – optional nodes df, otherwise will use network instance

  • +
  • weight_column – column to use as a weight, defaults to “i”

  • +
  • weight_factor – any additional weighting to multiply the weight column by, defaults to SP_WEIGHT_FACTOR

  • +
+
+
+

Returns: tuple with length of four +- Boolean if shortest path found +- nx Directed graph of graph links +- route of shortest path nodes as List +- links in shortest path selected from links_df

+
+ +
+
+split_properties_by_time_period_and_category(properties_to_split=None)[source]
+

Splits properties by time period, assuming a variable structure of

+
+
Parameters:
+

properties_to_split

dict +dictionary of output variable prefix mapped to the source variable and what to stratify it by +e.g. +{

+
+

’lanes’ : {‘v’:’lanes’, ‘times_periods’:{“AM”: (“6:00”, “10:00”),”PM”: (“15:00”, “19:00”)}}, +‘ML_lanes’ : {‘v’:’ML_lanes’, ‘times_periods’:{“AM”: (“6:00”, “10:00”),”PM”: (“15:00”, “19:00”)}}, +‘use’ : {‘v’:’use’, ‘times_periods’:{“AM”: (“6:00”, “10:00”),”PM”: (“15:00”, “19:00”)}},

+
+

}

+

+
+
+
+ +
+
+update_distance(links_df=None, use_shapes=False, units='miles', network_variable='distance', overwrite=True, inplace=True)
+

Calculate link distance in specified units to network variable using either straight line +distance or (if specified) shape distance if available.

+
+
Parameters:
+
    +
  • links_df – Links GeoDataFrame. Useful if want to update a portion of network links +(i.e. only centroid connectors). If not provided, will use entire self.links_df.

  • +
  • use_shapes – if True, will add length information from self.shapes_df rather than crow-fly. +If no corresponding shape found in self.shapes_df, will default to crow-fly.

  • +
  • units – units to use. Defaults to the standard unit of miles. Available units: “meters”, “miles”.

  • +
  • network_variable – variable to store link distance in. Defaults to “distance”.

  • +
  • overwrite – Defaults to True and will overwrite all existing calculated distances. +False will only update NaNs.

  • +
  • inplace – updates self.links_df

  • +
+
+
Returns:
+

links_df with updated distance

+
+
+
+ +
+ +

Validate roadway network data link schema and output a boolean

+
+ +
+
+static validate_node_schema(node_file, schema_location='roadway_network_node.json')
+

Validate roadway network data node schema and output a boolean

+
+ +
+
+validate_properties(properties, ignore_existing=False, require_existing_for_change=False)
+

If there are change or existing commands, make sure that that +property exists in the network.

+
+
Return type:
+

bool

+
+
Parameters:
+
    +
  • properties – properties dictionary to be evaluated

  • +
  • ignore_existing – If True, will only warn about properties +that specify an “existing” value. If False, will fail.

  • +
  • require_existing_for_change – If True, will fail if there isn’t +a specified value in theproject card for existing when a +change is specified.

  • +
+
+
+

Returns: boolean value as to whether the properties dictonary is valid.

+
+ +
+
+validate_selection(selection, selection_requires=['link'])
+

Evaluate whetther the selection dictionary contains the +minimum required values.

+
+
Return type:
+

bool

+
+
Parameters:
+

selection – selection dictionary to be evaluated

+
+
+

Returns: boolean value as to whether the selection dictonary is valid.

+
+ +
+
+static validate_shape_schema(shape_file, schema_location='roadway_network_shape.json')
+

Validate roadway network data shape schema and output a boolean

+
+ +
+
+validate_uniqueness()
+

Confirms that the unique identifiers are met.

+
+
Return type:
+

bool

+
+
+
+ +
+
+write(path='.', filename=None)
+

Writes a network in the roadway network standard

+
+
Return type:
+

None

+
+
Parameters:
+
    +
  • path – the path were the output will be saved

  • +
  • filename – the name prefix of the roadway files that will be generated

  • +
+
+
+
+ +
+
+write_roadway_as_fixedwidth(output_dir, node_output_variables=None, link_output_variables=None, output_link_txt=None, output_node_txt=None, output_link_header_width_txt=None, output_node_header_width_txt=None, output_cube_network_script=None, drive_only=False)[source]
+

Writes out fixed width file.

+

This function does: +1. write out link and node fixed width data files for cube. +2. write out header and width correspondence. +3. write out cube network building script with header and width specification.

+
+
Parameters:
+
    +
  • output_dir (str) – File path to where links, nodes and script will be written and run

  • +
  • node_output_variables (list) – list of node variable names.

  • +
  • link_output_variables (list) – list of link variable names.

  • +
  • output_link_txt (str) – File name of output link database (within output_dir)

  • +
  • output_node_txt (str) – File name of output node database (within output_dir)

  • +
  • output_link_header_width_txt (str) – File name of link column width records (within output_dir)

  • +
  • output_node_header_width_txt (str) – File name of node column width records (within output_dir)

  • +
  • output_cube_network_script (str) – File name of CUBE network building script (within output_dir)

  • +
  • drive_only (bool) – If True, only writes drive nodes and links

  • +
+
+
Returns:
+

None

+
+
+
+ +
+
+write_roadway_as_shp(output_dir, node_output_variables=None, link_output_variables=None, data_to_csv=True, data_to_dbf=False, output_link_shp=None, output_node_shp=None, output_link_csv=None, output_node_csv=None, output_gpkg=None, output_link_gpkg_layer=None, output_node_gpkg_layer=None, output_gpkg_link_filter=None)[source]
+

Write out dbf/shp/gpkg for cube. Write out csv in addition to shp with full length variable names.

+
+
Parameters:
+
    +
  • output_dir (str) – File path to directory

  • +
  • node_output_variables (list) – List of strings for node output variables.

  • +
  • link_output_variables (list) – List of strings for link output variables.

  • +
  • data_to_csv (bool) – True if write network in csv format.

  • +
  • data_to_dbf (bool) – True if write network in dbf/shp format.

  • +
  • output_link_shp (str) – File name to output link dbf/shp.

  • +
  • output_node_shp (str) – File name of output node dbf/shp.

  • +
  • output_link_csv (str) – File name to output link csv.

  • +
  • output_node_csv (str) – File name to output node csv.

  • +
  • output_gpkg (str) – File name to output GeoPackage.

  • +
  • output_link_gpkg_layer (str) – Layer name within output_gpkg to output links.

  • +
  • output_node_gpkg_layer (str) – Layer name within output_gpkg to output links.

  • +
  • output_gpkg_link_filter (str) – Optional column name to additional output link subset layers

  • +
+
+
Returns:
+

None

+
+
+
+ +
+
+CALCULATED_VALUES = ['area_type', 'county', 'centroidconnect']
+
+ +
+ +
+ + +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty_emme/_generated/lasso.Parameters/index.html b/branch/bicounty_emme/_generated/lasso.Parameters/index.html new file mode 100644 index 0000000..ead392d --- /dev/null +++ b/branch/bicounty_emme/_generated/lasso.Parameters/index.html @@ -0,0 +1,555 @@ + + + + + + + lasso.Parameters — lasso documentation + + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

lasso.Parameters

+
+
+class lasso.Parameters(**kwargs)[source]
+

Bases: object

+

A class representing all the parameters defining the networks +including time of day, categories, etc.

+

Parameters can be set at runtime by initializing a parameters instance +with a keyword argument setting the attribute. Parameters that are +not explicitly set will use default parameters listed in this class. +.. highlight:: python

+
+
Attr:
+
time_period_to_time (dict): Maps time period abbreviations used in

Cube to time of days used on gtfs and highway network standard +Default:

+
{
+    "EA": ("3:00", "6:00"),
+    "AM": ("6:00, "10:00"),
+    "MD": ("10:00", "15:00"),
+    "PM": ("15:00", "19:00"),
+    "EV": ("19:00", "3:00"),
+}
+
+
+
+
cube_time_periods (dict): Maps cube time period numbers used in

transit line files to the time period abbreviations in time_period_to_time +dictionary. +Default:

+
{"1": "EA", "2": "AM", "3": "MD", "4": "PM", "5": "EV"}
+
+
+
+
categories (dict): Maps demand category abbreviations to a list of

network categories they are allowed to use. +Default:

+
{
+    # suffix, source (in order of search)
+    "sov": ["sov", "default"],
+    "hov2": ["hov2", "default", "sov"],
+    "hov3": ["hov3", "hov2", "default", "sov"],
+    "truck": ["trk", "sov", "default"],
+}
+
+
+
+
properties_to_split (dict): Dictionary mapping variables in standard

roadway network to categories and time periods that need to be +split out in final model network to get variables like LANES_AM. +Default:

+
{
+    "lanes": {
+        "v": "lanes",
+        "time_periods": self.time_periods_to_time
+    },
+    "ML_lanes": {
+        "v": "ML_lanes",
+        "time_periods": self.time_periods_to_time
+    },
+    "use": {
+        "v": "use",
+        "time_periods": self.time_periods_to_time
+    },
+}
+
+
+
+
county_shape (str): File location of shapefile defining counties.

Default:

+
r"metcouncil_data/county/cb_2017_us_county_5m.shp"
+
+
+
+
county_variable_shp (str): Property defining the county n ame in

the county_shape file. +Default:

+
NAME
+
+
+
+
lanes_lookup_file (str): Lookup table of number of lanes for different data sources.

Default:

+
r"metcouncil_data/lookups/lanes.csv"
+
+
+
+
centroid_connect_lanes (int): Number of lanes for centroid connectors.

Default:

+
1
+
+
+
+
mpo_counties (list): list of county names within MPO boundary.

Default:

+
[
+    "ANOKA",
+    "DAKOTA",
+    "HENNEPIN",
+    "RAMSEY",
+    "SCOTT",
+    "WASHINGTON",
+    "CARVER",
+]
+
+
+
+
taz_shape (str):

Default:

+
r"metcouncil_data/TAZ/TAZOfficialWCurrentForecasts.shp"
+
+
+
+
taz_data (str):

Default:

+
??
+
+
+
+
highest_taz_number (int): highest TAZ number in order to define

centroid connectors. +Default:

+
3100
+
+
+
+
output_variables (list): list of variables to output in final model

network. +Default:

+
[
+    "model_link_id",
+    "link_id",
+    "A",
+    "B",
+    "shstGeometryId",
+    "distance",
+    "roadway",
+    "name",
+    "roadway_class",
+    "bike_access",
+    "walk_access",
+    "drive_access",
+    "truck_access",
+    "trn_priority_EA",
+    "trn_priority_AM",
+    "trn_priority_MD",
+    "trn_priority_PM",
+    "trn_priority_EV",
+    "ttime_assert_EA",
+    "ttime_assert_AM",
+    "ttime_assert_MD",
+    "ttime_assert_PM",
+    "ttime_assert_EV",
+    "lanes_EA",
+    "lanes_AM",
+    "lanes_MD",
+    "lanes_PM",
+    "lanes_EV",
+    "price_sov_EA",
+    "price_hov2_EA",
+    "price_hov3_EA",
+    "price_truck_EA",
+    "price_sov_AM",
+    "price_hov2_AM",
+    "price_hov3_AM",
+    "price_truck_AM",
+    "price_sov_MD",
+    "price_hov2_MD",
+    "price_hov3_MD",
+    "price_truck_MD",
+    "price_sov_PM",
+    "price_hov2_PM",
+    "price_hov3_PM",
+    "price_truck_PM",
+    "price_sov_EV",
+    "price_hov2_EV",
+    "price_hov3_EV",
+    "price_truck_EV",
+    "roadway_class_idx",
+    "facility_type",
+    "county",
+    "centroidconnect",
+    "model_node_id",
+    "N",
+    "osm_node_id",
+    "bike_node",
+    "transit_node",
+    "walk_node",
+    "drive_node",
+    "geometry",
+    "X",
+    "Y",
+    "ML_lanes_EA",
+    "ML_lanes_AM",
+    "ML_lanes_MD",
+    "ML_lanes_PM",
+    "ML_lanes_EV",
+    "segment_id",
+    "managed",
+    "bus_only",
+    "rail_only"
+]
+
+
+
+
osm_facility_type_dict (dict): Mapping between OSM Roadway variable

and facility type. Default:

+
+
area_type_shape (str): Location of shapefile defining area type.

Default:

+
r"metcouncil_data/area_type/ThriveMSP2040CommunityDesignation.shp"
+
+
+
+
area_type_variable_shp (str): property in area_type_shape with area

type in it. +Default:

+
"COMDES2040"
+
+
+
+
area_type_code_dict (dict): Mapping of the area_type_variable_shp to

the area type code used in the MetCouncil cube network. +Default:

+
{
+    23: 4,  # urban center
+    24: 3,
+    25: 2,
+    35: 2,
+    36: 1,
+    41: 1,
+    51: 1,
+    52: 1,
+    53: 1,
+    60: 1,
+}
+
+
+
+
downtown_area_type_shape (str): Location of shapefile defining downtown area type.

Default:

+
r"metcouncil_data/area_type/downtownzones_TAZ.shp"
+
+
+
+
downtown_area_type (int): Area type integer for downtown.

Default:

+
5
+
+
+
+
mrcc_roadway_class_shape (str): Shapefile of MRCC links with a property

associated with roadway class. Default:

+
r"metcouncil_data/mrcc/trans_mrcc_centerlines.shp"
+
+
+
+
mrcc_roadway_class_variable_shp (str): The property in mrcc_roadway_class_shp

associated with roadway class. Default:

+
"ROUTE_SYS"
+
+
+
+
widot_roadway_class_shape (str): Shapefile of Wisconsin links with a property

associated with roadway class. Default:

+
r"metcouncil_data/Wisconsin_Lanes_Counts_Median/WISLR.shp"
+
+
+
+
widot_roadway_class_variable_shp (str): The property in widot_roadway_class_shape

associated with roadway class.Default:

+
"RDWY_CTGY_"
+
+
+
+
mndot_count_shape (str): Shapefile of MnDOT links with a property

associated with counts. Default:

+
r"metcouncil_data/count_mn/AADT_2017_Count_Locations.shp"
+
+
+
+
mndot_count_variable_shp (str): The property in mndot_count_shape

associated with counts. Default:

+
+
::

“lookups/osm_highway_facility_type_crosswalk.csv”

+
+
+
+
legacy_tm2_attributes (str): CSV file of link attributes by

shStReferenceId from Legacy TM2 network. Default:

+
"lookups/legacy_tm2_attributes.csv"
+
+
+
+
osm_lanes_attributes (str): CSV file of number of lanes by shStReferenceId

from OSM. Default:

+
"lookups/osm_lanes_attributes.csv"
+
+
+
+
tam_tm2_attributes (str): CSV file of link attributes by

shStReferenceId from TAM TM2 network. Default:

+
"lookups/tam_tm2_attributes.csv"
+
+
+
+
tom_tom_attributes (str): CSV file of link attributes by

shStReferenceId from TomTom network. Default:

+
"lookups/tomtom_attributes.csv"
+
+
+
+
sfcta_attributes (str): CSV file of link attributes by

shStReferenceId from SFCTA network. Default:

+
"lookups/sfcta_attributes.csv"
+
+
+
+
output_epsg (int): EPSG type of geographic projection for output

shapefiles. Default:

+
102646
+
+
+
+
output_link_shp (str): Output shapefile for roadway links. Default:
+
::

r”tests/scratch/links.shp”

+
+
+
+
output_node_shp (str): Output shapefile for roadway nodes. Default:
+
::

r”tests/scratch/nodes.shp”

+
+
+
+
output_link_csv (str): Output csv for roadway links. Default:
+
::

r”tests/scratch/links.csv”

+
+
+
+
output_node_csv (str): Output csv for roadway nodes. Default:
+
::

r”tests/scratch/nodes.csv”

+
+
+
+
output_link_txt (str): Output fixed format txt for roadway links. Default:
+
::

r”tests/scratch/links.txt”

+
+
+
+
output_node_txt (str): Output fixed format txt for roadway nodes. Default:
+
::

r”tests/scratch/nodes.txt”

+
+
+
+
output_link_header_width_txt (str): Header for txt roadway links. Default:
+
::

r”tests/scratch/links_header_width.txt”

+
+
+
+
output_node_header_width_txt (str): Header for txt for roadway Nodes. Default:
+
::

r”tests/scratch/nodes_header_width.txt”

+
+
+
+
output_cube_network_script (str): Cube script for importing

fixed-format roadway network. Default:

+
r"tests/scratch/make_complete_network_from_fixed_width_file.s
+
+
+
+
+
+
+
+
+__init__(**kwargs)[source]
+

Time period and category splitting info

+
+ +

Methods

+ + + + + + +

__init__(**kwargs)

Time period and category splitting info

+

Attributes

+ + + + + + + + + + + + + + + +

cube_time_periods

#MC self.route_type_bus_mode_dict = {"Urb Loc": 5, "Sub Loc": 6, "Express": 7}

properties_to_split

Details for calculating the county based on the centroid of the link.

county_link_range_dict

self.county_code_dict = {

zones

Create all the possible headway variable combinations based on the cube time periods setting

+
+ +
+
self.county_code_dict = {

“Anoka”: 1, +“Carver”: 2, +“Dakota”: 3, +“Hennepin”: 4, +“Ramsey”: 5, +“Scott”: 6, +“Washington”: 7, +“external”: 10, +“Chisago”: 11, +“Goodhue”: 12, +“Isanti”: 13, +“Le Sueur”: 14, +“McLeod”: 15, +“Pierce”: 16, +“Polk”: 17, +“Rice”: 18, +“Sherburne”: 19, +“Sibley”: 20, +“St. Croix”: 21, +“Wright”: 22,

+
+
+

}

+
+ +
+
+cube_time_periods
+

#MC +self.route_type_bus_mode_dict = {“Urb Loc”: 5, “Sub Loc”: 6, “Express”: 7}

+

self.route_type_mode_dict = {0: 8, 2: 9}

+

self.cube_time_periods = {“1”: “AM”, “2”: “MD”} +self.cube_time_periods_name = {“AM”: “pk”, “MD”: “op”}

+
+ +
+
+properties_to_split
+

Details for calculating the county based on the centroid of the link. +The NAME varible should be the name of a field in shapefile.

+
+ +
+
+zones
+

Create all the possible headway variable combinations based on the cube time periods setting

+
+ +
+ +
+ + +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty_emme/_generated/lasso.Project/index.html b/branch/bicounty_emme/_generated/lasso.Project/index.html new file mode 100644 index 0000000..68c1dc2 --- /dev/null +++ b/branch/bicounty_emme/_generated/lasso.Project/index.html @@ -0,0 +1,521 @@ + + + + + + + lasso.Project — lasso documentation + + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

lasso.Project

+
+
+class lasso.Project(roadway_link_changes=None, roadway_node_changes=None, transit_changes=None, base_roadway_network=None, base_transit_network=None, base_cube_transit_network=None, build_cube_transit_network=None, project_name='', evaluate=False, parameters={})[source]
+

Bases: object

+

A single or set of changes to the roadway or transit system.

+

Compares a base and a build transit network or a base and build +highway network and produces project cards.

+

Typical usage example:

+
test_project = Project.create_project(
+    base_cube_transit_source=os.path.join(CUBE_DIR, "transit.LIN"),
+    build_cube_transit_source=os.path.join(CUBE_DIR, "transit_route_shape_change"),
+)
+test_project.evaluate_changes()
+test_project.write_project_card(
+    os.path.join(SCRATCH_DIR, "t_transit_shape_test.yml")
+)
+
+
+
+
+DEFAULT_PROJECT_NAME
+

a class-level constant that defines what +the project name will be if none is set.

+
+ +
+
+STATIC_VALUES
+

a class-level constant which defines values that +are not evaluated when assessing changes.

+
+ +
+
+card_data
+

{“project”: <project_name>, “changes”: <list of change dicts>}

+
+
Type:
+

dict

+
+
+
+ +
+ +

pandas dataframe of CUBE roadway link changes.

+
+
Type:
+

DataFrame

+
+
+
+ +
+
+roadway_node_changes
+

pandas dataframe of CUBE roadway node changes.

+
+
Type:
+

DataFrame

+
+
+
+ +
+
+transit_changes
+
+
Type:
+

CubeTransit

+
+
+
+ +
+
+base_roadway_network
+
+
Type:
+

RoadwayNetwork

+
+
+
+ +
+
+base_cube_transit_network
+
+
Type:
+

CubeTransit

+
+
+
+ +
+
+build_cube_transit_network
+
+
Type:
+

CubeTransit

+
+
+
+ +
+
+project_name
+

name of the project, set to DEFAULT_PROJECT_NAME if not provided

+
+
Type:
+

str

+
+
+
+ +
+
+parameters
+

an instance of the Parameters class which sets a bunch of parameters

+
+ +
+
+__init__(roadway_link_changes=None, roadway_node_changes=None, transit_changes=None, base_roadway_network=None, base_transit_network=None, base_cube_transit_network=None, build_cube_transit_network=None, project_name='', evaluate=False, parameters={})[source]
+

ProjectCard constructor.

+
+
Parameters:
+
    +
  • roadway_link_changes – dataframe of roadway changes read from a log file

  • +
  • roadway_node_changes – dataframe of roadway changes read from a log file

  • +
  • transit_changes – dataframe of transit changes read from a log file

  • +
  • base_roadway_network – RoadwayNetwork instance for base case

  • +
  • base_transit_network – StandardTransit instance for base case

  • +
  • base_cube_transit_network – CubeTransit instance for base transit network

  • +
  • build_cube_transit_network – CubeTransit instance for build transit network

  • +
  • project_name – name of the project

  • +
  • evaluate – defaults to false, but if true, will create card data

  • +
  • parameters – dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not specified, will use default parameters.

  • +
+
+
+

returns: instance of ProjectCard

+
+ +

Methods

+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +

__init__([roadway_link_changes, ...])

ProjectCard constructor.

add_highway_changes([...])

Evaluates changes from the log file based on the base highway object and adds entries into the self.card_data dictionary.

add_transit_changes()

Evaluates changes between base and build transit objects and adds entries into the self.card_data dictionary.

create_project([roadway_log_file, ...])

Constructor for a Project instance.

determine_roadway_network_changes_compatibility(...)

Checks to see that any links or nodes that change exist in base roadway network.

emme_id_to_wrangler_id(emme_link_change_df, ...)

rewrite the emme id with wrangler id, using the emme wrangler id crosswalk located in database folder

emme_name_to_wrangler_name(...)

rename emme names to wrangler names using crosswalk file

evaluate_changes()

Determines which changes should be evaluated, initiates self.card_data to be an aggregation of transit and highway changes.

get_object_from_network_build_command()

determine the network build object is node or link

get_operation_from_network_build_command()

determine the network build object action type

read_logfile(logfilename)

Reads a Cube log file and returns separate dataframes of roadway_link_changes and roadway_node_changes

read_network_build_file(networkbuildfilename)

Reads a emme network build file and returns separate dataframes of roadway_link_changes and roadway_node_changes

write_project_card([filename])

Writes project cards.

+

Attributes

+ + + + + + + + + + + + +

CALCULATED_VALUES

DEFAULT_PROJECT_NAME

STATIC_VALUES

+
+
+add_highway_changes(limit_variables_to_existing_network=False)[source]
+

Evaluates changes from the log file based on the base highway object and +adds entries into the self.card_data dictionary.

+
+
Parameters:
+

limit_variables_to_existing_network (bool) – True if no ad-hoc variables. Default to False.

+
+
+
+ +
+
+add_transit_changes()[source]
+

Evaluates changes between base and build transit objects and +adds entries into the self.card_data dictionary.

+
+ +
+
+static create_project(roadway_log_file=None, roadway_shp_file=None, roadway_csv_file=None, network_build_file=None, emme_node_id_crosswalk_file=None, emme_name_crosswalk_file=None, base_roadway_dir=None, base_transit_dir=None, base_cube_transit_source=None, build_cube_transit_source=None, roadway_link_changes=None, roadway_node_changes=None, transit_changes=None, base_roadway_network=None, base_transit_network=None, base_cube_transit_network=None, build_cube_transit_network=None, project_name=None, recalculate_calculated_variables=False, recalculate_distance=False, parameters={}, **kwargs)[source]
+

Constructor for a Project instance.

+
+
Parameters:
+
    +
  • roadway_log_file – File path to consuming logfile or a list of logfile paths.

  • +
  • roadway_shp_file – File path to consuming shape file for roadway changes.

  • +
  • roadway_csv_file – File path to consuming csv file for roadway changes.

  • +
  • network_build_file – File path to consuming EMME network build for network changes.

  • +
  • base_roadway_dir – Folder path to base roadway network.

  • +
  • base_transit_dir – Folder path to base transit network.

  • +
  • base_cube_transit_source – Folder path to base transit network or cube line file string.

  • +
  • base_cube_transit_file – File path to base transit network.

  • +
  • build_cube_transit_source – Folder path to build transit network or cube line file string.

  • +
  • build_cube_transit_file – File path to build transit network.

  • +
  • roadway_link_changes – pandas dataframe of CUBE roadway link changes.

  • +
  • roadway_node_changes – pandas dataframe of CUBE roadway node changes.

  • +
  • transit_changes – build transit changes.

  • +
  • base_roadway_network – Base roadway network object.

  • +
  • base_cube_transit_network – Base cube transit network object.

  • +
  • build_cube_transit_network – Build cube transit network object.

  • +
  • project_name – If not provided, will default to the roadway_log_file filename if +provided (or the first filename if a list is provided)

  • +
  • recalculate_calculated_variables – if reading in a base network, if this is true it +will recalculate variables such as area type, etc. This only needs to be true +if you are creating project cards that are changing the calculated variables.

  • +
  • recalculate_distance – recalculate the distance variable. This only needs to be +true if you are creating project cards that change the distance.

  • +
  • parameters – dictionary of parameters

  • +
  • crs (int) – coordinate reference system, ESPG number

  • +
  • node_foreign_key (str) – variable linking the node table to the link table

  • +
  • link_foreign_key (list) – list of variable linking the link table to the node foreign key

  • +
  • shape_foreign_key (str) – variable linking the links table and shape table

  • +
  • unique_link_ids (list) – list of variables unique to each link

  • +
  • unique_node_ids (list) – list of variables unique to each node

  • +
  • modes_to_network_link_variables (dict) – Mapping of modes to link variables in +the network

  • +
  • modes_to_network_nodes_variables (dict) – Mapping of modes to node variables +in the network

  • +
  • managed_lanes_node_id_scalar (int) – Scalar values added to primary keys for nodes for +corresponding managed lanes.

  • +
  • managed_lanes_link_id_scalar (int) – Scalar values added to primary keys for links for +corresponding managed lanes.

  • +
  • managed_lanes_required_attributes (list) – attributes that must be specified in managed +lane projects.

  • +
  • keep_same_attributes_ml_and_gp (list) – attributes to copy to managed lanes from parallel +general purpose lanes.

  • +
+
+
Returns:
+

A Project instance.

+
+
+
+ +
+
+static determine_roadway_network_changes_compatibility(base_roadway_network, roadway_link_changes, roadway_node_changes, parameters)[source]
+

Checks to see that any links or nodes that change exist in base roadway network.

+
+ +
+
+static emme_id_to_wrangler_id(emme_link_change_df, emme_node_change_df, emme_transit_changes_df, emme_node_id_crosswalk_file)[source]
+

rewrite the emme id with wrangler id, using the emme wrangler id crosswalk located in database folder

+
+ +
+
+static emme_name_to_wrangler_name(emme_link_change_df, emme_node_change_df, emme_name_crosswalk_file)[source]
+

rename emme names to wrangler names using crosswalk file

+
+ +
+
+evaluate_changes()[source]
+

Determines which changes should be evaluated, initiates +self.card_data to be an aggregation of transit and highway changes.

+
+ +
+
+get_object_from_network_build_command()[source]
+

determine the network build object is node or link

+
+
Parameters:
+

row – network build command history dataframe

+
+
Returns:
+

‘N’ for node, ‘L’ for link

+
+
+
+ +
+
+get_operation_from_network_build_command()[source]
+

determine the network build object action type

+
+
Parameters:
+

row – network build command history dataframe

+
+
Returns:
+

‘A’, ‘C’, ‘D’

+
+
+
+ +
+
+static read_logfile(logfilename)[source]
+

Reads a Cube log file and returns separate dataframes of roadway_link_changes and roadway_node_changes

+
+
Parameters:
+

logfilename (str or list[str]) – File path to CUBE logfile or list of logfile paths.

+
+
Returns:
+

A DataFrame reprsentation of the log file.

+
+
+
+ +
+
+static read_network_build_file(networkbuildfilename)[source]
+

Reads a emme network build file and returns separate dataframes of roadway_link_changes and roadway_node_changes

+
+
Parameters:
+

networkbuildfilename (str or list[str]) – File path to emme nework build file or list of network build file paths.

+
+
Returns:
+

A DataFrame representation of the network build file

+
+
+
+ +
+
+write_project_card(filename=None)[source]
+

Writes project cards.

+
+
Parameters:
+

filename (str) – File path to output .yml

+
+
Returns:
+

None

+
+
+
+ +
+
+CALCULATED_VALUES = ['area_type', 'county', 'assign_group', 'centroidconnect']
+
+ +
+
+DEFAULT_PROJECT_NAME = 'USER TO define'
+
+ +
+
+STATIC_VALUES = ['model_link_id', 'area_type', 'county', 'centroidconnect']
+
+ +
+ +
+ + +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty_emme/_generated/lasso.StandardTransit/index.html b/branch/bicounty_emme/_generated/lasso.StandardTransit/index.html new file mode 100644 index 0000000..e3c050b --- /dev/null +++ b/branch/bicounty_emme/_generated/lasso.StandardTransit/index.html @@ -0,0 +1,420 @@ + + + + + + + lasso.StandardTransit — lasso documentation + + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

lasso.StandardTransit

+
+
+class lasso.StandardTransit(ptg_feed, parameters={})[source]
+

Bases: object

+

Holds a standard transit feed as a Partridge object and contains +methods to manipulate and translate the GTFS data to MetCouncil’s +Cube Line files.

+

Typical usage example:

+
cube_transit_net = StandardTransit.read_gtfs(BASE_TRANSIT_DIR)
+cube_transit_net.write_as_cube_lin(os.path.join(WRITE_DIR, "outfile.lin"))
+
+
+
+
+feed
+

Partridge Feed object containing read-only access to GTFS feed

+
+ +
+
+parameters
+

Parameters instance containing information +about time periods and variables.

+
+
Type:
+

Parameters

+
+
+
+ +
+
+__init__(ptg_feed, parameters={})[source]
+
+
Parameters:
+
    +
  • ptg_feed – partridge feed object

  • +
  • parameters – dictionary of parameter settings (see Parameters class) or an instance of Parameters

  • +
+
+
+
+ +

Methods

+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +

__init__(ptg_feed[, parameters])

+
param ptg_feed:
+

partridge feed object

+
+
+

calculate_cube_mode(row)

Assigns a cube mode number by following logic.

cube_format(row)

Creates a string represnting the route in cube line file notation. #MC :param row: row of a DataFrame representing a cube-formatted trip, with the Attributes trip_id, shape_id, NAME, LONGNAME, tod, HEADWAY, MODE, ONEWAY, OPERATOR.

evaluate_differences(transit_changes)

Compare changes from the transit_changes dataframe with the standard transit network returns the project card changes in dictionary format

fromTransitNetwork(transit_network_object[, ...])

RoadwayNetwork to ModelRoadwayNetwork

read_gtfs(gtfs_feed_dir[, parameters])

Reads GTFS files from a directory and returns a StandardTransit instance.

route_properties_gtfs_to_cube(self)

Prepare gtfs for cube lin file.

shape_gtfs_to_cube(row[, add_nntime])

Creates a list of nodes that for the route in appropriate cube format.

shape_gtfs_to_emme(trip_row)

Creates transit segment for the trips in appropriate emme format.

time_to_cube_time_period(start_time_secs[, ...])

Converts seconds from midnight to the cube time period.

write_as_cube_lin([outpath])

Writes the gtfs feed as a cube line file after converting gtfs properties to MetCouncil cube properties.

+
+
+calculate_cube_mode(row)[source]
+

Assigns a cube mode number by following logic. +#MC +For rail, uses GTFS route_type variable: +https://developers.google.com/transit/gtfs/reference

+
+
::

# route_type : cube_mode +route_type_to_cube_mode = {0: 8, # Tram, Streetcar, Light rail

+
+

3: 0, # Bus; further disaggregated for cube +2: 9} # Rail

+
+
+
+

For buses, uses route id numbers and route name to find +express and suburban buses as follows:

+
+
::
+
if not cube_mode:
+
if ‘express’ in row[‘LONGNAME’].lower():

cube_mode = 7 # Express

+
+
elif int(row[‘route_id’].split(“-“)[0]) > 99:

cube_mode = 6 # Suburban Local

+
+
else:

cube_mode = 5 # Urban Local

+
+
+
+
+
+
+
+
Parameters:
+

row – A DataFrame row with route_type, route_long_name, and route_id

+
+
Returns:
+

cube mode number

+
+
+
+ +
+
+cube_format(row)[source]
+

Creates a string represnting the route in cube line file notation. +#MC +:param row: row of a DataFrame representing a cube-formatted trip, with the Attributes

+
+

trip_id, shape_id, NAME, LONGNAME, tod, HEADWAY, MODE, ONEWAY, OPERATOR

+
+
+
Returns:
+

string representation of route in cube line file notation

+
+
+
+ +
+
+evaluate_differences(transit_changes)[source]
+

Compare changes from the transit_changes dataframe with the standard transit network +returns the project card changes in dictionary format

+
+ +
+
+static fromTransitNetwork(transit_network_object, parameters={})[source]
+

RoadwayNetwork to ModelRoadwayNetwork

+
+
Parameters:
+
    +
  • transit_network_object – Reference to an instance of TransitNetwork.

  • +
  • parameters – dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not provided will +use default parameters.

  • +
+
+
Returns:
+

StandardTransit

+
+
+
+ +
+
+static read_gtfs(gtfs_feed_dir, parameters={})[source]
+

Reads GTFS files from a directory and returns a StandardTransit +instance.

+
+
Parameters:
+
    +
  • gtfs_feed_dir – location of the GTFS files

  • +
  • parameters – dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not provided will +use default parameters.

  • +
+
+
Returns:
+

StandardTransit instance

+
+
+
+ +
+
+static route_properties_gtfs_to_cube(self)[source]
+

Prepare gtfs for cube lin file. +#MC +Does the following operations: +1. Combines route, frequency, trip, and shape information +2. Converts time of day to time periods +3. Calculates cube route name from gtfs route name and properties +4. Assigns a cube-appropriate mode number +5. Assigns a cube-appropriate operator number

+
+
Returns:
+

+
DataFrame of trips with cube-appropriate values for:
    +
  • NAME

  • +
  • ONEWAY

  • +
  • OPERATOR

  • +
  • MODE

  • +
  • HEADWAY

  • +
+
+
+

+
+
Return type:
+

trip_df (DataFrame)

+
+
+
+ +
+
+shape_gtfs_to_cube(row, add_nntime=False)[source]
+

Creates a list of nodes that for the route in appropriate +cube format.

+
+
Parameters:
+

row – DataFrame row with both shape_id and trip_id

+
+
+
+
Returns: a string representation of the node list

for a route in cube format.

+
+
+
+ +
+
+shape_gtfs_to_emme(trip_row)[source]
+

Creates transit segment for the trips in appropriate +emme format.

+
+
Parameters:
+

row – DataFrame row with both shape_id and trip_id

+
+
+
+
Returns: a dataframe representation of the transit segment

for a trip in emme format.

+
+
+
+ +
+
+time_to_cube_time_period(start_time_secs, as_str=True, verbose=False)[source]
+

Converts seconds from midnight to the cube time period.

+
+
Parameters:
+
    +
  • start_time_secs – start time for transit trip in seconds +from midnight

  • +
  • as_str – if True, returns the time period as a string, +otherwise returns a numeric time period

  • +
+
+
Returns:
+

+
if as_str is False, returns the numeric

time period

+
+
this_tp: if as_str is True, returns the Cube time period

name abbreviation

+
+
+

+
+
Return type:
+

this_tp_num

+
+
+
+ +
+
+write_as_cube_lin(outpath=None)[source]
+

Writes the gtfs feed as a cube line file after +converting gtfs properties to MetCouncil cube properties. +#MC +:param outpath: File location for output cube line file.

+
+ +
+ +
+ + +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty_emme/_generated/lasso.logger/index.html b/branch/bicounty_emme/_generated/lasso.logger/index.html new file mode 100644 index 0000000..baf81a9 --- /dev/null +++ b/branch/bicounty_emme/_generated/lasso.logger/index.html @@ -0,0 +1,143 @@ + + + + + + + lasso.logger — lasso documentation + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

lasso.logger

+

Functions

+ + + + + + +

setupLogging(infoLogFilename, debugLogFilename)

Sets up the logger.

+
+
+lasso.logger.setupLogging(infoLogFilename, debugLogFilename, logToConsole=True)[source]
+

Sets up the logger. The infoLog is terse, just gives the bare minimum of details +so the network composition will be clear later. +The debuglog is very noisy, for debugging.

+

Pass none to either. +Spews it all out to console too, if logToConsole is true.

+
+ +
+ + +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty_emme/_generated/lasso.util/index.html b/branch/bicounty_emme/_generated/lasso.util/index.html new file mode 100644 index 0000000..1d14e5f --- /dev/null +++ b/branch/bicounty_emme/_generated/lasso.util/index.html @@ -0,0 +1,1696 @@ + + + + + + + lasso.util — lasso documentation + + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

lasso.util

+

Functions

+ + + + + + + + + + + + + + + + + + + + + + + + +

column_name_to_parts(c[, parameters])

create_locationreference(node, link)

geodesic_point_buffer(lat, lon, meters)

creates circular buffer polygon for node

get_shared_streets_intersection_hash(lat, long)

Calculated per:

hhmmss_to_datetime(hhmmss_str)

Creates a datetime time object from a string of hh:mm:ss

secs_to_datetime(secs)

Creates a datetime time object from a seconds from midnight

shorten_name(name)

+
+
+class lasso.util.Point(*args)[source]
+

Bases: BaseGeometry

+

A geometry type that represents a single coordinate with +x,y and possibly z values.

+

A point is a zero-dimensional feature and has zero length and zero area.

+
+
Parameters:
+

args (float, or sequence of floats) –

The coordinates can either be passed as a single parameter, or as +individual float values using multiple parameters:

+
    +
  1. 1 parameter: a sequence or array-like of with 2 or 3 values.

  2. +
  3. 2 or 3 parameters (float): x, y, and possibly z.

  4. +
+

+
+
+
+
+x, y, z
+

Coordinate values

+
+
Type:
+

float

+
+
+
+ +

Examples

+

Constructing the Point using separate parameters for x and y:

+
>>> p = Point(1.0, -1.0)
+
+
+

Constructing the Point using a list of x, y coordinates:

+
>>> p = Point([1.0, -1.0])
+>>> print(p)
+POINT (1 -1)
+>>> p.y
+-1.0
+>>> p.x
+1.0
+
+
+
+
+almost_equals(other, decimal=6)
+

True if geometries are equal at all coordinates to a +specified decimal place.

+
+

Deprecated since version 1.8.0: The ‘almost_equals()’ method is deprecated +and will be removed in Shapely 2.1 because the name is +confusing. The ‘equals_exact()’ method should be used +instead.

+
+

Refers to approximate coordinate equality, which requires +coordinates to be approximately equal and in the same order for +all components of a geometry.

+

Because of this it is possible for “equals()” to be True for two +geometries and “almost_equals()” to be False.

+

Examples

+
>>> LineString(
+...     [(0, 0), (2, 2)]
+... ).equals_exact(
+...     LineString([(0, 0), (1, 1), (2, 2)]),
+...     1e-6
+... )
+False
+
+
+
+
Return type:
+

bool

+
+
+
+ +
+
+buffer(distance, quad_segs=16, cap_style='round', join_style='round', mitre_limit=5.0, single_sided=False, **kwargs)
+

Get a geometry that represents all points within a distance +of this geometry.

+

A positive distance produces a dilation, a negative distance an +erosion. A very small or zero distance may sometimes be used to +“tidy” a polygon.

+
+
Parameters:
+
    +
  • distance (float) – The distance to buffer around the object.

  • +
  • resolution (int, optional) – The resolution of the buffer around each vertex of the +object.

  • +
  • quad_segs (int, optional) – Sets the number of line segments used to approximate an +angle fillet.

  • +
  • cap_style (shapely.BufferCapStyle or {'round', 'square', 'flat'}, default 'round') – Specifies the shape of buffered line endings. BufferCapStyle.round (‘round’) +results in circular line endings (see quad_segs). Both BufferCapStyle.square +(‘square’) and BufferCapStyle.flat (‘flat’) result in rectangular line endings, +only BufferCapStyle.flat (‘flat’) will end at the original vertex, +while BufferCapStyle.square (‘square’) involves adding the buffer width.

  • +
  • join_style (shapely.BufferJoinStyle or {'round', 'mitre', 'bevel'}, default 'round') – Specifies the shape of buffered line midpoints. BufferJoinStyle.ROUND (‘round’) +results in rounded shapes. BufferJoinStyle.bevel (‘bevel’) results in a beveled +edge that touches the original vertex. BufferJoinStyle.mitre (‘mitre’) results +in a single vertex that is beveled depending on the mitre_limit parameter.

  • +
  • mitre_limit (float, optional) – The mitre limit ratio is used for very sharp corners. The +mitre ratio is the ratio of the distance from the corner to +the end of the mitred offset corner. When two line segments +meet at a sharp angle, a miter join will extend the original +geometry. To prevent unreasonable geometry, the mitre limit +allows controlling the maximum length of the join corner. +Corners with a ratio which exceed the limit will be beveled.

  • +
  • single_side (bool, optional) –

    The side used is determined by the sign of the buffer +distance:

    +
    +

    a positive distance indicates the left-hand side +a negative distance indicates the right-hand side

    +
    +

    The single-sided buffer of point geometries is the same as +the regular buffer. The End Cap Style for single-sided +buffers is always ignored, and forced to the equivalent of +CAP_FLAT.

    +

  • +
  • quadsegs (int, optional) – Deprecated alias for quad_segs.

  • +
+
+
Return type:
+

Geometry

+
+
+

Notes

+

The return value is a strictly two-dimensional geometry. All +Z coordinates of the original geometry will be ignored.

+

Examples

+
>>> from shapely.wkt import loads
+>>> g = loads('POINT (0.0 0.0)')
+
+
+

16-gon approx of a unit radius circle:

+
>>> g.buffer(1.0).area  
+3.1365484905459...
+
+
+

128-gon approximation:

+
>>> g.buffer(1.0, 128).area  
+3.141513801144...
+
+
+

triangle approximation:

+
>>> g.buffer(1.0, 3).area
+3.0
+>>> list(g.buffer(1.0, cap_style=BufferCapStyle.square).exterior.coords)
+[(1.0, 1.0), (1.0, -1.0), (-1.0, -1.0), (-1.0, 1.0), (1.0, 1.0)]
+>>> g.buffer(1.0, cap_style=BufferCapStyle.square).area
+4.0
+
+
+
+ +
+
+contains(other)
+

Returns True if the geometry contains the other, else False

+
+ +
+
+contains_properly(other)
+

Returns True if the geometry completely contains the other, with no +common boundary points, else False

+

Refer to shapely.contains_properly for full documentation.

+
+ +
+
+covered_by(other)
+

Returns True if the geometry is covered by the other, else False

+
+ +
+
+covers(other)
+

Returns True if the geometry covers the other, else False

+
+ +
+
+crosses(other)
+

Returns True if the geometries cross, else False

+
+ +
+
+difference(other, grid_size=None)
+

Returns the difference of the geometries.

+

Refer to shapely.difference for full documentation.

+
+ +
+
+disjoint(other)
+

Returns True if geometries are disjoint, else False

+
+ +
+
+distance(other)
+

Unitless distance to other geometry (float)

+
+ +
+
+dwithin(other, distance)
+

Returns True if geometry is within a given distance from the other, else False.

+

Refer to shapely.dwithin for full documentation.

+
+ +
+
+equals(other)
+

Returns True if geometries are equal, else False.

+

This method considers point-set equality (or topological +equality), and is equivalent to (self.within(other) & +self.contains(other)).

+

Examples

+
>>> LineString(
+...     [(0, 0), (2, 2)]
+... ).equals(
+...     LineString([(0, 0), (1, 1), (2, 2)])
+... )
+True
+
+
+
+
Return type:
+

bool

+
+
+
+ +
+
+equals_exact(other, tolerance)
+

True if geometries are equal to within a specified +tolerance.

+
+
Parameters:
+
    +
  • other (BaseGeometry) – The other geometry object in this comparison.

  • +
  • tolerance (float) – Absolute tolerance in the same units as coordinates.

  • +
  • equality (This method considers coordinate) –

  • +
  • requires (which) –

  • +
  • components (coordinates to be equal and in the same order for all) –

  • +
  • geometry. (of a) –

  • +
  • two (Because of this it is possible for "equals()" to be True for) –

  • +
  • False. (geometries and "equals_exact()" to be) –

  • +
+
+
+

Examples

+
>>> LineString(
+...     [(0, 0), (2, 2)]
+... ).equals_exact(
+...     LineString([(0, 0), (1, 1), (2, 2)]),
+...     1e-6
+... )
+False
+
+
+
+
Return type:
+

bool

+
+
+
+ +
+
+geometryType()
+
+ +
+
+hausdorff_distance(other)
+

Unitless hausdorff distance to other geometry (float)

+
+ +
+
+interpolate(distance, normalized=False)
+

Return a point at the specified distance along a linear geometry

+

Negative length values are taken as measured in the reverse +direction from the end of the geometry. Out-of-range index +values are handled by clamping them to the valid range of values. +If the normalized arg is True, the distance will be interpreted as a +fraction of the geometry’s length.

+

Alias of line_interpolate_point.

+
+ +
+
+intersection(other, grid_size=None)
+

Returns the intersection of the geometries.

+

Refer to shapely.intersection for full documentation.

+
+ +
+
+intersects(other)
+

Returns True if geometries intersect, else False

+
+ +
+
+line_interpolate_point(distance, normalized=False)
+

Return a point at the specified distance along a linear geometry

+

Negative length values are taken as measured in the reverse +direction from the end of the geometry. Out-of-range index +values are handled by clamping them to the valid range of values. +If the normalized arg is True, the distance will be interpreted as a +fraction of the geometry’s length.

+

Alias of interpolate.

+
+ +
+
+line_locate_point(other, normalized=False)
+

Returns the distance along this geometry to a point nearest the +specified point

+

If the normalized arg is True, return the distance normalized to the +length of the linear geometry.

+

Alias of project.

+
+ +
+
+normalize()
+

Converts geometry to normal form (or canonical form).

+

This method orders the coordinates, rings of a polygon and parts of +multi geometries consistently. Typically useful for testing purposes +(for example in combination with equals_exact).

+

Examples

+
>>> from shapely import MultiLineString
+>>> line = MultiLineString([[(0, 0), (1, 1)], [(3, 3), (2, 2)]])
+>>> line.normalize()
+<MULTILINESTRING ((2 2, 3 3), (0 0, 1 1))>
+
+
+
+ +
+
+overlaps(other)
+

Returns True if geometries overlap, else False

+
+ +
+
+point_on_surface()
+

Returns a point guaranteed to be within the object, cheaply.

+

Alias of representative_point.

+
+ +
+
+project(other, normalized=False)
+

Returns the distance along this geometry to a point nearest the +specified point

+

If the normalized arg is True, return the distance normalized to the +length of the linear geometry.

+

Alias of line_locate_point.

+
+ +
+
+relate(other)
+

Returns the DE-9IM intersection matrix for the two geometries +(string)

+
+ +
+
+relate_pattern(other, pattern)
+

Returns True if the DE-9IM string code for the relationship between +the geometries satisfies the pattern, else False

+
+ +
+
+representative_point()
+

Returns a point guaranteed to be within the object, cheaply.

+

Alias of point_on_surface.

+
+ +
+
+reverse()
+

Returns a copy of this geometry with the order of coordinates reversed.

+

If the geometry is a polygon with interior rings, the interior rings are also +reversed.

+

Points are unchanged.

+
+

See also

+
+
is_ccw

Checks if a geometry is clockwise.

+
+
+
+

Examples

+
>>> from shapely import LineString, Polygon
+>>> LineString([(0, 0), (1, 2)]).reverse()
+<LINESTRING (1 2, 0 0)>
+>>> Polygon([(0, 0), (1, 0), (1, 1), (0, 1), (0, 0)]).reverse()
+<POLYGON ((0 0, 0 1, 1 1, 1 0, 0 0))>
+
+
+
+ +
+
+segmentize(max_segment_length)
+

Adds vertices to line segments based on maximum segment length.

+

Additional vertices will be added to every line segment in an input geometry +so that segments are no longer than the provided maximum segment length. New +vertices will evenly subdivide each segment.

+

Only linear components of input geometries are densified; other geometries +are returned unmodified.

+
+
Parameters:
+

max_segment_length (float or array_like) – Additional vertices will be added so that all line segments are no +longer this value. Must be greater than 0.

+
+
+

Examples

+
>>> from shapely import LineString, Polygon
+>>> LineString([(0, 0), (0, 10)]).segmentize(max_segment_length=5)
+<LINESTRING (0 0, 0 5, 0 10)>
+>>> Polygon([(0, 0), (10, 0), (10, 10), (0, 10), (0, 0)]).segmentize(max_segment_length=5)
+<POLYGON ((0 0, 5 0, 10 0, 10 5, 10 10, 5 10, 0 10, 0 5, 0 0))>
+
+
+
+ +
+
+simplify(tolerance, preserve_topology=True)
+

Returns a simplified geometry produced by the Douglas-Peucker +algorithm

+

Coordinates of the simplified geometry will be no more than the +tolerance distance from the original. Unless the topology preserving +option is used, the algorithm may produce self-intersecting or +otherwise invalid geometries.

+
+ +
+
+svg(scale_factor=1.0, fill_color=None, opacity=None)[source]
+

Returns SVG circle element for the Point geometry.

+
+
Parameters:
+
    +
  • scale_factor (float) – Multiplication factor for the SVG circle diameter. Default is 1.

  • +
  • fill_color (str, optional) – Hex string for fill color. Default is to use “#66cc99” if +geometry is valid, and “#ff3333” if invalid.

  • +
  • opacity (float) – Float number between 0 and 1 for color opacity. Default value is 0.6

  • +
+
+
+
+ +
+
+symmetric_difference(other, grid_size=None)
+

Returns the symmetric difference of the geometries.

+

Refer to shapely.symmetric_difference for full documentation.

+
+ +
+
+touches(other)
+

Returns True if geometries touch, else False

+
+ +
+
+union(other, grid_size=None)
+

Returns the union of the geometries.

+

Refer to shapely.union for full documentation.

+
+ +
+
+within(other)
+

Returns True if geometry is within the other, else False

+
+ +
+
+property area
+

Unitless area of the geometry (float)

+
+ +
+
+property boundary
+

Returns a lower dimension geometry that bounds the object

+

The boundary of a polygon is a line, the boundary of a line is a +collection of points. The boundary of a point is an empty (null) +collection.

+
+ +
+
+property bounds
+

Returns minimum bounding region (minx, miny, maxx, maxy)

+
+ +
+
+property centroid
+

Returns the geometric center of the object

+
+ +
+
+property convex_hull
+

that’s a +convex hull, more or less

+

The convex hull of a three member multipoint, for example, is a +triangular polygon.

+
+
Type:
+

Imagine an elastic band stretched around the geometry

+
+
+
+ +
+
+property coords
+

Access to geometry’s coordinates (CoordinateSequence)

+
+ +
+
+property envelope
+

A figure that envelopes the geometry

+
+ +
+
+property geom_type
+

Name of the geometry’s type, such as ‘Point’

+
+ +
+
+property has_z
+

True if the geometry’s coordinate sequence(s) have z values (are +3-dimensional)

+
+ +
+
+property is_closed
+

True if the geometry is closed, else False

+

Applicable only to 1-D geometries.

+
+ +
+
+property is_empty
+

True if the set of points in this geometry is empty, else False

+
+ +
+
+property is_ring
+

True if the geometry is a closed ring, else False

+
+ +
+
+property is_simple
+

True if the geometry is simple, meaning that any self-intersections +are only at boundary points, else False

+
+ +
+
+property is_valid
+

True if the geometry is valid (definition depends on sub-class), +else False

+
+ +
+
+property length
+

Unitless length of the geometry (float)

+
+ +
+
+property minimum_clearance
+

Unitless distance by which a node could be moved to produce an invalid geometry (float)

+
+ +
+
+property minimum_rotated_rectangle
+

Returns the oriented envelope (minimum rotated rectangle) that +encloses the geometry.

+

Unlike envelope this rectangle is not constrained to be parallel to the +coordinate axes. If the convex hull of the object is a degenerate (line +or point) this degenerate is returned.

+

Alias of oriented_envelope.

+
+ +
+
+property oriented_envelope
+

Returns the oriented envelope (minimum rotated rectangle) that +encloses the geometry.

+

Unlike envelope this rectangle is not constrained to be parallel to the +coordinate axes. If the convex hull of the object is a degenerate (line +or point) this degenerate is returned.

+

Alias of minimum_rotated_rectangle.

+
+ +
+
+property type
+
+ +
+
+property wkb
+

WKB representation of the geometry

+
+ +
+
+property wkb_hex
+

WKB hex representation of the geometry

+
+ +
+
+property wkt
+

WKT representation of the geometry

+
+ +
+
+property x
+

Return x coordinate.

+
+ +
+
+property xy
+

Separate arrays of X and Y coordinate values

+

Example

+
>>> x, y = Point(0, 0).xy
+>>> list(x)
+[0.0]
+>>> list(y)
+[0.0]
+
+
+
+ +
+
+property y
+

Return y coordinate.

+
+ +
+
+property z
+

Return z coordinate.

+
+ +
+ +
+
+class lasso.util.Polygon(shell=None, holes=None)[source]
+

Bases: BaseGeometry

+

A geometry type representing an area that is enclosed by a linear ring.

+

A polygon is a two-dimensional feature and has a non-zero area. It may +have one or more negative-space “holes” which are also bounded by linear +rings. If any rings cross each other, the feature is invalid and +operations on it may fail.

+
+
Parameters:
+
    +
  • shell (sequence) – A sequence of (x, y [,z]) numeric coordinate pairs or triples, or +an array-like with shape (N, 2) or (N, 3). +Also can be a sequence of Point objects.

  • +
  • holes (sequence) – A sequence of objects which satisfy the same requirements as the +shell parameters above

  • +
+
+
+
+
+exterior
+

The ring which bounds the positive space of the polygon.

+
+
Type:
+

LinearRing

+
+
+
+ +
+
+interiors
+

A sequence of rings which bound all existing holes.

+
+
Type:
+

sequence

+
+
+
+ +

Examples

+

Create a square polygon with no holes

+
>>> coords = ((0., 0.), (0., 1.), (1., 1.), (1., 0.), (0., 0.))
+>>> polygon = Polygon(coords)
+>>> polygon.area
+1.0
+
+
+
+
+almost_equals(other, decimal=6)
+

True if geometries are equal at all coordinates to a +specified decimal place.

+
+

Deprecated since version 1.8.0: The ‘almost_equals()’ method is deprecated +and will be removed in Shapely 2.1 because the name is +confusing. The ‘equals_exact()’ method should be used +instead.

+
+

Refers to approximate coordinate equality, which requires +coordinates to be approximately equal and in the same order for +all components of a geometry.

+

Because of this it is possible for “equals()” to be True for two +geometries and “almost_equals()” to be False.

+

Examples

+
>>> LineString(
+...     [(0, 0), (2, 2)]
+... ).equals_exact(
+...     LineString([(0, 0), (1, 1), (2, 2)]),
+...     1e-6
+... )
+False
+
+
+
+
Return type:
+

bool

+
+
+
+ +
+
+buffer(distance, quad_segs=16, cap_style='round', join_style='round', mitre_limit=5.0, single_sided=False, **kwargs)
+

Get a geometry that represents all points within a distance +of this geometry.

+

A positive distance produces a dilation, a negative distance an +erosion. A very small or zero distance may sometimes be used to +“tidy” a polygon.

+
+
Parameters:
+
    +
  • distance (float) – The distance to buffer around the object.

  • +
  • resolution (int, optional) – The resolution of the buffer around each vertex of the +object.

  • +
  • quad_segs (int, optional) – Sets the number of line segments used to approximate an +angle fillet.

  • +
  • cap_style (shapely.BufferCapStyle or {'round', 'square', 'flat'}, default 'round') – Specifies the shape of buffered line endings. BufferCapStyle.round (‘round’) +results in circular line endings (see quad_segs). Both BufferCapStyle.square +(‘square’) and BufferCapStyle.flat (‘flat’) result in rectangular line endings, +only BufferCapStyle.flat (‘flat’) will end at the original vertex, +while BufferCapStyle.square (‘square’) involves adding the buffer width.

  • +
  • join_style (shapely.BufferJoinStyle or {'round', 'mitre', 'bevel'}, default 'round') – Specifies the shape of buffered line midpoints. BufferJoinStyle.ROUND (‘round’) +results in rounded shapes. BufferJoinStyle.bevel (‘bevel’) results in a beveled +edge that touches the original vertex. BufferJoinStyle.mitre (‘mitre’) results +in a single vertex that is beveled depending on the mitre_limit parameter.

  • +
  • mitre_limit (float, optional) – The mitre limit ratio is used for very sharp corners. The +mitre ratio is the ratio of the distance from the corner to +the end of the mitred offset corner. When two line segments +meet at a sharp angle, a miter join will extend the original +geometry. To prevent unreasonable geometry, the mitre limit +allows controlling the maximum length of the join corner. +Corners with a ratio which exceed the limit will be beveled.

  • +
  • single_side (bool, optional) –

    The side used is determined by the sign of the buffer +distance:

    +
    +

    a positive distance indicates the left-hand side +a negative distance indicates the right-hand side

    +
    +

    The single-sided buffer of point geometries is the same as +the regular buffer. The End Cap Style for single-sided +buffers is always ignored, and forced to the equivalent of +CAP_FLAT.

    +

  • +
  • quadsegs (int, optional) – Deprecated alias for quad_segs.

  • +
+
+
Return type:
+

Geometry

+
+
+

Notes

+

The return value is a strictly two-dimensional geometry. All +Z coordinates of the original geometry will be ignored.

+

Examples

+
>>> from shapely.wkt import loads
+>>> g = loads('POINT (0.0 0.0)')
+
+
+

16-gon approx of a unit radius circle:

+
>>> g.buffer(1.0).area  
+3.1365484905459...
+
+
+

128-gon approximation:

+
>>> g.buffer(1.0, 128).area  
+3.141513801144...
+
+
+

triangle approximation:

+
>>> g.buffer(1.0, 3).area
+3.0
+>>> list(g.buffer(1.0, cap_style=BufferCapStyle.square).exterior.coords)
+[(1.0, 1.0), (1.0, -1.0), (-1.0, -1.0), (-1.0, 1.0), (1.0, 1.0)]
+>>> g.buffer(1.0, cap_style=BufferCapStyle.square).area
+4.0
+
+
+
+ +
+
+contains(other)
+

Returns True if the geometry contains the other, else False

+
+ +
+
+contains_properly(other)
+

Returns True if the geometry completely contains the other, with no +common boundary points, else False

+

Refer to shapely.contains_properly for full documentation.

+
+ +
+
+covered_by(other)
+

Returns True if the geometry is covered by the other, else False

+
+ +
+
+covers(other)
+

Returns True if the geometry covers the other, else False

+
+ +
+
+crosses(other)
+

Returns True if the geometries cross, else False

+
+ +
+
+difference(other, grid_size=None)
+

Returns the difference of the geometries.

+

Refer to shapely.difference for full documentation.

+
+ +
+
+disjoint(other)
+

Returns True if geometries are disjoint, else False

+
+ +
+
+distance(other)
+

Unitless distance to other geometry (float)

+
+ +
+
+dwithin(other, distance)
+

Returns True if geometry is within a given distance from the other, else False.

+

Refer to shapely.dwithin for full documentation.

+
+ +
+
+equals(other)
+

Returns True if geometries are equal, else False.

+

This method considers point-set equality (or topological +equality), and is equivalent to (self.within(other) & +self.contains(other)).

+

Examples

+
>>> LineString(
+...     [(0, 0), (2, 2)]
+... ).equals(
+...     LineString([(0, 0), (1, 1), (2, 2)])
+... )
+True
+
+
+
+
Return type:
+

bool

+
+
+
+ +
+
+equals_exact(other, tolerance)
+

True if geometries are equal to within a specified +tolerance.

+
+
Parameters:
+
    +
  • other (BaseGeometry) – The other geometry object in this comparison.

  • +
  • tolerance (float) – Absolute tolerance in the same units as coordinates.

  • +
  • equality (This method considers coordinate) –

  • +
  • requires (which) –

  • +
  • components (coordinates to be equal and in the same order for all) –

  • +
  • geometry. (of a) –

  • +
  • two (Because of this it is possible for "equals()" to be True for) –

  • +
  • False. (geometries and "equals_exact()" to be) –

  • +
+
+
+

Examples

+
>>> LineString(
+...     [(0, 0), (2, 2)]
+... ).equals_exact(
+...     LineString([(0, 0), (1, 1), (2, 2)]),
+...     1e-6
+... )
+False
+
+
+
+
Return type:
+

bool

+
+
+
+ +
+
+classmethod from_bounds(xmin, ymin, xmax, ymax)[source]
+

Construct a Polygon() from spatial bounds.

+
+ +
+
+geometryType()
+
+ +
+
+hausdorff_distance(other)
+

Unitless hausdorff distance to other geometry (float)

+
+ +
+
+interpolate(distance, normalized=False)
+

Return a point at the specified distance along a linear geometry

+

Negative length values are taken as measured in the reverse +direction from the end of the geometry. Out-of-range index +values are handled by clamping them to the valid range of values. +If the normalized arg is True, the distance will be interpreted as a +fraction of the geometry’s length.

+

Alias of line_interpolate_point.

+
+ +
+
+intersection(other, grid_size=None)
+

Returns the intersection of the geometries.

+

Refer to shapely.intersection for full documentation.

+
+ +
+
+intersects(other)
+

Returns True if geometries intersect, else False

+
+ +
+
+line_interpolate_point(distance, normalized=False)
+

Return a point at the specified distance along a linear geometry

+

Negative length values are taken as measured in the reverse +direction from the end of the geometry. Out-of-range index +values are handled by clamping them to the valid range of values. +If the normalized arg is True, the distance will be interpreted as a +fraction of the geometry’s length.

+

Alias of interpolate.

+
+ +
+
+line_locate_point(other, normalized=False)
+

Returns the distance along this geometry to a point nearest the +specified point

+

If the normalized arg is True, return the distance normalized to the +length of the linear geometry.

+

Alias of project.

+
+ +
+
+normalize()
+

Converts geometry to normal form (or canonical form).

+

This method orders the coordinates, rings of a polygon and parts of +multi geometries consistently. Typically useful for testing purposes +(for example in combination with equals_exact).

+

Examples

+
>>> from shapely import MultiLineString
+>>> line = MultiLineString([[(0, 0), (1, 1)], [(3, 3), (2, 2)]])
+>>> line.normalize()
+<MULTILINESTRING ((2 2, 3 3), (0 0, 1 1))>
+
+
+
+ +
+
+overlaps(other)
+

Returns True if geometries overlap, else False

+
+ +
+
+point_on_surface()
+

Returns a point guaranteed to be within the object, cheaply.

+

Alias of representative_point.

+
+ +
+
+project(other, normalized=False)
+

Returns the distance along this geometry to a point nearest the +specified point

+

If the normalized arg is True, return the distance normalized to the +length of the linear geometry.

+

Alias of line_locate_point.

+
+ +
+
+relate(other)
+

Returns the DE-9IM intersection matrix for the two geometries +(string)

+
+ +
+
+relate_pattern(other, pattern)
+

Returns True if the DE-9IM string code for the relationship between +the geometries satisfies the pattern, else False

+
+ +
+
+representative_point()
+

Returns a point guaranteed to be within the object, cheaply.

+

Alias of point_on_surface.

+
+ +
+
+reverse()
+

Returns a copy of this geometry with the order of coordinates reversed.

+

If the geometry is a polygon with interior rings, the interior rings are also +reversed.

+

Points are unchanged.

+
+

See also

+
+
is_ccw

Checks if a geometry is clockwise.

+
+
+
+

Examples

+
>>> from shapely import LineString, Polygon
+>>> LineString([(0, 0), (1, 2)]).reverse()
+<LINESTRING (1 2, 0 0)>
+>>> Polygon([(0, 0), (1, 0), (1, 1), (0, 1), (0, 0)]).reverse()
+<POLYGON ((0 0, 0 1, 1 1, 1 0, 0 0))>
+
+
+
+ +
+
+segmentize(max_segment_length)
+

Adds vertices to line segments based on maximum segment length.

+

Additional vertices will be added to every line segment in an input geometry +so that segments are no longer than the provided maximum segment length. New +vertices will evenly subdivide each segment.

+

Only linear components of input geometries are densified; other geometries +are returned unmodified.

+
+
Parameters:
+

max_segment_length (float or array_like) – Additional vertices will be added so that all line segments are no +longer this value. Must be greater than 0.

+
+
+

Examples

+
>>> from shapely import LineString, Polygon
+>>> LineString([(0, 0), (0, 10)]).segmentize(max_segment_length=5)
+<LINESTRING (0 0, 0 5, 0 10)>
+>>> Polygon([(0, 0), (10, 0), (10, 10), (0, 10), (0, 0)]).segmentize(max_segment_length=5)
+<POLYGON ((0 0, 5 0, 10 0, 10 5, 10 10, 5 10, 0 10, 0 5, 0 0))>
+
+
+
+ +
+
+simplify(tolerance, preserve_topology=True)
+

Returns a simplified geometry produced by the Douglas-Peucker +algorithm

+

Coordinates of the simplified geometry will be no more than the +tolerance distance from the original. Unless the topology preserving +option is used, the algorithm may produce self-intersecting or +otherwise invalid geometries.

+
+ +
+
+svg(scale_factor=1.0, fill_color=None, opacity=None)[source]
+

Returns SVG path element for the Polygon geometry.

+
+
Parameters:
+
    +
  • scale_factor (float) – Multiplication factor for the SVG stroke-width. Default is 1.

  • +
  • fill_color (str, optional) – Hex string for fill color. Default is to use “#66cc99” if +geometry is valid, and “#ff3333” if invalid.

  • +
  • opacity (float) – Float number between 0 and 1 for color opacity. Default value is 0.6

  • +
+
+
+
+ +
+
+symmetric_difference(other, grid_size=None)
+

Returns the symmetric difference of the geometries.

+

Refer to shapely.symmetric_difference for full documentation.

+
+ +
+
+touches(other)
+

Returns True if geometries touch, else False

+
+ +
+
+union(other, grid_size=None)
+

Returns the union of the geometries.

+

Refer to shapely.union for full documentation.

+
+ +
+
+within(other)
+

Returns True if geometry is within the other, else False

+
+ +
+
+property area
+

Unitless area of the geometry (float)

+
+ +
+
+property boundary
+

Returns a lower dimension geometry that bounds the object

+

The boundary of a polygon is a line, the boundary of a line is a +collection of points. The boundary of a point is an empty (null) +collection.

+
+ +
+
+property bounds
+

Returns minimum bounding region (minx, miny, maxx, maxy)

+
+ +
+
+property centroid
+

Returns the geometric center of the object

+
+ +
+
+property convex_hull
+

that’s a +convex hull, more or less

+

The convex hull of a three member multipoint, for example, is a +triangular polygon.

+
+
Type:
+

Imagine an elastic band stretched around the geometry

+
+
+
+ +
+
+property coords
+

Access to geometry’s coordinates (CoordinateSequence)

+
+ +
+
+property envelope
+

A figure that envelopes the geometry

+
+ +
+
+property exterior
+
+ +
+
+property geom_type
+

Name of the geometry’s type, such as ‘Point’

+
+ +
+
+property has_z
+

True if the geometry’s coordinate sequence(s) have z values (are +3-dimensional)

+
+ +
+
+property interiors
+
+ +
+
+property is_closed
+

True if the geometry is closed, else False

+

Applicable only to 1-D geometries.

+
+ +
+
+property is_empty
+

True if the set of points in this geometry is empty, else False

+
+ +
+
+property is_ring
+

True if the geometry is a closed ring, else False

+
+ +
+
+property is_simple
+

True if the geometry is simple, meaning that any self-intersections +are only at boundary points, else False

+
+ +
+
+property is_valid
+

True if the geometry is valid (definition depends on sub-class), +else False

+
+ +
+
+property length
+

Unitless length of the geometry (float)

+
+ +
+
+property minimum_clearance
+

Unitless distance by which a node could be moved to produce an invalid geometry (float)

+
+ +
+
+property minimum_rotated_rectangle
+

Returns the oriented envelope (minimum rotated rectangle) that +encloses the geometry.

+

Unlike envelope this rectangle is not constrained to be parallel to the +coordinate axes. If the convex hull of the object is a degenerate (line +or point) this degenerate is returned.

+

Alias of oriented_envelope.

+
+ +
+
+property oriented_envelope
+

Returns the oriented envelope (minimum rotated rectangle) that +encloses the geometry.

+

Unlike envelope this rectangle is not constrained to be parallel to the +coordinate axes. If the convex hull of the object is a degenerate (line +or point) this degenerate is returned.

+

Alias of minimum_rotated_rectangle.

+
+ +
+
+property type
+
+ +
+
+property wkb
+

WKB representation of the geometry

+
+ +
+
+property wkb_hex
+

WKB hex representation of the geometry

+
+ +
+
+property wkt
+

WKT representation of the geometry

+
+ +
+
+property xy
+

Separate arrays of X and Y coordinate values

+
+ +
+ +
+
+class lasso.util.partial[source]
+

Bases: object

+

partial(func, *args, **keywords) - new function with partial application +of the given arguments and keywords.

+
+
+args
+

tuple of arguments to future partial calls

+
+ +
+
+func
+

function object to use in future partial calls

+
+ +
+
+keywords
+

dictionary of keyword arguments to future partial calls

+
+ +
+ +
+
+lasso.util.column_name_to_parts(c, parameters=None)[source]
+
+ +
+
+lasso.util.create_locationreference(node, link)[source]
+
+ +
+
+lasso.util.geodesic_point_buffer(lat, lon, meters)[source]
+

creates circular buffer polygon for node

+
+
Parameters:
+
    +
  • lat – node lat

  • +
  • lon – node lon

  • +
  • meters – buffer distance, radius of circle

  • +
+
+
Returns:
+

Polygon

+
+
+
+ +
+
+lasso.util.get_shared_streets_intersection_hash(lat, long, osm_node_id=None)[source]
+
+
Calculated per:

https://github.com/sharedstreets/sharedstreets-js/blob/0e6d7de0aee2e9ae3b007d1e45284b06cc241d02/src/index.ts#L553-L565

+
+
Expected in/out
+
-93.0965985, 44.952112199999995 osm_node_id = 954734870

69f13f881649cb21ee3b359730790bb9

+
+
+
+
+
+ +
+
+lasso.util.hhmmss_to_datetime(hhmmss_str)[source]
+

Creates a datetime time object from a string of hh:mm:ss

+
+
Parameters:
+

hhmmss_str – string of hh:mm:ss

+
+
Returns:
+

datetime.time object representing time

+
+
Return type:
+

dt

+
+
+
+ +
+
+lasso.util.secs_to_datetime(secs)[source]
+

Creates a datetime time object from a seconds from midnight

+
+
Parameters:
+

secs – seconds from midnight

+
+
Returns:
+

datetime.time object representing time

+
+
Return type:
+

dt

+
+
+
+ +
+
+lasso.util.shorten_name(name)[source]
+
+ +
+
+lasso.util.transform(func, geom)[source]
+

Applies func to all coordinates of geom and returns a new +geometry of the same type from the transformed coordinates.

+

func maps x, y, and optionally z to output xp, yp, zp. The input +parameters may iterable types like lists or arrays or single values. +The output shall be of the same type. Scalars in, scalars out. +Lists in, lists out.

+

For example, here is an identity function applicable to both types +of input.

+
+
+
def id_func(x, y, z=None):

return tuple(filter(None, [x, y, z]))

+
+
+

g2 = transform(id_func, g1)

+
+

Using pyproj >= 2.1, this example will accurately project Shapely geometries:

+
+

import pyproj

+

wgs84 = pyproj.CRS(‘EPSG:4326’) +utm = pyproj.CRS(‘EPSG:32618’)

+

project = pyproj.Transformer.from_crs(wgs84, utm, always_xy=True).transform

+

g2 = transform(project, g1)

+
+

Note that the always_xy kwarg is required here as Shapely geometries only support +X,Y coordinate ordering.

+

Lambda expressions such as the one in

+
+

g2 = transform(lambda x, y, z=None: (x+1.0, y+1.0), g1)

+
+

also satisfy the requirements for func.

+
+ +
+
+lasso.util.unidecode(string, errors='ignore', replace_str='?')
+

Transliterate an Unicode object into an ASCII string

+
+
Return type:
+

str

+
+
+
>>> unidecode("北亰")
+"Bei Jing "
+
+
+

This function first tries to convert the string using ASCII codec. +If it fails (because of non-ASCII characters), it falls back to +transliteration using the character tables.

+

This is approx. five times faster if the string only contains ASCII +characters, but slightly slower than unicode_expect_nonascii if +non-ASCII characters are present.

+

errors specifies what to do with characters that have not been +found in replacement tables. The default is ‘ignore’ which ignores +the character. ‘strict’ raises an UnidecodeError. ‘replace’ +substitutes the character with replace_str (default is ‘?’). +‘preserve’ keeps the original character.

+

Note that if ‘preserve’ is used the returned string might not be +ASCII!

+
+ +
+ + +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty_emme/_modules/functools/index.html b/branch/bicounty_emme/_modules/functools/index.html new file mode 100644 index 0000000..bd88918 --- /dev/null +++ b/branch/bicounty_emme/_modules/functools/index.html @@ -0,0 +1,1083 @@ + + + + + + functools — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for functools

+"""functools.py - Tools for working with functions and callable objects
+"""
+# Python module wrapper for _functools C module
+# to allow utilities written in Python to be added
+# to the functools module.
+# Written by Nick Coghlan <ncoghlan at gmail.com>,
+# Raymond Hettinger <python at rcn.com>,
+# and Łukasz Langa <lukasz at langa.pl>.
+#   Copyright (C) 2006-2013 Python Software Foundation.
+# See C source code for _functools credits/copyright
+
+__all__ = ['update_wrapper', 'wraps', 'WRAPPER_ASSIGNMENTS', 'WRAPPER_UPDATES',
+           'total_ordering', 'cmp_to_key', 'lru_cache', 'reduce', 'partial',
+           'partialmethod', 'singledispatch', 'singledispatchmethod',
+           "cached_property"]
+
+from abc import get_cache_token
+from collections import namedtuple
+# import types, weakref  # Deferred to single_dispatch()
+from reprlib import recursive_repr
+from _thread import RLock
+
+
+################################################################################
+### update_wrapper() and wraps() decorator
+################################################################################
+
+# update_wrapper() and wraps() are tools to help write
+# wrapper functions that can handle naive introspection
+
+WRAPPER_ASSIGNMENTS = ('__module__', '__name__', '__qualname__', '__doc__',
+                       '__annotations__')
+WRAPPER_UPDATES = ('__dict__',)
+def update_wrapper(wrapper,
+                   wrapped,
+                   assigned = WRAPPER_ASSIGNMENTS,
+                   updated = WRAPPER_UPDATES):
+    """Update a wrapper function to look like the wrapped function
+
+       wrapper is the function to be updated
+       wrapped is the original function
+       assigned is a tuple naming the attributes assigned directly
+       from the wrapped function to the wrapper function (defaults to
+       functools.WRAPPER_ASSIGNMENTS)
+       updated is a tuple naming the attributes of the wrapper that
+       are updated with the corresponding attribute from the wrapped
+       function (defaults to functools.WRAPPER_UPDATES)
+    """
+    for attr in assigned:
+        try:
+            value = getattr(wrapped, attr)
+        except AttributeError:
+            pass
+        else:
+            setattr(wrapper, attr, value)
+    for attr in updated:
+        getattr(wrapper, attr).update(getattr(wrapped, attr, {}))
+    # Issue #17482: set __wrapped__ last so we don't inadvertently copy it
+    # from the wrapped function when updating __dict__
+    wrapper.__wrapped__ = wrapped
+    # Return the wrapper so this can be used as a decorator via partial()
+    return wrapper
+
+def wraps(wrapped,
+          assigned = WRAPPER_ASSIGNMENTS,
+          updated = WRAPPER_UPDATES):
+    """Decorator factory to apply update_wrapper() to a wrapper function
+
+       Returns a decorator that invokes update_wrapper() with the decorated
+       function as the wrapper argument and the arguments to wraps() as the
+       remaining arguments. Default arguments are as for update_wrapper().
+       This is a convenience function to simplify applying partial() to
+       update_wrapper().
+    """
+    return partial(update_wrapper, wrapped=wrapped,
+                   assigned=assigned, updated=updated)
+
+
+################################################################################
+### total_ordering class decorator
+################################################################################
+
+# The total ordering functions all invoke the root magic method directly
+# rather than using the corresponding operator.  This avoids possible
+# infinite recursion that could occur when the operator dispatch logic
+# detects a NotImplemented result and then calls a reflected method.
+
+def _gt_from_lt(self, other, NotImplemented=NotImplemented):
+    'Return a > b.  Computed by @total_ordering from (not a < b) and (a != b).'
+    op_result = self.__lt__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return not op_result and self != other
+
+def _le_from_lt(self, other, NotImplemented=NotImplemented):
+    'Return a <= b.  Computed by @total_ordering from (a < b) or (a == b).'
+    op_result = self.__lt__(other)
+    return op_result or self == other
+
+def _ge_from_lt(self, other, NotImplemented=NotImplemented):
+    'Return a >= b.  Computed by @total_ordering from (not a < b).'
+    op_result = self.__lt__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return not op_result
+
+def _ge_from_le(self, other, NotImplemented=NotImplemented):
+    'Return a >= b.  Computed by @total_ordering from (not a <= b) or (a == b).'
+    op_result = self.__le__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return not op_result or self == other
+
+def _lt_from_le(self, other, NotImplemented=NotImplemented):
+    'Return a < b.  Computed by @total_ordering from (a <= b) and (a != b).'
+    op_result = self.__le__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return op_result and self != other
+
+def _gt_from_le(self, other, NotImplemented=NotImplemented):
+    'Return a > b.  Computed by @total_ordering from (not a <= b).'
+    op_result = self.__le__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return not op_result
+
+def _lt_from_gt(self, other, NotImplemented=NotImplemented):
+    'Return a < b.  Computed by @total_ordering from (not a > b) and (a != b).'
+    op_result = self.__gt__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return not op_result and self != other
+
+def _ge_from_gt(self, other, NotImplemented=NotImplemented):
+    'Return a >= b.  Computed by @total_ordering from (a > b) or (a == b).'
+    op_result = self.__gt__(other)
+    return op_result or self == other
+
+def _le_from_gt(self, other, NotImplemented=NotImplemented):
+    'Return a <= b.  Computed by @total_ordering from (not a > b).'
+    op_result = self.__gt__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return not op_result
+
+def _le_from_ge(self, other, NotImplemented=NotImplemented):
+    'Return a <= b.  Computed by @total_ordering from (not a >= b) or (a == b).'
+    op_result = self.__ge__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return not op_result or self == other
+
+def _gt_from_ge(self, other, NotImplemented=NotImplemented):
+    'Return a > b.  Computed by @total_ordering from (a >= b) and (a != b).'
+    op_result = self.__ge__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return op_result and self != other
+
+def _lt_from_ge(self, other, NotImplemented=NotImplemented):
+    'Return a < b.  Computed by @total_ordering from (not a >= b).'
+    op_result = self.__ge__(other)
+    if op_result is NotImplemented:
+        return op_result
+    return not op_result
+
+_convert = {
+    '__lt__': [('__gt__', _gt_from_lt),
+               ('__le__', _le_from_lt),
+               ('__ge__', _ge_from_lt)],
+    '__le__': [('__ge__', _ge_from_le),
+               ('__lt__', _lt_from_le),
+               ('__gt__', _gt_from_le)],
+    '__gt__': [('__lt__', _lt_from_gt),
+               ('__ge__', _ge_from_gt),
+               ('__le__', _le_from_gt)],
+    '__ge__': [('__le__', _le_from_ge),
+               ('__gt__', _gt_from_ge),
+               ('__lt__', _lt_from_ge)]
+}
+
+def total_ordering(cls):
+    """Class decorator that fills in missing ordering methods"""
+    # Find user-defined comparisons (not those inherited from object).
+    roots = {op for op in _convert if getattr(cls, op, None) is not getattr(object, op, None)}
+    if not roots:
+        raise ValueError('must define at least one ordering operation: < > <= >=')
+    root = max(roots)       # prefer __lt__ to __le__ to __gt__ to __ge__
+    for opname, opfunc in _convert[root]:
+        if opname not in roots:
+            opfunc.__name__ = opname
+            setattr(cls, opname, opfunc)
+    return cls
+
+
+################################################################################
+### cmp_to_key() function converter
+################################################################################
+
+def cmp_to_key(mycmp):
+    """Convert a cmp= function into a key= function"""
+    class K(object):
+        __slots__ = ['obj']
+        def __init__(self, obj):
+            self.obj = obj
+        def __lt__(self, other):
+            return mycmp(self.obj, other.obj) < 0
+        def __gt__(self, other):
+            return mycmp(self.obj, other.obj) > 0
+        def __eq__(self, other):
+            return mycmp(self.obj, other.obj) == 0
+        def __le__(self, other):
+            return mycmp(self.obj, other.obj) <= 0
+        def __ge__(self, other):
+            return mycmp(self.obj, other.obj) >= 0
+        __hash__ = None
+    return K
+
+try:
+    from _functools import cmp_to_key
+except ImportError:
+    pass
+
+
+################################################################################
+### reduce() sequence to a single item
+################################################################################
+
+_initial_missing = object()
+
+def reduce(function, sequence, initial=_initial_missing):
+    """
+    reduce(function, sequence[, initial]) -> value
+
+    Apply a function of two arguments cumulatively to the items of a sequence,
+    from left to right, so as to reduce the sequence to a single value.
+    For example, reduce(lambda x, y: x+y, [1, 2, 3, 4, 5]) calculates
+    ((((1+2)+3)+4)+5).  If initial is present, it is placed before the items
+    of the sequence in the calculation, and serves as a default when the
+    sequence is empty.
+    """
+
+    it = iter(sequence)
+
+    if initial is _initial_missing:
+        try:
+            value = next(it)
+        except StopIteration:
+            raise TypeError("reduce() of empty sequence with no initial value") from None
+    else:
+        value = initial
+
+    for element in it:
+        value = function(value, element)
+
+    return value
+
+try:
+    from _functools import reduce
+except ImportError:
+    pass
+
+
+################################################################################
+### partial() argument application
+################################################################################
+
+# Purely functional, no descriptor behaviour
+
[docs]class partial: + """New function with partial application of the given arguments + and keywords. + """ + + __slots__ = "func", "args", "keywords", "__dict__", "__weakref__" + + def __new__(cls, func, /, *args, **keywords): + if not callable(func): + raise TypeError("the first argument must be callable") + + if hasattr(func, "func"): + args = func.args + args + keywords = {**func.keywords, **keywords} + func = func.func + + self = super(partial, cls).__new__(cls) + + self.func = func + self.args = args + self.keywords = keywords + return self + + def __call__(self, /, *args, **keywords): + keywords = {**self.keywords, **keywords} + return self.func(*self.args, *args, **keywords) + + @recursive_repr() + def __repr__(self): + qualname = type(self).__qualname__ + args = [repr(self.func)] + args.extend(repr(x) for x in self.args) + args.extend(f"{k}={v!r}" for (k, v) in self.keywords.items()) + if type(self).__module__ == "functools": + return f"functools.{qualname}({', '.join(args)})" + return f"{qualname}({', '.join(args)})" + + def __reduce__(self): + return type(self), (self.func,), (self.func, self.args, + self.keywords or None, self.__dict__ or None) + + def __setstate__(self, state): + if not isinstance(state, tuple): + raise TypeError("argument to __setstate__ must be a tuple") + if len(state) != 4: + raise TypeError(f"expected 4 items in state, got {len(state)}") + func, args, kwds, namespace = state + if (not callable(func) or not isinstance(args, tuple) or + (kwds is not None and not isinstance(kwds, dict)) or + (namespace is not None and not isinstance(namespace, dict))): + raise TypeError("invalid partial state") + + args = tuple(args) # just in case it's a subclass + if kwds is None: + kwds = {} + elif type(kwds) is not dict: # XXX does it need to be *exactly* dict? + kwds = dict(kwds) + if namespace is None: + namespace = {} + + self.__dict__ = namespace + self.func = func + self.args = args + self.keywords = kwds
+ +try: + from _functools import partial +except ImportError: + pass + +# Descriptor version +class partialmethod(object): + """Method descriptor with partial application of the given arguments + and keywords. + + Supports wrapping existing descriptors and handles non-descriptor + callables as instance methods. + """ + + def __init__(*args, **keywords): + if len(args) >= 2: + self, func, *args = args + elif not args: + raise TypeError("descriptor '__init__' of partialmethod " + "needs an argument") + elif 'func' in keywords: + func = keywords.pop('func') + self, *args = args + import warnings + warnings.warn("Passing 'func' as keyword argument is deprecated", + DeprecationWarning, stacklevel=2) + else: + raise TypeError("type 'partialmethod' takes at least one argument, " + "got %d" % (len(args)-1)) + args = tuple(args) + + if not callable(func) and not hasattr(func, "__get__"): + raise TypeError("{!r} is not callable or a descriptor" + .format(func)) + + # func could be a descriptor like classmethod which isn't callable, + # so we can't inherit from partial (it verifies func is callable) + if isinstance(func, partialmethod): + # flattening is mandatory in order to place cls/self before all + # other arguments + # it's also more efficient since only one function will be called + self.func = func.func + self.args = func.args + args + self.keywords = {**func.keywords, **keywords} + else: + self.func = func + self.args = args + self.keywords = keywords + __init__.__text_signature__ = '($self, func, /, *args, **keywords)' + + def __repr__(self): + args = ", ".join(map(repr, self.args)) + keywords = ", ".join("{}={!r}".format(k, v) + for k, v in self.keywords.items()) + format_string = "{module}.{cls}({func}, {args}, {keywords})" + return format_string.format(module=self.__class__.__module__, + cls=self.__class__.__qualname__, + func=self.func, + args=args, + keywords=keywords) + + def _make_unbound_method(self): + def _method(cls_or_self, /, *args, **keywords): + keywords = {**self.keywords, **keywords} + return self.func(cls_or_self, *self.args, *args, **keywords) + _method.__isabstractmethod__ = self.__isabstractmethod__ + _method._partialmethod = self + return _method + + def __get__(self, obj, cls=None): + get = getattr(self.func, "__get__", None) + result = None + if get is not None: + new_func = get(obj, cls) + if new_func is not self.func: + # Assume __get__ returning something new indicates the + # creation of an appropriate callable + result = partial(new_func, *self.args, **self.keywords) + try: + result.__self__ = new_func.__self__ + except AttributeError: + pass + if result is None: + # If the underlying descriptor didn't do anything, treat this + # like an instance method + result = self._make_unbound_method().__get__(obj, cls) + return result + + @property + def __isabstractmethod__(self): + return getattr(self.func, "__isabstractmethod__", False) + +# Helper functions + +def _unwrap_partial(func): + while isinstance(func, partial): + func = func.func + return func + +################################################################################ +### LRU Cache function decorator +################################################################################ + +_CacheInfo = namedtuple("CacheInfo", ["hits", "misses", "maxsize", "currsize"]) + +class _HashedSeq(list): + """ This class guarantees that hash() will be called no more than once + per element. This is important because the lru_cache() will hash + the key multiple times on a cache miss. + + """ + + __slots__ = 'hashvalue' + + def __init__(self, tup, hash=hash): + self[:] = tup + self.hashvalue = hash(tup) + + def __hash__(self): + return self.hashvalue + +def _make_key(args, kwds, typed, + kwd_mark = (object(),), + fasttypes = {int, str}, + tuple=tuple, type=type, len=len): + """Make a cache key from optionally typed positional and keyword arguments + + The key is constructed in a way that is flat as possible rather than + as a nested structure that would take more memory. + + If there is only a single argument and its data type is known to cache + its hash value, then that argument is returned without a wrapper. This + saves space and improves lookup speed. + + """ + # All of code below relies on kwds preserving the order input by the user. + # Formerly, we sorted() the kwds before looping. The new way is *much* + # faster; however, it means that f(x=1, y=2) will now be treated as a + # distinct call from f(y=2, x=1) which will be cached separately. + key = args + if kwds: + key += kwd_mark + for item in kwds.items(): + key += item + if typed: + key += tuple(type(v) for v in args) + if kwds: + key += tuple(type(v) for v in kwds.values()) + elif len(key) == 1 and type(key[0]) in fasttypes: + return key[0] + return _HashedSeq(key) + +def lru_cache(maxsize=128, typed=False): + """Least-recently-used cache decorator. + + If *maxsize* is set to None, the LRU features are disabled and the cache + can grow without bound. + + If *typed* is True, arguments of different types will be cached separately. + For example, f(3.0) and f(3) will be treated as distinct calls with + distinct results. + + Arguments to the cached function must be hashable. + + View the cache statistics named tuple (hits, misses, maxsize, currsize) + with f.cache_info(). Clear the cache and statistics with f.cache_clear(). + Access the underlying function with f.__wrapped__. + + See: http://en.wikipedia.org/wiki/Cache_replacement_policies#Least_recently_used_(LRU) + + """ + + # Users should only access the lru_cache through its public API: + # cache_info, cache_clear, and f.__wrapped__ + # The internals of the lru_cache are encapsulated for thread safety and + # to allow the implementation to change (including a possible C version). + + if isinstance(maxsize, int): + # Negative maxsize is treated as 0 + if maxsize < 0: + maxsize = 0 + elif callable(maxsize) and isinstance(typed, bool): + # The user_function was passed in directly via the maxsize argument + user_function, maxsize = maxsize, 128 + wrapper = _lru_cache_wrapper(user_function, maxsize, typed, _CacheInfo) + return update_wrapper(wrapper, user_function) + elif maxsize is not None: + raise TypeError( + 'Expected first argument to be an integer, a callable, or None') + + def decorating_function(user_function): + wrapper = _lru_cache_wrapper(user_function, maxsize, typed, _CacheInfo) + return update_wrapper(wrapper, user_function) + + return decorating_function + +def _lru_cache_wrapper(user_function, maxsize, typed, _CacheInfo): + # Constants shared by all lru cache instances: + sentinel = object() # unique object used to signal cache misses + make_key = _make_key # build a key from the function arguments + PREV, NEXT, KEY, RESULT = 0, 1, 2, 3 # names for the link fields + + cache = {} + hits = misses = 0 + full = False + cache_get = cache.get # bound method to lookup a key or return None + cache_len = cache.__len__ # get cache size without calling len() + lock = RLock() # because linkedlist updates aren't threadsafe + root = [] # root of the circular doubly linked list + root[:] = [root, root, None, None] # initialize by pointing to self + + if maxsize == 0: + + def wrapper(*args, **kwds): + # No caching -- just a statistics update + nonlocal misses + misses += 1 + result = user_function(*args, **kwds) + return result + + elif maxsize is None: + + def wrapper(*args, **kwds): + # Simple caching without ordering or size limit + nonlocal hits, misses + key = make_key(args, kwds, typed) + result = cache_get(key, sentinel) + if result is not sentinel: + hits += 1 + return result + misses += 1 + result = user_function(*args, **kwds) + cache[key] = result + return result + + else: + + def wrapper(*args, **kwds): + # Size limited caching that tracks accesses by recency + nonlocal root, hits, misses, full + key = make_key(args, kwds, typed) + with lock: + link = cache_get(key) + if link is not None: + # Move the link to the front of the circular queue + link_prev, link_next, _key, result = link + link_prev[NEXT] = link_next + link_next[PREV] = link_prev + last = root[PREV] + last[NEXT] = root[PREV] = link + link[PREV] = last + link[NEXT] = root + hits += 1 + return result + misses += 1 + result = user_function(*args, **kwds) + with lock: + if key in cache: + # Getting here means that this same key was added to the + # cache while the lock was released. Since the link + # update is already done, we need only return the + # computed result and update the count of misses. + pass + elif full: + # Use the old root to store the new key and result. + oldroot = root + oldroot[KEY] = key + oldroot[RESULT] = result + # Empty the oldest link and make it the new root. + # Keep a reference to the old key and old result to + # prevent their ref counts from going to zero during the + # update. That will prevent potentially arbitrary object + # clean-up code (i.e. __del__) from running while we're + # still adjusting the links. + root = oldroot[NEXT] + oldkey = root[KEY] + oldresult = root[RESULT] + root[KEY] = root[RESULT] = None + # Now update the cache dictionary. + del cache[oldkey] + # Save the potentially reentrant cache[key] assignment + # for last, after the root and links have been put in + # a consistent state. + cache[key] = oldroot + else: + # Put result in a new link at the front of the queue. + last = root[PREV] + link = [last, root, key, result] + last[NEXT] = root[PREV] = cache[key] = link + # Use the cache_len bound method instead of the len() function + # which could potentially be wrapped in an lru_cache itself. + full = (cache_len() >= maxsize) + return result + + def cache_info(): + """Report cache statistics""" + with lock: + return _CacheInfo(hits, misses, maxsize, cache_len()) + + def cache_clear(): + """Clear the cache and cache statistics""" + nonlocal hits, misses, full + with lock: + cache.clear() + root[:] = [root, root, None, None] + hits = misses = 0 + full = False + + wrapper.cache_info = cache_info + wrapper.cache_clear = cache_clear + return wrapper + +try: + from _functools import _lru_cache_wrapper +except ImportError: + pass + + +################################################################################ +### singledispatch() - single-dispatch generic function decorator +################################################################################ + +def _c3_merge(sequences): + """Merges MROs in *sequences* to a single MRO using the C3 algorithm. + + Adapted from http://www.python.org/download/releases/2.3/mro/. + + """ + result = [] + while True: + sequences = [s for s in sequences if s] # purge empty sequences + if not sequences: + return result + for s1 in sequences: # find merge candidates among seq heads + candidate = s1[0] + for s2 in sequences: + if candidate in s2[1:]: + candidate = None + break # reject the current head, it appears later + else: + break + if candidate is None: + raise RuntimeError("Inconsistent hierarchy") + result.append(candidate) + # remove the chosen candidate + for seq in sequences: + if seq[0] == candidate: + del seq[0] + +def _c3_mro(cls, abcs=None): + """Computes the method resolution order using extended C3 linearization. + + If no *abcs* are given, the algorithm works exactly like the built-in C3 + linearization used for method resolution. + + If given, *abcs* is a list of abstract base classes that should be inserted + into the resulting MRO. Unrelated ABCs are ignored and don't end up in the + result. The algorithm inserts ABCs where their functionality is introduced, + i.e. issubclass(cls, abc) returns True for the class itself but returns + False for all its direct base classes. Implicit ABCs for a given class + (either registered or inferred from the presence of a special method like + __len__) are inserted directly after the last ABC explicitly listed in the + MRO of said class. If two implicit ABCs end up next to each other in the + resulting MRO, their ordering depends on the order of types in *abcs*. + + """ + for i, base in enumerate(reversed(cls.__bases__)): + if hasattr(base, '__abstractmethods__'): + boundary = len(cls.__bases__) - i + break # Bases up to the last explicit ABC are considered first. + else: + boundary = 0 + abcs = list(abcs) if abcs else [] + explicit_bases = list(cls.__bases__[:boundary]) + abstract_bases = [] + other_bases = list(cls.__bases__[boundary:]) + for base in abcs: + if issubclass(cls, base) and not any( + issubclass(b, base) for b in cls.__bases__ + ): + # If *cls* is the class that introduces behaviour described by + # an ABC *base*, insert said ABC to its MRO. + abstract_bases.append(base) + for base in abstract_bases: + abcs.remove(base) + explicit_c3_mros = [_c3_mro(base, abcs=abcs) for base in explicit_bases] + abstract_c3_mros = [_c3_mro(base, abcs=abcs) for base in abstract_bases] + other_c3_mros = [_c3_mro(base, abcs=abcs) for base in other_bases] + return _c3_merge( + [[cls]] + + explicit_c3_mros + abstract_c3_mros + other_c3_mros + + [explicit_bases] + [abstract_bases] + [other_bases] + ) + +def _compose_mro(cls, types): + """Calculates the method resolution order for a given class *cls*. + + Includes relevant abstract base classes (with their respective bases) from + the *types* iterable. Uses a modified C3 linearization algorithm. + + """ + bases = set(cls.__mro__) + # Remove entries which are already present in the __mro__ or unrelated. + def is_related(typ): + return (typ not in bases and hasattr(typ, '__mro__') + and issubclass(cls, typ)) + types = [n for n in types if is_related(n)] + # Remove entries which are strict bases of other entries (they will end up + # in the MRO anyway. + def is_strict_base(typ): + for other in types: + if typ != other and typ in other.__mro__: + return True + return False + types = [n for n in types if not is_strict_base(n)] + # Subclasses of the ABCs in *types* which are also implemented by + # *cls* can be used to stabilize ABC ordering. + type_set = set(types) + mro = [] + for typ in types: + found = [] + for sub in typ.__subclasses__(): + if sub not in bases and issubclass(cls, sub): + found.append([s for s in sub.__mro__ if s in type_set]) + if not found: + mro.append(typ) + continue + # Favor subclasses with the biggest number of useful bases + found.sort(key=len, reverse=True) + for sub in found: + for subcls in sub: + if subcls not in mro: + mro.append(subcls) + return _c3_mro(cls, abcs=mro) + +def _find_impl(cls, registry): + """Returns the best matching implementation from *registry* for type *cls*. + + Where there is no registered implementation for a specific type, its method + resolution order is used to find a more generic implementation. + + Note: if *registry* does not contain an implementation for the base + *object* type, this function may return None. + + """ + mro = _compose_mro(cls, registry.keys()) + match = None + for t in mro: + if match is not None: + # If *match* is an implicit ABC but there is another unrelated, + # equally matching implicit ABC, refuse the temptation to guess. + if (t in registry and t not in cls.__mro__ + and match not in cls.__mro__ + and not issubclass(match, t)): + raise RuntimeError("Ambiguous dispatch: {} or {}".format( + match, t)) + break + if t in registry: + match = t + return registry.get(match) + +def singledispatch(func): + """Single-dispatch generic function decorator. + + Transforms a function into a generic function, which can have different + behaviours depending upon the type of its first argument. The decorated + function acts as the default implementation, and additional + implementations can be registered using the register() attribute of the + generic function. + """ + # There are many programs that use functools without singledispatch, so we + # trade-off making singledispatch marginally slower for the benefit of + # making start-up of such applications slightly faster. + import types, weakref + + registry = {} + dispatch_cache = weakref.WeakKeyDictionary() + cache_token = None + + def dispatch(cls): + """generic_func.dispatch(cls) -> <function implementation> + + Runs the dispatch algorithm to return the best available implementation + for the given *cls* registered on *generic_func*. + + """ + nonlocal cache_token + if cache_token is not None: + current_token = get_cache_token() + if cache_token != current_token: + dispatch_cache.clear() + cache_token = current_token + try: + impl = dispatch_cache[cls] + except KeyError: + try: + impl = registry[cls] + except KeyError: + impl = _find_impl(cls, registry) + dispatch_cache[cls] = impl + return impl + + def register(cls, func=None): + """generic_func.register(cls, func) -> func + + Registers a new implementation for the given *cls* on a *generic_func*. + + """ + nonlocal cache_token + if func is None: + if isinstance(cls, type): + return lambda f: register(cls, f) + ann = getattr(cls, '__annotations__', {}) + if not ann: + raise TypeError( + f"Invalid first argument to `register()`: {cls!r}. " + f"Use either `@register(some_class)` or plain `@register` " + f"on an annotated function." + ) + func = cls + + # only import typing if annotation parsing is necessary + from typing import get_type_hints + argname, cls = next(iter(get_type_hints(func).items())) + if not isinstance(cls, type): + raise TypeError( + f"Invalid annotation for {argname!r}. " + f"{cls!r} is not a class." + ) + registry[cls] = func + if cache_token is None and hasattr(cls, '__abstractmethods__'): + cache_token = get_cache_token() + dispatch_cache.clear() + return func + + def wrapper(*args, **kw): + if not args: + raise TypeError(f'{funcname} requires at least ' + '1 positional argument') + + return dispatch(args[0].__class__)(*args, **kw) + + funcname = getattr(func, '__name__', 'singledispatch function') + registry[object] = func + wrapper.register = register + wrapper.dispatch = dispatch + wrapper.registry = types.MappingProxyType(registry) + wrapper._clear_cache = dispatch_cache.clear + update_wrapper(wrapper, func) + return wrapper + + +# Descriptor version +class singledispatchmethod: + """Single-dispatch generic method descriptor. + + Supports wrapping existing descriptors and handles non-descriptor + callables as instance methods. + """ + + def __init__(self, func): + if not callable(func) and not hasattr(func, "__get__"): + raise TypeError(f"{func!r} is not callable or a descriptor") + + self.dispatcher = singledispatch(func) + self.func = func + + def register(self, cls, method=None): + """generic_method.register(cls, func) -> func + + Registers a new implementation for the given *cls* on a *generic_method*. + """ + return self.dispatcher.register(cls, func=method) + + def __get__(self, obj, cls=None): + def _method(*args, **kwargs): + method = self.dispatcher.dispatch(args[0].__class__) + return method.__get__(obj, cls)(*args, **kwargs) + + _method.__isabstractmethod__ = self.__isabstractmethod__ + _method.register = self.register + update_wrapper(_method, self.func) + return _method + + @property + def __isabstractmethod__(self): + return getattr(self.func, '__isabstractmethod__', False) + + +################################################################################ +### cached_property() - computed once per instance, cached as attribute +################################################################################ + +_NOT_FOUND = object() + + +class cached_property: + def __init__(self, func): + self.func = func + self.attrname = None + self.__doc__ = func.__doc__ + self.lock = RLock() + + def __set_name__(self, owner, name): + if self.attrname is None: + self.attrname = name + elif name != self.attrname: + raise TypeError( + "Cannot assign the same cached_property to two different names " + f"({self.attrname!r} and {name!r})." + ) + + def __get__(self, instance, owner=None): + if instance is None: + return self + if self.attrname is None: + raise TypeError( + "Cannot use cached_property instance without calling __set_name__ on it.") + try: + cache = instance.__dict__ + except AttributeError: # not all objects have __dict__ (e.g. class defines slots) + msg = ( + f"No '__dict__' attribute on {type(instance).__name__!r} " + f"instance to cache {self.attrname!r} property." + ) + raise TypeError(msg) from None + val = cache.get(self.attrname, _NOT_FOUND) + if val is _NOT_FOUND: + with self.lock: + # check if another thread filled cache while we awaited lock + val = cache.get(self.attrname, _NOT_FOUND) + if val is _NOT_FOUND: + val = self.func(instance) + try: + cache[self.attrname] = val + except TypeError: + msg = ( + f"The '__dict__' attribute on {type(instance).__name__!r} instance " + f"does not support item assignment for caching {self.attrname!r} property." + ) + raise TypeError(msg) from None + return val +
+ +
+
+
+ +
+ +
+

© Copyright 2019 Metropolitan Council.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty_emme/_modules/index.html b/branch/bicounty_emme/_modules/index.html new file mode 100644 index 0000000..19f96a2 --- /dev/null +++ b/branch/bicounty_emme/_modules/index.html @@ -0,0 +1,116 @@ + + + + + + Overview: module code — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+
    +
  • + +
  • +
  • +
+
+
+ +
+ +
+ +
+

© Copyright 2019 Metropolitan Council.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty_emme/_modules/lasso/logger/index.html b/branch/bicounty_emme/_modules/lasso/logger/index.html new file mode 100644 index 0000000..1561ede --- /dev/null +++ b/branch/bicounty_emme/_modules/lasso/logger/index.html @@ -0,0 +1,153 @@ + + + + + + lasso.logger — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for lasso.logger

+import logging
+
+__all__ = ["WranglerLogger", "setupLogging"]
+
+
+# for all the Wrangler logging needs!
+WranglerLogger = logging.getLogger("WranglerLogger")
+
+
+
[docs]def setupLogging(infoLogFilename, debugLogFilename, logToConsole=True): + """Sets up the logger. The infoLog is terse, just gives the bare minimum of details + so the network composition will be clear later. + The debuglog is very noisy, for debugging. + + Pass none to either. + Spews it all out to console too, if logToConsole is true. + """ + # clear handlers if any exist already + WranglerLogger.handlers = [] + + # create a logger + WranglerLogger.setLevel(logging.DEBUG) + + if infoLogFilename: + infologhandler = logging.StreamHandler(open(infoLogFilename, "w")) + infologhandler.setLevel(logging.INFO) + infologhandler.setFormatter( + logging.Formatter("%(asctime)s - %(name)s - %(levelname)s - %(message)s") + ) + WranglerLogger.addHandler(infologhandler) + + if debugLogFilename: + debugloghandler = logging.StreamHandler(open(debugLogFilename, "w")) + debugloghandler.setLevel(logging.DEBUG) + debugloghandler.setFormatter( + logging.Formatter("%(asctime)s %(levelname)s %(message)s", "%Y-%m-%d %H:%M") + ) + WranglerLogger.addHandler(debugloghandler) + + if logToConsole: + consolehandler = logging.StreamHandler() + consolehandler.setLevel(logging.DEBUG) + consolehandler.setFormatter( + logging.Formatter("%(name)-12s: %(levelname)-8s %(message)s") + ) + WranglerLogger.addHandler(consolehandler)
+
+ +
+
+
+ +
+ +
+

© Copyright 2019 Metropolitan Council.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty_emme/_modules/lasso/parameters/index.html b/branch/bicounty_emme/_modules/lasso/parameters/index.html new file mode 100644 index 0000000..7b8be0d --- /dev/null +++ b/branch/bicounty_emme/_modules/lasso/parameters/index.html @@ -0,0 +1,1060 @@ + + + + + + lasso.parameters — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for lasso.parameters

+import os
+import pyproj
+from .logger import WranglerLogger
+
+
+from pyproj import CRS
+
+
+def get_base_dir(lasso_base_dir=os.getcwd()):
+    d = lasso_base_dir
+    for i in range(3):
+        if "metcouncil_data" in os.listdir(d):
+
+            WranglerLogger.info("Lasso base directory set as: {}".format(d))
+            return d
+        d = os.path.dirname(d)
+
+    msg = "Cannot find Lasso base directory from {}, please input using keyword in parameters: `lasso_base_dir =` ".format(
+        lasso_base_dir
+    )
+    WranglerLogger.error(msg)
+    raise (ValueError(msg))
+
+
+
[docs]class Parameters: + """A class representing all the parameters defining the networks + including time of day, categories, etc. + + Parameters can be set at runtime by initializing a parameters instance + with a keyword argument setting the attribute. Parameters that are + not explicitly set will use default parameters listed in this class. + .. highlight:: python + + Attr: + time_period_to_time (dict): Maps time period abbreviations used in + Cube to time of days used on gtfs and highway network standard + Default: + :: + { + "EA": ("3:00", "6:00"), + "AM": ("6:00, "10:00"), + "MD": ("10:00", "15:00"), + "PM": ("15:00", "19:00"), + "EV": ("19:00", "3:00"), + } + cube_time_periods (dict): Maps cube time period numbers used in + transit line files to the time period abbreviations in time_period_to_time + dictionary. + Default: + :: + {"1": "EA", "2": "AM", "3": "MD", "4": "PM", "5": "EV"} + categories (dict): Maps demand category abbreviations to a list of + network categories they are allowed to use. + Default: + :: + { + # suffix, source (in order of search) + "sov": ["sov", "default"], + "hov2": ["hov2", "default", "sov"], + "hov3": ["hov3", "hov2", "default", "sov"], + "truck": ["trk", "sov", "default"], + } + properties_to_split (dict): Dictionary mapping variables in standard + roadway network to categories and time periods that need to be + split out in final model network to get variables like LANES_AM. + Default: + :: + { + "lanes": { + "v": "lanes", + "time_periods": self.time_periods_to_time + }, + "ML_lanes": { + "v": "ML_lanes", + "time_periods": self.time_periods_to_time + }, + "use": { + "v": "use", + "time_periods": self.time_periods_to_time + }, + } + + county_shape (str): File location of shapefile defining counties. + Default: + :: + r"metcouncil_data/county/cb_2017_us_county_5m.shp" + + county_variable_shp (str): Property defining the county n ame in + the county_shape file. + Default: + :: + NAME + lanes_lookup_file (str): Lookup table of number of lanes for different data sources. + Default: + :: + r"metcouncil_data/lookups/lanes.csv" + centroid_connect_lanes (int): Number of lanes for centroid connectors. + Default: + :: + 1 + mpo_counties (list): list of county names within MPO boundary. + Default: + :: + [ + "ANOKA", + "DAKOTA", + "HENNEPIN", + "RAMSEY", + "SCOTT", + "WASHINGTON", + "CARVER", + ] + + taz_shape (str): + Default: + :: + r"metcouncil_data/TAZ/TAZOfficialWCurrentForecasts.shp" + taz_data (str): + Default: + :: + ?? + highest_taz_number (int): highest TAZ number in order to define + centroid connectors. + Default: + :: + 3100 + + output_variables (list): list of variables to output in final model + network. + Default: + :: + [ + "model_link_id", + "link_id", + "A", + "B", + "shstGeometryId", + "distance", + "roadway", + "name", + "roadway_class", + "bike_access", + "walk_access", + "drive_access", + "truck_access", + "trn_priority_EA", + "trn_priority_AM", + "trn_priority_MD", + "trn_priority_PM", + "trn_priority_EV", + "ttime_assert_EA", + "ttime_assert_AM", + "ttime_assert_MD", + "ttime_assert_PM", + "ttime_assert_EV", + "lanes_EA", + "lanes_AM", + "lanes_MD", + "lanes_PM", + "lanes_EV", + "price_sov_EA", + "price_hov2_EA", + "price_hov3_EA", + "price_truck_EA", + "price_sov_AM", + "price_hov2_AM", + "price_hov3_AM", + "price_truck_AM", + "price_sov_MD", + "price_hov2_MD", + "price_hov3_MD", + "price_truck_MD", + "price_sov_PM", + "price_hov2_PM", + "price_hov3_PM", + "price_truck_PM", + "price_sov_EV", + "price_hov2_EV", + "price_hov3_EV", + "price_truck_EV", + "roadway_class_idx", + "facility_type", + "county", + "centroidconnect", + "model_node_id", + "N", + "osm_node_id", + "bike_node", + "transit_node", + "walk_node", + "drive_node", + "geometry", + "X", + "Y", + "ML_lanes_EA", + "ML_lanes_AM", + "ML_lanes_MD", + "ML_lanes_PM", + "ML_lanes_EV", + "segment_id", + "managed", + "bus_only", + "rail_only" + ] + + osm_facility_type_dict (dict): Mapping between OSM Roadway variable + and facility type. Default: + + area_type_shape (str): Location of shapefile defining area type. + Default: + :: + r"metcouncil_data/area_type/ThriveMSP2040CommunityDesignation.shp" + area_type_variable_shp (str): property in area_type_shape with area + type in it. + Default: + :: + "COMDES2040" + area_type_code_dict (dict): Mapping of the area_type_variable_shp to + the area type code used in the MetCouncil cube network. + Default: + :: + { + 23: 4, # urban center + 24: 3, + 25: 2, + 35: 2, + 36: 1, + 41: 1, + 51: 1, + 52: 1, + 53: 1, + 60: 1, + } + downtown_area_type_shape (str): Location of shapefile defining downtown area type. + Default: + :: + r"metcouncil_data/area_type/downtownzones_TAZ.shp" + downtown_area_type (int): Area type integer for downtown. + Default: + :: + 5 + mrcc_roadway_class_shape (str): Shapefile of MRCC links with a property + associated with roadway class. Default: + :: + r"metcouncil_data/mrcc/trans_mrcc_centerlines.shp" + mrcc_roadway_class_variable_shp (str): The property in mrcc_roadway_class_shp + associated with roadway class. Default: + :: + "ROUTE_SYS" + widot_roadway_class_shape (str): Shapefile of Wisconsin links with a property + associated with roadway class. Default: + :: + r"metcouncil_data/Wisconsin_Lanes_Counts_Median/WISLR.shp" + widot_roadway_class_variable_shp (str): The property in widot_roadway_class_shape + associated with roadway class.Default: + :: + "RDWY_CTGY_" + mndot_count_shape (str): Shapefile of MnDOT links with a property + associated with counts. Default: + :: + r"metcouncil_data/count_mn/AADT_2017_Count_Locations.shp" + mndot_count_variable_shp (str): The property in mndot_count_shape + associated with counts. Default: + + :: + "lookups/osm_highway_facility_type_crosswalk.csv" + legacy_tm2_attributes (str): CSV file of link attributes by + shStReferenceId from Legacy TM2 network. Default: + :: + "lookups/legacy_tm2_attributes.csv" + osm_lanes_attributes (str): CSV file of number of lanes by shStReferenceId + from OSM. Default: + :: + "lookups/osm_lanes_attributes.csv" + tam_tm2_attributes (str): CSV file of link attributes by + shStReferenceId from TAM TM2 network. Default: + :: + "lookups/tam_tm2_attributes.csv" + tom_tom_attributes (str): CSV file of link attributes by + shStReferenceId from TomTom network. Default: + :: + "lookups/tomtom_attributes.csv" + sfcta_attributes (str): CSV file of link attributes by + shStReferenceId from SFCTA network. Default: + :: + "lookups/sfcta_attributes.csv" + output_epsg (int): EPSG type of geographic projection for output + shapefiles. Default: + :: + 102646 + output_link_shp (str): Output shapefile for roadway links. Default: + :: + r"tests/scratch/links.shp" + output_node_shp (str): Output shapefile for roadway nodes. Default: + :: + r"tests/scratch/nodes.shp" + output_link_csv (str): Output csv for roadway links. Default: + :: + r"tests/scratch/links.csv" + output_node_csv (str): Output csv for roadway nodes. Default: + :: + r"tests/scratch/nodes.csv" + output_link_txt (str): Output fixed format txt for roadway links. Default: + :: + r"tests/scratch/links.txt" + output_node_txt (str): Output fixed format txt for roadway nodes. Default: + :: + r"tests/scratch/nodes.txt" + output_link_header_width_txt (str): Header for txt roadway links. Default: + :: + r"tests/scratch/links_header_width.txt" + output_node_header_width_txt (str): Header for txt for roadway Nodes. Default: + :: + r"tests/scratch/nodes_header_width.txt" + output_cube_network_script (str): Cube script for importing + fixed-format roadway network. Default: + :: + r"tests/scratch/make_complete_network_from_fixed_width_file.s + + + + """ + +
[docs] def __init__(self, **kwargs): + """ + Time period and category splitting info + """ + if "time_periods_to_time" in kwargs: + self.time_periods_to_time = kwargs.get("time_periods_to_time") + else: + self.time_period_to_time = { + "EA": ("3:00", "6:00"), + "AM": ("6:00", "10:00"), + "MD": ("10:00", "15:00"), + "PM": ("15:00", "19:00"), + "EV": ("19:00", "3:00"), + } + + #MTC + self.cube_time_periods = { + "1": "EA", + "2": "AM", + "3": "MD", + "4": "PM", + "5": "EV", + } + + """ + #MC + self.route_type_bus_mode_dict = {"Urb Loc": 5, "Sub Loc": 6, "Express": 7} + + self.route_type_mode_dict = {0: 8, 2: 9} + + self.cube_time_periods = {"1": "AM", "2": "MD"} + self.cube_time_periods_name = {"AM": "pk", "MD": "op"} + """ + if "categories" in kwargs: + self.categories = kwargs.get("categories") + else: + self.categories = { + # suffix, source (in order of search) + "sov": ["sov", "default"], + "hov2": ["hov2", "default", "sov"], + "hov3": ["hov3", "hov2", "default", "sov"], + "truck": ["trk", "sov", "default"], + } + + # prefix, source variable, categories + self.properties_to_split = { + "lanes": { + "v": "lanes", + "time_periods": self.time_period_to_time, + }, + "ML_lanes": { + "v": "ML_lanes", + "time_periods": self.time_period_to_time, + }, + "useclass": { + "v": "useclass", + "time_periods": self.time_period_to_time, + }, + } + + """ + Details for calculating the county based on the centroid of the link. + The NAME varible should be the name of a field in shapefile. + """ + #MTC + if 'lasso_base_dir' in kwargs: + self.base_dir = get_base_dir(lasso_base_dir = kwargs.get("lasso_base_dir")) + else: + self.base_dir = get_base_dir() + + if 'data_file_location' in kwargs: + self.data_file_location = kwargs.get("data_file_location") + else: + self.data_file_location = os.path.join(self.base_dir, "mtc_data") + + #MC + if "lasso_base_dir" in kwargs: + self.base_dir = get_base_dir(lasso_base_dir=kwargs.get("lasso_base_dir")) + else: + self.base_dir = get_base_dir() + """ + if "data_file_location" in kwargs: + self.data_file_location = kwargs.get("data_file_location") + else: + self.data_file_location = os.path.join(self.base_dir, "metcouncil_data") + """ + + #-------- + if "settings_location" in kwargs: + self.settings_location = kwargs.get("settings_location") + else: + self.settings_location = os.path.join(self.base_dir, "examples", "settings") + + if "scratch_location" in kwargs: + self.scratch_location = kwargs.get("scratch_location") + else: + self.scratch_location = os.path.join(self.base_dir, "tests", "scratch") + + ### COUNTIES + + self.county_shape = os.path.join( + self.data_file_location, "county", "county.shp" + ) + self.county_variable_shp = "NAME" + + #MTC + self.county_code_dict = { + 'San Francisco':1, + 'San Mateo':2, + 'Santa Clara':3, + 'Alameda':4, + 'Contra Costa':5, + 'Solano':6, + 'Napa':7, + 'Sonoma':8, + 'Marin':9, + 'San Joaquin':11, + 'External':10, + } + + self.county_centroid_range_dict = { + 'San Francisco':range(1,100000), + 'San Mateo':range(100001,200000), + 'Santa Clara':range(200001,300000), + 'Alameda':range(300001,400000), + 'Contra Costa':range(400001,500000), + 'Solano':range(500001,600000), + 'Napa':range(600001,700000), + 'Sonoma':range(700001,800000), + 'Marin':range(800001,900000), + 'External':range(900001,1000000) + } + + self.county_node_range_dict = { + 'San Francisco':range(1000000,1500000), + 'San Mateo':range(1500000,2000000), + 'Santa Clara':range(2000000,2500000), + 'Alameda':range(2500000,3000000), + 'Contra Costa':range(3000000,3500000), + 'Solano':range(3500000,4000000), + 'Napa':range(4000000,4500000), + 'Sonoma':range(4500000,5000000), + 'Marin':range(5000000,5500000), + } + + self.county_hov_node_range_dict = { + 'San Francisco':range(5500000,6000000), + 'San Mateo':range(6000000,6500000), + 'Santa Clara':range(6500000,7000000), + 'Alameda':range(7000000,7500000), + 'Contra Costa':range(7500000,8000000), + 'Solano':range(8000000,8500000), + 'Napa':range(8500000,9000000), + 'Sonoma':range(9000000,9500000), + 'Marin':range(9500000,10000000), + } + + self.county_link_range_dict = { + 'San Francisco':range(1,1000000), + 'San Mateo':range(1000000,2000000), + 'Santa Clara':range(2000000,3000000), + 'Alameda':range(3000000,4000000), + 'Contra Costa':range(4000000,5000000), + 'Solano':range(5000000,6000000), + 'Napa':range(6000000,7000000), + 'Sonoma':range(7000000,8000000), + 'Marin':range(8000000,9000000) + } + + #MC + """ + self.county_code_dict = { + "Anoka": 1, + "Carver": 2, + "Dakota": 3, + "Hennepin": 4, + "Ramsey": 5, + "Scott": 6, + "Washington": 7, + "external": 10, + "Chisago": 11, + "Goodhue": 12, + "Isanti": 13, + "Le Sueur": 14, + "McLeod": 15, + "Pierce": 16, + "Polk": 17, + "Rice": 18, + "Sherburne": 19, + "Sibley": 20, + "St. Croix": 21, + "Wright": 22, + } + """ + + self.mpo_counties = [ + 1, + 3, + 4, + 5, + 6, + 7, + 8, + 9 + ] + + self.taz_N_list = list(range(1, 10000)) + list(range(100001, 110000)) + list(range(200001, 210000)) + list(range(300001, 310000))\ + + list(range(400001, 410000)) + list(range(500001, 510000)) + list(range(600001, 610000)) + list(range(700001, 710000))\ + + list(range(800001, 810000)) + list(range(900001, 1000000)) + + self.maz_N_list = list(range(10001, 90000)) + list(range(110001, 190000)) + list(range(210001, 290000)) + list(range(310001, 390000))\ + + list(range(410001, 490000)) + list(range(510001, 590000)) + list(range(610001, 690000)) + list(range(710001, 790000))\ + + list(range(810001, 890000)) + + self.tap_N_list = list(range(90001, 99999)) + list(range(190001, 199999)) + list(range(290001, 299999)) + list(range(390001, 399999))\ + + list(range(490001, 499999)) + list(range(590001, 599999)) + list(range(690001, 699999)) + list(range(790001, 799999))\ + + list(range(890001, 899999)) + + self.tap_N_start = { + "San Francisco" : 90001, + "San Mateo" : 190001, + "Santa Clara" : 290001, + "Alameda" : 390001, + "Contra Costa" : 490001, + "Solano" : 590001, + "Napa" : 690001, + "Sonoma" : 790001, + "Marin" : 890001 + } + + #MTC + self.osm_facility_type_dict = os.path.join( + self.data_file_location, "lookups", "osm_highway_facility_type_crosswalk.csv" + ) + #MC + ### Lanes + self.lanes_lookup_file = os.path.join( + self.data_file_location, "lookups", "lanes.csv" + ) + + ### TAZS + + self.taz_shape = os.path.join( + self.data_file_location, "TAZ", "TAZOfficialWCurrentForecasts.shp" + ) + ###### + #MTC + self.osm_lanes_attributes = os.path.join( + self.data_file_location, "lookups", "osm_lanes_attributes.csv" + ) + + self.legacy_tm2_attributes = os.path.join( + self.data_file_location, "lookups", "legacy_tm2_attributes.csv" + ) + + self.assignable_analysis = os.path.join( + self.data_file_location, "lookups", "assignable_analysis_links.csv" + ) + ### + ### AREA TYPE - MC + self.area_type_shape = os.path.join( + self.data_file_location, + "area_type", + "ThriveMSP2040CommunityDesignation.shp", + ) + self.area_type_variable_shp = "COMDES2040" + # area type map from raw data to model category + + # source https://metrocouncil.org/Planning/Publications-And-Resources/Thrive-MSP-2040-Plan-(1)/7_ThriveMSP2040_LandUsePoliciesbyCD.aspx + # urban center + # urban + # suburban + # suburban edge + # emerging suburban edge + # rural center + # diversified rural + # rural residential + # agricultural + self.area_type_code_dict = { + 23: 4, # urban center + 24: 3, + 25: 2, + 35: 2, + 36: 1, + 41: 1, + 51: 1, + 52: 1, + 53: 1, + 60: 1, + } + + self.downtown_area_type_shape = os.path.join( + self.data_file_location, + "area_type", + "downtownzones_TAZ.shp", + ) + + self.downtown_area_type = int(5) + + self.centroid_connect_lanes = int(1) + + self.osm_assgngrp_dict = os.path.join( + self.data_file_location, "lookups", "osm_highway_asgngrp_crosswalk.csv" + ) + self.mrcc_roadway_class_shape = os.path.join( + self.data_file_location, "mrcc", "trans_mrcc_centerlines.shp" + ) + #### + ###MTC + self.tam_tm2_attributes = os.path.join( + self.data_file_location, "lookups", "tam_tm2_attributes.csv" + ) + + self.sfcta_attributes = os.path.join( + self.data_file_location, "lookups", "sfcta_attributes.csv" + ) + + self.tomtom_attributes = os.path.join( + self.data_file_location, "lookups", "tomtom_attributes.csv" + ) + + self.pems_attributes = os.path.join( + self.data_file_location, "lookups", "pems_attributes.csv" + ) + + self.centroid_file = os.path.join( + self.data_file_location, "centroid", "centroid_node.pickle" + ) + #### + ###MC + self.widot_shst_data = os.path.join( + self.data_file_location, + "Wisconsin_Lanes_Counts_Median", + "widot.out.matched.geojson", + ) + #### + + self.centroid_connector_link_file = os.path.join( + self.data_file_location, "centroid", "cc_link.pickle" + ) + + self.centroid_connector_shape_file = os.path.join( + self.data_file_location, "centroid", "cc_shape.pickle" + ) + + self.tap_file = os.path.join( + self.data_file_location, "tap", "tap_node.pickle" + ) + + self.tap_connector_link_file = os.path.join( + self.data_file_location, "tap", "tap_link.pickle" + ) + + self.tap_connector_shape_file = os.path.join( + self.data_file_location, "tap", "tap_shape.pickle" + ) + + self.net_to_dbf_crosswalk = os.path.join( + self.settings_location, "net_to_dbf.csv" + ) + + ###MTC + self.log_to_net_crosswalk = os.path.join(self.settings_location, "log_to_net.csv") + + self.emme_name_crosswalk_file = os.path.join( + self.data_file_location, "lookups", "emme_attribute_names.csv" + ) + #### + #MC + self.mndot_count_variable_shp = "AADT_mn" + + self.widot_county_shape = os.path.join( + self.data_file_location, + "Wisconsin_Lanes_Counts_Median", + "TRADAS_(counts).shp", + ) + ### + ###MTC + self.mode_crosswalk_file = os.path.join( + self.data_file_location, "lookups", "gtfs_to_tm2_mode_crosswalk.csv" + ) + + self.veh_cap_crosswalk_file = os.path.join( + self.data_file_location, "lookups", "transitSeatCap.csv" + ) + + self.faresystem_crosswalk_file = os.path.join( + self.data_file_location, "lookups", "faresystem_crosswalk.txt" + ) + + # https://app.asana.com/0/12291104512575/1200287255197808/f + self.fare_2015_to_2010_deflator = 0.927 + #### + #MC + self.widot_count_variable_shp = "AADT_wi" + + self.net_to_dbf_crosswalk = os.path.join( + self.settings_location, "net_to_dbf.csv" + ) + + self.log_to_net_crosswalk = os.path.join( + self.settings_location, "log_to_net.csv" + ) + + self.subregion_boundary_file = os.path.join( + self.data_file_location, 'emme', 'subregion_boundary_for_active_modes.shp' + ) + + self.subregion_boundary_id_variable = 'subregion' + #### + + self.output_variables = [ + "model_link_id", + "link_id", + "A", + "B", + "shstGeometryId", + #MTC + 'name', + "distance", + "roadway", + #"name", + #MC + #"shape_id", + #"distance", + #"roadway", + #"name", + #"roadway_class", + #### + "bike_access", + "walk_access", + "drive_access", + "truck_access", + "lanes_EA", + "lanes_AM", + "lanes_MD", + "lanes_PM", + "lanes_EV", + "county", + "model_node_id", + "N", + "osm_node_id", + "geometry", + "X", + "Y", + "segment_id", + "managed", + "bus_only", + "rail_only", + "pnr", + #MTC + "assignable", + "cntype", + "useclass_AM", + "useclass_MD", + "useclass_PM", + "useclass_EV", + "useclass_EA", + "transit", + "tollbooth", + "tollseg", + "ft", + "tap_drive", + "tollbooth", + "tollseg", + "farezone", + "tap_id", + #### + #MC + "bike_facility", + "mrcc_id", + "ROUTE_SYS", # mrcc functional class + #### + #bi-county + "nmt2010", + "nmt2020", + "BRT", + "has_transit" + ] + + self.output_link_shp = os.path.join(self.scratch_location, "links.shp") + self.output_node_shp = os.path.join(self.scratch_location, "nodes.shp") + self.output_link_csv = os.path.join(self.scratch_location, "links.csv") + self.output_node_csv = os.path.join(self.scratch_location, "nodes.csv") + self.output_link_txt = os.path.join(self.scratch_location, "links.txt") + self.output_node_txt = os.path.join(self.scratch_location, "nodes.txt") + self.output_link_header_width_txt = os.path.join( + self.scratch_location, "links_header_width.txt" + ) + self.output_node_header_width_txt = os.path.join( + self.scratch_location, "nodes_header_width.txt" + ) + self.output_cube_network_script = os.path.join( + self.scratch_location, "make_complete_network_from_fixed_width_file.s" + ) + self.output_dir = os.path.join(self.scratch_location) + self.output_proj = CRS("ESRI:102646") + self.output_proj4 = '+proj=lcc +lat_1=32.78333333333333 +lat_2=33.88333333333333 +lat_0=32.16666666666666 +lon_0=-116.25 +x_0=2000000 +y_0=500000.0000000002 +ellps=GRS80 +datum=NAD83 +to_meter=0.3048006096012192 +no_defs' + self.prj_file = os.path.join(self.data_file_location, 'projection', '102646.prj') + self.wkt_projection = 'PROJCS["NAD_1983_StatePlane_California_VI_FIPS_0406_Feet",GEOGCS["GCS_North_American_1983",DATUM["North_American_Datum_1983",SPHEROID["GRS_1980",6378137,298.257222101]],PRIMEM["Greenwich",0],UNIT["Degree",0.017453292519943295]],PROJECTION["Lambert_Conformal_Conic_2SP"],PARAMETER["False_Easting",6561666.666666666],PARAMETER["False_Northing",1640416.666666667],PARAMETER["Central_Meridian",-116.25],PARAMETER["Standard_Parallel_1",32.78333333333333],PARAMETER["Standard_Parallel_2",33.88333333333333],PARAMETER["Latitude_Of_Origin",32.16666666666666],UNIT["Foot_US",0.30480060960121924],AUTHORITY["EPSG","102646"]]' + + self.fare_matrix_output_variables = ["faresystem", "origin_farezone", "destination_farezone", "price"] + + self.zones = 6593 + """ + Create all the possible headway variable combinations based on the cube time periods setting + """ + self.time_period_properties_list = [ + p + "[" + str(t) + "]" + for p in ["HEADWAY", "FREQ"] + for t in self.cube_time_periods.keys() + ] + + self.int_col = [ + "model_link_id", + "model_node_id", + "A", + "B", + #MTC + #"county", + ### + #MC + # "lanes", + "lanes_AM", + "lanes_MD", + "lanes_PM", + "lanes_NT", + "roadway_class", + "assign_group", + #"county", + "area_type", + "trn_priority", + "AADT", + "count_AM", + "count_MD", + "count_PM", + "count_NT", + "count_daily", + "centroidconnect", + "bike_facility", + #### + "drive_access", + "walk_access", + "bike_access", + "truck_access", + #MTC + "ML_lanes_AM", + "ML_lanes_MD", + "ML_lanes_PM", + "ML_lanes_EV", + "ML_lanes_EA", + ### + #MC + "drive_node", + "walk_node", + "bike_node", + "transit_node", + # "ML_lanes", + "ML_lanes_AM", + "ML_lanes_MD", + "ML_lanes_PM", + "ML_lanes_NT", + #### + "segment_id", + "managed", + "bus_only", + "rail_only", + "transit", + ##MTC + "ft", + "assignable", + "lanes_AM", + "lanes_MD", + "lanes_PM", + "lanes_EA", + "lanes_EV", + "useclass_AM", + "useclass_EA", + "useclass_MD", + "useclass_PM", + "useclass_EV", + "tollseg", + "tollbooth", + "farezone", + "tap_id", + #### + #bi-county + "nmt2010", + "nmt2020", + "BRT", + "has_transit" + ] + + self.float_col = [ + "distance", + "price", + "X", + "Y" + "mrcc_id", + ] + + self.float_col = ["distance", "ttime_assert", "price", "X", "Y"] + + self.string_col = [ + "osm_node_id", + "name", + "pnr", + "roadway", + "shstGeometryId", + "access_AM", + "access_MD", + "access_PM", + "access_NT", + "ROUTE_SYS", + ] + + # pnr parameters + self.pnr_node_location = os.path.join( + self.data_file_location, "lookups", "pnr_stations.csv" + ) + + self.drive_buffer = 6 + + #self.network_build_crs = CRS("EPSG:2875") + #self.project_card_crs = CRS("EPSG:4326") + #self.transformer = pyproj.Transformer.from_crs( + # self.network_build_crs, self.project_card_crs, always_xy=True + #) + + self.__dict__.update(kwargs)
+
+ +
+
+
+ +
+ +
+

© Copyright 2019 Metropolitan Council.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty_emme/_modules/lasso/project/index.html b/branch/bicounty_emme/_modules/lasso/project/index.html new file mode 100644 index 0000000..e1285f4 --- /dev/null +++ b/branch/bicounty_emme/_modules/lasso/project/index.html @@ -0,0 +1,1530 @@ + + + + + + lasso.project — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for lasso.project

+import json
+import os
+import re
+from typing import Any, Dict, Optional, Union, List
+from csv import reader
+
+from pandas.core import base
+
+import numpy as np
+import pandas as pd
+from pandas import DataFrame
+import geopandas as gpd
+
+from network_wrangler import ProjectCard
+from network_wrangler import RoadwayNetwork
+
+from .transit import CubeTransit, StandardTransit
+from .logger import WranglerLogger
+from .parameters import Parameters
+from .roadway import ModelRoadwayNetwork
+from .util import column_name_to_parts
+
+
+
[docs]class Project(object): + """A single or set of changes to the roadway or transit system. + + Compares a base and a build transit network or a base and build + highway network and produces project cards. + + .. highlight:: python + + Typical usage example: + :: + test_project = Project.create_project( + base_cube_transit_source=os.path.join(CUBE_DIR, "transit.LIN"), + build_cube_transit_source=os.path.join(CUBE_DIR, "transit_route_shape_change"), + ) + test_project.evaluate_changes() + test_project.write_project_card( + os.path.join(SCRATCH_DIR, "t_transit_shape_test.yml") + ) + + Attributes: + DEFAULT_PROJECT_NAME: a class-level constant that defines what + the project name will be if none is set. + STATIC_VALUES: a class-level constant which defines values that + are not evaluated when assessing changes. + card_data (dict): {"project": <project_name>, "changes": <list of change dicts>} + roadway_link_changes (DataFrame): pandas dataframe of CUBE roadway link changes. + roadway_node_changes (DataFrame): pandas dataframe of CUBE roadway node changes. + transit_changes (CubeTransit): + base_roadway_network (RoadwayNetwork): + base_cube_transit_network (CubeTransit): + build_cube_transit_network (CubeTransit): + project_name (str): name of the project, set to DEFAULT_PROJECT_NAME if not provided + parameters: an instance of the Parameters class which sets a bunch of parameters + """ + + DEFAULT_PROJECT_NAME = "USER TO define" + + STATIC_VALUES = [ + "model_link_id", + "area_type", + "county", + # "assign_group", + "centroidconnect", + ] + CALCULATED_VALUES = [ + "area_type", + "county", + "assign_group", + "centroidconnect", + ] + +
[docs] def __init__( + self, + roadway_link_changes: Optional[DataFrame] = None, + roadway_node_changes: Optional[DataFrame] = None, + transit_changes: Optional[DataFrame] = None, + base_roadway_network: Optional[RoadwayNetwork] = None, + base_transit_network: Optional[StandardTransit] = None, + base_cube_transit_network: Optional[CubeTransit] = None, + build_cube_transit_network: Optional[CubeTransit] = None, + project_name: Optional[str] = "", + evaluate: Optional[bool] = False, + parameters: Union[dict, Parameters] = {}, + ): + """ + ProjectCard constructor. + + args: + roadway_link_changes: dataframe of roadway changes read from a log file + roadway_node_changes: dataframe of roadway changes read from a log file + transit_changes: dataframe of transit changes read from a log file + base_roadway_network: RoadwayNetwork instance for base case + base_transit_network: StandardTransit instance for base case + base_cube_transit_network: CubeTransit instance for base transit network + build_cube_transit_network: CubeTransit instance for build transit network + project_name: name of the project + evaluate: defaults to false, but if true, will create card data + parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not specified, will use default parameters. + + returns: instance of ProjectCard + """ + self.card_data = Dict[str, Dict[str, Any]] + + self.roadway_link_changes = roadway_link_changes + self.roadway_node_changes = roadway_node_changes + self.base_roadway_network = base_roadway_network + self.base_transit_network = base_transit_network + self.base_cube_transit_network = base_cube_transit_network + self.build_cube_transit_network = build_cube_transit_network + self.transit_changes = transit_changes + self.project_name = ( + project_name if project_name else Project.DEFAULT_PROJECT_NAME + ) + + if type(parameters) is dict: + self.parameters = Parameters(**parameters) + elif isinstance(parameters, Parameters): + self.parameters = Parameters(**parameters.__dict__) + else: + msg = "Parameters should be a dict or instance of Parameters: found {} which is of type:{}".format( + parameters, type(parameters) + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + if base_roadway_network != None: + self.determine_roadway_network_changes_compatibility( + self.base_roadway_network, + self.roadway_link_changes, + self.roadway_node_changes, + self.parameters + ) + + if evaluate: + self.evaluate_changes()
+ +
[docs] def write_project_card(self, filename: str = None): + """ + Writes project cards. + + Args: + filename (str): File path to output .yml + + Returns: + None + """ + ProjectCard(self.card_data).write(out_filename=filename)
+ +
[docs] @staticmethod + def create_project( + roadway_log_file: Union[str, List[str], None] = None, + roadway_shp_file: Optional[str] = None, + roadway_csv_file: Optional[str] = None, + network_build_file: Optional[str] = None, + emme_node_id_crosswalk_file: Optional[str] = None, + emme_name_crosswalk_file: Optional[str] = None, + base_roadway_dir: Optional[str] = None, + base_transit_dir: Optional[str] = None, + base_cube_transit_source: Optional[str] = None, + build_cube_transit_source: Optional[str] = None, + roadway_link_changes: Optional[DataFrame] = None, + roadway_node_changes: Optional[DataFrame] = None, + transit_changes: Optional[CubeTransit] = None, + base_roadway_network: Optional[RoadwayNetwork] = None, + base_transit_network: Optional[StandardTransit] = None, + base_cube_transit_network: Optional[CubeTransit] = None, + build_cube_transit_network: Optional[CubeTransit] = None, + project_name: Optional[str] = None, + recalculate_calculated_variables: Optional[bool] = False, + recalculate_distance: Optional[bool] = False, + parameters: Optional[dict] = {}, + **kwargs, + ): + """ + Constructor for a Project instance. + + Args: + roadway_log_file: File path to consuming logfile or a list of logfile paths. + roadway_shp_file: File path to consuming shape file for roadway changes. + roadway_csv_file: File path to consuming csv file for roadway changes. + network_build_file: File path to consuming EMME network build for network changes. + base_roadway_dir: Folder path to base roadway network. + base_transit_dir: Folder path to base transit network. + base_cube_transit_source: Folder path to base transit network or cube line file string. + base_cube_transit_file: File path to base transit network. + build_cube_transit_source: Folder path to build transit network or cube line file string. + build_cube_transit_file: File path to build transit network. + roadway_link_changes: pandas dataframe of CUBE roadway link changes. + roadway_node_changes: pandas dataframe of CUBE roadway node changes. + transit_changes: build transit changes. + base_roadway_network: Base roadway network object. + base_cube_transit_network: Base cube transit network object. + build_cube_transit_network: Build cube transit network object. + project_name: If not provided, will default to the roadway_log_file filename if + provided (or the first filename if a list is provided) + recalculate_calculated_variables: if reading in a base network, if this is true it + will recalculate variables such as area type, etc. This only needs to be true + if you are creating project cards that are changing the calculated variables. + recalculate_distance: recalculate the distance variable. This only needs to be + true if you are creating project cards that change the distance. + parameters: dictionary of parameters + crs (int): coordinate reference system, ESPG number + node_foreign_key (str): variable linking the node table to the link table + link_foreign_key (list): list of variable linking the link table to the node foreign key + shape_foreign_key (str): variable linking the links table and shape table + unique_link_ids (list): list of variables unique to each link + unique_node_ids (list): list of variables unique to each node + modes_to_network_link_variables (dict): Mapping of modes to link variables in + the network + modes_to_network_nodes_variables (dict): Mapping of modes to node variables + in the network + managed_lanes_node_id_scalar (int): Scalar values added to primary keys for nodes for + corresponding managed lanes. + managed_lanes_link_id_scalar (int): Scalar values added to primary keys for links for + corresponding managed lanes. + managed_lanes_required_attributes (list): attributes that must be specified in managed + lane projects. + keep_same_attributes_ml_and_gp (list): attributes to copy to managed lanes from parallel + general purpose lanes. + + Returns: + A Project instance. + """ + + if base_cube_transit_source and base_cube_transit_network: + msg = "Method takes only one of 'base_cube_transit_source' and 'base_cube_transit_network' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if base_cube_transit_source: + base_cube_transit_network = CubeTransit.create_from_cube(base_cube_transit_source, parameters) + WranglerLogger.debug( + "Base network has {} lines".format(len(base_cube_transit_network.lines)) + ) + if len(base_cube_transit_network.lines) <= 10: + WranglerLogger.debug( + "Base network lines: {}".format( + "\n - ".join(base_cube_transit_network.lines) + ) + ) + elif base_cube_transit_network: + pass + else: + msg = "No base cube transit network." + WranglerLogger.info(msg) + base_cube_transit_network = None + + if build_cube_transit_source and transit_changes: + msg = "Method takes only one of 'build_cube_transit_source' and 'transit_changes' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if build_cube_transit_source: + WranglerLogger.debug("build") + build_cube_transit_network = CubeTransit.create_from_cube(build_cube_transit_source, parameters) + WranglerLogger.debug( + "Build network has {} lines".format(len(build_cube_transit_network.lines)) + ) + if len(build_cube_transit_network.lines) <= 10: + WranglerLogger.debug( + "Build network lines: {}".format( + "\n - ".join(build_cube_transit_network.lines) + ) + ) + elif transit_changes: + pass + else: + msg = "No cube transit changes given or processed." + WranglerLogger.info(msg) + transit_changes = None + + if roadway_log_file and (roadway_link_changes or roadway_node_changes): + msg = "Method takes only one of 'roadway_log_file' and 'roadway_changes' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if roadway_shp_file and (roadway_link_changes or roadway_node_changes): + msg = "Method takes only one of 'roadway_shp_file' and 'roadway_changes' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if roadway_csv_file and (roadway_link_changes or roadway_node_changes): + msg = "Method takes only one of 'roadway_csv_file' and 'roadway_changes' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if roadway_log_file and roadway_csv_file: + msg = "Method takes only one of 'roadway_log_file' and 'roadway_csv_file' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if roadway_shp_file and roadway_csv_file: + msg = "Method takes only one of 'roadway_shp_file' and 'roadway_csv_file' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if roadway_log_file and roadway_shp_file: + msg = "Method takes only one of 'roadway_log_file' and 'roadway_shp_file' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if roadway_log_file and not project_name: + if type(roadway_log_file) == list: + project_name = os.path.splitext(os.path.basename(roadway_log_file[0]))[ + 0 + ] + WranglerLogger.info( + "No Project Name - Using name of first log file in list" + ) + else: + project_name = os.path.splitext(os.path.basename(roadway_log_file))[0] + WranglerLogger.info("No Project Name - Using name of log file") + if network_build_file and not project_name: + if type(network_build_file) == list: + with open(network_build_file[0]) as f: + _content = json.load(f) + project_name = ( + _content.get('metadata').get('project_title') + ' ' + + _content.get('metadata').get('date') + ' ' + + _content.get('metadata').get('comments') + ) + WranglerLogger.info( + "No Project Name - Using metadata of first network build file in list" + ) + else: + with open(network_build_file) as f: + _content = json.load(f) + project_name = ( + _content.get('metadata').get('project_title') + ' ' + + _content.get('metadata').get('date') + ' ' + + _content.get('metadata').get('comments') + ) + WranglerLogger.info("No Project Name - Using metadata of network build file") + if roadway_log_file: + roadway_link_changes, roadway_node_changes = Project.read_logfile(roadway_log_file) + elif roadway_shp_file: + roadway_changes = gpd.read_file(roadway_shp_file) + roadway_link_changes = roadway_changes[roadway_changes.OBJECT == 'L'].copy() + roadway_node_changes = roadway_changes[roadway_changes.OBJECT == 'N'].copy() + roadway_link_changes = DataFrame(roadway_link_changes.drop("geometry", axis=1)) + roadway_node_changes = DataFrame(roadway_node_changes.drop("geometry", axis=1)) + roadway_node_changes["model_node_id"] = 0 + elif roadway_csv_file: + roadway_changes = pd.read_csv(roadway_csv_file) + roadway_link_changes = roadway_changes[roadway_changes.OBJECT == 'L'].copy() + roadway_node_changes = roadway_changes[roadway_changes.OBJECT == 'N'].copy() + roadway_node_changes["model_node_id"] = 0 + elif network_build_file: + roadway_link_changes, roadway_node_changes, transit_changes = Project.read_network_build_file(network_build_file) + if emme_node_id_crosswalk_file: + # get wrangler IDs from emme element_id + roadway_link_changes, roadway_node_changes, transit_changes = Project.emme_id_to_wrangler_id( + roadway_link_changes, + roadway_node_changes, + transit_changes, + emme_node_id_crosswalk_file + ) + else: + msg = "User needs to specify emme node id crosswalk file using emme_node_id_crosswalk_file = " + WranglerLogger.error(msg) + raise ValueError(msg) + # rename emme attributes to wrangler attributes + if emme_name_crosswalk_file is None: + emme_name_crosswalk_file = parameters.emme_name_crosswalk_file + roadway_link_changes, roadway_node_changes = Project.emme_name_to_wrangler_name( + roadway_link_changes, + roadway_node_changes, + emme_name_crosswalk_file + ) + elif roadway_link_changes: + pass + elif roadway_node_changes: + pass + else: + msg = "No roadway changes given or processed." + WranglerLogger.info(msg) + roadway_link_changes = pd.DataFrame({}) + roadway_node_changes = pd.DataFrame({}) + + if base_roadway_network and base_roadway_dir: + msg = "Method takes only one of 'base_roadway_network' and 'base_roadway_dir' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if base_roadway_dir: + base_roadway_network = ModelRoadwayNetwork.read( + os.path.join(base_roadway_dir, "link.json"), + os.path.join(base_roadway_dir, "node.geojson"), + os.path.join(base_roadway_dir, "shape.geojson"), + fast=True, + recalculate_calculated_variables=recalculate_calculated_variables, + recalculate_distance=recalculate_distance, + parameters=parameters, + **kwargs, + ) + base_roadway_network.split_properties_by_time_period_and_category() + elif base_roadway_network: + base_roadway_network.split_properties_by_time_period_and_category() + else: + msg = "No base roadway network." + WranglerLogger.info(msg) + base_roadway_network = None + + if base_cube_transit_source and base_transit_dir: + msg = "Method takes only one of 'base_cube_transit_source' and 'base_transit_dir' but both given" + WranglerLogger.error(msg) + raise ValueError(msg) + if base_transit_dir: + base_transit_network = StandardTransit.read_gtfs( + gtfs_feed_dir=base_transit_dir, + parameters=parameters + ) + elif base_transit_network: + base_transit_network = base_transit_network + else: + msg = "No base transit network." + WranglerLogger.info(msg) + base_transit_network = None + + project = Project( + roadway_link_changes=roadway_link_changes, + roadway_node_changes=roadway_node_changes, + transit_changes=transit_changes, + base_roadway_network=base_roadway_network, + base_transit_network=base_transit_network, + base_cube_transit_network=base_cube_transit_network, + build_cube_transit_network=build_cube_transit_network, + evaluate=True, + project_name=project_name, + parameters=parameters, + ) + + return project
+ +
[docs] @staticmethod + def read_logfile(logfilename: Union[str, List[str]]): + """ + Reads a Cube log file and returns separate dataframes of roadway_link_changes and roadway_node_changes + + Args: + logfilename (str or list[str]): File path to CUBE logfile or list of logfile paths. + + Returns: + A DataFrame reprsentation of the log file. + """ + if type(logfilename) == str: + logfilename = [logfilename] + + link_df = pd.DataFrame() + node_df = pd.DataFrame() + + for file in logfilename: + WranglerLogger.info("Reading logfile: {}".format(file)) + with open(file) as f: + _content = f.readlines() + + _node_lines = [ + x.strip().replace(";", ",") for x in _content if x.startswith("N") + ] + WranglerLogger.debug("node lines: {}".format(_node_lines)) + _link_lines = [ + x.strip().replace(";", ",") for x in _content if x.startswith("L") + ] + WranglerLogger.debug("link lines: {}".format(_link_lines)) + + _nodecol = ["OBJECT", "OPERATION", "GROUP"] + _node_lines[0].split(",")[ + 1: + ] + WranglerLogger.debug("Node Cols: {}".format(_nodecol)) + _linkcol = ["OBJECT", "OPERATION", "GROUP"] + _link_lines[0].split(",")[ + 1: + ] + WranglerLogger.debug("Link Cols: {}".format(_linkcol)) + + def split_log(x): + return list(reader([x], delimiter=',', quotechar='"'))[0] + + _node_df = pd.DataFrame([split_log(x) for x in _node_lines[1:]],columns = _nodecol) + WranglerLogger.debug("Node DF: {}".format(_node_df)) + _link_df = pd.DataFrame([split_log(x) for x in _link_lines[1:]],columns = _linkcol) + WranglerLogger.debug("Link DF: {}".format(_link_df)) + + node_df = pd.concat([node_df, _node_df]) + link_df = pd.concat([link_df, _link_df]) + + # CUBE logfile headers for string fields: NAME[111] instead of NAME, need to shorten that + link_df.columns = [c.split("[")[0] for c in link_df.columns] + # CUBE logfile headers for string fields: NAME[111] instead of NAME, need to shorten that + node_df.columns = [c.split("[")[0] for c in node_df.columns] + + if len(link_df) > 0: + # create operation history + action_history_df = ( + link_df.groupby(['A', 'B'])["OPERATION"] + .agg(lambda x: x.tolist()) + .rename("operation_history") + .reset_index() + ) + action_history_df["operation_final"] = action_history_df.apply(lambda x: Project._final_op(x), axis=1) + + link_df = pd.merge(link_df, action_history_df, on=['A', 'B'], how="left") + + if len(node_df) > 0: + action_history_df = ( + node_df.groupby('N')["OPERATION"] + .agg(lambda x: x.tolist()) + .rename("operation_history") + .reset_index() + ) + action_history_df["operation_final"] = action_history_df.apply(lambda x: Project._final_op(x), axis=1) + + node_df = pd.merge(node_df, action_history_df, on='N', how="left") + + WranglerLogger.info( + "Processed {} Node lines and {} Link lines".format( + node_df.shape[0], link_df.shape[0] + ) + ) + + return link_df, node_df
+ +
[docs] @staticmethod + def read_network_build_file(networkbuildfilename: Union[str, List[str]]): + """ + Reads a emme network build file and returns separate dataframes of roadway_link_changes and roadway_node_changes + + Args: + networkbuildfilename (str or list[str]): File path to emme nework build file or list of network build file paths. + + Returns: + A DataFrame representation of the network build file + """ + if type(networkbuildfilename) == str: + networkbuildfilename = [networkbuildfilename] + + _link_command_history_df = DataFrame() + _node_command_history_df = DataFrame() + _transit_command_history_df = DataFrame() + + for file in networkbuildfilename: + WranglerLogger.info("Reading network build file: {}".format(file)) + with open(file) as f: + _content = json.load(f) + + _command_history = _content.get('command_history') + + # loop through all the commands + for command in _command_history: + if command.get('command') == 'set_attribute': + element_id = command.get('parameters').get('element_ids') + object = Project.get_object_from_network_build_command(command) + operation = Project.get_operation_from_network_build_command(command) + + _command_df = DataFrame( + data = { + 'element_id' : element_id, + 'object' : object, + 'operation' : operation + } + ) + + _command_df[command.get('parameters').get('attribute_name')] = command.get('parameters').get('value') + + if command.get('command') in ['create_link', 'create_node']: + if command.get('command') == 'create_link': + element_id = command.get('results').get('changes').get('added').get('LINK') + if command.get('command') == 'create_node': + element_id = command.get('results').get('changes').get('added').get('NODE') + object = Project.get_object_from_network_build_command(command) + operation = Project.get_operation_from_network_build_command(command) + + _command_df = DataFrame( + data = { + 'element_id' : element_id, + 'object' : object, + 'operation' : operation + } + ) + + for attribute_name, attribute_value in command.get('parameters').get('attributes').items(): + _command_df[attribute_name] = attribute_value + + if command.get('command') == 'delete_link': + element_id = command.get('results').get('changes').get('removed').get('LINK') + object = Project.get_object_from_network_build_command(command) + operation = Project.get_operation_from_network_build_command(command) + + _command_df = DataFrame( + data = { + 'element_id' : element_id, + 'object' : object, + 'operation' : operation + } + ) + + if command.get('command') == 'modify_transit_line': + element_id = command.get('parameters').get('line_id') + object = Project.get_object_from_network_build_command(command) + operation = Project.get_operation_from_network_build_command(command) + + _command_df = DataFrame( + data = { + 'element_id' : pd.Series(element_id), + 'object' : pd.Series(object), + 'operation' : pd.Series(operation) + } + ) + + _command_df['new_itinerary'] = [command.get('parameters').get('new_itinerary')] + + if ('L' in _command_df['object'].unique()): + _link_command_history_df = _link_command_history_df.append( + _command_df[_command_df['object'] == 'L'], + sort = False, + ignore_index = True + ) + + if ('N' in _command_df['object'].unique()): + _node_command_history_df = _node_command_history_df.append( + _command_df[_command_df['object'] == 'N'], + sort = False, + ignore_index = True + ) + + if ( + ('TRANSIT_LINE' in _command_df['object'].unique()) | + ('TRANSIT_STOP' in _command_df['object'].unique()) | + ('TRANSIT_SHAPE' in _command_df['object'].unique()) + ): + _transit_command_history_df = _transit_command_history_df.append( + _command_df[_command_df['object'].isin(['TRANSIT_LINE', 'TRANSIT_STOP', 'TRANSIT_SHAPE'])], + sort = False, + ignore_index = True + ) + + if len(_link_command_history_df) > 0: + # create operation history + link_action_history_df = ( + _link_command_history_df.groupby('element_id')["operation"] + .agg(lambda x: x.tolist()) + .rename("operation_history") + .reset_index() + ) + link_action_history_df["operation_final"] = link_action_history_df.apply( + lambda x: Project._final_op(x), + axis=1 + ) + + # get the last none null value for each element + # consolidate elements to single record + def get_last_valid(series): + if len(series.dropna()) > 0: + return series.dropna().iloc[-1] + else: + return np.nan + + #_command_history_df = _command_history_df.groupby(['element_id']).apply(get_last_valid).reset_index() + _link_command_history_df = _link_command_history_df.groupby( + ['element_id'] + ).last().reset_index() + + _link_command_history_df = pd.merge(_link_command_history_df, link_action_history_df, on='element_id', how="left") + + if len(_node_command_history_df) > 0: + # create node operation history + node_action_history_df = ( + _node_command_history_df.groupby('element_id')["operation"] + .agg(lambda x: x.tolist()) + .rename("operation_history") + .reset_index() + ) + node_action_history_df["operation_final"] = node_action_history_df.apply( + lambda x: Project._final_op(x), + axis=1 + ) + + _node_command_history_df = _node_command_history_df.groupby( + ['element_id'] + ).last().reset_index() + + _node_command_history_df = pd.merge(_node_command_history_df, node_action_history_df, on='element_id', how="left") + + WranglerLogger.info( + "Processed {} link element commands, {} node element commands".format( + _link_command_history_df.shape[0], + _node_command_history_df.shape[0] + ) + ) + + return _link_command_history_df, _node_command_history_df, _transit_command_history_df
+ +
[docs] @staticmethod + def emme_id_to_wrangler_id(emme_link_change_df, emme_node_change_df, emme_transit_changes_df, emme_node_id_crosswalk_file): + """ + rewrite the emme id with wrangler id, using the emme wrangler id crosswalk located in database folder + """ + WranglerLogger.info('Reading emme node id crosswalk file from {}'.format(emme_node_id_crosswalk_file)) + emme_node_id_crosswalk_df = pd.read_csv(emme_node_id_crosswalk_file) + emme_node_id_dict = dict(zip(emme_node_id_crosswalk_df['emme_node_id'], emme_node_id_crosswalk_df['model_node_id'])) + + # get node changes + if len(emme_node_change_df) > 0: + emme_node_change_df['emme_id'] = emme_node_change_df['element_id'].apply(lambda x: int(x.split('-')[0])) + + # get new emme nodes + new_emme_node_id_list = [ + n for n in emme_node_change_df['emme_id'].to_list() if n not in emme_node_id_crosswalk_df['emme_node_id'].to_list() + ] + WranglerLogger.info('New emme node id list {}'.format(new_emme_node_id_list)) + new_wrangler_node = emme_node_id_crosswalk_df['model_node_id'].max() + + # add crosswalk for new emme nodes + for new_emme_node in new_emme_node_id_list: + if new_emme_node in emme_node_id_dict.keys(): + msg = "new node id {} has already been added to the crosswalk".format(new_emme_node) + WranglerLogger.error(msg) + raise ValueError(msg) + else: + new_wrangler_node = new_wrangler_node + 1 + emme_node_id_dict.update({new_emme_node : new_wrangler_node}) + new_emme_node_id_crosswalk_df = pd.DataFrame(emme_node_id_dict.items(), columns=['emme_node_id', 'model_node_id']) + new_emme_node_id_crosswalk_df.to_csv(emme_node_id_crosswalk_file, index=False) + + # for nodes update model_node_id + emme_node_change_df['model_node_id'] = emme_node_change_df['emme_id'].map(emme_node_id_dict).fillna(0) + + if len(emme_link_change_df) > 0: + emme_link_change_df['A'] = emme_link_change_df['element_id'].apply(lambda x: int(x.split('-')[0])) + emme_link_change_df['B'] = emme_link_change_df['element_id'].apply(lambda x: int(x.split('-')[-1])) + # for links update A,B nodes + emme_link_change_df['A'] = emme_link_change_df['A'].map(emme_node_id_dict) + emme_link_change_df['B'] = emme_link_change_df['B'].map(emme_node_id_dict) + + if len(emme_transit_changes_df) > 0: + emme_transit_changes_df['i_node'] = emme_transit_changes_df.apply( + lambda x: x['element_id'].split('-')[-3] if x['object'] == 'TRANSIT_STOP' else 0, + axis = 1 + ) + + emme_transit_changes_df['j_node'] = emme_transit_changes_df.apply( + lambda x: x['element_id'].split('-')[-2] if x['object'] == 'TRANSIT_STOP' else 0, + axis = 1 + ) + + # update i,j nodes + emme_transit_changes_df['i_node'] = emme_transit_changes_df[ + 'i_node' + ].astype( + int + ).map( + emme_node_id_dict + ).fillna(0).astype(int) + emme_transit_changes_df['j_node'] = emme_transit_changes_df[ + 'j_node' + ].astype( + int + ).map( + emme_node_id_dict + ).fillna(0).astype(int) + + # update routing nodes + emme_transit_changes_df['new_itinerary'] = emme_transit_changes_df.apply( + lambda x: [emme_node_id_dict.get(n) for n in x['new_itinerary']] if x['object'] == 'TRANSIT_SHAPE' else 0, + axis = 1 + ) + + return emme_link_change_df, emme_node_change_df, emme_transit_changes_df
+ +
[docs] def get_object_from_network_build_command(row): + """ + determine the network build object is node or link + + Args: + row: network build command history dataframe + + Returns: + 'N' for node, 'L' for link + """ + + if row.get('command') == 'create_link': + return 'L' + + if row.get('command') == 'create_node': + return 'N' + + if row.get('command') == 'delete_link': + return 'L' + + if row.get('command') == 'set_attribute': + if row.get('parameters').get('element_type') == 'LINK': + return 'L' + if row.get('parameters').get('element_type') == 'NODE': + return 'N' + if row.get('parameters').get('element_type') == 'TRANSIT_LINE': + return 'TRANSIT_LINE' + if row.get('parameters').get('element_type') == 'TRANSIT_SEGMENT': + return 'TRANSIT_STOP' + + if row.get('command') == 'modify_transit_line': + return 'TRANSIT_SHAPE'
+ +
[docs] def get_operation_from_network_build_command(row): + """ + determine the network build object action type + + Args: + row: network build command history dataframe + + Returns: + 'A', 'C', 'D' + """ + + if row.get('command') == 'create_link': + return 'A' + + if row.get('command') == 'create_node': + return 'A' + + if row.get('command') == 'delete_link': + return 'D' + + if row.get('command') == 'set_attribute': + if row.get('parameters').get('element_type') == 'LINK': + return 'C' + if row.get('parameters').get('element_type') == 'NODE': + return 'C' + if row.get('parameters').get('element_type') == 'TRANSIT_LINE': + return 'C' + if row.get('parameters').get('element_type') == 'TRANSIT_SEGMENT': + return 'C' + + if row.get('command') == 'modify_transit_line': + return 'C'
+ +
[docs] @staticmethod + def emme_name_to_wrangler_name(emme_link_change_df, emme_node_change_df, emme_name_crosswalk_file): + """ + rename emme names to wrangler names using crosswalk file + """ + + WranglerLogger.info('Reading emme attribute name crosswalk file {}'.format(emme_name_crosswalk_file)) + emme_name_crosswalk_df = pd.read_csv(emme_name_crosswalk_file) + emme_name_crosswalk_dict = dict(zip(emme_name_crosswalk_df['emme_name'], emme_name_crosswalk_df['wrangler_name'])) + + # drop columns we don't need from emme to avoid confusion + ignore_columns = [ + c for c in emme_link_change_df.columns if c not in list(emme_name_crosswalk_dict.keys()) + ['operation_final', 'A', 'B'] + ] + WranglerLogger.info('Ignoring link changes in {}'.format(ignore_columns)) + emme_link_change_df = emme_link_change_df.drop(ignore_columns, axis = 1) + + ignore_columns = [ + c for c in emme_node_change_df.columns if c not in list(emme_name_crosswalk_dict.keys()) + ['operation_final', 'model_node_id'] + ] + WranglerLogger.info('Ignoring node changes in {}'.format(ignore_columns)) + emme_node_change_df = emme_node_change_df.drop(ignore_columns, axis = 1) + + # rename emme name to wrangler name + emme_link_change_df.rename(columns = emme_name_crosswalk_dict, inplace = True) + emme_node_change_df.rename(columns = emme_name_crosswalk_dict, inplace = True) + + return emme_link_change_df, emme_node_change_df
+ +
[docs] @staticmethod + def determine_roadway_network_changes_compatibility( + base_roadway_network: ModelRoadwayNetwork, + roadway_link_changes: DataFrame, + roadway_node_changes: DataFrame, + parameters: Parameters, + ): + """ + Checks to see that any links or nodes that change exist in base roadway network. + """ + WranglerLogger.info( + "Evaluating compatibility between roadway network changes and base network. Not evaluating deletions." + ) + + # CUBE log file saves all variable names in upper cases, need to convert them to be same as network + log_to_net_df = pd.read_csv(parameters.log_to_net_crosswalk) + log_to_net_dict = dict(zip(log_to_net_df["log"], log_to_net_df["net"])) + + dbf_to_net_df = pd.read_csv(parameters.net_to_dbf_crosswalk) + dbf_to_net_dict = dict(zip(dbf_to_net_df["dbf"], dbf_to_net_df["net"])) + + for c in roadway_link_changes.columns: + if (c not in log_to_net_df["log"].tolist() + log_to_net_df["net"].tolist()) & (c not in ["A", "B"]): + roadway_link_changes.rename(columns={c : c.lower()}, inplace=True) + roadway_link_changes.rename(columns=log_to_net_dict, inplace=True) + roadway_link_changes.rename(columns=dbf_to_net_dict, inplace=True) + + for c in roadway_node_changes.columns: + if (c not in log_to_net_df["log"].tolist() + log_to_net_df["net"].tolist()) & (c not in ["A", "B", "X", "Y"]): + roadway_node_changes.rename(columns={c : c.lower()}, inplace=True) + roadway_node_changes.rename(columns=log_to_net_dict, inplace=True) + roadway_node_changes.rename(columns=dbf_to_net_dict, inplace=True) + + # for links "L" that change "C", + # find locations where there isn't a base roadway link + if len(roadway_link_changes) > 0: + link_changes_df = roadway_link_changes[ + roadway_link_changes["operation_final"] == "C" + ].copy() + + link_merge_df = pd.merge( + link_changes_df[["A", "B"]].astype(str), + base_roadway_network.links_df[["A", "B", "model_link_id"]].astype(str), + how="left", + on=["A", "B"], + ) + + missing_links = link_merge_df.loc[link_merge_df["model_link_id"].isna()] + + if missing_links.shape[0]: + msg = "Network missing the following AB links:\n{}".format(missing_links) + WranglerLogger.error(msg) + raise ValueError(msg) + + # for links "N" that change "C", + # find locations where there isn't a base roadway node + if len(roadway_node_changes) > 0: + node_changes_df = roadway_node_changes[ + roadway_node_changes["operation_final"] == "C" + ].copy() + + node_merge_df = pd.merge( + node_changes_df[["model_node_id"]], + base_roadway_network.nodes_df[["model_node_id", "geometry"]], + how="left", + on=["model_node_id"], + ) + missing_nodes = node_merge_df.loc[node_merge_df["geometry"].isna()] + if missing_nodes.shape[0]: + msg = "Network missing the following nodes:\n{}".format(missing_nodes) + WranglerLogger.error(msg) + raise ValueError(msg)
+ +
[docs] def evaluate_changes(self): + """ + Determines which changes should be evaluated, initiates + self.card_data to be an aggregation of transit and highway changes. + """ + highway_change_list = [] + transit_change_list = [] + + WranglerLogger.info("Evaluating project changes.") + + if (not self.roadway_link_changes.empty) | (not self.roadway_node_changes.empty): + highway_change_list = self.add_highway_changes() + + if self.transit_changes is not None: + if (not self.transit_changes.empty) or ( + self.base_cube_transit_network is not None + and self.build_cube_transit_network is not None + ): + transit_change_list = self.add_transit_changes() + + if ( + self.base_cube_transit_network is not None + and self.build_cube_transit_network is not None + ): + transit_change_list = self.add_transit_changes() + + self.card_data = { + "project": self.project_name, + "changes": transit_change_list + highway_change_list, + }
+ +
[docs] def add_transit_changes(self): + """ + Evaluates changes between base and build transit objects and + adds entries into the self.card_data dictionary. + """ + if self.build_cube_transit_network: + transit_change_list = self.build_cube_transit_network.evaluate_differences( + self.base_cube_transit_network + ) + elif self.base_transit_network: + transit_change_list = self.base_transit_network.evaluate_differences( + self.transit_changes + ) + return transit_change_list
+ + @staticmethod + def _final_op(x): + if x["operation_history"][-1] == "D": + if "A" in x["operation_history"][:-1]: + return "N" + else: + return "D" + elif x["operation_history"][-1] == "A": + if "D" in x["operation_history"][:-1]: + return "C" + else: + return "A" + else: + if "A" in x["operation_history"][:-1]: + return "A" + else: + return "C" + +
[docs] def add_highway_changes(self, limit_variables_to_existing_network=False): + """ + Evaluates changes from the log file based on the base highway object and + adds entries into the self.card_data dictionary. + + Args: + limit_variables_to_existing_network (bool): True if no ad-hoc variables. Default to False. + """ + + for c in self.parameters.string_col: + if c in self.roadway_link_changes.columns: + self.roadway_link_changes[c] = self.roadway_link_changes[c].str.lstrip(" ") + if c in self.roadway_node_changes.columns: + self.roadway_node_changes[c] = self.roadway_node_changes[c].str.lstrip(" ") + + ## if worth it, could also add some functionality to network wrangler itself. + node_changes_df = self.roadway_node_changes.copy() + + link_changes_df = self.roadway_link_changes.copy() + + def _process_deletions(link_changes_df): + """ + create deletion section in project card + """ + WranglerLogger.debug("Processing link deletions") + + cube_delete_df = link_changes_df[link_changes_df["operation_final"] == "D"].copy() + + # make sure columns has the same type as base network + cube_delete_df['A'] = cube_delete_df['A'].astype( + type(self.base_roadway_network.links_df['A'].iloc[0]) + ) + cube_delete_df['B'] = cube_delete_df['B'].astype( + type(self.base_roadway_network.links_df['B'].iloc[0]) + ) + + if 'model_link_id' in cube_delete_df.columns: + cube_delete_df.drop(['model_link_id'], axis = 1, inplace = True) + + cube_delete_df = pd.merge( + cube_delete_df, + self.base_roadway_network.links_df[['A', 'B', 'model_link_id']], + how = 'left', + on = ['A', 'B'] + ) + + if len(cube_delete_df) > 0: + links_to_delete = cube_delete_df["model_link_id"].tolist() + delete_link_dict = { + "category": "Roadway Deletion", + "links": {"model_link_id": links_to_delete}, + } + WranglerLogger.debug("{} Links Deleted.".format(len(links_to_delete))) + else: + delete_link_dict = None + WranglerLogger.debug("No link deletions processed") + + return delete_link_dict + + def _process_link_additions( + link_changes_df, limit_variables_to_existing_network + ): + """""" + WranglerLogger.debug("Processing link additions") + cube_add_df = link_changes_df[link_changes_df["operation_final"] == "A"] + if len(cube_add_df) == 0: + WranglerLogger.debug("No link additions processed") + return {} + + if limit_variables_to_existing_network: + add_col = [ + c + for c in cube_add_df.columns + if c in self.base_roadway_network.links_df.columns + ] + else: + add_col = [ + c for c in cube_add_df.columns if c not in ["operation_final"] + ] + # can leave out "operation_final" from writing out, is there a reason to write it out? + + for x in add_col: + cube_add_df[x] = cube_add_df[x].astype(self.base_roadway_network.links_df[x].dtype) + + add_link_properties = cube_add_df[add_col].to_dict("records") + + # WranglerLogger.debug("Add Link Properties: {}".format(add_link_properties)) + WranglerLogger.debug("{} Links Added".format(len(add_link_properties))) + + return {"category": "Add New Roadway", "links": add_link_properties} + + def _process_node_additions(node_add_df): + """""" + WranglerLogger.debug("Processing node additions") + + if len(node_add_df) == 0: + WranglerLogger.debug("No node additions processed") + return [] + + node_add_df = node_add_df.drop(["operation_final"], axis=1) + + node_add_df = node_add_df.apply(_reproject_coordinates, axis=1) + + for x in node_add_df.columns: + node_add_df[x] = node_add_df[x].astype(self.base_roadway_network.nodes_df[x].dtype) + + add_nodes_dict_list = node_add_df.to_dict( + "records" + ) + WranglerLogger.debug("{} Nodes Added".format(len(add_nodes_dict_list))) + + return add_nodes_dict_list + + def _reproject_coordinates(row): + reprojected_x, reprojected_y = self.parameters.transformer.transform(row['X'], row['Y']) + row['X'] = reprojected_x + row['Y'] = reprojected_y + return row + + def _process_single_link_change(change_row, changeable_col): + """""" + + # 1. Find associated base year network values + base_df = self.base_roadway_network.links_df[ + (self.base_roadway_network.links_df["A"] == int(change_row.A)) + & (self.base_roadway_network.links_df["B"] == int(change_row.B)) + ] + + if not base_df.shape[0]: + msg = "No match found in network for AB combination: ({},{}). Incompatible base network.".format( + change_row.A, change_row.B + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + elif base_df.shape[0] > 1: + WranglerLogger.warning( + "Found more than one match in base network for AB combination: ({},{}). Selecting first one to operate on but AB should be unique to network.".format( + change_row.A, change_row.B + ) + ) + + base_row = base_df.iloc[0] + # WranglerLogger.debug("Properties with changes: {}".format(changeable_col)) + + # 2. find columns that changed (enough) + changed_col = [] + for col in changeable_col: + WranglerLogger.debug("Assessing Column: {}".format(col)) + # if it is the same as before, or a static value, don't process as a change + if str(change_row[col]).strip('"\'') == str(base_row[col]).strip('"\''): + continue + # if it is NaN or None, don't process as a change + if (change_row[col] != change_row[col]) | (change_row[col] is None): + continue + if (col == "roadway_class") & (change_row[col] == 0): + continue + # only look at distance if it has significantly changed + if col == "distance": + if ( + abs( + (change_row[col] - float(base_row[col])) + / base_row[col].astype(float) + ) + > 0.01 + ): + change_row[col] = type(base_row[col])(change_row[col]) + changed_col.append(col) + else: + continue + else: + change_row[col] = type(base_row[col])(change_row[col]) + changed_col.append(col) + + WranglerLogger.debug( + "Properties with changes that will be processed: {}".format(changed_col) + ) + + if not changed_col: + return pd.DataFrame() + + # 3. Iterate through columns with changed values and structure the changes as expected in project card + property_dict_list = [] + processed_properties = [] + + # check if it's a manged lane change + for c in changed_col: + if c.startswith("ML_"): + # TODO ML project card skeleton + msg = "Detected managed lane changes, please create managed lane project card!" + WranglerLogger.error(msg) + raise ValueError(msg) + return + + # regular roadway property change + for c in changed_col: + # WranglerLogger.debug("Processing Column: {}".format(c)) + ( + p_base_name, + p_time_period, + p_category, + managed_lane, + ) = column_name_to_parts(c, self.parameters) + + _d = { + "existing": base_row[c], + "set": change_row[c], + } + if c in Project.CALCULATED_VALUES: + _d = { + "set": change_row[c], + } + if p_time_period: + if managed_lane == 1: + _d["time"] = list( + self.parameters.time_period_to_time[p_time_period] + ) + if p_category: + _d["category"] = p_category + + # iterate through existing properties that have been changed and see if you should just add + if (p_base_name in processed_properties) & (managed_lane == 1): + for processed_p in property_dict_list: + if processed_p["property"] == p_base_name: + processed_p["timeofday"] += [_d] + elif (p_base_name in processed_properties) & (managed_lane == 0): + for processed_p in property_dict_list: + if processed_p["property"] == p_base_name: + if processed_p["set"] != change_row[c]: + msg = "Detected different changes for split-property variables on regular roadway links: " + msg += "conflicting \"{}\" values \"{}\", \"{}\"".format(p_base_name, processed_p["set"], change_row[c]) + WranglerLogger.error(msg) + raise ValueError(msg) + elif p_time_period: + if managed_lane == 1: + property_dict = {"property": p_base_name, "timeofday": [_d]} + processed_properties.append(p_base_name) + property_dict_list.append(property_dict) + else: + _d["property"] = p_base_name + processed_properties.append(_d["property"]) + property_dict_list.append(_d) + else: + _d["property"] = p_base_name + processed_properties.append(_d["property"]) + property_dict_list.append(_d) + + card_df = pd.DataFrame( + { + "properties": pd.Series([property_dict_list]), + "model_link_id": pd.Series(base_row["model_link_id"]), + } + ) + + # WranglerLogger.debug('single change card_df:\n {}'.format(card_df)) + + return card_df + + def _process_link_changes(link_changes_df, changeable_col): + """""" + cube_change_df = link_changes_df[link_changes_df["operation_final"] == "C"].copy() + + # make sure columns has the same type as base network + cube_change_df['A'] = cube_change_df['A'].astype( + type(self.base_roadway_network.links_df['A'].iloc[0]) + ) + cube_change_df['B'] = cube_change_df['B'].astype( + type(self.base_roadway_network.links_df['B'].iloc[0]) + ) + + if 'model_link_id' in cube_change_df.columns: + cube_change_df.drop('model_link_id', axis = 1, inplace = True) + + cube_change_df = pd.merge( + cube_change_df, + self.base_roadway_network.links_df[['A', 'B', 'model_link_id']], + how = 'left', + on = ['A', 'B'] + ) + + if not cube_change_df.shape[0]: + WranglerLogger.info("No link changes processed") + return [] + + change_link_dict_df = pd.DataFrame(columns=["properties", "model_link_id"]) + + for index, row in cube_change_df.iterrows(): + card_df = _process_single_link_change(row, changeable_col) + + change_link_dict_df = pd.concat( + [change_link_dict_df, card_df], ignore_index=True, sort=False + ) + + if not change_link_dict_df.shape[0]: + WranglerLogger.info("No link changes processed") + return [] + + # WranglerLogger.debug('change_link_dict_df Unaggregated:\n {}'.format(change_link_dict_df)) + + # Have to change to string so that it is a hashable type for the aggregation + change_link_dict_df["properties"] = change_link_dict_df[ + "properties" + ].astype(str) + # Group the changes that are the same + change_link_dict_df = ( + change_link_dict_df.groupby("properties")[["model_link_id"]] + .agg(lambda x: list(x)) + .reset_index() + ) + # WranglerLogger.debug('change_link_dict_df Aggregated:\n {}'.format(change_link_dict_df)) + + # Reformat model link id to correct "facility" format + change_link_dict_df["facility"] = change_link_dict_df.apply( + lambda x: {"link": [{"model_link_id": x.model_link_id}]}, axis=1 + ) + + # WranglerLogger.debug('change_link_dict_df 3: {}'.format(change_link_dict_df)) + change_link_dict_df["properties"] = change_link_dict_df["properties"].apply( + lambda x: json.loads( + x.replace("'\"", "'").replace("\"'", "'").replace("'", '"') + ) + ) + + change_link_dict_df["category"] = "Roadway Property Change" + + change_link_dict_list = change_link_dict_df[ + ["category", "facility", "properties"] + ].to_dict("record") + + WranglerLogger.debug( + "{} Changes Processed".format(len(change_link_dict_list)) + ) + return change_link_dict_list + + def _consolidate_actions(log, base, key_list): + log_df = log.copy() + # will be changed if to allow new variables being added/changed that are not in base network + changeable_col = [x for x in log_df.columns if x in base.columns] + #print(log_df) + #for x in changeable_col: + # print(x) + #log_df[x] = log_df[x].astype(base[x].dtype) + + if 'operation_final' not in log_df.columns: + action_history_df = ( + log_df.groupby(key_list)["operation"] + .agg(lambda x: x.tolist()) + .rename("operation_history") + .reset_index() + ) + + log_df = pd.merge(log_df, action_history_df, on=key_list, how="left") + log_df.drop_duplicates(subset=key_list, keep="last", inplace=True) + log_df["operation_final"] = log_df.apply(lambda x: Project._final_op(x), axis=1) + + return log_df[changeable_col + ["operation_final"]] + + delete_link_dict = None + add_link_dict = None + change_link_dict_list = [] + + if len(link_changes_df) != 0: + link_changes_df = _consolidate_actions( + link_changes_df, self.base_roadway_network.links_df, ["A", "B"] + ) + + # process deletions + delete_link_dict = _process_deletions(link_changes_df) + + # process additions + add_link_dict = _process_link_additions( + link_changes_df, limit_variables_to_existing_network + ) + + # process changes + WranglerLogger.debug("Processing changes") + WranglerLogger.debug(link_changes_df) + changeable_col = list( + ( + set(link_changes_df.columns) + & set(self.base_roadway_network.links_df.columns) + ) + - set(Project.STATIC_VALUES) + ) + + cols_in_changes_not_in_net = list( + set(link_changes_df.columns) + - set(self.base_roadway_network.links_df.columns) + ) + + if cols_in_changes_not_in_net: + WranglerLogger.warning( + "The following attributes are specified in the changes but do not exist in the base network: {}".format( + cols_in_changes_not_in_net + ) + ) + + change_link_dict_list = _process_link_changes(link_changes_df, changeable_col) + + if len(node_changes_df) != 0: + node_changes_df = _consolidate_actions( + node_changes_df, self.base_roadway_network.nodes_df, ["model_node_id"] + ) + + # print error message for node change and node deletion + if ( + len(node_changes_df[node_changes_df["operation_final"].isin(["C", "D"])]) + > 0 + ): + msg = "NODE changes and deletions are not allowed!" + WranglerLogger.warning(msg) + #raise ValueError(msg) + node_add_df = node_changes_df[node_changes_df["operation_final"] == "A"] + + if add_link_dict: + add_link_dict["nodes"] = _process_node_additions(node_add_df) + else: + add_link_dict = {"category": "Add New Roadway", "nodes": _process_node_additions(node_add_df)} + + else: + None + + # combine together + + highway_change_list = list( + filter(None, [delete_link_dict] + [add_link_dict] + change_link_dict_list) + ) + + return highway_change_list
+
+ +
+
+
+ +
+ +
+

© Copyright 2019 Metropolitan Council.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty_emme/_modules/lasso/roadway/index.html b/branch/bicounty_emme/_modules/lasso/roadway/index.html new file mode 100644 index 0000000..f276fd1 --- /dev/null +++ b/branch/bicounty_emme/_modules/lasso/roadway/index.html @@ -0,0 +1,2046 @@ + + + + + + lasso.roadway — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for lasso.roadway

+import copy
+import glob
+import os
+from typing import Optional, Union
+
+import geopandas as gpd
+import pandas as pd
+
+from geopandas import GeoDataFrame
+from pandas import DataFrame
+import numpy as np
+
+from network_wrangler import RoadwayNetwork
+from .parameters import Parameters
+from .logger import WranglerLogger
+
+
+
[docs]class ModelRoadwayNetwork(RoadwayNetwork): + """ + Subclass of network_wrangler class :ref:`RoadwayNetwork <network_wrangler:RoadwayNetwork>` + + A representation of the physical roadway network and its properties. + """ + + CALCULATED_VALUES = [ + "area_type", + "county", + "centroidconnect", + ] + +
[docs] def __init__( + self, + nodes: GeoDataFrame, + links: DataFrame, + shapes: GeoDataFrame, + parameters: Union[Parameters, dict] = {}, + **kwargs, + ): + """ + Constructor + + Args: + nodes: geodataframe of nodes + links: dataframe of links + shapes: geodataframe of shapes + parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters. + If not specified, will use default parameters. + crs (int): coordinate reference system, ESPG number + node_foreign_key (str): variable linking the node table to the link table + link_foreign_key (list): list of variable linking the link table to the node foreign key + shape_foreign_key (str): variable linking the links table and shape table + unique_link_ids (list): list of variables unique to each link + unique_node_ids (list): list of variables unique to each node + modes_to_network_link_variables (dict): Mapping of modes to link variables in the network + modes_to_network_nodes_variables (dict): Mapping of modes to node variables in the network + managed_lanes_node_id_scalar (int): Scalar values added to primary keys for nodes for + corresponding managed lanes. + managed_lanes_link_id_scalar (int): Scalar values added to primary keys for links for + corresponding managed lanes. + managed_lanes_required_attributes (list): attributes that must be specified in managed + lane projects. + keep_same_attributes_ml_and_gp (list): attributes to copy to managed lanes from parallel + general purpose lanes. + """ + super().__init__(nodes, links, shapes, **kwargs) + + # will have to change if want to alter them + if type(parameters) is dict: + self.parameters = Parameters(**parameters) + elif isinstance(parameters, Parameters): + self.parameters = Parameters(**parameters.__dict__) + else: + msg = "Parameters should be a dict or instance of Parameters: found {} which is of type:{}".format( + parameters, type(parameters) + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + self.links_metcouncil_df = None + self.nodes_metcouncil_df = None + + self.fill_na() + self.convert_int()
+ # self.shapes_metcouncil_df = None + ##todo also write to file + # WranglerLogger.debug("Used PARAMS\n", '\n'.join(['{}: {}'.format(k,v) for k,v in self.parameters.__dict__.items()])) + +
[docs] @staticmethod + def read( + link_filename: str, + node_filename: str, + shape_filename: str, + fast: bool = False, + recalculate_calculated_variables=False, + recalculate_distance=False, + parameters: Union[dict, Parameters] = {}, + **kwargs, + ): + """ + Reads in links and nodes network standard. + + Args: + link_filename (str): File path to link json. + node_filename (str): File path to node geojson. + shape_filename (str): File path to link true shape geojson + fast (bool): boolean that will skip validation to speed up read time. + recalculate_calculated_variables (bool): calculates fields from spatial joins, etc. + recalculate_distance (bool): re-calculates distance. + parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not specified, will use default parameters. + crs (int): coordinate reference system, ESPG number + node_foreign_key (str): variable linking the node table to the link table + link_foreign_key (list): list of variable linking the link table to the node foreign key + shape_foreign_key (str): variable linking the links table and shape table + unique_link_ids (list): list of variables unique to each link + unique_node_ids (list): list of variables unique to each node + modes_to_network_link_variables (dict): Mapping of modes to link variables in the network + modes_to_network_nodes_variables (dict): Mapping of modes to node variables in the network + managed_lanes_node_id_scalar (int): Scalar values added to primary keys for nodes for + corresponding managed lanes. + managed_lanes_link_id_scalar (int): Scalar values added to primary keys for links for + corresponding managed lanes. + managed_lanes_required_attributes (list): attributes that must be specified in managed + lane projects. + keep_same_attributes_ml_and_gp (list): attributes to copy to managed lanes from parallel + general purpose lanes. + Returns: + ModelRoadwayNetwork + """ + + nodes_df, links_df, shapes_df = RoadwayNetwork.load_transform_network( + node_filename, + link_filename, + shape_filename, + validate_schema=not fast, + **kwargs, + ) + + m_road_net = ModelRoadwayNetwork( + nodes_df, + links_df, + shapes_df, + parameters=parameters, + **kwargs, + ) + + if recalculate_calculated_variables: + m_road_net.create_calculated_variables() + if recalculate_distance: + m_road_net.calculate_distance(overwrite=True) + + m_road_net.fill_na() + # this method is making period values as string "NaN", need to revise. + m_road_net.split_properties_by_time_period_and_category() + for c in m_road_net.links_df.columns: + m_road_net.links_df[c] = m_road_net.links_df[c].replace("NaN", np.nan) + m_road_net.convert_int() + + return m_road_net
+ +
[docs] @staticmethod + def from_RoadwayNetwork( + roadway_network_object, + parameters: Union[dict, Parameters] = {}, + ): + """ + RoadwayNetwork to ModelRoadwayNetwork + + Args: + roadway_network_object (RoadwayNetwork). + parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not specified, will use default parameters. + + Returns: + ModelRoadwayNetwork + """ + + additional_params_dict = { + k: v + for k, v in roadway_network_object.__dict__.items() + if k not in ["nodes_df", "links_df", "shapes_df", "parameters"] + } + + return ModelRoadwayNetwork( + roadway_network_object.nodes_df, + roadway_network_object.links_df, + roadway_network_object.shapes_df, + parameters=parameters, + **additional_params_dict, + )
+ +
[docs] def split_properties_by_time_period_and_category(self, properties_to_split=None): + """ + Splits properties by time period, assuming a variable structure of + + Args: + properties_to_split: dict + dictionary of output variable prefix mapped to the source variable and what to stratify it by + e.g. + { + 'lanes' : {'v':'lanes', 'times_periods':{"AM": ("6:00", "10:00"),"PM": ("15:00", "19:00")}}, + 'ML_lanes' : {'v':'ML_lanes', 'times_periods':{"AM": ("6:00", "10:00"),"PM": ("15:00", "19:00")}}, + 'use' : {'v':'use', 'times_periods':{"AM": ("6:00", "10:00"),"PM": ("15:00", "19:00")}}, + } + + """ + import itertools + + if properties_to_split == None: + properties_to_split = self.parameters.properties_to_split + + for out_var, params in properties_to_split.items(): + if params["v"] not in self.links_df.columns: + WranglerLogger.warning( + "Specified variable to split: {} not in network variables: {}. Returning 0.".format( + params["v"], str(self.links_df.columns) + ) + ) + if params.get("time_periods") and params.get("categories"): + + for time_suffix, category_suffix in itertools.product( + params["time_periods"], params["categories"] + ): + self.links_df[ + out_var + "_" + time_suffix + "_" + category_suffix + ] = 0 + elif params.get("time_periods"): + for time_suffix in params["time_periods"]: + self.links_df[out_var + "_" + time_suffix] = 0 + elif params.get("time_periods") and params.get("categories"): + for time_suffix, category_suffix in itertools.product( + params["time_periods"], params["categories"] + ): + self.links_df[ + out_var + "_" + category_suffix + "_" + time_suffix + ] = self.get_property_by_time_period_and_group( + params["v"], + category=params["categories"][category_suffix], + time_period=params["time_periods"][time_suffix], + ) + elif params.get("time_periods"): + for time_suffix in params["time_periods"]: + self.links_df[ + out_var + "_" + time_suffix + ] = self.get_property_by_time_period_and_group( + params["v"], + category=None, + time_period=params["time_periods"][time_suffix], + ) + else: + raise ValueError( + "Shoudn't have a category without a time period: {}".format(params) + )
+ +
[docs] def create_calculated_variables(self): + """ + Creates calculated roadway variables. + + Args: + None + """ + WranglerLogger.info("Creating calculated roadway variables.") + + #MTC + self.create_ML_variable() + #/MTC + #MC + self.calculate_area_type() + self.calculate_county() + self.calculate_mpo() + self.add_counts() + self.create_ML_variable() + self.create_hov_corridor_variable() + self.create_managed_variable()
+ #/MC + +
[docs] def calculate_county( + self, + county_shape=None, + county_shape_variable=None, + network_variable="county", + county_codes_dict=None, + overwrite=False, + ): + """ + #MC + Calculates county variable. + + This uses the centroid of the geometry field to determine which county it should be labeled. + This isn't perfect, but it much quicker than other methods. + + Args: + county_shape (str): The File path to county geodatabase. + county_shape_variable (str): The variable name of county in county geodadabase. + network_variable (str): The variable name of county in network standard. Default to "county". + overwrite (Bool): True if overwriting existing county variable in network. Default to False. + + Returns: + None + """ + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing County Variable '{}' already in network".format( + network_variable + ) + ) + else: + WranglerLogger.info( + "County Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + """ + Verify inputs + """ + + county_shape = county_shape if county_shape else self.parameters.county_shape + + county_shape_variable = ( + county_shape_variable + if county_shape_variable + else self.parameters.county_variable_shp + ) + + WranglerLogger.info( + "Adding roadway network variable for county using a spatial join with: {}".format( + county_shape + ) + ) + + county_codes_dict = ( + county_codes_dict if county_codes_dict else self.parameters.county_code_dict + ) + if not county_codes_dict: + msg = "No county codes dictionary specified" + WranglerLogger.error(msg) + raise ValueError(msg) + + """ + Start actual process + """ + + centroids_gdf = self.links_df.copy() + centroids_gdf["geometry"] = centroids_gdf["geometry"].centroid + + county_gdf = gpd.read_file(county_shape) + county_gdf = county_gdf.to_crs(epsg=self.crs) + joined_gdf = gpd.sjoin(centroids_gdf, county_gdf, how="left", op="intersects") + + joined_gdf[county_shape_variable] = ( + joined_gdf[county_shape_variable] + .map(county_codes_dict) + .fillna(10) + .astype(int) + ) + + self.links_df[network_variable] = joined_gdf[county_shape_variable] + + WranglerLogger.info( + "Finished Calculating county variable: {}".format(network_variable) + )
+ +
[docs] def calculate_area_type( + self, + area_type_shape=None, + area_type_shape_variable=None, + network_variable="area_type", + area_type_codes_dict=None, + downtown_area_type_shape=None, + downtown_area_type=None, + overwrite=False, + ): + """ + #MC + Calculates area type variable. + + This uses the centroid of the geometry field to determine which area it should be labeled. + This isn't perfect, but it much quicker than other methods. + + Args: + area_type_shape (str): The File path to area geodatabase. + area_type_shape_variable (str): The variable name of area type in area geodadabase. + network_variable (str): The variable name of area type in network standard. Default to "area_type". + area_type_codes_dict: The dictionary to map input area_type_shape_variable to network_variable + downtown_area_type_shape: The file path to the downtown area type boundary. + downtown_area_type (int): Integer value of downtown area type + overwrite (Bool): True if overwriting existing county variable in network. Default to False. + + Returns: + None + + """ + + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing Area Type Variable '{}' already in network".format( + network_variable + ) + ) + else: + WranglerLogger.info( + "Area Type Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + WranglerLogger.info( + "Calculating Area Type from Spatial Data and adding as roadway network variable: {}".format( + network_variable + ) + ) + + """ + Verify inputs + """ + + area_type_shape = ( + area_type_shape if area_type_shape else self.parameters.area_type_shape + ) + + if not area_type_shape: + msg = "No area type shape specified" + WranglerLogger.error(msg) + raise ValueError(msg) + if not os.path.exists(area_type_shape): + msg = "File not found for area type shape: {}".format(area_type_shape) + WranglerLogger.error(msg) + raise ValueError(msg) + + area_type_shape_variable = ( + area_type_shape_variable + if area_type_shape_variable + else self.parameters.area_type_variable_shp + ) + + if not area_type_shape_variable: + msg = "No area type shape varible specified" + WranglerLogger.error(msg) + raise ValueError(msg) + + area_type_codes_dict = ( + area_type_codes_dict + if area_type_codes_dict + else self.parameters.area_type_code_dict + ) + if not area_type_codes_dict: + msg = "No area type codes dictionary specified" + WranglerLogger.error(msg) + raise ValueError(msg) + + downtown_area_type_shape = ( + downtown_area_type_shape + if downtown_area_type_shape + else self.parameters.downtown_area_type_shape + ) + + if not downtown_area_type_shape: + msg = "No downtown area type shape specified" + WranglerLogger.error(msg) + raise ValueError(msg) + if not os.path.exists(downtown_area_type_shape): + msg = "File not found for downtown area type shape: {}".format( + downtown_area_type_shape + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + downtown_area_type = ( + downtown_area_type + if downtown_area_type + else self.parameters.downtown_area_type + ) + if not downtown_area_type: + msg = "No downtown area type value specified" + WranglerLogger.error(msg) + raise ValueError(msg) + + """ + Start actual process + """ + centroids_gdf = self.links_df.copy() + centroids_gdf["geometry"] = centroids_gdf["geometry"].centroid + + WranglerLogger.debug("Reading Area Type Shapefile {}".format(area_type_shape)) + area_type_gdf = gpd.read_file(area_type_shape) + area_type_gdf = area_type_gdf.to_crs(epsg=self.crs) + + downtown_gdf = gpd.read_file(downtown_area_type_shape) + downtown_gdf = downtown_gdf.to_crs(epsg=self.crs) + + joined_gdf = gpd.sjoin( + centroids_gdf, area_type_gdf, how="left", op="intersects" + ) + + joined_gdf[area_type_shape_variable] = ( + joined_gdf[area_type_shape_variable] + .map(area_type_codes_dict) + .fillna(1) + .astype(int) + ) + + WranglerLogger.debug("Area Type Codes Used: {}".format(area_type_codes_dict)) + + d_joined_gdf = gpd.sjoin( + centroids_gdf, downtown_gdf, how="left", op="intersects" + ) + + d_joined_gdf["downtown_area_type"] = d_joined_gdf["Id"].fillna(-99).astype(int) + + joined_gdf.loc[ + d_joined_gdf["downtown_area_type"] == 0, area_type_shape_variable + ] = downtown_area_type + + WranglerLogger.debug( + "Downtown Area Type used boundary file: {}".format(downtown_area_type_shape) + ) + + self.links_df[network_variable] = joined_gdf[area_type_shape_variable] + + WranglerLogger.info( + "Finished Calculating Area Type from Spatial Data into variable: {}".format( + network_variable + ) + )
+ +
[docs] def calculate_mpo( + self, + county_network_variable="county", + network_variable="mpo", + as_integer=True, + mpo_counties=None, + overwrite=False, + ): + """ + Calculates mpo variable. + #MC + Args: + county_variable (str): Name of the variable where the county names are stored. Default to "county". + network_variable (str): Name of the variable that should be written to. Default to "mpo". + as_integer (bool): If true, will convert true/false to 1/0s. + mpo_counties (list): List of county names that are within mpo region. + overwrite (Bool): True if overwriting existing county variable in network. Default to False. + + Returns: + None + """ + + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing MPO Variable '{}' already in network".format( + network_variable + ) + ) + else: + WranglerLogger.info( + "MPO Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + WranglerLogger.info( + "Calculating MPO as roadway network variable: {}".format(network_variable) + ) + """ + Verify inputs + """ + county_network_variable = ( + county_network_variable + if county_network_variable + else self.parameters.county_network_variable + ) + + if not county_network_variable: + msg = "No variable specified as containing 'county' in the network." + WranglerLogger.error(msg) + raise ValueError(msg) + if county_network_variable not in self.links_df.columns: + msg = "Specified county network variable: {} does not exist in network. Try running or debuging county calculation." + WranglerLogger.error(msg) + raise ValueError(msg) + + mpo_counties = mpo_counties if mpo_counties else self.parameters.mpo_counties + + if not mpo_counties: + msg = "No MPO Counties specified in method call or in parameters." + WranglerLogger.error(msg) + raise ValueError(msg) + + WranglerLogger.debug("MPO Counties: {}".format(",".join(str(mpo_counties)))) + + """ + Start actual process + """ + + mpo = self.links_df[county_network_variable].isin(mpo_counties) + + if as_integer: + mpo = mpo.astype(int) + + self.links_df[network_variable] = mpo + + WranglerLogger.info( + "Finished calculating MPO variable: {}".format(network_variable) + )
+ +
[docs] def add_variable_using_shst_reference( + self, + var_shst_csvdata=None, + shst_csv_variable=None, + network_variable=None, + network_var_type=int, + overwrite=False, + ): + """ + Join network links with source data, via SHST API node match result. + + Args: + var_shst_csvdata (str): File path to SHST API return. + shst_csv_variable (str): Variable name in the source data. + network_variable (str): Name of the variable that should be written to. + network_var_type : Variable type in the written network. + overwrite (bool): True is overwriting existing variable. Default to False. + + Returns: + None + + """ + WranglerLogger.info( + "Adding Variable {} using Shared Streets Reference from {}".format( + network_variable, var_shst_csvdata + ) + ) + + var_shst_df = pd.read_csv(var_shst_csvdata) + + if "shstReferenceId" not in var_shst_df.columns: + msg = "'shstReferenceId' required but not found in {}".format(var_shst_data) + WranglerLogger.error(msg) + raise ValueError(msg) + + if shst_csv_variable not in var_shst_df.columns: + msg = "{} required but not found in {}".format( + shst_csv_variable, var_shst_data + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + join_gdf = pd.merge( + self.links_df, + var_shst_df[["shstReferenceId", shst_csv_variable]], + how="left", + on="shstReferenceId", + ) + + join_gdf[shst_csv_variable].fillna(0, inplace=True) + + if network_variable in self.links_df.columns and not overwrite: + join_gdf.loc[join_gdf[network_variable] > 0, network_variable] = join_gdf[ + shst_csv_variable + ].astype(network_var_type) + else: + join_gdf[network_variable] = join_gdf[shst_csv_variable].astype( + network_var_type + ) + + self.links_df[network_variable] = join_gdf[network_variable] + + WranglerLogger.info( + "Added variable: {} using Shared Streets Reference".format(network_variable) + )
+ +
[docs] def add_counts( + self, + network_variable="AADT", + mndot_count_shst_data=None, + widot_count_shst_data=None, + mndot_count_variable_shp=None, + widot_count_variable_shp=None, + ): + + """ + Adds count variable. + #MC + join the network with count node data, via SHST API node match result + + Args: + network_variable (str): Name of the variable that should be written to. Default to "AADT". + mndot_count_shst_data (str): File path to MNDOT count location SHST API node match result. + widot_count_shst_data (str): File path to WIDOT count location SHST API node match result. + mndot_count_variable_shp (str): File path to MNDOT count location geodatabase. + widot_count_variable_shp (str): File path to WIDOT count location geodatabase. + + Returns: + None + """ + + WranglerLogger.info("Adding Counts") + + """ + Verify inputs + """ + + mndot_count_shst_data = ( + mndot_count_shst_data + if mndot_count_shst_data + else self.parameters.mndot_count_shst_data + ) + widot_count_shst_data = ( + widot_count_shst_data + if widot_count_shst_data + else self.parameters.widot_count_shst_data + ) + mndot_count_variable_shp = ( + mndot_count_variable_shp + if mndot_count_variable_shp + else self.parameters.mndot_count_variable_shp + ) + widot_count_variable_shp = ( + widot_count_variable_shp + if widot_count_variable_shp + else self.parameters.widot_count_variable_shp + ) + + for varname, var in { + "mndot_count_shst_data": mndot_count_shst_data, + "widot_count_shst_data": widot_count_shst_data, + }.items(): + if not var: + msg = "'{}' not found in method or lasso parameters.".format(varname) + WranglerLogger.error(msg) + raise ValueError(msg) + if not os.path.exists(var): + msg = "{}' not found at following location: {}.".format(varname, var) + WranglerLogger.error(msg) + raise ValueError(msg) + + for varname, var in { + "mndot_count_variable_shp": mndot_count_variable_shp, + "widot_count_variable_shp": widot_count_variable_shp, + }.items(): + if not var: + msg = "'{}' not found in method or lasso parameters.".format(varname) + WranglerLogger.error(msg) + raise ValueError(msg) + + """ + Start actual process + """ + WranglerLogger.debug( + "Adding MNDOT Counts using \n- shst file: {}\n- shp file: {}\n- as network variable: {}".format( + mndot_count_shst_data, mndot_count_variable_shp, network_variable + ) + ) + # Add Minnesota Counts + self.add_variable_using_shst_reference( + var_shst_csvdata=mndot_count_shst_data, + shst_csv_variable=mndot_count_variable_shp, + network_variable=network_variable, + network_var_type=int, + overwrite=True, + ) + WranglerLogger.debug( + "Adding WiDot Counts using \n- shst file: {}\n- shp file: {}\n- as network variable: {}".format( + widot_count_shst_data, widot_count_variable_shp, network_variable + ) + ) + # Add Wisconsin Counts, but don't overwrite Minnesota + self.add_variable_using_shst_reference( + var_shst_csvdata=widot_count_shst_data, + shst_csv_variable=widot_count_variable_shp, + network_variable=network_variable, + network_var_type=int, + overwrite=False, + ) + + self.links_df["count_AM"] = self.links_df[network_variable] / 4 + self.links_df["count_MD"] = self.links_df[network_variable] / 4 + self.links_df["count_PM"] = self.links_df[network_variable] / 4 + self.links_df["count_NT"] = self.links_df[network_variable] / 4 + + self.links_df["count_daily"] = self.links_df[network_variable] + self.links_df["count_year"] = 2017 + + WranglerLogger.info( + "Finished adding counts variable: {}".format(network_variable) + )
+ +
[docs] @staticmethod + def read_match_result(path): + """ + Reads the shst geojson match returns. + + Returns shst dataframe. + + Reading lots of same type of file and concatenating them into a single DataFrame. + + Args: + path (str): File path to SHST match results. + + Returns: + geodataframe: geopandas geodataframe + + ##todo + not sure why we need, but should be in utilities not this class + """ + refId_gdf = DataFrame() + refid_file = glob.glob(path) + for i in refid_file: + new = gpd.read_file(i) + refId_gdf = pd.concat([refId_gdf, new], ignore_index=True, sort=False) + return refId_gdf
+ +
[docs] @staticmethod + def get_attribute( + links_df, + join_key, # either "shstReferenceId", or "shstGeometryId", tests showed the latter gave better coverage + source_shst_ref_df, # source shst refId + source_gdf, # source dataframe + field_name, # , # targetted attribute from source + ): + """ + Gets attribute from source data using SHST match result. + + Args: + links_df (dataframe): The network dataframe that new attribute should be written to. + join_key (str): SHST ID variable name used to join source data with network dataframe. + source_shst_ref_df (str): File path to source data SHST match result. + source_gdf (str): File path to source data. + field_name (str): Name of the attribute to get from source data. + + Returns: + None + """ + # join based on shared streets geometry ID + # pp_link_id is shared streets match return + # source_ink_id is mrcc + WranglerLogger.debug( + "source ShSt rename_variables_for_dbf columns\n{}".format( + source_shst_ref_df.columns + ) + ) + WranglerLogger.debug("source gdf columns\n{}".format(source_gdf.columns)) + # end up with OSM network with the MRCC Link ID + # could also do with route_sys...would that be quicker? + join_refId_df = pd.merge( + links_df, + source_shst_ref_df[[join_key, "pp_link_id", "score"]].rename( + columns={"pp_link_id": "source_link_id", "score": "source_score"} + ), + how="left", + on=join_key, + ) + + # joined with MRCC dataframe to get route_sys + + join_refId_df = pd.merge( + join_refId_df, + source_gdf[["LINK_ID", field_name]].rename( + columns={"LINK_ID": "source_link_id"} + ), + how="left", + on="source_link_id", + ) + + # drop duplicated records with same field value + + join_refId_df.drop_duplicates( + subset=["model_link_id", "shstReferenceId", field_name], inplace=True + ) + + # more than one match, take the best score + + join_refId_df.sort_values( + by=["model_link_id", "source_score"], + ascending=True, + na_position="first", + inplace=True, + ) + + join_refId_df.drop_duplicates( + subset=["model_link_id"], keep="last", inplace=True + ) + + # self.links_df[field_name] = join_refId_df[field_name] + + return join_refId_df[links_df.columns.tolist() + [field_name, "source_link_id"]]
+ +
[docs] def calculate_use( + self, + network_variable="use", + as_integer=True, + overwrite=False, + ): + """ + Calculates use variable. + + Args: + network_variable (str): Variable that should be written to in the network. Default to "use" + as_integer (bool): If True, will convert true/false to 1/0s. Defauly to True. + overwrite (Bool): True if overwriting existing county variable in network. Default to False. + + Returns: + None + """ + + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing hov Variable '{}' already in network".format( + network_variable + ) + ) + else: + WranglerLogger.info( + "'use' Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + WranglerLogger.info( + "Calculating hov and adding as roadway network variable: {}".format( + network_variable + ) + ) + """ + Verify inputs + """ + + if not network_variable: + msg = "No network variable specified for centroid connector" + WranglerLogger.error(msg) + raise ValueError(msg) + + """ + Start actual process + """ + #MTC + self.links_df[network_variable] = int(1) + #/MTC + + self.links_df[network_variable] = 0 + + self.links_df.loc[ + (self.links_df["assign_group"] == 8) | (self.links_df["access"] == "hov"), + network_variable, + ] = 100 + #/MC + + + if as_integer: + self.links_df[network_variable] = self.links_df[network_variable].astype( + int + ) + WranglerLogger.info( + "Finished calculating hov variable: {}".format(network_variable) + )
+ +
[docs] def create_ML_variable( + self, + network_variable="ML_lanes", + overwrite=False, + ): + """ + Created ML lanes placeholder for project to write out ML changes + + ML lanes default to 0, ML info comes from cube LOG file and store in project cards + + Args: + overwrite (Bool): True if overwriting existing variable in network. Default to False. + + Returns: + None + """ + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing ML Variable '{}' already in network".format( + network_variable + ) + ) + self.links_df[network_variable] = int(0) + else: + WranglerLogger.info( + "ML Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + """ + Verify inputs + """ + + WranglerLogger.info( + "Finished creating ML lanes variable: {}".format(network_variable) + )
+ +
[docs] def create_hov_corridor_variable( + self, + network_variable="segment_id", + overwrite=False, + ): + """ + Created hov corridor placeholder for project to write out corridor changes + + hov corridor id default to 0, its info comes from cube LOG file and store in project cards + + Args: + overwrite (Bool): True if overwriting existing variable in network. Default to False. + + Returns: + None + """ + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing hov corridor Variable '{}' already in network".format( + network_variable + ) + ) + else: + WranglerLogger.info( + "Hov corridor Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + """ + Verify inputs + """ + + self.links_df[network_variable] = int(0) + + WranglerLogger.info( + "Finished creating hov corridor variable: {}".format(network_variable) + )
+ +
[docs] def create_managed_variable( + self, + network_variable="managed", + overwrite=False, + ): + """ + Created placeholder for project to write out managed + + managed default to 0, its info comes from cube LOG file and store in project cards + + Args: + overwrite (Bool): True if overwriting existing variable in network. Default to False. + + Returns: + None + """ + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing managed Variable '{}' already in network".format( + network_variable + ) + ) + else: + WranglerLogger.info( + "Managed Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + """ + Verify inputs + """ + + self.links_df[network_variable] = int(0) + + WranglerLogger.info( + "Finished creating managed variable: {}".format(network_variable) + )
+ +
[docs] def calculate_centroidconnect( + self, + parameters, + network_variable="centroidconnect", + highest_taz_number=None, + as_integer=True, + overwrite=False, + ): + """ + Calculates centroid connector variable. + + Args: + parameters (Parameters): A Lasso Parameters, which stores input files. + network_variable (str): Variable that should be written to in the network. Default to "centroidconnect" + highest_taz_number (int): the max TAZ number in the network. + as_integer (bool): If True, will convert true/false to 1/0s. Default to True. + overwrite (Bool): True if overwriting existing county variable in network. Default to False. + Returns: + RoadwayNetwork + """ + + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing Centroid Connector Variable '{}' already in network".format( + network_variable + ) + ) + else: + WranglerLogger.info( + "Centroid Connector Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + WranglerLogger.info( + "Calculating Centroid Connector and adding as roadway network variable: {}".format( + network_variable + ) + ) + """ + Verify inputs + """ + highest_taz_number = ( + highest_taz_number if highest_taz_number else parameters.highest_taz_number + ) + + if not highest_taz_number: + msg = "No highest_TAZ number specified in method variable or in parameters" + WranglerLogger.error(msg) + raise ValueError(msg) + + WranglerLogger.debug( + "Calculating Centroid Connectors using highest TAZ number: {}".format( + highest_taz_number + ) + ) + + if not network_variable: + msg = "No network variable specified for centroid connector" + WranglerLogger.error(msg) + raise ValueError(msg) + + """ + Start actual process + """ + self.links_df[network_variable] = False + + self.links_df.loc[ + (self.links_df["A"] <= highest_taz_number) + | (self.links_df["B"] <= highest_taz_number), + network_variable, + ] = True + + if as_integer: + self.links_df[network_variable] = self.links_df[ + network_variable + ].astype(int) + WranglerLogger.info( + "Finished calculating centroid connector variable: {}".format(network_variable) + )
+ + +
[docs] def calculate_distance( + self, network_variable="distance", centroidconnect_only=False, overwrite=False + ): + """ + calculate link distance in miles + + Args: + centroidconnect_only (Bool): True if calculating distance for centroidconnectors only. Default to False. + overwrite (Bool): True if overwriting existing variable in network. Default to False. + + Returns: + None + + """ + + if network_variable in self.links_df: + if overwrite: + WranglerLogger.info( + "Overwriting existing distance Variable '{}' already in network".format( + network_variable + ) + ) + else: + WranglerLogger.info( + "Distance Variable '{}' already in network. Returning without overwriting.".format( + network_variable + ) + ) + return + + """ + Verify inputs + """ + + #MC + if ("centroidconnect" not in self.links_df) & ("taz" not in self.links_df.roadway.unique()): + if centroidconnect_only: + msg = "No variable specified for centroid connector, calculating centroidconnect first" + WranglerLogger.error(msg) + raise ValueError(msg) + #/MC + + """ + Start actual process + """ + + temp_links_gdf = self.links_df.copy() + temp_links_gdf.crs = "EPSG:4326" + temp_links_gdf = temp_links_gdf.to_crs(epsg=26915) + + #MTC + WranglerLogger.info( + "Calculating distance for all links".format(network_variable) + ) + temp_links_gdf[network_variable] = temp_links_gdf.geometry.length / 1609.34 + #/MTC + #MC + if centroidconnect_only: + WranglerLogger.info( + "Calculating {} for centroid connectors".format(network_variable) + ) + temp_links_gdf[network_variable] = np.where( + temp_links_gdf.centroidconnect == 1, + temp_links_gdf.geometry.length / 1609.34, + temp_links_gdf[network_variable], + ) + else: + WranglerLogger.info( + "Calculating distance for all links".format(network_variable) + ) + temp_links_gdf[network_variable] = temp_links_gdf.geometry.length / 1609.34 + #/MC + + self.links_df[network_variable] = temp_links_gdf[network_variable]
+ +
[docs] def convert_int(self, int_col_names=[]): + """ + Convert integer columns + """ + + #MTC + WranglerLogger.info( + "Converting variable type to mtc standard" + ) + + int_col_names = self.parameters.int_col + #/MTC + #MC + """ + WranglerLogger.info("Converting variable type to MetCouncil standard") + + if not int_col_names: + int_col_names = self.parameters.int_col + #/MC + """ + ##Why are we doing this? + # int_col_names.remove("lanes") + + for c in list(set(self.links_df.columns) & set(int_col_names)): + self.links_df[c] = self.links_df[c].replace(np.nan, 0) + # REPLACE BLANKS WITH ZERO FOR INTEGER COLUMNS + self.links_df[c] = self.links_df[c].replace('', 0) + try: + self.links_df[c] = self.links_df[c].replace(np.nan, 0) + self.links_df[c] = self.links_df[c].replace("", 0) + self.links_df[c] = self.links_df[c].astype(int) + except ValueError: + try: + self.links_df[c] = self.links_df[c].astype(float) + self.links_df[c] = self.links_df[c].astype(int) + except: + msg = f"Could not convert column {c} to integer." + WranglerLogger.error(msg) + raise ValueError(msg) + except: + self.links_df[c] = self.links_df[c].astype(float) + self.links_df[c] = self.links_df[c].astype(int) + + for c in list(set(self.nodes_df.columns) & set(int_col_names)): + self.nodes_df[c] = self.nodes_df[c].replace("", 0) + self.nodes_df[c] = self.nodes_df[c].astype(int)
+ +
[docs] def fill_na(self): + """ + Fill na values from create_managed_lane_network() + """ + + WranglerLogger.info("Filling nan for network from network wrangler") + + num_col = self.parameters.int_col + self.parameters.float_col + + for x in list(self.links_df.columns): + if x in num_col: + self.links_df[x].fillna(0, inplace=True) + self.links_df[x] = self.links_df[x].apply( + lambda k: 0 if k in [np.nan, "", float("nan"), "NaN"] else k + ) + + else: + self.links_df[x].fillna("", inplace=True) + + for x in list(self.nodes_df.columns): + if x in num_col: + self.nodes_df[x].fillna(0, inplace=True) + else: + self.nodes_df[x].fillna("", inplace=True)
+ + +
[docs] def roadway_standard_to_met_council_network(self, output_epsg=None): + """ + Rename and format roadway attributes to be consistent with what metcouncil's model is expecting. + #MC + Args: + output_epsg (int): epsg number of output network. + + Returns: + None + """ + + WranglerLogger.info( + "Renaming roadway attributes to be consistent with what metcouncil's model is expecting" + ) + + """ + Verify inputs + """ + + output_epsg = output_epsg if output_epsg else self.parameters.output_epsg + + """ + Start actual process + """ + if "managed" in self.links_df.columns: + WranglerLogger.info("Creating managed lane network.") + self.create_managed_lane_network(in_place=True) + + # when ML and assign_group projects are applied together, assign_group is filled as "" by wrangler for ML links + for c in ModelRoadwayNetwork.CALCULATED_VALUES: + if c in self.links_df.columns and c in self.parameters.int_col: + self.links_df[c] = self.links_df[c].replace("", 0) + else: + WranglerLogger.info("Didn't detect managed lanes in network.") + + self.calculate_centroidconnect(self.parameters) + self.create_calculated_variables() + self.calculate_distance(overwrite=True) + + self.fill_na() + # no method to calculate price yet, will be hard coded in project card + WranglerLogger.info("Splitting variables by time period and category") + self.split_properties_by_time_period_and_category() + self.convert_int() + + self.links_metcouncil_df = self.links_df.copy() + self.nodes_metcouncil_df = self.nodes_df.copy() + + self.links_metcouncil_df = pd.merge( + self.links_metcouncil_df.drop( + "geometry", axis=1 + ), # drop the stick geometry in links_df + self.shapes_df[["shape_id", "geometry"]], + how="left", + on="shape_id", + ) + + self.links_metcouncil_df.crs = "EPSG:4326" + self.nodes_metcouncil_df.crs = "EPSG:4326" + WranglerLogger.info("Setting Coordinate Reference System to EPSG 26915") + self.links_metcouncil_df = self.links_metcouncil_df.to_crs(epsg=26915) + self.nodes_metcouncil_df = self.nodes_metcouncil_df.to_crs(epsg=26915) + + self.nodes_metcouncil_df["X"] = self.nodes_metcouncil_df.geometry.apply( + lambda g: g.x + ) + self.nodes_metcouncil_df["Y"] = self.nodes_metcouncil_df.geometry.apply( + lambda g: g.y + ) + + # CUBE expect node id to be N + self.nodes_metcouncil_df.rename(columns={"model_node_id": "N"}, inplace=True)
+ +
[docs] def rename_variables_for_dbf( + self, + input_df, + variable_crosswalk: str = None, + output_variables: list = None, + convert_geometry_to_xy=False, + ): + """ + Rename attributes for DBF/SHP, make sure length within 10 chars. + + Args: + input_df (dataframe): Network standard DataFrame. + variable_crosswalk (str): File path to variable name crosswalk from network standard to DBF names. + output_variables (list): List of strings for DBF variables. + convert_geometry_to_xy (bool): True if converting node geometry to X/Y + + Returns: + dataframe + + """ + WranglerLogger.info("Renaming variables so that they are DBF-safe") + + """ + Verify inputs + """ + + variable_crosswalk = ( + variable_crosswalk + if variable_crosswalk + else self.parameters.net_to_dbf_crosswalk + ) + + output_variables = ( + output_variables if output_variables else self.parameters.output_variables + ) + + """ + Start actual process + """ + + crosswalk_df = pd.read_csv(variable_crosswalk) + WranglerLogger.debug( + "Variable crosswalk: {} \n {}".format(variable_crosswalk, crosswalk_df) + ) + net_to_dbf_dict = dict(zip(crosswalk_df["net"], crosswalk_df["dbf"])) + + dbf_name_list = [] + + dbf_df = copy.deepcopy(input_df) + + # only write out variables that we specify + # if variable is specified in the crosswalk, rename it to that variable + for c in dbf_df.columns: + if c in output_variables: + try: + dbf_df.rename(columns={c: net_to_dbf_dict[c]}, inplace=True) + dbf_name_list += [net_to_dbf_dict[c]] + except: + dbf_name_list += [c] + + if "geometry" in dbf_df.columns: + if str(dbf_df["geometry"].iloc[0].geom_type) == "Point": + dbf_df["X"] = dbf_df.geometry.apply(lambda g: g.x) + dbf_df["Y"] = dbf_df.geometry.apply(lambda g: g.y) + dbf_name_list += ["X", "Y"] + + WranglerLogger.debug("DBF Variables: {}".format(",".join(dbf_name_list))) + + return dbf_df[dbf_name_list]
+ +
[docs] def write_roadway_as_shp( + self, + output_dir, + node_output_variables: list = None, + link_output_variables: list = None, + data_to_csv: bool = True, + data_to_dbf: bool = False, + output_link_shp: str = None, + output_node_shp: str = None, + output_link_csv: str = None, + output_node_csv: str = None, + output_gpkg: str = None, + output_link_gpkg_layer: str = None, + output_node_gpkg_layer: str = None, + output_gpkg_link_filter: str = None + ): + """ + Write out dbf/shp/gpkg for cube. Write out csv in addition to shp with full length variable names. + + Args: + output_dir (str): File path to directory + node_output_variables (list): List of strings for node output variables. + link_output_variables (list): List of strings for link output variables. + data_to_csv (bool): True if write network in csv format. + data_to_dbf (bool): True if write network in dbf/shp format. + output_link_shp (str): File name to output link dbf/shp. + output_node_shp (str): File name of output node dbf/shp. + output_link_csv (str): File name to output link csv. + output_node_csv (str): File name to output node csv. + output_gpkg (str): File name to output GeoPackage. + output_link_gpkg_layer (str): Layer name within output_gpkg to output links. + output_node_gpkg_layer (str): Layer name within output_gpkg to output links. + output_gpkg_link_filter (str): Optional column name to additional output link subset layers + + Returns: + None + """ + + WranglerLogger.info("Writing Network as Shapefile") + WranglerLogger.debug( + "Output Variables: \n - {}".format( + "\n - ".join(self.parameters.output_variables) + ) + ) + + """ + Verify inputs + """ + + if self.nodes_mtc_df is None: + self.roadway_standard_to_met_council_network() + + WranglerLogger.debug( + "Network Link Variables: \n - {}".format( + "\n - ".join(self.links_mtc_df.columns) + ) + ) + WranglerLogger.debug( + "Network Node Variables: \n - {}".format( + "\n - ".join(self.nodes_mtc_df.columns) + ) + ) + + link_output_variables = ( + link_output_variables + if link_output_variables + else [ + c + for c in self.links_mtc_df.columns + if c in self.parameters.output_variables + ] + ) + + node_output_variables = ( + node_output_variables + if node_output_variables + else [ + c + for c in self.nodes_mtc_df.columns + if c in self.parameters.output_variables + ] + ) + + # unless specified that all the data goes to the DBF, only output A and B + dbf_link_output_variables = ( + #MTC + link_output_variables if link_output_variables else ["A", "B", "geometry"] + #MC + #link_output_variables if data_to_dbf else ["A", "B", "shape_id", "geometry"] + ) + + # Removing code to set this to versions from parameters + # User can use these as arg + + """ + Start Process + """ + # rename these to short only for shapefile option + if output_node_shp: + WranglerLogger.info("Renaming DBF Node Variables") + nodes_dbf_df = self.rename_variables_for_dbf(self.nodes_mtc_df, output_variables=node_output_variables) + else: + WranglerLogger.debug("nodes_mtc_df columns: {}".format(list(self.nodes_mtc_df.columns))) + nodes_dbf_df = self.nodes_mtc_df[node_output_variables] + + if output_link_shp: + WranglerLogger.info("Renaming DBF Link Variables") + links_dbf_df = self.rename_variables_for_dbf(self.links_mtc_df, output_variables=dbf_link_output_variables) + else: + WranglerLogger.debug("links_mtc_df columns: {}".format(list(self.links_mtc_df.columns))) + links_dbf_df = self.links_mtc_df[dbf_link_output_variables] + + links_dbf_df = gpd.GeoDataFrame(links_dbf_df, geometry=links_dbf_df["geometry"]) + + # temp debug + WranglerLogger.debug("links_dbf_df.loc[(links_dbf_df.A.isin([7063066,2563066]))|(links_dbf_df.B.isin([7063066,2563066]))]:\n{}".format( + links_dbf_df.loc[(links_dbf_df.A.isin([7063066,2563066]))|(links_dbf_df.B.isin([7063066,2563066]))] + )) + + if output_node_shp: + WranglerLogger.info("Writing Node Shapes: {}".format(os.path.join(output_dir, output_node_shp))) + nodes_dbf_df.to_file(os.path.join(output_dir, output_node_shp)) + + if output_gpkg and output_node_gpkg_layer: + WranglerLogger.info("Writing GeoPackage {} with Node Layer {}".format(os.path.join(output_dir, output_gpkg), output_node_gpkg_layer)) + nodes_dbf_df.to_file(os.path.join(output_dir, output_gpkg), layer=output_node_gpkg_layer, driver="GPKG") + + if output_link_shp: + WranglerLogger.info("Writing Link Shapes: {}".format(os.path.join(output_dir, output_link_shp))) + links_dbf_df.to_file(os.path.join(output_dir, output_link_shp)) + + # debug test + link_schema = { + "properties": { + "A" : "int:8", + "B" : "int:8", + "model_link_id" : "int:10", + "shstGeometryId": "str:32", + "name" : "str:84", + "ft" : "int:2", + "assignable" : "int:18", + "cntype" : "str:80", + "distance" : "float", + "county" : "str:15", + "bike_access" : "int:2", + "drive_access" : "int:2", + "walk_access" : "int:2", + "rail_only" : "int:2", + "bus_only" : "int:2", + "transit" : "int:2", + "managed" : "int:2", + "tollbooth" : "int:2", + "tollseg" : "int:2", + "segment_id" : "int:4", + "lanes_EA" : "int:2", + "lanes_AM" : "int:2", + "lanes_MD" : "int:2", + "lanes_PM" : "int:2", + "lanes_EV" : "int:2", + "useclass_EA" : "int:2", + "useclass_AM" : "int:2", + "useclass_MD" : "int:2", + "useclass_PM" : "int:2", + "useclass_EV" : "int:2", + "nmt2010" : "int:2", + "nmt2020" : "int:2", + }, + "geometry": "LineString" + } + if output_gpkg and output_link_gpkg_layer: + WranglerLogger.info("Writing GeoPackage {} with Link Layer {}".format(os.path.join(output_dir, output_gpkg), output_link_gpkg_layer)) + links_dbf_df.to_file(os.path.join(output_dir, output_gpkg), layer=output_link_gpkg_layer, schema=link_schema, driver="GPKG") + + # output additional link layers if filter column is specified + # e.g. if county-subsets are output + if output_gpkg_link_filter: + link_value_counts = links_dbf_df[output_gpkg_link_filter].value_counts() + for filter_val,filter_count in link_value_counts.items(): + gpkg_layer_name = "{}_{}".format(output_link_gpkg_layer, filter_val) + gpkg_layer_name = gpkg_layer_name.replace(" ","_") + WranglerLogger.info("Writing GeoPackage {} with Link Layer {} for {} rows".format( + os.path.join(output_dir, output_gpkg), gpkg_layer_name, filter_count)) + links_dbf_df.loc[ links_dbf_df[output_gpkg_link_filter]==filter_val ].to_file( + os.path.join(output_dir, output_gpkg), layer=gpkg_layer_name, schema=link_schema, driver="GPKG") + + + + + if data_to_csv: + WranglerLogger.info( + "Writing Network Data to CSVs:\n - {}\n - {}".format( + output_link_csv, output_node_csv + ) + ) + self.links_mtc_df[link_output_variables].to_csv( + output_link_csv, index=False + ) + self.nodes_mtc_df[node_output_variables].to_csv( + output_node_csv, index=False + )
+ + + # this should be moved to util +
[docs] @staticmethod + def dataframe_to_fixed_width(df): + """ + Convert dataframe to fixed width format, geometry column will not be transformed. + + Args: + df (pandas DataFrame). + + Returns: + pandas dataframe: dataframe with fixed width for each column. + dict: dictionary with columns names as keys, column width as values. + """ + WranglerLogger.info("Starting fixed width conversion") + if 'name' in df.columns: + df['name']=df['name'].apply(lambda x: x.strip().split(',')[0].replace("[",'').replace("'nan'","").replace("nan","").replace("'","")) + + + # get the max length for each variable column + max_width_dict = dict( + [ + (v, df[v].apply(lambda r: len(str(r)) if r != None else 0).max()) + for v in df.columns.values + if v != "geometry" + ] + ) + + fw_df = df.drop("geometry", axis=1).copy() + for c in fw_df.columns: + fw_df[c] = fw_df[c].apply(lambda x: str(x)) + fw_df["pad"] = fw_df[c].apply(lambda x: " " * (max_width_dict[c] - len(x))) + fw_df[c] = fw_df.apply(lambda x: x["pad"] + x[c], axis=1) + + return fw_df, max_width_dict
+ +
[docs] def write_roadway_as_fixedwidth( + self, + output_dir, + node_output_variables: list = None, + link_output_variables: list = None, + output_link_txt: str = None, + output_node_txt: str = None, + output_link_header_width_txt: str = None, + output_node_header_width_txt: str = None, + output_cube_network_script: str = None, + drive_only: bool = False, + ): + """ + Writes out fixed width file. + + This function does: + 1. write out link and node fixed width data files for cube. + 2. write out header and width correspondence. + 3. write out cube network building script with header and width specification. + + Args: + output_dir (str): File path to where links, nodes and script will be written and run + node_output_variables (list): list of node variable names. + link_output_variables (list): list of link variable names. + output_link_txt (str): File name of output link database (within output_dir) + output_node_txt (str): File name of output node database (within output_dir) + output_link_header_width_txt (str): File name of link column width records (within output_dir) + output_node_header_width_txt (str): File name of node column width records (within output_dir) + output_cube_network_script (str): File name of CUBE network building script (within output_dir) + drive_only (bool): If True, only writes drive nodes and links + + Returns: + None + + """ + + """ + Verify inputs + """ + + if self.nodes_mtc_df is None: + self.roadway_standard_to_mtc_network() + + WranglerLogger.debug( + "Network Link Variables: \n - {}".format( + "\n - ".join(self.links_mtc_df.columns) + ) + ) + WranglerLogger.debug( + "Network Node Variables: \n - {}".format( + "\n - ".join(self.nodes_mtc_df.columns) + ) + ) + + link_output_variables = ( + link_output_variables + if link_output_variables + else [ + c + for c in self.links_mtc_df.columns + if c in self.parameters.output_variables + ] + ) + + node_output_variables = ( + node_output_variables + if node_output_variables + else [ + c + for c in self.nodes_mtc_df.columns + if c in self.parameters.output_variables + ] + ) + + output_link_txt = ( + output_link_txt if output_link_txt else self.parameters.output_link_txt + ) + + output_node_txt = ( + output_node_txt if output_node_txt else self.parameters.output_node_txt + ) + + output_link_header_width_txt = ( + output_link_header_width_txt + if output_link_header_width_txt + else self.parameters.output_link_header_width_txt + ) + + output_node_header_width_txt = ( + output_node_header_width_txt + if output_node_header_width_txt + else self.parameters.output_node_header_width_txt + ) + + output_cube_network_script = ( + output_cube_network_script + if output_cube_network_script + else self.parameters.output_cube_network_script + ) + + """ + Start Process + """ + #MTC + link_ff_df, link_max_width_dict = self.dataframe_to_fixed_width( + self.links_mtc_df[link_output_variables] + ) + + if drive_only: + link_ff_df = link_ff_df.loc[link_ff_df['drive_access'] == 1] + #/MTC + """ + #MC + link_ff_df, link_max_width_dict = self.dataframe_to_fixed_width( + self.links_metcouncil_df[link_output_variables] + ) + + if drive_only: + link_ff_df = link_ff_df.loc[link_ff_df["drive_access"] == 1] + #/MC + """ + WranglerLogger.info("Writing out link database") + + link_ff_df.to_csv(os.path.join(output_dir, output_link_txt), sep=";", index=False, header=False) + + # write out header and width correspondence + WranglerLogger.info("Writing out link header and width ----") + link_max_width_df = DataFrame( + list(link_max_width_dict.items()), columns=["header", "width"] + ) + link_max_width_df.to_csv(os.path.join(output_dir, output_link_header_width_txt), index=False) + + #MTC + node_ff_df, node_max_width_dict = self.dataframe_to_fixed_width( + self.nodes_mtc_df[node_output_variables] + ) + #/MTC + """ + #MC + node_ff_df, node_max_width_dict = self.dataframe_to_fixed_width( + self.nodes_metcouncil_df[node_output_variables] + ) + #/MC + """ + WranglerLogger.info("Writing out node database") + + if drive_only: + node_ff_df = node_ff_df.loc[node_ff_df["drive_node"] == 1] + + + node_ff_df.to_csv(os.path.join(output_dir, output_node_txt), sep=";", index=False, header=False) + + # write out header and width correspondence + WranglerLogger.info("Writing out node header and width") + node_max_width_df = DataFrame( + list(node_max_width_dict.items()), columns=["header", "width"] + ) + node_max_width_df.to_csv(os.path.join(output_dir, output_node_header_width_txt), index=False) + + # write out cube script + s = 'RUN PGM = NETWORK MSG = "Read in network from fixed width file" \n' + s += 'FILEI LINKI[1] = "{}",'.format(output_link_txt) + start_pos = 1 + for i in range(len(link_max_width_df)): + s += " VAR=" + link_max_width_df.header.iloc[i] + + if ( + self.links_mtc_df.dtypes.loc[link_max_width_df.header.iloc[i]] + == "O" + ): + s += "(C" + str(link_max_width_df.width.iloc[i]) + ")" + + s += ( + ", BEG=" + + str(start_pos) + + ", LEN=" + + str(link_max_width_df.width.iloc[i]) + + "," + ) + + start_pos += link_max_width_df.width.iloc[i] + 1 + + s = s[:-1] + s += "\n" + s += 'FILEI NODEI[1] = "{}",'.format(output_node_txt) + start_pos = 1 + for i in range(len(node_max_width_df)): + s += " VAR=" + node_max_width_df.header.iloc[i] + + if ( + self.nodes_mtc_df.dtypes.loc[node_max_width_df.header.iloc[i]] + == "O" + ): + s += "(C" + str(node_max_width_df.width.iloc[i]) + ")" + + s += ( + ", BEG=" + + str(start_pos) + + ", LEN=" + + str(node_max_width_df.width.iloc[i]) + + "," + ) + + start_pos += node_max_width_df.width.iloc[i] + 1 + + s = s[:-1] + s += '\n' + s += 'FILEO NETO = "complete_network.net"\n\n' + s += ' ZONES = {}\n\n'.format(self.parameters.zones) + s += '; Trim leading whitespace from string variables\n' + # todo: The below should be built above based on columns that are strings + s += ' phase=NODEMERGE\n' + s += ' county = LTRIM(county)\n' + s += ' endphase\n' + s += ' phase=LINKMERGE\n' + s += ' name = LTRIM(name)\n' + s += ' county = LTRIM(county)\n' + s += ' cntype = LTRIM(cntype)\n' + s += ' endphase\n' + s += '\nENDRUN\n' + + with open(os.path.join(output_dir, output_cube_network_script), "w") as f: + f.write(s) + + # run the cube script to create the cube network + import subprocess + env = copy.copy(os.environ) + cube_cmd = '"C:\\Program Files\\Citilabs\\CubeVoyager\\runtpp.exe" {}'.format(output_cube_network_script) + try: + WranglerLogger.info("Running [{}] in cwd [{}]".format(cube_cmd, output_dir)) + ret = subprocess.run(cube_cmd, cwd=output_dir, capture_output=True, check=True) + + WranglerLogger.info("return code: {}".format(ret.returncode)) + + for line in ret.stdout.decode('utf-8').split('\r\n'): + if len(line) > 0: WranglerLogger.info("stdout: {}".format(line)) + + for line in ret.stderr.decode('utf-8').split('\r\n'): + if len(line) > 0: WranglerLogger.info("stderr: {}".format(line)) + + except Exception as e: + WranglerLogger.error(e)
+
+ +
+
+
+ +
+ +
+

© Copyright 2019 Metropolitan Council.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty_emme/_modules/lasso/transit/index.html b/branch/bicounty_emme/_modules/lasso/transit/index.html new file mode 100644 index 0000000..13a7d41 --- /dev/null +++ b/branch/bicounty_emme/_modules/lasso/transit/index.html @@ -0,0 +1,1858 @@ + + + + + + lasso.transit — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for lasso.transit

+"""Transit-related classes to parse, compare, and write standard and cube transit files.
+
+  Typical usage example:
+
+    tn = CubeTransit.create_from_cube(CUBE_DIR)
+    transit_change_list = tn.evaluate_differences(base_transit_network)
+
+    cube_transit_net = StandardTransit.read_gtfs(BASE_TRANSIT_DIR)
+    cube_transit_net.write_as_cube_lin(os.path.join(WRITE_DIR, "outfile.lin"))
+"""
+import os
+import copy
+import csv
+import datetime, time
+from typing import Any, Dict, Optional, Union
+
+from lark import Lark, Transformer, v_args
+from pandas import DataFrame
+
+import pandas as pd
+import partridge as ptg
+import numpy as np
+
+from network_wrangler import TransitNetwork
+
+from .logger import WranglerLogger
+from .parameters import Parameters
+
+
[docs]class CubeTransit(object): + """Class for storing information about transit defined in Cube line + files. + + Has the capability to: + + - Parse cube line file properties and shapes into python dictionaries + - Compare line files and represent changes as Project Card dictionaries + + .. highlight:: python + + Typical usage example: + :: + tn = CubeTransit.create_from_cube(CUBE_DIR) + transit_change_list = tn.evaluate_differences(base_transit_network) + + Attributes: + lines (list): list of strings representing unique line names in + the cube network. + line_properties (dict): dictionary of line properties keyed by line name. Property + values are stored in a dictionary by property name. These + properties are directly read from the cube line files and haven't + been translated to standard transit values. + shapes (dict): dictionary of shapes + keyed by line name. Shapes stored as a pandas DataFrame of nodes with following columns: + - 'node_id' (int): positive integer of node id + - 'node' (int): node number, with negative indicating a non-stop + - 'stop' (boolean): indicates if it is a stop + - 'order' (int): order within this shape + program_type (str): Either PT or TRNBLD + parameters (Parameters): + Parameters instance that will be applied to this instance which + includes information about time periods and variables. + source_list (list): + List of cube line file sources that have been read and added. + diff_dict (dict): + """ + +
[docs] def __init__(self, parameters: Union[Parameters, dict] = {}): + """ + Constructor for CubeTransit + + parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters + """ + WranglerLogger.debug("Creating a new Cube Transit instance") + + self.lines = [] + + self.line_properties = {} + self.shapes = {} + + self.program_type = None + + if type(parameters) is dict: + self.parameters = Parameters(**parameters) + elif isinstance(parameters, Parameters): + self.parameters = Parameters(**parameters.__dict__) + else: + msg = "Parameters should be a dict or instance of Parameters: found {} which is of type:{}".format( + parameters, type(parameters) + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + self.source_list = [] + + self.diff_dict = Dict[str, Any]
+ +
[docs] def add_cube(self, transit_source: str): + """Reads a .lin file and adds it to existing TransitNetwork instance. + + Args: + transit_source: a string or the directory of the cube line file to be parsed + + """ + + """ + Figure out what kind of transit source it is + """ + + parser = Lark(TRANSIT_LINE_FILE_GRAMMAR, debug="debug", parser="lalr") + + if "NAME=" in transit_source: + WranglerLogger.debug("reading transit source as string") + self.source_list.append("input_str") + parse_tree = parser.parse(transit_source) + elif os.path.isfile(transit_source): + print("reading: {}".format(transit_source)) + with open(transit_source) as file: + WranglerLogger.debug( + "reading transit source: {}".format(transit_source) + ) + self.source_list.append(transit_source) + parse_tree = parser.parse(file.read()) + elif os.path.isdir(transit_source): + import glob + + for lin_file in glob.glob(os.path.join(transit_source, "*.LIN")): + self.add_cube(lin_file) + return + else: + msg = "{} not a valid transit line string, directory, or file" + WranglerLogger.error(msg) + raise ValueError(msg) + + WranglerLogger.debug("finished parsing cube line file") + # WranglerLogger.debug("--Parse Tree--\n {}".format(parse_tree.pretty())) + transformed_tree_data = CubeTransformer().transform(parse_tree) + # WranglerLogger.debug("--Transformed Parse Tree--\n {}".format(transformed_tree_data)) + + _line_data = transformed_tree_data["lines"] + + line_properties_dict = {k: v["line_properties"] for k, v in _line_data.items()} + line_shapes_dict = {k: v["line_shape"] for k, v in _line_data.items()} + new_lines = list(line_properties_dict.keys()) + """ + Before adding lines, check to see if any are overlapping with existing ones in the network + """ + + overlapping_lines = set(new_lines) & set(self.lines) + if overlapping_lines: + msg = "Overlapping lines found when adding from {}. \nSource files:\n{}\n{} Overlapping Lines of {} total new lines.\n-->{}".format( + transit_source, + "\n - ".join(self.source_list), + len(new_lines), + len(overlapping_lines), + overlapping_lines, + ) + print(msg) + WranglerLogger.error(msg) + raise ValueError(msg) + + self.program_type = transformed_tree_data.get("program_type", None) + + self.lines += new_lines + self.line_properties.update(line_properties_dict) + self.shapes.update(line_shapes_dict) + + WranglerLogger.debug("Added lines to CubeTransit: \n".format(new_lines))
+ +
[docs] @staticmethod + def create_from_cube(transit_source: str, parameters: Optional[dict] = {}): + """ + Reads a cube .lin file and stores as TransitNetwork object. + + Args: + transit_source: a string or the directory of the cube line file to be parsed + + Returns: + A ::CubeTransit object created from the transit_source. + """ + + tn = CubeTransit(parameters) + tn.add_cube(transit_source) + + return tn
+ +
[docs] def evaluate_differences(self, base_transit): + """ + 1. Identifies what routes need to be updated, deleted, or added + 2. For routes being added or updated, identify if the time periods + have changed or if there are multiples, and make duplicate lines if so + 3. Create project card dictionaries for each change. + + Args: + base_transit (CubeTransit): an instance of this class for the base condition + + Returns: + A list of dictionaries containing project card changes + required to evaluate the differences between the base network + and this transit network instance. + """ + transit_change_list = [] + + """ + Identify what needs to be evaluated + """ + lines_to_update = [l for l in self.lines if l in base_transit.lines] + lines_to_delete = [l for l in base_transit.lines if l not in self.lines] + lines_to_add = [l for l in self.lines if l not in base_transit.lines] + + project_card_changes = [] + + """ + Evaluate Property Updates + """ + + for line in lines_to_update: + WranglerLogger.debug( + "Finding differences in time periods for: {}".format(line) + ) + + """ + Find any additional time periods that might need to add or delete. + """ + base_cube_time_period_numbers = ( + CubeTransit.get_time_period_numbers_from_cube_properties( + base_transit.line_properties[line] + ) + ) + + try: + assert len(base_cube_time_period_numbers) == 1 + except: + msg = "Base network line {} should only have one time period per route, but {} found".format( + line, base_cube_time_period_numbers + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + base_cube_time_period_number = base_cube_time_period_numbers[0] + + build_cube_time_period_numbers = ( + CubeTransit.get_time_period_numbers_from_cube_properties( + self.line_properties[line] + ) + ) + + time_periods_to_add = [ + tp + for tp in build_cube_time_period_numbers + if tp not in base_cube_time_period_numbers + ] + + for tp in time_periods_to_add: + lines_to_add.append(self.add_additional_time_periods(tp, line)) + + time_periods_to_delete = [ + tp + for tp in base_cube_time_period_numbers + if tp not in build_cube_time_period_numbers + ] + + for tp in time_periods_to_delete: + lines_to_delete.append(line) + + WranglerLogger.debug("Evaluating differences in: {}".format(line)) + updated_properties = self.evaluate_route_property_differences( + self.line_properties[line], + base_transit.line_properties[line], + base_cube_time_period_number, + ) + updated_shapes = CubeTransit.evaluate_route_shape_changes( + self.shapes[line].node, base_transit.shapes[line].node + ) + if updated_properties: + update_prop_card_dict = self.create_update_route_card_dict( + line, updated_properties + ) + project_card_changes.append(update_prop_card_dict) + + if updated_shapes: + update_shape_card_dict = self.create_update_route_card_dict( + line, updated_shapes + ) + project_card_changes.append(update_shape_card_dict) + + """ + Evaluate Deletions + """ + for line in lines_to_delete: + delete_card_dict = self.create_delete_route_card_dict( + line, base_transit.line_properties[line] + ) + project_card_changes.append(delete_card_dict) + + """ + Evaluate Additions + + First assess if need to add multiple routes if there are multiple time periods + """ + for line in lines_to_add: + time_period_numbers = ( + CubeTransit.get_time_period_numbers_from_cube_properties( + self.line_properties[line] + ) + ) + if len(time_period_numbers) > 1: + for tp in time_period_numbers[1:]: + lines_to_add.append(self.add_additional_time_periods(tp, line)) + + for line in lines_to_add: + add_card_dict = self.create_add_route_card_dict(line) + project_card_changes.append(add_card_dict) + + return project_card_changes
+ +
[docs] def add_additional_time_periods( + self, new_time_period_number: int, orig_line_name: str + ): + """ + Copies a route to another cube time period with appropriate + values for time-period-specific properties. + + New properties are stored under the new name in: + - ::self.shapes + - ::self.line_properties + + Args: + new_time_period_number (int): cube time period number + orig_line_name(str): name of the originating line, from which + the new line will copy its properties. + + Returns: + Line name with new time period. + """ + WranglerLogger.debug( + "adding time periods {} to line {}".format( + new_time_period_number, orig_line_name + ) + ) + + ( + route_id, + _init_time_period, + agency_id, + direction_id, + ) = CubeTransit.unpack_route_name(orig_line_name) + new_time_period_name = self.parameters.cube_time_periods[new_time_period_number] + new_tp_line_name = CubeTransit.build_route_name( + route_id=route_id, + time_period=new_time_period_name, + agency_id=agency_id, + direction_id=direction_id, + ) + + try: + assert new_tp_line_name not in self.lines + except: + msg = "Trying to add a new time period {} to line {}, but constructed name {} is already in line list.".format( + new_time_period_number, orig_line_name, new_tp_line_name + ) + WrangerLogger.error(msg) + raise ValueError(msg) + + # copy to a new line and add it to list of lines to add + self.line_properties[new_tp_line_name] = copy.deepcopy( + self.line_properties[orig_line_name] + ) + self.shapes[new_tp_line_name] = copy.deepcopy(self.shapes[orig_line_name]) + self.line_properties[new_tp_line_name]["NAME"] = new_tp_line_name + + """ + Remove entries that aren't for this time period from the new line's properties list. + """ + this_time_period_properties_list = [ + p + "[" + str(new_time_period_number) + "]" + ##todo parameterize all time period specific variables + for p in ["HEADWAY", "FREQ"] + ] + + not_this_tp_properties_list = list( + set(self.parameters.time_period_properties_list) + - set(this_time_period_properties_list) + ) + + for k in not_this_tp_properties_list: + self.line_properties[new_tp_line_name].pop(k, None) + + """ + Remove entries for time period from the original line's properties list. + """ + for k in this_time_period_properties_list: + self.line_properties[orig_line_name].pop(k, None) + + """ + Add new line to list of lines to add. + """ + WranglerLogger.debug( + "Adding new time period {} for line {} as {}.".format( + new_time_period_number, orig_line_name, new_tp_line_name + ) + ) + return new_tp_line_name
+ +
[docs] def create_update_route_card_dict(self, line: str, updated_properties_dict: dict): + """ + Creates a project card change formatted dictionary for updating + the line. + + Args: + line: name of line that is being updated + updated_properties_dict: dictionary of attributes to update as + 'property': <property name>, + 'set': <new property value> + + Returns: + A project card change-formatted dictionary for the attribute update. + """ + base_start_time_str, base_end_time_str = self.calculate_start_end_times( + self.line_properties[line] + ) + + update_card_dict = { + "category": "Transit Service Property Change", + "facility": { + "route_id": line.split("_")[1], + "direction_id": int(line.split("_")[-2].strip("d\"")), + "shape_id": line.split("_")[-1].strip("s\""), + "start_time": base_start_time_str, + "end_time": base_end_time_str, + }, + "properties": updated_properties_dict, + } + WranglerLogger.debug( + "Updating {} route to changes:\n{}".format(line, str(update_card_dict)) + ) + + return update_card_dict
+ +
[docs] def create_delete_route_card_dict( + self, line: str, base_transit_line_properties_dict: dict + ): + """ + Creates a project card change formatted dictionary for deleting a line. + + Args: + line: name of line that is being deleted + base_transit_line_properties_dict: dictionary of cube-style + attribute values in order to find time periods and + start and end times. + + Returns: + A project card change-formatted dictionary for the route deletion. + """ + base_start_time_str, base_end_time_str = self.calculate_start_end_times( + base_transit_line_properties_dict + ) + + delete_card_dict = { + "category": "Delete Transit Service", + "facility": { + "route_id": line.split("_")[1], + "direction_id": int(line.strip('"')[-1]), + "start_time": base_start_time_str, + "end_time": base_end_time_str, + }, + } + WranglerLogger.debug( + "Deleting {} route to changes:\n{}".format(line, delete_card_dict) + ) + + return delete_card_dict
+ +
[docs] def create_add_route_card_dict(self, line: str): + """ + Creates a project card change formatted dictionary for adding + a route based on the information in self.route_properties for + the line. + + Args: + line: name of line that is being updated + + Returns: + A project card change-formatted dictionary for the route addition. + """ + start_time_str, end_time_str = self.calculate_start_end_times( + self.line_properties[line] + ) + + standard_properties = self.cube_properties_to_standard_properties( + self.line_properties[line] + ) + + routing_properties = { + "property": "routing", + "set": self.shapes[line]["node"].tolist(), + } + + add_card_dict = { + "category": "New Transit Service", + "facility": { + "route_id": line.split("_")[1], + "direction_id": int(line.strip('_')[-2]), + "start_time": start_time_str, + "end_time": end_time_str, + "agency_id": line.strip('_')[0], + }, + "properties": standard_properties + [routing_properties], + } + + WranglerLogger.debug( + "Adding {} route to changes:\n{}".format(line, add_card_dict) + ) + return add_card_dict
+ +
[docs] @staticmethod + def get_time_period_numbers_from_cube_properties(properties_list: list): + """ + Finds properties that are associated with time periods and the + returns the numbers in them. + + Args: + properties_list (list): list of all properties. + + Returns: + list of strings of the time period numbers found + """ + time_periods_list = [] + for p in properties_list: + if ("[" not in p) or ("]" not in p): + continue + tp_num = p.split("[")[1][0] + if tp_num and tp_num not in time_periods_list: + time_periods_list.append(tp_num) + return time_periods_list
+ +
[docs] @staticmethod + def build_route_name( + route_id: str = "", + time_period: str = "", + agency_id: str = 0, + direction_id: str = 1, + ): + """ + Create a route name by contatenating route, time period, agency, and direction + + Args: + route_id: i.e. 452-111 + time_period: i.e. pk + direction_id: i.e. 1 + agency_id: i.e. 0 + + Returns: + constructed line_name i.e. "0_452-111_452_pk1" + """ + + return ( + str(agency_id) + + "_" + + str(route_id) + + "_" + + str(route_id.split("-")[0]) + + "_" + + str(time_period) + + str(direction_id) + )
+ +
[docs] @staticmethod + def unpack_route_name(line_name: str): + """ + Unpacks route name into direction, route, agency, and time period info + + Args: + line_name (str): i.e. "0_452-111_452_pk1" + + Returns: + route_id (str): 452-111 + time_period (str): i.e. pk + direction_id (str) : i.e. 1 + agency_id (str) : i.e. 0 + """ + + line_name = line_name.strip('"') + + agency_id, route_id, _rtid, _tp_direction = line_name.split("_") + time_period = _tp_direction[0:-1] + direction_id = _tp_direction[-1] + + return route_id, time_period, agency_id, direction_id
+ +
[docs] def calculate_start_end_times(self, line_properties_dict: dict): + """ + Calculate the start and end times of the property change + WARNING: Doesn't take care of discongruous time periods!!!! + + Args: + line_properties_dict: dictionary of cube-flavor properties for a transit line + """ + start_time_m = 24 * 60 + end_time_m = 0 * 60 + + WranglerLogger.debug( + "parameters.time_period_properties_list: {}".format( + self.parameters.time_period_properties_list + ) + ) + current_cube_time_period_numbers = ( + CubeTransit.get_time_period_numbers_from_cube_properties( + line_properties_dict + ) + ) + + WranglerLogger.debug( + "current_cube_time_period_numbers:{}".format( + current_cube_time_period_numbers + ) + ) + + for tp in current_cube_time_period_numbers: + time_period_name = self.parameters.cube_time_periods[tp] + WranglerLogger.debug("time_period_name:{}".format(time_period_name)) + _start_time, _end_time = self.parameters.time_period_to_time[ + time_period_name + ] + + # change from "HH:MM" to integer # of seconds + _start_time_m = (int(_start_time.split(":")[0]) * 60) + int( + _start_time.split(":")[1] + ) + _end_time_m = (int(_end_time.split(":")[0]) * 60) + int( + _end_time.split(":")[1] + ) + + # find bounding start and end times + if _start_time_m < start_time_m: + start_time_m = _start_time_m + if _end_time_m > end_time_m: + end_time_m = _end_time_m + + if start_time_m > end_time_m: + msg = "Start time ({}) is after end time ({})".format( + start_time_m, end_time_m + ) + #WranglerLogger.error(msg) + #raise ValueError(msg) + + start_time_str = "{:02d}:{:02d}".format(*divmod(start_time_m, 60)) + end_time_str = "{:02d}:{:02d}".format(*divmod(end_time_m, 60)) + return start_time_str, end_time_str
+ +
[docs] @staticmethod + def cube_properties_to_standard_properties(cube_properties_dict: dict): + """ + Converts cube style properties to standard properties. + + This is most pertinent to time-period specific variables like headway, + and varibles that have stnadard units like headway, which is minutes + in cube and seconds in standard format. + + Args: + cube_properties_dict: <cube style property name> : <property value> + + Returns: + A list of dictionaries with values for `"property": <standard + style property name>, "set" : <property value with correct units>` + + """ + standard_properties_list = [] + for k, v in cube_properties_dict.items(): + change_item = {} + if any(i in k for i in ["HEADWAY", "FREQ"]): + change_item["property"] = "headway_secs" + change_item["set"] = v * 60 + else: + change_item["property"] = k + change_item["set"] = v + standard_properties_list.append(change_item) + + return standard_properties_list
+ +
[docs] def evaluate_route_property_differences( + self, + properties_build: dict, + properties_base: dict, + time_period_number: str, + absolute: bool = True, + validate_base: bool = False, + ): + """ + Checks if any values have been updated or added for a specific + route and creates project card entries for each. + + Args: + properties_build: ::<property_name>: <property_value> + properties_base: ::<property_name>: <property_value> + time_period_number: time period to evaluate + absolute: if True, will use `set` command rather than a change. If false, will automatically check the base value. Note that this only applies to the numeric values of frequency/headway + validate_base: if True, will add the `existing` line in the project card + + Returns: + transit_change_list (list): a list of dictionary values suitable for writing to a project card + `{ + 'property': <property_name>, + 'set': <set value>, + 'change': <change from existing value>, + 'existing': <existing value to check>, + }` + + """ + + # Remove time period specific values for things that aren't part of the time period in question + this_time_period_properties_list = [ + p + "[" + str(time_period_number) + "]" + ##todo parameterize all time period specific variables + for p in ["HEADWAY", "FREQ"] + ] + + not_this_tp_properties_list = list( + set(self.parameters.time_period_properties_list) + - set(this_time_period_properties_list) + ) + + for k in not_this_tp_properties_list: + properties_build.pop(k, None) + properties_base.pop(k, None) + + difference_dict = dict( + set(properties_build.items()) ^ set(properties_base.items()) + ) + + # Iterate through properties list to build difference project card list + + properties_list = [] + for k, v in difference_dict.items(): + change_item = {} + if any(i in k for i in ["HEADWAY", "FREQ"]): + change_item["property"] = "headway_secs" + + if absolute: + change_item["set"] = ( + v * 60 + ) # project cards are in secs, cube is in minutes + else: + change_item["change"] = ( + properties_build[k] - properties_base[k] + ) * 60 + if validate_base or not absolute: + change_item["existing"] = properties_base[k] * 60 + else: + change_item["property"] = k + change_item["set"] = v + if validate_base: + change_item["existing"] = properties_base[k] + + properties_list.append(change_item) + WranglerLogger.debug( + "Evaluated Route Changes: \n {})".format( + "\n".join(map(str, properties_list)) + ) + ) + return properties_list
+ +
[docs] @staticmethod + def evaluate_route_shape_changes( + shape_build: DataFrame, shape_base: DataFrame + ): + """ + Compares two route shapes and constructs returns list of changes + suitable for a project card. + + Args: + shape_build: DataFrame of the build-version of the route shape. + shape_base: dDataFrame of the base-version of the route shape. + + Returns: + List of shape changes formatted as a project card-change dictionary. + + """ + + if shape_build.equals(shape_base): + return None + + shape_change_list = [] + + base_node_list = shape_base.tolist() + build_node_list = shape_build.tolist() + + sort_len = max(len(base_node_list), len(build_node_list)) + + start_pos = None + end_pos = None + for i in range(sort_len): + if (i == len(base_node_list)) | (i == len(build_node_list)): + start_pos = i - 1 + break + if base_node_list[i] != build_node_list[i]: + start_pos = i + break + else: + continue + + j = -1 + for i in range(sort_len): + if (i == len(base_node_list)) | (i == len(build_node_list)): + end_pos = j + 1 + break + if base_node_list[j] != build_node_list[j]: + end_pos = j + break + else: + j -= 1 + + if start_pos or end_pos: + existing = base_node_list[ + (start_pos - 2 if start_pos > 1 else None) : ( + end_pos + 2 if end_pos < -2 else None + ) + ] + set = build_node_list[ + (start_pos - 2 if start_pos > 1 else None) : ( + end_pos + 2 if end_pos < -2 else None + ) + ] + + shape_change_list.append( + {"property": "routing", "existing": existing, "set": set} + ) + + return shape_change_list
+ + +
[docs]class StandardTransit(object): + """Holds a standard transit feed as a Partridge object and contains + methods to manipulate and translate the GTFS data to MetCouncil's + Cube Line files. + + .. highlight:: python + Typical usage example: + :: + cube_transit_net = StandardTransit.read_gtfs(BASE_TRANSIT_DIR) + cube_transit_net.write_as_cube_lin(os.path.join(WRITE_DIR, "outfile.lin")) + + Attributes: + feed: Partridge Feed object containing read-only access to GTFS feed + parameters (Parameters): Parameters instance containing information + about time periods and variables. + """ + +
[docs] def __init__(self, ptg_feed, parameters: Union[Parameters, dict] = {}): + """ + + Args: + ptg_feed: partridge feed object + parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters + """ + self.feed = ptg_feed + + if type(parameters) is dict: + self.parameters = Parameters(**parameters) + elif isinstance(parameters, Parameters): + self.parameters = Parameters(**parameters.__dict__) + else: + msg = "Parameters should be a dict or instance of Parameters: found {} which is of type:{}".format( + parameters, type(parameters) + ) + WranglerLogger.error(msg) + raise ValueError(msg)
+ +
[docs] @staticmethod + def fromTransitNetwork( + transit_network_object: TransitNetwork, parameters: Union[Parameters, dict] = {} + ): + """ + RoadwayNetwork to ModelRoadwayNetwork + + Args: + transit_network_object: Reference to an instance of TransitNetwork. + parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not provided will + use default parameters. + + Returns: + StandardTransit + """ + return StandardTransit(transit_network_object.feed, parameters=parameters)
+ +
[docs] @staticmethod + def read_gtfs(gtfs_feed_dir: str, parameters: Union[Parameters, dict] = {}): + """ + Reads GTFS files from a directory and returns a StandardTransit + instance. + + Args: + gtfs_feed_dir: location of the GTFS files + parameters: dictionary of parameter settings (see Parameters class) or an instance of Parameters. If not provided will + use default parameters. + + Returns: + StandardTransit instance + """ + return StandardTransit(ptg.load_feed(gtfs_feed_dir), parameters=parameters)
+ +
[docs] def write_as_cube_lin(self, outpath: str = None): + """ + Writes the gtfs feed as a cube line file after + converting gtfs properties to MetCouncil cube properties. + #MC + Args: + outpath: File location for output cube line file. + + """ + if not outpath: + outpath = os.path.join(self.parameters.scratch_location, "outtransit.lin") + trip_cube_df = self.route_properties_gtfs_to_cube(self) + + trip_cube_df["LIN"] = trip_cube_df.apply(self.cube_format, axis=1) + + l = trip_cube_df["LIN"].tolist() + + with open(outpath, "w") as f: + f.write("\n".join(l))
+ +
[docs] @staticmethod + def route_properties_gtfs_to_cube(self): + """ + Prepare gtfs for cube lin file. + #MC + Does the following operations: + 1. Combines route, frequency, trip, and shape information + 2. Converts time of day to time periods + 3. Calculates cube route name from gtfs route name and properties + 4. Assigns a cube-appropriate mode number + 5. Assigns a cube-appropriate operator number + + Returns: + trip_df (DataFrame): DataFrame of trips with cube-appropriate values for: + - NAME + - ONEWAY + - OPERATOR + - MODE + - HEADWAY + """ + WranglerLogger.info( + "Converting GTFS Standard Properties to MetCouncil's Cube Standard" + ) + metro_operator_dict = { + "0": 3, + "1": 3, + "2": 3, + "3": 4, + "4": 2, + "5": 5, + "6": 8, + "7": 1, + "8": 1, + "9": 10, + "10": 3, + "11": 9, + "12": 3, + "13": 4, + "14": 4, + "15": 3, + } + + shape_df = self.feed.shapes.copy() + trip_df = self.feed.trips.copy() + + """ + Add information from: routes, frequencies, and routetype to trips_df + """ + trip_df = pd.merge(trip_df, self.feed.routes, how="left", on="route_id") + trip_df = pd.merge(trip_df, self.feed.frequencies, how="left", on="trip_id") + + trip_df["tod_name"] = trip_df.start_time.apply(self.time_to_cube_time_period) + inv_cube_time_periods_map = { + v: k for k, v in self.parameters.cube_time_periods.items() + } + trip_df["tod_num"] = trip_df.tod_name.map(inv_cube_time_periods_map) + trip_df["tod_name"] = trip_df.tod_name.map( + self.parameters.cube_time_periods_name + ) + + trip_df["NAME"] = trip_df.apply( + lambda x: x.agency_id + + "_" + + x.route_id + + "_" + + x.route_short_name + + "_" + + x.tod_name + + str(x.direction_id), + axis=1, + ) + + trip_df["LONGNAME"] = trip_df["route_long_name"] + trip_df["HEADWAY"] = (trip_df["headway_secs"] / 60).astype(int) + trip_df["MODE"] = trip_df.apply(self.calculate_cube_mode, axis=1) + trip_df["ONEWAY"] = "T" + trip_df["OPERATOR"] = trip_df["agency_id"].map(metro_operator_dict) + + return trip_df
+ +
[docs] def calculate_cube_mode(self, row): + """ + Assigns a cube mode number by following logic. + #MC + For rail, uses GTFS route_type variable: + https://developers.google.com/transit/gtfs/reference + + :: + # route_type : cube_mode + route_type_to_cube_mode = {0: 8, # Tram, Streetcar, Light rail + 3: 0, # Bus; further disaggregated for cube + 2: 9} # Rail + + For buses, uses route id numbers and route name to find + express and suburban buses as follows: + + :: + if not cube_mode: + if 'express' in row['LONGNAME'].lower(): + cube_mode = 7 # Express + elif int(row['route_id'].split("-")[0]) > 99: + cube_mode = 6 # Suburban Local + else: + cube_mode = 5 # Urban Local + + Args: + row: A DataFrame row with route_type, route_long_name, and route_id + + Returns: + cube mode number + """ + # route_type : cube_mode + route_type_to_cube_mode = { + 0: 8, # Tram, Streetcar, Light rail + 3: 0, # Bus; further disaggregated for cube + 2: 9, + } # Rail + + cube_mode = route_type_to_cube_mode[row["route_type"]] + + if not cube_mode: + if "express" in row["route_long_name"].lower(): + cube_mode = 7 # Express + elif int(row["route_id"].split("-")[0]) > 99: + cube_mode = 6 # Suburban Local + else: + cube_mode = 5 # Urban Local + + return cube_mode
+ +
[docs] def time_to_cube_time_period( + self, start_time_secs: int, as_str: bool = True, verbose: bool = False + ): + """ + Converts seconds from midnight to the cube time period. + + Args: + start_time_secs: start time for transit trip in seconds + from midnight + as_str: if True, returns the time period as a string, + otherwise returns a numeric time period + + Returns: + this_tp_num: if as_str is False, returns the numeric + time period + this_tp: if as_str is True, returns the Cube time period + name abbreviation + """ + from .util import hhmmss_to_datetime, secs_to_datetime + + # set initial time as the time that spans midnight + + start_time_dt = secs_to_datetime(start_time_secs) + + # set initial time as the time that spans midnight + this_tp = "NA" + for tp_name, _times in self.parameters.time_period_to_time.items(): + _start_time, _end_time = _times + _dt_start_time = hhmmss_to_datetime(_start_time) + _dt_end_time = hhmmss_to_datetime(_end_time) + if _dt_start_time > _dt_end_time: + this_tp = tp_name + break + + for tp_name, _times in self.parameters.time_period_to_time.items(): + _start_time, _end_time = _times + _dt_start_time = hhmmss_to_datetime(_start_time) + if start_time_dt >= _dt_start_time: + this_time = _dt_start_time + this_tp = tp_name + + if verbose: + WranglerLogger.debug( + "Finding Cube Time Period from Start Time: \ + \n - start_time_sec: {} \ + \n - start_time_dt: {} \ + \n - this_tp: {}".format( + start_time_secs, start_time_dt, this_tp + ) + ) + + if as_str: + return this_tp + + name_to_num = {v: k for k, v in self.parameters.cube_time_periods.items()} + this_tp_num = name_to_num.get(this_tp) + + if not this_tp_num: + msg = ( + "Cannot find time period number in {} for time period name: {}".format( + name_to_num, this_tp + ) + ) + WranglerLogger.error(msg) + raise ValueError(msg) + + return this_tp_num
+ +
[docs] def shape_gtfs_to_cube(self, row, add_nntime = False): + """ + Creates a list of nodes that for the route in appropriate + cube format. + + Args: + row: DataFrame row with both shape_id and trip_id + + Returns: a string representation of the node list + for a route in cube format. + + """ + agency_raw_name = row.agency_raw_name + shape_id = row.shape_id + trip_id = row.trip_id + + trip_stop_times_df = self.feed.stop_times.copy() + + if 'agency_raw_name' in trip_stop_times_df.columns: + trip_stop_times_df.drop('agency_raw_name', axis = 1, inplace = True) + + trip_stop_times_df = pd.merge( + trip_stop_times_df, + self.feed.trips[['trip_id', 'agency_raw_name']], + how = 'left', + on = ['trip_id'] + ) + + trip_stop_times_df = trip_stop_times_df[ + (trip_stop_times_df.trip_id == row.trip_id) & + (trip_stop_times_df.agency_raw_name == agency_raw_name) + ] + + trip_node_df = self.feed.shapes.copy() + if 'agency_raw_name' in trip_node_df.columns: + trip_node_df.drop('agency_raw_name', axis = 1, inplace = True) + + trip_node_df = pd.merge( + trip_node_df, + self.feed.trips[['shape_id', 'agency_raw_name']].drop_duplicates(), + how = 'left', + on = ['shape_id'] + ) + + trip_node_df = trip_node_df[ + (trip_node_df.shape_id == shape_id) & + (trip_node_df.agency_raw_name == agency_raw_name) + ] + + trip_node_df.sort_values(by = ["shape_pt_sequence"], inplace = True) + + stops_df = self.feed.stops.copy() + stops_df['stop_id'] = stops_df['stop_id'].astype(float).astype(int) + trip_stop_times_df['stop_id'] = trip_stop_times_df['stop_id'].astype(float).astype(int) + + if 'trip_id' in self.feed.stops.columns: + if agency_raw_name != 'sjrtd_2015_0127': + stops_df = stops_df[stops_df.agency_raw_name != 'sjrtd_2015_0127'] + trip_stop_times_df = pd.merge( + trip_stop_times_df, stops_df.drop('trip_id', axis = 1), how="left", on=["stop_id"] + ) + else: + stops_df = stops_df[stops_df.agency_raw_name == 'sjrtd_2015_0127'] + stops_df['trip_id'] = stops_df['trip_id'].astype(float).astype(int).astype(str) + trip_stop_times_df['trip_id'] = trip_stop_times_df['trip_id'].astype(float).astype(int).astype(str) + trip_stop_times_df = pd.merge( + trip_stop_times_df, stops_df, how="left", on=['agency_raw_name', 'trip_id',"stop_id"] + ) + else: + trip_stop_times_df = pd.merge( + trip_stop_times_df, stops_df, how="left", on="stop_id" + ) + + trip_stop_times_df["model_node_id"] = pd.to_numeric(trip_stop_times_df["model_node_id"]).astype(int) + trip_node_df["shape_model_node_id"] = pd.to_numeric(trip_node_df["shape_model_node_id"]).astype(int) + + stop_node_id_list = trip_stop_times_df["model_node_id"].tolist() + trip_node_list = trip_node_df["shape_model_node_id"].tolist() + + trip_stop_times_df.sort_values(by = ["stop_sequence"], inplace = True) + # sometimes GTFS `stop_sequence` does not start with 1, e.g. SFMTA light rails + trip_stop_times_df["internal_stop_sequence"] = range(1, 1+len(trip_stop_times_df)) + # sometimes GTFS `departure_time` is not recorded for every stop, e.g. VTA light rails + trip_stop_times_df["departure_time"].fillna(method = "ffill", inplace = True) + trip_stop_times_df["departure_time"].fillna(0, inplace = True) + trip_stop_times_df["NNTIME"] = trip_stop_times_df["departure_time"].diff() / 60 + # CUBE NNTIME takes 2 decimals + trip_stop_times_df["NNTIME"] = trip_stop_times_df["NNTIME"].round(2) + trip_stop_times_df["NNTIME"].fillna(-1, inplace = True) + + # ACCESS + def _access_type(x): + if (x.pickup_type in [1, "1"]): + return 2 + elif (x.drop_off_type in [1, "1"]): + return 1 + else: + return 0 + + trip_stop_times_df["ACCESS"] = trip_stop_times_df.apply(lambda x: _access_type(x), axis = 1) + + trip_runtime = round(trip_stop_times_df[trip_stop_times_df['NNTIME'] > 0]['NNTIME'].sum(),2) + + # node list + node_list_str = "" + stop_seq = 0 + for nodeIdx in range(len(trip_node_list)): + if trip_node_list[nodeIdx] in stop_node_id_list: + # in case a route stops at a stop more than once, e.g. circular route + stop_seq += 1 + + if (add_nntime) & (stop_seq > 1): + if len(trip_stop_times_df[ + trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]]) > 1: + nntime_v = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]) & + (trip_stop_times_df["internal_stop_sequence"] == stop_seq), + "NNTIME"].iloc[0] + else: + nntime_v = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]),"NNTIME"].iloc[0] + + if nntime_v > 0: + nntime = ", NNTIME=%s" % (nntime_v) + else: + nntime = "" + else: + nntime = "" + + access_v = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]),"ACCESS"].iloc[0] + if access_v > 0: + access = ", ACCESS=%s" % (access_v) + else: + access = "" + + node_list_str += "\n %s%s%s" % (trip_node_list[nodeIdx], nntime, access) + if nodeIdx < (len(trip_node_list) - 1): + node_list_str += "," + if ((add_nntime) & (stop_seq > 1) & (len(nntime) > 0)) | (len(access) > 0): + node_list_str += " N=" + else: + node_list_str += "\n -%s" % (trip_node_list[nodeIdx]) + if nodeIdx < (len(trip_node_list) - 1): + node_list_str += "," + + # remove NNTIME = 0 + node_list_str = node_list_str.replace(" NNTIME=0.0, N=", "") + node_list_str = node_list_str.replace(" NNTIME=0.0,", "") + + return node_list_str, trip_runtime
+ + +
[docs] def cube_format(self, row): + """ + Creates a string represnting the route in cube line file notation. + #MC + Args: + row: row of a DataFrame representing a cube-formatted trip, with the Attributes + trip_id, shape_id, NAME, LONGNAME, tod, HEADWAY, MODE, ONEWAY, OPERATOR + + Returns: + string representation of route in cube line file notation + """ + + s = '\nLINE NAME="{}",'.format(row.NAME) + s += '\n LONGNAME="{}",'.format(row.LONGNAME) + s += "\n HEADWAY[{}]={},".format(row.tod_num, row.HEADWAY) + s += "\n MODE={},".format(row.MODE) + s += "\n ONEWAY={},".format(row.ONEWAY) + s += "\n OPERATOR={},".format(row.OPERATOR) + s += "\n NODES={}".format(self.shape_gtfs_to_cube(row)) + + return s
+ +
[docs] def shape_gtfs_to_emme(self, trip_row): + """ + Creates transit segment for the trips in appropriate + emme format. + + Args: + row: DataFrame row with both shape_id and trip_id + + Returns: a dataframe representation of the transit segment + for a trip in emme format. + + """ + trip_stop_times_df = self.feed.stop_times.copy() + trip_stop_times_df = trip_stop_times_df[ + trip_stop_times_df.trip_id == trip_row.trip_id + ] + + trip_node_df = self.feed.shapes.copy() + trip_node_df = trip_node_df[trip_node_df.shape_id == trip_row.shape_id] + trip_node_df.sort_values(by = ["shape_pt_sequence"], inplace = True) + + trip_stop_times_df = pd.merge( + trip_stop_times_df, self.feed.stops, how="left", on="stop_id" + ) + + stop_node_id_list = trip_stop_times_df["model_node_id"].tolist() + trip_node_list = trip_node_df["shape_model_node_id"].tolist() + + trip_stop_times_df.sort_values(by = ["stop_sequence"], inplace = True) + # sometimes GTFS `stop_sequence` does not start with 1, e.g. SFMTA light rails + trip_stop_times_df["internal_stop_sequence"] = range(1, 1+len(trip_stop_times_df)) + # sometimes GTFS `departure_time` is not recorded for every stop, e.g. VTA light rails + trip_stop_times_df["departure_time"].fillna(method = "ffill", inplace = True) + trip_stop_times_df["departure_time"].fillna(0, inplace = True) + trip_stop_times_df["NNTIME"] = trip_stop_times_df["departure_time"].diff() / 60 + # CUBE NNTIME takes 2 decimals + trip_stop_times_df["NNTIME"] = trip_stop_times_df["NNTIME"].round(2) + trip_stop_times_df["NNTIME"].fillna(-1, inplace = True) + + # node list + stop_seq = 0 + nntimes = [] + allow_alightings=[] + allow_boardings=[] + stop_names=[] + + if trip_row.TM2_line_haul_name in ["Light rail", "Heavy rail", "Commuter rail", "Ferry service"]: + add_nntime = True + else: + add_nntime = False + + for nodeIdx in range(len(trip_node_list)): + + if trip_node_list[nodeIdx] in stop_node_id_list: + # in case a route stops at a stop more than once, e.g. circular route + stop_seq += 1 + + if (add_nntime) & (stop_seq > 1): + if len(trip_stop_times_df[ + trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]]) > 1: + + nntime_v = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]) & + (trip_stop_times_df["internal_stop_sequence"] == stop_seq), + "NNTIME"].iloc[0] + else: + nntime_v = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]),"NNTIME"].iloc[0] + + nntimes.append(nntime_v) + else: + nntimes.append(0) + + pickup_type = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]),"pickup_type"].iloc[0] + if pickup_type in [1, "1"]: + allow_alightings.append(0) + else: + allow_alightings.append(1) + + drop_off_type = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]),"drop_off_type"].iloc[0] + if drop_off_type in [1, "1"]: + allow_boardings.append(0) + else: + allow_boardings.append(1) + + stop_name = trip_stop_times_df.loc[ + (trip_stop_times_df["model_node_id"] == trip_node_list[nodeIdx]),"stop_name"].iloc[0] + stop_names.append(stop_name) + + else: + nntimes.append(0) + allow_alightings.append(0) + allow_boardings.append(0) + stop_names.append("") + + trip_node_df['time_minutes'] = nntimes + trip_node_df['allow_alightings'] = allow_alightings + trip_node_df['allow_boardings'] = allow_boardings + trip_node_df['stop_name'] = stop_names + trip_node_df['line_id'] = trip_row['line_id'] + trip_node_df['node_id'] = trip_node_df['shape_model_node_id'].astype(int) + trip_node_df['stop_order'] = trip_node_df['shape_pt_sequence'] + + return trip_node_df
+ +
[docs] def evaluate_differences(self, transit_changes): + """ + Compare changes from the transit_changes dataframe with the standard transit network + returns the project card changes in dictionary format + """ + + # simple properties change + trip_df = self.feed.trips.copy() + + mode_crosswalk = pd.read_csv(self.parameters.mode_crosswalk_file) + mode_crosswalk.drop_duplicates(subset = ["agency_raw_name", "route_type", "is_express_bus"], inplace = True) + + trip_df = pd.merge(trip_df, self.feed.routes.drop("agency_raw_name", axis = 1), how="left", on="route_id") + + trip_df = pd.merge(trip_df, self.feed.frequencies, how="left", on="trip_id") + + trip_df["tod"] = trip_df.start_time.apply(self.time_to_cube_time_period, as_str = False) + trip_df["tod_name"] = trip_df.start_time.apply(self.time_to_cube_time_period) + + trip_df["headway_minutes"] = (trip_df["headway_secs"] / 60).astype(int) + + trip_df = pd.merge(trip_df, self.feed.agency[["agency_name", "agency_raw_name", "agency_id"]], how = "left", on = ["agency_raw_name", "agency_id"]) + + # identify express bus + from .mtc import _is_express_bus + trip_df["is_express_bus"] = trip_df.apply(lambda x: _is_express_bus(x), axis = 1) + trip_df.drop("agency_name", axis = 1 , inplace = True) + + trip_df = pd.merge( + trip_df, + mode_crosswalk.drop("agency_id", axis = 1), + how = "left", + on = ["agency_raw_name", "route_type", "is_express_bus"] + ) + + trip_df["line_id"] = trip_df.apply( + lambda x: str(x.TM2_operator) + + "_" + + str(x.route_id) + + "_" + + x.tod_name + + "_" + + "d" + + str(int(x.direction_id)) + + "_s" + + x.shape_id, + axis=1, + ) + + trip_df["line_id"] = trip_df["line_id"].str.slice(stop = 28) + + project_card_changes = [] + + # lines updated + transit_changes['line_id'] = transit_changes.apply( + lambda x: '-'.join(x['element_id'].split('-')[:-3]) if + x['object'] == 'TRANSIT_STOP' else + x['element_id'], + axis = 1 + ) + + lines_updated_df = transit_changes[ + (transit_changes['operation'] == 'C') & + (transit_changes['line_id'].isin(trip_df['line_id'].tolist())) + ].copy() + + ######################### + # simple property changes + ######################### + + property_changes_df = lines_updated_df[ + lines_updated_df.object == 'TRANSIT_LINE' + ].copy() + + property_attribute_list = ['headway_secs'] + + for index, row in property_changes_df.iterrows(): + line_id = row['line_id'] + properties_list = [] + change_item = {} + for c in property_attribute_list: + existing_value = int(trip_df[ + trip_df['line_id'] == line_id + ][c].iloc[0]) + + change_item["existing"] = existing_value + + if c == 'headway_secs': + change_item["set"] = row['headway'] * 60 + else: + change_item["set"] = row[c] + + change_item["property"] = c + + properties_list.append(change_item) + + property_changes_df.loc[index, 'properties'] = properties_list + + ############### + # shape changes + ############### + + shape_changes_df = lines_updated_df[ + lines_updated_df.object.isin(['TRANSIT_SHAPE']) + ].copy() + + for index, row in shape_changes_df.iterrows(): + line_id = row.line_id + + # get base shape + trip_row = trip_df[trip_df.line_id == line_id].copy().squeeze() + + base_shape = self.shape_gtfs_to_emme( + trip_row=trip_row + ) + base_shape['shape_model_node_id'] = base_shape['shape_model_node_id'].astype(int) + + # get build shape + build_shape = row.new_itinerary + + updated_shapes = CubeTransit.evaluate_route_shape_changes( + shape_base = base_shape.shape_model_node_id, + shape_build = pd.Series(row.new_itinerary) + ) + updated_shapes[0]['property'] = 'shapes' + shape_changes_df.loc[index, 'properties'] = updated_shapes + + ############## + # stop changes + ############## + stop_changes_df = lines_updated_df[ + lines_updated_df.object.isin(['TRANSIT_STOP']) + ].copy() + + stop_attribute_list = ['allow_alightings', 'allow_boardings'] + + stop_changes_df = stop_changes_df.groupby( + ['line_id','i_node'] + )[stop_attribute_list].last().reset_index() + + stop_attribute_changes_df = pd.DataFrame() + + for attribute in stop_attribute_list: + + attribute_df = stop_changes_df.groupby( + ['line_id', attribute] + )['i_node'].apply(list).reset_index() + attribute_df['properties'] = attribute_df.apply( + lambda x: { + 'property' : attribute if x[attribute] == True else 'no_'+attribute.split('_')[-1], + 'set': x['i_node']}, + axis = 1 + ) + + stop_attribute_changes_df = pd.concat( + [stop_attribute_changes_df, + attribute_df[['line_id', 'properties']]], + sort = False, + ignore_index = True + ) + + ############## + # combine all transit changes + ############## + transit_changes_df = pd.concat( + [ + property_changes_df, + shape_changes_df, + stop_attribute_changes_df + ], + sort = False, + ignore_index = True + ) + + # groupby line_id + transit_changes_df = transit_changes_df.groupby( + ['line_id'] + )['properties'].apply(list).reset_index() + + # create change items by line_id + for index, row in transit_changes_df.iterrows(): + line_id = row['line_id'] + base_start_time_str = self.parameters.time_period_to_time.get( + line_id.split("_")[2] + )[0] + + base_end_time_str = self.parameters.time_period_to_time.get( + line_id.split("_")[2] + )[1] + + update_card_dict = { + "category": "Transit Service Property Change", + "facility": { + "route_id": line_id.split("_")[1], + "direction_id": int(line_id.split("_")[-2].strip("d\"")), + "shape_id": line_id.split("_")[-1].strip("s\""), + "start_time": base_start_time_str, + "end_time": base_end_time_str + }, + "properties": row['properties'], + } + + project_card_changes.append(update_card_dict) + + return project_card_changes
+ +class CubeTransformer(Transformer): + """A lark-parsing Transformer which transforms the parse-tree to + a dictionary. + + .. highlight:: python + Typical usage example: + :: + transformed_tree_data = CubeTransformer().transform(parse_tree) + + Attributes: + line_order (int): a dynamic counter to hold the order of the nodes within + a route shape + lines_list (list): a list of the line names + """ + + def __init__(self): + self.line_order = 0 + self.lines_list = [] + + def lines(self, line): + # WranglerLogger.debug("lines: \n {}".format(line)) + + # This MUST be a tuple because it returns to start in the tree + lines = {k: v for k, v in line} + return ("lines", lines) + + @v_args(inline=True) + def program_type_line(self, PROGRAM_TYPE, whitespace=None): + # WranglerLogger.debug("program_type_line:{}".format(PROGRAM_TYPE)) + self.program_type = PROGRAM_TYPE.value + + # This MUST be a tuple because it returns to start in the tree + return ("program_type", PROGRAM_TYPE.value) + + @v_args(inline=True) + def line(self, lin_attributes, nodes): + # WranglerLogger.debug("line...attributes:\n {}".format(lin_attributes)) + # WranglerLogger.debug("line...nodes:\n {}".format(nodes)) + lin_name = lin_attributes["NAME"] + + self.line_order = 0 + # WranglerLogger.debug("parsing: {}".format(lin_name)) + + return (lin_name, {"line_properties": lin_attributes, "line_shape": nodes}) + + @v_args(inline=True) + def lin_attributes(self, *lin_attr): + lin_attr = {k: v for (k, v) in lin_attr} + # WranglerLogger.debug("lin_attributes: {}".format(lin_attr)) + return lin_attr + + @v_args(inline=True) + def lin_attr(self, lin_attr_name, attr_value, SEMICOLON_COMMENT=None): + # WranglerLogger.debug("lin_attr {}: {}".format(lin_attr_name, attr_value)) + return lin_attr_name, attr_value + + def lin_attr_name(self, args): + attr_name = args[0].value.upper() + # WranglerLogger.debug(".......args {}".format(args)) + if attr_name in ["FREQ", "HEADWAY"]: + attr_name = attr_name + "[" + str(args[2]) + "]" + return attr_name + + def attr_value(self, attr_value): + try: + return int(attr_value[0].value) + except: + return attr_value[0].value + + def nodes(self, lin_node): + lin_node = DataFrame(lin_node) + # WranglerLogger.debug("nodes:\n {}".format(lin_node)) + + return lin_node + + @v_args(inline=True) + def lin_node(self, NODE_NUM, SEMICOLON_COMMENT=None, *lin_nodeattr): + self.line_order += 1 + n = int(NODE_NUM.value) + return {"node_id": abs(n), "node": n, "stop": n > 0, "order": self.line_order} + + start = dict + + +TRANSIT_LINE_FILE_GRAMMAR = r""" + +start : program_type_line? lines +WHITESPACE : /[ \t\r\n]/+ +STRING : /("(?!"").*?(?<!\\)(\\\\)*?"|'(?!'').*?(?<!\\)(\\\\)*?')/i +SEMICOLON_COMMENT : /;[^\n]*/ +BOOLEAN : "T"i | "F"i +program_type_line : ";;<<" PROGRAM_TYPE ">><<LINE>>;;" WHITESPACE? +PROGRAM_TYPE : "PT" | "TRNBUILD" + +lines : line* +line : "LINE" lin_attributes nodes + +lin_attributes : lin_attr+ +lin_attr : lin_attr_name "=" attr_value "," SEMICOLON_COMMENT* +TIME_PERIOD : "1".."5" +!lin_attr_name : "allstops"i + | "color"i + | ("freq"i "[" TIME_PERIOD "]") + | ("headway"i "[" TIME_PERIOD "]") + | "mode"i + | "name"i + | "oneway"i + | "owner"i + | "runtime"i + | "timefac"i + | "xyspeed"i + | "longname"i + | "shortname"i + | ("usera1"i) + | ("usera2"i) + | "circular"i + | "vehicletype"i + | "operator"i + | "faresystem"i + +attr_value : BOOLEAN | STRING | SIGNED_INT | FLOAT + +nodes : lin_node+ +lin_node : ("N" | "NODES")? "="? NODE_NUM ","? SEMICOLON_COMMENT? lin_nodeattr* +NODE_NUM : SIGNED_INT +lin_nodeattr : lin_nodeattr_name "=" attr_value ","? SEMICOLON_COMMENT* +!lin_nodeattr_name : "access_c"i + | "access"i + | "delay"i + | "xyspeed"i + | "timefac"i + | "nntime"i + | "time"i + +operator : SEMICOLON_COMMENT* "OPERATOR" opmode_attr* SEMICOLON_COMMENT* +mode : SEMICOLON_COMMENT* "MODE" opmode_attr* SEMICOLON_COMMENT* +opmode_attr : ( (opmode_attr_name "=" attr_value) ","? ) +opmode_attr_name : "number" | "name" | "longname" + +%import common.SIGNED_INT +%import common.FLOAT +%import common.WS +%ignore WS + +""" +
+ +
+
+
+ +
+ +
+

© Copyright 2019 Metropolitan Council.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty_emme/_modules/lasso/util/index.html b/branch/bicounty_emme/_modules/lasso/util/index.html new file mode 100644 index 0000000..f82d773 --- /dev/null +++ b/branch/bicounty_emme/_modules/lasso/util/index.html @@ -0,0 +1,256 @@ + + + + + + lasso.util — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for lasso.util

+from functools import partial
+import pyproj
+from shapely.ops import transform
+from shapely.geometry import Point, Polygon
+import re
+from unidecode import unidecode
+
+
[docs]def get_shared_streets_intersection_hash(lat, long, osm_node_id=None): + """ + Calculated per: + https://github.com/sharedstreets/sharedstreets-js/blob/0e6d7de0aee2e9ae3b007d1e45284b06cc241d02/src/index.ts#L553-L565 + Expected in/out + -93.0965985, 44.952112199999995 osm_node_id = 954734870 + 69f13f881649cb21ee3b359730790bb9 + + """ + import hashlib + + message = "Intersection {0:.5f} {0:.5f}".format(long, lat) + if osm_node_id: + message += " {}".format(osm_node_id) + unhashed = message.encode("utf-8") + hash = hashlib.md5(unhashed).hexdigest() + return hash
+ + +
[docs]def hhmmss_to_datetime(hhmmss_str: str): + """ + Creates a datetime time object from a string of hh:mm:ss + + Args: + hhmmss_str: string of hh:mm:ss + Returns: + dt: datetime.time object representing time + """ + import datetime + + dt = datetime.time(*[int(i) for i in hhmmss_str.split(":")]) + + return dt
+ + +
[docs]def secs_to_datetime(secs: int): + """ + Creates a datetime time object from a seconds from midnight + + Args: + secs: seconds from midnight + Returns: + dt: datetime.time object representing time + """ + import datetime + + dt = (datetime.datetime.min + datetime.timedelta(seconds=secs)).time() + + return dt
+ + +
[docs]def geodesic_point_buffer(lat, lon, meters): + """ + creates circular buffer polygon for node + + Args: + lat: node lat + lon: node lon + meters: buffer distance, radius of circle + Returns: + Polygon + """ + proj_wgs84 = pyproj.Proj('+proj=longlat +datum=WGS84') + # Azimuthal equidistant projection + aeqd_proj = '+proj=aeqd +lat_0={lat} +lon_0={lon} +x_0=0 +y_0=0' + project = partial( + pyproj.transform, + pyproj.Proj(aeqd_proj.format(lat=lat, lon=lon)), + proj_wgs84) + buf = Point(0, 0).buffer(meters) # distance in meters + return Polygon(transform(project, buf).exterior.coords[:])
+ +
[docs]def create_locationreference(node, link): + node['X'] = node['geometry'].apply(lambda p: p.x) + node['Y'] = node['geometry'].apply(lambda p: p.y) + node['point'] = [list(xy) for xy in zip(node.X, node.Y)] + node_dict = dict(zip(node.model_node_id, node.point)) + + link['A_point'] = link['A'].map(node_dict) + link['B_point'] = link['B'].map(node_dict) + link['locationReferences'] = link.apply(lambda x: [{'sequence':1, + 'point': x['A_point'], + 'distanceToNextRef':x['length'], + 'bearing' : 0, + 'intersectionId':x['fromIntersectionId']}, + {'sequence':2, + 'point': x['B_point'], + 'intersectionId':x['toIntersectionId']}], + axis = 1)
+ +
[docs]def column_name_to_parts(c, parameters=None): + + if not parameters: + from .parameters import Parameters + + parameters = Parameters() + + if c[0:2] == "ML": + managed = True + else: + managed = False + + time_period = None + category = None + + if c.split("_")[0] not in parameters.properties_to_split.keys(): + return c, None, None, managed + + tps = parameters.time_period_to_time.keys() + cats = parameters.categories.keys() + + if c.split("_")[-1] in tps: + time_period = c.split("_")[-1] + base_name = c.split(time_period)[-2][:-1] + if c.split("_")[-2] in cats: + category = c.split("_")[-2] + base_name = c.split(category)[-2][:-1] + elif c.split("_")[-1] in cats: + category = c.split("_")[-1] + base_name = c.split(category)[-2][:-1] + else: + msg = "Can't split property correctly: {}".format(c) + WranglerLogger.error(msg) + + return base_name, time_period, category, managed
+ +
[docs]def shorten_name(name): + if type(name) == str: + name_list = name.split(',') + else: + name_list = name + name_list = [re.sub(r'\W+', ' ', c).replace('nan', '').strip(' ') for c in name_list] + + name_list = list(set(name_list)) + #name_list.remove('') + + name_new = ' '.join(name_list).strip(' ') + + # convert non english character to english + name_new = unidecode(name_new) + + return name_new
+
+ +
+
+
+ +
+ +
+

© Copyright 2019 Metropolitan Council.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty_emme/_modules/shapely/geometry/point/index.html b/branch/bicounty_emme/_modules/shapely/geometry/point/index.html new file mode 100644 index 0000000..a0ef069 --- /dev/null +++ b/branch/bicounty_emme/_modules/shapely/geometry/point/index.html @@ -0,0 +1,252 @@ + + + + + + shapely.geometry.point — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for shapely.geometry.point

+"""Points and related utilities
+"""
+import numpy as np
+
+import shapely
+from shapely.errors import DimensionError
+from shapely.geometry.base import BaseGeometry
+
+__all__ = ["Point"]
+
+
+
[docs]class Point(BaseGeometry): + """ + A geometry type that represents a single coordinate with + x,y and possibly z values. + + A point is a zero-dimensional feature and has zero length and zero area. + + Parameters + ---------- + args : float, or sequence of floats + The coordinates can either be passed as a single parameter, or as + individual float values using multiple parameters: + + 1) 1 parameter: a sequence or array-like of with 2 or 3 values. + 2) 2 or 3 parameters (float): x, y, and possibly z. + + Attributes + ---------- + x, y, z : float + Coordinate values + + Examples + -------- + Constructing the Point using separate parameters for x and y: + + >>> p = Point(1.0, -1.0) + + Constructing the Point using a list of x, y coordinates: + + >>> p = Point([1.0, -1.0]) + >>> print(p) + POINT (1 -1) + >>> p.y + -1.0 + >>> p.x + 1.0 + """ + + __slots__ = [] + + def __new__(self, *args): + if len(args) == 0: + # empty geometry + # TODO better constructor + return shapely.from_wkt("POINT EMPTY") + elif len(args) > 3: + raise TypeError(f"Point() takes at most 3 arguments ({len(args)} given)") + elif len(args) == 1: + coords = args[0] + if isinstance(coords, Point): + return coords + + # Accept either (x, y) or [(x, y)] + if not hasattr(coords, "__getitem__"): # generators + coords = list(coords) + coords = np.asarray(coords).squeeze() + else: + # 2 or 3 args + coords = np.array(args).squeeze() + + if coords.ndim > 1: + raise ValueError( + f"Point() takes only scalar or 1-size vector arguments, got {args}" + ) + if not np.issubdtype(coords.dtype, np.number): + coords = [float(c) for c in coords] + geom = shapely.points(coords) + if not isinstance(geom, Point): + raise ValueError("Invalid values passed to Point constructor") + return geom + + # Coordinate getters and setters + + @property + def x(self): + """Return x coordinate.""" + return shapely.get_x(self) + + @property + def y(self): + """Return y coordinate.""" + return shapely.get_y(self) + + @property + def z(self): + """Return z coordinate.""" + if not shapely.has_z(self): + raise DimensionError("This point has no z coordinate.") + # return shapely.get_z(self) -> get_z only supported for GEOS 3.7+ + return self.coords[0][2] + + @property + def __geo_interface__(self): + return {"type": "Point", "coordinates": self.coords[0]} + +
[docs] def svg(self, scale_factor=1.0, fill_color=None, opacity=None): + """Returns SVG circle element for the Point geometry. + + Parameters + ========== + scale_factor : float + Multiplication factor for the SVG circle diameter. Default is 1. + fill_color : str, optional + Hex string for fill color. Default is to use "#66cc99" if + geometry is valid, and "#ff3333" if invalid. + opacity : float + Float number between 0 and 1 for color opacity. Default value is 0.6 + """ + if self.is_empty: + return "<g />" + if fill_color is None: + fill_color = "#66cc99" if self.is_valid else "#ff3333" + if opacity is None: + opacity = 0.6 + return ( + '<circle cx="{0.x}" cy="{0.y}" r="{1}" ' + 'stroke="#555555" stroke-width="{2}" fill="{3}" opacity="{4}" />' + ).format(self, 3.0 * scale_factor, 1.0 * scale_factor, fill_color, opacity)
+ + @property + def xy(self): + """Separate arrays of X and Y coordinate values + + Example: + >>> x, y = Point(0, 0).xy + >>> list(x) + [0.0] + >>> list(y) + [0.0] + """ + return self.coords.xy
+ + +shapely.lib.registry[0] = Point +
+ +
+
+
+ +
+ +
+

© Copyright 2019 Metropolitan Council.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty_emme/_modules/shapely/geometry/polygon/index.html b/branch/bicounty_emme/_modules/shapely/geometry/polygon/index.html new file mode 100644 index 0000000..f73f375 --- /dev/null +++ b/branch/bicounty_emme/_modules/shapely/geometry/polygon/index.html @@ -0,0 +1,462 @@ + + + + + + shapely.geometry.polygon — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for shapely.geometry.polygon

+"""Polygons and their linear ring components
+"""
+
+import numpy as np
+
+import shapely
+from shapely.algorithms.cga import is_ccw_impl, signed_area
+from shapely.errors import TopologicalError
+from shapely.geometry.base import BaseGeometry
+from shapely.geometry.linestring import LineString
+from shapely.geometry.point import Point
+
+__all__ = ["Polygon", "LinearRing"]
+
+
+def _unpickle_linearring(wkb):
+    linestring = shapely.from_wkb(wkb)
+    srid = shapely.get_srid(linestring)
+    linearring = shapely.linearrings(shapely.get_coordinates(linestring))
+    if srid:
+        linearring = shapely.set_srid(linearring, srid)
+    return linearring
+
+
+class LinearRing(LineString):
+    """
+    A geometry type composed of one or more line segments
+    that forms a closed loop.
+
+    A LinearRing is a closed, one-dimensional feature.
+    A LinearRing that crosses itself or touches itself at a single point is
+    invalid and operations on it may fail.
+
+    Parameters
+    ----------
+    coordinates : sequence
+        A sequence of (x, y [,z]) numeric coordinate pairs or triples, or
+        an array-like with shape (N, 2) or (N, 3).
+        Also can be a sequence of Point objects.
+
+    Notes
+    -----
+    Rings are automatically closed. There is no need to specify a final
+    coordinate pair identical to the first.
+
+    Examples
+    --------
+    Construct a square ring.
+
+    >>> ring = LinearRing( ((0, 0), (0, 1), (1 ,1 ), (1 , 0)) )
+    >>> ring.is_closed
+    True
+    >>> list(ring.coords)
+    [(0.0, 0.0), (0.0, 1.0), (1.0, 1.0), (1.0, 0.0), (0.0, 0.0)]
+    >>> ring.length
+    4.0
+
+    """
+
+    __slots__ = []
+
+    def __new__(self, coordinates=None):
+        if coordinates is None:
+            # empty geometry
+            # TODO better way?
+            return shapely.from_wkt("LINEARRING EMPTY")
+        elif isinstance(coordinates, LineString):
+            if type(coordinates) == LinearRing:
+                # return original objects since geometries are immutable
+                return coordinates
+            elif not coordinates.is_valid:
+                raise TopologicalError("An input LineString must be valid.")
+            else:
+                # LineString
+                # TODO convert LineString to LinearRing more directly?
+                coordinates = coordinates.coords
+
+        else:
+            if hasattr(coordinates, "__array__"):
+                coordinates = np.asarray(coordinates)
+            if isinstance(coordinates, np.ndarray) and np.issubdtype(
+                coordinates.dtype, np.number
+            ):
+                pass
+            else:
+                # check coordinates on points
+                def _coords(o):
+                    if isinstance(o, Point):
+                        return o.coords[0]
+                    else:
+                        return [float(c) for c in o]
+
+                coordinates = np.array([_coords(o) for o in coordinates])
+                if not np.issubdtype(coordinates.dtype, np.number):
+                    # conversion of coords to 2D array failed, this might be due
+                    # to inconsistent coordinate dimensionality
+                    raise ValueError("Inconsistent coordinate dimensionality")
+
+        if len(coordinates) == 0:
+            # empty geometry
+            # TODO better constructor + should shapely.linearrings handle this?
+            return shapely.from_wkt("LINEARRING EMPTY")
+
+        geom = shapely.linearrings(coordinates)
+        if not isinstance(geom, LinearRing):
+            raise ValueError("Invalid values passed to LinearRing constructor")
+        return geom
+
+    @property
+    def __geo_interface__(self):
+        return {"type": "LinearRing", "coordinates": tuple(self.coords)}
+
+    def __reduce__(self):
+        """WKB doesn't differentiate between LineString and LinearRing so we
+        need to move the coordinate sequence into the correct geometry type"""
+        return (_unpickle_linearring, (shapely.to_wkb(self, include_srid=True),))
+
+    @property
+    def is_ccw(self):
+        """True is the ring is oriented counter clock-wise"""
+        return bool(is_ccw_impl()(self))
+
+    @property
+    def is_simple(self):
+        """True if the geometry is simple, meaning that any self-intersections
+        are only at boundary points, else False"""
+        return bool(shapely.is_simple(self))
+
+
+shapely.lib.registry[2] = LinearRing
+
+
+class InteriorRingSequence:
+
+    _parent = None
+    _ndim = None
+    _index = 0
+    _length = 0
+
+    def __init__(self, parent):
+        self._parent = parent
+        self._ndim = parent._ndim
+
+    def __iter__(self):
+        self._index = 0
+        self._length = self.__len__()
+        return self
+
+    def __next__(self):
+        if self._index < self._length:
+            ring = self._get_ring(self._index)
+            self._index += 1
+            return ring
+        else:
+            raise StopIteration
+
+    def __len__(self):
+        return shapely.get_num_interior_rings(self._parent)
+
+    def __getitem__(self, key):
+        m = self.__len__()
+        if isinstance(key, int):
+            if key + m < 0 or key >= m:
+                raise IndexError("index out of range")
+            if key < 0:
+                i = m + key
+            else:
+                i = key
+            return self._get_ring(i)
+        elif isinstance(key, slice):
+            res = []
+            start, stop, stride = key.indices(m)
+            for i in range(start, stop, stride):
+                res.append(self._get_ring(i))
+            return res
+        else:
+            raise TypeError("key must be an index or slice")
+
+    def _get_ring(self, i):
+        return shapely.get_interior_ring(self._parent, i)
+
+
+
[docs]class Polygon(BaseGeometry): + """ + A geometry type representing an area that is enclosed by a linear ring. + + A polygon is a two-dimensional feature and has a non-zero area. It may + have one or more negative-space "holes" which are also bounded by linear + rings. If any rings cross each other, the feature is invalid and + operations on it may fail. + + Parameters + ---------- + shell : sequence + A sequence of (x, y [,z]) numeric coordinate pairs or triples, or + an array-like with shape (N, 2) or (N, 3). + Also can be a sequence of Point objects. + holes : sequence + A sequence of objects which satisfy the same requirements as the + shell parameters above + + Attributes + ---------- + exterior : LinearRing + The ring which bounds the positive space of the polygon. + interiors : sequence + A sequence of rings which bound all existing holes. + + Examples + -------- + Create a square polygon with no holes + + >>> coords = ((0., 0.), (0., 1.), (1., 1.), (1., 0.), (0., 0.)) + >>> polygon = Polygon(coords) + >>> polygon.area + 1.0 + """ + + __slots__ = [] + + def __new__(self, shell=None, holes=None): + if shell is None: + # empty geometry + # TODO better way? + return shapely.from_wkt("POLYGON EMPTY") + elif isinstance(shell, Polygon): + # return original objects since geometries are immutable + return shell + else: + shell = LinearRing(shell) + + if holes is not None: + if len(holes) == 0: + # shapely constructor cannot handle holes=[] + holes = None + else: + holes = [LinearRing(ring) for ring in holes] + + geom = shapely.polygons(shell, holes=holes) + if not isinstance(geom, Polygon): + raise ValueError("Invalid values passed to Polygon constructor") + return geom + + @property + def exterior(self): + return shapely.get_exterior_ring(self) + + @property + def interiors(self): + if self.is_empty: + return [] + return InteriorRingSequence(self) + + @property + def coords(self): + raise NotImplementedError( + "Component rings have coordinate sequences, but the polygon does not" + ) + + def __eq__(self, other): + if not isinstance(other, BaseGeometry): + return NotImplemented + if not isinstance(other, Polygon): + return False + check_empty = (self.is_empty, other.is_empty) + if all(check_empty): + return True + elif any(check_empty): + return False + my_coords = [self.exterior.coords] + [ + interior.coords for interior in self.interiors + ] + other_coords = [other.exterior.coords] + [ + interior.coords for interior in other.interiors + ] + if not len(my_coords) == len(other_coords): + return False + # equal_nan=False is the default, but not yet available for older numpy + return np.all( + [ + np.array_equal(left, right) # , equal_nan=False) + for left, right in zip(my_coords, other_coords) + ] + ) + + def __hash__(self): + return super().__hash__() + + @property + def __geo_interface__(self): + if self.exterior == LinearRing(): + coords = [] + else: + coords = [tuple(self.exterior.coords)] + for hole in self.interiors: + coords.append(tuple(hole.coords)) + return {"type": "Polygon", "coordinates": tuple(coords)} + +
[docs] def svg(self, scale_factor=1.0, fill_color=None, opacity=None): + """Returns SVG path element for the Polygon geometry. + + Parameters + ========== + scale_factor : float + Multiplication factor for the SVG stroke-width. Default is 1. + fill_color : str, optional + Hex string for fill color. Default is to use "#66cc99" if + geometry is valid, and "#ff3333" if invalid. + opacity : float + Float number between 0 and 1 for color opacity. Default value is 0.6 + """ + if self.is_empty: + return "<g />" + if fill_color is None: + fill_color = "#66cc99" if self.is_valid else "#ff3333" + if opacity is None: + opacity = 0.6 + exterior_coords = [["{},{}".format(*c) for c in self.exterior.coords]] + interior_coords = [ + ["{},{}".format(*c) for c in interior.coords] for interior in self.interiors + ] + path = " ".join( + [ + "M {} L {} z".format(coords[0], " L ".join(coords[1:])) + for coords in exterior_coords + interior_coords + ] + ) + return ( + '<path fill-rule="evenodd" fill="{2}" stroke="#555555" ' + 'stroke-width="{0}" opacity="{3}" d="{1}" />' + ).format(2.0 * scale_factor, path, fill_color, opacity)
+ +
[docs] @classmethod + def from_bounds(cls, xmin, ymin, xmax, ymax): + """Construct a `Polygon()` from spatial bounds.""" + return cls([(xmin, ymin), (xmin, ymax), (xmax, ymax), (xmax, ymin)])
+ + +shapely.lib.registry[3] = Polygon + + +def orient(polygon, sign=1.0): + s = float(sign) + rings = [] + ring = polygon.exterior + if signed_area(ring) / s >= 0.0: + rings.append(ring) + else: + rings.append(list(ring.coords)[::-1]) + for ring in polygon.interiors: + if signed_area(ring) / s <= 0.0: + rings.append(ring) + else: + rings.append(list(ring.coords)[::-1]) + return Polygon(rings[0], rings[1:]) +
+ +
+
+
+ +
+ +
+

© Copyright 2019 Metropolitan Council.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty_emme/_modules/shapely/ops/index.html b/branch/bicounty_emme/_modules/shapely/ops/index.html new file mode 100644 index 0000000..90625be --- /dev/null +++ b/branch/bicounty_emme/_modules/shapely/ops/index.html @@ -0,0 +1,845 @@ + + + + + + shapely.ops — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +

Source code for shapely.ops

+"""Support for various GEOS geometry operations
+"""
+
+from warnings import warn
+
+import shapely
+from shapely.algorithms.polylabel import polylabel  # noqa
+from shapely.errors import GeometryTypeError, ShapelyDeprecationWarning
+from shapely.geometry import (
+    GeometryCollection,
+    LineString,
+    MultiLineString,
+    MultiPoint,
+    Point,
+    Polygon,
+    shape,
+)
+from shapely.geometry.base import BaseGeometry, BaseMultipartGeometry
+from shapely.geometry.polygon import orient as orient_
+from shapely.prepared import prep
+
+__all__ = [
+    "cascaded_union",
+    "linemerge",
+    "operator",
+    "polygonize",
+    "polygonize_full",
+    "transform",
+    "unary_union",
+    "triangulate",
+    "voronoi_diagram",
+    "split",
+    "nearest_points",
+    "validate",
+    "snap",
+    "shared_paths",
+    "clip_by_rect",
+    "orient",
+    "substring",
+]
+
+
+class CollectionOperator:
+    def shapeup(self, ob):
+        if isinstance(ob, BaseGeometry):
+            return ob
+        else:
+            try:
+                return shape(ob)
+            except (ValueError, AttributeError):
+                return LineString(ob)
+
+    def polygonize(self, lines):
+        """Creates polygons from a source of lines
+
+        The source may be a MultiLineString, a sequence of LineString objects,
+        or a sequence of objects than can be adapted to LineStrings.
+        """
+        source = getattr(lines, "geoms", None) or lines
+        try:
+            source = iter(source)
+        except TypeError:
+            source = [source]
+        finally:
+            obs = [self.shapeup(line) for line in source]
+        collection = shapely.polygonize(obs)
+        return collection.geoms
+
+    def polygonize_full(self, lines):
+        """Creates polygons from a source of lines, returning the polygons
+        and leftover geometries.
+
+        The source may be a MultiLineString, a sequence of LineString objects,
+        or a sequence of objects than can be adapted to LineStrings.
+
+        Returns a tuple of objects: (polygons, cut edges, dangles, invalid ring
+        lines). Each are a geometry collection.
+
+        Dangles are edges which have one or both ends which are not incident on
+        another edge endpoint. Cut edges are connected at both ends but do not
+        form part of polygon. Invalid ring lines form rings which are invalid
+        (bowties, etc).
+        """
+        source = getattr(lines, "geoms", None) or lines
+        try:
+            source = iter(source)
+        except TypeError:
+            source = [source]
+        finally:
+            obs = [self.shapeup(line) for line in source]
+        return shapely.polygonize_full(obs)
+
+    def linemerge(self, lines, directed=False):
+        """Merges all connected lines from a source
+
+        The source may be a MultiLineString, a sequence of LineString objects,
+        or a sequence of objects than can be adapted to LineStrings.  Returns a
+        LineString or MultiLineString when lines are not contiguous.
+        """
+        source = None
+        if getattr(lines, "geom_type", None) == "MultiLineString":
+            source = lines
+        elif hasattr(lines, "geoms"):
+            # other Multi geometries
+            source = MultiLineString([ls.coords for ls in lines.geoms])
+        elif hasattr(lines, "__iter__"):
+            try:
+                source = MultiLineString([ls.coords for ls in lines])
+            except AttributeError:
+                source = MultiLineString(lines)
+        if source is None:
+            raise ValueError(f"Cannot linemerge {lines}")
+        return shapely.line_merge(source, directed=directed)
+
+    def cascaded_union(self, geoms):
+        """Returns the union of a sequence of geometries
+
+        .. deprecated:: 1.8
+            This function was superseded by :meth:`unary_union`.
+        """
+        warn(
+            "The 'cascaded_union()' function is deprecated. "
+            "Use 'unary_union()' instead.",
+            ShapelyDeprecationWarning,
+            stacklevel=2,
+        )
+        return shapely.union_all(geoms, axis=None)
+
+    def unary_union(self, geoms):
+        """Returns the union of a sequence of geometries
+
+        Usually used to convert a collection into the smallest set of polygons
+        that cover the same area.
+        """
+        return shapely.union_all(geoms, axis=None)
+
+
+operator = CollectionOperator()
+polygonize = operator.polygonize
+polygonize_full = operator.polygonize_full
+linemerge = operator.linemerge
+cascaded_union = operator.cascaded_union
+unary_union = operator.unary_union
+
+
+def triangulate(geom, tolerance=0.0, edges=False):
+    """Creates the Delaunay triangulation and returns a list of geometries
+
+    The source may be any geometry type. All vertices of the geometry will be
+    used as the points of the triangulation.
+
+    From the GEOS documentation:
+    tolerance is the snapping tolerance used to improve the robustness of
+    the triangulation computation. A tolerance of 0.0 specifies that no
+    snapping will take place.
+
+    If edges is False, a list of Polygons (triangles) will be returned.
+    Otherwise the list of LineString edges is returned.
+
+    """
+    collection = shapely.delaunay_triangles(geom, tolerance=tolerance, only_edges=edges)
+    return [g for g in collection.geoms]
+
+
+def voronoi_diagram(geom, envelope=None, tolerance=0.0, edges=False):
+    """
+    Constructs a Voronoi Diagram [1] from the given geometry.
+    Returns a list of geometries.
+
+    Parameters
+    ----------
+    geom: geometry
+        the input geometry whose vertices will be used to calculate
+        the final diagram.
+    envelope: geometry, None
+        clipping envelope for the returned diagram, automatically
+        determined if None. The diagram will be clipped to the larger
+        of this envelope or an envelope surrounding the sites.
+    tolerance: float, 0.0
+        sets the snapping tolerance used to improve the robustness
+        of the computation. A tolerance of 0.0 specifies that no
+        snapping will take place.
+    edges: bool, False
+        If False, return regions as polygons. Else, return only
+        edges e.g. LineStrings.
+
+    GEOS documentation can be found at [2]
+
+    Returns
+    -------
+    GeometryCollection
+        geometries representing the Voronoi regions.
+
+    Notes
+    -----
+    The tolerance `argument` can be finicky and is known to cause the
+    algorithm to fail in several cases. If you're using `tolerance`
+    and getting a failure, try removing it. The test cases in
+    tests/test_voronoi_diagram.py show more details.
+
+
+    References
+    ----------
+    [1] https://en.wikipedia.org/wiki/Voronoi_diagram
+    [2] https://geos.osgeo.org/doxygen/geos__c_8h_source.html  (line 730)
+    """
+    try:
+        result = shapely.voronoi_polygons(
+            geom, tolerance=tolerance, extend_to=envelope, only_edges=edges
+        )
+    except shapely.GEOSException as err:
+        errstr = "Could not create Voronoi Diagram with the specified inputs "
+        errstr += f"({err!s})."
+        if tolerance:
+            errstr += " Try running again with default tolerance value."
+        raise ValueError(errstr) from err
+
+    if result.geom_type != "GeometryCollection":
+        return GeometryCollection([result])
+    return result
+
+
+def validate(geom):
+    return shapely.is_valid_reason(geom)
+
+
+
[docs]def transform(func, geom): + """Applies `func` to all coordinates of `geom` and returns a new + geometry of the same type from the transformed coordinates. + + `func` maps x, y, and optionally z to output xp, yp, zp. The input + parameters may iterable types like lists or arrays or single values. + The output shall be of the same type. Scalars in, scalars out. + Lists in, lists out. + + For example, here is an identity function applicable to both types + of input. + + def id_func(x, y, z=None): + return tuple(filter(None, [x, y, z])) + + g2 = transform(id_func, g1) + + Using pyproj >= 2.1, this example will accurately project Shapely geometries: + + import pyproj + + wgs84 = pyproj.CRS('EPSG:4326') + utm = pyproj.CRS('EPSG:32618') + + project = pyproj.Transformer.from_crs(wgs84, utm, always_xy=True).transform + + g2 = transform(project, g1) + + Note that the always_xy kwarg is required here as Shapely geometries only support + X,Y coordinate ordering. + + Lambda expressions such as the one in + + g2 = transform(lambda x, y, z=None: (x+1.0, y+1.0), g1) + + also satisfy the requirements for `func`. + """ + if geom.is_empty: + return geom + if geom.geom_type in ("Point", "LineString", "LinearRing", "Polygon"): + + # First we try to apply func to x, y, z sequences. When func is + # optimized for sequences, this is the fastest, though zipping + # the results up to go back into the geometry constructors adds + # extra cost. + try: + if geom.geom_type in ("Point", "LineString", "LinearRing"): + return type(geom)(zip(*func(*zip(*geom.coords)))) + elif geom.geom_type == "Polygon": + shell = type(geom.exterior)(zip(*func(*zip(*geom.exterior.coords)))) + holes = list( + type(ring)(zip(*func(*zip(*ring.coords)))) + for ring in geom.interiors + ) + return type(geom)(shell, holes) + + # A func that assumes x, y, z are single values will likely raise a + # TypeError, in which case we'll try again. + except TypeError: + if geom.geom_type in ("Point", "LineString", "LinearRing"): + return type(geom)([func(*c) for c in geom.coords]) + elif geom.geom_type == "Polygon": + shell = type(geom.exterior)([func(*c) for c in geom.exterior.coords]) + holes = list( + type(ring)([func(*c) for c in ring.coords]) + for ring in geom.interiors + ) + return type(geom)(shell, holes) + + elif geom.geom_type.startswith("Multi") or geom.geom_type == "GeometryCollection": + return type(geom)([transform(func, part) for part in geom.geoms]) + else: + raise GeometryTypeError(f"Type {geom.geom_type!r} not recognized")
+ + +def nearest_points(g1, g2): + """Returns the calculated nearest points in the input geometries + + The points are returned in the same order as the input geometries. + """ + seq = shapely.shortest_line(g1, g2) + if seq is None: + if g1.is_empty: + raise ValueError("The first input geometry is empty") + else: + raise ValueError("The second input geometry is empty") + + p1 = shapely.get_point(seq, 0) + p2 = shapely.get_point(seq, 1) + return (p1, p2) + + +def snap(g1, g2, tolerance): + """ + Snaps an input geometry (g1) to reference (g2) geometry's vertices. + + Parameters + ---------- + g1 : geometry + The first geometry + g2 : geometry + The second geometry + tolerance : float + The snapping tolerance + + Refer to :func:`shapely.snap` for full documentation. + """ + + return shapely.snap(g1, g2, tolerance) + + +def shared_paths(g1, g2): + """Find paths shared between the two given lineal geometries + + Returns a GeometryCollection with two elements: + - First element is a MultiLineString containing shared paths with the + same direction for both inputs. + - Second element is a MultiLineString containing shared paths with the + opposite direction for the two inputs. + + Parameters + ---------- + g1 : geometry + The first geometry + g2 : geometry + The second geometry + """ + if not isinstance(g1, LineString): + raise GeometryTypeError("First geometry must be a LineString") + if not isinstance(g2, LineString): + raise GeometryTypeError("Second geometry must be a LineString") + return shapely.shared_paths(g1, g2) + + +class SplitOp: + @staticmethod + def _split_polygon_with_line(poly, splitter): + """Split a Polygon with a LineString""" + if not isinstance(poly, Polygon): + raise GeometryTypeError("First argument must be a Polygon") + if not isinstance(splitter, LineString): + raise GeometryTypeError("Second argument must be a LineString") + + union = poly.boundary.union(splitter) + + # greatly improves split performance for big geometries with many + # holes (the following contains checks) with minimal overhead + # for common cases + poly = prep(poly) + + # some polygonized geometries may be holes, we do not want them + # that's why we test if the original polygon (poly) contains + # an inner point of polygonized geometry (pg) + return [ + pg for pg in polygonize(union) if poly.contains(pg.representative_point()) + ] + + @staticmethod + def _split_line_with_line(line, splitter): + """Split a LineString with another (Multi)LineString or (Multi)Polygon""" + + # if splitter is a polygon, pick it's boundary + if splitter.geom_type in ("Polygon", "MultiPolygon"): + splitter = splitter.boundary + + if not isinstance(line, LineString): + raise GeometryTypeError("First argument must be a LineString") + if not isinstance(splitter, LineString) and not isinstance( + splitter, MultiLineString + ): + raise GeometryTypeError( + "Second argument must be either a LineString or a MultiLineString" + ) + + # | s\l | Interior | Boundary | Exterior | + # |----------|----------|----------|----------| + # | Interior | 0 or F | * | * | At least one of these two must be 0 + # | Boundary | 0 or F | * | * | So either '0********' or '[0F]**0*****' + # | Exterior | * | * | * | No overlapping interiors ('1********') + relation = splitter.relate(line) + if relation[0] == "1": + # The lines overlap at some segment (linear intersection of interiors) + raise ValueError("Input geometry segment overlaps with the splitter.") + elif relation[0] == "0" or relation[3] == "0": + # The splitter crosses or touches the line's interior --> return multilinestring from the split + return line.difference(splitter) + else: + # The splitter does not cross or touch the line's interior --> return collection with identity line + return [line] + + @staticmethod + def _split_line_with_point(line, splitter): + """Split a LineString with a Point""" + if not isinstance(line, LineString): + raise GeometryTypeError("First argument must be a LineString") + if not isinstance(splitter, Point): + raise GeometryTypeError("Second argument must be a Point") + + # check if point is in the interior of the line + if not line.relate_pattern(splitter, "0********"): + # point not on line interior --> return collection with single identity line + # (REASONING: Returning a list with the input line reference and creating a + # GeometryCollection at the general split function prevents unnecessary copying + # of linestrings in multipoint splitting function) + return [line] + elif line.coords[0] == splitter.coords[0]: + # if line is a closed ring the previous test doesn't behave as desired + return [line] + + # point is on line, get the distance from the first point on line + distance_on_line = line.project(splitter) + coords = list(line.coords) + # split the line at the point and create two new lines + current_position = 0.0 + for i in range(len(coords) - 1): + point1 = coords[i] + point2 = coords[i + 1] + dx = point1[0] - point2[0] + dy = point1[1] - point2[1] + segment_length = (dx**2 + dy**2) ** 0.5 + current_position += segment_length + if distance_on_line == current_position: + # splitter is exactly on a vertex + return [LineString(coords[: i + 2]), LineString(coords[i + 1 :])] + elif distance_on_line < current_position: + # splitter is between two vertices + return [ + LineString(coords[: i + 1] + [splitter.coords[0]]), + LineString([splitter.coords[0]] + coords[i + 1 :]), + ] + return [line] + + @staticmethod + def _split_line_with_multipoint(line, splitter): + """Split a LineString with a MultiPoint""" + + if not isinstance(line, LineString): + raise GeometryTypeError("First argument must be a LineString") + if not isinstance(splitter, MultiPoint): + raise GeometryTypeError("Second argument must be a MultiPoint") + + chunks = [line] + for pt in splitter.geoms: + new_chunks = [] + for chunk in filter(lambda x: not x.is_empty, chunks): + # add the newly split 2 lines or the same line if not split + new_chunks.extend(SplitOp._split_line_with_point(chunk, pt)) + chunks = new_chunks + + return chunks + + @staticmethod + def split(geom, splitter): + """ + Splits a geometry by another geometry and returns a collection of geometries. This function is the theoretical + opposite of the union of the split geometry parts. If the splitter does not split the geometry, a collection + with a single geometry equal to the input geometry is returned. + The function supports: + - Splitting a (Multi)LineString by a (Multi)Point or (Multi)LineString or (Multi)Polygon + - Splitting a (Multi)Polygon by a LineString + + It may be convenient to snap the splitter with low tolerance to the geometry. For example in the case + of splitting a line by a point, the point must be exactly on the line, for the line to be correctly split. + When splitting a line by a polygon, the boundary of the polygon is used for the operation. + When splitting a line by another line, a ValueError is raised if the two overlap at some segment. + + Parameters + ---------- + geom : geometry + The geometry to be split + splitter : geometry + The geometry that will split the input geom + + Example + ------- + >>> pt = Point((1, 1)) + >>> line = LineString([(0,0), (2,2)]) + >>> result = split(line, pt) + >>> result.wkt + 'GEOMETRYCOLLECTION (LINESTRING (0 0, 1 1), LINESTRING (1 1, 2 2))' + """ + + if geom.geom_type in ("MultiLineString", "MultiPolygon"): + return GeometryCollection( + [i for part in geom.geoms for i in SplitOp.split(part, splitter).geoms] + ) + + elif geom.geom_type == "LineString": + if splitter.geom_type in ( + "LineString", + "MultiLineString", + "Polygon", + "MultiPolygon", + ): + split_func = SplitOp._split_line_with_line + elif splitter.geom_type == "Point": + split_func = SplitOp._split_line_with_point + elif splitter.geom_type == "MultiPoint": + split_func = SplitOp._split_line_with_multipoint + else: + raise GeometryTypeError( + f"Splitting a LineString with a {splitter.geom_type} is not supported" + ) + + elif geom.geom_type == "Polygon": + if splitter.geom_type == "LineString": + split_func = SplitOp._split_polygon_with_line + else: + raise GeometryTypeError( + f"Splitting a Polygon with a {splitter.geom_type} is not supported" + ) + + else: + raise GeometryTypeError( + f"Splitting {geom.geom_type} geometry is not supported" + ) + + return GeometryCollection(split_func(geom, splitter)) + + +split = SplitOp.split + + +def substring(geom, start_dist, end_dist, normalized=False): + """Return a line segment between specified distances along a LineString + + Negative distance values are taken as measured in the reverse + direction from the end of the geometry. Out-of-range index + values are handled by clamping them to the valid range of values. + + If the start distance equals the end distance, a Point is returned. + + If the start distance is actually beyond the end distance, then the + reversed substring is returned such that the start distance is + at the first coordinate. + + Parameters + ---------- + geom : LineString + The geometry to get a substring of. + start_dist : float + The distance along `geom` of the start of the substring. + end_dist : float + The distance along `geom` of the end of the substring. + normalized : bool, False + Whether the distance parameters are interpreted as a + fraction of the geometry's length. + + Returns + ------- + Union[Point, LineString] + The substring between `start_dist` and `end_dist` or a Point + if they are at the same location. + + Raises + ------ + TypeError + If `geom` is not a LineString. + + Examples + -------- + >>> from shapely.geometry import LineString + >>> from shapely.ops import substring + >>> ls = LineString((i, 0) for i in range(6)) + >>> ls.wkt + 'LINESTRING (0 0, 1 0, 2 0, 3 0, 4 0, 5 0)' + >>> substring(ls, start_dist=1, end_dist=3).wkt + 'LINESTRING (1 0, 2 0, 3 0)' + >>> substring(ls, start_dist=3, end_dist=1).wkt + 'LINESTRING (3 0, 2 0, 1 0)' + >>> substring(ls, start_dist=1, end_dist=-3).wkt + 'LINESTRING (1 0, 2 0)' + >>> substring(ls, start_dist=0.2, end_dist=-0.6, normalized=True).wkt + 'LINESTRING (1 0, 2 0)' + + Returning a `Point` when `start_dist` and `end_dist` are at the + same location. + + >>> substring(ls, 2.5, -2.5).wkt + 'POINT (2.5 0)' + """ + + if not isinstance(geom, LineString): + raise GeometryTypeError( + "Can only calculate a substring of LineString geometries. " + f"A {geom.geom_type} was provided." + ) + + # Filter out cases in which to return a point + if start_dist == end_dist: + return geom.interpolate(start_dist, normalized) + elif not normalized and start_dist >= geom.length and end_dist >= geom.length: + return geom.interpolate(geom.length, normalized) + elif not normalized and -start_dist >= geom.length and -end_dist >= geom.length: + return geom.interpolate(0, normalized) + elif normalized and start_dist >= 1 and end_dist >= 1: + return geom.interpolate(1, normalized) + elif normalized and -start_dist >= 1 and -end_dist >= 1: + return geom.interpolate(0, normalized) + + if normalized: + start_dist *= geom.length + end_dist *= geom.length + + # Filter out cases where distances meet at a middle point from opposite ends. + if start_dist < 0 < end_dist and abs(start_dist) + end_dist == geom.length: + return geom.interpolate(end_dist) + elif end_dist < 0 < start_dist and abs(end_dist) + start_dist == geom.length: + return geom.interpolate(start_dist) + + start_point = geom.interpolate(start_dist) + end_point = geom.interpolate(end_dist) + + if start_dist < 0: + start_dist = geom.length + start_dist # Values may still be negative, + if end_dist < 0: # but only in the out-of-range + end_dist = geom.length + end_dist # sense, not the wrap-around sense. + + reverse = start_dist > end_dist + if reverse: + start_dist, end_dist = end_dist, start_dist + + if start_dist < 0: + start_dist = 0 # to avoid duplicating the first vertex + + if reverse: + vertex_list = [tuple(*end_point.coords)] + else: + vertex_list = [tuple(*start_point.coords)] + + coords = list(geom.coords) + current_distance = 0 + for p1, p2 in zip(coords, coords[1:]): + if start_dist < current_distance < end_dist: + vertex_list.append(p1) + elif current_distance >= end_dist: + break + + current_distance += ((p2[0] - p1[0]) ** 2 + (p2[1] - p1[1]) ** 2) ** 0.5 + + if reverse: + vertex_list.append(tuple(*start_point.coords)) + # reverse direction result + vertex_list = reversed(vertex_list) + else: + vertex_list.append(tuple(*end_point.coords)) + + return LineString(vertex_list) + + +def clip_by_rect(geom, xmin, ymin, xmax, ymax): + """Returns the portion of a geometry within a rectangle + + The geometry is clipped in a fast but possibly dirty way. The output is + not guaranteed to be valid. No exceptions will be raised for topological + errors. + + Parameters + ---------- + geom : geometry + The geometry to be clipped + xmin : float + Minimum x value of the rectangle + ymin : float + Minimum y value of the rectangle + xmax : float + Maximum x value of the rectangle + ymax : float + Maximum y value of the rectangle + + Notes + ----- + Requires GEOS >= 3.5.0 + New in 1.7. + """ + if geom.is_empty: + return geom + return shapely.clip_by_rect(geom, xmin, ymin, xmax, ymax) + + +def orient(geom, sign=1.0): + """A properly oriented copy of the given geometry. + + The signed area of the result will have the given sign. A sign of + 1.0 means that the coordinates of the product's exterior rings will + be oriented counter-clockwise. + + Parameters + ---------- + geom : Geometry + The original geometry. May be a Polygon, MultiPolygon, or + GeometryCollection. + sign : float, optional. + The sign of the result's signed area. + + Returns + ------- + Geometry + + """ + if isinstance(geom, BaseMultipartGeometry): + return geom.__class__( + list( + map( + lambda geom: orient(geom, sign), + geom.geoms, + ) + ) + ) + if isinstance(geom, (Polygon,)): + return orient_(geom, sign) + return geom +
+ +
+
+
+ +
+ +
+

© Copyright 2019 Metropolitan Council.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty_emme/_sources/_generated/lasso.CubeTransit.rst.txt b/branch/bicounty_emme/_sources/_generated/lasso.CubeTransit.rst.txt new file mode 100644 index 0000000..e24b49e --- /dev/null +++ b/branch/bicounty_emme/_sources/_generated/lasso.CubeTransit.rst.txt @@ -0,0 +1,36 @@ +lasso.CubeTransit +================= + +.. currentmodule:: lasso + +.. autoclass:: CubeTransit + + + .. automethod:: __init__ + + + .. rubric:: Methods + + .. autosummary:: + + ~CubeTransit.__init__ + ~CubeTransit.add_additional_time_periods + ~CubeTransit.add_cube + ~CubeTransit.build_route_name + ~CubeTransit.calculate_start_end_times + ~CubeTransit.create_add_route_card_dict + ~CubeTransit.create_delete_route_card_dict + ~CubeTransit.create_from_cube + ~CubeTransit.create_update_route_card_dict + ~CubeTransit.cube_properties_to_standard_properties + ~CubeTransit.evaluate_differences + ~CubeTransit.evaluate_route_property_differences + ~CubeTransit.evaluate_route_shape_changes + ~CubeTransit.get_time_period_numbers_from_cube_properties + ~CubeTransit.unpack_route_name + + + + + + \ No newline at end of file diff --git a/branch/bicounty_emme/_sources/_generated/lasso.ModelRoadwayNetwork.rst.txt b/branch/bicounty_emme/_sources/_generated/lasso.ModelRoadwayNetwork.rst.txt new file mode 100644 index 0000000..29190d8 --- /dev/null +++ b/branch/bicounty_emme/_sources/_generated/lasso.ModelRoadwayNetwork.rst.txt @@ -0,0 +1,90 @@ +lasso.ModelRoadwayNetwork +========================= + +.. currentmodule:: lasso + +.. autoclass:: ModelRoadwayNetwork + + + .. automethod:: __init__ + + + .. rubric:: Methods + + .. autosummary:: + + ~ModelRoadwayNetwork.__init__ + ~ModelRoadwayNetwork.add_counts + ~ModelRoadwayNetwork.add_incident_link_data_to_nodes + ~ModelRoadwayNetwork.add_new_roadway_feature_change + ~ModelRoadwayNetwork.add_variable_using_shst_reference + ~ModelRoadwayNetwork.addition_map + ~ModelRoadwayNetwork.apply + ~ModelRoadwayNetwork.apply_managed_lane_feature_change + ~ModelRoadwayNetwork.apply_python_calculation + ~ModelRoadwayNetwork.apply_roadway_feature_change + ~ModelRoadwayNetwork.assess_connectivity + ~ModelRoadwayNetwork.build_selection_key + ~ModelRoadwayNetwork.calculate_area_type + ~ModelRoadwayNetwork.calculate_centroidconnect + ~ModelRoadwayNetwork.calculate_county + ~ModelRoadwayNetwork.calculate_distance + ~ModelRoadwayNetwork.calculate_mpo + ~ModelRoadwayNetwork.calculate_use + ~ModelRoadwayNetwork.convert_int + ~ModelRoadwayNetwork.create_ML_variable + ~ModelRoadwayNetwork.create_calculated_variables + ~ModelRoadwayNetwork.create_dummy_connector_links + ~ModelRoadwayNetwork.create_hov_corridor_variable + ~ModelRoadwayNetwork.create_managed_lane_network + ~ModelRoadwayNetwork.create_managed_variable + ~ModelRoadwayNetwork.dataframe_to_fixed_width + ~ModelRoadwayNetwork.delete_roadway_feature_change + ~ModelRoadwayNetwork.deletion_map + ~ModelRoadwayNetwork.fill_na + ~ModelRoadwayNetwork.from_RoadwayNetwork + ~ModelRoadwayNetwork.get_attribute + ~ModelRoadwayNetwork.get_managed_lane_node_ids + ~ModelRoadwayNetwork.get_modal_graph + ~ModelRoadwayNetwork.get_modal_links_nodes + ~ModelRoadwayNetwork.get_property_by_time_period_and_group + ~ModelRoadwayNetwork.identify_segment + ~ModelRoadwayNetwork.identify_segment_endpoints + ~ModelRoadwayNetwork.is_network_connected + ~ModelRoadwayNetwork.load_transform_network + ~ModelRoadwayNetwork.network_connection_plot + ~ModelRoadwayNetwork.orig_dest_nodes_foreign_key + ~ModelRoadwayNetwork.ox_graph + ~ModelRoadwayNetwork.path_search + ~ModelRoadwayNetwork.read + ~ModelRoadwayNetwork.read_match_result + ~ModelRoadwayNetwork.rename_variables_for_dbf + ~ModelRoadwayNetwork.roadway_net_to_gdf + ~ModelRoadwayNetwork.roadway_standard_to_met_council_network + ~ModelRoadwayNetwork.select_roadway_features + ~ModelRoadwayNetwork.selection_has_unique_link_id + ~ModelRoadwayNetwork.selection_map + ~ModelRoadwayNetwork.shortest_path + ~ModelRoadwayNetwork.split_properties_by_time_period_and_category + ~ModelRoadwayNetwork.update_distance + ~ModelRoadwayNetwork.validate_link_schema + ~ModelRoadwayNetwork.validate_node_schema + ~ModelRoadwayNetwork.validate_properties + ~ModelRoadwayNetwork.validate_selection + ~ModelRoadwayNetwork.validate_shape_schema + ~ModelRoadwayNetwork.validate_uniqueness + ~ModelRoadwayNetwork.write + ~ModelRoadwayNetwork.write_roadway_as_fixedwidth + ~ModelRoadwayNetwork.write_roadway_as_shp + + + + + + .. rubric:: Attributes + + .. autosummary:: + + ~ModelRoadwayNetwork.CALCULATED_VALUES + + \ No newline at end of file diff --git a/branch/bicounty_emme/_sources/_generated/lasso.Parameters.rst.txt b/branch/bicounty_emme/_sources/_generated/lasso.Parameters.rst.txt new file mode 100644 index 0000000..28d2c86 --- /dev/null +++ b/branch/bicounty_emme/_sources/_generated/lasso.Parameters.rst.txt @@ -0,0 +1,31 @@ +lasso.Parameters +================ + +.. currentmodule:: lasso + +.. autoclass:: Parameters + + + .. automethod:: __init__ + + + .. rubric:: Methods + + .. autosummary:: + + ~Parameters.__init__ + + + + + + .. rubric:: Attributes + + .. autosummary:: + + ~Parameters.cube_time_periods + ~Parameters.properties_to_split + ~Parameters.county_link_range_dict + ~Parameters.zones + + \ No newline at end of file diff --git a/branch/bicounty_emme/_sources/_generated/lasso.Project.rst.txt b/branch/bicounty_emme/_sources/_generated/lasso.Project.rst.txt new file mode 100644 index 0000000..863945b --- /dev/null +++ b/branch/bicounty_emme/_sources/_generated/lasso.Project.rst.txt @@ -0,0 +1,42 @@ +lasso.Project +============= + +.. currentmodule:: lasso + +.. autoclass:: Project + + + .. automethod:: __init__ + + + .. rubric:: Methods + + .. autosummary:: + + ~Project.__init__ + ~Project.add_highway_changes + ~Project.add_transit_changes + ~Project.create_project + ~Project.determine_roadway_network_changes_compatibility + ~Project.emme_id_to_wrangler_id + ~Project.emme_name_to_wrangler_name + ~Project.evaluate_changes + ~Project.get_object_from_network_build_command + ~Project.get_operation_from_network_build_command + ~Project.read_logfile + ~Project.read_network_build_file + ~Project.write_project_card + + + + + + .. rubric:: Attributes + + .. autosummary:: + + ~Project.CALCULATED_VALUES + ~Project.DEFAULT_PROJECT_NAME + ~Project.STATIC_VALUES + + \ No newline at end of file diff --git a/branch/bicounty_emme/_sources/_generated/lasso.StandardTransit.rst.txt b/branch/bicounty_emme/_sources/_generated/lasso.StandardTransit.rst.txt new file mode 100644 index 0000000..1175b4b --- /dev/null +++ b/branch/bicounty_emme/_sources/_generated/lasso.StandardTransit.rst.txt @@ -0,0 +1,32 @@ +lasso.StandardTransit +===================== + +.. currentmodule:: lasso + +.. autoclass:: StandardTransit + + + .. automethod:: __init__ + + + .. rubric:: Methods + + .. autosummary:: + + ~StandardTransit.__init__ + ~StandardTransit.calculate_cube_mode + ~StandardTransit.cube_format + ~StandardTransit.evaluate_differences + ~StandardTransit.fromTransitNetwork + ~StandardTransit.read_gtfs + ~StandardTransit.route_properties_gtfs_to_cube + ~StandardTransit.shape_gtfs_to_cube + ~StandardTransit.shape_gtfs_to_emme + ~StandardTransit.time_to_cube_time_period + ~StandardTransit.write_as_cube_lin + + + + + + \ No newline at end of file diff --git a/branch/bicounty_emme/_sources/_generated/lasso.logger.rst.txt b/branch/bicounty_emme/_sources/_generated/lasso.logger.rst.txt new file mode 100644 index 0000000..2054273 --- /dev/null +++ b/branch/bicounty_emme/_sources/_generated/lasso.logger.rst.txt @@ -0,0 +1,29 @@ +lasso.logger +============ + +.. automodule:: lasso.logger + + + + + + + + .. rubric:: Functions + + .. autosummary:: + + setupLogging + + + + + + + + + + + + + diff --git a/branch/bicounty_emme/_sources/_generated/lasso.util.rst.txt b/branch/bicounty_emme/_sources/_generated/lasso.util.rst.txt new file mode 100644 index 0000000..95fecf8 --- /dev/null +++ b/branch/bicounty_emme/_sources/_generated/lasso.util.rst.txt @@ -0,0 +1,35 @@ +lasso.util +========== + +.. automodule:: lasso.util + + + + + + + + .. rubric:: Functions + + .. autosummary:: + + column_name_to_parts + create_locationreference + geodesic_point_buffer + get_shared_streets_intersection_hash + hhmmss_to_datetime + secs_to_datetime + shorten_name + + + + + + + + + + + + + diff --git a/branch/bicounty_emme/_sources/autodoc.rst.txt b/branch/bicounty_emme/_sources/autodoc.rst.txt new file mode 100644 index 0000000..7e48d58 --- /dev/null +++ b/branch/bicounty_emme/_sources/autodoc.rst.txt @@ -0,0 +1,29 @@ +Lasso Classes and Functions +==================================== + +.. automodule:: lasso + :no-members: + :no-undoc-members: + :no-inherited-members: + :no-show-inheritance: + + +Base Classes +-------------- +.. autosummary:: + :toctree: _generated + :nosignatures: + + CubeTransit + StandardTransit + ModelRoadwayNetwork + Project + Parameters + +Utils and Functions +-------------------- +.. autosummary:: + :toctree: _generated + + util + logger diff --git a/branch/bicounty_emme/_sources/index.rst.txt b/branch/bicounty_emme/_sources/index.rst.txt new file mode 100644 index 0000000..1255d4e --- /dev/null +++ b/branch/bicounty_emme/_sources/index.rst.txt @@ -0,0 +1,35 @@ +.. lasso documentation master file, created by + sphinx-quickstart on Thu Dec 5 15:43:28 2019. + You can adapt this file completely to your liking, but it should at least + contain the root `toctree` directive. + +Welcome to lasso's documentation! +================================= + +This package of utilities is a wrapper around the +[network_wrangler](http://github.com/wsp-sag/network_wrangler) package +for MetCouncil. It aims to have the following functionality: +1. parse Cube log files and base highway networks and create ProjectCards + for Network Wrangler +2. parse two Cube transit line files and create ProjectCards for NetworkWrangler +3. refine Network Wrangler highway networks to contain specific variables and + settings for Metropolitan Council and export them to a format that can + be read in by Citilab's Cube software. + +.. toctree:: + :maxdepth: 3 + :caption: Contents: + + starting + setup + running + autodoc + + + +Indices and tables +================== + +* :ref:`genindex` +* :ref:`modindex` +* :ref:`search` diff --git a/branch/bicounty_emme/_sources/running.md.txt b/branch/bicounty_emme/_sources/running.md.txt new file mode 100644 index 0000000..e139dc8 --- /dev/null +++ b/branch/bicounty_emme/_sources/running.md.txt @@ -0,0 +1,12 @@ +# Running Lasso + +## Create project files + + +## Create a scenario + + +## Exporting networks + + +## Auditing and Reporting diff --git a/branch/bicounty_emme/_sources/setup.md.txt b/branch/bicounty_emme/_sources/setup.md.txt new file mode 100644 index 0000000..e77d463 --- /dev/null +++ b/branch/bicounty_emme/_sources/setup.md.txt @@ -0,0 +1,9 @@ +# Setup + +### Projects + +### Parameters + +### Settings + +### Additional Data Files diff --git a/branch/bicounty_emme/_sources/starting.md.txt b/branch/bicounty_emme/_sources/starting.md.txt new file mode 100644 index 0000000..8886f95 --- /dev/null +++ b/branch/bicounty_emme/_sources/starting.md.txt @@ -0,0 +1,292 @@ +# Starting Out + +## Installation + +If you are managing multiple python versions, we suggest using [`virtualenv`](https://virtualenv.pypa.io/en/latest/) or [`conda`](https://conda.io/en/latest/) virtual environments. + +Example using a conda environment (recommended) and using the package manager [pip](https://pip.pypa.io/en/stable/) to install Lasso from the source on GitHub. + +```bash +conda config --add channels conda-forge +conda create python=3.7 rtree geopandas -n +conda activate +pip install git+https://github.com/wsp-sag/Lasso@master +``` + +Lasso will install `network_wrangler` from the [PyPi](https://pypi.org/project/network-wrangler/) repository because it is included in Lasso's `requirements.txt`. + +#### Bleeding Edge +If you want to install a more up-to-date or development version of network wrangler and lasso , you can do so by installing it from the `develop` branch of + +```bash +conda config --add channels conda-forge +conda create python=3.7 rtree geopandas -n +conda activate +pip install git+https://github.com/wsp-sag/network_wrangler@develop +pip install git+https://github.com/wsp-sag/Lasso@develop +``` + +#### From Clone +If you are going to be working on Lasso locally, you might want to clone it to your local machine and install it from the clone. The -e will install it in [editable mode](https://pip.pypa.io/en/stable/reference/pip_install/?highlight=editable#editable-installs). + +**if you plan to do development on both network wrangler and lasso locally, consider installing network wrangler from a clone as well!** + +```bash +conda config --add channels conda-forge +conda create python=3.7 rtree geopandas osmnx -n +conda activate +git clone https://github.com/wsp-sag/Lasso +git clone https://github.com/wsp-sag/network_wrangler +cd network_wrangler +pip install -e . +cd .. +cd Lasso +pip install -e . +``` + +Notes: + +1. The -e installs it in editable mode. +2. If you are not part of the project team and want to contribute code bxack to the project, please fork before you clone and then add the original repository to your upstream origin list per [these directions on github](https://help.github.com/en/articles/fork-a-repo). +3. if you wanted to install from a specific tag/version number or branch, replace `@master` with `@` or `@tag` +4. If you want to make use of frequent developer updates for network wrangler as well, you can also install it from clone by copying the instructions for cloning and installing Lasso for Network Wrangler + +If you are going to be doing Lasso development, we also recommend: + - a good IDE such as [Atom](http://atom.io), VS Code, Sublime Text, etc. + with Python syntax highlighting turned on. + - [GitHub Desktop](https://desktop.github.com/) to locally update your clones + +## Brief Intro + +Lasso is a 'wrapper' around the [Network Wrangler](http://wsp-sag.github.io/network_wrangler) utility. + +Both Lasso and NetworkWrangler are built around the following data schemas: + - [`roadway network`], which is based on a mashup of Open Street Map and [Shared Streets](http://sharedstreets.io). In Network Wrangler these are read in from three json files reprsenting: links, shapes, and nodes. Data fields that change by time of day or by user category are represented as nested fields such that any field can be defined for an ad-hoc time-of-day span or user category. + - [`transit network`], which is based on a frequency-based implementation of the csv-based GTFS; and + - [`project card`], which is novel to Network Wrangler and stores information about network changes as a result of projects in yml. + +In addition, Lasso utilizes the following data schemas: + + - [`MetCouncil Model Roadway Network Schema`], which adds data fields to the `roadway network` schema that MetCouncil uses in their travel model including breaking out data fields by time period. + - [`MetCouncil Model Transit Network Schema`], which uses the Cube PublicTransport format, and + - [`Cube Log Files`], which document changes to the roadway network done in the Cube GUI. Lasso translates these to project cards in order to be used by NetworkWrangler. + - [`Cube public transport line files`], which define a set of transit lines in the cube software. + +### Components +Network Wrangler has the following atomic parts: + + - _RoadwayNetwork_ object, which represents the `roadway network` data as GeoDataFrames; + - _TransitNetwork_ object, which represents the `transit network` data as DataFrames; + - _ProjectCard_ object, which represents the data of the `project card`. Project cards identify the infrastructure that is changing (a selection) and defines the changes; or contains information about a new facility to be constructed or a new service to be run.; + - _Scenario_ object, which consist of at least a RoadwayNetwork, and +TransitNetwork. Scenarios can be based on or tiered from other scenarios. +Scenarios can query and add ProjectCards to describe a set of changes that should be made to the network. + +In addition, Lasso has the following atomic parts: + + - _Project_ object, creates project cards from one of the following: a base and a build transit network in cube format, a base and build highway network, or a base highway network and a Cube log file. + - _ModelRoadwayNetwork_ object is a subclass of `RoadwayNetwork` and contains MetCouncil-specific methods to define and create MetCouncil-specific variables and export the network to a format that can be read by Cube. + - _StandardTransit_, an object for holding a standard transit feed as a Partridge object and contains + methods to manipulate and translate the GTFS data to MetCouncil's Cube Line files. + - _CubeTransit_, an object for storing information about transit defined in `Cube public transport line files` + . Has the capability to parse cube line file properties and shapes into python dictionaries and compare line files and represent changes as Project Card dictionaries. + - _Parameters_, A class representing all the parameters defining the networks + including time of day, categories, etc. Parameters can be set at runtime by initializing a parameters instance + with a keyword argument setting the attribute. Parameters that are + not explicitly set will use default parameters listed in this class. + +#### RoadwayNetwork + +Reads, writes, queries and and manipulates roadway network data, which +is mainly stored in the GeoDataFrames `links_df`, `nodes_df`, and `shapes_df`. + +```python +net = RoadwayNetwork.read( + link_filename=MY_LINK_FILE, + node_filename=MY_NODE_FILE, + shape_filename=MY_SHAPE_FILE, + shape_foreign_key ='shape_id', + + ) +my_selection = { + "link": [{"name": ["I 35E"]}], + "A": {"osm_node_id": "961117623"}, # start searching for segments at A + "B": {"osm_node_id": "2564047368"}, +} +net.select_roadway_features(my_selection) + +my_change = [ + { + 'property': 'lanes', + 'existing': 1, + 'set': 2, + }, + { + 'property': 'drive_access', + 'set': 0, + }, +] + +my_net.apply_roadway_feature_change( + my_net.select_roadway_features(my_selection), + my_change +) + +ml_net = net.create_managed_lane_network(in_place=False) + +ml_net.is_network_connected(mode="drive")) + +_, disconnected_nodes = ml_net.assess_connectivity( + mode="walk", + ignore_end_nodes=True +) +ml_net.write(filename=my_out_prefix, path=my_dir) +``` +#### TransitNetwork + +#### ProjectCard + +#### Scenario + +Manages sets of project cards and tiering from a base scenario/set of networks. + +```python + +my_base_scenario = { + "road_net": RoadwayNetwork.read( + link_filename=STPAUL_LINK_FILE, + node_filename=STPAUL_NODE_FILE, + shape_filename=STPAUL_SHAPE_FILE, + fast=True, + shape_foreign_key ='shape_id', + ), + "transit_net": TransitNetwork.read(STPAUL_DIR), +} + +card_filenames = [ + "3_multiple_roadway_attribute_change.yml", + "multiple_changes.yml", + "4_simple_managed_lane.yml", +] + +project_card_directory = os.path.join(STPAUL_DIR, "project_cards") + +project_cards_list = [ + ProjectCard.read(os.path.join(project_card_directory, filename), validate=False) + for filename in card_filenames +] + +my_scenario = Scenario.create_scenario( + base_scenario=my_base_scenario, + project_cards_list=project_cards_list, +) +my_scenario.check_scenario_requisites() + +my_scenario.apply_all_projects() + +my_scenario.scenario_summary() +``` + +#### Project +Creates project cards by comparing two MetCouncil Model Transit Network files or by reading a cube log file and a base network; + +```python + +test_project = Project.create_project( + base_transit_source=os.path.join(CUBE_DIR, "transit.LIN"), + build_transit_source=os.path.join(CUBE_DIR, "transit_route_shape_change"), + ) + +test_project.evaluate_changes() + +test_project.write_project_card( + os.path.join(SCRATCH_DIR, "t_transit_shape_test.yml") + ) + +``` + +#### ModelRoadwayNetwork +A subclass of network_wrangler's RoadwayNetwork +class which additional understanding about how to translate and write the +network out to the MetCouncil Roadway Network schema. + +```Python + +net = ModelRoadwayNetwork.read( + link_filename=STPAUL_LINK_FILE, + node_filename=STPAUL_NODE_FILE, + shape_filename=STPAUL_SHAPE_FILE, + fast=True, + shape_foreign_key ='shape_id', + ) + +net.write_roadway_as_fixedwidth() + +``` + +#### StandardTransit +Translates the standard GTFS data to MetCouncil's Cube Line files. + +```Python +cube_transit_net = StandardTransit.read_gtfs(BASE_TRANSIT_DIR) +cube_transit_net.write_as_cube_lin(os.path.join(WRITE_DIR, "outfile.lin")) +``` + +#### CubeTransit +Used by the project class and has the capability to: + - Parse cube line file properties and shapes into python dictionaries + - Compare line files and represent changes as Project Card dictionaries + +```python +tn = CubeTransit.create_from_cube(CUBE_DIR) +transit_change_list = tn.evaluate_differences(base_transit_network) +``` + +#### Parameters +Holds information about default parameters but can +also be initialized to override those parameters at object instantiation using a dictionary. + +```Python +# read parameters from a yaml configuration file +# could also provide as a key/value pair +with open(config_file) as f: + my_config = yaml.safe_load(f) + +# provide parameters at instantiation of ModelRoadwayNetwork +model_road_net = ModelRoadwayNetwork.from_RoadwayNetwork( + my_scenario.road_net, parameters=my_config.get("my_parameters", {}) + ) +# network written with direction from the parameters given +model_road_net.write_roadway_as_shp() + +``` + +### Typical Workflow + +Workflows in Lasso and Network Wrangler typically accomplish one of two goals: +1. Create Project Cards to document network changes as a result of either transit or roadway projects. +2. Create Model Network Files for a scenario as represented by a series of Project Cards layered on top of a base network. + +#### Project Cards from Transit LIN Files + + +#### Project Cards from Cube LOG Files + + +#### Model Network Files for a Scenario + + + +## Running Quickstart Jupyter Notebooks + +To learn basic lasso functionality, please refer to the following jupyter notebooks in the `/notebooks` directory: + + - `Lasso Project Card Creation Quickstart.ipynb` + - `Lasso Scenario Creation Quickstart.ipynb` + + Jupyter notebooks can be started by activating the lasso conda environment and typing `jupyter notebook`: + + ```bash + conda activate + jupyter notebook + ``` diff --git a/branch/bicounty_emme/_static/_sphinx_javascript_frameworks_compat.js b/branch/bicounty_emme/_static/_sphinx_javascript_frameworks_compat.js new file mode 100644 index 0000000..8141580 --- /dev/null +++ b/branch/bicounty_emme/_static/_sphinx_javascript_frameworks_compat.js @@ -0,0 +1,123 @@ +/* Compatability shim for jQuery and underscores.js. + * + * Copyright Sphinx contributors + * Released under the two clause BSD licence + */ + +/** + * small helper function to urldecode strings + * + * See https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/decodeURIComponent#Decoding_query_parameters_from_a_URL + */ +jQuery.urldecode = function(x) { + if (!x) { + return x + } + return decodeURIComponent(x.replace(/\+/g, ' ')); +}; + +/** + * small helper function to urlencode strings + */ +jQuery.urlencode = encodeURIComponent; + +/** + * This function returns the parsed url parameters of the + * current request. Multiple values per key are supported, + * it will always return arrays of strings for the value parts. + */ +jQuery.getQueryParameters = function(s) { + if (typeof s === 'undefined') + s = document.location.search; + var parts = s.substr(s.indexOf('?') + 1).split('&'); + var result = {}; + for (var i = 0; i < parts.length; i++) { + var tmp = parts[i].split('=', 2); + var key = jQuery.urldecode(tmp[0]); + var value = jQuery.urldecode(tmp[1]); + if (key in result) + result[key].push(value); + else + result[key] = [value]; + } + return result; +}; + +/** + * highlight a given string on a jquery object by wrapping it in + * span elements with the given class name. + */ +jQuery.fn.highlightText = function(text, className) { + function highlight(node, addItems) { + if (node.nodeType === 3) { + var val = node.nodeValue; + var pos = val.toLowerCase().indexOf(text); + if (pos >= 0 && + !jQuery(node.parentNode).hasClass(className) && + !jQuery(node.parentNode).hasClass("nohighlight")) { + var span; + var isInSVG = jQuery(node).closest("body, svg, foreignObject").is("svg"); + if (isInSVG) { + span = document.createElementNS("http://www.w3.org/2000/svg", "tspan"); + } else { + span = document.createElement("span"); + span.className = className; + } + span.appendChild(document.createTextNode(val.substr(pos, text.length))); + node.parentNode.insertBefore(span, node.parentNode.insertBefore( + document.createTextNode(val.substr(pos + text.length)), + node.nextSibling)); + node.nodeValue = val.substr(0, pos); + if (isInSVG) { + var rect = document.createElementNS("http://www.w3.org/2000/svg", "rect"); + var bbox = node.parentElement.getBBox(); + rect.x.baseVal.value = bbox.x; + rect.y.baseVal.value = bbox.y; + rect.width.baseVal.value = bbox.width; + rect.height.baseVal.value = bbox.height; + rect.setAttribute('class', className); + addItems.push({ + "parent": node.parentNode, + "target": rect}); + } + } + } + else if (!jQuery(node).is("button, select, textarea")) { + jQuery.each(node.childNodes, function() { + highlight(this, addItems); + }); + } + } + var addItems = []; + var result = this.each(function() { + highlight(this, addItems); + }); + for (var i = 0; i < addItems.length; ++i) { + jQuery(addItems[i].parent).before(addItems[i].target); + } + return result; +}; + +/* + * backward compatibility for jQuery.browser + * This will be supported until firefox bug is fixed. + */ +if (!jQuery.browser) { + jQuery.uaMatch = function(ua) { + ua = ua.toLowerCase(); + + var match = /(chrome)[ \/]([\w.]+)/.exec(ua) || + /(webkit)[ \/]([\w.]+)/.exec(ua) || + /(opera)(?:.*version|)[ \/]([\w.]+)/.exec(ua) || + /(msie) ([\w.]+)/.exec(ua) || + ua.indexOf("compatible") < 0 && /(mozilla)(?:.*? rv:([\w.]+)|)/.exec(ua) || + []; + + return { + browser: match[ 1 ] || "", + version: match[ 2 ] || "0" + }; + }; + jQuery.browser = {}; + jQuery.browser[jQuery.uaMatch(navigator.userAgent).browser] = true; +} diff --git a/branch/bicounty_emme/_static/basic.css b/branch/bicounty_emme/_static/basic.css new file mode 100644 index 0000000..cfc60b8 --- /dev/null +++ b/branch/bicounty_emme/_static/basic.css @@ -0,0 +1,921 @@ +/* + * basic.css + * ~~~~~~~~~ + * + * Sphinx stylesheet -- basic theme. + * + * :copyright: Copyright 2007-2023 by the Sphinx team, see AUTHORS. + * :license: BSD, see LICENSE for details. + * + */ + +/* -- main layout ----------------------------------------------------------- */ + +div.clearer { + clear: both; +} + +div.section::after { + display: block; + content: ''; + clear: left; +} + +/* -- relbar ---------------------------------------------------------------- */ + +div.related { + width: 100%; + font-size: 90%; +} + +div.related h3 { + display: none; +} + +div.related ul { + margin: 0; + padding: 0 0 0 10px; + list-style: none; +} + +div.related li { + display: inline; +} + +div.related li.right { + float: right; + margin-right: 5px; +} + +/* -- sidebar --------------------------------------------------------------- */ + +div.sphinxsidebarwrapper { + padding: 10px 5px 0 10px; +} + +div.sphinxsidebar { + float: left; + width: 230px; + margin-left: -100%; + font-size: 90%; + word-wrap: break-word; + overflow-wrap : break-word; +} + +div.sphinxsidebar ul { + list-style: none; +} + +div.sphinxsidebar ul ul, +div.sphinxsidebar ul.want-points { + margin-left: 20px; + list-style: square; +} + +div.sphinxsidebar ul ul { + margin-top: 0; + margin-bottom: 0; +} + +div.sphinxsidebar form { + margin-top: 10px; +} + +div.sphinxsidebar input { + border: 1px solid #98dbcc; + font-family: sans-serif; + font-size: 1em; +} + +div.sphinxsidebar #searchbox form.search { + overflow: hidden; +} + +div.sphinxsidebar #searchbox input[type="text"] { + float: left; + width: 80%; + padding: 0.25em; + box-sizing: border-box; +} + +div.sphinxsidebar #searchbox input[type="submit"] { + float: left; + width: 20%; + border-left: none; + padding: 0.25em; + box-sizing: border-box; +} + + +img { + border: 0; + max-width: 100%; +} + +/* -- search page ----------------------------------------------------------- */ + +ul.search { + margin: 10px 0 0 20px; + padding: 0; +} + +ul.search li { + padding: 5px 0 5px 20px; + background-image: url(file.png); + background-repeat: no-repeat; + background-position: 0 7px; +} + +ul.search li a { + font-weight: bold; +} + +ul.search li p.context { + color: #888; + margin: 2px 0 0 30px; + text-align: left; +} + +ul.keywordmatches li.goodmatch a { + font-weight: bold; +} + +/* -- index page ------------------------------------------------------------ */ + +table.contentstable { + width: 90%; + margin-left: auto; + margin-right: auto; +} + +table.contentstable p.biglink { + line-height: 150%; +} + +a.biglink { + font-size: 1.3em; +} + +span.linkdescr { + font-style: italic; + padding-top: 5px; + font-size: 90%; +} + +/* -- general index --------------------------------------------------------- */ + +table.indextable { + width: 100%; +} + +table.indextable td { + text-align: left; + vertical-align: top; +} + +table.indextable ul { + margin-top: 0; + margin-bottom: 0; + list-style-type: none; +} + +table.indextable > tbody > tr > td > ul { + padding-left: 0em; +} + +table.indextable tr.pcap { + height: 10px; +} + +table.indextable tr.cap { + margin-top: 10px; + background-color: #f2f2f2; +} + +img.toggler { + margin-right: 3px; + margin-top: 3px; + cursor: pointer; +} + +div.modindex-jumpbox { + border-top: 1px solid #ddd; + border-bottom: 1px solid #ddd; + margin: 1em 0 1em 0; + padding: 0.4em; +} + +div.genindex-jumpbox { + border-top: 1px solid #ddd; + border-bottom: 1px solid #ddd; + margin: 1em 0 1em 0; + padding: 0.4em; +} + +/* -- domain module index --------------------------------------------------- */ + +table.modindextable td { + padding: 2px; + border-collapse: collapse; +} + +/* -- general body styles --------------------------------------------------- */ + +div.body { + min-width: 360px; + max-width: 800px; +} + +div.body p, div.body dd, div.body li, div.body blockquote { + -moz-hyphens: auto; + -ms-hyphens: auto; + -webkit-hyphens: auto; + hyphens: auto; +} + +a.headerlink { + visibility: hidden; +} + +h1:hover > a.headerlink, +h2:hover > a.headerlink, +h3:hover > a.headerlink, +h4:hover > a.headerlink, +h5:hover > a.headerlink, +h6:hover > a.headerlink, +dt:hover > a.headerlink, +caption:hover > a.headerlink, +p.caption:hover > a.headerlink, +div.code-block-caption:hover > a.headerlink { + visibility: visible; +} + +div.body p.caption { + text-align: inherit; +} + +div.body td { + text-align: left; +} + +.first { + margin-top: 0 !important; +} + +p.rubric { + margin-top: 30px; + font-weight: bold; +} + +img.align-left, figure.align-left, .figure.align-left, object.align-left { + clear: left; + float: left; + margin-right: 1em; +} + +img.align-right, figure.align-right, .figure.align-right, object.align-right { + clear: right; + float: right; + margin-left: 1em; +} + +img.align-center, figure.align-center, .figure.align-center, object.align-center { + display: block; + margin-left: auto; + margin-right: auto; +} + +img.align-default, figure.align-default, .figure.align-default { + display: block; + margin-left: auto; + margin-right: auto; +} + +.align-left { + text-align: left; +} + +.align-center { + text-align: center; +} + +.align-default { + text-align: center; +} + +.align-right { + text-align: right; +} + +/* -- sidebars -------------------------------------------------------------- */ + +div.sidebar, +aside.sidebar { + margin: 0 0 0.5em 1em; + border: 1px solid #ddb; + padding: 7px; + background-color: #ffe; + width: 40%; + float: right; + clear: right; + overflow-x: auto; +} + +p.sidebar-title { + font-weight: bold; +} + +nav.contents, +aside.topic, +div.admonition, div.topic, blockquote { + clear: left; +} + +/* -- topics ---------------------------------------------------------------- */ + +nav.contents, +aside.topic, +div.topic { + border: 1px solid #ccc; + padding: 7px; + margin: 10px 0 10px 0; +} + +p.topic-title { + font-size: 1.1em; + font-weight: bold; + margin-top: 10px; +} + +/* -- admonitions ----------------------------------------------------------- */ + +div.admonition { + margin-top: 10px; + margin-bottom: 10px; + padding: 7px; +} + +div.admonition dt { + font-weight: bold; +} + +p.admonition-title { + margin: 0px 10px 5px 0px; + font-weight: bold; +} + +div.body p.centered { + text-align: center; + margin-top: 25px; +} + +/* -- content of sidebars/topics/admonitions -------------------------------- */ + +div.sidebar > :last-child, +aside.sidebar > :last-child, +nav.contents > :last-child, +aside.topic > :last-child, +div.topic > :last-child, +div.admonition > :last-child { + margin-bottom: 0; +} + +div.sidebar::after, +aside.sidebar::after, +nav.contents::after, +aside.topic::after, +div.topic::after, +div.admonition::after, +blockquote::after { + display: block; + content: ''; + clear: both; +} + +/* -- tables ---------------------------------------------------------------- */ + +table.docutils { + margin-top: 10px; + margin-bottom: 10px; + border: 0; + border-collapse: collapse; +} + +table.align-center { + margin-left: auto; + margin-right: auto; +} + +table.align-default { + margin-left: auto; + margin-right: auto; +} + +table caption span.caption-number { + font-style: italic; +} + +table caption span.caption-text { +} + +table.docutils td, table.docutils th { + padding: 1px 8px 1px 5px; + border-top: 0; + border-left: 0; + border-right: 0; + border-bottom: 1px solid #aaa; +} + +th { + text-align: left; + padding-right: 5px; +} + +table.citation { + border-left: solid 1px gray; + margin-left: 1px; +} + +table.citation td { + border-bottom: none; +} + +th > :first-child, +td > :first-child { + margin-top: 0px; +} + +th > :last-child, +td > :last-child { + margin-bottom: 0px; +} + +/* -- figures --------------------------------------------------------------- */ + +div.figure, figure { + margin: 0.5em; + padding: 0.5em; +} + +div.figure p.caption, figcaption { + padding: 0.3em; +} + +div.figure p.caption span.caption-number, +figcaption span.caption-number { + font-style: italic; +} + +div.figure p.caption span.caption-text, +figcaption span.caption-text { +} + +/* -- field list styles ----------------------------------------------------- */ + +table.field-list td, table.field-list th { + border: 0 !important; +} + +.field-list ul { + margin: 0; + padding-left: 1em; +} + +.field-list p { + margin: 0; +} + +.field-name { + -moz-hyphens: manual; + -ms-hyphens: manual; + -webkit-hyphens: manual; + hyphens: manual; +} + +/* -- hlist styles ---------------------------------------------------------- */ + +table.hlist { + margin: 1em 0; +} + +table.hlist td { + vertical-align: top; +} + +/* -- object description styles --------------------------------------------- */ + +.sig { + font-family: 'Consolas', 'Menlo', 'DejaVu Sans Mono', 'Bitstream Vera Sans Mono', monospace; +} + +.sig-name, code.descname { + background-color: transparent; + font-weight: bold; +} + +.sig-name { + font-size: 1.1em; +} + +code.descname { + font-size: 1.2em; +} + +.sig-prename, code.descclassname { + background-color: transparent; +} + +.optional { + font-size: 1.3em; +} + +.sig-paren { + font-size: larger; +} + +.sig-param.n { + font-style: italic; +} + +/* C++ specific styling */ + +.sig-inline.c-texpr, +.sig-inline.cpp-texpr { + font-family: unset; +} + +.sig.c .k, .sig.c .kt, +.sig.cpp .k, .sig.cpp .kt { + color: #0033B3; +} + +.sig.c .m, +.sig.cpp .m { + color: #1750EB; +} + +.sig.c .s, .sig.c .sc, +.sig.cpp .s, .sig.cpp .sc { + color: #067D17; +} + + +/* -- other body styles ----------------------------------------------------- */ + +ol.arabic { + list-style: decimal; +} + +ol.loweralpha { + list-style: lower-alpha; +} + +ol.upperalpha { + list-style: upper-alpha; +} + +ol.lowerroman { + list-style: lower-roman; +} + +ol.upperroman { + list-style: upper-roman; +} + +:not(li) > ol > li:first-child > :first-child, +:not(li) > ul > li:first-child > :first-child { + margin-top: 0px; +} + +:not(li) > ol > li:last-child > :last-child, +:not(li) > ul > li:last-child > :last-child { + margin-bottom: 0px; +} + +ol.simple ol p, +ol.simple ul p, +ul.simple ol p, +ul.simple ul p { + margin-top: 0; +} + +ol.simple > li:not(:first-child) > p, +ul.simple > li:not(:first-child) > p { + margin-top: 0; +} + +ol.simple p, +ul.simple p { + margin-bottom: 0; +} + +aside.footnote > span, +div.citation > span { + float: left; +} +aside.footnote > span:last-of-type, +div.citation > span:last-of-type { + padding-right: 0.5em; +} +aside.footnote > p { + margin-left: 2em; +} +div.citation > p { + margin-left: 4em; +} +aside.footnote > p:last-of-type, +div.citation > p:last-of-type { + margin-bottom: 0em; +} +aside.footnote > p:last-of-type:after, +div.citation > p:last-of-type:after { + content: ""; + clear: both; +} + +dl.field-list { + display: grid; + grid-template-columns: fit-content(30%) auto; +} + +dl.field-list > dt { + font-weight: bold; + word-break: break-word; + padding-left: 0.5em; + padding-right: 5px; +} + +dl.field-list > dd { + padding-left: 0.5em; + margin-top: 0em; + margin-left: 0em; + margin-bottom: 0em; +} + +dl { + margin-bottom: 15px; +} + +dd > :first-child { + margin-top: 0px; +} + +dd ul, dd table { + margin-bottom: 10px; +} + +dd { + margin-top: 3px; + margin-bottom: 10px; + margin-left: 30px; +} + +.sig dd { + margin-top: 0px; + margin-bottom: 0px; +} + +.sig dl { + margin-top: 0px; + margin-bottom: 0px; +} + +dl > dd:last-child, +dl > dd:last-child > :last-child { + margin-bottom: 0; +} + +dt:target, span.highlighted { + background-color: #fbe54e; +} + +rect.highlighted { + fill: #fbe54e; +} + +dl.glossary dt { + font-weight: bold; + font-size: 1.1em; +} + +.versionmodified { + font-style: italic; +} + +.system-message { + background-color: #fda; + padding: 5px; + border: 3px solid red; +} + +.footnote:target { + background-color: #ffa; +} + +.line-block { + display: block; + margin-top: 1em; + margin-bottom: 1em; +} + +.line-block .line-block { + margin-top: 0; + margin-bottom: 0; + margin-left: 1.5em; +} + +.guilabel, .menuselection { + font-family: sans-serif; +} + +.accelerator { + text-decoration: underline; +} + +.classifier { + font-style: oblique; +} + +.classifier:before { + font-style: normal; + margin: 0 0.5em; + content: ":"; + display: inline-block; +} + +abbr, acronym { + border-bottom: dotted 1px; + cursor: help; +} + +.translated { + background-color: rgba(207, 255, 207, 0.2) +} + +.untranslated { + background-color: rgba(255, 207, 207, 0.2) +} + +/* -- code displays --------------------------------------------------------- */ + +pre { + overflow: auto; + overflow-y: hidden; /* fixes display issues on Chrome browsers */ +} + +pre, div[class*="highlight-"] { + clear: both; +} + +span.pre { + -moz-hyphens: none; + -ms-hyphens: none; + -webkit-hyphens: none; + hyphens: none; + white-space: nowrap; +} + +div[class*="highlight-"] { + margin: 1em 0; +} + +td.linenos pre { + border: 0; + background-color: transparent; + color: #aaa; +} + +table.highlighttable { + display: block; +} + +table.highlighttable tbody { + display: block; +} + +table.highlighttable tr { + display: flex; +} + +table.highlighttable td { + margin: 0; + padding: 0; +} + +table.highlighttable td.linenos { + padding-right: 0.5em; +} + +table.highlighttable td.code { + flex: 1; + overflow: hidden; +} + +.highlight .hll { + display: block; +} + +div.highlight pre, +table.highlighttable pre { + margin: 0; +} + +div.code-block-caption + div { + margin-top: 0; +} + +div.code-block-caption { + margin-top: 1em; + padding: 2px 5px; + font-size: small; +} + +div.code-block-caption code { + background-color: transparent; +} + +table.highlighttable td.linenos, +span.linenos, +div.highlight span.gp { /* gp: Generic.Prompt */ + user-select: none; + -webkit-user-select: text; /* Safari fallback only */ + -webkit-user-select: none; /* Chrome/Safari */ + -moz-user-select: none; /* Firefox */ + -ms-user-select: none; /* IE10+ */ +} + +div.code-block-caption span.caption-number { + padding: 0.1em 0.3em; + font-style: italic; +} + +div.code-block-caption span.caption-text { +} + +div.literal-block-wrapper { + margin: 1em 0; +} + +code.xref, a code { + background-color: transparent; + font-weight: bold; +} + +h1 code, h2 code, h3 code, h4 code, h5 code, h6 code { + background-color: transparent; +} + +.viewcode-link { + float: right; +} + +.viewcode-back { + float: right; + font-family: sans-serif; +} + +div.viewcode-block:target { + margin: -1px -10px; + padding: 0 10px; +} + +/* -- math display ---------------------------------------------------------- */ + +img.math { + vertical-align: middle; +} + +div.body div.math p { + text-align: center; +} + +span.eqno { + float: right; +} + +span.eqno a.headerlink { + position: absolute; + z-index: 1; +} + +div.math:hover a.headerlink { + visibility: visible; +} + +/* -- printout stylesheet --------------------------------------------------- */ + +@media print { + div.document, + div.documentwrapper, + div.bodywrapper { + margin: 0 !important; + width: 100%; + } + + div.sphinxsidebar, + div.related, + div.footer, + #top-link { + display: none; + } +} \ No newline at end of file diff --git a/branch/bicounty_emme/_static/css/badge_only.css b/branch/bicounty_emme/_static/css/badge_only.css new file mode 100644 index 0000000..c718cee --- /dev/null +++ b/branch/bicounty_emme/_static/css/badge_only.css @@ -0,0 +1 @@ +.clearfix{*zoom:1}.clearfix:after,.clearfix:before{display:table;content:""}.clearfix:after{clear:both}@font-face{font-family:FontAwesome;font-style:normal;font-weight:400;src:url(fonts/fontawesome-webfont.eot?674f50d287a8c48dc19ba404d20fe713?#iefix) format("embedded-opentype"),url(fonts/fontawesome-webfont.woff2?af7ae505a9eed503f8b8e6982036873e) format("woff2"),url(fonts/fontawesome-webfont.woff?fee66e712a8a08eef5805a46892932ad) format("woff"),url(fonts/fontawesome-webfont.ttf?b06871f281fee6b241d60582ae9369b9) format("truetype"),url(fonts/fontawesome-webfont.svg?912ec66d7572ff821749319396470bde#FontAwesome) format("svg")}.fa:before{font-family:FontAwesome;font-style:normal;font-weight:400;line-height:1}.fa:before,a .fa{text-decoration:inherit}.fa:before,a .fa,li .fa{display:inline-block}li .fa-large:before{width:1.875em}ul.fas{list-style-type:none;margin-left:2em;text-indent:-.8em}ul.fas li .fa{width:.8em}ul.fas li .fa-large:before{vertical-align:baseline}.fa-book:before,.icon-book:before{content:"\f02d"}.fa-caret-down:before,.icon-caret-down:before{content:"\f0d7"}.fa-caret-up:before,.icon-caret-up:before{content:"\f0d8"}.fa-caret-left:before,.icon-caret-left:before{content:"\f0d9"}.fa-caret-right:before,.icon-caret-right:before{content:"\f0da"}.rst-versions{position:fixed;bottom:0;left:0;width:300px;color:#fcfcfc;background:#1f1d1d;font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif;z-index:400}.rst-versions a{color:#2980b9;text-decoration:none}.rst-versions .rst-badge-small{display:none}.rst-versions .rst-current-version{padding:12px;background-color:#272525;display:block;text-align:right;font-size:90%;cursor:pointer;color:#27ae60}.rst-versions .rst-current-version:after{clear:both;content:"";display:block}.rst-versions .rst-current-version .fa{color:#fcfcfc}.rst-versions .rst-current-version .fa-book,.rst-versions .rst-current-version .icon-book{float:left}.rst-versions .rst-current-version.rst-out-of-date{background-color:#e74c3c;color:#fff}.rst-versions .rst-current-version.rst-active-old-version{background-color:#f1c40f;color:#000}.rst-versions.shift-up{height:auto;max-height:100%;overflow-y:scroll}.rst-versions.shift-up .rst-other-versions{display:block}.rst-versions .rst-other-versions{font-size:90%;padding:12px;color:grey;display:none}.rst-versions .rst-other-versions hr{display:block;height:1px;border:0;margin:20px 0;padding:0;border-top:1px solid #413d3d}.rst-versions .rst-other-versions dd{display:inline-block;margin:0}.rst-versions .rst-other-versions dd a{display:inline-block;padding:6px;color:#fcfcfc}.rst-versions.rst-badge{width:auto;bottom:20px;right:20px;left:auto;border:none;max-width:300px;max-height:90%}.rst-versions.rst-badge .fa-book,.rst-versions.rst-badge .icon-book{float:none;line-height:30px}.rst-versions.rst-badge.shift-up .rst-current-version{text-align:right}.rst-versions.rst-badge.shift-up .rst-current-version .fa-book,.rst-versions.rst-badge.shift-up .rst-current-version .icon-book{float:left}.rst-versions.rst-badge>.rst-current-version{width:auto;height:30px;line-height:30px;padding:0 6px;display:block;text-align:center}@media screen and (max-width:768px){.rst-versions{width:85%;display:none}.rst-versions.shift{display:block}} \ No newline at end of file diff --git a/branch/bicounty_emme/_static/css/fonts/Roboto-Slab-Bold.woff b/branch/bicounty_emme/_static/css/fonts/Roboto-Slab-Bold.woff new file mode 100644 index 0000000..6cb6000 Binary files /dev/null and b/branch/bicounty_emme/_static/css/fonts/Roboto-Slab-Bold.woff differ diff --git a/branch/bicounty_emme/_static/css/fonts/Roboto-Slab-Bold.woff2 b/branch/bicounty_emme/_static/css/fonts/Roboto-Slab-Bold.woff2 new file mode 100644 index 0000000..7059e23 Binary files /dev/null and b/branch/bicounty_emme/_static/css/fonts/Roboto-Slab-Bold.woff2 differ diff --git a/branch/bicounty_emme/_static/css/fonts/Roboto-Slab-Regular.woff b/branch/bicounty_emme/_static/css/fonts/Roboto-Slab-Regular.woff new file mode 100644 index 0000000..f815f63 Binary files /dev/null and b/branch/bicounty_emme/_static/css/fonts/Roboto-Slab-Regular.woff differ diff --git a/branch/bicounty_emme/_static/css/fonts/Roboto-Slab-Regular.woff2 b/branch/bicounty_emme/_static/css/fonts/Roboto-Slab-Regular.woff2 new file mode 100644 index 0000000..f2c76e5 Binary files /dev/null and b/branch/bicounty_emme/_static/css/fonts/Roboto-Slab-Regular.woff2 differ diff --git a/branch/bicounty_emme/_static/css/fonts/fontawesome-webfont.eot b/branch/bicounty_emme/_static/css/fonts/fontawesome-webfont.eot new file mode 100644 index 0000000..e9f60ca Binary files /dev/null and b/branch/bicounty_emme/_static/css/fonts/fontawesome-webfont.eot differ diff --git a/branch/bicounty_emme/_static/css/fonts/fontawesome-webfont.svg b/branch/bicounty_emme/_static/css/fonts/fontawesome-webfont.svg new file mode 100644 index 0000000..855c845 --- /dev/null +++ b/branch/bicounty_emme/_static/css/fonts/fontawesome-webfont.svg @@ -0,0 +1,2671 @@ + + + + +Created by FontForge 20120731 at Mon Oct 24 17:37:40 2016 + By ,,, +Copyright Dave Gandy 2016. All rights reserved. + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/branch/bicounty_emme/_static/css/fonts/fontawesome-webfont.ttf b/branch/bicounty_emme/_static/css/fonts/fontawesome-webfont.ttf new file mode 100644 index 0000000..35acda2 Binary files /dev/null and b/branch/bicounty_emme/_static/css/fonts/fontawesome-webfont.ttf differ diff --git a/branch/bicounty_emme/_static/css/fonts/fontawesome-webfont.woff b/branch/bicounty_emme/_static/css/fonts/fontawesome-webfont.woff new file mode 100644 index 0000000..400014a Binary files /dev/null and b/branch/bicounty_emme/_static/css/fonts/fontawesome-webfont.woff differ diff --git a/branch/bicounty_emme/_static/css/fonts/fontawesome-webfont.woff2 b/branch/bicounty_emme/_static/css/fonts/fontawesome-webfont.woff2 new file mode 100644 index 0000000..4d13fc6 Binary files /dev/null and b/branch/bicounty_emme/_static/css/fonts/fontawesome-webfont.woff2 differ diff --git a/branch/bicounty_emme/_static/css/fonts/lato-bold-italic.woff b/branch/bicounty_emme/_static/css/fonts/lato-bold-italic.woff new file mode 100644 index 0000000..88ad05b Binary files /dev/null and b/branch/bicounty_emme/_static/css/fonts/lato-bold-italic.woff differ diff --git a/branch/bicounty_emme/_static/css/fonts/lato-bold-italic.woff2 b/branch/bicounty_emme/_static/css/fonts/lato-bold-italic.woff2 new file mode 100644 index 0000000..c4e3d80 Binary files /dev/null and b/branch/bicounty_emme/_static/css/fonts/lato-bold-italic.woff2 differ diff --git a/branch/bicounty_emme/_static/css/fonts/lato-bold.woff b/branch/bicounty_emme/_static/css/fonts/lato-bold.woff new file mode 100644 index 0000000..c6dff51 Binary files /dev/null and b/branch/bicounty_emme/_static/css/fonts/lato-bold.woff differ diff --git a/branch/bicounty_emme/_static/css/fonts/lato-bold.woff2 b/branch/bicounty_emme/_static/css/fonts/lato-bold.woff2 new file mode 100644 index 0000000..bb19504 Binary files /dev/null and b/branch/bicounty_emme/_static/css/fonts/lato-bold.woff2 differ diff --git a/branch/bicounty_emme/_static/css/fonts/lato-normal-italic.woff b/branch/bicounty_emme/_static/css/fonts/lato-normal-italic.woff new file mode 100644 index 0000000..76114bc Binary files /dev/null and b/branch/bicounty_emme/_static/css/fonts/lato-normal-italic.woff differ diff --git a/branch/bicounty_emme/_static/css/fonts/lato-normal-italic.woff2 b/branch/bicounty_emme/_static/css/fonts/lato-normal-italic.woff2 new file mode 100644 index 0000000..3404f37 Binary files /dev/null and b/branch/bicounty_emme/_static/css/fonts/lato-normal-italic.woff2 differ diff --git a/branch/bicounty_emme/_static/css/fonts/lato-normal.woff b/branch/bicounty_emme/_static/css/fonts/lato-normal.woff new file mode 100644 index 0000000..ae1307f Binary files /dev/null and b/branch/bicounty_emme/_static/css/fonts/lato-normal.woff differ diff --git a/branch/bicounty_emme/_static/css/fonts/lato-normal.woff2 b/branch/bicounty_emme/_static/css/fonts/lato-normal.woff2 new file mode 100644 index 0000000..3bf9843 Binary files /dev/null and b/branch/bicounty_emme/_static/css/fonts/lato-normal.woff2 differ diff --git a/branch/bicounty_emme/_static/css/theme.css b/branch/bicounty_emme/_static/css/theme.css new file mode 100644 index 0000000..19a446a --- /dev/null +++ b/branch/bicounty_emme/_static/css/theme.css @@ -0,0 +1,4 @@ +html{box-sizing:border-box}*,:after,:before{box-sizing:inherit}article,aside,details,figcaption,figure,footer,header,hgroup,nav,section{display:block}audio,canvas,video{display:inline-block;*display:inline;*zoom:1}[hidden],audio:not([controls]){display:none}*{-webkit-box-sizing:border-box;-moz-box-sizing:border-box;box-sizing:border-box}html{font-size:100%;-webkit-text-size-adjust:100%;-ms-text-size-adjust:100%}body{margin:0}a:active,a:hover{outline:0}abbr[title]{border-bottom:1px dotted}b,strong{font-weight:700}blockquote{margin:0}dfn{font-style:italic}ins{background:#ff9;text-decoration:none}ins,mark{color:#000}mark{background:#ff0;font-style:italic;font-weight:700}.rst-content code,.rst-content tt,code,kbd,pre,samp{font-family:monospace,serif;_font-family:courier new,monospace;font-size:1em}pre{white-space:pre}q{quotes:none}q:after,q:before{content:"";content:none}small{font-size:85%}sub,sup{font-size:75%;line-height:0;position:relative;vertical-align:baseline}sup{top:-.5em}sub{bottom:-.25em}dl,ol,ul{margin:0;padding:0;list-style:none;list-style-image:none}li{list-style:none}dd{margin:0}img{border:0;-ms-interpolation-mode:bicubic;vertical-align:middle;max-width:100%}svg:not(:root){overflow:hidden}figure,form{margin:0}label{cursor:pointer}button,input,select,textarea{font-size:100%;margin:0;vertical-align:baseline;*vertical-align:middle}button,input{line-height:normal}button,input[type=button],input[type=reset],input[type=submit]{cursor:pointer;-webkit-appearance:button;*overflow:visible}button[disabled],input[disabled]{cursor:default}input[type=search]{-webkit-appearance:textfield;-moz-box-sizing:content-box;-webkit-box-sizing:content-box;box-sizing:content-box}textarea{resize:vertical}table{border-collapse:collapse;border-spacing:0}td{vertical-align:top}.chromeframe{margin:.2em 0;background:#ccc;color:#000;padding:.2em 0}.ir{display:block;border:0;text-indent:-999em;overflow:hidden;background-color:transparent;background-repeat:no-repeat;text-align:left;direction:ltr;*line-height:0}.ir br{display:none}.hidden{display:none!important;visibility:hidden}.visuallyhidden{border:0;clip:rect(0 0 0 0);height:1px;margin:-1px;overflow:hidden;padding:0;position:absolute;width:1px}.visuallyhidden.focusable:active,.visuallyhidden.focusable:focus{clip:auto;height:auto;margin:0;overflow:visible;position:static;width:auto}.invisible{visibility:hidden}.relative{position:relative}big,small{font-size:100%}@media print{body,html,section{background:none!important}*{box-shadow:none!important;text-shadow:none!important;filter:none!important;-ms-filter:none!important}a,a:visited{text-decoration:underline}.ir a:after,a[href^="#"]:after,a[href^="javascript:"]:after{content:""}blockquote,pre{page-break-inside:avoid}thead{display:table-header-group}img,tr{page-break-inside:avoid}img{max-width:100%!important}@page{margin:.5cm}.rst-content .toctree-wrapper>p.caption,h2,h3,p{orphans:3;widows:3}.rst-content .toctree-wrapper>p.caption,h2,h3{page-break-after:avoid}}.btn,.fa:before,.icon:before,.rst-content .admonition,.rst-content .admonition-title:before,.rst-content .admonition-todo,.rst-content .attention,.rst-content .caution,.rst-content .code-block-caption .headerlink:before,.rst-content .danger,.rst-content .eqno .headerlink:before,.rst-content .error,.rst-content .hint,.rst-content .important,.rst-content .note,.rst-content .seealso,.rst-content .tip,.rst-content .warning,.rst-content code.download span:first-child:before,.rst-content dl dt .headerlink:before,.rst-content h1 .headerlink:before,.rst-content h2 .headerlink:before,.rst-content h3 .headerlink:before,.rst-content h4 .headerlink:before,.rst-content h5 .headerlink:before,.rst-content h6 .headerlink:before,.rst-content p.caption .headerlink:before,.rst-content p .headerlink:before,.rst-content table>caption .headerlink:before,.rst-content tt.download span:first-child:before,.wy-alert,.wy-dropdown .caret:before,.wy-inline-validate.wy-inline-validate-danger .wy-input-context:before,.wy-inline-validate.wy-inline-validate-info .wy-input-context:before,.wy-inline-validate.wy-inline-validate-success .wy-input-context:before,.wy-inline-validate.wy-inline-validate-warning .wy-input-context:before,.wy-menu-vertical li.current>a button.toctree-expand:before,.wy-menu-vertical li.on a button.toctree-expand:before,.wy-menu-vertical li button.toctree-expand:before,input[type=color],input[type=date],input[type=datetime-local],input[type=datetime],input[type=email],input[type=month],input[type=number],input[type=password],input[type=search],input[type=tel],input[type=text],input[type=time],input[type=url],input[type=week],select,textarea{-webkit-font-smoothing:antialiased}.clearfix{*zoom:1}.clearfix:after,.clearfix:before{display:table;content:""}.clearfix:after{clear:both}/*! + * Font Awesome 4.7.0 by @davegandy - http://fontawesome.io - @fontawesome + * License - http://fontawesome.io/license (Font: SIL OFL 1.1, CSS: MIT License) + */@font-face{font-family:FontAwesome;src:url(fonts/fontawesome-webfont.eot?674f50d287a8c48dc19ba404d20fe713);src:url(fonts/fontawesome-webfont.eot?674f50d287a8c48dc19ba404d20fe713?#iefix&v=4.7.0) format("embedded-opentype"),url(fonts/fontawesome-webfont.woff2?af7ae505a9eed503f8b8e6982036873e) format("woff2"),url(fonts/fontawesome-webfont.woff?fee66e712a8a08eef5805a46892932ad) format("woff"),url(fonts/fontawesome-webfont.ttf?b06871f281fee6b241d60582ae9369b9) format("truetype"),url(fonts/fontawesome-webfont.svg?912ec66d7572ff821749319396470bde#fontawesomeregular) format("svg");font-weight:400;font-style:normal}.fa,.icon,.rst-content .admonition-title,.rst-content .code-block-caption .headerlink,.rst-content .eqno .headerlink,.rst-content code.download span:first-child,.rst-content dl dt .headerlink,.rst-content h1 .headerlink,.rst-content h2 .headerlink,.rst-content h3 .headerlink,.rst-content h4 .headerlink,.rst-content h5 .headerlink,.rst-content h6 .headerlink,.rst-content p.caption .headerlink,.rst-content p .headerlink,.rst-content table>caption .headerlink,.rst-content tt.download span:first-child,.wy-menu-vertical li.current>a button.toctree-expand,.wy-menu-vertical li.on a button.toctree-expand,.wy-menu-vertical li button.toctree-expand{display:inline-block;font:normal normal normal 14px/1 FontAwesome;font-size:inherit;text-rendering:auto;-webkit-font-smoothing:antialiased;-moz-osx-font-smoothing:grayscale}.fa-lg{font-size:1.33333em;line-height:.75em;vertical-align:-15%}.fa-2x{font-size:2em}.fa-3x{font-size:3em}.fa-4x{font-size:4em}.fa-5x{font-size:5em}.fa-fw{width:1.28571em;text-align:center}.fa-ul{padding-left:0;margin-left:2.14286em;list-style-type:none}.fa-ul>li{position:relative}.fa-li{position:absolute;left:-2.14286em;width:2.14286em;top:.14286em;text-align:center}.fa-li.fa-lg{left:-1.85714em}.fa-border{padding:.2em .25em .15em;border:.08em solid #eee;border-radius:.1em}.fa-pull-left{float:left}.fa-pull-right{float:right}.fa-pull-left.icon,.fa.fa-pull-left,.rst-content .code-block-caption .fa-pull-left.headerlink,.rst-content .eqno .fa-pull-left.headerlink,.rst-content .fa-pull-left.admonition-title,.rst-content code.download span.fa-pull-left:first-child,.rst-content dl dt .fa-pull-left.headerlink,.rst-content h1 .fa-pull-left.headerlink,.rst-content h2 .fa-pull-left.headerlink,.rst-content h3 .fa-pull-left.headerlink,.rst-content h4 .fa-pull-left.headerlink,.rst-content h5 .fa-pull-left.headerlink,.rst-content h6 .fa-pull-left.headerlink,.rst-content p .fa-pull-left.headerlink,.rst-content table>caption .fa-pull-left.headerlink,.rst-content tt.download span.fa-pull-left:first-child,.wy-menu-vertical li.current>a button.fa-pull-left.toctree-expand,.wy-menu-vertical li.on a button.fa-pull-left.toctree-expand,.wy-menu-vertical li button.fa-pull-left.toctree-expand{margin-right:.3em}.fa-pull-right.icon,.fa.fa-pull-right,.rst-content .code-block-caption .fa-pull-right.headerlink,.rst-content .eqno .fa-pull-right.headerlink,.rst-content .fa-pull-right.admonition-title,.rst-content code.download span.fa-pull-right:first-child,.rst-content dl dt .fa-pull-right.headerlink,.rst-content h1 .fa-pull-right.headerlink,.rst-content h2 .fa-pull-right.headerlink,.rst-content h3 .fa-pull-right.headerlink,.rst-content h4 .fa-pull-right.headerlink,.rst-content h5 .fa-pull-right.headerlink,.rst-content h6 .fa-pull-right.headerlink,.rst-content p .fa-pull-right.headerlink,.rst-content table>caption .fa-pull-right.headerlink,.rst-content tt.download span.fa-pull-right:first-child,.wy-menu-vertical li.current>a button.fa-pull-right.toctree-expand,.wy-menu-vertical li.on a button.fa-pull-right.toctree-expand,.wy-menu-vertical li button.fa-pull-right.toctree-expand{margin-left:.3em}.pull-right{float:right}.pull-left{float:left}.fa.pull-left,.pull-left.icon,.rst-content .code-block-caption .pull-left.headerlink,.rst-content .eqno .pull-left.headerlink,.rst-content .pull-left.admonition-title,.rst-content code.download span.pull-left:first-child,.rst-content dl dt .pull-left.headerlink,.rst-content h1 .pull-left.headerlink,.rst-content h2 .pull-left.headerlink,.rst-content h3 .pull-left.headerlink,.rst-content h4 .pull-left.headerlink,.rst-content h5 .pull-left.headerlink,.rst-content h6 .pull-left.headerlink,.rst-content p .pull-left.headerlink,.rst-content table>caption .pull-left.headerlink,.rst-content tt.download span.pull-left:first-child,.wy-menu-vertical li.current>a button.pull-left.toctree-expand,.wy-menu-vertical li.on a button.pull-left.toctree-expand,.wy-menu-vertical li button.pull-left.toctree-expand{margin-right:.3em}.fa.pull-right,.pull-right.icon,.rst-content .code-block-caption .pull-right.headerlink,.rst-content .eqno .pull-right.headerlink,.rst-content .pull-right.admonition-title,.rst-content code.download span.pull-right:first-child,.rst-content dl dt .pull-right.headerlink,.rst-content h1 .pull-right.headerlink,.rst-content h2 .pull-right.headerlink,.rst-content h3 .pull-right.headerlink,.rst-content h4 .pull-right.headerlink,.rst-content h5 .pull-right.headerlink,.rst-content h6 .pull-right.headerlink,.rst-content p .pull-right.headerlink,.rst-content table>caption .pull-right.headerlink,.rst-content tt.download span.pull-right:first-child,.wy-menu-vertical li.current>a button.pull-right.toctree-expand,.wy-menu-vertical li.on a button.pull-right.toctree-expand,.wy-menu-vertical li button.pull-right.toctree-expand{margin-left:.3em}.fa-spin{-webkit-animation:fa-spin 2s linear infinite;animation:fa-spin 2s linear infinite}.fa-pulse{-webkit-animation:fa-spin 1s steps(8) infinite;animation:fa-spin 1s steps(8) infinite}@-webkit-keyframes fa-spin{0%{-webkit-transform:rotate(0deg);transform:rotate(0deg)}to{-webkit-transform:rotate(359deg);transform:rotate(359deg)}}@keyframes fa-spin{0%{-webkit-transform:rotate(0deg);transform:rotate(0deg)}to{-webkit-transform:rotate(359deg);transform:rotate(359deg)}}.fa-rotate-90{-ms-filter:"progid:DXImageTransform.Microsoft.BasicImage(rotation=1)";-webkit-transform:rotate(90deg);-ms-transform:rotate(90deg);transform:rotate(90deg)}.fa-rotate-180{-ms-filter:"progid:DXImageTransform.Microsoft.BasicImage(rotation=2)";-webkit-transform:rotate(180deg);-ms-transform:rotate(180deg);transform:rotate(180deg)}.fa-rotate-270{-ms-filter:"progid:DXImageTransform.Microsoft.BasicImage(rotation=3)";-webkit-transform:rotate(270deg);-ms-transform:rotate(270deg);transform:rotate(270deg)}.fa-flip-horizontal{-ms-filter:"progid:DXImageTransform.Microsoft.BasicImage(rotation=0, mirror=1)";-webkit-transform:scaleX(-1);-ms-transform:scaleX(-1);transform:scaleX(-1)}.fa-flip-vertical{-ms-filter:"progid:DXImageTransform.Microsoft.BasicImage(rotation=2, mirror=1)";-webkit-transform:scaleY(-1);-ms-transform:scaleY(-1);transform:scaleY(-1)}:root .fa-flip-horizontal,:root .fa-flip-vertical,:root .fa-rotate-90,:root .fa-rotate-180,:root .fa-rotate-270{filter:none}.fa-stack{position:relative;display:inline-block;width:2em;height:2em;line-height:2em;vertical-align:middle}.fa-stack-1x,.fa-stack-2x{position:absolute;left:0;width:100%;text-align:center}.fa-stack-1x{line-height:inherit}.fa-stack-2x{font-size:2em}.fa-inverse{color:#fff}.fa-glass:before{content:""}.fa-music:before{content:""}.fa-search:before,.icon-search:before{content:""}.fa-envelope-o:before{content:""}.fa-heart:before{content:""}.fa-star:before{content:""}.fa-star-o:before{content:""}.fa-user:before{content:""}.fa-film:before{content:""}.fa-th-large:before{content:""}.fa-th:before{content:""}.fa-th-list:before{content:""}.fa-check:before{content:""}.fa-close:before,.fa-remove:before,.fa-times:before{content:""}.fa-search-plus:before{content:""}.fa-search-minus:before{content:""}.fa-power-off:before{content:""}.fa-signal:before{content:""}.fa-cog:before,.fa-gear:before{content:""}.fa-trash-o:before{content:""}.fa-home:before,.icon-home:before{content:""}.fa-file-o:before{content:""}.fa-clock-o:before{content:""}.fa-road:before{content:""}.fa-download:before,.rst-content code.download span:first-child:before,.rst-content tt.download span:first-child:before{content:""}.fa-arrow-circle-o-down:before{content:""}.fa-arrow-circle-o-up:before{content:""}.fa-inbox:before{content:""}.fa-play-circle-o:before{content:""}.fa-repeat:before,.fa-rotate-right:before{content:""}.fa-refresh:before{content:""}.fa-list-alt:before{content:""}.fa-lock:before{content:""}.fa-flag:before{content:""}.fa-headphones:before{content:""}.fa-volume-off:before{content:""}.fa-volume-down:before{content:""}.fa-volume-up:before{content:""}.fa-qrcode:before{content:""}.fa-barcode:before{content:""}.fa-tag:before{content:""}.fa-tags:before{content:""}.fa-book:before,.icon-book:before{content:""}.fa-bookmark:before{content:""}.fa-print:before{content:""}.fa-camera:before{content:""}.fa-font:before{content:""}.fa-bold:before{content:""}.fa-italic:before{content:""}.fa-text-height:before{content:""}.fa-text-width:before{content:""}.fa-align-left:before{content:""}.fa-align-center:before{content:""}.fa-align-right:before{content:""}.fa-align-justify:before{content:""}.fa-list:before{content:""}.fa-dedent:before,.fa-outdent:before{content:""}.fa-indent:before{content:""}.fa-video-camera:before{content:""}.fa-image:before,.fa-photo:before,.fa-picture-o:before{content:""}.fa-pencil:before{content:""}.fa-map-marker:before{content:""}.fa-adjust:before{content:""}.fa-tint:before{content:""}.fa-edit:before,.fa-pencil-square-o:before{content:""}.fa-share-square-o:before{content:""}.fa-check-square-o:before{content:""}.fa-arrows:before{content:""}.fa-step-backward:before{content:""}.fa-fast-backward:before{content:""}.fa-backward:before{content:""}.fa-play:before{content:""}.fa-pause:before{content:""}.fa-stop:before{content:""}.fa-forward:before{content:""}.fa-fast-forward:before{content:""}.fa-step-forward:before{content:""}.fa-eject:before{content:""}.fa-chevron-left:before{content:""}.fa-chevron-right:before{content:""}.fa-plus-circle:before{content:""}.fa-minus-circle:before{content:""}.fa-times-circle:before,.wy-inline-validate.wy-inline-validate-danger .wy-input-context:before{content:""}.fa-check-circle:before,.wy-inline-validate.wy-inline-validate-success .wy-input-context:before{content:""}.fa-question-circle:before{content:""}.fa-info-circle:before{content:""}.fa-crosshairs:before{content:""}.fa-times-circle-o:before{content:""}.fa-check-circle-o:before{content:""}.fa-ban:before{content:""}.fa-arrow-left:before{content:""}.fa-arrow-right:before{content:""}.fa-arrow-up:before{content:""}.fa-arrow-down:before{content:""}.fa-mail-forward:before,.fa-share:before{content:""}.fa-expand:before{content:""}.fa-compress:before{content:""}.fa-plus:before{content:""}.fa-minus:before{content:""}.fa-asterisk:before{content:""}.fa-exclamation-circle:before,.rst-content .admonition-title:before,.wy-inline-validate.wy-inline-validate-info .wy-input-context:before,.wy-inline-validate.wy-inline-validate-warning .wy-input-context:before{content:""}.fa-gift:before{content:""}.fa-leaf:before{content:""}.fa-fire:before,.icon-fire:before{content:""}.fa-eye:before{content:""}.fa-eye-slash:before{content:""}.fa-exclamation-triangle:before,.fa-warning:before{content:""}.fa-plane:before{content:""}.fa-calendar:before{content:""}.fa-random:before{content:""}.fa-comment:before{content:""}.fa-magnet:before{content:""}.fa-chevron-up:before{content:""}.fa-chevron-down:before{content:""}.fa-retweet:before{content:""}.fa-shopping-cart:before{content:""}.fa-folder:before{content:""}.fa-folder-open:before{content:""}.fa-arrows-v:before{content:""}.fa-arrows-h:before{content:""}.fa-bar-chart-o:before,.fa-bar-chart:before{content:""}.fa-twitter-square:before{content:""}.fa-facebook-square:before{content:""}.fa-camera-retro:before{content:""}.fa-key:before{content:""}.fa-cogs:before,.fa-gears:before{content:""}.fa-comments:before{content:""}.fa-thumbs-o-up:before{content:""}.fa-thumbs-o-down:before{content:""}.fa-star-half:before{content:""}.fa-heart-o:before{content:""}.fa-sign-out:before{content:""}.fa-linkedin-square:before{content:""}.fa-thumb-tack:before{content:""}.fa-external-link:before{content:""}.fa-sign-in:before{content:""}.fa-trophy:before{content:""}.fa-github-square:before{content:""}.fa-upload:before{content:""}.fa-lemon-o:before{content:""}.fa-phone:before{content:""}.fa-square-o:before{content:""}.fa-bookmark-o:before{content:""}.fa-phone-square:before{content:""}.fa-twitter:before{content:""}.fa-facebook-f:before,.fa-facebook:before{content:""}.fa-github:before,.icon-github:before{content:""}.fa-unlock:before{content:""}.fa-credit-card:before{content:""}.fa-feed:before,.fa-rss:before{content:""}.fa-hdd-o:before{content:""}.fa-bullhorn:before{content:""}.fa-bell:before{content:""}.fa-certificate:before{content:""}.fa-hand-o-right:before{content:""}.fa-hand-o-left:before{content:""}.fa-hand-o-up:before{content:""}.fa-hand-o-down:before{content:""}.fa-arrow-circle-left:before,.icon-circle-arrow-left:before{content:""}.fa-arrow-circle-right:before,.icon-circle-arrow-right:before{content:""}.fa-arrow-circle-up:before{content:""}.fa-arrow-circle-down:before{content:""}.fa-globe:before{content:""}.fa-wrench:before{content:""}.fa-tasks:before{content:""}.fa-filter:before{content:""}.fa-briefcase:before{content:""}.fa-arrows-alt:before{content:""}.fa-group:before,.fa-users:before{content:""}.fa-chain:before,.fa-link:before,.icon-link:before{content:""}.fa-cloud:before{content:""}.fa-flask:before{content:""}.fa-cut:before,.fa-scissors:before{content:""}.fa-copy:before,.fa-files-o:before{content:""}.fa-paperclip:before{content:""}.fa-floppy-o:before,.fa-save:before{content:""}.fa-square:before{content:""}.fa-bars:before,.fa-navicon:before,.fa-reorder:before{content:""}.fa-list-ul:before{content:""}.fa-list-ol:before{content:""}.fa-strikethrough:before{content:""}.fa-underline:before{content:""}.fa-table:before{content:""}.fa-magic:before{content:""}.fa-truck:before{content:""}.fa-pinterest:before{content:""}.fa-pinterest-square:before{content:""}.fa-google-plus-square:before{content:""}.fa-google-plus:before{content:""}.fa-money:before{content:""}.fa-caret-down:before,.icon-caret-down:before,.wy-dropdown .caret:before{content:""}.fa-caret-up:before{content:""}.fa-caret-left:before{content:""}.fa-caret-right:before{content:""}.fa-columns:before{content:""}.fa-sort:before,.fa-unsorted:before{content:""}.fa-sort-desc:before,.fa-sort-down:before{content:""}.fa-sort-asc:before,.fa-sort-up:before{content:""}.fa-envelope:before{content:""}.fa-linkedin:before{content:""}.fa-rotate-left:before,.fa-undo:before{content:""}.fa-gavel:before,.fa-legal:before{content:""}.fa-dashboard:before,.fa-tachometer:before{content:""}.fa-comment-o:before{content:""}.fa-comments-o:before{content:""}.fa-bolt:before,.fa-flash:before{content:""}.fa-sitemap:before{content:""}.fa-umbrella:before{content:""}.fa-clipboard:before,.fa-paste:before{content:""}.fa-lightbulb-o:before{content:""}.fa-exchange:before{content:""}.fa-cloud-download:before{content:""}.fa-cloud-upload:before{content:""}.fa-user-md:before{content:""}.fa-stethoscope:before{content:""}.fa-suitcase:before{content:""}.fa-bell-o:before{content:""}.fa-coffee:before{content:""}.fa-cutlery:before{content:""}.fa-file-text-o:before{content:""}.fa-building-o:before{content:""}.fa-hospital-o:before{content:""}.fa-ambulance:before{content:""}.fa-medkit:before{content:""}.fa-fighter-jet:before{content:""}.fa-beer:before{content:""}.fa-h-square:before{content:""}.fa-plus-square:before{content:""}.fa-angle-double-left:before{content:""}.fa-angle-double-right:before{content:""}.fa-angle-double-up:before{content:""}.fa-angle-double-down:before{content:""}.fa-angle-left:before{content:""}.fa-angle-right:before{content:""}.fa-angle-up:before{content:""}.fa-angle-down:before{content:""}.fa-desktop:before{content:""}.fa-laptop:before{content:""}.fa-tablet:before{content:""}.fa-mobile-phone:before,.fa-mobile:before{content:""}.fa-circle-o:before{content:""}.fa-quote-left:before{content:""}.fa-quote-right:before{content:""}.fa-spinner:before{content:""}.fa-circle:before{content:""}.fa-mail-reply:before,.fa-reply:before{content:""}.fa-github-alt:before{content:""}.fa-folder-o:before{content:""}.fa-folder-open-o:before{content:""}.fa-smile-o:before{content:""}.fa-frown-o:before{content:""}.fa-meh-o:before{content:""}.fa-gamepad:before{content:""}.fa-keyboard-o:before{content:""}.fa-flag-o:before{content:""}.fa-flag-checkered:before{content:""}.fa-terminal:before{content:""}.fa-code:before{content:""}.fa-mail-reply-all:before,.fa-reply-all:before{content:""}.fa-star-half-empty:before,.fa-star-half-full:before,.fa-star-half-o:before{content:""}.fa-location-arrow:before{content:""}.fa-crop:before{content:""}.fa-code-fork:before{content:""}.fa-chain-broken:before,.fa-unlink:before{content:""}.fa-question:before{content:""}.fa-info:before{content:""}.fa-exclamation:before{content:""}.fa-superscript:before{content:""}.fa-subscript:before{content:""}.fa-eraser:before{content:""}.fa-puzzle-piece:before{content:""}.fa-microphone:before{content:""}.fa-microphone-slash:before{content:""}.fa-shield:before{content:""}.fa-calendar-o:before{content:""}.fa-fire-extinguisher:before{content:""}.fa-rocket:before{content:""}.fa-maxcdn:before{content:""}.fa-chevron-circle-left:before{content:""}.fa-chevron-circle-right:before{content:""}.fa-chevron-circle-up:before{content:""}.fa-chevron-circle-down:before{content:""}.fa-html5:before{content:""}.fa-css3:before{content:""}.fa-anchor:before{content:""}.fa-unlock-alt:before{content:""}.fa-bullseye:before{content:""}.fa-ellipsis-h:before{content:""}.fa-ellipsis-v:before{content:""}.fa-rss-square:before{content:""}.fa-play-circle:before{content:""}.fa-ticket:before{content:""}.fa-minus-square:before{content:""}.fa-minus-square-o:before,.wy-menu-vertical li.current>a button.toctree-expand:before,.wy-menu-vertical li.on a button.toctree-expand:before{content:""}.fa-level-up:before{content:""}.fa-level-down:before{content:""}.fa-check-square:before{content:""}.fa-pencil-square:before{content:""}.fa-external-link-square:before{content:""}.fa-share-square:before{content:""}.fa-compass:before{content:""}.fa-caret-square-o-down:before,.fa-toggle-down:before{content:""}.fa-caret-square-o-up:before,.fa-toggle-up:before{content:""}.fa-caret-square-o-right:before,.fa-toggle-right:before{content:""}.fa-eur:before,.fa-euro:before{content:""}.fa-gbp:before{content:""}.fa-dollar:before,.fa-usd:before{content:""}.fa-inr:before,.fa-rupee:before{content:""}.fa-cny:before,.fa-jpy:before,.fa-rmb:before,.fa-yen:before{content:""}.fa-rouble:before,.fa-rub:before,.fa-ruble:before{content:""}.fa-krw:before,.fa-won:before{content:""}.fa-bitcoin:before,.fa-btc:before{content:""}.fa-file:before{content:""}.fa-file-text:before{content:""}.fa-sort-alpha-asc:before{content:""}.fa-sort-alpha-desc:before{content:""}.fa-sort-amount-asc:before{content:""}.fa-sort-amount-desc:before{content:""}.fa-sort-numeric-asc:before{content:""}.fa-sort-numeric-desc:before{content:""}.fa-thumbs-up:before{content:""}.fa-thumbs-down:before{content:""}.fa-youtube-square:before{content:""}.fa-youtube:before{content:""}.fa-xing:before{content:""}.fa-xing-square:before{content:""}.fa-youtube-play:before{content:""}.fa-dropbox:before{content:""}.fa-stack-overflow:before{content:""}.fa-instagram:before{content:""}.fa-flickr:before{content:""}.fa-adn:before{content:""}.fa-bitbucket:before,.icon-bitbucket:before{content:""}.fa-bitbucket-square:before{content:""}.fa-tumblr:before{content:""}.fa-tumblr-square:before{content:""}.fa-long-arrow-down:before{content:""}.fa-long-arrow-up:before{content:""}.fa-long-arrow-left:before{content:""}.fa-long-arrow-right:before{content:""}.fa-apple:before{content:""}.fa-windows:before{content:""}.fa-android:before{content:""}.fa-linux:before{content:""}.fa-dribbble:before{content:""}.fa-skype:before{content:""}.fa-foursquare:before{content:""}.fa-trello:before{content:""}.fa-female:before{content:""}.fa-male:before{content:""}.fa-gittip:before,.fa-gratipay:before{content:""}.fa-sun-o:before{content:""}.fa-moon-o:before{content:""}.fa-archive:before{content:""}.fa-bug:before{content:""}.fa-vk:before{content:""}.fa-weibo:before{content:""}.fa-renren:before{content:""}.fa-pagelines:before{content:""}.fa-stack-exchange:before{content:""}.fa-arrow-circle-o-right:before{content:""}.fa-arrow-circle-o-left:before{content:""}.fa-caret-square-o-left:before,.fa-toggle-left:before{content:""}.fa-dot-circle-o:before{content:""}.fa-wheelchair:before{content:""}.fa-vimeo-square:before{content:""}.fa-try:before,.fa-turkish-lira:before{content:""}.fa-plus-square-o:before,.wy-menu-vertical li button.toctree-expand:before{content:""}.fa-space-shuttle:before{content:""}.fa-slack:before{content:""}.fa-envelope-square:before{content:""}.fa-wordpress:before{content:""}.fa-openid:before{content:""}.fa-bank:before,.fa-institution:before,.fa-university:before{content:""}.fa-graduation-cap:before,.fa-mortar-board:before{content:""}.fa-yahoo:before{content:""}.fa-google:before{content:""}.fa-reddit:before{content:""}.fa-reddit-square:before{content:""}.fa-stumbleupon-circle:before{content:""}.fa-stumbleupon:before{content:""}.fa-delicious:before{content:""}.fa-digg:before{content:""}.fa-pied-piper-pp:before{content:""}.fa-pied-piper-alt:before{content:""}.fa-drupal:before{content:""}.fa-joomla:before{content:""}.fa-language:before{content:""}.fa-fax:before{content:""}.fa-building:before{content:""}.fa-child:before{content:""}.fa-paw:before{content:""}.fa-spoon:before{content:""}.fa-cube:before{content:""}.fa-cubes:before{content:""}.fa-behance:before{content:""}.fa-behance-square:before{content:""}.fa-steam:before{content:""}.fa-steam-square:before{content:""}.fa-recycle:before{content:""}.fa-automobile:before,.fa-car:before{content:""}.fa-cab:before,.fa-taxi:before{content:""}.fa-tree:before{content:""}.fa-spotify:before{content:""}.fa-deviantart:before{content:""}.fa-soundcloud:before{content:""}.fa-database:before{content:""}.fa-file-pdf-o:before{content:""}.fa-file-word-o:before{content:""}.fa-file-excel-o:before{content:""}.fa-file-powerpoint-o:before{content:""}.fa-file-image-o:before,.fa-file-photo-o:before,.fa-file-picture-o:before{content:""}.fa-file-archive-o:before,.fa-file-zip-o:before{content:""}.fa-file-audio-o:before,.fa-file-sound-o:before{content:""}.fa-file-movie-o:before,.fa-file-video-o:before{content:""}.fa-file-code-o:before{content:""}.fa-vine:before{content:""}.fa-codepen:before{content:""}.fa-jsfiddle:before{content:""}.fa-life-bouy:before,.fa-life-buoy:before,.fa-life-ring:before,.fa-life-saver:before,.fa-support:before{content:""}.fa-circle-o-notch:before{content:""}.fa-ra:before,.fa-rebel:before,.fa-resistance:before{content:""}.fa-empire:before,.fa-ge:before{content:""}.fa-git-square:before{content:""}.fa-git:before{content:""}.fa-hacker-news:before,.fa-y-combinator-square:before,.fa-yc-square:before{content:""}.fa-tencent-weibo:before{content:""}.fa-qq:before{content:""}.fa-wechat:before,.fa-weixin:before{content:""}.fa-paper-plane:before,.fa-send:before{content:""}.fa-paper-plane-o:before,.fa-send-o:before{content:""}.fa-history:before{content:""}.fa-circle-thin:before{content:""}.fa-header:before{content:""}.fa-paragraph:before{content:""}.fa-sliders:before{content:""}.fa-share-alt:before{content:""}.fa-share-alt-square:before{content:""}.fa-bomb:before{content:""}.fa-futbol-o:before,.fa-soccer-ball-o:before{content:""}.fa-tty:before{content:""}.fa-binoculars:before{content:""}.fa-plug:before{content:""}.fa-slideshare:before{content:""}.fa-twitch:before{content:""}.fa-yelp:before{content:""}.fa-newspaper-o:before{content:""}.fa-wifi:before{content:""}.fa-calculator:before{content:""}.fa-paypal:before{content:""}.fa-google-wallet:before{content:""}.fa-cc-visa:before{content:""}.fa-cc-mastercard:before{content:""}.fa-cc-discover:before{content:""}.fa-cc-amex:before{content:""}.fa-cc-paypal:before{content:""}.fa-cc-stripe:before{content:""}.fa-bell-slash:before{content:""}.fa-bell-slash-o:before{content:""}.fa-trash:before{content:""}.fa-copyright:before{content:""}.fa-at:before{content:""}.fa-eyedropper:before{content:""}.fa-paint-brush:before{content:""}.fa-birthday-cake:before{content:""}.fa-area-chart:before{content:""}.fa-pie-chart:before{content:""}.fa-line-chart:before{content:""}.fa-lastfm:before{content:""}.fa-lastfm-square:before{content:""}.fa-toggle-off:before{content:""}.fa-toggle-on:before{content:""}.fa-bicycle:before{content:""}.fa-bus:before{content:""}.fa-ioxhost:before{content:""}.fa-angellist:before{content:""}.fa-cc:before{content:""}.fa-ils:before,.fa-shekel:before,.fa-sheqel:before{content:""}.fa-meanpath:before{content:""}.fa-buysellads:before{content:""}.fa-connectdevelop:before{content:""}.fa-dashcube:before{content:""}.fa-forumbee:before{content:""}.fa-leanpub:before{content:""}.fa-sellsy:before{content:""}.fa-shirtsinbulk:before{content:""}.fa-simplybuilt:before{content:""}.fa-skyatlas:before{content:""}.fa-cart-plus:before{content:""}.fa-cart-arrow-down:before{content:""}.fa-diamond:before{content:""}.fa-ship:before{content:""}.fa-user-secret:before{content:""}.fa-motorcycle:before{content:""}.fa-street-view:before{content:""}.fa-heartbeat:before{content:""}.fa-venus:before{content:""}.fa-mars:before{content:""}.fa-mercury:before{content:""}.fa-intersex:before,.fa-transgender:before{content:""}.fa-transgender-alt:before{content:""}.fa-venus-double:before{content:""}.fa-mars-double:before{content:""}.fa-venus-mars:before{content:""}.fa-mars-stroke:before{content:""}.fa-mars-stroke-v:before{content:""}.fa-mars-stroke-h:before{content:""}.fa-neuter:before{content:""}.fa-genderless:before{content:""}.fa-facebook-official:before{content:""}.fa-pinterest-p:before{content:""}.fa-whatsapp:before{content:""}.fa-server:before{content:""}.fa-user-plus:before{content:""}.fa-user-times:before{content:""}.fa-bed:before,.fa-hotel:before{content:""}.fa-viacoin:before{content:""}.fa-train:before{content:""}.fa-subway:before{content:""}.fa-medium:before{content:""}.fa-y-combinator:before,.fa-yc:before{content:""}.fa-optin-monster:before{content:""}.fa-opencart:before{content:""}.fa-expeditedssl:before{content:""}.fa-battery-4:before,.fa-battery-full:before,.fa-battery:before{content:""}.fa-battery-3:before,.fa-battery-three-quarters:before{content:""}.fa-battery-2:before,.fa-battery-half:before{content:""}.fa-battery-1:before,.fa-battery-quarter:before{content:""}.fa-battery-0:before,.fa-battery-empty:before{content:""}.fa-mouse-pointer:before{content:""}.fa-i-cursor:before{content:""}.fa-object-group:before{content:""}.fa-object-ungroup:before{content:""}.fa-sticky-note:before{content:""}.fa-sticky-note-o:before{content:""}.fa-cc-jcb:before{content:""}.fa-cc-diners-club:before{content:""}.fa-clone:before{content:""}.fa-balance-scale:before{content:""}.fa-hourglass-o:before{content:""}.fa-hourglass-1:before,.fa-hourglass-start:before{content:""}.fa-hourglass-2:before,.fa-hourglass-half:before{content:""}.fa-hourglass-3:before,.fa-hourglass-end:before{content:""}.fa-hourglass:before{content:""}.fa-hand-grab-o:before,.fa-hand-rock-o:before{content:""}.fa-hand-paper-o:before,.fa-hand-stop-o:before{content:""}.fa-hand-scissors-o:before{content:""}.fa-hand-lizard-o:before{content:""}.fa-hand-spock-o:before{content:""}.fa-hand-pointer-o:before{content:""}.fa-hand-peace-o:before{content:""}.fa-trademark:before{content:""}.fa-registered:before{content:""}.fa-creative-commons:before{content:""}.fa-gg:before{content:""}.fa-gg-circle:before{content:""}.fa-tripadvisor:before{content:""}.fa-odnoklassniki:before{content:""}.fa-odnoklassniki-square:before{content:""}.fa-get-pocket:before{content:""}.fa-wikipedia-w:before{content:""}.fa-safari:before{content:""}.fa-chrome:before{content:""}.fa-firefox:before{content:""}.fa-opera:before{content:""}.fa-internet-explorer:before{content:""}.fa-television:before,.fa-tv:before{content:""}.fa-contao:before{content:""}.fa-500px:before{content:""}.fa-amazon:before{content:""}.fa-calendar-plus-o:before{content:""}.fa-calendar-minus-o:before{content:""}.fa-calendar-times-o:before{content:""}.fa-calendar-check-o:before{content:""}.fa-industry:before{content:""}.fa-map-pin:before{content:""}.fa-map-signs:before{content:""}.fa-map-o:before{content:""}.fa-map:before{content:""}.fa-commenting:before{content:""}.fa-commenting-o:before{content:""}.fa-houzz:before{content:""}.fa-vimeo:before{content:""}.fa-black-tie:before{content:""}.fa-fonticons:before{content:""}.fa-reddit-alien:before{content:""}.fa-edge:before{content:""}.fa-credit-card-alt:before{content:""}.fa-codiepie:before{content:""}.fa-modx:before{content:""}.fa-fort-awesome:before{content:""}.fa-usb:before{content:""}.fa-product-hunt:before{content:""}.fa-mixcloud:before{content:""}.fa-scribd:before{content:""}.fa-pause-circle:before{content:""}.fa-pause-circle-o:before{content:""}.fa-stop-circle:before{content:""}.fa-stop-circle-o:before{content:""}.fa-shopping-bag:before{content:""}.fa-shopping-basket:before{content:""}.fa-hashtag:before{content:""}.fa-bluetooth:before{content:""}.fa-bluetooth-b:before{content:""}.fa-percent:before{content:""}.fa-gitlab:before,.icon-gitlab:before{content:""}.fa-wpbeginner:before{content:""}.fa-wpforms:before{content:""}.fa-envira:before{content:""}.fa-universal-access:before{content:""}.fa-wheelchair-alt:before{content:""}.fa-question-circle-o:before{content:""}.fa-blind:before{content:""}.fa-audio-description:before{content:""}.fa-volume-control-phone:before{content:""}.fa-braille:before{content:""}.fa-assistive-listening-systems:before{content:""}.fa-american-sign-language-interpreting:before,.fa-asl-interpreting:before{content:""}.fa-deaf:before,.fa-deafness:before,.fa-hard-of-hearing:before{content:""}.fa-glide:before{content:""}.fa-glide-g:before{content:""}.fa-sign-language:before,.fa-signing:before{content:""}.fa-low-vision:before{content:""}.fa-viadeo:before{content:""}.fa-viadeo-square:before{content:""}.fa-snapchat:before{content:""}.fa-snapchat-ghost:before{content:""}.fa-snapchat-square:before{content:""}.fa-pied-piper:before{content:""}.fa-first-order:before{content:""}.fa-yoast:before{content:""}.fa-themeisle:before{content:""}.fa-google-plus-circle:before,.fa-google-plus-official:before{content:""}.fa-fa:before,.fa-font-awesome:before{content:""}.fa-handshake-o:before{content:""}.fa-envelope-open:before{content:""}.fa-envelope-open-o:before{content:""}.fa-linode:before{content:""}.fa-address-book:before{content:""}.fa-address-book-o:before{content:""}.fa-address-card:before,.fa-vcard:before{content:""}.fa-address-card-o:before,.fa-vcard-o:before{content:""}.fa-user-circle:before{content:""}.fa-user-circle-o:before{content:""}.fa-user-o:before{content:""}.fa-id-badge:before{content:""}.fa-drivers-license:before,.fa-id-card:before{content:""}.fa-drivers-license-o:before,.fa-id-card-o:before{content:""}.fa-quora:before{content:""}.fa-free-code-camp:before{content:""}.fa-telegram:before{content:""}.fa-thermometer-4:before,.fa-thermometer-full:before,.fa-thermometer:before{content:""}.fa-thermometer-3:before,.fa-thermometer-three-quarters:before{content:""}.fa-thermometer-2:before,.fa-thermometer-half:before{content:""}.fa-thermometer-1:before,.fa-thermometer-quarter:before{content:""}.fa-thermometer-0:before,.fa-thermometer-empty:before{content:""}.fa-shower:before{content:""}.fa-bath:before,.fa-bathtub:before,.fa-s15:before{content:""}.fa-podcast:before{content:""}.fa-window-maximize:before{content:""}.fa-window-minimize:before{content:""}.fa-window-restore:before{content:""}.fa-times-rectangle:before,.fa-window-close:before{content:""}.fa-times-rectangle-o:before,.fa-window-close-o:before{content:""}.fa-bandcamp:before{content:""}.fa-grav:before{content:""}.fa-etsy:before{content:""}.fa-imdb:before{content:""}.fa-ravelry:before{content:""}.fa-eercast:before{content:""}.fa-microchip:before{content:""}.fa-snowflake-o:before{content:""}.fa-superpowers:before{content:""}.fa-wpexplorer:before{content:""}.fa-meetup:before{content:""}.sr-only{position:absolute;width:1px;height:1px;padding:0;margin:-1px;overflow:hidden;clip:rect(0,0,0,0);border:0}.sr-only-focusable:active,.sr-only-focusable:focus{position:static;width:auto;height:auto;margin:0;overflow:visible;clip:auto}.fa,.icon,.rst-content .admonition-title,.rst-content .code-block-caption .headerlink,.rst-content .eqno .headerlink,.rst-content code.download span:first-child,.rst-content dl dt .headerlink,.rst-content h1 .headerlink,.rst-content h2 .headerlink,.rst-content h3 .headerlink,.rst-content h4 .headerlink,.rst-content h5 .headerlink,.rst-content h6 .headerlink,.rst-content p.caption .headerlink,.rst-content p .headerlink,.rst-content table>caption .headerlink,.rst-content tt.download span:first-child,.wy-dropdown .caret,.wy-inline-validate.wy-inline-validate-danger .wy-input-context,.wy-inline-validate.wy-inline-validate-info .wy-input-context,.wy-inline-validate.wy-inline-validate-success .wy-input-context,.wy-inline-validate.wy-inline-validate-warning .wy-input-context,.wy-menu-vertical li.current>a button.toctree-expand,.wy-menu-vertical li.on a button.toctree-expand,.wy-menu-vertical li button.toctree-expand{font-family:inherit}.fa:before,.icon:before,.rst-content .admonition-title:before,.rst-content .code-block-caption .headerlink:before,.rst-content .eqno .headerlink:before,.rst-content code.download span:first-child:before,.rst-content dl dt .headerlink:before,.rst-content h1 .headerlink:before,.rst-content h2 .headerlink:before,.rst-content h3 .headerlink:before,.rst-content h4 .headerlink:before,.rst-content h5 .headerlink:before,.rst-content h6 .headerlink:before,.rst-content p.caption .headerlink:before,.rst-content p .headerlink:before,.rst-content table>caption .headerlink:before,.rst-content tt.download span:first-child:before,.wy-dropdown .caret:before,.wy-inline-validate.wy-inline-validate-danger .wy-input-context:before,.wy-inline-validate.wy-inline-validate-info .wy-input-context:before,.wy-inline-validate.wy-inline-validate-success .wy-input-context:before,.wy-inline-validate.wy-inline-validate-warning .wy-input-context:before,.wy-menu-vertical li.current>a button.toctree-expand:before,.wy-menu-vertical li.on a button.toctree-expand:before,.wy-menu-vertical li button.toctree-expand:before{font-family:FontAwesome;display:inline-block;font-style:normal;font-weight:400;line-height:1;text-decoration:inherit}.rst-content .code-block-caption a .headerlink,.rst-content .eqno a .headerlink,.rst-content a .admonition-title,.rst-content code.download a span:first-child,.rst-content dl dt a .headerlink,.rst-content h1 a .headerlink,.rst-content h2 a .headerlink,.rst-content h3 a .headerlink,.rst-content h4 a .headerlink,.rst-content h5 a .headerlink,.rst-content h6 a .headerlink,.rst-content p.caption a .headerlink,.rst-content p a .headerlink,.rst-content table>caption a .headerlink,.rst-content tt.download a span:first-child,.wy-menu-vertical li.current>a button.toctree-expand,.wy-menu-vertical li.on a button.toctree-expand,.wy-menu-vertical li a button.toctree-expand,a .fa,a .icon,a .rst-content .admonition-title,a .rst-content .code-block-caption .headerlink,a .rst-content .eqno .headerlink,a .rst-content code.download span:first-child,a .rst-content dl dt .headerlink,a .rst-content h1 .headerlink,a .rst-content h2 .headerlink,a .rst-content h3 .headerlink,a .rst-content h4 .headerlink,a .rst-content h5 .headerlink,a .rst-content h6 .headerlink,a .rst-content p.caption .headerlink,a .rst-content p .headerlink,a .rst-content table>caption .headerlink,a .rst-content tt.download span:first-child,a .wy-menu-vertical li button.toctree-expand{display:inline-block;text-decoration:inherit}.btn .fa,.btn .icon,.btn .rst-content .admonition-title,.btn .rst-content .code-block-caption .headerlink,.btn .rst-content .eqno .headerlink,.btn .rst-content code.download span:first-child,.btn .rst-content dl dt .headerlink,.btn .rst-content h1 .headerlink,.btn .rst-content h2 .headerlink,.btn .rst-content h3 .headerlink,.btn .rst-content h4 .headerlink,.btn .rst-content h5 .headerlink,.btn .rst-content h6 .headerlink,.btn .rst-content p .headerlink,.btn .rst-content table>caption .headerlink,.btn .rst-content tt.download span:first-child,.btn .wy-menu-vertical li.current>a button.toctree-expand,.btn .wy-menu-vertical li.on a button.toctree-expand,.btn .wy-menu-vertical li button.toctree-expand,.nav .fa,.nav .icon,.nav .rst-content .admonition-title,.nav .rst-content .code-block-caption .headerlink,.nav .rst-content .eqno .headerlink,.nav .rst-content code.download span:first-child,.nav .rst-content dl dt .headerlink,.nav .rst-content h1 .headerlink,.nav .rst-content h2 .headerlink,.nav .rst-content h3 .headerlink,.nav .rst-content h4 .headerlink,.nav .rst-content h5 .headerlink,.nav .rst-content h6 .headerlink,.nav .rst-content p .headerlink,.nav .rst-content table>caption .headerlink,.nav .rst-content tt.download span:first-child,.nav .wy-menu-vertical li.current>a button.toctree-expand,.nav .wy-menu-vertical li.on a button.toctree-expand,.nav .wy-menu-vertical li button.toctree-expand,.rst-content .btn .admonition-title,.rst-content .code-block-caption .btn .headerlink,.rst-content .code-block-caption .nav .headerlink,.rst-content .eqno .btn .headerlink,.rst-content .eqno .nav .headerlink,.rst-content .nav .admonition-title,.rst-content code.download .btn span:first-child,.rst-content code.download .nav span:first-child,.rst-content dl dt .btn .headerlink,.rst-content dl dt .nav .headerlink,.rst-content h1 .btn .headerlink,.rst-content h1 .nav .headerlink,.rst-content h2 .btn .headerlink,.rst-content h2 .nav .headerlink,.rst-content h3 .btn .headerlink,.rst-content h3 .nav .headerlink,.rst-content h4 .btn .headerlink,.rst-content h4 .nav .headerlink,.rst-content h5 .btn .headerlink,.rst-content h5 .nav .headerlink,.rst-content h6 .btn .headerlink,.rst-content h6 .nav .headerlink,.rst-content p .btn .headerlink,.rst-content p .nav .headerlink,.rst-content table>caption .btn .headerlink,.rst-content table>caption .nav .headerlink,.rst-content tt.download .btn span:first-child,.rst-content tt.download .nav span:first-child,.wy-menu-vertical li .btn button.toctree-expand,.wy-menu-vertical li.current>a .btn button.toctree-expand,.wy-menu-vertical li.current>a .nav button.toctree-expand,.wy-menu-vertical li .nav button.toctree-expand,.wy-menu-vertical li.on a .btn button.toctree-expand,.wy-menu-vertical li.on a .nav button.toctree-expand{display:inline}.btn .fa-large.icon,.btn .fa.fa-large,.btn .rst-content .code-block-caption .fa-large.headerlink,.btn .rst-content .eqno .fa-large.headerlink,.btn .rst-content .fa-large.admonition-title,.btn .rst-content code.download span.fa-large:first-child,.btn .rst-content dl dt .fa-large.headerlink,.btn .rst-content h1 .fa-large.headerlink,.btn .rst-content h2 .fa-large.headerlink,.btn .rst-content h3 .fa-large.headerlink,.btn .rst-content h4 .fa-large.headerlink,.btn .rst-content h5 .fa-large.headerlink,.btn .rst-content h6 .fa-large.headerlink,.btn .rst-content p .fa-large.headerlink,.btn .rst-content table>caption .fa-large.headerlink,.btn .rst-content tt.download span.fa-large:first-child,.btn .wy-menu-vertical li button.fa-large.toctree-expand,.nav .fa-large.icon,.nav .fa.fa-large,.nav .rst-content .code-block-caption .fa-large.headerlink,.nav .rst-content .eqno .fa-large.headerlink,.nav .rst-content .fa-large.admonition-title,.nav .rst-content code.download span.fa-large:first-child,.nav .rst-content dl dt .fa-large.headerlink,.nav .rst-content h1 .fa-large.headerlink,.nav .rst-content h2 .fa-large.headerlink,.nav .rst-content h3 .fa-large.headerlink,.nav .rst-content h4 .fa-large.headerlink,.nav .rst-content h5 .fa-large.headerlink,.nav .rst-content h6 .fa-large.headerlink,.nav .rst-content p .fa-large.headerlink,.nav .rst-content table>caption .fa-large.headerlink,.nav .rst-content tt.download span.fa-large:first-child,.nav .wy-menu-vertical li button.fa-large.toctree-expand,.rst-content .btn .fa-large.admonition-title,.rst-content .code-block-caption .btn .fa-large.headerlink,.rst-content .code-block-caption .nav .fa-large.headerlink,.rst-content .eqno .btn .fa-large.headerlink,.rst-content .eqno .nav .fa-large.headerlink,.rst-content .nav .fa-large.admonition-title,.rst-content code.download .btn span.fa-large:first-child,.rst-content code.download .nav span.fa-large:first-child,.rst-content dl dt .btn .fa-large.headerlink,.rst-content dl dt .nav .fa-large.headerlink,.rst-content h1 .btn .fa-large.headerlink,.rst-content h1 .nav .fa-large.headerlink,.rst-content h2 .btn .fa-large.headerlink,.rst-content h2 .nav .fa-large.headerlink,.rst-content h3 .btn .fa-large.headerlink,.rst-content h3 .nav .fa-large.headerlink,.rst-content h4 .btn .fa-large.headerlink,.rst-content h4 .nav .fa-large.headerlink,.rst-content h5 .btn .fa-large.headerlink,.rst-content h5 .nav .fa-large.headerlink,.rst-content h6 .btn .fa-large.headerlink,.rst-content h6 .nav .fa-large.headerlink,.rst-content p .btn .fa-large.headerlink,.rst-content p .nav .fa-large.headerlink,.rst-content table>caption .btn .fa-large.headerlink,.rst-content table>caption .nav .fa-large.headerlink,.rst-content tt.download .btn span.fa-large:first-child,.rst-content tt.download .nav span.fa-large:first-child,.wy-menu-vertical li .btn button.fa-large.toctree-expand,.wy-menu-vertical li .nav button.fa-large.toctree-expand{line-height:.9em}.btn .fa-spin.icon,.btn .fa.fa-spin,.btn .rst-content .code-block-caption .fa-spin.headerlink,.btn .rst-content .eqno .fa-spin.headerlink,.btn .rst-content .fa-spin.admonition-title,.btn .rst-content code.download span.fa-spin:first-child,.btn .rst-content dl dt .fa-spin.headerlink,.btn .rst-content h1 .fa-spin.headerlink,.btn .rst-content h2 .fa-spin.headerlink,.btn .rst-content h3 .fa-spin.headerlink,.btn .rst-content h4 .fa-spin.headerlink,.btn .rst-content h5 .fa-spin.headerlink,.btn .rst-content h6 .fa-spin.headerlink,.btn .rst-content p .fa-spin.headerlink,.btn .rst-content table>caption .fa-spin.headerlink,.btn .rst-content tt.download span.fa-spin:first-child,.btn .wy-menu-vertical li button.fa-spin.toctree-expand,.nav .fa-spin.icon,.nav .fa.fa-spin,.nav .rst-content .code-block-caption .fa-spin.headerlink,.nav .rst-content .eqno .fa-spin.headerlink,.nav .rst-content .fa-spin.admonition-title,.nav .rst-content code.download span.fa-spin:first-child,.nav .rst-content dl dt .fa-spin.headerlink,.nav .rst-content h1 .fa-spin.headerlink,.nav .rst-content h2 .fa-spin.headerlink,.nav .rst-content h3 .fa-spin.headerlink,.nav .rst-content h4 .fa-spin.headerlink,.nav .rst-content h5 .fa-spin.headerlink,.nav .rst-content h6 .fa-spin.headerlink,.nav .rst-content p .fa-spin.headerlink,.nav .rst-content table>caption .fa-spin.headerlink,.nav .rst-content tt.download span.fa-spin:first-child,.nav .wy-menu-vertical li button.fa-spin.toctree-expand,.rst-content .btn .fa-spin.admonition-title,.rst-content .code-block-caption .btn .fa-spin.headerlink,.rst-content .code-block-caption .nav .fa-spin.headerlink,.rst-content .eqno .btn .fa-spin.headerlink,.rst-content .eqno .nav .fa-spin.headerlink,.rst-content .nav .fa-spin.admonition-title,.rst-content code.download .btn span.fa-spin:first-child,.rst-content code.download .nav span.fa-spin:first-child,.rst-content dl dt .btn .fa-spin.headerlink,.rst-content dl dt .nav .fa-spin.headerlink,.rst-content h1 .btn .fa-spin.headerlink,.rst-content h1 .nav .fa-spin.headerlink,.rst-content h2 .btn .fa-spin.headerlink,.rst-content h2 .nav .fa-spin.headerlink,.rst-content h3 .btn .fa-spin.headerlink,.rst-content h3 .nav .fa-spin.headerlink,.rst-content h4 .btn .fa-spin.headerlink,.rst-content h4 .nav .fa-spin.headerlink,.rst-content h5 .btn .fa-spin.headerlink,.rst-content h5 .nav .fa-spin.headerlink,.rst-content h6 .btn .fa-spin.headerlink,.rst-content h6 .nav .fa-spin.headerlink,.rst-content p .btn .fa-spin.headerlink,.rst-content p .nav .fa-spin.headerlink,.rst-content table>caption .btn .fa-spin.headerlink,.rst-content table>caption .nav .fa-spin.headerlink,.rst-content tt.download .btn span.fa-spin:first-child,.rst-content tt.download .nav span.fa-spin:first-child,.wy-menu-vertical li .btn button.fa-spin.toctree-expand,.wy-menu-vertical li .nav button.fa-spin.toctree-expand{display:inline-block}.btn.fa:before,.btn.icon:before,.rst-content .btn.admonition-title:before,.rst-content .code-block-caption .btn.headerlink:before,.rst-content .eqno .btn.headerlink:before,.rst-content code.download span.btn:first-child:before,.rst-content dl dt .btn.headerlink:before,.rst-content h1 .btn.headerlink:before,.rst-content h2 .btn.headerlink:before,.rst-content h3 .btn.headerlink:before,.rst-content h4 .btn.headerlink:before,.rst-content h5 .btn.headerlink:before,.rst-content h6 .btn.headerlink:before,.rst-content p .btn.headerlink:before,.rst-content table>caption .btn.headerlink:before,.rst-content tt.download span.btn:first-child:before,.wy-menu-vertical li button.btn.toctree-expand:before{opacity:.5;-webkit-transition:opacity .05s ease-in;-moz-transition:opacity .05s ease-in;transition:opacity .05s ease-in}.btn.fa:hover:before,.btn.icon:hover:before,.rst-content .btn.admonition-title:hover:before,.rst-content .code-block-caption .btn.headerlink:hover:before,.rst-content .eqno .btn.headerlink:hover:before,.rst-content code.download span.btn:first-child:hover:before,.rst-content dl dt .btn.headerlink:hover:before,.rst-content h1 .btn.headerlink:hover:before,.rst-content h2 .btn.headerlink:hover:before,.rst-content h3 .btn.headerlink:hover:before,.rst-content h4 .btn.headerlink:hover:before,.rst-content h5 .btn.headerlink:hover:before,.rst-content h6 .btn.headerlink:hover:before,.rst-content p .btn.headerlink:hover:before,.rst-content table>caption .btn.headerlink:hover:before,.rst-content tt.download span.btn:first-child:hover:before,.wy-menu-vertical li button.btn.toctree-expand:hover:before{opacity:1}.btn-mini .fa:before,.btn-mini .icon:before,.btn-mini .rst-content .admonition-title:before,.btn-mini .rst-content .code-block-caption .headerlink:before,.btn-mini .rst-content .eqno .headerlink:before,.btn-mini .rst-content code.download span:first-child:before,.btn-mini .rst-content dl dt .headerlink:before,.btn-mini .rst-content h1 .headerlink:before,.btn-mini .rst-content h2 .headerlink:before,.btn-mini .rst-content h3 .headerlink:before,.btn-mini .rst-content h4 .headerlink:before,.btn-mini .rst-content h5 .headerlink:before,.btn-mini .rst-content h6 .headerlink:before,.btn-mini .rst-content p .headerlink:before,.btn-mini .rst-content table>caption .headerlink:before,.btn-mini .rst-content tt.download span:first-child:before,.btn-mini .wy-menu-vertical li button.toctree-expand:before,.rst-content .btn-mini .admonition-title:before,.rst-content .code-block-caption .btn-mini .headerlink:before,.rst-content .eqno .btn-mini .headerlink:before,.rst-content code.download .btn-mini span:first-child:before,.rst-content dl dt .btn-mini .headerlink:before,.rst-content h1 .btn-mini .headerlink:before,.rst-content h2 .btn-mini .headerlink:before,.rst-content h3 .btn-mini .headerlink:before,.rst-content h4 .btn-mini .headerlink:before,.rst-content h5 .btn-mini .headerlink:before,.rst-content h6 .btn-mini .headerlink:before,.rst-content p .btn-mini .headerlink:before,.rst-content table>caption .btn-mini .headerlink:before,.rst-content tt.download .btn-mini span:first-child:before,.wy-menu-vertical li .btn-mini button.toctree-expand:before{font-size:14px;vertical-align:-15%}.rst-content .admonition,.rst-content .admonition-todo,.rst-content .attention,.rst-content .caution,.rst-content .danger,.rst-content .error,.rst-content .hint,.rst-content .important,.rst-content .note,.rst-content .seealso,.rst-content .tip,.rst-content .warning,.wy-alert{padding:12px;line-height:24px;margin-bottom:24px;background:#e7f2fa}.rst-content .admonition-title,.wy-alert-title{font-weight:700;display:block;color:#fff;background:#6ab0de;padding:6px 12px;margin:-12px -12px 12px}.rst-content .danger,.rst-content .error,.rst-content .wy-alert-danger.admonition,.rst-content .wy-alert-danger.admonition-todo,.rst-content .wy-alert-danger.attention,.rst-content .wy-alert-danger.caution,.rst-content .wy-alert-danger.hint,.rst-content .wy-alert-danger.important,.rst-content .wy-alert-danger.note,.rst-content .wy-alert-danger.seealso,.rst-content .wy-alert-danger.tip,.rst-content .wy-alert-danger.warning,.wy-alert.wy-alert-danger{background:#fdf3f2}.rst-content .danger .admonition-title,.rst-content .danger .wy-alert-title,.rst-content .error .admonition-title,.rst-content .error .wy-alert-title,.rst-content .wy-alert-danger.admonition-todo .admonition-title,.rst-content .wy-alert-danger.admonition-todo .wy-alert-title,.rst-content .wy-alert-danger.admonition .admonition-title,.rst-content .wy-alert-danger.admonition .wy-alert-title,.rst-content .wy-alert-danger.attention .admonition-title,.rst-content .wy-alert-danger.attention .wy-alert-title,.rst-content .wy-alert-danger.caution .admonition-title,.rst-content .wy-alert-danger.caution .wy-alert-title,.rst-content .wy-alert-danger.hint .admonition-title,.rst-content .wy-alert-danger.hint .wy-alert-title,.rst-content .wy-alert-danger.important .admonition-title,.rst-content .wy-alert-danger.important .wy-alert-title,.rst-content .wy-alert-danger.note .admonition-title,.rst-content .wy-alert-danger.note .wy-alert-title,.rst-content .wy-alert-danger.seealso .admonition-title,.rst-content .wy-alert-danger.seealso .wy-alert-title,.rst-content .wy-alert-danger.tip .admonition-title,.rst-content .wy-alert-danger.tip .wy-alert-title,.rst-content .wy-alert-danger.warning .admonition-title,.rst-content .wy-alert-danger.warning .wy-alert-title,.rst-content .wy-alert.wy-alert-danger .admonition-title,.wy-alert.wy-alert-danger .rst-content .admonition-title,.wy-alert.wy-alert-danger .wy-alert-title{background:#f29f97}.rst-content .admonition-todo,.rst-content .attention,.rst-content .caution,.rst-content .warning,.rst-content .wy-alert-warning.admonition,.rst-content .wy-alert-warning.danger,.rst-content .wy-alert-warning.error,.rst-content .wy-alert-warning.hint,.rst-content .wy-alert-warning.important,.rst-content .wy-alert-warning.note,.rst-content .wy-alert-warning.seealso,.rst-content .wy-alert-warning.tip,.wy-alert.wy-alert-warning{background:#ffedcc}.rst-content .admonition-todo .admonition-title,.rst-content .admonition-todo .wy-alert-title,.rst-content .attention .admonition-title,.rst-content .attention .wy-alert-title,.rst-content .caution .admonition-title,.rst-content .caution .wy-alert-title,.rst-content .warning .admonition-title,.rst-content .warning .wy-alert-title,.rst-content .wy-alert-warning.admonition .admonition-title,.rst-content .wy-alert-warning.admonition .wy-alert-title,.rst-content .wy-alert-warning.danger .admonition-title,.rst-content .wy-alert-warning.danger .wy-alert-title,.rst-content .wy-alert-warning.error .admonition-title,.rst-content .wy-alert-warning.error .wy-alert-title,.rst-content .wy-alert-warning.hint .admonition-title,.rst-content .wy-alert-warning.hint .wy-alert-title,.rst-content .wy-alert-warning.important .admonition-title,.rst-content .wy-alert-warning.important .wy-alert-title,.rst-content .wy-alert-warning.note .admonition-title,.rst-content .wy-alert-warning.note .wy-alert-title,.rst-content .wy-alert-warning.seealso .admonition-title,.rst-content .wy-alert-warning.seealso .wy-alert-title,.rst-content .wy-alert-warning.tip .admonition-title,.rst-content .wy-alert-warning.tip .wy-alert-title,.rst-content .wy-alert.wy-alert-warning .admonition-title,.wy-alert.wy-alert-warning .rst-content .admonition-title,.wy-alert.wy-alert-warning .wy-alert-title{background:#f0b37e}.rst-content .note,.rst-content .seealso,.rst-content .wy-alert-info.admonition,.rst-content .wy-alert-info.admonition-todo,.rst-content .wy-alert-info.attention,.rst-content .wy-alert-info.caution,.rst-content .wy-alert-info.danger,.rst-content .wy-alert-info.error,.rst-content .wy-alert-info.hint,.rst-content .wy-alert-info.important,.rst-content .wy-alert-info.tip,.rst-content .wy-alert-info.warning,.wy-alert.wy-alert-info{background:#e7f2fa}.rst-content .note .admonition-title,.rst-content .note .wy-alert-title,.rst-content .seealso .admonition-title,.rst-content .seealso .wy-alert-title,.rst-content .wy-alert-info.admonition-todo .admonition-title,.rst-content .wy-alert-info.admonition-todo .wy-alert-title,.rst-content .wy-alert-info.admonition .admonition-title,.rst-content .wy-alert-info.admonition .wy-alert-title,.rst-content .wy-alert-info.attention .admonition-title,.rst-content .wy-alert-info.attention .wy-alert-title,.rst-content .wy-alert-info.caution .admonition-title,.rst-content .wy-alert-info.caution .wy-alert-title,.rst-content .wy-alert-info.danger .admonition-title,.rst-content .wy-alert-info.danger .wy-alert-title,.rst-content .wy-alert-info.error .admonition-title,.rst-content .wy-alert-info.error .wy-alert-title,.rst-content .wy-alert-info.hint .admonition-title,.rst-content .wy-alert-info.hint .wy-alert-title,.rst-content .wy-alert-info.important .admonition-title,.rst-content .wy-alert-info.important .wy-alert-title,.rst-content .wy-alert-info.tip .admonition-title,.rst-content .wy-alert-info.tip .wy-alert-title,.rst-content .wy-alert-info.warning .admonition-title,.rst-content .wy-alert-info.warning .wy-alert-title,.rst-content .wy-alert.wy-alert-info .admonition-title,.wy-alert.wy-alert-info .rst-content .admonition-title,.wy-alert.wy-alert-info .wy-alert-title{background:#6ab0de}.rst-content .hint,.rst-content .important,.rst-content .tip,.rst-content .wy-alert-success.admonition,.rst-content .wy-alert-success.admonition-todo,.rst-content .wy-alert-success.attention,.rst-content .wy-alert-success.caution,.rst-content .wy-alert-success.danger,.rst-content .wy-alert-success.error,.rst-content .wy-alert-success.note,.rst-content .wy-alert-success.seealso,.rst-content .wy-alert-success.warning,.wy-alert.wy-alert-success{background:#dbfaf4}.rst-content .hint .admonition-title,.rst-content .hint .wy-alert-title,.rst-content .important .admonition-title,.rst-content .important .wy-alert-title,.rst-content .tip .admonition-title,.rst-content .tip .wy-alert-title,.rst-content .wy-alert-success.admonition-todo .admonition-title,.rst-content .wy-alert-success.admonition-todo .wy-alert-title,.rst-content .wy-alert-success.admonition .admonition-title,.rst-content .wy-alert-success.admonition .wy-alert-title,.rst-content .wy-alert-success.attention .admonition-title,.rst-content .wy-alert-success.attention .wy-alert-title,.rst-content .wy-alert-success.caution .admonition-title,.rst-content .wy-alert-success.caution .wy-alert-title,.rst-content .wy-alert-success.danger .admonition-title,.rst-content .wy-alert-success.danger .wy-alert-title,.rst-content .wy-alert-success.error .admonition-title,.rst-content .wy-alert-success.error .wy-alert-title,.rst-content .wy-alert-success.note .admonition-title,.rst-content .wy-alert-success.note .wy-alert-title,.rst-content .wy-alert-success.seealso .admonition-title,.rst-content .wy-alert-success.seealso .wy-alert-title,.rst-content .wy-alert-success.warning .admonition-title,.rst-content .wy-alert-success.warning .wy-alert-title,.rst-content .wy-alert.wy-alert-success .admonition-title,.wy-alert.wy-alert-success .rst-content .admonition-title,.wy-alert.wy-alert-success .wy-alert-title{background:#1abc9c}.rst-content .wy-alert-neutral.admonition,.rst-content .wy-alert-neutral.admonition-todo,.rst-content .wy-alert-neutral.attention,.rst-content .wy-alert-neutral.caution,.rst-content .wy-alert-neutral.danger,.rst-content .wy-alert-neutral.error,.rst-content .wy-alert-neutral.hint,.rst-content .wy-alert-neutral.important,.rst-content .wy-alert-neutral.note,.rst-content .wy-alert-neutral.seealso,.rst-content .wy-alert-neutral.tip,.rst-content .wy-alert-neutral.warning,.wy-alert.wy-alert-neutral{background:#f3f6f6}.rst-content .wy-alert-neutral.admonition-todo .admonition-title,.rst-content .wy-alert-neutral.admonition-todo .wy-alert-title,.rst-content .wy-alert-neutral.admonition .admonition-title,.rst-content .wy-alert-neutral.admonition .wy-alert-title,.rst-content .wy-alert-neutral.attention .admonition-title,.rst-content .wy-alert-neutral.attention .wy-alert-title,.rst-content .wy-alert-neutral.caution .admonition-title,.rst-content .wy-alert-neutral.caution .wy-alert-title,.rst-content .wy-alert-neutral.danger .admonition-title,.rst-content .wy-alert-neutral.danger .wy-alert-title,.rst-content .wy-alert-neutral.error .admonition-title,.rst-content .wy-alert-neutral.error .wy-alert-title,.rst-content .wy-alert-neutral.hint .admonition-title,.rst-content .wy-alert-neutral.hint .wy-alert-title,.rst-content .wy-alert-neutral.important .admonition-title,.rst-content .wy-alert-neutral.important .wy-alert-title,.rst-content .wy-alert-neutral.note .admonition-title,.rst-content .wy-alert-neutral.note .wy-alert-title,.rst-content .wy-alert-neutral.seealso .admonition-title,.rst-content .wy-alert-neutral.seealso .wy-alert-title,.rst-content .wy-alert-neutral.tip .admonition-title,.rst-content .wy-alert-neutral.tip .wy-alert-title,.rst-content .wy-alert-neutral.warning .admonition-title,.rst-content .wy-alert-neutral.warning .wy-alert-title,.rst-content .wy-alert.wy-alert-neutral .admonition-title,.wy-alert.wy-alert-neutral .rst-content .admonition-title,.wy-alert.wy-alert-neutral .wy-alert-title{color:#404040;background:#e1e4e5}.rst-content .wy-alert-neutral.admonition-todo a,.rst-content .wy-alert-neutral.admonition a,.rst-content .wy-alert-neutral.attention a,.rst-content .wy-alert-neutral.caution a,.rst-content .wy-alert-neutral.danger a,.rst-content .wy-alert-neutral.error a,.rst-content .wy-alert-neutral.hint a,.rst-content .wy-alert-neutral.important a,.rst-content .wy-alert-neutral.note a,.rst-content .wy-alert-neutral.seealso a,.rst-content .wy-alert-neutral.tip a,.rst-content .wy-alert-neutral.warning a,.wy-alert.wy-alert-neutral a{color:#2980b9}.rst-content .admonition-todo p:last-child,.rst-content .admonition p:last-child,.rst-content .attention p:last-child,.rst-content .caution p:last-child,.rst-content .danger p:last-child,.rst-content .error p:last-child,.rst-content .hint p:last-child,.rst-content .important p:last-child,.rst-content .note p:last-child,.rst-content .seealso p:last-child,.rst-content .tip p:last-child,.rst-content .warning p:last-child,.wy-alert p:last-child{margin-bottom:0}.wy-tray-container{position:fixed;bottom:0;left:0;z-index:600}.wy-tray-container li{display:block;width:300px;background:transparent;color:#fff;text-align:center;box-shadow:0 5px 5px 0 rgba(0,0,0,.1);padding:0 24px;min-width:20%;opacity:0;height:0;line-height:56px;overflow:hidden;-webkit-transition:all .3s ease-in;-moz-transition:all .3s ease-in;transition:all .3s ease-in}.wy-tray-container li.wy-tray-item-success{background:#27ae60}.wy-tray-container li.wy-tray-item-info{background:#2980b9}.wy-tray-container li.wy-tray-item-warning{background:#e67e22}.wy-tray-container li.wy-tray-item-danger{background:#e74c3c}.wy-tray-container li.on{opacity:1;height:56px}@media screen and (max-width:768px){.wy-tray-container{bottom:auto;top:0;width:100%}.wy-tray-container li{width:100%}}button{font-size:100%;margin:0;vertical-align:baseline;*vertical-align:middle;cursor:pointer;line-height:normal;-webkit-appearance:button;*overflow:visible}button::-moz-focus-inner,input::-moz-focus-inner{border:0;padding:0}button[disabled]{cursor:default}.btn{display:inline-block;border-radius:2px;line-height:normal;white-space:nowrap;text-align:center;cursor:pointer;font-size:100%;padding:6px 12px 8px;color:#fff;border:1px solid rgba(0,0,0,.1);background-color:#27ae60;text-decoration:none;font-weight:400;font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif;box-shadow:inset 0 1px 2px -1px hsla(0,0%,100%,.5),inset 0 -2px 0 0 rgba(0,0,0,.1);outline-none:false;vertical-align:middle;*display:inline;zoom:1;-webkit-user-drag:none;-webkit-user-select:none;-moz-user-select:none;-ms-user-select:none;user-select:none;-webkit-transition:all .1s linear;-moz-transition:all .1s linear;transition:all .1s linear}.btn-hover{background:#2e8ece;color:#fff}.btn:hover{background:#2cc36b;color:#fff}.btn:focus{background:#2cc36b;outline:0}.btn:active{box-shadow:inset 0 -1px 0 0 rgba(0,0,0,.05),inset 0 2px 0 0 rgba(0,0,0,.1);padding:8px 12px 6px}.btn:visited{color:#fff}.btn-disabled,.btn-disabled:active,.btn-disabled:focus,.btn-disabled:hover,.btn:disabled{background-image:none;filter:progid:DXImageTransform.Microsoft.gradient(enabled = false);filter:alpha(opacity=40);opacity:.4;cursor:not-allowed;box-shadow:none}.btn::-moz-focus-inner{padding:0;border:0}.btn-small{font-size:80%}.btn-info{background-color:#2980b9!important}.btn-info:hover{background-color:#2e8ece!important}.btn-neutral{background-color:#f3f6f6!important;color:#404040!important}.btn-neutral:hover{background-color:#e5ebeb!important;color:#404040}.btn-neutral:visited{color:#404040!important}.btn-success{background-color:#27ae60!important}.btn-success:hover{background-color:#295!important}.btn-danger{background-color:#e74c3c!important}.btn-danger:hover{background-color:#ea6153!important}.btn-warning{background-color:#e67e22!important}.btn-warning:hover{background-color:#e98b39!important}.btn-invert{background-color:#222}.btn-invert:hover{background-color:#2f2f2f!important}.btn-link{background-color:transparent!important;color:#2980b9;box-shadow:none;border-color:transparent!important}.btn-link:active,.btn-link:hover{background-color:transparent!important;color:#409ad5!important;box-shadow:none}.btn-link:visited{color:#9b59b6}.wy-btn-group .btn,.wy-control .btn{vertical-align:middle}.wy-btn-group{margin-bottom:24px;*zoom:1}.wy-btn-group:after,.wy-btn-group:before{display:table;content:""}.wy-btn-group:after{clear:both}.wy-dropdown{position:relative;display:inline-block}.wy-dropdown-active .wy-dropdown-menu{display:block}.wy-dropdown-menu{position:absolute;left:0;display:none;float:left;top:100%;min-width:100%;background:#fcfcfc;z-index:100;border:1px solid #cfd7dd;box-shadow:0 2px 2px 0 rgba(0,0,0,.1);padding:12px}.wy-dropdown-menu>dd>a{display:block;clear:both;color:#404040;white-space:nowrap;font-size:90%;padding:0 12px;cursor:pointer}.wy-dropdown-menu>dd>a:hover{background:#2980b9;color:#fff}.wy-dropdown-menu>dd.divider{border-top:1px solid #cfd7dd;margin:6px 0}.wy-dropdown-menu>dd.search{padding-bottom:12px}.wy-dropdown-menu>dd.search input[type=search]{width:100%}.wy-dropdown-menu>dd.call-to-action{background:#e3e3e3;text-transform:uppercase;font-weight:500;font-size:80%}.wy-dropdown-menu>dd.call-to-action:hover{background:#e3e3e3}.wy-dropdown-menu>dd.call-to-action .btn{color:#fff}.wy-dropdown.wy-dropdown-up .wy-dropdown-menu{bottom:100%;top:auto;left:auto;right:0}.wy-dropdown.wy-dropdown-bubble .wy-dropdown-menu{background:#fcfcfc;margin-top:2px}.wy-dropdown.wy-dropdown-bubble .wy-dropdown-menu a{padding:6px 12px}.wy-dropdown.wy-dropdown-bubble .wy-dropdown-menu a:hover{background:#2980b9;color:#fff}.wy-dropdown.wy-dropdown-left .wy-dropdown-menu{right:0;left:auto;text-align:right}.wy-dropdown-arrow:before{content:" ";border-bottom:5px solid #f5f5f5;border-left:5px solid transparent;border-right:5px solid transparent;position:absolute;display:block;top:-4px;left:50%;margin-left:-3px}.wy-dropdown-arrow.wy-dropdown-arrow-left:before{left:11px}.wy-form-stacked select{display:block}.wy-form-aligned .wy-help-inline,.wy-form-aligned input,.wy-form-aligned label,.wy-form-aligned select,.wy-form-aligned textarea{display:inline-block;*display:inline;*zoom:1;vertical-align:middle}.wy-form-aligned .wy-control-group>label{display:inline-block;vertical-align:middle;width:10em;margin:6px 12px 0 0;float:left}.wy-form-aligned .wy-control{float:left}.wy-form-aligned .wy-control label{display:block}.wy-form-aligned .wy-control select{margin-top:6px}fieldset{margin:0}fieldset,legend{border:0;padding:0}legend{width:100%;white-space:normal;margin-bottom:24px;font-size:150%;*margin-left:-7px}label,legend{display:block}label{margin:0 0 .3125em;color:#333;font-size:90%}input,select,textarea{font-size:100%;margin:0;vertical-align:baseline;*vertical-align:middle}.wy-control-group{margin-bottom:24px;max-width:1200px;margin-left:auto;margin-right:auto;*zoom:1}.wy-control-group:after,.wy-control-group:before{display:table;content:""}.wy-control-group:after{clear:both}.wy-control-group.wy-control-group-required>label:after{content:" *";color:#e74c3c}.wy-control-group .wy-form-full,.wy-control-group .wy-form-halves,.wy-control-group .wy-form-thirds{padding-bottom:12px}.wy-control-group .wy-form-full input[type=color],.wy-control-group .wy-form-full input[type=date],.wy-control-group .wy-form-full input[type=datetime-local],.wy-control-group .wy-form-full input[type=datetime],.wy-control-group .wy-form-full input[type=email],.wy-control-group .wy-form-full input[type=month],.wy-control-group .wy-form-full input[type=number],.wy-control-group .wy-form-full input[type=password],.wy-control-group .wy-form-full input[type=search],.wy-control-group .wy-form-full input[type=tel],.wy-control-group .wy-form-full input[type=text],.wy-control-group .wy-form-full input[type=time],.wy-control-group .wy-form-full input[type=url],.wy-control-group .wy-form-full input[type=week],.wy-control-group .wy-form-full select,.wy-control-group .wy-form-halves input[type=color],.wy-control-group .wy-form-halves input[type=date],.wy-control-group .wy-form-halves input[type=datetime-local],.wy-control-group .wy-form-halves input[type=datetime],.wy-control-group .wy-form-halves input[type=email],.wy-control-group .wy-form-halves input[type=month],.wy-control-group .wy-form-halves input[type=number],.wy-control-group .wy-form-halves input[type=password],.wy-control-group .wy-form-halves input[type=search],.wy-control-group .wy-form-halves input[type=tel],.wy-control-group .wy-form-halves input[type=text],.wy-control-group .wy-form-halves input[type=time],.wy-control-group .wy-form-halves input[type=url],.wy-control-group .wy-form-halves input[type=week],.wy-control-group .wy-form-halves select,.wy-control-group .wy-form-thirds input[type=color],.wy-control-group .wy-form-thirds input[type=date],.wy-control-group .wy-form-thirds input[type=datetime-local],.wy-control-group .wy-form-thirds input[type=datetime],.wy-control-group .wy-form-thirds input[type=email],.wy-control-group .wy-form-thirds input[type=month],.wy-control-group .wy-form-thirds input[type=number],.wy-control-group .wy-form-thirds input[type=password],.wy-control-group .wy-form-thirds input[type=search],.wy-control-group .wy-form-thirds input[type=tel],.wy-control-group .wy-form-thirds input[type=text],.wy-control-group .wy-form-thirds input[type=time],.wy-control-group .wy-form-thirds input[type=url],.wy-control-group .wy-form-thirds input[type=week],.wy-control-group .wy-form-thirds select{width:100%}.wy-control-group .wy-form-full{float:left;display:block;width:100%;margin-right:0}.wy-control-group .wy-form-full:last-child{margin-right:0}.wy-control-group .wy-form-halves{float:left;display:block;margin-right:2.35765%;width:48.82117%}.wy-control-group .wy-form-halves:last-child,.wy-control-group .wy-form-halves:nth-of-type(2n){margin-right:0}.wy-control-group .wy-form-halves:nth-of-type(odd){clear:left}.wy-control-group .wy-form-thirds{float:left;display:block;margin-right:2.35765%;width:31.76157%}.wy-control-group .wy-form-thirds:last-child,.wy-control-group .wy-form-thirds:nth-of-type(3n){margin-right:0}.wy-control-group .wy-form-thirds:nth-of-type(3n+1){clear:left}.wy-control-group.wy-control-group-no-input .wy-control,.wy-control-no-input{margin:6px 0 0;font-size:90%}.wy-control-no-input{display:inline-block}.wy-control-group.fluid-input input[type=color],.wy-control-group.fluid-input input[type=date],.wy-control-group.fluid-input input[type=datetime-local],.wy-control-group.fluid-input input[type=datetime],.wy-control-group.fluid-input input[type=email],.wy-control-group.fluid-input input[type=month],.wy-control-group.fluid-input input[type=number],.wy-control-group.fluid-input input[type=password],.wy-control-group.fluid-input input[type=search],.wy-control-group.fluid-input input[type=tel],.wy-control-group.fluid-input input[type=text],.wy-control-group.fluid-input input[type=time],.wy-control-group.fluid-input input[type=url],.wy-control-group.fluid-input input[type=week]{width:100%}.wy-form-message-inline{padding-left:.3em;color:#666;font-size:90%}.wy-form-message{display:block;color:#999;font-size:70%;margin-top:.3125em;font-style:italic}.wy-form-message p{font-size:inherit;font-style:italic;margin-bottom:6px}.wy-form-message p:last-child{margin-bottom:0}input{line-height:normal}input[type=button],input[type=reset],input[type=submit]{-webkit-appearance:button;cursor:pointer;font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif;*overflow:visible}input[type=color],input[type=date],input[type=datetime-local],input[type=datetime],input[type=email],input[type=month],input[type=number],input[type=password],input[type=search],input[type=tel],input[type=text],input[type=time],input[type=url],input[type=week]{-webkit-appearance:none;padding:6px;display:inline-block;border:1px solid #ccc;font-size:80%;font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif;box-shadow:inset 0 1px 3px #ddd;border-radius:0;-webkit-transition:border .3s linear;-moz-transition:border .3s linear;transition:border .3s linear}input[type=datetime-local]{padding:.34375em .625em}input[disabled]{cursor:default}input[type=checkbox],input[type=radio]{padding:0;margin-right:.3125em;*height:13px;*width:13px}input[type=checkbox],input[type=radio],input[type=search]{-webkit-box-sizing:border-box;-moz-box-sizing:border-box;box-sizing:border-box}input[type=search]::-webkit-search-cancel-button,input[type=search]::-webkit-search-decoration{-webkit-appearance:none}input[type=color]:focus,input[type=date]:focus,input[type=datetime-local]:focus,input[type=datetime]:focus,input[type=email]:focus,input[type=month]:focus,input[type=number]:focus,input[type=password]:focus,input[type=search]:focus,input[type=tel]:focus,input[type=text]:focus,input[type=time]:focus,input[type=url]:focus,input[type=week]:focus{outline:0;outline:thin dotted\9;border-color:#333}input.no-focus:focus{border-color:#ccc!important}input[type=checkbox]:focus,input[type=file]:focus,input[type=radio]:focus{outline:thin dotted #333;outline:1px auto #129fea}input[type=color][disabled],input[type=date][disabled],input[type=datetime-local][disabled],input[type=datetime][disabled],input[type=email][disabled],input[type=month][disabled],input[type=number][disabled],input[type=password][disabled],input[type=search][disabled],input[type=tel][disabled],input[type=text][disabled],input[type=time][disabled],input[type=url][disabled],input[type=week][disabled]{cursor:not-allowed;background-color:#fafafa}input:focus:invalid,select:focus:invalid,textarea:focus:invalid{color:#e74c3c;border:1px solid #e74c3c}input:focus:invalid:focus,select:focus:invalid:focus,textarea:focus:invalid:focus{border-color:#e74c3c}input[type=checkbox]:focus:invalid:focus,input[type=file]:focus:invalid:focus,input[type=radio]:focus:invalid:focus{outline-color:#e74c3c}input.wy-input-large{padding:12px;font-size:100%}textarea{overflow:auto;vertical-align:top;width:100%;font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif}select,textarea{padding:.5em .625em;display:inline-block;border:1px solid #ccc;font-size:80%;box-shadow:inset 0 1px 3px #ddd;-webkit-transition:border .3s linear;-moz-transition:border .3s linear;transition:border .3s linear}select{border:1px solid #ccc;background-color:#fff}select[multiple]{height:auto}select:focus,textarea:focus{outline:0}input[readonly],select[disabled],select[readonly],textarea[disabled],textarea[readonly]{cursor:not-allowed;background-color:#fafafa}input[type=checkbox][disabled],input[type=radio][disabled]{cursor:not-allowed}.wy-checkbox,.wy-radio{margin:6px 0;color:#404040;display:block}.wy-checkbox input,.wy-radio input{vertical-align:baseline}.wy-form-message-inline{display:inline-block;*display:inline;*zoom:1;vertical-align:middle}.wy-input-prefix,.wy-input-suffix{white-space:nowrap;padding:6px}.wy-input-prefix .wy-input-context,.wy-input-suffix .wy-input-context{line-height:27px;padding:0 8px;display:inline-block;font-size:80%;background-color:#f3f6f6;border:1px solid #ccc;color:#999}.wy-input-suffix .wy-input-context{border-left:0}.wy-input-prefix .wy-input-context{border-right:0}.wy-switch{position:relative;display:block;height:24px;margin-top:12px;cursor:pointer}.wy-switch:before{left:0;top:0;width:36px;height:12px;background:#ccc}.wy-switch:after,.wy-switch:before{position:absolute;content:"";display:block;border-radius:4px;-webkit-transition:all .2s ease-in-out;-moz-transition:all .2s ease-in-out;transition:all .2s ease-in-out}.wy-switch:after{width:18px;height:18px;background:#999;left:-3px;top:-3px}.wy-switch span{position:absolute;left:48px;display:block;font-size:12px;color:#ccc;line-height:1}.wy-switch.active:before{background:#1e8449}.wy-switch.active:after{left:24px;background:#27ae60}.wy-switch.disabled{cursor:not-allowed;opacity:.8}.wy-control-group.wy-control-group-error .wy-form-message,.wy-control-group.wy-control-group-error>label{color:#e74c3c}.wy-control-group.wy-control-group-error input[type=color],.wy-control-group.wy-control-group-error input[type=date],.wy-control-group.wy-control-group-error input[type=datetime-local],.wy-control-group.wy-control-group-error input[type=datetime],.wy-control-group.wy-control-group-error input[type=email],.wy-control-group.wy-control-group-error input[type=month],.wy-control-group.wy-control-group-error input[type=number],.wy-control-group.wy-control-group-error input[type=password],.wy-control-group.wy-control-group-error input[type=search],.wy-control-group.wy-control-group-error input[type=tel],.wy-control-group.wy-control-group-error input[type=text],.wy-control-group.wy-control-group-error input[type=time],.wy-control-group.wy-control-group-error input[type=url],.wy-control-group.wy-control-group-error input[type=week],.wy-control-group.wy-control-group-error textarea{border:1px solid #e74c3c}.wy-inline-validate{white-space:nowrap}.wy-inline-validate .wy-input-context{padding:.5em .625em;display:inline-block;font-size:80%}.wy-inline-validate.wy-inline-validate-success .wy-input-context{color:#27ae60}.wy-inline-validate.wy-inline-validate-danger .wy-input-context{color:#e74c3c}.wy-inline-validate.wy-inline-validate-warning .wy-input-context{color:#e67e22}.wy-inline-validate.wy-inline-validate-info .wy-input-context{color:#2980b9}.rotate-90{-webkit-transform:rotate(90deg);-moz-transform:rotate(90deg);-ms-transform:rotate(90deg);-o-transform:rotate(90deg);transform:rotate(90deg)}.rotate-180{-webkit-transform:rotate(180deg);-moz-transform:rotate(180deg);-ms-transform:rotate(180deg);-o-transform:rotate(180deg);transform:rotate(180deg)}.rotate-270{-webkit-transform:rotate(270deg);-moz-transform:rotate(270deg);-ms-transform:rotate(270deg);-o-transform:rotate(270deg);transform:rotate(270deg)}.mirror{-webkit-transform:scaleX(-1);-moz-transform:scaleX(-1);-ms-transform:scaleX(-1);-o-transform:scaleX(-1);transform:scaleX(-1)}.mirror.rotate-90{-webkit-transform:scaleX(-1) rotate(90deg);-moz-transform:scaleX(-1) rotate(90deg);-ms-transform:scaleX(-1) rotate(90deg);-o-transform:scaleX(-1) rotate(90deg);transform:scaleX(-1) rotate(90deg)}.mirror.rotate-180{-webkit-transform:scaleX(-1) rotate(180deg);-moz-transform:scaleX(-1) rotate(180deg);-ms-transform:scaleX(-1) rotate(180deg);-o-transform:scaleX(-1) rotate(180deg);transform:scaleX(-1) rotate(180deg)}.mirror.rotate-270{-webkit-transform:scaleX(-1) rotate(270deg);-moz-transform:scaleX(-1) rotate(270deg);-ms-transform:scaleX(-1) rotate(270deg);-o-transform:scaleX(-1) rotate(270deg);transform:scaleX(-1) rotate(270deg)}@media only screen and (max-width:480px){.wy-form button[type=submit]{margin:.7em 0 0}.wy-form input[type=color],.wy-form input[type=date],.wy-form input[type=datetime-local],.wy-form input[type=datetime],.wy-form input[type=email],.wy-form input[type=month],.wy-form input[type=number],.wy-form input[type=password],.wy-form input[type=search],.wy-form input[type=tel],.wy-form input[type=text],.wy-form input[type=time],.wy-form input[type=url],.wy-form input[type=week],.wy-form label{margin-bottom:.3em;display:block}.wy-form input[type=color],.wy-form input[type=date],.wy-form input[type=datetime-local],.wy-form input[type=datetime],.wy-form input[type=email],.wy-form input[type=month],.wy-form input[type=number],.wy-form input[type=password],.wy-form input[type=search],.wy-form input[type=tel],.wy-form input[type=time],.wy-form input[type=url],.wy-form input[type=week]{margin-bottom:0}.wy-form-aligned .wy-control-group label{margin-bottom:.3em;text-align:left;display:block;width:100%}.wy-form-aligned .wy-control{margin:1.5em 0 0}.wy-form-message,.wy-form-message-inline,.wy-form .wy-help-inline{display:block;font-size:80%;padding:6px 0}}@media screen and (max-width:768px){.tablet-hide{display:none}}@media screen and (max-width:480px){.mobile-hide{display:none}}.float-left{float:left}.float-right{float:right}.full-width{width:100%}.rst-content table.docutils,.rst-content table.field-list,.wy-table{border-collapse:collapse;border-spacing:0;empty-cells:show;margin-bottom:24px}.rst-content table.docutils caption,.rst-content table.field-list caption,.wy-table caption{color:#000;font:italic 85%/1 arial,sans-serif;padding:1em 0;text-align:center}.rst-content table.docutils td,.rst-content table.docutils th,.rst-content table.field-list td,.rst-content table.field-list th,.wy-table td,.wy-table th{font-size:90%;margin:0;overflow:visible;padding:8px 16px}.rst-content table.docutils td:first-child,.rst-content table.docutils th:first-child,.rst-content table.field-list td:first-child,.rst-content table.field-list th:first-child,.wy-table td:first-child,.wy-table th:first-child{border-left-width:0}.rst-content table.docutils thead,.rst-content table.field-list thead,.wy-table thead{color:#000;text-align:left;vertical-align:bottom;white-space:nowrap}.rst-content table.docutils thead th,.rst-content table.field-list thead th,.wy-table thead th{font-weight:700;border-bottom:2px solid #e1e4e5}.rst-content table.docutils td,.rst-content table.field-list td,.wy-table td{background-color:transparent;vertical-align:middle}.rst-content table.docutils td p,.rst-content table.field-list td p,.wy-table td p{line-height:18px}.rst-content table.docutils td p:last-child,.rst-content table.field-list td p:last-child,.wy-table td p:last-child{margin-bottom:0}.rst-content table.docutils .wy-table-cell-min,.rst-content table.field-list .wy-table-cell-min,.wy-table .wy-table-cell-min{width:1%;padding-right:0}.rst-content table.docutils .wy-table-cell-min input[type=checkbox],.rst-content table.field-list .wy-table-cell-min input[type=checkbox],.wy-table .wy-table-cell-min input[type=checkbox]{margin:0}.wy-table-secondary{color:grey;font-size:90%}.wy-table-tertiary{color:grey;font-size:80%}.rst-content table.docutils:not(.field-list) tr:nth-child(2n-1) td,.wy-table-backed,.wy-table-odd td,.wy-table-striped tr:nth-child(2n-1) td{background-color:#f3f6f6}.rst-content table.docutils,.wy-table-bordered-all{border:1px solid #e1e4e5}.rst-content table.docutils td,.wy-table-bordered-all td{border-bottom:1px solid #e1e4e5;border-left:1px solid #e1e4e5}.rst-content table.docutils tbody>tr:last-child td,.wy-table-bordered-all tbody>tr:last-child td{border-bottom-width:0}.wy-table-bordered{border:1px solid #e1e4e5}.wy-table-bordered-rows td{border-bottom:1px solid #e1e4e5}.wy-table-bordered-rows tbody>tr:last-child td{border-bottom-width:0}.wy-table-horizontal td,.wy-table-horizontal th{border-width:0 0 1px;border-bottom:1px solid #e1e4e5}.wy-table-horizontal tbody>tr:last-child td{border-bottom-width:0}.wy-table-responsive{margin-bottom:24px;max-width:100%;overflow:auto}.wy-table-responsive table{margin-bottom:0!important}.wy-table-responsive table td,.wy-table-responsive table th{white-space:nowrap}a{color:#2980b9;text-decoration:none;cursor:pointer}a:hover{color:#3091d1}a:visited{color:#9b59b6}html{height:100%}body,html{overflow-x:hidden}body{font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif;font-weight:400;color:#404040;min-height:100%;background:#edf0f2}.wy-text-left{text-align:left}.wy-text-center{text-align:center}.wy-text-right{text-align:right}.wy-text-large{font-size:120%}.wy-text-normal{font-size:100%}.wy-text-small,small{font-size:80%}.wy-text-strike{text-decoration:line-through}.wy-text-warning{color:#e67e22!important}a.wy-text-warning:hover{color:#eb9950!important}.wy-text-info{color:#2980b9!important}a.wy-text-info:hover{color:#409ad5!important}.wy-text-success{color:#27ae60!important}a.wy-text-success:hover{color:#36d278!important}.wy-text-danger{color:#e74c3c!important}a.wy-text-danger:hover{color:#ed7669!important}.wy-text-neutral{color:#404040!important}a.wy-text-neutral:hover{color:#595959!important}.rst-content .toctree-wrapper>p.caption,h1,h2,h3,h4,h5,h6,legend{margin-top:0;font-weight:700;font-family:Roboto Slab,ff-tisa-web-pro,Georgia,Arial,sans-serif}p{line-height:24px;font-size:16px;margin:0 0 24px}h1{font-size:175%}.rst-content .toctree-wrapper>p.caption,h2{font-size:150%}h3{font-size:125%}h4{font-size:115%}h5{font-size:110%}h6{font-size:100%}hr{display:block;height:1px;border:0;border-top:1px solid #e1e4e5;margin:24px 0;padding:0}.rst-content code,.rst-content tt,code{white-space:nowrap;max-width:100%;background:#fff;border:1px solid #e1e4e5;font-size:75%;padding:0 5px;font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;color:#e74c3c;overflow-x:auto}.rst-content tt.code-large,code.code-large{font-size:90%}.rst-content .section ul,.rst-content .toctree-wrapper ul,.rst-content section ul,.wy-plain-list-disc,article ul{list-style:disc;line-height:24px;margin-bottom:24px}.rst-content .section ul li,.rst-content .toctree-wrapper ul li,.rst-content section ul li,.wy-plain-list-disc li,article ul li{list-style:disc;margin-left:24px}.rst-content .section ul li p:last-child,.rst-content .section ul li ul,.rst-content .toctree-wrapper ul li p:last-child,.rst-content .toctree-wrapper ul li ul,.rst-content section ul li p:last-child,.rst-content section ul li ul,.wy-plain-list-disc li p:last-child,.wy-plain-list-disc li ul,article ul li p:last-child,article ul li ul{margin-bottom:0}.rst-content .section ul li li,.rst-content .toctree-wrapper ul li li,.rst-content section ul li li,.wy-plain-list-disc li li,article ul li li{list-style:circle}.rst-content .section ul li li li,.rst-content .toctree-wrapper ul li li li,.rst-content section ul li li li,.wy-plain-list-disc li li li,article ul li li li{list-style:square}.rst-content .section ul li ol li,.rst-content .toctree-wrapper ul li ol li,.rst-content section ul li ol li,.wy-plain-list-disc li ol li,article ul li ol li{list-style:decimal}.rst-content .section ol,.rst-content .section ol.arabic,.rst-content .toctree-wrapper ol,.rst-content .toctree-wrapper ol.arabic,.rst-content section ol,.rst-content section ol.arabic,.wy-plain-list-decimal,article ol{list-style:decimal;line-height:24px;margin-bottom:24px}.rst-content .section ol.arabic li,.rst-content .section ol li,.rst-content .toctree-wrapper ol.arabic li,.rst-content .toctree-wrapper ol li,.rst-content section ol.arabic li,.rst-content section ol li,.wy-plain-list-decimal li,article ol li{list-style:decimal;margin-left:24px}.rst-content .section ol.arabic li ul,.rst-content .section ol li p:last-child,.rst-content .section ol li ul,.rst-content .toctree-wrapper ol.arabic li ul,.rst-content .toctree-wrapper ol li p:last-child,.rst-content .toctree-wrapper ol li ul,.rst-content section ol.arabic li ul,.rst-content section ol li p:last-child,.rst-content section ol li ul,.wy-plain-list-decimal li p:last-child,.wy-plain-list-decimal li ul,article ol li p:last-child,article ol li ul{margin-bottom:0}.rst-content .section ol.arabic li ul li,.rst-content .section ol li ul li,.rst-content .toctree-wrapper ol.arabic li ul li,.rst-content .toctree-wrapper ol li ul li,.rst-content section ol.arabic li ul li,.rst-content section ol li ul li,.wy-plain-list-decimal li ul li,article ol li ul li{list-style:disc}.wy-breadcrumbs{*zoom:1}.wy-breadcrumbs:after,.wy-breadcrumbs:before{display:table;content:""}.wy-breadcrumbs:after{clear:both}.wy-breadcrumbs>li{display:inline-block;padding-top:5px}.wy-breadcrumbs>li.wy-breadcrumbs-aside{float:right}.rst-content .wy-breadcrumbs>li code,.rst-content .wy-breadcrumbs>li tt,.wy-breadcrumbs>li .rst-content tt,.wy-breadcrumbs>li code{all:inherit;color:inherit}.breadcrumb-item:before{content:"/";color:#bbb;font-size:13px;padding:0 6px 0 3px}.wy-breadcrumbs-extra{margin-bottom:0;color:#b3b3b3;font-size:80%;display:inline-block}@media screen and (max-width:480px){.wy-breadcrumbs-extra,.wy-breadcrumbs li.wy-breadcrumbs-aside{display:none}}@media print{.wy-breadcrumbs li.wy-breadcrumbs-aside{display:none}}html{font-size:16px}.wy-affix{position:fixed;top:1.618em}.wy-menu a:hover{text-decoration:none}.wy-menu-horiz{*zoom:1}.wy-menu-horiz:after,.wy-menu-horiz:before{display:table;content:""}.wy-menu-horiz:after{clear:both}.wy-menu-horiz li,.wy-menu-horiz ul{display:inline-block}.wy-menu-horiz li:hover{background:hsla(0,0%,100%,.1)}.wy-menu-horiz li.divide-left{border-left:1px solid #404040}.wy-menu-horiz li.divide-right{border-right:1px solid #404040}.wy-menu-horiz a{height:32px;display:inline-block;line-height:32px;padding:0 16px}.wy-menu-vertical{width:300px}.wy-menu-vertical header,.wy-menu-vertical p.caption{color:#55a5d9;height:32px;line-height:32px;padding:0 1.618em;margin:12px 0 0;display:block;font-weight:700;text-transform:uppercase;font-size:85%;white-space:nowrap}.wy-menu-vertical ul{margin-bottom:0}.wy-menu-vertical li.divide-top{border-top:1px solid #404040}.wy-menu-vertical li.divide-bottom{border-bottom:1px solid #404040}.wy-menu-vertical li.current{background:#e3e3e3}.wy-menu-vertical li.current a{color:grey;border-right:1px solid #c9c9c9;padding:.4045em 2.427em}.wy-menu-vertical li.current a:hover{background:#d6d6d6}.rst-content .wy-menu-vertical li tt,.wy-menu-vertical li .rst-content tt,.wy-menu-vertical li code{border:none;background:inherit;color:inherit;padding-left:0;padding-right:0}.wy-menu-vertical li button.toctree-expand{display:block;float:left;margin-left:-1.2em;line-height:18px;color:#4d4d4d;border:none;background:none;padding:0}.wy-menu-vertical li.current>a,.wy-menu-vertical li.on a{color:#404040;font-weight:700;position:relative;background:#fcfcfc;border:none;padding:.4045em 1.618em}.wy-menu-vertical li.current>a:hover,.wy-menu-vertical li.on a:hover{background:#fcfcfc}.wy-menu-vertical li.current>a:hover button.toctree-expand,.wy-menu-vertical li.on a:hover button.toctree-expand{color:grey}.wy-menu-vertical li.current>a button.toctree-expand,.wy-menu-vertical li.on a button.toctree-expand{display:block;line-height:18px;color:#333}.wy-menu-vertical li.toctree-l1.current>a{border-bottom:1px solid #c9c9c9;border-top:1px solid #c9c9c9}.wy-menu-vertical .toctree-l1.current .toctree-l2>ul,.wy-menu-vertical .toctree-l2.current .toctree-l3>ul,.wy-menu-vertical .toctree-l3.current .toctree-l4>ul,.wy-menu-vertical .toctree-l4.current .toctree-l5>ul,.wy-menu-vertical .toctree-l5.current .toctree-l6>ul,.wy-menu-vertical .toctree-l6.current .toctree-l7>ul,.wy-menu-vertical .toctree-l7.current .toctree-l8>ul,.wy-menu-vertical .toctree-l8.current .toctree-l9>ul,.wy-menu-vertical .toctree-l9.current .toctree-l10>ul,.wy-menu-vertical .toctree-l10.current .toctree-l11>ul{display:none}.wy-menu-vertical .toctree-l1.current .current.toctree-l2>ul,.wy-menu-vertical .toctree-l2.current .current.toctree-l3>ul,.wy-menu-vertical .toctree-l3.current .current.toctree-l4>ul,.wy-menu-vertical .toctree-l4.current .current.toctree-l5>ul,.wy-menu-vertical .toctree-l5.current .current.toctree-l6>ul,.wy-menu-vertical .toctree-l6.current .current.toctree-l7>ul,.wy-menu-vertical .toctree-l7.current .current.toctree-l8>ul,.wy-menu-vertical .toctree-l8.current .current.toctree-l9>ul,.wy-menu-vertical .toctree-l9.current .current.toctree-l10>ul,.wy-menu-vertical .toctree-l10.current .current.toctree-l11>ul{display:block}.wy-menu-vertical li.toctree-l3,.wy-menu-vertical li.toctree-l4{font-size:.9em}.wy-menu-vertical li.toctree-l2 a,.wy-menu-vertical li.toctree-l3 a,.wy-menu-vertical li.toctree-l4 a,.wy-menu-vertical li.toctree-l5 a,.wy-menu-vertical li.toctree-l6 a,.wy-menu-vertical li.toctree-l7 a,.wy-menu-vertical li.toctree-l8 a,.wy-menu-vertical li.toctree-l9 a,.wy-menu-vertical li.toctree-l10 a{color:#404040}.wy-menu-vertical li.toctree-l2 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l3 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l4 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l5 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l6 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l7 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l8 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l9 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l10 a:hover button.toctree-expand{color:grey}.wy-menu-vertical li.toctree-l2.current li.toctree-l3>a,.wy-menu-vertical li.toctree-l3.current li.toctree-l4>a,.wy-menu-vertical li.toctree-l4.current li.toctree-l5>a,.wy-menu-vertical li.toctree-l5.current li.toctree-l6>a,.wy-menu-vertical li.toctree-l6.current li.toctree-l7>a,.wy-menu-vertical li.toctree-l7.current li.toctree-l8>a,.wy-menu-vertical li.toctree-l8.current li.toctree-l9>a,.wy-menu-vertical li.toctree-l9.current li.toctree-l10>a,.wy-menu-vertical li.toctree-l10.current li.toctree-l11>a{display:block}.wy-menu-vertical li.toctree-l2.current>a{padding:.4045em 2.427em}.wy-menu-vertical li.toctree-l2.current li.toctree-l3>a{padding:.4045em 1.618em .4045em 4.045em}.wy-menu-vertical li.toctree-l3.current>a{padding:.4045em 4.045em}.wy-menu-vertical li.toctree-l3.current li.toctree-l4>a{padding:.4045em 1.618em .4045em 5.663em}.wy-menu-vertical li.toctree-l4.current>a{padding:.4045em 5.663em}.wy-menu-vertical li.toctree-l4.current li.toctree-l5>a{padding:.4045em 1.618em .4045em 7.281em}.wy-menu-vertical li.toctree-l5.current>a{padding:.4045em 7.281em}.wy-menu-vertical li.toctree-l5.current li.toctree-l6>a{padding:.4045em 1.618em .4045em 8.899em}.wy-menu-vertical li.toctree-l6.current>a{padding:.4045em 8.899em}.wy-menu-vertical li.toctree-l6.current li.toctree-l7>a{padding:.4045em 1.618em .4045em 10.517em}.wy-menu-vertical li.toctree-l7.current>a{padding:.4045em 10.517em}.wy-menu-vertical li.toctree-l7.current li.toctree-l8>a{padding:.4045em 1.618em .4045em 12.135em}.wy-menu-vertical li.toctree-l8.current>a{padding:.4045em 12.135em}.wy-menu-vertical li.toctree-l8.current li.toctree-l9>a{padding:.4045em 1.618em .4045em 13.753em}.wy-menu-vertical li.toctree-l9.current>a{padding:.4045em 13.753em}.wy-menu-vertical li.toctree-l9.current li.toctree-l10>a{padding:.4045em 1.618em .4045em 15.371em}.wy-menu-vertical li.toctree-l10.current>a{padding:.4045em 15.371em}.wy-menu-vertical li.toctree-l10.current li.toctree-l11>a{padding:.4045em 1.618em .4045em 16.989em}.wy-menu-vertical li.toctree-l2.current>a,.wy-menu-vertical li.toctree-l2.current li.toctree-l3>a{background:#c9c9c9}.wy-menu-vertical li.toctree-l2 button.toctree-expand{color:#a3a3a3}.wy-menu-vertical li.toctree-l3.current>a,.wy-menu-vertical li.toctree-l3.current li.toctree-l4>a{background:#bdbdbd}.wy-menu-vertical li.toctree-l3 button.toctree-expand{color:#969696}.wy-menu-vertical li.current ul{display:block}.wy-menu-vertical li ul{margin-bottom:0;display:none}.wy-menu-vertical li ul li a{margin-bottom:0;color:#d9d9d9;font-weight:400}.wy-menu-vertical a{line-height:18px;padding:.4045em 1.618em;display:block;position:relative;font-size:90%;color:#d9d9d9}.wy-menu-vertical a:hover{background-color:#4e4a4a;cursor:pointer}.wy-menu-vertical a:hover button.toctree-expand{color:#d9d9d9}.wy-menu-vertical a:active{background-color:#2980b9;cursor:pointer;color:#fff}.wy-menu-vertical a:active button.toctree-expand{color:#fff}.wy-side-nav-search{display:block;width:300px;padding:.809em;margin-bottom:.809em;z-index:200;background-color:#2980b9;text-align:center;color:#fcfcfc}.wy-side-nav-search input[type=text]{width:100%;border-radius:50px;padding:6px 12px;border-color:#2472a4}.wy-side-nav-search img{display:block;margin:auto auto .809em;height:45px;width:45px;background-color:#2980b9;padding:5px;border-radius:100%}.wy-side-nav-search .wy-dropdown>a,.wy-side-nav-search>a{color:#fcfcfc;font-size:100%;font-weight:700;display:inline-block;padding:4px 6px;margin-bottom:.809em;max-width:100%}.wy-side-nav-search .wy-dropdown>a:hover,.wy-side-nav-search>a:hover{background:hsla(0,0%,100%,.1)}.wy-side-nav-search .wy-dropdown>a img.logo,.wy-side-nav-search>a img.logo{display:block;margin:0 auto;height:auto;width:auto;border-radius:0;max-width:100%;background:transparent}.wy-side-nav-search .wy-dropdown>a.icon img.logo,.wy-side-nav-search>a.icon img.logo{margin-top:.85em}.wy-side-nav-search>div.version{margin-top:-.4045em;margin-bottom:.809em;font-weight:400;color:hsla(0,0%,100%,.3)}.wy-nav .wy-menu-vertical header{color:#2980b9}.wy-nav .wy-menu-vertical a{color:#b3b3b3}.wy-nav .wy-menu-vertical a:hover{background-color:#2980b9;color:#fff}[data-menu-wrap]{-webkit-transition:all .2s ease-in;-moz-transition:all .2s ease-in;transition:all .2s ease-in;position:absolute;opacity:1;width:100%;opacity:0}[data-menu-wrap].move-center{left:0;right:auto;opacity:1}[data-menu-wrap].move-left{right:auto;left:-100%;opacity:0}[data-menu-wrap].move-right{right:-100%;left:auto;opacity:0}.wy-body-for-nav{background:#fcfcfc}.wy-grid-for-nav{position:absolute;width:100%;height:100%}.wy-nav-side{position:fixed;top:0;bottom:0;left:0;padding-bottom:2em;width:300px;overflow-x:hidden;overflow-y:hidden;min-height:100%;color:#9b9b9b;background:#343131;z-index:200}.wy-side-scroll{width:320px;position:relative;overflow-x:hidden;overflow-y:scroll;height:100%}.wy-nav-top{display:none;background:#2980b9;color:#fff;padding:.4045em .809em;position:relative;line-height:50px;text-align:center;font-size:100%;*zoom:1}.wy-nav-top:after,.wy-nav-top:before{display:table;content:""}.wy-nav-top:after{clear:both}.wy-nav-top a{color:#fff;font-weight:700}.wy-nav-top img{margin-right:12px;height:45px;width:45px;background-color:#2980b9;padding:5px;border-radius:100%}.wy-nav-top i{font-size:30px;float:left;cursor:pointer;padding-top:inherit}.wy-nav-content-wrap{margin-left:300px;background:#fcfcfc;min-height:100%}.wy-nav-content{padding:1.618em 3.236em;height:100%;max-width:800px;margin:auto}.wy-body-mask{position:fixed;width:100%;height:100%;background:rgba(0,0,0,.2);display:none;z-index:499}.wy-body-mask.on{display:block}footer{color:grey}footer p{margin-bottom:12px}.rst-content footer span.commit tt,footer span.commit .rst-content tt,footer span.commit code{padding:0;font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;font-size:1em;background:none;border:none;color:grey}.rst-footer-buttons{*zoom:1}.rst-footer-buttons:after,.rst-footer-buttons:before{width:100%;display:table;content:""}.rst-footer-buttons:after{clear:both}.rst-breadcrumbs-buttons{margin-top:12px;*zoom:1}.rst-breadcrumbs-buttons:after,.rst-breadcrumbs-buttons:before{display:table;content:""}.rst-breadcrumbs-buttons:after{clear:both}#search-results .search li{margin-bottom:24px;border-bottom:1px solid #e1e4e5;padding-bottom:24px}#search-results .search li:first-child{border-top:1px solid #e1e4e5;padding-top:24px}#search-results .search li a{font-size:120%;margin-bottom:12px;display:inline-block}#search-results .context{color:grey;font-size:90%}.genindextable li>ul{margin-left:24px}@media screen and (max-width:768px){.wy-body-for-nav{background:#fcfcfc}.wy-nav-top{display:block}.wy-nav-side{left:-300px}.wy-nav-side.shift{width:85%;left:0}.wy-menu.wy-menu-vertical,.wy-side-nav-search,.wy-side-scroll{width:auto}.wy-nav-content-wrap{margin-left:0}.wy-nav-content-wrap .wy-nav-content{padding:1.618em}.wy-nav-content-wrap.shift{position:fixed;min-width:100%;left:85%;top:0;height:100%;overflow:hidden}}@media screen and (min-width:1100px){.wy-nav-content-wrap{background:rgba(0,0,0,.05)}.wy-nav-content{margin:0;background:#fcfcfc}}@media print{.rst-versions,.wy-nav-side,footer{display:none}.wy-nav-content-wrap{margin-left:0}}.rst-versions{position:fixed;bottom:0;left:0;width:300px;color:#fcfcfc;background:#1f1d1d;font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif;z-index:400}.rst-versions a{color:#2980b9;text-decoration:none}.rst-versions .rst-badge-small{display:none}.rst-versions .rst-current-version{padding:12px;background-color:#272525;display:block;text-align:right;font-size:90%;cursor:pointer;color:#27ae60;*zoom:1}.rst-versions .rst-current-version:after,.rst-versions .rst-current-version:before{display:table;content:""}.rst-versions .rst-current-version:after{clear:both}.rst-content .code-block-caption .rst-versions .rst-current-version .headerlink,.rst-content .eqno .rst-versions .rst-current-version .headerlink,.rst-content .rst-versions .rst-current-version .admonition-title,.rst-content code.download .rst-versions .rst-current-version span:first-child,.rst-content dl dt .rst-versions .rst-current-version .headerlink,.rst-content h1 .rst-versions .rst-current-version .headerlink,.rst-content h2 .rst-versions .rst-current-version .headerlink,.rst-content h3 .rst-versions .rst-current-version .headerlink,.rst-content h4 .rst-versions .rst-current-version .headerlink,.rst-content h5 .rst-versions .rst-current-version .headerlink,.rst-content h6 .rst-versions .rst-current-version .headerlink,.rst-content p .rst-versions .rst-current-version .headerlink,.rst-content table>caption .rst-versions .rst-current-version .headerlink,.rst-content tt.download .rst-versions .rst-current-version span:first-child,.rst-versions .rst-current-version .fa,.rst-versions .rst-current-version .icon,.rst-versions .rst-current-version .rst-content .admonition-title,.rst-versions .rst-current-version .rst-content .code-block-caption .headerlink,.rst-versions .rst-current-version .rst-content .eqno .headerlink,.rst-versions .rst-current-version .rst-content code.download span:first-child,.rst-versions .rst-current-version .rst-content dl dt .headerlink,.rst-versions .rst-current-version .rst-content h1 .headerlink,.rst-versions .rst-current-version .rst-content h2 .headerlink,.rst-versions .rst-current-version .rst-content h3 .headerlink,.rst-versions .rst-current-version .rst-content h4 .headerlink,.rst-versions .rst-current-version .rst-content h5 .headerlink,.rst-versions .rst-current-version .rst-content h6 .headerlink,.rst-versions .rst-current-version .rst-content p .headerlink,.rst-versions .rst-current-version .rst-content table>caption .headerlink,.rst-versions .rst-current-version .rst-content tt.download span:first-child,.rst-versions .rst-current-version .wy-menu-vertical li button.toctree-expand,.wy-menu-vertical li .rst-versions .rst-current-version button.toctree-expand{color:#fcfcfc}.rst-versions .rst-current-version .fa-book,.rst-versions .rst-current-version .icon-book{float:left}.rst-versions .rst-current-version.rst-out-of-date{background-color:#e74c3c;color:#fff}.rst-versions .rst-current-version.rst-active-old-version{background-color:#f1c40f;color:#000}.rst-versions.shift-up{height:auto;max-height:100%;overflow-y:scroll}.rst-versions.shift-up .rst-other-versions{display:block}.rst-versions .rst-other-versions{font-size:90%;padding:12px;color:grey;display:none}.rst-versions .rst-other-versions hr{display:block;height:1px;border:0;margin:20px 0;padding:0;border-top:1px solid #413d3d}.rst-versions .rst-other-versions dd{display:inline-block;margin:0}.rst-versions .rst-other-versions dd a{display:inline-block;padding:6px;color:#fcfcfc}.rst-versions.rst-badge{width:auto;bottom:20px;right:20px;left:auto;border:none;max-width:300px;max-height:90%}.rst-versions.rst-badge .fa-book,.rst-versions.rst-badge .icon-book{float:none;line-height:30px}.rst-versions.rst-badge.shift-up .rst-current-version{text-align:right}.rst-versions.rst-badge.shift-up .rst-current-version .fa-book,.rst-versions.rst-badge.shift-up .rst-current-version .icon-book{float:left}.rst-versions.rst-badge>.rst-current-version{width:auto;height:30px;line-height:30px;padding:0 6px;display:block;text-align:center}@media screen and (max-width:768px){.rst-versions{width:85%;display:none}.rst-versions.shift{display:block}}.rst-content .toctree-wrapper>p.caption,.rst-content h1,.rst-content h2,.rst-content h3,.rst-content h4,.rst-content h5,.rst-content h6{margin-bottom:24px}.rst-content img{max-width:100%;height:auto}.rst-content div.figure,.rst-content figure{margin-bottom:24px}.rst-content div.figure .caption-text,.rst-content figure .caption-text{font-style:italic}.rst-content div.figure p:last-child.caption,.rst-content figure p:last-child.caption{margin-bottom:0}.rst-content div.figure.align-center,.rst-content figure.align-center{text-align:center}.rst-content .section>a>img,.rst-content .section>img,.rst-content section>a>img,.rst-content section>img{margin-bottom:24px}.rst-content abbr[title]{text-decoration:none}.rst-content.style-external-links a.reference.external:after{font-family:FontAwesome;content:"\f08e";color:#b3b3b3;vertical-align:super;font-size:60%;margin:0 .2em}.rst-content blockquote{margin-left:24px;line-height:24px;margin-bottom:24px}.rst-content pre.literal-block{white-space:pre;margin:0;padding:12px;font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;display:block;overflow:auto}.rst-content div[class^=highlight],.rst-content pre.literal-block{border:1px solid #e1e4e5;overflow-x:auto;margin:1px 0 24px}.rst-content div[class^=highlight] div[class^=highlight],.rst-content pre.literal-block div[class^=highlight]{padding:0;border:none;margin:0}.rst-content div[class^=highlight] td.code{width:100%}.rst-content .linenodiv pre{border-right:1px solid #e6e9ea;margin:0;padding:12px;font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;user-select:none;pointer-events:none}.rst-content div[class^=highlight] pre{white-space:pre;margin:0;padding:12px;display:block;overflow:auto}.rst-content div[class^=highlight] pre .hll{display:block;margin:0 -12px;padding:0 12px}.rst-content .linenodiv pre,.rst-content div[class^=highlight] pre,.rst-content pre.literal-block{font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;font-size:12px;line-height:1.4}.rst-content div.highlight .gp,.rst-content div.highlight span.linenos{user-select:none;pointer-events:none}.rst-content div.highlight span.linenos{display:inline-block;padding-left:0;padding-right:12px;margin-right:12px;border-right:1px solid #e6e9ea}.rst-content .code-block-caption{font-style:italic;font-size:85%;line-height:1;padding:1em 0;text-align:center}@media print{.rst-content .codeblock,.rst-content div[class^=highlight],.rst-content div[class^=highlight] pre{white-space:pre-wrap}}.rst-content .admonition,.rst-content .admonition-todo,.rst-content .attention,.rst-content .caution,.rst-content .danger,.rst-content .error,.rst-content .hint,.rst-content .important,.rst-content .note,.rst-content .seealso,.rst-content .tip,.rst-content .warning{clear:both}.rst-content .admonition-todo .last,.rst-content .admonition-todo>:last-child,.rst-content .admonition .last,.rst-content .admonition>:last-child,.rst-content .attention .last,.rst-content .attention>:last-child,.rst-content .caution .last,.rst-content .caution>:last-child,.rst-content .danger .last,.rst-content .danger>:last-child,.rst-content .error .last,.rst-content .error>:last-child,.rst-content .hint .last,.rst-content .hint>:last-child,.rst-content .important .last,.rst-content .important>:last-child,.rst-content .note .last,.rst-content .note>:last-child,.rst-content .seealso .last,.rst-content .seealso>:last-child,.rst-content .tip .last,.rst-content .tip>:last-child,.rst-content .warning .last,.rst-content .warning>:last-child{margin-bottom:0}.rst-content .admonition-title:before{margin-right:4px}.rst-content .admonition table{border-color:rgba(0,0,0,.1)}.rst-content .admonition table td,.rst-content .admonition table th{background:transparent!important;border-color:rgba(0,0,0,.1)!important}.rst-content .section ol.loweralpha,.rst-content .section ol.loweralpha>li,.rst-content .toctree-wrapper ol.loweralpha,.rst-content .toctree-wrapper ol.loweralpha>li,.rst-content section ol.loweralpha,.rst-content section ol.loweralpha>li{list-style:lower-alpha}.rst-content .section ol.upperalpha,.rst-content .section ol.upperalpha>li,.rst-content .toctree-wrapper ol.upperalpha,.rst-content .toctree-wrapper ol.upperalpha>li,.rst-content section ol.upperalpha,.rst-content section ol.upperalpha>li{list-style:upper-alpha}.rst-content .section ol li>*,.rst-content .section ul li>*,.rst-content .toctree-wrapper ol li>*,.rst-content .toctree-wrapper ul li>*,.rst-content section ol li>*,.rst-content section ul li>*{margin-top:12px;margin-bottom:12px}.rst-content .section ol li>:first-child,.rst-content .section ul li>:first-child,.rst-content .toctree-wrapper ol li>:first-child,.rst-content .toctree-wrapper ul li>:first-child,.rst-content section ol li>:first-child,.rst-content section ul li>:first-child{margin-top:0}.rst-content .section ol li>p,.rst-content .section ol li>p:last-child,.rst-content .section ul li>p,.rst-content .section ul li>p:last-child,.rst-content .toctree-wrapper ol li>p,.rst-content .toctree-wrapper ol li>p:last-child,.rst-content .toctree-wrapper ul li>p,.rst-content .toctree-wrapper ul li>p:last-child,.rst-content section ol li>p,.rst-content section ol li>p:last-child,.rst-content section ul li>p,.rst-content section ul li>p:last-child{margin-bottom:12px}.rst-content .section ol li>p:only-child,.rst-content .section ol li>p:only-child:last-child,.rst-content .section ul li>p:only-child,.rst-content .section ul li>p:only-child:last-child,.rst-content .toctree-wrapper ol li>p:only-child,.rst-content .toctree-wrapper ol li>p:only-child:last-child,.rst-content .toctree-wrapper ul li>p:only-child,.rst-content .toctree-wrapper ul li>p:only-child:last-child,.rst-content section ol li>p:only-child,.rst-content section ol li>p:only-child:last-child,.rst-content section ul li>p:only-child,.rst-content section ul li>p:only-child:last-child{margin-bottom:0}.rst-content .section ol li>ol,.rst-content .section ol li>ul,.rst-content .section ul li>ol,.rst-content .section ul li>ul,.rst-content .toctree-wrapper ol li>ol,.rst-content .toctree-wrapper ol li>ul,.rst-content .toctree-wrapper ul li>ol,.rst-content .toctree-wrapper ul li>ul,.rst-content section ol li>ol,.rst-content section ol li>ul,.rst-content section ul li>ol,.rst-content section ul li>ul{margin-bottom:12px}.rst-content .section ol.simple li>*,.rst-content .section ol.simple li ol,.rst-content .section ol.simple li ul,.rst-content .section ul.simple li>*,.rst-content .section ul.simple li ol,.rst-content .section ul.simple li ul,.rst-content .toctree-wrapper ol.simple li>*,.rst-content .toctree-wrapper ol.simple li ol,.rst-content .toctree-wrapper ol.simple li ul,.rst-content .toctree-wrapper ul.simple li>*,.rst-content .toctree-wrapper ul.simple li ol,.rst-content .toctree-wrapper ul.simple li ul,.rst-content section ol.simple li>*,.rst-content section ol.simple li ol,.rst-content section ol.simple li ul,.rst-content section ul.simple li>*,.rst-content section ul.simple li ol,.rst-content section ul.simple li ul{margin-top:0;margin-bottom:0}.rst-content .line-block{margin-left:0;margin-bottom:24px;line-height:24px}.rst-content .line-block .line-block{margin-left:24px;margin-bottom:0}.rst-content .topic-title{font-weight:700;margin-bottom:12px}.rst-content .toc-backref{color:#404040}.rst-content .align-right{float:right;margin:0 0 24px 24px}.rst-content .align-left{float:left;margin:0 24px 24px 0}.rst-content .align-center{margin:auto}.rst-content .align-center:not(table){display:block}.rst-content .code-block-caption .headerlink,.rst-content .eqno .headerlink,.rst-content .toctree-wrapper>p.caption .headerlink,.rst-content dl dt .headerlink,.rst-content h1 .headerlink,.rst-content h2 .headerlink,.rst-content h3 .headerlink,.rst-content h4 .headerlink,.rst-content h5 .headerlink,.rst-content h6 .headerlink,.rst-content p.caption .headerlink,.rst-content p .headerlink,.rst-content table>caption .headerlink{opacity:0;font-size:14px;font-family:FontAwesome;margin-left:.5em}.rst-content .code-block-caption .headerlink:focus,.rst-content .code-block-caption:hover .headerlink,.rst-content .eqno .headerlink:focus,.rst-content .eqno:hover .headerlink,.rst-content .toctree-wrapper>p.caption .headerlink:focus,.rst-content .toctree-wrapper>p.caption:hover .headerlink,.rst-content dl dt .headerlink:focus,.rst-content dl dt:hover .headerlink,.rst-content h1 .headerlink:focus,.rst-content h1:hover .headerlink,.rst-content h2 .headerlink:focus,.rst-content h2:hover .headerlink,.rst-content h3 .headerlink:focus,.rst-content h3:hover .headerlink,.rst-content h4 .headerlink:focus,.rst-content h4:hover .headerlink,.rst-content h5 .headerlink:focus,.rst-content h5:hover .headerlink,.rst-content h6 .headerlink:focus,.rst-content h6:hover .headerlink,.rst-content p.caption .headerlink:focus,.rst-content p.caption:hover .headerlink,.rst-content p .headerlink:focus,.rst-content p:hover .headerlink,.rst-content table>caption .headerlink:focus,.rst-content table>caption:hover .headerlink{opacity:1}.rst-content p a{overflow-wrap:anywhere}.rst-content .wy-table td p,.rst-content .wy-table td ul,.rst-content .wy-table th p,.rst-content .wy-table th ul,.rst-content table.docutils td p,.rst-content table.docutils td ul,.rst-content table.docutils th p,.rst-content table.docutils th ul,.rst-content table.field-list td p,.rst-content table.field-list td ul,.rst-content table.field-list th p,.rst-content table.field-list th ul{font-size:inherit}.rst-content .btn:focus{outline:2px solid}.rst-content table>caption .headerlink:after{font-size:12px}.rst-content .centered{text-align:center}.rst-content .sidebar{float:right;width:40%;display:block;margin:0 0 24px 24px;padding:24px;background:#f3f6f6;border:1px solid #e1e4e5}.rst-content .sidebar dl,.rst-content .sidebar p,.rst-content .sidebar ul{font-size:90%}.rst-content .sidebar .last,.rst-content .sidebar>:last-child{margin-bottom:0}.rst-content .sidebar .sidebar-title{display:block;font-family:Roboto Slab,ff-tisa-web-pro,Georgia,Arial,sans-serif;font-weight:700;background:#e1e4e5;padding:6px 12px;margin:-24px -24px 24px;font-size:100%}.rst-content .highlighted{background:#f1c40f;box-shadow:0 0 0 2px #f1c40f;display:inline;font-weight:700}.rst-content .citation-reference,.rst-content .footnote-reference{vertical-align:baseline;position:relative;top:-.4em;line-height:0;font-size:90%}.rst-content .citation-reference>span.fn-bracket,.rst-content .footnote-reference>span.fn-bracket{display:none}.rst-content .hlist{width:100%}.rst-content dl dt span.classifier:before{content:" : "}.rst-content dl dt span.classifier-delimiter{display:none!important}html.writer-html4 .rst-content table.docutils.citation,html.writer-html4 .rst-content table.docutils.footnote{background:none;border:none}html.writer-html4 .rst-content table.docutils.citation td,html.writer-html4 .rst-content table.docutils.citation tr,html.writer-html4 .rst-content table.docutils.footnote td,html.writer-html4 .rst-content table.docutils.footnote tr{border:none;background-color:transparent!important;white-space:normal}html.writer-html4 .rst-content table.docutils.citation td.label,html.writer-html4 .rst-content table.docutils.footnote td.label{padding-left:0;padding-right:0;vertical-align:top}html.writer-html5 .rst-content dl.citation,html.writer-html5 .rst-content dl.field-list,html.writer-html5 .rst-content dl.footnote{display:grid;grid-template-columns:auto minmax(80%,95%)}html.writer-html5 .rst-content dl.citation>dt,html.writer-html5 .rst-content dl.field-list>dt,html.writer-html5 .rst-content dl.footnote>dt{display:inline-grid;grid-template-columns:max-content auto}html.writer-html5 .rst-content aside.citation,html.writer-html5 .rst-content aside.footnote,html.writer-html5 .rst-content div.citation{display:grid;grid-template-columns:auto auto minmax(.65rem,auto) minmax(40%,95%)}html.writer-html5 .rst-content aside.citation>span.label,html.writer-html5 .rst-content aside.footnote>span.label,html.writer-html5 .rst-content div.citation>span.label{grid-column-start:1;grid-column-end:2}html.writer-html5 .rst-content aside.citation>span.backrefs,html.writer-html5 .rst-content aside.footnote>span.backrefs,html.writer-html5 .rst-content div.citation>span.backrefs{grid-column-start:2;grid-column-end:3;grid-row-start:1;grid-row-end:3}html.writer-html5 .rst-content aside.citation>p,html.writer-html5 .rst-content aside.footnote>p,html.writer-html5 .rst-content div.citation>p{grid-column-start:4;grid-column-end:5}html.writer-html5 .rst-content dl.citation,html.writer-html5 .rst-content dl.field-list,html.writer-html5 .rst-content dl.footnote{margin-bottom:24px}html.writer-html5 .rst-content dl.citation>dt,html.writer-html5 .rst-content dl.field-list>dt,html.writer-html5 .rst-content dl.footnote>dt{padding-left:1rem}html.writer-html5 .rst-content dl.citation>dd,html.writer-html5 .rst-content dl.citation>dt,html.writer-html5 .rst-content dl.field-list>dd,html.writer-html5 .rst-content dl.field-list>dt,html.writer-html5 .rst-content dl.footnote>dd,html.writer-html5 .rst-content dl.footnote>dt{margin-bottom:0}html.writer-html5 .rst-content dl.citation,html.writer-html5 .rst-content dl.footnote{font-size:.9rem}html.writer-html5 .rst-content dl.citation>dt,html.writer-html5 .rst-content dl.footnote>dt{margin:0 .5rem .5rem 0;line-height:1.2rem;word-break:break-all;font-weight:400}html.writer-html5 .rst-content dl.citation>dt>span.brackets:before,html.writer-html5 .rst-content dl.footnote>dt>span.brackets:before{content:"["}html.writer-html5 .rst-content dl.citation>dt>span.brackets:after,html.writer-html5 .rst-content dl.footnote>dt>span.brackets:after{content:"]"}html.writer-html5 .rst-content dl.citation>dt>span.fn-backref,html.writer-html5 .rst-content dl.footnote>dt>span.fn-backref{text-align:left;font-style:italic;margin-left:.65rem;word-break:break-word;word-spacing:-.1rem;max-width:5rem}html.writer-html5 .rst-content dl.citation>dt>span.fn-backref>a,html.writer-html5 .rst-content dl.footnote>dt>span.fn-backref>a{word-break:keep-all}html.writer-html5 .rst-content dl.citation>dt>span.fn-backref>a:not(:first-child):before,html.writer-html5 .rst-content dl.footnote>dt>span.fn-backref>a:not(:first-child):before{content:" "}html.writer-html5 .rst-content dl.citation>dd,html.writer-html5 .rst-content dl.footnote>dd{margin:0 0 .5rem;line-height:1.2rem}html.writer-html5 .rst-content dl.citation>dd p,html.writer-html5 .rst-content dl.footnote>dd p{font-size:.9rem}html.writer-html5 .rst-content aside.citation,html.writer-html5 .rst-content aside.footnote,html.writer-html5 .rst-content div.citation{padding-left:1rem;padding-right:1rem;font-size:.9rem;line-height:1.2rem}html.writer-html5 .rst-content aside.citation p,html.writer-html5 .rst-content aside.footnote p,html.writer-html5 .rst-content div.citation p{font-size:.9rem;line-height:1.2rem;margin-bottom:12px}html.writer-html5 .rst-content aside.citation span.backrefs,html.writer-html5 .rst-content aside.footnote span.backrefs,html.writer-html5 .rst-content div.citation span.backrefs{text-align:left;font-style:italic;margin-left:.65rem;word-break:break-word;word-spacing:-.1rem;max-width:5rem}html.writer-html5 .rst-content aside.citation span.backrefs>a,html.writer-html5 .rst-content aside.footnote span.backrefs>a,html.writer-html5 .rst-content div.citation span.backrefs>a{word-break:keep-all}html.writer-html5 .rst-content aside.citation span.backrefs>a:not(:first-child):before,html.writer-html5 .rst-content aside.footnote span.backrefs>a:not(:first-child):before,html.writer-html5 .rst-content div.citation span.backrefs>a:not(:first-child):before{content:" "}html.writer-html5 .rst-content aside.citation span.label,html.writer-html5 .rst-content aside.footnote span.label,html.writer-html5 .rst-content div.citation span.label{line-height:1.2rem}html.writer-html5 .rst-content aside.citation-list,html.writer-html5 .rst-content aside.footnote-list,html.writer-html5 .rst-content div.citation-list{margin-bottom:24px}html.writer-html5 .rst-content dl.option-list kbd{font-size:.9rem}.rst-content table.docutils.footnote,html.writer-html4 .rst-content table.docutils.citation,html.writer-html5 .rst-content aside.footnote,html.writer-html5 .rst-content aside.footnote-list aside.footnote,html.writer-html5 .rst-content div.citation-list>div.citation,html.writer-html5 .rst-content dl.citation,html.writer-html5 .rst-content dl.footnote{color:grey}.rst-content table.docutils.footnote code,.rst-content table.docutils.footnote tt,html.writer-html4 .rst-content table.docutils.citation code,html.writer-html4 .rst-content table.docutils.citation tt,html.writer-html5 .rst-content aside.footnote-list aside.footnote code,html.writer-html5 .rst-content aside.footnote-list aside.footnote tt,html.writer-html5 .rst-content aside.footnote code,html.writer-html5 .rst-content aside.footnote tt,html.writer-html5 .rst-content div.citation-list>div.citation code,html.writer-html5 .rst-content div.citation-list>div.citation tt,html.writer-html5 .rst-content dl.citation code,html.writer-html5 .rst-content dl.citation tt,html.writer-html5 .rst-content dl.footnote code,html.writer-html5 .rst-content dl.footnote tt{color:#555}.rst-content .wy-table-responsive.citation,.rst-content .wy-table-responsive.footnote{margin-bottom:0}.rst-content .wy-table-responsive.citation+:not(.citation),.rst-content .wy-table-responsive.footnote+:not(.footnote){margin-top:24px}.rst-content .wy-table-responsive.citation:last-child,.rst-content .wy-table-responsive.footnote:last-child{margin-bottom:24px}.rst-content table.docutils th{border-color:#e1e4e5}html.writer-html5 .rst-content table.docutils th{border:1px solid #e1e4e5}html.writer-html5 .rst-content table.docutils td>p,html.writer-html5 .rst-content table.docutils th>p{line-height:1rem;margin-bottom:0;font-size:.9rem}.rst-content table.docutils td .last,.rst-content table.docutils td .last>:last-child{margin-bottom:0}.rst-content table.field-list,.rst-content table.field-list td{border:none}.rst-content table.field-list td p{line-height:inherit}.rst-content table.field-list td>strong{display:inline-block}.rst-content table.field-list .field-name{padding-right:10px;text-align:left;white-space:nowrap}.rst-content table.field-list .field-body{text-align:left}.rst-content code,.rst-content tt{color:#000;font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;padding:2px 5px}.rst-content code big,.rst-content code em,.rst-content tt big,.rst-content tt em{font-size:100%!important;line-height:normal}.rst-content code.literal,.rst-content tt.literal{color:#e74c3c;white-space:normal}.rst-content code.xref,.rst-content tt.xref,a .rst-content code,a .rst-content tt{font-weight:700;color:#404040;overflow-wrap:normal}.rst-content kbd,.rst-content pre,.rst-content samp{font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace}.rst-content a code,.rst-content a tt{color:#2980b9}.rst-content dl{margin-bottom:24px}.rst-content dl dt{font-weight:700;margin-bottom:12px}.rst-content dl ol,.rst-content dl p,.rst-content dl table,.rst-content dl ul{margin-bottom:12px}.rst-content dl dd{margin:0 0 12px 24px;line-height:24px}.rst-content dl dd>ol:last-child,.rst-content dl dd>p:last-child,.rst-content dl dd>table:last-child,.rst-content dl dd>ul:last-child{margin-bottom:0}html.writer-html4 .rst-content dl:not(.docutils),html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple){margin-bottom:24px}html.writer-html4 .rst-content dl:not(.docutils)>dt,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt{display:table;margin:6px 0;font-size:90%;line-height:normal;background:#e7f2fa;color:#2980b9;border-top:3px solid #6ab0de;padding:6px;position:relative}html.writer-html4 .rst-content dl:not(.docutils)>dt:before,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt:before{color:#6ab0de}html.writer-html4 .rst-content dl:not(.docutils)>dt .headerlink,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt .headerlink{color:#404040;font-size:100%!important}html.writer-html4 .rst-content dl:not(.docutils) dl:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) dl:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt{margin-bottom:6px;border:none;border-left:3px solid #ccc;background:#f0f0f0;color:#555}html.writer-html4 .rst-content dl:not(.docutils) dl:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt .headerlink,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) dl:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt .headerlink{color:#404040;font-size:100%!important}html.writer-html4 .rst-content dl:not(.docutils)>dt:first-child,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt:first-child{margin-top:0}html.writer-html4 .rst-content dl:not(.docutils) code.descclassname,html.writer-html4 .rst-content dl:not(.docutils) code.descname,html.writer-html4 .rst-content dl:not(.docutils) tt.descclassname,html.writer-html4 .rst-content dl:not(.docutils) tt.descname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) code.descclassname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) code.descname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) tt.descclassname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) tt.descname{background-color:transparent;border:none;padding:0;font-size:100%!important}html.writer-html4 .rst-content dl:not(.docutils) code.descname,html.writer-html4 .rst-content dl:not(.docutils) tt.descname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) code.descname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) tt.descname{font-weight:700}html.writer-html4 .rst-content dl:not(.docutils) .optional,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) .optional{display:inline-block;padding:0 4px;color:#000;font-weight:700}html.writer-html4 .rst-content dl:not(.docutils) .property,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) .property{display:inline-block;padding-right:8px;max-width:100%}html.writer-html4 .rst-content dl:not(.docutils) .k,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) .k{font-style:italic}html.writer-html4 .rst-content dl:not(.docutils) .descclassname,html.writer-html4 .rst-content dl:not(.docutils) .descname,html.writer-html4 .rst-content dl:not(.docutils) .sig-name,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) .descclassname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) .descname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) .sig-name{font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;color:#000}.rst-content .viewcode-back,.rst-content .viewcode-link{display:inline-block;color:#27ae60;font-size:80%;padding-left:24px}.rst-content .viewcode-back{display:block;float:right}.rst-content p.rubric{margin-bottom:12px;font-weight:700}.rst-content code.download,.rst-content tt.download{background:inherit;padding:inherit;font-weight:400;font-family:inherit;font-size:inherit;color:inherit;border:inherit;white-space:inherit}.rst-content code.download span:first-child,.rst-content tt.download span:first-child{-webkit-font-smoothing:subpixel-antialiased}.rst-content code.download span:first-child:before,.rst-content tt.download span:first-child:before{margin-right:4px}.rst-content .guilabel,.rst-content .menuselection{font-size:80%;font-weight:700;border-radius:4px;padding:2.4px 6px;margin:auto 2px}.rst-content .guilabel,.rst-content .menuselection{border:1px solid #7fbbe3;background:#e7f2fa}.rst-content :not(dl.option-list)>:not(dt):not(kbd):not(.kbd)>.kbd,.rst-content :not(dl.option-list)>:not(dt):not(kbd):not(.kbd)>kbd{color:inherit;font-size:80%;background-color:#fff;border:1px solid #a6a6a6;border-radius:4px;box-shadow:0 2px grey;padding:2.4px 6px;margin:auto 0}.rst-content .versionmodified{font-style:italic}@media screen and (max-width:480px){.rst-content .sidebar{width:100%}}span[id*=MathJax-Span]{color:#404040}.math{text-align:center}@font-face{font-family:Lato;src:url(fonts/lato-normal.woff2?bd03a2cc277bbbc338d464e679fe9942) format("woff2"),url(fonts/lato-normal.woff?27bd77b9162d388cb8d4c4217c7c5e2a) format("woff");font-weight:400;font-style:normal;font-display:block}@font-face{font-family:Lato;src:url(fonts/lato-bold.woff2?cccb897485813c7c256901dbca54ecf2) format("woff2"),url(fonts/lato-bold.woff?d878b6c29b10beca227e9eef4246111b) format("woff");font-weight:700;font-style:normal;font-display:block}@font-face{font-family:Lato;src:url(fonts/lato-bold-italic.woff2?0b6bb6725576b072c5d0b02ecdd1900d) format("woff2"),url(fonts/lato-bold-italic.woff?9c7e4e9eb485b4a121c760e61bc3707c) format("woff");font-weight:700;font-style:italic;font-display:block}@font-face{font-family:Lato;src:url(fonts/lato-normal-italic.woff2?4eb103b4d12be57cb1d040ed5e162e9d) format("woff2"),url(fonts/lato-normal-italic.woff?f28f2d6482446544ef1ea1ccc6dd5892) format("woff");font-weight:400;font-style:italic;font-display:block}@font-face{font-family:Roboto Slab;font-style:normal;font-weight:400;src:url(fonts/Roboto-Slab-Regular.woff2?7abf5b8d04d26a2cafea937019bca958) format("woff2"),url(fonts/Roboto-Slab-Regular.woff?c1be9284088d487c5e3ff0a10a92e58c) format("woff");font-display:block}@font-face{font-family:Roboto Slab;font-style:normal;font-weight:700;src:url(fonts/Roboto-Slab-Bold.woff2?9984f4a9bda09be08e83f2506954adbe) format("woff2"),url(fonts/Roboto-Slab-Bold.woff?bed5564a116b05148e3b3bea6fb1162a) format("woff");font-display:block} \ No newline at end of file diff --git a/branch/bicounty_emme/_static/doctools.js b/branch/bicounty_emme/_static/doctools.js new file mode 100644 index 0000000..d06a71d --- /dev/null +++ b/branch/bicounty_emme/_static/doctools.js @@ -0,0 +1,156 @@ +/* + * doctools.js + * ~~~~~~~~~~~ + * + * Base JavaScript utilities for all Sphinx HTML documentation. + * + * :copyright: Copyright 2007-2023 by the Sphinx team, see AUTHORS. + * :license: BSD, see LICENSE for details. + * + */ +"use strict"; + +const BLACKLISTED_KEY_CONTROL_ELEMENTS = new Set([ + "TEXTAREA", + "INPUT", + "SELECT", + "BUTTON", +]); + +const _ready = (callback) => { + if (document.readyState !== "loading") { + callback(); + } else { + document.addEventListener("DOMContentLoaded", callback); + } +}; + +/** + * Small JavaScript module for the documentation. + */ +const Documentation = { + init: () => { + Documentation.initDomainIndexTable(); + Documentation.initOnKeyListeners(); + }, + + /** + * i18n support + */ + TRANSLATIONS: {}, + PLURAL_EXPR: (n) => (n === 1 ? 0 : 1), + LOCALE: "unknown", + + // gettext and ngettext don't access this so that the functions + // can safely bound to a different name (_ = Documentation.gettext) + gettext: (string) => { + const translated = Documentation.TRANSLATIONS[string]; + switch (typeof translated) { + case "undefined": + return string; // no translation + case "string": + return translated; // translation exists + default: + return translated[0]; // (singular, plural) translation tuple exists + } + }, + + ngettext: (singular, plural, n) => { + const translated = Documentation.TRANSLATIONS[singular]; + if (typeof translated !== "undefined") + return translated[Documentation.PLURAL_EXPR(n)]; + return n === 1 ? singular : plural; + }, + + addTranslations: (catalog) => { + Object.assign(Documentation.TRANSLATIONS, catalog.messages); + Documentation.PLURAL_EXPR = new Function( + "n", + `return (${catalog.plural_expr})` + ); + Documentation.LOCALE = catalog.locale; + }, + + /** + * helper function to focus on search bar + */ + focusSearchBar: () => { + document.querySelectorAll("input[name=q]")[0]?.focus(); + }, + + /** + * Initialise the domain index toggle buttons + */ + initDomainIndexTable: () => { + const toggler = (el) => { + const idNumber = el.id.substr(7); + const toggledRows = document.querySelectorAll(`tr.cg-${idNumber}`); + if (el.src.substr(-9) === "minus.png") { + el.src = `${el.src.substr(0, el.src.length - 9)}plus.png`; + toggledRows.forEach((el) => (el.style.display = "none")); + } else { + el.src = `${el.src.substr(0, el.src.length - 8)}minus.png`; + toggledRows.forEach((el) => (el.style.display = "")); + } + }; + + const togglerElements = document.querySelectorAll("img.toggler"); + togglerElements.forEach((el) => + el.addEventListener("click", (event) => toggler(event.currentTarget)) + ); + togglerElements.forEach((el) => (el.style.display = "")); + if (DOCUMENTATION_OPTIONS.COLLAPSE_INDEX) togglerElements.forEach(toggler); + }, + + initOnKeyListeners: () => { + // only install a listener if it is really needed + if ( + !DOCUMENTATION_OPTIONS.NAVIGATION_WITH_KEYS && + !DOCUMENTATION_OPTIONS.ENABLE_SEARCH_SHORTCUTS + ) + return; + + document.addEventListener("keydown", (event) => { + // bail for input elements + if (BLACKLISTED_KEY_CONTROL_ELEMENTS.has(document.activeElement.tagName)) return; + // bail with special keys + if (event.altKey || event.ctrlKey || event.metaKey) return; + + if (!event.shiftKey) { + switch (event.key) { + case "ArrowLeft": + if (!DOCUMENTATION_OPTIONS.NAVIGATION_WITH_KEYS) break; + + const prevLink = document.querySelector('link[rel="prev"]'); + if (prevLink && prevLink.href) { + window.location.href = prevLink.href; + event.preventDefault(); + } + break; + case "ArrowRight": + if (!DOCUMENTATION_OPTIONS.NAVIGATION_WITH_KEYS) break; + + const nextLink = document.querySelector('link[rel="next"]'); + if (nextLink && nextLink.href) { + window.location.href = nextLink.href; + event.preventDefault(); + } + break; + } + } + + // some keyboard layouts may need Shift to get / + switch (event.key) { + case "/": + if (!DOCUMENTATION_OPTIONS.ENABLE_SEARCH_SHORTCUTS) break; + Documentation.focusSearchBar(); + event.preventDefault(); + } + }); + }, +}; + +// quick alias for translations +const _ = Documentation.gettext; + +_ready(Documentation.init); diff --git a/branch/bicounty_emme/_static/documentation_options.js b/branch/bicounty_emme/_static/documentation_options.js new file mode 100644 index 0000000..c066c69 --- /dev/null +++ b/branch/bicounty_emme/_static/documentation_options.js @@ -0,0 +1,14 @@ +var DOCUMENTATION_OPTIONS = { + URL_ROOT: document.getElementById("documentation_options").getAttribute('data-url_root'), + VERSION: '', + LANGUAGE: 'en', + COLLAPSE_INDEX: false, + BUILDER: 'dirhtml', + FILE_SUFFIX: '.html', + LINK_SUFFIX: '.html', + HAS_SOURCE: true, + SOURCELINK_SUFFIX: '.txt', + NAVIGATION_WITH_KEYS: false, + SHOW_SEARCH_SUMMARY: true, + ENABLE_SEARCH_SHORTCUTS: true, +}; \ No newline at end of file diff --git a/branch/bicounty_emme/_static/file.png b/branch/bicounty_emme/_static/file.png new file mode 100644 index 0000000..a858a41 Binary files /dev/null and b/branch/bicounty_emme/_static/file.png differ diff --git a/branch/bicounty_emme/_static/graphviz.css b/branch/bicounty_emme/_static/graphviz.css new file mode 100644 index 0000000..8d81c02 --- /dev/null +++ b/branch/bicounty_emme/_static/graphviz.css @@ -0,0 +1,19 @@ +/* + * graphviz.css + * ~~~~~~~~~~~~ + * + * Sphinx stylesheet -- graphviz extension. + * + * :copyright: Copyright 2007-2023 by the Sphinx team, see AUTHORS. + * :license: BSD, see LICENSE for details. + * + */ + +img.graphviz { + border: 0; + max-width: 100%; +} + +object.graphviz { + max-width: 100%; +} diff --git a/branch/bicounty_emme/_static/jquery.js b/branch/bicounty_emme/_static/jquery.js new file mode 100644 index 0000000..c4c6022 --- /dev/null +++ b/branch/bicounty_emme/_static/jquery.js @@ -0,0 +1,2 @@ +/*! jQuery v3.6.0 | (c) OpenJS Foundation and other contributors | jquery.org/license */ +!function(e,t){"use strict";"object"==typeof module&&"object"==typeof module.exports?module.exports=e.document?t(e,!0):function(e){if(!e.document)throw new Error("jQuery requires a window with a document");return t(e)}:t(e)}("undefined"!=typeof window?window:this,function(C,e){"use strict";var t=[],r=Object.getPrototypeOf,s=t.slice,g=t.flat?function(e){return t.flat.call(e)}:function(e){return t.concat.apply([],e)},u=t.push,i=t.indexOf,n={},o=n.toString,v=n.hasOwnProperty,a=v.toString,l=a.call(Object),y={},m=function(e){return"function"==typeof e&&"number"!=typeof e.nodeType&&"function"!=typeof e.item},x=function(e){return null!=e&&e===e.window},E=C.document,c={type:!0,src:!0,nonce:!0,noModule:!0};function b(e,t,n){var r,i,o=(n=n||E).createElement("script");if(o.text=e,t)for(r in c)(i=t[r]||t.getAttribute&&t.getAttribute(r))&&o.setAttribute(r,i);n.head.appendChild(o).parentNode.removeChild(o)}function w(e){return null==e?e+"":"object"==typeof e||"function"==typeof e?n[o.call(e)]||"object":typeof e}var f="3.6.0",S=function(e,t){return new S.fn.init(e,t)};function p(e){var t=!!e&&"length"in e&&e.length,n=w(e);return!m(e)&&!x(e)&&("array"===n||0===t||"number"==typeof t&&0+~]|"+M+")"+M+"*"),U=new RegExp(M+"|>"),X=new RegExp(F),V=new RegExp("^"+I+"$"),G={ID:new RegExp("^#("+I+")"),CLASS:new RegExp("^\\.("+I+")"),TAG:new RegExp("^("+I+"|[*])"),ATTR:new RegExp("^"+W),PSEUDO:new RegExp("^"+F),CHILD:new RegExp("^:(only|first|last|nth|nth-last)-(child|of-type)(?:\\("+M+"*(even|odd|(([+-]|)(\\d*)n|)"+M+"*(?:([+-]|)"+M+"*(\\d+)|))"+M+"*\\)|)","i"),bool:new RegExp("^(?:"+R+")$","i"),needsContext:new RegExp("^"+M+"*[>+~]|:(even|odd|eq|gt|lt|nth|first|last)(?:\\("+M+"*((?:-\\d)?\\d*)"+M+"*\\)|)(?=[^-]|$)","i")},Y=/HTML$/i,Q=/^(?:input|select|textarea|button)$/i,J=/^h\d$/i,K=/^[^{]+\{\s*\[native \w/,Z=/^(?:#([\w-]+)|(\w+)|\.([\w-]+))$/,ee=/[+~]/,te=new RegExp("\\\\[\\da-fA-F]{1,6}"+M+"?|\\\\([^\\r\\n\\f])","g"),ne=function(e,t){var n="0x"+e.slice(1)-65536;return t||(n<0?String.fromCharCode(n+65536):String.fromCharCode(n>>10|55296,1023&n|56320))},re=/([\0-\x1f\x7f]|^-?\d)|^-$|[^\0-\x1f\x7f-\uFFFF\w-]/g,ie=function(e,t){return t?"\0"===e?"\ufffd":e.slice(0,-1)+"\\"+e.charCodeAt(e.length-1).toString(16)+" ":"\\"+e},oe=function(){T()},ae=be(function(e){return!0===e.disabled&&"fieldset"===e.nodeName.toLowerCase()},{dir:"parentNode",next:"legend"});try{H.apply(t=O.call(p.childNodes),p.childNodes),t[p.childNodes.length].nodeType}catch(e){H={apply:t.length?function(e,t){L.apply(e,O.call(t))}:function(e,t){var n=e.length,r=0;while(e[n++]=t[r++]);e.length=n-1}}}function se(t,e,n,r){var i,o,a,s,u,l,c,f=e&&e.ownerDocument,p=e?e.nodeType:9;if(n=n||[],"string"!=typeof t||!t||1!==p&&9!==p&&11!==p)return n;if(!r&&(T(e),e=e||C,E)){if(11!==p&&(u=Z.exec(t)))if(i=u[1]){if(9===p){if(!(a=e.getElementById(i)))return n;if(a.id===i)return n.push(a),n}else if(f&&(a=f.getElementById(i))&&y(e,a)&&a.id===i)return n.push(a),n}else{if(u[2])return H.apply(n,e.getElementsByTagName(t)),n;if((i=u[3])&&d.getElementsByClassName&&e.getElementsByClassName)return H.apply(n,e.getElementsByClassName(i)),n}if(d.qsa&&!N[t+" "]&&(!v||!v.test(t))&&(1!==p||"object"!==e.nodeName.toLowerCase())){if(c=t,f=e,1===p&&(U.test(t)||z.test(t))){(f=ee.test(t)&&ye(e.parentNode)||e)===e&&d.scope||((s=e.getAttribute("id"))?s=s.replace(re,ie):e.setAttribute("id",s=S)),o=(l=h(t)).length;while(o--)l[o]=(s?"#"+s:":scope")+" "+xe(l[o]);c=l.join(",")}try{return H.apply(n,f.querySelectorAll(c)),n}catch(e){N(t,!0)}finally{s===S&&e.removeAttribute("id")}}}return g(t.replace($,"$1"),e,n,r)}function ue(){var r=[];return function e(t,n){return r.push(t+" ")>b.cacheLength&&delete e[r.shift()],e[t+" "]=n}}function le(e){return e[S]=!0,e}function ce(e){var t=C.createElement("fieldset");try{return!!e(t)}catch(e){return!1}finally{t.parentNode&&t.parentNode.removeChild(t),t=null}}function fe(e,t){var n=e.split("|"),r=n.length;while(r--)b.attrHandle[n[r]]=t}function pe(e,t){var n=t&&e,r=n&&1===e.nodeType&&1===t.nodeType&&e.sourceIndex-t.sourceIndex;if(r)return r;if(n)while(n=n.nextSibling)if(n===t)return-1;return e?1:-1}function de(t){return function(e){return"input"===e.nodeName.toLowerCase()&&e.type===t}}function he(n){return function(e){var t=e.nodeName.toLowerCase();return("input"===t||"button"===t)&&e.type===n}}function ge(t){return function(e){return"form"in e?e.parentNode&&!1===e.disabled?"label"in e?"label"in e.parentNode?e.parentNode.disabled===t:e.disabled===t:e.isDisabled===t||e.isDisabled!==!t&&ae(e)===t:e.disabled===t:"label"in e&&e.disabled===t}}function ve(a){return le(function(o){return o=+o,le(function(e,t){var n,r=a([],e.length,o),i=r.length;while(i--)e[n=r[i]]&&(e[n]=!(t[n]=e[n]))})})}function ye(e){return e&&"undefined"!=typeof e.getElementsByTagName&&e}for(e in d=se.support={},i=se.isXML=function(e){var t=e&&e.namespaceURI,n=e&&(e.ownerDocument||e).documentElement;return!Y.test(t||n&&n.nodeName||"HTML")},T=se.setDocument=function(e){var t,n,r=e?e.ownerDocument||e:p;return r!=C&&9===r.nodeType&&r.documentElement&&(a=(C=r).documentElement,E=!i(C),p!=C&&(n=C.defaultView)&&n.top!==n&&(n.addEventListener?n.addEventListener("unload",oe,!1):n.attachEvent&&n.attachEvent("onunload",oe)),d.scope=ce(function(e){return a.appendChild(e).appendChild(C.createElement("div")),"undefined"!=typeof e.querySelectorAll&&!e.querySelectorAll(":scope fieldset div").length}),d.attributes=ce(function(e){return e.className="i",!e.getAttribute("className")}),d.getElementsByTagName=ce(function(e){return e.appendChild(C.createComment("")),!e.getElementsByTagName("*").length}),d.getElementsByClassName=K.test(C.getElementsByClassName),d.getById=ce(function(e){return a.appendChild(e).id=S,!C.getElementsByName||!C.getElementsByName(S).length}),d.getById?(b.filter.ID=function(e){var t=e.replace(te,ne);return function(e){return e.getAttribute("id")===t}},b.find.ID=function(e,t){if("undefined"!=typeof t.getElementById&&E){var n=t.getElementById(e);return n?[n]:[]}}):(b.filter.ID=function(e){var n=e.replace(te,ne);return function(e){var t="undefined"!=typeof e.getAttributeNode&&e.getAttributeNode("id");return t&&t.value===n}},b.find.ID=function(e,t){if("undefined"!=typeof t.getElementById&&E){var n,r,i,o=t.getElementById(e);if(o){if((n=o.getAttributeNode("id"))&&n.value===e)return[o];i=t.getElementsByName(e),r=0;while(o=i[r++])if((n=o.getAttributeNode("id"))&&n.value===e)return[o]}return[]}}),b.find.TAG=d.getElementsByTagName?function(e,t){return"undefined"!=typeof t.getElementsByTagName?t.getElementsByTagName(e):d.qsa?t.querySelectorAll(e):void 0}:function(e,t){var n,r=[],i=0,o=t.getElementsByTagName(e);if("*"===e){while(n=o[i++])1===n.nodeType&&r.push(n);return r}return o},b.find.CLASS=d.getElementsByClassName&&function(e,t){if("undefined"!=typeof t.getElementsByClassName&&E)return t.getElementsByClassName(e)},s=[],v=[],(d.qsa=K.test(C.querySelectorAll))&&(ce(function(e){var t;a.appendChild(e).innerHTML="",e.querySelectorAll("[msallowcapture^='']").length&&v.push("[*^$]="+M+"*(?:''|\"\")"),e.querySelectorAll("[selected]").length||v.push("\\["+M+"*(?:value|"+R+")"),e.querySelectorAll("[id~="+S+"-]").length||v.push("~="),(t=C.createElement("input")).setAttribute("name",""),e.appendChild(t),e.querySelectorAll("[name='']").length||v.push("\\["+M+"*name"+M+"*="+M+"*(?:''|\"\")"),e.querySelectorAll(":checked").length||v.push(":checked"),e.querySelectorAll("a#"+S+"+*").length||v.push(".#.+[+~]"),e.querySelectorAll("\\\f"),v.push("[\\r\\n\\f]")}),ce(function(e){e.innerHTML="";var t=C.createElement("input");t.setAttribute("type","hidden"),e.appendChild(t).setAttribute("name","D"),e.querySelectorAll("[name=d]").length&&v.push("name"+M+"*[*^$|!~]?="),2!==e.querySelectorAll(":enabled").length&&v.push(":enabled",":disabled"),a.appendChild(e).disabled=!0,2!==e.querySelectorAll(":disabled").length&&v.push(":enabled",":disabled"),e.querySelectorAll("*,:x"),v.push(",.*:")})),(d.matchesSelector=K.test(c=a.matches||a.webkitMatchesSelector||a.mozMatchesSelector||a.oMatchesSelector||a.msMatchesSelector))&&ce(function(e){d.disconnectedMatch=c.call(e,"*"),c.call(e,"[s!='']:x"),s.push("!=",F)}),v=v.length&&new RegExp(v.join("|")),s=s.length&&new RegExp(s.join("|")),t=K.test(a.compareDocumentPosition),y=t||K.test(a.contains)?function(e,t){var n=9===e.nodeType?e.documentElement:e,r=t&&t.parentNode;return e===r||!(!r||1!==r.nodeType||!(n.contains?n.contains(r):e.compareDocumentPosition&&16&e.compareDocumentPosition(r)))}:function(e,t){if(t)while(t=t.parentNode)if(t===e)return!0;return!1},j=t?function(e,t){if(e===t)return l=!0,0;var n=!e.compareDocumentPosition-!t.compareDocumentPosition;return n||(1&(n=(e.ownerDocument||e)==(t.ownerDocument||t)?e.compareDocumentPosition(t):1)||!d.sortDetached&&t.compareDocumentPosition(e)===n?e==C||e.ownerDocument==p&&y(p,e)?-1:t==C||t.ownerDocument==p&&y(p,t)?1:u?P(u,e)-P(u,t):0:4&n?-1:1)}:function(e,t){if(e===t)return l=!0,0;var n,r=0,i=e.parentNode,o=t.parentNode,a=[e],s=[t];if(!i||!o)return e==C?-1:t==C?1:i?-1:o?1:u?P(u,e)-P(u,t):0;if(i===o)return pe(e,t);n=e;while(n=n.parentNode)a.unshift(n);n=t;while(n=n.parentNode)s.unshift(n);while(a[r]===s[r])r++;return r?pe(a[r],s[r]):a[r]==p?-1:s[r]==p?1:0}),C},se.matches=function(e,t){return se(e,null,null,t)},se.matchesSelector=function(e,t){if(T(e),d.matchesSelector&&E&&!N[t+" "]&&(!s||!s.test(t))&&(!v||!v.test(t)))try{var n=c.call(e,t);if(n||d.disconnectedMatch||e.document&&11!==e.document.nodeType)return n}catch(e){N(t,!0)}return 0":{dir:"parentNode",first:!0}," ":{dir:"parentNode"},"+":{dir:"previousSibling",first:!0},"~":{dir:"previousSibling"}},preFilter:{ATTR:function(e){return e[1]=e[1].replace(te,ne),e[3]=(e[3]||e[4]||e[5]||"").replace(te,ne),"~="===e[2]&&(e[3]=" "+e[3]+" "),e.slice(0,4)},CHILD:function(e){return e[1]=e[1].toLowerCase(),"nth"===e[1].slice(0,3)?(e[3]||se.error(e[0]),e[4]=+(e[4]?e[5]+(e[6]||1):2*("even"===e[3]||"odd"===e[3])),e[5]=+(e[7]+e[8]||"odd"===e[3])):e[3]&&se.error(e[0]),e},PSEUDO:function(e){var t,n=!e[6]&&e[2];return G.CHILD.test(e[0])?null:(e[3]?e[2]=e[4]||e[5]||"":n&&X.test(n)&&(t=h(n,!0))&&(t=n.indexOf(")",n.length-t)-n.length)&&(e[0]=e[0].slice(0,t),e[2]=n.slice(0,t)),e.slice(0,3))}},filter:{TAG:function(e){var t=e.replace(te,ne).toLowerCase();return"*"===e?function(){return!0}:function(e){return e.nodeName&&e.nodeName.toLowerCase()===t}},CLASS:function(e){var t=m[e+" "];return t||(t=new RegExp("(^|"+M+")"+e+"("+M+"|$)"))&&m(e,function(e){return t.test("string"==typeof e.className&&e.className||"undefined"!=typeof e.getAttribute&&e.getAttribute("class")||"")})},ATTR:function(n,r,i){return function(e){var t=se.attr(e,n);return null==t?"!="===r:!r||(t+="","="===r?t===i:"!="===r?t!==i:"^="===r?i&&0===t.indexOf(i):"*="===r?i&&-1:\x20\t\r\n\f]*)[\x20\t\r\n\f]*\/?>(?:<\/\1>|)$/i;function j(e,n,r){return m(n)?S.grep(e,function(e,t){return!!n.call(e,t,e)!==r}):n.nodeType?S.grep(e,function(e){return e===n!==r}):"string"!=typeof n?S.grep(e,function(e){return-1)[^>]*|#([\w-]+))$/;(S.fn.init=function(e,t,n){var r,i;if(!e)return this;if(n=n||D,"string"==typeof e){if(!(r="<"===e[0]&&">"===e[e.length-1]&&3<=e.length?[null,e,null]:q.exec(e))||!r[1]&&t)return!t||t.jquery?(t||n).find(e):this.constructor(t).find(e);if(r[1]){if(t=t instanceof S?t[0]:t,S.merge(this,S.parseHTML(r[1],t&&t.nodeType?t.ownerDocument||t:E,!0)),N.test(r[1])&&S.isPlainObject(t))for(r in t)m(this[r])?this[r](t[r]):this.attr(r,t[r]);return this}return(i=E.getElementById(r[2]))&&(this[0]=i,this.length=1),this}return e.nodeType?(this[0]=e,this.length=1,this):m(e)?void 0!==n.ready?n.ready(e):e(S):S.makeArray(e,this)}).prototype=S.fn,D=S(E);var L=/^(?:parents|prev(?:Until|All))/,H={children:!0,contents:!0,next:!0,prev:!0};function O(e,t){while((e=e[t])&&1!==e.nodeType);return e}S.fn.extend({has:function(e){var t=S(e,this),n=t.length;return this.filter(function(){for(var e=0;e\x20\t\r\n\f]*)/i,he=/^$|^module$|\/(?:java|ecma)script/i;ce=E.createDocumentFragment().appendChild(E.createElement("div")),(fe=E.createElement("input")).setAttribute("type","radio"),fe.setAttribute("checked","checked"),fe.setAttribute("name","t"),ce.appendChild(fe),y.checkClone=ce.cloneNode(!0).cloneNode(!0).lastChild.checked,ce.innerHTML="",y.noCloneChecked=!!ce.cloneNode(!0).lastChild.defaultValue,ce.innerHTML="",y.option=!!ce.lastChild;var ge={thead:[1,"","
"],col:[2,"","
"],tr:[2,"","
"],td:[3,"","
"],_default:[0,"",""]};function ve(e,t){var n;return n="undefined"!=typeof e.getElementsByTagName?e.getElementsByTagName(t||"*"):"undefined"!=typeof e.querySelectorAll?e.querySelectorAll(t||"*"):[],void 0===t||t&&A(e,t)?S.merge([e],n):n}function ye(e,t){for(var n=0,r=e.length;n",""]);var me=/<|&#?\w+;/;function xe(e,t,n,r,i){for(var o,a,s,u,l,c,f=t.createDocumentFragment(),p=[],d=0,h=e.length;d\s*$/g;function je(e,t){return A(e,"table")&&A(11!==t.nodeType?t:t.firstChild,"tr")&&S(e).children("tbody")[0]||e}function De(e){return e.type=(null!==e.getAttribute("type"))+"/"+e.type,e}function qe(e){return"true/"===(e.type||"").slice(0,5)?e.type=e.type.slice(5):e.removeAttribute("type"),e}function Le(e,t){var n,r,i,o,a,s;if(1===t.nodeType){if(Y.hasData(e)&&(s=Y.get(e).events))for(i in Y.remove(t,"handle events"),s)for(n=0,r=s[i].length;n").attr(n.scriptAttrs||{}).prop({charset:n.scriptCharset,src:n.url}).on("load error",i=function(e){r.remove(),i=null,e&&t("error"===e.type?404:200,e.type)}),E.head.appendChild(r[0])},abort:function(){i&&i()}}});var _t,zt=[],Ut=/(=)\?(?=&|$)|\?\?/;S.ajaxSetup({jsonp:"callback",jsonpCallback:function(){var e=zt.pop()||S.expando+"_"+wt.guid++;return this[e]=!0,e}}),S.ajaxPrefilter("json jsonp",function(e,t,n){var r,i,o,a=!1!==e.jsonp&&(Ut.test(e.url)?"url":"string"==typeof e.data&&0===(e.contentType||"").indexOf("application/x-www-form-urlencoded")&&Ut.test(e.data)&&"data");if(a||"jsonp"===e.dataTypes[0])return r=e.jsonpCallback=m(e.jsonpCallback)?e.jsonpCallback():e.jsonpCallback,a?e[a]=e[a].replace(Ut,"$1"+r):!1!==e.jsonp&&(e.url+=(Tt.test(e.url)?"&":"?")+e.jsonp+"="+r),e.converters["script json"]=function(){return o||S.error(r+" was not called"),o[0]},e.dataTypes[0]="json",i=C[r],C[r]=function(){o=arguments},n.always(function(){void 0===i?S(C).removeProp(r):C[r]=i,e[r]&&(e.jsonpCallback=t.jsonpCallback,zt.push(r)),o&&m(i)&&i(o[0]),o=i=void 0}),"script"}),y.createHTMLDocument=((_t=E.implementation.createHTMLDocument("").body).innerHTML="
",2===_t.childNodes.length),S.parseHTML=function(e,t,n){return"string"!=typeof e?[]:("boolean"==typeof t&&(n=t,t=!1),t||(y.createHTMLDocument?((r=(t=E.implementation.createHTMLDocument("")).createElement("base")).href=E.location.href,t.head.appendChild(r)):t=E),o=!n&&[],(i=N.exec(e))?[t.createElement(i[1])]:(i=xe([e],t,o),o&&o.length&&S(o).remove(),S.merge([],i.childNodes)));var r,i,o},S.fn.load=function(e,t,n){var r,i,o,a=this,s=e.indexOf(" ");return-1").append(S.parseHTML(e)).find(r):e)}).always(n&&function(e,t){a.each(function(){n.apply(this,o||[e.responseText,t,e])})}),this},S.expr.pseudos.animated=function(t){return S.grep(S.timers,function(e){return t===e.elem}).length},S.offset={setOffset:function(e,t,n){var r,i,o,a,s,u,l=S.css(e,"position"),c=S(e),f={};"static"===l&&(e.style.position="relative"),s=c.offset(),o=S.css(e,"top"),u=S.css(e,"left"),("absolute"===l||"fixed"===l)&&-1<(o+u).indexOf("auto")?(a=(r=c.position()).top,i=r.left):(a=parseFloat(o)||0,i=parseFloat(u)||0),m(t)&&(t=t.call(e,n,S.extend({},s))),null!=t.top&&(f.top=t.top-s.top+a),null!=t.left&&(f.left=t.left-s.left+i),"using"in t?t.using.call(e,f):c.css(f)}},S.fn.extend({offset:function(t){if(arguments.length)return void 0===t?this:this.each(function(e){S.offset.setOffset(this,t,e)});var e,n,r=this[0];return r?r.getClientRects().length?(e=r.getBoundingClientRect(),n=r.ownerDocument.defaultView,{top:e.top+n.pageYOffset,left:e.left+n.pageXOffset}):{top:0,left:0}:void 0},position:function(){if(this[0]){var e,t,n,r=this[0],i={top:0,left:0};if("fixed"===S.css(r,"position"))t=r.getBoundingClientRect();else{t=this.offset(),n=r.ownerDocument,e=r.offsetParent||n.documentElement;while(e&&(e===n.body||e===n.documentElement)&&"static"===S.css(e,"position"))e=e.parentNode;e&&e!==r&&1===e.nodeType&&((i=S(e).offset()).top+=S.css(e,"borderTopWidth",!0),i.left+=S.css(e,"borderLeftWidth",!0))}return{top:t.top-i.top-S.css(r,"marginTop",!0),left:t.left-i.left-S.css(r,"marginLeft",!0)}}},offsetParent:function(){return this.map(function(){var e=this.offsetParent;while(e&&"static"===S.css(e,"position"))e=e.offsetParent;return e||re})}}),S.each({scrollLeft:"pageXOffset",scrollTop:"pageYOffset"},function(t,i){var o="pageYOffset"===i;S.fn[t]=function(e){return $(this,function(e,t,n){var r;if(x(e)?r=e:9===e.nodeType&&(r=e.defaultView),void 0===n)return r?r[i]:e[t];r?r.scrollTo(o?r.pageXOffset:n,o?n:r.pageYOffset):e[t]=n},t,e,arguments.length)}}),S.each(["top","left"],function(e,n){S.cssHooks[n]=Fe(y.pixelPosition,function(e,t){if(t)return t=We(e,n),Pe.test(t)?S(e).position()[n]+"px":t})}),S.each({Height:"height",Width:"width"},function(a,s){S.each({padding:"inner"+a,content:s,"":"outer"+a},function(r,o){S.fn[o]=function(e,t){var n=arguments.length&&(r||"boolean"!=typeof e),i=r||(!0===e||!0===t?"margin":"border");return $(this,function(e,t,n){var r;return x(e)?0===o.indexOf("outer")?e["inner"+a]:e.document.documentElement["client"+a]:9===e.nodeType?(r=e.documentElement,Math.max(e.body["scroll"+a],r["scroll"+a],e.body["offset"+a],r["offset"+a],r["client"+a])):void 0===n?S.css(e,t,i):S.style(e,t,n,i)},s,n?e:void 0,n)}})}),S.each(["ajaxStart","ajaxStop","ajaxComplete","ajaxError","ajaxSuccess","ajaxSend"],function(e,t){S.fn[t]=function(e){return this.on(t,e)}}),S.fn.extend({bind:function(e,t,n){return this.on(e,null,t,n)},unbind:function(e,t){return this.off(e,null,t)},delegate:function(e,t,n,r){return this.on(t,e,n,r)},undelegate:function(e,t,n){return 1===arguments.length?this.off(e,"**"):this.off(t,e||"**",n)},hover:function(e,t){return this.mouseenter(e).mouseleave(t||e)}}),S.each("blur focus focusin focusout resize scroll click dblclick mousedown mouseup mousemove mouseover mouseout mouseenter mouseleave change select submit keydown keypress keyup contextmenu".split(" "),function(e,n){S.fn[n]=function(e,t){return 0",d.insertBefore(c.lastChild,d.firstChild)}function d(){var a=y.elements;return"string"==typeof a?a.split(" "):a}function e(a,b){var c=y.elements;"string"!=typeof c&&(c=c.join(" ")),"string"!=typeof a&&(a=a.join(" ")),y.elements=c+" "+a,j(b)}function f(a){var b=x[a[v]];return b||(b={},w++,a[v]=w,x[w]=b),b}function g(a,c,d){if(c||(c=b),q)return c.createElement(a);d||(d=f(c));var e;return e=d.cache[a]?d.cache[a].cloneNode():u.test(a)?(d.cache[a]=d.createElem(a)).cloneNode():d.createElem(a),!e.canHaveChildren||t.test(a)||e.tagUrn?e:d.frag.appendChild(e)}function h(a,c){if(a||(a=b),q)return a.createDocumentFragment();c=c||f(a);for(var e=c.frag.cloneNode(),g=0,h=d(),i=h.length;i>g;g++)e.createElement(h[g]);return e}function i(a,b){b.cache||(b.cache={},b.createElem=a.createElement,b.createFrag=a.createDocumentFragment,b.frag=b.createFrag()),a.createElement=function(c){return y.shivMethods?g(c,a,b):b.createElem(c)},a.createDocumentFragment=Function("h,f","return function(){var n=f.cloneNode(),c=n.createElement;h.shivMethods&&("+d().join().replace(/[\w\-:]+/g,function(a){return b.createElem(a),b.frag.createElement(a),'c("'+a+'")'})+");return n}")(y,b.frag)}function j(a){a||(a=b);var d=f(a);return!y.shivCSS||p||d.hasCSS||(d.hasCSS=!!c(a,"article,aside,dialog,figcaption,figure,footer,header,hgroup,main,nav,section{display:block}mark{background:#FF0;color:#000}template{display:none}")),q||i(a,d),a}function k(a){for(var b,c=a.getElementsByTagName("*"),e=c.length,f=RegExp("^(?:"+d().join("|")+")$","i"),g=[];e--;)b=c[e],f.test(b.nodeName)&&g.push(b.applyElement(l(b)));return g}function l(a){for(var b,c=a.attributes,d=c.length,e=a.ownerDocument.createElement(A+":"+a.nodeName);d--;)b=c[d],b.specified&&e.setAttribute(b.nodeName,b.nodeValue);return e.style.cssText=a.style.cssText,e}function m(a){for(var b,c=a.split("{"),e=c.length,f=RegExp("(^|[\\s,>+~])("+d().join("|")+")(?=[[\\s,>+~#.:]|$)","gi"),g="$1"+A+"\\:$2";e--;)b=c[e]=c[e].split("}"),b[b.length-1]=b[b.length-1].replace(f,g),c[e]=b.join("}");return c.join("{")}function n(a){for(var b=a.length;b--;)a[b].removeNode()}function o(a){function b(){clearTimeout(g._removeSheetTimer),d&&d.removeNode(!0),d=null}var d,e,g=f(a),h=a.namespaces,i=a.parentWindow;return!B||a.printShived?a:("undefined"==typeof h[A]&&h.add(A),i.attachEvent("onbeforeprint",function(){b();for(var f,g,h,i=a.styleSheets,j=[],l=i.length,n=Array(l);l--;)n[l]=i[l];for(;h=n.pop();)if(!h.disabled&&z.test(h.media)){try{f=h.imports,g=f.length}catch(o){g=0}for(l=0;g>l;l++)n.push(f[l]);try{j.push(h.cssText)}catch(o){}}j=m(j.reverse().join("")),e=k(a),d=c(a,j)}),i.attachEvent("onafterprint",function(){n(e),clearTimeout(g._removeSheetTimer),g._removeSheetTimer=setTimeout(b,500)}),a.printShived=!0,a)}var p,q,r="3.7.3",s=a.html5||{},t=/^<|^(?:button|map|select|textarea|object|iframe|option|optgroup)$/i,u=/^(?:a|b|code|div|fieldset|h1|h2|h3|h4|h5|h6|i|label|li|ol|p|q|span|strong|style|table|tbody|td|th|tr|ul)$/i,v="_html5shiv",w=0,x={};!function(){try{var a=b.createElement("a");a.innerHTML="",p="hidden"in a,q=1==a.childNodes.length||function(){b.createElement("a");var a=b.createDocumentFragment();return"undefined"==typeof a.cloneNode||"undefined"==typeof a.createDocumentFragment||"undefined"==typeof a.createElement}()}catch(c){p=!0,q=!0}}();var y={elements:s.elements||"abbr article aside audio bdi canvas data datalist details dialog figcaption figure footer header hgroup main mark meter nav output picture progress section summary template time video",version:r,shivCSS:s.shivCSS!==!1,supportsUnknownElements:q,shivMethods:s.shivMethods!==!1,type:"default",shivDocument:j,createElement:g,createDocumentFragment:h,addElements:e};a.html5=y,j(b);var z=/^$|\b(?:all|print)\b/,A="html5shiv",B=!q&&function(){var c=b.documentElement;return!("undefined"==typeof b.namespaces||"undefined"==typeof b.parentWindow||"undefined"==typeof c.applyElement||"undefined"==typeof c.removeNode||"undefined"==typeof a.attachEvent)}();y.type+=" print",y.shivPrint=o,o(b),"object"==typeof module&&module.exports&&(module.exports=y)}("undefined"!=typeof window?window:this,document); \ No newline at end of file diff --git a/branch/bicounty_emme/_static/js/html5shiv.min.js b/branch/bicounty_emme/_static/js/html5shiv.min.js new file mode 100644 index 0000000..cd1c674 --- /dev/null +++ b/branch/bicounty_emme/_static/js/html5shiv.min.js @@ -0,0 +1,4 @@ +/** +* @preserve HTML5 Shiv 3.7.3 | @afarkas @jdalton @jon_neal @rem | MIT/GPL2 Licensed +*/ +!function(a,b){function c(a,b){var c=a.createElement("p"),d=a.getElementsByTagName("head")[0]||a.documentElement;return c.innerHTML="x",d.insertBefore(c.lastChild,d.firstChild)}function d(){var a=t.elements;return"string"==typeof a?a.split(" "):a}function e(a,b){var c=t.elements;"string"!=typeof c&&(c=c.join(" ")),"string"!=typeof a&&(a=a.join(" ")),t.elements=c+" "+a,j(b)}function f(a){var b=s[a[q]];return b||(b={},r++,a[q]=r,s[r]=b),b}function g(a,c,d){if(c||(c=b),l)return c.createElement(a);d||(d=f(c));var e;return e=d.cache[a]?d.cache[a].cloneNode():p.test(a)?(d.cache[a]=d.createElem(a)).cloneNode():d.createElem(a),!e.canHaveChildren||o.test(a)||e.tagUrn?e:d.frag.appendChild(e)}function h(a,c){if(a||(a=b),l)return a.createDocumentFragment();c=c||f(a);for(var e=c.frag.cloneNode(),g=0,h=d(),i=h.length;i>g;g++)e.createElement(h[g]);return e}function i(a,b){b.cache||(b.cache={},b.createElem=a.createElement,b.createFrag=a.createDocumentFragment,b.frag=b.createFrag()),a.createElement=function(c){return t.shivMethods?g(c,a,b):b.createElem(c)},a.createDocumentFragment=Function("h,f","return function(){var n=f.cloneNode(),c=n.createElement;h.shivMethods&&("+d().join().replace(/[\w\-:]+/g,function(a){return b.createElem(a),b.frag.createElement(a),'c("'+a+'")'})+");return n}")(t,b.frag)}function j(a){a||(a=b);var d=f(a);return!t.shivCSS||k||d.hasCSS||(d.hasCSS=!!c(a,"article,aside,dialog,figcaption,figure,footer,header,hgroup,main,nav,section{display:block}mark{background:#FF0;color:#000}template{display:none}")),l||i(a,d),a}var k,l,m="3.7.3-pre",n=a.html5||{},o=/^<|^(?:button|map|select|textarea|object|iframe|option|optgroup)$/i,p=/^(?:a|b|code|div|fieldset|h1|h2|h3|h4|h5|h6|i|label|li|ol|p|q|span|strong|style|table|tbody|td|th|tr|ul)$/i,q="_html5shiv",r=0,s={};!function(){try{var a=b.createElement("a");a.innerHTML="",k="hidden"in a,l=1==a.childNodes.length||function(){b.createElement("a");var a=b.createDocumentFragment();return"undefined"==typeof a.cloneNode||"undefined"==typeof a.createDocumentFragment||"undefined"==typeof a.createElement}()}catch(c){k=!0,l=!0}}();var t={elements:n.elements||"abbr article aside audio bdi canvas data datalist details dialog figcaption figure footer header hgroup main mark meter nav output picture progress section summary template time video",version:m,shivCSS:n.shivCSS!==!1,supportsUnknownElements:l,shivMethods:n.shivMethods!==!1,type:"default",shivDocument:j,createElement:g,createDocumentFragment:h,addElements:e};a.html5=t,j(b),"object"==typeof module&&module.exports&&(module.exports=t)}("undefined"!=typeof window?window:this,document); \ No newline at end of file diff --git a/branch/bicounty_emme/_static/js/theme.js b/branch/bicounty_emme/_static/js/theme.js new file mode 100644 index 0000000..1fddb6e --- /dev/null +++ b/branch/bicounty_emme/_static/js/theme.js @@ -0,0 +1 @@ +!function(n){var e={};function t(i){if(e[i])return e[i].exports;var o=e[i]={i:i,l:!1,exports:{}};return n[i].call(o.exports,o,o.exports,t),o.l=!0,o.exports}t.m=n,t.c=e,t.d=function(n,e,i){t.o(n,e)||Object.defineProperty(n,e,{enumerable:!0,get:i})},t.r=function(n){"undefined"!=typeof Symbol&&Symbol.toStringTag&&Object.defineProperty(n,Symbol.toStringTag,{value:"Module"}),Object.defineProperty(n,"__esModule",{value:!0})},t.t=function(n,e){if(1&e&&(n=t(n)),8&e)return n;if(4&e&&"object"==typeof n&&n&&n.__esModule)return n;var i=Object.create(null);if(t.r(i),Object.defineProperty(i,"default",{enumerable:!0,value:n}),2&e&&"string"!=typeof n)for(var o in n)t.d(i,o,function(e){return n[e]}.bind(null,o));return i},t.n=function(n){var e=n&&n.__esModule?function(){return n.default}:function(){return n};return t.d(e,"a",e),e},t.o=function(n,e){return Object.prototype.hasOwnProperty.call(n,e)},t.p="",t(t.s=0)}([function(n,e,t){t(1),n.exports=t(3)},function(n,e,t){(function(){var e="undefined"!=typeof window?window.jQuery:t(2);n.exports.ThemeNav={navBar:null,win:null,winScroll:!1,winResize:!1,linkScroll:!1,winPosition:0,winHeight:null,docHeight:null,isRunning:!1,enable:function(n){var t=this;void 0===n&&(n=!0),t.isRunning||(t.isRunning=!0,e((function(e){t.init(e),t.reset(),t.win.on("hashchange",t.reset),n&&t.win.on("scroll",(function(){t.linkScroll||t.winScroll||(t.winScroll=!0,requestAnimationFrame((function(){t.onScroll()})))})),t.win.on("resize",(function(){t.winResize||(t.winResize=!0,requestAnimationFrame((function(){t.onResize()})))})),t.onResize()})))},enableSticky:function(){this.enable(!0)},init:function(n){n(document);var e=this;this.navBar=n("div.wy-side-scroll:first"),this.win=n(window),n(document).on("click","[data-toggle='wy-nav-top']",(function(){n("[data-toggle='wy-nav-shift']").toggleClass("shift"),n("[data-toggle='rst-versions']").toggleClass("shift")})).on("click",".wy-menu-vertical .current ul li a",(function(){var t=n(this);n("[data-toggle='wy-nav-shift']").removeClass("shift"),n("[data-toggle='rst-versions']").toggleClass("shift"),e.toggleCurrent(t),e.hashChange()})).on("click","[data-toggle='rst-current-version']",(function(){n("[data-toggle='rst-versions']").toggleClass("shift-up")})),n("table.docutils:not(.field-list,.footnote,.citation)").wrap("
"),n("table.docutils.footnote").wrap("
"),n("table.docutils.citation").wrap("
"),n(".wy-menu-vertical ul").not(".simple").siblings("a").each((function(){var t=n(this);expand=n(''),expand.on("click",(function(n){return e.toggleCurrent(t),n.stopPropagation(),!1})),t.prepend(expand)}))},reset:function(){var n=encodeURI(window.location.hash)||"#";try{var e=$(".wy-menu-vertical"),t=e.find('[href="'+n+'"]');if(0===t.length){var i=$('.document [id="'+n.substring(1)+'"]').closest("div.section");0===(t=e.find('[href="#'+i.attr("id")+'"]')).length&&(t=e.find('[href="#"]'))}if(t.length>0){$(".wy-menu-vertical .current").removeClass("current").attr("aria-expanded","false"),t.addClass("current").attr("aria-expanded","true"),t.closest("li.toctree-l1").parent().addClass("current").attr("aria-expanded","true");for(let n=1;n<=10;n++)t.closest("li.toctree-l"+n).addClass("current").attr("aria-expanded","true");t[0].scrollIntoView()}}catch(n){console.log("Error expanding nav for anchor",n)}},onScroll:function(){this.winScroll=!1;var n=this.win.scrollTop(),e=n+this.winHeight,t=this.navBar.scrollTop()+(n-this.winPosition);n<0||e>this.docHeight||(this.navBar.scrollTop(t),this.winPosition=n)},onResize:function(){this.winResize=!1,this.winHeight=this.win.height(),this.docHeight=$(document).height()},hashChange:function(){this.linkScroll=!0,this.win.one("hashchange",(function(){this.linkScroll=!1}))},toggleCurrent:function(n){var e=n.closest("li");e.siblings("li.current").removeClass("current").attr("aria-expanded","false"),e.siblings().find("li.current").removeClass("current").attr("aria-expanded","false");var t=e.find("> ul li");t.length&&(t.removeClass("current").attr("aria-expanded","false"),e.toggleClass("current").attr("aria-expanded",(function(n,e){return"true"==e?"false":"true"})))}},"undefined"!=typeof window&&(window.SphinxRtdTheme={Navigation:n.exports.ThemeNav,StickyNav:n.exports.ThemeNav}),function(){for(var n=0,e=["ms","moz","webkit","o"],t=0;t0 + var meq1 = "^(" + C + ")?" + V + C + "(" + V + ")?$"; // [C]VC[V] is m=1 + var mgr1 = "^(" + C + ")?" + V + C + V + C; // [C]VCVC... is m>1 + var s_v = "^(" + C + ")?" + v; // vowel in stem + + this.stemWord = function (w) { + var stem; + var suffix; + var firstch; + var origword = w; + + if (w.length < 3) + return w; + + var re; + var re2; + var re3; + var re4; + + firstch = w.substr(0,1); + if (firstch == "y") + w = firstch.toUpperCase() + w.substr(1); + + // Step 1a + re = /^(.+?)(ss|i)es$/; + re2 = /^(.+?)([^s])s$/; + + if (re.test(w)) + w = w.replace(re,"$1$2"); + else if (re2.test(w)) + w = w.replace(re2,"$1$2"); + + // Step 1b + re = /^(.+?)eed$/; + re2 = /^(.+?)(ed|ing)$/; + if (re.test(w)) { + var fp = re.exec(w); + re = new RegExp(mgr0); + if (re.test(fp[1])) { + re = /.$/; + w = w.replace(re,""); + } + } + else if (re2.test(w)) { + var fp = re2.exec(w); + stem = fp[1]; + re2 = new RegExp(s_v); + if (re2.test(stem)) { + w = stem; + re2 = /(at|bl|iz)$/; + re3 = new RegExp("([^aeiouylsz])\\1$"); + re4 = new RegExp("^" + C + v + "[^aeiouwxy]$"); + if (re2.test(w)) + w = w + "e"; + else if (re3.test(w)) { + re = /.$/; + w = w.replace(re,""); + } + else if (re4.test(w)) + w = w + "e"; + } + } + + // Step 1c + re = /^(.+?)y$/; + if (re.test(w)) { + var fp = re.exec(w); + stem = fp[1]; + re = new RegExp(s_v); + if (re.test(stem)) + w = stem + "i"; + } + + // Step 2 + re = /^(.+?)(ational|tional|enci|anci|izer|bli|alli|entli|eli|ousli|ization|ation|ator|alism|iveness|fulness|ousness|aliti|iviti|biliti|logi)$/; + if (re.test(w)) { + var fp = re.exec(w); + stem = fp[1]; + suffix = fp[2]; + re = new RegExp(mgr0); + if (re.test(stem)) + w = stem + step2list[suffix]; + } + + // Step 3 + re = /^(.+?)(icate|ative|alize|iciti|ical|ful|ness)$/; + if (re.test(w)) { + var fp = re.exec(w); + stem = fp[1]; + suffix = fp[2]; + re = new RegExp(mgr0); + if (re.test(stem)) + w = stem + step3list[suffix]; + } + + // Step 4 + re = /^(.+?)(al|ance|ence|er|ic|able|ible|ant|ement|ment|ent|ou|ism|ate|iti|ous|ive|ize)$/; + re2 = /^(.+?)(s|t)(ion)$/; + if (re.test(w)) { + var fp = re.exec(w); + stem = fp[1]; + re = new RegExp(mgr1); + if (re.test(stem)) + w = stem; + } + else if (re2.test(w)) { + var fp = re2.exec(w); + stem = fp[1] + fp[2]; + re2 = new RegExp(mgr1); + if (re2.test(stem)) + w = stem; + } + + // Step 5 + re = /^(.+?)e$/; + if (re.test(w)) { + var fp = re.exec(w); + stem = fp[1]; + re = new RegExp(mgr1); + re2 = new RegExp(meq1); + re3 = new RegExp("^" + C + v + "[^aeiouwxy]$"); + if (re.test(stem) || (re2.test(stem) && !(re3.test(stem)))) + w = stem; + } + re = /ll$/; + re2 = new RegExp(mgr1); + if (re.test(w) && re2.test(w)) { + re = /.$/; + w = w.replace(re,""); + } + + // and turn initial Y back to y + if (firstch == "y") + w = firstch.toLowerCase() + w.substr(1); + return w; + } +} + diff --git a/branch/bicounty_emme/_static/minus.png b/branch/bicounty_emme/_static/minus.png new file mode 100644 index 0000000..d96755f Binary files /dev/null and b/branch/bicounty_emme/_static/minus.png differ diff --git a/branch/bicounty_emme/_static/plus.png b/branch/bicounty_emme/_static/plus.png new file mode 100644 index 0000000..7107cec Binary files /dev/null and b/branch/bicounty_emme/_static/plus.png differ diff --git a/branch/bicounty_emme/_static/pygments.css b/branch/bicounty_emme/_static/pygments.css new file mode 100644 index 0000000..84ab303 --- /dev/null +++ b/branch/bicounty_emme/_static/pygments.css @@ -0,0 +1,75 @@ +pre { line-height: 125%; } +td.linenos .normal { color: inherit; background-color: transparent; padding-left: 5px; padding-right: 5px; } +span.linenos { color: inherit; background-color: transparent; padding-left: 5px; padding-right: 5px; } +td.linenos .special { color: #000000; background-color: #ffffc0; padding-left: 5px; padding-right: 5px; } +span.linenos.special { color: #000000; background-color: #ffffc0; padding-left: 5px; padding-right: 5px; } +.highlight .hll { background-color: #ffffcc } +.highlight { background: #f8f8f8; } +.highlight .c { color: #3D7B7B; font-style: italic } /* Comment */ +.highlight .err { border: 1px solid #FF0000 } /* Error */ +.highlight .k { color: #008000; font-weight: bold } /* Keyword */ +.highlight .o { color: #666666 } /* Operator */ +.highlight .ch { color: #3D7B7B; font-style: italic } /* Comment.Hashbang */ +.highlight .cm { color: #3D7B7B; font-style: italic } /* Comment.Multiline */ +.highlight .cp { color: #9C6500 } /* Comment.Preproc */ +.highlight .cpf { color: #3D7B7B; font-style: italic } /* Comment.PreprocFile */ +.highlight .c1 { color: #3D7B7B; font-style: italic } /* Comment.Single */ +.highlight .cs { color: #3D7B7B; font-style: italic } /* Comment.Special */ +.highlight .gd { color: #A00000 } /* Generic.Deleted */ +.highlight .ge { font-style: italic } /* Generic.Emph */ +.highlight .ges { font-weight: bold; font-style: italic } /* Generic.EmphStrong */ +.highlight .gr { color: #E40000 } /* Generic.Error */ +.highlight .gh { color: #000080; font-weight: bold } /* Generic.Heading */ +.highlight .gi { color: #008400 } /* Generic.Inserted */ +.highlight .go { color: #717171 } /* Generic.Output */ +.highlight .gp { color: #000080; font-weight: bold } /* Generic.Prompt */ +.highlight .gs { font-weight: bold } /* Generic.Strong */ +.highlight .gu { color: #800080; font-weight: bold } /* Generic.Subheading */ +.highlight .gt { color: #0044DD } /* Generic.Traceback */ +.highlight .kc { color: #008000; font-weight: bold } /* Keyword.Constant */ +.highlight .kd { color: #008000; font-weight: bold } /* Keyword.Declaration */ +.highlight .kn { color: #008000; font-weight: bold } /* Keyword.Namespace */ +.highlight .kp { color: #008000 } /* Keyword.Pseudo */ +.highlight .kr { color: #008000; font-weight: bold } /* Keyword.Reserved */ +.highlight .kt { color: #B00040 } /* Keyword.Type */ +.highlight .m { color: #666666 } /* Literal.Number */ +.highlight .s { color: #BA2121 } /* Literal.String */ +.highlight .na { color: #687822 } /* Name.Attribute */ +.highlight .nb { color: #008000 } /* Name.Builtin */ +.highlight .nc { color: #0000FF; font-weight: bold } /* Name.Class */ +.highlight .no { color: #880000 } /* Name.Constant */ +.highlight .nd { color: #AA22FF } /* Name.Decorator */ +.highlight .ni { color: #717171; font-weight: bold } /* Name.Entity */ +.highlight .ne { color: #CB3F38; font-weight: bold } /* Name.Exception */ +.highlight .nf { color: #0000FF } /* Name.Function */ +.highlight .nl { color: #767600 } /* Name.Label */ +.highlight .nn { color: #0000FF; font-weight: bold } /* Name.Namespace */ +.highlight .nt { color: #008000; font-weight: bold } /* Name.Tag */ +.highlight .nv { color: #19177C } /* Name.Variable */ +.highlight .ow { color: #AA22FF; font-weight: bold } /* Operator.Word */ +.highlight .w { color: #bbbbbb } /* Text.Whitespace */ +.highlight .mb { color: #666666 } /* Literal.Number.Bin */ +.highlight .mf { color: #666666 } /* Literal.Number.Float */ +.highlight .mh { color: #666666 } /* Literal.Number.Hex */ +.highlight .mi { color: #666666 } /* Literal.Number.Integer */ +.highlight .mo { color: #666666 } /* Literal.Number.Oct */ +.highlight .sa { color: #BA2121 } /* Literal.String.Affix */ +.highlight .sb { color: #BA2121 } /* Literal.String.Backtick */ +.highlight .sc { color: #BA2121 } /* Literal.String.Char */ +.highlight .dl { color: #BA2121 } /* Literal.String.Delimiter */ +.highlight .sd { color: #BA2121; font-style: italic } /* Literal.String.Doc */ +.highlight .s2 { color: #BA2121 } /* Literal.String.Double */ +.highlight .se { color: #AA5D1F; font-weight: bold } /* Literal.String.Escape */ +.highlight .sh { color: #BA2121 } /* Literal.String.Heredoc */ +.highlight .si { color: #A45A77; font-weight: bold } /* Literal.String.Interpol */ +.highlight .sx { color: #008000 } /* Literal.String.Other */ +.highlight .sr { color: #A45A77 } /* Literal.String.Regex */ +.highlight .s1 { color: #BA2121 } /* Literal.String.Single */ +.highlight .ss { color: #19177C } /* Literal.String.Symbol */ +.highlight .bp { color: #008000 } /* Name.Builtin.Pseudo */ +.highlight .fm { color: #0000FF } /* Name.Function.Magic */ +.highlight .vc { color: #19177C } /* Name.Variable.Class */ +.highlight .vg { color: #19177C } /* Name.Variable.Global */ +.highlight .vi { color: #19177C } /* Name.Variable.Instance */ +.highlight .vm { color: #19177C } /* Name.Variable.Magic */ +.highlight .il { color: #666666 } /* Literal.Number.Integer.Long */ \ No newline at end of file diff --git a/branch/bicounty_emme/_static/searchtools.js b/branch/bicounty_emme/_static/searchtools.js new file mode 100644 index 0000000..97d56a7 --- /dev/null +++ b/branch/bicounty_emme/_static/searchtools.js @@ -0,0 +1,566 @@ +/* + * searchtools.js + * ~~~~~~~~~~~~~~~~ + * + * Sphinx JavaScript utilities for the full-text search. + * + * :copyright: Copyright 2007-2023 by the Sphinx team, see AUTHORS. + * :license: BSD, see LICENSE for details. + * + */ +"use strict"; + +/** + * Simple result scoring code. + */ +if (typeof Scorer === "undefined") { + var Scorer = { + // Implement the following function to further tweak the score for each result + // The function takes a result array [docname, title, anchor, descr, score, filename] + // and returns the new score. + /* + score: result => { + const [docname, title, anchor, descr, score, filename] = result + return score + }, + */ + + // query matches the full name of an object + objNameMatch: 11, + // or matches in the last dotted part of the object name + objPartialMatch: 6, + // Additive scores depending on the priority of the object + objPrio: { + 0: 15, // used to be importantResults + 1: 5, // used to be objectResults + 2: -5, // used to be unimportantResults + }, + // Used when the priority is not in the mapping. + objPrioDefault: 0, + + // query found in title + title: 15, + partialTitle: 7, + // query found in terms + term: 5, + partialTerm: 2, + }; +} + +const _removeChildren = (element) => { + while (element && element.lastChild) element.removeChild(element.lastChild); +}; + +/** + * See https://developer.mozilla.org/en-US/docs/Web/JavaScript/Guide/Regular_Expressions#escaping + */ +const _escapeRegExp = (string) => + string.replace(/[.*+\-?^${}()|[\]\\]/g, "\\$&"); // $& means the whole matched string + +const _displayItem = (item, searchTerms) => { + const docBuilder = DOCUMENTATION_OPTIONS.BUILDER; + const docUrlRoot = DOCUMENTATION_OPTIONS.URL_ROOT; + const docFileSuffix = DOCUMENTATION_OPTIONS.FILE_SUFFIX; + const docLinkSuffix = DOCUMENTATION_OPTIONS.LINK_SUFFIX; + const showSearchSummary = DOCUMENTATION_OPTIONS.SHOW_SEARCH_SUMMARY; + + const [docName, title, anchor, descr, score, _filename] = item; + + let listItem = document.createElement("li"); + let requestUrl; + let linkUrl; + if (docBuilder === "dirhtml") { + // dirhtml builder + let dirname = docName + "/"; + if (dirname.match(/\/index\/$/)) + dirname = dirname.substring(0, dirname.length - 6); + else if (dirname === "index/") dirname = ""; + requestUrl = docUrlRoot + dirname; + linkUrl = requestUrl; + } else { + // normal html builders + requestUrl = docUrlRoot + docName + docFileSuffix; + linkUrl = docName + docLinkSuffix; + } + let linkEl = listItem.appendChild(document.createElement("a")); + linkEl.href = linkUrl + anchor; + linkEl.dataset.score = score; + linkEl.innerHTML = title; + if (descr) + listItem.appendChild(document.createElement("span")).innerHTML = + " (" + descr + ")"; + else if (showSearchSummary) + fetch(requestUrl) + .then((responseData) => responseData.text()) + .then((data) => { + if (data) + listItem.appendChild( + Search.makeSearchSummary(data, searchTerms) + ); + }); + Search.output.appendChild(listItem); +}; +const _finishSearch = (resultCount) => { + Search.stopPulse(); + Search.title.innerText = _("Search Results"); + if (!resultCount) + Search.status.innerText = Documentation.gettext( + "Your search did not match any documents. Please make sure that all words are spelled correctly and that you've selected enough categories." + ); + else + Search.status.innerText = _( + `Search finished, found ${resultCount} page(s) matching the search query.` + ); +}; +const _displayNextItem = ( + results, + resultCount, + searchTerms +) => { + // results left, load the summary and display it + // this is intended to be dynamic (don't sub resultsCount) + if (results.length) { + _displayItem(results.pop(), searchTerms); + setTimeout( + () => _displayNextItem(results, resultCount, searchTerms), + 5 + ); + } + // search finished, update title and status message + else _finishSearch(resultCount); +}; + +/** + * Default splitQuery function. Can be overridden in ``sphinx.search`` with a + * custom function per language. + * + * The regular expression works by splitting the string on consecutive characters + * that are not Unicode letters, numbers, underscores, or emoji characters. + * This is the same as ``\W+`` in Python, preserving the surrogate pair area. + */ +if (typeof splitQuery === "undefined") { + var splitQuery = (query) => query + .split(/[^\p{Letter}\p{Number}_\p{Emoji_Presentation}]+/gu) + .filter(term => term) // remove remaining empty strings +} + +/** + * Search Module + */ +const Search = { + _index: null, + _queued_query: null, + _pulse_status: -1, + + htmlToText: (htmlString) => { + const htmlElement = new DOMParser().parseFromString(htmlString, 'text/html'); + htmlElement.querySelectorAll(".headerlink").forEach((el) => { el.remove() }); + const docContent = htmlElement.querySelector('[role="main"]'); + if (docContent !== undefined) return docContent.textContent; + console.warn( + "Content block not found. Sphinx search tries to obtain it via '[role=main]'. Could you check your theme or template." + ); + return ""; + }, + + init: () => { + const query = new URLSearchParams(window.location.search).get("q"); + document + .querySelectorAll('input[name="q"]') + .forEach((el) => (el.value = query)); + if (query) Search.performSearch(query); + }, + + loadIndex: (url) => + (document.body.appendChild(document.createElement("script")).src = url), + + setIndex: (index) => { + Search._index = index; + if (Search._queued_query !== null) { + const query = Search._queued_query; + Search._queued_query = null; + Search.query(query); + } + }, + + hasIndex: () => Search._index !== null, + + deferQuery: (query) => (Search._queued_query = query), + + stopPulse: () => (Search._pulse_status = -1), + + startPulse: () => { + if (Search._pulse_status >= 0) return; + + const pulse = () => { + Search._pulse_status = (Search._pulse_status + 1) % 4; + Search.dots.innerText = ".".repeat(Search._pulse_status); + if (Search._pulse_status >= 0) window.setTimeout(pulse, 500); + }; + pulse(); + }, + + /** + * perform a search for something (or wait until index is loaded) + */ + performSearch: (query) => { + // create the required interface elements + const searchText = document.createElement("h2"); + searchText.textContent = _("Searching"); + const searchSummary = document.createElement("p"); + searchSummary.classList.add("search-summary"); + searchSummary.innerText = ""; + const searchList = document.createElement("ul"); + searchList.classList.add("search"); + + const out = document.getElementById("search-results"); + Search.title = out.appendChild(searchText); + Search.dots = Search.title.appendChild(document.createElement("span")); + Search.status = out.appendChild(searchSummary); + Search.output = out.appendChild(searchList); + + const searchProgress = document.getElementById("search-progress"); + // Some themes don't use the search progress node + if (searchProgress) { + searchProgress.innerText = _("Preparing search..."); + } + Search.startPulse(); + + // index already loaded, the browser was quick! + if (Search.hasIndex()) Search.query(query); + else Search.deferQuery(query); + }, + + /** + * execute search (requires search index to be loaded) + */ + query: (query) => { + const filenames = Search._index.filenames; + const docNames = Search._index.docnames; + const titles = Search._index.titles; + const allTitles = Search._index.alltitles; + const indexEntries = Search._index.indexentries; + + // stem the search terms and add them to the correct list + const stemmer = new Stemmer(); + const searchTerms = new Set(); + const excludedTerms = new Set(); + const highlightTerms = new Set(); + const objectTerms = new Set(splitQuery(query.toLowerCase().trim())); + splitQuery(query.trim()).forEach((queryTerm) => { + const queryTermLower = queryTerm.toLowerCase(); + + // maybe skip this "word" + // stopwords array is from language_data.js + if ( + stopwords.indexOf(queryTermLower) !== -1 || + queryTerm.match(/^\d+$/) + ) + return; + + // stem the word + let word = stemmer.stemWord(queryTermLower); + // select the correct list + if (word[0] === "-") excludedTerms.add(word.substr(1)); + else { + searchTerms.add(word); + highlightTerms.add(queryTermLower); + } + }); + + if (SPHINX_HIGHLIGHT_ENABLED) { // set in sphinx_highlight.js + localStorage.setItem("sphinx_highlight_terms", [...highlightTerms].join(" ")) + } + + // console.debug("SEARCH: searching for:"); + // console.info("required: ", [...searchTerms]); + // console.info("excluded: ", [...excludedTerms]); + + // array of [docname, title, anchor, descr, score, filename] + let results = []; + _removeChildren(document.getElementById("search-progress")); + + const queryLower = query.toLowerCase(); + for (const [title, foundTitles] of Object.entries(allTitles)) { + if (title.toLowerCase().includes(queryLower) && (queryLower.length >= title.length/2)) { + for (const [file, id] of foundTitles) { + let score = Math.round(100 * queryLower.length / title.length) + results.push([ + docNames[file], + titles[file] !== title ? `${titles[file]} > ${title}` : title, + id !== null ? "#" + id : "", + null, + score, + filenames[file], + ]); + } + } + } + + // search for explicit entries in index directives + for (const [entry, foundEntries] of Object.entries(indexEntries)) { + if (entry.includes(queryLower) && (queryLower.length >= entry.length/2)) { + for (const [file, id] of foundEntries) { + let score = Math.round(100 * queryLower.length / entry.length) + results.push([ + docNames[file], + titles[file], + id ? "#" + id : "", + null, + score, + filenames[file], + ]); + } + } + } + + // lookup as object + objectTerms.forEach((term) => + results.push(...Search.performObjectSearch(term, objectTerms)) + ); + + // lookup as search terms in fulltext + results.push(...Search.performTermsSearch(searchTerms, excludedTerms)); + + // let the scorer override scores with a custom scoring function + if (Scorer.score) results.forEach((item) => (item[4] = Scorer.score(item))); + + // now sort the results by score (in opposite order of appearance, since the + // display function below uses pop() to retrieve items) and then + // alphabetically + results.sort((a, b) => { + const leftScore = a[4]; + const rightScore = b[4]; + if (leftScore === rightScore) { + // same score: sort alphabetically + const leftTitle = a[1].toLowerCase(); + const rightTitle = b[1].toLowerCase(); + if (leftTitle === rightTitle) return 0; + return leftTitle > rightTitle ? -1 : 1; // inverted is intentional + } + return leftScore > rightScore ? 1 : -1; + }); + + // remove duplicate search results + // note the reversing of results, so that in the case of duplicates, the highest-scoring entry is kept + let seen = new Set(); + results = results.reverse().reduce((acc, result) => { + let resultStr = result.slice(0, 4).concat([result[5]]).map(v => String(v)).join(','); + if (!seen.has(resultStr)) { + acc.push(result); + seen.add(resultStr); + } + return acc; + }, []); + + results = results.reverse(); + + // for debugging + //Search.lastresults = results.slice(); // a copy + // console.info("search results:", Search.lastresults); + + // print the results + _displayNextItem(results, results.length, searchTerms); + }, + + /** + * search for object names + */ + performObjectSearch: (object, objectTerms) => { + const filenames = Search._index.filenames; + const docNames = Search._index.docnames; + const objects = Search._index.objects; + const objNames = Search._index.objnames; + const titles = Search._index.titles; + + const results = []; + + const objectSearchCallback = (prefix, match) => { + const name = match[4] + const fullname = (prefix ? prefix + "." : "") + name; + const fullnameLower = fullname.toLowerCase(); + if (fullnameLower.indexOf(object) < 0) return; + + let score = 0; + const parts = fullnameLower.split("."); + + // check for different match types: exact matches of full name or + // "last name" (i.e. last dotted part) + if (fullnameLower === object || parts.slice(-1)[0] === object) + score += Scorer.objNameMatch; + else if (parts.slice(-1)[0].indexOf(object) > -1) + score += Scorer.objPartialMatch; // matches in last name + + const objName = objNames[match[1]][2]; + const title = titles[match[0]]; + + // If more than one term searched for, we require other words to be + // found in the name/title/description + const otherTerms = new Set(objectTerms); + otherTerms.delete(object); + if (otherTerms.size > 0) { + const haystack = `${prefix} ${name} ${objName} ${title}`.toLowerCase(); + if ( + [...otherTerms].some((otherTerm) => haystack.indexOf(otherTerm) < 0) + ) + return; + } + + let anchor = match[3]; + if (anchor === "") anchor = fullname; + else if (anchor === "-") anchor = objNames[match[1]][1] + "-" + fullname; + + const descr = objName + _(", in ") + title; + + // add custom score for some objects according to scorer + if (Scorer.objPrio.hasOwnProperty(match[2])) + score += Scorer.objPrio[match[2]]; + else score += Scorer.objPrioDefault; + + results.push([ + docNames[match[0]], + fullname, + "#" + anchor, + descr, + score, + filenames[match[0]], + ]); + }; + Object.keys(objects).forEach((prefix) => + objects[prefix].forEach((array) => + objectSearchCallback(prefix, array) + ) + ); + return results; + }, + + /** + * search for full-text terms in the index + */ + performTermsSearch: (searchTerms, excludedTerms) => { + // prepare search + const terms = Search._index.terms; + const titleTerms = Search._index.titleterms; + const filenames = Search._index.filenames; + const docNames = Search._index.docnames; + const titles = Search._index.titles; + + const scoreMap = new Map(); + const fileMap = new Map(); + + // perform the search on the required terms + searchTerms.forEach((word) => { + const files = []; + const arr = [ + { files: terms[word], score: Scorer.term }, + { files: titleTerms[word], score: Scorer.title }, + ]; + // add support for partial matches + if (word.length > 2) { + const escapedWord = _escapeRegExp(word); + Object.keys(terms).forEach((term) => { + if (term.match(escapedWord) && !terms[word]) + arr.push({ files: terms[term], score: Scorer.partialTerm }); + }); + Object.keys(titleTerms).forEach((term) => { + if (term.match(escapedWord) && !titleTerms[word]) + arr.push({ files: titleTerms[word], score: Scorer.partialTitle }); + }); + } + + // no match but word was a required one + if (arr.every((record) => record.files === undefined)) return; + + // found search word in contents + arr.forEach((record) => { + if (record.files === undefined) return; + + let recordFiles = record.files; + if (recordFiles.length === undefined) recordFiles = [recordFiles]; + files.push(...recordFiles); + + // set score for the word in each file + recordFiles.forEach((file) => { + if (!scoreMap.has(file)) scoreMap.set(file, {}); + scoreMap.get(file)[word] = record.score; + }); + }); + + // create the mapping + files.forEach((file) => { + if (fileMap.has(file) && fileMap.get(file).indexOf(word) === -1) + fileMap.get(file).push(word); + else fileMap.set(file, [word]); + }); + }); + + // now check if the files don't contain excluded terms + const results = []; + for (const [file, wordList] of fileMap) { + // check if all requirements are matched + + // as search terms with length < 3 are discarded + const filteredTermCount = [...searchTerms].filter( + (term) => term.length > 2 + ).length; + if ( + wordList.length !== searchTerms.size && + wordList.length !== filteredTermCount + ) + continue; + + // ensure that none of the excluded terms is in the search result + if ( + [...excludedTerms].some( + (term) => + terms[term] === file || + titleTerms[term] === file || + (terms[term] || []).includes(file) || + (titleTerms[term] || []).includes(file) + ) + ) + break; + + // select one (max) score for the file. + const score = Math.max(...wordList.map((w) => scoreMap.get(file)[w])); + // add result to the result list + results.push([ + docNames[file], + titles[file], + "", + null, + score, + filenames[file], + ]); + } + return results; + }, + + /** + * helper function to return a node containing the + * search summary for a given text. keywords is a list + * of stemmed words. + */ + makeSearchSummary: (htmlText, keywords) => { + const text = Search.htmlToText(htmlText); + if (text === "") return null; + + const textLower = text.toLowerCase(); + const actualStartPosition = [...keywords] + .map((k) => textLower.indexOf(k.toLowerCase())) + .filter((i) => i > -1) + .slice(-1)[0]; + const startWithContext = Math.max(actualStartPosition - 120, 0); + + const top = startWithContext === 0 ? "" : "..."; + const tail = startWithContext + 240 < text.length ? "..." : ""; + + let summary = document.createElement("p"); + summary.classList.add("context"); + summary.textContent = top + text.substr(startWithContext, 240).trim() + tail; + + return summary; + }, +}; + +_ready(Search.init); diff --git a/branch/bicounty_emme/_static/sphinx_highlight.js b/branch/bicounty_emme/_static/sphinx_highlight.js new file mode 100644 index 0000000..aae669d --- /dev/null +++ b/branch/bicounty_emme/_static/sphinx_highlight.js @@ -0,0 +1,144 @@ +/* Highlighting utilities for Sphinx HTML documentation. */ +"use strict"; + +const SPHINX_HIGHLIGHT_ENABLED = true + +/** + * highlight a given string on a node by wrapping it in + * span elements with the given class name. + */ +const _highlight = (node, addItems, text, className) => { + if (node.nodeType === Node.TEXT_NODE) { + const val = node.nodeValue; + const parent = node.parentNode; + const pos = val.toLowerCase().indexOf(text); + if ( + pos >= 0 && + !parent.classList.contains(className) && + !parent.classList.contains("nohighlight") + ) { + let span; + + const closestNode = parent.closest("body, svg, foreignObject"); + const isInSVG = closestNode && closestNode.matches("svg"); + if (isInSVG) { + span = document.createElementNS("http://www.w3.org/2000/svg", "tspan"); + } else { + span = document.createElement("span"); + span.classList.add(className); + } + + span.appendChild(document.createTextNode(val.substr(pos, text.length))); + parent.insertBefore( + span, + parent.insertBefore( + document.createTextNode(val.substr(pos + text.length)), + node.nextSibling + ) + ); + node.nodeValue = val.substr(0, pos); + + if (isInSVG) { + const rect = document.createElementNS( + "http://www.w3.org/2000/svg", + "rect" + ); + const bbox = parent.getBBox(); + rect.x.baseVal.value = bbox.x; + rect.y.baseVal.value = bbox.y; + rect.width.baseVal.value = bbox.width; + rect.height.baseVal.value = bbox.height; + rect.setAttribute("class", className); + addItems.push({ parent: parent, target: rect }); + } + } + } else if (node.matches && !node.matches("button, select, textarea")) { + node.childNodes.forEach((el) => _highlight(el, addItems, text, className)); + } +}; +const _highlightText = (thisNode, text, className) => { + let addItems = []; + _highlight(thisNode, addItems, text, className); + addItems.forEach((obj) => + obj.parent.insertAdjacentElement("beforebegin", obj.target) + ); +}; + +/** + * Small JavaScript module for the documentation. + */ +const SphinxHighlight = { + + /** + * highlight the search words provided in localstorage in the text + */ + highlightSearchWords: () => { + if (!SPHINX_HIGHLIGHT_ENABLED) return; // bail if no highlight + + // get and clear terms from localstorage + const url = new URL(window.location); + const highlight = + localStorage.getItem("sphinx_highlight_terms") + || url.searchParams.get("highlight") + || ""; + localStorage.removeItem("sphinx_highlight_terms") + url.searchParams.delete("highlight"); + window.history.replaceState({}, "", url); + + // get individual terms from highlight string + const terms = highlight.toLowerCase().split(/\s+/).filter(x => x); + if (terms.length === 0) return; // nothing to do + + // There should never be more than one element matching "div.body" + const divBody = document.querySelectorAll("div.body"); + const body = divBody.length ? divBody[0] : document.querySelector("body"); + window.setTimeout(() => { + terms.forEach((term) => _highlightText(body, term, "highlighted")); + }, 10); + + const searchBox = document.getElementById("searchbox"); + if (searchBox === null) return; + searchBox.appendChild( + document + .createRange() + .createContextualFragment( + '" + ) + ); + }, + + /** + * helper function to hide the search marks again + */ + hideSearchWords: () => { + document + .querySelectorAll("#searchbox .highlight-link") + .forEach((el) => el.remove()); + document + .querySelectorAll("span.highlighted") + .forEach((el) => el.classList.remove("highlighted")); + localStorage.removeItem("sphinx_highlight_terms") + }, + + initEscapeListener: () => { + // only install a listener if it is really needed + if (!DOCUMENTATION_OPTIONS.ENABLE_SEARCH_SHORTCUTS) return; + + document.addEventListener("keydown", (event) => { + // bail for input elements + if (BLACKLISTED_KEY_CONTROL_ELEMENTS.has(document.activeElement.tagName)) return; + // bail with special keys + if (event.shiftKey || event.altKey || event.ctrlKey || event.metaKey) return; + if (DOCUMENTATION_OPTIONS.ENABLE_SEARCH_SHORTCUTS && (event.key === "Escape")) { + SphinxHighlight.hideSearchWords(); + event.preventDefault(); + } + }); + }, +}; + +_ready(SphinxHighlight.highlightSearchWords); +_ready(SphinxHighlight.initEscapeListener); diff --git a/branch/bicounty_emme/autodoc/index.html b/branch/bicounty_emme/autodoc/index.html new file mode 100644 index 0000000..376fe34 --- /dev/null +++ b/branch/bicounty_emme/autodoc/index.html @@ -0,0 +1,165 @@ + + + + + + + Lasso Classes and Functions — lasso documentation + + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

Lasso Classes and Functions

+
+

Base Classes

+ + + + + + + + + + + + + + + + + + +

CubeTransit

Class for storing information about transit defined in Cube line files.

StandardTransit

Holds a standard transit feed as a Partridge object and contains methods to manipulate and translate the GTFS data to MetCouncil's Cube Line files.

ModelRoadwayNetwork

Subclass of network_wrangler class RoadwayNetwork

Project

A single or set of changes to the roadway or transit system.

Parameters

A class representing all the parameters defining the networks including time of day, categories, etc.

+
+
+

Utils and Functions

+ + + + + + + + + +

util

logger

+
+
+ + +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty_emme/genindex/index.html b/branch/bicounty_emme/genindex/index.html new file mode 100644 index 0000000..cb4a8a9 --- /dev/null +++ b/branch/bicounty_emme/genindex/index.html @@ -0,0 +1,1038 @@ + + + + + + Index — lasso documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+
    +
  • + +
  • +
  • +
+
+
+
+
+ + +

Index

+ +
+ _ + | A + | B + | C + | D + | E + | F + | G + | H + | I + | K + | L + | M + | N + | O + | P + | R + | S + | T + | U + | V + | W + | X + | Y + | Z + +
+

_

+ + +
+ +

A

+ + + +
+ +

B

+ + + +
+ +

C

+ + + +
+ +

D

+ + + +
+ +

E

+ + + +
+ +

F

+ + + +
+ +

G

+ + + +
+ +

H

+ + + +
+ +

I

+ + + +
+ +

K

+ + +
+ +

L

+ + + +
+ +

M

+ + + +
+ +

N

+ + + +
+ +

O

+ + + +
+ +

P

+ + + +
+ +

R

+ + + +
+ +

S

+ + + +
+ +

T

+ + + +
+ +

U

+ + + +
+ +

V

+ + + +
+ +

W

+ + + +
+ +

X

+ + + +
+ +

Y

+ + +
+ +

Z

+ + + +
+ + + +
+
+
+ +
+ +
+

© Copyright 2019 Metropolitan Council.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty_emme/index.html b/branch/bicounty_emme/index.html new file mode 100644 index 0000000..5a887f3 --- /dev/null +++ b/branch/bicounty_emme/index.html @@ -0,0 +1,183 @@ + + + + + + + Welcome to lasso’s documentation! — lasso documentation + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

Welcome to lasso’s documentation!

+

This package of utilities is a wrapper around the +[network_wrangler](http://github.com/wsp-sag/network_wrangler) package +for MetCouncil. It aims to have the following functionality: +1. parse Cube log files and base highway networks and create ProjectCards

+
+

for Network Wrangler

+
+
    +
  1. parse two Cube transit line files and create ProjectCards for NetworkWrangler

  2. +
  3. refine Network Wrangler highway networks to contain specific variables and +settings for Metropolitan Council and export them to a format that can +be read in by Citilab’s Cube software.

  4. +
+ +
+
+

Indices and tables

+ +
+ + +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty_emme/objects.inv b/branch/bicounty_emme/objects.inv new file mode 100644 index 0000000..5f26290 Binary files /dev/null and b/branch/bicounty_emme/objects.inv differ diff --git a/branch/bicounty_emme/py-modindex/index.html b/branch/bicounty_emme/py-modindex/index.html new file mode 100644 index 0000000..a64daae --- /dev/null +++ b/branch/bicounty_emme/py-modindex/index.html @@ -0,0 +1,136 @@ + + + + + + Python Module Index — lasso documentation + + + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+
    +
  • + +
  • +
  • +
+
+
+
+
+ + +

Python Module Index

+ +
+ l +
+ + + + + + + + + + + + + +
 
+ l
+ lasso +
    + lasso.logger +
    + lasso.util +
+ + +
+
+
+ +
+ +
+

© Copyright 2019 Metropolitan Council.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty_emme/running/index.html b/branch/bicounty_emme/running/index.html new file mode 100644 index 0000000..36a27f9 --- /dev/null +++ b/branch/bicounty_emme/running/index.html @@ -0,0 +1,133 @@ + + + + + + + Running Lasso — lasso documentation + + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

Running Lasso

+
+

Create project files

+
+
+

Create a scenario

+
+
+

Exporting networks

+
+
+

Auditing and Reporting

+
+
+ + +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty_emme/search/index.html b/branch/bicounty_emme/search/index.html new file mode 100644 index 0000000..8bafd66 --- /dev/null +++ b/branch/bicounty_emme/search/index.html @@ -0,0 +1,126 @@ + + + + + + Search — lasso documentation + + + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+
    +
  • + +
  • +
  • +
+
+
+
+
+ + + + +
+ +
+ +
+
+
+ +
+ +
+

© Copyright 2019 Metropolitan Council.

+
+ + Built with Sphinx using a + theme + provided by Read the Docs. + + +
+
+
+
+
+ + + + + + + + + \ No newline at end of file diff --git a/branch/bicounty_emme/searchindex.js b/branch/bicounty_emme/searchindex.js new file mode 100644 index 0000000..8ef2c89 --- /dev/null +++ b/branch/bicounty_emme/searchindex.js @@ -0,0 +1 @@ +Search.setIndex({"docnames": ["_generated/lasso.CubeTransit", "_generated/lasso.ModelRoadwayNetwork", "_generated/lasso.Parameters", "_generated/lasso.Project", "_generated/lasso.StandardTransit", "_generated/lasso.logger", "_generated/lasso.util", "autodoc", "index", "running", "setup", "starting"], "filenames": ["_generated/lasso.CubeTransit.rst", "_generated/lasso.ModelRoadwayNetwork.rst", "_generated/lasso.Parameters.rst", "_generated/lasso.Project.rst", "_generated/lasso.StandardTransit.rst", "_generated/lasso.logger.rst", "_generated/lasso.util.rst", "autodoc.rst", "index.rst", "running.md", "setup.md", "starting.md"], "titles": ["lasso.CubeTransit", "lasso.ModelRoadwayNetwork", "lasso.Parameters", "lasso.Project", "lasso.StandardTransit", "lasso.logger", "lasso.util", "Lasso Classes and Functions", "Welcome to lasso\u2019s documentation!", "Running Lasso", "Setup", "Starting Out"], "terms": {"class": [0, 1, 2, 3, 4, 6, 8, 11], "paramet": [0, 1, 3, 4, 6, 8], "sourc": [0, 1, 2, 3, 4, 5, 6, 11], "base": [0, 1, 2, 3, 4, 6, 8, 11], "object": [0, 1, 2, 3, 4, 6, 11], "store": [0, 1, 11], "inform": [0, 1, 4, 11], "about": [0, 1, 4, 11], "transit": [0, 1, 2, 3, 4, 8], "defin": [0, 1, 2, 3, 11], "cube": [0, 1, 2, 3, 4, 8], "line": [0, 1, 2, 3, 4, 6, 8, 11], "file": [0, 1, 2, 3, 4, 8], "ha": [0, 6, 11], "capabl": [0, 11], "pars": [0, 8, 11], "properti": [0, 1, 2, 4, 6, 11], "shape": [0, 1, 3, 4, 6, 11], "python": [0, 1, 2, 11], "dictionari": [0, 1, 2, 3, 4, 6, 11], "compar": [0, 3, 4, 11], "repres": [0, 2, 4, 6, 11], "chang": [0, 1, 3, 4, 11], "project": [0, 1, 2, 4, 6, 8], "card": [0, 1, 3, 4], "typic": [0, 3, 4, 6, 8], "usag": [0, 1, 3, 4], "exampl": [0, 1, 3, 4, 6, 11], "tn": [0, 11], "create_from_cub": [0, 11], "cube_dir": [0, 3, 11], "transit_change_list": [0, 11], "evaluate_differ": [0, 4, 11], "base_transit_network": [0, 3, 11], "list": [0, 1, 2, 3, 4, 6, 11], "string": [0, 1, 3, 4, 6], "uniqu": [0, 1, 3], "name": [0, 1, 2, 3, 4, 6, 11], "network": [0, 1, 2, 3, 4, 5, 8], "type": [0, 1, 2, 3, 4, 6, 11], "line_properti": 0, "kei": [0, 1, 3, 11], "valu": [0, 1, 3, 4, 6, 11], "ar": [0, 1, 2, 3, 6, 11], "These": 0, "directli": 0, "read": [0, 1, 3, 4, 8, 11], "from": [0, 1, 2, 3, 4, 6, 8], "haven": 0, "t": [0, 1, 6], "been": [0, 6], "translat": [0, 4, 11], "standard": [0, 1, 2, 4, 11], "dict": [0, 1, 2, 3], "panda": [0, 1, 3], "datafram": [0, 1, 3, 4, 11], "node": [0, 1, 2, 3, 4, 6, 11], "follow": [0, 4, 8, 11], "column": [0, 1], "node_id": 0, "int": [0, 1, 2, 3, 4, 6], "posit": [0, 6], "integ": [0, 1, 2], "id": [0, 1, 3, 4, 11], "number": [0, 1, 2, 3, 4, 6, 11], "neg": [0, 6], "indic": [0, 1, 6], "non": [0, 6], "stop": 0, "boolean": [0, 1], "i": [0, 1, 3, 4, 5, 6, 8, 11], "order": [0, 1, 2, 6, 11], "within": [0, 1, 2, 6], "thi": [0, 1, 2, 3, 6, 8, 11], "program_typ": 0, "either": [0, 1, 5, 6, 11], "pt": 0, "trnbld": 0, "str": [0, 1, 2, 3, 6], "instanc": [0, 1, 2, 3, 4, 11], "appli": [0, 1, 6], "which": [0, 1, 3, 6, 11], "includ": [0, 1, 2, 11], "time": [0, 1, 2, 4, 6, 11], "period": [0, 1, 2, 4, 11], "variabl": [0, 1, 2, 3, 4, 8, 11], "source_list": 0, "have": [0, 1, 6, 8], "ad": [0, 1, 3, 6, 11], "diff_dict": 0, "__init__": [0, 1, 2, 3, 4], "constructor": [0, 1, 3], "set": [0, 1, 2, 3, 4, 5, 6, 8, 11], "see": [0, 1, 3, 4, 6], "an": [0, 1, 3, 4, 6, 11], "method": [0, 1, 2, 3, 4, 6, 11], "add_additional_time_period": 0, "new_time_period_numb": 0, "orig_line_nam": 0, "copi": [0, 1, 3, 6, 11], "rout": [0, 1, 4], "anoth": 0, "appropri": [0, 4], "specif": [0, 1, 8, 11], "new": [0, 1, 6, 11], "under": 0, "self": [0, 1, 2, 3, 4, 6], "origin": [0, 1, 6, 11], "its": [0, 1], "return": [0, 1, 3, 4, 6], "add_cub": 0, "transit_sourc": 0, "lin": [0, 3, 4], "add": [0, 1, 3, 6, 11], "exist": [0, 1, 3, 6, 11], "transitnetwork": [0, 4], "directori": [0, 1, 4, 11], "static": [0, 1, 3, 4], "build_route_nam": 0, "route_id": [0, 4], "time_period": [0, 1, 2], "agency_id": 0, "0": [0, 1, 2, 4, 6, 11], "direction_id": 0, "1": [0, 1, 2, 4, 6, 8, 11], "creat": [0, 1, 2, 3, 4, 6, 8, 11], "contaten": 0, "agenc": 0, "direct": [0, 1, 6, 11], "e": [0, 1, 11], "452": 0, "111": 0, "pk": [0, 2], "construct": [0, 6, 11], "line_nam": 0, "0_452": 0, "111_452_pk1": 0, "calculate_start_end_tim": 0, "line_properties_dict": 0, "calcul": [0, 1, 2, 3, 4, 6], "start": [0, 1, 4, 8], "end": [0, 1, 6], "warn": [0, 1], "doesn": [0, 1], "take": [0, 1], "care": 0, "discongru": 0, "flavor": [0, 1], "create_add_route_card_dict": 0, "format": [0, 1, 2, 4, 8, 11], "route_properti": 0, "being": 0, "updat": [0, 1, 11], "A": [0, 1, 2, 3, 4, 6, 11], "addit": [0, 1, 6, 8, 11], "create_delete_route_card_dict": 0, "base_transit_line_properties_dict": 0, "delet": [0, 1], "style": [0, 6], "attribut": [0, 1, 2, 3, 4, 11], "find": [0, 1, 4], "create_update_route_card_dict": 0, "updated_properties_dict": 0, "cube_properties_to_standard_properti": 0, "cube_properties_dict": 0, "convert": [0, 1, 4, 6], "most": 0, "pertin": 0, "like": [0, 1, 2, 6], "headwai": [0, 2, 4], "varibl": [0, 2], "stnadard": 0, "unit": [0, 1, 6], "minut": 0, "second": [0, 4, 6], "correct": 0, "base_transit": 0, "identifi": [0, 1, 11], "what": [0, 1, 3, 6], "need": [0, 1, 2, 3], "For": [0, 1, 4, 6], "multipl": [0, 1, 6, 11], "make": [0, 1, 11], "duplic": 0, "so": [0, 1, 5, 6, 11], "each": [0, 1, 3, 6], "condit": [0, 1], "contain": [0, 1, 4, 6, 8, 11], "requir": [0, 1, 6, 11], "evalu": [0, 1, 3], "differ": [0, 2, 6], "between": [0, 1, 2, 3, 6], "evaluate_route_property_differ": 0, "properties_build": 0, "properties_bas": 0, "time_period_numb": 0, "absolut": [0, 6], "true": [0, 1, 3, 4, 5, 6, 11], "validate_bas": 0, "fals": [0, 1, 3, 4, 6, 11], "check": [0, 1, 3, 6], "ani": [0, 1, 3, 6, 11], "entri": [0, 3], "property_nam": 0, "property_valu": 0, "us": [0, 1, 2, 3, 4, 6, 11], "command": [0, 1, 3], "rather": [0, 1], "than": [0, 1, 6], "If": [0, 1, 3, 4, 6, 11], "automat": 0, "note": [0, 6, 11], "onli": [0, 1, 3, 4, 6], "numer": [0, 4, 6], "frequenc": [0, 4, 11], "suitabl": 0, "write": [0, 1, 3, 4, 11], "evaluate_route_shape_chang": 0, "shape_build": 0, "shape_bas": 0, "two": [0, 1, 6, 8, 11], "build": [0, 1, 3, 11], "version": [0, 6, 11], "ddatafram": 0, "get_time_period_numbers_from_cube_properti": 0, "properties_list": 0, "associ": [0, 2], "them": [0, 1, 6, 8], "all": [0, 1, 2, 5, 6, 11], "found": [0, 1, 6], "unpack_route_nam": 0, "unpack": 0, "info": [0, 1, 2], "link": [1, 2, 3, 6, 11], "kwarg": [1, 2, 3, 6], "roadwaynetwork": [1, 3, 4], "subclass": [1, 11], "network_wrangl": [1, 8, 11], "represent": [1, 3, 4, 6], "physic": 1, "roadwai": [1, 2, 3, 11], "geodatafram": [1, 11], "specifi": [1, 3, 6], "default": [1, 2, 3, 4, 6, 11], "cr": [1, 3, 6], "coordin": [1, 3, 6], "refer": [1, 3, 4, 6, 11], "system": [1, 3], "espg": [1, 3], "node_foreign_kei": [1, 3], "tabl": [1, 2, 3, 6], "link_foreign_kei": [1, 3], "foreign": [1, 3], "shape_foreign_kei": [1, 3, 11], "unique_link_id": [1, 3], "unique_node_id": [1, 3], "modes_to_network_link_vari": [1, 3], "map": [1, 2, 3, 6, 11], "mode": [1, 3, 4, 11], "modes_to_network_nodes_vari": [1, 3], "managed_lanes_node_id_scalar": [1, 3], "scalar": [1, 3, 6], "primari": [1, 3], "correspond": [1, 3], "manag": [1, 2, 3, 11], "lane": [1, 2, 3, 11], "managed_lanes_link_id_scalar": [1, 3], "managed_lanes_required_attribut": [1, 3], "must": [1, 3, 6], "keep_same_attributes_ml_and_gp": [1, 3], "parallel": [1, 3, 6], "gener": [1, 3], "purpos": [1, 3, 6], "add_count": 1, "network_vari": 1, "aadt": 1, "mndot_count_shst_data": 1, "none": [1, 3, 4, 5, 6], "widot_count_shst_data": 1, "mndot_count_variable_shp": [1, 2], "widot_count_variable_shp": 1, "count": [1, 2], "mc": [1, 2, 4], "join": [1, 3, 4, 6, 11], "data": [1, 2, 3, 4, 8, 11], "via": 1, "shst": 1, "api": 1, "match": 1, "result": [1, 6, 11], "should": [1, 2, 3, 6, 11], "written": [1, 11], "path": [1, 3, 4, 6, 11], "mndot": [1, 2], "locat": [1, 2, 3, 4], "widot": 1, "geodatabas": 1, "add_incident_link_data_to_nod": 1, "links_df": [1, 11], "nodes_df": [1, 11], "link_vari": 1, "unique_node_kei": 1, "model_node_id": [1, 2], "go": [1, 11], "assess": [1, 3], "connect": 1, "incid": 1, "where": 1, "length": [1, 6], "n": [1, 2, 3, 6, 11], "out": [1, 2, 5, 6, 8], "add_new_roadway_feature_chang": 1, "featur": [1, 6], "also": [1, 6, 11], "valid": [1, 6, 11], "add_variable_using_shst_refer": 1, "var_shst_csvdata": 1, "shst_csv_variabl": 1, "network_var_typ": 1, "overwrit": 1, "bool": [1, 3, 6], "addition_map": 1, "show": 1, "project_card_dictionari": 1, "wrapper": [1, 8, 11], "apply_managed_lane_feature_chang": 1, "link_idx": 1, "in_plac": [1, 11], "lndice": 1, "whether": 1, "decid": 1, "connector": [1, 2], "when": [1, 3, 6], "thei": [1, 2], "more": [1, 6, 11], "apply_python_calcul": 1, "pycod": 1, "execut": 1, "code": [1, 2, 6, 11], "apply_roadway_feature_chang": [1, 11], "select": [1, 11], "pass": [1, 5, 6], "assess_connect": [1, 11], "ignore_end_nod": [1, 11], "graph": 1, "disconnect": 1, "subgraph": 1, "describ": [1, 11], "member": [1, 6], "one": [1, 6, 11], "drive": [1, 11], "walk": [1, 11], "bike": 1, "ignor": [1, 6], "strai": 1, "singleton": 1, "tupl": [1, 6], "osmnx": [1, 11], "networkx": 1, "digraph": 1, "build_selection_kei": 1, "selection_dict": 1, "combin": [1, 2, 4, 6], "queri": [1, 11], "b": [1, 2, 11], "you": [1, 3, 11], "selection_dictonari": 1, "serv": 1, "calculate_area_typ": 1, "area_type_shap": [1, 2], "area_type_shape_vari": 1, "area_typ": [1, 2, 3], "area_type_codes_dict": 1, "downtown_area_type_shap": [1, 2], "downtown_area_typ": [1, 2], "area": [1, 2, 3, 6], "centroid": [1, 2, 6], "geometri": [1, 2, 6], "field": [1, 2, 11], "determin": [1, 3, 6], "label": 1, "isn": 1, "perfect": 1, "much": 1, "quicker": 1, "other": [1, 6, 11], "The": [1, 2, 5, 6, 11], "geodadabas": 1, "input": [1, 6], "downtown": [1, 2], "boundari": [1, 2, 6], "counti": [1, 2, 3], "calculate_centroidconnect": 1, "centroidconnect": [1, 2, 3], "highest_taz_numb": [1, 2], "as_integ": 1, "max": 1, "taz": [1, 2], "calculate_counti": 1, "county_shap": [1, 2], "county_shape_vari": 1, "county_codes_dict": 1, "calculate_dist": 1, "distanc": [1, 2, 3, 6], "centroidconnect_onli": 1, "mile": 1, "centroidconnector": 1, "calculate_mpo": 1, "county_network_vari": 1, "mpo": [1, 2], "mpo_counti": [1, 2], "param": [1, 4], "county_vari": 1, "region": [1, 6], "calculate_us": 1, "defauli": 1, "convert_int": 1, "int_col_nam": 1, "create_ml_vari": 1, "ml_lane": [1, 2], "ml": 1, "placehold": 1, "come": 1, "log": [1, 3, 8], "create_calculated_vari": 1, "create_dummy_connector_link": 1, "ml_df": 1, "access_lan": 1, "egress_lan": 1, "access_roadwai": 1, "ml_access": 1, "egress_roadwai": 1, "access_name_prefix": 1, "access": [1, 4, 6], "dummi": 1, "egress_name_prefix": 1, "egress": 1, "gp_df": 1, "roadai": 1, "prefix": 1, "create_hov_corridor_vari": 1, "segment_id": [1, 2], "hov": 1, "corridor": 1, "create_managed_lane_network": [1, 11], "keep_additional_attributes_ml_and_gp": 1, "separ": [1, 3, 6], "look": 1, "want": [1, 11], "leav": 1, "some": 1, "rigor": 1, "test": [1, 2, 6], "create_managed_vari": 1, "dataframe_to_fixed_width": 1, "df": 1, "fix": [1, 2], "width": [1, 6], "transform": [1, 6], "delete_roadway_feature_chang": 1, "ignore_miss": 1, "get": [1, 2, 6, 11], "miss": 1, "fail": [1, 6], "deletion_map": 1, "fill_na": 1, "fill": [1, 6], "na": 1, "from_roadwaynetwork": [1, 11], "roadway_network_object": 1, "get_attribut": 1, "join_kei": 1, "source_shst_ref_df": 1, "source_gdf": 1, "field_nam": 1, "get_managed_lane_node_id": 1, "nodes_list": 1, "4500000": 1, "237": 1, "get_modal_graph": 1, "bike_access": [1, 2], "bu": [1, 4], "bus_onli": [1, 2], "drive_access": [1, 2, 11], "rail": [1, 4], "rail_onli": [1, 2], "walk_access": [1, 2], "strongli": 1, "vertex": [1, 6], "reachabl": 1, "everi": [1, 6], "get_modal_links_nod": 1, "kept": 1, "both": [1, 4, 6, 11], "filter": [1, 6], "right": [1, 6], "now": 1, "we": [1, 11], "don": 1, "becaus": [1, 6, 11], "mark": 1, "issu": 1, "discuss": 1, "http": [1, 4, 6, 8, 11], "github": [1, 6, 8, 11], "com": [1, 4, 6, 8, 11], "wsp": [1, 8, 11], "sag": [1, 8, 11], "145": 1, "modal_nodes_df": 1, "mode_node_vari": 1, "get_property_by_time_period_and_group": 1, "prop": 1, "categori": [1, 2, 11], "default_return": 1, "seri": [1, 11], "group": 1, "16": [1, 2, 6], "00": [1, 2], "19": [1, 2], "option": [1, 6], "sov": [1, 2], "search": [1, 2, 8, 11], "hov3": [1, 2], "hov2": [1, 2], "identify_seg": 1, "o_id": 1, "d_id": 1, "endpoint": 1, "up": [1, 5, 11], "segment": [1, 4, 6, 11], "candid": 1, "otherwis": [1, 4, 6], "ram": 1, "hog": 1, "could": [1, 6, 11], "odd": 1, "shortest": 1, "segment_vari": 1, "keep": [1, 6], "identify_segment_endpoint": 1, "min_connecting_link": 1, "10": [1, 2, 6], "min_dist": 1, "max_link_devi": 1, "2": [1, 2, 4, 6, 11], "is_network_connect": [1, 11], "consid": [1, 6, 11], "cach": 1, "long": [1, 6], "load_transform_network": 1, "node_filenam": [1, 11], "link_filenam": [1, 11], "shape_filenam": [1, 11], "4326": [1, 6], "validate_schema": 1, "disk": 1, "schema": [1, 11], "shapes_df": [1, 11], "network_connection_plot": 1, "g": [1, 6], "disconnected_subgraph_nod": 1, "plot": 1, "fig": 1, "ax": [1, 6], "orig_dest_nodes_foreign_kei": 1, "whatev": 1, "u": 1, "v": [1, 2, 11], "ab": 1, "noth": 1, "assum": 1, "a_id": 1, "b_id": 1, "ox_graph": 1, "unique_link_kei": 1, "model_link_id": [1, 2, 3], "arrai": [1, 6], "remov": [1, 6], "certain": 1, "do": [1, 6, 11], "too": [1, 5], "link_df": 1, "referenc": 1, "multidigraph": 1, "path_search": 1, "candidate_links_df": 1, "weight_column": 1, "weight_factor": 1, "search_breadth": 1, "5": [1, 2, 4, 6], "max_search_breadth": 1, "candidate_link": 1, "part": [1, 6, 11], "foreigh": 1, "destin": 1, "weight": 1, "iter": [1, 6], "multipli": 1, "fast": [1, 11], "recalculate_calculated_vari": [1, 3], "recalculate_dist": [1, 3], "json": [1, 11], "geojson": 1, "skip": 1, "speed": 1, "spatial": [1, 6], "etc": [1, 2, 3, 11], "re": 1, "read_match_result": 1, "lot": 1, "same": [1, 6], "concaten": 1, "singl": [1, 3, 6], "geopanda": [1, 11], "sure": 1, "why": 1, "util": [1, 8, 11], "rename_variables_for_dbf": 1, "input_df": 1, "variable_crosswalk": 1, "output_vari": [1, 2], "convert_geometry_to_xi": 1, "renam": [1, 3], "dbf": 1, "shp": [1, 2], "char": 1, "crosswalk": [1, 3], "x": [1, 2, 6], "y": [1, 2, 6], "roadway_net_to_gdf": 1, "roadway_net": 1, "turn": [1, 11], "export": [1, 8, 11], "sophist": 1, "attach": 1, "roadway_standard_to_met_council_network": 1, "output_epsg": [1, 2], "consist": [1, 6, 11], "metcouncil": [1, 2, 4, 8, 11], "": [1, 2, 4, 6, 11], "model": [1, 2], "expect": [1, 6], "epsg": [1, 2, 6], "output": [1, 2, 3, 4, 6], "select_roadway_featur": [1, 11], "search_mod": 1, "force_search": 1, "sp_weight_factor": 1, "satisfi": [1, 6], "criteria": 1, "net": [1, 11], "osm": [1, 2], "share": [1, 11], "street": [1, 11], "osm_model_link_id": 1, "1234": 1, "shstid": 1, "4321": 1, "regex": 1, "facil": [1, 2, 11], "main": 1, "st": [1, 2], "least": [1, 11], "perform": 1, "even": 1, "previou": 1, "discourag": 1, "meander": 1, "ref": 1, "here": [1, 6], "defaul": 1, "selection_has_unique_link_id": 1, "selection_dictionari": 1, "selection_map": 1, "selected_link_idx": 1, "candidate_link_idx": 1, "selected_links_idx": 1, "candidate_links_idx": 1, "shortest_path": 1, "graph_links_df": 1, "100": 1, "four": 1, "nx": 1, "split_properties_by_time_period_and_categori": 1, "properties_to_split": [1, 2], "split": [1, 2, 4], "structur": 1, "stratifi": 1, "times_period": 1, "am": [1, 2], "6": [1, 2, 4, 6], "pm": [1, 2], "15": [1, 2], "update_dist": 1, "use_shap": 1, "inplac": 1, "straight": 1, "avail": 1, "portion": 1, "provid": [1, 3, 4, 6, 11], "entir": 1, "crow": 1, "fly": 1, "meter": [1, 6], "nan": 1, "validate_link_schema": 1, "schema_loc": 1, "roadway_network_link": 1, "validate_node_schema": 1, "node_fil": 1, "roadway_network_nod": 1, "validate_properti": 1, "ignore_exist": 1, "require_existing_for_chang": 1, "theproject": 1, "dictonari": 1, "validate_select": 1, "selection_requir": 1, "whetther": 1, "minimum": [1, 5, 6], "validate_shape_schema": 1, "shape_fil": 1, "roadway_network_shap": 1, "validate_uniqu": 1, "confirm": 1, "met": 1, "filenam": [1, 3, 11], "were": 1, "save": 1, "write_roadway_as_fixedwidth": [1, 11], "output_dir": 1, "node_output_vari": 1, "link_output_vari": 1, "output_link_txt": [1, 2], "output_node_txt": [1, 2], "output_link_header_width_txt": [1, 2], "output_node_header_width_txt": [1, 2], "output_cube_network_script": [1, 2], "drive_onli": 1, "function": [1, 5, 6, 8, 11], "doe": [1, 4], "header": [1, 2], "3": [1, 2, 4, 6, 11], "script": [1, 2], "run": [1, 8], "databas": [1, 3], "record": 1, "write_roadway_as_shp": [1, 11], "data_to_csv": 1, "data_to_dbf": 1, "output_link_shp": [1, 2], "output_node_shp": [1, 2], "output_link_csv": [1, 2], "output_node_csv": [1, 2], "output_gpkg": 1, "output_link_gpkg_lay": 1, "output_node_gpkg_lay": 1, "output_gpkg_link_filt": 1, "gpkg": 1, "csv": [1, 2, 3, 11], "full": [1, 6], "geopackag": 1, "layer": [1, 11], "subset": 1, "calculated_valu": [1, 3], "dai": [2, 4, 11], "can": [2, 6, 8, 11], "runtim": [2, 11], "initi": [2, 3, 11], "keyword": [2, 6, 11], "argument": [2, 6, 11], "explicitli": [2, 11], "highlight": [2, 11], "attr": 2, "time_period_to_tim": 2, "abbrevi": [2, 4], "gtf": [2, 4, 11], "highwai": [2, 3, 8, 11], "ea": 2, "md": 2, "ev": 2, "cube_time_period": 2, "4": [2, 4, 6], "demand": 2, "allow": [2, 6], "suffix": 2, "truck": 2, "trk": 2, "final": 2, "lanes_am": 2, "time_periods_to_tim": 2, "shapefil": 2, "r": 2, "metcouncil_data": 2, "cb_2017_us_county_5m": 2, "county_variable_shp": 2, "lanes_lookup_fil": 2, "lookup": 2, "centroid_connect_lan": 2, "anoka": 2, "dakota": 2, "hennepin": 2, "ramsei": 2, "scott": 2, "washington": 2, "carver": 2, "taz_shap": 2, "tazofficialwcurrentforecast": 2, "taz_data": 2, "highest": 2, "3100": 2, "link_id": 2, "shstgeometryid": 2, "roadway_class": 2, "truck_access": 2, "trn_priority_ea": 2, "trn_priority_am": 2, "trn_priority_md": 2, "trn_priority_pm": 2, "trn_priority_ev": 2, "ttime_assert_ea": 2, "ttime_assert_am": 2, "ttime_assert_md": 2, "ttime_assert_pm": 2, "ttime_assert_ev": 2, "lanes_ea": 2, "lanes_md": 2, "lanes_pm": 2, "lanes_ev": 2, "price_sov_ea": 2, "price_hov2_ea": 2, "price_hov3_ea": 2, "price_truck_ea": 2, "price_sov_am": 2, "price_hov2_am": 2, "price_hov3_am": 2, "price_truck_am": 2, "price_sov_md": 2, "price_hov2_md": 2, "price_hov3_md": 2, "price_truck_md": 2, "price_sov_pm": 2, "price_hov2_pm": 2, "price_hov3_pm": 2, "price_truck_pm": 2, "price_sov_ev": 2, "price_hov2_ev": 2, "price_hov3_ev": 2, "price_truck_ev": 2, "roadway_class_idx": 2, "facility_typ": 2, "osm_node_id": [2, 6, 11], "bike_nod": 2, "transit_nod": 2, "walk_nod": 2, "drive_nod": 2, "ml_lanes_ea": 2, "ml_lanes_am": 2, "ml_lanes_md": 2, "ml_lanes_pm": 2, "ml_lanes_ev": 2, "osm_facility_type_dict": 2, "thrivemsp2040communitydesign": 2, "area_type_variable_shp": 2, "comdes2040": 2, "area_type_code_dict": 2, "23": 2, "urban": [2, 4], "center": [2, 6], "24": 2, "25": 2, "35": 2, "36": 2, "41": 2, "51": 2, "52": 2, "53": 2, "60": 2, "downtownzones_taz": 2, "mrcc_roadway_class_shap": 2, "mrcc": 2, "trans_mrcc_centerlin": 2, "mrcc_roadway_class_variable_shp": 2, "mrcc_roadway_class_shp": 2, "route_si": 2, "widot_roadway_class_shap": 2, "wisconsin": 2, "wisconsin_lanes_counts_median": 2, "wislr": 2, "widot_roadway_class_variable_shp": 2, "rdwy_ctgy_": 2, "mndot_count_shap": 2, "count_mn": 2, "aadt_2017_count_loc": 2, "osm_highway_facility_type_crosswalk": 2, "legacy_tm2_attribut": 2, "shstreferenceid": 2, "legaci": 2, "tm2": 2, "osm_lanes_attribut": 2, "tam_tm2_attribut": 2, "tam": 2, "tom_tom_attribut": 2, "tomtom": 2, "tomtom_attribut": 2, "sfcta_attribut": 2, "sfcta": 2, "geograph": 2, "102646": 2, "scratch": 2, "txt": [2, 11], "links_header_width": 2, "nodes_header_width": 2, "import": [2, 6], "make_complete_network_from_fixed_width_fil": 2, "county_link_range_dict": 2, "county_code_dict": 2, "7": [2, 4, 11], "extern": 2, "chisago": 2, "11": 2, "goodhu": 2, "12": 2, "isanti": 2, "13": 2, "le": 2, "sueur": 2, "14": 2, "mcleod": 2, "pierc": 2, "polk": 2, "17": 2, "rice": 2, "18": 2, "sherburn": 2, "siblei": 2, "20": 2, "croix": 2, "21": 2, "wright": 2, "22": 2, "route_type_bus_mode_dict": 2, "urb": 2, "loc": 2, "sub": [2, 6], "express": [2, 4, 6], "route_type_mode_dict": 2, "8": [2, 4, 6], "9": [2, 4], "cube_time_periods_nam": 2, "op": 2, "detail": [2, 5], "zone": 2, "possibl": [2, 6], "roadway_link_chang": 3, "roadway_node_chang": 3, "transit_chang": [3, 4], "base_roadway_network": 3, "base_cube_transit_network": 3, "build_cube_transit_network": 3, "project_nam": 3, "produc": [3, 6], "test_project": [3, 11], "create_project": [3, 11], "base_cube_transit_sourc": 3, "o": [3, 4, 11], "build_cube_transit_sourc": 3, "transit_route_shape_chang": [3, 11], "evaluate_chang": [3, 11], "write_project_card": [3, 11], "scratch_dir": [3, 11], "t_transit_shape_test": [3, 11], "yml": [3, 11], "default_project_nam": 3, "level": 3, "constant": 3, "static_valu": 3, "card_data": 3, "cubetransit": [3, 8], "bunch": 3, "projectcard": [3, 8], "case": 3, "standardtransit": [3, 8], "add_highway_chang": 3, "limit_variables_to_existing_network": 3, "hoc": [3, 11], "add_transit_chang": 3, "roadway_log_fil": 3, "roadway_shp_fil": 3, "roadway_csv_fil": 3, "network_build_fil": 3, "emme_node_id_crosswalk_fil": 3, "emme_name_crosswalk_fil": 3, "base_roadway_dir": 3, "base_transit_dir": [3, 4, 11], "consum": 3, "logfil": 3, "emm": [3, 4], "folder": 3, "base_cube_transit_fil": 3, "build_cube_transit_fil": 3, "first": [3, 6], "recalcul": 3, "determine_roadway_network_changes_compat": 3, "emme_id_to_wrangler_id": 3, "emme_link_change_df": 3, "emme_node_change_df": 3, "emme_transit_changes_df": 3, "rewrit": 3, "wrangler": [3, 8, 11], "emme_name_to_wrangler_nam": 3, "aggreg": 3, "get_object_from_network_build_command": 3, "row": [3, 4], "histori": 3, "l": 3, "get_operation_from_network_build_command": 3, "action": 3, "c": [3, 6], "d": [3, 6], "read_logfil": 3, "logfilenam": 3, "reprsent": [3, 11], "read_network_build_fil": 3, "networkbuildfilenam": 3, "nework": 3, "assign_group": 3, "user": [3, 11], "TO": 3, "ptg_feed": 4, "hold": [4, 11], "feed": [4, 11], "partridg": [4, 11], "manipul": [4, 11], "cube_transit_net": [4, 11], "read_gtf": [4, 11], "write_as_cube_lin": [4, 11], "write_dir": [4, 11], "outfil": [4, 11], "calculate_cube_mod": 4, "assign": 4, "logic": 4, "route_typ": 4, "develop": [4, 11], "googl": 4, "cube_mod": 4, "route_type_to_cube_mod": 4, "tram": 4, "streetcar": 4, "light": 4, "further": 4, "disaggreg": 4, "buse": 4, "suburban": 4, "longnam": 4, "lower": [4, 6], "elif": 4, "99": 4, "local": [4, 11], "els": [4, 6], "route_long_nam": 4, "cube_format": 4, "represnt": 4, "notat": 4, "trip": 4, "trip_id": 4, "shape_id": [4, 11], "tod": 4, "onewai": 4, "oper": [4, 6], "fromtransitnetwork": 4, "transit_network_object": 4, "modelroadwaynetwork": [4, 8], "gtfs_feed_dir": 4, "route_properties_gtfs_to_cub": 4, "prepar": 4, "trip_df": 4, "shape_gtfs_to_cub": 4, "add_nntim": 4, "shape_gtfs_to_emm": 4, "trip_row": 4, "time_to_cube_time_period": 4, "start_time_sec": 4, "as_str": 4, "verbos": 4, "midnight": [4, 6], "this_tp": 4, "this_tp_num": 4, "outpath": 4, "after": 4, "setuplog": 5, "infologfilenam": 5, "debuglogfilenam": 5, "logtoconsol": 5, "infolog": 5, "ters": 5, "just": 5, "give": 5, "bare": 5, "composit": 5, "clear": 5, "later": 5, "debuglog": 5, "veri": [5, 6], "noisi": 5, "debug": 5, "spew": 5, "consol": 5, "point": 6, "arg": 6, "basegeometri": 6, "possibli": 6, "z": 6, "zero": 6, "dimension": 6, "float": 6, "sequenc": 6, "individu": 6, "p": 6, "print": 6, "almost_equ": 6, "decim": 6, "equal": 6, "place": 6, "deprec": 6, "sinc": 6, "confus": 6, "equals_exact": 6, "instead": 6, "approxim": 6, "compon": [6, 8], "linestr": 6, "1e": 6, "buffer": 6, "quad_seg": 6, "cap_styl": 6, "round": 6, "join_styl": 6, "mitre_limit": 6, "single_sid": 6, "dilat": 6, "eros": 6, "small": 6, "mai": 6, "sometim": 6, "tidi": 6, "polygon": 6, "around": [6, 8, 11], "resolut": 6, "angl": 6, "fillet": 6, "buffercapstyl": 6, "squar": 6, "flat": 6, "circular": 6, "rectangular": 6, "while": 6, "involv": 6, "bufferjoinstyl": 6, "mitr": 6, "bevel": 6, "midpoint": 6, "edg": [6, 8], "touch": 6, "depend": 6, "limit": 6, "ratio": 6, "sharp": 6, "corner": 6, "offset": 6, "meet": 6, "miter": 6, "extend": 6, "To": [6, 11], "prevent": 6, "unreason": 6, "control": 6, "maximum": 6, "exce": 6, "side": 6, "sign": 6, "left": 6, "hand": 6, "regular": 6, "cap": 6, "alwai": 6, "forc": 6, "equival": 6, "cap_flat": 6, "quadseg": 6, "alia": 6, "strictli": 6, "wkt": 6, "load": 6, "gon": 6, "approx": 6, "radiu": 6, "circl": 6, "1365484905459": 6, "128": 6, "141513801144": 6, "triangl": 6, "exterior": 6, "coord": 6, "contains_properli": 6, "complet": 6, "common": 6, "document": [6, 11], "covered_bi": 6, "cover": 6, "cross": 6, "grid_siz": 6, "disjoint": 6, "unitless": 6, "dwithin": 6, "given": [6, 11], "topolog": 6, "toler": 6, "comparison": 6, "geometrytyp": 6, "hausdorff_dist": 6, "hausdorff": 6, "interpol": 6, "normal": 6, "along": 6, "linear": 6, "taken": 6, "measur": 6, "revers": 6, "rang": 6, "index": [6, 8], "handl": 6, "clamp": 6, "interpret": 6, "fraction": 6, "line_interpolate_point": 6, "intersect": 6, "line_locate_point": 6, "nearest": 6, "form": 6, "canon": 6, "ring": 6, "multi": 6, "multilinestr": 6, "overlap": 6, "point_on_surfac": 6, "guarante": 6, "cheapli": 6, "representative_point": 6, "relat": 6, "de": 6, "9im": 6, "matrix": 6, "relate_pattern": 6, "pattern": 6, "relationship": 6, "interior": 6, "unchang": 6, "is_ccw": 6, "clockwis": 6, "max_segment_length": 6, "vertic": 6, "longer": 6, "evenli": 6, "subdivid": 6, "densifi": 6, "unmodifi": 6, "array_lik": 6, "greater": 6, "simplifi": 6, "preserve_topologi": 6, "dougla": 6, "peucker": 6, "algorithm": 6, "unless": 6, "topologi": 6, "preserv": 6, "invalid": 6, "svg": 6, "scale_factor": 6, "fill_color": 6, "opac": 6, "element": 6, "factor": 6, "diamet": 6, "hex": 6, "color": 6, "66cc99": 6, "ff3333": 6, "symmetric_differ": 6, "symmetr": 6, "union": 6, "dimens": 6, "bound": 6, "collect": 6, "empti": 6, "null": 6, "minx": 6, "mini": 6, "maxx": 6, "maxi": 6, "geometr": 6, "convex_hul": 6, "convex": 6, "hull": 6, "less": 6, "three": [6, 11], "multipoint": 6, "triangular": 6, "imagin": 6, "elast": 6, "band": 6, "stretch": 6, "coordinatesequ": 6, "envelop": 6, "figur": 6, "geom_typ": 6, "has_z": 6, "is_clos": 6, "close": 6, "applic": 6, "is_empti": 6, "is_r": 6, "is_simpl": 6, "simpl": 6, "mean": 6, "is_valid": 6, "definit": 6, "minimum_clear": 6, "move": 6, "minimum_rotated_rectangl": 6, "orient": 6, "rotat": 6, "rectangl": 6, "enclos": 6, "unlik": 6, "constrain": 6, "degener": 6, "oriented_envelop": 6, "wkb": 6, "wkb_hex": 6, "xy": 6, "shell": 6, "hole": 6, "It": [6, 8], "space": 6, "pair": [6, 11], "tripl": 6, "abov": 6, "classmethod": 6, "from_bound": 6, "xmin": 6, "ymin": 6, "xmax": 6, "ymax": 6, "stroke": 6, "partial": 6, "func": 6, "futur": 6, "call": 6, "column_name_to_part": 6, "create_locationrefer": 6, "geodesic_point_buff": 6, "lat": 6, "lon": 6, "get_shared_streets_intersection_hash": 6, "per": [6, 11], "sharedstreet": 6, "j": 6, "blob": 6, "0e6d7de0aee2e9ae3b007d1e45284b06cc241d02": 6, "src": 6, "l553": 6, "l565": 6, "93": 6, "0965985": 6, "44": 6, "952112199999995": 6, "954734870": 6, "69f13f881649cb21ee3b359730790bb9": 6, "hhmmss_to_datetim": 6, "hhmmss_str": 6, "datetim": 6, "hh": 6, "mm": 6, "ss": 6, "dt": 6, "secs_to_datetim": 6, "sec": 6, "shorten_nam": 6, "geom": 6, "xp": 6, "yp": 6, "zp": 6, "shall": 6, "ident": 6, "def": 6, "id_func": 6, "g2": 6, "g1": 6, "pyproj": 6, "accur": 6, "wgs84": 6, "utm": 6, "32618": 6, "from_cr": 6, "always_xi": 6, "support": 6, "lambda": 6, "unidecod": 6, "error": 6, "replace_str": 6, "transliter": 6, "unicod": 6, "ascii": 6, "\u5317\u4eb0": 6, "bei": 6, "jing": 6, "tri": 6, "codec": 6, "charact": 6, "fall": 6, "back": 6, "five": 6, "faster": 6, "slightli": 6, "slower": 6, "unicode_expect_nonascii": 6, "present": 6, "replac": [6, 11], "strict": 6, "rais": 6, "unidecodeerror": 6, "substitut": 6, "might": [6, 11], "packag": [8, 11], "aim": 8, "networkwrangl": [8, 11], "refin": 8, "metropolitan": 8, "council": 8, "citilab": 8, "softwar": [8, 11], "instal": 8, "bleed": 8, "clone": 8, "brief": 8, "intro": 8, "workflow": 8, "quickstart": 8, "jupyt": 8, "notebook": 8, "setup": 8, "scenario": 8, "audit": 8, "report": 8, "logger": 8, "modul": 8, "page": 8, "suggest": 11, "virtualenv": 11, "conda": 11, "virtual": 11, "environ": 11, "recommend": 11, "pip": 11, "lasso": 11, "config": 11, "channel": 11, "forg": 11, "rtree": 11, "my_lasso_environ": 11, "activ": 11, "git": 11, "master": 11, "pypi": 11, "repositori": 11, "date": 11, "branch": 11, "work": 11, "your": 11, "machin": 11, "edit": 11, "plan": 11, "well": 11, "cd": 11, "team": 11, "contribut": 11, "bxack": 11, "pleas": 11, "fork": 11, "befor": 11, "upstream": 11, "tag": 11, "branchnam": 11, "frequent": 11, "instruct": 11, "good": 11, "atom": 11, "sublim": 11, "text": 11, "syntax": 11, "desktop": 11, "built": 11, "mashup": 11, "open": 11, "In": 11, "nest": 11, "span": 11, "implement": 11, "novel": 11, "travel": 11, "break": 11, "publictransport": 11, "done": 11, "gui": 11, "public": 11, "transport": 11, "infrastructur": 11, "servic": 11, "tier": 11, "made": 11, "mainli": 11, "my_link_fil": 11, "my_node_fil": 11, "my_shape_fil": 11, "my_select": 11, "35e": 11, "961117623": 11, "2564047368": 11, "my_chang": 11, "my_net": 11, "ml_net": 11, "_": 11, "disconnected_nod": 11, "my_out_prefix": 11, "my_dir": 11, "my_base_scenario": 11, "road_net": 11, "stpaul_link_fil": 11, "stpaul_node_fil": 11, "stpaul_shape_fil": 11, "transit_net": 11, "stpaul_dir": 11, "card_filenam": 11, "3_multiple_roadway_attribute_chang": 11, "multiple_chang": 11, "4_simple_managed_lan": 11, "project_card_directori": 11, "project_card": 11, "project_cards_list": 11, "my_scenario": 11, "create_scenario": 11, "base_scenario": 11, "check_scenario_requisit": 11, "apply_all_project": 11, "scenario_summari": 11, "base_transit_sourc": 11, "build_transit_sourc": 11, "understand": 11, "how": 11, "overrid": 11, "those": 11, "instanti": 11, "yaml": 11, "configur": 11, "config_fil": 11, "f": 11, "my_config": 11, "safe_load": 11, "model_road_net": 11, "my_paramet": 11, "accomplish": 11, "goal": 11, "top": 11, "learn": 11, "basic": 11, "creation": 11, "ipynb": 11}, "objects": {"": [[7, 0, 0, "-", "lasso"]], "lasso": [[0, 1, 1, "", "CubeTransit"], [1, 1, 1, "", "ModelRoadwayNetwork"], [2, 1, 1, "", "Parameters"], [3, 1, 1, "", "Project"], [4, 1, 1, "", "StandardTransit"], [5, 0, 0, "-", "logger"], [6, 0, 0, "-", "util"]], "lasso.CubeTransit": [[0, 2, 1, "", "__init__"], [0, 2, 1, "", "add_additional_time_periods"], [0, 2, 1, "", "add_cube"], [0, 2, 1, "", "build_route_name"], [0, 2, 1, "", "calculate_start_end_times"], [0, 2, 1, "", "create_add_route_card_dict"], [0, 2, 1, "", "create_delete_route_card_dict"], [0, 2, 1, "", "create_from_cube"], [0, 2, 1, "", "create_update_route_card_dict"], [0, 2, 1, "", "cube_properties_to_standard_properties"], [0, 3, 1, "", "diff_dict"], [0, 2, 1, "", "evaluate_differences"], [0, 2, 1, "", "evaluate_route_property_differences"], [0, 2, 1, "", "evaluate_route_shape_changes"], [0, 2, 1, "", "get_time_period_numbers_from_cube_properties"], [0, 3, 1, "", "line_properties"], [0, 3, 1, "", "lines"], [0, 3, 1, "", "parameters"], [0, 3, 1, "", "program_type"], [0, 3, 1, "", "shapes"], [0, 3, 1, "", "source_list"], [0, 2, 1, "", "unpack_route_name"]], "lasso.ModelRoadwayNetwork": [[1, 3, 1, "", "CALCULATED_VALUES"], [1, 2, 1, "", "__init__"], [1, 2, 1, "", "add_counts"], [1, 2, 1, "", "add_incident_link_data_to_nodes"], [1, 2, 1, "", "add_new_roadway_feature_change"], [1, 2, 1, "", "add_variable_using_shst_reference"], [1, 2, 1, "", "addition_map"], [1, 2, 1, "", "apply"], [1, 2, 1, "", "apply_managed_lane_feature_change"], [1, 2, 1, "", "apply_python_calculation"], [1, 2, 1, "", "apply_roadway_feature_change"], [1, 2, 1, "", "assess_connectivity"], [1, 2, 1, "", "build_selection_key"], [1, 2, 1, "", "calculate_area_type"], [1, 2, 1, "", "calculate_centroidconnect"], [1, 2, 1, "", "calculate_county"], [1, 2, 1, "", "calculate_distance"], [1, 2, 1, "", "calculate_mpo"], [1, 2, 1, "", "calculate_use"], [1, 2, 1, "", "convert_int"], [1, 2, 1, "", "create_ML_variable"], [1, 2, 1, "", "create_calculated_variables"], [1, 2, 1, "", "create_dummy_connector_links"], [1, 2, 1, "", "create_hov_corridor_variable"], [1, 2, 1, "", "create_managed_lane_network"], [1, 2, 1, "", "create_managed_variable"], [1, 2, 1, "", "dataframe_to_fixed_width"], [1, 2, 1, "", "delete_roadway_feature_change"], [1, 2, 1, "", "deletion_map"], [1, 2, 1, "", "fill_na"], [1, 2, 1, "", "from_RoadwayNetwork"], [1, 2, 1, "", "get_attribute"], [1, 2, 1, "", "get_managed_lane_node_ids"], [1, 2, 1, "", "get_modal_graph"], [1, 2, 1, "", "get_modal_links_nodes"], [1, 2, 1, "", "get_property_by_time_period_and_group"], [1, 2, 1, "", "identify_segment"], [1, 2, 1, "", "identify_segment_endpoints"], [1, 2, 1, "", "is_network_connected"], [1, 2, 1, "", "load_transform_network"], [1, 2, 1, "", "network_connection_plot"], [1, 2, 1, "", "orig_dest_nodes_foreign_key"], [1, 2, 1, "", "ox_graph"], [1, 2, 1, "", "path_search"], [1, 2, 1, "", "read"], [1, 2, 1, "", "read_match_result"], [1, 2, 1, "", "rename_variables_for_dbf"], [1, 2, 1, "", "roadway_net_to_gdf"], [1, 2, 1, "", "roadway_standard_to_met_council_network"], [1, 2, 1, "", "select_roadway_features"], [1, 2, 1, "", "selection_has_unique_link_id"], [1, 2, 1, "", "selection_map"], [1, 2, 1, "", "shortest_path"], [1, 2, 1, "", "split_properties_by_time_period_and_category"], [1, 2, 1, "", "update_distance"], [1, 2, 1, "", "validate_link_schema"], [1, 2, 1, "", "validate_node_schema"], [1, 2, 1, "", "validate_properties"], [1, 2, 1, "", "validate_selection"], [1, 2, 1, "", "validate_shape_schema"], [1, 2, 1, "", "validate_uniqueness"], [1, 2, 1, "", "write"], [1, 2, 1, "", "write_roadway_as_fixedwidth"], [1, 2, 1, "", "write_roadway_as_shp"]], "lasso.Parameters": [[2, 2, 1, "", "__init__"], [2, 3, 1, "", "county_link_range_dict"], [2, 3, 1, "", "cube_time_periods"], [2, 3, 1, "", "properties_to_split"], [2, 3, 1, "", "zones"]], "lasso.Project": [[3, 3, 1, "", "CALCULATED_VALUES"], [3, 3, 1, "id0", "DEFAULT_PROJECT_NAME"], [3, 3, 1, "id1", "STATIC_VALUES"], [3, 2, 1, "", "__init__"], [3, 2, 1, "", "add_highway_changes"], [3, 2, 1, "", "add_transit_changes"], [3, 3, 1, "", "base_cube_transit_network"], [3, 3, 1, "", "base_roadway_network"], [3, 3, 1, "", "build_cube_transit_network"], [3, 3, 1, "", "card_data"], [3, 2, 1, "", "create_project"], [3, 2, 1, "", "determine_roadway_network_changes_compatibility"], [3, 2, 1, "", "emme_id_to_wrangler_id"], [3, 2, 1, "", "emme_name_to_wrangler_name"], [3, 2, 1, "", "evaluate_changes"], [3, 2, 1, "", "get_object_from_network_build_command"], [3, 2, 1, "", "get_operation_from_network_build_command"], [3, 3, 1, "", "parameters"], [3, 3, 1, "", "project_name"], [3, 2, 1, "", "read_logfile"], [3, 2, 1, "", "read_network_build_file"], [3, 3, 1, "", "roadway_link_changes"], [3, 3, 1, "", "roadway_node_changes"], [3, 3, 1, "", "transit_changes"], [3, 2, 1, "", "write_project_card"]], "lasso.StandardTransit": [[4, 2, 1, "", "__init__"], [4, 2, 1, "", "calculate_cube_mode"], [4, 2, 1, "", "cube_format"], [4, 2, 1, "", "evaluate_differences"], [4, 3, 1, "", "feed"], [4, 2, 1, "", "fromTransitNetwork"], [4, 3, 1, "", "parameters"], [4, 2, 1, "", "read_gtfs"], [4, 2, 1, "", "route_properties_gtfs_to_cube"], [4, 2, 1, "", "shape_gtfs_to_cube"], [4, 2, 1, "", "shape_gtfs_to_emme"], [4, 2, 1, "", "time_to_cube_time_period"], [4, 2, 1, "", "write_as_cube_lin"]], "lasso.logger": [[5, 4, 1, "", "setupLogging"]], "lasso.util": [[6, 1, 1, "", "Point"], [6, 1, 1, "", "Polygon"], [6, 4, 1, "", "column_name_to_parts"], [6, 4, 1, "", "create_locationreference"], [6, 4, 1, "", "geodesic_point_buffer"], [6, 4, 1, "", "get_shared_streets_intersection_hash"], [6, 4, 1, "", "hhmmss_to_datetime"], [6, 1, 1, "", "partial"], [6, 4, 1, "", "secs_to_datetime"], [6, 4, 1, "", "shorten_name"], [6, 4, 1, "", "transform"], [6, 4, 1, "", "unidecode"]], "lasso.util.Point": [[6, 2, 1, "", "almost_equals"], [6, 5, 1, "", "area"], [6, 5, 1, "", "boundary"], [6, 5, 1, "", "bounds"], [6, 2, 1, "", "buffer"], [6, 5, 1, "", "centroid"], [6, 2, 1, "", "contains"], [6, 2, 1, "", "contains_properly"], [6, 5, 1, "", "convex_hull"], [6, 5, 1, "", "coords"], [6, 2, 1, "", "covered_by"], [6, 2, 1, "", "covers"], [6, 2, 1, "", "crosses"], [6, 2, 1, "", "difference"], [6, 2, 1, "", "disjoint"], [6, 2, 1, "", "distance"], [6, 2, 1, "", "dwithin"], [6, 5, 1, "", "envelope"], [6, 2, 1, "", "equals"], [6, 2, 1, "", "equals_exact"], [6, 5, 1, "", "geom_type"], [6, 2, 1, "", "geometryType"], [6, 5, 1, "", "has_z"], [6, 2, 1, "", "hausdorff_distance"], [6, 2, 1, "", "interpolate"], [6, 2, 1, "", "intersection"], [6, 2, 1, "", "intersects"], [6, 5, 1, "", "is_closed"], [6, 5, 1, "", "is_empty"], [6, 5, 1, "", "is_ring"], [6, 5, 1, "", "is_simple"], [6, 5, 1, "", "is_valid"], [6, 5, 1, "", "length"], [6, 2, 1, "", "line_interpolate_point"], [6, 2, 1, "", "line_locate_point"], [6, 5, 1, "", "minimum_clearance"], [6, 5, 1, "", "minimum_rotated_rectangle"], [6, 2, 1, "", "normalize"], [6, 5, 1, "", "oriented_envelope"], [6, 2, 1, "", "overlaps"], [6, 2, 1, "", "point_on_surface"], [6, 2, 1, "", "project"], [6, 2, 1, "", "relate"], [6, 2, 1, "", "relate_pattern"], [6, 2, 1, "", "representative_point"], [6, 2, 1, "", "reverse"], [6, 2, 1, "", "segmentize"], [6, 2, 1, "", "simplify"], [6, 2, 1, "", "svg"], [6, 2, 1, "", "symmetric_difference"], [6, 2, 1, "", "touches"], [6, 5, 1, "", "type"], [6, 2, 1, "", "union"], [6, 2, 1, "", "within"], [6, 5, 1, "", "wkb"], [6, 5, 1, "", "wkb_hex"], [6, 5, 1, "", "wkt"], [6, 5, 1, "", "x"], [6, 5, 1, "", "xy"], [6, 5, 1, "", "y"], [6, 5, 1, "", "z"]], "lasso.util.Polygon": [[6, 2, 1, "", "almost_equals"], [6, 5, 1, "", "area"], [6, 5, 1, "", "boundary"], [6, 5, 1, "", "bounds"], [6, 2, 1, "", "buffer"], [6, 5, 1, "", "centroid"], [6, 2, 1, "", "contains"], [6, 2, 1, "", "contains_properly"], [6, 5, 1, "", "convex_hull"], [6, 5, 1, "", "coords"], [6, 2, 1, "", "covered_by"], [6, 2, 1, "", "covers"], [6, 2, 1, "", "crosses"], [6, 2, 1, "", "difference"], [6, 2, 1, "", "disjoint"], [6, 2, 1, "", "distance"], [6, 2, 1, "", "dwithin"], [6, 5, 1, "", "envelope"], [6, 2, 1, "", "equals"], [6, 2, 1, "", "equals_exact"], [6, 5, 1, "id0", "exterior"], [6, 2, 1, "", "from_bounds"], [6, 5, 1, "", "geom_type"], [6, 2, 1, "", "geometryType"], [6, 5, 1, "", "has_z"], [6, 2, 1, "", "hausdorff_distance"], [6, 5, 1, "id1", "interiors"], [6, 2, 1, "", "interpolate"], [6, 2, 1, "", "intersection"], [6, 2, 1, "", "intersects"], [6, 5, 1, "", "is_closed"], [6, 5, 1, "", "is_empty"], [6, 5, 1, "", "is_ring"], [6, 5, 1, "", "is_simple"], [6, 5, 1, "", "is_valid"], [6, 5, 1, "", "length"], [6, 2, 1, "", "line_interpolate_point"], [6, 2, 1, "", "line_locate_point"], [6, 5, 1, "", "minimum_clearance"], [6, 5, 1, "", "minimum_rotated_rectangle"], [6, 2, 1, "", "normalize"], [6, 5, 1, "", "oriented_envelope"], [6, 2, 1, "", "overlaps"], [6, 2, 1, "", "point_on_surface"], [6, 2, 1, "", "project"], [6, 2, 1, "", "relate"], [6, 2, 1, "", "relate_pattern"], [6, 2, 1, "", "representative_point"], [6, 2, 1, "", "reverse"], [6, 2, 1, "", "segmentize"], [6, 2, 1, "", "simplify"], [6, 2, 1, "", "svg"], [6, 2, 1, "", "symmetric_difference"], [6, 2, 1, "", "touches"], [6, 5, 1, "", "type"], [6, 2, 1, "", "union"], [6, 2, 1, "", "within"], [6, 5, 1, "", "wkb"], [6, 5, 1, "", "wkb_hex"], [6, 5, 1, "", "wkt"], [6, 5, 1, "", "xy"]], "lasso.util.partial": [[6, 3, 1, "", "args"], [6, 3, 1, "", "func"], [6, 3, 1, "", "keywords"]]}, "objtypes": {"0": "py:module", "1": "py:class", "2": "py:method", "3": "py:attribute", "4": "py:function", "5": "py:property"}, "objnames": {"0": ["py", "module", "Python module"], "1": ["py", "class", "Python class"], "2": ["py", "method", "Python method"], "3": ["py", "attribute", "Python attribute"], "4": ["py", "function", "Python function"], "5": ["py", "property", "Python property"]}, "titleterms": {"lasso": [0, 1, 2, 3, 4, 5, 6, 7, 8, 9], "cubetransit": [0, 11], "modelroadwaynetwork": [1, 11], "todo": 1, "paramet": [2, 10, 11], "project": [3, 9, 10, 11], "standardtransit": [4, 11], "logger": 5, "util": [6, 7], "class": 7, "function": 7, "base": 7, "welcom": 8, "": 8, "document": 8, "content": 8, "indic": 8, "tabl": 8, "run": [9, 11], "creat": 9, "file": [9, 10, 11], "scenario": [9, 11], "export": 9, "network": [9, 11], "audit": 9, "report": 9, "setup": 10, "set": 10, "addit": 10, "data": 10, "start": 11, "out": 11, "instal": 11, "bleed": 11, "edg": 11, "from": 11, "clone": 11, "brief": 11, "intro": 11, "compon": 11, "roadwaynetwork": 11, "transitnetwork": 11, "projectcard": 11, "typic": 11, "workflow": 11, "card": 11, "transit": 11, "lin": 11, "cube": 11, "log": 11, "model": 11, "quickstart": 11, "jupyt": 11, "notebook": 11}, "envversion": {"sphinx.domains.c": 3, "sphinx.domains.changeset": 1, "sphinx.domains.citation": 1, "sphinx.domains.cpp": 9, "sphinx.domains.index": 1, "sphinx.domains.javascript": 3, "sphinx.domains.math": 2, "sphinx.domains.python": 4, "sphinx.domains.rst": 2, "sphinx.domains.std": 2, "sphinx.ext.intersphinx": 1, "sphinx.ext.todo": 2, "sphinx.ext.viewcode": 1, "sphinx": 58}, "alltitles": {"lasso.CubeTransit": [[0, "lasso-cubetransit"]], "lasso.ModelRoadwayNetwork": [[1, "lasso-modelroadwaynetwork"]], "Todo": [[1, "id1"], [1, "id2"], [1, "id3"], [1, "id4"], [1, "id5"], [1, "id6"]], "lasso.Parameters": [[2, "lasso-parameters"]], "lasso.Project": [[3, "lasso-project"]], "lasso.StandardTransit": [[4, "lasso-standardtransit"]], "lasso.logger": [[5, "module-lasso.logger"]], "lasso.util": [[6, "module-lasso.util"]], "Lasso Classes and Functions": [[7, "module-lasso"]], "Base Classes": [[7, "base-classes"]], "Utils and Functions": [[7, "utils-and-functions"]], "Welcome to lasso\u2019s documentation!": [[8, "welcome-to-lasso-s-documentation"]], "Contents:": [[8, null]], "Indices and tables": [[8, "indices-and-tables"]], "Running Lasso": [[9, "running-lasso"]], "Create project files": [[9, "create-project-files"]], "Create a scenario": [[9, "create-a-scenario"]], "Exporting networks": [[9, "exporting-networks"]], "Auditing and Reporting": [[9, "auditing-and-reporting"]], "Setup": [[10, "setup"]], "Projects": [[10, "projects"]], "Parameters": [[10, "parameters"], [11, "parameters"]], "Settings": [[10, "settings"]], "Additional Data Files": [[10, "additional-data-files"]], "Starting Out": [[11, "starting-out"]], "Installation": [[11, "installation"]], "Bleeding Edge": [[11, "bleeding-edge"]], "From Clone": [[11, "from-clone"]], "Brief Intro": [[11, "brief-intro"]], "Components": [[11, "components"]], "RoadwayNetwork": [[11, "roadwaynetwork"]], "TransitNetwork": [[11, "transitnetwork"]], "ProjectCard": [[11, "projectcard"]], "Scenario": [[11, "scenario"]], "Project": [[11, "project"]], "ModelRoadwayNetwork": [[11, "modelroadwaynetwork"]], "StandardTransit": [[11, "standardtransit"]], "CubeTransit": [[11, "cubetransit"]], "Typical Workflow": [[11, "typical-workflow"]], "Project Cards from Transit LIN Files": [[11, "project-cards-from-transit-lin-files"]], "Project Cards from Cube LOG Files": [[11, "project-cards-from-cube-log-files"]], "Model Network Files for a Scenario": [[11, "model-network-files-for-a-scenario"]], "Running Quickstart Jupyter Notebooks": [[11, "running-quickstart-jupyter-notebooks"]]}, "indexentries": {"cubetransit (class in lasso)": [[0, "lasso.CubeTransit"]], "__init__() (lasso.cubetransit method)": [[0, "lasso.CubeTransit.__init__"]], "add_additional_time_periods() (lasso.cubetransit method)": [[0, "lasso.CubeTransit.add_additional_time_periods"]], "add_cube() (lasso.cubetransit method)": [[0, "lasso.CubeTransit.add_cube"]], "build_route_name() (lasso.cubetransit static method)": [[0, "lasso.CubeTransit.build_route_name"]], "calculate_start_end_times() (lasso.cubetransit method)": [[0, "lasso.CubeTransit.calculate_start_end_times"]], "create_add_route_card_dict() (lasso.cubetransit method)": [[0, "lasso.CubeTransit.create_add_route_card_dict"]], "create_delete_route_card_dict() (lasso.cubetransit method)": [[0, "lasso.CubeTransit.create_delete_route_card_dict"]], "create_from_cube() (lasso.cubetransit static method)": [[0, "lasso.CubeTransit.create_from_cube"]], "create_update_route_card_dict() (lasso.cubetransit method)": [[0, "lasso.CubeTransit.create_update_route_card_dict"]], "cube_properties_to_standard_properties() (lasso.cubetransit static method)": [[0, "lasso.CubeTransit.cube_properties_to_standard_properties"]], "diff_dict (lasso.cubetransit attribute)": [[0, "lasso.CubeTransit.diff_dict"]], "evaluate_differences() (lasso.cubetransit method)": [[0, "lasso.CubeTransit.evaluate_differences"]], "evaluate_route_property_differences() (lasso.cubetransit method)": [[0, "lasso.CubeTransit.evaluate_route_property_differences"]], "evaluate_route_shape_changes() (lasso.cubetransit static method)": [[0, "lasso.CubeTransit.evaluate_route_shape_changes"]], "get_time_period_numbers_from_cube_properties() (lasso.cubetransit static method)": [[0, "lasso.CubeTransit.get_time_period_numbers_from_cube_properties"]], "line_properties (lasso.cubetransit attribute)": [[0, "lasso.CubeTransit.line_properties"]], "lines (lasso.cubetransit attribute)": [[0, "lasso.CubeTransit.lines"]], "parameters (lasso.cubetransit attribute)": [[0, "lasso.CubeTransit.parameters"]], "program_type (lasso.cubetransit attribute)": [[0, "lasso.CubeTransit.program_type"]], "shapes (lasso.cubetransit attribute)": [[0, "lasso.CubeTransit.shapes"]], "source_list (lasso.cubetransit attribute)": [[0, "lasso.CubeTransit.source_list"]], "unpack_route_name() (lasso.cubetransit static method)": [[0, "lasso.CubeTransit.unpack_route_name"]], "calculated_values (lasso.modelroadwaynetwork attribute)": [[1, "lasso.ModelRoadwayNetwork.CALCULATED_VALUES"]], "modelroadwaynetwork (class in lasso)": [[1, "lasso.ModelRoadwayNetwork"]], "__init__() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.__init__"]], "add_counts() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.add_counts"]], "add_incident_link_data_to_nodes() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.add_incident_link_data_to_nodes"]], "add_new_roadway_feature_change() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.add_new_roadway_feature_change"]], "add_variable_using_shst_reference() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.add_variable_using_shst_reference"]], "addition_map() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.addition_map"]], "apply() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.apply"]], "apply_managed_lane_feature_change() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.apply_managed_lane_feature_change"]], "apply_python_calculation() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.apply_python_calculation"]], "apply_roadway_feature_change() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.apply_roadway_feature_change"]], "assess_connectivity() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.assess_connectivity"]], "build_selection_key() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.build_selection_key"]], "calculate_area_type() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.calculate_area_type"]], "calculate_centroidconnect() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.calculate_centroidconnect"]], "calculate_county() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.calculate_county"]], "calculate_distance() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.calculate_distance"]], "calculate_mpo() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.calculate_mpo"]], "calculate_use() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.calculate_use"]], "convert_int() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.convert_int"]], "create_ml_variable() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.create_ML_variable"]], "create_calculated_variables() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.create_calculated_variables"]], "create_dummy_connector_links() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.create_dummy_connector_links"]], "create_hov_corridor_variable() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.create_hov_corridor_variable"]], "create_managed_lane_network() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.create_managed_lane_network"]], "create_managed_variable() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.create_managed_variable"]], "dataframe_to_fixed_width() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.dataframe_to_fixed_width"]], "delete_roadway_feature_change() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.delete_roadway_feature_change"]], "deletion_map() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.deletion_map"]], "fill_na() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.fill_na"]], "from_roadwaynetwork() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.from_RoadwayNetwork"]], "get_attribute() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.get_attribute"]], "get_managed_lane_node_ids() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.get_managed_lane_node_ids"]], "get_modal_graph() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.get_modal_graph"]], "get_modal_links_nodes() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.get_modal_links_nodes"]], "get_property_by_time_period_and_group() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.get_property_by_time_period_and_group"]], "identify_segment() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.identify_segment"]], "identify_segment_endpoints() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.identify_segment_endpoints"]], "is_network_connected() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.is_network_connected"]], "load_transform_network() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.load_transform_network"]], "network_connection_plot() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.network_connection_plot"]], "orig_dest_nodes_foreign_key() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.orig_dest_nodes_foreign_key"]], "ox_graph() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.ox_graph"]], "path_search() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.path_search"]], "read() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.read"]], "read_match_result() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.read_match_result"]], "rename_variables_for_dbf() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.rename_variables_for_dbf"]], "roadway_net_to_gdf() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.roadway_net_to_gdf"]], "roadway_standard_to_met_council_network() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.roadway_standard_to_met_council_network"]], "select_roadway_features() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.select_roadway_features"]], "selection_has_unique_link_id() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.selection_has_unique_link_id"]], "selection_map() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.selection_map"]], "shortest_path() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.shortest_path"]], "split_properties_by_time_period_and_category() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.split_properties_by_time_period_and_category"]], "update_distance() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.update_distance"]], "validate_link_schema() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.validate_link_schema"]], "validate_node_schema() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.validate_node_schema"]], "validate_properties() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.validate_properties"]], "validate_selection() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.validate_selection"]], "validate_shape_schema() (lasso.modelroadwaynetwork static method)": [[1, "lasso.ModelRoadwayNetwork.validate_shape_schema"]], "validate_uniqueness() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.validate_uniqueness"]], "write() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.write"]], "write_roadway_as_fixedwidth() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.write_roadway_as_fixedwidth"]], "write_roadway_as_shp() (lasso.modelroadwaynetwork method)": [[1, "lasso.ModelRoadwayNetwork.write_roadway_as_shp"]], "parameters (class in lasso)": [[2, "lasso.Parameters"]], "__init__() (lasso.parameters method)": [[2, "lasso.Parameters.__init__"]], "county_link_range_dict (lasso.parameters attribute)": [[2, "lasso.Parameters.county_link_range_dict"]], "cube_time_periods (lasso.parameters attribute)": [[2, "lasso.Parameters.cube_time_periods"]], "properties_to_split (lasso.parameters attribute)": [[2, "lasso.Parameters.properties_to_split"]], "zones (lasso.parameters attribute)": [[2, "lasso.Parameters.zones"]], "calculated_values (lasso.project attribute)": [[3, "lasso.Project.CALCULATED_VALUES"]], "default_project_name (lasso.project attribute)": [[3, "id0"], [3, "lasso.Project.DEFAULT_PROJECT_NAME"]], "project (class in lasso)": [[3, "lasso.Project"]], "static_values (lasso.project attribute)": [[3, "id1"], [3, "lasso.Project.STATIC_VALUES"]], "__init__() (lasso.project method)": [[3, "lasso.Project.__init__"]], "add_highway_changes() (lasso.project method)": [[3, "lasso.Project.add_highway_changes"]], "add_transit_changes() (lasso.project method)": [[3, "lasso.Project.add_transit_changes"]], "base_cube_transit_network (lasso.project attribute)": [[3, "lasso.Project.base_cube_transit_network"]], "base_roadway_network (lasso.project attribute)": [[3, "lasso.Project.base_roadway_network"]], "build_cube_transit_network (lasso.project attribute)": [[3, "lasso.Project.build_cube_transit_network"]], "card_data (lasso.project attribute)": [[3, "lasso.Project.card_data"]], "create_project() (lasso.project static method)": [[3, "lasso.Project.create_project"]], "determine_roadway_network_changes_compatibility() (lasso.project static method)": [[3, "lasso.Project.determine_roadway_network_changes_compatibility"]], "emme_id_to_wrangler_id() (lasso.project static method)": [[3, "lasso.Project.emme_id_to_wrangler_id"]], "emme_name_to_wrangler_name() (lasso.project static method)": [[3, "lasso.Project.emme_name_to_wrangler_name"]], "evaluate_changes() (lasso.project method)": [[3, "lasso.Project.evaluate_changes"]], "get_object_from_network_build_command() (lasso.project method)": [[3, "lasso.Project.get_object_from_network_build_command"]], "get_operation_from_network_build_command() (lasso.project method)": [[3, "lasso.Project.get_operation_from_network_build_command"]], "parameters (lasso.project attribute)": [[3, "lasso.Project.parameters"]], "project_name (lasso.project attribute)": [[3, "lasso.Project.project_name"]], "read_logfile() (lasso.project static method)": [[3, "lasso.Project.read_logfile"]], "read_network_build_file() (lasso.project static method)": [[3, "lasso.Project.read_network_build_file"]], "roadway_link_changes (lasso.project attribute)": [[3, "lasso.Project.roadway_link_changes"]], "roadway_node_changes (lasso.project attribute)": [[3, "lasso.Project.roadway_node_changes"]], "transit_changes (lasso.project attribute)": [[3, "lasso.Project.transit_changes"]], "write_project_card() (lasso.project method)": [[3, "lasso.Project.write_project_card"]], "standardtransit (class in lasso)": [[4, "lasso.StandardTransit"]], "__init__() (lasso.standardtransit method)": [[4, "lasso.StandardTransit.__init__"]], "calculate_cube_mode() (lasso.standardtransit method)": [[4, "lasso.StandardTransit.calculate_cube_mode"]], "cube_format() (lasso.standardtransit method)": [[4, "lasso.StandardTransit.cube_format"]], "evaluate_differences() (lasso.standardtransit method)": [[4, "lasso.StandardTransit.evaluate_differences"]], "feed (lasso.standardtransit attribute)": [[4, "lasso.StandardTransit.feed"]], "fromtransitnetwork() (lasso.standardtransit static method)": [[4, "lasso.StandardTransit.fromTransitNetwork"]], "parameters (lasso.standardtransit attribute)": [[4, "lasso.StandardTransit.parameters"]], "read_gtfs() (lasso.standardtransit static method)": [[4, "lasso.StandardTransit.read_gtfs"]], "route_properties_gtfs_to_cube() (lasso.standardtransit static method)": [[4, "lasso.StandardTransit.route_properties_gtfs_to_cube"]], "shape_gtfs_to_cube() (lasso.standardtransit method)": [[4, "lasso.StandardTransit.shape_gtfs_to_cube"]], "shape_gtfs_to_emme() (lasso.standardtransit method)": [[4, "lasso.StandardTransit.shape_gtfs_to_emme"]], "time_to_cube_time_period() (lasso.standardtransit method)": [[4, "lasso.StandardTransit.time_to_cube_time_period"]], "write_as_cube_lin() (lasso.standardtransit method)": [[4, "lasso.StandardTransit.write_as_cube_lin"]], "lasso.logger": [[5, "module-lasso.logger"]], "module": [[5, "module-lasso.logger"], [6, "module-lasso.util"], [7, "module-lasso"]], "setuplogging() (in module lasso.logger)": [[5, "lasso.logger.setupLogging"]], "point (class in lasso.util)": [[6, "lasso.util.Point"]], "polygon (class in lasso.util)": [[6, "lasso.util.Polygon"]], "almost_equals() (lasso.util.point method)": [[6, "lasso.util.Point.almost_equals"]], "almost_equals() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.almost_equals"]], "area (lasso.util.point property)": [[6, "lasso.util.Point.area"]], "area (lasso.util.polygon property)": [[6, "lasso.util.Polygon.area"]], "args (lasso.util.partial attribute)": [[6, "lasso.util.partial.args"]], "boundary (lasso.util.point property)": [[6, "lasso.util.Point.boundary"]], "boundary (lasso.util.polygon property)": [[6, "lasso.util.Polygon.boundary"]], "bounds (lasso.util.point property)": [[6, "lasso.util.Point.bounds"]], "bounds (lasso.util.polygon property)": [[6, "lasso.util.Polygon.bounds"]], "buffer() (lasso.util.point method)": [[6, "lasso.util.Point.buffer"]], "buffer() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.buffer"]], "centroid (lasso.util.point property)": [[6, "lasso.util.Point.centroid"]], "centroid (lasso.util.polygon property)": [[6, "lasso.util.Polygon.centroid"]], "column_name_to_parts() (in module lasso.util)": [[6, "lasso.util.column_name_to_parts"]], "contains() (lasso.util.point method)": [[6, "lasso.util.Point.contains"]], "contains() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.contains"]], "contains_properly() (lasso.util.point method)": [[6, "lasso.util.Point.contains_properly"]], "contains_properly() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.contains_properly"]], "convex_hull (lasso.util.point property)": [[6, "lasso.util.Point.convex_hull"]], "convex_hull (lasso.util.polygon property)": [[6, "lasso.util.Polygon.convex_hull"]], "coords (lasso.util.point property)": [[6, "lasso.util.Point.coords"]], "coords (lasso.util.polygon property)": [[6, "lasso.util.Polygon.coords"]], "covered_by() (lasso.util.point method)": [[6, "lasso.util.Point.covered_by"]], "covered_by() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.covered_by"]], "covers() (lasso.util.point method)": [[6, "lasso.util.Point.covers"]], "covers() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.covers"]], "create_locationreference() (in module lasso.util)": [[6, "lasso.util.create_locationreference"]], "crosses() (lasso.util.point method)": [[6, "lasso.util.Point.crosses"]], "crosses() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.crosses"]], "difference() (lasso.util.point method)": [[6, "lasso.util.Point.difference"]], "difference() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.difference"]], "disjoint() (lasso.util.point method)": [[6, "lasso.util.Point.disjoint"]], "disjoint() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.disjoint"]], "distance() (lasso.util.point method)": [[6, "lasso.util.Point.distance"]], "distance() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.distance"]], "dwithin() (lasso.util.point method)": [[6, "lasso.util.Point.dwithin"]], "dwithin() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.dwithin"]], "envelope (lasso.util.point property)": [[6, "lasso.util.Point.envelope"]], "envelope (lasso.util.polygon property)": [[6, "lasso.util.Polygon.envelope"]], "equals() (lasso.util.point method)": [[6, "lasso.util.Point.equals"]], "equals() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.equals"]], "equals_exact() (lasso.util.point method)": [[6, "lasso.util.Point.equals_exact"]], "equals_exact() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.equals_exact"]], "exterior (lasso.util.polygon attribute)": [[6, "lasso.util.Polygon.exterior"]], "exterior (lasso.util.polygon property)": [[6, "id0"]], "from_bounds() (lasso.util.polygon class method)": [[6, "lasso.util.Polygon.from_bounds"]], "func (lasso.util.partial attribute)": [[6, "lasso.util.partial.func"]], "geodesic_point_buffer() (in module lasso.util)": [[6, "lasso.util.geodesic_point_buffer"]], "geom_type (lasso.util.point property)": [[6, "lasso.util.Point.geom_type"]], "geom_type (lasso.util.polygon property)": [[6, "lasso.util.Polygon.geom_type"]], "geometrytype() (lasso.util.point method)": [[6, "lasso.util.Point.geometryType"]], "geometrytype() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.geometryType"]], "get_shared_streets_intersection_hash() (in module lasso.util)": [[6, "lasso.util.get_shared_streets_intersection_hash"]], "has_z (lasso.util.point property)": [[6, "lasso.util.Point.has_z"]], "has_z (lasso.util.polygon property)": [[6, "lasso.util.Polygon.has_z"]], "hausdorff_distance() (lasso.util.point method)": [[6, "lasso.util.Point.hausdorff_distance"]], "hausdorff_distance() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.hausdorff_distance"]], "hhmmss_to_datetime() (in module lasso.util)": [[6, "lasso.util.hhmmss_to_datetime"]], "interiors (lasso.util.polygon attribute)": [[6, "lasso.util.Polygon.interiors"]], "interiors (lasso.util.polygon property)": [[6, "id1"]], "interpolate() (lasso.util.point method)": [[6, "lasso.util.Point.interpolate"]], "interpolate() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.interpolate"]], "intersection() (lasso.util.point method)": [[6, "lasso.util.Point.intersection"]], "intersection() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.intersection"]], "intersects() (lasso.util.point method)": [[6, "lasso.util.Point.intersects"]], "intersects() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.intersects"]], "is_closed (lasso.util.point property)": [[6, "lasso.util.Point.is_closed"]], "is_closed (lasso.util.polygon property)": [[6, "lasso.util.Polygon.is_closed"]], "is_empty (lasso.util.point property)": [[6, "lasso.util.Point.is_empty"]], "is_empty (lasso.util.polygon property)": [[6, "lasso.util.Polygon.is_empty"]], "is_ring (lasso.util.point property)": [[6, "lasso.util.Point.is_ring"]], "is_ring (lasso.util.polygon property)": [[6, "lasso.util.Polygon.is_ring"]], "is_simple (lasso.util.point property)": [[6, "lasso.util.Point.is_simple"]], "is_simple (lasso.util.polygon property)": [[6, "lasso.util.Polygon.is_simple"]], "is_valid (lasso.util.point property)": [[6, "lasso.util.Point.is_valid"]], "is_valid (lasso.util.polygon property)": [[6, "lasso.util.Polygon.is_valid"]], "keywords (lasso.util.partial attribute)": [[6, "lasso.util.partial.keywords"]], "lasso.util": [[6, "module-lasso.util"]], "length (lasso.util.point property)": [[6, "lasso.util.Point.length"]], "length (lasso.util.polygon property)": [[6, "lasso.util.Polygon.length"]], "line_interpolate_point() (lasso.util.point method)": [[6, "lasso.util.Point.line_interpolate_point"]], "line_interpolate_point() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.line_interpolate_point"]], "line_locate_point() (lasso.util.point method)": [[6, "lasso.util.Point.line_locate_point"]], "line_locate_point() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.line_locate_point"]], "minimum_clearance (lasso.util.point property)": [[6, "lasso.util.Point.minimum_clearance"]], "minimum_clearance (lasso.util.polygon property)": [[6, "lasso.util.Polygon.minimum_clearance"]], "minimum_rotated_rectangle (lasso.util.point property)": [[6, "lasso.util.Point.minimum_rotated_rectangle"]], "minimum_rotated_rectangle (lasso.util.polygon property)": [[6, "lasso.util.Polygon.minimum_rotated_rectangle"]], "normalize() (lasso.util.point method)": [[6, "lasso.util.Point.normalize"]], "normalize() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.normalize"]], "oriented_envelope (lasso.util.point property)": [[6, "lasso.util.Point.oriented_envelope"]], "oriented_envelope (lasso.util.polygon property)": [[6, "lasso.util.Polygon.oriented_envelope"]], "overlaps() (lasso.util.point method)": [[6, "lasso.util.Point.overlaps"]], "overlaps() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.overlaps"]], "partial (class in lasso.util)": [[6, "lasso.util.partial"]], "point_on_surface() (lasso.util.point method)": [[6, "lasso.util.Point.point_on_surface"]], "point_on_surface() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.point_on_surface"]], "project() (lasso.util.point method)": [[6, "lasso.util.Point.project"]], "project() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.project"]], "relate() (lasso.util.point method)": [[6, "lasso.util.Point.relate"]], "relate() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.relate"]], "relate_pattern() (lasso.util.point method)": [[6, "lasso.util.Point.relate_pattern"]], "relate_pattern() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.relate_pattern"]], "representative_point() (lasso.util.point method)": [[6, "lasso.util.Point.representative_point"]], "representative_point() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.representative_point"]], "reverse() (lasso.util.point method)": [[6, "lasso.util.Point.reverse"]], "reverse() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.reverse"]], "secs_to_datetime() (in module lasso.util)": [[6, "lasso.util.secs_to_datetime"]], "segmentize() (lasso.util.point method)": [[6, "lasso.util.Point.segmentize"]], "segmentize() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.segmentize"]], "shorten_name() (in module lasso.util)": [[6, "lasso.util.shorten_name"]], "simplify() (lasso.util.point method)": [[6, "lasso.util.Point.simplify"]], "simplify() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.simplify"]], "svg() (lasso.util.point method)": [[6, "lasso.util.Point.svg"]], "svg() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.svg"]], "symmetric_difference() (lasso.util.point method)": [[6, "lasso.util.Point.symmetric_difference"]], "symmetric_difference() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.symmetric_difference"]], "touches() (lasso.util.point method)": [[6, "lasso.util.Point.touches"]], "touches() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.touches"]], "transform() (in module lasso.util)": [[6, "lasso.util.transform"]], "type (lasso.util.point property)": [[6, "lasso.util.Point.type"]], "type (lasso.util.polygon property)": [[6, "lasso.util.Polygon.type"]], "unidecode() (in module lasso.util)": [[6, "lasso.util.unidecode"]], "union() (lasso.util.point method)": [[6, "lasso.util.Point.union"]], "union() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.union"]], "within() (lasso.util.point method)": [[6, "lasso.util.Point.within"]], "within() (lasso.util.polygon method)": [[6, "lasso.util.Polygon.within"]], "wkb (lasso.util.point property)": [[6, "lasso.util.Point.wkb"]], "wkb (lasso.util.polygon property)": [[6, "lasso.util.Polygon.wkb"]], "wkb_hex (lasso.util.point property)": [[6, "lasso.util.Point.wkb_hex"]], "wkb_hex (lasso.util.polygon property)": [[6, "lasso.util.Polygon.wkb_hex"]], "wkt (lasso.util.point property)": [[6, "lasso.util.Point.wkt"]], "wkt (lasso.util.polygon property)": [[6, "lasso.util.Polygon.wkt"]], "x (lasso.util.point property)": [[6, "lasso.util.Point.x"]], "xy (lasso.util.point property)": [[6, "lasso.util.Point.xy"]], "xy (lasso.util.polygon property)": [[6, "lasso.util.Polygon.xy"]], "y (lasso.util.point property)": [[6, "lasso.util.Point.y"]], "z (lasso.util.point property)": [[6, "lasso.util.Point.z"]], "lasso": [[7, "module-lasso"]]}}) \ No newline at end of file diff --git a/branch/bicounty_emme/setup/index.html b/branch/bicounty_emme/setup/index.html new file mode 100644 index 0000000..868c23c --- /dev/null +++ b/branch/bicounty_emme/setup/index.html @@ -0,0 +1,133 @@ + + + + + + + Setup — lasso documentation + + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

Setup

+
+

Projects

+
+
+

Parameters

+
+
+

Settings

+
+
+

Additional Data Files

+
+
+ + +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/branch/bicounty_emme/starting/index.html b/branch/bicounty_emme/starting/index.html new file mode 100644 index 0000000..ef31ab2 --- /dev/null +++ b/branch/bicounty_emme/starting/index.html @@ -0,0 +1,434 @@ + + + + + + + Starting Out — lasso documentation + + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

Starting Out

+
+

Installation

+

If you are managing multiple python versions, we suggest using virtualenv or conda virtual environments.

+

Example using a conda environment (recommended) and using the package manager pip to install Lasso from the source on GitHub.

+
conda config --add channels conda-forge
+conda create python=3.7 rtree geopandas -n <my_lasso_environment>
+conda activate <my_lasso_environment>
+pip install git+https://github.com/wsp-sag/Lasso@master
+
+
+

Lasso will install network_wrangler from the PyPi repository because it is included in Lasso’s requirements.txt.

+
+

Bleeding Edge

+

If you want to install a more up-to-date or development version of network wrangler and lasso , you can do so by installing it from the develop branch of

+
conda config --add channels conda-forge
+conda create python=3.7 rtree geopandas -n <my_lasso_environment>
+conda activate <my_lasso_environment>
+pip install git+https://github.com/wsp-sag/network_wrangler@develop
+pip install git+https://github.com/wsp-sag/Lasso@develop
+
+
+
+
+

From Clone

+

If you are going to be working on Lasso locally, you might want to clone it to your local machine and install it from the clone. The -e will install it in editable mode.

+

if you plan to do development on both network wrangler and lasso locally, consider installing network wrangler from a clone as well!

+
conda config --add channels conda-forge
+conda create python=3.7 rtree geopandas osmnx -n <my_lasso_environment>
+conda activate <my_lasso_environment>
+git clone https://github.com/wsp-sag/Lasso
+git clone https://github.com/wsp-sag/network_wrangler
+cd network_wrangler
+pip install -e .
+cd ..
+cd Lasso
+pip install -e .
+
+
+

Notes:

+
    +
  1. The -e installs it in editable mode.

  2. +
  3. If you are not part of the project team and want to contribute code bxack to the project, please fork before you clone and then add the original repository to your upstream origin list per these directions on github.

  4. +
  5. if you wanted to install from a specific tag/version number or branch, replace @master with @<branchname> or @tag

  6. +
  7. If you want to make use of frequent developer updates for network wrangler as well, you can also install it from clone by copying the instructions for cloning and installing Lasso for Network Wrangler

  8. +
+

If you are going to be doing Lasso development, we also recommend:

+
    +
  • a good IDE such as Atom, VS Code, Sublime Text, etc. +with Python syntax highlighting turned on.

  • +
  • GitHub Desktop to locally update your clones

  • +
+
+
+
+

Brief Intro

+

Lasso is a ‘wrapper’ around the Network Wrangler utility.

+

Both Lasso and NetworkWrangler are built around the following data schemas:

+
    +
  • [roadway network], which is based on a mashup of Open Street Map and Shared Streets. In Network Wrangler these are read in from three json files reprsenting: links, shapes, and nodes. Data fields that change by time of day or by user category are represented as nested fields such that any field can be defined for an ad-hoc time-of-day span or user category.

  • +
  • [transit network], which is based on a frequency-based implementation of the csv-based GTFS; and

  • +
  • [project card], which is novel to Network Wrangler and stores information about network changes as a result of projects in yml.

  • +
+

In addition, Lasso utilizes the following data schemas:

+
    +
  • [MetCouncil Model Roadway Network Schema], which adds data fields to the roadway network schema that MetCouncil uses in their travel model including breaking out data fields by time period.

  • +
  • [MetCouncil Model Transit Network Schema], which uses the Cube PublicTransport format, and

  • +
  • [Cube Log Files], which document changes to the roadway network done in the Cube GUI. Lasso translates these to project cards in order to be used by NetworkWrangler.

  • +
  • [Cube public transport line files], which define a set of transit lines in the cube software.

  • +
+
+

Components

+

Network Wrangler has the following atomic parts:

+
    +
  • RoadwayNetwork object, which represents the roadway network data as GeoDataFrames;

  • +
  • TransitNetwork object, which represents the transit network data as DataFrames;

  • +
  • ProjectCard object, which represents the data of the project card. Project cards identify the infrastructure that is changing (a selection) and defines the changes; or contains information about a new facility to be constructed or a new service to be run.;

  • +
  • Scenario object, which consist of at least a RoadwayNetwork, and +TransitNetwork. Scenarios can be based on or tiered from other scenarios. +Scenarios can query and add ProjectCards to describe a set of changes that should be made to the network.

  • +
+

In addition, Lasso has the following atomic parts:

+
    +
  • Project object, creates project cards from one of the following: a base and a build transit network in cube format, a base and build highway network, or a base highway network and a Cube log file.

  • +
  • ModelRoadwayNetwork object is a subclass of RoadwayNetwork and contains MetCouncil-specific methods to define and create MetCouncil-specific variables and export the network to a format that can be read by Cube.

  • +
  • StandardTransit, an object for holding a standard transit feed as a Partridge object and contains +methods to manipulate and translate the GTFS data to MetCouncil’s Cube Line files.

  • +
  • CubeTransit, an object for storing information about transit defined in Cube public transport line files +. Has the capability to parse cube line file properties and shapes into python dictionaries and compare line files and represent changes as Project Card dictionaries.

  • +
  • Parameters, A class representing all the parameters defining the networks +including time of day, categories, etc. Parameters can be set at runtime by initializing a parameters instance +with a keyword argument setting the attribute. Parameters that are +not explicitly set will use default parameters listed in this class.

  • +
+
+

RoadwayNetwork

+

Reads, writes, queries and and manipulates roadway network data, which +is mainly stored in the GeoDataFrames links_df, nodes_df, and shapes_df.

+
net = RoadwayNetwork.read(
+        link_filename=MY_LINK_FILE,
+        node_filename=MY_NODE_FILE,
+        shape_filename=MY_SHAPE_FILE,
+        shape_foreign_key ='shape_id',
+        
+    )
+my_selection = {
+    "link": [{"name": ["I 35E"]}],
+    "A": {"osm_node_id": "961117623"},  # start searching for segments at A
+    "B": {"osm_node_id": "2564047368"},
+}
+net.select_roadway_features(my_selection)
+
+my_change = [
+    {
+        'property': 'lanes',
+        'existing': 1,
+        'set': 2,
+     },
+     {
+        'property': 'drive_access',
+        'set': 0,
+      },
+]
+
+my_net.apply_roadway_feature_change(
+    my_net.select_roadway_features(my_selection),
+    my_change
+)
+
+ml_net = net.create_managed_lane_network(in_place=False)
+
+ml_net.is_network_connected(mode="drive"))
+
+_, disconnected_nodes = ml_net.assess_connectivity(
+  mode="walk",
+  ignore_end_nodes=True
+)
+ml_net.write(filename=my_out_prefix, path=my_dir)
+
+
+
+
+

TransitNetwork

+
+
+

ProjectCard

+
+
+

Scenario

+

Manages sets of project cards and tiering from a base scenario/set of networks.

+

+my_base_scenario = {
+    "road_net": RoadwayNetwork.read(
+        link_filename=STPAUL_LINK_FILE,
+        node_filename=STPAUL_NODE_FILE,
+        shape_filename=STPAUL_SHAPE_FILE,
+        fast=True,
+        shape_foreign_key ='shape_id',
+    ),
+    "transit_net": TransitNetwork.read(STPAUL_DIR),
+}
+
+card_filenames = [
+    "3_multiple_roadway_attribute_change.yml",
+    "multiple_changes.yml",
+    "4_simple_managed_lane.yml",
+]
+
+project_card_directory = os.path.join(STPAUL_DIR, "project_cards")
+
+project_cards_list = [
+    ProjectCard.read(os.path.join(project_card_directory, filename), validate=False)
+    for filename in card_filenames
+]
+
+my_scenario = Scenario.create_scenario(
+  base_scenario=my_base_scenario,
+  project_cards_list=project_cards_list,
+)
+my_scenario.check_scenario_requisites()
+
+my_scenario.apply_all_projects()
+
+my_scenario.scenario_summary()
+
+
+
+
+

Project

+

Creates project cards by comparing two MetCouncil Model Transit Network files or by reading a cube log file and a base network;

+

+test_project = Project.create_project(
+  base_transit_source=os.path.join(CUBE_DIR, "transit.LIN"),
+  build_transit_source=os.path.join(CUBE_DIR, "transit_route_shape_change"),
+  )
+
+test_project.evaluate_changes()
+
+test_project.write_project_card(
+  os.path.join(SCRATCH_DIR, "t_transit_shape_test.yml")
+  )
+
+
+
+
+

ModelRoadwayNetwork

+

A subclass of network_wrangler’s RoadwayNetwork +class which additional understanding about how to translate and write the +network out to the MetCouncil Roadway Network schema.

+
net = ModelRoadwayNetwork.read(
+      link_filename=STPAUL_LINK_FILE,
+      node_filename=STPAUL_NODE_FILE,
+      shape_filename=STPAUL_SHAPE_FILE,
+      fast=True,
+      shape_foreign_key ='shape_id',
+  )
+
+net.write_roadway_as_fixedwidth()
+
+
+
+
+

StandardTransit

+

Translates the standard GTFS data to MetCouncil’s Cube Line files.

+
cube_transit_net = StandardTransit.read_gtfs(BASE_TRANSIT_DIR)
+cube_transit_net.write_as_cube_lin(os.path.join(WRITE_DIR, "outfile.lin"))
+
+
+
+
+

CubeTransit

+

Used by the project class and has the capability to:

+
    +
  • Parse cube line file properties and shapes into python dictionaries

  • +
  • Compare line files and represent changes as Project Card dictionaries

  • +
+
tn = CubeTransit.create_from_cube(CUBE_DIR)
+transit_change_list = tn.evaluate_differences(base_transit_network)
+
+
+
+
+

Parameters

+

Holds information about default parameters but can +also be initialized to override those parameters at object instantiation using a dictionary.

+
# read parameters from a yaml configuration  file
+# could also provide as a key/value pair
+with open(config_file) as f:
+      my_config = yaml.safe_load(f)
+
+# provide parameters at instantiation of ModelRoadwayNetwork
+model_road_net = ModelRoadwayNetwork.from_RoadwayNetwork(
+            my_scenario.road_net, parameters=my_config.get("my_parameters", {})
+        )
+# network written with direction from the parameters given
+model_road_net.write_roadway_as_shp()
+
+
+
+
+
+

Typical Workflow

+

Workflows in Lasso and Network Wrangler typically accomplish one of two goals:

+
    +
  1. Create Project Cards to document network changes as a result of either transit or roadway projects.

  2. +
  3. Create Model Network Files for a scenario as represented by a series of Project Cards layered on top of a base network.

  4. +
+
+

Project Cards from Transit LIN Files

+
+
+

Project Cards from Cube LOG Files

+
+
+

Model Network Files for a Scenario

+
+
+
+
+

Running Quickstart Jupyter Notebooks

+

To learn basic lasso functionality, please refer to the following jupyter notebooks in the /notebooks directory:

+
    +
  • Lasso Project Card Creation Quickstart.ipynb

  • +
  • Lasso Scenario Creation Quickstart.ipynb

  • +
+

Jupyter notebooks can be started by activating the lasso conda environment and typing jupyter notebook:

+
conda activate <my_lasso_environment>
+jupyter notebook
+
+
+
+
+ + +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/genindex.html b/genindex.html new file mode 100644 index 0000000..b8fc347 --- /dev/null +++ b/genindex.html @@ -0,0 +1,721 @@ + + + + + + + + + + Index — lasso documentation + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ + + +
+ + + + + +
+ +
+ + + + + + + + + + + + + + + + + +
+ +
    + +
  • »
  • + +
  • Index
  • + + +
  • + + + +
  • + +
+ + +
+
+
+
+ + +

Index

+ +
+ _ + | A + | B + | C + | D + | E + | F + | G + | H + | I + | K + | L + | M + | N + | O + | P + | R + | S + | T + | U + | V + | W + +
+

_

+ + +
+ +

A

+ + + +
+ +

B

+ + + +
+ +

C

+ + + +
+ +

D

+ + + +
+ +

E

+ + + +
+ +

F

+ + + +
+ +

G

+ + + +
+ +

H

+ + +
+ +

I

+ + +
+ +

K

+ + +
+ +

L

+ + + +
    +
  • + lasso + +
  • +
  • + lasso.logger + +
  • +
+ +

M

+ + + +
+ +

N

+ + + +
+ +

O

+ + + +
+ +

P

+ + + +
+ +

R

+ + + +
+ +

S

+ + + +
+ +

T

+ + + +
+ +

U

+ + + +
+ +

V

+ + + +
+ +

W

+ + + +
+ + + +
+ +
+
+ + +
+ +
+

+ + © Copyright 2019 Metropolitan Council + +

+
+ + + + Built with Sphinx using a + + theme + + provided by Read the Docs. + +
+ +
+
+ +
+ +
+ + + + + + + + + + + \ No newline at end of file diff --git a/index.html b/index.html new file mode 100644 index 0000000..e567f3e --- /dev/null +++ b/index.html @@ -0,0 +1,284 @@ + + + + + + + + + + Welcome to lasso’s documentation! — lasso documentation + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ + + +
+ + + + + +
+ +
+ + + + + + + + + + + + + + + + + +
+ +
    + +
  • »
  • + +
  • Welcome to lasso’s documentation!
  • + + +
  • + + + View page source + + +
  • + +
+ + +
+
+
+
+ +
+

Welcome to lasso’s documentation!

+

This package of utilities is a wrapper around the +[network_wrangler](http://github.com/wsp-sag/network_wrangler) package +for MetCouncil. It aims to have the following functionality: +1. parse Cube log files and base highway networks and create ProjectCards

+
+

for Network Wrangler

+
+
    +
  1. parse two Cube transit line files and create ProjectCards for NetworkWrangler

  2. +
  3. refine Network Wrangler highway networks to contain specific variables and +settings for Metropolitan Council and export them to a format that can +be read in by Citilab’s Cube software.

  4. +
+ +
+
+

Indices and tables

+ +
+ + +
+ +
+
+ + + + +
+ +
+

+ + © Copyright 2019 Metropolitan Council + +

+
+ + + + Built with Sphinx using a + + theme + + provided by Read the Docs. + +
+ +
+
+ +
+ +
+ + + + + + + + + + + \ No newline at end of file diff --git a/objects.inv b/objects.inv new file mode 100644 index 0000000..798e9c6 Binary files /dev/null and b/objects.inv differ diff --git a/py-modindex.html b/py-modindex.html new file mode 100644 index 0000000..a1c4d06 --- /dev/null +++ b/py-modindex.html @@ -0,0 +1,230 @@ + + + + + + + + + + Python Module Index — lasso documentation + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ + + +
+ + + + + +
+ +
+ + + + + + + + + + + + + + + + + +
+ +
    + +
  • »
  • + +
  • Python Module Index
  • + + +
  • + +
  • + +
+ + +
+
+
+
+ + +

Python Module Index

+ +
+ l +
+ + + + + + + + + + + + + +
 
+ l
+ lasso +
    + lasso.logger +
    + lasso.util +
+ + +
+ +
+
+ + +
+ +
+

+ + © Copyright 2019 Metropolitan Council + +

+
+ + + + Built with Sphinx using a + + theme + + provided by Read the Docs. + +
+ +
+
+ +
+ +
+ + + + + + + + + + + \ No newline at end of file diff --git a/running.html b/running.html new file mode 100644 index 0000000..cf1017e --- /dev/null +++ b/running.html @@ -0,0 +1,235 @@ + + + + + + + + + + Running Lasso — lasso documentation + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ + + +
+ + + + + +
+ +
+ + + + + + + + + + + + + + + + + +
+ + + + +
+
+
+
+ +
+

Running Lasso

+
+

Create project files

+
+
+

Create a scenario

+
+
+

Exporting networks

+
+
+

Auditing and Reporting

+
+
+ + +
+ +
+ + +
+
+ +
+ +
+ + + + + + + + + + + \ No newline at end of file diff --git a/search.html b/search.html new file mode 100644 index 0000000..53b817e --- /dev/null +++ b/search.html @@ -0,0 +1,221 @@ + + + + + + + + + + Search — lasso documentation + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ + + +
+ + + + + +
+ +
+ + + + + + + + + + + + + + + + + +
+ +
    + +
  • »
  • + +
  • Search
  • + + +
  • + + + +
  • + +
+ + +
+
+
+
+ + + + +
+ +
+ +
+ +
+
+ + +
+ +
+

+ + © Copyright 2019 Metropolitan Council + +

+
+ + + + Built with Sphinx using a + + theme + + provided by Read the Docs. + +
+ +
+
+ +
+ +
+ + + + + + + + + + + + + + + + \ No newline at end of file diff --git a/searchindex.js b/searchindex.js new file mode 100644 index 0000000..7aabf63 --- /dev/null +++ b/searchindex.js @@ -0,0 +1 @@ +Search.setIndex({docnames:["_generated/lasso.CubeTransit","_generated/lasso.ModelRoadwayNetwork","_generated/lasso.Parameters","_generated/lasso.Project","_generated/lasso.StandardTransit","_generated/lasso.logger","_generated/lasso.util","autodoc","index","running","setup","starting"],envversion:{"sphinx.domains.c":2,"sphinx.domains.changeset":1,"sphinx.domains.citation":1,"sphinx.domains.cpp":3,"sphinx.domains.index":1,"sphinx.domains.javascript":2,"sphinx.domains.math":2,"sphinx.domains.python":2,"sphinx.domains.rst":2,"sphinx.domains.std":1,"sphinx.ext.intersphinx":1,"sphinx.ext.todo":2,"sphinx.ext.viewcode":1,sphinx:56},filenames:["_generated/lasso.CubeTransit.rst","_generated/lasso.ModelRoadwayNetwork.rst","_generated/lasso.Parameters.rst","_generated/lasso.Project.rst","_generated/lasso.StandardTransit.rst","_generated/lasso.logger.rst","_generated/lasso.util.rst","autodoc.rst","index.rst","running.md","setup.md","starting.md"],objects:{"":{lasso:[7,0,0,"-"]},"lasso.CubeTransit":{__init__:[0,2,1,""],add_additional_time_periods:[0,2,1,""],add_cube:[0,2,1,""],build_route_name:[0,2,1,""],calculate_start_end_times:[0,2,1,""],create_add_route_card_dict:[0,2,1,""],create_delete_route_card_dict:[0,2,1,""],create_from_cube:[0,2,1,""],create_update_route_card_dict:[0,2,1,""],cube_properties_to_standard_properties:[0,2,1,""],diff_dict:[0,3,1,""],evaluate_differences:[0,2,1,""],evaluate_route_property_differences:[0,2,1,""],evaluate_route_shape_changes:[0,2,1,""],get_time_period_numbers_from_cube_properties:[0,2,1,""],line_properties:[0,3,1,""],lines:[0,3,1,""],parameters:[0,3,1,""],program_type:[0,3,1,""],shapes:[0,3,1,""],source_list:[0,3,1,""],unpack_route_name:[0,2,1,""]},"lasso.ModelRoadwayNetwork":{CALCULATED_VALUES:[1,3,1,""],CRS:[1,3,1,""],KEEP_SAME_ATTRIBUTES_ML_AND_GP:[1,3,1,""],LINK_FOREIGN_KEY:[1,3,1,""],MANAGED_LANES_LINK_ID_SCALAR:[1,3,1,""],MANAGED_LANES_NODE_ID_SCALAR:[1,3,1,""],MANAGED_LANES_REQUIRED_ATTRIBUTES:[1,3,1,""],MANAGED_LANES_SCALAR:[1,3,1,""],MAX_SEARCH_BREADTH:[1,3,1,""],MODES_TO_NETWORK_LINK_VARIABLES:[1,3,1,""],MODES_TO_NETWORK_NODE_VARIABLES:[1,3,1,""],NODE_FOREIGN_KEY:[1,3,1,""],SEARCH_BREADTH:[1,3,1,""],SELECTION_REQUIRES:[1,3,1,""],SP_WEIGHT_FACTOR:[1,3,1,""],UNIQUE_LINK_KEY:[1,3,1,""],UNIQUE_MODEL_LINK_IDENTIFIERS:[1,3,1,""],UNIQUE_NODE_IDENTIFIERS:[1,3,1,""],UNIQUE_NODE_KEY:[1,3,1,""],UNIQUE_SHAPE_KEY:[1,3,1,""],__init__:[1,2,1,""],add_counts:[1,2,1,""],add_new_roadway_feature_change:[1,2,1,""],add_variable_using_shst_reference:[1,2,1,""],addition_map:[1,2,1,""],apply:[1,2,1,""],apply_managed_lane_feature_change:[1,2,1,""],apply_python_calculation:[1,2,1,""],apply_roadway_feature_change:[1,2,1,""],assess_connectivity:[1,2,1,""],build_selection_key:[1,2,1,""],calculate_area_type:[1,2,1,""],calculate_assign_group_and_roadway_class:[1,2,1,""],calculate_centroidconnect:[1,2,1,""],calculate_county:[1,2,1,""],calculate_distance:[1,2,1,""],calculate_hov:[1,2,1,""],calculate_mpo:[1,2,1,""],convert_int:[1,2,1,""],create_ML_variable:[1,2,1,""],create_calculated_variables:[1,2,1,""],create_dummy_connector_links:[1,2,1,""],create_hov_corridor_variable:[1,2,1,""],create_managed_lane_network:[1,2,1,""],create_managed_variable:[1,2,1,""],dataframe_to_fixed_width:[1,2,1,""],delete_roadway_feature_change:[1,2,1,""],deletion_map:[1,2,1,""],fill_na:[1,2,1,""],from_RoadwayNetwork:[1,2,1,""],get_attribute:[1,2,1,""],get_managed_lane_node_ids:[1,2,1,""],get_modal_graph:[1,2,1,""],get_modal_links_nodes:[1,2,1,""],get_property_by_time_period_and_group:[1,2,1,""],is_network_connected:[1,2,1,""],network_connection_plot:[1,2,1,""],orig_dest_nodes_foreign_key:[1,2,1,""],ox_graph:[1,2,1,""],read:[1,2,1,""],read_match_result:[1,2,1,""],rename_variables_for_dbf:[1,2,1,""],roadway_net_to_gdf:[1,2,1,""],roadway_standard_to_met_council_network:[1,2,1,""],select_roadway_features:[1,2,1,""],selection_has_unique_link_id:[1,2,1,""],selection_map:[1,2,1,""],split_properties_by_time_period_and_category:[1,2,1,""],validate_link_schema:[1,2,1,""],validate_node_schema:[1,2,1,""],validate_object_types:[1,2,1,""],validate_properties:[1,2,1,""],validate_selection:[1,2,1,""],validate_shape_schema:[1,2,1,""],validate_uniqueness:[1,2,1,""],write:[1,2,1,""],write_roadway_as_fixedwidth:[1,2,1,""],write_roadway_as_shp:[1,2,1,""]},"lasso.Parameters":{__init__:[2,2,1,""],output_epsg:[2,3,1,""],properties_to_split:[2,3,1,""]},"lasso.Project":{CALCULATED_VALUES:[3,3,1,""],DEFAULT_PROJECT_NAME:[3,3,1,"id0"],STATIC_VALUES:[3,3,1,"id1"],__init__:[3,2,1,""],add_highway_changes:[3,2,1,""],add_transit_changes:[3,2,1,""],base_roadway_network:[3,3,1,""],base_transit_network:[3,3,1,""],build_transit_network:[3,3,1,""],card_data:[3,3,1,""],create_project:[3,2,1,""],determine_roadway_network_changes_compatability:[3,2,1,""],evaluate_changes:[3,2,1,""],parameters:[3,3,1,""],project_name:[3,3,1,""],read_logfile:[3,2,1,""],roadway_changes:[3,3,1,""],transit_changes:[3,3,1,""],write_project_card:[3,2,1,""]},"lasso.StandardTransit":{__init__:[4,2,1,""],calculate_cube_mode:[4,2,1,""],cube_format:[4,2,1,""],feed:[4,3,1,""],fromTransitNetwork:[4,2,1,""],parameters:[4,3,1,""],read_gtfs:[4,2,1,""],route_properties_gtfs_to_cube:[4,2,1,""],shape_gtfs_to_cube:[4,2,1,""],time_to_cube_time_period:[4,2,1,""],write_as_cube_lin:[4,2,1,""]},"lasso.logger":{setupLogging:[5,4,1,""]},"lasso.util":{column_name_to_parts:[6,4,1,""],get_shared_streets_intersection_hash:[6,4,1,""],hhmmss_to_datetime:[6,4,1,""],secs_to_datetime:[6,4,1,""]},lasso:{CubeTransit:[0,1,1,""],ModelRoadwayNetwork:[1,1,1,""],Parameters:[2,1,1,""],Project:[3,1,1,""],StandardTransit:[4,1,1,""],logger:[5,0,0,"-"],util:[6,0,0,"-"]}},objnames:{"0":["py","module","Python module"],"1":["py","class","Python class"],"2":["py","method","Python method"],"3":["py","attribute","Python attribute"],"4":["py","function","Python function"]},objtypes:{"0":"py:module","1":"py:class","2":"py:method","3":"py:attribute","4":"py:function"},terms:{"0965985":6,"0_452":0,"0e6d7de0aee2e9ae3b007d1e45284b06cc241d02":6,"100":1,"1000000":1,"111":0,"111_452_pk1":0,"1234":1,"145":1,"2564047368":11,"26915":2,"3100":2,"35e":11,"3_multiple_roadway_attribute_chang":11,"4321":1,"4326":1,"452":0,"4_simple_managed_lan":11,"500000":1,"69f13f881649cb21ee3b359730790bb9":6,"952112199999995":6,"954734870":6,"961117623":11,"boolean":[0,1],"break":11,"case":3,"char":1,"class":[0,1,2,3,4,8,11],"default":[1,2,3,4,11],"export":[1,8,11],"final":2,"function":[1,5,6,8,11],"import":2,"int":[0,1,2,4],"long":[1,6],"new":[0,1,11],"public":11,"return":[0,1,3,4,6],"static":[0,1,3,4],"true":[0,1,3,4,5,11],Bus:4,CRS:1,For:[0,1,4],Has:[0,11],IDE:11,IDs:1,The:[1,2,5,11],There:1,These:0,Used:11,__init__:[0,1,2,3,4],a_id:1,aadt:[1,2],aadt_2017_count_loc:2,aadt_mn:2,aadt_wi:2,abbrevi:[2,4],about:[0,1,4,11],absolut:0,access:[1,2,4],access_am:2,access_md:2,access_nt:2,access_pm:2,accomplish:11,activ:11,add:[0,1,3,11],add_additional_time_period:0,add_count:1,add_cub:0,add_highway_chang:3,add_new_roadway_feature_chang:1,add_transit_chang:3,add_variable_using_shst_refer:1,added:[0,1],adding:0,addit:[0,1,8,11],addition_map:1,after:4,agenc:0,agency_id:0,aggreg:3,aim:8,all:[0,1,2,5,11],allow:2,also:[1,11],ame:2,ani:[0,3,11],anoka:2,anoth:0,api:1,appli:[0,1],apply_all_project:11,apply_managed_lane_feature_chang:1,apply_python_calcul:1,apply_roadway_feature_chang:[1,11],appropri:[0,4],area:[1,2,3],area_typ:[1,2,3],area_type_code_dict:2,area_type_codes_dict:1,area_type_shap:[1,2],area_type_shape_vari:1,area_type_variable_shp:2,argument:[2,11],around:[8,11],arrai:1,as_integ:1,as_str:4,asgngrp_rc_num_crosswalk:2,assess:[1,3],assess_connect:[1,11],assign:[1,2,4],assign_group:[1,2,3],assign_group_variable_nam:1,associ:[0,2],assum:1,atom:11,attach:1,attr:2,attribut:[0,1,2,3,4,11],audit:8,automat:0,b_id:1,bare:5,base:[0,1,2,3,4,8,11],base_roadway_dir:3,base_roadway_network:3,base_scenario:11,base_transit:0,base_transit_dir:[4,11],base_transit_fil:3,base_transit_line_properties_dict:0,base_transit_network:[0,3,11],base_transit_sourc:[3,11],basic:11,becaus:[1,11],been:0,beetween:2,befor:11,being:[0,1],between:[0,1,2,3],bike:1,bike_access:[1,2],bike_nod:[1,2],bleed:8,blob:6,bool:[1,3],both:[1,4,11],boundari:[1,2],branch:11,branchnam:11,brief:8,build:[0,1,3,11],build_route_nam:0,build_selection_kei:1,build_transit_fil:3,build_transit_network:3,build_transit_sourc:[3,11],built:[1,11],bunch:3,bus:1,bus_onli:1,buse:4,bxack:11,cach:1,calcul:[0,1,2,3,4,6],calculate_area_typ:1,calculate_assign_group_and_roadway_class:1,calculate_centroidconnect:1,calculate_counti:1,calculate_cube_mod:4,calculate_dist:1,calculate_hov:1,calculate_mpo:1,calculate_start_end_tim:0,calculated_valu:[1,3],call:1,can:[2,8,11],candid:1,candidate_link_idx:1,candidate_links_idx:1,capabl:[0,11],card:[0,1,3],card_data:3,card_filenam:11,care:0,carver:2,categori:[1,2,11],cb_2017_us_county_5m:2,center:2,centroid:[1,2],centroid_connect_lan:2,centroidconnect:[1,2,3],centroidconnect_onli:1,centroidconnector:1,certain:1,chang:[0,1,3,11],channel:11,check:[0,1,3],check_scenario_requisit:11,citilab:8,clear:5,clone:8,code:[1,2,11],column:[0,1],column_name_to_part:6,com:[1,4,6,8,11],combin:[1,2,4],comdes2040:2,come:1,command:[0,1],compar:[0,3,11],compliant:2,compon:8,composit:5,concaten:1,conda:11,condit:[0,1],config:11,config_fil:11,configur:11,confirm:1,connect:1,connector:[1,2],consid:[1,11],consist:[1,11],consol:5,constant:3,construct:[0,11],constructor:[0,1,3],consum:3,contain:[0,1,4,8,11],contaten:0,content:8,contribut:11,convert:[0,1,4],convert_geometry_to_xi:1,convert_int:1,copi:[0,11],correct:0,correspond:1,corridor:1,could:[1,11],council:[1,8],count:[1,2],count_am:2,count_daili:2,count_md:2,count_mn:2,count_nt:2,count_pm:2,count_year:2,counti:[1,2,3],county_codes_dict:1,county_network_vari:1,county_shap:[1,2],county_shape_vari:1,county_vari:[1,2],county_variable_shp:2,creat:[0,1,2,3,4,6,8,11],create_add_route_card_dict:0,create_calculated_vari:1,create_delete_route_card_dict:0,create_dummy_connector_link:1,create_from_cub:[0,11],create_hov_corridor_vari:1,create_managed_lane_network:[1,11],create_managed_vari:1,create_ml_vari:1,create_project:[3,11],create_scenario:11,create_update_route_card_dict:0,creation:11,criteria:1,crosswalk:[1,2],csv:[1,2,3,11],cube:[0,1,2,3,4,8],cube_dir:[0,3,11],cube_format:4,cube_mod:4,cube_properties_dict:0,cube_properties_to_standard_properti:0,cube_time_period:2,cube_transit_net:[4,11],cubetransit:[3,8],curv:1,dai:[2,4,11],dakota:2,data:[1,2,3,4,8,11],data_to_csv:1,data_to_dbf:1,databas:1,datafram:[0,1,3,4,11],dataframe_to_fixed_width:1,date:11,datetim:6,dbf:[1,2],ddatafram:0,debug:5,debuglog:5,debuglogfilenam:5,decid:1,defauli:1,default_project_nam:3,defin:[0,1,2,3,11],delai:1,delet:[0,1],delete_roadway_feature_chang:1,deletion_map:1,demand:2,describ:[1,11],desktop:11,detail:[2,5],determin:[1,3],determine_roadway_network_changes_compat:3,develop:[4,11],dict:[0,1,2,3],dictionari:[0,1,2,3,4,11],dictonari:1,diff_dict:0,differ:[0,2],digraph:1,direct:[0,11],direction_id:0,directli:0,directori:[0,4,11],disaggreg:4,discongru:0,disconnect:1,disconnected_nod:11,disconnected_subgraph_nod:1,discuss:1,distanc:[1,2,3],document:11,doe:[1,4],doesn:[0,1],doing:11,don:1,done:11,dot:[1,2],downtown:[1,2],downtown_area_typ:[1,2],downtown_area_type_shap:[1,2],downtownzones_taz:2,drive:[1,11],drive_access:[1,2,11],drive_nod:[1,2],drive_onli:1,dummi:1,duplic:0,each:[0,1],edg:8,edit:11,egress:1,either:[0,5,11],elif:4,els:4,end:[0,1],entri:[0,3],environ:11,epsg:[1,2],etc:[1,2,3,11],evalu:[0,1,3],evaluate_chang:[3,11],evaluate_differ:[0,11],evaluate_route_property_differ:0,evaluate_route_shape_chang:0,everi:1,exampl:[0,1,2,3,4,11],execut:1,exist:[0,1,3,11],expect:[1,6],explicitli:[2,11],express:4,facil:[1,11],fail:1,fals:[0,1,3,4,11],fast:[1,11],featur:1,feed:[4,11],field:[1,2,11],field_nam:1,fig:1,file:[0,1,2,3,4,8],filenam:[1,3,11],fill:1,fill_na:1,filter:1,find:[0,4],first:3,fix:[1,2],flavor:[0,1],folder:3,follow:[0,4,8,11],force_search:1,foreign:1,forg:11,fork:11,format:[0,1,2,4,8,11],found:0,frequenc:[0,4,11],frequent:11,from:[0,1,3,4,6,8],from_roadwaynetwork:[1,11],fromtransitnetwork:4,full:1,further:4,gener:1,geodadabas:1,geodatabas:1,geodatafram:[1,11],geograph:2,geojson:[1,2],geometri:[1,2],geopanda:[1,11],get:[1,2,11],get_attribut:1,get_managed_lane_node_id:1,get_modal_graph:1,get_modal_links_nod:1,get_property_by_time_period_and_group:1,get_shared_streets_intersection_hash:6,get_time_period_numbers_from_cube_properti:0,git:11,github:[1,6,8,11],give:5,given:11,goal:11,going:11,good:11,googl:4,gp_df:1,graph:1,group:[1,2],gtf:[2,4,11],gtfs_feed_dir:4,gui:11,has:[1,11],have:[0,1,8],haven:0,header:[1,2],headwai:[0,2,4],hennepin:2,hhmmss_str:6,hhmmss_to_datetim:6,highest:2,highest_taz_numb:[1,2],highlight:[2,11],highwai:[2,3,8,11],hoc:[3,11],hold:[4,11],hov2:[1,2],hov3:[1,2],hov:1,how:11,http:[1,4,6,8,11],identifi:[0,1,11],ids:1,ignor:1,ignore_end_nod:[1,11],ignore_exist:1,ignore_miss:1,implement:11,in_plac:[1,11],includ:[0,1,2,11],index:[6,8],indic:[0,1],info:[0,1,2],infolog:5,infologfilenam:5,inform:[0,1,4,11],infrastructur:11,initi:[2,3,11],input:1,input_df:1,instal:8,instanc:[0,1,2,3,4,11],instanti:11,instruct:11,int_col_nam:1,integ:[0,1,2],intro:8,ipynb:11,is_network_connect:[1,11],isn:1,issu:1,its:[0,1],join:[1,3,4,11],join_kei:1,json:[1,11],jupyt:8,just:5,keep_same_attributes_ml_and_gp:1,kei:[0,1,11],kept:1,keyword:[2,11],kwarg:2,l553:6,l565:6,label:1,lane:[1,2,11],lanes_am:2,lanes_lookup_fil:2,lanes_md:2,lanes_nt:2,lanes_pm:2,lanes_vari:1,lasso:11,lat:6,later:5,layer:11,learn:11,least:[1,11],length:[1,2],level:3,light:4,like:[0,1,2],limit_variables_to_existing_network:3,lin:[0,3,4],line:[0,2,3,4,8,11],line_nam:0,line_properti:0,line_properties_dict:0,link:[1,2,3,11],link_df:1,link_fil:[1,11],link_foreign_kei:1,link_idx:1,link_output_vari:1,links_df:[1,11],links_header_width:2,list:[0,1,2,3,4,11],lndice:1,local:[4,11],locat:[1,2,4],locationrefer:1,log:[1,3,8],logfil:3,logfilenam:3,logger:8,logic:4,logtoconsol:5,longnam:4,lookup:2,lot:1,lower:4,machin:11,made:11,main:1,mainli:11,make:[0,1,11],make_complete_network_from_fixed_width_fil:2,manag:[1,11],managed_lanes_link_id_scalar:1,managed_lanes_node_id_scalar:1,managed_lanes_required_attribut:1,managed_lanes_scalar:1,manipul:[4,11],map:[1,2,11],mark:1,mashup:11,master:11,match:[1,2],matcher:1,max:1,max_search_breadth:1,maxspe:1,member:1,met:1,metcouncil:[1,2,4,8,11],metcouncil_data:2,method:[0,1,2,3,4,11],metropolitan:8,midnight:[4,6],might:11,mile:1,minimum:[1,5],minnesota:1,minut:0,miss:1,ml_df:1,ml_lane:1,ml_net:11,mn_count_shst_api_match:2,mndot:[1,2],mndot_count_shap:2,mndot_count_shst_data:[1,2],mndot_count_variable_shp:[1,2],modal_nodes_df:1,mode:[1,4,11],mode_node_vari:1,model:[1,2],model_link_id:[1,2,3],model_node_id:[1,2],model_road_net:11,modelroadwaynetwork:[4,8],modes_to_network_link_vari:1,modes_to_network_node_vari:1,modul:8,more:[1,11],most:0,mpo:[1,2],mpo_counti:[1,2],mrcc:[1,2],mrcc_assgngrp_dict:[1,2],mrcc_roadway_class_shap:[1,2],mrcc_roadway_class_shp:2,mrcc_roadway_class_variable_shp:[1,2],mrcc_route_sys_asgngrp_crosswalk:2,mrcc_shst_data:[1,2],much:1,multidigraph:1,multipl:[0,11],multiple_chang:11,must:1,my_base_scenario:11,my_chang:11,my_config:11,my_dir:11,my_lasso_environ:11,my_link_fil:11,my_net:11,my_node_fil:11,my_out_prefix:11,my_paramet:11,my_scenario:11,my_select:11,my_shape_fil:11,name:[0,1,2,3,4,11],need:[0,1,2,3],neg:0,nest:11,net:[1,11],net_to_dbf:2,network:[0,1,2,3,5,8],network_connection_plot:1,network_var_typ:1,network_vari:1,network_wrangl:[1,8,11],networkwrangl:[8,11],networkx:1,new_time_period_numb:0,node:[0,1,2,3,4,11],node_fil:[1,11],node_foreign_kei:1,node_id:0,node_output_vari:1,nodes_df:[1,11],nodes_header_width:2,nodes_list:1,noisi:5,non:0,none:[0,1,3,4,5,6],notat:4,note:[0,11],notebook:8,noth:1,novel:11,now:1,number:[0,1,2,4,11],number_of_lan:1,numer:[0,4],object:[0,1,2,3,4,6,11],one:[1,11],onewai:[1,4],onli:[0,1,3,4],open:11,oper:4,option:1,order:[0,1,2,11],orig_dest_nodes_foreign_kei:1,orig_line_nam:0,origin:[0,1,11],osm:[1,2],osm_assgngrp_dict:[1,2],osm_highway_asgngrp_crosswalk:2,osm_model_link_id:1,osm_node_id:[2,6,11],osmnx:[1,11],other:[1,11],otherwis:4,out:[1,2,5,6,8],outfil:[4,11],outpath:4,output:[1,2,3,4],output_cube_network_script:[1,2],output_epsg:[1,2],output_link_csv:[1,2],output_link_header_width_txt:[1,2],output_link_shp:[1,2],output_link_txt:[1,2],output_node_csv:[1,2],output_node_header_width_txt:[1,2],output_node_shp:[1,2],output_node_txt:[1,2],output_vari:[1,2],overrid:11,overwrit:1,ox_graph:1,packag:[8,11],page:8,pair:11,panda:[0,1,3],parallel:1,param:1,paramet:[0,1,3,4,6,8],pars:[0,8,11],part:11,partridg:[4,11],pass:[1,5],path:[1,3,4,11],per:[6,11],perfect:1,period:[0,1,2,4,11],pertin:0,physic:1,pip:11,placehold:1,plan:11,pleas:11,plot:1,posit:0,possibl:2,potenti:2,prefix:1,prepar:4,price:[1,2],price_hov2_am:2,price_hov2_md:2,price_hov2_nt:2,price_hov2_pm:2,price_hov3_am:2,price_hov3_md:2,price_hov3_nt:2,price_hov3_pm:2,price_sov_am:2,price_sov_md:2,price_sov_nt:2,price_sov_pm:2,price_truck_am:2,price_truck_md:2,price_truck_nt:2,price_truck_pm:2,produc:3,program_typ:0,project:[0,1,2,8],project_card:11,project_card_dictionari:1,project_card_directori:11,project_cards_list:11,project_nam:3,projectcard:[3,8],properti:[0,1,2,4,11],properties_bas:0,properties_build:0,properties_list:0,properties_to_split:[1,2],property_nam:0,property_valu:0,provid:[3,4,11],ptg_feed:4,publictransport:11,purpos:1,pycod:1,pypi:11,python:[0,1,2,11],queri:[1,11],quicker:1,quickstart:8,rail:[1,4],rail_onli:1,ramsei:2,rather:[0,1],rdwy_ctgy_:2,reachabl:1,read:[0,1,3,4,8,11],read_gtf:[4,11],read_logfil:3,read_match_result:1,recalcul:3,recalculate_calculated_vari:[1,3],recalculate_dist:[1,3],recommend:11,record:1,ref:1,refer:[4,11],refin:8,regex:1,region:1,remov:1,renam:1,rename_variables_for_dbf:1,replac:11,report:8,repositori:11,repres:[0,2,4,6,11],represent:[1,4],represnt:4,reprsent:[3,11],requir:[0,1,11],require_existing_for_chang:1,result:[1,11],right:1,rigor:1,road_class_variable_nam:1,road_net:11,roadwai:[1,2,3,11],roadway_chang:3,roadway_class:[1,2],roadway_class_dict:2,roadway_class_idx:2,roadway_csv_fil:3,roadway_log_fil:3,roadway_net:1,roadway_net_to_gdf:1,roadway_network_link:1,roadway_network_nod:1,roadway_network_object:1,roadway_network_shap:1,roadway_shp_fil:3,roadway_standard_to_met_council_network:1,roadwaynetwork:[1,3,4],rout:[0,1,4],route_id:[0,4],route_long_nam:4,route_properti:0,route_properties_gtfs_to_cub:4,route_si:2,route_typ:4,route_type_to_cube_mod:4,row:4,rtree:11,run:8,runtim:[2,11],safe_load:11,sag:[1,8,11],same:1,satisfi:1,save:1,scenario:8,scenario_summari:11,schema:[1,11],schema_loc:1,scott:2,scratch:2,scratch_dir:[3,11],script:[1,2],search:[1,2,8,11],search_breadth:1,search_mod:1,sec:6,second:[0,4,6],secs_to_datetim:6,see:[0,1,3,4],segment:[1,11],segment_id:1,select:[1,11],select_roadway_featur:[1,11],selected_link_idx:1,selected_links_idx:1,selection_dict:1,selection_dictionari:1,selection_dictonari:1,selection_has_unique_link_id:1,selection_map:1,selection_requir:1,self:[0,1,2,3,4],separ:1,seri:[1,11],serv:1,servic:11,set:[0,1,2,3,4,5,8,11],setup:8,setuplog:5,sever:2,shape:[0,1,3,4,11],shape_bas:0,shape_build:0,shape_fil:[1,11],shape_gtfs_to_cub:4,shape_id:[1,4],shapefil:2,shapes_df:11,share:[1,2,11],sharedstreet:6,shortest:1,should:[1,2,3,11],show:1,shp:[1,2],shst:[1,2],shst_csv_variabl:1,shstgeometryid:2,shstid:1,singl:[1,3],singleton:1,skip:1,softwar:[8,11],sophist:1,sourc:[0,1,2,3,4,5,6,11],source_gdf:1,source_list:0,source_shst_ref_df:1,sov:[1,2],sp_weight_factor:1,span:11,spatial:1,specif:[0,1,8,11],specifi:[1,3],speed:1,spew:5,split:[1,2,4],split_properties_by_time_period_and_categori:1,src:6,standard:[0,1,2,4,11],standardtransit:8,start:[0,1,4,8],start_time_sec:4,static_valu:3,stnadard:0,stop:0,store:[0,1,11],stpaul_dir:11,stpaul_link_fil:11,stpaul_node_fil:11,stpaul_shape_fil:11,str:[0,1,2,3],strai:1,stratifi:1,street:[1,2,11],streetcar:4,string:[0,1,3,4,6],strongli:1,structur:1,style:0,subclass:[1,11],subgraph:1,sublim:11,suburban:4,suffix:2,suggest:11,suitabl:0,sure:1,syntax:11,system:[1,3],t_transit_shape_test:[3,11],tabl:2,tag:11,take:[0,1],taz:[1,2],taz_data:2,taz_shap:2,tazofficialwcurrentforecast:2,team:11,ters:5,test:[1,2],test_project:[3,11],text:11,than:[0,1],thei:[1,2],them:[0,1,8],theproject:1,thi:[0,1,2,3,8,11],this_tp:4,this_tp_num:4,those:11,three:11,thrivemsp2040communitydesign:2,tier:11,time:[0,1,2,4,6,11],time_period:[0,1,2],time_period_numb:0,time_period_to_tim:2,time_periods_to_tim:2,time_to_cube_time_period:4,times_period:1,tod:4,todo:2,too:[1,5],top:11,tradas_:2,traffic:1,tram:4,trans_mrcc_centerlin:2,transform:1,transit:[0,1,2,3,4,8],transit_access:[1,2],transit_chang:3,transit_change_list:[0,11],transit_net:11,transit_network_object:4,transit_nod:2,transit_route_shape_chang:[3,11],transit_sourc:0,transitnetwork:[0,4],translat:[0,4,11],transport:11,travel:11,trip:4,trip_df:4,trip_id:4,trk:2,trn_prioriti:[1,2],trn_priority_am:2,trn_priority_md:2,trn_priority_nt:2,trn_priority_pm:2,trnbld:0,truck:2,truck_access:2,ttime_assert:[1,2],ttime_assert_am:2,ttime_assert_md:2,ttime_assert_nt:2,ttime_assert_pm:2,tupl:1,turn:[1,11],two:[0,1,8,11],txt:[2,11],type:[0,1,2,3,4,6,11],typic:[0,3,4,8],under:0,understand:11,union:1,uniqu:[0,1],unique_link_kei:1,unique_model_link_identifi:1,unique_node_identifi:1,unique_node_kei:1,unique_shape_kei:1,unit:0,unpack:0,unpack_route_nam:0,updat:[0,1,11],updated_properties_dict:0,upstream:11,urban:[2,4],usag:[0,1,3,4],use:[0,1,2,3,4,11],used:[1,2,11],user:[3,11],uses:[1,4,11],using:[1,11],util:[1,8,11],valid:[1,11],validate_bas:0,validate_link_schema:1,validate_node_schema:1,validate_object_typ:1,validate_properti:1,validate_select:1,validate_shape_schema:1,validate_uniqu:1,valu:[0,1,3,4,11],var_shst_csvdata:1,variabl:[0,1,2,3,4,8,11],variable_crosswalk:1,varibl:0,verbos:4,veri:5,version:[0,11],vertex:1,via:1,virtual:11,virtualenv:11,volum:1,walk:[1,11],walk_access:[1,2],walk_nod:[1,2],want:[1,11],warn:[0,1],washington:2,well:11,were:1,what:[0,1,3],whatev:1,when:[1,3],where:1,whether:1,whetther:1,which:[0,1,3,11],why:1,wi_count_shst_api_match:2,widot:[1,2],widot_assgngrp_dict:[1,2],widot_count_shap:2,widot_count_shst_data:[1,2],widot_count_variable_shp:[1,2],widot_ctgy_asgngrp_crosswalk:2,widot_roadway_class_shap:[1,2],widot_roadway_class_variable_shp:[1,2],widot_shst_data:[1,2],width:1,wisconsin:[1,2],wisconsin_lanes_counts_median:2,wisconsindot:2,wislr:2,within:[0,1,2],work:11,workflow:8,wrangler:[8,11],wrapper:[1,8,11],write:[0,1,3,4,11],write_as_cube_lin:[4,11],write_dir:[4,11],write_project_card:[3,11],write_roadway_as_fixedwidth:[1,11],write_roadway_as_shp:[1,11],written:[1,11],wsp:[1,8,11],yaml:11,yml:[3,11],you:[1,3,11],your:11},titles:["lasso.CubeTransit","lasso.ModelRoadwayNetwork","lasso.Parameters","lasso.Project","lasso.StandardTransit","lasso.logger","lasso.util","Lasso Classes and Functions","Welcome to lasso\u2019s documentation!","Running Lasso","Setup","Starting Out"],titleterms:{"class":7,"export":9,"function":7,addit:10,audit:9,base:7,bleed:11,brief:11,card:11,clone:11,compon:11,creat:9,cube:11,cubetransit:[0,11],data:10,document:8,edg:11,file:[9,10,11],from:11,indic:8,instal:11,intro:11,jupyt:11,lasso:[0,1,2,3,4,5,6,7,8,9],lin:11,log:11,logger:5,model:11,modelroadwaynetwork:[1,11],network:[9,11],notebook:11,out:11,paramet:[2,10,11],project:[3,9,10,11],projectcard:11,quickstart:11,report:9,roadwaynetwork:11,run:[9,11],scenario:[9,11],set:10,setup:10,standardtransit:[4,11],start:11,tabl:8,todo:1,transit:11,transitnetwork:11,typic:11,util:[6,7],welcom:8,workflow:11}}) \ No newline at end of file diff --git a/setup.html b/setup.html new file mode 100644 index 0000000..097ad6a --- /dev/null +++ b/setup.html @@ -0,0 +1,235 @@ + + + + + + + + + + Setup — lasso documentation + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ + + +
+ + + + + +
+ +
+ + + + + + + + + + + + + + + + + +
+ + + + +
+
+
+
+ +
+

Setup

+
+

Projects

+
+
+

Parameters

+
+
+

Settings

+
+
+

Additional Data Files

+
+
+ + +
+ +
+ + +
+
+ +
+ +
+ + + + + + + + + + + \ No newline at end of file diff --git a/starting.html b/starting.html new file mode 100644 index 0000000..b8387cb --- /dev/null +++ b/starting.html @@ -0,0 +1,532 @@ + + + + + + + + + + Starting Out — lasso documentation + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ + + +
+ + + + + +
+ +
+ + + + + + + + + + + + + + + + + +
+ + + + +
+
+
+
+ +
+

Starting Out

+
+

Installation

+

If you are managing multiple python versions, we suggest using virtualenv or conda virtual environments.

+

Example using a conda environment (recommended) and using the package manager pip to install Lasso from the source on GitHub.

+
conda config --add channels conda-forge
+conda create python=3.7 rtree geopandas -n <my_lasso_environment>
+conda activate <my_lasso_environment>
+pip install git+https://github.com/wsp-sag/Lasso@master
+
+
+

Lasso will install network_wrangler from the PyPi repository because it is included in Lasso’s requirements.txt.

+
+

Bleeding Edge

+

If you want to install a more up-to-date or development version of network wrangler and lasso , you can do so by installing it from the develop branch of

+
conda config --add channels conda-forge
+conda create python=3.7 rtree geopandas -n <my_lasso_environment>
+conda activate <my_lasso_environment>
+pip install git+https://github.com/wsp-sag/network_wrangler@develop
+pip install git+https://github.com/wsp-sag/Lasso@develop
+
+
+
+
+

From Clone

+

If you are going to be working on Lasso locally, you might want to clone it to your local machine and install it from the clone. The -e will install it in editable mode.

+

if you plan to do development on both network wrangler and lasso locally, consider installing network wrangler from a clone as well!

+
conda config --add channels conda-forge
+conda create python=3.7 rtree geopandas osmnx -n <my_lasso_environment>
+conda activate <my_lasso_environment>
+git clone https://github.com/wsp-sag/Lasso
+git clone https://github.com/wsp-sag/network_wrangler
+cd network_wrangler
+pip install -e .
+cd ..
+cd Lasso
+pip install -e .
+
+
+

Notes:

+
    +
  1. The -e installs it in editable mode.

  2. +
  3. If you are not part of the project team and want to contribute code bxack to the project, please fork before you clone and then add the original repository to your upstream origin list per these directions on github.

  4. +
  5. if you wanted to install from a specific tag/version number or branch, replace @master with @<branchname> or @tag

  6. +
  7. If you want to make use of frequent developer updates for network wrangler as well, you can also install it from clone by copying the instructions for cloning and installing Lasso for Network Wrangler

  8. +
+

If you are going to be doing Lasso development, we also recommend:

+
    +
  • a good IDE such as Atom, VS Code, Sublime Text, etc. +with Python syntax highlighting turned on.

  • +
  • GitHub Desktop to locally update your clones

  • +
+
+
+
+

Brief Intro

+

Lasso is a ‘wrapper’ around the Network Wrangler utility.

+

Both Lasso and NetworkWrangler are built around the following data schemas:

+
    +
  • [roadway network], which is based on a mashup of Open Street Map and Shared Streets. In Network Wrangler these are read in from three json files reprsenting: links, shapes, and nodes. Data fields that change by time of day or by user category are represented as nested fields such that any field can be defined for an ad-hoc time-of-day span or user category.

  • +
  • [transit network], which is based on a frequency-based implementation of the csv-based GTFS; and

  • +
  • [project card], which is novel to Network Wrangler and stores information about network changes as a result of projects in yml.

  • +
+

In addition, Lasso utilizes the following data schemas:

+
    +
  • [MetCouncil Model Roadway Network Schema], which adds data fields to the roadway network schema that MetCouncil uses in their travel model including breaking out data fields by time period.

  • +
  • [MetCouncil Model Transit Network Schema], which uses the Cube PublicTransport format, and

  • +
  • [Cube Log Files], which document changes to the roadway network done in the Cube GUI. Lasso translates these to project cards in order to be used by NetworkWrangler.

  • +
  • [Cube public transport line files], which define a set of transit lines in the cube software.

  • +
+
+

Components

+

Network Wrangler has the following atomic parts:

+
    +
  • RoadwayNetwork object, which represents the roadway network data as GeoDataFrames;

  • +
  • TransitNetwork object, which represents the transit network data as DataFrames;

  • +
  • ProjectCard object, which represents the data of the project card. Project cards identify the infrastructure that is changing (a selection) and defines the changes; or contains information about a new facility to be constructed or a new service to be run.;

  • +
  • Scenario object, which consist of at least a RoadwayNetwork, and +TransitNetwork. Scenarios can be based on or tiered from other scenarios. +Scenarios can query and add ProjectCards to describe a set of changes that should be made to the network.

  • +
+

In addition, Lasso has the following atomic parts:

+
    +
  • Project object, creates project cards from one of the following: a base and a build transit network in cube format, a base and build highway network, or a base highway network and a Cube log file.

  • +
  • ModelRoadwayNetwork object is a subclass of RoadwayNetwork and contains MetCouncil-specific methods to define and create MetCouncil-specific variables and export the network to a format that can be read by Cube.

  • +
  • StandardTransit, an object for holding a standard transit feed as a Partridge object and contains +methods to manipulate and translate the GTFS data to MetCouncil’s Cube Line files.

  • +
  • CubeTransit, an object for storing information about transit defined in Cube public transport line files +. Has the capability to parse cube line file properties and shapes into python dictionaries and compare line files and represent changes as Project Card dictionaries.

  • +
  • Parameters, A class representing all the parameters defining the networks +including time of day, categories, etc. Parameters can be set at runtime by initializing a parameters instance +with a keyword argument setting the attribute. Parameters that are +not explicitly set will use default parameters listed in this class.

  • +
+
+

RoadwayNetwork

+

Reads, writes, queries and and manipulates roadway network data, which +is mainly stored in the GeoDataFrames links_df, nodes_df, and shapes_df.

+
net = RoadwayNetwork.read(
+        link_file=MY_LINK_FILE,
+        node_file=MY_NODE_FILE,
+        shape_file=MY_SHAPE_FILE,
+    )
+my_selection = {
+    "link": [{"name": ["I 35E"]}],
+    "A": {"osm_node_id": "961117623"},  # start searching for segments at A
+    "B": {"osm_node_id": "2564047368"},
+}
+net.select_roadway_features(my_selection)
+
+my_change = [
+    {
+        'property': 'lanes',
+        'existing': 1,
+        'set': 2,
+     },
+     {
+        'property': 'drive_access',
+        'set': 0,
+      },
+]
+
+my_net.apply_roadway_feature_change(
+    my_net.select_roadway_features(my_selection),
+    my_change
+)
+
+ml_net = net.create_managed_lane_network(in_place=False)
+
+ml_net.is_network_connected(mode="drive"))
+
+_, disconnected_nodes = ml_net.assess_connectivity(
+  mode="walk",
+  ignore_end_nodes=True
+)
+ml_net.write(filename=my_out_prefix, path=my_dir)
+
+
+
+
+

TransitNetwork

+
+
+

ProjectCard

+
+
+

Scenario

+

Manages sets of project cards and tiering from a base scenario/set of networks.

+

+my_base_scenario = {
+    "road_net": RoadwayNetwork.read(
+        link_file=STPAUL_LINK_FILE,
+        node_file=STPAUL_NODE_FILE,
+        shape_file=STPAUL_SHAPE_FILE,
+        fast=True,
+    ),
+    "transit_net": TransitNetwork.read(STPAUL_DIR),
+}
+
+card_filenames = [
+    "3_multiple_roadway_attribute_change.yml",
+    "multiple_changes.yml",
+    "4_simple_managed_lane.yml",
+]
+
+project_card_directory = os.path.join(STPAUL_DIR, "project_cards")
+
+project_cards_list = [
+    ProjectCard.read(os.path.join(project_card_directory, filename), validate=False)
+    for filename in card_filenames
+]
+
+my_scenario = Scenario.create_scenario(
+  base_scenario=my_base_scenario,
+  project_cards_list=project_cards_list,
+)
+my_scenario.check_scenario_requisites()
+
+my_scenario.apply_all_projects()
+
+my_scenario.scenario_summary()
+
+
+
+
+

Project

+

Creates project cards by comparing two MetCouncil Model Transit Network files or by reading a cube log file and a base network;

+

+test_project = Project.create_project(
+  base_transit_source=os.path.join(CUBE_DIR, "transit.LIN"),
+  build_transit_source=os.path.join(CUBE_DIR, "transit_route_shape_change"),
+  )
+
+test_project.evaluate_changes()
+
+test_project.write_project_card(
+  os.path.join(SCRATCH_DIR, "t_transit_shape_test.yml")
+  )
+
+
+
+
+

ModelRoadwayNetwork

+

A subclass of network_wrangler’s RoadwayNetwork +class which additional understanding about how to translate and write the +network out to the MetCouncil Roadway Network schema.

+
net = ModelRoadwayNetwork.read(
+      link_file=STPAUL_LINK_FILE,
+      node_file=STPAUL_NODE_FILE,
+      shape_file=STPAUL_SHAPE_FILE,
+      fast=True,
+  )
+
+net.write_roadway_as_fixedwidth()
+
+
+
+
+

StandardTransit

+

Translates the standard GTFS data to MetCouncil’s Cube Line files.

+
cube_transit_net = StandardTransit.read_gtfs(BASE_TRANSIT_DIR)
+cube_transit_net.write_as_cube_lin(os.path.join(WRITE_DIR, "outfile.lin"))
+
+
+
+
+

CubeTransit

+

Used by the project class and has the capability to:

+
    +
  • Parse cube line file properties and shapes into python dictionaries

  • +
  • Compare line files and represent changes as Project Card dictionaries

  • +
+
tn = CubeTransit.create_from_cube(CUBE_DIR)
+transit_change_list = tn.evaluate_differences(base_transit_network)
+
+
+
+
+

Parameters

+

Holds information about default parameters but can +also be initialized to override those parameters at object instantiation using a dictionary.

+
# read parameters from a yaml configuration  file
+# could also provide as a key/value pair
+with open(config_file) as f:
+      my_config = yaml.safe_load(f)
+
+# provide parameters at instantiation of ModelRoadwayNetwork
+model_road_net = ModelRoadwayNetwork.from_RoadwayNetwork(
+            my_scenario.road_net, parameters=my_config.get("my_parameters", {})
+        )
+# network written with direction from the parameters given
+model_road_net.write_roadway_as_shp()
+
+
+
+
+
+

Typical Workflow

+

Workflows in Lasso and Network Wrangler typically accomplish one of two goals:

+
    +
  1. Create Project Cards to document network changes as a result of either transit or roadway projects.

  2. +
  3. Create Model Network Files for a scenario as represented by a series of Project Cards layered on top of a base network.

  4. +
+
+

Project Cards from Transit LIN Files

+
+
+

Project Cards from Cube LOG Files

+
+
+

Model Network Files for a Scenario

+
+
+
+
+

Running Quickstart Jupyter Notebooks

+

To learn basic lasso functionality, please refer to the following jupyter notebooks in the /notebooks directory:

+
    +
  • Lasso Project Card Creation Quickstart.ipynb

  • +
  • Lasso Scenario Creation Quickstart.ipynb

  • +
+

Jupyter notebooks can be started by activating the lasso conda environment and typing jupyter notebook:

+
conda activate <my_lasso_environment>
+jupyter notebook
+
+
+
+
+ + +
+ +
+ + +
+
+ +
+ +
+ + + + + + + + + + + \ No newline at end of file