Skip to content

Commit

Permalink
released version (#454)
Browse files Browse the repository at this point in the history
* fixes errors on qgis (#417)

* runs examples (#422)

Co-authored-by: pveigadecamargo <pveigadecamargo@anl.gov>

* Update LICENSE.TXT (#427)

* fix matrix indices for memory-only matrices (#425)

* fix matrix indices for memory-only matrices

* run black

---------

Co-authored-by: Pelle Koster <pelle.koster@nginfra.nl>

* Pedro/cores skimming (#426)

* Moves the setting of number of cores to the right place

* Moves documentation to the right place

---------

Co-authored-by: pveigadecamargo <pveigadecamargo@anl.gov>

* Updates assignment logging (#423)

* assignment logging

* Updates logging

* Adds __config log

* Adds test and docs

* Update aequilibrae/paths/traffic_assignment.py

* Fixes tests

* fixes coverage testing

* merges testing changes

* style

* style

* style

* style

* narrows coverage test

* Update test_traffic_assignment.py

* Apply suggestions from code review

* Fixes test_traffic_assignment

---------

Co-authored-by: pveigadecamargo <pveigadecamargo@anl.gov>
Co-authored-by: Pedro Camargo <c@margo.co>

* narrows coverage test (#429)

Co-authored-by: pveigadecamargo <pveigadecamargo@anl.gov>

* moves openmatrix to a primary dependency (#431)

* moves openmatrix to a primary dependency

* QGIS moved to Numpy 1.24 before moving to Python 3.10

---------

Co-authored-by: pveigadecamargo <pveigadecamargo@anl.gov>

* Updates software version for release

* Bumps up version

* Bumps up documentation version

---------

Co-authored-by: pveigadecamargo <pveigadecamargo@anl.gov>

* minor change to test CI

* Fix multiple classes not being presented in the returned d (#438)

* Prevent invalid attribute names on AequilibraE data fields

This previously would have resulted in syntax error when accessing but
its best we don't accept them anyway.

* Fix multiple classes not being presented in the returned df

Bug report: https://groups.google.com/g/aequilibrae/c/y_q9nLNs6-Y/m/yWXNpey9AAAJ

* Style

* Skip if network fails

* Rounding

* fixes test

* Revert "Rounding"

Vatican City really did move huh

This reverts commit d9d0a5d.

---------

Co-authored-by: pveigadecamargo <pveigadecamargo@anl.gov>

* Update README.md (#439)

* Add implicit noexcept from Cython<3.0.0 (#440)

With the release of Cython 3.0.0 there are a few changes of note to use. Particularly the performance impact of the
removal of the implicit noexcept.

Now all cdefs allow exceptions by default meaning every cdef must require the gil at the end of the function regardless
of whether it was nogil or not. Adding the noexcept clause reverts to the old behaviour.

There is a compiler directive to result this old behaviour but better be be explicit when the solution is one regex away:
`^(cp?def(?:.|\n).*?)(nogil|):$`

https://cython.readthedocs.io/en/latest/src/userguide/migrating_to_cy30.html#exception-values-and-noexcept

* Disable select link (#443)

* Fix empty matrices not being saved (#444)

Introduced in 7c2beb6

* Matrix exports and deprecation warning (#435)

* Bumps up version

* Bumps up documentation version

* fixes matrix export

* fixes deprecations

* addresses SciPy versions

* addresses SciPy versions

* addresses SciPy versions

* addresses SciPy versions

* addresses SciPy versions

* fixes test

* fixes test

---------

Co-authored-by: pveigadecamargo <pveigadecamargo@anl.gov>

* Select link correctness fix (#447)

* Revert "Disable select link (#443)"

This reverts commit 0cd3d48.

* Add Kai Tang's test and data

* Potential select link fix

* Test formatting

* Fix tests imports

* Add select link test

This test asserts that the results of the select link on the links 7, and 13 are the same as the results of the
assignment. These links were chosen for this particular network to cover all paths used.

* Prevent data races in select link results

Memory for the multi-threaded runs are now allocated in MuliThreadedAoN along side the rest of the multi-threaded memory.

* Installs package to run documentation pipeline

* installing

* Install pandoc

---------

Co-authored-by: Pedro Camargo <c@margo.co>

* bumps up the version (#451)

Co-authored-by: pveigadecamargo <pveigadecamargo@anl.gov>

* new version

---------

Co-authored-by: Renata Imai <53949163+r-akemii@users.noreply.github.com>
Co-authored-by: pveigadecamargo <pveigadecamargo@anl.gov>
Co-authored-by: PelleK <elfjes@users.noreply.github.com>
Co-authored-by: Pelle Koster <pelle.koster@nginfra.nl>
Co-authored-by: Jamie Cook <jimi.cook@gmail.com>
Co-authored-by: Jake Moss <jake.moss@uqconnect.edu.au>
Co-authored-by: Jamie Cook <jamie.cook@veitchlister.com.au>
  • Loading branch information
8 people committed Oct 9, 2023
1 parent fe3f307 commit c5ee6d6
Show file tree
Hide file tree
Showing 28 changed files with 190 additions and 110 deletions.
3 changes: 3 additions & 0 deletions .github/qgis_requirements.py
Original file line number Diff line number Diff line change
Expand Up @@ -15,4 +15,7 @@ def replace_in_file(file_path, text_orig, suffix):
replace_in_file("../requirements.txt", "pandas", "<1.2")
replace_in_file("../pyproject.toml", "pandas", "<1.2")

replace_in_file("../requirements.txt", "scipy", "<1.11")
replace_in_file("../pyproject.toml", "scipy", "<1.11")

replace_in_file("../__version__.py", "{minor_version}", ".dev0")
3 changes: 2 additions & 1 deletion .github/workflows/documentation.yml
Original file line number Diff line number Diff line change
Expand Up @@ -29,12 +29,13 @@ jobs:
pip install -r docs/requirements-docs.txt
python -m pip install sphinx-gallery --user
sudo apt update
sudo apt install -y --fix-missing libsqlite3-mod-spatialite libspatialite-dev
sudo apt install -y --fix-missing libsqlite3-mod-spatialite libspatialite-dev pandoc
sudo ln -s /usr/lib/x86_64-linux-gnu/mod_spatialite.so /usr/lib/x86_64-linux-gnu/mod_spatialite
- name: Compile library
run: |
python setup.py build_ext --inplace
pip install .
- name: Check history of versions
run: |
Expand Down
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,12 +15,12 @@
[![QAequilibraE artifacts](https://github.com/AequilibraE/aequilibrae/actions/workflows/build_artifacts_qgis.yml/badge.svg)](https://github.com/AequilibraE/aequilibrae/actions/workflows/build_artifacts_qgis.yml)


AequilibraE is the first comprehensive Python package for transportation modeling, and it aims to provide all the
AequilibraE is the first comprehensive Python package for transportation modeling. It aims to provide all the
resources not available from other open-source packages in the Python (NumPy, really) ecosystem.

## Comprehensive documentation

[AequilibraE documentation built with Sphinx ](http://www.aequilibrae.com>)
[AequilibraE documentation built with Sphinx ](http://www.aequilibrae.com)

## What is available

Expand Down
2 changes: 1 addition & 1 deletion __version__.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
version = 0.9
minor_version = "3"
minor_version = "4"
release_name = "Queluz"

release_version = f"{version}.{minor_version}"
10 changes: 5 additions & 5 deletions aequilibrae/distribution/ipf_core.pyx
Original file line number Diff line number Diff line change
Expand Up @@ -56,7 +56,7 @@ cdef _fratar(double[:, :] flows,
double[:] attr_factor,
int max_iter,
double toler,
int cpus):
int cpus) noexcept:

cdef double err = 1.0
cdef int iter = 0
Expand Down Expand Up @@ -99,7 +99,7 @@ cdef _fratar(double[:, :] flows,
cpdef void _total_attra(double[:, :] flows,
double[:] prod_tgt,
double[:] attr_tot,
int cpus):
int cpus) noexcept:

cdef long long i, j, jk
cdef double *local_buf
Expand Down Expand Up @@ -133,7 +133,7 @@ cpdef void _total_attra(double[:, :] flows,
cpdef void _total_prods(double[:, :] flows,
double[:] prod_tgt,
double[:] prod_tot,
int cpus)nogil:
int cpus) noexcept nogil:

cdef long long i, j
cdef long long I = flows.shape[0]
Expand All @@ -154,7 +154,7 @@ cpdef void _total_prods(double[:, :] flows,
cpdef double _factors(double[:] target,
double[:] total,
double[:] factor,
int cpus):
int cpus) noexcept:

cdef long long i, I = target.shape[0]
cdef double err = 1.0
Expand All @@ -174,7 +174,7 @@ cpdef double _factors(double[:] target,
@cython.embedsignature(True)
@cython.boundscheck(False)
cpdef double _calc_err(double[:] p_factor,
double[:] a_factor):
double[:] a_factor) noexcept:

cdef long long i, I = p_factor.shape[0]
cdef long long j, J = a_factor.shape[0]
Expand Down
4 changes: 4 additions & 0 deletions aequilibrae/matrix/aequilibrae_data.py
Original file line number Diff line number Diff line change
Expand Up @@ -106,6 +106,10 @@ def create_empty(
# raise ValueError('Data types need to be Python or Numpy data types')

for field in self.fields:
if not type(field) is str:
raise TypeError(field + " is not a string. You cannot use it as a field name")
if not field.isidentifier():
raise Exception(field + " is a not a valid identifier name. You cannot use it as a field name")
if field in object.__dict__:
raise Exception(field + " is a reserved name. You cannot use it as a field name")

Expand Down
25 changes: 16 additions & 9 deletions aequilibrae/matrix/aequilibrae_matrix.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,7 @@
import os
import tempfile
import uuid
import warnings
from functools import reduce
from typing import List

Expand Down Expand Up @@ -399,7 +400,9 @@ def create_from_trip_list(self, path_to_file: str, from_column: str, to_column:

new_mat = AequilibraeMatrix()
nb_of_zones = len(zones_list)
new_mat.create_empty(file_name=path_to_file[:-4] + ".aem", zones=nb_of_zones, matrix_names=list_cores)
new_mat.create_empty(
file_name=path_to_file[:-4] + ".aem", zones=nb_of_zones, matrix_names=list_cores, memory_only=False
)

for idx, core in enumerate(list_cores):
m = (
Expand Down Expand Up @@ -677,6 +680,9 @@ def __flush(self, obj_to_flush: np.memmap):

def __getattr__(self, mat_name: str):
if mat_name in object.__dict__:
if mat_name == "matrix" and self.__omx:
warnings.warn("You can't access OMX matrix cores like that")
return
return self.__dict__[mat_name]

if mat_name in self.names:
Expand Down Expand Up @@ -748,18 +754,14 @@ def export(self, output_name: str, cores: List[str] = None):
>>> mat2.cores
2
"""

if self.__omx:
raise NotImplementedError("This operation does not make sense for OMX matrices")

fname, file_extension = os.path.splitext(output_name.upper())

if file_extension == ".OMX":
if not has_omx:
raise ValueError("Open Matrix is not installed. Cannot continue")

if file_extension not in [".AEM", ".CSV", ".OMX"]:
raise ValueError("File extension {} not implemented yet".format(file_extension))
raise NotImplementedError(f"File extension {file_extension} not implemented yet")

if cores is None:
cores = self.names
Expand All @@ -770,16 +772,21 @@ def export(self, output_name: str, cores: List[str] = None):
elif file_extension == ".OMX":
omx_export = omx.open_file(output_name, "w")
for c in cores:
omx_export[c] = self.matrix[c]

if self.__omx:
omx_export[c] = np.array(self.omx_file[c])
else:
omx_export[c] = self.matrix[c]
for i, idx in enumerate(self.index_names):
omx_export.create_mapping(idx, self.indices[:, i])
omx_export.close()

elif file_extension == ".CSV":

def f(name):
coo = coo_matrix(self.matrix[name])
if self.__omx:
coo = np.array(self.omx_file[name])
else:
coo = coo_matrix(self.matrix[name])
data = {"row": self.index[coo.row], "column": self.index[coo.col], name: coo.data}
return pd.DataFrame(data).set_index(["row", "column"])

Expand Down
7 changes: 3 additions & 4 deletions aequilibrae/paths/AoN.pyx
Original file line number Diff line number Diff line change
Expand Up @@ -97,11 +97,10 @@ def one_to_all(origin, matrix, graph, result, aux_result, curr_thread):
bint select_link = False

if result._selected_links:

has_flow_mask = aux_result.has_flow_mask[curr_thread, :]
sl_od_matrix_view = aux_result.temp_sl_od_matrix[:, origin_index, :, :]
sl_link_loading_view = aux_result.temp_sl_link_loading[:, :, :]
link_list = aux_result.select_links[:, :]
sl_od_matrix_view = aux_result.temp_sl_od_matrix[curr_thread, :, origin_index, :, :]
sl_link_loading_view = aux_result.temp_sl_link_loading[curr_thread, :, :, :]
link_list = aux_result.select_links[:, :] # Read only, don't need to slice on curr_thread
select_link = True
#Now we do all procedures with NO GIL
with nogil:
Expand Down
18 changes: 9 additions & 9 deletions aequilibrae/paths/basic_path_finding.pyx
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ cpdef void network_loading(long classes,
long long [:] no_path,
long long [:] reached_first,
double [:, :] node_load,
long found) nogil:
long found) noexcept nogil:

cdef long long i, j, predecessor, connector, node
cdef long long zones = demand.shape[0]
Expand Down Expand Up @@ -70,7 +70,7 @@ cpdef void network_loading(long classes,
@cython.embedsignature(True)
@cython.boundscheck(False)
cdef void _copy_skims(double[:,:] skim_matrix, #Skim matrix_procedures computed from one origin to all nodes
double[:,:] final_skim_matrix) nogil: #Skim matrix_procedures computed for one origin to all other centroids only
double[:,:] final_skim_matrix) noexcept nogil: #Skim matrix_procedures computed for one origin to all other centroids only

cdef long i, j
cdef long N = final_skim_matrix.shape[0]
Expand All @@ -81,7 +81,7 @@ cdef void _copy_skims(double[:,:] skim_matrix, #Skim matrix_procedures computed
final_skim_matrix[i,j]=skim_matrix[i,j]


cdef return_an_int_view(input):
cdef int[:] return_an_int_view(input) noexcept nogil:
cdef int [:] critical_links_view = input
return critical_links_view

Expand All @@ -97,7 +97,7 @@ cdef void sl_network_loading(
double [:, :, :] sl_od_matrix,
double [:, :, :] sl_link_loading,
unsigned char [:] has_flow_mask,
long classes) nogil:
long classes) noexcept nogil:
# VARIABLES:
# selected_links: 2d memoryview. Each row corresponds to a set of selected links specified by the user
# demand: The input demand matrix for a given origin. The first index corresponds to destination,
Expand Down Expand Up @@ -169,7 +169,7 @@ cpdef void put_path_file_on_disk(unsigned int orig,
long long [:] connectors,
long long [:] all_nodes,
unsigned int [:] origins_to_write,
unsigned int [:] nodes_to_write) nogil:
unsigned int [:] nodes_to_write) noexcept nogil:
cdef long long i
cdef long long k = pred.shape[0]

Expand All @@ -188,7 +188,7 @@ cdef void blocking_centroid_flows(int action,
long long centroids,
long long [:] fs,
long long [:] temp_b_nodes,
long long [:] real_b_nodes) nogil:
long long [:] real_b_nodes) noexcept nogil:
cdef long long i

if action == 1: # We are unblocking
Expand All @@ -213,7 +213,7 @@ cdef void skim_single_path(long origin,
long long [:] conn,
double[:, :] graph_costs,
long long [:] reached_first,
long found) nogil:
long found) noexcept nogil:
cdef long long i, node, predecessor, connector, j

# sets all skims to infinity
Expand Down Expand Up @@ -250,7 +250,7 @@ cpdef void skim_multiple_fields(long origin,
double[:, :] graph_costs,
long long [:] reached_first,
long found,
double [:,:] final_skims) nogil:
double [:,:] final_skims) noexcept nogil:
cdef long long i, node, predecessor, connector, j

# sets all skims to infinity
Expand Down Expand Up @@ -295,7 +295,7 @@ cpdef int path_finding(long origin,
long long [:] pred,
long long [:] ids,
long long [:] connectors,
long long [:] reached_first) nogil:
long long [:] reached_first) noexcept nogil:

cdef unsigned int N = graph_costs.shape[0]
cdef unsigned int M = pred.shape[0]
Expand Down
6 changes: 3 additions & 3 deletions aequilibrae/paths/bpr.pyx
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@ cpdef void bpr_cython(double[:] congested_time,
double [:] fftime,
double[:] alpha,
double [:] beta,
int cores):
int cores) noexcept:
cdef long long i
cdef long long l = congested_time.shape[0]

Expand All @@ -53,12 +53,12 @@ cpdef void dbpr_cython(double[:] deltaresult,
double [:] fftime,
double[:] alpha,
double [:] beta,
int cores):
int cores) noexcept:
cdef long long i
cdef long long l = deltaresult.shape[0]

for i in prange(l, nogil=True, num_threads=cores):
if link_flows[i] > 0:
deltaresult[i] = fftime[i] * (alpha[i] * beta[i] * (pow(link_flows[i] / capacity[i], beta[i]-1)))/ capacity[i]
else:
deltaresult[i] = fftime[i]
deltaresult[i] = fftime[i]
4 changes: 2 additions & 2 deletions aequilibrae/paths/bpr2.pyx
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@ cpdef void bpr2_cython(double[:] congested_time,
double [:] fftime,
double[:] alpha,
double [:] beta,
int cores):
int cores) noexcept:
cdef long long i
cdef long long l = congested_time.shape[0]

Expand All @@ -58,7 +58,7 @@ cpdef void dbpr2_cython(double[:] deltaresult,
double [:] fftime,
double[:] alpha,
double [:] beta,
int cores):
int cores) noexcept:
cdef long long i
cdef long long l = deltaresult.shape[0]

Expand Down
4 changes: 2 additions & 2 deletions aequilibrae/paths/conical.pyx
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@ cpdef void conical_cython(double[:] congested_time,
double [:] fftime,
double[:] alpha,
double [:] beta,
int cores):
int cores) noexcept:
cdef long long i
cdef long long l = congested_time.shape[0]

Expand All @@ -57,7 +57,7 @@ cpdef void dconical_cython(double[:] deltaresult,
double [:] fftime,
double[:] alpha,
double [:] beta,
int cores):
int cores) noexcept:
cdef long long i
cdef long long l = deltaresult.shape[0]

Expand Down
6 changes: 3 additions & 3 deletions aequilibrae/paths/graph.py
Original file line number Diff line number Diff line change
Expand Up @@ -192,8 +192,8 @@ def __build_directed_graph(self, network: pd.DataFrame, centroids: np.ndarray):
nlist = np.arange(num_nodes)
nodes_to_indices[all_nodes] = nlist

df.loc[:, "a_node"] = nodes_to_indices[df.a_node.values][:]
df.loc[:, "b_node"] = nodes_to_indices[df.b_node.values][:]
df.a_node = nodes_to_indices[df.a_node.values]
df.b_node = nodes_to_indices[df.b_node.values]
df = df.sort_values(by=["a_node", "b_node"])
df.index = np.arange(df.shape[0])
df["id"] = np.arange(df.shape[0])
Expand Down Expand Up @@ -321,7 +321,7 @@ def set_skimming(self, skim_fields: list) -> None:
raise ValueError("At least one of the skim fields does not exist in the graph: {}".format(",".join(k)))

self.compact_skims = np.zeros((self.compact_num_links + 1, len(skim_fields) + 1), self.__float_type)
df = self.__graph_groupby.sum()[skim_fields].reset_index()
df = self.__graph_groupby.sum(numeric_only=True)[skim_fields].reset_index()
for i, skm in enumerate(skim_fields):
self.compact_skims[df.index.values, i] = df[skm].values.astype(self.__float_type)

Expand Down
12 changes: 6 additions & 6 deletions aequilibrae/paths/graph_building.pyx
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ cdef long long _build_compressed_graph(long long[:] link_idx,
long long[:] all_links,
long long[:] compressed_dir,
long long[:] compressed_a_node,
long long[:] compressed_b_node) nogil:
long long[:] compressed_b_node) noexcept nogil:
cdef:
long long slink = 0
long long pre_link, n, first_node, lnk, lidx, a_node, b_node
Expand Down Expand Up @@ -93,7 +93,7 @@ cdef long long _build_compressed_graph(long long[:] link_idx,
@cython.wraparound(False)
@cython.embedsignature(True)
@cython.boundscheck(False)
cdef void _back_fill(long long[:] links_index, long long max_node):
cdef void _back_fill(long long[:] links_index, long long max_node) noexcept:
cdef Py_ssize_t i

for i in range(max_node + 1, 0, -1):
Expand Down Expand Up @@ -172,7 +172,7 @@ def build_compressed_graph(graph):
}
)
max_link_id = link_id_max * 10
comp_lnk.loc[:, "link_id"] += max_link_id
comp_lnk.link_id += max_link_id

df = pd.concat([df, comp_lnk])
df = df[["id", "link_id", "a_node", "b_node", "direction"]]
Expand All @@ -193,11 +193,11 @@ def build_compressed_graph(graph):
)

crosswalk = crosswalk[crosswalk.compressed_link >= 0]
crosswalk.loc[:, "compressed_link"] += max_link_id
crosswalk.compressed_link += max_link_id

cw2 = pd.DataFrame(crosswalk, copy=True)
cw2.loc[:, "link_direction"] *= -1
cw2.loc[:, "compressed_direction"] = -1
cw2.link_direction *= -1
cw2.compressed_direction = -1

crosswalk = pd.concat([crosswalk, cw2])
crosswalk = crosswalk.assign(key=crosswalk.compressed_link * crosswalk.compressed_direction)
Expand Down

0 comments on commit c5ee6d6

Please sign in to comment.