New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Exact Algorithm for Diameters of Large Real Directed Graphs #29309
Comments
New commits:
|
Commit: |
comment:3
A few comments:
|
comment:5
Thank you so much for you input. I have taken care of points 2, 3, 5 (I wasn't aware of the existence of a function like bitwise_next in O(1), thank you!). Regarding the 6th point (sorry for the typo) the comment refers to the fact that the BFS should be done only on the SCC of the considered node with the inverted edges in order to obtain the backwards distance. Temporarily I had used the given graph because I was considering G as an undirected graph. The comment was only there to remind me to find and efficient way to perform the BFS on the desired graph. |
Branch pushed to git repo; I updated commit sha1. New commits:
|
comment:7
Replying to @dcoudert:
About the 6th point as I commented before, the problem is that I have to consider the graph of reverse edges with vertices in the SCC. One way to do this is to create these graphs, however this seems be very inefficient. One solution I've though of is to create only the revere graph using init_revese, which is already implemented, and to run BFS adding only the vertices on the same SCC to the queue. The current implementation of simple_BFS does not allow for this. However with a couple of very simple changes and adding an optional list of components to the arguments of the function it could. I am hesitant about this path because I would then have to fix all calls to this function. Can I get your input on this one? Is it preferred not to mess with the arguments of already implemented functions? Perhaps I should just switch to another graph structure which allows for these operations although a bit less efficient? |
Branch pushed to git repo; I updated commit sha1. New commits:
|
Branch pushed to git repo; I updated commit sha1. New commits:
|
Branch pushed to git repo; I updated commit sha1. New commits:
|
Author: João Tavares |
comment:13
You must add doctests for your algorithms. Also, benchmarks would be welcome. |
Branch pushed to git repo; I updated commit sha1. New commits:
|
comment:15
I have been experimenting and it seems this method is more efficient when the graph is sparse.
|
Branch pushed to git repo; I updated commit sha1. New commits:
|
comment:19
FYI, #29346 also proposes an implementation of the same algorithm. Authors could join forces to get a better code. |
comment:20
Replying to @dcoudert:
#29346 implements the same algorithm with the modified definition of diameter, where we consider the maximum finite eccentricities only. This seems counter intuitive to have different algorithms giving different answers |
comment:21
May be with a proper documentation and an appropriate parameter, the same method could be used for both definitions, no? |
comment:22
Replying to @dcoudert:
Absolutely, but in that case the implementation over at #29346 should be used, mine has a lot of simplifications due to the restriction to connected graphs |
comment:23
Batch modifying tickets that will likely not be ready for 9.1, based on a review of the ticket title, branch/review status, and last modification date. |
comment:24
red flag |
comment:26
Setting new milestone based on a cursory review of ticket status, priority, and last modification date. |
comment:27
Setting a new milestone for this ticket based on a cursory review. |
Adds an algorithm for computing the diameter of directed graphs as described in https://doi.org/10.1007/978-3-319-20086-6_5
Component: graph theory
Keywords: diameter
Author: João Tavares
Branch/Commit: u/gh-tabus/exact_algorithm_for_diameters_of_large_real_directed_graphs @
a79cfd1
Issue created by migration from https://trac.sagemath.org/ticket/29309
The text was updated successfully, but these errors were encountered: