New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Centrality betweenness in Sage #18137
Comments
Branch: u/ncohen/18137 |
Commit: |
comment:2
Hello, I have only small remarks:
|
comment:3
Hello,
It would not be much faster, because most of what this array would contain is zeroes (bint is an int in memory). Plus the bottleneck is float computation in this case
There is one, isn't there? In
I don't think that it is worth it. Save a linear number of multiplications after all this work, really?..
Done.
Done.
They are in the same folder, so it works. It is even more robust as a result, as we can move them wherever we want and the path does not change. I also changed a Cython flag which checks for exceptions when doing float divisions. Nathann |
Branch pushed to git repo; I updated commit sha1. New commits:
|
comment:5
there is numerical noise, add tolerance, see patchbot report. |
Branch pushed to git repo; I updated commit sha1. New commits:
|
comment:8
Another thing for which pathbot save us Thanks, Nathann |
This comment has been minimized.
This comment has been minimized.
comment:10
The advantage of using rationals is that it was exact! Here you are using floats but without any guarantee on the result. Aren't you? Do you have an estimate on the error depending on the number of vertices/edges? One solution solution would be to use ball arithmetic that also produce a bound on the error (see the recently added arb package). Or interval arithmetic (but that is slower). Vincent |
comment:11
Although Nathann would prefer not to, we could have 2 versions of the code, the fast one as default, and a slower exact one. |
comment:12
Hello,
And this is the very reason why I wrote both implementations. I am not so sure that it is a very big problem, however, as the algorithm will not add noise to noise like it can happen for PDE computations. The current version of
I wanted to check how Boost does it, but I was not able to locate the source code (God, how can anyone read those files???). (15 minutes later) Here it is! Line 338 of: So the answer is that "it depends of dependency_type", which is.. A template. For igraph it is apparently a double too: For graph-tools (last of the libraries compared on the link in the ticket's description) it is apparently a double too, though I can't make sure for I do not find the https://git.skewed.de/count0/graph-tool/blob/master/src/graph_tool/centrality/__init__.py#L326 Sooooooo please don't just limit your argumentation to "not exact=BAD". I care about this, and for this reason I implemented both (which definitely took more than a couple of minutes as you can imagine), but I do believe that for this kind of computations working on floats is not that bad, for I know when the divisions occur and, well, we do not mind much. I would personally be very happy to have both in Sage, with an easy flag to switch from one implementation to the other. If you just checkout the first of my commits you will see that only one variable need to be changed so that double become rationals. My trouble is that using Cython's preprocessor instructions requires to run I would also like to NOT have the same code copy/pasted twice, and to not pay for the 'if' inside of the loops. I would be happy to have both if there is a free (in terms of computations) way to handle both at once, and a cheap (in term of lines of code) way to have both. So far I did not find any way out, and I thought that the best was to have what everybody seemds interested in: computations on double (we can also turn them into 'long double' if necessary). Nathann P.S.: I uploaded a commit with both versions so that it will be available somewhere (and not on my computer only) if we ever need that implementation. I did that on purpose, to have it archived somewhere. |
comment:13
Replying to @nathanncohen:
Do not oversimplify. My argumentation was "not exact => extra care needed". Floats are wonderful because they are very fast.
Would be interesting to investigate (experimentally) the error propagation.
Already summing (a lot of) floating point numbers create problems. Simple (not so dramatic) example
If you mix that with division, it is of course even worse.
I also believe so, but it would be better if we were sure and the documentation mentioned it. Something like: if you do have a graph with Vincent |
comment:14
Hello ! I agree that float operations make errors, but I do not know how to evaluate it. I expect the relative error to stay very very small in those cases, and in the graphs that are of interest for the networks community. Would you know a trick to have both implementations available in the code (without recompilation)? I do not think that we can have 'real templates' in Cython, can we? Nathann |
comment:15
Okay. Here it is. It cost me the last four hours. Nathann |
Branch pushed to git repo; I updated commit sha1. This was a forced push. New commits:
|
comment:17
Replying to @nathanncohen:
Youhou! You initiated me to the world of Cython templating! I am having a careful look right now. |
comment:44
Sorry, this is the "mistake due to lack of experience". I thought "positive review" meant that I was happy with the code, but now I understand it is much more. I think it's better to leave this issue to more experienced people. |
comment:45
No proooooob!!! If you have some spare time you can read our manual a bit. Reviewing a ticket is not very complicated and the 'technical checks' do not take more than a couple of minutes once you get used to them. And of course you can ask us any question if the manual isn't clear Nathann |
comment:46
One issue:
Did you know that your code is also working for multi-graphs?
|
Branch pushed to git repo; I updated commit sha1. New commits:
|
comment:48
Right. Fixed.
Yeah, that was a good news! At some point I wondered whether I should add a 'scream if not simple' somewhere, then figured out that it worked fine. It also extends the definition in the most natural way, i.e. by considering a path as a set of edges instead of a set of vertices. And it also works for loops Nathann |
Reviewer: David Coudert |
comment:49
Good. |
comment:50
I'm getting this on 32-bit. You should probably add an
|
Branch pushed to git repo; I updated commit sha1. New commits:
|
Branch pushed to git repo; I updated commit sha1 and set ticket back to needs_review. This was a forced push. New commits:
|
Changed branch from public/18137 to |
I hate it that we do not appear in comparisons like the following, just because we are slower than the worst library
:-P
http://graph-tool.skewed.de/performance
With this branch we can compute the betweenness centrality in Sage with a decent speed.
Nathann
P.S.: The version of the code that deals with rational instead of floats has been removed because it is much slower (60x in some cases), and because I did not see how to make the two coexist without copy/pasting most of the code.
CC: @dcoudert @sagetrac-borassi
Component: graph theory
Author: Nathann Cohen
Branch/Commit:
2db68fb
Reviewer: David Coudert
Issue created by migration from https://trac.sagemath.org/ticket/18137
The text was updated successfully, but these errors were encountered: