Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Build vroom as library #42

Closed
frodrigo opened this issue Sep 10, 2016 · 30 comments
Closed

Build vroom as library #42

frodrigo opened this issue Sep 10, 2016 · 30 comments

Comments

@frodrigo
Copy link
Contributor

It will be interesting to build vroom as library containing only the solver and have the main, tsplib and osrm loaders in a command linked with the lib.

@jcoupey
Copy link
Collaborator

jcoupey commented Sep 12, 2016

I can see how useful this could be for integration in other pieces of software. At the moment, a loader could build a problem from a string (or file name) and solving would return a string (or a rapidjson object).

I think a proper library would require to expose more than just this. For example a problem object that could be directly populated, and also an object to represent a solution and use it directly after solving.

@frodrigo
Copy link
Contributor Author

The idea is to use it as a native library in an Android application (@fijemax)

@jcoupey
Copy link
Collaborator

jcoupey commented Sep 12, 2016

This sounds cool. But how do you plan to re-use the solution output, just parse it again in the client application?

@frodrigo
Copy link
Contributor Author

The idea with a library if to share objects not exchanging and parsing files.

@jcoupey
Copy link
Collaborator

jcoupey commented Sep 12, 2016

to share objects

Precisely what I mean: we can't do this right now. At the moment, there is no "solution" object that could be shared, only code reaching out all over the place to write the expected json to output.

I think this should be considered in the light of #44 (I just filled this issue to write down things I had in mind before).

@jcoupey jcoupey added this to the v1.2.0 milestone Oct 5, 2016
@jcoupey
Copy link
Collaborator

jcoupey commented Feb 24, 2017

This is probably getting nearer with the refactor for #44, as involved objects (locations, jobs, vehicles, routes, input problem, solution etc.) now have a proper data representation.

@jcoupey jcoupey removed this from the v1.2.0 milestone Oct 4, 2017
@sashakh
Copy link
Contributor

sashakh commented Oct 31, 2017

Hi,

I'm using vroom as library, adding such simple makefile patch:

diff --git a/src/makefile b/src/makefile
index c729fcf..0ff069f 100644
--- a/src/makefile
+++ b/src/makefile
@@ -10,6 +10,7 @@ LDLIBS = -lboost_system -lboost_regex -lboost_log -lboost_log_setup -lpthread -l
 
 # Using all cpp files in current directory.
 MAIN = ../bin/vroom
+LIB = ../lib/libvroom.a
 SRC = $(wildcard *.cpp)\
                        $(wildcard ./algorithms/*.cpp)\
                        $(wildcard ./routing/*.cpp)\
@@ -34,11 +35,14 @@ endif
 OBJ = $(SRC:.cpp=.o)
 
 # Main target.
-all : $(MAIN)
+all : $(MAIN) $(LIB)
 
 $(MAIN) : $(OBJ) main.o
        $(CXX) $(CXXFLAGS) -o $@ $^ $(LDLIBS)
 
+$(LIB) : $(OBJ)
+       $(AR) cr $@ $^
+
 # Building .o files.
 %.o : %.cpp %.h
        $(CXX) $(CXXFLAGS) -c $< -o $@

And my code then looks like:

        input input_data(NULL, false);  // geometry = false

        input_data.add_vehicle(0, optional_coords_t({ p[0].pos->lon,
                                                      p[0].pos->lat}),
                               boost::none);

        unsigned i;
        for (i = 1; i < size; i++) {
                input_data.add_job(i, optional_coords_t({p[i].pos->lon,
                                                         p[i].pos->lat}));
        }

        input_data._max_cost_per_line.assign(size, 0);
        input_data._max_cost_per_column.assign(size, 0);
        input_data._matrix = get_matrix(osrm, input_data._locations,
                                        input_data._max_cost_per_line,
                                        input_data._max_cost_per_column);

        solution sol = input_data.solve(1);       // nb_threads = 1

        if (sol.code != 0) {
                err("sol.code %d: %s\n", sol.code, sol.error.c_str());
                return;
        }

        for (const auto & r:sol.routes) {
                unsigned i = 0; 
                for (const auto & s:r.steps) {
                        //const location_t &l = s.location;
                        //dbg("%lu: (%f,%f)\n", s.job, l.lon.get(), l.lat.get());

                        n[i] = s.job;
                        i++;
                }
                n[0] = 0;
        }

@sashakh
Copy link
Contributor

sashakh commented Oct 31, 2017

However #69 will create certain problems with continuous runs.

@jcoupey
Copy link
Collaborator

jcoupey commented Nov 1, 2017

@sashakh thanks for sharing, glad to know this works fine for you. The add_job and add_vehicle functions where added with that exact usage in mind.

Ideally you should not have to worry about internal members like _max_cost_per_[line|column] and calling get_matrix yourself. You can avoid that part by passing your osrm object as first argument in the input ctor (instead of NULL). In the current code-base, this object is constructed here:

std::unique_ptr<routing_io<cost_t>> routing_wrapper;
if (!cl_args.use_libosrm) {
// Use osrm-routed.
routing_wrapper = std::make_unique<routed_wrapper>(cl_args.osrm_address,
cl_args.osrm_port,
cl_args.osrm_profile);
} else {
#if LIBOSRM
// Use libosrm.
if (cl_args.osrm_profile.empty()) {
throw custom_exception("-l flag requires -m.");
}
routing_wrapper = std::make_unique<libosrm_wrapper>(cl_args.osrm_profile);
#else
throw custom_exception("libosrm must be installed to use -l.");
#endif
}

depending on whether we use libosrm or osrm-routed.

Then calling solve on an input object will automatically take care of setting up the matrix via set_matrix.

@sashakh
Copy link
Contributor

sashakh commented Nov 1, 2017 via email

@jcoupey
Copy link
Collaborator

jcoupey commented Nov 1, 2017

@sashakh could you submit your makefile patch as a pull request when you find time ? I'd be happy to merge it in. The support for library use is still quite experimental but it does seem relevant for several use-cases.

@sashakh
Copy link
Contributor

sashakh commented Nov 2, 2017 via email

@jcoupey
Copy link
Collaborator

jcoupey commented Nov 2, 2017

@sashakh creating the bin and lib directories from makefile rules is probably the more convenient option. Thanks!

@sashakh
Copy link
Contributor

sashakh commented Nov 2, 2017 via email

@sashakh sashakh mentioned this issue Nov 4, 2017
2 tasks
@sashakh
Copy link
Contributor

sashakh commented Nov 4, 2017 via email

@jcoupey
Copy link
Collaborator

jcoupey commented Feb 14, 2018

FYI: next step toward this landed with #84, introducing a input::set_matrix(matrix<cost_t>&&) function that is used throughout the codebase (_matrix member in input is now private).

See updated libvroom.cpp example for usage from a C++ context.

@jcoupey
Copy link
Collaborator

jcoupey commented Jul 6, 2018

I flagged this as an experimental feature in the v1.2.0 release. It should be fully working. Yet I think we should keep that ticket open for reference as the C++ API might still evolve. Also there is no proper documentation besides the examples in libvroom_examples/libvroom.cpp.

@jcoupey jcoupey mentioned this issue Jul 6, 2018
@dbhoot
Copy link

dbhoot commented Aug 20, 2019

I second this request. I think it'd be good to have javascript and or python wrappers so that the vroom solver can be invoked from other high level languages

@jcoupey
Copy link
Collaborator

jcoupey commented Aug 20, 2019

@dbhoot thanks for your interest. This ticket was primarily about compiling a C++ library, which is now effective, and I've had a couple reports from people using this successfully in their own C++ code. The reason I did not close here is that I still consider it as experimental (read "I don't want to be bound to the C++ API, and any internal change will not be considered as a breaking change"). I try to keep the example file up to date though.

I don't have a need myself for bindings to other languages, but I definitely see how that might be useful to others. If you're interested in setting something up, then feel free to open a dedicated ticket here to discuss the best way to proceed.

For the sake of completeness: you might want to check out vroom-express, an expressjs-based wrapper to use VROOM with HTTP POST requests.

@sashakh
Copy link
Contributor

sashakh commented Aug 20, 2019 via email

@dbhoot
Copy link

dbhoot commented Aug 21, 2019

For the sake of completeness: you might want to check out vroom-express, an expressjs-based wrapper to use VROOM with HTTP POST requests.

I saw the express project. If I remember correctly, it basically shells out to invoke the binary.

The PR sashakh made is great for c++ projs. I didn't try creating v8 bindings using swig or something based on his work but I'm sure it's possible with some work.

My point in commenting above is that I generally think it makes the project more useful.

@dndll
Copy link

dndll commented Feb 4, 2020

I struggled a little bit trying to get libvroom to run with Rust, gave up and did a shell command.

Did the same thing in Kotlin too as it was difficult/not enough time to build JNI bindings for it

That said, it would be mega useful if it was easier to utilise rather than the above methods as command is expensive in the JVM

@jcoupey
Copy link
Collaborator

jcoupey commented Feb 5, 2020

@AwesomeIbex not sure exactly what you're trying to achieve here as the purpose of libvroom is to call the native functions from C++.

If you're interested in using it from your C++ code, then you'll find an example on how to link the library and how to setup and solve a problem.

If you're interested in bindings for other languages, then please check my previous comment above that still applies to date.

@yhilem
Copy link

yhilem commented Nov 16, 2021

Hi,
I'm trying to use the vroom lib from Java. I thought about using JavaCPP (https://github.com/bytedeco/javacpp) and if possible create a module in JavaCPP Presets (https://github.com/bytedeco/javacpp-presets). I am not a C ++ expert. I would need some help creating the cppbuild.sh file (https://github.com/bytedeco/javacpp-presets/wiki/Create-New-Presets#the-cppbuildsh-file).
Thanks in advance.

@jcoupey
Copy link
Collaborator

jcoupey commented Nov 17, 2021

I'm not familiar at all with JavaCPP (and not that much with Java either), so I can't really comment on the whole process. You'll probably get better generic advise from the JavaCPP maintainers.

On the specific question of the cppbuild.sh script: the example you're pointing to seems to get down to building using make. So if you don't need all the fancy multi-platform stuff, you should be able to simply paste the usual build instructions for vroom in your script, including pulling the repo and installing dependencies.

Interested to have any feedback if you end up using the project from Java.

@yhilem
Copy link

yhilem commented Nov 17, 2021

Thank you for your reply. We are already using vroom from java with the ProcessBuilder class (https://docs.oracle.com/javase/9/docs/api/java/lang/ProcessBuilder.html). Same as what vroom-express does.
It is not scalable for our new use cases: between 20,000 to 40,000 requests, in real time, for optimizing routes in the morning between 8 a.m. and 10 a.m.
We want first reduce process startup overhead.
In addition, we want to deploy vroom on our agents' android smartphones with a cost matrix that corresponds to the geographic coverage area of our sites.
I will share the results of our work as soon as it is done.

@kkarbowiak
Copy link
Contributor

I flagged this as an experimental feature in the v1.2.0 release. It should be fully working. Yet I think we should keep that ticket open for reference as the C++ API might still evolve. Also there is no proper documentation besides the examples in libvroom_examples/libvroom.cpp.

I think the library part of this project is a fact and already a number of people depend on it (myself included). I would therefore propose to close this issue and handle any outstanding topics as new issues.

Any thoughts?

@jcoupey
Copy link
Collaborator

jcoupey commented Jun 21, 2022

I think the library part of this project is a fact

Yes you're right of course. Again my only concern here is that I don't want to be tied by the C++ API and have to use work-arounds to make it evolve in a non-breaking way.

Maybe we could advertise the C++ API but add some warnings stating that the non-breaking semantic versioning approach only applies to the json API?

@kkarbowiak
Copy link
Contributor

Maybe we could advertise the C++ API but add some warnings stating that the non-breaking semantic versioning approach only applies to the json API?

Works for me.

@dbhoot
Copy link

dbhoot commented Jun 21, 2022

👍

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

7 participants