Join GitHub today
GitHub is home to over 28 million developers working together to host and review code, manage projects, and build software together.
Sign up[Feature] Bulk Operation Object #1862
Comments
This comment has been minimized.
|
Thank you for the feature request. We're currently in the process of a cleanup of the issue tracker, and aren't currently accepting feature requests (there will be an official policy written soon, but the TL;DR is that open issues should reflect something that is a bug or on our immediate roadmap). We're happy to discuss feature requests, but the place to do so is discourse.diesel.rs, not the issue tracker. |
sgrif
closed this
Sep 21, 2018
This comment has been minimized.
|
@rrichardson Have you tried to just pass the whole list of values to the insert function? Something like this: ::diesel::insert_into(table).values(&vec![
(column_a.eq(42), column_b.eq("foo")),
(column_a.eq(42), column_b.eq("bar")),
// more tuples
]).execute(conn)?This should generate the following sql query: INSERT INTO table(column_a, column_b) VALUES (42, "foo"), (43, "bar"); |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
rrichardson commentedSep 20, 2018
I am using Diesel for bulk data ingest. Presently I have been able to get mediocre insert performance using by disabling indexing for the tables, using an
exec_sql, then I re-enable and re-index, but I'm still calling 1 execute per record.It would be nice to be able to have a batch object which constructs either multi-insert statements, or at the very least wraps a bunch of calls in
BEGIN TRANSACTIONandCOMMITThoughts?