New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Request for batch queries. #177
Comments
Issuing multiple statements in the same query requires using simple queries, postgrex only supports extended queries. There is currently no plans to support simple queries because it only uses the text format encoding. The binary format is much simpler and performant, adding the text format would mean duplicating all of the encoding/decoding and exposes you to subtle bugs such as SQL injection. I suggest rewriting your inserts like this: For updates you can build a single query and execute it multiple times with different parameters with |
Are you requesting support for |
@ericmj, sorry but how I can do prepare/execute thousand updates with one command? AFAIK it just makes several thousands requests. I do understand that it will not parse the query each time, but it barely can save us from many requests which would take majority of time. @fishcakez, well it would help us a little but not with update queries. In our app we process each entity which has several thousand transactions. We then calculate diff values and update all of them with one server request just joining all queries with ';'. It drastically speed up processing comparing to process each transaction separately. ( I meant we did it with Rails ). With elixir we save all update queries to file, then execute psql with it. I got that it linked to binary protocol and you guys do not want to redo all the work. However it would be really nice to have it. |
@nateless it would be possible to support simple (unparametrised/unprepared) queries but someone wanting it would have to do the proposal/work to add the feature. It would be a lot for them to do. An alternative might be to look at #145. It is out of date and could do better batching to the socket. Both of these issues would need fixing. However it does have a reasonable API and would allow batched/pipelined requests with the benefits of parametrised/prepared queries. |
Closing this as there is no plan to support simple queries for now. |
@fishcakez it will be great to have https://www.depesz.com/2007/07/05/how-to-insert-data-to-database-as-fast-as-possible/ |
@lessless can you show what you are trying? postgrex supports copy from since 0.11.2 and explicit prepared insert since 0.11.0. |
Would you guys be ok with me adding support for the simple text queries to support batch? It seems like you guys closed everything that wasn't going into Ecto and I don't want to do the work in this library if you guys don't want it. I can always build another library or fork this. Though maybe I can just use https://github.com/epgsql/epgsql |
If it means writing encoders and decoders for all text formats, then it is
probably too much for us to take. Thanks for asking upfront!
--
*José Valimwww.plataformatec.com.br
<http://www.plataformatec.com.br/>Founder and Director of R&D*
|
Hey guys,
We need to process several thousands inserts\updates as raw sql. But currently it's not allowed because of prepared statements. We would like to have that ability as the only way we have now is to put it into a text file and use psql copy to process them, which is kinda weird.
The text was updated successfully, but these errors were encountered: