-
Notifications
You must be signed in to change notification settings - Fork 242
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Have ability to perform bulk insert. #400
Comments
It would be very useful for me too |
+1 |
Much needed for me as well - an essential feature for using Pony on large (and growing) datasets. |
It's strange that such powerful ORM doesn't provide this yet. |
coming from a background of .NET and Entity Framework, this is a must-have feature. |
would be really cool to have it |
Yes please! What is the best way to ingest large amounts of data otherwise? My use case is importing tens of millions of records at a time, on a monthly basis, from text files. |
Another upvote.
We had to (sadly) switch away from pony for this exact case as well as migration functionality. We’re still lurking around in the hopes that both get addressed at some point. Our solution with SQLA was far from optimal as well but it did work and was reasonably maintainable.
… On May 27, 2022, at 22:23, mina ***@***.***> wrote:
Yes please!
What is the best way to ingest large amounts of data otherwise? My use case is importing tens of millions of records at a time, on a monthly basis, from text files.
—
Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you commented.
|
That is extremely weak. At first I thought this thread was a hoax. I can only agree with @Hamzakhalid01, as someone who comes from the .net world this is completely unimaginable. This is a basic functionality, why is it not implemented?!? Sorry to be so direct, but from my perspective, pony is just a fun project that is not suitable for the real world. Scalability is not given, which means its a nonsensical ORM🤬 @minalike You have to use some other ORM until they provide Bulk Inserts. |
It is unthinkable that pony doesnt have this basic feature while inserting large bulk of data is a common practice |
It appears implemented in the docs: https://ponyorm.readthedocs.io/en/latest/api_reference.html#collection-attribute-methods (look under |
Using that method is ungodly slow for “real” bulk insert cases. For simple schemas and a few hundred entries maybe… but forget it for even modestly big data. We had to revert to “that other ORM” in the hopes that pony would someday have bulk insert and migration. I still love pony - just not yet big data ready IMO/E. On Jun 9, 2023, at 23:16, The Agent Suite Inc ***@***.***> wrote:
It appears implemented in the docs: https://ponyorm.readthedocs.io/en/latest/api_reference.html#collection-attribute-methods (look under add)
—Reply to this email directly, view it on GitHub, or unsubscribe.You are receiving this because you commented.Message ID: ***@***.***>
|
Hi,
I use database adapter directly for performing bulk inserts. Is it possible to perform such thing via Pony?
In example, something like this:
Expected: 2 rows insert as one bulk insert.
Or is there another way/pattern to do this with Pony?
Thank you!
The text was updated successfully, but these errors were encountered: