Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Handling Bulk Inserts #32

Open
lamjohnson opened this issue Jun 29, 2017 · 3 comments
Open

Handling Bulk Inserts #32

lamjohnson opened this issue Jun 29, 2017 · 3 comments

Comments

@lamjohnson
Copy link

It'd be cool if it could keep the database and elasticsearch index in sync for bulk inserts. I noticed it wasn't a feature yet.

@barseghyanartur
Copy link
Contributor

@jlam17:

The only way it could work:

  • Create a custom Django Model manager, override the bulk_create method and handle updates there.
  • Use that custom Django Model manager with every model you would want to index.

And same applies to bulk update and delete actions.

@sabricot:

If it sounds convenient to you, I could implement that.

@lamjohnson
Copy link
Author

@barseghyanartur Thanks for the reply, I didn't know that. I just started learning Django so I appreciate the help.

@rapkyt
Copy link

rapkyt commented Nov 30, 2020

Not sure if I'm being late for this, but what I do is perform the bulk_insert and then grab all the ids of the recently created objects, create a queryset and call the update method of the document.

For instance, let's say you have Blog model and BlogDocument:

blogs_to_create = [Blog(**data) for data in bulk_data]
blogs_created = Blog.objects.bulk_create(blogs_to_create)

blogs_id = [blog.id for blog in blogs_created]
new_blogs_qs = Blog.object.filter(id__in=blogs_id)
BlogDocument().update(new_blogs_qs)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants