Skip to content

Bulk save #68

@caseydm

Description

@caseydm

First off let me say this is an awesome python package. I'm trying to import 12 million records from a Heroku Postgres instance into Elastic Cloud, but it stops immediately due to an out of memory error. I think it's due to the way the normal manage.py search_index --rebuild works. Is there a way to easily implement a bulk save and batch the items or limit the memory use?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions