Chunk large QuerySets into small chunks, and iterate over them without killing your RAM.
Python
Latest commit 6adf8ec Aug 26, 2016 @brunobord brunobord committed on GitHub Merge pull request #15 from novafloss/15-drop-django16-django17-support
Drop Django 1.6 and 1.7 support

README.rst

django-chunkator

Chunk large QuerySets into small chunks, and iterate over them without killing your RAM.

Usage

from chunkator import chunkator
for item in chunkator(LargeModel.objects.all(), 200):
    do_something(item)

This tool is intended to work on Django querysets.

Your model must define a pk field (this is done by default, but sometimes it can be overridden) and this pk has to be unique. django- chunkator has been tested with PostgreSQL and SQLite, using regular PKs and UUIDs as primary keys.

You can also use values():

from chunkator import chunkator
for item in chunkator(LargeModel.objects.values('pk', 'name'), 200):
    do_something(item)

Important

If you're using values() you have to add at least your "pk" field to the values, otherwise, the chunkator will throw a MissingPkFieldException.

Warning

This will not accelerate your process. Instead of having one BIG query, you'll have several small queries. This will save your RAM instead, because you'll not load a huge queryset result before looping on it.

License

MIT License.