Skip to content


Subversion checkout URL

You can clone with
Download ZIP
Browse files

Fixed #17668 - prefetch_related does not work in in_bulk

Thanks to gurets for the report, and akaariai for the initial patch.

git-svn-id: bcc190cf-cafb-0310-a4f2-bffc1f526a37
  • Loading branch information...
commit de9942a6673cfbe442abdfabc1e8f7c0a652ef5b 1 parent 4b641b7
@spookylukey spookylukey authored
2  django/db/models/
@@ -485,7 +485,7 @@ def in_bulk(self, id_list):
qs = self._clone()
qs.query.add_filter(('pk__in', id_list))
- return dict([(obj._get_pk_val(), obj) for obj in qs.iterator()])
+ return dict([(obj._get_pk_val(), obj) for obj in qs])
def delete(self):
6 docs/ref/models/querysets.txt
@@ -871,6 +871,9 @@ could be generated, which, depending on the database, might have performance
problems of its own when it comes to parsing or executing the SQL query. Always
profile for your use case!
+Note that if you use ``iterator()`` to run the query, ``prefetch_related()``
+calls will be ignored since these two optimizations do not make sense together.
@@ -1430,6 +1433,9 @@ performance and a significant reduction in memory.
Note that using ``iterator()`` on a ``QuerySet`` which has already been
evaluated will force it to evaluate again, repeating the query.
+Also, use of ``iterator()`` causes previous ``prefetch_related()`` calls to be
+ignored since these two optimizations do not make sense together.
13 tests/modeltests/prefetch_related/
@@ -470,3 +470,16 @@ def test_prefetch_nullable(self):
for e in qs2]
self.assertEqual(co_serfs, co_serfs2)
+ def test_in_bulk(self):
+ """
+ In-bulk does correctly prefetch objects by not using .iterator()
+ directly.
+ """
+ boss1 = Employee.objects.create(name="Peter")
+ boss2 = Employee.objects.create(name="Jack")
+ with self.assertNumQueries(2):
+ # Check that prefetch is done and it does not cause any errors.
+ bulk = Employee.objects.prefetch_related('serfs').in_bulk([,])
+ for b in bulk.values():
+ list(b.serfs.all())

0 comments on commit de9942a

Please sign in to comment.
Something went wrong with that request. Please try again.