diff --git a/docs/bulk_import.rst b/docs/bulk_import.rst index 233dab17e..4d4ddce57 100644 --- a/docs/bulk_import.rst +++ b/docs/bulk_import.rst @@ -31,9 +31,9 @@ Caveats * In bulk mode, exceptions are not linked to a row. Any exceptions raised by bulk operations are logged and returned as critical (non-validation) errors (and re-raised if ``raise_errors`` is true). -* If you use :class:`~import_export.widgets.ForeignKeyWidget` then this can affect performance, because it reads from - the database for each row. If this is an issue then create a subclass which caches ``get_queryset()`` results rather - than reading for each invocation. +* If you use :class:`~import_export.widgets.ForeignKeyWidget` then this should not affect performance during lookups, + because the ``QuerySet`` cache should be used. Some more information + `here `_. * If there is the potential for concurrent writes to a table during a bulk operation, then you need to consider the potential impact of this. Refer to :ref:`concurrent-writes` for more information. diff --git a/docs/testing.rst b/docs/testing.rst index dc2c69baf..b75ed81fb 100644 --- a/docs/testing.rst +++ b/docs/testing.rst @@ -64,4 +64,20 @@ You can then run the script as follows: # pass 'create', 'update' or 'delete' to run the single test ./manage.py runscript bulk_import --script-args create +Enable logging +^^^^^^^^^^^^^^ + +You can see console debug logging by updating the ``LOGGING`` block in `settings.py`:: + + LOGGING = { + "version": 1, + "handlers": {"console": {"class": "logging.StreamHandler"}}, + "root": { + "handlers": ["console"], + }, + "loggers": { + "django.db.backends": {"level": "DEBUG", "handlers": ["console"]}, + } + } +