Skip to content

Commit

Permalink
Merge 71fce64 into 270484b
Browse files Browse the repository at this point in the history
  • Loading branch information
matthewhegarty committed Jan 10, 2024
2 parents 270484b + 71fce64 commit 1c0f895
Show file tree
Hide file tree
Showing 10 changed files with 181 additions and 32 deletions.
73 changes: 68 additions & 5 deletions docs/advanced_usage.rst
Expand Up @@ -222,16 +222,79 @@ Errors are retained in each :class:`~import_export.results.RowResult` instance w
The :meth:`~import_export.resources.Resource.import_data` method takes optional parameters which can be used to
customize the handling of errors. Refer to the method documentation for specific details.

For example, to iterate over errors produced from an import::
Validation Errors
-----------------

During import of a row, each field is iterated and any `ValueError <https://docs.python.org/3/library/exceptions.html#ValueError/>`_
errors raised by Widgets are stored in an instance of Django's
`ValidationError <https://docs.djangoproject.com/en/stable/ref/forms/validation/>`_.

Validation errors are retained within the :attr:`~import_export.results.Result.invalid_rows` list as a
:class:`~import_export.results.InvalidRow` instance.

If importing programmatically, you can set the ``raise_errors`` parameter of :meth:`~import_export.resources.Resource.import_data`
to ``True``, which will mean the process will exit at the first row which has errors::

rows = [
(1, 'Lord of the Rings', '1996-01-01'),
(2, 'The Hobbit', '1996-01-02x'),
]
dataset = tablib.Dataset(*rows, headers=['id', 'name', 'published'])
resource = BookResource()
self.resource.import_data(self.dataset, raise_errors=True)

The above process will exit with a row number and error (formatted for clarity)::

ImportError: 2: {'published': ['Value could not be parsed using defined date formats.']}
(OrderedDict({'id': 2, 'name': 'The Hobbit', 'published': 'x'}))

To iterate over all validation errors produced from an import, pass ``False`` to ``raise_errors``::

result = self.resource.import_data(self.dataset, raise_errors=False)
if result.has_errors():
for row in result.rows:
for error in row.errors:
print(str(error.error))
for row in result.invalid_rows:
print(f"--- row {row.number} ---")
for field, error in row.error.error_dict.items():
print(f"{field}: {error} ({row.values})")

If using the :ref:`Admin UI<admin-integration>`, errors are presented to the user during import (see below).

Generic Errors
--------------

Generic errors are raised during import for cases which are not validation errors.
For example, generic errors are usually raised at the point the model instance is saved, such as attempt to save a float
to a int field. Because generic errors are raised from a lower point in the stack, it is not always possible to
identify which field caused the error.

Generic errors are retained within the :attr:`~import_export.results.Result.error_rows` list as a
:class:`~import_export.results.ErrorRow` instance.

The ``raise_errors`` parameter can be used during programmatic import to halt the import at the first error::

rows = [
(1, 'Lord of the Rings', '999'),
(2, 'The Hobbit', 'x'),
]
dataset = tablib.Dataset(*rows, headers=['id', 'name', 'price'])
resource = BookResource()
result = resource.import_data(
dataset,
raise_errors=True
)

The above process will exit with a row number and error (formatted for clarity)::

ImportError: 1: [<class 'decimal.ConversionSyntax'>]
(OrderedDict({'id': 1, 'name': 'Lord of the Rings', 'price': '1x'}))

To iterate over all generic errors produced from an import, pass ``False`` to ``raise_errors``::

result = self.resource.import_data(self.dataset, raise_errors=False)
for row in result.error_rows:
print(f"--- row {row.number} ---")
for field, error in row.error.error_dict.items():
print(f"{field}: {error} ({error.row})")

Field level validation
----------------------

Expand Down
23 changes: 23 additions & 0 deletions docs/api_exceptions.rst
@@ -0,0 +1,23 @@
==========
Exceptions
==========

.. currentmodule:: import_export.exceptions

ImportExportError
-----------------

.. autoclass:: import_export.exceptions.ImportExportError
:members:

FieldError
----------

.. autoclass:: import_export.exceptions.FieldError
:members:

ImportError
-----------

.. autoclass:: import_export.exceptions.ImportError
:members:
1 change: 1 addition & 0 deletions docs/changelog.rst
Expand Up @@ -13,6 +13,7 @@ Changelog
- Relocated admin integration section from advanced_usage.rst into new file (#1713)
- Fix slow export with ForeignKey id (#1717)
- Added customization of Admin UI import error messages (#1727)
- Improve output of error messages (#1729)

4.0.0-beta.2 (2023-12-09)
-------------------------
Expand Down
1 change: 1 addition & 0 deletions docs/index.rst
Expand Up @@ -61,6 +61,7 @@ exporting data with included admin integration.
api_tmp_storages
api_results
api_forms
api_exceptions

.. toctree::
:maxdepth: 2
Expand Down
10 changes: 9 additions & 1 deletion docs/release_notes.rst
Expand Up @@ -57,10 +57,18 @@ In v4, return values are rendered as strings by default (where applicable), with
Refer to the :doc:`documentation<api_widgets>` for more information.

Export field order
------------
------------------

The ordering rules for exported fields has been standardized. See :ref:`documentation<field_ordering>`.

Error output
------------

If the ``raise_errors`` parameter of :meth:`~import_export.resources.Resource.import_data` is ``True``, then an instance
of :class:`~import_export.exceptions.ImportError` is raised. This exception wraps the underlying exception.

See `this PR <https://github.com/django-import-export/django-import-export/issues/1729>`_.

Deprecations
============

Expand Down
16 changes: 16 additions & 0 deletions import_export/exceptions.py
Expand Up @@ -8,3 +8,19 @@ class FieldError(ImportExportError):
"""Raised when a field encounters an error."""

pass


class ImportError(ImportExportError):
def __init__(self, error, number=None, row=None):
"""A wrapper for errors thrown from the import process.
:param error: The underlying error that occurred.
:param number: The row number of the row containing the error (if obtainable).
:param row: The row containing the error (if obtainable).
"""
self.error = error
self.number = number
self.row = row

def __str__(self):
return f"{self.number}: {self.error} ({self.row})"
34 changes: 24 additions & 10 deletions import_export/resources.py
Expand Up @@ -21,9 +21,8 @@
from django.utils.safestring import mark_safe
from django.utils.translation import gettext_lazy as _

from . import widgets
from . import exceptions, widgets
from .declarative import DeclarativeMetaclass, ModelDeclarativeMetaclass
from .exceptions import FieldError
from .fields import Field
from .results import Error, Result, RowResult
from .utils import atomic_if_using_transaction, get_related_model
Expand Down Expand Up @@ -659,9 +658,11 @@ def handle_import_error(self, result, error, raise_errors=False):
logger.debug(error, exc_info=error)
if result:
tb_info = traceback.format_exc()
result.append_base_error(self.get_error_result_class()(error, tb_info))
result.append_base_error(
self.get_error_result_class()(error, traceback=tb_info)
)
if raise_errors:
raise
raise exceptions.ImportError(error)

def import_row(self, row, instance_loader, **kwargs):
r"""
Expand Down Expand Up @@ -762,7 +763,11 @@ def import_row(self, row, instance_loader, **kwargs):
if not isinstance(e, TransactionManagementError):
logger.debug(e, exc_info=e)
tb_info = traceback.format_exc()
row_result.errors.append(self.get_error_result_class()(e, tb_info, row))
row_result.errors.append(
self.get_error_result_class()(
e, traceback=tb_info, row=row, number=kwargs["row_number"]
)
)

return row_result

Expand All @@ -788,8 +793,12 @@ def import_data(
:param use_transactions: If ``True`` the import process will be processed
inside a transaction.
:param collect_failed_rows: If ``True`` the import process will collect
failed rows.
:param collect_failed_rows:
If ``True`` the import process will create a new dataset object comprising
failed rows and errors.
This can be useful for debugging purposes but will cause higher memory usage
for larger datasets.
See :attr:`~import_export.results.Result.failed_dataset`.
:param rollback_on_validation_errors: If both ``use_transactions`` and
``rollback_on_validation_errors`` are set to ``True``, the import process will
Expand Down Expand Up @@ -921,16 +930,21 @@ def import_data_inner(
result.increment_row_result_total(row_result)

if row_result.errors:
result.append_error_row(i, row, row_result.errors)
if collect_failed_rows:
result.append_failed_row(row, row_result.errors[0])
if raise_errors:
raise row_result.errors[-1].error
raise exceptions.ImportError(
row_result.errors[-1].error, number=i, row=row
)
elif row_result.validation_error:
result.append_invalid_row(i, row, row_result.validation_error)
if collect_failed_rows:
result.append_failed_row(row, row_result.validation_error)
if raise_errors:
raise row_result.validation_error
raise exceptions.ImportError(
row_result.validation_error, number=i, row=row
)
if (
row_result.import_type != RowResult.IMPORT_TYPE_SKIP
or self._meta.report_skipped
Expand Down Expand Up @@ -1100,7 +1114,7 @@ def _check_import_id_fields(self, headers):
missing_fields.append(col)

if missing_fields:
raise FieldError(
raise exceptions.FieldError(
_(
"The following import_id_fields are not present in the dataset: %s"
% ", ".join(missing_fields)
Expand Down
25 changes: 22 additions & 3 deletions import_export/results.py
Expand Up @@ -6,10 +6,11 @@


class Error:
def __init__(self, error, traceback=None, row=None):
def __init__(self, error, traceback=None, row=None, number=None):
self.error = error
self.traceback = traceback
self.row = row
self.number = number


class RowResult:
Expand Down Expand Up @@ -148,13 +149,28 @@ def error_count(self):
return count


class ErrorRow:
"""A row that resulted in one or more errors being raised during import."""

def __init__(self, number, errors):
#: The row number
self.number = number
#: A list of errors associated with the row
self.errors = errors


class Result:
def __init__(self, *args, **kwargs):
super().__init__()
self.base_errors = []
self.diff_headers = []
self.rows = [] # RowResults
self.invalid_rows = [] # InvalidRow
#: The rows associated with the result.
self.rows = []
#: The collection of rows which had validation errors.
self.invalid_rows = []
#: The collection of rows which had generic errors.
self.error_rows = []
#: A custom Dataset containing only failed rows and associated errors.
self.failed_dataset = Dataset()
self.totals = OrderedDict(
[
Expand Down Expand Up @@ -197,6 +213,9 @@ def append_invalid_row(self, number, row, validation_error):
InvalidRow(number=number, validation_error=validation_error, values=values)
)

def append_error_row(self, number, row, errors):
self.error_rows.append(ErrorRow(number=number, errors=errors))

def increment_row_result_total(self, row_result):
if row_result.import_type:
self.totals[row_result.import_type] += 1
Expand Down
6 changes: 3 additions & 3 deletions tests/core/tests/test_resources/test_bulk_operations.py
Expand Up @@ -6,7 +6,7 @@
from django.core.exceptions import ValidationError
from django.test import TestCase

from import_export import fields, resources, widgets
from import_export import exceptions, fields, resources, widgets
from import_export.instance_loaders import ModelInstanceLoader


Expand Down Expand Up @@ -246,7 +246,7 @@ class Meta:
batch_size = 100

resource = _BookResource()
with self.assertRaises(ValidationError):
with self.assertRaises(exceptions.ImportError):
resource.import_data(self.dataset, raise_errors=True)

@mock.patch("core.models.Book.objects.bulk_create")
Expand Down Expand Up @@ -457,7 +457,7 @@ class Meta:
use_bulk = True

resource = _BookResource()
with self.assertRaises(ValidationError) as raised_exc:
with self.assertRaises(exceptions.ImportError) as raised_exc:
resource.import_data(self.dataset, raise_errors=True)
self.assertEqual(e, raised_exc)

Expand Down

0 comments on commit 1c0f895

Please sign in to comment.