You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We use Deepdiff to compare complex nested structures for a while now.
After investigating some CPU usage issue, we found out that there's a max_diffs option which sounds promising.
Our use-case is to trigger actions if there's a diff, whatever it is, so max_diffs = 1 is the way to go.
We notice though that we get spammed with this log
Would it be possible to have a way to avoid this from happening?
It could be either from :
max_diffs=1 where there's a clear intent of exiting early without real detailed info about what's been going on
a dedicated param (though it looks overkill)
raise an exception
I have to admit the exception sounds appealing. Indeed, this would some the need to look at the stats systematically to find out what happened since you get the same return value, if i'm not mistaken, from a deepdiff(max_diffs=1) whether there's a diff or not.
The text was updated successfully, but these errors were encountered:
Hello,
We use Deepdiff to compare complex nested structures for a while now.
After investigating some CPU usage issue, we found out that there's a
max_diffs
option which sounds promising.Our use-case is to trigger actions if there's a diff, whatever it is, so
max_diffs = 1
is the way to go.We notice though that we get spammed with this log
deepdiff/deepdiff/diff.py
Line 1609 in 69adaf1
Would it be possible to have a way to avoid this from happening?
It could be either from :
max_diffs=1
where there's a clear intent of exiting early without real detailed info about what's been going onI have to admit the exception sounds appealing. Indeed, this would some the need to look at the stats systematically to find out what happened since you get the same return value, if i'm not mistaken, from a deepdiff(max_diffs=1) whether there's a diff or not.
The text was updated successfully, but these errors were encountered: