You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
After this change, the unique attributes must be unique for the entire MongoDB collection. Previously, they had to be unique only for the API resource. This becomes a problem when the datasource is configured with a filter and you have multiple resources sharing the same database collection.
At first we thought this was just a bug. However, in the mentioned commit a comment is introduced indicating that the change is in fact intentional:
we perform the check on the native mongo driver (and not on
app.data.find_one()) because in this case we don't want the usual
(for eve) query injection to interfere with this validation. We
are still operating within eve's mongo namespace anyway.
However, I fail to understand its motivation. Can you elaborate?
We don't use the soft-delete feature ourselves, but looking at the code, seems like it should also be affected (ie soft-deleted documents will be taken into account when validating the uniqueness of an attribute).
The text was updated successfully, but these errors were encountered:
Ok, so after reading the docs and the tests more carefully, I think I understand now the source of the conflict.
There are different and conflicting use cases for the datasource.filter feature:
a) Have N resources, where one of them is the canonical one, and the rest are subsets of the canonical one. Kind of named filters, or database views.
b) Have N resources, each of them canonical, that for some reason you want to keep in the same database collection. One possible reason could be that you want to have another (readonly) resource that retrieves the documents of all the other resources.
I think both use cases are valid, so I propose to add another unique_xxx validation rule to discriminate them. As this change was released quite a while ago (we just upgraded now from 0.5.3, sorry!), I think unique should remain as it is now, and the new unique validation rule should be unique_within_resource. I've opened #1292 with the suggestion.
After this change, the unique attributes must be unique for the entire MongoDB collection. Previously, they had to be unique only for the API resource. This becomes a problem when the datasource is configured with a filter and you have multiple resources sharing the same database collection.
At first we thought this was just a bug. However, in the mentioned commit a comment is introduced indicating that the change is in fact intentional:
However, I fail to understand its motivation. Can you elaborate?
We don't use the soft-delete feature ourselves, but looking at the code, seems like it should also be affected (ie soft-deleted documents will be taken into account when validating the uniqueness of an attribute).The text was updated successfully, but these errors were encountered: