Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Potential fix for #719, #1001: data loss when reordering items in upward direction #1002

Merged
merged 2 commits into from May 28, 2022

Conversation

olivierkes
Copy link
Owner

@olivierkes olivierkes commented Jan 21, 2022

I got some issue with this (#719, #1001) as well.

With that fix, I haven't been able to reproduce the bug.

Description of the fix

In my testing, it happens only when items have duplicate IDs (though I haven't been able to understand why):

WARNING> There are some items with overlapping IDs: ['5', '5']

And this happens at least if items are copied and pasted (or dragged and dropped with CTRL).

The issue comes from here:

# In case of copy actions, items might be duplicates, so we need new IDs.
# But they might not be, if we cut, then paste. Paste is a Copy Action.
# The first paste would not need new IDs. But subsequent ones will.
# Recursively change the existing IDs to new, unique values. No need to strip out the old
# even if they are not duplicated in pasting. There is no practical need for ID conservation.
if action == Qt.CopyAction:
IDs = self.rootItem.listAllIDs()
for item in items:
if item.ID() in IDs:
item.getUniqueID(recursive=True)

Since getUniqueID does not do anything if item has no model:

def getUniqueID(self, recursive=False):
if not self._model:
return
self.setData(self.enum.ID, self._model.requestNewID())
if recursive:
for c in self.children():
c.getUniqueID(recursive)

and at that moment in the drop, items have no models.

Well, the easier fix here is to simply remove items IDs, so they get new ones when theyre created, which this commit does.

Further things to consider

IF this effectiverly fixes this bug, in my mind the WARNING> There are some items with overlapping IDs: should be treated with more seriousness. It shouldn't be allowed that IDs are not unique.

@olivierkes
Copy link
Owner Author

After some sleep, I decided it will be much safer to fix duplicate IDs on project opening, if this is actually the cause of data loss.
The tradeof is that some references might not work anymore, since IDs are used to reference a scene. But between data loss and the possibility of a few broken links for the subset of people using that feature, I think the choice is clear.

Now since I don't understand why duplicate IDs were the cause of the data loss, I guess more testing is needed to see if this really fixes #719 or not.

@TheJackiMonster
Copy link
Collaborator

@olivierkes Do you think it would make sense to merge changes from this PR to address some issues or should we wait until it's fully done? I would like to publish a new release in the next days.

@TheJackiMonster
Copy link
Collaborator

After some sleep, I decided it will be much safer to fix duplicate IDs on project opening

I think we actually do this already in manuskript/models/abstractItem.py:

if max([self.IDs.count(i) for i in self.IDs if i]) != 1:
LOGGER.warning("There are some items with overlapping IDs: %s", [i for i in self.IDs if i and self.IDs.count(i) != 1])
def checkChildren(item):
for c in item.children():
_id = c.ID()
if not _id or _id == "0":
c.getUniqueID()
checkChildren(c)
checkChildren(self)

So the question is do we still have another cause for invalid/non-unique IDs besides local file changes? Otherwise I think this PR could fix the issues with the reordering via drag&drop quite well.

@TheJackiMonster TheJackiMonster merged commit e780963 into develop May 28, 2022
@TheJackiMonster TheJackiMonster deleted the duplicate_ids branch January 5, 2023 23:13
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants