New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Transferid collisions #3522
Comments
That only in case https://github.com/owncloud/core/blob/c700f42b68f430e9c89ce97b92ea91323c9f6ed5/lib/private/filechunking.php#L151 is indeed used for that. No idea how that chunking really works… Looked at it soooooooo long ago 🙊 |
signature_split isn't used anywhere. It looks like it's very old code. The transfer id is actually parsed from the file name sent by the client: https://github.com/owncloud/core/blob/c700f42b68f430e9c89ce97b92ea91323c9f6ed5/lib/private/filechunking.php#L36 |
I grepped the sync logs I saw that the transfer id 3657486019 appears in both the following logs:
@guruz pointed out at https://github.com/owncloud/client/blob/master/src/libsync/propagateupload.cpp#L256 and Line 73 in 7fc7925
Here is my Qt version in case it would affect the algo: |
@guruz advised me to add this in cmd.php's main: qsrand(QTime::currentTime().msec() * QCoreApplication::applicationPid()); Now when run the smashbox test the collision is gone and all tests pass! |
With the 2.0 beta coming out next week (hopefully) I don't think it's needed :) |
@guruz if we don't it means smashbox tests will randomly fail in CI as it does now. Unless we switch CI directly to 2.0 once it's out. |
Cherry-picked into 1.8 |
THX - so I'll run ci with 1.8 branch build |
See owncloud/core#17956 (comment)
This was found by running "test_basicSync.py" on my environment (see initial steps).
It looks like two instances of sync client from the test are uploading the same file at the same time. Since the start time is the same, the transfer id is the same.
I suspect that the transfer id is based on the timestamp.
@LukasReschke suggested using a low secure random instead.
This was with:
@ogoffart @guruz
The text was updated successfully, but these errors were encountered: