Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Onedrive: Error on writing data in serve mode with webdav #3131

Closed
prodigy7 opened this issue Apr 24, 2019 · 17 comments
Closed

Onedrive: Error on writing data in serve mode with webdav #3131

prodigy7 opened this issue Apr 24, 2019 · 17 comments

Comments

@prodigy7
Copy link

What is the problem you are having with rclone?

Because my NAS (Synology) does not support OneDrive for storing backups (HyperBackup), i want use rclone as a "proxy" for storing backup data because the backup software does support webdav.
For that I start serve webdav with:

rclone serve webdav --addr 127.0.0.1:8090 onedrive:

So far, i works for a time but then I got a error from rclone:

2019/04/24 21:16:08 ERROR : Hyper Backup/nas-h01.hbk/Pool/0/0/156.bucket.2: Failed to copy: invalidRange: fragmentOverlap: The uploaded fragment overlaps with data that has already been received.
2019/04/24 21:16:08 ERROR : Hyper Backup/nas-h01.hbk/Pool/0/0/156.bucket.2: WriteFileHandle.New Rcat failed: invalidRange: fragmentOverlap: The uploaded fragment overlaps with data that has already been received.

What is your rclone version (output from rclone version)

rclone v1.46

  • os/arch: linux/amd64
  • go version: go1.11.5

Which OS you are using and how many bits (eg Windows 7, 64 bit)

Synology OS (Linux 4.4.59+), but I can reproduce this at Debian 9/x64.

Which cloud storage system are you using? (eg Google Drive)

Onerive

The command you were trying to run (eg rclone copy /tmp remote:tmp)

rclone serve webdav --addr 127.0.0.1:8090 onedrive:

A log from the command with the -vv flag (eg output from rclone -vv copy /tmp remote:tmp)

2019/04/24 21:16:08 ERROR : Hyper Backup/nas-h01.hbk/Pool/0/0/156.bucket.2: Failed to copy: invalidRange: fragmentOverlap: The uploaded fragment overlaps with data that has already been received.
2019/04/24 21:16:08 ERROR : Hyper Backup/nas-h01.hbk/Pool/0/0/156.bucket.2: WriteFileHandle.New Rcat failed: invalidRange: fragmentOverlap: The uploaded fragment overlaps with data that has already been received.
@ncw ncw added the bug label Apr 25, 2019
@ncw ncw added this to the v1.48 milestone Apr 25, 2019
@ncw
Copy link
Member

ncw commented Apr 25, 2019

That looks like a bug! In the webdav serving or the onedrive backend I'm not sure.

Can you run the rclone serve webdav with -vv --dump responses --log-file /tmp/rclone.log and try to capture the REQUEST and the RESPONSE for the failing transaction. That should help pin it down.

@prodigy7
Copy link
Author

Got now a log. First, before I post it: Is it necessary, anonymize anything? I think not because I see there no oauth2 credentials in the log file.

@Cnly
Copy link
Member

Cnly commented Apr 28, 2019

@prodigy7 By default rclone replaces the auth header in your requests with Authorization: XXXX so the header part should be safe. But for OAuth2 tokens, another place where they may get leaked is when e.g. rclone refreshes them. So you may want to look for responses for requests like POST /common/oauth2/v2.0/token and remove them accordingly.

Some other things that you might want to redact are your organization name and your email address that may be returned by the server. If you tried listing files, there may also be some direct download links (which should have expired given the time passed). And there will be your Drive ID, etc. which just look like some random strings and should be safe.

@prodigy7
Copy link
Author

I think, this last lines of the logfile should contain the related part which shows the error:

2019/04/27 07:35:42 DEBUG : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2019/04/27 07:35:42 DEBUG : Hyper Backup/nas-h01.hbk/Pool/0/0/1622.bucket.2: Uploading segment 20971520/52434456 size 10485760
2019/04/27 07:35:42 DEBUG : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2019/04/27 07:35:42 DEBUG : HTTP REQUEST (req 0xc000120200)
2019/04/27 07:35:42 DEBUG : PUT /rup/xxxxxxxxxac590a1/xxxxxxxxxxxxxxxxxjoiN0VEMEE2QzUyQUM1OTBBMSEyMDUiLCJSZWxhdGlvbnNoaXBOYW1lIjoiMTYyMi5idWNrZXQuMiJ9/xxxxxxxxMixxrn_uxHGXPNGKKTOkLmsPwb-Os9wn-yGLBkcmgfg3O_FJhTvEFRDTDl_bnyv4lMriz0MTobjacsUQNN9TlFPeso84nmvK2ZlX0/xxxxxxxlIjoiMTYyMi5idWNrZXQuMiIsIkZpbGVTeXN0ZW1JbmZvIjp7IkNyZWF0ZWREYXRlVGltZSI6IjIwMTktMDQtMjdUMDU6MzU6MjQrMDA6MDAiLCJMYXN0TW9kaWZpZWREYXRlVGltZSI6IjIwMTktMDQtMjdUMDU6MzU6MjQrMDA6MDAifX0/4wzqkpUcWHg5wFZvyQrRZpWGH6KFHCobPXzJ1AzHn_k15urRYZWTJJu-xxxxxxxav6l9N__pagUPM_Fo6umx2iY6Zo3W3wtREX_h9ddvQciJUkGhH5bTetbyOT3dT2uggV8WzTSfdhx1Ok34_va3H0_MiifICJKgdNS7wBGsHMKIjXf15Vb46lTCMYrhTHuWFH2SBXeO6XmJLCRjg7W18y8THAEVgWmXLVG120TwhC90E44OWugqh2YSJo3wkdPV06E3XdNI0uX113L_VgfUjC2NVqCwKn2nMm35yyzQ8wxpofKSoGvprKfWXMwOR1Ezy7uaB6Ta1GQcVD9q33rf23AFL5OL882MkLti5cSdZKcO9DTyqPV6eeA8A9A0_3isoHw5mmoiIFOhv-vk__xxxxxxxxxxVGZnm9Nl8jGNvA5xs34rbhlcdVK7Zhjquci67sMa88O-VY8LGfjjA0NdRnzyquN8kdVd9WlV3KhDfH-xxxxxxx9s1z1my-xxxxxxxxxNljrQblACUa6r2K7hRV3UWtMDtIUcrLjq4ixmmMxv4muqxI1y2wPnpAdYlprK0fGpcyaaUE0pR4nVHY5Se300JFrH0ATM42SwLc0MHwwDw6QhuNx5dVySlIiZWBHp-m HTTP/1.1
2019/04/27 07:35:42 DEBUG : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2019/04/27 07:35:50 DEBUG : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2019/04/27 07:35:50 DEBUG : HTTP RESPONSE (req 0xc000120200)
2019/04/27 07:35:50 DEBUG : HTTP/1.1 202 Accepted
2019/04/27 07:35:50 DEBUG : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2019/04/27 07:35:50 DEBUG : Hyper Backup/nas-h01.hbk/Pool/0/0/1622.bucket.2: Uploading segment 31457280/52434456 size 10485760
2019/04/27 07:35:50 DEBUG : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2019/04/27 07:35:50 DEBUG : HTTP REQUEST (req 0xc0001b0500)
2019/04/27 07:35:50 DEBUG : PUT /rup/xxxxxxxxxac590a1/xxxxxxxxxxxxxxxxxjoiN0VEMEE2QzUyQUM1OTBBMSEyMDUiLCJSZWxhdGlvbnNoaXBOYW1lIjoiMTYyMi5idWNrZXQuMiJ9/xxxxxxxxMixxrn_uxHGXPNGKKTOkLmsPwb-Os9wn-yGLBkcmgfg3O_FJhTvEFRDTDl_bnyv4lMriz0MTobjacsUQNN9TlFPeso84nmvK2ZlX0/xxxxxxxlIjoiMTYyMi5idWNrZXQuMiIsIkZpbGVTeXN0ZW1JbmZvIjp7IkNyZWF0ZWREYXRlVGltZSI6IjIwMTktMDQtMjdUMDU6MzU6MjQrMDA6MDAiLCJMYXN0TW9kaWZpZWREYXRlVGltZSI6IjIwMTktMDQtMjdUMDU6MzU6MjQrMDA6MDAifX0/4wzqkpUcWHg5wFZvyQrRZpWGH6KFHCobPXzJ1AzHn_k15urRYZWTJJu-xxxxxxxav6l9N__pagUPM_Fo6umx2iY6Zo3W3wtREX_h9ddvQciJUkGhH5bTetbyOT3dT2uggV8WzTSfdhx1Ok34_va3H0_MiifICJKgdNS7wBGsHMKIjXf15Vb46lTCMYrhTHuWFH2SBXeO6XmJLCRjg7W18y8THAEVgWmXLVG120TwhC90E44OWugqh2YSJo3wkdPV06E3XdNI0uX113L_VgfUjC2NVqCwKn2nMm35yyzQ8wxpofKSoGvprKfWXMwOR1Ezy7uaB6Ta1GQcVD9q33rf23AFL5OL882MkLti5cSdZKcO9DTyqPV6eeA8A9A0_3isoHw5mmoiIFOhv-vk__xxxxxxxxxxVGZnm9Nl8jGNvA5xs34rbhlcdVK7Zhjquci67sMa88O-VY8LGfjjA0NdRnzyquN8kdVd9WlV3KhDfH-xxxxxxx9s1z1my-xxxxxxxxxNljrQblACUa6r2K7hRV3UWtMDtIUcrLjq4ixmmMxv4muqxI1y2wPnpAdYlprK0fGpcyaaUE0pR4nVHY5Se300JFrH0ATM42SwLc0MHwwDw6QhuNx5dVySlIiZWBHp-m HTTP/1.1
2019/04/27 07:35:50 DEBUG : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2019/04/27 07:35:59 DEBUG : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2019/04/27 07:35:59 DEBUG : HTTP RESPONSE (req 0xc0001b0500)
2019/04/27 07:35:59 DEBUG : HTTP/1.1 202 Accepted
2019/04/27 07:35:59 DEBUG : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2019/04/27 07:35:59 DEBUG : Hyper Backup/nas-h01.hbk/Pool/0/0/1622.bucket.2: Uploading segment 41943040/52434456 size 10485760
2019/04/27 07:35:59 DEBUG : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2019/04/27 07:35:59 DEBUG : HTTP REQUEST (req 0xc0001b0e00)
2019/04/27 07:35:59 DEBUG : PUT /rup/xxxxxxxxxac590a1/xxxxxxxxxxxxxxxxxjoiN0VEMEE2QzUyQUM1OTBBMSEyMDUiLCJSZWxhdGlvbnNoaXBOYW1lIjoiMTYyMi5idWNrZXQuMiJ9/xxxxxxxxMixxrn_uxHGXPNGKKTOkLmsPwb-Os9wn-yGLBkcmgfg3O_FJhTvEFRDTDl_bnyv4lMriz0MTobjacsUQNN9TlFPeso84nmvK2ZlX0/xxxxxxxlIjoiMTYyMi5idWNrZXQuMiIsIkZpbGVTeXN0ZW1JbmZvIjp7IkNyZWF0ZWREYXRlVGltZSI6IjIwMTktMDQtMjdUMDU6MzU6MjQrMDA6MDAiLCJMYXN0TW9kaWZpZWREYXRlVGltZSI6IjIwMTktMDQtMjdUMDU6MzU6MjQrMDA6MDAifX0/4wzqkpUcWHg5wFZvyQrRZpWGH6KFHCobPXzJ1AzHn_k15urRYZWTJJu-xxxxxxxav6l9N__pagUPM_Fo6umx2iY6Zo3W3wtREX_h9ddvQciJUkGhH5bTetbyOT3dT2uggV8WzTSfdhx1Ok34_va3H0_MiifICJKgdNS7wBGsHMKIjXf15Vb46lTCMYrhTHuWFH2SBXeO6XmJLCRjg7W18y8THAEVgWmXLVG120TwhC90E44OWugqh2YSJo3wkdPV06E3XdNI0uX113L_VgfUjC2NVqCwKn2nMm35yyzQ8wxpofKSoGvprKfWXMwOR1Ezy7uaB6Ta1GQcVD9q33rf23AFL5OL882MkLti5cSdZKcO9DTyqPV6eeA8A9A0_3isoHw5mmoiIFOhv-vk__xxxxxxxxxxVGZnm9Nl8jGNvA5xs34rbhlcdVK7Zhjquci67sMa88O-VY8LGfjjA0NdRnzyquN8kdVd9WlV3KhDfH-xxxxxxx9s1z1my-xxxxxxxxxNljrQblACUa6r2K7hRV3UWtMDtIUcrLjq4ixmmMxv4muqxI1y2wPnpAdYlprK0fGpcyaaUE0pR4nVHY5Se300JFrH0ATM42SwLc0MHwwDw6QhuNx5dVySlIiZWBHp-m HTTP/1.1
2019/04/27 07:35:59 DEBUG : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2019/04/27 07:36:07 DEBUG : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2019/04/27 07:36:07 DEBUG : HTTP RESPONSE (req 0xc0001b0e00)
2019/04/27 07:36:07 DEBUG : HTTP/1.1 202 Accepted
2019/04/27 07:36:07 DEBUG : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2019/04/27 07:36:07 DEBUG : Hyper Backup/nas-h01.hbk/Pool/0/0/1622.bucket.2: Uploading segment 52428800/52434456 size 5656
2019/04/27 07:36:07 DEBUG : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2019/04/27 07:36:07 DEBUG : HTTP REQUEST (req 0xc0001b0500)
2019/04/27 07:36:07 DEBUG : PUT /rup/xxxxxxxxxac590a1/xxxxxxxxxxxxxxxxxjoiN0VEMEE2QzUyQUM1OTBBMSEyMDUiLCJSZWxhdGlvbnNoaXBOYW1lIjoiMTYyMi5idWNrZXQuMiJ9/xxxxxxxxMixxrn_uxHGXPNGKKTOkLmsPwb-Os9wn-yGLBkcmgfg3O_FJhTvEFRDTDl_bnyv4lMriz0MTobjacsUQNN9TlFPeso84nmvK2ZlX0/xxxxxxxlIjoiMTYyMi5idWNrZXQuMiIsIkZpbGVTeXN0ZW1JbmZvIjp7IkNyZWF0ZWREYXRlVGltZSI6IjIwMTktMDQtMjdUMDU6MzU6MjQrMDA6MDAiLCJMYXN0TW9kaWZpZWREYXRlVGltZSI6IjIwMTktMDQtMjdUMDU6MzU6MjQrMDA6MDAifX0/4wzqkpUcWHg5wFZvyQrRZpWGH6KFHCobPXzJ1AzHn_k15urRYZWTJJu-xxxxxxxav6l9N__pagUPM_Fo6umx2iY6Zo3W3wtREX_h9ddvQciJUkGhH5bTetbyOT3dT2uggV8WzTSfdhx1Ok34_va3H0_MiifICJKgdNS7wBGsHMKIjXf15Vb46lTCMYrhTHuWFH2SBXeO6XmJLCRjg7W18y8THAEVgWmXLVG120TwhC90E44OWugqh2YSJo3wkdPV06E3XdNI0uX113L_VgfUjC2NVqCwKn2nMm35yyzQ8wxpofKSoGvprKfWXMwOR1Ezy7uaB6Ta1GQcVD9q33rf23AFL5OL882MkLti5cSdZKcO9DTyqPV6eeA8A9A0_3isoHw5mmoiIFOhv-vk__xxxxxxxxxxVGZnm9Nl8jGNvA5xs34rbhlcdVK7Zhjquci67sMa88O-VY8LGfjjA0NdRnzyquN8kdVd9WlV3KhDfH-xxxxxxx9s1z1my-xxxxxxxxxNljrQblACUa6r2K7hRV3UWtMDtIUcrLjq4ixmmMxv4muqxI1y2wPnpAdYlprK0fGpcyaaUE0pR4nVHY5Se300JFrH0ATM42SwLc0MHwwDw6QhuNx5dVySlIiZWBHp-m HTTP/1.1
2019/04/27 07:36:07 DEBUG : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2019/04/27 07:36:08 DEBUG : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2019/04/27 07:36:08 DEBUG : HTTP RESPONSE (req 0xc0001b0500)
2019/04/27 07:36:08 DEBUG : HTTP/1.1 201 Created
2019/04/27 07:36:08 DEBUG : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2019/04/27 07:36:08 INFO  : Hyper Backup/nas-h01.hbk/Pool/0/0/1622.bucket.2: Copied (new)
2019/04/27 07:36:08 DEBUG : {Hyper Backup/nas-h01.hbk/Pool/0/0/1622.bucket.2}: ETag:
2019/04/27 07:36:08 DEBUG : {Hyper Backup/nas-h01.hbk/Pool/0/0/1622.bucket.2}: >ETag: etag="", err=not implemented
2019/04/27 07:36:08 INFO  : /Hyper Backup/nas-h01.hbk/Pool/0/0/1622.bucket.2: PUT from 127.0.0.1:49874
2019/04/27 07:36:08 DEBUG : /Hyper Backup/nas-h01.hbk/Pool/0/0/1622.index.2: OpenFile: flags=578, perm=-rw-rw-rw-
2019/04/27 07:36:08 DEBUG : /Hyper Backup/nas-h01.hbk/Pool/0/0/1622.index.2: OpenFile: flags=O_RDWR|O_CREATE|O_TRUNC, perm=-rw-rw-rw-
2019/04/27 07:36:08 DEBUG : Hyper Backup/nas-h01.hbk/Pool/0/0/1622.index.2: Open: flags=O_RDWR|O_CREATE|O_TRUNC
2019/04/27 07:36:08 DEBUG : Hyper Backup/nas-h01.hbk/Pool/0/0/1622.index.2: >Open: fd=Hyper Backup/nas-h01.hbk/Pool/0/0/1622.index.2 (w), err=<nil>
2019/04/27 07:36:08 DEBUG : /Hyper Backup/nas-h01.hbk/Pool/0/0/1622.index.2: >OpenFile: fd=Hyper Backup/nas-h01.hbk/Pool/0/0/1622.index.2 (w), err=<nil>
2019/04/27 07:36:08 DEBUG : /Hyper Backup/nas-h01.hbk/Pool/0/0/1622.index.2: >OpenFile: err = <nil>
2019/04/27 07:36:08 DEBUG : One drive root '': Target remote doesn't support streaming uploads, creating temporary local FS to spool file
2019/04/27 07:36:08 DEBUG : Hyper Backup/nas-h01.hbk/Pool/0/0/1622.index.2: Size and modification time the same (differ by 0s, within tolerance 1s)
2019/04/27 07:36:08 DEBUG : Hyper Backup/nas-h01.hbk/Pool/0/0/1622.index.2: Starting multipart upload
2019/04/27 07:36:08 DEBUG : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2019/04/27 07:36:08 DEBUG : HTTP REQUEST (req 0xc000120300)
2019/04/27 07:36:08 DEBUG : POST /v1.0/drives/xxxxxxxxxac590a1/items/xxxxxxxxxAC590A1!205:/1622.index.2:/createUploadSession HTTP/1.1
2019/04/27 07:36:08 DEBUG : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2019/04/27 07:36:08 DEBUG : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2019/04/27 07:36:08 DEBUG : HTTP RESPONSE (req 0xc000120300)
2019/04/27 07:36:08 DEBUG : HTTP/1.1 200 OK
2019/04/27 07:36:08 DEBUG : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2019/04/27 07:36:08 DEBUG : Hyper Backup/nas-h01.hbk/Pool/0/0/1622.index.2: Uploading segment 0/207872 size 207872
2019/04/27 07:36:08 DEBUG : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2019/04/27 07:36:08 DEBUG : HTTP REQUEST (req 0xc002a32200)
2019/04/27 07:36:08 DEBUG : PUT /rup/xxxxxxxxxac590a1/xxxxxxxxxxxxxxxxxjoiN0VEMEE2QzUyQUM1OTBBMSEyMDUiLCJSZWxhdGlvbnNoaXBOYW1lIjoiMTYyMi5pbmRleC4yIn0/xxxxxxrf4I4FAGWeNOJGuYo9C3bDHSEXZGRZfhM4PETZcvpcRLk1X4d9gaKywTly-f2vaIhSPLiUnwpFxSWxPVPFTP6buEt3kDjll4_yARMA4/xxxxxxxlIjoiMTYyMi5pbmRleC4yIiwiRmlsZVN5c3RlbUluZm8iOnsiQ3JlYXRlZERhdGVUaW1lIjoiMjAxOS0wNC0yN1QwNTozNjowOCswMDowMCIsIkxhc3RNb2RpZmllZERhdGVUaW1lIjoiMjAxOS0wNC0yN1QwNTozNjowOCswMDowMCJ9fQ/4wjhBg9AZPWfsbRYwfymvn9yERSzgS3-wGJ0jwhv2LnhtK5kqy4WX3EfqtSxF1BA6RufH3h9aA5hXxylYzsGWAmH0Y76c6hTK60tbR4E-XBwiMH5pHdl7viGY-aTwasc9-fjnlgYcL0V8RUydF7eelFiDYYdsHpo2W27cE8l94VnATZs710RCLYMf_G54lrmhsT36Cy6J9LDQXdyBRQyLUYBHW144WD5vWFK2aqpAxhJk7ZDF-exUE3vhnFFoRcPUPbMsiwdvsLkGb6Cdo7AcPTj7zp1K2Hhf8958df25f4pQvU06Y7wpG1Vb0Z5GFc8KgwNXQrqCwmP4MRR6YX3fmW6s4RXib06jJAP7CALd5DklCMOT3q5ykQdUy2mvMTfPchq4BWxwYRiJ3eYNxML7OSAkeucHzNE-LpDzDd8vTOscX3K4KkvO81-HsV2j6vJCA7zrDcNLoJgx0j-0kDlEPtbDw8nk8UtixaaV1cr6mekF8t2hRrP3y5lwK__rI8Lih4kdnAgrFm3NVoEeowD2A0gOP54npHMviT3b-GySfEcoyGlW9GTWenpfL1zHXFXH3jC3o1WX2q-C-PUgRo_jwdKBQ_eMvSp8Q2QLQMbLK3p_cVgM1uU_AGVmm4cPgD8J1 HTTP/1.1
2019/04/27 07:36:08 DEBUG : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2019/04/27 07:36:27 DEBUG : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2019/04/27 07:36:27 DEBUG : HTTP RESPONSE (req 0xc002a32200)
2019/04/27 07:36:27 DEBUG : HTTP/1.1 400 Bad Request
2019/04/27 07:36:27 DEBUG : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2019/04/27 07:36:27 DEBUG : Hyper Backup/nas-h01.hbk/Pool/0/0/1622.index.2: Cancelling multipart upload: itemNotFound: uploadSessionNotFound: Upload session not found
2019/04/27 07:36:27 DEBUG : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2019/04/27 07:36:27 DEBUG : HTTP REQUEST (req 0xc000120700)
2019/04/27 07:36:27 DEBUG : DELETE /rup/xxxxxxxxxac590a1/xxxxxxxxxxxxxxxxxjoiN0VEMEE2QzUyQUM1OTBBMSEyMDUiLCJSZWxhdGlvbnNoaXBOYW1lIjoiMTYyMi5pbmRleC4yIn0/xxxxxxrf4I4FAGWeNOJGuYo9C3bDHSEXZGRZfhM4PETZcvpcRLk1X4d9gaKywTly-f2vaIhSPLiUnwpFxSWxPVPFTP6buEt3kDjll4_yARMA4/xxxxxxxlIjoiMTYyMi5pbmRleC4yIiwiRmlsZVN5c3RlbUluZm8iOnsiQ3JlYXRlZERhdGVUaW1lIjoiMjAxOS0wNC0yN1QwNTozNjowOCswMDowMCIsIkxhc3RNb2RpZmllZERhdGVUaW1lIjoiMjAxOS0wNC0yN1QwNTozNjowOCswMDowMCJ9fQ/4wjhBg9AZPWfsbRYwfymvn9yERSzgS3-wGJ0jwhv2LnhtK5kqy4WX3EfqtSxF1BA6RufH3h9aA5hXxylYzsGWAmH0Y76c6hTK60tbR4E-XBwiMH5pHdl7viGY-aTwasc9-fjnlgYcL0V8RUydF7eelFiDYYdsHpo2W27cE8l94VnATZs710RCLYMf_G54lrmhsT36Cy6J9LDQXdyBRQyLUYBHW144WD5vWFK2aqpAxhJk7ZDF-exUE3vhnFFoRcPUPbMsiwdvsLkGb6Cdo7AcPTj7zp1K2Hhf8958df25f4pQvU06Y7wpG1Vb0Z5GFc8KgwNXQrqCwmP4MRR6YX3fmW6s4RXib06jJAP7CALd5DklCMOT3q5ykQdUy2mvMTfPchq4BWxwYRiJ3eYNxML7OSAkeucHzNE-LpDzDd8vTOscX3K4KkvO81-HsV2j6vJCA7zrDcNLoJgx0j-0kDlEPtbDw8nk8UtixaaV1cr6mekF8t2hRrP3y5lwK__rI8Lih4kdnAgrFm3NVoEeowD2A0gOP54npHMviT3b-GySfEcoyGlW9GTWenpfL1zHXFXH3jC3o1WX2q-C-PUgRo_jwdKBQ_eMvSp8Q2QLQMbLK3p_cVgM1uU_AGVmm4cPgD8J1 HTTP/1.1
2019/04/27 07:36:27 DEBUG : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2019/04/27 07:36:27 DEBUG : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2019/04/27 07:36:27 DEBUG : HTTP RESPONSE (req 0xc000120700)
2019/04/27 07:36:27 DEBUG : HTTP/1.1 204 No Content
2019/04/27 07:36:27 DEBUG : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2019/04/27 07:36:27 ERROR : Hyper Backup/nas-h01.hbk/Pool/0/0/1622.index.2: Failed to copy: itemNotFound: uploadSessionNotFound: Upload session not found
2019/04/27 07:36:27 ERROR : Hyper Backup/nas-h01.hbk/Pool/0/0/1622.index.2: WriteFileHandle.New Rcat failed: itemNotFound: uploadSessionNotFound: Upload session not found
2019/04/27 07:36:27 INFO  : /Hyper Backup/nas-h01.hbk/Pool/0/0/1622.index.2: PUT from 127.0.0.1:49876
2019/04/27 07:36:27 DEBUG : /Hyper Backup/nas-h01.hbk/Pool/0/0: Stat:
2019/04/27 07:36:27 DEBUG : /Hyper Backup/nas-h01.hbk/Pool/0/0: >Stat: fi={FileInfo:Hyper Backup/nas-h01.hbk/Pool/0/0/}, err = <nil>
2019/04/27 07:36:27 DEBUG : /Hyper Backup/nas-h01.hbk/Pool/0/0: OpenFile: flags=0, perm=----------
2019/04/27 07:36:27 DEBUG : /Hyper Backup/nas-h01.hbk/Pool/0/0: OpenFile: flags=O_RDONLY, perm=----------
2019/04/27 07:36:27 DEBUG : /Hyper Backup/nas-h01.hbk/Pool/0/0: >OpenFile: fd=Hyper Backup/nas-h01.hbk/Pool/0/0/ (r), err=<nil>
2019/04/27 07:36:27 DEBUG : /Hyper Backup/nas-h01.hbk/Pool/0/0: >OpenFile: err = <nil>
2019/04/27 07:36:27 DEBUG : /Hyper Backup/nas-h01.hbk/Pool/0/0: OpenFile: flags=0, perm=----------
2019/04/27 07:36:27 DEBUG : /Hyper Backup/nas-h01.hbk/Pool/0/0: OpenFile: flags=O_RDONLY, perm=----------
2019/04/27 07:36:27 DEBUG : /Hyper Backup/nas-h01.hbk/Pool/0/0: >OpenFile: fd=Hyper Backup/nas-h01.hbk/Pool/0/0/ (r), err=<nil>
2019/04/27 07:36:27 DEBUG : /Hyper Backup/nas-h01.hbk/Pool/0/0: >OpenFile: err = <nil>
2019/04/27 07:36:27 INFO  : /Hyper Backup/nas-h01.hbk/Pool/0/0: PROPFIND from 127.0.0.1:49878
2019/04/27 07:36:42 DEBUG : /Hyper Backup/nas-h01.hbk/Control/lock/lock_keep_alive.@writer_version_0.f8cdd654-db26-4c74-bfa8-6e4cacec7444: Stat:
2019/04/27 07:36:42 DEBUG : /Hyper Backup/nas-h01.hbk/Control/lock/lock_keep_alive.@writer_version_0.f8cdd654-db26-4c74-bfa8-6e4cacec7444: >Stat: fi={FileInfo:Hyper Backup/nas-h01.hbk/Control/lock/lock_keep_alive.@writer_version_0.f8cdd654-db26-4c74-bfa8-6e4cacec7444}, err = <nil>
2019/04/27 07:36:42 DEBUG : /Hyper Backup/nas-h01.hbk/Control/lock/lock_keep_alive.@writer_version_0.f8cdd654-db26-4c74-bfa8-6e4cacec7444: RemoveAll:
2019/04/27 07:36:42 DEBUG : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2019/04/27 07:36:42 DEBUG : HTTP REQUEST (req 0xc000b90300)
2019/04/27 07:36:42 DEBUG : DELETE /v1.0/drives/xxxxxxxxxac590a1/items/xxxxxxxxxAC590A1!6391 HTTP/1.1
2019/04/27 07:36:42 DEBUG : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2019/04/27 07:36:42 DEBUG : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2019/04/27 07:36:42 DEBUG : HTTP RESPONSE (req 0xc000b90300)
2019/04/27 07:36:42 DEBUG : HTTP/1.1 204 No Content
2019/04/27 07:36:42 DEBUG : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2019/04/27 07:36:42 DEBUG : /Hyper Backup/nas-h01.hbk/Control/lock/lock_keep_alive.@writer_version_0.f8cdd654-db26-4c74-bfa8-6e4cacec7444: >RemoveAll: err = <nil>
2019/04/27 07:36:42 INFO  : /Hyper Backup/nas-h01.hbk/Control/lock/lock_keep_alive.@writer_version_0.f8cdd654-db26-4c74-bfa8-6e4cacec7444: DELETE from 127.0.0.1:49880

It looks like rclone get/use not a valid upload handle?
And here a log extract from this morning:

2019/04/28 07:01:34 DEBUG : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2019/04/28 07:01:34 DEBUG : HTTP RESPONSE (req 0xc0001b0500)
2019/04/28 07:01:34 DEBUG : HTTP/1.1 202 Accepted
2019/04/28 07:01:34 DEBUG : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2019/04/28 07:01:34 DEBUG : Hyper Backup/nas-h01.hbk/Pool/0/0/53.bucket.2: Uploading segment 41943040/52430908 size 10485760
2019/04/28 07:01:34 DEBUG : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2019/04/28 07:01:34 DEBUG : HTTP REQUEST (req 0xc0001b0e00)
2019/04/28 07:01:34 DEBUG : PUT /rup/xxxxxxc52ac590a1/xxxxxxxvdXJjZUlEIjoiN0VEMEE2QzUyQUM1OTBBMSEyMDUiLCJSZWxhdGlvbnNoaXBOYW1lIjoiNTMuYnVja2V0LjIifQ/xxxxxxxPkrZgkuSP_Orh388cZlqRdkMaB0KJXLUL9wW-adrZn1OKs7Sfi-xxxxxxx-S8aN8ksQhvTgC55zJGzCz-GcWwYwdSOnqTP_C-_VyyA/xxxxxxxlIjoiNTMuYnVja2V0LjIiLCJGaWxlU3lzdGVtSW5mbyI6eyJDcmVhdGVkRGF0ZVRpbWUiOiIyMDE5LTA0LTI4VDA1OjAwOjM1KzAwOjAwIiwiTGFzdE1vZGlmaWVkRGF0ZVRpbWUiOiIyMDE5LTA0LTI4VDA1OjAwOjM1KzAwOjAwIn19/xxxxxx_E0TA1M5T2dm2RI1T9LfNdDzOX8VJg1m4HWAaQ_rKCqA7WWMEGwZjADZwKlPNFDqCnZG4xjS6LvzvKSE2d-0Pj3mZBwcHDNSX2djzS_G1PuOzNC-x_C7vcW5hHr97yXT7TybUe6OD8S40Sde8NQpDuM4T-hgJ36i76wqh2NCb55Mz_hW2wKcgMv9skjOVEH1EeG8KFfU5-xxxxx9yPPPu7FW99yt_4TFy6H5D-iYyJjzUAXP1-We0HtMl8dtJtZP1uuXZRumK4-ozyulPkGqvRxhL1bwaFH-PCI5t7PAV9-jB1KI061I5gAGkrXkweTcYOb7Jh2pgqYfZpLRfrNPfOVnISTpxR1wHJf-jUY6MfAgA31xk6dgXNE6tiXsmXMSE1ZZ1c5CtaBvAc2EtvEbKKwpn4cGuY3-xxxxxplNkzlxXzNf2Qv0rZVXnyoYwYeB5DvdsF-9geTuU1XAlQ81k5NX12CiGfRIwjpdKzONsZZvmwnkjdIJaHfI1oZR4qxF3OV6JCWWFhGqYifkwIcyQEW7iQv79S1oqOAatU5qPnTmqyTYy6ejyuD_ERmb4ijTEGQVmRzYUstNL9fsdMMhJam3AFE0K2Gw2lGUclNxewDaJXrugsDkbApWpM5V HTTP/1.1
2019/04/28 07:01:34 DEBUG : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2019/04/28 07:02:05 DEBUG : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2019/04/28 07:02:05 DEBUG : HTTP RESPONSE (req 0xc0001b0e00)
2019/04/28 07:02:05 DEBUG : HTTP/1.1 500 Internal Server Error
2019/04/28 07:02:05 DEBUG : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2019/04/28 07:02:05 DEBUG : pacer: Rate limited, increasing sleep to 20ms
2019/04/28 07:02:05 DEBUG : pacer: low level retry 1/10 (error generalException: General Exception While Processing)
2019/04/28 07:02:05 DEBUG : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2019/04/28 07:02:05 DEBUG : HTTP REQUEST (req 0xc0001b0500)
2019/04/28 07:02:05 DEBUG : PUT /rup/xxxxxxc52ac590a1/xxxxxxxvdXJjZUlEIjoiN0VEMEE2QzUyQUM1OTBBMSEyMDUiLCJSZWxhdGlvbnNoaXBOYW1lIjoiNTMuYnVja2V0LjIifQ/xxxxxxxPkrZgkuSP_Orh388cZlqRdkMaB0KJXLUL9wW-adrZn1OKs7Sfi-xxxxxxx-S8aN8ksQhvTgC55zJGzCz-GcWwYwdSOnqTP_C-_VyyA/xxxxxxxlIjoiNTMuYnVja2V0LjIiLCJGaWxlU3lzdGVtSW5mbyI6eyJDcmVhdGVkRGF0ZVRpbWUiOiIyMDE5LTA0LTI4VDA1OjAwOjM1KzAwOjAwIiwiTGFzdE1vZGlmaWVkRGF0ZVRpbWUiOiIyMDE5LTA0LTI4VDA1OjAwOjM1KzAwOjAwIn19/xxxxxx_E0TA1M5T2dm2RI1T9LfNdDzOX8VJg1m4HWAaQ_rKCqA7WWMEGwZjADZwKlPNFDqCnZG4xjS6LvzvKSE2d-0Pj3mZBwcHDNSX2djzS_G1PuOzNC-x_C7vcW5hHr97yXT7TybUe6OD8S40Sde8NQpDuM4T-hgJ36i76wqh2NCb55Mz_hW2wKcgMv9skjOVEH1EeG8KFfU5-xxxxx9yPPPu7FW99yt_4TFy6H5D-iYyJjzUAXP1-We0HtMl8dtJtZP1uuXZRumK4-ozyulPkGqvRxhL1bwaFH-PCI5t7PAV9-jB1KI061I5gAGkrXkweTcYOb7Jh2pgqYfZpLRfrNPfOVnISTpxR1wHJf-jUY6MfAgA31xk6dgXNE6tiXsmXMSE1ZZ1c5CtaBvAc2EtvEbKKwpn4cGuY3-xxxxxplNkzlxXzNf2Qv0rZVXnyoYwYeB5DvdsF-9geTuU1XAlQ81k5NX12CiGfRIwjpdKzONsZZvmwnkjdIJaHfI1oZR4qxF3OV6JCWWFhGqYifkwIcyQEW7iQv79S1oqOAatU5qPnTmqyTYy6ejyuD_ERmb4ijTEGQVmRzYUstNL9fsdMMhJam3AFE0K2Gw2lGUclNxewDaJXrugsDkbApWpM5V HTTP/1.1
2019/04/28 07:02:05 DEBUG : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2019/04/28 07:02:19 DEBUG : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2019/04/28 07:02:19 DEBUG : HTTP RESPONSE (req 0xc0001b0500)
2019/04/28 07:02:19 DEBUG : HTTP/1.1 416 Requested Range Not Satisfiable
2019/04/28 07:02:19 DEBUG : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2019/04/28 07:02:19 DEBUG : pacer: Reducing sleep to 15ms
2019/04/28 07:02:19 DEBUG : Hyper Backup/nas-h01.hbk/Pool/0/0/53.bucket.2: Cancelling multipart upload: invalidRange: fragmentOverlap: The uploaded fragment overlaps with data that has already been received.
2019/04/28 07:02:19 DEBUG : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2019/04/28 07:02:19 DEBUG : HTTP REQUEST (req 0xc000120200)
2019/04/28 07:02:19 DEBUG : DELETE /rup/xxxxxxc52ac590a1/xxxxxxxvdXJjZUlEIjoiN0VEMEE2QzUyQUM1OTBBMSEyMDUiLCJSZWxhdGlvbnNoaXBOYW1lIjoiNTMuYnVja2V0LjIifQ/xxxxxxxPkrZgkuSP_Orh388cZlqRdkMaB0KJXLUL9wW-adrZn1OKs7Sfi-xxxxxxx-S8aN8ksQhvTgC55zJGzCz-GcWwYwdSOnqTP_C-_VyyA/xxxxxxxlIjoiNTMuYnVja2V0LjIiLCJGaWxlU3lzdGVtSW5mbyI6eyJDcmVhdGVkRGF0ZVRpbWUiOiIyMDE5LTA0LTI4VDA1OjAwOjM1KzAwOjAwIiwiTGFzdE1vZGlmaWVkRGF0ZVRpbWUiOiIyMDE5LTA0LTI4VDA1OjAwOjM1KzAwOjAwIn19/xxxxxx_E0TA1M5T2dm2RI1T9LfNdDzOX8VJg1m4HWAaQ_rKCqA7WWMEGwZjADZwKlPNFDqCnZG4xjS6LvzvKSE2d-0Pj3mZBwcHDNSX2djzS_G1PuOzNC-x_C7vcW5hHr97yXT7TybUe6OD8S40Sde8NQpDuM4T-hgJ36i76wqh2NCb55Mz_hW2wKcgMv9skjOVEH1EeG8KFfU5-xxxxx9yPPPu7FW99yt_4TFy6H5D-iYyJjzUAXP1-We0HtMl8dtJtZP1uuXZRumK4-ozyulPkGqvRxhL1bwaFH-PCI5t7PAV9-jB1KI061I5gAGkrXkweTcYOb7Jh2pgqYfZpLRfrNPfOVnISTpxR1wHJf-jUY6MfAgA31xk6dgXNE6tiXsmXMSE1ZZ1c5CtaBvAc2EtvEbKKwpn4cGuY3-xxxxxplNkzlxXzNf2Qv0rZVXnyoYwYeB5DvdsF-9geTuU1XAlQ81k5NX12CiGfRIwjpdKzONsZZvmwnkjdIJaHfI1oZR4qxF3OV6JCWWFhGqYifkwIcyQEW7iQv79S1oqOAatU5qPnTmqyTYy6ejyuD_ERmb4ijTEGQVmRzYUstNL9fsdMMhJam3AFE0K2Gw2lGUclNxewDaJXrugsDkbApWpM5V HTTP/1.1
2019/04/28 07:02:19 DEBUG : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2019/04/28 07:02:19 DEBUG : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2019/04/28 07:02:19 DEBUG : HTTP RESPONSE (req 0xc000120200)
2019/04/28 07:02:19 DEBUG : HTTP/1.1 204 No Content
2019/04/28 07:02:19 DEBUG : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2019/04/28 07:02:19 DEBUG : pacer: Reducing sleep to 11.25ms
2019/04/28 07:02:19 ERROR : Hyper Backup/nas-h01.hbk/Pool/0/0/53.bucket.2: Failed to copy: invalidRange: fragmentOverlap: The uploaded fragment overlaps with data that has already been received.
2019/04/28 07:02:19 ERROR : Hyper Backup/nas-h01.hbk/Pool/0/0/53.bucket.2: WriteFileHandle.New Rcat failed: invalidRange: fragmentOverlap: The uploaded fragment overlaps with data that has already been received.
2019/04/28 07:02:19 INFO  : /Hyper Backup/nas-h01.hbk/Pool/0/0/53.bucket.2: PUT from 127.0.0.1:35046
2019/04/28 07:02:19 DEBUG : /Hyper Backup/nas-h01.hbk/Pool/0/0: Stat:
2019/04/28 07:02:19 DEBUG : : Re-reading directory (5m31.67589108s old)
2019/04/28 07:02:19 DEBUG : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2019/04/28 07:02:19 DEBUG : HTTP REQUEST (req 0xc000120700)
2019/04/28 07:02:19 DEBUG : GET /v1.0/drives/xxxxxxc52ac590a1/items/xxxxxxC52AC590A1!101/children?$top=1000 HTTP/1.1
2019/04/28 07:02:19 DEBUG : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2019/04/28 07:02:19 DEBUG : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2019/04/28 07:02:19 DEBUG : HTTP RESPONSE (req 0xc000120700)
2019/04/28 07:02:19 DEBUG : HTTP/1.1 200 OK
2019/04/28 07:02:19 DEBUG : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2019/04/28 07:02:19 DEBUG : pacer: Reducing sleep to 10ms
2019/04/28 07:02:19 DEBUG : Hyper Backup: Re-reading directory (5m31.436547037s old)
2019/04/28 07:02:19 DEBUG : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2019/04/28 07:02:19 DEBUG : HTTP REQUEST (req 0xc000120a00)
2019/04/28 07:02:19 DEBUG : GET /v1.0/drives/xxxxxxc52ac590a1/items/xxxxxxC52AC590A1!170/children?$top=1000 HTTP/1.1
2019/04/28 07:02:19 DEBUG : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2019/04/28 07:02:19 DEBUG : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2019/04/28 07:02:19 DEBUG : HTTP RESPONSE (req 0xc000120a00)
2019/04/28 07:02:19 DEBUG : HTTP/1.1 200 OK
2019/04/28 07:02:19 DEBUG : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2019/04/28 07:02:19 DEBUG : Hyper Backup/nas-h01.hbk: Re-reading directory (5m31.269447641s old)
2019/04/28 07:02:19 DEBUG : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2019/04/28 07:02:19 DEBUG : HTTP REQUEST (req 0xc0004fc200)
2019/04/28 07:02:19 DEBUG : GET /v1.0/drives/xxxxxxc52ac590a1/items/xxxxxxC52AC590A1!171/children?$top=1000 HTTP/1.1
2019/04/28 07:02:19 DEBUG : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2019/04/28 07:02:20 DEBUG : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2019/04/28 07:02:20 DEBUG : HTTP RESPONSE (req 0xc0004fc200)
2019/04/28 07:02:20 DEBUG : HTTP/1.1 200 OK
2019/04/28 07:02:20 DEBUG : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2019/04/28 07:02:20 DEBUG : Hyper Backup/nas-h01.hbk/Pool: Re-reading directory (5m30.926771143s old)
2019/04/28 07:02:20 DEBUG : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2019/04/28 07:02:20 DEBUG : HTTP REQUEST (req 0xc000318f00)
2019/04/28 07:02:20 DEBUG : GET /v1.0/drives/xxxxxxc52ac590a1/items/xxxxxxC52AC590A1!180/children?$top=1000 HTTP/1.1
2019/04/28 07:02:20 DEBUG : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2019/04/28 07:02:20 DEBUG : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2019/04/28 07:02:20 DEBUG : HTTP RESPONSE (req 0xc000318f00)
2019/04/28 07:02:20 DEBUG : HTTP/1.1 200 OK
2019/04/28 07:02:20 DEBUG : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2019/04/28 07:02:20 DEBUG : Hyper Backup/nas-h01.hbk/Pool/0: Re-reading directory (5m30.943900578s old)
2019/04/28 07:02:20 DEBUG : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2019/04/28 07:02:20 DEBUG : HTTP REQUEST (req 0xc0004fc700)
2019/04/28 07:02:20 DEBUG : GET /v1.0/drives/xxxxxxc52ac590a1/items/xxxxxxC52AC590A1!204/children?$top=1000 HTTP/1.1
2019/04/28 07:02:20 DEBUG : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2019/04/28 07:02:20 DEBUG : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2019/04/28 07:02:20 DEBUG : HTTP RESPONSE (req 0xc0004fc700)
2019/04/28 07:02:20 DEBUG : HTTP/1.1 200 OK
2019/04/28 07:02:20 DEBUG : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2019/04/28 07:02:20 DEBUG : /Hyper Backup/nas-h01.hbk/Pool/0/0: >Stat: fi={FileInfo:Hyper Backup/nas-h01.hbk/Pool/0/0/}, err = <nil>
2019/04/28 07:02:20 DEBUG : /Hyper Backup/nas-h01.hbk/Pool/0/0: OpenFile: flags=0, perm=----------
2019/04/28 07:02:20 DEBUG : /Hyper Backup/nas-h01.hbk/Pool/0/0: OpenFile: flags=O_RDONLY, perm=----------
2019/04/28 07:02:20 DEBUG : /Hyper Backup/nas-h01.hbk/Pool/0/0: >OpenFile: fd=Hyper Backup/nas-h01.hbk/Pool/0/0/ (r), err=<nil>
2019/04/28 07:02:20 DEBUG : /Hyper Backup/nas-h01.hbk/Pool/0/0: >OpenFile: err = <nil>
2019/04/28 07:02:20 DEBUG : /Hyper Backup/nas-h01.hbk/Pool/0/0: OpenFile: flags=0, perm=----------
2019/04/28 07:02:20 DEBUG : /Hyper Backup/nas-h01.hbk/Pool/0/0: OpenFile: flags=O_RDONLY, perm=----------
2019/04/28 07:02:20 DEBUG : /Hyper Backup/nas-h01.hbk/Pool/0/0: >OpenFile: fd=Hyper Backup/nas-h01.hbk/Pool/0/0/ (r), err=<nil>
2019/04/28 07:02:20 DEBUG : /Hyper Backup/nas-h01.hbk/Pool/0/0: >OpenFile: err = <nil>
2019/04/28 07:02:20 INFO  : /Hyper Backup/nas-h01.hbk/Pool/0/0: PROPFIND from 127.0.0.1:35058
2019/04/28 07:02:27 DEBUG : /Hyper Backup/nas-h01.hbk/Control/lock/lock_keep_alive.@writer_version_0.d142eac1-35c9-4f3d-83a2-dd22ab2382ea: Stat:
2019/04/28 07:02:27 DEBUG : Hyper Backup/nas-h01.hbk/Control: Re-reading directory (20m20.690430088s old)
2019/04/28 07:02:27 DEBUG : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2019/04/28 07:02:27 DEBUG : HTTP REQUEST (req 0xc0004fc200)
2019/04/28 07:02:27 DEBUG : GET /v1.0/drives/xxxxxxc52ac590a1/items/xxxxxxC52AC590A1!190/children?$top=1000 HTTP/1.1
2019/04/28 07:02:27 DEBUG : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2019/04/28 07:02:28 DEBUG : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2019/04/28 07:02:28 DEBUG : HTTP RESPONSE (req 0xc0004fc200)
2019/04/28 07:02:28 DEBUG : HTTP/1.1 200 OK
2019/04/28 07:02:28 DEBUG : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2019/04/28 07:02:28 DEBUG : Hyper Backup/nas-h01.hbk/Control/lock: Re-reading directory (20m20.773180014s old)
2019/04/28 07:02:28 DEBUG : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2019/04/28 07:02:28 DEBUG : HTTP REQUEST (req 0xc0004fc900)
2019/04/28 07:02:28 DEBUG : GET /v1.0/drives/xxxxxxc52ac590a1/items/xxxxxxC52AC590A1!196/children?$top=1000 HTTP/1.1
2019/04/28 07:02:28 DEBUG : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2019/04/28 07:02:28 DEBUG : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2019/04/28 07:02:28 DEBUG : HTTP RESPONSE (req 0xc0004fc900)
2019/04/28 07:02:28 DEBUG : HTTP/1.1 200 OK
2019/04/28 07:02:28 DEBUG : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2019/04/28 07:02:28 DEBUG : /Hyper Backup/nas-h01.hbk/Control/lock/lock_keep_alive.@writer_version_0.d142eac1-35c9-4f3d-83a2-dd22ab2382ea: >Stat: fi={FileInfo:Hyper Backup/nas-h01.hbk/Control/lock/lock_keep_alive.@writer_version_0.d142eac1-35c9-4f3d-83a2-dd22ab2382ea}, err = <nil>
2019/04/28 07:02:28 DEBUG : /Hyper Backup/nas-h01.hbk/Control/lock/lock_keep_alive.@writer_version_0.d142eac1-35c9-4f3d-83a2-dd22ab2382ea: RemoveAll:
2019/04/28 07:02:28 DEBUG : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2019/04/28 07:02:28 DEBUG : HTTP REQUEST (req 0xc0004fcc00)
2019/04/28 07:02:28 DEBUG : DELETE /v1.0/drives/xxxxxxc52ac590a1/items/xxxxxxC52AC590A1!9637 HTTP/1.1
2019/04/28 07:02:28 DEBUG : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2019/04/28 07:02:28 DEBUG : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2019/04/28 07:02:28 DEBUG : HTTP RESPONSE (req 0xc0004fcc00)
2019/04/28 07:02:28 DEBUG : HTTP/1.1 204 No Content
2019/04/28 07:02:28 DEBUG : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2019/04/28 07:02:28 DEBUG : /Hyper Backup/nas-h01.hbk/Control/lock/lock_keep_alive.@writer_version_0.d142eac1-35c9-4f3d-83a2-dd22ab2382ea: >RemoveAll: err = <nil>
2019/04/28 07:02:28 INFO  : /Hyper Backup/nas-h01.hbk/Control/lock/lock_keep_alive.@writer_version_0.d142eac1-35c9-4f3d-83a2-dd22ab2382ea: DELETE from 127.0.0.1:35062

@Cnly
Copy link
Member

Cnly commented Apr 28, 2019

Thanks! I took a look at your logs. My guess is that you're hitting some random bugs on the remote side. Can you retry these two actions with rclone v1.47? In that version a workaround was added which has been able to mitigate some similar problems. If you're still seeing them, feel free to post some new logs. :)

@prodigy7
Copy link
Author

No problem, updated to v1.47 and start a new try. Will report! :-) Thanks until then!

@prodigy7
Copy link
Author

Howdy! Next error appears fast then expected ;-) Here the new log:

2019/04/28 18:43:56 DEBUG : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2019/04/28 18:43:56 DEBUG : HTTP RESPONSE (req 0xc000186f00)
2019/04/28 18:43:56 DEBUG : HTTP/1.1 202 Accepted
2019/04/28 18:43:56 DEBUG : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2019/04/28 18:43:56 DEBUG : Hyper Backup/nas-h01.hbk/Pool/0/0/45.bucket.2: Uploading segment 31457280/52431708 size 10485760
2019/04/28 18:43:56 DEBUG : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2019/04/28 18:43:56 DEBUG : HTTP REQUEST (req 0xc0002f8d00)
2019/04/28 18:43:56 DEBUG : PUT /rup/xxxxxxc52ac590a1/xxxxxxxvdXJjZUlEIjoiN0VEMEE2QzUyQUM1OTBBMSEyMDUiLCJSZWxhdGlvbnNoaXBOYW1lIjoiNDUuYnVja2V0LjIifQ/xxxxxxxdP64YfdW-xxxxxxWCuOURPZivJBN-xxxxxx0mSdb3W5yA4Su9QApzA4DhxcR-xxxxxx-RGJevZyUC2hUSVj_F9G4xv1J-xxxxxxRmA/xxxxxxxxIjoiNDUuYnVja2V0LjIiLCJGaWxlU3lzdGVtSW5mbyI6eyJDcmVhdGVkRGF0ZVRpbWUiOiIyMDE5LTA0LTI4VDE2OjQzOjI5KzAwOjAwIiwiTGFzdE1vZGlmaWVkRGF0ZVRpbWUiOiIyMDE5LTA0LTI4VDE2OjQzOjI5KzAwOjAwIn19/xxxxxxxy4qBlqrOfr1ZmD3NPhfZo9DsGXTlmAdRtMv-xxxxxBs20AGs4rA6os6JtyTKRFLjmXKnc5cRg2vWjnLnEuWyT53gEJ2bd2Xg8PZ59A_kE7UeL7qVRpy1TdWOqpBQkPMJ0j6KEZ0KhHpx0BePZl6LfPRJeKRIIJ3uZZPpsDgEgew8ROrdWzm3-Q2-xxxxxxxxuRuhJy5fOe5DKTaMvfJEDta6OyodsBrb4DInTOjvTTxMzC_kQ5PS8umf0Ilm78OjnHev7LuwL2GcyLwu9EsDCItRbgHTePk0qVCMcuFVf5uwZWGGOhciw5bOgvOQ4JrMXgNqeZec8MeUn_NcMMAxPKgYjGJctWYn2Ga1BTJVWGqjGGG02Yb3BP6qx4rZVTzE8f0lRIWxBcfxkPNRLteU9NXBpXstmKN75v-xxxxxxQy8Ptb5K7cqk9vsyCAj3OKv3EaeigZCDAUXP9sCTh7OYPQWPYGUjc3yi7-xxxxxxkGofNuYEql0nHNUvxqNXGPbPLzqlv_NlLrbgNnDkbw7v6ixd1-ajHhZJcyIFipqow3_0GT7peVbExBfbBTJ-xxxxxxJTlfHXSxiEvr9yCypREBZ85c_5e981jQG3b8LzdusO8QilIq-hcLJd-p HTTP/1.1
2019/04/28 18:43:56 DEBUG : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2019/04/28 18:44:22 DEBUG : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2019/04/28 18:44:22 DEBUG : HTTP RESPONSE (req 0xc0002f8d00)
2019/04/28 18:44:22 DEBUG : HTTP/1.1 500 Internal Server Error
2019/04/28 18:44:22 DEBUG : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2019/04/28 18:44:22 DEBUG : pacer: low level retry 1/10 (error generalException: General Exception While Processing)
2019/04/28 18:44:22 DEBUG : pacer: Rate limited, increasing sleep to 20ms
2019/04/28 18:44:22 DEBUG : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2019/04/28 18:44:22 DEBUG : HTTP REQUEST (req 0xc0002f9100)
2019/04/28 18:44:22 DEBUG : PUT /rup/xxxxxxc52ac590a1/xxxxxxxvdXJjZUlEIjoiN0VEMEE2QzUyQUM1OTBBMSEyMDUiLCJSZWxhdGlvbnNoaXBOYW1lIjoiNDUuYnVja2V0LjIifQ/xxxxxxxdP64YfdW-xxxxxxWCuOURPZivJBN-xxxxxx0mSdb3W5yA4Su9QApzA4DhxcR-xxxxxx-RGJevZyUC2hUSVj_F9G4xv1J-xxxxxxRmA/xxxxxxxxIjoiNDUuYnVja2V0LjIiLCJGaWxlU3lzdGVtSW5mbyI6eyJDcmVhdGVkRGF0ZVRpbWUiOiIyMDE5LTA0LTI4VDE2OjQzOjI5KzAwOjAwIiwiTGFzdE1vZGlmaWVkRGF0ZVRpbWUiOiIyMDE5LTA0LTI4VDE2OjQzOjI5KzAwOjAwIn19/xxxxxxxy4qBlqrOfr1ZmD3NPhfZo9DsGXTlmAdRtMv-xxxxxBs20AGs4rA6os6JtyTKRFLjmXKnc5cRg2vWjnLnEuWyT53gEJ2bd2Xg8PZ59A_kE7UeL7qVRpy1TdWOqpBQkPMJ0j6KEZ0KhHpx0BePZl6LfPRJeKRIIJ3uZZPpsDgEgew8ROrdWzm3-Q2-xxxxxxxxuRuhJy5fOe5DKTaMvfJEDta6OyodsBrb4DInTOjvTTxMzC_kQ5PS8umf0Ilm78OjnHev7LuwL2GcyLwu9EsDCItRbgHTePk0qVCMcuFVf5uwZWGGOhciw5bOgvOQ4JrMXgNqeZec8MeUn_NcMMAxPKgYjGJctWYn2Ga1BTJVWGqjGGG02Yb3BP6qx4rZVTzE8f0lRIWxBcfxkPNRLteU9NXBpXstmKN75v-xxxxxxQy8Ptb5K7cqk9vsyCAj3OKv3EaeigZCDAUXP9sCTh7OYPQWPYGUjc3yi7-xxxxxxkGofNuYEql0nHNUvxqNXGPbPLzqlv_NlLrbgNnDkbw7v6ixd1-ajHhZJcyIFipqow3_0GT7peVbExBfbBTJ-xxxxxxJTlfHXSxiEvr9yCypREBZ85c_5e981jQG3b8LzdusO8QilIq-hcLJd-p HTTP/1.1
2019/04/28 18:44:22 DEBUG : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2019/04/28 18:44:30 DEBUG : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2019/04/28 18:44:30 DEBUG : HTTP RESPONSE (req 0xc0002f9100)
2019/04/28 18:44:30 DEBUG : HTTP/1.1 416 Requested Range Not Satisfiable
2019/04/28 18:44:30 DEBUG : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2019/04/28 18:44:30 DEBUG : pacer: Reducing sleep to 15ms
2019/04/28 18:44:30 DEBUG : Hyper Backup/nas-h01.hbk/Pool/0/0/45.bucket.2: Error encountered during upload: invalidRange: fragmentOverlap: The uploaded fragment overlaps with data that has already been received.
2019/04/28 18:44:30 DEBUG : Hyper Backup/nas-h01.hbk/Pool/0/0/45.bucket.2: Cancelling multipart upload
2019/04/28 18:44:30 DEBUG : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2019/04/28 18:44:30 DEBUG : HTTP REQUEST (req 0xc000186700)
2019/04/28 18:44:30 DEBUG : DELETE /rup/xxxxxxc52ac590a1/xxxxxxxvdXJjZUlEIjoiN0VEMEE2QzUyQUM1OTBBMSEyMDUiLCJSZWxhdGlvbnNoaXBOYW1lIjoiNDUuYnVja2V0LjIifQ/xxxxxxxdP64YfdW-xxxxxxWCuOURPZivJBN-xxxxxx0mSdb3W5yA4Su9QApzA4DhxcR-xxxxxx-RGJevZyUC2hUSVj_F9G4xv1J-xxxxxxRmA/xxxxxxxxIjoiNDUuYnVja2V0LjIiLCJGaWxlU3lzdGVtSW5mbyI6eyJDcmVhdGVkRGF0ZVRpbWUiOiIyMDE5LTA0LTI4VDE2OjQzOjI5KzAwOjAwIiwiTGFzdE1vZGlmaWVkRGF0ZVRpbWUiOiIyMDE5LTA0LTI4VDE2OjQzOjI5KzAwOjAwIn19/xxxxxxxy4qBlqrOfr1ZmD3NPhfZo9DsGXTlmAdRtMv-xxxxxBs20AGs4rA6os6JtyTKRFLjmXKnc5cRg2vWjnLnEuWyT53gEJ2bd2Xg8PZ59A_kE7UeL7qVRpy1TdWOqpBQkPMJ0j6KEZ0KhHpx0BePZl6LfPRJeKRIIJ3uZZPpsDgEgew8ROrdWzm3-Q2-xxxxxxxxuRuhJy5fOe5DKTaMvfJEDta6OyodsBrb4DInTOjvTTxMzC_kQ5PS8umf0Ilm78OjnHev7LuwL2GcyLwu9EsDCItRbgHTePk0qVCMcuFVf5uwZWGGOhciw5bOgvOQ4JrMXgNqeZec8MeUn_NcMMAxPKgYjGJctWYn2Ga1BTJVWGqjGGG02Yb3BP6qx4rZVTzE8f0lRIWxBcfxkPNRLteU9NXBpXstmKN75v-xxxxxxQy8Ptb5K7cqk9vsyCAj3OKv3EaeigZCDAUXP9sCTh7OYPQWPYGUjc3yi7-xxxxxxkGofNuYEql0nHNUvxqNXGPbPLzqlv_NlLrbgNnDkbw7v6ixd1-ajHhZJcyIFipqow3_0GT7peVbExBfbBTJ-xxxxxxJTlfHXSxiEvr9yCypREBZ85c_5e981jQG3b8LzdusO8QilIq-hcLJd-p HTTP/1.1
2019/04/28 18:44:30 DEBUG : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2019/04/28 18:44:30 DEBUG : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2019/04/28 18:44:30 DEBUG : HTTP RESPONSE (req 0xc000186700)
2019/04/28 18:44:30 DEBUG : HTTP/1.1 204 No Content
2019/04/28 18:44:30 DEBUG : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2019/04/28 18:44:30 DEBUG : pacer: Reducing sleep to 11.25ms
2019/04/28 18:44:30 ERROR : Hyper Backup/nas-h01.hbk/Pool/0/0/45.bucket.2: Failed to copy: invalidRange: fragmentOverlap: The uploaded fragment overlaps with data that has already been received.
2019/04/28 18:44:30 ERROR : Hyper Backup/nas-h01.hbk/Pool/0/0/45.bucket.2: WriteFileHandle.New Rcat failed: invalidRange: fragmentOverlap: The uploaded fragment overlaps with data that has already been received.
2019/04/28 18:44:30 INFO  : /Hyper Backup/nas-h01.hbk/Pool/0/0/45.bucket.2: PUT from 127.0.0.1:40444
2019/04/28 18:44:30 DEBUG : /Hyper Backup/nas-h01.hbk/Pool/0/0: Stat:
2019/04/28 18:44:30 DEBUG : /Hyper Backup/nas-h01.hbk/Pool/0/0: >Stat: fi={FileInfo:Hyper Backup/nas-h01.hbk/Pool/0/0/}, err = <nil>
2019/04/28 18:44:30 DEBUG : /Hyper Backup/nas-h01.hbk/Pool/0/0: OpenFile: flags=0, perm=----------
2019/04/28 18:44:30 DEBUG : /Hyper Backup/nas-h01.hbk/Pool/0/0: OpenFile: flags=O_RDONLY, perm=----------
2019/04/28 18:44:30 DEBUG : /Hyper Backup/nas-h01.hbk/Pool/0/0: >OpenFile: fd=Hyper Backup/nas-h01.hbk/Pool/0/0/ (r), err=<nil>
2019/04/28 18:44:30 DEBUG : /Hyper Backup/nas-h01.hbk/Pool/0/0: >OpenFile: err = <nil>
2019/04/28 18:44:30 DEBUG : /Hyper Backup/nas-h01.hbk/Pool/0/0: OpenFile: flags=0, perm=----------
2019/04/28 18:44:30 DEBUG : /Hyper Backup/nas-h01.hbk/Pool/0/0: OpenFile: flags=O_RDONLY, perm=----------
2019/04/28 18:44:30 DEBUG : /Hyper Backup/nas-h01.hbk/Pool/0/0: >OpenFile: fd=Hyper Backup/nas-h01.hbk/Pool/0/0/ (r), err=<nil>
2019/04/28 18:44:30 DEBUG : /Hyper Backup/nas-h01.hbk/Pool/0/0: >OpenFile: err = <nil>
2019/04/28 18:44:30 INFO  : /Hyper Backup/nas-h01.hbk/Pool/0/0: PROPFIND from 127.0.0.1:40456
2019/04/28 18:44:38 DEBUG : /Hyper Backup/nas-h01.hbk/Control/lock/lock_keep_alive.@writer_version_0.ff38b47a-ebf8-4a6f-abde-49adb23cc527: Stat:
2019/04/28 18:44:38 DEBUG : Hyper Backup/nas-h01.hbk/Control: Re-reading directory (6m29.374568253s old)
2019/04/28 18:44:38 DEBUG : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2019/04/28 18:44:38 DEBUG : HTTP REQUEST (req 0xc00010c300)
2019/04/28 18:44:38 DEBUG : GET /v1.0/drives/xxxxxxc52ac590a1/items/xxxxxxC52AC590A1!190/children?$top=1000 HTTP/1.1
2019/04/28 18:44:38 DEBUG : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2019/04/28 18:44:39 DEBUG : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2019/04/28 18:44:39 DEBUG : HTTP RESPONSE (req 0xc00010c300)
2019/04/28 18:44:39 DEBUG : HTTP/1.1 200 OK
2019/04/28 18:44:39 DEBUG : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2019/04/28 18:44:39 DEBUG : pacer: Reducing sleep to 10ms
2019/04/28 18:44:39 DEBUG : Hyper Backup/nas-h01.hbk/Control/lock: Re-reading directory (6m29.421597188s old)
2019/04/28 18:44:39 DEBUG : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2019/04/28 18:44:39 DEBUG : HTTP REQUEST (req 0xc000690000)
2019/04/28 18:44:39 DEBUG : GET /v1.0/drives/xxxxxxc52ac590a1/items/xxxxxxC52AC590A1!196/children?$top=1000 HTTP/1.1
2019/04/28 18:44:39 DEBUG : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2019/04/28 18:44:39 DEBUG : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2019/04/28 18:44:39 DEBUG : HTTP RESPONSE (req 0xc000690000)
2019/04/28 18:44:39 DEBUG : HTTP/1.1 200 OK
2019/04/28 18:44:39 DEBUG : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2019/04/28 18:44:39 DEBUG : /Hyper Backup/nas-h01.hbk/Control/lock/lock_keep_alive.@writer_version_0.ff38b47a-ebf8-4a6f-abde-49adb23cc527: >Stat: fi={FileInfo:Hyper Backup/nas-h01.hbk/Control/lock/lock_keep_alive.@writer_version_0.ff38b47a-ebf8-4a6f-abde-49adb23cc527}, err = <nil>
2019/04/28 18:44:39 DEBUG : /Hyper Backup/nas-h01.hbk/Control/lock/lock_keep_alive.@writer_version_0.ff38b47a-ebf8-4a6f-abde-49adb23cc527: RemoveAll:
2019/04/28 18:44:39 DEBUG : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2019/04/28 18:44:39 DEBUG : HTTP REQUEST (req 0xc000690300)
2019/04/28 18:44:39 DEBUG : DELETE /v1.0/drives/xxxxxxc52ac590a1/items/xxxxxxC52AC590A1!9743 HTTP/1.1
2019/04/28 18:44:39 DEBUG : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2019/04/28 18:44:39 DEBUG : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2019/04/28 18:44:39 DEBUG : HTTP RESPONSE (req 0xc000690300)
2019/04/28 18:44:39 DEBUG : HTTP/1.1 204 No Content
2019/04/28 18:44:39 DEBUG : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2019/04/28 18:44:39 DEBUG : /Hyper Backup/nas-h01.hbk/Control/lock/lock_keep_alive.@writer_version_0.ff38b47a-ebf8-4a6f-abde-49adb23cc527: >RemoveAll: err = <nil>
2019/04/28 18:44:39 INFO  : /Hyper Backup/nas-h01.hbk/Control/lock/lock_keep_alive.@writer_version_0.ff38b47a-ebf8-4a6f-abde-49adb23cc527: DELETE from 127.0.0.1:40460

@Cnly
Copy link
Member

Cnly commented Apr 29, 2019

@prodigy7 Thanks. Does this happen if you don't use WebDAV and instead just rclone copy the file to the same remote location? And does it happen if you change the problematic file's name, or replace the contents of the file?

@prodigy7
Copy link
Author

@Cnly Unfortunately I can't reproduce the scenario directly if I don't use the WebDav frontend. The backup software I can use on my NAS does only support webdav as "official frontend". What I do is only schedule the backup software and it runs on specific time. I don't rename/change or do anything else on onedrive storage.

@ncw
Copy link
Member

ncw commented May 2, 2019

2019/04/28 18:44:22 DEBUG : HTTP REQUEST (req 0xc0002f9100)
2019/04/28 18:44:22 DEBUG : PUT /rup/xxxxxxc52ac590a1/xxxxxxxvdXJjZUlEIjoiN0VEMEE2QzUyQUM1OTBBMSEyMDUiLCJSZWxhdGlvbnNoaXBOYW1lIjoiNDUuYnVja2V0LjIifQ/xxxxxxxdP64YfdW-xxxxxxWCuOURPZivJBN-xxxxxx0mSdb3W5yA4Su9QApzA4DhxcR-xxxxxx-RGJevZyUC2hUSVj_F9G4xv1J-xxxxxxRmA/xxxxxxxxIjoiNDUuYnVja2V0LjIiLCJGaWxlU3lzdGVtSW5mbyI6eyJDcmVhdGVkRGF0ZVRpbWUiOiIyMDE5LTA0LTI4VDE2OjQzOjI5KzAwOjAwIiwiTGFzdE1vZGlmaWVkRGF0ZVRpbWUiOiIyMDE5LTA0LTI4VDE2OjQzOjI5KzAwOjAwIn19/xxxxxxxy4qBlqrOfr1ZmD3NPhfZo9DsGXTlmAdRtMv-xxxxxBs20AGs4rA6os6JtyTKRFLjmXKnc5cRg2vWjnLnEuWyT53gEJ2bd2Xg8PZ59A_kE7UeL7qVRpy1TdWOqpBQkPMJ0j6KEZ0KhHpx0BePZl6LfPRJeKRIIJ3uZZPpsDgEgew8ROrdWzm3-Q2-xxxxxxxxuRuhJy5fOe5DKTaMvfJEDta6OyodsBrb4DInTOjvTTxMzC_kQ5PS8umf0Ilm78OjnHev7LuwL2GcyLwu9EsDCItRbgHTePk0qVCMcuFVf5uwZWGGOhciw5bOgvOQ4JrMXgNqeZec8MeUn_NcMMAxPKgYjGJctWYn2Ga1BTJVWGqjGGG02Yb3BP6qx4rZVTzE8f0lRIWxBcfxkPNRLteU9NXBpXstmKN75v-xxxxxxQy8Ptb5K7cqk9vsyCAj3OKv3EaeigZCDAUXP9sCTh7OYPQWPYGUjc3yi7-xxxxxxkGofNuYEql0nHNUvxqNXGPbPLzqlv_NlLrbgNnDkbw7v6ixd1-ajHhZJcyIFipqow3_0GT7peVbExBfbBTJ-xxxxxxJTlfHXSxiEvr9yCypREBZ85c_5e981jQG3b8LzdusO8QilIq-hcLJd-p HTTP/1.1
2019/04/28 18:44:22 DEBUG : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2019/04/28 18:44:30 DEBUG : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2019/04/28 18:44:30 DEBUG : HTTP RESPONSE (req 0xc0002f9100)
2019/04/28 18:44:30 DEBUG : HTTP/1.1 416 Requested Range Not Satisfiable

This looks like the problem transaction...

Can you post more of the headers for the REQUEST? In particular Content-Range (or Range) and Content-Length

@prodigy7
Copy link
Author

prodigy7 commented May 3, 2019

Does this contains enough informations?

2019/04/28 18:43:47 DEBUG : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2019/04/28 18:43:47 DEBUG : HTTP RESPONSE (req 0xc000186600)
2019/04/28 18:43:47 DEBUG : HTTP/1.1 202 Accepted
Content-Type: application/json; charset=utf-8
Date: Sun, 28 Apr 2019 16:43:47 GMT
P3p: CP="BUS CUR CONo FIN IVDo ONL OUR PHY SAMo TELo"
Strict-Transport-Security: max-age=31536000; includeSubDomains
Www-Authenticate: Bearer realm="OneDriveAPI", error="invalid_token", error_description="Invalid auth token"
X-Asmversion: UNKNOWN; 19.237.422.2003
X-Msedge-Ref: Ref A: F50F27CFFECC43C98A1296CFBB238E1A Ref B: FRAEDGE0706 Ref C: 2019-04-28T16:43:39Z
X-Msnserver: SN3PPF9CF7F7B5D

{"expirationDateTime":"2019-05-05T16:43:30.667Z","nextExpectedRanges":["20971520-52431707"]}
2019/04/28 18:43:47 DEBUG : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2019/04/28 18:43:47 DEBUG : Hyper Backup/nas-h01.hbk/Pool/0/0/45.bucket.2: Uploading segment 20971520/52431708 size 10485760
2019/04/28 18:43:47 DEBUG : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2019/04/28 18:43:47 DEBUG : HTTP REQUEST (req 0xc000186f00)
2019/04/28 18:43:47 DEBUG : PUT /rup/XXXXXXXXXXXX90a1/XXXXXXXXXXXXZUlEIjoiN0VEMEE2QzUyQUM1OTBBMSEyMDUiLCJSZWxhdGlvbnNoaXBOYW1lIjoiNDUuYnVja2V0LjIifQ/XXXXXXXXXXXXfdW-FHdkP3WCuOURPZivJBN-VTtOeu0mSdb3W5yA4Su9QApzA4DhxcR-Cgs2sP-RGJevZyUC2hUSVj_F9G4xv1J-5tNnE5RmA/XXXXXXXXXXXXNDUuYnVja2V0LjIiLCJGaWxlU3lzdGVtSW5mbyI6eyJDcmVhdGVkRGF0ZVRpbWUiOiIyMDE5LTA0LTI4VDE2OjQzOjI5KzAwOjAwIiwiTGFzdE1vZGlmaWVkRGF0ZVRpbWUiOiIyMDE5LTA0LTI4VDE2OjQzOjI5KzAwOjAwIn19/XXXXXXXXXXXXqrOfr1ZmD3NPhfZo9DsGXTlmAdRtMv-yrzQWBs20AGs4rA6os6JtyTKRFLjmXKnc5cRg2vWjnLnEuWyT53gEJ2bd2Xg8PZ59A_kE7UeL7qVRpy1TdWOqpBQkPMJ0j6KEZ0KhHpx0BePZl6LfPRJeKRIIJ3uZZPpsDgEgew8ROrdWzm3-Q2-F2iGoV2euRuhJy5fOe5DKTaMvfJEDta6OyodsBrb4DInTOjvTTxMzC_kQ5PS8umf0Ilm78OjnHev7LuwL2GcyLwu9EsDCItRbgHTePk0qVCMcuFVf5uwZWGGOhciw5bOgvOQ4JrMXgNqeZec8MeUn_NcMMAxPKgYjGJctWYn2Ga1BTJVWGqjGGG02Yb3BP6qx4rZVTzE8f0lRIWxBcfxkPNRLteU9NXBpXstmKN75v-wLwRebQy8Ptb5K7cqk9vsyCAj3OKv3EaeigZCDAUXP9sCTh7OYPQWPYGUjc3yi7-LnLbILkGofNuYEql0nHNUvxqNXGPbPLzqlv_NlLrbgNnDkbw7v6ixd1-ajHhZJcyIFipqow3_0GT7peVbExBfbBTJ-ldxLD0JTlfHXSxiEvr9yCypREBZ85c_5e981jQG3b8LzdusO8QilIq-hcLJd-p HTTP/1.1
Host: api.onedrive.com
User-Agent: rclone/v1.47.0
Content-Length: 10485760
Authorization: XXXX
Content-Range: bytes 20971520-31457279/52431708
Accept-Encoding: gzip

2019/04/28 18:43:47 DEBUG : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2019/04/28 18:43:56 DEBUG : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2019/04/28 18:43:56 DEBUG : HTTP RESPONSE (req 0xc000186f00)
2019/04/28 18:43:56 DEBUG : HTTP/1.1 202 Accepted
Content-Type: application/json; charset=utf-8
Date: Sun, 28 Apr 2019 16:43:55 GMT
P3p: CP="BUS CUR CONo FIN IVDo ONL OUR PHY SAMo TELo"
Strict-Transport-Security: max-age=31536000; includeSubDomains
Www-Authenticate: Bearer realm="OneDriveAPI", error="invalid_token", error_description="Invalid auth token"
X-Asmversion: UNKNOWN; 19.237.422.2003
X-Msedge-Ref: Ref A: 46D6A84436554910BA45F616A6EBFBCC Ref B: FRAEDGE0706 Ref C: 2019-04-28T16:43:47Z
X-Msnserver: SN3PPF49E1332F9

{"expirationDateTime":"2019-05-05T16:43:30.667Z","nextExpectedRanges":["31457280-52431707"]}
2019/04/28 18:43:56 DEBUG : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2019/04/28 18:43:56 DEBUG : Hyper Backup/nas-h01.hbk/Pool/0/0/45.bucket.2: Uploading segment 31457280/52431708 size 10485760
2019/04/28 18:43:56 DEBUG : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2019/04/28 18:43:56 DEBUG : HTTP REQUEST (req 0xc0002f8d00)
2019/04/28 18:43:56 DEBUG : PUT /rup/XXXXXXXXXXXX90a1/XXXXXXXXXXXXZUlEIjoiN0VEMEE2QzUyQUM1OTBBMSEyMDUiLCJSZWxhdGlvbnNoaXBOYW1lIjoiNDUuYnVja2V0LjIifQ/XXXXXXXXXXXXfdW-FHdkP3WCuOURPZivJBN-VTtOeu0mSdb3W5yA4Su9QApzA4DhxcR-Cgs2sP-RGJevZyUC2hUSVj_F9G4xv1J-5tNnE5RmA/XXXXXXXXXXXXNDUuYnVja2V0LjIiLCJGaWxlU3lzdGVtSW5mbyI6eyJDcmVhdGVkRGF0ZVRpbWUiOiIyMDE5LTA0LTI4VDE2OjQzOjI5KzAwOjAwIiwiTGFzdE1vZGlmaWVkRGF0ZVRpbWUiOiIyMDE5LTA0LTI4VDE2OjQzOjI5KzAwOjAwIn19/XXXXXXXXXXXXqrOfr1ZmD3NPhfZo9DsGXTlmAdRtMv-yrzQWBs20AGs4rA6os6JtyTKRFLjmXKnc5cRg2vWjnLnEuWyT53gEJ2bd2Xg8PZ59A_kE7UeL7qVRpy1TdWOqpBQkPMJ0j6KEZ0KhHpx0BePZl6LfPRJeKRIIJ3uZZPpsDgEgew8ROrdWzm3-Q2-F2iGoV2euRuhJy5fOe5DKTaMvfJEDta6OyodsBrb4DInTOjvTTxMzC_kQ5PS8umf0Ilm78OjnHev7LuwL2GcyLwu9EsDCItRbgHTePk0qVCMcuFVf5uwZWGGOhciw5bOgvOQ4JrMXgNqeZec8MeUn_NcMMAxPKgYjGJctWYn2Ga1BTJVWGqjGGG02Yb3BP6qx4rZVTzE8f0lRIWxBcfxkPNRLteU9NXBpXstmKN75v-wLwRebQy8Ptb5K7cqk9vsyCAj3OKv3EaeigZCDAUXP9sCTh7OYPQWPYGUjc3yi7-LnLbILkGofNuYEql0nHNUvxqNXGPbPLzqlv_NlLrbgNnDkbw7v6ixd1-ajHhZJcyIFipqow3_0GT7peVbExBfbBTJ-ldxLD0JTlfHXSxiEvr9yCypREBZ85c_5e981jQG3b8LzdusO8QilIq-hcLJd-p HTTP/1.1
Host: api.onedrive.com
User-Agent: rclone/v1.47.0
Content-Length: 10485760
Authorization: XXXX
Content-Range: bytes 31457280-41943039/52431708
Accept-Encoding: gzip

2019/04/28 18:43:56 DEBUG : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2019/04/28 18:44:22 DEBUG : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2019/04/28 18:44:22 DEBUG : HTTP RESPONSE (req 0xc0002f8d00)
2019/04/28 18:44:22 DEBUG : HTTP/1.1 500 Internal Server Error
Content-Type: application/json; charset=utf-8
Date: Sun, 28 Apr 2019 16:44:21 GMT
P3p: CP="BUS CUR CONo FIN IVDo ONL OUR PHY SAMo TELo"
Strict-Transport-Security: max-age=31536000; includeSubDomains
Www-Authenticate: Bearer realm="OneDriveAPI", error="invalid_token", error_description="Invalid auth token"
X-Asmversion: UNKNOWN; 19.237.422.2003
X-Msedge-Ref: Ref A: 0717DAA38D544264B9442CD06D53D933 Ref B: FRAEDGE0706 Ref C: 2019-04-28T16:43:56Z
X-Msnserver: SN3PPF99485B56C
X-Qosstats: {"ApiId":0,"ResultType":0,"SourcePropertyId":42,"TargetPropertyId":274}
X-Throwsite: 15bf.1333

{"error":{"code":"generalException","message":"General Exception While Processing"}}
2019/04/28 18:44:22 DEBUG : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2019/04/28 18:44:22 DEBUG : pacer: low level retry 1/10 (error generalException: General Exception While Processing)
2019/04/28 18:44:22 DEBUG : pacer: Rate limited, increasing sleep to 20ms
2019/04/28 18:44:22 DEBUG : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2019/04/28 18:44:22 DEBUG : HTTP REQUEST (req 0xc0002f9100)
2019/04/28 18:44:22 DEBUG : PUT /rup/XXXXXXXXXXXX90a1/XXXXXXXXXXXXZUlEIjoiN0VEMEE2QzUyQUM1OTBBMSEyMDUiLCJSZWxhdGlvbnNoaXBOYW1lIjoiNDUuYnVja2V0LjIifQ/XXXXXXXXXXXXfdW-FHdkP3WCuOURPZivJBN-VTtOeu0mSdb3W5yA4Su9QApzA4DhxcR-Cgs2sP-RGJevZyUC2hUSVj_F9G4xv1J-5tNnE5RmA/XXXXXXXXXXXXNDUuYnVja2V0LjIiLCJGaWxlU3lzdGVtSW5mbyI6eyJDcmVhdGVkRGF0ZVRpbWUiOiIyMDE5LTA0LTI4VDE2OjQzOjI5KzAwOjAwIiwiTGFzdE1vZGlmaWVkRGF0ZVRpbWUiOiIyMDE5LTA0LTI4VDE2OjQzOjI5KzAwOjAwIn19/XXXXXXXXXXXXqrOfr1ZmD3NPhfZo9DsGXTlmAdRtMv-yrzQWBs20AGs4rA6os6JtyTKRFLjmXKnc5cRg2vWjnLnEuWyT53gEJ2bd2Xg8PZ59A_kE7UeL7qVRpy1TdWOqpBQkPMJ0j6KEZ0KhHpx0BePZl6LfPRJeKRIIJ3uZZPpsDgEgew8ROrdWzm3-Q2-F2iGoV2euRuhJy5fOe5DKTaMvfJEDta6OyodsBrb4DInTOjvTTxMzC_kQ5PS8umf0Ilm78OjnHev7LuwL2GcyLwu9EsDCItRbgHTePk0qVCMcuFVf5uwZWGGOhciw5bOgvOQ4JrMXgNqeZec8MeUn_NcMMAxPKgYjGJctWYn2Ga1BTJVWGqjGGG02Yb3BP6qx4rZVTzE8f0lRIWxBcfxkPNRLteU9NXBpXstmKN75v-wLwRebQy8Ptb5K7cqk9vsyCAj3OKv3EaeigZCDAUXP9sCTh7OYPQWPYGUjc3yi7-LnLbILkGofNuYEql0nHNUvxqNXGPbPLzqlv_NlLrbgNnDkbw7v6ixd1-ajHhZJcyIFipqow3_0GT7peVbExBfbBTJ-ldxLD0JTlfHXSxiEvr9yCypREBZ85c_5e981jQG3b8LzdusO8QilIq-hcLJd-p HTTP/1.1
Host: api.onedrive.com
User-Agent: rclone/v1.47.0
Content-Length: 10485760
Authorization: XXXX
Content-Range: bytes 31457280-41943039/52431708
Accept-Encoding: gzip

2019/04/28 18:44:22 DEBUG : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2019/04/28 18:44:30 DEBUG : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2019/04/28 18:44:30 DEBUG : HTTP RESPONSE (req 0xc0002f9100)
2019/04/28 18:44:30 DEBUG : HTTP/1.1 416 Requested Range Not Satisfiable
Content-Type: application/json; charset=utf-8
Date: Sun, 28 Apr 2019 16:44:30 GMT
P3p: CP="BUS CUR CONo FIN IVDo ONL OUR PHY SAMo TELo"
Strict-Transport-Security: max-age=31536000; includeSubDomains
Www-Authenticate: Bearer realm="OneDriveAPI", error="invalid_token", error_description="Invalid auth token"
X-Asmversion: UNKNOWN; 19.237.422.2003
X-Msedge-Ref: Ref A: E0DE8C11E85847E4914E19F3B25767B4 Ref B: FRAEDGE0706 Ref C: 2019-04-28T16:44:22Z
X-Msnserver: SN3PPF2EDBAD472
X-Qosstats: {"ApiId":0,"ResultType":2,"SourcePropertyId":0,"TargetPropertyId":42}
X-Throwsite: 06c5.1f40

{"error":{"code":"invalidRange","message":"The uploaded fragment overlaps with data that has already been received.","innererror":{"code":"fragmentOverlap"}}}
2019/04/28 18:44:30 DEBUG : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2019/04/28 18:44:30 DEBUG : pacer: Reducing sleep to 15ms
2019/04/28 18:44:30 DEBUG : Hyper Backup/nas-h01.hbk/Pool/0/0/45.bucket.2: Error encountered during upload: invalidRange: fragmentOverlap: The uploaded fragment overlaps with data that has already been received.
2019/04/28 18:44:30 DEBUG : Hyper Backup/nas-h01.hbk/Pool/0/0/45.bucket.2: Cancelling multipart upload

@ncw
Copy link
Member

ncw commented May 13, 2019

So what it looks like happened is that we tried to put a 10MB chunk and got a 500 error for some reason.

However when we retried it we got a "416 Requested Range Not Satisfiable" which probably means the server got the chunk already.

This is covered in the docs

On failures when the client sent a fragment the server had already received, the server will respond with HTTP 416 Requested Range Not Satisfiable. You can request upload status to get a more detailed list of missing ranges.

So we could assume if we get a 416 on a retry that the server got it already and to skip the chunk.

Perhaps better would be to get the upload status to work out where we are. Our hands are somewhat tied here - if the server is missing stuff that isn't in the current chunk we'll have to abort.

What do you think @Cnly ?

@ncw ncw modified the milestones: v1.48, v1.49 Jun 19, 2019
@ncw ncw modified the milestones: v1.49, v1.50 Aug 27, 2019
@ncw ncw modified the milestones: v1.50, Known Problem Nov 14, 2019
@ichtestemalwieder
Copy link

ichtestemalwieder commented Dec 25, 2019

Just to let you know, there are more people having this problem:

...
2019/12/25 01:08:17 ERROR : Bkomplett.tar.016: Failed to copy: invalidRange: fragmentOverlap: The uploaded fragment overlaps with data that has already been received.
2019/12/25 05:21:08 ERROR : Bkomplett.tar.024: Failed to copy: invalidRange: fragmentOverlap: The uploaded fragment overlaps with data that has already been received.
2019/12/25 05:30:11 ERROR : Bkomplett.tar.021: Failed to copy: invalidRange: fragmentOverlap: The uploaded fragment overlaps with data that has already been received.
2019/12/25 05:37:13 ERROR : Bkomplett.tar.022: Failed to copy: invalidRange: fragmentOverlap: The uploaded fragment overlaps with data that has already been received.
2019/12/25 05:57:03 ERROR : Bkomplett.tar.023: Failed to copy: invalidRange: fragmentOverlap: The uploaded fragment has already been received.
2019/12/25 07:31:45 ERROR : Bkomplett.tar.025: Failed to copy: invalidRange: fragmentOverlap: The uploaded fragment overlaps with data that has already been received.
2019/12/25 07:43:07 ERROR : Attempt 1/3 failed with 27 errors and: invalidRange: fragmentOverlap: The uploaded fragment overlaps with data that has already been received.
2019/12/25 08:55:23 ERROR : Bkomplett.tar.008: Failed to copy: invalidRange: fragmentOverlap: The uploaded fragment overlaps with data that has already been received.
2019/12/25 09:57:36 ERROR : Bkomplett.tar.004: Failed to copy: invalidRange: fragmentOverlap: The uploaded fragment overlaps with data that has already been received.
2019/12/25 12:02:42 ERROR : Bkomplett.tar.010: Failed to copy: invalidRange: fragmentOverlap: The uploaded fragment overlaps with data that has already been received.
2019/12/25 13:08:59 ERROR : Bkomplett.tar.009: Failed to copy: invalidRange: fragmentOverlap: The uploaded fragment overlaps with data that has already been received.
2019/12/25 13:25:22 ERROR : Bkomplett.tar.016: Failed to copy: invalidRange: fragmentOverlap: The uploaded fragment overlaps with data that has already been received.
2019/12/25 13:29:02 ERROR : Bkomplett.tar.021: Failed to copy: invalidRange: fragmentOverlap: The uploaded fragment overlaps with data that has already been received.
...

I try to upload around 200GB in files of around of 8GB to OneDrive(Private) and with somes files I get the fragmentOverlap Errors. I do not use webdav, but do a direct copy including remote encryption. Thanks.

ncw added a commit that referenced this issue Jan 25, 2020
Before this change, when uploading multipart files, onedrive would
sometimes return an unexpected 416 error and rclone would abort the
transfer.

This is usually after a 500 error which caused rclone to do a retry.

This change checks the upload position on a 416 error and if the
segment has already been uploaded correctly (the most likely scenario)
it will skip it and go onto the next one. On any other case rclone
will abort the transfer.

See: https://forum.rclone.org/t/fragment-overlap-error-with-encrypted-onedrive/14001

Fixes #3131
@ncw
Copy link
Member

ncw commented Jan 25, 2020

I had a go at fixing this properly - if you could run the transfer in debug (-vv) and watch out for Received 416 error - checking position logs that would be helpful!

https://beta.rclone.org/branch/v1.50.2-191-gca8bd807-fix-3131-onedrive-416-beta/ (uploaded in 15-30 mins)

ncw added a commit that referenced this issue Jan 27, 2020
Before this change, when uploading multipart files, onedrive would
sometimes return an unexpected 416 error and rclone would abort the
transfer.

This is usually after a 500 error which caused rclone to do a retry.

This change checks the upload position on a 416 error and if the
segment has already been uploaded correctly (the most likely scenario)
it will skip it and go onto the next one. On any other case rclone
will abort the transfer.

See: https://forum.rclone.org/t/fragment-overlap-error-with-encrypted-onedrive/14001

Fixes #3131
@ncw
Copy link
Member

ncw commented Jan 27, 2020

That had a bug in, here is v2

https://beta.rclone.org/branch/v1.50.2-195-gd4091ccf-fix-3131-onedrive-416-beta/ (uploaded in 15-30 mins)

ncw added a commit that referenced this issue Jan 28, 2020
Before this change, when uploading multipart files, onedrive would
sometimes return an unexpected 416 error and rclone would abort the
transfer.

This is usually after a 500 error which caused rclone to do a retry.

This change checks the upload position on a 416 error and works how
much of the current chunk to skip, then retries (or skips) the current
chunk as appropriate.

If the position is before the current chunk or after the current chunk
then rclone will abort the transfer.

See: https://forum.rclone.org/t/fragment-overlap-error-with-encrypted-onedrive/14001

Fixes #3131
ncw added a commit that referenced this issue Jan 29, 2020
Before this change, when uploading multipart files, onedrive would
sometimes return an unexpected 416 error and rclone would abort the
transfer.

This is usually after a 500 error which caused rclone to do a retry.

This change checks the upload position on a 416 error and works how
much of the current chunk to skip, then retries (or skips) the current
chunk as appropriate.

If the position is before the current chunk or after the current chunk
then rclone will abort the transfer.

See: https://forum.rclone.org/t/fragment-overlap-error-with-encrypted-onedrive/14001

Fixes #3131
ncw added a commit that referenced this issue Jan 31, 2020
Before this change, when uploading multipart files, onedrive would
sometimes return an unexpected 416 error and rclone would abort the
transfer.

This is usually after a 500 error which caused rclone to do a retry.

This change checks the upload position on a 416 error and works how
much of the current chunk to skip, then retries (or skips) the current
chunk as appropriate.

If the position is before the current chunk or after the current chunk
then rclone will abort the transfer.

See: https://forum.rclone.org/t/fragment-overlap-error-with-encrypted-onedrive/14001

Fixes #3131
@ncw ncw closed this as completed in 3dfa63b Feb 1, 2020
@ichtestemalwieder
Copy link

@ncw, I just wanted to thank you so much for your great work! You make this world a better place! rclone is absolutely great. And I am so sorry, that I did not yet have the time to test your release. At that time I got the error, I was only testing rclone and deleted everything as onedrive was to unreliable. IBut I plan to recreate the setup and test in the next days. Thanks again for your creat work and support!

@ncw
Copy link
Member

ncw commented Feb 2, 2020

I've had a willing tester for this patch, and I'm confident it is fixed now. It missed 1.50.0 but I'll release it in 1.50.1 at some point.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants