Skip to content

HTTPS clone URL

Subversion checkout URL

You can clone with
or
.
Download ZIP

Loading…

large filesize and nonexistent bucket cause retry upload loop and exit status of 0 #69

Closed
lexinator opened this Issue · 7 comments

7 participants

@lexinator

%s3cmd --version
s3cmd version 1.1.0-beta3

%ls -l /tmp/a.bz2
-rw-r--r-- 1 root root 356924 2012-07-12 18:15 /tmp/a.bz2
%ls -l /etc/hosts
-rw-r--r-- 1 root root 325 2012-04-13 23:50 /etc/hosts

%s3cmd put /tmp/a.bz2 s3://nonexistent-bucket/
WARNING: Module python-magic is not available. Guessing MIME types based on file extensions.
/tmp/a.bz2 -> s3://nonexistent-bucket/a.bz2 [1 of 1]
90112 of 356924 25% in 1s 49.39 kB/s failed
WARNING: Upload failed: /a.bz2 ([Errno 32] Broken pipe)
WARNING: Retrying on lower speed (throttle=0.00)
WARNING: Waiting 3 sec...
/tmp/a.bz2 -> s3://nonexistent-bucket/a.bz2 [1 of 1]
77824 of 356924 21% in 1s 45.42 kB/s failed
WARNING: Upload failed: /a.bz2 ([Errno 32] Broken pipe)
WARNING: Retrying on lower speed (throttle=0.01)
WARNING: Waiting 6 sec...
...
WARNING: Waiting 15 sec...
/tmp/a.bz2 -> s3://nonexistent-bucket/a.bz2 [1 of 1]
8192 of 356924 2% in 1s 5.87 kB/s failed
ERROR: Upload of '/tmp/a.bz2' failed too many times. Skipping that file.
%echo $?
0

%s3cmd put /etc/hosts s3://nonexistent-bucket/

WARNING: Module python-magic is not available. Guessing MIME types based on file extensions.
/etc/hosts -> s3://nonexistent-bucket/hosts [1 of 1]
325 of 325 100% in 0s 4.99 kB/s done
ERROR: S3 error: 404 (NoSuchBucket): The specified bucket does not exist
%echo $?
1

@phantomwhale

I'm also getting this - been trying to work out why were are getting "true" returns from failed S3 puts - and then furthermore trying to work out why we get getting "false" results when testing locally (answer: we were testing with 4kb files - but in production we are putting much larger file).

We use this command to push our database backups up to S3, and then delete the local backups from temporary storage when we do so. Not have a return value to let us know if the upload worked or not is probably a bit of a show stopper !

@arodrime

+1

@jcook793

+1 on the exit value. I'd even like an option when uploading multiple files to fail the whole operation if any file fails.

@dahugi

+1 on the exit value on retry upload loop

@hrchu

+1 I think the reason is aws s3 deny large file upload via put object API.

@mdomsch
Owner

Using master branch HEAD 18caa04
$ ./s3cmd put TODO s3://nobucket.domsch.com/
TODO -> s3://nobucket.domsch.com/TODO [1 of 1]
2507 of 2507 100% in 0s 7.29 kB/s done
ERROR: S3 error: 404 (NoSuchBucket): The specified bucket does not exist

and it exits out.

The exit value problem is duplicated in issue #19, #219 , #65, so I'm going to close this one as a duplicate. Lots of folks want that fixed, few have tried to do so systematically.

@mdomsch mdomsch closed this
@mdomsch
Owner

master branch has a fix.

[mdomsch@pws490 s3cmd (upstream-master)]$ ./s3cmd put does-not-exist s3://s3cmd-test.domsch.com/
ERROR: Parameter problem: Nothing to upload.
[mdomsch@pws490 s3cmd (upstream-master)]$ echo $?
64

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Something went wrong with that request. Please try again.