Skip to content

Commit

Permalink
general: remove '$ ' from command line examples
Browse files Browse the repository at this point in the history
  • Loading branch information
bagder committed Feb 17, 2017
1 parent 0df0452 commit 2fb7b3c
Show file tree
Hide file tree
Showing 6 changed files with 42 additions and 42 deletions.
20 changes: 10 additions & 10 deletions cmdline-globbing.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,50 +19,50 @@ You can ask for a numerical range with [N-M] syntax, where N is the start
index and it goes up to and including M. For example, you can ask for 100
images one by one that are named numerically:

$ curl -O http://example.com/[1-100].png
curl -O http://example.com/[1-100].png

and it can even do the ranges with zero prefixes, like if the number is
three digits all the time:

$ curl -O http://example.com/[001-100].png
curl -O http://example.com/[001-100].png

Or maybe you only want even numbered images so you tell curl a step counter
too. This example range goes from 0 to 100 with an increment of 2:

$ curl -O http://example.com/[0-100:2].png
curl -O http://example.com/[0-100:2].png

### Alphabetical ranges

curl can also do alphabetical ranges, like when a site has sections named a
to z:

$ curl -O http://example.com/section[a-z].html
curl -O http://example.com/section[a-z].html

### A list

Sometimes the parts don't follow such an easy pattern, and then you can
instead give the full list yourself but then within the curly braces instead
of the brackets used for the ranges:

$ curl -O http://example.com/{one,two,three,alpha,beta}.html
curl -O http://example.com/{one,two,three,alpha,beta}.html

### Combinations

You can use several globs in the same URL which then will make curl iterate
over those, too. To download the images of Ben, Alice and Frank, in both the
resolutions 100x100 and 1000x1000, a command line could look like:

$ curl -O http://example.com/{Ben,Alice,Frank}-{100x100,1000x1000}.jpg
curl -O http://example.com/{Ben,Alice,Frank}-{100x100,1000x1000}.jpg

Or download all the images of a chess board, indexed by two coordinates ranged
0 to 7:

$ curl -O http://example.com/chess-[0-7]x[0-7].jpg
curl -O http://example.com/chess-[0-7]x[0-7].jpg

And you can, of course, mix ranges and series. Get a week's worth of logs for
both the web server and the mail server:

$ curl -O http://example.com/{web,mail}-log[0-6].txt
curl -O http://example.com/{web,mail}-log[0-6].txt

### Output variables for globbing

Expand All @@ -81,8 +81,8 @@ starts with 1 for the first glob and ends with the last glob.

Save the main pages of two different sites:

$ curl http://{one,two}.example.com -o "file_#1.txt"
curl http://{one,two}.example.com -o "file_#1.txt"

Save the outputs from a command line with two globs in a subdirectory;

$ curl http://{site,host}.host[1-5].example.com -o "subdir/#1_#2"
curl http://{site,host}.host[1-5].example.com -o "subdir/#1_#2"
18 changes: 9 additions & 9 deletions cmdline-urls.md
Original file line number Diff line number Diff line change
Expand Up @@ -81,7 +81,7 @@ The use of this syntax is usually frowned upon these days since you easily
leak this information in scripts or otherwise. For example, listing the
directory of an FTP server using a given name and password:

$ curl ftp://user:password@example.com/
curl ftp://user:password@example.com/

The presence of user name and password in the URL is completely optional. curl
also allows that information to be provide with normal command-line options,
Expand All @@ -93,12 +93,12 @@ The host name part of the URL is, of course, simply a name that can be resolved
to an numerical IP address, or the numerical address itself. When specifying a
numerical address, use the dotted version for IPv4 addresses:

$ curl http://127.0.0.1/
curl http://127.0.0.1/

…and for IPv6 addresses the numerical version needs to be within square
brackets:

$ curl http://[::1]/
curl http://[::1]/

When a host name is used, the converting of the name to an IP address is
typically done using the system's resolver functions. That normally lets a
Expand All @@ -111,15 +111,15 @@ specified port number is given. The optional port number can be provided
within the URL after the host name part, as a colon and the port number
written in decimal. For example, asking for an HTTP document on port 8080:

$ curl http://example.com:8080/
curl http://example.com:8080/

With the name specified as an IPv4 address:

$ curl http://127.0.0.1:8080/
curl http://127.0.0.1:8080/

With the name given as an IPv6 address:

$ curl http://[fdea::1]:8080/
curl http://[fdea::1]:8080/

### Path

Expand All @@ -130,13 +130,13 @@ requested or that will be provided.
The exact use of the path is protocol dependent. For example, getting a file
README from the default anonymous user from an FTP server:

$ curl ftp://ftp.example.com/README
curl ftp://ftp.example.com/README

For the protocols that have a directory concept, ending the URL with a
trailing slash means that it is a directory and not a file. Thus asking for a
directory list from an FTP server is implied with such a slash:

$ curl ftp://ftp.example.com/tmp/
curl ftp://ftp.example.com/tmp/

### FTP type

Expand Down Expand Up @@ -225,7 +225,7 @@ As an example, we do an HTTP GET to a URL and follow redirects, we then make a
second HTTP POST to a different URL and we round it up with a HEAD request to
a third URL. All in a single command line:

$ curl --location http://example.com/1 --next
curl --location http://example.com/1 --next
--data sendthis http://example.com/2 --next
--head http://example.com/3

Expand Down
2 changes: 1 addition & 1 deletion ftp-twoconnections.md
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ exactly which address to use, just setting the same as you come from is almost
always the correct choice and you do that with `-P -`, like this way to ask
for a file:

$ curl -P - ftp://example.com/foobar.txt
curl -P - ftp://example.com/foobar.txt

You can also explicitly ask curl to not use EPRT (which is a slightly newer
command than PORT) with the `--no-epsv` command-line option.
Expand Down
2 changes: 1 addition & 1 deletion http-multipart.md
Original file line number Diff line number Diff line change
Expand Up @@ -97,7 +97,7 @@ stream.

To replace the header, use `-H` like this:

$ curl -F 'name=Dan' -H 'Content-Type: multipart/magic' https://example.com
curl -F 'name=Dan' -H 'Content-Type: multipart/magic' https://example.com

### Converting an HTML form

Expand Down
2 changes: 1 addition & 1 deletion http-post.md
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,7 @@ If that header is not good enough for you, you should, of course, replace that
and instead provide the correct one. Such as if you POST JSON to a server and
want to more accurately tell the server about what the content is:

$ curl -d '{json}' -H 'Content-Type: application/json' https://example.com
curl -d '{json}' -H 'Content-Type: application/json' https://example.com

### POSTing binary

Expand Down
40 changes: 20 additions & 20 deletions usingcurl-downloads.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ You specify the resource to download by giving curl a URL. curl defaults to
downloading a URL unless told otherwise, and the URL identifies what to
download. In this example the URL to download is "http://example.com":

$ curl http://example.com
curl http://example.com

The URL is broken down into its individual components ([as explained
elsewhere](cmdline-urls.md)), the correct server is contacted and is then
Expand Down Expand Up @@ -47,14 +47,14 @@ just a file name, a relative path to a file name or a full path to the file.
Also note that you can put the `-o` before or after the URL; it makes no
difference:

$ curl -o output.html http://example.com/
$ curl -o /tmp/index.html http://example.com/
$ curl http://example.com -o ../../folder/savethis.html
curl -o output.html http://example.com/
curl -o /tmp/index.html http://example.com/
curl http://example.com -o ../../folder/savethis.html

This is, of course, not limited to http:// URLs but works the same way no matter
which type of URL you download:

$ curl -o file.txt ftp://example.com/path/to/file-name.ext
curl -o file.txt ftp://example.com/path/to/file-name.ext

curl has several other ways to store and name the downloaded data. Details
follow!
Expand All @@ -65,11 +65,11 @@ Many URLs, however, already contain the file name part in the rightmost
end. curl lets you use that as a shortcut so you don't have to repeat it with
`-o`. So instead of:

$ curl -o file.html http://example.com/file.html
curl -o file.html http://example.com/file.html

You can save the remove URL resource into the local file 'file.html' with this:

$ curl -O http://example.com/file.html
curl -O http://example.com/file.html

This is the `-O` (uppercase letter o) option, or `--remote-name` for the long
name version. The -O option selects the local file name to use by picking the
Expand Down Expand Up @@ -113,7 +113,7 @@ displays as expected. curl will then not translate the arriving data.
A common example where this causes some surprising results is when a user
downloads a web page with something like:

$ curl https://example.com/ -o storage.html
curl https://example.com/ -o storage.html

…and when inspecting the `storage.html` file after the fact, the user realizes
that one or more characters look funny or downright wrong. This can then very
Expand All @@ -138,7 +138,7 @@ uses and is the widespread and popular way to do it! The common way to
compress HTTP content is using the **Content-Encoding** header. You ask curl to
use this with the `--compressed` option:

$ curl --compressed http://example.com/
curl --compressed http://example.com/

With this option enabled (and if the server support it) it delivers the data in
a compressed way and curl will decompress it before saving it or sending it to
Expand All @@ -151,7 +151,7 @@ method, which is the header that was created for this automated method but was
never really widely adopted. You can tell curl to ask for Transfer-Encoded
compression with `--tr-encoding`:

$ curl --tr-encoding http://example.com/
curl --tr-encoding http://example.com/

In theory, there's nothing that prevents you from using both in the same
command line, although in practise, you may then experience that some servers
Expand All @@ -166,21 +166,21 @@ abilities. In most Linux and Unix shells and with Windows' command prompts,
you direct stdout to a file with `> filename`. Using this, of course, makes the
use of -o or -O superfluous.

$ curl http://example.com/ > example.html
curl http://example.com/ > example.html

Redirecting output to a file redirects all output from curl to that file, so
even if you ask to transfer more than one URL to stdout, redirecting the output
will get all the URLs' output stored in that single file.

$ curl http://example.com/1 http://example.com/2 > files
curl http://example.com/1 http://example.com/2 > files

Unix shells usually allow you to redirect the *stderr* stream separately. The
stderr stream is usually a stream that also gets shown in the terminal, but you
can redirect it separately from the stdout stream. The stdout stream is for
the data while stderr is metadata and errors, etc., that aren't data. You can
redirect stderr with `2>file` like this:

$ curl http://example.com > files.html 2>errors
curl http://example.com > files.html 2>errors

### Multiple downloads

Expand All @@ -193,7 +193,7 @@ instruction". Without said "storage instruction", curl will default to sending
the data to stdout. If you ask for two URLs and only tell curl where to save
the first URL, the second one is sent to stdout. Like this:

$ curl -o one.html http://example.com/1 http://example.com/2
curl -o one.html http://example.com/1 http://example.com/2

The "storage instructions" are read and handled in the same order as the
download URLs so they don't have to be next to the URL in any way. You can
Expand All @@ -202,15 +202,15 @@ choose!

These examples all work the same way:

$ curl -o 1.txt -o 2.txt http://example.com/1 http://example.com/2
$ curl http://example.com/1 http://example.com/2 -o 1.txt -o 2.txt
$ curl -o 1.txt http://example.com/1 http://example.com/2 -o 2.txt
$ curl -o 1.txt http://example.com/1 -o 2.txt http://example.com/2
curl -o 1.txt -o 2.txt http://example.com/1 http://example.com/2
curl http://example.com/1 http://example.com/2 -o 1.txt -o 2.txt
curl -o 1.txt http://example.com/1 http://example.com/2 -o 2.txt
curl -o 1.txt http://example.com/1 -o 2.txt http://example.com/2

The `-O` is similarly just an instruction for a single download so if you
download multiple URLs, use more of them:

$ curl -O -O http://example.com/1 http://example.com/2
curl -O -O http://example.com/1 http://example.com/2

### Use the URL's file name part for all URLs

Expand Down Expand Up @@ -319,7 +319,7 @@ M and G for kilobytes, megabytes and gigabytes.

To make curl not download data any faster than 200 kilobytes per second:

$ curl https://example.com/ --limit-rate 200K
curl https://example.com/ --limit-rate 200K

The given limit is the maximum *average speed* allowed, counted during the
entire transfer. It means that curl might use higher transfer speeds in short
Expand Down

0 comments on commit 2fb7b3c

Please sign in to comment.