The rehan-wget
module that can install wget and retrieve files using it.
This module is a clone of maestrodev-wget without any legacy support for puppet versions older than 4.0. It manages installation of wget and supports retrieval of files and directories from the Internet.
- Installs and manages wget package
- Retrieve files and directories using wget.
In order to install rehan-wget
, run the following command:
$ puppet module install rehan-wget
The module can be used with hiera
to provide all configuration options. See Usage for examples on how to configure it.
This module is designed to be as clean and compliant with latest puppet code guidelines.
A basic install with the defaults would be:
include wget
Otherwise using the parameters:
class{ 'wget':
package_manage => true,
package_ensure => present,
package_name => 'wget',
}
- package_manage: Controls the wget package management by this module. The default is
true
. If it isfalse
, this module will not manage wget. - package_ensure: Sets the ensure parameter passed to the package. The default is
present
. - package_name: Provides the package name to be installed. The default is
wget
. It can be used on systems where the default name is other than that. - retrievals: A hash of retrieve resources that this module can download, see
wget::retrieve
for more details.
All of this data can be provided through Hiera
.
YAML
wget::package_manage: true
wget::package_ensure: present
wget::package_name: 'wget'
wget::retrievals:
'http://www.google.com/index.html':
destination: '/tmp/'
timeout: 0
Usage:
wget::retrieve { "download Google's index":
source => 'http://www.google.com/index.html',
destination => '/tmp/',
timeout => 0,
verbose => false,
}
or alternatively:
wget::retrieve { 'http://www.google.com/index.html':
destination => '/tmp/',
timeout => 0,
verbose => false,
}
If $destination
ends in either a forward or backward slash, it will treat the destination as a directory and name the file with the basename of the $source
.
wget::retrieve { 'http://mywebsite.com/apples':
destination => '/downloads/',
}
Download from an array of URLs into one directory
$manyfiles = [
'http://mywebsite.com/apples',
'http://mywebsite.com/oranges',
'http://mywebsite.com/bananas',
]
wget::retrieve { $manyfiles:
destination => '/downloads/',
}
This retrieves a document which requires authentication:
wget::retrieve { 'Retrieve secret PDF':
source => 'https://confidential.example.com/secret.pdf',
destination => '/tmp/',
user => 'user',
password => 'p$ssw0rd',
timeout => 0,
verbose => false,
}
This caches the downloaded file in an intermediate directory to avoid repeatedly downloading it. This uses the timestamping (-N) and prefix (-P) wget options to only re-download if the source file has been updated.
wget::retrieve { 'https://tool.com/downloads/tool-1.0.tgz':
destination => '/tmp/',
cache_dir => '/var/cache/wget',
}
It's assumed that the cached file will be named after the source's URL basename but this assumption can be broken if wget follows some redirects. In this case you must inform the correct filename in the cache like this:
wget::retrieve { 'https://tool.com/downloads/tool-latest.tgz':
destination => '/tmp/tool-1.0.tgz',
cache_dir => '/var/cache/wget',
cache_file => 'tool-1.1.tgz',
execuser => 'fileowner',
group => 'filegroup',
}
Checksum can be used in the source_hash
parameter, with the MD5-sum of the content to be downloaded.
If content exists, but does not match it is removed before downloading.
If you want to use your own unless condition, you can do it. This example uses wget to download the latest version of Wordpress to your destination folder only if the folder is empty (test used returns 1 if directory is empty or 0 if not).
wget::retrieve { 'wordpress':
source => 'https://wordpress.org/latest.tar.gz',
destination => "/var/www/html/latest_wordpress.tar.gz",
timeout => 0,
unless => "test $(ls -A /var/www/html 2>/dev/null)",
}
You can submit pull requests and create issues through the official page of this module on GitHub. Please do report any bug and suggest new features/improvements.