Giter Club home page Giter Club logo

voxpupuli / puppet-wget Goto Github PK

View Code? Open in Web Editor NEW

This project forked from bryanrossuk/puppet-wget

40.0 27.0 114.0 221 KB

A puppet recipe for wget, a useful tool to download arbitary files from the web

License: Apache License 2.0

Ruby 64.32% Puppet 33.83% Dockerfile 1.85%
bsd-puppet-module linux-puppet-module puppet windows-puppet-module hacktoberfest centos-puppet-module debian-puppet-module freebsd-puppet-module oraclelinux-puppet-module redhat-puppet-module scientific-puppet-module sles-puppet-module ubuntu-puppet-module

puppet-wget's Introduction

Build Status Puppet Forge Puppet Forge Puppet Forge

This module is declared as deprecated by Vox Pupuli.

The module puppet-archive is suggested as its replacement.

A Puppet module to download files with wget, supporting authentication. It got migrated from maestrodev to Vox Pupuli

ToC


Example

install wget:

    include wget
    wget::fetch { "download Google's index":
      source      => 'http://www.google.com/index.html',
      destination => '/tmp/',
      timeout     => 0,
      verbose     => false,
    }

or alternatively:

    wget::fetch { 'http://www.google.com/index.html':
      destination => '/tmp/',
      timeout     => 0,
      verbose     => false,
    }

If $destination ends in either a forward or backward slash, it will treat the destination as a directory and name the file with the basename of the $source.

  wget::fetch { 'http://mywebsite.com/apples':
    destination => '/downloads/',
  }

Download from an array of URLs into one directory

  $manyfiles = [
    'http://mywebsite.com/apples',
    'http://mywebsite.com/oranges',
    'http://mywebsite.com/bananas',
  ]

  wget::fetch { $manyfiles:
    destination => '/downloads/',
  }

This fetches a document which requires authentication:

    wget::fetch { 'Fetch secret PDF':
      source      => 'https://confidential.example.com/secret.pdf',
      destination => '/tmp/',
      user        => 'user',
      password    => 'p$ssw0rd',
      timeout     => 0,
      verbose     => false,
    }

This caches the downloaded file in an intermediate directory to avoid repeatedly downloading it. This uses the timestamping (-N) and prefix (-P) wget options to only re-download if the source file has been updated.

    wget::fetch { 'https://tool.com/downloads/tool-1.0.tgz':
      destination => '/tmp/',
      cache_dir   => '/var/cache/wget',
    }

It's assumed that the cached file will be named after the source's URL basename but this assumption can be broken if wget follows some redirects. In this case you must inform the correct filename in the cache like this:

    wget::fetch { 'https://tool.com/downloads/tool-latest.tgz':
      destination => '/tmp/tool-1.0.tgz',
      cache_dir   => '/var/cache/wget',
      cache_file  => 'tool-1.1.tgz',
      execuser    => 'fileowner',
      group       => 'filegroup',
    }

Checksum can be used in the source_hash parameter, with the MD5-sum of the content to be downloaded. If content exists, but does not match it is removed before downloading.

If you want to use your own unless condition, you can do it. This example uses wget to download the latest version of Wordpress to your destination folder only if the folder is empty (test used returns 1 if directory is empty or 0 if not).

    wget::fetch { 'wordpress':
        source      => 'https://wordpress.org/latest.tar.gz',
        destination => "/var/www/html/latest_wordpress.tar.gz",
        timeout     => 0,
        unless      => "test $(ls -A /var/www/html 2>/dev/null)",
    }

Building

Testing is done with rspec, Beaker-rspec, Beaker)

To test and build the module

bundle install
# run specs
rake

# run Beaker system tests with vagrant vms
rake beaker
# to use other vm from the list spec/acceptance/nodesets and not destroy the vm after the tests
BEAKER_destroy=no BEAKER_set=centos-65-x64-docker bundle exec rake beaker

# Release the Puppet module to the Forge, doing a clean, build, tag, push, bump_commit and git push
rake module:release

License

Copyright 2011-2013 MaestroDev

Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at

http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.

puppet-wget's People

Contributors

acr31 avatar adamcrews avatar andrewh avatar bastelfreak avatar brettporter avatar bryanrossuk avatar carlossg avatar dan33l avatar dcarley avatar dhoppe avatar drewhemm avatar duskglow avatar ekohl avatar ghoneycutt avatar gnustavo avatar igalic avatar illogicalimbecile avatar juanibiapina avatar kitplummer avatar llowder avatar luyseyal avatar mens avatar mirthy avatar robbat2 avatar sray avatar stevesaliman avatar tosmi avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

puppet-wget's Issues

Error during execution when executing as "user" on CentOS (6.7)

When I apply the puppet configuration, I get this error during the execution:

Error: Could not prefetch package provider 'yum': The yum provider can only be used as root"

If I execute with sudo it does not make this error but the files are under 'root' permission and I need to have them with my own user.

Is there any way to bypass the yum installation?

Another issue, I have a file called "my.destination.file.wgetrc" which contains only the password of the url. How to prevent having this file?

feature: Do not manage wget package

I have a use case where I want to use this module as a non root user. The yum provider can not be used as a non root user: "Error: Could not prefetch package provider 'yum': The yum provider can only be used as root."

Would you be open to me submitting a pull request 'manage_package' which would allow the package management to be disabled?

Warning: Unknown variable

Hello

Using latest 1.7.3 version of the module with puppet 4.10.4, I've those warning

==> test.vagrant.local: Warning: Unknown variable: '::http_proxy'. at /etc/puppetlabs/code/environments/production/modules/wget/manifests/fetch.pp:60:19
==> test.vagrant.local: Warning: Unknown variable: '::https_proxy'. at /etc/puppetlabs/code/environments/production/modules/wget/manifests/fetch.pp:65:20
==> test.vagrant.local: Warning: Unknown variable: 'schedule'. at /etc/puppetlabs/code/environments/production/modules/wget/manifests/fetch.pp:184:20
==> test.vagrant.local: Warning: Unknown variable: '::http_proxy'. at /etc/puppetlabs/code/environments/production/modules/wget/manifests/fetch.pp:60:19
==> test.vagrant.local: Warning: Unknown variable: '::https_proxy'. at /etc/puppetlabs/code/environments/production/modules/wget/manifests/fetch.pp:65:20
==> test.vagrant.local: Warning: Unknown variable: 'schedule'. at /etc/puppetlabs/code/environments/production/modules/wget/manifests/fetch.pp:184:20
==> test.vagrant.local: Warning: Unknown variable: '::http_proxy'. at /etc/puppetlabs/code/environments/production/modules/wget/manifests/fetch.pp:60:19
==> test.vagrant.local: Warning: Unknown variable: '::https_proxy'. at /etc/puppetlabs/code/environments/production/modules/wget/manifests/fetch.pp:65:20
==> test.vagrant.local: Warning: Unknown variable: 'schedule'. at /etc/puppetlabs/code/environments/production/modules/wget/manifests/fetch.pp:184:20

Best regards,

source_hash not working on Darwin

Hello,

I've found source_hash to not work on Darwin. The system-provided "md5" can't handle the way it's called, and the brew-package "md5sha1sum" provides a md5sum binary that doesn't understand "--quiet" and has a mandatory filename ("-" is okay) on "-c".

count

destination file's permission can't be edited

I am facing an issue to give permission to the destination file when the cache dir is set, because the the resource is already defined in the wget::fetch module. So the workaround is to set the destination to a temporary location and create a new resource by sourcing it to from the temporary location.

file { $destination:
  ensure  => file,
  source  => "${cache_dir}/${cache}",
  owner   => $execuser,
  require => Exec["wget-${name}"],
}

this is the code from the wget::fetch module.

Error: Invalid parameter unless on Wget::Fetch

Hi,

It seems that the Puppet Forge code is slightly out of date compared to GitHub. The 'unless' parameter is not available when using the module from the Forge, resulting in error:

Error: Invalid parameter unless on Wget::Fetch

Please could the forge module be updated?

Thanks.

Support Mac

It looks like there is some partial Darwin support but ultimately this does not work on a Mac...

Error: Could not find command 'wget'
Error: /Stage[main]/Maven::Maven/Wget::Fetch[fetch-maven]/Exec[wget-fetch-maven]/returns: change from notrun to 0 failed: Could not find command 'wget'

Could that be supported?

Errors when enabling strict_variables

This module seems to use three global variables which puppet 4 will complain about if strict_variables is true:

  • http_proxy
  • https_proxy
  • schedule (manifests/fetch.pp:99)

As a workaround I have to add the following to site.pp:

$http_proxy = undef
$https_proxy = undef
$schedule = undef

Related to #45

License File

Could you add a license file to the project please?

Thanks.

Wget package fails under Windows

Log below. The real problem is not installation itself, but the fact that the operatingsystem check just looks at "not Darwin". I think it makes sense to skip the package definition for Windows entirely for now, so the module functions work if wget is preinstalled.

Error: The source parameter is required when using the Windows provider.
Error: /Stage[main]/Wget/Package[wget]/ensure: change from absent to present failed: The source parameter is required when using the Windows provider.

Add support for headers or cookies

Some download URLs (such as Oracle JDK) require a cookie to confirm acceptance of their EULA.

The wget module should support either a hash of cookies, or better yet, a hash of headers to send along with the wget request.

If I get time, I'll do a pull request for it

No clear error message when destination is a directory

If you specify directory and not a specific file for the destination then the resource declaration for wget::fetch silently fails. My expectation was that it would save the file to a directory if I specify a directory and save it to a filename if I specify a filename. Also having an error message if that is not the expected behaviour would be great.

wget idempotency issue on Windows

  • Puppet: 5.5.6
  • OS: Windows 2016
  • Module version: 2.0.0

How to reproduce (e.g Puppet code you use)

wget::fetch { 'package.msi':
  source             => "https://api-host.com/packages/package.msi",
  destination        => "C:\\Program Files\\package.msi",
  user               => $api_user,
  password           => $api_password,
  nocheckcertificate => true,
}

What are you seeing

The resource is being executed on every puppet run even when the package.msi already is in the correct directory.

What behaviour did you expect instead

Only execute the resource when the package is not in the directory.

Any additional information you'd like to impart

In the wget module, this problem seems to be happening in the manifests/fetch.pp on the code:

if ($::operatingsystem == 'windows') {
  $exec_path = $::path
  $unless_test = "cmd.exe /c \"dir ${_destination}\""
} else {
...
}

Suggestion

The problem seems to be related to the path space, I think one possible solution could be something like:

 $unless_test = 'cmd /c IF exist "${_destination}" (exit 0) ELSE (exit 1)'

or simply allow the user to define is own unless command on Windows.

Last release fails on Debian Wheezy

Hi,
since the last update we face this issue

err: Could not retrieve catalog from remote server: Error 400 on SERVER: Failed to parse inline template: `@32bit_packages' is not allowed as an instance variable name at /etc/puppet/modules/wget/manifests/fetch.pp:36 on node

Archive this in favour of puppet/archive?

Hi!

we've got our archive module that is perfectly for downloading files. I don't see any purpose in maintaining this module while we still have archive. Unless someone likes to keep this module, I'm going to archive it in the next weeks. @voxpupuli/collaborators please provide feedback :)

::kernel is not defined

When the ::kernel fact is not defined then Package[wget] is not set and so define wget::fetch => exec { "wget-${name}": also fails.

Additionally Package[wget] is only set when $manage_package is set to true.

Not sure what a fix would look like, I wonder if some extra logic to not depend on the the Package[wget] if the ::kernel fact isn't true or $manage_package is false. I also wonder what happens to the exec if the kernel is set to FreeBSD.

bad download (source_hash) left at destination

Affected Puppet, Ruby, OS and module versions/distributions

  • Puppet: 3.8.7
  • Ruby: 2.0.0
  • Distribution: rhel7
  • Module version: 1.7.3

How to reproduce (e.g Puppet code you use)

wget::fetch { 'http://www.google.com/':
	destination => '/tmp/google',
}

wget::fetch { 'https://www.google.com/':
	destination => '/tmp/google_bad',
	source_hash => 0,
}

What are you seeing

$ ls /tmp/google*
/tmp/google  /tmp/google_bad

What behaviour did you expect instead

$ ls /tmp/google*
/tmp/google

Output log

Notice: /Stage[main]/Main/Wget::Fetch[https://www.google.com/]/Exec[wget-source_hash-check-https://www.google.com/]/returns: executed successfully
Info: /Stage[main]/Main/Wget::Fetch[https://www.google.com/]/Exec[wget-source_hash-check-https://www.google.com/]: Scheduling refresh of Exec[wget-https://www.google.com/]
Notice: /Stage[main]/Main/Wget::Fetch[https://www.google.com/]/Exec[wget-https://www.google.com/]/returns: 2019-02-19 12:47:08 URL:https://www.google.com/ [11192] -> "/tmp/google_bad" [1]
Notice: /Stage[main]/Main/Wget::Fetch[https://www.google.com/]/Exec[wget-https://www.google.com/]/returns: md5sum: standard input: no properly formatted MD5 checksum lines found
Error: wget --no-verbose --output-document="/tmp/google_bad" "https://www.google.com/" && echo '0  /tmp/google_bad' | md5sum -c --quiet returned 1 instead of one of [0]
Error: /Stage[main]/Main/Wget::Fetch[https://www.google.com/]/Exec[wget-https://www.google.com/]/returns: change from notrun to 0 failed: wget --no-verbose --output-document="/tmp/google_bad" "https://www.google.com/" && echo '0  /tmp/google_bad' | md5sum -c --quiet returned 1 instead of one of [0]
Notice: /Stage[main]/Main/Wget::Fetch[https://www.google.com/]/Exec[wget-https://www.google.com/]: Triggered 'refresh' from 1 events
Notice: /Stage[main]/Main/Wget::Fetch[http://www.google.com/]/Exec[wget-http://www.google.com/]/returns: executed successfully
Notice: /Stage[main]/Main/Wget::Fetch[https://www.google.com/]/Exec[wget-source_hash-check-https://www.google.com/]/returns: executed successfully
Info: /Stage[main]/Main/Wget::Fetch[https://www.google.com/]/Exec[wget-source_hash-check-https://www.google.com/]: Scheduling refresh of Exec[wget-https://www.google.com/]
Notice: /Stage[main]/Main/Wget::Fetch[https://www.google.com/]/Exec[wget-https://www.google.com/]/returns: 2019-02-19 12:48:41 URL:https://www.google.com/ [11213] -> "/tmp/google_bad" [1]
Notice: /Stage[main]/Main/Wget::Fetch[https://www.google.com/]/Exec[wget-https://www.google.com/]/returns: md5sum: standard input: no properly formatted MD5 checksum lines found
Error: wget --no-verbose --output-document="/tmp/google_bad" "https://www.google.com/" && echo '0  /tmp/google_bad' | md5sum -c --quiet returned 1 instead of one of [0]
Error: /Stage[main]/Main/Wget::Fetch[https://www.google.com/]/Exec[wget-https://www.google.com/]/returns: change from notrun to 0 failed: wget --no-verbose --output-document="/tmp/google_bad" "https://www.google.com/" && echo '0  /tmp/google_bad' | md5sum -c --quiet returned 1 instead of one of [0]
Notice: /Stage[main]/Main/Wget::Fetch[https://www.google.com/]/Exec[wget-https://www.google.com/]: Triggered 'refresh' from 1 events

Any additional information you'd like to impart

The reason for specifying the source_hash was to ensure the downloaded file was correct. This is especially important for executables. Current behaviour enforces the md5sum before the download, but not after (beyond notifying the puppet run).

Do not download if wget::fetch { redownload => false }

Add option to not download if some other condition is true..
for example downloaded file is zip (or zip of zip) which contains directory which if is unzipped fetching it again from internet is nonsense.

Add possibility to force download/ not download by redownload variable or another? variable from other test than destination exists.

Is it reasonable for you?

Thanks

no way to disable file backup

Can you please support an option to turn file backup off.

I am trying to download a 390M file, and every time it changes it tries to back it up to the filestore and fails. I don't really care if it does, because it's being pulled from artifactory. We are using r10k so I would have to actually fork the module to make that work, and I don't really want to do that. It should be a simple matter of adding the backup option to the File object and exposing it as a parameter.

Thanks. :)

Question: 'wget::fetch' syntax not download file

The following syntax from my install_drush.pp, is not downloading drush.phar:

...
## download drush: wget is a wrapper around the 'exec' function
#
#  @timeout, override the default 'exec' timeout
#  @verbose, verbose logging
wget::fetch {'download-drush':
  source      => "https://github.com/drush-ops/drush/releases/download/${drush_version}/drush.phar",
  destination => '/tmp/drush.phar',
  verbose     => false,
}
...

When I run a build via vagrant up, I get the following error traceback:

$ vagrant up
Bringing machine 'default' up with 'virtualbox' provider...
==> default: vagrant-r10k: Building the r10k module path with puppet provisioner module_path "puppet/environment/development/modules". (if module_path is an array, first element is used)
==> default: vagrant-r10k: Beginning r10k deploy of puppet modules into c:/path/to/drupal-demonstration/puppet/environment/development/modules using c:/path/to/drupal-demonstration/puppet/environment/development/Puppetfile
==> default: vagrant-r10k: Deploy finished
==> default: Importing base box 'puppetlabs/centos-7.2-64-puppet'...
==> default: Matching MAC address for NAT networking...
==> default: Checking if box 'puppetlabs/centos-7.2-64-puppet' is up to date...
==> default: Setting the name of the VM: drupal-demonstration_default_1451593369309_73348
==> default: vagrant-r10k: Building the r10k module path with puppet provisioner
 module_path "puppet/environment/development/modules". (if module_path is an array, first element is used)
==> default: vagrant-r10k: Beginning r10k deploy of puppet modules into c:/path/to/drupal-demonstration/puppet/environment/development/modules using c:/path/to/drupal-demonstration/puppet/environment/development/Puppetfile
==> default: vagrant-r10k: Deploy finished
==> default: Clearing any previously set network interfaces...
==> default: Preparing network interfaces based on configuration...
    default: Adapter 1: nat
==> default: Forwarding ports...
    default: 22 => 2222 (adapter 1)
==> default: Booting VM...
==> default: Waiting for machine to boot. This may take a few minutes...
    default: SSH address: 127.0.0.1:2222
    default: SSH username: vagrant
    default: SSH auth method: private key
    default: Warning: Connection timeout. Retrying...
    default: Warning: Remote connection disconnect. Retrying...
    default:
    default: Vagrant insecure key detected. Vagrant will automatically replace
    default: this with a newly generated keypair for better security.
    default:
    default: Inserting generated public key within guest...
    default: Removing insecure key from the guest if it's present...
    default: Key inserted! Disconnecting and reconnecting using new SSH key...
==> default: Machine booted and ready!
GuestAdditions versions on your host (5.0.12) and guest (4.3.22) do not match.
Loaded plugins: fastestmirror
Determining fastest mirrors
 * base: mirrors.greenmountainaccess.net
 * extras: ftp.linux.ncsu.edu
 * updates: mirror.solarvps.com
Package kernel-devel-3.10.0-327.el7.x86_64 already installed and latest version
Package gcc-4.8.5-4.el7.x86_64 already installed and latest version
Package 1:make-3.82-21.el7.x86_64 already installed and latest version
Package 4:perl-5.16.3-286.el7.x86_64 already installed and latest version
Nothing to do
Copy iso file C:\Program Files\Oracle\VirtualBox\VBoxGuestAdditions.iso into the
 box /tmp/VBoxGuestAdditions.iso
mount: /dev/loop0 is write-protected, mounting read-only
Installing Virtualbox Guest Additions 5.0.12 - guest version is 4.3.22
Verifying archive integrity... All good.
Uncompressing VirtualBox 5.0.12 Guest Additions for Linux............
VirtualBox Guest Additions installer
Removing installed version 4.3.22 of VirtualBox Guest Additions...
Copying additional installer modules ...
Installing additional modules ...
Removing existing VirtualBox non-DKMS kernel modules[  OK  ]
Building the VirtualBox Guest Additions kernel modules
Building the main Guest Additions module[  OK  ]
Building the shared folder support module[  OK  ]
Building the OpenGL support module[  OK  ]
Doing non-kernel setup of the Guest Additions[  OK  ]
You should restart your guest to make sure the new modules are actually used

Installing the Window System drivers
Could not find the X.Org or XFree86 Window System, skipping.
An error occurred during installation of VirtualBox Guest Additions 5.0.12. Some functionality may not work as intended.
In most cases it is OK that the "Window System drivers" installation failed.
Restarting VM to apply changes...
==> default: Attempting graceful shutdown of VM...
==> default: Booting VM...
==> default: Waiting for machine to boot. This may take a few minutes...
    default: SSH address: 127.0.0.1:2222
    default: SSH username: vagrant
    default: SSH auth method: private key
    default: Warning: Connection timeout. Retrying...
    default: Warning: Remote connection disconnect. Retrying...
==> default: Machine booted and ready!
==> default: Checking for guest additions in VM...
==> default: Setting hostname...
==> default: Mounting shared folders...
    default: /vagrant => c:/path/to/drupal-demonstration
    default: /tmp/vagrant-puppet/environments => c:/path/to/drupal-demonstration/puppet/environment
    default: /tmp/vagrant-puppet/modules-f71316dd467cca918424590c4186206a => c:/path/to/drupal-demonstration/puppet/environment/development/modules
    default: /tmp/vagrant-puppet/manifests-864bb2b87e0b8d63b5331c073d57a286 => c:/path/to/drupal-demonstration/puppet/environment/development/manifests
==> default: Running provisioner: puppet...
==> default: Running Puppet with environment development...
==> default: Notice: Compiled catalog for drupal-demonstration.com in environment development in 1.92 seconds
==> default: Notice: /Stage[main]/Mysql::Server::Install/Package[mysql-server]/ensure: created
==> default: Notice: /Stage[main]/Mysql::Server::Config/File[mysql-config-file]/content: content changed '{md5}54dc3e561e817f9c0a376a58383eb013' to '{md5}f3c1bf65999dea8a571555188e422e79'
==> default: Notice: /Stage[main]/Mysql::Server::Installdb/Exec[mysql_install_db]/returns: executed successfully
==> default: Notice: /Stage[main]/Mysql::Server::Service/Service[mysqld]/ensure: ensure changed 'stopped' to 'running'
==> default: Notice: /Stage[main]/Mysql::Server::Root_password/Mysql_user[root@localhost]/password_hash: defined 'password_hash' as '*2470C0C06DEE42FD1618BB99005ADCA2EC9D1E19'
==> default: Notice: /Stage[main]/Mysql::Server::Root_password/File[/root/.my.cnf]/ensure: defined content as '{md5}46bf5a6182b0abfe142e30fe85424afd'
==> default: Notice: /Stage[main]/Mysql::Server::Providers/Mysql_user[authenticated@localhost]/ensure: created
==> default: Notice: /Stage[main]/Mysql::Server::Providers/Mysql_user[provisioner@localhost]/ensure: created
==> default: Notice: /Stage[main]/Mysql::Server::Providers/Mysql_grant[authenticated@localhost/db_drupal.*]/ensure: created
==> default: Notice: /Stage[main]/Mysql::Server::Providers/Mysql_grant[provisioner@localhost/db_drupal.*]/ensure: created
==> default: Notice: /Stage[main]/Mysql::Server::Providers/Mysql_database[db_drupal]/ensure: created
==> default: Notice: /Stage[main]/Mysql::Bindings::Python/Package[python-mysqldb]/ensure: created
==> default: Notice: Applied catalog in 40.86 seconds
==> default: Running provisioner: puppet...
==> default: Running Puppet with environment development...
==> default: Notice: Compiled catalog for drupal-demonstration.com in environment development in 0.92 seconds
==> default: Notice: /Stage[main]/Main/Package[git]/ensure: created
==> default: Notice: /Stage[main]/Main/Package[httpd]/ensure: created
==> default: Notice: /Stage[main]/Main/Package[php]/ensure: created
==> default: Notice: /Stage[main]/Main/Package[php-mysql]/ensure: created
==> default: Notice: /Stage[main]/Main/Package[gd]/ensure: created
==> default: Notice: /Stage[main]/Main/Package[dos2unix]/ensure: created
==> default: Notice: /Stage[main]/Main/Exec[start-httpd]: Triggered 'refresh' from 6 events
==> default: Notice: /Stage[main]/Main/Exec[autostart-httpd]: Triggered 'refresh ' from 1 events
==> default: Notice: /Stage[main]/Main/Exec[add-epel]: Triggered 'refresh' from 1 events
==> default: Notice: /Stage[main]/Main/Exec[update-yum]: Triggered 'refresh' from 1 events
==> default: Notice: /Stage[main]/Main/Exec[install-phpmyadmin]: Triggered 'refresh' from 1 events
==> default: Notice: /Stage[main]/Main/Exec[define-errordocument-403]: Triggered 'refresh' from 1 events
==> default: Notice: /Stage[main]/Main/Exec[mv-httpd-conf-403]: Triggered 'refresh' from 1 events
==> default: Notice: /Stage[main]/Main/Exec[remove-comment-errordocument-1]: Triggered 'refresh' from 1 events
==> default: Notice: /Stage[main]/Main/Exec[mv-httpd-conf-comment-1]: Triggered 'refresh' from 1 events
==> default: Notice: /Stage[main]/Main/Exec[define-http-400]: Triggered 'refresh' from 1 events
==> default: Notice: /Stage[main]/Main/Exec[mv-httpd-conf-400]: Triggered 'refresh' from 1 events
==> default: Notice: /Stage[main]/Main/Exec[remove-comment-errordocument-2]: Triggered 'refresh' from 1 events
==> default: Notice: /Stage[main]/Main/Exec[mv-httpd-conf-comment-2]: Triggered 'refresh' from 1 events
==> default: Notice: /Stage[main]/Main/Exec[phpmyadmin-access-1]: Triggered 'refresh' from 1 events
==> default: Notice: /Stage[main]/Main/Exec[phpmyadmin-access-2]: Triggered 'refresh' from 1 events
==> default: Notice: /Stage[main]/Main/Exec[php-memory-limit]: Triggered 'refresh' from 1 events
==> default: Notice: /Stage[main]/Main/Exec[change-docroot]: Triggered 'refresh' from 1 events
==> default: Notice: /Stage[main]/Main/Exec[adjust-firewalld]: Triggered 'refresh' from 1 events
==> default: Notice: /Stage[main]/Main/Exec[restart-firewalld]: Triggered 'refresh' from 1 events
==> default: Notice: /Stage[main]/Main/Exec[allow-htaccess-1]: Triggered 'refresh' from 1 events
==> default: Notice: /Stage[main]/Main/Exec[mv-httpd-conf-htaccess-1]: Triggered 'refresh' from 1 events
==> default: Notice: /Stage[main]/Main/Exec[allow-htaccess-2]: Triggered 'refresh' from 1 events
==> default: Notice: /Stage[main]/Main/Exec[mv-httpd-conf-htaccess-2]: Triggered 'refresh' from 1 events
==> default: Notice: /Stage[main]/Main/Exec[set-time-zone]: Triggered 'refresh' from 1 events
==> default: Notice: /Stage[main]/Main/Exec[build-rpm-package-1]: Triggered 'refresh' from 1 events
==> default: Notice: /Stage[main]/Main/Package[php-opcache]/ensure: created
==> default: Notice: /Stage[main]/Main/Exec[restart-services]: Triggered 'refresh' from 1 events
==> default: Notice: Applied catalog in 261.22 seconds
==> default: Running provisioner: puppet...
==> default: Running Puppet with environment development...
==> default: Notice: Compiled catalog for drupal-demonstration.com in environment development in 1.24 seconds
==> default: Error: Validation of Exec[test-drush-install] failed: 'php drush.phar core-status' is not qualified and no path was specified. Please qualify the command or specify a path. at /tmp/vagrant-puppet/manifests-864bb2b87e0b8d63b5331c073d57a286/install_drush.pp:18
The SSH command responded with a non-zero exit status. Vagrant
assumes that this means the command failed. The output for this command
should be in the log above. Please read the output to determine what
went wrong.

User/Pass doesn't work when using cache_dir

There is a problem using user/pass with the cache_dir. I haven't been able to properly diagnose the issue but what I have found is that on the first puppet run, the file downloads properly. However, on the second puppet run, i get the folowing error:

Error: wget --no-verbose --user=USERNAME -N -P '/var/cache/wget' 'https://URL' returned 8 instead of one of [0]

Disable .wgetrc files

Unless I'm missing something, it would be great to disable .wgetrc file generation, particularly if verbose is set to false.

MIGRATE4_OPTION_TYPE_MISMATCH when attempting to upgrade to 2015.3.x

Hi! We are using this module in our 3.8 environment and are looking to go to PE 2015.3.1. That requires that we go to Puppet Language v4 (grumble...), and this module (at tag:v1.7.2) throws the following errors from the catalog_preview tool:

Preview Warnings (by issue)
MIGRATE4_OPTION_TYPE_MISMATCH (1)
/etc/puppetlabs/puppet/environments/ftr_test_puppet4/modules/wget/manifests/fetch.pp:147:5

The line is:

... case $source_hash{
'', undef: {
$command = "wget ${verbose_option}${nocheckcert_option}${no_cookies_option}${header_option}${user_option}${output_option}${flags_joined} "${source}""
...

Apparently, in Puppet Language 4, this throws an error because an empty string will never match a hash object, so it's a syntax error. (https://forge.puppetlabs.com/puppetlabs/catalog_preview#migration-warnings).

Is this already fixed in a higher version that we just aren't pulling in, yet? Thanks!

Module being maintained?

Hi @carlossg . Are you continuing to maintain this module? I notice there hasn't been much activity in over a year. The module is currently marked as Approved on the puppet forge. If you are no longer maintaining it we will remove the Approved badge. Let me know.

Thanks!

Unable to run as non-root user

Adding the user field under the exec for wget::fetch breaks the ability to run as a non-root user.

Perhaps if the user is not passed in, it could run as the current user?

Could not retrieve information from environment production source

Hi, when performing the following sequence I get the error shown:

root@localhost:/opt# cat d
wget::fetch { 'jungle':
source => 'https://www.dropbox.com/s/brk7zeszv3a8k2o/jungle.deb?dl=1',
destination => '/home/dan/jungle.deb',
cache_dir => '/var/cache/wget',
cache_file => 'jungle.deb',
}

root@localhost:/opt# puppet apply d
Notice: Compiled catalog for localhost.members.linode.com in environment production in 0.20 seconds
Notice: /Stage[main]/Main/Wget::Fetch[jungle]/Exec[wget-jungle]/returns: executed successfully
Error: /Stage[main]/Main/Wget::Fetch[jungle]/File[/home/dan/jungle.deb]: Could not evaluate: Could not retrieve information from environment production source(s) file:/var/cache/wget/jungle.deb
Notice: Finished catalog run in 2.97 seconds

Add a $destination_dir param

It would be cool if I could do this. I should be able to send a PR.

 $rhsm_sslcerts = hiera(rhsm::register::sslcerts)

  wget::fetch { $rhsm_sslcerts:
    destination_dir => '/etc/rhsm/ca/',
  }

hiera:


---
rhsm::register::sslcerts:
 - 'http://mysatelliteserver.mycompany.com/pub/certs/mycert-local.pem':
 - 'http://mysatelliteserver.mycompany.com/pub/certs/another-file-ca.pem':

And it downloads the files to /<destination_dir>/<filename cut from url>

Fetch module facts hash is empty value

It's failing in my system.
I got an error Evaluation Error: Operator '[]' is not applicable to an Undef Value. My facter version is 2.4.6
In my system, this manifest is not able to access the facter variables like $facts['var_name'].

Example:
$facts['os']['name'] is returning empty value in fetch.pp

mode is not set

while I can set the "mode", it will not be honored unless the download is from a cached location. All files appear to get set to 644.

cache_dir doesn't do what you think it does

From the README:

This caches the downloaded file in an intermediate directory to avoid repeatedly downloading it. This uses the timestamping (-N) and prefix (-P) wget options to only re-download if the source file has been updated.

   wget::fetch { 'https://tool.com/downloads/tool-1.0.tgz':
     destination => '/tmp/',
     cache_dir   => '/var/cache/wget',
   }

But if you specify cache_dir it actually keeps downloading the file over and over again.

This was introduced in commit back in 2015 from @mirthy (023ed15):

if $redownload == true or $cache_dir != undef  {

Ref: #L62

Is this for any particular reason? Apparently using cache_dir is the only way to set additional parameters on our files such as mode so we prefer to remove the $cache_dir != undef.

I don't mind raising a pull request for it but I wonder what we may break. Can someone shed a light on this perhaps?

Missing wget on Windows

Hi,

This module says it is compatible on Windows, but it does not work.

On Windows Server 2012 R2, it execs a wget. But Windows does not have wget in Batch. It have Wget from Powershell, which syntax is a bit different.

source_hash doesn't work with cache_dir

If you use cache_dir and the command generated to check the source_hash incorrectly tries to check it in the final destination before it is moved.

Since cache_dir is needed to work around #81 these two issues together mean you can't check the hash if mode needs to be changed to anything other than 0644.

Example Manifest snippet
wget::fetch { '/usr/local/bin/ecs-cli': destination => '/usr/local/bin/ecs-cli', cache_dir => '/tmp', mode => '0755', source => 'https://s3.amazonaws.com/amazon-ecs-cli/ecs-cli-linux-amd64-v0.6.1', # # SEE: https://s3.amazonaws.com/amazon-ecs-cli/ecs-cli-linux-amd64-v0.6.1.md5 # source_hash => 'c26236bdde9ad5df013d60137da0a239', }
Results in the following error, you can see the command incorrectly checking the final destination
Error: wget --no-verbose -N -P "/tmp" "https://s3.amazonaws.com/amazon-ecs-cli/ecs-cli-linux-amd64-v0.6.1" && echo 'c26236bdde9ad5df013d60137da0a239 /usr/local/bin/ecs-cli' | md5sum -c --quiet returned 1 instead of one of [0] Error: /Stage[main]/Aws::Ecscli/Wget::Fetch[/usr/local/bin/ecs-cli]/Exec[wget-/usr/local/bin/ecs-cli]/returns: change from notrun to 0 failed: wget --no-verbose -N -P "/tmp" "https://s3.amazonaws.com/amazon-ecs-cli/ecs-cli-linux-amd64-v0.6.1" && echo 'c26236bdde9ad5df013d60137da0a239 /usr/local/bin/ecs-cli' | md5sum -c --quiet returned 1 instead of one of [0]

specify http_proxy information as wget::fetch attribute

Currently, HTTP-Proxy information for wget::fetch can be specified by setting the top scope variables $::http_proxy and/or $::https_proxy. However, I think it is generally bad practice for a Puppet module to rely on the presence of certain global variables (if every module would do it that way, it would be a totally mess).

In my opinion, this piece of information should be specified as parameter for wget::fetch.

module always seems to notify even with cache

Even with cache enabled, I always get notifications from my wget::fetch resources. Is this a known issue or something I'm doing wrong with the cache? I can see by the datestamp that the file is not updating if the cached version doesn't change.

File does not get updated when the file is updated

My code:

wget::fetch { 'somefile':
source => "someURL",
destination => "destFilePath/fileName",
timeout => 0,
verbose => false,
}

When I alter the destination file, puppet run does not revert my destination file back to the source file.
When I alter the source file, puppet run does not update my destination file with the latest update.
The only time it fetch for the file is when I remove the destination file....Is there a way that I could use this module to pull the file similar to "file" resource type? I am not able to use the "File" resource because it is not located inside the Puppet master server.

I also try to put in the cache_dir:

wget::fetch { 'somefile':
source => "someURL",
destination => "$destFilePath/fileName",
cache_dir => '/var/cache/wget',
timeout => 0,
verbose => false,
}

Using the cache_dir works great!! But if the source and destination files are identical, the "wget:fetch" still run as normal. That would give Puppet a false alert that "something" change.

Not working in Windows

Hi,

This module is not working for me on Windows clients. Works fine on Linux machines.

I get the following error on running puppet agent -t on Windows:-

Could not evaluate: Could not find command 'test'

How to download the file inside a directory

Hi there,

My requirement is that I have a URL location of a directory inside which there is a package which I need to download. The trouble is that the package name changes due to versioning. So, all I have is the directory location to download the file. I tried various methods to download the package inside the URL location but have not been successful. So need help.

The wget in my site.pp looks like -

wget::fetch { 'study-war':
source => 'http://car-build-001.com:8080/jenkins/job/Study/ws/dist',
destination => '/home/souravb/study/',
user => 'uname',
password => 'password',
timeout => 0,
verbose => false,
}

The package is inside the location http://car-build-001.com:8080/jenkins/job/Study/ws/dist.

Wget::Fetch[my_item]: has no parameter named 'http_proxy' on ubuntu

  • Using the examples provided.
  • Have confirmed that wget works directly from command prompt with same url
  • Have tried adding http_proxy => 'undef' to my wget:fetch
  • module version: 1.7.1
  • ubuntu 14.04
  • bash shell

I would be pleased to set any appropriate variables or environment variables if that is what needs to happen. If so, could this please be reflected in the README ?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.