In Ubuntu, downloading files from the internet can be efficiently managed with the Wget command.
As a versatile tool, Wget allows users to retrieve content from the web, whether you’re grabbing a single file or mirroring entire websites.
This capability makes it an important command for system administrators and regular users alike, automating downloads that would otherwise be tedious.
But have you ever wondered how Wget works or how to tailor its functionality to your needs?
In this tutorial, we’ll get into the world of Wget, exploring its basics, options, and practical examples to help you harness the full potential of this command-line utility on your Linux system.
How to use Wget Command in Ubuntu
Wget is a powerful command-line utility in Ubuntu used for downloading files from the internet. It supports protocols such as HTTP, HTTPS, and FTP, making it versatile for various download tasks.
Why Use Wget?
- Resuming Downloads: Wget can resume interrupted downloads, which is particularly useful for large files or unstable network connections.
- Recursive Downloads: It can download entire websites or directories, preserving the hierarchy for offline viewing.
- Bandwidth Control: Wget allows you to limit download speeds, preventing it from consuming all your bandwidth.
- Background Downloads: You can initiate downloads that continue in the background, freeing up the terminal for other tasks.
Installing Wget
Wget often comes pre-installed on Ubuntu systems. To check if it’s installed, open the terminal and type:
wget --version
If it’s not installed, you can install it using the following commands:
-
sudo apt update
-
sudo apt install wget
Basic Syntax
The general syntax for the wget
command is:
wget [options] [URL]
Common Options
-O [file]
: Write output to the specified file.-P [directory]
: Specify the directory to save the downloaded file.-c
: Continue getting a partially downloaded file.--limit-rate=[rate]
: Limit the download speed.-b
: Download in the background.
Examples
Downloading a Single File
To download a file from a specified URL:
wget https://example.com/file.zip
This command downloads file.zip
from example.com
to your current directory.
Saving with a Different Filename
To save the downloaded file under a specific name:
wget -O newname.zip https://example.com/file.zip
This saves the file as newname.zip
in your current directory.
Downloading to a Specific Directory
To save the file to a particular directory:
wget -P /path/to/directory https://example.com/file.zip
This command saves file.zip
to the specified directory.
Resuming Interrupted Downloads
If a download is interrupted, you can resume it:
wget -c https://example.com/largefile.zip
This attempts to continue the download from where it left off.
Limiting Download Speed
To prevent wget from consuming all your bandwidth:
wget --limit-rate=500k https://example.com/file.zip
This limits the download speed to 500 KB/s.
Downloading in the Background
For large files, you might want to download them in the background:
wget -b https://example.com/largefile.zip
This command downloads the file in the background, allowing you to continue using the terminal.
Advanced Usage
Recursive Downloading
To download an entire website for offline viewing:
wget --mirror --convert-links --page-requisites --no-parent https://example.com
--mirror
: Enables options suitable for mirroring.--convert-links
: Converts links for offline viewing.--page-requisites
: Downloads all necessary files for displaying the page.--no-parent
: Prevents downloading files from parent directories.
This command creates a local copy of the website, making it accessible offline.
Downloading Files via FTP
To download files from an FTP server:
wget --ftp-user=username --ftp-password=password ftp://example.com/file.zip
Replace username
and password
with your FTP credentials. This command downloads file.zip
from the FTP server.
Downloading Multiple Files
If you have a list of URLs in a text file, you can download all of them:
wget -i urls.txt
This command reads each URL from urls.txt
and downloads the corresponding files.
Setting Retry Attempts
To set the number of retry attempts for a download:
wget --tries=100 https://example.com/file.zip
This command will retry downloading the file up to 100 times in case of failure.
Skipping Certificate Check
If you’re downloading from a server with an invalid SSL certificate:
wget --no-check-certificate https://example.com/file.zip
This command ignores SSL certificate errors. Use this option with caution, as it can pose security risks.
Changing User Agent
To change the user agent string sent to the server:
wget --user-agent="Mozilla/5.0" https://example.com/file.zip
This command makes the server think the request is coming from a web browser.
Check out the top Ubuntu Networking commands, how to use Sudo Command in Ubuntu and a list of the top Ubuntu Commands you need to know.