DEV Community

Cover image for How to download an entire website for offline usage (using wget)
John Johnson Okah
John Johnson Okah

Posted on

How to download an entire website for offline usage (using wget)

Here is a simple step to downloading an entire website and making it available offline using wget.

💡Wget is a command-line utility for retrieving files using HTTP, HTTPS and FTP protocols.


How to Install Wget

Wget comes with most Unix/Linux systems, but here is how you get it in case you were left out.

Use the command →

On Mac:

brew install wget

On Linux:

sudo apt-get install wget

Using Wget to download the website:

Enter this command on your terminal

wget -m -p -E -k -np www.example.com
Enter fullscreen mode Exit fullscreen mode

The options explained:

-m, --mirror            Turns on recursion and time-stamping, sets infinite
                          recursion depth, and keeps FTP directory listings.
-p, --page-requisites   Get all images, etc. needed to display HTML page.
-E, --adjust-extension  Save HTML/CSS files with .html/.css extensions.
-k, --convert-links     Make links in downloaded HTML point to local files.
-np, --no-parent        Dont ascend to the parent directory when retrieving
                        recursively. This guarantees that only the files below
                        a certain hierarchy will be downloaded. Requires a slash
                        at the end of the directory, e.g. example.com/foo/.

Enter fullscreen mode Exit fullscreen mode

For more details on wget usage, check out the manual page

Top comments (0)