If you're anything like me, you like to download things. And sometimes, it's too cumbersome to
right click > Save As... each item on a webpage. The solution to your problem sits in your terminal: the
wget utility. If we add a few options,
wget becomes a beast of a website downloader, and is capable of pulling an entire site for offline viewing, include all of the linked files.
All you have to do is copy & paste your desired URL into the following terminal command:
$ wget -mkEpnp WEBPAGE-URL
-mkEpnp are specified below (pulled from the
--mirror): Turns on options suitable for mirroring. This option turns on recursion and time-stamping, sets infinite recursion depth and keeps FTP directory listings. It is currently equivalent to
-r -N -l inf --no-remove-listing.
--convert-links): Converts links for offline viewing.
--adjust-extension): Adds proper filename extensions to downloaded files.
--page-requisites): Downloads images, sounds, stylesheets, and other required files for proper offline site rendering.
--no-parent): Prevents retrieval of the parent directory. Guarantees that only files below a certain hierarchy will be downloaded.
$ --execute robots=off #ignore robots.txt $ --wait=30 #be gentle, wait between fetch requests $ --random-wait #waits for a random amount of time before fetch requests $ --user-agent=Mozilla #sends a mock user agent with each request
Happy downloading! Oh and... I can't be held responsible if you suddenly find yourself investing in a home server setup, NAS drives, or the like.