Mirroring a copy of a single webpage for offline viewing using wget is decent but can be rough sometimes. Any other CLI options?
Mirroring a copy of a single webpage for offline viewing using wget is decent but can be rough sometimes. Any other CLI options? - SDF Chatter
This is one way to save a webpage from the commandline, which I prefer over launching a GUI browser: wget -E -H -k -K -p $URL Sometimes the result renders great in FF and sometimes not (e.g. ifixit.com [http://ifixit.com] has some layout problems). But even when it works well, I always have to dig around the tree of saved files for the HTML page that renders it all. There is no index.html in the root directory of where things were saved. Is there any other option? I’m aware of two options for saving as PDF (wkhtmltopdf and weasyprint), but that’s a separate discussion. I just wonder if there are other ways to save the HTML in the most reproduceable way… in a way that properly reconstitutes the page in a web browser.
Hacker News