If you ever need to download an entire website you can use this. Works great for sites with lots of pdfs or links you have click on each one to see things like the https://based.cooking/ website.
Ive never had good experiences with Httrack. The archives it creates are constantly missing things. From randomly omitted images to countless pages. This is when trying to copy the simple sites that dont use lazy load or some other archive confounding methods.
It is highly prone to shutting down part way without cause. Seemingly without reason. The error messages come with no clear explanation of their meaning or reason. The program can never properly restart a interrupted download or be sent to recopy omitted information. The entire thing is counter intuitive and poorly explained if at all.
The list of issues go on and on. Its only fit for one page at a time. If you want an entire site or merely a portion of the site such as all articles within a category good fucking luck.
I didn't know that. i've only used it a couple times and it did a 'good' job. lord_nougat suggested another option called wget below. Maybe try that one out. https://www.gnu.org/software/wget/
Better recheck those archives well. More than once I thought it did a "good" job only to find out it didnt. I would even check the archive a bit. I didnt check it well enough.
It's OK if you have to work on Windows. It's saved my butt over the years.
When you work for a company that does not deliver source code to a client when they leave, it allows you to give them a static HTML version of their site.
[ + ] oldblo
[ - ] oldblo 3 points 3.9 yearsJul 14, 2021 02:04:36 ago (+3/-0)
It is highly prone to shutting down part way without cause. Seemingly without reason. The error messages come with no clear explanation of their meaning or reason. The program can never properly restart a interrupted download or be sent to recopy omitted information. The entire thing is counter intuitive and poorly explained if at all.
The list of issues go on and on. Its only fit for one page at a time. If you want an entire site or merely a portion of the site such as all articles within a category good fucking luck.
[ + ] toobaditworks
[ - ] toobaditworks [op] 1 point 3.9 yearsJul 14, 2021 11:50:30 ago (+1/-0)*
GUI version: https://sites.google.com/site/visualwget/
[ + ] oldblo
[ - ] oldblo 0 points 3.9 yearsJul 14, 2021 16:21:36 ago (+0/-0)
[ + ] lord_nougat
[ - ] lord_nougat 2 points 3.9 yearsJul 14, 2021 02:32:38 ago (+2/-0)
[ + ] toobaditworks
[ - ] toobaditworks [op] 0 points 3.9 yearsJul 14, 2021 11:49:08 ago (+0/-0)*
Website link if anyone is interested: https://www.gnu.org/software/wget/
Also found a GUI for it called visualwget: https://sites.google.com/site/visualwget/a-download-manager-gui-based-on-wget-for-windows
[ + ] lord_nougat
[ - ] lord_nougat 0 points 3.9 yearsJul 14, 2021 13:05:51 ago (+0/-0)
I bet it can be made to work, though.
[ + ] SturmUndTrinker
[ - ] SturmUndTrinker 0 points 3.9 yearsJul 14, 2021 20:00:11 ago (+0/-0)
When you work for a company that does not deliver source code to a client when they leave, it allows you to give them a static HTML version of their site.