You can use something as simple as a browser extension like SingleFile that can automatically download complete, contained copies of anything bookmarked or only certain URLs.
You can use something as simple as a browser extension like SingleFile that can automatically download complete, contained copies of anything bookmarked or only certain URLs.
https://github.com/ArchiveBox/ArchiveBox/wiki/Web-Archiving-Community has a wide range of options
But all our tripod, angelfire, geocities etc websites were little art projects.
i search for error messages all the time on ddg and it usually finds relevant results. it fails when errors are not sufficiently obscure, such as a common python error occurring in many code bases, permissions errors, vaguely-worded errors etc. But there is no way for the internet to guess context in such a situation. spam is not a problem.
if google is so bad stop using it.
and there are websites like https://wiby.me/ that exist to assist people in finding the old-type content.
Sometimes I have resorted to searching blog domains and it isn’t bad for the right queries.
You know what the claw end of the hammer is for right?
Actually 2 hammers are really good to take out a nail sometimes. You use one hammer to tap the claw of the other one under the nail.
Honestly I have a cat paw type nail puller but 9/10 hammer is the answer.
I agree. So much of the web now functionally happens on the socials which are not indexed/accessible to crawlers.
Ive always used cryptsetup and never seen any intructions like how you are describing. I wonder if you have a different use case than I do? It seems like adding more complexity to start with one decryption method only to change it soon. Why?