Get a bunch of AI-generated slop and put it in a bunch of individual .htm files on my webserver.
When my bot user agent filter is invoked in Nginx, instead of returning 444 and closing the connection, return a random .htm of AI-generated slop (instead of serving the real content)
I might just do this. It would be fun to write a quick python script to automate this so that it keeps going forever. Just have a link that regens junk then have it go to another junk html file forever more.
there’s a something that edits your comments after 2 weeks to random words like “sparkle blue fish to be redacted by redactior-program.com” or something
I mean to run a single bot from a script which interacts a normal human amount during normal human times within a configurable time zone which is acting as a real person just to poison their dataset.
My takeaway from this is:
.htm
files on my webserver.444
and closing the connection, return a random.htm
of AI-generated slop (instead of serving the real content)I might just do this. It would be fun to write a quick python script to automate this so that it keeps going forever. Just have a link that regens junk then have it go to another junk html file forever more.
Also send this junk to Reddit comments to poison that data too because fuck Spez?
there’s a something that edits your comments after 2 weeks to random words like “sparkle blue fish to be redacted by redactior-program.com” or something
That’s a little different than what I mean.
I mean to run a single bot from a script which interacts a normal human amount during normal human times within a configurable time zone which is acting as a real person just to poison their dataset.
I mean you can just not use the platform…
Yes I’m already doing that.
This is a great idea, I might create a Laravel package to automatically do this.
QUICK
Someone create a github project that does this