I’ve discovered that Zuckbook has been leetching bandwith at an insane rate, can’t stop it with robots.txt so just had to get brutal with .htaccess. Took money out of my pocket because they can’t (or won’t) fix their shit or are just stealing shit online for AI
Kévin
https://social.mmn.on.ca/2024/06/29/ive-discovered-that.html It turns out I found out something worse which to be fair I shouldn’t be surprised by,
Facebook is knowingly being used as a DDoS vector and by the lack of responses on the dev forums, really don’t give a shit about it.
Initally I thought it might have been some AI shit, but no, plain old Big Tech not giving a fuck and damaging the internet for reasons question-mark-question-mark.
The clue to this is how fast I’m burning through credit with Bunny, I expected an initial peak from relaunching Plume instance. That started to drop off, but then this exposed that FB’s crawler was far too interested with photos. I did a search and voilà, this has been a problem for over 4 years.
Once I started to implement the not very easy task of blocking FB from Crawling, which it is still doing even with a 403 error on everything, I started to see the control checks in the logs of whoever is running the hit. Not sure why they are, this isn’t an old domain, nobody (except Kapitalistenschweines) I have really offended. But hey it’s the Internet® by GAFAM℠ who fucking knows why this shit is running.
So there you go, unless your website depends on Facebook (which if your ability to live exists in the hands of these clowns, you’re already fucked), just go block the User-Agent facebookexternalhit
Edit : to give you some context, this is what thier DDoS machine was sucking in per day before more drastic actions were taken to stop it. At those rates it is 152GB a week in non-genuine traffic that comes with a real financial cost.
Mirrored from
oh.mg.