Latest nightly builds of Firefox 139 include an experimental web link preview feature which shows (among other things) an AI-generated summary of what that page is purportedly about before you visit it, saving you time, a click, or the need to ‘hear’ a real human voice.
Overboard? Because I disallow AI summaries?
Or are you referring to my “try to detect sketchy user agents” ruleset? Because that had two false positives in the past two months, yet, those rules are responsible for stopping about 2.5 million requests per day, none of which were from a human (I’d know, human visitors have very different access patterns, even when they visit the maze).
If the bots were behaving correctly, and respected my
robots.txt
, I wouldn’t need to fight them. But when they’re DDoSing my sites from literally thousands of IPs, generating millions of requests a day, I will go to extreme lengths to make them go away.you disallow access to your website, including when the user agent is a little unusual. do you also only allow the last 1 major version of the official versions of major browsers?
nepenthes. make them regret it