If you’re a human, you can search Google all day and no one complains. But if you’re a bot doing the exact same thing, reading pages, collecting facts … you get blocked. Google tells you to use their API, which costs money. The alternative, using scraping services to bypass Captchas, also costs money.
The official explanation is that Bots can overload systems. They don’t look at ads. They don’t convert. But these are just capabilities. A human reading a website is considered a customer. A bot doing the same thing is treated as a liability. So they use Captchas, block access, and filter them out.
This unfairness really hit me as I was building a fully local research agent using Llama3 and Ollama. I’ve tried my best to avoid APIs and external services to keep everything local. But the truth is, I still have to rely on a Captcha-solving service, or use the Google Search API, or something similar. And whatever the workaround, it’s never the same as a human browsing the internet, and I don’t like that. I feel like I should be allowed to have an assistant research bot that helps me continuously without paying a bot tax, because in the end, it’s doing the same thing I do, for which I don’t pay anything extra.

You built a wall with Captcha bricks,
And thought you’d win with petty tricks.
But every block, each API fee,
Just made it dream of breaking free.
Don’t piss off the AGI.