I just started using this myself, seems pretty great so far!
Clearly doesn’t stop all AI crawlers, but a significantly large chunk of them.
I just started using this myself, seems pretty great so far!
Clearly doesn’t stop all AI crawlers, but a significantly large chunk of them.
Proof of work is just that, proof that it did work. What work it’s doing isn’t defined by that definition. Git doesn’t ask for proof, but it does do work. Presumably the proof part isn’t the thing you have an issue with. I agree it sucks that this isn’t being used to do something constructive, but as long as it’s kept to a minimum in user time scales, it shouldn’t be a big deal.
Crypto currencies are an issue because they do the work continuously, 24/7. This is a one-time operation per view (I assume per view and not once ever), which with human input times isn’t going to be much. AI garbage does consume massive amounts of power though, so damaging those is beneficial.
I’m not sure where you’re going with the git simile. Git isn’t performing any proof of work, at all. By definition, Proof of Work is that “one party (the prover) proves to others (the verifiers) that a certain amount of a specific computational effort has been expended.” The amount of computational power used to generate hashes for git is utterly irrelevant to its function. It doesn’t care how many cycles are used to generate a hash; therefore it’s in no way proof of work.
This solution is designed to cost scrapers money; it does this by causing them to burn extra electricity. Unless it’s at scale, unless it costs them, unless it has an impact, it’s not going to deter them. And if it does impact them, then it’s also impacting the environment. It’s like having a door-to-door salesman come to your door and intentionally making them wait while their car is running, and there cackling because you made them burn some extra gas, which cost than some pennies and also dumped extra carbon monoxide into the atmosphere.
Compare this to endlessh. It also wastes hacker’s time, but only because it just responds very slowly with and endless stream of header characters. It’s making them wait, only they’re not running their car while they’re waiting. It doesn’t require the caller to perform an expensive computation which, in the end, is harmful to more than just the scraper.
Let me make sure I understand you: AI is bad because it uses energy, so the solution is to make them use even more energy? And this benefits the environment how?
I’m not the person who brought git up. I was just stating that work is work. Sure, git is doing something useful with it. This is arguably useful without the work itself being important. Work is the thing you’re complaining about, not the proof.
Yeah, but the effect it has on legitimate usage is trivial. It’s a cost to illegitimate scrapers. Them not paying this cost also has an impact on the environment. In fact, this theoretically doesn’t. They’ll spend the same time scraping either way. This way they get delayed and don’t gather anything useful for more time.
To use your salesman analogy, it’s similar to that, except their car is going to be running regardless. It just prevents them from reaching as many houses. They’re going to go to as many as possible. If you can stall them then they use the same amount of gas, they just reach fewer houses.
This is probably wrong, because you’re using the salesman idea. Computers have threads. If they’re waiting for something then they can switch tasks to something else. It protects a site, but it doesn’t slow them down. It doesn’t actually really waste their time because they’re performing other tasks while they wait.
If they’re going to use the energy anyway, we might as well make them get less value. Eventually the cost may be more than the benefit. If it isn’t, they spend all the energy they have access to anyway. That part isn’t going to change.