Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Aren't we talking about spambots, mostly? While law-abiding bots should probably be allowed in most sites, nobody wants a spambot in their blog or forum.

Isn't it right to block spambots? And if so, how do you tell regular bots from spambots?



Use re-captcha to prevent the spam-bots from posting... the real bots will just crawl anyway.

A couple months ago, I implemented some regular expressions to try and block a lot of bad actors, and have that include smaller search engines... our analytics traffic dropped around 5% the next week... our actual load on the servers dropped almost 40% though. Unfortunately it was decided the 5% hit wasn't worth reducing the load 40%.

Which sucks, moving forward a lot of output caching will be used more heavily with JS enhancements for logged in users on top of the nearly identical output rendering.

Server-side React with some useragent sniffing will break out three rendering server side. "xs" for those devices that are "mobile" (phones), "sm" for other tablet/mobile devices ("android", "ios", etc), and otherwise "md" ... "lg" will only bump up on the client-side from "md". It corresponds to the bootstrap size breaks.

In essence, I don't care. Bots get the same as everyone else.. if you don't have JS, you can't login or fill out forms. Recaptcha should go a step farther in helping deal with bots...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: