Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

By "app's private apis" do you mean "apps other than tiktok running on the same device"?

Isn't that supposed to be prevented by the os via the permissions thing?



I mean bot-controlled spamming, like uploading videos, like, fav using private API, dislike & report your competitor's videos etc.

Lots of "studios" use these APIs as for-profit astro-turfing, sometimes spreading mis-informatin, with tons of fake accounts

A layer of VM obfuscation helps, like a DRM or packer.


Wasn’t there also an article on here that showed the sub farms, basically tons of usually girls in a factory floor with cubicles that were streaming for some coin. Not sure if it was China specific.


This used to be, now its all automated, the devices used (phones) don't even have screens attached anymore, just mainboard+sim


And specifically, the idea is to make sure that any kind of effort at writing a bots-and-fraudulent-likes-as-a-service platform is such a moving target that it becomes uneconomical to maintain. It's never perfect, as efforts like the OP's attest, but the incremental advertiser trust (and brand trust from a valuation perspective, as we've seen with Twitter's bot problems!) from being ahead in this "arms race" is likely considered worth the cost of hiring obfuscating-compiler engineers.


Seems like the same approach as using a different kind of lock on your door: the groups with resources will simply already have the tools they need to get through it and it only really stops people who aren’t at that level yet


Not when a sophisticated organization owns the device running the app.

Of course obfuscating the APIs is still an attempt to trust the client, which is not secure in a strict sense, but might slow people down.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: