The software that already exists along these lines already exhibit bias against marginalized groups. I have no trouble foreseeing a filter put on the end of the spigot that exempts certain people from the inconvenience of such surveillance. Might need a new law (it'll get passed).
Sounds like the devil is in the details. Often the AI seems to struggle with darker skin… are you suggesting we sift who can be monitored/prosecuted based on skin darkness? That sounds like a mess to try to enshrine in law.
Strong (and unhealthy) biases already exist when using this tech, but I am not sure that is the lever to pull that will fix the problem.
You know that's not what I was suggesting. I'm saying that if precedence is anything to go by, companies will be perfectly happy extending the paradigm established with sentencing software to anyone who can't pay or leverage their connections. If we continue down this path, tomorrow's just today, but worse, and more. (Please try to have a more rationale understanding of today, tomorrow.)