> choose whether or not they want to accept an order."
If the app shows clearly what needs to be done (shop, order list, miles driven), and the pay the worker will earn, and asks if they want to accept, then IMO that's fine.
The business can set those offers however they like, even using a random number generator if they want, and IMO it's morally fine.
They can set offers however they like and you are free to not accept.
If the algorithm detects that you are likely to accept for little money and short you with lower offers compared to other users, is it still morally fine?
This runs into both the ideals and the limitations of the Free Market.
Ideally, there's incentive for people to collectively reach the most efficient solution through aggregated laziness and greed.
In practice, people only have so much bandwidth and shortcuts will be taken, options will be overlooked, and people will exploit or be exploited due to the blinders either put on willingly or forced on them--on top of our natural capacity for observing reality no matter how much information is provided.
Look to two-tailed tests when a flipside doesn't make sense.
Consider what risk might exist if you fear overpaying so much that you make a lowball offer yet someone feels compelled to accept. The product or service might be "done" but in a way that screws you over in the long run as well.
Some companies underpaying part of their staff for the same job doesn't make it moral. In some cases, it tends to correlate with genetics which makes it even more questionable. A black box algorithm may unknowingly introduce such bias.
Anyway what is moral depends on your personal values, not the law nor how things are done, it's expected to disagree.
I think reasonable thing here would also allow contractors send counter offer. Maybe 10x 100x or 1000x. Then it would be up to side ordering to accept or reject one of those.
And if the company makes it a policy to never accept any counteroffer (which is legal and fair), you're back to the same system, without that feature existing.
Platforms sometimes care—by which I mean, achieve a market position that means sellers can’t afford not to use them, then leverage that power to force lots and lots of weaker people and entities to do what they want, possibly causing higher lowest-prices in the overall market in the process, so, also hurting buyers.
What if there's discrimination built in to the system? Maybe a business is willing to pay white people more, or women less. They can do that while still following your framework. Is that moral?
When they offer to pay you X for the job, but then pay you < X.
Or if they get you to pay them money upfront (ie. for uniforms) on the basis of 'workers earn $Y per day', but then change the rules so some workers don't earn Y per day and don't offer a refund of the upfront payment to unhappy workers.
Looks like shipt/target successfully converted gig work back from a percentage of revenue (percentage of cart value) to a task based rate. Workers lose when they can't capture value proportional to the revenue generation they support, only in proportion to their hours of labor.
> Workers lose when they can't capture value proportional to the revenue generation they support, only in proportion to their hours of labor.
Time-based contracts are pretty normal. I imagine most people on the planet are on them. There are exceptions - e.g. sales commissions - but to say that workers lose on the thing that most people do requires at least some elaboration.
If workers are low-skilled, easily replaceable and practically fungible then realistically speaking why would their employer pay them based on value-added?
If the app shows clearly what needs to be done (shop, order list, miles driven), and the pay the worker will earn, and asks if they want to accept, then IMO that's fine.
The business can set those offers however they like, even using a random number generator if they want, and IMO it's morally fine.