The analogy is missing a key part of the system as designed today - putting something on someone else's property. The house analogies are terrible, but closer to what's happening is as follows.
I'm your neighbor and I want you to store a box for me. It's tapped up so you don't really want to open it, but would you store it for me without knowing what's in it? You could be like other cloud providers, rip the tape off and rummage through looking for whatever.
Apple has designed a method for me to scan the box at my house only for CSAM (agreed upon by the intersection of multiple databases), then hand it to you with a note that says 'this is not CSAM'. Now you can store the box with some confidence in knowing you're not storing CSAM. It's also more private because you don't need to go rummaging through my box - even though you currently can if you want.
Of course we're starting from the basis that you have decided you don't want to store any CSAM. If you don't care about what you store it doesn't matter.
And you're right that Apple could make a policy change to scan my whole house and upload that somewhere, but guess what? iOS users (smart phone users in general) have always been one policy change away from Apple/Google doing something like uploading/sharing all the face ML data (which IMO is way more valuable than some hash matches). So nothing has really changed.
Analogies are always misleading. A cloud provider is no neighbor. They facilitate a specific service, and it is clear that the package is owned by the client. The cloud provider shouldn't have any liability for the data he stores, especially if it's encrypted and not shared to the public, nobody should ever care what's inside the package. If the authorities have a good reason to think that you have drugs or CP in your package, then they can force you to open it.
IMHO, no combination of ones and zeros should ever be illegal. The act of distributing them to others or creating them in the real world should be. The energy stored on a flash drive doesn't harm anybody, human action does.
iCloud servers are not your property, they are Apple’s property. Even the neighbor terminology is accurate since there’s is an implied trust higher than a random stranger.
I'm your neighbor and I want you to store a box for me. It's tapped up so you don't really want to open it, but would you store it for me without knowing what's in it? You could be like other cloud providers, rip the tape off and rummage through looking for whatever.
Apple has designed a method for me to scan the box at my house only for CSAM (agreed upon by the intersection of multiple databases), then hand it to you with a note that says 'this is not CSAM'. Now you can store the box with some confidence in knowing you're not storing CSAM. It's also more private because you don't need to go rummaging through my box - even though you currently can if you want.
Of course we're starting from the basis that you have decided you don't want to store any CSAM. If you don't care about what you store it doesn't matter.
And you're right that Apple could make a policy change to scan my whole house and upload that somewhere, but guess what? iOS users (smart phone users in general) have always been one policy change away from Apple/Google doing something like uploading/sharing all the face ML data (which IMO is way more valuable than some hash matches). So nothing has really changed.