Well that’s easy — with computers, you need explicit consent to even “observe that the user is sunburned”, because there are no inherent scaling limitations with computers.
So you are saying if we remove some of the scaling limitations of the human brain with biotech (i.e. by enhancing memory detail retention, figuring out a way to serialize memories to computer-compatible storage, etc.) it could become illegal to look at a person without consent (since you would effectively become a walking, breathing CCTV)?
Yes, I think that’s the logical conclusion of GDPR-style thinking about privacy. I would certainly protest against being able to index that data any which way.
I understand the flipside that you’re implying and the argument that you’re making, I just don’t agree.
When an individual has a computer-indexed memory that is admissible as evidence in court, I think it’s pretty okay to use it for all of the things that we use memories for today. But what about reselling that data? What about data sharing agreements that subsidize your implants? What about hackers?
I really hope we don’t get truly infallible, computer-backed memory.
Well, at least you are consistent with your position.
I just don't like that we rely on what amounts to DRM in order to give the human brain exemptions to privacy and copyright laws.
I feel that I "own" my memories and nobody should be allowed to tell me what I can do with them. If there's a device that lets me dump them to computers and sell them, first of all, it should be legal, and second of all, I should have that right.
I feel that you do not "own" what I observe about you with my own senses and that I do not need your permission to look at you, listen to you, or generally infer things about you. I don't see how a memory dumping device is different from computer sensors, and thus I don't have a problem with computers collecting information about me in public.