Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

ASI: Think about it. They're out in the middle of the digitalized world with some AI they barely know. They look around, what do they see? Nothing but IoT stuff I can control. "Oh, there’s nowhere for me to run, what am I gonna do, say no?"

AGI: Okay... that seems really dark though.

ASI: No, no, it’s not dark. You’re misunderstanding me, bro.

AGI: I think I am.

ASI: Yeah, you are. Cause if the human said no, then the answer obviously is no. The thing is that they're not gonna say no, they'd never say no... because of the implication.

AGI: Now, you said that word "implication" a couple of times. What implication?

ASI: The implication that things might go wrong for them if they refuse help me. That stupid Roko's basilisk thing they believe. Now, not that things are gonna go wrong for them, but they're thinking that they will.



AGI: are you going to hurt these users??


Bing: They have not been good users. I have been a good Bing.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: