> Upload an image of yourself and transform it
> into a personalized video. Movie Gen’s
> cutting-edge model lets you create personalized
> videos that preserve human identity and motion.
A stalker’s dream! I’m sure my ex is going to love all the videos I’m going to make of her!
Jokes aside, it’s a little bizarre to me that they treat identity preservation as a feature while competitors treat that as a bug, explicitly trying not to preserve identity of generated content to minimize deepfake reputation risk.
Any woman could have flagged this as an issue before this hit the public.
Pretty much anyone that I’ve talked to that somewhat works in AI industry, the attitude is “let it rip right now, and deal with the consequences as it’s going to happen one way or another”. I’m not sure where I stand on this issue, but the reality is, it’s inevitable whether we want it or not.
What sort of actions do you think we can take where the dangerous side effects (like creating deepfake pornography) won’t be as easily accessible as illegal streaming of TV shows? Only let big private companies to train models? Make open sourcing of weights illegal? Make usage of LLM tools generally illegal? All those are as enforceable as torrenting around the world.
When I read that text my first thought was making some videos of my mom that passed away, since so few videos of her exist and pictures don't capture her personality
The fact that your first thought was how you could use this amazing tech to remember a lost family member who you love, and OP's first thought was that it could be used for evil so it shouldn't exist says a ton about each of you.
If you put a piece of technology into the world you should spend more time on what consequences that has for the living in the future, not the dead.
As someone who has worked on payments infrastructure before, it's probably nice if your first thought is what great things an aunt can buy for her niece, but you're better off asking what bad actors can do with your software, or you're in for a bad surprise.
Meta aren’t exactly known for responsible use of technology.
I would expect nothing less of Zuck than to imbue a culture of “tech superiority at all costs” and only focus on the responsible aspect when it can be a sales element.
joking aside, i do think that as AI continues to pervade public consciousness, we will start to see some creators market their products as being "AI-free" or somesuch, for better or worse. It's like lemon market virtue signaling.
Jokes aside, it’s a little bizarre to me that they treat identity preservation as a feature while competitors treat that as a bug, explicitly trying not to preserve identity of generated content to minimize deepfake reputation risk.
Any woman could have flagged this as an issue before this hit the public.