I think the issue with a Google-glass style device is that you can see just as much info on a watch. We already have watches and are socially familiar with them, and watches are out of the face when not needed.
What info could you need that isn’t accessible by a watch? Especially considering a watch can be a simple touchscreen but a glasses UI is…?
The NReal Air is pretty cool, and it’s just glasses with a screen, but is it compelling enough for the price tag? Especially an apple price tag? It’s just a “head mounted monitor”. The problem is that moving around makes it shake and hard to focus on, and it requires a cable to get data and power. Apple probably could make it wireless with Apple Watch internals, but then it wouldn’t be able to stream graphics very well.
The watch has a very small display surface. Glasses can potentially use your whole field of vision. You could display an entire room full of monitors rendering full size web browsers and other applications. So there's a huge difference in what info you could see.
I don’t think “data on the fly” and “display a room full of web browsers” has a lot of overlap.
The NReal is a “full vision(ish)” glasses, so that’s available, but you probably couldn’t produce that wire-free. It it’s doable if you’re willing to plug-in when needed (so not on the fly).
You could do apple-watch-in-the-eye (Google glass basically) but that probably couldn’t sustain browsers or monitor images for long. It would work for things like notifications though… which the watches already do.
You can do a lot more with the ability to display using the whole field of vision. That's just a fact. If you can also use better control systems (gestures, etc) and interact with a visual environment those are also big benefits. You're right that the glasses form factor can't get near that right now, but we know that headsets can (like the Quest).
I am also of the opinion that Apple really messed up the watch ecosystem by locking it down so much. They prevented us from fully utilizing web browsing, text messaging, wouldn't allow spotify streaming for years, etc, not because of the watch being incapable, but because they wanted to maintain their walled garden and didn't trust watch owners to be able to type on a watch. If they allowed these things the watch would have been much more successful.
I wear an apple watch, it's not really a good information display vector. It's great at displaying certain things but definitely bad for textual information, in addition to needing to raise the arm to actually read it.
My imagined concept of glass display is to be able to read while I'm hiking/walking, a 80x24 that can be hacked would have been perfect. I don't need my hand to hold anything, and probably scroll using eyeball movements.
I don’t know I think it’s a great vector. Not if you’re trying to read a book, great if you’re trying to check a notification.
Even reading an email while walking with a visual overlay seems distracting or dangerous. I can’t imagine anything that requires strolling but isn’t better served with a phone display.
I think the issue with a Google-glass style device is that you can see just as much info on a watch. We already have watches and are socially familiar with them, and watches are out of the face when not needed.
What info could you need that isn’t accessible by a watch? Especially considering a watch can be a simple touchscreen but a glasses UI is…?
The NReal Air is pretty cool, and it’s just glasses with a screen, but is it compelling enough for the price tag? Especially an apple price tag? It’s just a “head mounted monitor”. The problem is that moving around makes it shake and hard to focus on, and it requires a cable to get data and power. Apple probably could make it wireless with Apple Watch internals, but then it wouldn’t be able to stream graphics very well.