Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Have you tried coding with these glasses?


Yea. It’s doable, but I found that I needed to scale the UI to read comfortably. For normal UI elements, I mostly knew what the buttons did and didn’t need to read them, but its like using a TV as a computer monitor… its just far enough away that everything felt too small.

The resolution isn’t quite high enough for fine text details and unlike VR goggles, glasses aren’t as secured to your head so it shakes more. Again, making focusing on text harder.

That said, they’re good enough for use in a pinch (eg. on a plane, I tried this). They prove the tech is around the corner. They work, they do what they’re advertised, they just need to refine a bit. 1-2 years if people keep buying them and the proper version would be made I suspect. But they work today if you’re really into it.

Surprising, but I actually like the oculus screen sharing, in terms of ability to focus and read the screen. The headset is too heavy and resolution too small, but I like the experience. The NReal is slightly worse in some regards, and slightly better in others.

Edit: they have 2 modes. One is as a USBC monitor, and the other is where your computer (via an app) projects virtual monitors that you see when spinning your head via accelerometer. Mode 1 is good, mode 2 is glitchy. I couldn’t use mode 2 for anything real.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: