Additionally, the virtual screen was not fixed in space but moved around when you moved your head, which gave me vertigo after prolonged use.
The current version of these glasses have this optional device that they sell that provides this called a Beam – I assume that it’s got enough 3D hardware and such to do the projection.
The problem, as I mention in another comment, is that if you do any kind of 3D projection of a virtual monitor, you have to “spend” resolution from the physical monitor on it to get the virtual monitor enough lower-resolution that it still looks good, and I don’t want to give up the resolution.
Like, there are physically 1080p, 1920x1080 OLED displays in front of each eye on these.
My laptop monitor, right now, is 2560x1600. So even from the start, I spend resolution just to get down to the resolution of the displays in the physical HMD.
Then I’m projecting a virtual monitor on that. You could argue what’s a reasonable virtual-to-physical ratio is, but it’s gotta be less than 1.
The display might be big in terms of visual arc, use a lot of my optical receptors. But end of the day, I want to shovel a lot of data into those optical receptors.
Right, I have a 1600p laptop screen as well and the resolution downgrade was noticeable. What you say about the projection makes sense, unfortunately I haven’t seen any specs for the micro OLED displays they use, they only claim that the virtual screen has 1080p, which might be achievable if the displays DO in fact have a higher vertical resolution. It DOES appear that they’ve increased the size of the displays from 0.55" to 0.68" but there’s no information on the native resolution that I can find.
If I saw these glasses in a store somewhere I’d probably try them out but they’d have to be VASTLY better than the ones I tried to convince me to buy them.
I think that some of the issue here is that the theoretical use case that these are designed around is not what the author is trying to use them for.
The author is looking for a monitor replacement.
These are augmented reality goggles. Like, the hardware is optimized to look at the world around yourself and then display useful information annotated over it, for which resolution is not critical. If we had data sources and software for that, that might be useful too, but right now, we don’t really have that software library and data sources.
I think that Snow Crash did a good job of highlighting some of the neat potential of and yet also issues with AR:
Putting a rubber-band on brightness:
A black car, alive with nasty lights, whines past her the other way, closing in on the hapless Hiro Protagonist. Her RadiKS Knight Vision goggles darken strategically to cut the noxious glaring of same, her pupils feel safe to remain wide open, scanning the road for signs of movement.
Highlighting hazards in low-light conditions using sensor fusion can be useful (current high-end US military NVGs do some of this):
He turns off his view of the Metaverse entirely, making the goggles totally transparent. Then he switches his system into full gargoyle mode: enhanced visible light with false-color infrared, plus millimeter-wave radar. His view of the world goes into grainy black and white, much brighter than it was before. Here and there, certain objects glow fuzzily in pink or red. This comes from the infrared, and it means that these things are warm or hot; people are pink, engines and fires are red.
The millimeter-wave radar stuff is superimposed much more cleanly and crisply in neon green. Anything made of metal shows up. Hiro is now navigating down a grainy, charcoal-gray avenue of water lined with grainy, light gray pontoon bridges tied up to crisp neon-green barges and ships that glow reddishly from place to place, wherever they are generating heat, It’s not pretty. In fact, it’s so ugly that it probably explains why gargoyles are, in general, so socially retarded. But it’s a lot more useful than the charcoal-on-ebony view he had before.
And it saves his life. As he’s buzzing down a curving, narrow canal, a narrow green parabola appears hanging across the water in front of him, suddenly rising out of the water and snapping into a perfectly straight line at neck level. It’s a piece of piano wire. Hiro ducks under it, waves to the young Chinese men who set the booby trap, and keeps going.
The radar picks out three fuzzy pink individuals holding Chinese AK47s standing by the side of the channel. Hiro cuts into a side channel and avoids them.
Overlaying blueprint data can permit “seeing through walls”:
YOU ARE HERE," he says. His view of the Enterprise’s hull – a gently curved expanse of gray steel – turns into a three-dimensional wire frame drawing, showing him all the guts of the ship on the other side.
Down here along the waterline, the Enterprise has a belt of thick antitorpedo armor. It’s not too promising. Farther up, the armor is thinner, and there’s good stuff on the other side of it, actual rooms instead of fuel tanks or ammunition holds.
Hiro chooses a room marked WARDROOM and opens fire. The hull of the Enterprise is surprisingly tough. Reason doesn’t just blow a crater straight through; it takes a few moments for the burst to penetrate. And then all it does is make a hole about six inches across.
A lot of the obvious stuff that one might display in AR goggles doesn’t compete well with just showing reality in terms of usefuless:
He stumbles forward helplessly as something terrible happens to his back. It feels like being massaged with a hundred ballpeen hammers. At the same time, a yellow sputtering light overrides the loglo. A screaming red display flashes up on the goggles informing him that the millimeter-wave radar has noticed a stream of bullets headed in his direction and would you like to know where they came from, sir?
Hiro has just been shot in the back with a burst of machine-gun fire. All of the bullets have slapped into his vest and dropped to the floor, but in doing so they have cracked about half of the ribs on that side of his body and bruised a few internal organs. He turns around, which hurts.
The Enforcer has given up on bullets and whipped out another weapon. It says so right on Hiro’s goggles: PACIFIC ENFORCEMENT HARDWARE, INC. MODEL SX-29 RESTRAINT PROJECTION DEVICE (LOOGIE GUN).
He turns off all of the techno-shit in his goggles. All it does is confuse him; he stands there reading statistics about his own death even as it’s happening to him. Very post-modern. Time to get immersed in Reality, like all the people around him.
Yes, and he’s not wrong, as that appears to be the primary use case for these glasses. For full AR, you still need the Beam Pro, which costs half of the price of the glasses alone.
I do love Snow Crash (it was one of my favorite novels growing up), but I think Google Glass was probably much closer to that vision than these are. Personally, all I want is a big fucking screen fixed in space before me that doesn’t make me dizzy when I look at it for more than 5 minutes, or wear out my neck muscles too much because the headset is too heavy.
The current version of these glasses have this optional device that they sell that provides this called a Beam – I assume that it’s got enough 3D hardware and such to do the projection.
The problem, as I mention in another comment, is that if you do any kind of 3D projection of a virtual monitor, you have to “spend” resolution from the physical monitor on it to get the virtual monitor enough lower-resolution that it still looks good, and I don’t want to give up the resolution.
Like, there are physically 1080p, 1920x1080 OLED displays in front of each eye on these.
My laptop monitor, right now, is 2560x1600. So even from the start, I spend resolution just to get down to the resolution of the displays in the physical HMD.
Then I’m projecting a virtual monitor on that. You could argue what’s a reasonable virtual-to-physical ratio is, but it’s gotta be less than 1.
The display might be big in terms of visual arc, use a lot of my optical receptors. But end of the day, I want to shovel a lot of data into those optical receptors.
Right, I have a 1600p laptop screen as well and the resolution downgrade was noticeable. What you say about the projection makes sense, unfortunately I haven’t seen any specs for the micro OLED displays they use, they only claim that the virtual screen has 1080p, which might be achievable if the displays DO in fact have a higher vertical resolution. It DOES appear that they’ve increased the size of the displays from 0.55" to 0.68" but there’s no information on the native resolution that I can find.
If I saw these glasses in a store somewhere I’d probably try them out but they’d have to be VASTLY better than the ones I tried to convince me to buy them.
I think that some of the issue here is that the theoretical use case that these are designed around is not what the author is trying to use them for.
The author is looking for a monitor replacement.
These are augmented reality goggles. Like, the hardware is optimized to look at the world around yourself and then display useful information annotated over it, for which resolution is not critical. If we had data sources and software for that, that might be useful too, but right now, we don’t really have that software library and data sources.
I think that Snow Crash did a good job of highlighting some of the neat potential of and yet also issues with AR:
Putting a rubber-band on brightness:
Highlighting hazards in low-light conditions using sensor fusion can be useful (current high-end US military NVGs do some of this):
Overlaying blueprint data can permit “seeing through walls”:
A lot of the obvious stuff that one might display in AR goggles doesn’t compete well with just showing reality in terms of usefuless:
Yes, and he’s not wrong, as that appears to be the primary use case for these glasses. For full AR, you still need the Beam Pro, which costs half of the price of the glasses alone.
I do love Snow Crash (it was one of my favorite novels growing up), but I think Google Glass was probably much closer to that vision than these are. Personally, all I want is a big fucking screen fixed in space before me that doesn’t make me dizzy when I look at it for more than 5 minutes, or wear out my neck muscles too much because the headset is too heavy.