Hacker News new | past | comments | ask | show | jobs | submit login
Bosch Smart glasses: A tiny laser array paints images directly onto your retina (ieee.org)
400 points by deniscepko2 on Feb 4, 2020 | hide | past | favorite | 265 comments



Start of the article: "My priority at CES every year is to find futuristic new technology that I can get excited about. But even at a tech show as large as CES, this can be surprisingly difficult. If I’m very lucky [...]"

End of the article: "Bosch covered our costs for attending CES 2020."


Also: "After making a minor nuisance of myself, Bosch agreed to give me a private demo"

I mean it's possible they paid for him (and others) and still didn't allow him to get a demo... but it's unlikely. Also, I feel like IEEE can afford to pay the attendance costs for a reporter.

So, this is really just promoted content/an ad. Which is fine and it's even fine to wait until the end to tell us I think but not if the article body tries to paint a different, organic picture. This wasn't at all necessary for the content but simply there to detract from the paid promo nature of the piece.


To paraphrase:

"We don't want this getting out just yet, but... oh okay, you can come behind the curtain just this once."

<end of private demo>

"So, what'dya think? By the way, here's a fresh marketing video we happen to have lying around. Tell everyone. No no, actually, let us pay you to tell everyone. We insist."


I wondered this myself but I searched the IEEE website and it appears that there are several stories about various Bosch technologies so it is possible that Bosch was expecting this writer to cover their other stuff and wasn't planning to demo this particular item. Given that Bosch is looking for a commercial partner and not to sell direct to the public they might have had the glasses to show to possible partners and not reporters.


I don't know if the article was edited, but this is what the article now shows at the end:

> Robert Bosch LLC provided travel support for us to attend CES 2020. Bosch Sensortec, responsible for the Smartglasses Light Drive, was not aware of nor involved with the travel support.

So it seems like you are correct. The article could have been much more clear about this though.


I can't prove it but at the time I checked and the OP of this thread had an exact and complete quote of what was on the page.

It has been edited at some point within the last ~2 hours.


It was edited since initial publication.

I don't think the new text changes anything, though. a) If that new text tries to paint the picture that this isn't sponsored content or might have bias, then why did they even mention that disclaimer? b) Which exact legal entity or subsidiary was "officially" sponsoring the trip isn't really that relevant when they are so tightly connected in brand and organizationally.

They could have simply started the article with: "This year Bosch invited us to CES fully covering our attendance cost."

or alternatively later on "while Bosch sponsored our trip we still had a really hard time getting a demo for ... because of ..." (e.g. too many people)


> Bosch Sensortec GmbH is a fully owned subsidiary of Robert Bosch GmbH


This was my assumption too... they have plenty of consumer products currently for sale that they're probably much more interested in promoting.


sure. anything is possible.


Native advertising.


Aren't you generally supposed to disclose things up front? not after the submarine has launched.


Preferably. If you want to create a more trustworthy rapport with your audience so they don't feel manipulated.

Sneaky disclosures kill any interest I have in a product/company.


Indeed. I was fooled. I went back after reading OP comment now I have a bad after-taste.


Yeah, they really Bosched this one...


Yeah, that's a big no-no for credibility. At least for me, I have to trust a stranger before I could ever convert on a pitch, and stating any financial obligations or fiduciary duties or conflicts of interest upfront or early on is far better than tacking it on later.

Case in point, I'm not even going to read this article after reading this comment.


Is it that unlikely that Bosch has sponsored a story written by an unaffiliated writer?


I don't understand your implication. Are you saying that an unaffiliated writer who is sponsored one time by Bosch wouldn't have an inclination of positive bias towards his financial benefactor, as opposed to a PR / marketing person on staff?


They wouldn't if they wrote the story before Bosch noticed it and decided to sponsor it as an article. Seems like good PR practice to me.


Does that change the fact that this is futuristic new technology?


The more relevant question is whether that changes the likelihood that this is a futuristic, new, working technology.


Well working is a qualifier that you added, not the author. Maybe he's only interested in things that explore innovative ideas rather than a functional product.


ieee.org tends to be like that. Still has interesting articles from time to time though


How about you actually quote it?!

>Robert Bosch LLC provided travel support for us to attend CES 2020. Bosch Sensortec, responsible for the Smartglasses Light Drive, was not aware of nor involved with the travel support.


I quoted what was on the page when I read it. They changed it at least twice, the last time I checked it said something about Bosch Sensortec, and I had to Google it to see if they were related to the glasses. They since apparently added that in as well.


Holy shit. The parent comment is as disingenuous as the implication they’re putting the author of the article. It earned a very rare HN downvote from me.


Looks like they changed the disclosure at some point. The original version of the article is here: http://archive.is/QCmA6


Thank you! I tried using internet archive but somehow it wasn't loading the page properly. I was going a bit insane, but remembered it's just the internet and moved on.


I can confirm that the article was edited. It said exactly what the parent comment wrote when I read it.

This IMHO makes it even worse. The article should clarify that they updated this description.


As an amateur runner and triathlete I really hope someone will integrate this technology into suitable smart sunglasses. I would pay $1000 today for such a product if it actually works. My GPS fitness tracker is a great tool but I hate having to constantly glance down at my wrist to check pace, distance, and heart rate. This is particularly annoying when executing a structured intervals workout based on specific target metrics.

There are existing heads-up display products targeted at cyclists such as the Everysight Raptor and Garmin Varia Vision. However they aren't practical or comfortable for runners.

Ideally I'd like the smart glasses to have the following features: ANT+ Extended Display profile. 6 hour battery life. Lightweight with even weight distribution (not all on one side). Prescription lens compatible.


I don't trust any of the major tech companies to do a good job with this product. How long until your weather app needs access to that always on camera in your glasses?

Practically I also can't imagine how they'll solve weight/battery/networking issues with the device particularly in athletic settings where you're literally putting the device through a constant earthquake.

Maybe we should stop wishing for some ridiculous convenience layer to be added to our lives and just look around during our runs like we've done for thousands of years.


Don't worry about it. None of the major tech companies understand the endurance sports market and it's too small of a niche to even interest them.


All it need to do is define an interface and the sports watch/app companies will do it themselves. Google doesn't need to understand every usecase and make an app for it they made Android Wear and anyone can make an appropriate app.


Having an app doesn't help if the hardware sucks. Android Wear has been a failure in the sports market so far due to short battery life, awkward touch interface, and limited sensor support.


Or (and this is controversial), design a convenience layer around some other sensory input that doesn’t require vision or alienates people with visual disabilities.


Why not both? Except, there are already solutions that work quite well for audio. Google Glass even used bone-conduction speakers IIRC.


Agreed, I am also an experienced runner and this would be wonderful. Assuming it can get GPS and map trails, trail running would be vastly improved by this. Right now people have to paint trees every 20 feet or so to mark the trail, but if you have a live HUD it can tell you if you're on or off course fairly quickly.

Trouble is GPS isn't always reliable in heavily wooded areas, so there might need to be improvements in GPS before that is really viable.


I would actually prefer to not have any GNSS (GPS) receiver in the smart glasses in order to keep the weight down and battery life up. GNSS receiver can still be in a wrist device and just use the smart glasses as a display linked via ANT+.

GNSS accuracy is poor in forests, canyons, and dense urban environments due to line-of-sight obstructions and multipath reflections. The latest Navstar Block III satellites should improve that a little, but any real solution would require deploying additional satellites in geosynchronous orbit like the Japanese QZSS.


> any real solution would require deploying additional satellites in geosynchronous orbit like the Japanese QZSS.

There's nothing about geosync orbit that would help with the tree/canyon effect. Satellites directly overhead contribute the least to horizontal positioning - essentially only helping to establish the "true" clock value but not 2D position.

Multi-constellation tracking can be helpful, simply because there are more satellites to use.


None of that is relevant to how QZSS actually augments GPS accuracy. Real world tests show that it works better than just using more constellations in lower orbits.

https://www.ion.org/publications/abstract.cfm?articleID=9679


> GNSS accuracy is poor in forests, canyons, and dense urban environments

I feel like this should be solvable on the software side, especially for medium or long distance runs. If I look at the traces, I can see that the recorded zig-zag is obviously wrong, and a bit of smoothing should be able to fix it.

Are there any running apps that are accurate?


Most fitness trackers do apply some level of automatic track smoothing but it's not a solution. Sometimes athletes really do zig-zag around, especially on trail runs in rough terrain. The Apple Watch is sometimes overly aggressive about track smoothing which leads to funny results showing the athlete running through solid obstacles.

Accelerometer, magnetometer, and gyroscope sensor data can help a little to sanity check GNSS inputs and fill in brief gaps. But those tiny sensors have terrible drift which makes them nearly useless for sustained position tracking.


I run in New York City using a Garmin (Forerunner 935 but the model doesn't really matter). GPS accuracy is terrible in midtown due to the signal bouncing off the buildings. Sometimes it records that I've run a two minute mile. At a minimum you'd think they'd have software that can detect that it's unlikely I'm setting a world record (or scaling the side of a building).

This frustrated me enough that I eventually got a Stryd footpod. The pace/distance tracking is extremely accurate and I use it to override what the watch records. So the GPS track still bounces all over the place but the recorded pace/distance data is correct.


I used to be an ultradistance runner and also built and sold an AR/Computer Vision company.

Unfortunately the state of localization isn't ready for passive (aka non-radiative) reliable sub meter localization over anything more than a few meters squared in a well lit indoor space.

Going to be a while likely before we get to "wear everyday" AR glasses with sub meter accurate camera position/localization.


I would maybe buy smart glasses for $1000 if and only if the protocol to interface with them was documented. First because it’s unlikely they’ll support the platforms I like to use, and second because enough of these products have come and gone that I don’t trust they’ll be officially supported for more than two years or so.


The ANT+ Extended Display profile is documented. https://www.thisisant.com/developer


Wouldn't this be a total shoe-in for an audio-based interface? Probably I don't quite understand your requirements - I'm imagining you want alerts for when you're within your pace/heartrate targets, and for when you reach specific distance targets or intervals.


Audio alerts don't work very well in practice. My fitness tracker can be configured to play alert tones when I reach my target zone or when various metrics are too high or low, but when I try to rely on those alerts I tend to over correct and bounce off the limits. Plus sometimes the environment is just too noisy and I can't distinguish between different alerts. That makes for a frustrating low-quality workout. Having the actual numbers constantly in the corner of my eye would make the process much smoother and easier.

Also I hate to be "that guy" whose stupid wristwatch is constantly beeping during group workouts and races.


I was thinking you'd use headphones. As for overcorrecting, I see what you mean, but I think there's probably a way to have the signals you want with just audio, and therefore, with today's tech. I've tried Jabra's fitness tracker and it's way too wordy, but I think that's a case of poor implementation.


Nope. Triathlon rules specifically ban all types of headphones. Train like you race.


Ah, well, that's that then. But how long would AR glasses be allowed?


No one is seriously proposing use of AR glasses for endurance sports because obviously anything that significantly interferes with vision is a safety problem. But a simple HUD is fine. Cyclists have been using them for a few years now and no one is calling for a ban.


*shoo-in (sorry)


Thanks! As a non-native speaker, I don't make spelling g mistakes like that very often, but I guess that's a rare instance of having learned a word from spoken language rather than written!


There is a company that makes swim goggles for $200 that does pace and heart rate, though that's only one of the disciplines.

https://www.formswim.com/

(I am in no way affiliated with this company)


I use Peloton Digital/Freeletics type apps where they tell you how to work out properly and build workout regimes for you but...on the phone.

I'd love to have it overlayed over the corner of my eye while I'm working out.


Social ticks such as frantically tapping the side of your head, will be normal when someone tries to "remember" your name. I thought it was weird when people were talking to themselves with bluetooth on or walking into a street sign when looking down at their phone...but it's just going to get stranger. I could easily imagine eye flickering or eye rolling when your brain OS is rebooting. These glasses and also the contact lenses in the works will make us forget having to worry about things we take for granted today just as your phones helped us forget peoples' phone numbers or care about knowing how to get somewhere as the new devices will be contextually aware of facts we need to know that will just appear as an overlay...no more "let me look it up on google". If it's coupled with audio, gps and other inputs, it could be even more proactive in finding things before you even knew you needed them.


I have a bluetooth "joystick" which is so small it uses a ring so that I can hold it. No reason not to improve that. The interface for your walking-around glasses can be a thick ring that you rub and click with your thumb.


This is how Focals by North [0] work. Everyone I’ve asked about it has thought it’s a silly idea (to have to wear a ring), but I still can’t think of a better interface.

[0]: https://www.bynorth.com/


I assure you, the combination of a stand that holds a tablet over my bed and the bluetooth joystick ring is the most comfortable possible bedtime reading scenario.


What stand do you use! Please share. :)


Amazon is full of them. I got a generic steel gooseneck with a clamp that attaches pretty well to my steel-pipe headboard, and I zip-tie it at a second point. My wife got a floor-standing gooseneck to hold her Chromebook, and added a 20Kg steel weight at the bottom to keep it upright.


Did they ever actually sell these? I've been trying to get my hands on a pair but have found it impossible.


Problem is I prefer fewer expensive things to charge and/or possibly lose


Can you share the product name? I'd like to get one?


It's the Acgam R1, about $16. The pictures make it look big and bulky, until you realize that the hole just fits a large man's pointing finger.


That's so cool. Thanks for sharing. What do you use it for?


I thought it was weird when people were talking to themselves with bluetooth

When Bluetooth headsets became small enough to be inconspicuous I lived in a marginal neighborhood. When my wife and I would go out, we'd play a guessing game called "Bluetooth or mental illness."


I play a game with people talking on their BT device, where I pretend I don't already know that, and I talk back to them as if they're talking to me. The object of the game is to see how long I can keep them from their other conversation by refusing to acknowledge their device.


We play that too. Bluetooth or crazy.


I'm not sure how it can get stranger. I've had guys come up to the urinal next to me still talking into their ear pieces. It's incredibly rude, bizarre behavior. Their conversation is taking place in an entirely different context, and I think that's where their minds usually are too. So they might not think anything of it when they approach from behind saying something like, "We need to take care of this right now" too loud and too close to your ear.


It seems rude in most contexts, to just shove your conversation into everyone else's life. Unless it's a real emergency, I don't get how people think it's okay.

I think a lot of people don't realize that a lot of people consider it a luxury to be out of contact for periods of time.


When that happens to me, I make sure to let the person on the other side of that conversation know where it's being held.

I flush repeatedly, turn on the water, and take out my phone and have very loud imaginary phone calls, or just cycle through the available ringtones at high volume.

Some people are still so low class that it doesn't even phase them.


Hmm, you can also try to pretend you have one on yourself and laugh joker-style looking up then talking to the ceiling, making sure the volume of your voice was uncomfortably loud and inappropriate for the setting you are in. Its not like that person can call you rude and it will give you some satisfaction that you have interrupted their call. Maybe they will self reflect at some point realizing how rude they are themselves. IDK.


Is this so important that there needs to be retaliation? Could not care less.


I wonder if guys like that realize that everyone on the call can totally hear them piss and flush the urinal.


I already can't stand talking to people who have those air pods in or are looking at screens. I don't know if social norms around this will change or not, but I hope they don't. No one can have a straight conversation with someone distracted by other stuff. Even if that person has no music on, it still feels as though he's elsewhere.


Judging by the (upper)middle class kids I work with, the social stigma around wearing earbuds while holding a conversation is clearly on the way out. It seems like a rapid social shift (which makes me old at 34, I guess) and is probably an “important” change for those who want to sell us on the AR future.

To that end, my experience of using transparency mode on AirPod Pro earbuds is that they very much do “become invisible” while allowing me to overlay (auditory) information on the world around me. If they were built to be as inconspicuous as my father’s hearing aids, nobody would know the difference and no overt social stigma would persist. The AR future of today is auditory.


Calling AirPods “those air pods”, but also having something as “young” as “big_chungus” as your username... I’m having a hard time figuring out your age!



That is a great point about distraction and not feeling as though you truly are a focal point. It will get very odd if something like neuralink comes into play where the person just gets ideas as if it was their own while they are talking to you, then you will never know if they are actually thinking for themselves at all either on top of not paying you full attention. But if you are a parent with a teenager, you'll kind of already feel this way.


> content lenses I'm guessing that's a typo, but it works just as well as "contact lenses".


yah was and funny it did, thanks, fixed, maybe they should be called "content lenses". funny accident.


I'm sure we'll develop sensitive clothing, body feedback and other forms of haptic interfaces - a tap on the side of the glasses might not be necessary if you can train your AR to respond to say, snapping fingers or tapping feet.


The thing I hate about this trend is that we seem to love to forego a vital part of interfaces in the process: visibility. I don't know why designers seem to hate buttons, but it's really annoying.

EDIT: where "visibility" also includes haptic feedback and all of that other good stuff. See http://worrydream.com/ABriefRantOnTheFutureOfInteractionDesi...


I don't hate buttons. Here, I'm making a small device with lots of buttons:

https://photos.app.goo.gl/VpG8uGSzWzCskN2m7

Can you write C? Do with it what you like. I'll make a javascript framework that runs on a smartphone too. (Nordic Bluetooth LE chip, microphone, speaker, 12 RGB LEDs, gyro/accelerometer, battery that should last weeks). Internet connected walkie-talkie, translator, study languages (like this: https://forum.language-learners.org/viewtopic.php?t=8699).

I'll make more devices soon, and will sell them for $25. Send me an email, and I'll message when they are ready (mail in profile).


Love it! Sadly, my self-insight has reached the level where I know I would buy it then never use it after tinkering for a few hours, so I'm afraid I have to pass on ordering one.. but I really like the concept!


Buttons are expensive and actually placing things on the real world carries a lot of inconveniences.

The good news is that AR can add visibility to hidden interfaces too. The bad news is that nobody will bother.


It has to be more than that. A touch sensor might be cheaper than a button, but even Software UI distances itself from the button visual of the 90s and 00s, instead opting for icons with no discernible background or button association. Software UI design is very much imitating how a physical touch sensor interface looks. It seems to be the current trend, chasing a "clean" design above all else.


I miss switches. Big metal toggle switches with labeled positions. These tiny hall effect buttons that barely give any feedback are the worst of both worlds. They have no visible status. The tactile element isn't much better than a touchscreen, and they break. Toggles!


Buttons aren't visible if they're on the side of your head. I have headphones with about 5 buttons. I can often successfully use the volume and power ones if I locate them correctly, but there are another two next to those that I sometimes end up pressing instead.


I should have been more clear: those buttons are still "visible" in that they have a haptic interface. I have a headphone with "tap to play/pause, swipe to change songs or volume" - it's impossible to know without reading the manual or triggering one of these actions by accident. Which incidentally also happens a lot.

That's bad interface design IMO. I guess the headphones look "cleaner" but it's really annoying in practice and not worth it.


Those are crappy buttons. Good buttons are ones that you don't need your eyes. Take an xbox controller. Every button on the device is labelled and there is a schematic detailing every aspect of their function in every game, however you never look down at the controller because you can discern every single button on the controller by feel alone.


Buttons don't have to be visible. They can have different shapes that you can feel. Brail would be useful if people would learn it.


There are also non-button tactile controls, like the 'buttons' on the Airpods Pro that are actually just pressure-sensitive indents you squeeze.


Which - as a Minnesotan - I find to be a frustrating interface when gloves are required.

(somethingsomethingdesignedforcalifornia)


You know it's cold when your bare finger no longer works on a touchscreen.


> I could easily imagine eye flickering or eye rolling when your brain OS is rebooting.

That image reminds me of the transhumanist YouTube series H+. In one of the first episodes one of the characters keeps trying to “reboot” after his implant succumbs to a computer virus capable of killing people.


This reminds me of a technology from a couple of decades ago that seemed to disappear: wearable computer monitors. What ever happened to those?

Basically, you wore this thing like eyeglasses on your head, and it had a small arm that extended in front of your eye (but a little below it). When you looked down, it appeared like a computer monitor was hovering in front of you. The very early models were 320x200 resolution in monochrome (red on black), but I tried one at a trade show in 2000 that I think was 800x600 in VGA color, which at the time was pretty decent. I'm surprised these never got more popular; they would be great for laptop computers: you could have total privacy in your viewing (unlike a normal screen), and with improvements in the technology you could potentially "see" a much larger screen than a normal laptop has.

Does anyone else remember these?


Yes. They were terrible. Best case was Google Glass, which sounded entrancingly innovative until you tried one and immediately lost all interest.

Resolution was very low, battery life very short, UI very annoying, appearance very embarrassing.


I actually enjoyed my Google Glass, primarily its always-on easy-to-reach nature. Pulling your phone out of your pocket, turning it on, and unlocking it is a trivial action, sure. But putting your finger up to your face is still an order of magnitude faster and easier. As a result, I found myself taking way more photos and having literally dozens more phone calls with friends and family. It was pretty interesting.


Resolution and battery life were too low, costs too high. This is essentially what Google Glass was in a more compact and usable package.


The only times I ever saw them in the wild was for a security guard. He was using them to watch a movie from what I think was a mini-disc player.


What are the safety implications of this? If something goes wrong, will people be blinded? What effect do these lasers have long term on the retina, the lens, and the vitreous humor of the eye?

I did not see any part of the article address these issues.


Given that the lasers are light shining in a straight line, and that they're presumably calibrated to intensities the eye can deal with (otherwise we couldn't see them/they'd be less sharp) there won't be any more long term effects than just normal eye use.

Our eyes are designed to let light in. The power level of lasers that can run all day on a 350mAh battery they share with some electronics is going to be miniscule, like maybe a fraction of a milliwatt of power. A common laser diode available right now that outputs 1mW continuously consumes about 36 mA of power to do that. Three lasers outputting 1mW would consume roughly 363 or 108mA, so a 350mA battery would only power those three lasers for about edit 3.4 hours, with nothing left for the electronics to use.

Less than 1mW is not much, even for a laser, and at visible wavelengths it's not going to transfer enough heat to even measurably change the temperature of the human eye, much less damage it.


But don't lasers need to scan in order to draw a full image (vs just a dot)? Doesn't that necessitate making the actual laser beam brighter than other light entering your eye? Are there any negative consequences to having a bright light scan over your retina quickly vs a dimmer light shining on your entire retina at once? What happens if the scanning mechanism fails and the laser ends up shining on one tiny point in the center of your eye for an extended period of time?

Lots of interesting safety questions unique to this particular display technology.


Remember that A) The laser is close to your eye and shining directly in and B) It's coherent, so it's focused on a single small spot. It doesn't have to be so much brighter than all the light around you given those things.


But it isn't focused on a single small spot; it has to scan across multiple small spots in order to form a recognisable 2d image. If it were just focused on a single small spot the entire time, all you'd see is a dot.

That means one of two things must be true. Either a) the beam is way brighter than other light entering your eye, such that it delivers in a single focused beam the same amount of energy that would otherwise be distributed evenly across the surface of your retina if you were viewing natural light. (Thus the questions I posed in my previous comment.) Or b) the beam is the same brightness as other light entering your eye, meaning the amount of energy in any individual pixel is far _less_ than what your eye receives from natural light. (I imagine this would make the image appear very dim.)


Depends how the circuit fails. I suspect the pump will fail to lase if it receives too much current, physically capping the laser power sounds like a good design decision with something going right into your eyeballs (unlike CD lasers).


What if someone hacks it to intentionally blind you?


There's going to be a hard limit to the amount of power you can direct through this laser. The actual hardware is going to be responsible for safety in this dimension, not software.


Just look away. It's just a display device, so at most it's going to display confusing pictures.

It's not possible to "hack" more output power into lasers with software changes. Would that it was. You can change the duration of the beam, but you can't pulse the beam without a Q switch in a way that changes the instantaneous power.


In the normal operations, the laser will scan across the retina without long dwell times at a single spot. Software is likely able to cause the laser to track a single point on the retina (I expect that the device needs some sort of an eye tracker and thus a camera aimed at the eye). I don't know if that can produce a harmful power density.


Given that it doesn't harm the retina when scanning, I'd say "full white" is the best it could do, which could be surprising and a bit uncomfortable, but not actually damage the eye.

Picture your monitor going all white... bright for a sec, but that's about it.


Given that it's scanning the duty cycle as seen by any part of the eye is very small. If the perceived brightness was a function of power density averaged over time, then it would very obviously have to be able to be much brighter than "full white" to create the full white experience[^].

[^] in reality the perceived brightness is somewhat higher than the mean (i.e. a light source that's twice as bright with twice smaller duty cycle appears brighter). I'm not sure how large an effect that is, and whether it has anything to do with pupil size adjustment.


> Picture your monitor going all white... bright for a sec, but that's about it

I remember when a CRT monitor would mess up and stop scanning right the white point in the middle was much brighter than the normal brightness, I was always worried it would burn in and would shut off the monitor right away


Then you get some marketing logo etched into your retina :)

I'm wondering how is the laser scanned over the retina (probably some MEMS chip). What happens if the beam scanning is suddenly stopped in place, wouldn't you get the whole laser power concentrated in a small dot. And it happens so quickly that you won't have time to react.

There may also be some phantom image effects like we used to have on those old TV monitors. Or like we can have when fixating a picture.

What's probably more insidious would be projecting some very lightly superimposed structure. You probably can induce some unconscious cognitive load or nausea. By decreasing the discomfort for example when the user is looking at an ad, you will increase the effectiveness of the ad.


I guess maxing out its power would be comparable to looking at a full-white LCD screen. If the device has an ambient light sensor to adjust its brightness, then circumventing it and setting to full power could be comparable to switching on a bright light in previously completely dark room.


Laser is just a means to make light. You stare at the sun and you will go blind, but the majority of people spend their entire life looking at light for all of their waking hours and die at an old age with eyesight. But to answer your question, yes absolutely. The power will have to be low enough to prevent damage.


This could become big in logistics. For example for order picking.

But also for maintenance crews. Want to know which machine broke down? The glasses will give you directions and will even give you an overview of the maintenance history.

I believe this is not a consumer product. Bosch has some consumer products but they are way bigger in the business market.

And about the laser: it's just light. A laser doesn't mean 'cut through everything'. It all depends on the power. I'm sure Bosch doesn't want to melt your retina.


> And about the laser: it's just light. A laser doesn't mean 'cut through everything'

Yeah, in this context laser means a light source with extremely tight cone, meaning it can render a very tiny point on the retina. Not a ray of death that will penetrate your brain and come out the other side of the skull ;)


Yeah, consumer "AR" as popularly imagined (video overlay) isn't a thing. But even at AR conferences a few years ago there were favorable case studies of AR use by shipyard workers, factory maintenance staff etc.


Maybe this will be the exception, but in general these seem so consumer focused as to be useless. I just want something that can accept some standardized or ad hoc well-documented protocol to do basic raster images or text or something as a baseline. I want something that application developers (and people like myself) can start to hack on and explore where it can go.


I know, I want to put these in ski goggles and wire up Google's maps for ski resorts, the ski resort's lift line times data, and apres ski events info, and the current and upcoming weather conditions.


Yes. So much this. I really just want a super low level API to paint monochrome vectors onto my retina. That's it. I don't want any cruft. I really really hope someone takes the *nix approach to a glasses-hud. I want the hardware to track where my retina is, paint it with a stream of vectors over bluetooth, and have strong hardware guarantees for my ocular safety. Maybe toss one of those nice Bosch IMUs in it as well.

Please. I'll take 3.


I think AR is more suited to commercial users than consumers anyway. Google Glass and Microsoft's AR solution seem to be playing out that way.

Anecdotally, the only context I would want information beamed right into my line of sight is in work scenarios. All other scenarios I want technology to be in the background as much as possible.


I don't see the AR applications even being the most interesting part of this. A private facial recognition database coupled with "this person's name and a note to self" would be immensely helpful in a lot of situations -- I have bad facial recognition (not full on face-blindness, but inconvenient) and it would be neat to be able to hack on this a little bit. This would also require a camera of some sort, but I'd rather have that not be integrated.

Possibly even "real-world closed captions".

A tap for clock/calendar function would be handy.

Morse code (or other silent, maybe subvocal?) "telepathy" would be interesting as well.

I don't know how convenient or awful these would be in reality, but if the cost were not exorbitant and you weren't locked into a proprietary app ecosphere then it definitely seems like it would be worth a shot.


Naturally, I worry about all of this facial recognition and data gathering; however, as a k12 teacher, the utility of displaying names/data about my students is immediately apparent. A great deal of time/effort is expended on assessing my students and modifying my interactions with them based on that information.


My main fear here is using any sort of centralized/cloud system for any portion of this. Facial recognition against a small corpus can be done fairly easily off the shelf.

Everyone wants to sell me something at a huge discount because they know that the enhancements to their own database will pay for the difference. I just want the basic version of this for my own personal use, preferably with no online use at all short of maybe encrypted backups (maybe).


I imagine special care instructions for children with health conditions or allergies would be useful as well.


I think the current uses are mostly commercial because that where you can get enough money to make a bespoke application and people aren't as sensitive about how things look. For a consumer AR application you need a lot more value to overcome how bad these things currently look and building that data out for the world is expensive.


I'm very curious what the effect would be over the course of decades of having even very weak lasers hitting your retina directly.


Photons do not get a special "laser" tag added to them by physics. If it's all in the visual wavelengths and at intensities below what we experience every day (sunlight is really bright, our eyes hide from us the number of orders of magnitude difference between even normal night-time artificial light and sunlight), there's no issue.

(However... since I often see this sorta misinterpreted in the wild on the internet, note that is an if-then statement. If the antecedent is false, I make no claim.)

The primary safety concern I have would be met by designing the lasers such that if they are overdriven for any reason, they will physically burn out before outputting enough light to be dangerous. Per the classic Therac-25 [1] case study though, that is one safety feature I absolutely want in hardware. There is no amount of software I would accept to implement that.

I would also additionally stick some fuses into the system, tuned below the threshold where the power will burn out the laser, along with of course building the whole battery system to not be able to deliver enough power to power the lasers to a dangerous level. However, I really want excess power to physically burn out the lasers. (I wouldn't want to find out the hard way that an EMP of some sort can overdrive the lasers.)

For all that I'm laying out safety systems here, I am quite confident that it could be done safely. We trust our lives to much more dangerous systems all the time. I will say that I can't explain to you how you'd audit that safety, though.

[1]: https://www.bowdoin.edu/~allen/courses/cs260/readings/therac...


> Photons do not get a special "laser" tag added to them by physics.

True, but we usually get a pretty wide spread of light energies. These are likely going to be very specific frequencies, hitting similar areas over and over. I wonder if the retina can get fatigued of specific frequencies.


A reasonable question, but I suspect we'd already have some idea if this was the case. There's a lot of artificial light with relatively few, narrow frequencies already in use. I imagine someone, somewhere in some bizarre application would have discovered this as a problem.

Since we only see in three color dimensions, it's hard for us to notice day-to-day, but for instance, some fluorescent bulbs are just 5 spikes in particular frequencies. It looks fairly "white" to us, but it's far from normal light.

(I am interpreting your comment as being fatigued of/damaged by very specific frequencies in a way that it would not be fatigued/damaged for the same amount of energy spread out over a wider range still within the given cone's sensitivity range.)


> (I am interpreting your comment as being fatigued of/damaged by very specific frequencies in a way that it would not be fatigued/damaged for the same amount of energy spread out over a wider range still within the given cone's sensitivity range.)

Yep, that's my concern. I don't know if anyone's done long-term studies about low-level light of identical wavelength.

I mean, on the one hand, people used to be afraid of fast-moving vehicles, convinced that it was impossible for a human body to survive going faster than 40 miles per hour.

But on the other hand, people used to strap radium to their faces because they thought it was a cure-all, too.


Indeed! The neurons can get saturated / reduce their signalling as they adapt, but the recovery time is usually quite fast.

You can use the effect to generate colors that are actually impossible in the real world (and are only perceived):

https://en.wikipedia.org/wiki/Impossible_color


> along with of course building the whole battery system to not be able to deliver enough power to power the lasers to a dangerous level

That part is not likely. If you concentrate the light over just a few retina cells, dangerous levels are very low.


>Photons do not get a special "laser" tag added to them by physics.

I thought they kinda did. By being very similar wavelength and power, compared to normal light which has all sorts of wave lengths and power. This difference can mean certain rods/cones being stressed more than average and not triggering the sort of fatigue that normal light would cause.

Perhaps the best example of a similar concept, though not with lasers, is looking at a total solar eclipse right before or after the sun is fully eclipsed. There is a small period of time where extremely bright light makes it into our eyes, but not he frequencies that cause pain normally associated with looking at the sun. This means that our default defenses against looking at the sun don't kick in and doing permanent eye damage is extremely easy without feeling any pain as the damage is done.


I have a lot of experience with lower-power laser diodes. They are extremely easy to burn out by even the slightest extra current.


If the lasers are low-powered enough it should be just the same as looking at a screen, shouldn't it? Either way the same amount of photons are hitting your retina, with a laser they are just aimed more carefully.


I'm mostly in agreement with you, but the part that gives me concern is precisely how precisely the lasers will be aimed at the retina. I doubt anyone ever really looks at a screen in the exact same place (even if you are staring at a specific character) due to eye jitter. After a year or twenty of use are we going to start having burn-in on our retina's similar to plasma screens used as kiosks?


If research shows that this is possible one need only to code in jitter into your iEye app, like the bouncing LG logo on my TV.


With lasers, the photons are collimated, so they're all going almost exactly the same direction, instead of just being scattered like with normal light sources. Because of this, it doesn't take that much laser power in your eye for your lens to focus it down to a small patch on your retina and burn holes in it. So you'll need very low-powered lasers for this to be safe. But it certainly should be possible.


Lasers are effectively point light sources. We usually think of them as perfectly collimated light, but that's just a special case, where the point is at infinity.

The consequence of being a point light source is that lenses can refocus the beam, parallel or not, back into a point. Doing so concentrates a lot of energy on a tiny surface. And if that surface is your retina, then that when it becomes dangerous, literally burning a tiny hole into it.

It can never happen with, say, a regular light bulb, or even the sun. It the light source is spread out, the image on your retina, or anywhere else, will be spread out, limiting the energy density. This is an indirect consequence of the second law of thermodynamics called the conservation of etendue.


You can go blind looking at the sun or any high powered light source for long enough. If the laser is at a low enough power, then this is no different than looking at any reflected light. The damage is due to the energy density so you just lower the energy.


Like CRT's, plasma's burn in, but you cannot buy a new set of eyes.


till you can of course (well maybe not us, but I see it quite realistically for our children, with some perks like taking pictures, better seeing in the dark, maybe infra, zoom)


If the projected technology trends of the past have taught us anything, we can't make assumptions on anything we'll be getting. For all we know it'll be a hundred years and we'll never crack that tech, but boy will we get better cellphones.


https://spectrum.ieee.org/biomedical/imaging/in-the-eye-of-t...

IEEE's coverage way back in 2003 and even by then a prototype existed as far back as 91.

I've been waiting for people to circle back around to this type of display for about a 2 decades now. I suspect the biggest barrier/reason is aversion to liability and safety of beaming lasers directly into customer eyes.


Lasers have been used routinely to measure the eye for the last 20 years. And everyone making that technology probably toyed with the idea of making a display rather than a sensor because you deal with the same issues. My guess is it has been prophetic patents holding progress back, so hopefully they are starting to expire now.


I disagree, the biggest barrier to commercialization of this technology is that it's technically infeasible. For the projection to make it through your pupil onto the retina your eye needs to be carefully aligned with the projection beam. This just barely works indoors when you pupil is hugely dilated, now imagine being outside when your pupil is contracted. It's a non-starter.


Me too! Elsewhere in the thread I brought up the HIT Lab that patented the approach in the 90s. I've been waiting for it to come out of patent / become adopted for decades:

http://www.hitl.washington.edu/projects/wlva/


Right. Still waiting for long term studies about the effects of such retinal displays.


OMG. I've been waiting for this tech ever since reading about this in high school during the early 90's, coming out of the HIT Lab research (Human Interfaces Technology Laboratory at University of Washington). http://www.hitl.washington.edu/projects/wlva/

I was so enamored, I actually talked my mom into swinging buy when we did a Pacific Northwest drive during summer break. Unfortunately, the lab was closed to tours at the time. And the lab's name also tickled me. It was my aspiration to go work there, and on VR technologies... and then the media chewed up the tech and spit it out, causing the long winter (I also wanted to get into Neural Networks and AI...)


Ha! I went to college my freshman year at University of Puget Sound, and my philosophy professor who taught our class "Posthuman Future" took us on a tour of that very lab at UW. It's been many years, but I still think back on that experience as a revelation.


That's so cool! And sounds like an amazing class. More and more, I'm kinda wishing I had a chance to get to college. Maybe some day I'll return in my retirement. :)


I don't know if they are working with Bosh here, but Microvision in Redmond was working on a system like this with NASA about 10 years ago.


Possibly longer than that. I've been disappointed by lack of consumer product since 1993.

https://comotion.uw.edu/startups/microvision/

I think the first end users were supposed to be fighter pilots.


I wonder how much of that was lack of direction and leadership. They had a number of different channels they were exploring which is fine, but they didn't seem to be really focused. I know when we tried to work with them they were losing $8 million a year, but the CEO at the time was getting paid $6 million, so...


This is a great article and it covers a lot of stuff that I'd wondered about before.

Here's someone doing a (somewhat terrifying) DIY version: https://eclecti.cc/hardware/blinded-by-the-light-diy-retinal...

Virtual Retinal Displays are a pretty old idea, first being demonstrated in 1991. They've been stuck in development for years, so it's good to see them getting some more attention.

http://www.hitl.washington.edu/research/vrd/


That was me. It’s great to see that the light engine and electronics have been miniaturized to be almost glasses sized. It’s annoying that they are still stuck with the tiny exit pupil I saw in 2012 and HITLab had in the 90’s. I hoped that mass producible pupil replicators would have been ubiquitous by now.


When I say "somewhat terrifying" I do genuinely mean "also awesome"!



I think this will be the next frontier for wearable tech. Smart watches are part of it, maybe even a smartwatch with some buttons that you can feel so you can control what you see in your AR glasses. At least until we see wearable contact lenses.

The phone becomes the main device, but these peripherals will enhance your use of your phone. Imagine being able to look up words you might not know when having a conversation in live time. Or seeing where the bathroom is in any building.

Edit: If you disagree at least tell me why.


I'd like to know why you're being downvoted too; I've thought about this very thing for a while now. Everyone right now runs around with the head tilted down so they can look at their phone (which is supposedly causing us to grow bone projections on the rear of our skulls). It would be much better to have eye implants so we can see information in real-time from our mobile devices, even as we walk around. Like you said, it would be really handy to see a map while you walk in a building, showing you where the bathroom is, or to help you navigate while you walk in a dense city, instead of having to pull out your phone and look at it.


The bone projection study has been pretty much debunked:

https://arstechnica.com/science/2019/06/debunked-the-absurd-...

That said, the current mobile situation is not great on posture.


Yeah, and to be fair, your phone wont be ditched, it will still be useful. Nobody will force you to do buy any of this tech, it would be useful though. I've heard of contact lenses eventually doing the same and I fantasized about it a decade ago long before I became a Software Engineer.


This sounds fascinating. I can't even load the page properly in Safari due to the stupid scroll hijacking around the ad.


Very impressive how far they went with MEMS scanners. That thing is a continuation of this: https://ae-bst.resource.bosch.com/media/_tech/media/product_...


On a somewhat related note, Google Glass enterprise edition 2 is now "generally" available https://developers.googleblog.com/2020/02/glass-enterprise-e...


If I were google I would buy this company and replace google glasses with this.


Google revenue 2018: $136B

Bosch revenue 2018: $78B


The problem is they're privately owned by a foundation. https://en.wikipedia.org/wiki/Robert_Bosch_Stiftung


Here's a concise 3min overview of all current AR tech and the continuum of product groups in the AR-VR (mixed reality) space.

https://youtu.be/U1BVI2JcNPc

This product would sit squarely in the "Smart Glasses" group.


Just in time for the snow crash TV show


Is there a TV show?! I loved that book! Please don't be like Altered Carbon.


I had the exact same reaction as I read the comment :D


Michael Bacall is adapting it for HBO. He worked with Edgar Wright to adapt Scott Pilgrim, so I am cautiously optimistic.


I was under the impression that Altered Carbon was very popular. Do you not think it's very good, or is it that it doesn't do the book justice?


Snow Crash was VR. This is _exactly_ technology that William Gibson described in Virtual Light.

https://en.wikipedia.org/wiki/Virtual_Light


But this display does use photons, it shoots lasers into your eyes.

The glasses in Virtual Light "got these little EMP drivers around the lenses, work your optic nerves direct".


Hiro uses something like this (laser projecting into his retina) for his computer in Snow Crash.

EDIT: Actually he wears goggles that the laser shines onto, so a little different


Seems like this technology, when developed further, could be the sort of thing that https://www.skully.com/ was trying with their original prototype, which was really exciting to me at the time.


What does 150 Line pairs [1] for resolution mean?

[1] https://www.bosch-sensortec.com/products/optical-microsystem...


I assume they mean line pairs per mm not 300 lines total.

Edit: Nope, the vertical resolution is really 150 line pairs.


So, with eye tracking, we could put these little laser things on a watch to give it. 20in screen?


Reminds me of the anime called "Dennō_Coil" [1],[2]. The protagonists use glasses to join an AR world. Never thought I would actually see something like this in real life within my lifetime.

If these glasses actually work we could be entering an AR revolution in the near future. Even if that future is still 20 years away that is still pretty close.

[1] https://www.youtube.com/watch?v=ODTvrQFtEOM

[2] https://en.wikipedia.org/wiki/Denn%C5%8D_Coil


This isn't new. QD Laser https://www.qdlaser.com/en/ has been making a similar technology for a while. I've tried their glasses at CSUN assistive technology conference in 2014 or something. Their target user was mostly low vision people, and the image was very clear. They also claimed that the technology is pretty safe for a long use. One problem that I found is that I have long eyelashes and it kinda blocks a part of the image, casting a shadow. But other than that, it looks a solid stuff.


I was working with similar glasses at Microvision 10 years ago! https://web.archive.org/web/20100114084506/http://microvisio...

I don't know why they didn't take off. I was an intern there for a few months. Every few weeks another engineer would quit, so maybe that had something to do with it. It was a good to experience working for a failing company. Now I know some warning signs!


This sounds really cool and promising but this made me laugh:

> The concept video doesn’t really do it justice—it looks great.

(two paragraphs later)

> The concept video is a quite accurate representation of how the glasses look when you’re using them.


Hrmmmm would have been good to know about the cost covering bit in the beginning of the article - but I digress.

I am extremely excited by the prospect of these or any glasses like system that can work well in a package that doesn't look like a bolted a computer to my head.

That laser warning sticker feels weird seeing as how that is literally the whole point of the glasses, so I'll probably wait for a 2nd iteration just to be sure. But for me, these would win over a decent watch any day.


IIRC the laser warning sticker is just required by law for any laser device of any appreciable power so there's no "well it's the whole point of the product" exemption to having the warning.


Very cool. I was super interested in Intel Vaunt and this is just an iterative improvement. I would buy a pair of these if they had a reasonably open API/firmware.


This is a really odd article.

Using this device sounds awful!

>

What I do want to talk about is how this entire system fundamentally screws with your brain in a way that I can barely understand, illustrated by seemingly straightforward questions of “how do I adjust the focus of the image” and “what if I want the image to seem closer or farther away from me?”

[...]

because the Smartglasses are using lasers to paint an AR image directly onto your retina, that image is always in focus. There are tiny muscles in our eyes that we use to focus on things, and no matter what those muscles are doing (whether they’re focused on something near or far), the AR image doesn’t get sharper or blurrier. It doesn’t change at all.

Furthermore, since only one eye is seeing the image, there’s no way for your eyes to converge on that image to estimate how far in front of you it is. Being able to see something that appears to be out there in the world but that has zero depth cues isn’t a situation that our brains are good at dealing with, which causes some weird effects.

>

Oh wait, maybe it's not awful?

>

for the first 10 or so minutes of wearing the glasses, your brain will be spending a lot of time trying to figure out just what the heck is going on. But after that, it just works, and you stop thinking about it (or that’s how it went for me, anyway.) This is just an experience that you and your brain need to have together, and it’ll all make sense.


The article sounds like the device is unnatural and disorienting at first, but then you get used to it and it isn't anymore. It sounds like a promising technology.


I think its probably something you could adjust to, once you get over the sickness and manage to avoid walking out in front of traffic ...


This demo is so weak I think we need a new noun for this category of failed wearable tech / lifestyle technology demos. Can anyone think of a good candidate?

wightbosch, n. failed futuristic product demonstration. Etymology from wight (poetic) ghost/deity + bosh (British) nonsense + Bosch (German technology firm)


It would be pretty cool but then again magic leap looked pretty cool. I'll get hyped after they start shipping.


Not for me, I'd rather have a camera to get input and an audible bot in my ear, no visual change in my face.


Reminds me of the “Virtual Retinal Display“ that Microvision was trying to create back in the late 90s.


Is this a vector display? The way it's worded seems like it, but I can't tell.


It works like a CRT monitor where the beam scans line by line, see https://en.wikipedia.org/wiki/Virtual_retinal_display


Even in the demo video, some stuff is easy with a phone like recipes or shopping list.


Despite this being essentially sponsored post, I’m glad someone has finally got this kind of thing working.

This is the path that should ultimately be us infinite depth of field and true image injection. Super exciting!


In one of the photos, the dude is wearing the glasses and smiling at the camera, it looks like there is a green reflection. Is that just a reflection on the glasses from ambient light, or can we see the text he is seeing?


Probably just a reflection: There is no way you could see what is displayed if you don't wear the glasses with this system


laser writing to retina sounds like an awful idea. at least change the marketing


Laser into the eye sounds like a great idea. What could possibly go wrong?


I don't see how they can hope to overcome the alignment issue. This idea has been prototyped before, iirc carmack built a simple one during very early development of the Oculus.


What's the holographic film for out of interest, is it for focussing somehow?

(I'm kind of curious as to why the lasers can't directly hit the retina from the mirror array)


From my understanding it's just for reflecting the laser light. The lasers are shining parallel to the lens surface initially and thus need to be redirected towards the pupil, this is done with a film.


Oh yeah! That's exactly what I need after looking into a lightbulb (my computer monitor) for a many hours a day! Laser, right into the eye! ;-)


With how well they design user interfaces on their appliances, it’s little surprise they put battery life right in the middle of the field of view.


I'm not sure how much I'd trust a laser directly on my retina--at least not until it's been very well tested (on other people).


Seems like very useful technology for fighter pilots or even race car drivers (maybe they already have something functionally similar)


Fighter pilots already have helmet mounted displays to project information to the pilot's eyes. These systems can also cue weapons systems to the direction the pilot is looking.

https://en.wikipedia.org/wiki/Helmet-mounted_display


I remember using MicroVision laser headmounts back in 1998. It's kind of shocking how slow commercialization of this has been.


I really like that this system relies on one eye only. I have a very slight strabism and some VR/3D stuff are out of reach.


How many pixels do they have, and what's the refresh rate? Also the one he's wearing still looks kinda dorky.


It doesn't works with pixels, it looks like it uses a galvanometer to guide a laser, it also says here [0] that it has a 60Hz frame rate.

[0] https://www.bosch-sensortec.com/products/optical-microsystem...


The relevant figures for a galvo would be kpps and angular precision.


There's the more generic line pairs per millimeter which is basically how many light and dark lines it can draw per mm.


Great to see development is continuing after Intel shut down the division that was developing this technology.


Next: how to prevent screen burn in.


Can an interesting application of this be how it can be used to assist the partially blinded?


The next evolution will be implanted direct optical nerve stimulation. Should I hold out?


Daemon operatives, take notice!


Finally!

We've been talking about these in Sci-Fi for years.

Surely the Jet Pack can't be far off now ...


Star Trek TNG S05E06: The Game: https://en.wikipedia.org/wiki/The_Game_(Star_Trek:_The_Next_...

Though the issue in the episode was the addictiveness of the game (which is already an issue) rather than the fact it painted directly on your eye.


Yes, this.



The problem with jet packs is that they're limited to people crazy enough to strap one on and light it up. Maybe once AI control is trusted...


Surely its a fairly straightforward control system alá quad copters. Just got to solve fuel and health and safety regulations and we’re set. Boris and Donald working on the latter, hopefully Elon can get the ball rolling with a 3D mockup of the former.


They had working jet packs in the 1960's. One was used in a James Bond film.


I didn’t say they haven’t been made, just that they aren’t a commonly available consumer item yet.


Can't wait to get burned in "pixels" in my retina


I am confused that it "paints images directly onto your retina" (presumably, shooting lasers at your eyeball, which sounds unhealthy) and yet the writer was able to take a picture of the image on the glass.


The image is being painted onto the camera sensor instead of the retina.

Regarding "shooting lasers at your eyeball, which sounds unhealthy", this technology has existed since at least the early '00s. Microvision was showing off their Nomad laser HMD (https://www.google.com/search?q=microvision+nomad) around 2002 IIRC. It's reasonably well established as safe.


It shoots a laser (well 3) at a film which reflects into your eye.

It's a lot less confusing if you read the article lol.


The photo you are referring to is the laser painting directly onto the sensor array of the camera.


I was definitely expecting some sort of joke article, but the tech does seem interesting.


My $50 monitor blits pixels directly onto my retina. Be impressed.


Oh boy, I love that headline: Bosch Gets Smartglasses Right With Tiny Eyeball Lasers. You know you're in the future when you start with the tiny eyeball lasers.


I really hope they're just connected to a smartphone as a display.

Who thought it was a good idea to put a whole computer in glasses or smartwatches?


Having the logic to interpret and respond to input on the device reduces the latency between interaction and effect. Also having at least some power on the device means it's useful by itself. It also reduced batter drain on both devices to only send small BT packages instead of essentially streaming video constantly during use.


Hope those lenses are not reflective


These lasers... are they dangerous, can they leave lasting scars on the retina through prolonged use?

I would like some info on similar things.


Isn't this the same technology as Focals by North?


It says that in the article, yes.


TBH, I've wanted this tech for DECADES ever since I saw Scientific American TV cover the "Cyborgs" at MIT in the early 90's, and then Google Glass (remember when everyone was selected based on their open idea submissions and thought they were getting FREE Google Glass, only to get the $1500 invoice?)

Let's start the list of all the things that can go wrong with this!

On the plus side, I wonder if they could measure the chromatic aberration and distortion of the reflected laser light and compute a correct lens prescription.


This is similar to what Focals by North have been doing for a few years.

https://www.bynorth.com/

Turns out the article mentions them.


Nice to see there can be a fashion conscious smart eyewear.

Apple has been rumored to have such a product in the works. If theirs were this cool, I could see them really taking off.


I tried a pair of these in Toronto. It was pretty cool, but still very early.


It says that in the article, but also points out differences.


I stopped reading at "paints an image directly onto your retina". This device is basically a miniature DLP laser projector that projects light onto a fancy light gradient stuck to the glass. You are looking at a reflection, nothing is painting your eyeball.


"You are looking at a reflection, nothing is painting your eyeball."

Each photon is reflected to a specific, targeted location on the retina. This is in contrast to normal vision where photons are reflected from and to multiple angles.

Which is why this is always perfectly in focus -- the normal path of light means that light is captured across the surface of our eye and needs to be focused to converge on a point (within the limits of the eye's ability to do so, and with the reality that it only has one focus distance at a time). This system, in contrast, is "painting" the retina so that it is addressing specific target pixels, so to speak, regardless of the focus of the eye.

Their terminology seems okay.


Nice explanation, thanks. Kind of like a very short throw laser projector where everything (the wall) is in focus, regardless of the distance or angle from the "fancy lens". I was just triggered by their use of "paint" when the process is passive.


I believe that's the difference here -- This isn't a DLP projector. It really does paint on to your retina snow-crash style.


But then he explains that the projected image is always in focus, never blurry no matter where his eyes are focusing. I know next to nothing about the optics so could be wrong though


I think it's the "directly" part that's incorrect, not the "paints an image onto your retina" part.


But then, everything we look at paints an image onto our retina, so using it as a hook for a headline feels disingenuous.


If the article is correct it is an important difference because the light is directly projected such that it's always in focus and not affected by the cornea so it's just always there and you don't have to change your focus to the read it.


Any regular display "paints an image onto your retina".


Then you should have kept reading, that's literally the problem they've overcome here (well, kind of).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: