The demo video on the site is hilarious where it's showing news or texts right over the woman's face that he's supposed to be listening to. It's like when they show what's going on in Homer Simpson's brain when his wife is talking.
I thought the same initially but that is demo’ng translation. She’s speaking in one language and the glasses are showing the English translation. Maybe not great UX either way since it is covering her face but that’s another story.
You would still be holding eye contact (mostly) with the speaker. If the text were above or below, you would see to be looking off into space, which may be worse from the speaker's perspective.
Will there be other design / style options? I don't want round glasses, they don't fit / suit my face shape.
They also remind me of harry potter, not a fan of the look.
The last AR glasses I saw on HN were also this same round shape. Why is that? This looks like a potentially cool product! But not with the current design.
And since they talk sunglasses as well, I like mine to be fitting my face so exactly that I have zero/almost-zero stray light. Their shape of shades looks the opposite of desired.
ha~ I think, if that's the only complaint people have, they've done a pretty good job.
Technically, this is a HUD. It's not AR. So, I'm happy to hear complaints about that.
...buuuuut, if you compare it to the ridiculously stupid promises from the rabbit and the AI clip, is there any harm in a simple, cheap (relatively) device that delivers what it promises?
Sounds pretty good to me.
...and that it doesn't quite match your personal style? eh.
I'm sure there will be other versions with other designs if the product actually works; but, I don't think it will make any meaningful difference to how people review it.
It'll live or die based on function, not form.
(Perhaps, you could argue, some other companies should have made that a priority with their devices...)
I'm guessing you're not a glasses wearer? Glasses shape is intensely personal for many people who wear them a lot. It's effectively modifying the shape of your face, for better or for worse, and that ties into a whole host of potential feelings about attractiveness and personal confidence. No matter the life-changing usefulness and specifications of a device, if it takes the form of a pair of spectacles, it's got to be a shape that you can stand wearing or it's an absolute non-starter.
That’s an awful lot of shiny emotive language dancing around a remarkably small amount of detail. Looks like it’s glasses which your phone can use as a display plus camera? Pretty cool, but there’s scant information about the display or capabilities.
I’d worry about ‘smartwatch syndrome’ where it’s neat but ultimately doesn’t really get you anything more than just using your phone.
Eh, smartwatches are actually useful though - while I don't use them a lot for "phone replacement," I do use them for a lot of things they do really well, namely checking the time(lol), as well as health tracking. My watch constantly tracks my heart rate, any variations in it, as well as a bunch of other data, including sleep.
I don't need to manually adjust workouts, runs, or walks either. When I work out, it is decently accurate in tracking those too.
the other day it actually woke me up because my heart rate had dropped dangerously low. So, useful there.
Other than these, a lot of "doing stuff on watch" has been replaced by the RayBan Meta smartglasses for me tbf.
No camera is interesting. On one hand, it's aesthetically strange - you would expect a computer on your face, putting screens right in front of your eyes, would be able to see what you can see. On the other hand, it improves the social experience if people don't feel like you're putting them in front of a potentially recording camera all the time.
On first blush I like the low-res, stylized display. Depends on the actual experience of it though. Seems like it might make long-form reading less pleasant.
I just got my Brilliant Labs glasses, and while cool, i highly doubt the outside examples.. un less its a super foggy sf day, it you cant see anything.
I have a question to you. The Brilliant Labs website says:
Out of the box you have access to AI models like Perplexity, OpenAI’s ChatGPT, Whisper etc. allowing you to receive answers to questions about what you’re currently looking at, experience live translation from either speech or text, and query the internet real time.
Do you feel this is all true?
I am especially interested in ‘experience live translation from either speech or text’. Do you feel that works?
When Google glass was announced we proposed a POC for our field and datacentre folk (Telco industry). Lack of availability killed it off as we were in Australia and never made the cut for trial. The ability to feed information to them while they can use both hands, not have a laptop balanced somewhere or not be on the phone was invaluable in theory. I'm hopeful this sort of thing can become what it has potential to but I feel it will be undercooked and die off.
I'm a fan of low-power, long-lifetime, low-weight displays in AR glasses, but the lack of a camera precludes a big class of applications. Even a simple low-resolution camera with a small FOV would open a ton of possibilities. That said, putting cameras in smart classes at all presents serious ID problems: camera modules are all just too damn big and cubic. We really need dedicated miniaturized cameras for devices like this one.
Following a long trend started by Magic Leap, the videos are entirely made up. I bet the display is very faint outdoors, just like the AR headsets that came before.
I have another pair of AR glasses that came out a few months ago, the RayNeo X2s. They are easily bright enough to show photographs or text on a very bright summer day, even without the optional darker lenses installed.
From having tried out Google Glass back in the day, I found the display on Google Glass uncomfortably bright. I'm guessing they did that to make it more visible outdoors, but it didn't feel nice.
That seems like a very solvable problem with an ambient light sensor. Or even just manually; my xreal Air's have a little rocker than you can change the brightness with.
Have done this very thing with a pair of Xreal glasses and my Samsung phone in Dex mode. Just be sure to turn down the brightness so you can switch focus.
Could be cool. It's a nicer, smaller form factor than most of the competition today, but I'm wary of the low resolution, lack of camera, lack of speakers, and awkward bumps on the earpieces.
But a 1.5 day battery (if true) sounds perfect; a lot of previous generations didn't last an entire day, making them kind of worthless as daily drivers.
Definitely in the market for smart glasses at some point, but will wait to see the reviews. Starting at $600 is still pretty steep for glasses.
One of the things that frustrates me in everyday life is people being distracted by a number of things while trying to engage with them in conversation. I understand that as we don more and more augmenting technology, we're inevitably going to have to share our attention with this technology, at least until we have BCIs, but it just makes me feel very anxious about the near-future of human-human interactions.
With smart glasses, you could be looking directly at somebody and still be as distracted as you want!
If the other person wants to make sure that you've been paying attention, don't worry! AI can summarize what they said and whisper it in your ears, so you can confidently pretend that you've been listening the whole time.
Very thin on details (though there are some specs on the bottom), but how are you supposed to focus on something projected right in front of your eyeballs? I'm a bit unclear on how they solve this ..
> how are you supposed to focus on something projected right in front of your eyeballs
Your eye can't tell where the rays were emitted. You don't even need fancy waveguides to make an image appear to be at infinity (although the G1 appears to use waveguides, and I don't know what they've chosen as their focal plane). https://en.wikipedia.org/wiki/Collimator devices have been in HUDs for ages.
I find the realtime translation feature quite compelling. Audio translation requires everything being said twice so doubles latency. If two people from two language speakings both used this, it's a massive step towards a universal translator.
A lot of the other features are meh, reminscent of what a 1980s high tech watch might do for you. feels like a PDA
The part where I get excited are the AI applications, this is also where it gets unethical. i.e. microexpression/voice analysis, AI suggested responses, realtime fact checking of what a person says, and so on. Applications in anything requiring human interaction, sales, meetings, dating, friendships, etc. Imagine being the voluntary puppet of an AI that tells you what to say. If it becomes mainstream, the implicit bias people make towards someone who wears the glasses and therefore may be AI-fed. The inequalities of the have versus have nots, with regards to having such a personal AI.
Ultimately their best bet if shooting for success would probably be to connect it to an AI though. That and social media feeds.
> I find the realtime translation feature quite compelling. Audio translation requires everything being said twice so doubles latency. If two people from two language speakings both used this, it's a massive step towards a universal translator.
I thought keeping a single airpod in one ear would have this solved by now. Growing up listening to things translated by a 'lektor' makes me hopeful that most people can hear 'tone/emotion' from the speaker in one ear and 'message/content' from a translator in the other ear and use their mind to fuse the two back into coherence
I was excited to use this to translate signs / menus, until I saw it has no camera...
> If two people from two language speakings both used this, it's a massive step towards a universal translator.
This could change the world. Ultimately books and films do not ‘need’ translation. No more learning foreign languages. Travel to foreign countries and understand everything, talk with everybody.
1. Not stream everything around you to third party online servers
2. Not use unreliably and algorithms that almost certainly (accidentally on all parts) discriminate against certain classes of people that they tend to return more false positives for.
3. In situations like dating, present yourself not what an AI chooses to present.
... I'm sure there are plenty more.
I'm sure there are ethical use cases for the technology too, but in a world like GP proposes there are going to be plenty of opportunities to use it unethically.
I don't recognize a general obligation to limit my subjective experience to what my squishy meat brain can supply independently. We're all going to have to adapt to a more ubiquitously connected environment. Halting everyone's progress in the name of someone's personal of idea "ethics" isn't going to move us forward as a species.
And yet again, it's a hardware device that's only usable with their specialized software, rather than being a general-purpose display that you can talk to from any device and use for your own creative purposes.
Still waiting for someone to build one of these that's just a display and not a locked-in proprietary ecosystem. By all means provide some applications at launch that can drive it, but let anyone build applications that talk directly to the glasses.
Many excellent hinges have been shipped that don't wear out in years of regular use. I have no idea if this design will live up to those examples, but on brief reflection the hinges I've admired in products I've owned don't have screws. In short, my priors agree that lack of a screw is a positive.
Will there be a way to make custom “apps” or “plugins”? Even just feeding it dynamic text to display on certain inputs or commands. Or are we stuck with the 6 features it comes with?
> If you need the G1 with prescription lenses, they will be individually and uniquely etched using software that interprets your optical data in microscopic detail
lol great. special lenses that can only be made using proprietary software and proprietary machines with data that is probably acquired with proprietary hardware and then locked away in this company's databases.
translation: good luck ever getting lenses fitted or replaced.
Do these features appeal to any of you? Enough to wear them all day long? I don’t see the killer use case here, especially as it requires a companion phone.
Maybe you would never misuse this technology, but if it becomes widespread and normalized then plenty of people are going to be walking into public toilets (other previously private spaces) with this tech on. We're creating the conditions for total surveillance of the physical world.
Is this the future you want to live in? It's not too late to back down.
EDIT: As pointed out by those below, these glasses don't have cameras. Which I think is much more preferable to the alternative.
no camera on v1. But a camera mod is desired and always on face surveillance tech is coming in the very first knockoffs.
They'd give off a more noble air by setting the standard for any recording of surroundings with a flashing red light on all sides with physical kill switch for others to see verifiable off setting.