Hacker News new | past | comments | ask | show | jobs | submit login
Meta (YC S13), The Crazy AR Glasses That Aim To Do What Google Glass Can’t (techcrunch.com)
264 points by L4mppu on Aug 9, 2013 | hide | past | favorite | 86 comments



Did anyone see actual demos of this thing in action? Meron came to my computer vision class this spring at CU and didn't have anything to show other than the CGI demo that has been up for over a year. The newer videos are also mostly just renderings and the only working examples that were posted are around 10 seconds long.

I wish them the best of luck but it seems to me like they're really overselling themselves.


These guys had a booth at the Epson stage at SIGGRAPH this year. I tried it out from one the engineers. The unit could stand to be a bit lighter, and less dorky but it definitely works. They are very accessible and probably game to demo it if you visit their offices.


looks like they invited the reddit co-founder to test them actually:

https://www.youtube.com/watch?v=Cv7nSng0yD8


That video of the Reddit co-founder struggling to use the product & struggling to say something good about it made me cringe a little.

They should work on a better testimonial video.


You make it sound worse than it is. Maybe it's just me, but he seemed to have some positive things to say during the actual demo. Just seems slightly staged at the end there is all. As for struggling to use the app, we don't all expect an early prototype + third-party app to have perfect usability.

Of course, this testimonial pales in comparison to, say, the Oculus Rift kickstarter video. But that product was further along at that point, and solving an easier problem.

Edit: Actually, I take that back. I didn't realize this was featured on their home page, which definitely should have a better testimonial. At least it reflects the reality of the tech at this point, setting the right expectations hopefully (in contrast to their main video).


The real kicker is how deluded the founders are about their idea. The sculpting app, for instance, is an awful idea. Sculpting in air doesn't work – it's a fundamental limitation of the model – there's no tactile or force feedback – not something that will improve with the technology. Demoing with this app shows they don't understand the nature of the technology they're developing. I would give them a pass, but, later on in the video, the guy with the beard says "so you can imagine a surface on every wall, like Gmail, Facebook, or whatever". This not only unimaginative, but wrongheaded. A moments thought reveals that this idea makes no sense.

This video is brogrammer culture at its worst.


There are technologies being developed to simulate a feeling of touch in the air: http://www.engadget.com/2013/07/21/siggraph-disney-research-...

"AIREAL pumps out tight vortices of air to simulate tactility in three dimensional space. The idea is to give touchless experiences like motion control a form of physical interaction, offering the end user a more natural response through, well, touch."

It looks like Aireal can only create a pulsing sensation, which will still be insufficient for sculpting, and I imagine it is a weak sensation in the first place, but I am still interested to see how these technologies improve over time.


Are you saying that what i am seeing is a concept and not a created product? This looks really cool, i am actually considering buying one but maybe i should just wait for some reviews. Either way this is way better than disappearing photo's, congrats!


Hey, I guess I was in your class. Nice so see people from CU on here : )

It is strange that they haven't done any real demos (that I've come across), but an awesome idea nonetheless. Hopefully they release something more substantive soon.


I've seen actual demos and the product's been improving at lightning speed. Not quite as polished as the CG but I think the special effects are there to help people envision what can be done, since it's really difficult to articulate. It's early-stage but definitely well beyond mere concept. There's some solid talent on the team and I'm excited to see where they go with this.


It seems like a good sign that Steve Mann is involved, even if it's unlikely they'll be willing to make the same compromises that he has historically done in order to get functional tech in exchange for comfort and style.


Look like a great idea and technological advancement, but the biggest thing holding back Google Glass is not that it lacks feature X.

Glass will be held back because most people don't want to look like geeks. Meta's next design looks like it's from an 80's Sci-Fi movie. I'm not trying to be an asshole, but there is no way I could take someone seriously wearing those things.


Ya know, I'm more or less with you and I am a Google fanboy.

The 2 things that Glass will have to overcome:

1)The not-so-fashionable look. I'm sure this'll be corrected in the future.

2) The way people feel about a device that may or may not be recording them at any given moment. Let's not even mention anything about a red light glowing during recording because we know that'll be hacked out. Google Glass will be able to record you without you knowing, period. People will simply have to accept that or the product will fail or get banned in so many places it'll almost be not worth owning at all unless you're a hardcore geek. Then of course someone does some super slick mod where Glass just looks like any ordinary pair of glasses; then mass paranoia breaks out and either people get over it or any glasses are banned. ;)

Mind you this paranoia will be happening despite the fact we've had wearable hidden HiRes cameras smaller'ish than a penny for over a decade already... http://www.brickhousesecurity.com/product/b-w+indoor+high+re...


It's almost scary to imagine there will inevitably be Glass-like contact lenses. How do you protect yourself from that?

Which makes me wish there was an anti-Glass product out there. Something that makes you disappear, or at least masked as a see-through hologram. It's perfectly acceptable if anti-Glass does not hide your feet for regulatory reasons.

People who are not comfortable being recorded might welcome invisibility as protection.


Google did quite a great job in the aesthetic department (considering the goal and size of the device) but it's still 'invasive'.

ps: It's funny how public react to obvious recording devices when they're surrounded by them.


The way people feel about a device that may or may not be recording them at any given moment.

It's pretty difficult to go out in public without being recorded. Security cameras are ubiquitous.


True, but people feel different about an individual they may or may not know recording them versus a security camera at the Bank. All it will take is one melodramatic FOX-News piece on TV connecting Glass to cyber-bullies or crime or some other ThinkOfTheChildren/terrorist nonsense and a bunch of people will freak out. I don't want it to happen, but I know it will and suddenly Google will have to explain how it's "Keeping our children & the public safe" or some other intangible goal.

EDIT: And look at the comment replying me below by user "read" - https://news.ycombinator.com/item?id=6189941 . If we have HN-users who already feel like they need to "protect themselves" against Glass, the general public would be whipped into a frenzy by one FOXNews report.


If we have HN-users who already feel like they need to "protect themselves" against Glass, the general public would be whipped into a frenzy by one FOXNews report.

I don't think this is true. There is already "revenge porn" and upskirt shots from cell phones. Nobody has suggested banning cell phone cameras, or even complained about it in any serious way. (Remeber that law that would require camera phones to always make a noise? How did that go? http://www.opencongress.org/bill/111-h414/show. Out of 435 potentially paranoid Fox-News-loving representatives, he could get no other sponsor.)

I think the whole Glass camera thing is just people trying to rationalize their intrinsic dislike for the thing. (It's new. It looks like glasses. I got made fun of for wearing glasses when I was a kid. They are intrinsically weird.) You can already surreptitiously record pretty much anything that happens in public, and people upload the result to Facebook and YouTube regularly.


Most people wouldn't want to look like a geek, typing on a handheld computer while out with friends. Yet here we are.


Did texting ever had a "look like a geek" stigma? As far as I remember, cell phones have always been "cool" even back when they were super clunky like in Saved by the Bell. Pagers, too. And texting took off pretty damn fast after it was introduced, even before stuff like the Sidekick, let alone the iPhone.


And they didn't until apple made them sleek and sexy. It will be the same with these glasses.


Texting was pretty mainstream popular in the US for at least 3 years before the iPhone came out...


Which makes me wonder why Apple is betting on the smart watch and that trashcan they started selling, ehhm, I mean the new Mac Pro.


My attitude is this: You're right. And this isn't really about what people look like now. This is about the first few steps in what I (and others, I assume) predict is going to be a fairly major change in the way people interact with computers.

Don't think about Google Glass / Meta as they are now, think about what happens when they fit on a contact lens...


I dont think this is meant to be used 24/7. More like when you want to play a certain game/do a certain thing at home/work and for that usecase i think its ok if it doesnt look totally normal. Id rather compare it to the Oculus Rift than Google Glass in terms of looks. If its meant as a 24/7 device, i am with you, looks too geeky.


I'm with you, but it's important to keep in mind that the current iteration of Google Glass is the rough equivalent of the old "mobile phones" that looked like plastic brick with a antenna sticking out of them [1] and that Meta's sure seems more like a proof of concept and tech demo than something they would put forward as polished.

These devices are going to keep getting smaller and in a few generations will be fairly indistinguishable from a pair of glasses (which many people wear now).

1 - Seriously, look at this guy: http://vni.s3.amazonaws.com/120802142609275.jpg


Totally agree with you here. But my approach to this kind of tech in the next decade is to treat it like another i/o device for my computer.

Like you, I'd never really consider wearing these things in public (or more specifically, in a casual setting when socialising), but I'd definitely consider using them at work/home if I could make them do neat time-saving things that other i/o devices couldn't.

Ultimately, I'm happy to be an early adopter but for me to use them in the public context shown in the video, they'd have to be somehow integrated into something much much smaller. E.g. contact lenses (but that is seriously way off).

Between this, Google Glass and Oculus, we will hopefully see some serious progress in the VR/AR industries.


Completely agree, their next iteration looks better but I just don't see most people wearing them. I honestly don't think this technology is going to become ubiquitous until and if somebody manages to get past the issues of getting it onto a contact lens.

I'd even go as far as saying the current efforts might be a bad thing for that possibility, if it is even possible, since we might just end up with a patent encumbered wasteland by the time the technology's there.


On the other hand, Glass could make something like Meta MORE socially acceptable, by pioneering the concept of having strapped to your face in public.


Also this article: http://news.cnet.com/8301-11386_3-57596204-76/metas-meron-gr...

Gribetz and his band of less than 25 employees are ensconced in the Los Altos mansion, filled with mattresses, cables, and aluminum bins of takeout food ... "We are hacking 24-7," Gribetz said, "and making less than McDonald's wages."


Awesome awesome awesome. I'm super psyched about this stuff. I'm going to pre-order a pair.

Anyone (from Meta maybe?) have any details on the SDK? I see "write code in Unity3D on a Windows PC" from their Kickstarter, but curious if that's the latest word...


All Unity 3D. Thanks for the feedback that you want more detail. We're posting some app video and a developer section soon.

We make the real world (surfaces/objects/hands) appear as 3D objects inside Unity. We do the heavy lifting with computer vision and math so you can code the game as you would any other--the cool bit is the 3D objects correspond to stuff in the real world. Our number one goal is to be the easiest environment to dev on.


Will this work with the free version of Unity, or only the paid version? I'm a student doing research in computer vision and would love to develop for Meta. (I actually proposed something very similar a few years ago and wanted to work with Steve Feiner, but I was never able to get in contact with him).


Seems strange that you don't have C++ SDK as well. Unity3D is great, but what if someone wants to add support to an existing 3D modeling application? Even if you're primarily targeting indie game developers, what if I my engine of choice is UDK instead of Unity3D?

Depending on how you implement your Unity integration it probably wouldn't be very hard to add support for other code bases, but if you want a lot of developers making applications for Meta it seems like you'd have more options available.


Since its a first dev version and aimed at simplicity i understand going with Unity first. I am sure they will add other technologies at a later stage.


I would also love some more information on the SDK. $700 is a little steep but I would probably spring for it depending on what I can do with the development kit.


fyi, Google glass is $1,200 and is going for twice that on eBay.


> Tracking blank white objects — be it a piece of paper, or a big blank wall — is one of the hardest computer vision challenges around.

Most of what I know about computer vision comes from deep learning approaches, but tracking a white object doesn't seem like it should be too difficult. Is tracking a large white object actually "one of the hardest computer vision challenges", or is this just a garbage quote?


Tracking within a plain white object is very hard. Plain coloured walls are featureless, so there's nothing for most algorithms to latch onto.

However, tracking a white object such as a piece of paper sitting on a contrasting desk is relatively easy. Especially if your algorithm is designed to handle such a case. You have the easily detectable corners and edges of the paper, and from that you can infer its transformation. You can also detect its soft deformation (such as bending or crumpling the paper) if your system is assuming a piece of paper as the model.

The way some tracking works is to use a corner detector to find "interesting" features. A naive tracking algorithm will then examine the spatial neighbourhood of each feature in the next frame in order to find out where it has moved to.

There are better feature representations (such as SIFT) which define a "feature" in an image in such a way as to be scale and rotation invariant (you can match the feature against scaled and transformed versions of itself). There are also much better ways to track across frames of video data.

Given that Meta has infrared and RGB stereo cameras it has a lot more information to work with. I hope they can make it work well under all situations, but I am skeptical.


Thank you for the explanation. I was mostly thinking about the case of tracking something like a piece of paper in front of a wooden desk.

I can see how tracking the scale and orientation of field of view filling single color objects would be difficult/impossible.

It doesn't seem like these worst case scenarios would come up much in real world use. It's fairly rare to encounter situations where one's entire field of view is filled with one (featureless) color. I would image that a wide field of view for the cameras would help greatly with this problem.


Without any texture it's practically impossible to locate and track any feature points on the wall. Most vision algorithms use "corners" (feature points) that can be matched/tracked.

A way to get around that is to use an infrared setup like the kinect to project a pattern onto object, but I'm pretty sure that wouldn't work if both the projector and the object are moving.


No, it works for moving projectors as long as the camera is moving as well.

http://www.youtube.com/watch?v=CSBDY0RuhS4 http://www.youtube.com/watch?v=Sw4RvwhQ73E

since you could assume that the projector is standing still and everything else is moving, the same should be true for the inverse and all points inbetween.


It depends - tracking a white piece of paper on a snowy backdrop is definitely a hard challenge ;)

But all in all, I wouldn't say it is. In undergrad the final project of my computer vision class was to track a soccer ball over video frames. White circular object against mostly green backdrop- fairly straightforward.


I'm pretty sure what Meron meant was getting the orientation of walls so that they can project stuff on top of them.


Their stylized logo looks like it says "METH".


What's wrong with METH? Ain't nobody going to touch that IP.


Wearing a computer on you head isn't normal... but on METH^HA it is. :P


some other interesting videos about Meta that get more into the details:

http://www.meta-view.com/

https://www.youtube.com/watch?v=Cv7nSng0yD8


Brilliant and ballsy. It's refreshing to hear a startup go after long epic visions and to be able to gather so many people to work on it already.

Looking forward to seeing their future


Are they already out?

Some of the people here saying Meta Glasses look geeky are missing the point. They can be used at work in a myriad different ways

* previewing 3d printings can be one of them, with Tony Stark-type visualizations more generally

* collaborative games in offices around the world after work, where you can do things like fire projectiles or see the same objects or stats only if you have the glasses on

* metainformation overlaid for visitors to museums etc.


This is really exciting, that someone has turnt this into reality (I was just daydreaming about it).

The top somewhat reminds me of the kinect, or are you guys using bifocal vision? If you are using the latter, does it work outside?

I think I'm way to excited about this, and having to wait for a teardown to find out what tech powers this beast is making me giddy like a 5 year old in a candy store.


The lazer tag could be incredibly awesome, since you can actually see all the lazers coming at you, and with none of the range and safety issues that come with paintball. You could conceivably play it as a team event in your own office using just the glasses, or scale up to full real-time war simulation with the right cameras and software.


The concept video looks awesome. A few weeks ago I went hiking and thought to myself "wouldn't it be great if Google Glass had an app that showed me information about the flora and fauna I'm seeing? Oh wait, it can't yet." It looks like Meta might be able to.

When looking at new technologies there are always two questions: is it worth doing and can it be done. The answer to the former is obvious here. I don't know nearly enough about the state of hardware to make a call about the latter, but kudos to the team for unabashedly attacking such a huge problem and trying to make the future happen faster.


    wouldn't it be great if Google Glass had an app that
    showed me information about the flora and fauna I'm
    seeing? Oh wait, it can't yet
I think Glass could do this, but not as an overlay.


I've been hoping for a long time that Mann's EyeTap stuff could be more widely available, so here's hoping they deliver something amazing (and maybe affordable too, in my dreams)


There quite a contrast between HN's reaction now and 3 months ago

https://news.ycombinator.com/item?id=5726572


I am not sure such glasses will take off. You have always a fucking HUD before your eyes and I am not sure everybody likes that (I never tried it but I even do not like normal glasses and prefer contact lenses as I feel more free that way).

I would appreciate something like a small beamer a lot more where you can project a UI on a suitable surface (holograms seem to take longer ;) and control it either by a pointing device or gesture control.


These guys are from the future.


Can someone explain to me how these are different than Vuzix, and any other company after that?:

http://www.vuzix.com/augmented-reality/

PS and Yes, I understand the advertising potential after GG_AR. Maybe Vuzix should do the same, like Sony did against Microsoft on the E3 (PS4 vs Xboxone)


Also, alternative name:

Singularity shades.


You scared me for a moment, because that's a mixup of the names of 2 things I'm working on right now. Geez.

Here, have another: Overlay Optics/Occulars.


Mirrorshades is really the only correct answer here.


They're not exactly mirroring aynthing though. I'd go with "Awesome", but I'm not sure whether that describes the product properly.

PS: Random tough, how long will it take the Japanese to create VR Girlfriends on this thing?


I shall create one, and name her Molly (comedy answer, Y.T.).


I'm curious how far away the display visually renders? Do you have to focus on very close objects to see it sharply (and then can't focus on faraway objects)? Does it rest in a medium-ish focal zone (10-30 ft?)?

I'd love a AR display, but I'm incredulous if it forces me to take my focus off of anything else.


The SDK works only on PC's as per their tech spec. Wonder if it is all C# based or some MS technology.


It's Unity 3D, which if I recall correctly uses C# .NET.


Unity3d also supports javascript and boo (some python variant)


I just took out my credit card and bought a pair. Looking forward to developing on their platform.


The technology here is groundbreaking, but personally reality is amazing enough as it is without augmentation. I honestly can't see myself wanting to wear any iteration of these glasses in my lifetime, even if they were undetectable contact lenses.


"You need hungry, imaginative and foolish people in their 20s to do this, and we have that" --Article about Meta from USAToday

F'k you ageists... Notice how they need the old guy who invented the AR concept (Steve Feiner) to give them any credibility.


Wow, didn't realize Meta was a YC company. It's a very interesting product.

I watched the demo on Kickstarter. I couldn't put my finger on it, but something seemed off about the object occlusion. Was that FX or real tech?


I get the feeling that reality will not come close to the video.

I'd love to be wrong, but this is usually the case with entirely CGI promo videos...


I worked with Meron on a completely different concept back in college. I moved on and and I'm not sure if that ever went anywhere, but man, that guy has got vision and can inspire people. If anyone can pull something like this off, he can. Or at least, if anyone can convince people he can pull something like this off, he can. Sometimes the latter can be more valuable.


If we can get AR with the quality of the occulus rift, gesture control with very fine precision and an emotiv like layer for added thought interaction, for 300$, I think we might slowly be getting somewhere...


Offer me such a controller with multi-platform support and I'd buy it at <800$. I think that's a good price to pay to add immersion capability to an Xbox, PC and PlayStation.

And I'm not even a gamer anymore.


Awesome! Could anyone explain or provide resources for how the micro projector technology and virtual screen work? I find that tech simply fascinating.


Damn...

Where are our flying cars? Who cares. We have Meta glasses.


People are working on it: http://www.terrafugia.com/ :)


What kinds of wireless comms will the device support? The specification only seems to mention USB and HDMI interfaces?


According to specs, field of view is 23 degrees. Concept video seems to exaggerate the visible portion of AR.


Is the price of $667 a play on what Steve Jobs and Steve Wozniak sold the original Apple I for?


Can't wait to see them in action for myself!


I really like what you created!


It's very interesting to note that there are many hardware (The serious ones, remember!) startups in the latest YC portfolio. I think it'd be pretty amazing to be around such an innovative bunch and do your work, even if one is doing a just software. :-)


john carmack is watching gentlemen, get ready for a job offer ;)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: