Hacker News new | past | comments | ask | show | jobs | submit login
Adam (unity3d.com)
657 points by nikolay on March 15, 2016 | hide | past | favorite | 132 comments



Much of what makes this appealing has nothing to do with the render engine. What I saw was some really excellent character animation / motion capture. The rendering itself wasn't particularly jaw dropping. And that's not a critique of unity. Rather it's a critique of their chosen subject. Metal, walls, and artificial objects in general are all very easy to render convincingly. Show me some trees, grass, translucency, or volumetrics and then I'll be impressed.


Even then, I'm unconvinced that the ability to do realistic graphics in real time will actually manifest itself in vediogames/VR experiences we can interact with. So much of what makes this look good is the excellent art direction, and as you stated, excellent textures and animations of an artificial thing. But animation students have been producing similar quality visuals for their show-reels for ages. All that has changed is the ability to render faster, but the creation of this content is still being done by the same processes.

For videogames/vr to break the photorealistic barrier, there needs to be some order of magnitude reduction in art development costs for these experiences to be affordable. Not all videogames can be star wars battlefront, where probably $150M ($330M overall cost to develop and market [1]) was spent making the best damn textures videogames have ever seen, but produced a simple, limited game.

Photorealism is a dead end for videogames unless art costs come down.

[1] http://fortune.com/2015/12/30/star-wars-video-game-sales/


Well, John Carmack was impressed by the results of Brigade. And I think Otoy is a company to watch when it comes to VR and 3D.

https://home.otoy.com/render/brigade/


That's so really impressive stuff. The problem is though, that asset creation remains massively expensive.


Thanks! Asset pipeline for Brigade is Octane.


I'm curious how you manage noise in low-light situations. The one video I found had significant compression artifacts so I wasn't able to tell what the actual engine looks like.


It's pretty but the demo doesn't have a hint of motion in it, and such organic objects as there are (plant leaves) are very plasticky. The HDR bloom is lovely, but presumably being faked just as it would be with any other render pipeline.


It's as dynamic as any rasterized game engine. Siggraph 2012 demo has exploding meshes at full speed for example.


> But animation students have been producing similar quality visuals for their show-reels for ages. All that has changed is the ability to render faster, but the creation of this content is still being done by the same processes.

Have you seen modern texturing pipelines? It's most certainly not the same process as 10 years ago. Have a look at this procedural wood flooring generator, built in Substance Designer's node-based texture workflow: https://www.youtube.com/watch?v=Zc5Pdcbjr0U

Same for animation. Doing it by hand won't get you the quality of results that you need for photorealism, so we're bringing in new processes that require a bunch of equipment, a bunch of time, and probably multiple crew members. Used to be you'd make a walk cycle by hand, now it's mocap. With the same old processes, there's no way Star Wars: Battlefront would have been able to make its content in the volume it needed, even with a large art budget.

Now that we know how to create these assets on at least a somewhat feasible budget (compared to armies of artists manually doing 8K textures in Photoshop), the next step will be taking this high-end work intensive stuff and bringing it down to the point where we can crank out similar quality results with lower time investment. It'll take a couple more years, but there's definitely a demand for it.

Releasing today: Substance Painter 2 https://youtu.be/1pIoA34MVBA?t=26 (warning - annoying soundtrack)


Yes, the tools have gotten much much better, that is true. But it still costs an incredible amount of time to generate the art assets, it's a huge part of the budget.


I have not. That's super neat and I'm glad people are making progress! Is there anything happening on the animation side which will push past mocapped movements?


I don't know a ton about animation, but HumanIK might be worth checking out: https://www.youtube.com/watch?v=blLBRmNA3zI

A couple of really cool things about the Substance texturing approach:

Feed it a 3D model and generate maps of its inside corners and outside edges, which can be used to mask layers for dirt accumulation and edge wear https://www.youtube.com/watch?v=zTYia53801k

Squish multiple simple generators/effects together to make really interesting shapes: http://bradfolio.com/art/substance/substance-red-rock-cliff-...

No reference handy, but you can expose parameters like "How much rust" or "color of scattered accent bricks" and make the textures be configurable in-engine. Really streamlines the process if the person building out a level in the game can adjust those sorts of features without going back to the texture artist every time.

There are a lot of neat examples in here: http://polycount.com/discussion/155851/weekly-substance-chal...

EDIT: Here's a cool one http://polycount.com/discussion/comment/2384431/#Comment_238... You can feed it an arbitrary mask to control the color pattern. Note how it's not just painting the color over the bricks, it actually makes brick seams following the edges. And if I had to guess, the dirt/sand accumulation is exposed as a configurable parameter.


Those methods (not texturing, but animation at least) seem to be evolving with the introduction of accessible mocap and 6dof controllers.

http://blog.leapmotion.com/vr-transform-world-3d-animation/

https://www.youtube.com/watch?v=0a_M9VsZ6Lk

http://www.wired.co.uk/news/archive/2015-05/08/animating-in-...


That sounds like the step forward these things need. VR to really get a good look at your stuff, and the ability to mocap while sitting at your pc. Cool stuff.


About [1], consider this:

https://www.reddit.com/r/todayilearned/comments/4412zk/til_s...

Both the topic: 'more spent on marketing than production' and the first post 'numbers can't be trusted because of creative accounting practices'. I wouldn't be surprised if this applied to the game studios, too.

Anyway, you're essentially stating two things. One is that art students have been able to produce this level of quality for their show-reels for ages. If students can do it, it can't be all that expensive. So why did we not see it in games before, but only show-reels? Not because these students or studios couldn't translate a show-reel to an interactive show-reel (i.e., a game). Rather more likely, because there's no market for it because nobody has the money for the hardware to run these things real-time. Now we're seeing this thing render on real-time on consumer hardware that might in 5 years be affordable to mainstream audiences.

Seems to me the biggest barrier was there's no market because there's no cheap hardware that can run high-quality in real-time. Of course cost of art is a factor, a big one, but I don't think it's been the primary limiting factor.

Lastly, the cost of art at any given quality level has come down in a big way. It's offset by increasing demands. But try to buy some assets of 2011 type quality, it's cheap, while 5 years ago you'd have had to hire a big team to deliver that. Once you approach realism, the amount of improvements to quality diminish and you get a build up of cheap assets (textures, template models etc) that can be tweaked and used and bought cheaply. Assets from 5-10 years ago are like cheap commodities already, tomorrow's assets are expensive, but at some point there's a limit to quality improvements and just like every other industry (e.g. smartphones in 2016) you see commoditisation and the cost come down.


A two minute film is not a videogame. A show reel requires one good animation to work in one specific situation. A videogame needs an animation that works generally. Also, making a good animation is hard, and artists deserve to be paid. Think of movies, where there are still movies today which have unrealistic CG, despite computation being far from a bottleneck for a feature film (Legolas on the elephant in LOTR: Two Towers) - it's not the technical challenges, it's the artistic challenges.

Right now, the vast majority of games have characters with fixed walk cycles that are used no matter the terrain. Realistic CG needs to have a walk cycle that captures the subtle changes in gait that corresponds to a given surface. I know people have been doing research on adaptive walk cycle, but afaik it has yet to hit production games.

For generating art, there is hope, as procedurally generated games look fantastic (No Man's Sky), but have yet to expand beyond sci-fi games, or into games with story and specific art styles.

Maybe reusing assets is the way forward, but I'm skeptical. Reusing the visuals just means more army guys fighting in sandy deserts crouching behind crates. Maybe the Storm Trooper models for battlefront really are as good as they get. But even then, I think aesthetically realism can lead to a dead end. The most visually impressive game I've played, other than Star Wars Battlefront is The Witness, which is simple in the CG sense, but has some really tremendous visual aesthetic moments that are something I've never seen done in a game. For me, the stylized look and the artistic/game opportunities that enabled were far more exciting than Battlefront's perfect rocks.


> still movies today which have unrealistic CG, despite computation being far from a bottleneck for a feature film (Legolas on the elephant in LOTR: Two Towers)

One point to note - LOTR Two Towers was released in 2002. 14 years ago.

Yeah, I feel old too.


A videogame needs an animation that works generally.

What are you talking about? The video says "rendered in REAL TIME" so obviously this is working on an XBox with full motion control.


I'm claiming that at this level hardware power, the hardware power isn't the limiting factor in realism, the animations are.

Realistic visuals needs realistic walk cycle. But since realistic walking isn't a perfect stable cycle (it is disturbed by head/hand motions, surface variations, etc), there can be no realistic walk cycle because real walking isn't a cycle. AFAIK current videogame animation all involves walk cycles. Realism at this level of detail needs some new form of animation, not the love and care of a person that is so present in this demo.


regarding the walk cycles on terrain, good work is seeing use with inverse kinematics. For modern era games the technology is readily available and generally robust that solves this issue.

I wouldn't describe the usage as trivial, but it's in line with other toolsets that deal with different issues.


>If students can do it, it can't be all that expensive.

That's not really true at all. It comes down to man-hours and how much time it takes the artist to do the work. A student isn't incurring that cost when they do work for themselves, but a company would be.


A valid point, but even realistic metal isn't necessarily easy. The volumetrically diffuse reflections (that actually get sharper as geometry approaches them) you see of the robot prisoner's arms on some of the walls haven't been possible in real-time until only very recently, and is clearly a new feature Unity is showing off here.

What's most interesting to me, though, is the Unity logo at the end, which I assume must be showing off engine capabilities as well. It makes heavy use of sub-surface scattering, which is of critical importance to lifelike skin and faces.


I'd much prefer if more people tried to improve animation, because state of the art motion captured movements still look like crap in modern AAA games. Graphics is good enough for now.


Even properly done human models are very challenging. You can spot a fake 3d actor in no time, even in movies.


This was the worst part of Martrix II.


Matrix Reloaded was 13 years ago. That's a bit unfair. You can still tell, I think, especially if you know where to look, but I don't think most people would notice, in Fast and the Furious 7, for example.


Nah, it was shoddy work. Compare Terminator 2 which was 25 years ago.


Terminator 2 used far, far more practical effects than one is likely to assume these days. The iconic shot of the t-1000 blown in half and sewing it self back up? Practical effect. Really amazing stuff over all. https://www.youtube.com/watch?v=EYQMfT6nsQs


Or skin, hair, and clothes.

I still found the motion capture and the crisp rendering quite impressive, though.


I always find the it's the physics that break the feel of the game, not the visuals.


Honestly the most impressive part to me was being able to convey a story of "human somehow put into a machine" pretty much only through physical acting. That's not something you see every day in video games.


Absolutely this. It fully conveyed real emotion with a sense of panic.. then seeing the different behaviors in the crowd too. The guards. The whole thing tells a story. I hope it's significantly more than just a tech demo. I'm hooked.

And I suppose it's worth pointing out that creating all the tooling to put this together is just as impressive as a lot of the more obvious aspects. It's a giant content creation pipeline coming together. I expect these demos from Epic, not Unity. They're stepping up.


You may want to play SOMA then. The whole game deals with this very subject and its implications. I can highly recommend it for the excellent storytelling, dense atmosphere and brilliant sound design.

http://somagame.com/


That was what mainly put me off, it makes no sense. First the robot starts off breathing like he'd been under water for 3 minutes and came up for oxygen, what? A robot running on oxygen? Then he breathes and moans, what? Vocal cords on a moaning robot? Stumbling because his robot muscles haven't been used in a long time? It made no sense, they took a human's motion, behavior, appearance and sound, and then just exchanged flesh for metal, which makes no sense to me. And then you say that it's only the acting that made him human, come on, everything except his skin made him appear human!

I mean, anyone can come up with some convoluted ideas that explain why the above does make sense... like the robot breaths because there's an organic brain in there that needs oxygen, they moan because it's still a human brain and the machine is sending his brain an overload of sensory data that is hard to deal with, the robot stumbles because his brain is new to interfacing with its machine parts etc. But I personally didn't like it and kinda roll my eyes when they go overboard with the anthropomorphism. Still it was hella cool, hope the full version will explain away my doubts neatly :)


I assumed it was all in his 'mind'. These didn't act like AI robots, seemed clear from some of the exposition that there's a mind in there that thinks it is human. Notice the different choices different robots took - one ripped the covering off it's arm, in the big crowd you can see different robots reacting differently - one pushes another out of the way for instance. I'm pretty sure there's reasons for how they acted.


Put me in a vat, I'm not going to make moaning or breathing sounds in my mind. And it seems like a pretty silly engineer who'd simulate such a thing. That's really my point. If it was a cartoon I couldn't care less, but I like my sci-fi to approach some level of realism. (and I'm really not the type of guy who complaints about sound in space, it's about immersion for me and a breathing and moaning robot kills it for me.)


Amputees still 'feel' the missing limb months and years later, so I find it plausible you would still 'feel' reactions in your body even if you no longer had one. I didn't take the sounds to be literal, just an artistic technique.

To be clear though, I think your opinion on this is every bit as valid as mine, I just really enjoy discussing film.


I had exactly the same reaction - a robot breathing oxygen, hahaha. Why is his body tired?!


because they replicated a function of humans that they wanted intact, like speaking, that requires breath, nice demo, as always story is king, it gripped me :)


Here it is running real-time in Unity editor:

https://www.youtube.com/watch?v=eN3PsU_iA80&t=37m50s

It's a standalone PC project using DX11.

Cloth and cables physics simulation is pre-baked, lighting is dynamic.



Thanks. Seems they must have edited the video since yesterday, cutting out early part of the stream.

Curiously I can still see the proper thumbnails when hovering timeline at the original timestamp.


I want to watch this movie/or someone play this game for a few hours. I can assume that Adam is a human who was put into a robot for some reason. There is some potential there, like Chappie re-imagined.


It looks like they are/were all prisoners based on the word 'Felon' on his name plate as well as the orange pants. Definitely will be an interesting story (maybe repurposing felons as robots to test stuff).


If you haven't already, you should watch/play SOMA. It is a masterful exploration of the concept of the ego and self, and the consequences of attempting to shift that into a robot (robots).


TALOS principle might be something you enjoy. You start as a robot that is basically attempting to prove it's human.


I am sure we would be able to clone/copy/emulate brain to machines someday. It will change the world in so many ways. We would be Immortal and that will change our priorities so much. People living in virtual worlds. Future would be so cool.


We'd probably kill earth before that.


This reminded me of the pod scene from The Matrix (1999): https://www.youtube.com/watch?v=0WCcX0KQ9V0

Very similar tubes removal process :)


I'm pretty sure that was the plan.


I think that's also Keanu Reeves' audio as well.


There have been a couple of games with atrocious performance on PS4 including Broforce and Firewatch—and Broforce is a 2D sidescroller(!)

I don’t know whether Unity is innately bad, or whether frameworks in general just tend to enable bad code.

Would love to hear more from people who know more; right now I associate Unity with people starting out in games rather than a platform people continue to use after they hone their skills.


Unity's massive problem is that it's still using an ancient version of Mono, with awful garbage collection and other performance issues.

In most other respects, it's an acceptable game engine. It's been used for a ton of big professional games you've probably played without even being aware:

https://en.wikipedia.org/wiki/List_of_Unity_games


It might have "awful garbage collection" and "other performance issues", but it has nothing to do with Mono for the last almost two years:

http://blogs.unity3d.com/2014/05/20/the-future-of-scripting-...


Not only that, Unity games also have a very inefficient game loop, which incurs a LOT of latency on input. You don't notice it much on consoles, but on the PC, with 60 FPS, and the mouse, it's super noticable.


Honestly, you can make a horribly inefficient game using any engine. Unity just has a lower barrier to entry, as it's very hobbyist-friendly (as you rightly point out), so there's more games being made overall with it, more games made by people with little experience in performance optimization.

I don't see this as a bad thing, either. The tradeoff, from the hobbyist's point of view, is that if you decide to use Unreal Engine, for example, your cost of learning the tech is so high you're unlikely to ever finish your game.

Source: I'm a (professional) game developer and we've used Unity for multiple projects.


You can write crappy games in assembler if you like. There are plenty of horribly performing Unity games simply because it sets a lower bar for developers. Also, rather ironically, developers using the less expensive license are forced to show the Unity logo at launch which tends to associate Unity with low-budget games.

There are plenty of awesome games done in Unity which you probably didn't realize they were done with Unity.


> Rendered in real time with Unity

...on what hardware?


From Twitter of one of the demo creators:

"I was using GTX 980, which runs it at 1440 in the editor with perf to spare"

https://twitter.com/robertcupisz/status/709789738316726272


Except that was in reference to the demo they gave at GDC while using the editor. Not necessarily the card used to render the demo, someone should ask.


From the YouTube comments:

Processor: Octacore Intel Core i7 6th Generation GPU: Nvidia GeForce Titan X (3x SLI)


Update from one of the demo authors:

"The Adam demo runs in 1440p at 30fps on a GTX 980, by the way :) See it running real-time at the Unity booth at GDC!"

https://twitter.com/robertcupisz/status/710160351568986112

So it's very likely that the YouTube video was rendered with far better hardware.


And one Titan X gives 6.6 TFLOPS, single precision (peak)


From the youtube comments that was also buddy pulling a guess out of his butt.

Was the commenter making it up.


Sure, but same info here: http://thenextweb.com/gadgets/2016/03/16/check-amazing-short...

Someone should ask though given another poster's comment below about the GDC demo of the editor where they were using a GTX 980.


on your hardware!


Nope, that's exactly what it led me to believe, but it is definitely not that.


Oops. Looks like I was wrong... This is a youtube video of a pre-baked "realtime" render.

Still impressive though...


love the ending... probably all going back to their cubes to write themselves some React JS code..


Damn, I thought this was running in browser until I read the comments.


I went from "Whoa, this is astounding for WebGL" in the first scene to "Ok, this probably isn't actually rendering on the client" as I saw some of the diffuse reflection shaders in the escape to "I feel kind of silly now" once all the other robots appeared.

The YouTube chrome is hidden, but the video must be in 4K; it looked like native resolution on my Retina MacBook Pro.


It's up to 1440p, played 1080p by default on my rMBP but I may have set that as the default a long time ago so perhaps you saw 1440p.

https://www.youtube.com/watch?v=44M7JsKqwow


Exactly. I feel a bit cheated.


Realtime with what hardware?


Even if high end I imagine in 2-3 years you'll have the same quality on mainstream.


the best part was the real time orchestra sound track.


haha. and the realtime coordination of all the 100+ agents.


If you like the philosophy of this, I heartily recommend you check out the first season of Ghost in the Shell: Stand Alone Complex. The second season is so-so, but the first season is amazing and jam-packed with cyberpunk and philosophical challenges like this one.

They’re also doing a (whitewashed) live-action version of some amalgamation of the movie and TV show, so might as well watch the original (1.0) movie and anime, before Hollywood ruins them for you.


Have Unity fixed the supposed big threading issues causing bad performance with v5.4?

https://youtu.be/HnVOi9wrZVU?t=188


Especially the second part with the humans looked very real.


Why not use the Unity Webplayer?


The webplayer actually doesn't work anymore in most recent browsers because of changes to how plugins work/security concerns with plugins in general.


Nowadays you can build to Javascript/WebGL. This demo uses special DirectX 11 features so it runs only on Windows native.


Not sure if sarcasm?


Well isn't that kinda what Unity stands for, running things on multiple platforms, including in your Browser? So if you want to showcase the engine, showcase the engine.

But anyway I don't want to be overly critical, after all this is probably just an internal demo, they thought would be cool to share with the public.


Anyone else reminded of the Metal Gear Solid series? The cutscenes (especially in MGSV) had the same exact shaky cam and overall feel in this clip.


Thats nothing remotely new in games though.


This is pretty cool. I'd be really interested to play around with unity to build webpages for VR or something of the sort


I was confused for a bit but it's a youtube video not running in engine... you can understand how I'd be confused as unity has a web player.

Edit: Apparently they don't really have the web player anymore. Still confusing. Was actually hoping that for once I'd be proved wrong about WebGL / encriptem.


It is still possible to export to e.g. WebGL, but the features required for something like this is not available in the browser. Also they were running on 3x Titan X, which doesn't exactly have widespread adoption :)


Great idea for Unity to have their own demo team producing stuff like this to push the technology further.


Great demo by the unity folks. Hopefully the small studios can harness that power just as the people that make the engine. They usually know how to cheat it best.


Great graphics do not make great games.


This is... too good. This is seriously all in browser?


Actually, it looks like it's a streaming video from Youtube (right click on the player while it's playing).



So what exactly is being rendered in real-time? Honest question. I don't quite get it.


What they mean is that it was rendered in a way in which it took 1 minute to render 1 minute of footage.

Instead of a typical movie rendering which can take a long time for just a single frame. Like toy story "Our original toy story frames were averaging four hours, which is 240 minutes" - https://www.quora.com/How-much-faster-would-it-be-to-render-...


Thanks! That makes perfect sense.


It just took a cluster of a thousand computers to do it.


It was run on a single PC with the following hardware: Processor: Octacore Intel Core i7 6th Generation GPU: Nvidia GeForce Titan X (3x SLI)


No it wasn't, that commenter was _guessing_.

A tweet above says it was on a GTX 980 in 1440


Actually I was joking.


They mean that it was rendered in real-time when it was recorded, not that it is rendering in real time in your browser.


I imagine this is a recording of the demo which was streamed in real time, as opposed to being rendered (a process which can take significant time per frame for scenes this complex).


E.g., see: https://www.youtube.com/watch?v=toLJh5nnJA8&t=1h36m47s

for current state of the art realistic characters and

https://www.youtube.com/watch?v=DRqMbHgBIyY

for current state of the art real-time architectural viz.


As soon as I thought I was decently smart, this guy started talking... Wow! https://youtu.be/toLJh5nnJA8?t=1h37m44s


These are not complex scenes by today's standards. Geometry's no problem. The shader complexity is a bigger concern, but here it's all hard surfaces.


That was my initial reaction when they said rendered in real time, but I'm guessing it meant low latency (real time) while waiting for frames to render during processing.


woah, what is this??

nvm i scrolled down lol


I had no clue to scroll down till I saw this.


Heh, I saw the scrollbar but it broke my brain a bit to see a button in the middle of the page that wasn't going to make it scroll.

cf: http://adventurega.me/bootstrap/


Same here, I completely didn't realize I could scroll down on the page until seeing this.


If it is rendered in real time why do I see a youtube logo at the bottom right corner?


Because it's a recording of a real-time render.

As opposed to a recording of a 1 minute video that took a few hours to render on industry hardware.

They ran it on a pretty beefy consumer PC that would cost thousands of dollars, definitely not feasible for ordinary gamers. Point is, it ran real-time on consumer hardware. It's sort of a showcase for what might become mainstream in the future, and what is already possible if you're the type of consumer who buys graphics cards that cost $1k for gaming.


Wait what? It says rendered in realtime. You mean there was an actor with dots on him / her and that film was playing on a monitor next to them so the director could see the cables popping of?

Surely not


Not quite - the animations would have been created "off-line", and included when rendering the scene.

mparlane explained it nicely: https://news.ycombinator.com/item?id=11294300

PS: Hi Paul, hope all is well :-)


Cool - thanks for the link. I imagine everyone else is living about ten years more in the future than i do. Or maybe I am just ten years behind ...

Very well thanks. Doing any conferences this year?


How can a robot have breaths?

Why does he try to take off the "mask"? How can he identify that it's not a part of his "body"?


I find it particularly clever. What they are telling is that this is body for a human to occupy. Sensation and sound of breathing is simulated for him.

He's still in bit of a shock of waking up not in his own body. I guess pulling was instinctive reaction upon seeing it and finding a (painless) hole in the side of his face.


I thought that also, but look at his body, it's machine to the bone so I was thinking where is the place for the human to occupy? I was thinking, maybe he just "thinks" that he is breathing because he was a human and wakes up to be like this.

What's that with the downvotes? A robot doesn't need oxygen to fuel chemical reactions.


Moore's Law would predict realtime rendering as inevitable. Now let's hope something interesting comes of it.


You're referring to Raytracing or other physical rendering approach.

This is still polygonal rendering but by real time they mean that it was rendered 1:1 live on a cpu.

Most games trailers actually render at some fraction of real time (12:1) (and often up-res everything) to a series of files which are turned into the actual trailer.

Also this is running on a pretty beastly machine. So realtime for them might be 12:1 time ratio on your machine at the same quality. Or much lower res and real time.


No. I meant high-fidelity rendering as was shown in the post. Of course Moore's Law will soon give us high-fidelity ray tracing in real time also. The new hardware implementations are a step in that direction. As for "running on a pretty beastly machine", that's my point - Moore's Law says this will soon run on your phone.


Hm? Realtime rendering is not something new. In fact it's decades old.


Moore's law doesn't predict that at all


Why you think the net was born? Porn porn porn


This isn't realtime rendering, like demoscene demos. It's just faster rendering of a video. I think the webpage is misleading.


I'm tired of projects using 'Unity' as a name.

There's this, there was an Apache Unity thing that was a Java implementation. There's an Ubuntu thing. Who knows how many others.


I'm tired of parents using David as a name.


Well that was an oddly personal response.


Fair enough but this one's been around for more than a decade and is a staple of game dev engines.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: