In case anyone has about 30-60 minutes and is interested to get a quick glimpse of how easy (I guess meaning free and accessible at the least) it has become to do such graphics:
- Download and install Blender 2.71 (http://blender.org/download). On linux (Ubuntu) I did not even have to install it; I just extracted the tarball and ran the blender binary.
As someone who does not have graphics training, I was blown away when I did this. Apparently there is this thing called 'path tracing' based rendering, that takes care of accurate lighting, as long as you give the correct specification of geometry and materials.
Blender is an amazing piece of software; it's history goes back 20 years. I never worked with 3D before and was scared away by all of the claims that Blender was too difficult. However, recent versions (2.6 and 2.7) have a completely revamped interface that is much easier to understand. It's also skinnable and scriptable with Python.
There are so many great Blender tutorials on Youtube and Lynda.com has an excellent Blender essential training course.
I've tried to make a very rough inference based on John Carmack's statement along the lines "at one order of magnitude improvement on today's GPU's we'll start seeing it in real things, and at two orders of magnitude it'll get competitive in games" (http://youtu.be/P6UKhR0T6cs?t=1h4m30s).
Nvidia's Optix is built on top of their general purpose library, CUDA and it massively accelerates ray casts. People have been using this for gpu accelerates path tracers.
The renderer used by Ikea is V-Ray, the same renderer we have integrated into our online 3D modeling & rendering tool: http://Clara.io :)
Here are two simple Ikea-like furniture scenes which if you click "Edit Online" you can edit it in your browser, both the geometry, the materials and the lighting setup, as well as rendering it photoreal via V-Ray:
IKEA has a mobile catalog app which already has a bunch of interactive features like Augmented Reality furniture and a 3D shelf configurator. https://www.youtube.com/watch?v=uaxtLru4-Vw
The largest door in the kitchen is the refrigerator door. Kitchen equipment with front panels matching the furniture panels are quite popular (refrigerators can be bought without front panels, panels can be then bought together with furniture to match it exactly).
You can use the IKEA catalog iOS app to see show items in your home using augmented reality. It uses the physical catalog as a reference in the camera view to determine size and viewing angle (there was an option if you don't have the catalog, but I didn't try it to see the alternatives).
Works decently well, enough so that I used it to pick out a TV stand.
There's a very large selection of Ikea models in the Sketchup 3d Warehouse - I'm not sure who keeps things up to date there, but that pretty much fits the bill for what you're asking. It's actually how I designed my study, and I ended up rearranging the furniture three times (virtually) before deciding on what I needed to buy, and I'm so glad that rearranging was done on my screen and not on my floor.
Not really having paid much attention to how they did do their catalogs in the past, I just kind of presumed they were moving towards 3D instead of 'real' photography.. and I guess my assumption has been proved right (makes sense - it's a lot more flexible).
I can only wait for a well integrated 'select the furniture for your own house app/site/whatever'.. which.. makes me wonder if they're considering some of the opportunities presented by VR or - better - AR (such as Meta and others).
AR overlays of how furniture would look in your own home, would be quite neat!
They actually have AR overlays already. What you do is lay the catalog down on the floor, and the iPad app uses it as a size reference to generate the overlay. It's a little finicky in practice but still pretty impressive.
Or the other obvious step (once 3D goggles mature) of replacing the physical shop with an online shop. Or combine the two: instead of you visiting IKEA's virtual shop, the virtual items are sent to you and rendered in 3D and overlayed on your existing room, so you can walk around the item and see what it looks like in your own room. You then press the "buy" button and it is delivered.
Fun and games then ensues when people figure how to dump the information from IKEA to their 3D printers.
> Or the other obvious step (once 3D goggles mature) of replacing the physical shop with an online shop.
I'm not sure they'd ever do that. They want you in their stores. Their stores are structured so that you have to go through everything and see everything and activate that "nesting instinct."
"Hmm, I want a chair, but that cutting board is really nice... and there's a knife block that matches it. And I guess I'll get some storage containers too. Might as well get lunch while I'm here."
A virtual store could also deliver in that department though, in that knife blocks and storage containers could always be situated in the neighbouring department, no matter what the customer was actually looking for. The accessories and decorations in each in-virtual-store display could also be tailored on a per customer basis, depending on what Ikea knows about the customer. "Nice table, and I really like the placemats they have used on it...". One can imagine Ikea providing a "buy the lot" option in their payment process.
The have an online store in several countries. I've heard from people who work there they're not expanding the trial due to shipping not working with their slim margins (paying for one person returning a sofa ruins the profitability).
Then they wouldn't have an online store either, and they do. I think an online 3D shopping experiment would work for them. Otherwise the competition will go that way and they would be leapfrogged.
Epic is encouraging all kinds of applications such as architecture simulations and not just video games. I'm interested to see how the engine can be used to do something similar to what Ikea is doing.
I wonder how active the 3D-CG scene is these days. In the mid-2000s there was so much activity on CGsociety (then cgtalk.com). The kind of work people posted there was just out of this world. Absolutely impressive attention to detail. I was an enthusiast too so I would visit the site many times a day.
Of late, I haven't been in touch. Good to see stuff like this on Hacker News.
I wish they had given a bit more information about the actual workflow.
Specifically, I wonder if they leverage the original CAD models? And if so, how are they converted to 3D Studio Max, and if the process is automated in any way?
I went to the Ikea Vray talk at Siggraph. They are reusing CAD models but the key to photorealistic rendering is not the model but the materials. They use a capture and calibration process to feed textures into their VRay based shaders in 3DS Max.
Hey - 3d artist here. When you're working with CAD/Solidworks drawings, you can import them directly into 3ds max as paths (2d shapes), and work with them in the viewport with accurate dimensions. I would be very surprised if that wasn't the typical Ikea workflow.
Failing that, I've heard of some artists actually whipping out calipers to take measurements from real-world pieces, but it seems like that method would defeat the purpose in this case.
> We use every computer in the building to give power to rendering as soon as they are not being used. As soon as someone goes to a meeting their computer-power is used, and of course there is overnight when people go home.
I'm very curious how they manage the distribution of computation?
Path tracing is highly parallelizable, a ray doesn't need to know anything about its neighbors (there are integrators that give better results if more information is available) to be traced. In practice, each process just gets assigned a part of the picture and can calculate in until it's done, moving on to the next part.
This is only partly true. Path tracing is an embarrassingly parallel problem but only under the assumption that the entire scene description can be accessed.
When a light ray strikes the ceiling it can bounce off towards a vase that is on a diffuse table which scatters the light in all directions. So the calculation for this light ray needs to know the shape and material (BRDF) of all the objects that interact with the ray.
Before sending out the ray from the camera into the scene it is unknown what objects are going to be hit along the way - as you can imagine is a difficult problem to optimize for. The usual solution is to just distribute the entire scene.
On a single computer there is no problem, the entire scene is usually present in memory. On multiple computer it is more difficult since you will end up distributing large amounts of data (scenes can be multiple gigabytes).
It's really just a bandwidth issue - VFX studios do this all the time with their renderfarms - textures are the main issue - prod/archviz like Ikea stuff are generally really clean and don't have THAT many textures - whereas in VFX everything's dirty and generally very detailed so you're generally pulling in >300GB of textures per medium level scene.
And at least in VFX everything's generally done lazily so you only read textures as and when you need them if they're not cached already - there's a bit of overhead to doing this (locking if a global cache, or duplicate memory if per-thread cache which is faster as no locking), but it solves the problem very nicely and on top of that the textures are mipmapped so for things like diffuse rays you only need to pull in the very low-res approximations of the image instead of say 8K images and point-sampling them, so this helps a lot too...
The entire concept is very interesting and is a logical extension of the product catalog business (think about the impact of 3D and CG on movies, architecture, etc.).
I've been experimenting with Blender and Skulptris lately and 3D modelling is quite amazing. A wonderful mix of technical and artistic skills. I wonder if IKEA will ever rethink their large super-store model and move towards smaller stores where you virtually walk into and interact with rooms and furniture.
I was hoping this was about an app to build 3d things by mixing and matching ikea parts. I know there's at list one community around that idea [1] (and I've done it myself :).
I am amazed on how this could be cheaper and faster than actually doing real life photos. The scenery and lighting quality is amazing though. Can't do that in a warehouse full of ikea products and fake housings either.
I'm also amazed. But if you've ever been on a photoshoot set it becomes a bit more believable. Sometimes the simplest shots can take forever to get right. Not to mention the number of people required - photographers, grips, directors, gophers, etc. Factor in the point they made about shipping all the physical stuff to a central location, all the different room setups (ie. American vs German vs Japanese kitchens) required and it starts to make more sense.
By the way, instead of home furnishings of different colors for people with Google Glass or similar devices IKEA can just sell an app which will color a furnishing (only in the image projected onto retina) into "bought" color whenever owner looks at the piece, Emerald City style.
- Download and install Blender 2.71 (http://blender.org/download). On linux (Ubuntu) I did not even have to install it; I just extracted the tarball and ran the blender binary.
- Go through this two part ceramic mug tutorial (30-60 minutes): http://youtu.be/y__uzGKmxt8 ... http://youtu.be/ChPle-aiJuA
As someone who does not have graphics training, I was blown away when I did this. Apparently there is this thing called 'path tracing' based rendering, that takes care of accurate lighting, as long as you give the correct specification of geometry and materials.
Some interesting videos:
- Octane 2.0 renderer: http://youtu.be/gLyhma-kuAw
- Lightwave: http://youtu.be/TAZIvyAJfeM
- Brigade 3.0: http://youtu.be/BpT6MkCeP7Y
- Alex Roman, the third and the seventh: http://vimeo.com/7809605
Brigade is an effort towards real-time path tracing, and it's predicted that within 2-3 GPU generations, such graphics would be possible in games.