Hacker News new | past | comments | ask | show | jobs | submit | kakali's comments login

They also acquired 250 people who they are now laying off and Weta is rehiring.

Seems like a disaster of a deal by Unity. Weta just got free money for a snapshot of their code.


Unity apparently had no clue what to do with the toolset when they had the team that built it, you can imagine what they will do with it now without the team. Weta sure isn't re-buying it because they already have it. It's been literally a 1.6B gift to Peter Jackson?


I understood that they are still keeping the actual Weta "tools" part with it's employees?

These 265 are just the people worked in the "professional services" part of the business and were doing actual VFX stuff for Wētā FX and not on the tooling part.

Edit: I was wrong, 275 was the entire workforce of Weta Digital when Unity acquired it back in 2021. So yeah.. it seems they just almost literally burnt $1.6 billion..


I worked for Weta a few years before the acquisition, let me just say that I'm not surprised that Unity didn't know what to do with it. It was never a single DCC set of code, but lots of separate tools (many very old) that only sort-of worked together. I imagine they would have to throw everything out and start over to build something on the level of Maya or Houdini.


It's always fascinating to me to watch companies do an acquisition of a technical stack that has no pedigree aligned with their own technical stack.

Software has shape, and if you're acquiring a stack that's wildly different from your own, there's no guarantee that integration will be particularly feasible. You may not even be acquiring the talent that would know where to start with such an integration because you never know what key pieces of the system were built ages ago by somebody who left long before acquisition was even discussed.

I've watched this happen with systems as closely aligned as two Python / JavaScript tech stacks that differed in frameworks, libraries, and wire protocol. It took something like 2 and 1/2 years past executive projections to integrate the offerings.


„Software has shape“ - apt!


I worked in the industry and fairly closely with a number of notable Weta Engineers.

I still don't know why you'd spend 1.6billion on a VFX pipeline. I mean sure it gives you cachet, but a lot of the pipeline is mostly asset management rather than the few bits of special sauce that provide artist tooling.

but 1.6 billion is a ridiculous amount of money for a company's turnover of ~110million[1], with pretty thin profit margins, compared to a software company.

Maybe I'm wrong about weta, they did make mari after all, which is pretty special, however the foundry didn't pay anywhere near that much for mari, for what appears to be a similar licensing deal.

[1]https://rocketreach.co/weta-digital-profile_b5c60ffef42e0c50


I'm not familiarized with those game engines, but the first search showed me that "Unity is more commonly used to develop mobile or 2D games while Unreal is used to create video games on consoles or next-generation PCs."

I have no idea, my own speculation as curiosity, but maybe Unity was looking for coding for realistic rendering, know-how, for rise a competence with the Unreal market space?

But if it were the case, why let to fly away the team? Maybe was this part of the deal with Weta? to sell an snapshot as the parent comment says, with the pack of programmers for to train Unity's team temporally?

I don't know. The numbers are very high, if I were in charge I wouldn't invest that quantity of money without sending firstly a pair of emissaries for to read code and verify the toolset will be useful for the pipeline.

PS: The shareholders owe me a few beers for sowing the seeds of doubt.


> but maybe Unity was looking for coding for realistic rendering, know-how, for rise a competence with the Unreal market space?

Unity already has a 3d engine that can match Unreal (more or less, and we won't talk about the fact that there is 3 rendering pipeline, and the deprecated one is the only one who is reliable. Really, we won't talk about it). If anything, Unity use to have some of the best realtime-cg engineer under their payroll. The mobile/2D game is mostly historical, a lot of AAA quality game have been made in Unity.

VFX is a different beast. For the longest time, realtime CG and VFX/movie CG use to be really two separate field (with a lot of connection ofc, CG is CG). Unreal made wave when they started to be used in cinema (The Mandalorian I think was the first big name openly saying they were using Unreal in their pipeline). Realtime CG got so good that it can (for some specific effect) be used instead of traditional VFX pipeline, which are usually slower to iterate in. I think the exec at Unity saw Unreal "success" and wanted to get a piece of the cake before it was too late. It was not a bad idea, but Unity as been so mismanaged these past few years, it again completely failed at producing anything real, stable and usable.


A major difference that is no longer a difference is that VFX used to be about creating visual imagery in the context of a forced perspective with live action shots mixed in.

The dynamic background technology being used now for shows like The Mandalorian and Star Trek changed the story. Now VFX teams are being asked to craft the back half of a set as a 3D model that can be blended with live action shot in front of a screen displaying that model to proper perspective based on camera position. So now they have to care about the 3D dynamics of the scene in a way they previously weren't required to... A way that video game engines have been wrestling to ground for decades.


The dynamic background technology being used now for shows like The Mandalorian and Star Trek changed the story

This is still very niche compared to normal vfx shots. VFX shots are everywhere.

So now they have to care about the 3D dynamics of the scene in a way they previously weren't required to

What does "3D dyanmics of the scene" mean specifically? Any vfx shot with 3D needs to track the camera, find the ground plane and maybe map out some simple geometry of the scene.

A way that video game engines have been wrestling to ground for decades.

What does this mean?


I mean that a VFX artist working on a thin slice of the world (even with camera tracking, one isn't usually doing full VFX on a 360 or even 180-degree view) doesn't really have to worry too much about whether their scene makes volumetric sense beyond what the camera can see, nor do they have to model a physical space with its own physics going on everywhere in the background.

... but task them with creating a whole virtual set that links to the practical set and the actors are acting within, and now questions like the realism of the whole space, light sources that impact the real world, etc. come into play. Game engines have been modeling whole spaces that can be inhabited at any point in the space for ages longer than VFX tooling pipelined for post-production has.


doesn't really have to worry too much about whether their scene makes volumetric sense beyond what the camera can see

This is absolutely not true. Typically there is lidar for an entire set and the camera and the set lidar need to fit together and make sense.

but task them with creating a whole virtual set that links to the practical set and the actors are acting within, and now questions like the realism of the whole space, light sources that impact the real world, etc. come into play.

This is what camera trackers do. Lights are done by the lighting department.

Game engines have been modeling whole spaces that can be inhabited at any point in the space for ages longer than VFX tooling pipelined for post-production has.

Photogrametry is 25-30 years old and virtual sets have been used for a very long time on green screens.

I'm not sure what point you are trying to make at this point, but the things you're saying aren't true.

Don't mistake the attention that the virtual stage stuff gets as being the same as the use it gets. The enabling technology is the large high resolution screens more than anything and the point of using them is more about reflections, refraction and to a lesser extent hair and backlighting that are difficult to make work on green screens.


This. Virtual production is an incredible savings on the cost of film production. It still has problems and expenses, but there was a huge push toward it during COVID. Film production is a big market world wide.

I would also guess that if you have really good technology for virtual production you have tech that is useful for other things, like tele-presence, simulation, image scanning and so forth.


This has nothing to do with the conversation topic, but I'm curious where you picked up the word "lucubration"? I know there are a vast number of words I've never encountered, but this one was... interesting.


Now I understand. "lucubration" was a bad offer made by the DeepL translator and I didn't notice it, what is embarrassing due the meaning is literally the opposite of my intended sense. In fact I used the translator for that part because I didn't know what English word could suit softly. The online dictionary I checked now it's also wrong.

I've replaced the word in the message, "Speculation" is a word that is perhaps better suited to warning the reader that it is a thought without much research.


Ah. Interesting that the translator would choose such an obscure word. lol. I love vocabulary though, so it was an interesting new word for me to run into.


Unity also got a ton of 3D models and assets, and the well-known Wētā Digital name, but yes it is strange to do this.


Looks like the digital trademark will be going back to FX, so unity isn't even keeping that.


Mindless Billions Anywhere at work..


That’s fine, but it’s okay to call out the author’s bias. They contributed wonderful tests of the limitations of the headset. But the headset is only a failure for the authors use case. (AR)


I agree. If you know the author the bias is clear, but it's okay to point it out for people who are not familiar with him.


Having used Hololens vs several passthrough-AR devices (Quest 2 etc) I'd take "poor image quality" over "limited FOV and the need to use it in a darkened room" any day.

Until AR display tech gets significantly better I think AR based on passthrough cameras is the only sensible approach.


That is a valid problem, but with MR pass through there is also a compute issue. The cameras are not in the same location as the users eyes. They have a wider parallax than the users eyes and are forward several inches. There is no simple transformation to correct the video to have the same perspective as the user. Instead a 3d representation of the environment must be created, camera imagery projected on to it, and then rendered to the position of the user’s eye. All of this done as fast as possible and highest rate possible. This explodes in compute with increasing resolutions. MQP is doing this with 3 cameras, color compositing, and without the aide of a depth camera.


This is the same person who did comics at Google. Their work was a highlight during my time there. Google even used their comics for basic employee training.

I’m surprised folks say that we can’t be critical of own employer. Our employers are not lords or kings. We choose to work for them and can replace them.


> I’m surprised folks say that we can’t be critical of own employer. Our employers are not lords or kings.

No one is saying you can't be critical of your own employer. They are saying there are repercussions to being critical and loss of employment is a common one which should be expected.

> We choose to work for them and can replace them.

They choose to employ us and can replace us.


> No one is saying you can't be critical of your own employer. They are saying there are repercussions to being critical and loss of employment is a common one which should be expected.

Why do you feel that the expected outcome of being critical is being punished instead of addressing the root causes of these issues? I mean how do you expect problems to be pointed out and addressed?


>I mean how do you expect problems to be pointed out and addressed?

Apparently, at Twitter, you don't.


> Apparently, at Twitter, you don't.

Twitter is itching to shed people any way they can, so the problem lies elsewhere.


Most companies I have worked for have a means to point out problems in an internal non-public manner. Once you bypass those to go public you generally lose the trust of your employer. Why would you employ someone you can no longer trust?


Is the status quo today akin to the story of The Emperor's New Clothes?


I think it was clear the person you're replying to is saying that the repercussions for being critical toward your employer is what makes it effectively 'not allowed'.


> They choose to employ us and can replace us.

It's really a mutual agreement, innit? The relative power in the situation is merely a supply / demand equation, for talent / jobs.


Why is this dead?


Because some people can't accept there are consequences to actions.


I believe you can shatter then with vice grips in an emergency.


"Linear Algebra Done Right" https://linear.axler.net/

There's a compact and free form on the book on that page.


Heed Axler's warning; the compact version is meant as a refresher, not a book to learn from.

> Linear Algebra Abridged is generated from Linear Algebra Done Right (third edition) by excluding all proofs, examples, and exercises, along with most comments. Learning linear algebra without proofs, examples, and exercises is probably impossible. Thus this abridged version should not substitute for the full book. However, this abridged version may be useful to students seeking to review the statements of the main results of linear algebra.

Maybe you previously worked through the full LADR (or some other math-major linear algebra book like the older, classic Halmos), but your memory is fuzzy. That might be a use for it.

Still, I don't know why you'd read through a 150 page refresher rather than the full book. If you can't afford it, the 3rd edition is on libgen like most other textbooks.


The book is not easy. It's more like a traditional textbook, albeit a great one. In my experience, the book is great at showing how seemingly hard concepts can be systematically developed, layer by layer, under the framework of vector space and linear transformation. When reading the book, I often marveled like this: that's it? Wow! A few references to the previously listed theorems and definitions and this is proved?


Axler is not easy. (At least, I did not find it easy.)


I think that book is usually recommended as a good book to learn from after you've already been through a course in linear algebra.


Axler won't really teach you to calculate though.


Axler's approach puts the cart behind the horse where it belongs. Learning to do row reduction by hand is a really pointless exercise that misses the forest for the trees. But if want want the nitty gritty on computation, Golub and Van Loans "Matrix Computations" is pretty classic.


Only if you want to be a real mathematician. Most people don't and are better served by learning to calculate, since that teaches you how the techniques of linear algebra are applied.

You actually do need to learn to do row reduction by hand because it teaches you what is going on. Same reason you need to learn to integrate by hand even though you'll never do it again after your coursework.

BTW, I learned linear algebra from Axler first, so I have some basis for comparison. There is a reason Axler is not considered an introductory text even though it is not very hard.


https://pluspool.org/pool/design/

Seems like it is a giant strainer. The walls are permeable and the river is supposed to push the water through the pool/filter to produce clean water.


They say it's "Like a giant strainer dropped in the river," I love that image they have there, they make it look like a giant deep fryer basket, handle and everything!


Also helps the consumer. My ISP doesn't offer unlimited bandwidth. Covid-19 has me hitting caps from being home so much.


Isn't that what VecLib is for? Also there are other math libraries to use on ARM. I think we use EigenBLAS in our products.


This article opens up with a mistake? It's 7 yuan per dollar. Not the other way around.


Whoops - good catch! Little dyslexia I guess! Thanks!


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: