Hacker News new | past | comments | ask | show | jobs | submit login
Unreal Engine 5 enters Early Access (unrealengine.com)
183 points by meheleventyone on May 26, 2021 | hide | past | favorite | 152 comments



I've used both Unreal and Unity for hobby projects, and as a developer I prefer Unreal Engine.

Unfortunately, I've been unable to use Unreal for game jams. The reason? Collaboration! I tried to get my team set up with Unreal Engine and git, but it ended up being a disaster where the team didn't have the time to figure it out. I also don't want to host my own subversion or perforce, and I also don't want to pay an arm and a leg for a hosted version.

On Unity, I pay $30/mo. You scroll to the "collaboration" tab, hit "sync" and your changes are in the cloud. It works great, my artist were immediately able to start syncing their changes.

I think in general Unreal is a more "professional, big teams" tool, and does not target the smaller market.


Unreal is built with Perforce, Epic Games workflows and tooling are built around Perforce. It is rock solid to manage binary assets of large size. I would highly recommend the publications of Dan Bloch from Google about it: https://static.googleusercontent.com/media/research.google.c... and https://research.google/pubs/pub39983/

Yes, I know that they now use Piper.


most game studios don't use git to manage their games. It's becoming more common but it is by no means a standard in games. The git integration in UE4 is very, very bad. It uses perforce terminology to describe git concepts, often to very confusing ends.


For good reason. Git is not well suited for games. Yes there are solutions to work around it’s issues but it’s not a natural fit at all.


With git LFS it works fine for game dev. The main thing it still fails at is the artist workflow. Git's solution to check out single files or ignore folders is still experimental and no GUI supports it yet.

I love git but P4 is still better if your workflow is on large single files.


Git LFS is not bad if that's the only thing you're referring. I've used Git for plenty of games, never had a single issue.


Yep, and perforce is expensive!


What do most game studios use?


Perforce is pretty standard. Plastic SCM has a handful of enthusiastic users but Perforce is more common. Some studios use SVN because they can self-host it and managing art assets in git is really painful.


What's the MVP equivalent of a game? If you want to make a startup, even if you are planning in two years to have a whole team of devops, etc, you can start small doing a MVP with one person (or two), etc.

I always tried to do something similar but it seems impossible in videogames, unless you are aiming a 2D mobile game.


An MVP in the gaming space is a complete demonstration of your core gameplay loop. For example, an MVP Pokemon game would have one urban and one area with tall grass, 2-3 Pokemon, 2-3 NPCs with dialogue, 2-3 NPCs with that challenge you to battles, a working battle system with 2-3 moves (no need to to stuff like status effects, buffs, types, resistances, etc), wild random encounters. It's quite more comprehensive than an MVP in a consumer product, but even the description above can be cut down substantially. An RPG MVP for example could be just a player character walking around a city.

All the graphics for the game above can use placeholders, or even solid colors.


Interesting comparing the answers here that are for a "looks-like" vs a "works-like" prototype, ref: https://predictabledesigns.com/the-essential-guide-to-protot...

Having no experience in gamedev, I could imagine that as in other fields these milestones are a bit orthogonal. You need a works-like prototype for internal use to confirm the basic questions like "is it fun" before committing to polish, and you need that vertical slice looks-like prototype to do demos, make a sizzle reel for Kickstarter, convince investors, etc.

The excellent NoClip doc on Horizon Zero Dawn talks about them actually making a handful of prototypes that explored various aspects of the game's mechanics (combat, traversal, etc) and then having to figure out how to integrate those concepts into a single entity. There are a few clips at https://youtu.be/h9tLcD1r-6w?t=1186 that show a very early demo of the Thunderjaw concept, with the destroyable parts just as big colored boxes.

On the other side of things, it's wild when you look at a game like Hollow Knight: Silksong, for which there was a reveal trailer over two years ago that showed seemingly dozens of fully finished environments, enemy encounters, etc— but in terms of the overall stage of the project, it must have been in alpha or pre-alpha at that point (unless it's encountered some significant unforeseen development barriers): https://www.youtube.com/watch?v=pFAknD_9U7c


You go for a looks-like prototype when you have a team and want to inspire them to move forward, when you want to impress investors or when you are planning a Kickstarter or similar. You go for a works-like prototype when you're experimenting with concepts or want to try out something new. You're completely correct in your assessment.


IIRC Silksong was originally planned as a DLC for the original instead of a sequel, which may be why they had so much ready


True, and maybe there's been a significant reworking of the overall structure since that teaser was produced. At the same time, you can't shake the feeling that that video is pitching something available in a few months, not that's years away.


That depends on the size of your company, but for AA/AAA type games, this is usually a so-called "Vertical slice". It's a demo that shows off a small piece of a game working fully (so demonstrating main systems, UI, etc.) that is then used to pitch full development to publishers.

It's rought equivalent to a pilot for a TV series: https://askagamedev.tumblr.com/post/77406994278/game-develop...

EDIT: I just remembered the other alternative - Early Access. This allows the developers to sell the game before it's complete and solicit gamer testing/feedback during the development process while iterating. It's a popular approach for smaller studios.


There’s traditionally a milestone during production called “vertical slice”, where the developer would present a segment of the game that’s meant to represent the final product. They’re not necessarily considered best practice, since a developer might have to pantomime that the game’s in a finished state when they still have a lot of work to do. Tim Rogers describes it akin to making a cake one slice at a time.

Mark Cerny, a developer for a lot of Sega and Sony titles, always suggests an experimental pre-production phase and a structured production. The former is meant to figure out the requirements for the latter and clarify the nature of the product.


The market is really harsh so I'm not sure you can actually make an MVP without making a polished, finished game.

You can noodle on mechanics, art direction, story, etc with a small team before you go into production but you probably won't have an actual viable product coming out of that stage.


Well, there is the Steam Early Access - mind you games need to be at least partly polished and quite playable before entering there. But the public is more forgiving of "early beta" quality at least there. Sometimes alpha if the game is obviously a gem waiting to be polished.


What you do is build your gameplay with the absolute minimum of graphics. For example if you want to build a platformer, start with a red square for your player character, blue squares for enemies, brown boxes for terrain. Do the physics first and add your unique gameplay ideas, but with the absolute minimum of artwork. No sound or music (unless your gameplay ideas require them). Once you have something fun to play with in this state, then you can add art.


That's not normally an MVP game, which needs to be the minimum viable product. Of course, this is targeted to the audience. If your audience is a gameworld of blocks, that might work :-).

As many others have pointed out, a vertical slice is the term that applies to this vertical of making games. So, even the terms are audience specific - MVP for owners and Vertical Slice for staff.


Yes, but that doesn't sound like a MVP. A MVP (in this case) should be playeable. No one will play a platformer with a red square.



Has a lot of sound and music though. Its an animated square with jump anims etc. A lot more work went into that game than whats described above.


It's sad for Super Meat Boy that the comments underneath yours keep insisting it's "just a platformer with a red square".


The point is that Super Meat Boy would still be a fun game even if the player character was just a red square. It wouldn't have the same commercial success without the artwork and music, but you can easily see how you could make an MVP and have it still be a fun game, then add the graphics after.


Super meat boy happened to have an extremely fast dev cycle. It only took 18 months to make the engine and game! https://www.gamasutra.com/view/feature/134717/postmortem_tea...


I get what you're saying, but if you read the replies, that's not what they're saying. Exact quote: "Super Meat Boy is basically that" (meaning, basically a red square).


Sure they will, if the gameplay is compelling. Maybe they won't pay full price for it (though there are many examples of games with very basic graphics that sell just fine), but an MVP doesn't need to be profitable. What you need at the MVP stage are playtesters who you can validate your gameplay ideas on.


You are confusing MVP (Minimum Viable Product) with MCP (Minimum Compelling Product). The red square is enough to enable key decision makers to assess the viability of the product. It is not intended to be a compelling product in that state.


Super Meat Boy had done exactly that:

https://en.wikipedia.org/wiki/Super_Meat_Boy


I recently played Super Meat Boy. It's a great game and the character is basically a red square.


Often it's a vertical slice with gameplay mostly done and assets of medium to high quality. Since assets are expensive to make this allows you to build a mostly complete product without cutscenes and 50 levels but without too much effort.

You often don't ship this MVP to the public but use it to attract publishers or to demo at game conferences.

[Oped below]

So much of what makes a game good is art and feel based so testing early and often whether you're on the right track is important. Game studios that are smaller have to do this. Larger ones can take the hit if they make a bad call in favor of controlling the marketing/presentation more carefully. I'm not entirely sure how they test. Similar to developers I think most people in the game development industry have good or reasonable instincts about games that would work.


> So much of what makes a game good is art and feel based so testing early and often whether you're on the right track is important. Game studios that are smaller have to do this. Larger ones can take the hit if they make a bad call in favor of controlling the marketing/presentation more carefully. I'm not entirely sure how they test. Similar to developers I think most people in the game development industry have good or reasonable instincts about games that would work.

I think I’m in the minority, but I really care very little about the art and feel; I’m die-hard about the mechanics of the game. I hear people ranting and raving about the “cinematic quality of the game art” and I find myself thinking, “I mean, that’s nice, but there are all of these glaring issues with weapon balance and map design, the net code favors laggy players, the time-to-kill is so low that network rather than skill dominates the outcome of almost every confrontation, and the server randomly disconnects people...”. I play a lot of first person shooters, and I rarely care whether the “skin” of the game is futuristic, present day, WWII era, etc as long as the mechanics are right.


Never saw the responses. I agree with you that the mechanics come first. Game feel is so important. Graphics don’t need to be amazing for good game feel.


I'm with you there. I (not a game dev) imagine that balance is hard to find, particularly when choosing the right moment to release. I remember playing "Age of Conan" (the mmo) when it was released and in my opinion while the graphics were good, the game-play was hilariously out of balance (or buggy). My younger self found exploiting the lopsided game-play very enjoyable, but ultimately didn't stick around for too long. If the balance was reversed (more polished game mechanics, less polished art) my friends and I probably would have made it a staple.

But to each their own! It's nice to see the variety available from movie-like experiences to pixel art with all ranges of game mechanics.


What you're looking for is called a "Vertical Slice" in games. The systems you intend to build done as mocks or stubs, used to convey the overall sense a game is meant to convey. You have to build a lot, but it serves the same purpose of checking to see if people actually like a game/its mechanics. Also, in a vertical slice, you can use unpolished game assets or just graybox everything. The point is the "feel", not necessarily any individual look or mechanic.


Here's a good video from an excellent series:

Making Your First Game: Minimum Viable Product - Scope Small, Start Right - Extra Credits https://www.youtube.com/watch?v=UvCri1tqIxQ

It's been a while since I've watched it, but I think you should take MVP to mean "what's the smallest game I can build to tell whether it is fun" and not "what is the smallest thing I can build that people will buy".


You are correct that the purpose of the MVP (Minimum Viable Product) is to test if the idea is viable. As you point out you can generally assess the viability of a product long before you have a compelling product (Minimum Compelling Product = MCP)


Interesting. I've never heard of an MCP before. I like the idea.


We are working on a 3d mobile game and we have both the concept of mvp and vertical slices.

Vertical slice is the product we show to management.It's crude and only has the most core elements. No music very few sounds no introduction. Something similar to a prealpha product that you can play with.

MVP in our company is like beta products that you put on the stores, those early access ones. You get full audio, polished visuals, full FTUE and the means of payment, etc. Basically a full game but with a bery limited amount of gameplay contents. Say Doom but only with e1m1 and e1m2.

Then we grow from there and add a lot of things if it looks good. We first push the MVP (early access) to Tier-3 countries and then gradually open access to all countries if the revenue looks good. We kill it if it doesn't pose a positive return after some tweaking. The majority of programming frameworks are done at this stage but new contents still need code here and there. Eventually you are planning new major features e. g. pvp that needs new frameworks but that's probably months after MVP.


MVP implies sales stretched over time, which is unlike the traditional AAA games and the movies industry processes they copy.

An MVP for a game is the cheapest playable version of the game you intend to make. It's hard to say anything else because games are an even more general category than apps: you might find out that you want a physical boardgame.


Really depends on who you are selling to. If you look at things as a startup you are selling to both investors and your customers.

From a games perspective the customer is actually much more picky than investors and the market is hard to get discovered in. Your MVP is a finished game.

For industry investors (publishers) it depends on your pedigree and connections to a large extent. No name, first time indies will find it hard to get a bad publishing deal without a nearly complete game. Well connected industry veterans will have a deal worth millions based on a pitch alone even prior to starting a studio.

For more traditional startup investors you need pedigree, an idea of how you'll scale, some evidence that you can actually build the thing and to be in a segment hitting the zeitgeist. For example over the last few years doing something with mobile GaaS, VR/AR, blockchain, cloud streaming/native and so on.


What's the MVP of a painting? Or a film?

I'd argue that if your goal is MVP, you probably aren't making art. Think sketches and studies and not products imo. That's the path to lasting greatness in gamedev.


If you're trying to sell it, it's a product.

An MVP of a painting is a sketch. An MVP of a film is a smaller film with the less important parts of the plot stripped, made with a phone camera and paper decorations in the backyard.


> If you're trying to sell it, it's a product.

That's kind of my point. Not that no art can be made to be sold. But art made with sales as a primary success criteria is only a subset of art-space. And a lot of the great art out there (video games included) did not originate from that subset.


I suppose for a film it's something like a script or a storyboard.

For a painting, I'm not sure because I'm not an artist, but is the investment large?


Early access? Some of the games released this way are very rough, but have at least a level or two worth of gameplay.


The core-loop. For example:

- plant seed

- harvest seed

- use income to buy better seeds

rinse repeat


I am curious about Unreal Verse, I hope there's a book or tutorial included. It's not on the official UE5 documentation page. https://forums.unrealengine.com/t/verse-the-new-unreal-scrip...


I'm a bit vague on what this is, is this apart of UE5 ? Is it a side project ?

My main problem with UE is C++, I don't want to even try to download files or read json in a blueprint.


I really hope we get more technical information on how Lumen and Nanite work, and additionally that Epic doesn't patent the techniques in either of them. A patent on either would make me so sad, 20 years is really long in software, absent Epic's amazing work I expect we would have something else like it in like 3 years given what we've seen in things like http://dreams.mediamolecule.com/.


A lot of information has been released here: https://docs.unrealengine.com/5.0/en-US/RenderingFeatures/Na... and here https://docs.unrealengine.com/5.0/en-US/RenderingFeatures/Lu...

There is also a source code release if you want to dive into that level of detail: https://github.com/EpicGames/UnrealEngine/releases/tag/5.0.0...

I don't think the released details are that surprising to those working on realtime computer graphics, but the engineering details and tradeoffs are certainly interesting. Epic has the budget and business case to allocate a team, including some of the best graphics engineers in the industry, to do R&D for over a year to make this a reality.


So Nanite is just traditional LOD baking implemented in a wholistic and automatic way?

The major difference seems to be they've done the work end to end to handle all the occlusion corner cases as well as a sophisticated mesh and texture streaming implementation that targets modern SSDs.


It's not traditional LOD baking. There's no LOD baking. It's the new rasterization system doing the whole work.

And most of us doesn't use fast SSDs like in PS5 and it works really well. Also these engineers said, it even works just fine with slowers HDDs too. Because, they don't stream meshes for each camera movements. But it's a continuos setup.


I’m not an expert on this, but there seems to be a custom GPU renderer optimized for dense triangle meshes, with its own occlusion pass. The LOD is also calculated based on clusters, multiple per-mesh with way to fix seams between cluster at different levels. This works best with very dense meshes such as those from photogrammetry or zbrush sculpted.


Critical comments on this thread are receiving an unusual amount of downvotes in my opinion.


> These comments are astroturfed af. Even a slightly critical comment is receiving tonnes of downvotes one minute after being posted.

You broke the site guidelines badly with this, as did everyone who upvoted it. Please read the rules: https://news.ycombinator.com/newsguidelines.html. They contain:

"Please don't post insinuations about astroturfing, shilling, brigading, foreign agents and the like. It degrades discussion and is usually mistaken. If you're worried about abuse, email hn@ycombinator.com and we'll look at the data."

We have that rule for good reason, as I've explained ad nauseum over many years:

https://hn.algolia.com/?sort=byDate&dateRange=all&type=comme...

including yesterday:

https://news.ycombinator.com/item?id=27271356

I looked at the data and saw no evidence that the comments are astroturfed, let alone "af". Perhaps I'm missing the obvious? In that case these astroturfers must be devilishly clever—enough not to leave any of the traces that we look for—and also dumb enough to leave comments that readers will pick out as obviously astroturfed.

More likely, though, you were doing what internet users commonly do, which is vastly over-weight the things they dislike and then read sinister interpretations into them. (See https://hn.algolia.com/?dateRange=all&page=0&prefix=true&que... for many past explanations.) The overwhelming majority of these interpretations have literally zero evidence, and they're deeply poisonous to forum threads. In the future, if you don't have evidence, please don't post any of these; and if you do have evidence, please email hn@ycombinator.com so we can look into it. But people having different tastes in game engines does not count as evidence.


I've edited my comment, still not sure it's compliant to the guidelines but probably better than the original.


I appreciate your wish to be more in keeping with the guidelines! And I hope I didn't come down on you too hard. The problem is really a co-creation between comments and upvotes, and the upvotes do the greater damage—but there's no way to reply directly to them.

Your edit is still not in keeping though, because if you read https://news.ycombinator.com/newsguidelines.html to the second-last rule, you'll find: Please don't comment about the voting on comments. It never does any good, and it makes boring reading.

Moreover, from what I've looked at in this thread, your comment appears to be false. Which critical comments are you talking about specifically?


If you think this is 'conspiracy' talk - enjoy this quote from Satya Nadella:

> In fact, this morning, I was reading a news article in Hacker News, which is a community where we have been working hard to make sure that Azure is growing in popularity and I was pleasantly surprised to see that we have made a lot of progress in some sense that at least basically said that we are neck to neck with Amazon when it comes to even lead developers as represented in that community. So we have more work to do, but we are making progress on all dimensions.

https://sg.finance.yahoo.com/news/microsoft-corp-msft-q1-201...

EDIT: I don't mean this to be an indictment of the moderation on HN - on the contrary, I think an amazing job is done to balance out this stuff. It's just crazy to realize that corporate efforts on HN are important enough to mention on an earnings call!


Wow, how brazen.

Everyone knows this happens on any forum with sufficient popularity, but it's striking to see it openly admitted.


If you're worried about abuse, email hn@ycombinator.com and they'll look at the data.


Including that one, which was downvoted three times in 0 minutes. Is that natural?

I suppose it is. It'd be more shocking if companies weren't shilling on these platforms. I have to admit I haven't been countering this with upvotes on my part, which is a fairly simple solution.

In retrospect astroturfing [0] is not the appropriate term, previous discussion on that here [1].

Shilling on the other hand is the correct term - but there doesn't appear to be a meaningful way to distinguish between paid and unpaid shills as an observer.

Perhaps my comment should've been more along the lines of "there are dissenting voices that have been buried," which on Reddit would be a callout that says "sort by controversial". Unfortunately, neither is useful for comments that aren't top-level.

[0] https://en.wikipedia.org/wiki/Astroturfing [1] https://news.ycombinator.com/item?id=19418177


You broke the site guidelines with this comment too. Please read them to the end: https://news.ycombinator.com/newsguidelines.html.


I presume the OPs comment is getting down voted because they made an unsubstantiated accusation of shilling - pretty sure that's against the HN rules, but in any case HN users tend to take a dim view of this behaviour.


looking through the release info, it states that linux doesn't (currently) have nanite or lumen support. nanite might be tightly coupled to the IO subsystem on windows (a bit rough, but it's.. understandable, almost..), but i'm a bit perplexed as to why their lighting engine is a windows only product.

oh, they've totally released the source code for it - guess we can find out!


It's probably just not implemented for anything but D3D12 and the Playstation graphics API. Those are the priority targets for high-end game content.


lumen seems to do light accumulation by raytracing which in turn requires Direct 3d/RTX, hence windows (for now)


Neither Nanite nor Lumen use raytracing.

Source: they said so, somewhere.


I am very confident lumen uses raytracing to accumulate global illumination. Notice the delay when light direction changes. No Rays, no GI.


I've looked it up [1]:

Lumen uses ray tracing to solve indirect lighting, but not triangle ray tracing," explains Daniel Wright, technical director of graphics at Epic. "Lumen traces rays against a scene representation consisting of signed distance fields, voxels and height fields. As a result, it requires no special ray tracing hardware.

So, it technically does use a kind of raytracing, but not the hardware triangle raytracing that requires DXR.

[1] https://www.eurogamer.net/articles/digitalfoundry-2020-unrea...

Distance-field raytracing isn't a new thing in UE either, they've been using it for shadows and AO for a while.


https://docs.unrealengine.com/5.0/en-US/RenderingFeatures/Lu...

„Lumen uses Software Ray Tracing through Signed Distance Fields by default, but can achieve higher quality on supporting video cards when Hardware Ray Tracing is enabled.“


„ Hardware Ray Tracing also scales up better to higher qualities — it intersects against the actual triangles and has the option to evaluate lighting at the ray hit instead of the lower quality Surface Cache. “


The joy of testing a new version of an engine, spoiler: everything breaks.


Showcase trailer for anyone who missed it:

https://www.youtube.com/watch?v=qC5KtatMcUw


The sample project is 100Gb... I think I am going to need a new hard drive.


100GB of rocks.

I love the notion of their nanite engine and basically removing concern about LoD (though we've heard that pitch over many generations of engines), and it looks like they're finally improved on what was a terrible, clunky, extremely dated UI.

What I don't love is that it leans into going crazy with assets. Already games are getting simply ridiculous, with new games hitting 300GB, many with an unending stream of enormous patches (looking at CoD Warzone).

Instead of 100GB of rock meshes and textures, isn't there a benefit to procedural on-demand generation for stuff like this?


Since everything is just from megascans, it might make more sense to have 300gb of photorealistic assets stored on every next-gen device, and (UE5) games can just commonly load from them.


Yeah, at a certain point, it probably makes sense to have a giant library of universal assets like rocks and grass and trees. There's no reason for each outdoor game to keep reinventing the universe.


So basically you’re asking for a node_modules for games!


More like a /usr/lib shared across games on the same console


This is Unreal/C++ we're talking about. If the size of some field somewhere changes, your 300GB will need to be re-cooked...


You can do that with Substance. Store images in a library and procedurally generate textures at runtime.


"and it looks like they're finally improved on what was a terrible, clunky, extremely dated UI." doesn't hold true, it's mostly the same UI with a dark reskin. Certain editor hot paths have been reworked and that's about it. The editor is almost exclusively written in Slate, Epic's in-house Window framework/general GUI module. Slate's got a pretty interesting nested MACRO system. Most definitely not data-driven and pretty much as hard-coded as it gets, redesigning the editor for real would be difficult to say the least. In reality most experienced developers don't want a new design, they are happy with the workflow they have and I have to agree with them. I'm generally happy with the "if it ain't broke don't fix it" reskin decision.


> isn't there a benefit to procedural on-demand generation for stuff like this?

Like how demoscenes do it? I would think a huge trade off is the performance hit (and potential delay / popin) of the generation


In a way procedural generation can be tought of as extreme compression!


What are they storing that the project is that big?


It's a Chia mining plot and a few megabytes of demo project :)


Source high res texture and model assets are quite large.


Graphics and audio are almost always the culprit for large sized games.


Can you compile and run the sample project and have something to walk around in?


Yes. Your first compile may take an hour. It took about an hour and a half to get to a running sample game. You get all the source code for what goes on the target machine. It's here, on Github.[1]

I tried to load the demo used for the UE5 videos, but that's only loadable through the "Epic Games Launcher", which is Windows-only. It's apparently possible to bash that into running under Wine, but I don't have time to make that work right now.

For those who don't know what the deal is with Unreal Engine, there's no charge until you get the first million dollars in revenue. Then they take 5%. The nature of the game industry is that the top 50 or so titles generate almost all the revenue. So, rather than bothering with restrictive licenses for all the little guys, Epic just waits until a successful game starts getting media coverage and shows up on the industry charts, then has their billing department contact the game company.

[1] https://github.com/EpicGames/UnrealEngine/releases/tag/5.0.0...


You don't need the source code to make a executable version of the Game. And it won't take 1 hour on a decent machine.


Is there any research published detailing how the Nanite "virtualized geometry" system works?


There is this now, someone trying to explain it after looking at it in RenderDoc:

http://www.elopezr.com/a-macro-view-of-nanite/

There's a little bit to be learned from the official documentation, too:

https://docs.unrealengine.com/5.0/en-US/RenderingFeatures/Na...


Glad to see that the Epic/Apple war hasn't boiled over to this.

From the website:

"What platforms does UE5 Early Access support?"

"Unreal Engine 5 Early Access broadly supports the same platforms as UE4—next‑generation consoles, current‐generation consoles, Windows, macOS, Linux, iOS, and Android..."


It almost did:

"The motion centers on iOS support for the Unreal Engine, which Apple has threatened to revoke as part of Epic’s broader loss of developer privileges. Epic has asked the court to restrain Apple from revoking that access while the case is ongoing."

https://www.theverge.com/2020/8/23/21397369/epic-apple-fortn...


Apple initially banned the UE too, but court gave preliminary injunction to decouple UE from Epic since both are under different Developer Account, under two differently registered companies.


Same payment information though(!)


30 percent revenue cut is still better than 100 percent missed sale.


Will there be ever be another Tournament or are we stuck with Fortnite?


there was gonna be but it didn't work out https://en.wikipedia.org/wiki/Unreal_Tournament_(cancelled_v...

the "arena shooter" genre is kind of dead these days, sadly


The fanbase for these games can’t be large anymore. Sadly. My impression back then was that the new UT would have always been hobby project for Epic. At best a marketing vehicle for the engine. They never committed a full team, a lot of weapon design was outsourced to the community. So I don’t think it “didn’t work out” in any way that they hadn’t already anticipated. It’s probably just that when they hit the jackpot with Fortnite, it was hard to justify such a niche project.


VALORANT (a Counter-Strike / Fortnite mashup) in Deathmatch mode certainly feels like the old arena shooters (Quake/UT). It's fast and frantic, however the style is different as you don't spend much time picking up ammo/weapons and the spawning behavior after death means there's ALWAYS someone right around the corner.


do they still have that kernel-level anti-cheat thing? I've wanted to try Valorant but don't feel comfortable letting a chinese-owned company have kernel-level access to my machine


You mean this system tray thingy that warns me in very stern language that I'll need to reboot if I close it? Yeah it's still there.


In Valorant (and Counter-Strike) you move at a snail pace compared to Quake/UT, it really does not feel like the same kind of game.


I've been playing Hyper Dash on Quest and it is actually really fun. Maps are well laid out and the movement mechanics are unique yet intuitive. Good community as well. It's the first arena shooter in years that has made me want to play again and again.


I have a group of friends that I play Xonotic[1] with occasionally. Free-software, easy to get started, cross-platform. It scratches the UT/ Quake3 itch without having to deal with CD-keys and backwards compatibility with friends.

[1] https://xonotic.org/


I think games like Unreal Tournament 2004 and Quake III Arena are reflections of a different market. Production costs and expectations around sales/RoI are a lot different from the late 90s.


UT2k4 has virtually anything you could possibly want out of a UT game.


People still play that old game?


Only if they don't want to be stuck with fortnite.


Between virtualized geometry and raytracing the amount of work to make a game is steadily going down. Artists can simply make objects at full quality (no more manual LOD creation!) and put stuff in a scene and the lights _just work_.


I'm not an artist and I still can't wait to get a closer look at their new geometry renderer tech. The UE5 early access branch is already on github, this is gonna be wild. I also saw they are planning to deprecate raytracing and... use lumen for everything? Wonder what Nvidia says to that. But with the next-gen consoles not using RTX cards I guess this is a reasonable move by epic.


Some specific workflows may improve and be more efficient here and there, but it would be the first time in history if this would mean games are cheaper to produce, instead the scope will increase.

As for LOD creation, there are things like https://www.simplygon.com/ already.


Although I think it's extremely important to move forward in making artists do less work, I do not expect that a full zbrush model without any retolopogy to go into the engine and expect it to just work, specially because most of the time you also need to take care of animation and texture/uv. Also I also worry about download size. So there are still many reasons why you'd want to optimize the hell out of a mesh in a game engine even if it has the capability to support millions of polygons.


In the video they say that they animated the high poly golem mesh using their new animation tools and they are certainly marketing it as 'zbrush directly to engine'. I'm curious how they went about texturing that massive mesh though and how well it works with humans where muscle deformation is important (I'd imagine the answer is "it doesn't").


Yes but their demo is a 100GB download alone, it seems. That's just a demo. The capabilities of new hardware and engines to render worlds is massively outstripping the capabilities of distribution networks to get that data into people's homes. That's where the next challenge is. Xbox Live is already a long way from being able to deliver games quickly and it will only get worse.


…and then management asks you to make a Nintendo Switch version.


Its a good think but this increases work. The amount of detail that needs to be created continues to skyrocket. And it never "just works."


My understanding is that this shorten the time in some cases, but most Artist being perfectionist meant they will continue to use their time refining other things or more work being assigned. Basically quality may go up in this case, but cost isn't going down at all.

I have been wondering for a long time if there are tools or massive innovation that bring the cost of building games down by a significant amount.


I'll believe it when I see it. In my experience, anything Unreal takes a long time to download, forever to build and then another eternity to cook the assets.


Just saw the video on youtube. Amazing. Las year when I was between jobs I took a shot at creating my own 3D engine and I know how much work it is. Congratulations.


I don't know about the engine, but this website renders at about a single frame per second on my 2020 Macbook Pro with 2,3 GHz 8-Core Intel Core i9 processor.


The Macbook Pro in order to render things is a bit mediocre.


I also have a stationary PC with 3070 video card, do you think it'll be up to the task?



Is there a demo you can "build" with UE5 or just UE projects in general?

As someone in DevOps I'm wondering what the build process and pipeline looks like for games.


Initial build on a UE4 AAA title typically takes 24-36 hours from what I've seen without intermediate or pre-processing on the art assets. You've gotta build the engine, editor, game and then all the art and lighting.


Varies between engines but its usually the same as a normal code build but with way more tooling around dealing with large art assets, importing them, compressing them, etc.


Does anyone know how they texture those monstrous meshes? Available 3D packages certainly couldn't handle it directly.


Quixel Megascans are based on photogrammetry. So they're taking lots of photos of real world objects and generating the meshes from laser scans. They aren't built by hand.


The golem from the sample project isn't just stitched together scans, it's definitely built by hand.


The video says that parts of the golem's armor are in fact megascans, iirc. But yes obviously some parts of it are manually modeled. I don't see why that's a problem - people have been making film quality assets for many years. The only thing that's new is it can run in real time now without a lot of work.


No, they just mention that it's made out of nanite meshes (the entire golem, that is).

>I don't see why that's a problem - people have been making film quality assets for many years.

Because there is no software in existence that allows you to UV a 20 million poly model. Automatic methods will crash and for manual methods the software will just be unusably slow. The only way to texture such a large mesh is to either

1) split it up and uv small parts individually. This would take ages, requiring at least a magnitude more time than the usual approach. This just isn't economical even for your average hero asset.

2) Create a low poly mesh (and not just by decimating, or you'll get bad UVs), but they're marketing Nanite as if you _don't_ have to do this ("straight from zbrush") and if you have to do this, while still very impressive obviously, it won't revolutionize the workflow like people think it will.


Do they use Unreal Engine 5 for Mandolorian?


They used UE4 to run The Volume in season 1. For season 2, ILM developed their own real-time render engine (Helios).


Thx. I didn't know that and did a search on it [1], some more details here [2], [3] but they still dont explain why they dropped Unreal and made their own in technical details. I guess partly to protect their IP partly to protect Unreal's PR.

I am also wondering how much did it cost for them to develop their own Rendering Engine.

Or could it be Epic was too busy with Unreal 5 they dont have the manpower to help ILM for their specific feature request.

[1] https://80.lv/articles/a-look-at-ilm-s-work-on-the-mandalori...

[2] https://www.fxguide.com/fxfeatured/mandalorian-season-2-virt...

[3] https://www.engadget.com/the-mandalorian-season-two-stagecra...


I found an article in Variety [1] that says "Additionally, ILM has a proprietary real-time engine dubbed Helios, based on technology developed at Pixar" which sounds like they adapted it from Renderman rather than build something entirely new from scratch. Pixar has been working on adding real-time/GPU capability to Renderman forever (supposedly it's finally coming to the public in RM24. We'll see).

[1] https://variety.com/2019/biz/features/video-game-engines-vis...


Thank You, the use or Renderman makes much more sense.



That's the previous Unreal Engine 4.


I really doubt it. I can't imagine UE5 being stable enough to use in production at this stage of development unless they've prioritized the featureset that Mandalorian is using.


I believe they used UE4 for season 1 and switched over to an internally-developed engine for season 2.


Does Unreal let you develop any kind of game you want using their engine? Are there any restricted subjects?


They used to not allow gambling products, which required a specific license, but that restriction was removed from UE4 at some point in the past couple years.

On the other hand, Unity does still require a specific gambling license.


For gambling games you need a custom license, but other than that I don't think there are any other restrictions.


There is no restrictions. You can use the Blueprints (a kind of drag&drop visual programming) or C++ language.


all I wanted to see was the new UI and they show it, so I'm glad. Definitely looks like they took some inspiration from the Blender 2.8 UI. It looks really nice.


Lol, why is their mobile site so fucked up?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: