Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Doom Eternal – Graphics Study (simoncoenen.com)
167 points by todsacerdoti on Sept 7, 2020 | hide | past | favorite | 53 comments


And with enough processing power, you can get a computer that does all of this 1,000 times per second!

https://slayersclub.bethesda.net/en/article/48xD6yVj0VsulONX...

I work with computers every day and I still just have no concept of how powerful they really are.


Computers are powerful enough to host exactly one Wordpress site serving a few dozen users simultaneously.

(Point being, any extra computer power is wasted by software until it performs no better than a computer from 20 years ago)


> any extra computer power is wasted by software until it performs no better than a computer from 20 years ago

The existence of Doom Eternal disproves this.

In domains where high performance is important to the product's success, such as graphically impressive video-games, computational power isn't wasted. Bloat is tolerated in many other domains, though. Anything made with Electron, for instance.



Computers are powerful enough that a reasonable desktop PC these days could host 100K simultaneous web connections simultaneously and still be entirely usable for several other things as well, as long as you are not flagrantly wasteful with them.

That is, static web hosting is an entirely solved problem that no longer requires special hardware.

Just yesterday, I was impressed that my four-year-old laptop could build 8 virtual machines at once, writing 500 MB/s to disk sustained, and I could still use it to compose emails without a significant performance hit.

That kind of thing would have been unthinkable a mere 10 years ago on even the most high-end servers, let alone something you can carry around with you...


Thanks to that post I discovered that there are such things as overclocking world champions. What a world!



A few thoughts on this from someone who happens to work with graphics programming.

Mega textures: I always thought their insistence on this technique was weird and was never quite convinced by it. I used to have a colleague who praised it to the skies and swore everyone would be doing it in a few years. It's interesting that they've moved away from it now.

Forward rendering: It's fascinating that they're going this way. I feel like everyone is all in on deferred rendering nowadays, except when targeting mobile. I wonder if this is because they want to target older systems and Nintendo Switch. I have heard that Doom Eternal renders surprisingly well on older systems and my own experience optimizing for Switch is that it's not worth it trying to deferred rendering to work efficiently on Switch.

Shadow mapping: the use of simple 3x3 PCF makes me feel a bit sad that variance shadow mapping never really took off. I remember reading about it and implementing my own version in 2010 and it seemed like the future.

Uber shaders and draw call merging: I really like this from a technical standpoint, but I wonder how it affects artist workflow. I know a number of artists who love using special crafted shader graphs for everything.

Reflections: I wonder if this was just an optimization for memory bandwidth, as suggested, or a compromise to allow to go fully forward rendering.


Out of curiosity, have you played the game itself?

I played both the predecessor Doom 2016 game and Doom Eternal on the exact same hardware, and I was blown away by the performance and quality difference.

Never before have I seen a game crank up the graphics quality this much and improve performance to this extent without requiring a hardware upgrade.

To put things in perspective, I had to play Doom 2016 at 1080p on an NVIDIA RTX 2080 Ti for acceptable framerates, but I played Doom Eternal at 4K and it produced a silky smooth 60 fps throughout.

Whatever they're doing... it's working!


I've not yet had the chance to play Eternal myself. With a massive backlog of games and my own hobby project vying for what little of my free time I don't spend with my two kids...

But from everything I've seen I'm deeply impressed with what Id has done these last few years. Of course, Id doing impressive things is nothing new, but these new Doom and the latest Wolfenstein was still beyond expectation.


What they are doing is: they got help from google to optimize shit out of the game for stadia. (online streaming service for games)


Such problems are not solvable with money or manpower, and I don't think the Id team depends on Google engineering knowledge for GPU optimizations ;)


> Forward rendering...

Isn't clustered forward rendering for lighting better in any aspect than "traditional" deferred shading, no matter if running on low- or high-end GPUs (because it wastes less memory bandwidth for accumulating light contribution from different light sources)? Also I think most rendering pipelines in modern games mix elements of both deferred and forward rendering (e.g. Doom Eternal renders additional information needed later in the frame into 'fat framebuffers', much like deferred renderers do).


Clustered rendering can be done both deferred and forward.

They do a few things which are reminiscent of deferred, as you say. But what really makes a render deferred is the use of the gbuffer pass to only cull and collect data, followed by a rendering pass over a screen quad doing fragment shading with no potential overlap or wasted processing. Of course, you do get similar advantages from the depth prepass.


All that super complicated math to produce incredibly immersive graphics but you still can't jump or walk on an elevator while its moving (I still love the game, just find such thing amusing).


Game physics are hard.


Serious question: what is the value in an in-house engine these days? Performance? Ease of use? Profit margin?

I don't know anything about game dev, but the cost of maintaining an engine sounds very high, and the benefits seem unclear. I imagine that Unreal is more advanced than these homegrown engines, and the engineering cost of constantly upgrading a homegrown engine with new algorithms and such must be staggering.

What's the value?

Ps: Doom Eternal runs really great, but I'm sure with tuning Unreal can produce similar results, right?


So I work at a big AAA company that maintains a few big AAA engines(I'm sure you can guess which one it is), and while there are loads of drawbacks, the benefits are worth it for us. Basically it boils down to - we control the engine, we control the release schedule for features, we don't need to wait for permissions or licences from another company, and there is no risk of being screwed over by a changing cost or licence restrictions.

As an example - right now as next-gen consoles are slowly heading towards release, we need to have several games ready for launch. If we relied on external companies to provide say PS5 support, this might have not alligned with our schedules and company goals. In fact, since different projects are in different stages of development, things like PS5 support have been developed independently by few different teams for different engines. This might seem wasteful on the outside, but in fact that allows us to pick the best implementation afterwards, or the best implementation for the genre of the game that we're building. We've done the same with the Switch when it first came out - multiple implementations appeared on different projects but nowadays there's only one or two that are considered "gold standard" within the company. You don't get this flexibility with external engines - you get one implementation, and it either works for you or it doesn't. You can make a request to change something, but it might or might not happen, and you usually don't know the timescale. With your own engine you control the timescale.

>> I imagine that Unreal is more advanced than these homegrown engines

Uhm no, it is and it isn't. Unreal as a commercial product has to serve many many goals, which means that for some it's just not the best engine to use. With your own engine you can tailor it to exactly the type of game or the system you're building for and achieve maximum benefit.

>>Ps: Doom Eternal runs really great, but I'm sure with tuning Unreal can produce similar results

Like, maybe Epic could do this internally. As an outside studio you have 0 chance of doing the same.


Makes sense, thanks a bunch for the explanation!


> Ps: Doom Eternal runs really great, but I'm sure with tuning Unreal can produce similar results, right?

Absolutely NOT! UE4 has great tools and awesome APIs for creating content from the perspective of an artist. The runtime, unfortunately, is mired with legacy code from a bygone era. It is also restricted by the need to maintain mobile support. Compared to an engine like what idTech has, or Activision, or Ubisoft, etc. the UE4 runtime is probably 10 years behind. There are numerous techniques such as those mentioned in the linked post that are simply not possible in UE4 because of how dated the RHI layer is.


I’ve worked with UE4 professionally for several years - this is absolutely true. It’s hard to get really good performance out of the engine. Furthermore, it’s a beast. The codebase is large and old enough that making deep modifications is very difficult and time consuming.

That said, the tools are (mostly) great.


Competitive advantage. On the PS4 at least, Doom 2016 and Doom Eternal are the some of the best-looking games which are capable of running at a constant 60FPS on the platform. Additionally, the team will have a tonne of institutional knowledge built up over the engine's lineage and content creation tooling, making switching a costly proposition.

It's worth noting that after Carmack's departure, the engine development is now headed by Tiago Sousa, the guy responsible for the Crytek engine. He's very much a worthy successor IMO.

Perhaps a more historical point of owning the rights to your own engine is the ability to relicense it under something like the GPL for others in the community to learn from (once it has outlived its useful commercial life). Id is well known for doing this (see their GitHub repo), driven by Carmack's altruism, and there is a thriving community of ports and code review breakdowns (Google Fabien Sanglard) for their historical products as a result. It's unfortunate that this trend did not continue past Doom 3.


The direct $$$ cost of maintaining an engine is probably not as high as you think, as it really needs a small team to move fast enough and have the customizability advantage over COTS engines.

The value question can't be answered in general, it depends on what the engine is specialized to, how well it's executed compared to the alternative, what are the cultural factors surrounding the idea of using in-house tech, how well the support tooling supporting creatives is executed etc.

The standard curve applies, most in-house engines aren't very good, but few are.


Value wise, it's potentially a competitive advantage. Product differentiation, facilitate specialist niches that the commodity engines don't.

The performance advantage is real. Getting to 85% of the visuals there is 'easy'.

The last 15% is where the waters get muddy, the predators fierce and the black obelisks loom high.

For most teams, most projects, it's probably a boondoggle. As an indie, your use case basically needs to be really simple and performance dependent.


You think the company that made Wolfenstein, Doom, and Quake should use the Unreal Engine?


Is it still the same company? The new team seems pretty great but they’re still a different team with different strengths.


If you replace all components of a car one by one over a long period of time, will it be still the same car?


More apt comparison would be a sports team rather than a car. And if you replace all players in the team over a long period time, usually it wouldn't be considered as the same team, sometimes just replacing one player is enough(Bulls minus Jordan etc)


> What's the value?

Business alignment. The engine is optimized for the needs of your company not others.

Unreal is optimized for Fornite, as Fornite produces a lot of money for Epic.

e.g. If there is bug that affects Fornite and another that affected Doom Eternal. Epic would fix first the one that affects most of their revenue, for later on fix (or not) the one affecting Doom Eternal depending on other priorities.

The same happens with new features, or moving to new technologies or platforms.


While the point about business alignment is true when done right, it can also be a net negative when done wrong.

I used to work for one of the biggest publishers of AAA games. They have their own engine which all their studios must use. It's been a big source of issues because that engine was in many ways behind others on the market and the team in charge of it did not feel the necessary pressure to address issues since all their "customers" were required by management to use it.


Depends on what game you're trying to build and if the engine is tuned to that use case.

The needs of an open world vs corridor vs highly instanced simulation will drive a lot of the core design of an engine in different directions.


Yeah, UE seems general purpose, which inevitably makes it sub-optimal for a lot of use cases.


I'm going to respectfully disagree with most of the other replies to your question.

There isn't much value in maintaining your own engine. All you have to do is look at the top games made with Unreal 4 and compare to the top games made without it and you'll see it's mostly irrelevant to have your own engine. The tech geeks and engine creators will all claim otherwise but sales and users don't care. They care that the game is fun. Just like in movies they don't care what brand the cameras are, what software was used to render the CGI, what software was used to edit it together. They only care that it's good movie. At last 1/2 of fellow AAA game devs have switched to using a 3rd party engine and so it will likely continue.


I don't write in-house game engines, but I do write in-house devops tools and I assume the reasons are similar. Instead of reaching for things like Jenkins, Docker, Kubernetes, et. al., we developed a unified software solution that ties all of these concerns together in a way that aligns perfectly with our business objectives. Our biggest concern has always been the turnaround time of our software changes. We are at a point of doing multiple builds per day per customer, and pushing further into this realm requires some machine assistance.

For me, having 100% visibility of the entire vertical in code is the most compelling aspect. I can set a breakpoint somewhere in business logic and then step into the low-level if I need to. This sort of approach makes it really easy to spot things that are conceptually tricky. E.g. After stepping into an internal library method you discover that you are using ASCII and not UTF-8 encoding which is resulting in loss of fidelity of persisted business data. If this library method was in some external source you have no control over, it would either be entirely invisible to you, or taunting you with its unchanging gaze. For us, we simply edit the low level problem, push a commit, and its done. Knowing that we can step into the lowest level code encourages us to do so. F11 takes us all the way to the bottom of the rabbit hole. Debugging is a lot easier when you have confidence that you are passing the correct byte sequences into database/os/framework libraries. This also means that stack traces are very useful since you can view the source behind everything.

There is obviously a cost associated with maintaining this solution. We have found that it is substantially more expensive to maintain than just using a set of existing tools. For our case, 1 full-time developer is approximately what it takes to maintain all of this infrastructure. However, there is also the angle of opportunity and efficiencies. With the custom tools we have built, we are now able to cycle our software development process 4-5x faster than before. Our project managers can click a few buttons in a web UI to trigger a build of our software followed by automatic packaging and scheduled release to selected customer environments. This also includes the ability to review all errors (fully-automatic reporting) combined with snapshots of business state at time of stack trace generation. We have the ability to click on a commit hash and see a list of all errors which occurred on that tree up to that point in time. All of this links back into our source control tool. I would say that our devops process is ~95% full-auto at this point. I have not seen anything like what I am describing offered in any public marketplace.

Sometimes if you want something really nice, you are just going to have to build it yourself. Many times, the cost of in-house custom will not be justifiable at face value. You have to look beyond this and consider the higher-order consequences of your decision over longer time frames. Our solution started out very humble and gradually grew into the comprehensive monster that it is today. If we hadn't taken the leap away from Jenkins 3 years ago in favor of implementing the build logic in code, we would have never had the opportunity to incrementally add all of the other amazing stuff that we did.


Sorry to say, but this was the completely wrong approach and I’m amazed the business let you do it.

Some poor bastard is going to have to undo all your mess down the road, I hope you buy them a beer if they ever track you down. I mean, at what point do you make the call that writing your own in-house copy of _DOCKER_ is a smart move? Let alone Jenkins.. The ‘magic’ abilities you describe this platform as having are utterly standard features and are yours as soon as you install e.g gitlab...

Of course, you actually have to spend some time and learn them and how to use them properly, but that time is VASTLY less than learning how to implement them badly and minuscule compared to the amount of time it’s going to take to migrate off your in-house pile..


We did not write a 1:1 replacement for docker. We made technology choices which allowed for us to completely sidestep the conversation.

See: .NET Core Self-Contained Deployments and the inherent advantages of SQLite.

Just because we don't use a particular vendor's product does not mean that we have made the decision to re-implement 100% of their feature sets.


Ho, sounds exactly what I'm in the process of laying down at work!

Do you push to on premises deployment or is it pure saas? The former is obviously harder to tackle, and unfortunately that's the case here.


We push to each of our customers' environments from our centralized administration tools. Each customer environment can vary wildly (on-prem, AWS, Azure, etc) and ultimately just needs a small component installed for communication with our mothership. We do not worry about the actual hardware or hosting, but do have tools that allow for us to inspect the health of whatever host we are running on from our remote perspective.

Longer term, we have considered shipping appliances (physical or VHD) to our customers so that they do not have to take the extra step of configuring a new windows or linux host each time we want to stand up a new product environment.


Can anyone tell me how one dan get multi pass screenshots like this ? I am aware of RenderDoc. Does it allow this or there is a different tool ?


You should be able to do this in any competent rendering debugger: PIX, RenderDoc, Nvidia's graphics debuggers


iD has done excellent work with id tech and I’d love to actually play it, but the Denuvo scandal leaves a bad taste in my mouth. I want to reward good industry behavior. I’m not sure if excellent technical advancement, competent design, and alluring art outweigh hostile behavior. They probably do on a 50% sale.


Didn't they back out Denuvo in the end? It seems fair to choose not to purchase a game if it has DRM, but conversely, if the company responds to the community by removing it, that seems like "good industry behaviour" worth rewarding :).


They did, but they also did add it suddenly, without notice, two weeks after the game launched (after return periods of day one sales expire). Pivoting to save face does not nullify that bad behavior. They’ll do it again and will only back out if they get caught.


> Didn't they back out Denuvo in the end?

I know that mainstream video game journalism mostly ranges from regurgitating publisher's lies to spreading political propaganda, but you may want to read at least the PCGW article[0]. This game has been using the Denuvo Anti-Tamper malware since its launch (2020-03-20). The first update (2020-05-14) introduced the additional Denuvo Anti-Cheat malware. This kernel mode(!) service was also required in singleplayer, decreased system stability (blue screens), caused performance issues and made playing the game on Linux impossible, after it had worked perfectly fine before via Proton. Its integration with the game was removed in the second update (2020-05-27), but Bethesda.net customers had to manually uninstall the service themselves.

> "good industry behaviour"

The game requires both a Bethesda.net account and either the Steam or Bethesda.net launcher, which are both in themselves malware, has four unskippable intro screens that last almost half a minute and the encrypted savefiles are locked to a specific platform/account. The included versions of Doom and Doom II are censored, some of the background music in the port of Doom 64 plays differently than in the original and the Red Cross symbol is censored. The account check and intro screens can be disabled with command line arguments, but the vast majority of players probably do not know this.

Doom Eternal does not offer a demo, has a base price of $60, pre-order DLC, a $90 Deluxe Edition (or a $30 Year One Pass) and a $200 Collector's Edition, that does not include a physical version of the game, but a download code locked to the Bethesda.net version and whose included OST, leaving the controversy about its quality aside, was only available a month after launch (2020-04-19) also locked behind a Bethesda.net account and still unavailable on other platforms a standalone. Despite its creative director's lie, this game also contains microtransactions, which were time-limited and locked behind linking a paid (or "free" trial) subscription to a certain online shop/video game live streaming service to the Bethesda.net account. The only thing missing is gambling (loot boxes).

Such revolting behaviour is an even further departure than Doom (2016) from the once self-publishing id Software that offered a shareware episode of Doom and open sourced their cleaned up game engine code four years after release.

> worth rewarding

Would you reward your hostage-taker for holstering the loaded gun - for now - after having pointed it to your head, because you did not get shot this time? Your attitude is precisely the reason why the Western game industry has not yet deservedly collapsed despite being in such a self-destructive state. Why should I reward stopping only one of numerous blatantly user-hostile practices, instead of blacklisting anybody who uses them in the first place, or just completely ditching these racketeering publishers and supporting indie developers or a different hobby altogether? You get the industry you deserve.

A DRM-free version of the game is also available, if you know where to look. "Pirates" get the objectively superior experience, again.

[0] https://www.pcgamingwiki.com/wiki/Doom_Eternal


I feel that Doom Eternal is a bit slow/unoptimized for the actual visuals you get. Love the game though


Really? My gaming rig is over 7 years old now. While it was pretty decent for it's time, it's definitely due an upgrade.

But I was still able to run D:E @ 1440p and get a respectable framerate with decent visual quality.

Compared to most other games, I think it's extremely well optimised.


Yeah it runs fine, but not great. I was also put off by this demo (rtx 3080 4K maxed out) https://youtu.be/A7nYy7ZucxM


I think the exact opposite from that demo, they chose Doom Eternal for the first showing of FPS from the 3000 line precisely because it's the best looking title they could get to run at 4k 120+ on the card not in spite of its looks.

Perhaps you just don't like the art style or prefer something that's prettier at low framerates than decent looking at high frame rates? I mean when it came out there were articles literally written about how optimized it is for the quality https://www.tomshardware.com/features/doom_eternal-graphics_.... I'm not saying it's the best optimized game of all time but it certainly is in the top quartile.


Doom Eternal runs better than most of the current Unreal/Unity engine games I've been playing lately, and it looks a touch better too

I haven't tried it on my 4K HTPC (no comfortable keyboard+mouse for FPS on it) but 1440p on a GTX 1080 with all the graphics maxed runs a consistent 60FPS in Vulkan.


Doom 2016 was just way too well optimized so anything after it will seem slow


I felt like the opposite was the case - both Doom 2016 and Eternal ran way better than any other AAA FPS on my machine.


I've read that both represent early push towards truly multithreaded engines, especially on PC, and including graphics part. It's also supposedly why D:E is Vulkan only, as it supposedly goes even deeper with multiple threads sending commands to GPU etc.


Yeah, I also noticed this as well. I think it's more about the level design and lack over optimization passes the levels rather then the engine. Looking in at certain locations would give me massive FPS drops up to 50% reduction.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: