Hacker News new | past | comments | ask | show | jobs | submit login
Duke Nukem 3D Code Review (fabiensanglard.net)
200 points by lispython on Feb 15, 2013 | hide | past | favorite | 39 comments



A true gem is hidden down at the end of the first page:

https://github.com/fabiensanglard/chocolate_duke3D

[...] a port of the Vanilla source code with two goals in mind:

    Education: Easy to read/understand and very portable.
    Fidelity: The gaming experience should be similar to what ran in 1996 on our 486s.
And from the README:

    Aimed at education: A lot of comments and documentation has been added in order to help programmers to understand and learn.


As I am unable to edit, there is also a project page (beats me how I overlooked that), explaining the modifications in detail:

http://fabiensanglard.net/duke3d/chocolate_duke_nukem_3D.php


Oh, awesome! I'm currently trying to do this ith the LibreOffice source code. Frankly, it's a definite leach code base. The source code organisation is poor, the have (IMO) not encapsulated functionality into appropriate classes, the rely on conditional directives too often, and while they abstracted base level functionality into their system abstraction layer, too often I see it leaking outside of this layer.

And that's just after reviewing their startup code!

Interestingly, I believe the code can be refactored. They have a great basis of code, but it needs a major reorganisation and a solid architectural overview and plan. It needs, in short, disciplined coding based solid OO practices.

In fact, I'd love to rip out the guts of the vcl into its own library, create a runner abstract class which you derive off of for each architecture they want to support. Better yet, they should find a good C++ IoC container and use this to plug in all their many components, including the UNO framework, which is actually very impressive.

To this end I'm strongly considering setting up a sequence diagram of the startup for the existing startup classes just to understand how they all work together. An interesting exercise at the very least :-)


Would love to see any notes you have regarding the current state. The developer documentation* isn't a great overview.

* https://www.libreoffice.org/developers-2/


I'm mainly I focussing on startup at the moment, I believe the libreOffice guys have their hands full with a more productive task if implementing features. I think they are approaching things from a different perspective to me: I'm interested in how it all hangs together, they want to work on things that give them the most bang or buck.

And that's sensible, and very fair, given limited man-power :-) no to mention I'm doing what I'm attempting is a part time hobby!



Thanks. I'll keep digging. If you want to socialize your notes, or need a proof-reader, I'll be around!


Ken Silverman (18 years old at the time) wrote a 3D engine at home and sent a demo for evaluation to 3D Realms. They found his skills promising and worked out a deal

Cool way to start a partnership!


Also, take a look at Ken's voxlap engine. He is so bright!! I wonder, why he is not so active in the industry :(


Voxlap is how I got my sister interested in programming. She really liked the cave blasting demo.

http://advsys.net/ken/voxlap.htm

http://advsys.net/ken/voxlap/cave.zip


> Since I left my job at Amazon I have spent a lot of time reading great source code.

I know (or think) the OP has another job/company now, but the idea of still diving into source code after leaving a technical job is a pretty cool pursuit -- especially with the intention of explaining it to the world.


OK, this post certainly led to some fun internet hopping.

I liked this vintage Carmack quote from Ken Silverman's website:

St. John: If you could just hire anybody from the 3D world, who would you hire?

Carmack: Well there's a big difference between who I consider the most talented and who I would necessarily hire, because you have to hire people that fit right. If I had to pick who I think is just the most talented, it would probably be Ken Silverman, the guy that did the BUILD engine. He does engines and tools. He's great as an editor. He writes all the code for everything, and he's just extremely talented. I think it was 3D Realms' worst decisions not to coddle him, or whatever it took, to keep him on board. I think if he was still working directly for 3D Realms, they would have a Quake-type game shipped by now, just because he's extraordinarily good. There's maybe a half dozen people that are top-notch A-level 3D programmers. I'm not going to give you a list because I'd leave somebody off and they'd be all pissed off at me.

St. John: You've already left off 90% of them by naming Ken Silverman.

Carmack: All the people doing things that people are talking about now are pretty talented. The Epic people have been working on it for a long time. They've gone through a big learning process, but they've got the issues under control and they're going to ship a product.

St. John: So you think one day Tim Sweeney might grow to be as successful as you.

Carmack: It's hard to become successful by following in footsteps. This is probably going to come out sounding demeaning, but Epic wants Unreal to be Quake. Everything they did with Unreal, they did because they wanted it to be like what Quake turned out to be. And they're going to achieve a lot of that, because they're doing a lot of things well, but you're just never as big when you're second in line.

Hook: Just like Dark Forces and Duke were both phenomenal games, they still definitely didn't have the impact of Doom simply because they just weren't first out the gate.

Carmack: Like Prey, there's a lesson to be learned, something a lot of companies don't really ever learn. You hear it from the fan base a lot. "Do it right. We'll still be here. We'll wait," and it's tempting to just let things slip. But that's really not OK. If you're doing something cutting edge, you're making fundamental decisions about your architecture, and if you let it slide for a year or two, then it's just not the right decision anymore. Even if you pile on all these extras, it's not optimal. It's not targeted at what you're doing. So I have some concerns about Prey coming out this late.

It's funny to see Carmack trash talking what Epic was doing with Unreal now, given the way history has unfolded. Tenacity, persistence, and consistency can outweigh the disadvantages of following an industry leader with a me-too product.

And for anyone else who eats this kind of thing up, David Kushner's Masters of Doom is a great read that I can't recommend highly enough.

p.s. I can definitely identify with running across popular commercial game codebases which are composed almost entirely by one source file.


>It's funny to see Carmack trash talking what Epic was doing with Unreal now, given the way history has unfolded. Tenacity, persistence, and consistency can outweigh the disadvantages of following an industry leader with a me-too product.

If you're talking about how Epic does really well in the licencing space while Id does not, I believe that an intentional decision by Id to not compete in that space post Q3.


Sadly, Ken declined: http://videogamepotpourri.blogspot.ca/2012/05/interview-with...

Seems like he ended up quite depressed.


That is a very depressing interview. I find it difficult to feel sorry for him since he mentions declining a job offer from John Carmack.


I get the impression the guy has been called a genius so often he feels insecure about not living up to the name.


Awesome and depressive. He set his engine as something he will never top and believes it, and that is only thing keeping him to make something amazing. Some people got over self imposed limits and are blasting. Some are fine where they are.

Still I am thankful to the world for him to exist.


Not trying to fanboy anyone here, but while id was still in the "license the engine out for fun and profit" game (especially during the quake 2 - 3 era) they dominated. It wasn't until id tech 4 and Unreal Tournament 2004 (and the Unreal 3 engine) that they stopped being really competitive in the engine space, and Epic is as big as it is today not just because of Gears of War (was UT3 popular?) but because they own the high end licensed 3d engine space.


My impression after reading "Masters of Doom" was that Carmack never really wanted to be in the licensing business. The main driver behind id's early licensing pursuits was Romero, and although Carmack went along, he did so without enthusiasm. After Romero got fired, there was no one at id pushing for the studio to do much besides making games. Epic Games, on the other hand, took engine licensing very seriously from the beginning. There is a feeling of "continuity" between iterations of their technology, and this is very important to toolkit licensees, and something that you don't get from id. I've always felt that Tim Sweeney has a lot more business acumen than most people would expect from a programmer.


+1 for Masters of Doom. It's the best computer game book I've ever read.


I think Carmacks comment is still spot-on - Epic did get what they have now by doing something on their own: putting all their weight behind the licensing which was always only "we have to do it, but we do not want to do it" for ID, i.e. they found an area where they area number one and executed well there.


I'm a little disappointed that the author didn't examine the original DOS version. DOS emulators and the Watcom toolkit are still available.


Slightly off-topic, what is the typical setup like for a 3D programmer these days? Is it still Windows/Visual Studio?


With the huge number of uses for 3D engines these days, the idea of 'typical' is the same as asking what the typical setup for any programmer is.

3D programmers for Android are probably on a combination of windows and linux. 3D programmers for iOS are no doubt coding away in Xcode. The new field of WebGL could have people coding 3d engines on Chromebooks. Etc.


Exactly.

There's not really any requirements for 3D development over any other development. All you need is a compiler that'll hit the platform you're developing for. Right now I'm just using vim and gcc (under Ubuntu), for Android dev, and it works great.

If you want to target consoles, you'll need the correct platform SDKs from the manufacturers. Lots of people I've worked with used VS for console dev, but I tended to stick with vim because I preferred it, and at the end of the day it's all just code going into a compiler.


Actually that's an answer in and of itself. Recently as a few years ago the answer was Windows / Visual Studio even when targeting non-Windows based consoles.


I'm curious - and excuse me for my ignorance - but what is "unrolling code"?

http://fabiensanglard.net/duke3d/duke3d_code_review_unrolled...


Usually it simply means repeating the loop body N times so that the amount of loop iterations can be reduced to 1/N.

For example, if you know that the iteration count is divisible by 4, you could do something like:

  int unrolledN = n / 4;
  for (int unrolledI = 0; unrolledI < unrolledN; unrolledI++) {
    int i = unrolledI * 4;
    // loop body...
    i++;
    // loop body...
    i++;
    // loop body...
    i++;
    // loop body...
  }
This wouldn't really offer any advantage over the plain loop, though. Next you'd need to reorganize the loop body so that e.g. memory reads for all iterations would occur at the start of the unrolled loop. This kind of optimizations can offer significant performance increases because you get more control over what the CPU is doing within the loop, but they also depend greatly on the target platform. Even different x86 processors can be very different in this respect, so unrolling can become a disoptimization easily.


In this case he sort of replaces some function calls with bodies of these functions, so the actual program flow would be clearer to read.


Broadly speaking, it's removing flow control from the code in order to make it easier to read or faster to execute.

In this case, I think he's replaced function calls with the body of those functions. For example, "displayrooms" is immediately followed by { } surrounding the contents of what "displayrooms" actually does: interpolate, animate, and so on. This means you can read just the one source file and know what's being executed, instead of having to read the source of displayrooms.c and various other source files separately.

Another common form of "unrolling" is repeating the body of a loop some number of times. See, for example, http://en.wikipedia.org/wiki/Duff%27s_device . Basically, by reducing the number of branches (and therefore potential pipeline flushes) you can decrease computation time.


I think what he means here is that all the code that is abstracted into functions etc. is "unrolled" (at some sensible level) into one main function so you can see the complete flow of the program without searching around.


Not sure, but it could be something like Duff's Device. Look it up, it can be used for various things, like drawing and stuff.


> a.c: A C reverse-engineered implementation of what used to be highly optimized x86 assembly. It works but is a monstruous pain in the ass to read :(

Was this manually reverse-engineered or was a tool used?


Does anybody have tips on how to view the embedded quicktime videos?


curl -s http://fabiensanglard.net/duke3d/build_engine_internals.php | grep -Eo "http.\.mov" | mplayer -playlist -

sadly does not work because they are not encoded well. So just use the following line to get the URLs and download them by hand. Substitute the page URL as needed.

curl -s http://fabiensanglard.net/duke3d/build_engine_internals.php | grep -Eo "http.\.mov"


It seems that the .mov files are just references to .m4v files.

This will fetch those:

curl -s http://fabiensanglard.net/duke3d/build_engine_internals.php | grep -Eo "http.*\.mov" |sed -e 's/\.mov/-desktop.m4v/' | xargs -n1 curl -O


just click on it and it should download and install quick time that will place a plugin in your browser


Any tips on watching quicktime that doesn't infect your computer with apples garbage? I'm not familiar with .mov files, but it seems I can't just download the file directly and play it with my menagerie of players capable of playing m4v's. Which is frustrating.

.mov is acting like some kind of meta-file to point quicktime in the right direction. The only string in the file worth noting is the name of the actual video file with the .m4v extension. Maybe it's storing an IP address?

Honestly I'd rather not go through all of this, if he had used a modern standard I would of being able to scrub through the file locally.


Just install the totem-mozilla package. Should provide you with /usr/lib/mozilla/plugins/libtotem-narrowspace-plugin.so

Or if you'd rather watch outside the browser, see instructions above.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: