Depends on what you can actually finish I suppose.
I have several game maker projects which are long dead and gone because I got bogged down trying to write nifty engines and crap for them. In one case, I spent probably weeks working on my own custom (and, to be fair, only barely passable) physics engine for a game which can now be accomplished in a few lines now that game maker studio comes with a physics engine.
I can only imagine how deep the rabbithole would go if I ever started to learn game design in C++ or something (which, of course, I plan to) and then added language design on top of that. Would probably be really fun though.
In line with the other commenter, I have spent many years tinkering with just game engines in my spare time, and indeed it is a rabbit hole.
For that reason I think things like indie game jams (Global Game Jam & Ludum dare) are a good idea because you will write crappy code, but you will actually make a game.
Any project that runs some sort of external script would count. Soon a config file has some sort of conditional (#define grows #ifdef) and variables.
Also any project that has a VM (in the traditional sense) such as SCUMM.
It incorporates the notion that Lisp will do everything already, the code as data as code that Lisp gives you is a win that coders dismiss until it's too late and they have spent years learning C++
I think a lot of people misunderstand what is being proposed here. He's not saying to write a whole new language, he's saying to write a byte code interpreter. You can implement one of those in about an hour.
While you're at it, you'll probably want to write some debugging tools since your designers are bound to dream up some complicated scripts you never imagined your system needing to handle...
If you're interested in this, check out Fabien Sanglard's code review of Another World[1], which is one of the more notable VM based games, managing to target a multitude of different platforms with some impressive vector graphics for it's time.
Also very interesting is the Emscripten port of ioquake3, which includes a custom compiler for Quake 3 VMs to Javascript.
"Console manufacturers and app marketplace gatekeepers have stringent quality requirements, and a single crash bug can prevent your game from shipping". Not in my experience, usually when the hard ship date comes it goes out the door.
There's a bit of give and take, but when I was at EA, certification was serious business. During beta when we had a final candidate build and the console manufacturers were testing it, we basically sat there with our fingers crossed hoping they didn't come back with an issue.
My friend scored his dream job "game tester" for Rockstar. The fun turned into tests like "Run to edge of gameworld, fire 25 rockets, run back, swap rocket gun for pistol, run to edge of gameworld, fire 25 shots, run back, repeat"
You might want to look at how people are approaching components now. Instead of components containing logic, components only contain data.
You then have systems that operate on the components. This solves the problem of needing components to interact with each other, since a system can look at as many components as it wants.
Yeah, I'm aware of that approach too (all of those links are already gray for me!), but I wrote the chapter before that became as popular as it seems to be now.
I say seems to be because while ECS is clearly being talked about a lot, I'm not sure how many shipped games are actually architected around it. The blogosphere doesn't always accurately reflect reality.
I've looked at a few ECS implementations and many of the ones I saw had such bizarre implementations that they more than nullified the performance benefits of components.
I do see the advantage of systems that work on multiple components, but I think some proponents are starting to cargo cult this model. That happens frequently in games because you have so many amateurs that want to know the "best" way to do things, and the constraints are challenging.
Hmm, this ended up sounding overly negative which I didn't intend. I think ECS is interesting, but I'll be interested to see how it matures over time. Right now it feels a bit like the OOP fads of the 90s to me. There's good ideas in it, but any good idea can be taken too far.
Is this a viable suggestion, though? Correct me if I'm wrong but I imagine that developing the entirety of your video game in Forth for anything but desktop operating systems would mean writing your own tooling and API bindings, if not your own Forth compiler.
It is pushing the the machine model to where it is most appropriate, on the iron. No different from choosing Javascript because it is cross platform, just someone has already done lots of the base work.
Forth has strengths and weaknesses, lack of wide use on modern platforms is definitely a weakness. There is still plenty of Win waiting in there.
If the article about building a hobby OS was on a blog about making actual useful software, and framed building an OS as a reasonable subproblem of making actual useful software, I would expect lots of people to reply like that.
The blog is specifically about patterns used in game development not about game development per se. So it's about the journey, not about the destination.
Correct and one of the patterns is embedding a language. Maybe lua/js does not fit your criteria for whatever reason so it might be a good idea to know how to roll your own language.
More importantly, that you understand how the underlying language runtimes work and their call/data structure models. Most of us will still embed an existing system.
I am beyond very confident that the solution set containing embedding Python, JavaScript, AngelScript, and Lua covers just about everything in the problem set a game programmer reading that book is going to encounter.
And if it doesn't, it's not going to be improved by rolling your own.
Hi, I wrote that chapter. I also wrote the AI system for Hatsworth. There is absolutely no way we would have embedded Lua on a DS!
What we did do was write a tiny little bytecode VM as described in the chapter. The level editor I wrote[1] let designers author behavior using a little UI. It then compiled that to bytecode.
It worked like a charm if I may say so. Unlike a full-featured language VM, we didn't have to deal with strings, parsing, garbage collection, or any of that stuff at runtime, which would have been untenable on a DS.
That's really interesting, thank you for mentioning that. That's totally in my "just about", for sure--but this isn't 2008 (even the 3DS has 128MB of RAM) and this doesn't exactly strike me as the sort of book that somebody who already has a console devkit is going to be reading. "A bytecode interpreter" seems to me to be something that that guy (i.e., you) is going to have sussed out entirely, whereas somebody who actually needs a book like this is in just about every case going to be getting off in the weeds building something instead of going off the rack.
(That said, no-GC AngelScript uses under a megabyte of RAM on ARM; before I switched to libgdx and decided that I could do without supporting the Galaxy Nexus I'd spent a lot of time examining it. It certainly might be tight on the DS, but on any modern system, I think you're probably cool.)
You saw them a lot in old games (the kind likely to be written in assembly), because not even C, let alone a general-purpose scripting language, was acceptable for performance reasons for quite some time, yet there has always been a need for greater abstraction and expressive power. Look at any Japanese RPG of the 80s and 90s, for instance, and there's probably a simple bytecode VM (or state machine, if you prefer) for scripts that move characters around during story sequences, one for displaying text in menus and dialogs, and so on. Look in an arcade shooter, and there's probably one for defining enemy movement patterns, etc. Tiny bytecode DSLs not only boosted expressive power for programmers, they also allowed non-programmers to script behavior. 6502 assembly might be out of reach for non-programmers, but `SETANIM PLAYER1 PLAYER_WALK_TICK; WAIT 20; SETANIM PLAYER1 PLAYER_WALK_TOCK; WAIT 20; LOOP;` is not. If you only have to remember a few simple rules to get the behavior you want, even learning to hand-assemble bytecode is not out of the question for a dedicated professional.
Nowadays, you'd probably be better served by embedding a general-purpose dynamic language in most of these scenarios, but hand-rolled bytecode can still win in certain situations that need high performance and/or low latency. Imagine, for instance, one of those fancy "danmaku" shooters with hundreds or even thousands of bullets on screen moving in intricate patterns. You just might want a language more dynamically manipulable than C to script the behavior of those bullets, but using a dynamic language that furiously generates garbage and demands collection pauses for something that must happen 60 frames per second on so many objects is probably going to cause hitches eventually that will get your player killed (in the game, at least). You'll then be subject to a very harsh but valid criticism: "I played games like this on my PlayStation back in the day with no slowdown, this guy must be a crappy programmer if his 2D game stutters on my monster PC." Sure, it's great that you don't need to break out the optimized assembly to do a Breakout clone anymore, but at the same time, you should feel bad if a 1Mhz 6502 provides a considerably more reliable experience than your game on a modern x86.
A certain popular 2D shooter series made by an ex-Taito arcade game programmer uses a bytecode DSL for its intricate bullet patterns for this reason. It was probably an easier decision to make when the series started in the late 90s, but today it lets the author be productive without adding wildly variable latency into the mix that would kill the gameplay. People should not forget that running smoothly even on poor hardware is a real feature. Even in 2014, I'm sure there are plenty of people out there still using Pentium 4s and Windows XP. If you make the only new games/apps/whatever that run well on these computers, you have a captive audience.
At the same time, I think you could extract 99% of the performance benefits of the bytecode approach if your language had a well-defined heapless, stackless "tasklet/microthread" subset that centered around mutating existing objects and making certain limited kinds of allocations (allocating from a memory pool and manually "freeing" by marking the object as unused later). Knowing that a piece of code will never generate garbage and that a garbage collection cycle will never happen during its execution makes a huge difference.
Thank you for your answer. I do realise why it was done in the past, I was wondering about the present. Also what you're describing in the last paragraph sounds a bit like Andy Gavin's GOOL - http://all-things-andy-gavin.com/2011/03/12/making-crash-ban...
"At the same time, I think you could extract 99% of the performance benefits of the bytecode approach if your language had a well-defined heapless, stackless "tasklet/microthread" subset that centered around mutating existing objects"
Low-end mobile devices these days have 80+MB of RAM available to an application and the downsampled assets that I assume you're using on them 'cause you're a pro leave you with a pretty large chunk of RAM to work with.
I sorta doubt there are many games targeting these devices (and, more particularly, the people who still use these devices in 2014) in the first place, but even fewer which are pushing their headroom so hard that, if they decide they need what amounts to a scripting language, they can't fit one in off-the-rack.
If you can point me towards a smartphone, with significant market share and usage numbers in 2014, that doesn't have 128MB of RAM and, like, an 800MHz Adreno or something, I would really like to see it. Because that's my defnition of "low end" and I've literally never seen crappier than that. (I'm targeting phones with at least two cores and 768MB of RAM and up, because I'm lazy and I don't think anybody with lower would be buying my game because it doesn't fit the profile of titles that make money on such devices, but I do have a couple paperweights around for idle testing.)
My market research has shown that when you get below those specs you start getting into dumbphone/J2ME platforms, where you have other constraining factors.
it is not always about whether you can at all. Sometimes it's about power budget- That is, battery drain, etc. Run your lua engine and you have less budget for particle effects. Run all the things and your game drains the battery in 10 minutes.
It's perfectly legitimate to not care. Writing a from scratch bytecode vm is a fairly extreme last resort solution.
But on the other hand, the bytecode VM, appropriately scoped, could end up being easier than embedding lua. That is, if what you're doing is only barely more than loading and using data- not even quite crossing the boundary into turing completeness.
The power delta between AngelScript or Lua and your own interpreter is gonna be pretty small, though. It's a cost you pay once (in a modern implementation) to interpret into...a bytecode VM. And you totally might be a wizard beyond my ken here, but I strongly doubt it'll be easier to implement a bytecode VM of your own than calling, like, five functions. =) I'm not denying that there are cases where such a thing is valuable, don't get me wrong, but don't reinvent what you don't have to. And probably should not be the first thing you're reaching for in your toolbox. Or second.
You're probably right and it's totally irrelevant in this day and age with cheap memory and cpu power.
I can't help but remember fondly on the idea of the bytecodeVM that "Out of this World" used to draw vector graphics, and think "That. is. neat".
or the power and flexibility the SCUMM vm afforded story writers.
It's fetishistic and unrealistic I suppose, to think it would be cool to do some amazing game on some extremely primitive slow low power cheap machine that anyone could buy for like $10. The whole fantasy usually ends when you realise that the cost of displays is well beyond the cost any actual computer hardware you may consider.
Angry Birds is not a terribly involved or demanding game, and the hard part, 2D physics simulation, is handled in C by the Box2D library anyway. It requires no real-time control, you just aim the bird and watch the reaction. You could half or quarter the framerate and add enormous interframe jitter and the game would play exactly the same. If you tried to do the same thing with an demanding twitch action game, it would quickly become unplayable.
Lua's been fast enough to do what you describe for years on gear much slower than even bottom-end smartphones, but if that's too iffy for you, AngelScript is a sub-megabyte runtime when run on Android via the NDK. It's historically been performant enough for use on the PS2 (with its ballin' 300MHz MIPS CPU and 32MB of RAM). It's statically typed, has a very well-defined garbage collector you can pretty easily avoid ever invoking once, and as a simple transform language generates very little (possibly even no? I don't remember) heap allocations. It will also take literally ten minutes to set up for the sort of use case we're talking about here. (If you want to bind a big honking C++ library to it, then things get interesting, but we aren't talking about that.)
You still write your perf-sensitive code in C++, of course. Just like if you embed Lua. Because you aren't dumb.
munificent hit on the one use case where this makes a ton of sense to me--when you're a basically-embedded environment, he was talking about a platform with four megabytes of RAM--but that hasn't been a serious use case in games on popular platforms for a while now. Which is why I'm confused why it's in a book that seems to basically be for novices, at least without some "you should rethink doing this unless" asterisks.
AngelScript seems relevant to my interests, I'll have to check it out.
I'm sure Lua is "good enough" for a lot of tasks in most games on most hardware, I was just trying to hint at scenarios where the GC overhead could be painful enough to warrant a custom approach.
>but that hasn't been a serious use case in games on popular platforms for a while now.
The DS with its 4 megabytes of RAM was still a relevant platform not that long ago. The 3DS doesn't have a whole lot of breathing room either; 128 megabytes of RAM, who knows how much of which is available to the game and not reserved for the OS.
>Which is why I'm confused why it's in a book that seems to basically be for novices, at least without some "you should rethink doing this unless" asterisks.
I haven't read too much of this book, but I got the impression that it was for programmers of other disciplines that wanted the skinny on game development practices, not for total novices. Somewhere between "Game Programming for Teens" and "Game Engine Architecture." This chapter doesn't seem that out of place to me.
> 128 megabytes of RAM, who knows how much of which is available to the game and not reserved for the OS
The OS reserves 32MB for itself, which leaves you with a bigger working set than a bottom-shelf Android phone. =) I get what you're saying, but I don't think anybody's going back to the halcyon days of fewer megabytes than I've got fingers.
> I got the impression that it was for programmers of other disciplines
It felt like something I'd recommend to a CS student, tbh. Maybe that's just me.
>The OS reserves 32MB for itself, which leaves you with a bigger working set than a bottom-shelf Android phone. =) I get what you're saying, but I don't think anybody's going back to the halcyon days of fewer megabytes than I've got fingers.
Ok, 96 megabytes. When you set aside RAM for everything else in a 3D game (the main binary, sound/mesh/texture caches, etc) there's not a whole lot of space left for a scripting language's heap, especially when you consider that most garbage collectors require that the heap be significantly larger than the working set in order to achieve reasonable performance. Again, not saying Lua wouldn't work just fine (any 3DS devs in the audience that would care to comment?), just that it's by no means impossible to bump against that limit. For comparison, here are the slides to a 2010 presentation that demonstrates how devs were having real performance problems with (PUC-Rio, not Mike Pall) Lua on the 360 and PS3:
I think--not sure, I haven't tried--that I'd be a lot less worried about that RAM budget on that phone, actually. That phone almost certainly doesn't have the GPU for a game rolling with nontrivial models/textures. Like, just as an example, I don't think you're running even a downscaled, super-optimized, everything-in-C++ version of the Angry Bots demo that comes with Unity (it's on Google Play if you want to check it out) without stripping out actual functionality to the point where I think you sort of miss the point of the game because I think the GPU would give up the ghost. So I feel like on this low-end hardware you're basically talking about fairly lightweight 2D games because it can't push better, so I'm not sure I'd really worry about big honkin' meshes or textures. Is that unfair? (My own libgdx game doesn't use more than 60MB of textures at its peak, and I'm doing a lot more stuff in my JRPG than I'd think almost anybody would try to do on that sort of bottom-of-the-barrel hardware.)
Also, yeah, maybe I'm overshooting the average programmer.
I think the point is you don't generally need something with the extreme power and flexibility of lua to simulate comments like yours. A simple bytecode with 5 or 6 instructions should do. And it'll be performant enough to run on a casio calculator watch at full speed!
I used to work as a programmer in a game company shipping on consoles+PC. The first time we embedded a scripting language in our engine to be used by the level designer's they surprised the whole team with their new found 'toy'. This was great and lessened the burden on programmers.
Fast forward to first milestone (everyone crunching) and the milestone was delayed ... The level designers (not being programmers) had introduced a new category of bugs. Compound that to that fact that this script language had a very very primitive debugger ... hell ...
I didn't imply the scripting engine was to blame. It's unchecked use and the lack of tools for debugging it were mostly to blame ... of course crunching amplified all these problems