I was a college student in India in 2001 being discouraged by my professors from seeking a graphics elective because "there's no career in graphics". 20 years later I can happily reflect on a dream career in graphics (EA, Pixar and my own game studio). The second edition of this book and its "make pretty pictures with a computer" message started it all off!
Heh, I had a similar story but just a decade later. Was not allowed to take the graphics elective in University even though it was a part of the official syllabus. The reason being that it was difficult to get a good grade in this subject in the finals.
Indian society and the love for meaningless numbers on a sheet of paper, name a better duo.
I think this is just sign of active fads. When I wanted to pick a research area in 2007 multiple professors/students had told me to ignore databases. The understanding was RDBMSes of the day had solved all problems. There were minor rumblings of the NoSQL movement back then, but no one was sure if it would be impactful and/or last long. At any rate not many ppl expected the field would see so much activity :-)
Another example I can think of is speech recognition systems (ASRs). Around 2007-08 again, I was told by researchers in the area that they are not coming to personal devices anytime soon - esp for Indian accents. A few years later you could dictate messages to hangouts on an Android phone :-)
> A few years later you could dictate messages to hangouts on an Android phone :-)
... can you ? it's 2021 and on any top of the line phone I'm definitely unable to get speech recognition systems to understand sentences to an acceptable level. A few words work... kinda. Hell, even typing has become harder than a few years ago where it used to be a simple dictionary - now every other message the automatic correction is completely wrong, which is very frustrating.
> there's no career in graphics sounds ridiculous today.
I'd wager that for this kind of people, even today, "graphics" is not a career. A career in these people's mind is something where you look like a stock photo executive, you'd do code for a couple years and move on to management.
“ A career in these people's mind is something where you look like a stock photo executive, you'd do code for a couple years and move on to management."
I shudder when I think about this outcome, despite its inevitability.
I’d be interested to hear from people currently working w OpenGL in any capacity - what does all this mean in 2021 wrt Vulkan? By way of example, Wayland is ~12 years old, and in my (BSD) world, just starting to become interesting to me.
Vulkan is really hard and is not really a "true" replacement for OpenGL. It's too low level, and not a good API for people making "games" to interact with. Sure, big engines and AAA games are definitely going to be implementing it, but it will always be alongside D3D and Metal.
But IMO Vulkan is way too much for most indies, small studios and solo developers to work with. I doubt we'll see many games targeting Vulkan as "the" multi-platform API like we've seen with OpenGL in the last 15 years. Anyone having to target Vulkan will have to use Unreal/Unity3D. Or maybe someone will develop a wrapper that converts OpenGL calls into Vulkan calls.
Honestly I see Vulkan as akin to regulatory capture. It will only enrich the big players by forcing small companies to use big engines.
EDIT: To give a concrete example, we (part time indie team) have an unfinished renderer in Vulkan, and it's already almost 5x larger than the other (finished) ones. To use Vulkan effectively, even on small games means you need a substantially large abstraction layer. Sure, it doesn't matter on AAA or for commercial engines, but not everyone is on such large projects.
I’m working with OpenGL in an indie game across Windows/Mac/Linux.
I feel like in the games space, if you’re not going through Unity or Unreal then it still makes a lot of sense to target OpenGL for most games rather than going to Vulkan. I definitely still get a bunch of error reports from folks whose computers don’t have OpenGL drivers capable of a context >2.1 on Windows, and I expect that sort of problem would be even more common for Vulkan. Mac hasn’t been a problem thus far, as long as you don’t need advanced 4.x features (I’m just using a 3.3 core context)
While the ‘average’ hardware shown in the Steam hardware surveys is pretty good (better than all of my dev machines, in fact), being the support contact for my game has definitely demonstrated that the standard deviation on that hardware goes down as well as up, and sometimes by quite a lot!
I’d love to play with Vulkan someday, but I worry about support, especially on really old hardware.
(Other thoughts: in indie spaces, it probably makes a lot more sense to use Unity or Unreal or Godot rather than rolling your own. And in AAA spaces, you’d be silly not to go Vulkan instead of OpenGL. It’s really only in my unusual “I happen to have already rolled my own engine and I’m using it to make an indie game” case where OpenGL makes a lot of sense to me)
I have a laptop i bought around late 2012 with a GeForce 660M. It can run a lot of stuff, including some recent games at very low settings - but Vulkan is not supported at all since Nvidia stopped releasing new drivers for it.
Similarly i have a GPD Win 1, it can only run lightweight games because it has an Atom CPU with integrated graphics - still for many smaller indie games (including some 3D games) that hardware should be enough. But under Windows there is no Vulkan support (the hardware can support it but Intel hasn't released any drivers). There is support under Linux, but then a lot of other stuff doesn't work.
In my new engine i was considering going with Vulkan or sticking with OpenGL (with which i am already comfortable anyway) and even made a binding generator for Free Pascal but then i noticed that aside of my main PC no other computer i have in my house supports Vulkan - and i'd like to have at least an Nvidia and AMD GPU to test. So i decided to stick with OpenGL for the time being as i really want to be able to run it on my portable PCs. I might consider a Vulkan renderer in the future but that would be after Vulkan is available even on whatever is considered old low end devices at the time.
You just have to look at Android, where Google had to make compulsory on Android 10 as no one was caring, and still it looks like this after 2 OS releases:
With some libs and tutorials, setting up a modern OpenGL pipeline has become reasonably easy, and extensions give you access to some state of the art features, sometimes even before DirectX (64bit atomics just made it into DirectX in december, but I've already used them in OpenGL 2018). I don't feel like switching to Vulkan, mainly because of all the reviews of others who said that it's incredibly cumbersome to use and after looking at certain aspects of Vulkan, I can see why. I don't really want to take the effort of diving into another badly designed graphics API at that point.
I do, however, very much look forward to WebGPU, which seems heavily based on modern Vulkan, Metal and DirectX, but it's comperatively easy to set up and work with. Major downside is the lack of cutting-edge features because WebGPU is targeting the lowest common denominator (=smartphones, and also the lowest feature set that all backends (DirectX, Vulkan, Metal) offer), but I've got the impression that it's going to be what OpenGL and Vulkan were supposed to be but didn't really manage to become: A platform-independent modern graphics API.
OpenGL is like learning to ride a bike. Vulkan is like flying a fighter jet. Lots more work and not much benefit if you're just trying to go a mile or two.
I do some projects that require WebGL and a good understanding of OpenGL makes life so much easier. WebGL from what I understand is based on OpenGL ES 2.0, and knowing how to write custom vertex and fragment shaders is important for my work
Most DIYs and commercial indies (and companies like Blizzard, Capcom, Namco, Square Enix, Rare, etc... for that matter) are using third party engines that handle the rendering for them. OpenGL will continue to be relevant as long as it's the API of choice for at least some subset of use cases in those engines. I would be very surprised to see any game company that uses cutting edge graphics to sell its games or the hardware they run on (Rockstar, Guerilla, etc) use it for a new project
Doing raw OpenGL as an indie instead of using an engine these days is unusual and likely to be ill-advised unless a big part of the point of the project is to learn more about low-level graphics programming, in which case you're not so much making games as studying programming through gamedev. It's like writing your own UI framework for your web app instead of using React / Angular / Vue. It's true that there are use cases that existing game engines don't cover perfectly, but the cost/benefit ratio just isn't gonna be there for a custom solution for most projects
Because everyone is busy targeting DirectX 11/12, NVN, Metal and LibGNMX.
Vulkan is mostly Android 10+ thing in games development, even on the Switch, there are other APIs to choose from, with Unity having a big piece of the pie, 50% of the titles according to the company numbers on 2020 Unite keynote.
Job reqs? Are you saying lots of people are looking for vulkan jobs and they can't be hired because there's no need, or lots of open positions requiring vulkan skills?
Man HN is making me feel old lately. I bought the second edition of this when it was the current one—Amazon tells me it’s almost 20 years old now. A few days ago it was reading the technical deep dive into the PS2 CPU/GPU as if it was historical tech. Remember when that was shiny and new too.
This looks rather dated, the most recent version linked is from 2013 covering the API for OpenGL 4.3. I don't know much about graphics tech yet I'd assume some of the best practices have changed with better hardware in the past 7 years.
The latest edition (7th ed, 2015) actually uses 4.5 as well. AFAIK OpenGL hasn't changed too much since 3.3, certainly not to the point of being irrelevant for learning.
For me personally DSA was an improvement, but not enough. I wish NV_command_list made it to core one day, amongst other things it introduced a concept of a "state object" which captures all the state and can be easily captured/restored.
I have moved on however towards the DX12/Vulkan like APIs. Currently using WebGPU as a stop-gap, it gives me an easy to use API (compared to Vulkan) with strengths of those explicit APIs. I think in 2021, even if people are writing OpenGL engines, they are already using the "RenderPipeline/RenderPass" abstraction to fit the new APIs better.
OpenGL 4.5 has named objects which makes working with OpenGL feel less archaic. Also 3.3 version doesn't have tessellation shaders. But for some reson OpenGL 3.3 is still being called "a modern OpenGL".
For a while it used to be that OpenGL 3.3 was the only modern version you could guarantee to find everywhere, even on crappy hardware and actually work without major issues.
I read a second hand copy a long time ago of the red book when it covered the 1.x series, I am starting to have some time to learn graphics programming again and I know that the SuperBible and RedBook were good ones.
Ah, the OpenGL blue book that I never got around to buying.
As opposed to the red book (reference guide)[1] and the orange book (shading language reference)[2] of which I got copies of the respective last versions before they merged into the newer "red-ish book" that combines both[3].
Just in case you ever wondered where that one Lego picture in the Windows 3D maze screen saver came from (the cover of the red book).
Has OpenGL got any better as an api over the years? Like becoming object oriented? Or is it still the cyptic and verbose pre-OO style library that it was 20 or 25 years ago?