If you jump to 11:52, you’ll see a neat demo that shows off the engine’s performance: 70,000 spheres (ray traced on the card), 500 lights, at 60 frames per second. Great stuff!
Honestly, I'm not sure if I should be impressed with the demo they showed. I haven't given the code a read, so take this with a grain of salt, but I feel like the exact demo they showed off was mostly a graphics card demo. You can get most, if not all, of that scene rendered purely on the graphics card itself using fancy shaders without needing to hit the CPU very much. I feel like I would have been much more impressed if they had created something such as a level from Megaman (since they showed a sprite of Megaman near the start) that the presenter was able to play. I would consider it more impressive since it would have to spend more time in Racket dealing with physics, collision, AI, etc. instead of spending the majority of the time on the graphics card. Hopefully they'll be able to show off a demo like that soon (if one doesn't exist already) to really show off the capabilities of the engine inside Racket. Honestly though, the whole project is exciting for me since they're really just trying to explore new ways of writing games. I'm looking forward to what they do in the future.
Right. I don't remember whether I said it in the talk, but that demo saturates the AGP bus, meaning that the engine can extract and write scene data at about 1.5 to 2 GB/Sec.
It's possible to put a lot more static content in each frame by uploading the vertex data to the card just once. That feature is on my to-do list.
The talk says it doesn't work on OS X because of which versions of OpenGL are supported. Neil, is that just because you don't know very well how to use the particular version OS X supports, or are there critical features of your Pict3D engine that won't work in OS X's version?
More precisely, the problem comes from how certain Linux libraries and OS X have responded to OpenGL's new profile system.
An OpenGL context is roughly a 3D drawing area and its associated behind-the-scenes data. Via changes to OpenGL version 3.0 through 3.2, it became possible to ask for specific versions of OpenGL for each context. There are two kinds of contexts: "compatibility" contexts, which expose all OpenGL features up to the version obtained, and "core" contexts, which don't have deprecated features. For example, starting with 3.1, a core context won't draw quads - you have to split them into two triangles.
On Linux, Racket uses libgtkgl to get OpenGL contexts that are associated with GTK widgets. This library won't return anything but compatibility contexts, up to version 3.0.
On OS X, it's only possible to get an OpenGL 2.1 context or an OpenGL 3.2 or 3.3 core context.
There are a few ways to fix this.
1. The libgtkgl developers add support for core contexts and versions later than 3.0. I doubt this will happen. They're dragging their feet.
2. OS X adds support for compatibility contexts. Again, not hopeful.
3. We find some other way to get an OpenGL context on Linux that's associated with a GTK widget. There aren't many options.
4. I develop using OpenGL 2.1. I lose a lot of shading language features that way, though, and probably a lot of performance.
5. I develop multiple rendering paths. I hate this idea, but it may be what I have to do.
http://www.youtube.com/watch?v=t3xdv4UP9-U&t=11m52s
The source code is available on GitHub:
https://github.com/ntoronto/pict3d