Hacker News new | past | comments | ask | show | jobs | submit login
Real pixel coding (2011) (iquilezles.org)
188 points by dshankar on March 5, 2013 | hide | past | favorite | 79 comments



Back when I was coding on the Sharp MZ-80K (http://blog.jgc.org/2009/08/in-which-i-switch-on-30-year-old...) we used to write code by typing it directly onto the screen. The screen was just directly memory mapped with each character on screen being a single byte. And the machine had the complete character set available from its (odd) keyboard.

The program would be written in assembly, converted by hand to hex and then each hex byte found in the character map and typed in. It was then possible to CALL the start of screen memory to run the program.


"we used to write code by typing it directly onto the screen"

I once knew a guy who had programmed by drilling holes in leather with a hand drill. :-)

[Some ancient process control mainframe that booted originally from a paper tape - over the decades of use in a steel mill the paper tape had long since worn out and been replaced by a sturdy leather belt - he had to modify the boot code, hence the drill]


A useful debugging technique on the VZ200 was to locate the stack in video memory. The display would be a mass of snow when the program was running, since the CPU activity left hardly any cycles for the video controller to access screen RAM, but when the program crashed its state could be read directly off the screen.


Nice. I've used this type of technique in other situations where very high-speed debug output is needed.

At one job we had a machine code ring buffer implementation that could be called via a software interrupt (INT 3 I believe with a single character in AL) to write a single byte of debug information. This could be used when tracing a program's execution. When the program crashed the ring buffer could be examined using a system debugger (typically Soft-ICE) to see what state it was in. Very useful when it was critical to get tracing information in a way that had minimal timing impact.


I'm using this exact technique in my daily work right now! :-). It's an embedded DSP doing real-time signal processing for digital radio, and the technique you describe allows the execution flow leading up to a crash to be determined, without the timing jitter introduced by the "real" debugger.


Something similar was also used by C64 cracker groups adding their intros to cracked games. Before the intro ran, user saw a few seconds of random characters on screen - most likely a buffer for a decompression algorithm. In this was case it wasn't done to ease debugging - it's just that the game itself occupied almost all available memory and the only area left that was big enough was the screen buffer. Plus, the resulitng effect itself looked Matrix-like :)


There's this esoteric programming language: http://www.dangermouse.net/esoteric/piet.html

where the source code are images similar to works of Piet Mondrian.

And there's ColorForth where the colors replace some of the punctuation normally used in Forth: http://www.colorforth.com/cf.htm -- this (seems?) more serious.


"Kill two births with one stone" is the most unfortunate malapropism I've ever seen


And yet somehow appropriate to the unusualness of this technique. And, from a Google perspective, it's likely that you'll be able to reference this post with that phrase for all time.


Sure, if attracting anti-abortionists to your website is a good thing...


Doesn't anyone proof read their blog posts anymore...


70% of the video he spends just manually copying bytes from a pre-written compiled and likely compressed source (off screen to the right). It's just dressed up (for the general public that find this stuff impressive because they don't really understand what's going on).

Not to downplay it here, I'm just describing what's going on. This is basically a small nice looking demo.


No compression of any type involved. He writes machine code and saves it to a .com file, which can directly be executed. The first thing it does is activate mode 13h, which gives direct access for drawing to the screen. The surprising thing here is that both these hacks are still supported by various versions of windows. They really used to take backwards compatibility seriously.


It took me several rereads to understand that what the author was referring to in the first few lines is a demo[1] say the 4K intro, where a few bytes in assembly language can create some interesting graphics often accompanied by music; often conveying a concept.

[1]: http://en.wikipedia.org/wiki/Demoscene


For reference, this is one of Iñigo's finest creations: http://www.youtube.com/watch?v=jB0vBmiTr6o



That was an excellent read!


In 4K? That's obscene! There's link in the video notes to a description of how it was created which makes for an interesting read: http://www.iquilezles.org/www/material/function2009/function...


iq is very well known in the demoscene. He's one of the greatest and most creative coders that there's ever been.


wow what an awesome read. the shadow idea is brilliant, so simple and so effective.


A similar demo written by iq https://www.shadertoy.com/view/MdX3Rr running in the browser (WebGL). ShaderToy.com is an awesome site. It was posted to HN a week ago and got very few up votes(http://news.ycombinator.com/item?id=5280380)


I remember being impressed by a friend at University who would do:-

copy CON: reboot.com

and then enter 8 or 9 bytes (some via ALT+keypad) all from memory.

(There are ways to do it in just 2 bytes I believe, this isn't about reboot.com golf anyway...)


See also the EICAR Standard Anti-Virus Test File[0], which is a valid MS-DOS .com file composed of only printable ASCII characters. Granted, its output isn't nearly as pretty as the one in the video, but it's still a neat trick.

[0]: https://en.wikipedia.org/wiki/EICAR_test_file


0xCD19, which is INT 19h (call the bootstrap loader).


Wow, 32×32 doesn't sound like much to me, but it's actually 1024 pixels. If you had 4 bytes per pixel, that's 4K - you can do quite a lot with that, even in java: http://www.java4k.com


Finally a reason to have the Java Plugin installed, and now for security reasons it's not. Oh, the humanity!


His image was 9x9 pixels, not 32x32. So 324 bytes.


A great reminder that code is data, and data code.

Now I want to see how all my programs look as bitmapped images... brb.


Hopefully it will look mostly like random noise, without many patterns. That would indicate compressed information without much repetition. If you see lots of regular patterns, perhaps that means refactoring is in order to maintain DRYness.

Hmm, curious if this visualization would actually work to spot duplicated code.


Hopefully it won't actually. Most languages are not designed for compactness of source code, and most forms of factoring will produce fairly noticeable patterns; e.g. lots of funcion declarations.


True. One would have to first reduce the natural duplication/patterns that come with the language (i.e. function declarations, etc.) so that only the meaningful user code is left, before such analysis would be useful.


Back in the days of A500, you could point the video viewport at any part of the memory -- including currently executing code. Watching lz decompress files was mesmerizing.


Yeah, you're describing Shannon entropy: http://en.wikipedia.org/wiki/Entropy_(information_theory)

so that means DRY is "try to maximise entropy" gzipped files should also look like random noise, for the same reasons.



It's not Piet, though.

But granted, initially I assumed the article would be about that, but the colours give it away that it's not. Well, that, and the description of what he's doing.


You're right. I was disappointed when I found out what it was, but I didn't come to complain about it. I just thought people might like to see something a little more in line with the title.


Yeah, that's the first thing that came to my mind before reading the article.

http://en.wikipedia.org/wiki/Piet_(programming_language)


"during the last 5 years i’ve been explaining people that it takes less space to write a 3 minutes HD animated video and music clip than than it takes to store a 32×32 pixel icon."

This sounds so wrong. This sentence makes it sound as if any 3 minutes HD video has smaller filesize than a 32x32 pixel icon. Of course people won't believe you if you use that wording.

I assume what you mean to say is, it is possible to create SOME 3 minute video with music that is smaller than SOME 32x32 pixel icon.

E.g. if you store the 32x32 pixel icon in some XML format that describes every single pixel color component, and the video is plain white with some bleep music :p


Except for the next fucking sentence where he explains it's about algorithmic generation not arbitrary compression. He's assuming his readers have a brain. That might be a false assumption.


Couldn't you just apply the same techniques to generate 1024 pixels in less than the filesize it takes to generate a video?

If he's having trouble explaining it to people, it's because he's not wording it correctly. And if he's wording it correctly he should use the same, correct, understandable wording to tell us.


Yeah. He's not creating a video, he's creating a program that displays a video. If you create a 320x240 program that animation and call it a video, you might as well call every frame an array of 300 16x16 icons.

The point is pretty cool though, algorithmic compression is pretty awesome, especially when you don't have to conform to any particular end result. A fresh Minecraft world is a 240k jar file used as a decompression tool, and the data is a single integer (the seed). In the end the result is some 9 million times the surface area of the Earth.

However, start modifying that world and the data needed to store it balloons. Make a world that accurately mimics the surface of just Iceland and you're going to be looking at a pretty large file.

Accept a world that's a bit of simplex noise and Brownian motion and let yourself be awed. Make a video of rotating around a fractal made from a simple formula - that looks cool. Try and compress a video of your daughter's first steps and you'll get something significantly larger than an icon. Especially an icon that's at least run-length encoded.


There's no real computer science distinction between a program which displays a video and a video. You can always create a video format which (optionally) can read some bytecode and run it in a virtual machine. There are Turing complete font formats you know.


I find it correct and understandable. That's not arrogance. This is basic CS. Reference Kolmogorov complexity. The measure of information relative to a generative basis is the length of the shortest unique representation of a generative description within that basis.


The comment you just replied to was pointing out that you could just as easily find a 32x32 pixel icon with Kolmogorov complexity lower than that of the 3 minute "HD" video. And they did so in a way without having to bring CS theory into what was previously a marvelously straightforward discussion.


This discussion is utterly straightforward basic in CS. I am a self educated high school dropout. I am intolerant to rationalizations of willful ignorance. Take that for what you will, but clearly one of us can do something the other can't, and for me that is telling.


Here's a playable, graphic, first person shooter. It's in 96kb.

(https://en.wikipedia.org/wiki/.kkrieger)


Looks like someone has a bad case of the Mondays. On Tuesday.


Yeah, you're right, that was grouchy. My bad. I get frustrated when I see someone post something I find interesting, and then the top HN comment is just someone ignorantly tearing it down.


But couldn't you use algorithmic generation to hold a 32x32 image as well?


There's no need for XML, nor for all-white video. A 32x32 icon of some color noise, stored as PNG, GIF, or BMP, takes about 2-3 KB. Writing an algorithm to procedurally generate a vaguely interesting video and music can easily be done in less than that. Check out the demoscene for numerous examples. Naturally not all videos can be generated this way, but his statement is still true.


This is a great example of the point the guy is making:

https://www.youtube.com/watch?v=36BPql6Nl_U

That's 128 bytes. You could encode that as a 43 pixel RGB888 image. That's less than 7x7 pixel icon!

Now as Jare mentioned in another comment, this video

http://www.youtube.com/watch?v=jB0vBmiTr6o

contains less bytes than a basic 32-bit 32x32 pixel icon. Some of the whole essence of this demoscene stuff is to work within these constraints(256 bytes, 1024 bytes, 4096 bytes, 64 kbytes, ...) to produce this kind of art. Custom virtual machines, on-fly decompression algorithms and methods, carefully crafted data and ingenious ways of data re-use are the key. It's like black magic.


Also, English isn't his (?first? / only) language.


I've always been wanting to do this, but never got to it. Never thought of the .raw format without header, which probably makes it much easier. It still looks very impressive though, nice job!


I knew this was iq's even before looking at the domain name :) Great spanish talent.

Don't forget checking his productions with RGBA! http://pouet.net/groups.php?which=697


Pretty awesome. Correct me if I'm wrong but I'm fairly certain this won't work in Windows 7 - 64 Bit as 16-bit COM applications are no longer supported. I've tried to go back and get my simple COM programs from Assembly class at college to work but no luck. It was just really simple to do a very basic program using simple DOS function calls. Anyone know a simple way to do basic assembly programming in 64 Bit windows OSs? I tried was unsuccessful in getting a linker working.


You could presumably run it in DOSBox.


I'm sure this is very possible. There are only few opcodes for a x86 real mode com to use and if you look at the demoscene and see what those guys can do...this is not even that impressive :) Steve Wozniak wrote BASIC for the Apple computer in binary.


This is cool. Could somebody please explain how this works? From what I can currently guess, the .midi image encoding with the 2D array of color values, when changed to an .com executable becomes assembly code that generates graphics? Thanks.


Initially is a 2d array but when saved without a image header to give it width and height (and other meta data) it becomes just an array of bytes. A color is encoded in 3 bytes, so with each pixel he can write one, two or three opcodes. If you google opcode tables you'll find plenty of examples of how assembly instructions are encoded in bytes.

For example (this is 16 bit but for the sake of demonstration) byte A1 (10100001) means ADD AX, BX. 101(add)00(ax)001(bx).

[Edit]

Found a cool image showing x86 opcodes. http://i.imgur.com/69Lli.png

And even a cooler one with windows exe description (not DOS .com as he used) http://i.imgur.com/tnUca.jpg Look at the right side of the image where it sais x86 Assembly where it colors and explains each byte. His .com was only that part since .com didn't need headers and imports and it was just code.


Thanks for the explanation and images cthackers:) I guess that the tunnel is an infinite loop from JMPs. I think I'll try to make one.


Congratulations, you just gave syntax highlighting a whole new meaning!


A little known company had this same idea in the mid 90s. I believe it languished in obscurity and then died after an unfortunate takeover. The company was Macromedia, the product was Flash.


final vid needs Doctor Who music! hmm... I wonder how simple a MIDI file you could make to play something like that tune... and what that would look like as a RAW bitmap...


You can generate music with algorithms too. I came across this a while ago:

    echo "main(i){for(i=0;;i++)putchar(((i*(i>>8|i>>9)&46&i>>8))^(i&i>>13|i>>6));}" | gcc -x c - && ./a.out | aplay
The generated a.out is 8.5kB for me though.


That sounds better than most of the pop music out there. Amusing demonstration of ... something!


Demonstration of using very few bytes of data to generate art. The same snippet of code can (very probably) be written as a < 128 byte ELF binary on Linux.


Of course, it gets linked in the standard way. A program with an empty main takes up about the same space. The point is that this snippet could be compiled very compactly with the right tools.


dude, that is awesome!

(although, I will now spend a good few hours working out how on earth you get the 16 bar repeats, with variations, and indeed the kick, from some crazy bitshift magic.)


Going the other way around is probably more interesting...


Absolutely fabulous!


I am only an egg.


mind blown


WOAAOOWW!


Fake, it's not hard to select colors based on hex values you're reading from a printouot ... But it does impress some people who think he's actually designing the code that way.


You've missed the point of the article. Of course the author is picking from a predefined palette and placing colors in a memorised pattern. This is a theatrical way to demonstrate the main point of the article, which is that:

it takes less space to write a 3 minutes HD animated video and music clip than than it takes to store a 32×32 pixel icon

The video very clearly demonstrated that using only a 9x9x24bit image you can encode a program that, when executed, will produce many MBs of data. It's the same concept as x (k)Bytes of data producing an infinitely scrolling procedurally generated terrain.


> the filling of the pixels with carefully chosen colors

See, to me this means "reading the colors from the printout". It's ridiculous that anyone could get the least significant bits of the color right by just eyeballing it, and I don't think the author ever claimed he was doing that.


> It's ridiculous that anyone could get the least significant bits of the color right by just eyeballing it

If you look at the video carefully, you'll see that the color picker displays numeric RGB values and that it has been sped up. Do you think the code would work if he didn't get the values 100% right?


Of course that values have to be right. The colours are picked to build the algorithm.

Off to the right side of the screen (not visible) is probably another image with the colours in that he's picking from.

Maybe it's fake but it's theoretically possible. I don't have a windows pc to try on. He's provided the image though so you could try for yourself.


I'm pretty sure that is not what this is about (showing that one can 'paint' coded animations) but to showcase how powerful algorithms are.

Am I wrong?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: