Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Tutorial on Creating Really Teensy ELF Executables for Linux (muppetlabs.com)
90 points by spudlyo on Oct 11, 2009 | hide | past | favorite | 16 comments


On this topic does anyone know of a good article that shows you you can profile your binary size? Say given binary X object Y takes up 60% and in object Y source file Z takes up the most. Or a list of common coding practices in [language] that cause the binary size to significantly increase?


A blog on using pahole to find holes/padding the compiler adds to your structs

http://zecke.blogspot.com/2009/10/what-is-size-of-qlistdata-...


try the 'nm' program on your binaries.


or better use the 'size' program to give how much each .o/.obj file takes.

You can also use 'strip' to remove debug information not needed (normally appended to the end of the file).


It would have been brilliant if he could have reduced the program size to 42 bytes - the same as the result when you run it, and the answer to everything.


In the postscript he mentions:

[...] Well, actually, it could be made smaller. When Linux starts up a new executable, one of the things it does is zero out the accumulator (as well as most of the other registers). Taking advantage of this fact would have allowed me to remove the xor [...]

Removing the xor reduces the program size to 43 bytes. Closer, but not there yet :-)

However, neither the original nor the 43-byte version run on my machine [1]. The first is simply Killed, the second doesn't even load: cannot execute binary file.

[1] Linux 2.6.28 i686 (Ubuntu 9.04 on Pentium M)


The original runs fine for me[1], but if you actually read the last bit of the article, you'll see that there's no way to get it below 45 bytes and convince the kernel to still run it. The 45th byte specifies the number of entries in the program header table, and that can't be zero (or missing).

[1] Linux 2.6.31 x86_64


Given the end size, some simple padding wouldn't allowed for that, but I don't think he was trying to behave like a complete nonce.


The equivalent exercise for the Windows PE file format:

http://www.phreedom.org/solar/code/tinype/


Very interesting, although we are (sadly) past the point where squeezing every byte you can get out of your program matters.


I think crazy size optimization like that is a fun exercise, but I completely disagree with your "sadly" there. Avoiding the need to do all this silly micro-optimization means more time to create things that people actually care about and want to use.

Yes, modern software is comparatively huge and bloated, but the less your average programmer has to worry about code size and speed, the larger the the body of useful applications we'll have.


Nostalgia is surely as bad a vice as they come, but I can't help be a little nostalgic that marvelous hacks like these no longer have a place in our world.

I've heard there are similar such hacks in the world microbiology, overlapping genes giving bacteria a selective advantage. But in the world of people, genomic code bloat is apparently not a problem.


I don't know, it's only last year I was trying to squeeze a pixel shader down to 64 ARB_fragment_program instructions to fit on older ATI cards. The cycle of hardware reincarnation keeps giving me reasons to dust off the 8 bit skills. Writing for cell phones or even JavaScript environments often feels like old times.


That must be one of the reasons why a stock install of a basic os takes up a gig or more of harddrive space and a similar amount of ram ?

If you want to see absolutely blistering speed try installing a 10 year old linux distro with X on it.


It's unfortunate. The Sidekick II was the best smartphone I ever had in terms of usability, keyboard and speed, and I am still highly partisan to it.


Wrong topic. :)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: