Hacker News new | past | comments | ask | show | jobs | submit login

> Maintaining older or smaller versions of software just to avoid self-hosting is bound to hit all kinds of bugs - big and small.

Unfortunately, any compiler that can compile current versions of GCC would probably be enormous, and difficult to trust directly. C++ is a complicated language to compile.

As long as the chosen versions of TCC and GCC

1. are free of bugs that affect compiling the next-more-sophisticated compiler, and

2. can be trusted,

then having them as intermediate compilers isn't too terrible. Bugs in them that affect compiling the next-more-sophisticated compiler can be written and committed now, and will remain that way forever. The only place new bugs can be encountered is if the current version of GCC changes so that it can't be compiled by the last-frozen version of GCC. There are two ways to fix that:

1. Patch the last-frozen version of GCC so that it can still compile the latest GCC, or

2. add another intermediate version of GCC to the toolchain, since the last-frozen GCC could compile every version of GCC up to the most current, and the most current GCC must be compilable by at least one earlier version of GCC.

This means that the compiler chain can potentially grow without bound, but it would do so slowly, and only by adding more compiler code onto the end of already-trusted compiler code. A long chain of obsolete-but-trusted compilers is not ideal, but should be a working and stable source of trust.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: