Hacker News new | past | comments | ask | show | jobs | submit login

I think it's the conservative part people have a problem with. Someone at some point in NASA's life had to take the plunge and develop the components and algorithms to use, so it isn't unreasonable to expect some new components and algorithms on new projects.



Computers are something that NASA has always been conservative with. By the time Apollo flew, the technology for the Apollo Guidance Computer -- RTL ICs -- was a few generations obsolete. The shuttle's main computers started out as designs derived from the IBM S/360. It takes time to design a radiation hardened board, and from there, it takes time to integrate that board into new designs. Thus, progress seems slow, but it's about as slow as it ever was in hardware. Integrating a newer core late in the game means a lot of needless regression testing for questionable benefit. Because of how orbits work, the deadlines are a lot more set in stone, and if your change breaks something, it could mean that you miss your launch window, and are sitting on your butt for a couple years until it comes around again.

The algorithm part is equally simple to explain. Most of these algorithms aren't broke, so there's no real impetus to fix them. The equations are pretty well known, so between missions, there's really not a need to change them up.


I imagine they build the new components and algorithms for less dicey missions, get as much data back as possible and then look at reusing them in another area. NASA is basic a giant engine of risk mitigation, a lower priority mission with a lower cost is much more suitable for something new that isn't tried and tested out.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: