Quoted:
Seems I heard it explained by some NASA type that they are and have been aware of the huge difference in computational power and did it on purpose.
The reason being that the simpler the computer, the less chance of it having a bug or crashing and shutting everything down. It needs to work flawless all the time every time.
The same applies to the computers in modern military jets too, I believe. Before the F-22 everything was pretty basic because simple was reliable.
Could be wrong though.
Critical system developers spend an insane amount of time doing what computer scientists call formal verification. This involves analyzing every function for logical correctness both on paper at a design level and also writing more code to model the conditions of of the code you're really writing.
A buddy did this kind of work several years back. Three teams had to independently come up with the same answer to be considered right. I'd go insane.
The hardware only needs to be "fast enough" to come up with the outputs you need on a regular interval.
This is rarely done for most software products where acceptance is based primarily on functional testing.