"Tarver Engineering" wrote in message ...
"Harry Andreas" wrote in message
...
In article , "The CO"
wrote:
"Tarver Engineering" wrote in message
...
"Emmanuel.Gustin" wrote in message
...
Stephen Harding wrote:
Wasn't the rover about to attack 'an interesting geological
feature' when it failed? So it would be legitimate self-defence,
or at worst a legitimate pre-emptive strike...
A much more likely source of failure is the lack of EEs at NASA, as
outlined
in the Shuttle crash investigation board report. Until the areo mafia
is
rooted out of NASA, there can be no forward movement. Aero's havn't
controlled a successful aerospace company since the 1970s, as it has
been all EE since then.
Not all EEs. Hughes Aircraft had an ME at the helm for a good while.
My mistake, you are quite correct. In fact, most MEs can work complex
numbers; while an Aero with a BSAE will be lost. I had an aero come up to
me at Dryden and speak with fascination at the huge difficulty he had
applying complex numbers. From the way he described it, I don't believe he
had ever seen a number written with a "j". before his masters level work.
Actually, it now appears highly likely that it's a problem with the
FLASH memory management
software module. The FLASH hardware is apparently ok. In short, it
appears to be either a
bug or something corrupted it, such as a high energy particle impact.
Some EE apparently didn't put in EDAC.
From the story today it apears JPL used a software MMU. That would make the
most likely cause of the problem some softhead that couldn't get their
program to run without hogging memory. It is a story repeated again and
again in industry.
It is nice to see you posting, Harry.
I have been programming vxWorks for 12 years. There is a basic mmu
that comes with and another that is more traditional that keeps tasks
from clobbering each other. But from the press release, it sounds like
to me exactly the same problem I complained to Wind River 10 years
ago: That all the directory entries are cached in regular dram memory
(on bootup, it rereads all the directory entries into memory). I
believe it was something like 64 bytes per entry. So when a customer
of mine logged 20000 data files, his system ran out of regular memory
because I didn't factor in this possibility. I think I heard they had
to delete 10s of thousands of files - probably freeing a few precious
megs of memory.
|