(Continuing a thread I began in my
July 13, 2007 entry.) A gratifying number of people who wrote to me indicated that they would fund research, in a lot of different areas. I'm for that; research is much less political than education, and much more can be done with less money. (Even a billion
(
Read more... )
Comments 5
It also made me think of past projects like IBM's Future Systems and OS/2, Pink and Taligent, BeOS and NeXt. How many attempts to replace the old Mac OS died quietly inside Apple?
Good luck!
Bill
Reply
Reply
There's a single management processor that is responsible for starting the rest of the hardware and monitoring it. It runs on a scaled-down version of that same chip and boots a flash-based Linux system.
For that matter, the operating system that runs on this hardware now has a binary compatibility layer that lets it run ELFs compiled for x86 Linux.
It's called the P-series from IBM. ;-) They're extremely good at virtualization at this point. Rather than specifying "I want this code to run on one processor and this code to run on another two and this to run on a fourth", you specify how much time each task should be able to request from the system as a whole. The hypervisor then balances it across all of the hardware in the box. Works very well. Not quite as fault-tolerant as a NonStop, but it's ( ... )
Reply
And a billion really doesn't go as far as it used to, heh.
Reply
For that matter, lots of games still use integer math internally for everything, which skips right by the purpose of these big, expensive (in terms of silicon area) vector coprocessing units everyone has been adding to their chips. Apple's page on optimizing the Noble Ape simulator focuses on one specific debugging tool from Apple, but it's a good example of the kind of performance increase some applications can see from multithreading and using vector math ( ... )
Reply
Leave a comment