I finished George Dyson's "Turing's Cathedral." The book is mainly focused on John von Neumann, despite a cover image of Turing who gets more or less a single chapter.
I think Dyson wanted to write a history of computing. His thesis is that much of the evolution of computing that we see today (networks, search engines, artificial life, weather and climate modelling) was present at the very beginning, with the first programmable digital computers. This leads to lots of flights of rhetoric about the digital universe that I didn't really appreciate (like the chain reaction comment I quoted in the previous post.) But his historical scope is constrained to Princeton and those working on the
IAS machine so these future excursions aren't well explained or developed.
The latter half of the book is a set of mini-biographies that don't mesh well into a coherent narrative. Turing, Stanislaw Ulam, and
Nils Barricelli (a first-class crackpot if ever there was one) all get a chapter each. von Neumann dies but then comes back for a couple of the remaining chapters.
The story is a somewhat sad one because the IAS machine, though widely copied, was a dead end for Princeton and the Institute for Advanced Study. The mathematicians and humanists of the IAS didn't want more engineering work taking place, and once von Neumann died nobody championed applied studies in the same way.
Of course, I'm somewhat curious whether anybody has written a history (or a critical study) of the IAS itself. While it has hosted a great number of famous scientists, few seem to have done revolutionary work while present. (Maybe this is more true of the one-year visitors than the permanent faculty.) The notion that freeing academics from their day-to-day studies would help them excel did not really pan out. The exception that perhaps proves the rule is von Neumann himself, whose enormous output seems more a consequence of his constant motion and numerous responsibilities than his position at the IAS.