From Daniel Greenfield, "
Government Is Magic," Sultan Knish, October 27th, 2013:
Our technocracy is detached from competence. It's not the technocracy of engineers, but of "thinkers" who read Malcolm Gladwell and Thomas Friedman and watch TED talks and savor the flavor of competence, without ever imbibing its substance.
...
The ObamaCare
(
Read more... )
This has to do with what's called 'scalability.' No matter what, any one computer only has so many wires going into it and they can only take so much data at once. If it gets 5900 messages at a time, it's going to have to buffer them, line them up, and deal with them as they come in on that wire. If it can't deal with them fast enough, some of them will 'time out' waiting for response, and fail. And your computer hangs.
There's ways to deal with this problem and techniques for aggregating at a point and spreading volume out over multiple computers, but even those techniques fail when your initial system sends 50 java programs and 9 style sheets.
It's just too much. It creates too much backflow of information. If you try to set something like this up for a website that you expect to have ten thousand people on, it crashes itself because it's impossible. It works in your testing lab because your testing lab only has ten computers, not ten thousand. But it's a basic failure.
Reply
Ni, I've not - I'm not american
over 50 java programs and 9 style sheets
G-d bless them, but why they don't use one nice big installer, for example
I never programmed for the web apps, you know, but it seems to me self-evident
It works in your testing lab because your testing lab only has ten computers, not ten thousand
It's funny, there're a lot of traffic generators around, it's the basic, to plan for peak capability, isn't it ?
Honestly, what is scarier : stupidity or corruption ?
Reply
Exactly. And keep in mind that this is just one of the big glaring problems that they have found with the site design.
There are many different ways to do this kind of thing right but they were not used. There are ways to generate realistic traffic to test it right but it wasn't done. And so on.
The people hired for the job are the same people who screwed up the Canadian firearms database (a much simpler job) for 2 billion dollars and ended up with nothing that worked and it had to be scrapped.
Reply
Indeed there are, and they should have tested the website under far greater than expected peak loads. After all, various events could cause lots of people to worry about their health all at the same time, so such super-peak loads would realistically happen from time to time.
Honestly, what is scarier : stupidity or corruption?
A little of both. The politicians really really REALLY wanted to conceal the (terrible) menu of plan choices until the user was committed to the system (corruption), and they really didn't want to hear that it wasn't possible (stupidity). There's also a hefty helping of class-based arrogance here: the politicians and bureaucrats considered themselves not merely the hierarchical but also the inherent superiors of the techies they trusted to implement it, so they did not want to accept any correction from them.
Reply
I proceeded to die laughing at his disparaging criticism of the epic amounts of redundancy, apparent copy-paste (...THAT SECTION WAS JUST PASTED IN THREE TIMES!!!!!!!!!111) and exclamations of "WHY DID THEY THINK THAT A SCRIPT WAS GOING TO WORK RIGHT UNDER A NOSCRIPT HEADER? I could have farted better website coding than this!"
"Over two thousand lines of code that probably doesn't need most of it because fuck you two hundred million dollars that's why."
I seriously wish I had the presence of mind to record that truly gloriously epic rant.
Apparently, whoever did the code should not have been allowed anywhere near a computer. Or the Internet, because it looks almost like a google-search and copy paste for every java script in the Net.
I know NOTHING about Java but even I went "...Why are there multiple scripts under a noscript header? Was that closed before the scripts were ... no they weren't." And there were links to other scripts external to the page ON TOP OF the regurgitation of scripts already in the page.
And apparently it is not meant to work under Firefox, and 'Chrome=1'
Also, if that site is crashing and unable to run, it's coz of the nearly 20 mb worth of scripts. Downloading. Multiple times - let's say the 50 number you gave. In one go. For just the front page.
For each user.
Now multiply that by say at least oh, a couple of hundred thousand at least.
I feel sorry for whomever has limited bandwidth in the US.
Reply
(1) The politicians repeatedly changed the requirements on the programmers, on many separate occasions demanding changes to either the overall structure or the details of the modules.
(2) They did this often enough and late enough into the coding process that there was no good way to move the modules around elegantly, so that
(3) when they were moved, all sorts of weird little legacies and sections were left behind, some of which had effects on other sections, because
(4) the programmers had little experience with massive projects and thus didn't realize just how important a strict application of structured programming techniques was going to be in order to make it possible to move around modules smoothly, and
(5) the final set of massive changes were so eleventh-hour that they had no time to debug the final product.
This is clearly because neither the politicians nor the bosses at the programming company knew what they were doing. FUBAR all around.
Reply
Reply
Reply
Another thing that is suspected is that 'code generators' were used. Programs that write programs for you.
While such things have their uses, usually they generate overlarge unreadable unmodifyable bloatware. That may be something that you see if you look into the java code; it doesn't look like a human wrote it.
The suspicion I have read is that the current effort is unfixable like CGI's previous work in Canada was, and the 'tech surge' people are probably going to have to slap up a simple data-entry-and-storage website in the short run, and do batch processing with the data they get.
Reply
Leave a comment