If you were paying attention to the field of pseudoscience a couple of weeks ago you’ll remember the story of the
maths teacher who revolutionised division by zero and without a by-your-leave was teaching it to secondary school pupils. Not only was he subverting the standard procedure of childhood education (by teaching his pet fantasies as the next great Truth) he was also talking a load of cobblers.
His point rests on the unfortunate fact of there not being an answer for division by zero. In fact, it’s defined as not being answerable. In computer programming this has the unfortunate result of occasionally causing a problem or two. Division doesn’t get used in maths or programming as much as, for example, addition but it does pop up. What was your average score for all the games you’ve played? Well, add up all the scores and divide by the number of games.
But what if you’ve only just installed the software and the number of games played = zero. What then is the average, since we cannot divide by zero? This problem is easy enough to predict and avoid: we only calculate the average if there’s been at least one game played. Some circumstances are less easy to predict, and sometimes the programmer just plain forgets to check.
The result of division by zero is an error. This error is named Not a Number, or NaN for short. If the programmer commits such a faux pas then two things will happen. The program may follow Elvis and “leave the building”, which will often result in a little dialog box in Windows saying the program attempted a division by zero.
The other thing that happens - and this is, I think, more common in languages which don’t mind about well-typed results - is that “NaN” is rendered as a literal result. So the average score over the last zero games played is… NaN.
Just to illustrate the point,
codeman38 kindly allowed me to use the weather widget shown. The person who wrote this obviously didn’t check for the presence of zeroes before dividing and the result is as you see - very silly.