Dec 12, 2007 02:50
So, I've been banging my head against the wall for the past couple of hours, trying to understand the Command design pattern (computer programming thing). Anyway, one of the examples I was reading through was a little desktop calculator that let you undo math operations. To undo a + 50, you'd do a - 50. To undo a / 5, you'd do a * 5. Okay, so what. Well, I realized the program would crash if you tried to undo multiplication by zero. Why? Well, the undo of * 0 would be / 0--that is, division by zero!
So anyway, that gave me a cute little justification for division by zero being illegal. Every other multiplication by a real number, you can undo with a division by the same real number. However, if you multiply a number by 0, you effectively destroy your data (the number)! Think about it: Given any number n, if you multiply it by zero, you get zero. So, * 0 maps all numbers to zero. But then, if you want to reverse that operation, with division by zero, how can the division operator possibly know what number you started with? (Computer scientists would call this an instance of the pigeonhole principle. A similar argument is used to prove there exists no file compressor [like ZIP or RAR] that can compress all files.)
Maybe I'll run it by Eleanor many years down the road and see what she thinks about it. Then I'll try to convince her that 0.99999... = 1. Hehe!
math