Jul 27, 2009 22:52
Since I came to MIT, most of my aerobic exercise has been in the form of treadmill running at the fitness center. Running indoors has some obvious advantages over road-running, like weather-independence and safety from cars, but there are some benefits that I hadn't anticipated. The treadmill has a lot more give than a typical road or sidewalk, and this makes my knees feel a lot better. A lot of people suggest that treadmills are incredibly boring, but I've found that if I run fast enough, I'm not particularly attuned to my surroundings anyway. The freedom from distraction also lets me focus on making my running form more efficient.
I was little concerned about the calibration of the treadmills, in part because of some inconsistencies I've seen. For example, the timer skips a second once in a while, say going from 5:32 straight to 5:34. This isn't particularly bad, since the distance meter skips in synchrony with it, suggesting that the computer has an internal state that is updated rather often, and the display has an independent clock with a cycle time of slightly more than 1 second that it uses to feed computer info to the runner. Unfortunately, the speed reading is not quite consistent with the time and distance readings. When I set the device to 8.0 miles/hour, I would find that every couple miles, I'd lose a second, e.g., two miles would take 15:01 or so. However, when I ran at 9.0 miles/hour, it never lost or gained any time. When I ran 9.1 miles/hour, it would look a lot like 9.0, except I'd gain a second roughly every 83.3 seconds. This means that given a distance a 9.0 runner would cover in 84.3 seconds, a 9.1 runner would take 83.3 seconds, and the expected values are 90 versus 91 seconds.
Because 9.0 miles/hour is distinguished by the property that a mile takes exactly 400 seconds, my guess was that the internal state involves a decimal representation of miles covered and seconds elapsed, and the speed is set by counting clock ticks against distance. The peculiar behavior at 9.1 strongly suggests that the machine counts increments of 1/100,000 of a mile each second (or something roughly equivalent, like millionths of a mile every tenth of a second), since it does 2.53 miles in 1000 seconds, and I have yet to see any evidence that, assuming its distance and time measurements are correct, this figure is anything but exact. This gives me a speed of 9.108 miles/hour, while a 9.1 mile/hour runner would take 1000 seconds to run 2.52777... miles. This last number gets rounded up to the observed value if you round to 1/100,000 of a mile each second, but not if you use 1/10,000 or 1/1,000,000. Similarly, 8.0 miles/hour yields 2.2222... miles/1000 seconds, and rounding down yields 1 second lost every 1000 seconds.
Anyway, I still don't know how consistent the treadmill is with real-world clocks and distances, but I was rather proud of myself for reverse-engineering the speed calibration from a few data points. I am thinking of testing it further with higher speeds if I get in better shape. Perhaps the most intriguing aspect of this exercise is the idea that someone made the engineering decision to use small decimal multiples of miles as internal units. I suppose it is a reasonable step once one has made the decision to display distance travelled in hundredths of a mile.