As most of you have probably picked up on, I'm among that minority of computer scientists who actually writes code, and often prefers it to writing papers (much to the chagrin of my advisors and colleagues). I enjoy my theoretical work, but if I spend too much time on theory alone, the joy turns hollow; I want to build things that people can use.
(
Read more... )
Here are some of my basic assumptions:
-They prefer productive action to allocative action, straight up, in general. There are exceptions to this rule, but they are not in the majority.
There are tons of studies showing this to be true (e.g., unemployed people tend to develop depression and low self-esteem, even when they have another easily accessible means of survival.) so I feel it's reasonably safe to state as a fact. I can also point out daily life examples, everywhere, of people doing work without expecting compensation, simply because productivity is more psychologically rewarding than sitting on one's ass whistling Dixie. :-) Just check out any knitting community.
-They do not, for the most part, understand how the world works on a large scale. An oft-used example for this would be the fact that we all use the roads, the sewers, and other massive aspects of national infrastructure, but I tend to believe that if people were told they could "opt out" of contributing, most of them would do so and expect someone else to pick up the slack. The average person doesn't understand why a seismic upgrade on highway supports is necessary until there's an earthquake and the road drops out from under them -- not because they're stupid, but because people in our culture are not encouraged to think about such things. (Have you seen the recent XKCD strip? "Congratulations, you're now the local computer expert!")
Reply
This is part of why I object to having routine health care funded either publicly or through comprehensive health insurance, and favor having it paid for out of pocket (let's set aside the more difficult issue of catastrophic care and major illnesses). I deal with my health care providers on that basis, and with my cats' health care providers. And so I discuss with them how much a procedure costs, and what the medical necessity for it is, and what standard of care it's meeting, and what risks I'm taking by having it or not having it. Being myself the decision maker motivates me to gain greater competence. And by moving us toward a situation where most people get even routine health care paid for through comprehensive insurance with modest copays, government policy has moved us in the opposite direction, encouraging people not to take that kind of directive role in their health care decisions.
One of the virtues of markets is that because people pay the costs of their own bad decisions, they have an incentive to learn to make better decisions. (Obviously this doesn't apply to people in your situation, who face ruinous costs that are not the result of any decisions of theirs; you are not going to be taught not to foolishly have a medical disability by the discovery that having it costs a lot!)
And one of the vices of allocative programs is that they make thinking allocatively rather than productively pay off, and thus change people's behavior. It's a matter of historical record that during the Great Depression, many people preferred to face real economic hardship rather than shame themselves by going on welfare. FDR went to considerable trouble to find end runs around that attitude, and now it's a historical curiosity, and our culture is much more accepting of the idea that getting money from the government is a good thing.
Wow, this has gotten to be an extensive discussion. But in brief: my girlfriend chorale said to me years and years ago, "Life is opportunistic." I think she's absolutely right, and the corollary is that people respond to incentives. Whatever you do, you need to think about what kind of action it creates incentives for.
Reply
Leave a comment