Feb 27, 2007 01:25
Lately computers are faster than ever, more physical processor cores in a single home level desktop. (Now two with an AMD platform, or four with an Intel. Though the AMD stills holds its own tyvm, and yes I'm an AMD fanboy) Ram is faster than ever and is more efficient in storing data and getting it to the processor. Prices for everything are plummeting like a stone, and power supplies are nearing the 90% efficiency rating while everything is becoming "green" so you can just huck the computer in the trash when it gets to old. But there's always been the price that no one ever looks at, power consumption.
Some might notice their computer keeps their dorm warm, or even hot during the summer months. That is wasted energy in the form of heat. Inside a normal OEM case the temp of a processor nears 50*c at idle, that's 122*f just sitting there doing squat. The amount of energy wasted there is easily 40+ watts of electricity. That may not seem like much but when you think about it it's a whole lot when you think that in ever room of a dorm there's at least on computer, on average. Each of them wasting that much energy on the processor alone. That can account to a gigawatt hour quite easily.
Now to Hard Drives, not much can be done here aside from turning them off. Ram, well it's getting better at being less of a power hog, so it's all good.
Videocards... the current GPU eats 70watts on its' own. That's 3/4 of a computers wasted heat energy. According to a report on the latest 8800 Geforce vid card there is anywhere from 80-110 watts of energy used to power just one. And most people who have access to that kind of card, usually have two in their computer.
So overall your computer itself will produce 150watts of heat give or take a third depending on how new it is. That is if course assuming your computer is less than three years old. But if your is less than that your still not out of the woods yet.
Your monitor sucks juice like no ones buissness as well. a 17" CRT monitorwill consume 100-125 watts of power. Depending on the settings used on it. (Source from two manuals I have laying around). Now compairing it to an LCD of 17" (actually it's larger, so it's liek having a 19" CRT, silly rating standards) the LCD consumes 65 watts of power. Much much less, and it's brighter, smaller, and easier on my eyes.
But take heart my tech dependent friend, there is hope.
Amd and Intel both offer low current processors. That is, processors that are either hand selected (AMD) or designed (Intel) for low current consumption. Intel's speed stepping, and AMD's cool'n'quiet offer your computers the chance to lower theri clock speeds to further reduce the power eaten by them, but support for such features is not supported by the OEM manufacturers. Rather it takes an aftermarket/3rdparty motherboard to enable such features. Secondly, there are now decent built in graphics processors on some motherboards as well. These setups are not nearly as powerful as their standalone brethren. But they consume much less power, and still offer decent performance for some less demanding games. Don't game at all? Even better, these processors allow you to do whatever multimedia you want on your computer flawlessly as well. ATI and Nvidia both offer motherboards with built in GPUs and both are a fine choice. Though Nvidia's is newer and offers better performance, at an increased price as well, for the most part.
LCD monitor prices are falling like stones, just like their bigger breathren that many sports bar's use to show the big game on. Now buying one form a mail order setup is actually getting cheaper than buying the cheapest CRT available at the local office supply store or best buy. Just ask your friends what to look for in such a monitor and read the fine print on the number of dead pixels there has to be before you can return it. Which is only a one in a hundred chance anymore, if you buy a good brand name and not some value saver junk, that extra $5 will save you $300 in the long run in terms of reliability.
Now lets say you want to take this one step further, you want this computer of yours, with everything attached and running to consume less than 150watts. That's very possible, and its' also rather inexpensive to do anymore. VIA, the manufacturer of chipsets for AMD and Intel has releaced a processor so efficient that it consumes less power than the dim nightlight you might have next to your bathroom at home. Using less than 60 Watts total for the entire tower is the Via Cyrex, Luke, and soon (hopefully) the Envy processors are made with power savings in mind. Coming in clock speeds from 800mhz to 1500 mhz these may sound like slouches but they are perfectly fine for non gaming applications and keeping a small footprint on your desk. They are also very quiet as well, some getting away with being passively cooled. ALl you need is a bit more ram that what you'd think is needed. (512+ for anyone) and a decient hard drive. Since those are honestly the two biggest things that hold the performance of the modern computer back.
So take heart you echo concerned techies. There is hope for tomorrow's work in an electronic age, do your part to conserve what you can, at least your doing something. And you also doing it much more quietly than the hogs with a deafening roar of a bunch of case fans.