TV Power Knowledge for the Super Bowl
The Super Bowl will be here before you know it, and that means good times with family, friends, food….and hopefully a nice entertainment center. HDTVs are showing up in more and more homes these days, so we want you to be prepared for every eventuality. This just so happens to include your power consumption too!
Electrical costs tend to be higher during the winter months, but we don’t want you to be caught off guard when the bill arrives. Tv.com wrote an awesome article a while back that still holds true today by and large regarding TV power consumption. Keep reading to learn more.
TV power consumption Guide
If you diligently click off the lights when you leave a room, obsess over the fuel-economy stickers on new cars, and cringe when the electricity bill arrives, you may be interested to know that your shiny, new flat-screen television may be a power hog. With the price of electricity going nowhere but up, catching the big game on Sunday, watching the latest DVD, and playing Halo with your buddies can make your electricity meter spin. But just how much does it cost to run a TV? To get an idea of how thirsty for power today’s sets are, we measured how much electricity they actually use when they’re on and when they’re off. The results might just startle both you and your wallet.
The basics of TV power
At a time when electricity costs 30 percent more than it did last year, just about every modern appliance can be seen as a power-hungry mass of circuits, lights, and buttons that sucks down electricity, day and night. We put 20 TVs–old and new–to the test by measuring how much power each uses in a variety of circumstances. Our results show that it can cost between $13 and $145 a year to watch TV, depending on whether you want a small LCD TV or a huge plasma set.
Technology and size matter
There are four basic technologies that TVs use to produce a picture, and technology type has the largest influence on power consumption per inch of screen. The traditional cathode-ray tube blasts electrons onto chemical phosphors embedded on the inside of the tube, while plasma sets ionize gas to create colors in a million or more tiny pixel cells. SpongeBob or American Idol show up on the other side of the glass, and both require more electricity to create a brighter image. How much? The typical CRT sucks down nearly double the power to create a white screen as compared to a black screen.
On the other hand, flat-panel LCDs and rear-projection microdisplays use a powerful fluorescent backlight or bulb that either punches through an LCD panel with its three color filters or that reflects off of a digital-light-processing chip that has a million miniature mirrors and a spinning color wheel. Either way, they consume the same power regardless of the brightness of the image. That’s because the primary light source–the backlight, or bulb–is essentially always running at maximum power. Note that some late-model flat LCDs actually have backlights that you can turn down to consume less power or produce a dimmer image.
The 45 watts that a 20-inch LCD TV uses is about what it takes to charge a notebook PC, while the 55-inch plasma’s 507-watt consumption is closer to that of a large refrigerator. Of course, some sets, such as Panasonic’s 50-inch TH-50PHD8UK plasma, can be more efficient than others. It’s the same size asMaxent’s MX-50X3 plasma, but it used a little more than half as much power when we engaged its power-saving mode.
Size matters as well, so we divided each set’s power use by its screen area to get a watts-per-square-inch rating. This way, small and large screens can be compared. While there is some overlap, the TVs we tested form neat groups based on technology:
*Microdisplay rear projector: 0.11 to 0.15 watt per square inch
*LCD: 0.16 to 0.41 watt per square inch
*CRT: 0.25 to 0.40 watt per square inch
*Plasma: 0.30 to 0.39 watt per square inch
If power efficiency is all you’re after, the clear choice is rear-projection technology, but these sets rarely get as bright as the others. While CRTs and LCDs are brighter, they are currently limited to about 36 and 40 inches, respectively. Of the four, plasma screens are generally the most power hungry, but on a square-inch basis, they are roughly equivalent to a large CRT set. Also, newer TVs are likely to be more efficient than older ones, and new technologies promise to make TVs more efficient.
Other power factors
It may also surprise you to hear that TVs use power even when they’re not turned on. So that the TV is ready to respond to the remote in an instant, all sets use what’s called phantom or standby power. Our tests revealed that standby power consumption varied widely among different TVs. For some, it’s just a few watts, while others use nearly 20 watts, but in either case, it adds up.
Few people have just a TV anymore, and all sorts of ancillary devices contribute to your yearly energy costs as well. Think of all that’s plugged into your set, from a DVD player, external speakers, and a gaming console to a satellite receiver, a digital recorder, and even a Wi-Fi transmitter. They all need power. It may not sound like much, but a DVD player can use about 10 watts, while a PlayStation 2 gaming machine draws about 50 watts. All told, these boxes can use more power than the TV itself. The Xbox 360, for example, uses 160 watts–significantly more than all but the big-screen HDTVs we tested.
Unlike with cars and refrigerators, where the law mandates posting of a power estimate, TV shoppers have no way to compare them on power use. Sure, scanning the manufacturers’ specs is a good start, but many don’t provide power information, there are no established power tests, and the numbers provided rarely include standby power ratings. Our solution is PowerView, a real-world estimate based on actual TV viewing habits during a typical year (see “Testing your TV’s thirst for electricity”). It includes TV and DVD watching plus standby power, and it adds up to as little as $13 for a small LCD TV or more than 10 times that amount for a large plasma screen. The bottom line is that, at a time when we have to live with expensive electricity, over a 10-year lifetime, the electricity a TV consumes comes close to the cost of the set itself.
Test your TV’s thirst for electricity
The whole idea behind this project is to get a good idea of how much electricity a TV uses by recording the actual current flow into the set. We talked to consultants and engineers both from the Environmental Protection Agency and from major TV makers to come up with a reliable test procedure. It turned out to be easy enough that we’re confident that you can do it too, so don’t sweat it if you don’t know squat about watts.
You’ll need a few tools, a little patience, and some time.
- *An AC clamp current meter with volt meter (Extech’s MA200 works fine)
- *A DVD player with Spider-Man 2
- *A cable or satellite TV box with EPG or TV Guide channel
- *A calculator
- *A short extension cord that you don’t mind modifying
Take the extension cord, unplug it completely, and use an Xacto knife to cut open the cord. Separate the white and black power lines, and be careful not to remove any of the insulation. Clamp the current meter around the white separated wire so that the clamp closes completely. Insert the meter’s red and black probes into the AC outlet’s plug, set the meter to VAC, and record the outlet’s line voltage. Remove the probes from the socket.
Next, plug the TV’s power cable into one end of the extension cord, and plug the cord into the socket you just measured. Turn on the TV and leave its picture controls and volume at your normal settings. (To arrive at a level playing field for our test, we calibrated the TVs to 60 footlamberts with a gray field or, if they couldn’t get that bright, to maximum brightness. For testing at home, you should use your typical settings.)
Now you’re ready to get down to work. Plug the screw-type RF output from the cable or satellite TV box into the TV’s RF input, call up the program guide or the TV Guide Channel (usually you have to tune the TV to channel 3 to see it), and measure the current load using the meter’s 2-amp range; some high-power TVs will require the 20-amp range. Next, switch to your DVD player’s input, and play the even-numbered chapters of Spider-Man 2 while recording the current flow for each scene. This provides a good assortment of light and dark scenes. Expect your measurements to vary wildly for CRTs and plasma screens, while for LCDs and projectors it’ll change little from scene to scene. Next, measure the TV’s standby current by turning the set off and watching the current meter’s readings. Most TVs will get to a standby level of a few watts in a matter of seconds, while some rear-projection TVs that have cooling fans will take longer. It’s a good idea to continue this measurement for 2 hours, but you don’t have to stick around, just check it from time to time. Finally, unplug the extension cord from the outlet, switch the meter back to VAC, and use the meter’s black and red probes to measure and record the current from the outlet a second time.
It’s now time for a few calculations. For those who slept through high school physics, here’s a refresher. To get power consumption in watts, multiply the amps measured for the TV, the DVD, and on standby by the average of the before and after voltage readings. The three results will be the wattage consumed for each activity. (For the DVD section, you’ll need to average the amps you measured for each even-numbered Spider-Man 2 chapter, then multiply that result by the voltage to get the TV’s power rating for DVDs.) Finally, take the TV and DVD wattage results and multiply them each by 1,460 hours, then multiply the standby wattage result by 5,840 hours, which corresponds to 4 hours each of TV viewing and DVD watching and 16 hours a day of standby. The last step is to add these figures together to get the annual energy use in watt-hours, divide it by 1,000 to convert it to kilowatt-hours, then multiply by 0.1 for an average power cost of 10 cents per kilowatt-hour. Voilà–this is the set’s PowerView rating, a good estimate of how much it costs to use it for a year.
The future of TV power
It’s hard to believe, but the 280 million TVs in the United States consume 4 percent of the power used, or 46 billion kilowatt-hours per year. That’s an electricity bill of more than $4 billion a year. The bad news, other than another season of Pimp My Ride, is that, with big screens becoming popular, power use is rising fast and could reach 70 billion kilowatt-hours by the end of the decade.
Every major TV maker is looking at cutting power, and each generation of TV generally uses a little less power than the last. For instance, a three-year-old 27-inch Sharp CRT we tested consumes one-third more power than a 27-inch RCA CRT made in 2005. The Environmental Protection Agency hopes that the pace of innovation will pick up with a revamped Energy Star program that replaces the current rating, which is based solely on standby power, with one that measures the total power consumption. The goal of many electrical utilities is to meet increasing demand through conservation, which is much cheaper than building new generators and power lines.
This trend to low-power TV technology has already begun with current “professional” Panasonic plasmas that are able to limit brightness peaks and, as a result, overall power. It’s just the start, as Matsushita (Panasonic’s corporate parent) has teamed up with Pioneer and Hitachi to create a prototype plasma screen that uses half as much power as the typical display today. First shown at Japan’s CEATEC show, it took a total reengineering of the set to make it possible. These low-power plasma TVs could be on the market in a year or two.
Also, look for a new generation of LCD TVs that get rid of the three color filters and use an LED backlight instead. A Samsung prototype wide-screen, 32-inch, high-definition display consumes only 80 watts, about half as much as comparable sets. It could be on sale next year. Look for LEDs to help projector TVs as well. We’ve seen projector prototypes that are miserly enough to run on a battery but don’t yet come close to the light intensity produced by a traditional high-voltage lamp. Finally, the future may belong to the organic light-emitting diode, or OLED, a technology that delivers bright, rich screens but for much less power. At the moment, they’re so small they’re useful only for cell phones, but like all TV technologies, the trick is to start small while thinking big.