Energy Star ratings are supposed tell consumers how much power their devices use under normal operation. But they only work if the tests and ratings reflect real-world use. A new study of Ultra HD televisions has revealed some of these devices use far more electricity than they claim if operated even slightly outside the preprogrammed test conditions that the manufacturers use for testing — and that the end result can be substantially higher electrical bills over the lifetime of the device.
A new report by the Natural Resources Defense Council (NRDC) criticizes the behavior of many high-end UHD televisions for the way they implement power-saving features. In some cases, manufacturers may use motion-detecting dimming (MDD) to falsely inflate the power savings they claim. This feature cuts backlight power when the screen changes rapidly, but it saves far more power on the test clips used by the Department of Energy than in any normal content. This may well reflect a weakness of the test, which switches from one type of content to another fairly rapidly — more rapidly than most broadcasts or even sports games do.
The NDRC team created their own content tests to attempt to nail down the difference between how the Samsung and LG televisions tested on the official IEC test versus their own content. What they found is that the IEC test contains many more jump cuts, with the average scene lasting just 2.29 seconds. The average scene in their own real-world content tests was larger, at 3.89 seconds — and their own real-world content tests produced dramatically different (and less positive) results.
Confusing standards, significant discrepancies
Not all of the problems here are a function of the test content used by the DoE. In some cases, shifting the display from its default mode into Vivid or Cinematic modes also disables automatic brightness control, or the aforementioned MDD.
Part of the problem here is that different manufacturers use different settings to mean different things. LG, for example, uses Auto Power Save to describe its power-saving feature; setting the TV to “Standard” completely disables all power-saving functions. On a Samsung TV, selecting “Standard” means that all power-saving options are enabled — so long as you don’t change anything, period. Some Samsung TVs apparently disable all power saving options if you so much as tweak the brightness or contrast settings, even if you use the “Standard” profile to do so — and they don’t inform you that the brightness and color settings have been altered.
Now, it’s fair to ask how much all of this matters and how great the discrepancy is between what a TV set scores on a test versus its real-life power consumption. Here’s a comparison of results on the SAM 7100 and SAM 9000 with Automatic Brightness Control (ABC) and motion-detection dimming (Samsung calls this Motion Lighting) off vs. on when using the DoE’s own test setup.
The gaps here aren’t small, and keep in mind that these Samsung televisions turn the options off the instant you change Brightness or Contrast, even if you’ve selected the power-saving “Standard” setting. The difference in operating costs on a yearly basis is fairly small — the average US power cost is 12.6 cents per kW, and at five hours per day of estimated operation, that only works out to about $15 per year. At the same time, however, there’s no denying the energy efficiency the unit promises on the sticker isn’t being delivered to the consumer — at least, not without a great many caveats and assumptions about operating conditions that may or may not be made clear, either by salespeople or within the device’s own documentation.
While not many people choose their devices solely on the basis of their energy efficiency, many people do take them into account when choosing between products. If Samsung and LG both make a well-regarded UHD TV, and one platform is listed as being more energy efficient than the other, then the smart consumer will choose the device he or she calculates will save him money over the long term. Energy use, like vehicle mileage, is difficult to capture in a single test because how you use the device will necessarily impact the overall figure.
One takeaway from the NDRC’s results, however, is that the IEC test suite doesn’t seem to map well to real-world content — and that has knock-on effects for consumers who want to take actual energy-efficiency into account. Vizio also gets dinged for the way it communicates certain features. It tells consumers to use one mode for power consumption, but then tells them “Calibrated” mode produces the best picture. While this may be factually true, it implies that using the device in efficiency mode necessitates a worse image and offers no way for the customer to fine-tune the settings to his or her liking.
HDR content sends power consumption spiking
One added wrinkle to this debate is the use of high-dynamic-range lighting, or HDR. HDR has many benefits — it captures a much broader range of the visible light spectrum, it increases color accuracy, and it can improve video games and offer greater ranges of detail without requiring more powerful video cards (or at least, not more powerful in the conventional sense of pushing higher frame rates).
Unfortunately, in at least early models, those benefits come along with a whopping increase in power consumption, as captured below.