The guys from Tested.com performed rundown tests on five popular mobile phones to test if they lived up to manufacturers’ talk-time claims. They managed to make a humorous video in the process and found that all but one was came within +/- 10% of the manufacturers’ claims.
The Samsung Captivate lasted 354 minutes (rated at 350 minutes), the HTC Incredible fell short of its claimed time by just 11 minutes and the Palm Pre Plus came in at 305 minutes (rated at 330 minutes). What was even more encouraging from the real-world tests was that smartphones that weren’t fresh out of the box exceeded battery life claims. The iPhone 3GS, for example, hung in for 330 minutes even though ti’s only rated to last 300 minutes. The Motorola Droid 2 was the furthest off the mark, lasting 465 minutes, which is 90 minutes (20%) short of Motorola’s 575 minute rating. The iPhone 4 lated 382 minutes, about 10% under its rating.
The Tested.com article got me thinking about battery life claims we see on PCs of all shapes and sizes. Why is it that HTC, Apple, Motorola and the rest of them can come within spitting distance of what people experience when using their smartphones as they were intended to be used, but mobile computer companies feel the need to sprinkle their battery life claims with a sack of fairy dust? When’s the last time you were able to see anything close to the claimed battery life on your notebook, netbook or Tablet PC?
The maximum battery life claims on computers are based off benchmarks, such as Mobile Mark. The problem with these benchmarking tools is that they don’t replicate how people actually use their computers. Instead, they spit out numbers that are more like theoretical maximums. To make matters worse, manufacturers will put their best configurations forward, then slap the battery claim on the entire series. They often put an asterisk or ‘Up T0′ in little letters next to the BIG numbers on marketing material. The fine print often explains that those incredible battery claims are achieved with expensive components such as SSDs or power-sipping components like less powerful, but more efficient processors.
In general, I take computer manufacturers’ battery life claims and chop them in half when considering a purchase. While I can sometimes stretch my notebooks’ batteries to get in the same ballpark (-30% or so) of what’s claimed, it means giving up things like screen brightness and running real applications.
My HP Envy 13, for example, can last for 8 to 9.5 hours on a single charge with its extended battery, as I’ve experienced on long trips, if I dim the screen, switch off the GPU and kill the radios when not in use. When I’m not ‘trying’ though, battery life is more like 5 to 7 hours depending on the task at hand. Both are well short of the 16 hours HP advertised last fall when the product launched. I wasn’t all that surprised at the relatively low battery life, but I expected it. What about the consumers that only get a new PC once every few years and buy into the battery life claims?
The HP envy 13 isn’t the only notebook I have that falls well short of its battery life claims. My new MacBook Pro is rated to last for 8 to 9 hours of wireless productivity, but in the real world it’s more like 5 to 5.5 hours. That’s without stretching things out too much and I can get close to the lower of Apple’s claimed numbers if I disable the GPU completely, run a single app (Safari), switch off Bluetooth and keep an eye on the number of tabs I have open simultaneously. The average notebook user doesn’t do any of that. They just want to use their computers the same way as when they’re plugged in.
It’s no wonder I was skeptical when Apple announced that the iPad would run for up to 10 hours on a single charge. In my opinion the iPad’s killer app is its ability to meet and exceed its claimed battery life. When I get the 20% warning I know I still have enough juice to watch a feature-length movie before the screen goes black.
Should we really have to be surprised when a device meets or exceeds its claimed battery life? Of course companies want to put their best numbers forward, but it’s not the right thing to do. Not only do people not get what they think they’re paying for, they become skeptical of all the other claims these companies throw out there.
I would sure love it if every notebook that I put my hands on came within 20% of its stated battery life in real world scenarios. There are too many variables to count when trying to figure out how quickly an average user drains a notebook battery, while it’s pretty straight forward to measure talk time. I’m not sure what the perfect answer is, but there’s got to be a better solution than the benchmarks that are now the industry standard.