![](/Content/images/logo2.png)
Original Link: https://www.anandtech.com/show/8619/investigating-nvidias-batteryboost-msi-gt72
Investigating NVIDIA's BatteryBoost with MSI GT72
by Jarred Walton on October 23, 2014 9:00 AM EST![](https://images.anandtech.com/doci/8619/MSI-GT72-Back_678x452.jpg)
We previewed the performance of MSI's new GT72 notebook earlier this month, and while we're still running a few additional tests for the full review, one area that we wanted to look at in more detail is BatteryBoost. Initially launched with the GTX 800M series earlier this year, our first look at the technology came with the MSI GT70 with GTX 880M, and unfortunately battery life even when not gaming wasn't exactly stellar, and powering up the GTX 880M didn't help matters. NVIDIA's stated goal is to get useful gaming battery life above two hours, which so far we haven't been able to do (and starting with a laptop that only manages 4-5 hours in our light and heavy Internet testing doesn't help).
Without BatteryBoost, the MSI GT70 managed around 50 minutes of battery life while gaming (give or take), while enabling BatteryBoost in some cases could get us up to 80+ minutes of battery life. More recently, we reviewed the updated Razer Blade (2014 edition) with a GTX 870M. We were able to see an improvement from 46 minutes without BatteryBoost to 76 minutes with BatteryBoost in our limited testing. However, if the goal is to get above two hours of gaming battery life, we're still not there.
Basically, the amount of battery life you are able to get while gaming is largely dependent on how high frame rates are without BatteryBoost and how low the target frame rate is set with BatteryBoost. If a game on battery power can run at 60FPS and BatteryBoost puts a 30FPS cap into place, battery life can improve a decent amount. A game that can hit 120FPS meanwhile would potentially experience a much larger benefit from BatteryBoost, especially when putting a 30FPS cap into effect. With GT72 and the GTX 980M, both power efficiency and performance should be better than the GT70 and GTX 880M, which means BatteryBoost has the potential to stretch its legs a bit more.
For our testing, we've picked three games and we've run and reasonably high settings – but not necessarily maxed out settings, as that would generally prevent BatteryBoost from providing much if any benefit. Our goal was to run settings that would allow at least 70+ FPS on battery power. Keep in mind that just because the GT72 can hit well over 60 FPS on AC power, even without BatteryBoost enabled there are some performance limitations in effect. In the end, our three games consist of Tomb Raider at High quality, the newly released Borderlands: The Pre-Sequel at nearly maxed out settings (we left PhysX on Low), and GRID Autosport with High settings. Anti-aliasing was not used in any of the games (though FXAA was enabled in Borderlands), and the resolution was set to 1080p. The power profile was set to Balanced, with the LCD running at 200 nits.
One of the interesting things about BatteryBoost is that it allows you to target a variety of frame rates (from 30 to 60 FPS in 5 FPS intervals). NVIDIA has also stated that they're doing more than just frame rate targeting, so we wanted to test that by enabling VSYNC and running without BatteryBoost at a steady 60FPS. Since BatteryBoost also doesn't inherently enable VSYNC, that was one more variable to test. (While in theory anything between 30 and 60 FPS should result in a 30FPS frame rate with VSYNC enabled, at least in GRID Autosport that doesn't happen, either due to triple buffering or some other factors.)
In the end, for at least one game – GRID Autosport – we tested both with and without VSYNC at 10FPS intervals with BatteryBoost, plus checking performance without BatteryBoost. That's ten different settings to test, and with each cycle requiring at least several hours we've been running BatteryBoost tests almost non-stop since our preview article. This is about the most effort we've ever put into testing gaming battery life on a laptop, and it might be a while before we decide to delve into this subject in such a fashion again. So join us as we thoroughly investigate BatteryBoost on the GTX 980M.
BatteryBoost: Gaming Battery Life x 3
First things first, let's talk about the typical FPS (Frames Per Second) before we get to BatteryBoost. Plugged in, GRID Autosport at our 1080p High settings will average around 145 FPS, making it a good candidate for BatteryBoost. Perhaps more importantly, even when running on battery power GRID Autosport is able to average nearly 120 FPS (117 to be precise). Obviously, enabling BatteryBoost FPS targets will result in the average FPS equaling the target – 30, 40, 50, or 60 is what we tested – and testing without BatteryBoost but with VSYNC will result in a 60 FPS average.
As for the other games, Tomb Raider at High (not the benchmark, but actually running the full game for our battery test) gets around 107 FPS on AC power but drops to 70-73 FPS on battery power, so we wouldn't expect nearly as much of a benefit from BatteryBoost, especially at the 60FPS target. Borderlands: The Pre-Sequel falls roughly between those two, getting 135-140 FPS on AC power and dropping to around 88 FPS on battery power.
If BatteryBoost is simply benefiting from lower FPS, our VSYNC results should be the same as the BatteryBoost 60FPS results, but as we'll see in a moment that's not the case. Figuring out exactly what NVIDIA is doing is a bit more complex, and we'll discuss this more on the next page. First, let's start with GRID Autosport and run some detailed tests at 10FPS intervals and see how battery life scales.
Interestingly, while BatteryBoost on its own is able to do a good job at improving battery life – the GT72 goes from just 55 minutes without BatteryBoost to 112 minutes with a 30FPS target – tacking on VSYNC adds a bit more battery life on top of that. Our best result is at 30FPS with VSYNC enabled, where the GT72 manages 124 minutes, just surpassing NVIDIA's target of two hours. Of course, you'll probably want to stop a few minutes earlier to make sure your game progress is saved (if applicable), and 30FPS isn't the best gaming experience. Moving to higher FPS targets, BatteryBoost offers diminishing returns, but that's sort of expected. Even at 60FPS however, BatteryBoost still manages 90 minutes compared to 70 minutes without BatteryBoost but with VSYNC.
Given VSYNC appears to help even when BatteryBoost is enabled, for our remaining tests we simply left VSYNC on (except for the one non-BatteryBoost test). We ended up running four tests: no BatteryBoost and without VSYNC, no BatteryBoost but with VSYNC, and then BatteryBoost at 60 and 30 FPS targets with VSYNC. Here's the same data from the above chart, but confined to these four test results.
GRID has a nice almost linear line going from 30FPS to 60FPS to no BatteryBoost with VSYNC, and then finally to the fully unlimited performance. What's interesting is that the other two games we tested don't show this same scaling….
Borderlands: The Pre-Sequel has lower frame rates by default, so BatteryBoost isn't able to help quite as much. Normal performance on battery power without VSYNC results in 54 minutes of gaming, which is pretty similar to the result with GRID Autosport. That actually makes sense as in both games we're basically running the system as fast as it will go. Putting a 60FPS cap into effect via VSYNC, battery life only improves by a few minutes, while tacking on BatteryBoost with a 60FPS target gets us up to 62 minutes. Since we're starting at just under 90FPS with no frame rate cap, the smaller gains in battery life with a 60FPS target aren't a surprise, but the very modest 15% improvement is less than I expected. Dropping to a 30FPS target, we're not quite able to get two hours, but we do come quite close at 112 minutes – so essentially double the battery life compared to running at full performance.
Last is Tomb Raider, and as the game with the lowest starting FPS (on battery power) I expected to see the smallest gains in battery life. Interestingly, battery life without BatteryBoost and VSYNC starts at 57 minutes, slightly more than the other two games, but Tomb Raider is known for being more of a GPU stress test than something that demands a lot of the CPU, so perhaps the Core i7-4710HQ just doesn't need to work as hard. Turning on VSYNC does almost nothing (the one minute increase is basically within the margin of error), and BatteryBoost targeting 60FPS is only slightly better (six minutes more than without BatteryBoost). Once we target 30FPS, the end result is about the same as Borderlands TPS: 113 minutes, just missing a 100% improvement in battery life.
Just for kicks, I ran a separate test with Tomb Raider using 1080p and Normal quality with the BatteryBoost 30FPS setting to see if I could get well over two hours by further reducing image quality. While there's still a lot going on that requires power from the system – remember we're dealing with a 45W TDP CPU and around a 100W maximum TDP GPU, plus various other components like the motherboard, LCD, storage, and RAM – at these moderate quality settings I was able to get 125 minutes out of Tomb Raider.
In essence, the less work the GPU has to do and the higher the starting frame rates, the more likely BatteryBoost is to help. It's little wonder then that NVIDIA's discussion of BatteryBoost often makes mention of League of Legends. The game is definitely popular, and what's more it's fairly light on the GPU. By capping FPS at 30 it's easy to see how such a light workload can reach into the 2+ hour range. Interestingly, with Tomb Raider managing 2.08 hours at Normal quality, and given the GT72 uses an ~87 Wh battery, that means the power draw of the notebook during this test is only around 41-42W – not bad for a notebook with a theoretical maximum TDP (under AC power) of roughly 150W.
A Closer Look at Clock Speeds and Power
Wrapping things up, while we've shown that BatteryBoost can certainly improve battery life, there's still the question of what exactly NVIDIA is doing behind the scenes. We know they're playing with the maximum FPS of course, but a frame rate cap alone isn't (always) able to match what BatteryBoost can deliver. To try and shed some additional light on what's going on internally, I logged performance data while running our three BatteryBoost gaming tests. This time, however, the goal was not to fully drain the battery but rather to try and find out what's going on in terms of clock speeds and power draw at a lower level; that means the tests were shorter and there may be more variance, but the numbers are generally in agreement.
There are four tests for each game where I logged data: AC power is the baseline, then I tested DC power without BatteryBoost, with BatteryBoost and a 60FPS target, and finally with BatteryBoost and a 30FPS target. I also tested all for settings with and without VSYNC. I won't guarantee the numbers are 100% accurate, as I have to rely on a utility to report clock speeds and other items, so I won't create any potentially misleading charts; nonetheless, the results are rather interesting to discuss.
First, under AC power the CPU is basically left to run free, and in most cases it will run near its maximum Turbo Boost clocks (3.2-3.3GHz); it also consumes quite a bit of power (25-35W) when VSYNC is off. The GTX 980M meanwhile is running basically full tilt (1100MHz plus or minus ~25MHz on the Core thanks to GPU Boost 2.0, and 5000MHz RAM). Turning VSYC on gives us a taste of things to come, however: average CPU clocks are typically much lower (1800-2000MHz, with spikes up to 3400MHz and lows of 800MHz) and average CPU package power is likewise substantially lower (10-30W). The GPU clocks don't change much, but GPU utilization drops from close to 100% (95-99%, depending on the game) to 32-55%. Switch to battery power and things start to get a bit interesting.
Let's discuss the three games I tested in turn, starting with Tomb Raider. The CPU clock speeds and power tend to vary substantially based on the game, and the GPU varies a bit as well though not as much as the CPU. Even without BatteryBoost, CPU clocks are often at their lowest level (800MHz), and turning on VSYNC actually resulted in higher average CPU clocks but lower average CPU power – the logging data may not be capturing fully accurate CPU clocks, though I suspect the power figures are pretty accurate. GPU clocks show some similarly odd behavior: without VSYNC the average GPU clock was 479MHz with 3200MHz GDDR5, but utilization is at 97%; with VSYNC the average GPU clocks are a bit higher (~950/3200 core/RAM) but utilization is just under 52%.
Enabling BatteryBoost with 60FPS and 30FPS targets continues to generate somewhat unexpected results. At 60FPS, the CPU is generally close to the base 800MHz, but it does average slightly higher when VSYNC is on; power draw from the CPU is pretty consistent at around 6.1-6.5W for the package. Average GPU clocks meanwhile make a bit more sense (they're slightly lower with VSYNC enabled), while average GPU utilization is slightly higher with VSYNC. Overall, however, system power use is much lower with BatteryBoost than without, which is what we'd expect from our earlier battery testing results. It looks like in Tomb Raider the GPU (plus the rest of the system except for the CPU) draws around 60-65W without BatteryBoost, and that drops to 50-55W with BatteryBoost at 60FPS. Our 30FPS BatteryBoost numbers meanwhile don't show a significant change in CPU clocks (still close to the minimum 800MHz), but with the lower FPS the CPU doesn't have to work as hard so CPU package power is now down to around 4.6-4.7W. On the GPU front, the core clocks are around 670-700MHz with close to 50% utilization, but the GDDR5 memory is now running at 1620MHz, so there are some definite power savings there. Average power draw from the GPU and system (again, minus the CPU) is around 35-40W.
Borderlands: The Pre-Sequel behaves quite differently on battery power. The AC results are about the same (CPU and GPU basically running as fast as they can), but now on DC power without BatteryBoost the CPU continues to run at relatively high clocks (3.0-3.4GHz), and as you'd expect power draw remains pretty high as well (20-25W). With BatteryBoost at 60FPS, VSYNC actually had substantially higher CPU clocks (and CPU power use – 14.6W with VSYNC compared to 11.2W without, though the test didn't last as long so there's more chance for variance), but at 30FPS things start to look a lot more like Tomb Raider: the CPU runs at 800-1500MHz, with a 1.0GHz average with VSYNC and 1.125GHz average without; CPU power is 6-7W as well (slightly lower with VSYNC). As for the GPU, things aren't all that different; there's a hard cap of 3.2GHz on the GDDR5 when running off the battery, and while the 980M is frequently at that mark when striving for 60FPS, it's mostly at 1620MHz on the 30FPS setting. The GPU (and system other than CPU) draw close to 50W at 60FPS and 35W at 30FPS, while running without BatteryBoost puts things closer to 60W.
With GRID Autosport, the results on AC power and on DC without BatteryBoost are basically similar to the other two games, though the CPU apparently isn't working as hard as in Borderlands. On AC power it uses 35W and that drops to 23W with VSYNC; on DC without BatteryBoost the CPU is drawing 25W and 15W with VSYNC. The GPU plus other system components meanwhile look to be drawing around 66W without BatteryBoost and 56W with VSYNC enabled. Turn on BatteryBoost and again at 60FPS we see higher CPU clocks (and higher CPU power use) when VSYNC is enabled, but we're talking about 10.7W without VSYNC and 13.7W with VSYNC, and apparently other factors can make up for the difference. The GPU and other components draw around 42W without VSYNC and 39W with VSYNC, so it balances out. Last but not least, at 30FPS the CPU package power averages ~7.3W without VSYNC and ~7.8W with VSYNC, while the GPU and remaining components use 35.7W without VSYCN and 31.8W with VSYNC.
Based on our testing of three different games, it appears BatteryBoost is most effective in games that don't hit the CPU as hard, though with caveats. Tomb Raider for example is known to be pretty easy on the CPU (i.e. a slower AMD APU would likely get close to the same frame rates as a fast Core i7 when paired with the same GPU). However, the type of calculations each game uses (including AI) mean that in some cases a game that doesn't appear to be very CPU intensive may still draw a fair amount of power from the CPU. In general, it looks like the GTX 980M under most gaming workloads will draw at least 25-30W of power (and another 5W or so for the motherboard, RAM, LCD, etc.), which means the lower the CPU load the better. In some cases it should be possible to get the entire GT72 notebook close to 35W while gaming, which would mean the 87Wh battery might last up to nearly 2.5 hours; more realistically, I'd expect most games will pull 40-45W even at the 30FPS target with BatteryBoost, which equates to 1.9 to 2.2 hours at most. Obviously if you have a game that's more taxing (e.g. Metro: Last Light), you'll get even less battery life.
With that said, one other interesting piece of information is that in our Light battery test (Internet surfing) using the same Balanced power profile, with the GTX 980M enabled the GT72 manages around 220 minutes of mobility. (Our Heavy battery test drops it to 165 minutes, if you're wondering.) While two hours of gaming isn't going to be enough for a LAN party, it's still quite impressive to see the GTX 980M effectively drawing about as much power as a GT 750M when BatteryBoost is enabled – though in most cases it's also providing roughly the same level of performance of the GT 750M (under AC power).
Closing Thoughts
The bottom line is that BatteryBoost is certainly improving battery life, though it does so at the cost of frame rates. Considering many console games target 30FPS it's not a horrible solution, but gamers willing to fork out the money for a notebook with a GTX 980M are likely to pack around their AC adapter so that they can get every ounce of performance possible out of their notebook. At some point, I still want to see a gaming notebook that can deliver a decent gaming experience at 60FPS and high quality for more than two hours – and once we reach that level, I'll want to see three or four hours of gaming battery life, I'm sure. It's the great thing about technology: there's always some further milestone to try to achieve.
The results of our testing also highlight another interesting potential for BatteryBoost: G-SYNC. While no one has created a G-SYNC enabled notebook display (at least, not that I'm aware of), I personally find that 30FPS is a bit too choppy but 40+ FPS with G-SYNC can work very well. The amount of power needed to reach 60FPS tends to be a lot higher than what would be needed for 40FPS, so at some point NVIDIA may have to work on G-SYNC notebooks. That of course G-SYNC might draw a bit more power as well for the extra circuitry, and for now G-SYNC also means no Optimus Technology (unless NVIDIA can figure out a workaround), but I suspect NVIDIA will cross those bridges when the time is right.
I suppose since I'm here testing the GT72, I should also note that I really like the changes MSI made with this model compared to the previous GT70. The decision to forego Optimus is also proving to be interesting; I like the idea of automatically switching to the Processor Graphics in theory, but there are definitely times when it gets in the way. For instance, I was just testing Civilization: Beyond Earth performance; none of the Optimus enabled laptops would let me connect an external 4K display over DisplayPort and run it (most likely due to a bug in either the Intel or NVIDIA drivers, though I'd lean towards Intel). What's more, I can't add a custom resolution through the Intel drivers of 2560x1440, because that "exceeds the available bandwidth", never mind the fact that 3840x2160 @ 60Hz works fine.
The full review of the GT72 will post next week, but if you're looking for a short verdict, I really like the notebook. It's expensive, and the battery is no longer externally accessible (so you can't take two or three batteries with you, though I don't know many people that ever do that). Overall however the design is much better looking, performance is great, and the dual cooling fans are definitely doing their job. When the IPS panels arrive, this will be one awesome notebook.