Previous | Next --- Slide 15 of 47
Back to Lecture Thumbnails

As power consumption of GPUs is a super important consideration for laptops and their batteries Apple has two GPUs for different workloads (heavy and light). It's interesting that it is economically more beneficial for apple to add an extra GPU to their chip which is used for a rendering lower end graphics than improve their battery life.


You don't necessarily need the AMD GPU for casual computing, and we don't want to have to boot it up every time (since that costs power and battery life). Therefore, in low energy mode, we use the integrated GPU.


GPU switching in the Nvidia Optimus system uses the GPU driver software to determine whether a program would benefit from using the discrete GPU.


What mechanism is responsible for deciding which to use at a given time?

Macbooks are sold as a complete package, so that each component can be engineered to coordinate well with every other element of the system, since their properties are known while it's being designed (or both are being designed together). Are there ways to use this configuration (one "heavy" and one "light" GPU) in other machines, including custom-built ones? Is it common in non-Apple laptops (or any desktops)?


It can be easily seen how the tradeoff between having GPU switching is beneficial to both the consumer and Apple when comparing the battery life between the two cases. Battery life with the iGPU for this model lasts around 5-6 hours but with the dGPU always on, battery life can be a dismal 30 minutes to 1.5 hours tops. For consumers, the use of the iGPU is sufficient for everyday use so you will be able to see a decent life in battery. For Apple, by having multiple GPUs, they are able to satisfy the need for higher end processing while maintaining decent battery life specs. If the laptop only used the dGPU, Apple may have had to put more research and money into improving battery life that would have cut into profit margins.


Isn't having two GPU's impact energy efficiency? Is this the main reason case why the battery burns faster when playing a video game since two GPU's are being used?


Only most Linux distros, Nvidias software allows you to manually switch between using integrated graphics and a discrete GPU. In casual testing (Dell XPS 15 9550 with Intel i7-6700HQ and GgeForce GTX 960M running Ubuntu 15.10), I have noticed that there is definitely a noticeable difference in battery life between the two. In addition, the integrated graphics are sufficient for the vast majority of use cases. Unless you are playing some extremely intensive games or want high performance video encoding, etc. the integrated graphics are more than adequate.


@temmie: The first generation of Apple laptops with two types of GPUs did it by having the user manually switch and reboot the system. The second generation had very simple heuristic: if it uses OpenGL, fire up the discrete GPU. Once the Intel GPUs stopped being garbage, the operating system started becoming smarter based on load measurements.

Edit: I'll also add that app developers can now manually specify whether their application requires the faster GPU.

There's a bit of caution here because suddenly swapping GPUs out from underneath a running application is risky business.