How NVidia Optimus works on Thinkpad W520/W530 laptops
Thinkpads (and probably some other laptops) with nvidia quadro 1000M or 2000M are wired so that intel integrated graphics chip does all the rendering and display for built in LCD screen, while all external output (VGA/DisplayPort) ports are wired through NVIDIA, which can also be fired up on demand to render 3d stuff. So what’s the problem?
Basically, in order to have external monitor connected to DisplayPort or VGA, we have two options:
- Go to BIOS -> Config -> Display, and set graphics to “Discrete Only”. This will make NVIDIA your primary graphic card, which with proprietary nvidia drivers will make your external monitor to work. However, this also means your battery life will suck. In my case, I’ve got 60-70% decrease in battery life in such setup, so it was a no-go.
- As of few weeks now, I have a complete working solution, which does not eat your battery when you’re unplugged, does not require you to reset X or computer when you want to connect/disconnect or any of such inconveniences, gives you ability to connect one or two external monitors to your laptop, and it’s relatively easy to setup.
How to get there?
- Go to BIOS -> Config -> Display, and select “NVidia Optimus”, and make sure to enable “Optimus OS Detection”.
- Boot into your linux and login
sudo add-apt-repository ppa:xorg-edgers/ppa sudo apt-get update sudo apt-get install nvidia-331
Now we need to install bumblebee:
sudo add-apt-repository ppa:bumblebee/stable sudo apt-get install bumblebee bumblebee-nvidia bbswitch-dkms
At this point, I recommend reboot.
Few more things are needed in order to get this running, and I’ll cover it now. First, you’ll need to edit /etc/bumblebee/bumblebee.conf and find and change these params, so they look like:
KeepUnusedXServer=true Driver=nvidia KernelDriver=nvidia-331 PMMethod=none (find this one in two locations in the file)
Next, edit /etc/bumblebee/xorg.conf.nvidia and make it look like this:
Section "ServerLayout" Identifier "Layout0" EndSection Section "Device" Identifier "DiscreteNvidia" Driver "nvidia" VendorName "NVIDIA Corporation" BusID "PCI:01:00:0" Option "ProbeAllGpus" "false" Option "NoLogo" "true" EndSection
First, you will need latest xf86-video-intel driver installed (2.99). Ubuntu 13.10 comes with it, so you don’t need to update driver in that case. However, what made all of this possible is the latest release of intel-virtual-output tool, which comes bundled with xf86-video-intel driver source. But, ubuntu’s package does not bundle it, and we need to compile it from source. One MAJOR thing to note here is: DO NOT compile it from ubuntu’s deb-src package. That package is old, and current release has some major fixes for the tool that we will actually need in order to have everything working properly. So lets do it:
sudo apt-get install xorg-dev git autoconf automake libtool xorg-utils-dev git clone git://anongit.freedesktop.org/
xorg/driver/xf86-video-intel cd xf86-video-intel ./autogen.sh cd tools make sudo cp intel-virtual-output /usr/bin/ sudo chmod +x /usr/bin/intel-virtual-output
Oh, and now that precious moment we’ve all been waiting for
Now, connect your external monitor to VGA or DisplayPort, and run this:
modprobe bbswitch optirun true intel-virtual-output
And you’re done! What the above two commands did is, they fired up nvidia card in the background so that we can use its external ports or rendering, and started another X server in the background which runs on nvidia card. However, all your apps are still rendered via Intel card, but can be proxied to external monitor. Just open up KDE System Settings -> Display and Monitor, and you’ll see 2 monitors as you normally would, and you can place them in any position you like. Same goes for Unity’s settings.
You might notice a small lag here and there (nothing of major importance), but that’s been worked on and future kernel and driver releases will improve that situation.
Wanna go mobile and turn off the nvidia card? No problem.
Now that you’ve enjoyed your static setup, it’s time to go mobile without draining the battery. These are the simple steps to do so:
- Disconnect your external monitor
# kill the second X server. # To find the process, run: ps ax | grep Xorg # You should see something like this $ ps ax | grep Xorg 3342 ? Ss 68:08 Xorg :8 -config /etc/bumblebee/xorg.conf.nvidia -configdir /etc/bumblebee/xorg.conf.d -sharevts -nolisten tcp -noreset -verbose 3 -isolateDevice PCI:01:00:0 -modulepath /usr/lib/nvidia-331/xorg,/usr/lib/xorg/modules # now kill the process $ sudo kill -15 3342
# Now you need to turn off nvidia card completely. sudo rmmod nvidia sudo tee /proc/acpi/bbswitch <<<OFF
That’s it. I hope I made your day :)