Linux NVidia Optimus on ThinkPad W520/W530 with external monitor – finally solved

How NVidia Optimus works on Thinkpad W520/W530 laptops

Thinkpads (and probably some other laptops) with nvidia quadro 1000M or 2000M are wired so that intel integrated graphics chip does all the rendering and display for built in LCD screen, while all external output (VGA/DisplayPort) ports are wired through NVIDIA, which can also be fired up on demand to render 3d stuff. So what’s the problem?

Basically, in order to have external monitor connected to DisplayPort or VGA, we have two options:

  1. Go to BIOS -> Config -> Display, and set graphics to “Discrete Only”. This will make NVIDIA your primary graphic card, which with proprietary nvidia drivers will make your external monitor to work. However, this also means your battery life will suck. In my case, I’ve got 60-70% decrease in battery life in such setup, so it was a no-go.
  2. As of few weeks now, I have a complete working solution, which does not eat your battery when you’re unplugged, does not require you to reset X or computer when you want to connect/disconnect or any of such inconveniences, gives you ability to connect one or two external monitors to your laptop, and it’s relatively easy to setup.
Needless to say, this blog post focuses on #2.

 

How to get there?

Groundwork steps:

  1. Go to BIOS -> Config -> Display, and select “NVidia Optimus”, and make sure to enable “Optimus OS Detection”.
  2. Boot into your linux and login
You will need latest NVIDIA drivers installed. At the time of writing, version is 331.20. On ubuntu 13.10, it looks like this:
sudo add-apt-repository ppa:xorg-edgers/ppa 
sudo apt-get update 
sudo apt-get install nvidia-331

Now we need to install bumblebee:

sudo add-apt-repository ppa:bumblebee/stable
sudo apt-get install bumblebee bumblebee-nvidia bbswitch-dkms

At this point, I recommend reboot.

Configure bumblebee

Few more things are needed in order to get this running, and I’ll cover it now. First, you’ll need to edit /etc/bumblebee/bumblebee.conf and find and change these params, so they look like:

KeepUnusedXServer=true
Driver=nvidia
KernelDriver=nvidia-331
PMMethod=none (find this one in two locations in the file)

Next, edit /etc/bumblebee/xorg.conf.nvidia and make it look like this:

Section "ServerLayout"
    Identifier    "Layout0"
EndSection

Section "Device"
    Identifier    "DiscreteNvidia"
    Driver        "nvidia"
    VendorName    "NVIDIA Corporation"
    BusID         "PCI:01:00:0"
    Option        "ProbeAllGpus" "false"
    Option        "NoLogo" "true"
EndSection

Intel-virtual-output tool

First, you will need latest xf86-video-intel driver installed (2.99). Ubuntu 13.10 comes with it, so you don’t need to update driver in that case. However, what made all of this possible is the latest release of intel-virtual-output tool, which comes bundled with xf86-video-intel driver source. But, ubuntu’s package does not bundle it, and we need to compile it from source. One MAJOR thing to note here is: DO NOT compile it from ubuntu’s deb-src package. That package is old, and current release has some major fixes for the tool that we will actually need in order to have everything working properly. So lets do it:

sudo apt-get install xorg-dev git autoconf automake libtool xorg-utils-dev
git clone git://anongit.freedesktop.org/xorg/driver/xf86-video-intel cd xf86-video-intel 
./autogen.sh 
cd tools
make 
sudo cp intel-virtual-output /usr/bin/ 
sudo chmod +x /usr/bin/intel-virtual-output

Oh, and now that precious moment we’ve all been waiting for

Now, connect your external monitor to VGA or DisplayPort, and run this:

modprobe bbswitch
optirun true
intel-virtual-output

And you’re done! What the above two commands did is, they fired up nvidia card in the background so that we can use its external ports or rendering, and started another X server in the background which runs on nvidia card. However, all your apps are still rendered via Intel card, but can be proxied to external monitor. Just open up KDE System Settings -> Display and Monitor, and you’ll see 2 monitors as you normally would, and you can place them in any position you like. Same goes for Unity’s settings.

You might notice a small lag here and there (nothing of major importance), but that’s been worked on and future kernel and driver releases will improve that situation.

Wanna go mobile and turn off the nvidia card? No problem.

Now that you’ve enjoyed your static setup, it’s time to go mobile without draining the battery. These are the simple steps to do so:

  1. Disconnect your external monitor
  2. # kill the second X server. 
    # To find the process, run: ps ax | grep Xorg
    # You should see something like this
    $ ps ax | grep Xorg
    3342 ?        Ss    68:08 Xorg :8 -config /etc/bumblebee/xorg.conf.nvidia -configdir /etc/bumblebee/xorg.conf.d -sharevts -nolisten tcp -noreset -verbose 3 -isolateDevice PCI:01:00:0 -modulepath /usr/lib/nvidia-331/xorg,/usr/lib/xorg/modules 
    # now kill the process 
    $ sudo kill -15 3342
  3. # Now you need to turn off nvidia card completely.
    sudo rmmod nvidia
    sudo tee /proc/acpi/bbswitch <<<OFF

That’s it. I hope I made your day :)


This entry was posted in General, Linux Desktop and tagged , , , , , , , , , , . Bookmark the permalink.

47 Responses to Linux NVidia Optimus on ThinkPad W520/W530 with external monitor – finally solved

  1. Anh says:

    With the last command I got an error:

    Failed to find available VirtualHead “VIRTUAL1″ for “VGA-0″ on display “:8″

    and the external monitor still appears in the system settings->displays

    • scyth says:

      What does your system log say about discrete card when you run optirun and intel-virtual-output? Can you paste the logs?

  2. Anh says:

    … does not appear .. I meant

  3. Stephan Fabel says:

    Two comments:

    1) you need the autotools package installed as well, otherwise you won’t be able to compile the intel driver

    2) when I activate this on my W520, all three displays are being recognized, but their screens overlap (i.e. I see portions of the first external monitor mirrored on the other external monitor).

    I see these messages in /var/log/syslog:


    Jan 2 15:12:16 majestic kernel: [ 87.917426] bbswitch: version 0.8
    Jan 2 15:12:16 majestic kernel: [ 87.917432] bbswitch: Found integrated VGA device 0000:00:02.0: \_SB_.PCI0.VID_
    Jan 2 15:12:16 majestic kernel: [ 87.917437] bbswitch: Found discrete VGA device 0000:01:00.0: \_SB_.PCI0.PEG_.VID_
    Jan 2 15:12:16 majestic kernel: [ 87.917466] ACPI Warning: \_SB_.PCI0.PEG_.VID_._DSM: Argument #4 type mismatch - Found [Buffer], ACPI requires [Package] (20130517/nsarguments-95)
    Jan 2 15:12:16 majestic kernel: [ 87.918337] bbswitch: detected an Optimus _DSM function
    Jan 2 15:12:16 majestic kernel: [ 87.918369] pci 0000:01:00.0: enabling device (0000 -> 0003)
    Jan 2 15:12:16 majestic kernel: [ 87.918434] bbswitch: Succesfully loaded. Discrete card 0000:01:00.0 is on
    Jan 2 15:12:20 majestic kernel: [ 91.403961] nvidia: module license 'NVIDIA' taints kernel.
    Jan 2 15:12:20 majestic kernel: [ 91.403967] Disabling lock debugging due to kernel taint
    Jan 2 15:12:20 majestic kernel: [ 91.415275] vgaarb: device changed decodes: PCI:0000:01:00.0,olddecodes=io+mem,decodes=none:owns=none
    Jan 2 15:12:20 majestic kernel: [ 91.415479] [drm] Initialized nvidia-drm 0.0.0 20130102 for 0000:01:00.0 on minor 1
    Jan 2 15:12:20 majestic kernel: [ 91.415484] NVRM: loading NVIDIA UNIX x86_64 Kernel Module 331.20 Wed Oct 30 17:43:35 PDT 2013
    Jan 2 15:12:20 majestic acpid: client connected from 2031[0:999]
    Jan 2 15:12:20 majestic acpid: 1 client rule loaded
    Jan 2 15:12:21 majestic kernel: [ 92.517537] nvidia 0000:01:00.0: irq 57 for MSI/MSI-X
    Jan 2 15:12:21 majestic kernel: [ 92.523852] ACPI Warning: \_SB_.PCI0.PEG_.VID_._DSM: Argument #4 type mismatch - Found [Buffer], ACPI requires [Package] (20130517/nsarguments-95)
    Jan 2 15:12:21 majestic kernel: [ 92.523899] ACPI Warning: \_SB_.PCI0.PEG_.VID_._DSM: Argument #4 type mismatch - Found [Buffer], ACPI requires [Package] (20130517/nsarguments-95)
    Jan 2 15:12:21 majestic kernel: [ 92.523922] ACPI Warning: \_SB_.PCI0.PEG_.VID_._DSM: Argument #4 type mismatch - Found [Buffer], ACPI requires [Package] (20130517/nsarguments-95)
    Jan 2 15:12:21 majestic kernel: [ 92.523944] ACPI Warning: \_SB_.PCI0.PEG_.VID_._DSM: Argument #4 type mismatch - Found [Buffer], ACPI requires [Package] (20130517/nsarguments-95)
    Jan 2 15:12:21 majestic kernel: [ 92.523965] ACPI Warning: \_SB_.PCI0.PEG_.VID_._DSM: Argument #4 type mismatch - Found [Buffer], ACPI requires [Package] (20130517/nsarguments-95)
    Jan 2 15:12:21 majestic kernel: [ 92.523985] ACPI Warning: \_SB_.PCI0.PEG_.VID_._DSM: Argument #4 type mismatch - Found [Buffer], ACPI requires [Package] (20130517/nsarguments-95)
    Jan 2 15:12:21 majestic kernel: [ 92.524171] ACPI Warning: \_SB_.PCI0.PEG_.VID_._DSM: Argument #4 type mismatch - Found [Buffer], ACPI requires [Package] (20130517/nsarguments-95)
    Jan 2 15:12:21 majestic kernel: [ 92.524192] ACPI Warning: \_SB_.PCI0.PEG_.VID_._DSM: Argument #4 type mismatch - Found [Buffer], ACPI requires [Package] (20130517/nsarguments-95)
    Jan 2 15:12:22 majestic kernel: [ 93.712709] ACPI Warning: \_SB_.PCI0.PEG_.VID_._DSM: Argument #4 type mismatch - Found [Buffer], ACPI requires [Package] (20130517/nsarguments-95)
    Jan 2 15:12:23 majestic acpid: client connected from 2031[0:999]
    Jan 2 15:12:23 majestic acpid: 1 client rule loaded
    Jan 2 15:12:23 majestic bumblebeed[1266]: [XORG] (WW) "glamoregl" will not be loaded unless you've specified it to be loaded elsewhere.
    Jan 2 15:12:23 majestic bumblebeed[1266]: [XORG] (WW) "xmir" is not to be loaded by default. Skipping.
    Jan 2 15:12:23 majestic bumblebeed[1266]: [XORG] (WW) Unresolved symbol: fbGetGCPrivateKey
    Jan 2 15:12:23 majestic bumblebeed[1266]: [XORG] (WW) NVIDIA(0): UBB is incompatible with the Composite extension. Disabling
    Jan 2 15:12:23 majestic bumblebeed[1266]: [XORG] (WW) NVIDIA(0): UBB.
    Jan 2 15:12:23 majestic bumblebeed[1266]: [XORG] (EE) synaptics: SynPS/2 Synaptics TouchPad: Synaptics driver unable to detect protocol
    Jan 2 15:12:23 majestic bumblebeed[1266]: [XORG] (EE) PreInit returned 11 for "SynPS/2 Synaptics TouchPad"

    This is on a fresh install of Kubuntu 13.10, with nothing else configured after installation except an complete upgrade of all packages and the steps outlined as above.

    • scyth says:

      Go to System Settings -> Display and Monitor. That’s where you configure positioning of your screens. Play around with it and see what works best for you. You can also do this manually with ‘xrandr’ command. Check online resources for usage specifics, but should be easy.

      • Dustin says:

        I’m having the same/similar problem as Stephan on a W530 running up to date Arch.

        It work’s great for one external monitor, but no combination of settings will make it work for two external monitors docked. I can extend the desktop from the LVDS-1 panel to one of the monitors, but if I try to VIRTUAL4 / VIRTUAL5 on with LVDS-1 off (or on), VIRTUAL 4 and 5 are exact clones of each other. I’m assuming this is a bug in the auto mapping the intel-virtual-output tool, but I haven’t been able to figure out where to start digging.

      • Dustin says:

        I was finally able to resolve my issue by switching to the nouveau driver. Interestingly enough, I could get a configuration that worked if I ran nvidia-settings under optirun, but when applying the same settings with xrandr, the screens would still overlap.

        Using nouveau it worked as expected the first time.

  4. med says:

    is suspend/hibernate working ?

  5. anon says:

    You made my day :) I’d been struggling to make my ext mon work with my laptop (ubuntu 13.10) for a week. Thanks!

    l have two questions.

    You say that you installed the latest nvidia driver and then bumblebee-nvidia. When I attempted to do so bumblebee-nvidia removed the latest nvidia drivers and I got kernel errors. It worked however when I used only bumblebee. Didn’t you have this problem?

    You also say “However, all your apps are still rendered via Intel card, but can be proxied to external monitor”

    What does that mean? The nvidia card isn’t used for rendering? But if a program is run with optirun then the nvidia card should handle rendering for that program, no?

    • scyth says:

      If you run apps with optirun, then yes, nvidia card will render those apps. Otherwise, intel card does the rendering, which is what we want.

  6. Steve says:

    I tested it with a fresh installation of Ubuntu 13.10 x64 compiled intel-virtual-output and it fired up the screen connected to the displayport. But it stays white. I can move the mousepointer arround the second screen. And when I click on the settings icon on the LVDS it draws the unity to the second screen and also already moved windows appear. As it seems this is the only time the output is updated.

    What can this be?

    • Steve says:

      There is a bug in the ubuntu version. With the one from xf86 it’s working great

      • scyth says:

        I would also suggest to everyone to compile the latest xf86 driver, as it resolves a huge memory leak as well. Overall, it works much better than packaged version.

  7. Steve says:

    A initial GUI supporting intel-virtual-output can be found on my github page.

  8. Luke Swart says:

    Thank you for this post. To compile the intel drivers from source, I installed the autotools package via `autoconf` and `automake`. I also needed to install `libtool` and `xorg-utils-dev` (I may be mistaken on the last package name). I hope this helps any newbies out there.

  9. fragmede says:

    Yeah! I got it working and you made my day, what with not-having-to-restart-X and everything!

    One small issue though.

    Whenever I suspend, with nothing connected, the extra X server killed, nvidia module removed, and bbswitch set to off, intel-virtual-output not running; and then I resume, with or without power connected, with or without bbswitch module loaded, X insists that VIRTUAL3’s output should be enabled until I run “xrandr –output VIRTUAL3 –off”.

    Can anything be done to have VIRTUAL3 be disabled until I explictly enable it?

  10. Luke says:

    Working great on a Lenovo T530. Thanks!

  11. Luke Swart says:

    After disconnecting my external monitor, how can I reconnect it? When I run `optirun` again, I get the error:

    `[16929.676769] [ERROR]Cannot access secondary GPU – error: [XORG] (EE) Server terminated successfully (0). Closing log file.

    [16929.676811] [ERROR]Aborting because fallback start is disabled.`

    Any ideas on how to restart my Nvidia driver? My external monitor also disconnects automatically after suspend. (I am running Ubuntu 13.10 on W520). More info here: http://unix.stackexchange.com/questions/118816/how-to-reload-nvidia-driver-after-rmmod-nvidia-optimus-multiscreen-how-to-re

    • Frankie says:

      Hey Luke I was having the same problem …. I believe scythe left out the following line:

      sudo tee /proc/acpi/bbswitch <<<ON

      You have to switch bb on again. Then follow with the optirun and intel commands

  12. Ivo says:

    Thanks, much better than the solution for previous ubuntus, which required compiling the intel graphics driver with a patch. Two notes:
    * for me this works with the regular ‘nvidia-current’ (304) driver
    * for me this works without requiring ‘modprobe bbswitch’ and ‘optirun true’.

  13. Krzysztof Suszyski says:

    Hey, checkout my scripts that automate enable/disable of external monitor based on this article: https://gist.github.com/cardil/10533170

  14. Tristan Muntsinger says:

    It’s xutils-dev, not xorg-utils-dev

  15. Luke Swart says:

    Any confirmation about whether this works for Ubuntu 14.04?

    • Luke Swart says:

      Trying this on Ubuntu 14.04, I get the following error:

      autoreconf: running: automake –add-missing –copy –no-force
      src/Makefile.am:37: error: Libtool library used but ‘LIBTOOL’ is undefined
      src/Makefile.am:37: The usual way to define ‘LIBTOOL’ is to add ‘LT_INIT’
      src/Makefile.am:37: to ‘configure.ac’ and run ‘aclocal’ and ‘autoconf’ again.
      src/Makefile.am:37: If ‘LT_INIT’ is in ‘configure.ac’, make sure
      src/Makefile.am:37: its definition is in aclocal’s search path.

    • scyth says:

      It works on ubuntu 14.04 as well.

  16. Ben says:

    I’ve been trying to get this to work on my W520 running Debian Jessie (3.13-1-amd64). I have followed all of the steps, but I have not been able to get an external display working. Everything seems to be going well until I try to run intel-virtual-output and get this error:
    Failed to find available VirtualHead "VIRTUAL1" for "VGA-0" on display ":8"

    I have searched for information about this error, but Google just keeps bringing me back here. I see that the first comment mentions this error, but no solution was mentioned. I am using version 331.67 of the nvidia driver and the intel driver packaged in xserver-xorg-video-intel. Both are from the Debian repository. I built the intel-virtual-output tool as instructed. I have tried this line in /etc/X11/xorg.conf.d/20-intel.conf, but it did not help:
    Option "VirtualHeads" "2"

    I see errors about type mismatches (see below) when I run “optirun true”, but I have read somewhere that that was not likely to cause a problem.

    Here is my log after loading bbswitch:
    May 11 12:54:17 blackbox kernel: [ 3043.774419] bbswitch: version 0.8
    May 11 12:54:17 blackbox kernel: [ 3043.774427] bbswitch: Found integrated VGA device 0000:00:02.0: \_SB_.PCI0.VID_
    May 11 12:54:17 blackbox kernel: [ 3043.774433] bbswitch: Found discrete VGA device 0000:01:00.0: \_SB_.PCI0.PEG_.VID_
    May 11 12:54:17 blackbox kernel: [ 3043.774443] ACPI Warning: \_SB_.PCI0.PEG_.VID_._DSM: Argument #4 type mismatch - Found [Buffer], ACPI requires [Package] (20131115/nsarguments-95)
    May 11 12:54:17 blackbox kernel: [ 3043.774792] bbswitch: detected an Optimus _DSM function
    May 11 12:54:17 blackbox kernel: [ 3043.774807] pci 0000:01:00.0: enabling device (0000 -> 0003)
    May 11 12:54:17 blackbox kernel: [ 3043.774840] bbswitch: Succesfully loaded. Discrete card 0000:01:00.0 is on

    Here is what comes up in the log after running “optirun true”:
    May 11 12:55:43 blackbox kernel: [ 3129.862337] nvidia: module license 'NVIDIA' taints kernel.
    May 11 12:55:43 blackbox kernel: [ 3129.862344] Disabling lock debugging due to kernel taint
    May 11 12:55:43 blackbox kernel: [ 3129.873681] vgaarb: device changed decodes: PCI:0000:01:00.0,olddecodes=io+mem,decodes=none:owns=none
    May 11 12:55:43 blackbox kernel: [ 3129.873814] NVRM: loading NVIDIA UNIX x86_64 Kernel Module 331.67 Fri Apr 4 13:48:39 PDT 2014
    May 11 12:55:43 blackbox kernel: [ 3130.532239] nvidia 0000:01:00.0: irq 55 for MSI/MSI-X
    May 11 12:55:43 blackbox kernel: [ 3130.538085] ACPI Warning: \_SB_.PCI0.PEG_.VID_._DSM: Argument #4 type mismatch - Found [Buffer], ACPI requires [Package] (20131115/nsarguments-95)
    May 11 12:55:43 blackbox kernel: [ 3130.538144] ACPI Warning: \_SB_.PCI0.PEG_.VID_._DSM: Argument #4 type mismatch - Found [Buffer], ACPI requires [Package] (20131115/nsarguments-95)
    May 11 12:55:43 blackbox kernel: [ 3130.538175] ACPI Warning: \_SB_.PCI0.PEG_.VID_._DSM: Argument #4 type mismatch - Found [Buffer], ACPI requires [Package] (20131115/nsarguments-95)
    May 11 12:55:43 blackbox kernel: [ 3130.538202] ACPI Warning: \_SB_.PCI0.PEG_.VID_._DSM: Argument #4 type mismatch - Found [Buffer], ACPI requires [Package] (20131115/nsarguments-95)
    May 11 12:55:43 blackbox kernel: [ 3130.538229] ACPI Warning: \_SB_.PCI0.PEG_.VID_._DSM: Argument #4 type mismatch - Found [Buffer], ACPI requires [Package] (20131115/nsarguments-95)
    May 11 12:55:43 blackbox kernel: [ 3130.538255] ACPI Warning: \_SB_.PCI0.PEG_.VID_._DSM: Argument #4 type mismatch - Found [Buffer], ACPI requires [Package] (20131115/nsarguments-95)
    May 11 12:55:43 blackbox kernel: [ 3130.538496] ACPI Warning: \_SB_.PCI0.PEG_.VID_._DSM: Argument #4 type mismatch - Found [Buffer], ACPI requires [Package] (20131115/nsarguments-95)
    May 11 12:55:43 blackbox kernel: [ 3130.538524] ACPI Warning: \_SB_.PCI0.PEG_.VID_._DSM: Argument #4 type mismatch - Found [Buffer], ACPI requires [Package] (20131115/nsarguments-95)
    May 11 12:55:52 blackbox kernel: [ 3138.744374] ACPI Warning: \_SB_.PCI0.PEG_.VID_._DSM: Argument #4 type mismatch - Found [Buffer], ACPI requires [Package] (20131115/nsarguments-95)
    May 11 12:55:52 blackbox systemd[1]: Starting ACPI event daemon...
    May 11 12:55:52 blackbox systemd[1]: Started ACPI event daemon.
    May 11 12:55:52 blackbox acpid: starting up with netlink and the input layer
    May 11 12:55:52 blackbox acpid: 42 rules loaded
    May 11 12:55:52 blackbox acpid: waiting for events: event logging is off
    May 11 12:55:52 blackbox acpid: client connected from 2994[0:999]
    May 11 12:55:52 blackbox acpid: 1 client rule loaded
    May 11 12:55:52 blackbox bumblebeed[1101]: [ 3140.075480] [WARN][XORG] (WW) Unresolved symbol: fbGetGCPrivateKey
    May 11 12:55:52 blackbox bumblebeed[1101]: [ 3140.075511] [WARN][XORG] (WW) NVIDIA(0): UBB is incompatible with the Composite extension. Disabling
    May 11 12:55:52 blackbox bumblebeed[1101]: [ 3140.075516] [WARN][XORG] (WW) NVIDIA(0): UBB.

    This is what xrandr shows:
    Screen 0: minimum 320 x 200, current 1920 x 1080, maximum 32767 x 32767
    LVDS1 connected 1920x1080+0+0 (normal left inverted right x axis y axis) 344mm x 193mm
    1920x1080 60.0*+ 59.9 50.0
    1680x1050 60.0 59.9
    1600x1024 60.2
    1400x1050 60.0
    1280x1024 60.0
    1440x900 59.9
    1280x960 60.0
    1360x768 59.8 60.0
    1152x864 60.0
    1024x768 60.0
    800x600 60.3 56.2
    640x480 59.9
    VGA1 disconnected (normal left inverted right x axis y axis)

    • scyth says:

      From what I can see, debian (even jessy) uses v2.21 of intel driver. That one won’t work. You’ll need 2.99. When you’re building intel-virtual-output tool, you can install driver from there as well. Just run make install and copy /usr/local/lib/xorg/modules/drivers/intel_drv.* to /usr/lib/xorg/modules/drivers/

      This should fix it.

  17. Ole says:

    Man, you are my hero!!!
    Thank you, thank you, thank you.

  18. Mehmed says:

    thanks, very good article ,
    Is it possible to implement these steps on centos 6 or SL 6.

  19. Pingback: Nvidia Optimus with Bumblebee on Linux | Luke's Blog

  20. Grant says:

    Yes you did make my day! Buy you a beer if I could.

    I have been trying to get this running as a script but the “tee” command is tricky and I can’t figure out a way to do it. If its the first time I’m running external no problem as the /proc/acpi/bbswitch file is absent but if its a subsequent time there are problems trying to change the OFF to an ON. So I’m trying something like:

    ——————————————————-
    #!/bin/sh

    if [ -r /proc/acpi/bbswitch ]
    then
    sudo -c “tee /proc/acpi/bbswitch <<<ON"
    else
    sudo modprobe bbswitch
    fi

    optirun true
    intel-virtual-output
    #put xrandr command line here
    ——————————————————-

    but the <<< redirect being executed by sudo isn't working. Any ideas?

    And I fixed it myself. Always the way after I write a question. Found it easier to unload and load the bbswitch module each time. So:
    —Enable external———————————————————-
    #!/bin/sh

    if [ -r /proc/acpi/bbswitch ]
    then
    sudo modprobe -r bbswitch
    fi

    sudo modprobe bbswitch load_state=1 unload_state=0
    optirun true
    intel-virtual-output
    sleep 4
    $HOME/.screenlayout/External-27.sh
    —————————————————————————–

    —Disable External——————————————————
    #!/bin/sh

    $HOME/.screenlayout/Internal-only.sh

    sleep 4

    sudo kill -15 $(ps -ef | grep "Xorg :8" | grep -v grep | awk '{print $2}')

    sleep 2
    sudo rmmod nvidia
    sudo modprobe -r bbswitch
    —————————————————————————–

    Thanks

    Grant

  21. Luke Swart says:

    Just today I have an issue with the “ppa:xorg-edgers/ppa”, where I could not load my desktop. After logging in, I get the background and pointer, but no menu bars or ability to launch programs. After executing “ppa-purge xorg-edgers”, I can use my laptop, but obviously no more Bumblebee support. Does anyone else have this problem? Any suggestions? I am using Ubuntu 14.04 on a W530.

  22. Simon says:

    Hi,

    I tried this on my W520 with Arch Linux. Most of it is working correctly. I can configure the second screen in the settings tool from the gnome desktop. The external monitor is switching on – but shows nothing. I can only see a black screen.

    Have you any ideas how I can see the desktop on the external monitor?

    Thank you!

  23. Suat Karakusoglu says:

    I have followed the tutorial and encountered a few issues i would like to address :
    1) one of the package name is misspelled while installing
    => sudo apt-get install xutils-dev
    2) it required me to install primus as well so
    => sudo apt-get install primus

    Then it all worked smoothly :)
    Thanks dude !

  24. Ryan Lazuka says:

    What is the proper way to launch dual displays on startup? Right now, every time my computer starts up, I run:

    modprobe bbswitch && optirun true && intel-virtual-output

    What is the best way to automatically run this code every time my computer is restarted or powered on?

  25. Steve says:

    after the update of xorg-edgers to intel 2.99.912 turning on intel-virtual-output ends up in to black monitors on my tp 520.

    Xorg log of server 0 :
    690.232] (II) config/udev: Adding drm device (/dev/dri/card1)
    [ 690.232] (II) xfree86: Adding drm device (/dev/dri/card1)
    [ 690.232] (II) LoadModule: “modesetting”
    [ 690.233] (II) Loading /usr/lib/xorg/modules/drivers/modesetting_drv.so
    [ 690.233] (II) Module modesetting: vendor=”X.Org Foundation”
    [ 690.233] compiled for 1.15.0, module version = 0.8.1
    [ 690.233] Module class: X.Org Video Driver
    [ 690.233] ABI class: X.Org Video Driver, version 15.0
    [ 690.233] xf86: found device 1
    [ 692.520] (II) intel(0): Output VIRTUAL2 has no monitor section
    [ 692.568] (II) intel(0): Output VIRTUAL3 has no monitor section
    [ 692.648] (II) intel(0): Output VIRTUAL4 has no monitor section
    [ 692.780] (II) intel(0): Output VIRTUAL5 has no monitor section
    [ 692.843] (II) intel(0): resizing framebuffer to 3840×1200
    [ 692.848] (II) intel(0): switch to mode 1920×1080@60.0 on LVDS1 using pipe 0, position (0, 0), rotation normal, reflection none
    [ 692.865] (II) intel(0): switch to mode 1920×1200 on VIRTUAL3, position (1920, 0), rotation normal, reflection none
    [ 693.008] (II) intel(0): Output VIRTUAL6 has no monitor section
    [ 693.056] (II) intel(0): Output VIRTUAL7 has no monitor section
    [ 693.128] (II) intel(0): Output VIRTUAL8 has no monitor section
    [ 693.184] (II) intel(0): Output VIRTUAL9 has no monitor section
    [ 693.621] (II) intel(0): resizing framebuffer to 1920×1200
    [ 693.624] (II) intel(0): switch to mode 1920×1200 on VIRTUAL3, position (0, 0), rotation normal, reflection none

    Also for server 8 log does not contain any errors:
    [ 692.503] (II) NVIDIA(GPU-0): Display (DELL U2412M (DFP-1)) does not support NVIDIA 3D
    [ 692.503] (II) NVIDIA(GPU-0): Vision stereo.
    [ 692.704] (II) NVIDIA(0): Setting mode “DP-0: nvidia-auto-select @1920×1200 +0+0 {ViewPortIn=1920×1200, ViewPortOut=1920×1200+0+0}”
    [ 692.719] (II) NVIDIA(0): Setting mode “NULL”
    [ 692.760] (II) NVIDIA(0): Setting mode “NULL”
    [ 693.226] (II) NVIDIA(0): Setting mode “DP-0: nvidia-auto-select @1920×1200 +0+0 {ViewPortIn=1920×1200, ViewPortOut=1920×1200+0+0}”

    Anyone else experiencing this problem?

  26. Clayton Zaugg says:

    I just wanted to say THANK YOU so very much! I am relatively new to the world of Linux and I have been scourging the web on how to get dual screen working and it’s just been a headache or less effective. Anyway, this worked perfectly, had to retype a few things along the way (my mistakes), and I’m up and running.

    Again, thank you very much good sir/ma’am!

  27. Daniel says:

    Hi, I’m planing to buy a Thinkpad W5xx and during my prebuying research I’ve stumbled upon the issues with Optimus, great blog!

    I was wondering about the wiring of these models, as far as I understand the Nvidia card is wired to the external port and the Intel card is wired to the internal LCD, now considering what Nvidia says in the release file of 331.13 driver I’m a bit confused and I don’t understand if hw acceleration through the nvidia card does actually work, would I be able to use the discreet card and accelerate the desktop on the internal LCD screen? would I be able to do so on the external monitor (see the caveats reported hereunder)? would I only be able to accelerate single apps through bumblebee profiles and render them on the nvidia card and then via RandR feed them to the intel card? does the W5xx have a “mux” between the intel and nvidia card?

    http://us.download.nvidia.com/XFree86/Linux-x86/331.13/README/optimus.html

    The driver may be installed normally on Optimus systems, but the NVIDIA X driver and the NVIDIA OpenGL driver may not be able to display to the laptop’s internal display panel unless a means to connect the panel to the NVIDIA GPU (for example, a hardware multiplexer, or “mux”, often controllable by a BIOS setting) is available. On systems without a mux, the NVIDIA GPU can still be useful for offscreen rendering, running CUDA applications, and other uses that don’t require driving a display.

    On muxless Optimus laptops, or on laptops where a mux is present, but not set to drive the internal display from the NVIDIA GPU, the internal display is driven by the integrated GPU. On these systems, it’s important that the X server not be configured to use the NVIDIA X driver after the driver is installed. Instead, the correct driver for the integrated GPU should be used. Often, this can be determined automatically by the X server, and no explicit configuration is required, especially on newer X server versions. If your X server autoselects the NVIDIA X driver after installation, you may need to explicitly select the driver for your integrated GPU.

    and, same file, differet section:

    http://us.download.nvidia.com/XFree86/Linux-x86/331.13/README/randr14.html

    Version 1.4 of the X Resize, Rotate, and Reflect Extension (RandR 1.4 for short) adds a way for drivers to work together so that one graphics device can display images rendered by another. This can be used on Optimus-based laptops to display a desktop rendered by an NVIDIA GPU on a screen connected to another graphics device, such as an Intel integrated graphics device or a USB-to-VGA adapter.

    [...]

    There is no synchronization between the images rendered by the NVIDIA GPU and the output device. This means that the output device can start reading the next frame of video while it is still being updated, producing a graphical artifact known as “tearing”. Tearing is currently expected due to limitations in the design of the X.Org X server.

    The NVIDIA driver currently only supports the Source Output capability. It does not support render offload and cannot be used as an output sink.

  28. Bob Stones says:

    Ok, I feel a bit of a divot here but I’m so desperate to get this working I’ll stand a bit of ridicule.

    Following your instructions I get to the point of running

    sudo apt-get install xorg-dev git autoconf automake libtool xorg-utils-dev

    and I get:

    Reading package lists… Done
    Building dependency tree
    Reading state information… Done
    E: Unable to locate package xorg-utils-dev

    Any clues for the clueless would be most appreciated

    rgds

    Bob

  29. Bob Stones says:

    got an even worse one now . . .

    sudo apt-get install xorg-dev
    Reading package lists… Done
    Building dependency tree
    Reading state information… Done
    Some packages could not be installed. This may mean that you have
    requested an impossible situation or if you are using the unstable
    distribution that some required packages have not yet been created
    or been moved out of Incoming.
    The following information may help to resolve the situation:

    The following packages have unmet dependencies:
    xorg-dev : Depends: libdmx-dev but it is not going to be installed
    Depends: libx11-dev but it is not going to be installed
    Depends: libxaw7-dev but it is not going to be installed
    Depends: libxcomposite-dev but it is not going to be installed
    Depends: libxcursor-dev but it is not going to be installed
    Depends: libxdamage-dev but it is not going to be installed
    Depends: libxext-dev but it is not going to be installed
    Depends: libxfixes-dev but it is not going to be installed
    Depends: libxfont-dev but it is not going to be installed
    Depends: libxft-dev but it is not going to be installed
    Depends: libxi-dev but it is not going to be installed
    Depends: libxinerama-dev but it is not going to be installed
    Depends: libxkbfile-dev but it is not going to be installed
    Depends: libxmu-dev but it is not going to be installed
    Depends: libxmuu-dev but it is not going to be installed
    Depends: libxpm-dev but it is not going to be installed
    Depends: libxrandr-dev but it is not going to be installed
    Depends: libxrender-dev but it is not going to be installed
    Depends: libxres-dev but it is not going to be installed
    Depends: libxss-dev but it is not going to be installed
    Depends: libxt-dev but it is not going to be installed
    Depends: libxtst-dev but it is not going to be installed
    Depends: libxv-dev but it is not going to be installed
    Depends: libxvmc-dev but it is not going to be installed
    Depends: libxxf86dga-dev but it is not going to be installed
    Depends: libxxf86vm-dev but it is not going to be installed
    Depends: xserver-xorg-dev but it is not going to be installed
    E: Unable to correct problems, you have held broken packages.

    Again any help for the newbie would be greatly appreciated

    Rgds

    Bob

  30. Fer says:

    Hi,

    Just a quick tip: you also have to update the file bumblebee.conf in order to use the nvidia-331 drivers. The file should look like this:


    [driver-nvidia]
    # Module name to load, defaults to Driver if empty or unset
    KernelDriver=nvidia-331
    PMMethod=none
    # colon-separated path to the nvidia libraries
    LibraryPath=/usr/lib/nvidia-331:/usr/lib32/nvidia-331
    # comma-separated path of the directory containing nvidia_drv.so and the
    # default Xorg modules path
    XorgModulePath=/usr/lib/nvidia-331/xorg,/usr/lib/xorg/modules
    XorgConfFile=/etc/bumblebee/xorg.conf.nvidia

    Hope this tip helps other persons dealing with Thinkpads (specially W530’s) in Ubuntu 14.04

  31. Miko says:

    it even works without bumblebee.
    I followed a mix of the begining of your solution and something like this:
    sudo add-apt-repository ppa:xorg-edgers/ppa
    sudo apt-get update
    sudo apt-get dist-upgrade
    sudo apt-get install nvidia-331

    and then:

    sudo ldconfig -n
    sudo update-initramfs -u
    sudo reboot
    (found here: https://bugs.launchpad.net/ubuntu/+source/ubuntu-drivers-common/+bug/1310023)

    works great – I finally got nvidia-prime working on w520! :)

Leave a Reply

Your email address will not be published. Required fields are marked *


seven + 3 =

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>