Difference between revisions of "Bumblebee"

From ArchWiki
Jump to: navigation, search
m ([VGL] ERROR: Could not open display :8: add template {{ic|}})
(ERROR: ld.so: ...: clearer description and added hint for primusrun)
(39 intermediate revisions by 21 users not shown)
Line 1: Line 1:
 
[[Category:Graphics]]
 
[[Category:Graphics]]
 
[[Category:X Server]]
 
[[Category:X Server]]
 +
[[es:Bumblebee]]
 
[[fr:Bumblebee]]
 
[[fr:Bumblebee]]
 
[[it:Bumblebee]]
 
[[it:Bumblebee]]
Line 14: Line 15:
 
[http://www.nvidia.com/object/optimus_technology.html Optimus Technology] is an ''[http://hybrid-graphics-linux.tuxfamily.org/index.php?title=Hybrid_graphics hybrid graphics]'' implementation without a hardware multiplexer. The integrated GPU manages the display while the dedicated GPU manages the most demanding rendering and ships the work to the integrated GPU to be displayed. When the laptop is running on battery supply, the dedicated GPU is turned off to save power and prolong the battery life.
 
[http://www.nvidia.com/object/optimus_technology.html Optimus Technology] is an ''[http://hybrid-graphics-linux.tuxfamily.org/index.php?title=Hybrid_graphics hybrid graphics]'' implementation without a hardware multiplexer. The integrated GPU manages the display while the dedicated GPU manages the most demanding rendering and ships the work to the integrated GPU to be displayed. When the laptop is running on battery supply, the dedicated GPU is turned off to save power and prolong the battery life.
  
Bumblebee is a software implementation based on VirtualGL and a kernel driver to be able to use the dedicated GPU, which is not physically connected to the screen.
+
Bumblebee is a software implementation comprising of two parts:
  
Bumblebee tries to mimic the Optimus technology behavior; using the dedicated GPU for rendering when needed and power it down when not in use. The present releases only support rendering on-demand, power-management is a work in progress.
+
* Render programs off-screen on the dedicated video card and display it on the screen using the integrated video card. This bridge is provided by VirtualGL or primus (read further) and connects to a X server started for the discrete video card.
 +
* Disable the dedicated video card when it is not in use (see the [[#Power Management]] section)
  
The NVIDIA dedicated card is managed as a separate X server connected to a "fake" screen (the screen is configured but not used). The second server is called using VirtualGL as if it were a remote server. That said, you will need a series of steps to set-up the kernel driver, the X server and a daemon.
+
It tries to mimic the Optimus technology behavior; using the dedicated GPU for rendering when needed and power it down when not in use. The present releases only support rendering on-demand, automatically starting a program with the discrete video card based on workload is not implemented.
  
 
{{Warning|Bumblebee is still under heavy development! But your help is very welcome.}}
 
{{Warning|Bumblebee is still under heavy development! But your help is very welcome.}}
  
==Installation==
+
== Installation ==
  
Before installing Bumblebee check your BIOS and activate Optimus (shareable graphics), if possible (BIOS doesn't have to provide this option), and install the [[Intel|intel driver]] for the secondary on board graphics card.
+
Before installing Bumblebee check your BIOS and activate Optimus (older laptops call it "switchable graphics") if possible (BIOS doesn't have to provide this option), and install the [[Intel|intel driver]] for the secondary on board graphics card.
  
{{Note|If you want to run a 32-bit application on a 64-bit system you must install {{AUR|lib32-virtualgl}} and proper lib32-* libraries.}}
+
Several packages are available for a complete setup:
  
=== Installing Bumblebee with Intel / nvidia ===
+
* {{aur|bumblebee}} - the main package providing the daemon and client programs.
 +
* (optional) {{aur|bbswitch}} (or {{aur|dkms-bbswitch}}) - recommended for saving power by disable the Nvidia card.
 +
* (optional) If you want more than just saving power, that is rendering programs on the discrete Nvidia card you also need:
 +
** a driver for the Nvidia card. The open-source {{ic|nouveau}} driver or the more closed-source {{ic|nvidia}} driver. See the subsection.
 +
** a render/display bridge. Two packages are currently available for that, {{aur|primus-git}} and {{aur|virtualgl}}. Only one of them is necessary, but installing them side-by-side does not hurt.
  
Install {{AUR|bumblebee}} from [[Arch User Repository|AUR]], and then install the special nvidia package {{aur|nvidia-utils-bumblebee}} for bumblebee.
+
{{Note|If you want to run a 32-bit application on a 64-bit system you must install the proper lib32-* libraries for the program. In addition to this, you also need to install {{AUR|lib32-virtualgl}} or {{aur|lib32-primus-git}}, depending on your choice for the render bridge. Just make sure you run {{ic|primusrun}} instead of {{ic|optirun}} if you decide to use Primus render bridge.}}
  
If you want to run 32-bit applications (like games with wine) on a 64-bit system you need the {{AUR|lib32-nvidia-utils-bumblebee}} from AUR additionally.
+
=== Installing Bumblebee with Intel / nvidia ===
  
{{Warning|Don't install the original {{Pkg|nvidia-utils}} for Bumblebee - it will break your system !}}
+
{{Warning|Don't install the original {{Pkg|nvidia-utils}} with Bumblebee - it will break your system!}}
  
In order to avoid installing {{Pkg|nvidia-utils}} as a dependency when installing the {{Pkg|nvidia}} driver, you have to install the {{AUR|nvidia-bumblebee}} package from the [[AUR]] instead (which is the same driver packaged for bumblebee usage).
+
* Install the special nvidia package {{aur|nvidia-utils-bumblebee}} for bumblebee from [[AUR]]. If you want to run 32-bit applications (like games with wine) on a 64-bit system you need the {{AUR|lib32-nvidia-utils-bumblebee}} package too.
 
+
* Install the kernel module {{AUR|nvidia-bumblebee}}. Unlike {{pkg|nvidia}}, this package does not depend on {{pkg|nvidia-utils}}. If you install {{AUR|dkms-nvidia}} or {{pkg|nvidia}}, do not continue upgrading if you are asked to replace {{aur|nvidia-utils-bumblebee}} by {{pkg|nvidia-utils}}.
{{Note|You can install {{AUR|dkms-nvidia}} from AUR instead of {{Pkg|nvidia}} if you need it.}}
+
 
+
{{note|If you like bumblebee to turn off the NVIDIA card automatically after usage, use {{AUR|bbswitch}} from AUR. See [[#Power Management|below]].}}
+
  
 
=== Installing Bumblebee with Intel / nouveau ===
 
=== Installing Bumblebee with Intel / nouveau ===
  
 
Install nouveau and required packages first:
 
Install nouveau and required packages first:
{{bc|# pacman -S xf86-video-nouveau nouveau-dri mesa}}
+
 
 +
# pacman -S xf86-video-nouveau nouveau-dri mesa
  
 
* {{Pkg|xf86-video-nouveau}} experimental 3D acceleration driver
 
* {{Pkg|xf86-video-nouveau}} experimental 3D acceleration driver
 
* {{Pkg|nouveau-dri}} Mesa classic DRI + Gallium3D drivers
 
* {{Pkg|nouveau-dri}} Mesa classic DRI + Gallium3D drivers
 
* {{Pkg|mesa}} Mesa 3-D graphics libraries
 
* {{Pkg|mesa}} Mesa 3-D graphics libraries
 
Now Install {{AUR|bumblebee}} from [[Arch User Repository|AUR]]:
 
 
{{note|If you like bumblebee to turn off the NVIDIA card automatically after usage, use {{AUR|bbswitch}} from AUR. See [[#Power Management|below]].}}
 
  
 
==Start Bumblebee==
 
==Start Bumblebee==
  
In order to use it is necessary add yourself (and other users) at Bumblebee group:
+
In order to use Bumblebee it is necessary add yourself (and other users) to the bumblebee group:
  
  # usermod -a -G bumblebee $USER
+
  # gpasswd -a $USER bumblebee
  
 
where {{ic|$USER}} is the login name of the user to be added. Then log off and on again to apply the group changes.
 
where {{ic|$USER}} is the login name of the user to be added. Then log off and on again to apply the group changes.
  
To start bumblebee automatically add it to your {{ic|DAEMONS}} array in {{ic|/etc/rc.conf}}
+
To start bumblebee automatically at startup, enable {{ic|bumblebeed}} service:
  DAEMONS=(... @bumblebeed)
+
 
 +
  # systemctl enable bumblebeed.service
  
 
Finished - reboot system and use the shell program {{ic|[[#Usage|optirun]]}} for Optimus NVIDIA rendering!
 
Finished - reboot system and use the shell program {{ic|[[#Usage|optirun]]}} for Optimus NVIDIA rendering!
Line 70: Line 71:
 
== Usage ==
 
== Usage ==
  
The command line programm {{ic|optirun}} shipped with bumblebee is your best friend for running applications on your Optimus NVIDIA card.
+
The command line programm {{ic|optirun}} shipped with Bumblebee is your best friend
 +
for running applications on your Optimus NVIDIA card.
  
 
Test Bumblebee if it works with your Optimus system:
 
Test Bumblebee if it works with your Optimus system:
{{bc|$ optirun glxgears}}
+
{{bc|$ optirun glxgears -info}}
  
 
If it succeeds and the terminal you are running from mentions something about your NVIDIA - Optimus with Bumblebee is working!
 
If it succeeds and the terminal you are running from mentions something about your NVIDIA - Optimus with Bumblebee is working!
Line 82: Line 84:
  
 
Some Examples:
 
Some Examples:
 
Start Firefox accelerated with Optimus:
 
 
{{bc|$ optirun firefox}}
 
  
 
Start Windows applications with Optimus:
 
Start Windows applications with Optimus:
Line 95: Line 93:
 
{{bc|$ optirun nvidia-settings -c :8 }}
 
{{bc|$ optirun nvidia-settings -c :8 }}
  
For a list of options for {{ic|optirun}} run:
+
For a list of options for {{ic|optirun}} view its manual page:
{{bc|$ optirun --help}}
+
{{bc|$ man optirun}}
 +
 
 +
A new program is soon becoming the default choice because of better performance, namely
 +
primus. Currently you need to run this program separately (it does not accept options
 +
unlike {{ic|optirun}}), but in the future it will be started by optirun. Usage:
 +
{{bc|$ primusrun glxgears}}
  
 
== Configuration ==
 
== Configuration ==
  
You can configure the behaviour of Bumblebee to fit your needs. Fine tuning like speed optimization, power managment and other stuff can be configured in {{ic|/etc/bumblebee/bumblebee.conf}}
+
You can configure the behaviour of Bumblebee to fit your needs. Fine tuning like speed optimization, power management and other stuff can be configured in {{ic|/etc/bumblebee/bumblebee.conf}}
  
=== Optimizing Speed ===
+
=== Optimizing Speed when using VirtualGL as bridge ===
  
Bumblebee renders frames for your Optimus NVIDIA card in a invisible X Server with VirtualGL and transports them back to your visible X Server.
+
Bumblebee renders frames for your Optimus NVIDIA card in an invisible X Server with VirtualGL and transports them back to your visible X Server.
  
Frames will be compressed before they are transported - this saves bandwith and can be used for speedup optimization of bumblebee:
+
Frames will be compressed before they are transported - this saves bandwidth and can be used for speed-up optimization of bumblebee:
  
 
To use an other compression method for a single application:
 
To use an other compression method for a single application:
Line 126: Line 129:
 
...
 
...
 
</nowiki>}}
 
</nowiki>}}
 +
 +
You can also play with the way VirtualGL reads back the pixels from your graphic card. Setting {{ic|VGL_READBACK}} environment variable to {{ic|pbo}} should increase the performance. Compare these two:
 +
 +
# PBO should be faster.
 +
VGL_READBACK=pbo optirun glxspheres
 +
# The default value is sync.
 +
VGL_READBACK=sync optirun glxspheres
  
 
{{Note|CPU frequency scaling will affect directly on render performance}}
 
{{Note|CPU frequency scaling will affect directly on render performance}}
Line 131: Line 141:
 
=== Power Management ===
 
=== Power Management ===
  
The goal of power management feature is to turnoff the NVIDIA card when it is not used by bumblebee anymore.
+
The goal of power management feature is to turn off the NVIDIA card when it is not used by bumblebee any more.
 +
If bbswitch is installed, it will be detected automatically when the Bumblebee daemon starts. No additional
 +
configuration is necessary.
  
To enable power managment for bumblebee install {{AUR|bbswitch}} from AUR.
+
==== Default power state of NVIDIA card using bbswitch ====
  
{{Warning|Make sure the secondary Xorg server is stopped when not in use !}}
+
The default behavior of bbswitch is to leave the card power state unchanged. {{ic|bumblebeed}} does disable
 
+
the card when started, so the following is only necessary if you use bbswitch without bumblebeed.
Set the {{ic|PMMethod}} to {{ic|bbswitch}} in the driver section of {{ic|/etc/bumblebee/bumblebee.conf}}:
+
+
{{hc|/etc/bumblebee/bumblebee.conf|<nowiki>
+
[bumblebeed]
+
KeepUnusedXServer=false
+
...
+
[driver-nvidia]
+
PMMethod=bbswitch
+
...
+
[driver-nouveau]
+
PMMethod=bbswitch
+
...
+
</nowiki>}}
+
 
+
==== Default power state of NVIDIA card ====
+
  
 
Set {{ic|load_state}} and {{ic|unload_state}} module options according to your needs (see [https://github.com/Bumblebee-Project/bbswitch bbswitch documentation]).
 
Set {{ic|load_state}} and {{ic|unload_state}} module options according to your needs (see [https://github.com/Bumblebee-Project/bbswitch bbswitch documentation]).
 
{{hc|/etc/modprobe.d/bbswitch.conf|<nowiki>
 
{{hc|/etc/modprobe.d/bbswitch.conf|<nowiki>
options bbswitch load_state=0 unload_state=0
+
options bbswitch load_state=0 unload_state=1
 
</nowiki>}}
 
</nowiki>}}
 
Just restart bumblebee daemon to activate power managment:
 
{{bc|# rc.d restart bumblebeed}}
 
  
 
==== Enable NVIDIA card during shutdown ====
 
==== Enable NVIDIA card during shutdown ====
Line 183: Line 177:
  
 
=== Multiple monitors ===
 
=== Multiple monitors ===
 +
 +
{{Note|This configuration is only valid for laptops, where the extra output is hardwired to the intel card. Unfortunately this is not the case for some (or most?) laptops, where, lets say, the HDMI output is hardwired to the NVIDIA card. In that case there is no such an ideal solution, as shown here. But you can make your extra output at least usable with the instructions on the bumblebee [https://github.com/Bumblebee-Project/Bumblebee/wiki/Multi-monitor-setup wiki page].}}
  
 
You can set up multiple monitors with xorg.conf. Set them to use the Intel card, but Bumblebee can still use the NVIDIA card. One example configuration is below for two identical screens with 1080p resolution and using the HDMI out.
 
You can set up multiple monitors with xorg.conf. Set them to use the Intel card, but Bumblebee can still use the NVIDIA card. One example configuration is below for two identical screens with 1080p resolution and using the HDMI out.
Line 267: Line 263:
 
}}
 
}}
  
If using NVIDA drivers a fix for this problem is to edit {{ic|/etc/bumblebee/xorg.conf.nvidia}} and change Option {{ic|ConnectedMonitor}} to {{ic|CRT-0}}.
+
If using NVIDIA drivers a fix for this problem is to edit {{ic|/etc/bumblebee/xorg.conf.nvidia}} and change Option {{ic|ConnectedMonitor}} to {{ic|CRT-0}}.
  
 
=== [ERROR]Cannot access secondary GPU ===
 
=== [ERROR]Cannot access secondary GPU ===
 +
 +
==== No devices detected. ====
  
 
In some instances, running optirun will return:
 
In some instances, running optirun will return:
Line 275: Line 273:
 
{{bc|
 
{{bc|
 
[ERROR]Cannot access secondary GPU - error: [XORG] (EE) No devices detected.
 
[ERROR]Cannot access secondary GPU - error: [XORG] (EE) No devices detected.
 +
[ERROR]Aborting because fallback start is disabled.
 +
}}
  
 +
In this case, you will need to move the file {{ic|/etc/X11/xorg.conf.d/20-intel.conf}} to somewhere else. Restart the bumblebeed daemon, and it should work.
 +
 +
It could be also necessary to comment the driver line in {{ic|/etc/X11/xorg.conf.d/10-monitor.conf}}.
 +
 +
If you're using the nouveau driver you could try switching to the nVidia driver.
 +
 +
==== NVIDIA(0): Failed to assign any connected display devices to X screen 0 ====
 +
 +
If the console output is:
 +
 +
{{bc|
 +
[ERROR]Cannot access secondary GPU - error: [XORG] (EE) NVIDIA(0): Failed to assign any connected display devices to X screen 0
 
[ERROR]Aborting because fallback start is disabled.
 
[ERROR]Aborting because fallback start is disabled.
 
}}
 
}}
  
In this case, you will need to move the file {{ic|/etc/X11/xorg.conf.d/20-intel.conf}} to somewhere else. Restart the bumblebeed daemon, and it should work.
+
You can change this line in {{ic|/etc/bumblebee/xorg.conf.nvidia}}:
Credit for this goes to Lekensteyn on #bumblebee at freenode.net
+
{{bc|
 +
Option "ConnectedMonitor" "DFP"
 +
}}
 +
to
 +
{{bc|
 +
Option "ConnectedMonitor" "CRT"
 +
}}
  
 +
=== ERROR: ld.so: object 'libdlfaker.so' from LD_PRELOAD cannot be preloaded: ignored. ===
  
It could be also necessary to comment the driver line in {{ic|/etc/X11/xorg.conf.d/10-monitor.conf}}.
+
You possibly want to start a 32-Bit application withbumblebee on your x64 system. Install [https://aur.archlinux.org/packages/lib32-virtualgl/ lib32-virtualgl]  or the faster Primus render bridge [https://aur.archlinux.org/packages/primus-git/ primus-git] from [https://aur.archlinux.org/ AUR]. Remember to use {{ic|primusrun}} with Primus.
  
 
=== Fatal IO error 11 (Resource temporarily unavailable) on X server ===
 
=== Fatal IO error 11 (Resource temporarily unavailable) on X server ===
  
Change {{ic|KeepUnusedXServer}} in {{ic|/etc/bumblebee/bumblebee.conf}} from {{ic|false}} to {{ic|true}}. Your program forks into backgroung and bumblebee don't know anything about it.
+
Change {{ic|KeepUnusedXServer}} in {{ic|/etc/bumblebee/bumblebee.conf}} from {{ic|false}} to {{ic|true}}. Your program forks into background and bumblebee don't know anything about it.
  
 
=== Video tearing ===
 
=== Video tearing ===
Line 300: Line 319:
  
 
If it is still not fixed, try to disable compositing from your desktop environment. Try also disabling triple buffering.
 
If it is still not fixed, try to disable compositing from your desktop environment. Try also disabling triple buffering.
 +
 +
=== It tells you you're not in the group, but you are ===
 +
First, check that you are actually in the group; {{ic|groups}}. If you aren't in the group add yourself(as above) and login and logout, try again.
 +
 +
Otherwise removing {{ic|/var/run/bumblebeed.socket}} might help.[https://bbs.archlinux.org/viewtopic.php?pid=1178729#p1178729 (forum thread)]
  
 
== Important Links ==
 
== Important Links ==

Revision as of 15:57, 24 December 2012

From Bumblebee's FAQ:

Bumblebee is an effort to make NVIDIA Optimus enabled laptops work in GNU/Linux systems. Such feature involves two graphics cards with two different power consumption profiles plugged in a layered way sharing a single framebuffer.

Bumblebee: Optimus for Linux

Optimus Technology is an hybrid graphics implementation without a hardware multiplexer. The integrated GPU manages the display while the dedicated GPU manages the most demanding rendering and ships the work to the integrated GPU to be displayed. When the laptop is running on battery supply, the dedicated GPU is turned off to save power and prolong the battery life.

Bumblebee is a software implementation comprising of two parts:

  • Render programs off-screen on the dedicated video card and display it on the screen using the integrated video card. This bridge is provided by VirtualGL or primus (read further) and connects to a X server started for the discrete video card.
  • Disable the dedicated video card when it is not in use (see the #Power Management section)

It tries to mimic the Optimus technology behavior; using the dedicated GPU for rendering when needed and power it down when not in use. The present releases only support rendering on-demand, automatically starting a program with the discrete video card based on workload is not implemented.

Warning: Bumblebee is still under heavy development! But your help is very welcome.

Installation

Before installing Bumblebee check your BIOS and activate Optimus (older laptops call it "switchable graphics") if possible (BIOS doesn't have to provide this option), and install the intel driver for the secondary on board graphics card.

Several packages are available for a complete setup:

  • bumblebeeAUR - the main package providing the daemon and client programs.
  • (optional) bbswitchAUR (or dkms-bbswitchAUR) - recommended for saving power by disable the Nvidia card.
  • (optional) If you want more than just saving power, that is rendering programs on the discrete Nvidia card you also need:
    • a driver for the Nvidia card. The open-source nouveau driver or the more closed-source nvidia driver. See the subsection.
    • a render/display bridge. Two packages are currently available for that, primus-gitAUR and virtualglAUR. Only one of them is necessary, but installing them side-by-side does not hurt.
Note: If you want to run a 32-bit application on a 64-bit system you must install the proper lib32-* libraries for the program. In addition to this, you also need to install lib32-virtualglAUR or lib32-primus-gitAUR, depending on your choice for the render bridge. Just make sure you run primusrun instead of optirun if you decide to use Primus render bridge.

Installing Bumblebee with Intel / nvidia

Warning: Don't install the original nvidia-utils with Bumblebee - it will break your system!

Installing Bumblebee with Intel / nouveau

Install nouveau and required packages first:

# pacman -S xf86-video-nouveau nouveau-dri mesa

Start Bumblebee

In order to use Bumblebee it is necessary add yourself (and other users) to the bumblebee group:

# gpasswd -a $USER bumblebee

where $USER is the login name of the user to be added. Then log off and on again to apply the group changes.

To start bumblebee automatically at startup, enable bumblebeed service:

# systemctl enable bumblebeed.service

Finished - reboot system and use the shell program optirun for Optimus NVIDIA rendering!

Usage

The command line programm optirun shipped with Bumblebee is your best friend for running applications on your Optimus NVIDIA card.

Test Bumblebee if it works with your Optimus system:

$ optirun glxgears -info

If it succeeds and the terminal you are running from mentions something about your NVIDIA - Optimus with Bumblebee is working!

General Usage:

$ optirun [options] <application> [application-parameters]

Some Examples:

Start Windows applications with Optimus:

$ optirun wine <windows application>.exe

Use NVIDIA Settings with Optimus:

$ optirun nvidia-settings -c :8 

For a list of options for optirun view its manual page:

$ man optirun

A new program is soon becoming the default choice because of better performance, namely primus. Currently you need to run this program separately (it does not accept options unlike optirun), but in the future it will be started by optirun. Usage:

$ primusrun glxgears

Configuration

You can configure the behaviour of Bumblebee to fit your needs. Fine tuning like speed optimization, power management and other stuff can be configured in /etc/bumblebee/bumblebee.conf

Optimizing Speed when using VirtualGL as bridge

Bumblebee renders frames for your Optimus NVIDIA card in an invisible X Server with VirtualGL and transports them back to your visible X Server.

Frames will be compressed before they are transported - this saves bandwidth and can be used for speed-up optimization of bumblebee:

To use an other compression method for a single application:

$ optirun -c <compress-method> application

The method of compres will affect performance in the GPU/GPU usage. Compressed methods (such as jpeg) will load the CPU the most but will load GPU the minimum necessary; uncompressed methods loads the most on GPU and the CPU will have the minimum load possible.

Compressed Methods are: jpeg, rgb, yuv

Uncompressed Methods are: proxy, xv

To use a standard compression for all applications set the VGLTransport to <compress-method> in /etc/bumblebee/bumblebee.conf

/etc/bumblebee/bumblebee.conf
...
[optirun]
VGLTransport=proxy
...

You can also play with the way VirtualGL reads back the pixels from your graphic card. Setting VGL_READBACK environment variable to pbo should increase the performance. Compare these two:

# PBO should be faster.
VGL_READBACK=pbo optirun glxspheres
# The default value is sync.
VGL_READBACK=sync optirun glxspheres
Note: CPU frequency scaling will affect directly on render performance

Power Management

The goal of power management feature is to turn off the NVIDIA card when it is not used by bumblebee any more. If bbswitch is installed, it will be detected automatically when the Bumblebee daemon starts. No additional configuration is necessary.

Default power state of NVIDIA card using bbswitch

The default behavior of bbswitch is to leave the card power state unchanged. bumblebeed does disable the card when started, so the following is only necessary if you use bbswitch without bumblebeed.

Set load_state and unload_state module options according to your needs (see bbswitch documentation).

/etc/modprobe.d/bbswitch.conf
options bbswitch load_state=0 unload_state=1

Enable NVIDIA card during shutdown

The NVIDIA card may not correctly initialize during boot if the card was powered off when the system was last shutdown. One option is to set TurnCardOffAtExit=false in /etc/bumblebee/bumblebee.conf, however this will enable the card everytime you stop the Bumblebee daemon, even if done manually. To ensure that the NVIDIA card is always powered on during shutdown, add the following hook function (if using bbswitchAUR):

/etc/rc.d/functions.d/nvidia-card-enable
nvidia_card_enable() {
  BBSWITCH=/proc/acpi/bbswitch

  stat_busy "Enabling NVIDIA GPU"

  if [ -w ${BBSWITCH} ]; then
    echo ON > ${BBSWITCH}
    stat_done
  else
    stat_fail
  fi
}

add_hook shutdown_poweroff nvidia_card_enable

Multiple monitors

Note: This configuration is only valid for laptops, where the extra output is hardwired to the intel card. Unfortunately this is not the case for some (or most?) laptops, where, lets say, the HDMI output is hardwired to the NVIDIA card. In that case there is no such an ideal solution, as shown here. But you can make your extra output at least usable with the instructions on the bumblebee wiki page.

You can set up multiple monitors with xorg.conf. Set them to use the Intel card, but Bumblebee can still use the NVIDIA card. One example configuration is below for two identical screens with 1080p resolution and using the HDMI out.

/etc/X11/xorg.conf
Section "Screen"
    Identifier     "Screen0"
    Device         "intelgpu0"
    Monitor        "Monitor0"
    DefaultDepth    24
    Option         "TwinView" "0"
    SubSection "Display"
        Depth          24
        Modes          "1980x1080_60.00"
    EndSubSection
EndSection

Section "Screen"
    Identifier     "Screen1"
    Device         "intelgpu1"
    Monitor        "Monitor1"
    DefaultDepth   24
    Option         "TwinView" "0"
    SubSection "Display"
        Depth          24
        Modes          "1980x1080_60.00"
    EndSubSection
EndSection

Section "Monitor"
    Identifier     "Monitor0"
    Option         "Enable" "true"
EndSection

Section "Monitor"
    Identifier     "Monitor1"
    Option         "Enable" "true"
EndSection

Section "Device"
    Identifier     "intelgpu0"
    Driver         "intel"
    Option         "XvMC" "true"
    Option         "UseEvents" "true"
    Option         "AccelMethod" "UXA"
    BusID          "PCI:0:2:0"
EndSection

Section "Device"
    Identifier     "intelgpu1"
    Driver         "intel"
    Option         "XvMC" "true"
    Option         "UseEvents" "true"
    Option         "AccelMethod" "UXA"
    BusID          "PCI:0:2:0"
EndSection

You need to probably change the BusID:

$ lspci | grep VGA
00:02.0 VGA compatible controller: Intel Corporation 2nd Generation Core Processor Family Integrated Graphics Controller (rev 09)

The BusID is 0:2:0

CUDA Without Bumblebee

This is not well documented, but you do not need Bumblebee to use CUDA and it may work even on machines where optirun fails. For a guide on how to get it working with the Lenovo IdeaPad Y580 (which uses the GeForce 660M), see: https://wiki.archlinux.org/index.php/Lenovo_IdeaPad_Y580#NVIDIA_Card. Those instructions are very likely to work with other machines (except for the acpi-handle-hack part, which may not be necessary).

Troubleshooting

Note: Please report bugs at Bumblebee-Project's GitHub tracker as described in its Wiki.

[VGL] ERROR: Could not open display :8

There is a known problem with some wine applications that fork and kill the parent process without keeping track of it (for example the free to play online game "Runes of Magic")

A workaround for this problem is:

$ optirun bash
$ optirun wine <windows program>.exe

If using NVIDIA drivers a fix for this problem is to edit /etc/bumblebee/xorg.conf.nvidia and change Option ConnectedMonitor to CRT-0.

[ERROR]Cannot access secondary GPU

No devices detected.

In some instances, running optirun will return:

[ERROR]Cannot access secondary GPU - error: [XORG] (EE) No devices detected.
[ERROR]Aborting because fallback start is disabled.

In this case, you will need to move the file /etc/X11/xorg.conf.d/20-intel.conf to somewhere else. Restart the bumblebeed daemon, and it should work.

It could be also necessary to comment the driver line in /etc/X11/xorg.conf.d/10-monitor.conf.

If you're using the nouveau driver you could try switching to the nVidia driver.

NVIDIA(0): Failed to assign any connected display devices to X screen 0

If the console output is:

[ERROR]Cannot access secondary GPU - error: [XORG] (EE) NVIDIA(0): Failed to assign any connected display devices to X screen 0
[ERROR]Aborting because fallback start is disabled.

You can change this line in /etc/bumblebee/xorg.conf.nvidia:

Option "ConnectedMonitor" "DFP"

to

Option "ConnectedMonitor" "CRT"

ERROR: ld.so: object 'libdlfaker.so' from LD_PRELOAD cannot be preloaded: ignored.

You possibly want to start a 32-Bit application withbumblebee on your x64 system. Install lib32-virtualgl or the faster Primus render bridge primus-git from AUR. Remember to use primusrun with Primus.

Fatal IO error 11 (Resource temporarily unavailable) on X server

Change KeepUnusedXServer in /etc/bumblebee/bumblebee.conf from false to true. Your program forks into background and bumblebee don't know anything about it.

Video tearing

Video tearing is a somewhat common problem on Bumblebee. To fix it, you need to enable vsync. It should be enabled by default on the Intel card, but verify that from Xorg logs. To check whether or not it is enabled for nvidia, run

$ optirun nvidia-settings -c :8 

X Server XVideo Settings -> Sync to VBlank and OpenGL Settings -> Sync to VBlank should both be enabled. The Intel card has in general less tearing, so use it for video playback. Especially use VA-API for video decoding (e.g. mplayer-vaapi and with -vsync parameter).

Refer to the Intel article on how to fix tearing on the Intel card.

If it is still not fixed, try to disable compositing from your desktop environment. Try also disabling triple buffering.

It tells you you're not in the group, but you are

First, check that you are actually in the group; groups. If you aren't in the group add yourself(as above) and login and logout, try again.

Otherwise removing /var/run/bumblebeed.socket might help.(forum thread)

Important Links

Join us at #bumblebee at freenode.net