From ArchWiki
Revision as of 01:14, 11 February 2012 by Svenstaro (Talk | contribs) (Power Management: Updated to reflect auto-loading of bbswitch from bumblebeed)

Jump to: navigation, search

This template has only maintenance purposes. For linking to local translations please use interlanguage links, see Help:i18n#Interlanguage links.

Local languages: Català – Dansk – English – Español – Esperanto – Hrvatski – Indonesia – Italiano – Lietuviškai – Magyar – Nederlands – Norsk Bokmål – Polski – Português – Slovenský – Česky – Ελληνικά – Български – Русский – Српски – Українська – עברית – العربية – ไทย – 日本語 – 正體中文 – 简体中文 – 한국어

External languages (all articles in these languages should be moved to the external wiki): Deutsch – Français – Română – Suomi – Svenska – Tiếng Việt – Türkçe – فارسی

Warning: Bumblebee is a work in progress and may not work properly on your machine
Note: Please report bugs at Bumblebee-Project's GitHub tracker as described in its Wiki.

From Bumblebee's FAQ:

Bumblebee is a effort to make Nvidia Optimus enabled laptops work in GNU/Linux systems. Such feature involves two graphics cards with two different power consumption profiles plugged in a layered way sharing a single framebuffer.

About Bumblebee

Optimus Technology is an hybrid graphics implementation without a hardware multiplexer. The integrated GPU manages the display while the dedicated GPU manages the most demanding rendering and ships the work to the integrated GPU to be displayed. When the laptop is running on battery supply, the dedicated GPU is turned off to save power and longer the battery life.

Bumblebee is a software implementation based on VirtualGL and a kernel driver to be able to use the dedicated GPU, which is not physically connected to the screen.

How it works

Bumblebee tries to mimic the Optimus technology behavior; using the dedicated GPU for rendering when needed and power it down when not in use. The present releases only support rendering on-demand, power-management is a work in progress.

The Nvidia dedicated card is managed as a separate X server connected to a "fake" screen (the screen is configured but not used). The second server is called using VirtualGL as if it were a remote server. That said, you will need a series of steps to set-up the kernel driver, the X server and a daemon.


Install package bumblebeeAUR from AUR. If you want the bleeding edge, in-development version, you can install bumblebee-gitAUR. Both packages can be used with Nvidia or Nouveau drivers. For 32-bit applications on 64-bit systems you must install lib32-virtualglAUR and relevant lib32-* libraries.

To add Power Management functionality you need to install any bbswitchAUR package.

Using Nouveau Driver

To use the Nouveau driver make sure you have these packages:

To get them run

# pacman -S xf86-video-nouveau nouveau-dri mesa

Using Nvidia Driver

Warning: Don't install nvidia-utils from [extra] nor lib32-nvidia-utils from [multilib], both packages will break libgl

To use Nvidia you need to install nvidia-utils-bumblebeeAUR from AUR, and the nvidia kernel module you like, both dkms-nvidiaAUR or nvidia would work. Optionally there is nvidia-bumblebeeAUR package in AUR that will depend explicitly on nvidia-utils-bumblebeeAUR for convenience.

Then make sure you load the proper kernel module at startup. If you run into trouble try the official Bumblebee wiki on GitHub.


There are some post-install steps to do before you can use Bumblebee.

Giving permission to use Bumblebee

Permission to use optirun is granted to all members of the bumblebee group, so you must add yourself (and other users whiling to use Bumblebee) to that group:

# usermod -a -G bumblebee $USER

where $USER is the login name of the user to be added. Then log off and on again to apply the group changes.

Start Bumblebee Daemon

Bumblebee provides a daemon to start the second X server and manage some privileged functions, to start it simply run:

# rc.d start bumblebeed

To be started at boot add it to your DAEMONS array in /etc/rc.conf

DAEMONS=(... @bumblebeed)

Testing Bumblebee

You can test Bumblebee with this command:

$ optirun glxspheres

If it succeeds means you are able of offload render to the Nvidia card.


You may configure some variables in file /etc/bumblebee/bumblebee.conf.

Compression and VGL Transport

Compression and transport regards how the frames are compressed in the server side (bumblebee X server), then transported to the client side (main X server) and uncompressed to be displayed in the application window. It mostly will affect performance in the GPU/GPU usage, as the transport is unlimited in bandwidth. Compressed methods (such as jpeg) will load the CPU the most but will load GPU the minimum necessary; uncompressed methods loads the most on GPU and the CPU will have the minimum load possible.

Note: CPU frequency scaling will affect directly on render performance

You can try different compression methods adding -c argument to optirun command and test which suits you best:

optirun -c <compress-method> glxspheres

Where <compress-method> can be jpeg, xv, proxy, rgb or yuv. Then you can replace the one you like in VGLTransport variable in /etc/bumblebee/bumblebee.conf.

Note: Uncompressed methods proxy and xv show lower framerates but they perform better in some applications

Multiple monitors

You can set up multiple monitors with xorg.conf. Set them to use the Intel card, but Bumblebee can still use the NVIDIA card. One example configuration is below for two identical screens with 1080p resolution and using the HDMI out.

Section "Screen"
    Identifier     "Screen0"
    Device         "intelgpu0"
    Monitor        "Monitor0"
    DefaultDepth    24
    Option         "TwinView" "0"
    SubSection "Display"
        Depth          24
        Modes          "1980x1080_60.00"

Section "Screen"
    Identifier     "Screen1"
    Device         "intelgpu1"
    Monitor        "Monitor1"
    DefaultDepth   24
    Option         "TwinView" "0"
    SubSection "Display"
        Depth          24
        Modes          "1980x1080_60.00"

Section "Monitor"
    Identifier     "Monitor0"
    Option         "Enable" "true"

Section "Monitor"
    Identifier     "Monitor1"
    Option         "Enable" "true"

Section "Device"
    Identifier     "intelgpu0"
    Driver         "intel"
    Option         "XvMC" "true"
    Option         "UseEvents" "true"
    Option         "AccelMethod" "UXA"
    BusID          "PCI:0:2:0"

Section "Device"
    Identifier     "intelgpu1"
    Driver         "intel"
    Option         "XvMC" "true"
    Option         "UseEvents" "true"
    Option         "AccelMethod" "UXA"
    BusID          "PCI:0:2:0"

You need to probably change the BusID:

$ lspci | grep VGA
00:02.0 VGA compatible controller: Intel Corporation 2nd Generation Core Processor Family Integrated Graphics Controller (rev 09)

The BusID is 0:2:0


To launch an application using the dedicated graphics card:

$ optirun [options] <application> [application-parameters]

For a list of options for optirun run:

$ optirun --help

If you want to run a 32-bit application on a 64-bit system you may need to install the proper 'lib32' packages.

Power Management

The goal of power management is to turn the discrete card off when it is not used by any application, and turn it back on when it is needed. Currently the card can be used on-demand and no automatic switching is supported by default.

To enable it, first make sure you have installed bbswitchAUR.

Make sure the secondary Xorg server is stopped when not in use. Then in the driver section of bumblebee.conf set the PMMethod option to bbswitch


This should be enough as the daemon will load the bbswitch kernel module and act accordingly. Then restart the daemon

# rc.d restart bumblebeed

Verify that bbswitch was loaded fine:

lsmod | grep bbswitch


No devices detected

In some instances, running optirun will return:

[ERROR]Cannot access secondary GPU - error: [XORG] (EE) No devices detected.

[ERROR]Aborting because fallback start is disabled.

In this case, you will need to move the file /etc/X11/xorg.conf.d/20-intel.conf to somewhere else. Restart the bumblebeed daemon, and it should work. Credit for this goes to Lekensteyn on #bumblebee at freenode.net

Video tearing

Video tearing is a somewhat common problem on Bumblebee. To fix it, you need to enable vsync. It should be enabled by default on the Intel card, but verify that from Xorg logs. To check whether or not it is enabled for nvidia, run

optirun nvidia-settings -c :8 

X Server XVideo Setings -> Sync to VBlank and OpenGL Settings -> Sync to VBlank should both be enabled. The Intel card has in general less tearing, so use it for video playback. Especially use VA-API for video decoding (e.g. mplayer-vaapi and with -vsync parameter). Refer to the Intel article on how to fix tearing on the Intel card. If it is still not fixed, try to disable compositing from your desktop environment. Try also disabling triple buffering.

See also