From ArchWiki
Revision as of 15:33, 9 April 2012 by Alex anthony (talk | contribs) (Don't confuse with nvidia-bumblebee and nvidia - normal nvidia is fine, and will update with kernel, only nvidia-utils-bumblebee must be special)
Jump to navigation Jump to search

This template has only maintenance purposes. For linking to local translations please use interlanguage links, see Help:i18n#Interlanguage links.

Local languages: Català – Dansk – English – Español – Esperanto – Hrvatski – Indonesia – Italiano – Lietuviškai – Magyar – Nederlands – Norsk Bokmål – Polski – Português – Slovenský – Česky – Ελληνικά – Български – Русский – Српски – Українська – עברית – العربية – ไทย – 日本語 – 正體中文 – 简体中文 – 한국어

External languages (all articles in these languages should be moved to the external wiki): Deutsch – Français – Română – Suomi – Svenska – Tiếng Việt – Türkçe – فارسی

From Bumblebee's FAQ:

Bumblebee is an effort to make Nvidia Optimus enabled laptops work in GNU/Linux systems. Such feature involves two graphics cards with two different power consumption profiles plugged in a layered way sharing a single framebuffer.

Bumblebee: Optimus for Linux

Optimus Technology is an hybrid graphics implementation without a hardware multiplexer. The integrated GPU manages the display while the dedicated GPU manages the most demanding rendering and ships the work to the integrated GPU to be displayed. When the laptop is running on battery supply, the dedicated GPU is turned off to save power and prolong the battery life.

Bumblebee is a software implementation based on VirtualGL and a kernel driver to be able to use the dedicated GPU, which is not physically connected to the screen.

Bumblebee tries to mimic the Optimus technology behavior; using the dedicated GPU for rendering when needed and power it down when not in use. The present releases only support rendering on-demand, power-management is a work in progress.

The Nvidia dedicated card is managed as a separate X server connected to a "fake" screen (the screen is configured but not used). The second server is called using VirtualGL as if it were a remote server. That said, you will need a series of steps to set-up the kernel driver, the X server and a daemon.

Warning: Bumblebee is still under heavy development! But your help is very welcome.


Before installing Bumblebee check your BIOS and activate Optimus (shareable graphics), if possible (BIOS doesn't have to provide this option), and install the intel driver for the secondary on board graphics card.

Note: If you want to run a 32-bit application on a 64-bit system you must install lib32-virtualglAUR and proper lib32-* libraries.

Installing Bumblebee with Intel / NVidia

Install bumblebeeAUR from AUR, and then install the special nvidia package nvidia-utils-bumblebeeAUR for bumblebee:

Warning: Don't install the original nvidia-utils for Bumblebee - it will break your system !

It is secure now to install the nvidia driver:

# pacman -S nvidia
Note: You can install dkms-nvidiaAUR from AUR instead of nvidia if you need it.
Note: If you like bumblebee to turn off the NVidia card automatically after usage, use bbswitchAUR from AUR. See below.

Installing Bumblebee with Intel / Nouveau

Install nouveau and required packages first:

# pacman -S xf86-video-nouveau nouveau-dri mesa

Now Install bumblebeeAUR from AUR:

Note: If you like bumblebee to turn off the NVidia card automatically after usage, use bbswitchAUR from AUR. See below.

Start Bumblebee

In order to use it is necessary add yourself (and other users) at Bumblebee group:

# usermod -a -G bumblebee $USER

where $USER is the login name of the user to be added. Then log off and on again to apply the group changes.

To start bumblebee automatically add it to your DAEMONS array in /etc/rc.conf

DAEMONS=(... @bumblebeed)

Finished - reboot system and use the shell program optirun for Optimus Nvidia rendering !


The command line programm optirun shipped with bumblebee is your best friend for running applications on your Optimus Nvidia card.

Test Bumblebee if it works with your Optimus system:

$ optirun glxgears

If it succeeds and the terminal you are running from mentions something about your Nvidia - Gratulation - Optimus with Bumblebee is working!

General Usage:

$ optirun [options] <application> [application-parameters]

Some Examples:

Start Firefox accelerated with Optimus:

$ optirun firefox

Start Windows applications with Optimus:

$ optirun wine <windows application>.exe

Use Nvidia Settings with Optimus:

$ optirun nvidia-settings -c :8 

For a list of options for optirun run:

$ optirun --help


You can configure the behaviour of Bumblebee to fit your needs. Fine tuning like speed optimization, power managment and other stuff can be configured in /etc/bumblebee/bumblebee.conf

Optimizing Speed

Bumblebee renders frames for your Optimus NVIDIA card in a invisible X Server with VirtualGL and transports them back to your visible X Server.

Frames will be compressed before they are transported - this saves bandwith and can be used for speedup optimization of bumblebee:

To use an other compression method for a single application:

$ optirun -c <compress-method> application

The method of compres will affect performance in the GPU/GPU usage. Compressed methods (such as jpeg) will load the CPU the most but will load GPU the minimum necessary; uncompressed methods loads the most on GPU and the CPU will have the minimum load possible.

Compressed Methods are: jpeg, rgb, yuv

Uncompressed Methods are: proxy, xv

To use a standard compression for all applications set the VGLTransport to <compress-method> in /etc/bumblebee/bumblebee.conf

Note: CPU frequency scaling will affect directly on render performance

Power Management

The goal of power management feature is to turnoff the Nvidia card when it is not used by bumblebee anymore.

To enable power managment for bumblebee install bbswitchAUR from AUR.

Warning: Make sure the secondary Xorg server is stopped when not in use !

Set the PMMethod to bbswitch in the driver section of /etc/bumblebee/bumblebee.conf:


Just restart bumblebee daemon to activate power managment:

# rc.d restart bumblebeed

Multiple monitors

You can set up multiple monitors with xorg.conf. Set them to use the Intel card, but Bumblebee can still use the NVIDIA card. One example configuration is below for two identical screens with 1080p resolution and using the HDMI out.

Section "Screen"
    Identifier     "Screen0"
    Device         "intelgpu0"
    Monitor        "Monitor0"
    DefaultDepth    24
    Option         "TwinView" "0"
    SubSection "Display"
        Depth          24
        Modes          "1980x1080_60.00"

Section "Screen"
    Identifier     "Screen1"
    Device         "intelgpu1"
    Monitor        "Monitor1"
    DefaultDepth   24
    Option         "TwinView" "0"
    SubSection "Display"
        Depth          24
        Modes          "1980x1080_60.00"

Section "Monitor"
    Identifier     "Monitor0"
    Option         "Enable" "true"

Section "Monitor"
    Identifier     "Monitor1"
    Option         "Enable" "true"

Section "Device"
    Identifier     "intelgpu0"
    Driver         "intel"
    Option         "XvMC" "true"
    Option         "UseEvents" "true"
    Option         "AccelMethod" "UXA"
    BusID          "PCI:0:2:0"

Section "Device"
    Identifier     "intelgpu1"
    Driver         "intel"
    Option         "XvMC" "true"
    Option         "UseEvents" "true"
    Option         "AccelMethod" "UXA"
    BusID          "PCI:0:2:0"

You need to probably change the BusID:

$ lspci | grep VGA
00:02.0 VGA compatible controller: Intel Corporation 2nd Generation Core Processor Family Integrated Graphics Controller (rev 09)

The BusID is 0:2:0


Note: Please report bugs at Bumblebee-Project's GitHub tracker as described in its Wiki.

[VGL] ERROR: Could not open display :8

There is a known problem with some wine applications that fork and kill the parent process without keeping track of it (for example the free to play online game "Runes of Magic")

A workaround for this problem is:

$ optirun bash
$ optirun wine <windows program>.exe

[ERROR]Cannot access secondary GPU

In some instances, running optirun will return:

[ERROR]Cannot access secondary GPU - error: [XORG] (EE) No devices detected.

[ERROR]Aborting because fallback start is disabled.

In this case, you will need to move the file /etc/X11/xorg.conf.d/20-intel.conf to somewhere else. Restart the bumblebeed daemon, and it should work. Credit for this goes to Lekensteyn on #bumblebee at freenode.net

Video tearing

Video tearing is a somewhat common problem on Bumblebee. To fix it, you need to enable vsync. It should be enabled by default on the Intel card, but verify that from Xorg logs. To check whether or not it is enabled for nvidia, run

$ optirun nvidia-settings -c :8 

X Server XVideo Settings -> Sync to VBlank and OpenGL Settings -> Sync to VBlank should both be enabled. The Intel card has in general less tearing, so use it for video playback. Especially use VA-API for video decoding (e.g. mplayer-vaapi and with -vsync parameter).

Refer to the Intel article on how to fix tearing on the Intel card.

If it is still not fixed, try to disable compositing from your desktop environment. Try also disabling triple buffering.

Important Links

Join us at #bumblebee at freenode.net