From Bumblebee's FAQ:
Bumblebee is a effort to make Nvidia Optimus enabled laptops work in GNU/Linux systems. Such feature involves two graphics cards with two different power consumption profiles plugged in a layered way sharing a single framebuffer.
- 1 About Bumblebee
- 2 Installation
- 3 Setup
- 4 Testing Bumblebee
- 5 Configuration
- 6 Usage
- 7 Power Management
- 8 Troubleshooting
- 9 See also
Optimus Technology is an hybrid graphics implementation without a hardware multiplexer. The integrated GPU manages the display while the dedicated GPU manages the most demanding rendering and ships the work to the integrated GPU to be displayed. When the laptop is running on battery supply, the dedicated GPU is turned off to save power and longer the battery life.
Bumblebee is a software implementation based on VirtualGL and a kernel driver to be able to use the dedicated GPU, which is not physically connected to the screen.
How it works
Bumblebee tries to mimic the Optimus technology behavior; using the dedicated GPU for rendering when needed and power it down when not in use. The present releases only support rendering on-demand, power-management is a work in progress.
The Nvidia dedicated card is managed as a separate X server connected to a "fake" screen (the screen is configured but not used). The second server is called using VirtualGL as if it were a remote server. That said, you will need a series of steps to set-up the kernel driver, the X server and a daemon.
To add Power Management functionality you need to install anyAUR package.
Using Nouveau Driver
To use the Nouveau driver make sure you have these packages:
- experimental 3D acceleration driver
- Mesa classic DRI + Gallium3D drivers
- Mesa 3-D graphics libraries
To get them run
# pacman -S xf86-video-nouveau nouveau-dri mesa
Using Nvidia Driver
To use Nvidia you need to install AUR from AUR, and the nvidia kernel module you like, both AUR or would work. Optionally there is AUR package in AUR that will depend explicitly on AUR for convenience.
Then make sure you load the proper kernel module at startup. If you run into trouble try the official Bumblebee wiki on GitHub.
There are some post-install steps to do before you can use Bumblebee.
Giving permission to use Bumblebee
Permission to use
optirun is granted to all members of the
bumblebee group, so you must add yourself (and other users whiling to use Bumblebee) to that group:
# usermod -a -G bumblebee $USER
$USER is the login name of the user to be added. Then log off and on again to apply the group changes.
Start Bumblebee Daemon
Bumblebee provides a daemon to start the second X server and manage some privileged functions, to start it simply run:
# rc.d start bumblebeed
To be started at boot add it to your
DAEMONS array in
You can test Bumblebee with this command:
$ optirun glxspheres
If it succeeds means you are able of offload render to the Nvidia card.
You may configure some variables in file
Compression and VGL Transport
Compression and transport regards how the frames are compressed in the server side (bumblebee X server), then transported to the client side (main X server) and uncompressed to be displayed in the application window. It mostly will affect performance in the GPU/GPU usage, as the transport is unlimited in bandwidth. Compressed methods (such as
jpeg) will load the CPU the most but will load GPU the minimum necessary; uncompressed methods loads the most on GPU and the CPU will have the minimum load possible.
You can try different compression methods adding
-c argument to
optirun command and test which suits you best:
optirun -c <compress-method> glxspheres
<compress-method> can be
yuv. Then you can replace the one you like in
VGLTransport variable in
... [optirun] VGLTransport=proxy ...
You can set up multiple monitors with xorg.conf. Set them to use the Intel card, but Bumblebee can still use the NVIDIA card. One example configuration is below for two identical screens with 1080p resolution and using the HDMI out.
Section "Screen" Identifier "Screen0" Device "intelgpu0" Monitor "Monitor0" DefaultDepth 24 Option "TwinView" "0" SubSection "Display" Depth 24 Modes "1980x1080_60.00" EndSubSection EndSection Section "Screen" Identifier "Screen1" Device "intelgpu1" Monitor "Monitor1" DefaultDepth 24 Option "TwinView" "0" SubSection "Display" Depth 24 Modes "1980x1080_60.00" EndSubSection EndSection Section "Monitor" Identifier "Monitor0" Option "Enable" "true" EndSection Section "Monitor" Identifier "Monitor1" Option "Enable" "true" EndSection Section "Device" Identifier "intelgpu0" Driver "intel" Option "XvMC" "true" Option "UseEvents" "true" Option "AccelMethod" "UXA" BusID "PCI:0:2:0" EndSection Section "Device" Identifier "intelgpu1" Driver "intel" Option "XvMC" "true" Option "UseEvents" "true" Option "AccelMethod" "UXA" BusID "PCI:0:2:0" EndSection
You need to probably change the BusID:
$ lspci | grep VGA
00:02.0 VGA compatible controller: Intel Corporation 2nd Generation Core Processor Family Integrated Graphics Controller (rev 09)
The BusID is 0:2:0
To launch an application using the dedicated graphics card:
$ optirun [options] <application> [application-parameters]
For a list of options for
$ optirun --help
If you want to run a 32-bit application on a 64-bit system you may need to install the proper 'lib32' packages.
The goal of power management is to turn the discrete card off when it is not used by any application, and turn it back on when it is needed. Currently the card can be used on-demand and no automatic switching is supported by default.
To enable it, first make sure you have installedAUR.
Make sure the secondary Xorg server is stopped when not in use. Then in the driver section of
bumblebee.conf set the
PMMethod option to
[bumblebeed] KeepUnusedXServer=false ... [driver-nvidia] PMMethod=bbswitch ... [driver-nouveau] PMMethod=bbswitch ...
This should be enough as the daemon will load the bbswitch kernel module and act accordingly. Then restart the daemon
# rc.d restart bumblebeed
Verify that bbswitch was loaded fine:
lsmod | grep bbswitch
No devices detected
In some instances, running optirun will return:
[ERROR]Cannot access secondary GPU - error: [XORG] (EE) No devices detected.
[ERROR]Aborting because fallback start is disabled.
In this case, you will need to move the file
/etc/X11/xorg.conf.d/20-intel.conf to somewhere else. Restart the bumblebeed daemon, and it should work.
Credit for this goes to Lekensteyn on #bumblebee at freenode.net
Video tearing is a somewhat common problem on Bumblebee. To fix it, you need to enable vsync. It should be enabled by default on the Intel card, but verify that from Xorg logs. To check whether or not it is enabled for nvidia, run
optirun nvidia-settings -c :8
X Server XVideo Setings -> Sync to VBlank and
OpenGL Settings -> Sync to VBlank should both be enabled. The Intel card has in general less tearing, so use it for video playback. Especially use VA-API for video decoding (e.g.
mplayer-vaapi and with
-vsync parameter). Refer to the Intel article on how to fix tearing on the Intel card. If it is still not fixed, try to disable compositing from your desktop environment. Try also disabling triple buffering.