From Bumblebee's FAQ:
- Bumblebee is an effort to make NVIDIA Optimus enabled laptops work in GNU/Linux systems. Such feature involves two graphics cards with two different power consumption profiles plugged in a layered way sharing a single framebuffer.
- 1 Bumblebee: Optimus for Linux
- 2 Installation
- 3 Usage
- 4 Configuration
- 4.1 Optimizing speed
- 4.2 Power management
- 4.3 Multiple monitors
- 4.4 Multiple NVIDIA Graphics Cards
- 5 CUDA without Bumblebee
- 6 Troubleshooting
- 6.1 [VGL] ERROR: Could not open display :8
- 6.2 Xlib: extension "GLX" missing on display ":0.0"
- 6.3 [ERROR]Cannot access secondary GPU: No devices detected
- 6.3.1 NVIDIA(0): Failed to assign any connected display devices to X screen 0
- 6.3.2 Failed to initialize the NVIDIA GPU at PCI:1:0:0 (GPU fallen off the bus / RmInitAdapter failed!)
- 6.3.3 Failed to initialize the NVIDIA GPU at PCI:1:0:0 (Bumblebee daemon reported: error: [XORG] (EE) NVIDIA(GPU-0))
- 6.3.4 Could not load GPU driver
- 6.3.5 NOUVEAU(0): [drm] failed to set drm interface version
- 6.4 [ERROR]Cannot access secondary GPU - error: X did not start properly
- 6.5 /dev/dri/card0: failed to set DRM interface version 1.4: Permission denied
- 6.6 ERROR: ld.so: object 'libdlfaker.so' from LD_PRELOAD cannot be preloaded: ignored
- 6.7 Fatal IO error 11 (Resource temporarily unavailable) on X server
- 6.8 Video tearing
- 6.9 Bumblebee cannot connect to socket
- 6.10 Running X.org from console after login (rootless X.org)
- 6.11 Using Primus causes a segmentation fault
- 6.12 Primusrun mouse delay (disable VSYNC)
- 6.13 Primus issues under compositing window managers
- 6.14 Problems with bumblebee after resuming from standby
- 6.15 Optirun does not work, no debug output
- 6.16 Broken power management with kernel 4.8
- 6.17 Lockup issue (lspci hangs)
- 6.18 Discrete card always on and acpi warnings
- 6.19 Screen 0 deleted because of no matching config section
- 6.20 Erratic, unpredictable behaviour
- 6.21 Discrete card always on and nvidia driver cannot be unloaded
- 6.22 Discrete card is silently activated when egl is requested by some application
- 6.23 Framerate drops to 1 FPS after a fixed period of time
- 7 See also
Bumblebee: Optimus for Linux
Optimus Technology is a hybrid graphics implementation without a hardware multiplexer. The integrated GPU manages the display while the dedicated GPU manages the most demanding rendering and ships the work to the integrated GPU to be displayed. When the laptop is running on battery supply, the dedicated GPU is turned off to save power and prolong the battery life. It has also been tested successfully with desktop machines with Intel integrated graphics and an nVidia dedicated graphics card.
Bumblebee is a software implementation comprising two parts:
- Render programs off-screen on the dedicated video card and display it on the screen using the integrated video card. This bridge is provided by VirtualGL or primus (read further) and connects to a X server started for the discrete video card.
- Disable the dedicated video card when it is not in use (see the #Power management section)
It tries to mimic the Optimus technology behavior; using the dedicated GPU for rendering when needed and power it down when not in use. The present releases only support rendering on-demand, automatically starting a program with the discrete video card based on workload is not implemented.
Before installing Bumblebee, check your BIOS and activate Optimus (older laptops call it "switchable graphics") if possible (BIOS does not have to provide this option). If neither "Optimus" or "switchable" is in the BIOS, still make sure both GPUs will be enabled and that the integrated graphics (igfx) is initial display (primary display). The display should be connected to the onboard integrated graphics, not the discrete graphics card. If integrated graphics had previously been disabled and discrete graphics drivers installed, be sure to remove
/etc/X11/xorg.conf or the conf file in
/etc/X11/xorg.conf.d related to the discrete graphics card.
- - The main package providing the daemon and client programs.
- - An open-source implementation of the OpenGL specification.
- An appropriate version of the NVIDIA driver, see NVIDIA#Installation.
- Optionally install Xorg driver. - Intel
For 32-bit application support, enable the multilib repository and install:
- - A render/display bridge for 32 bit applications.
- or AUR (match the version of the regular NVIDIA driver).
In order to use Bumblebee, it is necessary to add your regular user to the
# gpasswd -a user bumblebee
nvidia-drmmodule from loading on boot. Remember to uninstall this if you later switch away to other solutions.
glxgears to test if if Bumblebee works with your Optimus system:
$ optirun glxgears -info
If it fails, try the following commands:
- 64 bit system:
$ optirun glxspheres64
- 32 bit system:
$ optirun glxspheres32
If the window with animation shows up, Optimus with Bumblebee is working.
glxspheresXXworked, always replace "
glxgears" with "
glxspheresXX" in all cases.
$ optirun [options] application [application-parameters]
For example, start Windows applications with Optimus:
$ optirun wine application.exe
For another example, open NVIDIA Settings panel with Optimus:
$ optirun -b none nvidia-settings -c :8
For a list of all available options, see.
You can configure the behaviour of Bumblebee to fit your needs. Fine tuning like speed optimization, power management and other stuff can be configured in
Using VirtualGL as bridge
Bumblebee renders frames for your Optimus NVIDIA card in an invisible X Server with VirtualGL and transports them back to your visible X Server. Frames will be compressed before they are transported - this saves bandwidth and can be used for speed-up optimization of bumblebee:
To use another compression method for a single application:
$ optirun -c compress-method application
The method of compress will affect performance in the GPU/CPU usage. Compressed methods will mostly load the CPU. However, uncompressed methods will mostly load the GPU.
Here is a performance table tested with ASUS N550JV laptop and benchmark app AUR:
|Command||FPS||Score||Min FPS||Max FPS|
|optirun -c jpeg unigine-heaven||24.2||610||9.5||36.8|
|optirun -c rgb unigine-heaven||25.1||632||16.6||35.5|
|optirun -c yuv unigine-heaven||24.9||626||16.5||35.8|
|optirun -c proxy unigine-heaven||25.0||629||16.0||36.1|
|optirun -c xv unigine-heaven||22.9||577||15.4||32.2|
jpegcompression method was used.
To use a standard compression for all applications, set the
[...] [optirun] VGLTransport=proxy [...]
You can also play with the way VirtualGL reads back the pixels from your graphic card. Setting
VGL_READBACK environment variable to
pbo should increase the performance. Compare these two:
# PBO should be faster. VGL_READBACK=pbo optirun glxgears # The default value is sync. VGL_READBACK=sync optirun glxgears
primusrun (from package ) is becoming the default choice, because it consumes less power and sometimes provides better performance than
virtualgl. It may be run separately, but it does not accept options as
optirun does. Setting
primus as the bridge for
optirun provides more flexibility.
For 32-bit applications support on 64-bit machines, install multilib must be enabled).(
Usage (run separately):
$ primusrun glxgears
Usage (as a bridge for
The default configuration sets
virtualgl as the bridge. Override that on the command line:
$ optirun -b primus glxgears
/etc/bumblebee/bumblebee.conf and you will not have to specify it on the command line.
VSYNC. It can also remove mouse input delay lag and slightly increase the performance.
The goal of the power management feature is to turn off the NVIDIA card when it is not used by Bumblebee any more. If Optimus laptops only and will not work on desktop computers. So, Bumblebee power management is not available for desktop computers, and there is no reason to install on a desktop. (Nevertheless, the other features of Bumblebee do work on some desktop computers.)(for ) or (for or custom kernels) is installed, it will be detected automatically when the Bumblebee daemon starts. No additional configuration is necessary. However, is for
Default power state of NVIDIA card using bbswitch
The default behavior of bbswitch is to leave the card power state unchanged.
bumblebeed does disable the card when started, so the following is only necessary if you use bbswitch without bumblebeed.
unload_state module options according to your needs (see bbswitch documentation).
options bbswitch load_state=0 unload_state=1
To run bbswitch without bumblebeed on system startup, do not forget to add
Enable NVIDIA card during shutdown
On some laptops, the NVIDIA card may not correctly initialize during boot if the card was powered off when the system was last shutdown. Therefore the Bumblebee daemon will power on the GPU when stopping the daemon (e.g. on shutdown) due to the (default) setting
/etc/bumblebee/bumblebee.conf. Note that this setting does not influence power state while the daemon is running, so if all
primusrun programs have exited, the GPU will still be powered off.
When you stop the daemon manually, you might want to keep the card powered off while still powering it on on shutdown. To achieve the latter, add the following systemd service (if using ):
[Unit] Description=Enable NVIDIA card DefaultDependencies=no [Service] Type=oneshot ExecStart=/bin/sh -c 'echo ON > /proc/acpi/bbswitch' [Install] WantedBy=shutdown.target
Then enable the
Enable NVIDIA card after waking from suspend
The bumblebee daemon may fail to activate the graphics card after suspending. A possible fix involves setting
[driver-nvidia] PMMethod=bbswitch [driver-nouveau] PMMethod=bbswitch
If the above fix fails, try the following command:
# echo 1 > /sys/bus/pci/rescan
To rescan the PCI bus automatically after a suspend, create a script as described in Power management#Hooks in /usr/lib/systemd/system-sleep.
Outputs wired to the Intel chip
If the port (DisplayPort/HDMI/VGA) is wired to the Intel chip, you can set up multiple monitors with xorg.conf. Set them to use the Intel card, but Bumblebee can still use the NVIDIA card. One example configuration is below for two identical screens with 1080p resolution and using the HDMI out.
Section "Screen" Identifier "Screen0" Device "intelgpu0" Monitor "Monitor0" DefaultDepth 24 Option "TwinView" "0" SubSection "Display" Depth 24 Modes "1920x1080_60.00" EndSubSection EndSection Section "Screen" Identifier "Screen1" Device "intelgpu1" Monitor "Monitor1" DefaultDepth 24 Option "TwinView" "0" SubSection "Display" Depth 24 Modes "1920x1080_60.00" EndSubSection EndSection Section "Monitor" Identifier "Monitor0" Option "Enable" "true" EndSection Section "Monitor" Identifier "Monitor1" Option "Enable" "true" EndSection Section "Device" Identifier "intelgpu0" Driver "intel" Option "XvMC" "true" Option "UseEvents" "true" Option "AccelMethod" "UXA" BusID "PCI:0:2:0" EndSection Section "Device" Identifier "intelgpu1" Driver "intel" Option "XvMC" "true" Option "UseEvents" "true" Option "AccelMethod" "UXA" BusID "PCI:0:2:0" EndSection Section "Device" Identifier "nvidiagpu1" Driver "nvidia" BusID "PCI:0:1:0" EndSection
You need to probably change the BusID for both the Intel and the NVIDIA card.
$ lspci | grep VGA
00:02.0 VGA compatible controller: Intel Corporation 2nd Generation Core Processor Family Integrated Graphics Controller (rev 09)
The BusID is 0:2:0
Output wired to the NVIDIA chip
On some notebooks, the digital Video Output (HDMI or DisplayPort) is hardwired to the NVIDIA chip. If you want to use all the displays on such a system simultaneously, the easiest solution is to use intel-virtual-output, a tool provided in thedriver set, as of v2.99. It will allow you to extend the existing X session onto other screens, leveraging virtual outputs to work with the discrete graphics card. Commandline usage is as follows:
$ intel-virtual-output [OPTION]... [TARGET_DISPLAY]...
-d <source display> source display -f keep in foreground (do not detach from console and daemonize) -b start bumblebee -a connect to all local displays (e.g. :1, :2, etc) -S disable use of a singleton and launch a fresh intel-virtual-output process -v all verbose output, implies -f -V <category> specific verbose output, implies -f -h this help
If this command alone does not work, you can try running it with optirun to enable the discrete graphics and allow it to detect the outputs accordingly. This is known to be necessary on Lenovo's Legion Y720.
$ optirun intel-virtual-output
If no target displays are parsed on the commandline, intel-virtual-output will attempt to connect to any local display. The detected displays will be manageable via any desktop display manager such as xrandr or KDE Display. The tool will also start bumblebee (which may be left as default install). See the Bumblebee wiki page for more information.
When run in a terminal, intel-virtual-output will daemonize itself unless the
-f switch is used. Games can be run on the external screen by first exporting the display
export DISPLAY=:8, and then running the game with
optirun game_bin, however, cursor and keyboard are not fully captured. Use
export DISPLAY=:0 to revert back to standard operation.
If intel-virtual-output does not detect displays, or if a
no VIRTUAL outputs on ":0" message is obtained, then try editing the following two files as follows (and rebooting):
Section "Device" # This file does not exist by default Identifier "intelgpu0" Driver "intel" EndSection
Section "ServerLayout" Identifier "Layout0" Option "AutoAddDevices" "true" # Bumblebee defaults to false Option "AutoAddGPU" "false" EndSection Section "Device" Identifier "DiscreteNvidia" Driver "nvidia" VendorName "NVIDIA Corporation" Option "ProbeAllGpus" "false" Option "NoLogo" "true" Option "UseEDID" "true" # Bumblebee defaults to false Option "AllowEmptyInitialConfiguration" # Add this line # Option "UseDisplayDevice" "none" # Remove or comment out this line EndSection Section "Screen" # Add this section Identifier "Screen0" Device "DiscreteNvidia" EndSection
See  for further configurations to try. If the laptop screen is stretched and the cursor is misplaced while the external monitor shows only the cursor, try killing any running compositing managers.
If you do not want to use intel-virtual-output, another option is to configure Bumblebee to leave the discrete GPU on and directly configure X to use both the screens, as it will be able to detect them.
As a last resort, you can run 2 X Servers. The first will be using the Intel driver for the notebook's screen. The second will be started through optirun on the NVIDIA card, to show on the external display. Make sure to disable any display/session manager before manually starting your desktop environment with optirun. Then, you can log in the integrated-graphics powered one.
Multiple NVIDIA Graphics Cards
If you have multiple NVIDIA graphics cards (eg. when using an eGPU with a laptop with another built in NVIDIA graphics card), you need to make a minor edit to
/etc/bumblebee/xorg.conf.nvidia. If this change is not made the daemon may default to using the internal NVIDIA card.
First, determine the BusID of the external card:
$ lspci | grep -E "VGA|3D"
00:02.0 VGA compatible controller: Intel Corporation HD Graphics 530 (rev 06) 01:00.0 3D controller: NVIDIA Corporation GM107M [GeForce GTX 960M] (rev a2) 0b:00.0 VGA compatible controller: NVIDIA Corporation GP104 [GeForce GTX 1070] (rev a1)
In this case, the BusID is
/etc/bumblebee/xorg.conf.nvidia and add the following line to
Section "Device" ... BusID "PCI:11:00:0" Option "AllowExternalGpus" "true" # If the GPU is external ... EndSection
0bbecame a base10
CUDA without Bumblebee
You can use CUDA without bumblebee. All you need to do is ensure that the nvidia card is on:
# tee /proc/acpi/bbswitch <<< ON
Now when you start a CUDA application it is going to automatically load all the necessary modules.
To turn off the nvidia card after using CUDA do:
# rmmod nvidia_uvm # rmmod nvidia # tee /proc/acpi/bbswitch <<< OFF
[VGL] ERROR: Could not open display :8
There is a known problem with some wine applications that fork and kill the parent process without keeping track of it (for example the free to play online game "Runes of Magic")
This is a known problem with VirtualGL. As of bumblebee 3.1, so long as you have it installed, you can use Primus as your render bridge:
$ optirun -b primus wine windows program.exe
If this does not work, an alternative walkaround for this problem is:
$ optirun bash $ optirun wine windows program.exe
If using NVIDIA drivers a fix for this problem is to edit
/etc/bumblebee/xorg.conf.nvidia and change Option
Xlib: extension "GLX" missing on display ":0.0"
If you tried to install the NVIDIA driver from NVIDIA website, this is not going to work.
1. Uninstall that driver in the similar way:
# ./NVIDIA-Linux-*.run --uninstall
2. Remove generated by NVIDIA Xorg configuration file:
# rm /etc/X11/xorg.conf
3. (Re)install the correct NVIDIA driver: #Installation
[ERROR]Cannot access secondary GPU: No devices detected
In some instances, running
optirun will return:
[ERROR]Cannot access secondary GPU - error: [XORG] (EE) No devices detected. [ERROR]Aborting because fallback start is disabled.
In this case, you will need to move the file
/etc/X11/xorg.conf.d/20-intel.conf to somewhere else, restart the bumblebeed daemon and it should work. If you do need to change some features for the Intel module, a workaround is to merge
It could be also necessary to comment the driver line in
If you are using the
nouveau driver you could try switching to the
You might need to define the NVIDIA card somewhere (e.g. file
/etc/bumblebee/xorg.conf.nvidia), using the correct
BusID according to
Section "Device" Identifier "nvidiagpu1" Driver "nvidia" BusID "PCI:0:1:0" EndSection
Observe that the format of
lspci output is in HEX, while in xorg it is in decimals. So if the output of
lspci is, for example,
BusID should be
NVIDIA(0): Failed to assign any connected display devices to X screen 0
If the console output is:
[ERROR]Cannot access secondary GPU - error: [XORG] (EE) NVIDIA(0): Failed to assign any connected display devices to X screen 0 [ERROR]Aborting because fallback start is disabled.
If the following line in
/etc/bumblebee/xorg.conf.nvidia does not exist, you can add it to the "Device" section:
Option "ConnectedMonitor" "DFP"
If it does already exist, you can try changing it to:
Option "ConnectedMonitor" "CRT"
After that, restart the Bumblebee service to apply these changes.
Failed to initialize the NVIDIA GPU at PCI:1:0:0 (GPU fallen off the bus / RmInitAdapter failed!)
Failed to initialize the NVIDIA GPU at PCI:1:0:0 (Bumblebee daemon reported: error: [XORG] (EE) NVIDIA(GPU-0))
You might encounter an issue when after resume from sleep,
optirun command does not work anymore. there are two ways to fix this issue - reboot your system or execute the following command:
# echo 1 > /sys/bus/pci/rescan
And try to test if
If the above command did not help, try finding your NVIDIA card's bus ID:
$ lspci | grep NVIDIA
For example, above command showed
01:00.0 so we use following commands with this bus ID:
# echo 1 > /sys/bus/pci/devices/0000:01:00.0/remove # echo 1 > /sys/bus/pci/rescan
Could not load GPU driver
If the console output is:
[ERROR]Cannot access secondary GPU - error: Could not load GPU driver
and if you try to load the nvidia module you get:
modprobe nvidia modprobe: ERROR: could not insert 'nvidia': Exec format error
This could be because the nvidia driver is out of sync with the Linux kernel, for example if you installed the latest nvidia driver and have not updated the kernel in a while. A full system update might resolve the issue. If the problem persists you should try manually compiling the nvidia packages against your current kernel, for example with ABS.or by compiling from the
NOUVEAU(0): [drm] failed to set drm interface version
Consider switching to the official nvidia driver. As commented here, nouveau driver has some issues with some cards and bumblebee.
[ERROR]Cannot access secondary GPU - error: X did not start properly
"AutoAddDevices" option to
/etc/bumblebee/xorg.conf.nvidia (see here):
Section "ServerLayout" Identifier "Layout0" Option "AutoAddDevices" "true" Option "AutoAddGPU" "false" EndSection
/dev/dri/card0: failed to set DRM interface version 1.4: Permission denied
This could be worked around by appending following lines in
/etc/bumblebee/xorg.conf.nvidia (see here):
Section "Screen" Identifier "Default Screen" Device "DiscreteNvidia" EndSection
ERROR: ld.so: object 'libdlfaker.so' from LD_PRELOAD cannot be preloaded: ignored
You probably want to start a 32-bit application with bumblebee on a 64-bit system. See the "For 32-bit..." section in #Installation. If the problem persists or if it is a 64-bit application, try using the primus bridge.
true. Your program forks into background and bumblebee do not know anything about it.
Video tearing is a somewhat common problem on Bumblebee. To fix it, you need to enable vsync. It should be enabled by default on the Intel card, but verify that from Xorg logs. To check whether or not it is enabled for NVIDIA, make sureis installed and run:
$ optirun nvidia-settings -c :8
X Server XVideo Settings -> Sync to VBlank and
OpenGL Settings -> Sync to VBlank should both be enabled. The Intel card has in general less tearing, so use it for video playback. Especially use VA-API for video decoding (e.g.
mplayer-vaapi and with
Refer to Intel#Tearing on how to fix tearing on the Intel card.
If it is still not fixed, try to disable compositing from your desktop environment. Try also disabling triple buffering.
Bumblebee cannot connect to socket
You might get something like:
$ optirun glxspheres64
or (for 32 bit):
$ optirun glxspheres32
[ 1648.179533] [ERROR]You have no permission to communicate with the Bumblebee daemon. Try adding yourself to the 'bumblebee' group [ 1648.179628] [ERROR]Could not connect to bumblebee daemon - is it running?
If you are already in the
bumblebee group (
$ groups | grep bumblebee), you may try removing the socket
Another reason for this error could be that you have not actually turned on both GPUs in your BIOS, and as a result, the Bumblebee daemon is in fact not running. Check the BIOS settings carefully and be sure intel graphics (integrated graphics - may be abbreviated in BIOS as something like igfx) has been enabled or set to auto, and that it is the primary gpu. Your display should be connected to the onboard integrated graphics, not the discrete graphics card.
If you mistakenly had the display connected to the discrete graphics card and intel graphics was disabled, you probably installed Bumblebee after first trying to run Nvidia alone. In this case, be sure to remove the /etc/X11/xorg.conf or .../20-nvidia... configuration files. If Xorg is instructed to use Nvidia in a conf file, X will fail.
Running X.org from console after login (rootless X.org)
See Xorg#Rootless Xorg.
Using Primus causes a segmentation fault
In some instances, using primusrun instead of optirun will result in a segfault. This is caused by an issue in code auto-detecting faster upload method, see FS#58933.
The workaround is skipping auto-detection by manually setting
PRIMUS_UPLOAD environment variable to either 1 or 2, depending on which one is faster on your setup.
$ PRIMUS_UPLOAD=1 primusrun ...
Primusrun mouse delay (disable VSYNC)
VSYNC is enabled by default and as a result, it could make mouse input delay lag or even slightly decrease performance. Test
$ vblank_mode=0 primusrun glxgears
If you are satisfied with the above setting, create an alias (e.g.
alias primusrun="vblank_mode=0 primusrun").
|VSYNC enabled||FPS||Score||Min FPS||Max FPS|
Tested with ASUS N550JV notebook and benchmark app AUR.
Primus issues under compositing window managers
Since compositing hurts performance, invoking primus when a compositing WM is active is not recommended. If you need to use primus with compositing and see flickering or bad performance, synchronizing primus' display thread with the application's rendering thread may help:
$ PRIMUS_SYNC=1 primusrun ...
This makes primus display the previously rendered frame.
Problems with bumblebee after resuming from standby
In some systems, it can happens that the nvidia module is loaded after resuming from standby. One possible solution for this is to install theand package.
Optirun does not work, no debug output
Users are reporting that in some cases, even though Bumblebee was installed correctly, running
$ optirun glxgears -info
gives no output at all, and the glxgears window does not appear. Any programs that need 3d acceleration crashes:
$ optirun bash $ glxgears Segmentation fault (core dumped)
Apparently it is a bug of some versions of virtualgl. So a workaround is to install and and use it instead:
$ primusrun glxspheres64 $ optirun -b primus glxspheres64
By default primus locks the framerate to the vrate of your monitor (usually 60 fps), if needed it can be unlocked by passing the
vblank_mode=0 environment variable.
$ vblank_mode=0 primusrun glxspheres64
Usually there is no need to display more frames han your monitor can handle, but you might want to for benchmarking or to have faster reactions in games (e.g., if a game need 3 frames to react to a mouse movement with
vblank_mode=0 the reaction will be as quick as your system can handle, without it will always need 1/20 of second).
You might want to edit
/etc/bumblebee/bumblebee.conf to use the primus render as default. If after an update you want to check if the bug has been fixed just use
optirun -b virtualgl.
See this forum post for more information.
Broken power management with kernel 4.8
If you have a newer laptop (BIOS date 2015 or newer), then Linux 4.8 might break bbswitch (bbswitch issue 140) since bbswitch does not support the newer, recommended power management method. As a result, the GPU may fail to power on, fail to power off or worse.
As a workaround, add
pcie_port_pm=off to your Kernel parameters.
Alternatively, if you are only interested in power saving (and perhaps use of external monitors), remove bbswitch and rely on Nouveau runtime power-management (which supports the new method).
powertop --auto-tuneautomatically enable power management on PCI devices, which leads to the same problem . Use the same workaround or do not use the all-in-one tools.
Lockup issue (lspci hangs)
See NVIDIA Optimus#Lockup issue (lspci hangs)] for an issue that affects new laptops with a GTX 965M (or alike).
Discrete card always on and acpi warnings
Screen 0 deleted because of no matching config section
Modify the file xorg.conf.nvidia: First add
Screen 0 "nvidia" to the section ServerLayout and then create a new section.
Section "Screen" Identifier "nvidia" Device "DiscreteNvidia" EndSection
Erratic, unpredictable behaviour
If Bumblebee starts/works in a random manner, check that you have set your Network configuration#Local network hostname resolution (details here).
Discrete card always on and nvidia driver cannot be unloaded
nvidia-persistenced.service is disabled and not currently active. It is intended to keep the
nvidia driver running at all times , which prevents the card being turned off.
Discrete card is silently activated when egl is requested by some application
If the discrete card is activated by some program (lets say mpv with gpu backend) and stays on.
The problem might be
libglvnd which is loading nvidia drivers and activating the card.
To disable this set environment variable
(see documentation) to only load mesa config file:
Package nvidia-utils (and its branches) is installing nvidia config file at
/usr/share/glvnd/egl_vendor.d/10_nvidia.json which has priority and causes libglvnd to load nvidia drivers and enable the card.
The other solution is to remove the config file provided by nvidia-utils.
Framerate drops to 1 FPS after a fixed period of time
With the nvidia 440.36 driver, the DPMS setting is enabled by default resulting in a timeout after a fixed period of time (e.g. 10 minutes) which causes the frame rate to throttle down to 1 FPS. To work around this, add the following line to the "Screen" section in
Option "HardDPMS" "false"
Join us at #bumblebee at freenode.net.