https://wiki.archlinux.org/api.php?action=feedcontributions&user=FebLey&feedformat=atomArchWiki - User contributions [en]2024-03-29T09:41:20ZUser contributionsMediaWiki 1.41.0https://wiki.archlinux.org/index.php?title=Talk:Bumblebee&diff=212820Talk:Bumblebee2012-07-13T11:32:56Z<p>FebLey: /* What about lib32-nvidia-utils-bumblebee */</p>
<hr />
<div>== Wiki rewritten ==<br />
<br />
Hi, I followed this wiki two days ago and now Optimus technology works fine on my laptop, but I found this wiki a bit confusing. I decided to rewrite it. I'm not a linux-expert and i'm not English (I'm Italian), so feel free to correct what I wrote.<br />
<br />
:1) Setup X Server: I put this section as the first. New Bumblebee's versions create a xorg.conf.nvidia.pacnew file, so I added a cp command.<br />
:2) Load Kernel Module: I reordered this section with this logic in mind: first, get rid of nouveau at all; second, load nvidia module.<br />
:3) Start Bumblebee Daemon: I created a section for this. This way you don't need to reboot and it's more clear what you're doing.<br />
:4) Start VirtualGL Client: Well, I deleted this section because I think it's not needed to make bumblebee to work. I never run that command to use optirun or optirun32.<br />
:5) Usage: I added optirun32. It seems to work fine with Unigine Tropics benchmark.<br />
:6) Autostart Bumblebee: I created a section for this because this operations were all around the wiki. This way it's more compact.<br />
:7) Nvidia ON/OFF... : Everything is fine here. I added the command to check battery rate only.<br />
:About last section: I got an ACER Aspire 5742g (Nvidia gt540M) and if I followed the steps to turn off my card: well, my power usage is higher(+400mA) with the card turned off and nvidia module unloaded! I know it's unbelievable, but it's true. Anyone is experiencing this? Bye<br />
[[User:Thewall|Thewall]] 18:06, 1 July 2011 (EDT)<br />
=== Samsagax Reply on thewall changes ===<br />
<br />
It's nice someone got interested!<br />
Now I'll argue some points for what takes precedence, what are bugs and what is planned to the future of Bumblebee in ArchLinux:<br />
:1) I would put the kernel module load first, before the configuration of the X server, I think is better logic. <br />
:2) The issue with the ".pacnew" file is a bug, should create it only if there is an "xorg.conf.nvidia" (on upgrade). I'm also planning to move this conf file to /etc/bumblebee directory. <br />
:3) Liked that (: <br />
:4) I really wouldn't delete that, don't know why, but some people need the vglclient running, should be an optional and explanatory section maybe. <br />
:5) As the new package of bumblebee I'm trying to split into smaller packages containing the libraries apart from the scripts and optirun32 didn't work fine for most people (specially under wine). <br />
:6) Liked that, is more clean this way <br />
:7) This is a dark spot. as long as acpi_call does not work reliably on most laptops there is no safe way to tell if it's working. For this reason I'm putting this as purely experimental state and not supporting it for now. Your issue was reported and is known on a variety of ASUS laptops. I'll recommend to read about acpi_call and their known-to-work laptops. <br />
BTW: Thanks!<br />
<br />
==== Reply to Samsagax ====<br />
<br />
:1) Ok.<br />
:2) I tried to clarify. Is that bug solved?<br />
:3) Great (:<br />
:4) I re-entered the VGL Client section with a note.<br />
:5) You really made a good job here (:<br />
:6) Ok.<br />
:7) Nothing to say.<br />
:Other) A user on italian Arhlinux forum says that he must manually run the bumblebee daemon AFTER logging in with GNOME3. When he puts it in /etc/rc.conf he gets this: "[VGL] ERROR: Could not open display :1." It would be good to write that somewhere? Maybe a "troubleshooting" section?<br />
[[User:Thewall|Thewall]] 18:06, 1 July 2011 (EDT)<br />
<br />
<br />
==== Addition to 7) ====<br />
I think the higher Power consumption is caused by the X-Server that gets hung up (it hogs 100% of one CPU Core) when you switch off the Card via acpi_call. I've got the same issue here on a ASUS X53S, which also has a NVidia GT 540M.<br />
<br />
[[User:florianb|florianb]] 00:19, 1 August 2011 (CET)<br />
<br />
:Try disabling the X server first or you will have some issues. If there is still a problem try the vga-switcheroo option. <br />
:[[User:Samsagax|Samsagax]] 19:27, 31 July 2011 (EDT)<br />
<br />
::I tried to reproduce the errors successfully<br />
::1. If you switch off the NVIDIA Card before you stop the bumblebee daemon (which starts/stops the 2nd X-Server) you get into trouble, the X process hogs 100% CPU, gets unkillable and the overall power consumption (in my case) goes from about 1500mA to 2100mA<br />
::2. If you only stop the bumblebee daemon without switching off the NVIDIA Card, power consumption goes from about 1500mA to 1800-1900mA (maybe user "thewall" only stopped the daemon without switching off the NVIDIA Card?)<br />
::3. If you switch off the NVIDIA Card (which is a GT 540M in my case) via acpi_call, power consumption goes down to 1200mA, which is quite nice *BUT* the Fan goes 100% some seconds after you switch it off.. this seems to consume about 50mA more power.. blah blah and first of all is totally annoying<br />
::A guy in the ubuntu forum apparently already fixed 3) on similar hardware as i have, but i guess the differences are in detail, i'm trying to find it out.<br />
::[[User:florianb|florianb]] 08:07, 1 August 2011 (CET)<br />
<br />
:::I'll try to release today the new model for nvidia driver, similar to the one of nouveau. That way power switching is made automatically and by means of vga-switcheroo by default. I have to remind you that acpi_call method calls are guessed and (in your case) they may be incorrect. [[User:Samsagax|Samsagax]] 10:42, 1 August 2011 (EDT)<br />
<br />
::::Okay, sounds nice. I'd really like to contribute something to your work, if there's anything i could do, let me know.<br />
::::[[User:florianb|florianb]] 10:37, 2 August 2011 (CET)<br />
<br />
== We are making some progress ==<br />
<br />
Well, some developers (real ones) and me are getting somewhere on a stable Bumblebee due to this week. Will update the package as soon as we get it done. [[User:Samsagax|Samsagax]] 14:27, 11 August 2011 (EDT)<br />
<br />
== What about lib32-nvidia-utils-bumblebee ==<br />
Nowhere in the wiki article lib32-nvidia-utils-bumblebee is mentioned. But this is necessary if I would like to run 32bit wine games, right? --[[User:Onny|Onny]] 16:17, 29 January 2012 (EST)<br />
<br />
:I've added the lib32-nvidia-utils-bumblebee in the installation instructions --[[User:febLey|febLey]] 13:37, 13 July 2012 (GMT+1)<br />
<br />
== No devices detected, error encountered due to different cause ==<br />
While i was trying to use bumblebee with nouveau, i encountered<br />
<br />
<code> [ERROR]Cannot access secondary GPU - error: [XORG] (EE) No devices detected.<br />
<br />
[ERROR]Aborting because fallback start is disabled. </code><br />
<br />
But apparently for a different reason, i haven't figured out what it was, changing to nvidia(extra/nvidia 290.10-2) fixed it. (I also had to update to core/linux 3.2.2-1 for it.)</div>FebLeyhttps://wiki.archlinux.org/index.php?title=Talk:Bumblebee&diff=212819Talk:Bumblebee2012-07-13T11:32:33Z<p>FebLey: /* What about lib32-nvidia-utils-bumblebee */</p>
<hr />
<div>== Wiki rewritten ==<br />
<br />
Hi, I followed this wiki two days ago and now Optimus technology works fine on my laptop, but I found this wiki a bit confusing. I decided to rewrite it. I'm not a linux-expert and i'm not English (I'm Italian), so feel free to correct what I wrote.<br />
<br />
:1) Setup X Server: I put this section as the first. New Bumblebee's versions create a xorg.conf.nvidia.pacnew file, so I added a cp command.<br />
:2) Load Kernel Module: I reordered this section with this logic in mind: first, get rid of nouveau at all; second, load nvidia module.<br />
:3) Start Bumblebee Daemon: I created a section for this. This way you don't need to reboot and it's more clear what you're doing.<br />
:4) Start VirtualGL Client: Well, I deleted this section because I think it's not needed to make bumblebee to work. I never run that command to use optirun or optirun32.<br />
:5) Usage: I added optirun32. It seems to work fine with Unigine Tropics benchmark.<br />
:6) Autostart Bumblebee: I created a section for this because this operations were all around the wiki. This way it's more compact.<br />
:7) Nvidia ON/OFF... : Everything is fine here. I added the command to check battery rate only.<br />
:About last section: I got an ACER Aspire 5742g (Nvidia gt540M) and if I followed the steps to turn off my card: well, my power usage is higher(+400mA) with the card turned off and nvidia module unloaded! I know it's unbelievable, but it's true. Anyone is experiencing this? Bye<br />
[[User:Thewall|Thewall]] 18:06, 1 July 2011 (EDT)<br />
=== Samsagax Reply on thewall changes ===<br />
<br />
It's nice someone got interested!<br />
Now I'll argue some points for what takes precedence, what are bugs and what is planned to the future of Bumblebee in ArchLinux:<br />
:1) I would put the kernel module load first, before the configuration of the X server, I think is better logic. <br />
:2) The issue with the ".pacnew" file is a bug, should create it only if there is an "xorg.conf.nvidia" (on upgrade). I'm also planning to move this conf file to /etc/bumblebee directory. <br />
:3) Liked that (: <br />
:4) I really wouldn't delete that, don't know why, but some people need the vglclient running, should be an optional and explanatory section maybe. <br />
:5) As the new package of bumblebee I'm trying to split into smaller packages containing the libraries apart from the scripts and optirun32 didn't work fine for most people (specially under wine). <br />
:6) Liked that, is more clean this way <br />
:7) This is a dark spot. as long as acpi_call does not work reliably on most laptops there is no safe way to tell if it's working. For this reason I'm putting this as purely experimental state and not supporting it for now. Your issue was reported and is known on a variety of ASUS laptops. I'll recommend to read about acpi_call and their known-to-work laptops. <br />
BTW: Thanks!<br />
<br />
==== Reply to Samsagax ====<br />
<br />
:1) Ok.<br />
:2) I tried to clarify. Is that bug solved?<br />
:3) Great (:<br />
:4) I re-entered the VGL Client section with a note.<br />
:5) You really made a good job here (:<br />
:6) Ok.<br />
:7) Nothing to say.<br />
:Other) A user on italian Arhlinux forum says that he must manually run the bumblebee daemon AFTER logging in with GNOME3. When he puts it in /etc/rc.conf he gets this: "[VGL] ERROR: Could not open display :1." It would be good to write that somewhere? Maybe a "troubleshooting" section?<br />
[[User:Thewall|Thewall]] 18:06, 1 July 2011 (EDT)<br />
<br />
<br />
==== Addition to 7) ====<br />
I think the higher Power consumption is caused by the X-Server that gets hung up (it hogs 100% of one CPU Core) when you switch off the Card via acpi_call. I've got the same issue here on a ASUS X53S, which also has a NVidia GT 540M.<br />
<br />
[[User:florianb|florianb]] 00:19, 1 August 2011 (CET)<br />
<br />
:Try disabling the X server first or you will have some issues. If there is still a problem try the vga-switcheroo option. <br />
:[[User:Samsagax|Samsagax]] 19:27, 31 July 2011 (EDT)<br />
<br />
::I tried to reproduce the errors successfully<br />
::1. If you switch off the NVIDIA Card before you stop the bumblebee daemon (which starts/stops the 2nd X-Server) you get into trouble, the X process hogs 100% CPU, gets unkillable and the overall power consumption (in my case) goes from about 1500mA to 2100mA<br />
::2. If you only stop the bumblebee daemon without switching off the NVIDIA Card, power consumption goes from about 1500mA to 1800-1900mA (maybe user "thewall" only stopped the daemon without switching off the NVIDIA Card?)<br />
::3. If you switch off the NVIDIA Card (which is a GT 540M in my case) via acpi_call, power consumption goes down to 1200mA, which is quite nice *BUT* the Fan goes 100% some seconds after you switch it off.. this seems to consume about 50mA more power.. blah blah and first of all is totally annoying<br />
::A guy in the ubuntu forum apparently already fixed 3) on similar hardware as i have, but i guess the differences are in detail, i'm trying to find it out.<br />
::[[User:florianb|florianb]] 08:07, 1 August 2011 (CET)<br />
<br />
:::I'll try to release today the new model for nvidia driver, similar to the one of nouveau. That way power switching is made automatically and by means of vga-switcheroo by default. I have to remind you that acpi_call method calls are guessed and (in your case) they may be incorrect. [[User:Samsagax|Samsagax]] 10:42, 1 August 2011 (EDT)<br />
<br />
::::Okay, sounds nice. I'd really like to contribute something to your work, if there's anything i could do, let me know.<br />
::::[[User:florianb|florianb]] 10:37, 2 August 2011 (CET)<br />
<br />
== We are making some progress ==<br />
<br />
Well, some developers (real ones) and me are getting somewhere on a stable Bumblebee due to this week. Will update the package as soon as we get it done. [[User:Samsagax|Samsagax]] 14:27, 11 August 2011 (EDT)<br />
<br />
== What about lib32-nvidia-utils-bumblebee ==<br />
Nowhere in the wiki article lib32-nvidia-utils-bumblebee is mentioned. But this is necessary if I would like to run 32bit wine games, right? --[[User:Onny|Onny]] 16:17, 29 January 2012 (EST)<br />
<br />
-- I've added the lib32-nvidia-utils-bumblebee in the installation instructions --[[User:febLey|febLey]] 13:37, 13 July 2012 (GMT+1)<br />
<br />
== No devices detected, error encountered due to different cause ==<br />
While i was trying to use bumblebee with nouveau, i encountered<br />
<br />
<code> [ERROR]Cannot access secondary GPU - error: [XORG] (EE) No devices detected.<br />
<br />
[ERROR]Aborting because fallback start is disabled. </code><br />
<br />
But apparently for a different reason, i haven't figured out what it was, changing to nvidia(extra/nvidia 290.10-2) fixed it. (I also had to update to core/linux 3.2.2-1 for it.)</div>FebLeyhttps://wiki.archlinux.org/index.php?title=Bumblebee&diff=212818Bumblebee2012-07-13T11:27:40Z<p>FebLey: /* Installing Bumblebee with Intel / nvidia */</p>
<hr />
<div>[[Category:Graphics]]<br />
[[Category:X Server]]<br />
[[fr:Bumblebee]]<br />
[[it:Bumblebee]]<br />
[[ru:Bumblebee]]<br />
[[tr:Bumblebee]]<br />
[[zh-CN:Bumblebee]]<br />
From Bumblebee's [https://github.com/Bumblebee-Project/Bumblebee/wiki/FAQ FAQ]:<br />
<br />
''Bumblebee is an effort to make NVIDIA Optimus enabled laptops work in GNU/Linux systems. Such feature involves two graphics cards with two different power consumption profiles plugged in a layered way sharing a single framebuffer.''<br />
<br />
== Bumblebee: Optimus for Linux ==<br />
<br />
[http://www.nvidia.com/object/optimus_technology.html Optimus Technology] is an ''[http://hybrid-graphics-linux.tuxfamily.org/index.php?title=Hybrid_graphics hybrid graphics]'' implementation without a hardware multiplexer. The integrated GPU manages the display while the dedicated GPU manages the most demanding rendering and ships the work to the integrated GPU to be displayed. When the laptop is running on battery supply, the dedicated GPU is turned off to save power and prolong the battery life.<br />
<br />
Bumblebee is a software implementation based on VirtualGL and a kernel driver to be able to use the dedicated GPU, which is not physically connected to the screen.<br />
<br />
Bumblebee tries to mimic the Optimus technology behavior; using the dedicated GPU for rendering when needed and power it down when not in use. The present releases only support rendering on-demand, power-management is a work in progress.<br />
<br />
The NVIDIA dedicated card is managed as a separate X server connected to a "fake" screen (the screen is configured but not used). The second server is called using VirtualGL as if it were a remote server. That said, you will need a series of steps to set-up the kernel driver, the X server and a daemon.<br />
<br />
{{Warning|Bumblebee is still under heavy development! But your help is very welcome.}}<br />
<br />
==Installation==<br />
<br />
Before installing Bumblebee check your BIOS and activate Optimus (shareable graphics), if possible (BIOS doesn't have to provide this option), and install the [[Intel|intel driver]] for the secondary on board graphics card.<br />
<br />
{{Note|If you want to run a 32-bit application on a 64-bit system you must install {{AUR|lib32-virtualgl}} and proper lib32-* libraries.}}<br />
<br />
=== Installing Bumblebee with Intel / nvidia ===<br />
<br />
Install {{AUR|bumblebee}} from [[Arch User Repository|AUR]], and then install the special nvidia package {{aur|nvidia-utils-bumblebee}} for bumblebee.<br><br />
If you want to run 32-bit applications (like games with wine) on a 64-bit system you need the {{AUR|lib32-nvidia-utils-bumblebee}} from AUR additionally.<br />
<br />
{{Warning|Don't install the original {{Pkg|nvidia-utils}} for Bumblebee - it will break your system !}}<br />
<br />
In order to avoid installing {{Pkg|nvidia-utils}} as a dependency when installing the {{Pkg|nvidia}} driver, you have to install the {{AUR|nvidia-bumblebee}} package from the [[AUR]] instead (which is the same driver packaged for bumblebee usage).<br />
<br />
{{Note|You can install {{AUR|dkms-nvidia}} from AUR instead of {{Pkg|nvidia}} if you need it.}}<br />
<br />
{{note|If you like bumblebee to turn off the NVIDIA card automatically after usage, use {{AUR|bbswitch}} from AUR. See [[#Power Management|below]].}}<br />
<br />
=== Installing Bumblebee with Intel / nouveau ===<br />
<br />
Install nouveau and required packages first:<br />
{{bc|# pacman -S xf86-video-nouveau nouveau-dri mesa}}<br />
<br />
* {{Pkg|xf86-video-nouveau}} experimental 3D acceleration driver<br />
* {{Pkg|nouveau-dri}} Mesa classic DRI + Gallium3D drivers<br />
* {{Pkg|mesa}} Mesa 3-D graphics libraries<br />
<br />
Now Install {{AUR|bumblebee}} from [[Arch User Repository|AUR]]:<br />
<br />
{{note|If you like bumblebee to turn off the NVIDIA card automatically after usage, use {{AUR|bbswitch}} from AUR. See [[#Power Management|below]].}}<br />
<br />
==Start Bumblebee==<br />
<br />
In order to use it is necessary add yourself (and other users) at Bumblebee group:<br />
<br />
# usermod -a -G bumblebee $USER<br />
<br />
where {{ic|$USER}} is the login name of the user to be added. Then log off and on again to apply the group changes.<br />
<br />
To start bumblebee automatically add it to your {{ic|DAEMONS}} array in {{ic|/etc/rc.conf}}<br />
DAEMONS=(... @bumblebeed)<br />
<br />
Finished - reboot system and use the shell program {{ic|[[#Usage|optirun]]}} for Optimus NVIDIA rendering!<br />
<br />
== Usage ==<br />
<br />
The command line programm {{ic|optirun}} shipped with bumblebee is your best friend for running applications on your Optimus NVIDIA card.<br />
<br />
Test Bumblebee if it works with your Optimus system:<br />
{{bc|$ optirun glxgears}}<br />
<br />
If it succeeds and the terminal you are running from mentions something about your NVIDIA - Optimus with Bumblebee is working!<br />
<br />
General Usage:<br />
<br />
{{bc|$ optirun [options] <application> [application-parameters]}}<br />
<br />
Some Examples:<br />
<br />
Start Firefox accelerated with Optimus:<br />
<br />
{{bc|$ optirun firefox}}<br />
<br />
Start Windows applications with Optimus:<br />
<br />
{{bc|$ optirun wine <windows application>.exe}}<br />
<br />
Use NVIDIA Settings with Optimus:<br />
<br />
{{bc|$ optirun nvidia-settings -c :8 }}<br />
<br />
For a list of options for {{ic|optirun}} run:<br />
{{bc|$ optirun --help}}<br />
<br />
== Configuration ==<br />
<br />
You can configure the behaviour of Bumblebee to fit your needs. Fine tuning like speed optimization, power managment and other stuff can be configured in {{ic|/etc/bumblebee/bumblebee.conf}}<br />
<br />
=== Optimizing Speed ===<br />
<br />
Bumblebee renders frames for your Optimus NVIDIA card in a invisible X Server with VirtualGL and transports them back to your visible X Server.<br />
<br />
Frames will be compressed before they are transported - this saves bandwith and can be used for speedup optimization of bumblebee:<br />
<br />
To use an other compression method for a single application:<br />
<br />
$ optirun -c <compress-method> application<br />
<br />
The method of compres will affect performance in the GPU/GPU usage. Compressed methods (such as {{ic|jpeg}}) will load the CPU the most but will load GPU the minimum necessary; uncompressed methods loads the most on GPU and the CPU will have the minimum load possible.<br />
<br />
Compressed Methods are: {{ic|jpeg}}, {{ic|rgb}}, {{ic|yuv}}<br />
<br />
Uncompressed Methods are: {{ic|proxy}}, {{ic|xv}}<br />
<br />
To use a standard compression for all applications set the {{ic|VGLTransport}} to {{ic|<compress-method>}} in {{ic|/etc/bumblebee/bumblebee.conf}}<br />
<br />
{{hc|/etc/bumblebee/bumblebee.conf|<nowiki><br />
...<br />
[optirun]<br />
VGLTransport=proxy<br />
...<br />
</nowiki>}}<br />
<br />
{{Note|CPU frequency scaling will affect directly on render performance}}<br />
<br />
=== Power Management ===<br />
<br />
The goal of power management feature is to turnoff the NVIDIA card when it is not used by bumblebee anymore.<br />
<br />
To enable power managment for bumblebee install {{AUR|bbswitch}} from AUR.<br />
<br />
{{Warning|Make sure the secondary Xorg server is stopped when not in use !}}<br />
<br />
Set the {{ic|PMMethod}} to {{ic|bbswitch}} in the driver section of {{ic|/etc/bumblebee/bumblebee.conf}}:<br />
<br />
{{hc|/etc/bumblebee/bumblebee.conf|<nowiki><br />
[bumblebeed]<br />
KeepUnusedXServer=false<br />
...<br />
[driver-nvidia]<br />
PMMethod=bbswitch<br />
...<br />
[driver-nouveau]<br />
PMMethod=bbswitch<br />
...<br />
</nowiki>}}<br />
<br />
==== Default power state of NVIDIA card ====<br />
<br />
Set {{ic|load_state}} and {{ic|unload_state}} module options according to your needs (see [https://github.com/Bumblebee-Project/bbswitch bbswitch documentation]).<br />
{{hc|/etc/modprobe.d/bbswitch.conf|<nowiki><br />
options bbswitch load_state=0 unload_state=0<br />
</nowiki>}}<br />
<br />
Just restart bumblebee daemon to activate power managment:<br />
{{bc|# rc.d restart bumblebeed}}<br />
<br />
==== Enable NVIDIA card during shutdown ====<br />
<br />
The NVIDIA card may not correctly initialize during boot if the card was powered off when the system was last shutdown. One option is to set {{ic|TurnCardOffAtExit&#61;false}} in {{ic|/etc/bumblebee/bumblebee.conf}}, however this will enable the card everytime you stop the Bumblebee daemon, even if done manually. To ensure that the NVIDIA card is always powered on during shutdown, add the following [[Boot process#Custom_hooks|hook function]] (if using {{AUR|bbswitch}}):<br />
<br />
{{hc|/etc/rc.d/functions.d/nvidia-card-enable|<nowiki><br />
nvidia_card_enable() {<br />
BBSWITCH=/proc/acpi/bbswitch<br />
<br />
stat_busy "Enabling NVIDIA GPU"<br />
<br />
if [ -w ${BBSWITCH} ]; then<br />
echo ON > ${BBSWITCH}<br />
stat_done<br />
else<br />
stat_fail<br />
fi<br />
}<br />
<br />
add_hook shutdown_poweroff nvidia_card_enable<br />
</nowiki>}}<br />
<br />
=== Multiple monitors ===<br />
<br />
You can set up multiple monitors with xorg.conf. Set them to use the Intel card, but Bumblebee can still use the NVIDIA card. One example configuration is below for two identical screens with 1080p resolution and using the HDMI out.<br />
<br />
{{hc|/etc/X11/xorg.conf|<nowiki><br />
Section "Screen"<br />
Identifier "Screen0"<br />
Device "intelgpu0"<br />
Monitor "Monitor0"<br />
DefaultDepth 24<br />
Option "TwinView" "0"<br />
SubSection "Display"<br />
Depth 24<br />
Modes "1980x1080_60.00"<br />
EndSubSection<br />
EndSection<br />
<br />
Section "Screen"<br />
Identifier "Screen1"<br />
Device "intelgpu1"<br />
Monitor "Monitor1"<br />
DefaultDepth 24<br />
Option "TwinView" "0"<br />
SubSection "Display"<br />
Depth 24<br />
Modes "1980x1080_60.00"<br />
EndSubSection<br />
EndSection<br />
<br />
Section "Monitor"<br />
Identifier "Monitor0"<br />
Option "Enable" "true"<br />
EndSection<br />
<br />
Section "Monitor"<br />
Identifier "Monitor1"<br />
Option "Enable" "true"<br />
EndSection<br />
<br />
Section "Device"<br />
Identifier "intelgpu0"<br />
Driver "intel"<br />
Option "XvMC" "true"<br />
Option "UseEvents" "true"<br />
Option "AccelMethod" "UXA"<br />
BusID "PCI:0:2:0"<br />
EndSection<br />
<br />
Section "Device"<br />
Identifier "intelgpu1"<br />
Driver "intel"<br />
Option "XvMC" "true"<br />
Option "UseEvents" "true"<br />
Option "AccelMethod" "UXA"<br />
BusID "PCI:0:2:0"<br />
EndSection<br />
</nowiki>}}<br />
<br />
You need to probably change the BusID:<br />
<br />
{{hc|<nowiki>$ lspci | grep VGA</nowiki>|<br />
00:02.0 VGA compatible controller: Intel Corporation 2nd Generation Core Processor Family Integrated Graphics Controller (rev 09)<br />
}}<br />
<br />
The BusID is 0:2:0<br />
<br />
==CUDA Without Bumblebee==<br />
<br />
This is not well documented, but you do not need Bumblebee to use CUDA and it may work even on machines where optirun fails. For a guide on how to get it working with the Lenovo IdeaPad Y580 (which uses the GeForce 660M), see: https://wiki.archlinux.org/index.php/Lenovo_IdeaPad_Y580#NVIDIA_Card. Those instructions are very likely to work with other machines (except for the acpi-handle-hack part, which may not be necessary).<br />
<br />
==Troubleshooting==<br />
<br />
{{Note|Please report bugs at [https://github.com/Bumblebee-Project/Bumblebee Bumblebee-Project]'s GitHub tracker as described in its [https://github.com/Bumblebee-Project/Bumblebee/wiki/Reporting-Issues Wiki].}}<br />
<br />
=== [VGL] ERROR: Could not open display :8 ===<br />
<br />
There is a known problem with some wine applications that fork and kill the parent process without keeping track of it (for example the free to play online game "Runes of Magic")<br />
<br />
A workaround for this problem is:<br />
<br />
{{bc|$ optirun bash<br>$ optirun wine <windows program>.exe}}<br />
<br />
If using NVIDA drivers a fix for this problem is to edit /etc/bumblebee/xorg.conf.nvidia and change Option "ConnectedMonitor" to "CRT-0".<br />
<br />
=== [ERROR]Cannot access secondary GPU ===<br />
<br />
In some instances, running optirun will return:<br />
<br />
{{Ic|[ERROR]Cannot access secondary GPU - error: [XORG] (EE) No devices detected. <br><br />
<br><br />
[ERROR]Aborting because fallback start is disabled.}}<br />
<br />
In this case, you will need to move the file {{ic|/etc/X11/xorg.conf.d/20-intel.conf}} to somewhere else. Restart the bumblebeed daemon, and it should work. <br />
Credit for this goes to Lekensteyn on #bumblebee at freenode.net<br />
<br />
<br />
It could be also necessary to comment the driver line in {{ic|/etc/X11/xorg.conf.d/10-monitor.conf}}.<br />
<br />
=== Fatal IO error 11 (Resource temporarily unavailable) on X server ===<br />
<br />
Change {{ic|KeepUnusedXServer}} in {{ic|/etc/bumblebee/bumblebee.conf}} from {{ic|false}} to {{ic|true}}. Your program forks into backgroung and bumblebee don't know anything about it.<br />
<br />
=== Video tearing ===<br />
<br />
Video tearing is a somewhat common problem on Bumblebee. To fix it, you need to enable vsync. It should be enabled by default on the Intel card, but verify that from Xorg logs. To check whether or not it is enabled for nvidia, run <br />
<br />
{{bc|$ optirun nvidia-settings -c :8 }}<br />
<br />
{{ic|1=X Server XVideo Settings -> Sync to VBlank}} and {{ic|1=OpenGL Settings -> Sync to VBlank}} should both be enabled. The Intel card has in general less tearing, so use it for video playback. Especially use VA-API for video decoding (e.g. {{ic|mplayer-vaapi}} and with {{ic|-vsync}} parameter).<br />
<br />
Refer to the [[Intel#Video_tearing|Intel]] article on how to fix tearing on the Intel card.<br />
<br />
If it is still not fixed, try to disable compositing from your desktop environment. Try also disabling triple buffering.<br />
<br />
== Important Links ==<br />
* [http://www.bumblebee-project.org Bumblebee Project repository]<br />
* [http://wiki.bumblebee-project.org/ Bumblebee Project Wiki]<br />
* [https://github.com/Bumblebee-Project/bbswitch Bumblebee Project bbswitch repository]<br />
<br />
Join us at #bumblebee at freenode.net</div>FebLeyhttps://wiki.archlinux.org/index.php?title=Bumblebee&diff=212817Bumblebee2012-07-13T11:27:29Z<p>FebLey: /* Installing Bumblebee with Intel / nvidia */</p>
<hr />
<div>[[Category:Graphics]]<br />
[[Category:X Server]]<br />
[[fr:Bumblebee]]<br />
[[it:Bumblebee]]<br />
[[ru:Bumblebee]]<br />
[[tr:Bumblebee]]<br />
[[zh-CN:Bumblebee]]<br />
From Bumblebee's [https://github.com/Bumblebee-Project/Bumblebee/wiki/FAQ FAQ]:<br />
<br />
''Bumblebee is an effort to make NVIDIA Optimus enabled laptops work in GNU/Linux systems. Such feature involves two graphics cards with two different power consumption profiles plugged in a layered way sharing a single framebuffer.''<br />
<br />
== Bumblebee: Optimus for Linux ==<br />
<br />
[http://www.nvidia.com/object/optimus_technology.html Optimus Technology] is an ''[http://hybrid-graphics-linux.tuxfamily.org/index.php?title=Hybrid_graphics hybrid graphics]'' implementation without a hardware multiplexer. The integrated GPU manages the display while the dedicated GPU manages the most demanding rendering and ships the work to the integrated GPU to be displayed. When the laptop is running on battery supply, the dedicated GPU is turned off to save power and prolong the battery life.<br />
<br />
Bumblebee is a software implementation based on VirtualGL and a kernel driver to be able to use the dedicated GPU, which is not physically connected to the screen.<br />
<br />
Bumblebee tries to mimic the Optimus technology behavior; using the dedicated GPU for rendering when needed and power it down when not in use. The present releases only support rendering on-demand, power-management is a work in progress.<br />
<br />
The NVIDIA dedicated card is managed as a separate X server connected to a "fake" screen (the screen is configured but not used). The second server is called using VirtualGL as if it were a remote server. That said, you will need a series of steps to set-up the kernel driver, the X server and a daemon.<br />
<br />
{{Warning|Bumblebee is still under heavy development! But your help is very welcome.}}<br />
<br />
==Installation==<br />
<br />
Before installing Bumblebee check your BIOS and activate Optimus (shareable graphics), if possible (BIOS doesn't have to provide this option), and install the [[Intel|intel driver]] for the secondary on board graphics card.<br />
<br />
{{Note|If you want to run a 32-bit application on a 64-bit system you must install {{AUR|lib32-virtualgl}} and proper lib32-* libraries.}}<br />
<br />
=== Installing Bumblebee with Intel / nvidia ===<br />
<br />
Install {{AUR|bumblebee}} from [[Arch User Repository|AUR]], and then install the special nvidia package {{aur|nvidia-utils-bumblebee}} for bumblebee.<br />
<br />
If you want to run 32-bit applications (like games with wine) on a 64-bit system you need the {{AUR|lib32-nvidia-utils-bumblebee}} from AUR additionally.<br />
<br />
{{Warning|Don't install the original {{Pkg|nvidia-utils}} for Bumblebee - it will break your system !}}<br />
<br />
In order to avoid installing {{Pkg|nvidia-utils}} as a dependency when installing the {{Pkg|nvidia}} driver, you have to install the {{AUR|nvidia-bumblebee}} package from the [[AUR]] instead (which is the same driver packaged for bumblebee usage).<br />
<br />
{{Note|You can install {{AUR|dkms-nvidia}} from AUR instead of {{Pkg|nvidia}} if you need it.}}<br />
<br />
{{note|If you like bumblebee to turn off the NVIDIA card automatically after usage, use {{AUR|bbswitch}} from AUR. See [[#Power Management|below]].}}<br />
<br />
=== Installing Bumblebee with Intel / nouveau ===<br />
<br />
Install nouveau and required packages first:<br />
{{bc|# pacman -S xf86-video-nouveau nouveau-dri mesa}}<br />
<br />
* {{Pkg|xf86-video-nouveau}} experimental 3D acceleration driver<br />
* {{Pkg|nouveau-dri}} Mesa classic DRI + Gallium3D drivers<br />
* {{Pkg|mesa}} Mesa 3-D graphics libraries<br />
<br />
Now Install {{AUR|bumblebee}} from [[Arch User Repository|AUR]]:<br />
<br />
{{note|If you like bumblebee to turn off the NVIDIA card automatically after usage, use {{AUR|bbswitch}} from AUR. See [[#Power Management|below]].}}<br />
<br />
==Start Bumblebee==<br />
<br />
In order to use it is necessary add yourself (and other users) at Bumblebee group:<br />
<br />
# usermod -a -G bumblebee $USER<br />
<br />
where {{ic|$USER}} is the login name of the user to be added. Then log off and on again to apply the group changes.<br />
<br />
To start bumblebee automatically add it to your {{ic|DAEMONS}} array in {{ic|/etc/rc.conf}}<br />
DAEMONS=(... @bumblebeed)<br />
<br />
Finished - reboot system and use the shell program {{ic|[[#Usage|optirun]]}} for Optimus NVIDIA rendering!<br />
<br />
== Usage ==<br />
<br />
The command line programm {{ic|optirun}} shipped with bumblebee is your best friend for running applications on your Optimus NVIDIA card.<br />
<br />
Test Bumblebee if it works with your Optimus system:<br />
{{bc|$ optirun glxgears}}<br />
<br />
If it succeeds and the terminal you are running from mentions something about your NVIDIA - Optimus with Bumblebee is working!<br />
<br />
General Usage:<br />
<br />
{{bc|$ optirun [options] <application> [application-parameters]}}<br />
<br />
Some Examples:<br />
<br />
Start Firefox accelerated with Optimus:<br />
<br />
{{bc|$ optirun firefox}}<br />
<br />
Start Windows applications with Optimus:<br />
<br />
{{bc|$ optirun wine <windows application>.exe}}<br />
<br />
Use NVIDIA Settings with Optimus:<br />
<br />
{{bc|$ optirun nvidia-settings -c :8 }}<br />
<br />
For a list of options for {{ic|optirun}} run:<br />
{{bc|$ optirun --help}}<br />
<br />
== Configuration ==<br />
<br />
You can configure the behaviour of Bumblebee to fit your needs. Fine tuning like speed optimization, power managment and other stuff can be configured in {{ic|/etc/bumblebee/bumblebee.conf}}<br />
<br />
=== Optimizing Speed ===<br />
<br />
Bumblebee renders frames for your Optimus NVIDIA card in a invisible X Server with VirtualGL and transports them back to your visible X Server.<br />
<br />
Frames will be compressed before they are transported - this saves bandwith and can be used for speedup optimization of bumblebee:<br />
<br />
To use an other compression method for a single application:<br />
<br />
$ optirun -c <compress-method> application<br />
<br />
The method of compres will affect performance in the GPU/GPU usage. Compressed methods (such as {{ic|jpeg}}) will load the CPU the most but will load GPU the minimum necessary; uncompressed methods loads the most on GPU and the CPU will have the minimum load possible.<br />
<br />
Compressed Methods are: {{ic|jpeg}}, {{ic|rgb}}, {{ic|yuv}}<br />
<br />
Uncompressed Methods are: {{ic|proxy}}, {{ic|xv}}<br />
<br />
To use a standard compression for all applications set the {{ic|VGLTransport}} to {{ic|<compress-method>}} in {{ic|/etc/bumblebee/bumblebee.conf}}<br />
<br />
{{hc|/etc/bumblebee/bumblebee.conf|<nowiki><br />
...<br />
[optirun]<br />
VGLTransport=proxy<br />
...<br />
</nowiki>}}<br />
<br />
{{Note|CPU frequency scaling will affect directly on render performance}}<br />
<br />
=== Power Management ===<br />
<br />
The goal of power management feature is to turnoff the NVIDIA card when it is not used by bumblebee anymore.<br />
<br />
To enable power managment for bumblebee install {{AUR|bbswitch}} from AUR.<br />
<br />
{{Warning|Make sure the secondary Xorg server is stopped when not in use !}}<br />
<br />
Set the {{ic|PMMethod}} to {{ic|bbswitch}} in the driver section of {{ic|/etc/bumblebee/bumblebee.conf}}:<br />
<br />
{{hc|/etc/bumblebee/bumblebee.conf|<nowiki><br />
[bumblebeed]<br />
KeepUnusedXServer=false<br />
...<br />
[driver-nvidia]<br />
PMMethod=bbswitch<br />
...<br />
[driver-nouveau]<br />
PMMethod=bbswitch<br />
...<br />
</nowiki>}}<br />
<br />
==== Default power state of NVIDIA card ====<br />
<br />
Set {{ic|load_state}} and {{ic|unload_state}} module options according to your needs (see [https://github.com/Bumblebee-Project/bbswitch bbswitch documentation]).<br />
{{hc|/etc/modprobe.d/bbswitch.conf|<nowiki><br />
options bbswitch load_state=0 unload_state=0<br />
</nowiki>}}<br />
<br />
Just restart bumblebee daemon to activate power managment:<br />
{{bc|# rc.d restart bumblebeed}}<br />
<br />
==== Enable NVIDIA card during shutdown ====<br />
<br />
The NVIDIA card may not correctly initialize during boot if the card was powered off when the system was last shutdown. One option is to set {{ic|TurnCardOffAtExit&#61;false}} in {{ic|/etc/bumblebee/bumblebee.conf}}, however this will enable the card everytime you stop the Bumblebee daemon, even if done manually. To ensure that the NVIDIA card is always powered on during shutdown, add the following [[Boot process#Custom_hooks|hook function]] (if using {{AUR|bbswitch}}):<br />
<br />
{{hc|/etc/rc.d/functions.d/nvidia-card-enable|<nowiki><br />
nvidia_card_enable() {<br />
BBSWITCH=/proc/acpi/bbswitch<br />
<br />
stat_busy "Enabling NVIDIA GPU"<br />
<br />
if [ -w ${BBSWITCH} ]; then<br />
echo ON > ${BBSWITCH}<br />
stat_done<br />
else<br />
stat_fail<br />
fi<br />
}<br />
<br />
add_hook shutdown_poweroff nvidia_card_enable<br />
</nowiki>}}<br />
<br />
=== Multiple monitors ===<br />
<br />
You can set up multiple monitors with xorg.conf. Set them to use the Intel card, but Bumblebee can still use the NVIDIA card. One example configuration is below for two identical screens with 1080p resolution and using the HDMI out.<br />
<br />
{{hc|/etc/X11/xorg.conf|<nowiki><br />
Section "Screen"<br />
Identifier "Screen0"<br />
Device "intelgpu0"<br />
Monitor "Monitor0"<br />
DefaultDepth 24<br />
Option "TwinView" "0"<br />
SubSection "Display"<br />
Depth 24<br />
Modes "1980x1080_60.00"<br />
EndSubSection<br />
EndSection<br />
<br />
Section "Screen"<br />
Identifier "Screen1"<br />
Device "intelgpu1"<br />
Monitor "Monitor1"<br />
DefaultDepth 24<br />
Option "TwinView" "0"<br />
SubSection "Display"<br />
Depth 24<br />
Modes "1980x1080_60.00"<br />
EndSubSection<br />
EndSection<br />
<br />
Section "Monitor"<br />
Identifier "Monitor0"<br />
Option "Enable" "true"<br />
EndSection<br />
<br />
Section "Monitor"<br />
Identifier "Monitor1"<br />
Option "Enable" "true"<br />
EndSection<br />
<br />
Section "Device"<br />
Identifier "intelgpu0"<br />
Driver "intel"<br />
Option "XvMC" "true"<br />
Option "UseEvents" "true"<br />
Option "AccelMethod" "UXA"<br />
BusID "PCI:0:2:0"<br />
EndSection<br />
<br />
Section "Device"<br />
Identifier "intelgpu1"<br />
Driver "intel"<br />
Option "XvMC" "true"<br />
Option "UseEvents" "true"<br />
Option "AccelMethod" "UXA"<br />
BusID "PCI:0:2:0"<br />
EndSection<br />
</nowiki>}}<br />
<br />
You need to probably change the BusID:<br />
<br />
{{hc|<nowiki>$ lspci | grep VGA</nowiki>|<br />
00:02.0 VGA compatible controller: Intel Corporation 2nd Generation Core Processor Family Integrated Graphics Controller (rev 09)<br />
}}<br />
<br />
The BusID is 0:2:0<br />
<br />
==CUDA Without Bumblebee==<br />
<br />
This is not well documented, but you do not need Bumblebee to use CUDA and it may work even on machines where optirun fails. For a guide on how to get it working with the Lenovo IdeaPad Y580 (which uses the GeForce 660M), see: https://wiki.archlinux.org/index.php/Lenovo_IdeaPad_Y580#NVIDIA_Card. Those instructions are very likely to work with other machines (except for the acpi-handle-hack part, which may not be necessary).<br />
<br />
==Troubleshooting==<br />
<br />
{{Note|Please report bugs at [https://github.com/Bumblebee-Project/Bumblebee Bumblebee-Project]'s GitHub tracker as described in its [https://github.com/Bumblebee-Project/Bumblebee/wiki/Reporting-Issues Wiki].}}<br />
<br />
=== [VGL] ERROR: Could not open display :8 ===<br />
<br />
There is a known problem with some wine applications that fork and kill the parent process without keeping track of it (for example the free to play online game "Runes of Magic")<br />
<br />
A workaround for this problem is:<br />
<br />
{{bc|$ optirun bash<br>$ optirun wine <windows program>.exe}}<br />
<br />
If using NVIDA drivers a fix for this problem is to edit /etc/bumblebee/xorg.conf.nvidia and change Option "ConnectedMonitor" to "CRT-0".<br />
<br />
=== [ERROR]Cannot access secondary GPU ===<br />
<br />
In some instances, running optirun will return:<br />
<br />
{{Ic|[ERROR]Cannot access secondary GPU - error: [XORG] (EE) No devices detected. <br><br />
<br><br />
[ERROR]Aborting because fallback start is disabled.}}<br />
<br />
In this case, you will need to move the file {{ic|/etc/X11/xorg.conf.d/20-intel.conf}} to somewhere else. Restart the bumblebeed daemon, and it should work. <br />
Credit for this goes to Lekensteyn on #bumblebee at freenode.net<br />
<br />
<br />
It could be also necessary to comment the driver line in {{ic|/etc/X11/xorg.conf.d/10-monitor.conf}}.<br />
<br />
=== Fatal IO error 11 (Resource temporarily unavailable) on X server ===<br />
<br />
Change {{ic|KeepUnusedXServer}} in {{ic|/etc/bumblebee/bumblebee.conf}} from {{ic|false}} to {{ic|true}}. Your program forks into backgroung and bumblebee don't know anything about it.<br />
<br />
=== Video tearing ===<br />
<br />
Video tearing is a somewhat common problem on Bumblebee. To fix it, you need to enable vsync. It should be enabled by default on the Intel card, but verify that from Xorg logs. To check whether or not it is enabled for nvidia, run <br />
<br />
{{bc|$ optirun nvidia-settings -c :8 }}<br />
<br />
{{ic|1=X Server XVideo Settings -> Sync to VBlank}} and {{ic|1=OpenGL Settings -> Sync to VBlank}} should both be enabled. The Intel card has in general less tearing, so use it for video playback. Especially use VA-API for video decoding (e.g. {{ic|mplayer-vaapi}} and with {{ic|-vsync}} parameter).<br />
<br />
Refer to the [[Intel#Video_tearing|Intel]] article on how to fix tearing on the Intel card.<br />
<br />
If it is still not fixed, try to disable compositing from your desktop environment. Try also disabling triple buffering.<br />
<br />
== Important Links ==<br />
* [http://www.bumblebee-project.org Bumblebee Project repository]<br />
* [http://wiki.bumblebee-project.org/ Bumblebee Project Wiki]<br />
* [https://github.com/Bumblebee-Project/bbswitch Bumblebee Project bbswitch repository]<br />
<br />
Join us at #bumblebee at freenode.net</div>FebLeyhttps://wiki.archlinux.org/index.php?title=Bumblebee&diff=212816Bumblebee2012-07-13T11:27:23Z<p>FebLey: /* Installing Bumblebee with Intel / nvidia */</p>
<hr />
<div>[[Category:Graphics]]<br />
[[Category:X Server]]<br />
[[fr:Bumblebee]]<br />
[[it:Bumblebee]]<br />
[[ru:Bumblebee]]<br />
[[tr:Bumblebee]]<br />
[[zh-CN:Bumblebee]]<br />
From Bumblebee's [https://github.com/Bumblebee-Project/Bumblebee/wiki/FAQ FAQ]:<br />
<br />
''Bumblebee is an effort to make NVIDIA Optimus enabled laptops work in GNU/Linux systems. Such feature involves two graphics cards with two different power consumption profiles plugged in a layered way sharing a single framebuffer.''<br />
<br />
== Bumblebee: Optimus for Linux ==<br />
<br />
[http://www.nvidia.com/object/optimus_technology.html Optimus Technology] is an ''[http://hybrid-graphics-linux.tuxfamily.org/index.php?title=Hybrid_graphics hybrid graphics]'' implementation without a hardware multiplexer. The integrated GPU manages the display while the dedicated GPU manages the most demanding rendering and ships the work to the integrated GPU to be displayed. When the laptop is running on battery supply, the dedicated GPU is turned off to save power and prolong the battery life.<br />
<br />
Bumblebee is a software implementation based on VirtualGL and a kernel driver to be able to use the dedicated GPU, which is not physically connected to the screen.<br />
<br />
Bumblebee tries to mimic the Optimus technology behavior; using the dedicated GPU for rendering when needed and power it down when not in use. The present releases only support rendering on-demand, power-management is a work in progress.<br />
<br />
The NVIDIA dedicated card is managed as a separate X server connected to a "fake" screen (the screen is configured but not used). The second server is called using VirtualGL as if it were a remote server. That said, you will need a series of steps to set-up the kernel driver, the X server and a daemon.<br />
<br />
{{Warning|Bumblebee is still under heavy development! But your help is very welcome.}}<br />
<br />
==Installation==<br />
<br />
Before installing Bumblebee check your BIOS and activate Optimus (shareable graphics), if possible (BIOS doesn't have to provide this option), and install the [[Intel|intel driver]] for the secondary on board graphics card.<br />
<br />
{{Note|If you want to run a 32-bit application on a 64-bit system you must install {{AUR|lib32-virtualgl}} and proper lib32-* libraries.}}<br />
<br />
=== Installing Bumblebee with Intel / nvidia ===<br />
<br />
Install {{AUR|bumblebee}} from [[Arch User Repository|AUR]], and then install the special nvidia package {{aur|nvidia-utils-bumblebee}} for bumblebee.<br />
<br />
<br />
If you want to run 32-bit applications (like games with wine) on a 64-bit system you need the {{AUR|lib32-nvidia-utils-bumblebee}} from AUR additionally.<br />
<br />
{{Warning|Don't install the original {{Pkg|nvidia-utils}} for Bumblebee - it will break your system !}}<br />
<br />
In order to avoid installing {{Pkg|nvidia-utils}} as a dependency when installing the {{Pkg|nvidia}} driver, you have to install the {{AUR|nvidia-bumblebee}} package from the [[AUR]] instead (which is the same driver packaged for bumblebee usage).<br />
<br />
{{Note|You can install {{AUR|dkms-nvidia}} from AUR instead of {{Pkg|nvidia}} if you need it.}}<br />
<br />
{{note|If you like bumblebee to turn off the NVIDIA card automatically after usage, use {{AUR|bbswitch}} from AUR. See [[#Power Management|below]].}}<br />
<br />
=== Installing Bumblebee with Intel / nouveau ===<br />
<br />
Install nouveau and required packages first:<br />
{{bc|# pacman -S xf86-video-nouveau nouveau-dri mesa}}<br />
<br />
* {{Pkg|xf86-video-nouveau}} experimental 3D acceleration driver<br />
* {{Pkg|nouveau-dri}} Mesa classic DRI + Gallium3D drivers<br />
* {{Pkg|mesa}} Mesa 3-D graphics libraries<br />
<br />
Now Install {{AUR|bumblebee}} from [[Arch User Repository|AUR]]:<br />
<br />
{{note|If you like bumblebee to turn off the NVIDIA card automatically after usage, use {{AUR|bbswitch}} from AUR. See [[#Power Management|below]].}}<br />
<br />
==Start Bumblebee==<br />
<br />
In order to use it is necessary add yourself (and other users) at Bumblebee group:<br />
<br />
# usermod -a -G bumblebee $USER<br />
<br />
where {{ic|$USER}} is the login name of the user to be added. Then log off and on again to apply the group changes.<br />
<br />
To start bumblebee automatically add it to your {{ic|DAEMONS}} array in {{ic|/etc/rc.conf}}<br />
DAEMONS=(... @bumblebeed)<br />
<br />
Finished - reboot system and use the shell program {{ic|[[#Usage|optirun]]}} for Optimus NVIDIA rendering!<br />
<br />
== Usage ==<br />
<br />
The command line programm {{ic|optirun}} shipped with bumblebee is your best friend for running applications on your Optimus NVIDIA card.<br />
<br />
Test Bumblebee if it works with your Optimus system:<br />
{{bc|$ optirun glxgears}}<br />
<br />
If it succeeds and the terminal you are running from mentions something about your NVIDIA - Optimus with Bumblebee is working!<br />
<br />
General Usage:<br />
<br />
{{bc|$ optirun [options] <application> [application-parameters]}}<br />
<br />
Some Examples:<br />
<br />
Start Firefox accelerated with Optimus:<br />
<br />
{{bc|$ optirun firefox}}<br />
<br />
Start Windows applications with Optimus:<br />
<br />
{{bc|$ optirun wine <windows application>.exe}}<br />
<br />
Use NVIDIA Settings with Optimus:<br />
<br />
{{bc|$ optirun nvidia-settings -c :8 }}<br />
<br />
For a list of options for {{ic|optirun}} run:<br />
{{bc|$ optirun --help}}<br />
<br />
== Configuration ==<br />
<br />
You can configure the behaviour of Bumblebee to fit your needs. Fine tuning like speed optimization, power managment and other stuff can be configured in {{ic|/etc/bumblebee/bumblebee.conf}}<br />
<br />
=== Optimizing Speed ===<br />
<br />
Bumblebee renders frames for your Optimus NVIDIA card in a invisible X Server with VirtualGL and transports them back to your visible X Server.<br />
<br />
Frames will be compressed before they are transported - this saves bandwith and can be used for speedup optimization of bumblebee:<br />
<br />
To use an other compression method for a single application:<br />
<br />
$ optirun -c <compress-method> application<br />
<br />
The method of compres will affect performance in the GPU/GPU usage. Compressed methods (such as {{ic|jpeg}}) will load the CPU the most but will load GPU the minimum necessary; uncompressed methods loads the most on GPU and the CPU will have the minimum load possible.<br />
<br />
Compressed Methods are: {{ic|jpeg}}, {{ic|rgb}}, {{ic|yuv}}<br />
<br />
Uncompressed Methods are: {{ic|proxy}}, {{ic|xv}}<br />
<br />
To use a standard compression for all applications set the {{ic|VGLTransport}} to {{ic|<compress-method>}} in {{ic|/etc/bumblebee/bumblebee.conf}}<br />
<br />
{{hc|/etc/bumblebee/bumblebee.conf|<nowiki><br />
...<br />
[optirun]<br />
VGLTransport=proxy<br />
...<br />
</nowiki>}}<br />
<br />
{{Note|CPU frequency scaling will affect directly on render performance}}<br />
<br />
=== Power Management ===<br />
<br />
The goal of power management feature is to turnoff the NVIDIA card when it is not used by bumblebee anymore.<br />
<br />
To enable power managment for bumblebee install {{AUR|bbswitch}} from AUR.<br />
<br />
{{Warning|Make sure the secondary Xorg server is stopped when not in use !}}<br />
<br />
Set the {{ic|PMMethod}} to {{ic|bbswitch}} in the driver section of {{ic|/etc/bumblebee/bumblebee.conf}}:<br />
<br />
{{hc|/etc/bumblebee/bumblebee.conf|<nowiki><br />
[bumblebeed]<br />
KeepUnusedXServer=false<br />
...<br />
[driver-nvidia]<br />
PMMethod=bbswitch<br />
...<br />
[driver-nouveau]<br />
PMMethod=bbswitch<br />
...<br />
</nowiki>}}<br />
<br />
==== Default power state of NVIDIA card ====<br />
<br />
Set {{ic|load_state}} and {{ic|unload_state}} module options according to your needs (see [https://github.com/Bumblebee-Project/bbswitch bbswitch documentation]).<br />
{{hc|/etc/modprobe.d/bbswitch.conf|<nowiki><br />
options bbswitch load_state=0 unload_state=0<br />
</nowiki>}}<br />
<br />
Just restart bumblebee daemon to activate power managment:<br />
{{bc|# rc.d restart bumblebeed}}<br />
<br />
==== Enable NVIDIA card during shutdown ====<br />
<br />
The NVIDIA card may not correctly initialize during boot if the card was powered off when the system was last shutdown. One option is to set {{ic|TurnCardOffAtExit&#61;false}} in {{ic|/etc/bumblebee/bumblebee.conf}}, however this will enable the card everytime you stop the Bumblebee daemon, even if done manually. To ensure that the NVIDIA card is always powered on during shutdown, add the following [[Boot process#Custom_hooks|hook function]] (if using {{AUR|bbswitch}}):<br />
<br />
{{hc|/etc/rc.d/functions.d/nvidia-card-enable|<nowiki><br />
nvidia_card_enable() {<br />
BBSWITCH=/proc/acpi/bbswitch<br />
<br />
stat_busy "Enabling NVIDIA GPU"<br />
<br />
if [ -w ${BBSWITCH} ]; then<br />
echo ON > ${BBSWITCH}<br />
stat_done<br />
else<br />
stat_fail<br />
fi<br />
}<br />
<br />
add_hook shutdown_poweroff nvidia_card_enable<br />
</nowiki>}}<br />
<br />
=== Multiple monitors ===<br />
<br />
You can set up multiple monitors with xorg.conf. Set them to use the Intel card, but Bumblebee can still use the NVIDIA card. One example configuration is below for two identical screens with 1080p resolution and using the HDMI out.<br />
<br />
{{hc|/etc/X11/xorg.conf|<nowiki><br />
Section "Screen"<br />
Identifier "Screen0"<br />
Device "intelgpu0"<br />
Monitor "Monitor0"<br />
DefaultDepth 24<br />
Option "TwinView" "0"<br />
SubSection "Display"<br />
Depth 24<br />
Modes "1980x1080_60.00"<br />
EndSubSection<br />
EndSection<br />
<br />
Section "Screen"<br />
Identifier "Screen1"<br />
Device "intelgpu1"<br />
Monitor "Monitor1"<br />
DefaultDepth 24<br />
Option "TwinView" "0"<br />
SubSection "Display"<br />
Depth 24<br />
Modes "1980x1080_60.00"<br />
EndSubSection<br />
EndSection<br />
<br />
Section "Monitor"<br />
Identifier "Monitor0"<br />
Option "Enable" "true"<br />
EndSection<br />
<br />
Section "Monitor"<br />
Identifier "Monitor1"<br />
Option "Enable" "true"<br />
EndSection<br />
<br />
Section "Device"<br />
Identifier "intelgpu0"<br />
Driver "intel"<br />
Option "XvMC" "true"<br />
Option "UseEvents" "true"<br />
Option "AccelMethod" "UXA"<br />
BusID "PCI:0:2:0"<br />
EndSection<br />
<br />
Section "Device"<br />
Identifier "intelgpu1"<br />
Driver "intel"<br />
Option "XvMC" "true"<br />
Option "UseEvents" "true"<br />
Option "AccelMethod" "UXA"<br />
BusID "PCI:0:2:0"<br />
EndSection<br />
</nowiki>}}<br />
<br />
You need to probably change the BusID:<br />
<br />
{{hc|<nowiki>$ lspci | grep VGA</nowiki>|<br />
00:02.0 VGA compatible controller: Intel Corporation 2nd Generation Core Processor Family Integrated Graphics Controller (rev 09)<br />
}}<br />
<br />
The BusID is 0:2:0<br />
<br />
==CUDA Without Bumblebee==<br />
<br />
This is not well documented, but you do not need Bumblebee to use CUDA and it may work even on machines where optirun fails. For a guide on how to get it working with the Lenovo IdeaPad Y580 (which uses the GeForce 660M), see: https://wiki.archlinux.org/index.php/Lenovo_IdeaPad_Y580#NVIDIA_Card. Those instructions are very likely to work with other machines (except for the acpi-handle-hack part, which may not be necessary).<br />
<br />
==Troubleshooting==<br />
<br />
{{Note|Please report bugs at [https://github.com/Bumblebee-Project/Bumblebee Bumblebee-Project]'s GitHub tracker as described in its [https://github.com/Bumblebee-Project/Bumblebee/wiki/Reporting-Issues Wiki].}}<br />
<br />
=== [VGL] ERROR: Could not open display :8 ===<br />
<br />
There is a known problem with some wine applications that fork and kill the parent process without keeping track of it (for example the free to play online game "Runes of Magic")<br />
<br />
A workaround for this problem is:<br />
<br />
{{bc|$ optirun bash<br>$ optirun wine <windows program>.exe}}<br />
<br />
If using NVIDA drivers a fix for this problem is to edit /etc/bumblebee/xorg.conf.nvidia and change Option "ConnectedMonitor" to "CRT-0".<br />
<br />
=== [ERROR]Cannot access secondary GPU ===<br />
<br />
In some instances, running optirun will return:<br />
<br />
{{Ic|[ERROR]Cannot access secondary GPU - error: [XORG] (EE) No devices detected. <br><br />
<br><br />
[ERROR]Aborting because fallback start is disabled.}}<br />
<br />
In this case, you will need to move the file {{ic|/etc/X11/xorg.conf.d/20-intel.conf}} to somewhere else. Restart the bumblebeed daemon, and it should work. <br />
Credit for this goes to Lekensteyn on #bumblebee at freenode.net<br />
<br />
<br />
It could be also necessary to comment the driver line in {{ic|/etc/X11/xorg.conf.d/10-monitor.conf}}.<br />
<br />
=== Fatal IO error 11 (Resource temporarily unavailable) on X server ===<br />
<br />
Change {{ic|KeepUnusedXServer}} in {{ic|/etc/bumblebee/bumblebee.conf}} from {{ic|false}} to {{ic|true}}. Your program forks into backgroung and bumblebee don't know anything about it.<br />
<br />
=== Video tearing ===<br />
<br />
Video tearing is a somewhat common problem on Bumblebee. To fix it, you need to enable vsync. It should be enabled by default on the Intel card, but verify that from Xorg logs. To check whether or not it is enabled for nvidia, run <br />
<br />
{{bc|$ optirun nvidia-settings -c :8 }}<br />
<br />
{{ic|1=X Server XVideo Settings -> Sync to VBlank}} and {{ic|1=OpenGL Settings -> Sync to VBlank}} should both be enabled. The Intel card has in general less tearing, so use it for video playback. Especially use VA-API for video decoding (e.g. {{ic|mplayer-vaapi}} and with {{ic|-vsync}} parameter).<br />
<br />
Refer to the [[Intel#Video_tearing|Intel]] article on how to fix tearing on the Intel card.<br />
<br />
If it is still not fixed, try to disable compositing from your desktop environment. Try also disabling triple buffering.<br />
<br />
== Important Links ==<br />
* [http://www.bumblebee-project.org Bumblebee Project repository]<br />
* [http://wiki.bumblebee-project.org/ Bumblebee Project Wiki]<br />
* [https://github.com/Bumblebee-Project/bbswitch Bumblebee Project bbswitch repository]<br />
<br />
Join us at #bumblebee at freenode.net</div>FebLeyhttps://wiki.archlinux.org/index.php?title=Bumblebee&diff=212815Bumblebee2012-07-13T11:24:57Z<p>FebLey: /* Installing Bumblebee with Intel / nvidia */</p>
<hr />
<div>[[Category:Graphics]]<br />
[[Category:X Server]]<br />
[[fr:Bumblebee]]<br />
[[it:Bumblebee]]<br />
[[ru:Bumblebee]]<br />
[[tr:Bumblebee]]<br />
[[zh-CN:Bumblebee]]<br />
From Bumblebee's [https://github.com/Bumblebee-Project/Bumblebee/wiki/FAQ FAQ]:<br />
<br />
''Bumblebee is an effort to make NVIDIA Optimus enabled laptops work in GNU/Linux systems. Such feature involves two graphics cards with two different power consumption profiles plugged in a layered way sharing a single framebuffer.''<br />
<br />
== Bumblebee: Optimus for Linux ==<br />
<br />
[http://www.nvidia.com/object/optimus_technology.html Optimus Technology] is an ''[http://hybrid-graphics-linux.tuxfamily.org/index.php?title=Hybrid_graphics hybrid graphics]'' implementation without a hardware multiplexer. The integrated GPU manages the display while the dedicated GPU manages the most demanding rendering and ships the work to the integrated GPU to be displayed. When the laptop is running on battery supply, the dedicated GPU is turned off to save power and prolong the battery life.<br />
<br />
Bumblebee is a software implementation based on VirtualGL and a kernel driver to be able to use the dedicated GPU, which is not physically connected to the screen.<br />
<br />
Bumblebee tries to mimic the Optimus technology behavior; using the dedicated GPU for rendering when needed and power it down when not in use. The present releases only support rendering on-demand, power-management is a work in progress.<br />
<br />
The NVIDIA dedicated card is managed as a separate X server connected to a "fake" screen (the screen is configured but not used). The second server is called using VirtualGL as if it were a remote server. That said, you will need a series of steps to set-up the kernel driver, the X server and a daemon.<br />
<br />
{{Warning|Bumblebee is still under heavy development! But your help is very welcome.}}<br />
<br />
==Installation==<br />
<br />
Before installing Bumblebee check your BIOS and activate Optimus (shareable graphics), if possible (BIOS doesn't have to provide this option), and install the [[Intel|intel driver]] for the secondary on board graphics card.<br />
<br />
{{Note|If you want to run a 32-bit application on a 64-bit system you must install {{AUR|lib32-virtualgl}} and proper lib32-* libraries.}}<br />
<br />
=== Installing Bumblebee with Intel / nvidia ===<br />
<br />
Install {{AUR|bumblebee}} from [[Arch User Repository|AUR]], and then install the special nvidia package {{aur|nvidia-utils-bumblebee}} for bumblebee.<br />
If you want to run 32-bit applications (like games with wine) on a 64-bit system you need the {{AUR|lib32-nvidia-utils-bumblebee}} from AUR additionally.<br />
<br />
{{Warning|Don't install the original {{Pkg|nvidia-utils}} for Bumblebee - it will break your system !}}<br />
<br />
In order to avoid installing {{Pkg|nvidia-utils}} as a dependency when installing the {{Pkg|nvidia}} driver, you have to install the {{AUR|nvidia-bumblebee}} package from the [[AUR]] instead (which is the same driver packaged for bumblebee usage).<br />
<br />
{{Note|You can install {{AUR|dkms-nvidia}} from AUR instead of {{Pkg|nvidia}} if you need it.}}<br />
<br />
{{note|If you like bumblebee to turn off the NVIDIA card automatically after usage, use {{AUR|bbswitch}} from AUR. See [[#Power Management|below]].}}<br />
<br />
=== Installing Bumblebee with Intel / nouveau ===<br />
<br />
Install nouveau and required packages first:<br />
{{bc|# pacman -S xf86-video-nouveau nouveau-dri mesa}}<br />
<br />
* {{Pkg|xf86-video-nouveau}} experimental 3D acceleration driver<br />
* {{Pkg|nouveau-dri}} Mesa classic DRI + Gallium3D drivers<br />
* {{Pkg|mesa}} Mesa 3-D graphics libraries<br />
<br />
Now Install {{AUR|bumblebee}} from [[Arch User Repository|AUR]]:<br />
<br />
{{note|If you like bumblebee to turn off the NVIDIA card automatically after usage, use {{AUR|bbswitch}} from AUR. See [[#Power Management|below]].}}<br />
<br />
==Start Bumblebee==<br />
<br />
In order to use it is necessary add yourself (and other users) at Bumblebee group:<br />
<br />
# usermod -a -G bumblebee $USER<br />
<br />
where {{ic|$USER}} is the login name of the user to be added. Then log off and on again to apply the group changes.<br />
<br />
To start bumblebee automatically add it to your {{ic|DAEMONS}} array in {{ic|/etc/rc.conf}}<br />
DAEMONS=(... @bumblebeed)<br />
<br />
Finished - reboot system and use the shell program {{ic|[[#Usage|optirun]]}} for Optimus NVIDIA rendering!<br />
<br />
== Usage ==<br />
<br />
The command line programm {{ic|optirun}} shipped with bumblebee is your best friend for running applications on your Optimus NVIDIA card.<br />
<br />
Test Bumblebee if it works with your Optimus system:<br />
{{bc|$ optirun glxgears}}<br />
<br />
If it succeeds and the terminal you are running from mentions something about your NVIDIA - Optimus with Bumblebee is working!<br />
<br />
General Usage:<br />
<br />
{{bc|$ optirun [options] <application> [application-parameters]}}<br />
<br />
Some Examples:<br />
<br />
Start Firefox accelerated with Optimus:<br />
<br />
{{bc|$ optirun firefox}}<br />
<br />
Start Windows applications with Optimus:<br />
<br />
{{bc|$ optirun wine <windows application>.exe}}<br />
<br />
Use NVIDIA Settings with Optimus:<br />
<br />
{{bc|$ optirun nvidia-settings -c :8 }}<br />
<br />
For a list of options for {{ic|optirun}} run:<br />
{{bc|$ optirun --help}}<br />
<br />
== Configuration ==<br />
<br />
You can configure the behaviour of Bumblebee to fit your needs. Fine tuning like speed optimization, power managment and other stuff can be configured in {{ic|/etc/bumblebee/bumblebee.conf}}<br />
<br />
=== Optimizing Speed ===<br />
<br />
Bumblebee renders frames for your Optimus NVIDIA card in a invisible X Server with VirtualGL and transports them back to your visible X Server.<br />
<br />
Frames will be compressed before they are transported - this saves bandwith and can be used for speedup optimization of bumblebee:<br />
<br />
To use an other compression method for a single application:<br />
<br />
$ optirun -c <compress-method> application<br />
<br />
The method of compres will affect performance in the GPU/GPU usage. Compressed methods (such as {{ic|jpeg}}) will load the CPU the most but will load GPU the minimum necessary; uncompressed methods loads the most on GPU and the CPU will have the minimum load possible.<br />
<br />
Compressed Methods are: {{ic|jpeg}}, {{ic|rgb}}, {{ic|yuv}}<br />
<br />
Uncompressed Methods are: {{ic|proxy}}, {{ic|xv}}<br />
<br />
To use a standard compression for all applications set the {{ic|VGLTransport}} to {{ic|<compress-method>}} in {{ic|/etc/bumblebee/bumblebee.conf}}<br />
<br />
{{hc|/etc/bumblebee/bumblebee.conf|<nowiki><br />
...<br />
[optirun]<br />
VGLTransport=proxy<br />
...<br />
</nowiki>}}<br />
<br />
{{Note|CPU frequency scaling will affect directly on render performance}}<br />
<br />
=== Power Management ===<br />
<br />
The goal of power management feature is to turnoff the NVIDIA card when it is not used by bumblebee anymore.<br />
<br />
To enable power managment for bumblebee install {{AUR|bbswitch}} from AUR.<br />
<br />
{{Warning|Make sure the secondary Xorg server is stopped when not in use !}}<br />
<br />
Set the {{ic|PMMethod}} to {{ic|bbswitch}} in the driver section of {{ic|/etc/bumblebee/bumblebee.conf}}:<br />
<br />
{{hc|/etc/bumblebee/bumblebee.conf|<nowiki><br />
[bumblebeed]<br />
KeepUnusedXServer=false<br />
...<br />
[driver-nvidia]<br />
PMMethod=bbswitch<br />
...<br />
[driver-nouveau]<br />
PMMethod=bbswitch<br />
...<br />
</nowiki>}}<br />
<br />
==== Default power state of NVIDIA card ====<br />
<br />
Set {{ic|load_state}} and {{ic|unload_state}} module options according to your needs (see [https://github.com/Bumblebee-Project/bbswitch bbswitch documentation]).<br />
{{hc|/etc/modprobe.d/bbswitch.conf|<nowiki><br />
options bbswitch load_state=0 unload_state=0<br />
</nowiki>}}<br />
<br />
Just restart bumblebee daemon to activate power managment:<br />
{{bc|# rc.d restart bumblebeed}}<br />
<br />
==== Enable NVIDIA card during shutdown ====<br />
<br />
The NVIDIA card may not correctly initialize during boot if the card was powered off when the system was last shutdown. One option is to set {{ic|TurnCardOffAtExit&#61;false}} in {{ic|/etc/bumblebee/bumblebee.conf}}, however this will enable the card everytime you stop the Bumblebee daemon, even if done manually. To ensure that the NVIDIA card is always powered on during shutdown, add the following [[Boot process#Custom_hooks|hook function]] (if using {{AUR|bbswitch}}):<br />
<br />
{{hc|/etc/rc.d/functions.d/nvidia-card-enable|<nowiki><br />
nvidia_card_enable() {<br />
BBSWITCH=/proc/acpi/bbswitch<br />
<br />
stat_busy "Enabling NVIDIA GPU"<br />
<br />
if [ -w ${BBSWITCH} ]; then<br />
echo ON > ${BBSWITCH}<br />
stat_done<br />
else<br />
stat_fail<br />
fi<br />
}<br />
<br />
add_hook shutdown_poweroff nvidia_card_enable<br />
</nowiki>}}<br />
<br />
=== Multiple monitors ===<br />
<br />
You can set up multiple monitors with xorg.conf. Set them to use the Intel card, but Bumblebee can still use the NVIDIA card. One example configuration is below for two identical screens with 1080p resolution and using the HDMI out.<br />
<br />
{{hc|/etc/X11/xorg.conf|<nowiki><br />
Section "Screen"<br />
Identifier "Screen0"<br />
Device "intelgpu0"<br />
Monitor "Monitor0"<br />
DefaultDepth 24<br />
Option "TwinView" "0"<br />
SubSection "Display"<br />
Depth 24<br />
Modes "1980x1080_60.00"<br />
EndSubSection<br />
EndSection<br />
<br />
Section "Screen"<br />
Identifier "Screen1"<br />
Device "intelgpu1"<br />
Monitor "Monitor1"<br />
DefaultDepth 24<br />
Option "TwinView" "0"<br />
SubSection "Display"<br />
Depth 24<br />
Modes "1980x1080_60.00"<br />
EndSubSection<br />
EndSection<br />
<br />
Section "Monitor"<br />
Identifier "Monitor0"<br />
Option "Enable" "true"<br />
EndSection<br />
<br />
Section "Monitor"<br />
Identifier "Monitor1"<br />
Option "Enable" "true"<br />
EndSection<br />
<br />
Section "Device"<br />
Identifier "intelgpu0"<br />
Driver "intel"<br />
Option "XvMC" "true"<br />
Option "UseEvents" "true"<br />
Option "AccelMethod" "UXA"<br />
BusID "PCI:0:2:0"<br />
EndSection<br />
<br />
Section "Device"<br />
Identifier "intelgpu1"<br />
Driver "intel"<br />
Option "XvMC" "true"<br />
Option "UseEvents" "true"<br />
Option "AccelMethod" "UXA"<br />
BusID "PCI:0:2:0"<br />
EndSection<br />
</nowiki>}}<br />
<br />
You need to probably change the BusID:<br />
<br />
{{hc|<nowiki>$ lspci | grep VGA</nowiki>|<br />
00:02.0 VGA compatible controller: Intel Corporation 2nd Generation Core Processor Family Integrated Graphics Controller (rev 09)<br />
}}<br />
<br />
The BusID is 0:2:0<br />
<br />
==CUDA Without Bumblebee==<br />
<br />
This is not well documented, but you do not need Bumblebee to use CUDA and it may work even on machines where optirun fails. For a guide on how to get it working with the Lenovo IdeaPad Y580 (which uses the GeForce 660M), see: https://wiki.archlinux.org/index.php/Lenovo_IdeaPad_Y580#NVIDIA_Card. Those instructions are very likely to work with other machines (except for the acpi-handle-hack part, which may not be necessary).<br />
<br />
==Troubleshooting==<br />
<br />
{{Note|Please report bugs at [https://github.com/Bumblebee-Project/Bumblebee Bumblebee-Project]'s GitHub tracker as described in its [https://github.com/Bumblebee-Project/Bumblebee/wiki/Reporting-Issues Wiki].}}<br />
<br />
=== [VGL] ERROR: Could not open display :8 ===<br />
<br />
There is a known problem with some wine applications that fork and kill the parent process without keeping track of it (for example the free to play online game "Runes of Magic")<br />
<br />
A workaround for this problem is:<br />
<br />
{{bc|$ optirun bash<br>$ optirun wine <windows program>.exe}}<br />
<br />
If using NVIDA drivers a fix for this problem is to edit /etc/bumblebee/xorg.conf.nvidia and change Option "ConnectedMonitor" to "CRT-0".<br />
<br />
=== [ERROR]Cannot access secondary GPU ===<br />
<br />
In some instances, running optirun will return:<br />
<br />
{{Ic|[ERROR]Cannot access secondary GPU - error: [XORG] (EE) No devices detected. <br><br />
<br><br />
[ERROR]Aborting because fallback start is disabled.}}<br />
<br />
In this case, you will need to move the file {{ic|/etc/X11/xorg.conf.d/20-intel.conf}} to somewhere else. Restart the bumblebeed daemon, and it should work. <br />
Credit for this goes to Lekensteyn on #bumblebee at freenode.net<br />
<br />
<br />
It could be also necessary to comment the driver line in {{ic|/etc/X11/xorg.conf.d/10-monitor.conf}}.<br />
<br />
=== Fatal IO error 11 (Resource temporarily unavailable) on X server ===<br />
<br />
Change {{ic|KeepUnusedXServer}} in {{ic|/etc/bumblebee/bumblebee.conf}} from {{ic|false}} to {{ic|true}}. Your program forks into backgroung and bumblebee don't know anything about it.<br />
<br />
=== Video tearing ===<br />
<br />
Video tearing is a somewhat common problem on Bumblebee. To fix it, you need to enable vsync. It should be enabled by default on the Intel card, but verify that from Xorg logs. To check whether or not it is enabled for nvidia, run <br />
<br />
{{bc|$ optirun nvidia-settings -c :8 }}<br />
<br />
{{ic|1=X Server XVideo Settings -> Sync to VBlank}} and {{ic|1=OpenGL Settings -> Sync to VBlank}} should both be enabled. The Intel card has in general less tearing, so use it for video playback. Especially use VA-API for video decoding (e.g. {{ic|mplayer-vaapi}} and with {{ic|-vsync}} parameter).<br />
<br />
Refer to the [[Intel#Video_tearing|Intel]] article on how to fix tearing on the Intel card.<br />
<br />
If it is still not fixed, try to disable compositing from your desktop environment. Try also disabling triple buffering.<br />
<br />
== Important Links ==<br />
* [http://www.bumblebee-project.org Bumblebee Project repository]<br />
* [http://wiki.bumblebee-project.org/ Bumblebee Project Wiki]<br />
* [https://github.com/Bumblebee-Project/bbswitch Bumblebee Project bbswitch repository]<br />
<br />
Join us at #bumblebee at freenode.net</div>FebLey