Difference between revisions of "Talk:Bumblebee"

From ArchWiki
Jump to: navigation, search
(Wiki rewritten: Close edit log.)
(TurnCardOffAtExit in /etc/bumblebee/bumblebee.conf: reply)
 
(31 intermediate revisions by 15 users not shown)
Line 1: Line 1:
== <s> Wiki rewritten </s> ==
 
 
Hi, I followed this wiki two days ago and now Optimus technology works fine on my laptop, but I found this wiki a bit confusing. I decided to rewrite it. I'm not a linux-expert and i'm not English (I'm Italian), so feel free to correct what I wrote.
 
 
:1) Setup X Server: I put this section as the first. New Bumblebee's versions create a xorg.conf.nvidia.pacnew file, so I added a cp command.
 
:2) Load Kernel Module: I reordered this section with this logic in mind: first, get rid of nouveau at all; second, load nvidia module.
 
:3) Start Bumblebee Daemon: I created a section for this. This way you don't need to reboot and it's more clear what you're doing.
 
:4) Start VirtualGL Client: Well, I deleted this section because I think it's not needed to make bumblebee to work. I never run that command to use optirun or optirun32.
 
:5) Usage: I added optirun32. It seems to work fine with Unigine Tropics benchmark.
 
:6) Autostart Bumblebee: I created a section for this because this operations were all around the wiki. This way it's more compact.
 
:7) Nvidia ON/OFF... : Everything is fine here. I added the command to check battery rate only.
 
:About last section: I got an ACER Aspire 5742g (Nvidia gt540M) and if I followed the steps to turn off my card: well, my power usage is higher(+400mA) with the card turned off and nvidia module unloaded! I know it's unbelievable, but it's true. Anyone is experiencing this? Bye
 
[[User:Thewall|Thewall]] 18:06, 1 July 2011 (EDT)
 
=== Samsagax Reply on thewall changes ===
 
 
It's nice someone got interested!
 
Now I'll argue some points for what takes precedence, what are bugs and what is planned to the future of Bumblebee in ArchLinux:
 
:1) I would put the kernel module load first, before the configuration of the X server, I think is better logic.
 
:2) The issue with the ".pacnew" file is a bug, should create it only if there is an "xorg.conf.nvidia" (on upgrade). I'm also planning to move this conf file to /etc/bumblebee directory.
 
:3) Liked that (:
 
:4) I really wouldn't delete that, don't know why, but some people need the vglclient running, should be an optional and explanatory section maybe.
 
:5) As the new package of bumblebee I'm trying to split into smaller packages containing the libraries apart from the scripts and optirun32 didn't work fine for most people (specially under wine).
 
:6) Liked that, is more clean this way
 
:7) This is a dark spot. as long as acpi_call does not work reliably on most laptops there is no safe way to tell if it's working. For this reason I'm putting this as purely experimental state and not supporting it for now. Your issue was reported and is known on a variety of ASUS laptops. I'll recommend to read about acpi_call and their known-to-work laptops.
 
BTW: Thanks!
 
 
==== Reply to Samsagax ====
 
 
:1) Ok.
 
:2) I tried to clarify. Is that bug solved?
 
:3) Great (:
 
:4) I re-entered the VGL Client section with a note.
 
:5) You really made a good job here (:
 
:6) Ok.
 
:7) Nothing to say.
 
:Other) A user on italian Arhlinux forum says that he must manually run the bumblebee daemon AFTER logging in with GNOME3. When he puts it in /etc/rc.conf he gets this: "[VGL] ERROR: Could not open display :1." It would be good to write that somewhere? Maybe a "troubleshooting" section?
 
[[User:Thewall|Thewall]] 18:06, 1 July 2011 (EDT)
 
 
 
 
== Nvidia ON/OFF ==
 
== Nvidia ON/OFF ==
 
This is a dark spot. as long as acpi_call does not work reliably on most laptops there is no safe way to tell if it's working. For this reason I'm putting this as purely experimental state and not supporting it for now. Your issue was reported and is known on a variety of ASUS laptops. I'll recommend to read about acpi_call and their known-to-work laptops.  
 
This is a dark spot. as long as acpi_call does not work reliably on most laptops there is no safe way to tell if it's working. For this reason I'm putting this as purely experimental state and not supporting it for now. Your issue was reported and is known on a variety of ASUS laptops. I'll recommend to read about acpi_call and their known-to-work laptops.  
Line 61: Line 22:
 
:::::[[User:florianb|florianb]] 10:37, 2 August 2011 (CET)
 
:::::[[User:florianb|florianb]] 10:37, 2 August 2011 (CET)
  
== <s> ... socket path /var/run/bumblebee.socket was incorrect. </s> ==
+
== Multiple monitors with screenclone - wrong info ==
 +
 
 +
At the end of the manual it says "Take note of the position of the VIRTUAL display in the list of Outputs as shown by xrandr. The counting starts from zero, i.e. if it is the third display shown, you would specify -x 2 as parameter to screenclone", however, this was wrong information in my case; i had to specify -x 2 even though VIRTUAL was first in my xrandr call(and thus it should be -x 0, which only cloned my laptop display).
 +
Making a change that mentions this.
 +
[[User:Futile|Futile]] ([[User talk:Futile|talk]]) 21:29, 6 July 2013 (UTC)
 +
 
 +
== systemd-logind: failed to get session: PID XXX does not belong to any known session ==
 +
 
 +
{{ic|systemd-logind: failed to get session: PID XXX does not belong to any known session}}
 +
 
 +
Once I had got this error. When I tried what the wiki said, it made no difference. 
 +
 
 +
But this worked:
 +
 
 +
{{ic|Failed to initialize the NVIDIA GPU at PCI:1:0:0 (GPU fallen off the bus / RmInitAdapter failed!)}}
 +
 
 +
Add {{ic|1=rcutree.rcu_idle_gp_delay=1}} to the kernel parameters.
 +
 
 +
I think these two issues have something in common.
 +
 
 +
However people who have the same problem as mine should try it. {{unsigned|18 September 2014‎|Swordfeng}}
 +
 
 +
:Why is this error removed from the wiki? It is not fixed. And the workaround I added to the wiki still works...
 +
:[[User:Aligator|Aligator]] ([[User talk:Aligator|talk]]) 18:58, 3 February 2015 (UTC)
 +
 
 +
::It was removed (and a host of other content) with [https://wiki.archlinux.org/index.php?title=Bumblebee&diff=357350&oldid=357233], with only vague (read: none) reasoning. Reverted it. @Archange: Please read [[ArchWiki:Contributing]], make small edits, ''justify'' them and check the talk page. -- [[User:Alad|Alad]] ([[User talk:Alad|talk]]) 20:19, 3 February 2015 (UTC)
 +
 
 +
:::Sorry, I just wanted to clean this page, because as I’ve said, all this content was too old or even wrong. But I’m not used to mediawiki, so I probably did it not correctly, plus I’m definitively not comfortable with discussions here. I’m a Bumblebee “dev”, and I’m currently going into cleaning most important wikis (Debian, Ubuntu, Arch) for the incoming 4.0 release (that has been delayed but was due for end of January initially). About the aforementioned error, this has nothing todo with Bumblebee, it’s a feature of rootless X, Bumblebee is coded to return X.org errors, but should ignore this one as it does for some others (this is fixed in 4.0). -- [[User:Archange|Archange]] ([[User talk:Archange|talk]]) 12:56, 4 February 2015 (UTC)
 +
 
 +
Since {{ic|ll /dev/dri/card0}} gives something like:
 +
{{ic|crw-rw-rw-+ 1 root vglusers 226, 0 Mar  6 14:24 /dev/dri/card0}}
 +
 
 +
I think I've solved the problem by reconfiguring the virtualgl server by running {{ic|sudo vglserver_config}} and disabling both first two options ie. "Restrict 3D X server access to vglusers group" and "Restrict framebuffer device access to vglusers group", as mentioned in /usr/share/doc/virtualgl/index.html.
 +
 
 +
The error messages are persistent in dmesg/Xorg.8.log, but optirun seems to be working perfectly.
 +
 
 +
P.S. please don't remove this discussion page.
 +
 
 +
 
 +
--[[User:Rezad|Rezad]] ([[User talk:Rezad|talk]]) 10:20, 22 April 2015 (UTC)
 +
I fixed my same error by adding this to /etc/bumblebee/
 +
 
 +
{{ic|Section "Screen"
 +
    Identifier "Default Screen"
 +
    Device "DiscreteNvidia"
 +
EndSection}}
 +
 
 +
as is mentioned in Debian wiki bumblebee page
 +
https://wiki.debian.org/Bumblebee#Common_issues
 +
 
 +
== Adding "user" to the "bumblebee" group ==
 +
I believe adding "user" to the "bumblebee" group is still necessary. Bumblebee is not working for me without doing that:
 +
 
 +
$ optirun glxgears -info   
 +
[ERROR]You've no permission to communicate with the Bumblebee daemon. Try adding yourself to the 'bumblebee' group
 +
 
 +
Also there is still a group named: "bumblebee" in {{ic|/etc/group}}.[[User:Ghfujianbin|Ghfujianbin]] ([[User talk:Ghfujianbin|talk]]) 06:27, 21 September 2015 (UTC)
 +
 
 +
:Added back: [https://wiki.archlinux.org/index.php?title=Bumblebee&diff=401069&oldid=400954]. -- [[User:Lahwaacz|Lahwaacz]] ([[User talk:Lahwaacz|talk]]) 07:35, 21 September 2015 (UTC)
 +
 
 +
== Intel/Nouveau: PRIMUS_libGLa ==
 +
 
 +
I [https://wiki.archlinux.org/index.php?title=Bumblebee&diff=prev&oldid=409590 moved] [[Bumblebee#Intel/Nouveau: primus: fatal: failed to load any of the libraries: /usr/$LIB/nvidia/libGL.so.1]] under [[Bumblebee#Troubleshooting]], but it has been [https://wiki.archlinux.org/index.php?title=Bumblebee&curid=11683&diff=409609&oldid=409592 proposed] to move it back under [[Bumblebee#Installing Bumblebee with Intel/Nouveau]].
 +
 
 +
I see some other mentions of nouveau under [[Bumblebee#Troubleshooting]], but I also understand Lahwaacz's point, and wouldn't mind moving the section to its original place in a Note. Are there more opinions?
 +
 
 +
Besides that though, wouldn't it be more correct to state that support for nouveau is probably lacking due to the practical death of the project? The deprecation Note was added with [https://wiki.archlinux.org/index.php?title=Bumblebee&diff=prev&oldid=345410], and the Expansion template with [https://wiki.archlinux.org/index.php?title=Bumblebee&diff=prev&oldid=357076], but on the [http://www.bumblebee-project.org website] and related links I couldn't find any official deprecation statement, see in particular [https://github.com/Bumblebee-Project/Bumblebee/wiki/FAQ#how-do-i-install-bumblebee] and [https://github.com/Bumblebee-Project/Bumblebee/wiki/Supported-drivers#using-nouveau].
 +
 
 +
— [[User:Kynikos|Kynikos]] ([[User talk:Kynikos|talk]]) 03:57, 21 November 2015 (UTC)
 +
 
 +
:Right, but the other mentions of nouveau are tied with recommending to use the proprietary driver instead, so that does not count :P
 +
:-- [[User:Lahwaacz|Lahwaacz]] ([[User talk:Lahwaacz|talk]]) 13:58, 23 November 2015 (UTC)
 +
 
 +
::Eheh I won't insist, it's practically the same for me, [https://wiki.archlinux.org/index.php?title=Bumblebee&type=revision&diff=410001&oldid=409878 moved] back :) — [[User:Kynikos|Kynikos]] ([[User talk:Kynikos|talk]]) 07:23, 24 November 2015 (UTC)
 +
 
 +
:I have no idea what the actual state really is and don't have the resources to experimentally find out, because my laptop is unfortunately one of the last pre-Optimus models. For what it's worth, the last two links lead to pages last edited in 2013, whereas the deprecation note has been added a year ago.
 +
:-- [[User:Lahwaacz|Lahwaacz]] ([[User talk:Lahwaacz|talk]]) 13:58, 23 November 2015 (UTC)
 +
 
 +
::I can't test the current working state either wiht my laptop, but the state of the project is quite [https://github.com/Bumblebee-Project/Bumblebee/graphs/contributors clear]... My point was indeed that in the latest update of the official docs (2013) there's no trace of a deprecation of the support for nouveau, and the deprecation note was added here by Svenstaro without any external reference, so I thought that blaming the death of the project instead of an unreferenced deprecation would make things clearer: my guess is that Bumblebee was working on nouveau in 2013 and stopped working in 2014 without anybody to fix it, hence the Note. — [[User:Kynikos|Kynikos]] ([[User talk:Kynikos|talk]]) 07:23, 24 November 2015 (UTC)
 +
 
 +
== Fix for Bumblebee Optirun with Nvidia v358.16-2.1 driver and bbswitch v0.8 ==
 +
 
 +
Please refer to this [https://github.com/Bumblebee-Project/Bumblebee/issues/699 issue#699]
 +
for details.
 +
 
 +
Note: There are other similar issues relating to bumblebee or bbswitch not unloading nvidia drivers since 2012.  Further information can be found under the [https://github.com/Bumblebee-Project/Bumblebee Bumblebee Project] on GitHub.
 +
 
 +
System affected :
 +
::Arch x86_64 Laptop, Nvidia GT540M,
 +
::linux-lts 4.1.15-1, kded 5.17.0
 +
[[User:Auz|Dude Doe Doh]] ([[User talk:Auz|talk]]) 22:53, 25 December 2015 (UTC)
 +
 
 +
== TurnCardOffAtExit in /etc/bumblebee/bumblebee.conf ==
 +
 
 +
[https://wiki.archlinux.org/index.php/Bumblebee#Enable_NVIDIA_card_during_shutdown This section] suggests that one workaround to a problem
 +
 
 +
: is to set {{ic|TurnCardOffAtExit&#61;false}} in {{ic|/etc/bumblebee/bumblebee.conf}}, however this will enable the card everytime you stop the Bumblebee daemon, even if done manually.
 +
 
 +
This suggests to me that this is a poor solution, compared to the alternative option. However, the bumblebee package *already* sets this by default, which appears to conflict with this text. Also, does setting this mean that the card will be on at all times, i.e. using up unnecessary battery? -[[User:Ostiensis|Ostiensis]] ([[User talk:Ostiensis|talk]]) 01:04, 12 November 2016 (UTC)
  
I get the following error:
+
:The default bumblebee.conf setting is {{ic|1=TurnCardOffAtExit=true}}, the article suggests to set it to false (effectively writing OFF) but then dismisses the idea for a better alternative (namely writing ON to /proc/acpi/bbswitch). What is exactly conflicting? Only when you bumblebeed is stopped (e.g. {{ic|systemctl stop bumblebeed}} or on shutdown), the power state will be changed. It has no effect between invocations of {{ic|optirun}} and will normally not eat your battery.--[[User:Lekensteyn|Lekensteyn]] ([[User talk:Lekensteyn|talk]]) 01:46, 12 November 2016 (UTC) PS. please sign your posts
  
{{bc|[42641.769973] [ERROR]The Bumblebee daemon has not been started yet or the socket path /var/run/bumblebee.socket was incorrect.
+
:: Sorry, I always forget to sign! So you are saying that if I set this option to false, then if I only ever turn on the card with optirun/primusrun, it should still be turned off when unused? Hence, it should not affect battery. Regarding the default settings, I downloaded the [https://www.archlinux.org/packages/community/x86_64/bumblebee/download/ package]. Then {{ic|grep TurnCardOffAtExit bumblebee-3.2.1-12-x86_64.pkg/etc/bumblebee/bumblebee.conf}} gives {{ic|TurnCardOffAtExit&#61;false}} -[[User:Ostiensis|Ostiensis]] ([[User talk:Ostiensis|talk]]) 02:21, 12 November 2016 (UTC)
[42641.770121] [ERROR]Could not connect to bumblebee daemon - is it running?}}
+
  
I am in the bumblebee group, {{ic|bumblebeed}} is running, i both {{ic|bumblebee-git 20120726-1}} and {{ic|bumblebee 3.0.1-2}} in the AUR show the same problem.(aside: {{ic|bumblebee}} initially had the '{{ic|Cannot access secondary GPU}}' issue above but updating linux, and maybe some other stuff fixed that) I current use the `extra/nvidia`
+
::: Sorry, I was mistaken, I actually changed it to true myself. The default is indeed false. This default probably exists for compatibility reasons, some (older?) laptops would turn up with a black screen if is was not done. And correct, it will only turn the card with optirun (primusrun) and turn off when optirun is not running. On daemon exit (e.g. shutdown) it will also turn on, but that should not be a problem I guess?--[[User:Lekensteyn|Lekensteyn]] ([[User talk:Lekensteyn|talk]]) 10:24, 12 November 2016 (UTC)
package, but had same issue with `libgl`. Edit: fixed it, breaking xorg, fixed that, broke this again ><[[User:Jasper1984|Jasper1984]] ([[User talk:Jasper1984|talk]]) 23:11, 23 August 2012 (UTC)
+
(continued)well using the systemd version instead didn't work.. Running {{ic|/usr/sbin/bumblebeed}} directly i get {{ic|[ 4917.535145] [ERROR]Module 'nvidia' is not found.}}, maybe it doesn't look in {{ic|/usr/lib/modules/extramodules-3.4-ARCH/nvidia.ko.gz}}?[[User:Jasper1984|Jasper1984]] ([[User talk:Jasper1984|talk]]) 14:03, 24 August 2012 (UTC)
+
  
:::: [https://bbs.archlinux.org/viewtopic.php?pid=1178729#p1178729 It is fixed] i also added the troubleshooting item to the wiki.(this discussion section can be deleted)[[User:Jasper1984|Jasper1984]] ([[User talk:Jasper1984|talk]]) 11:55, 20 October 2012 (UTC)
+
:: No worries. I edited the wiki in an attempt to clarify it, but I've possibly misunderstood, so please feel free to edit further. Thanks for the replies. -[[User:Ostiensis|Ostiensis]] ([[User talk:Ostiensis|talk]]) 23:39, 12 November 2016 (UTC)

Latest revision as of 23:39, 12 November 2016

Nvidia ON/OFF

This is a dark spot. as long as acpi_call does not work reliably on most laptops there is no safe way to tell if it's working. For this reason I'm putting this as purely experimental state and not supporting it for now. Your issue was reported and is known on a variety of ASUS laptops. I'll recommend to read about acpi_call and their known-to-work laptops. BTW: Thanks!

I think the higher Power consumption is caused by the X-Server that gets hung up (it hogs 100% of one CPU Core) when you switch off the Card via acpi_call. I've got the same issue here on a ASUS X53S, which also has a NVidia GT 540M.
florianb 00:19, 1 August 2011 (CET)
Try disabling the X server first or you will have some issues. If there is still a problem try the vga-switcheroo option.
Samsagax 19:27, 31 July 2011 (EDT)
I tried to reproduce the errors successfully
1. If you switch off the NVIDIA Card before you stop the bumblebee daemon (which starts/stops the 2nd X-Server) you get into trouble, the X process hogs 100% CPU, gets unkillable and the overall power consumption (in my case) goes from about 1500mA to 2100mA
2. If you only stop the bumblebee daemon without switching off the NVIDIA Card, power consumption goes from about 1500mA to 1800-1900mA (maybe user "thewall" only stopped the daemon without switching off the NVIDIA Card?)
3. If you switch off the NVIDIA Card (which is a GT 540M in my case) via acpi_call, power consumption goes down to 1200mA, which is quite nice *BUT* the Fan goes 100% some seconds after you switch it off.. this seems to consume about 50mA more power.. blah blah and first of all is totally annoying
A guy in the ubuntu forum apparently already fixed 3) on similar hardware as i have, but i guess the differences are in detail, i'm trying to find it out.
florianb 08:07, 1 August 2011 (CET)
I'll try to release today the new model for nvidia driver, similar to the one of nouveau. That way power switching is made automatically and by means of vga-switcheroo by default. I have to remind you that acpi_call method calls are guessed and (in your case) they may be incorrect. Samsagax 10:42, 1 August 2011 (EDT)
Okay, sounds nice. I'd really like to contribute something to your work, if there's anything i could do, let me know.
florianb 10:37, 2 August 2011 (CET)

Multiple monitors with screenclone - wrong info

At the end of the manual it says "Take note of the position of the VIRTUAL display in the list of Outputs as shown by xrandr. The counting starts from zero, i.e. if it is the third display shown, you would specify -x 2 as parameter to screenclone", however, this was wrong information in my case; i had to specify -x 2 even though VIRTUAL was first in my xrandr call(and thus it should be -x 0, which only cloned my laptop display). Making a change that mentions this. Futile (talk) 21:29, 6 July 2013 (UTC)

systemd-logind: failed to get session: PID XXX does not belong to any known session

systemd-logind: failed to get session: PID XXX does not belong to any known session

Once I had got this error. When I tried what the wiki said, it made no difference.

But this worked:

Failed to initialize the NVIDIA GPU at PCI:1:0:0 (GPU fallen off the bus / RmInitAdapter failed!)

Add rcutree.rcu_idle_gp_delay=1 to the kernel parameters.

I think these two issues have something in common.

However people who have the same problem as mine should try it. —This unsigned comment is by Swordfeng (talk) 18 September 2014‎. Please sign your posts with ~~~~!

Why is this error removed from the wiki? It is not fixed. And the workaround I added to the wiki still works...
Aligator (talk) 18:58, 3 February 2015 (UTC)
It was removed (and a host of other content) with [1], with only vague (read: none) reasoning. Reverted it. @Archange: Please read ArchWiki:Contributing, make small edits, justify them and check the talk page. -- Alad (talk) 20:19, 3 February 2015 (UTC)
Sorry, I just wanted to clean this page, because as I’ve said, all this content was too old or even wrong. But I’m not used to mediawiki, so I probably did it not correctly, plus I’m definitively not comfortable with discussions here. I’m a Bumblebee “dev”, and I’m currently going into cleaning most important wikis (Debian, Ubuntu, Arch) for the incoming 4.0 release (that has been delayed but was due for end of January initially). About the aforementioned error, this has nothing todo with Bumblebee, it’s a feature of rootless X, Bumblebee is coded to return X.org errors, but should ignore this one as it does for some others (this is fixed in 4.0). -- Archange (talk) 12:56, 4 February 2015 (UTC)

Since ll /dev/dri/card0 gives something like: crw-rw-rw-+ 1 root vglusers 226, 0 Mar 6 14:24 /dev/dri/card0

I think I've solved the problem by reconfiguring the virtualgl server by running sudo vglserver_config and disabling both first two options ie. "Restrict 3D X server access to vglusers group" and "Restrict framebuffer device access to vglusers group", as mentioned in /usr/share/doc/virtualgl/index.html.

The error messages are persistent in dmesg/Xorg.8.log, but optirun seems to be working perfectly.

P.S. please don't remove this discussion page.


--Rezad (talk) 10:20, 22 April 2015 (UTC) I fixed my same error by adding this to /etc/bumblebee/

Section "Screen"

   Identifier "Default Screen"
   Device "DiscreteNvidia"

EndSection

as is mentioned in Debian wiki bumblebee page https://wiki.debian.org/Bumblebee#Common_issues

Adding "user" to the "bumblebee" group

I believe adding "user" to the "bumblebee" group is still necessary. Bumblebee is not working for me without doing that:

$ optirun glxgears -info    
[ERROR]You've no permission to communicate with the Bumblebee daemon. Try adding yourself to the 'bumblebee' group

Also there is still a group named: "bumblebee" in /etc/group.Ghfujianbin (talk) 06:27, 21 September 2015 (UTC)

Added back: [2]. -- Lahwaacz (talk) 07:35, 21 September 2015 (UTC)

Intel/Nouveau: PRIMUS_libGLa

I moved Bumblebee#Intel/Nouveau: primus: fatal: failed to load any of the libraries: /usr/$LIB/nvidia/libGL.so.1 under Bumblebee#Troubleshooting, but it has been proposed to move it back under Bumblebee#Installing Bumblebee with Intel/Nouveau.

I see some other mentions of nouveau under Bumblebee#Troubleshooting, but I also understand Lahwaacz's point, and wouldn't mind moving the section to its original place in a Note. Are there more opinions?

Besides that though, wouldn't it be more correct to state that support for nouveau is probably lacking due to the practical death of the project? The deprecation Note was added with [3], and the Expansion template with [4], but on the website and related links I couldn't find any official deprecation statement, see in particular [5] and [6].

Kynikos (talk) 03:57, 21 November 2015 (UTC)

Right, but the other mentions of nouveau are tied with recommending to use the proprietary driver instead, so that does not count :P
-- Lahwaacz (talk) 13:58, 23 November 2015 (UTC)
Eheh I won't insist, it's practically the same for me, moved back :) — Kynikos (talk) 07:23, 24 November 2015 (UTC)
I have no idea what the actual state really is and don't have the resources to experimentally find out, because my laptop is unfortunately one of the last pre-Optimus models. For what it's worth, the last two links lead to pages last edited in 2013, whereas the deprecation note has been added a year ago.
-- Lahwaacz (talk) 13:58, 23 November 2015 (UTC)
I can't test the current working state either wiht my laptop, but the state of the project is quite clear... My point was indeed that in the latest update of the official docs (2013) there's no trace of a deprecation of the support for nouveau, and the deprecation note was added here by Svenstaro without any external reference, so I thought that blaming the death of the project instead of an unreferenced deprecation would make things clearer: my guess is that Bumblebee was working on nouveau in 2013 and stopped working in 2014 without anybody to fix it, hence the Note. — Kynikos (talk) 07:23, 24 November 2015 (UTC)

Fix for Bumblebee Optirun with Nvidia v358.16-2.1 driver and bbswitch v0.8

Please refer to this issue#699 for details.

Note: There are other similar issues relating to bumblebee or bbswitch not unloading nvidia drivers since 2012. Further information can be found under the Bumblebee Project on GitHub.

System affected :

Arch x86_64 Laptop, Nvidia GT540M,
linux-lts 4.1.15-1, kded 5.17.0

Dude Doe Doh (talk) 22:53, 25 December 2015 (UTC)

TurnCardOffAtExit in /etc/bumblebee/bumblebee.conf

This section suggests that one workaround to a problem

is to set TurnCardOffAtExit=false in /etc/bumblebee/bumblebee.conf, however this will enable the card everytime you stop the Bumblebee daemon, even if done manually.

This suggests to me that this is a poor solution, compared to the alternative option. However, the bumblebee package *already* sets this by default, which appears to conflict with this text. Also, does setting this mean that the card will be on at all times, i.e. using up unnecessary battery? -Ostiensis (talk) 01:04, 12 November 2016 (UTC)

The default bumblebee.conf setting is TurnCardOffAtExit=true, the article suggests to set it to false (effectively writing OFF) but then dismisses the idea for a better alternative (namely writing ON to /proc/acpi/bbswitch). What is exactly conflicting? Only when you bumblebeed is stopped (e.g. systemctl stop bumblebeed or on shutdown), the power state will be changed. It has no effect between invocations of optirun and will normally not eat your battery.--Lekensteyn (talk) 01:46, 12 November 2016 (UTC) PS. please sign your posts
Sorry, I always forget to sign! So you are saying that if I set this option to false, then if I only ever turn on the card with optirun/primusrun, it should still be turned off when unused? Hence, it should not affect battery. Regarding the default settings, I downloaded the package. Then grep TurnCardOffAtExit bumblebee-3.2.1-12-x86_64.pkg/etc/bumblebee/bumblebee.conf gives TurnCardOffAtExit=false -Ostiensis (talk) 02:21, 12 November 2016 (UTC)
Sorry, I was mistaken, I actually changed it to true myself. The default is indeed false. This default probably exists for compatibility reasons, some (older?) laptops would turn up with a black screen if is was not done. And correct, it will only turn the card with optirun (primusrun) and turn off when optirun is not running. On daemon exit (e.g. shutdown) it will also turn on, but that should not be a problem I guess?--Lekensteyn (talk) 10:24, 12 November 2016 (UTC)
No worries. I edited the wiki in an attempt to clarify it, but I've possibly misunderstood, so please feel free to edit further. Thanks for the replies. -Ostiensis (talk) 23:39, 12 November 2016 (UTC)