Difference between revisions of "Talk:Bumblebee"

From ArchWiki
Jump to: navigation, search
(... socket path /var/run/bumblebee.socket was incorrect.: new section)
(Multiple monitors with screenclone - wrong info: new section)
(11 intermediate revisions by 2 users not shown)
Line 1: Line 1:
== Wiki rewritten ==
+
== Nvidia ON/OFF ==
 
+
This is a dark spot. as long as acpi_call does not work reliably on most laptops there is no safe way to tell if it's working. For this reason I'm putting this as purely experimental state and not supporting it for now. Your issue was reported and is known on a variety of ASUS laptops. I'll recommend to read about acpi_call and their known-to-work laptops.  
Hi, I followed this wiki two days ago and now Optimus technology works fine on my laptop, but I found this wiki a bit confusing. I decided to rewrite it. I'm not a linux-expert and i'm not English (I'm Italian), so feel free to correct what I wrote.
+
 
+
:1) Setup X Server: I put this section as the first. New Bumblebee's versions create a xorg.conf.nvidia.pacnew file, so I added a cp command.
+
:2) Load Kernel Module: I reordered this section with this logic in mind: first, get rid of nouveau at all; second, load nvidia module.
+
:3) Start Bumblebee Daemon: I created a section for this. This way you don't need to reboot and it's more clear what you're doing.
+
:4) Start VirtualGL Client: Well, I deleted this section because I think it's not needed to make bumblebee to work. I never run that command to use optirun or optirun32.
+
:5) Usage: I added optirun32. It seems to work fine with Unigine Tropics benchmark.
+
:6) Autostart Bumblebee: I created a section for this because this operations were all around the wiki. This way it's more compact.
+
:7) Nvidia ON/OFF... : Everything is fine here. I added the command to check battery rate only.
+
:About last section: I got an ACER Aspire 5742g (Nvidia gt540M) and if I followed the steps to turn off my card: well, my power usage is higher(+400mA) with the card turned off and nvidia module unloaded! I know it's unbelievable, but it's true. Anyone is experiencing this? Bye
+
[[User:Thewall|Thewall]] 18:06, 1 July 2011 (EDT)
+
=== Samsagax Reply on thewall changes ===
+
 
+
It's nice someone got interested!
+
Now I'll argue some points for what takes precedence, what are bugs and what is planned to the future of Bumblebee in ArchLinux:
+
:1) I would put the kernel module load first, before the configuration of the X server, I think is better logic.
+
:2) The issue with the ".pacnew" file is a bug, should create it only if there is an "xorg.conf.nvidia" (on upgrade). I'm also planning to move this conf file to /etc/bumblebee directory.
+
:3) Liked that (:
+
:4) I really wouldn't delete that, don't know why, but some people need the vglclient running, should be an optional and explanatory section maybe.
+
:5) As the new package of bumblebee I'm trying to split into smaller packages containing the libraries apart from the scripts and optirun32 didn't work fine for most people (specially under wine).
+
:6) Liked that, is more clean this way
+
:7) This is a dark spot. as long as acpi_call does not work reliably on most laptops there is no safe way to tell if it's working. For this reason I'm putting this as purely experimental state and not supporting it for now. Your issue was reported and is known on a variety of ASUS laptops. I'll recommend to read about acpi_call and their known-to-work laptops.  
+
 
BTW: Thanks!
 
BTW: Thanks!
  
==== Reply to Samsagax ====
+
:I think the higher Power consumption is caused by the X-Server that gets hung up (it hogs 100% of one CPU Core) when you switch off the Card via acpi_call. I've got the same issue here on a ASUS X53S, which also has a NVidia GT 540M.
 
+
:1) Ok.
+
:2) I tried to clarify. Is that bug solved?
+
:3) Great (:
+
:4) I re-entered the VGL Client section with a note.
+
:5) You really made a good job here (:
+
:6) Ok.
+
:7) Nothing to say.
+
:Other) A user on italian Arhlinux forum says that he must manually run the bumblebee daemon AFTER logging in with GNOME3. When he puts it in /etc/rc.conf he gets this: "[VGL] ERROR: Could not open display :1." It would be good to write that somewhere? Maybe a "troubleshooting" section?
+
[[User:Thewall|Thewall]] 18:06, 1 July 2011 (EDT)
+
 
+
 
+
==== Addition to 7) ====
+
I think the higher Power consumption is caused by the X-Server that gets hung up (it hogs 100% of one CPU Core) when you switch off the Card via acpi_call. I've got the same issue here on a ASUS X53S, which also has a NVidia GT 540M.
+
 
+
[[User:florianb|florianb]] 00:19, 1 August 2011 (CET)
+
 
+
:Try disabling the X server first or you will have some issues. If there is still a problem try the vga-switcheroo option.
+
:[[User:Samsagax|Samsagax]] 19:27, 31 July 2011 (EDT)
+
 
+
::I tried to reproduce the errors successfully
+
::1. If you switch off the NVIDIA Card before you stop the bumblebee daemon (which starts/stops the 2nd X-Server) you get into trouble, the X process hogs 100% CPU, gets unkillable and the overall power consumption (in my case) goes from about 1500mA to 2100mA
+
::2. If you only stop the bumblebee daemon without switching off the NVIDIA Card, power consumption goes from about 1500mA to 1800-1900mA (maybe user "thewall" only stopped the daemon without switching off the NVIDIA Card?)
+
::3. If you switch off the NVIDIA Card (which is a GT 540M in my case) via acpi_call, power consumption goes down to 1200mA, which is quite nice *BUT* the Fan goes 100% some seconds after you switch it off.. this seems to consume about 50mA more power.. blah blah and first of all is totally annoying
+
::A guy in the ubuntu forum apparently already fixed 3) on similar hardware as i have, but i guess the differences are in detail, i'm trying to find it out.
+
::[[User:florianb|florianb]] 08:07, 1 August 2011 (CET)
+
 
+
:::I'll try to release today the new model for nvidia driver, similar to the one of nouveau. That way power switching is made automatically and by means of vga-switcheroo by default. I have to remind you that acpi_call method calls are guessed and (in your case) they may be incorrect. [[User:Samsagax|Samsagax]] 10:42, 1 August 2011 (EDT)
+
 
+
::::Okay, sounds nice. I'd really like to contribute something to your work, if there's anything i could do, let me know.
+
::::[[User:florianb|florianb]] 10:37, 2 August 2011 (CET)
+
 
+
== We are making some progress ==
+
 
+
Well, some developers (real ones) and me are getting somewhere on a stable Bumblebee due to this week. Will update the package as soon as we get it done. [[User:Samsagax|Samsagax]] 14:27, 11 August 2011 (EDT)
+
 
+
==<s>What about lib32-nvidia-utils-bumblebee</s>==
+
Nowhere in the wiki article lib32-nvidia-utils-bumblebee is mentioned. But this is necessary if I would like to run 32bit wine games, right? --[[User:Onny|Onny]] 16:17, 29 January 2012 (EST)
+
 
+
:I've added the lib32-nvidia-utils-bumblebee in the installation instructions --[[User:febLey|febLey]] 13:37, 13 July 2012 (GMT+1)
+
 
+
== No devices detected, error encountered due to different cause ==
+
While i was trying to use bumblebee with nouveau, i encountered
+
  
<code> [ERROR]Cannot access secondary GPU - error: [XORG] (EE) No devices detected.
+
:[[User:florianb|florianb]] 00:19, 1 August 2011 (CET)
  
[ERROR]Aborting because fallback start is disabled. </code>
+
::Try disabling the X server first or you will have some issues. If there is still a problem try the vga-switcheroo option.  
 +
::[[User:Samsagax|Samsagax]] 19:27, 31 July 2011 (EDT)
  
But apparently for a different reason, i haven't figured out what it was, changing to nvidia(extra/nvidia 290.10-2) fixed it. (I also had to update to core/linux 3.2.2-1 for it.)
+
:::I tried to reproduce the errors successfully
 +
:::1. If you switch off the NVIDIA Card before you stop the bumblebee daemon (which starts/stops the 2nd X-Server) you get into trouble, the X process hogs 100% CPU, gets unkillable and the overall power consumption (in my case) goes from about 1500mA to 2100mA
 +
:::2. If you only stop the bumblebee daemon without switching off the NVIDIA Card, power consumption goes from about 1500mA to 1800-1900mA (maybe user "thewall" only stopped the daemon without switching off the NVIDIA Card?)
 +
:::3. If you switch off the NVIDIA Card (which is a GT 540M in my case) via acpi_call, power consumption goes down to 1200mA, which is quite nice *BUT* the Fan goes 100% some seconds after you switch it off.. this seems to consume about 50mA more power.. blah blah and first of all is totally annoying
 +
:::A guy in the ubuntu forum apparently already fixed 3) on similar hardware as i have, but i guess the differences are in detail, i'm trying to find it out.
 +
:::[[User:florianb|florianb]] 08:07, 1 August 2011 (CET)
  
== ... socket path /var/run/bumblebee.socket was incorrect. ==
+
::::I'll try to release today the new model for nvidia driver, similar to the one of nouveau. That way power switching is made automatically and by means of vga-switcheroo by default. I have to remind you that acpi_call method calls are guessed and (in your case) they may be incorrect. [[User:Samsagax|Samsagax]] 10:42, 1 August 2011 (EDT)
  
I get the following error:
+
:::::Okay, sounds nice. I'd really like to contribute something to your work, if there's anything i could do, let me know.
 +
:::::[[User:florianb|florianb]] 10:37, 2 August 2011 (CET)
  
{{bc|[42641.769973] [ERROR]The Bumblebee daemon has not been started yet or the socket path /var/run/bumblebee.socket was incorrect.
+
== Multiple monitors with screenclone - wrong info ==
[42641.770121] [ERROR]Could not connect to bumblebee daemon - is it running?}}
+
  
I am in the bumblebee group, {{ic|bumblebeed}} is running, i both {{ic|bumblebee-git 20120726-1}} and {{ic|bumblebee 3.0.1-2}} in the AUR show the same problem.(aside: {{ic|bumblebee}} initially had the '{{ic|Cannot access secondary GPU}}' issue above but updating linux, and maybe some other stuff fixed that)[[User:Jasper1984|Jasper1984]] ([[User talk:Jasper1984|talk]]) 23:06, 23 August 2012 (UTC)
+
At the end of the manual it says "Take note of the position of the VIRTUAL display in the list of Outputs as shown by xrandr. The counting starts from zero, i.e. if it is the third display shown, you would specify -x 2 as parameter to screenclone", however, this was wrong information in my case; i had to specify -x 2 even though VIRTUAL was first in my xrandr call(and thus it should be -x 0, which only cloned my laptop display).
 +
Making a change that mentions this.
 +
[[User:Futile|Futile]] ([[User talk:Futile|talk]]) 21:29, 6 July 2013 (UTC)

Revision as of 21:29, 6 July 2013

Nvidia ON/OFF

This is a dark spot. as long as acpi_call does not work reliably on most laptops there is no safe way to tell if it's working. For this reason I'm putting this as purely experimental state and not supporting it for now. Your issue was reported and is known on a variety of ASUS laptops. I'll recommend to read about acpi_call and their known-to-work laptops. BTW: Thanks!

I think the higher Power consumption is caused by the X-Server that gets hung up (it hogs 100% of one CPU Core) when you switch off the Card via acpi_call. I've got the same issue here on a ASUS X53S, which also has a NVidia GT 540M.
florianb 00:19, 1 August 2011 (CET)
Try disabling the X server first or you will have some issues. If there is still a problem try the vga-switcheroo option.
Samsagax 19:27, 31 July 2011 (EDT)
I tried to reproduce the errors successfully
1. If you switch off the NVIDIA Card before you stop the bumblebee daemon (which starts/stops the 2nd X-Server) you get into trouble, the X process hogs 100% CPU, gets unkillable and the overall power consumption (in my case) goes from about 1500mA to 2100mA
2. If you only stop the bumblebee daemon without switching off the NVIDIA Card, power consumption goes from about 1500mA to 1800-1900mA (maybe user "thewall" only stopped the daemon without switching off the NVIDIA Card?)
3. If you switch off the NVIDIA Card (which is a GT 540M in my case) via acpi_call, power consumption goes down to 1200mA, which is quite nice *BUT* the Fan goes 100% some seconds after you switch it off.. this seems to consume about 50mA more power.. blah blah and first of all is totally annoying
A guy in the ubuntu forum apparently already fixed 3) on similar hardware as i have, but i guess the differences are in detail, i'm trying to find it out.
florianb 08:07, 1 August 2011 (CET)
I'll try to release today the new model for nvidia driver, similar to the one of nouveau. That way power switching is made automatically and by means of vga-switcheroo by default. I have to remind you that acpi_call method calls are guessed and (in your case) they may be incorrect. Samsagax 10:42, 1 August 2011 (EDT)
Okay, sounds nice. I'd really like to contribute something to your work, if there's anything i could do, let me know.
florianb 10:37, 2 August 2011 (CET)

Multiple monitors with screenclone - wrong info

At the end of the manual it says "Take note of the position of the VIRTUAL display in the list of Outputs as shown by xrandr. The counting starts from zero, i.e. if it is the third display shown, you would specify -x 2 as parameter to screenclone", however, this was wrong information in my case; i had to specify -x 2 even though VIRTUAL was first in my xrandr call(and thus it should be -x 0, which only cloned my laptop display). Making a change that mentions this. Futile (talk) 21:29, 6 July 2013 (UTC)