Difference between revisions of "Talk:Bumblebee"

From ArchWiki
Jump to: navigation, search
m (Addition to 7))
m (Addition to 7))
Line 47: Line 47:
  
 
::I tried to reproduce the errors successfully
 
::I tried to reproduce the errors successfully
::1. If you switch off the NVIDIA Card before you stop the bumblebee daemon (which starts/stops the 2nd X-Server) you get into trouble, the X process hogs 100% CPU, gets unkillable and the overall power consumption (in my case) goes from about 1500 mAh to 2100mAh
+
::1. If you switch off the NVIDIA Card before you stop the bumblebee daemon (which starts/stops the 2nd X-Server) you get into trouble, the X process hogs 100% CPU, gets unkillable and the overall power consumption (in my case) goes from about 1500mA to 2100mA
::2. If you only stop the bumblebee daemon without switching off the NVIDIA Card, power consumption goes from about 1500 to 1800-1900 (maybe user "thewall" only stopped the daemon without switching off the NVIDIA Card?)
+
::2. If you only stop the bumblebee daemon without switching off the NVIDIA Card, power consumption goes from about 1500mA to 1800-1900mA (maybe user "thewall" only stopped the daemon without switching off the NVIDIA Card?)
::3. If you switch off the NVIDIA Card (which is a GT 540M in my case) via acpi_call, power consumption goes down to 1200 mAh, which is quite nice *BUT* the Fan goes 100% some seconds after you switch it off.. this seems to consume about 50mAh more power.. blah blah and first of all is totally annoying
+
::3. If you switch off the NVIDIA Card (which is a GT 540M in my case) via acpi_call, power consumption goes down to 1200mA, which is quite nice *BUT* the Fan goes 100% some seconds after you switch it off.. this seems to consume about 50mA more power.. blah blah and first of all is totally annoying
 
::A guy in the ubuntu forum apparently already fixed 3) on similar hardware as i have, but i guess the differences are in detail, i'm trying to find it out.
 
::A guy in the ubuntu forum apparently already fixed 3) on similar hardware as i have, but i guess the differences are in detail, i'm trying to find it out.
 
::[[User:florianb|florianb]] 08:07, 1 August 2011 (CET)
 
::[[User:florianb|florianb]] 08:07, 1 August 2011 (CET)

Revision as of 06:48, 1 August 2011

Wiki rewritten

Hi, I followed this wiki two days ago and now Optimus technology works fine on my laptop, but I found this wiki a bit confusing. I decided to rewrite it. I'm not a linux-expert and i'm not English (I'm Italian), so feel free to correct what I wrote.

1) Setup X Server: I put this section as the first. New Bumblebee's versions create a xorg.conf.nvidia.pacnew file, so I added a cp command.
2) Load Kernel Module: I reordered this section with this logic in mind: first, get rid of nouveau at all; second, load nvidia module.
3) Start Bumblebee Daemon: I created a section for this. This way you don't need to reboot and it's more clear what you're doing.
4) Start VirtualGL Client: Well, I deleted this section because I think it's not needed to make bumblebee to work. I never run that command to use optirun or optirun32.
5) Usage: I added optirun32. It seems to work fine with Unigine Tropics benchmark.
6) Autostart Bumblebee: I created a section for this because this operations were all around the wiki. This way it's more compact.
7) Nvidia ON/OFF... : Everything is fine here. I added the command to check battery rate only.
About last section: I got an ACER Aspire 5742g (Nvidia gt540M) and if I followed the steps to turn off my card: well, my power usage is higher(+400mA) with the card turned off and nvidia module unloaded! I know it's unbelievable, but it's true. Anyone is experiencing this? Bye

Thewall 18:06, 1 July 2011 (EDT)

Samsagax Reply on thewall changes

It's nice someone got interested! Now I'll argue some points for what takes precedence, what are bugs and what is planned to the future of Bumblebee in ArchLinux:

1) I would put the kernel module load first, before the configuration of the X server, I think is better logic.
2) The issue with the ".pacnew" file is a bug, should create it only if there is an "xorg.conf.nvidia" (on upgrade). I'm also planning to move this conf file to /etc/bumblebee directory.
3) Liked that (:
4) I really wouldn't delete that, don't know why, but some people need the vglclient running, should be an optional and explanatory section maybe.
5) As the new package of bumblebee I'm trying to split into smaller packages containing the libraries apart from the scripts and optirun32 didn't work fine for most people (specially under wine).
6) Liked that, is more clean this way
7) This is a dark spot. as long as acpi_call does not work reliably on most laptops there is no safe way to tell if it's working. For this reason I'm putting this as purely experimental state and not supporting it for now. Your issue was reported and is known on a variety of ASUS laptops. I'll recommend to read about acpi_call and their known-to-work laptops.

BTW: Thanks!

Reply to Samsagax

1) Ok.
2) I tried to clarify. Is that bug solved?
3) Great (:
4) I re-entered the VGL Client section with a note.
5) You really made a good job here (:
6) Ok.
7) Nothing to say.
Other) A user on italian Arhlinux forum says that he must manually run the bumblebee daemon AFTER logging in with GNOME3. When he puts it in /etc/rc.conf he gets this: "[VGL] ERROR: Could not open display :1." It would be good to write that somewhere? Maybe a "troubleshooting" section?

Thewall 18:06, 1 July 2011 (EDT)


Addition to 7)

I think the higher Power consumption is caused by the X-Server that gets hung up (it hogs 100% of one CPU Core) when you switch off the Card via acpi_call. I've got the same issue here on a ASUS X53S, which also has a NVidia GT 540M.

florianb 00:19, 1 August 2011 (CET)

Try disabling the X server first or you will have some issues. If there is still a problem try the vga-switcheroo option.
Samsagax 19:27, 31 July 2011 (EDT)
I tried to reproduce the errors successfully
1. If you switch off the NVIDIA Card before you stop the bumblebee daemon (which starts/stops the 2nd X-Server) you get into trouble, the X process hogs 100% CPU, gets unkillable and the overall power consumption (in my case) goes from about 1500mA to 2100mA
2. If you only stop the bumblebee daemon without switching off the NVIDIA Card, power consumption goes from about 1500mA to 1800-1900mA (maybe user "thewall" only stopped the daemon without switching off the NVIDIA Card?)
3. If you switch off the NVIDIA Card (which is a GT 540M in my case) via acpi_call, power consumption goes down to 1200mA, which is quite nice *BUT* the Fan goes 100% some seconds after you switch it off.. this seems to consume about 50mA more power.. blah blah and first of all is totally annoying
A guy in the ubuntu forum apparently already fixed 3) on similar hardware as i have, but i guess the differences are in detail, i'm trying to find it out.
florianb 08:07, 1 August 2011 (CET)