Difference between revisions of "NVIDIA"

From ArchWiki
Jump to: navigation, search
(Force Powermizer performance level (for laptops))
(Force Powermizer performance level (for laptops): NVIDIA™ driver for X.org: performance and power saving hints)
Line 227: Line 227:
 
On my laptop (featuring a Nvidia Geforce Go 7600), I need to set
 
On my laptop (featuring a Nvidia Geforce Go 7600), I need to set
 
  Option "RegistryDwords" "PerfLevelSrc=0x2222"
 
  Option "RegistryDwords" "PerfLevelSrc=0x2222"
into /etc/X11/xorg.cong (in "Device" or "Screen" section). Otherwise I get artefacts, or even display corruption and occasional lockups in KDE4 with OpenGL desktop effects.
+
into /etc/X11/xorg.cong (in "Device" or "Screen" section). Otherwise I get artefacts, or even display corruption and occasional lockups in KDE4 with OpenGL desktop effects.
 +
 
 +
* NVIDIA™ driver for X.org: performance and power saving hints
 +
[http://tutanhamon.com.ua/technovodstvo/NVIDIA-UNIX-driver/ Here]  is a page that explains these options quite well.
  
 
===Let the GPU set its own performance level (based on temperature)===
 
===Let the GPU set its own performance level (based on temperature)===

Revision as of 02:26, 22 January 2009

Template:I18n links start Template:I18n entry Template:I18n entry Template:I18n entry Template:I18n entry Template:I18n entry Template:I18n links end

How to install Nvidia Driver with pacman

Info from Package Maintainer tpowa

The package is for those people who run a stock arch kernel! I only test with kernel 2.6 and xorg.

Multiple kernel users: You need to install nvidia package for each extra!

Installing drivers

You have to use extra repository, enable it for pacman. Leave X-Server, else pacman cannot finish installation and it will not work! As root, run:

pacman -Sy nvidia (for newer cards)
pacman -Sy nvidia-96xx or pacman -Sy nvidia-173xx (for older cards)

For newer-newer cards you may need to install nvidia-beta from AUR, because stable drivers are not available. (See /usr/share/doc/nvidia/supported-cards.txt in nvidia package if your card is supported.)

If you use the -mm kernel, adjust the above command appropriately.

See the README from nvidia for details on which card is supported by which driver.

Configuring X-Server

Edit /etc/X11/XF86Config or your /etc/X11/xorg.conf config file: Disable in modules section: GLcore and DRI

Add to modules section:

Load "glx"

Make sure you DON'T have a line

Load           "type1"

in the Module section since recent versions of xorg-server does not include the type1 font module (completely replaced by freetype).

Disable Section DRI completely:

#Section "DRI"
# Mode 0666
#EndSection

Change

Driver "nv"

or

Driver "vesa"

to

Driver "nvidia"

If it exists, disable the Chipset option (only needed for nv driver):

#Chipset "generic"

This was for basic setup; if you need more tweaking options, have a look at /usr/share/doc/nvidia/README.

You can also run:

nvidia-xconfig

See installing and configuring xorg.

Enabling Composite in Xorg

Refer to the Composite wiki for detailed instructions.

Modifying Arch rc.conf file

Add nvidia to /etc/rc.conf MODULES section (not needed anymore if you run xorg and udev). Needed for nvidia-71xx and kernel >=2.6.13!

Problems that might occur

Nvidia specific

Xorg7: Please remove your old /usr/X11R6 dir it can cause trouble during installation. Also make sure you've installed pkgconfig. The NVIDIA installer uses pkgconfig to determine where modular Xorg components are installed.

If you experience slow 3D Performance have a look at /usr/lib/libGL.so.1, /usr/lib/libGL.so, /usr/lib/libGLcore.so.1 Perhaps they are wrong linked to mesa or something else. Try reinstalling with pacman -S nvidia.

When you get this message when you try to start an openGL application (for example enemy-territory, or glxgears):

Error: Could not open /dev/nvidiactl because the permissions are too
restrictive. Please see the FREQUENTLY ASKED QUESTIONS 
section of /usr/share/doc/NVIDIA_GLX-1.0/README 
for steps to correct.

Add yourself to the video group using gpasswd -a yourusername video (don't forget to log out and back in, or type: source /etc/profile).

Arch specific

x86_64 and lib32-* (stale files problem): If you have previously used the nvidia binary installer, and switch to the packages pacman provides, you will probably run into a problem where 32-bit proprietary GL apps won't start anymore (they throw segfaults now, and you have no idea what went wrong); examples of this are google earth and wine with GL apps. The difficulty here is that the installer stores its files in other places, and you might have stale files in either

/usr/lib64
/usr/lib/tls 
/opt/lib32/usr/lib/tls

or similar. The solution is to do a

updatedb; locate libnvidia-tls

then identify the mismatching tls libs, and remove them (if they live in a /tls directory, remove that directory). Contrary to popular belief, the installer does not always find and remove them, and the problem can go unnoticed a long time. If you cannot identify the mismatching versions, just remove nvidia-utils and lib32-nvidia-utils packages on the command line (with pacman -Rd), do the command above again, and you will know which ones have not been in the package. Delete them and reinstall nvidia-utils and lib32-nvidia-utils.

GCC update: You must compile the module with the compiler that was used for the kernel else it may fail. A simple pacman -S nvidia should do it, if not wait for a new kernel release and stay with old kernel and gcc.

Kernel update: Kernel updates will require reinstalling the driver.

Driver Config Tool

The new config tool for the nvidia-drivers is included called 'nvidia-settings' You don't have to use it it's only a add-on!
For more information about the use, have a look at the following file:
/usr/share/doc/NVIDIA_GLX-1.0/nvidia-settings-user-guide.txt
Please install gtk2 with "pacman -S gtk2" in order to use this tool.

NOTE: If you experience problems like crashing the X-Server while running the tool you have to delete your .nvidia-settings-rc file in your home directory.

Nvidia-Settings Autostart: You might like to apply the settings chosen using nvidia-settings at startup, firstly run nvidia-settings at least once in order for settings to be restored. The settings file is stored in ~/.nvidia-settings-rc. Then add the following to the auto-startup method of your DE:

nvidia-settings --load-config-only

Known Issues

If you experience crashes, try to disable RenderAccel "True" option.

If you have nvidia installer complaining about different versions of gcc between the current one and the one used for compiling the kernel then see on how to install the traditional way but remember to export IGNORE_CC_MISMATCH=1

If you have comments on the package please post it here: http://bbs.archlinux.org/viewtopic.php?t=10692 If you have a problem with the drivers have a look at the nvidia forum: http://www.nvnews.net/vbulletin/forumdisplay.php?s=&forumid=14 For a Changelog please look here: http://www.nvidia.com/object/linux_display_ia32_1.0-8762.html

Note: please don't change the above part without notifying me.

Bad performance after installing new nvidia-driver

If you experience very slow fps rate in compare with older driver first check if You have Direct Rendering turned on. You can do it by:

glxinfo | grep direct

If You get: direct rendering: No then that's your problem. Next check if You have the same versions of glx for the client and server by this:

glxinfo | egrep "glx (vendor|version)"

And if you see different vendors or versions for the client and server run this:

ln -fs /usr/lib/libGL.so.$VER /usr/X11R6/lib/libGL.so
ln -fs /usr/lib/libGL.so.$VER /usr/X11R6/lib/libGL.so.1
ln -fs /usr/lib/libGL.so.$VER /usr/lib/libGL.so.1.2

Where $VER is the version of nvidia package, that you're using. You can check it by nvidia-settings

That's all. Now restart your Xserver and you should have normal acceleration.


Tweaking NVIDIA drivers

Open /etc/X11/xorg.conf or /etc/X11/XFree86Config with your editor of choice and try the following options to improve performance. Not all options may work for your system, try them carefully and always backup your configuration file.

Disable NVIDIA Graphics Logo on startup

Under Device section add the "NoLogo" Option

Option "NoLogo" "True"

Enable hardware acceleration

Under Device section add the "RenderAccel" Option.

Option "RenderAccel" "True"

NOTE: The RenderAccel is enabled by default since drivers version 9746.

Override monitor detection

The "ConnectedMonitor" Option under Device section allows to override the monitor detection when X server starts. This may save a bunch of seconds at start up. The available options are: "CRT" (cathode ray tube), "DFP" (digital flat panel), or "TV" (television).

The following statement force NVIDIA drivers to use DFP monitors.

Option "ConnectedMonitor" "DFP"

NOTE: use "CRT" for all analog 15 pin VGA connections (even if you have a flat panel). "DFP" is intended for DVI digital connections only!

Enable TripleBuffer

Enable the use of triple buffering by adding under Device section the "TripleBuffer" Option.

Option "TripleBuffer" "True"

Use this option if your GPU has plenty of ram (128mb and more) and combined with "Sync to VBlank". You may enable sync to vblank in nvidia-settings.

Enable BackingStore

This option is used to enable the server's support for backing store, a mechanism by which pixel data for occluded window regions is remembered by the server thereby alleviating the need to send expose events to X clients when the data needs to be redisplayed. BackingStore is not bound to NVIDIA drivers but to X server itself. ATI users would benefit from this option as well.

Under Device section add:

Option "BackingStore" "True"

Template:Box Note

Use OS-level events

Taken from NVIDIA drivers README file: "Use OS-level events to efficiently notify X when a client has performed direct rendering to a window that needs to be composited." Whatever it means, it may help improve performance. This option is currently incompatible with SLI and Multi-GPU modes.

Under Device section add:

Option "DamageEvents" "True"

This option is enabled by default in newer driver.

Enable power saving

... For a greener planet (not strictly related to NVIDIA drivers). Under Monitor section add:

Option "DPMS" "True"

Force Powermizer performance level (for laptops)

In your xorg.conf, add the following to Section "Device"

#force Powermizer to a certain level at all times
# level 0x1 = highest
# level 0x2 = med
# level 0x3 = lowest
Option "RegistryDwords" "PowerMizerLevelAC=0x3"
Option	"RegistryDwords"	"PowerMizerLevel=0x3"
  • note from brazzmonkey

On my laptop (featuring a Nvidia Geforce Go 7600), I need to set

Option "RegistryDwords" "PerfLevelSrc=0x2222"

into /etc/X11/xorg.cong (in "Device" or "Screen" section). Otherwise I get artefacts, or even display corruption and occasional lockups in KDE4 with OpenGL desktop effects.

  • NVIDIA™ driver for X.org: performance and power saving hints

Here is a page that explains these options quite well.

Let the GPU set its own performance level (based on temperature)

In your xorg.conf, add the following to the Section "Device"

Option "RegistryDwords" "PerfLevelSrc=0x3333"

Disable vblank interrupts (for laptops)

When running the interrupt detection utility powertop, it is seen that the nvidia driver will generate an interrupt for every vblank. to disable, place in the Device section:

Option         "OnDemandVBlankInterrupts" "True"

This will reduce interrupts to about one or two per second.

Enable overclocking via nvidia-settings

To enable overclocking, place the following line in the "device" section:

Option         "Coolbits" "1"

This will enable on the fly overclocking by running nvidia-settings inside X.

Please note that overclocking may damage your hardware and that no responsibility may be placed on the authors of this page due to any damage to any information technology equipment from operating products out of specifications set by the manufacturer.

Further readings

Using TV-out on your NVIDIA card

Good article on the subject can be found from:

 http://en.wikibooks.org/wiki/NVidia/TV-OUT

Why is the refresh rate not reported correctly by utilities that use the XRandR X extension (e.g., the GNOME "Screen Resolution Preferences" panel, `xrandr -q`, etc)?

The XRandR X extension is not presently aware of multiple display devices on a single X screen; it only sees the MetaMode bounding box, which may contain one or more actual modes. This means that if multiple MetaModes have the same bounding box, XRandR will not be able to distinguish between them.

In order to support DynamicTwinView, the NVIDIA X driver must make each MetaMode appear to be unique to XRandR. Presently, the NVIDIA X driver accomplishes this by using the refresh rate as a unique identifier.

You can use `nvidia-settings -q RefreshRate` to query the actual refresh rate on each display device.

The XRandR extension is currently being redesigned by the X.Org community, so the refresh rate workaround may be removed at some point in the future.

This workaround can also be disabled by setting the "DynamicTwinView" X configuration option to FALSE, which will disable NV-CONTROL support for manipulating MetaModes, but will cause the XRandR and XF86VidMode visible refresh rate to be accurate.

How to install NVIDIA Driver with custom kernel

It's an advantage to know how the ABS system works by reading some of the other wiki pages about it, first:


We will create our own pacman package quickly by using ABS, which will build the module for the currently running kernel:

Make a temporary directory for creating our new package:

 mkdir -p /var/abs/local/

Make a copy of the nvidia package directory:

 cp -r /var/abs/extra/nvidia/ /var/abs/local/

Go into our temporary nvidia directory:

 cd /var/abs/local/nvidia

We need to edit the two files nvidia.install and the PKGBUILD file, so they contain the right kernel version variables, so we don't have to move it from the stock kernel /lib/modules/2.6.xx-ARCH directory.

You can get your kernel version and local version name if you type:

 uname -r
  • In nvidia.install replace the KERNEL_VERSION="2.6.xx-ARCH" variable with your kernel version, such as KERNEL_VERSION="2.6.22.6" or KERNEL_VERSION"2.6.22.6-custom" depending on what your kernels version is and local version text/number. Do this for all instances of the version number within this file.
  • In PKGBUILD change the _kernver='2.6.xx-ARCH' variable to match your kernel version again, like above.
  • If you have more than one kernel coexisting in parallel with another, (such as a custom kernel alongside the default -ARCH kernel) change the "pkgname=nvidia" variable in the PKGBUILD to a unique identifier, such as nvidia-2622 or nvidia-custom. This will allow both kernels to use the nvidia module, since the custom nvidia module has a different package name and will not overwrite the original.

Then do:

 makepkg -i -c

.. Now it will automatically build the NVIDIA module for your custom kernel and clean up the leftover files from creating the package. Enjoy!