NVIDIA

From ArchWiki
Jump to navigation Jump to search

Template:I18n links start Template:I18n entry Template:I18n entry Template:I18n entry Template:I18n links end

How to install Nvidia Driver with pacman

Info from Package Maintainer tpowa

The package is for those people who run a stock arch kernel! I only test with kernel 2.6 and xorg.

NOTE by someone else: we didn't forget people using the -beyond kernel, check next paragraph

Multiple kernel users: You need to install nvidia package for each extra!

Installing drivers

You have to use extra repository, enable it for pacman. Leave X-Server, else pacman cannot finish installation and it will not work! As root, run:

pacman -Sy nvidia (for newer cards)
pacman -Sy nvidia-96xx or pacman -Sy nvidia-71xx (for older cards)

For -beyond users:

pacman -Sy nvidia-beyond
pacman -Sy nvidia-96xx-beyond or pacman -Sy nvidia-71xx-beyond (for older cards)

If you use the -ck, -suspend2 or -mm kernels, adjust the above command appropriately.

See the README from nvidia for details on which card is supported by which driver.

Configuring X-Server

Edit /etc/X11/XF86Config or your /etc/X11/xorg.conf config file: Disable in modules section: GLcore and DRI

Add to modules section:

Load "glx"

Disable Section DRI completely:

#Section "DRI"
# Mode 0666
#EndSection

Change Driver "nv" or Driver "vesa" to Driver "nvidia" If it exists disable Chipset option (only needed for nv driver).

This was for basic setup, if you need more tweaking options have a look at /usr/share/doc/NVIDIA_GLX-1.0/README.txt.

You can also run:

nvidia-xconfig

See installing and configuring xorg.

Modifying Arch rc.conf file

Add nvidia to /etc/rc.conf MODULES section (not needed anymore if you run xorg and udev). Needed for nvidia-legacy and kernel >=2.6.13!

Problems that might occur

Nvidia specific

Xorg7: Please remove your old /usr/X11R6 dir it can cause trouble during installation. Also make sure you've installed pkgconfig. The NVIDIA installer uses pkgconfig to determine where modular Xorg components are installed.

If you experience slow 3D Performance have a look at /usr/lib/libGL.so.1, /usr/lib/libGL.so, /usr/lib/libGLcore.so.1 Perhaps they are wrong linked to mesa or something else. Try reinstalling with pacman -S nvidia.

When you get this message when you try to start an openGL application (for example enemy-territory, or glxgears):

Error: Could not open /dev/nvidiactl because the permissions are too
restrictive. Please see the FREQUENTLY ASKED QUESTIONS 
section of /usr/share/doc/NVIDIA_GLX-1.0/README 
for steps to correct.

Add yourself to the video group using gpasswd -a yourusername video (don't forget to log out and back in).

Arch specific

GCC update: You must compile the module with the compiler that was used for the kernel else it may fail. A simple pacman -S nvidia should do it, if not wait for a new kernel release and stay with old kernel and gcc.

Kernel update: Kernel updates will require reinstalling the driver. A workaround is available.

Driver Config Tool

The new config tool for the nvidia-drivers is included called 'nvidia-settings' You don't have to use it it's only a add-on!
For more information about the use, have a look at the following file:
/usr/share/doc/NVIDIA_GLX-1.0/nvidia-settings-user-guide.txt
Please install gtk2 with "pacman -S gtk2" in order to use this tool.

NOTE: If you experience problems like crashing the X-Server while running the tool you have to delete your .nvidia-settings-rc file in your home directory.

Known Issues

If you experience crashes, try to disable RenderAccel "true" option.

If you have nvidia installer complaining about different versions of gcc between the current one and the one used for compiling the kernel then see on how to install the traditional way but remember to export IGNORE_CC_MISMATCH=1

If you have comments on the package please post it here: http://bbs.archlinux.org/viewtopic.php?t=10692 If you have a problem with the drivers have a look at the nvidia forum: http://www.nvnews.net/vbulletin/forumdisplay.php?s=&forumid=14 For a Changelog please look here: http://www.nvidia.com/object/linux_display_ia32_1.0-8762.html

Note: please don't change the above part without notifying me.

Bad performance after installing new nvidia-driver

If you experience very slow fps rate in compare with older driver first check if You have Direct Rendering turned on. You can do it by:

glxinfo | grep direct

If You get: direct rendering: No then that's your problem. Next check if You have the same versions of glx for the client and server by this:

glxinfo | egrep "glx (vendor|version)"

And if You see different vendors or versions for the client and server run this:

ln -fs /usr/lib/libGL.so.$VER /usr/X11R6/lib/libGL.so
ln -fs /usr/lib/libGL.so.$VER /usr/X11R6/lib/libGL.so.1
ln -fs /usr/lib/libGL.so.$VER /usr/lib/libGL.so.1.2

Where $VER is the version of nvidia package, that you're using. You can check it by nvidia-settings

That's all. Now restart your Xserver and you should have normal acceleration.

How to install NVIDIA Driver the traditional way

File name will look something like this: NVIDIA-Linux-x86-1.0-7167-pkg0.run

The Kernel Steps (the following 4 steps) can be left out if you use at least kernel 2.6.5 because the needed Includes are now in the Kernel package

  • Download the kernel source for the kernel version you are using

uname -r will give you kernel version

  • Move the current incomplete kernel source tree to 2.x.x.old:
mv /usr/src/2.x.x /usr/src/2.x.x.old
  • Uncompress and unpack your source code in /usr/src:
mv /path/to/linux-2.x.x.tar.bz2 /usr/src
cd /usr/src
tar --bzip2 -xvf linux-2.x.x.tar.bz2
  • Copy the old include directory and .config file into new source tree:
cp -rp linux-2.x.x.old/include/ linux-2.x.x/include/
cp linux-2.x.x.old/.config linux-2.x.x/.config
  • Go to a non-graphical session
    • Use Control-Alt-F5 (or whatever F key you want)
    • Login as root
    • Go to runlevel 3
init 3
  • Run NVIDIA installer
sh /path/to/NVIDIA-Linux-x86-1.0-5336-pkg0.run

You will be asked to accept their license, then hit OK a couple times on informational screens, the driver will then be built and installed

  • Edit XFree86Config file
    • Use the editor of choice to open /etc/X11/XFree86Config and go to the Device section
    • Change your driver from current (probably nv or vesa) to nvidia:
      • Driver "nv" to Driver "nvidia"
    • Uncomment the glx load line
      • #Load "glx" to Load "glx"
    • Comment or delete the Chipset line if exists
  • Edit modules loaded on boot
    • Open /etc/rc.conf in editor
    • Add nvidia to modules section
MODULES=(... some modules ... nvidia)
  • Reboot the machine and enjoy 3D acceleration...and a tainted kernel :)


How to disable NVIDIA Graphics Logo on startup

  • Edit XOrgConfig file
    • Go to the Device section
    • Add the "NoLogo" Option
Option "NoLogo" "true"

Using TV-out on your NVIDIA card

Good article on the subject can be found from:

 http://en.wikibooks.org/wiki/NVidia/TV-OUT


Example: Clone of desktop to TV (CRT).

In this example we have connected the TV-OUT of the video card to the S-Video IN of our TV (CRT solution). "PAL-B" is seen below as this is the standard for Australia. (See the above link for other regions).

Use nano or vim to open /etc/X11/xorg.conf file, scroll down until you find the "Device" section.

It will be something like:


   Section "Device"
       Identifier   "Card0"
       Driver       "nvidia"
       VendorName   "ALL"
       BoardName    "ALL"
   EndSection


Add the following:


   Option "TwinView"                  "true"
   Option "SecondMonitorHorizSync"    "30-50"
   Option "SecondMonitorVertRefresh"  "60"
   Option "TwinViewOrientation"       "Clone"
   Option "TVOutFormat"               "SVIDEO"
   Option "TVStandard"                "PAL-B"


So it becomes:


   Section "Device"
       Identifier   "Card0"
       Driver       "nvidia"
       VendorName   "ALL"
       BoardName    "ALL"
       Option       "TwinView"                  "true"
       Option       "SecondMonitorHorizSync"    "30-50"
       Option       "SecondMonitorVertRefresh"  "60"
       Option       "TwinViewOrientation"       "Clone"
       Option       "TVOutFormat"               "SVIDEO"
       Option       "TVStandard"                "PAL-B"
   EndSection

Save changes and restart xorg. (Ctrl-Alt-Backspace)

This approach has been confirmed to work with Geforce2 MX400 (with Conexant TV-encoder chip), Geforce 5900XT, Geforce 7300GT, and Geforce 7600GS. (The last three have onboard TV-encoders on the GPU.)

SPECIAL NOTE: There is a problem with this approach. The TV-Out will suffer from slight "tearing" when viewing videos on TV-Out. The amount of tearing will vary from one setup to another, and on the video being watched. This is because the "Overlay" only works on one display, "screen 0". (Overlay stops the tearing). This could be a limitation of X. Its suggested that one should modify xorg.conf such that the TV-Out is "screen 0". (The result is that the computer monitor will then suffer from tearing instead of TV-Out.)

Another option is to complete do away with the computer monitor and use the TV for the display. This can be done by adding:


   Option       "ConnectedMonitor"   "TV"


However, this approach may cause inconvenience under some situations.

NOTE: If you still have trouble with "tearing" with videos (below HD quality), consider using the open driver nv instead. Bare in mind, you have to manually add additional entries for the TV-Out in the xorg.conf file for this to work. Do NOT use nvtv if you have a modern Geforce card. The program won't work, as its specifically for TNT and early Geforce cards!

Why is the refresh rate not reported correctly by utilities that use the XRandR X extension (e.g., the GNOME "Screen Resolution Preferences" panel, `xrandr -q`, etc)?

The XRandR X extension is not presently aware of multiple display devices on a single X screen; it only sees the MetaMode bounding box, which may contain one or more actual modes. This means that if multiple MetaModes have the same bounding box, XRandR will not be able to distinguish between them.

In order to support DynamicTwinView, the NVIDIA X driver must make each MetaMode appear to be unique to XRandR. Presently, the NVIDIA X driver accomplishes this by using the refresh rate as a unique identifier.

You can use `nvidia-settings -q RefreshRate` to query the actual refresh rate on each display device.

The XRandR extension is currently being redesigned by the X.Org community, so the refresh rate workaround may be removed at some point in the future.

This workaround can also be disabled by setting the "DynamicTwinView" X configuration option to FALSE, which will disable NV-CONTROL support for manipulating MetaModes, but will cause the XRandR and XF86VidMode visible refresh rate to be accurate.