Difference between revisions of "NVIDIA"

From ArchWiki
Jump to: navigation, search
m (Check the power source)
(Fixed link)
 
(391 intermediate revisions by more than 100 users not shown)
Line 1: Line 1:
 
[[Category:Graphics]]
 
[[Category:Graphics]]
[[Category:X Server]]
+
[[Category:X server]]
 
[[cs:NVIDIA]]
 
[[cs:NVIDIA]]
 
[[de:Nvidia]]
 
[[de:Nvidia]]
Line 12: Line 12:
 
[[tr:Nvidia]]
 
[[tr:Nvidia]]
 
[[zh-CN:NVIDIA]]
 
[[zh-CN:NVIDIA]]
{{Article summary start}}
+
{{Related articles start}}
{{Article summary text|Information on installing, configuring and troubleshooting the proprietary NVIDIA Drivers.}}
+
{{Related|NVIDIA/Tips and tricks}}
{{Article summary heading|Related}}
+
{{Related|NVIDIA/Troubleshooting}}
{{Article summary wiki|Nouveau}}
+
{{Related|Nouveau}}
{{Article summary wiki|Xorg}}
+
{{Related|Bumblebee}}
{{Article summary end}}
+
{{Related|NVIDIA Optimus}}
 +
{{Related|Xorg}}
 +
{{Related articles end}}
  
This article covers installing and configuring [http://www.nvidia.com NVIDIA]'s ''proprietary'' graphic card driver. For information about the open-source drivers, see [[Nouveau]].
+
This article covers installing and configuring [http://www.nvidia.com NVIDIA]'s ''proprietary'' graphic card driver. For information about the open-source drivers, see [[Nouveau]]. If you have a laptop with hybrid Intel/NVIDIA graphics, see [[NVIDIA Optimus]] instead.
  
{{Warning|The proprietary NVIDIA drivers *may* NOT work with EFI systems. You may need to run BIOS to use them.}}
+
== Installation ==
  
==Installing==
+
{{Warning|Avoid installing the NVIDIA driver through the package provided from the NVIDIA website. Installation through [[pacman]] allows upgrading the driver together with the rest of the system.}}
These instructions are for those using the stock {{Pkg|linux}} package. For custom kernel setup, skip to the [[#Alternate install: custom kernel|next]] subsection.
+
  
{{Tip|It is usually beneficial to install the NVIDIA driver through [[pacman]] rather than through the package provided by the NVIDIA site, this allows the driver to be updated when upgrading the system.}}
+
These instructions are for those using the stock {{Pkg|linux}} or {{Pkg|linux-lts}} packages. For custom kernel setup, skip to the [[#Custom kernel|next]] subsection.
  
1. Visit NVIDIA's [http://www.nvidia.com/Download/index.aspx?lang=en-us driver download site] to find out the appropriate driver for a given card. You could also check the [http://www.nvidia.com/object/IO_32667.html legacy card list].
+
1. If you do not know what graphics card you have, find out by issuing:
 +
:{{bc|<nowiki>$ lspci -k | grep -A 2 -E "(VGA|3D)"</nowiki>}}
  
:{{Note|For the very latest GPU models, it may be required to install {{AUR|nvidia-beta}} from the [[Arch User Repository]], since the stable drivers may not support the newly introduced features. Try the stable ones first (see below).}}
+
2. Determine the necessary driver version for your card by:
 +
:* finding the code name (e.g. NV50, NVC0, etc.) on [http://nouveau.freedesktop.org/wiki/CodeNames nouveau wiki's code names page]
 +
:* looking up the name in NVIDIA's [http://www.nvidia.com/object/IO_32667.html legacy card list]: if your card is not there you can use the latest driver
 +
:* visiting NVIDIA's [http://www.nvidia.com/Download/index.aspx driver download site]
  
2. Install the appropriate driver for your card:
+
3. Install the appropriate driver for your card:
:* For GeForce 8 series and newer [NVC0 and newer] cards, install {{Pkg|nvidia}} package, available in the [[Official Repositories|official repositories]].
+
:* For GeForce 400 series cards and newer [NVCx and newer], [[install]] the {{Pkg|nvidia}} or {{Pkg|nvidia-lts}} package along with {{Pkg|nvidia-libgl}}. If these packages do not work, {{AUR|nvidia-beta}} may have a newer driver version that offers support.
:* For GeForce 6/7 series cards [NV40-NVAF], install {{Pkg|nvidia-304xx}} package, available in the [[Official Repositories|official repositories]].
+
:* For GeForce 8000/9000, ION and 100-300 series cards [NV5x, NV8x, NV9x and NVAx] from around 2006-2010, [[install]] the {{Pkg|nvidia-340xx}} or {{Pkg|nvidia-340xx-lts}} package along with {{Pkg|nvidia-340xx-libgl}}.
:* For GeForce 5 FX series cards [NV30-NV38], install {{AUR|nvidia-173xx}} package, available in the [[Arch User Repository|AUR]].
+
:* For GeForce 6000/7000 series cards [NV4x and NV6x] from around 2004-2006, [[install]] the {{Pkg|nvidia-304xx}} or {{Pkg|nvidia-304xx-lts}} package along with {{Pkg|nvidia-304xx-libgl}}.
:* For GeForce 2/3/4 MX/Ti series cards [NV11 and NV17-NV28], install {{AUR|nvidia-96xx}} package, available in the [[Arch User Repository|AUR]].
+
  
:The ''nvidia{,-304xx,-173xx,-96xx}-utils'' package is a dependency and will be pulled in automatically. It may conflict with the ''libgl'' package; this is normal. If pacman asks to remove ''libgl'' and fails due to unsatisfied dependencies, remove it with {{ic|pacman -Rdd libgl}}.
+
:* For even older cards, have a look at [[#Unsupported drivers]].
  
:The ''nvidia-96xx-utils'' package requires a legacy X.Org server release ({{AUR|xorg-server1.12}}). It conflicts with the ''xorg-server'' from the official repositories.
+
4. If you are on 64-bit and also need 32-bit OpenGL support, you must also install the equivalent ''lib32'' package from the [[multilib]] repository (e.g. {{Pkg|lib32-nvidia-libgl}}, {{Pkg|lib32-nvidia-340xx-libgl}} or {{Pkg|lib32-nvidia-304xx-libgl}}).
  
:{{Note|For [[multilib|Arch x86_64]] you must also install the equivalent ''lib32'' package (e.g. {{Pkg|lib32-nvidia-utils}}, {{Pkg|lib32-nvidia-304xx-utils}} or {{AUR|lib32-nvidia-utils-beta}}).}}
+
5. Reboot. The {{Pkg|nvidia}} package contains a file which blacklists the ''nouveau'' module, so rebooting is necessary.
  
3. '''Reboot'''. The ''nvidia'' package contains a file which blacklists the ''nouveau'' module, so rebooting is necessary.
+
Once the driver has been installed, continue to [[#Configuration]].
  
Once the driver has been installed, continue to: [[#Configuring]].
+
=== Unsupported drivers ===
  
===Alternate install: custom kernel===
+
If you have a GeForce 5 FX series card or older, Nvidia no longer supports drivers for your card. This means that these drivers [http://nvidia.custhelp.com/app/answers/detail/a_id/3142/ do not support the current Xorg version]. It thus might be easier if you use the [[nouveau]] driver, which supports the old cards with the current Xorg.
  
First of all, it's good to know how the ABS works by reading some of the other articles about it:
+
However, Nvidia's legacy drivers are still available and might provide better 3D performance/stability if you are willing to downgrade Xorg:
  
* Main article for [[Arch Build System|ABS]]
+
* For GeForce 5 FX series cards [NV30-NV36], install the {{AUR|nvidia-173xx-dkms}} package. Last supported Xorg version is 1.15.
* Article on [[makepkg]]
+
* For GeForce 2/3/4 MX/Ti series cards [NV11, NV17-NV28], install the {{AUR|nvidia-96xx-dkms}} package. Last supported Xorg version is 1.12.
* Article on [[Creating Packages]]
+
  
{{Note|You can also find the {{AUR|nvidia-all}} package in [[Arch User Repository|AUR]] which makes it easier to use with custom kernels and multiple kernels.}}
+
{{Tip|The legacy nvidia-96xx-dkms and nvidia-173xx-dkms drivers can also be installed from the unofficial [http://pkgbuild.com/~bgyorgy/city.html <nowiki>[city] repository</nowiki>]. (It is strongly advised that you do not skip any dependencies restriction when installing from here)}}
  
The following is a short tutorial for making a custom NVIDIA driver package using [[ABS]]:
+
=== Custom kernel ===
  
[[Pacman|Install]] {{Pkg|abs}} from the [[Official Repositories|official repositories]] and generate the tree with:
+
If you are using a custom kernel, compilation of the Nvidia kernel modules can be automated with [[DKMS]].
# abs
+
As a standard user, make a temporary directory for creating the new package:
+
$ mkdir -p ~/abs
+
Make a copy of the {{ic|nvidia}} package directory:
+
$ cp -r /var/abs/extra/nvidia/ ~/abs/
+
Go into the temporary {{ic|nvidia}} build directory:
+
$ cd ~/abs/nvidia
+
It is required to edit the files {{ic|nvidia.install}} and {{ic|PKGBUILD}} file so that they contain the right kernel version variables.
+
  
While running the custom kernel, get the appropriate kernel and local version names:
+
Install the {{Pkg|nvidia-dkms}} package (or a specific branch such as {{Pkg|nvidia-340xx-dkms}}). The Nvidia module will be rebuilt after every Nvidia or kernel update thanks to the DKMS [[Pacman#Hooks|Pacman Hook]].
$ uname -r
+
# In nvidia.install, replace the {{ic|EXTRAMODULES<nowiki>=</nowiki>'extramodules-3.4-ARCH'}} variable with the custom kernel version, such as {{ic|EXTRAMODULES<nowiki>=</nowiki>'extramodules-3.4.4'}} or {{ic|EXTRAMODULES<nowiki>=</nowiki>'extramodules-3.4.4-custom'}} depending on what the kernel's version is and the local version's text/numbers. Do this for all instances of the version number within this file.
+
# In PKGBUILD, change the {{ic|_extramodules<nowiki>=</nowiki>extramodules-3.4-ARCH}} variable to match the appropriate version, as above.
+
# If there are more than one kernels in the system installed in parallel (such as a custom kernel alongside the default -ARCH kernel), change the {{ic|pkgname<nowiki>=</nowiki>nvidia}} variable in the PKGBUILD to a unique identifier, such as nvidia-344 or nvidia-custom. This will allow both kernels to use the nvidia module, since the custom nvidia module has a different package name and will not overwrite the original. You will also need to comment the line in {{ic|package()}} that blacklists the nvidia module in {{ic|/usr/lib/modprobe.d/nvidia.conf}} (no need to do it again).
+
  
Then do:
+
=== Pure Video HD ===
$ makepkg -ci
+
The {{ic|-c}} operand tells makepkg to clean left over files after building the package, whereas {{ic|-i}} specifies that makepkg should automatically run pacman to install the resulting package.
+
  
===Automatic re-compilation of the NVIDIA module with every update of any kernel===
+
At least a video card with second generation [[wikipedia:Nvidia PureVideo#Table of GPUs containing a PureVideo SIP block|PureVideo HD]] is required for [[hardware video acceleration]] using VDPAU.
  
This is possible thanks to {{AUR|nvidia-hook}} from the [[AUR]]. You will need to install the module sources: either {{AUR|nvidia-source}} for the stable drivers or {{AUR|nvidia-source-beta}} for the beta drivers. In '''nvidia-hook''', the 'automatic re-compilation' functionality is done by a '''nvidia hook''' on [[mkinitcpio]] after forcing to update the '''linux-headers''' package. You will need to add 'nvidia' to the HOOKS array in /etc/mkinitcpio.conf as well as 'linux-headers' and your custom kernel(s) headers to the SyncFirst array in /etc/pacman.conf for this to work.
+
=== DRM kernel mode setting ===
  
The hook will call the '''dkms''' command to update the NVIDIA module for the version of your new kernel.
+
{{Note|1=The NVIDIA driver does '''not''' provide an {{ic|fbdev}} driver for the high-resolution console for the kernel compiled-in {{ic|vesafb}} module. However, the kernel compiled-in {{ic|efifb}} module supports high-resolution nvidia console on EFI systems.[http://forums.fedoraforum.org/showthread.php?t=306271]}}
  
{{Note|If you are using this functionality it's '''important''' to look at the installation process of the linux (or any other kernel) package. nvidia hook will tell you if anything goes wrong.}}
+
{{Pkg|nvidia}} 364.16 adds support for DRM [[kernel mode setting]]. To enable this feature, add the {{ic|1=nvidia-drm.modeset=1}} [[kernel parameter]], and add nvidia, nvidia_modeset, nvidia_uvm and nvidia_drm to your [[initramfs#MODULES]].
  
==Configuring==
+
{{Warning| Do not forget to run mkinitcpio every time you update driver.}}
It is possible that after installing the driver it may not be needed to create an Xorg server configuration file. You can run [[Xorg#Running| a test]] to see if the Xorg server will function correctly without a configuration file. However, it may be required to create a {{ic|/etc/X11/xorg.conf}} configuration file in order to adjust various settings. This configuration can be generated by the NVIDIA Xorg configuration tool, or it can be created manually. If created manually, it can be a minimal configuration (in the sense that it will only pass the basic options to the [[Xorg]] server), or it can include a number of settings that can bypass Xorg's auto-discovered or pre-configured options.
+
:{{Note|Since 1.8.x Xorg uses separate configuration files in {{ic|/etc/X11/xorg.conf.d/}} - check out [[NVIDIA#Advanced:_20-nvidia.conf|advanced configuration]] section.}}
+
  
===Automatic configuration===
+
==== Pacman hook ====
The NVIDIA package includes an automatic configuration tool to create an Xorg server configuration file ({{ic|xorg.conf}}) and can be run by:
+
# nvidia-xconfig
+
  
This command will auto-detect and create (or edit, if already present) the {{ic|/etc/X11/xorg.conf}} configuration according to present hardware.
+
To avoid the possibility of forgetting to update your initramfs after an nvidia upgrade, you can use a pacman hook like this
  
If there are instances of DRI, ensure they are commented out:
+
{{hc|/etc/pacman.d/hooks/nvidia.hook|2=[Trigger]
#    Load        "dri"
+
Operation=Install
Double check your {{ic| /etc/X11/xorg.conf}} to make sure your default depth, horizontal sync, vertical refresh, and resolutions are acceptable.
+
Operation=Upgrade
 +
Operation=Remove
 +
Type=Package
 +
Target=nvidia
  
{{Warning| That may still not work properly with Xorg-server 1.8 }}
+
[Action]
 +
Depends=mkinitcpio
 +
When=PostTransaction
 +
Exec=/usr/bin/mkinitcpio -p linux}}
  
===Minimal configuration===
+
=== Hardware accelerated video decoding with XvMC ===
A basic {{ic|xorg.conf}} would look like this:
+
  
{{hc|/etc/X11/xorg.conf|
+
Accelerated decoding of MPEG-1 and MPEG-2 videos via [[XvMC]] are supported on GeForce4, GeForce 5 FX, GeForce 6 and GeForce 7 series cards. See [[XvMC]] for details.
Section "Device"
+
  Identifier    "Device0"
+
  Driver        "nvidia"
+
  VendorName    "NVIDIA Corporation"
+
EndSection
+
}}
+
  
{{Tip|If upgrading from nouveau make sure to remove "{{ic|nouveau}}" from {{ic|/etc/mkinitcpio.conf}}. See [[NVIDIA#Switching between nvidia and nouveau drivers]], if switching between the open and proprietary drivers often.}}
+
=== Switching between NVIDIA and nouveau drivers ===
  
=== Multiple monitors ===
+
{{Deletion|This script literally just installs NVIDIA and uninstalls nouveau or vice versa. Also it doesn't mention the blacklisting of the nouveau module.}}
:''See [[Multihead]] for more general information''
+
  
To activate dual screen support, you just need to edit the {{ic|/etc/X11/xorg.conf.d/10-monitor.conf}} file which you made before.
+
If you need to switch between drivers, you may use the following script, run as root (say yes to all confirmations):
  
Per each physical monitor, add one Monitor, Device, and Screen Section entry, and then a ServerLayout section to manage it. Be advised that when Xinerama is enabled, the NVIDIA proprietary driver automatically disables compositing. If you desire compositing, you should comment out the {{ic|Xinerama}} line in "{{ic|ServerLayout}}" and use TwinView (see below) instead.
+
{{bc|1=<nowiki>
 +
#!/bin/bash
 +
BRANCH= # Enter a branch if needed, i.e. -340xx or -304xx
 +
NVIDIA=nvidia${BRANCH} # If no branch entered above this would be "nvidia"
 +
NOUVEAU=xf86-video-nouveau
  
{{hc|/etc/X11/xorg.conf.d/10-monitor.conf|
+
# Replace -R with -Rs to if you want to remove the unneeded dependencies
Section "ServerLayout"
+
if [ $(pacman -Qqs ^mesa-libgl$) ]; then
     Identifier    "DualSreen"
+
     pacman -S $NVIDIA ${NVIDIA}-libgl # Add lib32-${NVIDIA}-libgl and ${NVIDIA}-lts if needed
     Screen      0 "Screen0"
+
     # pacman -R $NOUVEAU
     Screen      1 "Screen1" RightOf "Screen0" #Screen1 at the right of Screen0
+
elif [ $(pacman -Qqs ^${NVIDIA}$) ]; then
     Option        "Xinerama" "1" #To move windows between screens
+
     pacman -S --needed $NOUVEAU mesa-libgl # Add lib32-mesa-libgl if needed
EndSection
+
     pacman -R $NVIDIA # Add ${NVIDIA}-lts if needed
 +
fi
 +
</nowiki>}}
  
Section "Monitor"
+
== Configuration ==
    Identifier    "Monitor0"
+
    Option        "Enable" "true"
+
EndSection
+
  
Section "Monitor"
+
{{Out of date|nvidia-xconfig should be avoided in 2016, and manual configuration isn't needed in most cases. Neither is the automatic configuration with nvidia-xconfig.}}
    Identifier    "Monitor1"
+
    Option        "Enable" "true"
+
EndSection
+
  
Section "Device"
+
It is possible that after installing the driver it may not be needed to create an Xorg server configuration file. You can run [[Xorg#Running|a test]] to see if the Xorg server will function correctly without a configuration file. However, it may be required to create a configuration file (prefer {{ic|/etc/X11/xorg.conf.d/20-nvidia.conf}} over {{ic|/etc/X11/xorg.conf}}) in order to adjust various settings. This configuration can be generated by the NVIDIA Xorg configuration tool, or it can be created manually. If created manually, it can be a minimal configuration (in the sense that it will only pass the basic options to the [[Xorg]] server), or it can include a number of settings that can bypass Xorg's auto-discovered or pre-configured options.
    Identifier    "Device0"
+
    Driver        "nvidia"
+
    Screen        0
+
EndSection
+
  
Section "Device"
+
{{Note|For maunal configuration see [[NVIDIA/Tips and tricks#Manual configuration]].}}
    Identifier    "Device1"
+
    Driver        "nvidia"
+
    Screen        1
+
EndSection
+
  
Section "Screen"
+
=== Minimal configuration ===
    Identifier    "Screen0"
+
    Device        "Device0"
+
    Monitor        "Monitor0"
+
    DefaultDepth    24
+
    Option        "TwinView" "0"
+
    SubSection "Display"
+
        Depth          24
+
        Modes          "1280x800_75.00"
+
    EndSubSection
+
EndSection
+
  
Section "Screen"
+
A basic configuration block in {{ic|20-nvidia.conf}} (or deprecated in {{ic|xorg.conf}}) would look like this:
    Identifier     "Screen1"
+
 
    Device         "Device1"
+
{{hc|/etc/X11/xorg.conf.d/20-nvidia.conf|
    Monitor        "Monitor1"
+
Section "Device"
    DefaultDepth  24
+
        Identifier "Nvidia Card"
    Option         "TwinView" "0"
+
         Driver "nvidia"
    SubSection "Display"
+
        VendorName "NVIDIA Corporation"
         Depth          24
+
         Option "NoLogo" "true"
    EndSubSection
+
        #Option "UseEDID" "false"
 +
         #Option "ConnectedMonitor" "DFP"
 +
        # ...
 
EndSection
 
EndSection
 
}}
 
}}
  
====TwinView====
+
{{Tip|If upgrading from nouveau make sure to remove {{ic|nouveau}} from {{ic|/etc/mkinitcpio.conf}}. See [[NVIDIA/Tips_and_tricks#Switching_between_NVIDIA_and_nouveau_drivers|Switching between NVIDIA and nouveau drivers]]{{Broken section link}}, if switching between the open and proprietary drivers often.}}
You want only one big screen instead of two. Set the {{ic|TwinView}} argument to {{ic|1}}. This option should be used instead of Xinerama (see above), if you desire compositing.
+
Option "TwinView" "1"
+
  
TwinView only works on a per card basis: If you have multiple cards (and no SLI?), you'll have to use xinerama or zaphod mode (multiple X screens). You can combine TwinView with zaphod mode, ending up, for example, with two X screens covering two monitors each. Most window managers fail miserably in zaphod mode. The shining exception is Awesome. KDE almost works.
+
=== Automatic configuration ===
  
Example configuration:
+
The NVIDIA package includes an automatic configuration tool to create an Xorg server configuration file ({{ic|xorg.conf}}) and can be run by:
{{hc|/etc/X11/xorg.conf.d/10-monitor.conf|
+
# nvidia-xconfig
Section "ServerLayout"
+
    Identifier    "TwinLayout"
+
    Screen        0 "metaScreen" 0 0
+
EndSection
+
  
Section "Monitor"
+
This command will auto-detect and create (or edit, if already present) the {{ic|/etc/X11/xorg.conf}} configuration according to present hardware.
    Identifier    "Monitor0"
+
    Option        "Enable" "true"
+
EndSection
+
  
Section "Monitor"
+
If there are instances of DRI, ensure they are commented out:
    Identifier    "Monitor1"
+
#    Load        "dri"
    Option        "Enable" "true"
+
Double check your {{ic|/etc/X11/xorg.conf}} to make sure your default depth, horizontal sync, vertical refresh, and resolutions are acceptable.
EndSection
+
  
Section "Device"
+
{{Warning|That may still not work properly with Xorg-server 1.8 }}
    Identifier    "Card0"
+
    Driver        "nvidia"
+
    VendorName    "NVIDIA Corporation"
+
  
    #refer to the link below for more information on each of the following options.
+
=== NVIDIA Settings ===
    Option        "HorizSync"          "DFP-0: 28-33; DFP-1 28-33"
+
    Option        "VertRefresh"        "DFP-0: 43-73; DFP-1 43-73"
+
    Option        "MetaModes"          "1920x1080, 1920x1080"
+
    Option        "ConnectedMonitor"  "DFP-0, DFP-1"
+
    Option        "MetaModeOrientation" "DFP-1 LeftOf DFP-0"
+
EndSection
+
  
Section "Screen"
+
The {{Pkg|nvidia-settings}} tool lets you configure many options using either CLI or GUI. Running {{ic|nvidia-settings}} without any options launches the GUI, for CLI options see {{ic|nvidia-settings(1)}}.
    Identifier    "metaScreen"
+
    Device        "Card0"
+
    Monitor        "Monitor0"
+
    DefaultDepth    24
+
    Option        "TwinView" "True"
+
    SubSection "Display"
+
        Modes          "1920x1080"
+
    EndSubSection
+
EndSection
+
}}
+
  
[http://us.download.nvidia.com/XFree86/Linux-x86/304.51/README/configtwinview.html Device Option information]
+
You can run the GUI as a normal user and save the settings to {{ic|~/.nvidia-settings-rc}}. Then you can load the settings using {{ic|$ nvidia-settings --load-config-only}} (for example in your [[xinitrc]]). Alternatively, run {{ic|nvidia-settings}} as root, and then save the configuration to {{ic|/etc/X11/xorg.conf.d/}} as usual.
  
=====Automatic configuration =====
+
{{Tip|If your X server is crashing on startup, it may be because the GUI-generated settings are corrupt. Try deleting the generated file and starting from scratch.}}
The NVIDIA package provides Twinview. This tool will help by automatically configuring all the monitors connected to your video card. This only works for multiple monitors on a single card.
+
To configure Xorg Server with Twinview run:
+
# nvidia-xconfig --twinview
+
  
=====Manual CLI configuration with xrandr=====
+
=== Multiple monitors ===
If the latest solutions doesn't works for you, you can use the ''autostart'' trick of your window manager to run a {{ic|xrandr}} command like this one :
+
  
xrandr --output DVI-I-0 --auto --primary --left-of DVI-I-1
+
See [[Multihead]] for more general information.
  
or
+
==== Using NVIDIA Settings ====
  
xrandr --output DVI-I-1 --pos 1440x0 --mode 1440x900 --rate 75.0
+
The [[#NVIDIA Settings|nvidia-settings]] tool can configure multiple monitors.
  
When:
+
For CLI configuration, first get the {{ic|CurrentMetaMode}} by running:
  
* {{ic|--output}} is used to indicate to which "monitor" set the options.
+
{{hc|$ nvidia-settings -q CurrentMetaMode|2=
* {{ic|DVI-I-1}} is the name of the second monitor.
+
Attribute 'CurrentMetaMode' (hostnmae:0.0): id=50, switchable=no, source=nv-control :: DPY-1: 2880x1620 @2880x1620 +0+0 {ViewPortIn=2880x1620, ViewPortOut=2880x1620+0+0}
* {{ic|--pos}} is the position of the second monitor respect to the first.
+
}}
* {{ic|--mode}} is the resolution of the second monitor.
+
 
* {{ic|--rate}} is the Hz refresh rate.
+
Save everything after the {{ic|::}} to the end of the attribute (in this case: {{ic|1=DPY-1: 2880x1620 @2880x1620 +0+0 {ViewPortIn=2880x1620, ViewPortOut=2880x1620+0+0&#125;}}) and use to reconfigure your displays with {{ic|1=nvidia-settings --assign "CurrentMetaMode=''your_meta_mode''"}}.
  
You must adapt the {{ic|xrandr}} options with the help of the output of  the command {{ic|xrandr}} run alone in a terminal.
+
{{Tip|You can create shell aliases for the different monitor and resolution configurations you use.}}
  
====Using NVIDIA Settings====
+
==== ConnectedMonitor ====
You can also use the {{ic|nvidia-settings}} tool provided by {{Pkg|nvidia-utils}}. With this method, you will use the proprietary software NVIDIA provides with their drivers. Simply run {{ic|nvidia-settings}} as root, then configure as you wish, and then save the configuration to {{ic|/etc/X11/xorg.conf.d/10-monitor.conf}}.
+
  
====ConnectedMonitor====
+
If the driver does not properly detect a second monitor, you can force it to do so with ConnectedMonitor.  
If the driver doesn't properly detect a second monitor, you can force it to do so with ConnectedMonitor.  
+
  
 
{{hc|/etc/X11/xorg.conf|
 
{{hc|/etc/X11/xorg.conf|
Line 306: Line 238:
 
The duplicated device with {{ic|Screen}} is how you get X to use two monitors on one card without {{ic|TwinView}}. Note that {{ic|nvidia-settings}} will strip out any {{ic|ConnectedMonitor}} options you have added.
 
The duplicated device with {{ic|Screen}} is how you get X to use two monitors on one card without {{ic|TwinView}}. Note that {{ic|nvidia-settings}} will strip out any {{ic|ConnectedMonitor}} options you have added.
  
====Mosaic Mode====
+
==== TwinView ====
Mosaic mode is the only way to use more than 2 monitors across multiple graphics cards with compositing. Your window manager may or may not recognize the distinction between each monitor.
+
=====Base Mosaic=====
+
Base mosaic mode works on any set of Geforce 8000 series or higher GPUs. It cannot be enabled from withing the nvidia-setting GUI. You must either use the nvidia-xconfig command line program or edit xorg.conf by hand. Metamodes must be specified. The following is an example for four DFPs in a 2x2 configuration, each running at 1920x1024, with two DFPs connected to two cards:
+
$ nvidia-xconfig --base-mosaic --metamodes="GPU-0.DFP-0: 1920x1024+0+0, GPU-0.DFP-1: 1920x1024+1920+0, GPU-1.DFP-0: 1920x1024+0+1024, GPU-1.DFP-1: 1920x1024+1920+1024"
+
=====SLI Mosaic=====
+
If you have an SLI configuration and each GPU is a Quadro FX 5800, Quadro Fermi or newer then you can use SLI Mosaic mode. It can be enabled from within the nvidia-settings GUI or from the command line with:
+
$ nvidia-xconfig --sli=Mosaic --metamodes="GPU-0.DFP-0: 1920x1024+0+0, GPU-0.DFP-1: 1920x1024+1920+0, GPU-1.DFP-0: 1920x1024+0+1024, GPU-1.DFP-1: 1920x1024+1920+1024"
+
  
==Tweaking==
+
You want only one big screen instead of two. Set the {{ic|TwinView}} argument to {{ic|1}}. This option should be used if you desire compositing. TwinView only works on a per card basis, when all participating monitors are connected to the same card.
 +
Option "TwinView" "1"
  
===GUI: nvidia-settings===
+
Example configuration:
The NVIDIA package includes the {{ic|nvidia-settings}} program that allows adjustment of several additional settings.
+
{{hc|/etc/X11/xorg.conf.d/10-monitor.conf|
 +
Section "ServerLayout"
 +
    Identifier    "TwinLayout"
 +
    Screen        0 "metaScreen" 0 0
 +
EndSection
  
For the settings to be loaded on login, run this command from the terminal:
+
Section "Monitor"
$ nvidia-settings --load-config-only
+
    Identifier    "Monitor0"
 +
    Option        "Enable" "true"
 +
EndSection
  
Or add it to the the desktop environment's auto-startup method.
+
Section "Monitor"
 +
    Identifier    "Monitor1"
 +
    Option        "Enable" "true"
 +
EndSection
  
For a dramatic 2D graphics performance increase in pixmap-intensive applications, e.g. Firefox, set the {{ic|InitialPixmapPlacement}} parameter to 2:
+
Section "Device"
 +
    Identifier    "Card0"
 +
    Driver        "nvidia"
 +
    VendorName    "NVIDIA Corporation"
  
$ nvidia-settings -a InitialPixmapPlacement=2
+
    #refer to the link below for more information on each of the following options.
 +
    Option        "HorizSync"          "DFP-0: 28-33; DFP-1 28-33"
 +
    Option        "VertRefresh"        "DFP-0: 43-73; DFP-1 43-73"
 +
    Option        "MetaModes"          "1920x1080, 1920x1080"
 +
    Option        "ConnectedMonitor"  "DFP-0, DFP-1"
 +
    Option        "MetaModeOrientation" "DFP-1 LeftOf DFP-0"
 +
EndSection
  
This is documented in [http://cgit.freedesktop.org/~aplattner/nvidia-settings/tree/src/libXNVCtrl/NVCtrl.h?id=b27db3d10d58b821e87fbe3f46166e02dc589855#n2797 nvidia-settings source code]. For this setting to persist, this command needs to be run on every startup or added to your {{ic|~/.nvidia-settings-rc}}.
+
Section "Screen"
 +
    Identifier    "metaScreen"
 +
    Device        "Card0"
 +
    Monitor        "Monitor0"
 +
    DefaultDepth    24
 +
    Option        "TwinView" "True"
 +
    SubSection "Display"
 +
        Modes          "1920x1080"
 +
    EndSubSection
 +
EndSection
 +
}}
  
{{Tip | On rare occasions the {{ic|~/.nvidia-settings-rc}} may become corrupt. If this happens, the Xorg server may crash and the file will have to be deleted to fix the issue.}}
+
[ftp://download.nvidia.com/XFree86/Linux-x86/355.11/README/configtwinview.html Device option information].
  
===Enabling MSI (Message Signaled Interrupts)===
+
If you have multiple cards that are SLI capable, it is possible to run more than one monitor attached to separate cards (for example: two cards in SLI with one monitor attached to each). The "MetaModes" option in conjunction with SLI Mosaic mode enables this. Below is a configuration which works for the aforementioned example and runs [[GNOME]] flawlessly.
By default, the graphics card uses a shared interrupt system. To give a small performance boost, edit {{ic|/etc/modprobe.d/modprobe.conf}} and add:
+
{{hc|/etc/X11/xorg.conf.d/10-monitor.conf|
options nvidia NVreg_EnableMSI=1
+
Section "Device"
Be warned, as this has been known to damage some systems running older hardware!
+
        Identifier      "Card A"
 +
        Driver          "nvidia"
 +
        BusID          "PCI:1:00:0"
 +
EndSection
  
To confirm, run:
+
Section "Device"
# cat /proc/interrupts | grep nvidia
+
        Identifier      "Card B"
  43:         0         49      4199      86318  PCI-MSI-edge      nvidia
+
        Driver         "nvidia"
 +
         BusID          "PCI:2:00:0"
 +
EndSection
  
===Advanced: 20-nvidia.conf===
+
Section "Monitor"
Edit {{ic|/etc/X11/xorg.conf.d/20-nvidia.conf}}, and add the option to the correct section. The Xorg server will need to be restarted before any changes are applied.
+
        Identifier      "Right Monitor"
 +
EndSection
  
* See [http://http.download.nvidia.com/XFree86/Linux-x86/304.51/README/ NVIDIA Accelerated Linux Graphics Driver README and Installation Guide] for additional details and options.
+
Section "Monitor"
====Enabling desktop composition====
+
        Identifier      "Left Monitor"
As of NVIDIA driver version 180.44, support for GLX with the Damage and Composite X extensions is enabled by default. Refer to [[Xorg#Composite]] for detailed instructions.
+
EndSection
  
====Disabling the logo on startup====
+
Section "Screen"
Add the {{ic|"NoLogo"}} option under section {{ic|Device}}:
+
        Identifier      "Right Screen"
Option "NoLogo" "1"
+
        Device         "Card A"
 +
        Monitor        "Right Monitor"
 +
        DefaultDepth    24
 +
        Option         "SLI" "Mosaic"
 +
        Option          "Stereo" "0"
 +
        Option          "BaseMosaic" "True"
 +
        Option          "MetaModes" "GPU-0.DFP-0: 1920x1200+4480+0, GPU-1.DFP-0:1920x1200+0+0"
 +
        SubSection      "Display"
 +
                        Depth          24
 +
        EndSubSection
 +
EndSection
  
====Enabling hardware acceleration====
+
Section "Screen"
{{Note|RenderAccel is enabled by default since drivers version 97.46.xx}}
+
        Identifier      "Left Screen"
Add the {{ic|"RenderAccel"}} option under section {{ic|Device}}:
+
        Device         "Card B"
Option "RenderAccel" "1"
+
        Monitor        "Left Monitor"
 +
        DefaultDepth    24
 +
        Option         "SLI" "Mosaic"
 +
        Option          "Stereo" "0"
 +
        Option          "BaseMosaic" "True"
 +
        Option          "MetaModes" "GPU-0.DFP-0: 1920x1200+4480+0, GPU-1.DFP-0:1920x1200+0+0"
 +
        SubSection      "Display"
 +
                        Depth          24
 +
        EndSubSection
 +
EndSection
  
====Overriding monitor detection====
+
Section "ServerLayout"
The {{ic|"ConnectedMonitor"}} option under section {{ic|Device}} allows to override monitor detection when X server starts, which may save a significant amount of time at start up. The available options are: {{ic|"CRT"}} for analog connections, {{ic|"DFP"}} for digital monitors and {{ic|"TV"}} for televisions.
+
        Identifier      "Default"
 +
        Screen 0        "Right Screen" 0 0
 +
        Option          "Xinerama" "0"
 +
EndSection}}
  
The following statement forces the NVIDIA driver to bypass startup checks and recognize the monitor as DFP:
+
===== Manual CLI configuration with xrandr =====
Option "ConnectedMonitor" "DFP"
+
{{Accuracy|Do these commands set up the monitors in ''TwinView'' mode?}}
{{Note| Use "CRT" for all analog 15 pin VGA connections, even if the display is a flat panel. "DFP" is intended for DVI digital connections only.}}
+
  
====Enabling triple buffering====
+
If the latest solutions do not work for you, you can use your window manager's ''autostart'' implementation with {{Pkg|xorg-xrandr}}.
Enable the use of triple buffering by adding the {{ic|"TripleBuffer"}} Option under section {{ic|Device}}:
+
Option "TripleBuffer" "1"
+
  
Use this option if the graphics card has plenty of ram (equal or greater than 128MB). The setting only takes effect when syncing to vblank is enabled, one of the options featured in nvidia-settings.
+
Some {{ic|xrandr}} examples could be:
  
{{Note|This option may introduce full-screen tearing and reduce performance. As of the R300 drivers, vblank is enabled by default.}}
+
xrandr --output DVI-I-0 --auto --primary --left-of DVI-I-1
  
====Using OS-level events====
+
or:
Taken from the NVIDIA driver's [http://http.download.nvidia.com/XFree86/Linux-x86/304.51/README/xconfigoptions.html README] file: ''"[...] Use OS-level events to efficiently notify X when a client has performed direct rendering to a window that needs to be composited."'' It may help improving performance, but it is currently incompatible with SLI and Multi-GPU modes.
+
  
Add under section {{ic|Device}}:
+
  xrandr --output DVI-I-1 --pos 1440x0 --mode 1440x900 --rate 75.0
  Option "DamageEvents" "1"
+
{{Note|This option is enabled by default in newer driver versions.}}
+
  
====Enabling power saving====
+
When:
Add under section {{ic|Monitor}}:
+
Option "DPMS" "1"
+
  
====Enabling Brightness Control====
+
* {{ic|--output}} is used to indicate the "monitor" to which the options are set.
Add under section {{ic|Device}}:
+
* {{ic|DVI-I-1}} is the name of the second monitor.
Option "RegistryDwords" "EnableBrightnessControl=1"
+
* {{ic|--pos}} is the position of the second monitor relative to the first.
 +
* {{ic|--mode}} is the resolution of the second monitor.
 +
* {{ic|--rate}} is the refresh rate (in Hz).
  
====Enabling SLI====
+
===== Vertical sync using TwinView =====
{{Warning|As of May 7, 2011, you may experience sluggish video performance in GNOME 3 after enabling SLI.}}
+
  
Taken from the NVIDIA driver's [http://http.download.nvidia.com/XFree86/Linux-x86/304.51/README/xconfigoptions.html README] appendix: ''This option controls the configuration of SLI rendering in supported configurations.'' A "supported configuration" is a computer equipped with an SLI-Certified Motherboard and 2 or 3 SLI-Certified GeForce GPUs. See NVIDIA's [http://www.slizone.com/page/home.html SLI Zone] for more information.
+
If you are using TwinView and vertical sync (the "Sync to VBlank" option in '''nvidia-settings'''), you will notice that only one screen is being properly synced, unless you have two identical monitors. Although '''nvidia-settings''' does offer an option to change which screen is being synced (the "Sync to this display device" option), this does not always work. A solution is to add the following environment variables at startup, for example append in {{ic|/etc/profile}}:
  
Find the first GPU's PCI Bus ID using {{ic|lspci}}:
+
export __GL_SYNC_TO_VBLANK=1
  $ lspci | grep VGA
+
export __GL_SYNC_DISPLAY_DEVICE=DFP-0
 +
  export __VDPAU_NVIDIA_SYNC_DISPLAY_DEVICE=DFP-0
  
This will return something similar to:
+
You can change {{ic|DFP-0}} with your preferred screen ({{ic|DFP-0}} is the DVI port and {{ic|CRT-0}} is the VGA port). You can find the identifier for your display from '''nvidia-settings''' in the "X Server XVideoSettings" section.
03:00.0 VGA compatible controller: nVidia Corporation G92 [GeForce 8800 GTS 512] (rev a2)
+
  05:00.0 VGA compatible controller: nVidia Corporation G92 [GeForce 8800 GTS 512] (rev a2)
+
  
Add the BusID (3 in the previous example) under section {{ic|Device}}:
+
===== Gaming using TwinView =====
BusID "PCI:3:0:0"
+
  
{{Note|The format is important. The BusID value must be specified as {{ic|"PCI:<BusID>:0:0"}}}}
+
In case you want to play fullscreen games when using TwinView, you will notice that games recognize the two screens as being one big screen. While this is technically correct (the virtual X screen really is the size of your screens combined), you probably do not want to play on both screens at the same time.  
 
+
Add the desired SLI rendering mode value under section {{ic|Screen}}:
+
Option "SLI" "SLIAA"
+
 
+
The following table presents the available rendering modes.
+
 
+
{| border="1"
+
! Value !! Behavior
+
|-
+
| 0, no, off, false, Single || Use only a single GPU when rendering.
+
|-
+
| 1, yes, on, true, Auto || Enable SLI and allow the driver to automatically select the appropriate rendering mode.
+
|-
+
| AFR || Enable SLI and use the alternate frame rendering mode.
+
|-
+
| SFR || Enable SLI and use the split frame rendering mode.
+
|-
+
| SLIAA || Enable SLI and use SLI antialiasing. Use this in conjunction with full scene antialiasing to improve visual quality.
+
|}
+
 
+
Alternatively, you can use the {{ic|nvidia-xconfig}} utility to insert these changes into {{ic|xorg.conf}} with a single command:
+
# nvidia-xconfig --busid=PCI:3:0:0 --sli=SLIAA
+
 
+
To verify that SLI mode is enabled from a shell:
+
$ nvidia-settings -q all | grep SLIMode
+
  Attribute 'SLIMode' (arch:0.0): AA
+
    'SLIMode' is a string attribute.
+
    'SLIMode' is a read-only attribute.
+
    'SLIMode' can use the following target types: X Screen.
+
 
+
====Forcing Powermizer performance level (for laptops)====
+
Add under section {{ic|Device}}:
+
# Force Powermizer to a certain level at all times
+
# level 0x1=highest
+
# level 0x2=med
+
# level 0x3=lowest
+
 
+
# AC settings:
+
Option "RegistryDwords" "PowerMizerLevelAC=0x3"
+
# Battery settings:
+
Option "RegistryDwords" "PowerMizerLevel=0x3"
+
 
+
=====Letting the GPU set its own performance level based on temperature=====
+
Add under section {{ic|Device}}:
+
Option "RegistryDwords" "PerfLevelSrc=0x3333"
+
 
+
====Disable vblank interrupts (for laptops)====
+
When running the interrupt detection utility [[powertop]], it can be observed that the Nvidia driver will generate an interrupt for every vblank. To disable, place in the {{ic|Device}} section:
+
Option "OnDemandVBlankInterrupts" "1"
+
This will reduce interrupts to about one or two per second.
+
 
+
====Enabling overclocking====
+
{{Warning|Please note that overclocking may damage hardware and that no responsibility may be placed on the authors of this page due to any damage to any information technology equipment from operating products out of specifications set by the manufacturer.}}
+
To enable GPU and memory overclocking, place the following line in the {{ic|Device}} section:
+
Option "Coolbits" "1"
+
 
+
This will enable on-the-fly overclocking within an X session by running:
+
$ nvidia-settings
+
{{Note|GeForce 400/500/600 series Fermi/Kepler cores cannot currently be overclocked using
+
the Coolbits method. The alternative is to edit and reflash the GPU BIOS either under DOS (preferred), or within a Win32 environment by way of [http://www.mvktech.net/component/option,com_remository/Itemid,26/func,select/id,127/orderby,2/page,1/ nvflash] and [http://www.mvktech.net/component/option,com_remository/Itemid,26/func,select/id,135/orderby,2/page,1/ NiBiTor 6.0]. The advantage of BIOS flashing is that not only can voltage limits be raised, but stability is generally improved over software overclocking methods such as Coolbits.}}
+
 
+
===== Setting static 2D/3D clocks =====
+
Set the following string in the {{ic|Device}} section to enable PowerMizer at its maximum performance level:
+
Option "RegistryDwords" "PerfLevelSrc=0x2222"
+
Set one of the following two strings in the {{ic|Device}} section to enable manual GPU fan control within {{ic|nvidia-settings}}:
+
Option "Coolbits" "4"
+
 
+
Option "Coolbits" "5"
+
 
+
==Tips and tricks==
+
===Enabling Pure Video HD (VDPAU/VAAPI)===
+
'''Hardware Required:'''
+
 
+
At least a video card with second generation PureVideo HD [http://en.wikipedia.org/wiki/Nvidia_PureVideo#Table_of_PureVideo_.28HD.29_GPUs]
+
 
+
'''Software Required:'''
+
 
+
Nvidia video cards with the proprietary driver installed will provide video decoding capabilities with the VDPAU interface at different levels according to PureVideo generation.
+
 
+
You can also add support for the VA-API interface with:
+
# pacman -S libva-vdpau-driver
+
 
+
Check VA-API support with:
+
$ vainfo
+
 
+
To take full advantage of the hardware decoding capability of your video card you will need a media player that supports VDPAU or VA-API.
+
 
+
To enable hardware acceleration in [[MPlayer]] edit {{ic|~/.mplayer/config}}
+
 
+
vo=vdpau
+
vc=ffmpeg12vdpau,ffwmv3vdpau,ffvc1vdpau,ffh264vdpau,ffodivxvdpau,
+
 
+
To enable hardware acceleration in [[VLC media player|VLC]] go:
+
 
+
{{ic|'''Tools'''}} -> {{ic|'''Preferences'''}} -> {{ic|'''Input & Codecs'''}} -> check {{ic|'''Use GPU accelerated decoding'''}}
+
 
+
To enable hardware acceleration in '''smplayer''' go:
+
 
+
{{ic|'''Options'''}} -> {{ic|'''Preferences'''}} -> {{ic|'''General'''}} -> {{ic|'''Video Tab'''}} -> select {{ic|vdpau}} as {{ic|'''output driver'''}}
+
 
+
To enable hardware acceleration in '''gnome-mplayer''' go:
+
 
+
{{ic|'''Edit'''}} -> {{ic|'''Preferences'''}} -> set {{ic|'''video output'''}} to {{ic|vdpau}}
+
 
+
'''Playing HD movies on cards with low memory:'''
+
 
+
If your graphic card does not have a lot of memory (>512MB?), you can experience glitches when watching 1080p or even 720p movies.
+
To avoid that start simple window manager like TWM or MWM.
+
 
+
Additionally increasing the MPlayer's cache size in {{ic|~/.mplayer/config}} can help, when your hard drive is spinning down when watching HD movies.
+
 
+
===Hardware accelerated video decoding with XvMC===
+
Accelerated decoding of MPEG-1 and MPEG-2 videos via [[XvMC]] are supported on GeForce4, GeForce 5 FX, GeForce 6 and GeForce 7 series cards. To use it, create a new file {{ic|/etc/X11/XvMCConfig}} with the following content:
+
libXvMCNVIDIA_dynamic.so.1
+
 
+
See how to configure [[XvMC#Supported software|supported software]].
+
 
+
===Using TV-out===
+
A good article on the subject can be found [http://en.wikibooks.org/wiki/NVidia/TV-OUT here]
+
 
+
===X with a TV (DFP) as the only display===
+
The X server falls back to CRT-0 if no monitor is automatically detected. This can be a problem when using a DVI connected TV as the main display, and X is started while the TV is turned off or otherwise disconnected.
+
+
To force nvidia to use DFP, store a copy of the EDID somewhere in the filesystem so that X can parse the file instead of reading EDID from the TV/DFP.
+
 
+
To acquire the EDID, start nvidia-settings. It will show some information in tree format, ignore the rest of the settings for now and select the GPU (the corresponding entry should be titled "GPU-0" or similar), click the {{ic|DFP}} section (again, {{ic|DFP-0}} or similar), click on the {{ic|Acquire Edid}} Button and store it somewhere, for example, {{ic|/etc/X11/dfp0.edid}}.
+
 
+
Edit {{ic|xorg.conf}} by adding to the {{ic|Device}} section:
+
Option "ConnectedMonitor" "DFP"
+
Option "CustomEDID" "DFP-0:/etc/X11/dfp0.edid"
+
The {{ic|ConnectedMonitor}} option forces the driver to recognize the DFP as if it were connected. The {{ic|CustomEDID}} provides EDID data for the device, meaning that it will start up just as if the TV/DFP was connected during X the process.
+
 
+
This way, one can automatically start a display manager at boot time and still have a working and properly configured X screen by the time the TV gets powered on.
+
 
+
===Check the power source===
+
The NVIDIA X.org driver can also be used to detect the GPU's current source of power. To see the current power source, check the 'GPUPowerSource' read-only parameter (0 - AC, 1 - battery):
+
 
+
    $ nvidia-settings -q GPUPowerSource -t
+
    1
+
 
+
If you're seeing an error message similiar to the one below, then you either need to install [[acpid]] or start the systemd service via {{ic|systemctl start acpid.service}}
+
ACPI: failed to connect to the ACPI event daemon; the daemon
+
may not be running or the "AcpidSocketPath" X
+
configuration option may not be set correctly. When the
+
ACPI event daemon is available, the NVIDIA X driver will
+
try to use it to receive ACPI event notifications. For
+
details, please see the "ConnectToAcpid" and
+
"AcpidSocketPath" X configuration options in Appendix B: X
+
Config Options in the README.
+
(If you are not seeing this error, it is not necessary to install/run acpid soley for this purpose.  My current power source is correctly reported without acpid even installed.)
+
 
+
===Displaying GPU temperature in the shell===
+
====Method 1 - nvidia-settings====
+
{{Note|This method requires that you are using X. Use Method 2 or Method 3 if you are not. Also note that Method 3 currently does not not work with newer nvidia cards such as the G210/220 as well as embedded GPUs such as the Zotac IONITX's 8800GS.}}
+
 
+
To display the GPU temp in the shell, use {{ic|nvidia-settings}} as follows:
+
$ nvidia-settings -q gpucoretemp
+
 
+
This will output something similar to the following:
+
Attribute 'GPUCoreTemp' (hostname:0.0): 41.
+
'GPUCoreTemp' is an integer attribute.
+
'GPUCoreTemp' is a read-only attribute.
+
'GPUCoreTemp' can use the following target types: X Screen, GPU.
+
 
+
The GPU temps of this board is 41 C.
+
 
+
In order to get just the temperature for use in utils such as {{ic|rrdtool}} or {{ic|conky}}, among others:
+
$ nvidia-settings -q gpucoretemp -t
+
41
+
 
+
====Method 2 - nvidia-smi====
+
 
+
Use nvidia-smi which can read temps directly from the GPU without the need to use X at all. This is important for a small group of users who do not have X running on their boxes, perhaps because the box is headless running server apps.
+
To display the GPU temp in the shell, use nvidia-smi as follows:
+
 
+
$ nvidia-smi
+
 
+
This should output something similar to the following:
+
{{bc|<nowiki>$ nvidia-smi
+
Fri Jan  6 18:53:54 2012     
+
+------------------------------------------------------+                     
+
| NVIDIA-SMI 2.290.10  Driver Version: 290.10        |                     
+
|-------------------------------+----------------------+----------------------+
+
| Nb.  Name                    | Bus Id        Disp.  | Volatile ECC SB / DB |
+
| Fan  Temp  Power Usage /Cap | Memory Usage        | GPU Util. Compute M. |
+
|===============================+======================+======================|
+
| 0.  GeForce 8500 GT          | 0000:01:00.0  N/A    |      N/A        N/A |
+
|  30%  62 C  N/A  N/A /  N/A |  17%  42MB /  255MB |  N/A      Default    |
+
|-------------------------------+----------------------+----------------------|
+
| Compute processes:                                              GPU Memory |
+
|  GPU  PID    Process name                                      Usage      |
+
|=============================================================================|
+
|  0.          ERROR: Not Supported                                          |
+
+-----------------------------------------------------------------------------+
+
</nowiki>}}
+
 
+
Only for Temp:
+
{{bc|<nowiki>
+
$ nvidia-smi -q -d TEMPERATURE
+
 
+
==============NVSMI LOG==============
+
 
+
Timestamp                      : Fri Jan  6 18:50:57 2012
+
 
+
Driver Version                  : 290.10
+
 
+
Attached GPUs                  : 1
+
 
+
GPU 0000:01:00.0
+
    Temperature
+
        Gpu                    : 62 C
+
 
+
</nowiki>}}
+
 
+
In order to get just the temperature for use in utils such as rrdtool or conky, among others:
+
 
+
$ nvidia-smi -q -d TEMPERATURE | grep Gpu | cut -c35-36
+
 
+
62
+
 
+
Reference: http://www.question-defense.com/2010/03/22/gpu-linux-shell-temp-get-nvidia-gpu-temperatures-via-linux-cli
+
 
+
====Method 3 - nvclock====
+
Use nvclock which is available from the [extra] repo.
+
{{Note|{{ic|nvclock}} cannot access thermal sensors on newer NVIDIA cards such as the G210/220.}}
+
 
+
There can be significant differences between the temperatures reported by nvclock and nvidia-settings/nv-control. According to [http://sourceforge.net/projects/nvclock/forums/forum/67426/topic/1906899 this post] by the author (thunderbird) of nvclock, the nvclock values should be more accurate.
+
 
+
===Set Fan Speed at Login===
+
You can adjust the fan speed on your graphics card with {{ic|nvidia-settings}}'s console interface. First ensure that your Xorg configuration sets the Coolbits option to {{ic|4}} or {{ic|5}} in your {{ic|Device}} section to enable fan control.
+
 
+
Option "Coolbits" "4"
+
 
+
{{Note|GTX 4xx/5xx series cards cannot currently set fan speeds at login using this method. This method only allows for the setting of fan speeds within the current X session by way of nvidia-settings.}}
+
 
+
Place the following line in your [[xinitrc|{{ic|~/.xinitrc}}]] file to adjust the fan when you launch Xorg. Replace {{ic|<n>}} with the fan speed percentage you want to set.
+
 
+
nvidia-settings -a "[gpu:0]/GPUFanControlState=1" -a "[fan:0]/GPUCurrentFanSpeed=<n>"
+
 
+
You can also configure a second GPU by incrementing the GPU and fan number.
+
 
+
nvidia-settings -a "[gpu:0]/GPUFanControlState=1" \
+
-a "[gpu:1]/GPUFanControlState=1" \
+
-a "[fan:0]/GPUCurrentFanSpeed=<n>" \
+
-a  [fan:1]/GPUCurrentFanSpeed=<n>" &
+
 
+
If you use a login manager such as GDM or KDM, you can create a desktop entry file to process this setting. Create {{ic|~/.config/autostart/nvidia-fan-speed.desktop}} and place this text inside it. Again, change {{ic|<n>}} to the speed percentage you want.
+
 
+
[Desktop Entry]
+
Type=Application
+
Exec=nvidia-settings -a "[gpu:0]/GPUFanControlState=1" -a "[fan:0]/GPUCurrentFanSpeed=<n>"
+
X-GNOME-Autostart-enabled=true
+
Name=nvidia-fan-speed
+
 
+
===Order of install/deinstall for changing drivers===
+
Where the old driver is nvidiaO and the new driver is nvidiaN.
+
remove nvidiaO
+
install nvidia-utilsN
+
install nvidiaN
+
install lib32-nvidia-utils-N (if required)
+
 
+
===Switching between nvidia and nouveau drivers===
+
If you are switching between the nvidia and nouveau driver often, you can use these two scripts to make it easier (both need to be ran as root):
+
 
+
#!/bin/bash
+
# nvidia -> nouveau
+
+
sed -i 's/#*options nouveau modeset=1/options nouveau modeset=1/' /etc/modprobe.d/modprobe.conf
+
sed -i 's/#*MODULES="nouveau"/MODULES="nouveau"/' /etc/mkinitcpio.conf
+
+
pacman -Rdds --noconfirm nvidia{,-utils}
+
pacman -S --noconfirm nouveau-dri xf86-video-nouveau
+
+
#cp {10-monitor,20-nouveau}.conf /etc/X11/xorg.conf.d/
+
+
mkinitcpio -p linux
+
 
+
#!/bin/bash
+
# nouveau -> nvidia
+
+
sed -i 's/options nouveau modeset=1/#options nouveau modeset=1/' /etc/modprobe.d/modprobe.conf
+
sed -i 's/MODULES="nouveau"/#MODULES="nouveau"/' /etc/mkinitcpio.conf
+
+
pacman -Rdds --noconfirm nouveau-dri xf86-video-nouveau libgl
+
pacman -S --noconfirm nvidia{,-utils}
+
+
#rm /etc/X11/xorg.conf.d/{10-monitor,20-nouveau}.conf
+
+
mkinitcpio -p linux
+
 
+
A reboot is needed to complete the switch.
+
 
+
Adjust the scripts accordingly, if using other NVIDIA drivers (e.g. nvidia-173xx).
+
 
+
If using xorg-server older than 1.10.2, uncomment the lines that copy and remove {{ic|{10-monitor,20-nouveau}.conf}}. Since 1.10.2 X loads nouveau automatically.
+
 
+
==Troubleshooting==
+
 
+
===Bad performance, e.g. slow repaints when switching tabs in Chrome===
+
 
+
On some machines, recent nvidia drivers introduce a bug(?) that causes X11 to redraw pixmaps really slow. Switching tabs in Chrome/Chromium (while having more than 2 tabs opened) takes 1-2 seconds, instead of a few milliseconds.
+
 
+
It seems that setting the variable '''InitialPixmapPlacement''' to '''0''' solves that problem, although (like described some paragraphs above) '''InitialPixmapPlacement=2''' should actually be the faster method.
+
 
+
The variable can be (temporarily) set with the command
+
 
+
nvidia-settings -a InitialPixmapPlacement=0
+
 
+
To make this permanent, this call can be placed in a startup script.
+
 
+
===Gaming using Twinview===
+
In case you want to play fullscreen games when using Twinview, you will notice that games recognize the two screens as being one big screen. While this is technically correct (the virtual X screen really is the size of your screens combined), you probably do not want to play on both screens at the same time.  
+
  
 
To correct this behavior for SDL, try:
 
To correct this behavior for SDL, try:
 
  export SDL_VIDEO_FULLSCREEN_HEAD=1
 
  export SDL_VIDEO_FULLSCREEN_HEAD=1
  
For OpenGL, add the appropiate Metamodes to your xorg.conf in section {{ic|Device}} and restart X:
+
For OpenGL, add the appropriate Metamodes to your xorg.conf in section {{ic|Device}} and restart X:
 
  Option "Metamodes" "1680x1050,1680x1050; 1280x1024,1280x1024; 1680x1050,NULL; 1280x1024,NULL;"
 
  Option "Metamodes" "1680x1050,1680x1050; 1280x1024,1280x1024; 1680x1050,NULL; 1280x1024,NULL;"
  
 
Another method that may either work alone or in conjunction with those mentioned above is [[Gaming#Starting_games_in_a_separate_X_server|starting games in a separate X server]].
 
Another method that may either work alone or in conjunction with those mentioned above is [[Gaming#Starting_games_in_a_separate_X_server|starting games in a separate X server]].
  
===Vertical sync using TwinView===
+
==== Mosaic mode ====
If you're using TwinView and vertical sync (the "Sync to VBlank" option in {{ic|nvidia-settings}}), you will notice that only one screen is being properly synced, unless you have two identical monitors. Although {{ic|nvidia-settings}} does offer an option to change which screen is being synced (the "Sync to this display device" option), this does not always work. A solution is to add the following environment variables at startup:
+
nano /etc/profile
+
Add to the end of the file:
+
export __GL_SYNC_TO_VBLANK=1
+
export __GL_SYNC_DISPLAY_DEVICE=DFP-0
+
export __VDPAU_NVIDIA_SYNC_DISPLAY_DEVICE=DFP-0
+
You can change {{ic|DFP-0}} with your preferred screen ({{ic|DFP-0}} is the DVI port and {{ic|CRT-0}} is the VGA port).
+
  
===Old Xorg Settings===
+
Mosaic mode is the only way to use more than 2 monitors across multiple graphics cards with compositing. Your window manager may or may not recognize the distinction between each monitor.
If upgrading from an old installation, please remove old {{ic|/usr/X11R6/}} paths as it can cause trouble during installation.
+
  
===Corrupted screen: "Six screens" issue===
+
===== Base Mosaic =====
For some users using Geforce GT 100M's, the screen turns out corrupted after X starts; divided into 6 sections with a resolution limited to 640x480.
+
The same problem has been recently reported with Quadro 2000 and hi-res displays.
+
  
To solve this problem, enable the Validation Mode {{ic|NoTotalSizeCheck}} in section {{ic|Device}}:
+
Base Mosaic mode works on any set of Geforce 8000 series or higher GPUs. It cannot be enabled from within the nvidia-setting GUI. You must either use the {{ic|nvidia-xconfig}} command line program or edit {{ic|xorg.conf}} by hand. Metamodes must be specified. The following is an example for four DFPs in a 2x2 configuration, each running at 1920x1024, with two DFPs connected to two cards:
Section "Device"
+
  $ nvidia-xconfig --base-mosaic --metamodes="GPU-0.DFP-0: 1920x1024+0+0, GPU-0.DFP-1: 1920x1024+1920+0, GPU-1.DFP-0: 1920x1024+0+1024, GPU-1.DFP-1: 1920x1024+1920+1024"
  ...
+
  Option "ModeValidation" "NoTotalSizeCheck"
+
  ...
+
  EndSection
+
==='/dev/nvidia0' Input/Output error===
+
{{Accuracy|verify that the BIOS related suggestions work and are not coincidentally set while troubleshooting.|section='/dev/nvidia0' Input/Output error... suggested fixes}}
+
This error can occur for several different reasons, and the most common solution given for this error is to check for group/file permissions, which in almost every case is ''not'' the issue. The NVIDIA documentation does not talk in detail on what you should
+
do to correct this problem but there are a few things that have worked for some people. The problem can be a IRQ conflict with another device or bad routing by either the kernel or your BIOS.
+
  
First thing to try is to remove other video devices such as video capture cards and see if the problem goes away. If there are too many video processors on the same system it can lead into the kernel being unable to start them because of memory allocation problems with the video controller. In particular on systems with low video memory this can occur even if there is only one video processor. In such case you should find out the amount of your system's video memory (e.g. with {{ic|lspci -v}}) and pass allocation parameters to the kernel, e.g.:
+
{{Note|While the documentation lists a 2x2 configuration of monitors, Nvidia has reduced that ability to just 3 monitors in Base Mosaic mode as of driver version 304. More monitors are available with a Quadro card, but with standard consumer cards, it is limited to three. The explanation given for this reduction is "Feature parity with the Windows driver". As of September 2014, Windows has no restriction on the number of monitors, even on the same driver version. This is not a bug, this is entirely by design.}}
vmalloc=64M
+
or
+
vmalloc=256M
+
  
If running a 64bit kernel, a driver defect can cause the nvidia module to fail initializing when IOMMU is on. Turning it off in the BIOS has been confirmed to work for some users. [http://www.nvnews.net/vbulletin/showthread.php?s=68bb2fabadcb53b10b286aa42d13c5bc&t=159335][[User:Clickthem#nvidia module]]
+
===== SLI Mosaic =====
  
Another thing to try is to change your BIOS IRQ routing from {{ic|Operating system controlled}} to {{ic|BIOS controlled}} or the other way around. The first one can be passed as a kernel parameter:
+
If you have an SLI configuration and each GPU is a Quadro FX 5800, Quadro Fermi or newer then you can use SLI Mosaic mode. It can be enabled from within the nvidia-settings GUI or from the command line with:
PCI=biosirq
+
  $ nvidia-xconfig --sli=Mosaic --metamodes="GPU-0.DFP-0: 1920x1024+0+0, GPU-0.DFP-1: 1920x1024+1920+0, GPU-1.DFP-0: 1920x1024+0+1024, GPU-1.DFP-1: 1920x1024+1920+1024"
 
+
The {{ic|noacpi}} kernel parameter has also been suggested as a solution but since it disables ACPI completely it should be used with caution. Some hardware are easily damaged by overheating.
+
 
+
{{Note|The kernel parameters can be passed either through the kernel command line or the bootloader configuration file. See your bootloader Wiki page for more information.}}
+
 
+
==='/dev/nvidiactl' errors===
+
Trying to start an opengl application might result in errors such as:
+
Error: Could not open /dev/nvidiactl because the permissions are too
+
restrictive. Please see the {{ic|FREQUENTLY ASKED QUESTIONS}}
+
section of {{ic|/usr/share/doc/NVIDIA_GLX-1.0/README}}
+
for steps to correct.
+
 
+
Solve by adding the appropiate user to the {{ic|video}} group and relogin:
+
# gpasswd -a username video
+
 
+
===32 bit applications do not start===
+
Under 64 bit systems, installing {{ic|lib32-nvidia-utils}} that corresponds to the same version installed for the 64 bit driver fixes the issue.
+
 
+
===Errors after updating the kernel===
+
If a custom build of NVIDIA's module is used instead of the package from [extra], a recompile is required every time the kernel is updated. Rebooting is generally recommended after updating kernel and graphic drivers.
+
 
+
===Crashing in general===
+
* Try disabling {{ic|RenderAccel}} in xorg.conf.
+
* If Xorg outputs an error about "conflicting memory type" or "failed to allocate primary buffer: out of memory", add {{ic|nopat}} at the end of the {{ic|kernel}} line in {{ic|/boot/grub/menu.lst}}.
+
* If the NVIDIA compiler complains about different versions of GCC between the current one and the one used for compiling the kernel, add in {{ic|/etc/profile}}:
+
export IGNORE_CC_MISMATCH=1
+
* If Xorg is crashing with a "Signal 11" while using nvidia-96xx drivers, try disabling PAT. Pass the argument {{ic|nopat}} to the {{ic|kernel}} line in {{ic|menu.lst}}.
+
More information about troubleshooting the driver can be found in the [http://www.nvnews.net/vbulletin/forumdisplay.php?s=&forumid=14 NVIDIA forums.]
+
 
+
===Bad performance after installing a new driver version===
+
If FPS have dropped in comparison with older drivers, first check if direct rendering is turned on:
+
$ glxinfo | grep direct
+
If the command prints:
+
direct rendering: No
+
then that could be an indication for the sudden FPS drop.
+
 
+
A possible solution could be to regress to the previously installed driver version and rebooting afterwards.
+
 
+
===CPU spikes with 400 series cards===
+
If you are experiencing intermittent CPU spikes with a 400 series card, it may be caused by PowerMizer constantly changing the GPU's clock frequency. Switching PowerMizer's setting from Adaptive to Performance, add the following to the {{ic|Device}} section of your Xorg configuration:
+
 
+
  Option "RegistryDwords" "PowerMizerEnable=0x1; PerfLevelSrc=0x3322; PowerMizerDefaultAC=0x1"
+
 
+
===Laptops: X hangs on login/out, worked around with Ctrl+Alt+Backspace===
+
If while using the legacy NVIDIA drivers Xorg hangs on login and logout (particularly with an odd screen split into two black and white/gray pieces), but logging in is still possible via Ctrl-Alt-Backspace (or whatever the new "kill X" keybind is), try adding this in {{ic|/etc/modprobe.d/modprobe.conf}}:
+
options nvidia NVreg_Mobile=1
+
 
+
One user had luck with this instead, but it makes performance drop significantly for others:
+
  options nvidia NVreg_DeviceFileUID=0 NVreg_DeviceFileGID=33 NVreg_DeviceFileMode=0660 NVreg_SoftEDIDs=0 NVreg_Mobile=1
+
 
+
Note that {{ic|NVreg_Mobile}} needs to be changed according to the laptop:
+
* 1 for Dell laptops.
+
* 2 for non-Compal Toshiba laptops.
+
* 3 for other laptops.
+
* 4 for Compal Toshiba laptops.
+
* 5 for Gateway laptops.
+
 
+
See [http://http.download.nvidia.com/XFree86/Linux-x86/1.0-7182/README/readme.txt NVIDIA Driver's Readme:Appendix K] for more information.
+
 
+
===Refresh rate not detected properly by XRandR dependant utilities===
+
The XRandR X extension is not presently aware of multiple display devices on a single X screen; it only sees the {{ic|MetaMode}} bounding box, which may contain one or more actual modes. This means that if multiple MetaModes have the same bounding box, XRandR will not be able to distinguish between them.
+
 
+
In order to support {{ic|DynamicTwinView}}, the NVIDIA driver must make each MetaMode appear to be unique to XRandR. Presently, the NVIDIA driver accomplishes this by using the refresh rate as a unique identifier.
+
 
+
Use {{ic|nvidia-settings -q RefreshRate}} to query the actual refresh rate on each display device.
+
 
+
The XRandR extension is currently being redesigned by the X.Org community, so the refresh rate workaround may be removed at some point in the future.
+
 
+
This workaround can also be disabled by setting the {{ic|DynamicTwinView}} X configuration option to {{ic|false}}, which will disable NV-CONTROL support for manipulating MetaModes, but will cause the XRandR and XF86VidMode visible refresh rate to be accurate.
+
 
+
===No screens found on a laptop / NVIDIA Optimus===
+
On a laptop, if the NVIDIA driver cannot find any screens, you may have an NVIDIA Optimus setup : an Intel chipset connected to the screen and the video outputs, and a NVIDIA card that does all the hard work and writes to the chipset's video memory.
+
 
+
Check if
+
lspci | grep VGA
+
outputs something similar to
+
00:02.0 VGA compatible controller: Intel Corporation Core Processor Integrated Graphics Controller (rev 02)
+
01:00.0 VGA compatible controller: nVidia Corporation Device 0df4 (rev a1)
+
 
+
NVIDIA has [http://www.phoronix.com/scan.php?page=news_item&px=MTE3MzY announced plans] to support Optimus in their Linux drivers at some point in the future.
+
 
+
You need to install the [[Intel Graphics|Intel]] driver to handle the screens, then if you want 3D software you should run them through [[Bumblebee]] to tell them to use the NVIDIA card.
+
 
+
'''Possible Workaround'''
+
 
+
On my Lenovo W520 with a Quadro 1000M and Nvidia Optimus, I entered the BIOS and changed my default graphics setting from 'Optimus' to 'Discrete' and the pacman Nvidia drivers(295.20-1 at time of writing) recognized the screens.
+
 
+
Steps:
+
-Enter BIOS
+
-Find Graphics Settings(For me it's in the Config Tab, then Display submenu)
+
-Change 'Graphics Device' to 'Discrete Graphics'(Disables Intel integrated graphics)
+
-Change OS Detection for Nvidia Optimus to 'Disabled'
+
-Save and Exit
+
 
+
===Screen(s) found, but none have a usable configuration===
+
On a laptop, sometimes NVIDIA driver cannot find the active screen.
+
It may be caused because you own a graphic card with vga/tv outs.
+
You should examine Xorg.0.log to see what is wrong.
+
 
+
Another thing to try is adding invalid {{ic|"ConnectedMonitor" Option}} to {{ic|Section "Device"}}
+
to force Xorg throws error and shows you how correct it.
+
[http://http.download.nvidia.com/XFree86/Linux-x86/304.51/README/xconfigoptions.html Here]
+
more about ConnectedMonitor setting.
+
 
+
After re-run X see Xorg.0.log to get valid CRT-x,DFP-x,TV-x values.
+
 
+
{{ic|nvidia-xconfig --query-gpu-info}} could be helpful.
+
 
+
===No brightness control on laptops===
+
Try to add the following line on 20-nvidia.conf
+
Option "RegistryDwords" "EnableBrightnessControl=1"
+
If it still not working, you can try install [https://aur.archlinux.org/packages.php?ID=25467 nvidia-bl] or [https://aur.archlinux.org/packages.php?ID=50749 nvidiabl].
+
 
+
===Black Bars while watching full screen flash videos with twinview===
+
Follow the instructions presented here:
+
[http://al.robotfuzz.com/content/workaround-fullscreen-flash-linux-multiheaded-desktops link]
+
 
+
===Backlight is not turning off in some occasions===
+
 
+
By default, DPMS should turn off backlight with the timeouts set or by running xset. However, probably due to a bug in the proprietary Nvidia drivers the result is a blank screen with no powersaving whatsoever. To workaround it, until the bug has been fixed you can use the {{ic|vbetool}} as root.
+
 
+
Install the {{pkg|vbetool}} package.
+
 
+
Turn off your screen on demand and then by pressing a random key backlight turns on again:
+
 
+
vbetool dpms off && read -n1; vbetool dpms on
+
 
+
Alternatively, xrandr is able to disable and re-enable monitor outputs without requiring root.
+
 
+
xrandr --output DP-1 --off; read -n1; xrandr --output DP-1 --auto
+
 
+
===Blue tint on videos with Flash===
+
 
+
An issue with {{Pkg|flashplugin}} versions 11.2.202.228-1 and 11.2.202.233-1 causes it to send the U/V panes in the incorrect order resulting in a blue tint on certain videos. There are a few potential fixes for this bug:
+
 
+
* Install the latest {{Pkg|libvdpau}}.
+
* Patch {{ic|vdpau_trace.so}} with [https://bbs.archlinux.org/viewtopic.php?pid=1078368#p1078368 this makepkg].
+
* Right click on a video, select "Settings..." and uncheck "Enable hardware acceleration". Reload the page for it to take affect. Note that this disables GPU acceleration.
+
* [[Downgrading Packages|Downgrade]] the {{Pkg|flashplugin}} package to version 11.1.102.63-1 at most.
+
* Use {{AUR|google-chrome}} with the new [https://aur.archlinux.org/packages/?O=0&K=chromium-pepper-flash Pepper API].
+
* Try one of the few Flash alternatives.
+
 
+
The merits of each are discussed in [https://bbs.archlinux.org/viewtopic.php?id=137877 this thread]. To summarize: if you want all flash sites (YouTube, Vimeo, etc) to work properly in non-Chrome browsers, without feature regressions (such as losing hardware acceleration), without crashes/instability (enabling hardware decoding), without security concerns (multiple CVEs against older flash versions) and without breaking the vdpau tracing library from its intended purpose, the LEAST objectionable is to install {{AUR|libvdpau-git-flashpatch}}.
+
 
+
===Bleeding overlay with Flash===
+
 
+
This bug is due to the incorrect colour key being used by the {{Pkg|flashplugin}} version 11.2.202.228-1 and causes the flash content to "leak" into other pages or solid black backgrounds. To avoid this issue simply install the latest {{Pkg|libvdpau}} or export {{ic|1=VDPAU_NVIDIA_NO_OVERLAY=1}} within either your shell profile (E.g. {{ic|~/.bash_profile}} or {{ic|~/.zprofile}}) or {{ic|~/.xinitrc}}
+
 
+
===Full system freeze using flash===
+
 
+
If you experience occasional full system freezes (only the mouse is moving) using flashplugin
+
and get
+
  
  # /var/log/errors.log
+
=== Driver persistence ===
  NVRM: Xid (0000:01:00): 31, Ch 00000007, engmask 00000120, intr 10000000
+
  
a possible workaround is to switch off Hardware Acceleration in flash, setting
+
Nvidia has a daemon that is to be run at boot. See the [http://docs.nvidia.com/deploy/driver-persistence/index.html Driver Persistence] section of the Nvidia documentation for more details.
  
# /etc/adobe/mms.cfg
+
To start the persistence daemon at boot, [[enable]] the {{ic|nvidia-persistenced.service}}. For manual usage see the [http://docs.nvidia.com/deploy/driver-persistence/index.html#usage upstream documentation].
EnableLinuxHWVideoDecode=0
+
  
===XOrg fails to Load or Red Screen of Death===
+
== See also ==
If you get a red screen and use grub2 disable the grub2 framebuffer by editing /etc/defaults/grub and uncomment GRUB_TERMINAL_OUTPUT. For more information see [[Grub#Disable_framebuffer]].
+
  
==See also==
+
* [https://forums.geforce.com/ NVIDIA User forums]
* [http://www.nvnews.net/vbulletin/forumdisplay.php?s=&forumid=14 NVIDIA forums]
+
* [ftp://download.nvidia.com/XFree86/Linux-x86/355.11/README/README.txt Official README for NVIDIA drivers, all on one text page. Most Recent Driver Version as of September 7, 2015: 355.11.]
* [http://http.download.nvidia.com/XFree86/Linux-x86/1.0-7182/README/readme.txt Official readme for NVIDIA drivers]
+
* [ftp://download.nvidia.com/XFree86/Linux-x86/355.11/README/xconfigoptions.html README Appendix B. X Config Options, 355.11 (direct link)]

Latest revision as of 16:20, 25 August 2016

This article covers installing and configuring NVIDIA's proprietary graphic card driver. For information about the open-source drivers, see Nouveau. If you have a laptop with hybrid Intel/NVIDIA graphics, see NVIDIA Optimus instead.

Installation

Warning: Avoid installing the NVIDIA driver through the package provided from the NVIDIA website. Installation through pacman allows upgrading the driver together with the rest of the system.

These instructions are for those using the stock linux or linux-lts packages. For custom kernel setup, skip to the next subsection.

1. If you do not know what graphics card you have, find out by issuing:

$ lspci -k | grep -A 2 -E "(VGA|3D)"

2. Determine the necessary driver version for your card by:

3. Install the appropriate driver for your card:

4. If you are on 64-bit and also need 32-bit OpenGL support, you must also install the equivalent lib32 package from the multilib repository (e.g. lib32-nvidia-libgl, lib32-nvidia-340xx-libgl or lib32-nvidia-304xx-libgl).

5. Reboot. The nvidia package contains a file which blacklists the nouveau module, so rebooting is necessary.

Once the driver has been installed, continue to #Configuration.

Unsupported drivers

If you have a GeForce 5 FX series card or older, Nvidia no longer supports drivers for your card. This means that these drivers do not support the current Xorg version. It thus might be easier if you use the nouveau driver, which supports the old cards with the current Xorg.

However, Nvidia's legacy drivers are still available and might provide better 3D performance/stability if you are willing to downgrade Xorg:

  • For GeForce 5 FX series cards [NV30-NV36], install the nvidia-173xx-dkmsAUR package. Last supported Xorg version is 1.15.
  • For GeForce 2/3/4 MX/Ti series cards [NV11, NV17-NV28], install the nvidia-96xx-dkmsAUR package. Last supported Xorg version is 1.12.
Tip: The legacy nvidia-96xx-dkms and nvidia-173xx-dkms drivers can also be installed from the unofficial [city] repository. (It is strongly advised that you do not skip any dependencies restriction when installing from here)

Custom kernel

If you are using a custom kernel, compilation of the Nvidia kernel modules can be automated with DKMS.

Install the nvidia-dkms package (or a specific branch such as nvidia-340xx-dkms). The Nvidia module will be rebuilt after every Nvidia or kernel update thanks to the DKMS Pacman Hook.

Pure Video HD

At least a video card with second generation PureVideo HD is required for hardware video acceleration using VDPAU.

DRM kernel mode setting

Note: The NVIDIA driver does not provide an fbdev driver for the high-resolution console for the kernel compiled-in vesafb module. However, the kernel compiled-in efifb module supports high-resolution nvidia console on EFI systems.[1]

nvidia 364.16 adds support for DRM kernel mode setting. To enable this feature, add the nvidia-drm.modeset=1 kernel parameter, and add nvidia, nvidia_modeset, nvidia_uvm and nvidia_drm to your initramfs#MODULES.

Warning: Do not forget to run mkinitcpio every time you update driver.

Pacman hook

To avoid the possibility of forgetting to update your initramfs after an nvidia upgrade, you can use a pacman hook like this

/etc/pacman.d/hooks/nvidia.hook
[Trigger]
Operation=Install
Operation=Upgrade
Operation=Remove
Type=Package
Target=nvidia

[Action]
Depends=mkinitcpio
When=PostTransaction
Exec=/usr/bin/mkinitcpio -p linux

Hardware accelerated video decoding with XvMC

Accelerated decoding of MPEG-1 and MPEG-2 videos via XvMC are supported on GeForce4, GeForce 5 FX, GeForce 6 and GeForce 7 series cards. See XvMC for details.

Switching between NVIDIA and nouveau drivers

Tango-edit-cut.pngThis section is being considered for removal.Tango-edit-cut.png

Reason: This script literally just installs NVIDIA and uninstalls nouveau or vice versa. Also it doesn't mention the blacklisting of the nouveau module. (Discuss in Talk:NVIDIA#)

If you need to switch between drivers, you may use the following script, run as root (say yes to all confirmations):

#!/bin/bash
BRANCH= # Enter a branch if needed, i.e. -340xx or -304xx
NVIDIA=nvidia${BRANCH} # If no branch entered above this would be "nvidia"
NOUVEAU=xf86-video-nouveau

# Replace -R with -Rs to if you want to remove the unneeded dependencies
if [ $(pacman -Qqs ^mesa-libgl$) ]; then
    pacman -S $NVIDIA ${NVIDIA}-libgl # Add lib32-${NVIDIA}-libgl and ${NVIDIA}-lts if needed
    # pacman -R $NOUVEAU
elif [ $(pacman -Qqs ^${NVIDIA}$) ]; then
    pacman -S --needed $NOUVEAU mesa-libgl # Add lib32-mesa-libgl if needed
    pacman -R $NVIDIA # Add ${NVIDIA}-lts if needed
fi

Configuration

Tango-view-refresh-red.pngThis article or section is out of date.Tango-view-refresh-red.png

Reason: nvidia-xconfig should be avoided in 2016, and manual configuration isn't needed in most cases. Neither is the automatic configuration with nvidia-xconfig. (Discuss in Talk:NVIDIA#)

It is possible that after installing the driver it may not be needed to create an Xorg server configuration file. You can run a test to see if the Xorg server will function correctly without a configuration file. However, it may be required to create a configuration file (prefer /etc/X11/xorg.conf.d/20-nvidia.conf over /etc/X11/xorg.conf) in order to adjust various settings. This configuration can be generated by the NVIDIA Xorg configuration tool, or it can be created manually. If created manually, it can be a minimal configuration (in the sense that it will only pass the basic options to the Xorg server), or it can include a number of settings that can bypass Xorg's auto-discovered or pre-configured options.

Note: For maunal configuration see NVIDIA/Tips and tricks#Manual configuration.

Minimal configuration

A basic configuration block in 20-nvidia.conf (or deprecated in xorg.conf) would look like this:

/etc/X11/xorg.conf.d/20-nvidia.conf
Section "Device"
        Identifier "Nvidia Card"
        Driver "nvidia"
        VendorName "NVIDIA Corporation"
        Option "NoLogo" "true"
        #Option "UseEDID" "false"
        #Option "ConnectedMonitor" "DFP"
        # ...
EndSection
Tip: If upgrading from nouveau make sure to remove nouveau from /etc/mkinitcpio.conf. See Switching between NVIDIA and nouveau drivers[broken link: invalid section], if switching between the open and proprietary drivers often.

Automatic configuration

The NVIDIA package includes an automatic configuration tool to create an Xorg server configuration file (xorg.conf) and can be run by:

# nvidia-xconfig

This command will auto-detect and create (or edit, if already present) the /etc/X11/xorg.conf configuration according to present hardware.

If there are instances of DRI, ensure they are commented out:

#    Load        "dri"

Double check your /etc/X11/xorg.conf to make sure your default depth, horizontal sync, vertical refresh, and resolutions are acceptable.

Warning: That may still not work properly with Xorg-server 1.8

NVIDIA Settings

The nvidia-settings tool lets you configure many options using either CLI or GUI. Running nvidia-settings without any options launches the GUI, for CLI options see nvidia-settings(1).

You can run the GUI as a normal user and save the settings to ~/.nvidia-settings-rc. Then you can load the settings using $ nvidia-settings --load-config-only (for example in your xinitrc). Alternatively, run nvidia-settings as root, and then save the configuration to /etc/X11/xorg.conf.d/ as usual.

Tip: If your X server is crashing on startup, it may be because the GUI-generated settings are corrupt. Try deleting the generated file and starting from scratch.

Multiple monitors

See Multihead for more general information.

Using NVIDIA Settings

The nvidia-settings tool can configure multiple monitors.

For CLI configuration, first get the CurrentMetaMode by running:

$ nvidia-settings -q CurrentMetaMode
Attribute 'CurrentMetaMode' (hostnmae:0.0): id=50, switchable=no, source=nv-control :: DPY-1: 2880x1620 @2880x1620 +0+0 {ViewPortIn=2880x1620, ViewPortOut=2880x1620+0+0}

Save everything after the :: to the end of the attribute (in this case: DPY-1: 2880x1620 @2880x1620 +0+0 {ViewPortIn=2880x1620, ViewPortOut=2880x1620+0+0}) and use to reconfigure your displays with nvidia-settings --assign "CurrentMetaMode=your_meta_mode".

Tip: You can create shell aliases for the different monitor and resolution configurations you use.

ConnectedMonitor

If the driver does not properly detect a second monitor, you can force it to do so with ConnectedMonitor.

/etc/X11/xorg.conf

Section "Monitor"
    Identifier     "Monitor1"
    VendorName     "Panasonic"
    ModelName      "Panasonic MICRON 2100Ex"
    HorizSync       30.0 - 121.0 # this monitor has incorrect EDID, hence Option "UseEDIDFreqs" "false"
    VertRefresh     50.0 - 160.0
    Option         "DPMS"
EndSection

Section "Monitor"
    Identifier     "Monitor2"
    VendorName     "Gateway"
    ModelName      "GatewayVX1120"
    HorizSync       30.0 - 121.0
    VertRefresh     50.0 - 160.0
    Option         "DPMS"
EndSection

Section "Device"
    Identifier     "Device1"
    Driver         "nvidia"
    Option         "NoLogo"
    Option         "UseEDIDFreqs" "false"
    Option         "ConnectedMonitor" "CRT,CRT"
    VendorName     "NVIDIA Corporation"
    BoardName      "GeForce 6200 LE"
    BusID          "PCI:3:0:0"
    Screen          0
EndSection

Section "Device"
    Identifier     "Device2"
    Driver         "nvidia"
    Option         "NoLogo"
    Option         "UseEDIDFreqs" "false"
    Option         "ConnectedMonitor" "CRT,CRT"
    VendorName     "NVIDIA Corporation"
    BoardName      "GeForce 6200 LE"
    BusID          "PCI:3:0:0"
    Screen          1
EndSection

The duplicated device with Screen is how you get X to use two monitors on one card without TwinView. Note that nvidia-settings will strip out any ConnectedMonitor options you have added.

TwinView

You want only one big screen instead of two. Set the TwinView argument to 1. This option should be used if you desire compositing. TwinView only works on a per card basis, when all participating monitors are connected to the same card.

Option "TwinView" "1"

Example configuration:

/etc/X11/xorg.conf.d/10-monitor.conf
Section "ServerLayout"
    Identifier     "TwinLayout"
    Screen         0 "metaScreen" 0 0
EndSection

Section "Monitor"
    Identifier     "Monitor0"
    Option         "Enable" "true"
EndSection

Section "Monitor"
    Identifier     "Monitor1"
    Option         "Enable" "true"
EndSection

Section "Device"
    Identifier     "Card0"
    Driver         "nvidia"
    VendorName     "NVIDIA Corporation"

    #refer to the link below for more information on each of the following options.
    Option         "HorizSync"          "DFP-0: 28-33; DFP-1 28-33"
    Option         "VertRefresh"        "DFP-0: 43-73; DFP-1 43-73"
    Option         "MetaModes"          "1920x1080, 1920x1080"
    Option         "ConnectedMonitor"   "DFP-0, DFP-1"
    Option         "MetaModeOrientation" "DFP-1 LeftOf DFP-0"
EndSection

Section "Screen"
    Identifier     "metaScreen"
    Device         "Card0"
    Monitor        "Monitor0"
    DefaultDepth    24
    Option         "TwinView" "True"
    SubSection "Display"
        Modes          "1920x1080"
    EndSubSection
EndSection

Device option information.

If you have multiple cards that are SLI capable, it is possible to run more than one monitor attached to separate cards (for example: two cards in SLI with one monitor attached to each). The "MetaModes" option in conjunction with SLI Mosaic mode enables this. Below is a configuration which works for the aforementioned example and runs GNOME flawlessly.

/etc/X11/xorg.conf.d/10-monitor.conf
Section "Device"
        Identifier      "Card A"
        Driver          "nvidia"
        BusID           "PCI:1:00:0"
EndSection

Section "Device"
        Identifier      "Card B"
        Driver          "nvidia"
        BusID           "PCI:2:00:0"
EndSection

Section "Monitor"
        Identifier      "Right Monitor"
EndSection

Section "Monitor"
        Identifier      "Left Monitor"
EndSection

Section "Screen"
        Identifier      "Right Screen"
        Device          "Card A"
        Monitor         "Right Monitor"
        DefaultDepth    24
        Option          "SLI" "Mosaic"
        Option          "Stereo" "0"
        Option          "BaseMosaic" "True"
        Option          "MetaModes" "GPU-0.DFP-0: 1920x1200+4480+0, GPU-1.DFP-0:1920x1200+0+0"
        SubSection      "Display"
                        Depth           24
        EndSubSection
EndSection

Section "Screen"
        Identifier      "Left Screen"
        Device          "Card B"
        Monitor         "Left Monitor"
        DefaultDepth    24
        Option          "SLI" "Mosaic"
        Option          "Stereo" "0"
        Option          "BaseMosaic" "True"
        Option          "MetaModes" "GPU-0.DFP-0: 1920x1200+4480+0, GPU-1.DFP-0:1920x1200+0+0"
        SubSection      "Display"
                        Depth           24
        EndSubSection
EndSection

Section "ServerLayout"
        Identifier      "Default"
        Screen 0        "Right Screen" 0 0
        Option          "Xinerama" "0"
EndSection
Manual CLI configuration with xrandr

Tango-inaccurate.pngThe factual accuracy of this article or section is disputed.Tango-inaccurate.png

Reason: Do these commands set up the monitors in TwinView mode? (Discuss in Talk:NVIDIA#)

If the latest solutions do not work for you, you can use your window manager's autostart implementation with xorg-xrandr.

Some xrandr examples could be:

xrandr --output DVI-I-0 --auto --primary --left-of DVI-I-1

or:

xrandr --output DVI-I-1 --pos 1440x0 --mode 1440x900 --rate 75.0

When:

  • --output is used to indicate the "monitor" to which the options are set.
  • DVI-I-1 is the name of the second monitor.
  • --pos is the position of the second monitor relative to the first.
  • --mode is the resolution of the second monitor.
  • --rate is the refresh rate (in Hz).
Vertical sync using TwinView

If you are using TwinView and vertical sync (the "Sync to VBlank" option in nvidia-settings), you will notice that only one screen is being properly synced, unless you have two identical monitors. Although nvidia-settings does offer an option to change which screen is being synced (the "Sync to this display device" option), this does not always work. A solution is to add the following environment variables at startup, for example append in /etc/profile:

export __GL_SYNC_TO_VBLANK=1
export __GL_SYNC_DISPLAY_DEVICE=DFP-0
export __VDPAU_NVIDIA_SYNC_DISPLAY_DEVICE=DFP-0

You can change DFP-0 with your preferred screen (DFP-0 is the DVI port and CRT-0 is the VGA port). You can find the identifier for your display from nvidia-settings in the "X Server XVideoSettings" section.

Gaming using TwinView

In case you want to play fullscreen games when using TwinView, you will notice that games recognize the two screens as being one big screen. While this is technically correct (the virtual X screen really is the size of your screens combined), you probably do not want to play on both screens at the same time.

To correct this behavior for SDL, try:

export SDL_VIDEO_FULLSCREEN_HEAD=1

For OpenGL, add the appropriate Metamodes to your xorg.conf in section Device and restart X:

Option "Metamodes" "1680x1050,1680x1050; 1280x1024,1280x1024; 1680x1050,NULL; 1280x1024,NULL;"

Another method that may either work alone or in conjunction with those mentioned above is starting games in a separate X server.

Mosaic mode

Mosaic mode is the only way to use more than 2 monitors across multiple graphics cards with compositing. Your window manager may or may not recognize the distinction between each monitor.

Base Mosaic

Base Mosaic mode works on any set of Geforce 8000 series or higher GPUs. It cannot be enabled from within the nvidia-setting GUI. You must either use the nvidia-xconfig command line program or edit xorg.conf by hand. Metamodes must be specified. The following is an example for four DFPs in a 2x2 configuration, each running at 1920x1024, with two DFPs connected to two cards:

$ nvidia-xconfig --base-mosaic --metamodes="GPU-0.DFP-0: 1920x1024+0+0, GPU-0.DFP-1: 1920x1024+1920+0, GPU-1.DFP-0: 1920x1024+0+1024, GPU-1.DFP-1: 1920x1024+1920+1024"
Note: While the documentation lists a 2x2 configuration of monitors, Nvidia has reduced that ability to just 3 monitors in Base Mosaic mode as of driver version 304. More monitors are available with a Quadro card, but with standard consumer cards, it is limited to three. The explanation given for this reduction is "Feature parity with the Windows driver". As of September 2014, Windows has no restriction on the number of monitors, even on the same driver version. This is not a bug, this is entirely by design.
SLI Mosaic

If you have an SLI configuration and each GPU is a Quadro FX 5800, Quadro Fermi or newer then you can use SLI Mosaic mode. It can be enabled from within the nvidia-settings GUI or from the command line with:

$ nvidia-xconfig --sli=Mosaic --metamodes="GPU-0.DFP-0: 1920x1024+0+0, GPU-0.DFP-1: 1920x1024+1920+0, GPU-1.DFP-0: 1920x1024+0+1024, GPU-1.DFP-1: 1920x1024+1920+1024"

Driver persistence

Nvidia has a daemon that is to be run at boot. See the Driver Persistence section of the Nvidia documentation for more details.

To start the persistence daemon at boot, enable the nvidia-persistenced.service. For manual usage see the upstream documentation.

See also