From ArchWiki
Revision as of 21:50, 17 November 2018 by Afader (talk | contribs) (→‎Kernel module parameters: fix link)
Jump to navigation Jump to search

This article covers the proprietary NVIDIA graphics card driver. For the open-source driver, see Nouveau. If you have a laptop with hybrid Intel/NVIDIA graphics, see NVIDIA Optimus instead.


Warning: Avoid installing the NVIDIA driver through the package provided from the NVIDIA website. Installation through pacman allows upgrading the driver together with the rest of the system.

These instructions are for those using the stock linux or linux-lts packages. For custom kernel setup, skip to the next subsection.

1. If you do not know what graphics card you have, find out by issuing:

$ lspci -k | grep -A 2 -E "(VGA|3D)"

2. Determine the necessary driver version for your card by:

3. Install the appropriate driver for your card:

  • For GeForce 600 series cards (except 610, 620, 625, 705, 800A and other low end rebrands) and newer [NVEx and newer], install the nvidia or nvidia-lts package.

4. For 32-bit application support, also install the equivalent lib32 package from the multilib repository (e.g. lib32-nvidia-utils, lib32-nvidia-390xx-utils or lib32-nvidia-340xx-utils).

5. Reboot. The nvidia package contains a file which blacklists the nouveau module, so rebooting is necessary.

Once the driver has been installed, continue to #Configuration.

Unsupported drivers

If you have a GeForce 7 series card or older (released in 2006 or earlier), Nvidia no longer supports drivers for your card. This means that these drivers do not support the current Xorg version. It thus might be easier if you use the Nouveau driver, which supports the old cards with the current Xorg.

However, Nvidia's legacy drivers are still available and might provide better 3D performance/stability if you are willing to downgrade Xorg:

  • For GeForce 6/7 series cards [NV4x and NV6x], install the nvidia-304xx-dkmsAUR[broken link: archived in aur-mirror] package. Last supported Xorg version is 1.19.
  • For GeForce 5 FX series cards [NV30-NV36], install the nvidia-173xx-dkmsAUR package. Last supported Xorg version is 1.15.
  • For GeForce 2/3/4 MX/Ti series cards [NV11, NV17-NV28], install the nvidia-96xx-dkmsAUR package. Last supported Xorg version is 1.12.

Custom kernel

If you are using a custom kernel, compilation of the Nvidia kernel modules can be automated with DKMS.

Install the nvidia-dkms package (or a specific branch such as nvidia-390xx-dkms or nvidia-340xx-dkms). The Nvidia module will be rebuilt after every Nvidia or kernel update thanks to the DKMS pacman hook.

DRM kernel mode setting

nvidia 364.16 adds support for DRM (Direct Rendering Manager) kernel mode setting. To enable this feature, add the nvidia-drm.modeset=1 kernel parameter, and add nvidia, nvidia_modeset, nvidia_uvm and nvidia_drm to initramfs#MODULES.

Do not forget to run mkinitcpio every time there is a nvidia driver update. See #Pacman hook to automate these steps.

Warning: Enabling KMS causes GDM and GNOME to default to Wayland, which currently suffers from very poor performance: FS#53284. A workaround is to configure GDM to use Xorg (see GDM#Use Xorg backend) or to use the GNOME on Xorg session instead.
Note: The NVIDIA driver does not provide an fbdev driver for the high-resolution console for the kernel compiled-in vesafb module. However, the kernel compiled-in efifb module supports a high-resolution console on EFI systems. This method requires GRUB and is described in NVIDIA/Tips and tricks#Fixing terminal resolution.[1][2].

Pacman hook

To avoid the possibility of forgetting to update initramfs after an NVIDIA driver upgrade, you may want to use a pacman hook:

# Change the linux part above and in the Exec line if a different kernel is used

Description=Update Nvidia module in initcpio
Exec=/bin/sh -c 'while read -r trg; do case $trg in linux) exit 0; esac; done; /usr/bin/mkinitcpio -P'

Make sure the Target package set in this hook is the one you've installed in steps above (e.g. nvidia, nvidia-dkms, nvidia-lts or nvidia-ck-something).

Note: The complication in the Exec line above is in order to avoid running mkinitcpio multiple times if both nvidia and linux get updated. In case this doesn't bother you, the Target=linux and NeedsTargets lines may be dropped, and the Exec line may be reduced to simply Exec=/usr/bin/mkinitcpio -P.

Hardware accelerated video decoding


At least a video card with second generation PureVideo HD is required for hardware video acceleration using VDPAU.


Accelerated decoding of MPEG-1 and MPEG-2 videos via XvMC are supported on GeForce4, GeForce 5 FX, GeForce 6 and GeForce 7 series cards. See XvMC for details.


The proprietary NVIDIA graphics card driver does not need any Xorg server configuration file. You can run a test to see if the Xorg server will function correctly without a configuration file. However, it may be required to create a configuration file (prefer /etc/X11/xorg.conf.d/20-nvidia.conf over /etc/X11/xorg.conf) in order to adjust various settings. This configuration can be generated by the NVIDIA Xorg configuration tool, or it can be created manually. If created manually, it can be a minimal configuration (in the sense that it will only pass the basic options to the Xorg server), or it can include a number of settings that can bypass Xorg's auto-discovered or pre-configured options.

Tip: For more configuration options see NVIDIA/Tips and tricks#Manual configuration and NVIDIA/Troubleshooting section.

Minimal configuration

A basic configuration block in 20-nvidia.conf (or deprecated in xorg.conf) would look like this:

Section "Device"
        Identifier "Nvidia Card"
        Driver "nvidia"
        VendorName "NVIDIA Corporation"
        BoardName "GeForce GTX 1050 Ti"

Automatic configuration

The NVIDIA package includes an automatic configuration tool to create an Xorg server configuration file (xorg.conf) and can be run by:

# nvidia-xconfig

This command will auto-detect and create (or edit, if already present) the /etc/X11/xorg.conf configuration according to present hardware.

If there are instances of DRI, ensure they are commented out:

#    Load        "dri"

Double check your /etc/X11/xorg.conf to make sure your default depth, horizontal sync, vertical refresh, and resolutions are acceptable.

NVIDIA Settings

The nvidia-settings tool lets you configure many options using either CLI or GUI. Running nvidia-settings without any options launches the GUI, for CLI options see nvidia-settings(1).

You can run the CLI/GUI as a non-root user and save the settings to ~/.nvidia-settings-rc or save it as xorg.conf by using the option Save to X configuration File for a multi-user environment.

To load the ~/.nvidia-settings-rc for the current user:

$ nvidia-settings --load-config-only

See Autostarting to start this command on every boot.

Note: Xorg may not start or crash on startup after saving nvidia-settings changes. Adjusting or deleting the generated ~/.nvidia-settings-rc and/or Xorg file(s) should recover normal startup.

Multiple GPUs/SLI

If you're planning to use an SLI setup, you should check NVIDIA/Tips and tricks#Enabling SLI as manual configuration changes might be required for a working setup.

Multiple monitors

See Multihead for more general information.

Using NVIDIA Settings

The nvidia-settings tool can configure multiple monitors.

For CLI configuration, first get the CurrentMetaMode by running:

$ nvidia-settings -q CurrentMetaMode
Attribute 'CurrentMetaMode' (hostnmae:0.0): id=50, switchable=no, source=nv-control :: DPY-1: 2880x1620 @2880x1620 +0+0 {ViewPortIn=2880x1620, ViewPortOut=2880x1620+0+0}

Save everything after the :: to the end of the attribute (in this case: DPY-1: 2880x1620 @2880x1620 +0+0 {ViewPortIn=2880x1620, ViewPortOut=2880x1620+0+0}) and use to reconfigure your displays with nvidia-settings --assign "CurrentMetaMode=your_meta_mode".

Tip: You can create shell aliases for the different monitor and resolution configurations you use.


If the driver does not properly detect a second monitor, you can force it to do so with ConnectedMonitor.


Section "Monitor"
    Identifier     "Monitor1"
    VendorName     "Panasonic"
    ModelName      "Panasonic MICRON 2100Ex"
    HorizSync       30.0 - 121.0 # this monitor has incorrect EDID, hence Option "UseEDIDFreqs" "false"
    VertRefresh     50.0 - 160.0
    Option         "DPMS"

Section "Monitor"
    Identifier     "Monitor2"
    VendorName     "Gateway"
    ModelName      "GatewayVX1120"
    HorizSync       30.0 - 121.0
    VertRefresh     50.0 - 160.0
    Option         "DPMS"

Section "Device"
    Identifier     "Device1"
    Driver         "nvidia"
    Option         "NoLogo"
    Option         "UseEDIDFreqs" "false"
    Option         "ConnectedMonitor" "CRT,CRT"
    VendorName     "NVIDIA Corporation"
    BoardName      "GeForce 6200 LE"
    BusID          "PCI:3:0:0"
    Screen          0

Section "Device"
    Identifier     "Device2"
    Driver         "nvidia"
    Option         "NoLogo"
    Option         "UseEDIDFreqs" "false"
    Option         "ConnectedMonitor" "CRT,CRT"
    VendorName     "NVIDIA Corporation"
    BoardName      "GeForce 6200 LE"
    BusID          "PCI:3:0:0"
    Screen          1

The duplicated device with Screen is how you get X to use two monitors on one card without TwinView. Note that nvidia-settings will strip out any ConnectedMonitor options you have added.


You want only one big screen instead of two. Set the TwinView argument to 1. This option should be used if you desire compositing. TwinView only works on a per card basis, when all participating monitors are connected to the same card.

Option "TwinView" "1"

Example configuration:

Section "ServerLayout"
    Identifier     "TwinLayout"
    Screen         0 "metaScreen" 0 0

Section "Monitor"
    Identifier     "Monitor0"
    Option         "Enable" "true"

Section "Monitor"
    Identifier     "Monitor1"
    Option         "Enable" "true"

Section "Device"
    Identifier     "Card0"
    Driver         "nvidia"
    VendorName     "NVIDIA Corporation"

    #refer to the link below for more information on each of the following options.
    Option         "HorizSync"          "DFP-0: 28-33; DFP-1 28-33"
    Option         "VertRefresh"        "DFP-0: 43-73; DFP-1 43-73"
    Option         "MetaModes"          "1920x1080, 1920x1080"
    Option         "ConnectedMonitor"   "DFP-0, DFP-1"
    Option         "MetaModeOrientation" "DFP-1 LeftOf DFP-0"

Section "Screen"
    Identifier     "metaScreen"
    Device         "Card0"
    Monitor        "Monitor0"
    DefaultDepth    24
    Option         "TwinView" "True"
    SubSection "Display"
        Modes          "1920x1080"

Device option information.

If you have multiple cards that are SLI capable, it is possible to run more than one monitor attached to separate cards (for example: two cards in SLI with one monitor attached to each). The "MetaModes" option in conjunction with SLI Mosaic mode enables this. Below is a configuration which works for the aforementioned example and runs GNOME flawlessly.

Section "Device"
        Identifier      "Card A"
        Driver          "nvidia"
        BusID           "PCI:1:00:0"

Section "Device"
        Identifier      "Card B"
        Driver          "nvidia"
        BusID           "PCI:2:00:0"

Section "Monitor"
        Identifier      "Right Monitor"

Section "Monitor"
        Identifier      "Left Monitor"

Section "Screen"
        Identifier      "Right Screen"
        Device          "Card A"
        Monitor         "Right Monitor"
        DefaultDepth    24
        Option          "SLI" "Mosaic"
        Option          "Stereo" "0"
        Option          "BaseMosaic" "True"
        Option          "MetaModes" "GPU-0.DFP-0: 1920x1200+4480+0, GPU-1.DFP-0:1920x1200+0+0"
        SubSection      "Display"
                        Depth           24

Section "Screen"
        Identifier      "Left Screen"
        Device          "Card B"
        Monitor         "Left Monitor"
        DefaultDepth    24
        Option          "SLI" "Mosaic"
        Option          "Stereo" "0"
        Option          "BaseMosaic" "True"
        Option          "MetaModes" "GPU-0.DFP-0: 1920x1200+4480+0, GPU-1.DFP-0:1920x1200+0+0"
        SubSection      "Display"
                        Depth           24

Section "ServerLayout"
        Identifier      "Default"
        Screen 0        "Right Screen" 0 0
        Option          "Xinerama" "0"
Manual CLI configuration with xrandr

Tango-inaccurate.pngThe factual accuracy of this article or section is disputed.Tango-inaccurate.png

Reason: Do these commands set up the monitors in TwinView mode? (Discuss in Talk:NVIDIA#)

If the latest solutions do not work for you, you can use your window manager's autostart implementation with xorg-xrandr.

Some xrandr examples could be:

xrandr --output DVI-I-0 --auto --primary --left-of DVI-I-1


xrandr --output DVI-I-1 --pos 1440x0 --mode 1440x900 --rate 75.0


  • --output is used to indicate the "monitor" to which the options are set.
  • DVI-I-1 is the name of the second monitor.
  • --pos is the position of the second monitor relative to the first.
  • --mode is the resolution of the second monitor.
  • --rate is the refresh rate (in Hz).
Vertical sync using TwinView

If you are using TwinView and vertical sync (the "Sync to VBlank" option in nvidia-settings), you will notice that only one screen is being properly synced, unless you have two identical monitors. Although nvidia-settings does offer an option to change which screen is being synced (the "Sync to this display device" option), this does not always work. A solution is to add the following environment variables at startup, for example append in /etc/profile:

export __GL_SYNC_TO_VBLANK=1

You can change DFP-0 with your preferred screen (DFP-0 is the DVI port and CRT-0 is the VGA port). You can find the identifier for your display from nvidia-settings in the "X Server XVideoSettings" section.

Gaming using TwinView

In case you want to play fullscreen games when using TwinView, you will notice that games recognize the two screens as being one big screen. While this is technically correct (the virtual X screen really is the size of your screens combined), you probably do not want to play on both screens at the same time.

To correct this behavior for SDL, try:


For OpenGL, add the appropriate Metamodes to your xorg.conf in section Device and restart X:

Option "Metamodes" "1680x1050,1680x1050; 1280x1024,1280x1024; 1680x1050,NULL; 1280x1024,NULL;"

Another method that may either work alone or in conjunction with those mentioned above is starting games in a separate X server.

Mosaic mode

Mosaic mode is the only way to use more than 2 monitors across multiple graphics cards with compositing. Your window manager may or may not recognize the distinction between each monitor.

Base Mosaic

Base Mosaic mode works on any set of Geforce 8000 series or higher GPUs. It cannot be enabled from within the nvidia-setting GUI. You must either use the nvidia-xconfig command line program or edit xorg.conf by hand. Metamodes must be specified. The following is an example for four DFPs in a 2x2 configuration, each running at 1920x1024, with two DFPs connected to two cards:

$ nvidia-xconfig --base-mosaic --metamodes="GPU-0.DFP-0: 1920x1024+0+0, GPU-0.DFP-1: 1920x1024+1920+0, GPU-1.DFP-0: 1920x1024+0+1024, GPU-1.DFP-1: 1920x1024+1920+1024"
Note: While the documentation lists a 2x2 configuration of monitors, GeForce cards are artificially limited to 3 monitors in Base Mosaic mode. Quadro cards support more than 3 monitors. As of September 2014, the Windows driver has dropped this artificial restriction, but it remains in the Linux driver.
SLI Mosaic

If you have an SLI configuration and each GPU is a Quadro FX 5800, Quadro Fermi or newer then you can use SLI Mosaic mode. It can be enabled from within the nvidia-settings GUI or from the command line with:

$ nvidia-xconfig --sli=Mosaic --metamodes="GPU-0.DFP-0: 1920x1024+0+0, GPU-0.DFP-1: 1920x1024+1920+0, GPU-1.DFP-0: 1920x1024+0+1024, GPU-1.DFP-1: 1920x1024+1920+1024"

Custom TDP Limit

Modern Nvidia graphics cards throttle frequency to stay in their TDP and temperature limits. To increase performance it's possible to change the TDP Limit which will result in higher temperatures and higher power consumption.

The package nvidia-smi is required.

Example: (Set powerlimit to 160.30W)

nvidia-smi -pl 160.30

Set powerlimit on boot (without driver persistence) :

Description=Set nvidia power limit on boot


Description=Nvidia Powerlimit

ExecStart=/usr/bin/nvidia-smi -pl 160.30

Driver persistence

Nvidia has a daemon that is to be run at boot. See the Driver Persistence section of the Nvidia documentation for more details.

To start the persistence daemon at boot, enable the nvidia-persistenced.service. For manual usage see the upstream documentation.

Kernel module parameters

Some options can be set as kernel module parameters, a full list can be obtained by running modinfo nvidia or looking at nv-reg.h. See the Gentoo wiki as well.

For example, enabling the following will turn on kms, use PAT and enable PCI express 3.0. If your system can support these features it should improve performance.

options nvidia-drm modeset=1 
options nvidia NVreg_UsePageAttributeTable=1 NVreg_EnablePCIeGen3=1

See also