From ArchWiki
Jump to navigation Jump to search

zh-CN:ATI Template:Article summary start Template:Article summary text Template:Article summary heading Template:Article summary wiki Template:Article summary wiki Template:Article summary end

Owners of ATI/AMD video cards have a choice between AMD's proprietary driver (catalystAUR) and the open source driver (xf86-video-ati). This article covers the open source driver.

The open source driver is currently not on par with the proprietary driver in terms of 3D performance on newer cards or reliable TV-out support. It does, however, offer better dual-head support, excellent 2D acceleration, and provide sufficient 3D acceleration for OpenGL-accelerated window managers, such as Compiz or KWin.

If unsure, try the open source driver first, it will suit most needs and is generally less problematic (see the feature matrix for details).

Naming conventions

ATI's Radeon brand follows a naming scheme that relates each product to a market segment. Within this article, readers will see both product names (e.g. HD 4850, X1900) and code or core names (e.g. RV770, R580). Traditionally, a product series will correspond to a core series (e.g. the "X1000" product series includes the X1300, X1600, X1800, and X1900 products which utilize the "R500" core series – including the RV515, RV530, R520, and R580 cores).

For a table of core and product series, see Wikipedia:Comparison of AMD graphics processing units.


The xf86-video-ati (radeon) driver:

  • Works with Radeon chipsets up to HD 6xxx and 7xxxM (latest Northern Islands chipsets).
    • Radeons in the HD 77xx (Southern Islands) series are partially supported. Check the feature matrix for unsupported features.
    • Radeons up to the X1xxx series are fully supported, stable, and full 2D and 3D acceleration are provided.
    • Radeons from HD 2xxx to HD 6xxx have full 2D acceleration and functional 3D acceleration, but are not supported by all the features that the proprietary driver provides.
  • Supports DRI1, RandR 1.2/1.3, EXA acceleration and kernel mode-setting/DRI2 (with the latest Linux kernel, libdrm and Mesa versions).

Generally, xf86-video-ati should be your first choice, no matter which ATI card you own. In case you need to use a driver for newer ATI cards, you should consider the proprietary catalyst driver.

Note: xf86-video-ati is specified as radeon for the kernel and in xorg.conf.


If Catalyst/fglrx has been previously installed, see here.

Installing xf86-video-ati

Install xf86-video-ati, available in the official repositories.

The -git version of the driver and other needed packages (linux-git, etc) can be found in the radeon repository or the AUR.


Xorg will automatically load the driver and it will use your monitor's EDID to set the native resolution. Configuration is only required for tuning the driver.

If you want manual configuration, create /etc/X11/xorg.conf.d/20-radeon.conf, and add the following:

Section "Device"
    Identifier "Radeon"
    Driver "radeon"

Using this section, you can enable features and tweak the driver settings.

Kernel mode-setting (KMS)

Tip: If you have problems with the resolution, check this page.

KMS enables native resolution in the framebuffer and allows for instant console (tty) switching. KMS also enables newer technologies (such as DRI2) which will help reduce artifacts and increase 3D performance, even kernel space power-saving.

KMS for ATI video cards requires the Xorg free video user space driver xf86-video-ati.

Note: KMS is enabled by default for autodetected ATI/AMD cards. This section remains for configurations outside stock.

Early start

These two methods will start KMS as early as possible in the boot process.

1. Remove all conflicting UMS drivers from kernel command line:

  • Remove all vga= options from the kernel line in the bootloader configuration file. Using other framebuffer drivers (such as uvesafb or radeonfb) will conflict with KMS.
  • AGP speed can be set with radeon.agpmode=x kernel option, where x is 1, 2, 4, 8 (AGP speed) or -1 (PCI mode).

2. Otherwise, when the initramfs is loaded:

  • If you have a special kernel outside of stock -ARCH (e.g. linux-zen), remember to use a separate mkinitcpio configuration file (e.g. /etc/mkinitcpio-zen.conf) and not /etc/mkinitcpio.conf.
  • Remove any framebuffer related modules from your mkinitcpio file.
  • Add radeon to MODULES array in your mkinitcpio file. For AGP support, it is necessary to add intel_agp (or ali_agp, ati_agp, amd_agp, amd64_agp etc.) before the radeon module.
  • Re-generate your initramfs.

Finally, Reboot the system.

Late start

With this choice, KMS will be enabled when modules are loaded during the boot process.

If you have a special kernel (e.g. linux-zen), remember to use appropriate mkinitcpio configuration file, e.g. /etc/mkinitcpio-zen.conf. These instructions are written for the default kernel (linux).

Note: For AGP support, it may be necessary to add intel_agp, ali_agp, ati_agp, amd_agp, or amd64_agp) to appropriate .conf files in /etc/modules-load.d.
  1. Remove all vga= options from the kernel line in the bootloader configuration file. Using other framebuffer drivers (such as uvesafb or radeonfb) will conflict with KMS. Remove any framebuffer related modules from /etc/mkinitcpio.conf. video= can now be used in conjunction with KMS.
  2. Reboot the system.

Performance tuning

The following options apply to /etc/X11/xorg.conf.d/20-radeon.conf.

ColorTiling is completely safe to enable and supposedly is enabled by default. Most users will notice increased performance but it is not yet supported on R200 and earlier cards. Can be enabled on earlier cards, but the workload is transferred to the CPU

Option "ColorTiling" "on"

Acceleration architecture; this will work only on newer cards. If you enable this and then cannot get back into X, remove it.

Option "AccelMethod" "EXA"

Page Flip is generally safe to enable. This would mostly be used on older cards, as enabling this would disable EXA. With recent drivers can be used together with EXA.

Option "EnablePageFlip" "on"

EXAVSync option attempts to avoid tearing by stalling the engine until the display controller has passed the destination region. It reduces tearing at the cost of performance and has been know to cause instability on some chips. Really useful when enabling Xv overlay on videos on a 3D accelerated desktop. It is not necessary when KMS (thus DRI2 acceleration) is enabled.

Option "EXAVSync" "yes"

Below is a sample config file /etc/X11/xorg.conf.d/20-radeon.conf:

Section "Device"
	Identifier  "My Graphics Card"
	Driver	"radeon"
	Option	"SWcursor"              "off" #software cursor might be necessary on some rare occasions, hence set off by default
	Option	"EnablePageFlip"        "on"  #supported on all R/RV/RS4xx and older hardware, and set on by default
	Option	"AccelMethod"           "EXA" #valid options are XAA, EXA and Glamor. EXA is the default
	Option	"RenderAccel"           "on"  #enabled by default on all radeon hardware
	Option	"ColorTiling"           "on"  #enabled by default on RV300 and later radeon cards
	Option	"EXAVSync"              "off" #default is off, otherwise on. Only works if EXA activated
	Option	"EXAPixmaps"            "on"  #when on icreases 2D performance, but may also cause artifacts on some old cards. Only works if EXA activated
	Option	"AccelDFS"              "on"  #default is off, read the radeon manpage for more information

Defining the gartsize, if not autodetected, can be done by adding radeon.gartsize=32 into kernel parameters. Size is in megabytes and 32 is for RV280 cards.

Alternatively, do it with a modprobe option in /etc/modprobe.d/radeon.conf:

options radeon gartsize=32

For further information and other options, read the radeon manpage and the module's info page: man radeon, modinfo radeon.

A fine tool to try is driconf. It will allow you to modify several settings, like vsync, anisotropic filtering, texture compression, etc. Using this tool it is also possible to "disable Low Impact fallback" needed by some programs (e.g. Google Earth).

Deactivating PCI-E 2.0

Since kernel 3.6, PCI-E v2.0 in radeon is turned on by default.

It can be unstable with some motherboards, so it can be deactivated by adding radeon.pcie_gen2=0 on the kernel command line.

See Phoronix article for more information.


Glamor is a 2D acceleration method implemented through OpenGL, and it should work with graphic cards whose drivers are newer or equal to R300.

Since xf86-video-ati driver-1:7.2.0-1, glamor is automaticaly enabled with radeonsi drivers (Southern Island and superior GFX cards); with other graphic cards you can use it by adding the AccelMethod glamor to your xorg.conf config file in the Device section:

 Option	"AccelMethod"           "glamor"

However, you need to add the following section before:

Section "Module"
	Load "dri2"
	Load "glamoregl" 

Hybrid graphics/AMD Dynamic Switchable Graphics

It is the technology used on recent laptops equiped with two GPUs, one power-efficent (generally Intel integrated card) and one more powerful and more power-hungry (generally Radeon or Nvidia). There are three ways to get it work:

  • If you do not need to run any GPU-hungry application, you can plainly disable the discrete card: echo OFF > /sys/kernel/debug/vgaswitcheroo/switch. You can do more things with vgaswitcheroo (see Ubuntu wiki for more information) but ultimately at best one card is bound to one graphic session, you cannot use both on one graphic session.
  • You can use PRIME. It is the proper way to use hybrid graphics on Linux but still requires a bit of manual intervention from the user.
  • You can also use bumblebee with radeon, there is a bumblebee-amd-gitAUR package on AUR.


With kernel prior to 3.11.x

With the radeon driver, power saving is disabled by default but the kernel provides a method to enable it using sysfs.

You can choose between two different methods. It's hard to say which is the best, you have to try it yourself.

Dynamic frequency switching

This method dynamically changes the frequency depending on GPU load, so performance is ramped up when running GPU intensive apps, and ramped down when the GPU is idle. The re-clocking is attempted during vertical blanking periods, but due to the timing of the re-clocking functions, does not always complete in the blanking period, which can lead to flicker in the display. Due to this, dynpm only works when a single head is active.

It can be activated by simply running the following command:

# echo dynpm > /sys/class/drm/card0/device/power_method

Profile-based frequency switching

This method will allow you to select one of the five profiles (described below). Different profiles, for the most part, end up changing the frequency/voltage of the GPU. This method is not as aggressive, but is more stable and flicker free and works with multiple heads active.

To activate the method, run the following command:

# echo profile > /sys/class/drm/card0/device/power_method

Select one of the available profiles:

  • default uses the default clocks and does not change the power state. This is the default behaviour.
  • auto selects between mid and high power states based on the whether the system is on battery power or not. The low power state is selected when the monitors are in the DPMS-off state.
  • low forces the gpu to be in the low power state all the time. Note that low can cause display problems on some laptops, which is why auto only uses low when monitors are off.
  • mid forces the gpu to be in the mid power state all the time. The low power state is selected when the monitors are in the DPMS-off state.
  • high forces the gpu to be in the high power state all the time. The low power state is selected when the monitors are in the DPMS-off state.

As an example, we will activate the low profile (replace low with any of the aforementioned profiles as necessary):

# echo low > /sys/class/drm/card0/device/power_profile

Persistent configuration

The activation described above is not persistent, it will not last when the computer is rebooted. To make it persistent, you can use systemd-tmpfiles (example for #Dynamic frequency switching):

w /sys/class/drm/card0/device/power_method - - - - dynpm

Alternatively, you may use this udev rule instead (example for #Profile-based frequency switching):

KERNEL=="dri/card0", SUBSYSTEM=="drm", DRIVERS=="radeon", ATTR{device/power_method}="profile", ATTR{device/power_profile}="low"
Note: If the above rule is failing, try removing the dri/ prefix.

Graphical tools

  • Radeon-tray — A small program to control the power profiles of your Radeon card via systray icon. It is written in PyQt4 and is suitable for non-Gnome users. ||
  • power-play-switcher — A gui for changing powerplay setting of the open source driver for ati radeon video cards. || power-play-switcherAUR
  • Gnome-shell-extension-Radeon-Power-Profile-Manager — A small extension for Gnome-shell that will allow you to change the power profile of your radeon card when using the open source drivers. || gnome-shell-extension-radeon-ppmAUR gnome-shell-extension-radeon-power-profile-manager-gitAUR

Other notes

Power management is supported on all asics (r1xx-evergreen) that include the appropriate power state tables in the vbios; not all boards do (especially older desktop cards).

To view the speed that the GPU is running at, perform the following command and you will get something like this output:

$ cat /sys/kernel/debug/dri/0/radeon_pm_info
  default engine clock: 300000 kHz
  current engine clock: 300720 kHz
  default memory clock: 200000 kHz

If /sys/kernel/debug is empty, run this command:

# mount -t debugfs none /sys/kernel/debug

To permanently mount, add the following line to /etc/fstab:

debugfs   /sys/kernel/debug   debugfs   defaults   0   0

It depends on which GPU line yours is, however. Along with the radeon driver versions, kernel versions, etc. So it may not have much/any voltage regulation at all.

Thermal sensors are implemented via external i2c chips or via the internal thermal sensor (rv6xx-evergreen only). To get the temperature on asics that use i2c chips, you need to load the appropriate hwmon driver for the sensor used on your board (lm63, lm64, etc.). The drm will attempt to load the appropriate hwmon driver. On boards that use the internal thermal sensor, the drm will set up the hwmon interface automatically. When the appropriate driver is loaded, the temperatures can be accessed via lm_sensors tools or via sysfs in /sys/class/hwmon.

TV out

Tango-view-refresh-red.pngThis article or section is out of date.Tango-view-refresh-red.png

Reason: please use the first argument of the template to provide a brief explanation. (Discuss in Talk:ATI#)

Since August 2007, there is TV-out support for all Radeons with integrated TV-out.

It is somewhat limited for now, it does not always autodetect the output correctly and only NTSC mode works.

First, check that you have an S-video output: xrandr should give you something like

Screen 0: minimum 320x200, current 1024x768, maximum 1280x1200
S-video disconnected (normal left inverted right x axis y axis)

Setting tv standard to use:

xrandr --output S-video --set "tv standard" ntsc

Adding a mode for it (currently it supports only 800x600):

xrandr --addmode S-video 800x600

I will go for a clone mode:

xrandr --output S-video --same-as VGA-0

So far so good. Now let us try to see what we have:

xrandr --output S-video --mode 800x600

At this point you should see a 800x600 version of your desktop on your TV.

To disable the output, do

xrandr --output S-video --off

Also you may notice that the video is being played on monitor only and not on the TV. Where the Xv overlay is sent is controlled by XV_CRTC attribute.

To send the output to the TV, I do

xvattr -a XV_CRTC -v 1
Note: you need to install xvattrAUR to execute this command.

To switch back to my monitor, I change this to 0. -1 is used for automatic switching in dualhead setups.

Please see Enabling TV-Out Statically for how to enable TV-out in your xorg configuration file.

Force TV-out in KMS

Kernel can recognize video= parameter in following form (see KMS for more details):


For example:


Parameters with whitespaces must be quoted:

"video=9-pin DIN-1:1024x768-24@60e"

Current mkinitcpio implementation also requires # in front. For example:

root=/dev/disk/by-uuid/d950a14f-fc0c-451d-b0d4-f95c2adefee3 ro quiet radeon.modeset=1 security=none # video=DVI-I-1:1280x1024-24@60e "video=9-pin DIN-1:1024x768-24@60e"
  • Grub can pass such command line as is.
  • Lilo needs backslashes for doublequotes (append # \"video=9-pin DIN-1:1024x768-24@60e\")
  • Grub2: TODO

You can get list of your video outputs with following command:

$ ls -1 /sys/class/drm/ | grep -E '^card[[:digit:]]+-' | cut -d- -f2-

HDMI audio

HDMI audio is supported by the xf86-video-ati video driver. By default, the necessary kernel module is disabled in kernel versions >=3.0. However, if your Radeon card is listed in the Radeon Feature Matrix, you can add to your Kernel parameters. For example:

LABEL arch
    MENU LABEL Arch Linux
    LINUX ../vmlinuz-linux
    APPEND root=/dev/sda1 ro
    INITRD ../initramfs-linux.img

If HDMI audio doesn't simply work after installing the driver, test your setup with the procedure at Advanced_Linux_Sound_Architecture#HDMI_Output_Does_Not_Work.

Note: As of this writing (2013-05-20), drivers for the Southern Islands cards don't support HDMI Audio.

Dual Head setup

Independent X screens

Independent dual-headed setups can be configured the usual way. However you might want to know that the radeon driver has a "ZaphodHeads" option which allows you to bind a specific device section to an output of your choice, for instance using:

       Section "Device"
       Identifier     "Device0"
       Driver         "radeon"
       Option         "ZaphodHeads"   "VGA-0"
       VendorName     "ATI"
       BusID          "PCI:1:0:0"
       Screen          0

This can be a life-saver, because some cards which have more than two outputs (for instance one HDMI out, one DVI, one VGA), will only select and use HDMI+DVI outputs for the dual-head setup, unless you explicitely specify "ZaphodHeads" "VGA-0".

Moreover, this option allows you to easily select the screen you want to mark as primary.

Enabling video acceleration

Latest mesa package added support for MPEG1/2 decoding to free drivers, exported via libvdpau and are automaticaly detected.

You can force used driver by assigning environment variable LIBVA_DRIVER_NAME to vdpau and VDPAU_DRIVER to the name of driver core, e.g.:

export LIBVA_DRIVER_NAME=vdpau
export VDPAU_DRIVER=r600

for r600-based cards (all available VDPAU drivers are in /usr/lib/vdpau/).

Turn vsync off

The radeon driver will enable vsync by default, which is perfectly fine except for benchmarking. To turn it off, create ~/.drirc (or edit it if it already exists) and add the following section:

    <device screen="0" driver="dri2">
        <application name="Default">
            <option name="vblank_mode" value="0" />
    <!-- Other devices ... -->

It is effectively dri2, not your video card code (like r600).


Artifacts upon logging in

If encountering artifacts, first try starting X without /etc/X11/xorg.conf. Recent versions of Xorg are capable of reliable auto-detection and auto-configuration for most use cases. Outdated or improperly configured xorg.conf files are known to cause trouble.

In order to run without a configuration tile, it is recommended that the xorg-input-drivers package group be installed.

Artifacts may also be related to kernel mode setting. Consider disabling KMS.

You may as well try disabling EXAPixmaps in /etc/X11/xorg.conf.d/20-radeon.conf:

Section "Device"
    Identifier "Radeon"
    Driver "radeon"
    Option "EXAPixmaps" "off"

Further tweaking could be done by disabling AccelDFS:

Option "AccelDFS" "off"

Adding undetected resolutions

e.g. When EDID fails on a DisplayPort connection.

This issue is covered on the Xrandr page.

Slow performance with open-source drivers

Note: Make sure you are member of video group.

Some cards can be installed by default trying to use KMS. You can check whether this is your case running:

dmesg | egrep "drm|radeon"

This command might show something like this, meaning it is trying to default to KMS:

[drm] radeon default to kernel modesetting.
[drm:radeon_driver_load_kms] *ERROR* Failed to initialize radeon, disabling IOCTL

If your card is not supported by KMS (anything older than r100), then you can disable KMS. This should fix the problem.

AGP is disabled (with KMS)

If you experience poor performance and dmesg shows something like this

[drm:radeon_agp_init] *ERROR* Unable to acquire AGP: -19

then check if the agp driver for your motherboard (e.g., via_agp, intel_agp etc.) is loaded before the radeon module, see Enabling KMS.

TV showing a black border around the screen

When I connected my TV to my Radeon HD 5770 using the HDMI port, the TV showed a blurry picture with a 2-3cm border around it. This is not the case when using the proprietary driver. However, this protection against overscanning (see Wikipedia:Overscan) can be turned off using xrandr:

xrandr --output HDMI-0 --set underscan off

Black screen with mouse cursor on resume from suspend in X

Waking from suspend on cards with 32MB or less can result in a black screen with a mouse pointer in X. Some parts of the screen may be redrawn when under the mouse cursor. Forcing EXAPixmaps to "enabled" in /etc/X11/xorg.conf.d/20-radeon.conf may fix the problem. See performance tuning for more information.

No desktop effects in KDE4 with X1300 and Radeon driver

A bug in KDE4 may prevent an accurate video hardware check, thereby deactivating desktop effects despite the X1300 having more than sufficient GPU power. A workaround may be to manually override such checks in KDE4 configuration files /usr/share/kde-settings/kde-profile/default/share/config/kwinrc and/or .kde/share/config/kwinrc.



To the [Compositing] section. Ensure that compositing is enabled with:


Black screen and no console, but X works in KMS

This is a solution to no-console problem that might come up, when using two or more ATI cards on the same PC. Fujitsu Siemens Amilo PA 3553 laptop for example has this problem. This is due to fbcon console driver mapping itself to wrong framebuffer device that exist on the wrong card. This can be fixed by adding a this to the kernel boot line:


This will tell the fbcon to map itself to the /dev/fb1 framebuffer dev and not the /dev/fb0, that in our case exist on the wrong graphics card.

Some 3D applications show textures as all black or crash

You might need texture compression support, which is not included with the open source driver. Install libtxc_dxtn (and lib32-libtxc_dxtn for multilib systems).

2D performance (e.g. scrolling) is slow

If you have problem with 2D performance, like scrolling in terminal or browser, you might need to add Option "MigrationHeuristic" "greedy" into the "Device" section of your xorg.conf file.

Bellow is a sample config file /etc/X11/xorg.conf.d/20-radeon.conf:

Section "Device"
        Identifier  "My Graphics Card"
        Driver  "radeon"
        Option  "MigrationHeuristic"  "greedy"

ATI X1600 (RV530 series) 3D application show black windows

There are three possible solutions:

  • Try add pci=nomsi to your boot loader Kernel parameters.
  • If this doesn't work, you can try adding noapic instead of pci=nomsi.
  • If none of the above work, then you can try running vblank_mode=0 glxgears or vblank_mode=1 glxgears to see which one works for you, then install driconf via pacman and set that option in ~/.drirc.

Vertical colored stripes on chipset RS482 (Xpress 200M Series) with/out KMS

The bug :With the graphical chipset Xpress 200M Series (Radeon Xpress 1150), booting with KMS gives you sometimes, as soon as Xorg boots, a screen with many vertical colored stripes. You cannot Alt+Sys+K or do anything. Take a look FS#21918 for more information, How to fixed ? : disable dri (needn't to disable kms) Side effert: if i disable "dri" and use no kernel options (no "nomodeset") i see the vertical stripes at boot, only for 5 seconds, before having kdm displayed. Then, i have the same results.

If I start for example KDE Desktop Effects, i will see again the vertical stripes for 5 seconds...and return to kdm ! :)