Difference between revisions of "ATI"

From ArchWiki
Jump to navigation Jump to search
(update Pkg/AUR templates)
Tag: wiki-scripts
 
Line 1: Line 1:
[[Category:X Server (English)]]
+
[[Category:Graphics]]
 +
[[Category:X server]]
 +
[[de:ATI]]
 +
[[es:ATI]]
 +
[[fr:ATI]]
 +
[[it:ATI]]
 +
[[ja:ATI]]
 +
[[pl:ATI]]
 +
[[ru:ATI]]
 +
[[zh-hans:ATI]]
 +
{{Related articles start}}
 +
{{Related|AMD Catalyst}}
 +
{{Related|AMDGPU}}
 +
{{Related|Xorg}}
 +
{{Related|Vulkan}}
 +
{{Related articles end}}
  
This howto has been completely revamped to reflect the current state of the ATI drivers in arch.  If anyone's interested in how it used to be done (a somewhat painful manual process) feel free to check this page's history.
+
This article covers the [https://wiki.freedesktop.org/xorg/radeon/ radeon] open source driver which supports the majority of AMD (previously ATI) GPUs.
  
With the release of Xorg 7 and its inclusion in the Arch repos, the dev team has added ATI's fglrx drivers to the <code>extra</code> repo alongside it.  This should make the process of installing and using these drivers as painless as possible.
+
== Selecting the right driver ==
  
== Supported Devices ==
+
Depending on the card you have, find the right driver in [[Xorg#AMD]]. This page has instructions for '''ATI'''.
  
See [http://www2.ati.com/drivers/linux/linux_8.32.5.html ATI Linux Driver 8.32.5 Release Notes] for a list of supported devices by this version of the drivers.
+
If unsure, try this open source driver first, it will suit most needs and is generally less problematic. See the [https://www.x.org/wiki/RadeonFeature feature matrix] to know what is supported and the [https://www.x.org/wiki/RadeonFeature/#index5h2 decoder ring] to translate marketing names (e.g. Radeon HD4330) to chip names (e.g. R700).
  
 
== Installation ==
 
== Installation ==
  
From the release of Xorg 7, Arch has provided pre-compiled fglrx packages in the <code>extra</code> repository.  If you use either of the <code>kernel26</code> or <code>kernel26beyond</code> kernels, the process is simple.  If you use a custom kernel, a few extra steps need to be taken.
+
{{Note|If coming from the proprietary Catalyst driver, see [[AMD Catalyst#Uninstallation]] first.}}
  
=== Stock Kernels ===
+
[[Install]] the {{Pkg|mesa}} package, which provides the DRI driver for 3D acceleration.
  
====kernel26====
+
* For 32-bit application support, also install the {{Pkg|lib32-mesa}} package from the [[multilib]] repostory.
 +
* For the DDX driver (which provides 2D acceleration in [[Xorg]]), install the {{Pkg|xf86-video-ati}} package.
  
To install ATI's fglrx drivers for the <code>kernel26</code> package, you need to install the <code>ati-fglrx</code> package.
+
Support for [[#Enabling video acceleration|accelerated video decoding]] is provided by {{Pkg|mesa-vdpau}} and {{Pkg|lib32-mesa-vdpau}} packages.
  
    # pacman -Sy ati-fglrx
+
== Loading ==
  
====kernel26beyond====
+
The radeon kernel module should load fine automatically on system boot.
  
To install ATI's fglrx drivers for the <code>kernel26beyond</code> package, you need to install the <code>ati-fglrx-beyond</code> package.
+
If it does not happen, then:
  
    # pacman -Sy ati-fglrx-beyond
+
* Make sure you do '''not''' have {{ic|nomodeset}} or {{ic|1=vga=}} as a [[kernel parameter]], since radeon requires [[KMS]].
 +
* Also, check that you have not disabled radeon by using any [[Kernel_modules#Blacklisting|kernel module blacklisting]].
  
====Notes====
+
=== Enable early KMS ===
  
* The -beyond kernel patchset supercedes and replaces the -archck patchset.  AFAIK, the archck kernel is no longer being maintained - if you want to update drivers for this kernel, you must follow the steps in custom kernels below.
+
See [[Kernel mode setting#Early KMS start]].
* These packages contain '''only''' the kernel module, and depend on the <code>ati-fglrx-utils</code> package.  The <code>ati-fglrx-utils</code> package is kernel-independent and provides the libraries and utilities for Xorg, including ATI's own <code>libGL.so</code>.
 
* After installing the package, you'll need to either <code>source /etc/profile</code> or log out, then back in to set up the environment properly.
 
* If you run both <code>kernel26</code> and <code>kernel26beyond</code> then install both ati-fglrx module packages.  They won't conflict with one another.
 
  
=== Custom Kernels ===
+
== Xorg configuration ==
  
To install fglrx for a custom kernel, you'll need to build your own <code>ati-fglrx-$kernel</code> package, containing the kernel module compiled specifically for your kernel.
+
Xorg will automatically load the driver and it will use your monitor's EDID to set the native resolution. Configuration is only required for tuning the driver.
  
If you are at all uncomfortable or inexperienced making packages, read up the [[ABS]] wiki page first so things go smoothly.
+
If you want manual configuration, create {{ic|/etc/X11/xorg.conf.d/20-radeon.conf}}, and add the following:
  
==== Obtaining PKGBUILD ====
+
Section "Device"
Obtain the <code>PKGBUILD</code> and <code>ati-fglrx.install</code> files from CVS or ABS. Either:
+
    Identifier "Radeon"
 +
    Driver "radeon"
 +
  EndSection
  
* Visit http://www.archlinux.org/packages.php?id=10416 and click "View CVS Entries" to find them, or
+
Using this section, you can enable features and tweak the driver settings.
* Run <code>abs</code> as root and locate the files in <code>/var/abs/extra/modules/ati-fglrx</code>.
 
  
==== Editing the PKGBUILD and building ====
+
== Performance tuning ==
Three changes need to be made here:
+
=== Enabling video acceleration ===
  
'''First''', change
+
See [[Hardware video acceleration]].
    pkgname=ati-fglrx
 
to
 
    pkgname=ati-fglrx-KERNEL_NAME
 
where KERNEL_NAME is whatever you want (custom, mm, themostawesomekernelever)
 
  
'''Second''', remove <code>kernel26</code> from the dependencies list.
+
=== Graphical tools ===
  
'''Third''', change
+
* {{App|WattmanGTK|A GTK3 user-interface written in Python 3, which allows you to view, monitor Radeon performance, fan speeds and power states and the ability to overclock the graphics processor. It uses the AMDGPU kernel driver.|https://github.com/BoukeHaarsma23/WattmanGTK|{{AUR|wattman-gtk-git}}}}
    _kernver=2.6.15-ARCH
+
: {{Note|It is required to set a kernel parameter ({{ic|amdgpu.ppfeaturemask}}) in order to enable the AMD Overdrive technology within GNU/Linux. Which is necessary to use WattmanGTK.}}
to
 
    _kernver=`uname -r`
 
(or directly insert the output of uname -r '''when running your custom kernel''' there)
 
  
Finally, build and install the package. (<code>makepkg -i</code> or <code>makepkg</code> followed by <code>pacman -A pkgname.pkg.tar.gz</code>)
+
* {{App|radeon-profile|Qt application for displaying info about a Radeon card.|https://github.com/marazmista/radeon-profile|{{AUR|radeon-profile-git}}}}
  
==== Notes ====
+
=== Driver options ===
 +
The following options apply to {{ic|/etc/X11/xorg.conf.d/'''20-radeon.conf'''}}.
  
* No changes need to be made to the <code>ati-fglrx-utils</code> package, which is completely kernel-independent. '''All''' you need to do is compile a kernel module.
+
Please read {{man|4|radeon}} and [https://www.x.org/wiki/RadeonFeature/#index4h2 RadeonFeature] first before applying driver options.
* To build and run the fglrx kernel module with 2.6.16 kernels, patches are needed!  Check out the cvs entries for ati-fglrx in current for the required patch.
 
  
== Configuration ==
+
'''Acceleration architecture'''; Glamor is available as a 2D acceleration method implemented through OpenGL, and it [https://cgit.freedesktop.org/xorg/driver/xf86-video-ati/commit/?id=f11531c99fcd6473f58b4d10efaf3efd84304d8e is the default] for R600 (Radeon HD2000 series) and newer graphic cards. Older cards use EXA.
  
With recent releases of their drivers, ATI has gotten rid of the <code>fglrxconfig</code> tool in favour of the more streamlined, simpler, and more up-to-date <code>aticonfig</code> tool. For a list of complete options that you can use with <code>aticonfig</code> (and this is a long list!), run
+
  Option "AccelMethod" "glamor"
    $ aticonfig
 
at the command prompt. You can use this tool to set up an <code>xorg.conf</code> file and configure essentially every aspect of the card.  
 
  
To use this utility, make sure you have a working <code>xorg.conf</code> file already, and '''back it up'''. For the mostpart, <code>aticonfig</code> only modifies existing config files to work with fglrx. If you don't have an xorg.conf file yet, run
+
'''DRI3''' is enabled by default [https://www.phoronix.com/scan.php?page=news_item&px=Radeon-AMDGPU-1.19-Updates since xf86-video-ati 7.8.0]. For older drivers, which use DRI2 by default, switch to DRI3 with the following option:
    # Xorg -configure
 
to generate one.
 
  
The simplest way to use <code>aticonfig</code> to adapt your <code>xorg.conf</code> file is listed in the examples at the end of the output if you run <code>aticonfig</code> without any command-line parameters:
+
Option "DRI" "3"
    Examples:
 
      1. Setting up fglrx for the first time.
 
          Single head :    aticonfig --initial --input=/etc/X11/xorg.conf
 
          Dual head  :    aticonfig --initial=dual-head --screen-layout=above
 
                            This command will generate a dual head configuration
 
                            file with the second screen located above the first
 
                            screen.
 
  
Just adapt one of those two lines for your personal setup.
+
'''TearFree''' is a tearing prevention option which prevents tearing by using the hardware page flipping mechanism:
  
As always, it's a good idea to visually scan the resulting <code>xorg.conf</code> file to ensure everything seems right. If you want, you can compare it to one of the [http://wiki.archlinux.org/index.php?title=Xorg7#Sample_Xorg.conf_Files Sample Xorg.conf files] listed on the Xorg7 wiki page.
+
  Option "TearFree" "on"
  
Finally, run Xorg with <code>startx</code> and check if you have direct rendering by opening up a terminal and running
+
'''ColorTiling''' and '''ColorTiling2D''' are supposed to be enabled by default. Tiled mode can provide significant performance benefits with 3D applications. It is disabled if the DRM module is too old or if the current display configuration does not support it. KMS ColorTiling2D is only supported on R600 (Radeon HD2000 series) and newer chips:
    glxinfo | grep direct (you may need to install the mesa package to get glxinfo)
 
If it says "direct rendering: yes" then you're good to go!
 
  
== Troubleshooting ==
+
Option "ColorTiling" "on"
 +
Option "ColorTiling2D" "on"
 +
 
 +
When using Glamor as acceleration architecture, it is possible to enable the '''ShadowPrimary''' option, which enables a so-called "shadow primary" buffer for fast CPU access to pixel data, and separate scanout  buffers  for  each  display  controller  (CRTC). This may improve performance for some 2D workloads, potentially at the expense of other (e.g. 3D, video) workloads. Note that enabling this option currently disables Option "EnablePageFlip":
 +
 
 +
Option "ShadowPrimary" "on"
 +
 
 +
'''EXAVSync ''' is only available when using EXA and can be enabled to avoid tearing by stalling the engine until the display controller has passed the destination region. It reduces tearing at the cost of performance and has been known to cause instability on some chips:
 +
 
 +
Option "EXAVSync" "yes"
 +
 
 +
Below is a sample configuration file of {{ic|/etc/X11/xorg.conf.d/'''20-radeon.conf'''}}:
 +
 
 +
{{bc|
 +
Section "Device"
 +
Identifier  "Radeon"
 +
Driver "radeon"
 +
Option "AccelMethod" "glamor"
 +
        Option "DRI" "3"
 +
        Option "TearFree" "on"
 +
        Option "ColorTiling" "on"
 +
        Option "ColorTiling2D" "on"
 +
EndSection
 +
}}
 +
 
 +
{{Tip|{{AUR|driconf}} is a tool which allows several settings to be modified: vsync, anisotropic filtering, texture compression, etc. Using this tool it is also possible to "disable Low Impact fallback" needed by some programs (e.g. Google Earth).}}
 +
 
 +
=== Kernel parameters ===
 +
{{Tip|You may want to debug the new parameters with {{ic|systool}} as stated in [[Kernel modules#Obtaining information]].}}
 +
 
 +
Defining the '''gartsize''', if not autodetected, can be done by adding {{ic|1=radeon.gartsize=32}} as a [[kernel parameter]].
 +
 
 +
{{Note|Setting this parameter should not be needed anymore with modern AMD video cards:
 +
{{bc|<nowiki>
 +
[drm] Detected VRAM RAM=2048M, BAR=256M
 +
[drm] radeon: 2048M of VRAM memory ready
 +
[drm] radeon: 2048M of GTT memory ready.
 +
</nowiki>}}
 +
}}
 +
 
 +
The changes take effect at the next reboot.
 +
 
 +
==== Deactivating PCIe 2.0 ====
 +
 
 +
Since kernel 3.6, PCI Express 2.0 in '''radeon''' is turned on by default.
 +
 
 +
It may be unstable with some motherboards. It can be deactivated by adding {{ic|1=radeon.pcie_gen2=0}} as a [[kernel parameter]].
 +
 
 +
See [https://www.phoronix.com/scan.php?page=article&item=amd_pcie_gen2&num=1 Phoronix article] for more information.
 +
 
 +
=== Gallium Heads-Up Display ===
 +
 
 +
The radeon driver supports the activation of a heads-up display (HUD) which can draw transparent graphs and text on top of applications that are rendering, such as games. These can show values such as the current frame rate or the CPU load for each CPU core or an average of all of them. The HUD is controlled by the GALLIUM_HUD environment variable, and can be passed the following list of parameters among others:
 +
*"fps" - displays current frames per second
 +
*"cpu" - displays the average CPU load
 +
*"cpu0" - displays the CPU load for the first CPU core
 +
*"cpu0+cpu1" - displays the CPU load for the first two CPU cores
 +
*"draw-calls" - displays how many times each material in an object is drawn to the screen
 +
*"requested-VRAM" - displays how much VRAM is being used on the GPU
 +
*"pixels-rendered" - displays how many pixels are being displayed
 +
 
 +
To see a full list of parameters, as well as some notes on operating GALLIUM_HUD, you can also pass the "help" parameter to a simple application such as glxgears and see the corresponding terminal output:
 +
{{bc|1=# GALLIUM_HUD="help" glxgears }}
 +
 
 +
More information can be found from this [https://lists.freedesktop.org/archives/mesa-dev/2013-March/036586.html mailing list post] or [https://kparal.wordpress.com/2014/03/03/fraps-like-fps-overlay-for-linux/ this blog post].
 +
 
 +
== Hybrid graphics/AMD Dynamic Switchable Graphics ==
 +
 
 +
It is the technology used on recent laptops equiped with two GPUs, one power-efficent (generally Intel integrated card) and one more powerful and more power-hungry (generally Radeon or Nvidia). There are two ways to get it work:
 +
 
 +
* If it is not required to run 'GPU-hungry' applications, it is possible to disable the discrete card (see [https://help.ubuntu.com/community/HybridGraphics#Using_vga_switcheroo Ubuntu wiki]): {{ic|echo OFF > /sys/kernel/debug/vgaswitcheroo/switch}}.
 +
* [[PRIME]]: Is a proper way to use hybrid graphics on Linux, but still requires a bit of manual intervention from the user.
 +
 
 +
== Powersaving ==
 +
{{Note|Power management is supported on all chips that include the appropriate power state tables in the vbios (R1xx and newer). "dpm" is only supported on R6xx and newer chips.}}
 +
 
 +
With the radeon driver, power saving is disabled by default and has to be enabled manually if desired.
 +
 
 +
You can choose between three different methods:
 +
 
 +
# [[#Dynamic power management|dpm]] (enabled by default since kernel 3.13)
 +
# [[#Dynamic frequency switching|dynpm]]
 +
# [[#Profile-based frequency switching|profile]]
 +
 
 +
See https://www.x.org/wiki/RadeonFeature/#index3h2 for more details.
 +
 
 +
=== Dynamic power management ===
 +
 
 +
Since kernel 3.13, DPM is enabled by default for [https://kernelnewbies.org/Linux_3.13#head-f95c198f6fdc7defe36f470dc8369cf0e16898df lots of AMD Radeon hardware]. If you want to disable it, add the parameter {{ic|1=radeon.dpm=0}} to the [[kernel parameters]].
 +
 
 +
{{Tip| DPM works on R6xx gpus, but is not enabled by default in the kernel (only R7xx and up). Setting the {{ic|1=radeon.dpm=1}} kernel parameter will enable dpm.}}
 +
 
 +
Unlike [[#Dynamic frequency switching|dynpm]], the "dpm" method uses hardware on the GPU to dynamically change the clocks and voltage based on GPU load. It also enables clock and power gating.
 +
 
 +
There are 3 operation modes to choose from:
 +
 
 +
* {{ic|battery}} lowest power consumption
 +
* {{ic|balanced}} sane default
 +
* {{ic|performance}} highest performance
 +
 
 +
They can be changed via sysfs
 +
# echo battery > /sys/class/drm/card0/device/power_dpm_state
 +
 
 +
For testing or debugging purposes, you can force the card to run in a set performance mode:
 +
 
 +
* {{ic|auto}} default; uses all levels in the power state
 +
* {{ic|low}} enforces the lowest performance level
 +
* {{ic|high}} enforces the highest performance level
 +
 
 +
# echo low > /sys/class/drm/card0/device/power_dpm_force_performance_level
 +
 
 +
==== Commandline Tools ====
 +
 
 +
* [https://github.com/superjamie/snippets/blob/master/radcard radcard] - A script to get and set DPM power states and levels
 +
 
 +
=== Old methods ===
 +
 
 +
==== Dynamic frequency switching ====
 +
 
 +
This method dynamically changes the frequency depending on GPU load, so performance is ramped up when running GPU intensive apps, and ramped down when the GPU is idle. The re-clocking is attempted during vertical blanking periods, but due to the timing of the re-clocking functions, does not always complete in the blanking period, which can lead to flicker in the display. Due to this, dynpm only works when a single head is active.
 +
 
 +
It can be activated by simply running the following command:
 +
 
 +
# echo dynpm > /sys/class/drm/card0/device/power_method
 +
 
 +
==== Profile-based frequency switching ====
 +
 
 +
This method will allow you to select one of the five profiles (described below). Different profiles, for the most part, end up changing the frequency/voltage of the GPU. This method is not as aggressive, but is more stable and flicker free and works with multiple heads active.
 +
 
 +
To activate the method, run the following command:
 +
 
 +
# echo profile > /sys/class/drm/card0/device/power_method
 +
 
 +
Select one of the available profiles:
 +
* {{ic|default}} uses the default clocks and does not change the power state. This is the default behaviour.
 +
* {{ic|auto}} selects between {{ic|mid}} and {{ic|high}} power states based on the whether the system is on battery power or not.
 +
* {{ic|low}} forces the gpu to be in the {{ic|low}} power state all the time. Note that {{ic|low}} can cause display problems on some laptops, which is why {{ic|auto}} only uses {{ic|low}} when monitors are off. Selected on other profiles when the monitors are in the [[DPMS]]-off state.
 +
* {{ic|mid}} forces the gpu to be in the {{ic|mid}} power state all the time.
 +
* {{ic|high}} forces the gpu to be in the {{ic|high}} power state all the time.
 +
 
 +
As an example, we will activate the {{ic|low}} profile (replace {{ic|low}} with any of the aforementioned profiles as necessary):
 +
 
 +
# echo low > /sys/class/drm/card0/device/power_profile
 +
 
 +
=== Persistent configuration ===
 +
 
 +
The methods described above are not persistent. To make them persistent, you may create a [[udev]] rule (example for [[#Profile-based frequency switching]]):
 +
 
 +
{{hc|/etc/udev/rules.d/30-radeon-pm.rules|<nowiki>
 +
KERNEL=="dri/card0", SUBSYSTEM=="drm", DRIVERS=="radeon", ATTR{device/power_method}="profile", ATTR{device/power_profile}="low"
 +
</nowiki>}}
 +
 
 +
As another example, [[#Dynamic power management|dynamic power management]] can be permanently forced to a certain performance level:
 +
 
 +
{{hc|/etc/udev/rules.d/30-radeon-pm.rules|<nowiki>
 +
KERNEL=="dri/card0", SUBSYSTEM=="drm", DRIVERS=="radeon", ATTR{device/power_dpm_force_performance_level}="high"
 +
</nowiki>}}
 +
 
 +
{{Note|If the above rules are failing, try removing the {{ic|dri/}} prefix.}}
 +
 
 +
=== Graphical tools ===
 +
 
 +
* {{App|Radeon-tray|A small program to control the power profiles of your Radeon card via systray icon. It is written in PyQt4 and is suitable for non-Gnome users.|https://github.com/StuntsPT/Radeon-tray|{{AUR|radeon-tray}}}}
 +
 
 +
=== Other notes ===
 +
 
 +
To view the speed that the GPU is running at, perform the following command and you will get something like this output:
 +
 
 +
{{hc|# cat /sys/kernel/debug/dri/0/radeon_pm_info|<nowiki>
 +
  state: PM_STATE_ENABLED
 +
  default engine clock: 300000 kHz
 +
  current engine clock: 300720 kHz
 +
  default memory clock: 200000 kHz
 +
</nowiki>}}
 +
 
 +
It depends on which GPU line yours is, however. Along with the radeon driver versions, kernel versions, etc. So it may not have much/any voltage regulation at all.
 +
 
 +
Thermal sensors are implemented via external i2c chips or via the internal thermal sensor (rv6xx-evergreen only). To get the temperature on asics that use i2c chips, you need to load the appropriate hwmon driver for the sensor used on your board (lm63, lm64, etc.). The drm will attempt to load the appropriate hwmon driver. On boards that use the internal thermal sensor, the drm will set up the hwmon interface automatically. When the appropriate driver is loaded, the temperatures can be accessed via [[lm_sensors]] tools or via sysfs in {{ic|/sys/class/hwmon}}.
 +
 
 +
== Fan Speed ==
 +
 
 +
While the power saving features above should handle fan speeds quite well, some cards may still be too noisy in their idle state. In this case, and when your card supports it, you can change the fan speed manually.
 +
 
 +
{{Warning|
 +
* Keep in mind that the following method sets the fan speed to a fixed value, hence it will not adjust with the stress of the GPU, which can lead to overheating under heavy load.
 +
* Check GPU temperature when applying lower than standard values.
 +
}}
 +
 +
To control the GPU fan, see [[Fan speed control#AMDGPU sysfs fan control]] (amdgpu and radeon share the same controls for this).
 +
 
 +
For persistence, see the example in [[#Persistent configuration]].
 +
 
 +
If a fixed value is not desired, there are possibilities to define a custom fan curve manually by, for example, writing a script in which fan speeds are set depending on the current temperature (current value in {{ic|/sys/class/drm/card0/device/hwmon/hwmon0/temp1_input}}).
 +
 
 +
A GUI solution is available by installing {{AUR|radeon-profile-git}}.
 +
 
 +
== TV out ==
 +
 
 +
First, check that you have an S-video output: {{ic|xrandr}} should give you something like
 +
Screen 0: minimum 320x200, current 1024x768, maximum 1280x1200
 +
...
 +
S-video disconnected (normal left inverted right x axis y axis)
 +
 
 +
Now we should tell Xorg that it is actually connected (it ''is'', right?)
 +
xrandr --output S-video --set "load detection" 1
 +
 
 +
Setting TV standard to use:
 +
xrandr --output S-video --set "tv standard" ntsc
 +
 
 +
Adding a mode for it (currently supports only 800x600):
 +
xrandr --addmode S-video 800x600
 +
 
 +
Clone mode:
 +
xrandr --output S-video --same-as VGA-0
 +
 
 +
Now let us try to see what we have:
 +
xrandr --output S-video --mode 800x600
 +
 
 +
At this point you should see a 800x600 version of your desktop on your TV.
 +
 
 +
To disable the output, do
 +
xrandr --output S-video --off
 +
 
 +
=== Force TV-out in KMS ===
 +
 
 +
The kernel can recognize {{ic|1=video=}} parameter in following form (see [[KMS]] for more details):
 +
 
 +
video=<conn>:<xres>x<yres>[M][R][-<bpp>][@<refresh>][i][m][eDd]
 +
 
 +
For example:
 +
 
 +
video=DVI-I-1:1280x1024-24@60e
 +
 
 +
Parameters with whitespaces must be quoted:
 +
 
 +
"video=9-pin DIN-1:1024x768-24@60e"
 +
 
 +
Current mkinitcpio implementation also requires {{ic|#}} in front. For example:
 +
 
 +
root=/dev/disk/by-uuid/d950a14f-fc0c-451d-b0d4-f95c2adefee3 ro quiet radeon.modeset=1 security=none # video=DVI-I-1:1280x1024-24@60e "video=9-pin DIN-1:1024x768-24@60e"
  
===Errors about AIGLX in /var/log/Xorg.0.log===
+
* [[GRUB Legacy]] can pass such command line as is.
If you get errors like:
+
* [[LILO]] needs backslashes for doublequotes (append {{ic|1=# \"video=9-pin DIN-1:1024x768-24@60e\"}})
  
    '''(EE) AIGLX error: dlsym for __driCreateNewScreen_20050727 failed (/usr/lib/xorg/modules/dri/fglrx_dri.so: undefined symbol: __driCreateNewScreen_20050727)'''
+
You can get list of your video outputs with following command:
    '''(EE) AIGLX: reverting to software rendering'''
 
  
it is because you use the new xorg package which uses AIGLX rather than XGL which is not supported by fglrx yet. This can be solved by adding the following lines to /etc/X11/xorg.conf, which disable AIGLX which triggers the old XGL driver:
+
{{bc|<nowiki>$ ls -1 /sys/class/drm/ | grep -E '^card[[:digit:]]+-' | cut -d- -f2-</nowiki>}}
  
 +
== HDMI audio ==
  
    Section "ServerFlags"
+
HDMI audio is supported in the {{Pkg|xf86-video-ati}} video driver. To disable HDMI audio add {{ic|1=radeon.audio=0}} to your [[kernel parameters]].
        Option  "AIGLX" "off"
 
    EndSection
 
    Section "Extensions"
 
        Option "Composite" "Disable"
 
    EndSection
 
  
===Direct Rendering Doesn't Work===
+
If there is no video after boot up, the driver option has to be disabled.
  
Having trouble getting direct rendering working?  Run
+
{{Note|
    $ LIBGL_DEBUG=verbose glxinfo > /dev/null
+
* If HDMI audio does not work after installing the driver, test your setup with the procedure at [[Advanced Linux Sound Architecture/Troubleshooting#HDMI Output does not work]].
at the command prompt. At the very start of the output, usually it'll give you a nice error message saying why you don't have direct rendering.
+
* If the sound is distorted in PulseAudio try setting {{ic|1=tsched=0}} as described in [[PulseAudio/Troubleshooting#Glitches, skips or crackling]] and make sure {{ic|rtkit}} daemon is running.
 +
* Your sound card might use the same module, since HDA compliant hardware is pretty common. [[Advanced Linux Sound Architecture#Set the default sound card]] using one of the suggested methods, which include using the {{ic|defaults}} node in alsa configuration.
 +
}}
  
Common errors, and their solutions, are:
+
== Multihead setup ==
  
 +
=== Using the RandR extension ===
  
    '''libGL error: XF86DRIQueryDirectRenderingCapable returned false'''
+
See [[Multihead#RandR]] how to setup multiple monitors by using [[Wikipedia:RandR|RandR]].
  
* Ensure you are loading the correct agp modules for your AGP chipset before you load the fglrx kernel module.  To determine which agp modules you'll need, run <code>hwdetect --show-agp</code>, then ensure that all modules listed from that command are in the <code>MODULES=</code> array in rc.conf, '''before''' fglrx.
+
=== Independent X screens ===
  
 +
Independent dual-headed setups can be configured the usual way. However you might want to know that the radeon driver has a {{ic|"ZaphodHeads"}} option which allows you to bind a specific device section to an output of your choice:
 +
{{hc|/etc/X11/xorg.conf.d/20-radeon.conf|
 +
Section "Device"
 +
  Identifier "Device0"
 +
  Driver "radeon"
 +
  Option "ZaphodHeads" "VGA-0"
 +
  VendorName "ATI"
 +
  BusID "PCI:1:0:0"
 +
  Screen 0
 +
EndSection
 +
}}
  
    '''libGL error: failed to open DRM: Operation not permitted'''
+
This can be a life-saver, when using videocards that have more than two outputs. For instance one HDMI out, one DVI, one VGA, will only select and use HDMI+DVI outputs for the dual-head setup, unless you explicitly specify {{ic|"ZaphodHeads" "VGA-0"}}.
    '''libGL error: reverting to (slow) indirect rendering'''
 
  
* For this, make sure you have the following section in your <code>xorg.conf</code> somewhere:
+
== Turn vsync off ==
    Section "DRI"
 
        Mode 0666
 
    EndSection
 
  
 +
The radeon driver will probably enable vsync by default, which is perfectly fine except for benchmarking. To turn it off try the {{ic|1=vblank_mode=0}} [[environment variable]] or create {{ic|~/.drirc}} (edit it if it already exists) and add the following:
 +
{{hc|~/.drirc|<nowiki>
 +
<driconf>
 +
    <device screen="0" driver="dri2">
 +
        <application name="Default">
 +
            <option name="vblank_mode" value="0" />
 +
        </application>
 +
    </device>
 +
    <!-- Other devices ... -->
 +
</driconf>
 +
</nowiki>}}
  
    '''libGL: OpenDriver: trying /usr/lib/xorg/modules/dri//fglrx_dri.so'''
+
{{Note|Make sure the driver is '''dri2''', not your video card code (like r600).}}
    '''libGL error: dlopen /usr/lib/xorg/modules/dri//fglrx_dri.so failed (/usr/lib/xorg/modules/dri//fglrx_dri.so: cannot open shared object file: No such file or directory)'''
 
    '''libGL error: unable to find driver: fglrx_dri.so'''
 
  
* Something hasn't been installed right.  If the paths in the error message are <code>/usr/X11R6/lib/modules/dri/fglrx_dri.so</code>, then ensure you've logged completely out of your system, then back in.  If you're using a graphical login manager (gdm, kdm, xdm) ensure that /etc/profile is sourced every time you log in.  This is usually accomplished by adding <code>source /etc/profile</code> into <code>~/.xsession</code> or <code>~/.xinitrc</code>, but may vary between login managers.
+
If vsync is still enabled, you can disable it by editing {{ic|/etc/X11/xorg.conf.d/20-radeon.conf}}. See [[#Driver options]].
  
* If the paths above in your error message _are_ <code>/usr/lib/xorg/modules/dri/fglrx_dri.so</code> then something hasn't been correctly installed. Try reinstalling the <code>ati-fglrx-utils</code> package.
+
== Troubleshooting ==
  
 +
=== Performance and/or artifacts issues when using EXA ===
 +
{{Note|This only applies to cards older than R600 (Radeon X1000 series and older). Newer cards you should use Glamor instead of EXA.}}
  
    '''fglrx: libGL version undetermined - OpenGL module is using glapi fallback'''
+
If having 2D performance issues, like slow scrolling in a terminal or webbrowser, adding {{ic|Option "MigrationHeuristic" "greedy"}} as device option may solve the issue.
  
* This could be caused by having multiple versions of <code>libGL.so</code> on your system.  Run:
+
In addition disabling EXAPixmaps may solve artifacts issues, although this is generally not recommended and may cause other issues.
    $ sudo updatedb
 
    $ locate libGL.so
 
  
This should return the following output:
+
{{hc|/etc/X11/xorg.conf.d/20-radeon.conf|<nowiki>
     $ locate libGL.so
+
Section "Device"
     /usr/lib/libGL.so
+
     Identifier "Radeon"
     /usr/lib/libGL.so.1
+
     Driver "radeon"
     /usr/lib/libGL.so.1.2
+
     Option "AccelMethod" "exa"
     $
+
     Option "MigrationHeuristic" "greedy"
 +
     #Option "EXAPixmaps" "off"
 +
EndSection
 +
</nowiki>}}
  
These are the only three libGL.so files you should have on your system.  If you have any more (eg. <code>/usr/X11R6/lib/libGL.so.1.2</code>), then remove them.  This should fix your problem.
+
=== Adding undetected/unsupported resolutions ===
  
You might not get any error to indicate that this is a problem. If you are using X11R7, make sure you do '''not''' have these files on your system:
+
See [[Xrandr#Adding undetected resolutions]].
    /usr/X11R6/lib/libGL.so.1.2
 
    /usr/X11R6/lib/libGL.so.1
 
  
===System Freezes/Hard locks===
+
=== TV showing a black border around the screen ===
 +
{{Note|Make sure the tv has been setup correctly (see manual) before attempting the following solution.}}
  
* To prevent system lockups, try adding the following lines to your fglrx "Device" section in <code>xorg.conf</code>
+
When connecting a TV using the HDMI port, the TV may show a blurry picture with a 2-3cm border around it. This protects against overscanning (see [[Wikipedia:Overscan]]), but can be turned off using xrandr:
    Option "UseInternalAGPGART"        "no"
+
xrandr --output HDMI-0 --set underscan off
    Option "KernelModuleParm"          "agplock=0" # AGP locked user pages: disabled
 
  
Note: Neither both options are anymore necessary since 8.24.18, because ATI removed the internal AGP GART support from the driver.
+
=== Black screen and no console, but X works in KMS ===
  
* As well, the <code>radeonfb</code> framebuffer drivers have been known in the past to cause problems of this nature. If your kernel has radeonfb support compiled in, you may want to try a different kernel and see if this helps. Note: kernel26archck _DOES_ have radeonfb support compiled in.
+
This is a solution to the no-console problem that might come up, when using two or more ATI cards on the same PC. Fujitsu Siemens Amilo PA 3553 laptop for example has this problem. This is due to fbcon console driver mapping itself to the wrong framebuffer device that exists on the wrong card. This can be fixed by adding this to the kernel boot line:
 +
fbcon=map:1
 +
This will tell the fbcon to map itself to the {{ic|/dev/fb1}} framebuffer dev and not the {{ic|/dev/fb0}}, that in our case exists on the wrong graphics card. If that does not fix your problem, try booting with
 +
  fbcon=map:0
 +
instead.
  
===Hardware Confilcts===
+
=== ATI X1600 (RV530 series) 3D application show black windows ===
  
Radeon cards used in conjuntion with some versions of the nForce3 chipset (e.g. nForce 3 250Gb) won't have acceleration. Currently the cause of this is unknown but some sources indicates that it will be neccesary to boot Windows with the drivers from nVIDIA, reboot the system and it will be possible to get 3D acceleration with this hardware. This can be verified issuing in a root console the following command:
+
There are three possible solutions:
 +
* Try adding {{ic|<nowiki>pci=nomsi</nowiki>}} to your boot loader [[Kernel parameters]].
 +
* If this does not work, you can try adding {{ic|noapic}} instead of {{ic|<nowiki>pci=nomsi</nowiki>}}.
 +
* If none of the above work, then you can try running {{ic|<nowiki>vblank_mode=0 glxgears</nowiki>}} or {{ic|<nowiki>vblank_mode=1 glxgears</nowiki>}} to see which one works for you, then install {{AUR|driconf}} and set that option in {{ic|~/.drirc}}.
  
    dmesg | grep agp
+
=== Cursor corruption after coming out of sleep ===
  
If you get somthing similar to this (using an nForce3 based system)
+
If the cursor becomes corrupted like it's repeating itself vertically after the monitor(s) comes out of sleep, set {{ic|"SWCursor" "True"}} in the {{ic|"Device"}} section of the {{ic|/etc/X11/xorg.conf.d/20-radeon.conf}} configuration file.
  
    agpgart: Detected AGP bridge 0
+
=== DisplayPort stays black on multimonitor mode ===
    agpgart: Setting up Nforce3 AGP.
 
    agpgart: aperture base > 4G
 
  
and also if issuing this command...
+
Try booting with the [[kernel parameter]] {{ic|1=radeon.audio=0}}.
  
      tail -n 100 /var/log/Xorg.0.log | grep agp
+
=== R9-390 Poor Performance and/or Instability ===
  
...got something similar to:
+
Firmware issues with R9-390 series cards include poor performance and crashes (frequently caused by gaming or using Google Maps) possibly related DPM. Comment 115 of this bug [https://bugs.freedesktop.org/show_bug.cgi?id=91880 report] includes instructions for a fix.
  
      (EE) fglrx(0): [agp] unable to acquire AGP, error "xf86_ENODEV"
+
=== QHD / UHD / 4k support over HDMI for older Radeon cards ===
  
Then you got this bug.
+
Older cards have their pixel clock limited to 165MHz for HDMI. Hence, they do not support QHD or 4k only via dual-link DVI but not over HDMI.
  
Some sources indicates that in some cases, downgrading the Motherboard BIOS helps, but this can't be verified in all cases. Also a bad BIOS downgrade can render your hardware useless, so beware.
+
One possibility to work around this is to use [https://www.elstel.org/software/hunt-for-4K-UHD-2160p.html.en custom modes with lower refresh rate], e.g. 30Hz.
  
See bug http://bugzilla.kernel.org/show_bug.cgi?id=6350 for more information and a potential fix.
+
Another one is a kernel patch removing the pixel clock limit, but this may damage the card!
  
===Compaq Presario Laptops===
+
Official kernel bug ticket with patch for 4.8: https://bugzilla.kernel.org/show_bug.cgi?id=172421
After installing the drivers and editing the configuration file as required, some laptops (Presario R4000 with XPress200M) just come up with a blank screen.
 
The problem seems to be incorrect memory detected by the kernel.(even if you have 128M of video memory lspci -v always reports 256M). Changing the BIOS settings so it uses "SidePort+UMA" option and using 128M video memory plus another 128M taken from system seems to work around fine as a workaround.
 
It could be a bug in the BIOS or in the Linux PCI code.
 
  
== External Resources ==
+
The patch introduces a new kernel parameter {{ic|radeon.hdmimhz}} which alters the pixel clock limit.  
More Info can be found here
 
* [http://ati.cchtml.com/buglist.cgi?query_format=specific&order=relevance+desc&bug_status=__open__&product=&content= Unofficial ATI Linux Bugtracker]
 
* [http://www.rage3d.com/board/forumdisplay.php?f=88 Rage3D ATI Linux Forums]
 
* [http://www.thinkwiki.org/wiki/Problems_with_fglrx ThinkWiki fglrx Problems page]
 
* [http://www2.ati.com/drivers/linux/linux_8.24.8.html ATI fglrx 8.24.8 Release Notes]
 
  
== Open Source ATI Drivers ==
+
Be sure to use a high speed HDMI cable for this.
If you have a supported chipset (generally the older ones, see the manpage for radeon for specific cards), you can run the OSS ati drivers.  These are (in my opinion) hassle-free and support (usually, check the manpage) 3D acceleration.
 
  
The package is xf86-video-ati.
+
== See also ==
: pacman -S xf86-video-ati
 
  
== Something to add? ==
+
[https://www.phoronix.com/scan.php?page=article&item=radeonsi-cat-wow&num=1 Benchmark] showing the open source driver is on par performance-wise with the proprietary driver for many cards.
As always, feel free to edit this wiki to reflect any differences or ommissions.
 

Latest revision as of 07:34, 30 March 2019

This article covers the radeon open source driver which supports the majority of AMD (previously ATI) GPUs.

Selecting the right driver

Depending on the card you have, find the right driver in Xorg#AMD. This page has instructions for ATI.

If unsure, try this open source driver first, it will suit most needs and is generally less problematic. See the feature matrix to know what is supported and the decoder ring to translate marketing names (e.g. Radeon HD4330) to chip names (e.g. R700).

Installation

Note: If coming from the proprietary Catalyst driver, see AMD Catalyst#Uninstallation first.

Install the mesa package, which provides the DRI driver for 3D acceleration.

  • For 32-bit application support, also install the lib32-mesa package from the multilib repostory.
  • For the DDX driver (which provides 2D acceleration in Xorg), install the xf86-video-ati package.

Support for accelerated video decoding is provided by mesa-vdpau and lib32-mesa-vdpau packages.

Loading

The radeon kernel module should load fine automatically on system boot.

If it does not happen, then:

Enable early KMS

See Kernel mode setting#Early KMS start.

Xorg configuration

Xorg will automatically load the driver and it will use your monitor's EDID to set the native resolution. Configuration is only required for tuning the driver.

If you want manual configuration, create /etc/X11/xorg.conf.d/20-radeon.conf, and add the following:

Section "Device"
    Identifier "Radeon"
    Driver "radeon"
EndSection

Using this section, you can enable features and tweak the driver settings.

Performance tuning

Enabling video acceleration

See Hardware video acceleration.

Graphical tools

  • WattmanGTK — A GTK3 user-interface written in Python 3, which allows you to view, monitor Radeon performance, fan speeds and power states and the ability to overclock the graphics processor. It uses the AMDGPU kernel driver.
https://github.com/BoukeHaarsma23/WattmanGTK || wattman-gtk-gitAUR
Note: It is required to set a kernel parameter (amdgpu.ppfeaturemask) in order to enable the AMD Overdrive technology within GNU/Linux. Which is necessary to use WattmanGTK.
  • radeon-profile — Qt application for displaying info about a Radeon card.
https://github.com/marazmista/radeon-profile || radeon-profile-gitAUR

Driver options

The following options apply to /etc/X11/xorg.conf.d/20-radeon.conf.

Please read radeon(4) and RadeonFeature first before applying driver options.

Acceleration architecture; Glamor is available as a 2D acceleration method implemented through OpenGL, and it is the default for R600 (Radeon HD2000 series) and newer graphic cards. Older cards use EXA.

Option "AccelMethod" "glamor"

DRI3 is enabled by default since xf86-video-ati 7.8.0. For older drivers, which use DRI2 by default, switch to DRI3 with the following option:

Option "DRI" "3"

TearFree is a tearing prevention option which prevents tearing by using the hardware page flipping mechanism:

Option "TearFree" "on"

ColorTiling and ColorTiling2D are supposed to be enabled by default. Tiled mode can provide significant performance benefits with 3D applications. It is disabled if the DRM module is too old or if the current display configuration does not support it. KMS ColorTiling2D is only supported on R600 (Radeon HD2000 series) and newer chips:

Option "ColorTiling" "on"
Option "ColorTiling2D" "on"

When using Glamor as acceleration architecture, it is possible to enable the ShadowPrimary option, which enables a so-called "shadow primary" buffer for fast CPU access to pixel data, and separate scanout buffers for each display controller (CRTC). This may improve performance for some 2D workloads, potentially at the expense of other (e.g. 3D, video) workloads. Note that enabling this option currently disables Option "EnablePageFlip":

Option "ShadowPrimary" "on"

EXAVSync is only available when using EXA and can be enabled to avoid tearing by stalling the engine until the display controller has passed the destination region. It reduces tearing at the cost of performance and has been known to cause instability on some chips:

Option "EXAVSync" "yes"

Below is a sample configuration file of /etc/X11/xorg.conf.d/20-radeon.conf:

Section "Device"
	Identifier  "Radeon"
	Driver "radeon"
	Option "AccelMethod" "glamor"
        Option "DRI" "3"
        Option "TearFree" "on"
        Option "ColorTiling" "on"
        Option "ColorTiling2D" "on"
EndSection
Tip: driconfAUR is a tool which allows several settings to be modified: vsync, anisotropic filtering, texture compression, etc. Using this tool it is also possible to "disable Low Impact fallback" needed by some programs (e.g. Google Earth).

Kernel parameters

Tip: You may want to debug the new parameters with systool as stated in Kernel modules#Obtaining information.

Defining the gartsize, if not autodetected, can be done by adding radeon.gartsize=32 as a kernel parameter.

Note: Setting this parameter should not be needed anymore with modern AMD video cards:
[drm] Detected VRAM RAM=2048M, BAR=256M
[drm] radeon: 2048M of VRAM memory ready
[drm] radeon: 2048M of GTT memory ready.

The changes take effect at the next reboot.

Deactivating PCIe 2.0

Since kernel 3.6, PCI Express 2.0 in radeon is turned on by default.

It may be unstable with some motherboards. It can be deactivated by adding radeon.pcie_gen2=0 as a kernel parameter.

See Phoronix article for more information.

Gallium Heads-Up Display

The radeon driver supports the activation of a heads-up display (HUD) which can draw transparent graphs and text on top of applications that are rendering, such as games. These can show values such as the current frame rate or the CPU load for each CPU core or an average of all of them. The HUD is controlled by the GALLIUM_HUD environment variable, and can be passed the following list of parameters among others:

  • "fps" - displays current frames per second
  • "cpu" - displays the average CPU load
  • "cpu0" - displays the CPU load for the first CPU core
  • "cpu0+cpu1" - displays the CPU load for the first two CPU cores
  • "draw-calls" - displays how many times each material in an object is drawn to the screen
  • "requested-VRAM" - displays how much VRAM is being used on the GPU
  • "pixels-rendered" - displays how many pixels are being displayed

To see a full list of parameters, as well as some notes on operating GALLIUM_HUD, you can also pass the "help" parameter to a simple application such as glxgears and see the corresponding terminal output:

# GALLIUM_HUD="help" glxgears

More information can be found from this mailing list post or this blog post.

Hybrid graphics/AMD Dynamic Switchable Graphics

It is the technology used on recent laptops equiped with two GPUs, one power-efficent (generally Intel integrated card) and one more powerful and more power-hungry (generally Radeon or Nvidia). There are two ways to get it work:

  • If it is not required to run 'GPU-hungry' applications, it is possible to disable the discrete card (see Ubuntu wiki): echo OFF > /sys/kernel/debug/vgaswitcheroo/switch.
  • PRIME: Is a proper way to use hybrid graphics on Linux, but still requires a bit of manual intervention from the user.

Powersaving

Note: Power management is supported on all chips that include the appropriate power state tables in the vbios (R1xx and newer). "dpm" is only supported on R6xx and newer chips.

With the radeon driver, power saving is disabled by default and has to be enabled manually if desired.

You can choose between three different methods:

  1. dpm (enabled by default since kernel 3.13)
  2. dynpm
  3. profile

See https://www.x.org/wiki/RadeonFeature/#index3h2 for more details.

Dynamic power management

Since kernel 3.13, DPM is enabled by default for lots of AMD Radeon hardware. If you want to disable it, add the parameter radeon.dpm=0 to the kernel parameters.

Tip: DPM works on R6xx gpus, but is not enabled by default in the kernel (only R7xx and up). Setting the radeon.dpm=1 kernel parameter will enable dpm.

Unlike dynpm, the "dpm" method uses hardware on the GPU to dynamically change the clocks and voltage based on GPU load. It also enables clock and power gating.

There are 3 operation modes to choose from:

  • battery lowest power consumption
  • balanced sane default
  • performance highest performance

They can be changed via sysfs

# echo battery > /sys/class/drm/card0/device/power_dpm_state

For testing or debugging purposes, you can force the card to run in a set performance mode:

  • auto default; uses all levels in the power state
  • low enforces the lowest performance level
  • high enforces the highest performance level
# echo low > /sys/class/drm/card0/device/power_dpm_force_performance_level

Commandline Tools

  • radcard - A script to get and set DPM power states and levels

Old methods

Dynamic frequency switching

This method dynamically changes the frequency depending on GPU load, so performance is ramped up when running GPU intensive apps, and ramped down when the GPU is idle. The re-clocking is attempted during vertical blanking periods, but due to the timing of the re-clocking functions, does not always complete in the blanking period, which can lead to flicker in the display. Due to this, dynpm only works when a single head is active.

It can be activated by simply running the following command:

# echo dynpm > /sys/class/drm/card0/device/power_method

Profile-based frequency switching

This method will allow you to select one of the five profiles (described below). Different profiles, for the most part, end up changing the frequency/voltage of the GPU. This method is not as aggressive, but is more stable and flicker free and works with multiple heads active.

To activate the method, run the following command:

# echo profile > /sys/class/drm/card0/device/power_method

Select one of the available profiles:

  • default uses the default clocks and does not change the power state. This is the default behaviour.
  • auto selects between mid and high power states based on the whether the system is on battery power or not.
  • low forces the gpu to be in the low power state all the time. Note that low can cause display problems on some laptops, which is why auto only uses low when monitors are off. Selected on other profiles when the monitors are in the DPMS-off state.
  • mid forces the gpu to be in the mid power state all the time.
  • high forces the gpu to be in the high power state all the time.

As an example, we will activate the low profile (replace low with any of the aforementioned profiles as necessary):

# echo low > /sys/class/drm/card0/device/power_profile

Persistent configuration

The methods described above are not persistent. To make them persistent, you may create a udev rule (example for #Profile-based frequency switching):

/etc/udev/rules.d/30-radeon-pm.rules
KERNEL=="dri/card0", SUBSYSTEM=="drm", DRIVERS=="radeon", ATTR{device/power_method}="profile", ATTR{device/power_profile}="low"

As another example, dynamic power management can be permanently forced to a certain performance level:

/etc/udev/rules.d/30-radeon-pm.rules
KERNEL=="dri/card0", SUBSYSTEM=="drm", DRIVERS=="radeon", ATTR{device/power_dpm_force_performance_level}="high"
Note: If the above rules are failing, try removing the dri/ prefix.

Graphical tools

  • Radeon-tray — A small program to control the power profiles of your Radeon card via systray icon. It is written in PyQt4 and is suitable for non-Gnome users.
https://github.com/StuntsPT/Radeon-tray || radeon-trayAUR

Other notes

To view the speed that the GPU is running at, perform the following command and you will get something like this output:

# cat /sys/kernel/debug/dri/0/radeon_pm_info
  state: PM_STATE_ENABLED
  default engine clock: 300000 kHz
  current engine clock: 300720 kHz
  default memory clock: 200000 kHz

It depends on which GPU line yours is, however. Along with the radeon driver versions, kernel versions, etc. So it may not have much/any voltage regulation at all.

Thermal sensors are implemented via external i2c chips or via the internal thermal sensor (rv6xx-evergreen only). To get the temperature on asics that use i2c chips, you need to load the appropriate hwmon driver for the sensor used on your board (lm63, lm64, etc.). The drm will attempt to load the appropriate hwmon driver. On boards that use the internal thermal sensor, the drm will set up the hwmon interface automatically. When the appropriate driver is loaded, the temperatures can be accessed via lm_sensors tools or via sysfs in /sys/class/hwmon.

Fan Speed

While the power saving features above should handle fan speeds quite well, some cards may still be too noisy in their idle state. In this case, and when your card supports it, you can change the fan speed manually.

Warning:
  • Keep in mind that the following method sets the fan speed to a fixed value, hence it will not adjust with the stress of the GPU, which can lead to overheating under heavy load.
  • Check GPU temperature when applying lower than standard values.

To control the GPU fan, see Fan speed control#AMDGPU sysfs fan control (amdgpu and radeon share the same controls for this).

For persistence, see the example in #Persistent configuration.

If a fixed value is not desired, there are possibilities to define a custom fan curve manually by, for example, writing a script in which fan speeds are set depending on the current temperature (current value in /sys/class/drm/card0/device/hwmon/hwmon0/temp1_input).

A GUI solution is available by installing radeon-profile-gitAUR.

TV out

First, check that you have an S-video output: xrandr should give you something like

Screen 0: minimum 320x200, current 1024x768, maximum 1280x1200
...
S-video disconnected (normal left inverted right x axis y axis)

Now we should tell Xorg that it is actually connected (it is, right?)

xrandr --output S-video --set "load detection" 1

Setting TV standard to use:

xrandr --output S-video --set "tv standard" ntsc

Adding a mode for it (currently supports only 800x600):

xrandr --addmode S-video 800x600

Clone mode:

xrandr --output S-video --same-as VGA-0

Now let us try to see what we have:

xrandr --output S-video --mode 800x600

At this point you should see a 800x600 version of your desktop on your TV.

To disable the output, do

xrandr --output S-video --off

Force TV-out in KMS

The kernel can recognize video= parameter in following form (see KMS for more details):

video=<conn>:<xres>x<yres>[M][R][-<bpp>][@<refresh>][i][m][eDd]

For example:

video=DVI-I-1:1280x1024-24@60e

Parameters with whitespaces must be quoted:

"video=9-pin DIN-1:1024x768-24@60e"

Current mkinitcpio implementation also requires # in front. For example:

root=/dev/disk/by-uuid/d950a14f-fc0c-451d-b0d4-f95c2adefee3 ro quiet radeon.modeset=1 security=none # video=DVI-I-1:1280x1024-24@60e "video=9-pin DIN-1:1024x768-24@60e"
  • GRUB Legacy can pass such command line as is.
  • LILO needs backslashes for doublequotes (append # \"video=9-pin DIN-1:1024x768-24@60e\")

You can get list of your video outputs with following command:

$ ls -1 /sys/class/drm/ | grep -E '^card[[:digit:]]+-' | cut -d- -f2-

HDMI audio

HDMI audio is supported in the xf86-video-ati video driver. To disable HDMI audio add radeon.audio=0 to your kernel parameters.

If there is no video after boot up, the driver option has to be disabled.

Note:

Multihead setup

Using the RandR extension

See Multihead#RandR how to setup multiple monitors by using RandR.

Independent X screens

Independent dual-headed setups can be configured the usual way. However you might want to know that the radeon driver has a "ZaphodHeads" option which allows you to bind a specific device section to an output of your choice:

/etc/X11/xorg.conf.d/20-radeon.conf
Section "Device"
  Identifier "Device0"
  Driver "radeon"
  Option "ZaphodHeads" "VGA-0"
  VendorName "ATI"
  BusID "PCI:1:0:0"
  Screen 0
EndSection

This can be a life-saver, when using videocards that have more than two outputs. For instance one HDMI out, one DVI, one VGA, will only select and use HDMI+DVI outputs for the dual-head setup, unless you explicitly specify "ZaphodHeads" "VGA-0".

Turn vsync off

The radeon driver will probably enable vsync by default, which is perfectly fine except for benchmarking. To turn it off try the vblank_mode=0 environment variable or create ~/.drirc (edit it if it already exists) and add the following:

~/.drirc
<driconf>
    <device screen="0" driver="dri2">
        <application name="Default">
            <option name="vblank_mode" value="0" />
        </application>
    </device>
    <!-- Other devices ... -->
</driconf>
Note: Make sure the driver is dri2, not your video card code (like r600).

If vsync is still enabled, you can disable it by editing /etc/X11/xorg.conf.d/20-radeon.conf. See #Driver options.

Troubleshooting

Performance and/or artifacts issues when using EXA

Note: This only applies to cards older than R600 (Radeon X1000 series and older). Newer cards you should use Glamor instead of EXA.

If having 2D performance issues, like slow scrolling in a terminal or webbrowser, adding Option "MigrationHeuristic" "greedy" as device option may solve the issue.

In addition disabling EXAPixmaps may solve artifacts issues, although this is generally not recommended and may cause other issues.

/etc/X11/xorg.conf.d/20-radeon.conf
Section "Device"
    Identifier "Radeon"
    Driver "radeon"
    Option "AccelMethod" "exa"
    Option "MigrationHeuristic" "greedy"
    #Option "EXAPixmaps" "off"
EndSection

Adding undetected/unsupported resolutions

See Xrandr#Adding undetected resolutions.

TV showing a black border around the screen

Note: Make sure the tv has been setup correctly (see manual) before attempting the following solution.

When connecting a TV using the HDMI port, the TV may show a blurry picture with a 2-3cm border around it. This protects against overscanning (see Wikipedia:Overscan), but can be turned off using xrandr:

xrandr --output HDMI-0 --set underscan off

Black screen and no console, but X works in KMS

This is a solution to the no-console problem that might come up, when using two or more ATI cards on the same PC. Fujitsu Siemens Amilo PA 3553 laptop for example has this problem. This is due to fbcon console driver mapping itself to the wrong framebuffer device that exists on the wrong card. This can be fixed by adding this to the kernel boot line:

fbcon=map:1

This will tell the fbcon to map itself to the /dev/fb1 framebuffer dev and not the /dev/fb0, that in our case exists on the wrong graphics card. If that does not fix your problem, try booting with

fbcon=map:0

instead.

ATI X1600 (RV530 series) 3D application show black windows

There are three possible solutions:

  • Try adding pci=nomsi to your boot loader Kernel parameters.
  • If this does not work, you can try adding noapic instead of pci=nomsi.
  • If none of the above work, then you can try running vblank_mode=0 glxgears or vblank_mode=1 glxgears to see which one works for you, then install driconfAUR and set that option in ~/.drirc.

Cursor corruption after coming out of sleep

If the cursor becomes corrupted like it's repeating itself vertically after the monitor(s) comes out of sleep, set "SWCursor" "True" in the "Device" section of the /etc/X11/xorg.conf.d/20-radeon.conf configuration file.

DisplayPort stays black on multimonitor mode

Try booting with the kernel parameter radeon.audio=0.

R9-390 Poor Performance and/or Instability

Firmware issues with R9-390 series cards include poor performance and crashes (frequently caused by gaming or using Google Maps) possibly related DPM. Comment 115 of this bug report includes instructions for a fix.

QHD / UHD / 4k support over HDMI for older Radeon cards

Older cards have their pixel clock limited to 165MHz for HDMI. Hence, they do not support QHD or 4k only via dual-link DVI but not over HDMI.

One possibility to work around this is to use custom modes with lower refresh rate, e.g. 30Hz.

Another one is a kernel patch removing the pixel clock limit, but this may damage the card!

Official kernel bug ticket with patch for 4.8: https://bugzilla.kernel.org/show_bug.cgi?id=172421

The patch introduces a new kernel parameter radeon.hdmimhz which alters the pixel clock limit.

Be sure to use a high speed HDMI cable for this.

See also

Benchmark showing the open source driver is on par performance-wise with the proprietary driver for many cards.