Talk:Kernel mode setting

From ArchWiki
Jump to navigation Jump to search

Modesetting Driver Flips Off Atomic By Default

Users switching from xf86-video-intel to modesetting driver may experience random xorg crashes due to this. see Phoronix article [1] -- cirrus (talk) 19:43, 7 November 2019 (GMT)

Thank you for the reference, I can confirm the crashes. Is there a way to work around them? (Appart from switching back to xf86-video-intel) -- Pogojotz (talk) 12:29, 21 November 2019 (UTC)

Not that im aware of, sorry, though you could try Option "AccelMethod" "none" -- cirrus (talk) 15:23, 29 November 2019 (GMT)

I had that setting all along. I am back now to using xf86-video-intel -- Pogojotz (talk) 12:39, 4 December 2019 (UTC)

xorg-server 1.16

Wrt [2], AFAIK you need root rights if you're not using KMS too. -- Karol (talk) 17:58, 2 August 2014 (UTC)

Early KMS start for GMA500 chip set

My computer is a Poulsbo. intel_agp and i915 didn't work for me. I finally solved the problem by MODULES="gma500_gfx". Shall we mention this on this page? Haochen (talk) 19:33, 26 February 2015 (UTC)

My fonts are too tiny

This is an issue for HiDPI laptops. The solution to changing TTY font is very vague, and the link "See Fonts#Changing the default font" is broken. Can someone clarify how to do this, the right way. Jadelord (talk) 10:34, 11 February 2016 (UTC)

I think the link should be to Fonts#Console fonts. So for instance you could install terminus-font and run setfont ter-c32n. That should make the fonts a lot larger. See Fonts#Console fonts for how to list all available fonts and how to make it permanent. Lonaowna (talk) 11:19, 11 February 2016 (UTC)
Thanks! I managed to change it permanently by adding a consolefont hook in mkinitcpio.conf. The setfont command was not working for me, since I was trying it on a terminal emulator instead on TTY!
Good you managed to change it! In a terminal emulator it should be really simple.. For instance in GNOME Terminal you can simply do Edit->Profile Preferences->Custom font. But of course this is different for each one. Lonaowna (talk) 12:58, 11 February 2016 (UTC)

Is the advise to always disable "video=..." really a good one?

The advise:

At first, note that for any method you use, you should always disable:
 Any vga= options in your bootloader as these will conflict with the native resolution enabled by KMS.
 Any video= lines that enable a framebuffer that conflicts with the driver.

got me into trouble when I recently installed Arch for the first time. My monitor (a 4k TV) simply displays "Invalid Format" if I do not use any "video=" statement to set a compatible mode, and I wondered whether "that enable a framebuffer that conflicts with the driver" was a statement that was valid for the amdgpu KMS driver that I use.

Is there any more detailed information available on what "video=..." parameters could cause conflicts? --Lvml (talk) 19:13, 18 March 2017 (UTC)

I've been using video= virtually always for well over a decade without discovering but one reason not to use it. That one reason is using the xf86-video-intel DDX will cause it's specification to be inherited by Xorg as a default, which may not be an intended result. I generally use vga= as well. It is applied only until KMS engages, so its life is brief, but may engage init screen output earlier than otherwise. If KMS is broken and vga= is not applied, screen output on the ttys may be either pure black, or 80x25 text mode.Mrmazda (talk) 19:12, 14 February 2021 (UTC)


> At first, note that for any method you use, you should always disable:
>    Any vga= options in your bootloader as these will conflict with the native resolution enabled by KMS.
>    Any video= lines that enable a framebuffer that conflicts with the driver.
>    Any other framebuffer drivers (such as uvesafb).

Does "disable" here mean, disable in my kernel config, or just in the grub boot parameters? It's very common to have framebuffer drivers built in to the kernel. Does that conflict with KMS?

--Njn (talk) 00:56, 29 September 2017 (UTC)

It's just about removing the parameters from the bootloader configuration etc. -- Lahwaacz (talk) 06:32, 29 September 2017 (UTC)

Calculating GTF VESA Modelines for xorg.conf using gtf & xvidtune

Why is there no discussion on this page about the use of gtf to create modelines for xorg.conf for monitors from the manufacturer's documentation? (e.g. from the published [h][w][hsync][vsync] information) It is provided with the xorg-server package for that purpose. For older monitors that do not provide EDID (or correct EDID) information, that is the only way to configure the display correctly. There are still many high res/high frequency CRT's working away. In many cases there is no EDID reporting.

Arch provides both gtf (in xorg-server) and xvidtune (in the xorg-xvidtune package) that are used to precisely tune the graphics mode for older monitors. It would be nice to include discussion in this article. (or at least on the xorg page)

David C. Rankin, J.D.,P.E. -- Rankin Law Firm, PLLC (talk) 21:17, 2 March 2018 (UTC)

The ArchWiki is a colaborative effort. There probably is no section on this topic because no one either had the necessary knowledge or put in the effort. Please feel free to add whatever you see fit. -- Edh (talk) 22:30, 2 March 2018 (UTC)

Disabling KMS

Along with nomodeset kernel parameter, for Intel graphics card you need to add i915.modeset=0
and for Nvidia graphics card you need to add nouveau.modeset=0.
For Nvidia Optimus dual-graphics system, you need to add all the three kernel parameters (i.e. "nomodeset i915.modeset=0 nouveau.modeset=0").
nomodeset disables KMS for any and every GPU!
amdgpu.modeset=0 disables only AMD APUs/GPUs.
i915.modeset=0 disables only Intel IGPs.
nouveau.modeset=0 disables only NVidia GPUs.
radeon.modeset=0 disables only Radeon GPUs/IGPs.
nomodeset simultaneously with *.modeset=0 is redundant.Mrmazda (talk) 19:12, 14 February 2021 (UTC)