Talk:NVIDIA/Troubleshooting

From ArchWiki

Avoid screen tearing

Some advice to fight tearing likely turns out bad for most users:

  • ForceFullCompositionPipeline introduces additional input latency and will cause new issues with certain applications.
  • Setting triple buffering in the xorg.conf doesn't "enable" triple buffering, but also forces it for every application, even if it wants just double buffering (breaks Vsync of Firefox OpenGL).
  • Setting triple buffering (no matter if in xorg.conf or KDE Plasma config) introduces stuttering in even most lightweight games such as Counter-Strike 1.6.

-> the only method that doesn't come with nasty side effects seems to be the environment variable export __GL_YIELD="USLEEP" on KDE Plasma. I have benchmarked several games on a GTX 1070 and there was no real performance impact by it. It wouldn't really matter anyway since compositing should be turned off for gaming (it is turned off by most games automatically). Since it can be set up in a config file of Plasma, I think we should just link to this: https://wiki.archlinux.org/index.php/KDE#Screen_tearing_with_Nvidia

Aufkrawall (talk) 20:59, 8 May 2017 (UTC)Reply[reply]

I'm not sure if it should be removed entirely, but it definitely demands some investigation. I see a lot of stuff on the web about ForceCompositionPipeline (no "Full") fixing tearing issues, with no mention of forcing triple buffering. There seem to be a lot of anecdotes from people saying it helps. So the effects may be specific to different setups.
Silverhammermba (talk) 15:02, 10 May 2017 (UTC)Reply[reply]
ForceCompositionPipeline definitely fixes the tearing for good, but probably many users simply don't notice the drawbacks it brings. For example the increased latency even affects the mouse cursor and stuttering issues may only occur occasionally or briefly. If I'm not mistaken, X also doesn't run with vsync by definition and the vsync should be done by the compositor. This works ootb with Gnome Mutter, but on Nvidia only with bad performance (and from my experience, not any of the tweaks help against that).
Good for testing is the website vsynctester. As soon as I set up triple buffering for either the driver, KWin or both, I'm not able anymore to get clean vsync in Firefox with OpenGL rendering, there is occasional stutter. There is no stutter with either no compositor and rely solely on Firefox own OGL vsync, or set up KWin compositing with export __GL_YIELD="USLEEP" on top of it to have a smooth result and window compositing at the same time.
I've tested this with a GTX 1070 and different drivers in detail, but a GTX 780 Ti didn't seem to behave differently.
Aufkrawall (talk) 17:50, 10 May 2017 (UTC)Reply[reply]
I can not confirm that export __GL_YIELD="USLEEP" has "no real performance impact".
I talked to the devs at Feral Interactive and they pointed out that this setting sends the OGL threads to sleep and causes less performance in games like e.g. Deus Ex Mankind Divided.
Don't know which games you tested, but every game that I tested and that needs a bit GPU power runs worse with this setting.
Adequate (talk) 12:58, 11 May 2017 (UTC)Reply[reply]
It costs ~5% performance in CPU bound scenes here.
Hitman 720p max details:
default: https://abload.de/img/nosleep9haw7.png
~46fps
export __GL_YIELD="USLEEP": https://abload.de/img/sleepweypk.png
~44fps
Unigine Valley 720p max details 1xAA:
default: https://abload.de/img/screenshot_20170511_1b7b48.png
97fps
export __GL_YIELD="USLEEP": https://abload.de/img/screenshot_20170511_1yead4.png
93,7fps
That 5% penalty is surely not higher than that of ForceFullCompositionPipeline and it's by far the better deal than occasional stutter caused by broken/forced triple buffering.
Aufkrawall (talk) 16:25, 11 May 2017 (UTC)Reply[reply]


On my 980ti, I didn't notice a performance difference in raw FPS when changing from ForceFullCompositionPipeline to USLEEP (it might have been 1 or 2fps difference), however USLEEP gives a much smoother experience. ForceFullCompositionPipeline gave me an odd effect where games felt strangely jerky even at 60fps (with vsync on, on a 60hz display), I'm guessing it's to do with the way the FPS is limited and certain rendered frames are skipped. This is a complete conjecture, but it felt like it would render 60fps in under a second, hit the FPS cap, then wait until the next second, then render another 60fps and wait. USLEEP gave a much much smoother feeling game while still hitting FPS.
I'm afraid I can't give a technical explanation but USLEEP is a much more pleasant gaming experience. On recent nvidia drivers and for gaming, USLEEP was definitely the better option.
—This unsigned comment is by Tom B (talk) 21:25, 2 November 2017‎. Please sign your posts with ~~~~!

Triple buffer

When making this change, TripleBuffering should be enabled and AllowIndirectGLXProtocol should be disabled in the driver configuration as well.

Could I get an exmplanation for this (perhaps an authoritative source)? Because I can't find one. I'm using ForceCompositionPipeline without TripleBuffering and it works. --Cr0w (talk) 17:46, 17 September 2017 (UTC)Reply[reply]

Avoid screen tearing, other disadvantages

I'd like to introduce another disadvantage about "ForceFullCompositionPipeline=on": It heavily prolongs the time the driver takes to drop the GPU back to lower clocks after they are boosted by application activity. With this setting, it takes up to one minute to clock down again, with the setting disabled it only takes about 15 seconds. With the environment-variable "export __GL_ExperimentalPerfStrategy=1" it only takes 5 seconds.

—This unsigned comment is by Nordlicht (talk) 16:16, 12 December 2019‎. Please sign your posts with ~~~~!

Sluggish slow motion

Switching from i915 to nvidia-lts caused display slow-motion for me. Mouse pointer was still fast, but hovering over Gnome buttons took too long to display and clicking took long to register.

This was solved after switching BIOS to display on the dedicated GPU instead of the integrated one. The i915 driver is still loaded and required though. Modesetting did not make a difference.

KubaF (talk) 14:55, 24 January 2021 (UTC)Reply[reply]

amdvlk and nvidia packages should not exist with co-other

They are in conflict with each other. It creates problem especially for optimus users with amd igpu.

Source: https://www.reddit.com/r/linux_gaming/comments/rp3h9q/comment/iqkzqi7/

Arcsnim (talk) 06:52, 9 May 2023 (UTC)Reply[reply]

Perhaps this should be reported as a package bug/request, so that the PKGBUILD#conflicts field can be updated. -- CodingKoopa (talk) 23:14, 15 July 2023 (UTC)Reply[reply]

Another fix for screen tearing

I'm new to arch linux and had trouble getting rid of my screen tearing, but it seems that following the instructions to set up nvidia-prime correctly fixed all of my problems. It might be worth mentioning so other noobs like me who decide to put Arch on their gaming laptops can make sure their device is configured correctly. Facemelt (talk) 21:15, 8 October 2023 (UTC)Reply[reply]

NVIDIA Flags for Black-Screen Avoidance on Wayland

@Andreymal notes that as an NVIDIA user, they don't need those flags. As an NVIDIA user myself (Zotac 3090), I find that this fixes my black-screen and graphics crashing issues in Plasma 6.0.2 on Wayland. Before I specified those modules and flags, nothing worked. I'll leave it as further work for people here to ablate which of these flags are _really_ necessary.

I got them from: https://www.maketecheasier.com/wayland-work-with-nvidia-graphics-cards/ Nroth (talk) 20:49, 14 March 2024 (UTC)Reply[reply]

The first two flags are already mentioned in Wayland#Requirements. The other two are related to vkbasaltAUR and Hardware video acceleration. — Lahwaacz (talk) 05:20, 15 March 2024 (UTC)Reply[reply]