Talk:PRIME

From ArchWiki

PRIME GPU OFFLOADING

The section "PRIME GPU OFFLOADING" is in my opinion a collection of solutions outdating each other. I came from bumblebee and optirun, that stopped working with nvidia 440.31 and tried my luck using the approach here. Following the linked hints in the Notes of this section were already outdated and I found help in the readme of the current nvidia driver version: https://download.nvidia.com/XFree86/Linux-x86_64/440.31/README/primerenderoffload.html

How about either linking to that article or putting a snippet of working xorg-config plus which pkgs to use in that section?

—This unsigned comment is by Bollie (talk) 09:25, 26 November 2019 (UTC). Please sign your posts with ~~~~!Reply[reply]

The --provideroffloadsink option is still accurate for Mesa drivers. Not sure about the proprietary driver, but feel free to add more informations with links to the sources. Lekensteyn (talk) 23:08, 1 December 2019 (UTC)Reply[reply]

PRIME render offload

It's true that editing files under /usr/share/ will not survive package upgrades, but I didn't find any means to make it work that doesn't involve editing the file usr/share/X11/xorg.conf.d/10-nvidia-drm-outputclass.conf to remove the PrimaryGPU option. Either I could come up with a pacman hook (hacky as hell) or make a warning about it in the wiki. Right now it seems that depends on configuration alone and forcing that default configuration from the package doesn't seem sane. A minimum correct 10-nvidia-drm-outputclass.conf file should look like this:

usr/share/X11/xorg.conf.d/10-nvidia-drm-outputclass.conf
Section "OutputClass"
    Identifier "nvidia"
    MatchDriver "nvidia-drm"
    Driver "nvidia"
    ModulePath "/usr/lib/nvidia/xorg"
    ModulePath "/usr/lib/xorg/modules"
EndSection

Samsagax (talk) 19:45, 9 December 2019 (UTC)Reply[reply]

Either a xorg.conf file or a /etc/X11/xorg.conf.d snippet have precedence over /usr/share/X11/xorg.conf.d. Please remove that section from the prime render offload, it's not necessary. I'll make some changes later this week too, and I might remove that. —This unsigned comment is by Grazzolini (talk) 17:22, 10 December 2019‎. Please sign your posts with ~~~~!


I have created a package for this setup called nvidia-prime. It comes with a script and a xorg.conf.d snippet. During my tests, I have found out that using it and without commenting the PrimaryGPU option on the 10-nvidia-drm-outputclass.conf, I got a reverse prime setup by default, without any /etc/X11/xorg.conf or /etc/X11/xorg.conf.d snippet, which means that, because of the PrimaryGPU option, X would use the NVIDIA card for everything. If I comment out that option, I get the prime render offload setup. I'm going to discuss this with the nvidia-utils maintainer and see if we can either remove that snippet entirely, or at least remove the PrimaryGPU option. Grazzolini (talk) 00:08, 11 December 2019 (UTC)Reply[reply]
I think removing the option and every other that doesn't add or impose a setting to the user is the way to go. As general rule, there should only be a sane default that won't interfere with user coniguration or at least back it up. About the precedence, if what you say it's true, then adding a snippet under /etc/X11/xorg.conf.d with the option PrimaryGPU set to "no" should do the trick something like:
etc/X11/xorg.conf.d/10-nvidia-drm-outputclass-primary-no.conf
Section "OutputClass"
    Identifier "nvidia"
    Option "PrimaryGPU" "no"
EndSection
I'll try to test this tonight. Aside from this, I think the whole section about PRIME offload should be rewritten. I can help with that and my findings on setting up NVIDIA proprietary driver specifically. Samsagax (talk) 17:01, 11 December 2019 (UTC)Reply[reply]
Yes, setting it up on /etc/X11/xorg.conf or /etc/X11/xorg.conf.d should have precedence over /usr/share/X11/xorg.conf.d. I have opened FS#64805 for tracking this and I'm talking with the current maintainers, svenstaro and felixonmars. In addition to dropping the PrimaryGPU option, we should also drop the modesetting configuration for the intel card from that file also, because it means that even if you have xf86-video-intel installed, it won't be used unless you force it with a xorg.conf or xorg.conf.d snippet. Basically it's interfering with normal Xorg autodetection. It also makes Xorg.wrap to fail and start X as root by default. Grazzolini (talk) 20:08, 11 December 2019 (UTC)Reply[reply]
I just tested my theory and adding the snippet above didn't help as long as PrimaryGPU is set on the file under /usr/share/X11/xorg.conf.d so removing it or commenting it out is the way to go. —This unsigned comment is by Samsagax (talk) 22:43, 11 December 2019‎. Please sign your posts with ~~~~!


I have tested this as well, by copying the 10-nvidia-drm-outputclass.conf from /usr/share/X11/xorg.conf.d to /etc/X11/xorg.conf.d and both removing the PrimaryGPU option and setting it to "no" as well. Neither worked. The only solution is for that file to drop the PrimaryGPU option, indeed. Grazzolini (talk) 02:31, 13 December 2019 (UTC)Reply[reply]

Hi, new here, proposing an edit to: "As per the official documentation, it only works with the modesetting driver over Intel graphics card." I have a working setup with an Intel HD Graphics 620 using the Intel driver and Nvidia Geforce 940MX using the Nvidia driver (in an ASUS S510UQ laptop); I've confirmed this with xrandr --listproviders. Perhaps change to "...it only works with the modesetting driver, but success has been had with the Intel driver instead..."? Irradium (talk) 01:20, 12 January 2020 (UTC)Reply[reply]

Edited as appropriate to previous statement. Irradium (talk) 22:10, 14 January 2020 (UTC)Reply[reply]

Official PRIME solution

I have found the solution here to work pretty well when applied, and optimus-managerAUR with default config and hybrid mode fixes the lack of a video output in configs that exhibit it.

I wonder if the wiki page could be updated to reflect that? --TheSola10 (talk) 12:01, 18 December 2019 (UTC)Reply[reply]

The documentation states that: This feature requires a Turing or newer GPU. It can't really be used for all cards. You are welcome to edit the page to add info about this dynamic power management, but there's nothing "Official" about using optimusmanager. Grazzolini (talk) 12:23, 18 December 2019 (UTC)Reply[reply]

Prime and Wayland

I might be mistaken, but I believe that Reverse Prime is not possible with the current (470.57.02-1) driver.

I think that it would be nice to clarify the situation regarding Prime and Wayland globally

Pums974 (talk) 10:04, 21 July 2021 (UTC)Reply[reply]

Configure applications to render using GPU

This section has examples to run an application offloaded to the NVIDIA GPU with Dynamic Power Management enabled using the environment variables __NV_PRIME_RENDER_OFFLOAD=1 __GLX_VENDOR_LIBRARY_NAME=nvidia. prime-run can also be used for this purpose and is a convenient wrapper around the provided command (can be verified by running cat $(which prime-run)). So, that command can also be mentioned as can be seen in the section External GPU#Xorg rendered on iGPU, PRIME render offload to eGPU, where both commands are mentioned RaZorr (talk) 17:36, 16 March 2024 (UTC)Reply[reply]