https://wiki.archlinux.org/api.php?action=feedcontributions&user=Vi+six&feedformat=atomArchWiki - User contributions [en]2024-03-28T14:31:32ZUser contributionsMediaWiki 1.41.0https://wiki.archlinux.org/index.php?title=HiDPI&diff=528447HiDPI2018-07-01T15:48:02Z<p>Vi six: /* Official HiDPI support */ checked option path added</p>
<hr />
<div>[[Category:Graphics]]<br />
[[ja:HiDPI]]<br />
{{Related articles start}}<br />
{{Related|Font configuration}}<br />
{{Related articles end}}<br />
HiDPI (High Dots Per Inch) displays, also known by Apple's "[[wikipedia:Retina Display|Retina Display]]" marketing name, are screens with a high resolution in a relatively small format. They are mostly found in high-end laptops and monitors.<br />
<br />
Not all software behaves well in high-resolution mode yet. Here are listed most common tweaks which make work on a HiDPI screen more pleasant.<br />
<br />
== Desktop environments ==<br />
<br />
=== GNOME ===<br />
To enable HiDPI, Settings > Devices > Displays,or use gsettings:<br />
<br />
$ gsettings set org.gnome.desktop.interface scaling-factor 2<br />
<br />
{{Note|1={{ic|scaling-factor}} only allows whole numbers to be set. 1 = 100%, 2 = 200%, etc...}}<br />
<br />
==== Fractional Scaling ====<br />
<br />
A setting of {{ic|2, 3, etc}}, which is all you can do with {{ic|scaling-factor}}, may not be ideal for certain HiDPI displays and smaller screens (e.g. small tablets). <br />
<br />
*wayland<br />
Enable fractional Scaling experimental-feature:<br />
<br />
$ gsettings set org.gnome.mutter experimental-features "['scale-monitor-framebuffer']"<br />
<br />
then open Settings > Devices > Displays<br />
<br />
*xorg<br />
<br />
You can achieve any non-integer scale factor by using a combination of GNOME's {{ic|scaling-factor}} and [[xrandr]]. This combination keeps the TTF fonts properly scaled so that they do not become blurry if using {{ic|xrandr}} alone. You specify zoom-in factor with {{ic|gsettings}} and zoom-out factor with [[xrandr]].<br />
<br />
First scale GNOME up to the minimum size which is too big. Usually "2" is already too big, otherwise try "3" etc. Then start scaling down by setting zoom-out factor with [[xrandr]]. First get the relevant output name, the examples below use {{ic|eDP1}}. Start e.g. with zoom-out 1.25 times. If the UI is still too big, increase the scale factor; if it is too small decrease the scale factor.<br />
$ xrandr --output eDP1 --scale 1.25x1.25<br />
<br />
{{Note|To allow the mouse to reach the whole screen, you may need to use the {{ic|--panning}} option as explained in [[#Side display]].}}<br />
<br />
{{Accuracy|The following was initially added under [[#X Resources]]. Clarify how it integrates with the info there or that above for GNOME.|section=GNOME ignores X settings}}<br />
<br />
GNOME ignores X settings due to its xsettings Plugin in Gnome Settings Daemon, where DPI setting is hard coded.<br />
There is blog entry for [http://blog.drtebi.com/2012/12/changing-dpi-setting-on-gnome-34.html recompiling Gnome Settings Daemon].<br />
In the source documentation there is another way mentioned to set X settings DPI:<br />
<br />
You can use the dconf Editor and navigate to key <br />
<br />
/org/gnome/settings-daemon/plugins/xsettings/overrides<br />
<br />
and complement the entry with the value<br />
<br />
'Xft/DPI': <153600><br />
<br />
From README.xsettings<br />
<br />
Noting that variants must be specified in the usual way (wrapped in <>).<br />
<br />
Note also that DPI in the above example is expressed in 1024ths of an inch.<br />
<br />
=== KDE ===<br />
<br />
You can use KDE's settings to fine tune font, icon, and widget scaling. This solution affects both Qt and Gtk+ applications.<br />
<br />
To adjust font, widget, and icon scaling together:<br />
<br />
# System Settings → Display and Monitor → Display Configuration → Scale Display <br />
# Drag the slider to the desired size<br />
# Restart for the settings to take effect<br />
<br />
To adjust only font scaling:<br />
<br />
# System Settings → Fonts<br />
# Check "Force fonts DPI" and adjust the DPI level to the desired value. This setting should take effect immediately for newly started applications. You will have to logout and login for it to take effect on Plasma desktop.<br />
<br />
To adjust only icon scaling:<br />
<br />
# System Settings → Icons → Advanced<br />
# Choose the desired icon size for each category listed. This should take effect immediately.<br />
<br />
'''Display Scale not integer bug :'''<br />
<br />
When you use not integer values for Display Scale it causes font render issue in some QT application ( ex. okular ).<br />
<br />
A workaround for this is to:<br />
# Set the scale value to 1<br />
# Adjust your font and icons and use the "Force fonts DPI" ( this affects all apps, also GTK but not create issue with the fonts )<br />
# Restart KDE<br />
# If required tune the GTK apps using the variables GDK_SCALE/GDK_DPI_SCALE (as described above)<br />
<br />
==== Tray icons with fixed size ====<br />
<br />
The tray icons are not scaled with the rest of the desktop, since Plasma ignores the Qt scaling settings by default. To make Plasma respect the Qt settings, set {{ic|PLASMA_USE_QT_SCALING}} to {{ic|1}}.<br />
<br />
=== Xfce ===<br />
<br />
Go to Settings Manager → Appearance → Fonts, and change the DPI parameter. The value of 180 or 192 seems to work well on Retina screens. To get a more precise number, you can use {{ic|<nowiki>xdpyinfo | grep resolution</nowiki>}}, and then double it.<br />
<br />
To enlarge icons in system tray, right-click on it (aim for empty space / top pixels / bottom pixels, so that you will not activate icons themselves) → “Properties” → set “Maximum icon size” to 32, 48 or 64.<br />
<br />
=== Cinnamon ===<br />
<br />
Has good support out of the box.<br />
<br />
=== Enlightenment ===<br />
<br />
For E18, go to the E Setting panel. In Look → Scaling, you can control the UI scaling ratios. A ratio of 1.2 seems to work well for the native resolution of the MBPr 15" screen.<br />
<br />
== X Server ==<br />
<br />
Some programs use the DPI given by the X server. Examples are i3 ([https://github.com/i3/i3/blob/next/libi3/dpi.c source]) and Chromium ([https://code.google.com/p/chromium/codesearch#chromium/src/ui/views/widget/desktop_aura/desktop_screen_x11.cc source]).<br />
<br />
To verify that the X Server has properly detected the physical dimensions of your monitor, use the ''xdpyinfo'' utility from the {{Pkg|xorg-xdpyinfo}} package:<br />
<br />
$ xdpyinfo | grep -B 2 resolution<br />
screen #0:<br />
dimensions: 3200x1800 pixels (423x238 millimeters)<br />
resolution: 192x192 dots per inch<br />
<br />
This example uses inaccurate dimensions (423mm x 328mm, even though the Dell XPS 9530 has 346mm x 194mm) to have a clean multiple of 96 dpi, in this case 192 dpi. This tends to work better than using the correct DPI — Pango renders fonts crisper in i3 for example.<br />
<br />
If the DPI displayed by xdpyinfo is not correct, see [[Xorg#Display size and DPI]] for how to fix it.<br />
<br />
== X Resources ==<br />
<br />
If you are not using a desktop environment such as KDE, Xfce, or other that manipulates the X settings for you, you can set the desired DPI setting manually via the {{ic|Xft.dpi}} variable in [[Xresources]]:<br />
<br />
{{hc|~/.Xresources|<nowiki><br />
Xft.dpi: 180<br />
Xft.autohint: 0<br />
Xft.lcdfilter: lcddefault<br />
Xft.hintstyle: hintfull<br />
Xft.hinting: 1<br />
Xft.antialias: 1<br />
Xft.rgba: rgb<br />
</nowiki>}}<br />
<br />
Make sure the settings are loaded properly when X starts, for instance in your {{ic|~/.xinitrc}} with {{ic|xrdb -merge ~/.Xresources}} (see [[Xresources]] for more information).<br />
<br />
This will make the font render properly in most toolkits and applications, it will however not affect things such as icon size!<br />
Setting {{ic|Xft.dpi}} at the same time as toolkit scale (e.g. {{ic|GDK_SCALE}}) may cause interface elements to be much larger than intended in some programs like firefox.<br />
<br />
== GUI toolkits ==<br />
<br />
=== Qt 5 ===<br />
<br />
Since Qt 5.6, Qt 5 applications can be instructed to honor screen DPI by setting the {{ic|QT_AUTO_SCREEN_SCALE_FACTOR}} environment variable:<br />
<br />
export QT_AUTO_SCREEN_SCALE_FACTOR=1<br />
<br />
If automatic detection of DPI does not produce the desired effect, scaling can be set manually per-screen ({{ic|QT_SCREEN_SCALE_FACTORS}}) or globally ({{ic|QT_SCALE_FACTOR}}). For more details see the [https://blog.qt.io/blog/2016/01/26/high-dpi-support-in-qt-5-6/ Qt blog post].<br />
<br />
{{Note|<br />
* If you manually set the screen factor, it is important to set {{ic|1=QT_AUTO_SCREEN_SCALE_FACTOR=0}} otherwise some applications which explicitly force high DPI enabling get scaled twice.<br />
* {{ic|QT_SCALE_FACTOR}} scales fonts, but {{ic|QT_SCREEN_SCALE_FACTORS}} does not scale fonts.<br />
* If you also set the font DPI manually in ''xrdb'' to support other toolkits, {{ic|QT_SCALE_FACTORS}} will give you huge fonts.<br />
}}<br />
<br />
=== GDK 3 (GTK+ 3) ===<br />
<br />
To scale UI elements by a factor of two:<br />
<br />
export GDK_SCALE=2<br />
<br />
To undo scaling of text:<br />
<br />
export GDK_DPI_SCALE=0.5<br />
<br />
=== GTK+ 2 ===<br />
<br />
Scaling of UI elements is not supported by the toolkit itself, however it's possible to generate a theme with elements pre-scaled for HiDPI display using {{AUR|oomox-git}}.<br />
<br />
=== Elementary (EFL) ===<br />
<br />
To scale UI elements by a factor of 1.5:<br />
<br />
export ELM_SCALE=1.5<br />
<br />
For more details see https://phab.enlightenment.org/w/elementary/<br />
<br />
== Boot managers ==<br />
<br />
=== GRUB ===<br />
<br />
==== Lower the framebuffer resolution ====<br />
Set a lower resolution for the framebuffer as explained in [[GRUB/Tips and tricks#Setting the framebuffer resolution]].<br />
<br />
==== Change GRUB font size ====<br />
Find a ttf font that you like in {{ic|/usr/share/fonts/}}.<br />
<br />
Convert the font to a format that GRUB can utilize:<br />
<br />
# grub-mkfont -s 30 -o /boot/grubfont.pf2 /usr/share/fonts/FontFamily/FontName.ttf<br />
<br />
{{Note|Change the {{ic|-s 30}} parameter to modify the font size}}<br />
<br />
Edit {{ic|/etc/default/grub}} to set the new font as shown in [[GRUB/Tips and tricks#Background image and bitmap fonts]]:<br />
<br />
GRUB_FONT="/boot/grubfont.pf2"<br />
<br />
Update GRUB configuration by running {{ic|grub-mkconfig -o /boot/grub/grub.cfg}}<br />
<br />
== Applications ==<br />
<br />
=== Browsers ===<br />
<br />
==== Firefox ====<br />
<br />
Firefox should use the [[#GDK 3 (GTK+ 3)]] settings. However, the suggested {{ic|GDK_SCALE}} suggestion doesn't consistently scale the entirety of Firefox, and doesn't work for fractional values (e.g., a factor of 158DPI/96DPI = 1.65 for a 1080p 14" laptop). You may want to use {{ic|GDK_DPI_SCALE}} instead.<br />
<br />
To override those, open Firefox advanced preferences page ({{ic|about:config}}) and set parameter {{ic|layout.css.devPixelsPerPx}} to {{ic|2}} (or find the one that suits you better; {{ic|2}} is a good choice for Retina screens), but it also doesn't consistently scale the entirety of Firefox. If Firefox is not scaling fonts, you may want to create {{ic|userChrome.css}} and add appropriate styles to it. More information about {{ic|userChrome.css}} at [http://kb.mozillazine.org/index.php?title=UserChrome.css mozillaZine].<br />
<br />
{{hc|~/.mozilla/firefox/<em><profile></em>/chrome/userChrome.css|@namespace url("http://www.mozilla.org/keymaster/gatekeeper/there.is.only.xul");<br />
<br />
/* #tabbrowser-tabs, #navigator-toolbox, menuitem, menu, ... */<br />
* {<br />
font-size: 15px !important;<br />
}<br />
<br />
/* exception for badge on adblocker */<br />
.toolbarbutton-badge {<br />
font-size: 8px !important;<br />
}<br />
}}<br />
<br />
If you use a HiDPI monitor such as Retina display together with another monitor, you can use [https://addons.mozilla.org/en-US/firefox/addon/autohidpi/ AutoHiDPI] add-on in order to automatically adjust {{ic|layout.css.devPixelsPerPx}} setting for the active screen. Also, since Firefox version 49, it auto-scales based on your screen resolution, making it easier to deal with 2 or more screens.<br />
<br />
==== Chromium / Google Chrome ====<br />
<br />
Chromium should use the [[#GDK 3 (GTK+ 3)]] settings.<br />
<br />
To override those, use the {{ic|1=--force-device-scale-factor}} flag with a scaling value. This will scale all content and ui, including tab and font size. For example {{ic|1=chromium --force-device-scale-factor=2}}.<br />
<br />
Using this option, a scaling factor of 1 would be normal scaling. Floating point values can be used. To make the change permanent, for Chromium, you can add it to {{ic|~/.config/chromium-flags.conf}}:<br />
<br />
{{hc|~/.config/chromium-flags.conf|2=--force-device-scale-factor=2}}<br />
<br />
To make this work for Chrome, add the same option to {{ic|~/.config/chrome-flags.conf}} instead.<br />
<br />
If you use a HiDPI monitor such as Retina display together with another monitor, you can use the [https://chrome.google.com/webstore/detail/resolution-zoom/enjjhajnmggdgofagbokhmifgnaophmh reszoom] extension in order to automatically adjust the zoom level for the active screen.<br />
<br />
==== Opera ====<br />
<br />
Opera should use the [[#GDK 3 (GTK+ 3)]] settings.<br />
<br />
To override those, use the {{ic|1=--alt-high-dpi-setting=X}} command line option, where X is the desired DPI. For example, with {{ic|1=--alt-high-dpi-setting=144}} Opera will assume that DPI is 144. Newer versions of opera will auto detect the DPI using the font DPI setting (in KDE: the force font DPI setting.)<br />
<br />
=== Thunderbird ===<br />
<br />
See [[#Firefox]]. To access {{ic|about:config}}, go to Edit → Preferences → Advanced → Config editor.<br />
<br />
=== Wine applications ===<br />
<br />
Run<br />
$ winecfg<br />
and change the "dpi" setting found in the "Graphics" tab. This only affects the font size.<br />
<br />
=== Skype ===<br />
<br />
Skype for Linux ({{AUR|skypeforlinux-stable-bin}} package) uses [[#GTK+ 2]].<br />
<br />
=== Spotify ===<br />
<br />
Spotify can be launched with a custom scaling factor, for example<br />
$ spotify --force-device-scale-factor=1.5<br />
<br />
=== Zathura document viewer ===<br />
<br />
No modifications required for document viewing.<br />
<br />
UI text scaling is specified via [https://pwmt.org/projects/zathura/documentation/ configuration file] (note that "font" is a [https://pwmt.org/projects/girara/options/ girara option]):<br />
<br />
set font "monospace normal 20"<br />
<br />
=== Sublime Text 3 ===<br />
Sublime Text 3 has full support for display scaling. Go to Preferences > Settings > User Settings and add {{ic|"dpi_scale": 2.0}} to your settings [http://blog.wxm.be/2014/08/30/sublime-text-3-and-high-dpi-on-linux.html (source)].<br />
<br />
=== IntelliJ IDEA ===<br />
<br />
IntelliJ IDEA 15 and above should include HiDPI support.[http://blog.jetbrains.com/idea/2015/07/intellij-idea-15-eap-comes-with-true-hidpi-support-for-windows-and-linux/] If it does not work, the most convenient way to fix the problem in this case seems to be changing the Override Default Fonts setting:<br />
<br />
:''File -> Settings -> Behaviour & Appearance -> Appearance''<br />
<br />
The addition of {{ic|1=-Dhidpi=true}} to the vmoptions file in either {{ic|$HOME/.IdeaC14/}} or {{ic|/usr/share/intelligj-idea-ultimate-edition/bin/}} of [https://youtrack.jetbrains.com/issue/IDEA-114944 release 14] should not be required anymore.<br />
<br />
=== NetBeans ===<br />
<br />
NetBeans allows the font size of its interface to be controlled using the {{ic|1=--fontsize}} parameter during startup. To make this change permanent edit the {{ic|1=/usr/share/netbeans/etc/netbeans.conf}} file and append the {{ic|1=--fontsize}} parameter to the {{ic|1=netbeans_default_options}} property.[http://wiki.netbeans.org/FaqFontSize]<br />
<br />
The editor fontsize can be controlled from Tools → Option → Fonts & Colors.<br />
<br />
The output window fontsize can be controlled from Tools → Options → Miscelaneous → Output<br />
<br />
=== Gimp 2.8 ===<br />
<br />
Use a high DPI theme, or adjust {{ic|1=gtkrc}} of an existing theme. (Change all occurrences of the size {{ic|1=button}} to {{ic|1=dialog}}, for example {{ic|1=GimpToolPalette::tool-icon-size}}.)<br />
<br />
There is also the [https://github.com/jedireza/gimp-hidpi gimp-hidpi].<br />
<br />
=== Steam ===<br />
<br />
==== Official HiDPI support ====<br />
* Starting on 25 of January 2018 in the beta program there is actual support for HiDPI and it should be automatically detected.<br />
* Steam -> Settings -> Interface -> check "Enlarge text and icons based on monitor size" (restart required)<br />
* If it not automatically detected use {{ic|1=GDK_SCALE=2}} to set the desired scale factor.<br />
<br />
==== Unofficial ====<br />
The [https://github.com/MoriTanosuke/HiDPI-Steam-Skin HiDPI-Steam-Skin] can be installed to increase the font size of the interface. While not perfect, it does improve usability. <br />
<br />
{{Note|The README for the HiDPI skin lists several possible locations for where to place the skin. The correct folder out of these can be identified by the presence of a file named {{ic|1=skins_readme.txt}}.}}<br />
<br />
[http://steamcommunity.com/groups/metroskin/discussions/0/517142253861033946/ MetroSkin Unofficial Patch] also helps with HiDPI on Steam with Linux.<br />
<br />
=== Java applications ===<br />
<br />
Java applications using the AWT/Swing framework can be scaled by defining the sun.java2d.uiScale variable when invoking java. For example,<br />
<br />
java -Dsun.java2d.uiScale=2 -jar some_application.jar<br />
<br />
Since Java 9 the GDK_SCALE environment variable is used to scale Swing applications accordingly.<br />
<br />
=== Mono applications ===<br />
<br />
According to [https://bugzilla.xamarin.com/show_bug.cgi?id=35870], Mono applications should be scalable like [[#GDK 3 (GTK+ 3)|GTK3]] applications.<br />
<br />
=== MATLAB ===<br />
<br />
Recent versions (R2017b) of [[Matlab]] allow to set the scale factor:<br />
>> s = settings;s.matlab.desktop.DisplayScaleFactor<br />
>> s.matlab.desktop.DisplayScaleFactor.PersonalValue = 2<br />
The settings take effect after MATLAB is restarted.<br />
<br />
=== VirtualBox ===<br />
<br />
{{Note|This ony applies to KDE with scaling enabled.}}<br />
VirtualBox also applies the system-wide scaling to the virtual monitor, which reduces the maximum resolution inside VMs by your scaling factor (see [https://www.virtualbox.org/ticket/16604]).<br />
<br />
This can be worked around by calculating the inverse of your scaling factor and manually setting this new scaling factor for the VirtualBox execution, e.g.<br />
$ QT_SCALE_FACTOR=0.5 VirtualBox --startvm vm-name<br />
<br />
=== Unsupported applications ===<br />
<br />
{{AUR|run_scaled-git}} can be used to scale applications (which uses {{Pkg|xpra}} internally).<br />
<br />
Another approach is to run the application full screen and without decoration in its own VNC desktop. Then scale the viewer. With Vncdesk ({{AUR|vncdesk-git}} from the [[AUR]]) you can set up a desktop per application, then start server and client with a simple command such as {{ic|vncdesk 2}}.<br />
<br />
[[x11vnc]] has an experimental option {{ic|-appshare}}, which opens one viewer per application window. Perhaps something could be hacked up with that.<br />
<br />
== Multiple displays ==<br />
The HiDPI setting applies to the whole desktop, so non-HiDPI external displays show everything too large. However, note that setting different scaling factors for different monitors is already supported in [[Wayland]].<br />
<br />
=== Side display ===<br />
One workaround is to use [[xrandr]]'s scale option. To have a non-HiDPI monitor (on DP1) right of an internal HiDPI display (eDP1), one could run:<br />
<br />
xrandr --output eDP-1 --auto --output DP-1 --auto --scale 2x2 --right-of eDP-1<br />
<br />
When extending above the internal display, you may see part of the internal display on the external monitor. In that case, specify the position manually, e.g. using [https://gist.github.com/wvengen/178642bbc8236c1bdb67 this script].<br />
<br />
You may run into problems with your mouse not being able to reach the whole screen. That is a [https://bugs.freedesktop.org/show_bug.cgi?id=39949 known bug] with an xserver-org patch (or try the panning option, but that might cause other problems).<br />
<br />
An example of the panning syntax for a 4k laptop with an external 1920x1080 monitor to the right:<br />
<br />
xrandr --output eDP-1 --auto --output HDMI-1 --auto --panning 3840x2160+3840+0 --scale 2x2 --right-of eDP-1<br />
<br />
Generically if your HiDPI monitor is AxB pixels and your regular monitor is CxD and you are scaling by [ExF], the commandline for right-of is:<br />
<br />
xrandr --output eDP-1 --auto --output HDMI-1 --auto --panning [C*E]x[D*F]+[A]+0 --scale [E]x[F] --right-of eDP-1<br />
<br />
If panning is not a solution for you it may be better to set position of monitors and fix manually the total display screen.<br />
<br />
An example of the syntax for a 2560x1440 WQHD 210 DPI laptop monitor (eDP1) using native resolution placed below a 1920x1080 FHD 96 DPI external monitor (HDMI) scaled to match global DPI settings:<br />
<br />
xrandr --output eDP-1 --auto --pos 0x1458 --output HDMI-1 --scale 1.35x1.35 --auto --pos 0x0 --fb 2592x2898<br />
<br />
The total screen size (--fb) and positioning (--pos) are to be calculated taking into account the scaling factor.<br />
<br />
In this case laptop monitor (eDP1) has no scaling and uses native mode for resolution so it will total 2560x1440, but external monitor (HDMI) is scaled and it has to be considered a larger screen so (1920*1.35)x(1080*1.35) from where the eDP1 Y position came 1080*1.35=1458 and the total screen size: since one on top of the other X=(greater between eDP1 and HDMI, so 1920*1.35=2592) and Y=(sum of the calculated heights of eDP1 and HDMI, so 1440+(1080*1.35)=2898).<br />
<br />
Generically if your hidpi monitor is AxB pixels and your regular monitor is CxD and you are scaling by [ExF] and hidpi is placed below regular one, the commandline for right-of is:<br />
<br />
xrandr --output eDP-1 --auto --pos 0x(DxF) --output HDMI-1 --auto --scale [E]x[F] --pos 0x0 --fb [greater between A and (C*E)]x[B+(D*F)]<br />
<br />
You may adjust the "sharpness" parameter on your monitor settings to adjust the blur level introduced with scaling.<br />
<br />
{{Note|1=Above solution with {{ic|--scale 2x2}} does not work on some Nvidia cards. No solution is currently available. [https://bbs.archlinux.org/viewtopic.php?pid=1670840] A potential workaround exists with configuring {{ic|1=ForceFullCompositionPipeline=On}} on the {{ic|CurrentMetaMode}} via {{ic|nvidia-settings}}. For more info see [https://askubuntu.com/a/979551/763549].}}<br />
<br />
=== Multiple external monitors ===<br />
There might be some problems in scaling more than one external monitors which have lower dpi than the built-in HiDPI display. In that case, you may want to try downscaling the HiDPI display instead, with e.g.<br />
<br />
xrandr --output eDP1 --scale 0.5x0.5 --output DP2 --right-of eDP1 --output HDMI1 --right-of DP2<br />
<br />
In addition, when you downscale the HiDPI display, the font on the HiDPI display will be slightly blurry, but it's a different kind of bluriness compared with the one introduced by upscaling the external displays. You may compare and see which kind of bluriness is less problematic for you.<br />
<br />
=== Mirroring ===<br />
<br />
If all you want is to mirror ("unify") displays, this is easy as well:<br />
<br />
With AxB your native HiDPI resolution (for ex 3200x1800) and CxD your external screen resolution (for ex 1920x1200)<br />
<br />
xrandr --output HDMI --scale [A/C]x[B/D]<br />
<br />
In this example which is QHD (3200/1920 = 1.66 and 1800/1200 = 1.5)<br />
<br />
xrandr --output HDMI --scale 1.66x1.5<br />
<br />
For UHD to 1080p (3840/1920=2 2160/1080=2)<br />
<br />
xrandr --output HDMI --scale 2x2<br />
<br />
You may adjust the "sharpness" parameter on your monitor settings to adjust the blur level introduced with scaling.<br />
<br />
== Linux console ==<br />
<br />
The default [[w:Linux console|Linux console]] font will be very small on hidpi displays, the largest font present in the {{Pkg|kbd}} package is {{ic|latarcyrheb-sun32}} and other packages like {{Pkg|terminus-font}} contain further alternatives, such as {{ic|ter-132n}}(normal) and {{ic|ter-132b}}(bold). See [[Fonts#Console fonts]] for configuration details.<br />
<br />
After changing the font, it is often garbled and unreadable when changing to other virtual consoles ({{ic|tty2-6}}). To fix this you can [[Kernel_mode_setting#Forcing_modes_and_EDID|force specific mode]] for KMS, such as {{ic|1=video=2560x1600@60}} (substitute in the native resolution of your HiDPI display), and reboot.<br />
<br />
== See also ==<br />
<br />
* [http://www.phoronix.com/scan.php?page=article&item=linux_uhd4k_gpus Ultra HD 4K Linux Graphics Card Testing] (Nov 2013)<br />
* [http://www.eizo.com/library/basics/pixel_density_4k/ Understanding pixel density]</div>Vi sixhttps://wiki.archlinux.org/index.php?title=Steam/Troubleshooting&diff=515729Steam/Troubleshooting2018-04-03T13:01:55Z<p>Vi six: /* Launching games on Nvidia optimus laptops */ typo</p>
<hr />
<div>[[Category:Gaming]]<br />
[[ru:Steam/Troubleshooting]]<br />
[[ja:Steam/トラブルシューティング]]<br />
== Introduction ==<br />
<br />
# Make sure that you have followed [[Steam#Installation]].<br />
# If the Steam client / a game is not starting and/or you have error message about a library, read [[#Steam runtime]] and see [[#Debugging shared libraries]].<br />
# If the issue is related to networking, make sure that you have forwarded the [https://support.steampowered.com/kb_article.php?ref=8571-GLVN-8711 required ports for Steam].<br />
# If the issue is about a game, consult [[Steam/Game-specific troubleshooting]].<br />
<br />
=== Relevant online resources ===<br />
<br />
* [https://bbs.archlinux.org/viewforum.php?id=32 Multimedia and Games / Arch Linux Forums]<br />
* [https://github.com/ValveSoftware/steam-for-linux ValveSoftware/steam-for-linux] – Issue tracking for the Steam for Linux client<br />
* [https://steamcommunity.com/ Steam Community discussions of the game]<br />
* [https://help.steampowered.com/en/ Steam Support FAQ]<br />
<br />
== Steam runtime ==<br />
<br />
Steam for Linux ships with its own set of libraries called the [https://github.com/ValveSoftware/steam-runtime Steam runtime]. By default Steam launches all Steam Applications within the runtime environment.<br />
The Steam runtime is located at {{ic|~/.steam/root/ubuntu12_32/steam-runtime/}}.<br />
<br />
If you mix the Steam runtime libraries with system libraries you will run into binary incompatibility issues, see [https://github.com/ValveSoftware/steam-for-linux/issues/4768 steam-for-linux issue #4768].<br />
Binary incompatibility can lead to the Steam client and games not starting (manifesting as a crash, as hanging or silently returning), audio issues and various other problems.<br />
<br />
The {{Pkg|steam}} package offers three ways to launch Steam:<br />
<br />
* {{ic|steam-runtime}} (alias {{ic|steam}}), which overrides runtime libraries known to cause problems via the {{ic|LD_PRELOAD}} [[environment variable]] (see {{man|8|ld.so}}).<br />
* {{ic|steam-native}}, see [[#Steam native runtime]]<br />
* {{ic|/usr/lib/steam/steam}}, the default Steam launch script<br />
<br />
As the Steam runtime libraries are older they can lack newer features, e.g. the OpenAL version of the Steam runtime lacks [[Gaming#Binaural_Audio_with_OpenAL|HRTF]] and surround71 support.<br />
<br />
=== Steam native runtime ===<br />
<br />
{{Warning|Using the Steam native runtime is not recommended as it might break some games due to binary incompatibility and it might miss some libraries present in the Steam runtime.}}<br />
<br />
The {{ic|steam-native}} script launches Steam with the {{ic|1=STEAM_RUNTIME=0}} environment variable making it ignore its runtime and only use system libraries.<br />
<br />
The {{Pkg|steam-native-runtime}} meta package depends on over 120 packages to pose a native replacement of the Steam runtime, some games may however still require additional packages. You can also use the Steam native runtime without {{Pkg|steam-native-runtime}} by manually installing just the packages you need. See [[#Finding missing runtime libraries]].<br />
<br />
== Debugging shared libraries ==<br />
<br />
To see the shared libraries required by a program or a shared library run the {{ic|ldd}} command on it, see {{man|1|ldd}}. The {{ic|LD_LIBRARY_PATH}} and {{ic|LD_PRELOAD}} [[environment variables]] can alter which shared libraries are loaded, see {{man|8|ld.so}}. <br />
To correctly debug a program or shared library it is therefore important that these environment variables in your debug environment match the environment you wish to debug.<br />
<br />
If you figure out a missing library you can use [[pacman]] or [[pkgfile]] to search for packages that contain the missing library.<br />
<br />
=== Finding missing game libraries ===<br />
<br />
If a game fails to start, a possible reason is that it is missing required libraries. You can find out what libraries it requests by running {{ic|ldd ''game_executable''}}. {{ic|''game_executable''}} is likely located somewhere in {{ic|~/.steam/root/steamapps/common/}}. Please note that most of these "missing" libraries are actually already included with Steam, and do not need to be installed globally.<br />
<br />
=== Finding missing runtime libraries ===<br />
<br />
If individual games or Steam itself is failing to launch when using {{ic|steam-native}} you are probably missing libraries. To find the required libraries run:<br />
<br />
$ cd ~/.steam/root/ubuntu12_32<br />
$ file * | grep ELF | cut -d: -f1 | LD_LIBRARY_PATH=. xargs ldd | grep 'not found' | sort | uniq<br />
<br />
Alternatively, run Steam with {{ic|steam-runtime}} and use the following command to see which non-system libraries Steam is using (not all of these are part of the Steam runtime):<br />
<br />
$ for i in $(pgrep steam); do sed '/\.local/!d;s/.* //g' /proc/$i/maps; done | sort | uniq<br />
<br />
== Debugging Steam ==<br />
<br />
The Steam launcher redirects its stdout and stderr to {{ic|/tmp/dumps/''USER''_stdout.txt}}.<br />
This means you do not have to run Steam from the command-line to see that output.<br />
<br />
It is possible to debug Steam to gain more information which could be useful to find out why something does not work.<br />
<br />
You can set {{ic|DEBUGGER}} environment variable with one of {{ic|gdb}}, {{ic|cgdb}}, {{ic|valgrind}}, {{ic|callgrind}}, {{ic|strace}} and then start {{ic|steam}}.<br />
<br />
For example with {{Pkg|gdb}}<br />
{{bc|1=$ DEBUGGER=gdb steam}}<br />
<br />
{{ic|gdb}} will open, then type {{ic|run}} which will start {{ic|steam}} and once crash happens you can type {{ic|backtrace}} to see call stack.<br />
<br />
== Runtime issues ==<br />
<br />
=== Segmentation fault when disabling runtime ===<br />
<br />
:[https://github.com/ValveSoftware/steam-for-linux/issues/3863 steam-for-linux issue #3863]<br />
<br />
As per the bug report above, Steam crashes with the following error message when run with {{ic|1=STEAM_RUNTIME=0}}:<br />
<br />
/home/''USER''/.local/share/Steam/steam.sh: line 756: <variable numeric code> Segmentation fault (core dumped)<br />
<br />
This happens because {{ic|steamclient.so}} is linked to {{ic|libudev.so.0}} ({{AUR|lib32-libudev0}}) which conflicts with {{ic|libudev.so.1}} ({{Pkg|lib32-systemd}}).<br />
<br />
A proposed workaround is to copy Steam's packaged 32-bit versions of libusb and libgudev to {{ic|/usr/lib32}}:<br />
<br />
# cp ~/.steam/root/ubuntu12_32/steam-runtime/i386/usr/lib/i386-linux-gnu/libgudev* /usr/lib32<br />
# cp ~/.steam/root/ubuntu12_32/steam-runtime/i386/lib/i386-linux-gnu/libusb* /usr/lib32<br />
<br />
Notice that the workaround is necessary because the bug affects systems with lib32-libgudev and lib32-libusb installed.<br />
<br />
Alternatively it has been successful to prioritize the loading of the libudev.so.1 (see [https://github.com/ValveSoftware/steam-for-linux/issues/3863#issuecomment-203929113 comment on the same issue]):<br />
{{bc|1=$ LD_PRELOAD=/usr/lib32/libudev.so.1 STEAM_RUNTIME=0 steam}}<br />
<br />
=== 'GLBCXX_3.X.XX' not found when using Bumblebee ===<br />
<br />
This error is likely caused because Steam packages its own out of date {{ic|libstdc++.so.6}}. See [[#Steam runtime issues]]{{Broken section link}} about working around the bad library. See also [https://github.com/ValveSoftware/steam-for-linux/issues/3773 steam-for-linux issue 3773].<br />
<br />
=== Game crashes immediately ===<br />
<br />
This is likely due to [[#Steam runtime]] issues, see [[#Debugging shared libraries]].<br />
<br />
Disabling the in-game Steam Overlay in the game properties might help.<br />
<br />
And finally, if those don't work, you should check Steam's output for any error from the game. You may encounter the following:<br />
* {{ic|munmap_chunk(): invalid pointer}}<br />
* {{ic|free(): invalid pointer}}<br />
<br />
In these cases, try replacing the {{ic|libsteam_api.so}} file from the problematic game with one of a game that works. This error usually happens for games that were not updated recently when Steam runtime is disabled. This error has been encountered with AYIM, Bastion and Monaco.<br />
<br />
=== Version `CURL_OPENSSL_3` not found ===<br />
<br />
This is because {{Pkg|curl}} alone is not compatible with previous versions. You need to install the compatibility libraries:<br />
<br />
One of the following messages may show up:<br />
<br />
# Nuclear Throne<br />
./nuclearthrone: /usr/lib32/libcurl.so.4: version `CURL_OPENSSL_3' not found (required by ./nuclearthrone)<br />
<br />
# Devil Daggers<br />
./devildaggers: /usr/lib/libcurl.so.4: version `CURL_OPENSSL_3' not found (required by ./devildaggers)<br />
<br />
You need to install either {{Pkg|libcurl-compat}} or {{Pkg|lib32-libcurl-compat}} and link the compatibility library manually:<br />
<br />
# Nuclear Throne<br />
$ ln -s /usr/lib32/libcurl-compat.so.4.4.0 "''LIBRARY''/steamapps/common/Nuclear Throne/lib/libcurl.so.4"<br />
<br />
# Devil Daggers<br />
$ ln -s /usr/lib/libcurl-compat.so.4.4.0 ''LIBRARY''/steamapps/common/devildaggers/lib64/libcurl.so.4<br />
<br />
== Audio issues ==<br />
<br />
If the sections below do not address the issue, using the [[#Steam native runtime]] might help.<br />
<br />
=== Configure PulseAudio ===<br />
<br />
Games that explicitly depend on ALSA can break PulseAudio. Follow the directions for [[PulseAudio#ALSA]] to make these games use PulseAudio instead.<br />
<br />
=== No audio or 756 Segmentation fault ===<br />
<br />
First [[#Configure PulseAudio]] and see if that resolves the issue. If you do not have audio in the videos which play within the Steam client, it is possible that the ALSA libraries packaged with Steam are not working.<br />
<br />
Attempting to playback a video within the steam client results in an error similar to:<br />
<br />
ALSA lib pcm_dmix.c:1018:(snd_pcm_dmix_open) unable to open slave<br />
<br />
A workaround is to rename or delete the {{ic|alsa-lib}} folder and the {{ic|libasound.so.*}} files. They can be found at:<br />
<br />
~/.steam/steam/ubuntu12_32/steam-runtime/i386/usr/lib/i386-linux-gnu/<br />
<br />
An alternative workaround is to add the {{ic|libasound.so.*}} library to the {{ic|LD_PRELOAD}} environment variable:<br />
<br />
LD_PRELOAD='/usr/$LIB/libasound.so.2 '${LD_PRELOAD} steam<br />
<br />
If audio still won't work, adding the Pulseaudio-libs to the {{ic|LD_PRELOAD}} variable may help:<br />
<br />
LD_PRELOAD='/usr/$LIB/libpulse.so.0 /usr/$LIB/libpulse-simple.so.0 '${LD_PRELOAD}<br />
<br />
Be advised that their names may change over time. If so, it is necessary to take a look in <br />
<br />
~/.steam/ubuntu12_32/steam-runtime/i386/usr/lib/i386-linux-gnu<br />
<br />
and find the new libraries and their versions.<br />
<br />
Bugs reports have been filed: [https://github.com/ValveSoftware/steam-for-linux/issues/3376 #3376] and [https://github.com/ValveSoftware/steam-for-linux/issues/3504 #3504]<br />
<br />
=== FMOD sound engine ===<br />
{{Accuracy|No source / bug report.}}<br />
<br />
The [https://www.fmod.com/ FMOD] audio middleware package is a bit buggy, and as a result games using it may have sound problems.<br />
<br />
It usually occurs when an unused sound device is used as default for ALSA. See [[Advanced Linux Sound Architecture#Set the default sound card]].<br />
<br />
:Affected games: Hotline Miami, Hotline Miami 2, Transistor<br />
<br />
=== PulseAudio & OpenAL: Audio streams can't be moved between devices ===<br />
<br />
If you use [[PulseAudio]] and cannot move an audio stream between sinks, it might be because recent OpenAL versions default to disallow audio streams from being moved. Try to add the following to your {{ic|~/.alsoftrc}}:<br />
<br />
[pulse]<br />
allow-moves=true<br />
<br />
== Steam client issues ==<br />
<br />
=== Cannot add library folder because of missing execute permissions ===<br />
<br />
If you add another Steam library folder on another drive, you might get the error message:<br />
<br />
New Steam library folder must be on a filesystem mounted with execute permissions<br />
<br />
Make sure you are mounting the filesystem with the correct flags in your {{ic|/etc/fstab}}, usually by adding {{ic|exec}} to the list of mount parameter. The parameter must occur after any {{ic|user}} or {{ic|users}} parameter since these can imply {{ic|noexec}}.<br />
<br />
This error might also occur if your library folder does not contain a {{ic|steamapps}} directory. Previous versions used {{ic|SteamApps}} instead, so ensure the name is fully lowercase.<br />
<br />
This error can also occur because of Steam runtime issues and may be fixed following the [[#Dynamic linker]]{{Broken section link}} section.<br />
<br />
=== Unusually slow download speed ===<br />
<br />
If your Steam apps (games, software…) download speed through the client is unusually slow, but browsing the Steam store and streaming videos is unaffected, installing a DNS cache program, such as [[dnsmasq]] can help [https://steamcommunity.com/app/221410/discussions/2/616189106498372437/].<br />
<br />
=== "Needs to be online" error ===<br />
If the Steam launcher refuses to start and you get an error saying: "''Fatal Error: Steam needs to be online to update''" while you are online, then there might be issues with name resolving. <br />
<br />
Try to install {{Pkg|nss-mdns}}.<br />
<br />
=== Steam forgets password ===<br />
<br />
:Related: [https://github.com/ValveSoftware/steam-for-linux/issues/5030 steam-for-linux#5030]<br />
Steam for Linux has a bug which causes it to forget the password of some users.<br />
<br />
As a workaround, after logging in to Steam, run<br />
$ chmod -w ~/.steam/registry.vdf<br />
This will make the file read-only so Steam cannot modify it, and thus not log you out.<br />
<br />
=== Preventing crash memory dumps ===<br />
<br />
Every time Steam crashes, it writes a memory dump to {{ic|/tmp/dumps/}}. If Steam falls into a crash loop, the dump files can become quite large. When {{ic|/tmp}} is mounted as [[tmpfs]], memory and swap file can be consumed needlessly.<br />
<br />
To prevent this, link {{ic|/tmp/dumps/}} to {{ic|/dev/null}}:<br />
# ln -s /dev/null /tmp/dumps<br />
<br />
Or alternatively, create and modify permissions on {{ic|/tmp/dumps}}. Then Steam will be unable to write dump files to the directory.<br />
<br />
# mkdir /tmp/dumps<br />
# chmod 600 /tmp/dumps<br />
<br />
This also has the added benefit of Steam not uploading these dumps to Valve's servers.<br />
<br />
=== Steam license problem with playing videos ===<br />
<br />
Steam uses [[w:Widevine|Google's Widevine DRM]] for some videos. If it is not installed you will get the following error:<br />
<br />
This video requires a license to play which cannot be retrieved. This may be a temporary network condition. Please restart the video to try again.<br />
<br />
To solve this issue follow the [https://support.steampowered.com/kb_article.php?ref=8699-OASD-1871#15 ''Streaming Videos on Steam'' support page].<br />
<br />
== In-home streaming issues ==<br />
<br />
See [[Steam#In-home streaming]].<br />
<br />
=== In-home streaming does not work from archlinux host to archlinux guest ===<br />
<br />
Chances are you are missing {{Pkg|lib32-libcanberra}}. Once you [[install]] that, it should work as expected.<br />
<br />
With that, Steam should no longer crash when trying to launch a game through in-home streaming.<br />
<br />
=== Hardware decoding not available ===<br />
<br />
In-home streaming hardware decoding uses {{ic|vaapi}}, so it needs to be installed (or wrapped around {{ic|vdpau}}). See [[hardware video acceleration]]. Remember to install the {{ic|lib32}} versions as well.<br />
<br />
=== Big Picture Mode minimizes itself after losing focus ===<br />
<br />
This can occur when you play a game via in-home streaming or if you have a multi-monitor setup and move the mouse outside of BPM's window. To prevent this, set the following environment variable and restart Steam<br />
<br />
export SDL_VIDEO_MINIMIZE_ON_FOCUS_LOSS=0<br />
<br />
See also the [https://github.com/ValveSoftware/steam-for-linux/issues/4769 steam-for-linux issue 4769].<br />
<br />
== Other issues ==<br />
<br />
=== Wrong ELF class ===<br />
<br />
If you see this message in Steam's console output<br />
<br />
ERROR: ld.so: object '~/.local/share/Steam/ubuntu12_32/gameoverlayrenderer.so' from LD_PRELOAD cannot be preloaded (wrong ELF class: ELFCLASS32): ignored.<br />
<br />
you can safely ignore it. It is not really any error: Steam includes both 64- and 32-bit versions of some libraries and only one version will load successfully. This "error" is displayed even when Steam (and the in-game overlay) is working perfectly.<br />
<br />
=== Multiple monitors setup ===<br />
{{Expansion|Is this Nvidia-only? Can this be reproduced by anyone? Is there an upstream report?}}<br />
A setup with multiple monitors may prevent games from starting. Try to disable all additional displays, and then run a game. You can enable them after the game successfully started. <br />
<br />
Also you can try running Steam with this environment variable set:<br />
<br />
export LD_LIBRARY_PATH=/usr/lib32/nvidia:/usr/lib/nvidia:$LD_LIBRARY_PATH<br />
<br />
=== Text is corrupt or missing ===<br />
<br />
The Steam Support [https://support.steampowered.com/kb_article.php?ref=1974-YFKL-4947 instructions] for Windows seem to work on Linux also.<br />
<br />
You can install them via the {{AUR|steam-fonts}} package, or manually by downloading and [[fonts#Manual installation|installing]] [https://support.steampowered.com/downloads/1974-YFKL-4947/SteamFonts.zip SteamFonts.zip].<br />
<br />
{{Note|When steam cannot find the Arial fonts, font-config likes to fall back onto the Helveticia bitmap font. Steam does not render this and possibly other bitmap fonts correctly, so either removing problem fonts or [[Font configuration#Disable bitmap fonts|disabling bitmap fonts]] will most likely fix the issue without installing the Arial or ArialBold fonts.<br />
<br />
The font being used in place of Arial can be found with the command {{bc|$ fc-match -v Arial}}}}<br />
<br />
=== SetLocale('en_US.UTF-8') fails at game startup ===<br />
<br />
You need to generate the {{ic|en_US.UTF-8 UTF-8}} locale. See [[Locale#Generating locales]].<br />
<br />
=== Missing libc ===<br />
<br />
This could be due to a corrupt Steam executable. Check the output of:<br />
<br />
$ ldd ~/.local/share/Steam/ubuntu12_32/steam<br />
<br />
Should {{ic|ldd}} claim that it is not a dynamic executable, then Steam likely corrupted the binary during an update. The following should fix the issue:<br />
<br />
$ cd ~/.local/share/Steam/<br />
$ ./steam.sh --reset<br />
<br />
If it doesn't, try to delete the {{ic|~/.local/share/Steam/}} directory and launch Steam again, telling it to reinstall itself.<br />
<br />
This error message can also occur due to a bug in Steam which occurs when your {{ic|$HOME}} directory ends in a slash (Valve GitHub [https://github.com/ValveSoftware/steam-for-linux/issues/3730 issue 3730]). This can be fixed by editing {{ic|/etc/passwd}} and changing {{ic|/home/<username>/}} to {{ic|home/<username>}}, then logging out and in again. Afterwards, Steam should repair itself automatically.<br />
<br />
=== Games do not launch on older Intel hardware ===<br />
<br />
:[https://steamcommunity.com/app/8930/discussions/1/540744299927655197/ source]<br />
<br />
On older Intel hardware which doesn't support OpenGL 3, such as Intel GMA chips or Westmere CPUs, games may immediately crash when run. It appears as a {{ic|gameoverlayrenderer.so}} error in {{ic|/tmp/dumps/mobile_stdout.txt}}, but looking in {{ic|/tmp/gameoverlayrenderer.log}} it shows a GLXBadFBConfig error. <br />
<br />
This can be fixed, by forcing the game to use a later version of OpenGL than it wants.<br />
Add {{ic|1=MESA_GL_VERSION_OVERRIDE=3.1 MESA_GLSL_VERSION_OVERRIDE=140}} to your [[launch option]]s.<br />
<br />
=== Mesa: Game does not launch, complaining about OpenGL version supported by the card ===<br />
<br />
Some games are badly programmed, to use any OpenGL version above 3.0.<br />
With Mesa, an application has to request a specific core profile.<br />
If it doesn't make such a request, only OpenGL 3.0 and lower are available.<br />
<br />
This can be fixed, by forcing the game to use a version of OpenGL it actually needs.<br />
Add {{ic|1=MESA_GL_VERSION_OVERRIDE=4.1 MESA_GLSL_VERSION_OVERRIDE=410}} to your [[launch option]]s.<br />
<br />
=== 2K games do not run on XFS partitions ===<br />
<br />
{{Expansion|Seems to be a general issue, e.g. [https://github.com/ValveSoftware/Source-1-Games/issues/1685]}}<br />
<br />
If you are running 2K games such as Civilization 5 on [[XFS]] partitions, then the game may not start or run properly due to how the game loads files as it starts.<br />
[https://bbs.archlinux.org/viewtopic.php?id=185222]<br />
<br />
=== Steam controller not being detected correctly ===<br />
<br />
See [[Gamepad#Steam Controller]].<br />
<br />
=== Steam hangs on "Installing breakpad exception handler..." ===<br />
<br />
[https://bbs.archlinux.org/viewtopic.php?id=177245 BBS#177245]<br />
<br />
You have an Nvidia GPU and Steam has the following output:<br />
<br />
Running Steam on arch rolling 64-bit<br />
STEAM_RUNTIME is enabled automatically<br />
Installing breakpad exception handler for appid(steam)/version(0_client)<br />
<br />
Then nothing else happens. Ensure you have the correct drivers installed as well as their 32-bit versions: see [[NVIDIA#Installation]].<br />
<br />
=== Killing standalone compositors when launching games ===<br />
<br />
Further to this, utilising the {{ic|%command%}} switch, you can kill standalone compositors (such as Xcompmgr or [[Compton]]) - which can cause lag and tearing in some games on some systems - and relaunch them after the game ends by adding the following to your game's launch options.<br />
<br />
killall compton && %command%; compton -b &<br />
<br />
Replace {{ic|compton}} in the above command with whatever your compositor is. You can also add -options to {{ic|%command%}} or {{ic|compton}}, of course.<br />
<br />
Steam will latch on to any processes launched after {{ic|%command%}} and your Steam status will show as in game. So in this example, we run the compositor through {{ic|nohup}} so it is not attached to Steam (it will keep running if you close Steam) and follow it with an ampersand so that the line of commands ends, clearing your Steam status.<br />
<br />
=== Symbol lookup error using DRI3 ===<br />
<br />
Steam outputs this error and exits.<br />
<br />
symbol lookup error: /usr/lib/libxcb-dri3.so.0: undefined symbol: xcb_send_request_with_fds<br />
<br />
To work around this, run Steam with {{ic|1=LIBGL_DRI3_DISABLE=1}}, disabling DRI3 for Steam.<br />
<br />
=== Launching games on Nvidia optimus laptops ===<br />
<br />
To be able to play games which require using Nvidia GPU (for example, Hitman 2016) on optimus enabled laptop, you should start Steam with ''primusrun'' prefix. Otherwise, game will not work.<br />
Keep in mind, that issuing some command such as {{ic|primusrun steam}} while Steam is already running will not restart it. You should explicitly exit and then start Steam via {{ic|primusrun steam}} command or start game immediately after start, for example with {{ic|primusrun steam steam://rungameid/236870}}.<br />
After Steam was launched with primusrun prefix, you do not need to prefix your game with primusrun or optirun, because it does not matter.<br />
<br />
For primusrun, VSYNC is enabled by default it could result in a mouse input delay lag, slightly decrease performance and in-game FPS might be locked to a refresh rate of a monitor/display.<br />
In order to disable VSYNC for primusrun default value of option vblank_mode needs to be overridden by environment variable.<br />
<br />
{{ic|1=vblank_mode=0 primusrun steam}}<br />
<br />
Same with optirun that uses primus as a bridge. <br />
<br />
{{ic|1=vblank_mode=0 optirun -b primus steam}}<br />
<br />
For more details see [[Bumblebee#Primusrun mouse delay (disable VSYNC)]].</div>Vi sixhttps://wiki.archlinux.org/index.php?title=Steam/Troubleshooting&diff=515719Steam/Troubleshooting2018-04-03T12:26:36Z<p>Vi six: /* Launching games on Nvidia optimus laptops */ added example of disabling VSYNC for primusrun with overrided vblank_mode option.</p>
<hr />
<div>[[Category:Gaming]]<br />
[[ru:Steam/Troubleshooting]]<br />
[[ja:Steam/トラブルシューティング]]<br />
== Introduction ==<br />
<br />
# Make sure that you have followed [[Steam#Installation]].<br />
# If the Steam client / a game is not starting and/or you have error message about a library, read [[#Steam runtime]] and see [[#Debugging shared libraries]].<br />
# If the issue is related to networking, make sure that you have forwarded the [https://support.steampowered.com/kb_article.php?ref=8571-GLVN-8711 required ports for Steam].<br />
# If the issue is about a game, consult [[Steam/Game-specific troubleshooting]].<br />
<br />
=== Relevant online resources ===<br />
<br />
* [https://bbs.archlinux.org/viewforum.php?id=32 Multimedia and Games / Arch Linux Forums]<br />
* [https://github.com/ValveSoftware/steam-for-linux ValveSoftware/steam-for-linux] – Issue tracking for the Steam for Linux client<br />
* [https://steamcommunity.com/ Steam Community discussions of the game]<br />
* [https://help.steampowered.com/en/ Steam Support FAQ]<br />
<br />
== Steam runtime ==<br />
<br />
Steam for Linux ships with its own set of libraries called the [https://github.com/ValveSoftware/steam-runtime Steam runtime]. By default Steam launches all Steam Applications within the runtime environment.<br />
The Steam runtime is located at {{ic|~/.steam/root/ubuntu12_32/steam-runtime/}}.<br />
<br />
If you mix the Steam runtime libraries with system libraries you will run into binary incompatibility issues, see [https://github.com/ValveSoftware/steam-for-linux/issues/4768 steam-for-linux issue #4768].<br />
Binary incompatibility can lead to the Steam client and games not starting (manifesting as a crash, as hanging or silently returning), audio issues and various other problems.<br />
<br />
The {{Pkg|steam}} package offers three ways to launch Steam:<br />
<br />
* {{ic|steam-runtime}} (alias {{ic|steam}}), which overrides runtime libraries known to cause problems via the {{ic|LD_PRELOAD}} [[environment variable]] (see {{man|8|ld.so}}).<br />
* {{ic|steam-native}}, see [[#Steam native runtime]]<br />
* {{ic|/usr/lib/steam/steam}}, the default Steam launch script<br />
<br />
As the Steam runtime libraries are older they can lack newer features, e.g. the OpenAL version of the Steam runtime lacks [[Gaming#Binaural_Audio_with_OpenAL|HRTF]] and surround71 support.<br />
<br />
=== Steam native runtime ===<br />
<br />
{{Warning|Using the Steam native runtime is not recommended as it might break some games due to binary incompatibility and it might miss some libraries present in the Steam runtime.}}<br />
<br />
The {{ic|steam-native}} script launches Steam with the {{ic|1=STEAM_RUNTIME=0}} environment variable making it ignore its runtime and only use system libraries.<br />
<br />
The {{Pkg|steam-native-runtime}} meta package depends on over 120 packages to pose a native replacement of the Steam runtime, some games may however still require additional packages. You can also use the Steam native runtime without {{Pkg|steam-native-runtime}} by manually installing just the packages you need. See [[#Finding missing runtime libraries]].<br />
<br />
== Debugging shared libraries ==<br />
<br />
To see the shared libraries required by a program or a shared library run the {{ic|ldd}} command on it, see {{man|1|ldd}}. The {{ic|LD_LIBRARY_PATH}} and {{ic|LD_PRELOAD}} [[environment variables]] can alter which shared libraries are loaded, see {{man|8|ld.so}}. <br />
To correctly debug a program or shared library it is therefore important that these environment variables in your debug environment match the environment you wish to debug.<br />
<br />
If you figure out a missing library you can use [[pacman]] or [[pkgfile]] to search for packages that contain the missing library.<br />
<br />
=== Finding missing game libraries ===<br />
<br />
If a game fails to start, a possible reason is that it is missing required libraries. You can find out what libraries it requests by running {{ic|ldd ''game_executable''}}. {{ic|''game_executable''}} is likely located somewhere in {{ic|~/.steam/root/steamapps/common/}}. Please note that most of these "missing" libraries are actually already included with Steam, and do not need to be installed globally.<br />
<br />
=== Finding missing runtime libraries ===<br />
<br />
If individual games or Steam itself is failing to launch when using {{ic|steam-native}} you are probably missing libraries. To find the required libraries run:<br />
<br />
$ cd ~/.steam/root/ubuntu12_32<br />
$ file * | grep ELF | cut -d: -f1 | LD_LIBRARY_PATH=. xargs ldd | grep 'not found' | sort | uniq<br />
<br />
Alternatively, run Steam with {{ic|steam-runtime}} and use the following command to see which non-system libraries Steam is using (not all of these are part of the Steam runtime):<br />
<br />
$ for i in $(pgrep steam); do sed '/\.local/!d;s/.* //g' /proc/$i/maps; done | sort | uniq<br />
<br />
== Debugging Steam ==<br />
<br />
The Steam launcher redirects its stdout and stderr to {{ic|/tmp/dumps/''USER''_stdout.txt}}.<br />
This means you do not have to run Steam from the command-line to see that output.<br />
<br />
It is possible to debug Steam to gain more information which could be useful to find out why something does not work.<br />
<br />
You can set {{ic|DEBUGGER}} environment variable with one of {{ic|gdb}}, {{ic|cgdb}}, {{ic|valgrind}}, {{ic|callgrind}}, {{ic|strace}} and then start {{ic|steam}}.<br />
<br />
For example with {{Pkg|gdb}}<br />
{{bc|1=$ DEBUGGER=gdb steam}}<br />
<br />
{{ic|gdb}} will open, then type {{ic|run}} which will start {{ic|steam}} and once crash happens you can type {{ic|backtrace}} to see call stack.<br />
<br />
== Runtime issues ==<br />
<br />
=== Segmentation fault when disabling runtime ===<br />
<br />
:[https://github.com/ValveSoftware/steam-for-linux/issues/3863 steam-for-linux issue #3863]<br />
<br />
As per the bug report above, Steam crashes with the following error message when run with {{ic|1=STEAM_RUNTIME=0}}:<br />
<br />
/home/''USER''/.local/share/Steam/steam.sh: line 756: <variable numeric code> Segmentation fault (core dumped)<br />
<br />
This happens because {{ic|steamclient.so}} is linked to {{ic|libudev.so.0}} ({{AUR|lib32-libudev0}}) which conflicts with {{ic|libudev.so.1}} ({{Pkg|lib32-systemd}}).<br />
<br />
A proposed workaround is to copy Steam's packaged 32-bit versions of libusb and libgudev to {{ic|/usr/lib32}}:<br />
<br />
# cp ~/.steam/root/ubuntu12_32/steam-runtime/i386/usr/lib/i386-linux-gnu/libgudev* /usr/lib32<br />
# cp ~/.steam/root/ubuntu12_32/steam-runtime/i386/lib/i386-linux-gnu/libusb* /usr/lib32<br />
<br />
Notice that the workaround is necessary because the bug affects systems with lib32-libgudev and lib32-libusb installed.<br />
<br />
Alternatively it has been successful to prioritize the loading of the libudev.so.1 (see [https://github.com/ValveSoftware/steam-for-linux/issues/3863#issuecomment-203929113 comment on the same issue]):<br />
{{bc|1=$ LD_PRELOAD=/usr/lib32/libudev.so.1 STEAM_RUNTIME=0 steam}}<br />
<br />
=== 'GLBCXX_3.X.XX' not found when using Bumblebee ===<br />
<br />
This error is likely caused because Steam packages its own out of date {{ic|libstdc++.so.6}}. See [[#Steam runtime issues]]{{Broken section link}} about working around the bad library. See also [https://github.com/ValveSoftware/steam-for-linux/issues/3773 steam-for-linux issue 3773].<br />
<br />
=== Game crashes immediately ===<br />
<br />
This is likely due to [[#Steam runtime]] issues, see [[#Debugging shared libraries]].<br />
<br />
Disabling the in-game Steam Overlay in the game properties might help.<br />
<br />
And finally, if those don't work, you should check Steam's output for any error from the game. You may encounter the following:<br />
* {{ic|munmap_chunk(): invalid pointer}}<br />
* {{ic|free(): invalid pointer}}<br />
<br />
In these cases, try replacing the {{ic|libsteam_api.so}} file from the problematic game with one of a game that works. This error usually happens for games that were not updated recently when Steam runtime is disabled. This error has been encountered with AYIM, Bastion and Monaco.<br />
<br />
=== Version `CURL_OPENSSL_3` not found ===<br />
<br />
This is because {{Pkg|curl}} alone is not compatible with previous versions. You need to install the compatibility libraries:<br />
<br />
One of the following messages may show up:<br />
<br />
# Nuclear Throne<br />
./nuclearthrone: /usr/lib32/libcurl.so.4: version `CURL_OPENSSL_3' not found (required by ./nuclearthrone)<br />
<br />
# Devil Daggers<br />
./devildaggers: /usr/lib/libcurl.so.4: version `CURL_OPENSSL_3' not found (required by ./devildaggers)<br />
<br />
You need to install either {{Pkg|libcurl-compat}} or {{Pkg|lib32-libcurl-compat}} and link the compatibility library manually:<br />
<br />
# Nuclear Throne<br />
$ ln -s /usr/lib32/libcurl-compat.so.4.4.0 "''LIBRARY''/steamapps/common/Nuclear Throne/lib/libcurl.so.4"<br />
<br />
# Devil Daggers<br />
$ ln -s /usr/lib/libcurl-compat.so.4.4.0 ''LIBRARY''/steamapps/common/devildaggers/lib64/libcurl.so.4<br />
<br />
== Audio issues ==<br />
<br />
If the sections below do not address the issue, using the [[#Steam native runtime]] might help.<br />
<br />
=== Configure PulseAudio ===<br />
<br />
Games that explicitly depend on ALSA can break PulseAudio. Follow the directions for [[PulseAudio#ALSA]] to make these games use PulseAudio instead.<br />
<br />
=== No audio or 756 Segmentation fault ===<br />
<br />
First [[#Configure PulseAudio]] and see if that resolves the issue. If you do not have audio in the videos which play within the Steam client, it is possible that the ALSA libraries packaged with Steam are not working.<br />
<br />
Attempting to playback a video within the steam client results in an error similar to:<br />
<br />
ALSA lib pcm_dmix.c:1018:(snd_pcm_dmix_open) unable to open slave<br />
<br />
A workaround is to rename or delete the {{ic|alsa-lib}} folder and the {{ic|libasound.so.*}} files. They can be found at:<br />
<br />
~/.steam/steam/ubuntu12_32/steam-runtime/i386/usr/lib/i386-linux-gnu/<br />
<br />
An alternative workaround is to add the {{ic|libasound.so.*}} library to the {{ic|LD_PRELOAD}} environment variable:<br />
<br />
LD_PRELOAD='/usr/$LIB/libasound.so.2 '${LD_PRELOAD} steam<br />
<br />
If audio still won't work, adding the Pulseaudio-libs to the {{ic|LD_PRELOAD}} variable may help:<br />
<br />
LD_PRELOAD='/usr/$LIB/libpulse.so.0 /usr/$LIB/libpulse-simple.so.0 '${LD_PRELOAD}<br />
<br />
Be advised that their names may change over time. If so, it is necessary to take a look in <br />
<br />
~/.steam/ubuntu12_32/steam-runtime/i386/usr/lib/i386-linux-gnu<br />
<br />
and find the new libraries and their versions.<br />
<br />
Bugs reports have been filed: [https://github.com/ValveSoftware/steam-for-linux/issues/3376 #3376] and [https://github.com/ValveSoftware/steam-for-linux/issues/3504 #3504]<br />
<br />
=== FMOD sound engine ===<br />
{{Accuracy|No source / bug report.}}<br />
<br />
The [https://www.fmod.com/ FMOD] audio middleware package is a bit buggy, and as a result games using it may have sound problems.<br />
<br />
It usually occurs when an unused sound device is used as default for ALSA. See [[Advanced Linux Sound Architecture#Set the default sound card]].<br />
<br />
:Affected games: Hotline Miami, Hotline Miami 2, Transistor<br />
<br />
=== PulseAudio & OpenAL: Audio streams can't be moved between devices ===<br />
<br />
If you use [[PulseAudio]] and cannot move an audio stream between sinks, it might be because recent OpenAL versions default to disallow audio streams from being moved. Try to add the following to your {{ic|~/.alsoftrc}}:<br />
<br />
[pulse]<br />
allow-moves=true<br />
<br />
== Steam client issues ==<br />
<br />
=== Cannot add library folder because of missing execute permissions ===<br />
<br />
If you add another Steam library folder on another drive, you might get the error message:<br />
<br />
New Steam library folder must be on a filesystem mounted with execute permissions<br />
<br />
Make sure you are mounting the filesystem with the correct flags in your {{ic|/etc/fstab}}, usually by adding {{ic|exec}} to the list of mount parameter. The parameter must occur after any {{ic|user}} or {{ic|users}} parameter since these can imply {{ic|noexec}}.<br />
<br />
This error might also occur if your library folder does not contain a {{ic|steamapps}} directory. Previous versions used {{ic|SteamApps}} instead, so ensure the name is fully lowercase.<br />
<br />
This error can also occur because of Steam runtime issues and may be fixed following the [[#Dynamic linker]]{{Broken section link}} section.<br />
<br />
=== Unusually slow download speed ===<br />
<br />
If your Steam apps (games, software…) download speed through the client is unusually slow, but browsing the Steam store and streaming videos is unaffected, installing a DNS cache program, such as [[dnsmasq]] can help [https://steamcommunity.com/app/221410/discussions/2/616189106498372437/].<br />
<br />
=== "Needs to be online" error ===<br />
If the Steam launcher refuses to start and you get an error saying: "''Fatal Error: Steam needs to be online to update''" while you are online, then there might be issues with name resolving. <br />
<br />
Try to install {{Pkg|nss-mdns}}.<br />
<br />
=== Steam forgets password ===<br />
<br />
:Related: [https://github.com/ValveSoftware/steam-for-linux/issues/5030 steam-for-linux#5030]<br />
Steam for Linux has a bug which causes it to forget the password of some users.<br />
<br />
As a workaround, after logging in to Steam, run<br />
$ chmod -w ~/.steam/registry.vdf<br />
This will make the file read-only so Steam cannot modify it, and thus not log you out.<br />
<br />
=== Preventing crash memory dumps ===<br />
<br />
Every time Steam crashes, it writes a memory dump to {{ic|/tmp/dumps/}}. If Steam falls into a crash loop, the dump files can become quite large. When {{ic|/tmp}} is mounted as [[tmpfs]], memory and swap file can be consumed needlessly.<br />
<br />
To prevent this, link {{ic|/tmp/dumps/}} to {{ic|/dev/null}}:<br />
# ln -s /dev/null /tmp/dumps<br />
<br />
Or alternatively, create and modify permissions on {{ic|/tmp/dumps}}. Then Steam will be unable to write dump files to the directory.<br />
<br />
# mkdir /tmp/dumps<br />
# chmod 600 /tmp/dumps<br />
<br />
This also has the added benefit of Steam not uploading these dumps to Valve's servers.<br />
<br />
=== Steam license problem with playing videos ===<br />
<br />
Steam uses [[w:Widevine|Google's Widevine DRM]] for some videos. If it is not installed you will get the following error:<br />
<br />
This video requires a license to play which cannot be retrieved. This may be a temporary network condition. Please restart the video to try again.<br />
<br />
To solve this issue follow the [https://support.steampowered.com/kb_article.php?ref=8699-OASD-1871#15 ''Streaming Videos on Steam'' support page].<br />
<br />
== In-home streaming issues ==<br />
<br />
See [[Steam#In-home streaming]].<br />
<br />
=== In-home streaming does not work from archlinux host to archlinux guest ===<br />
<br />
Chances are you are missing {{Pkg|lib32-libcanberra}}. Once you [[install]] that, it should work as expected.<br />
<br />
With that, Steam should no longer crash when trying to launch a game through in-home streaming.<br />
<br />
=== Hardware decoding not available ===<br />
<br />
In-home streaming hardware decoding uses {{ic|vaapi}}, so it needs to be installed (or wrapped around {{ic|vdpau}}). See [[hardware video acceleration]]. Remember to install the {{ic|lib32}} versions as well.<br />
<br />
=== Big Picture Mode minimizes itself after losing focus ===<br />
<br />
This can occur when you play a game via in-home streaming or if you have a multi-monitor setup and move the mouse outside of BPM's window. To prevent this, set the following environment variable and restart Steam<br />
<br />
export SDL_VIDEO_MINIMIZE_ON_FOCUS_LOSS=0<br />
<br />
See also the [https://github.com/ValveSoftware/steam-for-linux/issues/4769 steam-for-linux issue 4769].<br />
<br />
== Other issues ==<br />
<br />
=== Wrong ELF class ===<br />
<br />
If you see this message in Steam's console output<br />
<br />
ERROR: ld.so: object '~/.local/share/Steam/ubuntu12_32/gameoverlayrenderer.so' from LD_PRELOAD cannot be preloaded (wrong ELF class: ELFCLASS32): ignored.<br />
<br />
you can safely ignore it. It is not really any error: Steam includes both 64- and 32-bit versions of some libraries and only one version will load successfully. This "error" is displayed even when Steam (and the in-game overlay) is working perfectly.<br />
<br />
=== Multiple monitors setup ===<br />
{{Expansion|Is this Nvidia-only? Can this be reproduced by anyone? Is there an upstream report?}}<br />
A setup with multiple monitors may prevent games from starting. Try to disable all additional displays, and then run a game. You can enable them after the game successfully started. <br />
<br />
Also you can try running Steam with this environment variable set:<br />
<br />
export LD_LIBRARY_PATH=/usr/lib32/nvidia:/usr/lib/nvidia:$LD_LIBRARY_PATH<br />
<br />
=== Text is corrupt or missing ===<br />
<br />
The Steam Support [https://support.steampowered.com/kb_article.php?ref=1974-YFKL-4947 instructions] for Windows seem to work on Linux also.<br />
<br />
You can install them via the {{AUR|steam-fonts}} package, or manually by downloading and [[fonts#Manual installation|installing]] [https://support.steampowered.com/downloads/1974-YFKL-4947/SteamFonts.zip SteamFonts.zip].<br />
<br />
{{Note|When steam cannot find the Arial fonts, font-config likes to fall back onto the Helveticia bitmap font. Steam does not render this and possibly other bitmap fonts correctly, so either removing problem fonts or [[Font configuration#Disable bitmap fonts|disabling bitmap fonts]] will most likely fix the issue without installing the Arial or ArialBold fonts.<br />
<br />
The font being used in place of Arial can be found with the command {{bc|$ fc-match -v Arial}}}}<br />
<br />
=== SetLocale('en_US.UTF-8') fails at game startup ===<br />
<br />
You need to generate the {{ic|en_US.UTF-8 UTF-8}} locale. See [[Locale#Generating locales]].<br />
<br />
=== Missing libc ===<br />
<br />
This could be due to a corrupt Steam executable. Check the output of:<br />
<br />
$ ldd ~/.local/share/Steam/ubuntu12_32/steam<br />
<br />
Should {{ic|ldd}} claim that it is not a dynamic executable, then Steam likely corrupted the binary during an update. The following should fix the issue:<br />
<br />
$ cd ~/.local/share/Steam/<br />
$ ./steam.sh --reset<br />
<br />
If it doesn't, try to delete the {{ic|~/.local/share/Steam/}} directory and launch Steam again, telling it to reinstall itself.<br />
<br />
This error message can also occur due to a bug in Steam which occurs when your {{ic|$HOME}} directory ends in a slash (Valve GitHub [https://github.com/ValveSoftware/steam-for-linux/issues/3730 issue 3730]). This can be fixed by editing {{ic|/etc/passwd}} and changing {{ic|/home/<username>/}} to {{ic|home/<username>}}, then logging out and in again. Afterwards, Steam should repair itself automatically.<br />
<br />
=== Games do not launch on older Intel hardware ===<br />
<br />
:[https://steamcommunity.com/app/8930/discussions/1/540744299927655197/ source]<br />
<br />
On older Intel hardware which doesn't support OpenGL 3, such as Intel GMA chips or Westmere CPUs, games may immediately crash when run. It appears as a {{ic|gameoverlayrenderer.so}} error in {{ic|/tmp/dumps/mobile_stdout.txt}}, but looking in {{ic|/tmp/gameoverlayrenderer.log}} it shows a GLXBadFBConfig error. <br />
<br />
This can be fixed, by forcing the game to use a later version of OpenGL than it wants.<br />
Add {{ic|1=MESA_GL_VERSION_OVERRIDE=3.1 MESA_GLSL_VERSION_OVERRIDE=140}} to your [[launch option]]s.<br />
<br />
=== Mesa: Game does not launch, complaining about OpenGL version supported by the card ===<br />
<br />
Some games are badly programmed, to use any OpenGL version above 3.0.<br />
With Mesa, an application has to request a specific core profile.<br />
If it doesn't make such a request, only OpenGL 3.0 and lower are available.<br />
<br />
This can be fixed, by forcing the game to use a version of OpenGL it actually needs.<br />
Add {{ic|1=MESA_GL_VERSION_OVERRIDE=4.1 MESA_GLSL_VERSION_OVERRIDE=410}} to your [[launch option]]s.<br />
<br />
=== 2K games do not run on XFS partitions ===<br />
<br />
{{Expansion|Seems to be a general issue, e.g. [https://github.com/ValveSoftware/Source-1-Games/issues/1685]}}<br />
<br />
If you are running 2K games such as Civilization 5 on [[XFS]] partitions, then the game may not start or run properly due to how the game loads files as it starts.<br />
[https://bbs.archlinux.org/viewtopic.php?id=185222]<br />
<br />
=== Steam controller not being detected correctly ===<br />
<br />
See [[Gamepad#Steam Controller]].<br />
<br />
=== Steam hangs on "Installing breakpad exception handler..." ===<br />
<br />
[https://bbs.archlinux.org/viewtopic.php?id=177245 BBS#177245]<br />
<br />
You have an Nvidia GPU and Steam has the following output:<br />
<br />
Running Steam on arch rolling 64-bit<br />
STEAM_RUNTIME is enabled automatically<br />
Installing breakpad exception handler for appid(steam)/version(0_client)<br />
<br />
Then nothing else happens. Ensure you have the correct drivers installed as well as their 32-bit versions: see [[NVIDIA#Installation]].<br />
<br />
=== Killing standalone compositors when launching games ===<br />
<br />
Further to this, utilising the {{ic|%command%}} switch, you can kill standalone compositors (such as Xcompmgr or [[Compton]]) - which can cause lag and tearing in some games on some systems - and relaunch them after the game ends by adding the following to your game's launch options.<br />
<br />
killall compton && %command%; compton -b &<br />
<br />
Replace {{ic|compton}} in the above command with whatever your compositor is. You can also add -options to {{ic|%command%}} or {{ic|compton}}, of course.<br />
<br />
Steam will latch on to any processes launched after {{ic|%command%}} and your Steam status will show as in game. So in this example, we run the compositor through {{ic|nohup}} so it is not attached to Steam (it will keep running if you close Steam) and follow it with an ampersand so that the line of commands ends, clearing your Steam status.<br />
<br />
=== Symbol lookup error using DRI3 ===<br />
<br />
Steam outputs this error and exits.<br />
<br />
symbol lookup error: /usr/lib/libxcb-dri3.so.0: undefined symbol: xcb_send_request_with_fds<br />
<br />
To work around this, run Steam with {{ic|1=LIBGL_DRI3_DISABLE=1}}, disabling DRI3 for Steam.<br />
<br />
=== Launching games on Nvidia optimus laptops ===<br />
<br />
To be able to play games which require using Nvidia GPU (for example, Hitman 2016) on optimus enabled laptop, you should start Steam with ''primusrun'' prefix. Otherwise, game will not work.<br />
Keep in mind, that issuing some command such as {{ic|primusrun steam}} while Steam is already running will not restart it. You should explicitly exit and then start Steam via {{ic|primusrun steam}} command or start game immediately after start, for example with {{ic|primusrun steam steam://rungameid/236870}}.<br />
After Steam was launched with primusrun prefix, you do not need to prefix your game with primusrun or optirun, because it does not matter.<br />
<br />
For primusrun, VSYNC is enabled by default and as a result, it could result in a mouse input delay lag, slightly decrease performance and in-game FPS might be locked to a refresh rate of a monitor/display.<br />
In order to disable VSYNC for primusrun default value of option vblank_mode needs to be overridden by environment variable.<br />
<br />
{{ic|1=vblank_mode=0 primusrun steam}}<br />
<br />
Same with optirun that uses primus as a bridge. <br />
<br />
{{ic|1=vblank_mode=0 optirun -b primus steam}}<br />
<br />
For more details see [[Bumblebee#Primusrun mouse delay (disable VSYNC)]].</div>Vi sixhttps://wiki.archlinux.org/index.php?title=Bumblebee&diff=469485Bumblebee2017-03-01T19:40:03Z<p>Vi six: /* Installation */ better formatting</p>
<hr />
<div>[[Category:Graphics]]<br />
[[Category:X server]]<br />
[[es:Bumblebee]]<br />
[[fr:Bumblebee]]<br />
[[it:Bumblebee]]<br />
[[ja:Bumblebee]]<br />
[[ru:Bumblebee]]<br />
[[tr:Bumblebee]]<br />
[[zh-hans:Bumblebee]]<br />
{{Related articles start}}<br />
{{Related|NVIDIA Optimus}}<br />
{{Related|Nouveau}}<br />
{{Related|NVIDIA}}<br />
{{Related|Intel graphics}}<br />
{{Related articles end}}<br />
From Bumblebee's [https://github.com/Bumblebee-Project/Bumblebee/wiki/FAQ FAQ]:<br />
<br />
:Bumblebee is an effort to make NVIDIA Optimus enabled laptops work in GNU/Linux systems. Such feature involves two graphics cards with two different power consumption profiles plugged in a layered way sharing a single framebuffer.<br />
<br />
== Bumblebee: Optimus for Linux ==<br />
<br />
[http://www.nvidia.com/object/optimus_technology.html Optimus Technology] is a ''[http://hybrid-graphics-linux.tuxfamily.org/index.php?title=Hybrid_graphics hybrid graphics]'' implementation without a hardware multiplexer. The integrated GPU manages the display while the dedicated GPU manages the most demanding rendering and ships the work to the integrated GPU to be displayed. When the laptop is running on battery supply, the dedicated GPU is turned off to save power and prolong the battery life. It has also been tested successfully with desktop machines with Intel integrated graphics and an nVidia dedicated graphics card. <br />
<br />
Bumblebee is a software implementation comprising of two parts:<br />
<br />
* Render programs off-screen on the dedicated video card and display it on the screen using the integrated video card. This bridge is provided by VirtualGL or primus (read further) and connects to a X server started for the discrete video card.<br />
* Disable the dedicated video card when it is not in use (see the [[#Power management]] section)<br />
<br />
It tries to mimic the Optimus technology behavior; using the dedicated GPU for rendering when needed and power it down when not in use. The present releases only support rendering on-demand, automatically starting a program with the discrete video card based on workload is not implemented.<br />
<br />
== Installation ==<br />
<br />
Before installing Bumblebee, check your BIOS and activate Optimus (older laptops call it "switchable graphics") if possible (BIOS doesn't have to provide this option). If neither "Optimus" or "switchable" is in the bios, still make sure both gpu's will be enabled and that the integrated graphics (igfx) is initial display (primary display). The display should be connected to the onboard integrated graphics, not the discrete graphics card. If integrated graphics had previously been disabled and discrete graphics drivers installed, be sure to remove {{ic|/etc/X11/xorg.conf}} or the conf file in {{ic|/etc/X11/xorg.conf.d}} related to the discrete graphics card.<br />
<br />
=== Installing Bumblebee with Intel/NVIDIA ===<br />
<br />
[[Install]]:<br />
* {{Pkg|bumblebee}} - The main package providing the daemon and client programs.<br />
* {{Pkg|mesa}} - An open-source implementation of the '''OpenGL''' specification.<br />
* {{Pkg|nvidia}} or {{Pkg|nvidia-340xx}} or {{Pkg|nvidia-304xx}} - Install appropriate NVIDIA driver. For more information read [[NVIDIA#Installation]].<br />
* Optionally install {{Pkg|xf86-video-intel}} - Intel driver.<br />
<br />
For 32-bit ([[Multilib]] must be enabled) applications support on 64-bit machines, install:<br />
* {{Pkg|lib32-virtualgl}} - A render/display bridge for 32 bit applications.<br />
* {{Pkg|lib32-nvidia-utils}} or {{Pkg|lib32-nvidia-340xx-utils}} or {{Pkg|lib32-nvidia-304xx-utils}} - match the version of the 64 bit package.<br />
<br />
In order to use Bumblebee, it is necessary to add your regular ''user'' to the {{ic|bumblebee}} group:<br />
<br />
# gpasswd -a ''user'' bumblebee<br />
<br />
Also [[enable]] {{ic|bumblebeed.service}}. Reboot your system and follow [[#Usage]].<br />
<br />
=== Installing Bumblebee with Intel/Nouveau ===<br />
<br />
{{Warning|This method is deprecated and [https://github.com/Bumblebee-Project/Bumblebee/issues/773 will not work anymore]. Use the nvidia module instead. If you want nouveau, use [[PRIME]].}}<br />
<br />
Install:<br />
* {{Pkg|xf86-video-nouveau}} - experimental 3D acceleration driver.<br />
* {{Pkg|mesa}} - Mesa classic DRI with Gallium3D drivers and 3D graphics libraries.<br />
<br />
{{Note|1=If, when using {{ic|primusrun}} on a system with the nouveau driver, you are getting:<br />
primus: fatal: failed to load any of the libraries: /usr/$LIB/nvidia/libGL.so.1 <br />
/usr/$LIB/nvidia/libGL.so.1: Cannot open shared object file: No such file or directory<br />
<br />
You should add the following in {{ic|/usr/bin/primus}} after {{ic|PRIMUS_libGL}}:<br />
export PRIMUS_libGLa='/usr/$LIB/libGL.so.1'<br />
<br />
If you want, create a new script (for example ''primusnouveau'').<br />
}}<br />
<br />
== Usage ==<br />
<br />
=== Test ===<br />
<br />
Install {{Pkg|mesa-demos}} and use {{ic|glxgears}} to test if if Bumblebee works with your Optimus system:<br />
$ optirun glxgears -info<br />
<br />
If it fails, try the following commands:<br />
<br />
*64 bit system:<br />
$ optirun glxspheres64<br />
*32 bit system:<br />
$ optirun glxspheres32<br />
<br />
If the window with animation shows up, Optimus with Bumblebee is working.<br />
<br />
{{Note|If {{ic|glxgears}} failed, but {{ic|glxspheres''XX''}} worked, always replace "{{ic|glxgears}}" with "{{ic|glxspheres''XX''}}" in all cases.}}<br />
<br />
=== General usage ===<br />
<br />
$ optirun [options] ''application'' [application-parameters]<br />
<br />
For example, start Windows applications with Optimus:<br />
<br />
$ optirun wine application.exe<br />
<br />
For another example, open NVIDIA Settings panel with Optimus:<br />
<br />
$ optirun -b none nvidia-settings -c :8<br />
<br />
{{Note|A patched version of {{Pkg|nvdock}} is available in the package {{AUR|nvdock-bumblebee}}}}<br />
<br />
For a list of the options for {{ic|optirun}}, view its manual page:<br />
<br />
$ man optirun<br />
<br />
== Configuration ==<br />
<br />
You can configure the behaviour of Bumblebee to fit your needs. Fine tuning like speed optimization, power management and other stuff can be configured in {{ic|/etc/bumblebee/bumblebee.conf}}<br />
<br />
=== Optimizing speed ===<br />
<br />
==== Using VirtualGL as bridge ====<br />
<br />
Bumblebee renders frames for your Optimus NVIDIA card in an invisible X Server with VirtualGL and transports them back to your visible X Server. Frames will be compressed before they are transported - this saves bandwidth and can be used for speed-up optimization of bumblebee:<br />
<br />
To use another compression method for a single application:<br />
<br />
$ optirun -c ''compress-method'' application<br />
<br />
The method of compress will affect performance in the GPU/CPU usage. Compressed methods will mostly load the CPU. However, uncompressed methods will mostly load the GPU.<br />
<br />
Compressed methods<br />
:*{{ic|jpeg}}<br />
:*{{ic|rgb}}<br />
:*{{ic|yuv}}<br />
<br />
Uncompressed methods<br />
:*{{ic|proxy}}<br />
:*{{ic|xv}}<br />
<br />
Here is a performance table tested with [[ASUS N550JV]] laptop and benchmark app {{AUR|unigine-heaven}}:<br />
<br />
{| class="wikitable"<br />
! Command !! FPS !! Score !! Min FPS !! Max FPS<br />
|-<br />
| optirun unigine-heaven || 25.0 || 630 || 16.4 || 36.1<br />
|-<br />
| optirun -c jpeg unigine-heaven || 24.2 || 610 || 9.5 || 36.8<br />
|-<br />
| optirun -c rgb unigine-heaven || 25.1 || 632 || 16.6 || 35.5<br />
|-<br />
| optirun -c yuv unigine-heaven || 24.9 || 626 || 16.5 || 35.8<br />
|-<br />
| optirun -c proxy unigine-heaven || 25.0 || 629 || 16.0 || 36.1<br />
|-<br />
| optirun -c xv unigine-heaven || 22.9 || 577 || 15.4 || 32.2<br />
|}<br />
{{Note|Lag spikes occurred when {{ic|jpeg}} compression method was used.}}<br />
<br />
To use a standard compression for all applications, set the {{ic|VGLTransport}} to {{ic|''compress-method''}} in {{ic|/etc/bumblebee/bumblebee.conf}}:<br />
<br />
{{hc|/etc/bumblebee/bumblebee.conf|2=<br />
[...]<br />
[optirun]<br />
VGLTransport=proxy<br />
[...]<br />
}}<br />
<br />
You can also play with the way VirtualGL reads back the pixels from your graphic card. Setting {{ic|VGL_READBACK}} environment variable to {{ic|pbo}} should increase the performance. Compare these two:<br />
<br />
# PBO should be faster.<br />
VGL_READBACK=pbo optirun glxgears<br />
# The default value is sync.<br />
VGL_READBACK=sync optirun glxgears<br />
<br />
{{Note|CPU frequency scaling will affect directly on render performance}}<br />
<br />
==== Primusrun ====<br />
<br />
{{Note|Since compositing hurts performance, invoking primus when a compositing WM is active is not recommended. See [[#Primus issues under compositing window managers]].}}<br />
{{ic|primusrun}} (from package {{Pkg|primus}}) is becoming the default choice, because it consumes less power and sometimes provides better performance than {{ic|optirun}}/{{ic|virtualgl}}. It may be run separately, but it does not accept options as {{ic|optirun}} does. Setting {{ic|primus}} as the bridge for {{ic|optirun}} provides more flexibility.<br />
<br />
For 32-bit applications support on 64-bit machines, install {{Pkg|lib32-primus}} ([[multilib]] must be enabled).<br />
<br />
Usage (run separately):<br />
$ primusrun glxgears<br />
<br />
Usage (as a bridge for {{ic|optirun}}):<br />
<br />
The default configuration sets {{ic|virtualgl}} as the bridge. Override that on the command line:<br />
$ optirun -b primus glxgears<br />
<br />
Or, set {{ic|1=Bridge=primus}} in {{ic|/etc/bumblebee/bumblebee.conf}} and you won't have to specify it on the command line.<br />
<br />
{{Tip|Refer to [[#Primusrun mouse delay (disable VSYNC)]] if you want to disable {{ic|VSYNC}}. It can also remove mouse input delay lag and slightly increase the performance.}}<br />
<br />
=== Power management ===<br />
<br />
The goal of the power management feature is to turn off the NVIDIA card when it is not used by Bumblebee any more. If {{Pkg|bbswitch}} (or {{Pkg|bbswitch-dkms}}) is installed, it will be detected automatically when the Bumblebee daemon starts. No additional configuration is necessary. However, {{Pkg|bbswitch}} is for [https://bugs.launchpad.net/ubuntu/+source/bbswitch/+bug/1338404/comments/6 Optimus laptops only and will not work on desktop computers]. So, Bumblebee power management is not available for desktop computers, and there is no reason to install {{Pkg|bbswitch}} on a desktop. (Nevertheless, the other features of Bumblebee do work on some desktop computers.)<br />
<br />
==== Default power state of NVIDIA card using bbswitch ====<br />
<br />
The default behavior of bbswitch is to leave the card power state unchanged. {{ic|bumblebeed}} does disable the card when started, so the following is only necessary if you use bbswitch without bumblebeed.<br />
<br />
Set {{ic|load_state}} and {{ic|unload_state}} module options according to your needs (see [https://github.com/Bumblebee-Project/bbswitch bbswitch documentation]).<br />
{{hc|/etc/modprobe.d/bbswitch.conf|2=<br />
options bbswitch load_state=0 unload_state=1<br />
}}<br />
<br />
==== Enable NVIDIA card during shutdown ====<br />
On some laptops, the NVIDIA card may not correctly initialize during boot if the card was powered off when the system was last shutdown. Therefore the Bumblebee daemon will power on the GPU when stopping the daemon (e.g. on shutdown) due to the (default) setting {{ic|TurnCardOffAtExit&#61;false}} in {{ic|/etc/bumblebee/bumblebee.conf}}. Note that this setting does not influence power state while the daemon is running, so if all {{ic|optirun}} or {{ic|primusrun}} programs have exited, the GPU will still be powered off.<br />
<br />
When you stop the daemon manually, you might want to keep the card powered off while still powering it on on shutdown. To achieve the latter, add the following [[systemd]] service (if using {{pkg|bbswitch}}):<br />
<br />
{{hc|/etc/systemd/system/nvidia-enable.service|2=<br />
[Unit]<br />
Description=Enable NVIDIA card<br />
DefaultDependencies=no<br />
<br />
[Service]<br />
Type=oneshot<br />
ExecStart=/bin/sh -c 'echo ON > /proc/acpi/bbswitch'<br />
<br />
[Install]<br />
WantedBy=shutdown.target<br />
}}<br />
<br />
Then [[enable]] the {{ic|nvidia-enable.service}} unit.<br />
<br />
==== Enable NVIDIA card after waking from suspend ====<br />
The bumblebee daemon may fail to activate the graphics card after suspending. A possible fix involves setting {{Pkg|bbswitch}} as the default method for power management in {{ic|/etc/bumblebee/bumblebee.conf}}:<br />
<br />
{{hc|/etc/bumblebee/bumblebee.conf|2=<br />
[driver-nvidia]<br />
PMMethod=bbswitch<br />
<br />
# ...<br />
<br />
[driver-nouveau]<br />
PMMethod=bbswitch<br />
}}<br />
<br />
{{Note|This fix seems to work only after rebooting the system. Restarting the bumblebee service is not enough.}}<br />
<br />
=== Multiple monitors ===<br />
<br />
==== Outputs wired to the Intel chip ====<br />
<br />
If the port (DisplayPort/HDMI/VGA) is wired to the Intel chip, you can set up multiple monitors with xorg.conf. Set them to use the Intel card, but Bumblebee can still use the NVIDIA card. One example configuration is below for two identical screens with 1080p resolution and using the HDMI out.<br />
<br />
{{hc|/etc/X11/xorg.conf|2=<br />
Section "Screen"<br />
Identifier "Screen0"<br />
Device "intelgpu0"<br />
Monitor "Monitor0"<br />
DefaultDepth 24<br />
Option "TwinView" "0"<br />
SubSection "Display"<br />
Depth 24<br />
Modes "1920x1080_60.00"<br />
EndSubSection<br />
EndSection<br />
<br />
Section "Screen"<br />
Identifier "Screen1"<br />
Device "intelgpu1"<br />
Monitor "Monitor1"<br />
DefaultDepth 24<br />
Option "TwinView" "0"<br />
SubSection "Display"<br />
Depth 24<br />
Modes "1920x1080_60.00"<br />
EndSubSection<br />
EndSection<br />
<br />
Section "Monitor"<br />
Identifier "Monitor0"<br />
Option "Enable" "true"<br />
EndSection<br />
<br />
Section "Monitor"<br />
Identifier "Monitor1"<br />
Option "Enable" "true"<br />
EndSection<br />
<br />
Section "Device"<br />
Identifier "intelgpu0"<br />
Driver "intel"<br />
Option "XvMC" "true"<br />
Option "UseEvents" "true"<br />
Option "AccelMethod" "UXA"<br />
BusID "PCI:0:2:0"<br />
EndSection<br />
<br />
Section "Device"<br />
Identifier "intelgpu1"<br />
Driver "intel"<br />
Option "XvMC" "true"<br />
Option "UseEvents" "true"<br />
Option "AccelMethod" "UXA"<br />
BusID "PCI:0:2:0"<br />
EndSection<br />
<br />
Section "Device"<br />
Identifier "nvidiagpu1"<br />
Driver "nvidia"<br />
BusID "PCI:0:1:0"<br />
EndSection<br />
<br />
}}<br />
<br />
You need to probably change the BusID for both the Intel and the NVIDIA card.<br />
<br />
{{hc|<nowiki>$ lspci | grep VGA</nowiki>|<br />
00:02.0 VGA compatible controller: Intel Corporation 2nd Generation Core Processor Family Integrated Graphics Controller (rev 09)<br />
}}<br />
<br />
The BusID is 0:2:0<br />
<br />
==== Output wired to the NVIDIA chip ====<br />
<br />
On some notebooks, the digital Video Output (HDMI or DisplayPort) is hardwired to the NVIDIA chip. If you want to use all the displays on such a system simultaneously, you have to run 2 X Servers. The first will be using the Intel driver for the notebooks panel and a display connected on VGA. The second will be started through optirun on the NVIDIA card, and drives the digital display.<br />
<br />
''intel-virtual-output'' is a tool provided in the {{Pkg|xf86-video-intel}} driver set, as of v2.99. Commandline usage is as follows:<br />
<br />
{{hc|$ intel-virtual-output [OPTION]... [TARGET_DISPLAY]...|<br />
-d <source display> source display<br />
-f keep in foreground (do not detach from console and daemonize)<br />
-b start bumblebee<br />
-a connect to all local displays (e.g. :1, :2, etc)<br />
-S disable use of a singleton and launch a fresh intel-virtual-output process<br />
-v all verbose output, implies -f<br />
-V <category> specific verbose output, implies -f<br />
-h this help}}<br />
<br />
If no target displays are parsed on the commandline, ''intel-virtual-output'' will attempt to connect to any local display. The detected displays will be manageable via any desktop display manager such as xrandr or KDE Display. <br />
<br />
The tool will also start bumblebee (which may be left as default install). See the [https://github.com/Bumblebee-Project/Bumblebee/wiki/Multi-monitor-setup Bumblebee wiki page] for more information.<br />
<br />
{{Note|In {{ic|/etc/bumblebee/xorg.conf.nvidia}} change the lines {{ic|UseEDID}} and {{ic|Option "AutoAddDevices" "false"}} to {{ic|"true"}}, if you are having trouble with device resolution detection. You will also need to comment out the line {{ic|Option "UseDisplayDevices" "none"}} in order to use the display connected to the NVIDIA GPU.}}<br />
<br />
When run in a terminal, it will daemonize itself unless the {{ic|-f}} switch is used. The advantage of using it in foreground mode is that once the external display is disconnected, ''intel-virtual-output'' can then be killed and bumblebee will disable the nvidia chip. Games can be run on the external screen by first exporting the display {{ic|1=export DISPLAY=:8}}, and then running the game with {{ic|optirun ''game_bin''}}, however, cursor and keyboard are not fully captured. Use {{ic|1=export DISPLAY=:0}} to revert back to standard operation.<br />
<br />
== CUDA without Bumblebee==<br />
<br />
You can use CUDA without bumblebee. All you need to do is ensure that the nvidia card is on:<br />
<br />
# tee /proc/acpi/bbswitch <<< ON<br />
<br />
Now when you start a CUDA application it is going to automatically load all the necessary modules.<br />
<br />
To turn off the nvidia card after using CUDA do:<br />
<br />
# rmmod nvidia_uvm<br />
# rmmod nvidia<br />
# tee /proc/acpi/bbswitch <<< OFF<br />
<br />
== Troubleshooting ==<br />
<br />
{{Note|Please report bugs at [https://github.com/Bumblebee-Project/Bumblebee Bumblebee-Project]'s GitHub tracker as described in its [https://github.com/Bumblebee-Project/Bumblebee/wiki/Reporting-Issues wiki].}}<br />
<br />
=== [VGL] ERROR: Could not open display :8 ===<br />
<br />
There is a known problem with some wine applications that fork and kill the parent process without keeping track of it (for example the free to play online game "Runes of Magic")<br />
<br />
This is a known problem with VirtualGL. As of bumblebee 3.1, so long as you have it installed, you can use Primus as your render bridge:<br />
<br />
$ optirun -b primus wine ''windows program''.exe<br />
<br />
If this does not work, an alternative walkaround for this problem is:<br />
<br />
$ optirun bash<br />
$ optirun wine ''windows program''.exe<br />
<br />
If using NVIDIA drivers a fix for this problem is to edit {{ic|/etc/bumblebee/xorg.conf.nvidia}} and change Option {{ic|ConnectedMonitor}} to {{ic|CRT-0}}.<br />
<br />
=== Xlib: extension "GLX" missing on display ":0.0" ===<br />
<br />
If you tried to install the NVIDIA driver from NVIDIA website, this is not going to work.<br />
<br />
1. Uninstall that driver in the similar way:<br />
# ./NVIDIA-Linux-*.run --uninstall<br />
2. Remove generated by NVIDIA Xorg configuration file:<br />
# rm /etc/X11/xorg.conf<br />
3. (Re)install the correct NVIDIA driver: [[#Installing Bumblebee with Intel/NVIDIA]]<br />
<br />
=== [ERROR]Cannot access secondary GPU: No devices detected ===<br />
<br />
In some instances, running {{ic|optirun}} will return:<br />
<br />
[ERROR]Cannot access secondary GPU - error: [XORG] (EE) No devices detected.<br />
[ERROR]Aborting because fallback start is disabled.<br />
<br />
In this case, you will need to move the file {{ic|/etc/X11/xorg.conf.d/20-intel.conf}} to somewhere else, [[restart]] the bumblebeed daemon and it should work. If you do need to change some features for the Intel module, a workaround is to merge {{ic|/etc/X11/xorg.conf.d/20-intel.conf}} to {{ic|/etc/X11/xorg.conf}}.<br />
<br />
It could be also necessary to comment the driver line in {{ic|/etc/X11/xorg.conf.d/10-monitor.conf}}.<br />
<br />
If you're using the {{ic|nouveau}} driver you could try switching to the {{ic|nvidia}} driver.<br />
<br />
You might need to define the NVIDIA card somewhere (e.g. file {{ic|/etc/X11/xorg.conf.d}}), using the correct {{ic|BusID}} according to {{ic|lspci}} output:<br />
<br />
{{bc|<br />
Section "Device"<br />
Identifier "nvidiagpu1"<br />
Driver "nvidia"<br />
BusID "PCI:0:1:0"<br />
EndSection<br />
}}<br />
<br />
Observe that the format of {{ic|lspci}} output is in HEX, while in xorg it is in decimals. So if the output of {{ic|lspci}} is, for example, {{ic|0a:00.0}} the {{ic|BusID}} should be {{ic|PCI:10:0:0}}.<br />
<br />
<br />
==== NVIDIA(0): Failed to assign any connected display devices to X screen 0 ====<br />
<br />
If the console output is:<br />
<br />
[ERROR]Cannot access secondary GPU - error: [XORG] (EE) NVIDIA(0): Failed to assign any connected display devices to X screen 0<br />
[ERROR]Aborting because fallback start is disabled.<br />
<br />
You can change this line in {{ic|/etc/bumblebee/xorg.conf.nvidia}}:<br />
<br />
Option "ConnectedMonitor" "DFP"<br />
<br />
to:<br />
<br />
Option "ConnectedMonitor" "CRT"<br />
<br />
==== Failed to initialize the NVIDIA GPU at PCI:1:0:0 (GPU fallen off the bus / RmInitAdapter failed!) ====<br />
<br />
Add {{ic|1=rcutree.rcu_idle_gp_delay=1}} to the [[kernel parameters]] of the [[bootloader]] configuration (see also the original [https://bbs.archlinux.org/viewtopic.php?id=169742 BBS post] for a configuration example).<br />
<br />
==== Could not load GPU driver ====<br />
<br />
If the console output is:<br />
<br />
[ERROR]Cannot access secondary GPU - error: Could not load GPU driver<br />
<br />
and if you try to load the nvidia module you get:<br />
<br />
modprobe nvidia<br />
modprobe: ERROR: could not insert 'nvidia': Exec format error<br />
<br />
This could be because the nvidia driver is out of sync with the Linux kernel, for example if you installed the latest nvidia driver and haven't updated the kernel in a while. A full system update might resolve the issue. If the problem persists you should try manually compiling the nvidia packages against your current kernel, for example with {{Pkg|nvidia-dkms}} or by compiling {{pkg|nvidia}} from the [[ABS]].<br />
<br />
==== NOUVEAU(0): [drm] failed to set drm interface version ====<br />
<br />
Consider switching to the official nvidia driver. As commented [https://github.com/Bumblebee-Project/Bumblebee/issues/438#issuecomment-22005923 here], nouveau driver has some issues with some cards and bumblebee.<br />
<br />
=== /dev/dri/card0: failed to set DRM interface version 1.4: Permission denied ===<br />
<br />
This could be worked around by appending following lines in {{ic|/etc/bumblebee/xorg.conf.nvidia}} (see [https://github.com/Bumblebee-Project/Bumblebee/issues/580 here]):<br />
{{bc|<br />
Section "Screen"<br />
Identifier "Default Screen"<br />
Device "DiscreteNvidia"<br />
EndSection<br />
}}<br />
<br />
=== ERROR: ld.so: object 'libdlfaker.so' from LD_PRELOAD cannot be preloaded: ignored ===<br />
<br />
You probably want to start a 32-bit application with bumblebee on a 64-bit system. See the "For 32-bit..." section in [[#Installation]]. If the problem persists or if it is a 64-bit application, try using the [[#Primusrun|primus bridge]].<br />
<br />
=== Fatal IO error 11 (Resource temporarily unavailable) on X server ===<br />
<br />
Change {{ic|KeepUnusedXServer}} in {{ic|/etc/bumblebee/bumblebee.conf}} from {{ic|false}} to {{ic|true}}. Your program forks into background and bumblebee don't know anything about it.<br />
<br />
=== Video tearing ===<br />
<br />
Video tearing is a somewhat common problem on Bumblebee. To fix it, you need to enable vsync. It should be enabled by default on the Intel card, but verify that from Xorg logs. To check whether or not it is enabled for NVIDIA, run: <br />
<br />
$ optirun nvidia-settings -c :8<br />
<br />
{{ic|1=X Server XVideo Settings -> Sync to VBlank}} and {{ic|1=OpenGL Settings -> Sync to VBlank}} should both be enabled. The Intel card has in general less tearing, so use it for video playback. Especially use VA-API for video decoding (e.g. {{ic|mplayer-vaapi}} and with {{ic|-vsync}} parameter).<br />
<br />
Refer to the [[Intel#Video_tearing|Intel]]{{Broken section link}} article on how to fix tearing on the Intel card.<br />
<br />
If it is still not fixed, try to disable compositing from your desktop environment. Try also disabling triple buffering.<br />
<br />
=== Bumblebee cannot connect to socket ===<br />
<br />
You might get something like:<br />
<br />
$ optirun glxspheres64<br />
or (for 32 bit):<br />
{{hc|$ optirun glxspheres32|<br />
[ 1648.179533] [ERROR]You've no permission to communicate with the Bumblebee daemon. Try adding yourself to the 'bumblebee' group<br />
[ 1648.179628] [ERROR]Could not connect to bumblebee daemon - is it running?<br />
}}<br />
<br />
If you are already in the {{ic|bumblebee}} group ({{ic|<nowiki>$ groups | grep bumblebee</nowiki>}}), you may try [https://bbs.archlinux.org/viewtopic.php?pid=1178729#p1178729 removing the socket] {{ic|/var/run/bumblebeed.socket}}.<br />
<br />
Another reason for this error could be that you haven't actually turned on both gpu's in your bios, and as a result, the Bumblebee daemon is in fact not running. Check the bios settings carefully and be sure intel graphics (integrated graphics - may be abbreviated in bios as something like igfx) has been enabled or set to auto, and that it's the primary gpu. Your display should be connected to the onboard integrated graphics, not the discrete graphics card.<br />
<br />
If you mistakenly had the display connected to the discrete graphics card and intel graphics was disabled, you probably installed Bumblebee after first trying to run Nvidia alone. In this case, be sure to remove the /etc/X11/xorg.conf or .../20-nvidia... configuration files. If Xorg is instructed to use Nvidia in a conf file, X will fail.<br />
<br />
=== Running X.org from console after login (rootless X.org) ===<br />
<br />
See [[Xorg#Rootless Xorg (v1.16)]].<br />
<br />
=== Primusrun mouse delay (disable VSYNC) ===<br />
<br />
For {{ic|primusrun}}, {{ic|VSYNC}} is enabled by default and as a result, it could make mouse input delay lag or even slightly decrease performance. Test {{ic|primusrun}} with {{ic|VSYNC}} disabled:<br />
<br />
$ vblank_mode=0 primusrun glxgears<br />
<br />
If you are satisfied with the above setting, create an [[alias]] (e.g. {{ic|1=alias primusrun="vblank_mode=0 primusrun"}}).<br />
<br />
Performance comparison:<br />
<br />
{| class="wikitable"<br />
! VSYNC enabled !! FPS !! Score !! Min FPS !! Max FPS<br />
|-<br />
| FALSE || 31.5 || 793 || 22.3 || 54.8<br />
|-<br />
| TRUE || 31.4 || 792 || 18.7 || 54.2<br />
|}<br />
''Tested with [[ASUS N550JV]] notebook and benchmark app {{AUR|unigine-heaven}}.''<br />
<br />
{{Note|To disable vertical synchronization system-wide, see [[Intel graphics#Disable Vertical Synchronization (VSYNC)]].}}<br />
<br />
=== Primus issues under compositing window managers ===<br />
<br />
Since compositing hurts performance, invoking primus when a compositing WM is active is not recommended.[https://github.com/amonakov/primus#issues-under-compositing-wms]<br />
If you need to use primus with compositing and see flickering or bad performance, synchronizing primus' display thread with the application's rendering thread may help:<br />
<br />
$ PRIMUS_SYNC=1 primusrun ...<br />
<br />
This makes primus display the previously rendered frame.<br />
<br />
=== Problems with bumblebee after resuming from standby ===<br />
<br />
In some systems, it can happens that the nvidia module is loaded after resuming from standby.<br />
The solution for this, is to install the {{pkg|acpi_call}} and {{pkg|acpi}} package.<br />
<br />
=== Optirun doesn't work, no debug output ===<br />
<br />
Users are reporting that in some cases, even though Bumblebee was installed correctly, running <br />
<br />
$ optirun glxgears -info<br />
<br />
gives no output at all, and the glxgears window does not appear. Any programs that need 3d acceleration crashes:<br />
<br />
$ optirun bash<br />
$ glxgears<br />
Segmentation fault (core dumped)<br />
<br />
Apparently it is a bug of some versions of virtualgl. So a workaround is to [[install]] {{Pkg|primus}} and {{Pkg|lib32-primus}} and use it instead:<br />
<br />
$ primusrun glxspheres64<br />
$ optirun -b primus glxspheres64<br />
<br />
By default primus locks the framerate to the vrate of your monitor (usually 60 fps), if needed it can be unlocked by passing the {{ic|vblank_mode&#61;0}} environment variable.<br />
<br />
$ vblank_mode=0 primusrun glxspheres64<br />
<br />
Usually there is no need to display more frames han your monitor can handle, but you might want to for benchmarking or to have faster reactions in games (e.g., if a game need 3 frames to react to a mouse movement with {{ic|vblank_mode&#61;0}} the reaction will be as quick as your system can handle, without it will always need 1/20 of second).<br />
<br />
You might want to edit {{ic|/etc/bumblebee/bumblebee.conf}} to use the primus render as default. If after an update you want to check if the bug has been fixed just use {{ic|optirun -b virtualgl}}.<br />
<br />
See [https://bbs.archlinux.org/viewtopic.php?pid=1643609 this forum post] for more information.<br />
<br />
=== Broken power management with kernel 4.8 ===<br />
{{Out of date|Fixed on nvidia 375.26+}}<br />
If you have a newer laptop (BIOS date 2015 or newer), then Linux 4.8 might break bbswitch ([https://github.com/Bumblebee-Project/bbswitch/issues/140 bbswitch issue 140]) since bbswitch does not support the newer, recommended power management method. As a result, the dGPU may fail to power on, fail to power off or worse.<br />
<br />
As a workaround, add {{ic|1=pcie_port_pm=off}} to your [[Kernel parameters]].<br />
<br />
Alternatively, if you are only interested in power saving (and perhaps use of external monitors), remove bbswitch and rely on [[Nouveau]] runtime power-management (which supports the new method).<br />
<br />
=== Lockup issue (lspci hangs) ===<br />
See [[NVIDIA_Optimus#Lockup_issue_.28lspci_hangs.29]]] for an issue that affects new laptops with a GTX 965M (or alike).<br />
<br />
== See also ==<br />
<br />
* [http://www.bumblebee-project.org Bumblebee project repository]<br />
* [http://wiki.bumblebee-project.org/ Bumblebee project wiki]<br />
* [https://github.com/Bumblebee-Project/bbswitch Bumblebee project bbswitch repository]<br />
<br />
Join us at #bumblebee at freenode.net.</div>Vi sixhttps://wiki.archlinux.org/index.php?title=Bumblebee&diff=469484Bumblebee2017-03-01T19:37:42Z<p>Vi six: /* Installing Bumblebee with Intel/NVIDIA */ It is not necessary to install xf86-video-intel</p>
<hr />
<div>[[Category:Graphics]]<br />
[[Category:X server]]<br />
[[es:Bumblebee]]<br />
[[fr:Bumblebee]]<br />
[[it:Bumblebee]]<br />
[[ja:Bumblebee]]<br />
[[ru:Bumblebee]]<br />
[[tr:Bumblebee]]<br />
[[zh-hans:Bumblebee]]<br />
{{Related articles start}}<br />
{{Related|NVIDIA Optimus}}<br />
{{Related|Nouveau}}<br />
{{Related|NVIDIA}}<br />
{{Related|Intel graphics}}<br />
{{Related articles end}}<br />
From Bumblebee's [https://github.com/Bumblebee-Project/Bumblebee/wiki/FAQ FAQ]:<br />
<br />
:Bumblebee is an effort to make NVIDIA Optimus enabled laptops work in GNU/Linux systems. Such feature involves two graphics cards with two different power consumption profiles plugged in a layered way sharing a single framebuffer.<br />
<br />
== Bumblebee: Optimus for Linux ==<br />
<br />
[http://www.nvidia.com/object/optimus_technology.html Optimus Technology] is a ''[http://hybrid-graphics-linux.tuxfamily.org/index.php?title=Hybrid_graphics hybrid graphics]'' implementation without a hardware multiplexer. The integrated GPU manages the display while the dedicated GPU manages the most demanding rendering and ships the work to the integrated GPU to be displayed. When the laptop is running on battery supply, the dedicated GPU is turned off to save power and prolong the battery life. It has also been tested successfully with desktop machines with Intel integrated graphics and an nVidia dedicated graphics card. <br />
<br />
Bumblebee is a software implementation comprising of two parts:<br />
<br />
* Render programs off-screen on the dedicated video card and display it on the screen using the integrated video card. This bridge is provided by VirtualGL or primus (read further) and connects to a X server started for the discrete video card.<br />
* Disable the dedicated video card when it is not in use (see the [[#Power management]] section)<br />
<br />
It tries to mimic the Optimus technology behavior; using the dedicated GPU for rendering when needed and power it down when not in use. The present releases only support rendering on-demand, automatically starting a program with the discrete video card based on workload is not implemented.<br />
<br />
== Installation ==<br />
<br />
Before installing Bumblebee, check your BIOS and activate Optimus (older laptops call it "switchable graphics") if possible (BIOS doesn't have to provide this option). If neither "Optimus" or "switchable" is in the bios, still make sure both gpu's will be enabled and that the integrated graphics (igfx) is initial display (primary display). The display should be connected to the onboard integrated graphics, not the discrete graphics card. If integrated graphics had previously been disabled and discrete graphics drivers installed, be sure to remove {{ic|/etc/X11/xorg.conf}} or the conf file in {{ic|/etc/X11/xorg.conf.d}} related to the discrete graphics card.<br />
<br />
=== Installing Bumblebee with Intel/NVIDIA ===<br />
<br />
[[Install]]:<br />
* {{Pkg|bumblebee}} - The main package providing the daemon and client programs.<br />
* {{Pkg|mesa}} - An open-source implementation of the '''OpenGL''' specification.<br />
* Optionally install {{Pkg|xf86-video-intel}} - Intel driver.<br />
* {{Pkg|nvidia}} or {{Pkg|nvidia-340xx}} or {{Pkg|nvidia-304xx}} - Install appropriate NVIDIA driver. For more information read [[NVIDIA#Installation]].<br />
<br />
For 32-bit ([[Multilib]] must be enabled) applications support on 64-bit machines, install:<br />
* {{Pkg|lib32-virtualgl}} - A render/display bridge for 32 bit applications.<br />
* {{Pkg|lib32-nvidia-utils}} or {{Pkg|lib32-nvidia-340xx-utils}} or {{Pkg|lib32-nvidia-304xx-utils}} - match the version of the 64 bit package.<br />
<br />
In order to use Bumblebee, it is necessary to add your regular ''user'' to the {{ic|bumblebee}} group:<br />
<br />
# gpasswd -a ''user'' bumblebee<br />
<br />
Also [[enable]] {{ic|bumblebeed.service}}. Reboot your system and follow [[#Usage]].<br />
<br />
=== Installing Bumblebee with Intel/Nouveau ===<br />
<br />
{{Warning|This method is deprecated and [https://github.com/Bumblebee-Project/Bumblebee/issues/773 will not work anymore]. Use the nvidia module instead. If you want nouveau, use [[PRIME]].}}<br />
<br />
Install:<br />
* {{Pkg|xf86-video-nouveau}} - experimental 3D acceleration driver.<br />
* {{Pkg|mesa}} - Mesa classic DRI with Gallium3D drivers and 3D graphics libraries.<br />
<br />
{{Note|1=If, when using {{ic|primusrun}} on a system with the nouveau driver, you are getting:<br />
primus: fatal: failed to load any of the libraries: /usr/$LIB/nvidia/libGL.so.1 <br />
/usr/$LIB/nvidia/libGL.so.1: Cannot open shared object file: No such file or directory<br />
<br />
You should add the following in {{ic|/usr/bin/primus}} after {{ic|PRIMUS_libGL}}:<br />
export PRIMUS_libGLa='/usr/$LIB/libGL.so.1'<br />
<br />
If you want, create a new script (for example ''primusnouveau'').<br />
}}<br />
<br />
== Usage ==<br />
<br />
=== Test ===<br />
<br />
Install {{Pkg|mesa-demos}} and use {{ic|glxgears}} to test if if Bumblebee works with your Optimus system:<br />
$ optirun glxgears -info<br />
<br />
If it fails, try the following commands:<br />
<br />
*64 bit system:<br />
$ optirun glxspheres64<br />
*32 bit system:<br />
$ optirun glxspheres32<br />
<br />
If the window with animation shows up, Optimus with Bumblebee is working.<br />
<br />
{{Note|If {{ic|glxgears}} failed, but {{ic|glxspheres''XX''}} worked, always replace "{{ic|glxgears}}" with "{{ic|glxspheres''XX''}}" in all cases.}}<br />
<br />
=== General usage ===<br />
<br />
$ optirun [options] ''application'' [application-parameters]<br />
<br />
For example, start Windows applications with Optimus:<br />
<br />
$ optirun wine application.exe<br />
<br />
For another example, open NVIDIA Settings panel with Optimus:<br />
<br />
$ optirun -b none nvidia-settings -c :8<br />
<br />
{{Note|A patched version of {{Pkg|nvdock}} is available in the package {{AUR|nvdock-bumblebee}}}}<br />
<br />
For a list of the options for {{ic|optirun}}, view its manual page:<br />
<br />
$ man optirun<br />
<br />
== Configuration ==<br />
<br />
You can configure the behaviour of Bumblebee to fit your needs. Fine tuning like speed optimization, power management and other stuff can be configured in {{ic|/etc/bumblebee/bumblebee.conf}}<br />
<br />
=== Optimizing speed ===<br />
<br />
==== Using VirtualGL as bridge ====<br />
<br />
Bumblebee renders frames for your Optimus NVIDIA card in an invisible X Server with VirtualGL and transports them back to your visible X Server. Frames will be compressed before they are transported - this saves bandwidth and can be used for speed-up optimization of bumblebee:<br />
<br />
To use another compression method for a single application:<br />
<br />
$ optirun -c ''compress-method'' application<br />
<br />
The method of compress will affect performance in the GPU/CPU usage. Compressed methods will mostly load the CPU. However, uncompressed methods will mostly load the GPU.<br />
<br />
Compressed methods<br />
:*{{ic|jpeg}}<br />
:*{{ic|rgb}}<br />
:*{{ic|yuv}}<br />
<br />
Uncompressed methods<br />
:*{{ic|proxy}}<br />
:*{{ic|xv}}<br />
<br />
Here is a performance table tested with [[ASUS N550JV]] laptop and benchmark app {{AUR|unigine-heaven}}:<br />
<br />
{| class="wikitable"<br />
! Command !! FPS !! Score !! Min FPS !! Max FPS<br />
|-<br />
| optirun unigine-heaven || 25.0 || 630 || 16.4 || 36.1<br />
|-<br />
| optirun -c jpeg unigine-heaven || 24.2 || 610 || 9.5 || 36.8<br />
|-<br />
| optirun -c rgb unigine-heaven || 25.1 || 632 || 16.6 || 35.5<br />
|-<br />
| optirun -c yuv unigine-heaven || 24.9 || 626 || 16.5 || 35.8<br />
|-<br />
| optirun -c proxy unigine-heaven || 25.0 || 629 || 16.0 || 36.1<br />
|-<br />
| optirun -c xv unigine-heaven || 22.9 || 577 || 15.4 || 32.2<br />
|}<br />
{{Note|Lag spikes occurred when {{ic|jpeg}} compression method was used.}}<br />
<br />
To use a standard compression for all applications, set the {{ic|VGLTransport}} to {{ic|''compress-method''}} in {{ic|/etc/bumblebee/bumblebee.conf}}:<br />
<br />
{{hc|/etc/bumblebee/bumblebee.conf|2=<br />
[...]<br />
[optirun]<br />
VGLTransport=proxy<br />
[...]<br />
}}<br />
<br />
You can also play with the way VirtualGL reads back the pixels from your graphic card. Setting {{ic|VGL_READBACK}} environment variable to {{ic|pbo}} should increase the performance. Compare these two:<br />
<br />
# PBO should be faster.<br />
VGL_READBACK=pbo optirun glxgears<br />
# The default value is sync.<br />
VGL_READBACK=sync optirun glxgears<br />
<br />
{{Note|CPU frequency scaling will affect directly on render performance}}<br />
<br />
==== Primusrun ====<br />
<br />
{{Note|Since compositing hurts performance, invoking primus when a compositing WM is active is not recommended. See [[#Primus issues under compositing window managers]].}}<br />
{{ic|primusrun}} (from package {{Pkg|primus}}) is becoming the default choice, because it consumes less power and sometimes provides better performance than {{ic|optirun}}/{{ic|virtualgl}}. It may be run separately, but it does not accept options as {{ic|optirun}} does. Setting {{ic|primus}} as the bridge for {{ic|optirun}} provides more flexibility.<br />
<br />
For 32-bit applications support on 64-bit machines, install {{Pkg|lib32-primus}} ([[multilib]] must be enabled).<br />
<br />
Usage (run separately):<br />
$ primusrun glxgears<br />
<br />
Usage (as a bridge for {{ic|optirun}}):<br />
<br />
The default configuration sets {{ic|virtualgl}} as the bridge. Override that on the command line:<br />
$ optirun -b primus glxgears<br />
<br />
Or, set {{ic|1=Bridge=primus}} in {{ic|/etc/bumblebee/bumblebee.conf}} and you won't have to specify it on the command line.<br />
<br />
{{Tip|Refer to [[#Primusrun mouse delay (disable VSYNC)]] if you want to disable {{ic|VSYNC}}. It can also remove mouse input delay lag and slightly increase the performance.}}<br />
<br />
=== Power management ===<br />
<br />
The goal of the power management feature is to turn off the NVIDIA card when it is not used by Bumblebee any more. If {{Pkg|bbswitch}} (or {{Pkg|bbswitch-dkms}}) is installed, it will be detected automatically when the Bumblebee daemon starts. No additional configuration is necessary. However, {{Pkg|bbswitch}} is for [https://bugs.launchpad.net/ubuntu/+source/bbswitch/+bug/1338404/comments/6 Optimus laptops only and will not work on desktop computers]. So, Bumblebee power management is not available for desktop computers, and there is no reason to install {{Pkg|bbswitch}} on a desktop. (Nevertheless, the other features of Bumblebee do work on some desktop computers.)<br />
<br />
==== Default power state of NVIDIA card using bbswitch ====<br />
<br />
The default behavior of bbswitch is to leave the card power state unchanged. {{ic|bumblebeed}} does disable the card when started, so the following is only necessary if you use bbswitch without bumblebeed.<br />
<br />
Set {{ic|load_state}} and {{ic|unload_state}} module options according to your needs (see [https://github.com/Bumblebee-Project/bbswitch bbswitch documentation]).<br />
{{hc|/etc/modprobe.d/bbswitch.conf|2=<br />
options bbswitch load_state=0 unload_state=1<br />
}}<br />
<br />
==== Enable NVIDIA card during shutdown ====<br />
On some laptops, the NVIDIA card may not correctly initialize during boot if the card was powered off when the system was last shutdown. Therefore the Bumblebee daemon will power on the GPU when stopping the daemon (e.g. on shutdown) due to the (default) setting {{ic|TurnCardOffAtExit&#61;false}} in {{ic|/etc/bumblebee/bumblebee.conf}}. Note that this setting does not influence power state while the daemon is running, so if all {{ic|optirun}} or {{ic|primusrun}} programs have exited, the GPU will still be powered off.<br />
<br />
When you stop the daemon manually, you might want to keep the card powered off while still powering it on on shutdown. To achieve the latter, add the following [[systemd]] service (if using {{pkg|bbswitch}}):<br />
<br />
{{hc|/etc/systemd/system/nvidia-enable.service|2=<br />
[Unit]<br />
Description=Enable NVIDIA card<br />
DefaultDependencies=no<br />
<br />
[Service]<br />
Type=oneshot<br />
ExecStart=/bin/sh -c 'echo ON > /proc/acpi/bbswitch'<br />
<br />
[Install]<br />
WantedBy=shutdown.target<br />
}}<br />
<br />
Then [[enable]] the {{ic|nvidia-enable.service}} unit.<br />
<br />
==== Enable NVIDIA card after waking from suspend ====<br />
The bumblebee daemon may fail to activate the graphics card after suspending. A possible fix involves setting {{Pkg|bbswitch}} as the default method for power management in {{ic|/etc/bumblebee/bumblebee.conf}}:<br />
<br />
{{hc|/etc/bumblebee/bumblebee.conf|2=<br />
[driver-nvidia]<br />
PMMethod=bbswitch<br />
<br />
# ...<br />
<br />
[driver-nouveau]<br />
PMMethod=bbswitch<br />
}}<br />
<br />
{{Note|This fix seems to work only after rebooting the system. Restarting the bumblebee service is not enough.}}<br />
<br />
=== Multiple monitors ===<br />
<br />
==== Outputs wired to the Intel chip ====<br />
<br />
If the port (DisplayPort/HDMI/VGA) is wired to the Intel chip, you can set up multiple monitors with xorg.conf. Set them to use the Intel card, but Bumblebee can still use the NVIDIA card. One example configuration is below for two identical screens with 1080p resolution and using the HDMI out.<br />
<br />
{{hc|/etc/X11/xorg.conf|2=<br />
Section "Screen"<br />
Identifier "Screen0"<br />
Device "intelgpu0"<br />
Monitor "Monitor0"<br />
DefaultDepth 24<br />
Option "TwinView" "0"<br />
SubSection "Display"<br />
Depth 24<br />
Modes "1920x1080_60.00"<br />
EndSubSection<br />
EndSection<br />
<br />
Section "Screen"<br />
Identifier "Screen1"<br />
Device "intelgpu1"<br />
Monitor "Monitor1"<br />
DefaultDepth 24<br />
Option "TwinView" "0"<br />
SubSection "Display"<br />
Depth 24<br />
Modes "1920x1080_60.00"<br />
EndSubSection<br />
EndSection<br />
<br />
Section "Monitor"<br />
Identifier "Monitor0"<br />
Option "Enable" "true"<br />
EndSection<br />
<br />
Section "Monitor"<br />
Identifier "Monitor1"<br />
Option "Enable" "true"<br />
EndSection<br />
<br />
Section "Device"<br />
Identifier "intelgpu0"<br />
Driver "intel"<br />
Option "XvMC" "true"<br />
Option "UseEvents" "true"<br />
Option "AccelMethod" "UXA"<br />
BusID "PCI:0:2:0"<br />
EndSection<br />
<br />
Section "Device"<br />
Identifier "intelgpu1"<br />
Driver "intel"<br />
Option "XvMC" "true"<br />
Option "UseEvents" "true"<br />
Option "AccelMethod" "UXA"<br />
BusID "PCI:0:2:0"<br />
EndSection<br />
<br />
Section "Device"<br />
Identifier "nvidiagpu1"<br />
Driver "nvidia"<br />
BusID "PCI:0:1:0"<br />
EndSection<br />
<br />
}}<br />
<br />
You need to probably change the BusID for both the Intel and the NVIDIA card.<br />
<br />
{{hc|<nowiki>$ lspci | grep VGA</nowiki>|<br />
00:02.0 VGA compatible controller: Intel Corporation 2nd Generation Core Processor Family Integrated Graphics Controller (rev 09)<br />
}}<br />
<br />
The BusID is 0:2:0<br />
<br />
==== Output wired to the NVIDIA chip ====<br />
<br />
On some notebooks, the digital Video Output (HDMI or DisplayPort) is hardwired to the NVIDIA chip. If you want to use all the displays on such a system simultaneously, you have to run 2 X Servers. The first will be using the Intel driver for the notebooks panel and a display connected on VGA. The second will be started through optirun on the NVIDIA card, and drives the digital display.<br />
<br />
''intel-virtual-output'' is a tool provided in the {{Pkg|xf86-video-intel}} driver set, as of v2.99. Commandline usage is as follows:<br />
<br />
{{hc|$ intel-virtual-output [OPTION]... [TARGET_DISPLAY]...|<br />
-d <source display> source display<br />
-f keep in foreground (do not detach from console and daemonize)<br />
-b start bumblebee<br />
-a connect to all local displays (e.g. :1, :2, etc)<br />
-S disable use of a singleton and launch a fresh intel-virtual-output process<br />
-v all verbose output, implies -f<br />
-V <category> specific verbose output, implies -f<br />
-h this help}}<br />
<br />
If no target displays are parsed on the commandline, ''intel-virtual-output'' will attempt to connect to any local display. The detected displays will be manageable via any desktop display manager such as xrandr or KDE Display. <br />
<br />
The tool will also start bumblebee (which may be left as default install). See the [https://github.com/Bumblebee-Project/Bumblebee/wiki/Multi-monitor-setup Bumblebee wiki page] for more information.<br />
<br />
{{Note|In {{ic|/etc/bumblebee/xorg.conf.nvidia}} change the lines {{ic|UseEDID}} and {{ic|Option "AutoAddDevices" "false"}} to {{ic|"true"}}, if you are having trouble with device resolution detection. You will also need to comment out the line {{ic|Option "UseDisplayDevices" "none"}} in order to use the display connected to the NVIDIA GPU.}}<br />
<br />
When run in a terminal, it will daemonize itself unless the {{ic|-f}} switch is used. The advantage of using it in foreground mode is that once the external display is disconnected, ''intel-virtual-output'' can then be killed and bumblebee will disable the nvidia chip. Games can be run on the external screen by first exporting the display {{ic|1=export DISPLAY=:8}}, and then running the game with {{ic|optirun ''game_bin''}}, however, cursor and keyboard are not fully captured. Use {{ic|1=export DISPLAY=:0}} to revert back to standard operation.<br />
<br />
== CUDA without Bumblebee==<br />
<br />
You can use CUDA without bumblebee. All you need to do is ensure that the nvidia card is on:<br />
<br />
# tee /proc/acpi/bbswitch <<< ON<br />
<br />
Now when you start a CUDA application it is going to automatically load all the necessary modules.<br />
<br />
To turn off the nvidia card after using CUDA do:<br />
<br />
# rmmod nvidia_uvm<br />
# rmmod nvidia<br />
# tee /proc/acpi/bbswitch <<< OFF<br />
<br />
== Troubleshooting ==<br />
<br />
{{Note|Please report bugs at [https://github.com/Bumblebee-Project/Bumblebee Bumblebee-Project]'s GitHub tracker as described in its [https://github.com/Bumblebee-Project/Bumblebee/wiki/Reporting-Issues wiki].}}<br />
<br />
=== [VGL] ERROR: Could not open display :8 ===<br />
<br />
There is a known problem with some wine applications that fork and kill the parent process without keeping track of it (for example the free to play online game "Runes of Magic")<br />
<br />
This is a known problem with VirtualGL. As of bumblebee 3.1, so long as you have it installed, you can use Primus as your render bridge:<br />
<br />
$ optirun -b primus wine ''windows program''.exe<br />
<br />
If this does not work, an alternative walkaround for this problem is:<br />
<br />
$ optirun bash<br />
$ optirun wine ''windows program''.exe<br />
<br />
If using NVIDIA drivers a fix for this problem is to edit {{ic|/etc/bumblebee/xorg.conf.nvidia}} and change Option {{ic|ConnectedMonitor}} to {{ic|CRT-0}}.<br />
<br />
=== Xlib: extension "GLX" missing on display ":0.0" ===<br />
<br />
If you tried to install the NVIDIA driver from NVIDIA website, this is not going to work.<br />
<br />
1. Uninstall that driver in the similar way:<br />
# ./NVIDIA-Linux-*.run --uninstall<br />
2. Remove generated by NVIDIA Xorg configuration file:<br />
# rm /etc/X11/xorg.conf<br />
3. (Re)install the correct NVIDIA driver: [[#Installing Bumblebee with Intel/NVIDIA]]<br />
<br />
=== [ERROR]Cannot access secondary GPU: No devices detected ===<br />
<br />
In some instances, running {{ic|optirun}} will return:<br />
<br />
[ERROR]Cannot access secondary GPU - error: [XORG] (EE) No devices detected.<br />
[ERROR]Aborting because fallback start is disabled.<br />
<br />
In this case, you will need to move the file {{ic|/etc/X11/xorg.conf.d/20-intel.conf}} to somewhere else, [[restart]] the bumblebeed daemon and it should work. If you do need to change some features for the Intel module, a workaround is to merge {{ic|/etc/X11/xorg.conf.d/20-intel.conf}} to {{ic|/etc/X11/xorg.conf}}.<br />
<br />
It could be also necessary to comment the driver line in {{ic|/etc/X11/xorg.conf.d/10-monitor.conf}}.<br />
<br />
If you're using the {{ic|nouveau}} driver you could try switching to the {{ic|nvidia}} driver.<br />
<br />
You might need to define the NVIDIA card somewhere (e.g. file {{ic|/etc/X11/xorg.conf.d}}), using the correct {{ic|BusID}} according to {{ic|lspci}} output:<br />
<br />
{{bc|<br />
Section "Device"<br />
Identifier "nvidiagpu1"<br />
Driver "nvidia"<br />
BusID "PCI:0:1:0"<br />
EndSection<br />
}}<br />
<br />
Observe that the format of {{ic|lspci}} output is in HEX, while in xorg it is in decimals. So if the output of {{ic|lspci}} is, for example, {{ic|0a:00.0}} the {{ic|BusID}} should be {{ic|PCI:10:0:0}}.<br />
<br />
<br />
==== NVIDIA(0): Failed to assign any connected display devices to X screen 0 ====<br />
<br />
If the console output is:<br />
<br />
[ERROR]Cannot access secondary GPU - error: [XORG] (EE) NVIDIA(0): Failed to assign any connected display devices to X screen 0<br />
[ERROR]Aborting because fallback start is disabled.<br />
<br />
You can change this line in {{ic|/etc/bumblebee/xorg.conf.nvidia}}:<br />
<br />
Option "ConnectedMonitor" "DFP"<br />
<br />
to:<br />
<br />
Option "ConnectedMonitor" "CRT"<br />
<br />
==== Failed to initialize the NVIDIA GPU at PCI:1:0:0 (GPU fallen off the bus / RmInitAdapter failed!) ====<br />
<br />
Add {{ic|1=rcutree.rcu_idle_gp_delay=1}} to the [[kernel parameters]] of the [[bootloader]] configuration (see also the original [https://bbs.archlinux.org/viewtopic.php?id=169742 BBS post] for a configuration example).<br />
<br />
==== Could not load GPU driver ====<br />
<br />
If the console output is:<br />
<br />
[ERROR]Cannot access secondary GPU - error: Could not load GPU driver<br />
<br />
and if you try to load the nvidia module you get:<br />
<br />
modprobe nvidia<br />
modprobe: ERROR: could not insert 'nvidia': Exec format error<br />
<br />
This could be because the nvidia driver is out of sync with the Linux kernel, for example if you installed the latest nvidia driver and haven't updated the kernel in a while. A full system update might resolve the issue. If the problem persists you should try manually compiling the nvidia packages against your current kernel, for example with {{Pkg|nvidia-dkms}} or by compiling {{pkg|nvidia}} from the [[ABS]].<br />
<br />
==== NOUVEAU(0): [drm] failed to set drm interface version ====<br />
<br />
Consider switching to the official nvidia driver. As commented [https://github.com/Bumblebee-Project/Bumblebee/issues/438#issuecomment-22005923 here], nouveau driver has some issues with some cards and bumblebee.<br />
<br />
=== /dev/dri/card0: failed to set DRM interface version 1.4: Permission denied ===<br />
<br />
This could be worked around by appending following lines in {{ic|/etc/bumblebee/xorg.conf.nvidia}} (see [https://github.com/Bumblebee-Project/Bumblebee/issues/580 here]):<br />
{{bc|<br />
Section "Screen"<br />
Identifier "Default Screen"<br />
Device "DiscreteNvidia"<br />
EndSection<br />
}}<br />
<br />
=== ERROR: ld.so: object 'libdlfaker.so' from LD_PRELOAD cannot be preloaded: ignored ===<br />
<br />
You probably want to start a 32-bit application with bumblebee on a 64-bit system. See the "For 32-bit..." section in [[#Installation]]. If the problem persists or if it is a 64-bit application, try using the [[#Primusrun|primus bridge]].<br />
<br />
=== Fatal IO error 11 (Resource temporarily unavailable) on X server ===<br />
<br />
Change {{ic|KeepUnusedXServer}} in {{ic|/etc/bumblebee/bumblebee.conf}} from {{ic|false}} to {{ic|true}}. Your program forks into background and bumblebee don't know anything about it.<br />
<br />
=== Video tearing ===<br />
<br />
Video tearing is a somewhat common problem on Bumblebee. To fix it, you need to enable vsync. It should be enabled by default on the Intel card, but verify that from Xorg logs. To check whether or not it is enabled for NVIDIA, run: <br />
<br />
$ optirun nvidia-settings -c :8<br />
<br />
{{ic|1=X Server XVideo Settings -> Sync to VBlank}} and {{ic|1=OpenGL Settings -> Sync to VBlank}} should both be enabled. The Intel card has in general less tearing, so use it for video playback. Especially use VA-API for video decoding (e.g. {{ic|mplayer-vaapi}} and with {{ic|-vsync}} parameter).<br />
<br />
Refer to the [[Intel#Video_tearing|Intel]]{{Broken section link}} article on how to fix tearing on the Intel card.<br />
<br />
If it is still not fixed, try to disable compositing from your desktop environment. Try also disabling triple buffering.<br />
<br />
=== Bumblebee cannot connect to socket ===<br />
<br />
You might get something like:<br />
<br />
$ optirun glxspheres64<br />
or (for 32 bit):<br />
{{hc|$ optirun glxspheres32|<br />
[ 1648.179533] [ERROR]You've no permission to communicate with the Bumblebee daemon. Try adding yourself to the 'bumblebee' group<br />
[ 1648.179628] [ERROR]Could not connect to bumblebee daemon - is it running?<br />
}}<br />
<br />
If you are already in the {{ic|bumblebee}} group ({{ic|<nowiki>$ groups | grep bumblebee</nowiki>}}), you may try [https://bbs.archlinux.org/viewtopic.php?pid=1178729#p1178729 removing the socket] {{ic|/var/run/bumblebeed.socket}}.<br />
<br />
Another reason for this error could be that you haven't actually turned on both gpu's in your bios, and as a result, the Bumblebee daemon is in fact not running. Check the bios settings carefully and be sure intel graphics (integrated graphics - may be abbreviated in bios as something like igfx) has been enabled or set to auto, and that it's the primary gpu. Your display should be connected to the onboard integrated graphics, not the discrete graphics card.<br />
<br />
If you mistakenly had the display connected to the discrete graphics card and intel graphics was disabled, you probably installed Bumblebee after first trying to run Nvidia alone. In this case, be sure to remove the /etc/X11/xorg.conf or .../20-nvidia... configuration files. If Xorg is instructed to use Nvidia in a conf file, X will fail.<br />
<br />
=== Running X.org from console after login (rootless X.org) ===<br />
<br />
See [[Xorg#Rootless Xorg (v1.16)]].<br />
<br />
=== Primusrun mouse delay (disable VSYNC) ===<br />
<br />
For {{ic|primusrun}}, {{ic|VSYNC}} is enabled by default and as a result, it could make mouse input delay lag or even slightly decrease performance. Test {{ic|primusrun}} with {{ic|VSYNC}} disabled:<br />
<br />
$ vblank_mode=0 primusrun glxgears<br />
<br />
If you are satisfied with the above setting, create an [[alias]] (e.g. {{ic|1=alias primusrun="vblank_mode=0 primusrun"}}).<br />
<br />
Performance comparison:<br />
<br />
{| class="wikitable"<br />
! VSYNC enabled !! FPS !! Score !! Min FPS !! Max FPS<br />
|-<br />
| FALSE || 31.5 || 793 || 22.3 || 54.8<br />
|-<br />
| TRUE || 31.4 || 792 || 18.7 || 54.2<br />
|}<br />
''Tested with [[ASUS N550JV]] notebook and benchmark app {{AUR|unigine-heaven}}.''<br />
<br />
{{Note|To disable vertical synchronization system-wide, see [[Intel graphics#Disable Vertical Synchronization (VSYNC)]].}}<br />
<br />
=== Primus issues under compositing window managers ===<br />
<br />
Since compositing hurts performance, invoking primus when a compositing WM is active is not recommended.[https://github.com/amonakov/primus#issues-under-compositing-wms]<br />
If you need to use primus with compositing and see flickering or bad performance, synchronizing primus' display thread with the application's rendering thread may help:<br />
<br />
$ PRIMUS_SYNC=1 primusrun ...<br />
<br />
This makes primus display the previously rendered frame.<br />
<br />
=== Problems with bumblebee after resuming from standby ===<br />
<br />
In some systems, it can happens that the nvidia module is loaded after resuming from standby.<br />
The solution for this, is to install the {{pkg|acpi_call}} and {{pkg|acpi}} package.<br />
<br />
=== Optirun doesn't work, no debug output ===<br />
<br />
Users are reporting that in some cases, even though Bumblebee was installed correctly, running <br />
<br />
$ optirun glxgears -info<br />
<br />
gives no output at all, and the glxgears window does not appear. Any programs that need 3d acceleration crashes:<br />
<br />
$ optirun bash<br />
$ glxgears<br />
Segmentation fault (core dumped)<br />
<br />
Apparently it is a bug of some versions of virtualgl. So a workaround is to [[install]] {{Pkg|primus}} and {{Pkg|lib32-primus}} and use it instead:<br />
<br />
$ primusrun glxspheres64<br />
$ optirun -b primus glxspheres64<br />
<br />
By default primus locks the framerate to the vrate of your monitor (usually 60 fps), if needed it can be unlocked by passing the {{ic|vblank_mode&#61;0}} environment variable.<br />
<br />
$ vblank_mode=0 primusrun glxspheres64<br />
<br />
Usually there is no need to display more frames han your monitor can handle, but you might want to for benchmarking or to have faster reactions in games (e.g., if a game need 3 frames to react to a mouse movement with {{ic|vblank_mode&#61;0}} the reaction will be as quick as your system can handle, without it will always need 1/20 of second).<br />
<br />
You might want to edit {{ic|/etc/bumblebee/bumblebee.conf}} to use the primus render as default. If after an update you want to check if the bug has been fixed just use {{ic|optirun -b virtualgl}}.<br />
<br />
See [https://bbs.archlinux.org/viewtopic.php?pid=1643609 this forum post] for more information.<br />
<br />
=== Broken power management with kernel 4.8 ===<br />
{{Out of date|Fixed on nvidia 375.26+}}<br />
If you have a newer laptop (BIOS date 2015 or newer), then Linux 4.8 might break bbswitch ([https://github.com/Bumblebee-Project/bbswitch/issues/140 bbswitch issue 140]) since bbswitch does not support the newer, recommended power management method. As a result, the dGPU may fail to power on, fail to power off or worse.<br />
<br />
As a workaround, add {{ic|1=pcie_port_pm=off}} to your [[Kernel parameters]].<br />
<br />
Alternatively, if you are only interested in power saving (and perhaps use of external monitors), remove bbswitch and rely on [[Nouveau]] runtime power-management (which supports the new method).<br />
<br />
=== Lockup issue (lspci hangs) ===<br />
See [[NVIDIA_Optimus#Lockup_issue_.28lspci_hangs.29]]] for an issue that affects new laptops with a GTX 965M (or alike).<br />
<br />
== See also ==<br />
<br />
* [http://www.bumblebee-project.org Bumblebee project repository]<br />
* [http://wiki.bumblebee-project.org/ Bumblebee project wiki]<br />
* [https://github.com/Bumblebee-Project/bbswitch Bumblebee project bbswitch repository]<br />
<br />
Join us at #bumblebee at freenode.net.</div>Vi six