User talk:Lahwaacz

From ArchWiki
Jump to: navigation, search

bot AUR to Official Repository edit

A recent bot edit (update Pkg/AUR templates) by on the Gitolite page correctly changed the AUR template to Pkg but left the Arch User Repository link

I fixed this, but would it be possible to modify the bot to take this into consideration?

I can imagine that blanket changing AUR links to Official Repository links in any given page could be dangerous - but for common phrasing or possibly word distance it would seem to be relatively safe

Or is there some sort of post-run manual inspection that I am unaware of that handles this situation?

Specifically this edit


{{AUR|gitolite}} is available in the [[Arch User Repository]]


{{Pkg|gitolite}} is available in the [[Arch User Repository]] (talk) 01:50, 1 April 2015 (UTC)

By "word distance" above what I _meant_ was Edit Distance ;)

I was initially thinking of Hamming distance - but apparently that is for strings of equal length.

What looks more promising is the Levenshtein distance - specifically "Comparing a list of strings" from the Python Distance package.

Example shamelessly ripped from that page:

(mainly because I couldn't link directly to the relevant section)

>>> sent1 = ['the', 'quick', 'brown', 'fox', 'jumps', 'over', 'the', 'lazy', 'dog']
>>> sent2 = ['the', 'lazy', 'fox', 'jumps', 'over', 'the', 'crazy', 'dog']
>>> distance.levenshtein(sent1, sent2)
3 (talk) 04:07, 1 April 2015 (UTC)

the bot currently does not touch the surrounding text at all, it only modifies the package templates or appends Template:Broken package link when the package is not found. This is obviously not perfect, this behaviour may lead to some incorrect combinations as you noticed, but blindly fixing the package links and not the surrounding text is still considered to be an improvement. Checking the surrounding text manually would require a lot of manpower, which we don't have, so it is currently not done systematically. Feel free to ask for further details or see the most recent discussion: ArchWiki:Requests#Strategy_for_updating_package_templates.
Regarding automatic updates of the surrounding text, the edit distance gives a clue about whether given edit should be performed or not, but it does not define how an edit should be performed. It can be useful in cases where there are multiple feasible substitutions in text and the strategy to select the optimal substitution is e.g. to minimize the Levenshtein distance. But we don't have any algorithm to generate feasible substitutions yet, so this technique fails. The surrounding text substitution is also very context sensitive and wiki bots must be designed in a way to minimize (ideally avoid completely) the error of the first kind, which in this case is modifying correct text to be incorrect. This makes defining general rules for the text substitution really hard, on the other hand many rules would be necessary to cover even the basic form of standard wording, so in the end both ways may be comparably hard. Anyway, if you have some ideas, I'm all ears :)
-- Lahwaacz (talk) 17:51, 1 April 2015 (UTC)

bot checking links after move

Hi, re Talk:Touchpad Synaptics#adding libinput alternative. Touchpad Synaptics has 100+ backlinks and the more important ones - a bit tedious task. I was just glancing over your clever github bot scripts. It would be handy to have a script after such moves: walk over the backlinks of Touchpad Synaptics and just replace "[[Touchpad Synaptics" with "[[Synaptics" from the links. That would leave all links to subsections intact. Leaving out the translations to handle manually, there would not be much to go wrong, or? --Indigo (talk) 07:36, 26 September 2015 (UTC)

Hi, thanks for the suggestion. It would be indeed handy in this case, but most likely not generally. Imagine that there was a UUID page, which was later generalized and renamed to Persistent block device naming and content about UUID is now only a section on the page. In this case using the naive replacement would likely change the meaning of many sentences, and using shorter redirects for convenience is actually encouraged. There would have to be a list of whitelisted "harmless" replacements, which could even help to replace [[pacman|Install]] with [[Install]] etc. -- Lahwaacz (talk) 08:01, 26 September 2015 (UTC)
Yes, good examples, but you are thinking universal already :) I did not mean it could be that. For example, if you take the time when the bulk of the title case moves were done. With such a script one could avoid a lot of internal redirects as well. E.g. [1]. But it's ok, just an idea. Please close this, if you think it's too singular cases with a simple enough replacement where it could be applied. --Indigo (talk) 10:02, 26 September 2015 (UTC)


Hi Lahwaacz,

It seems that aur-mirror has been down for a while. I'm not sure if this is intentional or not, but if it is, could you have remove Template:Aur-mirror from pages? At least where they are in a Template:Broken package link like {{Broken package link|{{aur-mirror|foobar}}}}.

If there is anything I can do to help, let me know.

Thanks! Lonaowna (talk) 14:56, 19 October 2016 (UTC)

Maybe drifting a bit offtopic... but I'm in favor to finally remove any and all packages that are not on AUR4 from the wiki. Users have had over a year time to migrate, which is a century in Arch standards. -- Alad (talk) 16:21, 19 October 2016 (UTC)
I agree, especially on pages like List of games (already took care of that), and List of Applications (see Talk:List of applications#AUR3 packages). On other pages, where the non-existing packages are mentioned inline, it requires some more knowledge and effort to remove them. -- Lonaowna (talk) 16:34, 19 October 2016 (UTC)
Hmm... For the moment I just updated the template to point to Github instead. What would be the alternative "hint" without the link? It should still be different from just "package not found". -- Lahwaacz (talk) 18:08, 19 October 2016 (UTC)
The GitHub repository is fine as well. I think we can keep that one while we (carefully) remove/update all broken links. Thanks! Lonaowna (talk) 06:52, 20 October 2016 (UTC)

PKGBUILD for AUR additional explanation

Where does it belong to then? Regards, -- wget (talk) 23:30, 23 January 2017 (UTC)

Here. -- Lahwaacz (talk) 23:35, 23 January 2017 (UTC)
Ok. I was actually hesitating between that page and the one I had actually written to. I'll repost to the right location then. Thanks for letting me know. -- wget (talk) 23:37, 23 January 2017 (UTC)
Actually I had to remove it again. I think it would be better if you proposed the example in a talk page, I have no idea what was the point given the "Even this is not recommended..." sentence. -- Lahwaacz (talk) 11:31, 23 August 2017 (UTC)

Hi Lahwaacz! thanks for reviewing my edits in the article about Haskell Cabal. Let me cover the clarifications about your two accuracy notes:

- About installing upstream Stack tool: I just recommend using the installation from upstream as it's a single binary and it doesn't depend on any of the haskell-* libraries we have here in the Arch repos, and we need it to avoid having those libraries as Cabal doesn't handle well when both dynamic and static libraries (Haskell libraries, no ELF libraries) are both present. Once a library is added to the global Cabal registry, it doesn't fetch ever the complementary library of the other linking type, see more in: .

- There is a difference between "using Cabal for static linking" and "using statically-linked Cabal": unfortunately, the way you configure Cabal when building it also defines the linking mode it will use (which is important given my previous point I explained). So, Cabal has to be build with the default static linking and the tool later on will build project as statically linked. So, that's why we need a custom cabal-install build in the default mode, unlike how it's these days shipped in Arch.

See here how cabal-install is configured in the repos to be build and to also use dynamic linking:

Maybe it sounds a bit confusing, not sure how familiar are you with the issues we are having the Haskell devs using Arch regarding dynamic/static linking. So, in the whole article I'm referring to static compilation of *Haskell object code*, which is not the same as having a pure static ELF binary not depending no shared library (*.so). It just means not splitting Haskell object code in multiple shared (.so) libraries, just a single binary (static) but it perfectly might depend on dynamic linking for basic system shared objects (.so again) such as the glibc.

Let me know how this sounds and if you propose any other improvement I'll add it before leaving the definitive version. Jimenezrick (talk) 20:19, 28 August 2017 (UTC)

OK, I'm improved quite a bit the whole section, I hope it's much more clear especially for the beginners. Jimenezrick (talk) 21:03, 28 August 2017 (UTC)

Thanks for the edits, but there is still a conflict between Haskell#Problems_with_linking and Haskell#Building_statically_linked_packages_with_Cabal_.28without_using_shared_libraries.29: the first section says that cabal-install (from the Arch package) has to be configured for dynamic linking, but the second section says that it is configured to link dynamically by default. I also completely don't understand the note starting with "This section is about..." in the second section. -- Lahwaacz (talk) 10:14, 29 August 2017 (UTC)
Right, regarding your first comment, I made a clarification about the default linking mode of cabal-install. It should be more correct now as static linking is always the default mode (even if it doesn't work...). I guess the confusion is that cabal-install tries to link statically always by default, but in Arch this is completely broken because the maintainers decided to only ship dynamic libraries. The whole point of this section is about how to get your own cabal-install that is able to link statically in a successful manner, which is what most Haskell devs want.
About your second comment regarding the "Note:". That note explains that one thing is static linking in the context of ELF binaries and the shared object model that Linux uses and other completely different is that GHC additionally supports placing generated code into shared libraries at the same time are also shared libraries (.so). It's popular to compile fully static binaries with haskell without using any (.so) files, so this guide isn't about that. This guide only tries to help to get a cabal-install that is able to compile monolithic Haskell binaries (which use dynamic ELF linking) but all the Haskell code in a single fat binary, which is the default and most popular way of compiling Haskell code. The problem is that Arch Haskell maintainers completely broke this way of linking in cabal-install because the way they ship libraries. (See ). So that comment clarifies the differentiation between the two subjects of how to link code.
Jimenezrick (talk) 11:19, 29 August 2017 (UTC)
Thanks again for your edits, now it looks good to me. I'm not a Haskell developer, but it seems that Arch developers chose a (relatively) new way which is supported by the compiler, but some other tools like Cabal are not ready for it. So I'm wondering, are there come bug reports to make cabal-install work with static linking on a system with only dynamic libraries?
As for the note, I've tried to improve it based on your replies here. Does it look good to you?
-- Lahwaacz (talk) 18:25, 29 August 2017 (UTC)
Hey Lahwaacz, thanks for that last edit in the "Note" part, now it easier to understand and thanks for all the feedback while writing this section, it's much decent now ;)
About the general situation with Haskell in Arch, people is quite unhappy with this change when the maintainers decided to start doing only dynamic linking, there has been multiple bug reports / complaints about this:
1 -
2 -
3 -
If you are a mere user of packages from the repos, it's OK, you won't notice anything except maybe seeing those massive dependencies being pulled. But if you are a dev, it's a quite big step
backwards as Haskell tooling has improved enormously in the last years in terms of reproducible builds and dependency control. But as other dev communities have accepted, dynamic linking isn't
usually worth the trouble (such as Golang community). So the issue with cabal it's just it can't handle well in the same system both dynamic and static linking. And Arch maintainers have imposed on
us a decision it isn't very accepted by the devs.
I was actually talking with folks in the #haskell IRC room and other Arch users feel the same, I'm gonna try to engage the Arch Haskell maintainer to see if I can persuade them to change this, as
IMHO going back to the usual static linking should be a better trade-off.
Jimenezrick (talk) 19:17, 29 August 2017 (UTC)
OK, thanks for sharing. I hope you can come to a reasonable conclusion for both sides. -- Lahwaacz (talk) 22:09, 29 August 2017 (UTC)

Reversion of pip autoconversion tools.

Can you discuss why you reverted ?

Obviously, as the author of PyPI2PKGBUILD, I may not be completely neutral on this topic. However:

As mentioned in, "it is always preferred to use pacman to install software" (which I definitely agree with). However, some important Python packages have been outdated for a long time on the Arch repositories (a major example being IPython, which has been flagged five months ago and is quite widely used); moreover, many "smaller" Python packages have rather poor quality, hand-written PKGBUILDs on the AUR ( is an example of a PKGBUILD which can mispackage if the user has a ~/.config/pip/pip.conf set). Thus, for packages that are not available on the official repos (or severely outdated there), I consider it better to autogenerate the PKGBUILD rather than having the AUR flooded with low-quality PKGBUILDs.

Note that I specifically suggested that such autogenerated PKGBUILDs should be for personal use (I would consider uploading them to the AUR to be in complete contradiction of the philosophy explained above).

Anntzer (talk) 09:05, 24 September 2017 (UTC)

The Python package guidelines page is about writing PKGBUILDs manually, not about generating them. I doubt that the "for personal use" clause would prevent people from sharing the generated PKGBUILDs...
I don't see how outdated or poorly written PKGBUILDs are relevant here.
-- Lahwaacz (talk) 09:17, 24 September 2017 (UTC)
I can replace "for personal use" by a more explicit "Do not upload such PKGBUILDs to the AUR" if you prefer...
Outdated PKGBUILDs are relevant because one of the main selling points of Arch is exactly the ability to keep everything up to date. When a package as important as IPython is not updated for five months, this selling point simply does not apply anymore. Obviously it is easy for each user to fetch the IPython PKGBUILD, edit it manually and build a package for a more recent version, but it seems clear that doing this in an automatic fashion is preferable.
Bad quality PKGBUILDs are relevant because they are just... highly annoying? I don't see why anyone would be happy with bad PKGBUILDs floating around the AUR (I know it is the responsibility of each user to check PKGBUILDs from the AUR, but certainly it is better to decrease the background noise?). Moreover, *even* if all packages on the AUR were of high quality, they would simply duplicate the information available on PyPI, but with an unnecessary delay. In fact, IMO, most PyPI packages should be autogenerated rather than obtained either from the Arch repos or from the AUR; the only exceptions being those that require a lengthy compilation step (and in that case the AUR does not really help), have non-detectable non-Python dependencies, or have incorrect metadata on PyPI that prevents their correct autopackaging.
Anntzer (talk) 11:02, 24 September 2017 (UTC)
So automatic generation is very important to you. Now, how is it relevant for the description of packaging guidelines, i.e. the standards that even the generators should conform to? -- Lahwaacz (talk) 11:16, 24 September 2017 (UTC)
Should we not all strive to improve the quality and up-to-date-ness of packages that Arch users can install? I do not consider this to be a solely personal issue; I also consider that in the vast majority of the cases, an autogenerated PKGBUILD is of better quality than most hand-written ones. In a sense, I would consider the autogenerated PKGBUILD to be the lowest bar for a PKGBUILD -- if you cannot write a better one then don't bother writing it yourself. And in case you are wondering, PyPI2PKGBUILD does follow the guidelines (I would consider anything else a bug) -- in fact I contributed some of them!...
If you consider that autogeneration tools are out of scope for this page, would you also strike out go-makepkg from the Go package guidelines, cblrepo from the Haskell package guidelines, and pacgem and gem2arch from the Ruby Gem package guidelines? Anntzer (talk) 11:42, 24 September 2017 (UTC)
Yes, I didn't like them either so I moved them to Creating_packages#PKGBUILD_generators. Do you like it enough? -- Lahwaacz (talk) 12:17, 24 September 2017 (UTC)
I would actually have written "Autogenerated PKGBUILDs should not be submitted to the AUR" (regardless of their quality), but heh. Anntzer (talk) 19:09, 24 September 2017 (UTC)

About installation of "xf86-video-intel" package.

I did not have xf86-video-intel package installed yet I had intel_backlight. MrHritik (talk) 20:22, 27 September 2017 (UTC)!

I'm pretty sure that the various backlight files in /sys are managed by the kernel, and are not dependent on which drivers you install, so that makes sense. I was wondering, however, whether the Driver parameter was actually needed in the example - that might be a cleaner fix than the note? -- Pypi (talk) 19:06, 27 September 2017 (UTC)
I can confirm that removing xf86-video-intel and the line "Driver intel" generates the error "No outputs have backlight property". MrHritik (talk) 20:22, 27 September 2017 (UTC)!
OK, reverted again. -- Lahwaacz (talk) 06:34, 29 September 2017 (UTC)


Hi, about your revert: You can use mkosi also to create a container/directory tree (-t directory). So it can do the same and more. -- Nudin (talk) 11:33, 1 October 2017 (UTC)

Alright, how is the "more" relevant to systemd-nspawn though? -- Lahwaacz (talk) 17:30, 3 October 2017 (UTC)
Hi, mkosi let's you create images (or directory trees) of various different distributions and allows you to do things like setting the root-password or installing additional packages. systemd-nspawn alows you to boot such images/directory trees. So I thought mentioning mkosi as alternative to manually creating a container with pacstrap or debootstrap would be worth it. -- Nudin (talk) 22:23, 5 October 2017 (UTC)