this feels like a silly thing to say but even though i’ve been using linux since 2004 I feel like i’m learning recently that the impact of the GNU project’s software (and its design decisions) on me is even bigger than I thought

like even just the fact that (afaik) many of them used Emacs has an impact on me today

(please no “it’s GNU/Linux”)

for example I thought the “vim vs emacs” flamewars were silly (who cares? use what you want!)

but actually I feel like some of the GNU software design decisions are really influenced by emacs (readline, info pages) and that does actually have an effect

(please don’t tell me that readline has a vi mode)

(2/?)

also this guidance on command line arguments is great, I didn’t realize these things came from the GNU project and I really appreciate them https://www.gnu.org/prep/standards/html_node/Command_002dLine-Interfaces.html#Command_002dLine-Interfaces

(via @zwol)

(3/?)

Command-Line Interfaces (GNU Coding Standards)

Command-Line Interfaces (GNU Coding Standards)

also I didn’t realize that standardizing “—help” came from the GNU project, it makes me wonder if folks have proposed adding —help to programs that predated GNU (or are from a BSD project etc) and if so what that conversation looked like

I imagine it’s not always possible to do without breaking backwards compatibility

(4/?)

anyway i’ve been thinking about how to understand the way “the terminal” works it feels really important to understand the cultural impact of specific programs or projects (like xterm, the GNU project, etc)

i think it’s something a lot of people are intuitively aware of just from using the terminal and noticing patterns

(5/?)

@b0rk I couldn't agree more.

The more history I've learned about these things the more the way they work makes sense.

@b0rk sometimes i like to imagine the teletype printer that it emulates

@b0rk in '94 I used (sun's) db (dbg?, does anyone remember?) and a friend asked me why I didn't use gdb. It was such an amazingly different experience, the gdb ui seemed to be designed with care---dare I say love?---for an actual human (me!) using it.

I went to read all of gnu.org, the philosophy (empowering the user instead of keeping them ignorant), the coding standards (info instead of elitist manual pages, no arbitrary limits, etc ...) and decided I wanted to be part of this.

The reason some of us prefer to say GNU/Linux is rooted in the idea that even people that have been using "Linux" for decades, may not have heard about GNU.

@janneke oh interesting what do you mean when you say man pages are elitist?

@b0rk

https://www.gnu.org/prep/standards/html_node/GNU-Manuals.html#GNU-Manuals

"GNU Manuals
The preferred document format for the GNU system is the Texinfo formatting language. Every GNU package should (ideally) have documentation in Texinfo both for reference and for learners."

Info manuals usually have a philosophy section, an introduction, a tutorial and describe the relationship of the software with other softwares. Some manual pages nowadays also give examples, but in the 90s the main feature of a man page, as I experienced it as a newbie, was terseness with no regard for (dare I say a elitist disregard?) for learners like myself.

GNU Manuals (GNU Coding Standards)

GNU Manuals (GNU Coding Standards)

@b0rk Just imagine how amazing it would have been if there would have been an info manual for Linux and for git?
@janneke @b0rk I think bork is asking a different question: why would info have been better? it’s been a loooooong time since I was a power user of either, but I remember finding info very frustrating. And I was a regular emacs user at the time so I should have known most of the shortcuts! So I’m not clear why it would have been amazing if there were info for git.
@janneke @b0rk (not saying man is particularly good, mind you, I just don’t recall why info was even theoretically any better)
@luis_in_brief @b0rk because info tells a story, has an introduction, and explicitly includes learners as their audience. Whereas man pages (at least in the 90s) attempted to not spend a single character too many?
@janneke I mean I spent 20 years using Linux every day without even realizing that info pages existed (beyond maybe once when I tried to use the `info` viewer and gave up instantly) so it's hard for me to personally relate to the idea that more info pages would have been helpful

@b0rk yeah, the info experience outside of emacs is pretty terrible. I believe there was a short period where GNOME/Yelp would seamlessly present info pages.

Info manuals tell a story, for power users and learners alike. Link between different concepts. Usually have a tutorial. All of that is missing, for example--not wanting to single out one non-gnu project-- in an avalanche of manual pages.

If you want to learn about Linux (the kernel), wouldn't it be amazing if there was a manual for that? There is one for the Hurd.

@janneke I'm glad to hear that info pages were helpful to you! I think you're the one of the first people I've heard say that and it's interesting to hear what your experience has been like
@janneke it reminds me of how someone told me they went to a free software conference and they were really surprised that almost everyone there was an emacs user, feels like there's a bit of a cultural divide between "free software" culture and the "tech industry" (even though of course a lot of folks in the "tech industry" rely on and care about free software)

@b0rk we have been living in mostly different bubbles.

When I read "We will use Emacs as our editor" and I realized that was the single one statement that I had a problem with (being a happy VI user for some years), it occurred to me that this was probably caused by a problem with my perspective.

I then spent three summers to try to learn Emacs. That's some 25y ago. I can't say whether it was having context aware info pages with just a keystroke away next to your program text, being able to copy stuff without even having to glance at your mouse. Or whether it was the automagical (re-)indentation of code. Or something like the debian-changelog-mode when creating a Debian package.

Together with a friend I went to create GNU LilyPond which was often praised because of its documentation. I now hear similar praise about our Guix manual.

@janneke that's awesome, the LilyPond website looks really clear!

@b0rk thank you! It's all generated from [tex]info and is thus (mostly) also browsable offline as info pages.

If only the #GNU project would have taken this up, have improved and standardized it. Twenty years ago.

@b0rk of course, especially at the time, the (projected) LilyPond user base included musicians rather than programmers.
@janneke I've been appreciating the fish shell's approach to documentation recently, where `help THING` opens an HTML manual page (stored locally) in a browser.

@b0rk that seems useful (unless maybe when you're running a shell inside de Emacs, you prolly don't want a GUI browser to spawn, bt that's details).

Which reminds me, I believe there was a proposal to add up, previous, next semantics to html and it got rejected.

One of the amazing things about info (esp in emacs) is that you can read the whole document just by hitting the space bar. For, next screen and next section alike. No such thing exists in web browsers today. Similarly you isearch or regexp-search, go to node, jump to index entries. Once you're used to it, using HTML documentation becomes terribly cumbersome in comparison.

@b0rk @janneke I failed to read Info manual although I consider myself comfortable with Emacs and tried several times over the years. Well, local manuals in HTML format missed me… a lot! Then one day, reading yet another message by some Guix wizard advocating for Info, I decided tp try again, so to force myself to only rely on Info for two weeks… Today, I don’t look back. 😀

Even, today, I “regret” that Info isn’t the standard for locally browsing any documentation. 😁

@b0rk somewhat aside, a lot of old unix tools would balk at "cmd file -opt" but gnu tools had no issue handling options coming after arguments

i had a friend who used to complain of "gnu bloatware" until he was forced to use (fairly old) solaris at work and he installed the gnu userland tools within minutes of trying to get the older tools to behave in ways he was used to

@tef huh I didn’t know that thanks!
@b0rk @tef Yes, when I started out in EDA (electronic design automation) Solaris was the standard and we all immediately installed the GNU core tools because the Solaris ones sucked.

@b0rk @zwol

Yea; back in "the bad old days" it was common to roll our own command line parameter parsing.

Didn't have the memory for reusable libraries in a 64K byte address machine.

"getopt", with the "--" long option names was a significant improvement. And "--" by itself to *STOP* option processing at that point, forcing all other "words" to be file names.

@b0rk I didn’t realize this is what was going on, but I am always secretly pleased when entering text and can use emacs shortcuts
@b0rk there'd be no open source movement without the FSF and no open source Unix software without gcc and glibc. Or if there were, it'd be very very different. I think the community has outgrown the FSF now but I have great respect for what they started.
@nelson ah yeah tried to clarify that I mean the software and design decisions that were made there, not the cultural impact

@b0rk @nelson The GNU project was also an early champion of portability. They *wanted* their software to work everywhere to show that their stuff was *good*. So they built it that way. That had an outsized impact on how people thought about software portability.

(Fun fact, glibc is the only C standard library implementation that worked on multiple operating systems. Nobody else ever did that before and I haven't seen such a thing since.)

@Conan_Kudo It was more of necessity, no free operating system existed back in the early 90s.

The GNU C library has only worked with the GNU system and variants (GNU/Linux, and GNU/kFreeBSD). I'm not aware of any other system where glibc was supported.

@b0rk @nelson

@b0rk are we talking about the cultural impact? Or are we talking about just the amount of GNU project software (even if it's not called GNU $SOFTWARE). I mean I was surprised to see that Orca is not "GNU Orca" and it's instead a part of the GNOME project https://help.gnome.org/users/orca/stable/index.html.en
Orca Screen Reader

@b0rk and then of course there is the reverse of "this uses the gnu name but isn't actually part of gnu anymore?" like https://www.gnutls.org/
GnuTLS

@puppygirlhornypost2 GNUTLS is part of the GNU project, and maintained by Simon Josefsson.

@b0rk

@amszmidt @b0rk weird I thought there was a falling out
@puppygirlhornypost2 There was. But water under the bridge….
@puppygirlhornypost2 i mean the software
@b0rk fair fair. the amount of software that's either directly from the GNU project but doesn't carry the GNU name and the amount of software built upon things like glib and other gnu libraries is almost unfathomable. I don't think we'd quite be where we are without GNU (and whether that's a positive or a negative is left as an exercise to the reader)

@b0rk yeah, i get irritated and some of their software is now crusty, but they got a snowball rolling that turned out to be bigger than i could have imagined.

i also remember seeing something about Wikipedia early (possibly Nupedia?) and thinking it was a foolish project--wouldn't get the mass or the accuracy.

for both projects i was extremely wrong, and the details of how the 'impossible' happened are fascinating!

@rf @b0rk I remember Nupedia well. I tried to help edit a few entries. 1999-2000 Eric would not have believed how things turned out.

In a way though, the evidence was already there with Linux and GNU that it was possible. Even though I had dabbled with both I didn't make the connection in the slightest that it could be applied to knowledge, too.

@b0rk it is absurd that this man just one day said "fuck it, we ball" and slaughtered AT&T's bazillion dollar OS business

that's like deciding to fight god

@b0rk people mock the "GNU Coding Standard" because its specification of how to indent C is legitimately weird. But that's actually the least important part of the document. There is stuff in there about how programs should *behave* that's been much more influential.

https://www.gnu.org/prep/standards/html_node/Program-Behavior.html

Program Behavior (GNU Coding Standards)

Program Behavior (GNU Coding Standards)

@zwol ooh I should look at this thanks

@zwol “Please define long-named options that are equivalent to the single-letter Unix-style options. We hope to make GNU more user friendly this way.”

i had no idea this was the goal and it’s so good

@b0rk @zwol This is defining behavior from GNU. It doesn't exist in CP/M or DOS. It doesn't exist in Unix. This "GNU-ism" is so useful that people straight up stopped using POSIX `getopt(3)` over it.
@Conan_Kudo @b0rk @zwol CP/M and DOS were shitty Unix ripoffs and their options syntax was terrible
@bascule @Conan_Kudo @b0rk I always thought CP/M's command line interface was ripping off VMS, or possibly whatever DEC's OS before VMS was

@Conan_Kudo @b0rk @zwol Eh? There's definitely still a bunch of getopt(3) usage, even though getopt_long(3) is in most Unixes libc.

And when it comes to long options implementation (which if you see the dd command, inspired from IBM JCL, isn't just a GNU thing), there's often different behavior, for example accepting = as separator or not, as well as accepting abbreviated names or not.

Which can get a bit funky in codebases like git for example where long options are rather prevalent but it's subcommands are implemented in different languages.

@Conan_Kudo I recently started making a makepkg clone-ish and at the start I thought "Imma try to stick to pure POSIX 2024 (+ cURL and bsdtar)" and not only was the shell syntax very limiting (specifically arrays, but I can work around that with HEREDOCs and newlines) but what made me decide "nope, Imma stick to "bash and GNU coreutils" is that all the tools only define the short options. It makes the scripts so unreadable :(

on a different note: I recently learned that GNU getopt allows abbreviating all long options as long as they don't clash (e.g. specifying --dir instead of --directory)

@NekkoDroid Yep, there are so many nice things about the GNU stuff that people just don't realize are from GNU.
@NekkoDroid @Conan_Kudo Heh, the arrays are from the Korn Shell so it's actually pretty portable (like even *BSD, illumos, … default install portable) and are one of the things I wondered if they could get into POSIX with someone else like 2 weeks ago.
@b0rk @zwol i wish it would be clear that long option processing should be internationalized to also be friendly to novice non-English-speakers.

@gugurumbe @b0rk It says something about programmer brain that no one has ever seriously explored that possibility as far as I know, even though GNU and many others did do a lot of work on translating the *output* of programs.

Specifically, I think people become accustomed to thinking of programming languages as independent from natural language, even though almost all of them draw heavily on prior understanding of English. To the experienced, command line options are tokens of the scripting language "shell" and thus it maybe doesn't occur to them that they are also words in English that could usefully be translated.

It *would* be good to change that but it may well require a rethink of how the whole "shell" environment works ... which would be a good idea *anyway* of course...

@b0rk @zwol @uliwitness bring back MPW Commando 🙂
@jimluther @b0rk @zwol @uliwitness and Think Reference. There was something so good about the simplicity of the docs at that point as well as the immediate reaction of the program.
@jimluther @b0rk @zwol @uliwitness options on man pages should be selectable and executable
@zwol @b0rk the “no arbitrary limits” stance is harmful in today’s networked environments though…