RT @garybernhardt
1960s: "COBOL will let non-programmers make the software!"

1980s: "4GLs will let non-programmers make the software!"

2000s: "UML will let non-programmers make the software!"

2020s: "AI will let non-programmers make the software!"

@gasproni @garybernhardt A moment of zen: Non-programmers made all the of the software all along.
@dabeaz @gasproni @garybernhardt almost my thought too, but the other way around. All these people are programmers.
@gerbrand @dabeaz @gasproni @garybernhardt exactly! And it easy to forget there are *way more* non-professional programmers than there are professional developers... https://ciberlandia.pt/@villares/109885982178235703
Alexandre B A Villares 🐍 (@[email protected])

Attached: 1 image Friendly reminder: It's easy to forget there are many more people in the world that write code and are *not developers* than there are professional software developers. Scientists, journalists, lawyers, activists, med. doctors, writers, educators, artists, designers, and many others, can create computer programs! This paper from 2012 shows an 2005 estimate (for 2012) that there are almost 4x more people programming ("end user programmers") than professional programmers https://dl.acm.org/doi/10.1145/2212776.2212421 #EndUserProgramming #NotADeveloper

Ciberlândia
@gasproni @garybernhardt Yep. Talking of which, where are we on the thick client-thin client merry-go-round? I’ve lost track. 🤔🙂
@thirstybear @garybernhardt I think we are. Just look at graphql. Developers are creating thick clients for the web.
@gasproni @thirstybear @garybernhardt SPAs are thick clients, in so many ways.
@gasproni @garybernhardt It turns out that structured thinking is a trained skill. Who could have ever guessed.
@hrefna @gasproni @garybernhardt Enough MBA still think that you can train any monkey to design software. 2 weeks of a coding bootcamp, and you are good to go.

@hrefna @gasproni @garybernhardt
Cough.

Coding != Designing

That's like comparing the work of a builder with the work of an architect.

Side note: experience shows, that you still cannot get everyone to code, it requires a certain minimum of abstract thinking capabilities.

Just as being able to create a nice colourful graph with Excel does not make you a data scientist.

@aizuchi @gasproni @garybernhardt Literally my career: picking up the pieces when non-programmers tried to code in a specific 4GL I won't mention here.

Sometimes they swore that they were, in fact, programmers. "it's easy! It's just like COBOL!". (It isn't, and in this case, it isn't.)

@fishidwardrobe @aizuchi @gasproni @garybernhardt most terrifying thing I ever saw was the code generator they gave my father to input specs and create COBOL in the late 80s. He was an old , electro-mechanical POTS exchange guy.

Yes it produced code, but the code was unreadable, slow and frequently wrong

@gasproni @garybernhardt The Dutch tax department went big on 4GL (mostly Cool:Gen I think) and now they have trouble finding people who want to work with that stuff. It's gotten so bad that there can't be any new tax legislation for a few years, because they haven't caught up yet with existing legislation!

@heinragas @gasproni @garybernhardt Yes, it was Cool:Gen. I remember applying there for a job ages ago, and the job description mentioned Cool:Gen, but at the actual interview they asked me "are you prepared to learn Cobol? Because that's what we're going to train you in."

I wasn't.

And I guess they're still transitioning.

@halla When I started at Cap Gemini in 1998, every starter got a course in Cobol. Everybody was still in the throes of Y2k, and I heard tales of people getting reams of Cobol code on their desk, and a marker to mark everything that looked like a date. Someone who knew their stuff would then take a look to see if anything had to be done...

I wonder how much untenable legacy we are making today. Can't be nothing.

@heinragas I started with Oracle PL/SQL and Oracle Forms 3 in 1994, so I'm of the 3GL generation. It was a bit hellish, to be honest. If a PL/SQL routine would be bigger than 64k (or 16k, I might misremember), you needed to add enough spaces so code didn't get split between pages at the wrong place.
@halla @heinragas at least with forms 3 you could crack it open with vi and fix it. (We could also use source control on it) When forms 4/4.5 came along
It was slow, required god's own PC, and the output was a binary so we couldn't use SCCS anymore.
Year 2038 problem - Wikipedia

@gasproni @garybernhardt Somehow I managed to miss all of these trendy fads, maybe because I realized you still have to have programmers who can understand and maintain other people's code, and it's a lot easier to do when it was written by trained professionals to begin with.

@gasproni @garybernhardt Nobody said that about UML. There was a fad for generating program structure from UML tools, and vice versa, in the 90s (I wrote such a tool), but that was always a programmer's shortcut, never a replacement for programmers. And it turned out to not actually be a shortcut, which is why "round trip engineering" and "model driven architecture" are not a thing anymore.

#UML

@sue
@gasproni @garybernhardt
I remember a lot of hype around Rational Rose in the 90s - it came very close to "you won't need developers any more". Or at least "you won't need as many developers".
@gasproni @garybernhardt It is always the argument, but I am still in a job.
@gasproni @garybernhardt ... and the moment that non-programmers started creating code (in whatever manner), they become programmers.
@gasproni @garybernhardt …maybe it’s the non-programmers that’s the problem?
@aethylred @gasproni @garybernhardt maybe it's the MBAs who went to replace paid skilled labor by people with machines they can own who are the problem.

@gasproni @garybernhardt The cognitive dissonance from thinking being a trade really messes with people, and they're determined to fix it by any means that doesn't involve updating their priors.

(People should experience trades more in youth.)

@gasproni @garybernhardt Ultimately, the reason software will never be made by non-programmers is that "a person who makes software" is pretty much the definition of "programmer". If new tech makes software creation accessible to you? Congratulations, you're a programmer now.
@gasproni @garybernhardt 1990s: "PHP will let non-programmers make the software!"
@gasproni @garybernhardt Back in the 1980s there was a program marketed called TLO - short for The Last One as the company claimed it would be the last software you would buy, as it would allow you to write all your own applications. I was dubious about it - it seemed to be a BASIC-like language and didn't address dea;omg with future developments in software. I last heard about it as an undergrad, when I overheard someone going "Remember TLO?" in a derisive tone.
@gasproni @garybernhardt @Citizenkahn We also should consider Grady Boochs correction of this tweet. He said: "2000s: 'UML will let non-programmers make the software!' I never claimed that. Not once. And I opposed the push some made to make the UML a programming language. The UML was a language for visualizing and reasoning about the design of a software-intensive system."
@sashag @gasproni @garybernhardt I imagine mangers, directors, and vp have always been looking for ways to cut us out. It’s complex and takes folks with specialized experience to do the work. Salaries are what they are because the market of coders has a supply shortage. Hope that’s the case for at least 10 more years

@gasproni I wasn’t alive for the COBOL hype. I finished college in time for the UML hype. I do remember that.

Honestly, nobody I’ve met in the industry really believes that AI will be much more than an advanced auto correct once all this has played out.

@gasproni @garybernhardt I think one major misconception here is that people assume typing in code is the real challenge in programming. However, the real issue is to come up with a model of the solution for a vaguely specified process or feature.

However, AI might be helpful in creating the code faster after the real work has been done.

@gasproni @garybernhardt

you'd think, over the course of 60yrs, if non- programmers wanted to make the software, they'd have just learned programming.

@ladyparabellum @gasproni @garybernhardt The problem was always 'we don't want to pay for programmers'

@gasproni @garybernhardt
I had that conversation in 1985, with the sharpest guy in my CompSci degree, told me solemnly we had to get into analysis work, that coding would be killed by 4GLs in a few years.

Later, I did learn SQL and used it heavily to avoid procedural coding, if that counts - and indeed, *basic* SQL was teachable to draftsmen, and engineers-who-hated-programming courses.

But big, long SQL statements with multiple tables and filters and subordinate SELECTs, nope. Programming.

@gasproni @garybernhardt

What DID avoid tons of coding was spreadsheets. Arguably, they ARE "functional programming", where tasks that would absolutely need hundreds of lines of code could be replaced by functions, sorts, "lookup" functions, etc - that did simple database stuff.

It isn't that these things didn't work, they did. They just weren't enough - every programmer knows "Feature creep": the customer always wants more.

They always wanted more than a no-code tool could give.

@gasproni @garybernhardt
When a fellow engineer asks what she should learn to be able to do something useful with programming, I don't recommend JavaScript or Python, which need "programming environments" to work in.

I recommend (gag) Visual Basic for Excel. They already live in Excel, and being able to supercharge its functional programming with even LITTLE procedural macros, a few dozens of lines long, can make a huge difference to its power, avoid having to go to a Real Programmer.

@gasproni @garybernhardt one of the more successful examples of "making programming accessible" was Visual Basic in the 90s. So widely adopted in early 2k that when MS designed .NET the rule was "If you can't do it in VB, you can't do it!"
@gasproni @garybernhardt The Wahl home clippers let non-stylists cut their own hair. The results are similar.
@gasproni @garybernhardt @rubenerd I’ve been saying this since forever. Every technology introduced to “allow non-programmers to make software” just creates a new breed of programmer.
@johncarneyau @gasproni @garybernhardt @rubenerd eell, we do have 100x the programmers of the 1970s, right?

@gasproni @cstross @garybernhardt So I admit to being a cranky old Mac guy, but I genuinely think Hypercard was probably the single most successful product ever at letting “non-programmers make the software.”

https://thehistoryofcomputing.net/bill-atkinsons-hypercard

The History of Computing: Bill Atkinson's HyperCard

We had this Mac lab in school. And even though they were a few years old at the time, we had a whole room full of Macintosh SEs. I’d been using the Apple II Cs before that and these just felt like Isaac Asimov himself dropped them off just for me to play with. Only thing: no BASIC interpreter. But in the Apple menu, tucked away in the corner was a little application called HyperCard. HyperCard wasn’t left by Asimov, but instead burst from the mind of Bill Atkinson. Atkinson was the 51st employee at Apple and a former student of Jeff Raskin, the initial inventor of the Mac before Steve Jobs took over. Steve Jobs convinced him to join Apple where he started with the Lisa and then joined the Mac team until he left with the team who created General Magic and helped bring shape to the world of mobile devices. But while at Apple he was on the original Mac team developing the menu bar, the double-click, Atkinson dithering, MacPaint, QuickDraw, and HyperCard.  Those were all amazing tools and many came out of his work on the original 1984 Mac and the Lisa days before that. But HyperCard was something entirely different. It was a glimpse into the future, even if self-contained on a given computer. See, there had been this idea floating around for awhile.  Vannevar Bush initially introduced the world to a device with all the world’s information available in his article “As We May Think” in 1946. Doug Engelbart had a team of researchers working on the oN-Line System that saw him give “The Mother of All Demos in 1968” where he showed how that might look, complete with a graphical interface and hypertext, including linked content. Ted Nelson introduced furthered the ideas in 1969 of having linked content, which evolved into what we now call hyperlinks. Although Nelson thought ahead to include the idea of what he called transclusions, or the snippets of text displayed on the screen from their live, original source.  HyperCard built on that wealth of information with a database that had a graphical front-end that allowed inserting media and a programming language they called HyperTalk. Databases were nothing new. But a simple form creator that supported graphics and again stressed simple, was new. Something else that was brewing was this idea of software economics. Brooks’ Law laid it out but Barry Boehm’s book on Software Engineering Economics took the idea of rapid application development another step forward in 1981. People wanted to build smaller programs faster. And so many people wanted to build tools that we needed to make it easier to do so in order for computers to make us more productive. Against that backdrop, Atkinson took some acid and came up with the idea for a tool he initially called WildCard. Dan Winkler signed onto the project to help build the programming language, HyperTalk, and they got to work in 1986. They changed the name of the program to HyperCard and released it in 1987 at MacWorld. Regular old people could create programs without knowing how to write code. There were a number of User Interface (UI) components that could easily be dropped on the screen, and true to his experience there was panel of elements like boxes, erasers, and text, just like we’d seen in MacPaint. Suppose you wanted a button, just pick it up from the menu and drop it where it goes. Then make a little script using the HyperText that read more like the English language than a programming language like LISP.  Each stack might be synonymous with a web page today. And a card was a building block of those stacks. Consider the desktop metaphor extended to a rolodex of cards. Those cards can be stacked up. There were template cards and if the background on a template changed, that flowed to each card that used the template, like styles in Keynote might today. The cards could have text fields, video, images, buttons, or anything else an author could think of. And the author word is important. Apple wanted everyone to feel like they could author a hypercard stack or program or application or… app. Just as they do with Swift Playgrounds today. That never left the DNA. We can see that ease of use in how scripting is done in HyperTalk. Not only the word scripting rather than programming, but how HyperTalk is weakly typed. This is to say there’s no memory safety or type safety, so a variable might be used as an integer or boolean. That either involves more work by the interpreter or compiler - or programs tend to crash a lot. Put the work on the programmers who build programming tools rather than the authors of HyperCard stacks. The ease of use and visual design made Hypercard popular instantly. It was the first of its kind. It didn’t compile at first, although larger stacks got slow because HyperTalk was interpreted, so the team added a just-in-time compiler in 1989 with HyperCard 2.0. They also added a debugger.  There were some funny behaviors. Like some cards could have objects that other cards in a stack didn’t have. This led to many a migration woe for larger stacks that moved into modern tools. One that could almost be considered HyperCard 3, was FileMaker. Apple spun their software business out as Claris, who bought Noshuba software, which had this interesting little database program called Nutshell. That became FileMaker in 1985. By the time HyperCard was ready to become 3.0, FileMaker Pro was launched in 1990.  Attempts to make Hypercard 3.0 were still made, but Hypercard had its run by the mid-1990s and died a nice quiet death. The web was here and starting to spread. The concept of a bunch of stacks on just one computer had run its course. Now we wanted pages that anyone could access. HyperCard could have become that but that isn’t its place in history. It was a stepping stone and yet a milestone and a legacy that lives on. Because it was a small tool in a large company. Atkinson and some of the other team that built the original Mac were off to General Magic. Yet there was still this idea, this legacy.  Hypercard’s interface inspired many modern applications we use to create applications. The first was probably Delphi, from Borland. But over time Visual Studio (which we still use today) for Microsoft’s Visual Basic. Even Powerpoint has some similarities with HyperCard’s interface. WinPlus was similar to Hypercard as well. Even today, several applications and tools use HyperCard’s ideas such as HyperNext, HyperStudio, SuperCard, and LiveCode. HyperCard also certainly inspired FileMaker and every Apple development environment since - and through that, most every tool we use to build software, which we call the IDE, or Integrated Development Environment. The most important IDE for any Apple developer is Xcode. Open Xcode to build an app and look at Interface Builder and you can almost feel Bill Atkinson’s pupils dilated pupils looking back at you, 10 hours into a trip. And within those pupils visions - visions of graphical elements being dropped into a card and people digitized CD collections, built a repository for their book collection, put all the Grateful Dead shows they’d recorded into a stack, or even built an application to automate their business. Oh and let’s not forget the Zine, or music and scene magazines that were so popular in the era that saw photocopying come down in price. HyperCard made for a pretty sweet Zine.  HyperCard sprang from a trip when the graphical interface was still just coming into its own. Digital computing might have been 40 years old but the information theorists and engineers hadn’t been as interested in making things easy to use. They wouldn’t have been against it, but they weren’t trying to appeal to regular humans. Apple was, and still is. The success of HyperCard seems to have taken everyone by surprise. Apple sold the last copy in 2004, but the legacy lives on. Successful products help to mass- Its success made a huge impact at that time as well on the upcoming technology. Its popularity declined in the mid-1990s and it died quietly when Apple sold its last copy in 2004. But it surely left a legacy that has inspired many - especially old-school Apple programmers, in today’s “there’s an app for that” world.

@wubfur @gasproni @garybernhardt I agree. Unfortunately Hypercard missed the boat when networked hypertext came along (it badly needed a URL schema and handler). LiveCode is still out there and was open source for a while, but has gone subscriptionware as a cross-platform development system. It's a pale shadow of the former ecosystem.

https://livecode.com

@wubfur @gasproni @cstross @garybernhardt easily one of the least controversial statements I've ever read

Innumerable gamedevs and creative coders got their start on Hyparcard

@shi @gasproni @cstross @garybernhardt > Innumerable gamedevs and creative coders got their start on Hyparcard

At which point they ceased to be non-programmers.

@gasproni @garybernhardt in 1981 I saw an article in UK mag about a software product called The Last One, supposedly a super 4GL that meant no one would ever have to program again. I have not heard of it since.
@gasproni @garybernhardt
All dates: ... and they will not test it properly.
@gasproni @garybernhardt The love for a silver bullet is the weakness of them all. These are all cases where they try to address the wrong problem. Writing software is not in itself hard, figuring out the good design is. As we increase capability, the problem we aim to solve becomes harder, so we end up having as challenging tasks. Because of development, we can do new things, and we do them towards the border of our ability. Thinking anything else is naive, wishful thinking and/or hybris.
@gasproni @garybernhardt
Well, my association is LabView. But you will never create Software when you don't [try to] understand what happens in the PC.
@gasproni @garybernhardt And without hype, since... uh... 1990s?, Excel has been allowing non-programmers to make software. Without even knowing that they're doing it. And with similar results to those other examples.

@gasproni @nazgul

Sir: If you'd seen some of the code I deal with at work, I think you'd have to concede that software is often made by non-programmers.

@xvf17 @gasproni The last code I had to deal with at M* was unformatted pasted routines (inheritance by copy and paste) with no organization and no comments. So I feel that.