@gilesgoat
#1 reason: very weak typing without runtime checking
C++ is worse than C! By being an almost-superset of C, it has all of the deficiencies of C, and adds many new ones of its own!
I will, however, admit that modern C++ does make it more practical to avoid the C pitfalls. Unfortunately it can't actually prevent them. The onus is in the programmer to know what to avoid. That's a crock.
@gilesgoat
C is a good language for a very limited problem domain. The problem is that it gets used for nearly everything, and mostly far outside that limited domain.
C combines some of the power of assembly language with almost all of the danger of assembly language.
@brouhaha @gilesgoat Oh, OK.
One might characterise C as "BCPL, but with the ability to address bytes not just words". From which point of view a microcontroller (where you're counting every single byte of ROM) is indeed a valid use case π€£
@brouhaha @gilesgoat The customer or employer always gets what they want.
Unless it's FORTRAN, which I removed from my CV so that agents would stop phoning me about FORTRAN gigs, or Perl, which I always make clear at interview time that I will absolutely refuse to attempt to read, let alone write.
@shelldozer @brouhaha @gilesgoat I follow the Bellman's "what I tell you three times is true".
If an employer/customer asks me to do something stupid I tell them it's stupid. If they say do it anyway I tell them it's stupid again. If they ask a third time then I shut up and do it (unless to do so would break any of the codes of conduct I'm signed up to). They may have good business or political reasons to do something that doesn't make sense technically.
@brouhaha @gilesgoat This one was talking to custom hardware - some one-line-of-pixels-at-a-time machine vision device and some stepping motors - in a machine whose job was to quality control microfilm of football pools coupons.
The idea was the photograph all the coupons before the matches were played to guard against people altering coupons once the results were known. As with any paper handling system, particularly one which had to read vast numbers of folded or scrunched up forms in a very short time, the photography process wasn't entirely reliable. The job of this machine was to spot forms or batches that the machine had screwed up or torn and that needed separate handling.
Late 1970s.
Is it? I'd say it's a perfect language for a narrow class of rather simple problems. Which is why it seems such a good learner language - much like BASIC. Great for anything that can be written in a 10 liner, increasingly problematic past that.
I would say it combines some of the power of assembly with all danger of assembly while adding all pitfalls of HLL shortcuts.
> For no particular reason, I am feeling nostalgic for Turbo Pascal.
Turbo Pascal for DOS is what we used to learn programming with in middle school in the early 2000s.
@resuna @brouhaha @uep If I recall correctly, UCSD Pascal couldn't be used to make a standalone, all-in-one-binary application. There were separate runtime libraries that had to be present.
Turbo Pascal was extremely fast to compile on a (Mac) 68000 CPU, but it produced code which was flaky on the 68020. I've forgotten the details but Turbo Pascal for the Mac didn't last too far into the Mac II era before being discontinued.
i remember learning to code in C using lightspeed C on a mac plus
@_the_cloud @resuna @brouhaha @uep
ICBW, as I only briefly worked on a system using it, but wasn't UCSD Pascal based on a byte-code interpreter runtime? (That was the "p-system" mentioned up-thread.)
Could that be what you're remembering?
@brouhaha @CliftonR @_the_cloud @resuna
Yeah, my impressions at the time (and comments here) were less about the bytecode aspect in particular and more about the idea of a complete environment focused on that language and source files and programs. I used it on Apple][, a lot, where you booted into it as a different system entirely. That was where I went to concentrate on learning "proper" programming in 80 columns. I would now say that the narrower scope that let it be complete was a feature, but that's not quite how I thought of it at the time.
Turbo didn't feel like that at all, but because it was an early version of what we now call an IDE on top of a general purpose operating system, it seemed like it was trying to recreate that. By then I had also been using enough other general purpose operating systems that the mismatch was jarring.
@stmuk @CliftonR @_the_cloud @resuna @uep
Here's the Microsoft pcode details, provided a while back by @fraggle
@[email protected] here's the pcode help file, one of many that I converted into html a few years ago. Appears to have the full byte code specification https://fragglet.github.io/dos-help-files/pcode.hlp/index.html
@CliftonR @_the_cloud @resuna @brouhaha @uep
Funny you should say that, I just dropped the UCSD manual into Datamuseum.dk's BitArchive:
I could always tell if a program had been built with Turbo Pascal because it wrote directly to the hardware, bypassing both MS-DOS and the BIOS, was unable to be redirected (even if it just output plain text), didn't always work correctly on my Kaypro 2000 or on systems with V20 chips, and occasionally scribbled on the wrong screen in DoubleDOS.
None of which was necessary 99.44% of the time.
So I had a hate/hate relationship with Borland.