For no particular reason, I am feeling nostalgic for Turbo Pascal.
I'm also feeling nostalgic for MPW Pascal, and the Macintosh Programmer's Workshop in general.
I was sad when Apple ditched Pascal for C.
Pascal was not perfect, by any means. I've programmed professionally in Ada, one of Pascal's successors, and liked it pretty well.
I wanted to use Modula 3, "an elegant weapon for a more civilized age," and dabbled in it a little, but there was no market for it.
People sometimes ask what my favorite programming language is. Probably Smalltalk, except that I prefer strong typing, and Smalltalk wants to be its own world, like FORTH, rather than being used to create "native" applications. I usually reply that I have no favorite, because all programming languages suck. Some suck less than others.
C was OK as a systems programming language for the PDP-11 in the 1970s. The minimalism that was absolutely _required_ for that has not been a benefit to good software design practices since the mid-1980s.
As the late, great C.A.R. Hoare said of ALGOL 60, "Here is a language so far ahead of its time that it was not only an improvement on its predecessors but also on nearly all its successors."
When I tell people that I have no favorite programming language, inevitably their next question is what programming languages do I use.
"Mostly C and C++, and some Python. Various others rarely as needed."
Then I'm asked why, especially since I consider C to be terrible.
"The customer is always right."
@brouhaha "Then I'm asked why, especially since I consider C to be terrible." .... BLASPHEMY ! 😎 Well ok it's not as good as C++ but how can you call C "terrible" !?

@gilesgoat
#1 reason: very weak typing without runtime checking

C++ is worse than C! By being an almost-superset of C, it has all of the deficiencies of C, and adds many new ones of its own!

I will, however, admit that modern C++ does make it more practical to avoid the C pitfalls. Unfortunately it can't actually prevent them. The onus is in the programmer to know what to avoid. That's a crock.

@brouhaha .. and yet I think 'those may be reasons to love C too' ... I am getting emotional here 😊

@gilesgoat
C is a good language for a very limited problem domain. The problem is that it gets used for nearly everything, and mostly far outside that limited domain.

C combines some of the power of assembly language with almost all of the danger of assembly language.

@brouhaha @gilesgoat Are there still people using C? - I haven't had to touch it for decades.
@TimWardCam @gilesgoat
Most of the C I'm paid to write is for microcontrollers, either on bare metal, or with a small RTOS like FreeRTOS or Zephyr. Most software I'm paid to write for host computers is in C++ or Python.

@brouhaha @gilesgoat Oh, OK.

One might characterise C as "BCPL, but with the ability to address bytes not just words". From which point of view a microcontroller (where you're counting every single byte of ROM) is indeed a valid use case 🀣

@TimWardCam @gilesgoat On the other hand, languages like Ada and Rust actually can effectively target small-memory microcontrollers. Ada actually does have facilities for managing bits and bytes (optional, Chapter 13, Representation Specifications, actually more powerful than what C gives you, undefined layout bit fields).
@brouhaha @TimWardCam We have a multi-platform game engine that does graphics, audio, controllers, logic and anything needed that started in C then evolved into C++ , it works as a charm for that kind of application πŸ₯°
@gilesgoat @brouhaha Sure, that was a natural evolution back in the day when Java was so slow that it could only really be considered a serious challenger to Visual Basic.
@TimWardCam @brouhaha My very first ATTEMPT at C believe it or not, was with HiSOFTC ON *TAPE* for ZX Spectrum .. I could NOT compile "Hello World" πŸ˜… My second 'serious' attempt was with Metacomco C on Microdrives on Sinclair QL .. I did something .. FINALLY someone ported C68 for QL on Floppies there I finally started to learn C. Finally my first real paid job we were using C on Motorola 68030/20 machines and I did use it sometime on Vax/VMS .. it all went C since πŸ™‚
@TimWardCam @brouhaha Ah I should said "quite a big part of my job" was to port/write device drivers for various HW .. you may see why I love C that much πŸ™‚
@gilesgoat @TimWardCam
Yes, I've been heavily involved in device drivers and network protocol stacks. C because it was the supported language, and what the customer wanted, not because it was the best language in any.general sense.

@brouhaha @gilesgoat The customer or employer always gets what they want.

Unless it's FORTRAN, which I removed from my CV so that agents would stop phoning me about FORTRAN gigs, or Perl, which I always make clear at interview time that I will absolutely refuse to attempt to read, let alone write.

@TimWardCam @brouhaha Ok "just to say a bit of madness that got into me once" .. once I did open a book .. and my eyes went 😱 😱 😱 ... "I got the thing" as I MUST DO/KNOW something about it .. long story made short I *EVEN* managed to program by hand a whole character set BY HAND for VT220 and modify AND RUN a CP/M version of .. .. APL80 .. yes .. APL .. I remember managing to do a few things .. surprisingly that Z80 version was 'quite fast' .. but yeah "language for Egyptians" πŸ˜‚ πŸ˜‚
@gilesgoat @brouhaha Nobody ever asked me to do APL so the opportunity to refuse never arose!
@gilesgoat @TimWardCam @brouhaha If wiki is to be trusted, the first version of APL publicly available ran on the IBM 1130 which was remarkably similar in performance and capability to a typical cp/m machine: 64k bytes max memory, clock speed in the 200-500 kHz range. A 4MHz z80 might even have been faster than it. The xerox 820-II I used a lot had dual single sided double density 8” drives which were (I think) 480kB each so it came close to the 1MB stated disk requirement.
@gilesgoat @TimWardCam @brouhaha oh come on.
Perl.
I can write perl.
Easy.
I cannot read it, though. 🀣
@gunstick @gilesgoat @TimWardCam
I can't write it (for anything non-trivial) without also bring able to read it.
@TimWardCam @gilesgoat
I don't mind Fortran. Maybe I should list it ony rΓ©sumΓ©.
I'm with you on Perl. I once started.writing a two-pass cross-assembler for HP-21xx minicomputers in Awk. When I got sick of dealing with Awk limitations, someone recommended that I run my code through the Awk-to-Perl translator. I did that, got the assembler working, published as open source, but I.swore to never touch Perl again.
@TimWardCam @brouhaha @gilesgoat What they want, or what they ask for?

@shelldozer @brouhaha @gilesgoat I follow the Bellman's "what I tell you three times is true".

If an employer/customer asks me to do something stupid I tell them it's stupid. If they say do it anyway I tell them it's stupid again. If they ask a third time then I shut up and do it (unless to do so would break any of the codes of conduct I'm signed up to). They may have good business or political reasons to do something that doesn't make sense technically.

@TimWardCam @shelldozer @brouhaha @gilesgoat "The things I tell you will not be wrong."
@_the_cloud @TimWardCam @brouhaha @gilesgoat Customer: "Can you print the whole Internet for me to look at later?"
@gilesgoat @brouhaha I was writing lots of assembler in those days.
@TimWardCam @brouhaha Me too, Motorola 68K asm .. nowadays .. I so far only wrote Z80 and SG2650 asm ( for fun, in the week ends ) πŸ˜‚
@gilesgoat @brouhaha 8080 then 68K (never did much Z80, just treated it as an 8080). Oh, and one project on 2900 bit slice which was a bit unusual.
@TimWardCam @gilesgoat
I've long been a fan of bit-slice designs and microcoding. I started a low-volume email list on that topic some years back.

@brouhaha @gilesgoat This one was talking to custom hardware - some one-line-of-pixels-at-a-time machine vision device and some stepping motors - in a machine whose job was to quality control microfilm of football pools coupons.

The idea was the photograph all the coupons before the matches were played to guard against people altering coupons once the results were known. As with any paper handling system, particularly one which had to read vast numbers of folded or scrunched up forms in a very short time, the photography process wasn't entirely reliable. The job of this machine was to spot forms or batches that the machine had screwed up or torn and that needed separate handling.

Late 1970s.

@TimWardCam @gilesgoat
I wrote a lot of assembly from the mid-1970s to the late 1980s. For a while beyond that I still wrote assembly for small PIC microcontrollers, and for bare metal startup and interrupt linkage on "real processors". Since the late 1990s, I've only written assembly where absolutely required (increasingly rare), or for esoteric personal projects.
@gilesgoat @TimWardCam
I'm not saying that it's impossible to build large and useful software systems in C or C++. I'm only saying that there are other languages that can do that even better, with fewer foot guns.
@TimWardCam @brouhaha @gilesgoat Writing C for an automotive ECU right now.

@brouhaha @gilesgoat

Is it? I'd say it's a perfect language for a narrow class of rather simple problems. Which is why it seems such a good learner language - much like BASIC. Great for anything that can be written in a 10 liner, increasingly problematic past that.

I would say it combines some of the power of assembly with all danger of assembly while adding all pitfalls of HLL shortcuts.

@brouhaha

> For no particular reason, I am feeling nostalgic for Turbo Pascal.

Turbo Pascal for DOS is what we used to learn programming with in middle school in the early 2000s.

@brouhaha That's kinda my steady state.
@brouhaha
When I read this I suddenly felt nostalgic for the Terak USCD p-system (http://pascal.hansotten.com/ucsd-p-system/terak-and-uscd-p-system/)
In my head I can hear the chunky clicking of the floppy drive right now...
Terak and USCD p-System – Pascal for small machines

@brouhaha it was cool and all but I always thought it was trying to be UCSD p-System and didn't quite manage it
@uep Having used both quite a bit, the only things "wrong" with Turbo Pascal compared to UCSD Pascal were:
* Not cross-platform - only supported Z80 and 8088/8086
* No separate compilation or "units" - fixed in Turbo Pascal 4.0 (Early UCSD actually didn't have this either.)

@brouhaha @uep

I don't think there were ever many apps written in UCSD but I think it was better behaved.

@resuna @brouhaha @uep If I recall correctly, UCSD Pascal couldn't be used to make a standalone, all-in-one-binary application. There were separate runtime libraries that had to be present.

Turbo Pascal was extremely fast to compile on a (Mac) 68000 CPU, but it produced code which was flaky on the 68020. I've forgotten the details but Turbo Pascal for the Mac didn't last too far into the Mac II era before being discontinued.

@_the_cloud @resuna @brouhaha

i remember learning to code in C using lightspeed C on a mac plus

@_the_cloud @resuna @brouhaha @uep

ICBW, as I only briefly worked on a system using it, but wasn't UCSD Pascal based on a byte-code interpreter runtime? (That was the "p-system" mentioned up-thread.)

Could that be what you're remembering?

@CliftonR @_the_cloud @resuna @uep
Yes, the p-System mostly used bytecode ("p-code"). There were some native compilers later, and some p-System derivatives that used the host operating system rather than forcing the use of their own.
By the late 1980s, the use cases for a bytecode interpreter language had shrunk dramatically. Yet somehow Java brought that back in the late 1990s. I don't think JVM bytecode would have stuck around long, if not for most JVMs going to JIT.

@brouhaha @CliftonR @_the_cloud @resuna

Yeah, my impressions at the time (and comments here) were less about the bytecode aspect in particular and more about the idea of a complete environment focused on that language and source files and programs. I used it on Apple][, a lot, where you booted into it as a different system entirely. That was where I went to concentrate on learning "proper" programming in 80 columns. I would now say that the narrower scope that let it be complete was a feature, but that's not quite how I thought of it at the time.

Turbo didn't feel like that at all, but because it was an early version of what we now call an IDE on top of a general purpose operating system, it seemed like it was trying to recreate that. By then I had also been using enough other general purpose operating systems that the mismatch was jarring.

@brouhaha @CliftonR @_the_cloud @resuna @uep oddly Microsoft had some sort of C compiler which output p-code on various platforms and which was used to port their first spreadsheet Multiplan even to systems like the C64 before they gave up on portibility.
@stmuk @CliftonR @_the_cloud @resuna @uep
For a long time I thought that was just something Microsoft used internally, so I was surprised to learn that it was actually a documented and supported part of Visual C++ back then.
@brouhaha @CliftonR @_the_cloud @resuna @uep Thanks for that! I thought that as well! I'll have to investigate further.
Microsoft P-Code Technology

@stmuk @CliftonR @_the_cloud @resuna @uep
Here's the Microsoft pcode details, provided a while back by @fraggle

https://social.coop/@fraggle/114418571014576084

fraggle (@[email protected])

@[email protected] here's the pcode help file, one of many that I converted into html a few years ago. Appears to have the full byte code specification https://fragglet.github.io/dos-help-files/pcode.hlp/index.html

social.coop

@CliftonR @_the_cloud @resuna @brouhaha @uep

Funny you should say that, I just dropped the UCSD manual into Datamuseum.dk's BitArchive:

https://datamuseum.dk/wiki/Bits:30009790

Bits:30009790 - DDHFwiki

@brouhaha

I could always tell if a program had been built with Turbo Pascal because it wrote directly to the hardware, bypassing both MS-DOS and the BIOS, was unable to be redirected (even if it just output plain text), didn't always work correctly on my Kaypro 2000 or on systems with V20 chips, and occasionally scribbled on the wrong screen in DoubleDOS.

None of which was necessary 99.44% of the time.

So I had a hate/hate relationship with Borland.

@resuna @brouhaha Turbo Pascal's floating point accuracy left something to be desired as well ...