I think that modern computer education is wrong and bad and I have a modest proposal/bad idea to fix it:

Teach computer science via living computer history.

Start folks off on (modern recreations of) early computers. Make them qualify by submitting a program on punch cards before graduating to a serial terminal.

Give them a text terminal on a very limited timesharing system; running an older OS, and encourage them to make it their only computer usage.

Have them complete assignments on various 8bit micros, and then on 68Ks.

Use Unix.

But, like, Unix Unix. Ed Unix.

Give folks and understanding of the history of computing, the depth and breadth of existing tool chains, and the capabilities and limitations of vintage designs.

And, only after they have that historica perspective, do you toss anything modern at them.

Let people live their life, and write their code, in less than 1MB of RAM on a 12-bit or 16-bit minicomputer before moving on to something powerful enough to encourage bad decisions.

The history of modern computing isn’t very long, it can be learned and lived and in doing so, we’ll get better engineers, with the ability and perspective to write better software.

And, like, that’s way more modest than eat babies or whatever nonsense Swift was in about.
@ajroach42
but eating babies would also be good training for engineers.