I think that modern computer education is wrong and bad and I have a modest proposal/bad idea to fix it:

Teach computer science via living computer history.

Start folks off on (modern recreations of) early computers. Make them qualify by submitting a program on punch cards before graduating to a serial terminal.

Give them a text terminal on a very limited timesharing system; running an older OS, and encourage them to make it their only computer usage.

Have them complete assignments on various 8bit micros, and then on 68Ks.

Use Unix.

But, like, Unix Unix. Ed Unix.

Give folks and understanding of the history of computing, the depth and breadth of existing tool chains, and the capabilities and limitations of vintage designs.

And, only after they have that historica perspective, do you toss anything modern at them.

Let people live their life, and write their code, in less than 1MB of RAM on a 12-bit or 16-bit minicomputer before moving on to something powerful enough to encourage bad decisions.

The history of modern computing isn’t very long, it can be learned and lived and in doing so, we’ll get better engineers, with the ability and perspective to write better software.

There are several reasons for this (and about a million reasons against it, but that’s for another day)

1) learning a programming language doesn’t make you an engineer.
2) how can we avoid past mistakes if we never discuss them?
3) how can we write better tools for the next generation of computers if we’ve never used the best tools from the last generation of computers?

4) we’ve been using the same rough architecture for a long time. Eons in the computer word, and we are long overdue for a change, but most modern computer education presents x86-64 as if it is without flaw, and the inevitable path for our future, instead of just one option among many.

5) Abstraction is good. Learning how to solve a specific problem in a specific case is okay. Learning how to solve a class of problems across multiple cases is better.

So, should we force burgeoning computer scientists and engineers to qualify on a pdp -8 and an Altair 8800, and write assembly for an Apple II before we let them get to a Linux box?

No. Probably not. It’d drive a bunch of people away from the industry, and probably not actually help many folks.

But we should totally offer living history classes, and for folks that are serious about low-level development, they should be strongly encouraged.

I’ll give an example of what I would like to see more of:

- teach students roughly how an Atari 2600 works.

- give them a bucket full of off the shelf parts

- give them four weeks to design a game console and make it play a computer game.

@ajroach42
I would *so* apply for this (despite having left the education system a while ago)

:s/a2600/c64 😉

Doing the retro thing, using a pinch card, using a UNIX terminal, typing in vi, a sense of history is useful in any field.

#history #cs

@Qwxlea
I think a program like this should exist, and be widely available.

I’m not sure it should actually be a requirement, but I’d love to see it in practice.