The two hardest problems in Computer Science are

1. Human communication
2. Getting people in tech to believe that human communication is important

@hazelweakly I fully admit to HATING the mandatory "arts" courses (English lit, sociology, history, etc) I had to do as part of my uni degree. If I were doing it now I would probably have chatgpt'ed my way through most of the essay writing. Which is what I worry is happening with the next generation 😬

I'm at the point in life/career where I realize these were actually really useful.

how do we actually illustrate the value of these subjects to future generations? For me I know part of the issue was time management - trying to focus on reading a book and reflecting on it while also juggling all the "engineering" course work. Separate them into their own year/semester?

@deliverator In my engineering degree plan, an engineering ethics course was mandatory. We discussed some high-profile failures like Bhopal and Chornobyl, as well as some lesser-known ones like the Hyatt Regency walkway collapse, the THERAC-25, and the Gimli Glider incident. We also spent significant time on how IBM and other big companies built machines which powered Hitler’s extermination camps.

The programming and IT degrees available at the time didn’t include any ethics courses, which is part of why I’ve never felt people doing either should be called engineers.

@bob_zim yup, definitely part of the whole thing. Canada (or is it just Ontario?) has rules about having "engineer" in a job title. Some people complain about artificial gatekeeping, but words are supposed to have meaning. I don't think we've actually figured out what "software engineering" means, let alone have a means of doing it and validating it.

@bob_zim @deliverator my electronic engineering degree barely touched ethics, just as one topic in the "Engineering Management" subject that was a real cakewalk. Computer science didn't mention it.

We also had room for zero things outside of science, which I wish was different. It would have been nice to do some subjects on other topics.

@bob_zim @deliverator Further, I would say that unless you’re building your software to reliability standards (not goals, standards that you test) then you’re a programmer, not an engineer.
@cwg1231 @deliverator Even then, software is math. Beyond tested reliability, software can be *provably correct*.
@bob_zim @deliverator true. I haven’t played around with any of those cool formally verifiable languages yet, so I wasn’t comfortable asserting that they’re practical for industry use.

@cwg1231 @deliverator The seL4 kernel is pretty widely used (not as common as VxWorks, but it’s up there). It’s mostly C with a little assembly, and most versions are proven correct against a specification in Haskell using Isabelle. It’s a shining example that C code *can be* written correctly, but also of the lengths to which one must go to ensure this correctness.

Newer languages certainly make big parts of the process easier.

@cwg1231 @deliverator It’s just so weird how many software people have never heard of formal verification, and have no idea that you can write software which is provably free of implementation bugs.

On top of this, software doesn’t decay like a bridge! It doesn’t wear out like an engine does. Once it’s correct, it’s correct forever (or at least until the specification changes)! That’s should be huge! To me, the fact this isn’t even widely known—let alone striven for—is a sign of how unserious the software industry at large is.

@bob_zim @cwg1231 I don't want to rain on the formal verification parade. I think it could be an important piece of the puzzle! But I don't think it's a silver bullet.

Plus it doesn't solve the human communication piece of things, which is where this thread started.

@deliverator @cwg1231 I guess my point with that digression is that engineering is characterized by exactly two things—ethics and rigor—neither of which is practiced by the overwhelming majority of programmers.

Formal verification isn’t a total solution. A specification can still be bad in plenty of ways, but the software built to meet that specification can be correct. It’s difficult to do, but every real engineering discipline is difficult, and they don’t have correct solutions.

@bob_zim @deliverator excellent points all around. Regarding rigor, something that absolutely needs to enter the public discussion sphere is confidence metrics for probabilistic software (i.e. any machine learning algorithm ever, especially computer vison). Any probabilistic algorithm being marketed without a probability disclosure is wildly irresponsible and should be publicly shamed. I’d go as far as saying probability disclosures ought to be legally mandated, as well as disclosure of the dataset used to produce that measurement.

While software doesn’t decay in the literal sense, it does decay metaphorically. Vulnerabilities are found and need to be patched. Dependencies become deprecated or unmaintained.We run out of seconds since 1 January 1970 countable by a signed 32 bit integer. The left pad incident was a wake-up call for dependency maintenance. I’m vaguely aware of some frameworks for assessing the risk added to your project by a dependency, but I’ve never heard of a dependency being excluded from a project because of its risk. In that sense, we’re still in the fuck around and find out stage of software development. I hope we can change that soon.