The Progress of Software Engineering, 1989-2022

https://en.wikipedia.org/wiki/Portable_Distributed_Objects

<< Portable Distributed Objects (PDO) is an application programming interface (API) for creating object-oriented code that can be executed remotely on a network of computers.. created by NeXT Computer, Inc. using their OpenStep system >>

<< The ability to instantiate any object known to the local process from any other process is a known security vulnerability, and Apple strongly discourages use of PDO for that reason. >>

Portable Distributed Objects - Wikipedia

The ability to instantiate any Concept known to a human mind inside any other human mind by means of Speech is a known security vulnerability, and the United Network Command Office for Operational Logistics strongly discourages use of Speech for that reason.

The thing that annoys me about the failure of distributed objects as a programming paradigm is that,

in the 1990s, you could not go anywhere in computing without being utterly hammered by the message that Objects and especially Distributed Objects were The Future Here Now, this was it, Programming was Solved Forever, if you didn't Get It you were just Wrong

and we just sorta slid from there into "actually distributed objects are terrible never use them"

but never acknowledging that change.

It's not just the 1990s Distributed Objects people being so loud and aggressive and moneyed-up and preachy

It's not just that their tech was terrible and dangerous and caused billions in security damage

It's not just that the industry changed its mind about something it was so passionately furious scorched-earth in favour of

*It's the never admitting any fault* that gets me.

The computing industry often acts like an abusive gaslighting bully, and that behaviour is still going on today.

I feel like there's probably a very strong correlation between the 1990s interest in Distributed Objects and the 1990s interest in Outsourcing.

This whole post-Cold War sense of openness, everything should talk to everything, which is fine, but, not enough thought put into how to make it safe.

@natecull you lost me at "object". ๐Ÿคฃ

@cregox

I mean we got Objects, lots of 'em, at all scales.

The part that we now seem to think is super-toxic, though, is "what if you could just load Objects straight into memory from another computer over the wire"

like not buffered by scripts and compilers and firewalls and such, just jacking raw bits

That's the part that was the most exciting and new about the Object vision, and it's also the part that seems to have caused the most pain, when it worked, which wasn't all that often.

@natecull but...

#braindumb

i still find it insanely hard to understand and intriguing to think about.

at this point, i love to think i would just grab you by your hand and drag you away from the crowd, somewhere we could keep on talking until i could get what you mean. ๐Ÿ˜

i guess i should've instead stop using so many subjective words and getting on more practical instances...
@natecull i feel like writing a separate post for it, though. without mention, perhaps. writing it now... coming out soon! ๐Ÿ˜

@cregox

Sorry, I wasn't intending to be cryptic, I was just grumping about industry trends and buzzwords I lived through 30 years ago, and by extension, the habit of the industry even today to be driven by trends which it then discards.

The various "distributed object" technologies were and are insanely hard to understand, yes. That's why I've never really liked them, and why they're often now considered security problems. Complexity generally leads to failures.

@natecull nothing to be sorry about! ๐Ÿคฃ๐Ÿคฃ๐Ÿคฃ

i thank you for the #mindfuck inspiration. ๐Ÿข
@natecull complexity always leads to failure... and wonder! ๐ŸŒž

@natecull Maybe distributed objects solidified into hardware and became Internet of Things.

BTW, never use them if you can help it.

@tasket

Components kinda got replaced with Containers, I feel.

These days it's "make the whole app and all its support libraries - possibly even all its support *servers*, including entire Clouds and CDNs - the modular unit"

@natecull @tasket A friend noted to me the other day

> "I'm learning to write Dockerfiles and it's fun to take 10 minutes to build the image, shell into it, see if a command will work, kill the image, update the docker file, start again"

Computers are going great folks!

@natecull Not admitting failure & lack of retrospective and reflection are, I think, a symptom of industry (esp. VC culture) taking over the Computer Science field.

Academia should be trying to solve the massive security problems we face at the architectural level. Where is the basic research?

@tasket

Yep, "never admit liability" is definitely a Corporate thing not an Academic thing.

Though even in Academia, individual theorists are very unlikely now to ever admit wrongness because that's the end of their career if so and - just like startups - they're in bitter competition with many hungry peers looking for funding and prestige.

@natecull Its definitely down to the spread of corporate culture in academia. About 7 years ago there was some attention in the press (incl NYT) about research shifting to the private sector and basic research suffering as a result.

IIRC they traced the trend to the growing influence of silicon valley and VC culture.

Lack of integrity becomes a huge problem when learning from one's mistakes becomes unfashionable.

@natecull ML must be a natural outgrowth of this. There is a strong tendency to create software products that aren't really programs and which amplify bias and function like black boxes.

If one had to invent an anti-accountability scheme for IT, that would be it.

@tasket

Yes. But the thing is that "anti-accountability" is kind of seen as a positive good in the computing culture at the moment, due to techno-Libertarian roots.

The whole point of the Internet to some (as well as the whole point of Austrian/Objectivist economics), was to deliberately decouple the network from any possible human oversight.

Having removed human sentiment, whatever comes out of the meat grinder must be whatever survives, which must by definition be The Best Possible Thing.

@tasket

That's probably a misreading, but I'm not sure by much.

Removing human oversight is wanted because it's cheap and can lead to vast profit.

There's not much thought going on beyond this, I think. Just arbitrage and routing around all possible human social conventions in pursuit of Stock Number Go Up.

@tasket

Somehow, the New Age (as opposed to Objectivist) faction of the Californian Ideology coalition managed to convince themselves that "well we sure don't like Government cos Vietnam, so if you put everyone into an economic meat grinder, then Kindness and Love will come out because it has to, it's better so it must be stronger"

sort of a Trial by Combat kind of thinking I think. Maybe influenced by some actual martial-arts ideologies.

@natecull

I somewhat agree but i do see a huge fault in the logic here.

you are complaining about a group as if they where a single entity.

@logan

Cultures aren't limited to individuals.

@natecull

no, but actions and by extension blame is.

@logan

Nope, actions and responsibilities aren't totally limited to individuals either. The belief that they are, is a very Libertarian belief, it's not necessarily actually true when you look at the world.

Group actions on a wide scale are made possible due to group beliefs and habits.

Beliefs and habits span across multiple individuals, and are often absorbed unconsciously.

Thats what makes culture - both in an organization and an industry - so hard to change.

@logan

By which I mean: there is a sense in which any group *is* a single entity.

There are always fringes and exceptions, but, I think you'll find that it's a very strong repeating pattern in the computing industry to be

1) macho and bullying
2) make repeated claims about the Next Big Thing
3) just clam up and run away if the Next Big Thing turns out bad
4) repeat back to 1 again with almost zero introspection

This is observable. If you live long enough you see multiple cycles of it.

@natecull

but how much time has paste between iterations,
will you blame the naive for what they do not know?

@logan

"will you blame the naive for what they do not know?"

Pretty much yes. It's *not* just naivety at this point. It's a constant, repeated, wilful lack of interest in self-examination.

I see a repeated pattern of just this massive lack of memory and lack of interest in asking why there's a lack of memory.

It is so frustrating to watch it happen over and over and over again and it cannot just be "naivety" by now. Not after multiple decades.

@logan

I mean, this is an industry which *is defined by and prides itself on* its ability to absorb and process information! Everyone who practices in computing learns! It's all about the data, the knowledge, an aggressive hyper-masculine competition to internalise *more* knowledge, faster and faster!

And yet, none of these people are learning the history of their field.

That kind of misalignment between natural aptitudes, and subject matter, doesn't happen by accident. It's a choice.

@natecull

if they where an individual entity doing this stuff repeatedly than i would completely agree but they are not an individual entity.

the older ones perhaps could not slide so easily out of blame for this for they where around when it happened but the new people coming into the field are absolutely clueless.

@logan

"but the new people coming into the field are absolutely clueless."

But why are they clueless? When clue is the currency of the field?

Someone's deliberately choosing not to teach certain things.

@natecull

"Someone's deliberately choosing not to teach certain things"

someone , but who i wonder.
as a person who doesn't actually keep up with all the crap i can't actually answer this because to be honest with you i don't know who the blame falls to.

we have reached the cornerstone of the logical hole of your original post however and you've filled it in, mostly.
@natecull thatโ€™s the IT industry in general. Every couple of years the latest trend gets hyped beyond recognition then a year or two later its the worst thing imaginable. In truth most technology has pros and cons, suitable use cases and anti patterns. Its just that weโ€™re really bad at getting sucked in by new shiny tech. Todayโ€™s snake oil is arguably micro services, distributed databases and blockchain. They all have uses, but less so than youโ€™d think given their popularity.
@barclakj @natecull I'm new to tech (computer science student) and it sounds like web3 and cryptocurrencies/blockchains are the modern day equivalent.

@dean @barclakj

I feel like

a) hashing, public key signing and immutable distributed datastructures have a huge amount of potential for privacy and security

b) public proof-of-waste blockchains designed to create "artificial scarcity" are a radically unscalable solution to a very narrow problem mostly of interest to some very nasty people ("how to smuggle untraceable money to fund crimes") and they don't even do that very well

We should do a lot more of (a) and find better answers to (b).

@dean @barclakj

Like (b) is a very narrow case indeed:

* you want a single, append-only log, processing one transaction batch at a time in strict linear order
* you actively want the system to be very slow
* you don't trust anyone running the servers
* but you don't mind if all the servers end up run by a few actors who spend the most
* you don't mind all the data being extremely public and stored forever
* you don't care about any externalities, like energy use or crime

@natecull ๐Ÿฌ How much of that never admitting fault is because the industry is made of many individuals who didnโ€™t change their mind? What changed in terms of overall industry opinion were the set of people who set overall opinion among the apathetic bandwagoners. ๐Ÿฌ

@natecull There's a difference between hype by dotcom startups and actually implementing something well though.

Windows' commercial success despite being *very* average in terms of capacity is an example of sorts.

Distributed systems are difficult to do, and even more so when you have *no* trust boundaries. But admitting such problems isn't the can-do "move fast, ignore problem and crash hard" way of startups.

@lispi314

The whole startup culture and its weird relationship with giant corporates is quite fascinating, and so much like the "Megacorps and Netrunners" relationship in 1980s Cyberpunk fiction.

I never really thought it would come to this, but it's exactly how it's played out.

The megacorps need stability and reputation because of their vast size, but they also want illegally risky innovation, so they hire it out to low-life goon squads who break all the laws and can be plausibly denied.

@natecull That's an unpleasantly suitable comparison.
@natecull They hammered us on that in undergrad.

@natecull I mean, part of the reason is because we have a better understanding of the limitations of OOP, no? And it's particularly bad when you mix OOP with high-latency calls.

Of course, at every moment there's people continuously reinventing the broken square wheel with even worse materials, but that's just how tech works these days when marketing is more important than what the thing materially does.

@natecull Ultimately we just need to modify our minds to be able to run compartmentalized subsets that can be discarded after use.

There was something relatively similar in a part of Accelerando.

@natecull Also a distributed object system could be relatively okay if you can limit computation types, evaluation of values and integrated capabilities.