Been on a bit of a journey reading about Systems Thinking in the last few weeks. Why is that, you may ask? Well: point one, I wanted to understand the arguments for #SafetyByDesign as added to the UK #OnlineSafetyAct which says that services must be "safe by design".
@jim I could actually buy that as a sensible requirement if social media platforms were actually designed by very serious people to meet social and cultural needs. But as a former dot-com era developer, all I can say is LOL ROFL. Nobody has a clue it's going to be a service people depend on and safety is an issue until it's too damn late! Until then it's just a neat toy a couple of devs are noodling with in their spare time for their own use.
@cstross It is a tall order for so many reasons. Closed vs open systems; adversarial or opposed goals of the regulator (risk reduction) and the platform (attention); low alignment between legality of content and risk; context dependency changing the nature of content and risk
@cstross Still: if alignment between the users, community and platform is high, eg with Mastodon, then safety is much more realisable.
@jim As I keep yelling at people, the profitability of a business model does not confer legitimacy: that kind of thinking—very much a silicon valley thing—is highly problematic in commercial social media, which monetize our social connections.
@cstross Quite. So the question is, which things are inside the safety "system"? if the business model is outside, then really it is just tech acting against "potentially problematic" content and users.
@jim I'm tempted to suggest as a rule of thumb that true security in social media starts with banning for-profit companies from running social media. (Non-profits? Sure. Charities? Sure. But Mark Zuckerberg or Elon Musk? Absolutely not.)
@cstross @jim The heart of the problem is a business model of secret tricks designed to drive engagement via outrage to sell ads. Can you have a for profit social media platform that doesn’t do that. No idea.
@cstross @jim The “drive engagement via outrage to sell ads” thing is not new, the Daily Mail, everything from News Corp and various other low quality outlets have been doing the same for many decades. Now X, Facebook and the rest don’t even have to pay “journalists” to make stuff up, there are enough racist scum who happy to do that for free. Especially when the lies align with what the owners want in their heart of hearts.

@cstross @jim I always think of ShootingPeople.org.. they closed last year after 27 years - but were an early social media model that worked precisely because community curation was funded, and that required building a little tech and borrowing money and being a limited company, back in 2002.

What’s different between them and the planet-tons of shit that followed is they never stopped being ‘by filmmakers for filmmakers’. The founders didn’t sell out, they never automated moderation, every channel had a named ‘editor’, they nurtured members and tried to talk thru problematic posts with the authors before publishing and stuck to daily scrollable digests over a non-stop firehose. ie the problem wasn’t that they were a tiny company who reinvested whatever it made - just that would have been a problem if it had changed. Instead ‘care’ was embedded all over - and it’s hard not to wonder if having two women co-founders was a part of that.

@cstross @jim or at least monopolies? Once a communication medium has more than a certain share (5%, 2%, 1% ?) of a market it needs to be treated like telephone lines or radio spectrum – ie a natural monopoly – and broken up around common infrastructure.

Charities have regulatory requirements that make me no more confident about them running giant things well than I am about coops doing it - tho they're both good things. A charity or coop with FOSS in its veins, maybe.

But instead law could require any digital enterprise beyond a certain size must have full data and identity portability, and interoperability on common open standard, or face being broken up?

@cstross

hate to break the bad news to you 😜 , but profitability = legitimacy not just a silicon valley thing, its deeply embedded in the so-called neoclassical economic doctrine, going back to Milton Freedman etc.

The idea is that business focuses on financial profits and politics/legislation sets the ethical/legal boundaries.

What they "failed" to account for is that corporate profits can easily buy politicians.

Total corruption follows and the state of digital tech is proof😟

@jim

@openrisk You've forgotten those businesses that governments dislike—illegal drugs, child pornography, human trafficking ... all highly profitable! @jim
@cstross @openrisk @jim Does the government really dislike the last two?
@lispi314 @openrisk @jim The government is not a person so it can neither like or dislike anything: the government is a swarm of loosely interconnected policies, some of them working antagonistically, driven by individuals and other hives (notably media outlets) with agendas.

@cstross

The list includes digital gambling (aka "prediction markets").

The idea of "markets" (=speculators) determining the likelihood (thus price and value) of everything goes also deep into the neoclassical mindset.

At its base its a dehumanizing mindset that as much as possible aims to ignore or bypass "annoying" moral questions.

@jim

@jim "This is obviously some strange new meaning of the word 'safe' of which I was not previously aware." Arthur Dent. #thhgttg