Trying to fix a broken system by tinkering won't achieve online safety.

The issue? The Internet is centralised in the hands of a few tech behemoths.

Simply adding new rules while harms evolve around them is a losing battle.

We must build something better.

Read our latest blog ⬇️

https://www.openrightsgroup.org/blog/online-safety-needs-structural-change-not-more-layers-of-control/

#onlinesafety #onlinesafetyact #bigtech #ageverification #privacy #ukpolitics #ukpol

Online safety needs structural change, not more layers of control

Concerns about online safety expose a deeper problem about who controls the digital world children grow up in.

Open Rights Group

Lob off one head, two more appear.

This is because online harms are an emergent property of the domination of Big Tech.

Attention capture revs the engine of targeted advertising that powers the beast.

New rules from age gating to content scanning are directed at the user, not the business model.

#onlinesafety #onlinesafetyact #bigtech #ageverification #privacy #ukpolitics #ukpol

Design interventions alone can't significantly improve safety if the market rewards behaviours and outcomes that generate harm.

Large companies can absorb the cost of new rules. Smaller services often can't.

This reduces competition and locks people into dominant platforms that feed off its users.

#onlinesafety #onlinesafetyact #bigtech #ageverification #privacy #ukpolitics #ukpol

Age verification makes people create profiles on platforms.

For porn sites, this means tying a specific user to algorithmic profiles. It enables customised porn feeds that are designed to keep them on the platform for longer.

It also exposes people more to data breaches and advertising networks.

#onlinesafety #onlinesafetyact #bigtech #ageverification #privacy #ukpolitics #ukpol

Proposals to expand age checks to VPNs and features like infinite-scrolling for UK users makes things worse.

The result is more personal data swilling around in tech firms and a permanent expansion of tracking infrastructure.

For VPNs, we're told to discard privacy for supposed 'safety'.

It makes no sense.

#onlinesafety #onlinesafetyact #bigtech #ageverification #privacy #ukpolitics #ukpol #vpn

The UK government is fattening the power of Big Tech, giving them greater control over our data.

With Palantir's Peter Thiel involved in age verification, we can't ignore how surveillance tech moves between military, intelligence and commercial markets.

Read more ⬇️

https://www.openrightsgroup.org/press-releases/roblox-reddit-and-discord-users-compelled-to-use-biometric-id-system-backed-by-palantir-co-founder-peter-thiel/

#onlinesafety #onlinesafetyact #bigtech #ageverification #privacy #ukpolitics #ukpol #palantir

Roblox, Reddit and Discord users compelled to use biometric ID system backed by Palantir co-founder Peter Thiel

Following global pushes for age assurance by governments, Persona is increasingly being used by major online platforms to carry out biometric age checks.

Open Rights Group

The approach to online safety is flawed.

We need a rights-based approach that challenges the dominance of Big Tech, not one that embeds permanent monitoring by requiring ID to access information and services.

Tell your MP to fix the UK Online Safety Act ⬇️

https://action.openrightsgroup.org/tell-your-mp-online-safety-act-isn%E2%80%99t-working

#onlinesafety #onlinesafetyact #bigtech #ageverification #privacy #ukpolitics #ukpol

Tell your MP: The Online Safety Act isn’t working

What's the problem? The Online Safety Act has been a disaster. Rather than protect children, millions of adults are facing widespread censorship, and teenagers are having their freedom of expression restricted. Here are some of the key problems with the law: Wrongful Censorship: The law places huge liabilities and threats of jail on platforms if they don't censor the right content. It does little to protect freedom of expression. The results are in. Footage of protest censored on X. Subreddits about stopping smoking, sexual health, and the news age-gated and shadow-banned.

Open Rights Group

@openrightsgroup

Good diagnosis, wrong answer.

IMO, the solution is to break the tech oligopoly - while ALSO having real regulation that holds service providers responsible.

Sure, there are things to improve in OSA. But the only ones who benefit from a total change of strategy are the techbros.

That's why we have both antitrust AND product safety laws. It's absurd to argue only antitrust is needed ...in any industry.

@TCatInReality @openrightsgroup For sure there are things that can be done. We are not saying that there should be nothing. Rather we are saying without action to align users and platforms, the pressure is in effect just on users, which cannot work.

@jim @openrightsgroup

Not sure if you wrote the ORG post, but it seems to me to blame OSA

Eg.
- approach is "flawed"
- need "not one that embeds...monitoring".

If the position is that society needs both, it should say that.

@TCatInReality @openrightsgroup This explains what we think in more detail. The OSA is tbh quite a mess and not delivering. The Digital Services Act is more proportionate and reflects free expression concerns better for example.

https://www.openrightsgroup.org/publications/how-to-fix-the-online-safety-act-a-rights-first-approach/

How to Fix the Online Safety Act: A Rights First Approach

In this report, we analyse the Online Safety Act (OSA or ‘the Act’) 2023, which imposes new duties on online service providers to protect children from harmful content, and Ofcom’s guidance to compliance with these duties.

Open Rights Group
@TCatInReality @openrightsgroup Overall, if we want online content regulation that works we have to think in terms of open rather than closed systems. OSA is all about traditional safety approaches (block this, do that) rather than recognising we are in an open, unpredictable (human) environment, with an industry whose current attention business model pushes them to work against these requirements.

@jim @openrightsgroup

Thanks for sharing.

I read this document last year - and broadly agree. "Open" is necessary to prevent the concentration of power.

However, I think this still misses the key element of product safety and liability.

In every other industry, manufacturers/providers have a legal requirement to provide a safe product. *That* is also an essential element of any long-term fix in the online world.

@TCatInReality @jim @openrightsgroup
Well put. This is why I have two proposals I think are key:
First, if you promote material online to people who have not signed up to receive it (whether posts or adverts), you should be held to the same standards as any other publisher.
Second, we need a new law criminalising “deliberately or recklessly misleading the public”. Like libel, there would be few prosecutions, but it would make people more careful about what they publish.
#politics

@KimSJ @jim @openrightsgroup

I like both.

Of course the "misleading" legal basis would be challenging. But I think there is a way to expand "fraud" laws to cover the majority of your point.

I would add more items:
- product liability
- criminal penalties (not only monetary fines)
- algorithm transparency - and greater user controls
- default opt-outs
- much, much stricter data privacy and limits on Corp use.

@KimSJ @jim @openrightsgroup

And I think the fundamental concept behind #OSA is sound - that if you provide a platform, you need to decide if it is all ages - or adult content - and you need to enforce that.

It is absurd, and impossible, for users to be safe in a world without knowing, upfront, what kind of space they're clicking into.

The big flaw is allowing the same greedy techbros who created this mess to create the age verification tools.
#ImproveOSADontEndIt

@TCatInReality @jim @openrightsgroup
I agree with your ambition, but I’m not sure it’s technically possible. I suspect it would be better to expend effort on providing the tools and associated training to help parents police their children’s Internet use (though that would also provide domestic abusers tools to further abuse spouses and similar).
There is no easy answer, but one thing is certain: governments who choose politically attractive quick fixes are doomed to make things worse!