Dear politicians,

Our wish for #2026: Stop using 'child safety' as an excuse for mass censorship and surveillance.

No one believes you.

And using children is just... sickening.

@Tutanota Stop allowing children under 16 or even 18 to get access to violent, abusive or pornographic content. That is why they use words "child safety". I was adult when I saw my first terrorists cutting throat video, and those screams are still hounting me. I was almost about to faint, I was having mild panic attack. You want children get access to that? How about animal torture? Why do you want kids have access to sick content? Why word Netiquette matters nothing to some anymore...
@catnipmedia maybe be a parent to your damn kids then, and not have a bunch of homophobic misogynist pedos decide what is and isn't "safe."
@thecyberwitch Oh, personal attack, how nice! I can see based on your enlightened attack, that you know all about children. So, dare I say you want all that content be freely available? Are you fan or creator of animal abuse videos perhaps?
@catnipmedia @thecyberwitch You're a fan of kids being used as sex slaves?? Being stolen and sold? You're not arguing against it.
If you actually wanted to protect children you wouldn't elect officials that like to fuck kids and you wouldn't support laws that do nothing to protect children. They actually make it harder to keep kids safe.
I can do false equivalences too!
@catnipmedia animal abuse videos are already illegal in most jurisdictions and banned on most platforms. "Feely available" sounds a bit like a self-report, sorry, maybe try rephrasing that.

@thecyberwitch @catnipmedia

Pitting individuals against the massive power of corporate bodies is futile. That's one reason we form governments.

Not all laws or regulation are bad.

We need functioning democratic processes to make laws that balance competing interests - and transparent governance to enforce them

Reflexively saying all laws bad if the rationale includes ABC is nonsense. Just like calling for total non-regulation of any industry is extremely naive. We all know "power corrupts"

@TCatInReality the trust in both corporate bodies and governments is at an all time low since they're increasingly on the same side. What's being regulated are our movements and access to legitimate resources under a growing fascist threat. As I mentioned elsewhere in this thread, certain abusive material is illegal and most platforms ban it as a result, which is a much more sane level of regulation.

@thecyberwitch

Sure, but there is no industry that effectively self-regulates. Even the Catholic Church (famously).

If we should fight for anything, it's not to reflexively attack attempts at regulation. It should be to demand a more effective democratic process.

Better to break/weaken the government-corporate dependency than leave it alone and expect it to behave

@TCatInReality until we reach that Democratic ideal, we need to recognise which "regulations" are beneficial and which one aren't. Tracking the online and irl activities of every citizen via digital ID and Flock, "predictive policing" and censoring LGBTQ or political info = bad regulation. OSHA, FAA, banning CSAM etc = good regulation. Fighting the bad ones isn't "reflexive attacking."

@thecyberwitch

Sure. Those are fine examples - and I agree on those specifics.

But this thread started out with a broadside on the "child safety" rationale and I joined in when it deteriorated to "parents should manage their kids safety" argument.

That's the reflexive attacks I was referring to.

BTW, I appreciate your civil and well-thought-out responses.

@TCatInReality I still stand by that statement by and large, but it isn't as extreme (or reflexive) as it might have initially come across. Yes, reasonable laws should and DO exist to protect the vulnerable, but responsibility for children does primarily rest on parents, again by law. Neglectful parenting used as an excuse for mass surveillance and censorship is where I start to get a little snarky.

@thecyberwitch

Certainly, I oppose mass surveillance and (most) censorship.

And certainly, parents should parent

But I believe you misunderstand the world we live in if you think parents bear the bulk of responsiblity for child safety. In our world, all gov have mountains of product safety laws for manufacturers. Then mountains more for safe usage and distribution.

It is in THIS world, already after layers of safety, do we expect parents to protect their children.

1/2

@thecyberwitch

IMO, the same principles we apply to food, medicine, toys, buildings, etc should apply to online safety (for everyone, not just children).

And if you read OSA, that's what it does...and not only for "kids"

I do agree that "for the kids" is an abused rationale and often not true.

But techbros use pseudo-libertarian arguments and simplistic "for the kids is a lie" logic to block any regulation. It is THAT which I am objecting to.

We should improve OSA, not reverse it

/End

@TCatInReality which measures of OSA as it is do you support vs oppose? Do you support ID and face verification and if so, are you ok with governments and third parties holding onto that info? With the amount of data breaches that happen on top of that? Are you comfortable with a government increasingly hostile to trans people deciding what is "age-gated?" Where are existing laws failing, and why? My experience says enforcement. Many reports go untouched. OSA is no fix for that.

@thecyberwitch

Good questions and I will assume they are asked in good faith

1) I support the basic structure of OSA - That online providers need to do an audit; that they need to decide if they will host content that needs age controls or not; that they have a legal responsibility to have safety controls to enforce that decision; and that there is some type of age-gating

2) I think that the age gating process is woefully ill-defined and unregulated. Thankfully, it is not centralised.

1/n

@thecyberwitch

3) As written, the age gating is ripe for abuse and errors. This is the essential improvement needed (IMO)

4) IMO, hacking is a total red herring. Every bit of technology is at risk of hacking and we've seen many, many hacks of deeply sensitive data. No one says we stop doing things online - rather, we continually improve security and fight to decentralise data (IMO, greater data privacy righs is another essential improvement, and not just in OSA)

2/n

@thecyberwitch

5) ID and face verifications are fine options for those willing, but we need more options. If you read OSA, those two were only some options. It is the unregulated age gating techbros choosing these most invasive approaches to offer - not the gov

6) Yes, I am troubled by what an LGBTQ "hostile" gov (or any human rights denying bias) could do with the law. But that's true to all laws. I'm not prepared to give techbros free rein. We need better democratic processes

3/n

@thecyberwitch

Now, I have tried to answer your questions earnestly.

Here's my main questions:
A) Do you think we have enough evidence of real world harms from techbros to start regulating the online world, or should we continue with total immunity and self-regulation?

B) If we need regulation, and given this is the process that we have, do we make the best of it and continually refine (like all laws) or do we wait (if so, for what)?

To me, this is a simple product safety issue.

/End

@TCatInReality. social was a bad choice, character limit is killing me.

1. Refining laws retroactively after rushing them through is a bad idea in general. Bills are what should be refined. That's what they're for. Bad laws should be repealed AND replaced by better ones.

2. Regulation has the wrong target. We don't cut out the services entirely, but we minimize the data available to breach to essentials only. Zero-trust should be embraced.

@TCatInReality 2 contd. Cut the predatory AI and algorithms used on kids and adults. Engagement-based metrics are poisoning society. Enforce mandatory reporting when individuals and platforms made aware (the biggest problem in cases of online abuse in communities, and law enforcement has shrugged these cases off too much).

When we have an age verification method that doesn't make the techbros an agent for governments to circumvent constitutional privacy, I will be all for it.

@thecyberwitch

I can certainly support regulation to force greater transparency and control over AI and Algos, counterbalance engagement metrics and enforce reporting and safety moderation. 👍

And I'd love for other age verification options to be available. It's certainly possible - if there were better motivation.

@TCatInReality Also, "total immunity and self regulation" hasn't existed since the mid-2000's for the surface web anyway, and I find that to be a massive mischaracterization of our current tech landscape.

@thecyberwitch

Fair enough, perhaps it is the US and non-European world that is "total immunity and self-regulation".

Europe does have "slight regulation and worthless enforcement mechanism".

Can we agree on that?

@TCatInReality there's more nuance to that. US has federal regulation that is largely unenforced unless it protects major copyright holders. It also varies by state with California generally being the strictest, which is where most tech companies are hq'd.

EU should hold tight to GPDR (which is at risk of being gutted) and reject Chat Control, which is a tech-illiterate solution. Otherwise I agree that Europe in general currently is manages rights and safety better.

@TCatInReality Calling it a "techbro pseudo libertarian argument" in the face of these legitimate consequences when we often see law enforcement fail to follow through their laws protecting children, rings a little reductive to me. While laws protect food safety, parents can easily neglect their child's nutrition with junk food and poor cooking can lead to food poisoning, so the responsibility ends with them. Existing laws support safety. It's the failure to enforce them.