@erincandescent "only ever going to be filled in on the machines of children administered by parents who want such restrictions enforced"
You say this as if it's not a huge problem in itself. We should not be building or shipping tools for abusive parents to use to surveil or control their children.
@erincandescent That doesn't justify being part to it and essentially forcing distros to ship an abuse-mechanism unless they actively patch it out (thereby having to make a highly charged political statement).
Yes a determined parent with technological know-how can always find a way to put such malware onto their child's machine. We should not be making it an out-of-the-box feature of "Linux".
RE: https://social.treehouse.systems/@mgorny/116274748222570834
@erincandescent Combined with other things, yes. See for example:
@erincandescent Right now, there is no standard place for a DOB field to be stored or for applications to know how to access that information or use it to enforce rules blocking access to information.
By creating standard places to store it and standard APIs to access it, you setup the infrastructure needed for these abuses to be something available out-of-the-box rather than requiring a ton of custom hackery by the abuser to setup.
@dalias iâm not sure what your actual argument is here.
Is it
@erincandescent @dalias that's fine, but don't make it easier for evangelicals to murder people along the way.
btw, when you say CNC I assume you don't mean computer numerical control, unless you want to block access to all of the precision machining content on YouTube.
@erincandescent @emma Nobody "accidentally wanders into pornography". That's an excuse.
If you're really worried about this, you mandate that porn sites have a splash page that says "this is a porn site. what you're about to see is sexually explicit. do you want to continue?"
Children who are not actually looking for porn are going to hit the back button stat.

@erincandescent None of the above.
My position is not that there is nothing harmful on the internet, but that for both fundamental reasons and reasons of political capture by people who wish harm to any children who are not straight cis neurotypical, any attempt to gate access to information will both block critically important non-harmful things and fail to block the most harmful things.
I could go into my views on how parents should deal with these truths, but I don't believe that "how else are we supposed to PrOtEcT tHe ChIlDrEn??????" is relevant.
Protecting children is not on the table here.
Doing harm to children and harm to people who need to be anonymous are what's on the table.
@robryk @dalias One of the things I see being discussed on the xdg-desktop-portal pull request is that the Age Verification service will provide the minimum viable API required to comply with the California/Coloradan/etc law, and that an entirely separate API would allow queries around specific content descriptors which would allow a much more capable decision system.
(In fact you could imagine such an API providing a way for you to configure for yourself that you donât want to see certain things or that they should be hidden by default)
@robryk @erincandescent No, the direction we've always had is that *nobody has control* except someone who's hovering over them.
Government - pushed by industry, who wants to shed legal liability for the harms they are encouraging and amplifying on their platforms - is attempting to force us to participate in building a system of parental controls that's always there.
On top of that being bad enough in itself, it's a requirement they could change into "governmental controls" whenever they like.
@erincandescent @robryk Seriously? You think this agenda just popped up worldwide all the sudden without someone funding it all? đ
The receipts purporting to pin it on Facebook haven't been verified yet, but I thought it was widely understood that they're doing this to avoid blanket bans, hoping instead of herd underage users onto reduced-harm versions of their platforms while keeping all the maximal-harm stuff in place for adult users.
@dalias @robryk This popped up all of a sudden? Itâs been building slowly for years now. Itâs a long term political trend if you read the news.
I donât think social media bans will actually do anything useful. I also think that itâs hard to think of any regulation that would mitigate the real harms here and not just open a different can of worms.
I mean, except for maybe legislating the abolition of Meta Platforms Inc and all associated companies.
@erincandescent @dalias @robryk It's likely less of "we really care about age info" and more "we care about liability and we want biometric data as it's just generally useful"
combined with
"the us government in this particular instance isn't supposed to collect this data and share it widely between departments /but it can buy it from third parties just fine/" kind of silliness
(disclaimer: I work at FB. I have no idea of what's actually going on internally here.)
@dalias @erincandescent The thing I am worried about is when the first bit of software tries to use that API, even when I am located outside of the demanded age restrictions. And it does not really matter if that software is a web browser to provide fingerprinting, a media player to verify that I am not playing an R-rated movie or Steam to collect statistics.
I don't believe/trust this stays opt-in and if I don't provide anything, (1) nothing will complain and (2) nothing will use even the negative information against my will. I don't believe in adding an API that is supposed to not be used based on where I am geographically located.
(And I don't trust an API that emerged as a result of this geopolitical climate.)
I don't want such an API existing on my device, even if âthe API itself is harmlessâ and the harm may come from âjustâ from the applications that utilise it.
@ledoian @dalias itâs Linux, youâre root, you can just change the code to simply lie or do whatever you want, you have that capability.
Unless someone legislates that you can no longer actually control your own computer (and yes, people are trying to do that), or systems are legislated to collect some kind of proof but thatâs a completely different legislative problem
@loptimist @dalias this is broadly how the portal works; you ask about a range and the API returns bounds on the userâs age based upon applicable laws (where the bound can be âunknownâ/ânot applicableâ and probably should be outside of California/Colorado).
Yes, maybe âCan the user see a film rated (mpaa:r, usk:15+violence, bbfc:15+violence, âŚ)â would be a better API and this is being discussed but unfortunately that is not the API California and Colorado are mandating operating systems to supply (the mandated API must return to an application that the user falls into one of a handful of enumerated age ranges and no more)
What they are aiming for with the age verification portal is precise minimal compliance with the law in a way that can trivially support similar laws elsewhere that e.g. differ in enumerated age ranges.
Where is that API located ? On the OS or within apps or websites ?
If the API is provided by the OS, it means that third-party software would be able to collect age ranges, which is unacceptable in my opinion.
Thatâs why I suggested that applications and content providers should provide a way for the OS to retrieve age ratings, and then let parents make decisions using parental control tools.
If that's not compliant with Californian law, then perhaps the law is flawed, as it does not adequately guarantee privacy.