Wait what, you can't use a *code* editor when you're under 18 now? 🤔
@marekfort @iamkonstantin Wait until you learn what your code editor can do.
@marekfort Children rights for access to information and skills are reduced to slave with QR code printed on their forehead.
@marekfort yup same shit as when Audacity went corporate

@marekfort such ageist terms are often an attempt by corpos to cover their asses against “don’t harvest minors’ personal data” regulation

safe to assume they definitely plan on harvesting and exploiting users’ personal data

@marekfort “you own your AI-generated output” wheeze
@marekfort what's next!? Age verification!?

@marekfort If I can’t teach my kids to code with your editor, maybe I should not be using your editor at all?

So many questions…

@bomberstudios @marekfort you should try and teach with punch cards: never seen any TOS for them
@nepenthes @marekfort Sounds like a fun form of rebellion, to be honest

@marekfort Gram has pretty much landed at the exact right time.

https://social.nouveau.community/@andnull/116161245796027671

... and and and and ... (@[email protected])

Someone has made an AI free fork of Zed it seems: https://gram.liten.app/

Nouveau
@nerdd @marekfort I was (ignorantly) happy with Zed until I saw Gram yesterday and read their post on why the Zed ToS suck
@evilpilaf @marekfort I was aware of Zed pushing AI-features *hard*, but I held my breath.
@marekfort When you can't trust your own ai
@marekfort when you see minimum ages (18 or 16/13) it means that they want to market to you and will probably share your data with third parties.

@bartvdo @marekfort they say they don't sell your data...

Now, nobody said anything about giving it away for free!

@marekfort I generally like Zed but with all the AI stuff (which I disabled) and this now, I am not sure anymore.
@cameo007 @marekfort
Same. Its becoming a bit of a Frankenstein.
... and and and and ... (@[email protected])

Someone has made an AI free fork of Zed it seems: https://gram.liten.app/

Nouveau

@marekfort

literal stepping stone to child slavery

@marekfort they may write code with a bad word in it.
We can't have that! Think of the children!
@marekfort are we protecting the kids yet?

@Someplaceunknown @marekfort

We're protecting them kids so goddamn hard, they'll never learn anything that could possibly help them in the future.

We're still letting them get raped by rich assholes, though. Don't get too excited.

@marekfort VC-funded company doing VC-funded company things. Last time I checked, Vim, Emacs, Kate etc... didn't ask for my age. Almost as if it's better to not follow the hype, who would have thought?
@lu_leipzig @marekfort Thanks for the reminder that Kate exists! I was selecting a code editor for a friend who just started coding and didn't want to recommend corporate-backed editors. Kate looks like a great editor for this situation.
we're doomed
@marekfort Because you might learn to code? God forbid! 💀

@jonossaseuraava @marekfort

/sarcasm
To be fair, C++ IS danger to any mind. (just working on project in C++😉)

@jonossaseuraava @marekfort Mommy, where do use after free bugs come from?
@marekfort Honestly, I wouldn't be opposed to making everything that contains an LLM 18+... or 180+ for that matter, as long as shoving that stuff into absolutely everything becomes less and less attractive.
@dfyx @marekfort yeah and it's probably only a matter of time til the IDE starts doing kinky roleplay with users

@marekfort

Excuse me, WTF???

@ParadeGrotesque @marekfort taking bets it's because of some new “AI integration” feature

@marekfort @Gargron I think it’s probably more that you can’t enter into a legal agreement like new ToS unless you’re 18 (in the US, at least…)

Lawyers ruin everything.

@jimmylittle @marekfort @Gargron ...why do you need a Terms of Service agreement for *A TEXT EDITOR*

@techokami @marekfort @Gargron I repeat…

Lawyers ruin everything. 🫠

@jimmylittle @marekfort @Gargron take a step back and reassess the question.
*Why* does *A TEXT EDITOR* require a ToS? What is the text editor *DOING* that makes this something that is even needed in the first place?

I looked into it, and apparently it's because this editor is filled to the brim with AI generative bullshit.

Why does such a thing need to exist?

@techokami @jimmylittle @marekfort And telemetry. Why does a text editor need telemetry? *Excellent question!*

@JeremiahFieldhaven @techokami @marekfort Telemetry is in basically every piece of software. Is there a "Report Bug" button? Probably sends telemetry data. Crash reports? Telemetry. Product research on which features are used the most/least? Telemetry.

It can be used for nefarious purposes, tracking, and data theft. But it's also used for very important software quality metrics.

@jimmylittle @JeremiahFieldhaven @techokami @marekfort That is an extremely unserious take. No these are not "very important software quality metrics". Nobody needs metrics on that. It's explicitly invading people's privacy to avoid having to pay testers/QA or having to know how to ask questions and get feedback to be able to reproduce and debug an issue.
@dalias @JeremiahFieldhaven @techokami @marekfort Some of the products I work on have tens of millions of users. There’s no “asking questions to reproduce” at that scale, especially when we’re forbidden by COPPA from collecting any personal information. Anonymized telemetry, metrics, and crash logs make our products better.
@jimmylittle @JeremiahFieldhaven @techokami @marekfort You don't ask questions at scale. You let users report bugs, and then you engage with them to fix the bug, or hand off figuring out how to reproduce the bugs reported to QA/testing staff. You don't rummage through users' computers and their private data that happened to be in the core dumps or whatever without their knowledge or consent.

@dalias @JeremiahFieldhaven @techokami @marekfort Do you know what stopped thousands of bug reports (some legit, some just user error or confusion) coming in from tens of millions of users?

Fixing the bugs preemptively because it was reported to us by the software before the user noticed.

@jimmylittle @JeremiahFieldhaven @techokami @marekfort Yes, invading user privacy as a cost-cutting shortcut is how capitalism misbehaves.

@dalias @JeremiahFieldhaven @techokami @marekfort It’s not about cost cutting, it’s about providing better products.

Most people don’t give a shit about working with a dev to fix a bug or improve a product. They want their thing to work without thinking about it.

And some of us do it all without even tracking what city a user is in, let alone who they are individually.

Not all telemetry is bad.

@jimmylittle @JeremiahFieldhaven @techokami @marekfort Products are obviously, demonstrably worse than back when software shipped on disks and you had to get it right before shipping because there were no second chances to just push an update. Because back then, you actually had serious QA/testing.

I'm not suggesting you should make users do back-and-forth helping you fix bugs as the alternative to spying on them. I'm saying that if you should spend the money on QA teams rather than spying on users as a shortcut.

All telemetry is bad.

@techokami @marekfort @Gargron Every commercial piece of software has terms.

Even open source. That MIT license? It’s the legal terms that govern your use of the software.

Especially considering software that is hooked to an LLM is no longer just *A TEXT EDITOR*. It's connected to the internet. It can send your data elsewhere. It can receive data. All of that stuff requires legal approval.

I totally respect if someone doesn't want AI all up in their business. If that's you, don't use a text editor with AI features. We all have options.

@jimmylittle @techokami @marekfort @Gargron Um, no. No open source software has *terms*. It has one-way *grants* of rights to do things that the law might otherwise forbid you from doing (like copying and making derivative works), possibly subject to conditions. But you never have to "agree" to any "terms" in order to use it.

@dalias @techokami @marekfort @Gargron Sure you do. MIT, GPL, Apache are all open source agreements you tacitly “accept” by using it. The permission is granted “provided the user” follows certain guidelines.

Sure, there’s usually no checkbox to use open source software, but accepting the grant of license is usually legally regarded as accepting the restrictions (or _terms_) of the agreement.

It’s a difference without a distinction.

@jimmylittle @techokami @marekfort @Gargron No. GPL, in its ideological purity, is VERY VERY EXPLICIT (in v2 paragraph 5; in v3 paragraph 9) that you are not required to accept it. Anyone exercising their rights under the GPL has an obligation to ensure that recipients know they are not obligated to accept any "license agreement" or other terms as a condition for receiving or using the software.

Other licenses do not make a point of this because it's obvious. If you legally received a copy of the software, you own that and have every right to use it. The purpose of the license is only to tell you that, in addition to the right you have to use it, you can freely copy it for others, prepare derivative works, etc.

@dalias @techokami @marekfort @Gargron GPL is certainly the most permissive that I’m aware of (and I’m no expert, for sure), including passing the license on to derivative works.

But, let’s say you take GPL-licensed code and make something that you put a different license on – that is a violation of your GPL “grant”, right?

You can break the “terms” of open-source by the simple act of building closed software with it.

@jimmylittle @techokami @marekfort @Gargron That's not use. It's copying and preparing derivative works and it's illegal because copyright law says so. Not because of some terms you "agreed" to.
@marekfort Those kids have to use neovim and become gay... it's too late now...

@marekfort

Those under 18 years of age have limited legal capacity and require the consent of their legal guardian(s).

@marekfort @gereon That’s the stupidest thing I’ve seen all day, and I did check the news.