The goal is to make corporate data less profitable.
Even stuff as simple as setting your birthdate to 1970-01-01 everywhere, adding [TEST] or [DELETED] as your name or account notes anywhere you don't need them to know your name.
Using plugins like AdNauseam to poison ad trackers (and cost them marketing dollars).
Using VPNs set to different locations.
Signing into data broker sites to "correct" outdated info (they'll often let you do that with little-to-no proof of identity, but will require your passport or state ID in order to delete your info). Bonus points if you correct it to someone else's info on their site that's similar to yours.
Only fill in required fields when you sign up for anything, but only provide correct info if it matters for you to use the service, otherwise provide plausible, but incorrect, data.
If you use LLMs anywhere, use the free tier and always vote thumbs up for bad answers and down for good ones. It wastes their resources and drives up their costs while making their training data worse.
@alice scribbling notes furiously
For the less savvy among us, tysvm for this helpful advice π
@Irenetherogue sure! There are low tech ways to do itβjust lie...to every corporation, app, and marketer you can. Make it plausible, but wrong.
Bonus: include something wildly implausible once in a while. It makes folx more likely to overlook the subtle ones.
@alice @Irenetherogue I got off when taken to court for nonpayment of Poll Tax (Thatcher thing, yes, I'm that old) because I poisoned their data by missing out a crucial box on the form.
Don't refuse to comply but *always* sabotage their data. It's simply costs them more.
Haha, you'd like my mother, the guerilla witch. She makes customer cards in every shop and switches them then with other people, bonus points if both have a strongly different consumer profile.
When she's bored, she responds maliciously questionnaires of evil corporations.
She studied psychology and statistics and says "it is anyway horribly difficult to get useful answers out of these marketing datasets, why not make it a bit harder for them?" π.
@earthworm TIL I have a second kid.
My education is in psychology and statistics, and I do shit like that whenever I can.
@alice @theorangetheme I once built a fuzz testing tool that "randomly" shuffled input around and tested it against things. "does my input validation survive utterly batshit inputs?"
Feeding the inputs through something like that would make sure they can't cache answers.
Distant memories of hooking together two ELIZA instances...
'and then?'
@nickynah them's just good manners! I find it's also helpful to upload my favorite random cat photos until I hit the attachment limit. I mean, who doesn't like cat photos?
Philosophy!
@alice If you're selfhosting, have a look a iocaine: https://iocaine.madhouse-project.org/
If you upload pictures, maybe nightshade would be the right tool: https://nightshade.cs.uchicago.edu/userguide.html
@Numerfolt @alice yeah, we need to switch to offensive mode.
That makes me want to create a nightshade fuse FS.
So when you want to upload the image from your picture folder, it nightshades it on the fly.
@alice when i have to use a web app to order food, e.g. CoolBurgz (fictional) i will always put my email as e.g.
usually counts as valid.
a fair bit of the advice in here seems really good, but from what I know, AdNauseam isn't really worth using over just uBO
at least as of when I last looked into it a couple years ago: it uses more resources on your machine, doesn't really make any significant difference for the companies, and the high volume of "clicks" from you just makes you far more trackable since no normal person browsing would do so
also, I think it might be worth editing the last point to say "hopefully none of you are using LLMs, but if you're someone who does..." π©΅
@vantiss I have to disagree.
Say I go to Amazon. I use perfect tracking protection and I'm not signed in. I browse for a while, and click every ad they serve me. I've wasted a bunch of different companies marketing money, my click data is worthless, and they don't know what ads to send me. I look like every other AdNauseam user, and they still don't know who I am.
Now say I do the exact same thing, but I sign into Amazon. The exact same thing happens, but they know who I am.
...
And as far as LLMs go, waste their fucking money and resources. Use every free option you can, and take it as an opportunity to poison their feedback. Don't give them any personal info, don't use them for critical questions, just flood them with garbage that pops this bubble even faster.
Even if you don't want LLMs in everything, companies will put them thereβunless it burns their wallets. The more we set fire to their AIs, the faster executives will learn it's a bad idea to use them.
@rabidchaos @flesh @alice @aj
If it is treating that null as a proper null there's a good chance there's constraints in place that'll fail and the app won't even check the failure...
Which can be fun, or not, depending on if it counts you as logged in after you submit the form or not
We should tax corporations by the GigaByte of storage the own.
It doesn't matter what they use it for, it should have a tangible yearly cost, to make them think about how much they store.
Wrt #PII, It might be a good idea to avoid entering data easily identifiable as trash, and use generators instead. E.g.:
@penguinrebellion that's why I said plausible, but fake.
Generators are good though.
There are, however, reasons to enter something wildly off every so often, like "[email protected]", because it tells companies that field is obviously fake. This both makes the plausible fakes more likely to slip by if they do use your data, but also makes them more likely to discard your data for marketing and analytics purposes in general.
It should be noted that there will be something similar to the Year 2000 Problem somewhere in 2038: the common way to represent time, seconds since 1970-01-01 00:00, as a 32 bit number, will wrap around and make computers think they're in the past.
Hopefully(?) we learned from Y2K and are preparing for that event already.