@briankrebs I am also really curious how many people have aggressively violated various privacy laws by feeding stuff into various LLMs for "summary" and "analysis".
Frankly it should be a much larger compliance nightmare than it is. (Or, I suppose, it *is* a ginormous compliance nightmare and just right now everyone's thinking it isn't. Incorrectly)
@wordshaper @briankrebs Unfortunately, I don't think the people doing this care or will ever care. Privacy laws tend to be a joke anyways and there is very little incentive for most people/companies to change. I don't think most governments even want that to change. It's better for them, allows more data collection, etc.
I wish I didn't have such a negative and cynical outlook on it all.
@briankrebs In several pen tests I've done across the last 18 months, one of the most interesting trends has been the sudden increase in the number of examples I've found of people who have thrown those API keys, and in some cases raw data, into accidentally public GitHub repos while attempting to glue AI to things to 'see what it can do'.
Few weeks ago I found a GitHub repo that a developer had trained on a dump of their own corporate emails, and all those emails where just in public, on Github, and contained lots of things like vendor SFTP creds. It's a free for all.
@leeloo @ai6yr @SecureOwl @briankrebs songs randomly picked from a playlist.
The list: [ "Rick Astley - Never Gonna GIve You Up" ]
@briankrebs oh we dont even have 2fa because because. Have i mentioned we have a gigantic bloated mess of it bureaucracy but nobody cares we dont have a secure image repo?
But somebody had the idea to write safe dev guidelines because paper is what keeps us safe, not patching vulns.
@briankrebs ...or just python in Excel.
The amount of "internal only" data that is unknowingly shipped off to a Microsoft Coud environment. 🤦‍♂️