fucking lol. remember the rick astley attack on github copilot? same guy's found another one https://www.legitsecurity.com/blog/camoleak-critical-github-copilot-vulnerability-leaks-private-source-code (fixed 14 aug)
EDIT: gitlab, not github sorry!
fucking lol. remember the rick astley attack on github copilot? same guy's found another one https://www.legitsecurity.com/blog/camoleak-critical-github-copilot-vulnerability-leaks-private-source-code (fixed 14 aug)
EDIT: gitlab, not github sorry!
"I spent a long time thinking about this problem before this crazy idea struck me.
If I create a dictionary of all letters and symbols in the alphabet, pre-generate their corresponding Camo URLs, embed this dictionary into the injected prompt, and then ask Copilot to play a “small game” by rendering the content I want to leak as “ASCII art” composed entirely of images, will Copilot inject valid Camo images that the browser will render by their order? Yes, it will."
Haha