Well that's fun. Gemini just said a quiet part out loud and revealed that it's clearly been trained on data that gives it access to my geographical location. I know this because I was asking it random questions to waste time, one of which was asking for a press release for a made up company. It immediately started the release with my town and state.
Unsurprisingly, it claims repeatedly to me that I have mentioned medford massachusetts or that it simply just "used a random geographical location".
buddy.
@Lyude I'm actually kinda wondering how much of this can be attributed to dialect, because written language and to a higher degree spoken language already includes cues about where you grew up or live.
@Lyude would be fun if I'd try the same prompts 🙃
@karolherbst @Lyude I figured it was just a matter of the Gemini chat web app including the user's location in the system prompt. That's not quite the same thing as including the user's location in the training data.

@matt @Lyude I meant, that just by the way you speak, or specific words you use, one can actually pinpoint where you grew up or live.

Like there are actually studies on that matter.

@Lyude if we make the ai compliment you while gaslighting you youre more likely to go along -the clowns who own these things
@Lyude In a certain way, it claiming you mentioned it makes the most sense. Becuase, of course, none of these LLMs separate context well. So likely what it truly means is "this location was in my context", and then as usual it...uh..."made shit up that sounds like what a human would say" (tm)
@Lyude from its point of view you did mention it, because what’s happening is that the front end is taking your IP address, doing a location lookup, and then inserting that in a preliminary introduction message “on behalf of the user” that’s invisible to you.
@0xabad1dea @Lyude the prompt likely also includes instructions on hiding itself. but training it on your location in particular likely doesn’t have this outcome