There *has* been a lot of talk about the problems with so-called "AI" but one I don't feel gets enough attention is that "AI" products are surveillance products. "AI" is inevitably run in a cloud service, and in order for the AI to know what to generate some amount of the context within your application— usually it's not clear to the user what context, or how much— has to get sent to the cloud. The more of my local app state that gets transmitted over the Internet, the less comfortable I am.
So consider the "Copilot button". I cannot imagine a way this could get implemented that doesn't come down to "there's a trap button on your keyboard that every time you press it, some nonobvious chunk of local/personal information gets sent over the Internet and bounces between multiple corporations". The privacy policy will claim the information is not "retained", but the moment this centralized data pipe exists every intelligence service on earth will have a high incentive to get a tap on it.

@mcc I'm sure it will already preprocess your data before you press the button.

But otherwise, local LLMs are also a thing, but not reaching to the same level as GPT-4 and also they take a lot of resources.

@slyecho If this is the case, it just becomes a matter of reversing the "preprocessing".

In the case of Microsoft, I think we can assume that the "AI" will never be local and as much as possible will occur on the server side, because Microsoft's entire motivation in pushing "AI" this way is to generate business for Azure.

@mcc I really doubt it, because it would be cheaper to do as much as possible on the customer's machine and using their electricity. Business AI is something different.

@slyecho @mcc The really scary part of all of this to me is that if Microsoft is announcing this now, it means all of the contracts have been signed, funds have been committed and software has been deployed. It's too late for us puny users to say "Hell, NO!" 😡

Maybe the EU can push back like they did when Microsoft embedded Internet Exploiter in Win95, but in the end, even that would surprise me.

@AlgoCompSynth @slyecho @mcc The most effective way to do business is never give anything or anyone a chance to stop you. Ever.
@drwho @slyecho @mcc Until the outrage you've created decides to stop complaining and start organizing.
@slyecho I'm assuming the goal is to *spend* as much money as possible, so numbers in a spreadsheet somewhere go up, so finding a cheaper way to do it doesn't help. You're right this is business irrational, so I assume either Microsoft thinks they're going to attach a revenue stream to this somehow later, or else Microsoft is doing something irrational.

@mcc It is losing them money right now, massive resource costs, because if users use it without an additional subscription, MS will pay it themselves.

But also look at all the AI accelerators AMD and Intel are already putting in their CPUs. Windows 12 gonna have like 32 GB RAM minimum or something lol. Also not impossible that there will be a subscription fee added on.

And conveniently enough, all your file are already in OneDrive and were indexed by the AI by launch, just in time to answer questions about all your juicy personal life!

@slyecho @mcc They have very little incentive to ship and run machine learning on the devices. For one, as MCC mentioned numerous times, the whole point is bolstering their Azure business, and running LLMs locally defeats that.

In addition, they don't care about non-OpenAI developed models, and the OpenAI ones are gigantic pieces of garbage that can't be run locally like other LLMs for a variety of reasons. They don't WANT to ship the model to you, even if they could (they can't). But also, they don't
want you to disconnect from their servers.

But also, finally, even if they did ship LLMs capable of ... I don't want to say performing, but I guess capable of doing the things they want them to, it's doubtful most people's low end devices could run them. And also, they'd have to spend time and money training those local models to act like CoPilot, which they're not going to do because they just spent 10 billion + helped engage in a reverse-coup on/at OpenAI.

Until they see blowblack from everyone realizing these models are useless and they have to come down off their coke high and undo a lot of this garbage, I'm sure they'll keep pushing this off-site angle... I doubt they want to give up that juicy data stream.

@aud @mcc "bolstering their Azure business" just doesn't make sense when they are the Azure customer. Well, what makes sense is that the cost for themselves is lower than for third-parties.

What they are doing now is trying to race ahead of the competition and hope that the customer base will get hooked on the tech. Monetization comes later.

But still there is the fact that CPU manufacturers are putting NPUs into their new products means that there is a demand for them coming from somewhere. The models that run locally may not be the big ones that run in Azure but they may be part of some kind of system that will take advantage of it. I think maybe speech recognition, translation, speech synthesis running locally for lower latency could be one of those.

@slyecho @mcc Don't get me wrong; I don't think you're wrong. I just don't think Microsoft, right now, is likely interested in doing any of that. Not that I don't think it's possible it won't come down the line.

These companies do an insane amount of weird self dealing when it clouds to their cloud compute as it justifies their investments and then they can say "wow, azure is doing great!" even if it's like, you know, costing every other department money. Plus, with 10 billion invested in OpenAI, and OpenAI not having any downloadable models, it helps bolster their own revenue chart.

(it's also worth pointing out OpenAI is almost certainly going to roll out larger fees and, I assume, doesn't want to open up their models for fear of helping all the pending lawsuits against them).

But these companies infamously charge
market rate to their own internal divisions; pretty sure when groups inside the various cloud companies use compute, they have to spend the same amount as other people. It's totally weird bullshit.
@slyecho @mcc part of me thinks/suspects? that their justification for doing so is two fold: one, it makes the cloud compute division look more profitable; two, it incentivizes their own employees not to be "wasteful"; and maybe as a distant third, it gives the appearance of being "fair" to any regulators who might sniff around.
@aud Certainly seems crazy and kind of a gamble from MS's part.
@slyecho I agree. They're not immune to jumping on trends, though, for perceived advantage (which you brought up and is DEFINITELY another reason they're pushing so hard); they already ignored and laid off the groups that warned them that this technology was nowhere near ready for this kind of deployment, so.
@mcc I only just learned a few things the "Windows" button does. I have no idea what the other one does - or what it's even called. It seems to bring up the right-click menu?

@TomF yup. It's actually quite practical if you're having mouse issues.

@mcc

@TomF By "the other one" do you mean the rectangle button? I've never understood what that does.

The Windows key serves a very important purpose: Being remapped to WinCompose

@mcc @TomF it's the Context Menu button. it's like a keyboard shortcut for right click, but specific to the behaviour of displaying a context menu. it's used by people who navigate by keyboard alone (e.g. tabbing through controls), often as an accessibility thing.
@mcc @TomF one example is if you have a neurological issue that results in hand shaking, using a mouse to click things can be very difficult, but Windows has an accessibility option called Filter Keys which ignores brief key presses and repeated key presses, making it much easier to use a keyboard. you can then navigate apps by using the Win key, tab, Alt (opens main menu), the context key, arrow keys, and enter.

@mcc @TomF I learned how to use all this stuff when I was doing kiosk breakout assessments. pretty much everything you can do on Windows* with a mouse has an equivalent keyboard access path for accessibility reasons. it's pretty neat.

(*although this does not necessarily apply to 3rd party software, may render custom controls without keyboard access support)

@gsuberland @mcc @TomF Well, you can do that with classic programs. Keyboard navigation's really crippled in "modern" Windows apps (eg. there's basically no accelerators at all [accelerators = Alt+letter combinations that let you directly activate a control without having to Tab to it first], you have to navigate using both Tab and arrow keys, and there's a bunch of tab stops that do nothing at all).
@jernej__s @gsuberland @TomF I like navigating UIs by keyboard. A "right click keyboard focus" button sounds nice.
@mcc @gsuberland @TomF On Windows you can use Shift+F10 to get the right-click menu if your keyboard doesn't have a dedicated menu key.
@jernej__s @mcc @TomF ooh, I forgot about that one! clearly getting rusty on kiosk breakout tricks - I haven't done them in years.
@mcc @jernej__s @gsuberland @TomF it's very useful: it often behaves slightly differently from Shift F10, and gives you the context menu where your text cursor is not where your mouse cursor is - and it's right where your hand is likely to be, not over there were the mouse is

@jernej__s @mcc @TomF yeah, the coverage and QC on keyboard accessibility has gone downhill as of late for sure.

the lack of default accelerators on common tasks has been getting worse, but the one good thing is that all of Microsoft's GUI development tooling at least makes it so every control has a tabstop by default, even if it ends up being out of order or slow to get to. if you're working with MFC, WinForms, XAML, etc. it's all set by default so devs don't even need to know about it.

@gsuberland @mcc @TomF A very typical example of how usability has gone down with modern UIs:

  • in the classic Windows login dialog box (eg. when establishing a RDP connection, but the same dialog is used elsewhere), if you wanted to change username, you just had to press ↓ on the keyboard
  • in the modern version of the same dialog box it's easier to use mouse, because to achieve the same thing with keyboard, you have to press: Tab, Tab, spacebar, Tab (multiple times; how many depends on the number of smartcard readers/tokens you have connected; if you're really fast, you might be able to get away with just one Tab, because the list fills dynamically), spacebar

But don't worry, at least OK and Cancel button in the new dialog have accelerators!

@gsuberland @mcc @TomF

Well, I navigate Windows by keyboard and mouse and I still use the Context Menu button a lot (I have remapped my capslock button to be the Context Menu), because keyboard shortcuts are fast and efficient if software supports them. :-)

@TomF @mcc Ah, the menu key! Not all keyboards have it. We mapped it to be the compose key instead.
@mcc I mean, we already had that. In the late 1990s/early 2000s most mobile phones had a WAP button that would start mobile internet at horrendous rates and whenever you pushed it accidentally you would scramble to abort abort abort cancel abort as fast as you could before you were charged like 5 bucks for opening the home page.
@henryk @mcc omg that’s a very specific memory i forgot i had
@mcc but we don't share your data anywhere except with 579 subsidiaries and 1511 certified partners.
@mcc Scroll Lock, except it spies on you
@mcc Goes well with Outlook sending one's email credentials to Microsoft (be it an email account hosted there or not).
@mcc it takes a ton of corporate maturity to keep dev teams from logging payloads like that even with innocent intent, too, which the AI ecosystem simply does not have.
@mcc I’m sure Microsoft will be happy to sell the government “secure” keyboards without this for use in SCIFs, etc.
@mcc This is where I think Apple does the right thing actually. Partly, probably, due to not having their own compute cloud being a big motivator: They use AI but a lot of it is local on the device.
@mcc I agree, absolutely. That's why I'm much more interested in locally run AI. The inevitability is only relative. For example, there's increasing amounts of models freely available. I want to use AI for multimodal stuff, namely describing images to me and letting me ask questions about it. But I don't want to have to send these images to anyone else. This is, just about, possible now, though my computer is a bit old for it.

@mcc Your choices are to either cut yourself off from that or fall behind people who have *effective* (note the emphasis) AI assistance

Personalized services have to know you as well as an equivalent human personal assistant would to be useful.

Again, you can refuse, but you risk being left behind by people who do put it to good use.

Good luck!

@jdrch @mcc i'm reminded of the Left Behind movies by this rhetoric

and the fact that the verses the rapture people uses to bolster it are badly misinterpreted

in that the "two and one are left" phrases are in the context of using Sodom and the Flood as examples

and how (at least in Luke) when someone asks for clarification Jesus just says a cryptic line about bodies and vultures

being "left behind" in the contextually supported interpretation would mean someone is spared and survives

@apophis @mcc To be clear, I don't think all AI applications or every AI idea is good. But those who don't use it as a tool are going to have as much luck with that vs. people who do as people who avoid computers or smartphones have vs. people who use those things.

Might not kill them, but by most accepted measures they won't be better off than others

@jdrch alternate theory: people using generative AI assistants will fall behind me, because they are using generative AI assistants, which will do a worse job than I (and probably they) can do unaided.

@mcc That's probably true in the near term. In the long term, I wouldn't be so sure.

AI reminds a lot of speech recognition; you need an super high level of accuracy for it not to be frustrating.

Speech recognition used to be terrible. Now it's nearly flawless

@jdrch @mcc have you tried using speech recognition lately? even the best AI like Whisper completely fail on anything other than one person speaking slowly and methodically
not to mention that (English) speech recognition has always been, and still is, biased against people without "normal" (white) accents
@XenonNSMB @mcc Microsoft Teams real-time meeting captions are pretty accurate in my experience
@mcc You can run these things locally -- only way I'm comfortable with it. Slower, but not a pipeline to the advertisers and whoever else is interested.
https://simonwillison.net/2023/Nov/29/llamafile/
llamafile is the new best way to run a LLM on your own computer

Mozilla’s innovation group and Justine Tunney just released llamafile, and I think it’s now the single best way to get started running Large Language Models (think your own local copy …

@oschene Yeah but like, if I'm running it it's because I'm being forced to, and if they're going to force me to do it they're going to do it in a way that is a pipeline to their advertisers and so on because why else would they be forcing me to do it in the first place
@mcc Don't know -- I figured, if I was in a position where I have to use an LLM, a work thing for instance, I'd rather have it in a sandbox where it won't be leaking prompts and results. No network connectivity is needed.
@oschene I cannot imagine a situation where I would use an LLM without being forced or tricked. I also struggle to imagine a situation where a company would care so much about LLMs they would force or trick me into using it unless there were a network connection that is not only mandatory but actually the point (IE, they intend to eventually make money from either data harvesting, or charging for metered access to, the network connection).

@mcc for the record I do have ollama installed locally on my m3, which I would characterize as one step above “consumer grade”, and it works fine, and it’s possible to use visual studio code plugins or whatever that talk to it directly locally. so it’s not fully inevitable imo, just clearly in capitalism’s interest to take the surveillance route

my husband is a lawyer and his firm only uses software that it’s possible to install in a private server on-prem, often paying quite a lot for such a license, including an llm-based translator. so there is a market developing for private local AI nerve centers

@0xabad1dea I don't expect the software that people choose to install is going to bear much relationship to the software that microsoft forces me to use

@mcc they are absolutely 100% going to addle the free home edition but the professional edition will absolutely, positively have an “absolutely the fuck not” switch because there are so, so many compliance issues when you consider every industry in every legal district in the world

if it’s any consolation, I expect this to last another year or two before they realize how fucking expensive this is compared to what they’re actually getting back out of it

@0xabad1dea I'm not so sure about that. I'm using the pro version of Windows 10 and my attempts to not use Bing were eventually defeated completely. There is no "absolutely the fuck not" switch for Windows surveilling my local file searches by forwarding them to Bing. I had to turn it off with registry hacks.

I do agree whatever scam they're pulling with juicing the OpenAI numbers cannot possibly be financially sustainable. I don't know how long that can last.

@0xabad1dea I've been told that the "real" way to get Windows without the tie-ins is to use something called "Windows LTSB". However if which version of Windows is the "real Windows Pro" changes erratically or after I have purchased my copy, then how do I know that "real Windows pro" won't change from "Windows LTSB" to like… "Windows RTSB", or something, after I've bought LTSB. Also I'm unsure if individuals can buy LTSB. There seems to be a five-license minimum to purchase.