@mcc I'm sure it will already preprocess your data before you press the button.
But otherwise, local LLMs are also a thing, but not reaching to the same level as GPT-4 and also they take a lot of resources.
@slyecho If this is the case, it just becomes a matter of reversing the "preprocessing".
In the case of Microsoft, I think we can assume that the "AI" will never be local and as much as possible will occur on the server side, because Microsoft's entire motivation in pushing "AI" this way is to generate business for Azure.
@slyecho @mcc The really scary part of all of this to me is that if Microsoft is announcing this now, it means all of the contracts have been signed, funds have been committed and software has been deployed. It's too late for us puny users to say "Hell, NO!" 😡
Maybe the EU can push back like they did when Microsoft embedded Internet Exploiter in Win95, but in the end, even that would surprise me.
@mcc It is losing them money right now, massive resource costs, because if users use it without an additional subscription, MS will pay it themselves.
But also look at all the AI accelerators AMD and Intel are already putting in their CPUs. Windows 12 gonna have like 32 GB RAM minimum or something lol. Also not impossible that there will be a subscription fee added on.
And conveniently enough, all your file are already in OneDrive and were indexed by the AI by launch, just in time to answer questions about all your juicy personal life!
@aud @mcc "bolstering their Azure business" just doesn't make sense when they are the Azure customer. Well, what makes sense is that the cost for themselves is lower than for third-parties.
What they are doing now is trying to race ahead of the competition and hope that the customer base will get hooked on the tech. Monetization comes later.
But still there is the fact that CPU manufacturers are putting NPUs into their new products means that there is a demand for them coming from somewhere. The models that run locally may not be the big ones that run in Azure but they may be part of some kind of system that will take advantage of it. I think maybe speech recognition, translation, speech synthesis running locally for lower latency could be one of those.
@TomF By "the other one" do you mean the rectangle button? I've never understood what that does.
The Windows key serves a very important purpose: Being remapped to WinCompose
@mcc @TomF I learned how to use all this stuff when I was doing kiosk breakout assessments. pretty much everything you can do on Windows* with a mouse has an equivalent keyboard access path for accessibility reasons. it's pretty neat.
(*although this does not necessarily apply to 3rd party software, may render custom controls without keyboard access support)
@jernej__s @mcc @TomF yeah, the coverage and QC on keyboard accessibility has gone downhill as of late for sure.
the lack of default accelerators on common tasks has been getting worse, but the one good thing is that all of Microsoft's GUI development tooling at least makes it so every control has a tabstop by default, even if it ends up being out of order or slow to get to. if you're working with MFC, WinForms, XAML, etc. it's all set by default so devs don't even need to know about it.
@gsuberland @mcc @TomF A very typical example of how usability has gone down with modern UIs:
But don't worry, at least OK and Cancel button in the new dialog have accelerators!
Well, I navigate Windows by keyboard and mouse and I still use the Context Menu button a lot (I have remapped my capslock button to be the Context Menu), because keyboard shortcuts are fast and efficient if software supports them. :-)
@mcc Your choices are to either cut yourself off from that or fall behind people who have *effective* (note the emphasis) AI assistance
Personalized services have to know you as well as an equivalent human personal assistant would to be useful.
Again, you can refuse, but you risk being left behind by people who do put it to good use.
Good luck!
@apophis @mcc To be clear, I don't think all AI applications or every AI idea is good. But those who don't use it as a tool are going to have as much luck with that vs. people who do as people who avoid computers or smartphones have vs. people who use those things.
Might not kill them, but by most accepted measures they won't be better off than others
@mcc That's probably true in the near term. In the long term, I wouldn't be so sure.
AI reminds a lot of speech recognition; you need an super high level of accuracy for it not to be frustrating.
Speech recognition used to be terrible. Now it's nearly flawless
@mcc for the record I do have ollama installed locally on my m3, which I would characterize as one step above “consumer grade”, and it works fine, and it’s possible to use visual studio code plugins or whatever that talk to it directly locally. so it’s not fully inevitable imo, just clearly in capitalism’s interest to take the surveillance route
my husband is a lawyer and his firm only uses software that it’s possible to install in a private server on-prem, often paying quite a lot for such a license, including an llm-based translator. so there is a market developing for private local AI nerve centers
@mcc they are absolutely 100% going to addle the free home edition but the professional edition will absolutely, positively have an “absolutely the fuck not” switch because there are so, so many compliance issues when you consider every industry in every legal district in the world
if it’s any consolation, I expect this to last another year or two before they realize how fucking expensive this is compared to what they’re actually getting back out of it
@0xabad1dea I'm not so sure about that. I'm using the pro version of Windows 10 and my attempts to not use Bing were eventually defeated completely. There is no "absolutely the fuck not" switch for Windows surveilling my local file searches by forwarding them to Bing. I had to turn it off with registry hacks.
I do agree whatever scam they're pulling with juicing the OpenAI numbers cannot possibly be financially sustainable. I don't know how long that can last.