OK. Sure. Sure. Can I speak to an adult please. Put an adult on the line.
@mhoye "We here at Microsoft are simply making punching motions in the air and walking in a direction. If you happen to occupy the air we are punching, you are solely responsible for any results or consequences."
@darius You can’t sell a bottle of soda water under terms like these.
@mhoye jeez, sounds like Happy Fun Ball.
@mhoye "copilot is for entertainment purposes only" lmaoooo is it good when they have to apply fox news legal theory

RE: https://cosocial.ca/@mhoye/116325415496773597

@mhoye If they tell me that something is "for entertainment purposes only," while selling it to me as a revolution in the way people write software and/or conduct business, they have established that lying for the purposes of covering their ass is how they do business.

I have other sources of entertainment, and if there's one thing I've learned, it's that leopards who offer to shave your beard for entertainment purposes only always wind up having your face for lunch.

@mhoye Only a completely corrupt and captured legal system would ever accept "For entertainment purposes only" as a disclaimer conferring immunity from liability.

C'mon. Who spends literally billions of dollars to make AI just to entertain people? Can they show the jury their business plan to pivot to entertainment? Is this the xbox division?

@mhoye "Here's Excel. It's buggy. Go ahead and use it for the pro-forma cash flow and income statements you need to raise funds, but if _our_ code screws up and _you_ are arrested for fraud, leave us out: Excel is for entertainment purposes only."

If that is ridiculous, so is the Copilot weasel-clause.

@raganwald @mhoye isn't that basically where we are, in truth, with software? isn't all the consumer-grade stuff sold under the end-user-licence declaration that the vendor bears no legal responsibility for any trouble caused by the software?
How can I mislead you? Air Canada found liable for chatbot's bad advice on bereavement rates | CBC News

Air Canada has been ordered to pay compensation to a grieving grandchild who claimed they were misled into purchasing full-price flight tickets by an ill-informed chatbot.

CBC
@raganwald @mhoye Excel being buggy isn't the problem. The problem is that pretty well every amateur-written spreadsheet (eg those written by accountants rather than software engineers) is buggy.
@TimWardCam @mhoye That is what I used to call a "beautiful failure:" Letting outsiders write their own software means 100x more junk, but also programs that experts would never have known needed to be written.

@raganwald @mhoye Pharma companies should heighten the contradictions by labeling their drugs as “for entertainment purposes only.”

Would also make a dandy shield against malpractice. "Your splenectomy was for entertainment purposes only. While I understand you didn’t find my little gratis surprise – the resection of your liver – entertaining, you can hardly blame me for trying to lighten your day.”

@mhoye
> we make this tool but we cannot guarantee that it does something and if it does we cannot guarantee that it does it correctly. Use at your own risk also it's not a tool.
@mhoye i experienced windows 98, man... I'm not surprised at all 😆😆😆
@mhoye is there any URL for this document?
@Sphinx_Pouet @mhoye yeah what's this from??
Copilot - Terms of Use

Microsoft Copilot

@Sphinx_Pouet @noodlemaz I might have gone through the same reaction as I think you did.

"This can't possibly be real. It *must* be satire."
"..."
"Apparently, in the grim future of 2026, Poe's law is the only law."

@datarama @Sphinx_Pouet @noodlemaz Worth noting, though, that these are the Copilot *for individuals* terms of use. I haven’t been able to find a similar document governing Microsoft’s liability for using Copilot for work, and I’d love to get one.
@bob_zim @datarama @Sphinx_Pouet yes, true, I had noted them promptly forgot on second sight 🙃

@noodlemaz @Sphinx_Pouet

www.microsoft.com/en-us/microsoft-copilot/for-individuals/termsofuse

@mhoye @noodlemaz thank you 🙇 I had look for the exact wording but it was not showing on search results
@mhoye do not taunt CoPoliot

@mhoye

  • Do not taunt Copilot.
  • If Copilot begins to smoke, get away immediately. Seek shelter and cover head.
  • Copilot contains a liquid core, which, if exposed due to rupture, should not be touched, inhaled, or looked at.
  • Ingredients of Copilot include an unknown glowing green substance which fell to Earth, presumably from outer space.
  • Discontinue use of Copilot if any of the following...

@mhoye "For example, we can’t promise that any Copilot’s Responses won’t infringe someone else’s rights (like their copyrights, trademarks, or rights of privacy) or defame them."

Remember that little period when Microsoft was promising to compensate Github Copilot users if they were sued for IP violations for Copilot output?

Anyway.

@mhoye "We don’t own Your Content, but we may use Your Content to operate Copilot and improve it. By using Copilot, you grant us permission to use Your Content, which means we can copy, distribute, transmit, publicly display, publicly perform, edit, translate, and reformat it, and we can give those same rights to others who work on our behalf."

We don't technically own your content, but we absolutely can do anything we want to with it. So can anyone else if we want them to. Suck it, peons.

@mhoye In some way, they are may be the most lucid company in the industry ?
@mhoye "Copilot is for entertainment purposes only"
@mhoye ARE YOU NOT ENTERTAINED?!
@mhoye wow that doesn’t align with how they advertise it
@mhoye I found the *one* area where copilot doesn’t respond like a grovelling sycophant
@mhoye "But we're going to force it into every crevice of our ecosystem anyway. Nya nya nya."