Microsoft doesn’t think that we should take it’s A.I. assistant Copilot too seriously.
Last year, Microsoft quietly updated the terms for Copilot. Here’s what it says, “Copilot is for entertainment purposes only. It can make mistakes, and it may not work as intended. Don’t rely on Copilot for important advice. Use Copilot at your own risk.”
How many times have you heard a company telling you not to trust its own product?
That’s a sharp contrast to Microsoft’s CEO recently saying that you should use Copilot in your everyday life . . . and that you can even ask it to predict outcomes.
Now, Microsoft says that disclaimer about it being for entertainment purposes only is just “legacy language” from when Copilot was more of a search tool. They say they plan to update it soon.
But for now, maybe just double check anything Copilot tells you.


