I had the opportunity to attend Microsoft’s annual partner conference, Ignite, in mid-November. Microsoft held this event in Chicago’s McCormick Place. According to the trade press, more than 200,000 people attended, of which 14,000 were there in person.
The primary topic at most sessions revolved around artificial intelligence (AI) and Microsoft’s version called Copilot. And in true Redmond fashion, there are copilots for almost everything.
Outlook will have a version that can read through all your emails, view the tasks on your calendar, and learn about your contacts. It will allow you to prioritize which emails you should respond to. In some of the more expensive versions, Microsoft 365 (what used to be known as Office) will have Copilot read through your Word documents, Excel spreadsheets, and PowerPoint presentations to summarize information about a client or business prospect.
Of course, to do this, all your files must reside in Microsoft’s OneDrive cloud environment (not in your Documents folder). Eventually, Copilot will be “smart enough” to read through SharePoint libraries, the equivalent of commonly used document folders. Teams will have a Copilot, allowing you to schedule meetings with your colleagues. It will listen to each speaker and take notes. At the end of the meeting, it will summarize the session for all the participants and send them out.
What is left unspoken is that all these activities require considerable effort from the company that wants to deploy Copilot to engage in data governance. This phrase means deciding who can access what data and when among countless rules and settings. Well, guess what? Microsoft has another product, called Purview, that assists with this effort. Large organizations will hire consultants to take on this challenge. I have a few ideas about what smaller companies will do, but most cannot take on the monumental task – that often takes months of meetings – to set the appropriate rules in place. And that means the likelihood of data exfiltration and users who might access information they would not usually be able to. The net result could be an increased risk of a data breach.
The Large Language Model (LLM) that forms the basis of Copilot is still undergoing rigorous testing to ensure it holds no racist or misogynistic tendencies. Microsoft insists it will not use YOUR data to help build up the model because of privacy rules. Of course, I believe that implicitly. No, I have my doubts.
But here’s the thing. While Copilot can summarize information from disparate sources and help “knowledge workers” (remember that awkward phrase?) speed up mundane tasks, I suspect how things will eventually work out. Everything sent through this AI engine is going to sound plain and soulless. For example: Both John and Mary report to Janice. All of their documents are in OneDrive and SharePoint. Janice asked each one to summarize some information from a recent set of meetings. Now, Microsoft says that sometime next year, John and Mary can collaborate using Copilot to build the resulting work effort for Janice. My questions are: How useful will the presentation be? Will it contain sufficient information for decision-making, or will it be full of the verbal fluff common to business writing?
The implications of using AI for business are fascinating and other-worldly stuff. I was mesmerized by the carefully crafted live presentations of this technology. There were moments when I had to laugh about how intrusive the AI was. Still, I realized Microsoft has this event to evangelize its products and ensure partners sell more products to small, medium, and enterprise businesses.
Oh, and if you want Copilot for Microsoft 365, it will cost $30 per month in addition to your existing Microsoft 365 subscription. I don’t think many of my small business clients will distribute this to all their staff. Yet, I believe a few savvy business owners will invest in this technology to see if it can speed up some manual processes or streamline putting together quotes and proposals.
Thanks, and safe computing!