On November 1, 2023, Microsoft's AI assistant, Copilot, created in collaboration with OpenAI, became available to everyone. Initially, it required a minimum number of users (300!) with Enterprise licenses to even be allowed to buy Copilot licenses but in early 2024 Microsoft released the restrictions and now also offers small and medium-sized organizations with Business licenses the opportunity to buy and use Copilot.
With features such as “intelligently” summarizing meetings, creating documents and presentations, Microsoft hopes the tool will eliminate “boredom” from office work.
But there is also concern that the technology could create unemployment and make companies dangerously dependent on AI-powered tools. In addition, there could be potential problems with compliance with new rules on AI, which, for example, will probably require humans to know if they are interacting with AI rather than humans.
For us in Sweden, the benefit of having access to Copilot is also limited because Copilot cannot yet handle Swedish. Today, Copilot is mainly available in English, but many functions are also available in German, French, Spanish, Italian, Japanese, Chinese (simplified) and Portuguese (Brazil).
What differentiates Copilot from ChatGPT?
There are a few key factors that distinguish Microsoft Copilot from ChatGPT.
- For starters, data is sent to the Microsoft Azure OpenAI service, rather than to OpenAI.
- Copilot uses captive data (more on that below) as an additional data source, in addition to the data on which the model was originally trained
- Copilot is also fully integrated into tools that people already understand and are familiar with. Office, Teams, SharePoint, etc.
- Copilot builds a profile around how you express yourself, so that what is generated becomes more like your tone.
Limited space for how much internal data can actually be used?
If we rely on Microsoft's own documentation, “using your internal data” simply means automatically asking a search query against Office 365 (technically Azure Cognitive Search), finding the relevant info, and then pasting the summary as part of the “prompt” set to the AI model. You can already do (almost) the same thing with ChatGPT, but more manually, if you add a summation step.
Copilot also requires the transcription services for Microsoft Teams to be better, without proper text it is difficult to generate anything meaningful. At the present stage, the transcription in Swedish, for example, is, in my opinion, too poor for us to be able to trust the results when Copilot summarizes what was said at a meeting, etc. But it is constantly improving.
We are also thinking of an increased demand for structure in SharePoint and Teams, for example and to have meaningful data stored in Office 365. But now to make it easier for the AI rather than the employees.
It will also be interesting to see what happens when more and more of the captive data is actually generated by the AI which will then use it to generate more data.
Potential risks of Copilot
With all technological advances, however, there also come risks. With Copilot, these are, among others:
- Hallucinations: AI can sometimes “hallucinate” or create information that is not in the source text, which can lead to erroneous conclusions.
- Erroneous conclusions: If the AI misunderstands the context or content of a conversation, it can jump to the wrong conclusions. You know that Olle and Stina say things in a certain way, you interpret what they say in different ways. This will be difficult for an AI model to do, which will easily lead to misunderstandings and incorrect conclusions.
- Lost human touch: We use different ways of interpreting people's words and actions. If AI takes over, we could lose this human aspect.
- Depending on the technology: If we become too dependent on AI and something goes wrong, or if the AI system changes its policies in a way we don't agree with, it can be hard to keep working without it.
- Regulatory problems: It is unclear how legislation such as Europe's AI Act will affect the use of AI assistants such as Copilot.
Is the interest as great as it sounds?
Microsoft Copilot currently costs $30 per user per month and Microsoft's sales and partner machines are working hard on the sell-in, but it remains to be seen if the interest is really as great as it sounds.
Despite the many benefits, there are still many questions and concerns surrounding the use of AI assistants. More research and development, as well as transparency and dialogue, will be required to ensure that these tools are used responsibly and ethically.
Microsoft Copilot has the potential to revolutionize the workplace, but it is important that we approach this new technology with a healthy skepticism and a cautiously optimistic approach.