This Microsoft UI Assistant can attend a meeting for you
Microsoft 365 Copilot can summarize meetings held in Teams for those who don't attend. It can also instantly draft emails, create text documents, graphs, spreadsheets and Powerpoint presentations.
Microsoft hopes the tool will eliminate "hard work," but some worry that such technology will replace employees.
There are also concerns that businesses could become too dependent on the help provided by artificial intelligence.
Both the European AI Act and China's AI regulations require people to know if they are interacting with AI rather than humans.
Collette Stallbaumer, head of Microsoft 365, said that an individual using Copilot needs to clarify this.
"It is a tool and people have a responsibility to use it responsibly,” she said.
"Maybe when I send you this answer, I won't tell you that I used artificial intelligence to help generate it. But man is always present and always in control."
Regardless, the EU states that companies developing artificial intelligence tools must ensure their responsible use.
A BBC journalist had the exclusive opportunity to test Copilot ahead of its wider rollout.
It uses the same technology behind ChatGPT, which was created by OpenAI - a company in which Microsoft has invested billions of dollars.
The tool was tested on the laptop of Derek Snyder, a Microsoft engineer, because Copilot is embedded in an individual's account with access to their own or company data.
Microsoft says the data is secure and will not be used to train the tool.
"You only have access to data that you should be able to see," said Stallbaumer. “Copilot respects the data usage policy.”
The first impression of Copilot is that it will certainly be a useful tool, but also an extremely competitive colleague for those who do office work - especially in companies that want to save.
A BBC journalist reported how he confidently summarizes a long chain of emails related to the launch of a fictional product in seconds.
He then suggested a short answer. They used a simple drop-down menu to make this response longer and more casual, and the Chatbot generated a warm response expressing admiration for the proposed ideas and excitement to be involved in the project—even though none of them had read any of it.
They could then choose to edit the email before sending, or choose an AI-generated suggestion and send it in its entirety. There was no indication in the email that it contained Copilot content.
A BBC journalist also witnessed the process of how the tool creates a multi-slide Powerpoint presentation based on the content of a Word document in about 43 seconds. You can use the images embedded in the document, or explore your own free collection. During testing, Copilot created a simple but effective presentation - and also wrote a text proposal that was meant to accompany the presentation.
As the journalist adds, Copilot did not understand the request to make the presentation more "colorful". The Microsoft tool even referred him back to Powerpoint tools to fix things himself.
At the end of the testing, the team also took a look at the functionality of the meeting in Microsoft Teams.
The copilot identified themes and provided summaries of the various themes that emerged during the meeting. According to the BBC, Copilot could also summarize what a certain person said and, in case of disagreement, it could offer a chart of the pros and cons of the topics discussed. All this took the tool just a few seconds.
Copilot was programmed not to answer questions about the performance of individuals in meetings—such as who was the best (or worst) speaker. Mr. Snyder was asked if he thought anyone would actually make the effort to attend meetings if they realized Copilot could save them time and effort. "Many meetings could become webinars,” he joked.
Copilot will initially cost $30 per month and will need to be connected to the internet to function, as it does not work offline.
Critics say this type of technology is likely to cause major disruptions, especially in the area of administrative jobs. Carissa Veliz, an associate professor at the University of Oxford's Institute for Ethics in Artificial Intelligence, said she is also concerned that people are becoming too dependent on such tools. "What happens if the technology fails or gets hacked? There may be an error, or the introduction of a new policy that you may not agree with. And then, if you're so addicted to the system that you think I can't live without it, what happens next?” she said.