Microsoft Security Copilot could be a seismic success for the tech industry
Microsoft has continued to capitalise on its investment into OpenAI by extending its Copilot AI functionality to cyber security.
Microsoft Security Copilot uses the GPT-4 generative AI to bring prompt-based cyber security detection and remediation functionality to Windows defenders.
The tool is being marketed as an assistant for security analysts to help them triage data quicker and search for potential vulnerabilities faster, among many other tasks.
The AI, which combines an advanced large language model (LLM) and Microsoft’s bespoke security-specific model, analyses an organisation’s IT environment against the 65 trillion signals received daily by Microsoft’s global threat intelligence team.
Example queries that security analysts can input to Copilot include: ‘How can improve my security posture’; ‘what are the trending threats’; ‘Which alerts are being triggered the most’; and ‘Tell me about my latest incidents’.
Microsoft Security Copilot will then respond at machine speed, in turn, training the bespoke Microsoft security model which will, over time, tune existing skills and create new ones too.
Microsoft was insistent that Copilot would not use company data to train the model, instead taking learnings from the processes alone.
It said the tool will allow defenders to respond to incidents within minutes rather than days. It aims to offer analysts a streamlined way of summarising the incident and its context, expediting the investigation and ultimately the remediation.
It also believes Security Copilot will help onboard and train new analysts who may not fully understand how to triage certain types of data or investigation specific incidents.
Less experienced analysts will be able to learn from the different remediation approaches Copilot suggests and accelerate the development of defensive skills.
“Security Copilot then can help catch what other approaches might miss and augment an analyst’s work,” Microsoft said.
“In a typical incident, this boost translates into gains in the quality of detection, speed of response and ability to strengthen security posture.”
Launching as a preview initially, Security Copilot will integrate with Microsoft’s other end-to-end security products and will support an increasing number of third-party security products over time, it said.
Microsoft also conceded that Security Copilot still makes mistakes and like any other generative AI product, suffers from hallucinations – outputs or responses that can appear logical and confident but are incorrect.
In a demonstration, the system gave a response referring to the non-existent ‘Windows 9’, which users could correct and flag as false.
As Security Copilot is a closed-loop learning system, users are capable of sending outputs to Microsoft tagged with feedback.
What can Microsoft Security Copilot do?
The tool’s complete array of capabilities is too comprehensive to list in full, but in a demonstration, Microsoft showed off some of the standout capabilities for organisations.
Copilot can identify a specific machine that led to a ransomware infection via OneNote, for example, and provide security teams with a visualised summary of the incident.
In a more detailed example, Microsoft showed that Security Copilot could reverse engineer a Powershell script. It was then able to produce a flowchart visualisation of the attack and download process in simple terms that a broad range of employees could understand.
Defenders can add links or files to the prompt bar and ask for information on them, like querying a log file and asking if there is any malicious activity inside.
Trawling through log files can be a laborious but necessary process for security analysts and having an AI assistant to scan through them at machine speed is likely to hasten incident response significantly.
How useful will Microsoft Security Copilot be?
If Security Copilot is anywhere near as successful as Copilot has been for GitHub users, then the launch could be a seismic one for the security industry.
GitHub Copilot has already amassed a huge and devoted user base since its launch in 2021, and the AI pair programmer is now generating nearly half of all the code on the platform.
GitHub Copilot was, and still is, seen as a massively significant advancement in the software development space, and it’s growing more capable with every version that’s released.
Microsoft appears to be on a mission to embed the Copilot brand into as many of its core products as possible. It recently announced Microsoft 365 Copilot, an integration of the AI into its Office apps and Teams, an indication that Copilot is going to spearhead a new generation of Microsoft products and tech advancements.
Security Copilot has already been dubbed the “security release of the year” by Sherrod DeGrippo, director of threat research strategy at Microsoft, who tweeted her excitement as the news emerged on Tuesday.
“This is incredible. Security practitioners, this is a game changer,” she added.
Ciaran Luttrell, senior director of SOC operations EMEA at eSentire, told ITPro that GPT-powered tools are “undoubtedly going to become more prevalent” thanks to their ability to reduce the time it takes to respond to threats.
However, he noted that it’s not a silver bullet and is by no means a replacement for skilled cyber security staff.
“In the right environment, Security Copilot has the potential to unburden security teams from some tedious and time-intensive tasks, and also to level the playing field somewhat between enterprise and SMBs who may not have the resources to invest in their security teams and tooling to the same level,” he said.
“It’s important to note, however, that users will still require the relevant security knowledge and understanding in order use the tool effectively. This is not a silver bullet and human expertise will still be required to interpret the output of these systems and to decide on what actions to take.
“We cannot expect it to replace security analysts anytime soon, it’s a Copilot and not a pilot.”
It will be interesting to observe the takeup of Microsoft Security Copilot and hear, over time, how organisations are using it in real incident response scenarios, for example.
The big question will be how useful it is for different types of businesses at launch, given that support for third-party products is coming later at an undetermined date. Organisations often purchase multiple different tools from different vendors to complete their security stacks, and without their telemetry who knows how useful it will be until those integrations come?
The proof will ultimately be in the results but if Microsoft is able to replicate the same success of GitHub Copilot, it’s like to be a significant moment for the security industry and a positive step forward for defenders.
Ⓒ Future Publishing