In this episode of the SecurityANGLE, I’m joined by fellow analyst, engineer, and member of theCUBE Collective community, Jo Peterson. Our guest today is Rob May, the Founder and Executive Chairman of ramsac, a UK-based IT support, managed services, and cybersecurity firm. We’re going to dive into all things Microsoft Copilot for Security, which went GA in early April, and unpack some of the nuances.
Watch: Talking Microsoft Copilot for Security with Cyber Expert Rob May here:
Microsoft Copilot for Security is a ChatGPT generative AI-powered chatbot that uses natural language inputs to deliver insights and help guide an IT team’s next steps.
Microsoft Copilot for Security works with other Microsoft Security products, including:
- Microsoft Defender XDR
- Microsoft Sentinel
- Microsoft Intune
- Microsoft Defender Threat Intelligence
- Microsoft Purview
- Microsoft Defender Attack Surface
Copilot for Security can access data from these products and provide a Copilot-driven experience to help increase efficiency and effectiveness.
Built on the tech giant’s existing threat intelligence gathering, Microsoft Copilot for Security gives IT professionals access to the latest information on security threats and benefits from the 78 trillion daily signals that Microsoft already collects.
Cybersecurity has been a growing business for Microsoft for some time, accounting for more than $20 billion in revenue for the company in fiscal year 2022, greater than gaming or search advertising. The new AI-powered solution could become a massive business opportunity that could present a serious challenge to others in the space, and that’s why we wanted to explore the nuances of Microsoft Copilot for Security with Rob today.
The Backstory on Rob May
I mentioned earlier that Rob is the founder and executive chair at ramsac. He’s a technologist, thought leader, and author. Rob has also written several books on cybersecurity and AI, which is a perfect prelude to this conversation. He’s also a UK Ambassador for Cybersecurity and has been a tech industry spokesperson for the past 30 years.
AI and Cybersecurity: Microsoft isn’t the Only Player Here
Microsoft is most definitely not the only player in this market touting AI-powered security offerings. CrowdStrike offers Charlotte AI, a generative AI ]security tool that can process over 2 trillion events per day, making 180 million indications of attack decisions every second.
Zscaler recently acquired AI security startup Avalor to bolster its own AI-driven protections.
Both SentinelOne and Cloudflare have also developed AI-enhanced protections, and of course there was no shortage of discussions about these solutions a few short weeks ago at RSA Conference in San Francisco.
Our conversation with Rob on Microsoft Copilot for Security explored the following:
- This is a discussion comparing Microsoft’s security solutions to those of other security vendors and whether Microsoft’s deep footprint in the enterprise landscape plays a role in the adoption numbers it touts.
- Copilot for Security can work as a standalone application drawing data from many different sources, or as an embedded chat window within other Microsoft security services. We discussed both scenarios and what that distinction matters.
- Tech Radar has reported that Microsoft Copilot for Security is “decidedly not a doer.” That means it won’t take actions on its own, like blocking emails or deleting suspicious files, but it will suggest, guide, and explain to folks using the platform. It’s also prompt-based, so there will be a need for users to up their query games. We discussed whether Microsoft Copilot for Security might actually make some of the Microsoft Products mentioned earlier better/more effective.
- Microsoft is adopting a subscription-based pricing model for Microsoft Copilot for Security that is usage-based and broken down into what is called “Security Compute Units.” Customers will be billed monthly for the number of SCUs provisioned hourly at the rate of $4 per hour, with a minimum of one hour of use. Microsoft frames this as a way to allow users to start experimenting with Microsoft Security Copilot and then scale up as needed. We discussed how customers might see this pricing and shared thoughts on whether it will be viewed as attractive, driving use and experimentation more widely or, perhaps conversely, the start of a bill that could be opaque and unpredictable.
- Security teams are using Copilot for Security to upskill. Microsoft has also been beta testing the product over the course of the last year and claims that users report Microsoft Copilot for Security helped them be 26% faster and 34% more accurate in their threat assessments. Microsoft also reports the system is 46% more accurate on security summarization and event analysis compared to when not using it.
- What is the impact of a solution like Microsoft Copilot for Security for SMBs and the midmarket, where staffing and talent are big issues?
- We explored the potential security risks of using Copilot for Security, especially as it relates to data leakage, unauthorized access, or regulatory compliance. If the user has access to something, Copilot does as well. Does this lead to the risk of sensitive information being exposed and how can companies stay secure when using LLMs like Copilot?
We wrapped the show with Rob’s thoughts on how he sees the Microsoft Copilot for Security integrations and the overall ecosystem for the product unfolding in the next 12 months. This is a conversation you won’t want to miss.
Image credit: Pexels Mike Johnson
See some of my other cybersecurity coverage here:
Microsoft Copilot for Microsoft 365: What’s Ahead for Law Firms
Cybersecurity: Evolution of the AI Threat — 3 Stages to Watch in 2024
IBM and Palo Alto Networks Join Forces on the Cybersecurity Front