Research Hub > How NextGen AI is Impacting Cybersecurity
Article
5 min

How NextGen AI is Impacting Cybersecurity

Artificial intelligence is becoming an integral element of cybersecurity for both attackers and defenders, and it must be considered from all angles.

mkt80426-rh-data-analytics-cdw-q225-ai-data-gov-mccray-hero

The use of artificial intelligence has accelerated rapidly in recent years, and it’s becoming a more common tool to address security issues. When we consider the intersection of AI and cybersecurity, it’s being used on both sides of the security struggle.

Attackers are weaponizing AI, forcing cybersecurity teams to rethink strategies to include defensive tactics to protect against AI-fueled attacks. But AI is also being enlisted as a tool for good: It can be leveraged as a part of an organization’s cybersecurity strategy for automated detection and recovery purposes, which can save an organization time and money.

In a recent CDW survey, 55% of respondents said their organization is pursuing AI to improve security. It’s a strong use case, and we're seeing this across the board with our customers. Why? Because it puts the modern CISO in a much better position to be able to handle their roles and responsibilities. When you think about incident management, incident response, being able to respond to a cybersecurity threat, it’s important to know that you have the right information and that you can trust that information when making a decision.

With the advancement of next-generation AI and its capabilities in security solutions, organizations and CISOs can make intelligent risk decisions and responses much more effectively and efficiently than before. They’re able to respond much faster and limit the blast radius of impactful events. These events don’t necessarily have to turn into incidents that negatively impact your operational status.

Leveraging AI as a Valuable Tool in Cyberdefense

As a CISO myself, I’m meeting with a lot of customers who want to use cybersecurity with built-in AI capabilities. One of the typical challenges for these organizations is that they’re impacted by training, they’re impacted by staffing, by capabilities that aren’t in their employee’s current tool sets. And one of the things CDW helps them understand is how cybersecurity with AI can help them identify indicators of compromise much faster, then respond in an automated fashion – with confidence.

If you think about the human element and what it takes to review a single incident, to go through the volumes of logs, the data that they have to look at, to correlate this information into the appropriate incident response, that takes a tremendous amount of time. Additionally, it can require someone with a lot of knowledge and expertise to find those hidden things that maybe a junior analyst or a brand-new analyst wouldn’t see.

This is where next-generation AI capabilities, built into next-generation security solutions, can actually make a difference. For example, they can parse large volumes of data quickly and put the relevant outputs into a use case that makes sense and allows the security team to make a decision on whether to adopt that use case. And when we get comfortable with how we’ve trained that system, we can take security orchestration and automated response to the next level. An organization can allow lower-level incidents to be handled by SOC solutions with next-generation AI capabilities. It allows security teams to be more strategic and spend their time on more complex issues that need to be addressed. AI-enabled cybersecurity solutions can help to provide a more holistic view into an organization’s environment, providing relevant and timely information to make intelligent decisions rapidly, accurately and with confidence in the response actions taken.

All Lines of Business Need To Be Aware of Data Protection Needs

But it’s not just cybersecurity teams that need to consider security concerns around AI. Other lines of business must infuse it into their strategy and operations, especially when it comes to data protection.

For example, you marketing team might be using a large language model to draft campaigns for their customers, but they have to ensure that they aren’t exposing proprietary information or customer insights that they haven’t received permission to share. More important, they need to be certain they aren’t violating regulations.

When I talk to organizations about next-generation AI and what they're doing with it, the No. 1 resource, used in all of these AI solutions, is data. So, I always bring them back to the data and ensure they understand that It’s intertwined, it’s foundational. It is the basic building block for anything you’re going to do with AI. So, what are you doing from a framework perspective, from a governance perspective and from a security perspective, to ensure that your data is protected?

Typically, governance is the last thing that we see organizations pursue. One of my areas of expertise is data protection, which includes AI, data quality management, data security, data privacy, data governance, etc. When I look at where organizations typically are on their data protection journey, governance is typically an afterthought. They push it down to the lowest level. Ethics cannot be pushed down to the lowest level. This is an organizational technology and capability that requires due diligence from the C-suite, board of directors and legal counsel. Additionally, putting in the right ethical framework should also be top of mind. So, I was pleasantly surprised to see that 51% of the survey respondents are already well down the road to implementing an AI framework that includes governance.

How Your AI Strategy Can Benefit From an Experienced Partner

A lot of the organizations that responded to the survey are turning to third parties for support with their AI initiatives. Companies such as CDW are uniquely positioned to help support these customers. So, it wasn't surprising to me that 46% of respondents stated that their use case involves security, and that they are approaching third parties for AI expertise and support.

Organizations should be focused on selecting the right partners to move forward with AI strategy. It’s one thing to say, “I need advisory services.” But a partner will recognize what types of services are needed. Organizations should consider working with partners who can help them through the challenging stages of creating a framework, implementing that framework, managing, monitoring, and securing that framework, and building out the use cases in support of creating AI solutions. CDW has a finger on the pulse and has the experience and expertise on staff, making us uniquely positioned, from a security perspective, to help organizations plan effectively for the adversarial threats that are going to come and how to proactively address them.

When I looked at the responses to the survey, it didn’t surprise me that 26% of the respondents said that training and finding talent are the biggest pain points for their organizations, and that applies across security as well. One of my peers at CDW stated that most organizations that he has worked with struggle with where to start with AI. They ask, “What solutions do we use? How do we govern and really have a lifecycle management approach to AI?” This goes to the heart of why organizations that lack talent and training struggle to achieve deployment of their high-priority AI projects, and that includes cybersecurity with AI capabilities. Because of this lack of talent and training, 99% of the respondents said they turn to partners such as CDW for help in overcoming shortfalls.

Learn more about the relationship between artificial intelligence and cybersecurity in the new CDW Artificial Intelligence Research Report.

Aaron McCray

Field CISO, CDW

Aaron McCray is a Field CISO at CDW.