The use of artificial intelligence (AI) in business has skyrocketed across industries, and the field of cybersecurity auditing is no exception.
By integrating AI into their processes, cybersecurity auditors can refine their processes, reduce time spent on administrative tasks, enhance accuracy, gain deeper insights from large datasets, and better prepare their clients to stay one step ahead of cybercriminals. But before you dive into the deep end, it is crucial for auditors to understand the benefits and limitations of available tools to determine how their teams—and their clients—should approach AI.
How AI is Reshaping the Cybersecurity Auditing Landscape
Cybersecurity auditors are already beginning to leverage AI to improve their own productivity as well as outcomes for clients. AI-powered tools can help elevate auditors’ reviews of complicated data sets, enable better productivity, and empower auditors to spend more time sharing their insights and expertise as well as providing actionable recommendations to clients on how to strengthen their cybersecurity and compliance programs.
Traditionally, a significant amount of an auditor’s time has been spent on simple but time-consuming tasks. AI tools can help with collecting documentation, compiling information for the audit file, and formatting that data into compliance reports.
Compliance automation tools are also playing a role in the rise of AI in cybersecurity auditing. Being able to review automated conclusions about control effectiveness, rather than performing time-consuming manual audit processes, can be a big time-saver for auditing teams. In order to rely on the completeness and accuracy of any reporting from compliance automation tools, an auditor may spend some upfront time digging into how the tool works and what the source of data is. Once that due diligence is complete, a trusted AI-powered compliance automation tool can greatly reduce the time it takes to complete cybersecurity audits like SOC 2 and ISO 27001.
AI is also reshaping the industry by putting an increased focus on continuous monitoring over traditional point-in-time audits. Leading compliance automation platforms are spearheading the charge, offering more transparency for system configurations, real-time alerts on anomalies, and configuration changes over a period of time so organizations aren’t limited to using a single screenshot on a specific date as evidence of controls in place.
We’re still far away from being able to thoroughly assess the cybersecurity posture of a company or a system with a single AI tool; however, great strides are being made to reduce the amount of “paper pushing” auditors have traditionally done.
It’s Not Just Auditing Firms
Particularly over the last year, auditors on my team and beyond have noticed an uptick in organizations across industries using AI in innovative ways, from drafting and proofreading marketing emails to automating and optimizing internal processes. But with these new benefits come new risks for companies—and their IT auditors—to consider.
Cybersecurity auditors should strive to learn more about artificial intelligence, including generative AI, and how it functions, so they’re well-equipped to ask their clients about how and when they use AI tools and what policies they have in place to ensure confidentiality and privacy aren’t compromised by the use of those tools.
For clients who have woven AI capabilities into their workflows, auditors should have frank conversations about the limitations of existing compliance frameworks like SOC 2 to conclude on AI safety or trustworthiness. For example, for AI tools that lack clear explainability, it could be challenging to report on conclusions for trust services criteria like processing integrity. In some instances, it might make sense for organizations to explore new and evolving standards to address AI management, like ISO 42001 as well as frameworks developed by NIST and IBM.
All of these frameworks include one common theme: Auditors and their clients should refrain from inputting private data into third-party tools unless and until they have completed thorough vendor risk assessments, assessed potential security vulnerabilities, and implemented proper data protection measures. What’s more, organizations should be prepared to roll back or secure their AI usage if unintended issues or incidents arise.
The Bottom Line
When it comes to cybersecurity compliance, there is no single AI tool that can serve as a one-stop shop for organizations, and auditors shouldn’t be concerned that artificial intelligence will replace their roles. AI tools will continue to get better at automating some of the manual tasks that auditors do today to create audit files, including workpapers and reports; however, humans will always be needed to assess risks, consult on best practices, interpret AI-generated insights, and ensure that systems are not just maintaining the status quo, but also continually improving over time.
AI won’t take our jobs—instead, AI will empower IT auditors to spend more of our days focused on critical thinking, strategic analysis, and what I believe are the most exciting parts of our roles: strengthening our clients’ security postures and helping them build long-lasting cyber resilience.