As Data Privacy Week 2025 approaches, cybersecurity experts are raising alarms about the growing impact of artificial intelligence on data protection. While AI offers unprecedented advancements in real-time threat detection and predictive modeling, it also introduces significant risks, from automated phishing campaigns to accidental exposure of sensitive information due to poor access controls. Experts emphasize the need for robust AI risk assessments and frameworks like NIST and ISO 42001 to safeguard against this double-edged sword, urging organizations to balance AI innovation with vigilant oversight to protect personal data.
- “There’s an AI rush going on. Organizations are cutting corners to implement AI into their applications, which opens the risk for AI tools to process sensitive customer data. It’s crucial for organizations to begin performing risk assessments around the use of the AI tools and how they can impact the sensitive data managed by the organization. There are now multiple frameworks that can be used to help facilitate this risk assessment, such as NIST AI Risk Management Framework and ISO 42001.” – Marc Rubbinaccio, Head of Compliance, Secureframe
- “AI will dominate data privacy conversations in 2025. It empowers defenders with real-time threat detection, predictive modeling, and automated responses through tools like SOAR (Security Orchestration, Automation, and Response). However, bad actors are also using AI to automate phishing campaigns, identify vulnerabilities faster, and evade detection with AI-designed malware. Organizations need to be aware of this double-edged sword and adopt AI-based threat detection tools to counter these tactics and protect the personal data they manage,” Chris Gibson, CEO, FIRST
- “In 2025, we’re seeing a concerning trend where sensitive data exposure through AI isn’t primarily coming from sophisticated attacks – it’s happening through basic oversights in authorization and data access controls. Organizations are discovering that their AI systems are inadvertently sharing confidential information simply because they haven’t defined who should have access to what.” – Rob Truesdell, Chief Product Officer, Pangea