Disaster recovery and business continuity must function now in a far more risk-laden and complex environment. While ransomware threats continue to plague organizations, those executives responsible for ensuring their company meets compliance and business continuity goals face several added challenges. They must navigate between the imperative to use generative AI (GenAI) and generate larger data sets to feed data science and analytics and their responsibility to manage IT staff time and workloads which are in alignment with overall revenue goals.

CIOs and their teams are refining their business continuity strategies to address the magnitude of unstructured data in their environment. They are seeking solutions and practices capable of efficiently managing data analytics fed by large language models (LLM), while providing the replication safety net that supports business continuity. The specter of a breach and costly downtime is ever present.

The GenAI Juggernaut

Huawei estimates global data volume will reach 180 zettabytes by 2025, a staggering 80% of which will be unstructured data. Its Global Industry Vision report says that by 2030 some 25% of unstructured data will be used for production and decision making, a number eventually reaching 80 percent. GenAI is driving this volume by aggregating text, voice, documents, videos, emails, and messaging platforms.

Businesses are contributing to GenAI volume as they find utility in the technology for product development, customer experience applications and data analytics. They are also responding to GenAI’s benefits of efficiency and enabling automated tasks, i.e., virtual assistants, chatbots and the like.

While GenAI is beginning to exhibit value in various use cases, its rapid development, broad use of large data sets, potential for misapplication and lack of robust security controls make it a rich target for cyber criminals. As Gartner says, “Enterprises must prepare for malicious actors’ use of generative AI systems for cyber and fraud attacks, such as those that use deep fakes for social engineering of personnel, and ensure mitigating controls are put in place.”

Interestingly, the Gartner survey of 2,500 executives found only 7% viewed business continuity as a primary objective of GenAI. Customer experience and retention topped the list at 38% followed by 26% for revenue growth.

Protecting Large Data Sets

Implications are that executives are not yet confident GenAI can help protect against data breaches or business continuity, even while it contributes to the explosive growth of unstructured data and more possibilities for cyberattacks. As businesses continue to assemble large data sets using GenAI-powered processes, and transform this data into applications, there is an immediate need to ensure these data sets are protected and adhere to business continuity standards.

Contributing to the security challenges is the fact that different functional teams are using and generating unstructured data in their own fashion and with varying degrees of security in place. IDC, in a study sponsored by Box, warns application sprawl and fragmentation of unstructured data, “often with diverse sets of identity and authentication models and different administrative features” contribute to more potential attack surfaces. Analyzing the cost of a data breach, IDC estimates that greater fragmentation leads to doubling of annual costs of security breaches, $4.5 million versus $2.2 million.

To avoid fragmentation, businesses need to set enterprise-wide standards on such practices as authentication and policy controls, and work with functional teams to accommodate specific needs like recovery point in time standards. Like any other use of data, unstructured data use must adhere to security compliance standards. IT security teams must also conduct regular audits to ensure standards and policies are being followed.

Integrating GenAI into Business Continuity

GenAI is evolving fast. Large language models offer the potential of significant business value but still must be subject to the same security and data protection practices as any other application or data asset. To ensure recovery and business continuity, there are several immediate considerations:

  • Visibility is a priority. The principle you can’t manage assets you don’t know about holds true for unstructured data. Functional teams, or lines of businesses, must have visibility into the unstructured data in their environment. It is a fundamental practice to avoid cyberattacks, data privacy breaches, and budget impact. By having visibility, teams can make a judgment as to which GenAI data and/or models are critical and need to be categorized as such to support continuity.
  • The cloud is king. Large language models are built on data which needs to live in the cloud, requiring best practice in being able to securely store the data and execute recovery as needed. The expense and lack of hardware support for datasets needed to train large language models makes on premise storage highly prohibitive. If a business has adopted a multi-cloud strategy, it needs to consider a solution which can support large data set migration across multiple cloud providers.
  • Recovery is the point. GenAI has changed the amount of data flowing through an organization and to the cloud. In refining a business continuity strategy, to integrate GenAI, functional teams need to review their recovery time objectives (RTO) and recovery point objectives (RPO). These set standards will ensure they have backup and recovery processes in place to accommodate recovery of any critical GenAI large data sets or applications.
  • Replication is imperative. To support near-zero RTO and RPO objectives, replication technology can help enable compliance and fast data recovery by providing real-time cloud replication of actively used GenAI data. This method reduces costs and further ensures continuous data accuracy should recovery be necessary. In terms of business continuity, it enables development and analytics teams to continue working with the most up-to-date material.
  • No Downtime is a must. Any data movement can hamper business continuity if it requires application downtime. Data migration solutions which can facilitate large-scale data changes and migration to the cloud will help minimize disruption.
  • Automation is the answer. In the event of a system failure, IT teams can use active-active replication over multi-cloud environments, as necessary, to ensure automatic failover and recovery, minimizing data loss and downtime.

Practice Secure GenAI Use

Before the end of this decade GenAI and unstructured data will become a larger and more powerful element in production, data science and day-to-day aspects of business-like HR. Now is the time to review business continuity practices in place to ensure business teams can effectively manage and secure the large language models and data sets they are using.

Updating security practices will help avoid the costs attributed to a data breach. In addition to direct monetary costs, there are other costs impacting the perception of what business continuity means in the era of unstructured data. There is the reputation cost when a disruption occurs, or when compliance sanctions are levied due to a data privacy breach. Secondly, there is the loss of individual customer trust and loyalty as the result of a disruption.

Stringent compliance requirements regarding data protection, privacy, and continuity of operations, particularly in the fields of finance and healthcare, are an important cost and trust factor. It is too easy to introduce confidential data into training large language models, perhaps unwittingly.

Security and functional teams will need to work together to set limits on unstructured data that pose a privacy threat. IT teams must also avoid policy controls fragmentation, update recovery practices, and use replication technology to take the lead in ensuring business continuity.

ABOUT THE AUTHOR

Paul Scott-Murphy

Paul Scott-Murphy is chief technology officer at Cirata, the company that enables data leaders to continuously move petabyte-scale data to the cloud of their choice, fast and with no business disruption. He is responsible for the company’s product and technology strategy, including industry engagement, technical innovation, new market and product initiation and creation. This includes direct interaction with the majority of Cirata’s significant customers, partners and prospects. Previously vice president of product management for Cirata, and regional chief technology officer for TIBCO Software in Asia Pacific and Japan, Scott-Murphy has a Bachelor of Science with first class honors and a Bachelor of Engineering with first class honors from the University of Western Australia.

So, You’re New to the Resilience World?
A version of this article first appeared on the Resilience Think Tank (RTT) website. The more we speak with our...
READ MORE >
Supporting the Next Generation of Hazard & Disaster Researchers
https://youtu.be/_Dm4HnNy9rU Episode 148:  Supporting the Next Generation of Hazard & Disaster Researchers We are looking to fill roles in our...
READ MORE >
How to Defend Against Cyberattacks That Take Over Admin Accounts
In July, 45 high-profile Twitter accounts tweeted variations of the same offer: Send me any amount of Bitcoin, and I’ll...
READ MORE >
Holistic Cybersecurity: How to Bring Security and DevOps into Alignment
Cloud computing today operates at a pace that is almost hard for the human mind to comprehend. In the time...
READ MORE >