Ad Image

87 Data Protection Predictions from 46 Experts for 2024

For our 5th annual Insight Jam LIVE! Solutions Review editors sourced this resource guide of data protection predictions for 2024 from Insight Jam, its new community of enterprise tech experts.

Note: Data protection predictions are listed in the order we received them.

Data Protection Predictions from Experts for 2024


Bobby Cornwell, Vice President Strategic Partner Enablement & Integration at SonicWall

Expect to See New Regulations for Reporting Breaches

“In 2024, incoming cybersecurity regulations will force businesses to be more transparent about their breaches and attacks. Forthcoming legislation such as the EU’s NIS2 Directive and the Cyber Resilience Act will impose more stringent standards for cyber protection and establish clear reporting timelines in the event of a breach. As these directives take effect, businesses will be made to share with their partners and suppliers early identifications of system vulnerabilities or face fines. The aim of this is to prevent cybercriminals from inflicting widespread damage across multiple businesses. In 2024, it will be crucial to optimize the transparency afforded by these regulations, and by dragging cybercriminals out into the open, authorities can more effectively curtail their illicit activity.”

Samir Zaveri, Practice Manager – Package Led Transformation at Wipro

Future of Data Protection in Hybrid Cloud Deployments

“On one hand, hybrid cloud adoption will continue to grow exponentially and on other hand, organizations are looking to repatriate workloads back to private clouds. Data protection in a single cloud environment was already a challenge and with data distributed across multiple clouds and cloud service provider,s the challenge has grown even more. Today, organizations are having a one-to-one mapping between source clouds and data backup, and disaster recovery sites lead to multiple standard operating procedures and multiple points of data thefts along with inconsistent recovery SLAs.”

To overcome these challenges, organizations will need to adopt new capabilities

“Full workload portability is one of them. With portability, organizations have the ability to deploy workloads across different cloud service providers without having to adapt to each environment and with no changes needed to the application or the infrastructure. Portability will give organizations a way to consolidate multiple sources into one single data backup and disaster recovery site, as well as consolidate standard operating procedures (SOPs), all with consistent recovery SLAs.”

Eyal Arazi, Cloud Security Manager at Radware

Migration to the cloud will slow down as companies reverse course

“During the past few years, there has been a rapid adoption of multi-cloud strategies, with organizations often using three, four and even five different cloud environments. The problem, however, is that while organizations accumulated more cloud platforms, they ignored the problems of cross-cloud security consistency, visibility and management.

A recent survey commissioned by Radware suggests that now organizations are starting to reverse course. Despite the ongoing discussion about “the great cloud migration” and the abandonment of on-premises environments, approximately three quarters of organizations not only still use these environments but expect usage to increase in the next 12 months. Based on the report, look for more companies to consolidate their cloud environments from three or more to one or two platforms in 2024. While there will be consolidation around the cloud, most organizations will continue to maintain a combination of public cloud and on prem platforms, resulting in a hybrid environment.”

Andrew Moloney, Chief Strategy Officer at SoftIron

Cloud strategy moves from “fashionable to rational” 

“Moving from an era when proposing a full-scale migration to the public cloud was a sure-fire way to promotion, the current maturation of the market, economic conditions, shifting performance requirements, and a dramatic simplification in building private cloud-native infrastructure, will see a much more rational approach. Underpinning this will be a broader understanding of the difference between cloud “the model” and cloud the “place”, where how applications are built is decoupled from where they operate.”

A sovereign cloud shake out 

“We predict that many of the “pseudo” sovereign cloud projects – those that rely on obfuscated infrastructure and/or local third parties to operate them to provide a veneer of sovereignty, will not gain traction. AWS, late to the party to offer such a service and having recently launched their European Sovereign Cloud may well be delivering too little, too late. Instead, those that offer true sovereign resilience – enabling nation-states to build, operate, inspect, and audit their own infrastructure on their own terms and turf, will become the preferred option.”

VMware acquisition accelerates the adoption of cloud-native infrastructure 

“Forced into seeking credible alternatives to using VMware to provide virtualized infrastructure in on-prem. data centers, existing VMware customers will take the opportunity to revisit their cloud strategy, making the rational decision to shift to a fully cloud-native infrastructure – one able to consolidate and simplify existing virtualized on-prem. workloads within an infrastructure able to deliver true private cloud going forward will grasp that opportunity. Finally, they will be able to deliver what VMware and Nutanix have promised for years but have never quite been able to deliver.”

A renaissance for Private Cloud 

“Partially related to our VMWare prediction, and the availability of cloud-native infrastructure that changes the economics of private cloud, the evolution of a more rational cloud strategy will see Cloud Centers of Excellence (CCoEs) and FinOps professionals grasp the opportunity to get an apples-to-apples comparison across not just public clouds, but now between public and private cloud. New open standards released in 2024, such as FOCUS will help to enable this.

At the same time, shifts to distributed cloud architectures, enabling workloads to move to the edge to the core and back will elevate the need to make private clouds more than just basic virtualized infrastructure.”

The death of “Hyper-Converged” 

“Already effectively abandoned by its greatest exponent, Nutanix, the independent and elastic scaling limitations inherent in these architectures, plus their failures to fully deliver on a fully cloud-native environment without significant integrations and third parties will see hyperconvergence relegated from the data center to smaller, departmental type solutions only.”

Software-defined fades, hardware, and hard tech get sexy again 

“Fuelled by the hype around AI and the investments being made in the processing power to support it, we predict we’ll see a resurgence in interest in innovation right in hardware, and the hard tech required to support that. A new generation of start-ups will disrupt the inertia in innovation in IT infrastructure design of the last couple of decades.”

Thomas Chauchefoin, Vulnerability Research at Sonar

AI-Assisted attacks to become more sophisticated and automated

“IT security attacks leveraging AI are expected to become more sophisticated and automated. Hackers will likely use AI to analyze vast amounts of data and launch targeted attacks. AI-driven phishing attackers capable of generating highly convincing and personalized messages, which trick users into revealing sensitive information, may increase. Furthermore, AI-powered malware could adapt and evolve in real time, making it more challenging for traditional antimalware detection systems to keep up.”

Stacy Hayes, Co-Founder and EVP, Americas at Assured Data Protection

“The managed services model will become increasingly attractive to traditional VARs in 2024, especially with more and more businesses looking to buy cloud and IT services on a usage basis. But making the transition from a traditional VAR to a provider of managed services is easier said than done. It’s not that VARs aren’t capable of diversifying, far from it, it’s just that the switch requires a fundamental shift in the way VARs do business and that isn’t something you can just change overnight. These large organizations are not built for this new world model. The in-house build and integration of new technology and go-to-market models takes too long and is too expensive to implement. VARs simply don’t have the people, the flexibility or the know how. With the economic headwinds as they are, Opex is king and no-one has the Capex or the appetite for big in-house builds.

It is becoming increasingly difficult for VARs to provide a large portfolio of products and services to the standards customers demand. The speed the market moves, the reliance on data, all add to greater demands from customers. It is evident channel businesses are struggling to deliver what their customers want, whether it be on-premises or in the cloud. It is a common topic and one I believe means VARs need to clearly understand what they can deliver themselves, and what they need to outsource. Outsourcing, white labelling, is a great way to deliver a high quality and diverse portfolio to customers.

MSPs that have the know-how to use utility based models effectively, that can execute immediately, have experts in the space and deliver services tailored for the vendor, customer, end user will be the partners of choice for VARs in 2024.”

Jim Liddle, Chief Innovation Officer at Nasuni

2024 will be a make-or-break year for data intelligence  

“Following the booming interest in AI in 2023, enterprises will face increased pressure from their boards to leverage AI to gain a competitive edge. That rush for an AI advantage is surfacing deeper data infrastructure issues that have been mounting for years. Before they can integrate AI effectively, organizations will first have to address how they collect, store, and manage their unstructured data, particularly at the edge.

AI doesn’t work in a vacuum and it’s just one part of the broader data intelligence umbrella. Many organizations have already implemented data analytics, machine learning and AI into their sales, customer support, and similar low-hanging initiatives, but struggle to integrate the technology in more sophisticated, high-value applications. 

Visibility, for example, is a crucial and often-overlooked first step towards data intelligence. A shocking number of companies store massive volumes of data simply because they don’t know what’s in it or whether they need it. Is the data accurate and up-to-date? Is it properly classified and ‘searchable’? Is it compliant? Does it contain personal identifiable information (PII), protected health information (PHI), or other sensitive information? Is it available on-demand or archived?  

In the coming year, companies across the board will be forced to come to terms with the data quality, governance, access, and storage requirements of AI before they can move forward with digital transformation or improvement programs to give them the desired competitive edge.” 

2024 will be the year of reckoning for both ransomware and compliance 

“The risk of ransomware and sophisticated attacks is ever-growing and will continue to spread internationally in 2024. Preventing the theft, encryption, misuse, or exposure of sensitive data will remain a daily concern for organizations indefinitely. Multi-layer protection has quickly become a matter of hygiene and even companies that invested in sophisticated, global ransomware protection products will need a belt and braces approach in the form of network, application, and access security, coupled with rapid data recovery solutions. 

Ransomware has typically been more prevalent in the US, with larger organizations and their larger data sets presenting more attractive targets for bad actors. In 2024, we’ll see more ransomware incidents in the UK as government agencies, health services, and critical infrastructure in both countries continue to lack the technology and funding to build adequate data protection and recovery capabilities. 

Organizations that haven’t addressed their data protection and recovery posture are now risking both security and compliance headaches, as regulatory penalties and recovery costs often outmatch ransom payouts. Europe still leads in data governance and regulation with the likes of GDPR, but legislation like the California Consumer Privacy Act (CCPA) is quickly spreading across the US. By delaying their investment in protection and compliance solutions until forced to, many large organizations will soon face the possibility of steep penalties, ransom demands, and business disruption simultaneously.” 

Russ Kennedy, Chief Product Officer at Nasuni

Enterprises will embrace hybrid infrastructure or fall behind 

“The next revolution in data will occur at the edge. After years of conflicting definitions and uncertainty, today’s leading businesses are realizing the necessity of truly hybrid infrastructure. To remain competitive in a data-driven world, enterprises need high performance processing at the edge, where data is generated, in combination with the scale, capacity, and advanced tools available in the cloud.  

Traditionally, large companies have used legacy storage vendors and traditional backup solutions to store and protect petabyte volumes of data. These legacy infrastructures are a performance bottleneck and can’t support the pace of growth, as analyst William Blair recently highlighted.  

Over the next few years, we’ll see more organizations realize it’s not one or the other, but a combination of edge and cloud storage. According to Gartner, 50 percent of critical infrastructure applications will reside outside of the public cloud through 2027. Manufacturers, for example, need to quickly capture and consolidate the critical data coming from their physical systems and processes across the world, while keeping and leveraging that data for analytics year after year. Ready or not, we’ll see this edge-cloud mechanism force organizations to adopt and embrace truly hybrid infrastructure and ultimately transform their ability to drive more effective innovations and respond in a more agile way to customer’s evolving needs.” 

Organizations will continue to grapple with data infrastructure to support hybrid work long after the pandemic 

“The genie is out of the bottle and hybrid or remote is here to stay. Though the greatest economic upheavals have hopefully passed, we’re seeing the residual effects. Many companies are still trying to design or optimize infrastructure to accommodate hybrid work and reconfigured supply chains.  

Though organizations worked quickly to spin up the necessary systems, they simply weren’t designed to support thousands of remote workers. Inevitably, workers started using whatever tools necessary to collaborate, and many businesses saw a significant increase in shadow IT tools outside of sanctioned corporate IT programs. As we enter 2024, IT organizations are still grappling with the effects of remote work on top of mounting pressure to reduce costs and regain control of their disparate and sprawling corporate data assets. 

Some have tried to remedy the issue by mandating employees back into the office, but to attract and retain appropriate talent, businesses will need to provide enhanced multi-team collaboration options and the data infrastructure to scale it. Those that have the right data access solutions in place to streamline processes and remote collaboration will succeed in the hybrid work economy.” 

Matt Waxman, Senior Vice President and GM for Data Protection at Veritas Technologies

The first end-to-end AI-powered robo-ransomware attack will usher in a new era of cybercrime pain for organizations

“Nearly two-thirds (65 percent) of organizations experienced a successful ransomware attack over the past two years in which an attacker gained access to their systems. While startling in its own right, this is even more troubling when paired with recent developments in artificial intelligence (AI). Already, tools like WormGPT make it easy for attackers to improve their social engineering with AI-generated phishing emails that are much more convincing than those we’ve previously learned to spot. In 2024, cybercriminals will put AI into full effect with the first end-to-end AI-driven autonomous ransomware attacks. Beginning with robocall-like automation, eventually AI will be put to work identifying targets, executing breaches, extorting victims and then depositing ransoms into attackers’ accounts, all with alarming efficiency and little human interaction.”

Targeted cell-level data corruption will make ransomware more dangerous than ever

“As more organizations become better prepared to recover from ransomware attacks without paying ransoms, cybercriminals will be forced to continue evolving. In 2024, we expect hackers to turn to targeted cell-level data corruption attacks—code secretly implanted deep within a victim’s database that lies in wait to covertly alter or corrupt specific but undisclosed data if the target refuses to pay a ransom. The real threat is that victims will not know what data—if any, the hackers could be bluffing—has been altered or corrupted until after the repercussions set in, thus effectively rendering all their data untrustworthy. The only solution is to ensure they have secure copies of their data that they are 100 percent certain are uncorrupted and can be rapidly restored.”

Adaptive data protection will autonomously fight hackers without organizations lifting a finger

“More than two-thirds of organizations are looking to boost their cyber resiliency with the help of AI. But, given AI’s dual nature as a force for both good and bad, the question going forward will be whether organizations’ AI-powered protection can evolve ahead of hackers’ AI-powered attacks. Part of that evolution in 2024 will be the emergence of AI-driven adaptive data protection. AI tools will be able to constantly monitor for changes in behavioral patterns to see if users might have been compromised. If the AI detects unusual activity, it can respond autonomously to increase their level of protection. For example, initiating more regular backups, sending them to differently optimized targets and overall creating a safer environment in defense against bad actors.”

Generative AI-focused data compliance regulations will impact adoption

“For all its potential use cases, generative AI also carries heavy risks, not the least of which are data privacy concerns. Organizations that fail to put proper guardrails in place to stop employees from potentially breaching existing privacy regulations through the inappropriate use of generative AI tools are playing a dangerous game that is likely to bring significant consequences. Over the past 12 months, the average organization that experienced a data breach resulting in regulatory noncompliance shelled out more than US$336,000 in fines. Right now, most regulatory bodies are focused on how existing data privacy laws apply to generative AI, but as the technology continues to evolve, expect generative AI-specific legislation in 2024 that applies rules directly to these tools and the data used to train them.”

For every organization that makes the jump to the cloud, another will develop an on-premises datacenter as hybrid cloud equilibrium sets in

“The percentage of data stored in the cloud versus on-premises has steadily grown to the point where it is estimated that 57 percent of data is now stored in the cloud with 43% on-premises. That growth has come from both mature companies with on-premises foundations making the jump to the cloud, and newer companies building their infrastructure in the cloud from the ground up. But both categories of organizations are learning that, for all its benefits, the cloud is not ideally suited for all applications and data. This is leading many companies that made the jump to the cloud to partially repatriate their data and cloud-native companies to supplement their cloud infrastructure with on-premises computing and storage resources. As a result, in 2024, we’ll see hybrid cloud equilibrium—for every organization that makes to the move to the cloud, another will build an on-premises datacenter.”

Cassius Rhue, VP of Customer Experience at SIOS Technology

Application high availability becomes universal

“As reliance on applications continues to rise, IT teams will be pressured to deliver efficient high availability for applications once considered non-essential. Once reserved for mission-critical systems, such as SQL Server, Oracle, SAP, and HANA, application high availability – typically delivered with HA clustering technology – will become a requirement for more systems, applications, and services throughout the enterprise.”

Cloud and OS agnostic high availability becomes an expected requirement for most applications

“IT teams will look for application HA solutions that are consistent across operating systems and cloud reducing complexity and improving cost-efficiency. As the need for HA rises, companies running applications in both on-prem and cloud environments as well as those running applications in both Windows and Linux environments will look to streamline their application environments with HA solutions that deliver a consistent user interface across all of their environments and also for matching cloud and OS technical support and services from the HA vendor.”

The trend toward migration to SAP HANA is likely to continue in 2024

“The mandatory 2027 migration will push more companies to migrate to SAP HANA. As companies migrate to SAP HANA there will be an increased need for more sophisticated and flexible high availability and disaster recovery solutions that help them bridge the gap between existing systems and the new, more modern systems that take advantage of SAP HANA’s capabilities. Organizations will look for HA solutions that help them find ways to take advantage of emerging technologies and accelerate digital transformation, while not losing the HA and DR capabilities that continue to arise.”

Automation becomes more common in high availability and disaster recovery efforts as data and analytics increase complexity

“As the volume and variety of data as well as the channels through which data are collected increase, organizations will require more information about why faults/failures occurred and how to address potential issues. Automation and orchestration tools will play a central role, streamlining root cause analysis, improving intelligent responses, and enhancing disaster recovery processes to further reduce downtime and enhance data availability.”

The focus on data security and compliance will intensify

“The focus on data retention, security, and access controls will intensify prompting organizations to integrate enhanced security measures deeper into their high availability and disaster recovery solutions, services, and strategies. As the volume and variety of data as well as the channels through which data are collected and processed increase, organizations will require more security measures to be baked into their solutions.”

Sophisticated storage and DR strategies will become crucial to the demands of an increasingly dynamic and data-driven business landscape

“As the volume of unstructured data continues to surge, storage solutions are expected to prioritize scalability, tiered performance, and accessibility. Enterprises will also adopt more sophisticated and resilient DR strategies using multiple high availability (HA) nodes, and DR technologies that understand the complexity of tiered storage solutions. Cloud storage is expected to continue its ascendancy, with organizations increasingly relying on scalable and flexible cloud solutions to meet their expanding data requirements. At the same time, a growing number of companies will look to move workloads out of the cloud to on-prem environments in favor of more predictable costs and greater control over their environments.”

Justin Borgman, Co-Founder and CEO at Starburst

All things make a comeback and on-prem storage is having a resurgence

“Companies including Dell have heavily invested in their EMC portfolio. Enterprise customers will continue to recognize that enhancing on-premise storage hardware presents the faster path to mitigating rising cloud expenses. This modernization will allow companies to manage data gravity for on-premise data that cannot be easily relocated, ensuring a more efficient approach.”

Haoyuan Li, Founder and CEO at Alluxio

Hybrid and Multi-cloud Acceleration

“In 2024, the adoption of hybrid and multi-cloud strategies is expected to accelerate, both for strategic and tactical reasons. From a strategic standpoint, organizations will aim to avoid vendor lock-in and will want to retain sensitive data on-premises while still utilizing the scalable resources offered by cloud services. Tactically, due to the continued scarcity of GPUs, companies will seek to access GPUs or specific resources and services that are unique to certain cloud providers. A seamless combination of cross-region and cross-cloud services will become essential, enabling businesses to enhance performance, flexibility, and efficiency without compromising data sovereignty.”

From Specialized Storage to Optimized Commodity Storage for AI Platform

“The growth of AI workloads has driven the adoption of specialized high-performance computing (HPC) storage optimized for speed and throughput. But in 2024, we expect a shift towards commoditized storage. Cloud object stores, NVMe flash, and other storage solutions will be optimized for cost-efficient scalability. The high cost and complexity of specialized storage will give way to flexible, cheaper, easy-to-manage commodity storage tailored for AI needs, allowing more organizations to store and process data-intensive workloads using cost-effective solutions.”

Jimmy Tam, CEO at Peer Software

Active-Passive High Availability Practices Evolve – Active-Active Has its Moment

“Without continuous availability and real-time access to data, businesses risk losing out to competitors, making decisions with inaccurate information, and more. So it is no wonder that CIOs are starting to demand more from their data centers. In the coming 12 months, it is likely that many IT leaders will start to adopt active-active capabilities, improving performance by distributing the workload across several nodes to allow access to the resources of all servers. 

By moving away from active-passive technologies that simply don’t make the most of the available servers and often require manual intervention during outages, CIOs will ensure that data is actionable wherever it resides, is as close as possible to the end-user for performance, and that the load of data processing is spread across all compute and storage nodes whether it be at the edge, in the data center, or in the cloud.”

The storage industry will start to productize AI and ML 

“AI and Machine Learning have so much promise, but they’re not being adopted as quickly as anyone in the industry anticipated. There’s a clear reason why: users simply don’t know how to realize the technologies’ full potential. Beyond ChatGPT, which is easy to use and incredibly popular, there’s no real out-of-the-box product for enterprise storage customers. So unless organizations have a data scientist on hand to help them navigate the intricacies of AI and ML, they’re very likely to hold off when it comes to implementing any kind of solution.

This presents a great opportunity for the storage industry and the smart companies are already starting to think about it. Through 2024, we’ll see the beginning of the productization of AI and ML. Ready-to-use packages will be developed so that users can easily understand what the technologies can help them achieve, while being straightforward to set up and run. Then watch, as AI and ML adoption increases.”

Virtual Desktop Infrastructure is here to stay – but much will move back on-premise

“When Covid hit, VDIs were the reason many of us could continue to work. They offered users a flexible, consistent experience from wherever they logged in and became a lynchpin for organizations during the days of lockdown. But there was an issue: the hardware was difficult to get hold of. And the urgency we all became so used to during the pandemic meant there was no time to wait for the supply chain to right itself, so CIOs turned to the cloud. 

Don’t get me wrong, the cloud has clear benefits. It is easy to implement, and it is elastic in nature, quickly responding to and growing with our needs. But it can be very expensive and, because cloud providers tend to charge for each transaction, costs can be difficult to predict. Availability in the supply chain, will bring about a shift towards migrating highly transactional workloads back on-premise. Unhappy with writing blank checks, CFOs will rightly start to ask CIOs to demonstrate ROI and explain the cost difference between cloud and on-premise.”

JB Baker, Vice President of Marketing & Product Management at ScaleFlux

Sustainable Data Storage Becomes a Priority

“With sustainability rising as an urgent priority across industries, data storage solutions will be under increasing pressure to reduce their environmental impact. Organizations are ramping up investments in energy-efficient technologies to meet emissions requirements and goals. Data storage, projected to account for 14 percent of the global carbon footprint by 2040, will be a key focus area.

To minimize the footprint of the data center, storage leaders will need to look beyond device-level specs and take a solution-wide view. The criteria will expand to encompass data compression, energy expenditure, workload optimization, and more. The goal is to maximize efficiency and minimize power consumption across the storage infrastructure. As sustainability becomes a competitive differentiator, we will see rapid innovation in “green” data storage technologies, architectures, and management techniques. The storage domain will play a critical role in driving the sustainability transformation.”

Anand Babu, Co-Founder & CEO at MinlO

Unstructured data becomes a core enterprise challenge

“Over the last few years, we have seen explosive growth in the semi-structured data world (log files, models, snapshots, artifactory code) which has, in turn, driven the growth of object storage.”

“In 2024, we’ll see an enterprise explosion of truly unstructured data (audio, video, meeting recordings, talks, presentations) as AI applications take flight. This is highly ‘learnable’ content from an AI perspective and gathering it into the AI data lake will greatly enhance the intelligence capacity of the enterprise as a whole, but it also comes with unique challenges.”

“There are distinct challenges with maintaining performance at 10s of petabytes. Those generally cannot be solved with traditional SAN/NAS solutions — they require the attributes of a modern, highly performant object store. This is why most of the AI/ML technologies (I.e. OpenAI, Anthropic, Kubeflow), leverage object stores and why most databases are moving to be object storage centric.”

Jon France, CISO at ISC2

We’ll see an evolution, rather than a revolution of regulations

“The regulatory landscape will continue to stay hot – I think we’ll see more regulations governing AI and privacy in particular, and we’ll likely see more backlash around reporting requirements and a push for agencies to define what should actually be reported and at what thresholds of materiality. However, I don’t see a major overhaul coming. Instead, I think what we’ll see is sectors grappling with the tangible effects of the requirements that have been introduced. We’re no longer looking at these regulations as being on the horizon…in 2024, they’ll have to be adhered to. With this, I hope to see increased harmonization of regulations globally, so that multinational companies don’t run into navigational issues of not knowing which regulations and policies to follow and which don’t apply. We’re starting to see increased communication on a global scale, but we’re not there yet. It may be wishful thinking, but I predict we’ll see major global powers collaborating on what a cyber secure world should look like, and making policy decisions based on those discussions.”

Mark Cassetta, Chief Product Officer at Axiomatics

With recent legislation, the security market is poised to shift focus

“2023 saw a number of notable startups, especially incorporating Generative AI into their offerings. As fast as enterprises have started to adopt the usage, legislation has been discussed and shared to try to further protect the US identity and economy. This means that in 2024, just as both the mobile (MDM) and cloud platform shifts (CASB) created their security categories, we will see the same very quickly formally emerge for Generative AI.”

Giorgio Regni, CTO at Scality

HDDs will live on, despite predictions of a premature death

“Some all-flash vendors prognosticate the end of spinning disk (HDD) media in the coming years. While flash media and solid state drives (SSDs) have clear benefits when it comes to latency, are making major strides in density, and the cost per GB is declining, we see HDDs holding a 3-5x density/cost advantage over high-density SSDs through 2028.

Therefore, the current call for HDD end-of-life is akin to the tape-is-dead arguments from 20 years ago. In a similar way, HDDs will likely survive for the foreseeable future as they continue to provide workload-specific value.”

End users will discover the value of unstructured data for AI

“The meteoric rise of large language models (LLMs) over the past year highlights the incredible potential they hold for organizations of all sizes and industries. They primarily leverage structured, or text-based, training data. In the coming year, businesses will discover the value of their vast troves of unstructured data, in the form of images and other media.

This unstructured data will become a useful source of insights through AI/ML tooling for image recognition applications in healthcare, surveillance, transportation, and other business domains. Organizations will store petabytes of unstructured data in scalable “lakehouses” that can feed this unstructured data to AI-optimized services in the core, edge and public cloud as needed to gain insights faster.”

Ransomware detection will be the next advancement in data protection solutions

“In recent years, the tech industry has made tremendous strides in protecting data against all manner of threats, including increasingly destructive malware and ransomware. This is exemplified by the rise of immutability in data protection and data storage solutions, especially for backup data.

While data protection and restoration are a major cornerstone that serves as a critical last line of defense in a layered cybersecurity infrastructure, new advancements in AI-generated ransomware detection capabilities will emerge in data protection and storage solutions in 2024.”

Managed services will become key to resolving the complexity of hybrid cloud

“Multi-cloud is a reality today for most enterprises, in their use of multiple SaaS and IaaS offerings from different vendors. However, the use of on-premises and public cloud in a single application or workload has become mired in the complexities of different application deployment models and multiple vendor APIs and orchestration frameworks.

While this has inhibited the powerful agility and cost-reduction promises of the hybrid-cloud model, throughout the coming year, organizations will increasingly leverage the experience and skills of managed service providers (MSPs) to solve these complexity issues and help them achieve business value and ROI.”

Amer Deeba, CEO and Co-Founder at Normalyze

SEC Regulations Will Impact The One Area We Don’t Want to Talk About: Your Data 

“As we know, the new SEC transparency requirements and ruling now requires public companies to disclose cybersecurity posture annually and cyber incidents within four days after determining an incident was material. In 2024, this major policy shift will have a significant effect on one key area: data, forcing businesses to think about security with data at the forefront. In response, enterprises will dedicate both effort and budget to support the SEC’s data-first strategy – implementing best practices that assure shareholders that their company’s most valuable asset – data – is protected. In 2024, companies will need to discover where their data resides and who can access it, while proactively remediating risks that have the highest monetary impact in the event of a breach. When faced with this dilemma companies will lean on automation, specifically end-to-end, automated solutions that center on a holistic approach.

The recent ALPHV/Black Cat and MeridianLink breach underscores the importance for businesses of understanding exactly what data they have, where it lives, and how it is protected. In order to answer critical questions with confidence in the event of a breach and lower the probability of a breach, companies need to build better defenses. The risk of exposure/tagging is not novel, but with these new disclosure requirements, securing the target of such attacks – the data – has gone from a good first practice to an absolute necessity.  Being proactive means that if a breach does occur, you can respond quickly, answer these critical questions, be in compliance with the SEC requirements, and most importantly — respond. To summarize, in 2024 we’ll see organizations separated by their approach to data security. With these regulations, there is no alternative. Organizations must effectively remediate risks to lucrative sensitive data before breaches occur. Only this will allow organizations to respond decisively and confidently if an incident occurs.” 

To Address the Influx of Data, Security Teams Must Approach Data Security Like a Team Sport

“As AI booms, the industry is facing increasing complexity and an influx of data, and companies are grappling with how to keep it all secure. In the height of AI technology adoption, companies will need to refocus in 2024 on what matters most – protecting their data as it gets used by machine learning modes and new AI technologies. Businesses need to change their approach: the success of the coming year for organizations big and small will come back to how they do so. The challenges that this will bring require the profound depth and efficiencies of AI and automated processes to ensure protection of cloud-resident sensitive data. As demands around data change in 2024, organizations will need to invest in their security and cloud ops teams, approaching data security like a team sport, building more efficient shared responsibility models to better protect data. These teams can then regain visibility of all data stores within an enterprise’s cloud or on-premises environment and trace possible attack paths, overprovisioned access, and risks that can lead to data exposure. Only by identifying the approach to data, ensuring permissions and privileges and efficiently implementing AI will companies enable their teams to be successful in 2024.” 

Raul Martynek, CEO at DataBank

The AI cloud wars between hyperscalers will take center stage

“With Google’s latest investment in Anthropic, together with Microsoft’s stake in OpenAI as well as Nvidia’s support for GPU-as-a-service players like CoreWeave, we are beginning to see the emerging outlines of a new phase of competition in the public cloud driven by differentiated AI GPU clouds. In 2024, these new competition dynamics will take center stage as big tech seeks to outcompete each other in the race to realize artificial general intelligence. Nvidia will emerge as a giant competing on the same level as the ranks of Google, Microsoft and AWS. With its cutting-edge GPUs, I see Nvidia emerging as a very capable threat to big tech’s dominance in the public cloud space.”

Miroslav Klivansky, Global Practice Leader at Pure Storage

“In 2024, we will start to creep into GenAI’s trough of disillusionment (Gartner’s Hype Cycle defines this as a period when interest wanes when experiments and implementations fail to deliver) and will eventually industrialize the use of AI.  As we shift from the hype brought on by AI tools with a consumer-friendly UX, we’ll see companies better understand, invest, and apply AI-specific solutions to their business needs.”

“In 2024, we can expect AI to optimize energy efficiency across energy-hungry industries (e.g., manufacturing) as it will be integral in optimizing the process and money savings. Deploying LLMs for inference at scale will also lead to surprisingly high power bills, leading companies to review their data center efficiency strategies and ESG initiatives.”

“One of the industries most ripe for innovation with the help of AI is healthcare. Not only does it have the potential to improve diagnostics, but it also can improve medical devices and automate administrative tasks. The latter will likely be disrupted first because these systems are electronically managed and quickly automate tasks.”

The rate of AI innovation will slow down in the next year

“Over the last several years, AI innovation has been fueled by information sharing and open-source development. However, as companies increasingly invest in AI to give them a competitive edge and regulatory bodies seek to unpack the potential around AI’s broader impact, companies will likely be more aggressive when it comes to protecting their IP.”

Kurt Markley, Managing Director, Americas at Apricorn

Cyber Resilience

“The rapid growth of AI is helping bad actors more quickly create and deploy ransomware tools across a host of industries. It’s been reported that generative AI has helped to double ransomware attacks against industries such as healthcare, municipalities and education between August 2022 and July 2023. Also concerning is the rate at which organizations choose to pay a ransom in order to secure their data. One research report shows that nearly half of respondents have a security policy in place to pay a ransom, with 45% admitting that bad actors still exposed their data even after paying the ransom.

Ransomware isn’t a threat; in many instances it’s an inevitability. No data is too low-value and no organization is too small. The alarmingly high rate of paying a ransom and still having data exposed means that IT leaders have to take back control and put practices in place to protect their data and save their capital budget. It means that IT leaders can’t afford to slack off regarding cyber resilience. 

While almost all IT leaders say they factor in data backups as part of their cyber security strategies, research we conducted earlier this year found that only one in four follow a best practice called the 3-2-1 rule, in which they keep three copies of data on two different formats, one of which is stored offsite and encrypted. Furthermore, this same research found that more than half of respondents kept their backups for 120 days or less, far shorter than the average 287 days it takes to detect a breach.

The likelihood that AI-driven ransomware will impact far-higher numbers of organizations, it will be more important than ever in 2024 that organizations have a strong cyber resiliency plan in place that relies on two things: encryption of data and storage of it for an appropriate amount of time. IT leaders need to embrace the 3-2-1 rule and must encrypt their own data before bad actors steal it and encrypt it against them.”

Data Management Within Security Policy

“Data is no longer a byproduct of what an organization’s users create; it is the most valuable asset organizations have. Businesses, agencies and organizations have invested billions of dollars over the past decade to move their data assets to the cloud; the demand is so high that Gartner expects that public-cloud end user spending will reach $600B this year. These organizations made the move to the cloud, at least in part, because of a perception that the cloud was more secure than traditional on-prem options.

It’s estimated that 30 percent of cloud data assets contain sensitive information. All that data makes the cloud a juicy target and we expect that 2024 will continue to show that bad actors are cunning, clever and hard-working when it comes to pursuing data. The industry has seen triple the number of hacking groups attacking the cloud, with high-profile successes against VMware servers and the U.S. Pentagon taking place this year.

As IT teams spend more on moving and storing data in the cloud, organizations must spend the next 12 – 24 months auditing, categorizing and storing it accordingly. They need to gain deeper visibility into what data they have stored in the cloud, how data relates to each other, and if it is still meaningful to the operations of the organization. In doing so, they are advised to create specific security policies about how, where and for how long they store their data. These policies, when actively enforced, will help organizations better protect their most valuable asset – their data.”

Brian Land, VP of Global Sales Engineering at Lucidworks

Navigating the Privacy Terrain

“In 2024, brands are gearing up to face new challenges around privacy and ethics with the end of third-party cookies and the advent of new large language models (LLMs). This means they’ll be shaking things up in how they market and handle consumer data privacy. For example, they’ll have to find new methods for collecting user data and be more transparent about how they’re collecting that data. And when it comes to managing LLMs, they will adopt advanced encryption and secure data storage practices to safeguard user information. Rest assured, they’re working hard to get it right – making sure they follow the rules while still keeping consumers engaged and happy.”

Matt Watts, CTO at NetApp

There’s No Such Thing as Perfection

“If your company thinks the cloud will ease every IT woe your team is experiencing, you’ll want to think again. It won’t and it can’t. Migrations in hybrid multicloud environments strain both budgets and team resources and you’ll need to find ways to optimize operations both as you move to the cloud and every day thereafter. According to a recent report on data complexity, approximately 75 percent of global tech executives in the throes of cloud migration note they still have a sizable number of workloads remaining on-premises (between 30 percent and 80 percent). For most companies, maintaining an increasingly complex IT infrastructure will remain challenging as cost pressures mount alongside demands for greater innovation. In 2024, we’ll see companies abandon unrealistic ideas of creating the “perfect” cloud environment as they move toward an intelligent data infrastructure (IDI). IDI is the union of unified data storage with fully integrated data management that delivers both security and observability with a single pane of glass interface so you can store, control, and use data more easily no matter what applications, cloud services, or databases you’re using. Companies that choose IDI will experience greater agility in adapting to market conditions. With a more nimble infrastructure, IT can spend its time on innovation, skill building, and development that align with business priorities rather than simply maintaining their cloud environments.”

The Uncomfortable Truth: You’ve Already Had a Breach

“Today’s cyber threat landscape requires constant vigilance as you try to guess who, when, and how the next bad actor will attack. Whether an employee clicks on the wrong link, or an organized gang of cyber criminals are the culprit, you’ll need to have the right tools to quickly alert you of an attack so you can recover quickly. And, while preventing attacks is always the goal, the ability to keep bad actors out indefinitely is now a statistical anomaly. In fact, it’s predicted that by 2031 ransomware attacks will occur every 2 seconds at a cost of $265 billion each year. Because of this, 87 percent of C-suite and board-level executives see protecting themselves from ransomware as a high, or top, priority. And stolen data isn’t your biggest concern after an attack. It’s the lost productivity and business continuity as systems are repaired and data restored to get your business up and running again. In 2024, we’ll see more investment in IT security that ensures systems are secure by design and keep business to a minimum when there is an attack. Security infrastructures that include immutable data backups will add to peace of mind and mitigate downtime when cyber incidents are investigated.”

James Beecham, Founder and CEO at ALTR

While AI and LLMs continue to increase in popularity, so will the potential danger

“With the rapid rise of AI and LLMs in 2023, the business landscape has undergone a profound transformation, marked by innovation and efficiency. But this quick ascent has also given rise to concerns about the utilization and the safeguarding of sensitive data. Unfortunately, early indications reveal that the data security problem will only intensify next year. When prompted effectively, LLMs are adept at extracting valuable insight from training data, but this poses a unique set of challenges that require modern technical solutions. As the use of AI and LLMs continues to grow in 2024, it will be essential to balance the potential benefits with the need to mitigate risks and ensure responsible use. 

Without stringent data protection over the data that AI has access to, there is a heightened risk of data breaches that can result in financial losses, regulatory fines, and severe damage to the organization’s reputation. There is also a dangerous risk of insider threats within organizations, where trusted personnel can exploit AI and LLM tools for unauthorized data sharing whether it was done maliciously or not, potentially resulting in intellectual property theft, corporate espionage, and damage to an organization’s reputation.  

In the coming year, organizations will combat these challenges by implementing comprehensive data governance frameworks, including, data classification, access controls, anonymization, frequent audits and monitoring, regulatory compliance, and consistent employee training. Also, SaaS-based data governance and data security solutions will play a critical role in keeping data protected, as it enables organizations to fit them into their existing framework without roadblocks.”

With increased data sharing, comes increased risk

“Two things will drive an increased need for governance and security in 2024. First, the need to share sensitive data outside of traditional on-premise systems means that businesses need increased real-time auditing and protection. It’s no surprise that sharing sensitive data outside the traditional four walls creates additional risks that need to be mitigated, so next year, businesses need – and want – to ensure that they have the right governance policies in place to protect it. 

The other issue is that new data sets are starting to move to the cloud and need to be shared. The cloud is an increasingly popular platform for this, as it provides a highly scalable and cost-effective way to store and share data. However, as data moves to the cloud, businesses need to ensure that they have the right security policies in place to protect data, and that these policies are being followed. This includes ensuring that data is encrypted both at rest and in transit, and that the right access controls are in place to ensure that only authorized users can access the data. 

In 2024, to reduce these security risks, businesses will make even more of an effort to protect their data no matter where it resides.”

Rodman Ramezanian, Global Cloud Threat Lead at Skyhigh Security

Data Security and Privacy Concerns

“Organizations are increasingly concerned about the security and privacy of their data in the cloud. On-premises infrastructures tend to give organizations more control over their data.”

Andrew Hollister, CISO & VP Labs R&D at LogRhythm

Generative AI adoption will lead to major confidential data risks

“The cybersecurity landscape will confront a similar challenge with generative AI as it did previously with cloud computing. Just as there was initially a lack of understanding regarding the shared responsibility model associated with cloud computing, we find ourselves in a situation where gen AI adoption lacks clarity. Many are uncertain about how to effectively leverage gen AI, where its true value lies, and when and where it should not be employed. This predicament is likely to result in a significant risk of confidential information breaches through gen AI platforms.”

Angel Vina, CEO & Founder at Denodo

Organizations Will Need to Manage Cloud Costs More Effectively

“As businesses continue to shift data operations to the cloud, they face a significant hurdle: the relentless, unsustainable escalation of cloud data expenses. For the year ahead, the mandate is not just to rein in these rising costs but to do so while maintaining high-quality service and competitive performance. Surging cloud hosting and data management costs are preventing companies from effectively forecasting and budgeting, and the previously reliable costs of on-premises data storage have become overshadowed by the volatile pricing structures of the cloud.

Addressing this financial strain requires businesses to thoroughly analyze cloud expenses and seek efficiencies without sacrificing performance. This involves a detailed examination of data usage patterns, pinpointing areas of inefficiency, and a consideration for more cost-effective storage options. To manage cloud data costs effectively, firms need to focus on the compute consumed by queries and the associated data egress volumes, tabulating the usage of datasets, and optimizing storage solutions. These efforts are enhanced by adopting financial operations (FinOps) principles, which blend financial accountability with the cloud’s flexible spending model.

By regularly monitoring expenditures, forecasting costs, and implementing financial best practices in cloud management, organizations can balance cost savings and operational efficacy, ensuring that their data strategies are economically and functionally robust. In 2024, we will see a significant rise in the use of FinOps dashboards to better manage cloud data charges.”

Kevin Keaton, CIO/CISO at Red Cell Partners

Shifting Cyber Regulations Will Change the Status Quo

“The new SEC rules on cyber that go active in December for public companies are and will continue to cause significant changes in how boards operate with regards to cyber risks – and I expect that navigating these rules will be the biggest cybersecurity challenge businesses will face in 2024.”

Eric Herzog, Chief Marketing Officer at Infinidat

Triple play of cyber resiliency, detection, and recovery to create an overall corporate cybersecurity strategy 

“The convergence of cyber resilience, detection, and recovery on a single storage platform is fueling a trend for 2024 around higher levels of cybersecurity for enterprise storage. Reliance solely on backup is no longer enough to secure storage systems. Primary storage has become a main target of cybercriminals for the most insidious and hard-to-detect ransomware and malware attacks that wreak costly havoc on enterprises. Combining resilience (the ability to instill defensive security measures to repel attacks), detection (the ability to know when data is corrupted and whether a known good copy of data is free of ransomware or malware), and recovery (the ability to bounce back) from cyberattacks is the key to hardening storage infrastructure.

This trend to better secure storage systems is highly important because of the continued exponential increase in cyberattacks against enterprises of all types and in all industries.  Cybercrime is predicted to grow from $8 trillion worldwide in 2023 to more than $10 trillion in 2025. Cybercriminals attempted nearly 500 million ransomware attacks last year, marking the second-highest year ever recorded for ransomware attacks globally, and in the 2023 Fortune CEO survey of “Threats” to their companies, CEOs named cybersecurity their #2 concern. Ransomware attacks also represented 12 percent of breaches of critical infrastructure in the last year.

The convergence of cyber resilience, detection, and recovery on an integrated storage platform is an advancement over the past, commonly-used approach of disparate tools and technologies trying to combat cyberattacks in silos. Improving the security defenses of cyber storage for enterprises eliminates the vulnerabilities of the silos. It makes the cyber capabilities more air-tight and ensures a rapid recovery of data within minutes to thwart cybercriminals, nullifying ransom demands and preventing (or minimizing) any downtime or damage to the business. Ignoring this trend in 2024 could greatly harm an enterprise, especially one that doesn’t even know cybercriminals are lurking in their data infrastructure, no matter how good their other cybersecurity defenses are.”

Stacy Hayes, Co-Founder and EVP at Assured Data Protection

More channel players to use specialists for managed services 

“The managed services model will become increasingly attractive to traditional VARs in 2024, especially with more and more businesses looking to buy cloud and IT services on a usage basis. But making the transition from a traditional VAR to a provider of managed services is easier said than done. It’s not that VARs aren’t capable of diversifying, far from it, it’s just that the switch requires a fundamental shift in the way VARs do business and that isn’t something you can just change overnight. These large organizations are not built for this new world model. The in-house build and integration of new technology and go-to-market models takes too long and is too expensive to implement. VARs simply don’t have the people, the flexibility or the know how. With the economic headwinds as they are, Opex is king and no-one has the Capex or the appetite for big in-house builds. 

It is becoming increasingly difficult for VARs to provide a large portfolio of products and services to the standards customers demand. The speed the market moves, the reliance on data, all add to greater demands from customers. It is evident channel businesses are struggling to deliver what their customers want, whether it be on-premises or in the cloud. It is a common topic and one I believe means VARs need to clearly understand what they can deliver themselves, and what they need to outsource. Outsourcing, white labelling, is a great way to deliver a high quality and diverse portfolio to customers. 

MSPs that have the know-how to use utility based models effectively, that can execute immediately, have experts in the space and deliver services tailored for the vendor, customer, end user will be the partners of choice for VARs in 2024.” 

Brian Dutton, Director, US Sales and Client Services at Assured Data Protection

More businesses to spend upfront for managed services to beat inflation 

“Businesses are becoming more cost-conscious as prices for cloud and SaaS services keep rising in line with inflation. Every time the large vendors and hyperscalers pass on costs to the customer, company CFOs and finance directors find themselves asking IT the question, ‘where can we cut costs?’ This is creating a dilemma for IT teams, who are left wondering how do they keep the lights on and execute new digital and cloud strategies, on a smaller budget? Which is why so many have switched to an OPEX model that covers core capabilities, including DR and backup, based on a consumption model that is paid for in monthly installments. It has allowed them to cut CAPEX, operate on a per TB model as opposed to wasting valuable data center resources, and focus their efforts on business priorities. 

The impact and cost savings are tangible, but it’s also thrown a lifeline to SMBs and government organizations that simply don’t have the budget or infrastructure to support investment in new DR and backup solutions. The managed service option has become the preferred choice for large enterprises that have to prioritize transformation projects, and SMEs, local schools and municipalities with budget limitations. We expect more businesses to adopt the utility-based model that managed service providers offer for cloud-based data management. It lightens the load on teams, while reducing risk and guaranteeing uptime and business continuity in the event of a disaster, data breach or ransomware attack. Another byproduct of this trend we’ve experienced is companies prepared to pay for services upfront, locking costs in for up to 6-12 months, or longer, to protect themselves against inflation. This makes financial sense, especially if you’re cash rich now and want to ensure your data is protected over the long term when market volatility can affect prices elsewhere. We expect this to become the norm next year and the foreseeable future.” 

Andrew Eva, Director, CIO at Assured Data Protection

Scope three emissions compliance set to drive uptake of disaster recovery managed services 

“Sustainability is an issue that impacts every part of the economy and increasingly, the technology sector is being held to account for its carbon emissions. Until recently, organizations have mostly had to concern themselves with two key emission calcifications: scope one – emissions the organization is directly responsible for, and scope two – indirect emissions, such as electricity. Now though, we’re seeing the impact of scope three emissions being felt. That is, all other emissions associated with an organization’s activities, including its supply chain. While scope three emissions aren’t yet legally enforceable, they are being widely adopted by large organizations, as legislation is inevitable and there’s a widespread desire to get ahead of the issue. We’re now seeing their impact filter down to smaller organizations.  

This is an issue for the DR sector and organizations that are leaders in sustainability – they are recognizing the challenge and the value of outsourcing this function to an MSP. By eliminating the need for data backup via a second site, which are costly to operate, don’t always utilize the latest power efficient hardware, and are responsible for significant carbon emissions, ESG compliance is a lot more manageable. There’s also recognition that this isn’t simply offloading the problem because MSP DR solutions achieve economies of scale by servicing multiple organizations via a shared facility, making them carbon-efficient for customers. Given the rate at which scope three is permeating, we expect to see more organizations adopt outsourced DR services. Both existing and future and existing business for MSPs depends on helping customers and partners achieve ESG compliance.”

Eric Herzog, Chief Marketing Officer at Infinidat

Freeing up money from storage for AI and other critical IT projects 

“Dramatically reducing the costs of enterprise storage frees up IT budget to fund other significant new projects, such as AI projects, cybersecurity-related projects, or other strategic activities. This trend for 2024 will play a pivotal role in enterprises where there will be pressure to accelerate AI projects for the next stage of digital transformation, as well as to improve cybersecurity against more sophisticated AI-driven cyberattacks. With IT budgets projected by Gartner to grow by 8 percent in 2024, funding for these new projects will need to come from somewhere.

A smart approach to shifting IT spending internally that is taking hold is to remove costs from storage, while simultaneously improving storage. It sounds like a paradox at first sight, but the trend in 2024 will be to take advantage of three key factors that make this “paradox” an actual reality: (1) storage consolidation onto a single, scalable high availability and high performance platform, (2) autonomous automation, and (3) pay-as-you-go, flexible consumption models for hybrid cloud (private cloud and public cloud) implementations of storage. 

Storage consolidation, for example, replaces 20 storage arrays with one or two storage arrays that can scale into the multi-petabyte range with 100% availability guaranteed. Having fewer arrays immediately lowers costs in terms of IT resource and storage management, power, cooling, and space. This cost savings can be used for critical IT projects. Autonomous automation simplifies storage, intelligently automating processes and how to handle applications and workloads. Virtually no human intervention is needed. The storage systems run themselves, enabling a set-it-and-forget-it mode of operations. IT staff can focus on more value-adding activities and building AI capabilities into the infrastructure and across the enterprise. Leveraging flexible consumption models to pay for storage, as needed and as efficiently as possible, lowers CAPEX and OPEX, freeing up money for these other IT projects. An extension of this trend is also to invest in enterprise storage solutions that deliver an ROI in one year or less, optimizing the budget.”

The blossoming of green storage 

“Having more storage capacity in the same form-factor and having fewer storage arrays have become part of an ongoing trend to make enterprise storage, along with data centers in general, more environmental-friendly. Fewer arrays mean less carbon footprint, less to cool with the use of coolants, and less to recycle, translating into a much lower impact on the environment. Furthermore, consolidating multiple arrays into a single storage platform means the use of less energy, fitting a strategy to advance sustainability. Storage upgrades also bring energy savings, which also translates into cost savings, in light of the rising costs of energy this year. Consolidation also means much improved, more efficient utilization of space.

With the higher cost of energy, the need to reduce floorspace costs, the drive to lower carbon emissions, and the desire to reduce the impact of recycling of storage arrays on the environment, this trend will see in the new year an increase of audits of enterprise storage infrastructure and more intense identification of inefficiencies and waste in the storage estate.

The blossoming of green storage will be demonstrated in 2024 by reduced energy consumed to power storage systems, while still protecting data. We’ll see bigger capacity systems being installed that take up less space than traditional arrays. Software-defined storage increases storage utilization and reduces overprovisioning of storage. Also, part of green storage will be managing data as part of a data lifecycle strategy for more agility and better compliance with sustainability standards.

As part of broader green IT initiatives, storage will come under greater scrutiny in 2024 to boost efficiency and conservation. Enterprises will increasingly turn to AI for the capabilities to optimize storage capacity and streamline management, resulting in more efficiency. Gartner predicts that by 2025 half of all data centers will deploy AI/ML to increase efficiency by up to 30%. AI will also be used to optimize cooling. More enterprises that use water-based cooling systems to cool their data centers where storage systems reside will be compelled to get on a path to become “water positive,” replenishing more water than they consume. Green IT is changing the way storage administrators need to think about the future of enterprise storage.”

Seamless hybrid cloud integration

“The shift to hybrid cloud in the enterprise has been ongoing for years, as enterprises figured out that a balanced approach to leveraging the public cloud and maximizing on-premises private cloud data infrastructure makes the most business sense. But what is new about this trend is how dominant the hybrid cloud has become for enterprise storage. New capabilities have made it extremely easy for enterprises to manage on-prem private cloud storage and the public cloud storage as one, integrated, software-defined infrastructure, as if the public cloud is just another “array” identified in the software’s user interface. Advancements in the past year to make managing hybrid cloud storage even simpler has unlocked the full-scale embracing of this approach to storage, especially for large enterprises.

Hybrid cloud has also become easier to scale. If a large enterprise needs to expand capacity quickly because of an unexpected burst in data traffic, it can scale fast on a single, software-defined, multi-petabyte storage system that is on-prem. It can obtain a cloud-like experience, yet leveraging its own infrastructure, without any downtime or complexity. At the heart of why hybrid cloud is so strong, and so attractive, are cost and control. Enterprises can keep costs lower by using an on-prem storage system after storage consolidation of many arrays into one or two systems. They also don’t have to pay the hidden costs of moving data back and forth from the public cloud. They can use the public cloud for the use cases that are most appropriate, such as archived data, backup data, disaster recovery, or DevOps. Simultaneously, enterprises keep better control of their data by having it on-prem, ensuring compliance, especially with recent regulations about data governance and data privacy.”

Opening up the potential of containers in the hybrid cloud data center

“It’s estimated that approximately 90 percent of global organizations will be running containerized applications in production by the year 2026, according to Gartner analysts – up from 40 percent just two years ago. By 2026, it’s estimated that as much as 20% of all enterprise applications will run in containers, more than doubling the percentage from 2020. The adoption and expanded usage of containers are definitely on the upswing. This multi-year trend is gaining momentum for 2024 because of the increasing need for enterprises to innovate at a faster rate today than ever to meet evolving customer expectations. The point is that enterprises are becoming more digitally-focused in order to better compete.

Containers, which exemplify a cloud-native approach, provide a cost-efficient way to automate the deployment of modern applications – and do it at scale – while making them portable, environment-agnostic, and less resource-dependent to achieve cost savings. Consequently, the rate at which new applications are being developed is monumental. IDC reported that by the end of 2023, roughly 500 million new “logical applications” will have been created – a number that is the equivalent to the amount of applications developed over the past four decades in total. 

According to the latest available Cloud Native Computing Foundation (CNCF) survey, 44% of respondents are already using containers for nearly all applications and business segments and another 35 percent say containers are used for at least a few production applications.

More attention is being paid to thinking about the infrastructure – particularly the enterprise storage infrastructure – that supports containers. Storage is critical in the world of containers. The challenge with this trend is to pick the right enterprise storage, especially with enterprises globally needing to scale container environments into petabytes.

CSI is the standard for external primary storage and backup storage for container deployments, and it is becoming the default for Kubernetes storage environments and other container-run types. Applications, workloads, and environments are transforming around Kubernetes. It’s in your favor to work with an enterprise storage solutions provider that aligns to the CSI standards. The container world is moving so quickly and the versions of Kubernetes and the associated distributions get updated every three to six months. Kubernetes has emerged as the de facto standard for container orchestration.”

Skills gap in storage calls for an increase in automated storage

“A skills gap across the data center and, particularly, in enterprise storage has emerged, and the trend is that this gap is getting wider. Fewer IT professionals are choosing to specialize in storage yet capacity trends to grow in the enterprise exponentially. While it makes storage administrators more valuable at this time, an increasing number of enterprises are having trouble staffing the roles to manage traditional, legacy storage in support of applications, workloads and the entire data infrastructure. The shortage in qualified IT professionals creates a precarious situation for the future of many enterprises. Therefore, the trend in the enterprise space is to turn to AI-equipped autonomous automation of enterprise storage.

With autonomous automation going mainstream in the enterprise space, CIOs and IT leaders can take a “set it and forget it” approach to storage. They can reduce the risk of the skills gap jeopardizing their ability to have an always-on, reliable storage infrastructure. This approach, of course, requires a very simple user interface for the average IT professional to be able to manage the storage, whether to increase capacity, see insights into the performance of the storage system, or execute a rapid recovery from a cyberattack.  At the same time, use of autonomously automated storage frees up valuable IT headcount to be utilized in other areas of the datacenter and enterprise software environments.

The increase in cyberattacks has exposed the skills gap even more because enterprise storage has become the new frontier for pressure-testing the merging of cybersecurity and cyber resilience. If storage continues to be done in the traditional ways, enterprise will continue to be held ransom by bad actors unleashing ransomware. However, by automating cyber detection with ML-tuned algorithms, the skills gap gets plugged with new capabilities, making the normal IT person look like a “storage superstar.” The trend is toward self-directed, self-adjusting autonomous automation of enterprise storage to reduce the risks associated with any major skills gap. “

Redefining the user experience for enterprise storage

“For enterprise storage, user experience is no longer only about the graphical user interface (GUI).  While the GUI is still important and should be as easy to use as possible, the scope of user experience has broadened to include essential elements for today’s day and age: guaranteed service level agreements (SLAs), white glove service, and professional services. Enterprises are not only looking for a box on which to run their applications and workloads; they are increasingly looking for excellence in service and support to be built into their storage solution.

The trend is for enterprises to opt for the storage vendor that offers the best support, best SLAs, and best proactive professional services, yet at a low cost or for no charge. It redefines the expectations for user experience. Enterprises want to know they have an advocate within the storage provider who can lend their expertise to solve challenges for the customer. They want to know they can get L3 support direct on a moment’s notice. They want to have confidence in the integration capabilities of a professional services team.

They see this all as added value, and it has become either a dealmaker or a deal-breaker for many enterprises that are reevaluating their choice of storage solution vendor based on this criteria. User experience has been transformed into the total experience for the customer.”

Graham Russell, Market Intelligence Director at Own Company

More organizations will embrace the ongoing adoption of cloud and SaaS

“The SaaS revolution is turning industries into tech playgrounds: from healthcare to finance, the widespread embrace of Software as a Service applications has been a game-changer. But while SaaS may already seem ubiquitous, in reality many organizations have been slow to embrace it. 2024 is the year they will likely catch on. This shift will result in a broader market for SaaS providers, opening up new opportunities for growth and innovation. And with this increased adoption comes a greater need for data protection and cybersecurity measures. As organizations entrust mission-critical information to SaaS applications, the potential consequences of data loss and corruption become more significant.”

AI adoption will drive data breaches  

“As the adoption of AI continues to skyrocket, the risk of data breaches increases. The sophistication and reach of AI can inadvertently expose vulnerabilities in cybersecurity defences, making organizations more susceptible to malicious attacks and unauthorised access.

This inevitable intersection of AI and data breaches is set to redefine the data protection and cybersecurity landscape in the near future. The silver lining? It will propel a renewed and intensified focus on data security issues. With each headline-grabbing breach, businesses are becoming increasingly vigilant about the safety of their business data. Organizations will be more focused than ever on being compliant with – and demonstrating compliance with – regulatory standards.”

AI adoption will prompt greater focus on data hygiene

“As the adoption of AI continues its rapid ascent, the spotlight on data hygiene is poised to become even more intense. AI’s voracious appetite for high-quality, accurate data makes the concept of data cleanliness a critical factor in unleashing the true potential of AI applications.

In response to this need for impeccable data, a notable trend is the strategic use of backup files. Traditionally seen as a safety net for data recovery, backup files are now being leveraged as a valuable resource for training and refining AI and machine learning models. These files, enriched with historical and real-world data, serve as a goldmine for organizations looking to enhance the depth and breadth of their AI algorithms.

Incorporating backup files into AI and machine learning models allows organizations to simulate diverse scenarios, ensuring that the algorithms are robust and adaptable to real-world complexities. This approach not only optimises the performance of AI applications but also enhances the accuracy of predictions and decision-making processes.”

Organizations will pivot to a ‘platform of choice’ at the core of their tech stacks

“In 2024, organisations will strategically opt for a ‘platform of choice’ that will serve as the centre of their tech stack. This shift will help businesses move away from the fragmented approach of using multiple vendors and applications, towards a more streamlined and integrated tech ecosystem. As a result, organizations will go ‘all in’ with a platform and seek to use applications that are built on that platform to achieve greater efficiency and cost savings. This ‘platform of choice’ approach will go beyond a mere technological preference. As organizations prioritise applications built natively on the chosen platform they are hoping to also minimise the number of vendors and applications in their tech stacks, streamline their workflows and gain increased negotiating power and potentially lower costs associated with managing multiple solutions.”

Seth Batey, Data Protection Officer and Senior Managing Privacy Counsel at Fivetran

The pendulum for build versus buy is going to swing back to buy in 2024

“IT and data management will play a crucial part in bolstering ESG programs in 2024. While ESG and the criteria for each prong, including what investors or customers look for, may differ, strong retention policies support a company’s effort to satisfy the “E” prong, and strong privacy practices support the “S” prong.  Having strong IT and data management, including robust data classification and inventory is necessary to implement retention policies and other privacy safeguards that can be easy wins for an ESG program.”

Steve Stone, Head of Rubrik Zero Labs at Rubrik

The accelerating data explosion will force a security strategy rethink

“In 2024, organizations will face a stiffer challenge in securing data across a rapidly expanding and changing surface area. One way they can address it is to have the same visibility into SaaS and cloud data as they have in their on-premises environments–in particular with existing capabilities. And that will be a major cybersecurity focus for many organizations next year. More will recognize that the entire security construct has shifted – it’s no longer about protecting individual castles but rather an interconnected caravan.”

AI will dominate the cybersecurity conversation

Both attackers and defenders will step up their use of AI. The bad guys will use it more to generate malware quickly, automate attacks, and strengthen the effectiveness of social engineering campaigns. The good guys will counter by incorporating machine learning algorithms, natural language processing, and other AI-based tools into their cybersecurity strategies.

I believe there is almost no scenario where AI-driven deepfakes won’t be part of the pending U.S. Presidential election amongst others. Even if the technology isn’t applied, its within the realm AI deepfakes will be blamed for gaffes or embarrassing imagery.

We’ll also hear more about the role AI can play in solving the persistent cybersecurity talent gap, with AI-powered systems taking over more and more of the routine operations in security operations centers. 

When it comes to cybersecurity in 2024, AI will be everywhere.”

CISOs (and others) will feel pressure from recent government actions

“In late October, the Securities and Exchange Commission announced charges against SolarWinds Corporation, which was targeted by a Russian-backed hacking group in one of the worst cyber-espionage incidents in U.S. history in 2019, and its chief information security officer, Timothy G. Brown. The complaint alleged that for more than two years, SolarWinds and Brown defrauded investors by overstating SolarWinds’ cybersecurity practices and understating or failing to disclose known risks. 

The charges came nearly six months after a judge sentenced Joseph Sullivan, the former CISO at Uber, to three years of probation and ordered him to pay a $50,000 fine after a jury found him guilty of two felonies. Sullivan had been charged with covering up a ransomware attack while Uber was under investigation by the Federal Trade Commission for earlier lapses in data protection.

On top of all that, new SEC rules on cybersecurity and disclosure of breaches were set to take effect Dec. 15. They require public and private companies to comply with numerous incident reporting and governance disclosure requirements.

All of this will have CISOs looking over their shoulder in 2024. As if defending their organizations from bad actors wasn’t challenging enough, now they will have to pay more attention to documenting absolutely everything.”

Brian Spanswick, CISO and CIO at Cohesity

IT leaders will increase their focus on cyber resilience to prepare for 2024’s top security threats

“As attackers and their tools become more sophisticated in the age of generative AI, the need for organizations to be resilient and ensure they can limit business disruption during a cyber event will become even more critical. In turn, organizations will be further investing in cybersecurity fundamentals such as strong asset management practices, system patching, and data encryption.

Quickly recovering core business processes with aggressive recovery time and recovery point objectives significantly minimizes the disruption of a ransomware attack and reduces the leverage the attacker has when demanding payment. This is especially key as organizations must be prepared for the event when, not if, a cyberattack occurs.”

As businesses move their workloads to the cloud, they will need to double down on data security by getting clear visibility of their attack surface – or risk misconfigurations and additional breaches

“Cloud transformation and hybridization are still in progress – and high-risk – for a number of organizations moving away from legacy systems. DSPM is a growing field to help customers manage this risk. In fact, Gartner has issued a report on this entitled “continuous threat exposure management.” Organizations need to fully understand the security implications of the new attack surface created by moving workloads from on-premises to the cloud.”

As more companies implement generative AI, they will face a challenge similar to shadow IT except shadow AI puts proprietary data in the public domain representing a much greater risk

“The challenge is not knowing what algorithms are being used, the data fueling them, and who is using those algorithms. CISOs and organizations will need to ensure transparency and control around the growing use of GenAI. Gartner has called this out in its focus on “AI trust, risk, and security management.”

In 2024, threat actors will continue demanding ransoms despite government agreements to not pay

“A group of nearly 50 countries recently pledged to no longer pay ransoms demanded as part of ransomware attacks. While this represents a diplomatic accomplishment this won’t blunt the frequency or sophistication of attacks on government infrastructure. Attacks are so cheap and easy to launch and the consequences so limited that attackers will continue to probe for weaknesses in an automated, programmatic fashion. Further, nation-state sponsored attackers will seek to sow chaos, rather than financial gain, from attacks.”

Dale Zabriskie, CISSP CCSK, Field CISO at Cohesity

Generative AI and security will come together in the worldwide fight against cybercrime and advanced-persistent threats

“Attackers will be leveraging AI tools to entice employees via social engineering to click and act recklessly, exploit zero-day vulnerabilities, and much more at a much faster rate. We can expect both adversaries and innovative defenders to leverage AI, and it will be a force multiplier in both of their efforts.”

Autonomous and Stateless AI Agents will be effective and efficient as nations and corporations fight off these ever-growing and evolving threats

“The technology world is evolving at a very rapid pace, and with this, the skills gap in emerging technologies is growing much wider than ever before. New tools need to be developed to act as a translation engine between native/natural language and engineering-speak or technical jargon.

To solve this, we are already starting to see the emerging trends of AI Agents – systems that act and reason with a set of predefined tools – to solve more complex situations than traditional RAG architectures. Agent and tool combinations will be leveraged to assist humans in more complex systems management and operational automation.”

Greg Statton, Office of the CTO – Data and AI at Cohesity

The importance of domain-specific bodies of data that are clean, relevant, as well as a means for providing responsible and governed access to that data for use in LLM-based applications, will be paramount

“2023 was the year of the LLM – the bigger the better. Now that all the cloud vendors have selected their LLM provider of choice (or built their own), the world will turn to the importance of domain-specific bodies of data that are clean, relevant, as well as a means for providing responsible and governed access to that data for use in LLM-based applications.

Data will be classified into two main camps:

    • Dynamic Data – this is machine generated data that is served via API or via event streams (think message bus data from live systems). This data is useful when looking at the current state of a thing/object/system/etc

    • Static Data – This is data that lives for a while in a current state. This will mostly be documents that are generated by other knowledge workers. Data that has a shelf life of weeks/months/years.”

Dr. Darren Williams, CEO and Founder at BlackFog

After a record-breaking 2023, we expect that ransomware will not ease anytime soon. Fundamentally, ransomware is becoming the main threat to all organizations, and insurance is no longer a viable option. Action needs to be taken. In 2024 we predict several new trends to take hold

“Ransomware gangs will look for new ways to force victims into paying. We have already seen gangs contact the SEC directly, reporting victims immediately to inflict maximum damage, forcing regulatory, reputational and class action liabilities. We expect this is just the beginning of several new tactics to maximize payouts.”

“Organizations will realize that their existing security is not making any impact on the new threat vectors and will finally start to focus on the core problem, “data security” and “data exfiltration.”

“More than 40 percent of existing data exfiltration goes to China and Russia. We expect other countries such as North Korea to play larger roles in 2024.”

“We expect to see major infrastructure applications become threat vectors for cyber gangs, similar to the way the MOVEit exploit was developed. Hiding in plain sight is going to be the new mantra for cyber gangs as they continue to avoid detection.”

“We expect to see ransomware disrupt major infrastructure through IoT devices and non-traditional platforms. These diverse systems often have limited security designed in and have significant exposure for organizations, particularly in the manufacturing industry.”

Monica Kumar, Chief Marketing Officer at Hitachi Vantara

Recognizing cloud as an operational model

“In 2024, we will see a significant shift in the perception of cloud computing. Gone are the days when all public cloud is good; we will now be looking at cloud as an ecosystem. Cloud will no longer be a fixed location–either on-prem or in the cloud–it’s an operating model that offers cloud principles like agility, self-service, cost-effectiveness, and scalability. This transformation from a location to an operational framework is becoming increasingly clear as more cloud providers begin to leverage solutions that bridge the gap between on-prem and cloud deployments.”

Hybrid cloud sustainability is no longer a luxury; it’s a necessity 

“Business leaders have shown a bigger commitment to reducing their environmental footprint in recent years, with a focus on optimizing resource usage and enhancing the efficiency of data centers. A 2023 study from Hitachi Vantara found that nearly four in five IT leaders and C-level business executives have developed plans for achieving net zero carbon emissions, with 60 percent saying the creation of eco-friendly data centers is a top priority. As businesses rely more on hybrid cloud solutions for their IT needs, these technologies must contribute to a sustainable future. Therefore, in 2024, hybrid cloud sustainability will transition from a “nice to have” strategy to an absolute necessity due to its real implication on the business. 

The shift towards hybrid cloud sustainability will include a range of initiatives. Data center infrastructure and data management practices will be overhauled to reduce unnecessary resource consumption. This may consist of eliminating hot spots and excess energy usage, enhancing cooling systems, and properly removing electrical waste. Businesses will implement strategies to intelligently optimize workloads in their hybrid cloud setups for reduced energy consumption.

This transformation won’t just align with business goals; it will also drastically lower energy costs and streamline data management operations to improve efficiency, protect resources, and substantially curb environmental impact.”

Steve Santamaria, CEO at Folio Photonics

The Rise of Optical Storage in Active Archives

“In 2024, there will be a transformation in how we store and archive data with the emergence of optical storage and an alternative to active archiving systems. This trend will be driven by the growing demand for storage solutions that are not only long-lasting and secure but also energy-efficient. The invention, giving rise to a new generation of Optical storage, will gain traction, especially in sectors where stringent data retention rules are in place, thanks to its durability and resistance to environmental wear and tear. By incorporating state-of-the-art optical storage into active archiving, we’re looking at a viable, environmentally conscious alternative to conventional storage methods, bolstering data access and security. This movement is a testament to the increasing emphasis on both data preservation and environmental stewardship.”

Steve Leeper, VP of Product Marketing at Datadobi

“As artificial intelligence (AI) continues to weave into the fabric of modern business, the year 2024 is likely to witness a surge in the demand for enhanced data insight and mobility. Companies will need to gain insight into their data to strategically feed AI and machine learning platforms, ensuring the most valuable and relevant information is utilized for analysis. This granular data insight will become a cornerstone for businesses as they navigate the complexities of AI integration. At the same time, the mobility of data will emerge as a critical factor, with the need to efficiently transfer large and numerous datasets to AI systems for in-depth analysis and model refinement. The era of AI adoption will not just be about possessing vast amounts of data but about unlocking its true value through meticulous selection and agile movement.

 The trajectory of storage technology is also poised for a significant shift as the year 2024 approaches, with declining flash prices driving a broad-scale transition towards all-flash object storage systems. This shift is expected to result in superior system performance, catering adeptly to the voracious data appetites and rapid access demands of AI-driven operations. As flash storage becomes more financially accessible, its integration into object storage infrastructures is likely to become the norm, offering the swift performance that traditional HDD-based object storage and scalability that NAS systems lack. This evolution will be particularly beneficial for handling the large datasets integral to AI workloads, which necessitate rapid throughput and scalability. Consequently, a data mobility wave may be seen, with datasets and workloads being transferred from outdated and sluggish storage architectures to cutting-edge all-flash object storage solutions. Such a move is anticipated not just for its speed but for its ability to meet the expanding data and performance requisites of burgeoning AI initiatives.

Also importantly, in 2024, the landscape of data management will undergo a profound transformation as the relentless accumulation of data heightens the necessity for robust management solutions. According to Gartner’s projections, by 2027, it is expected that no less than 40% of organizations will have implemented data storage management solutions to classify, garner insights, and optimize their data assets, a significant leap from the 15% benchmark set in early 2023. This trend is likely to be propelled by the relentless expansion of data volumes, outpacing the rate at which companies can expand their IT workforce, thus elevating the indispensability of automation for data management at scale.

2024 is set to be a pivotal time for data management, with a shift towards API-centric architectures for meshed applications gaining traction. As customers increasingly demand that data management vendors offer API access to their functionalities, we are likely to see a mesh of interconnected applications seamlessly communicating with one another. Imagine ITSM (IT Service Management) and/or ITOM (IT Operations Management) software triggering actions in other applications via API calls in response to tickets — this interconnectedness will become commonplace. The trend towards API-first strategies will likely accelerate, driven by the desire to embed data management more integrally within the broader IT ecosystem. As a result, the development of self-service applications will flourish, enabling automated workflows and facilitating access to data management services without the need for manual oversight. This move towards a more integrated, automated IT environment is not just anticipated; it is imminent, reflecting a broader shift towards efficiency and interconnectivity within the technological landscape.

Finally, as we look toward 2024, we predict that an intensified focus on risk management will become a strategic imperative for companies worldwide.  Governance, risk, and compliance (GRC) practices are anticipated to receive heightened attention as companies grapple with the complexities of managing access to data, aging data, orphaned data, and illegal/unwanted data, recognizing these as potential vulnerabilities. Moreover, immutable object storage and offline archival storage will continue to be essential tools in addressing the diverse risk management and data lifecycle needs within the market.”

Rohit Badlaney, CGM, Cloud Product and Industry Platforms at IBM

Businesses Must Close Skills Gaps to Ensure Hybrid Cloud Success in 2024

“In the past few years, hybrid cloud adoption has accelerated at an exponential rate with no signs of slowing down into 2024. However, businesses will face persistent challenges along their hybrid cloud journeys due to the widening skills gap in the tech workforce. In fact, a 2023 global survey from the IBM found more than half of global decision-makers say that cloud skills remain a challenge to widespread cloud adoption. In response to this challenge, 72 percent of organizations have created new positions to meet new and evolving demands for cloud skills. As organizations refine their multi- and hybrid cloud strategies, they must take a comprehensive approach that addresses skills gaps — creating opportunities to expand current workers’ skills and welcoming new skilled talent. “

 

Register for Insight Jam (free) to gain exclusive access to best practices resources, DEMO SLAM, leading enterprise tech experts, and more!

Share This

Related Posts