Saturday, November 23, 2024
Home » Data value, transformation and the need for multi-protocol data management

Data value, transformation and the need for multi-protocol data management

Enterprise corporations of all types have fully embraced digital transformation, along with the realization that data has become the lifeblood and key value enabler for their businesses. This is easy to see in industries such as media & entertainment, where our lives have been enriched by the instant availability of video on demand to almost anyplace on earth. This is also true in industries such as travel, where online travel searches, reservations, and bookings are now commonplace. Healthcare is another industry experiencing these changes, with digital health records and digital images replacing paper and film records nearly everywhere. Just look around, and you’ll likely notice the transformation..

A key challenge in this period of transformation remains the dependency on legacy business applications. These applications are still present everywhere: in banks, hospitals, motion picture studios, travel agencies, and factories. They work well, do their job, and are difficult or costly to replace. They were created for pre-cloud deployment models, and are typically host-based and monolithic (non-modular or distributed) in architecture. This means their usage and access has limits within the data center or the enterprise itself, and typically this means they are rigid and hard to enhance in functionality. In other words, They will continue to exist and flourish into the foreseeable future.

These legacy applications commonly use a classic style of access to data and storage, through file systems or remote file system protocols such as NFS (popular on Linux and older UNIX systems) and SMB (sometimes referred to as CIFS, the de facto file protocol standard on Microsoft Windows systems). We still see advancements today in these legacy file protocols. NFS v4 is becoming more widely adopted and capable, as is SMB 3.0 with newer capabilities that can be exploited for scalability, performance and security.

But a new era of cloud-native applications is emerging  simultaneously. We now see applications deployed in the cloud, or even behind the firewall in enterprise private cloud deployments. These applications are designed for our new world of ubiquitous access from anywhere on the globe with 24×7 availability.

Consider the example of an online banking application, which we all depend upon both at home and remotely as we travel. These requirements are difficult to near-impossible to address with older legacy applications, since their model of deployment, distribution, and access has shifted. This ultimately has an impact on data management and storage access — and since object protocols work well over long-distances and with the cloud, this is one of the reasons the new era of cloud native applications has embraced object storage as default data access protocols. AWS has created a default standard protocol with its popular AWS S3 API, but alternatives persist, including the OpenStack Swift protocol and cloud storage alternatives such as the Microsoft Azure Blob API.

As the mix of applications used in the enterprise increases, this also results in demand for scaling storage to continue to grow exponentially. For example, applications for managing and archiving healthcare radiology images (digital MRI, CT-Scan and others) are now driving hundreds of terabytes per year of new data storage requirements in mid-to-large hospitals. These applications are still heavily file-based. Combine that with newer applications for genomics data management and analysis, backup, surveillance and more — this again demonstrates the need for modern data management and storage to be multi-protocol ready, but also to support a petabyte-level scale for the digital transformation era.

In the end, our changing world demands variety; with digital transformation increasing the value of data and the complexity of applications, we can see a clear need for multi-protocol data management and storage. This is a key driver for the continued support and enhancement of file and object storage coexisting in solutions such as the Scality RING.

In an upcoming post we’ll explore the RING’s native scale out file system (SOFS) and how it addresses the high demands of archiving applications in the healthcare world, video-on-demand, and for the big data archives needed for AI and machine learning.

Leave a Comment

* By using this form you agree with the storage and handling of your data by this website.

About Us

Solved is a digital magazine exploring the latest innovations in Cloud Data Management and other topics related to Scality.

Editors' Picks

Newsletter

Challenges solved, insights delivered, straight to your inbox.

Receive hand-picked articles, case studies, and expert opinions. Keep up with industry innovations and get actionable insights to optimize your strategy.

All Right Reserved. Designed by Scality.com