IBM Red Hat Advances Container Storage

IBM doubles down on Red Hat Kubernetes storage platform

IBM and Red Hat today revealed that the core technologies of Red Hat OpenShift Data Foundation (ODF) will become the foundation for the next generation of the IBM Spectrum Fusion storage platform.

Scott Baker, CMO of IBM Storage, says it’s clear the storage technologies used to build Red Hat ODF will be applicable to a wider range of use cases beyond cloud-native applications running on clusters. Kubernetes. These core components include an instance of the Ceph open-source object storage operating system, the Rook Orchestrator for Kubernetes, and the NooBaa data management platform.

Brent Compton, senior director of Red Hat Storage, says the goal is to create a unified storage platform that bridges cloud computing and on-premises computing environments to better enable two-way mobility of applications and data.

IBM has multiple storage platforms, but this transition is a clear indication that Ceph will become the foundation of its storage strategy going forward. IBM says it will assume responsibility for the future development of the Ceph, Rook and NooBaa projects in addition to all sales and marketing activities.

It’s unclear how widely adopted Red Hat ODF is, but Compton says the rate at which stateful applications are built on Kubernetes clusters running both in the cloud and in the on-premises computing environment s is considerably accelerated. In fact, many of these stateful applications are deployed on edge computing platforms to process and analyze data as it is created and consumed.

There’s no shortage of options when it comes to Kubernetes, but the biggest obstacle to achieving hybrid cloud computing is the fact that data isn’t easily moved from one platform to another. In fact, most applications today are deployed within the confines of a single cloud or on-premises computing environment, which naturally results in many independently managed data silos. As a result, the total cost of IT only increases each time an organization chooses to deploy an application on a different platform and these data silos prevent most organizations from truly achieving IT success. hybrid cloud computing.

It’s unclear how long it will take for IBM to realize its vision of breaking down the data silos that impede true hybrid cloud computing, but as more and more applications access data scattered across multiple platforms cloud forms, the problem becomes a major concern. Not only are applications invoking microservices to access data in a highly distributed computing environment, but the types of data that can be accessed are also diversifying.

In the meantime, organizations are investing in hiring data engineers alongside DevOps teams to implement best practices for automating the management of various storage systems. It is expected that a set of DataOps best practices will be defined that in many ways mimic the processes created to automate application development and deployment. The goal should be to modernize data management in a way that simplifies what is currently a highly fractured storage environment.

Naturally, none of these goals will be achieved overnight, but it’s obvious that as object storage systems mature, there are many more opportunities for progress.

Similar Posts

Leave a Reply

Your email address will not be published.