By: Josh Epstein
The forces of digital transformation are radically altering the world of enterprise IT and changing the face of IT infrastructure. The explosion of data from sources including ecommerce, healthtech, fintech, and internet of things (IoT) coupled with advanced analytics, machine learning, and AI is driving a new class of cloud-scale applications. Powerful mobile devices, ubiquitous broadband, and new as-a-service business models set the stage for a new class of applications that are available anytime and anywhere.
Data on-demand has expanded the ability for stakeholders across the enterprise to access the information they need from any device, in any location. Traditional IT practices and traditional infrastructure strategies can no longer keep up; performance planning has become increasingly costly, inefficient, and difficult to manage. To help overcome these challenges, datacenter architects are taking a new approach, coined composable infrastructure, to ensure their datacenters support the agility, operational simplicity, and performance needed to compete in today’s increasingly competitive environment.
As the need for supporting dynamic cloud-scale applications, IT organizations turn to cloud-based solutions to improve scalability, flexibility, and connectivity. Cloud-scale infrastructures must be extremely dynamic, with the ability to respond to ever-changing performance and capacity requirements. Public cloud infrastructure as a service (IaaS) has continued to grow in popularity, with the annual cloud spend continues to increase each year. Recently, IDC—in its Worldwide Semiannual Public Cloud Services Spending Guide, 2018—predicted worldwide spending on public cloud services and infrastructure will reach $160 billion in 2018, an increase of 23.2 percent over 2017.
While the flexibility of public cloud solutions are clear, applications providers need to consider several factors when choosing a cloud infrastructure strategy, security, regulatory compliance, cost of scaling, and avoiding lock-in to a given cloud provider.
For datacenter architects building infrastructures to support cloud-scale applications, there are four key forces that should be considered.
The IT world is steadily moving toward as-a-service infrastructures. Industries such as healthcare, retail, finance, and many others are seeing a tremendous growth in as-a-service infrastructure, which allows for dynamic workloads, flexibility, and better user experience. Enterprises are beginning to make the transformation into software-as-a-service (SaaS) businesses, otherwise known as the “SaaSification of the enterprise,” giving them the ability to respond and react quickly, improve engagement with their customers, employees, and partners. As traditional on-premises data centers are being displaced, IT organizations supporting SaaS environments need to rethink how they procure, manage, optimize, and orchestrate storage resources.
Storage solutions need to deliver the operational simplicity of public infrastructure-as-a-service (IaaS), while offering the performance and efficiency of traditional storage arrays. While some enterprises are moving toward the implementation of an IaaS strategy, it does not prove to be as cost-effective, secure, or customizable compared to private cloud offerings. As a result, storage providers need to deliver offerings that compete with IaaS providers.
There is a growing need for sophisticated, real-time analytics in fast, modern applications. The influx of data from across the enterprise means businesses must be able to analyze and act on the data in the most efficient and effective ways possible. Real-time analytics is no longer an added bonus, it has become a must have for enterprises looking to stay competitive in this data-intensive environment. As such, storage must have the ability to deliver cost-effective performance in analytics-intensive environments.
The distance between the application layer and the storage layer is compressing. Development and DevOps teams need to implement storage capabilities that optimize the performance, capabilities and user experience of all applications. The concept of a physical array no longer not aligns with the way modern application are being designed today. Instead, applications have to be able to react to unexpected events and adapt to rapid changes in IT environments.
As cloud-scale applications continue to displace traditional on-premises datacenters, there are emerging opportunities for businesses to rethink data storage strategies to better fit the demands of today’s cloud-era applications. Legacy scale-up and scale-out architectures no longer meet the need for high efficiency, simplicity and flexibility. Instead, composable infrastructure addresses the challenges enterprises face today, and is set to transform the storage industry for the better. With composable storage, IT organizations can achieve performance and efficiency at scale, while storage management remains simple.
Service providers may already be delivering next-generation cloud services, but the level of technology investment goes beyond what is practical for many enterprises. As such, enterprises and cloud service providers, in particular, can embrace the SDDC model with composable storage. It allows enterprises to overcome the major challenges and changes that prove to be shifting storage market as a whole. Service providers that want to be viable—and remain competitive—years down the road should implement software-defined datacenter (SDDC) strategies that leverage composable storage. This architecture leverages the best of scale-up and scale-out, with added capacity and compute performance, and the ability to dedicate specific performance characteristics to specific applications. Through the integration of storage platforms with higher level data orchestration platforms, composable storage enables enterprise IT operations become more efficient, flexible and scalable. SW
Josh Epstein is the CMO for Kaminario.
Jul2018, Software Magazine