Making the Most of the Cloud


When people talk about the next generation of applications or infrastructure, what is often echoed throughout the industry is the cloud. On the application side, the concept of “serverless” is becoming less of a pipe dream and more of a reality. The infrastructure side has already proven that it is possible to deliver the ability to pay for compute on an hourly or more granular basis.

In February of 2017, Amazon had a “hiccup” which caused a massive blackout impacting websites across the world. This doesn’t make Amazon a bad choice as a cloud provider, but should cause people to think about and remember what the real value proposition of the cloud is, and that is infrastructure-as-a-service. If you think the cloud is a viable option for operating part or all of your business, then consider taking advantage of it. However, as the old addage goes, “Variety is the spice of life,” and in the context of cloud, this means “prepare to go multi-cloud.”

Utilizing multiple clouds simultaneously delivers the same benefits as running multiple private data centers. Determine where to run select parts of your business applications or even load balance your business applications by deploying all of your services in a highly available and redundant way. This will protect your business from any single failure in infrastructure and deliver better overall business continuity plans as well.

Container technologies such as Docker are a great way to leverage cloud offerings. While containers are not required, they make deploying software easier and allow you to better utilize the resources for which you are paying. When considering the use of Docker containers, think about deploying your own internal Docker repository and mirroring your repository between data centers. Then, as new container images are created, it becomes very easy to ensure the software is available in each of the data center locations, regardless of how many are being leveraged. One of the main benefits of Docker containers is that they enable you to start viewing your infrastructure as just a pool of resources waiting to be utilized.

Deploying a container on a server still requires having someplace safe to persist all of the data. Protecting against the failure of an individual Docker container or server failure is imperative to a successful implementation. Because of those considerations, storage is a necessity when it comes to containerized applications. Whether your application is trying to write log files, maintain the internal state of an application, leverage a database for general data model persistence, or even trying to use decoupled messaging for communication, persistence is a necessity.

Converging all of these services into a single data platform is ideal, as this allows simplified management and deployment of persistent application client containers. With a converged platform, regardless of which server in your cluster of hardware your container gets deployed on, it will always be able to find and write data of any type. This is a major benefit when contemplating a move toward a serverless architecture.

Enabling a separation in authority between the software architecture and the data administration is critical to leveraging cloud infrastructure. Software engineers should not have to be concerned about the cost of storage options or which storage facility needs be used. Moreover, they should not have to worry about rewriting software when a new storage class becomes available or is a better fit based on cost or performance.

The line-of-business and systems administrators are the folks in an organization who should determine the ongoing balancing of the costs and performance of a system. If your data platform supports using ultrafast NVM Express solid-state drives or super-slow, yet reliable, object storage, then take advantage of that and pick and choose where your data lives within the data platform.

In order to maintain agility within your organization, do not force your engineers to write code specific to where data should land based on costs. Software engineers should worry about writing good software to meet the needs of the business and systems administrators, and business owners should pick and choose where the data will reside. When looking toward the cloud, it isn’t acceptable to be required to make a trade-off between data agility, application agility, and infrastructure agility. You deserve them all, but it is up to you to take advantage of the options presented.



Newsletters

Subscribe to Big Data Quarterly E-Edition