Serverless Platforms Continue to Evolve

Modern applications have increasingly leveraged Kubernetes as the “OS of the cloud” because of its ability to abstract the underlying cloud platform and coordinate the activities of multiple docker containers.

Kubernetes does indeed radically simplify the deployment and administration of multi-service distributed applications. However, it has a significant learning curve, and maintaining a largescale Kubernetes cluster can be daunting.

Serverless platforms offer an attractive alternative to Kubernetes for distributed applications. A serverless platform attempts to abstract the entirety of the underlying infrastructure and simply provides a black box that runs application code.

As a developer, serverless platforms let you focus entirely on application code and not on application infrastructure. Although serverless platforms lack some of the capabilities of Kubernetes, they radically reduce the management and deployment overheads.

Serverless platforms are perhaps a logical extension of the platform-as-a-service (PaaS) cloud model. PaaS platforms such as Google App Engine hide the underlying virtual machine architecture of the platform but still involve permanently running server instances and often don’t scale automatically.

In contrast, a serverless environment simply runs transitory service requests without any permanently running infrastructure. Not all applications have a suitable architecture for a serverless deployment, but applications designed around modern microservices patterns often work well in a serverless framework.

Amazon’s Lambda framework was one of the first popular serverless offerings. Lamda is a “Function as a Service” (FaaS) offering that allows functions written in several popular programming languages to be executed on demand. All scaling is handled automatically by the serverless infrastructure.

FaaS does limit the languages and runtime environment that the developer can use, while a more recent breed of “Container as a Service” (CaaS) systems allow containerized microservices to be used as in the serverless platform.

Serverless platforms have reduced administrative overheads, which can reduce staffing expenditure. They can also reduce hosting costs because where a Kubernetes or PaaS deployment will typically exact some hosting costs even when idle, the billing for a serverless deployment will typically be based on actual utilization, not peak utilization.

However, as with many platforms that provide greater levels of abstraction, serverless platforms offer increased simplicity at the expense of reduced flexibility. Serverless platforms work well when the application is composed of independent containers that are completely stateless and don’t interact with each other.

If your application is comprised of short-lived microservices with no persistent storage and don’t need to interact with each other, then serverless is a good fit. However, if your application relies on long-lived services that have their own local storage, then you might find a completely serverless solution difficult to achieve. Furthermore, serverless platforms can place restrictions on container sizes and/or programming languages that are supported.

And when things go wrong in a serverless platform, you will be more reliant on the platform vendor than is the case with a Kubernetes environment.

Serverless platforms are provided by all the major cloud vendors:

  • Google Cloud Run offers a container as a service capability, while Google Cloud Functions provides Functions as a Service.
  • Amazon AWS Lambda is a Function as a Service offering, while Amazon Fargate is a containerized solution similar to Cloud Run.
  • Microsoft Azure Container Apps is a containerized serverless platform available in the Azure cloud.
  • Knative is a Kubernetes-based platform offering a serverless experience within a Kubernetes cluster. This allows enterprises to provide a serverless experience within a private Kubernetes-based cloud or allows developers with access to Kubernetes clusters a simpler deployment experience.

There are also a variety of serverless platforms provided by smaller companies for more specialized purposes. For example, EDJX offers a serverless “edge” platform. The application code for EDJX must be written in a compiled language such as Rust or C++, which results in very small runtime footprints and consequently greater scalability for high frequency, high volume traffic such as might be encountered in an IoT scenario.

Serverless platforms don’t suit all applications but do offer big improvements in scalability, developer productivity, and running costs for those applications that can make the switch.