It’s no surprise that organizations are trying to do more with less. In the case of managing infrastructure, they’re trying to do much more in the area of provisioning software — not by lessening it but by eliminating infrastructure through the use of serverless technology.
According to Jeffrey Hammond, the vice president and principal analyst at Forrester, one in four developers is regularly deploying to public clouds using serverless technology, going up from 19% to 24% since last year. This compares to 28% of respondents who said they regularly deploy with containers.
The main reason containers are a little bit ahead is that when organizations are trying to modernize existing apps, it’s a little bit easier to go from a virtual machine to a container than it is to embrace serverless architecture, especially if one is using something like AWS Lambda, which you requires stateless writing applications, according to Hammond.
Also, the recently released Shift to Serverless survey conducted by the cloud-native programming platform provider Lightbend found that 83% of respondents said they were delighted with their serverless application development solutions. However, only over half of the organizations expect that switching to serverless will be easy.
“If I just want to run my code and you worry about scaling it, then a serverless approach is very effective. If I don’t want to worry about having to size my database, if I want to use it as I need it, serverless extensions for things like Aurora make that a lot easier,” Hammond said. “So basically, as a developer, when I want to work at a higher level, when I have a very spiky workload, when I don’t particularly care to tune my infrastructure, I’d rather just focus on solving my business problem, a serverless approach is the way to go.”
While serverless is seeing a pickup in new domains, Doug Davis, who heads up the CNCF Serverless Working Group and is an architect and product manager at IBM Cloud Code Engine, said that the main change in serverless is not in the technology itself. Still, instead, providers are thinking of new ways to reel people into their platforms.
“Serverless is what it is. It’s finer-grain microservices, scale to zero, pay-as-you-go, ignore the infrastructure, and all that good stuff. What I think might be new in the community at large is more just; people are still trying to find the right way to expose that to people,” Davis said. “But from the technology perspective, I’m not sure I see a lot necessarily changing from that perspective because I don’t think there’s a whole lot you can change right now.”
Abstracting away Kubernetes
The primary appeal for many organizations moving to serverless is that they want more and more infrastructure abstracted away from them. While Kubernetes revolutionized the way infrastructure is handled, many want to go. Further, Davis explained.
“As good as Kubernetes is from a feature perspective, I don’t think most people will say Kubernetes is easy to use. It abstracts the infrastructure, but then it presents you with different infrastructure,” Davis said.
While people don’t need to know which VM they’re using with Kubernetes, they still have to tell about nodes, and even though they don’t need to know which load balancer they’re using, there’s always managing the load balancer.
“People realize not only do I not want to worry about the infrastructure from a service perspective, but I also don’t want to worry about it from a Kubernetes perspective,” Davis said. “I just want to hand you my container image or source code, and you run it all for me. I’ll tweak some little knobs to tell you what fine-tuning I want to do on it. That’s why I think projects like Knative are popular, not just because, yeah, it’s a new flavor of serverless, but it hides Kubernetes.”
Davis said there needs to be a new way to present it as hiding the infrastructure, going abstract in a way, and just handing over the workload in whatever form is desired, rather than getting bogged down thinking, is this serverless, platform-as-a-service or container-as-a-service?
In contrast, serverless abstracts more things away from the user, and containers and Kubernetes are more open-source-oriented, so the enterprise entry barrier is low. Serverless can be viewed as a bit of a “black box” Many functional platforms today also tend to be a little proprietary to those vendors. However, Arun Chandrasekaran, a distinguished vice president, and analyst at Gartner, said that.
“So serverless has some advantages In terms of the elasticity, in terms of the abstraction that it provides in terms of the low operational overhead to the developers. But on the flip side, your application needs to fit into an event-driven pattern in many cases to be fit for using serverless functions. Serverless can be a little opaque compared to running things like containers,” Chandrasekaran said. “I think of serverless and containers as being that there are some overlapping use cases, but I think, by and large, they address very different customer requirements at this point.”
Davis said that some decision-makers are still wary of relinquishing control over their infrastructure because that often equated to reduced functionality in the past. But with the way that serverless stands now, users won’t be losing functionality; instead, they’ll be able to access it more streamlined.
“I don’t think they buy that argument yet, and I think they’re skeptical. It will take time for them to believe,” Davis said. “This is a fully-featured Kubernetes under the covers.”
Other challenges that stifle adoption include developers’ difficulty with changing to work asynchronously. Also, some would like to have more control over their runtime, including the autoscaling, security, and tendency models, according to Forrester’s Hammond.
Hammond added that he is starting to see a bit of an intersection between serverless and containers, but the main thing that sets serverless apart is its auto-scaling features.
Vendors are defining serverless.
Serverless is expanding, and some cloud vendors have started to call all services where one doesn’t have to provision or manage the infrastructure as serverless.
Even though these services are not serverless functions, one could argue that they’re broadly part of serverless computing, Gartner’s Chandrasekaran explained.
For example, you have services like Athena, an interactive query service from Amazon, or Fargate, for instance, which is a way to run containers. Still, you’re not operating the container environment.
However, Roman Shaposhnik, the co-founder and VP of product and strategy at ZEDEDA, as well as a member of the board of directors for Linux Foundation Edge and vice president of the Legal Affairs Committee at the Apache Software Foundation, said that the whole term of serverless is a bit of a confusing at the moment and that people typically mean two different things whenever they talk about serverless. Defining the technology is essential to spark interest in more people.
“Google has these two services, and they kind of very confusingly call them serverless in both cases. One is called Google Functions, and the other is Google Run, and people are constantly confused. Google was such an interesting case for me because I expected Google to unify around Knative. Their Google Cloud Functions is completely separate, and they don’t seem interested in running it as an open-source project,” Shaposhnik said. “This is very symbolic of how the industry is confused. I feel like this is the biggest threat to adoption.”
This large basket of products has created an API sprawl rather than a tool sprawl because the public cloud typically offers so much that if developers wanted to replicate all of this in an open-source serverless offering like OpenWhisk by the Apache Software Foundation, they have to build a lot of things that they have no interest in the building.
“This is not even because vendors are evil. It’s just because only vendors can give you the full gamut of the APIs that would be meaningful to what they offer you because 90% of their APIs are closed-source and proprietary anyway. And if you want to make them effective, you might as well use a proprietary serverless platform. Like, what’s the big deal, right?” Shaposhnik said.
Serverless commits users to a particular viewpoint that not all might necessarily enjoy. Suppose companies are doing a lot of hybrid work. In that case, if they need to support multiple public clouds, especially if they have some deployments in a private data center, it can get painful pretty quickly, Shaposhnik explained.
OpenFaaS, an open-source framework and infrastructure preparation system for building serverless applications, is trying to solve the niche of figuring out the sweet spot of dealing with the problematic aspects.
“If you have enough of those easy things that you can automate, then you should probably use OpenFaaS, but everything else starts making less sense because if your deployment is super heterogeneous, you are not ready for serverless,” Shaposhnik said.
There is not much uptick with open-source serverless platforms because they must first find a great environment to be embedded in.
“Basically, at this point, it is a bit of a solution looking for a problem, and until that wider environment to which it can be embedded successfully appears, I don’t think it will be fascinating.”
In the serverless space, proprietary vendor-specific solutions are the ones that are pushing the area forward.
“I would say open-source is not as compelling as in some other spaces, and the reason is I think a lot of developers prefer open-source not necessarily because it’s free as in freedom but because it’s free as in beer,” Forrester’s Hammond said.
Because with most functions, organizations pay by the gigabyte second, developers seem to experiment and prototype and prove value at meager costs. And most of them seem willing to pay for that to have all the infrastructure managed for them.
“So you do see some open source here, but it’s not necessarily at the same level as something like Kafka or Postgres SQL or any of those sorts of open-source libraries,” Hammond said.
With so many functionalities to choose from, some organizations are looking for serverless frameworks to help manage setting up the infrastructure.
Serverless frameworks can deploy all the serverless infrastructure needed; it deploys one’s code and infrastructure via a more direct abstract experience.
In other words, “You don’t need to be an infrastructure expert in deploying a serverless architecture on AWS if you use these serverless frameworks,” said Austen Collins, the serverless framework’s founder, and CEO.
Collins added that the Serverless Framework he heads had seen a massive increase in usage throughout the pandemic, starting at 12 million downloads at the beginning of 2020 and now at 26 million.
“I think a big difference between us and a Terraform project is developers use us. They really like Serverless Framework because it helps them deliver applications where Terraform is focused on just the hardcore infrastructure side and used by many Ops teams,” Collins said.
The growth in the framework can be attributed to the expanding use cases of serverless and every time there is a new infrastructure as a service (IaaS) offering. “The cloud has nowhere else to go other than in a more serverless direction,” Collins added.
Many organizations also realize that they’re not going to keep up with the hyper-competitive, innovative era if they’re trying to maintain and scale their software all by themselves.
“The key difference developers and teams will have to understand is that number one lives exclusively on the cloud, so you’re using cloud services. You can’t spin up this architecture on your machine as easily. And also the development workflow is different, which is one great value of Serverless Framework,” Collins said. “But, once you pass that hurdle, you’ve got an architecture with the lowest overhead out of anything else on the market right now.”
All eyes are on serverless at the edge.
The adoption of serverless has been broad-based, but the more prominent organizations tend to embrace it more, especially if they need to provide a global reach to their software infrastructure and don’t want to do that on top of their hardware; Forrester’s Hammond explained.
In the past year, the industry started to see more interest in edge and edge-oriented deployments. Customers wanted to apply some of these workloads in edge computing environments, according to Gartner’s Chandrasekaran.
This is evident in content delivery network (CDN) companies such as Cloudflare, Fastly, or Akamai, all bringing new serverless products to market that primarily focus on edge computing.
“Edge is all about rapid elasticity.” “It’s about scale-up, which is to scale and massively expand quickly, but it’s also about scaling down when data is not coming from IoT endpoints. I don’t want to use the infrastructure, and I want the resources to be de-provisioned,” Chandrasekaran said.
According to Collins, the serverless compute running in the edge is a use case that can create new types of architectures to change how applications were previously built to process calculations closer to the end-user for faster performance.
“So an interesting example, this is just how we’re leveraging it. We’ve got serverless.com processed using Cloudflare workers in the edge. And it’s all on one domain, but the different paths point to different architectures. So it’s the same domain, but we have computed running that looks at the past and forwards the request to different technology stacks. So one for our documentation, one for our landing pages, and whatnot,” Collins said. “So a bunch of new architectural patterns is opening up, thanks to running serverless in the edge.”
Another major trend the serverless space has seen is the growth of product extension models for integrations.
“If you’ve got a platform as a company and you want developers to extend it and use it and integrate it into their day-to-day work, the last thing you want to do is say, well now you’ve got to go stand up infrastructure on your premises or in another public cloud provider, just so that you can take advantage of our APIs,” Forrester’s Hammond said. “I think, increasingly, we will use serverless concepts as the glue by which we weld all these cloud-based platforms together.
The extensions also involve continued improvements to serverless functions that add more programming languages and enhance the existing tooling in areas like security and monitoring.
For companies sold on a particular cloud and don’t care about multicolored or whether Amazon is locking them in, for example, Shaposhnik said not using serverless would be foolish.
“Serverless would give you a lot of bang for the buck, effectively scripting and automating a lot of the things happening within the cloud,” Shaposhnik said.
Serverless is the architecture for volatility.
Serverless is now the volatility architecture because of business uncertainty due to the pandemic.
“Everyone seems to be talking about scaling up, but there’s this whole other aspect of what about if I need to scale down,” Serverless Framework founder and CEO Austen Collins said.
Many businesses dealing with events, sports, and anything in-person have had to scale down operations almost immediately.
At a moment’s notice, these businesses had to scale down almost immediately due to a shutdown. For those that work with serverless architecture, their operations can scale down without doing anything.
The last 16 months have also seen a tremendous amount of employee turnover, especially in tech, so organizations are looking to adopt a way to quickly onboard new hires by abstracting a lot of the infrastructure away, Collins added.
“I think it’s our customers that have had serverless architectures that don’t require as much in-house expertise as running your own Kubernetes clusters that have weathered this challenge better than anyone else,” Collins said. “Now we can see the differences, whenever there’s a mandate to shut down different types of businesses in the usage of people, applications and the scaling, scaling up when things are opening up again is immediate, and they don’t have to do anything. The decision-makers are often now citing these same concerns.”
A serverless future: A Tale of two companies
Carla Diaz, the cofounder of Broadband Search, a company that aims to find the best internet and television services in an area, has been considering adopting a serverless architecture since it is now revamping its digital plans.
“Since most of the team will be working from home rather than the office, it doesn’t make sense to continue hosting servers when adopting a cloud-based infrastructure. Overall, that is the main appeal of going serverless, especially if you are beginning to turn your work environment into a hybrid environment,” Diaz said.
Overall, the cost of maintaining and committing to downtime are just some things the company doesn’t need to worry about any more with the scalability of the serverless architecture.
Another reason why Broadband Search is interested in going to the cloud-based system is the company doesn’t have to worry about the costs of not only having to buy more hardware, which can already be quite expensive, but the costs of maintaining more equipment and possible downtime if the integration is extensive.
“By switching and removing the hardware component, the only real cost is to pay for the service, which will host your data off-site and allow you to scale your business’ IT needs either back or forward as needed,” Diaz added.
Dmitriy Yeryomin, a senior Golang developer at iTechArt Group, a one-stop source for custom software development, said that many of the 250-plus active projects within the company use serverless architecture.
“This type of architecture is not needed in every use case, and you should fully envision your project before considering serverless, microservice, or monolith architecture,” Yeryomin said.
Regarding this company’s projects, Yeryomin said it helps to divide the system into fast coding and deploying sequences to make their solution high-performance and easily scalable.
“In terms of benefits, serverless applications are well-suited to deploying and redeploying to the cloud while conveniently setting the environmental and security parameters,” Yeryomin said. “I work mostly with AWS, and UI has perfect monitoring and test service tools. Also, local invokes are great for testing and debug services.”
However, the most challenging thing with serverless is time. When you configure a lambda function execution, as it is bigger, it becomes more expensive.
“You can’t store the data inside more than the function works,” Yeryomin explained. “So background jobs are not for serverless applications.”