This is the last post for 2018 on VamsiTalksTech.com. In retrospect, 2018 has been the best year of my career – one full of amazing customer interactions over a range of exciting technology, fantastic co-workers and extensive writing. It is also heartening that my work is being syndicated across multiple outlets. When I first sat down to create the blog almost four years ago, I had no idea of the realm of possibilities – I just wanted to share my hard won insights and learnings for the benefit of my readers. I owe my readers a big thank you and my state of happiness wouldn’t happen without your choosing to spend time on the site. Looking forward to an exciting 2019 on the blog with more posts around themes covering virtually all facets of enterprise IT!!
The Edge Challenge
Traditional software applications are developed using teams spread over one or two locations. These applications are developed on a common framework and then modified or customized for client usage. This is typically the mode followed by large vendors of systems such as ERP, CRM, ISV developed software applications. Expensive professional services are engaged to install these on premise and some of these actually run into years. The cloud is beginning to change the calculus of such drawn out and expensive engagement models. More and more companies are turning to Managed Services delivering hybrid cloud capabilities as a SaaS. The hope is that costs can be saved yet flexibility be achieved in a multi tenant manner.
Edge applications are a unique yet growing subset of the SaaS delivered application as they are typically deployed over tens to thousands of remote locations. They are different from their datacenter cousins in that they need to be run over unreliable connections with no technical operator involvement.
Further,
- Given the scale of Edge Applications, multiple teams need to work on a given codebase with centralized release management. Different teams work on the user interface, the domain model, business functionality etc
- Need to support polyglot language and database development
- Support Independent load & performance testing
- Support application versioning and rollback at the edge
- Support traffic shaping and independent updates for edge applications
Kubernetes is an ideal choice to run these applications in a manner that maximizes performance, scalability and change management.
Solution
While there are many patterns applicable to the challenge – the below DevOps methodology is a best practices pattern that is predicated on immutable infrastructure using Docker and Kubernetes.
When using a pattern such as the above a few things need to be kept in mind:
- Developers can choose to implement their business logic using microservices or serverless functions or a combination of both. There are advantages and tradeoffs for all these choices.
- It is important that the CI/CD pipeline systems – whether Spinnaker or Jenkins(X) – natively understand the state of the cloud deployment. What that means is a way of working with kubernetes APIs to perform deployments, roll them back if needed, create load balancers, enforce security restrictions across namespaces etc
- It is highly recommended to leverage a Service Mesh for a multitude of advantages including abstracting away a whole ton of networking logic such as authentication, connectivity, retries etc away from the application itself. Service Meshes also provide a range of visibility & monitoring for microservices.