DockerCon 2017: Using Docker to Modernize Your Legacy Apps
At the recent Dockercon, Docker launched a new initiative around modernizing legacy applications: The Modernize Traditional Apps Program. We were pleased to see this move as it’s the legacy applications which are at the heart of the challenge of lack of innovation in enterprise IT.
These applications consume massive amounts of budget and mindshare just in keeping the lights on, with some estimating up to 80% of annual IT budgets are allocated to keeping the lights on. Enterprise IT therefore needs the latest, greatest tools like Docker available and proven in it’s arsenal if they are to redirect resources back to new development and innovation.
Docker: Not Just for Microservices
Most people associate Docker with the more modern end of the architectural spectrum - cloud native, polyglot, stateless microservices. This is however an incorrect assumption. I always explain that Docker can definitely be leveraged for more traditional monolithic applications. You can put the biggest ugliest monolith in a Docker container. It will work just fine and you’ll start to secure some of the benefits of containerisation right away.
This misconception has slowed down Docker uptake in the enterprise, as many end users feel that they don't have the maturity for containers: “We’re miles away from that”. In reality, containers are probably one of the first steps on the journey to cloud native if you accept that it can be applied in legacy estates.
A key side benefit of this is the portability which Docker gives you. Many of our clients have some cloud or PAAS initiative going on. We advise that you can get started in the legacy estate with Docker, then lift and shift your containers to your ultimate target when it’s ready.
Because it’s a new paradigm, Docker is also sometimes perceived as being quite complex. However, it’s actually quite simple for the basic use case of containing and deploying some application binaries. You can get started in hours, and typically a robust containerized pipeline within a month or two for a moderately complex enterprise application.
This will instantly allow you to secure benefits ranging from safer and more repeatable deployments, better resilience, better traceability, consistent environments, better rollback etc.
Some people might scratch their head at introducing a modern tool like Docker into some of their oldest, deepest, darkest areas of their application estate. However, if you look at the business case you will see that these applications have significant cost associated with them. Big teams of niche skills, long testing cycles, out of hours deployments and high support loads.
The hidden cost of this is how it impacts on your ability to innovate. Changing these systems takes a long time, and this has downstream impacts on other systems which integrate with your legacy. Adding agility here can really add velocity across a large application portfolio.
When taking all of this into account, redirecting DevOps automation resources into legacy areas of the application portfolio starts to look compelling.
Getting Started with Containers
During the early years of containers in the enterprise there was a fair degree of analysis paralysis, where people couldn’t agree on where and how to deploy containers, and how they would fit into their workflows.
One thing we have found to be successful is developing a container strategy - a period of strategic alignment on the technical and governance issues associated with moving to a containerized stack. This would of course include a look into whether to target containers at your legacy considering the business case and technical viability. If you would like to learn more about this process, please do contact us.