2024-03-19

How Docker is disrupting Legacy IT Companies

https://sylvainleroy.com/wp-admin/options-general.php?page=ad-inserter.php#tab-2

Thanks to its popularity, has disrupted many companies and blurred the silos between Developers and Operations. This article, based partially on my own experiences.

Contents

How is disrupting our programmer daily life.

I will depict some of the disruptions that containers have provoked into the IT companies.

I wish that this article will depict familiar situations and brings you an argument to win the obstacles to the container technology propagation 🙂

Disclaimer: Despite I am quoting quite a lot in this article, it is not an endorsed article. If you have a better alternative, simply replace Docker with another Container Technology, the arguments should still be valid.

If you appreciate this article, please relay or like it.

 Day one: I don’t need to spend four hours to set up my development environment

That is my first day on the job. A brand-new laptop, decent performance, and features. Default Operating system: Windows.

Well, I am a Linux hardcore developer and I spent many efforts to live without Excel, PowerPoint, and Outlook. And who knows?

Maybe my customers won’t be on Windows. Therefore, I am wondering how to create my new software development environments? Node.JS? Java? Mobile, each technology is coming with its own tools, servers, and configuration.

What are the choices?

The company is fair and provides a decent laptop with sufficient power. However, the software installation on the native OS has been blocked. I need to proceed with virtual machines. Virtual machines? What a cumbersome solution.

I need to download ISO or OVA’s and proceed to the installation of my software. What is your virtual machine creation’s strategy? I made a quick survey to my colleagues, who confirm that they create a virtual machine by customer’s project. And they share their virtual machine like Pokemon. WTF? Many dozen of gigabytes are transferred through USB3, a tiny hard drive and the team is ready. Well after several hours.

or any similar container technology is offering me a better solution.

These are the following arguments :

  • Reduce your startup time and be more efficient. You can find many images to set up a development environment ready to use :
  • Docker image node.js dev: A Dev environment for JS
  • Docker image Ruby Dev, Docker image Ruby Dev2
  • Docker image C/C++ on Linux
  • Docker image Java Dev
  • Docker image PHP Dev
  • Broadcast your software programming best practices by using the same env in your team. As a tech lead, my mission is to make my colleagues better than me. And to reach that goal, I am trying to provide them the best tools, configuration, IDE, and automation to help them in their work. How many times, I had to provide a format style guideline to indent their code? A syntax checker configuration? An IDE with the right plugins? All these issues can be solved by providing my image and updating it regularly.
  • The time for Web IDE software probably comes: Eclipse and Visual Studio, Borland Delphi, such IDE have been used by generations of developers. They all come with the same advantages and drawbacks. Powerful, clever code completion, nice OS integration, and notifications, a whole bad of features. Clearly the develop has a great environment to write its software but these solutions do not scale well inside a team. How to share my configuration? My preferences? How to share code? How to communicate? To create coziness in your team, you will have to rely on a great IT administrator. A magician of the command line, Powershell to set up your OS with the same configuration everywhere, yet able to update it regularly.

My recommendation is to rely on two kinds of tools to produce your software :

  • Lightweight code editors such as Atom, Visual Studio Code from Microsoft
  • Web IDE such as Cloud 9 or CodeEnvy is a great example. Using Eclipse Che, the web rewrite of Eclipse, a Saas IDE with the same lot of features and configuration.

The most amazing thing is that this great and complex system can be installed with a one-liner.

 Day two: Security everywhere, freedom and performance nowhere

As software specialists, we are well aware of the threats coming from web applications, unobserved operating systems, data breaches. Our daily duty is to protect our customer data.

The consequence for developers is a matter of rule, our computers are locked. Software installation is double-checked by IT, the Internet is accessed through a proxy, antivirus, whitelist, and so on. Hard disk ciphered and so on. In all developments have to be done on Virtual machines. BUT Virtual machines are such a pain to manage. A huge disk space, hard to customize, fairly expensive solutions (license cost of VMWare to be able to perform VM snapshot on every developer laptop…)

Docker Datacenter

Datacenter

Building Virtual Disk images is a tremendous task for the system administrators: slow to copy, hard to customize, mostly manual installation, and snapshots to producing them. Developers do not find the necessary flexibility to adapt to their customer projects.

is offering a neat and efficient way to produce images, thanks to the docker script language. A good mix between automation and the traditional system administrator work of producing shell scripts.

Using Docker images is offering enough security control to the system administrator, fewer efforts to maintain and developers can also provide easily their own images to the Security/Administrator Team for review. But how to store your Docker images. At present, I would recommend something like Docker Datacenter to host your company containers on-premise.

Day twenty: The typical legacy IT Project

Traditional “legacy” IT Projects features : * a code base * scripts to build the software * some manual test cases * huge and extensive installation documentation to setup : * the test environment * the production environment * perform the maintenance, the upgrade, the backup of the system * scripts to install the database schema

In practice, most IT projects enforces developers to manually install their development, test, production environments using out-of-date documentation, incomplete. How the software team is performing a QA session? Will they create a brand new test environment with fresh data in a given state and the latest produced software version? The answer is probably no, definitively no.

Usually, IT teams are relying on a single test server, painstakingly built through the scrum sprints, on a virtual machine. Do you think that I am exaggerating? Ask your team, how much time do they need to recreate this environment? And what if their snapshot is lost or damaged?

Continuous Dockery, ElectricCloud image property

Continuous Dockery, ElectricCloud image property

Docker is providing several solutions to common IT project problems :

  • Deploying the customer application on the developer laptop: docker images are shared between developers to have access to a debug environment. Docker composer can help the software development team to build ready-to-use.
  • Initializing and populating a database for tests: another solution to execute integration tests is to rely on Docker to build an image, ready-to-use of your data. Start the container, wait for the readiness, execute your integration tests and kill the container once used. Such a scenario is easy to create with Docker, even with proprietary databases such as Oracle or MSSQL. This article is a good instruction for Docker and test automation.

Deterministic Test Automation

Deterministic Test Automation

  • Multiple targets and environment compilation: Developers often need to test their software in different environments, browsers. Docker images also provide a solution to the complexity of a software environment.

Day forty: The void of the production environment

Recently, I have encountered a brilliant developer – although alone – maintaining a messy piece of PHP code. He was not the originator of the project, though, in charge of the project for two years. He told me that the manager offered him a virtual machine with everything on it at the beginning of the project, to help him. The same one he is using on it.

Currently, he is struggling with the customer and that software. Both the customer and him have different deployment environments. And the difference between server, languages, and frameworks versions is creating a huge mess.

Another project and situation. This IT team has been relying on Ansible (with Puppet it would have been the same situation), to deploy their software in the different environments. Despite the improvements thanks to the automation provided by Ansible, there is always a slight tension when launching the Ansible scripts. Maybe it’s the system entropy, the virtual machine erosion, most likely the reason is that the virtual machine has never been deleted and recreated. Anyway, there are some subtle differences between the environments and probably and the Ansible deployments are sometimes failing when new features are shipped.

System erosion

System erosion

With that team, we have reached a common point of view. When should we use Ansible to deploy the software? We should rather use Ansible to prepare the virtual machines to host Docker, open the firewall, establishing the network routes, program the monitoring, and so on. And the software will be shipped as a docker image, copied by Ansible, and launched.

Docker/Containers can simplify your software deployments whether you have a private cloud or regular virtual machines. Simply install the docker system on your virtual machines and change your way to ship your software, past the effort, you won’t regret it.

Day eighty: The good old mama’s Software Factory

The last situation where Docker/Containers is really brilliant is when you use Docker inside your software factory.

Docker can be used in a software factory to make your Software factory evolve from a monolithic all-usage but slow and frustrating software factory to a real platform Software Factory As A Service (If you like the term SFAAS, it’s mine 🙂

  • The main differences between a Software Factory and an SFAAS are the following :
  • Product owners, Team managers are creating the new Software Factories for their projects directly through a WebUI by picking the technologies, tools they need.
  • Developers have the possibility to instantiate new environments to build or test the software without any interaction, paper submission, and waiting for a round trip between Earth and Mars.
  • Integration engineers are providing new tools and environments accessible to the projects if they wish.
  • Few interactions are necessary between the infrastructure and system administration teams and the software teams. It’s a win-win solution and the IT bottleneck has been removed.

Docker Software Factory : Marcel Birkner

Docker Software Factory: Marcel Birkner

I strongly recommend that Software factories built on of containers like these great initiative projects.

Conclusion

If you have read the whole article, I can only say a big thank you and I hope you have been able to learn one thing or two. The apparition of containers is really helping developers, ops and I wish that the IT companies fully embrace these technologies to make our profession much funnier and attractive.

Sylvain Leroy

Senior Software Quality Manager and Solution Architect in Switzerland, I have previously created my own company, Tocea, in Software Quality Assurance. Now I am offering my knowledge and services in a small IT Consulting company : Byoskill and a website www.byoskill.com Currently living in Lausanne (CH)

View all posts by Sylvain Leroy →