How to do a legacy software migration : a successful checklist

Legacy Software migration

Here is a small checklist about how to migrate a legacy migration and to ensure its success.

This article is part of my work to explain my knowledge about Software migration in the company Byoskill.

First measure your project according the changes involved by the migration :

  • The organizational changes
  • The process changes
  • The technological changes

The organization

You will have to think about each person involved in the Software that is going to be migrated. Which skills are required to develop, test, maintain, support the Software. How will you manage the brutal disrupt in their daily work and motivate them to embrace the change ?

I recommend you to have a clear picture of the team and not underestimate the needs of training, evangelize, and change management.

Changing technologies or framework, even without any feature changes may impact severely your end-users, the support and maintenance teams and the integration engineers.

The processes

An automated migration differs from a classical IT Software project by adding a certain disruption in terms of time and practices. Since the migration has to be hastened (thanks to the usage of specialized tools), the success of the project requires a high level of software development practices. Usually, the legacy software are still hitting the wall in terms of agility, silo breakup, continuous integration, test automation practices.

The legacy project migration requires to have a clear state (and view) of the following processes :

  • SDLC : Software development lifecycle.
    • What is the current process ?
    • Is their any traps or caveheats to be aware of ?
    • How much time does it take to push a new feature from the specification to the production ?
  • Test automation : You will have to answer many questions who are associated to well identified risks.
    • How the application is tested ?
    • At which levels ?
    • Are the tests automated ?
    • What is the coverage (estimated and measure) ?
    • What are the requirements to set up a new test environment ?
    • How much does it cost ?
    • Is test data available ?
    • How accurate is the test data ?
    • Which tools are required to execute the tests ? The licences
    • What would it take to obtain a sufficient coverage for the migration project ?
  • Continuous Integration : 
    • How the software is built ?
    • How much time does it take to produce a new release ?
    • What are the steps ?
    • Which parts are tricky or manual ?
    • What are the components to be built ?
    • How many individual parts composes the software ?
    • Which tools are used ?
    • What would it take to obtain a complete CI/CD  for the migration project ?
  • DevOps : 
    • Logging and Monitoring support :
      • Will the tools still compatible are the migration ?
    • Automatic deployment
      • What are the changes required to maintain (or obtain) an automatic deployment of the solution
    • Error and Exception handling
      • Detect and document any regressions in the current way to catch and handle the errors during the run
    • Performance testing
      • Does the software have any automated performance tests ?
      • Will they still be compatible ?
  • Release Management :
    • How many active versions for this Software ?
    • What is the branch release model ?
    • Versions to maintain and port
    • Frequency of releases and year schedule / roadmap

The technologies

A good legacy migration project is preparing a clear state of the perimeter to be migrated and the targeted solution.

It comes in a phase of three steps.

  • The actual picture
  • The definition of the target
  • The migration in itself

Drawing the current picture

The initial software assessment does really mater. Indeed an solution architect will detect any caveats and flaws in the current architecture that may critically slowdown the migration project.

Such assessment usually requires :

  • a technology survey : which technologies are used in the software, licenses issues, exact version and possibility of upgrade
  • an architecture assessment : will be controlled in priority the physical organization of the project (folders, files), the logical structure (components, package, functional and technical layers) and the dependency matrix (cycles, code weaving, code smells)
  • an automated test assessment : code coverage, tests documentation, test robustness/fragility, main complex components and their level of testing.
  • a code quality assesment : a quick review to identify in priority the main risks (reliability, security, maintainability)

The definition phase

This phase has four objectives :

  • Get a understanding of the cost and time of the migration project
  • Evaluate with the customer, the ROI of the operation
  • Eliminate the main technical  / functional risks of the migration
  • Communicate to the customer, the organization and process changes to be operated.

To achieve that, a main document (or specification) has to be produced, the migration guide.

This migration guide will expose the targeted solution, the way to achieve it, the necessary steps, a risk analysis and RACI, the cost and estimation of each tasks.

This definition phase may be accompanied by a Proof Of Concept(POC), a short term development performed on the current solution to assess the feasibility of the targeted solution and to allow any necessary test to be executed. It will be critical to pay attention to any functional regressions and performance regressions in this POC.

The migration

The migration is not a Big Rewrite. 

It’s an incremental, well-defined process where the automation is removing the main source of failure of migration projects : the time of execution.

Indeed longer the migration process is taking, more dangerous will become the project, debated and finally rejected.

A good migration project usually has the following qualities :

  • Incremental : in some way it’s possible to have the two technological environments living as room mates in the Software
  • Fast : The amount of rewriting, manual fixes and iterations to obtain the targeted solution have to be small.
  • Cost effective : The volume of manual operation cost should be significantly be smaller thanks to the automation
  • Critical : No legacy migration finds its justification without a real concern (Security, Investment, Scale up operation, Business loss or expectations)

How to make a software developer happy ?

Leave your comfort zone

To be or not to be (happy), that’s the question. In this article, I expose some thoughts about what could make a software developer happy in his work. I wrote this article with several targeted audience in mind : Junior developers, Senior Techleads and H&R resources.

Continue Reading


Java developer testing toolbox

JBehave : code

An article dealing with Java application and testing frameworks and related libraries. Continue Reading


TOP open-source dashboard solutions 2017

Dashboards can be a very efficient communication tool for a team, between managers and business units. It enables an organization around a vision to share common goals. It can also be useful to identify weaknesses in processes and adapt your strategy according to them.

Continue Reading


How Docker is disrupting Legacy IT Companies

Thanks to its popularity, Docker has disrupted many companies and blurred the silos between Developers and Operations. In this article, based partially on my own experiences, I will depict some of the disruptions that containers have provoked into the IT companies. I wish that this article will depict familiar situations and brings you argument to win the obstacles to the container technology propagation 🙂

Disclaimer : Despite I am quoting Docker quite a lot in this article, it is not an endorsed article. If you have a better alternative, simply replace Docker by another Container Technology, the arguments should still be valid.

If you appreciate this article, please relay or like it.

 Day one : I don’t need to spend four hours to setup my development environment

That is my first day on the job. A brand-new laptop, decent performance and feature. Default Operating system : Windows.

Well, I am a Linux hardcore developer and I spent many efforts to live without Excel, PowerPoint and Outlook. And who knows ? Maybe my customers won’t be on Windows. Therefore, I am wondering how to create my new software development environments ? Node.JS ? Java ? Mobile, each technology is coming with their own tools, servers and configuration.

What are the choices ? The company is fair and provides a decent laptop with sufficient power. However the software installation on the native OS has been blocked. I need to proceed by virtual machines. Virtual machines ? What a cumbersome solution. I need to download ISO or OVA’s and proceed to the installation of my software. What is your virtual machine creation’s strategy ? I made a quick survey to my colleagues, who confirm that they create a virtual machine by customer’s project. And they share their virtual machine like Pokemon. WTF ? Many dozen of gigabytes are transferred through USB3, a tiny hard drive and the team is ready. Well after several hours.

Docker or any similar container technology is offering me a better solution.

These are the following arguments :

  • Reduce your startup time and be more efficient. You can find many docker images to set up a development environment ready to use :

  • Docker image node.js dev : A Dev environment for JS
  • Docker image Ruby Dev, Docker image Ruby Dev2
  • Docker image C/C++ on Linux
  • Docker image Java Dev
  • Docker image PHP Dev

  • Broadcast your software programming best practices by using the same env in your team. As a tech lead, my mission is to make my colleagues better than me. And to reach that goal, I am trying to provide them the best tools, configuration, IDE, and automation to help them in their work. How many times, I had to provide a format style guideline to indent their code ? A syntax checker configuration ? An IDE with the right plugins ? All these issues can be solved by providing my docker image and updating it regularly.

  • The time for for Web IDE software is probably come : Eclipse and Visual Studio, Borland Delphi, such IDE have been used by generations of developers. They all come with the same advantages and drawbacks. Powerful, clever code completion, nice OS integration and notifications, a whole bad of features. Clearly the develop has a great environment to write its software but these solutions do not scale well inside a team. How to share my configuration ? My preferences ? How to share code ? How to communicate ? To create coziness in your team, you will have to rely on a great IT administrator. A magician of the command line, Powershell to setup your OS with the same configuration everywhere, yet able to update it regularly.

My recommendation is to rely on two kind of tools to produce your software : – lightweight code editors such like Atom, Visual Studio Code from Microsoft – Web IDE such like Cloud 9 or CodeEnvy is a great example. Using Eclipse Che, the web rewrite of Eclipse, a Saas IDE with the same lot of features and configuration. The most amazing thing is that this great and complex system can be installed with a Docker one liner.

 Day two : Security everywhere, freedom and performance nowhere

As software specialists, we are well aware of the threats coming from the web applications, unobserved operating systems, data breaches. Our daily duty is to protect our customer data.

The consequence for developers is a matter of rule, our computers are locked. Software installation is double checked by IT, Internet is accessed through a proxy, antivirus, whitelist and so on. Hard disk ciphered and so on. in ch All developments have to be done on Virtual machines. BUT Virtual machines are such a pain to manage. A huge disk space, hard to customize, fairly expensive solutions (licence cost of VMWare to be able to perform VM snapshot on every developer laptop…)

Docker Datacenter

Docker Datacenter

Building Virtual Disk images is a tremendous task for the system administrators : slow to copy, hard to customize, mostly manual installation and snapshots to produce them. Developers does not find the necessary flexibility to adapt to their customer projects.

Docker is offering a neat and efficient way to produce images, thanks to the docker script language. A good mix between automation and the traditional system administrator work of producing shell scripts.

Using Docker images is offering enough security control to system administrator, less efforts to maintain and developers can also provide easily their own images to the Security/Administrator Team for review. But how to store your Docker images. At the present time, I would recommend something like Docker Datacenter to host your company containers on premise.

Day twenty : The typical legacy IT Project

Traditional "legacy" IT Projects features : * a code base * scripts to build the software * some manual test cases * a huge and extensive installation documentation to setup : * the test environment * the production environment * perform the maintenance, the upgrade, the backup of the system * scripts to install the database schema

In practice, most IT projects enforces developers to manually install their development, test, production environments using an out of date documentation, incomplete. How the software team is performing a QA session ? Will they create a brand new test environment with fresh data in a given state and the latest produced software version ? The answer is probably no, definitively no.

Usually, IT teams are relying on a single test server, painstakingly built through the scrum sprints, on a virtual machine. Do you think that I am exaggerating ? Ask to your team, how much time do they need to recreate this environment ? And what if their snapshot is lost or damaged ?

Continuous Dockery, ElectricCloud image property

Continuous Dockery, ElectricCloud image property

Docker is providing several solutions to common IT project problems :

  • Deploying the customer application on the developer laptop : docker images are shared between developers to have access to a debug environment. Docker composer can help the software development team to build ready-to-use.

  • Initializing and populating a database for tests : another solution to execute integration tests is to rely on Docker to build an image, ready-to-use of your data. Start the container, wait the readiness , execute your integration tests and kill the container once used. Such scenario is easy to create with Docker, even with proprietary databases such like Oracle of MSSQL. This article is a good instruction to Docker and test automation.

Deterministic Test Automation

Deterministic Test Automation

  • Multiple target and environment compilation : Developers often need to test their software in different environments, browsers. Docker images also provide a solution to the complexity of a software environment.

Day forty : The void of the production environment

Recently, I have encountered a brilliant developer – although alone – maintaining a messy piece of PHP code. He was not the originator of the project, though, in charge of the project since two years. He told me that the manager offered him a virtual machine with everything on it at the begin of the project, to help him. The same one he is using on it.

Currently, he is struggling with the customer and that software. Both the customer and him have different deployment environments. And the difference of server, languages and frameworks versions is creating a huge mess.

Another project and situation. This IT team has been relying on Ansible (with Puppet it would have been the same situation), to deploy their software in the different environments. Despite the improvements thanks to the automation provided by Ansible, there is always a slight tension when launching the Ansible scripts. Maybe it’s the system entropy, the virtual machine erosion, most likely the reason is that the virtual machine has never been deleted and recreated. Anyway there are some subtile differences between the environments and probably and the Ansible deployments are sometimes failing when new features are shipped.

System erosion

System erosion

With that team, we have reached a common point of view. When should we use Ansible to deploy the software ? We should rather use Ansible to prepare the virtual machines to host Docker, open the firewall, establishing the network routes, program the monitoring and so on. And the software will be shipped as a docker image, copied by Ansible and launched.

Docker/Containers can simplify your software deployments whether you have a private cloud or regular virtual machines. Simply install the docker system on your virtual machines and change your way to ship your software, past the effort, you won’t regret it.

Day eighty : The good old mama’s Software Factory

The last situation where Docker/Containers is really brilliant is when you use Docker inside your software factory.

Docker can be used in a software factory to make your Software factory evolve from a monolithic all-usage but slow and frustrating software factory to a real platform Software Factory As A Service (If you like the term SFAAS, it’s mine 🙂

The main differences between a Software Factory and a SFAAS are the following : – Product owner, Team manager are creating the new Software Factories for their projects directly through a WebUI by picking the technologies, tools they need. – Developers have the possibility to instantiate new environments to build or test the software without any interaction, paper submission and waiting for a round trip between Earth and Mars. – Integration engineers are providing new tools and environments accessible to the projects, if they wish. – Few interactions are necessary between the infrastructure and system administration teams and the software teams. It’s a win-win solution and the IT bottleneck has been removed.

Docker Software Factory : Marcel Birkner

Docker Software Factory : Marcel Birkner

I strongly recommend that Software factories built on the top of containers like these great initiative projects.


If you have read the whole article, I can only say you a Big thank you and I hope you have been able to learn one thing or two. The apparition of containers is really helping developers, ops and I wish that the IT companies fully embrace these technologies to make our profession much funnier and attractive.


The disappointing quest for an Headless CMS in 2017

In 2017, this blog is powered by Hexo.js. However I am looking for a replacement since Hexo.JS is lacking of crucial features.


TLDR : HexoJS is too limited, I want online post edition!

I have been recently working to replace the technology powering my blog. A major point is that I am disappointed with its theme. I would like to replace it, with a new technology Vue.JS, for which I have already discussed there.

Since I am replacing the whole front-end, I have been using the great plugin hexo-generator-json. However I still have major issues with my assets (stored with the posts) and it is not really compatible with a CDN solution.

The second feature I am missing, is the possibility to edit my post online. I am an user of Medium and I love the mobile application to create and edits my posts as well watching statistics. A thing I did not think at first, is the impossibility to create new posts with Hexo.JS without an computer. Indeed, to generate your site, you have to generate it, using a full Node.JS environment, commiting, pushing on GITHub your modifications, deploy the docker container and so on. A lot of tasks I have mostly automated but yet, I don’t have a CI Environment available for it.

I did not want to switch back to Drupal and WordPress, equals for me as a bloated solution, slow and hard to tune. I wanted a compromise : why not having a NoSQL Database, a light REST backend, an AdminUI and that’s all. At the beginning of this blog, it was my plan to build this backend, but I decided shortly to concentrate on the content, rather than on the code.

Fortunately, the technologies have evolved and I made a list of Headless CMS / API-First CMD and tested them.

Headless CMS, what is it ?

Headless CMS

Headless CMS

I won’t spend too much time in the details, a good description has been done there.

Basically, legacy / traditional CMS are highly coupled solutions where the following components are tied :

  • Database : SQL Databases
  • Backend : PHP or worse
  • Front-End : Templated front end or theme highly coupled with the backend API. Unmodifiable at best, throwable at worst.
  • Separated WS / RPC : External service to access the backend data, not used by the front-end.
  • Admin UI : Bundled Admin UI.

Usually this kind of CMS are stored in one big block called WordPress, Drupal, Joomla and so on.

The good news is that even these famous solutions are evolving to apply the following modern and well-known principles :

  1. Decoupled front-end : CMS frontends should be decoupled. The UI will access to the blog data and content using an REST API. UI for Headless CMS are usually using technologies as Angular, React or Vue.JS
  2. Responsive front-end : Headless CMS enales the possibility to create various UI depending of your devices, smart watch, website, search engine etc.
  3. NoSQL Database : Handling documents and content is the speciality of NoSQL databases allowing to add your own custom fields, categories and organization.
  4. Framework : Such Headless CMS should provide libraries or framework to access to the content and handling the security as NPM modules and so on.
  5. DevOps : Such solution should be dockerized.

 My expectations

I am expecting from an Headless CMS to contains :

  • a REST backend
  • Documented RESTFul Apis
  • a database driver compatible NoSQL
  • a bundled Admin UI accessing by an API to the REST Backend
  • a docker image or docker compose
  • possibility to add custom fields
  • possibility to handle markdown format for the edition
  • Cloud FS Storage for my medias
  • Optimized solution : I don’t want a new wordpress installation
  • Node.JS solution : I want a lightweight solution
  • Self hosted solution : I want to deploy it on Google Cloud.


The list of experiments and my opinion about it.

Directus : No!

Docker-compose was not running (I used this project). The docker instructions worked for me.

I launched it and soon enough I received a lot of technical alerts wasting my pleasure of a fresh installation.

Directus / Error message

Directus / Error message

My last blocking point, and the reason I have rejected : I did not find any way to create a content category (called table) in the admin UI. Seems to manipulate the SQL Database to create them : no thanks (rant here).

 GetMesh : Meh

Uh Uh, a Java solution to power a small blog : no thanks.


 Drupal and WordPress : Hydra CMS

Too big, too well-known. The REST API is for sure the next security hole of these solutions.

But the reason of my reject, the UI cannot be separated from the backend!! And why would I like a UI embedded in my backend when I want to create a SPA WebSite ?

I will use them when they will have deleted their UI from the installer.

I suggest to call them HydraCMS.

GraphCMS : Hipster$$CMS



Looks Great but I want my own self-hosted solution and don’t want to pay for that.

Site here

Ghost : GirlyCMS

Honestly, I had a crush with Ghost. Sexy, a great installer, a great documentation, everything to tempt me like an attractive woman.

The problem is that Ghost has almost everything to charm me but he has an embedded UI!!!

I don’t want an UI, I want to build mine 🙁

Apart from that point, GhostCMS is really great.

Ghost CMS

Ghost CMS

It even has an Slack integration and loves Markdown!!

Ghost CMS Site

Cockpit : Blind CMS

Cockpit CMS

Cockpit CMS

Listed in the Awesome CMS List, Cockpit CMS is a rather small solution.

The good points are :

** Docker is working fine. ** The concepts and architecture are OK. ** Nice AdminUI, I really appreciated the way to create my collections

But what really disappointed me was :

** No documentation (REST and so on). For an developer it’s unusable ** PHP : There is no documentation and the REST API is coded in PHP.. Meh ** Lonesome developer yes he is brave, we should encourage him, but he is freaking alone.

In summary, I think that this project goes in the right direction but took some tough and spiky path. PHP is clearly not the appropriate language for such solution. Compared to an Express server, the amount of work to be delivered is too high. It really needs more contributors (actives) to create a good solution and fill the big documentation blackhole. I cannot help since I don’t want to code again in PHP but the solution could be great.

Site is here


Well at the first glance, I rejected, could not find any Docker image. Or the few ones were not working. But my first attempt was dumb. KeystoneJS is not an Headless CMS by itself it’s rather an implementation of a CMS, fully customizable to create your own blog!

Powered by Express and Node.JS, two technologies I am particularly fond of!

The site is there

The positive sides of KeyNodeJS :

  • A slick project creator using Yeoman!
  • Modern technologies, I think the best to create a CMS
  • The bundle is containing what I am expecting (AdminUI, REST Backend, NoSQL Database(mongoDB))
  • Fully customizable collections and so on

The negative points are :

  • Maybe too much code to begin with
  • What is the maturity of the base implementation ?
  • How much effort requested to build its own website ?
  • I think I did not find yet a NPM module to build a REST Client


I have rejected most of these solutions.

  • I tried two times an installation and to migrate my data in DirectUS but I gave up. I don’t believe in the concepts.
  • The lack of API Documentation in Cockpit (HTML or a la Swagger) is blocking my attempt to use it and migrate my data. The fact that the solution is developed in PHP is blocking my wish to support them. And I don’t like much PHP REST Backend to be honest.
  • I really love Ghost but I don’t want their UI, I want mine. Otherwise I would have use it.
  • I tried to use Drupal and WordPress, but the requested system resources + the fact I cannot disable the UI are a big NO for me.

The consequence is that I am using KeystoneJS and I hope I won’t have too much work to power a new version of my blog.

Stay tune!

References :



Disruption in Software Quality Assessment ?

As many other markets, the SQA/ALM Market soon will meet #disruption. Domains like machine learning, deep learning and cloud computing will force it to evolve in the next few years. This article is presenting some predictions about the future of the quality tools.

Disruption in Software Quality Assessment

Disclaimer I am not a native english speaker and I am perfecting my english skills by writing these articles. If this topic interests you, please comment below or share the article to your friends. And every syntax, grammar mistakes will be fixed under your wise comments.

A new generation of Software quality tools is going to emerge. Machine Learning, Deep Learning, DevOps, Continuous Delivery, Continuous Integration, Cloud Computing, all these movements are influencing the SQA/ALM Software Editors. It has never before been so easy and cheap to produce a new static analysis tool to measure some aspects of a software. The Opensource movement and the market evolution are the direct contributors to this state. Made famous under the name of “linters”, well-known and unknown developers are creating the tools required to their activities. And the Software editors are faced to the dilemma : “Should I continue to build my own tools ? What should be my behaviour confronted with this plethoria of scanners ?”.

Until recently, Software developers were depending of the highly-specialized skills from the Quality Software Editors to detect, analyze and fix the bugs inside their softwares. And it is a big source of frustration. From both sides. Developers are usually complaining that the rules do not reflect their real needs or the complexity of their softwares. “Quality tools do not detect real problems or too late or under a trillion of false positives”. Software Editors are providing to the hungry population rule sets, standards to satisfy the crowd. A crowd much much bigger than their own forces.

I am predicting that the disruption may be coming from these directions :

  • From the open-source : soon or later, the basic needs of developers will be fulfilled by the open-source offer. Tools like PMD, Findbugs, and so on have inspired a whole generation of developers. The young developers through the Angular 2, ReactJS, Go are already educated to the benefits of Quality tools. And they are heavily relying on linters well-integrated in their CI or in their IDE (Atom, Code). Twitter, Facebook are continuously producing and releasing in opensource new tools to help the developer community. The recent examples of Flow or PrePack are helping a lot developers to increase the quality of their products.
  • From the digital technologies. The increasing level of maturity of the machine learning and deep-learning technologies should bring us shortly new kind of tools to predict bugs, predict code defects and usual developer decisions. I believe that the scientific researches from Microsoft and Google will contribute indirectly to the Software Quality tool market. This topic is unsurprisingly very discussed (here).
  • From the software development process transformation : Movements like Agile, DevOps, Continuous Integration and Deployment, ChatBots are deeply changing the way developers are collaborating. Several aspects are changing : communication (Slack, Hipchat), software building (Jenkins, Travis CI, Microsoft TSF & Azure), software deployment (containers, PAAS, Amazon AWS)… The way a product is conceived, built and deployed requires to track and measure several quality aspects. The integration effort to produce these metrics and KPI’s is tremendous and have to be adapted to each organization. Would the developers be enough satisfied with code quality or will they require higher levels metrics extracted from their development process.


Who will be the future leaders in the ALM market ? Who will be the fastest to adapt to the current technology and data disruption ? Do you have some tools that could match these descriptions ?

If that article has been useful or interesting, stay connected, I will produce new articles on that subject.

One of my future article will present Codacy, an emerging code quality platform. This platform offer to ease the quality control as soon as possible in your development process to detect the bugs early and surely. I will compare this solution with the famous market leader SonarQube.


Codacy : a great example of technological blog post


Some months ago, I wrote an article for a new static analysis tool and a great team, Codacy.

The original blog post is there on Dzone and on my Blog.

Continue Reading


Five trending technologies for Digital Transformation

Digital Transformation

Today I stumbled upon that interesting article. That is a summary / prediction of some emerging technologies that are useful for the Digital Transformation. Have you ever tried them ?

These technologies are :

  • Apache Spark : the well-known opensource solution for Machine Learning and Deep Learning
  • Okta : a solution that I have used in a previous project to federate identity and authentication. A Saas service that provides SSO/SAML authentication. Its usage could be better documented tough take a look at it!
  • MultiChain : a software tool for web asset and legal contracts on blockchain, allows its customers to control whether the chain is private or public
  • Puppet : Puppet is an IT Automation solution that I have also used during the past two years to build Software factories. I am rather dubious how the solution is nowadays comparing against Docker and its ecosystem. It’s basically a client agent/server soljtion to deploy servers and applications. The worst of all is that it’s written in Ruby 🙂
  • Capriza : Capriza seems to be a low-code solution. I personaly did not know it since we have many partnerships with other solutions in my new Company (as Appway). I will take a look at it too.

The link 5 technologies for rapid Digital Transformation