DevOps DLM

Database Lifecycle Management (DLM) Process and Tools

Applying agile principles to database development, DLM ensures the enterprise’s adaptability to a changing business environment, and delivers more value to customers sooner.

Database Lifecycle Management (DLM) Process Diagram - DevOps DLM
DLM_Popular_Tools

Download a full-page PDF "DLM Process and Tools" Diagram

Popular DLM DevOps Tools

 

What is DLM?

Database Lifecycle Management (DLM) is the application of the agile software development practices to the process of database development. It facilitates the continuous refinement of even the most complex database applications. DLM ensures that the organization can deliver more value to customers faster, by enabling the quick adaptation to the ever more frequent changes in the business environment.

In simple terms, the DLM cycle begins when developers make changes to the code and submit them to the environment, where they are automatically tested. Any errors are reported and addressed, and the validated code is then integrated, triggering the next version release. Meanwhile monitoring tools are on the lookout for any performance issues. In a typical DevOps fashion, there is full process visibility and continuous communication between development and production teams.

The interdependence of databases in complex database applications makes DLM an even more crucial component of software development. Data, as one of the key assets in the modern enterprise, requires a rather different development and management approach than ALM.

Key Steps in DLM

I. DEVELOPMENT PHASE

1. Version Control – Version control is at the foundation of DLM, making it easier to share code and track changes. As a result the database application can be rolled back to the last known working version. Supervision, code auditing, and testing are other key benefits of version control.

2. Continuous Integration – CI involves automatic testing and timely error notifications to prevent integration problems. At the heart of this step is the immediate feedback to developers on any errors. It begins when database changes are imported into the master code depository, containing the latest working version of the code. A series of builds with integration tests are used to validate the code changes. After any errors are addressed, code changes are synced for the next version release.

II. OPERATIONS PHASE

3. Release Management – A good version control system in place during the development phase, along with continuous integration, provide for a steady development pace, with frequent releases and minimized risk. Continuous delivery is automated and all deployments safely make their way to the respective environments. Release Management tools ensure that the automatic deployment of changes is timely and error-free, delivering maximum value to the users.

4. Monitoring, High Availability, and Data Locality - With the automated deployment of database changes it is critical to monitor for any performance issues. The constantly evolving state of the databases requires continuous communication between teams at the various stages of development and operations, which is easily accomplished with the proper tools. Performance analytics allows for the study of the impact of the database changes. In that process various performance insights are revealed and taken into account when planning the next development stages, continuing the DLM cycle.