DevOps: Moving to the Mainstream?

A recent analyst study predicted that by 2016 25% of Global 2000 companies are expected to adopt DevOps investing over $2B in related tooling and support.  This transition is further evidenced by the influx of DevOps sessions at EMC World, a traditionally storage and infrastructure focused event.

While we at EMC are having more and more DevOps conversations with our customers, with our Federation Enterprise Hybrid Cloud an ideal platform for automating many of the DevOps processes, I am finding that the term and practice of DevOps is becoming diluted quickly as many enterprises race to implement automation without having a broader strategy, approach, and understanding as to how a DevOps transformation can radically change IT.

To level set, DevOps is a combination of three key components: pipelines, continuous delivery tool chains, and collaboration.  Let’s look at each in a bit more detail:


A pipeline is an instantiated instance of a delivery workflow or process.  In other words, it is the collection of steps needed to move an accepted change request or use case linearly from development, through testing, and ultimately into production.  In most organizations, there will be 100s of active pipelines instantiated at any given time, tracking and guiding a specific change through the development and release process.  Pipelines are temporary.  Once a change is successfully in production or terminated for any myriad of reasons, like business packaging dependencies or failure, the pipeline is archived or decommissioned.

Accelerate Your Application Delivery Capability InfographicContinuous Delivery Tool Chain

The continuous delivery tool chain is an integrated set of tools that enable, automate, and optimize the aforementioned pipeline.  This tool chain is required to consistently and reliably provision and deploy the correct version of infrastructure matched with the correct version of the application depending upon what stage of the software development lifecycle (SDLC) you are in.  It recognizes and embraces the mutualistic relationship between infrastructure and applications.  In early stages, this tool chain connects and orchestrates infrastructure provisioning, configuration management, and code deployment tools, such as Federation Enterprise Hybrid Cloud, Puppet, Artifactory, GIT, and Jenkins.  Implementation, testing, monitoring, compliance, and other tools are incorporated into the chain in advance, enabling rapid feedback to application teams like Nagio, Selenium, Black Duck, etc.


As the portmanteau suggests and further evidenced above, DevOps requires close collaboration between developer and infrastructure operations.  This collaboration isn’t just meeting every other Friday to talk shop, but rather, it is joint accountability and responsibility for rapidly introducing change into product with a high degree of quality of reliability.  Through this deep collaboration, enterprises are able to create a development fabric that is vectored towards creating value, eliminating waste, and ultimately continuously improving.

Countless studies, articles, journals, and even my own PowerPoint decks assert the benefits of DevOps, such as faster release cycles, improved recovery time, and higher success rates and productivity.  As IT organizations begin the transition towards service broker and manage more and more resources through software-defined services, the shift to DevOps will become increasingly critical for long term success.

About the Author: Bart Driscoll

Bart Driscoll is the Global Innovation Lead for Digital Services at Dell Technologies. This practice delivers a full spectrum of platform, data, application, and operations related services that help our clients navigate through the complexities and challenges of modernizing legacy portfolios, implementing continuous delivery systems, and adopting lean devops and agile practices. Bart’s passion for lean, collaborative systems combined with his tactical, action-oriented focus has helped Dell Technologies partner with some of the largest financial services and healthcare companies to begin the journey of digital transformation. Bart has broad experience in IT ranging from networking engineering to help desk management to application development and testing. He has spent the last 22 years honing his application development and delivery skills in roles such as Information Architect, Release Manager, Test Manager, Agile Coach, Architect, and Project/Program Manager. Bart has held certifications from PMI, Agile Alliance, Pegasystems, and Six Sigma. Bart earned a bachelor’s degree from the College of the Holy Cross and a master’s degree from the University of Virginia.