Welcome!

Linux Containers Authors: Stefana Muller, Elizabeth White, Zakia Bouachraoui, Yeshim Deniz, Pat Romanski

Related Topics: @DevOpsSummit, Microservices Expo, Containers Expo Blog, @CloudExpo, @DXWorldExpo

@DevOpsSummit: Blog Post

Data Demands of #DevOps | @DevOpsSummit #Docker #Microservices

The data as a service solution: new technologies deliver data at the speed of DevOps

Data Demands of DevOps | Part 2

In Part 1 of this article, we explored how data all too often becomes the critical bottleneck in software development, delaying projects and undermining the benefits of DevOps tools and processes. In Part 2, we'll look at the emerging category of Data as a Service solutions, which turn data from a drag to a driver.

Data as a Service
In order to deliver on the promise of DevOps and hit continuous release targets for even the largest, most complex and integrated applications, companies need solutions that provide the same flexibility for data as for code bases, the same automation and repeatability for data as for configurations. They need Data as a Service. DaaS solutions offer a single, integrated platform that serves up faithful copies of source data as easily as codes or configurations, and have sophisticated features to enable collaboration, project agility, and strong governance.

Provision and Branch
The most fundamental capability of any DaaS solution is the ability to deliver multiple copies of data promptly and with sophisticated automation. In order to deliver true DevOps capabilities, data standup should take no more time and effort than containerized code delivery or automated configuration: a few keystrokes and a few minutes.

DaaS solutions often deliver this capability through sophisticated block sharing and virtual files. Instead of moving data from system to system, such solutions keep a single repository of record, and then create virtual data instances by pointing to the correct set of blocks within the repository. That allows data provisioning to occur rapidly and automatically, and decouples time and effort from the size of the data set.

Of course, the ability to provision a full initial copy of the data is not enough. In order for developers and testers to achieve the flexibility they need, the DaaS solution must be able to branch the data as easily as code. A DevOps-ready DaaS solution will enable end users to spin off additional copies of the data they are working on, with whatever adjustments they have made, or from production as of many points in time. With this capability, developers and testers can keep code and data in sync, even as they pursue parallel projects, working on different versions or tests.

Bookmark and Share
DevOps isn't just about self-sufficiency. It's also about sophisticated collaboration. Without a DaaS solution, often data can be the bottleneck to efficient collaboration.

For example, suppose a QA staffer is reviewing a new piece of code for bugs. We're in a DevOps workplace, so this isn't dull, automated testing-perhaps it's an advanced scenario test, or a complicated A-B test setup. Let's say the tester finds a bug. He sends a note to the developer, with whom they've been working closely, outlining the bug. The developer uses the note and automated configuration tools, etc., to get her code into the same state, but she's unable to reproduce the bug. She lets the tester know she can't find it. The tester verifies the bug, and together, dev and test confirm that their code is in the same state. So the difference must be in the data.

With a legacy solution, there are two options. Either the developer would have to file an ops ticket to get her data into the right state-a process that could take days or weeks, and might fail repeatedly, depending on how the tester got his data-or she can take over the tester's data set. That will let her run down the bug quickly, in exchange for preventing the tester from doing any work at all. Either way, the process is broken. And if we imagine that the code being tested is part of a major push, or a daily feature cadence, or even a crucial patch to a bug running rampant in production, it becomes clear how disruptive this data management task can be.

With a DaaS solution, users can save data at any state, and share a copy of that data to any other user, with the same few clicks they would use to share code. Developers and testers don't contend for the same data. They can even skip the process of checking to see if the problem is data mismatch. Instead, they share data readily for every task, as easily and naturally as they share code or underlying hardware resources.

Refresh and Reset
Along with initial environment setup and collaborative debugging, test cycles are some of the most voracious consumers of data in the software development lifecycle. With legacy data delivery methods, testers often have to wait many hours for data to be provisioned to their test environment, in order to run a fifteen-minute test. This creates a very low ceiling on the number of test cycles available in a day, and can prevent the early detection and collaborative resolution of issues that are the keys to DevOps quality.

A DaaS solution can refresh an environment in minutes, accelerating the test cycle by a factor of ten. However, top-line solutions can do even more. A refresh would repopulate the test environment with data from production. But a strong DaaS solution can simply rewind the data state to that immediately before the test. This means that any changes to the data will need to be made only once. A test cycle characterized by long wait times for data and repeated set-up activities can be replaced by one where each test is followed by a rapid, effortless reset, and any data set-up is performed just once.

Governance
The DevOps movement drives cross-functional collaboration to meet the needs of both developers and operations staff. A good DaaS solution will serve both groups' stakeholders. The above capabilities have outlined some of the benefits that a DaaS solution can provide to Dev and Test teams, but the solution should meet Ops needs as well.

To do that, it needs a distinct set of permissions and management interfaces, so that Ops can carefully manage existing infrastructure and resources, even as Dev and Test staff spin up their own environments as-needed. A well-designed DaaS tool will not only save Ops time and effort by automating some of the dullest and most repetitive data-delivery tasks, it will also provide a full view of the team's resources for optimal management.

Conclusion
The growing acceptance of the DevOps philosophy, and the maturing ecosystem of associated tools, promises to revolutionize software development across industries, replacing outdated processes and models with collaborative teams that can truly deliver business value at digital speeds. Data as a Service solutions will be a key component of this revolution, enabling the full stack of environment creation, sharing, and management, leading to an overall doubling of project delivery.

More Stories By Louis Evans

Louis Evans is a Product Marketing Manager at Delphix. He is a subject-matter expert developing content, surveys and best practices pertinent to the DevOps community. Evans is also a speaker at DevOps focused industry events. He is a graduate of Harvard College, with a degree in Social Studies and Mathematics.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


IoT & Smart Cities Stories
The hierarchical architecture that distributes "compute" within the network specially at the edge can enable new services by harnessing emerging technologies. But Edge-Compute comes at increased cost that needs to be managed and potentially augmented by creative architecture solutions as there will always a catching-up with the capacity demands. Processing power in smartphones has enhanced YoY and there is increasingly spare compute capacity that can be potentially pooled. Uber has successfully ...
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...
We are seeing a major migration of enterprises applications to the cloud. As cloud and business use of real time applications accelerate, legacy networks are no longer able to architecturally support cloud adoption and deliver the performance and security required by highly distributed enterprises. These outdated solutions have become more costly and complicated to implement, install, manage, and maintain.SD-WAN offers unlimited capabilities for accessing the benefits of the cloud and Internet. ...
Dion Hinchcliffe is an internationally recognized digital expert, bestselling book author, frequent keynote speaker, analyst, futurist, and transformation expert based in Washington, DC. He is currently Chief Strategy Officer at the industry-leading digital strategy and online community solutions firm, 7Summits.
As IoT continues to increase momentum, so does the associated risk. Secure Device Lifecycle Management (DLM) is ranked as one of the most important technology areas of IoT. Driving this trend is the realization that secure support for IoT devices provides companies the ability to deliver high-quality, reliable, secure offerings faster, create new revenue streams, and reduce support costs, all while building a competitive advantage in their markets. In this session, we will use customer use cases...
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
With 10 simultaneous tracks, keynotes, general sessions and targeted breakout classes, @CloudEXPO and DXWorldEXPO are two of the most important technology events of the year. Since its launch over eight years ago, @CloudEXPO and DXWorldEXPO have presented a rock star faculty as well as showcased hundreds of sponsors and exhibitors! In this blog post, we provide 7 tips on how, as part of our world-class faculty, you can deliver one of the most popular sessions at our events. But before reading...
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
Charles Araujo is an industry analyst, internationally recognized authority on the Digital Enterprise and author of The Quantum Age of IT: Why Everything You Know About IT is About to Change. As Principal Analyst with Intellyx, he writes, speaks and advises organizations on how to navigate through this time of disruption. He is also the founder of The Institute for Digital Transformation and a sought after keynote speaker. He has been a regular contributor to both InformationWeek and CIO Insight...