Click here to close now.

Welcome!

Linux Authors: Liz McMillan, AppDynamics Blog, Elizabeth White, Pat Romanski, Carmen Gonzalez

Related Topics: @DevOpsSummit, Java, Microservices Journal, Linux, Cloud Expo, Big Data Journal

@DevOpsSummit: Article

The DevOps Database | Part 4

A culture of continual experimentation and learning

In the final post in this series about bringing DevOps patterns to database change management, we’re going to discuss the Third Way.  Here’s a refresher on the Third Way from the introductory post in this series:

The Third Way: Culture of Continual Experimentation & Learning – This way emphasizes the benefits that can be realized through embracing experimentation, risk-taking, and learning from failure.  By adopting this kind of attitude, experimentation and risk-taking lead to innovation and improvement while embracing failure allows the organization to produce more resilient products and sharpen skills that allow teams to recover more quickly from unexpected failure when it does occur.

The Third Way is by far the most intriguing of the “The Ways” to me.  I’ve spent the lion’s share of my career in early stage start-ups where cycles of experimentation, learning, and failure are the norm.  When bringing a new product or service to market, your latest release is never your last.  It may not even be the last release this week.  You are constantly experimenting with new workflows and technology, learning about your target market, and getting more valuable information from your early failures than your early successes.  The Third Way is crucial to the success of an early stage company.

While the benefits of the Third Way still apply to more established companies and product lines, practicing it becomes more difficult.  The potential negatives of experimentation and risk-taking are much harder to stomach when you have a large base of paying customers with SLAs. This aversion to risk is most acute when you’re talking about your data platform where outages, performance problems and data loss are not an option. Complicating matters further is how difficult it can be to unwind the database changes that were affected to support a specific version of your app.  Application code can usually be uninstalled and replaced with the previous working version fairly simply should problems arise. Reverting the database changes that support that version of the application is more akin to defusing a bomb.  Database changes must be reverted delicately and meticulously to avoid errors and omissions that could negatively impact your data platform.

What DBAs and release managers need to facilitate experimentation and risk taking on the data platform is a special combination of tools and process.  This combination should make it easy to identify the root cause of issues, quickly remediate problems caused by application schema structure, and revert to a previous version of the schema safely.  When we started Datical, we spent several hours in conversation and at white boards exploring these unique needs and hammering out a path to usher the Third Way into regular database activity.

A Rollback Designed With Every Change
The biggest problem with experimentation in the data platform is how difficult it is to move backward and forward through your schema’s version history.  We feel the best way to bring more flexibility to the process of upgrading and reverting schema is an attitude shift.  Your rollback strategy for each database change must become as important as the change itself.  The best time to craft your rollback strategy is when the change itself is being designed.  When the motivation for the change is fresh in your mind and the dependencies of the object being created, dropped or altered are clearly mapped out, a developer or DBA can better craft a rollback strategy for which every contingency has been considered.  This leads to a stronger safety net and makes your application schema as agile and easily managed as your application code. The database is no longer preventing you from being bold but quickly and safely moving between versions to accommodate experimentation.

The Richness of Model Based Comparison & Remediation
I’m a native Texan, born and raised in Austin.  I know the jokes about how proud and vocal Texans can be about their home state.  That being said, I spend more time telling people about the model based approach Datical has applied to Database Change Management than I do telling them how wonderful it is that I was fortunate enough to be born in the best state in the country.  The advantages of the model based approach really shine through when it comes to experimentation and troubleshooting.  The model allows you to annotate all objects and modifications to your application’s schema with the business reason that prompted them.  This detailed history is invaluable when designing new changes or refactoring your schema as part of an experimental exercise.  You immediately know what objects are most crucial, what your dependencies are, what areas you need to tread lightly in, and what areas are ripe for experimentation due to the changing needs of your business.  Designing intelligently eliminates risk.

Troubleshooting with models is also dramatically faster and more reliable than other methods. Programmatically comparing models allows you to determine the differences between two databases much more quickly than manually comparing diagrams or SQL scripts.  You know with certainty exactly what has changed and what is missing in a fraction of the time that human review takes.

Once you have identified the differences, remediation is as simple plugging one, some or all of the determined differences into the model and deploying those changes to the non-compliant instance.

Flexible Quick Rollback
If you’ve taken our advice to implement your rollback strategy when you implement a change, recovering from disaster becomes testable, fast, and simple.  Before going to production or a sensitive environment, you should always test your rollback steps in dev, test and staging.  This will allow you to make any tweaks or changes to your rollback strategy before you are in a pinch.  Think of it like testing your smoke alarm.  Hopefully you’ll never need it, but it’s nice to know that it’ll work if you do.

Let’s say the worst happens. You deploy a new set of database changes and the application performance degrades or errors are logged.  The decision is made to revert the entire installation to the previous version.  Because you have carefully designed, tested and refined your rollback strategy, rolling back the database changes becomes a push button operation or a single invocation of a command line tool.  No more running disparate SQL scripts or undoing changes on the fly.  You can be confident that your database has been returned to the same state it was in before the upgrade.

Summary
The database has long been handled with kid gloves and for good reason.  Data is a precious resource for consumers and businesses.  Consumers provide data to businesses and trust that it will be kept safe and used to offer them better products and services.  Businesses rely on data to strategize, grow, and become more efficient and profitable.  It is the lifeblood of our economy.  As our ability to collect and process data becomes greater and greater, the rate at which an enterprise must move on what is learned from that data becomes linearly faster.  Data must be kept safe, but the database must become more agile to accommodate the growing pressure for faster value realization of business intelligence initiatives.  DevOps patterns hold the key to this necessary agility while maintaining or improving the security and integrity of data stores.  The database and DBAs need to be brought into the application design process earlier and must be treated as the first class stakeholder and commodity that they are. Companies that acknowledge this and move to adopt DevOps patterns and include their database teams will be at a distinct competitive advantage to those that don’t.  Don’t miss the boat!

More Stories By Pete Pickerill

Pete Pickerill is Vice President of Products and Co-founder of Datical. Pete is a software industry veteran who has built his career in Austin’s technology sector. Prior to co-founding Datical, he was employee number one at Phurnace Software and helped lead the company to a high profile acquisition by BMC Software, Inc. Pete has spent the majority of his career in successful startups and the companies that acquired them including Loop One (acquired by NeoPost Solutions), WholeSecurity (acquired by Symantec, Inc.) and Phurnace Software.

@ThingsExpo Stories
Wearable devices have come of age. The primary applications of wearables so far have been "the Quantified Self" or the tracking of one's fitness and health status. We propose the evolution of wearables into social and emotional communication devices. Our BE(tm) sensor uses light to visualize the skin conductance response. Our sensors are very inexpensive and can be massively distributed to audiences or groups of any size, in order to gauge reactions to performances, video, or any kind of presentation. In her session at @ThingsExpo, Jocelyn Scheirer, CEO & Founder of Bionolux, will discuss ho...
The true value of the Internet of Things (IoT) lies not just in the data, but through the services that protect the data, perform the analysis and present findings in a usable way. With many IoT elements rooted in traditional IT components, Big Data and IoT isn’t just a play for enterprise. In fact, the IoT presents SMBs with the prospect of launching entirely new activities and exploring innovative areas. CompTIA research identifies several areas where IoT is expected to have the greatest impact.
Every day we read jaw-dropping stats on the explosion of data. We allocate significant resources to harness and better understand it. We build businesses around it. But we’ve only just begun. For big payoffs in Big Data, CIOs are turning to cognitive computing. Cognitive computing’s ability to securely extract insights, understand natural language, and get smarter each time it’s used is the next, logical step for Big Data.
The 4th International Internet of @ThingsExpo, co-located with the 17th International Cloud Expo - to be held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA - announces that its Call for Papers is open. The Internet of Things (IoT) is the biggest idea since the creation of the Worldwide Web more than 20 years ago.
We are reaching the end of the beginning with WebRTC, and real systems using this technology have begun to appear. One challenge that faces every WebRTC deployment (in some form or another) is identity management. For example, if you have an existing service – possibly built on a variety of different PaaS/SaaS offerings – and you want to add real-time communications you are faced with a challenge relating to user management, authentication, authorization, and validation. Service providers will want to use their existing identities, but these will have credentials already that are (hopefully) i...
17th Cloud Expo, taking place Nov 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Meanwhile, 94% of enterprises are using some form of XaaS – software, platform, and infrastructure as a service.
The Industrial Internet revolution is now underway, enabled by connected machines and billions of devices that communicate and collaborate. The massive amounts of Big Data requiring real-time analysis is flooding legacy IT systems and giving way to cloud environments that can handle the unpredictable workloads. Yet many barriers remain until we can fully realize the opportunities and benefits from the convergence of machines and devices with Big Data and the cloud, including interoperability, data security and privacy.
The 17th International Cloud Expo has announced that its Call for Papers is open. 17th International Cloud Expo, to be held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, brings together Cloud Computing, APM, APIs, Microservices, Security, Big Data, Internet of Things, DevOps and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportunity. Submit your speaking proposal today!
The Internet of Things is tied together with a thin strand that is known as time. Coincidentally, at the core of nearly all data analytics is a timestamp. When working with time series data there are a few core principles that everyone should consider, especially across datasets where time is the common boundary. In his session at Internet of @ThingsExpo, Jim Scott, Director of Enterprise Strategy & Architecture at MapR Technologies, discussed single-value, geo-spatial, and log time series data. By focusing on enterprise applications and the data center, he will use OpenTSDB as an example t...
SYS-CON Events announced today that MetraTech, now part of Ericsson, has been named “Silver Sponsor” of SYS-CON's 16th International Cloud Expo®, which will take place on June 9–11, 2015, at the Javits Center in New York, NY. Ericsson is the driving force behind the Networked Society- a world leader in communications infrastructure, software and services. Some 40% of the world’s mobile traffic runs through networks Ericsson has supplied, serving more than 2.5 billion subscribers.
Scott Jenson leads a project called The Physical Web within the Chrome team at Google. Project members are working to take the scalability and openness of the web and use it to talk to the exponentially exploding range of smart devices. Nearly every company today working on the IoT comes up with the same basic solution: use my server and you'll be fine. But if we really believe there will be trillions of these devices, that just can't scale. We need a system that is open a scalable and by using the URL as a basic building block, we open this up and get the same resilience that the web enjoys.
Container frameworks, such as Docker, provide a variety of benefits, including density of deployment across infrastructure, convenience for application developers to push updates with low operational hand-holding, and a fairly well-defined deployment workflow that can be orchestrated. Container frameworks also enable a DevOps approach to application development by cleanly separating concerns between operations and development teams. But running multi-container, multi-server apps with containers is very hard. You have to learn five new and different technologies and best practices (libswarm, sy...
SYS-CON Events announced today that DragonGlass, an enterprise search platform, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. After eleven years of designing and building custom applications, OpenCrowd has launched DragonGlass, a cloud-based platform that enables the development of search-based applications. These are a new breed of applications that utilize a search index as their backbone for data retrieval. They can easily adapt to new data sets and provide access to both structured and unstruc...
Converging digital disruptions is creating a major sea change - Cisco calls this the Internet of Everything (IoE). IoE is the network connection of People, Process, Data and Things, fueled by Cloud, Mobile, Social, Analytics and Security, and it represents a $19Trillion value-at-stake over the next 10 years. In her keynote at @ThingsExpo, Manjula Talreja, VP of Cisco Consulting Services, will discuss IoE and the enormous opportunities it provides to public and private firms alike. She will share what businesses must do to thrive in the IoE economy, citing examples from several industry sector...
With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo in Silicon Valley. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be! Internet of @ThingsExpo, taking place Nov 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 17th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The Internet of Things (IoT) is the most profound change in personal an...
All major researchers estimate there will be tens of billions devices - computers, smartphones, tablets, and sensors - connected to the Internet by 2020. This number will continue to grow at a rapid pace for the next several decades. With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo, June 9-11, 2015, at the Javits Center in New York City. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be
An entirely new security model is needed for the Internet of Things, or is it? Can we save some old and tested controls for this new and different environment? In his session at @ThingsExpo, New York's at the Javits Center, Davi Ottenheimer, EMC Senior Director of Trust, reviewed hands-on lessons with IoT devices and reveal a new risk balance you might not expect. Davi Ottenheimer, EMC Senior Director of Trust, has more than nineteen years' experience managing global security operations and assessments, including a decade of leading incident response and digital forensics. He is co-author of t...
The Internet of Things is a misnomer. That implies that everything is on the Internet, and that simply should not be - especially for things that are blurring the line between medical devices that stimulate like a pacemaker and quantified self-sensors like a pedometer or pulse tracker. The mesh of things that we manage must be segmented into zones of trust for sensing data, transmitting data, receiving command and control administrative changes, and peer-to-peer mesh messaging. In his session at @ThingsExpo, Ryan Bagnulo, Solution Architect / Software Engineer at SOA Software, focused on desi...
Buzzword alert: Microservices and IoT at a DevOps conference? What could possibly go wrong? In this Power Panel at DevOps Summit, moderated by Jason Bloomberg, the leading expert on architecting agility for the enterprise and president of Intellyx, panelists will peel away the buzz and discuss the important architectural principles behind implementing IoT solutions for the enterprise. As remote IoT devices and sensors become increasingly intelligent, they become part of our distributed cloud environment, and we must architect and code accordingly. At the very least, you'll have no problem fil...
While great strides have been made relative to the video aspects of remote collaboration, audio technology has basically stagnated. Typically all audio is mixed to a single monaural stream and emanates from a single point, such as a speakerphone or a speaker associated with a video monitor. This leads to confusion and lack of understanding among participants especially regarding who is actually speaking. Spatial teleconferencing introduces the concept of acoustic spatial separation between conference participants in three dimensional space. This has been shown to significantly improve comprehe...