Welcome!

Linux Containers Authors: Yeshim Deniz, Liz McMillan, Zakia Bouachraoui, Elizabeth White, Pat Romanski

Related Topics: @DevOpsSummit, Linux Containers, @CloudExpo, FinTech Journal

@DevOpsSummit: Blog Feed Post

Deployment Management with @Plutora | @CloudExpo [#DevOps #APM]

Organizations miss opportunities to take advantage of dynamic cloud-based deployments for non-production environments

Adapt Environment Management to Cloud Deployments with Plutora

Despite the prevalence of public and private clouds in the enterprise, most IT departments still adhere to operational models designed for physical infrastructure and servers that involve complex environment setup processes.  As a result, organizations miss opportunities to take advantage of dynamic cloud-based deployments for non-production environments.

There are a number of reasons why it is often too difficult for enterprises to stand up a non-production environment for testing or staging. Databases have to be provisioned, application servers need to be configured, and a testing environment for a complex system can take weeks or months to properly certify as ready for use.  This involves a lot of coordination and effort to setup environments.  Organizations typically simplify setup by treating environments as permanent because they lack a tool like Plutora to help them keep track of environment management efforts.

  • Without Plutora, you'll use your clouds just like a colo facility and won't be able to adapt your processes to new possibilities. You'll over-provision your environments to make up for the lack of visibility into what's really needed.
  • With Plutora, you gain insight into how much effort is required to coordinate environment management tasks and you get a dashboard showing you when environments are critical during a release timeline. These two capabilities give you the opportunity to adapt environment management to the dynamic possibilities of cloud-based deployments.

Cloud vs. Colo: What's the Difference?

Before the advent of cloud computing systems such as OpenStack and EC2, we had Colocation Centers.  Companies rented physical space and installed servers in data centers. For example, in 2001 I remember shipping expensive physical infrastructure to colo facilities. Applications were installed on these physical assets directly, and in 2001, if you needed a new testing environment you would order more physical servers and send them to your data center. Colos were all CapEx; your environments were your physical servers.

Fast-forward 14 years and most large enterprises are either using or planning to use a private cloud such as OpenStack. Many of these companies are still renting space in colo facilities.  They still have dedicated data centers to run private clouds, but there's now an opportunity to model environment capacity as a dynamic function. If you need a new environment you don't need to purchase more servers. You just provision more cores on-the-fly to adapt to changing demand.

This is the vision of cloud computing - the ability to react to demand as needed. Unfortunately, this vision runs into an obstacle - most organizations don't plan for dynamic environments because they are too difficult to manage and model.

So, what happens? A company invests millions in standing up internal clouds only to have application teams fire up static environments that run for years. In other words, your application teams use your cloud like an old colo because they haven't adapted to the cloud. They don't take advantage of the dynamic capabilities of cloud computing because it takes too much effort to "hydrate" an environment.

Infrastructure Isn't the Problem: It's the Applications

Application setup time is prohibitive and the challenge isn't purely technical. When a large e-commerce site needs to setup an end-to-end testing environment, this might require ten departments to coordinate the efforts of over fifty people on infrastructure, database, and application configuration efforts. This work may involve an end-to-end architecture, including everything from the website to transaction databases that fuel orders. Instead of accounting for this setup time, many organizations just write this off as lost time.

In a static environment, this setup process happens so infrequently it isn't even tracked and this is the root of the problem. If you only have to setup your QA and staging environments once every two years why bother optimizing this process? (Hint: In the cloud it should happen more frequently.)

There is a better way to address setup complexity. Start using Plutora to orchestrate the hundreds of steps necessary for standing up an environment. Use the tools we've designed to properly account for the time and effort it takes to get an environment ready for use. Once you've done this and identified the necessary effort, you can use Plutora to factor environment setup time into your release schedules.

Use Plutora to Track Environments: Take Advantage of the Cloud

You have two sides to the environment setup problem. On one side you have the requirement for many application teams to maintain multiple environments for parallel development tracks. On the other side you have the effort involved in setting up and tearing down complex end-to-end testing environments. What can be done?

First, no amount of planning is going to do away with the requirement for QA and staging environments, but properly planning when capacity is needed can help you identify opportunities for environment sharing and reuse. Even though your teams may require multiple staging and QA systems, a tool like Plutora can help you establish schedules for these systems so that you can effectively scale up or scale down cloud-based resources for environments as they are needed.

Planning and tracking tasks involved in environment setup can also help you identify potential areas for automation that will reduce the time required to stand up and tear down testing environments.

Times Have Changed, So Should Your Approach to Environment Management...

In the more static situation of 2001 where you provisioned physical servers and mapped them to non-production environments, it didn't make sense to optimize environment setup and tear down because it was impractical to automate across the full architecture.

In 2015, it only takes a few minutes to spin up hundreds of virtual machines on a private cloud. Your organization needs to use a solution like Plutora to start capturing and optimizing the environment setup process so that you can take full advantage of the cloud. When you install OpenStack or start using Amazon EC2, you should adopt Plutora to help you manage your environments.

Read the original blog entry...

More Stories By Plutora Blog

Plutora provides Enterprise Release and Test Environment Management SaaS solutions aligning process, technology, and information to solve release orchestration challenges for the enterprise.

Plutora’s SaaS solution enables organizations to model release management and test environment management activities as a bridge between agile project teams and an enterprise’s ITSM initiatives. Using Plutora, you can orchestrate parallel releases from several independent DevOps groups all while giving your executives as well as change management specialists insight into overall risk.

Supporting the largest releases for the largest organizations throughout North America, EMEA, and Asia Pacific, Plutora provides proof that large companies can adopt DevOps while managing the risks that come with wider adoption of self-service and agile software development in the enterprise. Aligning process, technology, and information to solve increasingly complex release orchestration challenges, this Gartner “Cool Vendor in IT DevOps” upgrades the enterprise release management from spreadsheets, meetings, and email to an integrated dashboard giving release managers insight and control over large software releases.

IoT & Smart Cities Stories
Dion Hinchcliffe is an internationally recognized digital expert, bestselling book author, frequent keynote speaker, analyst, futurist, and transformation expert based in Washington, DC. He is currently Chief Strategy Officer at the industry-leading digital strategy and online community solutions firm, 7Summits.
Digital Transformation is much more than a buzzword. The radical shift to digital mechanisms for almost every process is evident across all industries and verticals. This is often especially true in financial services, where the legacy environment is many times unable to keep up with the rapidly shifting demands of the consumer. The constant pressure to provide complete, omnichannel delivery of customer-facing solutions to meet both regulatory and customer demands is putting enormous pressure on...
IoT is rapidly becoming mainstream as more and more investments are made into the platforms and technology. As this movement continues to expand and gain momentum it creates a massive wall of noise that can be difficult to sift through. Unfortunately, this inevitably makes IoT less approachable for people to get started with and can hamper efforts to integrate this key technology into your own portfolio. There are so many connected products already in place today with many hundreds more on the h...
The standardization of container runtimes and images has sparked the creation of an almost overwhelming number of new open source projects that build on and otherwise work with these specifications. Of course, there's Kubernetes, which orchestrates and manages collections of containers. It was one of the first and best-known examples of projects that make containers truly useful for production use. However, more recently, the container ecosystem has truly exploded. A service mesh like Istio addr...
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Charles Araujo is an industry analyst, internationally recognized authority on the Digital Enterprise and author of The Quantum Age of IT: Why Everything You Know About IT is About to Change. As Principal Analyst with Intellyx, he writes, speaks and advises organizations on how to navigate through this time of disruption. He is also the founder of The Institute for Digital Transformation and a sought after keynote speaker. He has been a regular contributor to both InformationWeek and CIO Insight...
Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settlement products to hedge funds and investment banks. After, he co-founded a revenue cycle management company where he learned about Bitcoin and eventually Ethereal. Andrew's role at ConsenSys Enterprise is a mul...
To Really Work for Enterprises, MultiCloud Adoption Requires Far Better and Inclusive Cloud Monitoring and Cost Management … But How? Overwhelmingly, even as enterprises have adopted cloud computing and are expanding to multi-cloud computing, IT leaders remain concerned about how to monitor, manage and control costs across hybrid and multi-cloud deployments. It’s clear that traditional IT monitoring and management approaches, designed after all for on-premises data centers, are falling short in ...
In his general session at 19th Cloud Expo, Manish Dixit, VP of Product and Engineering at Dice, discussed how Dice leverages data insights and tools to help both tech professionals and recruiters better understand how skills relate to each other and which skills are in high demand using interactive visualizations and salary indicator tools to maximize earning potential. Manish Dixit is VP of Product and Engineering at Dice. As the leader of the Product, Engineering and Data Sciences team at D...
Dynatrace is an application performance management software company with products for the information technology departments and digital business owners of medium and large businesses. Building the Future of Monitoring with Artificial Intelligence. Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more busine...