Welcome!

Linux Containers Authors: Liz McMillan, Elizabeth White, Craig Lowell, Pat Romanski, XebiaLabs Blog

Related Topics: Linux Containers

Linux Containers: Article

Tackling Dependencies

Automated dependency resolution

You don't have to be around Linux for long before you hear about "the dependency problem," which is no problem at all for many users - until the day it bites them. In a nutshell, the problem is that most Linux applications depend on the operating system to provide various pieces of functionality that the applications need. These components most often take the form of shared libraries that are dynamically loaded and linked to the application at runtime. Problems occur when one or more of these libraries are replaced with a different (usually newer) version. Provided all the interfaces remain the same and the semantics of the functionality remain the same, there's no problem. However, due to security fixes, bug fixes, or new or improved functionality, the interfaces and semantics can and do change, and the changes can be enough to break the application.

In general, individual libraries aren't distributed individually but as part of a package of related components. Packages are installed as a whole. In general there's no way to take just one part of a package, and in fact to do so would be inviting trouble since packages are usually composed of related components that need to be installed as a whole to guarantee that the components work together correctly.

It would seem reasonable to assume that once an application is installed and working there would be no reason to touch it, or any of the components that it depends on. However, there are myriad reasons why components end up changing. The most common are:

  • A new application is installed that requires a later version of a package containing one of the shared libraries used by the first application.
  • A security or bug fix affects a package used by the application.
  • An update version of the OS from the distributor includes newer packages.
Whatever the cause, installing a package to satisfy the requirements of some totally unrelated piece of software can break a currently installed application, even if it has no libraries in common with the new software, but simply because some of the libraries of the two applications are contained in the same package. The more applications that are installed, the greater the chance of problems, and the more complex it becomes to find a combination of packages that will satisfy all of the dependencies.

This problem isn't unique to Linux. It's been a long-time problem with many different Unix systems that make heavy use of shared libraries, and is essentially the same as the famous "dll hell" that afflicted older versions of Windows (a Windows .dll file is a dynamically loaded shared library). Unix vendors reduced the problem to manageable proportions by tightly controlling the evolution of the system and providing lots of advance warning to third-party vendors of impending changes. Microsoft adopted similar tactics and became adamant about which libraries third parties were free to change, and which they weren't, making it difficult for third-party applications to overwrite system-provided libraries.

Linux is essentially no better or worse in terms of shared library management than Unix or Windows. What's different is that there's no central coordinating authority to make sure that changes happen in a controlled and consistent manner. In some respects this is one of the strengths of Open Source development; it allows change to happen at its natural pace and forces people to be more aware of the potential problems associated with change. It's also a weakness from the point-of-view of an end user having to integrate systems with components with different dependency requirements.

Linux distributors spend a lot of time and effort making sure that their systems are delivered with the dependencies all correctly resolved, and that any updates they create don't disturb this balance. They often go so far as to take security/bug/performance updates that tend to be created by the component development group on the latest version of their software - which is probably not the version currently shipping - and back-port the changes to the shipping version and test to make sure the changes don't introduce any inconsistencies. However, distributions have limited control over third parties, both commercial and Open-Source, that are reluctant to re-test their products with every new version of a package on which they have a dependency.

One of the interesting benefits Linux holds for its enterprise adopters is that it's multi-sourced. Essentially the same product is available from multiple vendors, improving competition and avoiding lock-in. Although managing multiple Linux distributions adds overhead, some companies prefer not to put all their eggs in one basket and use multiple distributions to help ensure that they don't end up locked in to a specific distribution.

However, this adds a new dimension to the dependency problem.

In many instances, different packages are used on different distributions. There's no standardized system for packaging, although most major Linux distributors use the RPM packaging system that defines how the package is constructed and what information it contains on individual file location and a set of dependency rules. Unfortunately, the same package name on two different distributions can contain different revisions of components or even different components. To add to the problem, not all Linux systems use the same package mechanism. For example, Debian-based systems don't use packages at all.

The end result is that the configuration management of these systems becomes quite complex. Applying something such as a security fix may necessitate other changes to bring the dependencies back into alignment, and then subsequent testing of all applications running on those revised systems/distributions before they can be declared stable and rolled out into production. This has to happen independently on each distribution platform.

Aduva OnStage

Having lived and worked with this problem, and recognizing the need for a better solution than the mostly manual process that they and everyone else was using, the founders of Aduva began work on trying to automate the dependency resolution process.

One of the key components of their system was based on recognizing that the package level - at which most people were working - is too high a level for successful resolution. They built a database (the Aduva Universal KnowlegeBase) of dependency information based on the contents of the packages - the individual files that they contain.

By extending the dependency information and including specific dynamic dependency rules down to the file level it becomes much easier to find solutions to dependency problems. Once a set of solutions is found, information stored in the database about the composition of the packages is used to resolve the set of packages with the highest version levels possible to implement a solution. Since information is stored on distribution-specific packages, the system can derive package lists specific to individual distributions for the same dependencies.

The database also contains information on security alerts and errata notifications and fixes for packages and their components, so in building a specific package list the system takes account of these, and will find a path through the dependency graph that avoids as many of them as possible, hopefully all of them. Rarely, when re-evaluating dependencies to add a new application, the only valid path(s) will include components with known security issues. In that case the system delivers its package list, but warns that the list will introduce known security problems, leaving system administrators to decide whether to continue or consult with Aduva's Lab and professional services team to devise a solution.

Keeping the database up-to-date is the key to success. Aduva works closely with Linux distributions, many different Open Source development communities, and various security groups to ensure that it has complete and current information. A set of tools automates and tests much of the process of determining dependencies.

The complete KnowledgeBase allows systems configurations to be generated based on combinations of different packages beyond those directly supported by a given Linux distribution, provided those packages are known to the KnowledgeBase. Of course, in real-world deployments a huge variety of different applications and third-party packages are going to be encountered, more than in the centrally maintained Universal KnowledgeBase. To make sure that the extended dependency requirements encompassing these additional components are taken into account when configuring, a set of tools exists that permits individual customers to create their own KnowledgeBase with dependency rules specific to their particular software. This local KnowledgeBase is then used in conjunction with the central KnowledgeBase to ensure that the specific requirements of local software components are taken into account when determining a stable configuration.

With this core technology in place Aduva has used it as a platform on which to build a set of tools designed to simplify Linux configuration management, application deployment, change management, and patch control for the enterprise. This set of management tools is sold under the name OnStage.

The OnStage toolset provides a very complete set of system configuration management tools, enabling machine types to be defined, a configuration generated for that specific set of machines, and automatically deployed at the click of a button. Changes can be made to any given set of machines such as deploying an application, adding a patch, or changing the system configuration, and the set of changes are validated against the local and central KnowledgeBases and automatically updated to ensure dependency rules are met and pushed to the entire set of machines, either immediately, or deferred until a specific time or other criteria are met. Configurations are recorded at each stage making it trivial to back out any change or set of changes should that be required.

What's unique about the OnStage toolset is that it sits on the KnowledgeBase and takes the uncertainty out of making changes to a stable production platform, which is one more step in making Open Source a viable solution for the enterprise.

More Stories By Zev Laderman

Zev Laderman is chairman and CEO of Aduva. He was previously CEO of Tradeum, President of Verticalnet, and was a VP at Oracle. He has an MBA from the Stanford University Graduate School of Business.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories
Unless your company can spend a lot of money on new technology, re-engineering your environment and hiring a comprehensive cybersecurity team, you will most likely move to the cloud or seek external service partnerships. In his session at 18th Cloud Expo, Darren Guccione, CEO of Keeper Security, revealed what you need to know when it comes to encryption in the cloud.
We're entering the post-smartphone era, where wearable gadgets from watches and fitness bands to glasses and health aids will power the next technological revolution. With mass adoption of wearable devices comes a new data ecosystem that must be protected. Wearables open new pathways that facilitate the tracking, sharing and storing of consumers’ personal health, location and daily activity data. Consumers have some idea of the data these devices capture, but most don’t realize how revealing and...
What are the successful IoT innovations from emerging markets? What are the unique challenges and opportunities from these markets? How did the constraints in connectivity among others lead to groundbreaking insights? In her session at @ThingsExpo, Carmen Feliciano, a Principal at AMDG, will answer all these questions and share how you can apply IoT best practices and frameworks from the emerging markets to your own business.
Ask someone to architect an Internet of Things (IoT) solution and you are guaranteed to see a reference to the cloud. This would lead you to believe that IoT requires the cloud to exist. However, there are many IoT use cases where the cloud is not feasible or desirable. In his session at @ThingsExpo, Dave McCarthy, Director of Products at Bsquare Corporation, will discuss the strategies that exist to extend intelligence directly to IoT devices and sensors, freeing them from the constraints of ...
You think you know what’s in your data. But do you? Most organizations are now aware of the business intelligence represented by their data. Data science stands to take this to a level you never thought of – literally. The techniques of data science, when used with the capabilities of Big Data technologies, can make connections you had not yet imagined, helping you discover new insights and ask new questions of your data. In his session at @ThingsExpo, Sarbjit Sarkaria, data science team lead ...
Extracting business value from Internet of Things (IoT) data doesn’t happen overnight. There are several requirements that must be satisfied, including IoT device enablement, data analysis, real-time detection of complex events and automated orchestration of actions. Unfortunately, too many companies fall short in achieving their business goals by implementing incomplete solutions or not focusing on tangible use cases. In his general session at @ThingsExpo, Dave McCarthy, Director of Products...
Traditional IT, great for stable systems of record, is struggling to cope with newer, agile systems of engagement requirements coming straight from the business. In his session at 18th Cloud Expo, William Morrish, General Manager of Product Sales at Interoute, outlined ways of exploiting new architectures to enable both systems and building them to support your existing platforms, with an eye for the future. Technologies such as Docker and the hyper-convergence of computing, networking and sto...
WebRTC is bringing significant change to the communications landscape that will bridge the worlds of web and telephony, making the Internet the new standard for communications. Cloud9 took the road less traveled and used WebRTC to create a downloadable enterprise-grade communications platform that is changing the communication dynamic in the financial sector. In his session at @ThingsExpo, Leo Papadopoulos, CTO of Cloud9, discussed the importance of WebRTC and how it enables companies to focus...
With an estimated 50 billion devices connected to the Internet by 2020, several industries will begin to expand their capabilities for retaining end point data at the edge to better utilize the range of data types and sheer volume of M2M data generated by the Internet of Things. In his session at @ThingsExpo, Don DeLoach, CEO and President of Infobright, discussed the infrastructures businesses will need to implement to handle this explosion of data by providing specific use cases for filterin...
IoT generates lots of temporal data. But how do you unlock its value? You need to discover patterns that are repeatable in vast quantities of data, understand their meaning, and implement scalable monitoring across multiple data streams in order to monetize the discoveries and insights. Motif discovery and deep learning platforms are emerging to visualize sensor data, to search for patterns and to build application that can monitor real time streams efficiently. In his session at @ThingsExpo, ...
Early adopters of IoT viewed it mainly as a different term for machine-to-machine connectivity or M2M. This is understandable since a prerequisite for any IoT solution is the ability to collect and aggregate device data, which is most often presented in a dashboard. The problem is that viewing data in a dashboard requires a human to interpret the results and take manual action, which doesn’t scale to the needs of IoT.
Internet of @ThingsExpo has announced today that Chris Matthieu has been named tech chair of Internet of @ThingsExpo 2016 Silicon Valley. The 6thInternet of @ThingsExpo will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Much of IT terminology is often misused and misapplied. Modernization and transformation are two such terms. They are often used interchangeably even though they mean different things and have very different connotations. Indeed, it is somewhat safe to assume that in IT any transformative effort is likely to also have a modernizing effect, and thus, we can see these as levels of improvement efforts. However, many businesses are being led to believe if they don’t transform now they risk becoming ...
CenturyLink has announced that application server solutions from GENBAND are now available as part of CenturyLink’s Networx contracts. The General Services Administration (GSA)’s Networx program includes the largest telecommunications contract vehicles ever awarded by the federal government. CenturyLink recently secured an extension through spring 2020 of its offerings available to federal government agencies via GSA’s Networx Universal and Enterprise contracts. GENBAND’s EXPERiUS™ Application...
What does it look like when you have access to cloud infrastructure and platform under the same roof? Let’s talk about the different layers of Technology as a Service: who cares, what runs where, and how does it all fit together. In his session at 18th Cloud Expo, Phil Jackson, Lead Technology Evangelist at SoftLayer, an IBM company, spoke about the picture being painted by IBM Cloud and how the tools being crafted can help fill the gaps in your IT infrastructure.
SYS-CON Events announced today the Enterprise IoT Bootcamp, being held November 1-2, 2016, in conjunction with 19th Cloud Expo | @ThingsExpo at the Santa Clara Convention Center in Santa Clara, CA. Combined with real-world scenarios and use cases, the Enterprise IoT Bootcamp is not just based on presentations but with hands-on demos and detailed walkthroughs. We will introduce you to a variety of real world use cases prototyped using Arduino, Raspberry Pi, BeagleBone, Spark, and Intel Edison. Y...
SYS-CON Events announced today that LeaseWeb USA, a cloud Infrastructure-as-a-Service (IaaS) provider, will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. LeaseWeb is one of the world's largest hosting brands. The company helps customers define, develop and deploy IT infrastructure tailored to their exact business needs, by combining various kinds cloud solutions.
The best-practices for building IoT applications with Go Code that attendees can use to build their own IoT applications. In his session at @ThingsExpo, Indraneel Mitra, Senior Solutions Architect & Technology Evangelist at Cognizant, provided valuable information and resources for both novice and experienced developers on how to get started with IoT and Golang in a day. He also provided information on how to use Intel Arduino Kit, Go Robotics API and AWS IoT stack to build an application tha...
It’s 2016: buildings are smart, connected and the IoT is fundamentally altering how control and operating systems work and speak to each other. Platforms across the enterprise are networked via inexpensive sensors to collect massive amounts of data for analytics, information management, and insights that can be used to continuously improve operations. In his session at @ThingsExpo, Brian Chemel, Co-Founder and CTO of Digital Lumens, will explore: The benefits sensor-networked systems bring to ...
Whether your IoT service is connecting cars, homes, appliances, wearable, cameras or other devices, one question hangs in the balance – how do you actually make money from this service? The ability to turn your IoT service into profit requires the ability to create a monetization strategy that is flexible, scalable and working for you in real-time. It must be a transparent, smoothly implemented strategy that all stakeholders – from customers to the board – will be able to understand and comprehe...