Welcome!

Linux Authors: Pat Romanski, Carmen Gonzalez, Vormetric Blog, Liz McMillan, Elizabeth White

Related Topics: Linux

Linux: Article

Tackling Dependencies

Automated dependency resolution

You don't have to be around Linux for long before you hear about "the dependency problem," which is no problem at all for many users - until the day it bites them. In a nutshell, the problem is that most Linux applications depend on the operating system to provide various pieces of functionality that the applications need. These components most often take the form of shared libraries that are dynamically loaded and linked to the application at runtime. Problems occur when one or more of these libraries are replaced with a different (usually newer) version. Provided all the interfaces remain the same and the semantics of the functionality remain the same, there's no problem. However, due to security fixes, bug fixes, or new or improved functionality, the interfaces and semantics can and do change, and the changes can be enough to break the application.

In general, individual libraries aren't distributed individually but as part of a package of related components. Packages are installed as a whole. In general there's no way to take just one part of a package, and in fact to do so would be inviting trouble since packages are usually composed of related components that need to be installed as a whole to guarantee that the components work together correctly.

It would seem reasonable to assume that once an application is installed and working there would be no reason to touch it, or any of the components that it depends on. However, there are myriad reasons why components end up changing. The most common are:

  • A new application is installed that requires a later version of a package containing one of the shared libraries used by the first application.
  • A security or bug fix affects a package used by the application.
  • An update version of the OS from the distributor includes newer packages.
Whatever the cause, installing a package to satisfy the requirements of some totally unrelated piece of software can break a currently installed application, even if it has no libraries in common with the new software, but simply because some of the libraries of the two applications are contained in the same package. The more applications that are installed, the greater the chance of problems, and the more complex it becomes to find a combination of packages that will satisfy all of the dependencies.

This problem isn't unique to Linux. It's been a long-time problem with many different Unix systems that make heavy use of shared libraries, and is essentially the same as the famous "dll hell" that afflicted older versions of Windows (a Windows .dll file is a dynamically loaded shared library). Unix vendors reduced the problem to manageable proportions by tightly controlling the evolution of the system and providing lots of advance warning to third-party vendors of impending changes. Microsoft adopted similar tactics and became adamant about which libraries third parties were free to change, and which they weren't, making it difficult for third-party applications to overwrite system-provided libraries.

Linux is essentially no better or worse in terms of shared library management than Unix or Windows. What's different is that there's no central coordinating authority to make sure that changes happen in a controlled and consistent manner. In some respects this is one of the strengths of Open Source development; it allows change to happen at its natural pace and forces people to be more aware of the potential problems associated with change. It's also a weakness from the point-of-view of an end user having to integrate systems with components with different dependency requirements.

Linux distributors spend a lot of time and effort making sure that their systems are delivered with the dependencies all correctly resolved, and that any updates they create don't disturb this balance. They often go so far as to take security/bug/performance updates that tend to be created by the component development group on the latest version of their software - which is probably not the version currently shipping - and back-port the changes to the shipping version and test to make sure the changes don't introduce any inconsistencies. However, distributions have limited control over third parties, both commercial and Open-Source, that are reluctant to re-test their products with every new version of a package on which they have a dependency.

One of the interesting benefits Linux holds for its enterprise adopters is that it's multi-sourced. Essentially the same product is available from multiple vendors, improving competition and avoiding lock-in. Although managing multiple Linux distributions adds overhead, some companies prefer not to put all their eggs in one basket and use multiple distributions to help ensure that they don't end up locked in to a specific distribution.

However, this adds a new dimension to the dependency problem.

In many instances, different packages are used on different distributions. There's no standardized system for packaging, although most major Linux distributors use the RPM packaging system that defines how the package is constructed and what information it contains on individual file location and a set of dependency rules. Unfortunately, the same package name on two different distributions can contain different revisions of components or even different components. To add to the problem, not all Linux systems use the same package mechanism. For example, Debian-based systems don't use packages at all.

The end result is that the configuration management of these systems becomes quite complex. Applying something such as a security fix may necessitate other changes to bring the dependencies back into alignment, and then subsequent testing of all applications running on those revised systems/distributions before they can be declared stable and rolled out into production. This has to happen independently on each distribution platform.

Aduva OnStage

Having lived and worked with this problem, and recognizing the need for a better solution than the mostly manual process that they and everyone else was using, the founders of Aduva began work on trying to automate the dependency resolution process.

One of the key components of their system was based on recognizing that the package level - at which most people were working - is too high a level for successful resolution. They built a database (the Aduva Universal KnowlegeBase) of dependency information based on the contents of the packages - the individual files that they contain.

By extending the dependency information and including specific dynamic dependency rules down to the file level it becomes much easier to find solutions to dependency problems. Once a set of solutions is found, information stored in the database about the composition of the packages is used to resolve the set of packages with the highest version levels possible to implement a solution. Since information is stored on distribution-specific packages, the system can derive package lists specific to individual distributions for the same dependencies.

The database also contains information on security alerts and errata notifications and fixes for packages and their components, so in building a specific package list the system takes account of these, and will find a path through the dependency graph that avoids as many of them as possible, hopefully all of them. Rarely, when re-evaluating dependencies to add a new application, the only valid path(s) will include components with known security issues. In that case the system delivers its package list, but warns that the list will introduce known security problems, leaving system administrators to decide whether to continue or consult with Aduva's Lab and professional services team to devise a solution.

Keeping the database up-to-date is the key to success. Aduva works closely with Linux distributions, many different Open Source development communities, and various security groups to ensure that it has complete and current information. A set of tools automates and tests much of the process of determining dependencies.

The complete KnowledgeBase allows systems configurations to be generated based on combinations of different packages beyond those directly supported by a given Linux distribution, provided those packages are known to the KnowledgeBase. Of course, in real-world deployments a huge variety of different applications and third-party packages are going to be encountered, more than in the centrally maintained Universal KnowledgeBase. To make sure that the extended dependency requirements encompassing these additional components are taken into account when configuring, a set of tools exists that permits individual customers to create their own KnowledgeBase with dependency rules specific to their particular software. This local KnowledgeBase is then used in conjunction with the central KnowledgeBase to ensure that the specific requirements of local software components are taken into account when determining a stable configuration.

With this core technology in place Aduva has used it as a platform on which to build a set of tools designed to simplify Linux configuration management, application deployment, change management, and patch control for the enterprise. This set of management tools is sold under the name OnStage.

The OnStage toolset provides a very complete set of system configuration management tools, enabling machine types to be defined, a configuration generated for that specific set of machines, and automatically deployed at the click of a button. Changes can be made to any given set of machines such as deploying an application, adding a patch, or changing the system configuration, and the set of changes are validated against the local and central KnowledgeBases and automatically updated to ensure dependency rules are met and pushed to the entire set of machines, either immediately, or deferred until a specific time or other criteria are met. Configurations are recorded at each stage making it trivial to back out any change or set of changes should that be required.

What's unique about the OnStage toolset is that it sits on the KnowledgeBase and takes the uncertainty out of making changes to a stable production platform, which is one more step in making Open Source a viable solution for the enterprise.

More Stories By Zev Laderman

Zev Laderman is chairman and CEO of Aduva. He was previously CEO of Tradeum, President of Verticalnet, and was a VP at Oracle. He has an MBA from the Stanford University Graduate School of Business.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories
The Internet of Things (IoT) promises to evolve the way the world does business; however, understanding how to apply it to your company can be a mystery. Most people struggle with understanding the potential business uses or tend to get caught up in the technology, resulting in solutions that fail to meet even minimum business goals. In his session at @ThingsExpo, Jesse Shiah, CEO / President / Co-Founder of AgilePoint Inc., showed what is needed to leverage the IoT to transform your business. He discussed opportunities and challenges ahead for the IoT from a market and technical point of vie...
IoT is still a vague buzzword for many people. In his session at @ThingsExpo, Mike Kavis, Vice President & Principal Cloud Architect at Cloud Technology Partners, discussed the business value of IoT that goes far beyond the general public's perception that IoT is all about wearables and home consumer services. He also discussed how IoT is perceived by investors and how venture capitalist access this space. Other topics discussed were barriers to success, what is new, what is old, and what the future may hold. Mike Kavis is Vice President & Principal Cloud Architect at Cloud Technology Pa...
Dale Kim is the Director of Industry Solutions at MapR. His background includes a variety of technical and management roles at information technology companies. While his experience includes work with relational databases, much of his career pertains to non-relational data in the areas of search, content management, and NoSQL, and includes senior roles in technical marketing, sales engineering, and support engineering. Dale holds an MBA from Santa Clara University, and a BA in Computer Science from the University of California, Berkeley.
The Internet of Things (IoT) is rapidly in the process of breaking from its heretofore relatively obscure enterprise applications (such as plant floor control and supply chain management) and going mainstream into the consumer space. More and more creative folks are interconnecting everyday products such as household items, mobile devices, appliances and cars, and unleashing new and imaginative scenarios. We are seeing a lot of excitement around applications in home automation, personal fitness, and in-car entertainment and this excitement will bleed into other areas. On the commercial side, m...
The Industrial Internet revolution is now underway, enabled by connected machines and billions of devices that communicate and collaborate. The massive amounts of Big Data requiring real-time analysis is flooding legacy IT systems and giving way to cloud environments that can handle the unpredictable workloads. Yet many barriers remain until we can fully realize the opportunities and benefits from the convergence of machines and devices with Big Data and the cloud, including interoperability, data security and privacy.
The 3rd International Internet of @ThingsExpo, co-located with the 16th International Cloud Expo - to be held June 9-11, 2015, at the Javits Center in New York City, NY - announces that its Call for Papers is now open. The Internet of Things (IoT) is the biggest idea since the creation of the Worldwide Web more than 20 years ago.
"People are a lot more knowledgeable about APIs now. There are two types of people who work with APIs - IT people who want to use APIs for something internal and the product managers who want to do something outside APIs for people to connect to them," explained Roberto Medrano, Executive Vice President at SOA Software, in this SYS-CON.tv interview at Cloud Expo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Performance is the intersection of power, agility, control, and choice. If you value performance, and more specifically consistent performance, you need to look beyond simple virtualized compute. Many factors need to be considered to create a truly performant environment. In his General Session at 15th Cloud Expo, Harold Hannon, Sr. Software Architect at SoftLayer, discussed how to take advantage of a multitude of compute options and platform features to make cloud the cornerstone of your online presence.
SYS-CON Media announced that Splunk, a provider of the leading software platform for real-time Operational Intelligence, has launched an ad campaign on Big Data Journal. Splunk software and cloud services enable organizations to search, monitor, analyze and visualize machine-generated big data coming from websites, applications, servers, networks, sensors and mobile devices. The ads focus on delivering ROI - how improved uptime delivered $6M in annual ROI, improving customer operations by mining large volumes of unstructured data, and how data tracking delivers uptime when it matters most.
In this Women in Technology Power Panel at 15th Cloud Expo, moderated by Anne Plese, Senior Consultant, Cloud Product Marketing at Verizon Enterprise, Esmeralda Swartz, CMO at MetraTech; Evelyn de Souza, Data Privacy and Compliance Strategy Leader at Cisco Systems; Seema Jethani, Director of Product Management at Basho Technologies; Victoria Livschitz, CEO of Qubell Inc.; Anne Hungate, Senior Director of Software Quality at DIRECTV, discussed what path they took to find their spot within the technology industry and how do they see opportunities for other women in their area of expertise.
DevOps Summit 2015 New York, co-located with the 16th International Cloud Expo - to be held June 9-11, 2015, at the Javits Center in New York City, NY - announces that it is now accepting Keynote Proposals. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long development cycles that produce software that is obsolete at launch. DevOps may be disruptive, but it is essential.
Almost everyone sees the potential of Internet of Things but how can businesses truly unlock that potential. The key will be in the ability to discover business insight in the midst of an ocean of Big Data generated from billions of embedded devices via Systems of Discover. Businesses will also need to ensure that they can sustain that insight by leveraging the cloud for global reach, scale and elasticity.
The Internet of Things will greatly expand the opportunities for data collection and new business models driven off of that data. In her session at @ThingsExpo, Esmeralda Swartz, CMO of MetraTech, discussed how for this to be effective you not only need to have infrastructure and operational models capable of utilizing this new phenomenon, but increasingly service providers will need to convince a skeptical public to participate. Get ready to show them the money!
The 3rd International Internet of @ThingsExpo, co-located with the 16th International Cloud Expo - to be held June 9-11, 2015, at the Javits Center in New York City, NY - announces that its Call for Papers is now open. The Internet of Things (IoT) is the biggest idea since the creation of the Worldwide Web more than 20 years ago.
Connected devices and the Internet of Things are getting significant momentum in 2014. In his session at Internet of @ThingsExpo, Jim Hunter, Chief Scientist & Technology Evangelist at Greenwave Systems, examined three key elements that together will drive mass adoption of the IoT before the end of 2015. The first element is the recent advent of robust open source protocols (like AllJoyn and WebRTC) that facilitate M2M communication. The second is broad availability of flexible, cost-effective storage designed to handle the massive surge in back-end data in a world where timely analytics is e...
"There is a natural synchronization between the business models, the IoT is there to support ,” explained Brendan O'Brien, Co-founder and Chief Architect of Aria Systems, in this SYS-CON.tv interview at the 15th International Cloud Expo®, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
The Internet of Things will put IT to its ultimate test by creating infinite new opportunities to digitize products and services, generate and analyze new data to improve customer satisfaction, and discover new ways to gain a competitive advantage across nearly every industry. In order to help corporate business units to capitalize on the rapidly evolving IoT opportunities, IT must stand up to a new set of challenges. In his session at @ThingsExpo, Jeff Kaplan, Managing Director of THINKstrategies, will examine why IT must finally fulfill its role in support of its SBUs or face a new round of...
The BPM world is going through some evolution or changes where traditional business process management solutions really have nowhere to go in terms of development of the road map. In this demo at 15th Cloud Expo, Kyle Hansen, Director of Professional Services at AgilePoint, shows AgilePoint’s unique approach to dealing with this market circumstance by developing a rapid application composition or development framework.

ARMONK, N.Y., Nov. 20, 2014 /PRNewswire/ --  IBM (NYSE: IBM) today announced that it is bringing a greater level of control, security and flexibility to cloud-based application development and delivery with a single-tenant version of Bluemix, IBM's platform-as-a-service. The new platform enables developers to build ap...

Advanced Persistent Threats (APTs) are increasing at an unprecedented rate. The threat landscape of today is drastically different than just a few years ago. Attacks are much more organized and sophisticated. They are harder to detect and even harder to anticipate. In the foreseeable future it's going to get a whole lot harder. Everything you know today will change. Keeping up with this changing landscape is already a daunting task. Your organization needs to use the latest tools, methods and expertise to guard against those threats. But will that be enough? In the foreseeable future attacks w...