|By Zev Laderman||
|August 12, 2005 04:00 PM EDT||
You don't have to be around Linux for long before you hear about "the dependency problem," which is no problem at all for many users - until the day it bites them. In a nutshell, the problem is that most Linux applications depend on the operating system to provide various pieces of functionality that the applications need. These components most often take the form of shared libraries that are dynamically loaded and linked to the application at runtime. Problems occur when one or more of these libraries are replaced with a different (usually newer) version. Provided all the interfaces remain the same and the semantics of the functionality remain the same, there's no problem. However, due to security fixes, bug fixes, or new or improved functionality, the interfaces and semantics can and do change, and the changes can be enough to break the application.
In general, individual libraries aren't distributed individually but as part of a package of related components. Packages are installed as a whole. In general there's no way to take just one part of a package, and in fact to do so would be inviting trouble since packages are usually composed of related components that need to be installed as a whole to guarantee that the components work together correctly.
It would seem reasonable to assume that once an application is installed and working there would be no reason to touch it, or any of the components that it depends on. However, there are myriad reasons why components end up changing. The most common are:
- A new application is installed that requires a later version of a package containing one of the shared libraries used by the first application.
- A security or bug fix affects a package used by the application.
- An update version of the OS from the distributor includes newer packages.
This problem isn't unique to Linux. It's been a long-time problem with many different Unix systems that make heavy use of shared libraries, and is essentially the same as the famous "dll hell" that afflicted older versions of Windows (a Windows .dll file is a dynamically loaded shared library). Unix vendors reduced the problem to manageable proportions by tightly controlling the evolution of the system and providing lots of advance warning to third-party vendors of impending changes. Microsoft adopted similar tactics and became adamant about which libraries third parties were free to change, and which they weren't, making it difficult for third-party applications to overwrite system-provided libraries.
Linux is essentially no better or worse in terms of shared library management than Unix or Windows. What's different is that there's no central coordinating authority to make sure that changes happen in a controlled and consistent manner. In some respects this is one of the strengths of Open Source development; it allows change to happen at its natural pace and forces people to be more aware of the potential problems associated with change. It's also a weakness from the point-of-view of an end user having to integrate systems with components with different dependency requirements.
Linux distributors spend a lot of time and effort making sure that their systems are delivered with the dependencies all correctly resolved, and that any updates they create don't disturb this balance. They often go so far as to take security/bug/performance updates that tend to be created by the component development group on the latest version of their software - which is probably not the version currently shipping - and back-port the changes to the shipping version and test to make sure the changes don't introduce any inconsistencies. However, distributions have limited control over third parties, both commercial and Open-Source, that are reluctant to re-test their products with every new version of a package on which they have a dependency.
One of the interesting benefits Linux holds for its enterprise adopters is that it's multi-sourced. Essentially the same product is available from multiple vendors, improving competition and avoiding lock-in. Although managing multiple Linux distributions adds overhead, some companies prefer not to put all their eggs in one basket and use multiple distributions to help ensure that they don't end up locked in to a specific distribution.
However, this adds a new dimension to the dependency problem.
In many instances, different packages are used on different distributions. There's no standardized system for packaging, although most major Linux distributors use the RPM packaging system that defines how the package is constructed and what information it contains on individual file location and a set of dependency rules. Unfortunately, the same package name on two different distributions can contain different revisions of components or even different components. To add to the problem, not all Linux systems use the same package mechanism. For example, Debian-based systems don't use packages at all.
The end result is that the configuration management of these systems becomes quite complex. Applying something such as a security fix may necessitate other changes to bring the dependencies back into alignment, and then subsequent testing of all applications running on those revised systems/distributions before they can be declared stable and rolled out into production. This has to happen independently on each distribution platform.
Having lived and worked with this problem, and recognizing the need for a better solution than the mostly manual process that they and everyone else was using, the founders of Aduva began work on trying to automate the dependency resolution process.
One of the key components of their system was based on recognizing that the package level - at which most people were working - is too high a level for successful resolution. They built a database (the Aduva Universal KnowlegeBase) of dependency information based on the contents of the packages - the individual files that they contain.
By extending the dependency information and including specific dynamic dependency rules down to the file level it becomes much easier to find solutions to dependency problems. Once a set of solutions is found, information stored in the database about the composition of the packages is used to resolve the set of packages with the highest version levels possible to implement a solution. Since information is stored on distribution-specific packages, the system can derive package lists specific to individual distributions for the same dependencies.
The database also contains information on security alerts and errata notifications and fixes for packages and their components, so in building a specific package list the system takes account of these, and will find a path through the dependency graph that avoids as many of them as possible, hopefully all of them. Rarely, when re-evaluating dependencies to add a new application, the only valid path(s) will include components with known security issues. In that case the system delivers its package list, but warns that the list will introduce known security problems, leaving system administrators to decide whether to continue or consult with Aduva's Lab and professional services team to devise a solution.
Keeping the database up-to-date is the key to success. Aduva works closely with Linux distributions, many different Open Source development communities, and various security groups to ensure that it has complete and current information. A set of tools automates and tests much of the process of determining dependencies.
The complete KnowledgeBase allows systems configurations to be generated based on combinations of different packages beyond those directly supported by a given Linux distribution, provided those packages are known to the KnowledgeBase. Of course, in real-world deployments a huge variety of different applications and third-party packages are going to be encountered, more than in the centrally maintained Universal KnowledgeBase. To make sure that the extended dependency requirements encompassing these additional components are taken into account when configuring, a set of tools exists that permits individual customers to create their own KnowledgeBase with dependency rules specific to their particular software. This local KnowledgeBase is then used in conjunction with the central KnowledgeBase to ensure that the specific requirements of local software components are taken into account when determining a stable configuration.
With this core technology in place Aduva has used it as a platform on which to build a set of tools designed to simplify Linux configuration management, application deployment, change management, and patch control for the enterprise. This set of management tools is sold under the name OnStage.
The OnStage toolset provides a very complete set of system configuration management tools, enabling machine types to be defined, a configuration generated for that specific set of machines, and automatically deployed at the click of a button. Changes can be made to any given set of machines such as deploying an application, adding a patch, or changing the system configuration, and the set of changes are validated against the local and central KnowledgeBases and automatically updated to ensure dependency rules are met and pushed to the entire set of machines, either immediately, or deferred until a specific time or other criteria are met. Configurations are recorded at each stage making it trivial to back out any change or set of changes should that be required.
What's unique about the OnStage toolset is that it sits on the KnowledgeBase and takes the uncertainty out of making changes to a stable production platform, which is one more step in making Open Source a viable solution for the enterprise.
Too often with compelling new technologies market participants become overly enamored with that attractiveness of the technology and neglect underlying business drivers. This tendency, what some call the “newest shiny object syndrome,” is understandable given that virtually all of us are heavily engaged in technology. But it is also mistaken. Without concrete business cases driving its deployment, IoT, like many other technologies before it, will fade into obscurity.
Aug. 28, 2015 06:00 PM EDT Reads: 296
With the proliferation of connected devices underpinning new Internet of Things systems, Brandon Schulz, Director of Luxoft IoT – Retail, will be looking at the transformation of the retail customer experience in brick and mortar stores in his session at @ThingsExpo. Questions he will address include: Will beacons drop to the wayside like QR codes, or be a proximity-based profit driver? How will the customer experience change in stores of all types when everything can be instrumented and analyzed? As an area of investment, how might a retail company move towards an innovation methodolo...
Aug. 28, 2015 05:30 PM EDT Reads: 397
Consumer IoT applications provide data about the user that just doesn’t exist in traditional PC or mobile web applications. This rich data, or “context,” enables the highly personalized consumer experiences that characterize many consumer IoT apps. This same data is also providing brands with unprecedented insight into how their connected products are being used, while, at the same time, powering highly targeted engagement and marketing opportunities. In his session at @ThingsExpo, Nathan Treloar, President and COO of Bebaio, will explore examples of brands transforming their businesses by t...
Aug. 28, 2015 03:45 PM EDT Reads: 167
SYS-CON Events announced today that HPM Networks will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. For 20 years, HPM Networks has been integrating technology solutions that solve complex business challenges. HPM Networks has designed solutions for both SMB and enterprise customers throughout the San Francisco Bay Area.
Aug. 28, 2015 03:30 PM EDT Reads: 808
As more and more data is generated from a variety of connected devices, the need to get insights from this data and predict future behavior and trends is increasingly essential for businesses. Real-time stream processing is needed in a variety of different industries such as Manufacturing, Oil and Gas, Automobile, Finance, Online Retail, Smart Grids, and Healthcare. Azure Stream Analytics is a fully managed distributed stream computation service that provides low latency, scalable processing of streaming data in the cloud with an enterprise grade SLA. It features built-in integration with Azur...
Aug. 28, 2015 02:15 PM EDT Reads: 132
A producer of the first smartphones and tablets, presenter Lee M. Williams will talk about how he is now applying his experience in mobile technology to the design and development of the next generation of Environmental and Sustainability Services at ETwater. In his session at @ThingsExpo, Lee Williams, COO of ETwater, will talk about how he is now applying his experience in mobile technology to the design and development of the next generation of Environmental and Sustainability Services at ETwater.
Aug. 28, 2015 02:00 PM EDT
SYS-CON Events announced today that Micron Technology, Inc., a global leader in advanced semiconductor systems, will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Micron’s broad portfolio of high-performance memory technologies – including DRAM, NAND and NOR Flash – is the basis for solid state drives, modules, multichip packages and other system solutions. Backed by more than 35 years of technology leadership, Micron's memory solutions enable the world's most innovative computing, consumer,...
Aug. 28, 2015 12:30 PM EDT Reads: 144
SYS-CON Events announced today that Pythian, a global IT services company specializing in helping companies leverage disruptive technologies to optimize revenue-generating systems, has been named “Bronze Sponsor” of SYS-CON's 17th Cloud Expo, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Founded in 1997, Pythian is a global IT services company that helps companies compete by adopting disruptive technologies such as cloud, Big Data, advanced analytics, and DevOps to advance innovation and increase agility. Specializing in designing, imple...
Aug. 28, 2015 12:00 PM EDT Reads: 186
While many app developers are comfortable building apps for the smartphone, there is a whole new world out there. In his session at @ThingsExpo, Narayan Sainaney, Co-founder and CTO of Mojio, will discuss how the business case for connected car apps is growing and, with open platform companies having already done the heavy lifting, there really is no barrier to entry.
Aug. 28, 2015 11:30 AM EDT
As more intelligent IoT applications shift into gear, they’re merging into the ever-increasing traffic flow of the Internet. It won’t be long before we experience bottlenecks, as IoT traffic peaks during rush hours. Organizations that are unprepared will find themselves by the side of the road unable to cross back into the fast lane. As billions of new devices begin to communicate and exchange data – will your infrastructure be scalable enough to handle this new interconnected world?
Aug. 28, 2015 10:00 AM EDT
WebRTC has had a real tough three or four years, and so have those working with it. Only a few short years ago, the development world were excited about WebRTC and proclaiming how awesome it was. You might have played with the technology a couple of years ago, only to find the extra infrastructure requirements were painful to implement and poorly documented. This probably left a bitter taste in your mouth, especially when things went wrong.
Aug. 28, 2015 07:45 AM EDT Reads: 392
Through WebRTC, audio and video communications are being embedded more easily than ever into applications, helping carriers, enterprises and independent software vendors deliver greater functionality to their end users. With today’s business world increasingly focused on outcomes, users’ growing calls for ease of use, and businesses craving smarter, tighter integration, what’s the next step in delivering a richer, more immersive experience? That richer, more fully integrated experience comes about through a Communications Platform as a Service which allows for messaging, screen sharing, video...
Aug. 28, 2015 07:30 AM EDT Reads: 550
SYS-CON Events announced today that IceWarp will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. IceWarp, the leader of cloud and on-premise messaging, delivers secured email, chat, documents, conferencing and collaboration to today's mobile workforce, all in one unified interface
Aug. 28, 2015 03:00 AM EDT Reads: 347
The Internet of Things (IoT) is about the digitization of physical assets including sensors, devices, machines, gateways, and the network. It creates possibilities for significant value creation and new revenue generating business models via data democratization and ubiquitous analytics across IoT networks. The explosion of data in all forms in IoT requires a more robust and broader lens in order to enable smarter timely actions and better outcomes. Business operations become the key driver of IoT applications and projects. Business operations, IT, and data scientists need advanced analytics t...
Aug. 28, 2015 12:30 AM EDT Reads: 343
Akana has announced the availability of the new Akana Healthcare Solution. The API-driven solution helps healthcare organizations accelerate their transition to being secure, digitally interoperable businesses. It leverages the Health Level Seven International Fast Healthcare Interoperability Resources (HL7 FHIR) standard to enable broader business use of medical data. Akana developed the Healthcare Solution in response to healthcare businesses that want to increase electronic, multi-device access to health records while reducing operating costs and complying with government regulations.
Aug. 26, 2015 07:00 AM EDT Reads: 103
For IoT to grow as quickly as analyst firms’ project, a lot is going to fall on developers to quickly bring applications to market. But the lack of a standard development platform threatens to slow growth and make application development more time consuming and costly, much like we’ve seen in the mobile space. In his session at @ThingsExpo, Mike Weiner, Product Manager of the Omega DevCloud with KORE Telematics Inc., discussed the evolving requirements for developers as IoT matures and conducted a live demonstration of how quickly application development can happen when the need to comply wit...
Aug. 2, 2015 11:15 AM EDT Reads: 544
The Internet of Everything (IoE) brings together people, process, data and things to make networked connections more relevant and valuable than ever before – transforming information into knowledge and knowledge into wisdom. IoE creates new capabilities, richer experiences, and unprecedented opportunities to improve business and government operations, decision making and mission support capabilities.
Aug. 1, 2015 10:00 AM EDT Reads: 472
Explosive growth in connected devices. Enormous amounts of data for collection and analysis. Critical use of data for split-second decision making and actionable information. All three are factors in making the Internet of Things a reality. Yet, any one factor would have an IT organization pondering its infrastructure strategy. How should your organization enhance its IT framework to enable an Internet of Things implementation? In his session at @ThingsExpo, James Kirkland, Red Hat's Chief Architect for the Internet of Things and Intelligent Systems, described how to revolutionize your archit...
Jul. 30, 2015 07:30 PM EDT Reads: 1,558
MuleSoft has announced the findings of its 2015 Connectivity Benchmark Report on the adoption and business impact of APIs. The findings suggest traditional businesses are quickly evolving into "composable enterprises" built out of hundreds of connected software services, applications and devices. Most are embracing the Internet of Things (IoT) and microservices technologies like Docker. A majority are integrating wearables, like smart watches, and more than half plan to generate revenue with APIs within the next year.
Jul. 30, 2015 02:30 PM EDT Reads: 274
Growth hacking is common for startups to make unheard-of progress in building their business. Career Hacks can help Geek Girls and those who support them (yes, that's you too, Dad!) to excel in this typically male-dominated world. Get ready to learn the facts: Is there a bias against women in the tech / developer communities? Why are women 50% of the workforce, but hold only 24% of the STEM or IT positions? Some beginnings of what to do about it! In her Opening Keynote at 16th Cloud Expo, Sandy Carter, IBM General Manager Cloud Ecosystem and Developers, and a Social Business Evangelist, d...
Jul. 30, 2015 12:00 PM EDT Reads: 2,218