|By Zev Laderman||
|August 12, 2005 04:00 PM EDT||
You don't have to be around Linux for long before you hear about "the dependency problem," which is no problem at all for many users - until the day it bites them. In a nutshell, the problem is that most Linux applications depend on the operating system to provide various pieces of functionality that the applications need. These components most often take the form of shared libraries that are dynamically loaded and linked to the application at runtime. Problems occur when one or more of these libraries are replaced with a different (usually newer) version. Provided all the interfaces remain the same and the semantics of the functionality remain the same, there's no problem. However, due to security fixes, bug fixes, or new or improved functionality, the interfaces and semantics can and do change, and the changes can be enough to break the application.
In general, individual libraries aren't distributed individually but as part of a package of related components. Packages are installed as a whole. In general there's no way to take just one part of a package, and in fact to do so would be inviting trouble since packages are usually composed of related components that need to be installed as a whole to guarantee that the components work together correctly.
It would seem reasonable to assume that once an application is installed and working there would be no reason to touch it, or any of the components that it depends on. However, there are myriad reasons why components end up changing. The most common are:
- A new application is installed that requires a later version of a package containing one of the shared libraries used by the first application.
- A security or bug fix affects a package used by the application.
- An update version of the OS from the distributor includes newer packages.
This problem isn't unique to Linux. It's been a long-time problem with many different Unix systems that make heavy use of shared libraries, and is essentially the same as the famous "dll hell" that afflicted older versions of Windows (a Windows .dll file is a dynamically loaded shared library). Unix vendors reduced the problem to manageable proportions by tightly controlling the evolution of the system and providing lots of advance warning to third-party vendors of impending changes. Microsoft adopted similar tactics and became adamant about which libraries third parties were free to change, and which they weren't, making it difficult for third-party applications to overwrite system-provided libraries.
Linux is essentially no better or worse in terms of shared library management than Unix or Windows. What's different is that there's no central coordinating authority to make sure that changes happen in a controlled and consistent manner. In some respects this is one of the strengths of Open Source development; it allows change to happen at its natural pace and forces people to be more aware of the potential problems associated with change. It's also a weakness from the point-of-view of an end user having to integrate systems with components with different dependency requirements.
Linux distributors spend a lot of time and effort making sure that their systems are delivered with the dependencies all correctly resolved, and that any updates they create don't disturb this balance. They often go so far as to take security/bug/performance updates that tend to be created by the component development group on the latest version of their software - which is probably not the version currently shipping - and back-port the changes to the shipping version and test to make sure the changes don't introduce any inconsistencies. However, distributions have limited control over third parties, both commercial and Open-Source, that are reluctant to re-test their products with every new version of a package on which they have a dependency.
One of the interesting benefits Linux holds for its enterprise adopters is that it's multi-sourced. Essentially the same product is available from multiple vendors, improving competition and avoiding lock-in. Although managing multiple Linux distributions adds overhead, some companies prefer not to put all their eggs in one basket and use multiple distributions to help ensure that they don't end up locked in to a specific distribution.
However, this adds a new dimension to the dependency problem.
In many instances, different packages are used on different distributions. There's no standardized system for packaging, although most major Linux distributors use the RPM packaging system that defines how the package is constructed and what information it contains on individual file location and a set of dependency rules. Unfortunately, the same package name on two different distributions can contain different revisions of components or even different components. To add to the problem, not all Linux systems use the same package mechanism. For example, Debian-based systems don't use packages at all.
The end result is that the configuration management of these systems becomes quite complex. Applying something such as a security fix may necessitate other changes to bring the dependencies back into alignment, and then subsequent testing of all applications running on those revised systems/distributions before they can be declared stable and rolled out into production. This has to happen independently on each distribution platform.
Having lived and worked with this problem, and recognizing the need for a better solution than the mostly manual process that they and everyone else was using, the founders of Aduva began work on trying to automate the dependency resolution process.
One of the key components of their system was based on recognizing that the package level - at which most people were working - is too high a level for successful resolution. They built a database (the Aduva Universal KnowlegeBase) of dependency information based on the contents of the packages - the individual files that they contain.
By extending the dependency information and including specific dynamic dependency rules down to the file level it becomes much easier to find solutions to dependency problems. Once a set of solutions is found, information stored in the database about the composition of the packages is used to resolve the set of packages with the highest version levels possible to implement a solution. Since information is stored on distribution-specific packages, the system can derive package lists specific to individual distributions for the same dependencies.
The database also contains information on security alerts and errata notifications and fixes for packages and their components, so in building a specific package list the system takes account of these, and will find a path through the dependency graph that avoids as many of them as possible, hopefully all of them. Rarely, when re-evaluating dependencies to add a new application, the only valid path(s) will include components with known security issues. In that case the system delivers its package list, but warns that the list will introduce known security problems, leaving system administrators to decide whether to continue or consult with Aduva's Lab and professional services team to devise a solution.
Keeping the database up-to-date is the key to success. Aduva works closely with Linux distributions, many different Open Source development communities, and various security groups to ensure that it has complete and current information. A set of tools automates and tests much of the process of determining dependencies.
The complete KnowledgeBase allows systems configurations to be generated based on combinations of different packages beyond those directly supported by a given Linux distribution, provided those packages are known to the KnowledgeBase. Of course, in real-world deployments a huge variety of different applications and third-party packages are going to be encountered, more than in the centrally maintained Universal KnowledgeBase. To make sure that the extended dependency requirements encompassing these additional components are taken into account when configuring, a set of tools exists that permits individual customers to create their own KnowledgeBase with dependency rules specific to their particular software. This local KnowledgeBase is then used in conjunction with the central KnowledgeBase to ensure that the specific requirements of local software components are taken into account when determining a stable configuration.
With this core technology in place Aduva has used it as a platform on which to build a set of tools designed to simplify Linux configuration management, application deployment, change management, and patch control for the enterprise. This set of management tools is sold under the name OnStage.
The OnStage toolset provides a very complete set of system configuration management tools, enabling machine types to be defined, a configuration generated for that specific set of machines, and automatically deployed at the click of a button. Changes can be made to any given set of machines such as deploying an application, adding a patch, or changing the system configuration, and the set of changes are validated against the local and central KnowledgeBases and automatically updated to ensure dependency rules are met and pushed to the entire set of machines, either immediately, or deferred until a specific time or other criteria are met. Configurations are recorded at each stage making it trivial to back out any change or set of changes should that be required.
What's unique about the OnStage toolset is that it sits on the KnowledgeBase and takes the uncertainty out of making changes to a stable production platform, which is one more step in making Open Source a viable solution for the enterprise.
The Internet of Things (IoT) is growing rapidly by extending current technologies, products and networks. By 2020, Cisco estimates there will be 50 billion connected devices. Gartner has forecast revenues of over $300 billion, just to IoT suppliers. Now is the time to figure out how you’ll make money – not just create innovative products. With hundreds of new products and companies jumping into the IoT fray every month, there’s no shortage of innovation. Despite this, McKinsey/VisionMobile data shows "less than 10 percent of IoT developers are making enough to support a reasonably sized team....
Nov. 28, 2015 01:00 PM EST Reads: 469
Just over a week ago I received a long and loud sustained applause for a presentation I delivered at this year’s Cloud Expo in Santa Clara. I was extremely pleased with the turnout and had some very good conversations with many of the attendees. Over the next few days I had many more meaningful conversations and was not only happy with the results but also learned a few new things. Here is everything I learned in those three days distilled into three short points.
Nov. 28, 2015 12:00 PM EST Reads: 330
With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo 2016 in New York and Silicon Valley. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be! Internet of @ThingsExpo, taking place Nov 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 17th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The Internet of Things (IoT) is the most profound cha...
Nov. 28, 2015 12:00 PM EST Reads: 547
DevOps is about increasing efficiency, but nothing is more inefficient than building the same application twice. However, this is a routine occurrence with enterprise applications that need both a rich desktop web interface and strong mobile support. With recent technological advances from Isomorphic Software and others, rich desktop and tuned mobile experiences can now be created with a single codebase – without compromising functionality, performance or usability. In his session at DevOps Summit, Charles Kendrick, CTO and Chief Architect at Isomorphic Software, demonstrated examples of com...
Nov. 28, 2015 11:45 AM EST Reads: 399
As organizations realize the scope of the Internet of Things, gaining key insights from Big Data, through the use of advanced analytics, becomes crucial. However, IoT also creates the need for petabyte scale storage of data from millions of devices. A new type of Storage is required which seamlessly integrates robust data analytics with massive scale. These storage systems will act as “smart systems” provide in-place analytics that speed discovery and enable businesses to quickly derive meaningful and actionable insights. In his session at @ThingsExpo, Paul Turner, Chief Marketing Officer at...
Nov. 28, 2015 11:15 AM EST Reads: 411
In his keynote at @ThingsExpo, Chris Matthieu, Director of IoT Engineering at Citrix and co-founder and CTO of Octoblu, focused on building an IoT platform and company. He provided a behind-the-scenes look at Octoblu’s platform, business, and pivots along the way (including the Citrix acquisition of Octoblu).
Nov. 28, 2015 11:00 AM EST Reads: 510
In his General Session at 17th Cloud Expo, Bruce Swann, Senior Product Marketing Manager for Adobe Campaign, explored the key ingredients of cross-channel marketing in a digital world. Learn how the Adobe Marketing Cloud can help marketers embrace opportunities for personalized, relevant and real-time customer engagement across offline (direct mail, point of sale, call center) and digital (email, website, SMS, mobile apps, social networks, connected objects).
Nov. 28, 2015 10:30 AM EST Reads: 308
We all know that data growth is exploding and storage budgets are shrinking. Instead of showing you charts on about how much data there is, in his General Session at 17th Cloud Expo, Scott Cleland, Senior Director of Product Marketing at HGST, showed how to capture all of your data in one place. After you have your data under control, you can then analyze it in one place, saving time and resources.
Nov. 28, 2015 10:00 AM EST Reads: 187
Two weeks ago (November 3-5), I attended the Cloud Expo Silicon Valley as a speaker, where I presented on the security and privacy due diligence requirements for cloud solutions. Cloud security is a topical issue for every CIO, CISO, and technology buyer. Decision-makers are always looking for insights on how to mitigate the security risks of implementing and using cloud solutions. Based on the presentation topics covered at the conference, as well as the general discussions heard between sessions, I wanted to share some of my observations on emerging trends. As cyber security serves as a fou...
Nov. 28, 2015 08:45 AM EST Reads: 331
The Internet of Everything is re-shaping technology trends–moving away from “request/response” architecture to an “always-on” Streaming Web where data is in constant motion and secure, reliable communication is an absolute necessity. As more and more THINGS go online, the challenges that developers will need to address will only increase exponentially. In his session at @ThingsExpo, Todd Greene, Founder & CEO of PubNub, exploreed the current state of IoT connectivity and review key trends and technology requirements that will drive the Internet of Things from hype to reality.
Nov. 28, 2015 08:45 AM EST Reads: 433
With all the incredible momentum behind the Internet of Things (IoT) industry, it is easy to forget that not a single CEO wakes up and wonders if “my IoT is broken.” What they wonder is if they are making the right decisions to do all they can to increase revenue, decrease costs, and improve customer experience – effectively the same challenges they have always had in growing their business. The exciting thing about the IoT industry is now these decisions can be better, faster, and smarter. Now all corporate assets – people, objects, and spaces – can share information about themselves and thei...
Nov. 28, 2015 06:00 AM EST Reads: 242
Continuous processes around the development and deployment of applications are both impacted by -- and a benefit to -- the Internet of Things trend. To help better understand the relationship between DevOps and a plethora of new end-devices and data please welcome Gary Gruver, consultant, author and a former IT executive who has led many large-scale IT transformation projects, and John Jeremiah, Technology Evangelist at Hewlett Packard Enterprise (HPE), on Twitter at @j_jeremiah. The discussion is moderated by me, Dana Gardner, Principal Analyst at Interarbor Solutions.
Nov. 28, 2015 05:30 AM EST Reads: 730
Too often with compelling new technologies market participants become overly enamored with that attractiveness of the technology and neglect underlying business drivers. This tendency, what some call the “newest shiny object syndrome” is understandable given that virtually all of us are heavily engaged in technology. But it is also mistaken. Without concrete business cases driving its deployment, IoT, like many other technologies before it, will fade into obscurity.
Nov. 28, 2015 05:00 AM EST Reads: 359
Discussions of cloud computing have evolved in recent years from a focus on specific types of cloud, to a world of hybrid cloud, and to a world dominated by the APIs that make today's multi-cloud environments and hybrid clouds possible. In this Power Panel at 17th Cloud Expo, moderated by Conference Chair Roger Strukhoff, panelists addressed the importance of customers being able to use the specific technologies they need, through environments and ecosystems that expose their APIs to make true change and transformation possible.
Nov. 28, 2015 04:00 AM EST Reads: 538
The Internet of Things is clearly many things: data collection and analytics, wearables, Smart Grids and Smart Cities, the Industrial Internet, and more. Cool platforms like Arduino, Raspberry Pi, Intel's Galileo and Edison, and a diverse world of sensors are making the IoT a great toy box for developers in all these areas. In this Power Panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, panelists discussed what things are the most important, which will have the most profound effect on the world, and what should we expect to see over the next couple of years.
Nov. 28, 2015 03:30 AM EST Reads: 473
Microservices are a very exciting architectural approach that many organizations are looking to as a way to accelerate innovation. Microservices promise to allow teams to move away from monolithic "ball of mud" systems, but the reality is that, in the vast majority of organizations, different projects and technologies will continue to be developed at different speeds. How to handle the dependencies between these disparate systems with different iteration cycles? Consider the "canoncial problem" in this scenario: microservice A (releases daily) depends on a couple of additions to backend B (re...
Nov. 28, 2015 03:00 AM EST Reads: 450
The cloud. Like a comic book superhero, there seems to be no problem it can’t fix or cost it can’t slash. Yet making the transition is not always easy and production environments are still largely on premise. Taking some practical and sensible steps to reduce risk can also help provide a basis for a successful cloud transition. A plethora of surveys from the likes of IDG and Gartner show that more than 70 percent of enterprises have deployed at least one or more cloud application or workload. Yet a closer inspection at the data reveals less than half of these cloud projects involve production...
Nov. 28, 2015 03:00 AM EST Reads: 482
Growth hacking is common for startups to make unheard-of progress in building their business. Career Hacks can help Geek Girls and those who support them (yes, that's you too, Dad!) to excel in this typically male-dominated world. Get ready to learn the facts: Is there a bias against women in the tech / developer communities? Why are women 50% of the workforce, but hold only 24% of the STEM or IT positions? Some beginnings of what to do about it! In her Day 2 Keynote at 17th Cloud Expo, Sandy Carter, IBM General Manager Cloud Ecosystem and Developers, and a Social Business Evangelist, wil...
Nov. 28, 2015 02:00 AM EST Reads: 579
PubNub has announced the release of BLOCKS, a set of customizable microservices that give developers a simple way to add code and deploy features for realtime apps.PubNub BLOCKS executes business logic directly on the data streaming through PubNub’s network without splitting it off to an intermediary server controlled by the customer. This revolutionary approach streamlines app development, reduces endpoint-to-endpoint latency, and allows apps to better leverage the enormous scalability of PubNub’s Data Stream Network.
Nov. 28, 2015 02:00 AM EST Reads: 328
Container technology is shaping the future of DevOps and it’s also changing the way organizations think about application development. With the rise of mobile applications in the enterprise, businesses are abandoning year-long development cycles and embracing technologies that enable rapid development and continuous deployment of apps. In his session at DevOps Summit, Kurt Collins, Developer Evangelist at Built.io, examined how Docker has evolved into a highly effective tool for application delivery by allowing increasingly popular Mobile Backend-as-a-Service (mBaaS) platforms to quickly crea...
Nov. 28, 2015 02:00 AM EST Reads: 361