Click here to close now.


Linux Containers Authors: Liz McMillan, Carmen Gonzalez, Jennifer Gill, SmartBear Blog, AppDynamics Blog

Related Topics: Linux Containers

Linux Containers: Article

*SPECIAL ANALYSIS* Bruce Perens White Paper on UserLinux

Perens publishes white paper sub-titled "Repairing the Economic Paradigm of Enterprise Linux"

Linux evangelist Bruce Perens has made available his first draft, UserLinux: Repairing the Economic Paradigm of Enterprise Linux. Which at first read sounds like a good idea, even though it seems to bear many similarities to United Linux. UnitedLinux to date seems to have had very little impact on the Linux user community - due to SCO’s participation and the lack of unilateral support by Linux distribution vendors, most notably Red Hat.

Analysis of UnitedLinux’s results to date may be helpful to those thinking about jumping on the UserLinux bandwagon. This is not to say that UserLinux is destined for failure; on the contrary, Bruce’s effort to bring the same discussion to the community rather than the corporate level intrigues me. But it leads me to pose the following questions:

  • Can and will the community advance Linux in the enterprise faster than the distribution vendors?
  • If so what differences between the two models will be the catalyst for success?
  • Is UserLinux really needed - how much does UserLinux potentially overlap with work being done by OSDL and the Linux Standards Base?

One key theme in the initial Perens proposal is the idea of a structure that would include a central body setting direction for the community and making choices on applications supplied by those who cooperate with community efforts. He also envisions a technical plan that sets goals and maintains relationships with commercial organizations. Also there is mention of a certification of solutions on UserLinux, a practice that has allowed Red Hat to take a leadership position in the enterprise Linux space.

A community-led project with the ability to certify solutions would be an interesting competitor to Red Hat’s Enterprise Linux. I have to wonder if a band of community developers with a proposed $1 million annual budget can make advances that rival or overshadow those of the corporate Linux community. Then again, from the humblest of beginnings, Linus’ kernel project and RMS’s GNU initiatives have taken a significant market share away from the world’s largest company.

From a technical perspective, Perens proposes Debian as the base technology for UserLinux, which is no coincidence since he is a former leader of the Debian project. This is an interesting proposal because Debian has as good a technology as anyone (arguably better by many) but seems to be the least commercial of any distribution.

There would be some irony in Debian as the technology that powers corporations throughout the world. Debian does seem like a logical choice to address one of the biggest problems with Linux today, application delivery and installation. The difficulty of installation of applications across distributions due to conflicts and lack of supporting libraries could be solved by Debian’s apt tools, which are quite superior for the installation and resolution of dependencies in comparison to rpm.

Decisions on a standardized GUI interface and Web server software are all points of contention for UserLinux. All in all the proposal seems to point towards making some choices between existing technologies and thereafter working on better integration and overall usability. This strategy may result in a lack of innovation and healthy competition that exist today between “competing projects,” but could yield significant progress in Linux usability if the efforts of the individual groups could be combined.

I am sure that UserLinux will continue to spark debate because of user need for Linux to develop into a platform with some level of commonality across vendors. Once this commonality has become established then there needs to be a mechanism, vendor, or group of vendors that provide software solutions, management and hardware support equivalent to those available on other operating systems with equivalent or better prices and equivalent or better functionality.

Keeping this in mind maybe more interesting than UserLinux is the prospect of Linux being on the cusp of mainstream success. Ideas and the extension of ideas like Bruce’s are the tinder that will spark widespread Open Source adoption. I encourage everyone that can to participate to help Bruce shape his proposal.

More Stories By Mark R. Hinkle

Mark Hinkle is the Senior Director, Open Soure Solutions at Citrix. He also is along-time open source expert and advocate. He is a co-founder of both the Open Source Management Consortium and the Desktop Linux Consortium. He has served as Editor-in-Chief for both LinuxWorld Magazine and Enterprise Open Source Magazine. Hinkle is also the author of the book, "Windows to Linux Business Desktop Migration" (Thomson, 2006). His blog on open source, technology, and new media can be found at

Comments (12) View Comments

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

Most Recent Comments
downwa 12/04/03 05:02:13 PM EST

More important than a standard package management / dependency resolution system is a standard for what package names to use, and what files they should contain. Debian has the lead here, simply because the RPM-based distributions have fragmented their choices to some extent. However, I agree that urpmi is superior in some respects to apt-get.

However, in my view, it would be better yet if we could stop relying on the package to contain dependency information, and instead extract it from the files themselves. That is, use ldd against all executables in a package to determine which libraries are needed, at install time. If the libraries are already in the system (no matter how they got there), the install proceeds, otherwise, a search for packages providing those libraries is made, both on local CDs and over the network (like urpmi can do). The information retrieved should be cached, to speed up future installs, and the CDs could already contain a local cache. The difference here is the cache would be dynamically built from the actual contents of the package, not from what the package creator "said" was in the package. Also, it wouldn't matter if your libraries were installed via RPM, apt, or simply extracting a .tgz file.

JDR 12/04/03 09:31:18 AM EST

I think that standardizing on a single "package management / dependency resolution" system is as important to the adoption of GNU/Linux as a standard libc was to the C programming language.

All system administrators must manage software installations, and it would be really nice if there was at least one (nearly) universal body of knowledge that would accomplish that task.

BeRT 12/04/03 05:48:19 AM EST

Redhat Packet Manager ? RPM being a 'system'

and that would be
apt-get install evolution

Duck 12/04/03 03:46:07 AM EST

From a technical perspective, Perens proposes Debian as the base technology for UserLinux, which is no coincidence since he is a former leader of the Debian project.

You make it sound like Perens chose Debian just because he is a former leader of the project, from reading the proposal I think he had different reasons and sees his previous involvement in the Debian project as a benefit.

He notices several aspects of Debian as important: Freedom, size (packages, developers, users), package delivery system, responsive and open behavior towards bugs and security issues, social contract and the broad range of architectures supported by Debian.

You can either build a distribution from scratch or build upon an existing effort. If you want to have freedom and openness and your goal is to bring the community spirit to the enterprise, you're not gonna build upon RedHat's fedora, right?

Frank Wales 12/03/03 11:33:28 PM EST

"...taken a significant market share away from the world’s largest company."

Isn't that the Mitsubishi Group, or maybe GE?

Adam Boyle 12/03/03 07:14:33 PM EST

Can we stop being so ignorant about RPM, please!!! RPM is a packaging standard, not a delivery/dependency resolving mechanism. Please don't tell me that RPM is worse than apt-get, because you're comparing a package to a delivery mechanism. RPM is the equivalent of a .deb package, and they really are functionally equivalent.

If you want to compare delivery and dependency resolution mechanisms, try comparing Mandrake's urpmi or RedHat's up2date to apt-get. And urpmi is arguably better than apt-get!

Besides the fact that:

> urpmi evolution

takes less characters to type than:

> apt-get evolution

Just my 2 cents.

Chuck Wegrzyn 12/03/03 06:59:40 PM EST

Why use Debian? While it is a nice platform, and one I have used for a few years, I think the Gentoo system is actually a much better way to go. It is easier to customize and much easier to keep up-to-date. The downside to Gentoo is the installation - it is more complicated than any of the other distributions.

macb 12/03/03 06:59:32 PM EST

This idea sounds fine to me, particularly since it doesn't involve creating Yet Another Distribution. The worst that can happen, it seems is that the idea will fail to pick up momentum but in the mean time will contribute to the existing Debian project.

What interests me is that companies like Red Hat and Novell continue to state publicly that they are NOT setting themselves up in competition with Microsoft. Following that they proceed to adopt, to the extent possible by the GPL Microsoft-like practices and furthermore measure their success or failure as though competition with Microsoft is indeed what they have at the top of their agendas.

What they don't seem to get is that Microsoft, when (should I say if?) it falls will NOT be replaced by something else. I think there is every reason to believe that Microsoft is an historical anomaly produced by the single-minded greed and ego of one man, plus a large pinch of luck in being in the right place at the right time to capitalize on IBM's fumbling of the PC business. Sometimes I think IBM fumbled it a bit on purpose, since had they retained full control of the PC hardware and software as we know it today they would have found themselves back in anti-trust litigation and they would have had a harder time fending off the Justice Department than did Microsoft.

I like seeing companies like IBM involved with Linux and Open source. Why? because software is NOT their only business. They sell hardware and services, invent things and sell patent rights. Novell, Red Hat, Suse, Mandrake, etc are all set up, like Microsoft, to make a killing just selling an Operating system. When in history did any other company succeed by just selling an operating system? Just once, just Microsoft and it is a bad act to try and follow.

As Bruce said on The Linux Show yesterday, the key to success is more likely to be with areas of specialization. Who will corner the market in selling turnkey Linux-based systems to dentists? How about doctors, construction companies, import/export companies, and so on? Companies like Red Hat or Novell could of course use subsidiaries to focus on these industries, but to do so they will have to abandon their current mindsets and start thinking about BIG payrolls with lots of locally based employees pounding the pavement to both sell into and support these specialties. That, in my opinion is a formula that can work (it seems to be working for IBM anyway).

Finally, Linux is a great tool for universities. Students have full access to everything to tinker with and in some cases improve upon. This was actually also the case with IBM software before PCs came along. It will be this synergy between the mission of higher education and both commercial and individual computer users that will keep Linux going even if ALL of the commercial Linux ventures fail.

Too much attention has been paid to leveling the playing field between Microsoft and other commercial software companies. We have lost sight of the fact that the operating system *IS* the playing field and the only way to enjoy the full benefit of competing ideas is to have an operating system that is both equally available to everyone and also that runs on all hardware. There is only one operating system that satisfies those requirements now and it is hard to see how anything can stand in its way.

Randy Andy 12/03/03 06:39:56 PM EST

The variations between GNU/Linux distributions is being regularly cited as a weakness of the GPL development model. I'll make a counter argument. This debate we are having about how to standardized GNU/Linux configuration so that competing distributions can continue to compete without stifling growth in the app space is one of the few places in the market this is happening.

We could hastily lock down some distribution and thereby speed up short term acceptance, but the GPL community is committed to retaining competition by looking to achieve an open standard for the OS development as well.

This hasn't been achieved elsewhere. If it can be done, the GNU platform will be the ONLY place where competition flourishes both in the OS space AND the App space. Everywhere else, the OS's have prospered not from their own strength, but from the standardized environment that allow only the App competition to flourish.

This determination shows the power of freedom of choice. Linux users get what they want.

kaka 12/03/03 06:18:15 PM EST

From what I understood, redhat was invited to join UnitedLinux the DAY BEFORE the deadline...meaning they didn't want redhat to participate but didn't want to look as though they were competition.

Ed Mack 12/03/03 05:37:35 PM EST

Bruce, there pretty much is this already. Debain follows the standards set out for distributions file paths, and other distros do to less a degree.

For the file depenedencies, Fedora, Debian ect.. have apt style tools that solve this by automatically getting them.

I don't personally see why a new distro should be made, if Debian was worked on in regards to desktop, a hellova lot of people would win, not just users of yet another distro

Bruce 12/03/03 10:19:59 AM EST

I think there should be more standardisation of the program locations.
I know opensource is all about freedom, but then there should still be some standards on where files are installed to. So that it would be the same for every distro by default. This would be a great help for newbies reading their first howtos as well as for RPM finding file depencies

@ThingsExpo Stories
Cloud computing delivers on-demand resources that provide businesses with flexibility and cost-savings. The challenge in moving workloads to the cloud has been the cost and complexity of ensuring the initial and ongoing security and regulatory (PCI, HIPAA, FFIEC) compliance across private and public clouds. Manual security compliance is slow, prone to human error, and represents over 50% of the cost of managing cloud applications. Determining how to automate cloud security compliance is critical to maintaining positive ROI. Raxak Protect is an automated security compliance SaaS platform and ma...
The cloud. Like a comic book superhero, there seems to be no problem it can’t fix or cost it can’t slash. Yet making the transition is not always easy and production environments are still largely on premise. Taking some practical and sensible steps to reduce risk can also help provide a basis for a successful cloud transition. A plethora of surveys from the likes of IDG and Gartner show that more than 70 percent of enterprises have deployed at least one or more cloud application or workload. Yet a closer inspection at the data reveals less than half of these cloud projects involve production...
Internet of @ThingsExpo, taking place June 7-9, 2016 at Javits Center, New York City and Nov 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with the 18th International @CloudExpo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world and ThingsExpo New York Call for Papers is now open.
With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo 2016 in New York and Silicon Valley. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be! Internet of @ThingsExpo, taking place Nov 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 17th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The Internet of Things (IoT) is the most profound cha...
We are rapidly moving to a brave new world of interconnected smart homes, cars, offices and factories known as the Internet of Things (IoT). Sensors and monitoring devices will touch every part of our lives. Let's take a closer look at the Internet of Things. The Internet of Things is a worldwide network of objects and devices connected to the Internet. They are electronics, sensors, software and more. These objects connect to the Internet and can be controlled remotely via apps and programs. Because they can be accessed via the Internet, these devices create a tremendous opportunity to inte...
Today air travel is a minefield of delays, hassles and customer disappointment. Airlines struggle to revitalize the experience. GE and M2Mi will demonstrate practical examples of how IoT solutions are helping airlines bring back personalization, reduce trip time and improve reliability. In their session at @ThingsExpo, Shyam Varan Nath, Principal Architect with GE, and Dr. Sarah Cooper, M2Mi’s VP Business Development and Engineering, explored the IoT cloud-based platform technologies driving this change including privacy controls, data transparency and integration of real time context with p...
We all know that data growth is exploding and storage budgets are shrinking. Instead of showing you charts on about how much data there is, in his General Session at 17th Cloud Expo, Scott Cleland, Senior Director of Product Marketing at HGST, showed how to capture all of your data in one place. After you have your data under control, you can then analyze it in one place, saving time and resources.
The Internet of Things (IoT) is growing rapidly by extending current technologies, products and networks. By 2020, Cisco estimates there will be 50 billion connected devices. Gartner has forecast revenues of over $300 billion, just to IoT suppliers. Now is the time to figure out how you’ll make money – not just create innovative products. With hundreds of new products and companies jumping into the IoT fray every month, there’s no shortage of innovation. Despite this, McKinsey/VisionMobile data shows "less than 10 percent of IoT developers are making enough to support a reasonably sized team....
Just over a week ago I received a long and loud sustained applause for a presentation I delivered at this year’s Cloud Expo in Santa Clara. I was extremely pleased with the turnout and had some very good conversations with many of the attendees. Over the next few days I had many more meaningful conversations and was not only happy with the results but also learned a few new things. Here is everything I learned in those three days distilled into three short points.
DevOps is about increasing efficiency, but nothing is more inefficient than building the same application twice. However, this is a routine occurrence with enterprise applications that need both a rich desktop web interface and strong mobile support. With recent technological advances from Isomorphic Software and others, rich desktop and tuned mobile experiences can now be created with a single codebase – without compromising functionality, performance or usability. In his session at DevOps Summit, Charles Kendrick, CTO and Chief Architect at Isomorphic Software, demonstrated examples of com...
As organizations realize the scope of the Internet of Things, gaining key insights from Big Data, through the use of advanced analytics, becomes crucial. However, IoT also creates the need for petabyte scale storage of data from millions of devices. A new type of Storage is required which seamlessly integrates robust data analytics with massive scale. These storage systems will act as “smart systems” provide in-place analytics that speed discovery and enable businesses to quickly derive meaningful and actionable insights. In his session at @ThingsExpo, Paul Turner, Chief Marketing Officer at...
In his keynote at @ThingsExpo, Chris Matthieu, Director of IoT Engineering at Citrix and co-founder and CTO of Octoblu, focused on building an IoT platform and company. He provided a behind-the-scenes look at Octoblu’s platform, business, and pivots along the way (including the Citrix acquisition of Octoblu).
In his General Session at 17th Cloud Expo, Bruce Swann, Senior Product Marketing Manager for Adobe Campaign, explored the key ingredients of cross-channel marketing in a digital world. Learn how the Adobe Marketing Cloud can help marketers embrace opportunities for personalized, relevant and real-time customer engagement across offline (direct mail, point of sale, call center) and digital (email, website, SMS, mobile apps, social networks, connected objects).
The Internet of Everything is re-shaping technology trends–moving away from “request/response” architecture to an “always-on” Streaming Web where data is in constant motion and secure, reliable communication is an absolute necessity. As more and more THINGS go online, the challenges that developers will need to address will only increase exponentially. In his session at @ThingsExpo, Todd Greene, Founder & CEO of PubNub, exploreed the current state of IoT connectivity and review key trends and technology requirements that will drive the Internet of Things from hype to reality.
Two weeks ago (November 3-5), I attended the Cloud Expo Silicon Valley as a speaker, where I presented on the security and privacy due diligence requirements for cloud solutions. Cloud security is a topical issue for every CIO, CISO, and technology buyer. Decision-makers are always looking for insights on how to mitigate the security risks of implementing and using cloud solutions. Based on the presentation topics covered at the conference, as well as the general discussions heard between sessions, I wanted to share some of my observations on emerging trends. As cyber security serves as a fou...
Continuous processes around the development and deployment of applications are both impacted by -- and a benefit to -- the Internet of Things trend. To help better understand the relationship between DevOps and a plethora of new end-devices and data please welcome Gary Gruver, consultant, author and a former IT executive who has led many large-scale IT transformation projects, and John Jeremiah, Technology Evangelist at Hewlett Packard Enterprise (HPE), on Twitter at @j_jeremiah. The discussion is moderated by me, Dana Gardner, Principal Analyst at Interarbor Solutions.
Too often with compelling new technologies market participants become overly enamored with that attractiveness of the technology and neglect underlying business drivers. This tendency, what some call the “newest shiny object syndrome” is understandable given that virtually all of us are heavily engaged in technology. But it is also mistaken. Without concrete business cases driving its deployment, IoT, like many other technologies before it, will fade into obscurity.
With all the incredible momentum behind the Internet of Things (IoT) industry, it is easy to forget that not a single CEO wakes up and wonders if “my IoT is broken.” What they wonder is if they are making the right decisions to do all they can to increase revenue, decrease costs, and improve customer experience – effectively the same challenges they have always had in growing their business. The exciting thing about the IoT industry is now these decisions can be better, faster, and smarter. Now all corporate assets – people, objects, and spaces – can share information about themselves and thei...
The Internet of Things is clearly many things: data collection and analytics, wearables, Smart Grids and Smart Cities, the Industrial Internet, and more. Cool platforms like Arduino, Raspberry Pi, Intel's Galileo and Edison, and a diverse world of sensors are making the IoT a great toy box for developers in all these areas. In this Power Panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, panelists discussed what things are the most important, which will have the most profound effect on the world, and what should we expect to see over the next couple of years.
Discussions of cloud computing have evolved in recent years from a focus on specific types of cloud, to a world of hybrid cloud, and to a world dominated by the APIs that make today's multi-cloud environments and hybrid clouds possible. In this Power Panel at 17th Cloud Expo, moderated by Conference Chair Roger Strukhoff, panelists addressed the importance of customers being able to use the specific technologies they need, through environments and ecosystems that expose their APIs to make true change and transformation possible.