Welcome!

Linux Containers Authors: Liz McMillan, William Schmarzo, Karthick Viswanathan, Pat Romanski, Elizabeth White

Related Topics: Linux Containers

Linux Containers: Article

*SPECIAL Linux.SYS-CON.com ANALYSIS* Bruce Perens White Paper on UserLinux

Perens publishes white paper sub-titled "Repairing the Economic Paradigm of Enterprise Linux"

Linux evangelist Bruce Perens has made available his first draft, UserLinux: Repairing the Economic Paradigm of Enterprise Linux. Which at first read sounds like a good idea, even though it seems to bear many similarities to United Linux. UnitedLinux to date seems to have had very little impact on the Linux user community - due to SCO’s participation and the lack of unilateral support by Linux distribution vendors, most notably Red Hat.

Analysis of UnitedLinux’s results to date may be helpful to those thinking about jumping on the UserLinux bandwagon. This is not to say that UserLinux is destined for failure; on the contrary, Bruce’s effort to bring the same discussion to the community rather than the corporate level intrigues me. But it leads me to pose the following questions:

  • Can and will the community advance Linux in the enterprise faster than the distribution vendors?
  • If so what differences between the two models will be the catalyst for success?
  • Is UserLinux really needed - how much does UserLinux potentially overlap with work being done by OSDL and the Linux Standards Base?

One key theme in the initial Perens proposal is the idea of a structure that would include a central body setting direction for the community and making choices on applications supplied by those who cooperate with community efforts. He also envisions a technical plan that sets goals and maintains relationships with commercial organizations. Also there is mention of a certification of solutions on UserLinux, a practice that has allowed Red Hat to take a leadership position in the enterprise Linux space.

A community-led project with the ability to certify solutions would be an interesting competitor to Red Hat’s Enterprise Linux. I have to wonder if a band of community developers with a proposed $1 million annual budget can make advances that rival or overshadow those of the corporate Linux community. Then again, from the humblest of beginnings, Linus’ kernel project and RMS’s GNU initiatives have taken a significant market share away from the world’s largest company.

From a technical perspective, Perens proposes Debian as the base technology for UserLinux, which is no coincidence since he is a former leader of the Debian project. This is an interesting proposal because Debian has as good a technology as anyone (arguably better by many) but seems to be the least commercial of any distribution.

There would be some irony in Debian as the technology that powers corporations throughout the world. Debian does seem like a logical choice to address one of the biggest problems with Linux today, application delivery and installation. The difficulty of installation of applications across distributions due to conflicts and lack of supporting libraries could be solved by Debian’s apt tools, which are quite superior for the installation and resolution of dependencies in comparison to rpm.

Decisions on a standardized GUI interface and Web server software are all points of contention for UserLinux. All in all the proposal seems to point towards making some choices between existing technologies and thereafter working on better integration and overall usability. This strategy may result in a lack of innovation and healthy competition that exist today between “competing projects,” but could yield significant progress in Linux usability if the efforts of the individual groups could be combined.

I am sure that UserLinux will continue to spark debate because of user need for Linux to develop into a platform with some level of commonality across vendors. Once this commonality has become established then there needs to be a mechanism, vendor, or group of vendors that provide software solutions, management and hardware support equivalent to those available on other operating systems with equivalent or better prices and equivalent or better functionality.

Keeping this in mind maybe more interesting than UserLinux is the prospect of Linux being on the cusp of mainstream success. Ideas and the extension of ideas like Bruce’s are the tinder that will spark widespread Open Source adoption. I encourage everyone that can to participate to help Bruce shape his proposal.

More Stories By Mark R. Hinkle

Mark Hinkle is the Senior Director, Open Soure Solutions at Citrix. He also is along-time open source expert and advocate. He is a co-founder of both the Open Source Management Consortium and the Desktop Linux Consortium. He has served as Editor-in-Chief for both LinuxWorld Magazine and Enterprise Open Source Magazine. Hinkle is also the author of the book, "Windows to Linux Business Desktop Migration" (Thomson, 2006). His blog on open source, technology, and new media can be found at http://www.socializedsoftware.com.

Comments (12) View Comments

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Most Recent Comments
downwa 12/04/03 05:02:13 PM EST

More important than a standard package management / dependency resolution system is a standard for what package names to use, and what files they should contain. Debian has the lead here, simply because the RPM-based distributions have fragmented their choices to some extent. However, I agree that urpmi is superior in some respects to apt-get.

However, in my view, it would be better yet if we could stop relying on the package to contain dependency information, and instead extract it from the files themselves. That is, use ldd against all executables in a package to determine which libraries are needed, at install time. If the libraries are already in the system (no matter how they got there), the install proceeds, otherwise, a search for packages providing those libraries is made, both on local CDs and over the network (like urpmi can do). The information retrieved should be cached, to speed up future installs, and the CDs could already contain a local cache. The difference here is the cache would be dynamically built from the actual contents of the package, not from what the package creator "said" was in the package. Also, it wouldn't matter if your libraries were installed via RPM, apt, or simply extracting a .tgz file.

JDR 12/04/03 09:31:18 AM EST

I think that standardizing on a single "package management / dependency resolution" system is as important to the adoption of GNU/Linux as a standard libc was to the C programming language.

All system administrators must manage software installations, and it would be really nice if there was at least one (nearly) universal body of knowledge that would accomplish that task.

BeRT 12/04/03 05:48:19 AM EST

Redhat Packet Manager ? RPM being a 'system'

and that would be
apt-get install evolution

Duck 12/04/03 03:46:07 AM EST

From a technical perspective, Perens proposes Debian as the base technology for UserLinux, which is no coincidence since he is a former leader of the Debian project.

You make it sound like Perens chose Debian just because he is a former leader of the project, from reading the proposal I think he had different reasons and sees his previous involvement in the Debian project as a benefit.

He notices several aspects of Debian as important: Freedom, size (packages, developers, users), package delivery system, responsive and open behavior towards bugs and security issues, social contract and the broad range of architectures supported by Debian.

You can either build a distribution from scratch or build upon an existing effort. If you want to have freedom and openness and your goal is to bring the community spirit to the enterprise, you're not gonna build upon RedHat's fedora, right?

Frank Wales 12/03/03 11:33:28 PM EST

"...taken a significant market share away from the world’s largest company."

Isn't that the Mitsubishi Group, or maybe GE?

Adam Boyle 12/03/03 07:14:33 PM EST

Can we stop being so ignorant about RPM, please!!! RPM is a packaging standard, not a delivery/dependency resolving mechanism. Please don't tell me that RPM is worse than apt-get, because you're comparing a package to a delivery mechanism. RPM is the equivalent of a .deb package, and they really are functionally equivalent.

If you want to compare delivery and dependency resolution mechanisms, try comparing Mandrake's urpmi or RedHat's up2date to apt-get. And urpmi is arguably better than apt-get!

Besides the fact that:

> urpmi evolution

takes less characters to type than:

> apt-get evolution

Just my 2 cents.

Chuck Wegrzyn 12/03/03 06:59:40 PM EST

Why use Debian? While it is a nice platform, and one I have used for a few years, I think the Gentoo system is actually a much better way to go. It is easier to customize and much easier to keep up-to-date. The downside to Gentoo is the installation - it is more complicated than any of the other distributions.

macb 12/03/03 06:59:32 PM EST

This idea sounds fine to me, particularly since it doesn't involve creating Yet Another Distribution. The worst that can happen, it seems is that the idea will fail to pick up momentum but in the mean time will contribute to the existing Debian project.

What interests me is that companies like Red Hat and Novell continue to state publicly that they are NOT setting themselves up in competition with Microsoft. Following that they proceed to adopt, to the extent possible by the GPL Microsoft-like practices and furthermore measure their success or failure as though competition with Microsoft is indeed what they have at the top of their agendas.

What they don't seem to get is that Microsoft, when (should I say if?) it falls will NOT be replaced by something else. I think there is every reason to believe that Microsoft is an historical anomaly produced by the single-minded greed and ego of one man, plus a large pinch of luck in being in the right place at the right time to capitalize on IBM's fumbling of the PC business. Sometimes I think IBM fumbled it a bit on purpose, since had they retained full control of the PC hardware and software as we know it today they would have found themselves back in anti-trust litigation and they would have had a harder time fending off the Justice Department than did Microsoft.

I like seeing companies like IBM involved with Linux and Open source. Why? because software is NOT their only business. They sell hardware and services, invent things and sell patent rights. Novell, Red Hat, Suse, Mandrake, etc are all set up, like Microsoft, to make a killing just selling an Operating system. When in history did any other company succeed by just selling an operating system? Just once, just Microsoft and it is a bad act to try and follow.

As Bruce said on The Linux Show yesterday, the key to success is more likely to be with areas of specialization. Who will corner the market in selling turnkey Linux-based systems to dentists? How about doctors, construction companies, import/export companies, and so on? Companies like Red Hat or Novell could of course use subsidiaries to focus on these industries, but to do so they will have to abandon their current mindsets and start thinking about BIG payrolls with lots of locally based employees pounding the pavement to both sell into and support these specialties. That, in my opinion is a formula that can work (it seems to be working for IBM anyway).

Finally, Linux is a great tool for universities. Students have full access to everything to tinker with and in some cases improve upon. This was actually also the case with IBM software before PCs came along. It will be this synergy between the mission of higher education and both commercial and individual computer users that will keep Linux going even if ALL of the commercial Linux ventures fail.

Too much attention has been paid to leveling the playing field between Microsoft and other commercial software companies. We have lost sight of the fact that the operating system *IS* the playing field and the only way to enjoy the full benefit of competing ideas is to have an operating system that is both equally available to everyone and also that runs on all hardware. There is only one operating system that satisfies those requirements now and it is hard to see how anything can stand in its way.

Randy Andy 12/03/03 06:39:56 PM EST

The variations between GNU/Linux distributions is being regularly cited as a weakness of the GPL development model. I'll make a counter argument. This debate we are having about how to standardized GNU/Linux configuration so that competing distributions can continue to compete without stifling growth in the app space is one of the few places in the market this is happening.

We could hastily lock down some distribution and thereby speed up short term acceptance, but the GPL community is committed to retaining competition by looking to achieve an open standard for the OS development as well.

This hasn't been achieved elsewhere. If it can be done, the GNU platform will be the ONLY place where competition flourishes both in the OS space AND the App space. Everywhere else, the OS's have prospered not from their own strength, but from the standardized environment that allow only the App competition to flourish.

This determination shows the power of freedom of choice. Linux users get what they want.

kaka 12/03/03 06:18:15 PM EST

From what I understood, redhat was invited to join UnitedLinux the DAY BEFORE the deadline...meaning they didn't want redhat to participate but didn't want to look as though they were competition.

Ed Mack 12/03/03 05:37:35 PM EST

Bruce, there pretty much is this already. Debain follows the standards set out for distributions file paths, and other distros do to less a degree.

For the file depenedencies, Fedora, Debian ect.. have apt style tools that solve this by automatically getting them.

I don't personally see why a new distro should be made, if Debian was worked on in regards to desktop, a hellova lot of people would win, not just users of yet another distro

Bruce 12/03/03 10:19:59 AM EST

I think there should be more standardisation of the program locations.
I know opensource is all about freedom, but then there should still be some standards on where files are installed to. So that it would be the same for every distro by default. This would be a great help for newbies reading their first howtos as well as for RPM finding file depencies

@ThingsExpo Stories
SYS-CON Events announced today that Secure Channels, a cybersecurity firm, will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Secure Channels, Inc. offers several products and solutions to its many clients, helping them protect critical data from being compromised and access to computer networks from the unauthorized. The company develops comprehensive data encryption security strategie...
In his session at @ThingsExpo, Sudarshan Krishnamurthi, a Senior Manager, Business Strategy, at Cisco Systems, discussed how IT and operational technology (OT) work together, as opposed to being in separate siloes as once was traditional. Attendees learned how to fully leverage the power of IoT in their organization by bringing the two sides together and bridging the communication gap. He also looked at what good leadership must entail in order to accomplish this, and how IT managers can be the ...
Recently, WebRTC has a lot of eyes from market. The use cases of WebRTC are expanding - video chat, online education, online health care etc. Not only for human-to-human communication, but also IoT use cases such as machine to human use cases can be seen recently. One of the typical use-case is remote camera monitoring. With WebRTC, people can have interoperability and flexibility for deploying monitoring service. However, the benefit of WebRTC for IoT is not only its convenience and interopera...
There is only one world-class Cloud event on earth, and that is Cloud Expo – which returns to Silicon Valley for the 21st Cloud Expo at the Santa Clara Convention Center, October 31 - November 2, 2017. Every Global 2000 enterprise in the world is now integrating cloud computing in some form into its IT development and operations. Midsize and small businesses are also migrating to the cloud in increasing numbers. Companies are each developing their unique mix of cloud technologies and service...
When shopping for a new data processing platform for IoT solutions, many development teams want to be able to test-drive options before making a choice. Yet when evaluating an IoT solution, it’s simply not feasible to do so at scale with physical devices. Building a sensor simulator is the next best choice; however, generating a realistic simulation at very high TPS with ease of configurability is a formidable challenge. When dealing with multiple application or transport protocols, you would be...
SYS-CON Events announced today that App2Cloud will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct. 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. App2Cloud is an online Platform, specializing in migrating legacy applications to any Cloud Providers (AWS, Azure, Google Cloud).
IoT is at the core or many Digital Transformation initiatives with the goal of re-inventing a company's business model. We all agree that collecting relevant IoT data will result in massive amounts of data needing to be stored. However, with the rapid development of IoT devices and ongoing business model transformation, we are not able to predict the volume and growth of IoT data. And with the lack of IoT history, traditional methods of IT and infrastructure planning based on the past do not app...
To get the most out of their data, successful companies are not focusing on queries and data lakes, they are actively integrating analytics into their operations with a data-first application development approach. Real-time adjustments to improve revenues, reduce costs, or mitigate risk rely on applications that minimize latency on a variety of data sources. Jack Norris reviews best practices to show how companies develop, deploy, and dynamically update these applications and how this data-first...
Intelligent Automation is now one of the key business imperatives for CIOs and CISOs impacting all areas of business today. In his session at 21st Cloud Expo, Brian Boeggeman, VP Alliances & Partnerships at Ayehu, will talk about how business value is created and delivered through intelligent automation to today’s enterprises. The open ecosystem platform approach toward Intelligent Automation that Ayehu delivers to the market is core to enabling the creation of the self-driving enterprise.
Internet-of-Things discussions can end up either going down the consumer gadget rabbit hole or focused on the sort of data logging that industrial manufacturers have been doing forever. However, in fact, companies today are already using IoT data both to optimize their operational technology and to improve the experience of customer interactions in novel ways. In his session at @ThingsExpo, Gordon Haff, Red Hat Technology Evangelist, shared examples from a wide range of industries – including en...
Consumers increasingly expect their electronic "things" to be connected to smart phones, tablets and the Internet. When that thing happens to be a medical device, the risks and benefits of connectivity must be carefully weighed. Once the decision is made that connecting the device is beneficial, medical device manufacturers must design their products to maintain patient safety and prevent compromised personal health information in the face of cybersecurity threats. In his session at @ThingsExpo...
"We're a cybersecurity firm that specializes in engineering security solutions both at the software and hardware level. Security cannot be an after-the-fact afterthought, which is what it's become," stated Richard Blech, Chief Executive Officer at Secure Channels, in this SYS-CON.tv interview at @ThingsExpo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
SYS-CON Events announced today that Massive Networks will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Massive Networks mission is simple. To help your business operate seamlessly with fast, reliable, and secure internet and network solutions. Improve your customer's experience with outstanding connections to your cloud.
SYS-CON Events announced today that Grape Up will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct. 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Grape Up is a software company specializing in cloud native application development and professional services related to Cloud Foundry PaaS. With five expert teams that operate in various sectors of the market across the U.S. and Europe, Grape Up works with a variety of customers from emergi...
Detecting internal user threats in the Big Data eco-system is challenging and cumbersome. Many organizations monitor internal usage of the Big Data eco-system using a set of alerts. This is not a scalable process given the increase in the number of alerts with the accelerating growth in data volume and user base. Organizations are increasingly leveraging machine learning to monitor only those data elements that are sensitive and critical, autonomously establish monitoring policies, and to detect...
Because IoT devices are deployed in mission-critical environments more than ever before, it’s increasingly imperative they be truly smart. IoT sensors simply stockpiling data isn’t useful. IoT must be artificially and naturally intelligent in order to provide more value In his session at @ThingsExpo, John Crupi, Vice President and Engineering System Architect at Greenwave Systems, will discuss how IoT artificial intelligence (AI) can be carried out via edge analytics and machine learning techn...
Everything run by electricity will eventually be connected to the Internet. Get ahead of the Internet of Things revolution and join Akvelon expert and IoT industry leader, Sergey Grebnov, in his session at @ThingsExpo, for an educational dive into the world of managing your home, workplace and all the devices they contain with the power of machine-based AI and intelligent Bot services for a completely streamlined experience.
When shopping for a new data processing platform for IoT solutions, many development teams want to be able to test-drive options before making a choice. Yet when evaluating an IoT solution, it’s simply not feasible to do so at scale with physical devices. Building a sensor simulator is the next best choice; however, generating a realistic simulation at very high TPS with ease of configurability is a formidable challenge. When dealing with multiple application or transport protocols, you would be...
With tough new regulations coming to Europe on data privacy in May 2018, Calligo will explain why in reality the effect is global and transforms how you consider critical data. EU GDPR fundamentally rewrites the rules for cloud, Big Data and IoT. In his session at 21st Cloud Expo, Adam Ryan, Vice President and General Manager EMEA at Calligo, will examine the regulations and provide insight on how it affects technology, challenges the established rules and will usher in new levels of diligence a...
An increasing number of companies are creating products that combine data with analytical capabilities. Running interactive queries on Big Data requires complex architectures to store and query data effectively, typically involving data streams, an choosing efficient file format/database and multiple independent systems that are tied together through custom-engineered pipelines. In his session at @BigDataExpo at @ThingsExpo, Tomer Levi, a senior software engineer at Intel’s Advanced Analytics ...