Click here to close now.

Welcome!

Linux Authors: Carmen Gonzalez, Liz McMillan, Ian Khan, Brad Thies, Pat Romanski

Related Topics: Linux

Linux: Article

Linus changed his source habits & why it doesn't matter

Developers have three options for advancing the Linux kernel.

(LinuxWorld) -- Linux is in greater demand every day. People want to run Linux on everything from wristwatches to mainframes. Thankfully, the core kernel developers aren't burdened with the responsibility of making Linux run on a wristwatch, but they do have to deal with desktop and server demands. People want the kernel to accommodate more processors, different hardware architectures, more types of I/O, more robust and complex network support and filtering, not to mention emerging hardware peripherals ranging from InfiniBand special-purpose high-speed network adapters to cheap digital cameras.

Talented programmers addressed these issues. They submited patches, which they felt were dropped into a black hole. They felt ignored, and, in some cases, they are right. Some ISVs feel as if the core kernel developers are an exclusive club that resents and rejects contributions from outside the inner circle, especially from organizations where capitalism is involved. I don't know if that's true (I doubt it), but the perception can be destructive.

Many people have offered various suggestions on how to improve the process of accepting and integrating kernel patches. I don't have space to tackle them all, and I doubt if my endorsement would make any difference. I do have a bit of advice for those who have more influence on the process. (If you're interested in reading up on all the ideas being tossed around, visit one of the many Linux kernel mailing list archives. You can find links for two of them at http://www.kernel.org/.)

Free advice for free software

Much of the discussion is politically motivated, and some of the antipathy is based on personality conflicts. Some of the suggestions are purely logistic in nature, for example, that kernel developers should use a more intelligent means of submitting and applying patches than the patch and diff utilities.

It appears Linus Torvalds is doing just that. Earlier this week he announced he started using BitKeeper, a distributed source management system. He wrote in his weekly kernel update, "...I've spent about a week trying to change my working habits and scripting bitkeeper enough to (a) import a good revision controlled tree into it from the 2.4.x and 2.5.x patch-archives and (b) try to actually accept patches directly into bitkeeper."

This is good news. BitKeeper isn't an instant panacea, however. "Quite frankly, so far it has definitely made me slower -- it took me basically a week to get about 50 patches applied, but most of that time by far was writing scripts and just getting used to the thing," Torvalds writes. "I expect to be a bit slower to react to patches for a while yet, until the scripts are better."

As great as BitKeeper might be and as wonderful as Torvalds has proven, we need to keep these two facts in mind:

  • Linus Torvalds is not God
  • The Linux kernel is open source, and it is licensed under the GPL

It is difficult for me to criticize the way Linus Torvalds folded patches into the Linux kernel in the past. Anyone who is tempted to do so should remember how much we Linux users owe to the talent and years of hard work of Torvalds.

Nevertheless, for the sake of argument, allow me to assume the worst. Suppose Linus deliberately ignores patches from some people, and in so doing occasionally works against the best interest of all Linux developers and users. Suppose Linus rejects some patches, not because the patches are without merit but because they come from IBM or HP employees, and he doesn't want those companies to have any influence on the kernel.

Or, how about this? What if BitKeeper is using the worst tool for collecting patches, evaluating them, and applying them? What if BitKeeper is perfectly fine but Torvalds uses it incorrectly? What if he never "gets used to the thing"?

I seriously doubt the situation is nearly as bad as these worst-case scenarios, but it is likely that there is an element of truth to at least some of these claims. Why? Because Linus Torvalds is not God. He has a finite ego, IQ, energy level, attention span, and suffers to some degree from all of the other human limitations you and I share. He is predisposed to handling things one way and not another. It is possible that his way was the best way when the kernel was smaller and more manageable, but it is no longer the best way. In other words, it may be true that regardless of the source management tools he uses, the Linux kernel has outgrown Linus Torvalds.

If so, there is still one more thing you can do. Remember that the Linux kernel is open source.

If you can set up a system that manages the progress of the Linux kernel better than Linus can, then go for it. Linus Torvalds may be able to stop you from calling your kernel "Linux," but he can't stop you from taking the kernel as it exists today and doing a better job advancing it. If you're worried that kernel developers simply won't cooperate with you out of a sense of loyalty to Linus, then keep reminding them that Linus Torvalds is not God. Eventually it will sink in.

Okay, now. Any takers?

If not, then I have one last bit of advice. Keep on making suggestions on how to improve the process. In the meantime, try to work with Linus the way he prefers to do things, whether you think he's right or not.

Postscript

In case you're wondering, here's how I feel about the current branches of development on the Linux kernel. I haven't been very happy with the Linus-managed 2.5 branch as of late. It usually doesn't even compile on my system. Dave Jones is aggressive about cleaning up the problems with the 2.5 branch and merging fixes and features from other branches. Jones' patches usually compile for me (although they make it difficult, but not impossible, for me to use the NVidia accelerated driver). You have to be pretty daring to mess with any of the 2.5 branches, whether they're from Linus or Dave. I run the unstable branch of Debian, so I tend to take risks in order to learn more about Linux, but I'm not comfortable running any of the 2.5 kernels yet except as a short experiment.

I miss the frequency with which Alan Cox improved and experimented with the 2.4 branch, so I was very happy to see Alan post some patches to the 2.4.18-pre tree. I'm currently running 2.4.18-pre7-ac2, which has been working great.

Whenever Linus reaches a reasonably stable point in his kernel branch, I usually find myself going back to one of his versions. Whether I do so in the future is uncertain, which is as it should be. It all depends on who does it best.

More Stories By Nicholas Petreley

Nicholas Petreley is a computer consultant and author in Asheville, NC.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories
SYS-CON Events announced today that Dyn, the worldwide leader in Internet Performance, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Dyn is a cloud-based Internet Performance company. Dyn helps companies monitor, control, and optimize online infrastructure for an exceptional end-user experience. Through a world-class network and unrivaled, objective intelligence into Internet conditions, Dyn ensures traffic gets delivered faster, safer, and more reliably than ever.
Even as cloud and managed services grow increasingly central to business strategy and performance, challenges remain. The biggest sticking point for companies seeking to capitalize on the cloud is data security. Keeping data safe is an issue in any computing environment, and it has been a focus since the earliest days of the cloud revolution. Understandably so: a lot can go wrong when you allow valuable information to live outside the firewall. Recent revelations about government snooping, along with a steady stream of well-publicized data breaches, only add to the uncertainty
Hadoop as a Service (as offered by handful of niche vendors now) is a cloud computing solution that makes medium and large-scale data processing accessible, easy, fast and inexpensive. In his session at Big Data Expo, Kumar Ramamurthy, Vice President and Chief Technologist, EIM & Big Data, at Virtusa, will discuss how this is achieved by eliminating the operational challenges of running Hadoop, so one can focus on business growth. The fragmented Hadoop distribution world and various PaaS solutions that provide a Hadoop flavor either make choices for customers very flexible in the name of opti...
The Workspace-as-a-Service (WaaS) market will grow to $6.4B by 2018. In his session at 16th Cloud Expo, Seth Bostock, CEO of IndependenceIT, will begin by walking the audience through the evolution of Workspace as-a-Service, where it is now vs. where it going. To look beyond the desktop we must understand exactly what WaaS is, who the users are, and where it is going in the future. IT departments, ISVs and service providers must look to workflow and automation capabilities to adapt to growing demand and the rapidly changing workspace model.
As organizations shift toward IT-as-a-service models, the need for managing and protecting data residing across physical, virtual, and now cloud environments grows with it. CommVault can ensure protection &E-Discovery of your data – whether in a private cloud, a Service Provider delivered public cloud, or a hybrid cloud environment – across the heterogeneous enterprise. In his session at 16th Cloud Expo, Randy De Meno, Chief Technologist - Windows Products and Microsoft Partnerships, will discuss how to cut costs, scale easily, and unleash insight with CommVault Simpana software, the only si...
Cloud data governance was previously an avoided function when cloud deployments were relatively small. With the rapid adoption in public cloud – both rogue and sanctioned, it’s not uncommon to find regulated data dumped into public cloud and unprotected. This is why enterprises and cloud providers alike need to embrace a cloud data governance function and map policies, processes and technology controls accordingly. In her session at 15th Cloud Expo, Evelyn de Souza, Data Privacy and Compliance Strategy Leader at Cisco Systems, will focus on how to set up a cloud data governance program and s...
Roberto Medrano, Executive Vice President at SOA Software, had reached 30,000 page views on his home page - http://RobertoMedrano.SYS-CON.com/ - on the SYS-CON family of online magazines, which includes Cloud Computing Journal, Internet of Things Journal, Big Data Journal, and SOA World Magazine. He is a recognized executive in the information technology fields of SOA, internet security, governance, and compliance. He has extensive experience with both start-ups and large companies, having been involved at the beginning of four IT industries: EDA, Open Systems, Computer Security and now SOA.
The industrial software market has treated data with the mentality of “collect everything now, worry about how to use it later.” We now find ourselves buried in data, with the pervasive connectivity of the (Industrial) Internet of Things only piling on more numbers. There’s too much data and not enough information. In his session at @ThingsExpo, Bob Gates, Global Marketing Director, GE’s Intelligent Platforms business, to discuss how realizing the power of IoT, software developers are now focused on understanding how industrial data can create intelligence for industrial operations. Imagine ...
Operational Hadoop and the Lambda Architecture for Streaming Data Apache Hadoop is emerging as a distributed platform for handling large and fast incoming streams of data. Predictive maintenance, supply chain optimization, and Internet-of-Things analysis are examples where Hadoop provides the scalable storage, processing, and analytics platform to gain meaningful insights from granular data that is typically only valuable from a large-scale, aggregate view. One architecture useful for capturing and analyzing streaming data is the Lambda Architecture, representing a model of how to analyze rea...
SYS-CON Events announced today that Vitria Technology, Inc. will exhibit at SYS-CON’s @ThingsExpo, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Vitria will showcase the company’s new IoT Analytics Platform through live demonstrations at booth #330. Vitria’s IoT Analytics Platform, fully integrated and powered by an operational intelligence engine, enables customers to rapidly build and operationalize advanced analytics to deliver timely business outcomes for use cases across the industrial, enterprise, and consumer segments.
The Internet of Things (IoT) promises to evolve the way the world does business; however, understanding how to apply it to your company can be a mystery. Most people struggle with understanding the potential business uses or tend to get caught up in the technology, resulting in solutions that fail to meet even minimum business goals. In his session at @ThingsExpo, Jesse Shiah, CEO / President / Co-Founder of AgilePoint Inc., showed what is needed to leverage the IoT to transform your business. He discussed opportunities and challenges ahead for the IoT from a market and technical point of vie...
Advanced Persistent Threats (APTs) are increasing at an unprecedented rate. The threat landscape of today is drastically different than just a few years ago. Attacks are much more organized and sophisticated. They are harder to detect and even harder to anticipate. In the foreseeable future it's going to get a whole lot harder. Everything you know today will change. Keeping up with this changing landscape is already a daunting task. Your organization needs to use the latest tools, methods and expertise to guard against those threats. But will that be enough? In the foreseeable future attacks w...
HP and Aruba Networks on Monday announced a definitive agreement for HP to acquire Aruba, a provider of next-generation network access solutions for the mobile enterprise, for $24.67 per share in cash. The equity value of the transaction is approximately $3.0 billion, and net of cash and debt approximately $2.7 billion. Both companies' boards of directors have approved the deal. "Enterprises are facing a mobile-first world and are looking for solutions that help them transition legacy investments to the new style of IT," said Meg Whitman, Chairman, President and Chief Executive Officer of HP...
Containers and microservices have become topics of intense interest throughout the cloud developer and enterprise IT communities. Accordingly, attendees at the upcoming 16th Cloud Expo at the Javits Center in New York June 9-11 will find fresh new content in a new track called PaaS | Containers & Microservices Containers are not being considered for the first time by the cloud community, but a current era of re-consideration has pushed them to the top of the cloud agenda. With the launch of Docker's initial release in March of 2013, interest was revved up several notches. Then late last...
Disruptive macro trends in technology are impacting and dramatically changing the "art of the possible" relative to supply chain management practices through the innovative use of IoT, cloud, machine learning and Big Data to enable connected ecosystems of engagement. Enterprise informatics can now move beyond point solutions that merely monitor the past and implement integrated enterprise fabrics that enable end-to-end supply chain visibility to improve customer service delivery and optimize supplier management. Learn about enterprise architecture strategies for designing connected systems tha...
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @ThingsExpo, Michael Sick, a Senior Manager and Big Data Architect within Ernst and Young's Financial Servi...
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @ThingsExpo, Michael Sick, a Senior Manager and Big Data Architect within Ernst and Young's Financial Servi...
PubNub on Monday has announced that it is partnering with IBM to bring its sophisticated real-time data streaming and messaging capabilities to Bluemix, IBM’s cloud development platform. “Today’s app and connected devices require an always-on connection, but building a secure, scalable solution from the ground up is time consuming, resource intensive, and error-prone,” said Todd Greene, CEO of PubNub. “PubNub enables web, mobile and IoT developers building apps on IBM Bluemix to quickly add scalable realtime functionality with minimal effort and cost.”
Sensor-enabled things are becoming more commonplace, precursors to a larger and more complex framework that most consider the ultimate promise of the IoT: things connecting, interacting, sharing, storing, and over time perhaps learning and predicting based on habits, behaviors, location, preferences, purchases and more. In his session at @ThingsExpo, Tom Wesselman, Director of Communications Ecosystem Architecture at Plantronics, will examine the still nascent IoT as it is coalescing, including what it is today, what it might ultimately be, the role of wearable tech, and technology gaps stil...
With several hundred implementations of IoT-enabled solutions in the past 12 months alone, this session will focus on experience over the art of the possible. Many can only imagine the most advanced telematics platform ever deployed, supporting millions of customers, producing tens of thousands events or GBs per trip, and hundreds of TBs per month. With the ability to support a billion sensor events per second, over 30PB of warm data for analytics, and hundreds of PBs for an data analytics archive, in his session at @ThingsExpo, Jim Kaskade, Vice President and General Manager, Big Data & Ana...