Welcome!

Linux Containers Authors: Anders Wallgren, Elizabeth White, Pete Waterhouse, Pat Romanski, Tim Hinds

Related Topics: Linux Containers

Linux Containers: Article

Linus changed his source habits & why it doesn't matter

Developers have three options for advancing the Linux kernel.

(LinuxWorld) -- Linux is in greater demand every day. People want to run Linux on everything from wristwatches to mainframes. Thankfully, the core kernel developers aren't burdened with the responsibility of making Linux run on a wristwatch, but they do have to deal with desktop and server demands. People want the kernel to accommodate more processors, different hardware architectures, more types of I/O, more robust and complex network support and filtering, not to mention emerging hardware peripherals ranging from InfiniBand special-purpose high-speed network adapters to cheap digital cameras.

Talented programmers addressed these issues. They submited patches, which they felt were dropped into a black hole. They felt ignored, and, in some cases, they are right. Some ISVs feel as if the core kernel developers are an exclusive club that resents and rejects contributions from outside the inner circle, especially from organizations where capitalism is involved. I don't know if that's true (I doubt it), but the perception can be destructive.

Many people have offered various suggestions on how to improve the process of accepting and integrating kernel patches. I don't have space to tackle them all, and I doubt if my endorsement would make any difference. I do have a bit of advice for those who have more influence on the process. (If you're interested in reading up on all the ideas being tossed around, visit one of the many Linux kernel mailing list archives. You can find links for two of them at http://www.kernel.org/.)

Free advice for free software

Much of the discussion is politically motivated, and some of the antipathy is based on personality conflicts. Some of the suggestions are purely logistic in nature, for example, that kernel developers should use a more intelligent means of submitting and applying patches than the patch and diff utilities.

It appears Linus Torvalds is doing just that. Earlier this week he announced he started using BitKeeper, a distributed source management system. He wrote in his weekly kernel update, "...I've spent about a week trying to change my working habits and scripting bitkeeper enough to (a) import a good revision controlled tree into it from the 2.4.x and 2.5.x patch-archives and (b) try to actually accept patches directly into bitkeeper."

This is good news. BitKeeper isn't an instant panacea, however. "Quite frankly, so far it has definitely made me slower -- it took me basically a week to get about 50 patches applied, but most of that time by far was writing scripts and just getting used to the thing," Torvalds writes. "I expect to be a bit slower to react to patches for a while yet, until the scripts are better."

As great as BitKeeper might be and as wonderful as Torvalds has proven, we need to keep these two facts in mind:

  • Linus Torvalds is not God
  • The Linux kernel is open source, and it is licensed under the GPL

It is difficult for me to criticize the way Linus Torvalds folded patches into the Linux kernel in the past. Anyone who is tempted to do so should remember how much we Linux users owe to the talent and years of hard work of Torvalds.

Nevertheless, for the sake of argument, allow me to assume the worst. Suppose Linus deliberately ignores patches from some people, and in so doing occasionally works against the best interest of all Linux developers and users. Suppose Linus rejects some patches, not because the patches are without merit but because they come from IBM or HP employees, and he doesn't want those companies to have any influence on the kernel.

Or, how about this? What if BitKeeper is using the worst tool for collecting patches, evaluating them, and applying them? What if BitKeeper is perfectly fine but Torvalds uses it incorrectly? What if he never "gets used to the thing"?

I seriously doubt the situation is nearly as bad as these worst-case scenarios, but it is likely that there is an element of truth to at least some of these claims. Why? Because Linus Torvalds is not God. He has a finite ego, IQ, energy level, attention span, and suffers to some degree from all of the other human limitations you and I share. He is predisposed to handling things one way and not another. It is possible that his way was the best way when the kernel was smaller and more manageable, but it is no longer the best way. In other words, it may be true that regardless of the source management tools he uses, the Linux kernel has outgrown Linus Torvalds.

If so, there is still one more thing you can do. Remember that the Linux kernel is open source.

If you can set up a system that manages the progress of the Linux kernel better than Linus can, then go for it. Linus Torvalds may be able to stop you from calling your kernel "Linux," but he can't stop you from taking the kernel as it exists today and doing a better job advancing it. If you're worried that kernel developers simply won't cooperate with you out of a sense of loyalty to Linus, then keep reminding them that Linus Torvalds is not God. Eventually it will sink in.

Okay, now. Any takers?

If not, then I have one last bit of advice. Keep on making suggestions on how to improve the process. In the meantime, try to work with Linus the way he prefers to do things, whether you think he's right or not.

Postscript

In case you're wondering, here's how I feel about the current branches of development on the Linux kernel. I haven't been very happy with the Linus-managed 2.5 branch as of late. It usually doesn't even compile on my system. Dave Jones is aggressive about cleaning up the problems with the 2.5 branch and merging fixes and features from other branches. Jones' patches usually compile for me (although they make it difficult, but not impossible, for me to use the NVidia accelerated driver). You have to be pretty daring to mess with any of the 2.5 branches, whether they're from Linus or Dave. I run the unstable branch of Debian, so I tend to take risks in order to learn more about Linux, but I'm not comfortable running any of the 2.5 kernels yet except as a short experiment.

I miss the frequency with which Alan Cox improved and experimented with the 2.4 branch, so I was very happy to see Alan post some patches to the 2.4.18-pre tree. I'm currently running 2.4.18-pre7-ac2, which has been working great.

Whenever Linus reaches a reasonably stable point in his kernel branch, I usually find myself going back to one of his versions. Whether I do so in the future is uncertain, which is as it should be. It all depends on who does it best.

More Stories By Nicholas Petreley

Nicholas Petreley is a computer consultant and author in Asheville, NC.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories
With an estimated 50 billion devices connected to the Internet by 2020, several industries will begin to expand their capabilities for retaining end point data at the edge to better utilize the range of data types and sheer volume of M2M data generated by the Internet of Things. In his session at @ThingsExpo, Don DeLoach, CEO and President of Infobright, will discuss the infrastructures businesses will need to implement to handle this explosion of data by providing specific use cases for filte...
SYS-CON Events announced today that Pythian, a global IT services company specializing in helping companies adopt disruptive technologies to optimize revenue-generating systems, has been named “Bronze Sponsor” of SYS-CON's 18th Cloud Expo, which will take place on June 7-9, 2015 at the Javits Center in New York, New York. Founded in 1997, Pythian is a global IT services company that helps companies compete by adopting disruptive technologies such as cloud, Big Data, advanced analytics, and DevO...
SYS-CON Events announced today that Avere Systems, a leading provider of enterprise storage for the hybrid cloud, will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. Avere delivers a more modern architectural approach to storage that doesn’t require the overprovisioning of storage capacity to achieve performance, overspending on expensive storage media for inactive data or the overbuilding of data centers ...
SYS-CON Events announced today that Interoute, owner-operator of one of Europe's largest networks and a global cloud services platform, has been named “Bronze Sponsor” of SYS-CON's 18th Cloud Expo, which will take place on June 7-9, 2015 at the Javits Center in New York, New York. Interoute is the owner-operator of one of Europe's largest networks and a global cloud services platform which encompasses 12 data centers, 14 virtual data centers and 31 colocation centers, with connections to 195 ad...
Join us at Cloud Expo | @ThingsExpo 2016 – June 7-9 at the Javits Center in New York City and November 1-3 at the Santa Clara Convention Center in Santa Clara, CA – and deliver your unique message in a way that is striking and unforgettable by taking advantage of SYS-CON's unmatched high-impact, result-driven event / media packages.
SYS-CON Events announced today that Alert Logic, Inc., the leading provider of Security-as-a-Service solutions for the cloud, will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. Alert Logic, Inc., provides Security-as-a-Service for on-premises, cloud, and hybrid infrastructures, delivering deep security insight and continuous protection for customers at a lower cost than traditional security solutions. Ful...
SYS-CON Events announced today that Men & Mice, the leading global provider of DNS, DHCP and IP address management overlay solutions, will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. The Men & Mice Suite overlay solution is already known for its powerful application in heterogeneous operating environments, enabling enterprises to scale without fuss. Building on a solid range of diverse platform support,...
SYS-CON Events announced today that iDevices®, the preeminent brand in the connected home industry, will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. iDevices, the preeminent brand in the connected home industry, has a growing line of HomeKit-enabled products available at the largest retailers worldwide. Through the “Designed with iDevices” co-development program and its custom-built IoT Cloud Infrastruc...
Fortunately, meaningful and tangible business cases for IoT are plentiful in a broad array of industries and vertical markets. These range from simple warranty cost reduction for capital intensive assets, to minimizing downtime for vital business tools, to creating feedback loops improving product design, to improving and enhancing enterprise customer experiences. All of these business cases, which will be briefly explored in this session, hinge on cost effectively extracting relevant data from ...
As enterprises work to take advantage of Big Data technologies, they frequently become distracted by product-level decisions. In most new Big Data builds this approach is completely counter-productive: it presupposes tools that may not be a fit for development teams, forces IT to take on the burden of evaluating and maintaining unfamiliar technology, and represents a major up-front expense. In his session at @BigDataExpo at @ThingsExpo, Andrew Warfield, CTO and Co-Founder of Coho Data, will dis...
Companies can harness IoT and predictive analytics to sustain business continuity; predict and manage site performance during emergencies; minimize expensive reactive maintenance; and forecast equipment and maintenance budgets and expenditures. Providing cost-effective, uninterrupted service is challenging, particularly for organizations with geographically dispersed operations.
The Quantified Economy represents the total global addressable market (TAM) for IoT that, according to a recent IDC report, will grow to an unprecedented $1.3 trillion by 2019. With this the third wave of the Internet-global proliferation of connected devices, appliances and sensors is poised to take off in 2016. In his session at @ThingsExpo, David McLauchlan, CEO and co-founder of Buddy Platform, will discuss how the ability to access and analyze the massive volume of streaming data from mil...
WebSocket is effectively a persistent and fat pipe that is compatible with a standard web infrastructure; a "TCP for the Web." If you think of WebSocket in this light, there are other more hugely interesting applications of WebSocket than just simply sending data to a browser. In his session at 18th Cloud Expo, Frank Greco, Director of Technology for Kaazing Corporation, will compare other modern web connectivity methods such as HTTP/2, HTTP Streaming, Server-Sent Events and new W3C event APIs ...
SYS-CON Events announced today that FalconStor Software® Inc., a 15-year innovator of software-defined storage solutions, will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. FalconStor Software®, Inc. (NASDAQ: FALC) is a leading software-defined storage company offering a converged, hardware-agnostic, software-defined storage and data services platform. Its flagship solution FreeStor®, utilizes a horizonta...
Silver Spring Networks, Inc. (NYSE: SSNI) extended its Internet of Things technology platform with performance enhancements to Gen5 – its fifth generation critical infrastructure networking platform. Already delivering nearly 23 million devices on five continents as one of the leading networking providers in the market, Silver Spring announced it is doubling the maximum speed of its Gen5 network to up to 2.4 Mbps, increasing computational performance by 10x, supporting simultaneous mesh communic...
The cloud promises new levels of agility and cost-savings for Big Data, data warehousing and analytics. But it’s challenging to understand all the options – from IaaS and PaaS to newer services like HaaS (Hadoop as a Service) and BDaaS (Big Data as a Service). In her session at @BigDataExpo at @ThingsExpo, Hannah Smalltree, a director at Cazena, will provide an educational overview of emerging “as-a-service” options for Big Data in the cloud. This is critical background for IT and data profes...
Eighty percent of a data scientist’s time is spent gathering and cleaning up data, and 80% of all data is unstructured and almost never analyzed. Cognitive computing, in combination with Big Data, is changing the equation by creating data reservoirs and using natural language processing to enable analysis of unstructured data sources. This is impacting every aspect of the analytics profession from how data is mined (and by whom) to how it is delivered. This is not some futuristic vision: it's ha...
With the Apple Watch making its way onto wrists all over the world, it’s only a matter of time before it becomes a staple in the workplace. In fact, Forrester reported that 68 percent of technology and business decision-makers characterize wearables as a top priority for 2015. Recognizing their business value early on, FinancialForce.com was the first to bring ERP to wearables, helping streamline communication across front and back office functions. In his session at @ThingsExpo, Kevin Roberts...
Cognitive Computing is becoming the foundation for a new generation of solutions that have the potential to transform business. Unlike traditional approaches to building solutions, a cognitive computing approach allows the data to help determine the way applications are designed. This contrasts with conventional software development that begins with defining logic based on the current way a business operates. In her session at 18th Cloud Expo, Judith S. Hurwitz, President and CEO of Hurwitz & ...
One of the bewildering things about DevOps is integrating the massive toolchain including the dozens of new tools that seem to crop up every year. Part of DevOps is Continuous Delivery and having a complex toolchain can add additional integration and setup to your developer environment. In his session at @DevOpsSummit at 18th Cloud Expo, Miko Matsumura, Chief Marketing Officer of Gradle Inc., will discuss which tools to use in a developer stack, how to provision the toolchain to minimize onboa...