Welcome!

Linux Containers Authors: Pat Romanski, Dana Gardner, Liz McMillan, Elizabeth White, Cloud Best Practices Network

Related Topics: Linux Containers

Linux Containers: Article

Introducing the Linux 2.6 Kernel

Introducing the Linux 2.6 Kernel

Already the subject of intense scrutiny, this new kernel will be the first major revamp of the Linux kernel in two years. We at Open Source Development Labs (OSDL) have worked with Linux developers and together completed more than 4,000 tests on publicly available development versions of this kernel.

In recent months, we have run the development kernel, known as 2.5, in our production environment with servers, achieving more than 30 days of continuous uptime. The 2.5 kernel will transition into 2.6, and OSDL is committed to its rapid adoption in the market. (OSDL is a global consortium backed by Computer Associates, Fujitsu, Hitachi, HP, IBM, Intel, NEC, and other major vendors.)

A fast and deep entry into the market would be a distinct change from what happened with Linux 2.4, when adoption took longer than the industry anticipated. This time around, however, the development community, including OSDL, has tested the kernel so extensively that we believe adoption will come much, much faster.

There are eight reasons why CIOs will decide to upgrade to a Linux distribution based on 2.6: seven related to performance and the eighth, and critical, factor being cost. At OSDL, we divided the key 2.6 kernel feature improvements into seven categories: performance, scalability, availability, clustering, I/O, management, and serviceability. We found that Linux systems based on the 2.6 kernel will scale better on bigger machines. This provides the opportunity to replace more proprietary Unix servers and to consolidate workloads on bigger Linux systems. But it's not just the technical features; the clincher is the cost savings that these features will make possible for large organizations.

Businesses can save big money by implementing the new Linux kernel on Intel architecture-based servers. Amazon's move from Solaris to Linux on HP NetServer systems helped Amazon slash its technology capital budget more than 25% in the first year alone. There's more: businesses can achieve lower training costs, and additional savings can be found as Unix technical staff can easily port their skills, procedures, and even many applications to Linux.

To borrow an insight from Clayton Christensen's book, The Innovator's Dilemma (HarperBusiness, 2000), Linux is a disruptive technology. The new kernel is going to allow Linux to pass Christensen's "good enough" test. This means that many organizations are going to begin moving their core data center operations over to Linux. It gets the job done for a lot less money. As proprietary architectures yield their performance advantage to Linux, Linux becomes "good enough" for most workloads.

Scalability
"Does Linux scale?" is often the first question an IT manager will ask when evaluating whether Linux can replace an enterprise Unix server. Our tests indicate that the Linux 2.6 kernel will scale much better than the 2.4 kernel. Most of the development of the 2.4 kernel was done on single-processor systems with some testing on dual-processor and larger systems. The larger 8- and 16-way machines are supported, but the 2.4 kernel isn't really aimed at those system sizes. With the 2.6 kernel, performance is dramatically improved on large machines.

As part of OSDL's charter we provide outside developers access to enterprise-class machines. Testing on multiprocessor machines is a vital part of the Linux development process and has resulted in an improved scheduler, kernel native threading, and overall refinement of the locking granularity.

We also did a lot of testing of these larger machines with databases, which are a classic resource-intensive, business-critical workload. OSDL provided the Database Test Suite, a fair-use implementation of Transaction Processing Performance Council (TPC) benchmarks. These database performance test results comparing the Linux 2.4 kernel to the Linux 2.5 kernel are freely available from OSDL at www.osdl.org/projects/performance. The source code for the tests is also available for developers.

Stability
"Is Linux stable on larger systems?" is probably the second most frequently asked question. OSDL put a lot of time and resources into testing the Linux 2.5 kernel through the Linux Stabilization Project. A description of the tests and results is available at www.osdl.org/projects/26lnxstblztn/results. Based on these tests and our experience with the 2.5 kernel, we expect that the Linux 2.6 kernel will be more stable than the Linux 2.4 kernel was when it was released.

There is a companion project to test scalability in a repeatable scientific environment. OSDL's Scalable Test Platform (STP) and Patch Lifecycle Manager (PLM) provide the Linux development community with an open, easy-to-use resource for testing custom kernels. STP works as the testing engine. PLM makes it easy to manage developers' patches against stock kernels. With a consistent set of hardware and test suites, developers can test new features in a controlled environment.

Planning for the Future
Because of the improvements in scalability, stability, performance, and availability in the kernel, Linux has reached the level where it can replace more expensive Unix servers. IT managers need to evaluate Linux suitability for their data centers based on the features it will have at the time of deployment. The rapid development of Linux adds some challenges to plans to adopt it. IT managers need to become familiar with the improvements to the 2.6 kernel, determine suitability for their enterprise, and insist on these features when preparing Request for Proposals (RFPs) or making a purchase.

The Linux 2.6 kernel will support more hardware platforms, bringing businesses savings in reduced management costs through a reduction in the number of operating systems under management. Instead of a variety of Unix versions, businesses can standardize on Linux on a range of hardware architectures. Most of Linux runs on industry-standard Intel architecture servers, available from almost every vendor, including Dell, HP, IBM, and NEC. Linux also runs on mainframes from IBM and Fujitsu, PowerPC-based servers from IBM, and Itanium-based servers from HP.

When IT managers plan for the future, they should keep in mind that Linux server use is growing and Unix server use is shrinking. According to industry research firm Gartner, hardware vendors shipped over 425,000 servers with Linux last year, up from 286,823 in 2001. During the same time period, shipments of Unix machines fell 9%. Due to technical improvements in the 2.6 kernel, we anticipate that this trend will accelerate. Many more companies will follow Amazon's early lead and realize significant cost savings by migrating from Unix to Intel architecture hardware.

With the release of the new kernel, OSDL is refocusing much of its work on end-user Global 2000 corporations. We're interested in learning more about your plans to use Linux. What challenges remain before you are prepared for production deployment? With classic disruptive technologies, much like the original personal computer, we know that adoption of "good enough" technology accelerates in new and surprising ways. Tell us what your plans are for Linux. We invite your organization to participate with OSDL in making Linux ready for your enterprise.

For More Information
Learn more by visiting www.osdl.org, the OSDL site. Here you'll find information on Carrier Grade Linux, Data Center Linux, OSDL Database Test Suite, Linux Stabilization Project, Scalability Test Platform, Patch Lifecycle Manager, and much more.

What Is the Kernel?
The Linux kernel is the core of a Linux system. It is only a small part of the large number of files that are installed on a server's hardware. Programs like Web servers, databases, application servers, mail servers, compilers, text editors, image editors, and word processors are not part of the Linux kernel. The kernel controls access to system resources such as

  • CPU
  • RAM
  • Monitor, keyboard, mouse
  • Disk drives, CD-ROM drives
  • Tape drives, printers, and other peripherals and ports
  • Network access

There are two types of Linux kernels, development and production (or stable). Development kernels end in an odd number (e.g., 2.3 or 2.5). Production kernels end in even numbers (e.g., 2.4 or 2.6).

This numbering scheme divides Linux users into two categories. The first category consists of developers and testers, who use the odd-numbered kernels, which are changing and may be unstable. The second category consists of production users, who use even-numbered kernels, which change as little as possible.

Although there will be some settling of the fine points of kernel feature implementation and a deferral of features that aren't ready for production, a look at the 2.5 kernel will give a fairly good view of what the 2.6 kernel will become.

More Stories By Dave Fuller

Dave Fuller brings more than 25 years of data center technical and marketing experience to his current position leading the technical marketing group at OSDL, where he participates in both the Linux kernel stabilization project and the Data Center Linux working group. Prior to OSDL, Dave led IT activities at a start-up focused on Web commerce. At Sequent Computer Systems, he played key roles in technical services and oversaw technical sales support for the company's Asia-Pacific and Latin American operations.

Comments (1) View Comments

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Most Recent Comments
Marky Goldstein 10/01/03 04:51:14 AM EDT

The Linux kernel should be improved for realtime audio- & video processing and clustering... ! That's home entertainment in the future...

@ThingsExpo Stories
SYS-CON Events announced today that Peak 10, Inc., a national IT infrastructure and cloud services provider, will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. Peak 10 provides reliable, tailored data center and network services, cloud and managed services. Its solutions are designed to scale and adapt to customers’ changing business needs, enabling them to lower costs, improve performance and focus inter...
The demand for organizations to expand their infrastructure to multiple IT environments like the cloud, on-premise, mobile, bring your own device (BYOD) and the Internet of Things (IoT) continues to grow. As this hybrid infrastructure increases, the challenge to monitor the security of these systems increases in volume and complexity. In his session at 18th Cloud Expo, Stephen Coty, Chief Security Evangelist at Alert Logic, will show how properly configured and managed security architecture can...
In his session at @ThingsExpo, Chris Klein, CEO and Co-founder of Rachio, will discuss next generation communities that are using IoT to create more sustainable, intelligent communities. One example is Sterling Ranch, a 10,000 home development that – with the help of Siemens – will integrate IoT technology into the community to provide residents with energy and water savings as well as intelligent security. Everything from stop lights to sprinkler systems to building infrastructures will run ef...
The IoTs will challenge the status quo of how IT and development organizations operate. Or will it? Certainly the fog layer of IoT requires special insights about data ontology, security and transactional integrity. But the developmental challenges are the same: People, Process and Platform. In his session at @ThingsExpo, Craig Sproule, CEO of Metavine, will demonstrate how to move beyond today's coding paradigm and share the must-have mindsets for removing complexity from the development proc...
We're entering the post-smartphone era, where wearable gadgets from watches and fitness bands to glasses and health aids will power the next technological revolution. With mass adoption of wearable devices comes a new data ecosystem that must be protected. Wearables open new pathways that facilitate the tracking, sharing and storing of consumers’ personal health, location and daily activity data. Consumers have some idea of the data these devices capture, but most don’t realize how revealing and...
SYS-CON Events announced today that Ericsson has been named “Gold Sponsor” of SYS-CON's @ThingsExpo, which will take place on June 7-9, 2016, at the Javits Center in New York, New York. Ericsson is a world leader in the rapidly changing environment of communications technology – providing equipment, software and services to enable transformation through mobility. Some 40 percent of global mobile traffic runs through networks we have supplied. More than 1 billion subscribers around the world re...
Artificial Intelligence has the potential to massively disrupt IoT. In his session at 18th Cloud Expo, AJ Abdallat, CEO of Beyond AI, will discuss what the five main drivers are in Artificial Intelligence that could shape the future of the Internet of Things. AJ Abdallat is CEO of Beyond AI. He has over 20 years of management experience in the fields of artificial intelligence, sensors, instruments, devices and software for telecommunications, life sciences, environmental monitoring, process...
A critical component of any IoT project is the back-end systems that capture data from remote IoT devices and structure it in a way to answer useful questions. Traditional data warehouse and analytical systems are mature technologies that can be used to handle large data sets, but they are not well suited to many IoT-scale products and the need for real-time insights. At Fuze, we have developed a backend platform as part of our mobility-oriented cloud service that uses Big Data-based approache...
trust and privacy in their ecosystem. Assurance and protection of device identity, secure data encryption and authentication are the key security challenges organizations are trying to address when integrating IoT devices. This holds true for IoT applications in a wide range of industries, for example, healthcare, consumer devices, and manufacturing. In his session at @ThingsExpo, Lancen LaChance, vice president of product management, IoT solutions at GlobalSign, will teach IoT developers how t...
The increasing popularity of the Internet of Things necessitates that our physical and cognitive relationship with wearable technology will change rapidly in the near future. This advent means logging has become a thing of the past. Before, it was on us to track our own data, but now that data is automatically available. What does this mean for mHealth and the "connected" body? In her session at @ThingsExpo, Lisa Calkins, CEO and co-founder of Amadeus Consulting, will discuss the impact of wea...
There is an ever-growing explosion of new devices that are connected to the Internet using “cloud” solutions. This rapid growth is creating a massive new demand for efficient access to data. And it’s not just about connecting to that data anymore. This new demand is bringing new issues and challenges and it is important for companies to scale for the coming growth. And with that scaling comes the need for greater security, gathering and data analysis, storage, connectivity and, of course, the...
We’ve worked with dozens of early adopters across numerous industries and will debunk common misperceptions, which starts with understanding that many of the connected products we’ll use over the next 5 years are already products, they’re just not yet connected. With an IoT product, time-in-market provides much more essential feedback than ever before. Innovation comes from what you do with the data that the connected product provides in order to enhance the customer experience and optimize busi...
Increasing IoT connectivity is forcing enterprises to find elegant solutions to organize and visualize all incoming data from these connected devices with re-configurable dashboard widgets to effectively allow rapid decision-making for everything from immediate actions in tactical situations to strategic analysis and reporting. In his session at 18th Cloud Expo, Shikhir Singh, Senior Developer Relations Manager at Sencha, will discuss how to create HTML5 dashboards that interact with IoT devic...
So, you bought into the current machine learning craze and went on to collect millions/billions of records from this promising new data source. Now, what do you do with them? Too often, the abundance of data quickly turns into an abundance of problems. How do you extract that "magic essence" from your data without falling into the common pitfalls? In her session at @ThingsExpo, Natalia Ponomareva, Software Engineer at Google, will provide tips on how to be successful in large scale machine lear...
Digital payments using wearable devices such as smart watches, fitness trackers, and payment wristbands are an increasing area of focus for industry participants, and consumer acceptance from early trials and deployments has encouraged some of the biggest names in technology and banking to continue their push to drive growth in this nascent market. Wearable payment systems may utilize near field communication (NFC), radio frequency identification (RFID), or quick response (QR) codes and barcodes...
You think you know what’s in your data. But do you? Most organizations are now aware of the business intelligence represented by their data. Data science stands to take this to a level you never thought of – literally. The techniques of data science, when used with the capabilities of Big Data technologies, can make connections you had not yet imagined, helping you discover new insights and ask new questions of your data. In his session at @ThingsExpo, Sarbjit Sarkaria, data science team lead ...
The IETF draft standard for M2M certificates is a security solution specifically designed for the demanding needs of IoT/M2M applications. In his session at @ThingsExpo, Brian Romansky, VP of Strategic Technology at TrustPoint Innovation, will explain how M2M certificates can efficiently enable confidentiality, integrity, and authenticity on highly constrained devices.
Manufacturers are embracing the Industrial Internet the same way consumers are leveraging Fitbits – to improve overall health and wellness. Both can provide consistent measurement, visibility, and suggest performance improvements customized to help reach goals. Fitbit users can view real-time data and make adjustments to increase their activity. In his session at @ThingsExpo, Mark Bernardo Professional Services Leader, Americas, at GE Digital, will discuss how leveraging the Industrial Interne...
Whether your IoT service is connecting cars, homes, appliances, wearable, cameras or other devices, one question hangs in the balance – how do you actually make money from this service? The ability to turn your IoT service into profit requires the ability to create a monetization strategy that is flexible, scalable and working for you in real-time. It must be a transparent, smoothly implemented strategy that all stakeholders – from customers to the board – will be able to understand and comprehe...
You deployed your app with the Bluemix PaaS and it's gaining some serious traction, so it's time to make some tweaks. Did you design your application in a way that it can scale in the cloud? Were you even thinking about the cloud when you built the app? If not, chances are your app is going to break. Check out this webcast to learn various techniques for designing applications that will scale successfully in Bluemix, for the confidence you need to take your apps to the next level and beyond.