Welcome!

Linux Containers Authors: AppDynamics Blog, Elizabeth White, Pat Romanski, Christopher Keene, Liz McMillan

Related Topics: Linux Containers

Linux Containers: Article

Can Linux Clusters Move into Mainstream Information Technology?

Exclusive Interview with Rob Lucke, Chief Solutions Officer for Vista Solutions Corp

The emergence of commodity supercomputing has driven clusters based on the Linux operating system into engineering and scientific research organizations that couldn't afford their own supercomputing resources before. But Linux clusters have the potential to become a hot topic in traditional information technology circles as well. The coming year may well be the technology tipping point when Linux cluster technology escapes its current home in research organizations and inhabits the traditional data center.

Building Clustered Linux Systems by Robert Lucke, recently published by Prentice Hall Professional Technical Reference and HP Books, attempts to provide a starting point for organizations interested in building or evaluating their first Linux cluster.

We took this opportunity to have a chat with Robert and ask him a few questions on the subject of Linux and clustering.

What made you want to write a book about building clusters?

Before starting work on clusters, I spent a considerable amount of time tackling workgroup architecture and large-scale system administration problems for my engineering and scientific customers. When I got the opportunity to work on a prototype Itanium 2 cluster at Pacific Northwest National Laboratory, I was fascinated by the new techologies like the high-speed interconnect from Quadrics, and recognized many familiar management and architectural issues. The more I learned, the more I saw applications for clusters in other more "traditional" areas. The book itself was a learning experience for me and an attempt to collect and organize cluster-building information for organizations that are investigating clustered solutions.

Why do you think that clusters are an important architecture?

If you have the proper application software, a cluster can scale high-performance, high-availability or high-throughput resources far beyond anything that is available in a single SMP system. Being able to do this with commodity hardware brings tremendous compute resources in reach of organizations that previously couldn't afford them. I see cluster architectures and techniques as the gateway to some of the resource virtualization that seems to be the Holy Grail of traditional IT departments today. I think this is exciting! I guess it's the love of finding an elegant solution to a problem that drives my excitement. Instead of being a marketechture, clusters represent a real solution to a group of scalable problems.

When someone says cluster, what does it mean to you?

I have learned to be very, very careful with the word cluster. It's overloaded and the meaning depends on the audience. Using cluster in a scientific context evokes a different mental picture than if it were used by traditional IT folks. In general, I think of a cluster as a group of separate resources like systems, CPUs and RAM that gets poured into a mold. The shape of the mold determines the final shape and behavior of the clustered solution. Sizing the problem and determining the shape of the mold is the fun part for me.

Besides scientific and engineering environments, do you see any other applications for clusters?

I sure do! There are database clusters, web server clusters, file server clusters, visualization clusters, and on and on. Instead of building clusters that push the upper limits of RAM and CPU resources, thousands to tens of thousands of CPUs, for example, the company I work for concentrates on application-specific clusters. These are smaller, single-function clusters that are meant to run application configurations that would have required a large and expensive SMP system. The intent is to lower the complexity of building and managing the cluster, but still provide a more cost-effective solution for the application. I think this type of approach is generally applicable in any type of computing environment.

 What are some of the common mistakes you've encountered in cluster building?

The single biggest problem I run into is what I call pile o'hardware syndrome. That's the notion that you just buy a whole bunch of cheap hardware, rack it up, and a cluster will magically appear out of a pile of pieces. It's still very common to underestimate the amount of work required to make physically separate resources work together as if they were one very large, manageable SMP system. A cluster is still a systems engineering problem that can turn nasty if you aren't careful. But, with advances in pre-racked, pre-cabled hardware from some of the hardware vendors and the cluster software toolkits like OSCAR and Rocks, I see cluster building getting easier all the time.

Why do you think that Linux is the best cluster operating system?

One simple answer is choice. There are commercial distributions, free distributions, white-box distributions and so forth. If you have a commercial software package like an Oracle database that's qualified against a particular Linux commercial distribution like SuSE or RedHat, you can build a fully supported cluster configuration. If you want to do research or custom work, there are free distributions like Debian or Fedora. Because the source code is available, you can choose your starting point and degree of customization. This is the best of all possible worlds.

The Linux operating system is stable, manageable and flexible. You are free to configure Linux as you see fit instead of trying to chip away at a black-box operating system that fights you every step of the way. There's a wealth of free management and development tools available. Oh, did I mention that Linux runs on a wide range of commodity hardware, both 32- and 64-bit? What's not to like? Nothing else comes close in my estimation.

What do you see ahead for clustered architectures?

I definitely see Linux clusters moving into mainstream information technology environments. If you look back, the scientific community tends to drive computing technologies that are later adopted by the more conservative IT organizations as business solutions. One modest example I can think of might be the World Wide Web and the Mosaic browser. I firmly believe that clusters, specifically Linux clusters, are poised to repeat this type of adoption pattern. I think we are very close, if not past, the tipping point.

What would you say to someone who is thinking about building his first cluster?

Do it, but do it with your eyes open. Do your homework before starting. Give yourself time to learn. Try not to fall into the pile o'hardware trap. Start small and scale up. Investigate starting points like openMosix, Rocks and OSCAR first. If you don't have time for the learning curve, then have a replicable solution designed and implemented for you.

Conclusion

In addition to their usefulness in scientific and engineering computing environments, I believe that Linux clusters and clustering techniques will be an important addition to the standard information technology solutions in the corporate datacenter. The trick is going to be sharing the cluster-building knowledge that's available in universities and research institutions with the traditional information technology organization. Because of its stability, flexibility, open nature, manageability and availability on a wide range of commodity hardware, I believe that Linux is the correct choice for the creation of clustered solutions. I am really looking forward to the next few years. I believe it will be an exciting time for both Linux and clusters.

About Rob Lucke

Rob Lucke is currently chief solutions officer for Vista Solutions Corp. (http://www.VistaSolutions.Net), concentrating on technical and scientific computing. Rob's field of expertise include Linux compute clusters, technical systems architecture, large-scale system administration techniques, network file systems, heterogeneous interoperability, software development and application and system-level performance tuning. Rob has over 30 years of experience in computing and software of all types from real-time data acquisition to transaction processing. His first book, Designing and Implementing Computer Workgroups, was published in 1999. His second book, Building Clustered Linux Systems, was published in September of 2004. Rob is Red Hat Linux certified engineer #807200931604117.

More Stories By Ibrahim Haddad

Ibrahim Haddad is a member of the management team at The Linux Foundation responsible for technical, legal and compliance projects and initiatives. Prior to that, he ran the Open Source Office at Palm, the Open Source Technology Group at Motorola, and Global Telecommunications Initiatives at The Open Source Development Labs. Ibrahim started his career as a member of the research team at Ericsson Research focusing on advanced research for system architecture of 3G wireless IP networks and on the adoption of open source software in telecom. Ibrahim graduated from Concordia University (Montréal, Canada) with a Ph.D. in Computer Science. He is a Contributing Editor to the Linux Journal. Ibrahim is fluent in Arabic, English and French. He can be reached via http://www.IbrahimHaddad.com.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories
Personalization has long been the holy grail of marketing. Simply stated, communicate the most relevant offer to the right person and you will increase sales. To achieve this, you must understand the individual. Consequently, digital marketers developed many ways to gather and leverage customer information to deliver targeted experiences. In his session at @ThingsExpo, Lou Casal, Founder and Principal Consultant at Practicala, discussed how the Internet of Things (IoT) has accelerated our abil...
With so much going on in this space you could be forgiven for thinking you were always working with yesterday’s technologies. So much change, so quickly. What do you do if you have to build a solution from the ground up that is expected to live in the field for at least 5-10 years? This is the challenge we faced when we looked to refresh our existing 10-year-old custom hardware stack to measure the fullness of trash cans and compactors.
The emerging Internet of Everything creates tremendous new opportunities for customer engagement and business model innovation. However, enterprises must overcome a number of critical challenges to bring these new solutions to market. In his session at @ThingsExpo, Michael Martin, CTO/CIO at nfrastructure, outlined these key challenges and recommended approaches for overcoming them to achieve speed and agility in the design, development and implementation of Internet of Everything solutions wi...
DevOps at Cloud Expo, taking place Nov 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 19th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long dev...
Cloud computing is being adopted in one form or another by 94% of enterprises today. Tens of billions of new devices are being connected to The Internet of Things. And Big Data is driving this bus. An exponential increase is expected in the amount of information being processed, managed, analyzed, and acted upon by enterprise IT. This amazing is not part of some distant future - it is happening today. One report shows a 650% increase in enterprise data by 2020. Other estimates are even higher....
I wanted to gather all of my Internet of Things (IOT) blogs into a single blog (that I could later use with my University of San Francisco (USF) Big Data “MBA” course). However as I started to pull these blogs together, I realized that my IOT discussion lacked a vision; it lacked an end point towards which an organization could drive their IOT envisioning, proof of value, app dev, data engineering and data science efforts. And I think that the IOT end point is really quite simple…
Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more business becomes digital the more stakeholders are interested in this data including how it relates to business. Some of these people have never used a monitoring tool before. They have a question on their mind like “How is my application doing” but no id...
Identity is in everything and customers are looking to their providers to ensure the security of their identities, transactions and data. With the increased reliance on cloud-based services, service providers must build security and trust into their offerings, adding value to customers and improving the user experience. Making identity, security and privacy easy for customers provides a unique advantage over the competition.
Is the ongoing quest for agility in the data center forcing you to evaluate how to be a part of infrastructure automation efforts? As organizations evolve toward bimodal IT operations, they are embracing new service delivery models and leveraging virtualization to increase infrastructure agility. Therefore, the network must evolve in parallel to become equally agile. Read this essential piece of Gartner research for recommendations on achieving greater agility.
Smart Cities are here to stay, but for their promise to be delivered, the data they produce must not be put in new siloes. In his session at @ThingsExpo, Mathias Herberts, Co-founder and CTO of Cityzen Data, will deep dive into best practices that will ensure a successful smart city journey.
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 19th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The Internet of Things (IoT) is the most profound change in personal and enterprise IT since the creation of the Worldwide Web more than 20 years ago. All major researchers estimate there will be tens of billions devices - comp...
The 19th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Digital Transformation, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportuni...
SYS-CON Events announced today that Venafi, the Immune System for the Internet™ and the leading provider of Next Generation Trust Protection, will exhibit at @DevOpsSummit at 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Venafi is the Immune System for the Internet™ that protects the foundation of all cybersecurity – cryptographic keys and digital certificates – so they can’t be misused by bad guys in attacks...
SYS-CON Events announced today Telecom Reseller has been named “Media Sponsor” of SYS-CON's 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Telecom Reseller reports on Unified Communications, UCaaS, BPaaS for enterprise and SMBs. They report extensively on both customer premises based solutions such as IP-PBX as well as cloud based and hosted platforms.
For basic one-to-one voice or video calling solutions, WebRTC has proven to be a very powerful technology. Although WebRTC’s core functionality is to provide secure, real-time p2p media streaming, leveraging native platform features and server-side components brings up new communication capabilities for web and native mobile applications, allowing for advanced multi-user use cases such as video broadcasting, conferencing, and media recording.
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at Cloud Expo, Ed Featherston, a director and senior enterprise architect at Collaborative Consulting, will discuss the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
Pulzze Systems was happy to participate in such a premier event and thankful to be receiving the winning investment and global network support from G-Startup Worldwide. It is an exciting time for Pulzze to showcase the effectiveness of innovative technologies and enable them to make the world smarter and better. The reputable contest is held to identify promising startups around the globe that are assured to change the world through their innovative products and disruptive technologies. There w...
Akana has announced the availability of version 8 of its API Management solution. The Akana Platform provides an end-to-end API Management solution for designing, implementing, securing, managing, monitoring, and publishing APIs. It is available as a SaaS platform, on-premises, and as a hybrid deployment. Version 8 introduces a lot of new functionality, all aimed at offering customers the richest API Management capabilities in a way that is easier than ever for API and app developers to use.
SYS-CON Events announced today that 910Telecom will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Housed in the classic Denver Gas & Electric Building, 910 15th St., 910Telecom is a carrier-neutral telecom hotel located in the heart of Denver. Adjacent to CenturyLink, AT&T, and Denver Main, 910Telecom offers connectivity to all major carriers, Internet service providers, Internet backbones and ...
19th Cloud Expo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Meanwhile, 94% of enterpri...