Welcome!

Linux Containers Authors: Liz McMillan, Pat Romanski, Elizabeth White, John Esposito, Flint Brenton

Related Topics: Linux Containers

Linux Containers: Article

Can Linux Clusters Move into Mainstream Information Technology?

Exclusive Interview with Rob Lucke, Chief Solutions Officer for Vista Solutions Corp

The emergence of commodity supercomputing has driven clusters based on the Linux operating system into engineering and scientific research organizations that couldn't afford their own supercomputing resources before. But Linux clusters have the potential to become a hot topic in traditional information technology circles as well. The coming year may well be the technology tipping point when Linux cluster technology escapes its current home in research organizations and inhabits the traditional data center.

Building Clustered Linux Systems by Robert Lucke, recently published by Prentice Hall Professional Technical Reference and HP Books, attempts to provide a starting point for organizations interested in building or evaluating their first Linux cluster.

We took this opportunity to have a chat with Robert and ask him a few questions on the subject of Linux and clustering.

What made you want to write a book about building clusters?

Before starting work on clusters, I spent a considerable amount of time tackling workgroup architecture and large-scale system administration problems for my engineering and scientific customers. When I got the opportunity to work on a prototype Itanium 2 cluster at Pacific Northwest National Laboratory, I was fascinated by the new techologies like the high-speed interconnect from Quadrics, and recognized many familiar management and architectural issues. The more I learned, the more I saw applications for clusters in other more "traditional" areas. The book itself was a learning experience for me and an attempt to collect and organize cluster-building information for organizations that are investigating clustered solutions.

Why do you think that clusters are an important architecture?

If you have the proper application software, a cluster can scale high-performance, high-availability or high-throughput resources far beyond anything that is available in a single SMP system. Being able to do this with commodity hardware brings tremendous compute resources in reach of organizations that previously couldn't afford them. I see cluster architectures and techniques as the gateway to some of the resource virtualization that seems to be the Holy Grail of traditional IT departments today. I think this is exciting! I guess it's the love of finding an elegant solution to a problem that drives my excitement. Instead of being a marketechture, clusters represent a real solution to a group of scalable problems.

When someone says cluster, what does it mean to you?

I have learned to be very, very careful with the word cluster. It's overloaded and the meaning depends on the audience. Using cluster in a scientific context evokes a different mental picture than if it were used by traditional IT folks. In general, I think of a cluster as a group of separate resources like systems, CPUs and RAM that gets poured into a mold. The shape of the mold determines the final shape and behavior of the clustered solution. Sizing the problem and determining the shape of the mold is the fun part for me.

Besides scientific and engineering environments, do you see any other applications for clusters?

I sure do! There are database clusters, web server clusters, file server clusters, visualization clusters, and on and on. Instead of building clusters that push the upper limits of RAM and CPU resources, thousands to tens of thousands of CPUs, for example, the company I work for concentrates on application-specific clusters. These are smaller, single-function clusters that are meant to run application configurations that would have required a large and expensive SMP system. The intent is to lower the complexity of building and managing the cluster, but still provide a more cost-effective solution for the application. I think this type of approach is generally applicable in any type of computing environment.

 What are some of the common mistakes you've encountered in cluster building?

The single biggest problem I run into is what I call pile o'hardware syndrome. That's the notion that you just buy a whole bunch of cheap hardware, rack it up, and a cluster will magically appear out of a pile of pieces. It's still very common to underestimate the amount of work required to make physically separate resources work together as if they were one very large, manageable SMP system. A cluster is still a systems engineering problem that can turn nasty if you aren't careful. But, with advances in pre-racked, pre-cabled hardware from some of the hardware vendors and the cluster software toolkits like OSCAR and Rocks, I see cluster building getting easier all the time.

Why do you think that Linux is the best cluster operating system?

One simple answer is choice. There are commercial distributions, free distributions, white-box distributions and so forth. If you have a commercial software package like an Oracle database that's qualified against a particular Linux commercial distribution like SuSE or RedHat, you can build a fully supported cluster configuration. If you want to do research or custom work, there are free distributions like Debian or Fedora. Because the source code is available, you can choose your starting point and degree of customization. This is the best of all possible worlds.

The Linux operating system is stable, manageable and flexible. You are free to configure Linux as you see fit instead of trying to chip away at a black-box operating system that fights you every step of the way. There's a wealth of free management and development tools available. Oh, did I mention that Linux runs on a wide range of commodity hardware, both 32- and 64-bit? What's not to like? Nothing else comes close in my estimation.

What do you see ahead for clustered architectures?

I definitely see Linux clusters moving into mainstream information technology environments. If you look back, the scientific community tends to drive computing technologies that are later adopted by the more conservative IT organizations as business solutions. One modest example I can think of might be the World Wide Web and the Mosaic browser. I firmly believe that clusters, specifically Linux clusters, are poised to repeat this type of adoption pattern. I think we are very close, if not past, the tipping point.

What would you say to someone who is thinking about building his first cluster?

Do it, but do it with your eyes open. Do your homework before starting. Give yourself time to learn. Try not to fall into the pile o'hardware trap. Start small and scale up. Investigate starting points like openMosix, Rocks and OSCAR first. If you don't have time for the learning curve, then have a replicable solution designed and implemented for you.

Conclusion

In addition to their usefulness in scientific and engineering computing environments, I believe that Linux clusters and clustering techniques will be an important addition to the standard information technology solutions in the corporate datacenter. The trick is going to be sharing the cluster-building knowledge that's available in universities and research institutions with the traditional information technology organization. Because of its stability, flexibility, open nature, manageability and availability on a wide range of commodity hardware, I believe that Linux is the correct choice for the creation of clustered solutions. I am really looking forward to the next few years. I believe it will be an exciting time for both Linux and clusters.

About Rob Lucke

Rob Lucke is currently chief solutions officer for Vista Solutions Corp. (http://www.VistaSolutions.Net), concentrating on technical and scientific computing. Rob's field of expertise include Linux compute clusters, technical systems architecture, large-scale system administration techniques, network file systems, heterogeneous interoperability, software development and application and system-level performance tuning. Rob has over 30 years of experience in computing and software of all types from real-time data acquisition to transaction processing. His first book, Designing and Implementing Computer Workgroups, was published in 1999. His second book, Building Clustered Linux Systems, was published in September of 2004. Rob is Red Hat Linux certified engineer #807200931604117.

More Stories By Ibrahim Haddad

Ibrahim Haddad is a member of the management team at The Linux Foundation responsible for technical, legal and compliance projects and initiatives. Prior to that, he ran the Open Source Office at Palm, the Open Source Technology Group at Motorola, and Global Telecommunications Initiatives at The Open Source Development Labs. Ibrahim started his career as a member of the research team at Ericsson Research focusing on advanced research for system architecture of 3G wireless IP networks and on the adoption of open source software in telecom. Ibrahim graduated from Concordia University (Montréal, Canada) with a Ph.D. in Computer Science. He is a Contributing Editor to the Linux Journal. Ibrahim is fluent in Arabic, English and French. He can be reached via http://www.IbrahimHaddad.com.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories
Connected devices and the industrial internet are growing exponentially every year with Cisco expecting 50 billion devices to be in operation by 2020. In this period of growth, location-based insights are becoming invaluable to many businesses as they adopt new connected technologies. Knowing when and where these devices connect from is critical for a number of scenarios in supply chain management, disaster management, emergency response, M2M, location marketing and more. In his session at @Th...
Basho Technologies has announced the latest release of Basho Riak TS, version 1.3. Riak TS is an enterprise-grade NoSQL database optimized for Internet of Things (IoT). The open source version enables developers to download the software for free and use it in production as well as make contributions to the code and develop applications around Riak TS. Enhancements to Riak TS make it quick, easy and cost-effective to spin up an instance to test new ideas and build IoT applications. In addition to...
When people aren’t talking about VMs and containers, they’re talking about serverless architecture. Serverless is about no maintenance. It means you are not worried about low-level infrastructural and operational details. An event-driven serverless platform is a great use case for IoT. In his session at @ThingsExpo, Animesh Singh, an STSM and Lead for IBM Cloud Platform and Infrastructure, will detail how to build a distributed serverless, polyglot, microservices framework using open source tec...
Apixio Inc. has raised $19.3 million in Series D venture capital funding led by SSM Partners with participation from First Analysis, Bain Capital Ventures and Apixio’s largest angel investor. Apixio will dedicate the proceeds toward advancing and scaling products powered by its cognitive computing platform, further enabling insights for optimal patient care. The Series D funding comes as Apixio experiences strong momentum and increasing demand for its HCC Profiler solution, which mines unstruc...
IoT offers a value of almost $4 trillion to the manufacturing industry through platforms that can improve margins, optimize operations & drive high performance work teams. By using IoT technologies as a foundation, manufacturing customers are integrating worker safety with manufacturing systems, driving deep collaboration and utilizing analytics to exponentially increased per-unit margins. However, as Benoit Lheureux, the VP for Research at Gartner points out, “IoT project implementers often ...
It is one thing to build single industrial IoT applications, but what will it take to build the Smart Cities and truly society changing applications of the future? The technology won’t be the problem, it will be the number of parties that need to work together and be aligned in their motivation to succeed. In his Day 2 Keynote at @ThingsExpo, Henrik Kenani Dahlgren, Portfolio Marketing Manager at Ericsson, discussed how to plan to cooperate, partner, and form lasting all-star teams to change t...
In his general session at 18th Cloud Expo, Lee Atchison, Principal Cloud Architect and Advocate at New Relic, discussed cloud as a ‘better data center’ and how it adds new capacity (faster) and improves application availability (redundancy). The cloud is a ‘Dynamic Tool for Dynamic Apps’ and resource allocation is an integral part of your application architecture, so use only the resources you need and allocate /de-allocate resources on the fly.
In his keynote at 18th Cloud Expo, Andrew Keys, Co-Founder of ConsenSys Enterprise, provided an overview of the evolution of the Internet and the Database and the future of their combination – the Blockchain. Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life sett...
Presidio has received the 2015 EMC Partner Services Quality Award from EMC Corporation for achieving outstanding service excellence and customer satisfaction as measured by the EMC Partner Services Quality (PSQ) program. Presidio was also honored as the 2015 EMC Americas Marketing Excellence Partner of the Year and 2015 Mid-Market East Partner of the Year. The EMC PSQ program is a project-specific survey program designed for partners with Service Partner designations to solicit customer feedbac...
Machine Learning helps make complex systems more efficient. By applying advanced Machine Learning techniques such as Cognitive Fingerprinting, wind project operators can utilize these tools to learn from collected data, detect regular patterns, and optimize their own operations. In his session at 18th Cloud Expo, Stuart Gillen, Director of Business Development at SparkCognition, discussed how research has demonstrated the value of Machine Learning in delivering next generation analytics to imp...
There are several IoTs: the Industrial Internet, Consumer Wearables, Wearables and Healthcare, Supply Chains, and the movement toward Smart Grids, Cities, Regions, and Nations. There are competing communications standards every step of the way, a bewildering array of sensors and devices, and an entire world of competing data analytics platforms. To some this appears to be chaos. In this power panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, Bradley Holt, Developer Advocate a...
The cloud market growth today is largely in public clouds. While there is a lot of spend in IT departments in virtualization, these aren’t yet translating into a true “cloud” experience within the enterprise. What is stopping the growth of the “private cloud” market? In his general session at 18th Cloud Expo, Nara Rajagopalan, CEO of Accelerite, explored the challenges in deploying, managing, and getting adoption for a private cloud within an enterprise. What are the key differences between wh...
A strange thing is happening along the way to the Internet of Things, namely far too many devices to work with and manage. It has become clear that we'll need much higher efficiency user experiences that can allow us to more easily and scalably work with the thousands of devices that will soon be in each of our lives. Enter the conversational interface revolution, combining bots we can literally talk with, gesture to, and even direct with our thoughts, with embedded artificial intelligence, wh...
Cloud computing is being adopted in one form or another by 94% of enterprises today. Tens of billions of new devices are being connected to The Internet of Things. And Big Data is driving this bus. An exponential increase is expected in the amount of information being processed, managed, analyzed, and acted upon by enterprise IT. This amazing is not part of some distant future - it is happening today. One report shows a 650% increase in enterprise data by 2020. Other estimates are even higher....
SYS-CON Events announced today that Bsquare has been named “Silver Sponsor” of SYS-CON's @ThingsExpo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. For more than two decades, Bsquare has helped its customers extract business value from a broad array of physical assets by making them intelligent, connecting them, and using the data they generate to optimize business processes.
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 19th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The Internet of Things (IoT) is the most profound change in personal and enterprise IT since the creation of the Worldwide Web more than 20 years ago. All major researchers estimate there will be tens of billions devices - comp...
19th Cloud Expo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Meanwhile, 94% of enterpri...
The 19th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Digital Transformation, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportuni...
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with the 19th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world and ThingsExpo Silicon Valley Call for Papers is now open.
There is little doubt that Big Data solutions will have an increasing role in the Enterprise IT mainstream over time. Big Data at Cloud Expo - to be held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA - has announced its Call for Papers is open. Cloud computing is being adopted in one form or another by 94% of enterprises today. Tens of billions of new devices are being connected to The Internet of Things. And Big Data is driving this bus. An exponential increase is...