Linux Containers Authors: Liz McMillan, Yeshim Deniz, Pat Romanski, Elizabeth White, Stefana Muller

Related Topics: Linux Containers

Linux Containers: Article

Preparing for the Revolution

Dual-core technology for HPC Clusters

MPI (Message Passing Interface)
In the high-performance computing market, parallelism is often expressed using the MPI programming interface. In contrast to threaded approaches, MPI uses messages to copy memory from one process space (program) to another. This approach is very effective when the processors don't share local memory (e.g., they're located on another motherboard in the HPC cluster). It can be used, however, for multi-core programming as well. And there are many programs that have already been ported to MPI that can take advantage of multiple-cores without any re-programming.

In our cash register analogy the "MPI cashier:" would call other stores on the phone and tick off the items in the shopping cart that they have to tabulate. The advantage here is that the size (scale) of the order can get very big and exceed the capacity of the cash registers of any one store (computer). MPI is available as a library for most languages (C/C++ and Fortran) and is available in both commercial and Open Source packages. For Linux HPC clusters, hybrid methods that employ both OpenMP (intra-motherboard communication) and MPI (inter-motherboard communication) are now being studied.

Debugging Methods
There's an old joke that goes "Every program works just fine, getting it to work the way you want it to is the trick." Parallel programs like their sequential program cousins are no exception. In fact, parallel programs represent a much harder proposition because unlike serial programs there's the notion of synchronization and data sharing. These properties can make it hard to fully understand program behavior in a real-world environment where program execution may not be easily replicated.

The choice of debugger can be critical to the success of any multi-core programming project. Without the ability to see what the program is doing in real-time, multi-core applications can be impossible to develop in any meaningful timeframe.

The multi-core revolution forces software developers to evaluate their codes in terms of multi-core performance now and as the cores increases in the future. In particular, many Linux HPC clusters can easily see a doubling of CPU cores simply by upgrading the processor. The following recommendations are designed to help aid in the transition to multi-core architectures:

  • Assess the level of concurrency in your application: Concurrency isn't present (or necessary) in all applications. Some applications, by virtue of their algorithms, can only be executed in a serial fashion. Examining the algorithm is the only effective way to access of the presence of concurrency in your application. Careful attention to memory access patterns by various parts of the program is important since dependencies will inhibit the ability to operate concurrently.
  • Assess, if possible, whether concurrency will improve performance: If you can identify concurrent parts of your application, look carefully to see if executing them in parallel will result in application speed-up (reduced execution time). There are often parts of your program that could be executed concurrently, but have no effect on execution time. There's no point in spending time on these sections of code. Typically you should focus your attention on the computationally heavy parts of your application.
  • Assess the scalability of your applications: Another important question to ask is how scalable your application is. As more processors are added, parallel execution will always hit the point of diminishing returns. This means that creating more threads won't improve performance (it may actually hurt performance). If you find that your application can be scaled to a large number of processors then using MPI may be a good choice. If, however, you don't envision using more than four processors then pthreads or OpenMP is probably your best choices.
  • Make sure there's an adequate tool chain for your application: This last recommendation is critical to the your application's success. There are many ways of using multiple processors, but not all of them have tool chains that provide the professional support and capabilities needed to produce software in a reasonable period of time and at a reasonable cost. Research projects should be avoided and the use of mainstream compilers, debugger, and profiles should be a high priority.
Taking advantage of multi-core technologies is the next step in software development and HPC cluster performance. Once a plan and infrastructure are in place the benefits of dual-core processors can be realized by customers and users alike. A commitment to parallel computing will provide solid and sustained performance growth into the future.

These sources can be consulted for general information about multi-core hardware and software.


Programming Issues pthreads OpenMP MPI
  • Using MPI - 2nd Edition: Portable Parallel Programming with the Message Passing Interface, William Gropp, Ewing Lusk, Anthony Skjellum, ISBN: 0262571323
  • MPI Web site: www-unix.mcs.anl.gov/mpi/

More Stories By Douglas Eadline

Dr. Douglas Eadline has over 25 years of experience in high-performance computing. You can contact him through Basement Supercomputing (http://basement-supercomputing.com).

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

IoT & Smart Cities Stories
In today's enterprise, digital transformation represents organizational change even more so than technology change, as customer preferences and behavior drive end-to-end transformation across lines of business as well as IT. To capitalize on the ubiquitous disruption driving this transformation, companies must be able to innovate at an increasingly rapid pace.
Predicting the future has never been more challenging - not because of the lack of data but because of the flood of ungoverned and risk laden information. Microsoft states that 2.5 exabytes of data are created every day. Expectations and reliance on data are being pushed to the limits, as demands around hybrid options continue to grow.
"MobiDev is a Ukraine-based software development company. We do mobile development, and we're specialists in that. But we do full stack software development for entrepreneurs, for emerging companies, and for enterprise ventures," explained Alan Winters, U.S. Head of Business Development at MobiDev, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
Bill Schmarzo, author of "Big Data: Understanding How Data Powers Big Business" and "Big Data MBA: Driving Business Strategies with Data Science," is responsible for setting the strategy and defining the Big Data service offerings and capabilities for EMC Global Services Big Data Practice. As the CTO for the Big Data Practice, he is responsible for working with organizations to help them identify where and how to start their big data journeys. He's written several white papers, is an avid blogge...
The best way to leverage your Cloud Expo presence as a sponsor and exhibitor is to plan your news announcements around our events. The press covering Cloud Expo and @ThingsExpo will have access to these releases and will amplify your news announcements. More than two dozen Cloud companies either set deals at our shows or have announced their mergers and acquisitions at Cloud Expo. Product announcements during our show provide your company with the most reach through our targeted audiences.
When talking IoT we often focus on the devices, the sensors, the hardware itself. The new smart appliances, the new smart or self-driving cars (which are amalgamations of many ‘things'). When we are looking at the world of IoT, we should take a step back, look at the big picture. What value are these devices providing. IoT is not about the devices, its about the data consumed and generated. The devices are tools, mechanisms, conduits. This paper discusses the considerations when dealing with the...
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
Business professionals no longer wonder if they'll migrate to the cloud; it's now a matter of when. The cloud environment has proved to be a major force in transitioning to an agile business model that enables quick decisions and fast implementation that solidify customer relationships. And when the cloud is combined with the power of cognitive computing, it drives innovation and transformation that achieves astounding competitive advantage.
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...