Linux Containers Authors: Liz McMillan, Elizabeth White, Yeshim Deniz, Zakia Bouachraoui, Pat Romanski

Related Topics: Linux Containers

Linux Containers: Article

Linux Lunch

Debunking the myths

Over the past few months I've had lunch with a lot of old friends who were techies 10 years ago, but are now IT managers at large financial institutions. Linux came up in the conversations with most of them, and I was surprised by the misconceptions they have about the whole open source environment. I ended up debunking the myths that seem to exist around Linux. This is how most of my conversations went....

You've got to be kidding! I wouldn't want to bet my entire business on Linux, would I?
Funny you should say that. I remember you saying the same thing 10 years ago, but then we were talking about Windows. That's when you were using Windows 3.1, and Windows NT 3.1 was just about to be released. And indeed, back then it was much more sensible to stick with the system you were running at the time. But a few years later you were one of the first companies to adopt Windows NT 3.51 as a corporate platform. Now, Linux is maturing rapidly, and the time has come to start looking at it seriously as the next corporate platform.

We had a look at Linux a few years ago, but felt it was really just a toy operating system for hobbyists.
Over the past few years tremendous progress has been made. Linux is a very stable operating system. You can get drivers for just about any piece of hardware, and there's an incredible number of applications available for it, on the server side as well as on the desktop.

If we're really precise, Linux is really only a kernel. With it you run GNU utilities (which are the shells and the shell commands), XFree86 (which is what gives you GUI capabilities), and desktop environments such as GNOME and KDE. On top of that you get productivity tools such as OpenOffice.org, Evolution, and so on. Most of these are included in all the distributions. Other bits you may need to download separately.

With just the Linux kernel, you can't do much. But with all the projects going on at the moment, you can run a complete computing environment, on servers and on the desktop, based entirely on open source software. Within that you have a lot of choice as to which software you want to run. You're in no way tied to what one single company wants you to run.

I don't like the DIY approach. I don't want to involve my staff in compiling all the systems that I'm going to run on my computers.
You don't have to. You buy or download a "distribution," which contains all the binaries. To install this on a computer you go through a GUI-based installation procedure, selecting the options you want. It's no different from purchasing and installing a proprietary operating system.

But I should really compile a kernel for my specific hardware, right?
Wrong. It's been well over a year since I last compiled a kernel. And even with that one I found later that the support I wanted was already in the kernel of the distribution I was using, so I didn't have to do it. At the moment all the systems in our office run with off-the-shelf kernels. The Linux kernel supports loadable modules and device drivers, so the kernel that comes with your distribution will in most cases be adequate, as it will load into memory only the portions it needs. If you have obscure hardware you may need to add a specific device driver to your system, but that's most likely also a loadable module. You don't need to recompile a new kernel for that. There's no difference from proprietary software on that front either.

Most distributions come with more than one precompiled kernel, each one optimized for a different processor, for example i386, i586, or i686. The installation procedure automatically selects the appropriate kernel for the processor found in the machine. This is in contrast to proprietary systems, which come in one version, compiled to support the lowest supported processor.

But Linux is about freedom of choice: if you really want to squeeze the last bit of performance out of your machines, then you do have the option of compiling a tailor- made kernel for your particular hardware configuration.

What about modifying the kernel? I wouldn't want any of my staff slipping a dodgy home-grown modified kernel into my production systems.
Well, don't let them do that then. For your current systems you have strict change-control procedures. Upgrades and patches are applied only after extensive testing, and after sign-off by business users. You use the same procedures with Linux.

At the moment you have two types of software in your company: shrink-wrapped software that was developed by software companies, and in-house applications that were developed by your own developers. For the first group you only have binaries, pay a license fee, and rely on the vendor for fixes and further development. For the second group you're on your own because you need to maintain the source code yourself. You seem to be looking at Linux as being part of that second group. But your having access to the source code doesn't put it in that category. It is software created by people outside your company, for which you have a binary distribution, and for most of it you can get support. But you don't pay a license fee, and you do get free access to the source code.

There are tens of thousands of open source projects going on at the moment. I wouldn't expect the average Linux system administrator to make changes to the kernel itself. It's more likely that you'll want something changed in one of the other portions of the system. When we're talking about participating in open source projects, we're not just talking about the kernel, but about all the software that's available for the Linux platform.

So, if I run a binary distribution, what is the actual point in having the source code?
The point is that with open source software, you have free access to the source code. You don't actually need the source code to run the software. However, when there's an unexpected problem, it's very handy to have access to the source code. With proprietary systems, engineers often end up using trial and error to find a workaround for a problem. With Linux you can look inside to see what's going on, and find a workaround that way, or actually fix the problem.

If we make modifications or enhancements to that code, then we are on our own with an unsupported configuration, and we'll need to reapply those changes with every future release.
That's not how it works. If one of your developers makes a change to the software, to fix a bug or make something more efficient, then the idea is that you submit that change back to the project. The next official release will include that fix or enhancement. From then on, you use that new release, in binary form and with support.

Developers don't work for free. Are you suggesting that I spend money on development and then give the work to the rest of the world for free?
When you submit a change back to the project, it will be reviewed by others before being accepted. They may come up with some further improvements, or see that your developer did not get things completely right and fix that, or suggest how to fix it, or improve your developer's code in a different way. Maybe you put in three days to make the change, but you may get another three days of others' time to check and perfect that change. Later on, yet another person may build upon your enhancement to create even more functionality. You don't pay for that additional effort. The community principle works out very well economically.

And think about the license fees you pay for proprietary software.... Some of that money goes toward further development of the product you bought, but you have absolutely no say in that development. You can suggest to the company the fixes and features you'd like to see, but you have no say in what happens with the money you pay. Some of it will go into developing features that others want, but for which you have absolutely no need. With open source software you can contribute to the project by making fixes and doing some development in specific areas where you want it, instead of paying large sums to fund the overall development, and leaving someone else to decide where and how that money is applied. Open source puts you in control.

But what about other core software? For example, we use Oracle extensively. What are we going to do about that?
More and more software companies are porting their software to Linux, although that doesn't mean they are going open source. Oracle is one good example of a proprietary product that is available for Linux. Some purists don't like it because it's not open source, but in the end the choice is yours. If you want to stick with Oracle, you can do so, either on the platform that it runs on at the moment, or by running Oracle under Linux. If you want to go completely open source, then there are also open source database projects, such as PostgreSQL and MySQL. But nobody forces you to choose one or another. Open source is about choice. You make the choices.

Okay, I'm beginning to get the picture. But if we put Linux on our servers, do we then still need Windows on our desktop PCs? Do we then use Terminal Services or something similar?
No. Linux is a general purpose operating system, not just a server operating system. In the commercial world it's currently most successful in the server arena, but it runs just as happily on the desktop. There are masses of end-user applications available for Linux, which you can run directly under Linux on a desktop (or laptop) computer. The files for those applications can be on the user's computer or on a shared directory on a server. But you also have the option of running applications on a server and displaying their GUI back on the workstations. Terminal Services on Windows are a fairly recent development, but X11 (which is the graphical subsystem on Linux) has had remote application display for decades, and is far more advanced.

So, you reckon I can switch to Linux right now?
I would not advise you to go back to the office today and send your engineers around the company installing Linux on all the computers they can find. That would be suicide. Upgrading to any new platform is a major project that requires testing and planning – whether it's an upgrade to a major new version of the same environment, or to a completely different platform. Depending on the size of your company and the scope of such a project, you are talking about 6, 12, or 18 months. Even getting business approval to run such a project can take significant time.

But that still leaves the question: Why should we switch to Linux?
There are many different reasons for switching to Linux: lower costs, improved reliability and stability, more flexibility, choice, being in control. There are many factors that make open source systems better than systems based on proprietary software.

For a large enterprise to make the switch to open source software, these benefits will need to be quantified. To be able to say how it will benefit your organization, you will need to start looking at it. Then, when the next round of upgrades is due, you'll be able to make the business case for Linux as the next corporate platform.

Now's the time to start looking at the potential of an open source–based corporate computing environment.

Urban Legends of Linux

  • Linux is for hobbyists: Linux is a highly stable operating system suitable for corporate environments. Some of the Web's best-known sites rely entirely on Linux, for example, Google and Amazon.
  • Hardware support is limited: Most hardware providers have woken up to the importance of Linux and provide drivers or assist developers in writing drivers for their hardware.
  • Application availability is limited: There are tens of thousands of applications available for Linux. Most of them are open source, but more and more closed source applications are ported to Linux as well.
  • Systems are built entirely from source code: Just about all open source software is available in binary form. Some projects deliver source code only, but the various distributions then deliver binary packages of those applications.
  • Linux is a server OS: Within corporations Linux is most successful in the server arena, but as a general-purpose operating system it is equally suitable for the desktop.
  • The claims about lower costs are fake: There are far too many variables involved for anyone to make a generic statement about costs. The only way to find out how much your organization can save is by looking at how using Linux may impact that organization. For us, the enormous reduction in the amount of time spent managing our systems was well worth the effort of migrating to Linux.
  • More Stories By Herman Verkade

    Herman Verkade is a UK-based, independent consultant who specializes in the management of large-scale heterogeneous environments. Over the past 22 years he has worked mostly with financial institutions in the UK, the U.S., and continental Europe.

    Comments (2) View Comments

    Share your thoughts on this story.

    Add your comment
    You must be signed in to add a comment. Sign-in | Register

    In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

    Most Recent Comments
    Nyasha Mukura 12/10/03 07:20:09 AM EST

    An excellent article. I love Linux - its about time we put a stop to the insatiable M$ extortionist pricing model.I'm from a third world country with an ever depreciating currency and where the new forced licencing/pricing model from M$ merely strangles businesses with no corresponding value addition to the bottom line. What we need is a site to discuss methodologies/approaches and real life experiences to move from MSWindows to Linux. Viva Linux. Abasha MS Windows. Thank you Linus.

    Kalevi Nyman 12/07/03 06:13:59 PM EST

    An Evening at the local PUB

    Excellent article. Also here in Sweden the argumentation
    goes within the same zero-knowledge outline. I just explained to a don't-know-anything-but-Windows admin person what a windowing system is and some workings of X. He was
    obviously amazed by capabilities of X and had hard time
    believing it all.

    He also had hard time believing how much is included in a
    standard distribution, say SuSE, RedHat, Debian etc.
    all for peanuts, compared to anything else. I went practically through same argumentation as your article.
    That discussion encreased the local PUBs beer sales
    significantly! :-)

    Kalevi Nyman

    P.S. Please write some more of these! I guess good
    arguments for Linux deployment are not so hard to
    find these days :-)

    IoT & Smart Cities Stories
    Early Bird Registration Discount Expires on August 31, 2018 Conference Registration Link ▸ HERE. Pick from all 200 sessions in all 10 tracks, plus 22 Keynotes & General Sessions! Lunch is served two days. EXPIRES AUGUST 31, 2018. Ticket prices: ($1,295-Aug 31) ($1,495-Oct 31) ($1,995-Nov 12) ($2,500-Walk-in)
    Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settlement products to hedge funds and investment banks. After, he co-founded a revenue cycle management company where he learned about Bitcoin and eventually Ethereal. Andrew's role at ConsenSys Enterprise is a mul...
    Nicolas Fierro is CEO of MIMIR Blockchain Solutions. He is a programmer, technologist, and operations dev who has worked with Ethereum and blockchain since 2014. His knowledge in blockchain dates to when he performed dev ops services to the Ethereum Foundation as one the privileged few developers to work with the original core team in Switzerland.
    René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
    Digital Transformation and Disruption, Amazon Style - What You Can Learn. Chris Kocher is a co-founder of Grey Heron, a management and strategic marketing consulting firm. He has 25+ years in both strategic and hands-on operating experience helping executives and investors build revenues and shareholder value. He has consulted with over 130 companies on innovating with new business models, product strategies and monetization. Chris has held management positions at HP and Symantec in addition to ...
    The challenges of aggregating data from consumer-oriented devices, such as wearable technologies and smart thermostats, are fairly well-understood. However, there are a new set of challenges for IoT devices that generate megabytes or gigabytes of data per second. Certainly, the infrastructure will have to change, as those volumes of data will likely overwhelm the available bandwidth for aggregating the data into a central repository. Ochandarena discusses a whole new way to think about your next...
    CloudEXPO | DevOpsSUMMIT | DXWorldEXPO are the world's most influential, independent events where Cloud Computing was coined and where technology buyers and vendors meet to experience and discuss the big picture of Digital Transformation and all of the strategies, tactics, and tools they need to realize their goals. Sponsors of DXWorldEXPO | CloudEXPO benefit from unmatched branding, profile building and lead generation opportunities.
    Dynatrace is an application performance management software company with products for the information technology departments and digital business owners of medium and large businesses. Building the Future of Monitoring with Artificial Intelligence. Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more busine...
    All in Mobile is a place where we continually maximize their impact by fostering understanding, empathy, insights, creativity and joy. They believe that a truly useful and desirable mobile app doesn't need the brightest idea or the most advanced technology. A great product begins with understanding people. It's easy to think that customers will love your app, but can you justify it? They make sure your final app is something that users truly want and need. The only way to do this is by ...
    DXWorldEXPO LLC announced today that Big Data Federation to Exhibit at the 22nd International CloudEXPO, colocated with DevOpsSUMMIT and DXWorldEXPO, November 12-13, 2018 in New York City. Big Data Federation, Inc. develops and applies artificial intelligence to predict financial and economic events that matter. The company uncovers patterns and precise drivers of performance and outcomes with the aid of machine-learning algorithms, big data, and fundamental analysis. Their products are deployed...