Welcome!

Linux Containers Authors: Yeshim Deniz, Liz McMillan, Zakia Bouachraoui, Elizabeth White, Pat Romanski

Related Topics: Linux Containers

Linux Containers: Article

Linux Lunch

Debunking the myths

Over the past few months I've had lunch with a lot of old friends who were techies 10 years ago, but are now IT managers at large financial institutions. Linux came up in the conversations with most of them, and I was surprised by the misconceptions they have about the whole open source environment. I ended up debunking the myths that seem to exist around Linux. This is how most of my conversations went....

You've got to be kidding! I wouldn't want to bet my entire business on Linux, would I?
Funny you should say that. I remember you saying the same thing 10 years ago, but then we were talking about Windows. That's when you were using Windows 3.1, and Windows NT 3.1 was just about to be released. And indeed, back then it was much more sensible to stick with the system you were running at the time. But a few years later you were one of the first companies to adopt Windows NT 3.51 as a corporate platform. Now, Linux is maturing rapidly, and the time has come to start looking at it seriously as the next corporate platform.

We had a look at Linux a few years ago, but felt it was really just a toy operating system for hobbyists.
Over the past few years tremendous progress has been made. Linux is a very stable operating system. You can get drivers for just about any piece of hardware, and there's an incredible number of applications available for it, on the server side as well as on the desktop.

If we're really precise, Linux is really only a kernel. With it you run GNU utilities (which are the shells and the shell commands), XFree86 (which is what gives you GUI capabilities), and desktop environments such as GNOME and KDE. On top of that you get productivity tools such as OpenOffice.org, Evolution, and so on. Most of these are included in all the distributions. Other bits you may need to download separately.

With just the Linux kernel, you can't do much. But with all the projects going on at the moment, you can run a complete computing environment, on servers and on the desktop, based entirely on open source software. Within that you have a lot of choice as to which software you want to run. You're in no way tied to what one single company wants you to run.

I don't like the DIY approach. I don't want to involve my staff in compiling all the systems that I'm going to run on my computers.
You don't have to. You buy or download a "distribution," which contains all the binaries. To install this on a computer you go through a GUI-based installation procedure, selecting the options you want. It's no different from purchasing and installing a proprietary operating system.

But I should really compile a kernel for my specific hardware, right?
Wrong. It's been well over a year since I last compiled a kernel. And even with that one I found later that the support I wanted was already in the kernel of the distribution I was using, so I didn't have to do it. At the moment all the systems in our office run with off-the-shelf kernels. The Linux kernel supports loadable modules and device drivers, so the kernel that comes with your distribution will in most cases be adequate, as it will load into memory only the portions it needs. If you have obscure hardware you may need to add a specific device driver to your system, but that's most likely also a loadable module. You don't need to recompile a new kernel for that. There's no difference from proprietary software on that front either.

Most distributions come with more than one precompiled kernel, each one optimized for a different processor, for example i386, i586, or i686. The installation procedure automatically selects the appropriate kernel for the processor found in the machine. This is in contrast to proprietary systems, which come in one version, compiled to support the lowest supported processor.

But Linux is about freedom of choice: if you really want to squeeze the last bit of performance out of your machines, then you do have the option of compiling a tailor- made kernel for your particular hardware configuration.

What about modifying the kernel? I wouldn't want any of my staff slipping a dodgy home-grown modified kernel into my production systems.
Well, don't let them do that then. For your current systems you have strict change-control procedures. Upgrades and patches are applied only after extensive testing, and after sign-off by business users. You use the same procedures with Linux.

At the moment you have two types of software in your company: shrink-wrapped software that was developed by software companies, and in-house applications that were developed by your own developers. For the first group you only have binaries, pay a license fee, and rely on the vendor for fixes and further development. For the second group you're on your own because you need to maintain the source code yourself. You seem to be looking at Linux as being part of that second group. But your having access to the source code doesn't put it in that category. It is software created by people outside your company, for which you have a binary distribution, and for most of it you can get support. But you don't pay a license fee, and you do get free access to the source code.

There are tens of thousands of open source projects going on at the moment. I wouldn't expect the average Linux system administrator to make changes to the kernel itself. It's more likely that you'll want something changed in one of the other portions of the system. When we're talking about participating in open source projects, we're not just talking about the kernel, but about all the software that's available for the Linux platform.

So, if I run a binary distribution, what is the actual point in having the source code?
The point is that with open source software, you have free access to the source code. You don't actually need the source code to run the software. However, when there's an unexpected problem, it's very handy to have access to the source code. With proprietary systems, engineers often end up using trial and error to find a workaround for a problem. With Linux you can look inside to see what's going on, and find a workaround that way, or actually fix the problem.

If we make modifications or enhancements to that code, then we are on our own with an unsupported configuration, and we'll need to reapply those changes with every future release.
That's not how it works. If one of your developers makes a change to the software, to fix a bug or make something more efficient, then the idea is that you submit that change back to the project. The next official release will include that fix or enhancement. From then on, you use that new release, in binary form and with support.

Developers don't work for free. Are you suggesting that I spend money on development and then give the work to the rest of the world for free?
When you submit a change back to the project, it will be reviewed by others before being accepted. They may come up with some further improvements, or see that your developer did not get things completely right and fix that, or suggest how to fix it, or improve your developer's code in a different way. Maybe you put in three days to make the change, but you may get another three days of others' time to check and perfect that change. Later on, yet another person may build upon your enhancement to create even more functionality. You don't pay for that additional effort. The community principle works out very well economically.

And think about the license fees you pay for proprietary software.... Some of that money goes toward further development of the product you bought, but you have absolutely no say in that development. You can suggest to the company the fixes and features you'd like to see, but you have no say in what happens with the money you pay. Some of it will go into developing features that others want, but for which you have absolutely no need. With open source software you can contribute to the project by making fixes and doing some development in specific areas where you want it, instead of paying large sums to fund the overall development, and leaving someone else to decide where and how that money is applied. Open source puts you in control.

But what about other core software? For example, we use Oracle extensively. What are we going to do about that?
More and more software companies are porting their software to Linux, although that doesn't mean they are going open source. Oracle is one good example of a proprietary product that is available for Linux. Some purists don't like it because it's not open source, but in the end the choice is yours. If you want to stick with Oracle, you can do so, either on the platform that it runs on at the moment, or by running Oracle under Linux. If you want to go completely open source, then there are also open source database projects, such as PostgreSQL and MySQL. But nobody forces you to choose one or another. Open source is about choice. You make the choices.

Okay, I'm beginning to get the picture. But if we put Linux on our servers, do we then still need Windows on our desktop PCs? Do we then use Terminal Services or something similar?
No. Linux is a general purpose operating system, not just a server operating system. In the commercial world it's currently most successful in the server arena, but it runs just as happily on the desktop. There are masses of end-user applications available for Linux, which you can run directly under Linux on a desktop (or laptop) computer. The files for those applications can be on the user's computer or on a shared directory on a server. But you also have the option of running applications on a server and displaying their GUI back on the workstations. Terminal Services on Windows are a fairly recent development, but X11 (which is the graphical subsystem on Linux) has had remote application display for decades, and is far more advanced.

So, you reckon I can switch to Linux right now?
I would not advise you to go back to the office today and send your engineers around the company installing Linux on all the computers they can find. That would be suicide. Upgrading to any new platform is a major project that requires testing and planning – whether it's an upgrade to a major new version of the same environment, or to a completely different platform. Depending on the size of your company and the scope of such a project, you are talking about 6, 12, or 18 months. Even getting business approval to run such a project can take significant time.

But that still leaves the question: Why should we switch to Linux?
There are many different reasons for switching to Linux: lower costs, improved reliability and stability, more flexibility, choice, being in control. There are many factors that make open source systems better than systems based on proprietary software.

For a large enterprise to make the switch to open source software, these benefits will need to be quantified. To be able to say how it will benefit your organization, you will need to start looking at it. Then, when the next round of upgrades is due, you'll be able to make the business case for Linux as the next corporate platform.

Now's the time to start looking at the potential of an open source–based corporate computing environment.

Urban Legends of Linux

  • Linux is for hobbyists: Linux is a highly stable operating system suitable for corporate environments. Some of the Web's best-known sites rely entirely on Linux, for example, Google and Amazon.
  • Hardware support is limited: Most hardware providers have woken up to the importance of Linux and provide drivers or assist developers in writing drivers for their hardware.
  • Application availability is limited: There are tens of thousands of applications available for Linux. Most of them are open source, but more and more closed source applications are ported to Linux as well.
  • Systems are built entirely from source code: Just about all open source software is available in binary form. Some projects deliver source code only, but the various distributions then deliver binary packages of those applications.
  • Linux is a server OS: Within corporations Linux is most successful in the server arena, but as a general-purpose operating system it is equally suitable for the desktop.
  • The claims about lower costs are fake: There are far too many variables involved for anyone to make a generic statement about costs. The only way to find out how much your organization can save is by looking at how using Linux may impact that organization. For us, the enormous reduction in the amount of time spent managing our systems was well worth the effort of migrating to Linux.
  • More Stories By Herman Verkade

    Herman Verkade is a UK-based, independent consultant who specializes in the management of large-scale heterogeneous environments. Over the past 22 years he has worked mostly with financial institutions in the UK, the U.S., and continental Europe.

    Comments (2)

    Share your thoughts on this story.

    Add your comment
    You must be signed in to add a comment. Sign-in | Register

    In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


    IoT & Smart Cities Stories
    Dion Hinchcliffe is an internationally recognized digital expert, bestselling book author, frequent keynote speaker, analyst, futurist, and transformation expert based in Washington, DC. He is currently Chief Strategy Officer at the industry-leading digital strategy and online community solutions firm, 7Summits.
    Digital Transformation is much more than a buzzword. The radical shift to digital mechanisms for almost every process is evident across all industries and verticals. This is often especially true in financial services, where the legacy environment is many times unable to keep up with the rapidly shifting demands of the consumer. The constant pressure to provide complete, omnichannel delivery of customer-facing solutions to meet both regulatory and customer demands is putting enormous pressure on...
    IoT is rapidly becoming mainstream as more and more investments are made into the platforms and technology. As this movement continues to expand and gain momentum it creates a massive wall of noise that can be difficult to sift through. Unfortunately, this inevitably makes IoT less approachable for people to get started with and can hamper efforts to integrate this key technology into your own portfolio. There are so many connected products already in place today with many hundreds more on the h...
    The standardization of container runtimes and images has sparked the creation of an almost overwhelming number of new open source projects that build on and otherwise work with these specifications. Of course, there's Kubernetes, which orchestrates and manages collections of containers. It was one of the first and best-known examples of projects that make containers truly useful for production use. However, more recently, the container ecosystem has truly exploded. A service mesh like Istio addr...
    Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
    Charles Araujo is an industry analyst, internationally recognized authority on the Digital Enterprise and author of The Quantum Age of IT: Why Everything You Know About IT is About to Change. As Principal Analyst with Intellyx, he writes, speaks and advises organizations on how to navigate through this time of disruption. He is also the founder of The Institute for Digital Transformation and a sought after keynote speaker. He has been a regular contributor to both InformationWeek and CIO Insight...
    Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settlement products to hedge funds and investment banks. After, he co-founded a revenue cycle management company where he learned about Bitcoin and eventually Ethereal. Andrew's role at ConsenSys Enterprise is a mul...
    To Really Work for Enterprises, MultiCloud Adoption Requires Far Better and Inclusive Cloud Monitoring and Cost Management … But How? Overwhelmingly, even as enterprises have adopted cloud computing and are expanding to multi-cloud computing, IT leaders remain concerned about how to monitor, manage and control costs across hybrid and multi-cloud deployments. It’s clear that traditional IT monitoring and management approaches, designed after all for on-premises data centers, are falling short in ...
    In his general session at 19th Cloud Expo, Manish Dixit, VP of Product and Engineering at Dice, discussed how Dice leverages data insights and tools to help both tech professionals and recruiters better understand how skills relate to each other and which skills are in high demand using interactive visualizations and salary indicator tools to maximize earning potential. Manish Dixit is VP of Product and Engineering at Dice. As the leader of the Product, Engineering and Data Sciences team at D...
    Dynatrace is an application performance management software company with products for the information technology departments and digital business owners of medium and large businesses. Building the Future of Monitoring with Artificial Intelligence. Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more busine...