Linux Containers Authors: Pat Romanski, Carmen Gonzalez, Elizabeth White, Yeshim Deniz, Zakia Bouachraoui

Related Topics: Linux Containers

Linux Containers: Article

Linux Lunch

Debunking the myths

Over the past few months I've had lunch with a lot of old friends who were techies 10 years ago, but are now IT managers at large financial institutions. Linux came up in the conversations with most of them, and I was surprised by the misconceptions they have about the whole open source environment. I ended up debunking the myths that seem to exist around Linux. This is how most of my conversations went....

You've got to be kidding! I wouldn't want to bet my entire business on Linux, would I?
Funny you should say that. I remember you saying the same thing 10 years ago, but then we were talking about Windows. That's when you were using Windows 3.1, and Windows NT 3.1 was just about to be released. And indeed, back then it was much more sensible to stick with the system you were running at the time. But a few years later you were one of the first companies to adopt Windows NT 3.51 as a corporate platform. Now, Linux is maturing rapidly, and the time has come to start looking at it seriously as the next corporate platform.

We had a look at Linux a few years ago, but felt it was really just a toy operating system for hobbyists.
Over the past few years tremendous progress has been made. Linux is a very stable operating system. You can get drivers for just about any piece of hardware, and there's an incredible number of applications available for it, on the server side as well as on the desktop.

If we're really precise, Linux is really only a kernel. With it you run GNU utilities (which are the shells and the shell commands), XFree86 (which is what gives you GUI capabilities), and desktop environments such as GNOME and KDE. On top of that you get productivity tools such as OpenOffice.org, Evolution, and so on. Most of these are included in all the distributions. Other bits you may need to download separately.

With just the Linux kernel, you can't do much. But with all the projects going on at the moment, you can run a complete computing environment, on servers and on the desktop, based entirely on open source software. Within that you have a lot of choice as to which software you want to run. You're in no way tied to what one single company wants you to run.

I don't like the DIY approach. I don't want to involve my staff in compiling all the systems that I'm going to run on my computers.
You don't have to. You buy or download a "distribution," which contains all the binaries. To install this on a computer you go through a GUI-based installation procedure, selecting the options you want. It's no different from purchasing and installing a proprietary operating system.

But I should really compile a kernel for my specific hardware, right?
Wrong. It's been well over a year since I last compiled a kernel. And even with that one I found later that the support I wanted was already in the kernel of the distribution I was using, so I didn't have to do it. At the moment all the systems in our office run with off-the-shelf kernels. The Linux kernel supports loadable modules and device drivers, so the kernel that comes with your distribution will in most cases be adequate, as it will load into memory only the portions it needs. If you have obscure hardware you may need to add a specific device driver to your system, but that's most likely also a loadable module. You don't need to recompile a new kernel for that. There's no difference from proprietary software on that front either.

Most distributions come with more than one precompiled kernel, each one optimized for a different processor, for example i386, i586, or i686. The installation procedure automatically selects the appropriate kernel for the processor found in the machine. This is in contrast to proprietary systems, which come in one version, compiled to support the lowest supported processor.

But Linux is about freedom of choice: if you really want to squeeze the last bit of performance out of your machines, then you do have the option of compiling a tailor- made kernel for your particular hardware configuration.

What about modifying the kernel? I wouldn't want any of my staff slipping a dodgy home-grown modified kernel into my production systems.
Well, don't let them do that then. For your current systems you have strict change-control procedures. Upgrades and patches are applied only after extensive testing, and after sign-off by business users. You use the same procedures with Linux.

At the moment you have two types of software in your company: shrink-wrapped software that was developed by software companies, and in-house applications that were developed by your own developers. For the first group you only have binaries, pay a license fee, and rely on the vendor for fixes and further development. For the second group you're on your own because you need to maintain the source code yourself. You seem to be looking at Linux as being part of that second group. But your having access to the source code doesn't put it in that category. It is software created by people outside your company, for which you have a binary distribution, and for most of it you can get support. But you don't pay a license fee, and you do get free access to the source code.

There are tens of thousands of open source projects going on at the moment. I wouldn't expect the average Linux system administrator to make changes to the kernel itself. It's more likely that you'll want something changed in one of the other portions of the system. When we're talking about participating in open source projects, we're not just talking about the kernel, but about all the software that's available for the Linux platform.

So, if I run a binary distribution, what is the actual point in having the source code?
The point is that with open source software, you have free access to the source code. You don't actually need the source code to run the software. However, when there's an unexpected problem, it's very handy to have access to the source code. With proprietary systems, engineers often end up using trial and error to find a workaround for a problem. With Linux you can look inside to see what's going on, and find a workaround that way, or actually fix the problem.

If we make modifications or enhancements to that code, then we are on our own with an unsupported configuration, and we'll need to reapply those changes with every future release.
That's not how it works. If one of your developers makes a change to the software, to fix a bug or make something more efficient, then the idea is that you submit that change back to the project. The next official release will include that fix or enhancement. From then on, you use that new release, in binary form and with support.

Developers don't work for free. Are you suggesting that I spend money on development and then give the work to the rest of the world for free?
When you submit a change back to the project, it will be reviewed by others before being accepted. They may come up with some further improvements, or see that your developer did not get things completely right and fix that, or suggest how to fix it, or improve your developer's code in a different way. Maybe you put in three days to make the change, but you may get another three days of others' time to check and perfect that change. Later on, yet another person may build upon your enhancement to create even more functionality. You don't pay for that additional effort. The community principle works out very well economically.

And think about the license fees you pay for proprietary software.... Some of that money goes toward further development of the product you bought, but you have absolutely no say in that development. You can suggest to the company the fixes and features you'd like to see, but you have no say in what happens with the money you pay. Some of it will go into developing features that others want, but for which you have absolutely no need. With open source software you can contribute to the project by making fixes and doing some development in specific areas where you want it, instead of paying large sums to fund the overall development, and leaving someone else to decide where and how that money is applied. Open source puts you in control.

But what about other core software? For example, we use Oracle extensively. What are we going to do about that?
More and more software companies are porting their software to Linux, although that doesn't mean they are going open source. Oracle is one good example of a proprietary product that is available for Linux. Some purists don't like it because it's not open source, but in the end the choice is yours. If you want to stick with Oracle, you can do so, either on the platform that it runs on at the moment, or by running Oracle under Linux. If you want to go completely open source, then there are also open source database projects, such as PostgreSQL and MySQL. But nobody forces you to choose one or another. Open source is about choice. You make the choices.

Okay, I'm beginning to get the picture. But if we put Linux on our servers, do we then still need Windows on our desktop PCs? Do we then use Terminal Services or something similar?
No. Linux is a general purpose operating system, not just a server operating system. In the commercial world it's currently most successful in the server arena, but it runs just as happily on the desktop. There are masses of end-user applications available for Linux, which you can run directly under Linux on a desktop (or laptop) computer. The files for those applications can be on the user's computer or on a shared directory on a server. But you also have the option of running applications on a server and displaying their GUI back on the workstations. Terminal Services on Windows are a fairly recent development, but X11 (which is the graphical subsystem on Linux) has had remote application display for decades, and is far more advanced.

So, you reckon I can switch to Linux right now?
I would not advise you to go back to the office today and send your engineers around the company installing Linux on all the computers they can find. That would be suicide. Upgrading to any new platform is a major project that requires testing and planning – whether it's an upgrade to a major new version of the same environment, or to a completely different platform. Depending on the size of your company and the scope of such a project, you are talking about 6, 12, or 18 months. Even getting business approval to run such a project can take significant time.

But that still leaves the question: Why should we switch to Linux?
There are many different reasons for switching to Linux: lower costs, improved reliability and stability, more flexibility, choice, being in control. There are many factors that make open source systems better than systems based on proprietary software.

For a large enterprise to make the switch to open source software, these benefits will need to be quantified. To be able to say how it will benefit your organization, you will need to start looking at it. Then, when the next round of upgrades is due, you'll be able to make the business case for Linux as the next corporate platform.

Now's the time to start looking at the potential of an open source–based corporate computing environment.

Urban Legends of Linux

  • Linux is for hobbyists: Linux is a highly stable operating system suitable for corporate environments. Some of the Web's best-known sites rely entirely on Linux, for example, Google and Amazon.
  • Hardware support is limited: Most hardware providers have woken up to the importance of Linux and provide drivers or assist developers in writing drivers for their hardware.
  • Application availability is limited: There are tens of thousands of applications available for Linux. Most of them are open source, but more and more closed source applications are ported to Linux as well.
  • Systems are built entirely from source code: Just about all open source software is available in binary form. Some projects deliver source code only, but the various distributions then deliver binary packages of those applications.
  • Linux is a server OS: Within corporations Linux is most successful in the server arena, but as a general-purpose operating system it is equally suitable for the desktop.
  • The claims about lower costs are fake: There are far too many variables involved for anyone to make a generic statement about costs. The only way to find out how much your organization can save is by looking at how using Linux may impact that organization. For us, the enormous reduction in the amount of time spent managing our systems was well worth the effort of migrating to Linux.
  • More Stories By Herman Verkade

    Herman Verkade is a UK-based, independent consultant who specializes in the management of large-scale heterogeneous environments. Over the past 22 years he has worked mostly with financial institutions in the UK, the U.S., and continental Europe.

    Comments (2)

    Share your thoughts on this story.

    Add your comment
    You must be signed in to add a comment. Sign-in | Register

    In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

    IoT & Smart Cities Stories
    Atmosera delivers modern cloud services that maximize the advantages of cloud-based infrastructures. Offering private, hybrid, and public cloud solutions, Atmosera works closely with customers to engineer, deploy, and operate cloud architectures with advanced services that deliver strategic business outcomes. Atmosera's expertise simplifies the process of cloud transformation and our 20+ years of experience managing complex IT environments provides our customers with the confidence and trust tha...
    At CloudEXPO Silicon Valley, June 24-26, 2019, Digital Transformation (DX) is a major focus with expanded DevOpsSUMMIT and FinTechEXPO programs within the DXWorldEXPO agenda. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term. A total of 88% of Fortune 500 companies from a generation ago are now out of business. Only 12% still survive. Similar percentages are found throug...
    Never mind that we might not know what the future holds for cryptocurrencies and how much values will fluctuate or even how the process of mining a coin could cost as much as the value of the coin itself - cryptocurrency mining is a hot industry and shows no signs of slowing down. However, energy consumption to mine cryptocurrency is one of the biggest issues facing this industry. Burning huge amounts of electricity isn't incidental to cryptocurrency, it's basically embedded in the core of "mini...
    In his general session at 19th Cloud Expo, Manish Dixit, VP of Product and Engineering at Dice, discussed how Dice leverages data insights and tools to help both tech professionals and recruiters better understand how skills relate to each other and which skills are in high demand using interactive visualizations and salary indicator tools to maximize earning potential. Manish Dixit is VP of Product and Engineering at Dice. As the leader of the Product, Engineering and Data Sciences team at D...
    The term "digital transformation" (DX) is being used by everyone for just about any company initiative that involves technology, the web, ecommerce, software, or even customer experience. While the term has certainly turned into a buzzword with a lot of hype, the transition to a more connected, digital world is real and comes with real challenges. In his opening keynote, Four Essentials To Become DX Hero Status Now, Jonathan Hoppe, Co-Founder and CTO of Total Uptime Technologies, shared that ...
    At CloudEXPO Silicon Valley, June 24-26, 2019, Digital Transformation (DX) is a major focus with expanded DevOpsSUMMIT and FinTechEXPO programs within the DXWorldEXPO agenda. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term. A total of 88% of Fortune 500 companies from a generation ago are now out of business. Only 12% still survive. Similar percentages are found throug...
    Every organization is facing their own Digital Transformation as they attempt to stay ahead of the competition, or worse, just keep up. Each new opportunity, whether embracing machine learning, IoT, or a cloud migration, seems to bring new development, deployment, and management models. The results are more diverse and federated computing models than any time in our history.
    At CloudEXPO Silicon Valley, June 24-26, 2019, Digital Transformation (DX) is a major focus with expanded DevOpsSUMMIT and FinTechEXPO programs within the DXWorldEXPO agenda. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term. A total of 88% of Fortune 500 companies from a generation ago are now out of business. Only 12% still survive. Similar percentages are found throug...
    Where many organizations get into trouble, however, is that they try to have a broad and deep knowledge in each of these areas. This is a huge blow to an organization's productivity. By automating or outsourcing some of these pieces, such as databases, infrastructure, and networks, your team can instead focus on development, testing, and deployment. Further, organizations that focus their attention on these areas can eventually move to a test-driven development structure that condenses several l...
    The graph represents a network of 1,329 Twitter users whose recent tweets contained "#DevOps", or who were replied to or mentioned in those tweets, taken from a data set limited to a maximum of 18,000 tweets. The network was obtained from Twitter on Thursday, 10 January 2019 at 23:50 UTC. The tweets in the network were tweeted over the 7-hour, 6-minute period from Thursday, 10 January 2019 at 16:29 UTC to Thursday, 10 January 2019 at 23:36 UTC. Additional tweets that were mentioned in this...