Welcome!

Linux Containers Authors: William Schmarzo, Craig Lowell, Elizabeth White, Liz McMillan, David Green

Related Topics: Linux Containers

Linux Containers: Article

Making Sure Migration Is an Option

What pitfalls await your open source move

As time passes, the base of Linux users is growing in the data center, desktop, and even embedded electronic devices. Numbers from industry analysts point out that Linux server shipments have shown double-digit growth every quarter for over two years. In many cases these servers are being used for expansion or new projects. Inevitably they will be put into service to replace systems that once ran Unix or Windows.

In these cases there is usually an event like a hardware upgrade due to obsolescence or capacity concerns, software maintenance renewal, or other occasion that drives the migration. Rarely do we see a complete replacement of all legacy systems; it's commonly done piecemeal where one part of the infrastructure has been migrated in lieu of renewing an investment in other systems. This means migration to Linux. It also often means adoption of open standards that for the most part is a prime consideration for the open source community. Open source without open standards bears less advantage than a fully open system.

Linux migration is usually a matter of expansion, adding Linux into an increasingly diverse environment. Desktop PCs may be predominantly Windows, while file servers and application servers once hosted on Windows or even Novell NetWare might soon be hosted on Linux. In this case there are two hurdles that you need to overcome, especially if there is a need for communication between systems. The first is the obvious hurdle of moving from one system to another. This is a short-term problem. It's usually very disruptive and requires a considerable amount of planning and staging, though it's not especially unique as you face many of the same problems moving from one version of Windows to another or from one brand of Unix to another. The second problem is longer term and involves interoperability with existing systems. To lessen the burden in this area you should be planning well before a migration. The consideration I believe is most important is whether your systems lock you in and make it unlikely you can easily change vendors should you become unhappy with and want to investigate others. This applies to your data and network services. For example, could data stored in DB/2, Sybase, or Oracle be stored in MySQL or vice versa? Does one system have features you can't live without? Could documents originally authored in Microsoft Office be read in OpenOffice.org? If a new version of Windows becomes available, does it still allow you to access your Samba file system hosted on Linux? These are all questions you should be asking regardless of platform. For example, using Apache on a FreeBSD server is a fine choice, but if you decide that Red Hat offers a good value, can you move from one platform to another? The answer is likely yes. Does the same hold true for Web applications developed on Microsoft's Internet Information Server?

My advice is to adhere to open standards and portable file formats that are more easily migrated later on. Even if you decide to stay with one vendor, it's much better for you to select which solutions to use rather than your vendor making that choice for you. I suggest looking at things that are going on today in your enterprise. On the desktop you are likely using Microsoft Office. Their next-generation file formats are XML and intuitively should allow for easier collaboration between Microsoft and OpenOffice.org users. However, watch closely to make sure that they really do facilitate the sharing of files and that the hype is not the result of a clever PR campaign. Another thing to be wary of is the potential of a new feature in Microsoft's yet-to-be-released Vista operating system: encrypted file systems. Since the product has not gone live, it's hard to understand how this will affect cross-platform enterprises. My understanding is that this feature would likely include a level of encryption that protects the data on the hard drive (for example, the one on your Linux laptop). The idea being that if your laptop were stolen, it could not be booted under another operating system and the data would not be accessible. At first glance this sounds like a valuable feature. However, the question is: Would this prevent you from legitimately accessing that data from another operating system (like I do on my dual boot Windows/Linux laptop). In fact I often help Microsoft users migrate their data from an out-of-service Windows PC using a bootable Linux CD and a network. Will I still have this option with future products? Since this new technology prevents the bad guys from getting my data, I wonder if it also prevents me from accessing my data in a way that I choose? Is the encryption technology open source and does it allows me to authenticate my data from systems other than Windows? Does that make it possible for me to share files between systems that aren't licensed to use this new cryptology technology? I'm unsure of the answer.

I also worry about the inclusion of Digital Rights Management (DRM) technology in hardware. For example, DVI connections that are present on many modern graphics cards are very similar to the HDMI (High Definition Multimedia Interface) cables that are being used to combine audio and video into one cable for home entertainment equipment. Eventually, computers will use this same interface. Now here's a bit of trivia: the HDMI standard includes an element called HDCP (High Definition Copy Protection, developed by Intel) that does little to add value to my personal experience (I'm sure the recording and movie industries will offer some statistic about how reduced piracy keeps prices lower for me). However, it could mean that if I choose to use a "standard" graphics card with a standard PCI-E interface, I also must make sure that they adhere to less obvious standards buried within my hardware should I want to watch a DVD or HDTV broadcast. Does HDCP add value to me personally? Does it help me get more enjoyment out of my system? Should I be concerned about what's going on within the widgetry of my system? I would think so.

My point in mentioning these things is not to cause you any undue worry or to preach doom and gloom. My hope is to make you aware that while you continue to adopt mainstream technologies, you may also unknowingly be adopting features that lock you into a product's technology. Some of these features will have benefits that will be useful to you; just make sure you are getting what you bargained for. Also, what happens if these technologies add an additional point of failure? For example, the reason I know so much about HDMI is because I recently bought a plasma TV and while running cables from my HD receiver to my A/V receiver to my new TV, I found out that the receiver or the set top box didn't properly implement the standard. The result was that the copy protection software inhibited my ability to legally use my equipment. HDCP never came up in the sales process nor were the installers of my system aware of the potential problem. Take the same situation in a different content: What happens if data stored on your Windows server becomes unavailable to your Linux servers because of some obscure DRM scheme? Does it shut down your operation? Does it add unnecessary complexity and inconvenience? These are the questions I would be asking before I made my next investment in new technologies. The freedom to migrate is one that I believe to be more important than the actual act of migrating. You see, I have made my decisions and continue to make them but do so on my terms, not that of vendors that are conspiring to lock me in. In the future I don't know which conventions might be widely adopted that would prevent me from using legally purchase products in a reasonable way (of course, the consumer's version of reasonable way and the vendor's are bound to be different). What I do know is that I need to be vigilant and watch for these gotchas, and take steps to avoid them. I would advise you to do the same.

More Stories By Mark R. Hinkle

Mark Hinkle is the Senior Director, Open Soure Solutions at Citrix. He also is along-time open source expert and advocate. He is a co-founder of both the Open Source Management Consortium and the Desktop Linux Consortium. He has served as Editor-in-Chief for both LinuxWorld Magazine and Enterprise Open Source Magazine. Hinkle is also the author of the book, "Windows to Linux Business Desktop Migration" (Thomson, 2006). His blog on open source, technology, and new media can be found at http://www.socializedsoftware.com.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories
Is the ongoing quest for agility in the data center forcing you to evaluate how to be a part of infrastructure automation efforts? As organizations evolve toward bimodal IT operations, they are embracing new service delivery models and leveraging virtualization to increase infrastructure agility. Therefore, the network must evolve in parallel to become equally agile. Read this essential piece of Gartner research for recommendations on achieving greater agility.
Smart Cities are here to stay, but for their promise to be delivered, the data they produce must not be put in new siloes. In his session at @ThingsExpo, Mathias Herberts, Co-founder and CTO of Cityzen Data, will deep dive into best practices that will ensure a successful smart city journey.
DevOps at Cloud Expo, taking place Nov 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 19th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long dev...
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 19th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The Internet of Things (IoT) is the most profound change in personal and enterprise IT since the creation of the Worldwide Web more than 20 years ago. All major researchers estimate there will be tens of billions devices - comp...
The 19th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Digital Transformation, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportuni...
SYS-CON Events announced today that Venafi, the Immune System for the Internet™ and the leading provider of Next Generation Trust Protection, will exhibit at @DevOpsSummit at 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Venafi is the Immune System for the Internet™ that protects the foundation of all cybersecurity – cryptographic keys and digital certificates – so they can’t be misused by bad guys in attacks...
SYS-CON Events announced today Telecom Reseller has been named “Media Sponsor” of SYS-CON's 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Telecom Reseller reports on Unified Communications, UCaaS, BPaaS for enterprise and SMBs. They report extensively on both customer premises based solutions such as IP-PBX as well as cloud based and hosted platforms.
For basic one-to-one voice or video calling solutions, WebRTC has proven to be a very powerful technology. Although WebRTC’s core functionality is to provide secure, real-time p2p media streaming, leveraging native platform features and server-side components brings up new communication capabilities for web and native mobile applications, allowing for advanced multi-user use cases such as video broadcasting, conferencing, and media recording.
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at Cloud Expo, Ed Featherston, a director and senior enterprise architect at Collaborative Consulting, will discuss the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
Pulzze Systems was happy to participate in such a premier event and thankful to be receiving the winning investment and global network support from G-Startup Worldwide. It is an exciting time for Pulzze to showcase the effectiveness of innovative technologies and enable them to make the world smarter and better. The reputable contest is held to identify promising startups around the globe that are assured to change the world through their innovative products and disruptive technologies. There w...
Akana has announced the availability of version 8 of its API Management solution. The Akana Platform provides an end-to-end API Management solution for designing, implementing, securing, managing, monitoring, and publishing APIs. It is available as a SaaS platform, on-premises, and as a hybrid deployment. Version 8 introduces a lot of new functionality, all aimed at offering customers the richest API Management capabilities in a way that is easier than ever for API and app developers to use.
Personalization has long been the holy grail of marketing. Simply stated, communicate the most relevant offer to the right person and you will increase sales. To achieve this, you must understand the individual. Consequently, digital marketers developed many ways to gather and leverage customer information to deliver targeted experiences. In his session at @ThingsExpo, Lou Casal, Founder and Principal Consultant at Practicala, discussed how the Internet of Things (IoT) has accelerated our abil...
SYS-CON Events announced today that 910Telecom will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Housed in the classic Denver Gas & Electric Building, 910 15th St., 910Telecom is a carrier-neutral telecom hotel located in the heart of Denver. Adjacent to CenturyLink, AT&T, and Denver Main, 910Telecom offers connectivity to all major carriers, Internet service providers, Internet backbones and ...
With so much going on in this space you could be forgiven for thinking you were always working with yesterday’s technologies. So much change, so quickly. What do you do if you have to build a solution from the ground up that is expected to live in the field for at least 5-10 years? This is the challenge we faced when we looked to refresh our existing 10-year-old custom hardware stack to measure the fullness of trash cans and compactors.
The emerging Internet of Everything creates tremendous new opportunities for customer engagement and business model innovation. However, enterprises must overcome a number of critical challenges to bring these new solutions to market. In his session at @ThingsExpo, Michael Martin, CTO/CIO at nfrastructure, outlined these key challenges and recommended approaches for overcoming them to achieve speed and agility in the design, development and implementation of Internet of Everything solutions wi...
19th Cloud Expo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Meanwhile, 94% of enterpri...
Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more business becomes digital the more stakeholders are interested in this data including how it relates to business. Some of these people have never used a monitoring tool before. They have a question on their mind like “How is my application doing” but no id...
In today's uber-connected, consumer-centric, cloud-enabled, insights-driven, multi-device, global world, the focus of solutions has shifted from the product that is sold to the person who is buying the product or service. Enterprises have rebranded their business around the consumers of their products. The buyer is the person and the focus is not on the offering. The person is connected through multiple devices, wearables, at home, on the road, and in multiple locations, sometimes simultaneously...
Amazon has gradually rolled out parts of its IoT offerings in the last year, but these are just the tip of the iceberg. In addition to optimizing their back-end AWS offerings, Amazon is laying the ground work to be a major force in IoT – especially in the connected home and office. Amazon is extending its reach by building on its dominant Cloud IoT platform, its Dash Button strategy, recently announced Replenishment Services, the Echo/Alexa voice recognition control platform, the 6-7 strategic...
Cloud computing is being adopted in one form or another by 94% of enterprises today. Tens of billions of new devices are being connected to The Internet of Things. And Big Data is driving this bus. An exponential increase is expected in the amount of information being processed, managed, analyzed, and acted upon by enterprise IT. This amazing is not part of some distant future - it is happening today. One report shows a 650% increase in enterprise data by 2020. Other estimates are even higher....