Linux Containers Authors: Elizabeth White, Lori MacVittie, Yeshim Deniz, Carmen Gonzalez, Derek Weeks

Related Topics: Linux Containers

Linux Containers: Article

Migration Planning for Linux Desktop Adoption

Five Steps for a Successful Windows-to-Linux Migration Plan

Corporate migration to a Linux desktop requires rigorous premigration planning to succeed. The goal of migration is to finish with a Linux desktop that is cost-effective and responsive to the organization's needs. Without proper data center planning, the migration won't meet this goal and can become a technical and organizational challenge.

During premigration planning, it's crucial to answer the questions: What can I migrate? What do I need to migrate? How much is it going to cost? Who's going to do it? And when is it going to happen?

It's useful to keep in mind what the organization's IT requirements will be in the future, what the current hardware lifecycle is, and what the time frame is for software licensing.

A company or agency determined to move IT towards web-based applications for key business activities can easily switch to a Linux desktop with a browser such as Firefox. An organization determined to increase the hardware lifecycle from three to five years can consider Linux desktops to extend the life of its old hardware. An organization with a large long-term software licensing deal due that expires in the next three years should test the Linux desktop, if for no other reason than to use it as a negotiating weapon. Lowering costs with Mozilla, Evolution and OpenOffice does not necessarily require a complete migration to a Linux desktop.

Alacos (www.alacos.com) uses a best-practices desktop migration methodology called Migration Mapping. Migration Mapping consists of five key steps necessary to a successful migration. They are:

  1. Audit the Current Environment
  2. Analyze the Audit Data
  3. Design the Solution
  4. Map to an Organizational Matrix
  5. Automate the Organizational Transition

Audit the Current Environment

An audit, the first key step, determines the actual cost of your current software and hardware environment, the applications currently in use and how the IT infrastructure is bound together. The point of this step is to determine how third-party applications, proprietary applications and external-facing applications (both proprietary and licensed) are deployed and used in your organization.

The base data to gather are the number of desktops in use, the number of applications used, the printer situation, line-of-business applications and the server and database infrastructure. Data on usage should be coordinated in a spreadsheet that contains information on software and hardware by department, by function and by business need. The cost of the OS, software, hardware, printing, back-end and support should be included.

The quickest, but least reliable, way to audit an organization's IT needs is to count software licenses. It will give you an overstated picture of software usage because most users rarely use all of the software installed on their desktops. This approach won't uncover any unlicensed software or illegal programs downloaded from the Internet that are being used. Knowing what's actually installed and used in an organization is crucial to creating a migration plan and integrating it successfully.

The most rigorous way to measure an organization's software use is to run an auditing and usage program on its network. Useful and actionable data will be available after a week of running auditing software, but it's preferable to run it for a full quarter to really understand changing software usage patterns with-in departments. For example, a finance department might use significantly different software during a quarterly or annual wrap-up than it does during a regular month.

There are various software tools available to conduct an audit. The Business Software Alliance (www.bsa.com) offers free trials of audit tools. Sassafras Software (www.sassafras.com) and Tally Systems (www.tallysystems.com) both offer commercial usage and auditing software. The cost can range from $15-$25 per computer for usage audit software licenses. Microcost (www.microcost.com) offers usage consulting in Europe and a lengthy report about software usage.

Beyond usage auditing, it's crucial to review the network and desktop data situation. Although many companies strive for all data to be on a network and backed up, that's rarely the case. Users have important business files on their desktops, as well as e-mail files and address books. Employees with laptops often don't keep data adequately synchronized to the network - if it's not automatic, they don't do it. Each organization is structured differently so it's important to spend time reviewing what business data resides on the network and what lives on the user desktop.

The last audit to undertake is to inventory the IT skills of both an organization's IT staff and its general population. Who on the IT staff can run Linux capably? Who already uses Linux at home? Who likes OpenOffice or StarOffice? Ask these questions at an early stage to identify the Linux champions in your company.

Analyze the Audit Data

The second key step, analyzing the audit data, requires reporting and visualization capabilities. Use a spreadsheet to sort and define data by group and function. Use Visio or Draw to create a visual map of the network and desktop environment. The goal here is to detail the IT structure and determine where migrating to desktop Linux would add value. It also pinpoints which groups can be migrated to Linux most easily.

It's usually a big surprise finding out what employees do with their desktops. The top activities are usually e-mail, Word documents and proprietary business applications (Internet or desktop). Applications like Excel and PowerPoint are usually found in heavy use only by small groups of key employees.

There are often a lot of programs installed on user desktops that shouldn't be there. They're either unlicensed software (remove it quick before you're visited by the BSA) or programs downloaded from the Internet. It's always interesting to find hacking tools hidden on the in-house network.

Games and Internet use are generally higher than most companies would like and good usage data may translate into changes in business policies. Switching to a Linux desktop will let IT administrators control software use more easily.

A typical audit will find that a company needs to replace its front-end mail client, office suite and web browser for security reasons and move employees to a network and printer environment using Samba.

Any changes in the desktop environment should reflect changes that have been made or will be made in the back-end infrastructure. Although one could switch to a new mail client and use a special connector to access a proprietary mail server, it's more useful to move the mail server to a Linux server running an open source mail server such as Open-Exchange.

Design the Solution

The third key step is to design the solution based on an analysis of the IT audit. Because the decision-making is based on data, a solution can be developed that meets the organization's needs. The goal is to build a Linux desktop that replaces Windows at a fraction of the cost with no loss in business capabilities. A data-centered process will allow a company to fix on an IT structure that outcompetes rivals in the future.

Use the data to determine what applications and servers need to be migrated. Then find out whether an alternative exists on Linux or whether a compromise solution is available.

The usual way to develop a solution is to go through a series of choices for each application. First, find a Linux version of the software. If there's no Linux version, find a viable alternative such as StarOffice instead of Microsoft Office. If there's no alternative, then consider emulators such as CodeWeavers, Win4Lin or VMWare to run Windows applications on Linux. If the emulator approach won't work, then run a terminal server such as Citrix or LSP to deliver desktop access to the Windows program.

If none of these solutions work, then port the application to either a web-based application or a Linux desktop application. Porting an application is time-consuming and expensive, but if a company is already moving to web-based applications then that approach fits into its overall IT business strategy.

An organization will usually choose a Linux distribution that delivers enterprise-level support and training. The current desktop distributions that provide these services are the Novell Linux Desktop, Red Hat Enterprise, Sun's Java Desktop System, Mandrakesoft and Turbolinux. Xandros delivers customer support and service to smaller and mid-sized enterprises.

Choosing a new mail client, web browser and office suite is straightforward. Linux alternatives exist for each of these applications. Office suites that are commercially supported are StarOffice, while OpenOffice is freely available. The biggest problem with current office alternatives is that Excel macros don't work in Calc, the alternate spreadsheet program. This can be a challenge for moving financial and accounting groups to open source office alternatives, but it's worth re-creating those macros in Calc so everybody in a company uses the same standard program.

The other issue for office applications is communicating with external business partners. Using PDF as the standard for business communications will mitigate most document issues. PDF is also more secure because it's virus-free.

One goal of software design is to drop applications. Many companies rely on outdated applications that don't meet business objectives. Moving to more flexible and modern applications can result in significant efficiency and cost-savings.

Map to an Organizational Matrix

The fourth key step in the process is to map the software solution to the hardware across an organizational matrix. An application matrix for each group in a company is applied against the group's desktop hardware. Every piece of software that's going on the desktop should be determined prior to implementation. Every new hardware system should be defined at this point.

Once the solution has been mapped to the organizational matrix, then software, hardware and training costs can be established. Overall, the cost of moving to a Linux desktop and the cost of going to the next-generation Windows system will be equivalent so long-term savings are seen through the OS, office applications, lower IT costs for network management and lower security risks.

Automate the Organizational Transition

The fifth and final key in pre-migration planning is to finalize the transition plan using automated migration tools. A transition plan consists of laying the software design and organizational matrix against the real-world organization and timeline to determine how to automate mass migration. Although these considerations have already played a part in developing the IT solution, this is the moment to place those plans against group schedules and deadlines.

The goal is to set down in detail who in the organization will migrate, when it will occur and what post-migration support will be needed. It's important to the success of the migration. Consider the quarterly and yearly cycles of the various groups. Take into account the trade show plans of the sales and marketing staff. Remember quarterly and annual financial preparations. Migrating during a group's busiest time ensures failure.

Every possible migration process that can be automated is defined at this point. Automation is crucial to mass migration. The transition plan should include who's doing the work using which migration tools for servers and desktops. Manually migrating user data and manually installing a desktop OS and software is a painstaking, error-prone process that requires an experienced Windows and Linux technician. Since it's time-consuming, migrating manually is expensive.


The most important part of a Linux desktop migration is to develop a rigorous migration plan during the pre-migration phase. Migration Mapping consists of the five key steps in a successful plan - Audit the Current Environment, Analyze the Audit Data, Design the Solution, Map to an Organizational Matrix and Automate the Organizational Transition. Following these five key steps provides a data center approach to revising an organization's desktop and software infrastructure.


Alacos: Linux Migration Specialists

Alacos (www.alacos.com) a Linux software company located in Seattle, Washington is focused on the migration of Windows data to Linux. Alacos is the maker of Linux Migration Agent, which can move data from a Windows PC to a Linux PC via a crossover cable or network. Linux Migration Agent can move email from Outlook to Novell's Evolution e-mail client, Internet Explorer browser settings to Mozilla and accomplish many other time consuming tasks that are involved in a Windows to Linux Migration.

More Stories By Luis Aguilar

Luis Aguilar is Vice President of Technology and a founder of Alacos, a Linux software company located in Seattle, Washington, focused on the migration of Windows data to Linux.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

@ThingsExpo Stories
November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Penta Security is a leading vendor for data security solutions, including its encryption solution, D’Amo. By using FPE technology, D’Amo allows for the implementation of encryption technology to sensitive data fields without modification to schema in the database environment. With businesses having their data become increasingly more complicated in their mission-critical applications (such as ERP, CRM, HRM), continued ...
SYS-CON Events announced today that Enzu will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Enzu’s mission is to be the leading provider of enterprise cloud solutions worldwide. Enzu enables online businesses to use its IT infrastructure to their competitive advantage. By offering a suite of proven hosting and management services, Enzu wants companies to focus on the core of their online busine...
For basic one-to-one voice or video calling solutions, WebRTC has proven to be a very powerful technology. Although WebRTC’s core functionality is to provide secure, real-time p2p media streaming, leveraging native platform features and server-side components brings up new communication capabilities for web and native mobile applications, allowing for advanced multi-user use cases such as video broadcasting, conferencing, and media recording.
Established in 1998, Calsoft is a leading software product engineering Services Company specializing in Storage, Networking, Virtualization and Cloud business verticals. Calsoft provides End-to-End Product Development, Quality Assurance Sustenance, Solution Engineering and Professional Services expertise to assist customers in achieving their product development and business goals. The company's deep domain knowledge of Storage, Virtualization, Networking and Cloud verticals helps in delivering ...
SYS-CON Events announced today that Cloudbric, a leading website security provider, will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Cloudbric is an elite full service website protection solution specifically designed for IT novices, entrepreneurs, and small and medium businesses. First launched in 2015, Cloudbric is based on the enterprise level Web Application Firewall by Penta Security Sys...
The best way to leverage your Cloud Expo presence as a sponsor and exhibitor is to plan your news announcements around our events. The press covering Cloud Expo and @ThingsExpo will have access to these releases and will amplify your news announcements. More than two dozen Cloud companies either set deals at our shows or have announced their mergers and acquisitions at Cloud Expo. Product announcements during our show provide your company with the most reach through our targeted audiences.
In the next five to ten years, millions, if not billions of things will become smarter. This smartness goes beyond connected things in our homes like the fridge, thermostat and fancy lighting, and into heavily regulated industries including aerospace, pharmaceutical/medical devices and energy. “Smartness” will embed itself within individual products that are part of our daily lives. We will engage with smart products - learning from them, informing them, and communicating with them. Smart produc...
SYS-CON Events announced today that 910Telecom will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Housed in the classic Denver Gas & Electric Building, 910 15th St., 910Telecom is a carrier-neutral telecom hotel located in the heart of Denver. Adjacent to CenturyLink, AT&T, and Denver Main, 910Telecom offers connectivity to all major carriers, Internet service providers, Internet backbones and ...
SYS-CON Events announced today that Coalfire will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Coalfire is the trusted leader in cybersecurity risk management and compliance services. Coalfire integrates advisory and technical assessments and recommendations to the corporate directors, executives, boards, and IT organizations for global brands and organizations in the technology, cloud, health...
SYS-CON Events announced today that Transparent Cloud Computing (T-Cloud) Consortium will exhibit at the 19th International Cloud Expo®, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. The Transparent Cloud Computing Consortium (T-Cloud Consortium) will conduct research activities into changes in the computing model as a result of collaboration between "device" and "cloud" and the creation of new value and markets through organic data proces...
WebRTC defines no default signaling protocol, causing fragmentation between WebRTC silos. SIP and XMPP provide possibilities, but come with considerable complexity and are not designed for use in a web environment. In his session at @ThingsExpo, Matthew Hodgson, technical co-founder of the Matrix.org, discussed how Matrix is a new non-profit Open Source Project that defines both a new HTTP-based standard for VoIP & IM signaling and provides reference implementations.
The Internet of Things (IoT), in all its myriad manifestations, has great potential. Much of that potential comes from the evolving data management and analytic (DMA) technologies and processes that allow us to gain insight from all of the IoT data that can be generated and gathered. This potential may never be met as those data sets are tied to specific industry verticals and single markets, with no clear way to use IoT data and sensor analytics to fulfill the hype being given the IoT today.
In his general session at 18th Cloud Expo, Lee Atchison, Principal Cloud Architect and Advocate at New Relic, discussed cloud as a ‘better data center’ and how it adds new capacity (faster) and improves application availability (redundancy). The cloud is a ‘Dynamic Tool for Dynamic Apps’ and resource allocation is an integral part of your application architecture, so use only the resources you need and allocate /de-allocate resources on the fly.
We're entering the post-smartphone era, where wearable gadgets from watches and fitness bands to glasses and health aids will power the next technological revolution. With mass adoption of wearable devices comes a new data ecosystem that must be protected. Wearables open new pathways that facilitate the tracking, sharing and storing of consumers’ personal health, location and daily activity data. Consumers have some idea of the data these devices capture, but most don’t realize how revealing and...
A completely new computing platform is on the horizon. They’re called Microservers by some, ARM Servers by others, and sometimes even ARM-based Servers. No matter what you call them, Microservers will have a huge impact on the data center and on server computing in general. Although few people are familiar with Microservers today, their impact will be felt very soon. This is a new category of computing platform that is available today and is predicted to have triple-digit growth rates for some ...
In past @ThingsExpo presentations, Joseph di Paolantonio has explored how various Internet of Things (IoT) and data management and analytics (DMA) solution spaces will come together as sensor analytics ecosystems. This year, in his session at @ThingsExpo, Joseph di Paolantonio from DataArchon, will be adding the numerous Transportation areas, from autonomous vehicles to “Uber for containers.” While IoT data in any one area of Transportation will have a huge impact in that area, combining sensor...
SYS-CON Events announced today that SoftNet Solutions will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. SoftNet Solutions specializes in Enterprise Solutions for Hadoop and Big Data. It offers customers the most open, robust, and value-conscious portfolio of solutions, services, and tools for the shortest route to success with Big Data. The unique differentiator is the ability to architect and ...
SYS-CON Events announced today that MathFreeOn will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. MathFreeOn is Software as a Service (SaaS) used in Engineering and Math education. Write scripts and solve math problems online. MathFreeOn provides online courses for beginners or amateurs who have difficulties in writing scripts. In accordance with various mathematical topics, there are more tha...
More and more brands have jumped on the IoT bandwagon. We have an excess of wearables – activity trackers, smartwatches, smart glasses and sneakers, and more that track seemingly endless datapoints. However, most consumers have no idea what “IoT” means. Creating more wearables that track data shouldn't be the aim of brands; delivering meaningful, tangible relevance to their users should be. We're in a period in which the IoT pendulum is still swinging. Initially, it swung toward "smart for smar...
@ThingsExpo has been named the Top 5 Most Influential Internet of Things Brand by Onalytica in the ‘The Internet of Things Landscape 2015: Top 100 Individuals and Brands.' Onalytica analyzed Twitter conversations around the #IoT debate to uncover the most influential brands and individuals driving the conversation. Onalytica captured data from 56,224 users. The PageRank based methodology they use to extract influencers on a particular topic (tweets mentioning #InternetofThings or #IoT in this ...