Click here to close now.

Welcome!

Linux Authors: Carmen Gonzalez, Roger Strukhoff, Gregor Petri, Elizabeth White, Ian Khan

Related Topics: Linux

Linux: Article

Windows-to-Linux Desktop Migration Road Map

Managing application settings, system settings, and data

International Data Corporation (IDC) released a study in December 2004 noting that the worldwide Linux market for PCs, servers, and software will hit $35 billion by 2008. There's a general industry consensus that we're at the brink of a major Windows-to-Linux migration. However, with all the high-level discussion, there has not yet been much attention paid to the practical steps of moving from a Windows desktop to a Linux desktop. This article serves as a road map, outlining how to perform both a manual and an automated desktop migration.

Prep Steps: Advance Planning

In planning a desktop migration, it is essential to determine whether or not Linux supports applications comparable to those currently in use on the Windows desktop. First, determine which applications are critical to your migration. Next, consider equivalent or corollary applications for your new Linux environment. For example, will OpenOffice.org replace MS Office? Will Novell Evolution or Mozilla Mail replace MS Outlook? Last, determine which of these applications must be ported, either through reengineering, modification, or a third-party application that allows a Windows application to run natively on Linux.

The next step is to choose your Linux graphical desktop environment, distribution, and Web browser. First, you should decide on a Linux distribution. There are many distributions available, including, but not limited to, Novell Linux Desktop (NLD), SUSE LINUX Pro, Red Hat Desktop (RHD), Xandros Desktop OS, Mandrake, and Turbolinux Desktop. Next, determine the graphical desktop environment - KDE or GNOME. Each user interface has its merits: SUSE LINUX defaults to KDE and Red Hat defaults to GNOME. Finally, consider which Web browser to use. Linux Web browsers include Firefox, Konqueror, and Mozilla.

If your organization uses personal digital assistants (PDAs), such as Palm Pilots or Pocket PCs, look into available enterprise support within your new Linux environment, specifically with regard to PDA syncing.

You also need to formulate a plan for the hardware on which your Linux desktop will run. Do you plan to buy new hardware or to utilize what you already have? Can Linux work on the hardware you are currently using? One of the advantages of Linux is that it runs well on older hardware, performing better than Microsoft Windows in this respect. Newer hardware can present challenges with Linux because, to date, there is typically no support "out of the box." However, companies such as HP, IBM, and Dell are improving their support for Linux.

Prior to performing the migration, you must consider how you plan to set up machines for your users. You must decide whether data will be stored on the network or locally. If data is to be stored on a network, perhaps a Linux thin client would be most suitable for your organization. However, if you have many laptop users, or other users who aren't always connected to the network server, a Linux rich client is more appropriate. It is also important to consider the access rights of a desktop user. Is software installation done by network administrators only? Who can access directories on the network? How do users interact with applications and programs?

Additionally, habitual concerns, such as supporting upgrades, patches, and backup, have corollaries on the new Linux machine and should be planned for accordingly.

Manual Desktop Migration

After the planning process has been completed and you've determined everything you need on the new Linux environment, including application compatibility, it is recommended that a pilot migration (1-25 seats) be conducted. This trial will allow you to recognize any technical difficulties that may exist and that may need to be addressed before deployment across the organization takes place. You should budget up to one day of work for one worker per machine. Given enough practice and experience, one worker may be able to complete two or three machines each day.

OpenOffice
Custom dictionaries make a good starting point for a migration to OpenOffice. It is not possible to use the dictionaries from Microsoft Office because the format is slightly different. To grab custom dictionaries from Microsoft Word, first find the dictionaries in use by selecting Tools->Options from within Word, selecting the Spelling & Grammar tab, and then clicking Custom Dictionaries. Here you'll find a list of all your dictionaries (most of which will be in the user's Application Data directory, under the subfolder Microsoft\Proof). Copy all the dictionaries to the Linux machine and open them in OpenOffice. (Since the Word dictionary has just one word per line, the dictionary will open as a plain text file under OpenOffice.) Now run a spell check (from Tools->Spellcheck->Check) and repeatedly click the Add button. This will add the custom words into the new custom dictionary.

It is also important to migrate macros. Your migration will depend on how many macros you have and how complex they are. OpenOffice has its own macro format, which is similar but not identical to that of Microsoft Office. You will most likely have to treat macro migration as you'd treat the migration of an application, setting aside some time for rewriting and testing.

Many other Microsoft Office settings have equivalent configurations in OpenOffice. In most cases, default settings will be adequate for the majority of users. Some settings, however, may be configured so that all documents produced from a company follow the same theme. For example, you'll want to look into settings for Styles (available in Format->Styles & Formatting in Microsoft Word, and Format ->Styles->Catalog in OpenOffice Writer) and Templates (saved as .dot files for Microsoft Word and .stw files for OpenOffice).

Finally, in order to maintain productivity, you will probably want to configure toolbars and keyboard shortcuts to be as similar to those in the old environment as possible. The toolbars from Word are easily visible and toolbars in OpenOffice may be changed by going to Tools->Configure, selecting the Toolbars tab, highlighting the toolbar to be edited, and clicking the customize button. Keyboard shortcuts can be found in Word by selecting Tools-> Customize and then clicking on the Keyboard button; in OpenOffice, use Tools ->Customize and then the Keyboard tab. As these settings are quite numerous, it might be best not to customize them for each user. Instead, determine a standard toolbar and keyboard layout, then, before migration, copy the company standard OpenOffice settings folder (which, depending on the version, is ~/.openoffice, ~/.xopenoffice, or ~/OpenOffice-version) to each machine.

E-mail/Contacts/Calendars/Tasks
Bluntly put, work does not get done without e-mail. The successful migration of e-mail (along with contacts, calendars, and tasks) is therefore of the utmost importance. If you are using (and plan to continue using) a Microsoft Exchange Server with Microsoft Outlook clients, you'll want to migrate your clients to Novell Evolution and use the Exchange connector. Setup of the accounts and the migration should be automatic, since all data resides on the server. This process is similar for GroupWise - just use the GroupWise connector for Evolution. For both Exchange and GroupWise, you'll want to check that users are storing mail on the server and not on their local hard drives.

If your e-mail is provided via an IMAP server, most e-mail will migrate automatically. However, you'll have to manually move contacts, calendars, tasks, and any e-mail residing on the local hard drive (such as sent mail, drafts, or other saved messages). If you're using a POP server, you will have to migrate everything manually.

There is an open source product called Outport (http://outport.sourceforge.net) that can help you migrate from Microsoft Outlook to Evolution. Outport migrates most but not all data. On the Linux side, you'll have to set up accounts again and use Evolution import features to import data. You can visit the Outport homepage for further procedural details.

Data
Perhaps of an importance second only to e-mail, files must also be migrated to the user's new machine. If your organization already has a policy of keeping documents on a network share, this step is simple: set up the reference to the new network share. Otherwise, if you're migrating to network-based storage (either on a Samba server or using Novell's iFolder), you'll want to copy the documents from Windows to the network share, then simply set up the share on Linux. Or, if you're still going to keep documents based on local hard drives, you'll need to copy all the documents over the network in some way. The simplest method is to begin by enabling sharing on the Windows machine for each folder that needs to be copied. Next, use the graphical tools (Konqueror on KDE and Nautilus on GNOME) to browse to those shares and copy the files locally. When copying files to the Linux machine, most files should be copied to a subfolder of the user's home directory.

Web Browsers
The easiest way to migrate from Internet Explorer to Firefox is to use Firefox's built-in import feature. First, install and launch Firefox on the original Windows machine. A prompt will ask if you'd like to import settings from Internet Explorer (click yes). Next, copy the Firefox profile (this is the folder named Mozilla, found in the user's Application Data directory) over to the Linux machine and put it in the user's home directory. You must, however, rename the folder .mozilla (the "." at the beginning means that the folder won't show up in most directory listings and is a required part of the file name). You may need to change permissions on the folder so it can be accessed by the user. Now, set up Firefox to have the appropriate extensions and plug-ins, such as Flash, Java, and anything else the user will need on a daily basis.

Desktop/Background/Look & Feel
Most noticeable to many end users is the wallpaper migration. To migrate the wallpaper, first find its location on Windows. (In the registry, browse to the key HKEY_CURRENT_USER\Control Panel\Desktop and look up the value Wallpaper. Or, if the user was using an Active Desktop wallpaper, find the Wallpaper value in HKEY_CURRENT_USER\Software\Microsoft\Internet Explorer\Desktop\General.) Copy the file over to Linux (preferably to somewhere in the user's home directory). If you're using Gnome, set the wallpaper by right-clicking on the desktop and choosing Change Desktop Background, then clicking Add Wallpaper. If you're using KDE, choose Configure Desktop, select the Background tab, and click on the file dialog button for the picture.

The KDE desktop can be made to look and feel just like Windows, with fewer dramatic differences than those found between Microsoft Windows 95 and XP. For instance, the Redmond theme uses similar colors and nearly identical shapes for the Minimize, Maximize, and Close buttons in Windows 2000 (the theme can be set from KDE's Control Center, under Appearance & Themes->Window Decorations). Likewise, the Redmond Splash Screen provides a Login screen that is familiar to Windows XP users. Other possibilities include using different icons, changing the look of the clock, and choosing among mouse pointer settings.

You may wish to make Linux look just like Windows, or you may decide to use some features that are unique to Linux. For instance, double-clicking the title bar can be configured to maximize and minimize the window exactly as in Windows (which is now the default on many Linux distributions). However, many long-time Linux users prefer the historical default "roll up" setting. Additionally, the GNOME desktop defaults to having the Applications menu and launchers on the top of the screen, leaving more room for the list of currently open windows on the bottom.

Instant Messaging (IM)
If you're using a protocol where the buddy list resides on the server, simply set up accounts on the new Linux environment. Assuming the user name is tied to the corporate login, you can simply set up the new account. Otherwise, you'll want to launch the IM client on Windows to see what the user name was, and let the user enter his or her password the first time IM is used in the new system. Some examples of instant messaging applications supported on Linux include AIM, Jabber, and Gaim.

Automated Migration

There is a variety of software migration tools that save significant resources and technician time when moving from the Windows desktop to the Linux desktop. Two examples of these tools are Versora's Progression Desktop and Alacos' Linux Migration Agent. Customer surveys have indicated that an IT staff can migrate 20-25 machines (including testing) in eight hours if Progression Desktop is run on each machine. Reports also indicate that using Progression Desktop in conjunction with a systems management suite (such as ZENworks) can enable one technician to migrate up to 100 machines over the same eight-hour period. Compared to a manual migration, in which it takes one technician eight hours to migrate and test between one and three machines, an automated migration is a very attractive solution.

The Linux Desktop Has Arrived!

If the recent migrations of high-profile global enterprises, municipalities, and universities are any indication, it's safe to say that the Windows-to-Linux desktop migration trend is gaining momentum daily. Given the growing frustration with Microsoft's licensing fees, inflexibility, security flaws, etc., organizations are becoming more and more open to alternatives. At the very least, IT staffs will be looking to support mixed environments.

Whether you choose to begin a pilot project for your corporation or on your home network, I think you will be happily surprised by the results. Performing this migration with one of the automated tools previously mentioned will make your migration run even more smoothly. Wishing you good migrations!

SIDEBAR

Automated Migration

Step 1: Create a Platform Neutral Package
Insert the Progression Desktop CD and run through the install wizard. You will be prompted to choose which applications and systems settings you wish to migrate from. When you have chosen, click next. You'll now be prompted to choose which files to move, either by selecting files manually or via powerful filters. When you're finished, click next. A prompt will now ask you to choose where to put the PNP file. Choose and click next. Progression Desktop will now create a Platform Neutral Package (PNP). This process may take anywhere from five minutes to two hours. Though not required, it is faster if the PNP is copied on the local machine. Note that these time estimates apply to computer time, not technician time. This step requires approximately 10 minutes of technician time per machine.

Step 2: Migration
Make the PNP file accessible to the Linux machine by enabling file sharing on the servers (recommended easiest method), by using a remote file server (most secure method), or by using burned CDs. Put the Progression Desktop CD into the Linux machine. Browse to the CD and double-click on ProgressionDesktop.sh. Enter a password if prompted, as Progression Desktop may need to install some dependencies. Follow the wizard instructions and select the PNP file.

Next, a prompt will ask you to choose either an express or custom migration. An express migration automatically uses default settings. Custom migration will let you choose application destinations, new file paths for stored documents, and other, more advanced settings. After the express or custom migration step is completed, hit next and the data will be applied. If performed locally, the migration step should take between five minutes and two-and-a-half hours. If done over a network connection, this step will take longer. Keep in mind that the automated migration requires only about 10 minutes of technician time per machine, while a manual migration requires eight hours of technician time.

More Stories By Jon Walker

Jon Walker serves as CTO of Versora, an ISV providing Microsoft to Linux migration software. Mr. Walker recently has co-authored 2 whitepapers with Novell titled Migrating from IS Web Servers to Apache SUSE LINUX Enterprise Server 9.0 and Migrating File and Print Servers from Windows to SUSE LINUX Enterprise Server 9. Prior to Versora, Mr. Walker was CTO/VP of Engineering for Miramar Systems. Software developed under his direction at Miramar has been deployed to over 20 million computers worldwide. Mr. Walker has also served as senior technologist for Nortel and Xing Technology (now Real Networks).

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories
Advanced Persistent Threats (APTs) are increasing at an unprecedented rate. The threat landscape of today is drastically different than just a few years ago. Attacks are much more organized and sophisticated. They are harder to detect and even harder to anticipate. In the foreseeable future it's going to get a whole lot harder. Everything you know today will change. Keeping up with this changing landscape is already a daunting task. Your organization needs to use the latest tools, methods and expertise to guard against those threats. But will that be enough? In the foreseeable future attacks w...
Disruptive macro trends in technology are impacting and dramatically changing the "art of the possible" relative to supply chain management practices through the innovative use of IoT, cloud, machine learning and Big Data to enable connected ecosystems of engagement. Enterprise informatics can now move beyond point solutions that merely monitor the past and implement integrated enterprise fabrics that enable end-to-end supply chain visibility to improve customer service delivery and optimize supplier management. Learn about enterprise architecture strategies for designing connected systems tha...
Dale Kim is the Director of Industry Solutions at MapR. His background includes a variety of technical and management roles at information technology companies. While his experience includes work with relational databases, much of his career pertains to non-relational data in the areas of search, content management, and NoSQL, and includes senior roles in technical marketing, sales engineering, and support engineering. Dale holds an MBA from Santa Clara University, and a BA in Computer Science from the University of California, Berkeley.
Wearable devices have come of age. The primary applications of wearables so far have been "the Quantified Self" or the tracking of one's fitness and health status. We propose the evolution of wearables into social and emotional communication devices. Our BE(tm) sensor uses light to visualize the skin conductance response. Our sensors are very inexpensive and can be massively distributed to audiences or groups of any size, in order to gauge reactions to performances, video, or any kind of presentation. In her session at @ThingsExpo, Jocelyn Scheirer, CEO & Founder of Bionolux, will discuss ho...
The cloud is now a fact of life but generating recurring revenues that are driven by solutions and services on a consumption model have been hard to implement, until now. In their session at 16th Cloud Expo, Ermanno Bonifazi, CEO & Founder of Solgenia, and Ian Khan, Global Strategic Positioning & Brand Manager at Solgenia, will discuss how a top European telco has leveraged the innovative recurring revenue generating capability of the consumption cloud to enable a unique cloud monetization model to drive results.
As organizations shift toward IT-as-a-service models, the need for managing and protecting data residing across physical, virtual, and now cloud environments grows with it. CommVault can ensure protection &E-Discovery of your data – whether in a private cloud, a Service Provider delivered public cloud, or a hybrid cloud environment – across the heterogeneous enterprise. In his session at 16th Cloud Expo, Randy De Meno, Chief Technologist - Windows Products and Microsoft Partnerships, will discuss how to cut costs, scale easily, and unleash insight with CommVault Simpana software, the only si...
Analytics is the foundation of smart data and now, with the ability to run Hadoop directly on smart storage systems like Cloudian HyperStore, enterprises will gain huge business advantages in terms of scalability, efficiency and cost savings as they move closer to realizing the potential of the Internet of Things. In his session at 16th Cloud Expo, Paul Turner, technology evangelist and CMO at Cloudian, Inc., will discuss the revolutionary notion that the storage world is transitioning from mere Big Data to smart data. He will argue that today’s hybrid cloud storage solutions, with commodity...
Every innovation or invention was originally a daydream. You like to imagine a “what-if” scenario. And with all the attention being paid to the so-called Internet of Things (IoT) you don’t have to stretch the imagination too much to see how this may impact commercial and homeowners insurance. We’re beyond the point of accepting this as a leap of faith. The groundwork is laid. Now it’s just a matter of time. We can thank the inventors of smart thermostats for developing a practical business application that everyone can relate to. Gone are the salad days of smart home apps, the early chalkb...
Cloud data governance was previously an avoided function when cloud deployments were relatively small. With the rapid adoption in public cloud – both rogue and sanctioned, it’s not uncommon to find regulated data dumped into public cloud and unprotected. This is why enterprises and cloud providers alike need to embrace a cloud data governance function and map policies, processes and technology controls accordingly. In her session at 15th Cloud Expo, Evelyn de Souza, Data Privacy and Compliance Strategy Leader at Cisco Systems, will focus on how to set up a cloud data governance program and s...
Roberto Medrano, Executive Vice President at SOA Software, had reached 30,000 page views on his home page - http://RobertoMedrano.SYS-CON.com/ - on the SYS-CON family of online magazines, which includes Cloud Computing Journal, Internet of Things Journal, Big Data Journal, and SOA World Magazine. He is a recognized executive in the information technology fields of SOA, internet security, governance, and compliance. He has extensive experience with both start-ups and large companies, having been involved at the beginning of four IT industries: EDA, Open Systems, Computer Security and now SOA.
The industrial software market has treated data with the mentality of “collect everything now, worry about how to use it later.” We now find ourselves buried in data, with the pervasive connectivity of the (Industrial) Internet of Things only piling on more numbers. There’s too much data and not enough information. In his session at @ThingsExpo, Bob Gates, Global Marketing Director, GE’s Intelligent Platforms business, to discuss how realizing the power of IoT, software developers are now focused on understanding how industrial data can create intelligence for industrial operations. Imagine ...
We certainly live in interesting technological times. And no more interesting than the current competing IoT standards for connectivity. Various standards bodies, approaches, and ecosystems are vying for mindshare and positioning for a competitive edge. It is clear that when the dust settles, we will have new protocols, evolved protocols, that will change the way we interact with devices and infrastructure. We will also have evolved web protocols, like HTTP/2, that will be changing the very core of our infrastructures. At the same time, we have old approaches made new again like micro-services...
Operational Hadoop and the Lambda Architecture for Streaming Data Apache Hadoop is emerging as a distributed platform for handling large and fast incoming streams of data. Predictive maintenance, supply chain optimization, and Internet-of-Things analysis are examples where Hadoop provides the scalable storage, processing, and analytics platform to gain meaningful insights from granular data that is typically only valuable from a large-scale, aggregate view. One architecture useful for capturing and analyzing streaming data is the Lambda Architecture, representing a model of how to analyze rea...
Today’s enterprise is being driven by disruptive competitive and human capital requirements to provide enterprise application access through not only desktops, but also mobile devices. To retrofit existing programs across all these devices using traditional programming methods is very costly and time consuming – often prohibitively so. In his session at @ThingsExpo, Jesse Shiah, CEO, President, and Co-Founder of AgilePoint Inc., discussed how you can create applications that run on all mobile devices as well as laptops and desktops using a visual drag-and-drop application – and eForms-buildi...
SYS-CON Events announced today that Vitria Technology, Inc. will exhibit at SYS-CON’s @ThingsExpo, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Vitria will showcase the company’s new IoT Analytics Platform through live demonstrations at booth #330. Vitria’s IoT Analytics Platform, fully integrated and powered by an operational intelligence engine, enables customers to rapidly build and operationalize advanced analytics to deliver timely business outcomes for use cases across the industrial, enterprise, and consumer segments.
SYS-CON Events announced today that Dyn, the worldwide leader in Internet Performance, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Dyn is a cloud-based Internet Performance company. Dyn helps companies monitor, control, and optimize online infrastructure for an exceptional end-user experience. Through a world-class network and unrivaled, objective intelligence into Internet conditions, Dyn ensures traffic gets delivered faster, safer, and more reliably than ever.
Containers and microservices have become topics of intense interest throughout the cloud developer and enterprise IT communities. Accordingly, attendees at the upcoming 16th Cloud Expo at the Javits Center in New York June 9-11 will find fresh new content in a new track called PaaS | Containers & Microservices Containers are not being considered for the first time by the cloud community, but a current era of re-consideration has pushed them to the top of the cloud agenda. With the launch of Docker's initial release in March of 2013, interest was revved up several notches. Then late last...
CommVault has announced that top industry technology visionaries have joined its leadership team. The addition of leaders from companies such as Oracle, SAP, Microsoft, Cisco, PwC and EMC signals the continuation of CommVault Next, the company's business transformation for sales, go-to-market strategies, pricing and packaging and technology innovation. The company also announced that it had realigned its structure to create business units to more directly match how customers evaluate, deploy, operate, and purchase technology.
In their session at @ThingsExpo, Shyam Varan Nath, Principal Architect at GE, and Ibrahim Gokcen, who leads GE's advanced IoT analytics, focused on the Internet of Things / Industrial Internet and how to make it operational for business end-users. Learn about the challenges posed by machine and sensor data and how to marry it with enterprise data. They also discussed the tips and tricks to provide the Industrial Internet as an end-user consumable service using Big Data Analytics and Industrial Cloud.
Performance is the intersection of power, agility, control, and choice. If you value performance, and more specifically consistent performance, you need to look beyond simple virtualized compute. Many factors need to be considered to create a truly performant environment. In his General Session at 15th Cloud Expo, Harold Hannon, Sr. Software Architect at SoftLayer, discussed how to take advantage of a multitude of compute options and platform features to make cloud the cornerstone of your online presence.