Click here to close now.

Welcome!

Linux Authors: Carmen Gonzalez, Lori MacVittie, Mike Kavis, Ian Khan, VictorOps Blog

Related Topics: Linux

Linux: Article

Migrating the Desktop from NT to Linux

Commitment from your team is the key to success

At the end of 2004, Microsoft will stop supporting Windows NT. At that point, anyone using Windows NT will have several choices: follow Microsoft's upgrade path to Windows 2003, continue to use Windows NT without Microsoft support, or switch to Linux.

Switching to Linux is the cheapest, safest alternative, according to such companies as Tramp Trampolines and Polyscientific Enterprise Sdn. Bhd, a distributor of chemical and industrial products. Both of these companies made successful migrations from Windows NT to Linux and are happily using Linux as a desktop today, bringing them cost savings and greater stability.

This article examines the Windows-to-Linux path for organizations using Windows NT as a desktop. We'll look at the first step, taking stock of the current situation, and then look at the choices that have to be made based on that. Then we'll look at the migration process and examine some of the problems and successes organizations have faced in making migrations work. Also covered are some of the recent technologies such as Live CDs and WINE (www.winehq.org), Win4Lin (www.netraverse.com), as well as application equivalents and data conversion tools that make migration less painful, such as Rekall (www.totalrekall.co.uk).

Convincing the Business

The first step in any successful migration is to have a solid commitment from the decision makers. Every migration I have ever been involved in has strongly resembled an ungodly combination of a train wreck and a bar fight. It takes a clear plan (fail to plan, plan to fail) and a lot of willpower combined with flexibility to get through to the end while reducing to a minimum the amount of bloodshed along the way. Without real buy-in by the decision makers it's not just difficult, it's impossible.

It's crucial to understand as well that not all the decision makers are in the boardroom - it's best to have a core of active supporters as a core team and a majority who are at least passive supporters of the migration effort. A little education and communication up front will go a long way in reducing the costs of the project and ensuring the active, willing cooperation of your core team. This is the second step in a successful migration. When I say core team, I don't mean the experts that may be brought in to install and train users; I mean users who have bought into the new technologies and are willing to put out the extra effort needed to carry it through. You'll need them.

Identifying the Task at Hand

The third step in a successful migration is to take stock of the current state of the shop. You'll need to answer these questions:
  • What are our key applications?
  • What dependencies do they create?
  • Who are our key users?
  • How big is the job?
Many tools are available to do software inventory on the high end of things (see www.trackbird.com and www.expressmetrix.com/faq/software_inventory.asp). On the less-expensive end, Syslist (www.syslist.com) and AIDA32 (www.aida32.hu/aida-features.php?bit=32) are available.

Once you have the answers to these questions you'll be in a position to conduct a systems triage. In a systems triage you divide your key applications into those that can be replaced by functional equivalents, those that cannot be replaced, and those that must be converted in detail. An example of the first group might be a word-processing package - OpenOffice, for example, can replace Microsoft Word.

The second group comprises two categories: applications that are unacceptably expensive to replace because of reengineering or retraining costs, and applications that cannot be replaced because of external requirements. A company may find that the retraining costs for moving people from Adobe Photoshop to the GIMP are unacceptably high, for example. Or they may have a requirement to provide material in certain formats that they cannot modify, such as a supplier whose largest customer stipulates that certain information must be transferred using Access or Excel.

The last group encompasses the "homegrown" components of the desktop system, such as Word macros or Visual Basic utilities, which would need to be rewritten in a new package.

This last group is where most of the migration "gotchas" lurk, and early identification of them is critical. Although zealots on both sides will often try to show that the choice between Windows and Linux is all or nothing, this isn't true in most cases. There is a set of technologies that allow Windows applications to be run on Linux. There are a lot of options here, from WINE, CodeWeavers (www.codeweavers.com), and Win4Lin, which provide a basic environment for executing Windows applications directly within Linux, through to full operating system emulation environments such as VMWare (www.vmware.com).

These technologies are quite solid and when properly applied can give you the best of both worlds. Users use applications, and applications use operating systems, so a solution that gives the users applications that they can work with and the applications a stable, secure operating system may be the best solution - or at least one that gives you a little more breathing space.

Building the New Environment

Once the analysis is done, you'll be in a position to make evaluations that will lead to firm decisions about the specific technologies and packages you'll be using. This is an area where open source stops being an abstract and becomes a serious business advantage. You don't need to buy a pig in a poke - you can get several pigs and make them jump through hoops for a very low cost.

If you take advantage of Linux on bootable CD technologies such as KNOPPIX, you can reduce the cost of testing and evaluation significantly. For example, rather than setting up a test machine or network and moving over a typical set of material, you can simply boot your existing machines with KNOPPIX and try opening your existing Word documents with OpenOffice.

Your core users can try things like switching over to Linux and falling back to Windows when required. There are also a lot of resources for choosing Windows application equivalents on Linux and many articles describing Windows-to-Linux migration in general.

The best guide I've found is the Migration Guide put out by KBSt Publication Service, a 441-page PDF containing a thorough and well-written analysis sure to be useful to anyone looking at this.

The absence of license fees and ready availability of much of the software sharply reduces the cost of doing an incremental migration. The variety among Linux distributions is an advantage here, rather than a liability, because no matter what your existing hardware base is, you'll be able to find a distribution that will run on it. If the one you find can't do what you want, you'll be able to determine the needed upgrades much more exactly than by simply taking a minimum requirements list from a vendor's sales material. On the other hand, if you want to obtain professional services to assist your evaluation, companies such as IBM (www-1.ibm.com/linux) and Racemi (www.racemi.com) offer consulting services in this area.

I haven't found any products designed specifically for assisting desktop migrations; however, two tools I often recommend are OpenOffice and Rekall. OpenOffice's ability to read Word and Excel formats and write a variety of formats make it an ideal replacement for the Windows equivalents, while Rekall allows you to read an Access database via ODBC and write that data to PostgreSQL, MySQL, or a number of other databases. For the vast majority of desktop systems this will allow you to transfer the user data.

In situations where you cannot easily transfer data, you may have to change your approach to looking for an equivalent or compatible software package. For example, Polyscientific Enterprise had a problem with Lotus Smartsheet documents not being readable by OpenOffice, and reassessed their business problem to look for a solution within another package.

You can use one of the methods described previously to run a Windows application on Linux. In any event, when you have decided on the correct mix of application packages, make sure they can work together. Having your core team perform interoperability testing by actually moving real data around and verifying the results is the best way to discover problems. Once again, solutions such as Knoppix can be a real help at this point.

Realizing the Migration

So, after you've gotten a solid commitment, decided on your migration plan, assembled the core team, and assembled and tested your solution, you're faced with training and supporting your end users. Some suggestions to make this easier:
  • Try to do it a few users at a time, or one functional group at a time.
  • Evaluate the material available for free from places such as Openoffice.org, and make this material available through an internal Web application such as a forum or a Wiki.
  • Set up a Web-based training package such as Moodle (www.moodle.org).
  • If you can, make your core team available to help people out.
  • Test your chosen architecture and software suite and ensure that it fulfils your functional requirements.
  • Test the interoperability of your new solutions with your legacy systems and verify that they work in a production environment before you commit them organization-wide.
  • Test your training and documentation setup using typical users with no previous background. Remember that if people can't be brought up to speed on the new solutions in a cost-effective way, it won't work.
  • Expect problems. Testing will reduce, but not eliminate, them and you'll have to react quickly while under a great deal of stress

Summary

People and commitment are the key to a successful migration. If you have them you can succeed - and if you can take advantage of the open source edge, you can do it for a lot less. Migrations are always a high-stress activity and desktop migration is particularly so because it forces users to cope with more-visible changes than, for example, upgrading an e-mail server. Careful goal definition, planning, solution evaluation, and end-user training are all critical components, as is a dedicated core team and a step-by-step approach. The lower cost, greater interoperability, and greater flexibility of open source technologies, when used properly as part of well-thought-out and coordinated plan, will get you to the end of your migration path with a stable, secure, and lower-cost desktop.

References

  • "The Wrong Choice: After picking NT, Trampoline firm leaps to Linux": http://searchenterpriselinux.techtarget.com/ originalContent/0,289142,sid39_gci905078,00.html
  • "Open Source in SME Migration to Linux": http://opensource.mimos.my/fosscon2003cd/paper/slides/11_seah_hong_yee.pdf
  • Windows application equivalents on Linux: http://linuxshop.ru/linuxbegin/win-lin-soft-en/table.shtml
  • Switch to Linux: http://switch.demoni.ca
  • Linux for Microsoft Windows Users: http://mozillaquest.com/indexes/Linux4Windows_index.html
  • KBst Migration Guide: www.bmi.bund.de/downloadde/25072/Download_englisch.pdf
  • More Stories By Rob Sutherland

    Rob Sutherland is an independent consultant in Toronto, specializing in providing support, analysis, and implementation assistance to small and medium-size companies moving into open source. For the past 25 years he has worked as a programmer, systems analyst, and IT support person for clients ranging from startups to state and federal governments. You can find out more about Rob at www.cheapersafer.com.

    Comments (1) View Comments

    Share your thoughts on this story.

    Add your comment
    You must be signed in to add a comment. Sign-in | Register

    In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


    Most Recent Comments
    John Dean 08/24/04 03:42:32 AM EDT

    Hi
    I would like to add a little additional information which has not yet found its way into many of the recent article which include information of Rekall's feature list. There two new features which are presently being worked on. The first is "Rekall On The Web". The idea is the web enable Rekall Forms and Reports. This will allow users to produce either traditional desktop GUI database applications or to produce data driven web based applications. For more details on this particular feature please visit the Total Rekall web portal at http://www.totalrekall.co.uk. The second feature is the produce a MS Access to Rekall conversion utility. This feature will likely form the basis of a commercial Enterprise Edition. The idea is to scan an Access mdb file and exact data and meta data such that an Access application can be re-created in Rekall's native format. In order for us to produce these features quickly we will need sponsorship, so that we can devote 100% of our time to the project.

    @ThingsExpo Stories
    Roberto Medrano, Executive Vice President at SOA Software, had reached 30,000 page views on his home page - http://RobertoMedrano.SYS-CON.com/ - on the SYS-CON family of online magazines, which includes Cloud Computing Journal, Internet of Things Journal, Big Data Journal, and SOA World Magazine. He is a recognized executive in the information technology fields of SOA, internet security, governance, and compliance. He has extensive experience with both start-ups and large companies, having been involved at the beginning of four IT industries: EDA, Open Systems, Computer Security and now SOA.
    The industrial software market has treated data with the mentality of “collect everything now, worry about how to use it later.” We now find ourselves buried in data, with the pervasive connectivity of the (Industrial) Internet of Things only piling on more numbers. There’s too much data and not enough information. In his session at @ThingsExpo, Bob Gates, Global Marketing Director, GE’s Intelligent Platforms business, to discuss how realizing the power of IoT, software developers are now focused on understanding how industrial data can create intelligence for industrial operations. Imagine ...
    Operational Hadoop and the Lambda Architecture for Streaming Data Apache Hadoop is emerging as a distributed platform for handling large and fast incoming streams of data. Predictive maintenance, supply chain optimization, and Internet-of-Things analysis are examples where Hadoop provides the scalable storage, processing, and analytics platform to gain meaningful insights from granular data that is typically only valuable from a large-scale, aggregate view. One architecture useful for capturing and analyzing streaming data is the Lambda Architecture, representing a model of how to analyze rea...
    SYS-CON Events announced today that Vitria Technology, Inc. will exhibit at SYS-CON’s @ThingsExpo, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Vitria will showcase the company’s new IoT Analytics Platform through live demonstrations at booth #330. Vitria’s IoT Analytics Platform, fully integrated and powered by an operational intelligence engine, enables customers to rapidly build and operationalize advanced analytics to deliver timely business outcomes for use cases across the industrial, enterprise, and consumer segments.
    The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @ThingsExpo, Michael Sick, a Senior Manager and Big Data Architect within Ernst and Young's Financial Servi...
    SYS-CON Events announced today that Open Data Centers (ODC), a carrier-neutral colocation provider, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place June 9-11, 2015, at the Javits Center in New York City, NY. Open Data Centers is a carrier-neutral data center operator in New Jersey and New York City offering alternative connectivity options for carriers, service providers and enterprise customers.
    The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @ThingsExpo, Michael Sick, a Senior Manager and Big Data Architect within Ernst and Young's Financial Servi...
    PubNub on Monday has announced that it is partnering with IBM to bring its sophisticated real-time data streaming and messaging capabilities to Bluemix, IBM’s cloud development platform. “Today’s app and connected devices require an always-on connection, but building a secure, scalable solution from the ground up is time consuming, resource intensive, and error-prone,” said Todd Greene, CEO of PubNub. “PubNub enables web, mobile and IoT developers building apps on IBM Bluemix to quickly add scalable realtime functionality with minimal effort and cost.”
    Sensor-enabled things are becoming more commonplace, precursors to a larger and more complex framework that most consider the ultimate promise of the IoT: things connecting, interacting, sharing, storing, and over time perhaps learning and predicting based on habits, behaviors, location, preferences, purchases and more. In his session at @ThingsExpo, Tom Wesselman, Director of Communications Ecosystem Architecture at Plantronics, will examine the still nascent IoT as it is coalescing, including what it is today, what it might ultimately be, the role of wearable tech, and technology gaps stil...
    In the consumer IoT, everything is new, and the IT world of bits and bytes holds sway. But industrial and commercial realms encompass operational technology (OT) that has been around for 25 or 50 years. This grittier, pre-IP, more hands-on world has much to gain from Industrial IoT (IIoT) applications and principles. But adding sensors and wireless connectivity won’t work in environments that demand unwavering reliability and performance. In his session at @ThingsExpo, Ron Sege, CEO of Echelon, will discuss how as enterprise IT embraces other IoT-related technology trends, enterprises with i...
    When it comes to the Internet of Things, hooking up will get you only so far. If you want customers to commit, you need to go beyond simply connecting products. You need to use the devices themselves to transform how you engage with every customer and how you manage the entire product lifecycle. In his session at @ThingsExpo, Sean Lorenz, Technical Product Manager for Xively at LogMeIn, will show how “product relationship management” can help you leverage your connected devices and the data they generate about customer usage and product performance to deliver extremely compelling and reliabl...
    The Internet of Things (IoT) is causing data centers to become radically decentralized and atomized within a new paradigm known as “fog computing.” To support IoT applications, such as connected cars and smart grids, data centers' core functions will be decentralized out to the network's edges and endpoints (aka “fogs”). As this trend takes hold, Big Data analytics platforms will focus on high-volume log analysis (aka “logs”) and rely heavily on cognitive-computing algorithms (aka “cogs”) to make sense of it all.
    With several hundred implementations of IoT-enabled solutions in the past 12 months alone, this session will focus on experience over the art of the possible. Many can only imagine the most advanced telematics platform ever deployed, supporting millions of customers, producing tens of thousands events or GBs per trip, and hundreds of TBs per month. With the ability to support a billion sensor events per second, over 30PB of warm data for analytics, and hundreds of PBs for an data analytics archive, in his session at @ThingsExpo, Jim Kaskade, Vice President and General Manager, Big Data & Ana...
    One of the biggest impacts of the Internet of Things is and will continue to be on data; specifically data volume, management and usage. Companies are scrambling to adapt to this new and unpredictable data reality with legacy infrastructure that cannot handle the speed and volume of data. In his session at @ThingsExpo, Don DeLoach, CEO and president of Infobright, will discuss how companies need to rethink their data infrastructure to participate in the IoT, including: Data storage: Understanding the kinds of data: structured, unstructured, big/small? Analytics: What kinds and how responsiv...
    Since 2008 and for the first time in history, more than half of humans live in urban areas, urging cities to become “smart.” Today, cities can leverage the wide availability of smartphones combined with new technologies such as Beacons or NFC to connect their urban furniture and environment to create citizen-first services that improve transportation, way-finding and information delivery. In her session at @ThingsExpo, Laetitia Gazel-Anthoine, CEO of Connecthings, will focus on successful use cases.
    Sensor-enabled things are becoming more commonplace, precursors to a larger and more complex framework that most consider the ultimate promise of the IoT: things connecting, interacting, sharing, storing, and over time perhaps learning and predicting based on habits, behaviors, location, preferences, purchases and more. In his session at @ThingsExpo, Tom Wesselman, Director of Communications Ecosystem Architecture at Plantronics, will examine the still nascent IoT as it is coalescing, including what it is today, what it might ultimately be, the role of wearable tech, and technology gaps stil...
    The true value of the Internet of Things (IoT) lies not just in the data, but through the services that protect the data, perform the analysis and present findings in a usable way. With many IoT elements rooted in traditional IT components, Big Data and IoT isn’t just a play for enterprise. In fact, the IoT presents SMBs with the prospect of launching entirely new activities and exploring innovative areas. CompTIA research identifies several areas where IoT is expected to have the greatest impact.
    Wearable devices have come of age. The primary applications of wearables so far have been "the Quantified Self" or the tracking of one's fitness and health status. We propose the evolution of wearables into social and emotional communication devices. Our BE(tm) sensor uses light to visualize the skin conductance response. Our sensors are very inexpensive and can be massively distributed to audiences or groups of any size, in order to gauge reactions to performances, video, or any kind of presentation. In her session at @ThingsExpo, Jocelyn Scheirer, CEO & Founder of Bionolux, will discuss ho...
    SYS-CON Events announced today that GENBAND, a leading developer of real time communications software solutions, has been named “Silver Sponsor” of SYS-CON's WebRTC Summit, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. The GENBAND team will be on hand to demonstrate their newest product, Kandy. Kandy is a communications Platform-as-a-Service (PaaS) that enables companies to seamlessly integrate more human communications into their Web and mobile applications - creating more engaging experiences for their customers and boosting collaboration and productiv...
    From telemedicine to smart cars, digital homes and industrial monitoring, the explosive growth of IoT has created exciting new business opportunities for real time calls and messaging. In his session at @ThingsExpo, Ivelin Ivanov, CEO and Co-Founder of Telestax, shared some of the new revenue sources that IoT created for Restcomm – the open source telephony platform from Telestax. Ivelin Ivanov is a technology entrepreneur who founded Mobicents, an Open Source VoIP Platform, to help create, deploy, and manage applications integrating voice, video and data. He is the co-founder of TeleStax, a...