|By Rob Sutherland||
|March 16, 2004 12:00 AM EST||
At the end of 2004, Microsoft will stop supporting Windows NT. At that point, anyone using Windows NT will have several choices: follow Microsoft's upgrade path to Windows 2003, continue to use Windows NT without Microsoft support, or switch to Linux.
Switching to Linux is the cheapest, safest alternative, according to such companies as Tramp Trampolines and Polyscientific Enterprise Sdn. Bhd, a distributor of chemical and industrial products. Both of these companies made successful migrations from Windows NT to Linux and are happily using Linux as a desktop today, bringing them cost savings and greater stability.
This article examines the Windows-to-Linux path for organizations using Windows NT as a desktop. We'll look at the first step, taking stock of the current situation, and then look at the choices that have to be made based on that. Then we'll look at the migration process and examine some of the problems and successes organizations have faced in making migrations work. Also covered are some of the recent technologies such as Live CDs and WINE (www.winehq.org), Win4Lin (www.netraverse.com), as well as application equivalents and data conversion tools that make migration less painful, such as Rekall (www.totalrekall.co.uk).
Convincing the BusinessThe first step in any successful migration is to have a solid commitment from the decision makers. Every migration I have ever been involved in has strongly resembled an ungodly combination of a train wreck and a bar fight. It takes a clear plan (fail to plan, plan to fail) and a lot of willpower combined with flexibility to get through to the end while reducing to a minimum the amount of bloodshed along the way. Without real buy-in by the decision makers it's not just difficult, it's impossible.
It's crucial to understand as well that not all the decision makers are in the boardroom - it's best to have a core of active supporters as a core team and a majority who are at least passive supporters of the migration effort. A little education and communication up front will go a long way in reducing the costs of the project and ensuring the active, willing cooperation of your core team. This is the second step in a successful migration. When I say core team, I don't mean the experts that may be brought in to install and train users; I mean users who have bought into the new technologies and are willing to put out the extra effort needed to carry it through. You'll need them.
Identifying the Task at HandThe third step in a successful migration is to take stock of the current state of the shop. You'll need to answer these questions:
- What are our key applications?
- What dependencies do they create?
- Who are our key users?
- How big is the job?
Once you have the answers to these questions you'll be in a position to conduct a systems triage. In a systems triage you divide your key applications into those that can be replaced by functional equivalents, those that cannot be replaced, and those that must be converted in detail. An example of the first group might be a word-processing package - OpenOffice, for example, can replace Microsoft Word.
The second group comprises two categories: applications that are unacceptably expensive to replace because of reengineering or retraining costs, and applications that cannot be replaced because of external requirements. A company may find that the retraining costs for moving people from Adobe Photoshop to the GIMP are unacceptably high, for example. Or they may have a requirement to provide material in certain formats that they cannot modify, such as a supplier whose largest customer stipulates that certain information must be transferred using Access or Excel.
The last group encompasses the "homegrown" components of the desktop system, such as Word macros or Visual Basic utilities, which would need to be rewritten in a new package.
This last group is where most of the migration "gotchas" lurk, and early identification of them is critical. Although zealots on both sides will often try to show that the choice between Windows and Linux is all or nothing, this isn't true in most cases. There is a set of technologies that allow Windows applications to be run on Linux. There are a lot of options here, from WINE, CodeWeavers (www.codeweavers.com), and Win4Lin, which provide a basic environment for executing Windows applications directly within Linux, through to full operating system emulation environments such as VMWare (www.vmware.com).
These technologies are quite solid and when properly applied can give you the best of both worlds. Users use applications, and applications use operating systems, so a solution that gives the users applications that they can work with and the applications a stable, secure operating system may be the best solution - or at least one that gives you a little more breathing space.
Building the New EnvironmentOnce the analysis is done, you'll be in a position to make evaluations that will lead to firm decisions about the specific technologies and packages you'll be using. This is an area where open source stops being an abstract and becomes a serious business advantage. You don't need to buy a pig in a poke - you can get several pigs and make them jump through hoops for a very low cost.
If you take advantage of Linux on bootable CD technologies such as KNOPPIX, you can reduce the cost of testing and evaluation significantly. For example, rather than setting up a test machine or network and moving over a typical set of material, you can simply boot your existing machines with KNOPPIX and try opening your existing Word documents with OpenOffice.
Your core users can try things like switching over to Linux and falling back to Windows when required. There are also a lot of resources for choosing Windows application equivalents on Linux and many articles describing Windows-to-Linux migration in general.
The best guide I've found is the Migration Guide put out by KBSt Publication Service, a 441-page PDF containing a thorough and well-written analysis sure to be useful to anyone looking at this.
The absence of license fees and ready availability of much of the software sharply reduces the cost of doing an incremental migration. The variety among Linux distributions is an advantage here, rather than a liability, because no matter what your existing hardware base is, you'll be able to find a distribution that will run on it. If the one you find can't do what you want, you'll be able to determine the needed upgrades much more exactly than by simply taking a minimum requirements list from a vendor's sales material. On the other hand, if you want to obtain professional services to assist your evaluation, companies such as IBM (www-1.ibm.com/linux) and Racemi (www.racemi.com) offer consulting services in this area.
I haven't found any products designed specifically for assisting desktop migrations; however, two tools I often recommend are OpenOffice and Rekall. OpenOffice's ability to read Word and Excel formats and write a variety of formats make it an ideal replacement for the Windows equivalents, while Rekall allows you to read an Access database via ODBC and write that data to PostgreSQL, MySQL, or a number of other databases. For the vast majority of desktop systems this will allow you to transfer the user data.
In situations where you cannot easily transfer data, you may have to change your approach to looking for an equivalent or compatible software package. For example, Polyscientific Enterprise had a problem with Lotus Smartsheet documents not being readable by OpenOffice, and reassessed their business problem to look for a solution within another package.
You can use one of the methods described previously to run a Windows application on Linux. In any event, when you have decided on the correct mix of application packages, make sure they can work together. Having your core team perform interoperability testing by actually moving real data around and verifying the results is the best way to discover problems. Once again, solutions such as Knoppix can be a real help at this point.
Realizing the MigrationSo, after you've gotten a solid commitment, decided on your migration plan, assembled the core team, and assembled and tested your solution, you're faced with training and supporting your end users. Some suggestions to make this easier:
- Try to do it a few users at a time, or one functional group at a time.
- Evaluate the material available for free from places such as Openoffice.org, and make this material available through an internal Web application such as a forum or a Wiki.
- Set up a Web-based training package such as Moodle (www.moodle.org).
- If you can, make your core team available to help people out.
- Test your chosen architecture and software suite and ensure that it fulfils your functional requirements.
- Test the interoperability of your new solutions with your legacy systems and verify that they work in a production environment before you commit them organization-wide.
- Test your training and documentation setup using typical users with no previous background. Remember that if people can't be brought up to speed on the new solutions in a cost-effective way, it won't work.
- Expect problems. Testing will reduce, but not eliminate, them and you'll have to react quickly while under a great deal of stress
SummaryPeople and commitment are the key to a successful migration. If you have them you can succeed - and if you can take advantage of the open source edge, you can do it for a lot less. Migrations are always a high-stress activity and desktop migration is particularly so because it forces users to cope with more-visible changes than, for example, upgrading an e-mail server. Careful goal definition, planning, solution evaluation, and end-user training are all critical components, as is a dedicated core team and a step-by-step approach. The lower cost, greater interoperability, and greater flexibility of open source technologies, when used properly as part of well-thought-out and coordinated plan, will get you to the end of your migration path with a stable, secure, and lower-cost desktop.
|John Dean 08/24/04 03:42:32 AM EDT|
With an estimated 50 billion devices connected to the Internet by 2020, several industries will begin to expand their capabilities for retaining end point data at the edge to better utilize the range of data types and sheer volume of M2M data generated by the Internet of Things. In his session at @ThingsExpo, Don DeLoach, CEO and President of Infobright, will discuss the infrastructures businesses will need to implement to handle this explosion of data by providing specific use cases for filte...
Feb. 5, 2016 03:00 PM EST
SYS-CON Events announced today that Fusion, a leading provider of cloud services, will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. Fusion, a leading provider of integrated cloud solutions to small, medium and large businesses, is the industry's single source for the cloud. Fusion's advanced, proprietary cloud service platform enables the integration of leading edge solutions in the cloud, including clou...
Feb. 5, 2016 02:30 PM EST Reads: 681
SYS-CON Events announced today that Commvault, a global leader in enterprise data protection and information management, has been named “Bronze Sponsor” of SYS-CON's 18th International Cloud Expo, which will take place on June 7–9, 2016, at the Javits Center in New York City, NY, and the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Commvault is a leading provider of data protection and information management...
Feb. 5, 2016 01:30 PM EST Reads: 320
SYS-CON Events announced today that Alert Logic, Inc., the leading provider of Security-as-a-Service solutions for the cloud, will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. Alert Logic, Inc., provides Security-as-a-Service for on-premises, cloud, and hybrid infrastructures, delivering deep security insight and continuous protection for customers at a lower cost than traditional security solutions. Ful...
Feb. 5, 2016 01:15 PM EST Reads: 316
SYS-CON Events announced today that VAI, a leading ERP software provider, will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. VAI (Vormittag Associates, Inc.) is a leading independent mid-market ERP software developer renowned for its flexible solutions and ability to automate critical business functions for the distribution, manufacturing, specialty retail and service sectors. An IBM Premier Business Part...
Feb. 5, 2016 12:00 PM EST Reads: 506
The cloud promises new levels of agility and cost-savings for Big Data, data warehousing and analytics. But it’s challenging to understand all the options – from IaaS and PaaS to newer services like HaaS (Hadoop as a Service) and BDaaS (Big Data as a Service). In her session at @BigDataExpo at @ThingsExpo, Hannah Smalltree, a director at Cazena, will provide an educational overview of emerging “as-a-service” options for Big Data in the cloud. This is critical background for IT and data profes...
Feb. 5, 2016 12:00 PM EST
Fortunately, meaningful and tangible business cases for IoT are plentiful in a broad array of industries and vertical markets. These range from simple warranty cost reduction for capital intensive assets, to minimizing downtime for vital business tools, to creating feedback loops improving product design, to improving and enhancing enterprise customer experiences. All of these business cases, which will be briefly explored in this session, hinge on cost effectively extracting relevant data from ...
Feb. 5, 2016 10:15 AM EST
As enterprises work to take advantage of Big Data technologies, they frequently become distracted by product-level decisions. In most new Big Data builds this approach is completely counter-productive: it presupposes tools that may not be a fit for development teams, forces IT to take on the burden of evaluating and maintaining unfamiliar technology, and represents a major up-front expense. In his session at @BigDataExpo at @ThingsExpo, Andrew Warfield, CTO and Co-Founder of Coho Data, will dis...
Feb. 5, 2016 10:00 AM EST
With the Apple Watch making its way onto wrists all over the world, it’s only a matter of time before it becomes a staple in the workplace. In fact, Forrester reported that 68 percent of technology and business decision-makers characterize wearables as a top priority for 2015. Recognizing their business value early on, FinancialForce.com was the first to bring ERP to wearables, helping streamline communication across front and back office functions. In his session at @ThingsExpo, Kevin Roberts...
Feb. 5, 2016 09:30 AM EST Reads: 291
SYS-CON Events announced today that Interoute, owner-operator of one of Europe's largest networks and a global cloud services platform, has been named “Bronze Sponsor” of SYS-CON's 18th Cloud Expo, which will take place on June 7-9, 2015 at the Javits Center in New York, New York. Interoute is the owner-operator of one of Europe's largest networks and a global cloud services platform which encompasses 12 data centers, 14 virtual data centers and 31 colocation centers, with connections to 195 ad...
Feb. 5, 2016 12:00 AM EST Reads: 303
Most people haven’t heard the word, “gamification,” even though they probably, and perhaps unwittingly, participate in it every day. Gamification is “the process of adding games or game-like elements to something (as a task) so as to encourage participation.” Further, gamification is about bringing game mechanics – rules, constructs, processes, and methods – into the real world in an effort to engage people. In his session at @ThingsExpo, Robert Endo, owner and engagement manager of Intrepid D...
Feb. 4, 2016 09:15 PM EST Reads: 740
Eighty percent of a data scientist’s time is spent gathering and cleaning up data, and 80% of all data is unstructured and almost never analyzed. Cognitive computing, in combination with Big Data, is changing the equation by creating data reservoirs and using natural language processing to enable analysis of unstructured data sources. This is impacting every aspect of the analytics profession from how data is mined (and by whom) to how it is delivered. This is not some futuristic vision: it's ha...
Feb. 2, 2016 02:00 PM EST Reads: 388
WebRTC has had a real tough three or four years, and so have those working with it. Only a few short years ago, the development world were excited about WebRTC and proclaiming how awesome it was. You might have played with the technology a couple of years ago, only to find the extra infrastructure requirements were painful to implement and poorly documented. This probably left a bitter taste in your mouth, especially when things went wrong.
Feb. 2, 2016 04:30 AM EST Reads: 814
Learn how IoT, cloud, social networks and last but not least, humans, can be integrated into a seamless integration of cooperative organisms both cybernetic and biological. This has been enabled by recent advances in IoT device capabilities, messaging frameworks, presence and collaboration services, where devices can share information and make independent and human assisted decisions based upon social status from other entities. In his session at @ThingsExpo, Michael Heydt, founder of Seamless...
Feb. 1, 2016 05:00 AM EST Reads: 894
The IoT's basic concept of collecting data from as many sources possible to drive better decision making, create process innovation and realize additional revenue has been in use at large enterprises with deep pockets for decades. So what has changed? In his session at @ThingsExpo, Prasanna Sivaramakrishnan, Solutions Architect at Red Hat, discussed the impact commodity hardware, ubiquitous connectivity, and innovations in open source software are having on the connected universe of people, thi...
Jan. 31, 2016 09:00 PM EST Reads: 692
WebRTC: together these advances have created a perfect storm of technologies that are disrupting and transforming classic communications models and ecosystems. In his session at WebRTC Summit, Cary Bran, VP of Innovation and New Ventures at Plantronics and PLT Labs, provided an overview of this technological shift, including associated business and consumer communications impacts, and opportunities it may enable, complement or entirely transform.
Jan. 31, 2016 07:15 PM EST Reads: 1,116
There are so many tools and techniques for data analytics that even for a data scientist the choices, possible systems, and even the types of data can be daunting. In his session at @ThingsExpo, Chris Harrold, Global CTO for Big Data Solutions for EMC Corporation, showed how to perform a simple, but meaningful analysis of social sentiment data using freely available tools that take only minutes to download and install. Participants received the download information, scripts, and complete end-t...
Jan. 31, 2016 10:00 AM EST Reads: 1,173
For manufacturers, the Internet of Things (IoT) represents a jumping-off point for innovation, jobs, and revenue creation. But to adequately seize the opportunity, manufacturers must design devices that are interconnected, can continually sense their environment and process huge amounts of data. As a first step, manufacturers must embrace a new product development ecosystem in order to support these products.
Jan. 31, 2016 10:00 AM EST Reads: 767
Manufacturing connected IoT versions of traditional products requires more than multiple deep technology skills. It also requires a shift in mindset, to realize that connected, sensor-enabled “things” act more like services than what we usually think of as products. In his session at @ThingsExpo, David Friedman, CEO and co-founder of Ayla Networks, discussed how when sensors start generating detailed real-world data about products and how they’re being used, smart manufacturers can use the dat...
Jan. 30, 2016 07:45 PM EST Reads: 733
When it comes to IoT in the enterprise, namely the commercial building and hospitality markets, a benefit not getting the attention it deserves is energy efficiency, and IoT’s direct impact on a cleaner, greener environment when installed in smart buildings. Until now clean technology was offered piecemeal and led with point solutions that require significant systems integration to orchestrate and deploy. There didn't exist a 'top down' approach that can manage and monitor the way a Smart Buildi...
Jan. 30, 2016 03:45 PM EST Reads: 1,237